Projects
Essentials
gstreamer-plugins-bad-codecs
Sign Up
Log In
Username
Password
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
Expand all
Collapse all
Changes of Revision 5
View file
gstreamer-plugins-bad-codecs.changes
Changed
@@ -1,4 +1,9 @@ ------------------------------------------------------------------- +Sat Mar 26 01:04:36 UTC 2022 - Bjørn Lie <zaitor@opensuse.org> + +- Update to version 1.20.1 + +------------------------------------------------------------------- Fri Feb 4 20:56:11 UTC 2022 - Bjørn Lie <zaitor@opensuse.org> - Update to version 1.18.6
View file
gstreamer-plugins-bad-codecs.spec
Changed
@@ -4,10 +4,10 @@ %define _name gst-plugins-bad %define gst_branch 1.0 -%define _version 1.18.0 +%define _version 1.20.0 Name: gstreamer-plugins-bad-codecs -Version: 1.18.6 +Version: 1.20.1 Release: 0 Summary: Codecs/plugins for gstreamer-plugins-bad License: LGPL-2.1-or-later @@ -64,6 +64,7 @@ -Dpackage-name='Packman GStreamer-plugins-bad-codecs' \ -Dpackage-origin='https://pmbs.links2linux.de' \ -Dlibde265=enabled \ + -Dgpl=enabled \ -Dorc=enabled \ -Ddts=enabled \ -Dfaac=enabled \
View file
gst-plugins-bad-1.18.6.tar.xz/.gitlab-ci.yml
Deleted
@@ -1,1 +0,0 @@ -include: "https://gitlab.freedesktop.org/gstreamer/gst-ci/raw/1.18/gitlab/ci_template.yml"
View file
gst-plugins-bad-1.18.6.tar.xz/.indentignore
Deleted
@@ -1,1 +0,0 @@ -ext/sctp/usrsctp/usrsctplib/
View file
gst-plugins-bad-1.18.6.tar.xz/README
Deleted
@@ -1,252 +0,0 @@ -GStreamer 1.18.x stable series - -WHAT IT IS ----------- - -This is GStreamer, a framework for streaming media. - -WHERE TO START --------------- - -We have a website at - - https://gstreamer.freedesktop.org - -Our documentation, including tutorials, API reference and FAQ can be found at - - https://gstreamer.freedesktop.org/documentation/ - -You can subscribe to our mailing lists: - - https://lists.freedesktop.org/mailman/listinfo/gstreamer-announce - - https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel - -We track bugs, feature requests and merge requests (patches) in GitLab at - - https://gitlab.freedesktop.org/gstreamer/ - -You can join us on IRC - #gstreamer on irc.freenode.org - -GStreamer 1.0 series --------------------- - -Starring - - GSTREAMER - -The core around which all other modules revolve. Base functionality and -libraries, some essential elements, documentation, and testing. - - BASE - -A well-groomed and well-maintained collection of GStreamer plug-ins and -elements, spanning the range of possible types of elements one would want -to write for GStreamer. - -And introducing, for the first time ever, on the development screen ... - - THE GOOD - - --- "Such ingratitude. After all the times I've saved your life." - -A collection of plug-ins you'd want to have right next to you on the -battlefield. Shooting sharp and making no mistakes, these plug-ins have it -all: good looks, good code, and good licensing. Documented and dressed up -in tests. If you're looking for a role model to base your own plug-in on, -here it is. - -If you find a plot hole or a badly lip-synced line of code in them, -let us know - it is a matter of honour for us to ensure Blondie doesn't look -like he's been walking 100 miles through the desert without water. - - THE UGLY - - --- "When you have to shoot, shoot. Don't talk." - -There are times when the world needs a color between black and white. -Quality code to match the good's, but two-timing, backstabbing and ready to -sell your freedom down the river. These plug-ins might have a patent noose -around their neck, or a lock-up license, or any other problem that makes you -think twice about shipping them. - -We don't call them ugly because we like them less. Does a mother love her -son less because he's not as pretty as the other ones ? No - she commends -him on his great personality. These plug-ins are the life of the party. -And we'll still step in and set them straight if you report any unacceptable -behaviour - because there are two kinds of people in the world, my friend: -those with a rope around their neck and the people who do the cutting. - - THE BAD - - --- "That an accusation?" - -No perfectly groomed moustache or any amount of fine clothing is going to -cover up the truth - these plug-ins are Bad with a capital B. -They look fine on the outside, and might even appear to get the job done, but -at the end of the day they're a black sheep. Without a golden-haired angel -to watch over them, they'll probably land in an unmarked grave at the final -showdown. - -Don't bug us about their quality - exercise your Free Software rights, -patch up the offender and send us the patch on the fastest steed you can -steal from the Confederates. Because you see, in this world, there's two -kinds of people, my friend: those with loaded guns and those who dig. -You dig. - -The Lowdown ------------ - - --- "I've never seen so many plug-ins wasted so badly." - -GStreamer Plug-ins has grown so big that it's hard to separate the wheat from -the chaff. Also, distributors have brought up issues about the legal status -of some of the plug-ins we ship. To remedy this, we've divided the previous -set of available plug-ins into four modules: - -- gst-plugins-base: a small and fixed set of plug-ins, covering a wide range - of possible types of elements; these are continuously kept up-to-date - with any core changes during the development series. - - - We believe distributors can safely ship these plug-ins. - - People writing elements should base their code on these elements. - - These elements come with examples, documentation, and regression tests. - -- gst-plugins-good: a set of plug-ins that we consider to have good quality - code, correct functionality, our preferred license (LGPL for the plug-in - code, LGPL or LGPL-compatible for the supporting library). - - - We believe distributors can safely ship these plug-ins. - - People writing elements should base their code on these elements. - -- gst-plugins-ugly: a set of plug-ins that have good quality and correct - functionality, but distributing them might pose problems. The license - on either the plug-ins or the supporting libraries might not be how we'd - like. The code might be widely known to present patent problems. - - - Distributors should check if they want/can ship these plug-ins. - - People writing elements should base their code on these elements. - -- gst-plugins-bad: a set of plug-ins that aren't up to par compared to the - rest. They might be close to being good quality, but they're missing - something - be it a good code review, some documentation, a set of tests, - a real live maintainer, or some actual wide use. - If the blanks are filled in they might be upgraded to become part of - either gst-plugins-good or gst-plugins-ugly, depending on the other factors. - - - If the plug-ins break, you can't complain - instead, you can fix the - problem and send us a patch, or bribe someone into fixing them for you. - - New contributors can start here for things to work on. - -PLATFORMS ---------- - -- Linux is of course fully supported -- FreeBSD is reported to work; other BSDs should work too; same for Solaris -- MacOS works, binary 1.x packages can be built using the cerbero build tool -- Windows works; binary 1.x packages can be built using the cerbero build tool - - MSys/MinGW builds - - Microsoft Visual Studio builds are also available and supported -- Android works, binary 1.x packages can be built using the cerbero build tool -- iOS works - -INSTALLING FROM PACKAGES ------------------------- - -You should always prefer installing from packages first. GStreamer is -well-maintained for a number of distributions, including Fedora, Debian, -Ubuntu, Mandrake, Arch Linux, Gentoo, ... - -Only in cases where you: - - - want to hack on GStreamer - - want to verify that a bug has been fixed - - do not have a sane distribution - -should you choose to build from source tarballs or git. - -Find more information about the various packages at - - https://gstreamer.freedesktop.org/download/ - -COMPILING FROM SOURCE TARBALLS ------------------------------- - -- again, make sure that you really need to install from source! - If GStreamer is one of your first projects ever that you build from source, - consider taking on an easier project. - -- you need a recent version of Meson installed, see - - http://mesonbuild.com/Getting-meson.html - - and - - https://gitlab.freedesktop.org/gstreamer/gst-build/blob/master/README.md - -- run - - meson build - ninja -C build - - to build GStreamer. - -- if you want to install it (not required, but what you usually want to do), run - - ninja -C build install - -- try out a simple test: - gst-launch-1.0 -v fakesrc num_buffers=5 ! fakesink - (If you didn't install GStreamer, run `./build/tools/gst-launch-1.0`) - - If it outputs a bunch of messages from fakesrc and fakesink, everything is - ok. - - If it did not work, keep in mind that you might need to adjust the - PATH and/or LD_LIBRARY_PATH environment variables to make the system - find GStreamer in the prefix where you installed (by default that is /usr/local). - -- After this, you're ready to install gst-plugins, which will provide the - functionality you're probably looking for by now, so go on and read - that README. - -COMPILING FROM GIT ------------------- - -You can build an uninstalled GStreamer from git for development or testing -purposes without affecting your system installation. - -Get started with: - - git clone https://gitlab.freedesktop.org/gstreamer/gst-build - meson build - ninja -C build - ninja -C build uninstalled - -For more information, see the `gst-build` module and its documentation: - - https://gitlab.freedesktop.org/gstreamer/gst-build/blob/master/README.md - - -PLUG-IN DEPENDENCIES AND LICENSES ---------------------------------- - -GStreamer is developed under the terms of the LGPL (see COPYING file for -details). Some of our plug-ins however rely on libraries which are available -under other licenses. This means that if you are distributing an application -which has a non-GPL compatible license (for instance a closed-source -application) with GStreamer, you have to make sure not to distribute GPL-linked -plug-ins. - -When using GPL-linked plug-ins, GStreamer is for all practical reasons -under the GPL itself. - -HISTORY -------- - -The fundamental design comes from the video pipeline at Oregon Graduate -Institute, as well as some ideas from DirectMedia. It's based on plug-ins that -will provide the various codec and other functionality. The interface -hopefully is generic enough for various companies (ahem, Apple) to release -binary codecs for Linux, until such time as they get a clue and release the -source.
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gsthls.h
Deleted
@@ -1,12 +0,0 @@ -#ifndef __GST_HLS_H__ -#define __GST_HLS_H__ - -#include <gst/gst.h> - -G_BEGIN_DECLS - -GST_DEBUG_CATEGORY_EXTERN (hls_debug); - -G_END_DECLS - -#endif /* __GST_HLS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkate.h
Deleted
@@ -1,55 +0,0 @@ -/* - * GStreamer - * Copyright (C) 2008 Vincent Penquerc'h <ogg.k.ogg.k@googlemail.com> - * - * Permission is hereby granted, free of charge, to any person obtaining a - * copy of this software and associated documentation files (the "Software"), - * to deal in the Software without restriction, including without limitation - * the rights to use, copy, modify, merge, publish, distribute, sublicense, - * and/or sell copies of the Software, and to permit persons to whom the - * Software is furnished to do so, subject to the following conditions: - * - * The above copyright notice and this permission notice shall be included in - * all copies or substantial portions of the Software. - * - * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR - * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, - * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE - * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER - * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING - * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER - * DEALINGS IN THE SOFTWARE. - * - * Alternatively, the contents of this file may be used under the - * GNU Lesser General Public License Version 2.1 (the "LGPL"), in - * which case the following provisions apply instead of the ones - * mentioned above: - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_KATE_H__ -#define __GST_KATE_H__ - -#include <gst/gst.h> - -G_BEGIN_DECLS - -/* nothing here any more */ - -G_END_DECLS - -#endif /* __GST_KATE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libmms
Deleted
-(directory)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libmms/gstmms.c
Deleted
@@ -1,637 +0,0 @@ -/* - * - * GStreamer - * Copyright (C) 1999-2001 Erik Walthinsen <omega@cse.ogi.edu> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include <gst/gst.h> -#include <stdio.h> -#include <string.h> -#include "gstmms.h" - -#define DEFAULT_CONNECTION_SPEED 0 - -enum -{ - PROP_0, - PROP_LOCATION, - PROP_CONNECTION_SPEED -}; - - -GST_DEBUG_CATEGORY_STATIC (mmssrc_debug); -#define GST_CAT_DEFAULT mmssrc_debug - -static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", - GST_PAD_SRC, - GST_PAD_ALWAYS, - GST_STATIC_CAPS ("video/x-ms-asf") - ); - -static void gst_mms_finalize (GObject * gobject); -static void gst_mms_uri_handler_init (gpointer g_iface, gpointer iface_data); - -static void gst_mms_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec); -static void gst_mms_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec); - -static gboolean gst_mms_query (GstBaseSrc * src, GstQuery * query); - -static gboolean gst_mms_start (GstBaseSrc * bsrc); -static gboolean gst_mms_stop (GstBaseSrc * bsrc); -static gboolean gst_mms_is_seekable (GstBaseSrc * src); -static gboolean gst_mms_get_size (GstBaseSrc * src, guint64 * size); -static gboolean gst_mms_prepare_seek_segment (GstBaseSrc * src, - GstEvent * event, GstSegment * segment); -static gboolean gst_mms_do_seek (GstBaseSrc * src, GstSegment * segment); - -static GstFlowReturn gst_mms_create (GstPushSrc * psrc, GstBuffer ** buf); - -static gboolean gst_mms_uri_set_uri (GstURIHandler * handler, - const gchar * uri, GError ** error); - -#define gst_mms_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstMMS, gst_mms, GST_TYPE_PUSH_SRC, - G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_mms_uri_handler_init)); - -/* initialize the plugin's class */ -static void -gst_mms_class_init (GstMMSClass * klass) -{ - GObjectClass *gobject_class = (GObjectClass *) klass; - GstElementClass *gstelement_class = (GstElementClass *) klass; - GstBaseSrcClass *gstbasesrc_class = (GstBaseSrcClass *) klass; - GstPushSrcClass *gstpushsrc_class = (GstPushSrcClass *) klass; - - gobject_class->set_property = gst_mms_set_property; - gobject_class->get_property = gst_mms_get_property; - gobject_class->finalize = gst_mms_finalize; - - g_object_class_install_property (gobject_class, PROP_LOCATION, - g_param_spec_string ("location", "location", - "Host URL to connect to. Accepted are mms://, mmsu://, mmst:// URL types", - NULL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - /* Note: connection-speed is intentionaly limited to G_MAXINT as libmms - * uses int for it */ - g_object_class_install_property (gobject_class, PROP_CONNECTION_SPEED, - g_param_spec_uint64 ("connection-speed", "Connection Speed", - "Network connection speed in kbps (0 = unknown)", - 0, G_MAXINT / 1000, DEFAULT_CONNECTION_SPEED, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - gst_element_class_add_static_pad_template (gstelement_class, &src_factory); - - gst_element_class_set_static_metadata (gstelement_class, - "MMS streaming source", "Source/Network", - "Receive data streamed via MSFT Multi Media Server protocol", - "Maciej Katafiasz <mathrick@users.sourceforge.net>"); - - GST_DEBUG_CATEGORY_INIT (mmssrc_debug, "mmssrc", 0, "MMS Source Element"); - - gstbasesrc_class->start = GST_DEBUG_FUNCPTR (gst_mms_start); - gstbasesrc_class->stop = GST_DEBUG_FUNCPTR (gst_mms_stop); - - gstpushsrc_class->create = GST_DEBUG_FUNCPTR (gst_mms_create); - - gstbasesrc_class->is_seekable = GST_DEBUG_FUNCPTR (gst_mms_is_seekable); - gstbasesrc_class->get_size = GST_DEBUG_FUNCPTR (gst_mms_get_size); - gstbasesrc_class->prepare_seek_segment = - GST_DEBUG_FUNCPTR (gst_mms_prepare_seek_segment); - gstbasesrc_class->do_seek = GST_DEBUG_FUNCPTR (gst_mms_do_seek); - gstbasesrc_class->query = GST_DEBUG_FUNCPTR (gst_mms_query); -} - -/* initialize the new element - * instantiate pads and add them to element - * set functions - * initialize structure - */ -static void -gst_mms_init (GstMMS * mmssrc) -{ - mmssrc->uri_name = NULL; - mmssrc->current_connection_uri_name = NULL; - mmssrc->connection = NULL; - mmssrc->connection_speed = DEFAULT_CONNECTION_SPEED; -} - -static void -gst_mms_finalize (GObject * gobject) -{ - GstMMS *mmssrc = GST_MMS (gobject); - - /* We may still have a connection open, as we preserve unused / pristine - open connections in stop to reuse them in start. */ - if (mmssrc->connection) { - mmsx_close (mmssrc->connection); - mmssrc->connection = NULL; - } - - if (mmssrc->current_connection_uri_name) { - g_free (mmssrc->current_connection_uri_name); - mmssrc->current_connection_uri_name = NULL; - } - - if (mmssrc->uri_name) { - g_free (mmssrc->uri_name); - mmssrc->uri_name = NULL; - } - - G_OBJECT_CLASS (parent_class)->finalize (gobject); -} - -/* FIXME operating in TIME rather than BYTES could remove this altogether - * and be more convenient elsewhere */ -static gboolean -gst_mms_query (GstBaseSrc * src, GstQuery * query) -{ - GstMMS *mmssrc = GST_MMS (src); - gboolean res = TRUE; - GstFormat format; - gint64 value; - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_POSITION: - gst_query_parse_position (query, &format, &value); - if (format != GST_FORMAT_BYTES) { - res = FALSE; - break; - } - value = (gint64) mmsx_get_current_pos (mmssrc->connection); - gst_query_set_position (query, format, value); - break; - case GST_QUERY_DURATION: - if (!mmsx_get_seekable (mmssrc->connection)) { - res = FALSE; - break; - } - gst_query_parse_duration (query, &format, &value); - switch (format) { - case GST_FORMAT_BYTES: - value = (gint64) mmsx_get_length (mmssrc->connection); - gst_query_set_duration (query, format, value); - break; - case GST_FORMAT_TIME: - value = mmsx_get_time_length (mmssrc->connection) * GST_SECOND; - gst_query_set_duration (query, format, value); - break; - default: - res = FALSE; - } - break; - default: - /* chain to parent */ - res = - GST_BASE_SRC_CLASS (parent_class)->query (GST_BASE_SRC (src), query); - break; - } - - return res; -} - - -static gboolean -gst_mms_prepare_seek_segment (GstBaseSrc * src, GstEvent * event, - GstSegment * segment) -{ - GstSeekType cur_type, stop_type; - gint64 cur, stop; - GstSeekFlags flags; - GstFormat seek_format; - gdouble rate; - - gst_event_parse_seek (event, &rate, &seek_format, &flags, - &cur_type, &cur, &stop_type, &stop); - - if (seek_format != GST_FORMAT_BYTES && seek_format != GST_FORMAT_TIME) { - GST_LOG_OBJECT (src, "Only byte or time seeking is supported"); - return FALSE; - } - - if (stop_type != GST_SEEK_TYPE_NONE) { - GST_LOG_OBJECT (src, "Stop seeking not supported"); - return FALSE; - } - - if (cur_type != GST_SEEK_TYPE_NONE && cur_type != GST_SEEK_TYPE_SET) { - GST_LOG_OBJECT (src, "Only absolute seeking is supported"); - return FALSE; - } - - /* We would like to convert from GST_FORMAT_TIME to GST_FORMAT_BYTES here - when needed, but we cannot as to do that we need to actually do the seek, - so we handle this in do_seek instead. */ - - /* FIXME implement relative seeking, we could do any needed relevant - seeking calculations here (in seek_format metrics), before the re-init - of the segment. */ - - gst_segment_init (segment, seek_format); - gst_segment_do_seek (segment, rate, seek_format, flags, cur_type, cur, - stop_type, stop, NULL); - - return TRUE; -} - -static gboolean -gst_mms_do_seek (GstBaseSrc * src, GstSegment * segment) -{ - gint64 start; - GstMMS *mmssrc = GST_MMS (src); - - if (segment->format == GST_FORMAT_TIME) { - if (!mmsx_time_seek (NULL, mmssrc->connection, - (double) segment->start / GST_SECOND)) { - GST_LOG_OBJECT (mmssrc, "mmsx_time_seek() failed"); - return FALSE; - } - start = mmsx_get_current_pos (mmssrc->connection); - GST_INFO_OBJECT (mmssrc, - "performed seek to %" GST_TIME_FORMAT ", offset after " "seek: %" - G_GINT64_FORMAT, GST_TIME_ARGS (segment->start), start); - } else if (segment->format == GST_FORMAT_BYTES) { - start = mmsx_seek (NULL, mmssrc->connection, segment->start, SEEK_SET); - /* mmsx_seek will close and reopen the connection when seeking with the - mmsh protocol, if the reopening fails this is indicated with -1 */ - if (start == -1) { - GST_DEBUG_OBJECT (mmssrc, "connection broken during seek"); - return FALSE; - } - GST_INFO_OBJECT (mmssrc, "performed seek to: %" G_GINT64_FORMAT " bytes, " - "result: %" G_GINT64_FORMAT, segment->start, start); - } else { - GST_DEBUG_OBJECT (mmssrc, "unsupported seek segment format: %s", - GST_STR_NULL (gst_format_get_name (segment->format))); - return FALSE; - } - gst_segment_init (segment, GST_FORMAT_BYTES); - gst_segment_do_seek (segment, segment->rate, GST_FORMAT_BYTES, - GST_SEEK_FLAG_NONE, GST_SEEK_TYPE_SET, start, GST_SEEK_TYPE_NONE, - segment->stop, NULL); - return TRUE; -} - - -/* get function - * this function generates new data when needed - */ - - -static GstFlowReturn -gst_mms_create (GstPushSrc * psrc, GstBuffer ** buf) -{ - GstMMS *mmssrc = GST_MMS (psrc); - guint8 *data; - guint blocksize; - gint result; - goffset offset; - - *buf = NULL; - - offset = mmsx_get_current_pos (mmssrc->connection); - - /* Check if a seek perhaps has wrecked our connection */ - if (offset == -1) { - GST_ERROR_OBJECT (mmssrc, - "connection broken (probably an error during mmsx_seek_time during a convert query) returning FLOW_ERROR"); - return GST_FLOW_ERROR; - } - - /* Choose blocksize best for optimum performance */ - if (offset == 0) - blocksize = mmsx_get_asf_header_len (mmssrc->connection); - else - blocksize = mmsx_get_asf_packet_len (mmssrc->connection); - - data = g_try_malloc (blocksize); - if (!data) { - GST_ERROR_OBJECT (mmssrc, "Failed to allocate %u bytes", blocksize); - return GST_FLOW_ERROR; - } - - GST_LOG_OBJECT (mmssrc, "reading %d bytes", blocksize); - result = mmsx_read (NULL, mmssrc->connection, (char *) data, blocksize); - /* EOS? */ - if (result == 0) - goto eos; - - *buf = gst_buffer_new_wrapped (data, result); - GST_BUFFER_OFFSET (*buf) = offset; - - GST_LOG_OBJECT (mmssrc, "Returning buffer with offset %" G_GOFFSET_FORMAT - " and size %u", offset, result); - - return GST_FLOW_OK; - -eos: - { - GST_DEBUG_OBJECT (mmssrc, "EOS"); - g_free (data); - *buf = NULL; - return GST_FLOW_EOS; - } -} - -static gboolean -gst_mms_is_seekable (GstBaseSrc * src) -{ - GstMMS *mmssrc = GST_MMS (src); - - return mmsx_get_seekable (mmssrc->connection); -} - -static gboolean -gst_mms_get_size (GstBaseSrc * src, guint64 * size) -{ - GstMMS *mmssrc = GST_MMS (src); - - /* non seekable usually means live streams, and get_length() returns, - erm, interesting values for live streams */ - if (!mmsx_get_seekable (mmssrc->connection)) - return FALSE; - - *size = mmsx_get_length (mmssrc->connection); - return TRUE; -} - -static gboolean -gst_mms_start (GstBaseSrc * bsrc) -{ - GstMMS *mms = GST_MMS (bsrc); - guint bandwidth_avail; - - if (!mms->uri_name || *mms->uri_name == '\0') - goto no_uri; - - if (mms->connection_speed) - bandwidth_avail = mms->connection_speed; - else - bandwidth_avail = G_MAXINT; - - /* If we already have a connection, and the uri isn't changed, reuse it, - as connecting is expensive. */ - if (mms->connection) { - if (!strcmp (mms->uri_name, mms->current_connection_uri_name)) { - GST_DEBUG_OBJECT (mms, "Reusing existing connection for %s", - mms->uri_name); - return TRUE; - } else { - mmsx_close (mms->connection); - g_free (mms->current_connection_uri_name); - mms->current_connection_uri_name = NULL; - } - } - - /* FIXME: pass some sane arguments here */ - GST_DEBUG_OBJECT (mms, - "Trying mms_connect (%s) with bandwidth constraint of %d bps", - mms->uri_name, bandwidth_avail); - mms->connection = mmsx_connect (NULL, NULL, mms->uri_name, bandwidth_avail); - if (mms->connection) { - /* Save the uri name so that it can be checked for connection reusing, - see above. */ - mms->current_connection_uri_name = g_strdup (mms->uri_name); - GST_DEBUG_OBJECT (mms, "Connect successful"); - return TRUE; - } else { - gchar *url, *location; - - GST_ERROR_OBJECT (mms, - "Could not connect to this stream, redirecting to rtsp"); - location = strstr (mms->uri_name, "://"); - if (location == NULL || *location == '\0' || *(location + 3) == '\0') - goto no_uri; - url = g_strdup_printf ("rtsp://%s", location + 3); - - gst_element_post_message (GST_ELEMENT_CAST (mms), - gst_message_new_element (GST_OBJECT_CAST (mms), - gst_structure_new ("redirect", "new-location", G_TYPE_STRING, url, - NULL))); - - /* post an error message as well, so that applications that don't handle - * redirect messages get to see a proper error message */ - GST_ELEMENT_ERROR (mms, RESOURCE, OPEN_READ, - ("Could not connect to streaming server."), - ("A redirect message was posted on the bus and should have been " - "handled by the application.")); - - return FALSE; - } - -no_uri: - { - GST_ELEMENT_ERROR (mms, RESOURCE, OPEN_READ, - ("No URI to open specified"), (NULL)); - return FALSE; - } -} - -static gboolean -gst_mms_stop (GstBaseSrc * bsrc) -{ - GstMMS *mms = GST_MMS (bsrc); - - if (mms->connection != NULL) { - /* Check if the connection is still pristine, that is if no more then - just the mmslib cached asf header has been read. If it is still pristine - preserve it as we often are re-started with the same URL and connecting - is expensive */ - if (mmsx_get_current_pos (mms->connection) > - mmsx_get_asf_header_len (mms->connection)) { - mmsx_close (mms->connection); - mms->connection = NULL; - g_free (mms->current_connection_uri_name); - mms->current_connection_uri_name = NULL; - } - } - return TRUE; -} - -static void -gst_mms_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstMMS *mmssrc = GST_MMS (object); - - switch (prop_id) { - case PROP_LOCATION: - gst_mms_uri_set_uri (GST_URI_HANDLER (mmssrc), - g_value_get_string (value), NULL); - break; - case PROP_CONNECTION_SPEED: - GST_OBJECT_LOCK (mmssrc); - mmssrc->connection_speed = g_value_get_uint64 (value) * 1000; - GST_OBJECT_UNLOCK (mmssrc); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_mms_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstMMS *mmssrc = GST_MMS (object); - - GST_OBJECT_LOCK (mmssrc); - switch (prop_id) { - case PROP_LOCATION: - if (mmssrc->uri_name) - g_value_set_string (value, mmssrc->uri_name); - break; - case PROP_CONNECTION_SPEED: - g_value_set_uint64 (value, mmssrc->connection_speed / 1000); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } - GST_OBJECT_UNLOCK (mmssrc); -} - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and pad templates - * register the features - */ -static gboolean -plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "mmssrc", GST_RANK_NONE, GST_TYPE_MMS); -} - -static GstURIType -gst_mms_uri_get_type (GType type) -{ - return GST_URI_SRC; -} - -static const gchar *const * -gst_mms_uri_get_protocols (GType type) -{ - static const gchar *protocols[] = { "mms", "mmsh", "mmst", "mmsu", NULL }; - - return protocols; -} - -static gchar * -gst_mms_uri_get_uri (GstURIHandler * handler) -{ - GstMMS *src = GST_MMS (handler); - - /* FIXME: make thread-safe */ - return g_strdup (src->uri_name); -} - -static gchar * -gst_mms_src_make_valid_uri (const gchar * uri) -{ - gchar *protocol; - const gchar *colon, *tmp; - gsize len; - - if (!uri || !gst_uri_is_valid (uri)) - return NULL; - - protocol = gst_uri_get_protocol (uri); - - if ((strcmp (protocol, "mms") != 0) && (strcmp (protocol, "mmsh") != 0) && - (strcmp (protocol, "mmst") != 0) && (strcmp (protocol, "mmsu") != 0)) { - g_free (protocol); - return FALSE; - } - g_free (protocol); - - colon = strstr (uri, "://"); - if (!colon) - return NULL; - - tmp = colon + 3; - len = strlen (tmp); - if (len == 0) - return NULL; - - /* libmms segfaults if there's no hostname or - * no / after the hostname - */ - colon = strstr (tmp, "/"); - if (colon == tmp) - return NULL; - - if (strstr (tmp, "/") == NULL) { - gchar *ret; - - len = strlen (uri); - ret = g_malloc0 (len + 2); - memcpy (ret, uri, len); - ret[len] = '/'; - return ret; - } else { - return g_strdup (uri); - } -} - -static gboolean -gst_mms_uri_set_uri (GstURIHandler * handler, const gchar * uri, - GError ** error) -{ - GstMMS *src = GST_MMS (handler); - gchar *fixed_uri; - - fixed_uri = gst_mms_src_make_valid_uri (uri); - if (!fixed_uri && uri) { - g_set_error (error, GST_URI_ERROR, GST_URI_ERROR_BAD_URI, - "Invalid MMS URI"); - return FALSE; - } - - GST_OBJECT_LOCK (src); - g_free (src->uri_name); - src->uri_name = fixed_uri; - GST_OBJECT_UNLOCK (src); - - return TRUE; -} - -static void -gst_mms_uri_handler_init (gpointer g_iface, gpointer iface_data) -{ - GstURIHandlerInterface *iface = (GstURIHandlerInterface *) g_iface; - - iface->get_type = gst_mms_uri_get_type; - iface->get_protocols = gst_mms_uri_get_protocols; - iface->get_uri = gst_mms_uri_get_uri; - iface->set_uri = gst_mms_uri_set_uri; -} - - -/* this is the structure that gst-register looks for - * so keep the name plugin_desc, or you cannot get your plug-in registered */ -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - mms, - "Microsoft Multi Media Server streaming protocol support", - plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libmms/gstmms.h
Deleted
@@ -1,49 +0,0 @@ -/* - * gstmms.h: header file for gst-mms plugin - */ - -#ifndef __GST_MMS_H__ -#define __GST_MMS_H__ - -#include <gst/gst.h> -#include <libmms/mmsx.h> -#include <gst/base/gstpushsrc.h> - -G_BEGIN_DECLS - -/* #define's don't like whitespacey bits */ -#define GST_TYPE_MMS \ - (gst_mms_get_type()) -#define GST_MMS(obj) \ - (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_MMS,GstMMS)) -#define GST_MMS_CLASS(klass) \ - (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_MMS,GstMMSClass)) -#define GST_IS_MMS(obj) \ - (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_MMS)) -#define GST_IS_MMS_CLASS(klass) \ - (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_MMS)) - -typedef struct _GstMMS GstMMS; -typedef struct _GstMMSClass GstMMSClass; - -struct _GstMMS -{ - GstPushSrc parent; - - gchar *uri_name; - gchar *current_connection_uri_name; - guint64 connection_speed; - - mmsx_t *connection; -}; - -struct _GstMMSClass -{ - GstPushSrcClass parent_class; -}; - -GType gst_mms_get_type (void); - -G_END_DECLS - -#endif /* __GST_MMS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libmms/meson.build
Deleted
@@ -1,15 +0,0 @@ -mms_dep = dependency('libmms', version : '>= 0.4', required : get_option('libmms')) - -if mms_dep.found() - gstmms = library('gstmms', - 'gstmms.c', - c_args : gst_plugins_bad_args, - link_args : noseh_link_args, - include_directories : [configinc], - dependencies : [gstbase_dep, mms_dep], - install : true, - install_dir : plugins_install_dir, - ) - pkgconfig.generate(gstmms, install_dir : plugins_pkgconfig_install_dir) - plugins += [gstmms] -endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ofa
Deleted
-(directory)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ofa/gstofa.c
Deleted
@@ -1,279 +0,0 @@ -/* GStreamer ofa fingerprinting element - * Copyright (C) 2006 M. Derezynski - * Copyright (C) 2008 Eric Buehl - * Copyright (C) 2008 Sebastian Dröge <slomo@circular-chaos.org> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301 USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include <gst/gst.h> -#include <ofa1/ofa.h> -#include "gstofa.h" - -#define PAD_CAPS \ - "audio/x-raw, " \ - "format = { S16LE, S16BE }, " \ - "rate = (int) [ 1, MAX ], " \ - "channels = (int) [ 1, 2 ]" - -GST_DEBUG_CATEGORY_STATIC (gst_ofa_debug); -#define GST_CAT_DEFAULT gst_ofa_debug - -enum -{ - PROP_0, - PROP_FINGERPRINT, -}; - -#define parent_class gst_ofa_parent_class -G_DEFINE_TYPE (GstOFA, gst_ofa, GST_TYPE_AUDIO_FILTER); - -static void gst_ofa_finalize (GObject * object); -static void gst_ofa_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec); -static GstFlowReturn gst_ofa_transform_ip (GstBaseTransform * trans, - GstBuffer * buf); -static gboolean gst_ofa_sink_event (GstBaseTransform * trans, GstEvent * event); - -static void -gst_ofa_finalize (GObject * object) -{ - GstOFA *ofa = GST_OFA (object); - - if (ofa->adapter) { - g_object_unref (ofa->adapter); - ofa->adapter = NULL; - } - - g_free (ofa->fingerprint); - ofa->fingerprint = NULL; - - G_OBJECT_CLASS (parent_class)->finalize (object); -} - -static void -gst_ofa_class_init (GstOFAClass * klass) -{ - GstBaseTransformClass *gstbasetrans_class = GST_BASE_TRANSFORM_CLASS (klass); - GstAudioFilterClass *audio_filter_class = GST_AUDIO_FILTER_CLASS (klass); - GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstCaps *caps; - - gobject_class->get_property = gst_ofa_get_property; - - g_object_class_install_property (gobject_class, PROP_FINGERPRINT, - g_param_spec_string ("fingerprint", "Resulting fingerprint", - "Resulting fingerprint", NULL, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - gobject_class->finalize = gst_ofa_finalize; - - gstbasetrans_class->transform_ip = GST_DEBUG_FUNCPTR (gst_ofa_transform_ip); - gstbasetrans_class->sink_event = GST_DEBUG_FUNCPTR (gst_ofa_sink_event); - gstbasetrans_class->passthrough_on_same_caps = TRUE; - - gst_element_class_set_static_metadata (gstelement_class, "OFA", - "MusicIP Fingerprinting element", - "Find a music fingerprint using MusicIP's libofa", - "Milosz Derezynski <internalerror@gmail.com>, " - "Eric Buehl <eric.buehl@gmail.com>"); - - caps = gst_caps_from_string (PAD_CAPS); - gst_audio_filter_class_add_pad_templates (audio_filter_class, caps); - gst_caps_unref (caps); -} - -static void -create_fingerprint (GstOFA * ofa) -{ - GstAudioFilter *audiofilter = GST_AUDIO_FILTER (ofa); - const guint8 *samples; - const gchar *fingerprint; - gint rate, channels, endianness; - GstTagList *tags; - gsize available; - - available = gst_adapter_available (ofa->adapter); - - if (available == 0) { - GST_WARNING_OBJECT (ofa, "No data to take fingerprint from"); - ofa->record = FALSE; - return; - } - - rate = GST_AUDIO_INFO_RATE (&audiofilter->info); - channels = GST_AUDIO_INFO_CHANNELS (&audiofilter->info); - if (GST_AUDIO_INFO_ENDIANNESS (&audiofilter->info) == G_BIG_ENDIAN) - endianness = OFA_BIG_ENDIAN; - else - endianness = OFA_LITTLE_ENDIAN; - - - GST_DEBUG_OBJECT (ofa, "Generating fingerprint for %" G_GSIZE_FORMAT - " samples", available / sizeof (gint16)); - - samples = gst_adapter_map (ofa->adapter, available); - - fingerprint = ofa_create_print ((unsigned char *) samples, endianness, - available / sizeof (gint16), rate, (channels == 2) ? 1 : 0); - - gst_adapter_unmap (ofa->adapter); - gst_adapter_flush (ofa->adapter, available); - - if (fingerprint == NULL) { - GST_WARNING_OBJECT (ofa, "Failed to generate fingerprint"); - goto done; - } - - GST_INFO_OBJECT (ofa, "Generated fingerprint: %s", fingerprint); - ofa->fingerprint = g_strdup (fingerprint); - - // FIXME: combine with upstream tags - tags = gst_tag_list_new (GST_TAG_OFA_FINGERPRINT, ofa->fingerprint, NULL); - gst_pad_push_event (GST_BASE_TRANSFORM_SRC_PAD (ofa), - gst_event_new_tag (tags)); - - g_object_notify (G_OBJECT (ofa), "fingerprint"); - -done: - - ofa->record = FALSE; -} - -static gboolean -gst_ofa_sink_event (GstBaseTransform * trans, GstEvent * event) -{ - GstOFA *ofa = GST_OFA (trans); - - switch (GST_EVENT_TYPE (event)) { - case GST_EVENT_FLUSH_STOP: - case GST_EVENT_SEGMENT: - GST_DEBUG_OBJECT (ofa, "Got %s event, clearing buffer", - GST_EVENT_TYPE_NAME (event)); - gst_adapter_clear (ofa->adapter); - /* FIXME: should we really always reset this instead of using an - * already-existing fingerprint? Assumes fingerprints are always - * extracted in a separate pipeline instead of a live playback - * situation */ - ofa->record = TRUE; - g_free (ofa->fingerprint); - ofa->fingerprint = NULL; - break; - case GST_EVENT_EOS: - /* we got to the end of the stream but never generated a fingerprint - * (probably under 135 seconds) - */ - if (!ofa->fingerprint && ofa->record) - create_fingerprint (ofa); - break; - default: - break; - } - - return GST_BASE_TRANSFORM_CLASS (parent_class)->sink_event (trans, event); -} - -static void -gst_ofa_init (GstOFA * ofa) -{ - gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (ofa), TRUE); - - ofa->fingerprint = NULL; - ofa->record = TRUE; - - ofa->adapter = gst_adapter_new (); -} - -static GstFlowReturn -gst_ofa_transform_ip (GstBaseTransform * trans, GstBuffer * buf) -{ - GstAudioFilter *audiofilter = GST_AUDIO_FILTER (trans); - GstOFA *ofa = GST_OFA (trans); - guint64 nframes; - GstClockTime duration; - gint rate, channels; - - rate = GST_AUDIO_INFO_RATE (&audiofilter->info); - channels = GST_AUDIO_INFO_CHANNELS (&audiofilter->info); - - if (rate == 0 || channels == 0) - return GST_FLOW_NOT_NEGOTIATED; - - if (!ofa->record) - return GST_FLOW_OK; - - gst_adapter_push (ofa->adapter, gst_buffer_copy (buf)); - - nframes = gst_adapter_available (ofa->adapter) / (channels * 2); - duration = GST_FRAMES_TO_CLOCK_TIME (nframes, rate); - - if (duration >= 135 * GST_SECOND && ofa->fingerprint == NULL) - create_fingerprint (ofa); - - return GST_FLOW_OK; -} - -static void -gst_ofa_get_property (GObject * object, guint prop_id, GValue * value, - GParamSpec * pspec) -{ - GstOFA *ofa = GST_OFA (object); - - switch (prop_id) { - case PROP_FINGERPRINT: - g_value_set_string (value, ofa->fingerprint); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - - -static gboolean -plugin_init (GstPlugin * plugin) -{ - gboolean ret; - int major, minor, rev; - - GST_DEBUG_CATEGORY_INIT (gst_ofa_debug, "ofa", 0, "ofa element"); - - ofa_get_version (&major, &minor, &rev); - - GST_DEBUG ("libofa %d.%d.%d", major, minor, rev); - - ret = gst_element_register (plugin, "ofa", GST_RANK_NONE, GST_TYPE_OFA); - - if (ret) { - /* TODO: get this into core */ - gst_tag_register (GST_TAG_OFA_FINGERPRINT, GST_TAG_FLAG_META, - G_TYPE_STRING, "ofa fingerprint", "OFA fingerprint", NULL); - } - - return ret; -} - -/* FIXME: someone write a libofa replacement with an LGPL or BSD license */ -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - ofa, - "Calculate MusicIP fingerprint from audio files", - plugin_init, VERSION, "GPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ofa/gstofa.h
Deleted
@@ -1,79 +0,0 @@ -/* GStreamer - * - * gstofa.h - * - * Copyright (C) 2006 M. Derezynski - * Copyright (C) 2008 Eric Buehl - * Copyright (C) 2008 Sebastian Dröge <slomo@circular-chaos.org> - * - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301 USA. - */ - -#ifndef __GST_OFA_H__ -#define __GST_OFA_H__ - -#include <gst/gst.h> -#include <gst/base/gstadapter.h> -#include <gst/audio/gstaudiofilter.h> -#include <gst/audio/audio.h> - -G_BEGIN_DECLS - -#define GST_TYPE_OFA \ - (gst_ofa_get_type()) -#define GST_OFA(obj) \ - (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_OFA,GstOFA)) -#define GST_OFA_CLASS(klass) \ - (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_OFA,GstOFAClass)) -#define GST_IS_OFA(obj) \ - (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_OFA)) -#define GST_IS_OFA_CLASS(klass) \ - (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_OFA)) - -#define GST_TAG_OFA_FINGERPRINT "ofa-fingerprint" - -typedef struct _GstOFA GstOFA; -typedef struct _GstOFAClass GstOFAClass; - - -/** - * GstOFA: - * - * Opaque #GstOFA data structure - */ - -struct _GstOFA -{ - GstAudioFilter audiofilter; - - /*< private > */ - - GstAdapter *adapter; - char *fingerprint; - gboolean record; -}; - -struct _GstOFAClass -{ - GstAudioFilterClass audiofilter_class; -}; - -GType gst_ofa_get_type (void); - -G_END_DECLS - -#endif /* __GST_OFA_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ofa/meson.build
Deleted
@@ -1,13 +0,0 @@ -ofa_dep = dependency('libofa', version: '>= 0.9.3', required: get_option('ofa')) - -if ofa_dep.found() - gstofa = library('gstofa', 'gstofa.c', - c_args: gst_plugins_bad_args, - include_directories: [configinc], - dependencies: [gstaudio_dep, ofa_dep], - install: true, - install_dir: plugins_install_dir, - ) - pkgconfig.generate(gstofa, install_dir: plugins_pkgconfig_install_dir) - plugins += [gstofa] -endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sndfile/gstsf.h
Deleted
@@ -1,44 +0,0 @@ -/* GStreamer libsndfile plugin - * Copyright (C) 2003 Andy Wingo <wingo at pobox dot com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - - -#ifndef __GST_SFSINK_H__ -#define __GST_SFSINK_H__ - - -#include <gst/gst.h> -#include <sndfile.h> - - -G_BEGIN_DECLS - -GstCaps *gst_sf_create_audio_template_caps (void); - -#define GST_TYPE_SF_MAJOR_TYPES (gst_sf_major_types_get_type()) -#define GST_TYPE_SF_MINOR_TYPES (gst_sf_minor_types_get_type()) - -GType gst_sf_major_types_get_type (void); -GType gst_sf_minor_types_get_type (void); - -GType gst_sf_dec_get_type (void); - -G_END_DECLS - - -#endif /* __GST_SFSINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/sctptransport.c
Deleted
@@ -1,270 +0,0 @@ -/* GStreamer - * Copyright (C) 2018 Matthew Waters <matthew@centricular.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include "config.h" -#endif - -#include <stdio.h> - -#include "sctptransport.h" -#include "gstwebrtcbin.h" - -#define GST_CAT_DEFAULT gst_webrtc_sctp_transport_debug -GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); - -enum -{ - SIGNAL_0, - ON_RESET_STREAM_SIGNAL, - LAST_SIGNAL, -}; - -enum -{ - PROP_0, - PROP_TRANSPORT, - PROP_STATE, - PROP_MAX_MESSAGE_SIZE, - PROP_MAX_CHANNELS, -}; - -static guint gst_webrtc_sctp_transport_signals[LAST_SIGNAL] = { 0 }; - -#define gst_webrtc_sctp_transport_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstWebRTCSCTPTransport, gst_webrtc_sctp_transport, - GST_TYPE_OBJECT, GST_DEBUG_CATEGORY_INIT (gst_webrtc_sctp_transport_debug, - "webrtcsctptransport", 0, "webrtcsctptransport");); - -typedef void (*SCTPTask) (GstWebRTCSCTPTransport * sctp, gpointer user_data); - -struct task -{ - GstWebRTCSCTPTransport *sctp; - SCTPTask func; - gpointer user_data; - GDestroyNotify notify; -}; - -static void -_execute_task (GstWebRTCBin * webrtc, struct task *task) -{ - if (task->func) - task->func (task->sctp, task->user_data); -} - -static void -_free_task (struct task *task) -{ - gst_object_unref (task->sctp); - - if (task->notify) - task->notify (task->user_data); - g_free (task); -} - -static void -_sctp_enqueue_task (GstWebRTCSCTPTransport * sctp, SCTPTask func, - gpointer user_data, GDestroyNotify notify) -{ - struct task *task = g_new0 (struct task, 1); - - task->sctp = gst_object_ref (sctp); - task->func = func; - task->user_data = user_data; - task->notify = notify; - - gst_webrtc_bin_enqueue_task (sctp->webrtcbin, - (GstWebRTCBinFunc) _execute_task, task, (GDestroyNotify) _free_task, - NULL); -} - -static void -_emit_stream_reset (GstWebRTCSCTPTransport * sctp, gpointer user_data) -{ - guint stream_id = GPOINTER_TO_UINT (user_data); - - g_signal_emit (sctp, - gst_webrtc_sctp_transport_signals[ON_RESET_STREAM_SIGNAL], 0, stream_id); -} - -static void -_on_sctp_dec_pad_removed (GstElement * sctpdec, GstPad * pad, - GstWebRTCSCTPTransport * sctp) -{ - guint stream_id; - - if (sscanf (GST_PAD_NAME (pad), "src_%u", &stream_id) != 1) - return; - - _sctp_enqueue_task (sctp, (SCTPTask) _emit_stream_reset, - GUINT_TO_POINTER (stream_id), NULL); -} - -static void -_on_sctp_association_established (GstElement * sctpenc, gboolean established, - GstWebRTCSCTPTransport * sctp) -{ - GST_OBJECT_LOCK (sctp); - if (established) - sctp->state = GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED; - else - sctp->state = GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED; - sctp->association_established = established; - GST_OBJECT_UNLOCK (sctp); - - g_object_notify (G_OBJECT (sctp), "state"); -} - -static void -gst_webrtc_sctp_transport_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ -// GstWebRTCSCTPTransport *sctp = GST_WEBRTC_SCTP_TRANSPORT (object); - - switch (prop_id) { - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_webrtc_sctp_transport_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstWebRTCSCTPTransport *sctp = GST_WEBRTC_SCTP_TRANSPORT (object); - - switch (prop_id) { - case PROP_TRANSPORT: - g_value_set_object (value, sctp->transport); - break; - case PROP_STATE: - g_value_set_enum (value, sctp->state); - break; - case PROP_MAX_MESSAGE_SIZE: - g_value_set_uint64 (value, sctp->max_message_size); - break; - case PROP_MAX_CHANNELS: - g_value_set_uint (value, sctp->max_channels); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_webrtc_sctp_transport_finalize (GObject * object) -{ - GstWebRTCSCTPTransport *sctp = GST_WEBRTC_SCTP_TRANSPORT (object); - - g_signal_handlers_disconnect_by_data (sctp->sctpdec, sctp); - g_signal_handlers_disconnect_by_data (sctp->sctpenc, sctp); - - gst_object_unref (sctp->sctpdec); - gst_object_unref (sctp->sctpenc); - - g_clear_object (&sctp->transport); - - G_OBJECT_CLASS (parent_class)->finalize (object); -} - -static void -gst_webrtc_sctp_transport_constructed (GObject * object) -{ - GstWebRTCSCTPTransport *sctp = GST_WEBRTC_SCTP_TRANSPORT (object); - guint association_id; - - association_id = g_random_int_range (0, G_MAXUINT16); - - sctp->sctpdec = - g_object_ref_sink (gst_element_factory_make ("sctpdec", NULL)); - g_object_set (sctp->sctpdec, "sctp-association-id", association_id, NULL); - sctp->sctpenc = - g_object_ref_sink (gst_element_factory_make ("sctpenc", NULL)); - g_object_set (sctp->sctpenc, "sctp-association-id", association_id, NULL); - - g_signal_connect (sctp->sctpdec, "pad-removed", - G_CALLBACK (_on_sctp_dec_pad_removed), sctp); - g_signal_connect (sctp->sctpenc, "sctp-association-established", - G_CALLBACK (_on_sctp_association_established), sctp); - - G_OBJECT_CLASS (parent_class)->constructed (object); -} - -static void -gst_webrtc_sctp_transport_class_init (GstWebRTCSCTPTransportClass * klass) -{ - GObjectClass *gobject_class = (GObjectClass *) klass; - - gobject_class->constructed = gst_webrtc_sctp_transport_constructed; - gobject_class->get_property = gst_webrtc_sctp_transport_get_property; - gobject_class->set_property = gst_webrtc_sctp_transport_set_property; - gobject_class->finalize = gst_webrtc_sctp_transport_finalize; - - g_object_class_install_property (gobject_class, - PROP_TRANSPORT, - g_param_spec_object ("transport", - "WebRTC DTLS Transport", - "DTLS transport used for this SCTP transport", - GST_TYPE_WEBRTC_DTLS_TRANSPORT, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, - PROP_STATE, - g_param_spec_enum ("state", - "WebRTC SCTP Transport state", "WebRTC SCTP Transport state", - GST_TYPE_WEBRTC_SCTP_TRANSPORT_STATE, - GST_WEBRTC_SCTP_TRANSPORT_STATE_NEW, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, - PROP_MAX_MESSAGE_SIZE, - g_param_spec_uint64 ("max-message-size", - "Maximum message size", - "Maximum message size as reported by the transport", 0, G_MAXUINT64, - 0, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, - PROP_MAX_CHANNELS, - g_param_spec_uint ("max-channels", - "Maximum number of channels", "Maximum number of channels", - 0, G_MAXUINT16, 0, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - /** - * GstWebRTCSCTPTransport::reset-stream: - * @object: the #GstWebRTCSCTPTransport - * @stream_id: the SCTP stream that was reset - */ - gst_webrtc_sctp_transport_signals[ON_RESET_STREAM_SIGNAL] = - g_signal_new ("stream-reset", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, G_TYPE_UINT); -} - -static void -gst_webrtc_sctp_transport_init (GstWebRTCSCTPTransport * nice) -{ -} - -GstWebRTCSCTPTransport * -gst_webrtc_sctp_transport_new (void) -{ - return g_object_new (GST_TYPE_WEBRTC_SCTP_TRANSPORT, NULL); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/sctptransport.h
Deleted
@@ -1,66 +0,0 @@ -/* GStreamer - * Copyright (C) 2018 Matthew Waters <matthew@centricular.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_WEBRTC_SCTP_TRANSPORT_H__ -#define __GST_WEBRTC_SCTP_TRANSPORT_H__ - -#include <gst/gst.h> -/* libnice */ -#include <agent.h> -#include <gst/webrtc/webrtc.h> -#include "gstwebrtcice.h" - -G_BEGIN_DECLS - -GType gst_webrtc_sctp_transport_get_type(void); -#define GST_TYPE_WEBRTC_SCTP_TRANSPORT (gst_webrtc_sctp_transport_get_type()) -#define GST_WEBRTC_SCTP_TRANSPORT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_WEBRTC_SCTP_TRANSPORT,GstWebRTCSCTPTransport)) -#define GST_IS_WEBRTC_SCTP_TRANSPORT(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_WEBRTC_SCTP_TRANSPORT)) -#define GST_WEBRTC_SCTP_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass) ,GST_TYPE_WEBRTC_SCTP_TRANSPORT,GstWebRTCSCTPTransportClass)) -#define GST_IS_WEBRTC_SCTP_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_SCTP_TRANSPORT)) -#define GST_WEBRTC_SCTP_TRANSPORT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_SCTP_TRANSPORT,GstWebRTCSCTPTransportClass)) - -struct _GstWebRTCSCTPTransport -{ - GstObject parent; - - GstWebRTCDTLSTransport *transport; - GstWebRTCSCTPTransportState state; - guint64 max_message_size; - guint max_channels; - - gboolean association_established; - - gulong sctpdec_block_id; - GstElement *sctpdec; - GstElement *sctpenc; - - GstWebRTCBin *webrtcbin; -}; - -struct _GstWebRTCSCTPTransportClass -{ - GstObjectClass parent_class; -}; - -GstWebRTCSCTPTransport * gst_webrtc_sctp_transport_new (void); - -G_END_DECLS - -#endif /* __GST_WEBRTC_SCTP_TRANSPORT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wpe/gstwpesrc.cpp
Deleted
@@ -1,735 +0,0 @@ -/* Copyright (C) <2018> Philippe Normand <philn@igalia.com> - * Copyright (C) <2018> Žan Doberšek <zdobersek@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -/** - * SECTION:element-wpesrc - * @title: wpesrc - * - * The wpesrc element is used to produce a video texture representing a web page - * rendered off-screen by WPE. - * - * Starting from WPEBackend-FDO 1.6.x, software rendering support is available. This - * features allows wpesrc to be used on machines without GPU, and/or for testing - * purpose. To enable it, set the `LIBGL_ALWAYS_SOFTWARE=true` environment - * variable and make sure `video/x-raw, format=BGRA` caps are negotiated by the - * wpesrc element. - * - * Since: 1.16 - * - * ## Example launch lines - * - * |[ - * gst-launch-1.0 -v wpesrc location="https://gstreamer.freedesktop.org" ! queue ! glimagesink - * ]| - * Shows the GStreamer website homepage - * - * |[ - * LIBGL_ALWAYS_SOFTWARE=true gst-launch-1.0 -v wpesrc num-buffers=50 location="https://gstreamer.freedesktop.org" ! videoconvert ! pngenc ! multifilesink location=/tmp/snapshot-%05d.png - * ]| - * Saves the first 50 video frames generated for the GStreamer website as PNG files in /tmp. - * - * |[ - * gst-play-1.0 --videosink gtkglsink wpe://https://gstreamer.freedesktop.org - * ]| - * Shows the GStreamer website homepage as played with GstPlayer in a GTK+ window. - * - * |[ - * gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 ! glimagesink wpesrc location="file:///home/phil/Downloads/plunk/index.html" draw-background=0 ! m. videotestsrc ! queue ! glupload ! glcolorconvert ! m. - * ]| - * Composite WPE with a video stream in a single OpenGL scene. - * - * |[ - * gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 sink_0::height=818 sink_0::width=1920 ! gtkglsink wpesrc location="file:///home/phil/Downloads/plunk/index.html" draw-background=0 ! m. uridecodebin uri="http://192.168.1.44/Sintel.2010.1080p.mkv" name=d d. ! queue ! glupload ! glcolorconvert ! m. - * ]| - * Composite WPE with a video stream, sink_0 pad properties have to match the video dimensions. - */ - -/* - * TODO: - * - Audio support (requires an AudioSession implementation in WebKit and a WPEBackend-fdo API for it) - * - DMABuf support (requires changes in WPEBackend-fdo to expose DMABuf planes and fds) - * - Custom EGLMemory allocator - * - Better navigation events handling (would require a new GstNavigation API) - */ - -#ifdef HAVE_CONFIG_H -#include <config.h> -#endif - -#include "gstwpesrc.h" -#include <gst/gl/gl.h> -#include <gst/gl/egl/gstglmemoryegl.h> -#include <gst/gl/wayland/gstgldisplay_wayland.h> -#include <gst/video/video.h> -#include <xkbcommon/xkbcommon.h> - -#include "WPEThreadedView.h" - -GST_DEBUG_CATEGORY (wpe_src_debug); -#define GST_CAT_DEFAULT wpe_src_debug - -#define DEFAULT_WIDTH 1920 -#define DEFAULT_HEIGHT 1080 -#define DEFAULT_FPS_N 30 -#define DEFAULT_FPS_D 1 - -enum -{ - PROP_0, - PROP_LOCATION, - PROP_DRAW_BACKGROUND -}; - -enum -{ - SIGNAL_CONFIGURE_WEB_VIEW, - SIGNAL_LOAD_BYTES, - LAST_SIGNAL -}; -static guint gst_wpe_src_signals[LAST_SIGNAL] = { 0 }; - -struct _GstWpeSrc -{ - GstGLBaseSrc parent; - - /* properties */ - gchar *location; - gboolean draw_background; - - GBytes *bytes; - gboolean gl_enabled; - - gint64 n_frames; /* total frames sent */ - - WPEView *view; -}; - -static void gst_wpe_src_uri_handler_init (gpointer iface, gpointer data); - -#define gst_wpe_src_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstWpeSrc, gst_wpe_src, GST_TYPE_GL_BASE_SRC, - G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_wpe_src_uri_handler_init)); - -#if ENABLE_SHM_BUFFER_SUPPORT -#define WPE_RAW_CAPS "; video/x-raw, " \ - "format = (string) BGRA, " \ - "width = " GST_VIDEO_SIZE_RANGE ", " \ - "height = " GST_VIDEO_SIZE_RANGE ", " \ - "framerate = " GST_VIDEO_FPS_RANGE ", " \ - "pixel-aspect-ratio = (fraction)1/1" -#else -#define WPE_RAW_CAPS "" -#endif - -#define WPE_BASIC_CAPS "video/x-raw(memory:GLMemory), " \ - "format = (string) RGBA, " \ - "width = " GST_VIDEO_SIZE_RANGE ", " \ - "height = " GST_VIDEO_SIZE_RANGE ", " \ - "framerate = " GST_VIDEO_FPS_RANGE ", " \ - "pixel-aspect-ratio = (fraction)1/1, texture-target = (string)2D" - -#define WPE_SRC_CAPS WPE_BASIC_CAPS WPE_RAW_CAPS -#define WPE_SRC_DOC_CAPS WPE_BASIC_CAPS "; video/x-raw, format = (string) BGRA" - -static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", - GST_PAD_SRC, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (WPE_SRC_CAPS)); - -static GstFlowReturn -gst_wpe_src_create (GstBaseSrc * bsrc, guint64 offset, guint length, GstBuffer ** buf) -{ - GstGLBaseSrc *gl_src = GST_GL_BASE_SRC (bsrc); - GstWpeSrc *src = GST_WPE_SRC (bsrc); - GstFlowReturn ret = GST_FLOW_ERROR; - GstBuffer *locked_buffer; - GstClockTime next_time; - gint64 ts_offset = 0; - - GST_OBJECT_LOCK (src); - if (src->gl_enabled) { - GST_OBJECT_UNLOCK (src); - return GST_CALL_PARENT_WITH_DEFAULT (GST_BASE_SRC_CLASS, create, (bsrc, offset, length, buf), ret); - } - - locked_buffer = src->view->buffer (); - if (locked_buffer == NULL) { - GST_OBJECT_UNLOCK (src); - GST_ELEMENT_ERROR (src, RESOURCE, FAILED, - ("WPE View did not render a buffer"), (NULL)); - return ret; - } - *buf = gst_buffer_copy_deep (locked_buffer); - - g_object_get(gl_src, "timestamp-offset", &ts_offset, NULL); - - /* The following code mimics the behaviour of GLBaseSrc::fill */ - GST_BUFFER_TIMESTAMP (*buf) = ts_offset + gl_src->running_time; - GST_BUFFER_OFFSET (*buf) = src->n_frames; - src->n_frames++; - GST_BUFFER_OFFSET_END (*buf) = src->n_frames; - if (gl_src->out_info.fps_n) { - next_time = gst_util_uint64_scale_int (src->n_frames * GST_SECOND, - gl_src->out_info.fps_d, gl_src->out_info.fps_n); - GST_BUFFER_DURATION (*buf) = next_time - gl_src->running_time; - } else { - next_time = ts_offset; - GST_BUFFER_DURATION (*buf) = GST_CLOCK_TIME_NONE; - } - - GST_LOG_OBJECT (src, "Created buffer from SHM %" GST_PTR_FORMAT, *buf); - - gl_src->running_time = next_time; - - ret = GST_FLOW_OK; - GST_OBJECT_UNLOCK (src); - return ret; -} - -static gboolean -gst_wpe_src_fill_memory (GstGLBaseSrc * bsrc, GstGLMemory * memory) -{ - GstWpeSrc *src = GST_WPE_SRC (bsrc); - const GstGLFuncs *gl; - guint tex_id; - GstEGLImage *locked_image; - - if (!gst_gl_context_check_feature (GST_GL_CONTEXT (bsrc->context), - "EGL_KHR_image_base")) { - GST_ERROR_OBJECT (src, "EGL_KHR_image_base is not supported"); - return FALSE; - } - - GST_OBJECT_LOCK (src); - - gl = bsrc->context->gl_vtable; - tex_id = gst_gl_memory_get_texture_id (memory); - locked_image = src->view->image (); - - if (!locked_image) { - GST_OBJECT_UNLOCK (src); - return TRUE; - } - - gl->ActiveTexture (GL_TEXTURE0 + memory->plane); - gl->BindTexture (GL_TEXTURE_2D, tex_id); - gl->EGLImageTargetTexture2D (GL_TEXTURE_2D, - gst_egl_image_get_image (locked_image)); - gl->Flush (); - GST_OBJECT_UNLOCK (src); - return TRUE; -} - -static gboolean -gst_wpe_src_start (GstWpeSrc * src) -{ - GstGLContext *context = NULL; - GstGLDisplay *display = NULL; - GstGLBaseSrc *base_src = GST_GL_BASE_SRC (src); - - GST_INFO_OBJECT (src, "Starting up"); - GST_OBJECT_LOCK (src); - - if (src->gl_enabled) { - context = base_src->context; - display = base_src->display; - } - - GST_DEBUG_OBJECT (src, "Will fill GLMemories: %d\n", src->gl_enabled); - - auto & thread = WPEContextThread::singleton (); - src->view = thread.createWPEView (src, context, display, - GST_VIDEO_INFO_WIDTH (&base_src->out_info), - GST_VIDEO_INFO_HEIGHT (&base_src->out_info)); - - if (!src->view) { - GST_OBJECT_UNLOCK (src); - GST_ELEMENT_ERROR (src, RESOURCE, FAILED, - ("WPEBackend-FDO EGL display initialisation failed"), (NULL)); - return FALSE; - } - - if (src->bytes != NULL) { - src->view->loadData (src->bytes); - g_bytes_unref (src->bytes); - src->bytes = NULL; - } - - src->n_frames = 0; - GST_OBJECT_UNLOCK (src); - return TRUE; -} - -static gboolean -gst_wpe_src_decide_allocation (GstBaseSrc * base_src, GstQuery * query) -{ - GstGLBaseSrc *gl_src = GST_GL_BASE_SRC (base_src); - GstWpeSrc *src = GST_WPE_SRC (base_src); - GstCapsFeatures *caps_features; - - GST_OBJECT_LOCK (src); - caps_features = gst_caps_get_features (gl_src->out_caps, 0); - if (caps_features != NULL && gst_caps_features_contains (caps_features, GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { - src->gl_enabled = TRUE; - } else { - src->gl_enabled = FALSE; - } - - if (src->gl_enabled) { - GST_OBJECT_UNLOCK (src); - return GST_CALL_PARENT_WITH_DEFAULT(GST_BASE_SRC_CLASS, decide_allocation, (base_src, query), FALSE); - } - GST_OBJECT_UNLOCK (src); - return gst_wpe_src_start (src); -} - -static gboolean -gst_wpe_src_gl_start (GstGLBaseSrc * base_src) -{ - GstWpeSrc *src = GST_WPE_SRC (base_src); - return gst_wpe_src_start (src); -} - -static void -gst_wpe_src_stop_unlocked (GstWpeSrc * src) -{ - if (src->view) { - delete src->view; - src->view = NULL; - } -} - -static void -gst_wpe_src_gl_stop (GstGLBaseSrc * base_src) -{ - GstWpeSrc *src = GST_WPE_SRC (base_src); - - GST_OBJECT_LOCK (src); - gst_wpe_src_stop_unlocked (src); - GST_OBJECT_UNLOCK (src); -} - -static gboolean -gst_wpe_src_stop (GstBaseSrc * base_src) -{ - GstWpeSrc *src = GST_WPE_SRC (base_src); - - /* we can call this always, GstGLBaseSrc is smart enough to not crash if - * gst_gl_base_src_gl_start() has not been called from chaining up - * gst_wpe_src_decide_allocation() */ - if (!GST_CALL_PARENT_WITH_DEFAULT(GST_BASE_SRC_CLASS, stop, (base_src), FALSE)) - return FALSE; - - GST_OBJECT_LOCK (src); - - /* if gl-enabled, gst_wpe_src_stop_unlocked() would have already been called - * inside gst_wpe_src_gl_stop() from the base class stopping the OpenGL - * context */ - if (!src->gl_enabled) - gst_wpe_src_stop_unlocked (src); - - GST_OBJECT_UNLOCK (src); - return TRUE; -} - -static GstCaps * -gst_wpe_src_fixate (GstBaseSrc * base_src, GstCaps * caps) -{ - GstWpeSrc *src = GST_WPE_SRC (base_src); - GstStructure *structure; - gint width, height; - - caps = gst_caps_make_writable (caps); - structure = gst_caps_get_structure (caps, 0); - - gst_structure_fixate_field_nearest_int (structure, "width", DEFAULT_WIDTH); - gst_structure_fixate_field_nearest_int (structure, "height", DEFAULT_HEIGHT); - - if (gst_structure_has_field (structure, "framerate")) - gst_structure_fixate_field_nearest_fraction (structure, "framerate", - DEFAULT_FPS_N, DEFAULT_FPS_D); - else - gst_structure_set (structure, "framerate", GST_TYPE_FRACTION, DEFAULT_FPS_N, - DEFAULT_FPS_D, NULL); - - caps = GST_BASE_SRC_CLASS (parent_class)->fixate (base_src, caps); - GST_INFO_OBJECT (base_src, "Fixated caps to %" GST_PTR_FORMAT, caps); - - if (src->view) { - gst_structure_get (structure, "width", G_TYPE_INT, &width, "height", G_TYPE_INT, &height, NULL); - src->view->resize (width, height); - } - return caps; -} - -void -gst_wpe_src_configure_web_view (GstWpeSrc * src, WebKitWebView * webview) -{ - GValue args[2] = { {0}, {0} }; - - g_value_init (&args[0], GST_TYPE_ELEMENT); - g_value_set_object (&args[0], src); - g_value_init (&args[1], G_TYPE_OBJECT); - g_value_set_object (&args[1], webview); - - g_signal_emitv (args, gst_wpe_src_signals[SIGNAL_CONFIGURE_WEB_VIEW], 0, - NULL); - - g_value_unset (&args[0]); - g_value_unset (&args[1]); -} - -static void -gst_wpe_src_load_bytes (GstWpeSrc * src, GBytes * bytes) -{ - if (src->view && GST_STATE (GST_ELEMENT_CAST (src)) > GST_STATE_NULL) - src->view->loadData (bytes); - else - src->bytes = g_bytes_ref (bytes); -} - -static gboolean -gst_wpe_src_set_location (GstWpeSrc * src, const gchar * location, - GError ** error) -{ - g_free (src->location); - src->location = g_strdup (location); - if (src->view) - src->view->loadUri (src->location); - - return TRUE; -} - -static void -gst_wpe_src_set_draw_background (GstWpeSrc * src, gboolean draw_background) -{ - if (src->view) - src->view->setDrawBackground (draw_background); - src->draw_background = draw_background; -} - -static void -gst_wpe_src_set_property (GObject * object, guint prop_id, const GValue * value, - GParamSpec * pspec) -{ - GstWpeSrc *src = GST_WPE_SRC (object); - - switch (prop_id) { - case PROP_LOCATION: - { - const gchar *location; - - location = g_value_get_string (value); - if (location == NULL) { - GST_WARNING_OBJECT (src, "location property cannot be NULL"); - return; - } - - if (!gst_wpe_src_set_location (src, location, NULL)) { - GST_WARNING_OBJECT (src, "badly formatted location"); - return; - } - break; - } - case PROP_DRAW_BACKGROUND: - gst_wpe_src_set_draw_background (src, g_value_get_boolean (value)); - break; - default: - break; - } -} - -static void -gst_wpe_src_get_property (GObject * object, guint prop_id, GValue * value, - GParamSpec * pspec) -{ - GstWpeSrc *src = GST_WPE_SRC (object); - - switch (prop_id) { - case PROP_LOCATION: - g_value_set_string (value, src->location); - break; - case PROP_DRAW_BACKGROUND: - g_value_set_boolean (value, src->draw_background); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static gboolean -gst_wpe_src_event (GstPad * pad, GstObject * parent, GstEvent * event) -{ - gboolean ret = FALSE; - GstWpeSrc *src = GST_WPE_SRC (parent); - - if (GST_EVENT_TYPE (event) == GST_EVENT_NAVIGATION) { - const gchar *key; - gint button; - gdouble x, y, delta_x, delta_y; - - GST_DEBUG_OBJECT (src, "Processing event %" GST_PTR_FORMAT, event); - if (!src->view) { - return FALSE; - } - switch (gst_navigation_event_get_type (event)) { - case GST_NAVIGATION_EVENT_KEY_PRESS: - case GST_NAVIGATION_EVENT_KEY_RELEASE: - if (gst_navigation_event_parse_key_event (event, &key)) { - /* FIXME: This is wrong... The GstNavigation API should pass - hardware-level information, not high-level keysym strings */ - uint32_t keysym = - (uint32_t) xkb_keysym_from_name (key, XKB_KEYSYM_NO_FLAGS); - struct wpe_input_keyboard_event wpe_event; - wpe_event.key_code = keysym; - wpe_event.pressed = - gst_navigation_event_get_type (event) == - GST_NAVIGATION_EVENT_KEY_PRESS; - src->view->dispatchKeyboardEvent (wpe_event); - ret = TRUE; - } - break; - case GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS: - case GST_NAVIGATION_EVENT_MOUSE_BUTTON_RELEASE: - if (gst_navigation_event_parse_mouse_button_event (event, &button, &x, - &y)) { - struct wpe_input_pointer_event wpe_event; - wpe_event.time = GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); - wpe_event.type = wpe_input_pointer_event_type_button; - wpe_event.x = (int) x; - wpe_event.y = (int) y; - if (button == 1) { - wpe_event.modifiers = wpe_input_pointer_modifier_button1; - } else if (button == 2) { - wpe_event.modifiers = wpe_input_pointer_modifier_button2; - } else if (button == 3) { - wpe_event.modifiers = wpe_input_pointer_modifier_button3; - } else if (button == 4) { - wpe_event.modifiers = wpe_input_pointer_modifier_button4; - } else if (button == 5) { - wpe_event.modifiers = wpe_input_pointer_modifier_button5; - } - wpe_event.button = button; - wpe_event.state = - gst_navigation_event_get_type (event) == - GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS; - src->view->dispatchPointerEvent (wpe_event); - ret = TRUE; - } - break; - case GST_NAVIGATION_EVENT_MOUSE_MOVE: - if (gst_navigation_event_parse_mouse_move_event (event, &x, &y)) { - struct wpe_input_pointer_event wpe_event; - wpe_event.time = GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); - wpe_event.type = wpe_input_pointer_event_type_motion; - wpe_event.x = (int) x; - wpe_event.y = (int) y; - src->view->dispatchPointerEvent (wpe_event); - ret = TRUE; - } - break; - case GST_NAVIGATION_EVENT_MOUSE_SCROLL: - if (gst_navigation_event_parse_mouse_scroll_event (event, &x, &y, - &delta_x, &delta_y)) { -#if WPE_CHECK_VERSION(1, 6, 0) - struct wpe_input_axis_2d_event wpe_event; - if (delta_x) { - wpe_event.x_axis = delta_x; - } else { - wpe_event.y_axis = delta_y; - } - wpe_event.base.time = - GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); - wpe_event.base.type = - static_cast < wpe_input_axis_event_type > - (wpe_input_axis_event_type_mask_2d | - wpe_input_axis_event_type_motion_smooth); - wpe_event.base.x = (int) x; - wpe_event.base.y = (int) y; - src->view->dispatchAxisEvent (wpe_event.base); -#else - struct wpe_input_axis_event wpe_event; - if (delta_x) { - wpe_event.axis = 1; - wpe_event.value = delta_x; - } else { - wpe_event.axis = 0; - wpe_event.value = delta_y; - } - wpe_event.time = GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); - wpe_event.type = wpe_input_axis_event_type_motion; - wpe_event.x = (int) x; - wpe_event.y = (int) y; - src->view->dispatchAxisEvent (wpe_event); -#endif - ret = TRUE; - } - break; - default: - break; - } - /* FIXME: No touch events handling support in GstNavigation */ - } - - if (!ret) { - ret = gst_pad_event_default (pad, parent, event); - } else { - gst_event_unref (event); - } - return ret; -} - -static void -gst_wpe_src_init (GstWpeSrc * src) -{ - GstPad *pad = gst_element_get_static_pad (GST_ELEMENT_CAST (src), "src"); - - gst_pad_set_event_function (pad, gst_wpe_src_event); - gst_object_unref (pad); - - src->draw_background = TRUE; - - gst_base_src_set_live (GST_BASE_SRC_CAST (src), TRUE); -} - -static GstURIType -gst_wpe_src_uri_get_type (GType) -{ - return GST_URI_SRC; -} - -static const gchar *const * -gst_wpe_src_get_protocols (GType) -{ - static const char *protocols[] = { "wpe", NULL }; - return protocols; -} - -static gchar * -gst_wpe_src_get_uri (GstURIHandler * handler) -{ - GstWpeSrc *src = GST_WPE_SRC (handler); - return g_strdup_printf ("wpe://%s", src->location); -} - -static gboolean -gst_wpe_src_set_uri (GstURIHandler * handler, const gchar * uri, - GError ** error) -{ - GstWpeSrc *src = GST_WPE_SRC (handler); - - return gst_wpe_src_set_location (src, uri + 6, error); -} - -static void -gst_wpe_src_uri_handler_init (gpointer iface_ptr, gpointer data) -{ - GstURIHandlerInterface *iface = (GstURIHandlerInterface *) iface_ptr; - - iface->get_type = gst_wpe_src_uri_get_type; - iface->get_protocols = gst_wpe_src_get_protocols; - iface->get_uri = gst_wpe_src_get_uri; - iface->set_uri = gst_wpe_src_set_uri; -} - -static void -gst_wpe_src_class_init (GstWpeSrcClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); - GstGLBaseSrcClass *gl_base_src_class = GST_GL_BASE_SRC_CLASS (klass); - GstBaseSrcClass *base_src_class = GST_BASE_SRC_CLASS (klass); - GstPadTemplate *tmpl; - GstCaps *doc_caps; - - gobject_class->set_property = gst_wpe_src_set_property; - gobject_class->get_property = gst_wpe_src_get_property; - - g_object_class_install_property (gobject_class, PROP_LOCATION, - g_param_spec_string ("location", "location", - "The URL to display", - "", (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); - g_object_class_install_property (gobject_class, PROP_DRAW_BACKGROUND, - g_param_spec_boolean ("draw-background", "Draws the background", - "Whether to draw the WebView background", TRUE, - (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); - - gst_element_class_set_static_metadata (gstelement_class, - "WPE source", "Source/Video", - "Creates a video stream from a WPE browser", - "Philippe Normand <philn@igalia.com>, Žan Doberšek <zdobersek@igalia.com>"); - - tmpl = gst_static_pad_template_get (&src_factory); - gst_element_class_add_pad_template (gstelement_class, tmpl); - - base_src_class->fixate = GST_DEBUG_FUNCPTR (gst_wpe_src_fixate); - base_src_class->create = GST_DEBUG_FUNCPTR (gst_wpe_src_create); - base_src_class->decide_allocation = GST_DEBUG_FUNCPTR (gst_wpe_src_decide_allocation); - base_src_class->stop = GST_DEBUG_FUNCPTR (gst_wpe_src_stop); - - gl_base_src_class->supported_gl_api = - static_cast < GstGLAPI > - (GST_GL_API_OPENGL | GST_GL_API_OPENGL3 | GST_GL_API_GLES2); - gl_base_src_class->gl_start = GST_DEBUG_FUNCPTR (gst_wpe_src_gl_start); - gl_base_src_class->gl_stop = GST_DEBUG_FUNCPTR (gst_wpe_src_gl_stop); - gl_base_src_class->fill_gl_memory = - GST_DEBUG_FUNCPTR (gst_wpe_src_fill_memory); - - doc_caps = gst_caps_from_string (WPE_SRC_DOC_CAPS); - gst_pad_template_set_documentation_caps (tmpl, doc_caps); - gst_clear_caps (&doc_caps); - - /** - * GstWpeSrc::configure-web-view: - * @src: the object which received the signal - * @webview: the webView - * - * Allow application to configure the webView settings. - */ - gst_wpe_src_signals[SIGNAL_CONFIGURE_WEB_VIEW] = - g_signal_new ("configure-web-view", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, G_TYPE_OBJECT); - - /** - * GstWpeSrc::load-bytes: - * @src: the object which received the signal - * @bytes: the GBytes data to load - * - * Load the specified bytes into the internal webView. - */ - gst_wpe_src_signals[SIGNAL_LOAD_BYTES] = - g_signal_new_class_handler ("load-bytes", G_TYPE_FROM_CLASS (klass), - static_cast < GSignalFlags > (G_SIGNAL_RUN_LAST | G_SIGNAL_ACTION), - G_CALLBACK (gst_wpe_src_load_bytes), NULL, NULL, NULL, - G_TYPE_NONE, 1, G_TYPE_BYTES); -} - -static gboolean -plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (wpe_src_debug, "wpesrc", 0, "WPE Source"); - - return gst_element_register (plugin, "wpesrc", GST_RANK_NONE, - GST_TYPE_WPE_SRC); -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, - wpe, "WPE src plugin", plugin_init, VERSION, GST_LICENSE, PACKAGE, - GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wpe/gstwpesrc.h
Deleted
@@ -1,46 +0,0 @@ -/* Copyright (C) <2018> Philippe Normand <philn@igalia.com> - * Copyright (C) <2018> Žan Doberšek <zdobersek@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#pragma once - -#include <gst/gst.h> -#include <gst/gl/gl.h> -#include <wpe/webkit.h> - -G_BEGIN_DECLS - -#define GST_TYPE_WPE_SRC (gst_wpe_src_get_type()) -#define GST_WPE_SRC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_WPE_SRC,GstWpeSrc)) -#define GST_WPE_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_WPE_SRC,GstWpeSrcClass)) -#define GST_IS_WPE_SRC(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_WPE_SRC)) -#define GST_IS_WPE_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_WPE_SRC)) - -typedef struct _GstWpeSrc GstWpeSrc; -typedef struct _GstWpeSrcClass GstWpeSrcClass; - -struct _GstWpeSrcClass -{ - GstGLBaseSrcClass parent_class; -}; - -GType gst_wpe_src_get_type (void); - -void gst_wpe_src_configure_web_view(GstWpeSrc * src, WebKitWebView * webview); - -G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstplugin.h
Deleted
@@ -1,63 +0,0 @@ -/* - * GStreamer - * Copyright (C) 2005 Thomas Vander Stichele <thomas@apestaart.org> - * Copyright (C) 2005 Ronald S. Bultje <rbultje@ronald.bitfreak.net> - * Copyright (C) 2010 Luis de Bethencourt <luis@debethencourt.com> - * - * Permission is hereby granted, free of charge, to any person obtaining a - * copy of this software and associated documentation files (the "Software"), - * to deal in the Software without restriction, including without limitation - * the rights to use, copy, modify, merge, publish, distribute, sublicense, - * and/or sell copies of the Software, and to permit persons to whom the - * Software is furnished to do so, subject to the following conditions: - * - * The above copyright notice and this permission notice shall be included in - * all copies or substantial portions of the Software. - * - * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR - * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, - * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE - * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER - * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING - * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER - * DEALINGS IN THE SOFTWARE. - * - * Alternatively, the contents of this file may be used under the - * GNU Lesser General Public License Version 2.1 (the "LGPL"), in - * which case the following provisions apply instead of the ones - * mentioned above: - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_GAUDI_EFFECTS_H__ -#define __GST_GAUDI_EFFECTS_H__ - -#include <gst/gst.h> - -G_BEGIN_DECLS - -gboolean gst_burn_plugin_init (GstPlugin *plugin); -gboolean gst_chromium_plugin_init (GstPlugin *plugin); -gboolean gst_dilate_plugin_init (GstPlugin *plugin); -gboolean gst_dodge_plugin_init (GstPlugin *plugin); -gboolean gst_exclusion_plugin_init (GstPlugin *plugin); -gboolean gst_gauss_blur_plugin_init (GstPlugin *plugin); -gboolean gst_solarize_plugin_init (GstPlugin *plugin); - -G_END_DECLS - -#endif /* __GST_GAUDI_EFFECTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/hooks
Deleted
-(directory)
View file
gst-plugins-bad-1.18.6.tar.xz/hooks/pre-commit.hook
Deleted
@@ -1,83 +0,0 @@ -#!/bin/sh -# -# Check that the code follows a consistent code style -# - -# Check for existence of indent, and error out if not present. -# On some *bsd systems the binary seems to be called gnunindent, -# so check for that first. - -version=`gnuindent --version 2>/dev/null` -if test "x$version" = "x"; then - version=`gindent --version 2>/dev/null` - if test "x$version" = "x"; then - version=`indent --version 2>/dev/null` - if test "x$version" = "x"; then - echo "GStreamer git pre-commit hook:" - echo "Did not find GNU indent, please install it before continuing." - exit 1 - else - INDENT=indent - fi - else - INDENT=gindent - fi -else - INDENT=gnuindent -fi - -case `$INDENT --version` in - GNU*) - ;; - default) - echo "GStreamer git pre-commit hook:" - echo "Did not find GNU indent, please install it before continuing." - echo "(Found $INDENT, but it doesn't seem to be GNU indent)" - exit 1 - ;; -esac - -INDENT_PARAMETERS="--braces-on-if-line \ - --case-brace-indentation0 \ - --case-indentation2 \ - --braces-after-struct-decl-line \ - --line-length80 \ - --no-tabs \ - --cuddle-else \ - --dont-line-up-parentheses \ - --continuation-indentation4 \ - --honour-newlines \ - --tab-size8 \ - --indent-level2 \ - --leave-preprocessor-space" - -echo "--Checking style--" -for file in `git diff-index --cached --name-only HEAD --diff-filter=ACMR| grep "\.c$"` ; do - # nf is the temporary checkout. This makes sure we check against the - # revision in the index (and not the checked out version). - nf=`git checkout-index --temp ${file} | cut -f 1` - newfile=`mktemp /tmp/${nf}.XXXXXX` || exit 1 - $INDENT ${INDENT_PARAMETERS} \ - $nf -o $newfile 2>> /dev/null - # FIXME: Call indent twice as it tends to do line-breaks - # different for every second call. - $INDENT ${INDENT_PARAMETERS} \ - $newfile 2>> /dev/null - diff -u -p "${nf}" "${newfile}" - r=$? - rm "${newfile}" - rm "${nf}" - if [ $r != 0 ] ; then -echo "=================================================================================================" -echo " Code style error in: $file " -echo " " -echo " Please fix before committing. Don't forget to run git add before trying to commit again. " -echo " If the whole file is to be committed, this should work (run from the top-level directory): " -echo " " -echo " gst-indent $file; git add $file; git commit" -echo " " -echo "=================================================================================================" - exit 1 - fi -done -echo "--Checking style pass--"
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig
Deleted
-(directory)
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/.gitignore
Deleted
@@ -1,1 +0,0 @@ -*.pc
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-bad-audio-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@audiolibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer bad audio library, uninstalled -Description: Bad audio library for GStreamer, Not Installed -Version: @VERSION@ -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ - -Libs: -L${libdir} -lgstbadaudio-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-bad-audio.pc.in
Deleted
@@ -1,13 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ -pluginsdir=@libdir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer bad audio library, uninstalled -Description: Bad audio library for GStreamer elements, Not Installed -Version: @VERSION@ -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ - -Libs: -L${libdir} -lgstbadaudio-@GST_API_VERSION@ -Cflags: -I${includedir}
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-codecparsers-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@codecparserslibdir@ -includedir=@abs_top_builddir@/gst-libs - -Name: GStreamer codec parsers, Uninstalled -Description: Bitstream parsers for GStreamer elements, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstcodecparsers-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-codecparsers.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer codec parsers -Description: Bitstream parsers for GStreamer elements -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstcodecparsers-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-insertbin-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@insertbinlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Insert Bin, Uninstalled -Description: Bin to automatically and insertally link elements, uninstalled -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstinsertbin-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-insertbin.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Insert Bin -Description: Bin to automatically and insertally link elements -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstinsertbin-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-mpegts-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@mpegtslibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer MPEG-TS, Uninstalled -Description: GStreamer MPEG-TS support, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstmpegts-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-mpegts.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer MPEG-TS -Description: GStreamer MPEG-TS support -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstmpegts-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-photography-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@playerlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Photography, Uninstalled -Description: GStreamer Photography digital image capture library, uninstalled -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstphotography-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-photography.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Photography -Description: GStreamer Photography digital image capture library -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstphotography-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-player-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@playerlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Player, Uninstalled -Description: GStreamer Player convenience library, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-video-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstplayer-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-player.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Player -Description: GStreamer Player convenience library -Requires: gstreamer-@GST_API_VERSION@ gstreamer-video-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstplayer-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-plugins-bad-uninstalled.pc.in
Deleted
@@ -1,14 +0,0 @@ -# the standard variables don't make sense for an uninstalled copy -prefix= -exec_prefix= -libdir= -# includedir is builddir because it is used to find gstconfig.h in places -includedir=@abs_top_builddir@/gst-libs -pluginsdir=@abs_top_builddir@ - -Name: GStreamer Bad Plugin libraries, Uninstalled -Description: Streaming media framework, bad plugins libraries, uninstalled -Version: @VERSION@ -Requires: gstreamer-@GST_API_VERSION@ -Libs: -L@audiolibdir@ -L@basecamerabinsrclibdir@ -L@codecparserslibdir@ -L@insertbinlibdir@ -L@photographylibdir@ -L@mpegtslibdir@ -L@playerlibdir@ -L@webrtclibdir@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-plugins-bad.pc.in
Deleted
@@ -1,13 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ -pluginsdir=@libdir@/gstreamer-@GST_API_VERSION@ - - -Name: GStreamer Bad Plugin libraries -Description: Streaming media framework, bad plugins libraries -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -Cflags: -I${includedir}
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-sctp-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=${pcfiledir}/../gst-libs/gst/sctp -includedir=${pcfiledir}/../gst-libs - -Name: GStreamer SCTP Library -Description: SCTP helper functions, uninstalled -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} ${libdir}/libgstsctp-@GST_API_VERSION@.la -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-sctp.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer SCTP Library -Description: SCTP helper functions -Requires: gstreamer-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstsctp-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-transcoder-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@playerlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Transcoder, Uninstalled -Description: GStreamer Transcoder library, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-pbutils-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgsttranscoder-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-transcoder.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Transcoder -Description: GStreamer Transcoder library -Requires: gstreamer-@GST_API_VERSION@ gstreamer-pbutils-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgsttranscoder-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-vulkan-uninstalled.pc.in
Deleted
@@ -1,14 +0,0 @@ -prefix= -exec_prefix= -libdir=@vulkanlibdir@ -includedir=@abs_top_srcdir@/gst-libs -vulkan_winsys=@VULKAN_WINSYS@ - - -Name: GStreamer Vulkan, Uninstalled -Description: GStreamer Vulkan support, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ gstreamer-video-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstvulkan-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-vulkan-wayland-uninstalled.pc.in
Deleted
@@ -1,9 +0,0 @@ -prefix= -exec_prefix= -libdir=@vulkanlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Vulkan (Wayland Specifics), Uninstalled -Description: GStreamer Vulkan support (Wayland Specifics), uninstalled -Requires: gstreamer-vulkan-@GST_API_VERSION@ wayland-client -Version: @VERSION@
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-vulkan-wayland.pc.in
Deleted
@@ -1,9 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Vulkan (Wayland Specifics) -Description: GStreamer Vulkan support (Wayland Specifics) -Requires: gstreamer-vulkan-@GST_API_VERSION@ wayland-client -Version: @VERSION@
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-vulkan-xcb-uninstalled.pc.in
Deleted
@@ -1,9 +0,0 @@ -prefix= -exec_prefix= -libdir=@vulkanlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Vulkan (XCB Specifics), Uninstalled -Description: GStreamer Vulkan support (XCB Specifics), uninstalled -Requires: gstreamer-vulkan-@GST_API_VERSION@ xcb -Version: @VERSION@
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-vulkan-xcb.pc.in
Deleted
@@ -1,9 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Vulkan (XCB Specifics) -Description: GStreamer Vulkan support (XCB Specifics) -Requires: gstreamer-vulkan-@GST_API_VERSION@ xcb -Version: @VERSION@
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-vulkan.pc.in
Deleted
@@ -1,13 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ -vulkan_winsys=@VULKAN_WINSYS@ - -Name: GStreamer Vulkan -Description: GStreamer Vulkan support -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ gstreamer-video-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstvulkan-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-wayland-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@waylandlibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer Wayland, Uninstalled -Description: GStreamer Wayland support, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-video-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstwayland-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-wayland.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer Wayland -Description: GStreamer Wayland support -Requires: gstreamer-@GST_API_VERSION@ gstreamer-video-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstwayland-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-webrtc-uninstalled.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix= -exec_prefix= -libdir=@webrtclibdir@ -includedir=@abs_top_srcdir@/gst-libs - -Name: GStreamer WebRTC, Uninstalled -Description: GStreamer WebRTC support, uninstalled -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ gstreamer-sdp-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstwebrtc-@GST_API_VERSION@ -Cflags: -I@abs_top_srcdir@/gst-libs -I@abs_top_builddir@/gst-libs -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/gstreamer-webrtc.pc.in
Deleted
@@ -1,12 +0,0 @@ -prefix=@prefix@ -exec_prefix=@exec_prefix@ -libdir=@libdir@ -includedir=@includedir@/gstreamer-@GST_API_VERSION@ - -Name: GStreamer WebRTC -Description: GStreamer WebRTC support -Requires: gstreamer-@GST_API_VERSION@ gstreamer-base-@GST_API_VERSION@ gstreamer-sdp-@GST_API_VERSION@ -Version: @VERSION@ -Libs: -L${libdir} -lgstwebrtc-@GST_API_VERSION@ -Cflags: -I${includedir} -
View file
gst-plugins-bad-1.18.6.tar.xz/pkgconfig/meson.build
Deleted
@@ -1,69 +0,0 @@ -pkgconf = configuration_data() - -pkgconf.set('prefix', join_paths(get_option('prefix'))) -pkgconf.set('exec_prefix', '${prefix}') -pkgconf.set('libdir', '${prefix}/@0@'.format(get_option('libdir'))) -pkgconf.set('includedir', '${prefix}/@0@'.format(get_option('includedir'))) -pkgconf.set('GST_API_VERSION', api_version) -pkgconf.set('VERSION', gst_version) - -# needed for generating -uninstalled.pc files -pkgconf.set('abs_top_builddir', join_paths(meson.current_build_dir(), '..')) -pkgconf.set('abs_top_srcdir', join_paths(meson.current_source_dir(), '..')) -pkgconf.set('audiolibdir', join_paths(meson.build_root(), gstbadaudio.outdir())) -pkgconf.set('transcoderlibdir', join_paths(meson.build_root(), gst_transcoder.outdir())) -pkgconf.set('codecparserslibdir', join_paths(meson.build_root(), gstcodecparsers.outdir())) -pkgconf.set('insertbinlibdir', join_paths(meson.build_root(), gstinsertbin.outdir())) -pkgconf.set('mpegtslibdir', join_paths(meson.build_root(), gstmpegts.outdir())) -pkgconf.set('playerlibdir', join_paths(meson.build_root(), gstplayer.outdir())) -pkgconf.set('basecamerabinsrclibdir', join_paths(meson.build_root(), gstbasecamerabin.outdir())) -pkgconf.set('photographylibdir', join_paths(meson.build_root(), gstphotography.outdir())) -pkgconf.set('webrtclibdir', join_paths(meson.build_root(), gstwebrtc.outdir())) - -pkg_install_dir = '@0@/pkgconfig'.format(get_option('libdir')) - -pkg_libs = [ - 'bad-audio', - 'codecparsers', - 'insertbin', - 'mpegts', - 'photography', - 'player', - 'plugins-bad', - 'sctp', - 'transcoder', - 'webrtc', -] - -#if use_wayland -# pkgconf.set('waylandlibdir', join_paths(meson.build_root(), gstwayland.outdir())) -# pkg_libs += 'wayland' -#endif - -if gstvulkan_dep.found() - pkgconf.set('vulkanlibdir', join_paths(meson.build_root(), gstvulkan.outdir())) - pkgconf.set('VULKAN_WINSYS', ' '.join(enabled_vulkan_winsys)) - pkg_libs += 'vulkan' - - if enabled_vulkan_winsys.contains('xcb') - pkg_libs += 'vulkan-xcb' - endif - if enabled_vulkan_winsys.contains('wayland') - pkg_libs += 'vulkan-wayland' - endif -endif - -foreach p : pkg_libs - infile = 'gstreamer-@0@.pc.in'.format(p) - outfile = 'gstreamer-@0@-@1@.pc'.format(p, api_version) - configure_file(input : infile, - output : outfile, - configuration : pkgconf, - install_dir : pkg_install_dir) - - infile = 'gstreamer-@0@-uninstalled.pc.in'.format(p) - outfile = 'gstreamer-@0@-@1@-uninstalled.pc'.format(p, api_version) - configure_file(input : infile, - output : outfile, - configuration : pkgconf) -endforeach
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11_fwd.h
Deleted
@@ -1,92 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_FWD_H__ -#define __GST_D3D11_FWD_H__ - -#include <gst/gst.h> -#include "gstd3d11config.h" - -#ifndef INITGUID -#include <initguid.h> -#endif - -#if (D3D11_HEADER_VERSION >= 4) -#include <d3d11_4.h> -#elif (D3D11_HEADER_VERSION >= 3) -#include <d3d11_3.h> -#elif (D3D11_HEADER_VERSION >= 2) -#include <d3d11_2.h> -#elif (D3D11_HEADER_VERSION >= 1) -#include <d3d11_1.h> -#else -#include <d3d11.h> -#endif - -#if (DXGI_HEADER_VERSION >= 6) -#include <dxgi1_6.h> -#elif (DXGI_HEADER_VERSION >= 5) -#include <dxgi1_5.h> -#elif (DXGI_HEADER_VERSION >= 4) -#include <dxgi1_4.h> -#elif (DXGI_HEADER_VERSION >= 3) -#include <dxgi1_3.h> -#elif (DXGI_HEADER_VERSION >= 2) -#include <dxgi1_2.h> -#else -#include <dxgi.h> -#endif - -G_BEGIN_DECLS - -typedef struct _GstD3D11Device GstD3D11Device; -typedef struct _GstD3D11DeviceClass GstD3D11DeviceClass; -typedef struct _GstD3D11DevicePrivate GstD3D11DevicePrivate; - -typedef struct _GstD3D11AllocationParams GstD3D11AllocationParams; -typedef struct _GstD3D11Memory GstD3D11Memory; -typedef struct _GstD3D11Allocator GstD3D11Allocator; -typedef struct _GstD3D11AllocatorClass GstD3D11AllocatorClass; -typedef struct _GstD3D11AllocatorPrivate GstD3D11AllocatorPrivate; - -typedef struct _GstD3D11BufferPool GstD3D11BufferPool; -typedef struct _GstD3D11BufferPoolClass GstD3D11BufferPoolClass; -typedef struct _GstD3D11BufferPoolPrivate GstD3D11BufferPoolPrivate; - -typedef struct _GstD3D11Format GstD3D11Format; - -typedef struct _GstD3D11BaseFilter GstD3D11BaseFilter; -typedef struct _GstD3D11BaseFilterClass GstD3D11BaseFilterClass; - -typedef struct _GstD3D11Upload GstD3D11Upload; -typedef struct _GstD3D11UploadClass GstD3D11UploadClass; - -typedef struct _GstD3D11Download GstD3D11Download; -typedef struct _GstD3D11DownloadClass GstD3D11DownloadClass; - -typedef struct _GstD3D11ColorConvert GstD3D11ColorConvert; -typedef struct _GstD3D11ColorConvertClass GstD3D11ColorConvertClass; - -typedef struct _GstD3D11Decoder GstD3D11Decoder; -typedef struct _GstD3D11DecoderClass GstD3D11DecoderClass; -typedef struct _GstD3D11DecoderPrivate GstD3D11DecoderPrivate; - -G_END_DECLS - -#endif /* __GST_D3D11_FWD_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11basefilter.c
Deleted
@@ -1,265 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11basefilter.h" -#include "gstd3d11utils.h" -#include "gstd3d11device.h" -#include "gstd3d11bufferpool.h" -#include "gstd3d11memory.h" - -GST_DEBUG_CATEGORY_STATIC (gst_d3d11_base_filter_debug); -#define GST_CAT_DEFAULT gst_d3d11_base_filter_debug - -enum -{ - PROP_0, - PROP_ADAPTER, -}; - -#define DEFAULT_ADAPTER -1 - -#define gst_d3d11_base_filter_parent_class parent_class -G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstD3D11BaseFilter, gst_d3d11_base_filter, - GST_TYPE_BASE_TRANSFORM, GST_DEBUG_CATEGORY_INIT (GST_CAT_DEFAULT, - "d3d11basefilter", 0, "d3d11 basefilter")); - -static void gst_d3d11_base_filter_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec); -static void gst_d3d11_base_filter_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec); -static void gst_d3d11_base_filter_dispose (GObject * object); -static void gst_d3d11_base_filter_set_context (GstElement * element, - GstContext * context); -static gboolean gst_d3d11_base_filter_start (GstBaseTransform * trans); -static gboolean gst_d3d11_base_filter_stop (GstBaseTransform * trans); -static gboolean gst_d3d11_base_filter_set_caps (GstBaseTransform * trans, - GstCaps * incaps, GstCaps * outcaps); -static gboolean gst_d3d11_base_filter_get_unit_size (GstBaseTransform * trans, - GstCaps * caps, gsize * size); -static gboolean -gst_d3d11_base_filter_query (GstBaseTransform * trans, - GstPadDirection direction, GstQuery * query); - -static void -gst_d3d11_base_filter_class_init (GstD3D11BaseFilterClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); - - gobject_class->set_property = gst_d3d11_base_filter_set_property; - gobject_class->get_property = gst_d3d11_base_filter_get_property; - gobject_class->dispose = gst_d3d11_base_filter_dispose; - - /** - * GstD3D11BaseFilter:adapter: - * - * Adapter index for creating device (-1 for default) - * - * Since: 1.18 - */ - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_int ("adapter", "Adapter", - "Adapter index for creating device (-1 for default)", - -1, G_MAXINT32, DEFAULT_ADAPTER, - G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | - G_PARAM_STATIC_STRINGS)); - - element_class->set_context = - GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_set_context); - - trans_class->passthrough_on_same_caps = TRUE; - - trans_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_start); - trans_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_stop); - trans_class->set_caps = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_set_caps); - trans_class->get_unit_size = - GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_get_unit_size); - trans_class->query = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_query); - - gst_type_mark_as_plugin_api (GST_TYPE_D3D11_BASE_FILTER, 0); -} - -static void -gst_d3d11_base_filter_init (GstD3D11BaseFilter * filter) -{ - filter->adapter = DEFAULT_ADAPTER; -} - -static void -gst_d3d11_base_filter_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (object); - - switch (prop_id) { - case PROP_ADAPTER: - filter->adapter = g_value_get_int (value); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_base_filter_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (object); - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_int (value, filter->adapter); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_base_filter_dispose (GObject * object) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (object); - - gst_clear_object (&filter->device); - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_d3d11_base_filter_set_context (GstElement * element, GstContext * context) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (element); - - gst_d3d11_handle_set_context (element, - context, filter->adapter, &filter->device); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -static gboolean -gst_d3d11_base_filter_start (GstBaseTransform * trans) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - - if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (filter), - filter->adapter, &filter->device)) { - GST_ERROR_OBJECT (filter, "Failed to get D3D11 device"); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_base_filter_stop (GstBaseTransform * trans) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - - gst_clear_object (&filter->device); - - return TRUE; -} - -static gboolean -gst_d3d11_base_filter_set_caps (GstBaseTransform * trans, GstCaps * incaps, - GstCaps * outcaps) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstVideoInfo in_info, out_info; - GstD3D11BaseFilterClass *klass; - gboolean res; - - if (!filter->device) { - GST_ERROR_OBJECT (filter, "No available D3D11 device"); - return FALSE; - } - - /* input caps */ - if (!gst_video_info_from_caps (&in_info, incaps)) - goto invalid_caps; - - /* output caps */ - if (!gst_video_info_from_caps (&out_info, outcaps)) - goto invalid_caps; - - klass = GST_D3D11_BASE_FILTER_GET_CLASS (filter); - if (klass->set_info) - res = klass->set_info (filter, incaps, &in_info, outcaps, &out_info); - else - res = TRUE; - - if (res) { - filter->in_info = in_info; - filter->out_info = out_info; - } - - return res; - - /* ERRORS */ -invalid_caps: - { - GST_ERROR_OBJECT (filter, "invalid caps"); - return FALSE; - } -} - -static gboolean -gst_d3d11_base_filter_get_unit_size (GstBaseTransform * trans, GstCaps * caps, - gsize * size) -{ - gboolean ret = FALSE; - GstVideoInfo info; - - ret = gst_video_info_from_caps (&info, caps); - if (ret) - *size = GST_VIDEO_INFO_SIZE (&info); - - return TRUE; -} - -static gboolean -gst_d3d11_base_filter_query (GstBaseTransform * trans, - GstPadDirection direction, GstQuery * query) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT: - { - gboolean ret; - ret = gst_d3d11_handle_context_query (GST_ELEMENT (filter), query, - filter->device); - if (ret) - return TRUE; - break; - } - default: - break; - } - - return GST_BASE_TRANSFORM_CLASS (parent_class)->query (trans, direction, - query); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11bufferpool.c
Deleted
@@ -1,397 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11bufferpool.h" -#include "gstd3d11memory.h" -#include "gstd3d11device.h" - -GST_DEBUG_CATEGORY_STATIC (gst_d3d11_buffer_pool_debug); -#define GST_CAT_DEFAULT gst_d3d11_buffer_pool_debug - -struct _GstD3D11BufferPoolPrivate -{ - GstD3D11Device *device; - GstD3D11Allocator *allocator; - - /* initial buffer used for calculating buffer size */ - GstBuffer *initial_buffer; - - gboolean add_videometa; - GstD3D11AllocationParams *d3d11_params; -}; - -#define gst_d3d11_buffer_pool_parent_class parent_class -G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11BufferPool, - gst_d3d11_buffer_pool, GST_TYPE_BUFFER_POOL); - -static void gst_d3d11_buffer_pool_dispose (GObject * object); -static const gchar **gst_d3d11_buffer_pool_get_options (GstBufferPool * pool); -static gboolean gst_d3d11_buffer_pool_set_config (GstBufferPool * pool, - GstStructure * config); -static GstFlowReturn gst_d3d11_buffer_pool_alloc (GstBufferPool * pool, - GstBuffer ** buffer, GstBufferPoolAcquireParams * params); -static void gst_d3d11_buffer_pool_flush_start (GstBufferPool * pool); -static void gst_d3d11_buffer_pool_flush_stop (GstBufferPool * pool); - -static void -gst_d3d11_buffer_pool_class_init (GstD3D11BufferPoolClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstBufferPoolClass *bufferpool_class = GST_BUFFER_POOL_CLASS (klass); - - gobject_class->dispose = gst_d3d11_buffer_pool_dispose; - - bufferpool_class->get_options = gst_d3d11_buffer_pool_get_options; - bufferpool_class->set_config = gst_d3d11_buffer_pool_set_config; - bufferpool_class->alloc_buffer = gst_d3d11_buffer_pool_alloc; - bufferpool_class->flush_start = gst_d3d11_buffer_pool_flush_start; - bufferpool_class->flush_stop = gst_d3d11_buffer_pool_flush_stop; - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_buffer_pool_debug, "d3d11bufferpool", 0, - "d3d11bufferpool object"); -} - -static void -gst_d3d11_buffer_pool_init (GstD3D11BufferPool * self) -{ - self->priv = gst_d3d11_buffer_pool_get_instance_private (self); -} - -static void -gst_d3d11_buffer_pool_dispose (GObject * object) -{ - GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (object); - GstD3D11BufferPoolPrivate *priv = self->priv; - - if (priv->d3d11_params) - gst_d3d11_allocation_params_free (priv->d3d11_params); - priv->d3d11_params = NULL; - - gst_clear_buffer (&priv->initial_buffer); - gst_clear_object (&priv->device); - gst_clear_object (&priv->allocator); - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static const gchar ** -gst_d3d11_buffer_pool_get_options (GstBufferPool * pool) -{ - /* NOTE: d3d11 memory does not support alignment */ - static const gchar *options[] = { GST_BUFFER_POOL_OPTION_VIDEO_META, NULL }; - - return options; -} - -static gboolean -gst_d3d11_buffer_pool_set_config (GstBufferPool * pool, GstStructure * config) -{ - GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); - GstD3D11BufferPoolPrivate *priv = self->priv; - GstVideoInfo info; - GstCaps *caps = NULL; - guint min_buffers, max_buffers; - GstAllocator *allocator = NULL; - gboolean ret = TRUE; - D3D11_TEXTURE2D_DESC *desc; - gint i; - - if (!gst_buffer_pool_config_get_params (config, &caps, NULL, &min_buffers, - &max_buffers)) - goto wrong_config; - - if (caps == NULL) - goto no_caps; - - /* now parse the caps from the config */ - if (!gst_video_info_from_caps (&info, caps)) - goto wrong_caps; - - GST_LOG_OBJECT (pool, "%dx%d, caps %" GST_PTR_FORMAT, info.width, info.height, - caps); - - if (!gst_buffer_pool_config_get_allocator (config, &allocator, NULL)) - goto wrong_config; - - gst_clear_buffer (&priv->initial_buffer); - gst_clear_object (&priv->allocator); - - if (allocator) { - if (!GST_IS_D3D11_ALLOCATOR (allocator)) { - goto wrong_allocator; - } else { - priv->allocator = gst_object_ref (allocator); - } - } else { - priv->allocator = gst_d3d11_allocator_new (priv->device); - g_assert (priv->allocator); - } - - priv->add_videometa = gst_buffer_pool_config_has_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_META); - - if (priv->d3d11_params) - gst_d3d11_allocation_params_free (priv->d3d11_params); - priv->d3d11_params = - gst_buffer_pool_config_get_d3d11_allocation_params (config); - if (!priv->d3d11_params) { - /* allocate memory with resource format by default */ - priv->d3d11_params = - gst_d3d11_allocation_params_new (priv->device, &info, 0, 0); - } - - desc = priv->d3d11_params->desc; - - /* resolution of semi-planar formats must be multiple of 2 */ - if (desc[0].Format == DXGI_FORMAT_NV12 || desc[0].Format == DXGI_FORMAT_P010 - || desc[0].Format == DXGI_FORMAT_P016) { - if (desc[0].Width % 2 || desc[0].Height % 2) { - gint width, height; - GstVideoAlignment align; - - GST_WARNING_OBJECT (self, "Resolution %dx%d is not mutiple of 2, fixing", - desc[0].Width, desc[0].Height); - - width = GST_ROUND_UP_2 (desc[0].Width); - height = GST_ROUND_UP_2 (desc[0].Height); - - gst_video_alignment_reset (&align); - align.padding_right = width - desc[0].Width; - align.padding_bottom = height - desc[0].Height; - - gst_d3d11_allocation_params_alignment (priv->d3d11_params, &align); - } - } -#ifndef GST_DISABLE_GST_DEBUG - { - GST_LOG_OBJECT (self, "Direct3D11 Allocation params"); - GST_LOG_OBJECT (self, "\tD3D11AllocationFlags: 0x%x", - priv->d3d11_params->flags); - for (i = 0; GST_VIDEO_MAX_PLANES; i++) { - if (desc[i].Format == DXGI_FORMAT_UNKNOWN) - break; - GST_LOG_OBJECT (self, "\t[plane %d] %dx%d, DXGI format %d", - i, desc[i].Width, desc[i].Height, desc[i].Format); - GST_LOG_OBJECT (self, "\t[plane %d] MipLevel %d, ArraySize %d", - i, desc[i].MipLevels, desc[i].ArraySize); - GST_LOG_OBJECT (self, - "\t[plane %d] SampleDesc.Count %d, SampleDesc.Quality %d", - i, desc[i].SampleDesc.Count, desc[i].SampleDesc.Quality); - GST_LOG_OBJECT (self, "\t[plane %d] Usage %d", i, desc[i].Usage); - GST_LOG_OBJECT (self, - "\t[plane %d] BindFlags 0x%x", i, desc[i].BindFlags); - GST_LOG_OBJECT (self, - "\t[plane %d] CPUAccessFlags 0x%x", i, desc[i].CPUAccessFlags); - GST_LOG_OBJECT (self, - "\t[plane %d] MiscFlags 0x%x", i, desc[i].MiscFlags); - } - } -#endif - - if ((priv->d3d11_params->flags & GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY)) { - guint max_array_size = 0; - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (desc[i].Format == DXGI_FORMAT_UNKNOWN) - break; - - if (desc[i].ArraySize > max_array_size) - max_array_size = desc[i].ArraySize; - } - - if (max_buffers == 0 || max_buffers > max_array_size) { - GST_WARNING_OBJECT (pool, - "Array pool is requested but allowed pool size %d > ArraySize %d", - max_buffers, max_array_size); - max_buffers = max_array_size; - } - } - - gst_d3d11_buffer_pool_alloc (pool, &priv->initial_buffer, NULL); - - if (!priv->initial_buffer) { - GST_ERROR_OBJECT (pool, "Could not create initial buffer"); - return FALSE; - } - - self->buffer_size = gst_buffer_get_size (priv->initial_buffer); - - gst_buffer_pool_config_set_params (config, - caps, self->buffer_size, min_buffers, max_buffers); - - return GST_BUFFER_POOL_CLASS (parent_class)->set_config (pool, config) && ret; - - /* ERRORS */ -wrong_config: - { - GST_WARNING_OBJECT (pool, "invalid config"); - return FALSE; - } -no_caps: - { - GST_WARNING_OBJECT (pool, "no caps in config"); - return FALSE; - } -wrong_caps: - { - GST_WARNING_OBJECT (pool, - "failed getting geometry from caps %" GST_PTR_FORMAT, caps); - return FALSE; - } -wrong_allocator: - { - GST_WARNING_OBJECT (pool, "Incorrect allocator type for this pool"); - return FALSE; - } -} - -static GstFlowReturn -gst_d3d11_buffer_pool_alloc (GstBufferPool * pool, GstBuffer ** buffer, - GstBufferPoolAcquireParams * params) -{ - GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); - GstD3D11BufferPoolPrivate *priv = self->priv; - GstMemory *mem; - GstBuffer *buf; - GstD3D11AllocationParams *d3d11_params = priv->d3d11_params; - GstVideoInfo *info = &d3d11_params->info; - GstVideoInfo *aligned_info = &d3d11_params->aligned_info; - gint n_texture = 0; - gint i; - gsize offset[GST_VIDEO_MAX_PLANES] = { 0, }; - - /* consume pre-allocated buffer if any */ - if (G_UNLIKELY (priv->initial_buffer)) { - *buffer = priv->initial_buffer; - priv->initial_buffer = NULL; - - return GST_FLOW_OK; - } - - buf = gst_buffer_new (); - - if (d3d11_params->d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { - for (n_texture = 0; n_texture < GST_VIDEO_INFO_N_PLANES (info); n_texture++) { - d3d11_params->plane = n_texture; - mem = gst_d3d11_allocator_alloc (priv->allocator, d3d11_params); - if (!mem) - goto error; - - gst_buffer_append_memory (buf, mem); - } - } else { - d3d11_params->plane = 0; - mem = gst_d3d11_allocator_alloc (priv->allocator, priv->d3d11_params); - n_texture++; - - if (!mem) - goto error; - - gst_buffer_append_memory (buf, mem); - } - - /* calculate offset */ - for (i = 0; i < n_texture && i < GST_VIDEO_MAX_PLANES - 1; i++) { - offset[i + 1] = offset[i] + - d3d11_params->stride[i] * GST_VIDEO_INFO_COMP_HEIGHT (aligned_info, i); - } - - if (priv->add_videometa) { - GST_DEBUG_OBJECT (self, "adding GstVideoMeta"); - gst_buffer_add_video_meta_full (buf, GST_VIDEO_FRAME_FLAG_NONE, - GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_WIDTH (info), - GST_VIDEO_INFO_HEIGHT (info), GST_VIDEO_INFO_N_PLANES (info), - offset, d3d11_params->stride); - } - - *buffer = buf; - - return GST_FLOW_OK; - -error: - gst_buffer_unref (buf); - - GST_ERROR_OBJECT (self, "cannot create texture memory"); - - return GST_FLOW_ERROR; -} - -static void -gst_d3d11_buffer_pool_flush_start (GstBufferPool * pool) -{ - GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); - GstD3D11BufferPoolPrivate *priv = self->priv; - - if (priv->allocator) - gst_d3d11_allocator_set_flushing (priv->allocator, TRUE); -} - -static void -gst_d3d11_buffer_pool_flush_stop (GstBufferPool * pool) -{ - GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); - GstD3D11BufferPoolPrivate *priv = self->priv; - - if (priv->allocator) - gst_d3d11_allocator_set_flushing (priv->allocator, FALSE); -} - -GstBufferPool * -gst_d3d11_buffer_pool_new (GstD3D11Device * device) -{ - GstD3D11BufferPool *pool; - GstD3D11Allocator *alloc; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - pool = g_object_new (GST_TYPE_D3D11_BUFFER_POOL, NULL); - alloc = gst_d3d11_allocator_new (device); - - pool->priv->device = gst_object_ref (device); - pool->priv->allocator = alloc; - - return GST_BUFFER_POOL_CAST (pool); -} - -GstD3D11AllocationParams * -gst_buffer_pool_config_get_d3d11_allocation_params (GstStructure * config) -{ - GstD3D11AllocationParams *ret; - - if (!gst_structure_get (config, "d3d11-allocation-params", - GST_TYPE_D3D11_ALLOCATION_PARAMS, &ret, NULL)) - ret = NULL; - - return ret; -} - -void -gst_buffer_pool_config_set_d3d11_allocation_params (GstStructure * config, - GstD3D11AllocationParams * params) -{ - g_return_if_fail (config != NULL); - g_return_if_fail (params != NULL); - - gst_structure_set (config, "d3d11-allocation-params", - GST_TYPE_D3D11_ALLOCATION_PARAMS, params, NULL); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11bufferpool.h
Deleted
@@ -1,70 +0,0 @@ -/* - * GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_BUFFER_POOL_H__ -#define __GST_D3D11_BUFFER_POOL_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> - -#include "gstd3d11_fwd.h" - -G_BEGIN_DECLS - -#define GST_TYPE_D3D11_BUFFER_POOL (gst_d3d11_buffer_pool_get_type()) -#define GST_D3D11_BUFFER_POOL(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_D3D11_BUFFER_POOL, GstD3D11BufferPool)) -#define GST_D3D11_BUFFER_POOL_CLASS(klass) (G_TYPE_CHECK_CLASS((klass), GST_TYPE_D3D11_BUFFER_POOL, GstD3D11BufferPoolClass)) -#define GST_IS_D3D11_BUFFER_POOL(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_D3D11_BUFFER_POOL)) -#define GST_IS_D3D11_BUFFER_POOL_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_BUFFER_POOL)) -#define GST_D3D11_BUFFER_POOL_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_BUFFER_POOL, GstD3D11BufferPoolClass)) - -struct _GstD3D11BufferPool -{ - GstBufferPool parent; - - /* re-calculated buffer size based on d3d11 pitch and stride */ - guint buffer_size; - - /*< private >*/ - GstD3D11BufferPoolPrivate *priv; - - gpointer _gst_reserved[GST_PADDING]; -}; - -struct _GstD3D11BufferPoolClass -{ - GstBufferPoolClass bufferpool_class; - - /*< private >*/ - gpointer _gst_reserved[GST_PADDING]; -}; - -GType gst_d3d11_buffer_pool_get_type (void); - -GstBufferPool * gst_d3d11_buffer_pool_new (GstD3D11Device *device); - -GstD3D11AllocationParams * gst_buffer_pool_config_get_d3d11_allocation_params (GstStructure * config); - -void gst_buffer_pool_config_set_d3d11_allocation_params (GstStructure * config, - GstD3D11AllocationParams * params); - -G_END_DECLS - -#endif /* __GST_D3D11_BUFFER_POOL_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11colorconvert.c
Deleted
@@ -1,1576 +0,0 @@ -/* GStreamer - * Copyright (C) <1999> Erik Walthinsen <omega@cse.ogi.edu> - * Copyright (C) 2005-2012 David Schleef <ds@schleef.org> - * Copyright (C) 2012-2014 Matthew Waters <ystree00@gmail.com> - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * Copyright (C) <2019> Jeongki Kim <jeongki.kim@jeongki.kim> - * Copyright (C) 2020 Thibault Saunier <tsaunier@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -/** - * SECTION:element-d3d11convert - * @title: d3d11colorconvert - * - * This element resizes video frames and change color space. - * By default the element will try to negotiate to the same size on the source - * and sinkpad so that no scaling is needed. - * It is therefore safe to insert this element in a pipeline to - * get more robust behaviour without any cost if no scaling is needed. - * - * ## Example launch line - * |[ - * gst-launch-1.0 -v videotestsrc ! video/x-raw,format=NV12 ! d3d11upload ! d3d11convert ! d3d11videosink - * ]| - * This will output a test video (generated in NV12 format) in a video - * window. If the video sink selected does not support NV12 - * d3d11colorconvert will automatically convert the video to a format understood - * by the video sink. - * - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11colorconvert.h" -#include "gstd3d11utils.h" -#include "gstd3d11memory.h" -#include "gstd3d11device.h" -#include "gstd3d11bufferpool.h" - -GST_DEBUG_CATEGORY_STATIC (gst_d3d11_color_convert_debug); -#define GST_CAT_DEFAULT gst_d3d11_color_convert_debug - -static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", - GST_PAD_SINK, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS)) - ); - -static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", - GST_PAD_SRC, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS)) - ); - -#define gst_d3d11_color_convert_parent_class parent_class -G_DEFINE_TYPE (GstD3D11ColorConvert, - gst_d3d11_color_convert, GST_TYPE_D3D11_BASE_FILTER); - -static void gst_d3d11_color_convert_dispose (GObject * object); -static GstCaps *gst_d3d11_color_convert_transform_caps (GstBaseTransform * - trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter); -static GstCaps *gst_d3d11_color_convert_fixate_caps (GstBaseTransform * - base, GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); -static gboolean gst_d3d11_color_convert_filter_meta (GstBaseTransform * trans, - GstQuery * query, GType api, const GstStructure * params); -static gboolean -gst_d3d11_color_convert_propose_allocation (GstBaseTransform * trans, - GstQuery * decide_query, GstQuery * query); -static gboolean -gst_d3d11_color_convert_decide_allocation (GstBaseTransform * trans, - GstQuery * query); - -static GstFlowReturn gst_d3d11_color_convert_transform (GstBaseTransform * - trans, GstBuffer * inbuf, GstBuffer * outbuf); -static gboolean gst_d3d11_color_convert_set_info (GstD3D11BaseFilter * filter, - GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, - GstVideoInfo * out_info); - -/* copies the given caps */ -static GstCaps * -gst_d3d11_color_convert_caps_remove_format_and_rangify_size_info (GstCaps * - caps) -{ - GstStructure *st; - GstCapsFeatures *f; - gint i, n; - GstCaps *res; - GstCapsFeatures *feature = - gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); - - res = gst_caps_new_empty (); - - n = gst_caps_get_size (caps); - for (i = 0; i < n; i++) { - st = gst_caps_get_structure (caps, i); - f = gst_caps_get_features (caps, i); - - /* If this is already expressed by the existing caps - * skip this structure */ - if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) - continue; - - st = gst_structure_copy (st); - /* Only remove format info for the cases when we can actually convert */ - if (!gst_caps_features_is_any (f) - && gst_caps_features_is_equal (f, feature)) { - gst_structure_set (st, "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, - "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); - /* if pixel aspect ratio, make a range of it */ - if (gst_structure_has_field (st, "pixel-aspect-ratio")) { - gst_structure_set (st, "pixel-aspect-ratio", - GST_TYPE_FRACTION_RANGE, 1, G_MAXINT, G_MAXINT, 1, NULL); - } - gst_structure_remove_fields (st, "format", "colorimetry", "chroma-site", - NULL); - } - - gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); - } - gst_caps_features_free (feature); - - return res; -} - -static void -gst_d3d11_color_convert_class_init (GstD3D11ColorConvertClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); - GstD3D11BaseFilterClass *bfilter_class = GST_D3D11_BASE_FILTER_CLASS (klass); - - gobject_class->dispose = gst_d3d11_color_convert_dispose; - - gst_element_class_add_static_pad_template (element_class, &sink_template); - gst_element_class_add_static_pad_template (element_class, &src_template); - - gst_element_class_set_static_metadata (element_class, - "Direct3D11 colorspace converter and scaler", - "Filter/Converter/Scaler/Video/Hardware", - "Resizes video and allow color conversion using D3D11", - "Seungha Yang <seungha.yang@navercorp.com>, " - "Jeongki Kim <jeongki.kim@jeongki.kim>"); - - trans_class->passthrough_on_same_caps = TRUE; - - trans_class->transform_caps = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_transform_caps); - trans_class->fixate_caps = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_fixate_caps); - trans_class->filter_meta = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_filter_meta); - trans_class->propose_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_propose_allocation); - trans_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_decide_allocation); - trans_class->transform = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_transform); - - bfilter_class->set_info = - GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_set_info); - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_color_convert_debug, - "d3d11convert", 0, "d3d11convert element"); -} - -static void -gst_d3d11_color_convert_init (GstD3D11ColorConvert * self) -{ -} - -static void -gst_d3d11_color_convert_clear_shader_resource (GstD3D11ColorConvert * self) -{ - gint i; - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (self->shader_resource_view[i]) { - ID3D11ShaderResourceView_Release (self->shader_resource_view[i]); - self->shader_resource_view[i] = NULL; - } - - if (self->render_target_view[i]) { - ID3D11RenderTargetView_Release (self->render_target_view[i]); - self->render_target_view[i] = NULL; - } - } - - self->num_input_view = 0; - self->num_output_view = 0; - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (self->in_texture[i]) { - ID3D11Texture2D_Release (self->in_texture[i]); - self->in_texture[i] = NULL; - } - - if (self->out_texture[i]) { - ID3D11Texture2D_Release (self->out_texture[i]); - self->out_texture[i] = NULL; - } - } - - if (self->converter) - gst_d3d11_color_converter_free (self->converter); - self->converter = NULL; -} - -static void -gst_d3d11_color_convert_dispose (GObject * object) -{ - GstD3D11ColorConvert *self = GST_D3D11_COLOR_CONVERT (object); - - gst_d3d11_color_convert_clear_shader_resource (self); - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static GstCaps * -gst_d3d11_color_convert_transform_caps (GstBaseTransform * - trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter) -{ - GstCaps *tmp, *tmp2; - GstCaps *result; - - /* Get all possible caps that we can transform to */ - tmp = gst_d3d11_color_convert_caps_remove_format_and_rangify_size_info (caps); - - if (filter) { - tmp2 = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); - gst_caps_unref (tmp); - tmp = tmp2; - } - - result = tmp; - - GST_DEBUG_OBJECT (trans, "transformed %" GST_PTR_FORMAT " into %" - GST_PTR_FORMAT, caps, result); - - return result; -} - -/* - * This is an incomplete matrix of in formats and a score for the prefered output - * format. - * - * out: RGB24 RGB16 ARGB AYUV YUV444 YUV422 YUV420 YUV411 YUV410 PAL GRAY - * in - * RGB24 0 2 1 2 2 3 4 5 6 7 8 - * RGB16 1 0 1 2 2 3 4 5 6 7 8 - * ARGB 2 3 0 1 4 5 6 7 8 9 10 - * AYUV 3 4 1 0 2 5 6 7 8 9 10 - * YUV444 2 4 3 1 0 5 6 7 8 9 10 - * YUV422 3 5 4 2 1 0 6 7 8 9 10 - * YUV420 4 6 5 3 2 1 0 7 8 9 10 - * YUV411 4 6 5 3 2 1 7 0 8 9 10 - * YUV410 6 8 7 5 4 3 2 1 0 9 10 - * PAL 1 3 2 6 4 6 7 8 9 0 10 - * GRAY 1 4 3 2 1 5 6 7 8 9 0 - * - * PAL or GRAY are never prefered, if we can we would convert to PAL instead - * of GRAY, though - * less subsampling is prefered and if any, preferably horizontal - * We would like to keep the alpha, even if we would need to to colorspace conversion - * or lose depth. - */ -#define SCORE_FORMAT_CHANGE 1 -#define SCORE_DEPTH_CHANGE 1 -#define SCORE_ALPHA_CHANGE 1 -#define SCORE_CHROMA_W_CHANGE 1 -#define SCORE_CHROMA_H_CHANGE 1 -#define SCORE_PALETTE_CHANGE 1 - -#define SCORE_COLORSPACE_LOSS 2 /* RGB <-> YUV */ -#define SCORE_DEPTH_LOSS 4 /* change bit depth */ -#define SCORE_ALPHA_LOSS 8 /* lose the alpha channel */ -#define SCORE_CHROMA_W_LOSS 16 /* vertical subsample */ -#define SCORE_CHROMA_H_LOSS 32 /* horizontal subsample */ -#define SCORE_PALETTE_LOSS 64 /* convert to palette format */ -#define SCORE_COLOR_LOSS 128 /* convert to GRAY */ - -#define COLORSPACE_MASK (GST_VIDEO_FORMAT_FLAG_YUV | \ - GST_VIDEO_FORMAT_FLAG_RGB | GST_VIDEO_FORMAT_FLAG_GRAY) -#define ALPHA_MASK (GST_VIDEO_FORMAT_FLAG_ALPHA) -#define PALETTE_MASK (GST_VIDEO_FORMAT_FLAG_PALETTE) - -/* calculate how much loss a conversion would be */ -static void -score_value (GstBaseTransform * base, const GstVideoFormatInfo * in_info, - const GValue * val, gint * min_loss, const GstVideoFormatInfo ** out_info) -{ - const gchar *fname; - const GstVideoFormatInfo *t_info; - GstVideoFormatFlags in_flags, t_flags; - gint loss; - - fname = g_value_get_string (val); - t_info = gst_video_format_get_info (gst_video_format_from_string (fname)); - if (!t_info || t_info->format == GST_VIDEO_FORMAT_UNKNOWN) - return; - - /* accept input format immediately without loss */ - if (in_info == t_info) { - *min_loss = 0; - *out_info = t_info; - return; - } - - loss = SCORE_FORMAT_CHANGE; - - in_flags = GST_VIDEO_FORMAT_INFO_FLAGS (in_info); - in_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; - in_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; - in_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; - - t_flags = GST_VIDEO_FORMAT_INFO_FLAGS (t_info); - t_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; - t_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; - t_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; - - if ((t_flags & PALETTE_MASK) != (in_flags & PALETTE_MASK)) { - loss += SCORE_PALETTE_CHANGE; - if (t_flags & PALETTE_MASK) - loss += SCORE_PALETTE_LOSS; - } - - if ((t_flags & COLORSPACE_MASK) != (in_flags & COLORSPACE_MASK)) { - loss += SCORE_COLORSPACE_LOSS; - if (t_flags & GST_VIDEO_FORMAT_FLAG_GRAY) - loss += SCORE_COLOR_LOSS; - } - - if ((t_flags & ALPHA_MASK) != (in_flags & ALPHA_MASK)) { - loss += SCORE_ALPHA_CHANGE; - if (in_flags & ALPHA_MASK) - loss += SCORE_ALPHA_LOSS; - } - - if ((in_info->h_sub[1]) != (t_info->h_sub[1])) { - loss += SCORE_CHROMA_H_CHANGE; - if ((in_info->h_sub[1]) < (t_info->h_sub[1])) - loss += SCORE_CHROMA_H_LOSS; - } - if ((in_info->w_sub[1]) != (t_info->w_sub[1])) { - loss += SCORE_CHROMA_W_CHANGE; - if ((in_info->w_sub[1]) < (t_info->w_sub[1])) - loss += SCORE_CHROMA_W_LOSS; - } - - if ((in_info->bits) != (t_info->bits)) { - loss += SCORE_DEPTH_CHANGE; - if ((in_info->bits) > (t_info->bits)) - loss += SCORE_DEPTH_LOSS + (in_info->bits - t_info->bits); - } - - GST_DEBUG_OBJECT (base, "score %s -> %s = %d", - GST_VIDEO_FORMAT_INFO_NAME (in_info), - GST_VIDEO_FORMAT_INFO_NAME (t_info), loss); - - if (loss < *min_loss) { - GST_DEBUG_OBJECT (base, "found new best %d", loss); - *out_info = t_info; - *min_loss = loss; - } -} - -static void -gst_d3d11_color_convert_fixate_format (GstBaseTransform * trans, - GstCaps * caps, GstCaps * result) -{ - GstStructure *ins, *outs; - const gchar *in_format; - const GstVideoFormatInfo *in_info, *out_info = NULL; - gint min_loss = G_MAXINT; - guint i, capslen; - - ins = gst_caps_get_structure (caps, 0); - in_format = gst_structure_get_string (ins, "format"); - if (!in_format) { - return; - } - - GST_DEBUG_OBJECT (trans, "source format %s", in_format); - - in_info = - gst_video_format_get_info (gst_video_format_from_string (in_format)); - if (!in_info) - return; - - outs = gst_caps_get_structure (result, 0); - - capslen = gst_caps_get_size (result); - GST_DEBUG ("iterate %d structures", capslen); - for (i = 0; i < capslen; i++) { - GstStructure *tests; - const GValue *format; - - tests = gst_caps_get_structure (result, i); - format = gst_structure_get_value (tests, "format"); - gst_structure_remove_fields (tests, "height", "width", "pixel-aspect-ratio", - "display-aspect-ratio", NULL); - /* should not happen */ - if (format == NULL) - continue; - - if (GST_VALUE_HOLDS_LIST (format)) { - gint j, len; - - len = gst_value_list_get_size (format); - GST_DEBUG_OBJECT (trans, "have %d formats", len); - for (j = 0; j < len; j++) { - const GValue *val; - - val = gst_value_list_get_value (format, j); - if (G_VALUE_HOLDS_STRING (val)) { - score_value (trans, in_info, val, &min_loss, &out_info); - if (min_loss == 0) - break; - } - } - } else if (G_VALUE_HOLDS_STRING (format)) { - score_value (trans, in_info, format, &min_loss, &out_info); - } - } - if (out_info) - gst_structure_set (outs, "format", G_TYPE_STRING, - GST_VIDEO_FORMAT_INFO_NAME (out_info), NULL); -} - -static GstCaps * -gst_d3d11_color_convert_get_fixed_format (GstBaseTransform * trans, - GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) -{ - GstCaps *result; - - result = gst_caps_intersect (othercaps, caps); - if (gst_caps_is_empty (result)) { - gst_caps_unref (result); - result = gst_caps_copy (othercaps); - } - - gst_d3d11_color_convert_fixate_format (trans, caps, result); - - /* fixate remaining fields */ - result = gst_caps_fixate (result); - - if (direction == GST_PAD_SINK) { - if (gst_caps_is_subset (caps, result)) { - gst_caps_replace (&result, caps); - } - } - - return result; -} - -static GstCaps * -gst_d3d11_color_convert_fixate_size (GstBaseTransform * base, - GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) -{ - GstStructure *ins, *outs; - const GValue *from_par, *to_par; - GValue fpar = { 0, }, tpar = { - 0,}; - - othercaps = gst_caps_truncate (othercaps); - othercaps = gst_caps_make_writable (othercaps); - ins = gst_caps_get_structure (caps, 0); - outs = gst_caps_get_structure (othercaps, 0); - - from_par = gst_structure_get_value (ins, "pixel-aspect-ratio"); - to_par = gst_structure_get_value (outs, "pixel-aspect-ratio"); - - /* If we're fixating from the sinkpad we always set the PAR and - * assume that missing PAR on the sinkpad means 1/1 and - * missing PAR on the srcpad means undefined - */ - if (direction == GST_PAD_SINK) { - if (!from_par) { - g_value_init (&fpar, GST_TYPE_FRACTION); - gst_value_set_fraction (&fpar, 1, 1); - from_par = &fpar; - } - if (!to_par) { - g_value_init (&tpar, GST_TYPE_FRACTION_RANGE); - gst_value_set_fraction_range_full (&tpar, 1, G_MAXINT, G_MAXINT, 1); - to_par = &tpar; - } - } else { - if (!to_par) { - g_value_init (&tpar, GST_TYPE_FRACTION); - gst_value_set_fraction (&tpar, 1, 1); - to_par = &tpar; - - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1, - NULL); - } - if (!from_par) { - g_value_init (&fpar, GST_TYPE_FRACTION); - gst_value_set_fraction (&fpar, 1, 1); - from_par = &fpar; - } - } - - /* we have both PAR but they might not be fixated */ - { - gint from_w, from_h, from_par_n, from_par_d, to_par_n, to_par_d; - gint w = 0, h = 0; - gint from_dar_n, from_dar_d; - gint num, den; - - /* from_par should be fixed */ - g_return_val_if_fail (gst_value_is_fixed (from_par), othercaps); - - from_par_n = gst_value_get_fraction_numerator (from_par); - from_par_d = gst_value_get_fraction_denominator (from_par); - - gst_structure_get_int (ins, "width", &from_w); - gst_structure_get_int (ins, "height", &from_h); - - gst_structure_get_int (outs, "width", &w); - gst_structure_get_int (outs, "height", &h); - - /* if both width and height are already fixed, we can't do anything - * about it anymore */ - if (w && h) { - guint n, d; - - GST_DEBUG_OBJECT (base, "dimensions already set to %dx%d, not fixating", - w, h); - if (!gst_value_is_fixed (to_par)) { - if (gst_video_calculate_display_ratio (&n, &d, from_w, from_h, - from_par_n, from_par_d, w, h)) { - GST_DEBUG_OBJECT (base, "fixating to_par to %dx%d", n, d); - if (gst_structure_has_field (outs, "pixel-aspect-ratio")) - gst_structure_fixate_field_nearest_fraction (outs, - "pixel-aspect-ratio", n, d); - else if (n != d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - n, d, NULL); - } - } - goto done; - } - - /* Calculate input DAR */ - if (!gst_util_fraction_multiply (from_w, from_h, from_par_n, from_par_d, - &from_dar_n, &from_dar_d)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - goto done; - } - - GST_DEBUG_OBJECT (base, "Input DAR is %d/%d", from_dar_n, from_dar_d); - - /* If either width or height are fixed there's not much we - * can do either except choosing a height or width and PAR - * that matches the DAR as good as possible - */ - if (h) { - GstStructure *tmp; - gint set_w, set_par_n, set_par_d; - - GST_DEBUG_OBJECT (base, "height is fixed (%d)", h); - - /* If the PAR is fixed too, there's not much to do - * except choosing the width that is nearest to the - * width with the same DAR */ - if (gst_value_is_fixed (to_par)) { - to_par_n = gst_value_get_fraction_numerator (to_par); - to_par_d = gst_value_get_fraction_denominator (to_par); - - GST_DEBUG_OBJECT (base, "PAR is fixed %d/%d", to_par_n, to_par_d); - - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, - to_par_n, &num, &den)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - goto done; - } - - w = (guint) gst_util_uint64_scale_int_round (h, num, den); - gst_structure_fixate_field_nearest_int (outs, "width", w); - - goto done; - } - - /* The PAR is not fixed and it's quite likely that we can set - * an arbitrary PAR. */ - - /* Check if we can keep the input width */ - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "width", from_w); - gst_structure_get_int (tmp, "width", &set_w); - - /* Might have failed but try to keep the DAR nonetheless by - * adjusting the PAR */ - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, h, set_w, - &to_par_n, &to_par_d)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - gst_structure_free (tmp); - goto done; - } - - if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) - gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); - gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", - to_par_n, to_par_d); - gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, - &set_par_d); - gst_structure_free (tmp); - - /* Check if the adjusted PAR is accepted */ - if (set_par_n == to_par_n && set_par_d == to_par_d) { - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "width", G_TYPE_INT, set_w, - "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, - NULL); - goto done; - } - - /* Otherwise scale the width to the new PAR and check if the - * adjusted with is accepted. If all that fails we can't keep - * the DAR */ - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, - set_par_n, &num, &den)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - goto done; - } - - w = (guint) gst_util_uint64_scale_int_round (h, num, den); - gst_structure_fixate_field_nearest_int (outs, "width", w); - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - set_par_n, set_par_d, NULL); - - goto done; - } else if (w) { - GstStructure *tmp; - gint set_h, set_par_n, set_par_d; - - GST_DEBUG_OBJECT (base, "width is fixed (%d)", w); - - /* If the PAR is fixed too, there's not much to do - * except choosing the height that is nearest to the - * height with the same DAR */ - if (gst_value_is_fixed (to_par)) { - to_par_n = gst_value_get_fraction_numerator (to_par); - to_par_d = gst_value_get_fraction_denominator (to_par); - - GST_DEBUG_OBJECT (base, "PAR is fixed %d/%d", to_par_n, to_par_d); - - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, - to_par_n, &num, &den)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - goto done; - } - - h = (guint) gst_util_uint64_scale_int_round (w, den, num); - gst_structure_fixate_field_nearest_int (outs, "height", h); - - goto done; - } - - /* The PAR is not fixed and it's quite likely that we can set - * an arbitrary PAR. */ - - /* Check if we can keep the input height */ - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "height", from_h); - gst_structure_get_int (tmp, "height", &set_h); - - /* Might have failed but try to keep the DAR nonetheless by - * adjusting the PAR */ - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, w, - &to_par_n, &to_par_d)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - gst_structure_free (tmp); - goto done; - } - if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) - gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); - gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", - to_par_n, to_par_d); - gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, - &set_par_d); - gst_structure_free (tmp); - - /* Check if the adjusted PAR is accepted */ - if (set_par_n == to_par_n && set_par_d == to_par_d) { - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "height", G_TYPE_INT, set_h, - "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, - NULL); - goto done; - } - - /* Otherwise scale the height to the new PAR and check if the - * adjusted with is accepted. If all that fails we can't keep - * the DAR */ - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, - set_par_n, &num, &den)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scale sized - integer overflow")); - goto done; - } - - h = (guint) gst_util_uint64_scale_int_round (w, den, num); - gst_structure_fixate_field_nearest_int (outs, "height", h); - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - set_par_n, set_par_d, NULL); - - goto done; - } else if (gst_value_is_fixed (to_par)) { - GstStructure *tmp; - gint set_h, set_w, f_h, f_w; - - to_par_n = gst_value_get_fraction_numerator (to_par); - to_par_d = gst_value_get_fraction_denominator (to_par); - - /* Calculate scale factor for the PAR change */ - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_n, - to_par_d, &num, &den)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - goto done; - } - - /* Try to keep the input height (because of interlacing) */ - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "height", from_h); - gst_structure_get_int (tmp, "height", &set_h); - - /* This might have failed but try to scale the width - * to keep the DAR nonetheless */ - w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); - gst_structure_fixate_field_nearest_int (tmp, "width", w); - gst_structure_get_int (tmp, "width", &set_w); - gst_structure_free (tmp); - - /* We kept the DAR and the height is nearest to the original height */ - if (set_w == w) { - gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", - G_TYPE_INT, set_h, NULL); - goto done; - } - - f_h = set_h; - f_w = set_w; - - /* If the former failed, try to keep the input width at least */ - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "width", from_w); - gst_structure_get_int (tmp, "width", &set_w); - - /* This might have failed but try to scale the width - * to keep the DAR nonetheless */ - h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); - gst_structure_fixate_field_nearest_int (tmp, "height", h); - gst_structure_get_int (tmp, "height", &set_h); - gst_structure_free (tmp); - - /* We kept the DAR and the width is nearest to the original width */ - if (set_h == h) { - gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", - G_TYPE_INT, set_h, NULL); - goto done; - } - - /* If all this failed, keep the dimensions with the DAR that was closest - * to the correct DAR. This changes the DAR but there's not much else to - * do here. - */ - if (set_w * ABS (set_h - h) < ABS (f_w - w) * f_h) { - f_h = set_h; - f_w = set_w; - } - gst_structure_set (outs, "width", G_TYPE_INT, f_w, "height", G_TYPE_INT, - f_h, NULL); - goto done; - } else { - GstStructure *tmp; - gint set_h, set_w, set_par_n, set_par_d, tmp2; - - /* width, height and PAR are not fixed but passthrough is not possible */ - - /* First try to keep the height and width as good as possible - * and scale PAR */ - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "height", from_h); - gst_structure_get_int (tmp, "height", &set_h); - gst_structure_fixate_field_nearest_int (tmp, "width", from_w); - gst_structure_get_int (tmp, "width", &set_w); - - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, set_w, - &to_par_n, &to_par_d)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - gst_structure_free (tmp); - goto done; - } - - if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) - gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); - gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", - to_par_n, to_par_d); - gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, - &set_par_d); - gst_structure_free (tmp); - - if (set_par_n == to_par_n && set_par_d == to_par_d) { - gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", - G_TYPE_INT, set_h, NULL); - - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - set_par_n, set_par_d, NULL); - goto done; - } - - /* Otherwise try to scale width to keep the DAR with the set - * PAR and height */ - if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, - set_par_n, &num, &den)) { - GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), - ("Error calculating the output scaled size - integer overflow")); - goto done; - } - - w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "width", w); - gst_structure_get_int (tmp, "width", &tmp2); - gst_structure_free (tmp); - - if (tmp2 == w) { - gst_structure_set (outs, "width", G_TYPE_INT, tmp2, "height", - G_TYPE_INT, set_h, NULL); - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - set_par_n, set_par_d, NULL); - goto done; - } - - /* ... or try the same with the height */ - h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); - tmp = gst_structure_copy (outs); - gst_structure_fixate_field_nearest_int (tmp, "height", h); - gst_structure_get_int (tmp, "height", &tmp2); - gst_structure_free (tmp); - - if (tmp2 == h) { - gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", - G_TYPE_INT, tmp2, NULL); - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - set_par_n, set_par_d, NULL); - goto done; - } - - /* If all fails we can't keep the DAR and take the nearest values - * for everything from the first try */ - gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", - G_TYPE_INT, set_h, NULL); - if (gst_structure_has_field (outs, "pixel-aspect-ratio") || - set_par_n != set_par_d) - gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, - set_par_n, set_par_d, NULL); - } - } - -done: - if (from_par == &fpar) - g_value_unset (&fpar); - if (to_par == &tpar) - g_value_unset (&tpar); - - return othercaps; -} - -static GstCaps * -gst_d3d11_color_convert_fixate_caps (GstBaseTransform * base, - GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) -{ - GstCaps *format; - - GST_DEBUG_OBJECT (base, - "trying to fixate othercaps %" GST_PTR_FORMAT " based on caps %" - GST_PTR_FORMAT, othercaps, caps); - - format = gst_d3d11_color_convert_get_fixed_format (base, direction, caps, - othercaps); - - if (gst_caps_is_empty (format)) { - GST_ERROR_OBJECT (base, "Could not convert formats"); - return format; - } - - othercaps = - gst_d3d11_color_convert_fixate_size (base, direction, caps, othercaps); - if (gst_caps_get_size (othercaps) == 1) { - gint i; - const gchar *format_fields[] = { "format", "colorimetry", "chroma-site" }; - GstStructure *format_struct = gst_caps_get_structure (format, 0); - GstStructure *fixated_struct; - - othercaps = gst_caps_make_writable (othercaps); - fixated_struct = gst_caps_get_structure (othercaps, 0); - - for (i = 0; i < G_N_ELEMENTS (format_fields); i++) { - if (gst_structure_has_field (format_struct, format_fields[i])) { - gst_structure_set (fixated_struct, format_fields[i], G_TYPE_STRING, - gst_structure_get_string (format_struct, format_fields[i]), NULL); - } else { - gst_structure_remove_field (fixated_struct, format_fields[i]); - } - } - } - gst_caps_unref (format); - - GST_DEBUG_OBJECT (base, "fixated othercaps to %" GST_PTR_FORMAT, othercaps); - - return othercaps; -} - -static gboolean -gst_d3d11_color_convert_filter_meta (GstBaseTransform * trans, - GstQuery * query, GType api, const GstStructure * params) -{ - /* This element cannot passthrough the crop meta, because it would convert the - * wrong sub-region of the image, and worst, our output image may not be large - * enough for the crop to be applied later */ - if (api == GST_VIDEO_CROP_META_API_TYPE) - return FALSE; - - /* propose all other metadata upstream */ - return TRUE; -} - -static gboolean -gst_d3d11_color_convert_propose_allocation (GstBaseTransform * trans, - GstQuery * decide_query, GstQuery * query) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstVideoInfo info; - GstBufferPool *pool = NULL; - GstCaps *caps; - guint n_pools, i; - GstStructure *config; - guint size; - GstD3D11AllocationParams *d3d11_params; - - if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, - decide_query, query)) - return FALSE; - - /* passthrough, we're done */ - if (decide_query == NULL) - return TRUE; - - gst_query_parse_allocation (query, &caps, NULL); - - if (caps == NULL) - return FALSE; - - if (!gst_video_info_from_caps (&info, caps)) - return FALSE; - - n_pools = gst_query_get_n_allocation_pools (query); - for (i = 0; i < n_pools; i++) { - gst_query_parse_nth_allocation_pool (query, i, &pool, NULL, NULL, NULL); - if (!GST_IS_D3D11_BUFFER_POOL (pool)) { - gst_object_unref (pool); - pool = NULL; - } - } - - if (!pool) - pool = gst_d3d11_buffer_pool_new (filter->device); - - config = gst_buffer_pool_get_config (pool); - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - - d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); - if (!d3d11_params) { - d3d11_params = gst_d3d11_allocation_params_new (filter->device, &info, 0, - D3D11_BIND_SHADER_RESOURCE); - } else { - /* Set bind flag */ - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { - d3d11_params->desc[i].BindFlags |= D3D11_BIND_SHADER_RESOURCE; - } - } - - gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); - gst_d3d11_allocation_params_free (d3d11_params); - - /* size will be updated by d3d11 buffer pool */ - gst_buffer_pool_config_set_params (config, caps, 0, 0, 0); - - if (!gst_buffer_pool_set_config (pool, config)) - goto config_failed; - - gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); - gst_query_add_allocation_meta (query, - GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); - - size = GST_D3D11_BUFFER_POOL (pool)->buffer_size; - gst_query_add_allocation_pool (query, pool, size, 0, 0); - - gst_object_unref (pool); - - return TRUE; - - /* ERRORS */ -config_failed: - { - GST_ERROR_OBJECT (filter, "failed to set config"); - gst_object_unref (pool); - return FALSE; - } -} - -static gboolean -gst_d3d11_color_convert_decide_allocation (GstBaseTransform * trans, - GstQuery * query) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstCaps *outcaps = NULL; - GstBufferPool *pool = NULL; - guint size, min = 0, max = 0; - GstStructure *config; - GstD3D11AllocationParams *d3d11_params; - gboolean update_pool = FALSE; - GstVideoInfo info; - gint i; - - gst_query_parse_allocation (query, &outcaps, NULL); - - if (!outcaps) - return FALSE; - - if (!gst_video_info_from_caps (&info, outcaps)) - return FALSE; - - size = GST_VIDEO_INFO_SIZE (&info); - - if (gst_query_get_n_allocation_pools (query) > 0) { - gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); - if (pool && !GST_IS_D3D11_BUFFER_POOL (pool)) { - gst_object_unref (pool); - pool = NULL; - } - - update_pool = TRUE; - } - - if (!pool) - pool = gst_d3d11_buffer_pool_new (filter->device); - - config = gst_buffer_pool_get_config (pool); - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - - d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); - if (!d3d11_params) { - d3d11_params = gst_d3d11_allocation_params_new (filter->device, &info, 0, - D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET); - } else { - /* Set bind flag */ - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { - d3d11_params->desc[i].BindFlags |= - (D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET); - } - } - - gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); - gst_d3d11_allocation_params_free (d3d11_params); - - gst_buffer_pool_config_set_params (config, outcaps, size, min, max); - gst_buffer_pool_set_config (pool, config); - - size = GST_D3D11_BUFFER_POOL (pool)->buffer_size; - - if (update_pool) - gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); - else - gst_query_add_allocation_pool (query, pool, size, min, max); - - gst_object_unref (pool); - - return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, - query); -} - -static gboolean -create_shader_input_resource (GstD3D11ColorConvert * self, - GstD3D11Device * device, const GstD3D11Format * format, GstVideoInfo * info) -{ - D3D11_TEXTURE2D_DESC texture_desc = { 0, }; - D3D11_SHADER_RESOURCE_VIEW_DESC view_desc = { 0 }; - HRESULT hr; - ID3D11Device *device_handle; - ID3D11Texture2D *tex[GST_VIDEO_MAX_PLANES] = { NULL, }; - ID3D11ShaderResourceView *view[GST_VIDEO_MAX_PLANES] = { NULL, }; - gint i; - - if (self->num_input_view) - return TRUE; - - device_handle = gst_d3d11_device_get_device_handle (device); - - texture_desc.MipLevels = 1; - texture_desc.ArraySize = 1; - texture_desc.SampleDesc.Count = 1; - texture_desc.SampleDesc.Quality = 0; - texture_desc.Usage = D3D11_USAGE_DEFAULT; - texture_desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; - - if (format->dxgi_format == DXGI_FORMAT_UNKNOWN) { - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) - break; - - texture_desc.Width = GST_VIDEO_INFO_COMP_WIDTH (info, i); - texture_desc.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); - texture_desc.Format = format->resource_format[i]; - - hr = ID3D11Device_CreateTexture2D (device_handle, - &texture_desc, NULL, &tex[i]); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); - goto error; - } - } - } else { - gboolean is_semiplanar = FALSE; - - if (format->dxgi_format == DXGI_FORMAT_NV12 || - format->dxgi_format == DXGI_FORMAT_P010 || - format->dxgi_format == DXGI_FORMAT_P016) - is_semiplanar = TRUE; - - texture_desc.Width = GST_VIDEO_INFO_WIDTH (info); - texture_desc.Height = GST_VIDEO_INFO_HEIGHT (info); - texture_desc.Format = format->dxgi_format; - - /* semiplanar format resolution of should be even number */ - if (is_semiplanar) { - texture_desc.Width = GST_ROUND_UP_2 (texture_desc.Width); - texture_desc.Height = GST_ROUND_UP_2 (texture_desc.Height); - } - - hr = ID3D11Device_CreateTexture2D (device_handle, - &texture_desc, NULL, &tex[0]); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); - goto error; - } - - if (is_semiplanar) { - ID3D11Resource_AddRef (tex[0]); - tex[1] = tex[0]; - } - } - - view_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; - view_desc.Texture2D.MipLevels = 1; - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) - break; - - view_desc.Format = format->resource_format[i]; - hr = ID3D11Device_CreateShaderResourceView (device_handle, - (ID3D11Resource *) tex[i], &view_desc, &view[i]); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (self, - "Failed to create resource view (0x%x)", (guint) hr); - goto error; - } - } - - self->num_input_view = i; - - GST_DEBUG_OBJECT (self, - "%d shader resource view created", self->num_input_view); - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - self->in_texture[i] = tex[i]; - self->shader_resource_view[i] = view[i]; - } - - return TRUE; - -error: - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (view[i]) - ID3D11ShaderResourceView_Release (view[i]); - } - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (tex[i]) - ID3D11Texture2D_Release (tex[i]); - } - - return FALSE; -} - -static gboolean -create_shader_output_resource (GstD3D11ColorConvert * self, - GstD3D11Device * device, const GstD3D11Format * format, GstVideoInfo * info) -{ - D3D11_TEXTURE2D_DESC texture_desc = { 0, }; - D3D11_RENDER_TARGET_VIEW_DESC view_desc = { 0, }; - HRESULT hr; - ID3D11Device *device_handle; - ID3D11Texture2D *tex[GST_VIDEO_MAX_PLANES] = { NULL, }; - ID3D11RenderTargetView *view[GST_VIDEO_MAX_PLANES] = { NULL, }; - gint i; - - if (self->num_output_view) - return TRUE; - - device_handle = gst_d3d11_device_get_device_handle (device); - - texture_desc.MipLevels = 1; - texture_desc.ArraySize = 1; - texture_desc.SampleDesc.Count = 1; - texture_desc.SampleDesc.Quality = 0; - texture_desc.Usage = D3D11_USAGE_DEFAULT; - texture_desc.BindFlags = - D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; - - if (format->dxgi_format == DXGI_FORMAT_UNKNOWN) { - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) - break; - - texture_desc.Width = GST_VIDEO_INFO_COMP_WIDTH (info, i); - texture_desc.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); - texture_desc.Format = format->resource_format[i]; - - hr = ID3D11Device_CreateTexture2D (device_handle, - &texture_desc, NULL, &tex[i]); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); - goto error; - } - } - } else { - gboolean is_semiplanar = FALSE; - - if (format->dxgi_format == DXGI_FORMAT_NV12 || - format->dxgi_format == DXGI_FORMAT_P010 || - format->dxgi_format == DXGI_FORMAT_P016) - is_semiplanar = TRUE; - - texture_desc.Width = GST_VIDEO_INFO_WIDTH (info); - texture_desc.Height = GST_VIDEO_INFO_HEIGHT (info); - texture_desc.Format = format->dxgi_format; - - /* semiplanar format resolution of should be even number */ - if (is_semiplanar) { - texture_desc.Width = GST_ROUND_UP_2 (texture_desc.Width); - texture_desc.Height = GST_ROUND_UP_2 (texture_desc.Height); - } - - hr = ID3D11Device_CreateTexture2D (device_handle, - &texture_desc, NULL, &tex[0]); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); - goto error; - } - - if (is_semiplanar) { - ID3D11Resource_AddRef (tex[0]); - tex[1] = tex[0]; - } - } - - view_desc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; - view_desc.Texture2D.MipSlice = 0; - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) - break; - - view_desc.Format = format->resource_format[i]; - hr = ID3D11Device_CreateRenderTargetView (device_handle, - (ID3D11Resource *) tex[i], &view_desc, &view[i]); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (self, - "Failed to create %dth render target view (0x%x)", i, (guint) hr); - goto error; - } - } - - self->num_output_view = i; - - GST_DEBUG_OBJECT (self, "%d render view created", self->num_output_view); - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - self->out_texture[i] = tex[i]; - self->render_target_view[i] = view[i]; - } - - return TRUE; - -error: - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (view[i]) - ID3D11RenderTargetView_Release (view[i]); - } - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (tex[i]) - ID3D11Texture2D_Release (tex[i]); - } - - return FALSE; -} - -static gboolean -gst_d3d11_color_convert_set_info (GstD3D11BaseFilter * filter, - GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, - GstVideoInfo * out_info) -{ - GstD3D11ColorConvert *self = GST_D3D11_COLOR_CONVERT (filter); - const GstVideoInfo *unknown_info; - - if (gst_base_transform_is_passthrough (GST_BASE_TRANSFORM (filter))) - return TRUE; - - gst_d3d11_color_convert_clear_shader_resource (self); - - GST_DEBUG_OBJECT (self, "Setup convert with format %s -> %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info)), - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); - - /* if present, these must match */ - if (in_info->interlace_mode != out_info->interlace_mode) - goto format_mismatch; - - /* FIXME: add support border */ - if (in_info->width == out_info->width && in_info->height == out_info->height - && in_info->finfo == out_info->finfo) { - gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (filter), TRUE); - return TRUE; - } - - self->in_d3d11_format = - gst_d3d11_device_format_from_gst (filter->device, - GST_VIDEO_INFO_FORMAT (in_info)); - if (!self->in_d3d11_format) { - unknown_info = in_info; - goto format_unknown; - } - - self->out_d3d11_format = - gst_d3d11_device_format_from_gst (filter->device, - GST_VIDEO_INFO_FORMAT (out_info)); - if (!self->out_d3d11_format) { - unknown_info = out_info; - goto format_unknown; - } - - self->converter = gst_d3d11_color_converter_new (filter->device, - in_info, out_info); - - if (!self->converter) { - GST_ERROR_OBJECT (self, "couldn't set converter"); - return FALSE; - } - - /* setup D3D11_BOX struct for fallback copy */ - self->in_src_box.left = 0; - self->in_src_box.top = 0; - self->in_src_box.front = 0; - self->in_src_box.back = 1; - self->in_src_box.right = GST_VIDEO_INFO_WIDTH (in_info); - self->in_src_box.bottom = GST_VIDEO_INFO_HEIGHT (in_info); - - self->out_src_box.left = 0; - self->out_src_box.top = 0; - self->out_src_box.front = 0; - self->out_src_box.back = 1; - self->out_src_box.right = GST_VIDEO_INFO_WIDTH (out_info); - self->out_src_box.bottom = GST_VIDEO_INFO_HEIGHT (out_info); - - return TRUE; - - /* ERRORS */ -format_mismatch: - { - GST_ERROR_OBJECT (self, "input and output formats do not match"); - return FALSE; - } -format_unknown: - { - GST_ERROR_OBJECT (self, - "%s couldn't be converted to d3d11 format", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (unknown_info))); - return FALSE; - } -} - -static GstFlowReturn -gst_d3d11_color_convert_transform (GstBaseTransform * trans, - GstBuffer * inbuf, GstBuffer * outbuf) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstD3D11ColorConvert *self = GST_D3D11_COLOR_CONVERT (trans); - ID3D11DeviceContext *context_handle; - ID3D11ShaderResourceView *resource_view[GST_VIDEO_MAX_PLANES] = { NULL, }; - ID3D11RenderTargetView *render_view[GST_VIDEO_MAX_PLANES] = { NULL, }; - gint i, j, view_index; - gboolean copy_input = FALSE; - gboolean copy_output = FALSE; - GstD3D11Device *device = filter->device; - - context_handle = gst_d3d11_device_get_device_context_handle (device); - - view_index = 0; - for (i = 0; i < gst_buffer_n_memory (inbuf); i++) { - GstMemory *mem = gst_buffer_peek_memory (inbuf, i); - GstD3D11Memory *d3d11_mem; - GstMapInfo info; - - g_assert (gst_is_d3d11_memory (mem)); - - d3d11_mem = (GstD3D11Memory *) mem; - /* map to transfer pending staging data if any */ - if (d3d11_mem->desc.Usage == D3D11_USAGE_DEFAULT) { - gst_memory_map (mem, &info, GST_MAP_READ | GST_MAP_D3D11); - gst_memory_unmap (mem, &info); - } - - if (gst_d3d11_memory_ensure_shader_resource_view (d3d11_mem)) { - GST_TRACE_OBJECT (self, "Use input texture resource without copy"); - - for (j = 0; j < d3d11_mem->num_shader_resource_views; j++) { - resource_view[view_index] = d3d11_mem->shader_resource_view[j]; - view_index++; - } - } else { - GST_TRACE_OBJECT (self, "Render using fallback input texture"); - copy_input = TRUE; - - if (!create_shader_input_resource (self, device, - self->in_d3d11_format, &filter->in_info)) { - GST_ERROR_OBJECT (self, "Failed to configure fallback input texture"); - return GST_FLOW_ERROR; - } - break; - } - } - - /* if input memory has no resource view, - * copy texture into our fallback texture */ - if (copy_input) { - gst_d3d11_device_lock (device); - for (i = 0; i < gst_buffer_n_memory (inbuf); i++) { - GstMemory *mem = gst_buffer_peek_memory (inbuf, i); - GstD3D11Memory *d3d11_mem; - - g_assert (gst_is_d3d11_memory (mem)); - - d3d11_mem = (GstD3D11Memory *) mem; - - ID3D11DeviceContext_CopySubresourceRegion (context_handle, - (ID3D11Resource *) self->in_texture[i], 0, 0, 0, 0, - (ID3D11Resource *) d3d11_mem->texture, d3d11_mem->subresource_index, - &self->in_src_box); - } - gst_d3d11_device_unlock (device); - } - - view_index = 0; - for (i = 0; i < gst_buffer_n_memory (outbuf); i++) { - GstMemory *mem = gst_buffer_peek_memory (outbuf, i); - GstD3D11Memory *d3d11_mem; - - g_assert (gst_is_d3d11_memory (mem)); - - d3d11_mem = (GstD3D11Memory *) mem; - - if (gst_d3d11_memory_ensure_render_target_view (d3d11_mem)) { - GST_TRACE_OBJECT (self, "Render to output texture directly"); - - for (j = 0; j < d3d11_mem->num_render_target_views; j++) { - render_view[view_index] = d3d11_mem->render_target_view[j]; - view_index++; - } - } else { - GST_TRACE_OBJECT (self, "Render to fallback output texture"); - copy_output = TRUE; - - if (!create_shader_output_resource (self, device, self->out_d3d11_format, - &filter->out_info)) { - GST_ERROR_OBJECT (self, "Failed to configure fallback output texture"); - return GST_FLOW_ERROR; - } - break; - } - } - - if (!gst_d3d11_color_converter_convert (self->converter, - copy_input ? self->shader_resource_view : resource_view, - copy_output ? self->render_target_view : render_view)) { - GST_ERROR_OBJECT (self, "Failed to convert"); - - return GST_FLOW_ERROR; - } - - if (copy_output) { - gst_d3d11_device_lock (device); - for (i = 0; i < gst_buffer_n_memory (outbuf); i++) { - GstMemory *mem = gst_buffer_peek_memory (outbuf, i); - GstD3D11Memory *d3d11_mem; - - g_assert (gst_is_d3d11_memory (mem)); - - d3d11_mem = (GstD3D11Memory *) mem; - - ID3D11DeviceContext_CopySubresourceRegion (context_handle, - (ID3D11Resource *) d3d11_mem->texture, d3d11_mem->subresource_index, - 0, 0, 0, (ID3D11Resource *) self->out_texture[i], 0, - &self->out_src_box); - } - gst_d3d11_device_unlock (device); - } else { - for (i = 0; i < gst_buffer_n_memory (outbuf); i++) { - GstMemory *mem = gst_buffer_peek_memory (outbuf, i); - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - } - } - - return GST_FLOW_OK; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11colorconvert.h
Deleted
@@ -1,68 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_COLOR_CONVERT_H__ -#define __GST_D3D11_COLOR_CONVERT_H__ - -#include <gst/gst.h> - -#include "gstd3d11basefilter.h" -#include "gstd3d11colorconverter.h" - -G_BEGIN_DECLS - -#define GST_TYPE_D3D11_COLOR_CONVERT (gst_d3d11_color_convert_get_type()) -#define GST_D3D11_COLOR_CONVERT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_COLOR_CONVERT,GstD3D11ColorConvert)) -#define GST_D3D11_COLOR_CONVERT_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_COLOR_CONVERT,GstD3D11ColorConvertClass)) -#define GST_D3D11_COLOR_CONVERT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_COLOR_CONVERT,GstD3D11ColorConvertClass)) -#define GST_IS_D3D11_COLOR_CONVERT(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_COLOR_CONVERT)) -#define GST_IS_D3D11_COLOR_CONVERT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_COLOR_CONVERT)) - -struct _GstD3D11ColorConvert -{ - GstD3D11BaseFilter parent; - - const GstD3D11Format *in_d3d11_format; - const GstD3D11Format *out_d3d11_format; - - ID3D11Texture2D *in_texture[GST_VIDEO_MAX_PLANES]; - ID3D11ShaderResourceView *shader_resource_view[GST_VIDEO_MAX_PLANES]; - guint num_input_view; - - ID3D11Texture2D *out_texture[GST_VIDEO_MAX_PLANES]; - ID3D11RenderTargetView *render_target_view[GST_VIDEO_MAX_PLANES]; - guint num_output_view; - - GstD3D11ColorConverter *converter; - - /* used for fallback texture copy */ - D3D11_BOX in_src_box; - D3D11_BOX out_src_box; -}; - -struct _GstD3D11ColorConvertClass -{ - GstD3D11BaseFilterClass parent_class; -}; - -GType gst_d3d11_color_convert_get_type (void); - -G_END_DECLS - -#endif /* __GST_D3D11_COLOR_CONVERT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11colorconverter.c
Deleted
@@ -1,1432 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * Copyright (C) <2019> Jeongki Kim <jeongki.kim@jeongki.kim> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11colorconverter.h" -#include "gstd3d11utils.h" -#include "gstd3d11device.h" -#include "gstd3d11shader.h" -#include "gstd3d11format.h" - -#include <string.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_colorconverter_debug); -#define GST_CAT_DEFAULT gst_d3d11_colorconverter_debug - -#define CONVERTER_MAX_QUADS 2 - -/* *INDENT-OFF* */ -typedef struct -{ - FLOAT trans_matrix[12]; - FLOAT padding[4]; -} PixelShaderColorTransform; - -typedef struct -{ - struct { - FLOAT x; - FLOAT y; - FLOAT z; - } position; - struct { - FLOAT x; - FLOAT y; - } texture; -} VertexData; - -typedef struct -{ - const gchar *constant_buffer; - const gchar *func; -} PixelShaderTemplate; - -#define COLOR_TRANSFORM_COEFF \ - "cbuffer PixelShaderColorTransform : register(b0)\n" \ - "{\n" \ - " float3x4 trans_matrix;\n" \ - " float3 padding;\n" \ - "};\n" - -#define HLSL_FUNC_YUV_TO_RGB \ - "float3 yuv_to_rgb (float3 yuv)\n" \ - "{\n" \ - " yuv += float3(-0.062745f, -0.501960f, -0.501960f);\n" \ - " yuv = mul(yuv, trans_matrix);\n" \ - " return saturate(yuv);\n" \ - "}\n" - -#define HLSL_FUNC_RGB_TO_YUV \ - "float3 rgb_to_yuv (float3 rgb)\n" \ - "{\n" \ - " float3 yuv;\n" \ - " yuv = mul(rgb, trans_matrix);\n" \ - " yuv += float3(0.062745f, 0.501960f, 0.501960f);\n" \ - " return saturate(yuv);\n" \ - "}\n" - -static const PixelShaderTemplate templ_REORDER = - { NULL, NULL }; - -static const PixelShaderTemplate templ_YUV_to_RGB = - { COLOR_TRANSFORM_COEFF, HLSL_FUNC_YUV_TO_RGB }; - -static const PixelShaderTemplate templ_RGB_to_YUV = - { COLOR_TRANSFORM_COEFF, HLSL_FUNC_RGB_TO_YUV }; - -static const gchar templ_REORDER_BODY[] = - " output.Plane_0 = shaderTexture[0].Sample(samplerState, input.Texture);\n"; - -static const gchar templ_VUYA_to_RGB_BODY[] = - " float4 sample, rgba;\n" - " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).z;\n" - " sample.y = shaderTexture[0].Sample(samplerState, input.Texture).y;\n" - " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" - " sample.a = shaderTexture[0].Sample(samplerState, input.Texture).a;\n" - " rgba.rgb = yuv_to_rgb (sample.xyz);\n" - " rgba.a = sample.a;\n" - " output.Plane_0 = rgba;\n"; - -static const gchar templ_RGB_to_VUYA_BODY[] = - " float4 sample, vuya;\n" - " sample = shaderTexture[0].Sample(samplerState, input.Texture);\n" - " vuya.zyx = rgb_to_yuv (sample.rgb);\n" - " vuya.a = sample.a;\n" - " output.Plane_0 = vuya;\n"; - -/* YUV to RGB conversion */ -static const gchar templ_PLANAR_YUV_to_RGB_BODY[] = - " float4 sample, rgba;\n" - " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x * %d;\n" - " sample.y = shaderTexture[1].Sample(samplerState, input.Texture).x * %d;\n" - " sample.z = shaderTexture[2].Sample(samplerState, input.Texture).x * %d;\n" - " rgba.rgb = yuv_to_rgb (sample.xyz);\n" - " rgba.a = 1.0;\n" - " output.Plane_0 = rgba;\n"; - -static const gchar templ_SEMI_PLANAR_to_RGB_BODY[] = - " float4 sample, rgba;\n" - " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" - " sample.yz = shaderTexture[1].Sample(samplerState, input.Texture).xy;\n" - " rgba.rgb = yuv_to_rgb (sample.xyz);\n" - " rgba.a = 1.0;\n" - " output.Plane_0 = rgba;\n"; - -/* RGB to YUV conversion */ -static const gchar templ_RGB_to_LUMA_BODY[] = - " float4 sample, rgba;\n" - " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" - " sample.xyz = rgb_to_yuv (rgba.rgb);\n" - " sample.y = 0.0;\n" - " sample.z = 0.0;\n" - " sample.a = 0.0;\n" - " sample.x = sample.x / %d;\n" - " output.Plane_0 = sample;\n"; - -static const gchar templ_RGB_to_SEMI_PLANAR_CHROMA_BODY[] = - " float4 sample, rgba;\n" - " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" - " sample.xyz = rgb_to_yuv (rgba.rgb);\n" - " sample.x = sample.y;\n" - " sample.y = sample.z;\n" - " sample.z = 0.0;\n" - " sample.a = 0.0;\n" - " output.Plane_0 = sample;\n"; - -static const gchar templ_RGB_to_PLANAR_CHROMA_BODY[] = - " float4 sample, rgba;\n" - " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" - " sample.xyz = rgb_to_yuv (rgba.rgb);\n" - " output.Plane_0 = float4(sample.y / %d, 0.0, 0.0, 0.0);\n" - " output.Plane_1 = float4(sample.z / %d, 0.0, 0.0, 0.0);\n"; - -/* YUV to YUV conversion */ -static const gchar templ_LUMA_to_LUMA_BODY[] = - " float4 sample;\n" - " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x * %d;\n" - " output.Plane_0 = float4(sample.x / %d, 0.0, 0.0, 0.0);\n"; - -static const gchar templ_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY[] = - " float4 sample;\n" - " sample.y = shaderTexture[1].Sample(samplerState, input.Texture).x * %d;\n" - " sample.z = shaderTexture[2].Sample(samplerState, input.Texture).x * %d;\n" - " output.Plane_0 = float4(sample.yz, 0.0, 0.0);\n"; - -static const gchar templ_SEMI_PLANAR_TO_PLANAR_CHROMA_BODY[] = - " float4 sample;\n" - " sample.yz = shaderTexture[1].Sample(samplerState, input.Texture).xy;\n" - " output.Plane_0 = float4(sample.y / %d, 0.0, 0.0, 0.0);\n" - " output.Plane_1 = float4(sample.z / %d, 0.0, 0.0, 0.0);\n"; - -static const gchar templ_SEMI_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY[] = - " float4 sample;\n" - " sample.yz = shaderTexture[1].Sample(samplerState, input.Texture).xy;\n" - " output.Plane_0 = float4(sample.yz, 0.0, 0.0);\n"; - -static const gchar templ_PLANAR_TO_PLANAR_CHROMA_BODY[] = - " float4 sample;\n" - " sample.y = shaderTexture[1].Sample(samplerState, input.Texture).x * %d;\n" - " sample.z = shaderTexture[2].Sample(samplerState, input.Texture).x * %d;\n" - " output.Plane_0 = float4(sample.y / %d, 0.0, 0.0, 0.0);\n" - " output.Plane_1 = float4(sample.z / %d, 0.0, 0.0, 0.0);\n"; - -/* VUYA to YUV */ -static const gchar templ_VUYA_to_LUMA_BODY[] = - " float4 sample;\n" - " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).z;\n" - " output.Plane_0 = float4(sample.x / %d, 0.0, 0.0, 0.0);\n"; - -static const gchar templ_VUYA_TO_PLANAR_CHROMA_BODY[] = - " float4 sample;\n" - " sample.yz = shaderTexture[0].Sample(samplerState, input.Texture).yx;\n" - " output.Plane_0 = float4(sample.y / %d, 0.0, 0.0, 0.0);\n" - " output.Plane_1 = float4(sample.z / %d, 0.0, 0.0, 0.0);\n"; - -static const gchar templ_VUYA_TO_SEMI_PLANAR_CHROMA_BODY[] = - " float4 sample;\n" - " sample.yz = shaderTexture[0].Sample(samplerState, input.Texture).yx;\n" - " output.Plane_0 = float4(sample.yz, 0.0, 0.0);\n"; - -/* YUV to VUYA */ -static const gchar templ_PLANAR_to_VUYA_BODY[] = - " float4 sample;\n" - " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).x * %d;\n" - " sample.y = shaderTexture[1].Sample(samplerState, input.Texture).x * %d;\n" - " sample.x = shaderTexture[2].Sample(samplerState, input.Texture).x * %d;\n" - " output.Plane_0 = float4(sample.xyz, 1.0f);\n"; - -static const gchar templ_SEMI_PLANAR_to_VUYA_BODY[] = - " float4 sample;\n" - " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" - " sample.xy = shaderTexture[1].Sample(samplerState, input.Texture).yx;\n" - " output.Plane_0 = float4(sample.xyz, 1.0f);\n"; - -static const gchar templ_pixel_shader[] = - /* constant buffer */ - "%s\n" - "Texture2D shaderTexture[4];\n" - "SamplerState samplerState;\n" - "\n" - "struct PS_INPUT\n" - "{\n" - " float4 Position: SV_POSITION;\n" - " float3 Texture: TEXCOORD0;\n" - "};\n" - "\n" - "struct PS_OUTPUT\n" - "{\n" - " float4 Plane_0: SV_TARGET0;\n" - " float4 Plane_1: SV_TARGET1;\n" - "};\n" - "\n" - /* rgb <-> yuv function */ - "%s\n" - "PS_OUTPUT main(PS_INPUT input)\n" - "{\n" - " PS_OUTPUT output;\n" - "%s" - " return output;\n" - "}\n"; - -static const gchar templ_vertex_shader[] = - "struct VS_INPUT\n" - "{\n" - " float4 Position : POSITION;\n" - " float4 Texture : TEXCOORD0;\n" - "};\n" - "\n" - "struct VS_OUTPUT\n" - "{\n" - " float4 Position: SV_POSITION;\n" - " float4 Texture: TEXCOORD0;\n" - "};\n" - "\n" - "VS_OUTPUT main(VS_INPUT input)\n" - "{\n" - " return input;\n" - "}\n"; - -/* *INDENT-ON* */ - -typedef struct -{ - const PixelShaderTemplate *templ; - gchar *ps_body[CONVERTER_MAX_QUADS]; - PixelShaderColorTransform transform; -} ConvertInfo; - -struct _GstD3D11ColorConverter -{ - GstD3D11Device *device; - GstVideoInfo in_info; - GstVideoInfo out_info; - - const GstD3D11Format *in_d3d11_format; - const GstD3D11Format *out_d3d11_format; - - guint num_input_view; - guint num_output_view; - - GstD3D11Quad *quad[CONVERTER_MAX_QUADS]; - - D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES]; - - RECT crop_rect; - gint input_texture_width; - gint input_texture_height; - ID3D11Buffer *vertex_buffer; - gboolean update_vertex; - - ConvertInfo convert_info; -}; - -/* from video-converter.c */ -typedef struct -{ - gfloat dm[4][4]; -} MatrixData; - -static void -color_matrix_set_identity (MatrixData * m) -{ - gint i, j; - - for (i = 0; i < 4; i++) { - for (j = 0; j < 4; j++) { - m->dm[i][j] = (i == j); - } - } -} - -static void -color_matrix_copy (MatrixData * d, const MatrixData * s) -{ - gint i, j; - - for (i = 0; i < 4; i++) - for (j = 0; j < 4; j++) - d->dm[i][j] = s->dm[i][j]; -} - -/* Perform 4x4 matrix multiplication: - * - @dst@ = @a@ * @b@ - * - @dst@ may be a pointer to @a@ andor @b@ - */ -static void -color_matrix_multiply (MatrixData * dst, MatrixData * a, MatrixData * b) -{ - MatrixData tmp; - gint i, j, k; - - for (i = 0; i < 4; i++) { - for (j = 0; j < 4; j++) { - gfloat x = 0; - for (k = 0; k < 4; k++) { - x += a->dm[i][k] * b->dm[k][j]; - } - tmp.dm[i][j] = x; - } - } - color_matrix_copy (dst, &tmp); -} - -static void -color_matrix_offset_components (MatrixData * m, gfloat a1, gfloat a2, gfloat a3) -{ - MatrixData a; - - color_matrix_set_identity (&a); - a.dm[0][3] = a1; - a.dm[1][3] = a2; - a.dm[2][3] = a3; - color_matrix_multiply (m, &a, m); -} - -static void -color_matrix_scale_components (MatrixData * m, gfloat a1, gfloat a2, gfloat a3) -{ - MatrixData a; - - color_matrix_set_identity (&a); - a.dm[0][0] = a1; - a.dm[1][1] = a2; - a.dm[2][2] = a3; - color_matrix_multiply (m, &a, m); -} - -static void -color_matrix_debug (GstD3D11ColorConverter * self, const MatrixData * s) -{ - GST_DEBUG ("[%f %f %f %f]", - s->dm[0][0], s->dm[0][1], s->dm[0][2], s->dm[0][3]); - GST_DEBUG ("[%f %f %f %f]", - s->dm[1][0], s->dm[1][1], s->dm[1][2], s->dm[1][3]); - GST_DEBUG ("[%f %f %f %f]", - s->dm[2][0], s->dm[2][1], s->dm[2][2], s->dm[2][3]); - GST_DEBUG ("[%f %f %f %f]", - s->dm[3][0], s->dm[3][1], s->dm[3][2], s->dm[3][3]); -} - -static void -color_matrix_YCbCr_to_RGB (MatrixData * m, gfloat Kr, gfloat Kb) -{ - gfloat Kg = 1.0 - Kr - Kb; - MatrixData k = { - { - {1., 0., 2 * (1 - Kr), 0.}, - {1., -2 * Kb * (1 - Kb) / Kg, -2 * Kr * (1 - Kr) / Kg, 0.}, - {1., 2 * (1 - Kb), 0., 0.}, - {0., 0., 0., 1.}, - } - }; - - color_matrix_multiply (m, &k, m); -} - -static void -color_matrix_RGB_to_YCbCr (MatrixData * m, gfloat Kr, gfloat Kb) -{ - gfloat Kg = 1.0 - Kr - Kb; - MatrixData k; - gfloat x; - - k.dm[0][0] = Kr; - k.dm[0][1] = Kg; - k.dm[0][2] = Kb; - k.dm[0][3] = 0; - - x = 1 / (2 * (1 - Kb)); - k.dm[1][0] = -x * Kr; - k.dm[1][1] = -x * Kg; - k.dm[1][2] = x * (1 - Kb); - k.dm[1][3] = 0; - - x = 1 / (2 * (1 - Kr)); - k.dm[2][0] = x * (1 - Kr); - k.dm[2][1] = -x * Kg; - k.dm[2][2] = -x * Kb; - k.dm[2][3] = 0; - - k.dm[3][0] = 0; - k.dm[3][1] = 0; - k.dm[3][2] = 0; - k.dm[3][3] = 1; - - color_matrix_multiply (m, &k, m); -} - -static void -compute_matrix_to_RGB (GstD3D11ColorConverter * self, MatrixData * data, - GstVideoInfo * info) -{ - gdouble Kr = 0, Kb = 0; - gint offset[4], scale[4]; - - /* bring color components to [0..1.0] range */ - gst_video_color_range_offsets (info->colorimetry.range, info->finfo, offset, - scale); - - color_matrix_offset_components (data, -offset[0], -offset[1], -offset[2]); - color_matrix_scale_components (data, 1 / ((float) scale[0]), - 1 / ((float) scale[1]), 1 / ((float) scale[2])); - - if (!GST_VIDEO_INFO_IS_RGB (info)) { - /* bring components to R'G'B' space */ - if (gst_video_color_matrix_get_Kr_Kb (info->colorimetry.matrix, &Kr, &Kb)) - color_matrix_YCbCr_to_RGB (data, Kr, Kb); - } - color_matrix_debug (self, data); -} - -static void -compute_matrix_to_YUV (GstD3D11ColorConverter * self, MatrixData * data, - GstVideoInfo * info) -{ - gdouble Kr = 0, Kb = 0; - gint offset[4], scale[4]; - - if (!GST_VIDEO_INFO_IS_RGB (info)) { - /* bring components to YCbCr space */ - if (gst_video_color_matrix_get_Kr_Kb (info->colorimetry.matrix, &Kr, &Kb)) - color_matrix_RGB_to_YCbCr (data, Kr, Kb); - } - - /* bring color components to nominal range */ - gst_video_color_range_offsets (info->colorimetry.range, info->finfo, offset, - scale); - - color_matrix_scale_components (data, (float) scale[0], (float) scale[1], - (float) scale[2]); - color_matrix_offset_components (data, offset[0], offset[1], offset[2]); - - color_matrix_debug (self, data); -} - -static gboolean -converter_get_matrix (GstD3D11ColorConverter * self, MatrixData * matrix, - GstVideoInfo * in_info, GstVideoInfo * out_info) -{ - gboolean same_matrix; - guint in_bits, out_bits; - - in_bits = GST_VIDEO_INFO_COMP_DEPTH (in_info, 0); - out_bits = GST_VIDEO_INFO_COMP_DEPTH (out_info, 0); - - same_matrix = in_info->colorimetry.matrix == out_info->colorimetry.matrix; - - GST_DEBUG ("matrix %d -> %d (%d)", in_info->colorimetry.matrix, - out_info->colorimetry.matrix, same_matrix); - - color_matrix_set_identity (matrix); - - if (same_matrix) { - GST_DEBUG ("conversion matrix is not required"); - return FALSE; - } - - if (in_bits < out_bits) { - gint scale = 1 << (out_bits - in_bits); - color_matrix_scale_components (matrix, - 1 / (float) scale, 1 / (float) scale, 1 / (float) scale); - } - - GST_DEBUG ("to RGB matrix"); - compute_matrix_to_RGB (self, matrix, in_info); - GST_DEBUG ("current matrix"); - color_matrix_debug (self, matrix); - - GST_DEBUG ("to YUV matrix"); - compute_matrix_to_YUV (self, matrix, out_info); - GST_DEBUG ("current matrix"); - color_matrix_debug (self, matrix); - - if (in_bits > out_bits) { - gint scale = 1 << (in_bits - out_bits); - color_matrix_scale_components (matrix, - (float) scale, (float) scale, (float) scale); - } - - GST_DEBUG ("final matrix"); - color_matrix_debug (self, matrix); - - return TRUE; -} - -static gboolean -setup_convert_info_rgb_to_rgb (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *convert_info = &self->convert_info; - - convert_info->templ = &templ_REORDER; - convert_info->ps_body[0] = g_strdup_printf (templ_REORDER_BODY); - - return TRUE; -} - -static gboolean -setup_convert_info_yuv_to_rgb (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - - info->templ = &templ_YUV_to_RGB; - - switch (GST_VIDEO_INFO_FORMAT (in_info)) { - case GST_VIDEO_FORMAT_VUYA: - info->ps_body[0] = g_strdup_printf (templ_VUYA_to_RGB_BODY); - break; - case GST_VIDEO_FORMAT_I420: - info->ps_body[0] = - g_strdup_printf (templ_PLANAR_YUV_to_RGB_BODY, 1, 1, 1); - break; - case GST_VIDEO_FORMAT_I420_10LE: - info->ps_body[0] = - g_strdup_printf (templ_PLANAR_YUV_to_RGB_BODY, 64, 64, 64); - break; - case GST_VIDEO_FORMAT_NV12: - case GST_VIDEO_FORMAT_P010_10LE: - case GST_VIDEO_FORMAT_P016_LE: - info->ps_body[0] = g_strdup_printf (templ_SEMI_PLANAR_to_RGB_BODY); - break; - default: - GST_FIXME_OBJECT (self, - "Unhandled input format %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info))); - return FALSE; - } - - return TRUE; -} - -static gboolean -setup_convert_info_rgb_to_yuv (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - - info->templ = &templ_RGB_to_YUV; - - switch (GST_VIDEO_INFO_FORMAT (out_info)) { - case GST_VIDEO_FORMAT_VUYA: - info->ps_body[0] = g_strdup_printf (templ_RGB_to_VUYA_BODY); - break; - case GST_VIDEO_FORMAT_NV12: - case GST_VIDEO_FORMAT_P010_10LE: - case GST_VIDEO_FORMAT_P016_LE: - info->ps_body[0] = g_strdup_printf (templ_RGB_to_LUMA_BODY, 1); - info->ps_body[1] = g_strdup_printf (templ_RGB_to_SEMI_PLANAR_CHROMA_BODY); - break; - case GST_VIDEO_FORMAT_I420: - info->ps_body[0] = g_strdup_printf (templ_RGB_to_LUMA_BODY, 1); - info->ps_body[1] = - g_strdup_printf (templ_RGB_to_PLANAR_CHROMA_BODY, 1, 1); - break; - case GST_VIDEO_FORMAT_I420_10LE: - info->ps_body[0] = g_strdup_printf (templ_RGB_to_LUMA_BODY, 64); - info->ps_body[1] = - g_strdup_printf (templ_RGB_to_PLANAR_CHROMA_BODY, 64, 64); - break; - default: - GST_FIXME_OBJECT (self, - "Unhandled output format %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); - return FALSE; - } - - return TRUE; -} - -static gboolean -setup_convert_info_planar_to_planar (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint mul = 1; - gint div = 1; - - info->templ = &templ_REORDER; - - if (GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_I420_10LE) - mul = 64; - - if (GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_I420_10LE) - div = 64; - - info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, mul, div); - info->ps_body[1] = - g_strdup_printf (templ_PLANAR_TO_PLANAR_CHROMA_BODY, mul, mul, div, div); - - return TRUE; -} - -static gboolean -setup_convert_info_planar_to_semi_planar (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint mul = 1; - gint div = 1; - - info->templ = &templ_REORDER; - - if (GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_I420_10LE) - mul = 64; - - info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, mul, div); - info->ps_body[1] = - g_strdup_printf (templ_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY, mul, mul); - - return TRUE; -} - -static gboolean -setup_convert_info_semi_planar_to_planar (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint mul = 1; - gint div = 1; - - info->templ = &templ_REORDER; - - if (GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_I420_10LE) - div = 64; - - info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, mul, div); - info->ps_body[1] = - g_strdup_printf (templ_SEMI_PLANAR_TO_PLANAR_CHROMA_BODY, div, div); - - return TRUE; -} - -static gboolean -setup_convert_info_semi_planar_to_semi_planar (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint mul = 1; - gint div = 1; - - info->templ = &templ_REORDER; - - info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, mul, div); - info->ps_body[1] = - g_strdup_printf (templ_SEMI_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY); - - return TRUE; -} - -static gboolean -setup_convert_info_vuya_to_vuya (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - - info->templ = &templ_REORDER; - - info->ps_body[0] = g_strdup_printf (templ_REORDER_BODY); - - return TRUE; -} - -static gboolean -setup_convert_info_vuya_to_planar (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint div = 1; - - info->templ = &templ_REORDER; - - if (GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_I420_10LE) - div = 64; - - info->ps_body[0] = g_strdup_printf (templ_VUYA_to_LUMA_BODY, div); - info->ps_body[1] = - g_strdup_printf (templ_VUYA_TO_PLANAR_CHROMA_BODY, div, div); - - return TRUE; -} - -static gboolean -setup_convert_info_vuya_to_semi_planar (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint div = 1; - - info->templ = &templ_REORDER; - - info->ps_body[0] = g_strdup_printf (templ_VUYA_to_LUMA_BODY, div); - info->ps_body[1] = g_strdup_printf (templ_VUYA_TO_SEMI_PLANAR_CHROMA_BODY); - - return TRUE; -} - -static gboolean -setup_convert_info_planar_to_vuya (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - gint mul = 1; - - info->templ = &templ_REORDER; - - if (GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_I420_10LE) - mul = 64; - - info->ps_body[0] = g_strdup_printf (templ_PLANAR_to_VUYA_BODY, mul, mul, mul); - - return TRUE; -} - -static gboolean -setup_convert_info_semi_planar_to_vuya (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - ConvertInfo *info = &self->convert_info; - - info->templ = &templ_REORDER; - - info->ps_body[0] = g_strdup_printf (templ_SEMI_PLANAR_to_VUYA_BODY); - - return TRUE; -} - -static gboolean -setup_convert_info_yuv_to_yuv (GstD3D11ColorConverter * self, - const GstVideoInfo * in_info, const GstVideoInfo * out_info) -{ - gboolean in_planar, out_planar; - gboolean in_vuya, out_vuya; - - in_vuya = GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_VUYA; - out_vuya = GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_VUYA; - in_planar = (GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_I420 || - GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_I420_10LE); - out_planar = (GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_I420 || - GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_I420_10LE); - - if (in_vuya && out_vuya) { - return setup_convert_info_vuya_to_vuya (self, in_info, out_info); - } else if (in_vuya) { - if (out_planar) - return setup_convert_info_vuya_to_planar (self, in_info, out_info); - else - return setup_convert_info_vuya_to_semi_planar (self, in_info, out_info); - } else if (out_vuya) { - if (in_planar) - return setup_convert_info_planar_to_vuya (self, in_info, out_info); - else - return setup_convert_info_semi_planar_to_vuya (self, in_info, out_info); - } - - if (in_planar) { - if (out_planar) - return setup_convert_info_planar_to_planar (self, in_info, out_info); - else - return setup_convert_info_planar_to_semi_planar (self, in_info, out_info); - } else { - if (out_planar) - return setup_convert_info_semi_planar_to_planar (self, in_info, out_info); - else - return setup_convert_info_semi_planar_to_semi_planar (self, in_info, - out_info); - } - - return FALSE; -} - -static gboolean -gst_d3d11_color_convert_setup_shader (GstD3D11ColorConverter * self, - GstD3D11Device * device, GstVideoInfo * in_info, GstVideoInfo * out_info) -{ - HRESULT hr; - D3D11_SAMPLER_DESC sampler_desc = { 0, }; - D3D11_INPUT_ELEMENT_DESC input_desc[2] = { 0, }; - D3D11_BUFFER_DESC buffer_desc = { 0, }; - D3D11_MAPPED_SUBRESOURCE map; - VertexData *vertex_data; - WORD *indices; - ID3D11Device *device_handle; - ID3D11DeviceContext *context_handle; - ConvertInfo *convert_info = &self->convert_info; - ID3D11PixelShader *ps[CONVERTER_MAX_QUADS] = { NULL, NULL }; - ID3D11VertexShader *vs = NULL; - ID3D11InputLayout *layout = NULL; - ID3D11SamplerState *sampler = NULL; - ID3D11Buffer *const_buffer = NULL; - ID3D11Buffer *vertex_buffer = NULL; - ID3D11Buffer *index_buffer = NULL; - const guint index_count = 2 * 3; - gboolean ret = TRUE; - gint i; - - device_handle = gst_d3d11_device_get_device_handle (device); - context_handle = gst_d3d11_device_get_device_context_handle (device); - - /* bilinear filtering */ - sampler_desc.Filter = D3D11_FILTER_MIN_MAG_LINEAR_MIP_POINT; - sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP; - sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP; - sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP; - sampler_desc.ComparisonFunc = D3D11_COMPARISON_ALWAYS; - sampler_desc.MinLOD = 0; - sampler_desc.MaxLOD = D3D11_FLOAT32_MAX; - - hr = ID3D11Device_CreateSamplerState (device_handle, &sampler_desc, &sampler); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create sampler state, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - for (i = 0; i < CONVERTER_MAX_QUADS; i++) { - gchar *shader_code = NULL; - - if (convert_info->ps_body[i]) { - shader_code = g_strdup_printf (templ_pixel_shader, - convert_info->templ->constant_buffer ? - convert_info->templ->constant_buffer : "", - convert_info->templ->func ? convert_info->templ->func : "", - convert_info->ps_body[i]); - - ret = gst_d3d11_create_pixel_shader (device, shader_code, &ps[i]); - g_free (shader_code); - if (!ret) { - GST_ERROR ("Couldn't create pixel shader"); - goto clear; - } - } - } - - if (convert_info->templ->constant_buffer) { - D3D11_BUFFER_DESC const_buffer_desc = { 0, }; - - const_buffer_desc.Usage = D3D11_USAGE_DYNAMIC; - const_buffer_desc.ByteWidth = sizeof (PixelShaderColorTransform); - const_buffer_desc.BindFlags = D3D11_BIND_CONSTANT_BUFFER; - const_buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; - const_buffer_desc.MiscFlags = 0; - const_buffer_desc.StructureByteStride = 0; - - hr = ID3D11Device_CreateBuffer (device_handle, &const_buffer_desc, NULL, - &const_buffer); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create constant buffer, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - gst_d3d11_device_lock (device); - hr = ID3D11DeviceContext_Map (context_handle, - (ID3D11Resource *) const_buffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &map); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't map constant buffer, hr: 0x%x", (guint) hr); - gst_d3d11_device_unlock (device); - ret = FALSE; - goto clear; - } - - memcpy (map.pData, &convert_info->transform, - sizeof (PixelShaderColorTransform)); - - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) const_buffer, 0); - gst_d3d11_device_unlock (device); - } - - input_desc[0].SemanticName = "POSITION"; - input_desc[0].SemanticIndex = 0; - input_desc[0].Format = DXGI_FORMAT_R32G32B32_FLOAT; - input_desc[0].InputSlot = 0; - input_desc[0].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; - input_desc[0].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; - input_desc[0].InstanceDataStepRate = 0; - - input_desc[1].SemanticName = "TEXCOORD"; - input_desc[1].SemanticIndex = 0; - input_desc[1].Format = DXGI_FORMAT_R32G32_FLOAT; - input_desc[1].InputSlot = 0; - input_desc[1].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; - input_desc[1].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; - input_desc[1].InstanceDataStepRate = 0; - - if (!gst_d3d11_create_vertex_shader (device, templ_vertex_shader, - input_desc, G_N_ELEMENTS (input_desc), &vs, &layout)) { - GST_ERROR ("Couldn't vertex pixel shader"); - ret = FALSE; - goto clear; - } - - /* setup vertext buffer and index buffer */ - buffer_desc.Usage = D3D11_USAGE_DYNAMIC; - buffer_desc.ByteWidth = sizeof (VertexData) * 4; - buffer_desc.BindFlags = D3D11_BIND_VERTEX_BUFFER; - buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; - - hr = ID3D11Device_CreateBuffer (device_handle, &buffer_desc, NULL, - &vertex_buffer); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create vertex buffer, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - buffer_desc.Usage = D3D11_USAGE_DYNAMIC; - buffer_desc.ByteWidth = sizeof (WORD) * index_count; - buffer_desc.BindFlags = D3D11_BIND_INDEX_BUFFER; - buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; - - hr = ID3D11Device_CreateBuffer (device_handle, &buffer_desc, NULL, - &index_buffer); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create index buffer, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - gst_d3d11_device_lock (device); - hr = ID3D11DeviceContext_Map (context_handle, - (ID3D11Resource *) vertex_buffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &map); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't map vertex buffer, hr: 0x%x", (guint) hr); - gst_d3d11_device_unlock (device); - ret = FALSE; - goto clear; - } - - vertex_data = (VertexData *) map.pData; - - hr = ID3D11DeviceContext_Map (context_handle, - (ID3D11Resource *) index_buffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &map); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't map index buffer, hr: 0x%x", (guint) hr); - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) vertex_buffer, 0); - gst_d3d11_device_unlock (device); - ret = FALSE; - goto clear; - } - - indices = (WORD *) map.pData; - - /* bottom left */ - vertex_data[0].position.x = -1.0f; - vertex_data[0].position.y = -1.0f; - vertex_data[0].position.z = 0.0f; - vertex_data[0].texture.x = 0.0f; - vertex_data[0].texture.y = 1.0f; - - /* top left */ - vertex_data[1].position.x = -1.0f; - vertex_data[1].position.y = 1.0f; - vertex_data[1].position.z = 0.0f; - vertex_data[1].texture.x = 0.0f; - vertex_data[1].texture.y = 0.0f; - - /* top right */ - vertex_data[2].position.x = 1.0f; - vertex_data[2].position.y = 1.0f; - vertex_data[2].position.z = 0.0f; - vertex_data[2].texture.x = 1.0f; - vertex_data[2].texture.y = 0.0f; - - /* bottom right */ - vertex_data[3].position.x = 1.0f; - vertex_data[3].position.y = -1.0f; - vertex_data[3].position.z = 0.0f; - vertex_data[3].texture.x = 1.0f; - vertex_data[3].texture.y = 1.0f; - - /* clockwise indexing */ - indices[0] = 0; /* bottom left */ - indices[1] = 1; /* top left */ - indices[2] = 2; /* top right */ - - indices[3] = 3; /* bottom right */ - indices[4] = 0; /* bottom left */ - indices[5] = 2; /* top right */ - - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) vertex_buffer, 0); - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) index_buffer, 0); - gst_d3d11_device_unlock (device); - - self->quad[0] = gst_d3d11_quad_new (device, - ps[0], vs, layout, sampler, NULL, NULL, const_buffer, vertex_buffer, - sizeof (VertexData), index_buffer, DXGI_FORMAT_R16_UINT, index_count); - - if (ps[1]) { - self->quad[1] = gst_d3d11_quad_new (device, - ps[1], vs, layout, sampler, NULL, NULL, const_buffer, vertex_buffer, - sizeof (VertexData), index_buffer, DXGI_FORMAT_R16_UINT, index_count); - } - - self->num_input_view = GST_VIDEO_INFO_N_PLANES (in_info); - self->num_output_view = GST_VIDEO_INFO_N_PLANES (out_info); - - /* holds vertex buffer for crop rect update */ - self->vertex_buffer = vertex_buffer; - ID3D11Buffer_AddRef (vertex_buffer); - - self->crop_rect.left = 0; - self->crop_rect.top = 0; - self->crop_rect.right = GST_VIDEO_INFO_WIDTH (in_info); - self->crop_rect.bottom = GST_VIDEO_INFO_HEIGHT (in_info); - - self->input_texture_width = GST_VIDEO_INFO_WIDTH (in_info); - self->input_texture_height = GST_VIDEO_INFO_HEIGHT (in_info); - -clear: - for (i = 0; i < CONVERTER_MAX_QUADS; i++) { - if (ps[i]) - ID3D11PixelShader_Release (ps[i]); - } - if (vs) - ID3D11VertexShader_Release (vs); - if (layout) - ID3D11InputLayout_Release (layout); - if (sampler) - ID3D11SamplerState_Release (sampler); - if (const_buffer) - ID3D11Buffer_Release (const_buffer); - if (vertex_buffer) - ID3D11Buffer_Release (vertex_buffer); - if (index_buffer) - ID3D11Buffer_Release (index_buffer); - - return ret; -} - -GstD3D11ColorConverter * -gst_d3d11_color_converter_new (GstD3D11Device * device, - GstVideoInfo * in_info, GstVideoInfo * out_info) -{ - const GstVideoInfo *unknown_info; - const GstD3D11Format *in_d3d11_format; - const GstD3D11Format *out_d3d11_format; - gboolean is_supported = FALSE; - MatrixData matrix; - GstD3D11ColorConverter *converter = NULL; - gboolean ret; - gint i; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - g_return_val_if_fail (in_info != NULL, NULL); - g_return_val_if_fail (out_info != NULL, NULL); - - GST_DEBUG ("Setup convert with format %s -> %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info)), - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); - - in_d3d11_format = - gst_d3d11_device_format_from_gst (device, - GST_VIDEO_INFO_FORMAT (in_info)); - if (!in_d3d11_format) { - unknown_info = in_info; - goto format_unknown; - } - - out_d3d11_format = - gst_d3d11_device_format_from_gst (device, - GST_VIDEO_INFO_FORMAT (out_info)); - if (!out_d3d11_format) { - unknown_info = out_info; - goto format_unknown; - } - - converter = g_new0 (GstD3D11ColorConverter, 1); - converter->device = gst_object_ref (device); - - if (GST_VIDEO_INFO_IS_RGB (in_info)) { - if (GST_VIDEO_INFO_IS_RGB (out_info)) { - is_supported = - setup_convert_info_rgb_to_rgb (converter, in_info, out_info); - } else if (GST_VIDEO_INFO_IS_YUV (out_info)) { - is_supported = - setup_convert_info_rgb_to_yuv (converter, in_info, out_info); - } - } else if (GST_VIDEO_INFO_IS_YUV (in_info)) { - if (GST_VIDEO_INFO_IS_RGB (out_info)) { - is_supported = - setup_convert_info_yuv_to_rgb (converter, in_info, out_info); - } else if (GST_VIDEO_INFO_IS_YUV (out_info)) { - is_supported = - setup_convert_info_yuv_to_yuv (converter, in_info, out_info); - } - } - - if (!is_supported) { - goto conversion_not_supported; - } - - if (converter_get_matrix (converter, &matrix, in_info, out_info)) { - PixelShaderColorTransform *transform = &converter->convert_info.transform; - - /* padding the last column for 16bytes alignment */ - transform->trans_matrix[0] = matrix.dm[0][0]; - transform->trans_matrix[1] = matrix.dm[0][1]; - transform->trans_matrix[2] = matrix.dm[0][2]; - transform->trans_matrix[3] = 0; - transform->trans_matrix[4] = matrix.dm[1][0]; - transform->trans_matrix[5] = matrix.dm[1][1]; - transform->trans_matrix[6] = matrix.dm[1][2]; - transform->trans_matrix[7] = 0; - transform->trans_matrix[8] = matrix.dm[2][0]; - transform->trans_matrix[9] = matrix.dm[2][1]; - transform->trans_matrix[10] = matrix.dm[2][2]; - transform->trans_matrix[11] = 0; - } - - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (out_info); i++) { - converter->viewport[i].TopLeftX = 0; - converter->viewport[i].TopLeftY = 0; - converter->viewport[i].Width = GST_VIDEO_INFO_COMP_WIDTH (out_info, i); - converter->viewport[i].Height = GST_VIDEO_INFO_COMP_HEIGHT (out_info, i); - converter->viewport[i].MinDepth = 0.0f; - converter->viewport[i].MaxDepth = 1.0f; - } - - ret = gst_d3d11_color_convert_setup_shader (converter, - device, in_info, out_info); - - if (!ret) { - GST_ERROR ("Couldn't setup shader"); - gst_d3d11_color_converter_free (converter); - converter = NULL; - } else { - converter->in_info = *in_info; - converter->out_info = *out_info; - } - - return converter; - - /* ERRORS */ -format_unknown: - { - GST_ERROR ("%s couldn't be converted to d3d11 format", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (unknown_info))); - return NULL; - } -conversion_not_supported: - { - GST_ERROR ("Conversion %s to %s not supported", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info)), - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); - gst_d3d11_color_converter_free (converter); - return NULL; - } -} - -void -gst_d3d11_color_converter_free (GstD3D11ColorConverter * converter) -{ - gint i; - - g_return_if_fail (converter != NULL); - - for (i = 0; i < CONVERTER_MAX_QUADS; i++) { - if (converter->quad[i]) - gst_d3d11_quad_free (converter->quad[i]); - - g_free (converter->convert_info.ps_body[i]); - } - - if (converter->vertex_buffer) - ID3D11Buffer_Release (converter->vertex_buffer); - - gst_clear_object (&converter->device); - g_free (converter); -} - -/* must be called with gst_d3d11_device_lock since ID3D11DeviceContext is not - * thread-safe */ -static gboolean -gst_d3d11_color_converter_update_vertex_buffer (GstD3D11ColorConverter * self) -{ - D3D11_MAPPED_SUBRESOURCE map; - VertexData *vertex_data; - ID3D11DeviceContext *context_handle; - HRESULT hr; - FLOAT u, v; - const RECT *crop_rect = &self->crop_rect; - gint texture_width = self->input_texture_width; - gint texture_height = self->input_texture_height; - - context_handle = gst_d3d11_device_get_device_context_handle (self->device); - - hr = ID3D11DeviceContext_Map (context_handle, - (ID3D11Resource *) self->vertex_buffer, 0, D3D11_MAP_WRITE_DISCARD, - 0, &map); - - if (!gst_d3d11_result (hr, self->device)) { - GST_ERROR ("Couldn't map vertex buffer, hr: 0x%x", (guint) hr); - return FALSE; - } - - vertex_data = (VertexData *) map.pData; - - /* bottom left */ - u = (crop_rect->left / (gfloat) texture_width) - 0.5f / texture_width; - v = (crop_rect->bottom / (gfloat) texture_height) - 0.5f / texture_height; - - vertex_data[0].position.x = -1.0f; - vertex_data[0].position.y = -1.0f; - vertex_data[0].position.z = 0.0f; - vertex_data[0].texture.x = u; - vertex_data[0].texture.y = v; - - /* top left */ - u = (crop_rect->left / (gfloat) texture_width) - 0.5f / texture_width; - v = (crop_rect->top / (gfloat) texture_height) - 0.5f / texture_height; - - vertex_data[1].position.x = -1.0f; - vertex_data[1].position.y = 1.0f; - vertex_data[1].position.z = 0.0f; - vertex_data[1].texture.x = u; - vertex_data[1].texture.y = v; - - /* top right */ - u = (crop_rect->right / (gfloat) texture_width) - 0.5f / texture_width; - v = (crop_rect->top / (gfloat) texture_height) - 0.5f / texture_height; - - vertex_data[2].position.x = 1.0f; - vertex_data[2].position.y = 1.0f; - vertex_data[2].position.z = 0.0f; - vertex_data[2].texture.x = u; - vertex_data[2].texture.y = v; - - /* bottom right */ - u = (crop_rect->right / (gfloat) texture_width) - 0.5f / texture_width; - v = (crop_rect->bottom / (gfloat) texture_height) - 0.5f / texture_height; - - vertex_data[3].position.x = 1.0f; - vertex_data[3].position.y = -1.0f; - vertex_data[3].position.z = 0.0f; - vertex_data[3].texture.x = u; - vertex_data[3].texture.y = v; - - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) self->vertex_buffer, 0); - - self->update_vertex = FALSE; - - return TRUE; -} - -gboolean -gst_d3d11_color_converter_convert (GstD3D11ColorConverter * converter, - ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], - ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) -{ - gboolean ret; - - g_return_val_if_fail (converter != NULL, FALSE); - g_return_val_if_fail (srv != NULL, FALSE); - g_return_val_if_fail (rtv != NULL, FALSE); - - gst_d3d11_device_lock (converter->device); - ret = gst_d3d11_color_converter_convert_unlocked (converter, srv, rtv); - gst_d3d11_device_unlock (converter->device); - - return ret; -} - -gboolean -gst_d3d11_color_converter_convert_unlocked (GstD3D11ColorConverter * converter, - ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], - ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) -{ - gboolean ret; - ID3D11Resource *resource; - D3D11_TEXTURE2D_DESC desc; - - g_return_val_if_fail (converter != NULL, FALSE); - g_return_val_if_fail (srv != NULL, FALSE); - g_return_val_if_fail (rtv != NULL, FALSE); - - /* check texture resolution and update crop area */ - ID3D11ShaderResourceView_GetResource (srv[0], &resource); - ID3D11Texture2D_GetDesc ((ID3D11Texture2D *) resource, &desc); - ID3D11Resource_Release (resource); - - if (converter->update_vertex || - desc.Width != converter->input_texture_width || - desc.Height != converter->input_texture_height) { - GST_DEBUG ("Update vertext buffer, texture resolution: %dx%d", - desc.Width, desc.Height); - - converter->input_texture_width = desc.Width; - converter->input_texture_height = desc.Height; - - if (!gst_d3d11_color_converter_update_vertex_buffer (converter)) { - GST_ERROR ("Cannot update vertex buffer"); - return FALSE; - } - } - - ret = gst_d3d11_draw_quad_unlocked (converter->quad[0], converter->viewport, - 1, srv, converter->num_input_view, rtv, 1, NULL); - - if (!ret) - return FALSE; - - if (converter->quad[1]) { - ret = gst_d3d11_draw_quad_unlocked (converter->quad[1], - &converter->viewport[1], converter->num_output_view - 1, - srv, converter->num_input_view, &rtv[1], converter->num_output_view - 1, - NULL); - - if (!ret) - return FALSE; - } - - return TRUE; -} - -gboolean -gst_d3d11_color_converter_update_rect (GstD3D11ColorConverter * converter, - RECT * rect) -{ - g_return_val_if_fail (converter != NULL, FALSE); - g_return_val_if_fail (rect != NULL, FALSE); - - converter->viewport[0].TopLeftX = rect->left; - converter->viewport[0].TopLeftY = rect->top; - converter->viewport[0].Width = rect->right - rect->left; - converter->viewport[0].Height = rect->bottom - rect->top; - - switch (GST_VIDEO_INFO_FORMAT (&converter->out_info)) { - case GST_VIDEO_FORMAT_NV12: - case GST_VIDEO_FORMAT_P010_10LE: - case GST_VIDEO_FORMAT_P016_LE: - case GST_VIDEO_FORMAT_I420: - case GST_VIDEO_FORMAT_I420_10LE:{ - gint i; - converter->viewport[1].TopLeftX = converter->viewport[0].TopLeftX / 2; - converter->viewport[1].TopLeftY = converter->viewport[0].TopLeftY / 2; - converter->viewport[1].Width = converter->viewport[0].Width / 2; - converter->viewport[1].Height = converter->viewport[0].Height / 2; - - for (i = 2; i < GST_VIDEO_INFO_N_PLANES (&converter->out_info); i++) - converter->viewport[i] = converter->viewport[1]; - - break; - } - default: - if (converter->num_output_view > 1) - g_assert_not_reached (); - break; - } - - return TRUE; -} - -gboolean -gst_d3d11_color_converter_update_crop_rect (GstD3D11ColorConverter * converter, - RECT * crop_rect) -{ - g_return_val_if_fail (converter != NULL, FALSE); - g_return_val_if_fail (crop_rect != NULL, FALSE); - - if (converter->crop_rect.left != crop_rect->left || - converter->crop_rect.top != crop_rect->top || - converter->crop_rect.right != crop_rect->right || - converter->crop_rect.bottom != crop_rect->bottom) { - converter->crop_rect = *crop_rect; - - /* vertex buffer will be updated on next convert() call */ - converter->update_vertex = TRUE; - } - - return TRUE; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11colorconverter.h
Deleted
@@ -1,53 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_COLOR_CONVERTER_H__ -#define __GST_D3D11_COLOR_CONVERTER_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> -#include "gstd3d11_fwd.h" - -G_BEGIN_DECLS - -typedef struct _GstD3D11ColorConverter GstD3D11ColorConverter; - -GstD3D11ColorConverter * gst_d3d11_color_converter_new (GstD3D11Device * device, - GstVideoInfo * in_info, - GstVideoInfo * out_info); - -void gst_d3d11_color_converter_free (GstD3D11ColorConverter * converter); - -gboolean gst_d3d11_color_converter_convert (GstD3D11ColorConverter * converter, - ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES], - ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES]); - -gboolean gst_d3d11_color_converter_convert_unlocked (GstD3D11ColorConverter * converter, - ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES], - ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES]); - -gboolean gst_d3d11_color_converter_update_rect (GstD3D11ColorConverter * converter, - RECT *rect); - -gboolean gst_d3d11_color_converter_update_crop_rect (GstD3D11ColorConverter * converter, - RECT *crop_rect); - -G_END_DECLS - -#endif /* __GST_D3D11_COLOR_CONVERTER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11decoder.c
Deleted
@@ -1,1827 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - * - * NOTE: some of implementations are copied/modified from Chromium code - * - * Copyright 2015 The Chromium Authors. All rights reserved. - * - * Redistribution and use in source and binary forms, with or without - * modification, are permitted provided that the following conditions are - * met: - * - * * Redistributions of source code must retain the above copyright - * notice, this list of conditions and the following disclaimer. - * * Redistributions in binary form must reproduce the above - * copyright notice, this list of conditions and the following disclaimer - * in the documentation and/or other materials provided with the - * distribution. - * * Neither the name of Google Inc. nor the names of its - * contributors may be used to endorse or promote products derived from - * this software without specific prior written permission. - * - * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS - * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT - * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR - * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT - * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, - * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT - * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, - * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY - * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT - * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE - * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - */ - -#ifdef HAVE_CONFIG_H -#include <config.h> -#endif - -#include "gstd3d11decoder.h" -#include "gstd3d11memory.h" -#include "gstd3d11bufferpool.h" -#include "gstd3d11device.h" -#include "gstd3d11colorconverter.h" -#include <string.h> - -GST_DEBUG_CATEGORY (d3d11_decoder_debug); -#define GST_CAT_DEFAULT d3d11_decoder_debug - -enum -{ - PROP_0, - PROP_DEVICE, -}; - -struct _GstD3D11DecoderPrivate -{ - GstD3D11Device *device; - - ID3D11VideoDevice *video_device; - ID3D11VideoContext *video_context; - - ID3D11VideoDecoder *decoder; - - GstBufferPool *internal_pool; - - gint display_width; - gint display_height; - - gboolean use_array_of_texture; - guint pool_size; - guint8 next_view_id; - - /* for staging */ - ID3D11Texture2D *staging; - gsize staging_texture_offset[GST_VIDEO_MAX_PLANES]; - gint stating_texture_stride[GST_VIDEO_MAX_PLANES]; - - GUID decoder_profile; - - /* For device specific workaround */ - gboolean can_direct_rendering; - - /* for internal shader */ - GstD3D11ColorConverter *converter; - ID3D11Texture2D *shader_resource_texture; - ID3D11ShaderResourceView *shader_resource_view[GST_VIDEO_MAX_PLANES]; - ID3D11Texture2D *fallback_shader_output_texture; - ID3D11RenderTargetView *fallback_render_target_view[GST_VIDEO_MAX_PLANES]; - DXGI_FORMAT resource_formats[GST_VIDEO_MAX_PLANES]; - guint num_resource_views; -}; - -#define OUTPUT_VIEW_QUARK _decoder_output_view_get() -static GQuark -_decoder_output_view_get (void) -{ - static gsize g_quark = 0; - - if (g_once_init_enter (&g_quark)) { - gsize quark = - (gsize) g_quark_from_static_string ("GstD3D11DecoderOutputView"); - g_once_init_leave (&g_quark, quark); - } - return (GQuark) g_quark; -} - -static void gst_d3d11_decoder_constructed (GObject * object); -static void gst_d3d11_decoder_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec); -static void gst_d3d11_decoder_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec); -static void gst_d3d11_decoder_dispose (GObject * obj); - -#define parent_class gst_d3d11_decoder_parent_class -G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11Decoder, - gst_d3d11_decoder, GST_TYPE_OBJECT); - -static void -gst_d3d11_decoder_class_init (GstD3D11DecoderClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - - gobject_class->constructed = gst_d3d11_decoder_constructed; - gobject_class->set_property = gst_d3d11_decoder_set_property; - gobject_class->get_property = gst_d3d11_decoder_get_property; - gobject_class->dispose = gst_d3d11_decoder_dispose; - - g_object_class_install_property (gobject_class, PROP_DEVICE, - g_param_spec_object ("device", "Device", - "D3D11 Devicd to use", GST_TYPE_D3D11_DEVICE, - G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); - - GST_DEBUG_CATEGORY_INIT (d3d11_decoder_debug, - "d3d11decoder", 0, "Direct3D11 Base Video Decoder object"); -} - -static void -gst_d3d11_decoder_init (GstD3D11Decoder * self) -{ - self->priv = gst_d3d11_decoder_get_instance_private (self); -} - -static void -gst_d3d11_decoder_constructed (GObject * object) -{ - GstD3D11Decoder *self = GST_D3D11_DECODER (object); - GstD3D11DecoderPrivate *priv = self->priv; - HRESULT hr; - ID3D11Device *device_handle; - ID3D11DeviceContext *device_context_handle; - - if (!priv->device) { - GST_ERROR_OBJECT (self, "No D3D11Device available"); - return; - } - - device_handle = gst_d3d11_device_get_device_handle (priv->device); - device_context_handle = - gst_d3d11_device_get_device_context_handle (priv->device); - - hr = ID3D11Device_QueryInterface (device_handle, &IID_ID3D11VideoDevice, - (void **) &priv->video_device); - - if (!gst_d3d11_result (hr, priv->device) || !priv->video_device) { - GST_WARNING_OBJECT (self, "Cannot create VideoDevice Object: 0x%x", - (guint) hr); - priv->video_device = NULL; - - return; - } - - hr = ID3D11DeviceContext_QueryInterface (device_context_handle, - &IID_ID3D11VideoContext, (void **) &priv->video_context); - - if (!gst_d3d11_result (hr, priv->device) || !priv->video_context) { - GST_WARNING_OBJECT (self, "Cannot create VideoContext Object: 0x%x", - (guint) hr); - priv->video_context = NULL; - - goto fail; - } - - return; - -fail: - if (priv->video_device) { - ID3D11VideoDevice_Release (priv->video_device); - priv->video_device = NULL; - } - - if (priv->video_context) { - ID3D11VideoContext_Release (priv->video_context); - priv->video_context = NULL; - } - - return; -} - -static void -gst_d3d11_decoder_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstD3D11Decoder *self = GST_D3D11_DECODER (object); - GstD3D11DecoderPrivate *priv = self->priv; - - switch (prop_id) { - case PROP_DEVICE: - priv->device = g_value_dup_object (value); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_decoder_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11Decoder *self = GST_D3D11_DECODER (object); - GstD3D11DecoderPrivate *priv = self->priv; - - switch (prop_id) { - case PROP_DEVICE: - g_value_set_object (value, priv->device); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static gboolean -gst_d3d11_decoder_close (GstD3D11Decoder * self) -{ - GstD3D11DecoderPrivate *priv = self->priv; - - gst_d3d11_decoder_reset (self); - - if (priv->video_device) { - ID3D11VideoDevice_Release (priv->video_device); - priv->video_device = NULL; - } - - if (priv->video_context) { - ID3D11VideoContext_Release (priv->video_context); - priv->video_context = NULL; - } - - return TRUE; -} - -static void -gst_d3d11_decoder_reset_unlocked (GstD3D11Decoder * decoder) -{ - GstD3D11DecoderPrivate *priv; - gint i; - - priv = decoder->priv; - gst_clear_object (&priv->internal_pool); - - if (priv->decoder) { - ID3D11VideoDecoder_Release (priv->decoder); - priv->decoder = NULL; - } - - if (priv->staging) { - ID3D11Texture2D_Release (priv->staging); - priv->staging = NULL; - } - - if (priv->converter) { - gst_d3d11_color_converter_free (priv->converter); - priv->converter = NULL; - } - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (priv->shader_resource_view[i]) { - ID3D11ShaderResourceView_Release (priv->shader_resource_view[i]); - priv->shader_resource_view[i] = NULL; - } - - if (priv->fallback_render_target_view[i]) { - ID3D11RenderTargetView_Release (priv->fallback_render_target_view[i]); - priv->fallback_render_target_view[i] = NULL; - } - } - - if (priv->shader_resource_texture) { - ID3D11Texture2D_Release (priv->shader_resource_texture); - priv->shader_resource_texture = NULL; - } - - if (priv->fallback_shader_output_texture) { - ID3D11Texture2D_Release (priv->fallback_shader_output_texture); - priv->fallback_shader_output_texture = NULL; - } - - decoder->opened = FALSE; -} - -void -gst_d3d11_decoder_reset (GstD3D11Decoder * decoder) -{ - GstD3D11DecoderPrivate *priv; - - g_return_if_fail (GST_IS_D3D11_DECODER (decoder)); - - priv = decoder->priv; - gst_d3d11_device_lock (priv->device); - gst_d3d11_decoder_reset_unlocked (decoder); - gst_d3d11_device_unlock (priv->device); -} - -static void -gst_d3d11_decoder_dispose (GObject * obj) -{ - GstD3D11Decoder *self = GST_D3D11_DECODER (obj); - GstD3D11DecoderPrivate *priv = self->priv; - - if (priv->device) { - gst_d3d11_decoder_close (self); - gst_object_unref (priv->device); - priv->device = NULL; - } - - G_OBJECT_CLASS (parent_class)->dispose (obj); -} - -GstD3D11Decoder * -gst_d3d11_decoder_new (GstD3D11Device * device) -{ - GstD3D11Decoder *decoder; - GstD3D11DecoderPrivate *priv; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - decoder = g_object_new (GST_TYPE_D3D11_DECODER, "device", device, NULL); - priv = decoder->priv; - - if (!priv->video_device || !priv->video_context) { - gst_object_unref (decoder); - return NULL; - } - - gst_object_ref_sink (decoder); - - return decoder; -} - -static void -gst_d3d11_decoder_output_view_free (GstD3D11DecoderOutputView * view) -{ - GST_LOG_OBJECT (view->device, "Free view %p, view id: %d", view, - view->view_id); - - if (view->handle) { - gst_d3d11_device_lock (view->device); - ID3D11VideoDecoderOutputView_Release (view->handle); - gst_d3d11_device_unlock (view->device); - } - - gst_object_unref (view->device); - g_free (view); -} - -static gboolean -gst_d3d11_decoder_ensure_output_view (GstD3D11Decoder * self, - GstBuffer * buffer) -{ - GstD3D11DecoderPrivate *priv = self->priv; - D3D11_VIDEO_DECODER_OUTPUT_VIEW_DESC view_desc = { 0, }; - GstD3D11Memory *mem; - GstD3D11DecoderOutputView *view; - ID3D11VideoDecoderOutputView *view_handle; - HRESULT hr; - guint view_id = 0; - - mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, 0); - - view = (GstD3D11DecoderOutputView *) - gst_mini_object_get_qdata (GST_MINI_OBJECT_CAST (mem), OUTPUT_VIEW_QUARK); - - if (view) { - GST_TRACE_OBJECT (self, "Reuse view id %d", view->view_id); - return TRUE; - } - - view_desc.DecodeProfile = priv->decoder_profile; - view_desc.ViewDimension = D3D11_VDOV_DIMENSION_TEXTURE2D; - view_desc.Texture2D.ArraySlice = mem->subresource_index; - if (priv->use_array_of_texture) { - view_id = priv->next_view_id; - - priv->next_view_id++; - /* valid view range is [0, 126] */ - priv->next_view_id %= 127; - } else { - view_id = mem->subresource_index; - } - - GST_LOG_OBJECT (self, "Create decoder output view with index %d", view_id); - - hr = ID3D11VideoDevice_CreateVideoDecoderOutputView (priv->video_device, - (ID3D11Resource *) mem->texture, &view_desc, &view_handle); - if (!gst_d3d11_result (hr, priv->device)) { - GST_ERROR_OBJECT (self, - "Could not create decoder output view index %d, hr: 0x%x", - view_id, (guint) hr); - return FALSE; - } - - view = g_new0 (GstD3D11DecoderOutputView, 1); - view->device = gst_object_ref (priv->device); - view->handle = view_handle; - view->view_id = view_id; - - gst_mini_object_set_qdata (GST_MINI_OBJECT_CAST (mem), OUTPUT_VIEW_QUARK, - view, (GDestroyNotify) gst_d3d11_decoder_output_view_free); - - return TRUE; -} - -/* Must be called from D3D11Device thread */ -static gboolean -gst_d3d11_decoder_prepare_output_view_pool (GstD3D11Decoder * self, - GstVideoInfo * info, guint coded_width, guint coded_height, - guint pool_size, const GUID * decoder_profile) -{ - GstD3D11DecoderPrivate *priv = self->priv; - GstD3D11AllocationParams *alloc_params = NULL; - GstBufferPool *pool = NULL; - GstStructure *config = NULL; - GstCaps *caps = NULL; - GstVideoAlignment align; - GstD3D11AllocationFlags alloc_flags = 0; - gint bind_flags = D3D11_BIND_DECODER; - - gst_clear_object (&priv->internal_pool); - - if (!priv->use_array_of_texture) { - alloc_flags = GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY; - } else { - /* array of texture can have shader resource view */ - bind_flags |= D3D11_BIND_SHADER_RESOURCE; - } - - alloc_params = gst_d3d11_allocation_params_new (priv->device, info, - alloc_flags, bind_flags); - - if (!alloc_params) { - GST_ERROR_OBJECT (self, "Failed to create allocation param"); - goto error; - } - - if (!priv->use_array_of_texture) - alloc_params->desc[0].ArraySize = pool_size; - gst_video_alignment_reset (&align); - - align.padding_right = coded_width - GST_VIDEO_INFO_WIDTH (info); - align.padding_bottom = coded_height - GST_VIDEO_INFO_HEIGHT (info); - if (!gst_d3d11_allocation_params_alignment (alloc_params, &align)) { - GST_ERROR_OBJECT (self, "Cannot set alignment"); - goto error; - } - - pool = gst_d3d11_buffer_pool_new (priv->device); - if (!pool) { - GST_ERROR_OBJECT (self, "Failed to create buffer pool"); - goto error; - } - - /* Setup buffer pool */ - config = gst_buffer_pool_get_config (pool); - caps = gst_video_info_to_caps (info); - if (!caps) { - GST_ERROR_OBJECT (self, "Couldn't convert video info to caps"); - goto error; - } - - gst_buffer_pool_config_set_params (config, caps, GST_VIDEO_INFO_SIZE (info), - 0, pool_size); - gst_buffer_pool_config_set_d3d11_allocation_params (config, alloc_params); - gst_caps_unref (caps); - gst_d3d11_allocation_params_free (alloc_params); - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - - if (!gst_buffer_pool_set_config (pool, config)) { - GST_ERROR_OBJECT (self, "Invalid pool config"); - goto error; - } - - if (!gst_buffer_pool_set_active (pool, TRUE)) { - GST_ERROR_OBJECT (self, "Couldn't activate pool"); - goto error; - } - - priv->internal_pool = pool; - priv->pool_size = pool_size; - - return TRUE; - -error: - if (alloc_params) - gst_d3d11_allocation_params_free (alloc_params); - if (pool) - gst_object_unref (pool); - if (caps) - gst_caps_unref (caps); - - return FALSE; -} - -gboolean -gst_d3d11_decoder_get_supported_decoder_profile (GstD3D11Decoder * decoder, - const GUID ** decoder_profiles, guint profile_size, GUID * selected_profile) -{ - GstD3D11DecoderPrivate *priv; - GUID *guid_list = NULL; - const GUID *profile = NULL; - guint available_profile_count; - gint i, j; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - g_return_val_if_fail (decoder_profiles != NULL, FALSE); - g_return_val_if_fail (profile_size > 0, FALSE); - g_return_val_if_fail (selected_profile != NULL, FALSE); - - priv = decoder->priv; - - available_profile_count = - ID3D11VideoDevice_GetVideoDecoderProfileCount (priv->video_device); - - if (available_profile_count == 0) { - GST_WARNING_OBJECT (decoder, "No available decoder profile"); - return FALSE; - } - - GST_DEBUG_OBJECT (decoder, - "Have %u available decoder profiles", available_profile_count); - guid_list = g_alloca (sizeof (GUID) * available_profile_count); - - for (i = 0; i < available_profile_count; i++) { - hr = ID3D11VideoDevice_GetVideoDecoderProfile (priv->video_device, - i, &guid_list[i]); - if (!gst_d3d11_result (hr, priv->device)) { - GST_WARNING_OBJECT (decoder, "Failed to get %d th decoder profile", i); - return FALSE; - } - } - -#ifndef GST_DISABLE_GST_DEBUG - GST_LOG_OBJECT (decoder, "Supported decoder GUID"); - for (i = 0; i < available_profile_count; i++) { - const GUID *guid = &guid_list[i]; - - GST_LOG_OBJECT (decoder, - "\t { %8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x }", - (guint) guid->Data1, (guint) guid->Data2, (guint) guid->Data3, - guid->Data4[0], guid->Data4[1], guid->Data4[2], guid->Data4[3], - guid->Data4[4], guid->Data4[5], guid->Data4[6], guid->Data4[7]); - } - - GST_LOG_OBJECT (decoder, "Requested decoder GUID"); - for (i = 0; i < profile_size; i++) { - const GUID *guid = decoder_profiles[i]; - - GST_LOG_OBJECT (decoder, - "\t { %8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x }", - (guint) guid->Data1, (guint) guid->Data2, (guint) guid->Data3, - guid->Data4[0], guid->Data4[1], guid->Data4[2], guid->Data4[3], - guid->Data4[4], guid->Data4[5], guid->Data4[6], guid->Data4[7]); - } -#endif - - for (i = 0; i < profile_size; i++) { - for (j = 0; j < available_profile_count; j++) { - if (IsEqualGUID (decoder_profiles[i], &guid_list[j])) { - profile = decoder_profiles[i]; - break; - } - } - } - - if (!profile) { - GST_WARNING_OBJECT (decoder, "No supported decoder profile"); - return FALSE; - } - - *selected_profile = *profile; - - GST_DEBUG_OBJECT (decoder, - "Selected guid " - "{ %8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x }", - (guint) selected_profile->Data1, (guint) selected_profile->Data2, - (guint) selected_profile->Data3, - selected_profile->Data4[0], selected_profile->Data4[1], - selected_profile->Data4[2], selected_profile->Data4[3], - selected_profile->Data4[4], selected_profile->Data4[5], - selected_profile->Data4[6], selected_profile->Data4[7]); - - return TRUE; -} - -gboolean -gst_d3d11_decoder_open (GstD3D11Decoder * decoder, GstD3D11Codec codec, - GstVideoInfo * info, guint coded_width, guint coded_height, - guint pool_size, const GUID ** decoder_profiles, guint profile_size) -{ - GstD3D11DecoderPrivate *priv; - const GstD3D11Format *d3d11_format; - HRESULT hr; - BOOL can_support = FALSE; - guint config_count; - D3D11_VIDEO_DECODER_CONFIG *config_list; - D3D11_VIDEO_DECODER_CONFIG *best_config = NULL; - D3D11_VIDEO_DECODER_DESC decoder_desc = { 0, }; - D3D11_TEXTURE2D_DESC staging_desc = { 0, }; - GUID selected_profile; - gint i; - guint aligned_width, aligned_height; - guint alignment; - GstD3D11DeviceVendor vendor; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - g_return_val_if_fail (codec > GST_D3D11_CODEC_NONE, FALSE); - g_return_val_if_fail (codec < GST_D3D11_CODEC_LAST, FALSE); - g_return_val_if_fail (info != NULL, FALSE); - g_return_val_if_fail (coded_width >= GST_VIDEO_INFO_WIDTH (info), FALSE); - g_return_val_if_fail (coded_height >= GST_VIDEO_INFO_HEIGHT (info), FALSE); - g_return_val_if_fail (pool_size > 0, FALSE); - g_return_val_if_fail (decoder_profiles != NULL, FALSE); - g_return_val_if_fail (profile_size > 0, FALSE); - - priv = decoder->priv; - decoder->opened = FALSE; - priv->use_array_of_texture = FALSE; - - d3d11_format = gst_d3d11_device_format_from_gst (priv->device, - GST_VIDEO_INFO_FORMAT (info)); - if (!d3d11_format || d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { - GST_ERROR_OBJECT (decoder, "Could not determine dxgi format from %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (info))); - return FALSE; - } - - gst_d3d11_device_lock (priv->device); - if (!gst_d3d11_decoder_get_supported_decoder_profile (decoder, - decoder_profiles, profile_size, &selected_profile)) { - goto error; - } - - hr = ID3D11VideoDevice_CheckVideoDecoderFormat (priv->video_device, - &selected_profile, d3d11_format->dxgi_format, &can_support); - if (!gst_d3d11_result (hr, priv->device) || !can_support) { - GST_ERROR_OBJECT (decoder, - "VideoDevice could not support dxgi format %d, hr: 0x%x", - d3d11_format->dxgi_format, (guint) hr); - goto error; - } - - gst_d3d11_decoder_reset_unlocked (decoder); - - priv->can_direct_rendering = TRUE; - - vendor = gst_d3d11_get_device_vendor (priv->device); - switch (vendor) { - case GST_D3D11_DEVICE_VENDOR_XBOX: - case GST_D3D11_DEVICE_VENDOR_QUALCOMM: - /* FIXME: Need to figure out Xbox device's behavior - * https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1312 - * - * Qualcomm driver seems to be buggy in zero-copy scenario - */ - priv->can_direct_rendering = FALSE; - break; - default: - break; - } - - /* NOTE: other dxva implementations (ffmpeg and vlc) do this - * and they say the required alignment were mentioned by dxva spec. - * See ff_dxva2_common_frame_params() in dxva.c of ffmpeg and - * directx_va_Setup() in directx_va.c of vlc. - * But... where it is? */ - switch (codec) { - case GST_D3D11_CODEC_H265: - /* See directx_va_Setup() impl. in vlc */ - if (vendor != GST_D3D11_DEVICE_VENDOR_XBOX) - alignment = 128; - else - alignment = 16; - break; - default: - alignment = 16; - break; - } - - aligned_width = GST_ROUND_UP_N (coded_width, alignment); - aligned_height = GST_ROUND_UP_N (coded_height, alignment); - if (aligned_width != coded_width || aligned_height != coded_height) { - GST_DEBUG_OBJECT (decoder, - "coded resolution %dx%d is not aligned to %d, adjust to %dx%d", - coded_width, coded_height, alignment, aligned_width, aligned_height); - } - - decoder_desc.SampleWidth = aligned_width; - decoder_desc.SampleHeight = aligned_height; - decoder_desc.OutputFormat = d3d11_format->dxgi_format; - decoder_desc.Guid = selected_profile; - - hr = ID3D11VideoDevice_GetVideoDecoderConfigCount (priv->video_device, - &decoder_desc, &config_count); - if (!gst_d3d11_result (hr, priv->device) || config_count == 0) { - GST_ERROR_OBJECT (decoder, "Could not get decoder config count, hr: 0x%x", - (guint) hr); - goto error; - } - - GST_DEBUG_OBJECT (decoder, "Total %d config available", config_count); - - config_list = g_alloca (sizeof (D3D11_VIDEO_DECODER_CONFIG) * config_count); - - for (i = 0; i < config_count; i++) { - hr = ID3D11VideoDevice_GetVideoDecoderConfig (priv->video_device, - &decoder_desc, i, &config_list[i]); - if (!gst_d3d11_result (hr, priv->device)) { - GST_ERROR_OBJECT (decoder, "Could not get decoder %dth config, hr: 0x%x", - i, (guint) hr); - goto error; - } - - /* FIXME: need support DXVA_Slice_H264_Long ?? */ - /* this config uses DXVA_Slice_H264_Short */ - switch (codec) { - case GST_D3D11_CODEC_H264: - if (config_list[i].ConfigBitstreamRaw == 2) - best_config = &config_list[i]; - break; - case GST_D3D11_CODEC_H265: - case GST_D3D11_CODEC_VP9: - case GST_D3D11_CODEC_VP8: - if (config_list[i].ConfigBitstreamRaw == 1) - best_config = &config_list[i]; - break; - default: - g_assert_not_reached (); - goto error; - } - - if (best_config) - break; - } - - if (best_config == NULL) { - GST_ERROR_OBJECT (decoder, "Could not determine decoder config"); - goto error; - } - - GST_DEBUG_OBJECT (decoder, "ConfigDecoderSpecific 0x%x", - best_config->ConfigDecoderSpecific); - - /* FIXME: Revisit this at some point. - * Some 4K VP9 + super frame enabled streams would be broken with - * this configuration (driver crash) on Intel and Nvidia - */ -#if 0 - /* bit 14 is equal to 1b means this config support array of texture and - * it's recommended type as per DXVA spec */ - if ((best_config->ConfigDecoderSpecific & 0x4000) == 0x4000) { - GST_DEBUG_OBJECT (decoder, "Config support array of texture"); - priv->use_array_of_texture = TRUE; - } -#endif - - if (!gst_d3d11_decoder_prepare_output_view_pool (decoder, - info, aligned_width, aligned_height, pool_size, &selected_profile)) { - GST_ERROR_OBJECT (decoder, "Couldn't prepare output view pool"); - goto error; - } - - hr = ID3D11VideoDevice_CreateVideoDecoder (priv->video_device, - &decoder_desc, best_config, &priv->decoder); - if (!gst_d3d11_result (hr, priv->device) || !priv->decoder) { - GST_ERROR_OBJECT (decoder, - "Could not create decoder object, hr: 0x%x", (guint) hr); - goto error; - } - - GST_DEBUG_OBJECT (decoder, "Decoder object %p created", priv->decoder); - - priv->display_width = GST_VIDEO_INFO_WIDTH (info); - priv->display_height = GST_VIDEO_INFO_HEIGHT (info); - - /* create stage texture to copy out */ - staging_desc.Width = aligned_width; - staging_desc.Height = aligned_height; - staging_desc.MipLevels = 1; - staging_desc.Format = d3d11_format->dxgi_format; - staging_desc.SampleDesc.Count = 1; - staging_desc.ArraySize = 1; - staging_desc.Usage = D3D11_USAGE_STAGING; - staging_desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ; - - priv->staging = gst_d3d11_device_create_texture (priv->device, - &staging_desc, NULL); - if (!priv->staging) { - GST_ERROR_OBJECT (decoder, "Couldn't create staging texture"); - goto error; - } - - memset (priv->staging_texture_offset, - 0, sizeof (priv->staging_texture_offset)); - memset (priv->stating_texture_stride, - 0, sizeof (priv->stating_texture_stride)); - - priv->decoder_profile = selected_profile; - decoder->opened = TRUE; - - /* VP9 codec allows internal frame resizing. To handle that case, we need to - * configure converter here. - * - * Note: d3d11videoprocessor seemly does not work well and its ability of - * YUV to YUV resizing would vary depending on device. - * To make sure this conversion, shader will be placed - * instead of d3d11videoprocessor. - * - * TODO: VP8 has the same resizing spec. - * Need to VP8 here when VP8 support is added - */ - if (codec == GST_D3D11_CODEC_VP9) { - D3D11_TEXTURE2D_DESC texture_desc = { 0, }; - D3D11_RENDER_TARGET_VIEW_DESC render_desc = { 0, }; - D3D11_SHADER_RESOURCE_VIEW_DESC resource_desc = { 0, }; - ID3D11Device *device_handle; - RECT rect; - - priv->converter = gst_d3d11_color_converter_new (priv->device, info, info); - - rect.left = 0; - rect.top = 0; - rect.right = priv->display_width; - rect.bottom = priv->display_height; - gst_d3d11_color_converter_update_rect (priv->converter, &rect); - - device_handle = gst_d3d11_device_get_device_handle (priv->device); - - texture_desc.Width = aligned_width; - texture_desc.Height = aligned_height; - texture_desc.MipLevels = 1; - texture_desc.Format = d3d11_format->dxgi_format; - texture_desc.SampleDesc.Count = 1; - texture_desc.ArraySize = 1; - texture_desc.Usage = D3D11_USAGE_DEFAULT; - texture_desc.BindFlags = D3D11_BIND_RENDER_TARGET; - - priv->fallback_shader_output_texture = - gst_d3d11_device_create_texture (priv->device, &texture_desc, NULL); - if (!priv->fallback_shader_output_texture) { - GST_ERROR_OBJECT (decoder, "Couldn't create shader output texture"); - goto error; - } - - texture_desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; - priv->shader_resource_texture = - gst_d3d11_device_create_texture (priv->device, &texture_desc, NULL); - if (!priv->shader_resource_texture) { - GST_ERROR_OBJECT (decoder, "Couldn't create shader input texture"); - goto error; - } - - switch (texture_desc.Format) { - case DXGI_FORMAT_B8G8R8A8_UNORM: - case DXGI_FORMAT_R8G8B8A8_UNORM: - case DXGI_FORMAT_R10G10B10A2_UNORM: - case DXGI_FORMAT_R8_UNORM: - case DXGI_FORMAT_R8G8_UNORM: - case DXGI_FORMAT_R16_UNORM: - case DXGI_FORMAT_R16G16_UNORM: - priv->num_resource_views = 1; - priv->resource_formats[0] = texture_desc.Format; - break; - case DXGI_FORMAT_AYUV: - priv->num_resource_views = 1; - priv->resource_formats[0] = DXGI_FORMAT_R8G8B8A8_UNORM; - break; - case DXGI_FORMAT_NV12: - priv->num_resource_views = 2; - priv->resource_formats[0] = DXGI_FORMAT_R8_UNORM; - priv->resource_formats[1] = DXGI_FORMAT_R8G8_UNORM; - break; - case DXGI_FORMAT_P010: - case DXGI_FORMAT_P016: - priv->num_resource_views = 2; - priv->resource_formats[0] = DXGI_FORMAT_R16_UNORM; - priv->resource_formats[1] = DXGI_FORMAT_R16G16_UNORM; - break; - default: - g_assert_not_reached (); - break; - } - - render_desc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; - render_desc.Texture2D.MipSlice = 0; - - for (i = 0; i < priv->num_resource_views; i++) { - render_desc.Format = priv->resource_formats[i]; - - hr = ID3D11Device_CreateRenderTargetView (device_handle, - (ID3D11Resource *) priv->fallback_shader_output_texture, &render_desc, - &priv->fallback_render_target_view[i]); - if (!gst_d3d11_result (hr, priv->device)) { - GST_ERROR_OBJECT (decoder, - "Failed to create %dth render target view (0x%x)", i, (guint) hr); - goto error; - } - } - - resource_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; - resource_desc.Texture2D.MipLevels = 1; - - for (i = 0; i < priv->num_resource_views; i++) { - resource_desc.Format = priv->resource_formats[i]; - hr = ID3D11Device_CreateShaderResourceView (device_handle, - (ID3D11Resource *) priv->shader_resource_texture, &resource_desc, - &priv->shader_resource_view[i]); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_ERROR_OBJECT (decoder, - "Failed to create %dth resource view (0x%x)", i, (guint) hr); - goto error; - } - } - } - - gst_d3d11_device_unlock (priv->device); - - return TRUE; - -error: - gst_d3d11_decoder_reset_unlocked (decoder); - gst_d3d11_device_unlock (priv->device); - - return FALSE; -} - -gboolean -gst_d3d11_decoder_begin_frame (GstD3D11Decoder * decoder, - GstD3D11DecoderOutputView * output_view, guint content_key_size, - gconstpointer content_key) -{ - GstD3D11DecoderPrivate *priv; - guint retry_count = 0; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - g_return_val_if_fail (output_view != NULL, FALSE); - g_return_val_if_fail (output_view->handle != NULL, FALSE); - - priv = decoder->priv; - - do { - GST_LOG_OBJECT (decoder, "Try begin frame, retry count %d", retry_count); - gst_d3d11_device_lock (priv->device); - hr = ID3D11VideoContext_DecoderBeginFrame (priv->video_context, - priv->decoder, output_view->handle, content_key_size, content_key); - gst_d3d11_device_unlock (priv->device); - - if (hr == E_PENDING && retry_count < 50) { - GST_LOG_OBJECT (decoder, "GPU busy, try again"); - - /* HACK: no better idea other than sleep... - * 1ms waiting like msdkdec */ - g_usleep (1000); - } else { - if (gst_d3d11_result (hr, priv->device)) - GST_LOG_OBJECT (decoder, "Success with retry %d", retry_count); - break; - } - - retry_count++; - } while (TRUE); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_ERROR_OBJECT (decoder, "Failed to begin frame, hr: 0x%x", (guint) hr); - return FALSE; - } - - return TRUE; -} - -gboolean -gst_d3d11_decoder_end_frame (GstD3D11Decoder * decoder) -{ - GstD3D11DecoderPrivate *priv; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - - priv = decoder->priv; - - gst_d3d11_device_lock (priv->device); - hr = ID3D11VideoContext_DecoderEndFrame (priv->video_context, priv->decoder); - gst_d3d11_device_unlock (priv->device); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_WARNING_OBJECT (decoder, "EndFrame failed, hr: 0x%x", (guint) hr); - return FALSE; - } - - return TRUE; -} - -gboolean -gst_d3d11_decoder_get_decoder_buffer (GstD3D11Decoder * decoder, - D3D11_VIDEO_DECODER_BUFFER_TYPE type, guint * buffer_size, - gpointer * buffer) -{ - GstD3D11DecoderPrivate *priv; - UINT size; - void *decoder_buffer; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - - priv = decoder->priv; - - gst_d3d11_device_lock (priv->device); - hr = ID3D11VideoContext_GetDecoderBuffer (priv->video_context, - priv->decoder, type, &size, &decoder_buffer); - gst_d3d11_device_unlock (priv->device); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_WARNING_OBJECT (decoder, "Getting buffer type %d error, hr: 0x%x", - type, (guint) hr); - return FALSE; - } - - *buffer_size = size; - *buffer = decoder_buffer; - - return TRUE; -} - -gboolean -gst_d3d11_decoder_release_decoder_buffer (GstD3D11Decoder * decoder, - D3D11_VIDEO_DECODER_BUFFER_TYPE type) -{ - GstD3D11DecoderPrivate *priv; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - - priv = decoder->priv; - - gst_d3d11_device_lock (priv->device); - hr = ID3D11VideoContext_ReleaseDecoderBuffer (priv->video_context, - priv->decoder, type); - gst_d3d11_device_unlock (priv->device); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_WARNING_OBJECT (decoder, "ReleaseDecoderBuffer failed, hr: 0x%x", - (guint) hr); - return FALSE; - } - - return TRUE; -} - -gboolean -gst_d3d11_decoder_submit_decoder_buffers (GstD3D11Decoder * decoder, - guint buffer_count, const D3D11_VIDEO_DECODER_BUFFER_DESC * buffers) -{ - GstD3D11DecoderPrivate *priv; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - - priv = decoder->priv; - - gst_d3d11_device_lock (priv->device); - hr = ID3D11VideoContext_SubmitDecoderBuffers (priv->video_context, - priv->decoder, buffer_count, buffers); - gst_d3d11_device_unlock (priv->device); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_WARNING_OBJECT (decoder, "SubmitDecoderBuffers failed, hr: 0x%x", - (guint) hr); - return FALSE; - } - - return TRUE; -} - -GstBuffer * -gst_d3d11_decoder_get_output_view_buffer (GstD3D11Decoder * decoder) -{ - GstD3D11DecoderPrivate *priv; - GstBuffer *buf = NULL; - GstFlowReturn ret; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - - priv = decoder->priv; - - ret = gst_buffer_pool_acquire_buffer (priv->internal_pool, &buf, NULL); - - if (ret != GST_FLOW_OK || !buf) { - GST_ERROR_OBJECT (decoder, "Couldn't get buffer from pool, ret %s", - gst_flow_get_name (ret)); - return NULL; - } - - if (!gst_d3d11_decoder_ensure_output_view (decoder, buf)) { - GST_ERROR_OBJECT (decoder, "Output view unavailable"); - gst_buffer_unref (buf); - - return NULL; - } - - return buf; -} - -GstD3D11DecoderOutputView * -gst_d3d11_decoder_get_output_view_from_buffer (GstD3D11Decoder * decoder, - GstBuffer * buffer) -{ - GstMemory *mem; - GstD3D11DecoderOutputView *view; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), NULL); - g_return_val_if_fail (GST_IS_BUFFER (buffer), NULL); - - mem = gst_buffer_peek_memory (buffer, 0); - if (!gst_is_d3d11_memory (mem)) { - GST_WARNING_OBJECT (decoder, "nemory is not d3d11 memory"); - return NULL; - } - - view = (GstD3D11DecoderOutputView *) - gst_mini_object_get_qdata (GST_MINI_OBJECT_CAST (mem), OUTPUT_VIEW_QUARK); - - if (!view) { - GST_WARNING_OBJECT (decoder, "memory does not have output view"); - } - - return view; -} - -guint -gst_d3d11_decoder_get_output_view_index (GstD3D11Decoder * decoder, - ID3D11VideoDecoderOutputView * view_handle) -{ - D3D11_VIDEO_DECODER_OUTPUT_VIEW_DESC view_desc; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), 0xff); - g_return_val_if_fail (view_handle != NULL, 0xff); - - ID3D11VideoDecoderOutputView_GetDesc (view_handle, &view_desc); - - return view_desc.Texture2D.ArraySlice; -} - -static gboolean -copy_to_system (GstD3D11Decoder * self, GstVideoInfo * info, gint display_width, - gint display_height, gboolean need_convert, - GstBuffer * decoder_buffer, GstBuffer * output) -{ - GstD3D11DecoderPrivate *priv = self->priv; - GstVideoFrame out_frame; - gint i; - GstD3D11Memory *in_mem; - D3D11_MAPPED_SUBRESOURCE map; - HRESULT hr; - ID3D11Texture2D *in_texture; - guint in_subresource_index; - ID3D11DeviceContext *device_context = - gst_d3d11_device_get_device_context_handle (priv->device); - - if (!gst_video_frame_map (&out_frame, info, output, GST_MAP_WRITE)) { - GST_ERROR_OBJECT (self, "Couldn't map output buffer"); - return FALSE; - } - - in_mem = (GstD3D11Memory *) gst_buffer_peek_memory (decoder_buffer, 0); - - in_texture = in_mem->texture; - in_subresource_index = in_mem->subresource_index; - - gst_d3d11_device_lock (priv->device); - - if (need_convert) { - D3D11_BOX src_box; - RECT rect; - ID3D11ShaderResourceView **srv = NULL; - - GST_LOG_OBJECT (self, "convert resolution, %dx%d -> %dx%d", - display_width, display_height, - priv->display_width, priv->display_height); - - src_box.left = 0; - src_box.top = 0; - src_box.front = 0; - src_box.back = 1; - src_box.right = GST_ROUND_UP_2 (display_width); - src_box.bottom = GST_ROUND_UP_2 (display_height); - - /* array of texture can be used for shader resource view */ - if (priv->use_array_of_texture && - gst_d3d11_memory_ensure_shader_resource_view (in_mem)) { - GST_TRACE_OBJECT (self, "Decoded texture supports shader resource view"); - srv = in_mem->shader_resource_view; - } - - if (!srv) { - /* copy decoded texture into shader resource texture */ - GST_TRACE_OBJECT (self, - "Copy decoded texture to internal shader texture"); - ID3D11DeviceContext_CopySubresourceRegion (device_context, - (ID3D11Resource *) priv->shader_resource_texture, 0, 0, 0, 0, - (ID3D11Resource *) in_mem->texture, in_mem->subresource_index, - &src_box); - - srv = priv->shader_resource_view; - } - - rect.left = 0; - rect.top = 0; - rect.right = display_width; - rect.bottom = display_height; - - gst_d3d11_color_converter_update_crop_rect (priv->converter, &rect); - - if (!gst_d3d11_color_converter_convert_unlocked (priv->converter, - srv, priv->fallback_render_target_view)) { - GST_ERROR_OBJECT (self, "Failed to convert"); - goto error; - } - - in_texture = priv->fallback_shader_output_texture; - in_subresource_index = 0; - } - - ID3D11DeviceContext_CopySubresourceRegion (device_context, - (ID3D11Resource *) priv->staging, 0, 0, 0, 0, - (ID3D11Resource *) in_texture, in_subresource_index, NULL); - - hr = ID3D11DeviceContext_Map (device_context, - (ID3D11Resource *) priv->staging, 0, D3D11_MAP_READ, 0, &map); - - if (!gst_d3d11_result (hr, priv->device)) { - GST_ERROR_OBJECT (self, "Failed to map, hr: 0x%x", (guint) hr); - goto error; - } - - /* calculate stride and offset only once */ - if (priv->stating_texture_stride[0] == 0) { - D3D11_TEXTURE2D_DESC desc; - gsize dummy; - - ID3D11Texture2D_GetDesc (priv->staging, &desc); - - gst_d3d11_dxgi_format_get_size (desc.Format, desc.Width, desc.Height, - map.RowPitch, priv->staging_texture_offset, - priv->stating_texture_stride, &dummy); - } - - for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&out_frame); i++) { - guint8 *src, *dst; - gint j; - gint width; - - src = (guint8 *) map.pData + priv->staging_texture_offset[i]; - dst = GST_VIDEO_FRAME_PLANE_DATA (&out_frame, i); - width = GST_VIDEO_FRAME_COMP_WIDTH (&out_frame, i) * - GST_VIDEO_FRAME_COMP_PSTRIDE (&out_frame, i); - - for (j = 0; j < GST_VIDEO_FRAME_COMP_HEIGHT (&out_frame, i); j++) { - memcpy (dst, src, width); - dst += GST_VIDEO_FRAME_PLANE_STRIDE (&out_frame, i); - src += priv->stating_texture_stride[i]; - } - } - - gst_video_frame_unmap (&out_frame); - ID3D11DeviceContext_Unmap (device_context, (ID3D11Resource *) priv->staging, - 0); - gst_d3d11_device_unlock (priv->device); - - return TRUE; - -error: - gst_d3d11_device_unlock (priv->device); - return FALSE; -} - -static gboolean -copy_to_d3d11 (GstD3D11Decoder * self, GstVideoInfo * info, gint display_width, - gint display_height, gboolean need_convert, - GstBuffer * decoder_buffer, GstBuffer * output) -{ - GstD3D11DecoderPrivate *priv = self->priv; - GstD3D11Memory *in_mem; - GstD3D11Memory *out_mem; - D3D11_BOX src_box; - ID3D11Texture2D *in_texture; - guint in_subresource_index; - ID3D11DeviceContext *device_context = - gst_d3d11_device_get_device_context_handle (priv->device); - - gst_d3d11_device_lock (priv->device); - - in_mem = (GstD3D11Memory *) gst_buffer_peek_memory (decoder_buffer, 0); - out_mem = (GstD3D11Memory *) gst_buffer_peek_memory (output, 0); - - in_texture = in_mem->texture; - in_subresource_index = in_mem->subresource_index; - - src_box.left = 0; - src_box.top = 0; - src_box.front = 0; - src_box.back = 1; - - if (need_convert) { - gboolean need_copy = FALSE; - ID3D11RenderTargetView **rtv; - ID3D11ShaderResourceView **srv = NULL; - RECT rect; - - GST_LOG_OBJECT (self, "convert resolution, %dx%d -> %dx%d", - display_width, display_height, - priv->display_width, priv->display_height); - - if (!gst_d3d11_memory_ensure_render_target_view (out_mem)) { - /* convert to fallback output view */ - GST_LOG_OBJECT (self, "output memory cannot support render target view"); - rtv = priv->fallback_render_target_view; - need_copy = TRUE; - } else { - rtv = out_mem->render_target_view; - } - - src_box.right = GST_ROUND_UP_2 (display_width); - src_box.bottom = GST_ROUND_UP_2 (display_height); - - /* array of texture can be used for shader resource view */ - if (priv->use_array_of_texture && - gst_d3d11_memory_ensure_shader_resource_view (in_mem)) { - GST_TRACE_OBJECT (self, "Decoded texture supports shader resource view"); - srv = in_mem->shader_resource_view; - } - - if (!srv) { - /* copy decoded texture into shader resource texture */ - GST_TRACE_OBJECT (self, - "Copy decoded texture to internal shader texture"); - ID3D11DeviceContext_CopySubresourceRegion (device_context, - (ID3D11Resource *) priv->shader_resource_texture, 0, 0, 0, 0, - (ID3D11Resource *) in_texture, in_mem->subresource_index, &src_box); - - srv = priv->shader_resource_view; - } - - rect.left = 0; - rect.top = 0; - rect.right = display_width; - rect.bottom = display_height; - - gst_d3d11_color_converter_update_crop_rect (priv->converter, &rect); - - if (!gst_d3d11_color_converter_convert_unlocked (priv->converter, srv, rtv)) { - GST_ERROR_OBJECT (self, "Failed to convert"); - goto error; - } - - if (!need_copy) { - gst_d3d11_device_unlock (priv->device); - return TRUE; - } - - in_texture = priv->fallback_shader_output_texture; - in_subresource_index = 0; - } - - src_box.right = GST_ROUND_UP_2 (priv->display_width); - src_box.bottom = GST_ROUND_UP_2 (priv->display_height); - - ID3D11DeviceContext_CopySubresourceRegion (device_context, - (ID3D11Resource *) out_mem->texture, - out_mem->subresource_index, 0, 0, 0, - (ID3D11Resource *) in_texture, in_subresource_index, &src_box); - - GST_MINI_OBJECT_FLAG_SET (out_mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - gst_d3d11_device_unlock (priv->device); - - return TRUE; - -error: - gst_d3d11_device_unlock (priv->device); - return FALSE; -} - -gboolean -gst_d3d11_decoder_process_output (GstD3D11Decoder * decoder, - GstVideoInfo * info, gint display_width, gint display_height, - GstBuffer * decoder_buffer, GstBuffer * output) -{ - GstD3D11DecoderPrivate *priv; - gboolean can_device_copy = TRUE; - gboolean need_convert = FALSE; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - g_return_val_if_fail (GST_IS_BUFFER (decoder_buffer), FALSE); - g_return_val_if_fail (GST_IS_BUFFER (output), FALSE); - - priv = decoder->priv; - - need_convert = priv->converter && - (priv->display_width != display_width || - priv->display_height != display_height); - - /* if decoder buffer is intended to be outputted and we don't need to - * do post processing, do nothing here */ - if (decoder_buffer == output && !need_convert) - return TRUE; - - /* decoder buffer must have single memory */ - if (gst_buffer_n_memory (decoder_buffer) == gst_buffer_n_memory (output)) { - GstMemory *mem; - GstD3D11Memory *dmem; - - mem = gst_buffer_peek_memory (output, 0); - if (!gst_is_d3d11_memory (mem)) { - can_device_copy = FALSE; - goto do_process; - } - - dmem = (GstD3D11Memory *) mem; - if (dmem->device != priv->device) - can_device_copy = FALSE; - } else { - can_device_copy = FALSE; - } - -do_process: - if (can_device_copy) { - return copy_to_d3d11 (decoder, info, display_width, display_height, - need_convert, decoder_buffer, output); - } - - return copy_to_system (decoder, info, display_width, display_height, - need_convert, decoder_buffer, output); -} - -gboolean -gst_d3d11_decoder_negotiate (GstVideoDecoder * decoder, - GstVideoCodecState * input_state, GstVideoFormat format, - guint width, guint height, GstVideoCodecState ** output_state, - gboolean * downstream_supports_d3d11) -{ - GstCaps *peer_caps; - GstVideoCodecState *state; - - g_return_val_if_fail (GST_IS_VIDEO_DECODER (decoder), FALSE); - g_return_val_if_fail (input_state != NULL, FALSE); - g_return_val_if_fail (format != GST_VIDEO_FORMAT_UNKNOWN, FALSE); - g_return_val_if_fail (width > 0, FALSE); - g_return_val_if_fail (height > 0, FALSE); - g_return_val_if_fail (output_state != NULL, FALSE); - g_return_val_if_fail (downstream_supports_d3d11 != NULL, FALSE); - - state = gst_video_decoder_set_output_state (decoder, - format, width, height, input_state); - state->caps = gst_video_info_to_caps (&state->info); - - if (*output_state) - gst_video_codec_state_unref (*output_state); - *output_state = state; - - peer_caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (decoder)); - GST_DEBUG_OBJECT (decoder, "Allowed caps %" GST_PTR_FORMAT, peer_caps); - - *downstream_supports_d3d11 = FALSE; - - if (!peer_caps || gst_caps_is_any (peer_caps)) { - GST_DEBUG_OBJECT (decoder, - "cannot determine output format, use system memory"); - } else { - GstCapsFeatures *features; - guint size = gst_caps_get_size (peer_caps); - guint i; - - for (i = 0; i < size; i++) { - features = gst_caps_get_features (peer_caps, i); - if (features && gst_caps_features_contains (features, - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { - GST_DEBUG_OBJECT (decoder, "found D3D11 memory feature"); - gst_caps_set_features (state->caps, 0, - gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, NULL)); - - *downstream_supports_d3d11 = TRUE; - break; - } - } - } - gst_clear_caps (&peer_caps); - - return TRUE; -} - -gboolean -gst_d3d11_decoder_decide_allocation (GstVideoDecoder * decoder, - GstQuery * query, GstD3D11Device * device, GstD3D11Codec codec, - gboolean use_d3d11_pool) -{ - GstCaps *outcaps; - GstBufferPool *pool = NULL; - guint n, size, min, max; - GstVideoInfo vinfo = { 0, }; - GstStructure *config; - GstD3D11AllocationParams *d3d11_params; - - g_return_val_if_fail (GST_IS_VIDEO_DECODER (decoder), FALSE); - g_return_val_if_fail (query != NULL, FALSE); - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); - g_return_val_if_fail (codec > GST_D3D11_CODEC_NONE && - codec < GST_D3D11_CODEC_LAST, FALSE); - - gst_query_parse_allocation (query, &outcaps, NULL); - - if (!outcaps) { - GST_DEBUG_OBJECT (decoder, "No output caps"); - return FALSE; - } - - gst_video_info_from_caps (&vinfo, outcaps); - n = gst_query_get_n_allocation_pools (query); - if (n > 0) - gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); - - /* create our own pool */ - if (pool && (use_d3d11_pool && !GST_D3D11_BUFFER_POOL (pool))) { - gst_object_unref (pool); - pool = NULL; - } - - if (!pool) { - if (use_d3d11_pool) - pool = gst_d3d11_buffer_pool_new (device); - else - pool = gst_video_buffer_pool_new (); - - min = max = 0; - size = (guint) vinfo.size; - } - - config = gst_buffer_pool_get_config (pool); - gst_buffer_pool_config_set_params (config, outcaps, size, min, max); - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - - if (use_d3d11_pool) { - GstVideoAlignment align; - gint width, height; - - gst_video_alignment_reset (&align); - - d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); - if (!d3d11_params) - d3d11_params = gst_d3d11_allocation_params_new (device, &vinfo, 0, 0); - - width = GST_VIDEO_INFO_WIDTH (&vinfo); - height = GST_VIDEO_INFO_HEIGHT (&vinfo); - - /* need alignment to copy decoder output texture to downstream texture */ - align.padding_right = GST_ROUND_UP_16 (width) - width; - align.padding_bottom = GST_ROUND_UP_16 (height) - height; - if (!gst_d3d11_allocation_params_alignment (d3d11_params, &align)) { - GST_ERROR_OBJECT (decoder, "Cannot set alignment"); - return FALSE; - } - - if (codec == GST_D3D11_CODEC_VP9) { - /* Needs render target bind flag so that it can be used for - * output of shader pipeline if internal resizing is required */ - d3d11_params->desc[0].BindFlags |= D3D11_BIND_RENDER_TARGET; - } - - gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); - gst_d3d11_allocation_params_free (d3d11_params); - } - - gst_buffer_pool_set_config (pool, config); - if (use_d3d11_pool) - size = GST_D3D11_BUFFER_POOL (pool)->buffer_size; - - if (n > 0) - gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); - else - gst_query_add_allocation_pool (query, pool, size, min, max); - gst_object_unref (pool); - - return TRUE; -} - -gboolean -gst_d3d11_decoder_supports_direct_rendering (GstD3D11Decoder * decoder) -{ - GstD3D11DecoderPrivate *priv; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - - priv = decoder->priv; - - return priv->can_direct_rendering; -} - -/* Keep sync with chromium and keep in sorted order. - * See supported_profile_helpers.cc in chromium */ -static const guint legacy_amd_list[] = { - 0x130f, 0x6700, 0x6701, 0x6702, 0x6703, 0x6704, 0x6705, 0x6706, 0x6707, - 0x6708, 0x6709, 0x6718, 0x6719, 0x671c, 0x671d, 0x671f, 0x6720, 0x6721, - 0x6722, 0x6723, 0x6724, 0x6725, 0x6726, 0x6727, 0x6728, 0x6729, 0x6738, - 0x6739, 0x673e, 0x6740, 0x6741, 0x6742, 0x6743, 0x6744, 0x6745, 0x6746, - 0x6747, 0x6748, 0x6749, 0x674a, 0x6750, 0x6751, 0x6758, 0x6759, 0x675b, - 0x675d, 0x675f, 0x6760, 0x6761, 0x6762, 0x6763, 0x6764, 0x6765, 0x6766, - 0x6767, 0x6768, 0x6770, 0x6771, 0x6772, 0x6778, 0x6779, 0x677b, 0x6798, - 0x67b1, 0x6821, 0x683d, 0x6840, 0x6841, 0x6842, 0x6843, 0x6849, 0x6850, - 0x6858, 0x6859, 0x6880, 0x6888, 0x6889, 0x688a, 0x688c, 0x688d, 0x6898, - 0x6899, 0x689b, 0x689c, 0x689d, 0x689e, 0x68a0, 0x68a1, 0x68a8, 0x68a9, - 0x68b0, 0x68b8, 0x68b9, 0x68ba, 0x68be, 0x68bf, 0x68c0, 0x68c1, 0x68c7, - 0x68c8, 0x68c9, 0x68d8, 0x68d9, 0x68da, 0x68de, 0x68e0, 0x68e1, 0x68e4, - 0x68e5, 0x68e8, 0x68e9, 0x68f1, 0x68f2, 0x68f8, 0x68f9, 0x68fa, 0x68fe, - 0x9400, 0x9401, 0x9402, 0x9403, 0x9405, 0x940a, 0x940b, 0x940f, 0x9440, - 0x9441, 0x9442, 0x9443, 0x9444, 0x9446, 0x944a, 0x944b, 0x944c, 0x944e, - 0x9450, 0x9452, 0x9456, 0x945a, 0x945b, 0x945e, 0x9460, 0x9462, 0x946a, - 0x946b, 0x947a, 0x947b, 0x9480, 0x9487, 0x9488, 0x9489, 0x948a, 0x948f, - 0x9490, 0x9491, 0x9495, 0x9498, 0x949c, 0x949e, 0x949f, 0x94a0, 0x94a1, - 0x94a3, 0x94b1, 0x94b3, 0x94b4, 0x94b5, 0x94b9, 0x94c0, 0x94c1, 0x94c3, - 0x94c4, 0x94c5, 0x94c6, 0x94c7, 0x94c8, 0x94c9, 0x94cb, 0x94cc, 0x94cd, - 0x9500, 0x9501, 0x9504, 0x9505, 0x9506, 0x9507, 0x9508, 0x9509, 0x950f, - 0x9511, 0x9515, 0x9517, 0x9519, 0x9540, 0x9541, 0x9542, 0x954e, 0x954f, - 0x9552, 0x9553, 0x9555, 0x9557, 0x955f, 0x9580, 0x9581, 0x9583, 0x9586, - 0x9587, 0x9588, 0x9589, 0x958a, 0x958b, 0x958c, 0x958d, 0x958e, 0x958f, - 0x9590, 0x9591, 0x9593, 0x9595, 0x9596, 0x9597, 0x9598, 0x9599, 0x959b, - 0x95c0, 0x95c2, 0x95c4, 0x95c5, 0x95c6, 0x95c7, 0x95c9, 0x95cc, 0x95cd, - 0x95ce, 0x95cf, 0x9610, 0x9611, 0x9612, 0x9613, 0x9614, 0x9615, 0x9616, - 0x9640, 0x9641, 0x9642, 0x9643, 0x9644, 0x9645, 0x9647, 0x9648, 0x9649, - 0x964a, 0x964b, 0x964c, 0x964e, 0x964f, 0x9710, 0x9711, 0x9712, 0x9713, - 0x9714, 0x9715, 0x9802, 0x9803, 0x9804, 0x9805, 0x9806, 0x9807, 0x9808, - 0x9809, 0x980a, 0x9830, 0x983d, 0x9850, 0x9851, 0x9874, 0x9900, 0x9901, - 0x9903, 0x9904, 0x9905, 0x9906, 0x9907, 0x9908, 0x9909, 0x990a, 0x990b, - 0x990c, 0x990d, 0x990e, 0x990f, 0x9910, 0x9913, 0x9917, 0x9918, 0x9919, - 0x9990, 0x9991, 0x9992, 0x9993, 0x9994, 0x9995, 0x9996, 0x9997, 0x9998, - 0x9999, 0x999a, 0x999b, 0x999c, 0x999d, 0x99a0, 0x99a2, 0x99a4 -}; - -static const guint legacy_intel_list[] = { - 0x102, 0x106, 0x116, 0x126, 0x152, 0x156, 0x166, - 0x402, 0x406, 0x416, 0x41e, 0xa06, 0xa16, 0xf31, -}; - -static gint -binary_search_compare (const guint * a, const guint * b) -{ - return *a - *b; -} - -/* Certain AMD GPU drivers like R600, R700, Evergreen and Cayman and some second - * generation Intel GPU drivers crash if we create a video device with a - * resolution higher then 1920 x 1088. This function checks if the GPU is in - * this list and if yes returns true. */ -gboolean -gst_d3d11_decoder_util_is_legacy_device (GstD3D11Device * device) -{ - const guint amd_id[] = { 0x1002, 0x1022 }; - const guint intel_id = 0x8086; - guint device_id = 0; - guint vendor_id = 0; - guint *match = NULL; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); - - g_object_get (device, "device-id", &device_id, "vendor-id", &vendor_id, NULL); - - if (vendor_id == amd_id[0] || vendor_id == amd_id[1]) { - match = - (guint *) gst_util_array_binary_search ((gpointer) legacy_amd_list, - G_N_ELEMENTS (legacy_amd_list), sizeof (guint), - (GCompareDataFunc) binary_search_compare, - GST_SEARCH_MODE_EXACT, &device_id, NULL); - } else if (vendor_id == intel_id) { - match = - (guint *) gst_util_array_binary_search ((gpointer) legacy_intel_list, - G_N_ELEMENTS (legacy_intel_list), sizeof (guint), - (GCompareDataFunc) binary_search_compare, - GST_SEARCH_MODE_EXACT, &device_id, NULL); - } - - if (match) { - GST_DEBUG_OBJECT (device, "it's legacy device"); - return TRUE; - } - - return FALSE; -} - -gboolean -gst_d3d11_decoder_supports_format (GstD3D11Decoder * decoder, - const GUID * decoder_profile, DXGI_FORMAT format) -{ - GstD3D11DecoderPrivate *priv; - HRESULT hr; - BOOL can_support = FALSE; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - g_return_val_if_fail (decoder_profile != NULL, FALSE); - g_return_val_if_fail (format != DXGI_FORMAT_UNKNOWN, FALSE); - - priv = decoder->priv; - - hr = ID3D11VideoDevice_CheckVideoDecoderFormat (priv->video_device, - decoder_profile, format, &can_support); - if (!gst_d3d11_result (hr, priv->device) || !can_support) { - GST_DEBUG_OBJECT (decoder, - "VideoDevice could not support dxgi format %d, hr: 0x%x", - format, (guint) hr); - - return FALSE; - } - - return TRUE; -} - -/* Don't call this method with legacy device */ -gboolean -gst_d3d11_decoder_supports_resolution (GstD3D11Decoder * decoder, - const GUID * decoder_profile, DXGI_FORMAT format, guint width, guint height) -{ - D3D11_VIDEO_DECODER_DESC desc; - GstD3D11DecoderPrivate *priv; - HRESULT hr; - UINT config_count; - - g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); - g_return_val_if_fail (decoder_profile != NULL, FALSE); - g_return_val_if_fail (format != DXGI_FORMAT_UNKNOWN, FALSE); - - priv = decoder->priv; - - desc.SampleWidth = width; - desc.SampleHeight = height; - desc.OutputFormat = format; - desc.Guid = *decoder_profile; - - hr = ID3D11VideoDevice_GetVideoDecoderConfigCount (priv->video_device, - &desc, &config_count); - if (!gst_d3d11_result (hr, priv->device) || config_count == 0) { - GST_DEBUG_OBJECT (decoder, "Could not get decoder config count, hr: 0x%x", - (guint) hr); - return FALSE; - } - - return TRUE; -} - -/** - * gst_d3d11_decoder_class_data_new: - * @device: (transfer none): a #GstD3D11Device - * @sink_caps: (transfer full): a #GstCaps - * @src_caps: (transfer full): a #GstCaps - * - * Create new #GstD3D11DecoderClassData - * - * Returns: (transfer full): the new #GstD3D11DecoderClassData - */ -GstD3D11DecoderClassData * -gst_d3d11_decoder_class_data_new (GstD3D11Device * device, - GstCaps * sink_caps, GstCaps * src_caps) -{ - GstD3D11DecoderClassData *ret; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - g_return_val_if_fail (sink_caps != NULL, NULL); - g_return_val_if_fail (src_caps != NULL, NULL); - - ret = g_new0 (GstD3D11DecoderClassData, 1); - - /* class data will be leaked if the element never gets instantiated */ - GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - - g_object_get (device, "adapter", &ret->adapter, - "device-id", &ret->device_id, "vendor-id", &ret->vendor_id, - "description", &ret->description, NULL); - ret->sink_caps = sink_caps; - ret->src_caps = src_caps; - - return ret; -} - -void -gst_d3d11_decoder_class_data_free (GstD3D11DecoderClassData * data) -{ - if (!data) - return; - - gst_clear_caps (&data->sink_caps); - gst_clear_caps (&data->src_caps); - g_free (data->description); - g_free (data); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11device.c
Deleted
@@ -1,1168 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11device.h" -#include "gstd3d11utils.h" -#include "gstd3d11format.h" -#include "gmodule.h" - -#if HAVE_D3D11SDKLAYERS_H -#include <d3d11sdklayers.h> -static GModule *d3d11_debug_module = NULL; - -/* mingw header does not define D3D11_RLDO_IGNORE_INTERNAL - * D3D11_RLDO_SUMMARY = 0x1, - D3D11_RLDO_DETAIL = 0x2, - * D3D11_RLDO_IGNORE_INTERNAL = 0x4 - */ -#define GST_D3D11_RLDO_FLAGS (0x2 | 0x4) -#endif - -#if HAVE_DXGIDEBUG_H -#include <dxgidebug.h> -typedef HRESULT (WINAPI * DXGIGetDebugInterface_t) (REFIID riid, - void **ppDebug); -static GModule *dxgi_debug_module = NULL; -static DXGIGetDebugInterface_t GstDXGIGetDebugInterface = NULL; - -#endif - -#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H) -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_debug_layer_debug); -#endif -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_device_debug); -#define GST_CAT_DEFAULT gst_d3d11_device_debug - -enum -{ - PROP_0, - PROP_ADAPTER, - PROP_DEVICE_ID, - PROP_VENDOR_ID, - PROP_HARDWARE, - PROP_DESCRIPTION, - PROP_ALLOW_TEARING, -}; - -#define DEFAULT_ADAPTER 0 - -struct _GstD3D11DevicePrivate -{ - guint adapter; - guint device_id; - guint vendor_id; - gboolean hardware; - gchar *description; - gboolean allow_tearing; - - ID3D11Device *device; - ID3D11DeviceContext *device_context; - - IDXGIFactory1 *factory; - GstD3D11DXGIFactoryVersion factory_ver; - D3D_FEATURE_LEVEL feature_level; - GstD3D11Format format_table[GST_D3D11_N_FORMATS]; - - GRecMutex extern_lock; - -#if HAVE_D3D11SDKLAYERS_H - ID3D11Debug *d3d11_debug; - ID3D11InfoQueue *d3d11_info_queue; -#endif - -#if HAVE_DXGIDEBUG_H - IDXGIDebug *dxgi_debug; - IDXGIInfoQueue *dxgi_info_queue; -#endif -}; - -#define gst_d3d11_device_parent_class parent_class -G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11Device, gst_d3d11_device, GST_TYPE_OBJECT); - -static void gst_d3d11_device_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec); -static void gst_d3d11_device_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec); -static void gst_d3d11_device_constructed (GObject * object); -static void gst_d3d11_device_dispose (GObject * object); -static void gst_d3d11_device_finalize (GObject * object); - -#if HAVE_D3D11SDKLAYERS_H -static gboolean -gst_d3d11_device_enable_d3d11_debug (void) -{ - static gsize _init = 0; - - /* If all below libraries are unavailable, d3d11 device would fail with - * D3D11_CREATE_DEVICE_DEBUG flag */ - if (g_once_init_enter (&_init)) { - d3d11_debug_module = - g_module_open ("d3d11sdklayers.dll", G_MODULE_BIND_LAZY); - - if (!d3d11_debug_module) - d3d11_debug_module = - g_module_open ("d3d11_1sdklayers.dll", G_MODULE_BIND_LAZY); - - g_once_init_leave (&_init, 1); - } - - return ! !d3d11_debug_module; -} - -static inline GstDebugLevel -d3d11_message_severity_to_gst (D3D11_MESSAGE_SEVERITY level) -{ - switch (level) { - case D3D11_MESSAGE_SEVERITY_CORRUPTION: - case D3D11_MESSAGE_SEVERITY_ERROR: - return GST_LEVEL_ERROR; - case D3D11_MESSAGE_SEVERITY_WARNING: - return GST_LEVEL_WARNING; - case D3D11_MESSAGE_SEVERITY_INFO: - return GST_LEVEL_INFO; - case D3D11_MESSAGE_SEVERITY_MESSAGE: - return GST_LEVEL_DEBUG; - default: - break; - } - - return GST_LEVEL_LOG; -} - -void -gst_d3d11_device_d3d11_debug (GstD3D11Device * device, - const gchar * file, const gchar * function, gint line) -{ - GstD3D11DevicePrivate *priv = device->priv; - D3D11_MESSAGE *msg; - SIZE_T msg_len = 0; - HRESULT hr; - UINT64 num_msg, i; - - if (!priv->d3d11_info_queue) - return; - - num_msg = ID3D11InfoQueue_GetNumStoredMessages (priv->d3d11_info_queue); - - for (i = 0; i < num_msg; i++) { - GstDebugLevel level; - - hr = ID3D11InfoQueue_GetMessage (priv->d3d11_info_queue, i, NULL, &msg_len); - - if (FAILED (hr) || msg_len == 0) { - return; - } - - msg = (D3D11_MESSAGE *) g_alloca (msg_len); - hr = ID3D11InfoQueue_GetMessage (priv->d3d11_info_queue, i, msg, &msg_len); - - level = d3d11_message_severity_to_gst (msg->Severity); - gst_debug_log (gst_d3d11_debug_layer_debug, level, file, function, line, - G_OBJECT (device), "D3D11InfoQueue: %s", msg->pDescription); - } - - ID3D11InfoQueue_ClearStoredMessages (priv->d3d11_info_queue); - - return; -} -#else -void -gst_d3d11_device_d3d11_debug (GstD3D11Device * device, - const gchar * file, const gchar * function, gint line) -{ - /* do nothing */ - return; -} -#endif - -#if HAVE_DXGIDEBUG_H -static gboolean -gst_d3d11_device_enable_dxgi_debug (void) -{ - static gsize _init = 0; - gboolean ret = FALSE; - - /* If all below libraries are unavailable, d3d11 device would fail with - * D3D11_CREATE_DEVICE_DEBUG flag */ - if (g_once_init_enter (&_init)) { -#if (!GST_D3D11_WINAPI_ONLY_APP) - dxgi_debug_module = g_module_open ("dxgidebug.dll", G_MODULE_BIND_LAZY); - - if (dxgi_debug_module) - g_module_symbol (dxgi_debug_module, - "DXGIGetDebugInterface", (gpointer *) & GstDXGIGetDebugInterface); - ret = ! !GstDXGIGetDebugInterface; -#elif (DXGI_HEADER_VERSION >= 3) - ret = TRUE; -#endif - g_once_init_leave (&_init, 1); - } - - return ret; -} - -static HRESULT -gst_d3d11_device_dxgi_get_device_interface (REFIID riid, void **debug) -{ -#if (!GST_D3D11_WINAPI_ONLY_APP) - if (GstDXGIGetDebugInterface) { - return GstDXGIGetDebugInterface (riid, debug); - } -#elif (DXGI_HEADER_VERSION >= 3) - return DXGIGetDebugInterface1 (0, riid, debug); -#endif - - return E_NOINTERFACE; -} - -static inline GstDebugLevel -dxgi_info_queue_message_severity_to_gst (DXGI_INFO_QUEUE_MESSAGE_SEVERITY level) -{ - switch (level) { - case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_CORRUPTION: - case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_ERROR: - return GST_LEVEL_ERROR; - case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_WARNING: - return GST_LEVEL_WARNING; - case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_INFO: - return GST_LEVEL_INFO; - case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_MESSAGE: - return GST_LEVEL_DEBUG; - default: - break; - } - - return GST_LEVEL_LOG; -} - -void -gst_d3d11_device_dxgi_debug (GstD3D11Device * device, - const gchar * file, const gchar * function, gint line) -{ - GstD3D11DevicePrivate *priv = device->priv; - DXGI_INFO_QUEUE_MESSAGE *msg; - SIZE_T msg_len = 0; - HRESULT hr; - UINT64 num_msg, i; - - if (!priv->dxgi_info_queue) - return; - - num_msg = IDXGIInfoQueue_GetNumStoredMessages (priv->dxgi_info_queue, - DXGI_DEBUG_ALL); - - for (i = 0; i < num_msg; i++) { - GstDebugLevel level; - - hr = IDXGIInfoQueue_GetMessage (priv->dxgi_info_queue, - DXGI_DEBUG_ALL, i, NULL, &msg_len); - - if (FAILED (hr) || msg_len == 0) { - return; - } - - msg = (DXGI_INFO_QUEUE_MESSAGE *) g_alloca (msg_len); - hr = IDXGIInfoQueue_GetMessage (priv->dxgi_info_queue, - DXGI_DEBUG_ALL, i, msg, &msg_len); - - level = dxgi_info_queue_message_severity_to_gst (msg->Severity); - gst_debug_log (gst_d3d11_debug_layer_debug, level, file, function, line, - G_OBJECT (device), "DXGIInfoQueue: %s", msg->pDescription); - } - - IDXGIInfoQueue_ClearStoredMessages (priv->dxgi_info_queue, DXGI_DEBUG_ALL); - - return; -} -#else -void -gst_d3d11_device_dxgi_debug (GstD3D11Device * device, - const gchar * file, const gchar * function, gint line) -{ - /* do nothing */ - return; -} -#endif - -static void -gst_d3d11_device_class_init (GstD3D11DeviceClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - - gobject_class->set_property = gst_d3d11_device_set_property; - gobject_class->get_property = gst_d3d11_device_get_property; - gobject_class->constructed = gst_d3d11_device_constructed; - gobject_class->dispose = gst_d3d11_device_dispose; - gobject_class->finalize = gst_d3d11_device_finalize; - - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_uint ("adapter", "Adapter", - "DXGI Adapter index for creating device", - 0, G_MAXUINT32, DEFAULT_ADAPTER, - G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_DEVICE_ID, - g_param_spec_uint ("device-id", "Device Id", - "DXGI Device ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_VENDOR_ID, - g_param_spec_uint ("vendor-id", "Vendor Id", - "DXGI Vendor ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_HARDWARE, - g_param_spec_boolean ("hardware", "Hardware", - "Whether hardware device or not", TRUE, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_DESCRIPTION, - g_param_spec_string ("description", "Description", - "Human readable device description", NULL, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_ALLOW_TEARING, - g_param_spec_boolean ("allow-tearing", "Allow tearing", - "Whether dxgi device supports allow-tearing feature or not", FALSE, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); -} - -static void -gst_d3d11_device_init (GstD3D11Device * self) -{ - GstD3D11DevicePrivate *priv; - - priv = gst_d3d11_device_get_instance_private (self); - priv->adapter = DEFAULT_ADAPTER; - - g_rec_mutex_init (&priv->extern_lock); - - self->priv = priv; -} - -static gboolean -can_support_format (GstD3D11Device * self, DXGI_FORMAT format) -{ - GstD3D11DevicePrivate *priv = self->priv; - ID3D11Device *handle = priv->device; - HRESULT hr; - UINT supported; - D3D11_FORMAT_SUPPORT flags = - (D3D11_FORMAT_SUPPORT_TEXTURE2D | D3D11_FORMAT_SUPPORT_RENDER_TARGET | - D3D11_FORMAT_SUPPORT_SHADER_SAMPLE); - - if (!gst_d3d11_is_windows_8_or_greater ()) { - GST_WARNING_OBJECT (self, "DXGI format %d needs Windows 8 or greater", - (guint) format); - return FALSE; - } - - hr = ID3D11Device_CheckFormatSupport (handle, format, &supported); - if (!gst_d3d11_result (hr, NULL)) { - GST_WARNING_OBJECT (self, "DXGI format %d is not supported by device", - (guint) format); - return FALSE; - } - - if ((supported & flags) != flags) { - GST_WARNING_OBJECT (self, - "DXGI format %d doesn't support flag 0x%x (supported flag 0x%x)", - (guint) format, (guint) supported, (guint) flags); - return FALSE; - } - - GST_INFO_OBJECT (self, "Device supports DXGI format %d", (guint) format); - - return TRUE; -} - -static void -gst_d3d11_device_setup_format_table (GstD3D11Device * self) -{ - GstD3D11DevicePrivate *priv = self->priv; - guint n_formats = 0; - - /* RGB formats */ - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_BGRA; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_B8G8R8A8_UNORM; - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_B8G8R8A8_UNORM; - n_formats++; - - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_RGBA; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8G8B8A8_UNORM; - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R8G8B8A8_UNORM; - n_formats++; - - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_RGB10A2_LE; - priv->format_table[n_formats].resource_format[0] = - DXGI_FORMAT_R10G10B10A2_UNORM; - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R10G10B10A2_UNORM; - n_formats++; - - /* YUV packed */ - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_VUYA; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8G8B8A8_UNORM; - if (can_support_format (self, DXGI_FORMAT_AYUV)) - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_AYUV; - else - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; - n_formats++; - - /* YUV semi-planar */ - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_NV12; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; - priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8G8_UNORM; - if (can_support_format (self, DXGI_FORMAT_NV12)) - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_NV12; - else - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; - n_formats++; - - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_P010_10LE; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; - priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16G16_UNORM; - if (can_support_format (self, DXGI_FORMAT_P010)) - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_P010; - else - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; - n_formats++; - - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_P016_LE; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; - priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16G16_UNORM; - if (can_support_format (self, DXGI_FORMAT_P016)) - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_P016; - else - priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; - n_formats++; - - /* YUV plannar */ - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I420; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; - priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8_UNORM; - priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R8_UNORM; - n_formats++; - - priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I420_10LE; - priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; - priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; - priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; - n_formats++; - - g_assert (n_formats == GST_D3D11_N_FORMATS); -} - -static void -gst_d3d11_device_constructed (GObject * object) -{ - GstD3D11Device *self = GST_D3D11_DEVICE (object); - GstD3D11DevicePrivate *priv = self->priv; - IDXGIAdapter1 *adapter = NULL; - IDXGIFactory1 *factory = NULL; - HRESULT hr; - UINT d3d11_flags = D3D11_CREATE_DEVICE_BGRA_SUPPORT; - - static const D3D_FEATURE_LEVEL feature_levels[] = { - D3D_FEATURE_LEVEL_11_1, - D3D_FEATURE_LEVEL_11_0, - D3D_FEATURE_LEVEL_10_1, - D3D_FEATURE_LEVEL_10_0, - D3D_FEATURE_LEVEL_9_3, - D3D_FEATURE_LEVEL_9_2, - D3D_FEATURE_LEVEL_9_1 - }; - D3D_FEATURE_LEVEL selected_level; - - GST_DEBUG_OBJECT (self, - "Built with DXGI header version %d", DXGI_HEADER_VERSION); - -#if HAVE_DXGIDEBUG_H - if (gst_debug_category_get_threshold (gst_d3d11_debug_layer_debug) > - GST_LEVEL_NONE) { - if (gst_d3d11_device_enable_dxgi_debug ()) { - IDXGIDebug *debug = NULL; - IDXGIInfoQueue *info_queue = NULL; - - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "dxgi debug library was loaded"); - hr = gst_d3d11_device_dxgi_get_device_interface (&IID_IDXGIDebug, - (void **) &debug); - - if (SUCCEEDED (hr)) { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "IDXGIDebug interface available"); - priv->dxgi_debug = debug; - - hr = gst_d3d11_device_dxgi_get_device_interface (&IID_IDXGIInfoQueue, - (void **) &info_queue); - if (SUCCEEDED (hr)) { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "IDXGIInfoQueue interface available"); - priv->dxgi_info_queue = info_queue; - } - } - } else { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "couldn't load dxgi debug library"); - } - } -#endif - -#if (DXGI_HEADER_VERSION >= 5) - hr = CreateDXGIFactory1 (&IID_IDXGIFactory5, (void **) &factory); - if (!gst_d3d11_result (hr, NULL)) { - GST_INFO_OBJECT (self, "IDXGIFactory5 was unavailable"); - factory = NULL; - } else { - BOOL allow_tearing; - - hr = IDXGIFactory5_CheckFeatureSupport ((IDXGIFactory5 *) factory, - DXGI_FEATURE_PRESENT_ALLOW_TEARING, (void *) &allow_tearing, - sizeof (allow_tearing)); - - priv->allow_tearing = SUCCEEDED (hr) && allow_tearing; - - hr = S_OK; - } - - priv->factory_ver = GST_D3D11_DXGI_FACTORY_5; -#endif - - if (!factory) { - priv->factory_ver = GST_D3D11_DXGI_FACTORY_1; - hr = CreateDXGIFactory1 (&IID_IDXGIFactory1, (void **) &factory); - } - - if (!gst_d3d11_result (hr, NULL)) { - GST_ERROR_OBJECT (self, "cannot create dxgi factory, hr: 0x%x", (guint) hr); - goto error; - } - - if (IDXGIFactory1_EnumAdapters1 (factory, priv->adapter, - &adapter) == DXGI_ERROR_NOT_FOUND) { - GST_WARNING_OBJECT (self, "No adapter for index %d", priv->adapter); - goto error; - } else { - DXGI_ADAPTER_DESC1 desc; - - hr = IDXGIAdapter1_GetDesc1 (adapter, &desc); - if (SUCCEEDED (hr)) { - gchar *description = NULL; - gboolean is_hardware = FALSE; - - /* DXGI_ADAPTER_FLAG_SOFTWARE is missing in dxgi.h of mingw */ - if ((desc.Flags & 0x2) != 0x2) { - is_hardware = TRUE; - } - - description = g_utf16_to_utf8 (desc.Description, -1, NULL, NULL, NULL); - GST_DEBUG_OBJECT (self, - "adapter index %d: D3D11 device vendor-id: 0x%04x, device-id: 0x%04x, " - "Flags: 0x%x, %s", - priv->adapter, desc.VendorId, desc.DeviceId, desc.Flags, description); - - priv->vendor_id = desc.VendorId; - priv->device_id = desc.DeviceId; - priv->hardware = is_hardware; - priv->description = description; - } - } - -#if HAVE_D3D11SDKLAYERS_H - if (gst_debug_category_get_threshold (gst_d3d11_debug_layer_debug) > - GST_LEVEL_NONE) { - /* DirectX SDK should be installed on system for this */ - if (gst_d3d11_device_enable_d3d11_debug ()) { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "d3d11 debug library was loaded"); - d3d11_flags |= D3D11_CREATE_DEVICE_DEBUG; - } else { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "couldn't load d3d11 debug library"); - } - } -#endif - - hr = D3D11CreateDevice ((IDXGIAdapter *) adapter, D3D_DRIVER_TYPE_UNKNOWN, - NULL, d3d11_flags, feature_levels, G_N_ELEMENTS (feature_levels), - D3D11_SDK_VERSION, &priv->device, &selected_level, &priv->device_context); - priv->feature_level = selected_level; - - if (!gst_d3d11_result (hr, NULL)) { - /* Retry if the system could not recognize D3D_FEATURE_LEVEL_11_1 */ - hr = D3D11CreateDevice ((IDXGIAdapter *) adapter, D3D_DRIVER_TYPE_UNKNOWN, - NULL, d3d11_flags, &feature_levels[1], - G_N_ELEMENTS (feature_levels) - 1, D3D11_SDK_VERSION, &priv->device, - &selected_level, &priv->device_context); - priv->feature_level = selected_level; - } - - /* if D3D11_CREATE_DEVICE_DEBUG was enabled but couldn't create device, - * try it without the flag again */ - if (FAILED (hr) && (d3d11_flags & D3D11_CREATE_DEVICE_DEBUG) == - D3D11_CREATE_DEVICE_DEBUG) { - GST_WARNING_OBJECT (self, "Couldn't create d3d11 device with debug flag"); - - d3d11_flags &= ~D3D11_CREATE_DEVICE_DEBUG; - - hr = D3D11CreateDevice ((IDXGIAdapter *) adapter, D3D_DRIVER_TYPE_UNKNOWN, - NULL, d3d11_flags, feature_levels, G_N_ELEMENTS (feature_levels), - D3D11_SDK_VERSION, &priv->device, &selected_level, - &priv->device_context); - priv->feature_level = selected_level; - - if (!gst_d3d11_result (hr, NULL)) { - /* Retry if the system could not recognize D3D_FEATURE_LEVEL_11_1 */ - hr = D3D11CreateDevice ((IDXGIAdapter *) adapter, D3D_DRIVER_TYPE_UNKNOWN, - NULL, d3d11_flags, &feature_levels[1], - G_N_ELEMENTS (feature_levels) - 1, D3D11_SDK_VERSION, &priv->device, - &selected_level, &priv->device_context); - priv->feature_level = selected_level; - } - } - - if (gst_d3d11_result (hr, NULL)) { - GST_DEBUG_OBJECT (self, "Selected feature level 0x%x", selected_level); - } else { - GST_ERROR_OBJECT (self, "cannot create d3d11 device, hr: 0x%x", (guint) hr); - goto error; - } - - priv->factory = factory; - -#if HAVE_D3D11SDKLAYERS_H - if ((d3d11_flags & D3D11_CREATE_DEVICE_DEBUG) == D3D11_CREATE_DEVICE_DEBUG) { - ID3D11Debug *debug; - ID3D11InfoQueue *info_queue; - - hr = ID3D11Device_QueryInterface (priv->device, - &IID_ID3D11Debug, (void **) &debug); - - if (SUCCEEDED (hr)) { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "D3D11Debug interface available"); - priv->d3d11_debug = debug; - - hr = ID3D11Device_QueryInterface (priv->device, - &IID_ID3D11InfoQueue, (void **) &info_queue); - if (SUCCEEDED (hr)) { - GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, - "ID3D11InfoQueue interface available"); - priv->d3d11_info_queue = info_queue; - } - } - } -#endif - - IDXGIAdapter1_Release (adapter); - gst_d3d11_device_setup_format_table (self); - - G_OBJECT_CLASS (parent_class)->constructed (object); - - return; - -error: - if (factory) - IDXGIFactory1_Release (factory); - - if (adapter) - IDXGIAdapter1_Release (adapter); - - G_OBJECT_CLASS (parent_class)->constructed (object); - - return; -} - -static void -gst_d3d11_device_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstD3D11Device *self = GST_D3D11_DEVICE (object); - GstD3D11DevicePrivate *priv = self->priv; - - switch (prop_id) { - case PROP_ADAPTER: - priv->adapter = g_value_get_uint (value); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_device_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11Device *self = GST_D3D11_DEVICE (object); - GstD3D11DevicePrivate *priv = self->priv; - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_uint (value, priv->adapter); - break; - case PROP_DEVICE_ID: - g_value_set_uint (value, priv->device_id); - break; - case PROP_VENDOR_ID: - g_value_set_uint (value, priv->vendor_id); - break; - case PROP_HARDWARE: - g_value_set_boolean (value, priv->hardware); - break; - case PROP_DESCRIPTION: - g_value_set_string (value, priv->description); - break; - case PROP_ALLOW_TEARING: - g_value_set_boolean (value, priv->allow_tearing); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_device_dispose (GObject * object) -{ - GstD3D11Device *self = GST_D3D11_DEVICE (object); - GstD3D11DevicePrivate *priv = self->priv; - - GST_LOG_OBJECT (self, "dispose"); - - if (priv->device) { - ID3D11Device_Release (priv->device); - priv->device = NULL; - } - - if (priv->device_context) { - ID3D11DeviceContext_Release (priv->device_context); - priv->device_context = NULL; - } - - if (priv->factory) { - IDXGIFactory1_Release (priv->factory); - priv->factory = NULL; - } -#if HAVE_D3D11SDKLAYERS_H - if (priv->d3d11_debug) { - ID3D11Debug_ReportLiveDeviceObjects (priv->d3d11_debug, - (D3D11_RLDO_FLAGS) GST_D3D11_RLDO_FLAGS); - ID3D11Debug_Release (priv->d3d11_debug); - priv->d3d11_debug = NULL; - } - - if (priv->d3d11_info_queue) { - gst_d3d11_device_d3d11_debug (self, __FILE__, GST_FUNCTION, __LINE__); - - ID3D11InfoQueue_Release (priv->d3d11_info_queue); - priv->d3d11_info_queue = NULL; - } -#endif - -#if HAVE_DXGIDEBUG_H - if (priv->dxgi_debug) { - IDXGIDebug_ReportLiveObjects (priv->dxgi_debug, DXGI_DEBUG_ALL, - (DXGI_DEBUG_RLO_FLAGS) GST_D3D11_RLDO_FLAGS); - IDXGIDebug_Release (priv->dxgi_debug); - priv->dxgi_debug = NULL; - } - - if (priv->dxgi_info_queue) { - gst_d3d11_device_dxgi_debug (self, __FILE__, GST_FUNCTION, __LINE__); - - IDXGIInfoQueue_Release (priv->dxgi_info_queue); - priv->dxgi_info_queue = NULL; - } -#endif - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_d3d11_device_finalize (GObject * object) -{ - GstD3D11Device *self = GST_D3D11_DEVICE (object); - GstD3D11DevicePrivate *priv = self->priv; - - GST_LOG_OBJECT (self, "finalize"); - - g_rec_mutex_clear (&priv->extern_lock); - g_free (priv->description); - - G_OBJECT_CLASS (parent_class)->finalize (object); -} - -/** - * gst_d3d11_device_new: - * @adapter: the index of adapter for creating d3d11 device - * - * Returns: (transfer full) (nullable): a new #GstD3D11Device for @adapter or %NULL - * when failed to create D3D11 device with given adapter index. - */ -GstD3D11Device * -gst_d3d11_device_new (guint adapter) -{ - GstD3D11Device *device = NULL; - GstD3D11DevicePrivate *priv; - - device = g_object_new (GST_TYPE_D3D11_DEVICE, "adapter", adapter, NULL); - - priv = device->priv; - - if (!priv->device || !priv->device_context) { - GST_WARNING ("Cannot create d3d11 device with adapter %d", adapter); - gst_clear_object (&device); - } else { - gst_object_ref_sink (device); - } - - return device; -} - -/** - * gst_d3d11_device_get_device_handle: - * @device: a #GstD3D11Device - * - * Used for various D3D11 APIs directly. - * Caller must not destroy returned device object. - * - * Returns: (transfer none): the ID3D11Device - */ -ID3D11Device * -gst_d3d11_device_get_device_handle (GstD3D11Device * device) -{ - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - return device->priv->device; -} - -/** - * gst_d3d11_device_get_device_context_handle: - * @device: a #GstD3D11Device - * - * Used for various D3D11 APIs directly. - * Caller must not destroy returned device object. - * - * Returns: (transfer none): the ID3D11DeviceContext - */ -ID3D11DeviceContext * -gst_d3d11_device_get_device_context_handle (GstD3D11Device * device) -{ - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - return device->priv->device_context; -} - -GstD3D11DXGIFactoryVersion -gst_d3d11_device_get_chosen_dxgi_factory_version (GstD3D11Device * device) -{ - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), - GST_D3D11_DXGI_FACTORY_UNKNOWN); - - return device->priv->factory_ver; -} - -D3D_FEATURE_LEVEL -gst_d3d11_device_get_chosen_feature_level (GstD3D11Device * device) -{ - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), 0); - - return device->priv->feature_level; -} - -/** - * gst_d3d11_device_create_swap_chain: - * @device: a #GstD3D11Device - * @desc: a DXGI_SWAP_CHAIN_DESC structure for swapchain - * - * Create a IDXGISwapChain object. Caller must release returned swap chain object - * via IDXGISwapChain_Release() - * - * Returns: (transfer full) (nullable): a new IDXGISwapChain or %NULL - * when failed to create swap chain with given @desc - */ -IDXGISwapChain * -gst_d3d11_device_create_swap_chain (GstD3D11Device * device, - const DXGI_SWAP_CHAIN_DESC * desc) -{ - GstD3D11DevicePrivate *priv; - HRESULT hr; - IDXGISwapChain *swap_chain = NULL; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - priv = device->priv; - - gst_d3d11_device_lock (device); - hr = IDXGIFactory1_CreateSwapChain (priv->factory, (IUnknown *) priv->device, - (DXGI_SWAP_CHAIN_DESC *) desc, &swap_chain); - gst_d3d11_device_unlock (device); - - if (!gst_d3d11_result (hr, device)) { - GST_WARNING_OBJECT (device, "Cannot create SwapChain Object: 0x%x", - (guint) hr); - swap_chain = NULL; - } - - return swap_chain; -} - -#if (DXGI_HEADER_VERSION >= 2) -#if (!GST_D3D11_WINAPI_ONLY_APP) -/** - * gst_d3d11_device_create_swap_chain_for_hwnd: - * @device: a #GstD3D11Device - * @hwnd: HWND handle - * @desc: a DXGI_SWAP_CHAIN_DESC1 structure for swapchain - * @fullscreen_desc: (nullable): a DXGI_SWAP_CHAIN_FULLSCREEN_DESC - * structure for swapchain - * @output: (nullable): a IDXGIOutput interface for the output to restrict content to - * - * Create a IDXGISwapChain1 object. Caller must release returned swap chain object - * via IDXGISwapChain1_Release() - * - * Returns: (transfer full) (nullable): a new IDXGISwapChain1 or %NULL - * when failed to create swap chain with given @desc - */ -IDXGISwapChain1 * -gst_d3d11_device_create_swap_chain_for_hwnd (GstD3D11Device * device, - HWND hwnd, const DXGI_SWAP_CHAIN_DESC1 * desc, - const DXGI_SWAP_CHAIN_FULLSCREEN_DESC * fullscreen_desc, - IDXGIOutput * output) -{ - GstD3D11DevicePrivate *priv; - IDXGISwapChain1 *swap_chain = NULL; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - priv = device->priv; - - gst_d3d11_device_lock (device); - hr = IDXGIFactory2_CreateSwapChainForHwnd ((IDXGIFactory2 *) priv->factory, - (IUnknown *) priv->device, hwnd, desc, fullscreen_desc, - output, &swap_chain); - gst_d3d11_device_unlock (device); - - if (!gst_d3d11_result (hr, device)) { - GST_WARNING_OBJECT (device, "Cannot create SwapChain Object: 0x%x", - (guint) hr); - swap_chain = NULL; - } - - return swap_chain; -} -#endif /* GST_D3D11_WINAPI_ONLY_APP */ - -#if GST_D3D11_WINAPI_ONLY_APP -/** - * gst_d3d11_device_create_swap_chain_for_core_window: - * @device: a #GstD3D11Device - * @core_window: CoreWindow handle - * @desc: a DXGI_SWAP_CHAIN_DESC1 structure for swapchain - * @output: (nullable): a IDXGIOutput interface for the output to restrict content to - * - * Create a IDXGISwapChain1 object. Caller must release returned swap chain object - * via IDXGISwapChain1_Release() - * - * Returns: (transfer full) (nullable): a new IDXGISwapChain1 or %NULL - * when failed to create swap chain with given @desc - */ -IDXGISwapChain1 * -gst_d3d11_device_create_swap_chain_for_core_window (GstD3D11Device * device, - guintptr core_window, const DXGI_SWAP_CHAIN_DESC1 * desc, - IDXGIOutput * output) -{ - GstD3D11DevicePrivate *priv; - IDXGISwapChain1 *swap_chain = NULL; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - priv = device->priv; - - gst_d3d11_device_lock (device); - hr = IDXGIFactory2_CreateSwapChainForCoreWindow ((IDXGIFactory2 *) - priv->factory, (IUnknown *) priv->device, (IUnknown *) core_window, desc, - output, &swap_chain); - gst_d3d11_device_unlock (device); - - if (!gst_d3d11_result (hr, device)) { - GST_WARNING_OBJECT (device, "Cannot create SwapChain Object: 0x%x", - (guint) hr); - swap_chain = NULL; - } - - return swap_chain; -} - -/** - * gst_d3d11_device_create_swap_chain_for_composition: - * @device: a #GstD3D11Device - * @desc: a DXGI_SWAP_CHAIN_DESC1 structure for swapchain - * @output: (nullable): a IDXGIOutput interface for the output to restrict content to - * - * Create a IDXGISwapChain1 object. Caller must release returned swap chain object - * via IDXGISwapChain1_Release() - * - * Returns: (transfer full) (nullable): a new IDXGISwapChain1 or %NULL - * when failed to create swap chain with given @desc - */ -IDXGISwapChain1 * -gst_d3d11_device_create_swap_chain_for_composition (GstD3D11Device * device, - const DXGI_SWAP_CHAIN_DESC1 * desc, IDXGIOutput * output) -{ - GstD3D11DevicePrivate *priv; - IDXGISwapChain1 *swap_chain = NULL; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - priv = device->priv; - - gst_d3d11_device_lock (device); - hr = IDXGIFactory2_CreateSwapChainForComposition ((IDXGIFactory2 *) - priv->factory, (IUnknown *) priv->device, desc, output, &swap_chain); - gst_d3d11_device_unlock (device); - - if (!gst_d3d11_result (hr, device)) { - GST_WARNING_OBJECT (device, "Cannot create SwapChain Object: 0x%x", - (guint) hr); - swap_chain = NULL; - } - - return swap_chain; -} -#endif /* GST_D3D11_WINAPI_ONLY_APP */ -#endif /* (DXGI_HEADER_VERSION >= 2) */ - -/** - * gst_d3d11_device_release_swap_chain: - * @device: a #GstD3D11Device - * @swap_chain: a IDXGISwapChain - * - * Release a @swap_chain from device thread - */ -void -gst_d3d11_device_release_swap_chain (GstD3D11Device * device, - IDXGISwapChain * swap_chain) -{ - g_return_if_fail (GST_IS_D3D11_DEVICE (device)); - g_return_if_fail (swap_chain != NULL); - - gst_d3d11_device_lock (device); - IDXGISwapChain_Release (swap_chain); - gst_d3d11_device_unlock (device); -} - -ID3D11Texture2D * -gst_d3d11_device_create_texture (GstD3D11Device * device, - const D3D11_TEXTURE2D_DESC * desc, - const D3D11_SUBRESOURCE_DATA * inital_data) -{ - GstD3D11DevicePrivate *priv; - HRESULT hr; - ID3D11Texture2D *texture; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - g_return_val_if_fail (desc != NULL, NULL); - - priv = device->priv; - - hr = ID3D11Device_CreateTexture2D (priv->device, desc, inital_data, &texture); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Failed to create texture (0x%x)", (guint) hr); - - GST_WARNING ("Direct3D11 Allocation params"); - GST_WARNING ("\t%dx%d, DXGI format %d", - desc->Width, desc->Height, desc->Format); - GST_WARNING ("\tMipLevel %d, ArraySize %d", - desc->MipLevels, desc->ArraySize); - GST_WARNING ("\tSampleDesc.Count %d, SampleDesc.Quality %d", - desc->SampleDesc.Count, desc->SampleDesc.Quality); - GST_WARNING ("\tUsage %d", desc->Usage); - GST_WARNING ("\tBindFlags 0x%x", desc->BindFlags); - GST_WARNING ("\tCPUAccessFlags 0x%x", desc->CPUAccessFlags); - GST_WARNING ("\tMiscFlags 0x%x", desc->MiscFlags); - texture = NULL; - } - - return texture; -} - -void -gst_d3d11_device_release_texture (GstD3D11Device * device, - ID3D11Texture2D * texture) -{ - g_return_if_fail (GST_IS_D3D11_DEVICE (device)); - g_return_if_fail (texture != NULL); - - ID3D11Texture2D_Release (texture); -} - -void -gst_d3d11_device_lock (GstD3D11Device * device) -{ - GstD3D11DevicePrivate *priv; - - g_return_if_fail (GST_IS_D3D11_DEVICE (device)); - - priv = device->priv; - - GST_TRACE_OBJECT (device, "device locking"); - g_rec_mutex_lock (&priv->extern_lock); - GST_TRACE_OBJECT (device, "device locked"); -} - -void -gst_d3d11_device_unlock (GstD3D11Device * device) -{ - GstD3D11DevicePrivate *priv; - - g_return_if_fail (GST_IS_D3D11_DEVICE (device)); - - priv = device->priv; - - g_rec_mutex_unlock (&priv->extern_lock); - GST_TRACE_OBJECT (device, "device unlocked"); -} - -const GstD3D11Format * -gst_d3d11_device_format_from_gst (GstD3D11Device * device, - GstVideoFormat format) -{ - GstD3D11DevicePrivate *priv; - gint i; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - priv = device->priv; - - for (i = 0; i < G_N_ELEMENTS (priv->format_table); i++) { - if (priv->format_table[i].format == format) - return &priv->format_table[i]; - } - - return NULL; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11device.h
Deleted
@@ -1,144 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_DEVICE_H__ -#define __GST_D3D11_DEVICE_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> - -#include "gstd3d11_fwd.h" - -G_BEGIN_DECLS - -#define GST_TYPE_D3D11_DEVICE (gst_d3d11_device_get_type()) -#define GST_D3D11_DEVICE(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_DEVICE,GstD3D11Device)) -#define GST_D3D11_DEVICE_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_D3D11_DEVICE,GstD3D11DeviceClass)) -#define GST_D3D11_DEVICE_GET_CLASS(obj) (GST_D3D11_DEVICE_CLASS(G_OBJECT_GET_CLASS(obj))) -#define GST_IS_D3D11_DEVICE(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_DEVICE)) -#define GST_IS_D3D11_DEVICE_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_DEVICE)) -#define GST_D3D11_DEVICE_CAST(obj) ((GstD3D11Device*)(obj)) - -#define GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE "gst.d3d11.device.handle" - -/** - * GstD3D11DeviceThreadFunc: - * @device: a #GstD3D11Device - * @data: user data - * - * Represents a function to run in the D3D11 device thread with @device and @data - */ -typedef void (*GstD3D11DeviceThreadFunc) (GstD3D11Device * device, gpointer data); - -typedef enum -{ - GST_D3D11_DXGI_FACTORY_UNKNOWN = 0, - GST_D3D11_DXGI_FACTORY_1, - GST_D3D11_DXGI_FACTORY_2, - GST_D3D11_DXGI_FACTORY_3, - GST_D3D11_DXGI_FACTORY_4, - GST_D3D11_DXGI_FACTORY_5, -} GstD3D11DXGIFactoryVersion; - - -struct _GstD3D11Device -{ - GstObject parent; - - GstD3D11DevicePrivate *priv; - - /*< private >*/ - gpointer _gst_reserved[GST_PADDING]; -}; - -struct _GstD3D11DeviceClass -{ - GstObjectClass parent_class; - - /*< private >*/ - gpointer _gst_reserved[GST_PADDING]; -}; - -GType gst_d3d11_device_get_type (void); - -GstD3D11Device * gst_d3d11_device_new (guint adapter); - -ID3D11Device * gst_d3d11_device_get_device_handle (GstD3D11Device * device); - -ID3D11DeviceContext * gst_d3d11_device_get_device_context_handle (GstD3D11Device * device); - -GstD3D11DXGIFactoryVersion gst_d3d11_device_get_chosen_dxgi_factory_version (GstD3D11Device * device); - -D3D_FEATURE_LEVEL gst_d3d11_device_get_chosen_feature_level (GstD3D11Device * device); - -IDXGISwapChain * gst_d3d11_device_create_swap_chain (GstD3D11Device * device, - const DXGI_SWAP_CHAIN_DESC * desc); - -#if (DXGI_HEADER_VERSION >= 2) -#if (!GST_D3D11_WINAPI_ONLY_APP) -IDXGISwapChain1 * gst_d3d11_device_create_swap_chain_for_hwnd (GstD3D11Device * device, - HWND hwnd, - const DXGI_SWAP_CHAIN_DESC1 * desc, - const DXGI_SWAP_CHAIN_FULLSCREEN_DESC * fullscreen_desc, - IDXGIOutput * output); -#endif - -#if GST_D3D11_WINAPI_ONLY_APP -IDXGISwapChain1 * gst_d3d11_device_create_swap_chain_for_core_window (GstD3D11Device * device, - guintptr core_window, - const DXGI_SWAP_CHAIN_DESC1 * desc, - IDXGIOutput * output); - -IDXGISwapChain1 * gst_d3d11_device_create_swap_chain_for_composition (GstD3D11Device * device, - const DXGI_SWAP_CHAIN_DESC1 * desc, - IDXGIOutput * output); - -#endif /* GST_D3D11_WINAPI_ONLY_APP */ -#endif /* (DXGI_HEADER_VERSION >= 2) */ - -void gst_d3d11_device_release_swap_chain (GstD3D11Device * device, - IDXGISwapChain * swap_chain); - -ID3D11Texture2D * gst_d3d11_device_create_texture (GstD3D11Device * device, - const D3D11_TEXTURE2D_DESC * desc, - const D3D11_SUBRESOURCE_DATA *inital_data); - -void gst_d3d11_device_release_texture (GstD3D11Device * device, - ID3D11Texture2D * texture); - -void gst_d3d11_device_lock (GstD3D11Device * device); - -void gst_d3d11_device_unlock (GstD3D11Device * device); - -void gst_d3d11_device_d3d11_debug (GstD3D11Device * device, - const gchar * file, - const gchar * function, - gint line); - -void gst_d3d11_device_dxgi_debug (GstD3D11Device * device, - const gchar * file, - const gchar * function, - gint line); - -const GstD3D11Format * gst_d3d11_device_format_from_gst (GstD3D11Device * device, - GstVideoFormat format); - -G_END_DECLS - -#endif /* __GST_D3D11_DEVICE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11download.c
Deleted
@@ -1,324 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11download.h" -#include "gstd3d11memory.h" -#include "gstd3d11device.h" -#include "gstd3d11bufferpool.h" -#include "gstd3d11utils.h" - -GST_DEBUG_CATEGORY_STATIC (gst_d3d11_download_debug); -#define GST_CAT_DEFAULT gst_d3d11_download_debug - -static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", - GST_PAD_SINK, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE (GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) - )); - -static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", - GST_PAD_SRC, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE (GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) - )); - -#define gst_d3d11_download_parent_class parent_class -G_DEFINE_TYPE (GstD3D11Download, gst_d3d11_download, - GST_TYPE_D3D11_BASE_FILTER); - -static GstCaps *gst_d3d11_download_transform_caps (GstBaseTransform * trans, - GstPadDirection direction, GstCaps * caps, GstCaps * filter); -static gboolean gst_d3d11_download_propose_allocation (GstBaseTransform * trans, - GstQuery * decide_query, GstQuery * query); -static gboolean gst_d3d11_download_decide_allocation (GstBaseTransform * trans, - GstQuery * query); -static GstFlowReturn gst_d3d11_download_transform (GstBaseTransform * trans, - GstBuffer * inbuf, GstBuffer * outbuf); - -static void -gst_d3d11_download_class_init (GstD3D11DownloadClass * klass) -{ - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); - - gst_element_class_add_static_pad_template (element_class, &sink_template); - gst_element_class_add_static_pad_template (element_class, &src_template); - - gst_element_class_set_static_metadata (element_class, - "Direct3D11 downloader", "Filter/Video", - "Downloads D3D11 texture memory into system memory", - "Seungha Yang <seungha.yang@navercorp.com>"); - - trans_class->passthrough_on_same_caps = TRUE; - - trans_class->transform_caps = - GST_DEBUG_FUNCPTR (gst_d3d11_download_transform_caps); - trans_class->propose_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_download_propose_allocation); - trans_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_download_decide_allocation); - trans_class->transform = GST_DEBUG_FUNCPTR (gst_d3d11_download_transform); - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_download_debug, - "d3d11download", 0, "d3d11download Element"); -} - -static void -gst_d3d11_download_init (GstD3D11Download * download) -{ -} - -static GstCaps * -_set_caps_features (const GstCaps * caps, const gchar * feature_name) -{ - GstCaps *tmp = gst_caps_copy (caps); - guint n = gst_caps_get_size (tmp); - guint i = 0; - - for (i = 0; i < n; i++) - gst_caps_set_features (tmp, i, - gst_caps_features_from_string (feature_name)); - - return tmp; -} - -static GstCaps * -gst_d3d11_download_transform_caps (GstBaseTransform * trans, - GstPadDirection direction, GstCaps * caps, GstCaps * filter) -{ - GstCaps *result, *tmp; - - GST_DEBUG_OBJECT (trans, - "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, - (direction == GST_PAD_SINK) ? "sink" : "src"); - - if (direction == GST_PAD_SINK) { - tmp = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); - tmp = gst_caps_merge (gst_caps_ref (caps), tmp); - } else { - GstCaps *newcaps; - tmp = gst_caps_ref (caps); - - newcaps = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); - tmp = gst_caps_merge (tmp, newcaps); - } - - if (filter) { - result = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); - gst_caps_unref (tmp); - } else { - result = tmp; - } - - GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, result); - - return result; -} - -static gboolean -gst_d3d11_download_propose_allocation (GstBaseTransform * trans, - GstQuery * decide_query, GstQuery * query) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstVideoInfo info; - GstBufferPool *pool; - GstCaps *caps; - guint size; - - if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, - decide_query, query)) - return FALSE; - - /* passthrough, we're done */ - if (decide_query == NULL) - return TRUE; - - gst_query_parse_allocation (query, &caps, NULL); - - if (caps == NULL) - return FALSE; - - if (!gst_video_info_from_caps (&info, caps)) - return FALSE; - - if (gst_query_get_n_allocation_pools (query) == 0) { - GstCapsFeatures *features; - GstStructure *config; - gboolean is_d3d11 = FALSE; - - features = gst_caps_get_features (caps, 0); - - if (features && gst_caps_features_contains (features, - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { - GST_DEBUG_OBJECT (filter, "upstream support d3d11 memory"); - pool = gst_d3d11_buffer_pool_new (filter->device); - is_d3d11 = TRUE; - } else { - pool = gst_video_buffer_pool_new (); - } - - config = gst_buffer_pool_get_config (pool); - - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_META); - - /* d3d11 pool does not support video alignment */ - if (!is_d3d11) { - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); - } - - size = GST_VIDEO_INFO_SIZE (&info); - gst_buffer_pool_config_set_params (config, caps, size, 0, 0); - - if (!gst_buffer_pool_set_config (pool, config)) - goto config_failed; - - gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); - - /* d3d11 buffer pool might update buffer size by self */ - if (is_d3d11) { - size = GST_D3D11_BUFFER_POOL (pool)->buffer_size; - } - - gst_query_add_allocation_pool (query, pool, size, 0, 0); - - gst_object_unref (pool); - } - - return TRUE; - - /* ERRORS */ -config_failed: - { - GST_ERROR_OBJECT (filter, "failed to set config"); - gst_object_unref (pool); - return FALSE; - } -} - -static gboolean -gst_d3d11_download_decide_allocation (GstBaseTransform * trans, - GstQuery * query) -{ - GstBufferPool *pool = NULL; - GstStructure *config; - guint min, max, size; - gboolean update_pool; - GstCaps *outcaps = NULL; - - if (gst_query_get_n_allocation_pools (query) > 0) { - gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); - - if (!pool) - gst_query_parse_allocation (query, &outcaps, NULL); - - update_pool = TRUE; - } else { - GstVideoInfo vinfo; - - gst_query_parse_allocation (query, &outcaps, NULL); - gst_video_info_from_caps (&vinfo, outcaps); - size = vinfo.size; - min = max = 0; - update_pool = FALSE; - } - - if (!pool) - pool = gst_video_buffer_pool_new (); - - config = gst_buffer_pool_get_config (pool); - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - if (outcaps) - gst_buffer_pool_config_set_params (config, outcaps, size, 0, 0); - gst_buffer_pool_set_config (pool, config); - - if (update_pool) - gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); - else - gst_query_add_allocation_pool (query, pool, size, min, max); - - gst_object_unref (pool); - - return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, - query); -} - -static GstFlowReturn -gst_d3d11_download_transform (GstBaseTransform * trans, GstBuffer * inbuf, - GstBuffer * outbuf) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstVideoFrame in_frame, out_frame; - gint i; - - if (!gst_video_frame_map (&in_frame, &filter->in_info, inbuf, - GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF)) - goto invalid_buffer; - - if (!gst_video_frame_map (&out_frame, &filter->out_info, outbuf, - GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF)) { - gst_video_frame_unmap (&in_frame); - goto invalid_buffer; - } - - for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&in_frame); i++) { - if (!gst_video_frame_copy_plane (&out_frame, &in_frame, i)) { - GST_ERROR_OBJECT (filter, "Couldn't copy %dth plane", i); - return GST_FLOW_ERROR; - } - } - - gst_video_frame_unmap (&out_frame); - gst_video_frame_unmap (&in_frame); - - return GST_FLOW_OK; - - /* ERRORS */ -invalid_buffer: - { - GST_ELEMENT_WARNING (filter, CORE, NOT_IMPLEMENTED, (NULL), - ("invalid video buffer received")); - return GST_FLOW_ERROR; - } -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11format.c
Deleted
@@ -1,476 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11format.h" -#include "gstd3d11utils.h" -#include "gstd3d11device.h" -#include "gstd3d11memory.h" - -#include <string.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_format_debug); -#define GST_CAT_DEFAULT gst_d3d11_format_debug - -guint -gst_d3d11_dxgi_format_n_planes (DXGI_FORMAT format) -{ - switch (format) { - case DXGI_FORMAT_B8G8R8A8_UNORM: - case DXGI_FORMAT_R8G8B8A8_UNORM: - case DXGI_FORMAT_R10G10B10A2_UNORM: - case DXGI_FORMAT_AYUV: - case DXGI_FORMAT_R8_UNORM: - case DXGI_FORMAT_R8G8_UNORM: - case DXGI_FORMAT_R16_UNORM: - case DXGI_FORMAT_R16G16_UNORM: - return 1; - case DXGI_FORMAT_NV12: - case DXGI_FORMAT_P010: - case DXGI_FORMAT_P016: - return 2; - default: - break; - } - - return 0; -} - -gboolean -gst_d3d11_dxgi_format_get_size (DXGI_FORMAT format, guint width, guint height, - guint pitch, gsize offset[GST_VIDEO_MAX_PLANES], - gint stride[GST_VIDEO_MAX_PLANES], gsize * size) -{ - g_return_val_if_fail (format != DXGI_FORMAT_UNKNOWN, FALSE); - - switch (format) { - case DXGI_FORMAT_B8G8R8A8_UNORM: - case DXGI_FORMAT_R8G8B8A8_UNORM: - case DXGI_FORMAT_R10G10B10A2_UNORM: - case DXGI_FORMAT_AYUV: - case DXGI_FORMAT_R8_UNORM: - case DXGI_FORMAT_R8G8_UNORM: - case DXGI_FORMAT_R16_UNORM: - case DXGI_FORMAT_R16G16_UNORM: - offset[0] = 0; - stride[0] = pitch; - *size = pitch * height; - break; - case DXGI_FORMAT_NV12: - case DXGI_FORMAT_P010: - case DXGI_FORMAT_P016: - offset[0] = 0; - stride[0] = pitch; - offset[1] = offset[0] + stride[0] * height; - stride[1] = pitch; - *size = offset[1] + stride[1] * GST_ROUND_UP_2 (height / 2); - break; - default: - return FALSE; - } - - GST_LOG ("Calculated buffer size: %" G_GSIZE_FORMAT - " (dxgi format:%d, %dx%d, Pitch %d)", - *size, format, width, height, pitch); - - return TRUE; -} - -/** - * gst_d3d11_device_get_supported_caps: - * @device: a #GstD3DDevice - * @flags: D3D11_FORMAT_SUPPORT flags - * - * Check supported format with given flags - * - * Returns: a #GstCaps representing supported format - */ -GstCaps * -gst_d3d11_device_get_supported_caps (GstD3D11Device * device, - D3D11_FORMAT_SUPPORT flags) -{ - ID3D11Device *d3d11_device; - HRESULT hr; - gint i; - GValue v_list = G_VALUE_INIT; - GstCaps *supported_caps; - static const GstVideoFormat format_list[] = { - GST_VIDEO_FORMAT_BGRA, - GST_VIDEO_FORMAT_RGBA, - GST_VIDEO_FORMAT_RGB10A2_LE, - GST_VIDEO_FORMAT_VUYA, - GST_VIDEO_FORMAT_NV12, - GST_VIDEO_FORMAT_P010_10LE, - GST_VIDEO_FORMAT_P016_LE, - GST_VIDEO_FORMAT_I420, - GST_VIDEO_FORMAT_I420_10LE, - }; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - d3d11_device = gst_d3d11_device_get_device_handle (device); - g_value_init (&v_list, GST_TYPE_LIST); - - for (i = 0; i < G_N_ELEMENTS (format_list); i++) { - UINT format_support = 0; - GstVideoFormat format; - const GstD3D11Format *d3d11_format; - - d3d11_format = gst_d3d11_device_format_from_gst (device, format_list[i]); - if (!d3d11_format || d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) - continue; - - format = d3d11_format->format; - hr = ID3D11Device_CheckFormatSupport (d3d11_device, - d3d11_format->dxgi_format, &format_support); - - if (SUCCEEDED (hr) && ((format_support & flags) == flags)) { - GValue v_str = G_VALUE_INIT; - g_value_init (&v_str, G_TYPE_STRING); - - GST_LOG_OBJECT (device, "d3d11 device can support %s with flags 0x%x", - gst_video_format_to_string (format), flags); - g_value_set_string (&v_str, gst_video_format_to_string (format)); - gst_value_list_append_and_take_value (&v_list, &v_str); - } - } - - supported_caps = gst_caps_new_simple ("video/x-raw", - "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, - "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, - "framerate", GST_TYPE_FRACTION_RANGE, 0, 1, G_MAXINT, 1, NULL); - gst_caps_set_value (supported_caps, "format", &v_list); - g_value_unset (&v_list); - - gst_caps_set_features_simple (supported_caps, - gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)); - - return supported_caps; -} - -#if (DXGI_HEADER_VERSION >= 5) -gboolean -gst_d3d11_hdr_meta_data_to_dxgi (GstVideoMasteringDisplayInfo * minfo, - GstVideoContentLightLevel * cll, DXGI_HDR_METADATA_HDR10 * dxgi_hdr10) -{ - g_return_val_if_fail (dxgi_hdr10 != NULL, FALSE); - - memset (dxgi_hdr10, 0, sizeof (DXGI_HDR_METADATA_HDR10)); - - if (minfo) { - dxgi_hdr10->RedPrimary[0] = minfo->display_primaries[0].x; - dxgi_hdr10->RedPrimary[1] = minfo->display_primaries[0].y; - dxgi_hdr10->GreenPrimary[0] = minfo->display_primaries[1].x; - dxgi_hdr10->GreenPrimary[1] = minfo->display_primaries[1].y; - dxgi_hdr10->BluePrimary[0] = minfo->display_primaries[2].x; - dxgi_hdr10->BluePrimary[1] = minfo->display_primaries[2].y; - - dxgi_hdr10->WhitePoint[0] = minfo->white_point.x; - dxgi_hdr10->WhitePoint[1] = minfo->white_point.y; - dxgi_hdr10->MaxMasteringLuminance = minfo->max_display_mastering_luminance; - dxgi_hdr10->MinMasteringLuminance = minfo->min_display_mastering_luminance; - } - - if (cll) { - dxgi_hdr10->MaxContentLightLevel = cll->max_content_light_level; - dxgi_hdr10->MaxFrameAverageLightLevel = cll->max_frame_average_light_level; - } - - return TRUE; -} -#endif - -#if (DXGI_HEADER_VERSION >= 4) -typedef enum -{ - GST_DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709 = 0, - GST_DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709 = 1, - GST_DXGI_COLOR_SPACE_RGB_STUDIO_G22_NONE_P709 = 2, - GST_DXGI_COLOR_SPACE_RGB_STUDIO_G22_NONE_P2020 = 3, - GST_DXGI_COLOR_SPACE_RESERVED = 4, - GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_NONE_P709_X601 = 5, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P601 = 6, - GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_LEFT_P601 = 7, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P709 = 8, - GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_LEFT_P709 = 9, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P2020 = 10, - GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_LEFT_P2020 = 11, - GST_DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020 = 12, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G2084_LEFT_P2020 = 13, - GST_DXGI_COLOR_SPACE_RGB_STUDIO_G2084_NONE_P2020 = 14, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_TOPLEFT_P2020 = 15, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G2084_TOPLEFT_P2020 = 16, - GST_DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P2020 = 17, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_GHLG_TOPLEFT_P2020 = 18, - GST_DXGI_COLOR_SPACE_YCBCR_FULL_GHLG_TOPLEFT_P2020 = 19, - GST_DXGI_COLOR_SPACE_RGB_STUDIO_G24_NONE_P709 = 20, - GST_DXGI_COLOR_SPACE_RGB_STUDIO_G24_NONE_P2020 = 21, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G24_LEFT_P709 = 22, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G24_LEFT_P2020 = 23, - GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G24_TOPLEFT_P2020 = 24, - GST_DXGI_COLOR_SPACE_CUSTOM = 0xFFFFFFFF -} GST_DXGI_COLOR_SPACE_TYPE; - -/* https://docs.microsoft.com/en-us/windows/win32/api/dxgicommon/ne-dxgicommon-dxgi_color_space_type */ - -#define MAKE_COLOR_MAP(d,r,m,t,p) \ - { GST_DXGI_COLOR_SPACE_ ##d, GST_VIDEO_COLOR_RANGE ##r, \ - GST_VIDEO_COLOR_MATRIX_ ##m, GST_VIDEO_TRANSFER_ ##t, \ - GST_VIDEO_COLOR_PRIMARIES_ ##p } - -static const GstDxgiColorSpace rgb_colorspace_map[] = { - /* RGB_FULL_G22_NONE_P709 */ - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT709, BT709), - - /* RGB_FULL_G10_NONE_P709 */ - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, GAMMA10, BT709), - - /* RGB_STUDIO_G22_NONE_P709 */ - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _16_235, UNKNOWN, BT709, BT709), - - /* RGB_STUDIO_G22_NONE_P2020 */ - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _16_235, UNKNOWN, BT2020_10, BT2020), - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _16_235, UNKNOWN, BT2020_12, BT2020), - - /* RGB_FULL_G2084_NONE_P2020 */ - MAKE_COLOR_MAP (RGB_FULL_G2084_NONE_P2020, _0_255, UNKNOWN, SMPTE2084, - BT2020), - - /* RGB_STUDIO_G2084_NONE_P2020 */ - MAKE_COLOR_MAP (RGB_STUDIO_G2084_NONE_P2020, - _16_235, UNKNOWN, SMPTE2084, BT2020), - - /* RGB_FULL_G22_NONE_P2020 */ - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, BT2020_10, BT2020), - MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, BT2020_12, BT2020), - - /* RGB_STUDIO_G24_NONE_P709 */ - MAKE_COLOR_MAP (RGB_STUDIO_G24_NONE_P709, _16_235, UNKNOWN, SRGB, BT709), - - /* RGB_STUDIO_G24_NONE_P2020 */ - MAKE_COLOR_MAP (RGB_STUDIO_G24_NONE_P709, _16_235, UNKNOWN, SRGB, BT2020), -}; - -static const GstDxgiColorSpace yuv_colorspace_map[] = { - /* YCBCR_FULL_G22_NONE_P709_X601 */ - MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT709, BT709), - - /* YCBCR_STUDIO_G22_LEFT_P601 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT601, SMPTE170M), - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT709, SMPTE170M), - - /* YCBCR_FULL_G22_LEFT_P601 */ - MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT601, SMPTE170M), - MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT709, SMPTE170M), - - /* YCBCR_STUDIO_G22_LEFT_P709 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT709, BT709), - - /* YCBCR_FULL_G22_LEFT_P709 */ - MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT709, BT709), - - /* YCBCR_STUDIO_G22_LEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P2020, _16_235, BT2020, BT2020_10, - BT2020), - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P2020, _16_235, BT2020, BT2020_12, - BT2020), - - /* YCBCR_FULL_G22_LEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P2020, _0_255, BT2020, BT2020_10, BT2020), - MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P2020, _0_255, BT2020, BT2020_12, BT2020), - - /* YCBCR_STUDIO_G2084_LEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G2084_LEFT_P2020, _16_235, BT2020, SMPTE2084, - BT2020), - - /* YCBCR_STUDIO_G22_TOPLEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_TOPLEFT_P2020, _16_235, BT2020, BT2020_10, - BT2020), - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_TOPLEFT_P2020, _16_235, BT2020, BT2020_12, - BT2020), - - /* YCBCR_STUDIO_G2084_TOPLEFT_P2020 */ - /* FIXME: check chroma-site to differentiate this from - * YCBCR_STUDIO_G2084_LEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G2084_TOPLEFT_P2020, _16_235, BT2020, SMPTE2084, - BT2020), - - /* YCBCR_STUDIO_GHLG_TOPLEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_GHLG_TOPLEFT_P2020, _16_235, BT2020, - ARIB_STD_B67, BT2020), - - /* YCBCR_STUDIO_GHLG_TOPLEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_FULL_GHLG_TOPLEFT_P2020, _0_255, BT2020, ARIB_STD_B67, - BT2020), - - /* YCBCR_STUDIO_G24_LEFT_P709 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, SRGB, BT709), - - /* YCBCR_STUDIO_G24_LEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G24_LEFT_P2020, _16_235, BT2020, SRGB, BT2020), - - /* YCBCR_STUDIO_G24_TOPLEFT_P2020 */ - /* FIXME: check chroma-site to differentiate this from - * YCBCR_STUDIO_G24_LEFT_P2020 */ - MAKE_COLOR_MAP (YCBCR_STUDIO_G24_TOPLEFT_P2020, _16_235, BT2020, SRGB, - BT2020), -}; - -#define SCORE_RANGE_MISMATCH 1 -#define SCORE_MATRIX_MISMATCH 5 -#define SCORE_TRANSFER_MISMATCH 5 -#define SCORE_PRIMARY_MISMATCH 10 - -static gint -get_score (GstVideoInfo * info, const GstDxgiColorSpace * color_map, - gboolean is_yuv) -{ - gint loss = 0; - GstVideoColorimetry *color = &info->colorimetry; - - if (color->range != color_map->range) - loss += SCORE_RANGE_MISMATCH; - - if (is_yuv && color->matrix != color_map->matrix) - loss += SCORE_MATRIX_MISMATCH; - - if (color->transfer != color_map->transfer) - loss += SCORE_TRANSFER_MISMATCH; - - if (color->primaries != color_map->primaries) - loss += SCORE_PRIMARY_MISMATCH; - - return loss; -} - -static const GstDxgiColorSpace * -gst_d3d11_video_info_to_dxgi_color_space_rgb (GstVideoInfo * info) -{ - gint best_score = G_MAXINT; - gint score, i; - const GstDxgiColorSpace *colorspace = NULL; - - for (i = 0; i < G_N_ELEMENTS (rgb_colorspace_map); i++) { - score = get_score (info, &rgb_colorspace_map[i], TRUE); - - if (score < best_score) { - best_score = score; - colorspace = &rgb_colorspace_map[i]; - - if (score == 0) - break; - } - } - - return colorspace; -} - -static const GstDxgiColorSpace * -gst_d3d11_video_info_to_dxgi_color_space_yuv (GstVideoInfo * info) -{ - gint best_score = G_MAXINT; - gint score, i; - const GstDxgiColorSpace *colorspace = NULL; - - for (i = 0; i < G_N_ELEMENTS (yuv_colorspace_map); i++) { - score = get_score (info, &yuv_colorspace_map[i], TRUE); - - if (score < best_score) { - best_score = score; - colorspace = &yuv_colorspace_map[i]; - - if (score == 0) - break; - } - } - - return colorspace; -} - -const GstDxgiColorSpace * -gst_d3d11_video_info_to_dxgi_color_space (GstVideoInfo * info) -{ - g_return_val_if_fail (info != NULL, NULL); - - if (GST_VIDEO_INFO_IS_RGB (info)) { - return gst_d3d11_video_info_to_dxgi_color_space_rgb (info); - } else if (GST_VIDEO_INFO_IS_YUV (info)) { - return gst_d3d11_video_info_to_dxgi_color_space_yuv (info); - } - - return NULL; -} - -const GstDxgiColorSpace * -gst_d3d11_find_swap_chain_color_space (GstVideoInfo * info, - IDXGISwapChain3 * swapchain, gboolean use_hdr10) -{ - const GstDxgiColorSpace *colorspace = NULL; - gint best_score = G_MAXINT; - gint i; - - g_return_val_if_fail (info != NULL, FALSE); - g_return_val_if_fail (swapchain != NULL, FALSE); - - if (!GST_VIDEO_INFO_IS_RGB (info)) { - GST_WARNING ("Swapchain colorspace should be RGB format"); - return FALSE; - } - - for (i = 0; i < G_N_ELEMENTS (rgb_colorspace_map); i++) { - UINT can_support = 0; - HRESULT hr; - gint score; - GST_DXGI_COLOR_SPACE_TYPE cur_type = - rgb_colorspace_map[i].dxgi_color_space_type; - - /* FIXME: Non-HDR colorspace with BT2020 primaries will break rendering. - * https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/1175 - * To workaround it, BT709 colorspace will be chosen for non-HDR case. - */ - if (!use_hdr10 && - rgb_colorspace_map[i].primaries == GST_VIDEO_COLOR_PRIMARIES_BT2020) - continue; - - hr = IDXGISwapChain3_CheckColorSpaceSupport (swapchain, - cur_type, &can_support); - - if (FAILED (hr)) - continue; - - if ((can_support & DXGI_SWAP_CHAIN_COLOR_SPACE_SUPPORT_FLAG_PRESENT) == - DXGI_SWAP_CHAIN_COLOR_SPACE_SUPPORT_FLAG_PRESENT) { - score = get_score (info, &rgb_colorspace_map[i], FALSE); - - GST_DEBUG ("colorspace %d supported, score %d", cur_type, score); - - if (score < best_score) { - best_score = score; - colorspace = &rgb_colorspace_map[i]; - } - } - } - - return colorspace; -} - -#endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11format.h
Deleted
@@ -1,85 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_FORMAT_H__ -#define __GST_D3D11_FORMAT_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> - -#include "gstd3d11_fwd.h" - -#define GST_D3D11_FORMATS \ - "{ BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }" -#define GST_D3D11_N_FORMATS 9 - -G_BEGIN_DECLS - -typedef struct _GstDxgiColorSpace GstDxgiColorSpace; - -struct _GstD3D11Format -{ - GstVideoFormat format; - - /* direct mapping to dxgi format if applicable */ - DXGI_FORMAT dxgi_format; - - /* formats for texture processing */ - DXGI_FORMAT resource_format[GST_VIDEO_MAX_COMPONENTS]; -}; - -struct _GstDxgiColorSpace -{ - guint dxgi_color_space_type; - GstVideoColorRange range; - GstVideoColorMatrix matrix; - GstVideoTransferFunction transfer; - GstVideoColorPrimaries primaries; -}; - -guint gst_d3d11_dxgi_format_n_planes (DXGI_FORMAT format); - -gboolean gst_d3d11_dxgi_format_get_size (DXGI_FORMAT format, - guint width, - guint height, - guint pitch, - gsize offset[GST_VIDEO_MAX_PLANES], - gint stride[GST_VIDEO_MAX_PLANES], - gsize *size); - -GstCaps * gst_d3d11_device_get_supported_caps (GstD3D11Device * device, - D3D11_FORMAT_SUPPORT flags); - -#if (DXGI_HEADER_VERSION >= 5) -gboolean gst_d3d11_hdr_meta_data_to_dxgi (GstVideoMasteringDisplayInfo * minfo, - GstVideoContentLightLevel * cll, - DXGI_HDR_METADATA_HDR10 * dxgi_hdr10); -#endif - -#if (DXGI_HEADER_VERSION >= 4) -const GstDxgiColorSpace * gst_d3d11_video_info_to_dxgi_color_space (GstVideoInfo * info); - -const GstDxgiColorSpace * gst_d3d11_find_swap_chain_color_space (GstVideoInfo * info, - IDXGISwapChain3 * swapchain, - gboolean use_hdr10); -#endif - -G_END_DECLS - -#endif /* __GST_D3D11_FORMAT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11h264dec.c
Deleted
@@ -1,1281 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - * - * NOTE: some of implementations are copied/modified from Chromium code - * - * Copyright 2015 The Chromium Authors. All rights reserved. - * - * Redistribution and use in source and binary forms, with or without - * modification, are permitted provided that the following conditions are - * met: - * - * * Redistributions of source code must retain the above copyright - * notice, this list of conditions and the following disclaimer. - * * Redistributions in binary form must reproduce the above - * copyright notice, this list of conditions and the following disclaimer - * in the documentation and/or other materials provided with the - * distribution. - * * Neither the name of Google Inc. nor the names of its - * contributors may be used to endorse or promote products derived from - * this software without specific prior written permission. - * - * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS - * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT - * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR - * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT - * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, - * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT - * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, - * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY - * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT - * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE - * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - */ - -#ifdef HAVE_CONFIG_H -#include <config.h> -#endif - -#include "gstd3d11h264dec.h" -#include "gstd3d11memory.h" -#include "gstd3d11bufferpool.h" - -#include <gst/codecs/gsth264decoder.h> -#include <string.h> - -/* HACK: to expose dxva data structure on UWP */ -#ifdef WINAPI_PARTITION_DESKTOP -#undef WINAPI_PARTITION_DESKTOP -#endif -#define WINAPI_PARTITION_DESKTOP 1 -#include <d3d9.h> -#include <dxva.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_h264_dec_debug); -#define GST_CAT_DEFAULT gst_d3d11_h264_dec_debug - -enum -{ - PROP_0, - PROP_ADAPTER, - PROP_DEVICE_ID, - PROP_VENDOR_ID, -}; - -/* copied from d3d11.h since mingw header doesn't define them */ -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_H264_IDCT_FGT, 0x1b81be67, 0xa0c7, - 0x11d3, 0xb9, 0x84, 0x00, 0xc0, 0x4f, 0x2e, 0x73, 0xc5); -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_NOFGT, 0x1b81be68, 0xa0c7, - 0x11d3, 0xb9, 0x84, 0x00, 0xc0, 0x4f, 0x2e, 0x73, 0xc5); -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_FGT, 0x1b81be69, 0xa0c7, - 0x11d3, 0xb9, 0x84, 0x00, 0xc0, 0x4f, 0x2e, 0x73, 0xc5); - -typedef struct _GstD3D11H264Dec -{ - GstH264Decoder parent; - - GstVideoCodecState *output_state; - - GstD3D11Device *device; - - guint width, height; - guint coded_width, coded_height; - guint bitdepth; - guint chroma_format_idc; - GstVideoFormat out_format; - - gint max_dpb_size; - - /* Array of DXVA_Slice_H264_Short */ - GArray *slice_list; - - GstD3D11Decoder *d3d11_decoder; - - /* Pointing current bitstream buffer */ - gboolean bad_aligned_bitstream_buffer; - guint written_buffer_size; - guint remaining_buffer_size; - guint8 *bitstream_buffer_data; - - gboolean use_d3d11_output; - - DXVA_PicEntry_H264 ref_frame_list[16]; - INT field_order_cnt_list[16][2]; - USHORT frame_num_list[16]; - UINT used_for_reference_flags; - USHORT non_existing_frame_flags; -} GstD3D11H264Dec; - -typedef struct _GstD3D11H264DecClass -{ - GstH264DecoderClass parent_class; - guint adapter; - guint device_id; - guint vendor_id; -} GstD3D11H264DecClass; - -static GstElementClass *parent_class = NULL; - -#define GST_D3D11_H264_DEC(object) ((GstD3D11H264Dec *) (object)) -#define GST_D3D11_H264_DEC_GET_CLASS(object) \ - (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11H264DecClass)) - -static void gst_d3d11_h264_dec_get_property (GObject * object, - guint prop_id, GValue * value, GParamSpec * pspec); -static void gst_d3d11_h264_dec_dispose (GObject * object); -static void gst_d3d11_h264_dec_set_context (GstElement * element, - GstContext * context); - -static gboolean gst_d3d11_h264_dec_open (GstVideoDecoder * decoder); -static gboolean gst_d3d11_h264_dec_close (GstVideoDecoder * decoder); -static gboolean gst_d3d11_h264_dec_negotiate (GstVideoDecoder * decoder); -static gboolean gst_d3d11_h264_dec_decide_allocation (GstVideoDecoder * - decoder, GstQuery * query); -static gboolean gst_d3d11_h264_dec_src_query (GstVideoDecoder * decoder, - GstQuery * query); - -/* GstH264Decoder */ -static gboolean gst_d3d11_h264_dec_new_sequence (GstH264Decoder * decoder, - const GstH264SPS * sps, gint max_dpb_size); -static gboolean gst_d3d11_h264_dec_new_picture (GstH264Decoder * decoder, - GstVideoCodecFrame * frame, GstH264Picture * picture); -static GstFlowReturn gst_d3d11_h264_dec_output_picture (GstH264Decoder * - decoder, GstVideoCodecFrame * frame, GstH264Picture * picture); -static gboolean gst_d3d11_h264_dec_start_picture (GstH264Decoder * decoder, - GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb); -static gboolean gst_d3d11_h264_dec_decode_slice (GstH264Decoder * decoder, - GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, - GArray * ref_pic_list1); -static gboolean gst_d3d11_h264_dec_end_picture (GstH264Decoder * decoder, - GstH264Picture * picture); - -static void -gst_d3d11_h264_dec_class_init (GstD3D11H264DecClass * klass, gpointer data) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); - GstH264DecoderClass *h264decoder_class = GST_H264_DECODER_CLASS (klass); - GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; - gchar *long_name; - - gobject_class->get_property = gst_d3d11_h264_dec_get_property; - gobject_class->dispose = gst_d3d11_h264_dec_dispose; - - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_uint ("adapter", "Adapter", - "DXGI Adapter index for creating device", - 0, G_MAXUINT32, cdata->adapter, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_DEVICE_ID, - g_param_spec_uint ("device-id", "Device Id", - "DXGI Device ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_VENDOR_ID, - g_param_spec_uint ("vendor-id", "Vendor Id", - "DXGI Vendor ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - parent_class = g_type_class_peek_parent (klass); - - klass->adapter = cdata->adapter; - klass->device_id = cdata->device_id; - klass->vendor_id = cdata->vendor_id; - - element_class->set_context = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_set_context); - - long_name = g_strdup_printf ("Direct3D11 H.264 %s Decoder", - cdata->description); - gst_element_class_set_metadata (element_class, long_name, - "Codec/Decoder/Video/Hardware", - "A Direct3D11 based H.264 video decoder", - "Seungha Yang <seungha.yang@navercorp.com>"); - g_free (long_name); - - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - cdata->sink_caps)); - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - cdata->src_caps)); - gst_d3d11_decoder_class_data_free (cdata); - - decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_open); - decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_close); - decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_negotiate); - decoder_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_decide_allocation); - decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_src_query); - - h264decoder_class->new_sequence = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_new_sequence); - h264decoder_class->new_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_new_picture); - h264decoder_class->output_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_output_picture); - h264decoder_class->start_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_start_picture); - h264decoder_class->decode_slice = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_decode_slice); - h264decoder_class->end_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_end_picture); -} - -static void -gst_d3d11_h264_dec_init (GstD3D11H264Dec * self) -{ - self->slice_list = g_array_new (FALSE, TRUE, sizeof (DXVA_Slice_H264_Short)); -} - -static void -gst_d3d11_h264_dec_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11H264DecClass *klass = GST_D3D11_H264_DEC_GET_CLASS (object); - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_uint (value, klass->adapter); - break; - case PROP_DEVICE_ID: - g_value_set_uint (value, klass->device_id); - break; - case PROP_VENDOR_ID: - g_value_set_uint (value, klass->vendor_id); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_h264_dec_dispose (GObject * object) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (object); - - if (self->slice_list) { - g_array_unref (self->slice_list); - self->slice_list = NULL; - } - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_d3d11_h264_dec_set_context (GstElement * element, GstContext * context) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (element); - GstD3D11H264DecClass *klass = GST_D3D11_H264_DEC_GET_CLASS (self); - - gst_d3d11_handle_set_context (element, context, klass->adapter, - &self->device); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -/* Clear all codec specific (e.g., SPS) data */ -static void -gst_d3d11_h264_dec_reset (GstD3D11H264Dec * self) -{ - self->width = 0; - self->height = 0; - self->coded_width = 0; - self->coded_height = 0; - self->bitdepth = 0; - self->chroma_format_idc = 0; - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; - self->max_dpb_size = 0; -} - -static gboolean -gst_d3d11_h264_dec_open (GstVideoDecoder * decoder) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - GstD3D11H264DecClass *klass = GST_D3D11_H264_DEC_GET_CLASS (self); - - if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), klass->adapter, - &self->device)) { - GST_ERROR_OBJECT (self, "Cannot create d3d11device"); - return FALSE; - } - - self->d3d11_decoder = gst_d3d11_decoder_new (self->device); - - if (!self->d3d11_decoder) { - GST_ERROR_OBJECT (self, "Cannot create d3d11 decoder"); - gst_clear_object (&self->device); - return FALSE; - } - - gst_d3d11_h264_dec_reset (self); - - return TRUE; -} - -static gboolean -gst_d3d11_h264_dec_close (GstVideoDecoder * decoder) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - - if (self->output_state) - gst_video_codec_state_unref (self->output_state); - self->output_state = NULL; - - gst_clear_object (&self->d3d11_decoder); - gst_clear_object (&self->device); - - return TRUE; -} - -static gboolean -gst_d3d11_h264_dec_negotiate (GstVideoDecoder * decoder) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - GstH264Decoder *h264dec = GST_H264_DECODER (decoder); - - if (!gst_d3d11_decoder_negotiate (decoder, h264dec->input_state, - self->out_format, self->width, self->height, &self->output_state, - &self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); -} - -static gboolean -gst_d3d11_h264_dec_decide_allocation (GstVideoDecoder * decoder, - GstQuery * query) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - - if (!gst_d3d11_decoder_decide_allocation (decoder, query, self->device, - GST_D3D11_CODEC_H264, self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation - (decoder, query); -} - -static gboolean -gst_d3d11_h264_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT: - if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), - query, self->device)) { - return TRUE; - } - break; - default: - break; - } - - return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); -} - -static gboolean -gst_d3d11_h264_dec_new_sequence (GstH264Decoder * decoder, - const GstH264SPS * sps, gint max_dpb_size) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - gint crop_width, crop_height; - gboolean modified = FALSE; - static const GUID *supported_profiles[] = { - &GST_GUID_D3D11_DECODER_PROFILE_H264_IDCT_FGT, - &GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_NOFGT, - &GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_FGT - }; - - GST_LOG_OBJECT (self, "new sequence"); - - if (sps->frame_cropping_flag) { - crop_width = sps->crop_rect_width; - crop_height = sps->crop_rect_height; - } else { - crop_width = sps->width; - crop_height = sps->height; - } - - if (self->width != crop_width || self->height != crop_height || - self->coded_width != sps->width || self->coded_height != sps->height) { - GST_INFO_OBJECT (self, "resolution changed %dx%d (%dx%d)", - crop_width, crop_height, sps->width, sps->height); - self->width = crop_width; - self->height = crop_height; - self->coded_width = sps->width; - self->coded_height = sps->height; - modified = TRUE; - } - - if (self->bitdepth != sps->bit_depth_luma_minus8 + 8) { - GST_INFO_OBJECT (self, "bitdepth changed"); - self->bitdepth = sps->bit_depth_luma_minus8 + 8; - modified = TRUE; - } - - if (self->chroma_format_idc != sps->chroma_format_idc) { - GST_INFO_OBJECT (self, "chroma format changed"); - self->chroma_format_idc = sps->chroma_format_idc; - modified = TRUE; - } - - if (self->max_dpb_size < max_dpb_size) { - GST_INFO_OBJECT (self, "Requires larger DPB size (%d -> %d)", - self->max_dpb_size, max_dpb_size); - modified = TRUE; - } - - if (modified || !self->d3d11_decoder->opened) { - GstVideoInfo info; - - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; - - if (self->bitdepth == 8) { - if (self->chroma_format_idc == 1) - self->out_format = GST_VIDEO_FORMAT_NV12; - else { - GST_FIXME_OBJECT (self, "Could not support 8bits non-4:2:0 format"); - } - } else if (self->bitdepth == 10) { - if (self->chroma_format_idc == 1) - self->out_format = GST_VIDEO_FORMAT_P010_10LE; - else { - GST_FIXME_OBJECT (self, "Could not support 10bits non-4:2:0 format"); - } - } - - if (self->out_format == GST_VIDEO_FORMAT_UNKNOWN) { - GST_ERROR_OBJECT (self, "Could not support bitdepth/chroma format"); - return FALSE; - } - - gst_video_info_set_format (&info, - self->out_format, self->width, self->height); - - /* Store configured DPB size here. Then, it will be referenced later - * to decide whether we need to re-open decoder object or not. - * For instance, if every configuration is same apart from DPB size and - * new DPB size is decreased, we can reuse existing decoder object. - */ - self->max_dpb_size = max_dpb_size; - gst_d3d11_decoder_reset (self->d3d11_decoder); - if (!gst_d3d11_decoder_open (self->d3d11_decoder, GST_D3D11_CODEC_H264, - &info, self->coded_width, self->coded_height, - /* Additional 4 views margin for zero-copy rendering */ - max_dpb_size + 4, - supported_profiles, G_N_ELEMENTS (supported_profiles))) { - GST_ERROR_OBJECT (self, "Failed to create decoder"); - return FALSE; - } - - if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { - GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; - } - } - - return TRUE; -} - -static gboolean -gst_d3d11_h264_dec_get_bitstream_buffer (GstD3D11H264Dec * self) -{ - GST_TRACE_OBJECT (self, "Getting bitstream buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM, &self->remaining_buffer_size, - (gpointer *) & self->bitstream_buffer_data)) { - GST_ERROR_OBJECT (self, "Faild to get bitstream buffer"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Got bitstream buffer %p with size %d", - self->bitstream_buffer_data, self->remaining_buffer_size); - self->written_buffer_size = 0; - if ((self->remaining_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "The size of bitstream buffer is not 128 bytes aligned"); - self->bad_aligned_bitstream_buffer = TRUE; - } else { - self->bad_aligned_bitstream_buffer = FALSE; - } - - return TRUE; -} - -static GstD3D11DecoderOutputView * -gst_d3d11_h264_dec_get_output_view_from_picture (GstD3D11H264Dec * self, - GstH264Picture * picture) -{ - GstBuffer *view_buffer; - GstD3D11DecoderOutputView *view; - - view_buffer = (GstBuffer *) gst_h264_picture_get_user_data (picture); - if (!view_buffer) { - GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); - return NULL; - } - - view = gst_d3d11_decoder_get_output_view_from_buffer (self->d3d11_decoder, - view_buffer); - if (!view) { - GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); - return NULL; - } - - return view; -} - -static gboolean -gst_d3d11_h264_dec_start_picture (GstH264Decoder * decoder, - GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - GstD3D11DecoderOutputView *view; - gint i; - GArray *dpb_array; - - view = gst_d3d11_h264_dec_get_output_view_from_picture (self, picture); - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view handle"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Begin frame"); - - if (!gst_d3d11_decoder_begin_frame (self->d3d11_decoder, view, 0, NULL)) { - GST_ERROR_OBJECT (self, "Failed to begin frame"); - return FALSE; - } - - for (i = 0; i < 16; i++) { - self->ref_frame_list[i].bPicEntry = 0xFF; - self->field_order_cnt_list[i][0] = 0; - self->field_order_cnt_list[i][1] = 0; - self->frame_num_list[i] = 0; - } - self->used_for_reference_flags = 0; - self->non_existing_frame_flags = 0; - - dpb_array = gst_h264_dpb_get_pictures_all (dpb); - - for (i = 0; i < dpb_array->len; i++) { - guint ref = 3; - GstH264Picture *other = g_array_index (dpb_array, GstH264Picture *, i); - GstD3D11DecoderOutputView *other_view; - gint id = 0xff; - - if (!other->ref) - continue; - - other_view = gst_d3d11_h264_dec_get_output_view_from_picture (self, other); - - if (other_view) - id = other_view->view_id; - - self->ref_frame_list[i].Index7Bits = id; - self->ref_frame_list[i].AssociatedFlag = other->long_term; - self->field_order_cnt_list[i][0] = other->top_field_order_cnt; - self->field_order_cnt_list[i][1] = other->bottom_field_order_cnt; - self->frame_num_list[i] = self->ref_frame_list[i].AssociatedFlag - ? other->long_term_pic_num : other->frame_num; - self->used_for_reference_flags |= ref << (2 * i); - self->non_existing_frame_flags |= (other->nonexisting) << i; - } - - g_array_unref (dpb_array); - g_array_set_size (self->slice_list, 0); - - return gst_d3d11_h264_dec_get_bitstream_buffer (self); -} - -static gboolean -gst_d3d11_h264_dec_new_picture (GstH264Decoder * decoder, - GstVideoCodecFrame * frame, GstH264Picture * picture) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - GstBuffer *view_buffer; - GstD3D11Memory *mem; - - view_buffer = gst_d3d11_decoder_get_output_view_buffer (self->d3d11_decoder); - if (!view_buffer) { - GST_ERROR_OBJECT (self, "No available output view buffer"); - return FALSE; - } - - mem = (GstD3D11Memory *) gst_buffer_peek_memory (view_buffer, 0); - - GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT " (index %d)", - view_buffer, mem->subresource_index); - - gst_h264_picture_set_user_data (picture, - view_buffer, (GDestroyNotify) gst_buffer_unref); - - GST_LOG_OBJECT (self, "New h264picture %p", picture); - - return TRUE; -} - -static GstFlowReturn -gst_d3d11_h264_dec_output_picture (GstH264Decoder * decoder, - GstVideoCodecFrame * frame, GstH264Picture * picture) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); - GstBuffer *output_buffer = NULL; - GstBuffer *view_buffer; - - GST_LOG_OBJECT (self, - "Outputting picture %p (poc %d)", picture, picture->pic_order_cnt); - - view_buffer = (GstBuffer *) gst_h264_picture_get_user_data (picture); - - if (!view_buffer) { - GST_ERROR_OBJECT (self, "Could not get output view"); - goto error; - } - - /* if downstream is d3d11 element and forward playback case, - * expose our decoder view without copy. In case of reverse playback, however, - * we cannot do that since baseclass will store the decoded buffer - * up to gop size but our dpb pool cannot be increased */ - if (self->use_d3d11_output && - gst_d3d11_decoder_supports_direct_rendering (self->d3d11_decoder) && - GST_VIDEO_DECODER (self)->input_segment.rate > 0) { - GstMemory *mem; - - output_buffer = gst_buffer_ref (view_buffer); - mem = gst_buffer_peek_memory (output_buffer, 0); - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - } else { - output_buffer = gst_video_decoder_allocate_output_buffer (vdec); - } - - if (!output_buffer) { - GST_ERROR_OBJECT (self, "Couldn't allocate output buffer"); - goto error; - } - - frame->output_buffer = output_buffer; - GST_BUFFER_PTS (output_buffer) = GST_BUFFER_PTS (frame->input_buffer); - GST_BUFFER_DTS (output_buffer) = GST_CLOCK_TIME_NONE; - GST_BUFFER_DURATION (output_buffer) = - GST_BUFFER_DURATION (frame->input_buffer); - - if (!gst_d3d11_decoder_process_output (self->d3d11_decoder, - &self->output_state->info, - GST_VIDEO_INFO_WIDTH (&self->output_state->info), - GST_VIDEO_INFO_HEIGHT (&self->output_state->info), - view_buffer, output_buffer)) { - GST_ERROR_OBJECT (self, "Failed to copy buffer"); - goto error; - } - - GST_LOG_OBJECT (self, "Finish frame %" GST_TIME_FORMAT, - GST_TIME_ARGS (GST_BUFFER_PTS (output_buffer))); - - gst_h264_picture_unref (picture); - - return gst_video_decoder_finish_frame (vdec, frame); - -error: - gst_video_decoder_drop_frame (vdec, frame); - gst_h264_picture_unref (picture); - - return GST_FLOW_ERROR; -} - -static gboolean -gst_d3d11_h264_dec_submit_slice_data (GstD3D11H264Dec * self) -{ - guint buffer_size; - gpointer buffer; - guint8 *data; - gsize offset = 0; - gint i; - D3D11_VIDEO_DECODER_BUFFER_DESC buffer_desc[4] = { 0, }; - gboolean ret; - DXVA_Slice_H264_Short *slice_data; - - if (self->slice_list->len < 1) { - GST_WARNING_OBJECT (self, "Nothing to submit"); - return FALSE; - } - - slice_data = &g_array_index (self->slice_list, DXVA_Slice_H264_Short, - self->slice_list->len - 1); - - /* DXVA2 spec is saying that written bitstream data must be 128 bytes - * aligned if the bitstream buffer contains end of slice - * (i.e., wBadSliceChopping == 0 or 2) */ - if (slice_data->wBadSliceChopping == 0 || slice_data->wBadSliceChopping == 2) { - guint padding = - MIN (GST_ROUND_UP_128 (self->written_buffer_size) - - self->written_buffer_size, self->remaining_buffer_size); - - if (padding) { - GST_TRACE_OBJECT (self, - "Written bitstream buffer size %u is not 128 bytes aligned, " - "add padding %u bytes", self->written_buffer_size, padding); - memset (self->bitstream_buffer_data, 0, padding); - self->written_buffer_size += padding; - slice_data->SliceBytesInBuffer += padding; - } - } - - GST_TRACE_OBJECT (self, "Getting slice control buffer"); - - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL, &buffer_size, &buffer)) { - GST_ERROR_OBJECT (self, "Couldn't get slice control buffer"); - return FALSE; - } - - data = buffer; - for (i = 0; i < self->slice_list->len; i++) { - DXVA_Slice_H264_Short *slice_data = - &g_array_index (self->slice_list, DXVA_Slice_H264_Short, i); - - memcpy (data + offset, slice_data, sizeof (DXVA_Slice_H264_Short)); - offset += sizeof (DXVA_Slice_H264_Short); - } - - GST_TRACE_OBJECT (self, "Release slice control buffer"); - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL)) { - GST_ERROR_OBJECT (self, "Failed to release slice control buffer"); - return FALSE; - } - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM)) { - GST_ERROR_OBJECT (self, "Failed to release bitstream buffer"); - return FALSE; - } - - buffer_desc[0].BufferType = D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS; - buffer_desc[0].DataOffset = 0; - buffer_desc[0].DataSize = sizeof (DXVA_PicParams_H264); - - buffer_desc[1].BufferType = - D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX; - buffer_desc[1].DataOffset = 0; - buffer_desc[1].DataSize = sizeof (DXVA_Qmatrix_H264); - - buffer_desc[2].BufferType = D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL; - buffer_desc[2].DataOffset = 0; - buffer_desc[2].DataSize = - sizeof (DXVA_Slice_H264_Short) * self->slice_list->len; - - if (!self->bad_aligned_bitstream_buffer - && (self->written_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "Written bitstream buffer size %u is not 128 bytes aligned", - self->written_buffer_size); - } - - buffer_desc[3].BufferType = D3D11_VIDEO_DECODER_BUFFER_BITSTREAM; - buffer_desc[3].DataOffset = 0; - buffer_desc[3].DataSize = self->written_buffer_size; - - ret = gst_d3d11_decoder_submit_decoder_buffers (self->d3d11_decoder, - 4, buffer_desc); - - self->written_buffer_size = 0; - self->bitstream_buffer_data = NULL; - self->remaining_buffer_size = 0; - g_array_set_size (self->slice_list, 0); - - return ret; -} - -static gboolean -gst_d3d11_h264_dec_end_picture (GstH264Decoder * decoder, - GstH264Picture * picture) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - - GST_LOG_OBJECT (self, "end picture %p, (poc %d)", - picture, picture->pic_order_cnt); - - if (!gst_d3d11_h264_dec_submit_slice_data (self)) { - GST_ERROR_OBJECT (self, "Failed to submit slice data"); - return FALSE; - } - - if (!gst_d3d11_decoder_end_frame (self->d3d11_decoder)) { - GST_ERROR_OBJECT (self, "Failed to EndFrame"); - return FALSE; - } - - return TRUE; -} - -static void -gst_d3d11_h264_dec_picture_params_from_sps (GstD3D11H264Dec * self, - const GstH264SPS * sps, gboolean field_pic, DXVA_PicParams_H264 * params) -{ -#define COPY_FIELD(f) \ - (params)->f = (sps)->f - - params->wFrameWidthInMbsMinus1 = sps->pic_width_in_mbs_minus1; - params->wFrameHeightInMbsMinus1 = sps->pic_height_in_map_units_minus1; - params->residual_colour_transform_flag = sps->separate_colour_plane_flag; - params->MbaffFrameFlag = sps->mb_adaptive_frame_field_flag && field_pic; - params->field_pic_flag = field_pic; - params->MinLumaBipredSize8x8Flag = sps->level_idc >= 31; - - COPY_FIELD (num_ref_frames); - COPY_FIELD (chroma_format_idc); - COPY_FIELD (frame_mbs_only_flag); - COPY_FIELD (bit_depth_luma_minus8); - COPY_FIELD (bit_depth_chroma_minus8); - COPY_FIELD (log2_max_frame_num_minus4); - COPY_FIELD (pic_order_cnt_type); - COPY_FIELD (log2_max_pic_order_cnt_lsb_minus4); - COPY_FIELD (delta_pic_order_always_zero_flag); - COPY_FIELD (direct_8x8_inference_flag); - -#undef COPY_FIELD -} - -static void -gst_d3d11_h264_dec_picture_params_from_pps (GstD3D11H264Dec * self, - const GstH264PPS * pps, DXVA_PicParams_H264 * params) -{ -#define COPY_FIELD(f) \ - (params)->f = (pps)->f - - COPY_FIELD (constrained_intra_pred_flag); - COPY_FIELD (weighted_pred_flag); - COPY_FIELD (weighted_bipred_idc); - COPY_FIELD (transform_8x8_mode_flag); - COPY_FIELD (pic_init_qs_minus26); - COPY_FIELD (chroma_qp_index_offset); - COPY_FIELD (second_chroma_qp_index_offset); - COPY_FIELD (pic_init_qp_minus26); - COPY_FIELD (num_ref_idx_l0_active_minus1); - COPY_FIELD (num_ref_idx_l1_active_minus1); - COPY_FIELD (entropy_coding_mode_flag); - COPY_FIELD (pic_order_present_flag); - COPY_FIELD (deblocking_filter_control_present_flag); - COPY_FIELD (redundant_pic_cnt_present_flag); - COPY_FIELD (num_slice_groups_minus1); - COPY_FIELD (slice_group_map_type); - -#undef COPY_FIELD -} - -static void -gst_d3d11_h264_dec_picture_params_from_slice_header (GstD3D11H264Dec * - self, const GstH264SliceHdr * slice_header, DXVA_PicParams_H264 * params) -{ - params->sp_for_switch_flag = slice_header->sp_for_switch_flag; - params->field_pic_flag = slice_header->field_pic_flag; - params->CurrPic.AssociatedFlag = slice_header->bottom_field_flag; - params->IntraPicFlag = - GST_H264_IS_I_SLICE (slice_header) || GST_H264_IS_SI_SLICE (slice_header); -} - -static gboolean -gst_d3d11_h264_dec_fill_picture_params (GstD3D11H264Dec * self, - const GstH264SliceHdr * slice_header, DXVA_PicParams_H264 * params) -{ - const GstH264SPS *sps; - const GstH264PPS *pps; - - g_return_val_if_fail (slice_header->pps != NULL, FALSE); - g_return_val_if_fail (slice_header->pps->sequence != NULL, FALSE); - - pps = slice_header->pps; - sps = pps->sequence; - - memset (params, 0, sizeof (DXVA_PicParams_H264)); - - params->MbsConsecutiveFlag = 1; - params->Reserved16Bits = 3; - params->ContinuationFlag = 1; - params->Reserved8BitsA = 0; - params->Reserved8BitsB = 0; - params->StatusReportFeedbackNumber = 1; - - gst_d3d11_h264_dec_picture_params_from_sps (self, - sps, slice_header->field_pic_flag, params); - gst_d3d11_h264_dec_picture_params_from_pps (self, pps, params); - gst_d3d11_h264_dec_picture_params_from_slice_header (self, - slice_header, params); - - return TRUE; -} - -static gboolean -gst_d3d11_h264_dec_decode_slice (GstH264Decoder * decoder, - GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, - GArray * ref_pic_list1) -{ - GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); - GstH264SPS *sps; - GstH264PPS *pps; - DXVA_PicParams_H264 pic_params = { 0, }; - DXVA_Qmatrix_H264 iq_matrix = { 0, }; - guint d3d11_buffer_size = 0; - gpointer d3d11_buffer = NULL; - gint i, j; - GstD3D11DecoderOutputView *view; - - pps = slice->header.pps; - sps = pps->sequence; - - view = gst_d3d11_h264_dec_get_output_view_from_picture (self, picture); - - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view"); - return FALSE; - } - - gst_d3d11_h264_dec_fill_picture_params (self, &slice->header, &pic_params); - - pic_params.CurrPic.Index7Bits = view->view_id; - pic_params.RefPicFlag = picture->ref; - pic_params.frame_num = picture->frame_num; - - if (pic_params.field_pic_flag && pic_params.CurrPic.AssociatedFlag) { - pic_params.CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; - pic_params.CurrFieldOrderCnt[0] = 0; - } else if (pic_params.field_pic_flag && !pic_params.CurrPic.AssociatedFlag) { - pic_params.CurrFieldOrderCnt[0] = picture->top_field_order_cnt; - pic_params.CurrFieldOrderCnt[1] = 0; - } else { - pic_params.CurrFieldOrderCnt[0] = picture->top_field_order_cnt; - pic_params.CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; - } - - memcpy (pic_params.RefFrameList, self->ref_frame_list, - sizeof (pic_params.RefFrameList)); - memcpy (pic_params.FieldOrderCntList, self->field_order_cnt_list, - sizeof (pic_params.FieldOrderCntList)); - memcpy (pic_params.FrameNumList, self->frame_num_list, - sizeof (pic_params.FrameNumList)); - - pic_params.UsedForReferenceFlags = self->used_for_reference_flags; - pic_params.NonExistingFrameFlags = self->non_existing_frame_flags; - - GST_TRACE_OBJECT (self, "Getting picture param decoder buffer"); - - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, - "Failed to get decoder buffer for picture parameters"); - return FALSE; - } - - memcpy (d3d11_buffer, &pic_params, sizeof (DXVA_PicParams_H264)); - - GST_TRACE_OBJECT (self, "Release picture param decoder buffer"); - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS)) { - GST_ERROR_OBJECT (self, "Failed to release decoder buffer"); - return FALSE; - } - - if (pps->pic_scaling_matrix_present_flag) { - for (i = 0; i < 6; i++) { - for (j = 0; j < 16; j++) { - iq_matrix.bScalingLists4x4[i][j] = pps->scaling_lists_4x4[i][j]; - } - } - - for (i = 0; i < 2; i++) { - for (j = 0; j < 64; j++) { - iq_matrix.bScalingLists8x8[i][j] = pps->scaling_lists_8x8[i][j]; - } - } - } else { - for (i = 0; i < 6; i++) { - for (j = 0; j < 16; j++) { - iq_matrix.bScalingLists4x4[i][j] = sps->scaling_lists_4x4[i][j]; - } - } - - for (i = 0; i < 2; i++) { - for (j = 0; j < 64; j++) { - iq_matrix.bScalingLists8x8[i][j] = sps->scaling_lists_8x8[i][j]; - } - } - } - - GST_TRACE_OBJECT (self, "Getting inverse quantization maxtirx buffer"); - - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX, - &d3d11_buffer_size, &d3d11_buffer)) { - GST_ERROR_OBJECT (self, - "Failed to get decoder buffer for inv. quantization matrix"); - return FALSE; - } - - memcpy (d3d11_buffer, &iq_matrix, sizeof (DXVA_Qmatrix_H264)); - - GST_TRACE_OBJECT (self, "Release inverse quantization maxtirx buffer"); - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX)) { - GST_ERROR_OBJECT (self, "Failed to release decoder buffer"); - return FALSE; - } - - { - guint to_write = slice->nalu.size + 3; - gboolean is_first = TRUE; - - while (to_write > 0) { - guint bytes_to_copy; - gboolean is_last = TRUE; - DXVA_Slice_H264_Short slice_short = { 0, }; - - if (self->remaining_buffer_size < to_write && self->slice_list->len > 0) { - if (!gst_d3d11_h264_dec_submit_slice_data (self)) { - GST_ERROR_OBJECT (self, "Failed to submit bitstream buffers"); - return FALSE; - } - - if (!gst_d3d11_h264_dec_get_bitstream_buffer (self)) { - GST_ERROR_OBJECT (self, "Failed to get bitstream buffer"); - return FALSE; - } - } - - /* remaining_buffer_size: the size of remaining d3d11 decoder - * bitstream memory allowed to write more - * written_buffer_size: the size of written bytes to this d3d11 decoder - * bitstream memory - * bytes_to_copy: the size of which we would write to d3d11 decoder - * bitstream memory in this loop - */ - - bytes_to_copy = to_write; - - if (bytes_to_copy > self->remaining_buffer_size) { - /* if the size of this slice is larger than the size of remaining d3d11 - * decoder bitstream memory, write the data up to the remaining d3d11 - * decoder bitstream memory size and the rest would be written to the - * next d3d11 bitstream memory */ - bytes_to_copy = self->remaining_buffer_size; - is_last = FALSE; - } - - if (bytes_to_copy >= 3 && is_first) { - /* normal case */ - self->bitstream_buffer_data[0] = 0; - self->bitstream_buffer_data[1] = 0; - self->bitstream_buffer_data[2] = 1; - memcpy (self->bitstream_buffer_data + 3, - slice->nalu.data + slice->nalu.offset, bytes_to_copy - 3); - } else { - /* when this nal unit date is splitted into two buffer */ - memcpy (self->bitstream_buffer_data, - slice->nalu.data + slice->nalu.offset, bytes_to_copy); - } - - /* For wBadSliceChopping value 0 or 1, BSNALunitDataLocation means - * the offset of the first start code of this slice in this d3d11 - * memory buffer. - * 1) If this is the first slice of picture, it should be zero - * since we write start code at offset 0 (written size before this - * slice also must be zero). - * 2) If this is not the first slice of picture but this is the first - * d3d11 bitstream buffer (meaning that one bitstream buffer contains - * multiple slices), then this is the written size of buffer - * excluding this loop. - * And for wBadSliceChopping value 2 or 3, this should be zero by spec */ - if (is_first) - slice_short.BSNALunitDataLocation = self->written_buffer_size; - else - slice_short.BSNALunitDataLocation = 0; - slice_short.SliceBytesInBuffer = bytes_to_copy; - - /* wBadSliceChopping: (dxva h264 spec.) - * 0: All bits for the slice are located within the corresponding - * bitstream data buffer - * 1: The bitstream data buffer contains the start of the slice, - * but not the entire slice, because the buffer is full - * 2: The bitstream data buffer contains the end of the slice. - * It does not contain the start of the slice, because the start of - * the slice was located in the previous bitstream data buffer. - * 3: The bitstream data buffer does not contain the start of the slice - * (because the start of the slice was located in the previous - * bitstream data buffer), and it does not contain the end of the slice - * (because the current bitstream data buffer is also full). - */ - if (is_last && is_first) { - slice_short.wBadSliceChopping = 0; - } else if (!is_last && is_first) { - slice_short.wBadSliceChopping = 1; - } else if (is_last && !is_first) { - slice_short.wBadSliceChopping = 2; - } else { - slice_short.wBadSliceChopping = 3; - } - - g_array_append_val (self->slice_list, slice_short); - self->remaining_buffer_size -= bytes_to_copy; - self->written_buffer_size += bytes_to_copy; - self->bitstream_buffer_data += bytes_to_copy; - is_first = FALSE; - to_write -= bytes_to_copy; - } - } - - return TRUE; -} - -typedef struct -{ - guint width; - guint height; -} GstD3D11H264DecResolution; - -void -gst_d3d11_h264_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank, gboolean legacy) -{ - GType type; - gchar *type_name; - gchar *feature_name; - guint index = 0; - guint i; - gboolean ret; - GUID profile; - GTypeInfo type_info = { - sizeof (GstD3D11H264DecClass), - NULL, - NULL, - (GClassInitFunc) gst_d3d11_h264_dec_class_init, - NULL, - NULL, - sizeof (GstD3D11H264Dec), - 0, - (GInstanceInitFunc) gst_d3d11_h264_dec_init, - }; - static const GUID *supported_profiles[] = { - &GST_GUID_D3D11_DECODER_PROFILE_H264_IDCT_FGT, - &GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_NOFGT, - &GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_FGT - }; - /* values were taken from chromium. See supported_profile_helper.cc */ - GstD3D11H264DecResolution resolutions_to_check[] = { - {1920, 1088}, {2560, 1440}, {3840, 2160}, {4096, 2160}, - {4096, 2304} - }; - GstCaps *sink_caps = NULL; - GstCaps *src_caps = NULL; - guint max_width = 0; - guint max_height = 0; - guint resolution; - - ret = gst_d3d11_decoder_get_supported_decoder_profile (decoder, - supported_profiles, G_N_ELEMENTS (supported_profiles), &profile); - - if (!ret) { - GST_WARNING_OBJECT (device, "decoder profile unavailable"); - return; - } - - ret = gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_NV12); - if (!ret) { - GST_FIXME_OBJECT (device, "device does not support NV12 format"); - return; - } - - /* we will not check the maximum resolution for legacy devices. - * it might cause crash */ - if (legacy) { - max_width = resolutions_to_check[0].width; - max_height = resolutions_to_check[0].height; - } else { - for (i = 0; i < G_N_ELEMENTS (resolutions_to_check); i++) { - if (gst_d3d11_decoder_supports_resolution (decoder, &profile, - DXGI_FORMAT_NV12, resolutions_to_check[i].width, - resolutions_to_check[i].height)) { - max_width = resolutions_to_check[i].width; - max_height = resolutions_to_check[i].height; - - GST_DEBUG_OBJECT (device, - "device support resolution %dx%d", max_width, max_height); - } else { - break; - } - } - } - - if (max_width == 0 || max_height == 0) { - GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); - return; - } - - sink_caps = gst_caps_from_string ("video/x-h264, " - "stream-format= (string) { avc, avc3, byte-stream }, " - "alignment= (string) au, " - "profile = (string) { high, main, constrained-baseline, baseline }, " - "framerate = " GST_VIDEO_FPS_RANGE); - src_caps = gst_caps_from_string ("video/x-raw(" - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "), format = (string) NV12, " - "framerate = " GST_VIDEO_FPS_RANGE ";" - "video/x-raw, format = (string) NV12, " - "framerate = " GST_VIDEO_FPS_RANGE); - - /* To cover both landscape and portrait, select max value */ - resolution = MAX (max_width, max_height); - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - - type_info.class_data = - gst_d3d11_decoder_class_data_new (device, sink_caps, src_caps); - - type_name = g_strdup ("GstD3D11H264Dec"); - feature_name = g_strdup ("d3d11h264dec"); - - while (g_type_from_name (type_name)) { - index++; - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstD3D11H264Device%dDec", index); - feature_name = g_strdup_printf ("d3d11h264device%ddec", index); - } - - type = g_type_register_static (GST_TYPE_H264_DECODER, - type_name, &type_info, 0); - - /* make lower rank than default device */ - if (rank > 0 && index != 0) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11h265dec.c
Deleted
@@ -1,1540 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include <config.h> -#endif - -#include "gstd3d11h265dec.h" -#include "gstd3d11memory.h" -#include "gstd3d11bufferpool.h" - -#include <gst/codecs/gsth265decoder.h> -#include <string.h> - -/* HACK: to expose dxva data structure on UWP */ -#ifdef WINAPI_PARTITION_DESKTOP -#undef WINAPI_PARTITION_DESKTOP -#endif -#define WINAPI_PARTITION_DESKTOP 1 -#include <d3d9.h> -#include <dxva.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_h265_dec_debug); -#define GST_CAT_DEFAULT gst_d3d11_h265_dec_debug - -enum -{ - PROP_0, - PROP_ADAPTER, - PROP_DEVICE_ID, - PROP_VENDOR_ID, -}; - -/* copied from d3d11.h since mingw header doesn't define them */ -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN, - 0x5b11d51b, 0x2f4c, 0x4452, 0xbc, 0xc3, 0x09, 0xf2, 0xa1, 0x16, 0x0c, 0xc0); -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN10, - 0x107af0e0, 0xef1a, 0x4d19, 0xab, 0xa8, 0x67, 0xa1, 0x63, 0x07, 0x3d, 0x13); - -typedef struct _GstD3D11H265Dec -{ - GstH265Decoder parent; - - GstVideoCodecState *output_state; - - GstD3D11Device *device; - - guint width, height; - guint coded_width, coded_height; - guint bitdepth; - guint chroma_format_idc; - GstVideoFormat out_format; - - /* Array of DXVA_Slice_HEVC_Short */ - GArray *slice_list; - gboolean submit_iq_data; - - GstD3D11Decoder *d3d11_decoder; - - /* Pointing current bitstream buffer */ - gboolean bad_aligned_bitstream_buffer; - guint written_buffer_size; - guint remaining_buffer_size; - guint8 *bitstream_buffer_data; - - gboolean use_d3d11_output; - - DXVA_PicEntry_HEVC ref_pic_list[15]; - INT pic_order_cnt_val_list[15]; - UCHAR ref_pic_set_st_curr_before[8]; - UCHAR ref_pic_set_st_curr_after[8]; - UCHAR ref_pic_set_lt_curr[8]; -} GstD3D11H265Dec; - -typedef struct _GstD3D11H265DecClass -{ - GstH265DecoderClass parent_class; - guint adapter; - guint device_id; - guint vendor_id; -} GstD3D11H265DecClass; - -static GstElementClass *parent_class = NULL; - -#define GST_D3D11_H265_DEC(object) ((GstD3D11H265Dec *) (object)) -#define GST_D3D11_H265_DEC_GET_CLASS(object) \ - (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11H265DecClass)) - -static void gst_d3d11_h265_dec_get_property (GObject * object, - guint prop_id, GValue * value, GParamSpec * pspec); -static void gst_d3d11_h265_dec_dispose (GObject * object); -static void gst_d3d11_h265_dec_set_context (GstElement * element, - GstContext * context); - -static gboolean gst_d3d11_h265_dec_open (GstVideoDecoder * decoder); -static gboolean gst_d3d11_h265_dec_close (GstVideoDecoder * decoder); -static gboolean gst_d3d11_h265_dec_negotiate (GstVideoDecoder * decoder); -static gboolean gst_d3d11_h265_dec_decide_allocation (GstVideoDecoder * - decoder, GstQuery * query); -static gboolean gst_d3d11_h265_dec_src_query (GstVideoDecoder * decoder, - GstQuery * query); - -/* GstH265Decoder */ -static gboolean gst_d3d11_h265_dec_new_sequence (GstH265Decoder * decoder, - const GstH265SPS * sps, gint max_dpb_size); -static gboolean gst_d3d11_h265_dec_new_picture (GstH265Decoder * decoder, - GstH265Picture * picture); -static GstFlowReturn gst_d3d11_h265_dec_output_picture (GstH265Decoder * - decoder, GstH265Picture * picture); -static gboolean gst_d3d11_h265_dec_start_picture (GstH265Decoder * decoder, - GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb); -static gboolean gst_d3d11_h265_dec_decode_slice (GstH265Decoder * decoder, - GstH265Picture * picture, GstH265Slice * slice); -static gboolean gst_d3d11_h265_dec_end_picture (GstH265Decoder * decoder, - GstH265Picture * picture); - -static void -gst_d3d11_h265_dec_class_init (GstD3D11H265DecClass * klass, gpointer data) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); - GstH265DecoderClass *h265decoder_class = GST_H265_DECODER_CLASS (klass); - GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; - gchar *long_name; - - gobject_class->get_property = gst_d3d11_h265_dec_get_property; - gobject_class->dispose = gst_d3d11_h265_dec_dispose; - - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_uint ("adapter", "Adapter", - "DXGI Adapter index for creating device", - 0, G_MAXUINT32, cdata->adapter, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_DEVICE_ID, - g_param_spec_uint ("device-id", "Device Id", - "DXGI Device ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_VENDOR_ID, - g_param_spec_uint ("vendor-id", "Vendor Id", - "DXGI Vendor ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - parent_class = g_type_class_peek_parent (klass); - - klass->adapter = cdata->adapter; - klass->device_id = cdata->device_id; - klass->vendor_id = cdata->vendor_id; - - element_class->set_context = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_set_context); - - long_name = g_strdup_printf ("Direct3D11 H.265 %s Decoder", - cdata->description); - gst_element_class_set_metadata (element_class, long_name, - "Codec/Decoder/Video/Hardware", - "A Direct3D11 based H.265 video decoder", - "Seungha Yang <seungha.yang@navercorp.com>"); - g_free (long_name); - - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - cdata->sink_caps)); - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - cdata->src_caps)); - gst_d3d11_decoder_class_data_free (cdata); - - decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_open); - decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_close); - decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_negotiate); - decoder_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_decide_allocation); - decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_src_query); - - h265decoder_class->new_sequence = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_new_sequence); - h265decoder_class->new_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_new_picture); - h265decoder_class->output_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_output_picture); - h265decoder_class->start_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_start_picture); - h265decoder_class->decode_slice = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_decode_slice); - h265decoder_class->end_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_end_picture); -} - -static void -gst_d3d11_h265_dec_init (GstD3D11H265Dec * self) -{ - self->slice_list = g_array_new (FALSE, TRUE, sizeof (DXVA_Slice_HEVC_Short)); -} - -static void -gst_d3d11_h265_dec_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11H265DecClass *klass = GST_D3D11_H265_DEC_GET_CLASS (object); - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_uint (value, klass->adapter); - break; - case PROP_DEVICE_ID: - g_value_set_uint (value, klass->device_id); - break; - case PROP_VENDOR_ID: - g_value_set_uint (value, klass->vendor_id); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_h265_dec_dispose (GObject * object) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (object); - - if (self->slice_list) { - g_array_unref (self->slice_list); - self->slice_list = NULL; - } - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_d3d11_h265_dec_set_context (GstElement * element, GstContext * context) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (element); - GstD3D11H265DecClass *klass = GST_D3D11_H265_DEC_GET_CLASS (self); - - gst_d3d11_handle_set_context (element, context, klass->adapter, - &self->device); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -static gboolean -gst_d3d11_h265_dec_open (GstVideoDecoder * decoder) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - GstD3D11H265DecClass *klass = GST_D3D11_H265_DEC_GET_CLASS (self); - - if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), klass->adapter, - &self->device)) { - GST_ERROR_OBJECT (self, "Cannot create d3d11device"); - return FALSE; - } - - self->d3d11_decoder = gst_d3d11_decoder_new (self->device); - - if (!self->d3d11_decoder) { - GST_ERROR_OBJECT (self, "Cannot create d3d11 decoder"); - gst_clear_object (&self->device); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_h265_dec_close (GstVideoDecoder * decoder) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - - if (self->output_state) - gst_video_codec_state_unref (self->output_state); - self->output_state = NULL; - - gst_clear_object (&self->d3d11_decoder); - gst_clear_object (&self->device); - - return TRUE; -} - -static gboolean -gst_d3d11_h265_dec_negotiate (GstVideoDecoder * decoder) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - GstH265Decoder *h265dec = GST_H265_DECODER (decoder); - - if (!gst_d3d11_decoder_negotiate (decoder, h265dec->input_state, - self->out_format, self->width, self->height, &self->output_state, - &self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); -} - -static gboolean -gst_d3d11_h265_dec_decide_allocation (GstVideoDecoder * decoder, - GstQuery * query) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - - if (!gst_d3d11_decoder_decide_allocation (decoder, query, self->device, - GST_D3D11_CODEC_H265, self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation - (decoder, query); -} - -static gboolean -gst_d3d11_h265_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT: - if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), - query, self->device)) { - return TRUE; - } - break; - default: - break; - } - - return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); -} - -static gboolean -gst_d3d11_h265_dec_new_sequence (GstH265Decoder * decoder, - const GstH265SPS * sps, gint max_dpb_size) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - gint crop_width, crop_height; - gboolean modified = FALSE; - static const GUID *main_10_guid = - &GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN10; - static const GUID *main_guid = &GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN; - - GST_LOG_OBJECT (self, "new sequence"); - - if (sps->conformance_window_flag) { - crop_width = sps->crop_rect_width; - crop_height = sps->crop_rect_height; - } else { - crop_width = sps->width; - crop_height = sps->height; - } - - if (self->width != crop_width || self->height != crop_height || - self->coded_width != sps->width || self->coded_height != sps->height) { - GST_INFO_OBJECT (self, "resolution changed %dx%d (%dx%d)", - crop_width, crop_height, sps->width, sps->height); - self->width = crop_width; - self->height = crop_height; - self->coded_width = sps->width; - self->coded_height = sps->height; - modified = TRUE; - } - - if (self->bitdepth != sps->bit_depth_luma_minus8 + 8) { - GST_INFO_OBJECT (self, "bitdepth changed"); - self->bitdepth = sps->bit_depth_luma_minus8 + 8; - modified = TRUE; - } - - if (self->chroma_format_idc != sps->chroma_format_idc) { - GST_INFO_OBJECT (self, "chroma format changed"); - self->chroma_format_idc = sps->chroma_format_idc; - modified = TRUE; - } - - if (modified || !self->d3d11_decoder->opened) { - const GUID *profile_guid = NULL; - GstVideoInfo info; - - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; - - if (self->bitdepth == 8) { - if (self->chroma_format_idc == 1) { - self->out_format = GST_VIDEO_FORMAT_NV12; - profile_guid = main_guid; - } else { - GST_FIXME_OBJECT (self, "Could not support 8bits non-4:2:0 format"); - } - } else if (self->bitdepth == 10) { - if (self->chroma_format_idc == 1) { - self->out_format = GST_VIDEO_FORMAT_P010_10LE; - profile_guid = main_10_guid; - } else { - GST_FIXME_OBJECT (self, "Could not support 10bits non-4:2:0 format"); - } - } - - if (self->out_format == GST_VIDEO_FORMAT_UNKNOWN) { - GST_ERROR_OBJECT (self, "Could not support bitdepth/chroma format"); - return FALSE; - } - - gst_video_info_set_format (&info, - self->out_format, self->width, self->height); - - gst_d3d11_decoder_reset (self->d3d11_decoder); - if (!gst_d3d11_decoder_open (self->d3d11_decoder, GST_D3D11_CODEC_H265, - &info, self->coded_width, self->coded_height, - /* Additional 4 views margin for zero-copy rendering */ - max_dpb_size + 4, &profile_guid, 1)) { - GST_ERROR_OBJECT (self, "Failed to create decoder"); - return FALSE; - } - - if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { - GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; - } - } - - return TRUE; -} - -static gboolean -gst_d3d11_h265_dec_get_bitstream_buffer (GstD3D11H265Dec * self) -{ - GST_TRACE_OBJECT (self, "Getting bitstream buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM, &self->remaining_buffer_size, - (gpointer *) & self->bitstream_buffer_data)) { - GST_ERROR_OBJECT (self, "Faild to get bitstream buffer"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Got bitstream buffer %p with size %d", - self->bitstream_buffer_data, self->remaining_buffer_size); - self->written_buffer_size = 0; - if ((self->remaining_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "The size of bitstream buffer is not 128 bytes aligned"); - self->bad_aligned_bitstream_buffer = TRUE; - } else { - self->bad_aligned_bitstream_buffer = FALSE; - } - - return TRUE; -} - -static GstD3D11DecoderOutputView * -gst_d3d11_h265_dec_get_output_view_from_picture (GstD3D11H265Dec * self, - GstH265Picture * picture) -{ - GstBuffer *view_buffer; - GstD3D11DecoderOutputView *view; - - view_buffer = (GstBuffer *) gst_h265_picture_get_user_data (picture); - if (!view_buffer) { - GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); - return NULL; - } - - view = gst_d3d11_decoder_get_output_view_from_buffer (self->d3d11_decoder, - view_buffer); - if (!view) { - GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); - return NULL; - } - - return view; -} - -static gint -gst_d3d11_h265_dec_get_ref_index (GstD3D11H265Dec * self, gint view_id) -{ - gint i; - for (i = 0; i < G_N_ELEMENTS (self->ref_pic_list); i++) { - if (self->ref_pic_list[i].Index7Bits == view_id) - return i; - } - - return 0xff; -} - -static gboolean -gst_d3d11_h265_dec_start_picture (GstH265Decoder * decoder, - GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - GstD3D11DecoderOutputView *view; - gint i, j; - GArray *dpb_array; - - view = gst_d3d11_h265_dec_get_output_view_from_picture (self, picture); - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view handle"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Begin frame"); - - if (!gst_d3d11_decoder_begin_frame (self->d3d11_decoder, view, 0, NULL)) { - GST_ERROR_OBJECT (self, "Failed to begin frame"); - return FALSE; - } - - for (i = 0; i < 15; i++) { - self->ref_pic_list[i].bPicEntry = 0xff; - self->pic_order_cnt_val_list[i] = 0; - } - - for (i = 0; i < 8; i++) { - self->ref_pic_set_st_curr_before[i] = 0xff; - self->ref_pic_set_st_curr_after[i] = 0xff; - self->ref_pic_set_lt_curr[i] = 0xff; - } - - dpb_array = gst_h265_dpb_get_pictures_all (dpb); - - GST_LOG_OBJECT (self, "DPB size %d", dpb_array->len); - - for (i = 0; i < dpb_array->len && i < G_N_ELEMENTS (self->ref_pic_list); i++) { - GstH265Picture *other = g_array_index (dpb_array, GstH265Picture *, i); - GstD3D11DecoderOutputView *other_view; - gint id = 0xff; - - if (!other->ref) { - GST_LOG_OBJECT (self, "%dth picture in dpb is not reference, skip", i); - continue; - } - - other_view = gst_d3d11_h265_dec_get_output_view_from_picture (self, other); - - if (other_view) - id = other_view->view_id; - - self->ref_pic_list[i].Index7Bits = id; - self->ref_pic_list[i].AssociatedFlag = other->long_term; - self->pic_order_cnt_val_list[i] = other->pic_order_cnt; - } - - for (i = 0, j = 0; i < G_N_ELEMENTS (self->ref_pic_set_st_curr_before); i++) { - GstH265Picture *other = NULL; - gint id = 0xff; - - while (other == NULL && j < decoder->NumPocStCurrBefore) - other = decoder->RefPicSetStCurrBefore[j++]; - - if (other) { - GstD3D11DecoderOutputView *other_view; - - other_view = - gst_d3d11_h265_dec_get_output_view_from_picture (self, other); - - if (other_view) - id = gst_d3d11_h265_dec_get_ref_index (self, other_view->view_id); - } - - self->ref_pic_set_st_curr_before[i] = id; - } - - for (i = 0, j = 0; i < G_N_ELEMENTS (self->ref_pic_set_st_curr_after); i++) { - GstH265Picture *other = NULL; - gint id = 0xff; - - while (other == NULL && j < decoder->NumPocStCurrAfter) - other = decoder->RefPicSetStCurrAfter[j++]; - - if (other) { - GstD3D11DecoderOutputView *other_view; - - other_view = - gst_d3d11_h265_dec_get_output_view_from_picture (self, other); - - if (other_view) - id = gst_d3d11_h265_dec_get_ref_index (self, other_view->view_id); - } - - self->ref_pic_set_st_curr_after[i] = id; - } - - for (i = 0, j = 0; i < G_N_ELEMENTS (self->ref_pic_set_lt_curr); i++) { - GstH265Picture *other = NULL; - gint id = 0xff; - - while (other == NULL && j < decoder->NumPocLtCurr) - other = decoder->RefPicSetLtCurr[j++]; - - if (other) { - GstD3D11DecoderOutputView *other_view; - - other_view = - gst_d3d11_h265_dec_get_output_view_from_picture (self, other); - - if (other_view) - id = gst_d3d11_h265_dec_get_ref_index (self, other_view->view_id); - } - - self->ref_pic_set_lt_curr[i] = id; - } - - g_array_unref (dpb_array); - g_array_set_size (self->slice_list, 0); - - return gst_d3d11_h265_dec_get_bitstream_buffer (self); -} - -static gboolean -gst_d3d11_h265_dec_new_picture (GstH265Decoder * decoder, - GstH265Picture * picture) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - GstBuffer *view_buffer; - GstD3D11Memory *mem; - - view_buffer = gst_d3d11_decoder_get_output_view_buffer (self->d3d11_decoder); - if (!view_buffer) { - GST_ERROR_OBJECT (self, "No available output view buffer"); - return FALSE; - } - - mem = (GstD3D11Memory *) gst_buffer_peek_memory (view_buffer, 0); - - GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT " (index %d)", - view_buffer, mem->subresource_index); - - gst_h265_picture_set_user_data (picture, - view_buffer, (GDestroyNotify) gst_buffer_unref); - - GST_LOG_OBJECT (self, "New h265picture %p", picture); - - return TRUE; -} - -static GstFlowReturn -gst_d3d11_h265_dec_output_picture (GstH265Decoder * decoder, - GstH265Picture * picture) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - GstVideoCodecFrame *frame = NULL; - GstBuffer *output_buffer = NULL; - GstFlowReturn ret; - GstBuffer *view_buffer; - - GST_LOG_OBJECT (self, - "Outputting picture %p, poc %d", picture, picture->pic_order_cnt); - - view_buffer = (GstBuffer *) gst_h265_picture_get_user_data (picture); - - if (!view_buffer) { - GST_ERROR_OBJECT (self, "Could not get output view"); - return GST_FLOW_ERROR; - } - - frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), - picture->system_frame_number); - - /* if downstream is d3d11 element and forward playback case, - * expose our decoder view without copy. In case of reverse playback, however, - * we cannot do that since baseclass will store the decoded buffer - * up to gop size but our dpb pool cannot be increased */ - if (self->use_d3d11_output && - gst_d3d11_decoder_supports_direct_rendering (self->d3d11_decoder) && - GST_VIDEO_DECODER (self)->input_segment.rate > 0) { - GstMemory *mem; - - output_buffer = gst_buffer_ref (view_buffer); - mem = gst_buffer_peek_memory (output_buffer, 0); - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - } else { - output_buffer = - gst_video_decoder_allocate_output_buffer (GST_VIDEO_DECODER (self)); - } - - if (!output_buffer) { - GST_ERROR_OBJECT (self, "Couldn't allocate output buffer"); - return GST_FLOW_ERROR; - } - - if (!frame) { - GST_WARNING_OBJECT (self, - "Failed to find codec frame for picture %p", picture); - - GST_BUFFER_PTS (output_buffer) = picture->pts; - GST_BUFFER_DTS (output_buffer) = GST_CLOCK_TIME_NONE; - GST_BUFFER_DURATION (output_buffer) = GST_CLOCK_TIME_NONE; - } else { - frame->output_buffer = output_buffer; - GST_BUFFER_PTS (output_buffer) = GST_BUFFER_PTS (frame->input_buffer); - GST_BUFFER_DTS (output_buffer) = GST_CLOCK_TIME_NONE; - GST_BUFFER_DURATION (output_buffer) = - GST_BUFFER_DURATION (frame->input_buffer); - } - - if (!gst_d3d11_decoder_process_output (self->d3d11_decoder, - &self->output_state->info, - GST_VIDEO_INFO_WIDTH (&self->output_state->info), - GST_VIDEO_INFO_HEIGHT (&self->output_state->info), - view_buffer, output_buffer)) { - GST_ERROR_OBJECT (self, "Failed to copy buffer"); - if (frame) - gst_video_decoder_drop_frame (GST_VIDEO_DECODER (self), frame); - else - gst_buffer_unref (output_buffer); - - return GST_FLOW_ERROR; - } - - GST_LOG_OBJECT (self, "Finish frame %" GST_TIME_FORMAT, - GST_TIME_ARGS (GST_BUFFER_PTS (output_buffer))); - - if (frame) { - ret = gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); - } else { - ret = gst_pad_push (GST_VIDEO_DECODER_SRC_PAD (self), output_buffer); - } - - return ret; -} - -static gboolean -gst_d3d11_h265_dec_submit_slice_data (GstD3D11H265Dec * self) -{ - guint buffer_size; - gpointer buffer; - guint8 *data; - gsize offset = 0; - gint i; - D3D11_VIDEO_DECODER_BUFFER_DESC buffer_desc[4] = { 0, }; - gboolean ret; - guint buffer_count = 0; - DXVA_Slice_HEVC_Short *slice_data; - - if (self->slice_list->len < 1) { - GST_WARNING_OBJECT (self, "Nothing to submit"); - return FALSE; - } - - slice_data = &g_array_index (self->slice_list, DXVA_Slice_HEVC_Short, - self->slice_list->len - 1); - - /* DXVA2 spec is saying that written bitstream data must be 128 bytes - * aligned if the bitstream buffer contains end of slice - * (i.e., wBadSliceChopping == 0 or 2) */ - if (slice_data->wBadSliceChopping == 0 || slice_data->wBadSliceChopping == 2) { - guint padding = - MIN (GST_ROUND_UP_128 (self->written_buffer_size) - - self->written_buffer_size, self->remaining_buffer_size); - - if (padding) { - GST_TRACE_OBJECT (self, - "Written bitstream buffer size %u is not 128 bytes aligned, " - "add padding %u bytes", self->written_buffer_size, padding); - memset (self->bitstream_buffer_data, 0, padding); - self->written_buffer_size += padding; - slice_data->SliceBytesInBuffer += padding; - } - } - - GST_TRACE_OBJECT (self, "Getting slice control buffer"); - - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL, &buffer_size, &buffer)) { - GST_ERROR_OBJECT (self, "Couldn't get slice control buffer"); - return FALSE; - } - - data = buffer; - for (i = 0; i < self->slice_list->len; i++) { - slice_data = &g_array_index (self->slice_list, DXVA_Slice_HEVC_Short, i); - - memcpy (data + offset, slice_data, sizeof (DXVA_Slice_HEVC_Short)); - offset += sizeof (DXVA_Slice_HEVC_Short); - } - - GST_TRACE_OBJECT (self, "Release slice control buffer"); - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL)) { - GST_ERROR_OBJECT (self, "Failed to release slice control buffer"); - return FALSE; - } - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM)) { - GST_ERROR_OBJECT (self, "Failed to release bitstream buffer"); - return FALSE; - } - - buffer_desc[buffer_count].BufferType = - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS; - buffer_desc[buffer_count].DataOffset = 0; - buffer_desc[buffer_count].DataSize = sizeof (DXVA_PicParams_HEVC); - buffer_count++; - - if (self->submit_iq_data) { - buffer_desc[buffer_count].BufferType = - D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX; - buffer_desc[buffer_count].DataOffset = 0; - buffer_desc[buffer_count].DataSize = sizeof (DXVA_Qmatrix_HEVC); - buffer_count++; - } - - buffer_desc[buffer_count].BufferType = - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL; - buffer_desc[buffer_count].DataOffset = 0; - buffer_desc[buffer_count].DataSize = - sizeof (DXVA_Slice_HEVC_Short) * self->slice_list->len; - buffer_count++; - - if (!self->bad_aligned_bitstream_buffer - && (self->written_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "Written bitstream buffer size %u is not 128 bytes aligned", - self->written_buffer_size); - } - - buffer_desc[buffer_count].BufferType = D3D11_VIDEO_DECODER_BUFFER_BITSTREAM; - buffer_desc[buffer_count].DataOffset = 0; - buffer_desc[buffer_count].DataSize = self->written_buffer_size; - buffer_count++; - - ret = gst_d3d11_decoder_submit_decoder_buffers (self->d3d11_decoder, - buffer_count, buffer_desc); - - self->written_buffer_size = 0; - self->bitstream_buffer_data = NULL; - self->remaining_buffer_size = 0; - g_array_set_size (self->slice_list, 0); - - return ret; -} - -static gboolean -gst_d3d11_h265_dec_end_picture (GstH265Decoder * decoder, - GstH265Picture * picture) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - - GST_LOG_OBJECT (self, "end picture %p, (poc %d)", - picture, picture->pic_order_cnt); - - if (!gst_d3d11_h265_dec_submit_slice_data (self)) { - GST_ERROR_OBJECT (self, "Failed to submit slice data"); - return FALSE; - } - - if (!gst_d3d11_decoder_end_frame (self->d3d11_decoder)) { - GST_ERROR_OBJECT (self, "Failed to EndFrame"); - return FALSE; - } - - return TRUE; -} - -static void -gst_d3d11_h265_dec_picture_params_from_sps (GstD3D11H265Dec * self, - const GstH265SPS * sps, DXVA_PicParams_HEVC * params) -{ -#define COPY_FIELD(f) \ - (params)->f = (sps)->f -#define COPY_FIELD_WITH_PREFIX(f) \ - (params)->G_PASTE(sps_,f) = (sps)->f - - params->PicWidthInMinCbsY = - sps->width >> (sps->log2_min_luma_coding_block_size_minus3 + 3); - params->PicHeightInMinCbsY = - sps->height >> (sps->log2_min_luma_coding_block_size_minus3 + 3); - params->sps_max_dec_pic_buffering_minus1 = - sps->max_dec_pic_buffering_minus1[sps->max_sub_layers_minus1]; - - COPY_FIELD (chroma_format_idc); - COPY_FIELD (separate_colour_plane_flag); - COPY_FIELD (bit_depth_luma_minus8); - COPY_FIELD (bit_depth_chroma_minus8); - COPY_FIELD (log2_max_pic_order_cnt_lsb_minus4); - COPY_FIELD (log2_min_luma_coding_block_size_minus3); - COPY_FIELD (log2_diff_max_min_luma_coding_block_size); - COPY_FIELD (log2_min_transform_block_size_minus2); - COPY_FIELD (log2_diff_max_min_transform_block_size); - COPY_FIELD (max_transform_hierarchy_depth_inter); - COPY_FIELD (max_transform_hierarchy_depth_intra); - COPY_FIELD (num_short_term_ref_pic_sets); - COPY_FIELD (num_long_term_ref_pics_sps); - COPY_FIELD (scaling_list_enabled_flag); - COPY_FIELD (amp_enabled_flag); - COPY_FIELD (sample_adaptive_offset_enabled_flag); - COPY_FIELD (pcm_enabled_flag); - - if (sps->pcm_enabled_flag) { - COPY_FIELD (pcm_sample_bit_depth_luma_minus1); - COPY_FIELD (pcm_sample_bit_depth_chroma_minus1); - COPY_FIELD (log2_min_pcm_luma_coding_block_size_minus3); - COPY_FIELD (log2_diff_max_min_pcm_luma_coding_block_size); - } - - COPY_FIELD (pcm_loop_filter_disabled_flag); - COPY_FIELD (long_term_ref_pics_present_flag); - COPY_FIELD_WITH_PREFIX (temporal_mvp_enabled_flag); - COPY_FIELD (strong_intra_smoothing_enabled_flag); - -#undef COPY_FIELD -#undef COPY_FIELD_WITH_PREFIX -} - -static void -gst_d3d11_h265_dec_picture_params_from_pps (GstD3D11H265Dec * self, - const GstH265PPS * pps, DXVA_PicParams_HEVC * params) -{ - gint i; - -#define COPY_FIELD(f) \ - (params)->f = (pps)->f -#define COPY_FIELD_WITH_PREFIX(f) \ - (params)->G_PASTE(pps_,f) = (pps)->f - - COPY_FIELD (num_ref_idx_l0_default_active_minus1); - COPY_FIELD (num_ref_idx_l1_default_active_minus1); - COPY_FIELD (init_qp_minus26); - COPY_FIELD (dependent_slice_segments_enabled_flag); - COPY_FIELD (output_flag_present_flag); - COPY_FIELD (num_extra_slice_header_bits); - COPY_FIELD (sign_data_hiding_enabled_flag); - COPY_FIELD (cabac_init_present_flag); - COPY_FIELD (constrained_intra_pred_flag); - COPY_FIELD (transform_skip_enabled_flag); - COPY_FIELD (cu_qp_delta_enabled_flag); - COPY_FIELD_WITH_PREFIX (slice_chroma_qp_offsets_present_flag); - COPY_FIELD (weighted_pred_flag); - COPY_FIELD (weighted_bipred_flag); - COPY_FIELD (transquant_bypass_enabled_flag); - COPY_FIELD (tiles_enabled_flag); - COPY_FIELD (entropy_coding_sync_enabled_flag); - COPY_FIELD (uniform_spacing_flag); - - if (pps->tiles_enabled_flag) - COPY_FIELD (loop_filter_across_tiles_enabled_flag); - - COPY_FIELD_WITH_PREFIX (loop_filter_across_slices_enabled_flag); - COPY_FIELD (deblocking_filter_override_enabled_flag); - COPY_FIELD_WITH_PREFIX (deblocking_filter_disabled_flag); - COPY_FIELD (lists_modification_present_flag); - COPY_FIELD (slice_segment_header_extension_present_flag); - COPY_FIELD_WITH_PREFIX (cb_qp_offset); - COPY_FIELD_WITH_PREFIX (cr_qp_offset); - - if (pps->tiles_enabled_flag) { - COPY_FIELD (num_tile_columns_minus1); - COPY_FIELD (num_tile_rows_minus1); - if (!pps->uniform_spacing_flag) { - for (i = 0; i < pps->num_tile_columns_minus1 && - i < G_N_ELEMENTS (params->column_width_minus1); i++) - COPY_FIELD (column_width_minus1[i]); - - for (i = 0; i < pps->num_tile_rows_minus1 && - i < G_N_ELEMENTS (params->row_height_minus1); i++) - COPY_FIELD (row_height_minus1[i]); - } - } - - COPY_FIELD (diff_cu_qp_delta_depth); - COPY_FIELD_WITH_PREFIX (beta_offset_div2); - COPY_FIELD_WITH_PREFIX (tc_offset_div2); - COPY_FIELD (log2_parallel_merge_level_minus2); - -#undef COPY_FIELD -#undef COPY_FIELD_WITH_PREFIX -} - -static void -gst_d3d11_h265_dec_picture_params_from_slice_header (GstD3D11H265Dec * - self, const GstH265SliceHdr * slice_header, DXVA_PicParams_HEVC * params) -{ - if (slice_header->short_term_ref_pic_set_sps_flag == 0) { - params->ucNumDeltaPocsOfRefRpsIdx = - slice_header->short_term_ref_pic_sets.NumDeltaPocsOfRefRpsIdx; - params->wNumBitsForShortTermRPSInSlice = - slice_header->short_term_ref_pic_set_size; - } -} - -static gboolean -gst_d3d11_h265_dec_fill_picture_params (GstD3D11H265Dec * self, - const GstH265SliceHdr * slice_header, DXVA_PicParams_HEVC * params) -{ - const GstH265SPS *sps; - const GstH265PPS *pps; - - g_return_val_if_fail (slice_header->pps != NULL, FALSE); - g_return_val_if_fail (slice_header->pps->sps != NULL, FALSE); - - pps = slice_header->pps; - sps = pps->sps; - - memset (params, 0, sizeof (DXVA_PicParams_HEVC)); - - /* not related to hevc syntax */ - params->NoPicReorderingFlag = 0; - params->NoBiPredFlag = 0; - params->ReservedBits1 = 0; - params->StatusReportFeedbackNumber = 1; - - gst_d3d11_h265_dec_picture_params_from_sps (self, sps, params); - gst_d3d11_h265_dec_picture_params_from_pps (self, pps, params); - gst_d3d11_h265_dec_picture_params_from_slice_header (self, - slice_header, params); - - return TRUE; -} - -#ifndef GST_DISABLE_GST_DEBUG -static void -gst_d3d11_h265_dec_dump_pic_params (GstD3D11H265Dec * self, - DXVA_PicParams_HEVC * params) -{ - gint i; - - GST_TRACE_OBJECT (self, "Dump current DXVA_PicParams_HEVC"); - -#define DUMP_PIC_PARAMS(p) \ - GST_TRACE_OBJECT (self, "\t" G_STRINGIFY(p) ": %d", (gint)params->p) - - DUMP_PIC_PARAMS (PicWidthInMinCbsY); - DUMP_PIC_PARAMS (PicHeightInMinCbsY); - DUMP_PIC_PARAMS (chroma_format_idc); - DUMP_PIC_PARAMS (separate_colour_plane_flag); - DUMP_PIC_PARAMS (bit_depth_chroma_minus8); - DUMP_PIC_PARAMS (NoPicReorderingFlag); - DUMP_PIC_PARAMS (NoBiPredFlag); - DUMP_PIC_PARAMS (CurrPic.Index7Bits); - DUMP_PIC_PARAMS (sps_max_dec_pic_buffering_minus1); - DUMP_PIC_PARAMS (log2_min_luma_coding_block_size_minus3); - DUMP_PIC_PARAMS (log2_diff_max_min_luma_coding_block_size); - DUMP_PIC_PARAMS (log2_min_transform_block_size_minus2); - DUMP_PIC_PARAMS (log2_diff_max_min_transform_block_size); - DUMP_PIC_PARAMS (max_transform_hierarchy_depth_inter); - DUMP_PIC_PARAMS (max_transform_hierarchy_depth_intra); - DUMP_PIC_PARAMS (num_short_term_ref_pic_sets); - DUMP_PIC_PARAMS (num_long_term_ref_pics_sps); - DUMP_PIC_PARAMS (num_ref_idx_l0_default_active_minus1); - DUMP_PIC_PARAMS (num_ref_idx_l1_default_active_minus1); - DUMP_PIC_PARAMS (init_qp_minus26); - DUMP_PIC_PARAMS (ucNumDeltaPocsOfRefRpsIdx); - DUMP_PIC_PARAMS (wNumBitsForShortTermRPSInSlice); - DUMP_PIC_PARAMS (scaling_list_enabled_flag); - DUMP_PIC_PARAMS (amp_enabled_flag); - DUMP_PIC_PARAMS (sample_adaptive_offset_enabled_flag); - DUMP_PIC_PARAMS (pcm_enabled_flag); - DUMP_PIC_PARAMS (pcm_sample_bit_depth_luma_minus1); - DUMP_PIC_PARAMS (pcm_sample_bit_depth_chroma_minus1); - DUMP_PIC_PARAMS (log2_min_pcm_luma_coding_block_size_minus3); - DUMP_PIC_PARAMS (log2_diff_max_min_pcm_luma_coding_block_size); - DUMP_PIC_PARAMS (pcm_loop_filter_disabled_flag); - DUMP_PIC_PARAMS (long_term_ref_pics_present_flag); - DUMP_PIC_PARAMS (sps_temporal_mvp_enabled_flag); - DUMP_PIC_PARAMS (strong_intra_smoothing_enabled_flag); - DUMP_PIC_PARAMS (dependent_slice_segments_enabled_flag); - DUMP_PIC_PARAMS (output_flag_present_flag); - DUMP_PIC_PARAMS (num_extra_slice_header_bits); - DUMP_PIC_PARAMS (sign_data_hiding_enabled_flag); - DUMP_PIC_PARAMS (cabac_init_present_flag); - - DUMP_PIC_PARAMS (constrained_intra_pred_flag); - DUMP_PIC_PARAMS (transform_skip_enabled_flag); - DUMP_PIC_PARAMS (cu_qp_delta_enabled_flag); - DUMP_PIC_PARAMS (pps_slice_chroma_qp_offsets_present_flag); - DUMP_PIC_PARAMS (weighted_pred_flag); - DUMP_PIC_PARAMS (weighted_bipred_flag); - DUMP_PIC_PARAMS (transquant_bypass_enabled_flag); - DUMP_PIC_PARAMS (tiles_enabled_flag); - DUMP_PIC_PARAMS (entropy_coding_sync_enabled_flag); - DUMP_PIC_PARAMS (uniform_spacing_flag); - DUMP_PIC_PARAMS (loop_filter_across_tiles_enabled_flag); - DUMP_PIC_PARAMS (pps_loop_filter_across_slices_enabled_flag); - DUMP_PIC_PARAMS (deblocking_filter_override_enabled_flag); - DUMP_PIC_PARAMS (pps_deblocking_filter_disabled_flag); - DUMP_PIC_PARAMS (lists_modification_present_flag); - DUMP_PIC_PARAMS (IrapPicFlag); - DUMP_PIC_PARAMS (IdrPicFlag); - DUMP_PIC_PARAMS (IntraPicFlag); - DUMP_PIC_PARAMS (pps_cb_qp_offset); - DUMP_PIC_PARAMS (pps_cr_qp_offset); - DUMP_PIC_PARAMS (num_tile_columns_minus1); - DUMP_PIC_PARAMS (num_tile_rows_minus1); - for (i = 0; i < G_N_ELEMENTS (params->column_width_minus1); i++) - GST_TRACE_OBJECT (self, "\tcolumn_width_minus1[%d]: %d", i, - params->column_width_minus1[i]); - for (i = 0; i < G_N_ELEMENTS (params->row_height_minus1); i++) - GST_TRACE_OBJECT (self, "\trow_height_minus1[%d]: %d", i, - params->row_height_minus1[i]); - DUMP_PIC_PARAMS (diff_cu_qp_delta_depth); - DUMP_PIC_PARAMS (pps_beta_offset_div2); - DUMP_PIC_PARAMS (pps_tc_offset_div2); - DUMP_PIC_PARAMS (log2_parallel_merge_level_minus2); - DUMP_PIC_PARAMS (CurrPicOrderCntVal); - - for (i = 0; i < G_N_ELEMENTS (params->RefPicList); i++) { - GST_TRACE_OBJECT (self, "\tRefPicList[%d].Index7Bits: %d", i, - params->RefPicList[i].Index7Bits); - GST_TRACE_OBJECT (self, "\tRefPicList[%d].AssociatedFlag: %d", i, - params->RefPicList[i].AssociatedFlag); - GST_TRACE_OBJECT (self, "\tPicOrderCntValList[%d]: %d", i, - params->PicOrderCntValList[i]); - } - - for (i = 0; i < G_N_ELEMENTS (params->RefPicSetStCurrBefore); i++) { - GST_TRACE_OBJECT (self, "\tRefPicSetStCurrBefore[%d]: %d", i, - params->RefPicSetStCurrBefore[i]); - GST_TRACE_OBJECT (self, "\tRefPicSetStCurrAfter[%d]: %d", i, - params->RefPicSetStCurrAfter[i]); - GST_TRACE_OBJECT (self, "\tRefPicSetLtCurr[%d]: %d", i, - params->RefPicSetLtCurr[i]); - } - -#undef DUMP_PIC_PARAMS -} -#endif - -static gboolean -gst_d3d11_h265_dec_decode_slice (GstH265Decoder * decoder, - GstH265Picture * picture, GstH265Slice * slice) -{ - GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); - GstH265SPS *sps; - GstH265PPS *pps; - DXVA_PicParams_HEVC pic_params = { 0, }; - DXVA_Qmatrix_HEVC iq_matrix = { 0, }; - guint d3d11_buffer_size = 0; - gpointer d3d11_buffer = NULL; - gint i; - GstD3D11DecoderOutputView *view; - GstH265ScalingList *scaling_list = NULL; - - pps = slice->header.pps; - sps = pps->sps; - - view = gst_d3d11_h265_dec_get_output_view_from_picture (self, picture); - - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view"); - return FALSE; - } - - gst_d3d11_h265_dec_fill_picture_params (self, &slice->header, &pic_params); - - pic_params.CurrPic.Index7Bits = view->view_id; - pic_params.IrapPicFlag = GST_H265_IS_NAL_TYPE_IRAP (slice->nalu.type); - pic_params.IdrPicFlag = GST_H265_IS_NAL_TYPE_IDR (slice->nalu.type); - pic_params.IntraPicFlag = GST_H265_IS_NAL_TYPE_IRAP (slice->nalu.type); - pic_params.CurrPicOrderCntVal = picture->pic_order_cnt; - - memcpy (pic_params.RefPicList, self->ref_pic_list, - sizeof (pic_params.RefPicList)); - memcpy (pic_params.PicOrderCntValList, self->pic_order_cnt_val_list, - sizeof (pic_params.PicOrderCntValList)); - memcpy (pic_params.RefPicSetStCurrBefore, self->ref_pic_set_st_curr_before, - sizeof (pic_params.RefPicSetStCurrBefore)); - memcpy (pic_params.RefPicSetStCurrAfter, self->ref_pic_set_st_curr_after, - sizeof (pic_params.RefPicSetStCurrAfter)); - memcpy (pic_params.RefPicSetLtCurr, self->ref_pic_set_lt_curr, - sizeof (pic_params.RefPicSetLtCurr)); - -#ifndef GST_DISABLE_GST_DEBUG - gst_d3d11_h265_dec_dump_pic_params (self, &pic_params); -#endif - - GST_TRACE_OBJECT (self, "Getting picture param decoder buffer"); - - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, - "Failed to get decoder buffer for picture parameters"); - return FALSE; - } - - memcpy (d3d11_buffer, &pic_params, sizeof (pic_params)); - - GST_TRACE_OBJECT (self, "Release picture param decoder buffer"); - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS)) { - GST_ERROR_OBJECT (self, "Failed to release decoder buffer"); - return FALSE; - } - - if (pps->scaling_list_data_present_flag || - (sps->scaling_list_enabled_flag - && !sps->scaling_list_data_present_flag)) { - scaling_list = &pps->scaling_list; - } else if (sps->scaling_list_enabled_flag && - sps->scaling_list_data_present_flag) { - scaling_list = &sps->scaling_list; - } - - if (scaling_list) { - self->submit_iq_data = TRUE; - - memcpy (iq_matrix.ucScalingLists0, scaling_list->scaling_lists_4x4, - sizeof (iq_matrix.ucScalingLists0)); - memcpy (iq_matrix.ucScalingLists1, scaling_list->scaling_lists_8x8, - sizeof (iq_matrix.ucScalingLists1)); - memcpy (iq_matrix.ucScalingLists2, scaling_list->scaling_lists_16x16, - sizeof (iq_matrix.ucScalingLists2)); - memcpy (iq_matrix.ucScalingLists3, scaling_list->scaling_lists_32x32, - sizeof (iq_matrix.ucScalingLists3)); - - for (i = 0; i < 6; i++) - iq_matrix.ucScalingListDCCoefSizeID2[i] = - scaling_list->scaling_list_dc_coef_minus8_16x16[i] + 8; - - for (i = 0; i < 2; i++) - iq_matrix.ucScalingListDCCoefSizeID3[i] = - scaling_list->scaling_list_dc_coef_minus8_32x32[i] + 8; - - GST_TRACE_OBJECT (self, "Getting inverse quantization maxtirx buffer"); - - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX, - &d3d11_buffer_size, &d3d11_buffer)) { - GST_ERROR_OBJECT (self, - "Failed to get decoder buffer for inv. quantization matrix"); - return FALSE; - } - - memcpy (d3d11_buffer, &iq_matrix, sizeof (iq_matrix)); - - GST_TRACE_OBJECT (self, "Release inverse quantization maxtirx buffer"); - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX)) { - GST_ERROR_OBJECT (self, "Failed to release decoder buffer"); - return FALSE; - } - } else { - self->submit_iq_data = FALSE; - } - - { - guint to_write = slice->nalu.size + 3; - gboolean is_first = TRUE; - - while (to_write > 0) { - guint bytes_to_copy; - gboolean is_last = TRUE; - DXVA_Slice_HEVC_Short slice_short = { 0, }; - - if (self->remaining_buffer_size < to_write && self->slice_list->len > 0) { - if (!gst_d3d11_h265_dec_submit_slice_data (self)) { - GST_ERROR_OBJECT (self, "Failed to submit bitstream buffers"); - return FALSE; - } - - if (!gst_d3d11_h265_dec_get_bitstream_buffer (self)) { - GST_ERROR_OBJECT (self, "Failed to get bitstream buffer"); - return FALSE; - } - } - - /* remaining_buffer_size: the size of remaining d3d11 decoder - * bitstream memory allowed to write more - * written_buffer_size: the size of written bytes to this d3d11 decoder - * bitstream memory - * bytes_to_copy: the size of which we would write to d3d11 decoder - * bitstream memory in this loop - */ - - bytes_to_copy = to_write; - - if (bytes_to_copy > self->remaining_buffer_size) { - /* if the size of this slice is larger than the size of remaining d3d11 - * decoder bitstream memory, write the data up to the remaining d3d11 - * decoder bitstream memory size and the rest would be written to the - * next d3d11 bitstream memory */ - bytes_to_copy = self->remaining_buffer_size; - is_last = FALSE; - } - - if (bytes_to_copy >= 3 && is_first) { - /* normal case */ - self->bitstream_buffer_data[0] = 0; - self->bitstream_buffer_data[1] = 0; - self->bitstream_buffer_data[2] = 1; - memcpy (self->bitstream_buffer_data + 3, - slice->nalu.data + slice->nalu.offset, bytes_to_copy - 3); - } else { - /* when this nal unit date is splitted into two buffer */ - memcpy (self->bitstream_buffer_data, - slice->nalu.data + slice->nalu.offset, bytes_to_copy); - } - - /* For wBadSliceChopping value 0 or 1, BSNALunitDataLocation means - * the offset of the first start code of this slice in this d3d11 - * memory buffer. - * 1) If this is the first slice of picture, it should be zero - * since we write start code at offset 0 (written size before this - * slice also must be zero). - * 2) If this is not the first slice of picture but this is the first - * d3d11 bitstream buffer (meaning that one bitstream buffer contains - * multiple slices), then this is the written size of buffer - * excluding this loop. - * And for wBadSliceChopping value 2 or 3, this should be zero by spec */ - if (is_first) - slice_short.BSNALunitDataLocation = self->written_buffer_size; - else - slice_short.BSNALunitDataLocation = 0; - slice_short.SliceBytesInBuffer = bytes_to_copy; - - /* wBadSliceChopping: (dxva h265 spec.) - * 0: All bits for the slice are located within the corresponding - * bitstream data buffer - * 1: The bitstream data buffer contains the start of the slice, - * but not the entire slice, because the buffer is full - * 2: The bitstream data buffer contains the end of the slice. - * It does not contain the start of the slice, because the start of - * the slice was located in the previous bitstream data buffer. - * 3: The bitstream data buffer does not contain the start of the slice - * (because the start of the slice was located in the previous - * bitstream data buffer), and it does not contain the end of the slice - * (because the current bitstream data buffer is also full). - */ - if (is_last && is_first) { - slice_short.wBadSliceChopping = 0; - } else if (!is_last && is_first) { - slice_short.wBadSliceChopping = 1; - } else if (is_last && !is_first) { - slice_short.wBadSliceChopping = 2; - } else { - slice_short.wBadSliceChopping = 3; - } - - g_array_append_val (self->slice_list, slice_short); - self->remaining_buffer_size -= bytes_to_copy; - self->written_buffer_size += bytes_to_copy; - self->bitstream_buffer_data += bytes_to_copy; - is_first = FALSE; - to_write -= bytes_to_copy; - } - } - - return TRUE; -} - -typedef struct -{ - guint width; - guint height; -} GstD3D11H265DecResolution; - -void -gst_d3d11_h265_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank) -{ - GType type; - gchar *type_name; - gchar *feature_name; - guint index = 0; - guint i; - GUID profile; - GTypeInfo type_info = { - sizeof (GstD3D11H265DecClass), - NULL, - NULL, - (GClassInitFunc) gst_d3d11_h265_dec_class_init, - NULL, - NULL, - sizeof (GstD3D11H265Dec), - 0, - (GInstanceInitFunc) gst_d3d11_h265_dec_init, - }; - static const GUID *main_10_guid = - &GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN10; - static const GUID *main_guid = &GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN; - /* values were taken from chromium. - * Note that since chromium does not support hevc decoding, this list is - * the combination of lists for avc and vp9. - * See supported_profile_helper.cc */ - GstD3D11H265DecResolution resolutions_to_check[] = { - {1920, 1088}, {2560, 1440}, {3840, 2160}, {4096, 2160}, - {4096, 2304}, {7680, 4320}, {8192, 4320}, {8192, 8192} - }; - GstCaps *sink_caps = NULL; - GstCaps *src_caps = NULL; - guint max_width = 0; - guint max_height = 0; - guint resolution; - gboolean have_main10 = FALSE; - gboolean have_main = FALSE; - DXGI_FORMAT format = DXGI_FORMAT_UNKNOWN; - - have_main10 = gst_d3d11_decoder_get_supported_decoder_profile (decoder, - &main_10_guid, 1, &profile); - if (!have_main10) { - GST_DEBUG_OBJECT (device, "decoder does not support HEVC_VLD_MAIN10"); - } else { - have_main10 &= - gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_P010); - have_main10 &= - gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_NV12); - if (!have_main10) { - GST_FIXME_OBJECT (device, - "device does not support P010 and/or NV12 format"); - } - } - - have_main = gst_d3d11_decoder_get_supported_decoder_profile (decoder, - &main_guid, 1, &profile); - if (!have_main) { - GST_DEBUG_OBJECT (device, "decoder does not support HEVC_VLD_MAIN"); - } else { - have_main = - gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_NV12); - if (!have_main) { - GST_FIXME_OBJECT (device, "device does not support NV12 format"); - } - } - - if (!have_main10 && !have_main) { - GST_INFO_OBJECT (device, "device does not support h.265 decoding"); - return; - } - - if (have_main) { - profile = *main_guid; - format = DXGI_FORMAT_NV12; - } else { - profile = *main_10_guid; - format = DXGI_FORMAT_P010; - } - - for (i = 0; i < G_N_ELEMENTS (resolutions_to_check); i++) { - if (gst_d3d11_decoder_supports_resolution (decoder, &profile, - format, resolutions_to_check[i].width, - resolutions_to_check[i].height)) { - max_width = resolutions_to_check[i].width; - max_height = resolutions_to_check[i].height; - - GST_DEBUG_OBJECT (device, - "device support resolution %dx%d", max_width, max_height); - } else { - break; - } - } - - if (max_width == 0 || max_height == 0) { - GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); - return; - } - - sink_caps = gst_caps_from_string ("video/x-h265, " - "stream-format=(string) { hev1, hvc1, byte-stream }, " - "alignment= (string) au, " "framerate = " GST_VIDEO_FPS_RANGE); - src_caps = gst_caps_from_string ("video/x-raw(" - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "), " - "framerate = " GST_VIDEO_FPS_RANGE ";" - "video/x-raw, " "framerate = " GST_VIDEO_FPS_RANGE); - - if (have_main10) { - /* main10 profile covers main and main10 */ - GValue profile_list = G_VALUE_INIT; - GValue profile_value = G_VALUE_INIT; - GValue format_list = G_VALUE_INIT; - GValue format_value = G_VALUE_INIT; - - g_value_init (&profile_list, GST_TYPE_LIST); - - g_value_init (&profile_value, G_TYPE_STRING); - g_value_set_string (&profile_value, "main"); - gst_value_list_append_and_take_value (&profile_list, &profile_value); - - g_value_init (&profile_value, G_TYPE_STRING); - g_value_set_string (&profile_value, "main-10"); - gst_value_list_append_and_take_value (&profile_list, &profile_value); - - - g_value_init (&format_list, GST_TYPE_LIST); - - g_value_init (&format_value, G_TYPE_STRING); - g_value_set_string (&format_value, "NV12"); - gst_value_list_append_and_take_value (&format_list, &format_value); - - g_value_init (&format_value, G_TYPE_STRING); - g_value_set_string (&format_value, "P010_10LE"); - gst_value_list_append_and_take_value (&format_list, &format_value); - - gst_caps_set_value (sink_caps, "profile", &profile_list); - gst_caps_set_value (src_caps, "format", &format_list); - g_value_unset (&profile_list); - g_value_unset (&format_list); - } else { - gst_caps_set_simple (sink_caps, "profile", G_TYPE_STRING, "main", NULL); - gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); - } - - /* To cover both landscape and portrait, select max value */ - resolution = MAX (max_width, max_height); - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - - type_info.class_data = - gst_d3d11_decoder_class_data_new (device, sink_caps, src_caps); - - type_name = g_strdup ("GstD3D11H265Dec"); - feature_name = g_strdup ("d3d11h265dec"); - - while (g_type_from_name (type_name)) { - index++; - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstD3D11H265Device%dDec", index); - feature_name = g_strdup_printf ("d3d11h265device%ddec", index); - } - - type = g_type_register_static (GST_TYPE_H265_DECODER, - type_name, &type_info, 0); - - /* make lower rank than default device */ - if (rank > 0 && index != 0) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11memory.c
Deleted
@@ -1,876 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include <string.h> -#include "gstd3d11memory.h" -#include "gstd3d11device.h" -#include "gstd3d11utils.h" - -GST_DEBUG_CATEGORY_STATIC (gst_d3d11_allocator_debug); -#define GST_CAT_DEFAULT gst_d3d11_allocator_debug - -GstD3D11AllocationParams * -gst_d3d11_allocation_params_new (GstD3D11Device * device, GstVideoInfo * info, - GstD3D11AllocationFlags flags, gint bind_flags) -{ - GstD3D11AllocationParams *ret; - const GstD3D11Format *d3d11_format; - gint i; - - g_return_val_if_fail (info != NULL, NULL); - - d3d11_format = gst_d3d11_device_format_from_gst (device, - GST_VIDEO_INFO_FORMAT (info)); - if (!d3d11_format) { - GST_WARNING ("Couldn't get d3d11 format"); - return NULL; - } - - ret = g_new0 (GstD3D11AllocationParams, 1); - - ret->info = *info; - ret->aligned_info = *info; - ret->d3d11_format = d3d11_format; - - /* Usage Flag - * https://docs.microsoft.com/en-us/windows/win32/api/d3d11/ne-d3d11-d3d11_usage - * - * +----------------------------------------------------------+ - * | Resource Usage | Default | Dynamic | Immutable | Staging | - * +----------------+---------+---------+-----------+---------+ - * | GPU-Read | Yes | Yes | Yes | Yes | - * | GPU-Write | Yes | | | Yes | - * | CPU-Read | | | | Yes | - * | CPU-Write | | Yes | | Yes | - * +----------------------------------------------------------+ - */ - - /* If corresponding dxgi format is undefined, use resource format instead */ - if (d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { - g_assert (d3d11_format->resource_format[i] != DXGI_FORMAT_UNKNOWN); - - ret->desc[i].Width = GST_VIDEO_INFO_COMP_WIDTH (info, i); - ret->desc[i].Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); - ret->desc[i].MipLevels = 1; - ret->desc[i].ArraySize = 1; - ret->desc[i].Format = d3d11_format->resource_format[i]; - ret->desc[i].SampleDesc.Count = 1; - ret->desc[i].SampleDesc.Quality = 0; - ret->desc[i].Usage = D3D11_USAGE_DEFAULT; - ret->desc[i].BindFlags = bind_flags; - } - } else { - ret->desc[0].Width = GST_VIDEO_INFO_WIDTH (info); - ret->desc[0].Height = GST_VIDEO_INFO_HEIGHT (info); - ret->desc[0].MipLevels = 1; - ret->desc[0].ArraySize = 1; - ret->desc[0].Format = d3d11_format->dxgi_format; - ret->desc[0].SampleDesc.Count = 1; - ret->desc[0].SampleDesc.Quality = 0; - ret->desc[0].Usage = D3D11_USAGE_DEFAULT; - ret->desc[0].BindFlags = bind_flags; - } - - ret->flags = flags; - - return ret; -} - -gboolean -gst_d3d11_allocation_params_alignment (GstD3D11AllocationParams * params, - GstVideoAlignment * align) -{ - gint i; - guint padding_width, padding_height; - GstVideoInfo *info; - GstVideoInfo new_info; - - g_return_val_if_fail (params != NULL, FALSE); - g_return_val_if_fail (align != NULL, FALSE); - - /* d3d11 does not support stride align. Consider padding only */ - padding_width = align->padding_left + align->padding_right; - padding_height = align->padding_top + align->padding_bottom; - - info = ¶ms->info; - - if (!gst_video_info_set_format (&new_info, GST_VIDEO_INFO_FORMAT (info), - GST_VIDEO_INFO_WIDTH (info) + padding_width, - GST_VIDEO_INFO_HEIGHT (info) + padding_height)) { - GST_WARNING ("Set format fail"); - return FALSE; - } - - params->aligned_info = new_info; - - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { - params->desc[i].Width = GST_VIDEO_INFO_COMP_WIDTH (&new_info, i); - params->desc[i].Height = GST_VIDEO_INFO_COMP_HEIGHT (&new_info, i); - } - - return TRUE; -} - -GstD3D11AllocationParams * -gst_d3d11_allocation_params_copy (GstD3D11AllocationParams * src) -{ - GstD3D11AllocationParams *dst; - - g_return_val_if_fail (src != NULL, NULL); - - dst = g_new0 (GstD3D11AllocationParams, 1); - memcpy (dst, src, sizeof (GstD3D11AllocationParams)); - - return dst; -} - -void -gst_d3d11_allocation_params_free (GstD3D11AllocationParams * params) -{ - g_free (params); -} - -static gint -gst_d3d11_allocation_params_compare (const GstD3D11AllocationParams * p1, - const GstD3D11AllocationParams * p2) -{ - g_return_val_if_fail (p1 != NULL, -1); - g_return_val_if_fail (p2 != NULL, -1); - - if (p1 == p2) - return 0; - - return -1; -} - -static void -_init_alloc_params (GType type) -{ - static GstValueTable table = { - 0, (GstValueCompareFunc) gst_d3d11_allocation_params_compare, - NULL, NULL - }; - - table.type = type; - gst_value_register (&table); -} - -G_DEFINE_BOXED_TYPE_WITH_CODE (GstD3D11AllocationParams, - gst_d3d11_allocation_params, - (GBoxedCopyFunc) gst_d3d11_allocation_params_copy, - (GBoxedFreeFunc) gst_d3d11_allocation_params_free, - _init_alloc_params (g_define_type_id)); - -struct _GstD3D11AllocatorPrivate -{ - /* parent textrure when array typed memory is used */ - ID3D11Texture2D *texture; - guint8 array_in_use[D3D11_REQ_TEXTURE2D_ARRAY_AXIS_DIMENSION]; - - GMutex lock; - GCond cond; - - gboolean flushing; -}; - -#define gst_d3d11_allocator_parent_class parent_class -G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11Allocator, - gst_d3d11_allocator, GST_TYPE_ALLOCATOR); - -static inline D3D11_MAP -gst_map_flags_to_d3d11 (GstMapFlags flags) -{ - if ((flags & GST_MAP_READWRITE) == GST_MAP_READWRITE) - return D3D11_MAP_READ_WRITE; - else if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) - return D3D11_MAP_WRITE; - else if ((flags & GST_MAP_READ) == GST_MAP_READ) - return D3D11_MAP_READ; - else - g_assert_not_reached (); - - return D3D11_MAP_READ; -} - -static ID3D11Texture2D * -create_staging_texture (GstD3D11Device * device, - const D3D11_TEXTURE2D_DESC * ref) -{ - D3D11_TEXTURE2D_DESC desc = { 0, }; - - desc.Width = ref->Width; - desc.Height = ref->Height; - desc.MipLevels = 1; - desc.Format = ref->Format; - desc.SampleDesc.Count = 1; - desc.ArraySize = 1; - desc.Usage = D3D11_USAGE_STAGING; - desc.CPUAccessFlags = (D3D11_CPU_ACCESS_READ | D3D11_CPU_ACCESS_WRITE); - - return gst_d3d11_device_create_texture (device, &desc, NULL); -} - -static gboolean -map_cpu_access_data (GstD3D11Memory * dmem, D3D11_MAP map_type) -{ - HRESULT hr; - gboolean ret = TRUE; - ID3D11Resource *texture = (ID3D11Resource *) dmem->texture; - ID3D11Resource *staging = (ID3D11Resource *) dmem->staging; - ID3D11DeviceContext *device_context = - gst_d3d11_device_get_device_context_handle (dmem->device); - - gst_d3d11_device_lock (dmem->device); - if (GST_MEMORY_FLAG_IS_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD)) { - ID3D11DeviceContext_CopySubresourceRegion (device_context, - staging, 0, 0, 0, 0, texture, dmem->subresource_index, NULL); - } - - hr = ID3D11DeviceContext_Map (device_context, - staging, 0, map_type, 0, &dmem->map); - - if (!gst_d3d11_result (hr, dmem->device)) { - GST_ERROR_OBJECT (GST_MEMORY_CAST (dmem)->allocator, - "Failed to map staging texture (0x%x)", (guint) hr); - ret = FALSE; - } - - gst_d3d11_device_unlock (dmem->device); - - return ret; -} - -static gpointer -gst_d3d11_memory_map (GstMemory * mem, gsize maxsize, GstMapFlags flags) -{ - GstD3D11Memory *dmem = (GstD3D11Memory *) mem; - - g_mutex_lock (&dmem->lock); - if ((flags & GST_MAP_D3D11) == GST_MAP_D3D11) { - if (dmem->staging && - GST_MEMORY_FLAG_IS_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD)) { - ID3D11DeviceContext *device_context = - gst_d3d11_device_get_device_context_handle (dmem->device); - - gst_d3d11_device_lock (dmem->device); - ID3D11DeviceContext_CopySubresourceRegion (device_context, - (ID3D11Resource *) dmem->texture, dmem->subresource_index, 0, 0, 0, - (ID3D11Resource *) dmem->staging, 0, NULL); - gst_d3d11_device_unlock (dmem->device); - } - - GST_MEMORY_FLAG_UNSET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD); - - if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) - GST_MINI_OBJECT_FLAG_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - - g_assert (dmem->texture != NULL); - g_mutex_unlock (&dmem->lock); - - return dmem->texture; - } - - if (dmem->cpu_map_count == 0) { - D3D11_MAP map_type; - - /* Allocate staging texture for CPU access */ - if (!dmem->staging) { - dmem->staging = create_staging_texture (dmem->device, &dmem->desc); - if (!dmem->staging) { - GST_ERROR_OBJECT (mem->allocator, "Couldn't create staging texture"); - g_mutex_unlock (&dmem->lock); - - return NULL; - } - - /* first memory, always need download to staging */ - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - } - - map_type = gst_map_flags_to_d3d11 (flags); - - if (!map_cpu_access_data (dmem, map_type)) { - GST_ERROR_OBJECT (mem->allocator, "Couldn't map staging texture"); - g_mutex_unlock (&dmem->lock); - - return NULL; - } - } - - if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD); - - GST_MEMORY_FLAG_UNSET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - - dmem->cpu_map_count++; - g_mutex_unlock (&dmem->lock); - - return dmem->map.pData; -} - -static void -unmap_cpu_access_data (GstD3D11Memory * dmem) -{ - ID3D11Resource *staging = (ID3D11Resource *) dmem->staging; - ID3D11DeviceContext *device_context = - gst_d3d11_device_get_device_context_handle (dmem->device); - - gst_d3d11_device_lock (dmem->device); - ID3D11DeviceContext_Unmap (device_context, staging, 0); - gst_d3d11_device_unlock (dmem->device); -} - -static void -gst_d3d11_memory_unmap_full (GstMemory * mem, GstMapInfo * info) -{ - GstD3D11Memory *dmem = (GstD3D11Memory *) mem; - - g_mutex_lock (&dmem->lock); - if ((info->flags & GST_MAP_D3D11) == GST_MAP_D3D11) { - if ((info->flags & GST_MAP_WRITE) == GST_MAP_WRITE) - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - - g_mutex_unlock (&dmem->lock); - return; - } - - if ((info->flags & GST_MAP_WRITE)) - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD); - - dmem->cpu_map_count--; - if (dmem->cpu_map_count > 0) { - g_mutex_unlock (&dmem->lock); - return; - } - - unmap_cpu_access_data (dmem); - - g_mutex_unlock (&dmem->lock); -} - -static GstMemory * -gst_d3d11_memory_share (GstMemory * mem, gssize offset, gssize size) -{ - /* TODO: impl. */ - return NULL; -} - -static GstMemory * -gst_d3d11_allocator_dummy_alloc (GstAllocator * allocator, gsize size, - GstAllocationParams * params) -{ - g_return_val_if_reached (NULL); -} - -static void -gst_d3d11_allocator_free (GstAllocator * allocator, GstMemory * mem) -{ - GstD3D11Allocator *self = GST_D3D11_ALLOCATOR (allocator); - GstD3D11AllocatorPrivate *priv = self->priv; - GstD3D11Memory *dmem = (GstD3D11Memory *) mem; - gint i; - - g_mutex_lock (&priv->lock); - priv->array_in_use[dmem->subresource_index] = 0; - g_cond_broadcast (&priv->cond); - g_mutex_unlock (&priv->lock); - - for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { - if (dmem->render_target_view[i]) - ID3D11RenderTargetView_Release (dmem->render_target_view[i]); - dmem->render_target_view[i] = NULL; - - if (dmem->shader_resource_view[i]) - ID3D11ShaderResourceView_Release (dmem->shader_resource_view[i]); - dmem->shader_resource_view[i] = NULL; - } - - if (dmem->texture) - ID3D11Texture2D_Release (dmem->texture); - - if (dmem->staging) - ID3D11Texture2D_Release (dmem->staging); - - gst_clear_object (&dmem->device); - g_mutex_clear (&dmem->lock); - - g_free (dmem); -} - -static void -gst_d3d11_allocator_dispose (GObject * object) -{ - GstD3D11Allocator *alloc = GST_D3D11_ALLOCATOR (object); - GstD3D11AllocatorPrivate *priv = alloc->priv; - - if (alloc->device && priv->texture) { - gst_d3d11_device_release_texture (alloc->device, priv->texture); - priv->texture = NULL; - } - - gst_clear_object (&alloc->device); - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_d3d11_allocator_finalize (GObject * object) -{ - GstD3D11Allocator *alloc = GST_D3D11_ALLOCATOR (object); - GstD3D11AllocatorPrivate *priv = alloc->priv; - - g_mutex_clear (&priv->lock); - g_cond_clear (&priv->cond); - - G_OBJECT_CLASS (parent_class)->finalize (object); -} - -static void -gst_d3d11_allocator_class_init (GstD3D11AllocatorClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstAllocatorClass *allocator_class = GST_ALLOCATOR_CLASS (klass); - - gobject_class->dispose = gst_d3d11_allocator_dispose; - gobject_class->finalize = gst_d3d11_allocator_finalize; - - allocator_class->alloc = gst_d3d11_allocator_dummy_alloc; - allocator_class->free = gst_d3d11_allocator_free; - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_allocator_debug, "d3d11allocator", 0, - "d3d11allocator object"); -} - -static void -gst_d3d11_allocator_init (GstD3D11Allocator * allocator) -{ - GstAllocator *alloc = GST_ALLOCATOR_CAST (allocator); - GstD3D11AllocatorPrivate *priv; - - alloc->mem_type = GST_D3D11_MEMORY_NAME; - alloc->mem_map = gst_d3d11_memory_map; - alloc->mem_unmap_full = gst_d3d11_memory_unmap_full; - alloc->mem_share = gst_d3d11_memory_share; - /* fallback copy */ - - GST_OBJECT_FLAG_SET (alloc, GST_ALLOCATOR_FLAG_CUSTOM_ALLOC); - - priv = gst_d3d11_allocator_get_instance_private (allocator); - g_mutex_init (&priv->lock); - g_cond_init (&priv->cond); - - allocator->priv = priv; -} - -GstD3D11Allocator * -gst_d3d11_allocator_new (GstD3D11Device * device) -{ - GstD3D11Allocator *allocator; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - allocator = g_object_new (GST_TYPE_D3D11_ALLOCATOR, NULL); - allocator->device = gst_object_ref (device); - - return allocator; -} - -static gboolean -calculate_mem_size (GstD3D11Device * device, ID3D11Texture2D * texture, - D3D11_TEXTURE2D_DESC * desc, D3D11_MAP map_type, - gint stride[GST_VIDEO_MAX_PLANES], gsize * size) -{ - HRESULT hr; - gboolean ret = TRUE; - D3D11_MAPPED_SUBRESOURCE map; - gsize offset[GST_VIDEO_MAX_PLANES]; - ID3D11DeviceContext *device_context = - gst_d3d11_device_get_device_context_handle (device); - - gst_d3d11_device_lock (device); - hr = ID3D11DeviceContext_Map (device_context, - (ID3D11Resource *) texture, 0, map_type, 0, &map); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR_OBJECT (device, "Failed to map texture (0x%x)", (guint) hr); - gst_d3d11_device_unlock (device); - return FALSE; - } - - ret = gst_d3d11_dxgi_format_get_size (desc->Format, - desc->Width, desc->Height, map.RowPitch, offset, stride, size); - - ID3D11DeviceContext_Unmap (device_context, (ID3D11Resource *) texture, 0); - gst_d3d11_device_unlock (device); - - return ret; -} - -static void -create_shader_resource_views (GstD3D11Memory * mem) -{ - gint i; - HRESULT hr; - guint num_views = 0; - ID3D11Device *device_handle; - D3D11_SHADER_RESOURCE_VIEW_DESC resource_desc = { 0, }; - DXGI_FORMAT formats[GST_VIDEO_MAX_PLANES] = { DXGI_FORMAT_UNKNOWN, }; - - device_handle = gst_d3d11_device_get_device_handle (mem->device); - - switch (mem->desc.Format) { - case DXGI_FORMAT_B8G8R8A8_UNORM: - case DXGI_FORMAT_R8G8B8A8_UNORM: - case DXGI_FORMAT_R10G10B10A2_UNORM: - case DXGI_FORMAT_R8_UNORM: - case DXGI_FORMAT_R8G8_UNORM: - case DXGI_FORMAT_R16_UNORM: - case DXGI_FORMAT_R16G16_UNORM: - num_views = 1; - formats[0] = mem->desc.Format; - break; - case DXGI_FORMAT_AYUV: - num_views = 1; - formats[0] = DXGI_FORMAT_R8G8B8A8_UNORM; - break; - case DXGI_FORMAT_NV12: - num_views = 2; - formats[0] = DXGI_FORMAT_R8_UNORM; - formats[1] = DXGI_FORMAT_R8G8_UNORM; - break; - case DXGI_FORMAT_P010: - case DXGI_FORMAT_P016: - num_views = 2; - formats[0] = DXGI_FORMAT_R16_UNORM; - formats[1] = DXGI_FORMAT_R16G16_UNORM; - break; - default: - g_assert_not_reached (); - break; - } - - if ((mem->desc.BindFlags & D3D11_BIND_SHADER_RESOURCE) == - D3D11_BIND_SHADER_RESOURCE) { - resource_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; - resource_desc.Texture2D.MipLevels = 1; - - for (i = 0; i < num_views; i++) { - resource_desc.Format = formats[i]; - hr = ID3D11Device_CreateShaderResourceView (device_handle, - (ID3D11Resource *) mem->texture, &resource_desc, - &mem->shader_resource_view[i]); - - if (!gst_d3d11_result (hr, mem->device)) { - GST_ERROR_OBJECT (GST_MEMORY_CAST (mem)->allocator, - "Failed to create %dth resource view (0x%x)", i, (guint) hr); - goto error; - } - } - - mem->num_shader_resource_views = num_views; - } - - return; - -error: - for (i = 0; i < num_views; i++) { - if (mem->shader_resource_view[i]) - ID3D11ShaderResourceView_Release (mem->shader_resource_view[i]); - mem->shader_resource_view[i] = NULL; - } - - mem->num_shader_resource_views = 0; -} - -static void -create_render_target_views (GstD3D11Memory * mem) -{ - gint i; - HRESULT hr; - guint num_views = 0; - ID3D11Device *device_handle; - D3D11_RENDER_TARGET_VIEW_DESC render_desc = { 0, }; - DXGI_FORMAT formats[GST_VIDEO_MAX_PLANES] = { DXGI_FORMAT_UNKNOWN, }; - - device_handle = gst_d3d11_device_get_device_handle (mem->device); - - switch (mem->desc.Format) { - case DXGI_FORMAT_B8G8R8A8_UNORM: - case DXGI_FORMAT_R8G8B8A8_UNORM: - case DXGI_FORMAT_R10G10B10A2_UNORM: - case DXGI_FORMAT_R8_UNORM: - case DXGI_FORMAT_R8G8_UNORM: - case DXGI_FORMAT_R16_UNORM: - case DXGI_FORMAT_R16G16_UNORM: - num_views = 1; - formats[0] = mem->desc.Format; - break; - case DXGI_FORMAT_AYUV: - num_views = 1; - formats[0] = DXGI_FORMAT_R8G8B8A8_UNORM; - break; - case DXGI_FORMAT_NV12: - num_views = 2; - formats[0] = DXGI_FORMAT_R8_UNORM; - formats[1] = DXGI_FORMAT_R8G8_UNORM; - break; - case DXGI_FORMAT_P010: - case DXGI_FORMAT_P016: - num_views = 2; - formats[0] = DXGI_FORMAT_R16_UNORM; - formats[1] = DXGI_FORMAT_R16G16_UNORM; - break; - default: - g_assert_not_reached (); - break; - } - - if ((mem->desc.BindFlags & D3D11_BIND_RENDER_TARGET) == - D3D11_BIND_RENDER_TARGET) { - render_desc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; - render_desc.Texture2D.MipSlice = 0; - - for (i = 0; i < num_views; i++) { - render_desc.Format = formats[i]; - - hr = ID3D11Device_CreateRenderTargetView (device_handle, - (ID3D11Resource *) mem->texture, &render_desc, - &mem->render_target_view[i]); - if (!gst_d3d11_result (hr, mem->device)) { - GST_ERROR_OBJECT (GST_MEMORY_CAST (mem)->allocator, - "Failed to create %dth render target view (0x%x)", i, (guint) hr); - goto error; - } - } - - mem->num_render_target_views = num_views; - } - - return; - -error: - for (i = 0; i < num_views; i++) { - if (mem->render_target_view[i]) - ID3D11RenderTargetView_Release (mem->render_target_view[i]); - mem->render_target_view[i] = NULL; - } - - mem->num_render_target_views = 0; -} - -GstMemory * -gst_d3d11_allocator_alloc (GstD3D11Allocator * allocator, - GstD3D11AllocationParams * params) -{ - GstD3D11Memory *mem; - GstD3D11Device *device; - ID3D11Texture2D *texture = NULL; - ID3D11Texture2D *staging = NULL; - D3D11_TEXTURE2D_DESC *desc; - gsize *size; - gboolean is_first = FALSE; - guint index_to_use = 0; - GstD3D11AllocatorPrivate *priv; - GstD3D11MemoryType type = GST_D3D11_MEMORY_TYPE_TEXTURE; - - g_return_val_if_fail (GST_IS_D3D11_ALLOCATOR (allocator), NULL); - g_return_val_if_fail (params != NULL, NULL); - - priv = allocator->priv; - device = allocator->device; - desc = ¶ms->desc[params->plane]; - size = ¶ms->size[params->plane]; - - if (*size == 0) - is_first = TRUE; - - if ((params->flags & GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY)) { - gint i; - - do_again: - g_mutex_lock (&priv->lock); - if (priv->flushing) { - GST_DEBUG_OBJECT (allocator, "we are flushing"); - g_mutex_unlock (&priv->lock); - - return NULL; - } - - for (i = 0; i < desc->ArraySize; i++) { - if (priv->array_in_use[i] == 0) { - index_to_use = i; - break; - } - } - - if (i == desc->ArraySize) { - GST_DEBUG_OBJECT (allocator, "All elements in array are used now"); - g_cond_wait (&priv->cond, &priv->lock); - g_mutex_unlock (&priv->lock); - goto do_again; - } - - priv->array_in_use[index_to_use] = 1; - - g_mutex_unlock (&priv->lock); - - if (!priv->texture) { - priv->texture = gst_d3d11_device_create_texture (device, desc, NULL); - if (!priv->texture) { - GST_ERROR_OBJECT (allocator, "Couldn't create texture"); - goto error; - } - } - - ID3D11Texture2D_AddRef (priv->texture); - texture = priv->texture; - - type = GST_D3D11_MEMORY_TYPE_ARRAY; - } else { - texture = gst_d3d11_device_create_texture (device, desc, NULL); - if (!texture) { - GST_ERROR_OBJECT (allocator, "Couldn't create texture"); - goto error; - } - } - - /* per plane, allocated staging texture to calculate actual size, - * stride, and offset */ - if (is_first) { - gint num_plane; - gint stride[GST_VIDEO_MAX_PLANES]; - gsize mem_size; - gint i; - - staging = create_staging_texture (device, desc); - if (!staging) { - GST_ERROR_OBJECT (allocator, "Couldn't create staging texture"); - goto error; - } - - if (!calculate_mem_size (device, - staging, desc, D3D11_MAP_READ, stride, &mem_size)) - goto error; - - num_plane = gst_d3d11_dxgi_format_n_planes (desc->Format); - - for (i = 0; i < num_plane; i++) { - params->stride[params->plane + i] = stride[i]; - } - - *size = mem_size; - } - - mem = g_new0 (GstD3D11Memory, 1); - - gst_memory_init (GST_MEMORY_CAST (mem), - 0, GST_ALLOCATOR_CAST (allocator), NULL, *size, 0, 0, *size); - - g_mutex_init (&mem->lock); - mem->info = params->info; - mem->plane = params->plane; - mem->desc = *desc; - mem->texture = texture; - mem->staging = staging; - mem->device = gst_object_ref (device); - mem->type = type; - mem->subresource_index = index_to_use; - - if (staging) - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - - return GST_MEMORY_CAST (mem); - -error: - if (texture) - gst_d3d11_device_release_texture (device, texture); - - if (staging) - gst_d3d11_device_release_texture (device, texture); - - return NULL; -} - -void -gst_d3d11_allocator_set_flushing (GstD3D11Allocator * allocator, - gboolean flushing) -{ - GstD3D11AllocatorPrivate *priv; - - g_return_if_fail (GST_IS_D3D11_ALLOCATOR (allocator)); - - priv = allocator->priv; - - g_mutex_lock (&priv->lock); - priv->flushing = flushing; - g_cond_broadcast (&priv->cond); - g_mutex_unlock (&priv->lock); -} - -gboolean -gst_is_d3d11_memory (GstMemory * mem) -{ - return mem != NULL && mem->allocator != NULL && - GST_IS_D3D11_ALLOCATOR (mem->allocator); -} - -gboolean -gst_d3d11_memory_ensure_shader_resource_view (GstD3D11Memory * mem) -{ - g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), FALSE); - - if (mem->num_shader_resource_views) - return TRUE; - - if (!(mem->desc.BindFlags & D3D11_BIND_SHADER_RESOURCE)) { - GST_WARNING_OBJECT (GST_MEMORY_CAST (mem)->allocator, - "Need BindFlags, current flag 0x%x", mem->desc.BindFlags); - - return FALSE; - } - - create_shader_resource_views (mem); - - return ! !mem->num_shader_resource_views; -} - -gboolean -gst_d3d11_memory_ensure_render_target_view (GstD3D11Memory * mem) -{ - g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), FALSE); - - if (mem->num_render_target_views) - return TRUE; - - if (!(mem->desc.BindFlags & D3D11_BIND_RENDER_TARGET)) { - GST_WARNING_OBJECT (GST_MEMORY_CAST (mem)->allocator, - "Need BindFlags, current flag 0x%x", mem->desc.BindFlags); - - return FALSE; - } - - create_render_target_views (mem); - - return ! !mem->num_render_target_views; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11memory.h
Deleted
@@ -1,189 +0,0 @@ -/* - * GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_MEMORY_H__ -#define __GST_D3D11_MEMORY_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> - -#include "gstd3d11_fwd.h" -#include "gstd3d11format.h" - -G_BEGIN_DECLS - -#define GST_TYPE_D3D11_ALLOCATION_PARAMS (gst_d3d11_allocation_params_get_type()) -#define GST_TYPE_D3D11_ALLOCATOR (gst_d3d11_allocator_get_type()) -#define GST_D3D11_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_D3D11_ALLOCATOR, GstD3D11Allocator)) -#define GST_D3D11_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS((klass), GST_TYPE_D3D11_ALLOCATOR, GstD3D11AllocatorClass)) -#define GST_IS_D3D11_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_D3D11_ALLOCATOR)) -#define GST_IS_D3D11_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_ALLOCATOR)) -#define GST_D3D11_ALLOCATOR_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_ALLOCATOR, GstD3D11AllocatorClass)) - -#define GST_D3D11_MEMORY_NAME "D3D11Memory" - -/** - * GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY: - * - * Name of the caps feature for indicating the use of #GstD3D11Memory - */ -#define GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "memory:D3D11Memory" - -/** - * GST_MAP_D3D11: - * - * Flag indicating that we should map the D3D11 resource instead of to system memory. - */ -#define GST_MAP_D3D11 (GST_MAP_FLAG_LAST << 1) - -/** - * GstD3D11AllocationFlags: - * GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY: Indicates each allocated texture should be array type - */ -typedef enum -{ - GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY = (1 << 0), -} GstD3D11AllocationFlags; - -/** - * GstD3D11MemoryTransfer: - * @GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD: the texture needs downloading - * to the staging texture memory - * @GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD: the staging texture needs uploading - * to the texture - */ -typedef enum -{ - GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD = (GST_MEMORY_FLAG_LAST << 0), - GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD = (GST_MEMORY_FLAG_LAST << 1) -} GstD3D11MemoryTransfer; - -struct _GstD3D11AllocationParams -{ - /* Texture description per plane */ - D3D11_TEXTURE2D_DESC desc[GST_VIDEO_MAX_PLANES]; - - GstVideoInfo info; - GstVideoInfo aligned_info; - const GstD3D11Format *d3d11_format; - - /* size and stride of staging texture, set by allocator */ - gint stride[GST_VIDEO_MAX_PLANES]; - gsize size[GST_VIDEO_MAX_PLANES]; - - /* Current target plane for allocation */ - guint plane; - - GstD3D11AllocationFlags flags; - - /*< private >*/ - gpointer _gst_reserved[GST_PADDING_LARGE]; -}; - -typedef enum -{ - GST_D3D11_MEMORY_TYPE_TEXTURE = 0, - GST_D3D11_MEMORY_TYPE_ARRAY = 1, -} GstD3D11MemoryType; - -struct _GstD3D11Memory -{ - GstMemory mem; - - /*< public > */ - GstD3D11Device *device; - - ID3D11Texture2D *texture; - ID3D11Texture2D *staging; - - ID3D11ShaderResourceView *shader_resource_view[GST_VIDEO_MAX_PLANES]; - guint num_shader_resource_views; - - ID3D11RenderTargetView *render_target_view[GST_VIDEO_MAX_PLANES]; - guint num_render_target_views; - - GstVideoInfo info; - - guint plane; - GstD3D11MemoryType type; - - /* > 0 if this is Array typed memory */ - guint subresource_index; - - D3D11_TEXTURE2D_DESC desc; - D3D11_MAPPED_SUBRESOURCE map; - - /*< private >*/ - GMutex lock; - gint cpu_map_count; -}; - -struct _GstD3D11Allocator -{ - GstAllocator parent; - - GstD3D11Device *device; - - /*< private >*/ - GstD3D11AllocatorPrivate *priv; - gpointer _gst_reserved[GST_PADDING]; -}; - -struct _GstD3D11AllocatorClass -{ - GstAllocatorClass allocator_class; - - /*< private >*/ - gpointer _gst_reserved[GST_PADDING]; -}; - -GType gst_d3d11_allocation_params_get_type (void); - -GstD3D11AllocationParams * gst_d3d11_allocation_params_new (GstD3D11Device * device, - GstVideoInfo * info, - GstD3D11AllocationFlags flags, - gint bind_flags); - -GstD3D11AllocationParams * gst_d3d11_allocation_params_copy (GstD3D11AllocationParams * src); - -void gst_d3d11_allocation_params_free (GstD3D11AllocationParams * params); - -gboolean gst_d3d11_allocation_params_alignment (GstD3D11AllocationParams * parms, - GstVideoAlignment * align); - -GType gst_d3d11_allocator_get_type (void); - -GstD3D11Allocator * gst_d3d11_allocator_new (GstD3D11Device *device); - -GstMemory * gst_d3d11_allocator_alloc (GstD3D11Allocator * allocator, - GstD3D11AllocationParams * params); - -void gst_d3d11_allocator_set_flushing (GstD3D11Allocator * allocator, - gboolean flushing); - -gboolean gst_is_d3d11_memory (GstMemory * mem); - -gboolean gst_d3d11_memory_ensure_shader_resource_view (GstD3D11Memory * mem); - -gboolean gst_d3d11_memory_ensure_render_target_view (GstD3D11Memory * mem); - -G_END_DECLS - -#endif /* __GST_D3D11_MEMORY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11overlaycompositor.c
Deleted
@@ -1,680 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11overlaycompositor.h" -#include "gstd3d11utils.h" -#include "gstd3d11device.h" -#include "gstd3d11shader.h" -#include "gstd3d11format.h" - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_overlay_compositor_debug); -#define GST_CAT_DEFAULT gst_d3d11_overlay_compositor_debug - -/* *INDENT-OFF* */ -typedef struct -{ - struct { - FLOAT x; - FLOAT y; - FLOAT z; - } position; - struct { - FLOAT x; - FLOAT y; - } texture; -} VertexData; - -static const gchar templ_pixel_shader[] = - "Texture2D shaderTexture;\n" - "SamplerState samplerState;\n" - "\n" - "struct PS_INPUT\n" - "{\n" - " float4 Position: SV_POSITION;\n" - " float3 Texture: TEXCOORD0;\n" - "};\n" - "\n" - "float4 main(PS_INPUT input): SV_TARGET\n" - "{\n" - " return shaderTexture.Sample(samplerState, input.Texture);\n" - "}\n"; - -static const gchar templ_vertex_shader[] = - "struct VS_INPUT\n" - "{\n" - " float4 Position : POSITION;\n" - " float4 Texture : TEXCOORD0;\n" - "};\n" - "\n" - "struct VS_OUTPUT\n" - "{\n" - " float4 Position: SV_POSITION;\n" - " float4 Texture: TEXCOORD0;\n" - "};\n" - "\n" - "VS_OUTPUT main(VS_INPUT input)\n" - "{\n" - " return input;\n" - "}\n"; -/* *INDENT-ON* */ - -struct _GstD3D11OverlayCompositor -{ - GstD3D11Device *device; - GstVideoInfo out_info; - - D3D11_VIEWPORT viewport; - - ID3D11PixelShader *ps; - ID3D11VertexShader *vs; - ID3D11InputLayout *layout; - ID3D11SamplerState *sampler; - ID3D11BlendState *blend; - ID3D11Buffer *index_buffer; - - /* GstD3D11CompositionOverlay */ - GList *overlays; -}; - -typedef struct -{ - GstVideoOverlayRectangle *overlay_rect; - ID3D11Texture2D *texture; - ID3D11ShaderResourceView *srv; - GstD3D11Quad *quad; -} GstD3D11CompositionOverlay; - -static GstD3D11CompositionOverlay * -gst_d3d11_composition_overlay_new (GstD3D11OverlayCompositor * self, - GstVideoOverlayRectangle * overlay_rect) -{ - GstD3D11CompositionOverlay *overlay = NULL; - gint x, y; - guint width, height; - D3D11_SUBRESOURCE_DATA subresource_data = { 0, }; - D3D11_TEXTURE2D_DESC texture_desc = { 0, }; - D3D11_SHADER_RESOURCE_VIEW_DESC srv_desc = { 0, }; - D3D11_BUFFER_DESC buffer_desc = { 0, }; - ID3D11Buffer *vertex_buffer = NULL; - D3D11_MAPPED_SUBRESOURCE map; - VertexData *vertex_data; - GstBuffer *buf; - GstVideoMeta *vmeta; - GstMapInfo info; - guint8 *data; - gint stride; - ID3D11Texture2D *texture = NULL; - ID3D11ShaderResourceView *srv = NULL; - HRESULT hr; - ID3D11Device *device_handle; - ID3D11DeviceContext *context_handle; - GstD3D11Device *device = self->device; - const guint index_count = 2 * 3; - FLOAT x1, y1, x2, y2; - gdouble val; - - g_return_val_if_fail (overlay_rect != NULL, NULL); - - device_handle = gst_d3d11_device_get_device_handle (device); - context_handle = gst_d3d11_device_get_device_context_handle (device); - - if (!gst_video_overlay_rectangle_get_render_rectangle (overlay_rect, &x, &y, - &width, &height)) { - GST_ERROR ("Failed to get render rectangle"); - return NULL; - } - - buf = gst_video_overlay_rectangle_get_pixels_unscaled_argb (overlay_rect, - GST_VIDEO_OVERLAY_FORMAT_FLAG_NONE); - if (!buf) { - GST_ERROR ("Failed to get overlay buffer"); - return NULL; - } - - vmeta = gst_buffer_get_video_meta (buf); - if (!vmeta) { - GST_ERROR ("Failed to get video meta"); - return NULL; - } - - if (!gst_video_meta_map (vmeta, - 0, &info, (gpointer *) & data, &stride, GST_MAP_READ)) { - GST_ERROR ("Failed to map"); - return NULL; - } - - /* Do create texture and upload data at once, for create immutable texture */ - subresource_data.pSysMem = data; - subresource_data.SysMemPitch = stride; - subresource_data.SysMemSlicePitch = 0; - - texture_desc.Width = width; - texture_desc.Height = height; - texture_desc.MipLevels = 1; - texture_desc.ArraySize = 1; - /* FIXME: need to consider non-BGRA ? */ - texture_desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; - texture_desc.SampleDesc.Count = 1; - texture_desc.SampleDesc.Quality = 0; - texture_desc.Usage = D3D11_USAGE_IMMUTABLE; - texture_desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; - texture_desc.CPUAccessFlags = 0; - - texture = gst_d3d11_device_create_texture (device, - &texture_desc, &subresource_data); - gst_video_meta_unmap (vmeta, 0, &info); - - if (!texture) { - GST_ERROR ("Failed to create texture"); - return NULL; - } - - srv_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; - srv_desc.Texture2D.MipLevels = 1; - - hr = ID3D11Device_CreateShaderResourceView (device_handle, - (ID3D11Resource *) texture, &srv_desc, &srv); - if (!gst_d3d11_result (hr, device) || !srv) { - GST_ERROR ("Failed to create shader resource view"); - goto clear; - } - - buffer_desc.Usage = D3D11_USAGE_DYNAMIC; - buffer_desc.ByteWidth = sizeof (VertexData) * 4; - buffer_desc.BindFlags = D3D11_BIND_VERTEX_BUFFER; - buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; - - hr = ID3D11Device_CreateBuffer (device_handle, &buffer_desc, NULL, - &vertex_buffer); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create vertex buffer, hr: 0x%x", (guint) hr); - goto clear; - } - - gst_d3d11_device_lock (device); - hr = ID3D11DeviceContext_Map (context_handle, - (ID3D11Resource *) vertex_buffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &map); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't map vertex buffer, hr: 0x%x", (guint) hr); - gst_d3d11_device_unlock (device); - goto clear; - } - - vertex_data = (VertexData *) map.pData; - /* bottom left */ - gst_util_fraction_to_double (x, GST_VIDEO_INFO_WIDTH (&self->out_info), &val); - x1 = (val * 2.0f) - 1.0f; - - gst_util_fraction_to_double (y + height, - GST_VIDEO_INFO_HEIGHT (&self->out_info), &val); - y1 = (val * -2.0f) + 1.0f; - - /* top right */ - gst_util_fraction_to_double (x + width, - GST_VIDEO_INFO_WIDTH (&self->out_info), &val); - x2 = (val * 2.0f) - 1.0f; - - gst_util_fraction_to_double (y, - GST_VIDEO_INFO_HEIGHT (&self->out_info), &val); - y2 = (val * -2.0f) + 1.0f; - - /* bottom left */ - vertex_data[0].position.x = x1; - vertex_data[0].position.y = y1; - vertex_data[0].position.z = 0.0f; - vertex_data[0].texture.x = 0.0f; - vertex_data[0].texture.y = 1.0f; - - /* top left */ - vertex_data[1].position.x = x1; - vertex_data[1].position.y = y2; - vertex_data[1].position.z = 0.0f; - vertex_data[1].texture.x = 0.0f; - vertex_data[1].texture.y = 0.0f; - - /* top right */ - vertex_data[2].position.x = x2; - vertex_data[2].position.y = y2; - vertex_data[2].position.z = 0.0f; - vertex_data[2].texture.x = 1.0f; - vertex_data[2].texture.y = 0.0f; - - /* bottom right */ - vertex_data[3].position.x = x2; - vertex_data[3].position.y = y1; - vertex_data[3].position.z = 0.0f; - vertex_data[3].texture.x = 1.0f; - vertex_data[3].texture.y = 1.0f; - - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) vertex_buffer, 0); - gst_d3d11_device_unlock (device); - - overlay = g_new0 (GstD3D11CompositionOverlay, 1); - overlay->overlay_rect = gst_video_overlay_rectangle_ref (overlay_rect); - overlay->texture = texture; - overlay->srv = srv; - overlay->quad = gst_d3d11_quad_new (device, - self->ps, self->vs, self->layout, self->sampler, self->blend, NULL, NULL, - vertex_buffer, sizeof (VertexData), - self->index_buffer, DXGI_FORMAT_R16_UINT, index_count); - -clear: - if (!overlay) { - if (srv) - ID3D11ShaderResourceView_Release (srv); - if (texture) - ID3D11Texture2D_Release (texture); - } - - if (vertex_buffer) - ID3D11Buffer_Release (vertex_buffer); - - return overlay; -} - -static void -gst_d3d11_composition_overlay_free (GstD3D11CompositionOverlay * overlay) -{ - if (!overlay) - return; - - if (overlay->overlay_rect) - gst_video_overlay_rectangle_unref (overlay->overlay_rect); - - if (overlay->srv) - ID3D11ShaderResourceView_Release (overlay->srv); - - if (overlay->texture) - ID3D11Texture2D_Release (overlay->texture); - - if (overlay->quad) - gst_d3d11_quad_free (overlay->quad); - - g_free (overlay); -} - -static gboolean -gst_d3d11_overlay_compositor_setup_shader (GstD3D11OverlayCompositor * self, - GstD3D11Device * device) -{ - HRESULT hr; - D3D11_SAMPLER_DESC sampler_desc = { 0, }; - D3D11_INPUT_ELEMENT_DESC input_desc[2] = { 0, }; - D3D11_BUFFER_DESC buffer_desc = { 0, }; - D3D11_BLEND_DESC blend_desc = { 0, }; - D3D11_MAPPED_SUBRESOURCE map; - WORD *indices; - ID3D11Device *device_handle; - ID3D11DeviceContext *context_handle; - ID3D11PixelShader *ps = NULL; - ID3D11VertexShader *vs = NULL; - ID3D11InputLayout *layout = NULL; - ID3D11SamplerState *sampler = NULL; - ID3D11BlendState *blend = NULL; - ID3D11Buffer *index_buffer = NULL; - const guint index_count = 2 * 3; - gboolean ret = TRUE; - - device_handle = gst_d3d11_device_get_device_handle (device); - context_handle = gst_d3d11_device_get_device_context_handle (device); - - /* bilinear filtering */ - sampler_desc.Filter = D3D11_FILTER_MIN_MAG_LINEAR_MIP_POINT; - sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP; - sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP; - sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP; - sampler_desc.ComparisonFunc = D3D11_COMPARISON_ALWAYS; - sampler_desc.MinLOD = 0; - sampler_desc.MaxLOD = D3D11_FLOAT32_MAX; - - hr = ID3D11Device_CreateSamplerState (device_handle, &sampler_desc, &sampler); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create sampler state, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - GST_LOG ("Create Pixel Shader \n%s", templ_pixel_shader); - - if (!gst_d3d11_create_pixel_shader (device, templ_pixel_shader, &ps)) { - GST_ERROR ("Couldn't create pixel shader"); - ret = FALSE; - goto clear; - } - - input_desc[0].SemanticName = "POSITION"; - input_desc[0].SemanticIndex = 0; - input_desc[0].Format = DXGI_FORMAT_R32G32B32_FLOAT; - input_desc[0].InputSlot = 0; - input_desc[0].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; - input_desc[0].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; - input_desc[0].InstanceDataStepRate = 0; - - input_desc[1].SemanticName = "TEXCOORD"; - input_desc[1].SemanticIndex = 0; - input_desc[1].Format = DXGI_FORMAT_R32G32_FLOAT; - input_desc[1].InputSlot = 0; - input_desc[1].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; - input_desc[1].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; - input_desc[1].InstanceDataStepRate = 0; - - if (!gst_d3d11_create_vertex_shader (device, templ_vertex_shader, - input_desc, G_N_ELEMENTS (input_desc), &vs, &layout)) { - GST_ERROR ("Couldn't vertex pixel shader"); - ret = FALSE; - goto clear; - } - - blend_desc.AlphaToCoverageEnable = FALSE; - blend_desc.IndependentBlendEnable = FALSE; - blend_desc.RenderTarget[0].BlendEnable = TRUE; - blend_desc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA; - blend_desc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA; - blend_desc.RenderTarget[0].BlendOp = D3D11_BLEND_OP_ADD; - blend_desc.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ONE; - blend_desc.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_ZERO; - blend_desc.RenderTarget[0].BlendOpAlpha = D3D11_BLEND_OP_ADD; - blend_desc.RenderTarget[0].RenderTargetWriteMask = - D3D11_COLOR_WRITE_ENABLE_ALL; - - hr = ID3D11Device_CreateBlendState (device_handle, &blend_desc, &blend); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create blend staten, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - buffer_desc.Usage = D3D11_USAGE_DYNAMIC; - buffer_desc.ByteWidth = sizeof (WORD) * index_count; - buffer_desc.BindFlags = D3D11_BIND_INDEX_BUFFER; - buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; - - hr = ID3D11Device_CreateBuffer (device_handle, &buffer_desc, NULL, - &index_buffer); - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't create index buffer, hr: 0x%x", (guint) hr); - ret = FALSE; - goto clear; - } - - gst_d3d11_device_lock (device); - hr = ID3D11DeviceContext_Map (context_handle, - (ID3D11Resource *) index_buffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &map); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("Couldn't map index buffer, hr: 0x%x", (guint) hr); - gst_d3d11_device_unlock (device); - ret = FALSE; - goto clear; - } - - indices = (WORD *) map.pData; - - /* clockwise indexing */ - indices[0] = 0; /* bottom left */ - indices[1] = 1; /* top left */ - indices[2] = 2; /* top right */ - - indices[3] = 3; /* bottom right */ - indices[4] = 0; /* bottom left */ - indices[5] = 2; /* top right */ - - ID3D11DeviceContext_Unmap (context_handle, - (ID3D11Resource *) index_buffer, 0); - gst_d3d11_device_unlock (device); - - self->ps = ps; - self->vs = vs; - self->layout = layout; - self->sampler = sampler; - self->blend = blend; - self->index_buffer = index_buffer; - -clear: - if (ret) - return TRUE; - - if (ps) - ID3D11PixelShader_Release (ps); - if (vs) - ID3D11VertexShader_Release (vs); - if (layout) - ID3D11InputLayout_Release (layout); - if (sampler) - ID3D11SamplerState_Release (sampler); - if (blend) - ID3D11BlendState_Release (blend); - if (index_buffer) - ID3D11Buffer_Release (index_buffer); - - return FALSE; -} - - -GstD3D11OverlayCompositor * -gst_d3d11_overlay_compositor_new (GstD3D11Device * device, - GstVideoInfo * out_info) -{ - GstD3D11OverlayCompositor *compositor = NULL; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - g_return_val_if_fail (out_info != NULL, NULL); - - compositor = g_new0 (GstD3D11OverlayCompositor, 1); - - if (!gst_d3d11_overlay_compositor_setup_shader (compositor, device)) { - gst_d3d11_overlay_compositor_free (compositor); - return NULL; - } - - compositor->device = gst_object_ref (device); - compositor->out_info = *out_info; - - compositor->viewport.TopLeftX = 0; - compositor->viewport.TopLeftY = 0; - compositor->viewport.Width = GST_VIDEO_INFO_WIDTH (out_info); - compositor->viewport.Height = GST_VIDEO_INFO_HEIGHT (out_info); - compositor->viewport.MinDepth = 0.0f; - compositor->viewport.MaxDepth = 1.0f; - - return compositor; -} - -void -gst_d3d11_overlay_compositor_free (GstD3D11OverlayCompositor * compositor) -{ - g_return_if_fail (compositor != NULL); - - gst_d3d11_overlay_compositor_free_overlays (compositor); - - if (compositor->ps) - ID3D11PixelShader_Release (compositor->ps); - if (compositor->vs) - ID3D11VertexShader_Release (compositor->vs); - if (compositor->layout) - ID3D11InputLayout_Release (compositor->layout); - if (compositor->sampler) - ID3D11SamplerState_Release (compositor->sampler); - if (compositor->blend) - ID3D11BlendState_Release (compositor->blend); - if (compositor->index_buffer) - ID3D11Buffer_Release (compositor->index_buffer); - - gst_clear_object (&compositor->device); - g_free (compositor); -} - -static gint -find_in_compositor (const GstD3D11CompositionOverlay * overlay, - const GstVideoOverlayRectangle * rect) -{ - return !(overlay->overlay_rect == rect); -} - -static gboolean -is_in_video_overlay_composition (GstVideoOverlayComposition * voc, - GstD3D11CompositionOverlay * overlay) -{ - guint i; - - for (i = 0; i < gst_video_overlay_composition_n_rectangles (voc); i++) { - GstVideoOverlayRectangle *rectangle = - gst_video_overlay_composition_get_rectangle (voc, i); - if (overlay->overlay_rect == rectangle) - return TRUE; - } - return FALSE; -} - -gboolean -gst_d3d11_overlay_compositor_upload (GstD3D11OverlayCompositor * compositor, - GstBuffer * buf) -{ - GstVideoOverlayCompositionMeta *meta; - gint i, num_overlays; - GList *iter; - - g_return_val_if_fail (compositor != NULL, FALSE); - g_return_val_if_fail (GST_IS_BUFFER (buf), FALSE); - - meta = gst_buffer_get_video_overlay_composition_meta (buf); - - if (!meta) { - gst_d3d11_overlay_compositor_free_overlays (compositor); - return TRUE; - } - - num_overlays = gst_video_overlay_composition_n_rectangles (meta->overlay); - if (!num_overlays) { - gst_d3d11_overlay_compositor_free_overlays (compositor); - return TRUE; - } - - GST_LOG ("Upload %d overlay rectangles", num_overlays); - - /* Upload new overlay */ - for (i = 0; i < num_overlays; i++) { - GstVideoOverlayRectangle *rectangle = - gst_video_overlay_composition_get_rectangle (meta->overlay, i); - - if (!g_list_find_custom (compositor->overlays, - rectangle, (GCompareFunc) find_in_compositor)) { - GstD3D11CompositionOverlay *overlay = NULL; - - overlay = gst_d3d11_composition_overlay_new (compositor, rectangle); - - if (!overlay) - return FALSE; - - compositor->overlays = g_list_append (compositor->overlays, overlay); - } - } - - /* Remove old overlay */ - iter = compositor->overlays; - while (iter) { - GstD3D11CompositionOverlay *overlay = - (GstD3D11CompositionOverlay *) iter->data; - GList *next = iter->next; - - if (!is_in_video_overlay_composition (meta->overlay, overlay)) { - compositor->overlays = g_list_delete_link (compositor->overlays, iter); - gst_d3d11_composition_overlay_free (overlay); - } - - iter = next; - } - - return TRUE; -} - -void -gst_d3d11_overlay_compositor_free_overlays (GstD3D11OverlayCompositor * - compositor) -{ - g_return_if_fail (compositor != NULL); - - if (compositor->overlays) { - g_list_free_full (compositor->overlays, - (GDestroyNotify) gst_d3d11_composition_overlay_free); - - compositor->overlays = NULL; - } -} - -gboolean -gst_d3d11_overlay_compositor_update_rect (GstD3D11OverlayCompositor * - compositor, RECT * rect) -{ - g_return_val_if_fail (compositor != NULL, FALSE); - g_return_val_if_fail (rect != NULL, FALSE); - - compositor->viewport.TopLeftX = rect->left; - compositor->viewport.TopLeftY = rect->top; - compositor->viewport.Width = rect->right - rect->left; - compositor->viewport.Height = rect->bottom - rect->top; - - return TRUE; -} - -gboolean -gst_d3d11_overlay_compositor_draw (GstD3D11OverlayCompositor * compositor, - ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) -{ - gboolean ret = TRUE; - - g_return_val_if_fail (compositor != NULL, FALSE); - g_return_val_if_fail (rtv != NULL, FALSE); - - gst_d3d11_device_lock (compositor->device); - ret = gst_d3d11_overlay_compositor_draw_unlocked (compositor, rtv); - gst_d3d11_device_unlock (compositor->device); - - return ret; -} - -gboolean -gst_d3d11_overlay_compositor_draw_unlocked (GstD3D11OverlayCompositor * - compositor, ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) -{ - gboolean ret = TRUE; - GList *iter; - - g_return_val_if_fail (compositor != NULL, FALSE); - g_return_val_if_fail (rtv != NULL, FALSE); - - for (iter = compositor->overlays; iter; iter = g_list_next (iter)) { - GstD3D11CompositionOverlay *overlay = - (GstD3D11CompositionOverlay *) iter->data; - - ret = gst_d3d11_draw_quad_unlocked (overlay->quad, - &compositor->viewport, 1, &overlay->srv, 1, rtv, 1, NULL); - - if (!ret) - break; - } - - return ret; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11shader.c
Deleted
@@ -1,423 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11shader.h" -#include "gstd3d11device.h" -#include "gstd3d11utils.h" -#include <gmodule.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_shader_debug); -#define GST_CAT_DEFAULT gst_d3d11_shader_debug - -static GModule *d3d_compiler_module = NULL; -static pD3DCompile GstD3DCompileFunc = NULL; - -gboolean -gst_d3d11_shader_init (void) -{ - static gsize _init = 0; - - if (g_once_init_enter (&_init)) { -#if GST_D3D11_WINAPI_ONLY_APP - /* Assuming that d3d compiler library is available */ - GstD3DCompileFunc = D3DCompile; -#else - static const gchar *d3d_compiler_names[] = { - "d3dcompiler_47.dll", - "d3dcompiler_46.dll", - "d3dcompiler_45.dll", - "d3dcompiler_44.dll", - "d3dcompiler_43.dll", - }; - gint i; - for (i = 0; i < G_N_ELEMENTS (d3d_compiler_names); i++) { - d3d_compiler_module = - g_module_open (d3d_compiler_names[i], G_MODULE_BIND_LAZY); - - if (d3d_compiler_module) { - GST_INFO ("D3D compiler %s is available", d3d_compiler_names[i]); - if (!g_module_symbol (d3d_compiler_module, "D3DCompile", - (gpointer *) & GstD3DCompileFunc)) { - GST_ERROR ("Cannot load D3DCompile symbol from %s", - d3d_compiler_names[i]); - g_module_close (d3d_compiler_module); - d3d_compiler_module = NULL; - GstD3DCompileFunc = NULL; - } else { - break; - } - } - } - - if (!GstD3DCompileFunc) - GST_WARNING ("D3D11 compiler library is unavailable"); -#endif - - g_once_init_leave (&_init, 1); - } - - return ! !GstD3DCompileFunc; -} - -static ID3DBlob * -compile_shader (GstD3D11Device * device, const gchar * shader_source, - gboolean is_pixel_shader) -{ - ID3DBlob *ret; - ID3DBlob *error = NULL; - const gchar *shader_target; - D3D_FEATURE_LEVEL feature_level; - HRESULT hr; - - if (!gst_d3d11_shader_init ()) { - GST_ERROR ("D3DCompiler is unavailable"); - return NULL; - } - - feature_level = gst_d3d11_device_get_chosen_feature_level (device); - - if (is_pixel_shader) { - if (feature_level >= D3D_FEATURE_LEVEL_10_0) - shader_target = "ps_4_0"; - else if (feature_level >= D3D_FEATURE_LEVEL_9_3) - shader_target = "ps_4_0_level_9_3"; - else - shader_target = "ps_4_0_level_9_1"; - } else { - if (feature_level >= D3D_FEATURE_LEVEL_10_0) - shader_target = "vs_4_0"; - else if (feature_level >= D3D_FEATURE_LEVEL_9_3) - shader_target = "vs_4_0_level_9_3"; - else - shader_target = "vs_4_0_level_9_1"; - } - - g_assert (GstD3DCompileFunc); - - hr = GstD3DCompileFunc (shader_source, strlen (shader_source), NULL, NULL, - NULL, "main", shader_target, 0, 0, &ret, &error); - - if (!gst_d3d11_result (hr, device)) { - const gchar *err = NULL; - - if (error) - err = ID3D10Blob_GetBufferPointer (error); - - GST_ERROR ("could not compile source, hr: 0x%x, error detail %s", - (guint) hr, GST_STR_NULL (err)); - - if (error) - ID3D10Blob_Release (error); - - return NULL; - } - - if (error) { - const gchar *err = ID3D10Blob_GetBufferPointer (error); - - GST_WARNING ("HLSL compiler warnings:\n%s", GST_STR_NULL (err)); - ID3D10Blob_Release (error); - } - - return ret; -} - -gboolean -gst_d3d11_create_pixel_shader (GstD3D11Device * device, - const gchar * source, ID3D11PixelShader ** shader) -{ - ID3DBlob *ps_blob; - ID3D11Device *device_handle; - HRESULT hr; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); - g_return_val_if_fail (source != NULL, FALSE); - g_return_val_if_fail (shader != NULL, FALSE); - - gst_d3d11_device_lock (device); - ps_blob = compile_shader (device, source, TRUE); - - if (!ps_blob) { - GST_ERROR ("Failed to compile pixel shader"); - gst_d3d11_device_unlock (device); - return FALSE; - } - - device_handle = gst_d3d11_device_get_device_handle (device); - hr = ID3D11Device_CreatePixelShader (device_handle, - (gpointer) ID3D10Blob_GetBufferPointer (ps_blob), - ID3D10Blob_GetBufferSize (ps_blob), NULL, shader); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("could not create pixel shader, hr: 0x%x", (guint) hr); - gst_d3d11_device_unlock (device); - return FALSE; - } - - ID3D10Blob_Release (ps_blob); - gst_d3d11_device_unlock (device); - - return TRUE; -} - -gboolean -gst_d3d11_create_vertex_shader (GstD3D11Device * device, const gchar * source, - const D3D11_INPUT_ELEMENT_DESC * input_desc, guint desc_len, - ID3D11VertexShader ** shader, ID3D11InputLayout ** layout) -{ - ID3DBlob *vs_blob; - ID3D11Device *device_handle; - HRESULT hr; - ID3D11VertexShader *vshader = NULL; - ID3D11InputLayout *in_layout = NULL; - gboolean ret = FALSE; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); - g_return_val_if_fail (source != NULL, FALSE); - g_return_val_if_fail (input_desc != NULL, FALSE); - g_return_val_if_fail (desc_len > 0, FALSE); - g_return_val_if_fail (shader != NULL, FALSE); - g_return_val_if_fail (layout != NULL, FALSE); - - gst_d3d11_device_lock (device); - vs_blob = compile_shader (device, source, FALSE); - if (!vs_blob) { - GST_ERROR ("Failed to compile shader code"); - goto done; - } - - device_handle = gst_d3d11_device_get_device_handle (device); - - hr = ID3D11Device_CreateVertexShader (device_handle, - (gpointer) ID3D10Blob_GetBufferPointer (vs_blob), - ID3D10Blob_GetBufferSize (vs_blob), NULL, &vshader); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("could not create vertex shader, hr: 0x%x", (guint) hr); - ID3D10Blob_Release (vs_blob); - goto done; - } - - hr = ID3D11Device_CreateInputLayout (device_handle, input_desc, - desc_len, (gpointer) ID3D10Blob_GetBufferPointer (vs_blob), - ID3D10Blob_GetBufferSize (vs_blob), &in_layout); - - if (!gst_d3d11_result (hr, device)) { - GST_ERROR ("could not create input layout shader, hr: 0x%x", (guint) hr); - ID3D10Blob_Release (vs_blob); - ID3D11VertexShader_Release (vshader); - goto done; - } - - ID3D10Blob_Release (vs_blob); - - *shader = vshader; - *layout = in_layout; - - ret = TRUE; - -done: - gst_d3d11_device_unlock (device); - - return ret; -} - -struct _GstD3D11Quad -{ - GstD3D11Device *device; - ID3D11PixelShader *ps; - ID3D11VertexShader *vs; - ID3D11InputLayout *layout; - ID3D11SamplerState *sampler; - ID3D11BlendState *blend; - ID3D11DepthStencilState *depth_stencil; - ID3D11Buffer *const_buffer; - ID3D11Buffer *vertex_buffer; - guint vertex_stride; - ID3D11Buffer *index_buffer; - DXGI_FORMAT index_format; - guint index_count; - D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES]; - ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES]; - guint num_srv; - ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES]; - guint num_rtv; -}; - -GstD3D11Quad * -gst_d3d11_quad_new (GstD3D11Device * device, ID3D11PixelShader * pixel_shader, - ID3D11VertexShader * vertex_shader, ID3D11InputLayout * layout, - ID3D11SamplerState * sampler, ID3D11BlendState * blend, - ID3D11DepthStencilState * depth_stencil, - ID3D11Buffer * const_buffer, - ID3D11Buffer * vertex_buffer, guint vertex_stride, - ID3D11Buffer * index_buffer, DXGI_FORMAT index_format, guint index_count) -{ - GstD3D11Quad *quad; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - g_return_val_if_fail (pixel_shader != NULL, NULL); - g_return_val_if_fail (vertex_shader != NULL, NULL); - g_return_val_if_fail (layout != NULL, NULL); - g_return_val_if_fail (sampler != NULL, NULL); - g_return_val_if_fail (vertex_buffer != NULL, NULL); - g_return_val_if_fail (vertex_stride > 0, NULL); - g_return_val_if_fail (index_buffer != NULL, NULL); - g_return_val_if_fail (index_format != DXGI_FORMAT_UNKNOWN, NULL); - - quad = g_new0 (GstD3D11Quad, 1); - - quad->device = gst_object_ref (device); - quad->ps = pixel_shader; - quad->vs = vertex_shader; - quad->layout = layout; - quad->sampler = sampler; - quad->blend = blend; - quad->depth_stencil = depth_stencil; - quad->vertex_buffer = vertex_buffer; - quad->vertex_stride = vertex_stride; - quad->index_buffer = index_buffer; - quad->index_format = index_format; - quad->index_count = index_count; - - ID3D11PixelShader_AddRef (pixel_shader); - ID3D11VertexShader_AddRef (vertex_shader); - ID3D11InputLayout_AddRef (layout); - ID3D11SamplerState_AddRef (sampler); - - if (blend) - ID3D11BlendState_AddRef (blend); - - if (depth_stencil) - ID3D11DepthStencilState_AddRef (depth_stencil); - - if (const_buffer) { - quad->const_buffer = const_buffer; - ID3D11Buffer_AddRef (const_buffer); - } - ID3D11Buffer_AddRef (vertex_buffer); - ID3D11Buffer_AddRef (index_buffer); - - return quad; -} - -void -gst_d3d11_quad_free (GstD3D11Quad * quad) -{ - g_return_if_fail (quad != NULL); - - if (quad->ps) - ID3D11PixelShader_Release (quad->ps); - if (quad->vs) - ID3D11VertexShader_Release (quad->vs); - if (quad->layout) - ID3D11InputLayout_Release (quad->layout); - if (quad->sampler) - ID3D11SamplerState_Release (quad->sampler); - if (quad->blend) - ID3D11BlendState_Release (quad->blend); - if (quad->depth_stencil) - ID3D11DepthStencilState_Release (quad->depth_stencil); - if (quad->const_buffer) - ID3D11Buffer_Release (quad->const_buffer); - if (quad->vertex_buffer) - ID3D11Buffer_Release (quad->vertex_buffer); - if (quad->index_buffer) - ID3D11Buffer_Release (quad->index_buffer); - - gst_clear_object (&quad->device); - g_free (quad); -} - -gboolean -gst_d3d11_draw_quad (GstD3D11Quad * quad, - D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES], guint num_viewport, - ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], guint num_srv, - ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES], guint num_rtv, - ID3D11DepthStencilView * dsv) -{ - gboolean ret; - - g_return_val_if_fail (quad != NULL, FALSE); - - gst_d3d11_device_lock (quad->device); - ret = gst_d3d11_draw_quad_unlocked (quad, viewport, num_viewport, - srv, num_srv, rtv, num_viewport, dsv); - gst_d3d11_device_unlock (quad->device); - - return ret; -} - -gboolean -gst_d3d11_draw_quad_unlocked (GstD3D11Quad * quad, - D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES], guint num_viewport, - ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], guint num_srv, - ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES], guint num_rtv, - ID3D11DepthStencilView * dsv) -{ - ID3D11DeviceContext *context_handle; - UINT offsets = 0; - ID3D11ShaderResourceView *clear_view[GST_VIDEO_MAX_PLANES] = { NULL, }; - - g_return_val_if_fail (quad != NULL, FALSE); - g_return_val_if_fail (viewport != NULL, FALSE); - g_return_val_if_fail (num_viewport <= GST_VIDEO_MAX_PLANES, FALSE); - g_return_val_if_fail (srv != NULL, FALSE); - g_return_val_if_fail (num_srv <= GST_VIDEO_MAX_PLANES, FALSE); - g_return_val_if_fail (rtv != NULL, FALSE); - g_return_val_if_fail (num_rtv <= GST_VIDEO_MAX_PLANES, FALSE); - - context_handle = gst_d3d11_device_get_device_context_handle (quad->device); - - ID3D11DeviceContext_IASetPrimitiveTopology (context_handle, - D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); - ID3D11DeviceContext_IASetInputLayout (context_handle, quad->layout); - ID3D11DeviceContext_IASetVertexBuffers (context_handle, - 0, 1, &quad->vertex_buffer, &quad->vertex_stride, &offsets); - ID3D11DeviceContext_IASetIndexBuffer (context_handle, - quad->index_buffer, quad->index_format, 0); - - ID3D11DeviceContext_PSSetSamplers (context_handle, 0, 1, &quad->sampler); - ID3D11DeviceContext_VSSetShader (context_handle, quad->vs, NULL, 0); - ID3D11DeviceContext_PSSetShader (context_handle, quad->ps, NULL, 0); - ID3D11DeviceContext_RSSetViewports (context_handle, num_viewport, viewport); - - if (quad->const_buffer) - ID3D11DeviceContext_PSSetConstantBuffers (context_handle, - 0, 1, &quad->const_buffer); - - ID3D11DeviceContext_PSSetShaderResources (context_handle, 0, num_srv, srv); - ID3D11DeviceContext_OMSetRenderTargets (context_handle, num_rtv, rtv, dsv); - ID3D11DeviceContext_OMSetBlendState (context_handle, - quad->blend, NULL, 0xffffffff); - ID3D11DeviceContext_OMSetDepthStencilState (context_handle, - quad->depth_stencil, 1); - - ID3D11DeviceContext_DrawIndexed (context_handle, quad->index_count, 0, 0); - - ID3D11DeviceContext_PSSetShaderResources (context_handle, - 0, num_srv, clear_view); - ID3D11DeviceContext_OMSetRenderTargets (context_handle, 0, NULL, NULL); - - return TRUE; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11upload.c
Deleted
@@ -1,369 +0,0 @@ -/* GStreamer - * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11upload.h" -#include "gstd3d11memory.h" -#include "gstd3d11device.h" -#include "gstd3d11bufferpool.h" -#include "gstd3d11utils.h" - -#include <string.h> - -GST_DEBUG_CATEGORY_STATIC (gst_d3d11_upload_debug); -#define GST_CAT_DEFAULT gst_d3d11_upload_debug - -static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", - GST_PAD_SINK, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, - GST_D3D11_FORMATS) ";" - GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY - "," GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS)) - ); - -static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", - GST_PAD_SRC, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS))); - -#define gst_d3d11_upload_parent_class parent_class -G_DEFINE_TYPE (GstD3D11Upload, gst_d3d11_upload, GST_TYPE_D3D11_BASE_FILTER); - -static GstCaps *gst_d3d11_upload_transform_caps (GstBaseTransform * trans, - GstPadDirection direction, GstCaps * caps, GstCaps * filter); -static gboolean gst_d3d11_upload_propose_allocation (GstBaseTransform * trans, - GstQuery * decide_query, GstQuery * query); -static gboolean gst_d3d11_upload_decide_allocation (GstBaseTransform * trans, - GstQuery * query); -static GstFlowReturn gst_d3d11_upload_transform (GstBaseTransform * trans, - GstBuffer * inbuf, GstBuffer * outbuf); - -static void -gst_d3d11_upload_class_init (GstD3D11UploadClass * klass) -{ - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); - - gst_element_class_add_static_pad_template (element_class, &sink_template); - gst_element_class_add_static_pad_template (element_class, &src_template); - - gst_element_class_set_static_metadata (element_class, - "Direct3D11 uploader", "Filter/Video", - "Uploads data into D3D11 texture memory", - "Seungha Yang <seungha.yang@navercorp.com>"); - - trans_class->passthrough_on_same_caps = TRUE; - - trans_class->transform_caps = - GST_DEBUG_FUNCPTR (gst_d3d11_upload_transform_caps); - trans_class->propose_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_upload_propose_allocation); - trans_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_upload_decide_allocation); - trans_class->transform = GST_DEBUG_FUNCPTR (gst_d3d11_upload_transform); - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_upload_debug, - "d3d11upload", 0, "d3d11upload Element"); -} - -static void -gst_d3d11_upload_init (GstD3D11Upload * upload) -{ -} - -static GstCaps * -_set_caps_features (const GstCaps * caps, const gchar * feature_name) -{ - guint i, j, m, n; - GstCaps *tmp; - GstCapsFeatures *overlay_feature = - gst_caps_features_from_string - (GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION); - - tmp = gst_caps_new_empty (); - - n = gst_caps_get_size (caps); - for (i = 0; i < n; i++) { - GstCapsFeatures *features, *orig_features; - GstStructure *s = gst_caps_get_structure (caps, i); - - orig_features = gst_caps_get_features (caps, i); - features = gst_caps_features_new (feature_name, NULL); - - if (gst_caps_features_is_any (orig_features)) { - gst_caps_append_structure_full (tmp, gst_structure_copy (s), - gst_caps_features_copy (features)); - - if (!gst_caps_features_contains (features, - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION)) - gst_caps_features_add (features, - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION); - } else { - m = gst_caps_features_get_size (orig_features); - for (j = 0; j < m; j++) { - const gchar *feature = gst_caps_features_get_nth (orig_features, j); - - /* if we already have the features */ - if (gst_caps_features_contains (features, feature)) - continue; - - if (g_strcmp0 (feature, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY) == 0) - continue; - - if (gst_caps_features_contains (overlay_feature, feature)) { - gst_caps_features_add (features, feature); - } - } - } - - gst_caps_append_structure_full (tmp, gst_structure_copy (s), features); - } - - gst_caps_features_free (overlay_feature); - - return tmp; -} - -static GstCaps * -gst_d3d11_upload_transform_caps (GstBaseTransform * trans, - GstPadDirection direction, GstCaps * caps, GstCaps * filter) -{ - GstCaps *result, *tmp; - - GST_DEBUG_OBJECT (trans, - "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, - (direction == GST_PAD_SINK) ? "sink" : "src"); - - if (direction == GST_PAD_SINK) { - tmp = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); - tmp = gst_caps_merge (gst_caps_ref (caps), tmp); - } else { - GstCaps *newcaps; - tmp = gst_caps_ref (caps); - newcaps = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); - tmp = gst_caps_merge (tmp, newcaps); - } - - if (filter) { - result = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); - gst_caps_unref (tmp); - } else { - result = tmp; - } - - GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, result); - - return result; -} - -static gboolean -gst_d3d11_upload_propose_allocation (GstBaseTransform * trans, - GstQuery * decide_query, GstQuery * query) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstVideoInfo info; - GstBufferPool *pool; - GstCaps *caps; - guint size; - - if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, - decide_query, query)) - return FALSE; - - /* passthrough, we're done */ - if (decide_query == NULL) - return TRUE; - - gst_query_parse_allocation (query, &caps, NULL); - - if (caps == NULL) - return FALSE; - - if (!gst_video_info_from_caps (&info, caps)) - return FALSE; - - if (gst_query_get_n_allocation_pools (query) == 0) { - GstCapsFeatures *features; - GstStructure *config; - gboolean is_d3d11 = FALSE; - - features = gst_caps_get_features (caps, 0); - - if (features && gst_caps_features_contains (features, - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { - GST_DEBUG_OBJECT (filter, "upstream support d3d11 memory"); - pool = gst_d3d11_buffer_pool_new (filter->device); - is_d3d11 = TRUE; - } else { - pool = gst_video_buffer_pool_new (); - } - - config = gst_buffer_pool_get_config (pool); - - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_META); - - /* d3d11 pool does not support video alignment */ - if (!is_d3d11) { - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); - } - - size = GST_VIDEO_INFO_SIZE (&info); - gst_buffer_pool_config_set_params (config, caps, size, 0, 0); - - if (!gst_buffer_pool_set_config (pool, config)) - goto config_failed; - - /* d3d11 buffer pool might update buffer size by self */ - if (is_d3d11) - size = GST_D3D11_BUFFER_POOL (pool)->buffer_size; - - gst_query_add_allocation_pool (query, pool, size, 0, 0); - gst_object_unref (pool); - } - - gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); - gst_query_add_allocation_meta (query, - GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); - - return TRUE; - - /* ERRORS */ -config_failed: - { - GST_ERROR_OBJECT (filter, "failed to set config"); - gst_object_unref (pool); - return FALSE; - } -} - -static gboolean -gst_d3d11_upload_decide_allocation (GstBaseTransform * trans, GstQuery * query) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - GstCaps *outcaps = NULL; - GstBufferPool *pool = NULL; - guint size, min, max; - GstStructure *config; - gboolean update_pool = FALSE; - GstVideoInfo vinfo; - - gst_query_parse_allocation (query, &outcaps, NULL); - - if (!outcaps) - return FALSE; - - gst_video_info_from_caps (&vinfo, outcaps); - - if (gst_query_get_n_allocation_pools (query) > 0) { - gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); - if (pool && !GST_IS_D3D11_BUFFER_POOL (pool)) { - gst_object_unref (pool); - pool = NULL; - } - - update_pool = TRUE; - } else { - size = GST_VIDEO_INFO_SIZE (&vinfo); - min = max = 0; - } - - if (!pool) { - GST_DEBUG_OBJECT (trans, "create our pool"); - - pool = gst_d3d11_buffer_pool_new (filter->device); - } - - config = gst_buffer_pool_get_config (pool); - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - gst_buffer_pool_config_set_params (config, outcaps, size, min, max); - - gst_buffer_pool_set_config (pool, config); - - /* update size with calculated one */ - size = GST_D3D11_BUFFER_POOL (pool)->buffer_size; - - if (update_pool) - gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); - else - gst_query_add_allocation_pool (query, pool, size, min, max); - - gst_object_unref (pool); - - return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, - query); -} - -static GstFlowReturn -gst_d3d11_upload_transform (GstBaseTransform * trans, GstBuffer * inbuf, - GstBuffer * outbuf) -{ - GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); - - GstVideoFrame in_frame, out_frame; - gint i; - GstFlowReturn ret = GST_FLOW_OK; - - if (!gst_video_frame_map (&in_frame, &filter->in_info, inbuf, - GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF)) - goto invalid_buffer; - - if (!gst_video_frame_map (&out_frame, &filter->out_info, outbuf, - GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF)) { - gst_video_frame_unmap (&in_frame); - goto invalid_buffer; - } - - for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&in_frame); i++) { - if (!gst_video_frame_copy_plane (&out_frame, &in_frame, i)) { - GST_ERROR_OBJECT (filter, "Couldn't copy %dth plane", i); - ret = GST_FLOW_ERROR; - break; - } - } - - gst_video_frame_unmap (&out_frame); - gst_video_frame_unmap (&in_frame); - - return ret; - - /* ERRORS */ -invalid_buffer: - { - GST_ELEMENT_WARNING (filter, CORE, NOT_IMPLEMENTED, (NULL), - ("invalid video buffer received")); - return GST_FLOW_ERROR; - } -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11utils.c
Deleted
@@ -1,432 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11utils.h" -#include "gstd3d11device.h" - -#include <windows.h> -#include <versionhelpers.h> - -GST_DEBUG_CATEGORY_STATIC (GST_CAT_CONTEXT); -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_utils_debug); -#define GST_CAT_DEFAULT gst_d3d11_utils_debug - -static void -_init_context_debug (void) -{ - static gsize _init = 0; - - if (g_once_init_enter (&_init)) { - GST_DEBUG_CATEGORY_GET (GST_CAT_CONTEXT, "GST_CONTEXT"); - g_once_init_leave (&_init, 1); - } -} - -/** - * gst_d3d11_handle_set_context: - * @element: a #GstElement - * @context: a #GstContext - * @device: (inout) (transfer full): location of a #GstD3DDevice - * - * Helper function for implementing #GstElementClass.set_context() in - * D3D11 capable elements. - * - * Retrieve's the #GstD3D11Device in @context and places the result in @device. - * - * Returns: whether the @device could be set successfully - */ -gboolean -gst_d3d11_handle_set_context (GstElement * element, GstContext * context, - gint adapter, GstD3D11Device ** device) -{ - const gchar *context_type; - - g_return_val_if_fail (GST_IS_ELEMENT (element), FALSE); - g_return_val_if_fail (device != NULL, FALSE); - - _init_context_debug (); - - if (!context) - return FALSE; - - context_type = gst_context_get_context_type (context); - if (g_strcmp0 (context_type, GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE) == 0) { - const GstStructure *str; - GstD3D11Device *other_device = NULL; - guint other_adapter = 0; - - /* If we had device already, will not replace it */ - if (*device) - return TRUE; - - str = gst_context_get_structure (context); - - if (gst_structure_get (str, "device", GST_TYPE_D3D11_DEVICE, - &other_device, "adapter", G_TYPE_UINT, &other_adapter, NULL)) { - if (adapter == -1 || (guint) adapter == other_adapter) { - GST_CAT_DEBUG_OBJECT (GST_CAT_CONTEXT, - element, "Found D3D11 device context"); - *device = other_device; - - return TRUE; - } - - gst_object_unref (other_device); - } - } - - return FALSE; -} - -static void -context_set_d3d11_device (GstContext * context, GstD3D11Device * device) -{ - GstStructure *s; - guint adapter = 0; - guint device_id = 0; - guint vendor_id = 0; - gboolean hardware = FALSE; - gchar *desc = NULL; - - g_return_if_fail (context != NULL); - - g_object_get (G_OBJECT (device), "adapter", &adapter, "device-id", &device_id, - "vendor_id", &vendor_id, "hardware", &hardware, "description", &desc, - NULL); - - GST_CAT_LOG (GST_CAT_CONTEXT, - "setting GstD3D11Device(%" GST_PTR_FORMAT - ") with adapter %d on context(%" GST_PTR_FORMAT ")", - device, adapter, context); - - s = gst_context_writable_structure (context); - gst_structure_set (s, "device", GST_TYPE_D3D11_DEVICE, device, - "adapter", G_TYPE_UINT, adapter, - "device-id", G_TYPE_UINT, device_id, - "vendor-id", G_TYPE_UINT, vendor_id, - "hardware", G_TYPE_BOOLEAN, hardware, - "description", G_TYPE_STRING, GST_STR_NULL (desc), NULL); - g_free (desc); -} - -/** - * gst_d3d11_handle_context_query: - * @element: a #GstElement - * @query: a #GstQuery of type %GST_QUERY_CONTEXT - * @device: (transfer none) (nullable): a #GstD3D11Device - * - * Returns: Whether the @query was successfully responded to from the passed - * @device. - */ -gboolean -gst_d3d11_handle_context_query (GstElement * element, GstQuery * query, - GstD3D11Device * device) -{ - const gchar *context_type; - GstContext *context, *old_context; - - g_return_val_if_fail (GST_IS_ELEMENT (element), FALSE); - g_return_val_if_fail (GST_IS_QUERY (query), FALSE); - - _init_context_debug (); - - GST_LOG_OBJECT (element, "handle context query %" GST_PTR_FORMAT, query); - - if (!device) - return FALSE; - - gst_query_parse_context_type (query, &context_type); - if (g_strcmp0 (context_type, GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE) != 0) - return FALSE; - - gst_query_parse_context (query, &old_context); - if (old_context) - context = gst_context_copy (old_context); - else - context = gst_context_new (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE, TRUE); - - context_set_d3d11_device (context, device); - gst_query_set_context (query, context); - gst_context_unref (context); - - GST_DEBUG_OBJECT (element, "successfully set %" GST_PTR_FORMAT - " on %" GST_PTR_FORMAT, device, query); - - return TRUE; -} - -static gboolean -pad_query (const GValue * item, GValue * value, gpointer user_data) -{ - GstPad *pad = g_value_get_object (item); - GstQuery *query = user_data; - gboolean res; - - res = gst_pad_peer_query (pad, query); - - if (res) { - g_value_set_boolean (value, TRUE); - return FALSE; - } - - GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, pad, "pad peer query failed"); - return TRUE; -} - -static gboolean -run_query (GstElement * element, GstQuery * query, GstPadDirection direction) -{ - GstIterator *it; - GstIteratorFoldFunction func = pad_query; - GValue res = { 0 }; - - g_value_init (&res, G_TYPE_BOOLEAN); - g_value_set_boolean (&res, FALSE); - - /* Ask neighbor */ - if (direction == GST_PAD_SRC) - it = gst_element_iterate_src_pads (element); - else - it = gst_element_iterate_sink_pads (element); - - while (gst_iterator_fold (it, func, &res, query) == GST_ITERATOR_RESYNC) - gst_iterator_resync (it); - - gst_iterator_free (it); - - return g_value_get_boolean (&res); -} - -static void -run_d3d11_context_query (GstElement * element, GstD3D11Device ** device) -{ - GstQuery *query; - GstContext *ctxt = NULL; - - /* 1) Query downstream with GST_QUERY_CONTEXT for the context and - * check if downstream already has a context of the specific type - */ - query = gst_query_new_context (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE); - if (run_query (element, query, GST_PAD_SRC)) { - gst_query_parse_context (query, &ctxt); - if (ctxt) { - GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, - "found context (%" GST_PTR_FORMAT ") in downstream query", ctxt); - gst_element_set_context (element, ctxt); - } - } - - /* 2) although we found d3d11 device context above, the element does not want - * to use the context. Then try to find from the other direction */ - if (*device == NULL && run_query (element, query, GST_PAD_SINK)) { - gst_query_parse_context (query, &ctxt); - if (ctxt) { - GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, - "found context (%" GST_PTR_FORMAT ") in upstream query", ctxt); - gst_element_set_context (element, ctxt); - } - } - - if (*device == NULL) { - /* 3) Post a GST_MESSAGE_NEED_CONTEXT message on the bus with - * the required context type and afterwards check if a - * usable context was set now as in 1). The message could - * be handled by the parent bins of the element and the - * application. - */ - GstMessage *msg; - - GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, - "posting need context message"); - msg = gst_message_new_need_context (GST_OBJECT_CAST (element), - GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE); - gst_element_post_message (element, msg); - } - - /* - * Whomever responds to the need-context message performs a - * GstElement::set_context() with the required context in which the element - * is required to update the display_ptr or call gst_gl_handle_set_context(). - */ - - gst_query_unref (query); -} - -/** - * gst_d3d11_ensure_element_data: - * @element: the #GstElement running the query - * @adapter: prefered adapter index, pass adapter >=0 when - * the adapter explicitly required. Otherwise, set -1. - * @device: (inout): the resulting #GstD3D11Device - * - * Perform the steps necessary for retrieving a #GstD3D11Device - * from the surrounding elements or from the application using the #GstContext mechanism. - * - * If the contents of @device is not %NULL, then no #GstContext query is - * necessary for #GstD3D11Device retrieval is performed. - * - * Returns: whether a #GstD3D11Device exists in @device - */ -gboolean -gst_d3d11_ensure_element_data (GstElement * element, gint adapter, - GstD3D11Device ** device) -{ - guint target_adapter = 0; - - g_return_val_if_fail (element != NULL, FALSE); - g_return_val_if_fail (device != NULL, FALSE); - - _init_context_debug (); - - if (*device) { - GST_LOG_OBJECT (element, "already have a device %" GST_PTR_FORMAT, *device); - return TRUE; - } - - run_d3d11_context_query (element, device); - if (*device) - return TRUE; - - if (adapter > 0) - target_adapter = adapter; - - *device = gst_d3d11_device_new (target_adapter); - - if (*device == NULL) { - GST_ERROR_OBJECT (element, - "Couldn't create new device with adapter index %d", target_adapter); - return FALSE; - } else { - GstContext *context; - GstMessage *msg; - - /* Propagate new D3D11 device context */ - - context = gst_context_new (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE, TRUE); - context_set_d3d11_device (context, *device); - - gst_element_set_context (element, context); - - GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, - "posting have context (%p) message with D3D11 device context (%p)", - context, *device); - msg = gst_message_new_have_context (GST_OBJECT_CAST (element), context); - gst_element_post_message (GST_ELEMENT_CAST (element), msg); - } - - return TRUE; -} - -gboolean -gst_d3d11_is_windows_8_or_greater (void) -{ - static gsize version_once = 0; - static gboolean ret = FALSE; - - if (g_once_init_enter (&version_once)) { -#if (!GST_D3D11_WINAPI_ONLY_APP) - if (IsWindows8OrGreater ()) - ret = TRUE; -#else - ret = TRUE; -#endif - - g_once_init_leave (&version_once, 1); - } - - return ret; -} - -GstD3D11DeviceVendor -gst_d3d11_get_device_vendor (GstD3D11Device * device) -{ - guint device_id = 0; - guint vendor_id = 0; - gchar *desc = NULL; - GstD3D11DeviceVendor vendor = GST_D3D11_DEVICE_VENDOR_UNKNOWN; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); - - g_object_get (device, "device-id", &device_id, "vendor-id", &vendor_id, - "description", &desc, NULL); - - switch (vendor_id) { - case 0: - if (device_id == 0 && desc && g_strrstr (desc, "SraKmd")) - vendor = GST_D3D11_DEVICE_VENDOR_XBOX; - break; - case 0x1002: - case 0x1022: - vendor = GST_D3D11_DEVICE_VENDOR_AMD; - break; - case 0x8086: - vendor = GST_D3D11_DEVICE_VENDOR_INTEL; - break; - case 0x10de: - vendor = GST_D3D11_DEVICE_VENDOR_NVIDIA; - break; - case 0x4d4f4351: - vendor = GST_D3D11_DEVICE_VENDOR_QUALCOMM; - break; - default: - break; - } - - g_free (desc); - - return vendor; -} - -gboolean -_gst_d3d11_result (HRESULT hr, GstD3D11Device * device, GstDebugCategory * cat, - const gchar * file, const gchar * function, gint line) -{ -#ifndef GST_DISABLE_GST_DEBUG - gboolean ret = TRUE; - - if (FAILED (hr)) { - gchar *error_text = NULL; - - error_text = g_win32_error_message ((guint) hr); - /* g_win32_error_message() doesn't cover all HERESULT return code, - * so it could be empty string, or null if there was an error - * in g_utf16_to_utf8() */ - gst_debug_log (cat, GST_LEVEL_WARNING, file, function, line, - NULL, "D3D11 call failed: 0x%x, %s", (guint) hr, - GST_STR_NULL (error_text)); - g_free (error_text); - - ret = FALSE; - } -#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H) - if (device) { - gst_d3d11_device_d3d11_debug (device, file, function, line); - gst_d3d11_device_dxgi_debug (device, file, function, line); - } -#endif - - return ret; -#else - return SUCCEEDED (hr); -#endif -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11utils.h
Deleted
@@ -1,76 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_UTILS_H__ -#define __GST_D3D11_UTILS_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> - -#include "gstd3d11_fwd.h" - -G_BEGIN_DECLS - -typedef enum -{ - GST_D3D11_DEVICE_VENDOR_UNKNOWN = 0, - GST_D3D11_DEVICE_VENDOR_AMD, - GST_D3D11_DEVICE_VENDOR_INTEL, - GST_D3D11_DEVICE_VENDOR_NVIDIA, - GST_D3D11_DEVICE_VENDOR_QUALCOMM, - GST_D3D11_DEVICE_VENDOR_XBOX, -} GstD3D11DeviceVendor; - -gboolean gst_d3d11_handle_set_context (GstElement * element, - GstContext * context, - gint adapter, - GstD3D11Device ** device); - -gboolean gst_d3d11_handle_context_query (GstElement * element, - GstQuery * query, - GstD3D11Device * device); - -gboolean gst_d3d11_ensure_element_data (GstElement * element, - gint adapter, - GstD3D11Device ** device); - -gboolean gst_d3d11_is_windows_8_or_greater (void); - -GstD3D11DeviceVendor gst_d3d11_get_device_vendor (GstD3D11Device * device); - -gboolean _gst_d3d11_result (HRESULT hr, - GstD3D11Device * device, - GstDebugCategory * cat, - const gchar * file, - const gchar * function, - gint line); -/** - * gst_d3d11_result: - * @result: D3D11 API return code #HRESULT - * @device: (nullable): Associated #GstD3D11Device - * - * Returns: %TRUE if D3D11 API call result is SUCCESS - */ -#define gst_d3d11_result(result,device) \ - _gst_d3d11_result (result, device, GST_CAT_DEFAULT, __FILE__, GST_FUNCTION, __LINE__) - - -G_END_DECLS - -#endif /* __GST_D3D11_UTILS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11videoprocessor.c
Deleted
@@ -1,558 +0,0 @@ -/* GStreamer - * Copyright (C) <2020> Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -# include <config.h> -#endif - -#include "gstd3d11device.h" -#include "gstd3d11utils.h" -#include "gstd3d11format.h" -#include "gstd3d11memory.h" -#include "gstd3d11videoprocessor.h" - -#include <string.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_video_processor_debug); -#define GST_CAT_DEFAULT gst_d3d11_video_processor_debug - -#if (D3D11_HEADER_VERSION >= 1 && DXGI_HEADER_VERSION >= 4) -#define HAVE_VIDEO_CONTEXT_ONE -#endif - -#if (D3D11_HEADER_VERSION >= 4) && (DXGI_HEADER_VERSION >= 5) -#define HAVE_VIDEO_CONTEXT_TWO -#endif - -GQuark -gst_d3d11_video_processor_input_view_quark (void) -{ - static gsize quark = 0; - - if (g_once_init_enter (&quark)) { - GQuark q = g_quark_from_static_string ("GstD3D11VideoProcessorInputView"); - g_once_init_leave (&quark, (gsize) q); - } - - return (GQuark) quark; -} - -GQuark -gst_d3d11_video_processor_output_view_quark (void) -{ - static gsize quark = 0; - - if (g_once_init_enter (&quark)) { - GQuark q = g_quark_from_static_string ("GstD3D11VideoProcessorOutputView"); - g_once_init_leave (&quark, (gsize) q); - } - - return (GQuark) quark; -} - -struct _GstD3D11VideoProcessor -{ - GstD3D11Device *device; - - ID3D11VideoDevice *video_device; - ID3D11VideoContext *video_context; -#ifdef HAVE_VIDEO_CONTEXT_ONE - ID3D11VideoContext1 *video_context1; -#endif -#ifdef HAVE_VIDEO_CONTEXT_TWO - ID3D11VideoContext2 *video_context2; -#endif - ID3D11VideoProcessor *processor; - ID3D11VideoProcessorEnumerator *enumerator; - - D3D11_VIDEO_PROCESSOR_CAPS processor_caps; -}; - -GstD3D11VideoProcessor * -gst_d3d11_video_processor_new (GstD3D11Device * device, guint in_width, - guint in_height, guint out_width, guint out_height) -{ - GstD3D11VideoProcessor *self; - ID3D11Device *device_handle; - ID3D11DeviceContext *context_handle; - HRESULT hr; - D3D11_VIDEO_PROCESSOR_CONTENT_DESC desc = { 0, }; - - g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); - - device_handle = gst_d3d11_device_get_device_handle (device); - context_handle = gst_d3d11_device_get_device_context_handle (device); - - self = g_new0 (GstD3D11VideoProcessor, 1); - self->device = gst_object_ref (device); - - hr = ID3D11Device_QueryInterface (device_handle, - &IID_ID3D11VideoDevice, (void **) &self->video_device); - if (!gst_d3d11_result (hr, device)) - goto fail; - - hr = ID3D11DeviceContext_QueryInterface (context_handle, - &IID_ID3D11VideoContext, (void **) &self->video_context); - if (!gst_d3d11_result (hr, device)) - goto fail; - - /* FIXME: Add support intelace */ - desc.InputFrameFormat = D3D11_VIDEO_FRAME_FORMAT_PROGRESSIVE; - desc.InputWidth = in_width; - desc.InputHeight = in_height; - desc.OutputWidth = out_width; - desc.OutputHeight = out_height; - /* TODO: make option for this */ - desc.Usage = D3D11_VIDEO_USAGE_PLAYBACK_NORMAL; - - hr = ID3D11VideoDevice_CreateVideoProcessorEnumerator (self->video_device, - &desc, &self->enumerator); - if (!gst_d3d11_result (hr, device)) - goto fail; - - hr = ID3D11VideoProcessorEnumerator_GetVideoProcessorCaps (self->enumerator, - &self->processor_caps); - if (!gst_d3d11_result (hr, device)) - goto fail; - - hr = ID3D11VideoDevice_CreateVideoProcessor (self->video_device, - self->enumerator, 0, &self->processor); - if (!gst_d3d11_result (hr, device)) - goto fail; - -#ifdef HAVE_VIDEO_CONTEXT_ONE - hr = ID3D11VideoContext_QueryInterface (self->video_context, - &IID_ID3D11VideoContext1, (void **) &self->video_context1); - if (gst_d3d11_result (hr, device)) { - GST_DEBUG ("ID3D11VideoContext1 interface available"); - } -#endif -#ifdef HAVE_VIDEO_CONTEXT_TWO - hr = ID3D11VideoContext_QueryInterface (self->video_context, - &IID_ID3D11VideoContext2, (void **) &self->video_context2); - if (gst_d3d11_result (hr, device)) { - GST_DEBUG ("ID3D11VideoContext2 interface available"); - } -#endif - - return self; - -fail: - gst_d3d11_video_processor_free (self); - - return NULL; -} - -void -gst_d3d11_video_processor_free (GstD3D11VideoProcessor * processor) -{ - g_return_if_fail (processor != NULL); - - if (processor->video_device) - ID3D11VideoDevice_Release (processor->video_device); - if (processor->video_context) - ID3D11VideoContext_Release (processor->video_context); -#ifdef HAVE_VIDEO_CONTEXT_ONE - if (processor->video_context1) - ID3D11VideoContext1_Release (processor->video_context1); -#endif -#ifdef HAVE_VIDEO_CONTEXT_TWO - if (processor->video_context2) - ID3D11VideoContext2_Release (processor->video_context2); -#endif - if (processor->processor) - ID3D11VideoProcessor_Release (processor->processor); - if (processor->enumerator) - ID3D11VideoProcessorEnumerator_Release (processor->enumerator); - - gst_clear_object (&processor->device); - g_free (processor); -} - -static gboolean -gst_d3d11_video_processor_supports_format (GstD3D11VideoProcessor * - self, DXGI_FORMAT format, gboolean is_input) -{ - HRESULT hr; - UINT flag = 0; - - hr = ID3D11VideoProcessorEnumerator_CheckVideoProcessorFormat - (self->enumerator, format, &flag); - - if (!gst_d3d11_result (hr, self->device)) - return FALSE; - - if (is_input) { - /* D3D11_VIDEO_PROCESSOR_FORMAT_SUPPORT_INPUT, missing in mingw header */ - if ((flag & 0x1) != 0) - return TRUE; - } else { - /* D3D11_VIDEO_PROCESSOR_FORMAT_SUPPORT_OUTPUT, missing in mingw header */ - if ((flag & 0x2) != 0) - return TRUE; - } - - return FALSE; -} - -gboolean -gst_d3d11_video_processor_supports_input_format (GstD3D11VideoProcessor * - processor, DXGI_FORMAT format) -{ - g_return_val_if_fail (processor != NULL, FALSE); - - if (format == DXGI_FORMAT_UNKNOWN) - return FALSE; - - return gst_d3d11_video_processor_supports_format (processor, format, TRUE); -} - -gboolean -gst_d3d11_video_processor_supports_output_format (GstD3D11VideoProcessor * - processor, DXGI_FORMAT format) -{ - g_return_val_if_fail (processor != NULL, FALSE); - - if (format == DXGI_FORMAT_UNKNOWN) - return FALSE; - - return gst_d3d11_video_processor_supports_format (processor, format, FALSE); -} - -gboolean -gst_d3d11_video_processor_get_caps (GstD3D11VideoProcessor * processor, - D3D11_VIDEO_PROCESSOR_CAPS * caps) -{ - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (caps != NULL, FALSE); - - *caps = processor->processor_caps; - - return TRUE; -} - -static void -video_processor_color_space_from_gst (GstD3D11VideoProcessor * self, - GstVideoColorimetry * color, D3D11_VIDEO_PROCESSOR_COLOR_SPACE * colorspace) -{ - /* D3D11_VIDEO_PROCESSOR_DEVICE_CAPS_xvYCC */ - UINT can_xvYCC = 2; - - /* 0: playback, 1: video processing */ - colorspace->Usage = 0; - - if (color->range == GST_VIDEO_COLOR_RANGE_0_255) { - colorspace->RGB_Range = 0; - colorspace->Nominal_Range = D3D11_VIDEO_PROCESSOR_NOMINAL_RANGE_0_255; - } else { - /* 16-235 */ - colorspace->RGB_Range = 1; - colorspace->Nominal_Range = D3D11_VIDEO_PROCESSOR_NOMINAL_RANGE_16_235; - } - - if (color->matrix == GST_VIDEO_COLOR_MATRIX_BT601) { - colorspace->YCbCr_Matrix = 0; - } else { - /* BT.709, no other options (such as BT2020) */ - colorspace->YCbCr_Matrix = 1; - } - - if ((self->processor_caps.DeviceCaps & can_xvYCC) == can_xvYCC) { - colorspace->YCbCr_xvYCC = 1; - } else { - colorspace->YCbCr_xvYCC = 0; - } -} - -gboolean -gst_d3d11_video_processor_set_input_color_space (GstD3D11VideoProcessor * - processor, GstVideoColorimetry * color) -{ - D3D11_VIDEO_PROCESSOR_COLOR_SPACE color_space; - - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (color != NULL, FALSE); - - video_processor_color_space_from_gst (processor, color, &color_space); - - ID3D11VideoContext_VideoProcessorSetStreamColorSpace - (processor->video_context, processor->processor, 0, &color_space); - - return TRUE; -} - -gboolean -gst_d3d11_video_processor_set_output_color_space (GstD3D11VideoProcessor * - processor, GstVideoColorimetry * color) -{ - D3D11_VIDEO_PROCESSOR_COLOR_SPACE color_space; - - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (color != NULL, FALSE); - - video_processor_color_space_from_gst (processor, color, &color_space); - - ID3D11VideoContext_VideoProcessorSetOutputColorSpace - (processor->video_context, processor->processor, &color_space); - - return TRUE; -} - -#if (DXGI_HEADER_VERSION >= 4) -gboolean -gst_d3d11_video_processor_set_input_dxgi_color_space (GstD3D11VideoProcessor * - processor, DXGI_COLOR_SPACE_TYPE color_space) -{ - g_return_val_if_fail (processor != NULL, FALSE); - -#ifdef HAVE_VIDEO_CONTEXT_ONE - if (processor->video_context1) { - ID3D11VideoContext1_VideoProcessorSetStreamColorSpace1 - (processor->video_context1, processor->processor, 0, color_space); - return TRUE; - } -#endif - - return FALSE; -} - -gboolean -gst_d3d11_video_processor_set_output_dxgi_color_space (GstD3D11VideoProcessor * - processor, DXGI_COLOR_SPACE_TYPE color_space) -{ - g_return_val_if_fail (processor != NULL, FALSE); - -#ifdef HAVE_VIDEO_CONTEXT_ONE - if (processor->video_context1) { - ID3D11VideoContext1_VideoProcessorSetOutputColorSpace1 - (processor->video_context1, processor->processor, color_space); - return TRUE; - } -#endif - - return FALSE; -} -#endif - -#if (DXGI_HEADER_VERSION >= 5) -/* D3D11_VIDEO_PROCESSOR_FEATURE_CAPS_METADATA_HDR10 - * missing in mingw header */ -#define FEATURE_CAPS_METADATA_HDR10 (0x800) - -gboolean -gst_d3d11_video_processor_set_input_hdr10_metadata (GstD3D11VideoProcessor * - processor, DXGI_HDR_METADATA_HDR10 * hdr10_meta) -{ - g_return_val_if_fail (processor != NULL, FALSE); - -#ifdef HAVE_VIDEO_CONTEXT_TWO - if (processor->video_context2 && (processor->processor_caps.FeatureCaps & - FEATURE_CAPS_METADATA_HDR10)) { - if (hdr10_meta) { - ID3D11VideoContext2_VideoProcessorSetStreamHDRMetaData - (processor->video_context2, processor->processor, 0, - DXGI_HDR_METADATA_TYPE_HDR10, sizeof (DXGI_HDR_METADATA_HDR10), - hdr10_meta); - } else { - ID3D11VideoContext2_VideoProcessorSetStreamHDRMetaData - (processor->video_context2, processor->processor, 0, - DXGI_HDR_METADATA_TYPE_NONE, 0, NULL); - } - - return TRUE; - } -#endif - - return FALSE; -} - -gboolean -gst_d3d11_video_processor_set_output_hdr10_metadata (GstD3D11VideoProcessor * - processor, DXGI_HDR_METADATA_HDR10 * hdr10_meta) -{ - g_return_val_if_fail (processor != NULL, FALSE); - -#ifdef HAVE_VIDEO_CONTEXT_TWO - if (processor->video_context2 && (processor->processor_caps.FeatureCaps & - FEATURE_CAPS_METADATA_HDR10)) { - if (hdr10_meta) { - ID3D11VideoContext2_VideoProcessorSetOutputHDRMetaData - (processor->video_context2, processor->processor, - DXGI_HDR_METADATA_TYPE_HDR10, sizeof (DXGI_HDR_METADATA_HDR10), - hdr10_meta); - } else { - ID3D11VideoContext2_VideoProcessorSetOutputHDRMetaData - (processor->video_context2, processor->processor, - DXGI_HDR_METADATA_TYPE_NONE, 0, NULL); - } - - return TRUE; - } -#endif - - return FALSE; -} -#endif - -gboolean -gst_d3d11_video_processor_create_input_view (GstD3D11VideoProcessor * processor, - D3D11_VIDEO_PROCESSOR_INPUT_VIEW_DESC * desc, ID3D11Resource * resource, - ID3D11VideoProcessorInputView ** view) -{ - HRESULT hr; - - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (desc != NULL, FALSE); - g_return_val_if_fail (resource != NULL, FALSE); - g_return_val_if_fail (view != NULL, FALSE); - - hr = ID3D11VideoDevice_CreateVideoProcessorInputView (processor->video_device, - resource, processor->enumerator, desc, view); - if (!gst_d3d11_result (hr, processor->device)) - return FALSE; - - return TRUE; -} - -gboolean -gst_d3d11_video_processor_create_output_view (GstD3D11VideoProcessor * - processor, D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC * desc, - ID3D11Resource * resource, ID3D11VideoProcessorOutputView ** view) -{ - HRESULT hr; - - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (desc != NULL, FALSE); - g_return_val_if_fail (resource != NULL, FALSE); - g_return_val_if_fail (view != NULL, FALSE); - - hr = ID3D11VideoDevice_CreateVideoProcessorOutputView - (processor->video_device, resource, processor->enumerator, desc, view); - if (!gst_d3d11_result (hr, processor->device)) - return FALSE; - - return TRUE; -} - -void -gst_d3d11_video_processor_input_view_release (ID3D11VideoProcessorInputView * - view) -{ - if (!view) - return; - - ID3D11VideoProcessorInputView_Release (view); -} - -void -gst_d3d11_video_processor_output_view_release (ID3D11VideoProcessorOutputView * - view) -{ - if (!view) - return; - - ID3D11VideoProcessorOutputView_Release (view); -} - -gboolean -gst_d3d11_video_processor_render (GstD3D11VideoProcessor * processor, - RECT * in_rect, ID3D11VideoProcessorInputView * in_view, - RECT * out_rect, ID3D11VideoProcessorOutputView * out_view) -{ - gboolean ret; - - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (in_view != NULL, FALSE); - g_return_val_if_fail (out_view != NULL, FALSE); - - gst_d3d11_device_lock (processor->device); - ret = gst_d3d11_video_processor_render_unlocked (processor, in_rect, in_view, - out_rect, out_view); - gst_d3d11_device_unlock (processor->device); - - return ret; -} - -gboolean -gst_d3d11_video_processor_render_unlocked (GstD3D11VideoProcessor * processor, - RECT * in_rect, ID3D11VideoProcessorInputView * in_view, - RECT * out_rect, ID3D11VideoProcessorOutputView * out_view) -{ - HRESULT hr; - D3D11_VIDEO_PROCESSOR_STREAM stream = { 0, }; - - g_return_val_if_fail (processor != NULL, FALSE); - g_return_val_if_fail (in_view != NULL, FALSE); - g_return_val_if_fail (out_view != NULL, FALSE); - - stream.Enable = TRUE; - stream.pInputSurface = in_view; - - if (in_rect) { - ID3D11VideoContext_VideoProcessorSetStreamSourceRect - (processor->video_context, processor->processor, 0, TRUE, in_rect); - } else { - ID3D11VideoContext_VideoProcessorSetStreamSourceRect - (processor->video_context, processor->processor, 0, FALSE, NULL); - } - - if (out_rect) { - ID3D11VideoContext_VideoProcessorSetStreamDestRect - (processor->video_context, processor->processor, 0, TRUE, out_rect); - ID3D11VideoContext_VideoProcessorSetOutputTargetRect - (processor->video_context, processor->processor, TRUE, out_rect); - } else { - ID3D11VideoContext_VideoProcessorSetStreamDestRect - (processor->video_context, processor->processor, 0, FALSE, NULL); - ID3D11VideoContext_VideoProcessorSetOutputTargetRect - (processor->video_context, processor->processor, FALSE, NULL); - } - - hr = ID3D11VideoContext_VideoProcessorBlt (processor->video_context, - processor->processor, out_view, 0, 1, &stream); - if (!gst_d3d11_result (hr, processor->device)) - return FALSE; - - return TRUE; -} - -gboolean -gst_d3d11_video_processor_check_bind_flags_for_input_view (guint bind_flags) -{ - static const guint compatible_flags = (D3D11_BIND_DECODER | - D3D11_BIND_VIDEO_ENCODER | D3D11_BIND_RENDER_TARGET | - D3D11_BIND_UNORDERED_ACCESS); - - if (bind_flags == 0) - return TRUE; - - if ((bind_flags & compatible_flags) != 0) - return TRUE; - - return FALSE; -} - -gboolean -gst_d3d11_video_processor_check_bind_flags_for_output_view (guint bind_flags) -{ - if ((bind_flags & D3D11_BIND_RENDER_TARGET) == D3D11_BIND_RENDER_TARGET) - return TRUE; - - return FALSE; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11videosink.c
Deleted
@@ -1,1088 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11videosink.h" -#include "gstd3d11memory.h" -#include "gstd3d11utils.h" -#include "gstd3d11device.h" -#include "gstd3d11bufferpool.h" -#include "gstd3d11format.h" -#include "gstd3d11videoprocessor.h" - -#if GST_D3D11_WINAPI_ONLY_APP -#include "gstd3d11window_corewindow.h" -#include "gstd3d11window_swapchainpanel.h" -#else -#include "gstd3d11window_win32.h" -#endif - -enum -{ - PROP_0, - PROP_ADAPTER, - PROP_FORCE_ASPECT_RATIO, - PROP_ENABLE_NAVIGATION_EVENTS, - PROP_FULLSCREEN_TOGGLE_MODE, - PROP_FULLSCREEN, -}; - -#define DEFAULT_ADAPTER -1 -#define DEFAULT_FORCE_ASPECT_RATIO TRUE -#define DEFAULT_ENABLE_NAVIGATION_EVENTS TRUE -#define DEFAULT_FULLSCREEN_TOGGLE_MODE GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_NONE -#define DEFAULT_FULLSCREEN FALSE - -static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", - GST_PAD_SINK, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) - )); - -GST_DEBUG_CATEGORY (d3d11_video_sink_debug); -#define GST_CAT_DEFAULT d3d11_video_sink_debug - -static void gst_d3d11_videosink_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec); -static void gst_d3d11_videosink_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec); - -static void -gst_d3d11_video_sink_video_overlay_init (GstVideoOverlayInterface * iface); -static void -gst_d3d11_video_sink_navigation_init (GstNavigationInterface * iface); - -static void gst_d3d11_video_sink_set_context (GstElement * element, - GstContext * context); -static GstCaps *gst_d3d11_video_sink_get_caps (GstBaseSink * sink, - GstCaps * filter); -static gboolean gst_d3d11_video_sink_set_caps (GstBaseSink * sink, - GstCaps * caps); - -static gboolean gst_d3d11_video_sink_start (GstBaseSink * sink); -static gboolean gst_d3d11_video_sink_stop (GstBaseSink * sink); -static gboolean gst_d3d11_video_sink_propose_allocation (GstBaseSink * sink, - GstQuery * query); -static gboolean gst_d3d11_video_sink_query (GstBaseSink * sink, - GstQuery * query); -static gboolean gst_d3d11_video_sink_unlock (GstBaseSink * sink); -static gboolean gst_d3d11_video_sink_unlock_stop (GstBaseSink * sink); - -static GstFlowReturn -gst_d3d11_video_sink_show_frame (GstVideoSink * sink, GstBuffer * buf); -static gboolean gst_d3d11_video_sink_prepare_window (GstD3D11VideoSink * self); - -#define gst_d3d11_video_sink_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstD3D11VideoSink, gst_d3d11_video_sink, - GST_TYPE_VIDEO_SINK, - G_IMPLEMENT_INTERFACE (GST_TYPE_VIDEO_OVERLAY, - gst_d3d11_video_sink_video_overlay_init); - G_IMPLEMENT_INTERFACE (GST_TYPE_NAVIGATION, - gst_d3d11_video_sink_navigation_init); - GST_DEBUG_CATEGORY_INIT (d3d11_video_sink_debug, - "d3d11videosink", 0, "Direct3D11 Video Sink")); - -static void -gst_d3d11_video_sink_class_init (GstD3D11VideoSinkClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstBaseSinkClass *basesink_class = GST_BASE_SINK_CLASS (klass); - GstVideoSinkClass *videosink_class = GST_VIDEO_SINK_CLASS (klass); - - gobject_class->set_property = gst_d3d11_videosink_set_property; - gobject_class->get_property = gst_d3d11_videosink_get_property; - - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_int ("adapter", "Adapter", - "Adapter index for creating device (-1 for default)", - -1, G_MAXINT32, DEFAULT_ADAPTER, - G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | - G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_FORCE_ASPECT_RATIO, - g_param_spec_boolean ("force-aspect-ratio", - "Force aspect ratio", - "When enabled, scaling will respect original aspect ratio", - DEFAULT_FORCE_ASPECT_RATIO, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_ENABLE_NAVIGATION_EVENTS, - g_param_spec_boolean ("enable-navigation-events", - "Enable navigation events", - "When enabled, navigation events are sent upstream", - DEFAULT_ENABLE_NAVIGATION_EVENTS, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_FULLSCREEN_TOGGLE_MODE, - g_param_spec_flags ("fullscreen-toggle-mode", - "Full screen toggle mode", - "Full screen toggle mode used to trigger fullscreen mode change", - GST_D3D11_WINDOW_TOGGLE_MODE_GET_TYPE, DEFAULT_FULLSCREEN_TOGGLE_MODE, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_FULLSCREEN, - g_param_spec_boolean ("fullscreen", - "fullscreen", - "Ignored when \"fullscreen-toggle-mode\" does not include \"property\"", - DEFAULT_FULLSCREEN, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - element_class->set_context = - GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_set_context); - - gst_element_class_set_static_metadata (element_class, - "Direct3D11 video sink", "Sink/Video", - "A Direct3D11 based videosink", - "Seungha Yang <seungha.yang@navercorp.com>"); - - gst_element_class_add_static_pad_template (element_class, &sink_template); - - basesink_class->get_caps = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_get_caps); - basesink_class->set_caps = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_set_caps); - basesink_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_start); - basesink_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_stop); - basesink_class->propose_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_propose_allocation); - basesink_class->query = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_query); - basesink_class->unlock = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_unlock); - basesink_class->unlock_stop = - GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_unlock_stop); - - videosink_class->show_frame = - GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_show_frame); - - gst_type_mark_as_plugin_api (GST_D3D11_WINDOW_TOGGLE_MODE_GET_TYPE, 0); -} - -static void -gst_d3d11_video_sink_init (GstD3D11VideoSink * self) -{ - self->adapter = DEFAULT_ADAPTER; - self->force_aspect_ratio = DEFAULT_FORCE_ASPECT_RATIO; - self->enable_navigation_events = DEFAULT_ENABLE_NAVIGATION_EVENTS; - self->fullscreen_toggle_mode = DEFAULT_FULLSCREEN_TOGGLE_MODE; - self->fullscreen = DEFAULT_FULLSCREEN; -} - -static void -gst_d3d11_videosink_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (object); - - GST_OBJECT_LOCK (self); - switch (prop_id) { - case PROP_ADAPTER: - self->adapter = g_value_get_int (value); - break; - case PROP_FORCE_ASPECT_RATIO: - self->force_aspect_ratio = g_value_get_boolean (value); - if (self->window) - g_object_set (self->window, - "force-aspect-ratio", self->force_aspect_ratio, NULL); - break; - case PROP_ENABLE_NAVIGATION_EVENTS: - self->enable_navigation_events = g_value_get_boolean (value); - if (self->window) { - g_object_set (self->window, - "enable-navigation-events", self->enable_navigation_events, NULL); - } - break; - case PROP_FULLSCREEN_TOGGLE_MODE: - self->fullscreen_toggle_mode = g_value_get_flags (value); - if (self->window) { - g_object_set (self->window, - "fullscreen-toggle-mode", self->fullscreen_toggle_mode, NULL); - } - break; - case PROP_FULLSCREEN: - self->fullscreen = g_value_get_boolean (value); - if (self->window) { - g_object_set (self->window, "fullscreen", self->fullscreen, NULL); - } - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } - GST_OBJECT_UNLOCK (self); -} - -static void -gst_d3d11_videosink_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (object); - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_int (value, self->adapter); - break; - case PROP_FORCE_ASPECT_RATIO: - g_value_set_boolean (value, self->force_aspect_ratio); - break; - case PROP_ENABLE_NAVIGATION_EVENTS: - g_value_set_boolean (value, self->enable_navigation_events); - break; - case PROP_FULLSCREEN_TOGGLE_MODE: - g_value_set_flags (value, self->fullscreen_toggle_mode); - break; - case PROP_FULLSCREEN: - if (self->window) { - g_object_get_property (G_OBJECT (self->window), pspec->name, value); - } else { - g_value_set_boolean (value, self->fullscreen); - } - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_video_sink_set_context (GstElement * element, GstContext * context) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (element); - - gst_d3d11_handle_set_context (element, context, self->adapter, &self->device); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -static GstCaps * -gst_d3d11_video_sink_get_caps (GstBaseSink * sink, GstCaps * filter) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - GstCaps *caps = NULL; - - if (self->device && !self->can_convert) { - GstCaps *overlaycaps; - GstCapsFeatures *features; - - caps = gst_d3d11_device_get_supported_caps (self->device, - D3D11_FORMAT_SUPPORT_TEXTURE2D | D3D11_FORMAT_SUPPORT_DISPLAY); - overlaycaps = gst_caps_copy (caps); - features = gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, NULL); - gst_caps_set_features_simple (overlaycaps, features); - gst_caps_append (caps, overlaycaps); - } - - if (!caps) - caps = gst_pad_get_pad_template_caps (GST_VIDEO_SINK_PAD (sink)); - - if (caps && filter) { - GstCaps *isect; - isect = gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); - gst_caps_unref (caps); - caps = isect; - } - - return caps; -} - -static gboolean -gst_d3d11_video_sink_set_caps (GstBaseSink * sink, GstCaps * caps) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - gint video_width, video_height; - gint video_par_n, video_par_d; /* video's PAR */ - gint display_par_n = 1, display_par_d = 1; /* display's PAR */ - guint num, den; - GError *error = NULL; - GstStructure *config; - gint i; - - GST_DEBUG_OBJECT (self, "set caps %" GST_PTR_FORMAT, caps); - - if (!gst_d3d11_video_sink_prepare_window (self)) - goto no_window; - - if (!gst_video_info_from_caps (&self->info, caps)) - goto invalid_format; - - video_width = GST_VIDEO_INFO_WIDTH (&self->info); - video_height = GST_VIDEO_INFO_HEIGHT (&self->info); - video_par_n = GST_VIDEO_INFO_PAR_N (&self->info); - video_par_d = GST_VIDEO_INFO_PAR_D (&self->info); - - /* get aspect ratio from caps if it's present, and - * convert video width and height to a display width and height - * using wd / hd = wv / hv * PARv / PARd */ - - /* TODO: Get display PAR */ - - if (!gst_video_calculate_display_ratio (&num, &den, video_width, - video_height, video_par_n, video_par_d, display_par_n, display_par_d)) - goto no_disp_ratio; - - GST_DEBUG_OBJECT (sink, - "video width/height: %dx%d, calculated display ratio: %d/%d format: %s", - video_width, video_height, num, den, - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (&self->info))); - - /* now find a width x height that respects this display ratio. - * prefer those that have one of w/h the same as the incoming video - * using wd / hd = num / den - */ - - /* start with same height, because of interlaced video - * check hd / den is an integer scale factor, and scale wd with the PAR - */ - if (video_height % den == 0) { - GST_DEBUG_OBJECT (self, "keeping video height"); - GST_VIDEO_SINK_WIDTH (self) = (guint) - gst_util_uint64_scale_int (video_height, num, den); - GST_VIDEO_SINK_HEIGHT (self) = video_height; - } else if (video_width % num == 0) { - GST_DEBUG_OBJECT (self, "keeping video width"); - GST_VIDEO_SINK_WIDTH (self) = video_width; - GST_VIDEO_SINK_HEIGHT (self) = (guint) - gst_util_uint64_scale_int (video_width, den, num); - } else { - GST_DEBUG_OBJECT (self, "approximating while keeping video height"); - GST_VIDEO_SINK_WIDTH (self) = (guint) - gst_util_uint64_scale_int (video_height, num, den); - GST_VIDEO_SINK_HEIGHT (self) = video_height; - } - - GST_DEBUG_OBJECT (self, "scaling to %dx%d", - GST_VIDEO_SINK_WIDTH (self), GST_VIDEO_SINK_HEIGHT (self)); - self->video_width = video_width; - self->video_height = video_height; - - if (GST_VIDEO_SINK_WIDTH (self) <= 0 || GST_VIDEO_SINK_HEIGHT (self) <= 0) - goto no_display_size; - - GST_OBJECT_LOCK (self); - if (!self->pending_render_rect) { - self->render_rect.x = 0; - self->render_rect.y = 0; - self->render_rect.w = GST_VIDEO_SINK_WIDTH (self); - self->render_rect.h = GST_VIDEO_SINK_HEIGHT (self); - } - - gst_d3d11_window_set_render_rectangle (self->window, - self->render_rect.x, self->render_rect.y, self->render_rect.w, - self->render_rect.h); - self->pending_render_rect = FALSE; - GST_OBJECT_UNLOCK (self); - - self->have_video_processor = FALSE; - if (!gst_d3d11_window_prepare (self->window, GST_VIDEO_SINK_WIDTH (self), - GST_VIDEO_SINK_HEIGHT (self), caps, &self->have_video_processor, - &error)) { - GstMessage *error_msg; - - GST_ERROR_OBJECT (self, "cannot create swapchain"); - error_msg = gst_message_new_error (GST_OBJECT_CAST (self), - error, "Failed to prepare d3d11window"); - g_clear_error (&error); - gst_element_post_message (GST_ELEMENT (self), error_msg); - - return FALSE; - } - - if (self->fallback_pool) { - gst_buffer_pool_set_active (self->fallback_pool, FALSE); - gst_object_unref (self->fallback_pool); - } - - self->fallback_pool = gst_d3d11_buffer_pool_new (self->device); - config = gst_buffer_pool_get_config (self->fallback_pool); - gst_buffer_pool_config_set_params (config, - caps, GST_VIDEO_INFO_SIZE (&self->info), 0, 2); - - { - GstD3D11AllocationParams *d3d11_params; - gint bind_flags = D3D11_BIND_SHADER_RESOURCE; - - if (self->have_video_processor) { - /* To create video processor input view, one of following bind flags - * is required - * NOTE: Any texture arrays which were created with D3D11_BIND_DECODER flag - * cannot be used for shader input. - * - * D3D11_BIND_DECODER - * D3D11_BIND_VIDEO_ENCODER - * D3D11_BIND_RENDER_TARGET - * D3D11_BIND_UNORDERED_ACCESS_VIEW - */ - bind_flags |= D3D11_BIND_RENDER_TARGET; - } - - d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); - if (!d3d11_params) { - d3d11_params = gst_d3d11_allocation_params_new (self->device, - &self->info, 0, bind_flags); - } else { - /* Set bind flag */ - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&self->info); i++) { - d3d11_params->desc[i].BindFlags |= bind_flags; - } - } - - gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); - gst_d3d11_allocation_params_free (d3d11_params); - } - - gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); - gst_buffer_pool_set_config (self->fallback_pool, config); - - return TRUE; - - /* ERRORS */ -invalid_format: - { - GST_DEBUG_OBJECT (sink, - "Could not locate image format from caps %" GST_PTR_FORMAT, caps); - return FALSE; - } -no_window: - { - GST_ELEMENT_ERROR (sink, RESOURCE, NOT_FOUND, (NULL), - ("Failed to open window.")); - return FALSE; - } -no_disp_ratio: - { - GST_ELEMENT_ERROR (sink, CORE, NEGOTIATION, (NULL), - ("Error calculating the output display ratio of the video.")); - return FALSE; - } -no_display_size: - { - GST_ELEMENT_ERROR (sink, CORE, NEGOTIATION, (NULL), - ("Error calculating the output display ratio of the video.")); - return FALSE; - } -} - -static void -gst_d3d11_video_sink_key_event (GstD3D11Window * window, const gchar * event, - const gchar * key, GstD3D11VideoSink * self) -{ - if (self->enable_navigation_events) { - GST_LOG_OBJECT (self, "send key event %s, key %s", event, key); - gst_navigation_send_key_event (GST_NAVIGATION (self), event, key); - } -} - -static void -gst_d3d11_video_mouse_key_event (GstD3D11Window * window, const gchar * event, - gint button, gdouble x, gdouble y, GstD3D11VideoSink * self) -{ - if (self->enable_navigation_events) { - GST_LOG_OBJECT (self, - "send mouse event %s, button %d (%.1f, %.1f)", event, button, x, y); - gst_navigation_send_mouse_event (GST_NAVIGATION (self), event, button, x, - y); - } -} - -static gboolean -gst_d3d11_video_sink_start (GstBaseSink * sink) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - gboolean is_hardware = TRUE; - - GST_DEBUG_OBJECT (self, "Start"); - - if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), self->adapter, - &self->device)) { - GST_ERROR_OBJECT (sink, "Cannot create d3d11device"); - return FALSE; - } - - g_object_get (self->device, "hardware", &is_hardware, NULL); - if (!is_hardware) { - GST_WARNING_OBJECT (self, "D3D11 device is running on software emulation"); - self->can_convert = FALSE; - } else { - self->can_convert = TRUE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_video_sink_prepare_window (GstD3D11VideoSink * self) -{ - GstD3D11WindowNativeType window_type = GST_D3D11_WINDOW_NATIVE_TYPE_HWND; - - if (self->window) - return TRUE; - - if (!self->window_id) - gst_video_overlay_prepare_window_handle (GST_VIDEO_OVERLAY (self)); - - if (self->window_id) { - window_type = - gst_d3d11_window_get_native_type_from_handle (self->window_id); - - if (window_type != GST_D3D11_WINDOW_NATIVE_TYPE_NONE) { - GST_DEBUG_OBJECT (self, "Have window handle %" G_GUINTPTR_FORMAT, - self->window_id); - gst_video_overlay_got_window_handle (GST_VIDEO_OVERLAY (self), - self->window_id); - } - } - - GST_DEBUG_OBJECT (self, "Create window (type: %s)", - gst_d3d11_window_get_native_type_to_string (window_type)); - -#if GST_D3D11_WINAPI_ONLY_APP - if (window_type != GST_D3D11_WINDOW_NATIVE_TYPE_CORE_WINDOW && - window_type != GST_D3D11_WINDOW_NATIVE_TYPE_SWAP_CHAIN_PANEL) { - GST_ERROR_OBJECT (self, "Overlay handle must be set before READY state"); - return FALSE; - } -#endif - - switch (window_type) { -#if (!GST_D3D11_WINAPI_ONLY_APP) - case GST_D3D11_WINDOW_NATIVE_TYPE_HWND: - self->window = gst_d3d11_window_win32_new (self->device, self->window_id); - break; -#else - case GST_D3D11_WINDOW_NATIVE_TYPE_CORE_WINDOW: - self->window = gst_d3d11_window_core_window_new (self->device, - self->window_id); - break; - case GST_D3D11_WINDOW_NATIVE_TYPE_SWAP_CHAIN_PANEL: - self->window = gst_d3d11_window_swap_chain_panel_new (self->device, - self->window_id); - break; -#endif - default: - break; - } - - if (!self->window) { - GST_ERROR_OBJECT (self, "Cannot create d3d11window"); - return FALSE; - } - - GST_OBJECT_LOCK (self); - g_object_set (self->window, - "force-aspect-ratio", self->force_aspect_ratio, - "fullscreen-toggle-mode", self->fullscreen_toggle_mode, - "fullscreen", self->fullscreen, - "enable-navigation-events", self->enable_navigation_events, NULL); - GST_OBJECT_UNLOCK (self); - - g_signal_connect (self->window, "key-event", - G_CALLBACK (gst_d3d11_video_sink_key_event), self); - g_signal_connect (self->window, "mouse-event", - G_CALLBACK (gst_d3d11_video_mouse_key_event), self); - - return TRUE; -} - -static gboolean -gst_d3d11_video_sink_stop (GstBaseSink * sink) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - - GST_DEBUG_OBJECT (self, "Stop"); - - if (self->fallback_pool) { - gst_buffer_pool_set_active (self->fallback_pool, FALSE); - gst_object_unref (self->fallback_pool); - self->fallback_pool = NULL; - } - - if (self->window) - gst_d3d11_window_unprepare (self->window); - - gst_clear_object (&self->device); - gst_clear_object (&self->window); - - return TRUE; -} - -static gboolean -gst_d3d11_video_sink_propose_allocation (GstBaseSink * sink, GstQuery * query) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - GstStructure *config; - GstCaps *caps; - GstBufferPool *pool = NULL; - GstVideoInfo info; - guint size; - gboolean need_pool; - - if (!self->device || !self->window) - return FALSE; - - gst_query_parse_allocation (query, &caps, &need_pool); - - if (caps == NULL) - goto no_caps; - - if (!gst_video_info_from_caps (&info, caps)) - goto invalid_caps; - - /* the normal size of a frame */ - size = info.size; - - if (need_pool) { - gint i; - GstD3D11AllocationParams *d3d11_params; - - GST_DEBUG_OBJECT (self, "create new pool"); - - pool = gst_d3d11_buffer_pool_new (self->device); - config = gst_buffer_pool_get_config (pool); - gst_buffer_pool_config_set_params (config, caps, size, 2, - DXGI_MAX_SWAP_CHAIN_BUFFERS); - - d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); - if (!d3d11_params) { - d3d11_params = gst_d3d11_allocation_params_new (self->device, &info, 0, - D3D11_BIND_SHADER_RESOURCE); - } else { - /* Set bind flag */ - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { - d3d11_params->desc[i].BindFlags |= D3D11_BIND_SHADER_RESOURCE; - } - } - - gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); - gst_d3d11_allocation_params_free (d3d11_params); - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_META); - - if (!gst_buffer_pool_set_config (pool, config)) { - g_object_unref (pool); - goto config_failed; - } - } - - gst_query_add_allocation_pool (query, pool, size, 2, 0); - if (pool) - g_object_unref (pool); - - gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); - gst_query_add_allocation_meta (query, - GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); - - return TRUE; - - /* ERRORS */ -no_caps: - { - GST_WARNING_OBJECT (self, "no caps specified"); - return FALSE; - } -invalid_caps: - { - GST_WARNING_OBJECT (self, "invalid caps specified"); - return FALSE; - } -config_failed: - { - GST_WARNING_OBJECT (self, "failed setting config"); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_video_sink_query (GstBaseSink * sink, GstQuery * query) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT: - if (gst_d3d11_handle_context_query (GST_ELEMENT (self), query, - self->device)) { - return TRUE; - } - break; - default: - break; - } - - return GST_BASE_SINK_CLASS (parent_class)->query (sink, query); -} - -static gboolean -gst_d3d11_video_sink_unlock (GstBaseSink * sink) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - - if (self->window) - gst_d3d11_window_unlock (self->window); - - return TRUE; -} - -static gboolean -gst_d3d11_video_sink_unlock_stop (GstBaseSink * sink) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - - if (self->window) - gst_d3d11_window_unlock_stop (self->window); - - return TRUE; -} - -static gboolean -gst_d3d11_video_sink_upload_frame (GstD3D11VideoSink * self, GstBuffer * inbuf, - GstBuffer * outbuf) -{ - GstVideoFrame in_frame, out_frame; - gboolean ret; - gint i; - - GST_LOG_OBJECT (self, "Copy to fallback buffer"); - - if (!gst_video_frame_map (&in_frame, &self->info, inbuf, - GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF)) - goto invalid_buffer; - - if (!gst_video_frame_map (&out_frame, &self->info, outbuf, - GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF)) { - gst_video_frame_unmap (&in_frame); - goto invalid_buffer; - } - - ret = gst_video_frame_copy (&out_frame, &in_frame); - - gst_video_frame_unmap (&in_frame); - gst_video_frame_unmap (&out_frame); - - if (ret) { - /* map to upload staging texture to render texture */ - for (i = 0; i < gst_buffer_n_memory (outbuf); i++) { - GstMemory *mem; - GstMapInfo map; - - mem = gst_buffer_peek_memory (outbuf, i); - if (!gst_memory_map (mem, &map, (GST_MAP_READ | GST_MAP_D3D11))) { - GST_ERROR_OBJECT (self, "cannot upload staging texture"); - ret = FALSE; - break; - } - gst_memory_unmap (mem, &map); - } - } - - return ret; - - /* ERRORS */ -invalid_buffer: - { - GST_ELEMENT_WARNING (self, CORE, NOT_IMPLEMENTED, (NULL), - ("invalid video buffer received")); - return FALSE; - } -} - -static gboolean -gst_d3d11_video_sink_copy_d3d11_to_d3d11 (GstD3D11VideoSink * self, - GstBuffer * inbuf, GstBuffer * outbuf) -{ - gint i; - ID3D11DeviceContext *context_handle = - gst_d3d11_device_get_device_context_handle (self->device); - - g_return_val_if_fail (gst_buffer_n_memory (inbuf) == - gst_buffer_n_memory (outbuf), FALSE); - - GST_LOG_OBJECT (self, "Copy to fallback buffer using device memory copy"); - - gst_d3d11_device_lock (self->device); - for (i = 0; i < gst_buffer_n_memory (inbuf); i++) { - GstD3D11Memory *in_mem = - (GstD3D11Memory *) gst_buffer_peek_memory (inbuf, i); - GstD3D11Memory *out_mem = - (GstD3D11Memory *) gst_buffer_peek_memory (outbuf, i); - D3D11_BOX src_box; - - /* input buffer might be larger than render size */ - src_box.left = 0; - src_box.top = 0; - src_box.front = 0; - src_box.back = 1; - src_box.right = out_mem->desc.Width; - src_box.bottom = out_mem->desc.Height; - - ID3D11DeviceContext_CopySubresourceRegion (context_handle, - (ID3D11Resource *) out_mem->texture, 0, 0, 0, 0, - (ID3D11Resource *) in_mem->texture, in_mem->subresource_index, - &src_box); - } - gst_d3d11_device_unlock (self->device); - - return TRUE; -} - -static GstFlowReturn -gst_d3d11_video_sink_show_frame (GstVideoSink * sink, GstBuffer * buf) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); - GstMapInfo map; - GstFlowReturn ret; - GstVideoRectangle rect = { 0, }; - GstBuffer *render_buf; - gboolean need_unref = FALSE; - gboolean do_device_copy = TRUE; - gint i; - - render_buf = buf; - - for (i = 0; i < gst_buffer_n_memory (buf); i++) { - GstMemory *mem; - GstD3D11Memory *dmem; - - mem = gst_buffer_peek_memory (buf, i); - if (!gst_is_d3d11_memory (mem)) { - GST_LOG_OBJECT (sink, "not a d3d11 memory, need fallback"); - render_buf = NULL; - do_device_copy = FALSE; - break; - } - - dmem = (GstD3D11Memory *) mem; - if (dmem->device != self->device) { - GST_LOG_OBJECT (sink, "different d3d11 device, need fallback"); - render_buf = NULL; - do_device_copy = FALSE; - break; - } - - if (dmem->desc.Usage == D3D11_USAGE_DEFAULT) { - if (!gst_memory_map (mem, &map, (GST_MAP_READ | GST_MAP_D3D11))) { - GST_ERROR_OBJECT (self, "cannot map d3d11 memory"); - return GST_FLOW_ERROR; - } - - gst_memory_unmap (mem, &map); - } - - if (gst_buffer_n_memory (buf) == 1 && self->have_video_processor && - gst_d3d11_video_processor_check_bind_flags_for_input_view - (dmem->desc.BindFlags)) { - break; - } - - if (!gst_d3d11_memory_ensure_shader_resource_view (dmem)) { - GST_LOG_OBJECT (sink, - "shader resource view is unavailable, need fallback"); - render_buf = NULL; - /* keep run loop in order to upload staging memory to device memory */ - } - } - - if (!render_buf) { - if (!self->fallback_pool || - !gst_buffer_pool_set_active (self->fallback_pool, TRUE) || - gst_buffer_pool_acquire_buffer (self->fallback_pool, &render_buf, - NULL) != GST_FLOW_OK) { - GST_ERROR_OBJECT (self, "fallback pool is unavailable"); - - return GST_FLOW_ERROR; - } - - for (i = 0; i < gst_buffer_n_memory (render_buf); i++) { - GstD3D11Memory *dmem; - - dmem = (GstD3D11Memory *) gst_buffer_peek_memory (render_buf, i); - if (!gst_d3d11_memory_ensure_shader_resource_view (dmem)) { - GST_ERROR_OBJECT (self, "fallback shader resource view is unavailable"); - gst_buffer_unref (render_buf); - - return GST_FLOW_ERROR; - } - } - - if (do_device_copy) { - if (!gst_d3d11_video_sink_copy_d3d11_to_d3d11 (self, buf, render_buf)) { - GST_ERROR_OBJECT (self, "cannot copy frame"); - gst_buffer_unref (render_buf); - - return GST_FLOW_ERROR; - } - } else if (!gst_d3d11_video_sink_upload_frame (self, buf, render_buf)) { - GST_ERROR_OBJECT (self, "cannot upload frame"); - gst_buffer_unref (render_buf); - - return GST_FLOW_ERROR; - } - - need_unref = TRUE; - } - - gst_d3d11_window_show (self->window); - - /* FIXME: add support crop meta */ - rect.w = self->video_width; - rect.h = self->video_height; - - ret = gst_d3d11_window_render (self->window, render_buf, &rect); - if (need_unref) - gst_buffer_unref (render_buf); - - if (ret == GST_D3D11_WINDOW_FLOW_CLOSED) { - GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, - ("Output window was closed"), (NULL)); - - ret = GST_FLOW_ERROR; - } - - return ret; -} - -/* VideoOverlay interface */ -static void -gst_d3d11_video_sink_set_window_handle (GstVideoOverlay * overlay, - guintptr window_id) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (overlay); - - GST_DEBUG ("set window handle %" G_GUINTPTR_FORMAT, window_id); - - self->window_id = window_id; -} - -static void -gst_d3d11_video_sink_set_render_rectangle (GstVideoOverlay * overlay, gint x, - gint y, gint width, gint height) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (overlay); - - GST_DEBUG_OBJECT (self, - "render rect x: %d, y: %d, width: %d, height %d", x, y, width, height); - - GST_OBJECT_LOCK (self); - if (self->window) { - gst_d3d11_window_set_render_rectangle (self->window, x, y, width, height); - } else { - self->render_rect.x = x; - self->render_rect.y = y; - self->render_rect.w = width; - self->render_rect.h = height; - self->pending_render_rect = TRUE; - } - GST_OBJECT_UNLOCK (self); -} - -static void -gst_d3d11_video_sink_expose (GstVideoOverlay * overlay) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (overlay); - - if (self->window && self->window->swap_chain) { - GstVideoRectangle rect = { 0, }; - rect.w = GST_VIDEO_SINK_WIDTH (self); - rect.h = GST_VIDEO_SINK_HEIGHT (self); - - gst_d3d11_window_render (self->window, NULL, &rect); - } -} - -static void -gst_d3d11_video_sink_video_overlay_init (GstVideoOverlayInterface * iface) -{ - iface->set_window_handle = gst_d3d11_video_sink_set_window_handle; - iface->set_render_rectangle = gst_d3d11_video_sink_set_render_rectangle; - iface->expose = gst_d3d11_video_sink_expose; -} - -/* Navigation interface */ -static void -gst_d3d11_video_sink_navigation_send_event (GstNavigation * navigation, - GstStructure * structure) -{ - GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (navigation); - gboolean handled = FALSE; - GstEvent *event = NULL; - GstVideoRectangle src = { 0, }; - GstVideoRectangle dst = { 0, }; - GstVideoRectangle result; - gdouble x, y, xscale = 1.0, yscale = 1.0; - - if (!self->window) { - gst_structure_free (structure); - return; - } - - if (self->force_aspect_ratio) { - /* We get the frame position using the calculated geometry from _setcaps - that respect pixel aspect ratios */ - src.w = GST_VIDEO_SINK_WIDTH (self); - src.h = GST_VIDEO_SINK_HEIGHT (self); - dst.w = self->render_rect.w; - dst.h = self->render_rect.h; - - gst_video_sink_center_rect (src, dst, &result, TRUE); - result.x += self->render_rect.x; - result.y += self->render_rect.y; - } else { - memcpy (&result, &self->render_rect, sizeof (GstVideoRectangle)); - } - - xscale = (gdouble) GST_VIDEO_INFO_WIDTH (&self->info) / result.w; - yscale = (gdouble) GST_VIDEO_INFO_HEIGHT (&self->info) / result.h; - - /* Converting pointer coordinates to the non scaled geometry */ - if (gst_structure_get_double (structure, "pointer_x", &x)) { - x = MIN (x, result.x + result.w); - x = MAX (x - result.x, 0); - gst_structure_set (structure, "pointer_x", G_TYPE_DOUBLE, - (gdouble) x * xscale, NULL); - } - if (gst_structure_get_double (structure, "pointer_y", &y)) { - y = MIN (y, result.y + result.h); - y = MAX (y - result.y, 0); - gst_structure_set (structure, "pointer_y", G_TYPE_DOUBLE, - (gdouble) y * yscale, NULL); - } - - event = gst_event_new_navigation (structure); - if (event) { - gst_event_ref (event); - handled = gst_pad_push_event (GST_VIDEO_SINK_PAD (self), event); - - if (!handled) - gst_element_post_message (GST_ELEMENT_CAST (self), - gst_navigation_message_new_event (GST_OBJECT_CAST (self), event)); - - gst_event_unref (event); - } -} - -static void -gst_d3d11_video_sink_navigation_init (GstNavigationInterface * iface) -{ - iface->send_event = gst_d3d11_video_sink_navigation_send_event; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11videosinkbin.c
Deleted
@@ -1,352 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11videosinkbin.h" -#include "gstd3d11videosink.h" -#include "gstd3d11upload.h" -#include "gstd3d11colorconvert.h" -#include "gstd3d11memory.h" -#include "gstd3d11utils.h" -#include "gstd3d11device.h" -#include "gstd3d11format.h" - -enum -{ - PROP_0, - /* basesink */ - PROP_SYNC, - PROP_MAX_LATENESS, - PROP_QOS, - PROP_ASYNC, - PROP_TS_OFFSET, - PROP_ENABLE_LAST_SAMPLE, - PROP_LAST_SAMPLE, - PROP_BLOCKSIZE, - PROP_RENDER_DELAY, - PROP_THROTTLE_TIME, - PROP_MAX_BITRATE, - PROP_PROCESSING_DEADLINE, - PROP_STATS, - /* videosink */ - PROP_SHOW_PREROLL_FRAME, - /* d3d11videosink */ - PROP_ADAPTER, - PROP_FORCE_ASPECT_RATIO, - PROP_ENABLE_NAVIGATION_EVENTS, - PROP_FULLSCREEN_TOGGLE_MODE, - PROP_FULLSCREEN, -}; - -/* basesink */ -#define DEFAULT_SYNC TRUE -#define DEFAULT_MAX_LATENESS -1 -#define DEFAULT_QOS FALSE -#define DEFAULT_ASYNC TRUE -#define DEFAULT_TS_OFFSET 0 -#define DEFAULT_BLOCKSIZE 4096 -#define DEFAULT_RENDER_DELAY 0 -#define DEFAULT_ENABLE_LAST_SAMPLE TRUE -#define DEFAULT_THROTTLE_TIME 0 -#define DEFAULT_MAX_BITRATE 0 -#define DEFAULT_DROP_OUT_OF_SEGMENT TRUE -#define DEFAULT_PROCESSING_DEADLINE (20 * GST_MSECOND) - -/* videosink */ -#define DEFAULT_SHOW_PREROLL_FRAME TRUE - -/* d3d11videosink */ -#define DEFAULT_ADAPTER -1 -#define DEFAULT_FORCE_ASPECT_RATIO TRUE -#define DEFAULT_ENABLE_NAVIGATION_EVENTS TRUE -#define DEFAULT_FULLSCREEN_TOGGLE_MODE GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_NONE -#define DEFAULT_FULLSCREEN FALSE - -static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", - GST_PAD_SINK, - GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) ";" - GST_VIDEO_CAPS_MAKE (GST_D3D11_FORMATS) "; " - GST_VIDEO_CAPS_MAKE_WITH_FEATURES - (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," - GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, - GST_D3D11_FORMATS) - )); - -GST_DEBUG_CATEGORY (d3d11_video_sink_bin_debug); -#define GST_CAT_DEFAULT d3d11_video_sink_bin_debug - -static void gst_d3d11_video_sink_bin_set_property (GObject * object, - guint prop_id, const GValue * value, GParamSpec * pspec); -static void gst_d3d11_video_sink_bin_get_property (GObject * object, - guint prop_id, GValue * value, GParamSpec * pspec); - -static void -gst_d3d11_video_sink_bin_video_overlay_init (GstVideoOverlayInterface * iface); -static void -gst_d3d11_video_sink_bin_navigation_init (GstNavigationInterface * iface); - -#define gst_d3d11_video_sink_bin_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstD3D11VideoSinkBin, gst_d3d11_video_sink_bin, - GST_TYPE_BIN, - G_IMPLEMENT_INTERFACE (GST_TYPE_VIDEO_OVERLAY, - gst_d3d11_video_sink_bin_video_overlay_init); - G_IMPLEMENT_INTERFACE (GST_TYPE_NAVIGATION, - gst_d3d11_video_sink_bin_navigation_init); - GST_DEBUG_CATEGORY_INIT (d3d11_video_sink_bin_debug, - "d3d11videosink", 0, "Direct3D11 Video Sink")); - -static void -gst_d3d11_video_sink_bin_class_init (GstD3D11VideoSinkBinClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - - gobject_class->set_property = gst_d3d11_video_sink_bin_set_property; - gobject_class->get_property = gst_d3d11_video_sink_bin_get_property; - - /* basesink */ - g_object_class_install_property (gobject_class, PROP_SYNC, - g_param_spec_boolean ("sync", "Sync", "Sync on the clock", DEFAULT_SYNC, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_MAX_LATENESS, - g_param_spec_int64 ("max-lateness", "Max Lateness", - "Maximum number of nanoseconds that a buffer can be late before it " - "is dropped (-1 unlimited)", -1, G_MAXINT64, DEFAULT_MAX_LATENESS, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_QOS, - g_param_spec_boolean ("qos", "Qos", - "Generate Quality-of-Service events upstream", DEFAULT_QOS, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_ASYNC, - g_param_spec_boolean ("async", "Async", - "Go asynchronously to PAUSED", DEFAULT_ASYNC, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_TS_OFFSET, - g_param_spec_int64 ("ts-offset", "TS Offset", - "Timestamp offset in nanoseconds", G_MININT64, G_MAXINT64, - DEFAULT_TS_OFFSET, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_ENABLE_LAST_SAMPLE, - g_param_spec_boolean ("enable-last-sample", "Enable Last Buffer", - "Enable the last-sample property", DEFAULT_ENABLE_LAST_SAMPLE, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_LAST_SAMPLE, - g_param_spec_boxed ("last-sample", "Last Sample", - "The last sample received in the sink", GST_TYPE_SAMPLE, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_BLOCKSIZE, - g_param_spec_uint ("blocksize", "Block size", - "Size in bytes to pull per buffer (0 = default)", 0, G_MAXUINT, - DEFAULT_BLOCKSIZE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_RENDER_DELAY, - g_param_spec_uint64 ("render-delay", "Render Delay", - "Additional render delay of the sink in nanoseconds", 0, G_MAXUINT64, - DEFAULT_RENDER_DELAY, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_THROTTLE_TIME, - g_param_spec_uint64 ("throttle-time", "Throttle time", - "The time to keep between rendered buffers (0 = disabled)", 0, - G_MAXUINT64, DEFAULT_THROTTLE_TIME, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_MAX_BITRATE, - g_param_spec_uint64 ("max-bitrate", "Max Bitrate", - "The maximum bits per second to render (0 = disabled)", 0, - G_MAXUINT64, DEFAULT_MAX_BITRATE, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_PROCESSING_DEADLINE, - g_param_spec_uint64 ("processing-deadline", "Processing deadline", - "Maximum processing deadline in nanoseconds", 0, G_MAXUINT64, - DEFAULT_PROCESSING_DEADLINE, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_STATS, - g_param_spec_boxed ("stats", "Statistics", - "Sink Statistics", GST_TYPE_STRUCTURE, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - /* videosink */ - g_object_class_install_property (gobject_class, PROP_SHOW_PREROLL_FRAME, - g_param_spec_boolean ("show-preroll-frame", "Show preroll frame", - "Whether to render video frames during preroll", - DEFAULT_SHOW_PREROLL_FRAME, - G_PARAM_READWRITE | G_PARAM_CONSTRUCT | G_PARAM_STATIC_STRINGS)); - - /* d3d11videosink */ - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_int ("adapter", "Adapter", - "Adapter index for creating device (-1 for default)", - -1, G_MAXINT32, DEFAULT_ADAPTER, - G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | - G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_FORCE_ASPECT_RATIO, - g_param_spec_boolean ("force-aspect-ratio", - "Force aspect ratio", - "When enabled, scaling will respect original aspect ratio", - DEFAULT_FORCE_ASPECT_RATIO, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_ENABLE_NAVIGATION_EVENTS, - g_param_spec_boolean ("enable-navigation-events", - "Enable navigation events", - "When enabled, navigation events are sent upstream", - DEFAULT_ENABLE_NAVIGATION_EVENTS, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_FULLSCREEN_TOGGLE_MODE, - g_param_spec_flags ("fullscreen-toggle-mode", - "Full screen toggle mode", - "Full screen toggle mode used to trigger fullscreen mode change", - GST_D3D11_WINDOW_TOGGLE_MODE_GET_TYPE, DEFAULT_FULLSCREEN_TOGGLE_MODE, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_FULLSCREEN, - g_param_spec_boolean ("fullscreen", - "fullscreen", - "Ignored when \"fullscreen-toggle-mode\" does not include \"property\"", - DEFAULT_FULLSCREEN, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - gst_element_class_set_static_metadata (element_class, - "Direct3D11 video sink bin", "Sink/Video", - "A Direct3D11 based videosink", - "Seungha Yang <seungha.yang@navercorp.com>"); - - gst_element_class_add_static_pad_template (element_class, &sink_template); -} - -static void -gst_d3d11_video_sink_bin_init (GstD3D11VideoSinkBin * self) -{ - GstPad *pad; - - self->upload = gst_element_factory_make ("d3d11upload", NULL); - if (!self->upload) { - GST_ERROR_OBJECT (self, "d3d11upload unavailable"); - return; - } - - self->sink = gst_element_factory_make ("d3d11videosinkelement", NULL); - if (!self->sink) { - gst_clear_object (&self->upload); - GST_ERROR_OBJECT (self, "d3d11videosinkelement unavailable"); - return; - } - - gst_bin_add_many (GST_BIN (self), self->upload, self->sink, NULL); - - gst_element_link_many (self->upload, self->sink, NULL); - - pad = gst_element_get_static_pad (self->upload, "sink"); - - self->sinkpad = gst_ghost_pad_new ("sink", pad); - gst_element_add_pad (GST_ELEMENT_CAST (self), self->sinkpad); - gst_object_unref (pad); -} - -static void -gst_d3d11_video_sink_bin_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (object); - GParamSpec *sink_pspec; - - sink_pspec = g_object_class_find_property (G_OBJECT_GET_CLASS (self->sink), - pspec->name); - - if (sink_pspec && G_PARAM_SPEC_TYPE (sink_pspec) == G_PARAM_SPEC_TYPE (pspec)) { - g_object_set_property (G_OBJECT (self->sink), pspec->name, value); - } else { - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - } -} - -static void -gst_d3d11_video_sink_bin_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (object); - - g_object_get_property (G_OBJECT (self->sink), pspec->name, value); -} - -/* VideoOverlay interface */ -static void -gst_d3d11_video_sink_bin_set_window_handle (GstVideoOverlay * overlay, - guintptr window_id) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (overlay); - - gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (self->sink), - window_id); -} - -static void -gst_d3d11_video_sink_bin_set_render_rectangle (GstVideoOverlay * overlay, - gint x, gint y, gint width, gint height) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (overlay); - - gst_video_overlay_set_render_rectangle (GST_VIDEO_OVERLAY (self->sink), - x, y, width, height); -} - -static void -gst_d3d11_video_sink_bin_expose (GstVideoOverlay * overlay) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (overlay); - - gst_video_overlay_expose (GST_VIDEO_OVERLAY (self->sink)); -} - -static void -gst_d3d11_video_sink_bin_handle_events (GstVideoOverlay * overlay, - gboolean handle_events) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (overlay); - - gst_video_overlay_handle_events (GST_VIDEO_OVERLAY (self->sink), - handle_events); -} - -static void -gst_d3d11_video_sink_bin_video_overlay_init (GstVideoOverlayInterface * iface) -{ - iface->set_window_handle = gst_d3d11_video_sink_bin_set_window_handle; - iface->set_render_rectangle = gst_d3d11_video_sink_bin_set_render_rectangle; - iface->expose = gst_d3d11_video_sink_bin_expose; - iface->handle_events = gst_d3d11_video_sink_bin_handle_events; -} - -/* Navigation interface */ -static void -gst_d3d11_video_sink_bin_navigation_send_event (GstNavigation * navigation, - GstStructure * structure) -{ - GstD3D11VideoSinkBin *self = GST_D3D11_VIDEO_SINK_BIN (navigation); - - gst_navigation_send_event (GST_NAVIGATION (self->sink), structure); -} - -static void -gst_d3d11_video_sink_bin_navigation_init (GstNavigationInterface * iface) -{ - iface->send_event = gst_d3d11_video_sink_bin_navigation_send_event; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11videosinkbin.h
Deleted
@@ -1,63 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifndef __GST_D3D11_VIDEO_SINK_BIN_H__ -#define __GST_D3D11_VIDEO_SINK_BIN_H__ - -#include <gst/gst.h> -#include <gst/video/video.h> -#include <gst/video/gstvideosink.h> -#include <gst/video/videooverlay.h> -#include <gst/video/navigation.h> - -#include "gstd3d11_fwd.h" - -G_BEGIN_DECLS - -#define GST_TYPE_D3D11_VIDEO_SINK_BIN (gst_d3d11_video_sink_bin_get_type()) -#define GST_D3D11_VIDEO_SINK_BIN(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_VIDEO_SINK_BIN,GstD3D11VideoSinkBin)) -#define GST_D3D11_VIDEO_SINK_BIN_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_D3D11_VIDEO_SINK_BIN,GstD3D11VideoSinkBinClass)) -#define GST_D3D11_VIDEO_SINK_BIN_GET_CLASS(obj) (GST_D3D11_VIDEO_SINK_BIN_CLASS(G_OBJECT_GET_CLASS(obj))) -#define GST_IS_D3D11_VIDEO_SINK_BIN(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_VIDEO_SINK_BIN)) -#define GST_IS_D3D11_VIDEO_SINK_BIN_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_VIDEO_SINK_BIN)) - -typedef struct _GstD3D11VideoSinkBin GstD3D11VideoSinkBin; -typedef struct _GstD3D11VideoSinkBinClass GstD3D11VideoSinkBinClass; - -struct _GstD3D11VideoSinkBin -{ - GstBin parent; - - GstPad *sinkpad; - - GstElement *upload; - GstElement *sink; -}; - -struct _GstD3D11VideoSinkBinClass -{ - GstBinClass parent_class; -}; - -GType gst_d3d11_video_sink_bin_get_type (void); - -G_END_DECLS - - -#endif /* __GST_D3D11_VIDEO_SINK_BIN_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11vp8dec.c
Deleted
@@ -1,935 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include <config.h> -#endif - -#include "gstd3d11vp8dec.h" -#include "gstd3d11memory.h" -#include "gstd3d11bufferpool.h" - -#include <gst/codecs/gstvp8decoder.h> -#include <string.h> - -/* HACK: to expose dxva data structure on UWP */ -#ifdef WINAPI_PARTITION_DESKTOP -#undef WINAPI_PARTITION_DESKTOP -#endif -#define WINAPI_PARTITION_DESKTOP 1 -#include <d3d9.h> -#include <dxva.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_vp8_dec_debug); -#define GST_CAT_DEFAULT gst_d3d11_vp8_dec_debug - -enum -{ - PROP_0, - PROP_ADAPTER, - PROP_DEVICE_ID, - PROP_VENDOR_ID, -}; - -/* copied from d3d11.h since mingw header doesn't define them */ -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_VP8_VLD, - 0x90b899ea, 0x3a62, 0x4705, 0x88, 0xb3, 0x8d, 0xf0, 0x4b, 0x27, 0x44, 0xe7); - -/* reference list 4 + 4 margin */ -#define NUM_OUTPUT_VIEW 8 - -typedef struct _GstD3D11Vp8Dec -{ - GstVp8Decoder parent; - - GstVideoCodecState *output_state; - GstD3D11Device *device; - GstD3D11Decoder *d3d11_decoder; - - guint width, height; - GstVideoFormat out_format; - - gboolean use_d3d11_output; -} GstD3D11Vp8Dec; - -typedef struct _GstD3D11Vp8DecClass -{ - GstVp8DecoderClass parent_class; - guint adapter; - guint device_id; - guint vendor_id; -} GstD3D11Vp8DecClass; - -static GstElementClass *parent_class = NULL; - -#define GST_D3D11_VP8_DEC(object) ((GstD3D11Vp8Dec *) (object)) -#define GST_D3D11_VP8_DEC_GET_CLASS(object) \ - (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11Vp8DecClass)) - -static void gst_d3d11_vp8_dec_get_property (GObject * object, - guint prop_id, GValue * value, GParamSpec * pspec); -static void gst_d3d11_vp8_dec_set_context (GstElement * element, - GstContext * context); - -static gboolean gst_d3d11_vp8_dec_open (GstVideoDecoder * decoder); -static gboolean gst_d3d11_vp8_dec_close (GstVideoDecoder * decoder); -static gboolean gst_d3d11_vp8_dec_negotiate (GstVideoDecoder * decoder); -static gboolean gst_d3d11_vp8_dec_decide_allocation (GstVideoDecoder * - decoder, GstQuery * query); -static gboolean gst_d3d11_vp8_dec_src_query (GstVideoDecoder * decoder, - GstQuery * query); - -/* GstVp8Decoder */ -static gboolean gst_d3d11_vp8_dec_new_sequence (GstVp8Decoder * decoder, - const GstVp8FrameHdr * frame_hdr); -static gboolean gst_d3d11_vp8_dec_new_picture (GstVp8Decoder * decoder, - GstVideoCodecFrame * frame, GstVp8Picture * picture); -static GstFlowReturn gst_d3d11_vp8_dec_output_picture (GstVp8Decoder * - decoder, GstVideoCodecFrame * frame, GstVp8Picture * picture); -static gboolean gst_d3d11_vp8_dec_start_picture (GstVp8Decoder * decoder, - GstVp8Picture * picture); -static gboolean gst_d3d11_vp8_dec_decode_picture (GstVp8Decoder * decoder, - GstVp8Picture * picture, GstVp8Parser * parser); -static gboolean gst_d3d11_vp8_dec_end_picture (GstVp8Decoder * decoder, - GstVp8Picture * picture); - -static void -gst_d3d11_vp8_dec_class_init (GstD3D11Vp8DecClass * klass, gpointer data) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); - GstVp8DecoderClass *vp8decoder_class = GST_VP8_DECODER_CLASS (klass); - GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; - gchar *long_name; - - gobject_class->get_property = gst_d3d11_vp8_dec_get_property; - - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_uint ("adapter", "Adapter", - "DXGI Adapter index for creating device", - 0, G_MAXUINT32, cdata->adapter, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_DEVICE_ID, - g_param_spec_uint ("device-id", "Device Id", - "DXGI Device ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_VENDOR_ID, - g_param_spec_uint ("vendor-id", "Vendor Id", - "DXGI Vendor ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - parent_class = g_type_class_peek_parent (klass); - - klass->adapter = cdata->adapter; - klass->device_id = cdata->device_id; - klass->vendor_id = cdata->vendor_id; - - element_class->set_context = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_set_context); - - long_name = g_strdup_printf ("Direct3D11 VP8 %s Decoder", cdata->description); - gst_element_class_set_metadata (element_class, long_name, - "Codec/Decoder/Video/Hardware", - "A Direct3D11 based VP8 video decoder", - "Seungha Yang <seungha.yang@navercorp.com>"); - g_free (long_name); - - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - cdata->sink_caps)); - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - cdata->src_caps)); - gst_d3d11_decoder_class_data_free (cdata); - - decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_open); - decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_close); - decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_negotiate); - decoder_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_decide_allocation); - decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_src_query); - - vp8decoder_class->new_sequence = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_new_sequence); - vp8decoder_class->new_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_new_picture); - vp8decoder_class->output_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_output_picture); - vp8decoder_class->start_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_start_picture); - vp8decoder_class->decode_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_decode_picture); - vp8decoder_class->end_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_end_picture); -} - -static void -gst_d3d11_vp8_dec_init (GstD3D11Vp8Dec * self) -{ -} - -static void -gst_d3d11_vp8_dec_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11Vp8DecClass *klass = GST_D3D11_VP8_DEC_GET_CLASS (object); - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_uint (value, klass->adapter); - break; - case PROP_DEVICE_ID: - g_value_set_uint (value, klass->device_id); - break; - case PROP_VENDOR_ID: - g_value_set_uint (value, klass->vendor_id); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_vp8_dec_set_context (GstElement * element, GstContext * context) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (element); - GstD3D11Vp8DecClass *klass = GST_D3D11_VP8_DEC_GET_CLASS (self); - - gst_d3d11_handle_set_context (element, context, klass->adapter, - &self->device); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -static gboolean -gst_d3d11_vp8_dec_open (GstVideoDecoder * decoder) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - GstD3D11Vp8DecClass *klass = GST_D3D11_VP8_DEC_GET_CLASS (self); - - if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), klass->adapter, - &self->device)) { - GST_ERROR_OBJECT (self, "Cannot create d3d11device"); - return FALSE; - } - - self->d3d11_decoder = gst_d3d11_decoder_new (self->device); - - if (!self->d3d11_decoder) { - GST_ERROR_OBJECT (self, "Cannot create d3d11 decoder"); - gst_clear_object (&self->device); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_vp8_dec_close (GstVideoDecoder * decoder) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - - gst_clear_object (&self->d3d11_decoder); - gst_clear_object (&self->device); - - return TRUE; -} - -static gboolean -gst_d3d11_vp8_dec_negotiate (GstVideoDecoder * decoder) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - GstVp8Decoder *vp8dec = GST_VP8_DECODER (decoder); - - if (!gst_d3d11_decoder_negotiate (decoder, vp8dec->input_state, - self->out_format, self->width, self->height, &self->output_state, - &self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); -} - -static gboolean -gst_d3d11_vp8_dec_decide_allocation (GstVideoDecoder * decoder, - GstQuery * query) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - - if (!gst_d3d11_decoder_decide_allocation (decoder, query, self->device, - GST_D3D11_CODEC_VP8, self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation - (decoder, query); -} - -static gboolean -gst_d3d11_vp8_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT: - if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), - query, self->device)) { - return TRUE; - } - break; - default: - break; - } - - return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); -} - -static gboolean -gst_d3d11_vp8_dec_new_sequence (GstVp8Decoder * decoder, - const GstVp8FrameHdr * frame_hdr) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - static const GUID *profile_guid = &GST_GUID_D3D11_DECODER_PROFILE_VP8_VLD; - GstVideoInfo info; - - GST_LOG_OBJECT (self, "new sequence"); - - /* FIXME: support I420 */ - self->out_format = GST_VIDEO_FORMAT_NV12; - self->width = frame_hdr->width; - self->height = frame_hdr->height; - - gst_video_info_set_format (&info, - self->out_format, self->width, self->height); - - gst_d3d11_decoder_reset (self->d3d11_decoder); - if (!gst_d3d11_decoder_open (self->d3d11_decoder, GST_D3D11_CODEC_VP8, - &info, self->width, self->height, - NUM_OUTPUT_VIEW, &profile_guid, 1)) { - GST_ERROR_OBJECT (self, "Failed to create decoder"); - return FALSE; - } - - if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { - GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_vp8_dec_new_picture (GstVp8Decoder * decoder, - GstVideoCodecFrame * frame, GstVp8Picture * picture) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - GstBuffer *view_buffer; - GstD3D11Memory *mem; - - view_buffer = gst_d3d11_decoder_get_output_view_buffer (self->d3d11_decoder); - if (!view_buffer) { - GST_ERROR_OBJECT (self, "No available output view buffer"); - return FALSE; - } - - mem = (GstD3D11Memory *) gst_buffer_peek_memory (view_buffer, 0); - - GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT " (index %d)", - view_buffer, mem->subresource_index); - - gst_vp8_picture_set_user_data (picture, - view_buffer, (GDestroyNotify) gst_buffer_unref); - - GST_LOG_OBJECT (self, "New VP8 picture %p", picture); - - return TRUE; -} - -static GstFlowReturn -gst_d3d11_vp8_dec_output_picture (GstVp8Decoder * decoder, - GstVideoCodecFrame * frame, GstVp8Picture * picture) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); - GstBuffer *output_buffer = NULL; - GstBuffer *view_buffer; - - GST_LOG_OBJECT (self, "Outputting picture %p", picture); - - view_buffer = (GstBuffer *) gst_vp8_picture_get_user_data (picture); - - if (!view_buffer) { - GST_ERROR_OBJECT (self, "Could not get output view"); - goto error; - } - - if (!picture->frame_hdr.show_frame) { - GST_LOG_OBJECT (self, "Decode only picture %p", picture); - GST_VIDEO_CODEC_FRAME_SET_DECODE_ONLY (frame); - - gst_vp8_picture_unref (picture); - - return gst_video_decoder_finish_frame (vdec, frame); - } - - /* if downstream is d3d11 element and forward playback case, - * expose our decoder view without copy. In case of reverse playback, however, - * we cannot do that since baseclass will store the decoded buffer - * up to gop size but our dpb pool cannot be increased */ - if (self->use_d3d11_output && - gst_d3d11_decoder_supports_direct_rendering (self->d3d11_decoder) && - vdec->input_segment.rate > 0) { - GstMemory *mem; - - output_buffer = gst_buffer_ref (view_buffer); - mem = gst_buffer_peek_memory (output_buffer, 0); - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - } else { - output_buffer = gst_video_decoder_allocate_output_buffer (vdec); - } - - if (!output_buffer) { - GST_ERROR_OBJECT (self, "Couldn't allocate output buffer"); - goto error; - } - - frame->output_buffer = output_buffer; - - if (!gst_d3d11_decoder_process_output (self->d3d11_decoder, - &self->output_state->info, - picture->frame_hdr.width, picture->frame_hdr.height, - view_buffer, output_buffer)) { - GST_ERROR_OBJECT (self, "Failed to copy buffer"); - goto error; - } - - GST_LOG_OBJECT (self, "Finish frame %" GST_TIME_FORMAT, - GST_TIME_ARGS (GST_BUFFER_PTS (output_buffer))); - - gst_vp8_picture_unref (picture); - - return gst_video_decoder_finish_frame (vdec, frame); - -error: - gst_video_decoder_drop_frame (vdec, frame); - gst_vp8_picture_unref (picture); - - return GST_FLOW_ERROR; -} - -static GstD3D11DecoderOutputView * -gst_d3d11_vp8_dec_get_output_view_from_picture (GstD3D11Vp8Dec * self, - GstVp8Picture * picture) -{ - GstBuffer *view_buffer; - GstD3D11DecoderOutputView *view; - - view_buffer = (GstBuffer *) gst_vp8_picture_get_user_data (picture); - if (!view_buffer) { - GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); - return NULL; - } - - view = - gst_d3d11_decoder_get_output_view_from_buffer (self->d3d11_decoder, - view_buffer); - if (!view) { - GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); - return NULL; - } - - return view; -} - -static gboolean -gst_d3d11_vp8_dec_start_picture (GstVp8Decoder * decoder, - GstVp8Picture * picture) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - GstD3D11DecoderOutputView *view; - - view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, picture); - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view handle"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Begin frame"); - - if (!gst_d3d11_decoder_begin_frame (self->d3d11_decoder, view, 0, NULL)) { - GST_ERROR_OBJECT (self, "Failed to begin frame"); - return FALSE; - } - - return TRUE; -} - -static void -gst_d3d11_vp8_dec_copy_frame_params (GstD3D11Vp8Dec * self, - GstVp8Picture * picture, GstVp8Parser * parser, DXVA_PicParams_VP8 * params) -{ - const GstVp8FrameHdr *frame_hdr = &picture->frame_hdr; - gint i; - - /* 0: keyframe, 1: inter */ - params->frame_type = !frame_hdr->key_frame; - params->version = frame_hdr->version; - params->show_frame = frame_hdr->show_frame; - params->clamp_type = frame_hdr->clamping_type; - - params->filter_type = frame_hdr->filter_type; - params->filter_level = frame_hdr->loop_filter_level; - params->sharpness_level = frame_hdr->sharpness_level; - params->mode_ref_lf_delta_enabled = - parser->mb_lf_adjust.loop_filter_adj_enable; - params->mode_ref_lf_delta_update = - parser->mb_lf_adjust.mode_ref_lf_delta_update; - for (i = 0; i < 4; i++) { - params->ref_lf_deltas[i] = parser->mb_lf_adjust.ref_frame_delta[i]; - params->mode_lf_deltas[i] = parser->mb_lf_adjust.mb_mode_delta[i]; - } - params->log2_nbr_of_dct_partitions = frame_hdr->log2_nbr_of_dct_partitions; - params->base_qindex = frame_hdr->quant_indices.y_ac_qi; - params->y1dc_delta_q = frame_hdr->quant_indices.y_dc_delta; - params->y2dc_delta_q = frame_hdr->quant_indices.y2_dc_delta; - params->y2ac_delta_q = frame_hdr->quant_indices.y2_ac_delta; - params->uvdc_delta_q = frame_hdr->quant_indices.uv_dc_delta; - params->uvac_delta_q = frame_hdr->quant_indices.uv_ac_delta; - - params->ref_frame_sign_bias_golden = frame_hdr->sign_bias_golden; - params->ref_frame_sign_bias_altref = frame_hdr->sign_bias_alternate; - - params->refresh_entropy_probs = frame_hdr->refresh_entropy_probs; - - memcpy (params->vp8_coef_update_probs, frame_hdr->token_probs.prob, - sizeof (frame_hdr->token_probs.prob)); - - params->mb_no_coeff_skip = frame_hdr->mb_no_skip_coeff; - params->prob_skip_false = frame_hdr->prob_skip_false; - params->prob_intra = frame_hdr->prob_intra; - params->prob_last = frame_hdr->prob_last; - params->prob_golden = frame_hdr->prob_gf; - - memcpy (params->intra_16x16_prob, frame_hdr->mode_probs.y_prob, - sizeof (frame_hdr->mode_probs.y_prob)); - memcpy (params->intra_chroma_prob, frame_hdr->mode_probs.uv_prob, - sizeof (frame_hdr->mode_probs.uv_prob)); - memcpy (params->vp8_mv_update_probs, frame_hdr->mv_probs.prob, - sizeof (frame_hdr->mv_probs.prob)); -} - -static void -gst_d3d11_vp8_dec_copy_reference_frames (GstD3D11Vp8Dec * self, - DXVA_PicParams_VP8 * params) -{ - GstVp8Decoder *decoder = GST_VP8_DECODER (self); - GstD3D11DecoderOutputView *view; - - if (decoder->alt_ref_picture) { - view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, - decoder->alt_ref_picture); - if (!view) { - GST_ERROR_OBJECT (self, "picture does not have output view handle"); - return; - } - - params->alt_fb_idx.Index7Bits = view->view_id; - } else { - params->alt_fb_idx.bPicEntry = 0xff; - } - - if (decoder->golden_ref_picture) { - view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, - decoder->golden_ref_picture); - if (!view) { - GST_ERROR_OBJECT (self, "picture does not have output view handle"); - return; - } - - params->gld_fb_idx.Index7Bits = view->view_id; - } else { - params->gld_fb_idx.bPicEntry = 0xff; - } - - if (decoder->last_picture) { - view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, - decoder->last_picture); - if (!view) { - GST_ERROR_OBJECT (self, "picture does not have output view handle"); - return; - } - - params->lst_fb_idx.Index7Bits = view->view_id; - } else { - params->lst_fb_idx.bPicEntry = 0xff; - } -} - -static void -gst_d3d11_vp8_dec_copy_segmentation_params (GstD3D11Vp8Dec * self, - GstVp8Parser * parser, DXVA_PicParams_VP8 * params) -{ - const GstVp8Segmentation *seg = &parser->segmentation; - gint i; - - params->stVP8Segments.segmentation_enabled = seg->segmentation_enabled; - params->stVP8Segments.update_mb_segmentation_map = - seg->update_mb_segmentation_map; - params->stVP8Segments.update_mb_segmentation_data = - seg->update_segment_feature_data; - params->stVP8Segments.mb_segement_abs_delta = seg->segment_feature_mode; - - for (i = 0; i < 4; i++) { - params->stVP8Segments.segment_feature_data[0][i] = - seg->quantizer_update_value[i]; - } - - for (i = 0; i < 4; i++) { - params->stVP8Segments.segment_feature_data[1][i] = seg->lf_update_value[i]; - } - - for (i = 0; i < 3; i++) { - params->stVP8Segments.mb_segment_tree_probs[i] = seg->segment_prob[i]; - } -} - -static gboolean -gst_d3d11_vp8_dec_submit_picture_data (GstD3D11Vp8Dec * self, - GstVp8Picture * picture, DXVA_PicParams_VP8 * params) -{ - guint d3d11_buffer_size; - gpointer d3d11_buffer; - gsize buffer_offset = 0; - gboolean is_first = TRUE; - - GST_TRACE_OBJECT (self, "Getting picture params buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, - "Failed to get decoder buffer for picture parameters"); - return FALSE; - } - - memcpy (d3d11_buffer, params, sizeof (DXVA_PicParams_VP8)); - - GST_TRACE_OBJECT (self, "Release picture param decoder buffer"); - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS)) { - GST_ERROR_OBJECT (self, "Failed to release decoder buffer"); - return FALSE; - } - - if (!picture->data || !picture->size) { - GST_ERROR_OBJECT (self, "No data to submit"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Submit total %" G_GSIZE_FORMAT " bytes", - picture->size); - - while (buffer_offset < picture->size) { - gsize bytes_to_copy = picture->size - buffer_offset; - gsize written_buffer_size; - gboolean is_last = TRUE; - DXVA_Slice_VPx_Short slice_short = { 0, }; - D3D11_VIDEO_DECODER_BUFFER_DESC buffer_desc[3] = { 0, }; - gboolean bad_aligned_bitstream_buffer = FALSE; - - GST_TRACE_OBJECT (self, "Getting bitstream buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, "Couldn't get bitstream buffer"); - goto error; - } - - if ((d3d11_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "The size of bitstream buffer is not 128 bytes aligned"); - bad_aligned_bitstream_buffer = TRUE; - } - - if (bytes_to_copy > d3d11_buffer_size) { - /* if the size of this slice is larger than the size of remaining d3d11 - * decoder bitstream memory, write the data up to the remaining d3d11 - * decoder bitstream memory size and the rest would be written to the - * next d3d11 bitstream memory */ - bytes_to_copy = d3d11_buffer_size; - is_last = FALSE; - } - - memcpy (d3d11_buffer, picture->data + buffer_offset, bytes_to_copy); - written_buffer_size = bytes_to_copy; - - /* DXVA2 spec is saying that written bitstream data must be 128 bytes - * aligned if the bitstream buffer contains end of frame - * (i.e., wBadSliceChopping == 0 or 2) */ - if (is_last) { - guint padding = MIN (GST_ROUND_UP_128 (bytes_to_copy) - bytes_to_copy, - d3d11_buffer_size - bytes_to_copy); - - if (padding) { - GST_TRACE_OBJECT (self, - "Written bitstream buffer size %" G_GSIZE_FORMAT - " is not 128 bytes aligned, add padding %d bytes", - bytes_to_copy, padding); - memset ((guint8 *) d3d11_buffer + bytes_to_copy, 0, padding); - written_buffer_size += padding; - } - } - - GST_TRACE_OBJECT (self, "Release bitstream buffer"); - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM)) { - GST_ERROR_OBJECT (self, "Failed to release bitstream buffer"); - - goto error; - } - - slice_short.BSNALunitDataLocation = 0; - slice_short.SliceBytesInBuffer = (UINT) written_buffer_size; - - /* wBadSliceChopping: (dxva spec.) - * 0: All bits for the slice are located within the corresponding - * bitstream data buffer - * 1: The bitstream data buffer contains the start of the slice, - * but not the entire slice, because the buffer is full - * 2: The bitstream data buffer contains the end of the slice. - * It does not contain the start of the slice, because the start of - * the slice was located in the previous bitstream data buffer. - * 3: The bitstream data buffer does not contain the start of the slice - * (because the start of the slice was located in the previous - * bitstream data buffer), and it does not contain the end of the slice - * (because the current bitstream data buffer is also full). - */ - if (is_last && is_first) { - slice_short.wBadSliceChopping = 0; - } else if (!is_last && is_first) { - slice_short.wBadSliceChopping = 1; - } else if (is_last && !is_first) { - slice_short.wBadSliceChopping = 2; - } else { - slice_short.wBadSliceChopping = 3; - } - - GST_TRACE_OBJECT (self, "Getting slice control buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, "Couldn't get slice control buffer"); - - goto error; - } - - memcpy (d3d11_buffer, &slice_short, sizeof (DXVA_Slice_VPx_Short)); - - GST_TRACE_OBJECT (self, "Release slice control buffer"); - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL)) { - GST_ERROR_OBJECT (self, "Failed to release slice control buffer"); - - goto error; - } - - buffer_desc[0].BufferType = D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS; - buffer_desc[0].DataOffset = 0; - buffer_desc[0].DataSize = sizeof (DXVA_PicParams_VP8); - - buffer_desc[1].BufferType = D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL; - buffer_desc[1].DataOffset = 0; - buffer_desc[1].DataSize = sizeof (DXVA_Slice_VPx_Short); - - if (!bad_aligned_bitstream_buffer && (written_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "Written bitstream buffer size %" G_GSIZE_FORMAT - " is not 128 bytes aligned", written_buffer_size); - } - - buffer_desc[2].BufferType = D3D11_VIDEO_DECODER_BUFFER_BITSTREAM; - buffer_desc[2].DataOffset = 0; - buffer_desc[2].DataSize = written_buffer_size; - - if (!gst_d3d11_decoder_submit_decoder_buffers (self->d3d11_decoder, - 3, buffer_desc)) { - GST_ERROR_OBJECT (self, "Couldn't submit decoder buffers"); - goto error; - } - - buffer_offset += bytes_to_copy; - is_first = FALSE; - } - - return TRUE; - -error: - return FALSE; -} - -static gboolean -gst_d3d11_vp8_dec_decode_picture (GstVp8Decoder * decoder, - GstVp8Picture * picture, GstVp8Parser * parser) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - DXVA_PicParams_VP8 pic_params = { 0, }; - GstD3D11DecoderOutputView *view; - const GstVp8FrameHdr *frame_hdr = &picture->frame_hdr; - - view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, picture); - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view handle"); - return FALSE; - } - - pic_params.first_part_size = frame_hdr->first_part_size; - pic_params.width = self->width; - pic_params.height = self->height; - pic_params.CurrPic.Index7Bits = view->view_id; - pic_params.StatusReportFeedbackNumber = 1; - - gst_d3d11_vp8_dec_copy_frame_params (self, picture, parser, &pic_params); - gst_d3d11_vp8_dec_copy_reference_frames (self, &pic_params); - gst_d3d11_vp8_dec_copy_segmentation_params (self, parser, &pic_params); - - return gst_d3d11_vp8_dec_submit_picture_data (self, picture, &pic_params); -} - -static gboolean -gst_d3d11_vp8_dec_end_picture (GstVp8Decoder * decoder, GstVp8Picture * picture) -{ - GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); - - if (!gst_d3d11_decoder_end_frame (self->d3d11_decoder)) { - GST_ERROR_OBJECT (self, "Failed to EndFrame"); - return FALSE; - } - - return TRUE; -} - -typedef struct -{ - guint width; - guint height; -} GstD3D11Vp8DecResolution; - -void -gst_d3d11_vp8_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank) -{ - GType type; - gchar *type_name; - gchar *feature_name; - guint index = 0; - guint i; - GUID profile; - GTypeInfo type_info = { - sizeof (GstD3D11Vp8DecClass), - NULL, - NULL, - (GClassInitFunc) gst_d3d11_vp8_dec_class_init, - NULL, - NULL, - sizeof (GstD3D11Vp8Dec), - 0, - (GInstanceInitFunc) gst_d3d11_vp8_dec_init, - }; - static const GUID *profile_guid = &GST_GUID_D3D11_DECODER_PROFILE_VP8_VLD; - /* values were taken from chromium. See supported_profile_helper.cc */ - GstD3D11Vp8DecResolution resolutions_to_check[] = { - {1920, 1088}, {2560, 1440}, {3840, 2160}, {4096, 2160}, {4096, 2304} - }; - GstCaps *sink_caps = NULL; - GstCaps *src_caps = NULL; - guint max_width = 0; - guint max_height = 0; - guint resolution; - DXGI_FORMAT format = DXGI_FORMAT_NV12; - - if (!gst_d3d11_decoder_get_supported_decoder_profile (decoder, - &profile_guid, 1, &profile)) { - GST_INFO_OBJECT (device, "device does not support VP8 decoding"); - return; - } - - for (i = 0; i < G_N_ELEMENTS (resolutions_to_check); i++) { - if (gst_d3d11_decoder_supports_resolution (decoder, &profile, - format, resolutions_to_check[i].width, - resolutions_to_check[i].height)) { - max_width = resolutions_to_check[i].width; - max_height = resolutions_to_check[i].height; - - GST_DEBUG_OBJECT (device, - "device support resolution %dx%d", max_width, max_height); - } else { - break; - } - } - - if (max_width == 0 || max_height == 0) { - GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); - return; - } - - sink_caps = gst_caps_from_string ("video/x-vp8, " - "framerate = " GST_VIDEO_FPS_RANGE); - src_caps = gst_caps_from_string ("video/x-raw(" - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "), " - "framerate = " GST_VIDEO_FPS_RANGE ";" - "video/x-raw, " "framerate = " GST_VIDEO_FPS_RANGE); - - gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); - - /* To cover both landscape and portrait, select max value */ - resolution = MAX (max_width, max_height); - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - - type_info.class_data = - gst_d3d11_decoder_class_data_new (device, sink_caps, src_caps); - - type_name = g_strdup ("GstD3D11Vp8Dec"); - feature_name = g_strdup ("d3d11vp8dec"); - - while (g_type_from_name (type_name)) { - index++; - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstD3D11Vp8Device%dDec", index); - feature_name = g_strdup_printf ("d3d11vp8device%ddec", index); - } - - type = g_type_register_static (GST_TYPE_VP8_DECODER, - type_name, &type_info, 0); - - /* make lower rank than default device */ - if (rank > 0 && index != 0) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11vp9dec.c
Deleted
@@ -1,1269 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - * - * NOTE: some of implementations are copied/modified from Chromium code - * - * Copyright 2015 The Chromium Authors. All rights reserved. - * - * Redistribution and use in source and binary forms, with or without - * modification, are permitted provided that the following conditions are - * met: - * - * * Redistributions of source code must retain the above copyright - * notice, this list of conditions and the following disclaimer. - * * Redistributions in binary form must reproduce the above - * copyright notice, this list of conditions and the following disclaimer - * in the documentation and/or other materials provided with the - * distribution. - * * Neither the name of Google Inc. nor the names of its - * contributors may be used to endorse or promote products derived from - * this software without specific prior written permission. - * - * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS - * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT - * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR - * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT - * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, - * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT - * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, - * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY - * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT - * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE - * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - */ - -#ifdef HAVE_CONFIG_H -#include <config.h> -#endif - -#include "gstd3d11vp9dec.h" -#include "gstd3d11memory.h" -#include "gstd3d11bufferpool.h" - -#include <gst/codecs/gstvp9decoder.h> -#include <string.h> - -/* HACK: to expose dxva data structure on UWP */ -#ifdef WINAPI_PARTITION_DESKTOP -#undef WINAPI_PARTITION_DESKTOP -#endif -#define WINAPI_PARTITION_DESKTOP 1 -#include <d3d9.h> -#include <dxva.h> - -GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_vp9_dec_debug); -#define GST_CAT_DEFAULT gst_d3d11_vp9_dec_debug - -enum -{ - PROP_0, - PROP_ADAPTER, - PROP_DEVICE_ID, - PROP_VENDOR_ID, -}; - -/* copied from d3d11.h since mingw header doesn't define them */ -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_PROFILE0, - 0x463707f8, 0xa1d0, 0x4585, 0x87, 0x6d, 0x83, 0xaa, 0x6d, 0x60, 0xb8, 0x9e); -DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_10BIT_PROFILE2, - 0xa4c749ef, 0x6ecf, 0x48aa, 0x84, 0x48, 0x50, 0xa7, 0xa1, 0x16, 0x5f, 0xf7); - -/* reference list 8 + 4 margin */ -#define NUM_OUTPUT_VIEW 12 - -typedef struct _GstD3D11Vp9Dec -{ - GstVp9Decoder parent; - - GstVideoCodecState *output_state; - - GstD3D11Device *device; - - GstD3D11Decoder *d3d11_decoder; - - guint width, height; - GstVP9Profile profile; - - GstVideoFormat out_format; - - gboolean use_d3d11_output; -} GstD3D11Vp9Dec; - -typedef struct _GstD3D11Vp9DecClass -{ - GstVp9DecoderClass parent_class; - guint adapter; - guint device_id; - guint vendor_id; -} GstD3D11Vp9DecClass; - -static GstElementClass *parent_class = NULL; - -#define GST_D3D11_VP9_DEC(object) ((GstD3D11Vp9Dec *) (object)) -#define GST_D3D11_VP9_DEC_GET_CLASS(object) \ - (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11Vp9DecClass)) - -static void gst_d3d11_vp9_dec_get_property (GObject * object, - guint prop_id, GValue * value, GParamSpec * pspec); -static void gst_d3d11_vp9_dec_set_context (GstElement * element, - GstContext * context); - -static gboolean gst_d3d11_vp9_dec_open (GstVideoDecoder * decoder); -static gboolean gst_d3d11_vp9_dec_close (GstVideoDecoder * decoder); -static gboolean gst_d3d11_vp9_dec_negotiate (GstVideoDecoder * decoder); -static gboolean gst_d3d11_vp9_dec_decide_allocation (GstVideoDecoder * - decoder, GstQuery * query); -static gboolean gst_d3d11_vp9_dec_src_query (GstVideoDecoder * decoder, - GstQuery * query); - -/* GstVp9Decoder */ -static gboolean gst_d3d11_vp9_dec_new_sequence (GstVp9Decoder * decoder, - const GstVp9FrameHdr * frame_hdr); -static gboolean gst_d3d11_vp9_dec_new_picture (GstVp9Decoder * decoder, - GstVideoCodecFrame * frame, GstVp9Picture * picture); -static GstVp9Picture *gst_d3d11_vp9_dec_duplicate_picture (GstVp9Decoder * - decoder, GstVp9Picture * picture); -static GstFlowReturn gst_d3d11_vp9_dec_output_picture (GstVp9Decoder * - decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); -static gboolean gst_d3d11_vp9_dec_start_picture (GstVp9Decoder * decoder, - GstVp9Picture * picture); -static gboolean gst_d3d11_vp9_dec_decode_picture (GstVp9Decoder * decoder, - GstVp9Picture * picture, GstVp9Dpb * dpb); -static gboolean gst_d3d11_vp9_dec_end_picture (GstVp9Decoder * decoder, - GstVp9Picture * picture); - -static void -gst_d3d11_vp9_dec_class_init (GstD3D11Vp9DecClass * klass, gpointer data) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstElementClass *element_class = GST_ELEMENT_CLASS (klass); - GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); - GstVp9DecoderClass *vp9decoder_class = GST_VP9_DECODER_CLASS (klass); - GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; - gchar *long_name; - - gobject_class->get_property = gst_d3d11_vp9_dec_get_property; - - g_object_class_install_property (gobject_class, PROP_ADAPTER, - g_param_spec_uint ("adapter", "Adapter", - "DXGI Adapter index for creating device", - 0, G_MAXUINT32, cdata->adapter, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_DEVICE_ID, - g_param_spec_uint ("device-id", "Device Id", - "DXGI Device ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - g_object_class_install_property (gobject_class, PROP_VENDOR_ID, - g_param_spec_uint ("vendor-id", "Vendor Id", - "DXGI Vendor ID", 0, G_MAXUINT32, 0, - G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - parent_class = g_type_class_peek_parent (klass); - - klass->adapter = cdata->adapter; - klass->device_id = cdata->device_id; - klass->vendor_id = cdata->vendor_id; - - element_class->set_context = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_set_context); - - long_name = g_strdup_printf ("Direct3D11 VP9 %s Decoder", cdata->description); - gst_element_class_set_metadata (element_class, long_name, - "Codec/Decoder/Video/Hardware", - "A Direct3D11 based VP9 video decoder", - "Seungha Yang <seungha.yang@navercorp.com>"); - g_free (long_name); - - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - cdata->sink_caps)); - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - cdata->src_caps)); - gst_d3d11_decoder_class_data_free (cdata); - - decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_open); - decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_close); - decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_negotiate); - decoder_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_decide_allocation); - decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_src_query); - - vp9decoder_class->new_sequence = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_new_sequence); - vp9decoder_class->new_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_new_picture); - vp9decoder_class->duplicate_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_duplicate_picture); - vp9decoder_class->output_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_output_picture); - vp9decoder_class->start_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_start_picture); - vp9decoder_class->decode_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_decode_picture); - vp9decoder_class->end_picture = - GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_end_picture); -} - -static void -gst_d3d11_vp9_dec_init (GstD3D11Vp9Dec * self) -{ -} - -static void -gst_d3d11_vp9_dec_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstD3D11Vp9DecClass *klass = GST_D3D11_VP9_DEC_GET_CLASS (object); - - switch (prop_id) { - case PROP_ADAPTER: - g_value_set_uint (value, klass->adapter); - break; - case PROP_DEVICE_ID: - g_value_set_uint (value, klass->device_id); - break; - case PROP_VENDOR_ID: - g_value_set_uint (value, klass->vendor_id); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_d3d11_vp9_dec_set_context (GstElement * element, GstContext * context) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (element); - GstD3D11Vp9DecClass *klass = GST_D3D11_VP9_DEC_GET_CLASS (self); - - gst_d3d11_handle_set_context (element, context, klass->adapter, - &self->device); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -static gboolean -gst_d3d11_vp9_dec_open (GstVideoDecoder * decoder) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - GstD3D11Vp9DecClass *klass = GST_D3D11_VP9_DEC_GET_CLASS (self); - - if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), klass->adapter, - &self->device)) { - GST_ERROR_OBJECT (self, "Cannot create d3d11device"); - return FALSE; - } - - self->d3d11_decoder = gst_d3d11_decoder_new (self->device); - - if (!self->d3d11_decoder) { - GST_ERROR_OBJECT (self, "Cannot create d3d11 decoder"); - gst_clear_object (&self->device); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_d3d11_vp9_dec_close (GstVideoDecoder * decoder) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - - gst_clear_object (&self->d3d11_decoder); - gst_clear_object (&self->device); - - return TRUE; -} - -static gboolean -gst_d3d11_vp9_dec_negotiate (GstVideoDecoder * decoder) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); - - if (!gst_d3d11_decoder_negotiate (decoder, vp9dec->input_state, - self->out_format, self->width, self->height, &self->output_state, - &self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); -} - -static gboolean -gst_d3d11_vp9_dec_decide_allocation (GstVideoDecoder * decoder, - GstQuery * query) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - - if (!gst_d3d11_decoder_decide_allocation (decoder, query, self->device, - GST_D3D11_CODEC_VP9, self->use_d3d11_output)) - return FALSE; - - return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation - (decoder, query); -} - -static gboolean -gst_d3d11_vp9_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT: - if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), - query, self->device)) { - return TRUE; - } - break; - default: - break; - } - - return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); -} - -static gboolean -gst_d3d11_vp9_dec_new_sequence (GstVp9Decoder * decoder, - const GstVp9FrameHdr * frame_hdr) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - gboolean modified = FALSE; - static const GUID *profile0_guid = - &GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_PROFILE0; - static const GUID *profile2_guid = - &GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_10BIT_PROFILE2; - - GST_LOG_OBJECT (self, "new sequence"); - - if (self->width < frame_hdr->width || self->height < frame_hdr->height) { - self->width = frame_hdr->width; - self->height = frame_hdr->height; - GST_INFO_OBJECT (self, "resolution changed %dx%d", - self->width, self->height); - modified = TRUE; - } - - if (self->profile != frame_hdr->profile) { - self->profile = frame_hdr->profile; - GST_INFO_OBJECT (self, "profile changed %d", self->profile); - modified = TRUE; - } - - if (modified || !self->d3d11_decoder->opened) { - const GUID *profile_guid = NULL; - GstVideoInfo info; - - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; - - if (self->profile == GST_VP9_PROFILE_0) { - self->out_format = GST_VIDEO_FORMAT_NV12; - profile_guid = profile0_guid; - } else if (self->profile == GST_VP9_PROFILE_2) { - self->out_format = GST_VIDEO_FORMAT_P010_10LE; - profile_guid = profile2_guid; - } - - if (self->out_format == GST_VIDEO_FORMAT_UNKNOWN) { - GST_ERROR_OBJECT (self, "Could not support profile %d", self->profile); - return FALSE; - } - - gst_video_info_set_format (&info, - self->out_format, self->width, self->height); - - gst_d3d11_decoder_reset (self->d3d11_decoder); - if (!gst_d3d11_decoder_open (self->d3d11_decoder, GST_D3D11_CODEC_VP9, - &info, self->width, self->height, - NUM_OUTPUT_VIEW, &profile_guid, 1)) { - GST_ERROR_OBJECT (self, "Failed to create decoder"); - return FALSE; - } - - if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { - GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; - } - } - - return TRUE; -} - -static gboolean -gst_d3d11_vp9_dec_new_picture (GstVp9Decoder * decoder, - GstVideoCodecFrame * frame, GstVp9Picture * picture) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - GstBuffer *view_buffer; - GstD3D11Memory *mem; - - view_buffer = gst_d3d11_decoder_get_output_view_buffer (self->d3d11_decoder); - if (!view_buffer) { - GST_ERROR_OBJECT (self, "No available output view buffer"); - return FALSE; - } - - mem = (GstD3D11Memory *) gst_buffer_peek_memory (view_buffer, 0); - - GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT " (index %d)", - view_buffer, mem->subresource_index); - - gst_vp9_picture_set_user_data (picture, - view_buffer, (GDestroyNotify) gst_buffer_unref); - - GST_LOG_OBJECT (self, "New VP9 picture %p", picture); - - return TRUE; -} - -static GstVp9Picture * -gst_d3d11_vp9_dec_duplicate_picture (GstVp9Decoder * decoder, - GstVp9Picture * picture) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - GstBuffer *view_buffer; - GstD3D11Memory *mem; - GstVp9Picture *new_picture; - - view_buffer = gst_vp9_picture_get_user_data (picture); - - if (!view_buffer) { - GST_ERROR_OBJECT (self, "Parent picture does not have output view buffer"); - return NULL; - } - - new_picture = gst_vp9_picture_new (); - new_picture->frame_hdr = picture->frame_hdr; - - mem = (GstD3D11Memory *) gst_buffer_peek_memory (view_buffer, 0); - - GST_LOG_OBJECT (self, "Duplicate output with buffer %" GST_PTR_FORMAT - " (index %d)", view_buffer, mem->subresource_index); - - gst_vp9_picture_set_user_data (new_picture, - gst_buffer_ref (view_buffer), (GDestroyNotify) gst_buffer_unref); - - return new_picture; -} - -static GstFlowReturn -gst_d3d11_vp9_dec_output_picture (GstVp9Decoder * decoder, - GstVideoCodecFrame * frame, GstVp9Picture * picture) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); - GstBuffer *output_buffer = NULL; - GstFlowReturn ret; - GstBuffer *view_buffer; - - GST_LOG_OBJECT (self, "Outputting picture %p", picture); - - view_buffer = (GstBuffer *) gst_vp9_picture_get_user_data (picture); - - if (!view_buffer) { - GST_ERROR_OBJECT (self, "Could not get output view"); - goto error; - } - - if (!picture->frame_hdr.show_frame) { - GST_LOG_OBJECT (self, "Decode only picture %p", picture); - if (frame) { - GST_VIDEO_CODEC_FRAME_SET_DECODE_ONLY (frame); - gst_vp9_picture_unref (picture); - - return gst_video_decoder_finish_frame (vdec, frame); - } else { - /* expected case if we are decoding super frame */ - gst_vp9_picture_unref (picture); - - return GST_FLOW_OK; - } - } - - /* if downstream is d3d11 element and forward playback case, - * expose our decoder view without copy. In case of reverse playback, however, - * we cannot do that since baseclass will store the decoded buffer - * up to gop size but our dpb pool cannot be increased */ - if (self->use_d3d11_output && - gst_d3d11_decoder_supports_direct_rendering (self->d3d11_decoder) && - GST_VIDEO_DECODER (self)->input_segment.rate > 0 && - GST_VIDEO_INFO_WIDTH (&self->output_state->info) == - picture->frame_hdr.width - && GST_VIDEO_INFO_HEIGHT (&self->output_state->info) == - picture->frame_hdr.height) { - GstMemory *mem; - - output_buffer = gst_buffer_ref (view_buffer); - mem = gst_buffer_peek_memory (output_buffer, 0); - GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); - } else { - output_buffer = gst_video_decoder_allocate_output_buffer (vdec); - } - - if (!output_buffer) { - GST_ERROR_OBJECT (self, "Couldn't allocate output buffer"); - goto error; - } - - if (!frame) { - /* this is the case where super frame has multiple displayable - * (non-decode-only) subframes. Should be rare case but it's possible - * in theory */ - GST_WARNING_OBJECT (self, "No codec frame for picture %p", picture); - - GST_BUFFER_PTS (output_buffer) = picture->pts; - GST_BUFFER_DTS (output_buffer) = GST_CLOCK_TIME_NONE; - GST_BUFFER_DURATION (output_buffer) = GST_CLOCK_TIME_NONE; - } else { - frame->output_buffer = output_buffer; - GST_BUFFER_PTS (output_buffer) = GST_BUFFER_PTS (frame->input_buffer); - GST_BUFFER_DTS (output_buffer) = GST_CLOCK_TIME_NONE; - GST_BUFFER_DURATION (output_buffer) = - GST_BUFFER_DURATION (frame->input_buffer); - } - - if (!gst_d3d11_decoder_process_output (self->d3d11_decoder, - &self->output_state->info, - picture->frame_hdr.width, picture->frame_hdr.height, - view_buffer, output_buffer)) { - GST_ERROR_OBJECT (self, "Failed to copy buffer"); - goto error; - } - - GST_LOG_OBJECT (self, "Finish frame %" GST_TIME_FORMAT, - GST_TIME_ARGS (GST_BUFFER_PTS (output_buffer))); - - gst_vp9_picture_unref (picture); - - if (frame) { - ret = gst_video_decoder_finish_frame (vdec, frame); - } else { - ret = gst_pad_push (GST_VIDEO_DECODER_SRC_PAD (self), output_buffer); - } - - return ret; - -error: - if (frame) { - /* normal case */ - gst_video_decoder_drop_frame (vdec, frame); - } else if (output_buffer) { - /* in case of super frame with multiple displayable subframes */ - gst_buffer_unref (output_buffer); - } - gst_vp9_picture_unref (picture); - - return GST_FLOW_ERROR; -} - -static GstD3D11DecoderOutputView * -gst_d3d11_vp9_dec_get_output_view_from_picture (GstD3D11Vp9Dec * self, - GstVp9Picture * picture) -{ - GstBuffer *view_buffer; - GstD3D11DecoderOutputView *view; - - view_buffer = (GstBuffer *) gst_vp9_picture_get_user_data (picture); - if (!view_buffer) { - GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); - return NULL; - } - - view = - gst_d3d11_decoder_get_output_view_from_buffer (self->d3d11_decoder, - view_buffer); - if (!view) { - GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); - return NULL; - } - - return view; -} - -static gboolean -gst_d3d11_vp9_dec_start_picture (GstVp9Decoder * decoder, - GstVp9Picture * picture) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - GstD3D11DecoderOutputView *view; - - view = gst_d3d11_vp9_dec_get_output_view_from_picture (self, picture); - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view handle"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Begin frame"); - - if (!gst_d3d11_decoder_begin_frame (self->d3d11_decoder, view, 0, NULL)) { - GST_ERROR_OBJECT (self, "Failed to begin frame"); - return FALSE; - } - - return TRUE; -} - -static void -gst_d3d11_vp9_dec_copy_frame_params (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, DXVA_PicParams_VP9 * params) -{ - const GstVp9FrameHdr *frame_hdr = &picture->frame_hdr; - - params->profile = frame_hdr->profile; - - params->frame_type = frame_hdr->frame_type; - params->show_frame = frame_hdr->show_frame; - params->error_resilient_mode = frame_hdr->error_resilient_mode; - params->subsampling_x = picture->subsampling_x; - params->subsampling_y = picture->subsampling_y; - params->refresh_frame_context = frame_hdr->refresh_frame_context; - params->frame_parallel_decoding_mode = - frame_hdr->frame_parallel_decoding_mode; - params->intra_only = frame_hdr->intra_only; - params->frame_context_idx = frame_hdr->frame_context_idx; - params->reset_frame_context = frame_hdr->reset_frame_context; - if (frame_hdr->frame_type == GST_VP9_KEY_FRAME) - params->allow_high_precision_mv = 0; - else - params->allow_high_precision_mv = frame_hdr->allow_high_precision_mv; - - params->width = frame_hdr->width; - params->height = frame_hdr->height; - params->BitDepthMinus8Luma = picture->bit_depth - 8; - params->BitDepthMinus8Chroma = picture->bit_depth - 8; - - params->interp_filter = frame_hdr->mcomp_filter_type; - params->log2_tile_cols = frame_hdr->log2_tile_columns; - params->log2_tile_rows = frame_hdr->log2_tile_rows; - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Dump frame params"); - /* set before calling this function */ - GST_TRACE_OBJECT (self, "\tCurrPic.Index7Bits: %d", - params->CurrPic.Index7Bits); - GST_TRACE_OBJECT (self, "\tuncompressed_header_size_byte_aligned: %d", - params->uncompressed_header_size_byte_aligned); - GST_TRACE_OBJECT (self, "\tfirst_partition_size: %d", - params->first_partition_size); - - GST_TRACE_OBJECT (self, "\tprofile: %d", params->profile); - GST_TRACE_OBJECT (self, "\tframe_type: %d", params->frame_type); - GST_TRACE_OBJECT (self, "\tshow_frame: %d", params->show_frame); - GST_TRACE_OBJECT (self, "\terror_resilient_mode: %d", - params->error_resilient_mode); - GST_TRACE_OBJECT (self, "\tsubsampling_x: %d", params->subsampling_x); - GST_TRACE_OBJECT (self, "\tsubsampling_t: %d", params->subsampling_y); - GST_TRACE_OBJECT (self, "\trefresh_frame_context: %d", - params->refresh_frame_context); - GST_TRACE_OBJECT (self, "\tframe_parallel_decoding_mode: %d", - params->frame_parallel_decoding_mode); - GST_TRACE_OBJECT (self, "\tintra_only: %d", params->intra_only); - GST_TRACE_OBJECT (self, "\tframe_context_idx: %d", params->frame_context_idx); - GST_TRACE_OBJECT (self, "\treset_frame_context: %d", - params->reset_frame_context); - GST_TRACE_OBJECT (self, "\tallow_high_precision_mv: %d", - params->allow_high_precision_mv); - GST_TRACE_OBJECT (self, "\twidth: %d", params->width); - GST_TRACE_OBJECT (self, "\theight: %d", params->height); - GST_TRACE_OBJECT (self, "\tBitDepthMinus8Luma: %d", - params->BitDepthMinus8Luma); - GST_TRACE_OBJECT (self, "\tBitDepthMinus8Chroma: %d", - params->BitDepthMinus8Chroma); - GST_TRACE_OBJECT (self, "\tinterp_filter: %d", params->interp_filter); - GST_TRACE_OBJECT (self, "\tlog2_tile_cols: %d", params->log2_tile_cols); - GST_TRACE_OBJECT (self, "\tlog2_tile_rows: %d", params->log2_tile_rows); -#endif -} - -static void -gst_d3d11_vp9_dec_copy_reference_frames (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, GstVp9Dpb * dpb, DXVA_PicParams_VP9 * params) -{ - gint i; - - for (i = 0; i < GST_VP9_REF_FRAMES; i++) { - if (dpb->pic_list[i]) { - GstVp9Picture *other_pic = dpb->pic_list[i]; - GstD3D11DecoderOutputView *view; - - view = gst_d3d11_vp9_dec_get_output_view_from_picture (self, other_pic); - if (!view) { - GST_ERROR_OBJECT (self, "picture does not have output view handle"); - return; - } - - params->ref_frame_map[i].Index7Bits = view->view_id; - params->ref_frame_coded_width[i] = picture->frame_hdr.width; - params->ref_frame_coded_height[i] = picture->frame_hdr.height; - } else { - params->ref_frame_map[i].bPicEntry = 0xff; - params->ref_frame_coded_width[i] = 0; - params->ref_frame_coded_height[i] = 0; - } - } - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Dump reference frames"); - for (i = 0; i < GST_VP9_REF_FRAMES; i++) { - GST_TRACE_OBJECT (self, "\t[%d] ref_frame_map.Index7Bits: %d", i, - params->ref_frame_map[i].Index7Bits); - GST_TRACE_OBJECT (self, "\t[%d] ref_frame_coded_width: %d", i, - params->ref_frame_coded_width[i]); - GST_TRACE_OBJECT (self, "\t[%d] ref_frame_coded_height: %d", i, - params->ref_frame_coded_height[i]); - } -#endif -} - -static void -gst_d3d11_vp9_dec_copy_frame_refs (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, DXVA_PicParams_VP9 * params) -{ - const GstVp9FrameHdr *frame_hdr = &picture->frame_hdr; - gint i; - - for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { - params->frame_refs[i] = - params->ref_frame_map[frame_hdr->ref_frame_indices[i]]; - } - - for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { - params->ref_frame_sign_bias[i + 1] = frame_hdr->ref_frame_sign_bias[i]; - } - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Dump frame refs"); - for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { - GST_TRACE_OBJECT (self, "\t[%d] ref_frame_indices: %d", i, - frame_hdr->ref_frame_indices[i]); - GST_TRACE_OBJECT (self, "\t[%d] frame_refs.Index7Bits: %d", i, - params->frame_refs[i].Index7Bits); - GST_TRACE_OBJECT (self, "\t[%d] ref_frame_sign_bias: %d", i, - params->ref_frame_sign_bias[i + 1]); - } -#endif -} - -static void -gst_d3d11_vp9_dec_copy_loop_filter_params (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, DXVA_PicParams_VP9 * params) -{ - const GstVp9FrameHdr *frame_hdr = &picture->frame_hdr; - const GstVp9LoopFilter *loopfilter = &frame_hdr->loopfilter; - gint i; - - params->filter_level = loopfilter->filter_level; - params->sharpness_level = loopfilter->sharpness_level; - params->mode_ref_delta_enabled = loopfilter->mode_ref_delta_enabled; - params->mode_ref_delta_update = loopfilter->mode_ref_delta_update; - - for (i = 0; i < GST_VP9_MAX_REF_LF_DELTAS; i++) { - params->ref_deltas[i] = loopfilter->ref_deltas[i]; - } - - for (i = 0; i < GST_VP9_MAX_MODE_LF_DELTAS; i++) { - params->mode_deltas[i] = loopfilter->mode_deltas[i]; - } - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Dump loop filter params"); - GST_TRACE_OBJECT (self, "\tfilter_level: %d", params->filter_level); - GST_TRACE_OBJECT (self, "\tsharpness_level: %d", params->sharpness_level); - GST_TRACE_OBJECT (self, "\tmode_ref_delta_enabled: %d", - params->mode_ref_delta_enabled); - GST_TRACE_OBJECT (self, "\tmode_ref_delta_update: %d", - params->mode_ref_delta_update); - for (i = 0; i < GST_VP9_MAX_REF_LF_DELTAS; i++) - GST_TRACE_OBJECT (self, "\tref_deltas[%d]: %d", i, params->ref_deltas[i]); - for (i = 0; i < GST_VP9_MAX_MODE_LF_DELTAS; i++) - GST_TRACE_OBJECT (self, "\tmode_deltas[%d]: %d", i, params->mode_deltas[i]); -#endif -} - -static void -gst_d3d11_vp9_dec_copy_quant_params (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, DXVA_PicParams_VP9 * params) -{ - const GstVp9FrameHdr *frame_hdr = &picture->frame_hdr; - const GstVp9QuantIndices *quant_indices = &frame_hdr->quant_indices; - - params->base_qindex = quant_indices->y_ac_qi; - params->y_dc_delta_q = quant_indices->y_dc_delta; - params->uv_dc_delta_q = quant_indices->uv_dc_delta; - params->uv_ac_delta_q = quant_indices->uv_ac_delta; - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Dump quantization params"); - GST_TRACE_OBJECT (self, "\tbase_qindex: %d", params->base_qindex); - GST_TRACE_OBJECT (self, "\ty_dc_delta_q: %d", params->y_dc_delta_q); - GST_TRACE_OBJECT (self, "\tuv_dc_delta_q: %d", params->uv_dc_delta_q); - GST_TRACE_OBJECT (self, "\tuv_ac_delta_q: %d", params->uv_ac_delta_q); -#endif -} - -static void -gst_d3d11_vp9_dec_copy_segmentation_params (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, DXVA_PicParams_VP9 * params) -{ - const GstVp9FrameHdr *frame_hdr = &picture->frame_hdr; - const GstVp9SegmentationInfo *seg = &frame_hdr->segmentation; - gint i; - - params->stVP9Segments.enabled = seg->enabled; - params->stVP9Segments.update_map = seg->update_map; - params->stVP9Segments.temporal_update = seg->temporal_update; - params->stVP9Segments.abs_delta = seg->abs_delta; - - for (i = 0; i < GST_VP9_SEG_TREE_PROBS; i++) - params->stVP9Segments.tree_probs[i] = seg->tree_probs[i]; - - for (i = 0; i < GST_VP9_PREDICTION_PROBS; i++) - params->stVP9Segments.pred_probs[i] = seg->pred_probs[i]; - - for (i = 0; i < GST_VP9_MAX_SEGMENTS; i++) { - params->stVP9Segments.feature_mask[i] = 0; - - if (seg->data[i].alternate_quantizer_enabled) - params->stVP9Segments.feature_mask[i] |= (1 << 0); - - if (seg->data[i].alternate_loop_filter_enabled) - params->stVP9Segments.feature_mask[i] |= (1 << 1); - - if (seg->data[i].reference_frame_enabled) - params->stVP9Segments.feature_mask[i] |= (1 << 2); - - if (seg->data[i].reference_skip) - params->stVP9Segments.feature_mask[i] |= (1 << 3); - - params->stVP9Segments.feature_data[i][0] = seg->data[i].alternate_quantizer; - params->stVP9Segments.feature_data[i][1] = - seg->data[i].alternate_loop_filter; - params->stVP9Segments.feature_data[i][2] = seg->data[i].reference_frame; - params->stVP9Segments.feature_data[i][3] = 0; - } - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Dump segmentation params"); - GST_TRACE_OBJECT (self, "\tenabled: %d", params->stVP9Segments.enabled); - GST_TRACE_OBJECT (self, "\tupdate_map: %d", params->stVP9Segments.update_map); - GST_TRACE_OBJECT (self, "\ttemporal_update: %d", - params->stVP9Segments.temporal_update); - GST_TRACE_OBJECT (self, "\tabs_delta: %d", params->stVP9Segments.abs_delta); - for (i = 0; i < GST_VP9_SEG_TREE_PROBS; i++) - GST_TRACE_OBJECT (self, "\ttree_probs[%d]: %d", i, - params->stVP9Segments.tree_probs[i]); - for (i = 0; i < GST_VP9_PREDICTION_PROBS; i++) - GST_TRACE_OBJECT (self, "\tpred_probs[%d]: %d", i, - params->stVP9Segments.pred_probs[i]); - for (i = 0; i < GST_VP9_MAX_SEGMENTS; i++) { - gint j; - GST_TRACE_OBJECT (self, "\tfeature_mask[%d]: 0x%x", i, - params->stVP9Segments.feature_mask[i]); - for (j = 0; j < 4; j++) - GST_TRACE_OBJECT (self, "\tfeature_data[%d][%d]: %d", i, j, - params->stVP9Segments.feature_data[i][j]); - } -#endif -} - -static gboolean -gst_d3d11_vp9_dec_submit_picture_data (GstD3D11Vp9Dec * self, - GstVp9Picture * picture, DXVA_PicParams_VP9 * params) -{ - guint d3d11_buffer_size; - gpointer d3d11_buffer; - gsize buffer_offset = 0; - gboolean is_first = TRUE; - - GST_TRACE_OBJECT (self, "Getting picture params buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, - "Failed to get decoder buffer for picture parameters"); - return FALSE; - } - - memcpy (d3d11_buffer, params, sizeof (DXVA_PicParams_VP9)); - - GST_TRACE_OBJECT (self, "Release picture param decoder buffer"); - - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS)) { - GST_ERROR_OBJECT (self, "Failed to release decoder buffer"); - return FALSE; - } - - if (!picture->data || !picture->size) { - GST_ERROR_OBJECT (self, "No data to submit"); - return FALSE; - } - - GST_TRACE_OBJECT (self, "Submit total %" G_GSIZE_FORMAT " bytes", - picture->size); - - while (buffer_offset < picture->size) { - gsize bytes_to_copy = picture->size - buffer_offset; - gsize written_buffer_size; - gboolean is_last = TRUE; - DXVA_Slice_VPx_Short slice_short = { 0, }; - D3D11_VIDEO_DECODER_BUFFER_DESC buffer_desc[3] = { 0, }; - gboolean bad_aligned_bitstream_buffer = FALSE; - - GST_TRACE_OBJECT (self, "Getting bitstream buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, "Couldn't get bitstream buffer"); - goto error; - } - - if ((d3d11_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "The size of bitstream buffer is not 128 bytes aligned"); - bad_aligned_bitstream_buffer = TRUE; - } - - if (bytes_to_copy > d3d11_buffer_size) { - /* if the size of this slice is larger than the size of remaining d3d11 - * decoder bitstream memory, write the data up to the remaining d3d11 - * decoder bitstream memory size and the rest would be written to the - * next d3d11 bitstream memory */ - bytes_to_copy = d3d11_buffer_size; - is_last = FALSE; - } - - memcpy (d3d11_buffer, picture->data + buffer_offset, bytes_to_copy); - written_buffer_size = bytes_to_copy; - - /* DXVA2 spec is saying that written bitstream data must be 128 bytes - * aligned if the bitstream buffer contains end of frame - * (i.e., wBadSliceChopping == 0 or 2) */ - if (is_last) { - guint padding = MIN (GST_ROUND_UP_128 (bytes_to_copy) - bytes_to_copy, - d3d11_buffer_size - bytes_to_copy); - - if (padding) { - GST_TRACE_OBJECT (self, - "Written bitstream buffer size %" G_GSIZE_FORMAT - " is not 128 bytes aligned, add padding %d bytes", - bytes_to_copy, padding); - memset ((guint8 *) d3d11_buffer + bytes_to_copy, 0, padding); - written_buffer_size += padding; - } - } - - GST_TRACE_OBJECT (self, "Release bitstream buffer"); - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_BITSTREAM)) { - GST_ERROR_OBJECT (self, "Failed to release bitstream buffer"); - - goto error; - } - - slice_short.BSNALunitDataLocation = 0; - slice_short.SliceBytesInBuffer = (UINT) written_buffer_size; - - /* wBadSliceChopping: (dxva spec.) - * 0: All bits for the slice are located within the corresponding - * bitstream data buffer - * 1: The bitstream data buffer contains the start of the slice, - * but not the entire slice, because the buffer is full - * 2: The bitstream data buffer contains the end of the slice. - * It does not contain the start of the slice, because the start of - * the slice was located in the previous bitstream data buffer. - * 3: The bitstream data buffer does not contain the start of the slice - * (because the start of the slice was located in the previous - * bitstream data buffer), and it does not contain the end of the slice - * (because the current bitstream data buffer is also full). - */ - if (is_last && is_first) { - slice_short.wBadSliceChopping = 0; - } else if (!is_last && is_first) { - slice_short.wBadSliceChopping = 1; - } else if (is_last && !is_first) { - slice_short.wBadSliceChopping = 2; - } else { - slice_short.wBadSliceChopping = 3; - } - - GST_TRACE_OBJECT (self, "Getting slice control buffer"); - if (!gst_d3d11_decoder_get_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL, &d3d11_buffer_size, - &d3d11_buffer)) { - GST_ERROR_OBJECT (self, "Couldn't get slice control buffer"); - - goto error; - } - - memcpy (d3d11_buffer, &slice_short, sizeof (DXVA_Slice_VPx_Short)); - - GST_TRACE_OBJECT (self, "Release slice control buffer"); - if (!gst_d3d11_decoder_release_decoder_buffer (self->d3d11_decoder, - D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL)) { - GST_ERROR_OBJECT (self, "Failed to release slice control buffer"); - - goto error; - } - - buffer_desc[0].BufferType = D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS; - buffer_desc[0].DataOffset = 0; - buffer_desc[0].DataSize = sizeof (DXVA_PicParams_VP9); - - buffer_desc[1].BufferType = D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL; - buffer_desc[1].DataOffset = 0; - buffer_desc[1].DataSize = sizeof (DXVA_Slice_VPx_Short); - - if (!bad_aligned_bitstream_buffer && (written_buffer_size & 127) != 0) { - GST_WARNING_OBJECT (self, - "Written bitstream buffer size %" G_GSIZE_FORMAT - " is not 128 bytes aligned", written_buffer_size); - } - - buffer_desc[2].BufferType = D3D11_VIDEO_DECODER_BUFFER_BITSTREAM; - buffer_desc[2].DataOffset = 0; - buffer_desc[2].DataSize = written_buffer_size; - - if (!gst_d3d11_decoder_submit_decoder_buffers (self->d3d11_decoder, - 3, buffer_desc)) { - GST_ERROR_OBJECT (self, "Couldn't submit decoder buffers"); - goto error; - } - - buffer_offset += bytes_to_copy; - is_first = FALSE; - } - - return TRUE; - -error: - return FALSE; -} - -static gboolean -gst_d3d11_vp9_dec_decode_picture (GstVp9Decoder * decoder, - GstVp9Picture * picture, GstVp9Dpb * dpb) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - DXVA_PicParams_VP9 pic_params = { 0, }; - GstD3D11DecoderOutputView *view; - - view = gst_d3d11_vp9_dec_get_output_view_from_picture (self, picture); - if (!view) { - GST_ERROR_OBJECT (self, "current picture does not have output view handle"); - return FALSE; - } - - pic_params.CurrPic.Index7Bits = view->view_id; - pic_params.uncompressed_header_size_byte_aligned = - picture->frame_hdr.frame_header_length_in_bytes; - pic_params.first_partition_size = picture->frame_hdr.first_partition_size; - pic_params.StatusReportFeedbackNumber = 1; - - gst_d3d11_vp9_dec_copy_frame_params (self, picture, &pic_params); - gst_d3d11_vp9_dec_copy_reference_frames (self, picture, dpb, &pic_params); - gst_d3d11_vp9_dec_copy_frame_refs (self, picture, &pic_params); - gst_d3d11_vp9_dec_copy_loop_filter_params (self, picture, &pic_params); - gst_d3d11_vp9_dec_copy_quant_params (self, picture, &pic_params); - gst_d3d11_vp9_dec_copy_segmentation_params (self, picture, &pic_params); - - return gst_d3d11_vp9_dec_submit_picture_data (self, picture, &pic_params); -} - -static gboolean -gst_d3d11_vp9_dec_end_picture (GstVp9Decoder * decoder, GstVp9Picture * picture) -{ - GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); - - if (!gst_d3d11_decoder_end_frame (self->d3d11_decoder)) { - GST_ERROR_OBJECT (self, "Failed to EndFrame"); - return FALSE; - } - - return TRUE; -} - -typedef struct -{ - guint width; - guint height; -} GstD3D11Vp9DecResolution; - -void -gst_d3d11_vp9_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank) -{ - GType type; - gchar *type_name; - gchar *feature_name; - guint index = 0; - guint i; - GUID profile; - GTypeInfo type_info = { - sizeof (GstD3D11Vp9DecClass), - NULL, - NULL, - (GClassInitFunc) gst_d3d11_vp9_dec_class_init, - NULL, - NULL, - sizeof (GstD3D11Vp9Dec), - 0, - (GInstanceInitFunc) gst_d3d11_vp9_dec_init, - }; - static const GUID *profile2_guid = - &GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_10BIT_PROFILE2; - static const GUID *profile0_guid = - &GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_PROFILE0; - /* values were taken from chromium. See supported_profile_helper.cc */ - GstD3D11Vp9DecResolution resolutions_to_check[] = { - {4096, 2160}, {4096, 2304}, {7680, 4320}, {8192, 4320}, {8192, 8192} - }; - GstCaps *sink_caps = NULL; - GstCaps *src_caps = NULL; - guint max_width = 0; - guint max_height = 0; - guint resolution; - gboolean have_profile2 = FALSE; - gboolean have_profile0 = FALSE; - DXGI_FORMAT format = DXGI_FORMAT_UNKNOWN; - - have_profile2 = gst_d3d11_decoder_get_supported_decoder_profile (decoder, - &profile2_guid, 1, &profile); - if (!have_profile2) { - GST_DEBUG_OBJECT (device, - "decoder does not support VP9_VLD_10BIT_PROFILE2"); - } else { - have_profile2 &= - gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_P010); - have_profile2 &= - gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_NV12); - if (!have_profile2) { - GST_FIXME_OBJECT (device, - "device does not support P010 and/or NV12 format"); - } - } - - have_profile0 = gst_d3d11_decoder_get_supported_decoder_profile (decoder, - &profile0_guid, 1, &profile); - if (!have_profile0) { - GST_DEBUG_OBJECT (device, "decoder does not support VP9_VLD_PROFILE0"); - } else { - have_profile0 = - gst_d3d11_decoder_supports_format (decoder, &profile, DXGI_FORMAT_NV12); - if (!have_profile0) { - GST_FIXME_OBJECT (device, "device does not support NV12 format"); - } - } - - if (!have_profile2 && !have_profile0) { - GST_INFO_OBJECT (device, "device does not support VP9 decoding"); - return; - } - - if (have_profile0) { - profile = *profile0_guid; - format = DXGI_FORMAT_NV12; - } else { - profile = *profile2_guid; - format = DXGI_FORMAT_P010; - } - - for (i = 0; i < G_N_ELEMENTS (resolutions_to_check); i++) { - if (gst_d3d11_decoder_supports_resolution (decoder, &profile, - format, resolutions_to_check[i].width, - resolutions_to_check[i].height)) { - max_width = resolutions_to_check[i].width; - max_height = resolutions_to_check[i].height; - - GST_DEBUG_OBJECT (device, - "device support resolution %dx%d", max_width, max_height); - } else { - break; - } - } - - if (max_width == 0 || max_height == 0) { - GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); - return; - } - - sink_caps = gst_caps_from_string ("video/x-vp9, " - "framerate = " GST_VIDEO_FPS_RANGE); - src_caps = gst_caps_from_string ("video/x-raw(" - GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "), " - "framerate = " GST_VIDEO_FPS_RANGE ";" - "video/x-raw, " "framerate = " GST_VIDEO_FPS_RANGE); - - if (have_profile2) { - GValue format_list = G_VALUE_INIT; - GValue format_value = G_VALUE_INIT; - - g_value_init (&format_list, GST_TYPE_LIST); - - g_value_init (&format_value, G_TYPE_STRING); - g_value_set_string (&format_value, "NV12"); - gst_value_list_append_and_take_value (&format_list, &format_value); - - g_value_init (&format_value, G_TYPE_STRING); - g_value_set_string (&format_value, "P010_10LE"); - gst_value_list_append_and_take_value (&format_list, &format_value); - - gst_caps_set_value (src_caps, "format", &format_list); - g_value_unset (&format_list); - } else { - gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); - } - - /* To cover both landscape and portrait, select max value */ - resolution = MAX (max_width, max_height); - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - - type_info.class_data = - gst_d3d11_decoder_class_data_new (device, sink_caps, src_caps); - - type_name = g_strdup ("GstD3D11Vp9Dec"); - feature_name = g_strdup ("d3d11vp9dec"); - - while (g_type_from_name (type_name)) { - index++; - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstD3D11Vp9Device%dDec", index); - feature_name = g_strdup_printf ("d3d11vp9device%ddec", index); - } - - type = g_type_register_static (GST_TYPE_VP9_DECODER, - type_name, &type_info, 0); - - /* make lower rank than default device */ - if (rank > 0 && index != 0) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/plugin.c
Deleted
@@ -1,199 +0,0 @@ -/* GStreamer - * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstd3d11config.h" - -#include <gst/gst.h> -#include "gstd3d11videosink.h" -#include "gstd3d11upload.h" -#include "gstd3d11download.h" -#include "gstd3d11colorconvert.h" -#include "gstd3d11videosinkbin.h" -#include "gstd3d11shader.h" -#ifdef HAVE_DXVA_H -#include "gstd3d11utils.h" -#include "gstd3d11h264dec.h" -#include "gstd3d11h265dec.h" -#include "gstd3d11vp9dec.h" -#include "gstd3d11vp8dec.h" -#endif - -GST_DEBUG_CATEGORY (gst_d3d11_debug); -GST_DEBUG_CATEGORY (gst_d3d11_shader_debug); -GST_DEBUG_CATEGORY (gst_d3d11_colorconverter_debug); -GST_DEBUG_CATEGORY (gst_d3d11_utils_debug); -GST_DEBUG_CATEGORY (gst_d3d11_format_debug); -GST_DEBUG_CATEGORY (gst_d3d11_device_debug); -GST_DEBUG_CATEGORY (gst_d3d11_overlay_compositor_debug); -GST_DEBUG_CATEGORY (gst_d3d11_window_debug); -GST_DEBUG_CATEGORY (gst_d3d11_video_processor_debug); - -#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H) -GST_DEBUG_CATEGORY (gst_d3d11_debug_layer_debug); -#endif - -#ifdef HAVE_DXVA_H -GST_DEBUG_CATEGORY (gst_d3d11_h264_dec_debug); -GST_DEBUG_CATEGORY (gst_d3d11_h265_dec_debug); -GST_DEBUG_CATEGORY (gst_d3d11_vp9_dec_debug); -GST_DEBUG_CATEGORY (gst_d3d11_vp8_dec_debug); -#endif - -#define GST_CAT_DEFAULT gst_d3d11_debug - -static gboolean -plugin_init (GstPlugin * plugin) -{ - GstD3D11Device *device = NULL; - GstRank video_sink_rank = GST_RANK_NONE; - - /** - * plugin-d3d11: - * - * Since: 1.18 - */ - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_debug, "d3d11", 0, "direct3d 11 plugin"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_shader_debug, - "d3d11shader", 0, "d3d11shader"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_colorconverter_debug, - "d3d11colorconverter", 0, "d3d11colorconverter"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_utils_debug, - "d3d11utils", 0, "d3d11 utility functions"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_format_debug, - "d3d11format", 0, "d3d11 specific formats"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_device_debug, - "d3d11device", 0, "d3d11 device object"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_overlay_compositor_debug, - "d3d11overlaycompositor", 0, "d3d11overlaycompositor"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_window_debug, - "d3d11window", 0, "d3d11window"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_video_processor_debug, - "d3d11videoprocessor", 0, "d3d11videoprocessor"); - -#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H) - /* NOTE: enabled only for debug build */ - GST_DEBUG_CATEGORY_INIT (gst_d3d11_debug_layer_debug, - "d3d11debuglayer", 0, "native d3d11 and dxgi debug"); -#endif - - if (!gst_d3d11_shader_init ()) { - GST_WARNING ("Cannot initialize d3d11 shader"); - return TRUE; - } - - gst_element_register (plugin, - "d3d11upload", GST_RANK_NONE, GST_TYPE_D3D11_UPLOAD); - gst_element_register (plugin, - "d3d11download", GST_RANK_NONE, GST_TYPE_D3D11_DOWNLOAD); - gst_element_register (plugin, - "d3d11convert", GST_RANK_NONE, GST_TYPE_D3D11_COLOR_CONVERT); - gst_element_register (plugin, - "d3d11videosinkelement", GST_RANK_NONE, GST_TYPE_D3D11_VIDEO_SINK); - - device = gst_d3d11_device_new (0); - - /* FIXME: Our shader code is not compatible with D3D_FEATURE_LEVEL_9_3 - * or lower. So HLSL compiler cannot understand our shader code and - * therefore d3d11colorconverter cannot be configured. - * - * Known D3D_FEATURE_LEVEL_9_3 driver is - * "VirtualBox Graphics Adapter (WDDM)" - * ... and there might be some more old physical devices which don't support - * D3D_FEATURE_LEVEL_10_0. - */ - if (device) { - D3D_FEATURE_LEVEL feature_level = D3D_FEATURE_LEVEL_10_0; - - feature_level = gst_d3d11_device_get_chosen_feature_level (device); - if (feature_level >= D3D_FEATURE_LEVEL_10_0) - video_sink_rank = GST_RANK_PRIMARY; - } - - gst_element_register (plugin, - "d3d11videosink", video_sink_rank, GST_TYPE_D3D11_VIDEO_SINK_BIN); - -#ifdef HAVE_DXVA_H - /* DXVA2 API is availble since Windows 8 */ - if (gst_d3d11_is_windows_8_or_greater ()) { - gint i = 0; - - GST_DEBUG_CATEGORY_INIT (gst_d3d11_h264_dec_debug, - "d3d11h264dec", 0, "Direct3D11 H.264 Video Decoder"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_vp9_dec_debug, - "d3d11vp9dec", 0, "Direct3D11 VP9 Video Decoder"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_h265_dec_debug, - "d3d11h265dec", 0, "Direct3D11 H.265 Video Decoder"); - GST_DEBUG_CATEGORY_INIT (gst_d3d11_vp8_dec_debug, - "d3d11vp8dec", 0, "Direct3D11 VP8 Decoder"); - - do { - GstD3D11Decoder *decoder = NULL; - gboolean legacy; - gboolean hardware; - - if (!device) - device = gst_d3d11_device_new (i); - - if (!device) - break; - - g_object_get (device, "hardware", &hardware, NULL); - if (!hardware) - goto clear; - - decoder = gst_d3d11_decoder_new (device); - if (!decoder) - goto clear; - - legacy = gst_d3d11_decoder_util_is_legacy_device (device); - - gst_d3d11_h264_dec_register (plugin, - device, decoder, GST_RANK_SECONDARY, legacy); - if (!legacy) { - gst_d3d11_h265_dec_register (plugin, device, decoder, - GST_RANK_SECONDARY); - gst_d3d11_vp9_dec_register (plugin, device, decoder, - GST_RANK_SECONDARY); - gst_d3d11_vp8_dec_register (plugin, device, decoder, - GST_RANK_SECONDARY); - } - - clear: - gst_clear_object (&device); - gst_clear_object (&decoder); - i++; - } while (1); - } -#endif - - gst_clear_object (&device); - - return TRUE; -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - d3d11, - "Direct3D11 plugin", - plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/h264-ctrls.h
Deleted
@@ -1,212 +0,0 @@ -/* SPDX-License-Identifier: GPL-2.0 */ -/* - * These are the H.264 state controls for use with stateless H.264 - * codec drivers. - * - * It turns out that these structs are not stable yet and will undergo - * more changes. So keep them private until they are stable and ready to - * become part of the official public API. - */ - -#ifndef _H264_CTRLS_H_ -#define _H264_CTRLS_H_ - -#include "linux/videodev2.h" - -/* Our pixel format isn't stable at the moment */ -#define V4L2_PIX_FMT_H264_SLICE v4l2_fourcc('S', '2', '6', '4') /* H264 parsed slices */ - -/* - * This is put insanely high to avoid conflicting with controls that - * would be added during the phase where those controls are not - * stable. It should be fixed eventually. - */ -#define V4L2_CID_MPEG_VIDEO_H264_SPS (V4L2_CID_MPEG_BASE+1000) -#define V4L2_CID_MPEG_VIDEO_H264_PPS (V4L2_CID_MPEG_BASE+1001) -#define V4L2_CID_MPEG_VIDEO_H264_SCALING_MATRIX (V4L2_CID_MPEG_BASE+1002) -#define V4L2_CID_MPEG_VIDEO_H264_SLICE_PARAMS (V4L2_CID_MPEG_BASE+1003) -#define V4L2_CID_MPEG_VIDEO_H264_DECODE_PARAMS (V4L2_CID_MPEG_BASE+1004) -#define V4L2_CID_MPEG_VIDEO_H264_DECODE_MODE (V4L2_CID_MPEG_BASE+1005) -#define V4L2_CID_MPEG_VIDEO_H264_START_CODE (V4L2_CID_MPEG_BASE+1006) - -/* enum v4l2_ctrl_type type values */ -#define V4L2_CTRL_TYPE_H264_SPS 0x0110 -#define V4L2_CTRL_TYPE_H264_PPS 0x0111 -#define V4L2_CTRL_TYPE_H264_SCALING_MATRIX 0x0112 -#define V4L2_CTRL_TYPE_H264_SLICE_PARAMS 0x0113 -#define V4L2_CTRL_TYPE_H264_DECODE_PARAMS 0x0114 - -enum v4l2_mpeg_video_h264_decode_mode { - V4L2_MPEG_VIDEO_H264_DECODE_MODE_SLICE_BASED, - V4L2_MPEG_VIDEO_H264_DECODE_MODE_FRAME_BASED, -}; - -enum v4l2_mpeg_video_h264_start_code { - V4L2_MPEG_VIDEO_H264_START_CODE_NONE, - V4L2_MPEG_VIDEO_H264_START_CODE_ANNEX_B, -}; - -#define V4L2_H264_SPS_CONSTRAINT_SET0_FLAG 0x01 -#define V4L2_H264_SPS_CONSTRAINT_SET1_FLAG 0x02 -#define V4L2_H264_SPS_CONSTRAINT_SET2_FLAG 0x04 -#define V4L2_H264_SPS_CONSTRAINT_SET3_FLAG 0x08 -#define V4L2_H264_SPS_CONSTRAINT_SET4_FLAG 0x10 -#define V4L2_H264_SPS_CONSTRAINT_SET5_FLAG 0x20 - -#define V4L2_H264_SPS_FLAG_SEPARATE_COLOUR_PLANE 0x01 -#define V4L2_H264_SPS_FLAG_QPPRIME_Y_ZERO_TRANSFORM_BYPASS 0x02 -#define V4L2_H264_SPS_FLAG_DELTA_PIC_ORDER_ALWAYS_ZERO 0x04 -#define V4L2_H264_SPS_FLAG_GAPS_IN_FRAME_NUM_VALUE_ALLOWED 0x08 -#define V4L2_H264_SPS_FLAG_FRAME_MBS_ONLY 0x10 -#define V4L2_H264_SPS_FLAG_MB_ADAPTIVE_FRAME_FIELD 0x20 -#define V4L2_H264_SPS_FLAG_DIRECT_8X8_INFERENCE 0x40 - -struct v4l2_ctrl_h264_sps { - __u8 profile_idc; - __u8 constraint_set_flags; - __u8 level_idc; - __u8 seq_parameter_set_id; - __u8 chroma_format_idc; - __u8 bit_depth_luma_minus8; - __u8 bit_depth_chroma_minus8; - __u8 log2_max_frame_num_minus4; - __u8 pic_order_cnt_type; - __u8 log2_max_pic_order_cnt_lsb_minus4; - __u8 max_num_ref_frames; - __u8 num_ref_frames_in_pic_order_cnt_cycle; - __s32 offset_for_ref_frame[255]; - __s32 offset_for_non_ref_pic; - __s32 offset_for_top_to_bottom_field; - __u16 pic_width_in_mbs_minus1; - __u16 pic_height_in_map_units_minus1; - __u32 flags; -}; - -#define V4L2_H264_PPS_FLAG_ENTROPY_CODING_MODE 0x0001 -#define V4L2_H264_PPS_FLAG_BOTTOM_FIELD_PIC_ORDER_IN_FRAME_PRESENT 0x0002 -#define V4L2_H264_PPS_FLAG_WEIGHTED_PRED 0x0004 -#define V4L2_H264_PPS_FLAG_DEBLOCKING_FILTER_CONTROL_PRESENT 0x0008 -#define V4L2_H264_PPS_FLAG_CONSTRAINED_INTRA_PRED 0x0010 -#define V4L2_H264_PPS_FLAG_REDUNDANT_PIC_CNT_PRESENT 0x0020 -#define V4L2_H264_PPS_FLAG_TRANSFORM_8X8_MODE 0x0040 -#define V4L2_H264_PPS_FLAG_PIC_SCALING_MATRIX_PRESENT 0x0080 - -struct v4l2_ctrl_h264_pps { - __u8 pic_parameter_set_id; - __u8 seq_parameter_set_id; - __u8 num_slice_groups_minus1; - __u8 num_ref_idx_l0_default_active_minus1; - __u8 num_ref_idx_l1_default_active_minus1; - __u8 weighted_bipred_idc; - __s8 pic_init_qp_minus26; - __s8 pic_init_qs_minus26; - __s8 chroma_qp_index_offset; - __s8 second_chroma_qp_index_offset; - __u16 flags; -}; - -struct v4l2_ctrl_h264_scaling_matrix { - __u8 scaling_list_4x4[6][16]; - __u8 scaling_list_8x8[6][64]; -}; - -struct v4l2_h264_weight_factors { - __s16 luma_weight[32]; - __s16 luma_offset[32]; - __s16 chroma_weight[32][2]; - __s16 chroma_offset[32][2]; -}; - -struct v4l2_h264_pred_weight_table { - __u16 luma_log2_weight_denom; - __u16 chroma_log2_weight_denom; - struct v4l2_h264_weight_factors weight_factors[2]; -}; - -#define V4L2_H264_SLICE_TYPE_P 0 -#define V4L2_H264_SLICE_TYPE_B 1 -#define V4L2_H264_SLICE_TYPE_I 2 -#define V4L2_H264_SLICE_TYPE_SP 3 -#define V4L2_H264_SLICE_TYPE_SI 4 - -#define V4L2_H264_SLICE_FLAG_FIELD_PIC 0x01 -#define V4L2_H264_SLICE_FLAG_BOTTOM_FIELD 0x02 -#define V4L2_H264_SLICE_FLAG_DIRECT_SPATIAL_MV_PRED 0x04 -#define V4L2_H264_SLICE_FLAG_SP_FOR_SWITCH 0x08 - -struct v4l2_ctrl_h264_slice_params { - /* Size in bytes, including header */ - __u32 size; - - /* Offset in bytes to the start of slice in the OUTPUT buffer. */ - __u32 start_byte_offset; - - /* Offset in bits to slice_data() from the beginning of this slice. */ - __u32 header_bit_size; - - __u16 first_mb_in_slice; - __u8 slice_type; - __u8 pic_parameter_set_id; - __u8 colour_plane_id; - __u8 redundant_pic_cnt; - __u16 frame_num; - __u16 idr_pic_id; - __u16 pic_order_cnt_lsb; - __s32 delta_pic_order_cnt_bottom; - __s32 delta_pic_order_cnt0; - __s32 delta_pic_order_cnt1; - - struct v4l2_h264_pred_weight_table pred_weight_table; - /* Size in bits of dec_ref_pic_marking() syntax element. */ - __u32 dec_ref_pic_marking_bit_size; - /* Size in bits of pic order count syntax. */ - __u32 pic_order_cnt_bit_size; - - __u8 cabac_init_idc; - __s8 slice_qp_delta; - __s8 slice_qs_delta; - __u8 disable_deblocking_filter_idc; - __s8 slice_alpha_c0_offset_div2; - __s8 slice_beta_offset_div2; - __u8 num_ref_idx_l0_active_minus1; - __u8 num_ref_idx_l1_active_minus1; - __u32 slice_group_change_cycle; - - /* - * Entries on each list are indices into - * v4l2_ctrl_h264_decode_params.dpb[]. - */ - __u8 ref_pic_list0[32]; - __u8 ref_pic_list1[32]; - - __u32 flags; -}; - -#define V4L2_H264_DPB_ENTRY_FLAG_VALID 0x01 -#define V4L2_H264_DPB_ENTRY_FLAG_ACTIVE 0x02 -#define V4L2_H264_DPB_ENTRY_FLAG_LONG_TERM 0x04 -#define V4L2_H264_DPB_ENTRY_FLAG_FIELD 0x08 -#define V4L2_H264_DPB_ENTRY_FLAG_BOTTOM_FIELD 0x10 - -struct v4l2_h264_dpb_entry { - __u64 reference_ts; - __u16 frame_num; - __u16 pic_num; - /* Note that field is indicated by v4l2_buffer.field */ - __s32 top_field_order_cnt; - __s32 bottom_field_order_cnt; - __u32 flags; /* V4L2_H264_DPB_ENTRY_FLAG_* */ -}; - -#define V4L2_H264_DECODE_PARAM_FLAG_IDR_PIC 0x01 - -struct v4l2_ctrl_h264_decode_params { - struct v4l2_h264_dpb_entry dpb[16]; - __u16 num_slices; - __u16 nal_ref_idc; - __s32 top_field_order_cnt; - __s32 bottom_field_order_cnt; - __u32 flags; /* V4L2_H264_DECODE_PARAM_FLAG_* */ -}; - -#endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/vp8-ctrls.h
Deleted
@@ -1,112 +0,0 @@ -/* SPDX-License-Identifier: GPL-2.0 */ -/* - * These are the VP8 state controls for use with stateless VP8 - * codec drivers. - * - * It turns out that these structs are not stable yet and will undergo - * more changes. So keep them private until they are stable and ready to - * become part of the official public API. - */ - -#ifndef _VP8_CTRLS_H_ -#define _VP8_CTRLS_H_ - -#include <linux/types.h> - -#define V4L2_PIX_FMT_VP8_FRAME v4l2_fourcc('V', 'P', '8', 'F') - -#define V4L2_CID_MPEG_VIDEO_VP8_FRAME_HEADER (V4L2_CID_MPEG_BASE + 2000) -#define V4L2_CTRL_TYPE_VP8_FRAME_HEADER 0x301 - -#define V4L2_VP8_SEGMENT_HEADER_FLAG_ENABLED 0x01 -#define V4L2_VP8_SEGMENT_HEADER_FLAG_UPDATE_MAP 0x02 -#define V4L2_VP8_SEGMENT_HEADER_FLAG_UPDATE_FEATURE_DATA 0x04 -#define V4L2_VP8_SEGMENT_HEADER_FLAG_DELTA_VALUE_MODE 0x08 - -struct v4l2_vp8_segment_header { - __s8 quant_update[4]; - __s8 lf_update[4]; - __u8 segment_probs[3]; - __u8 padding; - __u32 flags; -}; - -#define V4L2_VP8_LF_HEADER_ADJ_ENABLE 0x01 -#define V4L2_VP8_LF_HEADER_DELTA_UPDATE 0x02 -#define V4L2_VP8_LF_FILTER_TYPE_SIMPLE 0x04 -struct v4l2_vp8_loopfilter_header { - __s8 ref_frm_delta[4]; - __s8 mb_mode_delta[4]; - __u8 sharpness_level; - __u8 level; - __u16 padding; - __u32 flags; -}; - -struct v4l2_vp8_quantization_header { - __u8 y_ac_qi; - __s8 y_dc_delta; - __s8 y2_dc_delta; - __s8 y2_ac_delta; - __s8 uv_dc_delta; - __s8 uv_ac_delta; - __u16 padding; -}; - -struct v4l2_vp8_entropy_header { - __u8 coeff_probs[4][8][3][11]; - __u8 y_mode_probs[4]; - __u8 uv_mode_probs[3]; - __u8 mv_probs[2][19]; - __u8 padding[3]; -}; - -struct v4l2_vp8_entropy_coder_state { - __u8 range; - __u8 value; - __u8 bit_count; - __u8 padding; -}; - -#define V4L2_VP8_FRAME_HEADER_FLAG_KEY_FRAME 0x01 -#define V4L2_VP8_FRAME_HEADER_FLAG_EXPERIMENTAL 0x02 -#define V4L2_VP8_FRAME_HEADER_FLAG_SHOW_FRAME 0x04 -#define V4L2_VP8_FRAME_HEADER_FLAG_MB_NO_SKIP_COEFF 0x08 -#define V4L2_VP8_FRAME_HEADER_FLAG_SIGN_BIAS_GOLDEN 0x10 -#define V4L2_VP8_FRAME_HEADER_FLAG_SIGN_BIAS_ALT 0x20 - -#define VP8_FRAME_IS_KEY_FRAME(hdr) \ - (!!((hdr)->flags & V4L2_VP8_FRAME_HEADER_FLAG_KEY_FRAME)) - -struct v4l2_ctrl_vp8_frame_header { - struct v4l2_vp8_segment_header segment_header; - struct v4l2_vp8_loopfilter_header lf_header; - struct v4l2_vp8_quantization_header quant_header; - struct v4l2_vp8_entropy_header entropy_header; - struct v4l2_vp8_entropy_coder_state coder_state; - - __u16 width; - __u16 height; - - __u8 horizontal_scale; - __u8 vertical_scale; - - __u8 version; - __u8 prob_skip_false; - __u8 prob_intra; - __u8 prob_last; - __u8 prob_gf; - __u8 num_dct_parts; - - __u32 first_part_size; - __u32 first_part_header_bits; - __u32 dct_part_sizes[8]; - - __u64 last_frame_ts; - __u64 golden_frame_ts; - __u64 alt_frame_ts; - - __u64 flags; -}; - -#endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadisplay.c
Deleted
@@ -1,414 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Igalia, S.L. - * Author: Víctor Jáquez <vjaquez@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstvadisplay.h" -#include "gstvaprofile.h" -#include "gstvavideoformat.h" - -GST_DEBUG_CATEGORY (gst_va_display_debug); -#define GST_CAT_DEFAULT gst_va_display_debug - -typedef struct _GstVaDisplayPrivate GstVaDisplayPrivate; -struct _GstVaDisplayPrivate -{ - GRecMutex lock; - VADisplay display; - - gboolean foreign; - gboolean init; -}; - -#define gst_va_display_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstVaDisplay, gst_va_display, GST_TYPE_OBJECT, - G_ADD_PRIVATE (GstVaDisplay); - GST_DEBUG_CATEGORY_INIT (gst_va_display_debug, "vadisplay", 0, - "VA Display")); -enum -{ - PROP_VA_DISPLAY = 1, - N_PROPERTIES -}; - -static GParamSpec *g_properties[N_PROPERTIES]; - -#define GET_PRIV(obj) gst_va_display_get_instance_private (GST_VA_DISPLAY (obj)) - -static gboolean -_gst_va_display_driver_filter (VADisplay display) -{ - const char *vendor = vaQueryVendorString (display); - - GST_INFO ("VA-API driver vendor: %s", vendor); - - /* XXX(victor): driver quirks & driver whitelist */ - - return TRUE; -} - -static void -gst_va_display_set_display (GstVaDisplay * self, gpointer display) -{ - GstVaDisplayPrivate *priv = GET_PRIV (self); - - if (!display) - return; - - if (vaDisplayIsValid (display) == 0) { - GST_WARNING_OBJECT (self, - "User's VA display is invalid. An internal one will be tried."); - return; - } - - if (!_gst_va_display_driver_filter (display)) - return; - - priv->display = display; - priv->foreign = TRUE; - - /* assume driver is already initialized */ - priv->init = TRUE; -} - -static void -gst_va_display_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstVaDisplay *self = GST_VA_DISPLAY (object); - - switch (prop_id) { - case PROP_VA_DISPLAY:{ - gpointer display = g_value_get_pointer (value); - gst_va_display_set_display (self, display); - break; - } - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_va_display_get_property (GObject * object, guint prop_id, GValue * value, - GParamSpec * pspec) -{ - GstVaDisplayPrivate *priv = GET_PRIV (object); - - switch (prop_id) { - case PROP_VA_DISPLAY: - g_value_set_pointer (value, priv->display); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_va_display_constructed (GObject * object) -{ - GstVaDisplay *self = GST_VA_DISPLAY (object); - GstVaDisplayPrivate *priv = GET_PRIV (object); - GstVaDisplayClass *klass = GST_VA_DISPLAY_GET_CLASS (object); - - if (!priv->display && klass->create_va_display) - priv->display = klass->create_va_display (self); - - G_OBJECT_CLASS (parent_class)->constructed (object); -} - -static void -gst_va_display_dispose (GObject * object) -{ - GstVaDisplayPrivate *priv = GET_PRIV (object); - - if (priv->display && !priv->foreign) - vaTerminate (priv->display); - priv->display = NULL; - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_va_display_finalize (GObject * object) -{ - GstVaDisplayPrivate *priv = GET_PRIV (object); - - g_rec_mutex_clear (&priv->lock); - - G_OBJECT_CLASS (parent_class)->finalize (object); -} - -static void -gst_va_display_class_init (GstVaDisplayClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - - gobject_class->set_property = gst_va_display_set_property; - gobject_class->get_property = gst_va_display_get_property; - gobject_class->constructed = gst_va_display_constructed; - gobject_class->dispose = gst_va_display_dispose; - gobject_class->finalize = gst_va_display_finalize; - - g_properties[PROP_VA_DISPLAY] = - g_param_spec_pointer ("va-display", "VADisplay", "VA Display handler", - G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); - - g_object_class_install_properties (gobject_class, N_PROPERTIES, g_properties); -} - -static void -gst_va_display_init (GstVaDisplay * self) -{ - GstVaDisplayPrivate *priv = GET_PRIV (self); - - g_rec_mutex_init (&priv->lock); -} - -void -gst_va_display_lock (GstVaDisplay * self) -{ - GstVaDisplayPrivate *priv; - - g_return_if_fail (GST_IS_VA_DISPLAY (self)); - - priv = GET_PRIV (self); - - g_rec_mutex_lock (&priv->lock); -} - -void -gst_va_display_unlock (GstVaDisplay * self) -{ - GstVaDisplayPrivate *priv; - - g_return_if_fail (GST_IS_VA_DISPLAY (self)); - - priv = GET_PRIV (self); - - g_rec_mutex_unlock (&priv->lock); -} - -#ifndef GST_DISABLE_GST_DEBUG -static gchar * -_strip_msg (const char *message) -{ - gchar *msg = g_strdup (message); - if (!msg) - return NULL; - return g_strstrip (msg); -} - -static void -_va_warning (gpointer object, const char *message) -{ - GstVaDisplay *self = GST_VA_DISPLAY (object); - gchar *msg; - - if ((msg = _strip_msg (message))) { - GST_WARNING_OBJECT (self, "VA error: %s", msg); - g_free (msg); - } -} - -static void -_va_info (gpointer object, const char *message) -{ - GstVaDisplay *self = GST_VA_DISPLAY (object); - gchar *msg; - - if ((msg = _strip_msg (message))) { - GST_INFO_OBJECT (self, "VA info: %s", msg); - g_free (msg); - } -} -#endif - -/** - * gst_va_display_initialize: - * @self: a #GstVaDisplay - * - * If the display is set by the user (foreign) it is assumed that the - * driver is already initialized, thus this function is noop. - * - * If the display is opened internally, this function will initialize - * the driver and it will set driver's message callbacks. - * - * NOTE: this function is supposed to be private, only used by - * GstVaDisplay descendants. - * - * Returns: %TRUE if the VA driver can be initialized; %FALSE - * otherwise - **/ -gboolean -gst_va_display_initialize (GstVaDisplay * self) -{ - GstVaDisplayPrivate *priv; - VAStatus status; - int major_version = -1, minor_version = -1; - - g_return_val_if_fail (GST_IS_VA_DISPLAY (self), FALSE); - - priv = GET_PRIV (self); - - if (priv->init) - return TRUE; - - if (!priv->display) - return FALSE; - -#ifndef GST_DISABLE_GST_DEBUG - vaSetErrorCallback (priv->display, _va_warning, self); - vaSetInfoCallback (priv->display, _va_info, self); -#endif - - status = vaInitialize (priv->display, &major_version, &minor_version); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR_OBJECT (self, "vaInitialize: %s", vaErrorStr (status)); - return FALSE; - } - - GST_INFO_OBJECT (self, "VA-API version %d.%d", major_version, minor_version); - - priv->init = TRUE; - - if (!_gst_va_display_driver_filter (priv->display)) - return FALSE; - - return TRUE; -} - -VADisplay -gst_va_display_get_va_dpy (GstVaDisplay * self) -{ - VADisplay dpy = 0; - - g_return_val_if_fail (GST_IS_VA_DISPLAY (self), 0); - - g_object_get (self, "va-display", &dpy, NULL); - return dpy; -} - -GArray * -gst_va_display_get_profiles (GstVaDisplay * self, guint32 codec, - VAEntrypoint entrypoint) -{ - GArray *ret = NULL; - VADisplay dpy; - VAEntrypoint *entrypoints; - VAProfile *profiles; - VAStatus status; - gint i, j, num_entrypoints = 0, num_profiles = 0; - - g_return_val_if_fail (GST_IS_VA_DISPLAY (self), NULL); - - dpy = gst_va_display_get_va_dpy (self); - - gst_va_display_lock (self); - num_profiles = vaMaxNumProfiles (dpy); - num_entrypoints = vaMaxNumEntrypoints (dpy); - gst_va_display_unlock (self); - - profiles = g_new (VAProfile, num_profiles); - entrypoints = g_new (VAEntrypoint, num_entrypoints); - - gst_va_display_lock (self); - status = vaQueryConfigProfiles (dpy, profiles, &num_profiles); - gst_va_display_unlock (self); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaQueryConfigProfile: %s", vaErrorStr (status)); - goto bail; - } - - for (i = 0; i < num_profiles; i++) { - if (codec != gst_va_profile_codec (profiles[i])) - continue; - - gst_va_display_lock (self); - status = vaQueryConfigEntrypoints (dpy, profiles[i], entrypoints, - &num_entrypoints); - gst_va_display_unlock (self); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaQueryConfigEntrypoints: %s", vaErrorStr (status)); - goto bail; - } - - for (j = 0; j < num_entrypoints; j++) { - if (entrypoints[j] == entrypoint) { - if (!ret) - ret = g_array_new (FALSE, FALSE, sizeof (VAProfile)); - g_array_append_val (ret, profiles[i]); - break; - } - } - } - -bail: - g_free (entrypoints); - g_free (profiles); - return ret; -} - -GArray * -gst_va_display_get_image_formats (GstVaDisplay * self) -{ - GArray *ret = NULL; - GstVideoFormat format; - VADisplay dpy = gst_va_display_get_va_dpy (self); - VAImageFormat *va_formats; - VAStatus status; - int i, max, num = 0; - - gst_va_display_lock (self); - max = vaMaxNumImageFormats (dpy); - gst_va_display_unlock (self); - if (max == 0) - return NULL; - - va_formats = g_new (VAImageFormat, max); - - gst_va_display_lock (self); - status = vaQueryImageFormats (dpy, va_formats, &num); - gst_va_display_unlock (self); - - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaQueryImageFormats: %s", vaErrorStr (status)); - goto bail; - } - - ret = g_array_sized_new (FALSE, FALSE, sizeof (GstVideoFormat), num); - for (i = 0; i < num; i++) { - format = gst_va_video_format_from_va_image_format (&va_formats[i]); - if (format != GST_VIDEO_FORMAT_UNKNOWN) - g_array_append_val (ret, format); - } - - if (ret->len == 0) { - g_array_unref (ret); - ret = NULL; - } - -bail: - g_free (va_formats); - return ret; -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadisplay.h
Deleted
@@ -1,49 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Igalia, S.L. - * Author: Víctor Jáquez <vjaquez@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#pragma once - -#include <gst/gst.h> -#include <va/va.h> - -G_BEGIN_DECLS - -#define GST_TYPE_VA_DISPLAY (gst_va_display_get_type()) -G_DECLARE_DERIVABLE_TYPE (GstVaDisplay, gst_va_display, GST, VA_DISPLAY, GstObject) - -struct _GstVaDisplayClass -{ - /*< private > */ - GstObjectClass parent_class; - - /*< public > */ - gpointer (*create_va_display) (GstVaDisplay * self); -}; - -void gst_va_display_lock (GstVaDisplay * self); -void gst_va_display_unlock (GstVaDisplay * self); -gboolean gst_va_display_initialize (GstVaDisplay * self); -VADisplay gst_va_display_get_va_dpy (GstVaDisplay * self); -GArray * gst_va_display_get_profiles (GstVaDisplay * self, - guint32 codec, - VAEntrypoint entrypoint); -GArray * gst_va_display_get_image_formats (GstVaDisplay * display); - -G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadisplay_drm.c
Deleted
@@ -1,188 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Igalia, S.L. - * Author: Víctor Jáquez <vjaquez@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstvadisplay_drm.h" -#include <fcntl.h> -#include <unistd.h> -#include <sys/types.h> -#include <sys/stat.h> -#include <va/va_drm.h> - -#if HAVE_LIBDRM -#include <xf86drm.h> -#endif - -struct _GstVaDisplayDrm -{ - GstVaDisplay parent; - - gchar *path; - gint fd; -}; - -GST_DEBUG_CATEGORY_EXTERN (gst_va_display_debug); -#define GST_CAT_DEFAULT gst_va_display_debug - -#define gst_va_display_drm_parent_class parent_class -G_DEFINE_TYPE (GstVaDisplayDrm, gst_va_display_drm, GST_TYPE_VA_DISPLAY); - -enum -{ - PROP_PATH = 1, - N_PROPERTIES -}; - -static GParamSpec *g_properties[N_PROPERTIES]; - -#define MAX_DEVICES 8 - -static void -gst_va_display_drm_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (object); - - switch (prop_id) { - case PROP_PATH: - self->path = g_value_dup_string (value); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_va_display_drm_get_property (GObject * object, guint prop_id, - GValue * value, GParamSpec * pspec) -{ - GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (object); - - switch (prop_id) { - case PROP_PATH: - g_value_set_string (value, self->path); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void -gst_va_display_drm_finalize (GObject * object) -{ - GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (object); - - g_free (self->path); - if (self->fd > -1) - close (self->fd); - - G_OBJECT_CLASS (parent_class)->finalize (object); -} - -static gpointer -gst_va_display_drm_create_va_display (GstVaDisplay * display) -{ - int fd, saved_errno = 0; - GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (display); - - fd = open (self->path, O_RDWR); - saved_errno = errno; - if (fd < 0) { - GST_WARNING_OBJECT (self, "Failed to open %s: %s", self->path, - g_strerror (saved_errno)); - close (fd); - return 0; - } -#if HAVE_LIBDRM - { - drmVersion *version; - - version = drmGetVersion (fd); - if (!version) { - GST_ERROR_OBJECT (self, "Device %s is not a DRM render node", self->path); - return 0; - } - GST_INFO_OBJECT (self, "DRM render node with kernel driver %s", - version->name); - drmFreeVersion (version); - } -#endif - - self->fd = fd; - return vaGetDisplayDRM (self->fd); -} - -static void -gst_va_display_drm_class_init (GstVaDisplayDrmClass * klass) -{ - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - GstVaDisplayClass *vadisplay_class = GST_VA_DISPLAY_CLASS (klass); - - gobject_class->set_property = gst_va_display_drm_set_property; - gobject_class->get_property = gst_va_display_drm_get_property; - gobject_class->finalize = gst_va_display_drm_finalize; - - vadisplay_class->create_va_display = gst_va_display_drm_create_va_display; - - g_properties[PROP_PATH] = - g_param_spec_string ("path", "render-path", "The path of DRM device", - "/dev/dri/renderD128", - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT_ONLY); - - g_object_class_install_properties (gobject_class, N_PROPERTIES, g_properties); -} - -static void -gst_va_display_drm_init (GstVaDisplayDrm * self) -{ - self->fd = -1; -} - -/** - * gst_va_display_drm_new_from_path: - * @path: the path to the DRM device - * - * Creates a new #GstVaDisplay from a DRM device . It will try to open - * and operate the device in @path. - * - * Returns: a newly allocated #GstVaDisplay if the specified DRM - * render device could be opened and initialized; otherwise %NULL - * is returned. - **/ -GstVaDisplay * -gst_va_display_drm_new_from_path (const gchar * path) -{ - GstVaDisplay *dpy; - - g_return_val_if_fail (path, NULL); - - dpy = g_object_new (GST_TYPE_VA_DISPLAY_DRM, "path", path, NULL); - if (!gst_va_display_initialize (dpy)) { - gst_object_unref (dpy); - return NULL; - } - - return gst_object_ref_sink (dpy); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadisplay_drm.h
Deleted
@@ -1,33 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Igalia, S.L. - * Author: Víctor Jáquez <vjaquez@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#pragma once - -#include "gstvadisplay.h" - -G_BEGIN_DECLS - -#define GST_TYPE_VA_DISPLAY_DRM (gst_va_display_drm_get_type()) -G_DECLARE_FINAL_TYPE (GstVaDisplayDrm, gst_va_display_drm, GST, VA_DISPLAY_DRM, - GstVaDisplay) - -GstVaDisplay * gst_va_display_drm_new_from_path (const gchar * path); - -G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadisplay_wrapped.c
Deleted
@@ -1,69 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Igalia, S.L. - * Author: Víctor Jáquez <vjaquez@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#include "gstvadisplay_wrapped.h" - -struct _GstVaDisplayWrapped -{ - GstVaDisplay parent; -}; - -#define gst_va_display_wrapped_parent_class parent_class -G_DEFINE_TYPE (GstVaDisplayWrapped, gst_va_display_wrapped, - GST_TYPE_VA_DISPLAY); - -static void -gst_va_display_wrapped_class_init (GstVaDisplayWrappedClass * klass) -{ -} - -static void -gst_va_display_wrapped_init (GstVaDisplayWrapped * self) -{ -} - -/** - * gst_va_display_wrapped_new: - * @handle: a #VADisplay to wrap - * - * Creates a #GstVaDisplay wrapping an already created and initialized - * VADisplay. - * - * Returns: a new #GstVaDisplay if @handle is valid, Otherwise %NULL. - **/ -GstVaDisplay * -gst_va_display_wrapped_new (guintptr handle) -{ - GstVaDisplay *dpy; - - g_return_val_if_fail (handle, NULL); - - dpy = g_object_new (GST_TYPE_VA_DISPLAY_WRAPPED, "va-display", handle, NULL); - if (!gst_va_display_initialize (dpy)) { - gst_object_unref (dpy); - return NULL; - } - - return gst_object_ref_sink (dpy); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadisplay_wrapped.h
Deleted
@@ -1,33 +0,0 @@ -/* GStreamer - * Copyright (C) 2020 Igalia, S.L. - * Author: Víctor Jáquez <vjaquez@igalia.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#pragma once - -#include "gstvadisplay.h" - -G_BEGIN_DECLS - -#define GST_TYPE_VA_DISPLAY_WRAPPED gst_va_display_wrapped_get_type() -G_DECLARE_FINAL_TYPE (GstVaDisplayWrapped, gst_va_display_wrapped, GST, - VA_DISPLAY_WRAPPED, GstVaDisplay) - -GstVaDisplay * gst_va_display_wrapped_new (guintptr handle); - -G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/ofa.c
Deleted
@@ -1,371 +0,0 @@ -/* GStreamer - * Copyright (C) 2008 Sebastian Dröge <slomo@circular-chaos.org> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -#include <gst/check/gstcheck.h> - -static gboolean found_fingerprint = FALSE; - -static gboolean -bus_handler (GstBus * bus, GstMessage * message, gpointer data) -{ - GMainLoop *loop = (GMainLoop *) data; - - switch (message->type) { - case GST_MESSAGE_EOS: - g_main_loop_quit (loop); - break; - case GST_MESSAGE_WARNING: - case GST_MESSAGE_ERROR:{ - GError *gerror; - gchar *debug; - - if (message->type == GST_MESSAGE_WARNING) - gst_message_parse_warning (message, &gerror, &debug); - else - gst_message_parse_error (message, &gerror, &debug); - gst_object_default_error (GST_MESSAGE_SRC (message), gerror, debug); - g_error_free (gerror); - g_free (debug); - g_main_loop_quit (loop); - break; - } - case GST_MESSAGE_TAG: - { - GstTagList *tag_list; - gchar *fpr, *p; - - gst_message_parse_tag (message, &tag_list); - - GST_DEBUG ("tag message: %" GST_PTR_FORMAT, tag_list); - - if (!gst_tag_list_get_value_index (tag_list, "ofa-fingerprint", 0)) { - gst_tag_list_unref (tag_list); - break; - } - - fail_unless (gst_tag_list_get_string (tag_list, "ofa-fingerprint", &fpr)); - - p = fpr; - while (*p) { - fail_unless (g_ascii_isalnum (*p) || *p == '=' || *p == '+' - || *p == '/'); - p++; - } - - g_free (fpr); - gst_tag_list_unref (tag_list); - - found_fingerprint = TRUE; - - g_main_loop_quit (loop); - break; - } - default: - break; - } - - return TRUE; -} - -GST_START_TEST (test_ofa_le_1ch) -{ - GstElement *pipeline; - GstElement *audiotestsrc, *audioconvert, *capsfilter, *ofa, *fakesink; - - GstBus *bus; - GMainLoop *loop; - GstCaps *caps; - gint64 position; - GstFormat fmt = GST_FORMAT_TIME; - guint bus_watch = 0; - - pipeline = gst_pipeline_new ("pipeline"); - fail_unless (pipeline != NULL); - - audiotestsrc = gst_element_factory_make ("audiotestsrc", "src"); - fail_unless (audiotestsrc != NULL); - g_object_set (G_OBJECT (audiotestsrc), "wave", 0, "freq", 440.0, NULL); - - audioconvert = gst_element_factory_make ("audioconvert", "audioconvert"); - fail_unless (audioconvert != NULL); - g_object_set (G_OBJECT (audioconvert), "dithering", 0, NULL); - - capsfilter = gst_element_factory_make ("capsfilter", "capsfilter"); - fail_unless (capsfilter != NULL); - caps = gst_caps_new_simple ("audio/x-raw", "format", G_TYPE_STRING, "S16LE", - "rate", G_TYPE_INT, 44100, "channels", G_TYPE_INT, 1, NULL); - g_object_set (G_OBJECT (capsfilter), "caps", caps, NULL); - gst_caps_unref (caps); - - ofa = gst_element_factory_make ("ofa", "ofa"); - fail_unless (ofa != NULL); - - fakesink = gst_element_factory_make ("fakesink", "sink"); - fail_unless (fakesink != NULL); - - gst_bin_add_many (GST_BIN (pipeline), audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL); - - fail_unless (gst_element_link_many (audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL)); - - loop = g_main_loop_new (NULL, TRUE); - fail_unless (loop != NULL); - - bus = gst_element_get_bus (pipeline); - fail_unless (bus != NULL); - bus_watch = gst_bus_add_watch (bus, bus_handler, loop); - gst_object_unref (bus); - - found_fingerprint = FALSE; - gst_element_set_state (pipeline, GST_STATE_PLAYING); - g_main_loop_run (loop); - - fail_unless (gst_element_query_position (audiotestsrc, fmt, &position)); - fail_unless (position >= 135 * GST_SECOND); - - gst_element_set_state (pipeline, GST_STATE_NULL); - - fail_unless (found_fingerprint == TRUE); - g_object_unref (pipeline); - g_main_loop_unref (loop); - g_source_remove (bus_watch); -} - -GST_END_TEST; - - -GST_START_TEST (test_ofa_be_1ch) -{ - GstElement *pipeline; - GstElement *audiotestsrc, *audioconvert, *capsfilter, *ofa, *fakesink; - GstBus *bus; - GMainLoop *loop; - GstCaps *caps; - gint64 position; - GstFormat fmt = GST_FORMAT_TIME; - guint bus_watch = 0; - - pipeline = gst_pipeline_new ("pipeline"); - fail_unless (pipeline != NULL); - - audiotestsrc = gst_element_factory_make ("audiotestsrc", "src"); - fail_unless (audiotestsrc != NULL); - g_object_set (G_OBJECT (audiotestsrc), "wave", 0, "freq", 440.0, NULL); - - audioconvert = gst_element_factory_make ("audioconvert", "audioconvert"); - fail_unless (audioconvert != NULL); - g_object_set (G_OBJECT (audioconvert), "dithering", 0, NULL); - - capsfilter = gst_element_factory_make ("capsfilter", "capsfilter"); - fail_unless (capsfilter != NULL); - caps = gst_caps_new_simple ("audio/x-raw", "format", G_TYPE_STRING, "S16BE", - "rate", G_TYPE_INT, 44100, "channels", G_TYPE_INT, 1, NULL); - g_object_set (G_OBJECT (capsfilter), "caps", caps, NULL); - gst_caps_unref (caps); - - ofa = gst_element_factory_make ("ofa", "ofa"); - fail_unless (ofa != NULL); - - fakesink = gst_element_factory_make ("fakesink", "sink"); - fail_unless (fakesink != NULL); - - gst_bin_add_many (GST_BIN (pipeline), audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL); - - fail_unless (gst_element_link_many (audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL)); - - loop = g_main_loop_new (NULL, TRUE); - fail_unless (loop != NULL); - - bus = gst_element_get_bus (pipeline); - fail_unless (bus != NULL); - bus_watch = gst_bus_add_watch (bus, bus_handler, loop); - gst_object_unref (bus); - - found_fingerprint = FALSE; - gst_element_set_state (pipeline, GST_STATE_PLAYING); - g_main_loop_run (loop); - - fail_unless (gst_element_query_position (audiotestsrc, fmt, &position)); - fail_unless (position >= 135 * GST_SECOND); - - gst_element_set_state (pipeline, GST_STATE_NULL); - - fail_unless (found_fingerprint == TRUE); - g_object_unref (pipeline); - g_main_loop_unref (loop); - g_source_remove (bus_watch); -} - -GST_END_TEST; - -GST_START_TEST (test_ofa_le_2ch) -{ - GstElement *pipeline; - GstElement *audiotestsrc, *audioconvert, *capsfilter, *ofa, *fakesink; - GstBus *bus; - GMainLoop *loop; - GstCaps *caps; - gint64 position; - GstFormat fmt = GST_FORMAT_TIME; - guint bus_watch = 0; - - pipeline = gst_pipeline_new ("pipeline"); - fail_unless (pipeline != NULL); - - audiotestsrc = gst_element_factory_make ("audiotestsrc", "src"); - fail_unless (audiotestsrc != NULL); - g_object_set (G_OBJECT (audiotestsrc), "wave", 0, "freq", 440.0, NULL); - - audioconvert = gst_element_factory_make ("audioconvert", "audioconvert"); - fail_unless (audioconvert != NULL); - g_object_set (G_OBJECT (audioconvert), "dithering", 0, NULL); - - capsfilter = gst_element_factory_make ("capsfilter", "capsfilter"); - fail_unless (capsfilter != NULL); - caps = gst_caps_new_simple ("audio/x-raw", "format", G_TYPE_STRING, "S16LE", - "rate", G_TYPE_INT, 44100, "channels", G_TYPE_INT, 2, NULL); - g_object_set (G_OBJECT (capsfilter), "caps", caps, NULL); - gst_caps_unref (caps); - - ofa = gst_element_factory_make ("ofa", "ofa"); - fail_unless (ofa != NULL); - - fakesink = gst_element_factory_make ("fakesink", "sink"); - fail_unless (fakesink != NULL); - - gst_bin_add_many (GST_BIN (pipeline), audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL); - - fail_unless (gst_element_link_many (audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL)); - - loop = g_main_loop_new (NULL, TRUE); - fail_unless (loop != NULL); - - bus = gst_element_get_bus (pipeline); - fail_unless (bus != NULL); - bus_watch = gst_bus_add_watch (bus, bus_handler, loop); - gst_object_unref (bus); - - found_fingerprint = FALSE; - gst_element_set_state (pipeline, GST_STATE_PLAYING); - g_main_loop_run (loop); - - fail_unless (gst_element_query_position (audiotestsrc, fmt, &position)); - fail_unless (position >= 135 * GST_SECOND); - - gst_element_set_state (pipeline, GST_STATE_NULL); - - fail_unless (found_fingerprint == TRUE); - g_object_unref (pipeline); - g_main_loop_unref (loop); - g_source_remove (bus_watch); -} - -GST_END_TEST; - - -GST_START_TEST (test_ofa_be_2ch) -{ - GstElement *pipeline; - GstElement *audiotestsrc, *audioconvert, *capsfilter, *ofa, *fakesink; - GstBus *bus; - GMainLoop *loop; - GstCaps *caps; - gint64 position; - GstFormat fmt = GST_FORMAT_TIME; - guint bus_watch = 0; - - pipeline = gst_pipeline_new ("pipeline"); - fail_unless (pipeline != NULL); - - audiotestsrc = gst_element_factory_make ("audiotestsrc", "src"); - fail_unless (audiotestsrc != NULL); - g_object_set (G_OBJECT (audiotestsrc), "wave", 0, "freq", 440.0, NULL); - - audioconvert = gst_element_factory_make ("audioconvert", "audioconvert"); - fail_unless (audioconvert != NULL); - g_object_set (G_OBJECT (audioconvert), "dithering", 0, NULL); - - capsfilter = gst_element_factory_make ("capsfilter", "capsfilter"); - fail_unless (capsfilter != NULL); - caps = gst_caps_new_simple ("audio/x-raw", "format", G_TYPE_STRING, "S16BE", - "rate", G_TYPE_INT, 44100, "channels", G_TYPE_INT, 2, NULL); - g_object_set (G_OBJECT (capsfilter), "caps", caps, NULL); - gst_caps_unref (caps); - - ofa = gst_element_factory_make ("ofa", "ofa"); - fail_unless (ofa != NULL); - - fakesink = gst_element_factory_make ("fakesink", "sink"); - fail_unless (fakesink != NULL); - - gst_bin_add_many (GST_BIN (pipeline), audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL); - - fail_unless (gst_element_link_many (audiotestsrc, audioconvert, capsfilter, - ofa, fakesink, NULL)); - - loop = g_main_loop_new (NULL, TRUE); - fail_unless (loop != NULL); - - bus = gst_element_get_bus (pipeline); - fail_unless (bus != NULL); - bus_watch = gst_bus_add_watch (bus, bus_handler, loop); - gst_object_unref (bus); - - found_fingerprint = FALSE; - gst_element_set_state (pipeline, GST_STATE_PLAYING); - g_main_loop_run (loop); - - fail_unless (gst_element_query_position (audiotestsrc, fmt, &position)); - fail_unless (position >= 135 * GST_SECOND); - - gst_element_set_state (pipeline, GST_STATE_NULL); - - fail_unless (found_fingerprint == TRUE); - g_object_unref (pipeline); - g_main_loop_unref (loop); - g_source_remove (bus_watch); -} - -GST_END_TEST; - -static Suite * -ofa_suite (void) -{ - Suite *s = suite_create ("OFA"); - TCase *tc_chain = tcase_create ("linear"); - - /* time out after 120s, not the default 3 */ - tcase_set_timeout (tc_chain, 120); - - suite_add_tcase (s, tc_chain); - tcase_add_test (tc_chain, test_ofa_le_1ch); - tcase_add_test (tc_chain, test_ofa_be_1ch); - tcase_add_test (tc_chain, test_ofa_le_2ch); - tcase_add_test (tc_chain, test_ofa_be_2ch); - - return s; -} - -GST_CHECK_MAIN (ofa)
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/libs/player.c
Deleted
@@ -1,1735 +0,0 @@ -/* GStreamer - * - * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> - * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> - * - * This library is free software; you can redistribute it and/or - * modify it under the terms of the GNU Library General Public - * License as published by the Free Software Foundation; either - * version 2 of the License, or (at your option) any later version. - * - * This library is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - * Library General Public License for more details. - * - * You should have received a copy of the GNU Library General Public - * License along with this library; if not, write to the - * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, - * Boston, MA 02110-1301, USA. - */ - -/* TODO: - * - start with pause, go to playing - * - play, pause, play - * - set uri in play/pause - * - play/pause after eos - * - seek in play/pause/stopped, after eos, back to 0, after duration - * - http buffering - */ - -#ifdef HAVE_CONFIG_H -#include "config.h" -#endif - -#ifdef HAVE_VALGRIND -# include <valgrind/valgrind.h> -#endif - -#include <gst/check/gstcheck.h> - -#define fail_unless_equals_int(a, b) \ -G_STMT_START { \ - int first = a; \ - int second = b; \ - fail_unless(first == second, \ - "'" #a "' (%d) is not equal to '" #b"' (%d)", first, second); \ -} G_STMT_END; - -#define fail_unless_equals_uint64(a, b) \ -G_STMT_START { \ - guint64 first = a; \ - guint64 second = b; \ - fail_unless(first == second, \ - "'" #a "' (%" G_GUINT64_FORMAT ") is not equal to '" #b"' (%" \ - G_GUINT64_FORMAT ")", first, second); \ -} G_STMT_END; - -#define fail_unless_equals_double(a, b) \ -G_STMT_START { \ - double first = a; \ - double second = b; \ - fail_unless(first == second, \ - "'" #a "' (%lf) is not equal to '" #b"' (%lf)", first, second); \ -} G_STMT_END; - -#include <gst/player/player.h> - -START_TEST (test_create_and_free) -{ - GstPlayer *player; - - player = gst_player_new (NULL, NULL); - fail_unless (player != NULL); - g_object_unref (player); -} - -END_TEST; - -START_TEST (test_set_and_get_uri) -{ - GstPlayer *player; - gchar *uri; - - player = gst_player_new (NULL, NULL); - - fail_unless (player != NULL); - - gst_player_set_uri (player, "file:///path/to/a/file"); - uri = gst_player_get_uri (player); - - fail_unless (g_strcmp0 (uri, "file:///path/to/a/file") == 0); - - g_free (uri); - g_object_unref (player); -} - -END_TEST; - -START_TEST (test_set_and_get_position_update_interval) -{ - GstPlayer *player; - guint interval = 0; - GstStructure *config; - - player = gst_player_new (NULL, NULL); - - fail_unless (player != NULL); - - config = gst_player_get_config (player); - gst_player_config_set_position_update_interval (config, 500); - interval = gst_player_config_get_position_update_interval (config); - fail_unless (interval == 500); - gst_player_set_config (player, config); - - g_object_unref (player); -} - -END_TEST; - -typedef enum -{ - STATE_CHANGE_BUFFERING, - STATE_CHANGE_DURATION_CHANGED, - STATE_CHANGE_END_OF_STREAM, - STATE_CHANGE_ERROR, - STATE_CHANGE_WARNING, - STATE_CHANGE_POSITION_UPDATED, - STATE_CHANGE_STATE_CHANGED, - STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED, - STATE_CHANGE_MEDIA_INFO_UPDATED, - STATE_CHANGE_SEEK_DONE, - STATE_CHANGE_URI_LOADED, -} TestPlayerStateChange; - -static const gchar * -test_player_state_change_get_name (TestPlayerStateChange change) -{ - switch (change) { - case STATE_CHANGE_BUFFERING: - return "buffering"; - case STATE_CHANGE_DURATION_CHANGED: - return "duration-changed"; - case STATE_CHANGE_END_OF_STREAM: - return "end-of-stream"; - case STATE_CHANGE_WARNING: - return "warning"; - case STATE_CHANGE_ERROR: - return "error"; - case STATE_CHANGE_POSITION_UPDATED: - return "position-updated"; - case STATE_CHANGE_STATE_CHANGED: - return "state-changed"; - case STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED: - return "video-dimensions-changed"; - case STATE_CHANGE_MEDIA_INFO_UPDATED: - return "media-info-updated"; - case STATE_CHANGE_SEEK_DONE: - return "seek-done"; - case STATE_CHANGE_URI_LOADED: - return "uri-loaded"; - default: - g_assert_not_reached (); - break; - } -} - -typedef struct _TestPlayerState TestPlayerState; -struct _TestPlayerState -{ - GMainLoop *loop; - - gint buffering_percent; - guint64 position, duration, seek_done_position; - gboolean end_of_stream, error, warning, seek_done; - GstPlayerState state; - gint width, height; - GstPlayerMediaInfo *media_info; - gchar *uri_loaded; - gboolean stopping; - - void (*test_callback) (GstPlayer * player, TestPlayerStateChange change, - TestPlayerState * old_state, TestPlayerState * new_state); - gpointer test_data; -}; - -static void -test_player_state_change_debug (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - GST_DEBUG_OBJECT (player, "Changed %s:\n" - "\tbuffering %d%% -> %d%%\n" - "\tposition %" GST_TIME_FORMAT " -> %" GST_TIME_FORMAT "\n" - "\tduration %" GST_TIME_FORMAT " -> %" GST_TIME_FORMAT "\n" - "\tseek position %" GST_TIME_FORMAT " -> %" GST_TIME_FORMAT "\n" - "\tend-of-stream %d -> %d\n" - "\terror %d -> %d\n" - "\tseek_done %d -> %d\n" - "\tstate %s -> %s\n" - "\twidth/height %d/%d -> %d/%d\n" - "\tmedia_info %p -> %p\n" - "\turi_loaded %s -> %s", - test_player_state_change_get_name (change), - old_state->buffering_percent, new_state->buffering_percent, - GST_TIME_ARGS (old_state->position), GST_TIME_ARGS (new_state->position), - GST_TIME_ARGS (old_state->duration), GST_TIME_ARGS (new_state->duration), - GST_TIME_ARGS (old_state->seek_done_position), - GST_TIME_ARGS (new_state->seek_done_position), old_state->end_of_stream, - new_state->end_of_stream, old_state->error, new_state->error, - old_state->seek_done, new_state->seek_done, - gst_player_state_get_name (old_state->state), - gst_player_state_get_name (new_state->state), old_state->width, - old_state->height, new_state->width, new_state->height, - old_state->media_info, new_state->media_info, - old_state->uri_loaded, new_state->uri_loaded); -} - -static void -test_player_state_reset (TestPlayerState * state) -{ - state->buffering_percent = 100; - state->position = state->duration = state->seek_done_position = -1; - state->end_of_stream = state->error = state->seek_done = FALSE; - state->state = GST_PLAYER_STATE_STOPPED; - state->width = state->height = 0; - state->media_info = NULL; - state->stopping = FALSE; - g_clear_pointer (&state->uri_loaded, g_free); -} - -static void -buffering_cb (GstPlayer * player, gint percent, TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->buffering_percent = percent; - test_player_state_change_debug (player, STATE_CHANGE_BUFFERING, &old_state, - state); - state->test_callback (player, STATE_CHANGE_BUFFERING, &old_state, state); -} - -static void -duration_changed_cb (GstPlayer * player, guint64 duration, - TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->duration = duration; - test_player_state_change_debug (player, STATE_CHANGE_DURATION_CHANGED, - &old_state, state); - state->test_callback (player, STATE_CHANGE_DURATION_CHANGED, &old_state, - state); -} - -static void -end_of_stream_cb (GstPlayer * player, TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->end_of_stream = TRUE; - test_player_state_change_debug (player, STATE_CHANGE_END_OF_STREAM, - &old_state, state); - state->test_callback (player, STATE_CHANGE_END_OF_STREAM, &old_state, state); -} - -static void -error_cb (GstPlayer * player, GError * error, TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->error = TRUE; - test_player_state_change_debug (player, STATE_CHANGE_ERROR, &old_state, - state); - state->test_callback (player, STATE_CHANGE_ERROR, &old_state, state); -} - -static void -warning_cb (GstPlayer * player, GError * error, TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->warning = TRUE; - test_player_state_change_debug (player, STATE_CHANGE_WARNING, &old_state, - state); - state->test_callback (player, STATE_CHANGE_WARNING, &old_state, state); -} - -static void -position_updated_cb (GstPlayer * player, guint64 position, - TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->position = position; - test_player_state_change_debug (player, STATE_CHANGE_POSITION_UPDATED, - &old_state, state); - state->test_callback (player, STATE_CHANGE_POSITION_UPDATED, &old_state, - state); -} - -static void -media_info_updated_cb (GstPlayer * player, GstPlayerMediaInfo * media_info, - TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->media_info = media_info; - - test_player_state_change_debug (player, STATE_CHANGE_MEDIA_INFO_UPDATED, - &old_state, state); - state->test_callback (player, STATE_CHANGE_MEDIA_INFO_UPDATED, &old_state, - state); -} - -static void -state_changed_cb (GstPlayer * player, GstPlayerState player_state, - TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping || player_state == GST_PLAYER_STATE_STOPPED); - - state->state = player_state; - - if (player_state == GST_PLAYER_STATE_STOPPED) - test_player_state_reset (state); - - test_player_state_change_debug (player, STATE_CHANGE_STATE_CHANGED, - &old_state, state); - state->test_callback (player, STATE_CHANGE_STATE_CHANGED, &old_state, state); -} - -static void -video_dimensions_changed_cb (GstPlayer * player, gint width, gint height, - TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->width = width; - state->height = height; - test_player_state_change_debug (player, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED, - &old_state, state); - state->test_callback (player, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED, - &old_state, state); -} - -static void -seek_done_cb (GstPlayer * player, guint64 position, TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - g_assert (!state->stopping); - - state->seek_done = TRUE; - state->seek_done_position = position; - test_player_state_change_debug (player, STATE_CHANGE_SEEK_DONE, - &old_state, state); - state->test_callback (player, STATE_CHANGE_SEEK_DONE, &old_state, state); -} - -static void -uri_loaded_cb (GstPlayer * player, const gchar * uri, TestPlayerState * state) -{ - TestPlayerState old_state = *state; - - state->uri_loaded = g_strdup (uri); - state->test_callback (player, STATE_CHANGE_URI_LOADED, &old_state, state); -} - -static GstPlayer * -test_player_new (TestPlayerState * state) -{ - GstPlayer *player; - GstElement *playbin, *fakesink; - - player = - gst_player_new (NULL, - gst_player_g_main_context_signal_dispatcher_new (NULL)); - fail_unless (player != NULL); - - test_player_state_reset (state); - - playbin = gst_player_get_pipeline (player); - fakesink = gst_element_factory_make ("fakesink", "audio-sink"); - g_object_set (fakesink, "sync", TRUE, NULL); - g_object_set (playbin, "audio-sink", fakesink, NULL); - fakesink = gst_element_factory_make ("fakesink", "video-sink"); - g_object_set (fakesink, "sync", TRUE, NULL); - g_object_set (playbin, "video-sink", fakesink, NULL); - gst_object_unref (playbin); - - g_signal_connect (player, "buffering", G_CALLBACK (buffering_cb), state); - g_signal_connect (player, "duration-changed", - G_CALLBACK (duration_changed_cb), state); - g_signal_connect (player, "end-of-stream", G_CALLBACK (end_of_stream_cb), - state); - g_signal_connect (player, "error", G_CALLBACK (error_cb), state); - g_signal_connect (player, "warning", G_CALLBACK (warning_cb), state); - g_signal_connect (player, "position-updated", - G_CALLBACK (position_updated_cb), state); - g_signal_connect (player, "state-changed", G_CALLBACK (state_changed_cb), - state); - g_signal_connect (player, "media-info-updated", - G_CALLBACK (media_info_updated_cb), state); - g_signal_connect (player, "video-dimensions-changed", - G_CALLBACK (video_dimensions_changed_cb), state); - g_signal_connect (player, "seek-done", G_CALLBACK (seek_done_cb), state); - g_signal_connect (player, "uri-loaded", G_CALLBACK (uri_loaded_cb), state); - - return player; -} - -static void -test_player_stopped_cb (GstPlayer * player, TestPlayerStateChange change, - TestPlayerState * old_state, TestPlayerState * new_state) -{ - if (new_state->state == GST_PLAYER_STATE_STOPPED) { - g_main_loop_quit (new_state->loop); - } -} - -static void -stop_player (GstPlayer * player, TestPlayerState * state) -{ - if (state->state != GST_PLAYER_STATE_STOPPED) { - /* Make sure all pending operations are finished so the player won't be - * appear as 'leaked' to leak detection tools. */ - state->test_callback = test_player_stopped_cb; - gst_player_stop (player); - state->stopping = TRUE; - g_main_loop_run (state->loop); - } -} - -static void -test_play_audio_video_eos_cb (GstPlayer * player, TestPlayerStateChange change, - TestPlayerState * old_state, TestPlayerState * new_state) -{ - gint step = GPOINTER_TO_INT (new_state->test_data); - gboolean video; - - video = ! !(step & 0x10); - step = (step & (~0x10)); - - switch (step) { - case 0: - fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); - if (video) - fail_unless (g_str_has_suffix (new_state->uri_loaded, - "audio-video-short.ogg")); - else - fail_unless (g_str_has_suffix (new_state->uri_loaded, - "audio-short.ogg")); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 1: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_STOPPED); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_BUFFERING); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 2: - fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 3: - fail_unless_equals_int (change, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED); - if (video) { - fail_unless_equals_int (new_state->width, 320); - fail_unless_equals_int (new_state->height, 240); - } else { - fail_unless_equals_int (new_state->width, 0); - fail_unless_equals_int (new_state->height, 0); - } - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 4: - fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 5: - fail_unless_equals_int (change, STATE_CHANGE_DURATION_CHANGED); - fail_unless_equals_uint64 (new_state->duration, - G_GUINT64_CONSTANT (464399092)); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 6: - fail_unless_equals_int (change, STATE_CHANGE_POSITION_UPDATED); - fail_unless_equals_uint64 (new_state->position, G_GUINT64_CONSTANT (0)); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 7: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_BUFFERING); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_PLAYING); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - break; - case 8: - if (change == STATE_CHANGE_POSITION_UPDATED) { - fail_unless (old_state->position <= new_state->position); - } else { - fail_unless_equals_uint64 (old_state->position, old_state->duration); - fail_unless_equals_int (change, STATE_CHANGE_END_OF_STREAM); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - } - break; - case 9: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_PLAYING); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_STOPPED); - new_state->test_data = - GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); - g_main_loop_quit (new_state->loop); - break; - default: - fail (); - break; - } -} - -START_TEST (test_play_audio_eos) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_audio_video_eos_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio-short.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 10); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_audio_info (GstPlayerMediaInfo * media_info) -{ - gint i = 0; - GList *list; - - for (list = gst_player_media_info_get_audio_streams (media_info); - list != NULL; list = list->next) { - GstPlayerStreamInfo *stream = (GstPlayerStreamInfo *) list->data; - GstPlayerAudioInfo *audio_info = (GstPlayerAudioInfo *) stream; - - fail_unless (gst_player_stream_info_get_tags (stream) != NULL); - fail_unless (gst_player_stream_info_get_caps (stream) != NULL); - fail_unless_equals_string (gst_player_stream_info_get_stream_type (stream), - "audio"); - - if (i == 0) { - fail_unless_equals_string (gst_player_stream_info_get_codec (stream), - "MPEG-1 Layer 3 (MP3)"); - fail_unless_equals_int (gst_player_audio_info_get_sample_rate - (audio_info), 48000); - fail_unless_equals_int (gst_player_audio_info_get_channels (audio_info), - 2); - fail_unless_equals_int (gst_player_audio_info_get_max_bitrate - (audio_info), 192000); - fail_unless (gst_player_audio_info_get_language (audio_info) != NULL); - } else { - fail_unless_equals_string (gst_player_stream_info_get_codec (stream), - "MPEG-4 AAC"); - fail_unless_equals_int (gst_player_audio_info_get_sample_rate - (audio_info), 48000); - fail_unless_equals_int (gst_player_audio_info_get_channels (audio_info), - 6); - fail_unless (gst_player_audio_info_get_language (audio_info) != NULL); - } - - i++; - } -} - -static void -test_video_info (GstPlayerMediaInfo * media_info) -{ - GList *list; - - for (list = gst_player_media_info_get_video_streams (media_info); - list != NULL; list = list->next) { - gint fps_d, fps_n; - guint par_d, par_n; - GstPlayerStreamInfo *stream = (GstPlayerStreamInfo *) list->data; - GstPlayerVideoInfo *video_info = (GstPlayerVideoInfo *) stream; - - fail_unless (gst_player_stream_info_get_tags (stream) != NULL); - fail_unless (gst_player_stream_info_get_caps (stream) != NULL); - fail_unless_equals_int (gst_player_stream_info_get_index (stream), 0); - fail_unless (strstr (gst_player_stream_info_get_codec (stream), - "H.264") != NULL - || strstr (gst_player_stream_info_get_codec (stream), "H264") != NULL); - fail_unless_equals_int (gst_player_video_info_get_width (video_info), 320); - fail_unless_equals_int (gst_player_video_info_get_height (video_info), 240); - gst_player_video_info_get_framerate (video_info, &fps_n, &fps_d); - fail_unless_equals_int (fps_n, 24); - fail_unless_equals_int (fps_d, 1); - gst_player_video_info_get_pixel_aspect_ratio (video_info, &par_n, &par_d); - fail_unless_equals_int (par_n, 33); - fail_unless_equals_int (par_d, 20); - } -} - -static void -test_subtitle_info (GstPlayerMediaInfo * media_info) -{ - GList *list; - - for (list = gst_player_media_info_get_subtitle_streams (media_info); - list != NULL; list = list->next) { - GstPlayerStreamInfo *stream = (GstPlayerStreamInfo *) list->data; - GstPlayerSubtitleInfo *sub = (GstPlayerSubtitleInfo *) stream; - - fail_unless_equals_string (gst_player_stream_info_get_stream_type (stream), - "subtitle"); - fail_unless (gst_player_stream_info_get_tags (stream) != NULL); - fail_unless (gst_player_stream_info_get_caps (stream) != NULL); - fail_unless_equals_string (gst_player_stream_info_get_codec (stream), - "Timed Text"); - fail_unless (gst_player_subtitle_info_get_language (sub) != NULL); - } -} - -static void -test_media_info_object (GstPlayer * player, GstPlayerMediaInfo * media_info) -{ - GList *list; - - /* global tag */ - fail_unless (gst_player_media_info_is_seekable (media_info) == TRUE); - fail_unless (gst_player_media_info_get_tags (media_info) != NULL); - fail_unless_equals_string (gst_player_media_info_get_title (media_info), - "Sintel"); - fail_unless_equals_string (gst_player_media_info_get_container_format - (media_info), "Matroska"); - fail_unless (gst_player_media_info_get_image_sample (media_info) == NULL); - fail_unless (strstr (gst_player_media_info_get_uri (media_info), - "sintel.mkv") != NULL); - - /* number of streams */ - list = gst_player_media_info_get_stream_list (media_info); - fail_unless (list != NULL); - fail_unless_equals_int (g_list_length (list), 10); - - list = gst_player_media_info_get_video_streams (media_info); - fail_unless (list != NULL); - fail_unless_equals_int (g_list_length (list), 1); - - list = gst_player_media_info_get_audio_streams (media_info); - fail_unless (list != NULL); - fail_unless_equals_int (g_list_length (list), 2); - - list = gst_player_media_info_get_subtitle_streams (media_info); - fail_unless (list != NULL); - fail_unless_equals_int (g_list_length (list), 7); - - /* test subtitle */ - test_subtitle_info (media_info); - - /* test audio */ - test_audio_info (media_info); - - /* test video */ - test_video_info (media_info); -} - -static void -test_play_media_info_cb (GstPlayer * player, TestPlayerStateChange change, - TestPlayerState * old_state, TestPlayerState * new_state) -{ - gint completed = GPOINTER_TO_INT (new_state->test_data); - - if (change == STATE_CHANGE_MEDIA_INFO_UPDATED) { - test_media_info_object (player, new_state->media_info); - new_state->test_data = GINT_TO_POINTER (completed + 1); - g_main_loop_quit (new_state->loop); - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) { - g_main_loop_quit (new_state->loop); - } -} - -START_TEST (test_play_media_info) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_media_info_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 1); - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_error_invalid_external_suburi_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - gchar *suburi; - - suburi = gst_filename_to_uri (TEST_PATH "/foo.srt", NULL); - fail_unless (suburi != NULL); - - new_state->test_data = GINT_TO_POINTER (steps + 1); - /* load invalid suburi */ - gst_player_set_subtitle_uri (player, suburi); - g_free (suburi); - - } else if (steps && change == STATE_CHANGE_WARNING) { - new_state->test_data = GINT_TO_POINTER (steps + 1); - g_main_loop_quit (new_state->loop); - - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) - g_main_loop_quit (new_state->loop); -} - -static void -test_play_stream_disable_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data) & 0xf; - gint mask = GPOINTER_TO_INT (new_state->test_data) & 0xf0; - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - new_state->test_data = GINT_TO_POINTER (0x10 + steps + 1); - gst_player_set_audio_track_enabled (player, FALSE); - - } else if (mask == 0x10 && change == STATE_CHANGE_POSITION_UPDATED) { - GstPlayerAudioInfo *audio; - - audio = gst_player_get_current_audio_track (player); - fail_unless (audio == NULL); - new_state->test_data = GINT_TO_POINTER (0x20 + steps + 1); - gst_player_set_subtitle_track_enabled (player, FALSE); - - } else if (mask == 0x20 && change == STATE_CHANGE_POSITION_UPDATED) { - GstPlayerSubtitleInfo *sub; - - sub = gst_player_get_current_subtitle_track (player); - fail_unless (sub == NULL); - new_state->test_data = GINT_TO_POINTER (0x30 + steps + 1); - g_main_loop_quit (new_state->loop); - - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) { - g_main_loop_quit (new_state->loop); - } -} - -START_TEST (test_play_stream_disable) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_stream_disable_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 0x33); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_stream_switch_audio_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - gint ret; - - new_state->test_data = GINT_TO_POINTER (steps + 1); - ret = gst_player_set_audio_track (player, 1); - fail_unless_equals_int (ret, 1); - - } else if (steps && change == STATE_CHANGE_POSITION_UPDATED) { - gint index; - GstPlayerAudioInfo *audio; - - audio = gst_player_get_current_audio_track (player); - fail_unless (audio != NULL); - index = gst_player_stream_info_get_index ((GstPlayerStreamInfo *) audio); - fail_unless_equals_int (index, 1); - g_object_unref (audio); - - new_state->test_data = GINT_TO_POINTER (steps + 1); - g_main_loop_quit (new_state->loop); - - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) { - g_main_loop_quit (new_state->loop); - } -} - -START_TEST (test_play_stream_switch_audio) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_stream_switch_audio_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_stream_switch_subtitle_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - gint ret; - - new_state->test_data = GINT_TO_POINTER (steps + 1); - ret = gst_player_set_subtitle_track (player, 5); - fail_unless_equals_int (ret, 1); - - } else if (steps && change == STATE_CHANGE_POSITION_UPDATED) { - gint index; - GstPlayerSubtitleInfo *sub; - - sub = gst_player_get_current_subtitle_track (player); - fail_unless (sub != NULL); - index = gst_player_stream_info_get_index ((GstPlayerStreamInfo *) sub); - fail_unless_equals_int (index, 5); - g_object_unref (sub); - - new_state->test_data = GINT_TO_POINTER (steps + 1); - g_main_loop_quit (new_state->loop); - - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) { - g_main_loop_quit (new_state->loop); - } -} - -START_TEST (test_play_stream_switch_subtitle) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_stream_switch_subtitle_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -START_TEST (test_play_error_invalid_external_suburi) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_error_invalid_external_suburi_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio-video.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static gboolean -has_subtitle_stream (TestPlayerState * new_state) -{ - if (gst_player_media_info_get_subtitle_streams (new_state->media_info)) - return TRUE; - - return FALSE; -} - -static void -test_play_external_suburi_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - gchar *suburi; - - suburi = gst_filename_to_uri (TEST_PATH "/test_sub.srt", NULL); - fail_unless (suburi != NULL); - - gst_player_set_subtitle_uri (player, suburi); - g_free (suburi); - new_state->test_data = GINT_TO_POINTER (steps + 1); - - } else if (change == STATE_CHANGE_MEDIA_INFO_UPDATED && - has_subtitle_stream (new_state)) { - gchar *current_suburi, *suburi; - - current_suburi = gst_player_get_subtitle_uri (player); - fail_unless (current_suburi != NULL); - suburi = gst_filename_to_uri (TEST_PATH "/test_sub.srt", NULL); - fail_unless (suburi != NULL); - - fail_unless_equals_int (g_strcmp0 (current_suburi, suburi), 0); - - g_free (current_suburi); - g_free (suburi); - new_state->test_data = GINT_TO_POINTER (steps + 1); - g_main_loop_quit (new_state->loop); - - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) - g_main_loop_quit (new_state->loop); -} - -START_TEST (test_play_external_suburi) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_external_suburi_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio-video.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_rate_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data) & 0xf; - gint mask = GPOINTER_TO_INT (new_state->test_data) & 0xf0; - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - guint64 dur = -1, pos = -1; - - g_object_get (player, "position", &pos, "duration", &dur, NULL); - pos = pos + dur * 0.2; /* seek 20% */ - gst_player_seek (player, pos); - - /* default rate should be 1.0 */ - fail_unless_equals_double (gst_player_get_rate (player), 1.0); - new_state->test_data = GINT_TO_POINTER (mask + steps + 1); - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) { - g_main_loop_quit (new_state->loop); - } else if (steps == 1 && change == STATE_CHANGE_SEEK_DONE) { - if (mask == 0x10) - gst_player_set_rate (player, 1.5); - else if (mask == 0x20) - gst_player_set_rate (player, -1.0); - - new_state->test_data = GINT_TO_POINTER (mask + steps + 1); - } else if (steps && (change == STATE_CHANGE_POSITION_UPDATED)) { - if (steps == 10) { - g_main_loop_quit (new_state->loop); - } else { - if (mask == 0x10 && (new_state->position > old_state->position)) - new_state->test_data = GINT_TO_POINTER (mask + steps + 1); - else if (mask == 0x20 && (new_state->position < old_state->position)) - new_state->test_data = GINT_TO_POINTER (mask + steps + 1); - } - } -} - -START_TEST (test_play_forward_rate) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_rate_cb; - state.test_data = GINT_TO_POINTER (0x10); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & 0xf, 10); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -START_TEST (test_play_backward_rate) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_rate_cb; - state.test_data = GINT_TO_POINTER (0x20); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & 0xf, 10); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -START_TEST (test_play_audio_video_eos) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_audio_video_eos_cb; - state.test_data = GINT_TO_POINTER (0x10); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio-video-short.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & (~0x10), 10); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_error_invalid_uri_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint step = GPOINTER_TO_INT (new_state->test_data); - - switch (step) { - case 0: - fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); - fail_unless_equals_string (new_state->uri_loaded, "foo://bar"); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 1: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_STOPPED); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_BUFFERING); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 2: - fail_unless_equals_int (change, STATE_CHANGE_ERROR); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 3: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_BUFFERING); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_STOPPED); - new_state->test_data = GINT_TO_POINTER (step + 1); - g_main_loop_quit (new_state->loop); - break; - default: - fail (); - break; - } -} - -START_TEST (test_play_error_invalid_uri) -{ - GstPlayer *player; - TestPlayerState state; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_error_invalid_uri_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - gst_player_set_uri (player, "foo://bar"); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 4); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_error_invalid_uri_and_play_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint step = GPOINTER_TO_INT (new_state->test_data); - gchar *uri; - - switch (step) { - case 0: - fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); - fail_unless_equals_string (new_state->uri_loaded, "foo://bar"); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 1: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_STOPPED); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_BUFFERING); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 2: - fail_unless_equals_int (change, STATE_CHANGE_ERROR); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 3: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_BUFFERING); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_STOPPED); - new_state->test_data = GINT_TO_POINTER (step + 1); - - uri = gst_filename_to_uri (TEST_PATH "/audio-short.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - break; - case 4: - fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); - fail_unless (g_str_has_suffix (new_state->uri_loaded, "audio-short.ogg")); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 5: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_STOPPED); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_BUFFERING); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 6: - fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 7: - fail_unless_equals_int (change, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED); - fail_unless_equals_int (new_state->width, 0); - fail_unless_equals_int (new_state->height, 0); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 8: - fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 9: - fail_unless_equals_int (change, STATE_CHANGE_DURATION_CHANGED); - fail_unless_equals_uint64 (new_state->duration, - G_GUINT64_CONSTANT (464399092)); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 10: - fail_unless_equals_int (change, STATE_CHANGE_POSITION_UPDATED); - fail_unless_equals_uint64 (new_state->position, G_GUINT64_CONSTANT (0)); - new_state->test_data = GINT_TO_POINTER (step + 1); - break; - case 11: - fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); - fail_unless_equals_int (old_state->state, GST_PLAYER_STATE_BUFFERING); - fail_unless_equals_int (new_state->state, GST_PLAYER_STATE_PLAYING); - new_state->test_data = GINT_TO_POINTER (step + 1); - g_main_loop_quit (new_state->loop); - break; - default: - fail (); - break; - } -} - -START_TEST (test_play_error_invalid_uri_and_play) -{ - GstPlayer *player; - TestPlayerState state; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_error_invalid_uri_and_play_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - gst_player_set_uri (player, "foo://bar"); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 12); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_seek_done_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint step = GPOINTER_TO_INT (new_state->test_data) & (~0x10); - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !step) { - gst_player_seek (player, 0); - new_state->test_data = GINT_TO_POINTER (step + 1); - } else if (change == STATE_CHANGE_SEEK_DONE || change == STATE_CHANGE_ERROR) { - fail_unless_equals_int (change, STATE_CHANGE_SEEK_DONE); - fail_unless_equals_uint64 (new_state->seek_done_position, - G_GUINT64_CONSTANT (0)); - new_state->test_data = GINT_TO_POINTER (step + 1); - g_main_loop_quit (new_state->loop); - } -} - -START_TEST (test_play_audio_video_seek_done) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_seek_done_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/audio-video.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & (~0x10), 2); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_play_position_update_interval_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - static gboolean do_quit = TRUE; - static GstClockTime last_position = GST_CLOCK_TIME_NONE; - - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (new_state->state == GST_PLAYER_STATE_PLAYING && !steps) { - new_state->test_data = GINT_TO_POINTER (steps + 1); - } else if (steps && change == STATE_CHANGE_POSITION_UPDATED) { - GstClockTime position = gst_player_get_position (player); - new_state->test_data = GINT_TO_POINTER (steps + 1); - - if (GST_CLOCK_TIME_IS_VALID (last_position)) { - GstClockTime interval = GST_CLOCK_DIFF (last_position, position); - GST_DEBUG_OBJECT (player, - "position update interval: %" GST_TIME_FORMAT "\n", - GST_TIME_ARGS (interval)); - fail_unless (interval > (590 * GST_MSECOND) - && interval < (610 * GST_MSECOND)); - } - - last_position = position; - - if (do_quit && position >= 2000 * GST_MSECOND) { - do_quit = FALSE; - g_main_loop_quit (new_state->loop); - } - } else if (change == STATE_CHANGE_END_OF_STREAM || - change == STATE_CHANGE_ERROR) { - g_main_loop_quit (new_state->loop); - } -} - -static gboolean -quit_loop_cb (gpointer user_data) -{ - GMainLoop *loop = user_data; - g_main_loop_quit (loop); - - return G_SOURCE_REMOVE; -} - -START_TEST (test_play_position_update_interval) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - GstStructure *config; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_play_position_update_interval_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - config = gst_player_get_config (player); - gst_player_config_set_position_update_interval (config, 600); - gst_player_set_config (player, config); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 5); - - /* Disable position updates */ - gst_player_stop (player); - - config = gst_player_get_config (player); - gst_player_config_set_position_update_interval (config, 0); - gst_player_set_config (player, config); - - g_timeout_add (2000, quit_loop_cb, state.loop); - g_main_loop_run (state.loop); - - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 5); - - stop_player (player, &state); - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -static void -test_restart_cb (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (!steps && change == STATE_CHANGE_URI_LOADED) { - fail_unless (g_str_has_suffix (new_state->uri_loaded, "sintel.mkv")); - new_state->test_data = GINT_TO_POINTER (steps + 1); - } else if (change == STATE_CHANGE_STATE_CHANGED - && new_state->state == GST_PLAYER_STATE_BUFFERING) { - new_state->test_data = GINT_TO_POINTER (steps + 1); - g_main_loop_quit (new_state->loop); - } -} - -static void -test_restart_cb2 (GstPlayer * player, - TestPlayerStateChange change, TestPlayerState * old_state, - TestPlayerState * new_state) -{ - gint steps = GPOINTER_TO_INT (new_state->test_data); - - if (!steps && change == STATE_CHANGE_URI_LOADED) { - fail_unless (g_str_has_suffix (new_state->uri_loaded, "audio-short.ogg")); - new_state->test_data = GINT_TO_POINTER (steps + 1); - } else if (change == STATE_CHANGE_STATE_CHANGED - && new_state->state == GST_PLAYER_STATE_BUFFERING) { - new_state->test_data = GINT_TO_POINTER (steps + 1); - g_main_loop_quit (new_state->loop); - } -} - - -START_TEST (test_restart) -{ - GstPlayer *player; - TestPlayerState state; - gchar *uri; - - memset (&state, 0, sizeof (state)); - state.loop = g_main_loop_new (NULL, FALSE); - state.test_callback = test_restart_cb; - state.test_data = GINT_TO_POINTER (0); - - player = test_player_new (&state); - - fail_unless (player != NULL); - - uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); - stop_player (player, &state); - - /* Try again with another URI */ - state.test_data = GINT_TO_POINTER (0); - state.test_callback = test_restart_cb2; - - uri = gst_filename_to_uri (TEST_PATH "/audio-short.ogg", NULL); - fail_unless (uri != NULL); - gst_player_set_uri (player, uri); - g_free (uri); - - gst_player_play (player); - g_main_loop_run (state.loop); - fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); - stop_player (player, &state); - - g_object_unref (player); - g_main_loop_unref (state.loop); -} - -END_TEST; - -#define TEST_USER_AGENT "test user agent" - -static void -source_setup_cb (GstElement * playbin, GstElement * source, GMainLoop * loop) -{ - gchar *user_agent; - - g_object_get (source, "user-agent", &user_agent, NULL); - fail_unless_equals_string (user_agent, TEST_USER_AGENT); - g_free (user_agent); - - g_main_loop_quit (loop); -} - -START_TEST (test_user_agent) -{ - GstPlayer *player; - GMainLoop *loop; - GstElement *pipeline; - GstStructure *config; - gchar *user_agent; - - loop = g_main_loop_new (NULL, FALSE); - player = gst_player_new (NULL, NULL); - fail_unless (player != NULL); - - gst_player_set_uri (player, "http://badger.com/test.mkv"); - - config = gst_player_get_config (player); - gst_player_config_set_user_agent (config, TEST_USER_AGENT); - - user_agent = gst_player_config_get_user_agent (config); - fail_unless_equals_string (user_agent, TEST_USER_AGENT); - g_free (user_agent); - - gst_player_set_config (player, config); - - pipeline = gst_player_get_pipeline (player); - g_signal_connect (pipeline, "source-setup", G_CALLBACK (source_setup_cb), - loop); - - gst_player_pause (player); - g_main_loop_run (loop); - - gst_object_unref (pipeline); - - g_object_unref (player); - g_main_loop_unref (loop); -} - -END_TEST; - -static Suite * -player_suite (void) -{ - Suite *s = suite_create ("GstPlayer"); - - TCase *tc_general = tcase_create ("general"); - - /* Use a longer timeout */ -#ifdef HAVE_VALGRIND - if (RUNNING_ON_VALGRIND) { - tcase_set_timeout (tc_general, 5 * 60); - } else -#endif - { - tcase_set_timeout (tc_general, 2 * 60); - } - - tcase_add_test (tc_general, test_create_and_free); - tcase_add_test (tc_general, test_set_and_get_uri); - tcase_add_test (tc_general, test_set_and_get_position_update_interval); - - /* Skip playback tests for now. - * See https://bugzilla.gnome.org/show_bug.cgi?id=787374 */ - -#ifdef HAVE_VALGRIND - if (RUNNING_ON_VALGRIND) { - } else -#endif - { - tcase_skip_broken_test (tc_general, test_play_position_update_interval); - } - tcase_skip_broken_test (tc_general, test_play_audio_eos); - tcase_skip_broken_test (tc_general, test_play_audio_video_eos); - tcase_skip_broken_test (tc_general, test_play_error_invalid_uri); - tcase_skip_broken_test (tc_general, test_play_error_invalid_uri_and_play); - tcase_skip_broken_test (tc_general, test_play_media_info); - tcase_skip_broken_test (tc_general, test_play_stream_disable); - tcase_skip_broken_test (tc_general, test_play_stream_switch_audio); - tcase_skip_broken_test (tc_general, test_play_stream_switch_subtitle); - tcase_skip_broken_test (tc_general, test_play_error_invalid_external_suburi); - tcase_skip_broken_test (tc_general, test_play_external_suburi); - tcase_skip_broken_test (tc_general, test_play_forward_rate); - tcase_skip_broken_test (tc_general, test_play_backward_rate); - tcase_skip_broken_test (tc_general, test_play_audio_video_seek_done); - tcase_skip_broken_test (tc_general, test_restart); - tcase_skip_broken_test (tc_general, test_user_agent); - - suite_add_tcase (s, tc_general); - - return s; -} - -GST_CHECK_MAIN (player)
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/d3d11videosink
Deleted
-(directory)
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/d3d11videosink/meson.build
Deleted
@@ -1,9 +0,0 @@ -if host_system == 'windows' - executable('d3d11videosink', - ['d3d11videosink.c', 'd3d11videosink-kb.c'], - c_args : gst_plugins_bad_args, - include_directories : [configinc, libsinc], - dependencies: [gst_dep, gstbase_dep, gstvideo_dep], - install: false, - ) -endif
View file
gst-plugins-bad-1.18.6.tar.xz/ChangeLog -> gst-plugins-bad-1.20.1.tar.xz/ChangeLog
Changed
@@ -1,26 +1,911 @@ -=== release 1.18.6 === +=== release 1.20.1 === -2022-02-02 15:07:37 +0000 Tim-Philipp Müller <tim@centricular.com> +2022-03-14 11:33:33 +0000 Tim-Philipp Müller <tim@centricular.com> + + * NEWS: + * RELEASE: + * gst-plugins-bad.doap: + * meson.build: + Release 1.20.1 + +2022-03-14 11:33:25 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + Update ChangeLogs for 1.20.1 + +2022-03-04 10:02:56 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/videoparsers/gstvp9parse.c: + vp9parse: Fix auto-plugging of HW frame decoder + Decoders that required frame aligmment and didn't have an associated + alpha decoder were skipped. This is because the parser was constructing + caps based on the software alpha decoder, which specify super-frame + alignment. + Iterate over the caps to filter the one that have a matching codec-alpha, with + the semantic the no codec-alpha field means codec-alpha=false. Then if + everything was removed, callback to the original, so that the first non-alpha + decoder will be picked. + Fixes #820 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1949> + +2022-01-14 23:42:27 -0600 Tim Mooney <Tim.Mooney@ndsu.edu> + + * sys/v4l2codecs/linux/types-compat.h: + v4l2: include <sys/ioccom.h> on Illumos + Needed for _IOR/_IORW + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1947> + +2022-03-13 00:17:48 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvbaseenc.c: + nvenc: Fix deadlock because of too strict buffer pool size + The pool size might need to be larger than encoding surface pool size. + Also, because we always copy input frame into internal CUDA memory, + there's no reason to restrict max size of buffer pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1942> + +2022-03-11 23:20:26 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvh265dec.c: + nvh265sldec: Always fill SPS/PPS related parameters + Address compare was not a valid approach since it works + only if SPS/PPS id are changed. Otherwise it will always point to + the same address of member variables of h265parser. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1941> + +2022-03-11 19:32:59 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + h264decoder: Fix invalid memory access + gst_h264_dpb_needs_bump() can be called with null picture + in case of live + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1929> + +2022-03-10 18:40:12 +0000 Tim-Philipp Müller <tim@centricular.com> + + * gst/sdp/gstsdpdemux.c: + sdpdemux: add media attributes to caps to fix ptp clock handling + Those are needed by rtpjitterbuffer to do the right thing, e.g. + a=ts-refclk:ptp=IEEE1588-2008:00-**-**-**-**-**-**-**:0 + a=mediaclk:direct=1266592257 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1925> + +2022-03-10 10:33:56 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsbase.h: + mpegts: Handle glib < 2.58 + By using a workaround to the lack of g_ptr_array_steal_index. + Fixes #1078 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1919> + +2021-11-02 09:20:55 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtsbase.c: + tsbase: Handle more program updates + There could be a case where the new program has the same program number as the + previous one ... but is actually located on a PID previously used for elementary + stream. In that case the program is guaranteed to not be an update of the + previous program but a completely new one. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1919> + +2021-11-02 09:18:57 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsbase.h: + mpegtsbase: Use an array to track programs + We need to be able to look for programs by their PID also. Using a hash table + was a bit sub-par (and overkill) for storing a range of programs. + This is needed because there could potentially be two programs with the same + program id but different PMT PID (while one is being deactivated the new one + would "exist"). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1919> + +2022-03-07 18:46:55 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/gstbasetsmux.c: + mpegtsmux: Start last_ts with GST_CLOCK_TIME_NONE + And use the output segment position for the outgoing timestamp while it + is. This is needed to delay the calculation of `output_ts_offset` until + we actually have a usable timestamp, as tsmux will output a few initial + packets while `last_ts` is still unset. + Without this, the calculation would use the initial `0` value, which did + not have the intended effect of making VBR mode behave like CBR mode, + but always calculated an offset equal to the selected start time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1895> + +2022-03-07 18:46:08 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/gstbasetsmux.c: + mpegtsmux: Use GST_CLOCK_STIME_NONE for output_ts_offset + It's a GstClockTimeDiff, thus GST_CLOCK_TIME_NONE isn't appropriate. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1895> + +2022-03-07 10:19:53 +0000 Philippe Normand <philn@igalia.com> + + * tests/check/libs/play.c: + gstplay: tests: Keep track of errors/warnings + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1871> + +2022-03-07 10:16:36 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay.c: + gstplay: Do not error out on message parsing failures + Specially when parsing errors and warnings, the details field can be NULL and + the gst_structure_get() call would return FALSE in such cases, triggering false + positive errors. + Follow-up for #1063 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1871> + +2022-03-07 10:14:43 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay.c: + * gst-libs/gst/play/gstplay.h: + gstplay: Fix warning parsing API + The GError is an out parameter, so should be a ** parameter, like the details + parameter. + See also #1063 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1871> + +2022-03-04 14:17:47 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Handle PES headers bigger than a mpeg-ts packet + While the actual PES header parser could notify us that it needed more data, we + would never actually act on it. + This commit will accumulate incoming packets in such situation and re-attempt + the header parsing. + Fixes #1027 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1858> + +2022-03-04 09:57:02 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay.c: + * gst-libs/gst/play/gstplay.h: + play: Fix error parsing API + The GError is an out parameter, so should be a ** parameter, like the details + parameter. + Fixes #1063 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1850> + +2022-02-21 10:49:15 +0100 Sebastian Fricke <sebastian.fricke@collabora.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpe.cpp: + * ext/wpe/gstwpe.h: + Remove the uninstalled term + Remove the symbolic link `gst-uninstalled` which points to `gst-env`. + The `uninstalled` is the old name and the project should stick to a + single name for the procedure. + Remove the term from all the files, exceptions are variables from + dependencies like `uninstalled_variables` from pkgconfig and + `meson-uninstalled`. + Adjust mentions of the script in the documentation and README. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1829> + +2022-02-23 11:10:11 +0100 Sebastian Fricke <sebastian.fricke@collabora.com> + + * README.md: + Maintain build instructions at a single location + Do not maintain similar build instructions within each gst-plugins-* + subproject and the subproject/gstreamer subproject. Use the build + instructions from the mono-repository and link to them via hyperlink. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1829> + +2022-02-25 15:00:05 +0800 jinsl00000 <jinsl00000@msn.cn> + + * sys/ipcpipeline/gstipcpipelinecomm.c: + * sys/ipcpipeline/meson.build: + ipcpipeline: fix crash and error on windows with SOCKET or _pipe() + The fd was in different meanings on windows: + POSIX read and write use the fd as a file descriptor. + The gst_poll use the fd as a WSASocket. + This patch use WSASocket as default on windows. This is a temporary measure, because IPC has many different implement. There may be a better way in the future. + See #1044 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1822> + +2022-02-28 16:33:23 +0100 Guillaume Desmottes <guillaume.desmottes@onestream.live> + + * ext/gs/meson.build: + gs: look for google_cloud_cpp_storage.pc + storage_client.pc was legacy and has been removed: + https://github.com/googleapis/google-cloud-cpp/commit/df6fa3611cdfbc37d40e1451afa91fd7d2e7d5f7#diff-bc35ad7c2fe631fd5578a06092412dba81c7ddd27bb25df7e17bb13771799afcL743 + No need to keep looking for storage_client.pc as a fallback as 1.25.0, + our minimum version, already ships google_cloud_cpp_storage.pc + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1825> + +2022-02-24 20:26:46 +0530 Sanchayan Maity <sanchayan@asymptotic.io> + + * ext/ldac/gstldacenc.c: + ldac: Set eqmid in caps + We set the eqmid in caps to be usable downstream by rtpldacpay for + knowing the frame count. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1804> + +2022-01-31 16:13:32 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsmux/gstbasetsmux.c: + tsmux: Skip empty buffers + They can be created e.g. by aggregator when there is a gap. Such buffers + should not be muxed at all. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1801> + +2022-02-01 14:51:27 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/gstbasetsmux.h: + tsmux: Lock mux->tsmux, the programs hash table, and pad streams + They contain implementations that are not thread-safe (e.g. GList, GHashTable). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1800> + +2022-02-21 16:45:50 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * docs/plugins/gst_plugins_cache.json: + bad:docs: Add vaav1dec in documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1780> + +2022-02-18 16:23:09 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: vp9: Fix reset_frame_context parameter + It was assumed that the kernel parameters would match with the bitstream value + but instead the author when with another set of value. Surprisingly, this + makes no difference with the resulting fluster score. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1754> + +2022-02-18 16:02:27 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: vp9: Only fill compressed headers if needed + Fixes: 13944cf3ee871722 ("v4l2codecs: vp9: Make compressed hdr control optional") + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1754> + +2022-01-15 00:04:05 -0600 Tim Mooney <Tim.Mooney@ndsu.edu> + + * ext/curl/meson.build: + * gst/festival/meson.build: + * meson.build: + * sys/shm/meson.build: + meson: check for libsocket and libnsl + If present, add '-lsocket' and '-lnsl' to network_deps. + ext/curl/meson.build: add network_deps to dependencies + gst/festival/meson.build: same + sys/shm/meson.build: same + Fixes linking issues on Illumos distros. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1736> + +2022-02-14 16:18:54 +0300 Dmitry Osipenko <dmitry.osipenko@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Correct scaling matrix ABI check + Scaling matrix V4L UAPI control not presents on NVIDIA Tegra, the default + matrix should be used in this case. Mark scaling matrix presence optional. + Fixes: 47bfa71530c ("v4l2codecs: h264: Improve ABI check ") + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1735> + +2022-02-16 02:23:58 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11converter.cpp: + d3d11converter: Fix for missing GRAY conversion + Add missing Y410 -> GRAY and GRAY -> semi-planar conversion + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1733> + +2022-02-16 02:11:53 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11converter.cpp: + d3d11converter: Don't use FIXME_OBJECT for non-GstObject + ... and print ERROR messages for unexpected input/output formats + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1733> + +2022-02-14 12:57:44 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: unlock mutex on -1 start_offfset + Closing #1013 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1718> + +2022-02-08 15:22:39 +0100 Jan Alexander Steffens (heftig) <heftig@archlinux.org> + + * ext/openaptx/gstopenaptxdec.h: + * ext/openaptx/gstopenaptxenc.h: + * ext/openaptx/meson.build: + openaptx: Support libfreeaptx + [libfreeaptx][1] is a fork of libopenapt 0.2.0, used by pipewire. + [1]: https://github.com/iamthehorker/libfreeaptx + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1642 + Closes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1589 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1721> + +2022-02-15 02:26:46 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11converter.cpp: + d3d11converter: Fix RGB to GRAY conversion + Fix typo in shader code + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1719> + +2022-02-12 10:05:11 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/webrtc/dtlstransport.c: + dtlstransport: Notify ICE transport property changes + The application might track the underlying ICE transport, so not notifying + changes might lead to use-after-free issues. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1717> + +2022-02-12 14:51:51 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavpp.c: + vavpp: Fix the caps leak in the transform_caps() function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1715> + +2022-02-10 12:52:30 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Add temporal unit check when TD is absent. + The current manner for deciding the new temporal unit is based on + temporal delimiter(TD) OBU. We only start a new temporal unit when + the TD comes. + But some streams do not have TD at all, which makes the output "TU" + alignment fail to work. We now add check based on the relationship + between the different layers and it can successfully judge the TU edge. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1712> + +2022-02-04 17:12:15 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: let the parse continue when MISSING_OBU_REFERENCE error. + Some streams may have verbose OBUs before a valid sequence header. We + should let the parse continue rather than return a error. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1712> + +2022-02-04 11:40:18 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Continue when we fail to detect the alignment. + Some streams may have problematic OBUs at the beginning, which causes + the parse fail to detect the alignment and return error. For example, + there may be verbose OBUs before a valid sequence, which should be + discarded until we meet a valid sequence. We should let the parse + continue when we meet such cases, rather than just return error. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1712> + +2021-03-30 19:23:12 +0900 Seungha Yang <seungha@centricular.com> + + * gst/ivfparse/gstivfparse.c: + ivfparse: Don't set zero resolution on caps + It could be zero if the information is not available at ivfparse + side, or not implemented. In that case, simply don't set + width/height on caps, otherwise downstream would be confused + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1709> + +2022-02-10 01:48:23 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfaudioenc.cpp: + mfaudioenc: Handle empty IMFMediaBuffer + IMFMediaBuffer may not hold encoded data, which seems to happen + while draining. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1705> + +2022-02-06 23:20:32 +0900 Sangchul Lee <sc11.lee@samsung.com> + + * ext/webrtc/gstwebrtcice.c: + webrtcice: Fix memory leaks in gst_webrtc_ice_add_candidate() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1704> + +2022-02-07 12:34:53 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/meson.build: + va: Fix and simplify build recipe. + 1. Use api_version variable rather than static string. + 2. Remove pkgconfig generation since currently the library + is not installed, only used internally. + 3. Rely on dependency "required" to abort compilation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1702> + +2022-02-07 11:27:57 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/meson.build: + * sys/va/meson.build: + va: Remove libgudev crumbs in library. + In commit e699aaeb we moved linking of libgudev to the plugin rather + the library, because it's only used in the plugin. But the dependency + check is still done in library. + This patch removes the dependency check in library, and updates the + dependency check in plugin. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1702> + +2022-02-01 00:50:53 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvh265dec.c: + nvh265sldec: Fix for decoding 12bits stream + We've been exposing main-444-12 profile as a supported profile + in its sinkpad template but not actaully. Adding code to + covert 12 and 16 bits as well. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1660> + +2022-02-01 00:12:06 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvdecoder.h: + * sys/nvcodec/gstnvh264dec.c: + * sys/nvcodec/gstnvh265dec.c: + * sys/nvcodec/gstnvvp8dec.c: + * sys/nvcodec/gstnvvp9dec.c: + nvdecoder: Fix for HEVC 4:4:4 format decoding + Map chroma_format_idc == 3 (which means 4:4:4 subsampling) correctly, + also pass coded bitdepth for decoder initialization instead of + inferring it from output format since they can be different. + Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/949 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1660> + +2022-02-05 17:36:41 +0300 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + + * meson_options.txt: + qroverlay: move to plugins that need external deps + qroverlay requires libqrencode dependency, move it next to similar plugins. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1658> + +2022-02-07 16:17:28 +0000 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + Back to development + +=== release 1.20.0 === + +2022-02-03 19:53:25 +0000 Tim-Philipp Müller <tim@centricular.com> * ChangeLog: * NEWS: + * README: * RELEASE: * gst-plugins-bad.doap: * meson.build: - Release 1.18.6 + Release 1.20.0 + +2022-02-03 19:53:18 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + Update ChangeLogs for 1.20.0 + +2022-02-02 09:58:15 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dash/gstdashsink.c: + dashsink: doc cleanup + Remove max-files mention in the command line test + Fix some typos + Use mpegtsdemux instead of tsdemux in the pipeline description + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1624> + +2022-02-01 00:18:44 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + nvdecoder: Fix for display resolution setup + Display resolution should be cropped rect, not coded resolution. + Otherwise decoded output from NVDEC might be wrong. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1617> + +2022-02-01 03:00:41 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtdec.c: + * sys/applemedia/vtenc.c: + * sys/applemedia/vtutil.c: + * sys/applemedia/vtutil.h: + applemedia: Disable 64RGBALE support on older macOS + The kCVPixelFormatType_64RGBALE enum is only available on macOS Big + Sur (11.3) and newer. We also cannot use that while configuring the + encoder or decoder on older macOS. + Define the symbol unconditionally, but only use it when we're running + on Big Sur with __builtin_available(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1613> + +2022-02-01 03:04:32 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtenc.c: + applemedia: Remove some unnecessary variables + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1613> + +2022-02-01 05:07:04 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * docs/meson.build: + * sys/applemedia/avfassetsrc.m: + * sys/applemedia/iosassetsrc.m: + docs: Add objc and objcpp files to hotdoc gst_c_sources + Hotdoc should be able to extract and parse comments out of these. Just + need to be careful to only add the glob in directories that actually + contain *.m (objc) and *.mm (objcpp) files. + Also fix some doc comments and remove redundant ones. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1614> + +2022-01-14 11:26:42 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * ext/wpe/meson.build: + wpe: Support wpe-webkit-1.1 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1522> + +2022-01-29 10:24:36 +0000 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpe.cpp: + * ext/wpe/gstwpe.h: + * ext/wpe/meson.build: + * ext/wpe/wpe-extension/meson.build: + wpe: Install WebExtension in pkglibdir + The uninstalled WebExtension takes precedence over the installed one, so that + audio support works in local developer builds as well. + Fixes #975 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1602> + +2022-01-30 19:06:29 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/plugin.m: + * sys/applemedia/vtenc.c: + docs: Rename "OS X" to "macOS" in some documentation + No one uses the term "Mac OS X" anymore, it's "macOS". "OS X" is even + worse, because people will usually start the search with "mac". + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1607> + +2022-01-28 23:15:28 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/applemedia/avfvideosrc.m: + * sys/applemedia/vtdec.c: + * sys/applemedia/vtenc.c: + applemedia: Document vtenc / vtdec elements + Also preserve-alpha property should only be exposed on the + vtenc_prores element since h264 does not support transparency. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/94 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1601> + +2022-01-28 14:39:35 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Improve ABI check + This moves the ABI check to the registration, so we don't expose + decoders with the wrong ABI or that are just broken somehow. It + also makes few enhancement: + - Handle missing, but required controls + - Prints the controls macro name instead of id + This should fix RK3399 support with a currently release minor + regression in the Hantro driver that cause errors. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1599> + +2022-01-28 17:11:41 +0000 Philippe Normand <philn@igalia.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.c: + * ext/webrtc/utils.h: + * ext/webrtc/webrtcdatachannel.c: + * ext/webrtc/webrtcsdp.c: + * gst-libs/gst/webrtc/meson.build: + * gst-libs/gst/webrtc/webrtc.c: + * gst-libs/gst/webrtc/webrtc_fwd.h: + * tests/check/elements/webrtcbin.c: + webrtc: Expose RTCError enum + The error codes not complying with the spec are now notified with the + GST_WEBRTC_ERROR_INTERNAL_FAILURE code. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1485> + +2022-01-29 04:46:24 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Fix typo in doc + s/elemenet/element/g + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1598> + +=== release 1.19.90 === + +2022-01-28 14:28:35 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + * NEWS: + * RELEASE: + * gst-plugins-bad.doap: + * meson.build: + Release 1.19.90 + +2022-01-28 14:28:28 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + Update ChangeLogs for 1.19.90 + +2022-01-27 11:22:54 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Make cb max values symmetrical to their min values. + Intel drivers expose some colorbalance's maximum values much more + bigger than their minimum values, given their middle values (default + value). This means, in practice, that the real middle point between + the maximum and minimum values implies a major change in the color + balance, which is not expected by the GStreamer color balance logic. + This patch makes the given maximum value symmetrical to the minimum + value, given the middle one (default value). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1580> + +2022-01-27 11:49:53 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: Fix debug assertion in register functions + As now, we warn if the decoder have no support src pixel format, but that + warning is called before the type (hence the debug category) is initialized. + Fix this by moving the debug category init out of the type initialization, + into the register funcitons. + This will fix an assertion that occures in the register function and allow + relevant log to be seen by the users. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1588> 2022-01-27 17:56:29 +0100 Jakub Adam <jakub.adam@collabora.com> * ext/webrtc/gstwebrtcbin.c: webrtcbin: Chain up to parent constructed method Failing to do so makes GstWebRTCBin invisible to the leaks tracer. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2570> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1587> + +2022-01-27 01:40:17 +0000 Tim-Philipp Müller <tim@centricular.com> + + * po/LINGUAS: + * po/ro.po: + gst-plugins-bad: update translations + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1579> + +2022-01-27 03:10:39 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + d3d11av1dec: Fix typo in debug message + Fixing copy and paste mistake, It's AV1 decoder not VP8 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1576> + +2022-01-25 12:32:50 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasetransform.c: + va: basetransform: Pass component index not plane index. + This is an issue detected and fixed in commit 3897b24f for other + libraries and elements. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1566> + +2022-01-24 11:14:14 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: filter & postproc: Match color with caps features. + When fixating color, there might be "other caps" with color spaces not + supported by the caps features exposed in the vapostproc's source pad + caps template (perhaps it's a bug somewhere else in GStreamer). + This solution checks if the proposed format exists in the filter + within the caps feature associated with the proposed format. + The check is done with the new filter's function + gst_va_filter_has_video_format(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1559> + +2022-01-10 13:34:21 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/tsmux/tsmux.c: + * gst/mpegtsmux/tsmux/tsmuxstream.c: + * gst/mpegtsmux/tsmux/tsmuxstream.h: + tsmux: Allow specifying PMT order via the prog-map + Look for an entry `PMT_<PID>` in the `prog-map`, which specifies the + relative index of the stream in the PMT. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1510> + +2022-01-10 14:16:28 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + tsmux: Deterministically order program streams by PID + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1510> + +2022-01-10 12:59:58 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + tsmux: Deterministically order PAT programs by number + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1510> + +2022-01-10 13:03:11 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + * gst/mpegtsmux/tsmux/tsmuxstream.c: + * gst/mpegtsmux/tsmux/tsmuxstream.h: + tsmux: Remove program_array_index + It's only used for removal. Let's just scan the array. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1510> + +2022-01-10 12:31:42 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + * gst/mpegtsmux/tsmux/tsmux.h: + tsmux: Replace streams GArray with GPtrArray + This is more appropriate. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1510> + +2022-01-19 23:17:23 +0900 Sangchul Lee <sc11.lee@samsung.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/webrtcdatachannel.c: + * ext/webrtc/webrtcsdp.c: + webrtc: Fix memory leaks + Redundant condition and unreachable codes are also removed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1544> + +2020-07-14 11:11:11 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + * sys/v4l2codecs/gstv4l2format.h: + v4l2codecs: Unify the src template caps format + Notably NV12_4L4 ended up being applied to only VP9 decoder. This fixes the + situation by using a central define for all static src pad templated formats. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/965> + +2022-01-21 14:13:39 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/linux/v4l2-controls.h: + * sys/v4l2codecs/linux/videodev2.h: + v4l2codecs: Sync kernel headers against 5.16.0 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/965> + +2022-01-21 11:13:55 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/msdk/msdk.c: + msdk: Avoid noisy registry when no MSDK device. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1550> + +2022-01-21 10:53:21 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/gstvadisplay.c: + va: libs: Avoid noisy registry when no VA device. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1550> + +2022-01-20 05:59:36 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + * sys/wasapi2/gstwasapi2util.c: + * sys/wasapi2/gstwasapi2util.h: + * sys/wasapi2/meson.build: + wasapi2: Fix for device open failure on old OS + To open automatic stream routing aware device, + at least Windows10 Anniversary Update is required. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1545> + +2022-01-18 03:03:30 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + wasapi2ringbuffer: Fix for desynced buffer-size and segsize + GstAudioRingBufferSpec::segsize has been configured by using + device period but GstWasapi2RingBuffer was referencing the + buffer size returned by IAudioClient::GetBufferSize() + which is most likely larger than device period. + Fixing to sync them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1533> + +2021-12-30 16:52:17 +0100 Robert Mader <robert.mader@collabora.com> + + * ext/wayland/wlwindow.c: + * ext/wayland/wlwindow.h: + waylandsink: Ensure correct mapping of area_surface + If the `area_surface` got unmapped when changing to the `READY` or + `NULL` state, we currently don't remap it when playback resumes and + `wp_viewporter` is supported. Without `wp_viewporter` we do remap + it, but rather unintentionally and also when not wanted. + On Weston this has not been a big problem as it so far wrongly maps + subsurfaces of unmapped surfaces anyway - i.e. only the black + background was missing on resume. On other compositors and future + Weston this prevents the `video_surface` to get remapped. + Shuffle things around to ensure `area_surface` is mapped in the + right situations and do some minor cleanup. + See also https://gitlab.freedesktop.org/wayland/weston/-/issues/426 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1483> + +2022-01-16 02:21:43 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11deinterlace.cpp: + d3d11deinterlace: Do not restrict minimum resolution to 64x64 + The value 64 was completely arbitrary one, and this element + will be able to support smaller resolutions + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1528> + +2022-01-06 22:00:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + d3d11compositor: Don't try to read empty buffer + The queued buffer may not be readable buffer in case that + upstream sends GAP event or so. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1497> + +2022-01-13 14:17:09 +0100 Robert Mader <robert.mader@collabora.com> + + * tests/examples/waylandsink/main.c: + waylandsink: Fix alpha value for the test pattern in example + The background-color property is in big-endian ARGB, resulting in + a alpha value of `0`. This accidentally used to work on all common + compositors, but on Weston this now correctly results in a black + background. + See also https://gitlab.freedesktop.org/wayland/weston/-/issues/577 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446> + +2021-12-13 13:16:06 +0100 Robert Mader <robert.mader@collabora.com> + + * ext/wayland/wldisplay.c: + * ext/wayland/wlwindow.c: + waylandsink: Use wl_surface_damage_buffer() instead of wl_surface_damage() + The later, doing damage in surface coordinates instead of buffer + coordinates, has been deprecated. The reason for that is that it + is more prone to bugs, both on the client and the compositor side, + especially when paired with buffer scale, `wp_viewporter` or + buffer transforms. + Unfortunately, on Weston this risks running into + https://gitlab.freedesktop.org/wayland/weston/-/issues/446 + (which causes trouble for several other projects as well). However, + that bug only affects cases where we run in sync mode, i.e. only + during resizes. In practise I haven't been able to observe the + issue. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446> + +2021-12-13 12:21:06 +0100 Robert Mader <robert.mader@collabora.com> + + * ext/wayland/wlwindow.c: + waylandsink: Use G_MAXINT32 for surface damage + Each time we call `wl_surface_damage()` we want to do full surface + damage. Like Mesa, just use `G_MAXINT32` to ensure we always do + full damage, reducing the need to track the right dimensions. + `window->video_rectangle` is now unused, but we keep it around for + now as we may need it again in the future. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446> + +2021-12-30 18:14:24 +0100 Robert Mader <robert.mader@collabora.com> + + * ext/wayland/wlwindow.c: + waylandsink: Only call wl_surface_damage() when buffer content changed + From the spec: + > This request is used to describe the regions where the pending + > buffer is different from the current surface contents + We currently also call `wl_surface_damage()` on surfaces without + new or still compositor-hold buffers, e.g. when resizing the window. + In that case we call it on `area_surface_wrapper`, even though it + gets resized via `wp_viewport_set_destination()`, in which case + the compositor is in charge of repainting the area on screen. + Doing so is currently not forbidden by the spec, however it might + be in the future, see + https://gitlab.freedesktop.org/wayland/wayland/-/issues/267 + Thus lets stay close to the spec and only call `wl_surface_damage()` + when we just attached a buffer. + Right now this prevents runtime assertions in Mutter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446> + +2021-12-13 14:31:06 +0100 Robert Mader <robert.mader@collabora.com> + + * ext/wayland/wlwindow.c: + waylandsink: Simplify input region handling + We only need to unset the input region for the area surface when + we don't have our own toplevel surface. By default, the input region + covers the whole surface, thus no need to change it on resize. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446> + +2021-12-13 12:00:10 +0100 Robert Mader <robert.mader@collabora.com> + + * ext/wayland/wlwindow.c: + waylandsink: Use G_MAXINT32 for opaque regions + `gst_wl_window_set_opaque` does not get called on window resizes, + potentially leaving opaque regions too small. + According to the spec opaque regions can be bigger than the surface + size - parts that fall outside of the surface will get ignored. + Thus we can can simply use `G_MAXINT32` and be sure that the whole + surfaces will always be covered. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446> 2022-01-11 13:21:55 -0500 Dave Piché <dave.piche@motorolasolutions.com> * ext/webrtc/gstwebrtcbin.c: webrtc: fix log error message in function gst_webrtc_bin_set_local_description - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2569> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1511> 2022-01-13 11:31:55 +0000 Tim-Philipp Müller <tim@centricular.com> @@ -32,7 +917,56 @@ We would error out because when we reach the end of the loop without having found a closed caption packet the flow return variable is still FLOW_ERROR which is what it has been initialised to. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2568> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1518> + +2022-01-13 10:36:24 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + codecparsers: h265parser: return invalid profile if len is 0. + Though the profiles[0] is inited as GST_H265_PROFILE_INVALID in the + gst_h265_profile_tier_level_get_profile(), the profile detecting may + change its content later. So the return of profiles[0] may not be an + invalid profile even the len is 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1517> + +2022-01-13 10:11:52 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + codecparsers: h265parser: Fix the index incrementation error in append_profile(). + The current "*idx++" operation just refers the pointer and increment the pointer + itself, not the content of the pointer. This causes that the count of the profiles + is always 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1517> + +2022-01-04 04:56:55 +0300 Dmitry Osipenko <digetx@gmail.com> + + * sys/kms/gstkmssink.c: + kmssink: Support auto-detection of NVIDIA Tegra driver + NVIDIA Tegra SoCs have a separate (from GPU) display controller. It's + the primary display device on all Tegra SoCs. Add Tegra to the list + of primary DRM drivers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1514> + +2022-01-08 00:16:29 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: fix s334-1a scheduling + The previous code was mistakenly trying to compute a cc_type out + of the first byte in the byte triplet, whereas it is to be interpreted + as: + > Bit b7 of the LINE value is the field number (0 for field 2; 1 for field 1). + > Bits b6 and b5 are 0. Bits b4-b0 form a 5-bit unsigned integer which + > represents the offset + The same mistake was made when creating padding packets. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1496> + +2022-01-05 22:48:31 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: merge buffers for both fields with caption type s334-1a + Other elements such as line21encoder expect both fields to be present + in the same meta, not one meta per field. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1496> 2022-01-10 15:24:13 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> @@ -41,7 +975,78 @@ Fix cb_offset and cr_offset data type from guint8 to guint16. According to spec, cb_offset and cr_offset are 9 bit long, while guint8 can cause interger overflow, and thus change to guint16. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2567> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1508> + +2022-01-05 02:07:59 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * docs/meson.build: + * ext/opencv/meson.build: + * gst-libs/gst/vulkan/meson.build: + * meson.build: + * sys/msdk/meson.build: + meson: Add explicit check: kwarg to all run_command() calls + This is required since Meson 0.61.0, and causes a warning to be + emitted otherwise: + https://github.com/mesonbuild/meson/commit/2c079d855ed87488bdcc6c5c06f59abdb9b85b6c + https://github.com/mesonbuild/meson/issues/9300 + This exposed a bunch of places where we had broken run_command() + calls, unnecessary run_command() calls, and places where check: true + should be used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1507> + +2022-01-05 10:53:55 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/codecalpha/gstalphacombine.c: + alphacombine: update example launch line + Fix typos and missing videoconvert element to demonstrate + the alphacombine element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1494> + +2021-12-24 23:09:59 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Set the "tu" as the default alignment. + The tu(temporal unit) is more widely used than the current alignment. + We now change the default alignment to tu. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1468> + +2021-12-24 21:50:01 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Fix the wrong DELTA_UNIT flag setting for key frames. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1468> + +2021-12-22 12:36:15 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Copy the PTS and DURATION when we create data. + We need to create header buffers for annex b format. This kind of + buffers should inherit the PTS and DURATION from the original buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1468> + +2022-01-03 21:02:47 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtenc.c: + vtenc: Signal ignored alpha component with ProRes + When the image is opaque but the output ProRes format has an alpha + component (4 component, 32 bits per pixel), Apple requires that we + signal that it should be ignored by setting the depth to 24 bits per + pixel. Not doing so causes the encoded files to fail validation. + So we set that in the caps and qtmux sets the depth value in the + container, which will be read by demuxers so that decoders can skip + those bytes entirely. qtdemux does this, but vtdec does not use this + information at present. + The sister change was made in qtmux and qtdemux in: + https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/merge_requests/1061 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1489> + +2022-01-03 22:15:12 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + codecparsers: h265parser: Correct the read of slice_sao_chroma_flag. + According to the SPEC, for parsing the slice header, we should read the + slice_sao_chroma_flag only when ChromaArrayType is not equal to 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1488> 2021-12-29 21:29:02 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> @@ -51,15 +1056,7 @@ new "font/*" mime types. Some new encoders are already using the updated mime types. We need to also add them to the support list in order for assrender to correctly identify them as fonts. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2566> - -2021-01-20 12:04:48 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> - - * ext/assrender/gstassrender.c: - assrender: Add "application/vnd.ms-opentype" mimetype detection - The "application/vnd.ms-opentype" mimetype is commonly used in many fonts attached in the matroska videos. - Assrender should treat it as compatible without the need of parsing the file extension. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2566> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1481> 2021-12-29 21:28:56 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> @@ -68,7 +1065,215 @@ TTC stands for "TrueType Collection" file. We can pass it into libass as any other attachment. Add it to the supported extensions list, so the fonts it contains will be used. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2566> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1481> + +2022-01-02 09:38:43 +0000 Philippe Normand <philn@igalia.com> + + * ext/webrtc/webrtcdatachannel.c: + webrtcdatachannel: Notify buffered-amount property updates + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1484> + +2021-12-27 03:15:10 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + d3d11decoder: Negotiate again on the first output buffer + ... unconditionally. There may be updated field in sinkpad caps + after the new_sequence() call (HDR related ones for example), + then we should signal the information to downstream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1474> + +2021-12-29 15:02:03 +0000 Philippe Normand <philn@igalia.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Fix null pointer dereference + If there is no jitterbuffer stats we should not attempt to store them in the + global stats structure. + Also add a g_return_if_fail in _gst_structure_take_structure() about this + because it is a programmer error to pass an invalid pointer address there. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1479> + +2021-08-12 19:14:16 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + codecparsers: av1parse: Add the DECODE_ONLY flag to output buffer. + When the alignment is ALIGN_FRAME and the output buf contains a frame + which is not to be shown, the GST_BUFFER_FLAG_DECODE_ONLY flag should + be set. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1050> + +2021-12-14 12:38:25 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Fall back to last packet ssrc if caps dont provide it + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1448> + +2021-12-14 11:28:42 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Use our own caps instead of the sticky event + The sticky event seems to get cleared sometimes. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1448> + +2021-12-14 11:28:13 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + webrtcbin: Store the ssrc of the last received packet + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1448> + +2021-12-13 16:57:06 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtc stats: Remove duplicate structure get + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1448> + +2021-12-13 16:56:37 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtc stats: Add more details about codecs into the stats + This makes the output a little closer to what the upstream stats are. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1448> + +2021-12-23 10:06:58 +1100 Brad Hards <bradh@frogmouth.net> + + * ChangeLog: + * data/targets/file-extension/ts.gep: + doc: typo fix for streaming + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1463> + +2021-12-21 12:55:59 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaprofile.c: + va: av1dec: Use named profiles to replace the numeric ones. + Use named AV1 profiles (i.e., main, high, and professional) to replace + the old 0, 1, 2 profiles. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1456> + +2021-12-21 01:08:40 +0900 Seungha Yang <seungha@centricular.com> + + * ext/aom/gstav1enc.c: + av1enc: Update for newly designed AV1 profile signalling + Accept named AV1 profiles (i.e., main, high, and professional) + as well + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1456> + +2021-12-19 21:48:51 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + d3d11av1dec: Update sinkpad template for profile + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1456> + +2021-12-19 21:44:19 +0900 Seungha Yang <seungha@centricular.com> + + * gst/videoparsers/gstav1parse.c: + * tests/check/elements/av1parse.c: + av1parse: Use descriptive profile name instead of numeric + As per AV1 specification Annex A, AV1 profiles have explicit and + descriptive names for each seq_profile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1456> + +2021-12-19 21:47:18 +0900 Seungha Yang <seungha@centricular.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Remove trailing white space + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1456> + +2021-12-17 22:24:57 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + d3d11av1dec: Fix for Cdef param + av1parser will increase the sec_strength values by 1 if parsed + values were equal to 3 as defined in spec. But DXVA wants unmodified + ones. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1455> + +2021-12-17 19:49:42 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + d3d11av1dec: Sync DXVA AV1 data structure with released header + Update AV1 data structure based on Windows 11 SDK header + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1455> + +2021-12-15 12:27:24 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/check/libs/h265parser.c: + tests: h265parser: Add test for multiple compatibility profiles. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1440> + +2021-12-14 19:56:48 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst/videoparsers/gsth265parse.c: + h265parser: Compare upstream profile with in SPS. + Compare if upstream profile in caps is the same as the one parsed in + the SPS. If they are different use the bigger for simplicity and + more chances to decode it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1440> + +2021-12-15 11:58:07 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + codecparsers: h265parser: Use a table map to get profile. + Instead of a sequence of if statements, declare a table to map profile + idc with profiles and traverse it. + Also, first add the profile from the parsed profile idc and later add, + into the profile array, the profile from the compatibility flags. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1440> + +2021-12-14 19:36:56 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + codecparsers: h265parser: Verify all possible profiles. + It's possible a HEVC stream to have multiple profiles given the + compatibility bits. Instead of returning a single profile, internal + gst_h265_profile_tier_level_get_profiles() returns an array with all + it possible profiles. + Profiles are appended into the array only if the generated profile + is not invalid. + gst_h265_profile_tier_level_get_profile() is rewritten in terms of + gst_h265_profile_tier_level_get_profiles(), returning the first + profile found the array. + And gst_h265_get_profile_from_sps() is also rewritten in terms of + gst_h265_profile_tier_level_get_profiles(), but traversing the array + verifying if the proposed profile is actually valid by Annex A.3.x of + the specification. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1440> + +2021-12-09 03:00:56 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: bind transceiver's fec-percentage to encoder percentage + Allows for dynamic control of the applied FEC overhead + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1429> + +2021-12-07 23:55:22 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/transportstream.c: + * ext/webrtc/transportstream.h: + webrtcbin: fix ulpfec / red for the BUNDLE case + * Add fec / red encoders as direct children of webrtcbin, instead + of providing them to rtpbin through the request-fec-encoder signal. + That is because they need to be placed before the rtpfunnel, which + is placed upstream of rtpbin. + * Update configuration of red decoders to set a list of RED payloads + on them, instead of setting the pt property. + That is because there may be one RED pt per media in the same session. + * Connect to request-fec-decoder-full instead of request-fec-decoder, + in order to instantiate FEC decoders according to the payload type + of the stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1429> + +2021-12-10 21:58:33 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11pluginutils.cpp: + d3d11videosink: Use only tested color space for swapchain + We are querying supported swapchain colorspace via + CheckColorSpaceSupport() but it doesn't seem to be reliable. + Use only tested full-range RGB formats which are: + - sRGB + - BT709 primaries with linear RGB + - BT2020 primaries with PQ gamma + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1433> 2021-12-11 11:33:39 -0300 Thibault Saunier <tsaunier@igalia.com> @@ -95,13 +1300,936 @@ ERROR: pipeline doesn't want to preroll. Freeing pipeline ... ``` - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2565> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1441> + +2021-12-10 15:46:41 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + doc: Update vp9alphadecodebin doc cache + A new field was added to the template cpas. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-10 15:18:56 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphacombine.c: + alphacombine: Fix for early allocation queries + When using playbin3, it seems that the alpha decode is always first to + push caps and run an allocation query. As the format change from sink + and alpha were not synchronized, the allocation query could endup + being run before the caps are pushed. That may lead to failing query, + which makes the decoder thinks there is no GstVideoMeta downstream and + most likely CPU copy the frame. + This patch implements a format cookie to track and synchronize the + format changes on both pads fixing the racy performance issue. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-10 14:09:44 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + * sys/v4l2codecs/gstv4l2codecvp9dec.h: + * sys/v4l2codecs/plugin.c: + v4l2codecs: vp9: Add alpha decodebin wrapper + This will allow HW accelerated decoding of WebM alpha videos. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-10 14:09:06 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/plugin.c: + v4l2codecs: plugin: Minor style fix + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-10 14:08:32 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2codecs: decoder: Improve logging of timed out request + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-10 14:07:18 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + * sys/v4l2codecs/gstv4l2codecmpeg2dec.h: + * sys/v4l2codecs/plugin.c: + v4l2codecs: mpeg2: Check that the decoder output formats + This is to avoid exposing a decoder for which we don't support any + output format. This happens on platform using vendor formats or + not yet supported tiles formats. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-10 14:04:40 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstvp9alphadecodebin.c: + vp9alphadecodebin: Fix auto-pluging v4l2slvp9dec + This adds the alignment field to the template caps. Without this field + set, the auto-plugger will see fixed caps and will use + gst_caps_is_subset() against the caps produced by the parser. This is a + challenge for all cases where a parser can do conversion. This is fixed + by adding alignment field, which makes the auto-pluggers do an + intersection of the caps as it gets unfixed caps after intersection now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1439> + +2021-12-09 19:55:04 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11pluginutils.cpp: + * sys/d3d11/gstd3d11pluginutils.h: + * sys/d3d11/gstd3d11window.cpp: + d3d11window: Remove hack related to color space selection + Use input GstVideoColorPrimaries without any special case handling + otherwise rendered image color would be very wrong. + The hack was added to work around an issue that some Intel driver + couldn't handle wide color gamut image without HDR10 metadata, specifically PQ image. + But device capability can be checked via a method added in + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1723 + so there's no issue now. + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1175 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1430> + +2021-12-09 19:51:04 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11window.cpp: + d3d11window: Fix typo in debug message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1430> + +2021-12-08 11:08:30 +0100 Benjamin Gaignard <benjamin.gaignard@collabora.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Fix return value type + Return value should be GstFlowReturn not gboolean + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1426> + +2021-12-07 17:09:11 +0100 Benjamin Gaignard <benjamin.gaignard@collabora.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Fix return value if klass->new_picture isn't set + If klass->new_picture isn't set we need to initialize + ret with GST_FLOW_OK to avoid unwanted error case + Fixes: 5b405d15858b ("codecs: h265decoder: Use GstFlowReturn everywhere") + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1426> + +2021-12-06 16:47:14 +0000 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpevideosrc.cpp: + wpevideosrc: Use basesrc event vfunc + Allows for basic default handling from the base class. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1422> + +2021-12-03 13:24:25 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ext/teletextdec/gstteletextdec.c: + teletextdec: fix minor string leak + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1416> + +2021-12-02 15:52:06 +0100 Marc Leeman <m.leeman@televic.com> + + * gst/rist/gstristsink.c: + ristsink: set properties on children early + The properties on the udpsink/udpsrc elements need to be set before + there is any state change. If not, in a network without default gateway, + udpsink tries to bind an a NULL interface and fails. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1411> + +2021-12-03 07:53:54 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + v4l2codecs: mpeg2: Fix selected sizeimage + Due to a copy paste bug, the bitdepth was never set and that was leading + to requesting sizeimage of 0. Previously that worked since the driver + would in that case pick a size for us. But now the we bumped the minimum + to 4KB, the driver happily allocate 4KB of bitstream which lead to + decoding error. + As MPEG2 have a fixed bitdeph of 8, use a define instead of the run-time + variable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1415> + +2021-12-01 12:16:40 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: vp9: Drop frames on non-keyframe format change + V4L2 does not yet support this feature, this will skip over the + transition portion up to the next keyframe. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1404> + +2021-12-01 09:51:57 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: Fix renegotiation + If we hold the last reference to the allocator, leaving the device + streaming will cause an EBUSY error when trying to free the allocate + buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1404> + +2021-12-02 16:26:08 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11vp9dec: Drop frames on non-keyframe format change + ... in case of NVIDIA GPU + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1407> + +2021-12-02 16:04:21 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvvp9dec.c: + nvvp9sldec: Drop frames on non-keyframe format change + NVDEC doesn't seem to be able to handle the case + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1407> + +2021-12-02 16:03:14 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + codecs: vp9: Drop frames on non-keyframe format change + ... if subclass does not support the case + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1407> + +2021-12-01 12:10:42 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: vp9: Also consider render_width/height + Also emits new_sequence if on keyframe and the render_width/height have + change. The subclass can always optimize this if the frame resolution + didn't change, the output caps needs to reflect this though. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1407> + +2021-11-30 10:24:37 +0100 Marc Leeman <m.leeman@televic.com> + + * gst/rtp/gstrtpsink.c: + rtpsink: set properties on children early + The properties on the udpsink/udpsrc elements need to be set before + there is any state change. If not, in a network without default gateway, + udpsink tries to bind an a NULL interface and fails. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1398> + +2021-11-30 14:48:03 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: vp9: Remove uneeded picture data + The GstV4l2Request now holds a reference on the picture buffer and is + recounted already. This effectively removes usage of GRefCount which is only + available in GLib 2.58, while we support 2.56. + Fixes #910 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1402> + +2021-11-30 17:05:22 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + v4l2codecs: decoder: Add method to get the pic_buf + This helper will be needed for VP9 frame duplication. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1402> + +2021-11-30 16:08:18 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: vp9: Add missing error checks in decide_allocation + This could otherwise lead to crash. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1402> + +2021-11-24 11:17:22 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Fix typo in comment + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1387> + +2021-11-24 11:17:40 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Properly set pic_num/frame_num + The V4L2 uAPI uses pic_num for both PicNum and ShortTermPicNum. It also + doe the same for both FrameNum and LongTermFrameIdx. This change does + not change the fluster score, but fixed a visual corruption noticed + with some third party streams. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1387> + +2021-11-23 16:35:16 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvasurfacecopy.c: + va: Use a lock to protect the surface copy by using vpp. + If we use vpp to do the surface copy, its operation is not atomic. + We need to maintain the filter's context unchanged during the whole + copy progress. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1373> + +2021-11-23 21:10:55 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: Use the GstVaSurfaceCopy of the allocator atomically. + The mem_copy() of the allocator can be called simultaneously from + different threads. We should use atomic pointer operations to create + and use the GstVaSurfaceCopy of the allocator. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1373> + +2021-11-22 16:07:27 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: DMA allocator: Set the copied memory properly when popped from pool. + The current code does not set the copied memory correctly when it is popped + from the surface cache pool. + 1. We forget to ref the allocator, which causes the allocator to be freed + unexpected, and we get a crash later because of the memory violation. + 2. We forget to add ref_mems_count, which causes the surface leak because + the surface can not be pushed back to the cache pool again. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1373> + +2021-04-22 16:29:20 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2codecs: gstv4l2decoder: set minimum sizeimage + Set minimum sizeimage such that there is enough space for any overhead + introduced by the codec. + Notably fix a vp9 issue in which a small image would not have a + bitstream buffer large enough to accomodate it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1012> + +2021-04-07 16:15:32 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + v4l2codecs: gstv4l2codecsvp9dec: implement a render delay + The v4l2 backend support delayed output for performance reasons. + It is then possible to use render delays to queue multiple requests + simultaneously, thus increasing performance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1012> + +2021-03-30 13:30:46 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp9dec.c: + * sys/v4l2codecs/gstv4l2codecvp9dec.h: + * sys/v4l2codecs/gstv4l2format.c: + * sys/v4l2codecs/linux/videodev2.h: + * sys/v4l2codecs/meson.build: + * sys/v4l2codecs/plugin.c: + v4l2codecs: vp9dec: Implement VP9 v4l2 decoder + Implement a v4l2 based vp9 decoder element based on the preexisting vp8 + v4l2 decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1012> + +2021-07-14 16:21:59 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/linux/v4l2-controls.h: + * sys/v4l2codecs/linux/videodev2.h: + v4l2codecs: update to the new uAPI + The new VP9 stateless API is in its way to be destaged. Update our + headers to match. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1012> + +2021-04-08 10:40:03 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: gstvp9decoder: copy frame->system_frame_number into picture + A comment in gstvp9picture.h states that picture->system_frame_number + should get its value from frame->system_frame_number, but in fact + it was never copied at all. + Fix it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1012> + +2021-04-08 10:38:55 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + * gst-libs/gst/codecs/gstvp9statefulparser.c: + * gst-libs/gst/codecs/gstvp9statefulparser.h: + codecs: gstvp9{decoder|statefulparser}: optionally parse compressed headers + Rework gstvp9{decoder|statefulparser} to optionally parse compressed headers. + The information in these headers might be needed for accelerators + downstream, so optionally parse them if downstream requests it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1012> + +2021-11-23 20:12:06 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * tests/check/elements/webrtcbin.c: + webrtcbin: deduplicate extmaps + When an extmap is defined twice for the same ID, firefox complains and + errors out (chrome is smart enough to accept strict duplicates). + To work around this, we deduplicate extmap attributes, and also error + out when a different extmap is defined for the same ID. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1383> + +2021-11-23 13:30:17 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: drain() only when significant sequence changes. + There are a lot of info in the mpeg2's sequence(also including ext + display_ext and scalable_ext). We need to notify the subclass about + its change, but not all the changes should trigger a drain(), which + may change the output picture order. For example, the matrix changes + in sequence header does not change the decoder context and so no need + to trigger a drain(). + Fixes: #899 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1375> + +2021-11-23 23:52:18 +0900 Seungha Yang <seungha@centricular.com> + + * ext/openjpeg/gstopenjpegenc.h: + openjpegenc: Fix build warning + Compiling C object subprojects/gst-plugins-bad/ext/openjpeg/gstopenjpeg.dll.p/gstopenjpegenc.c.obj + ../subprojects/gst-plugins-bad/ext/openjpeg/gstopenjpegenc.c(416): + warning C4133: '=': incompatible types - from 'GstFlowReturn (__cdecl *)(GstVideoEncoder *,GstVideoCodecFrame *)' to + 'gboolean (__cdecl *)(GstVideoEncoder *,GstVideoCodecFrame *)' + ../subprojects/gst-plugins-bad/ext/openjpeg/gstopenjpegenc.c(418): + warning C4133: '=': incompatible types - from 'GstFlowReturn (__cdecl *)(GstVideoEncoder *,GstVideoCodecFrame *)' to + 'gboolean (__cdecl *)(GstVideoEncoder *,GstVideoCodecFrame *)' + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1378> + +2021-11-23 09:28:57 +0100 Guillaume Desmottes <guillaume.desmottes@onestream.live> + + * ext/gs/gstgssink.cpp: + gssink: add metadata property + This property can be used to set metadata on the storage object. + Similar API has been added to the S3 sink already, see + https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/613 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1377> + +2021-11-23 00:25:07 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11bufferpool.cpp: + * gst-libs/gst/d3d11/gstd3d11utils.cpp: + d3d11: Update comments + Remove copy & paste mistake (this is not GstGL) and add more + description. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1374> + +2021-11-19 15:13:28 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasetransform.c: + * sys/va/gstvabasetransform.h: + * sys/va/gstvadeinterlace.c: + * sys/va/gstvavpp.c: + vapostproc, vadeinterlace: don't transform caps if no intersection. + If caps to transform don't intersect with those supported by the VA + filter (VAEntrypointVideoProc) then return them as is, because only + pass-through mode is the only possibility. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1369> + +2021-11-16 10:40:03 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Append featured caps rather than merge. + So it would be possible to honor upstream preference. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1369> + +2021-10-25 19:22:05 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Remove dead code. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1369> + +2021-10-25 19:22:19 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: Validate input parameter in internal function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1369> + +2021-11-15 10:15:38 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: log drm modifier + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1369> + +2021-11-22 17:34:22 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavpp.c: + va: vpp: Fix the memory leak in fixate_caps(). + For the BaseTransform class, the function of fixate_caps(), takes + the ownership of "othercaps". So we should clear it in our subclass. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1372> + +2021-11-13 12:22:36 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/player/gstplayer.c: + player: Ensure the GstPlay is created before the wrapped renderer + The GstPlayerWrappedVideoRenderer implicitely depends on GstPlay. + Fixes #878 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1345> + +2021-11-13 12:17:23 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay.c: + play: Allow runtime configuration of video-renderer + This is a requirement for GstPlayer when using the default overlay interface + provided by the pipeline. The GstPlayerWrappedVideoRenderer requires a valid + pipeline, but that's available only after the GstPlay thread has successfully + started. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1345> + +2021-11-19 19:02:20 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvadecoder.c: + va: decoder: Also ref the display when duplicating pictures. + The _destroy_buffers() will check the display handle using the + g_return_val_if_fail. we should not generate the invalid pointer + warning. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1368> + +2021-11-17 22:51:00 +0900 Seungha Yang <seungha@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + docs: Update doc cache for Windows specific plugins + Updating doc cache for d3d11, mediafoundation and wasapi2 plugins + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1360> + +2021-11-18 00:16:41 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2sink.c: + wasapi2: Fix typo in doc + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1360> + +2021-11-17 22:46:17 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvp9enc.cpp: + * sys/mediafoundation/plugin.c: + mediafoundation: Skip doc for non-default encoder elements + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1360> + +2021-10-21 19:41:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/d3d11/plugin.cpp: + d3d11: Stop doc for non-default decoder and deinterlacer elements + Just skip doc for non-default decoder/deinterlacer elements + since there are multiple element in case that system has + multiple GPUs. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1360> + +2021-11-19 13:27:54 +0900 Wonchul Lee <wonchul.dev@gmail.com> + + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11: Fix deadlock while doing unprepare + ShowWindow() could be blocked while doing gst_d3d11_window_win32_unprepare + when external window handle provided to d3d11videosink in multi-threaded + environment. + The condition that issue happened is, UI thread is waiting for a + background thread that changes d3d11videosink state to NULL, and the + background thread would try to send a window message to the queue. + The queue is already occupied by the UI thread, so the background + thread will be blocked. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1366> + +2021-11-16 12:56:38 +0000 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpevideosrc.cpp: + wpevideosrc: Fix frame stuttering in GL rendering path + Make sure the EGLImage we're rendering to the GL memory stays alive long enough, + until the the GL memory has been destroyed. + This change fixes tearing and black flashes artefacts that were happening + because the EGLImage was sometimes destroyed before the sink actually rendered + the associated texture. + Fixes #889 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1354> + +2021-11-16 12:53:35 +0000 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpevideosrc.cpp: + wpevideosrc: Run through gst-indent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1354> + +2021-11-11 19:11:25 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ext/rsvg/gstrsvgoverlay.c: + * gst-libs/gst/basecamerabinsrc/gstbasecamerasrc.c: + docs: fix unnecessary ampersand, < and > escaping in code blocks + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1340> + +2020-11-05 10:23:13 +0100 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpcrfbase.c: + avtp: crf: Process also local CRF streams + Without this patch locally generated CRF streams will be ignored. + Therefore the same network interface could not be CRF talker and + CRF listener. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1074> + +2021-11-09 15:10:21 +0100 Jean Felder <jean.felder@gmail.com> + + * gst/id3tag/id3tag.c: + id3tag: Map GST_TAG_MUSICBRAINZ_RELEASETRACKID + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1331> + +2021-11-09 15:04:59 +0100 Jean Felder <jean.felder@gmail.com> + + * gst/id3tag/id3tag.c: + id3tag: Map GST_TAG_MUSICBRAINZ_RELEASEGROUPID + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1331> + +2021-11-09 15:04:00 +0100 Jean Felder <jean.felder@gmail.com> + + * gst/id3tag/id3tag.c: + id3tag: Remove trailing whitespace + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1331> + +2021-10-06 15:54:09 +0200 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpsrc.c: + avtpsrc: Use correct size for provided buffers + Without this patch the following pipeline would send packets containing + garbage in the data section. + $ gst-launch-1.0 avtpsrc ! avtpsink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1077> + +2020-10-15 14:49:58 +0200 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpcrfsync.c: + avtp: crfsync: Warn when CRF package not yet received + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1075> + +2021-03-03 10:08:57 +0100 Timo Wischer <timo.wischer@de.bosch.com> + + * tests/check/elements/avtpcrfbase.c: + test: avtp: crf: Check for rounding errors + on average period calculation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1073> + +2020-11-11 16:50:28 +0100 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpcrfbase.c: + * ext/avtp/gstavtpcrfbase.h: + * tests/check/elements/avtpcrfbase.c: + avtp: crf: Use double for average period calculation + to also support CRF intervals like every 1,333,333ns 64 events + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1073> + +2021-01-12 10:03:32 +0100 Timo Wischer <timo.wischer@de.bosch.com> + + * tests/check/elements/avtpcrfbase.c: + tests: avtp: crf: Test for timestamp_interval > 1 + in case of CRF AVTPDUs with single CRF timestamp. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1076> + +2020-09-16 17:12:32 +0200 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpcrfbase.c: + * ext/avtp/gstavtpcrfbase.h: + avtp: crf: Properly handling one timestamp per PDU + The average_period should always represent the time between two + events. The specification defines the event time as the time + between audio samples, video frame sync, video line sync, etc. + In case of one timestamp per PDU the timestamp_interval identifies + the amount of events between the timestamp of one PDU and the + timestamp of the next PDU. + As described in IEEE 1722-2016 chapter + "10.4.12 timestamp_interval field" timestamp_interval shall be + nonzero. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1076> + +2021-11-08 20:18:51 +0100 Thomas Klausner <tk@giga.or.at> + + * sys/shm/meson.build: + shm: NetBSD build fix + shm_unlink() and friends live in librt on NetBSD. Adapt build system. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1323> + +2021-05-21 16:56:33 -0300 Martin Reboredo <yakoyoku@gmail.com> + + * ext/aom/gstav1enc.c: + aom: Set fixed_qp_offsets to a deactivated value + aom only uses fixed_qp_offsets with the + Constant Quality (Q) Rate Control mode, + previously this was locking any usage + with another Rate Control mode. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1198> 2021-11-05 13:12:14 +0100 Antonio Ospite <antonio.ospite@collabora.com> * sys/magicleap/meson.build: magicleap: update lumin_rt libraries names to the latest official version - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2561> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1314> + +2021-10-05 01:07:57 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Calculate the latency by its bump mode. + The current latency calculation just uses the num_reorder_frames, + which is not very precise. We should consider the bump mode of the + DPB, the faster it bumps, the lower latency we will have. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1046> + +2021-11-04 19:06:22 +0000 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/nvcodec/gstcudabasefilter.c: + * sys/nvcodec/gstcudabasetransform.c: + * sys/nvcodec/gstcudaconvert.c: + * sys/nvcodec/gstcudadownload.c: + * sys/nvcodec/gstcudascale.c: + * sys/nvcodec/gstcudaupload.c: + * sys/nvcodec/gstnvdec.c: + doc: Update nvdec documentation + Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/870 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1309> + +2021-11-03 17:09:52 +0900 Seungha Yang <seungha@centricular.com> + + * gst/videoparsers/gsth264parse.c: + * gst/videoparsers/gsth264parse.h: + h264parse: Don't insert extra AUD if exists in bitstream already + AUD nalu in packetized format is completely valid and therefore we should not + assume that we should insert AUD for packetized -> bytestream + conversion. + Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/862 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1296> + +2021-11-03 20:36:09 +0900 Seungha Yang <seungha@centricular.com> + + * tests/check/elements/h264parse.c: + tests: h264parse: Add test for AUD insertion + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1296> + +2021-11-04 16:36:05 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + wasapi2ringbuffer: Fix client object leak + Check whether ringbuffer is holding client object already since + open_device() may be called multiple times + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1307> + +2021-11-04 12:48:28 +0200 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Clear errors from finding codec preferences before the next iteration + The media is just skipped and the error is not propagated to the caller, + so keeping it around here would cause assertions a bit later when trying + to set a new error over the old one. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1291> + +2021-11-04 12:45:34 +0200 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Move addition of attributes to the caps after making sure they're not empty or any + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1291> + +2021-11-02 11:21:34 +0200 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Don't require fixed caps when querying caps for a transceiver pad to match it with a media + Upstream caps might for example be + application/x-rtp,media=audio,encoding-name={OPUS, X-GST-OPUS-DRAFT-SPITTKA-00, multiopus} + and while that is not fixed caps it is enough to match it with a media. + Only caps structures that have the correct structure name and that have + the media and encoding-name field are preserved, but if both are present + then these caps are used as "codec preferences". + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1291> + +2021-11-03 18:44:03 +0000 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + Back to development + +=== release 1.19.3 === + +2021-11-03 15:43:36 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + * NEWS: + * RELEASE: + * gst-plugins-bad.doap: + * meson.build: + Release 1.19.3 + +2021-11-03 15:43:32 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + Update ChangeLogs for 1.19.3 + +2021-11-02 09:40:43 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2: Drain before a new_sequence get signalled + The decoder may need to re-allocate the output buffer, it is easier if all + pictured have been outputed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-04-08 16:24:49 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + v4l2codecs: gstv4l2codecsmpeg2dec: implement a render delay + The v4l2 backend support delayed output for performance reasons. + It is then possible to use render delays to queue multiple requests + simultaneously, thus increasing performance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-04-08 16:07:23 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstmpeg2decoder.h: + codecs: gstmpeg2decoder: add support for render delay + Some decoding APIs support delayed output for performance reasons. + One example would be to request decoding for multiple frames and + then query for the oldest frame in the output queue. + This also increases throughput for transcoding and improves seek + performance when supported by the underlying backend. + Introduce support in the mpeg2 base class, so that backends that + support render delays can actually implement it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-04-06 16:40:28 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2codecmpeg2dec.c: + * sys/v4l2codecs/gstv4l2codecmpeg2dec.h: + * sys/v4l2codecs/meson.build: + * sys/v4l2codecs/plugin.c: + v4l2codecs: Implement a MPEG2 V4L2 decoder element + Implement a MPEG2 V4L2 decoder element based on the previous h264 + implementation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-04-06 16:42:54 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/linux/videodev2.h: + v4l2codecs: mpeg2: update to the new uAPI + The mpeg2 stateless api has undergone changes as it is being + destage. Update the v4l2-controls header to match. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-03-26 15:52:21 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/va/gstvampeg2dec.c: + sys: va: GstVaMpeg2Dec: use slice sc_offset and size + Seeing as how GstMpeg2Slice will now record the start code offset + as well as its size with the above field taken into account, the + manual computation in this class is not needed. + Remove it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-03-26 15:31:51 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstmpeg2picture.h: + codecs: GstMpeg2Slice: add field for sc_offset and size + Downstream might need the start code offset when decoding. + Previously this computation would be scattered in multiple sites. This + is error prone, so move it to the base class. Subclasses can access + slice->sc_offset directly without computing the address themselves + knowing that the size will also take the start code into account. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1013> + +2021-10-20 12:11:49 +0100 James Cowgill <james.cowgill@blaize.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + v4l2codecs: Handle allocator creation failure + If `VIDIOC_REQBUFS` doesn't return enough buffers the allocator creation + function can fail and return `NULL`. Handle this by generating an error + and returning instead of segfaulting. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1220> + +2021-10-20 12:08:49 +0100 James Cowgill <james.cowgill@blaize.com> + + * sys/v4l2codecs/gstv4l2codecallocator.c: + v4l2codecs: Fix segfault when destroying non-detached allocator + The GstV4l2CodecAllocator dispose function clears `self->decoder` but + the finalize function then tries to use it if the allocator has no been + detached yet. + Fix by detaching in the dispose function before we clear + `self->decoder`. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1220> + +2021-10-29 16:08:20 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * gst/codecalpha/gstalphacombine.c: + alphacombine: use the same allocation query data for both decoders + This allows downstream elements to set allocation query parameters for both + decoders, which should be always the same. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1277> + +2021-10-31 13:43:40 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtdec.c: + * sys/applemedia/vtenc.c: + applemedia: Add ARGB64_BE, RGBA64_LE support to vtenc/vtdec + We can add this now that ARGB64_BE videoconvert support was added in: + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1247 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-20 02:19:33 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtenc.c: + * sys/applemedia/vtenc.h: + vtenc: Add FieldDetail properties for interlaced input + Standard interlace handling: + * If we have interlace-mode=interleaved and the field order, we just + set it when creating the session + * If we have interlace-mode=(interleaved|mixed) and no field order, we + set the field order on the first buffer + The encoder session does not support changing the FieldDetail after it + has started encoding frames, so we cannot support mixed streams + correctly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-20 01:49:29 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtenc.c: + * sys/applemedia/vtenc.h: + vtenc: Add a property to forcibly ignore alpha values + This PropertyKey is not documented in any headers anywhere, so we need + to define it ourselves. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-19 23:53:39 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtenc.c: + vtenc: Set colorimetry information + It looks like VideoToolbox doesn't support all our colorimetries. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-17 18:54:10 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/coremediabuffer.h: + * sys/applemedia/vtdec.c: + * sys/applemedia/vtenc.c: + * sys/applemedia/vtenc.h: + * sys/applemedia/vtutil.c: + * sys/applemedia/vtutil.h: + applemedia: Add ProRes support to vtenc and vtdec + For vtdec, we continue to prefer NV12; else we pick whatever + downstream wants. In the special case where we're decoding 10-bit or + 12-bit ProRes formats, we will prefer AYUV64. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-17 19:19:15 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtenc.c: + vtenc: Improve error reporting in chain function + Otherwise it is quite difficult to figure out why the chain function + failed. Also assert not reached for case statements that should not be + hit. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-14 12:14:49 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * sys/applemedia/vtdec.c: + vtdec: Remove dead code in switch statement + We never advertise these formats, so these cases will never be hit. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1214> + +2021-10-30 16:22:39 +0300 Sebastian Dröge <sebastian@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + video: Fix order of new video formats + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1282> + +2021-10-30 00:58:55 +0100 Tim-Philipp Müller <tim@centricular.com> + + * gst/mpegtsmux/gstatscmux.c: + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/gstmpegtsmux.c: + Couple more g_memdup() -> g_memdup2() fixes + Fixes deprecation warnings with newer GLib versions. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1279> + +2021-10-30 00:52:42 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ext/dtls/gstdtlssrtpenc.c: + dtls: don't use deprecated g_binding_get_source() with newer GLib versions + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1279> 2021-10-30 01:41:51 +0900 Seungha Yang <seungha@centricular.com> @@ -109,7 +2237,113 @@ codecs: h265decoder: Fix per-slice leak As documented, slice header parsed via gst_h265_parser_parse_slice_hdr() should be cleared, otherwise it would result in memory leak. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2559> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1274> + +2021-10-26 16:05:24 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Delay decoders downstream negotiation. + Delay decoders downstream negotiation just before an output frame + needs to be allocated. + This is required, are least for H.264 and H.265 decoders, since + codec_data might trigger a new sequence before finishing upstream + negotiation, and sink pad caps need to set before setting source pad + caps, particularly to forward HDR fields. The other decoders are + changed too in order to keep the same structure among them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1257> + +2021-10-26 09:41:53 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasedec.c: + vabasedec: Move warning message to decoder's category. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1257> + +2021-10-26 09:28:10 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaav1dec.c: + * sys/va/gstvabasedec.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Move common variable need_negotiation to GstBaseDec. + This is a common variable to all decoders, so it's sound to move it to + the base helper. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1257> + +2021-10-26 09:23:42 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaav1dec.c: + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Move back parent_object to each element. + Using GstBaseDec hack to access the parent_object of each element in + the element itself is a bit fragile. It would be better to keep its + own parent object as the usual global variable. It would make it + resistant to code changes. + The GstBaseDec macro to access the parent object now it's internal to + base decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1257> + +2021-10-23 00:44:57 +0200 Piotrek Brzeziński <piotr@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + video-format: Add support for ARGB64 LE/BE and similar variants + Co-authored-by: Sebastian Dröge <sebastian@centricular.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1247> + +2021-09-26 21:34:30 +0200 Heiko Becker <heirecka@exherbo.org> + + * ext/neon/meson.build: + neon: Allow building against neon 0.32.x + No API/ABI changes: https://github.com/notroj/neon/blob/0.32.0/NEWS#L3 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1267> + +2021-10-25 11:37:45 +0100 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + meson: require matching GStreamer dep versions for unstable development releases + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/929 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1244> + +2021-10-27 00:20:57 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/webrtctransceiver.c: + webrtcbin: fix check_negotiation computing on caps event + It seems logical that check_negotiation be true if received_caps + is *not* equal to the new caps. + Also clean up handling of transceivers' ssrc events, as this + patch triggered a leaky code path. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1233> + +2021-10-23 01:54:05 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: connect input stream when receiving caps + .. if a current direction has already been set + When `webrtcbin` has created an offer based on codec_preferences, + it might not have received caps on its sinkpads by the time a + remote description is set, in which case we want to connect the + input stream upon actual reception of the caps instead. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1233> + +2021-10-18 15:23:48 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: consider pads with trans->codec_preferences ready + .. when determining whether we can emit on-negotiation-needed + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1233> 2021-10-28 17:41:54 +0100 Tim-Philipp Müller <tim@centricular.com> @@ -120,7 +2354,1591 @@ in which case we'll try to map a NULL buffer which will generate lots of criticals. Fixes #855 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2558> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1265> + +2021-10-20 17:46:10 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkh265enc.c: + msdk: Insert hdr sei at hevc encoder + There are two HDR SEIs defined in spec: mastering display colour volume and + content light level. Add insertion of HDR SEIs when they are available + during encoding. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1242> + +2021-10-21 16:12:06 +0100 Rob Agar <rob@engineeredarts.co.uk> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Also check data channel transport when collating connection state + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/838 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1224> + +2021-10-27 11:54:09 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Color fixation will choose othercaps' structure. + gst_va_fixate_format() will iterate all othercaps' structures to find + the one with less information lost at color conversion. If a structure + with same color format is found, the iteration stops. It's like a + smart truncation. Then, this function also will choose the caps + feature. + Later this structure is used fixate its size and no further truncation + is needed. + Don't intersect at fixate, since it kills possible resizing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1261> + +2021-10-27 11:53:28 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Ignore direction at orientation swapping. + It doesn't matter the direction of the negotiation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1261> + +2021-10-27 10:31:04 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Consider video orientation for border calculation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1261> + +2021-10-06 16:00:56 +0200 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpsrc.c: + avtpsrc: Retry receive with same buffer size + Without this patch in case of a retry recv() will be called with a + negative size argument. + Signed-off-by: Timo Wischer <timo.wischer@de.bosch.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1078> + +2021-10-26 16:00:36 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: fix default value when installing schedule property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1252> + +2021-10-26 15:58:26 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: fix emission of selected-samples in one case + Detected while reading the code, cccombiner must set + self->current_video_buffer to NULL *after* emitting selected-samples + in order for the application to get a useful return when peeking + the next video sample. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1252> + +2021-10-26 01:09:58 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + * tests/check/elements/cccombiner.c: + cccombiner: stop attaching caption buffers when caption pad has gone EOS + When schedule is true (as is the case by default), we insert padding + when no caption data is present in the schedule queue, and previously + weren't checking whether the caption pad had gone EOS, leading to + infinite scheduling of padding after EOS on the caption pad. + Rectify that by adding a "drain" parameter to dequeue_caption() + In addition, update the captions_and_eos test to push valid cc_data + in: without this cccombiner was attaching padding buffers it had + generated itself, and with that patch would now stop attaching + said padding to the second buffer. By pushing valid, non-padding + cc_data we ensure a caption buffer is indeed attached to the first + and second video buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1252> + +2021-10-20 13:19:00 +0200 Mats Lindestam <matslm@axis.com> + + * ext/curl/gstcurlsshsink.c: + * ext/curl/gstcurlsshsink.h: + * tests/check/elements/curlsftpsink.c: + curlsftpsink: Add support for sha256 fingerprint + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1193> + +2021-10-21 11:09:07 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkvpp.c: + msdkvpp: Add 12bit formats + Add 12bit formats for different chroma samplings at sink pad and + src pad, including P012_LE, Y212_LE and Y412_LE. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1218> + +2021-10-25 18:52:24 +0200 Floris <weersproductions@gmail.com> + + * ext/gs/README.md: + gs: update README to use fixed versions + Use specific versions, instead of relying on 'master'. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1246> + +2021-10-25 18:47:46 +0200 Floris <weersproductions@gmail.com> + + * ext/gs/gstgssrc.cpp: + gssrc: use default blocksize + The blocksize is set to 3 * 1024 * 1024 / 2, which is the default download_size of Google-Cloud-CPP. + Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/846 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1246> + +2021-10-25 16:53:14 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah265dec.c: + vah265dec: Fix end_picture() vmethod. + Since commit 88437a9c the signature of h265decoder's end_picture() + changed to return GstFlowReturn, but vah265dec was not updated. + This commit fixes this regression. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1248> + +2021-10-12 17:32:30 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: Fix possible memory leaks + At gst_va_dmabuf_allocator_setup_buffer_full, static code analysis tool + does not know number of objects in descriptor is always larger than 0 if + export_surface_to_dmabuf succeeds. Thus, the tool will assume buf is + allocated with mem but not released when desc.num_objects equals to 0 + and raise a mem leak issue. + For gst_va_dambuf_memories_setup, we should also inform the tool that + n_planes will be larger than 0 by checking the value at very beginning. + Then, the defect similar to above will not be raised during static analysis. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1241> + +2021-10-25 01:02:28 +0100 Tim-Philipp Müller <tim@centricular.com> + + * po/af.po: + * po/ast.po: + * po/az.po: + * po/bg.po: + * po/ca.po: + * po/cs.po: + * po/da.po: + * po/de.po: + * po/el.po: + * po/en_GB.po: + * po/eo.po: + * po/es.po: + * po/eu.po: + * po/fi.po: + * po/fr.po: + * po/fur.po: + * po/gl.po: + * po/hr.po: + * po/hu.po: + * po/id.po: + * po/it.po: + * po/ja.po: + * po/ky.po: + * po/lt.po: + * po/lv.po: + * po/mt.po: + * po/nb.po: + * po/nl.po: + * po/or.po: + * po/pl.po: + * po/pt_BR.po: + * po/ro.po: + * po/ru.po: + * po/sk.po: + * po/sl.po: + * po/sq.po: + * po/sr.po: + * po/sv.po: + * po/tr.po: + * po/uk.po: + * po/vi.po: + * po/zh_CN.po: + * po/zh_TW.po: + gst-plugins-bad: update translations + Fixes #656 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1240> + +2021-10-23 19:26:06 +0200 Andoni Morales Alastruey <amorales@fluendo.com> + + * gst-libs/gst/d3d11/gstd3d11device.cpp: + d3d11: add support for new debug layer versions + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1237> + +2021-01-20 12:04:48 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> + + * ext/assrender/gstassrender.c: + assrender: Add "application/vnd.ms-opentype" mimetype detection + The "application/vnd.ms-opentype" mimetype is commonly used in many fonts attached in the matroska videos. + Assrender should treat it as compatible without the need of parsing the file extension. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1207> + +2021-10-22 18:13:46 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openjpeg/meson.build: + wrap: libopenjp2: use patch version 7 + Add support for win32 build + Disable the binary to avoid the thirdparty + dependency to be checked. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1229> + +2021-10-21 23:35:41 -0300 Martin Reboredo <yakoyoku@gmail.com> + + * gst-libs/gst/vulkan/gstvkutils.c: + * gst-libs/gst/vulkan/gstvkutils.h: + gstvulkan: Constify code in create_shader + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1226> + +2021-10-21 00:33:06 +0100 Tim-Philipp Müller <tim@centricular.com> + + * meson_options.txt: + meson: default to gpl=disabled for gst-plugins-bad and -ugly + This will only affect individual/tarball module builds, as the + options yield to the parent project which was set to gpl=disabled + by default already. We kept it as auto in the original commit + to accommodate the need to update cerbero as well, which had to + be done separately after the initial commit. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1217> + +2021-01-20 13:38:03 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> + + * ext/assrender/gstassrender.c: + assrender: Do not iterate over mimetypes without filename + No point spending time on iterating and comparing strings if we + are going to reject the value due to missing filename anyway. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1206> + +2021-01-20 11:46:17 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> + + * ext/assrender/gstassrender.c: + assrender: Fix mimetype detection + Previously gst_structure_has_name was used to get a string to compare with supported mimetypes. + This is incorrect as above function returns a user defined structure name which is + not the structure mimetype value. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1206> + +2021-10-21 19:17:18 +1100 Matthew Waters <matthew@centricular.com> + + * gst-libs/gst/vulkan/gstvkdebug.c: + * gst-libs/gst/vulkan/gstvkdebug.h: + * gst-libs/gst/vulkan/gstvkswapper.c: + vulkan/swapper: add some debug logging for surface size and present modes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1219> + +2021-10-21 00:28:25 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/interlace/gstinterlace.c: + interlace: Replace custom lock with object lock + The object lock is sufficient for the task of protecting against + object property data races. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1039> + +2021-10-21 00:37:47 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/interlace/gstinterlace.c: + interlace: Protect all properties with the lock + Avoid blatant data races here. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1039> + +2021-10-21 00:36:47 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/interlace/gstinterlace.c: + interlace: Reset src_fps_d together with src_fps_n + These fields belong together. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1039> + +2021-10-21 00:35:00 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/interlace/gstinterlace.c: + interlace: Clear stored_fields together with stored_frame + These fields belong together. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1039> + +2021-10-21 00:31:24 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/interlace/gstinterlace.c: + interlace: Reset after changing state to READY + Trying to reset before the pads have been deactivated races with the + streaming thread. There was also a buggy buffer clear leaving a dangling + `stored_frame` pointer around. Use `gst_interlace_reset` so this happens + properly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1039> + +2021-10-20 14:34:42 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/nvcodec/gstnvenc.c: + * sys/nvcodec/gstnvh264enc.c: + nvh264enc: add constrained-baseline to the caps profiles + In practice, when baseline is requested from the encoder it + produces constrained baseline, and it is already reflected + in the profile-iop flags. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1213> + +2021-10-21 01:47:07 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfplatloader.c: + * sys/mediafoundation/meson.build: + mediafoundation: Fix for UWP build + We don't support D3D11 interop for UWP because some APIs + (specifically MFTEnum2) are desktop application only. + However, the code for symbol loading is commonly used by both UWP and WIN32. + Just link GModule unconditionally which is UWP compatible, and simply don't + try to load any library/symbol dynamically when D3D11 interop is unavailable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1216> + +2021-10-20 00:54:26 +0100 Tim-Philipp Müller <tim@centricular.com> + + * REQUIREMENTS: + * docs/plugins/gst_plugins_cache.json: + * ext/libmms/gstmms.c: + * ext/libmms/gstmms.h: + * ext/libmms/meson.build: + * ext/meson.build: + * meson_options.txt: + mms: remove mmssrc plugin + Doubtful that anyone still needs that or there are even + any streams left out there. + MMS was deprecated in 2003 (in favour of RTSP) and support for + it was dropped with Microsoft Media Services 2008. + https://en.wikipedia.org/wiki/Microsoft_Media_Server + https://sdp.ppona.com/news2008.html + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/821 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1211> + +2021-10-19 18:18:25 +0100 Tim-Philipp Müller <tim@centricular.com> + + * sys/msdk/meson.build: + * sys/va/meson.build: + meson: va, msdk: simplify dep.get_variable() use + With recent Meson versions we can just write dep.get_variable('foo') + instead of dep.get_variable(pkgconfig: 'driverdir', internal: 'driverdir'). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1183> + +2021-10-18 15:47:00 +0100 Tim-Philipp Müller <tim@centricular.com> + + * tests/check/meson.build: + * tests/validate/meson.build: + meson: update for meson.build_root() and .build_source() deprecation + -> use meson.project_build_root() or .global_build_root() instead. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1183> + +2021-10-18 00:40:14 +0100 Tim-Philipp Müller <tim@centricular.com> + + * docs/meson.build: + * ext/onnx/meson.build: + * ext/opencv/meson.build: + * ext/wayland/meson.build: + * meson.build: + * sys/msdk/meson.build: + * tests/check/meson.build: + meson: update for dep.get_pkgconfig_variable() deprecation + ... in favour of dep.get_variable('foo', ..) which in some + cases allows for further cleanups in future since we can + extract variables from pkg-config dependencies as well as + internal dependencies using this mechanism. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1183> + +2021-10-18 00:03:47 +0100 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + meson: clean up conditional paths after version bump + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1183> + +2020-11-19 18:03:11 +0100 Rafał Dzięgiel <rafostar.github@gmail.com> + + * ext/assrender/gstassrender.c: + assrender: fix smooth scaling by disabling hinting + When ass hinting value is set to anything other than NONE, + subtitles cannot use smooth scaling, thus all animations will jitter. + The libass author warns about possibility of breaking some scripts when it is enabled, + so lets do what is recommended and disable it to get the smooth scaling working. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1201> + +2021-05-07 11:11:31 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * ext/srt/gstsrtsink.c: + srt: Plug leak of headers + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1040> + +2020-08-31 17:17:56 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/gstbasetsmux.c: + mpegtsmux: Avoid crash when best pad gets flushed + The 'best' pad might receive a flush event between us picking it and us + popping the buffer. In this case, the buffer will be missing. + Similar to https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/merge_requests/711 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1042> + +2021-10-17 11:39:57 +0100 Tim-Philipp Müller <tim@centricular.com> + + * tools/meson.build: + tools: Define G_LOG_DOMAIN for various tools as well + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1009> + +2021-10-01 15:31:18 +0100 Tim-Philipp Müller <tim@centricular.com> + + * gst-libs/gst/adaptivedemux/meson.build: + * gst-libs/gst/audio/meson.build: + * gst-libs/gst/basecamerabinsrc/meson.build: + * gst-libs/gst/codecparsers/meson.build: + * gst-libs/gst/codecs/meson.build: + * gst-libs/gst/d3d11/meson.build: + * gst-libs/gst/insertbin/meson.build: + * gst-libs/gst/interfaces/meson.build: + * gst-libs/gst/isoff/meson.build: + * gst-libs/gst/mpegts/meson.build: + * gst-libs/gst/opencv/meson.build: + * gst-libs/gst/play/meson.build: + * gst-libs/gst/player/meson.build: + * gst-libs/gst/sctp/meson.build: + * gst-libs/gst/transcoder/meson.build: + * gst-libs/gst/uridownloader/meson.build: + * gst-libs/gst/va/meson.build: + * gst-libs/gst/vulkan/meson.build: + * gst-libs/gst/wayland/meson.build: + * gst-libs/gst/webrtc/meson.build: + * gst-libs/gst/winrt/meson.build: + gst-plugins-bad: define G_LOG_DOMAIN for all libraries + Fixes #634 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1009> + +2021-10-06 13:38:35 +0200 Antonio Ospite <antonio.ospite@collabora.com> + + * ext/aes/meson.build: + aes: specify the required OpenSSL version + The code in the aes elements assumes OpenSSL >= 1.1.0: + - implicit library initialization; + - version retrieved with OpenSSL_version(OPENSSL_VERSION); + and it fails to build with older versions. + Specify the required OpenSSL version explicitly in meson.build so that + the elements are excluded on older systems (e.g. Ubuntu 16.04) and the + rest of GStreamer can still build. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1067> + +2021-10-11 13:05:24 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadeinterlace.c: + vadeinterlace: Accept ANY feature. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1024> + +2021-10-11 13:04:19 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadeinterlace.c: + vadeinterlace: Fixate interlace-mode and framerate accordingly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1024> + +2021-10-14 07:03:26 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Accept ANY feature. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1024> + +2021-10-13 19:27:41 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Traverse caps features in gst_va_vpp_caps_remove_fields() + The previous code had a potential failure for multiple caps features. Now + each caps feature in each structure is reviewed, and if it has a supported + feature, the structure is processed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1024> + +2021-10-13 17:08:12 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Refactor gst_va_vpp_complete_caps_features() + gst_va_vpp_complete_caps_features() now receives the @feature_name to + add and return if @caps doesn't provide it. + So, instead of two nested loops, now the function is a single loop, + traversing @caps to find if each structure already contains the requested + @features_name. + It's important to add missing caps features with @caps, in order to + not lost information. + The function caller does the external loop by calling per each + available caps feature. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1024> + +2021-10-11 18:57:48 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Split caps transform in two phases. + In order to make more readable the caps transformation, the operation + was split in two phases: + 1. Rangify the supported caps structures. + 2. Add the missing (and supported) caps features. + Step 1 modified its logic, by copying any unrecognized structure. + It's a previous step required for allowing ANY caps feature as + passthrough. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1024> + +2021-10-08 12:38:04 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Change many GST_{DEBUG, LOG, etc} into _OBJECT + Log files with several demuxers running at once can otherwise get + confusing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1096> + +2021-10-08 12:36:58 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Issue GST_ELEMENT_WARNING for continuity errors + The application might want to make use of these. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1096> + +2021-10-14 18:38:26 +0100 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + meson: bump meson requirement to >= 0.59 + For monorepo build and ugly/bad, for advanced feature + option API like get_option('xyz').required(..) which + we use in combination with the 'gpl' option. + For rest of modules for consistency (people will likely + use newer features based on the top-level requirement). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1084> + +2021-09-19 00:55:34 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ext/dts/meson.build: + * ext/faad/meson.build: + * ext/iqa/iqa.c: + * ext/iqa/meson.build: + * ext/mpeg2enc/meson.build: + * ext/mplex/meson.build: + * ext/resindvd/meson.build: + * ext/x265/meson.build: + * meson.build: + * meson_options.txt: + meson: add 'gpl' option and only build plugins with (A)GPL deps if explicitly enabled + Require explicit opt-in to build plugins with (A)GPL dependencies. + Keep ugly/bad options on 'auto' for now so cerbero doesn't fail. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1084> + +2020-10-30 16:02:22 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsmux/gstbasetsmux.c: + basetsmux: Support for caps changes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/981> + +2020-11-12 12:17:14 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/gstbasetsmux.c: + basetsmux: Clean up gst_base_ts_mux_create_stream + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/981> + +2021-08-31 16:35:06 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsdemux/mpegtsbase.c: + mpegtsbase: Search SCTE-35 DRF_ID_CUEI in multiple registration descriptors + There are streams in the wild that have to add a SCTE-35 trigger in + another e.g. GA94 stream. Most encoders would replace the GA94 + descriptor ID with the CUEI one temporarily, but there are some that + will add two registration ID descriptors, one with GA94 and one with + CUEI. + Failing to parse the CUEI registration ID in that case would return + FALSE in _stream_is_private_section , therefore setting it as known PES + and pushing packets downstream instead of calling handle_psi. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/979> + +2021-10-01 14:36:48 +0200 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Improve gap detection + We should also take into account whether data is currently pending when checking + for gap on streams. It could very well be that some streams have very low + bitrate (and spread out) data. For those we don't want to push out a gap event. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1179> + +2021-09-05 11:57:18 +0200 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtspacketizer.c: + * gst/mpegtsdemux/mpegtspacketizer.h: + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Handle "negative" timestamps + This is only enabled in push time mode. Furthermore it's only enabled for now if + PCR is to be ignored. + The problem is dealing with streams where the initial PTS/DTS observation might + be greater than following ones (from other PID for example). Before this patch, + this would result in sending buffers without any timestamp which would cause a + wide variety of issues. + Instead, pad segment and buffer timestamps with an extra + value (packetizer->extra_shift, default to 2s), to ensure that we can get valid + timestamps on outgoing buffers (even if that means they are before the segment + start). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1179> + +2021-09-05 11:55:55 +0200 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Handle streams with bogus PTS vs DTS + PTS and DTS should be within a reasonable distance of each other. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1179> + +2021-09-05 11:53:05 +0200 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtspacketizer.c: + tsdemux: Handle PTS->TS at wraparound + This has been a FIXME for ages. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1179> + +2021-10-14 14:07:07 +0100 Rob Agar <rob@engineeredarts.co.uk> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: fix prevention of webrtcbin deletion due to ref held by probe callback + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/810 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1150> + +2021-10-16 19:01:27 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11convert.cpp: + d3d11: d3d11{convert,scale}: Add add-borders property + Functionally identical to that of videoscale element. + When disabled, d3d11convert or d3d11scale element will scale + image without adding borders, meaning that display aspect ratio + will not be preserved. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1170> + +2021-10-16 10:58:53 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + vapostproc: Add add-borders property to keep dar + Just as videoscale, it enables add-borders property (FALSE by default) + in vapostproc to add border, if necessary, to keep the display aspect + ratio from the original image. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1169> + +2021-10-16 10:51:57 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Fix early fixation. + First copy missing fields and then fixate all remaining fields. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1169> + +2021-10-14 19:08:49 +0100 Tim-Philipp Müller <tim@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/meson.build: + * ext/ofa/gstofa.c: + * ext/ofa/gstofa.h: + * ext/ofa/meson.build: + * meson_options.txt: + * tests/check/elements/ofa.c: + ofa: remove ofa audio fingerprinting plugin + I think the MusicIP database for this has been defunct for years, + so I can't imagine this plugin is particularly useful or still + used by anyone. + See https://musicbrainz.org/doc/Fingerprinting#PUID + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1153> + +2021-10-16 22:43:32 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/meson.build: + * sys/wasapi2/meson.build: + meson: wasapi2,mediafoundation: Work around Windows SDK header issue + Some SDK headers are not standard compliant, so MSVC will + complain when such headers are in use with "/permissive-" compile + option. Use "/Zc:twoPhase-" to work around the issue as documented in + https://docs.microsoft.com/en-us/cpp/build/reference/permissive-standards-conformance?view=msvc-160#windows-header-issues + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1174> + +2021-10-16 09:29:28 -0300 Thibault Saunier <tsaunier@igalia.com> + + * ext/fdkaac/meson.build: + meson: Mark newly fdkaac/ogg/vorbis as allow fallback + This way when the dep is `auto` we will fallback if the system + dependency is not available. + And use https to get libvorbis + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1171> + +2021-10-16 01:15:06 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfplatloader.c: + * sys/mediafoundation/gstmfplatloader.h: + * sys/mediafoundation/gstmftransform.cpp: + * sys/mediafoundation/gstmftransform.h: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/meson.build: + * sys/mediafoundation/plugin.c: + mediafoundation: Use GetProcAddress() for OS version dependent symbols + We are using some symbols which are not available on Windows 7, + specifically D3D11 interop related ones + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1167> + +2021-10-06 03:26:25 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Transfer colorimetry at fixate if possible. + Taken from videoconvert element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1110> + +2021-10-12 15:52:48 -0300 Thibault Saunier <tsaunier@igalia.com> + + * docs/meson.build: + meson: Streamline the way we detect when to build documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1093> + +2020-06-27 00:39:00 -0400 Thibault Saunier <tsaunier@igalia.com> + + * docs/meson.build: + * gst-libs/gst/adaptivedemux/meson.build: + * gst-libs/gst/audio/meson.build: + * gst-libs/gst/basecamerabinsrc/meson.build: + * gst-libs/gst/codecparsers/meson.build: + * gst-libs/gst/codecs/meson.build: + * gst-libs/gst/d3d11/meson.build: + * gst-libs/gst/insertbin/meson.build: + * gst-libs/gst/interfaces/meson.build: + * gst-libs/gst/mpegts/meson.build: + * gst-libs/gst/opencv/meson.build: + * gst-libs/gst/play/meson.build: + * gst-libs/gst/player/meson.build: + * gst-libs/gst/sctp/meson.build: + * gst-libs/gst/transcoder/meson.build: + * gst-libs/gst/uridownloader/meson.build: + * gst-libs/gst/va/meson.build: + * gst-libs/gst/vulkan/meson.build: + * gst-libs/gst/wayland/meson.build: + * gst-libs/gst/webrtc/meson.build: + * meson.build: + meson: List libraries and their corresponding gir definition + Introduces a `libraries` variable that contains all libraries in a + list with the following format: + ``` meson + libraries = [ + [pkg_name, { + 'lib': library_object + 'gir': [ {full gir definition in a dict } ] + ], + .... + ] + ``` + It therefore refactors the way we build the gir so that we can reuse the + same information to build them against 'gstreamer-full' in gst-build + when linking statically + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1093> + +2020-06-27 00:37:39 -0400 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/audio/meson.build: + * gst-libs/gst/basecamerabinsrc/meson.build: + * gst-libs/gst/codecs/meson.build: + * gst-libs/gst/insertbin/meson.build: + * gst-libs/gst/mpegts/meson.build: + * gst-libs/gst/play/meson.build: + * gst-libs/gst/player/meson.build: + * gst-libs/gst/vulkan/meson.build: + * gst-libs/gst/webrtc/meson.build: + meson: Mark files as files() + Making it more robust and future proof + And fix issues that it creates + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1093> + +2021-09-13 17:53:12 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/audio/meson.build: + bad:audio: Add generated files sources in declare_dependency + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1093> + +2021-10-15 23:18:41 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmftransform.cpp: + * sys/mediafoundation/gstmfutils.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + mediafoundation: Fix various string constness handling + ... with fixing typo (g_strup -> g_strdup) + Constness needs to be explicit in C++ world otherwise compiler + would complain about that. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1164> + +2021-10-15 10:03:46 +0100 Rob Agar <rob@engineeredarts.co.uk> + + * tests/examples/webrtc/webrtcrenego.c: + missing transceiver unref in WebRTC renegotiation example + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1160> + +2021-10-13 17:17:44 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * tests/examples/va/multiple-vpp.c: + tests:va: Fix null ptr dereference in multi-vpp + Dereference the pointer err before null check, which raised a null + pointer dereference issue by Coverity. Modify it to do the null check + of err first, then dereference it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1143> + +2021-10-13 15:58:29 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/va/gstvadecoder.c: + va: Fix null ptr dereference for vadeocder + Making a null check in gst_va_decode_picture_free () indicates pic->buffers or pic->slices + can be null, then in _destroy_buffers () the pointers are dereferenced, which is detected + as dereference after null check by Coverity. Thus, modify the code to do null check in + _detroy_buffers (). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1143> + +2021-04-05 10:29:37 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsbase.h: + * gst/mpegtsdemux/tsdemux.c: + * gst/mpegtsdemux/tsdemux.h: + tsdemux: Handle delayed seek events + Store the event in case it cannot be processed immediately and process + it after the first segment has been produced. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/980> + +2021-04-05 10:28:51 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsdemux/tsdemux.c: + * gst/mpegtsdemux/tsdemux.h: + tsdemux: Protect demux->segment_event with a mutex + Would otherwise cause weird issues when processing a delayed seek event + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/980> + +2021-10-14 14:35:45 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264: Fix wrong type of ret variable + This ret is not a GstFlowReturn. This broke v4l2 decoder which does not + implement new_picture() virtual function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1154> + +2021-10-13 21:45:34 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdup.h: + * sys/d3d11/gstd3d11screencapture.cpp: + * sys/d3d11/gstd3d11screencapture.h: + * sys/d3d11/gstd3d11screencapturedevice.cpp: + * sys/d3d11/gstd3d11screencapturedevice.h: + * sys/d3d11/gstd3d11screencapturesrc.cpp: + * sys/d3d11/gstd3d11screencapturesrc.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.cpp: + * tests/examples/d3d11/d3d11screencapturesrc.cpp: + * tests/examples/d3d11/meson.build: + d3d11: Rename screen capture element + Old name "desktopdup" may confuse users. Now it's renamed to + "screencapture" + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1136> + +2021-10-05 18:52:25 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Set state back to NULL after run() finishes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1063> + +2021-09-13 18:02:03 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.h: + transcoder: Use full path for includes in 'gsttranscoder.h' + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1063> + +2021-10-12 17:50:31 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * gst-libs/gst/va/gstvadisplay_drm.c: + va:display: Don't close an fd with negative value + Cannot pass negative parameter to close() and thus no need to apply + close() when fd < 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1131> + +2021-10-12 17:48:17 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/va/gstvadecoder.c: + va: Fix error handling for decoder + Need to check if va decoder is closed successfully. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1131> + +2021-10-12 17:44:27 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/va/gstvabasetransform.c: + va: Fix error handling for base transform + Need to check the returned value of gst_buffer_pool_set_active() when + setting the active status of buffer pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1131> + +2021-10-13 21:26:53 -0300 Thibault Saunier <tsaunier@igalia.com> + + * ext/avtp/meson.build: + meson:avtp: Error out if sock_txtime is not present and avtp is enabled + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1142> + +2021-10-10 01:56:32 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecparsers/gsth264parser.c: + * gst-libs/gst/codecparsers/gsth265parser.c: + codecparsers: {h264,h265}parser: Fix typo around SEI nalu generator + Fix to create correct SEI nalu when the size of payloadType and/or + payloadType is larger than 255 (0xff) + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1601 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1106> + +2021-10-05 20:15:44 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvavpp.c: + vapostproc: Negotiate interlaced. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1109> + +2021-10-05 20:15:09 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Copy missing fields at fixate. + When caps negotiation implies a caps feature change, some fields might + get lost. This patch brings them back from input caps. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1109> + +2021-10-05 20:15:09 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Simplify size fixate. + gst_va_vpp_fixate_size() returned the fixated caps, but that is not + needed since `othercaps` are modified inline. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1109> + +2021-10-05 20:15:09 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Simplify fixate. + The first approach to fixate was simply a copy&paste of both + videoconvert and videoscale, trying to keep their logic as isolated + as possible. But that brought duplicated and sparse logic. + This patch merge both approaches simplifying the fixate operation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1109> + +2021-10-05 17:41:57 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadeinterlace.c: + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: filter, deinterlace, vpp: Add gst_va_buffer_get_surface_flags(). + Add a helper function to get, from GstVideoInfo and GstBuffers flags, + the VA interlace surface flags. This is used currently by vainterlace + element, but it will be used in vapostproc too if it can process + interlaced frames. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1109> + +2021-09-22 14:50:40 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkenc.c: + msdkenc: fix vp9enc initialization fail + MediaSDK does not support to handle extbuff with id + MFX_EXTBUFF_VIDEO_SIGNAL_INFO for mjpegenc and vp9enc. Hence, need to + exclude mjpeg and vp9 when passing color properties to MediaSDK during + msdkenc initialization. + Fix issue: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/764 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1101> + +2021-10-10 17:04:13 +0900 Seungha Yang <seungha@centricular.com> + + * tests/examples/d3d11/d3d11desktopdupsrc.cpp: + * tests/examples/d3d11/d3d11device.cpp: + * tests/examples/d3d11/d3d11device.h: + * tests/examples/d3d11/d3d11videosink-kb.c: + * tests/examples/d3d11/d3d11videosink-kb.h: + * tests/examples/d3d11/d3d11videosink-shared-texture-d3d9ex.cpp: + * tests/examples/d3d11/d3d11videosink-shared-texture.cpp: + * tests/examples/d3d11/d3d11videosink.c: + * tests/examples/d3d11/meson.build: + * tests/examples/meson.build: + examples: d3d11: Add a desktop capture example + ... with d3d11 desktop capture device provider + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1103> + +2021-10-05 21:49:38 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdup.h: + * sys/d3d11/gstd3d11desktopdupdevice.cpp: + * sys/d3d11/gstd3d11desktopdupdevice.h: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.cpp: + d3d11: Add device provider for d3d11desktopdupsrc + ... and add support for multi-GPU/multi-monitor + By using newly added "monitor-handle" property, user can specify a + monitor to be captured via HMONITOR handle. + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1673 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1103> + +2021-10-11 15:23:08 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/ttml/gstttmlparse.c: + * ext/ttml/gstttmlrender.c: + ttml: fix log init + The log system should be init before calling a log + Fix regression after: + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1112> + +2021-10-04 13:30:37 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/check/elements/vapostproc.c: + * tests/check/meson.build: + tests: va: Add VA buffer copy tests. + It should only work for raw buffers, but fails on dmabuf since it + should have a drm modifier. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-06 15:20:50 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: dmabuf: Use GstVaSurfaceCopy, if possible. + If dmabuf-based buffer to copy contains only one memory, and there are + memories available in the allocator's pool, it's possible a fast + memory copy using GstVaSurfaceCopy, regardless the drm modifier. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-05 15:21:01 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: Use GstVaSurfaceCopy. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-05 13:36:56 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvasurfacecopy.c: + * sys/va/gstvasurfacecopy.h: + * sys/va/meson.build: + * sys/va/vasurfaceimage.c: + * sys/va/vasurfaceimage.h: + va: Add GstVaSurfaceCopy class. + This new class is a helper for fast/tricky copy of surfaces. First it + tries to copy using the function vaCopy in libva 1.12. If it fails, or + it's not available, a GstVaFilter is tried to be instantiated with the + allocator's parameters, and if succeed, it's used for copying the + source surface. + This is required for dmabuf surfaces with drm modifier. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-05 13:21:00 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/vasurfaceimage.c: + * sys/va/vasurfaceimage.h: + va: filter: Enable to pass VASurfaceID in GstVaSample. + Initially GstVaSample processed its GstBuffer member to get the + VASurfaceID. But it might cases where we already have the VASurfaceID + to process by the filter. + This patch enables the possibility to pass the surfaces rather than + the buffers. In order to validate the surfaces a function to check + surfaces were added. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-07 21:51:55 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/meson.build: + * sys/va/vasurfaceimage.c: + * sys/va/vasurfaceimage.h: + va: Split VA memory handling in different files. + Take out the VA memory wrappers from gstvallocator.c to an external + file exposing the functions. + This is going to be needed for the copy helper object. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-05 06:54:25 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.h: + va: allocator: Add missing header file. + Added stdint.h because uintptr_t is used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-10-04 21:31:53 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadisplay_priv.c: + * sys/va/gstvadisplay_priv.h: + va: display: Add gst_va_display_has_vpp() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1023> + +2021-09-30 14:59:31 +0200 Benjamin Gaignard <benjamin.gaignard@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/linux/v4l2-controls.h: + v4l2codecs: Align v4l2-controls header with kernel 5.15-rc3 + Update v4l2-controls to be aligned with kernel 5.15-rc3. + Fix VP8 decoder to use the correct field name. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1081> + +2021-10-08 23:07:32 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvideoenc.h: + * sys/mediafoundation/gstmfvp9enc.cpp: + mediafoundation: mfvideoenc: Use DXGI adapter LUID + Make use of new DXGI adapter LUID based device context sharing. + Note that we were using DXGI adapter LUID to open MFT already. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1098> + +2021-10-08 22:37:20 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11: d3d11decoder: Use DXGI adapter LUID + ... instead of index of DXGI adapter. + The order of IDXGIAdapter1 enumerated via IDXGIFactory1::EnumAdapters1 + can be varying even there's no rebooting in case that GPU preference order + is updated by user (for example, it can be done by using NVIDIA Control Panel + in case of multi-GPU laptop system) and eGPU is another possible case. + So, for an element which requires fixed target GPU requirement, + index based device enumeration is unreliable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1098> + +2021-10-08 21:39:44 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11utils.cpp: + * gst-libs/gst/d3d11/gstd3d11utils.h: + d3d11: d3d11utils: Add support for DXGI Adapter LUID based D3D11 device context sharing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1098> + +2021-10-08 19:41:22 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.cpp: + * gst-libs/gst/d3d11/gstd3d11device.h: + * gst-libs/gst/d3d11/gstd3d11utils.cpp: + * gst-libs/gst/d3d11/gstd3d11utils.h: + * tests/check/libs/d3d11device.cpp: + * tests/check/meson.build: + d3d11: d3d11device: Add gst_d3d11_device_new_{for_adapter_luid,wrapped} + * gst_d3d11_device_new_for_adapter_luid() + Used for creating D3D11 device for a DXGI adapter (i.e., GPU) + corresponding to a LUID (Locally Unique Identifier). + This method can be useful for interop with other APIs such as + Direct3D12, MediaFoundation, CUDA, etc. + * gst_d3d11_device_new_wrapped() + Allows creating a new GstD3D11Device object by using already + configured ID3D11Device. This is conceptually equivalent to + gst_gl_context_new_wrapped() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1098> + +2021-10-08 17:16:02 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.cpp: + * sys/d3d11/gstd3d11window.cpp: + d3d11: d3d11device: Remove "allow-tearing" property + Plugin can query DXGI_FEATURE_PRESENT_ALLOW_TEARING without d3d11device + help + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1098> + +2021-10-08 21:14:52 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * tests/check/elements/webrtcbin.c: + webrtcbin: Use the same promise reply structure name everywhere + This was an inconsistent mix of different names in the past. The name + has no meaning at all so let's set all to "application/x-gst-promise". + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1099> + +2021-10-08 15:44:37 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * docs/plugins/gst_plugins_cache.json: + * gst/debugutils/gstchopmydata.c: + chopmydata: Fix FIXMEs in gst_element_class_set_static_metadata + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1097> + +2021-10-06 03:19:30 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + d3d11: d3d11desktopdupsrc: Add support for non-D3D11 downstream element + By this commit, application doesn't need to configure d3d11download + element for software pipeline which will make things simpler + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1082> + +2021-10-06 22:06:44 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstav1decoder.c: + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstvp8decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: Use GST_VIDEO_DECODER_ERROR() only for decoding error case + The GST_VIDEO_DECODER_ERROR() should be used only for robust/error-resilient + decoding purpose. Any other error codes such as not-negotiated or flushing + should be returned without modified for upstream to be able to handle + it immediately. (for example, application might want to try other + decoder element on not-negotiated) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1070> + +2021-10-07 01:54:29 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Update for remaining gboolean to GstFlowReturn port + Fix for spurious/spammy warning and wrong function return type + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1085> + +2021-09-23 17:36:20 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * ext/openjpeg/meson.build: + meson: Fix build with -Dopenjpeg=disabled + tests/check/meson.build uses the openjpeg_dep variable + unconditionally, and the subdir_done() is useless anyway, since the + plugin is only built if openjpeg_dep.found() is true. Fixes: + ..\tests\check\meson.build:23:0: ERROR: Unknown variable "openjpeg_dep". + In particular, this fixes the build on UWP since we disable openjpeg + explicitly in Cerbero when building for UWP. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1069> + +2021-10-01 14:35:06 +0200 Guillaume Desmottes <guillaume@desmottes.be> + + * ext/gs/README.md: + bad: gs: update README + - add one missing dep + - change install path to match monorepo + - fix current dirs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1008> + +2021-09-17 13:02:38 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadeinterlace.c: + * sys/va/gstvavpp.c: + * tests/examples/va/main.c: + va: Use macro rather than VAMemory feature string. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1061> + +2021-09-24 11:53:56 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsdemux/mpegtsparse.c: + mpegtsparse: Don't assert the packet_size when filling for EOS + If the packetizer got reset for any reason (failure to find PCR?) then + the packet_size can be zero here even though we already enqueued some + packets. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1038> + +2021-09-22 00:05:43 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstmpeg2decoder.h: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/va/gstvampeg2dec.c: + codecs: mpeg2decoder: Use GstFlowReturn everywhere + boolean return value is not sufficient for representing the reason + of error in most cases. For instance, any errors around new_sequence() + would mean negotiation error, not just *ERROR*. + And some subclasses will allocate buffer/memory/surface on new_picture() + but it could be failed because of expected error, likely flushing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1019> + +2021-09-21 22:21:51 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/nvcodec/gstnvh264dec.c: + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/va/gstvah264dec.c: + codecs: h264decoder: Use GstFlowReturn everywhere + boolean return value is not sufficient for representing the reason + of error in most cases. For instance, any errors around new_sequence() + would mean negotiation error, not just *ERROR*. + And some subclasses will allocate buffer/memory/surface on new_picture() + but it could be failed because of expected error, likely flushing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1019> + +2021-09-21 00:23:13 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gsth265decoder.h: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/nvcodec/gstnvh265dec.c: + * sys/va/gstvah265dec.c: + codecs: h265decoder: Use GstFlowReturn everywhere + boolean return value is not sufficient for representing the reason + of error in most cases. For instance, any errors around new_sequence() + would mean negotiation error, not just *ERROR*. + And some subclasses will allocate buffer/memory/surface on new_picture() + but it could be failed because of expected error, likely flushing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1019> + +2021-10-02 21:22:23 +0900 Seungha Yang <seungha@centricular.com> + + * ext/closedcaption/bit_slicer.c: + * ext/closedcaption/io-sim.c: + * ext/closedcaption/misc.h: + * ext/closedcaption/sampling_par.c: + closedcaption: Fix broken debug function macros with MSVC build + warning C4003: not enough arguments for function-like macro invocation 'warning' + G_STMT_END macro is extended to the below form with MSVC + __pragma(warning(push)) \ + __pragma(warning(disable:4127)) \ + while(0) \ + __pragma(warning(pop)) + So MSVC preprocessor will extend it further to + __pragma(VBI_CAT_LEVEL_LOG(push)) ... + Should rename warning() debug macro function therefore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1018> + +2021-10-02 20:12:07 +0900 Seungha Yang <seungha@centricular.com> + + * gst/mpegtsmux/gstbasetsmux.c: + mpegtsmux: basetsmux: Don't try to return value from void function + gstbasetsmux.c(1508): warning C4098: 'free_splice': 'void' function returning a value + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1018> + +2021-10-03 16:53:54 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadevice.c: + * sys/va/meson.build: + va: Make libgudev dependency optional. + libgudev is a problematic dependency, particularly in sandboxed + environments, such as flatpak. + This patch implements a way to get the available VA devices using + brute-forced traverse of /dev/drm/renderD* directory. Thus usable in + those sandboxed environments. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1027> + +2021-10-03 15:45:58 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/meson.build: + * sys/va/meson.build: + va: meson: Move back libgudev dependency to plugin. + When move the libgstva, libgudev dependency was moved as part of the + library, though it's not use by the library but the plugin. This patch + moves back libgudev dependency to the plugin. + Also HAVE_LIBDRM is move to the library which is the one who use it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1027> + +2021-10-03 19:14:07 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvh264dec.c: + nvcodec: nvh264sldec: Add support for interlaced stream + Implement missing interlaced stream support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1026> + +2021-10-03 17:41:40 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvh264dec.c: + nvcodec: nvh264sldec: Consider additional render delay DPB pictures + At least additional 4 pictures are required + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1026> + +2021-10-03 17:37:02 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvvp9dec.c: + nvcodec: nvvp9sldec: Fix for VP9 profile2 decoding + Fix for output video format to be selected correctly + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1026> + +2021-10-03 02:14:17 +0900 Seungha Yang <seungha@centricular.com> + + * sys/msdk/meson.build: + msdk: meson: Fix build on Windows + subprojects\gst-plugins-bad\sys\msdk\meson.build:160:2: ERROR: Unknown variable "libva_dep". + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1022> + +2021-10-03 01:45:57 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11: d3d11vp9dec: Fix use_prev_in_find_mv_refs value setting + "last_show_frame" should be updated based on + GstVp9FrameHeader::show_frame, not show_existing_frame + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1021> + +2021-09-18 22:51:53 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: {vp8,vp9}decoder: Drain on new_sequence() + Decoder should drain queued frame (if any) and empty DPB before + starting new sequence. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/987> + +2021-10-01 00:27:42 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvvp8dec.c: + * sys/nvcodec/gstnvvp9dec.c: + nvcodec: nv{vp8,vp9}sldec: Implement get_preferred_output_delay() + Equivalent to that of nvh264sldec. Use render delay in case of non-live + pipeline for the better throughput performance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/987> + +2021-10-01 01:00:24 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: {vp8,vp9}decoder: Cleanup drain code + Make them consistent with h26x decoder baseclass + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/987> + +2021-10-01 02:58:44 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + codecs: vp8decoder: Fix typo + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/987> + +2021-10-02 20:21:41 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + codecs: vp9decoder: add support for render delay + Some decoding APIs support delayed output for performance reasons. + One example would be to request decoding for multiple frames and + then query for the oldest frame in the output queue. + This also increases throughput for transcoding and improves seek + performance when supported by the underlying backend. + Introduce support in the vp9 base class, so that backends that + support render delays can actually implement it. + Co-authored by Seungha Yang <seungha@centricular.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/987> + +2021-10-02 19:47:45 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.h: + codecs: vp9decoder: Fix class struct documentation + s/GstVp9Decoder/GstVp9DecoderClass + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/987> + +2021-09-30 17:38:33 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/meson.build: + va: meson: Update and enhance meson syntax usage. + This patch contains two updates: + 1. Instead of checking for dependency already checked just to verify a + version, we use the dependency version API. + 2. Update the deprecated function get_pkgconfig_variable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/997> + +2021-09-30 17:43:09 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/msdk/meson.build: + msdk: meson: Don't get dependency variable before it's valid. + It's possible to have installed MediaSDK environment + package (libmfx-dev in Debian) without libva environment package. This + setup will lead to a breakage of meson configuration. + The fix is to get the libva's driver directory variable after the + dependency is validated as found. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/998> + +2021-09-30 13:32:44 +0200 Marc Leeman <m.leeman@televic.com> + + * gst/rist/gstristsink.c: + ristsink: set sync to FALSE on RTCP sink + See commit 921e9a54: rtpsink: set sync off on rtcp_sink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/993> + +2021-09-30 13:19:40 +0200 Marc Leeman <m.leeman@televic.com> + + * gst/rtp/gstrtpsink.c: + rtpsink: set sync off on rtcp_sink + When using the following setup (the error can be reproduced using + simpler sender pipelines), the receiver resynchronises the clock on RTCP + packets. The effect was that a couple seconds were cut out of the + playback because an initial RTCP packet was dropped. + When sending out all RTCP packets (setting sync=FALSE on the RTCP + updsink), the playback is fine. + This syncs rtpsink with rtpsrc (where this property was already set). + gst-launch-1.0 filesrc location=899-en.mp3 \ + ! mpegaudioparse \ + ! mpg123audiodec \ + ! audioconvert \ + ! audioresample \ + ! avenc_g722 \ + ! rtpg722pay + ! rtpsink uri=rtp://239.1.2.3:1234 + gst-launch-1.0 uridecodebin rtp://239.1.2.3:1234?encoding-name=G722 \ + ! autoaudiosink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/993> + +2020-09-17 15:06:38 +0200 Marc Leeman <m.leeman@televic.com> + + * gst/rtp/gstrtpsrc.c: + * tests/check/elements/rtpsrc.c: + rtpmanagerbad: do not set iface on sink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/994> + +2021-08-27 19:19:57 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfdevice.c: + * sys/mediafoundation/gstwin32devicewatcher.cpp: + * sys/mediafoundation/gstwin32devicewatcher.h: + * sys/mediafoundation/meson.build: + mfdeviceprovider: Add support for device update + Similar to the wasapi2 plugin, GstWinRT library will be used for UWP, + and adding new GstWin32DeviceWatcher object implementation for + Win32 desktop application. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/947> + +2021-08-26 22:38:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2device.c: + * sys/wasapi2/gstwasapi2device.h: + * sys/wasapi2/meson.build: + * sys/wasapi2/plugin.c: + wasapi2deviceprovider: Add support for device update + ... by using newly implemented GstWinRT library + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/947> + +2021-08-26 19:47:51 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/meson.build: + * gst-libs/gst/winrt/gstwinrt.h: + * gst-libs/gst/winrt/gstwinrtdevicewatcher.cpp: + * gst-libs/gst/winrt/gstwinrtdevicewatcher.h: + * gst-libs/gst/winrt/meson.build: + * gst-libs/gst/winrt/winrt-prelude.h: + libs: Introduce GstWinRT library + Adding a helper library for various WinRT specific implementations. + Currently this library supports only DeviceWatcher abstraction object + which can be used for dynamic device add/remove detection. + See also + https://docs.microsoft.com/en-us/uwp/api/windows.devices.enumeration.devicewatcher?view=winrt-20348 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/947> + +2021-09-28 10:11:15 +1000 Brad Hards <bradh@frogmouth.net> + + * README: + * RELEASE: + doc: update IRC links to OFTC + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/945> + +2021-09-18 23:37:20 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11window.cpp: + d3d11videosink: Add support for crop meta + ... when upstream element is d3d11. + Note that, if upstream element is not d3d11, crop meta is almost + pointless since d3d11videosink will upload video frame to GPU memory + in any case. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/933> + +2021-09-18 23:37:59 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + d3d11videosink: Perform propose_allocation() even when we have no configured window + In order to support d3d11 device update, d3d11videosink will configure + window on the first buffer. So, there might not be configured + window when propose_allocation() is required. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/933> + +2021-09-27 15:30:25 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/gs/gstgscommon.cpp: + * ext/gs/gstgscommon.h: + * ext/gs/gstgssink.cpp: + * ext/gs/gstgssrc.cpp: + gs: Add support for authenticating via Service Account Credentials + This allows authenticating directly with Server Account credentials + instead of having it configured on host system separately, and thus + allows using arbitrary accounts configured/selected at runtime. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/934> + +2021-09-27 14:56:21 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/gs/gstgs.cpp: + * ext/gs/gstgscommon.cpp: + * ext/gs/gstgscommon.h: + * ext/gs/gstgssink.cpp: + * ext/gs/gstgssink.h: + * ext/gs/gstgssrc.cpp: + * ext/gs/gstgssrc.h: + gs: Fix indentation and make it consistent + Apparently this partially used clang-format's default settings, so let's + use that for everything now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/934> + +2021-09-26 01:07:02 +0100 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + Back to development + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/925> 2021-09-23 19:30:32 +0300 Sebastian Dröge <sebastian@centricular.com> @@ -131,7 +3949,486 @@ the pipeline configuration might become inconsistent, e.g. with regards to latency. See https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/737 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2556> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/900> + +2021-09-25 00:09:00 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + * gst-libs/gst/mpegts/gstmpegtssection.c: + * gst/mpegtsdemux/tsdemux.c: + mpegts: add missing Since comments after SCTE 35 work + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-06-08 23:25:58 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/mpegtsmux/gstbasetsmux.c: + basetsmux: use private copy of g_ptr_array_copy + This function is only present since glib 2.62 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-05-04 14:38:28 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/mpegtsmux/gstbasetsmux.c: + basetsmux: fix SCTE pts_adjustment with offsets + When there are elements between the demuxer and the muxer that + introduce an offset to the running time, or when offsets are + set on pads by the application, this shift must be taken into + account when calculating the final pts_adjustement. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-23 01:22:32 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + * gst/mpegtsmux/gstbasetsmux.c: + basetsmux: rework SCTE section handling to handle passthrough + mpegtsmux can receive SCTE sections from two origins: events + created by the application, and events forwarded downstream by + mpegtsdemux, containing sections that may not have been fully + parsed, and additional data to help tsmux translate times to + the correct domain, both for requesting keyframes and calculating + an accurate pts_adjustment. + The complete approach is documented further in a comment above + the relevant function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-23 01:19:21 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gstmpegtssection.c: + mpegtspacketizer: handle "packetizing" already packetized data + .. when the section didn't have a packetizer. This can happen + as a result of building a new section from a copy of the original + data of another section. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-23 01:15:08 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/mpegtsdemux/tsdemux.c: + * gst/mpegtsdemux/tsdemux.h: + tsdemux: switch SCTE 35 sections handling to a passthrough model + Instead of modifying the splice times in the incoming sections + to running time and expecting eg mpegtsmux to convert those back + to its local PES time domain, which might be impossible when + those splice times are encrypted or the specification is extended, + transmit the needed information to the muxer as separate fields in + the event: + * A pts offset field can be used by the muxer in order to calculate + a final pts_adjustment + * A rtime_map can be used by the muxer to determine the correct + running times at which it should request keyframes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-14 00:27:16 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + scte-section: add support for packetizing splice_program_flag='0' + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-13 23:56:06 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + scte-section: add support for packetizing schedule events + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-13 23:42:54 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + scte-section: Add TODO for porting to gst_bit_* + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-13 23:38:16 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + scte-section: add support for parsing splice components + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-13 20:51:09 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + scte-section: add support for SCHEDULE commands + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-13 20:47:36 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + scte-section: fix typo + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-13 20:44:54 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + gst-scte-section: implement partial parsing + In cases where either the SIT is encrypted, or an unknown + command is encountered, we still want to send an event downstream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 17:57:42 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + * tests/examples/mpegts/ts-scte-writer.c: + scte35-section: semantic API break + Document that the constructors for the splice events expect + a running time, as users of the API can not be expected to + predict the appropriate local PTS. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 17:37:28 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.c: + scte-section: add support for packetizing time_signal splices + time_signal splices are trivial, they only contain a splice_time() + and all the relevant information is carried in descriptors. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 00:58:33 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/gstbasetsmux.h: + basetsmux: extend SCTE 35 support + Makes it possible to support passing SCTE 35 cue points from + demuxer to muxer, while preserving correct timing. + This will also improve ex nihilo cue points injection, as splice + times and durations are now interpreted as running time values, + and may trigger key unit requests. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 00:36:43 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/mpegtsdemux/tsdemux.c: + * gst/mpegtsdemux/tsdemux.h: + tsdemux: Expose send-scte35-events property + When enabled, SCTE 35 sections (eg ad placement opportunities) + are forwarded as events donwstream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 00:26:50 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsbase.h: + mpegtsbase: expose vmethod to let subclass handle sections + This can be used by tsdemux to handle and forward SCTE 35 + sections. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 00:23:09 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gstmpegtssection.c: + * gst-libs/gst/mpegts/gstmpegtssection.h: + mpegtssection: expose event constructor + This allows the demuxer to forward sections of interest downstream, + for example SCTE 35 splice information. These can then be reinjected + appropriately by a muxer for example. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +2021-04-06 00:21:58 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * gst-libs/gst/mpegts/gst-scte-section.h: + scte-section.h: fix type macros + Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/913> + +=== release 1.19.2 === + +2021-09-23 01:34:47 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + * NEWS: + * RELEASE: + * gst-plugins-bad.doap: + * meson.build: + Release 1.19.2 + +2021-09-22 14:17:35 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/audiobuffersplit/gstaudiobuffersplit.c: + audiobuffersplit: Remove unneeded buffer_clip wrapper + This is just a small cleanup noticed while reading. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2544> + +2020-07-10 19:31:13 +0530 Vivek R <123vivekr@gmail.com> + + * ext/opencv/gstcvtracker.cpp: + * ext/opencv/gstcvtracker.h: + opencv: cvtracker: add draw property + This property controls the drawing of rectangle around the tracked object. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2454> + +2020-05-24 23:37:25 +0530 Vivek R <123vivekr@gmail.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/opencv/gstcvtracker.cpp: + * ext/opencv/gstcvtracker.h: + * ext/opencv/gstopencv.cpp: + * ext/opencv/meson.build: + * tests/meson.build: + * tests/validate/meson.build: + * tests/validate/opencv/cvtracker.validatetest: + * tests/validate/opencv/cvtracker/flow-expectations/log-tracker-src-expected: + opencv: add cvtracker plugin + This adds an object tracker plugin. + Tracker implementations from https://docs.opencv.org/3.4/d0/d0a/classcv_1_1Tracker.html + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2454> + +2020-05-25 10:35:30 +0530 Vivek R <123vivekr@gmail.com> + + * ext/opencv/meson.build: + opencv: patch to ensure headers are detected + This patch is used to ensure opencv headers are detected. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2454> + +2021-09-19 01:18:00 +0100 Tim-Philipp Müller <tim@centricular.com> + + * gst/mpegpsmux/mpegpsmux_aac.c: + * gst/mpegpsmux/mpegpsmux_aac.h: + * gst/mpegpsmux/mpegpsmux_h264.c: + * gst/mpegpsmux/mpegpsmux_h264.h: + * gst/mpegtsmux/gstatscmux.c: + * gst/mpegtsmux/gstatscmux.h: + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/gstbasetsmux.h: + * gst/mpegtsmux/gstbasetsmuxaac.c: + * gst/mpegtsmux/gstbasetsmuxaac.h: + * gst/mpegtsmux/gstbasetsmuxjpeg2000.c: + * gst/mpegtsmux/gstbasetsmuxjpeg2000.h: + * gst/mpegtsmux/gstbasetsmuxopus.c: + * gst/mpegtsmux/gstbasetsmuxopus.h: + * gst/mpegtsmux/gstbasetsmuxttxt.c: + * gst/mpegtsmux/gstbasetsmuxttxt.h: + * gst/mpegtsmux/gstmpegtsmux.c: + * gst/mpegtsmux/gstmpegtsmux.h: + * gst/mpegtsmux/gstmpegtsmuxplugin.c: + * gst/mpegtsmux/tsmux/tsmux.c: + * gst/mpegtsmux/tsmux/tsmux.h: + * gst/mpegtsmux/tsmux/tsmuxcommon.h: + * gst/mpegtsmux/tsmux/tsmuxstream.c: + * gst/mpegtsmux/tsmux/tsmuxstream.h: + mpegtsmux, mpegpsmux: remove GPL from choice of licenses and add SPDX license identifiers + Some people need to avoid inclusion of GPL code for their use cases and thus + get easily spooked by GPL license headers. This code is actually licensed + under different licenses, only one of which is GPL, and it's already possible + to just upgrade from LGPL to GPL anyway so having the GPL listed explicitly + as one of the choices doesn't really add anything. So remove GPL from the list + and also add SPDX license identifiers while we're at it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2539> + +2021-08-24 03:54:27 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi/gstmmdeviceenumerator.cpp: + * sys/wasapi/gstmmdeviceenumerator.h: + * sys/wasapi/gstwasapidevice.c: + * sys/wasapi/gstwasapidevice.h: + * sys/wasapi/gstwasapisink.c: + * sys/wasapi/gstwasapisink.h: + * sys/wasapi/gstwasapisrc.c: + * sys/wasapi/gstwasapisrc.h: + * sys/wasapi/gstwasapiutil.c: + * sys/wasapi/gstwasapiutil.h: + * sys/wasapi/meson.build: + * tests/check/elements/wasapi.c: + wasapideviceprovider: Add support for dynamic device add/remove + Adding IMMDeviceEnumerator::RegisterEndpointNotificationCallback + in order to support device monitoring. + On OnDeviceAdded(), OnDeviceRemoved(), and OnDefaultDeviceChanged() + callback, wasapi device provider implementation will enumerate + devices again and will notify newly added and removed device + via GstDeviceProvider API. + As a bonus point, this IMMDeviceEnumerator abstraction object + will spawn a dedicated internal COM thread, so various COM thread + related issues of WASAPI plugin can be resolved by this commit. + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1649 + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1110 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2484> + +2021-02-01 16:21:59 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * tests/check/elements/openjpeg.c: + * tests/check/meson.build: + openjpeg: add unit test + Test various format supported with subframes. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2021-01-22 10:39:56 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/openjpeg/gstopenjpegdec.c: + * ext/openjpeg/gstopenjpegenc.c: + * gst/videoparsers/gstjpeg2000parse.c: + jpeg2000parse + openjpeg: Switch striped mode to its own caps + It's not compatible with any other element that use the non-striped + mode. In addition, in this mode, we require that every frame have the + same number of stripes or that the MARKER bit be present, which is + different from the other + formats too. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-12-23 11:03:34 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openjpeg/gstopenjpegdec.c: + openjpegdec: Fix crash with AYUV64 in subframe mode + Remove useless generic fill_frame methods to use + the packed one for AYUV and AYUV64. + Fix gst-launch-1.0 -v videotestsrc ! + video/x-raw,width=640,height=480,format=AYUV64 ! openjpegenc + num-stripes=8 ! openjpegdec max-threads=8 ! videoconvert ! + autovideosink sync=false + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-04-24 16:15:42 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openjpeg/gstopenjpeg.h: + * ext/openjpeg/gstopenjpegdec.c: + * ext/openjpeg/gstopenjpegdec.h: + openjpegdec: support for a multithreaded decoding. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-01-13 14:02:39 -0500 Aaron Boxer <aaron.boxer@collabora.com> + + * ext/openjpeg/gstopenjpegdec.c: + * ext/openjpeg/gstopenjpegdec.h: + openjpegdec: enable sub frame mode + Rebuild output frame from multiple stripes input. + Keep the first frame and fill it with the following stripes to finish + a complete frame only once. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-12-22 18:19:40 -0500 Olivier Crête <olivier.crete@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/mpegtsmux/gstmpegtsmux.c: + mpegtsmux: Require frame alignment for JPEG 2000 + We have yet to implement stripe alignment with the required descriptor. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-12-22 18:15:52 -0500 Olivier Crête <olivier.crete@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/openjpeg/gstopenjpegdec.c: + openjpegdec: Reject stripes for now + They're not implemented. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-12-22 18:20:35 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/openjpeg/gstopenjpegenc.c: + openjpegenc: Only allow stripe with image/x-jpc format + It's the only format that our MPEG-TS muxer allows and the carriage of + JPEG 2000 stripes is only defined for MPEG-TS as far as I know. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2020-04-21 20:56:03 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/openjpeg/gstopenjpeg.h: + * ext/openjpeg/gstopenjpegenc.c: + * ext/openjpeg/gstopenjpegenc.h: + openjpegenc: support for a multithreaded encoding. + This commit introduces a multithreaded encoder allowing + to encode mulitple stripes or subframes in separated threads. + This feature aims to enhance the overall latency of a codec + pipeline. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/979> + +2021-09-18 12:02:15 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + wpe: context thread dispatch fixes + Use dedicated mutex/cond/flag for jobs being dispatched in the context thread. + The previous code was signalling the thread startup condition, which is wrong. + When WPEContextThread::dispatch() is invoked it means the thread has already + correctly been started up. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2533> + +2021-09-18 12:01:39 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/WPEThreadedView.h: + wpe: Properly wait on context thread startup condition + Fixes #1661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2533> + +2021-09-20 09:41:32 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + doc: Update kmssink caps cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2530> + +2021-09-17 16:21:39 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/kms/gstkmsutils.c: + kmssink: Add RGB16/BGR16 support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2530> + +2021-09-17 16:14:36 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/kms/gstkmsutils.c: + kmssink: Sort format according to GST_VIDEO_FORMATS_ALL + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2530> + +2021-09-17 15:42:25 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/kms/gstkmsutils.c: + kmssink: Remove big endian format inversion + This has been a bad interpretation of the DRM docuemntation. The formats are + fixed regardless the CPU, but for some formats, they expressed in the opposite + order as GStreamer. Same change was done in waylandsink 2 years ago. + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/commit/b393b650ab9bfb9654fc116163ab331907216d74 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2530> + +2021-09-17 15:41:41 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/kms/gstkmsutils.c: + kmssink: Add NV61 support + This identically handled to NV16, so no reason not to inclue it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2530> + +2021-09-17 15:39:54 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/kms/gstkmsutils.c: + kmssink: Add NV24 support + This was tested on RK3566 platform, using vendor DRM driver. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2530> + +2021-09-18 00:33:12 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstav1decoder.c: + * gst-libs/gst/codecs/gstav1decoder.h: + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/va/gstvaav1dec.c: + codecs: av1decoder: Use GstFlowReturn everywhere + The same modification as that of VP8 decoder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2528> + +2021-09-18 00:09:24 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/nvcodec/gstnvvp9dec.c: + * sys/va/gstvavp9dec.c: + codecs: vp9decoder: Use GstFlowReturn everywhere + The same modification as that of VP8 decoder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2528> + +2021-09-17 23:23:06 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + * gst-libs/gst/codecs/gstvp8decoder.h: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/nvcodec/gstnvvp8dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/va/gstvavp8dec.c: + codecs: vp8decoder: Use GstFlowReturn everywhere + boolean return value is not sufficient for representing the reason + of error in most cases. For instance, any errors around new_sequence() + would mean negotiation error, not just *ERROR*. + And some subclasses will allocate buffer/memory/surface on new_picture() + but it could be failed because of expected error, likely flushing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2528> 2021-08-16 18:15:42 +0300 Vivia Nikolaidou <vivia@ahiru.eu> @@ -144,44 +4441,499 @@ together, which is equivalent of capturing a DVB stream while switching channels. A set-top box would know that we switched the channels and reset the demuxer, but in practice this might not happen. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2555> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2468> -2021-02-25 14:09:50 +0100 Stéphane Cerveau <scerveau@collabora.com> +2021-09-20 11:35:51 +0300 Sebastian Dröge <sebastian@centricular.com> - * ext/zxing/gstzxing.cpp: - * ext/zxing/meson.build: - * tests/check/elements/zxing.c: - zxing: update to support version 1.1.1 - Support new API in 1.1.1 - Update the supported input video format. - Update tests to use parse_launch - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2554> + * gst-libs/gst/play/gstplay-media-info.c: + * gst-libs/gst/play/gstplay.c: + * gst-libs/gst/player/gstplayer-media-info.c: + * gst-libs/gst/player/gstplayer.c: + player: Fix/add various annotations + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2538> -2021-10-21 00:31:24 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> +2021-09-18 18:07:43 +0900 Seungha Yang <seungha@centricular.com> - * gst/interlace/gstinterlace.c: - interlace: Reset after changing state to READY - Trying to reset before the pads have been deactivated races with the - streaming thread. There was also a buggy buffer clear leaving a dangling - `stored_frame` pointer around. Use `gst_interlace_reset` so this happens - properly. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2553> + * sys/va/meson.build: + meson: va: Make AV1 support always optional + Otherwise meson configure with -Dva=enabled will be failed + when installed libva version is < 1.8 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2532> -2021-09-09 00:12:51 +0100 Tim-Philipp Müller <tim@centricular.com> +2021-09-18 11:03:16 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * meson.build: - Back to development + * sys/va/gstvacaps.c: + va: caps: Don't use image formats for decoded frames. + Initially we tried to use the internal color conversion used in i965 + and Gallium drivers when decoding. But this approach has showed + limitations and problems. + This patch removes completely the possible color conversion at + decoding, since it show problems with deinterlacing, for example: + gst-launch-1.0 filesrc location=interlaced.mpg2 ! parsebin ! vampeg2dec ! vadeinterlace ! xvimagesink + Allowing only the surface formats when decoding is more stable. + For color conversion is better to do it explicitly with vapostproc. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2531> + +2021-04-27 11:59:15 +0200 Marijn Suijten <marijns95@gmail.com> -=== release 1.18.5 === + * gst-libs/gst/player/gstplayer-media-info.c: + * gst-libs/gst/player/gstplayer.c: + player: Add missing nullable annotations + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2197> -2021-09-08 20:03:37 +0100 Tim-Philipp Müller <tim@centricular.com> +2021-04-27 11:58:58 +0200 Marijn Suijten <marijns95@gmail.com> - * ChangeLog: - * NEWS: - * RELEASE: - * gst-plugins-bad.doap: - * meson.build: - Release 1.18.5 + * gst-libs/gst/play/gstplay-media-info.c: + * gst-libs/gst/play/gstplay.c: + play: Add missing nullable annotations + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2197> + +2021-09-18 14:29:25 +0200 Fabian Orccon <cfoch.fabian@gmail.com> + + * sys/shm/meson.build: + sys: shm: Define shm_enable and shm_deps before escape meson subdir + Fixes meson configure in tests if the shm plugin is disabled + Fixes #1664 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2534> + +2021-09-18 02:27:51 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11videosink: Display title of content if possible + Update title text of window (currently it's always "Direct3D11 renderer") + when we are rendering on internal HWND, not external HWND. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2529> + +2021-09-18 01:32:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + d3d11videosink: Remove unused enum value + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2529> + +2021-09-15 13:59:17 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkenc.c: + msdkenc: Pass color properties to MediaSDK for encoding + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2523> + +2021-09-15 16:32:02 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + msdkh265enc: Add profile main10 still picture for hevc + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2524> + +2021-09-16 17:12:58 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * ext/wayland/gstwaylandsink.c: + * ext/wayland/wlbuffer.c: + waylandsink: Fix double render check + Our code does not support rendering twice the same wl_buffer in a row, so it + tries to skip that case, but for this it relied on the GstBuffer pointer, + while the cache actually works at the GstMemory level now. To avoid this + compare the GstWlBuffer instead. + This fixes crash when use in zero-copy with videorate element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2526> + +2021-09-07 09:45:54 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: Use tsg framerate for latency. + Latency setting relies on src pad caps, but they aren't set when the + function is called, and latency is never updated. + In order to fix it, this patch uses TSG framerate first, and if it's + not set yet, sinkpad caps are used to get the framerate. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2514> + +2021-09-16 00:59:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11av1dec.h: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h264dec.h: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11h265dec.h: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.h: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp8dec.h: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/d3d11/gstd3d11vp9dec.h: + * sys/d3d11/plugin.cpp: + d3d11decoder: Refactor for more unified decoding flow + ... and various code cleanup. + * Move spreaded decoding API calls into one method + Previously, decoding flow of most codecs are + - Call DecoderBeginFrame() on start_picture() + - Call {Get,Release}DecoderBuffer() on decode_slice() + - Call SubmitDecoderBuffers() and DecoderEndFrame() on end_picture() + Such spreaded API calls make it hard to keep track of status + of decoding. Now it will be done at once in a new method. + * Drop a code for non-zero wBadSliceChopping + When bitstream buffer provided by driver is not sufficient + to write compressed bitstream data, host decoder needs to make use + of wBadSliceChopping so that driver can understand there are + multiple bitstream buffer. But it's a bit unrealistic and + not tested. Since FFMpeg's DXVA implemetaion doesn't support it, + we might be able to ignore the case for now. + * Make code more portable + Consider common logic of GstCodecs -> DXVA translation for all D3D APIs + (i,e., D3D9, D3D11, and D3D12). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2525> + +2021-09-15 23:41:39 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Remove duplicated class_init and property related code + Move them into the decoder helper code to remove duplication + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2525> + +2021-09-11 00:43:26 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11compositorbin.cpp: + * sys/d3d11/gstd3d11converter.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11overlaycompositor.cpp: + * sys/d3d11/gstd3d11pluginutils.cpp: + * sys/d3d11/gstd3d11shader.cpp: + * sys/d3d11/gstd3d11videoprocessor.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + * sys/d3d11/gstd3d11window_win32.cpp: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.cpp: + d3d11: Get rid of "extern "C"" wrapping for GST_DEBUG_CATEGORY_EXTERN + Instead, change the file defining debug category to cpp + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2525> + +2020-06-30 11:15:43 -0700 U. Artie Eoff <ullysses.a.eoff@intel.com> + + * ext/closedcaption/meson.build: + * tests/check/meson.build: + tests: skip cc tests if plugin is disabled + Skip the closedcaption element tests if the + closedcaption option is disabled at compile + time (i.e. -Dclosedcaption=disabled). + v2: rename pangocairo_dep to avoid conflict + with later definition in ext/ttml/meson.build + as suggested by @tpm. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1388> + +2021-08-31 17:16:05 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Check bumping again after inserting current picture. + In order to get the lowest latency, we can add another bumping check after + inserting the current picture into the DPB immediately. That can avoid + waiting for another decoding circle of the next frame and so the latency + is lower. + Fix: #1628 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2501> + +2021-08-31 17:37:11 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264: Add protection to to_insert picture in bump check. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2501> + +2021-08-31 16:39:06 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Improve the policy to infer max_num_reorder_frames. + The max_num_reorder_frames number can change the way we bumping the + pictures in the DPB. The smaller it is, the lower latency we will + get. So it is important for live mode streams, but it is not given + in VUI parameters sometimes. We now improve the policy to infer it: + 1. Never guess it in the "strict" compliance. + 2. For baseline and constrained baseline profiles, which do not have + B frames, set it to 0. + 3. For -intra only profiles, set it to 0. + 4. Otherwise, not guess it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2501> + +2021-09-14 20:57:30 -0700 U. Artie Eoff <ullysses.a.eoff@intel.com> + + * ext/aes/meson.build: + * tests/check/meson.build: + tests: skip aes test if elements not built + In ext/aes/meson.build, the aes_dep will return + not-found if -Daes=disabled, regardless of whether + openssl is found or not. Thus, we don't need a + separate check for the option. This will also + ensure that aes_dep is always defined and we can + use it in the tests/check/meson.build unit. + Fixes #1660 + v2: handle -Daes=disabled, too. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2522> + +2021-08-31 17:33:02 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + wpe: Add support for web:// URIs + The CEF source already supports this. No good reason for wpesrc not too ;) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2503> + +2021-07-23 23:38:22 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah264dec.c: + va: h264dec: Try to use ConstrainedBaseline or Main to decode BaseLine. + In the h264, the Baseline profile is widely misused. A lot of streams declare + that they are the Baseline, but in fact they just conform to ConstrainedBaseline. + The features such as FMO and ASO are not used at all. + If the decoder does not strictly conforms to the SPEC, we can just use Baseline + or Main profile to decode it to avoid lots of streams failure. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2428> + +2021-07-28 23:19:15 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Improve the fast bump for the live mode. + We control the policy of fast bump by the profile and the compliance + property. For baseline and constrained baseline profiles, we can use + more radical bump policy. User can also change the bump policy by + setting the compliance property in run time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2432> + +2021-07-28 22:48:21 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264: Change the low_latency to an enum for dpb_needs_bump(). + The bool parameter of low_latency is not enough. We have multi policies for + low latency bumping, from the safest to something radical. So we need an enum + to represent the proper latency requirement. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2432> + +2021-07-26 16:09:19 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264decoder.h: + codecs: h264dec: Add a compliance property to control behavior. + Some features such as the low-latency DPB bumping and mapping the + baseline profile as the constrained-baseline profile do not conform + to the H264 offical spec. But in practice, they are very useful and + are widely needed. We add this compliance property to control the + behavior of the decoder, make it fit more requirement. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2432> + +2021-09-12 12:23:36 +0100 Philippe Normand <philn@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + docs: Update cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2521> + +2021-09-07 10:55:10 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/va/gstvafilter.c: + va: Update vapostproc documentation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-09-07 10:16:05 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + * sys/va/gstvavpp.c: + va: Update todo lists, removing deinterlacing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-23 11:24:40 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/va/gstvadeinterlace.c: + * sys/va/gstvadeinterlace.h: + * sys/va/meson.build: + * sys/va/plugin.c: + Add vadeinterlace element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-24 13:53:12 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + va: filter: Add past and future frames in GstVaSample. + And add them in the pipeline structure if they are provided. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-24 13:33:29 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + va: filter: Add gst_va_filter_add_deinterlace_buffer() + This function decorates gst_va_filter_add_filter_buffer() to get the + number of past and future frames to hold, given the method. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2020-12-21 18:17:24 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + va: filter: Add deinterlacing method parameter. + For exposing that gobject parameter a new helper function is added: + gst_va_filter_install_deinterlace_properties() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-23 16:29:36 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: Protect filters array of overwrite. + It's possible to modify the filters array from another GStremer + thread, and the post-processing operation is not atomic, so the filter + array is reffed while the VA pipeline is processed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-23 15:24:55 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: Add helper function to query pipeline caps. + This function is going to be shared for future deinterlace filter + processing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-23 15:16:16 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: Shuffle _destroy_filters_unlocked(). + In order to put it near to its caller. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-10 17:55:43 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + * sys/va/gstvavpp.h: + * sys/va/plugin.c: + vapostproc: Move up color balance detection to plugin. + In order to install the color balance interface, a GstVaFilter is + instantiated and queried to know if it supports color balance + filter. It was done just after the GObject was registered. Now, it's + done before. + The reason of this change is that deinterlace element has to be + registered only if deinterlace filter is available, using only one + instantiate of GstVaFilter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-09-07 11:35:09 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/va/gstvabasetransform.c: + va: basetransform: Update documentation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-09-09 18:26:56 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasetransform.h: + va: basetransform: Add autoptr clean up function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-23 18:44:30 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasetransform.c: + va: basetransform: Use copy_metadata() at buffer import. + Instead of using only gst_buffer_copy_into() use copy_metadata() + vmethod to copy what's needed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-08-23 10:40:32 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: don't chain up transform_meta() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2495> + +2021-07-29 12:20:30 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstvp9statefulparser.c: + codecs: gstvp9statefulparser: feature_data should be 0 if feature_enable is 0 + The spec says in 6.2.11 that feature_data[i][j] should be zero if + feature_enabled[i][j] is zero. Instead we retained the old value in the parser. + Fix it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2449> + +2021-09-08 05:28:22 +0200 Marek Vasut <marex@denx.de> + + * gst-libs/gst/codecparsers/gsth264parser.c: + gsth264parser: Fix handling of NALs with emulation byte set + In case a set of NALs with emulation_prevention_three_byte is decoded using + hardware decoder like Hantro G1, wrong struct v4l2_ctrl_h264_decode_params + .dec_ref_pic_marking_bit_size is passed into the kernel, which results in + decoding artifacts. Subtract the number of emulation three bytes from the + .dec_ref_pic_m->bit_size to get the correct bit size and avoid having these + artifacts. Apply the exact same fix to slice->pic_order_cnt_bit_size as well. + The following NALs (7, 8, 6, 5) decode with artifacts, + .dec_ref_pic_marking_bit_size is set to 10 without this patch. + 00000000 00 00 00 01 27 4d 00 20 89 8b 60 3c 04 bf 2e 02 |....'M. ..`<....| + 00000010 d4 18 04 18 c0 c0 01 77 00 00 5d c1 7b df 05 00 |.......w..].{...| + 00000020 00 00 01 28 ee 1f 20 00 00 01 06 05 10 b9 ed b9 |...(.. .........| + 00000030 30 5d 21 4b 71 83 71 2c 10 a3 14 bb 29 80 00 00 |0]!Kq.q,....)...| + 00000040 01 25 b8 00 05 00 00 03 03 7f fa 78 1e e7 fd fe |.%.........x....| + ^^^^^^^^^^^^--- emulation 3 byte + 00000050 b4 62 7a 31 ff 7d 81 fd 26 d8 62 b6 d6 25 46 ae |.bz1.}..&.b..%F.| + The following NALs (7, 8, 6, 5) decode fine, + .dec_ref_pic_marking_bit_size is set to 2 without this patch. + 00000000 00 00 00 01 27 4d 00 20 89 8b 60 3c 04 bf 2e 02 |....'M. ..`<....| + 00000010 d4 18 04 18 c0 c0 01 77 00 00 5d c1 7b df 05 00 |.......w..].{...| + 00000020 00 00 01 28 ee 1f 20 00 00 01 06 05 10 b9 ed b9 |...(.. .........| + 00000030 30 5d 21 4b 71 83 71 2c 10 a3 14 bb 29 80 00 00 |0]!Kq.q,....)...| + 00000040 01 25 b8 00 04 c0 00 03 7f fa 78 1e e7 fd fe b4 |.%........x.....| + 00000050 62 7a 31 ff 7d 81 fd 26 d8 62 b6 d6 25 46 ae ce |bz1.}..&.b..%F..| + Fixes: d0d65fa875 ("codecparsers: h264: record dec_ref_pic_marking() size") + Fixes: 0cc7d6f093 ("codecparsers: h264: record pic_order_cnt elements size") + Signed-off-by: Marek Vasut <marex@denx.de> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2517> + +2021-09-03 14:57:09 -0400 Aaron Boxer <aaron.boxer@collabora.com> + + * gst-libs/gst/codecparsers/gsth264parser.c: + gsth264parser: reject memory management control op greater than 6 + This prevents assertion from being thrown in + gst_h264_dpb_perform_memory_management_control_operation + if corrupt NAL has a control op greater than 6 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2508> + +2021-08-24 09:59:59 +0000 Ung, Teng En <teng.en.ung@intel.com> + + * sys/msdk/gstmsdk.c: + * sys/msdk/gstmsdkav1dec.c: + * sys/msdk/gstmsdkdec.c: + * sys/msdk/gstmsdkenc.c: + * sys/msdk/gstmsdkh264dec.c: + * sys/msdk/gstmsdkh264enc.c: + * sys/msdk/gstmsdkh265dec.c: + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkmjpegdec.c: + * sys/msdk/gstmsdkmjpegenc.c: + * sys/msdk/gstmsdkmpeg2dec.c: + * sys/msdk/gstmsdkmpeg2enc.c: + * sys/msdk/gstmsdkvc1dec.c: + * sys/msdk/gstmsdkvp8dec.c: + * sys/msdk/gstmsdkvp9dec.c: + * sys/msdk/gstmsdkvp9enc.c: + * sys/msdk/gstmsdkvpp.c: + * sys/msdk/msdk.c: + * sys/msdk/msdk.h: + msdk: Adjust the plugin and factories description based on MFX_VERSION. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2485> 2021-09-08 17:32:30 +0200 Mathieu Duponchelle <mathieu@centricular.com> @@ -190,22 +4942,6 @@ hotdoc doesn't know about that symbol Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2518> -2021-03-31 18:07:40 -0300 Thibault Saunier <tsaunier@igalia.com> - - * gst/rtp/gstrtpsrc.c: - rtpsrc: Fix wrong/NULL URI handling - We can reset the URI to NULL and this fix a deadlock in that case or - when the URI was invalid. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2512> - -2021-06-03 11:24:53 +0200 Edward Hervey <edward@centricular.com> - - * gst/mpegtsdemux/tsdemux.c: - tsdemux: Clear all streams when rewinding - This avoids sending out partial invalid data downstream which could cause - decoders (ex: `dvdlpmdec`) to error out. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2511> - 2021-08-30 23:26:39 +1000 Jan Schmidt <jan@centricular.com> * ext/mpeg2enc/gstmpeg2enc.cc: @@ -217,22 +4953,206 @@ consume all RAM. I don't really see any reason to have more than 1 outstanding encoded frame, so remove the queue and limit things to 1 pending frame. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2510> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2499> + +2021-09-01 17:35:45 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/wpe-extension/gstwpeaudiosink.c: + wpe: Fix race condition on teardown + There was a race when going to PAUSED while pushing a buffer to the + pipeline process (where we weren't even cancelling anything). + This rework base all the cancellation around the GCancellable + "cancelled" signal trying to ensure that the streaming thread will not + block once a cancel operation happens. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2504> + +2021-09-01 17:26:04 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + wpe: Use the new element.get_current_running_time API + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2504> + +2021-09-01 17:24:45 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + wpe: Mark first buffer as starting at 0 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2504> 2021-09-02 22:06:52 +0900 Seungha Yang <seungha@centricular.com> * gst/videoparsers/gstvideoparseutils.c: videoparseutils: Fix for wrong CEA708 minimum size check The minimum possible size of valid CEA708 data is 3 bytes, not 7 bytes - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2507> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2505> -2021-08-05 23:11:26 +0200 Mathieu Duponchelle <mathieu@centricular.com> +2021-08-29 11:04:17 +0100 Philippe Normand <philn@igalia.com> - * ext/mpeg2enc/gstmpeg2encpicturereader.cc: - mpeg2enc: fix interlace-mode detection - Previously, the code was always assuming progressive input, - fix this by looking at the caps. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2500> + * ext/wpe/gstwpevideosrc.cpp: + wpevideosrc: Uniformise default value for draw-background property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2498> + +2021-08-29 10:30:53 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpevideosrc.cpp: + wpevideosrc: Implement basic heuristic for raw caps negotiation + Before this patch raw caps could be negotiated already with a capsfilter, but in + cases where wpesrc is being auto-plugged this approach can't be used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2498> + +2021-08-29 10:28:57 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpevideosrc.cpp: + wpevideosrc: Ensure debug category is set + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2498> + +2021-07-15 21:10:14 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: fix scheduling with interlaced video buffers + The initial code was written with the misunderstanding that + IS_TOP_FIELD indicated that an interlaced buffer contained + a top field, not that it contained only a top field + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2413> + +2021-08-27 15:41:32 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/kms/gstkmssink.c: + Revert "kmssink: Fix fallback path for driver not able to scale scenario" + This reverts commit d2a7b763bef3ca51f0c84cdac52eeed85b0db8fb. + After this change, non-scaled rendered were not centred as expected. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2496> + +2021-08-20 13:28:51 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * gst-libs/gst/codecs/gstav1decoder.c: + codecs: av1dec: Fix to output frame with highest spatial layer + During the output process, if there are multiple frames in a TU (i.e. multi-spatial + layers case), only one frame with the highest spatial layer id should be selected + according to av1 spec. The highest spatial layer id is obtained from idc value of + the operating point. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2475> + +2021-08-24 14:33:42 +0100 Alex Ashley <bugzilla@ashley-family.net> + + * ext/dash/gstxmlhelper.c: + * tests/check/elements/dash_mpd.c: + dashdemux: copy ContentProtection element including xml namespaces + Commit bc09d8cc changed gstmpdparser to put the entire + <ContentProtection> element in the "value" field, so that DRMs + other than PlayReady could make use of the data inside this + element. + However, the data in the "value" field does not include any + XML namespace declarations that are used within the element. This + causes problems for a namespace aware XML parser that wants to + make use of this data. + This commit modifies the way the XML is converted to a string + so that XML namespaces are preserved in the output. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2487> + +2021-08-26 21:26:01 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * docs/plugins/gst_plugins_cache.json: + * gst/debugutils/gsterrorignore.c: + * gst/debugutils/gsterrorignore.h: + errorignore: Add ignore-eos mode + It's otherwise very complicated to ignore GST_FLOW_EOS without a + ghostpad's chain function to rewrite. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2492> + +2021-08-27 17:25:04 +1000 Brad Hards <bradh@frogmouth.net> + + * gst-libs/gst/codecparsers/gsth264parser.c: + gsth264parser: fix typo in debug message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2493> + +2021-08-26 04:12:07 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/timecode/gsttimecodestamper.c: + * gst/timecode/gsttimecodestamper.h: + timecodestamper: add support for closedcaption input + Some closedcaption elements like sccenc except input buffers + to have timecode metas. The original use case is to serialize + closed captions extracted from a video stream, in that case + ccextractor copies the video time code metas to the closed + caption buffers, but no such mechanism exists when creating + a CC stream ex nihilo. + Remedy that by having timecodestamper accept closedcaption + input caps, as long as they have a framerate. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2490> + +2021-07-06 12:31:42 -0400 Aaron Boxer <aaron.boxer@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/aes/gstaes.c: + * ext/aes/gstaesdec.c: + * ext/aes/gstaesdec.h: + * ext/aes/gstaesenc.c: + * ext/aes/gstaesenc.h: + * ext/aes/gstaeshelper.c: + * ext/aes/gstaeshelper.h: + * ext/aes/meson.build: + * ext/meson.build: + * meson_options.txt: + * tests/check/elements/aesdec.c: + * tests/check/elements/aesenc.c: + * tests/check/meson.build: + aes: add aes encryption and decryption elements + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1505> + +2021-05-10 12:02:20 +0200 Johan Sternerup <johast@axis.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Return typed "sctp-transport" + With GstWebRTCSCTPTransport type exposed we can now define + "sctp-transport" property as being of this type. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2214> + +2021-05-07 08:12:25 +0200 Johan Sternerup <johast@axis.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + * ext/webrtc/meson.build: + * ext/webrtc/webrtcdatachannel.c: + * ext/webrtc/webrtcdatachannel.h: + * ext/webrtc/webrtcsctptransport.c: + * ext/webrtc/webrtcsctptransport.h: + * gst-libs/gst/webrtc/meson.build: + * gst-libs/gst/webrtc/sctptransport.c: + * gst-libs/gst/webrtc/sctptransport.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + * gst-libs/gst/webrtc/webrtc_fwd.h: + webrtc: Split sctptransport into lib and implementation parts + GstWebRTCSCTPTransport is now made into into an abstract base class + that only contains property specifications matching the + RTCSctpTransport interface of the W3C WebRTC specification, see + https://w3c.github.io/webrtc-pc/#rtcsctptransport-interface. This + class is put into the WebRTC library to expose it for applications and + to allow for generation of bindings for non-dynamic languages using + GObject introspection. + The actual implementation is moved to the subclass WebRTCSCTPTransport + located in the WebRTC plugin. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2214> + +2021-05-03 10:45:42 +0200 Johan Sternerup <johast@axis.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Expose SCTP Transport + Being able to access the SCTP Transport object from the application + means the application can access the associated DTLS Transport object + and its ICE Transport object. This means we can observe the ICE state + also for a data-channel-only session. The collated + ice-connection-state on webrtcbin only includes the ICE Transport + objects that resides on the RTP transceivers (which is exactly how it + is specified in + https://w3c.github.io/webrtc-pc/#rtciceconnectionstate-enum). + For the consent freshness functionality (RFC 7675) to work the ICE + state must be accessible and consequently the SCTP transport must be + accessible for enabling consent freshness checking for a + data-channel-only session. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2214> 2021-04-20 20:04:33 +0100 Tim-Philipp Müller <tim@centricular.com> @@ -247,13 +5167,13 @@ in our case. Just don't tell the base class about our headers, they will be sent at the beginning of each IDR frame anyway. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2478> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2178> 2021-04-20 19:43:53 +0100 Tim-Philipp Müller <tim@centricular.com> * ext/openh264/gstopenh264enc.cpp: - openh264enc: fix header buffer leak - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2478> + openh264enc: fix caps and header buffer leak + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2178> 2021-04-20 19:11:12 +0100 Tim-Philipp Müller <tim@centricular.com> @@ -265,13 +5185,99 @@ NAL is at in the bitstream buffer (psBsBuf + nal_offset). This was broken in commit 17113695. Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1576 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2478> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2178> + +2021-08-22 00:33:58 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11bufferpool.cpp: + * gst-libs/gst/d3d11/gstd3d11bufferpool.h: + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + * sys/d3d11/gstd3d11download.cpp: + * sys/d3d11/gstd3d11upload.cpp: + * sys/d3d11/gstd3d11videosink.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + d3d11bufferpool: Hide buffer_size field from header + User can get the required buffer size by using buffer pool config. + Since d3d11 implementation is a candidate for public library in the future, + we need to hide everything from header as much as possible. + Note that the total size of allocated d3d11 texture memory by GPU is not + controllable factor. It depends on hardware specific alignment/padding + requirement. So, GstD3D11 implementation updates actual buffer size + by allocating D3D11 texture, since there's no way to get CPU accessible + memory size without allocating real D3D11 texture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2482> + +2021-08-21 02:20:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/cuda-converter.c: + * sys/nvcodec/gstcudaconvert.c: + * sys/nvcodec/gstnvbaseenc.c: + * sys/nvcodec/gstnvdec.c: + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvenc.c: + * sys/nvcodec/gstnvenc.h: + nvcodec: Fix various typos + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2481> + +2021-08-21 02:10:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/cuda-converter.h: + * sys/nvcodec/gstcudacontext.h: + * sys/nvcodec/gstcudafilter.h: + * sys/nvcodec/gstcudaloader.h: + * sys/nvcodec/gstcudanvrtc.h: + * sys/nvcodec/gstcudautils.h: + * sys/nvcodec/gstcuvidloader.h: + * sys/nvcodec/gstnvbaseenc.h: + * sys/nvcodec/gstnvenc.h: + * sys/nvcodec/gstnvh264dec.h: + * sys/nvcodec/gstnvh264enc.h: + * sys/nvcodec/gstnvh265dec.h: + * sys/nvcodec/gstnvh265enc.h: + * sys/nvcodec/gstnvrtcloader.h: + nvcodec: Get rid of G_GNUC_INTERNAL + Our default symbol visibility is hidden, so G_GNUC_INTERNAL + is pointless + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2481> + +2021-08-19 16:45:18 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Fix split field handling + Split fields ends up on multiple picture and requires accessing the + other_field to complete the information (POC). + This also cleanup the DPB from non-reference (was not useful) and skips + properly merge field instead of keeping them duplicated. This fixes most + of interlace decoding seen in fluster. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2474> + +2021-08-19 11:40:22 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codec: h264: Implement support for split fields + When a frame is composed of two fields, the base class now split the + picture in two. In order to support this, we need to ensure that picture + buffer is held in VB2 queue so that the second field get decoded into + it. This also implements the new_field_picture() virtual and sets the + previous request on the new picture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2474> + +2021-08-20 11:23:57 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Fix filling weight factors + This was a typo, the wrong index was used to set l1 weight (b-frames). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2480> 2021-08-20 14:34:53 +0200 Edward Hervey <edward@centricular.com> * ext/dash/gstdashdemux.c: dashdemux: Properly initalize GError - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2479> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2476> 2021-08-19 21:56:05 +0900 Seungha Yang <seungha@centricular.com> @@ -285,7 +5291,7 @@ See also https://docs.microsoft.com/en-us/windows/win32/medfound/image-stride Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1646 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2477> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2473> 2021-08-18 11:14:37 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> @@ -294,7 +5300,292 @@ The emulation bytes need to be removed as bytes, not bit. This fixes decoding issues with files that have emulation bytes with the Cedrus driver. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2472> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2471> + +2021-08-12 14:08:19 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/examples/va/multiple-vpp.c: + example: va: Add skin tone enhancement. + If camera is used as input stream and skin tone parameter is available + in vapostproc, and no random changes are enabled, the skin tone will + be enabled. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2470> + +2021-08-17 14:04:41 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Use vapostproc as debug category name. + Otherwise is difficult to remember the different name. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2470> + +2021-08-12 13:54:34 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/examples/va/multiple-vpp.c: + examples: va: Add random cropping. + And remove unused caps filter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2443> + +2021-07-28 13:04:50 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Disable cropping in pass-through mode. + Originally, if a buffer arrives with crop meta but downstream doesn't + handle crop allocation meta, vapostproc tried to reconfigure itself to + non pass-through mode automatically. Sadly, this behavior was based on + the wrong assumption that propose_allocation() vmethod would bring + downstream allocation query, but it is not. + Now, if vapostproc is in pass-through mode, the cropping is passed to + downstream. Pass-through mode can be disabled via a parameter. + Finally, if pass-through mode isn't enabled, it's assumed the buffer + is going to be processed and, if cropping, downstream already + negotiated the cropped frame size, thus it's required to do the + cropping inside vapostproc to avoid artifacts because of the size of + downstream allocated buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2443> + +2021-08-17 14:54:21 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Update filters update_properties(). + Right after instantiating the VA filter and changing the element + state, rebuild the image filters. + This will fix a regression from f20b3b815, where properties in a + gst-launch pipeline are not applied. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2443> + +2021-08-18 09:13:45 +0300 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklinkvideosrc.cpp: + decklinkvideosrc: Fix PAL/NTSC widescreen autodetection when switching back to non-widescreen + Previously it would only switch to widescreen but never back. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2469> + +2021-07-20 18:15:11 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkvpp.c: + msdkvpp: Fix frc from lower fps to higher fps + There are three framerate conversion algorithms described in + <https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md>, + interpolation is not implemented so far and thus distributed timestamp algorihtm + is considered to be more practical which evenly distributes output timestamps + according to output framerate. In this case, newly generated frames are inserted + between current frame and previous one, timestamp is calculated by msdk API. + This implementation first pushes newly generated buffers(outbuf_new) forward and + the current buffer(outbuf) is handled at last round by base transform automatically. + A flag "create_new_surface" is used to indicate if new surfaces have been generated + and then push new outbuf forward accordingly. + Considering the upstream element may not be the msdk element, it is necessary to + always set the input surface timestamp as same as input buffer's timestamp and + convert it to msdk timestamp. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2418> + +2021-05-06 22:22:12 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtc: improve matching on the correct jitterbuffer + The mapping between an RTP session and the SDP m= line is not always the + same, especially when BUNDLEing is used. + This causes a failure in a specific case where if when bundling, + if mline 0 is a data channel, and mline 1 an audio/video section, + then retrieving the transceiver at mline 0 (rtp session used) will fail + and cause an assertion. + This fix is actually potentially a regression for cases where the remote + part does not provide the a=ssrc: media level SDP attributes as is now + becoming common, especially when simulcast is involved. + The correct fix actually requires reading out header extensions as used + with bundle for signalling in the actual data, what media and therefore + transceiver is being used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2467> + +2021-08-16 13:45:39 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/va/gstvadisplay.h: + * sys/va/gstvaav1dec.c: + * sys/va/gstvabasedec.c: + * sys/va/gstvacaps.c: + * sys/va/gstvafilter.c: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + * sys/va/gstvavpp.c: + va: Use GST_CAPS_FEATURE_MEMORY_VA to replace "memory:VAMemory". + "memory:VAMemory" is a commonly used string which notates our VA-kind + memory type. We now used a definition in va lib to replace the simply + string usage. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2466> + +2021-08-16 13:32:51 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvacaps.c: + * sys/va/gstvafilter.c: + va: Use MEMORY_DMABUF definition to replace "memory:DMABuf" strings. + GST_CAPS_FEATURE_MEMORY_DMABUF is already a common definition, we should + just use it rather than use the "memory:DMABuf" strings by ourselves. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2466> + +2021-08-09 19:02:56 -0400 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/fdkaac/gstfdkaacenc.c: + fdkaacdec: Add Converter class to hint gst-validate + fdkaacdec have minimal conversion capability, adding the Converter class allow + gst-validate to behave properly and not spit an error when it notice that the + number of channels or rate miss-match in and out. + Same logic as with opusdec, see: https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1142> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2462> + +2021-06-09 23:29:43 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/plugin.c: + wasapi2: Increase rank to primary + 1 + wasapi2 plugin should be preferred than old wasapi plugin if available because: + * wasapi2 supports automatic stream routing, and it's highly recommended + feature for application by MS. See also + https://docs.microsoft.com/en-us/windows/win32/coreaudio/automatic-stream-routing + * This implementation must be various COM threading issue free by design + since wasapi2 plugin spawns a new dedicated COM thread and all COM objects' + life-cycles are managed correctly. + There are unsolved COM issues around old wasapi plugin. Such issues are + very tricky to be solved unless old wasapi plugin's threading model + is re-designed. + Note that, in case of UWP, wasapi2 plugin's rank is primary + 1 already + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2314> + +2021-08-12 20:39:24 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstccconverter.c: + ccconverter: fix overflow when not doing framerate conversion + When converting from one framerate to another, counters are + reset periodically, however when not converting they never are + and can_genearte_output ends up making overflow-prone calculations + with large values for input_frames and output_frames. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2465> + +2021-08-12 15:26:27 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Don't assume that non-audio medias are video medias when creating transceivers + And print the unknown media kind in the logs. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2464> + +2021-08-12 15:25:50 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Use the correct media for deciding the media kind when creating the transceiver from the SDP + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2464> + +2021-07-29 21:30:32 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Output the picture directly if already a frame. + We forget one case that is the frame and field pictures may be mixed + together. For this case, the dpb is interlaced while the last picture + may be a complete frame. We do not need to cache that complete picture + and should output it directly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2448> + +2021-08-06 17:11:55 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvacaps.c: + va: caps: Make the template raw video caps classified by features. + The current output of raw video caps is not good. When we have multi + profiles and each profile support different formats, the output of + gst-inspect may like: + SRC template: 'src' + Availability: Always + Capabilities: + video/x-raw(memory:VAMemory) + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: NV12 + video/x-raw + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: NV12 + video/x-raw(memory:VAMemory) + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: P010_10LE + video/x-raw + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: P010_10LE + video/x-raw(memory:VAMemory) + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: P012_LE + video/x-raw + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: P012_LE + The gst_caps_simplify does not classify the caps by same features, but + just leave them interweaved. We need to handle them manually here, the + result should be: + SRC template: 'src' + Availability: Always + Capabilities: + video/x-raw + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: { (string)P010_10LE, (string)P012_LE, (string)NV12 } + video/x-raw(memory:VAMemory) + width: [ 1, 16384 ] + height: [ 1, 16384 ] + format: { (string)P010_10LE, (string)P012_LE, (string)NV12 } + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2456> + +2021-07-27 13:22:02 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Inherit from GstVaBaseTransform. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2442> + +2021-02-17 17:15:22 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasetransform.c: + * sys/va/gstvabasetransform.h: + * sys/va/meson.build: + va: Add base transform class. + This base transform class is a derivable class for VA-based filters, + for example vapostproc right now, but it will be used also for + future elements such as vadeinterlace. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2442> + +2021-07-27 13:03:37 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + * sys/va/gstvapool.h: + va: pool: Add gst_va_pool_new_with_config(). + It is a function helper. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2442> + +2021-08-10 02:48:01 +0900 Seungha Yang <seungha@centricular.com> + + d3d11window: Misc code cleanup + * Remove unnecessary upcasting. We are now dealing with C++ class objects + and don't need explicit C-style casting in C++ world + * Use helper macro IID_PPV_ARGS() everywhere. It will make code + a little short. + * Use ComPtr smart pointer instead of calling manual IUnknown::Release() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2461> + +2021-08-10 02:48:45 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + d3d11compositor: Fix indent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2461> + +2021-05-28 17:36:15 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/openh264/gstopenh264enc.cpp: + * ext/openh264/meson.build: + openh264: Respect level set downstream + We were not specifying the requested level to openh264 meaning that + it was choosing anything and was not respecting what was specified\ + downstream + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2289> 2021-08-04 15:02:01 +0800 He Junyan <junyan.he@intel.com> @@ -304,7 +5595,389 @@ use gst_object_get_parent to get the full object path name, which needs to lock the object. But we are already in a locked context and so this will cause a deadlock, the pipeline can not exit normally. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2460> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2451> + +2021-07-22 20:58:02 +0000 R S Nikhil Krishna <rsnk96@gmail.com> + + * ext/rtmp/gstrtmpsrc.c: + rtmpsrc: mention setting librtmp flags in docs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2424> + +2021-08-05 23:11:26 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/mpeg2enc/gstmpeg2encpicturereader.cc: + mpeg2enc: fix interlace-mode detection + Previously, the code was always assuming progressive input, + fix this by looking at the caps. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2455> + +2021-05-23 19:15:25 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ext/curl/gstcurlbasesink.c: + * ext/curl/gstcurlhttpsrc.c: + * ext/faad/gstfaad.c: + * ext/hls/gsthlsdemux.c: + * ext/teletextdec/gstteletextdec.c: + * gst-libs/gst/codecparsers/gsth264parser.c: + * gst-libs/gst/mpegts/gst-dvb-descriptor.c: + * gst-libs/gst/mpegts/gstmpegtsdescriptor.c: + * gst-libs/gst/mpegts/gstmpegtssection.c: + * gst/audiovisualizers/gstspectrascope.c: + * gst/mpegpsmux/mpegpsmux.c: + * gst/mpegtsdemux/mpegtspacketizer.c: + * gst/mpegtsdemux/tsdemux.c: + * gst/mxf/mxfaes-bwf.c: + * gst/mxf/mxfdms1.c: + * gst/mxf/mxfmetadata.c: + * gst/mxf/mxfmpeg.c: + * gst/mxf/mxftypes.c: + * gst/rtmp2/rtmp/amf.c: + * meson.build: + * sys/androidmedia/gstamcaudiodec.c: + * sys/androidmedia/gstamcvideodec.c: + * sys/androidmedia/jni/gstamc-codeclist-jni.c: + * sys/androidmedia/jni/gstamc-format-jni.c: + * sys/androidmedia/magicleap/gstamc-format-ml.c: + * tests/check/libs/mpegts.c: + Use g_memdup2() where available and add fallback for older GLib versions + g_memdup() is deprecated since GLib 2.68 and we want to avoid + deprecation warnings with recent versions of GLib. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2280> + +2021-08-05 13:02:00 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst/timecode/gsttimecodestamper.c: + timecodestamper: Fix latency calculation + The LTC extra latency is in ms already and not in frames, so multiplying + with the framerate will end up with a wrong number. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2453> + +2021-07-18 00:51:04 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkcontext.c: + msdk: make sure child context is destroyed first + The parent context shares some resources with child context, so the + child context should be destroyed first, otherwise the command below + will trigger a segmentation fault + $> gst-launch-1.0 videotestsrc num-buffers=100 ! msdkh264enc ! \ + msdkh264dec ! fakesink videotestsrc num-buffers=50 ! \ + msdkh264enc ! msdkh264dec ! fakesink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2435> + +2021-08-02 16:22:06 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + d3d11videosink: Fix warning around GstVideoOverlay::expose() + When expose() is called, d3d11videosink needs to redraw using + cached buffer, so gst_d3d11_window_render() should allow null buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2450> + +2021-07-31 01:05:47 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + d3d11videosink: Forward navigation event without modification + Current implementation for translating native coordinate and + video coordinate is very wrong because d3d11videosink doesn't + understand native HWND's coordinate. That should be handled + by GstD3D11Window implementation as an enhancement. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2450> + +2021-07-31 00:59:14 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11videosink: Add support for GstVideoOverlay::set_render_rectangle + Inspired by an MR https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2382 + The idea is that we can make use of MoveWindow() in WIN32 d3d11window + implementation safely because WIN32 d3d11window implementation creates + internal HWND even when external HWND is set and then subclassing is used to + draw on internal HWND in any case. So the coordinates passed to MoveWindow() + will be relative to parent HWND, and it meets well to the concept of + set_render_rectangle(). + On MoveWindow() event, WM_SIZE event will be generated by OS and then + GstD3D11WindowWin32 implementation will update render area including swapchain + correspondingly, as if it's normal window move/resize case. + But in case of UWP (CoreWindow or SwapChainPanel), we need more research to + meet expected behavior of set_render_rectangle() + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1416 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2450> + +2021-07-29 18:05:35 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavp8dec.c: + va: vp8: fix the overflow in _fill_quant_matrix(). + The gint8 of qi and qi_base may overflow when calculation the matrix + parameters and change the decoding result. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2447> + +2021-06-21 00:19:17 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.cpp: + * gst-libs/gst/d3d11/gstd3d11format.h: + * tests/check/elements/d3d11colorconvert.c: + d3d11: Disable packed and subsampled YUV formats + Direct3D11 sampler doesn't support them very well, and conversion + outputs usually result in poor visual quality with our shader code. + Should disable support for such formats for now + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2344> + +2021-07-26 16:43:47 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Notify when ignore_pcr is set + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2437> + +2021-07-27 23:53:06 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: fix CDP padding detection + While a cc_data_pkt with cc_valid 0 should be considered padding, + it might be followed up by valid DTVCC packets, and should not + cause the whole CDP packet to get discarded. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2440> + +2021-07-27 12:51:08 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Improve the find_first_field_picture(). + We need to consider the first field of the last picture when the + last picture can not enter the DPB. + Another change is, when prev field's frame_num is not equal to the + current field's frame_num, we should also return FASLE because it + is also a case of losing some field. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2430> + +2021-07-27 12:16:13 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: consider the last field when add picture to DPB. + There are cases that the first field of the last picture is not a + ref but the second field is a ref. We need to add both of them + because the bumping always needs a complete frame in the DPB. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2430> + +2021-07-27 10:51:03 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Consider the field case when directly output. + For interlaced streams, it is also possible that the last frame is + not able to be inserted into DPB when the DPB is full and the last + frame is a non ref. For this case, we need to hold a extra ref for + the first field of the last frame and wait for the complete frame + with both top and bottom fields. For the progressive stream, the + behaviour is unchanged. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2430> + +2021-07-26 01:16:34 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264dec: Fix a error print of dpb_add. + When the dpb is interlaced, the max size should be 2*dpb->max_num_frames, + correcting the error print info for that. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2430> + +2021-07-28 16:11:36 +0900 Seungha Yang <seungha@centricular.com> + + * sys/decklink/linux/DeckLinkAPIDispatch.cpp: + decklink: Don't print error for dlopen failure + This is not a fatal error on systems without decklink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2441> + +2021-07-27 12:21:41 +0200 Imanol Fernandez <ifernandez@igalia.com> + + * gst-libs/gst/play/gstplay.c: + * gst-libs/gst/player/gstplayer.c: + player: Add static keyword to _config_quark_table + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2439> + +2021-07-27 14:52:38 +1000 Matthew Waters <matthew@centricular.com> + + * sys/applemedia/meson.build: + * sys/applemedia/videotexturecache-vulkan.mm: + applemedia: silence a couple of MoltenVK warnings + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2433> + +2021-07-27 11:49:47 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/debugutils/gstfakeaudiosink.c: + * gst/debugutils/gstfakevideosink.c: + debugutils: Only proxy the properties once + The needed once call was removed accidently during porting. This was catch by + the CI as memory leaks. + Related to !2426 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2438> + +2021-07-27 12:13:43 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfd10.c: + mxf: Handle D10 "picture only" variant + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/80 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2436> + +2021-07-24 13:19:39 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: filter: refactor convert_surface() to process() + The idea of this change is to add, in the future, + process_with_generator(), when multiple input surfaces are processed, + for blending. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2431> + +2021-07-18 12:46:08 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: filter: Refactor set_formats() to set_video_info(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2431> + +2021-07-18 17:13:16 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + vapostproc: Don't add video alignment option in buffer pool. + vapostproc will not call gst_buffer_pool_config_set_video_alignment(), + thus this option is not required. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2431> + +2021-07-27 09:37:49 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfvc3.c: + mxfvc3: Also accept clip-wrapped vc-3 + We can now handle this fine + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2434> + +2021-07-27 07:59:52 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + mxfdemux: Handle EOS with non-frame wrapping + When reaching the end of non-frame wrapping track in pull mode, we want to force + the switch to the next non-eos pad. This is similar to when we exceed the + maximum drift. + Fixes issues on EOS where not everything would be drained out and stray errors + would pop out. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2434> + +2021-07-25 07:52:06 +0200 Edward Hervey <edward@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/mxf/mxfdemux.c: + mxfdemux: More granular interleaved content handling + An interleave of 500ms can be way too big for some downstream queueing + elements. Instead use a smaller 100ms interleave and silence the various + warnings about resyncing (it's normal) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2434> + +2021-07-23 09:36:10 +0100 Philippe Normand <philn@igalia.com> + + * gst/debugutils/gstfakeaudiosink.c: + * gst/debugutils/gstfakesinkutils.c: + * gst/debugutils/gstfakesinkutils.h: + * gst/debugutils/gstfakevideosink.c: + * gst/debugutils/meson.build: + debugutils: De-duplicate proxy_properties function to a new utils module + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2426> + +2021-06-08 01:40:34 +0900 Seungha Yang <seungha@centricular.com> + + * meson_options.txt: + * sys/asio/gstasiodeviceprovider.cpp: + * sys/asio/gstasiodeviceprovider.h: + * sys/asio/gstasioobject.cpp: + * sys/asio/gstasioobject.h: + * sys/asio/gstasioringbuffer.cpp: + * sys/asio/gstasioringbuffer.h: + * sys/asio/gstasiosink.cpp: + * sys/asio/gstasiosink.h: + * sys/asio/gstasiosrc.cpp: + * sys/asio/gstasiosrc.h: + * sys/asio/gstasioutils.cpp: + * sys/asio/gstasioutils.h: + * sys/asio/meson.build: + * sys/asio/plugin.c: + * sys/meson.build: + Introduce Steinberg ASIO (Audio Streaming Input/Output) plugin + Adds a new plugin for ASIO devices. + Although there is a standard low-level audio API, WASAPI, on Windows, + ASIO is still being broadly used for audio devices which are aiming to + professional use case. In case of such devices, ASIO API might be able + to show better quality and latency performance depending on manufacturer's + driver implementation. + In order to build this plugin, user should provide path to + ASIO SDK as a build option, "asio-sdk-path". + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2309> + +2021-06-06 22:32:08 +0900 Seungha Yang <seungha@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/audiolatency/gstaudiolatency.c: + * gst/audiolatency/gstaudiolatency.h: + audiolatency: Expose samplesperbuffer property + ... for user to be able to set the number of required samples. + For instance, our default value is 240 samples + (about 5ms latency in case that sample rate is 48000), which might + be larger than actual buffer size of audio capture device. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2307> + +2021-07-23 22:02:05 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: let print_ref_pic_list_b print the correct list name. + The print_ref_pic_list_b now not only needs to trace the ref_pic_list_b0/1, + but also need to trace the ref_frame_list_0_short_term. We need to pass the + name directly to it rather than an index to refer to ref_pic_list_b0/1. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2425> + +2021-07-23 12:31:17 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Fix a typo in construct_ref_field_pic_lists_b. + The array sort of ref_frame_list_0_short_term has some typo. The + typo makes this list not in the POC ascend order and generate wrong + decoding result for interlaced streams. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2425> + +2021-07-15 05:12:37 -0700 Devarsh Thakkar <devarsh.thakkar@xilinx.com> + + * sys/kms/gstkmssink.c: + kmssink: Fix fallback path for driver not able to scale scenario + When driver return error on update plane request, kmssink + disables the scaling and retries plane update. + While doing so kmssink was matching the source rectangle dimensions + to the target rectangle dimensions which were calculated + as per scaling but this is incorrect, instead what we want here is + that target rectangle dimensions should match the source rectangle + dimensions as scaling is disabled now and so we match result + rectangle dimensions with source rectangle dimensions. + While at it, also match the result rectangle coordinates for + horizontal and vertical offsets with source rectange coordinates, + as since there is no scaling being done so no recentering is + required. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2415> + +2021-07-23 16:49:49 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstvp9parse.c: + videoparsers: vp9: Need to process the first frame even not key. + Some cut VP9 streams begin with a non key frame. The current code + just bail out the parse_process_frame() if not a key frame. Because + of this, we do not set the valid caps before we push the data of the + first frame(even this first frame will be discarded by the downstream + decoder because it is not a key frame). + The pipeline such as: + gst-launch-1.0 filesrc location=some.ivf ! ivfparse ! vp9parse ! + vavp9dec ! fakesink + will get a negotiation error and the pipeline can not continue. The + correct behaviour should be: the decoder discard the first frame and + continue to decode later frames successfully. + So, when the parse does not have valid stream info(should be the first + frame case), we should continue and report caps. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2427> 2021-07-21 19:40:17 +0530 Nirbheek Chauhan <nirbheek@centricular.com> @@ -313,7 +5986,452 @@ pipewiresrc outputs audio buffers without a valid duration, so we need to calculate it manually in that case. Upstream issue: https://gitlab.freedesktop.org/pipewire/pipewire/-/issues/1438 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2444> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2419> + +2021-07-22 22:00:38 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Do not assign the frame->output_buffer until output_picture. + We may need to drop the slices such as RASL pictures with the NoRaslOutputFlag, so + the current picture of h265decoder may be freed. We should not assign the frame-> + output_buffer too early until we really output it. Or, the later coming slices will + allocate another picture and trigger the assert of: + gst_video_decoder_allocate_output_frame_with_params: + assertion 'frame->output_buffer == NULL' failed + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2421> + +2021-07-22 15:14:26 +0200 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Handle PCR-less streams + Some programs specify a PCR PID but don't actually store any PCR values, or are + way too far apart. + In order to gracefully handle those situations, we will queue up to a certain + amount of pending buffers before deciding to give up on that PCR PID and not use + any (i.e. using DTS/PTS values as-is) + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1629 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2422> + +2021-07-22 10:44:27 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: H265: Add odd bit depth and chroma depth in get_rtformat. + In H265, the stream may have odd bit depth such as 9 or 11. And + the bit depth of luma and chroma may differ. For example, the + stream with luma depth of 8 and chroma depth of 9 should use the + 10 bit rtformat as the decoded picture format. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2420> + +2021-07-21 00:04:18 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264dec: Improve the algorithm for low latency mode. + In low_latency mode, try to bump the picture as soon as possible + without the frames disorder: + 1. We can directly output the continuous non-reference frame. + 2. Consider max_num_reorder_frames, which is special useful for + I-P mode. + 3. Consider the leading pictures with negative POC. + 4 Output small POC pictures when non-reference frame comes. + 4. Output the POC increment<=2 pictures. This is not 100% safe, + but in practice this condition can be used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-20 23:49:12 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264dec: Add help function of dpb_set_max_num_reorder_frames. + The max_num_reorder_frames can be useful for bump check. We store it + in the DPB and no need for the decoder now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-20 23:36:38 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264dec: Add a flag to record whether picture is reference. + The picture->ref field will change from time to time according to decoder's + state and reference sliding window. We need another flag to record whether + the picture is a reference picture when it is created, and this can help + the bumping check. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-12 00:31:54 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Change the order of dpb_add and dpb_bump. + The current behavior is different from the SPEC. We should check + and bump the DPB or drain the DPB before we insert the current + picture into it. This may cause the output picture disorder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-12 00:06:49 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264dec: Modify the DPB need bump check. + Accord to spec, we should not add the current picture into the DPB + when we check whether it needs to bump, so the checks of the IDR and + the "memory_management_control_operation equal to 5" are no needed. + And the spec also says that the DPB only needs to bump when there is + no empty frame buffer left(We handle the IDR cases in other places). + We need to follow that and the max_num_reorder_frames is useless. + We also minus 1 in has_empty_frame_buffer because the current frame + has not been added yet. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-12 00:01:58 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264dec: Make dpb_has_empty_frame_buffer a codecs API. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-05 23:53:25 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264dec: Set picture to a small poc when mem_mgmt_5. + When current frame memory_management_control_operation equal to 5, that + means we need to drain the dpb and the current picture act as an IDR frame. + So it should have smaller poc than the later pictures to ensure the output + order. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2373> + +2021-07-15 11:12:01 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + mxfdemux: Make gst-indent on the CI happy + grmbl + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-15 10:59:39 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + * gst/mxf/mxfdemux.h: + mxfdemux: Handle non-frame wrapping + * If we have an index table for non-framed essence, we can handle it + * The demuxer has a state which indicates whether it will next fetch a KLV or + data contained *within* a KLV. + * The position on Essence Tracks always correspond to the next entry to fetch, + demuxer offset will be skipped accordingly whenever we switch between + partitions (in case of resyncs). A copy of the main clip/custom KLV for that + partition is kept to track the position within the essence of that partition. + * For clip/custom-wrapped raw audio, if the edit rate is too small (and would + cause plenty of tiny buffers to be outputted), specify a minimum number of edit + units per buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-15 10:45:46 +0200 Edward Hervey <edward@centricular.com> + + mxfdemux: Use KLV for position/content tracking + * For pull-based, this avoids pulling content if it's not needed (ex: skipping filler + packet, not downloading the content if we only need to know if/where an essence + packet is, etc...). Allows reducing i/o usage to the minimum. + * This also allows doing sub-klv position tracking, and opens the way for + non-frame-wrapping handling + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-15 10:28:31 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + mxfdemux: Output the topology of the file in debug logs + This provides a summary of the number/type of tracks in the Material and File + Packages + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-15 10:16:34 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + mxfdemux: Refactor pull seek + In order to figure out the exact start position (backed by a keyframe) accross + all tracks, we first figure out the backing keyframe position, and *then* seek + to that position. + Avoids ending up in situations where we would properly seek to the backing + keyframe on video ... but not on the audio streams (they would have been set to + the original non-keyframe position). Fixes key-unit seeking. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-14 07:58:01 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfaes-bwf.c: + * gst/mxf/mxfessence.h: + mxfaes-bwf: Handle new custom-constant-sized variant + Defined by Amendment 2:2013 to SMPTE ST 382:2007 + Also define a new "UNKNOWN" wrapping type to make the difference with known + wrapping types + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-14 07:54:38 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfmpeg.c: + mxfmpeg: Fix essence coding detection + The picture essence coding matching was wrong. Use the proper "base" MXFUL for + video mpeg compression for matching. + Also handle the case where some old files would put the essence container label + in the essence coding field + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-01 08:35:01 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + * gst/mxf/mxfdemux.h: + * gst/mxf/mxftypes.c: + * gst/mxf/mxftypes.h: + mxfdemux: Refactor index table and offset handling + * Streamline offset <=> entry handling. Historically the demuxer didn't support + information from index tables and stored the discovered information in an array + per track. When index table support was added, a parallel system was setup for + that relationship. This commit unifies this into one system with the + `find_edit_entry()` and `find_entry_for_offset()` functions. + * By extension, per-track offset entry tables are only created/used if no index + table is present for those tracks. + * Use index table information as-is. The index table system from MXF is quite + complex and there are various ways to use the information contained + within. Instead of converting that information we store the data from the tables + as-is and extract the needed information when needed. + * Handle index tables without entries (i.e. all content package units are of the + same size). + * Allow collecting index table segments as we go instead of only once if a + random-index-pack is present. This also improves support of some files in + push-mode. + * When searching for keyframe entries, use the keyframe_offset if + present (speeds up searching). + * For interleaved content (i.e. several tracks in the sample essence container), + we use a system to be able to identify the position of each track in the delta + entries of index tables. + * Handle temporal offset only on tracks which *do* need it (as specified in the + delta entries of the index tables). If present, those offsets are stored in a + pre-processed table which allows computing PTS from DTS with a simple offset. + * Add a quirk for files which are known to be have wrongly stored temporal + offsets. + * Overall opens the way to handle more types of MXF files, especially those with + non-frame-wrapping. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-06-29 15:29:36 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + mxfdemux: Drop duplicate seek events + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-06-24 09:53:08 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxftypes.c: + mxf: Improve index entry debug log + By printing out the various known flag values + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-06-23 09:08:33 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfmetadata.c: + mxf: Demote error message when resolving valid empty reference + A Source Clip can have zero'd SourcePackageID and SourceTrackID, this indicates + it terminates the source reference chain + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-06-17 16:38:54 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + * gst/mxf/mxfdemux.h: + mxfdemux: Handle temporal reordering shift + This is similar to how the same issue was handled in qtdemux. + In order for the "DTS <= PTS" constraint to be respected, we calculate the + maximum temporal reordering that can happen (via index tables). + If there is a non-0 temporal reordering, we: + * Shift all outgoing PTS by that amount + * Shift segment for that stream by that amount + * Don't modify DTS (i.e. they might end up having negative running-time, before + the start of the segment) + Also ensure all entries have a valid PTS set, previously this wouldn't be set + for entries with a temporal offset of 0. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/584 + (and maybe a lot of other issues) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2371> + +2021-07-17 20:49:15 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavp9dec.c: + va: vp9dec: Minor cleanups. + Added a comment with a future to-do, enhanced another comment and + fixed a typo in an error log message. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2417> + +2021-07-17 20:48:21 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + va: decoder: Group decoder methods. + Move up gst_va_decoder_get_config() to group decoders function in the + same file area. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2417> + +2021-07-17 20:45:48 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaav1dec.c: + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Refactor _format_changed() to _config_is_equal(). + Change gst_va_decoder_format_changed() to + gst_va_decoder_config_is_equal(), which is more similar with other + GStreamer API. + The function call is replaced but it has to be negated because the + return value is the opposite. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2417> + +2021-07-17 20:37:52 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvavp9dec.c: + va: Refactor _change_resolution() to _update_frame_size(). + Rename gst_va_decoder_change_resolution() to + gst_va_decoder_update_frame_size() which resembles + gst_va_decoder_set_frame_size(). + Also added a comment to clarify the function use and makes more + specific the error message. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2417> + +2021-07-17 20:29:45 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaav1dec.c: + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Refactor _set_format() to _set_frame_size(). + Renamed gst_va_decoder_set_format() to + gst_va_decoder_set_frame_size_with_surfaces() which resembles better + the passed parameters. Internally it creates the vaContext. + Added gst_va_decoder_set_frame_size() which is an alias of + gst_va_decoder_set_frame_size_with_surfaces() without surfaces. This + is the function which replaces gst_va_decoder_set_format() where + used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2417> + +2021-07-16 15:24:11 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gsth265decoder.h: + * sys/nvcodec/gstnvh265dec.c: + codecs: h265decoder: Fix a typo of NumPocTotalCurr when process ref pic list. + We should use the NumPocTotalCurr value stored in decoder, which is a calculated + valid value, rather than use the invalid value in the slice header. Most of the + time, the NumPocTotalCurr is 0 and make the tmp_refs a very short length, and + causes the decoder's wrong result. + By the way, the NumPocTotalCurr is not the correct name specified in H265 spec, + its name should be NumPicTotalCurr. We change it to the correct name. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2414> + +2021-07-16 13:21:11 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Do not add non reference frames into ref list. + The VA's ReferenceFrames should only contain the reference frame, we + should not add the non reference frames into this list. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2414> + +2021-07-15 19:44:21 +0900 Seungha Yang <seungha@centricular.com> + + * tests/check/meson.build: + tests: Enable closedcaption test on Windows + ... if closedcaption plugin is available + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2411> + +2021-07-15 16:44:18 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideoenc.cpp: + mfvideoenc: Disable RGB format support + Some GPUs support BGRA format and it will be converted to subsampled + YUV format by GPU internally. Disable this implicit conversion + since the conversion parameters such as input/output colorimetry + are not exposed nor it's written in bitstream (e.g., VUI). + We prefer explicit conversion via our conversion elements. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2410> + +2021-07-15 21:32:54 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Fix a temp var overflow bug when write pred weight table. + The temp guint8 var of delta_chroma_offset_l0 and delta_chroma_offset_l1 + can not cover the full range of delta_chroma_weight_l0/1 in the slice + header. When overflow happens, the decoder result is wrong. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2412> + +2021-07-12 12:08:20 +0100 Philippe Normand <philn@igalia.com> + + * ext/dash/gstmpdparser.c: + * tests/check/elements/dash_mpd.c: + dash: Store entire ContentProtection node in protection event data + Some manifests use the ContentProtection node to store additional information + such as the license server url. Our MPD parser used to process the + ContentProtection node, extracting Playready PSSH boxes. However for other DRM + systems, only the `value` attribute was passed down to the protection event, so + for example, Widevine data was not parsed at all and "Widevine" was passed to + the event, which is not very useful for decryptors that require a PSSH init + data. + Parsing should now be done by decryptors which will receive the entire + ContentProtection XML node as a string. This gives more "freedom" to the + decryptor which can then detect and parse custom nodes as well. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2400> + +2021-07-14 22:36:52 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavp9dec.c: + va: vp9dec: We need to check the resolution changes for every frame. + The VP9 streams have the ability to change the resolution dynamically + at any time point. It does not send ad KEY frame before change the + resolution, even the INTER frame can change the resolution immediately. + So we need to check the resolution change for each frame and do the + re-negiotiation if needed. + Some insaned stream may play in resolution A first and then dynamically + changes to B, and after 1 or 2 frames, it use a show_existing_frame to + repeat the old frame of resolution A before. So, not only new_picture(), + but also duplicate_picture() need to check this. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2407> + +2021-07-14 14:43:51 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavp9dec.c: + va: vp9dec: Do not re-create context for dynamical resolution change. + The driver for VP9 should have the ability to handle the dynamical resolution + changes. So if only the resolution changes, we should not re-create the config + and context in negotiation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2407> + +2021-07-14 14:27:34 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + va: decoder: Add helper functions to get and change the resolution. + Some codecs such as VP9, its config and context have the ability to + dynamically. When we only change the width and height, no need to + re-create the config and context. The helper function can just change + the resolution without re-creating config and context. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2407> + +2021-05-29 06:03:26 +1000 Jan Schmidt <jan@centricular.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + mpegtsmux: Quieten "missed PCR" warnings in VBR mode. + When the muxer is operating in VBR mode, it's kind of expected + for now that we might not put the PCR in exactly the right place, + because the muxer doesn't schedule packets that way. In that case + don't warn constantly about the PCR ending up a few ms off target. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2295> 2021-07-13 21:38:10 +1000 Matthew Waters <matthew@centricular.com> @@ -324,15 +6442,168 @@ not synchronised, Then the GSource destruction can access freed GMainContext resources and cause a crash. This is not super common but can happen. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2416> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2405> -2021-07-09 09:09:14 +0100 Philippe Normand <philn@igalia.com> +2021-07-08 14:25:23 +0200 Mads Buvik Sandvei <madssandvei@protonmail.com> - * ext/dash/gstdashdemux.c: - dashdemux: Remove duplicate logging call - Follow-up from: - https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2392 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2395> + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Always free messages while parsing SEI + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2390> + +2021-07-14 19:39:11 +0900 Seungha Yang <seungha@centricular.com> + + * gst/videoparsers/gstvp9parse.c: + vp9parse: Skip parsing decode-only frame + Decode-only frame (i.e., show_existing_frame == 1) doesn't hold + any valid information apart from the index of frame to be duplicated. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2408> + +2021-07-13 16:55:30 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + wpesrcbin: Use gst_buffer_new_memdup() + g_memdup() is deprecated. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2406> + +2021-07-12 23:25:02 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstvp9statefulparser.c: + codecs: vp9statefulparser: not init segmentation_abs_or_delta_update. + The segmentation_abs_or_delta_update is a stateful var, it should not + be inited every time when parsing the segmentation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2403> + +2021-07-12 23:21:29 +0900 Seungha Yang <seungha@centricular.com> + + * tests/check/elements/wasapi2.c: + tests: wasapi2: Add more device reuse cases + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2402> + +2021-07-12 22:17:22 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + wasapi2ringbuffer: Close IAudioClient on GstAudioRingBuffer::release + IAudioClient interface is not reusable once it's initialized. + So we should close the handle and reopen it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2402> + +2021-07-13 03:35:22 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + wasapi2ringbuffer: Run gst-indent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2402> + +2021-07-12 09:01:06 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: fix underflow in last_frame_idx + The spec mandates this field be parsed using unsigned arithmetic. Nevertheless, + av1parser will use -1 apparently as an uninitialized value in + gst_av1_parse_frame_header. This immediately underflows last_frame_idx + though, since its type was defined as guint8. Fix this by converting to gint8. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2401> + +2021-03-18 10:55:58 +0100 Jakub Janků <jjanku@redhat.com> + + * sys/wasapi/gstwasapisink.c: + wasapi: fix reinit of audioclient in prepare() + When the sink goes from PLAYING to READY and then back to PLAYING, + the initialization of the audioclient in prepare() fails with the + error AUDCLNT_E_ALREADY_INITIALIZED. As a result, the playback + stops. + To fix this, we need to drop the AudioClient in unprepare() and + grab a new one in prepare() to be able to initialize it again + with the new buffer spec. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2096> + +2021-03-17 22:45:57 +0100 Jakub Janků <jjanku@redhat.com> + + * sys/wasapi/gstwasapisink.c: + * sys/wasapi/gstwasapisrc.c: + * sys/wasapi/gstwasapiutil.c: + * sys/wasapi/gstwasapiutil.h: + wasapi: split gst_wasapi_util_get_device_client() + The functionality now resides in + gst_wasapi_util_get_device() and + gst_wasapi_util_get_audio_client(). + This is a preparatory patch. It will be used in the following + patch to init/deinit the AudioClient separately from the device. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2096> + +2021-07-11 18:14:46 +0200 Jakub Janků <janku.jakub.jj@gmail.com> + + * tests/check/elements/wasapi.c: + * tests/check/meson.build: + tests: wasapi: check PLAYING -> READY -> PLAYING + Such sequence of state changes is valid and no error should happen. + At the moment, the test fails. Following patches aim to fix it. + Partially based on the code in tests/check/elements/wasapi2.c + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2096> + +2021-07-09 14:55:43 +0200 Guido Günther <agx@sigxcpu.org> + + * gst-libs/gst/play/gstplay-signal-adapter.c: + play: Emit correct signal + SIGNAL_MEDIA_INFO_UPDATED should be emitted on media info changes, + not SIGNAL_VIDEO_DIMENSIONS_CHANGED. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2398> + +2021-03-05 09:18:15 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/vulkan/gstvkcommandpool.h: + * gst-libs/gst/vulkan/gstvkdescriptorcache.h: + * gst-libs/gst/vulkan/gstvkdevice.h: + * gst-libs/gst/vulkan/gstvkdisplay.h: + * gst-libs/gst/vulkan/gstvkfullscreenquad.h: + * gst-libs/gst/vulkan/gstvkhandlepool.h: + * gst-libs/gst/vulkan/gstvkinstance.h: + * gst-libs/gst/vulkan/gstvkphysicaldevice.h: + * gst-libs/gst/vulkan/gstvkqueue.h: + * gst-libs/gst/vulkan/gstvkswapper.h: + * gst-libs/gst/vulkan/gstvktrash.h: + * gst-libs/gst/vulkan/gstvkwindow.h: + vulkan: Declare missing auto-pointer cleanup functions. + Also removed a couple guards since, given the glib dependency, they + are set. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2396> + +2021-03-03 12:54:20 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/vulkan/gstvkvideofilter.c: + vulkan: filter: Use filter variable name for choosing queue. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2396> + +2021-03-03 08:50:13 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/vulkan/vksink.c: + vulkansink: Fix element metadata. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2396> + +2021-06-25 09:19:25 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * sys/msdk/gstmsdkvpp.c: + msdkvpp: use NV12 as default format on srcpad + By default, sinkpad is NV12 format and srcpad is BGRA format, the + different format will trigger an implicit format conversion in + msdkvpp, which will cause performance drop. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2394> + +2021-06-01 08:40:17 +0900 Dominique Martinet <dominique.martinet@atmark-techno.com> + + * gst-libs/gst/wayland/wayland.c: + gst-libs/gst/wayland: handle display passing better + failure to pass a display in 'handle' would result in uninitialized value + being returned, which would often segfault later down the road when trying + to initialize gstreamer context with it. + Check the return value of gst_structure_get() to make sure we return valid + data. + Furthermore, the gstglimagesink in gst-plugins-base also has a similar + mechanism but uses 'display' as field name to pass the value; instead of + requiring the application to behave differently depending on what sink + was automatically detected just try to read both values here, with display + being the new default. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2292> 2021-07-08 14:46:11 +0100 Philippe Normand <philn@igalia.com> @@ -340,7 +6611,20 @@ dashdemux: Log protection events on corresponding pad GstDashDemuxStream is not a GstObject, so use its pad as associated object when emitting log messages. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2392> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2389> + +2021-07-08 16:49:27 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvafilter.c: + va: vpp: Improve the color properties setting. + The current setting of color properties are not very correct and + we will get some kind of "unknown Color Standard for YUV format" + warnings printed out by drivers. The video-color already provides + some standard APIs for us, and we can use them directly. + We also change the logic to: Finding the exactly match or explicit + standard first. If not found, we continue to find the most similar + one. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2385> 2021-07-08 19:03:06 +0900 Seungha Yang <seungha@centricular.com> @@ -350,7 +6634,246 @@ Given caps does not need to be strictly subset of device caps. Allow accept it if device caps and requested caps can intersect Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1619 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2391> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2384> + +2021-07-08 02:24:18 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstcccombiner.c: + cccombiner: mark field 0 as valid when generating padding CDP + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2383> + +2021-07-06 17:14:21 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideobuffer.cpp: + * sys/mediafoundation/gstmfvideobuffer.h: + mfvideobuffer: Don't error for unexpected Unlock/Unlock2D call + Some GPU vendor's MFT implementation calls IMFMediaBuffer::Unlock() + without previous IMFMediaBuffer::Lock() call. Which is obviously + driver bug but we can ignore the Unlock call. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2381> + +2021-06-30 10:30:43 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dash/gstdashsink.c: + dashsink: fix crash with no pad name for representation + if there is no pad name, the representation id + was NULL, causing a crash when writing the mpd file. + gst-launch-1.0 videotestsrc num-buffers=900 ! video/x-raw, width=800, + height=600, framerate=30/1 ! x264enc ! video/x-h264, profile=high ! + dashsink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2064> + +2021-03-09 11:40:43 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/dash/gstdashsink.c: + dashsink: Add signals for allowing custom playlist/fragment + Instead of always going through the file system API we allow the + application to modify the behaviour. For the playlist itself and + fragments, the application can provide a GOutputStream. In addition the + sink notifies the application whenever a fragment can be deleted. + Following the HLS change: + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/918 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2064> + +2021-07-06 14:06:24 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265dec: Disable the POC order warning for negative POC. + There may be leading frames after the IRAP frames, which has negative + POC. This kind of frames are allowed and they will be displayed before + the IRAP frame. So the warning should not be triggered for them. Init + the last_output_poc to G_MININT32 can avoid this. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2380> + +2021-07-06 13:38:16 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264dec: Disable the POC order warning for negative POC. + There may be leading frames after the IDR frame, which has negative + POC. This kind of frames are allowed and they will be displayed before + the IDR frame. So the warning should not be triggered for them. Init + the last_output_poc to G_MININT32 can avoid this. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2380> + +2021-06-25 15:57:03 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/interlace/gstinterlace.c: + interlace: Push the reconfigure event in the right direction + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2361> + +2021-07-05 15:44:34 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + va: basedec: Fix some artifact when do the crop copy. + The default video converter setting will add some artifact into + the picture for 10/12 bits conversion. This make the MD5 checksum + change from the original picture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2379> + +2021-07-05 02:05:03 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + d3d11decoder: Enable zero-copy for Qualcomm + Qualcomm GPU works fine with current implementation now. + Noticeable difference between when it was disabled and current + d3d11 implementation is that we now support GstD3D11Memory + pool, so there will be no more frequent re-binding decoder surface anymore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2377> + +2021-07-05 07:42:39 +0200 Edward Hervey <edward@centricular.com> + + * gst/mxf/mxfdemux.c: + * gst/mxf/mxfmetadata.c: + * gst/mxf/mxfmetadata.h: + mxfdemux: Check validity of interleaved File Package + As specified by the S377 MXF core specification, if a file package has + interleaved content, then all tracks must be using the same Edit Rate + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2378> + +2021-07-05 01:54:02 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11vp9dec: Fix for incorrect use_prev_in_find_mv_refs setting + Set use_prev_in_find_mv_refs depending on context. The value seems + to be used by AMD and Qualcomm (Intel and NVIDIA doesn't make difference + as per test) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2376> + +2021-05-11 14:07:14 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkallocator_libva.c: + * sys/msdk/gstmsdksystemmemory.c: + * sys/msdk/gstmsdkvideomemory.c: + * sys/msdk/gstmsdkvpp.c: + * sys/msdk/msdk.c: + * sys/msdk/msdk_libva.c: + gstmsdkvpp: add RGBP and BGRP in src pad + It requires MFX version 2.4+ + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2234> + +2021-07-04 00:36:27 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfsourcereader.cpp: + * sys/mediafoundation/gstmftransform.cpp: + mediafoundation: Port to IID_PPV_ARGS + Make code short where possible + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2375> + +2021-07-04 00:24:09 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfaacenc.cpp: + * sys/mediafoundation/gstmfmp3enc.cpp: + mfaudioenc: Remove pointless enumerating for hardware audio encoder + Hardware audio encoder can exist in theory, but it's untested + and we are not sure whether it can be preferred over software + implementation which is implemented by MS + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2375> + +2021-07-03 23:12:08 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfaacenc.cpp: + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfmp3enc.cpp: + * sys/mediafoundation/gstmfvp9enc.cpp: + mediafoundation: Fix typos + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2375> + +2021-07-03 22:56:48 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfaacenc.cpp: + * sys/mediafoundation/gstmfaudioenc.cpp: + * sys/mediafoundation/gstmfcapturewinrt.cpp: + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfmp3enc.cpp: + * sys/mediafoundation/gstmfsourcereader.cpp: + * sys/mediafoundation/gstmftransform.cpp: + * sys/mediafoundation/gstmfutils.cpp: + * sys/mediafoundation/gstmfvideobuffer.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvp9enc.cpp: + * sys/mediafoundation/mediacapturewrapper.cpp: + mediafoundation: Run gst-indent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2375> + +2021-06-26 21:42:37 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: Consider the compatibility when we get_profile() for H265 decoder. + Adding the compatile profiles when we decide the final profile used for decoding. + The final profile candidates include: + 1. The profile directly specified by SPS, which is the exact one. + 2. The compatile profiles decided by the upstream element such as the h265parse. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2322> + +2021-06-27 15:34:28 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gsth265parse.c: + h265parse: Add special profile case for profile_idc 0. + This is a work-around to identify some main profile streams having + wrong profile_idc. There are some wrongly encoded main profile streams + which doesn't have any of the profile_idc values mentioned in Annex-A, + instead, general_profile_idc has been set as zero and the + general_profile_compatibility_flag[general_profile_idc] is TRUE. + Assuming them as MAIN profile for now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2322> + +2021-06-26 15:11:47 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gsth265parse.c: + h265parse: Map -intra profiles to non-intra compatible profiles. + All the -intra profiles can map to non-intra profiles as compatible + profiles, except the monochrome case for main and main-10. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2322> + +2021-07-01 19:27:28 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavp9dec.c: + va: vp9dec: update segmentation and store the result. + The segmentation is stateful, its information may depend on the previous + segmentation setting. For example, if loop_filter_delta_enabled is TRUE, + the filter_level[GST_VP9_REF_FRAME_INTRA][1] should inherit the previous + frame's value and can not be calculated by the current frame's segmentation + data only. So we need to maintain the segmentation state inside the vp9 + decoder and update it when the new frame header comes. + We also fix the CLAMP issue of lvl_seg and intra_lvl because of their wrong + uint type here. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2369> + +2021-06-30 15:23:15 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstvp9statefulparser.c: + codecparsers: vp9statefulparser: Fix the gst_vp9_get_qindex clamp issue. + The alternate quantizer is a delta value and should be int type. + We mark it wrongly as uint, that will make CLAMP (data, 0, 255) + always choose 255 rather than 0 if the data < 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2369> + +2021-06-30 15:32:42 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstvp9parser.h: + codecparsers: vp9parser: Use macro to define the size of filter_level in Segmentation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2369> + +2021-06-30 12:15:42 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstvp9parser.h: + codecparsers: vp9parser: Delete the verbose redefine of MAX_LOOP_FILTER. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2369> + +2021-06-29 23:21:24 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: dma: Fail when mapping the non-linear buffer. + The current way of DMA buffer mapping is simply forwarding the job + to parent's map function, which is a mmap(). That can not handle the + non-linear buffers, such as tiling, compressed, etc. The incorrect + mapping of such buffers causes broken images, which are recognized + as bugs. We should directly block this kind of mapping to avoid the + misunderstanding. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2353> 2021-07-02 13:10:25 +1000 Matthew Waters <matthew@centricular.com> @@ -361,7 +6884,28 @@ framerates. The existing caps negotiation code was allowing any framerate to convert to a cdp output which is incorrect and would hit an assertion later. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2388> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2372> + +2021-06-09 15:16:39 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Consider the conformance window changes when new_sequence(). + The change of conformance_window_flag and crop windows size also has impact on the + output resolution and caps. So it deserves a trigger of new_sequence() to notify + the sub class to update caps and pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2312> + +2021-06-16 01:07:09 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + h265decoder: Don't hold reference to GstVideoCodecFrame for dropped picture + We are dropping RASL (Random Access Skipped Leading picture) which + is associated with an IRAP (Intra Random Access Picture) that has + NoRaslOutputFlag equal to 1, since the RASL picture will not be + outputted and also it should not be used for reference picture. + So, corresponding GstVideoCodecFrame should be released immediately. + Otherwise GstVideoDecoder baseclass will hold the unused frame. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2330> 2021-06-21 13:23:13 +0200 Edward Hervey <edward@centricular.com> @@ -372,7 +6916,25 @@ variants. In order to properly advance the streams, the essence handler returns an empty GAP buffer which gets converted to a GST_EVENT_GAP. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2387> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2345> + +2021-06-30 18:11:46 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideoenc.cpp: + mfvideoenc: Don't ignore previous flow return value + In case of ASYNC MFT (hardware encoder), we were ignoring previous + finish_frame or pad_push return value. so, error wasn't propagated. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2370> + +2021-05-20 00:49:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11compositor.h: + * sys/d3d11/gstd3d11compositorbin.cpp: + d3d11compositor: Add scaling policy to support PAR-aware scaling + Identical to https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1156 + but for D3D11. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2263> 2021-06-30 13:56:49 +0900 youngsoo.lee <youngsoo15.lee@gmail.com> @@ -383,7 +6945,307 @@ Mat_(std::initializer_list<_Tp> values); fatal error: too many errors emitted, stopping now [-ferror-limit=] 35 warnings and 20 errors generated. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2386> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2368> + +2021-03-03 15:38:45 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + v4l2codecs: vp8: Check kernel version + Print a warning if the kernel version is too old. + Signed-off-by: Ezequiel Garcia <ezequiel@collabora.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2075> + +2021-03-02 18:13:27 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/linux/v4l2-controls.h: + * sys/v4l2codecs/linux/videodev2.h: + * sys/v4l2codecs/linux/vp8-ctrls.h: + * sys/v4l2codecs/plugin.c: + v4l2codecs: vp8: Update to the new uAPI + Starting from Linux v5.13, the V4L2 stateless VP8 uAPI + is updated and stable. + Signed-off-by: Ezequiel Garcia <ezequiel@collabora.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2075> + +2021-06-27 01:15:49 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvrtcloader.c: + nvcodec: Enhance CUDA runtime compiler library loading on Windows + The name of installed CUDA runtime compiler library is formed like + nvrtc64_{major-version}{minor-version}_0.dll on Windows + (which is differnt from documented in https://docs.nvidia.com/cuda/nvrtc/index.html) + And minor version might not be exactly same as that of CUDA. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2362> + +2021-06-14 18:49:20 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11av1dec.cpp: + * sys/d3d11/gstd3d11av1dec.h: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Add AV1 decoder + Introduce Direct3D11/DXVA AV1 decoder element + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2365> + +2021-06-27 23:09:30 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstav1decoder.c: + av1decoder: Store display resolution for duplicated picture + Target display resolution might be required by subclass implementation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2365> + +2021-06-27 20:35:49 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstav1decoder.c: + av1decoder: Fix debug typo + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2365> + +2021-06-27 20:19:39 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + av1parser: Fix tile size calculation + Remaining size should exclude already read "tile size bits". + And see also "5.11.1. General tile group OBU syntax" + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2365> + +2021-06-28 21:13:56 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportreceivebin.c: + webrtc receivebin: Drop serialized queries before receive queue + If they're not dropped, they can be blocked in the queue even if it is + leaky in the case where there is a buffer being pushed downstream. Since + in webrtc, it's unlikely that there will be a special allocator to + receive RTP packets, there is almost no downside to just ignoring the + queries. + Also drop queries if they get caught in the pad probe after the queue. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2363> + +2021-06-26 14:31:01 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportreceivebin.c: + * ext/webrtc/transportreceivebin.h: + webrtc receivebin: Only set queue to leaky when the pad is blocked + When the pad is no longer blocked, remove the leakyness to make sure + everything gets into the jitterbuffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2363> + +2021-06-26 14:25:39 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportreceivebin.c: + webrtc receivebin: Don't unblock pad until sender is unblocked + As ther OpenSSL session is created when the receiver goes into + playing, we have to wait for the ICE session to be connected before we + can start delivering packets to the DTLS element. + Fixes #1599 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2363> + +2021-06-24 13:17:09 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * gst-libs/gst/webrtc/dtlstransport.c: + webrtcbin: Sync to the clock per stream and not per bundle + By using the clocksync inside the dtlssrtpenc, all streams inside a + bundled are synchronized together. This will cause problems if their + buffers are not already arriving synchronized: clocksync would wait for + a buffer on one stream and then buffers from the other stream(s) with + lower timestamps would all be sent out too late. + Placing the clocksync before the rtpbin and rtpfunnel synchronizes each + stream individually and they will be send out more smoothly as a result. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2355> + +2021-06-24 14:58:12 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportreceivebin.c: + * ext/webrtc/transportsendbin.c: + * ext/webrtc/transportstream.c: + * gst-libs/gst/webrtc/rtpreceiver.c: + * gst-libs/gst/webrtc/rtpreceiver.h: + webrtc: Remove the webrtc-priv.h header from public headers + And this time for real, also import it in a couple more places + inside the webrtc element to make it build. + Fixes #1607 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2359> + +2021-06-09 17:29:19 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaav1dec.c: + va: change AV1 GstVideoAlignment setting to left-top corner. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-18 10:37:06 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah264dec.c: + va: h264dec: Set the GstVideoAlignment correctly. + We should set GstVideoAlignment based on the sequence's crop information. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-09 17:21:18 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Set the GstVideoAlignment correctly. + We should set GstVideoAlignment based on the conformance window info. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-09 17:19:04 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvapool.c: + va: pool: Add VideoCropMeta to the buffer if crop_top/left > 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-09 17:14:42 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + va: basedec: Copy the frames into other_pool if needed. + If decoder's crop_top/left value > 0 and the downstream does not + support the VideoCropMeta, we need to manually copy the frames + into the other_pool and output it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-09 15:44:33 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + va: basedec: Setup the other_pool to copy output if crop_left/top. + If the decoder has crop_top/left value > 0(e.g. the conformance + window in the H265). Which means that the real output picture + locates in the middle of the decoded buffer. If the downstream can + support VideoCropMeta, a VideoCropMeta is added to notify the + real picture's coordinate and size. But if not, we need to copy + it manually and the other_pool is needed. We always assume that + decoded picture starts from top-left corner, and so there is no + need to do this if crop_bottom/right value > 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-07 00:49:49 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvapool.c: + va: No need to set the alignment for VideoMeta + The base va decoder's video_align is just used for calculation the + real decoded buffer's width and height. It does not have meaning + for the VideoMeta, because it does not align to the real picture + in the output buffer. We will use VideoCropMeta to replace it later. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-03 00:07:05 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvapool.c: + va: Delete the useless align expand in va_pool_set_config(). + The base va decoder's video_align is just used for calculation the + real decoded buffer's width and height. While the gst_video_info_align + just calculate the offset and stride based on the video_align. But + all the offsets and strides are overwritten in gst_va_dmabuf_allocator_try + or gst_va_allocator_try, which make that calculation useless. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2298> + +2021-06-28 17:41:38 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst-libs/gst/webrtc/webrtc-priv.h: + * gst-libs/gst/webrtc/webrtc_fwd.h: + webrtc: Re-add WebRTC object docs to the public headers + So they end up in the generated documentation and the Since markers + appear in the .gir files too. + Also remove wrong "Since: 1.16" markers for some objects that were + available since 1.14.0 already. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1609 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2366> + +2021-06-25 10:20:06 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.c: + webrtcbin: Set transceiver kind and codec preferences immediately when creating it + Otherwise the on-new-transceiver signal will always be emitted with kind + set to UNKNOWN and no codec preferences although both are often known at + this point already. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2360> + +2021-06-25 12:14:03 +0300 Sebastian Dröge <sebastian@centricular.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin: Add a test for setting codec preferences as part of "on-new-transceiver" when setting the remote offer + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2310> + +2021-06-25 12:13:42 +0300 Sebastian Dröge <sebastian@centricular.com> + + * tests/check/elements/webrtcbin.c: + webrtc: Use fail_unless_equals_string() for string assertions + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2310> + +2021-06-08 11:40:14 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Store newly created transceivers when creating an answer also in the seen transceivers list + Otherwise it might be used a second time for another media afterwards. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2310> + +2021-06-08 11:39:27 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: When creating a new transceiver as part of creating the answer also take its codec preferences into account + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2310> + +2021-06-08 11:38:11 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Fix a couple of caps leaks of the offer caps + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2310> + +2021-06-24 12:28:11 +0100 Philippe Normand <philn@igalia.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Stop transceivers update after first SDP error on data channel + When invalid SDP is supplied, _update_data_channel_from_sdp_media() sets the + GError, so it is invalid to continue any further SDP processing, we have to exit + early when the first error is raised. + This change is similar to the one applied in + 064428cb34572fa1a018ebbaba6925967ba99dc0. + See also #1595 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2356> + +2021-06-21 16:50:46 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin test: Fix race in new test + Pull a buffer from a sink to make sure that the caps are already + set before trying to update them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2348> + +2021-06-22 16:12:57 +0800 Mengkejiergeli Ba <mengkejiergeli.ba@intel.com> + + * sys/msdk/gstmsdkenc.c: + * sys/msdk/gstmsdkenc.h: + * sys/msdk/gstmsdkvp9enc.c: + msdk: fix qp range for vp9enc + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2349> + +2021-06-10 11:46:35 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst/timecode/gstavwait.c: + avwait: Don't consider it a segment change if the segment is the same except for the position + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2319> + +2021-06-21 17:13:33 +0900 Seungha Yang <seungha@centricular.com> + + d3d11: Add support for GRAY and more YUV formats + By this commit, following formats will be newly supported by d3d11 elements + * Y444_{8, 12, 16}LE formats: + Similar to other planar formats. Such Y444 variants are not supported + by Direct3D11 natively, but we can simply map each plane by + using R8 and/or R16 texture. + * P012_LE: + It is not different from P016_LE, but defining P012 and P016 separately + for more explicit signalling. Note that DXVA uses P016 texture + for 12bits encoded bitstreams. + * GRAY: + This format is required for some codecs (e.g., AV1) if monochrome + is supported + * 4:2:0 planar 12bits (I420_12LE) and 4:2:2 planar 8, 10, 12bits + formats (Y42B, I422_10LE, and I422_12LE) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2346> 2021-06-10 11:42:24 +0300 Sebastian Dröge <sebastian@centricular.com> @@ -394,7 +7256,26 @@ Previously we would've created a pad with a random pid that would become "sink_0", and then on a new muxer instance a pad "sink_0" and tsmux would've then failed because 0 is not a valid PID. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2358> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2318> + +2021-06-23 01:43:08 +0900 Seungha Yang <seungha@centricular.com> + + mfvideoenc: Enhance B-frame timestamp handling + When B-frame is enabled, encoder seems to adjust PTS of encoded sample + by using frame duration. + For instance, one observed timestamp pattern by using B-frame enabled + and 30fps stream is: + * Frame-1: MF pts 0:00.033333300 MF dts 0:00.000000000 + * Frame-2: MF pts 0:00.133333300 MF dts 0:00.033333300 + * Frame-3: MF pts 0:00.066666600 MF dts 0:00.066666600 + * Frame-4: MF pts 0:00.099999900 MF dts 0:00.100000000 + We can notice that the amount of PTS shift is frame duration and + Frame-4 exhibits PTS < DTS. + To compensate shifted timestamp, we should + calculate the timestamp offset and re-calculate DTS correspondingly. + Otherwise, total timeline of output stream will be shifted, and that + can cause time sync issue. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2354> 2021-06-10 11:36:43 +0300 Sebastian Dröge <sebastian@centricular.com> @@ -405,7 +7286,158 @@ was requested. Resetting is re-creating the internal muxer object and as such resetting the pid counter, so the next requested pad would get the same pid as the first requested pad which then leads to collisions. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2357> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2317> + +2021-06-22 02:34:18 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + mfh264enc, mfh265enc: Set profile string to src caps + Set configured profile to src caps so that downstream can figure + out selected profile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2347> + +2021-04-21 16:24:00 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/webrtcdatachannel.h: + * gst-libs/gst/webrtc/datachannel.c: + * gst-libs/gst/webrtc/datachannel.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + webrtc lib: Make the datachannel struct private + This will prevent any unsafe access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2241> + +2021-04-21 16:19:41 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/dtlstransport.c: + * gst-libs/gst/webrtc/dtlstransport.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + webrtc lib: Make the DTLSTransport struct private + This will prevent any unsafe access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2241> + +2021-04-21 16:17:23 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/nicetransport.h: + * gst-libs/gst/webrtc/icetransport.c: + * gst-libs/gst/webrtc/icetransport.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + webrtc lib: Make the icetransport struct private + This will prevent any unsafe access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2241> + +2021-04-21 16:04:26 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + webrtc lib: Make the rtpreceiver struct private + This will prevent any unsafe access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2241> + +2021-04-21 16:00:57 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + webrtc lib: Make the rtpsender struct private + This will prevent any unsafe access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2241> + +2021-04-21 16:00:34 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/webrtctransceiver.h: + * gst-libs/gst/webrtc/rtptransceiver.c: + * gst-libs/gst/webrtc/rtptransceiver.h: + * gst-libs/gst/webrtc/webrtc-priv.h: + webrtc lib: Make the transceiver struct private + This will prevent any unsafe access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2241> + +2021-06-18 19:26:35 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/x265/gstx265enc.c: + x265enc: add negative DTS support + Use the same set_min_pts approach as x264enc. + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/304 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2340> + +2021-06-17 20:10:35 +0900 Seungha Yang <seungha@centricular.com> + + * sys/decklink/gstdecklinkaudiosrc.cpp: + decklinkaudiosrc: Don't assume that stream time is always valid + As per SDK doc, IDeckLinkInputCallback::VideoInputFrameArrived + method might not provide video frame and it can be null. + In that case, given stream_time can be invalid. + So, we should not try to convert GST_CLOCK_TIME_NONE + by using gst_clock_adjust_with_calibration() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2337> + +2021-06-14 13:16:30 -0400 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Fix usage of g_error_propagate + In the error callback we were propagating an error we were not owning + which is incorrect use of the API. + Also we were clearing a GError we already propagated which is wrong + as propagating gives ownership away. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2325> + +2021-06-14 13:13:24 -0400 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Add a missing object unlocking + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2325> + +2021-06-14 15:07:05 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/faad/gstfaad.c: + faad: fix typo in element documentation + seealso is now see_also + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2323> + +2021-06-17 20:17:14 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/check/elements/msdkh264enc.c: + tests: msdkh264dec: Run test only if factory is available. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2336> + +2021-06-17 11:25:11 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/msdk/gstmsdkcontext.c: + msdk: Demote error log message to warning. + It is not an error that the available hardware doesn't support VA-API/MSDK. Just + none plugin features will be registered. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2336> + +2021-06-20 18:48:21 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11_private.h: + * gst-libs/gst/d3d11/gstd3d11bufferpool.cpp: + * gst-libs/gst/d3d11/gstd3d11device.cpp: + * gst-libs/gst/d3d11/gstd3d11format.cpp: + * gst-libs/gst/d3d11/gstd3d11memory.cpp: + * gst-libs/gst/d3d11/gstd3d11utils.cpp: + * gst-libs/gst/d3d11/meson.build: + libs: d3d11: Port to C++ + In general, C++ COM APIs are slightly less verbose and more readable + than C APIs. And C++ supports some helper methods + (smart pointer and C++ only macros for example) which are not allowed for C. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2343> + +2021-06-16 10:23:37 -0700 U. Artie Eoff <ullysses.a.eoff@intel.com> + + * sys/msdk/gstmsdk.c: + * sys/msdk/meson.build: + msdk: declare external dependencies + Track kernel and VA driver dependencies so gstreamer + will re-inspect the plugin if any of them change. + Also, do not blacklist the plugin if !msdk_is_available + since it could be a transient issue caused by one or + more external dependency issues (e.g. wrong/missing + driver specified, but corrected by user later on). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2335> 2021-06-17 01:00:33 +0900 Seungha Yang <seungha@centricular.com> @@ -419,7 +7451,50 @@ downstream if such parameter set NAL units don't exist in inband bitstream. Therefore, parse elements should re-send parameter set NAL units like the case of flush event. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2351> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2334> + +2021-06-16 10:31:13 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + vp8decoder: Drain the output queue on EOS/finish + The finish() virtual method was flushing the queue, instead push the + remaining buffers. It is not required to reset in finish() unlike + drain(). This a regression causing last frame to always be lost. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2333> + +2021-06-16 10:30:18 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + v4l2slvp8dec: Only ask for output delay once per negotiation + While it's technically possible to change it per frame, asking for + that every frame is not very useful. This mimic H264 decoder better. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2333> + +2021-06-16 16:56:14 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + va: Improve the default mapping between rt_format and video format. + We add 12 bits entries into this default mapping. And the old mapping + is not precise. For example, the NV12 should not be used as the default + mapping for VA_RT_FORMAT_YUV422 and VA_RT_FORMAT_YUV444, it is even not + a 422 or 444 format. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2332> + +2021-06-16 16:43:40 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: Add 12 bits rt_format setting in H265. + In order to support 12 bits format decoding, we need to add the + support for 12 bits rt_format in H265. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2332> + +2021-06-16 16:32:30 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavideoformat.c: + va: Fix a typo in video format mapping. + GST_VIDEO_FORMAT_Y412_LE is a 4:4:4 format and so should be mapped + to VA_RT_YUV444_12 rt format. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2332> 2021-06-15 21:36:43 +0800 He Junyan <junyan.he@intel.com> @@ -428,7 +7503,352 @@ The GST_H265_PROFILE_MAIN_444_10 profile should be compatible with GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444_10, not the current GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_10. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2350> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2328> + +2020-12-15 18:11:08 +0800 Randy Li (ayaka) <ayaka@soulik.info> + + * ext/wayland/gstwaylandsink.c: + * ext/wayland/gstwaylandsink.h: + * ext/wayland/wlwindow.c: + * ext/wayland/wlwindow.h: + waylandsink: prevent frame callback being released twice + For those using context from the application which + would be the embedded video case, if the frame callback + is entering at the same time as window is finalizing, + a wayland proxy object would be destroyed twice, leading + the refcout less than zero in the second time, it can + throw an abort() in wayland. + For those top window case, which as a directly connection + to the compositor, they can stop the message queue then + the frame callback won't happen at the same time as the + window is finalizing. It doesn't think it would bother + them about this. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1883> + +2021-06-14 16:04:52 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphadecodebin.c: + * sys/v4l2codecs/gstv4l2codecalphadecodebin.c: + alphadecodebin: Fix stall due to QoS + alphacombine element is a simple element that assumes buffers are always + paired, or at least that missing buffers are signalled with a GAP. The QoS + implementation in the GstVideoDecoder base class allow decoders dropping + frames independently and that could lead to stall in alphacombine. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2326> + +2021-02-02 11:02:02 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/openjpeg/gstopenjpegenc.c: + * gst-libs/gst/codecparsers/gstjpeg2000sampling.c: + * gst-libs/gst/codecparsers/gstjpeg2000sampling.h: + * gst/videoparsers/gstjpeg2000parse.c: + jpeg2000parse, openjpeg: add support for YCrCb 4:1:1 sampling + Add YCrCb 4:1:1 support in openjpeg elements + and fix in jpeg2000parse the YCrCb 4:1:0 support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2321> + +2021-06-10 23:35:38 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Don't print error log when no DPB texture is available + ... but we are flushing. The condition is quite expected situation + when pipeline is in the middle of seeking operation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2320> + +2021-05-23 18:17:38 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * sys/msdk/gstmsdkenc.c: + msdkenc: add extbrc support in ext-coding-props property + The SDK can support external bitrate control [1], so add extbrc + to enable this feature. + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxextcodingoption2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2139> + +2021-05-23 18:13:25 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/msdk/gstmsdkenc.c: + * sys/msdk/gstmsdkenc.h: + msdkenc: add ext-coding-props for external coding options + This property supports passing multiple parameters using GstStructure. + Example usage: + ext-coding-props="props,key0=value0,key1=value1,..." + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2139> + +2021-06-05 21:59:50 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: Fix the H265 poc out of order warning. + We always get a warning such as: + h265decoder gsth265decoder.c:1432:gst_h265_decoder_do_output_picture: \ + <vah265dec0> Outputting out of order 255 -> 0, likely a broken stream + in H265 decoder. + The problem is caused because we fail to reset the last_output_poc when + we get IDR and BLA. The incoming IDR and BLA frame already bump all the + frames in the DPB, but we forget to reset the last_output_poc, which + make the POC out of order and generate the warning all the time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2294> + +2021-06-10 01:09:44 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2sink.c: + wasapi2sink: Fix ringbuffer object leak + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2315> + +2021-06-10 00:24:24 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + wasapi2ringbuffer: Implement GstAudioRingBuffer::pause() + WASAPI doesn't support PAUSE so it's not different from Stop(). + When pipeline is in paused state, we don't need to waste CPU resource + for feeding silent buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2315> + +2021-06-07 01:49:26 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11_fwd.h: + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/gstd3d11memory.h: + d3d11memory: Implement GstAllocator::mem_copy method + There are a few places which require deep copy + (basesink on drain for example). Also this implementation can be + useful for future use case. + One probable future use case is that copying DPB texture to + another texture for in-place transform since our DPB texture is never + writable, and therefore copying is unavoidable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2308> + +2021-06-08 21:35:20 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + * sys/wasapi2/gstwasapi2client.h: + * sys/wasapi2/gstwasapi2device.c: + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + * sys/wasapi2/gstwasapi2src.c: + * sys/wasapi2/gstwasapi2util.c: + * sys/wasapi2/gstwasapi2util.h: + wasapi2src: Add support for loopback recording + ... and add various device error handling. + This loopback implementation is functionally identical to that of wasapisrc. + When it's enabled, wasapi2src will read data from render device instead of + capture device. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2311> + +2021-05-10 20:45:28 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + * sys/wasapi2/gstwasapi2client.h: + * sys/wasapi2/gstwasapi2device.c: + * sys/wasapi2/gstwasapi2ringbuffer.cpp: + * sys/wasapi2/gstwasapi2ringbuffer.h: + * sys/wasapi2/gstwasapi2sink.c: + * sys/wasapi2/gstwasapi2sink.h: + * sys/wasapi2/gstwasapi2src.c: + * sys/wasapi2/gstwasapi2src.h: + * sys/wasapi2/gstwasapi2util.c: + * sys/wasapi2/gstwasapi2util.h: + * sys/wasapi2/meson.build: + * sys/wasapi2/plugin.c: + wasapi2: Rewrite plugin and implement audioringbuffer subclass + ... based on MediaFoundation work queue API. + By this commit, wasapi2 plugin will make use of pull mode scheduling + with audioringbuffer subclass. + There are several drawbacks of audiosrc/audiosink subclassing + (not audiobasesrc/audiobasesink) for WASAPI API, which are: + * audiosrc/audiosink classes try to set high priority to + read/write thread via MMCSS (Multimedia Class Scheduler Service) + but it's not allowed in case of UWP application. + In order to use MMCSS in UWP, application should use MediaFoundation + work queue indirectly. + Since audiosrc/audiosink scheduling model is not compatible with + MediaFoundation's work queue model, audioringbuffer subclassing + is required. + * WASAPI capture device might report larger packet size than expected + (i.e., larger frames we can read than expected frame size per period). + Meanwhile, in any case, application should drain all packets at that moment. + In order to handle the case, wasapi/wasapi2 plugins were making use of + GstAdapter which is obviously sub-optimal because it requires additional + memory allocation and copy. + By implementing audioringbuffer subclassing, we can avoid such inefficiency. + In this commit, all the device read/write operations will be moved + to newly implemented wasapi2ringbuffer class and + existing wasapi2client class will take care of device enumeration + and activation parts only. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2306> + +2021-06-06 17:32:59 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + wasapi2: Use AUDCLNT_STREAMFLAGS_NOPERSIST flag + ... so that we can disable persistence of our mute/volume status + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2306> + +2021-06-06 17:28:56 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2src.c: + wasapi2src: Fix doc typo + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2306> + +2021-05-26 00:12:59 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/wpe-extension/gstwpebusmsgforwarder.c: + wpe: Rename `undeserializable_type` to `not_deserializable_type` + Making it more readable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-25 23:58:27 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/wpe-extension/gstwpebusmsgforwarder.c: + wpe: Make forwarded messages layout more like GstBinForwaded messages + Making it look more like how we do this kind of things in other places. + See: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252#note_927653 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 10:52:01 -0400 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/wpe/gstwpesrcbin.cpp: + * tests/examples/wpe/wpe.c: + wpe: Make wpesrc!video pad an always pad + There should always be a `video` pad no matter what. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 10:31:53 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/wpe-extension/gstwpeextension.c: + wpe: Remove unused env var + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 10:31:37 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/wpe-extension/gstwpeaudiosink.c: + wpe: Fix atomic usage + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 10:29:11 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + wpe: Add a note able requiring tracing subsystem for message forwarding + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 10:18:21 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/wpe-extension/gstwpeaudiosink.c: + wpe: Fix check on whether MEMFD_CREATE is available + The ordering of the ifdef was wrong + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 10:13:01 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/wpe-extension/gstwpebusmsgforwarder.c: + wpe: Plug a leak + We were freeing after returning + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-21 09:54:33 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + Revert "wpe: Properly respect LIBGL_ALWAYS_SOFTWARE" + This causes issues I didn't see: + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252#note_927633 + Let's just tell people to use capsfilter to force software rendering in + `wpesrc` for now. + The intent was to allow forcing it easily in playbin2 for the CI, but + we will do it some other way and see when time comes. + This reverts commit 9415106b029e5469ca28d882dc46ecc38786d4c9. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2273> + +2021-05-28 15:18:53 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/debugutils/debugutilsbad.c: + * gst/debugutils/gstdebugutilsbadelements.h: + * gst/debugutils/gstvideocodectestsink.c: + * gst/debugutils/gstvideocodectestsink.h: + * gst/debugutils/meson.build: + debugutils: Introduce videocodectestsink + This is a video specific sink used to test video CODEC conformance. This is similar + to a combination of filesink and testsink, but will skip over any type of + padding that GStreamer Video library introduces. This is needed in order to obtain the + correct checksum or raw yuv data. + This element currently support writing back non-padded raw I420 through the + location property and will calculate an MD5 and post it as an element message + of type conformance/checksum. More output format or checksum type could be + added in the future as needed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2287> + +2021-06-04 01:44:47 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/vulkan/gstvkinstance.c: + vkinstance: Don't abort in case that system has no available vulkan device + Specification doesn't have restriction that returned + pPhysicalDeviceCount value must be non-zero + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2304> + +2021-06-03 11:24:53 +0200 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Clear all streams when rewinding + This avoids sending out partial invalid data downstream which could cause + decoders (ex: `dvdlpmdec`) to error out. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2301> + +2021-05-29 01:48:15 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11format.h: + * sys/d3d11/gstd3d11converter.cpp: + * tests/check/elements/d3d11colorconvert.c: + d3d11: Add support for YV12 and NV21 formats + Handle UV swapped 4:2:0 8bits formats + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2290> + +2021-06-03 18:28:26 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11window_win32: Ensure closing internal HWND from window thread + Window handle must be closed from its own message thread + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2302> + +2021-06-03 10:31:39 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + doc: Update cache after pixel format reorder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2303> + +2021-06-03 10:03:19 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/msdk/gstmsdk.c: + * sys/msdk/gstmsdkav1dec.c: + * sys/msdk/gstmsdkh264dec.c: + * sys/msdk/gstmsdkh264enc.c: + * sys/msdk/gstmsdkh265dec.c: + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkmjpegdec.c: + * sys/msdk/gstmsdkmjpegenc.c: + * sys/msdk/gstmsdkmpeg2dec.c: + * sys/msdk/gstmsdkmpeg2enc.c: + * sys/msdk/gstmsdkvc1dec.c: + * sys/msdk/gstmsdkvp8dec.c: + * sys/msdk/gstmsdkvp9dec.c: + * sys/msdk/gstmsdkvp9enc.c: + * sys/msdk/gstmsdkvpp.c: + doc: add the msdk elements + Supported elements: + msdkav1dec, msdkh264dec, msdkh264enc, msdkh265dec, msdkh265enc, + msdkmjpegdec, msdkmjpegenc, msdkmpeg2dec, msdkmpeg2enc, msdkvc1dec, + msdkvp8dec, msdkvp9dec, msdkvp9enc, msdkvpp. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2296> 2021-06-02 14:17:13 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> @@ -439,14 +7859,190 @@ event ever, not just on mutter. This fixes the following warning when using mutter compositor (aka gnome-shell): waylandsink wlwindow.c:304:gst_wl_window_new_toplevel: The compositor did not send configure event. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2300> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2299> + +2021-06-02 11:26:41 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * tests/check/elements/camerabin.c: + test: camerabin: Fix buffer size calculation + We were assunming that GStreamer size for RGB (24bit packed) data was width x + height x 3, but GStreamer defaults to specific alignment. Use GstVideoInfo API + in order to obtain the buffer size. + This fixes failure seen when trying to merge: https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/merge_requests/998 + which make us negoaite 1x1 instead of 16x16 in this test. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2297> + +2021-05-31 17:51:58 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gsth265picture.c: + * gst-libs/gst/codecs/gsth265picture.h: + codecs: Integrate H265 DPB full check into need_bump(). + The current DPB check of H265 is not very correct. The current frame + is already in the DPB when we check whether the DPB is full. + For example, the DPB max size is 16 and we have 15 ref frames in the + DPB, so the gst_h265_dpb_delete_unused() cleans no one, and then plus + the current frame, the DPB is 16. This causes an error return, but in + fact, the stream is correct. + We now integrate the DPB full check into the need_bump() function. + We add the correct frame into to DPB and then check whether the picture + num is bigger than max_num_pics of DPB(which means there is no room for + the current picture). If true, we bump the DPB immediately. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2291> + +2021-06-01 15:28:57 +0100 Tim-Philipp Müller <tim@centricular.com> + + * meson.build: + Back to development + +=== release 1.19.1 === + +2021-06-01 00:14:22 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ChangeLog: + * NEWS: + * README: + * RELEASE: + * gst-plugins-bad.doap: + * meson.build: + Release 1.19.1 + +2021-04-08 10:11:52 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + v4l2codecs: gstv4l2codecsvp8dec: implement a render delay + The v4l2 backend support delayed output for performance reasons. + It is then possible to use render delays to queue multiple requests + simultaneously, thus increasing performance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2150> + +2021-04-07 18:24:27 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + * gst-libs/gst/codecs/gstvp8decoder.h: + codecs: gstvp8decoder: add support for render delay + Some decoding APIs support delayed output for performance reasons. + One example would be to request decoding for multiple frames and + then query for the oldest frame in the output queue. + This also increases throughput for transcoding and improves seek + performance when supported by the underlying backend. + Introduce support in the vp8 base class, so that backends that + support render delays can actually implement it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2150> + +2021-05-17 10:49:41 +0100 Philippe Normand <philn@igalia.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Stop transceivers update after first SDP error + When invalid SDP is supplied, _update_transceiver_from_sdp_media() sets the + GError, so it is invalid to continue any further SDP processing, we have to exit + early when the first error is raised. + Fixes #1595 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2254> + +2021-05-28 23:21:19 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11decoder.cpp: + d3d11: Suppress some warning logs + We uses gst_d3d11_device_new() for enumerating device which can + fail for some reason. Don't print warning log for the case. + And decoding capability check is the same case as well. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2286> + +2021-05-28 17:05:02 -0400 Roman Sivriver <roman@rsiv.net> + + * ext/hls/gsthlssink2.c: + hlssink2: Initialize debug category to prevent an assert with `fatal-warnings` + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2288> + +2021-05-21 20:02:53 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + d3d11compositor: Reuse converter on alpha update + ... instead of creating converter object + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2276> + +2021-05-18 01:24:29 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11converter.cpp: + * sys/d3d11/gstd3d11converter.h: + * sys/d3d11/gstd3d11overlaycompositor.cpp: + * sys/d3d11/gstd3d11shader.cpp: + * sys/d3d11/gstd3d11shader.h: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window_dummy.cpp: + d3d11converter: Introduce config to be extensible + Add a config argument like that of GstVideoConverter so that + we can add more options without modifying existing methods + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2276> + +2021-05-21 21:30:42 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11converter.cpp: + * sys/d3d11/gstd3d11overlaycompositor.cpp: + * sys/d3d11/gstd3d11shader.cpp: + * sys/d3d11/gstd3d11shader.h: + d3d11shader: Don't hold state object in GstD3D11Quad + We might want to update state object + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2276> + +2021-05-27 16:22:42 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codech264dec.h: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.h: + * sys/v4l2codecs/plugin.c: + v4lcodecs: Validate src formats + This add src format validation, this avoid registering element for + drivers we don't support any of their src formats. This also special + case the AlphaDecodeBin wrapper, as we know that alphacombine element + only support I420 and NV12 for now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2272> + +2021-05-22 16:29:09 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2codecalphadecodebin.c: + * sys/v4l2codecs/gstv4l2codecalphadecodebin.h: + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + * sys/v4l2codecs/meson.build: + v4l2codecs: add wrappers for alpha decode + codecalpha is a new plugin introduced to support VP8/VP9 alpha as + defined in the WebM and Matroska specifications. It splits the stream + into two streams, one for the alpha and one for the actual content, + then it decodes them separately with vpxdec and finally combine the + results as A420 or AV12 (i.e. YUV + an extra alpha plane). + The workflow above is setup by means of a bin, gstcodecalphabin. + This patch simulates the same workflow into the v4l2codecs namespace, + thus using the new v4l2 stateless decoders for hardware acceleration. + This is so we can register the new alpha decode elements only if the + hardware produces formats we support, i.e. I420 or NV12 for now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2272> + +2021-05-19 18:45:19 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/codecalpha/gstalphacombine.c: + codecalpha: alphacombine: add support for NV12/AV12 + Alpha combine works by appending the GstMemory for the alpha channel + to the GstBuffer containing I420, thereby pushing A420 on its src pad. + Add support for the same workflow for NV12, thereby producing the + recently introduced AV12 format (NV12 + Alpha). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2277> 2021-05-25 20:21:34 +0900 Seungha Yang <seungha@centricular.com> * gst/interlace/gstinterlace.c: interlace: Don't set field-order field for progressive caps That would cause negotiation issue - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2293> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2282> 2021-05-25 19:47:28 +0900 Seungha Yang <seungha@centricular.com> @@ -456,7 +8052,193 @@ not resulting format. Fixing negotiation error with gst-launch-1.0 videotestsrc ! video/x-raw,framerate=25/1 ! interlace field-pattern=0 ! fakesink - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2293> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2282> + +2021-05-26 16:37:06 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + d3d11compositor: Fix caps update handling + New caps is applied only when previous buffer is consumed if any. + So, the lastest given caps might not be corresponding to the current buffer + to be handled. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2284> + +2021-05-20 13:47:11 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/gstvadisplay.h: + * gst-libs/gst/va/gstvadisplay_drm.h: + * gst-libs/gst/va/gstvadisplay_wrapped.h: + libs: va: display: Handle auto clean up macros. + Add G_DEFINE_AUTOPTR_CLEANUP_FUNC macro for display classes, so auto + pointers are possible to users. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2269> + +2021-05-17 19:06:34 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Add current picture into reference list for SCC. + The current picture is not in the DPB, so we need to add it manually + to the reference list when SCC is enabled. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2255> + +2021-05-21 23:47:14 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Set Screen Content extension (SCC) for picture parameters. + We already declare the support of HEVC screen content extension profiles + in the profile mapping list, but we fail to generate the correct VA picture + parameters buffers. This may cause the GPU hang. + We need to fill the buffer of VAPictureParameterBufferHEVCExtension correctly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2255> + +2021-05-17 17:47:07 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Use get_profile_from_sps() to recognize the profile. + The function of gst_h265_get_profile_from_sps() is better than the + function gst_h265_profile_tier_level_get_profile() when we recognize + the profile of the stream, becaue it considers the compatibility. + It is also used by h265parse to recognize the profile. So it is + better to keep the same behaviour with the parse and other decoders. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2255> + +2021-05-21 23:21:12 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Set range extension for picture and slice parameters. + We already declare the support of HEVC range extension profiles in + the profile mapping list, but we fail to generate the correct VA + picture and slice parameters buffers. This may cause the GPU hang. + We need to fill the buffer of VAPictureParameterBufferHEVCExtension + and VASliceParameterBufferHEVCExtension correctly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2255> + +2021-05-24 18:18:52 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Use picture and slide extension parameters. + This is transitional commit to later implement extended and screen + profiles. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2255> + +2021-05-24 18:34:25 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ext/aom/gstav1enc.c: + * ext/dash/gstdashdemux.c: + * ext/dtls/gstdtlsdec.c: + * ext/dtls/gstdtlsenc.c: + * ext/fdkaac/gstfdkaacenc.c: + * ext/sctp/gstsctpenc.c: + * ext/sndfile/gstsfdec.c: + * gst/videoparsers/gstav1parse.c: + * gst/videoparsers/gstmpeg4videoparse.c: + * tests/check/elements/kate.c: + * tests/check/elements/pcapparse.c: + Use gst_buffer_new_memdup() + Update for function rename in core. + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/827 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2281> + +2021-05-22 18:05:18 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/WPEThreadedView.h: + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/gstwpevideosrc.cpp: + * ext/wpe/meson.build: + wpe: Bump wpebackend-fdo version requirement to 1.8 + Debian bullseye has this version already, and this allows us to get rid of many + ifdefs. The mouse scroll handling is actually functional now as well. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2278> + +2021-05-23 16:10:53 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ext/aom/gstav1enc.c: + * ext/dash/gstdashdemux.c: + * ext/dtls/gstdtlsdec.c: + * ext/dtls/gstdtlsenc.c: + * ext/fdkaac/gstfdkaacenc.c: + * ext/sctp/gstsctpenc.c: + * ext/sndfile/gstsfdec.c: + * gst/videoparsers/gstav1parse.c: + * gst/videoparsers/gstmpeg4videoparse.c: + * tests/check/elements/kate.c: + * tests/check/elements/pcapparse.c: + Use new gst_buffer_new_copy() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2279> + +2021-05-21 15:18:21 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + doc: update gst_plugins_cache.json + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1152> + +2021-05-14 20:22:26 +0200 Jakub Adam <jakub.adam@collabora.com> + + * sys/winscreencap/dxgicapture.c: + * sys/winscreencap/dxgicapture.h: + * sys/winscreencap/gstdxgiscreencapsrc.c: + dxgiscreencapsrc: renegotiate caps on resolution change + When desktop gets resized, recreate the textures and renegotiate the + source caps with the updated video dimensions. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2249> + +2021-05-18 14:09:01 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah265dec.c: + va: h265dec: Set LastSliceOfPic for multi sliced frames. + VA-API HEVC decoding needs to known which is the last slice of a + picture, but slices are processed sequencially, so we know the + last slice until all the slices are already pushed into the + VABuffer array. + In order to mark the last slice, they are pushed into the + VABuffer array with a delay of one slice: the first slice is + hold, and when the second slice come, the first one is pushed + while holding the second, and so on. Finally, at end_picture(), + the last slice is marked and pushed into the array. + Co-author: Victor Jaquez <vjaquez@igalia.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2246> + +2021-05-20 17:03:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdup.h: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + * sys/d3d11/gstd3d11desktopdupsrc.h: + d3d11desktopdupsrc: Add support for desktop size/rotation mode change + Re-negotiates with updated size on desktop size + (i.e., resolution, scaling factor), and rotation mode change + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2268> + +2021-05-20 10:09:57 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + mpegtsmux: Fixup program array indices after stream removal + Each stream stores the `program_array_index` of its position in its + program's `streams` array. When we remove a stream from this array, we + need to correct the `program_array_index` of all streams that were + backshifted by the removal. + Also extract the removal into a new function and add some more safety + checks. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2266> + +2021-05-20 18:49:01 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11memory.c: + d3d11memory: Protect map and unmap with device lock + We should lock memory object with gst_d3d11_device_lock() first + then GST_D3D11_MEMORY_LOCK() need to be used. + One observed deadlock case is that: + - Thread A takes d3d11 device lock + - At the same time, Thread B tries CPU map to d3d11memory which requires + d3d11 device lock as well, but it's already taken by Thread A. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2267> + +2021-05-20 18:38:17 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11memory.c: + d3d11memory: Add trace log for debugging locking thread + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2267> 2021-05-20 15:39:39 +0900 Seungha Yang <seungha@centricular.com> @@ -464,7 +8246,7 @@ audiolatency: Drop incoming downstream stick events stream-start, caps, and segment events will be pushed by internal audiotestsrc element. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2271> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2265> 2021-05-20 15:28:13 +0900 Seungha Yang <seungha@centricular.com> @@ -473,7 +8255,267 @@ Expected use case of audiolatency element is that mimic audio capture device which is most likely live source. So audiolatency element should use live mode as well. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2271> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2265> + +2021-05-19 18:48:29 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/meson.build: + wpe: Bump WPE dependency to 2.28 + The new audio feature depends on WPE 2.28 so we should just bump our + requirement to that. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2264> + +2021-05-20 00:51:08 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11compositor.h: + * sys/d3d11/gstd3d11compositorbin.cpp: + d3d11compositor: Fix missing D3D11 prefix + Fix typo, no functional change + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2262> + +2021-05-18 17:49:23 -0400 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/wpe/gstwpe.cpp: + * ext/wpe/gstwpesrcbin.cpp: + wpe: Update doc cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2021-04-21 23:14:13 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + wpe: Properly respect LIBGL_ALWAYS_SOFTWARE + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2021-05-01 21:48:23 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/wpe-extension/gstwpebusmsgforwarder.c: + * ext/wpe/wpe-extension/gstwpeextension.c: + * ext/wpe/wpe-extension/gstwpeextension.h: + * ext/wpe/wpe-extension/meson.build: + wpe: Relay messages from WPE internal pipelines + It is based on a tracer as it allows us to very easily get + every message that are posted on any bus inside the process. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2021-04-19 20:46:46 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/WPEThreadedView.h: + * ext/wpe/gstwpe.cpp: + * ext/wpe/gstwpe.h: + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/gstwpesrcbin.h: + * ext/wpe/gstwpevideosrc.cpp: + * ext/wpe/meson.build: + * ext/wpe/wpe-extension/gstwpeaudiosink.c: + * ext/wpe/wpe-extension/gstwpeextension.c: + * ext/wpe/wpe-extension/gstwpeextension.h: + * ext/wpe/wpe-extension/meson.build: + wpe: Base wpe audio implementation on a web extension + This makes the implementation simpler and enable us to map + webviews and audio stream much more easily + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2019-12-08 13:16:38 +0000 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + wpe: Enable WebAudio + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2019-12-08 11:49:20 +0000 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/WPEThreadedView.h: + * ext/wpe/gstwpe-private.h: + * ext/wpe/gstwpe.cpp: + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/gstwpesrcbin.h: + * ext/wpe/gstwpevideosrc.cpp: + * ext/wpe/meson.build: + * tests/examples/meson.build: + * tests/examples/wpe/meson.build: + * tests/examples/wpe/wpe.c: + wpe: Implement audio support + The wpesrc bin now exposes "sometimes" audio src pads, one for every PCM audio + stream created by WPEWebKit. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2021-03-10 17:27:52 -0300 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/WPEThreadedView.h: + * ext/wpe/gstwpe.cpp: + * ext/wpe/gstwpesrcbin.cpp: + * ext/wpe/gstwpesrcbin.h: + * ext/wpe/gstwpevideosrc.cpp: + * ext/wpe/gstwpevideosrc.h: + * ext/wpe/meson.build: + wpe: Move wpesrc to wpevideosrc and add a wrapper bin `wpesrc` + Currently the bin contains a single element but we are going + to implement audio support and expose extra pads for audio + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2252> + +2021-05-18 00:43:23 -0400 Doug Nazar <nazard@nazar.ca> + + * ext/sctp/gstsctpenc.c: + sctp: Ensure pad is still a child of element before removal + During pipeline shutdown there are several competing paths to remove + pads. Avoids tests failing due to: + Unexpected critical/warning: Padname '':sink_1 does not belong to element sctpenc1 when removing + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2256> + +2021-05-17 09:13:28 -0400 Doug Nazar <nazard@nazar.ca> + + * ext/sctp/gstsctpdec.c: + sctp: Fix race of pad removal during reset/stop + Both reset & stop remove existing pads. Can result in warning from + gst_element_remove_pad(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2256> + +2021-05-17 09:11:54 -0400 Doug Nazar <nazard@nazar.ca> + + * ext/webrtc/webrtcdatachannel.c: + webrtcbin: Fix race bringing up sctp data channel + Notifying before pads are linked can cause the stream to fail to start. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2256> + +2021-05-13 21:11:30 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcice.c: + webrtcbin: advertise harder the rtcp-mux-only requirement + And ignore rtcp ICE candidates + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2239> + +2021-05-14 10:47:05 -0500 Sid Sethupathi <sid.sethupathi@gmail.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: update default jb latency docs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2242> + +2021-05-18 16:38:04 -0400 Doug Nazar <nazard@nazar.ca> + + * ext/dtls/gstdtlsenc.c: + dtls: Let sender know when we are flushing + Prevents endless loop during shutdown where we end up sending 0 bytes. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2229> + +2021-05-18 16:31:47 -0400 Doug Nazar <nazard@nazar.ca> + + * ext/dtls/gstdtlsconnection.c: + * ext/dtls/gstdtlsconnection.h: + dtls: Add ability to set custom GstFlowReturn on callback error + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2229> + +2021-05-18 20:26:38 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportsendbin.c: + * ext/webrtc/transportsendbin.h: + webrtc: Remove reundundant context object in transportsendbin + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2260> + +2021-05-18 20:18:28 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportsendbin.c: + * ext/webrtc/transportsendbin.h: + webrtc: Wait until ICE is connected to start DTLS handshake process + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2260> + +2021-05-18 18:29:16 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/transportsendbin.c: + * ext/webrtc/transportsendbin.h: + webrtcbin: Remove pad probe on nicesink + This pad probe is no longer necessary, libnice now drops + all buffers before the stream is connected. This pad problem + also caused deadlocks in some situations. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2260> + +2021-05-17 20:59:19 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/kate/gstkatedec.c: + * ext/kate/gstkateenc.c: + * ext/kate/gstkateparse.c: + * ext/kate/gstkatetag.c: + * ext/kate/gstkatetiger.c: + kate: Initialize debug categories + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2258> + +2021-05-13 10:27:49 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/gstvadisplay.c: + * gst-libs/gst/va/gstvadisplay.h: + * gst-libs/gst/va/gstvadisplay_drm.c: + * gst-libs/gst/va/gstvadisplay_wrapped.c: + libs: va: Documentation and annotations. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2196> + +2021-05-07 17:05:38 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/va/meson.build: + * tests/examples/va/main.c: + * tests/examples/va/meson.build: + * tests/examples/va/multiple-vpp.c: + examples: va: Update the VA examples because of the new va lib. + Because we introduce the new va lib, the va examples need to include + new header files and add more library linkage. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2196> + +2021-05-13 18:46:21 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/va/gstvadisplay_wrapped.c: + * gst-libs/gst/va/gstvadisplay_wrapped.h: + * sys/va/gstvautils.c: + libs: va: display_wrapper: Use gpointer for VADisplay. + In order to be coherent along all the implementation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2196> + +2021-05-06 18:23:23 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/meson.build: + * gst-libs/gst/va/gstvadisplay.c: + * gst-libs/gst/va/gstvadisplay.h: + * gst-libs/gst/va/gstvadisplay_drm.c: + * gst-libs/gst/va/gstvadisplay_drm.h: + * gst-libs/gst/va/gstvadisplay_wrapped.c: + * gst-libs/gst/va/gstvadisplay_wrapped.h: + * gst-libs/gst/va/meson.build: + * gst-libs/gst/va/va-prelude.h: + * gst-libs/gst/va/va_fwd.h: + * sys/va/gstvaallocator.h: + * sys/va/gstvacaps.c: + * sys/va/gstvacaps.h: + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvadevice.c: + * sys/va/gstvadevice.h: + * sys/va/gstvadisplay.h: + * sys/va/gstvadisplay_priv.c: + * sys/va/gstvadisplay_priv.h: + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvautils.c: + * sys/va/gstvautils.h: + * sys/va/gstvavpp.c: + * sys/va/meson.build: + libs: va: Move the VA common logic as a lib. + The VA acceleration now has more usages in linux-like platforms, + such as the MSDK. The different plugins based on the VA acceleration + need to share some common logic and types. We now move the display + related functions and types into a common va lib. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2196> + +2021-05-17 11:42:07 +0800 mkba <mengker1031@outlook.com> + + * sys/msdk/gstmsdkh265enc.c: + msdk: add profile main-still-picture for hevc encoder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2253> 2021-05-15 00:39:57 +0900 Seungha Yang <seungha@centricular.com> @@ -483,13 +8525,532 @@ buffer size of progressive and interleaved formats could be different because we are rounding up height of all plane of interlaced frame to be multiple of two. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2270> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2244> + +2021-03-01 12:09:43 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkcontext.c: + msdk: use MFXJoinSession() to join the parent and child sessions + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1503> + +2021-02-04 15:27:13 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkcontext.c: + * sys/msdk/msdk.c: + * sys/msdk/msdk.h: + msdk: use a new method to create mfx session when using oneVPL dispatcher + In oneVPL, MFXLoad() and MFXCreateSession() are required to create a + workable mfx session[1] + [1] https://spec.oneapi.com/versions/latest/elements/oneVPL/source/programming_guide/VPL_prg_session.html#onevpl-dispatcher + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1503> + +2021-02-18 13:38:25 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * meson_options.txt: + * sys/msdk/meson.build: + msdk: allow user build this plugin against MFX version 2.2+ (oneVPL) + Intel oneVPL SDK (oneVPL) is a successor to Intel Media SDK (MSDK)[1]. + User may use -Dmfx_api=MSDK or -Dmfx_api=oneVPL to specify the required + SDK when building this plugin. If the SDK is not specified, meson will + try MSDK firstly, then oneVPL if MSDK is not available + Version 2.2+ is required in this patch because pkg-config file was not + provided officially before version 2.2 + [1]https://spec.oneapi.com/versions/latest/elements/oneVPL/source/appendix/VPL_intel_media_sdk.html + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1503> + +2021-05-14 11:56:49 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkvp9dec.c: + msdkvp9dec: do not include mfxvp9.h + The VP9 related definitions in mfxvp9.h are available under the + condition of 'MFX_VERSION >= MFX_VERSION_NEXT', which implies that these + definitions are never used in a public release. + This is in preparation for oneVPL support because mfxvp9.h was + removed from oneVPL + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1503> + +2020-08-04 12:53:35 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/msdk.c: + * sys/msdk/msdk.h: + msdk: don't load user plugins for MFX version 2.0+ + MFX version 2.0+ no longer supports user plugins, please refer to the + links for details + https://spec.oneapi.com/versions/latest/elements/oneVPL/source/appendix/VPL_intel_media_sdk.html#msdk-full-name-feature-removals + https://github.com/oneapi-src/oneVPL + This is in preparation for oneVPL support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1503> + +2020-08-04 12:55:35 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/msdk.c: + msdk: exclude the audio code for MFX version 2.0+ + MFX version 2.0+ no longer supports audio functions, please refer to the + links below for details + https://spec.oneapi.com/versions/latest/elements/oneVPL/source/appendix/VPL_intel_media_sdk.html#msdk-full-name-feature-removals + https://github.com/oneapi-src/oneVPL + This is in preparation for oneVPL support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1503> + +2021-05-14 14:08:17 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphacombine.c: + alphacombine: Ignore all events coming from the alpha_pad + As per usage of this element, everything from this pad is a + duplicate. Instead of implemented needless aggregation, simply + drop all events from this pad and let the one from the main stream + passthrough. Also stop proxying some queries from the alpha pad_too. + This fixes racy test failure: + - validate.file.playback.scrub_forward_seeking.opus_vp9-alpha_webm + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2247> + +2021-05-14 14:05:59 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstcodecalphademux.c: + codecalphademux: Do not set a GstFlowReturn from a boolean + This was a small overlook, gst_pad_send_event() returns a boolean, + so setting it into ret could confuse the flow combiner. Though, + it didn't bug, since both 0 and 1 are success (though 1 being + undefined). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2247> + +2021-05-14 14:04:00 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstcodecalphademux.c: + codecalphademux: Remove eos flow return workaround + It turns out that downstream returning OK after EOS is a bug in + multiqueue. As we moved to queue, we no longer have this issue. + Let's keep the code clean and just assuming that downstream will + keep returning EOS and allow convergence of flow. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2247> + +2021-05-13 15:18:34 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/openh264/gstopenh264element.c: + openh264: Don't use GOnce for ABI check + It turns out the value used for g_once_* APIs can't be + zero. And this is a very cheap check, so let's just do it every time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2240> + +2021-05-13 15:25:57 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtc test: Print content of error GstMessage + Makes it easier to interpret the result of the CI! + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-06 13:52:32 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin tests: Add test for intersection src pad caps + This checks that the codec preferences are intersected also with what + the src pad can handle. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-14 19:46:56 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtc test: Add explicit test clock + This way the test clock is not linked to the multiple harnesses + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-06 17:58:15 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Intersect answer with codec prefs & capabilities + In case the local capabilities changed since the last negotiaton, + we need to re-intersect and see if the result would be different. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-06 17:50:38 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Ignore current caps for codec negotiation + On the sink pad, we want the caps of the current stream, those + are the "received_caps" field. If we haven't received caps yet, then + we only care about the caps that the next element can accept, that is + the caps from the peer pad (and the preferences). Otherwise, we prevent + re-negotiation to a better codec when possible. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-05 19:21:18 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Remove dead code + The function is only called to create an offer, so no + need to pass the offer parameter and then check it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-05 19:18:02 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtc test: Add test for codec preferences negotiation + Validate that it does the intersection with the caps from + the sink pad and rejects the offer creation otherwise. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-05 19:00:11 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.h: + webrtcbin: Refactor codec preference retrieval + Now intersect against pads on both sides if they are available. + If the intersection fails, we now just reject the creation of the offer + or answer as it means that the codec_preferences are too restrictive or + that the caps on both sides the webrtcbin are not compatible. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-30 17:04:12 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Intersect codec preferences with caps from pads + When creating an offer or an answer, also take into account + the caps on the pads as well as the codec preferences when both are set. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-30 16:21:14 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * tests/check/elements/webrtcbin.c: + webrtcbin: Implement caps queries on sinkpad based on codec preferences + Also includes a unit test. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-30 15:04:33 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Hold transceiver lock when accessing codec_preferences + This is required to allow the applications to modify the preferences. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-30 14:55:41 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtptransceiver.c: + webrtcbin: Hold lock while accessing the codec preferences + They could be changed at runtime by the application, so take the lock + when modifying them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-21 15:55:00 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin tests: Use properties to access the inside of the transceiver object + This will allow hiding the insides from unsafe application access. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-21 15:54:14 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtptransceiver.c: + webrtc rtptransceiver: Implement "codec-preferences" property + This allows safer access to the internals of the codec-preferences + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-21 15:38:00 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtptransceiver.c: + webrtc rtptransceiver: Implement "kind" property + Implement the property as read-only to follow the WebRTC spec + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-21 15:34:07 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtptransceiver.c: + webrtc rtptransceiver: Implement "current-direction" property + Implement the property as read-only to follow the WebRTC spec + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-04-21 15:29:18 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtptransceiver.c: + webrtc rtptransceiver: Implement "mid" property + Implement the property as read-only to follow the WebRTC spec + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2183> + +2021-05-12 17:32:20 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphadecodebin.c: + alphadecodebin: Use normal queues instead of multiqueue + The multiqueue was too flexible for our need, allowing to queue passed + the configured threshold. It also didn't work well when trying to + propagate EOS flow return. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2238> + +2021-05-12 17:29:02 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphacombine.c: + alphacombine: Implement flow return propagation + The EOS handling was not the problem way. Instead of this, implement + proper prorogation of the flow return for the alpha chain function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2238> + +2021-05-12 15:13:11 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstcodecalphademux.c: + codecalphademux: Fix handling of flow combine + As the alphacombine is simplified to received matching pair of buffers, + we can't just stop streaming when we receive EOS from downstream. Due + to usage of queue, the moment we get this return value may differ. + Though, by continuing pushing, we override the last_flowret on the pad + which can make us miss that we effectively can combine all flow into + EOS. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2238> + +2021-04-29 17:14:43 -0400 Thibault Saunier <tsaunier@igalia.com> + + * gst/debugutils/gsttestsrcbin.c: + testbinsrc: Handle setting URI on the fly + Reusing existing streams when possible + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2210> + +2021-01-20 14:55:09 +0800 Bing Song <bing.song@nxp.com> + + * data/meson.build: + * data/targets/file-extension/ts.gep: + transcoding: add encoding target for TS. + Add encoding target for streamming. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1965> + +2021-04-29 16:51:27 +0200 Johan Sternerup <johast@axis.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin: Add unit test for closing of data channels + Add test for verifying that the data channel "close" action signal + triggers an SCTP_RESET_STREAMS request that is propagated to the other + side and eventually leads to both sides closing properly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2186> + +2021-04-22 10:43:55 +0200 Johan Sternerup <johast@axis.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + webrtcbin: Fix deadlock when receiving new sctp stream + When receiving an sctp message for a stream that not yet has an + sctpdec pad associated with it means we end up in + _on_sctpdec_pad_added. At this point we're holding the sctpassocation + lock. Then it's not possible to take the pc_lock because then code + executing under the pc_lock (which means anything in the webrtc + thread) may not take the sctpassociation lock. For example, running + the data channel close procedure from the webrtc thread means we + eventually end up sending a SCTP_RESET_STREAMS packet which needs to + grab the sctpassociation lock. + This means _on_sctpdec_pad_added simply cannot take the pc_lock and + also it is not possible to postpone the channel creation as we need to + link the pads right there. The solution is to introduce a more + granular dc_lock that protects only the things that needs to be done + to create the datachannel. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2186> + +2021-04-20 10:45:46 +0200 Johan Sternerup <johast@axis.com> + + * ext/sctp/sctpassociation.c: + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/sctptransport.c: + * ext/webrtc/webrtcdatachannel.c: + * ext/webrtc/webrtcdatachannel.h: + webrtcbin: Support closing of data channels + Support for closing WebRTC data channels as described in RFC + 8831 (section 6.7) now fully supported. This means that we can now + reuse data channels that have been closed properly. Previously, an + application that created a lot of short-lived on-demand data channels + would quickly exhaust resources held by lingering non-closed data + channels. + We now use a one-to-one style socket interface to SCTP just like the + Google implementation (i.e. SOCK_STREAM instead of SOCK_SEQPACKET, see + RFC 6458). For some reason the socket interface to use was made + optional through a property "use-sock-stream" even though code wasn't + written to handle the SOCK_SEQPACKET style. Specifically the + SCTP_RESET_STREAMS command wouldn't work without passing the correct + assocation id. Changing the default interface to use from + SOCK_SEQPACKET to SOCK_STREAM now means we don't have to bother about + the association id as there is only one association per socket. For + the SCTP_RESET_STREAMS command we set it to SCTP_ALL_ASSOC just to + match the Google implementation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2186> + +2021-05-07 16:30:49 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/videoparsers/gstvp9parse.c: + vp9parse: Manually fixate codec-alpha field + This is a newly introduced field, and we interpret it as false when missing in + the caps. Otherwise, a simple capsfilter will just add the missing field and + keep going, despite the upstream caps being a superset. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-05-07 11:28:21 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstplugin.c: + doc: codecalpha: Add plugin documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-05-06 09:12:34 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + doc: Add codecalpha plugin to the plugins cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-04-22 16:50:17 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphadecodebin.c: + * gst/codecalpha/gstalphadecodebin.h: + * gst/codecalpha/gstplugin.c: + * gst/codecalpha/gstvp8alphadecodebin.c: + * gst/codecalpha/gstvp8alphadecodebin.h: + * gst/codecalpha/gstvp9alphadecodebin.c: + * gst/codecalpha/gstvp9alphadecodebin.h: + * gst/codecalpha/meson.build: + alphadecodebin: Add wrappers to decode VP8/VP9 alpha + This includes base class with wrappers bin that will create a static + pipeline capable of handling the VP8/VP9 alpha channel decoding + using two instances of vp8/vp9dec element each. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-04-02 15:07:22 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstalphacombine.c: + * gst/codecalpha/gstalphacombine.h: + * gst/codecalpha/gstplugin.c: + * gst/codecalpha/meson.build: + codecalpha: Implement alphacombine element + This element will merge video buffers in order to use the alpha stream + luma plane as the alpha of the video stream. The implementation is zero-copy + and currently only support merging I420 stream with an I420, NV12 or GRAY8 + alpha stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-03-30 15:34:11 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstcodecalphademux.c: + alphacodecdemux: Implement meta demuxing + Produce two streams from a buffer that has GstVideoCodecAlphaMeta + attached. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-03-24 16:48:35 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/codecalpha/gstcodecalphademux.c: + * gst/codecalpha/gstcodecalphademux.h: + * gst/codecalpha/gstplugin.c: + * gst/codecalpha/meson.build: + * gst/meson.build: + * meson_options.txt: + Introduce CODEC Alpha plugin + This plugin contains a set of utility elements allowing to extract, + decode and combine CODEC (typically VP8/VP9) alpha stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2199> + +2021-05-11 13:57:59 +0100 Daniel Stone <daniel@fooishbar.org> + + * ext/openaptx/meson.build: + openaptx: Fix to v0.2.0 due to license change + openaptx has recently changed its license to explicitly exclude + 'Freedesktop projects' from using it, which would include GStreamer, as + well as shifting to base terms of GPLv3: + https://github.com/pali/libopenaptx/commit/811bc18586d634042618d633727ac0281d4170b8 + This unilateral license change is legally dubious in many ways. + The original work came from ffmpeg under the LGPL v2.1, to which third + parties may not add additional restrictions (per sections 2 and 7 of the + LGPL v2.1), so LGPLv2.1 + may-not-use restrictions are not permissible + without the explicit consent of the original copyright holder. + The upgrade to LGPL v3.0 without explicit consent from the original + copyright holder is in itself permissible through the upgrade terms of + the LGPL, however the additional restrictions imposed again conflict + with sections 7 and 10 of the GPLv3 (as the base of the LGPLv3, with + those sections not being invalidated by the additional LGPLv3 text). + Though it does not impact the legal validity of the redeclaration of + licensing, the claims that freedesktop.org has violated the terms of the + openaptx license in the past are false; the work was contributed to the + PulseAudio project with an explicit open license, with the original + contributor later attempting to revoke permission for its use, despite + the explicit terms of the license giving no ability to do so as they + lack a change-of-heart provision. + The claims that Collabora violated the license are even more baseless; + they are based on an assertion that when I (acting on behalf of + freedesktop.org rather than Collabora, in my own unpaid time) banned + users from freedesktop.org's GitLab instance due to sustained violations + of the Code of Conduct users agree to when creating an account on that + platform, this somehow constituted a violation of the license. Even if + Collabora were somehow involved in this - which they were not at all - + there is no requirement under open licenses that users be given + unlimited access under all terms to any platform on the internet. Such + terms would mean that open development could only be conducted on + completely unmoderated platforms, which does not stand up to any + scrutiny. + Regardless of the declared license having no legal validity, the LGPL's + explicit provision in both v2.1 and v3.0 for such additional + restrictions to be stripped, and the low likelihood of it ever being + used together with GStreamer as its licensing terms would not be + acceptable to any distribution, enforcing a version check seems like the + safest way to ensure complete legal clarity, not put users or + downstreams in any jeopardy, and comply with the author's stated wishes + for v0.2.1 and above to not be used by GStreamer. + Signed-off-by: Daniel Stone <daniel@fooishbar.org> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2235> + +2021-05-11 10:21:27 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + doc: Update cache after RGBP pixel format addition + Related to https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1141 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2236> + +2021-05-09 23:42:46 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaav1dec.c: + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Do not use a common parent_class in vabasedec. + We have only one copy of gst_va_base_dec_parent_class inside the + vabasedec, so it can not handle the case when there are multi va + decoders inside one pipeline. The pipeline: + gst-launch-1.0 filesrc location=xxx.h264 ! h264parse \ + ! vah264dec ! msdkh265enc ! vah265dec ! fakesink + generates a assertion of + "invalid cast from 'GstVaH264Dec' to 'GstH265Decoder" + and gets a crash. + We should keep the parent_class for each decoder type. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2231> + +2021-05-07 16:02:04 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + libs: codecs: h264decoder: Assert output_picture virtual method. + For new code it's nice to assert if the derived class implemented the + output_picture virtual method. Otherwise a segmentation fault + occurs. All other decoders assert this method. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2228> + +2021-05-06 18:37:45 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpesrc.cpp: + wpe: Properly free property fields + The set location (in two places) and loaded bytes were not freed when + the element is destroyed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2222> + +2021-05-06 19:17:29 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * ext/wpe/gstwpesrc.cpp: + wpe: Properly lock property fields + Use the object lock for the following fields: + - `bytes`: Written by the `load-bytes` signal unless running; consumed + on start. + - `draw_background`: Read and written by the `draw-background` + property. + - `location`: Read and written by the `location` property and the URI + handler. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2222> 2021-05-07 11:13:06 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> * gst/rtp/gstrtpsrc.c: rtpsrc: Plug leak of rtcp_send_addr - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2251> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2226> 2021-05-07 11:13:46 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -498,13 +9059,74 @@ Bizarrely, it returned a pad from the child rtpbin. I noticed because our application leaked the implicitly created ghost pad. Make an explicit ghost pad so this works properly. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2250> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2227> 2021-05-07 11:12:39 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> * gst/rist/gstristsrc.c: rist: Plug leak of rtcp_send_addr - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2248> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2225> + +2021-05-07 11:10:17 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaav1dec.c: + va: av1dec: Avoid structure overwrite. + VADecPictureParameterBufferAV1.mode_control_fields.bits were filled + twice, overwriting to zeros the first assignation. This patch unifies + both assignations. + Also it makes explicit an enum casting between libva and gstreamer; it + removes the assignation to zero a deprecated parameter; and use an + appropriate assertion. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2223> + +2021-05-06 17:07:51 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.c: + webrtc: only add nack pli by default if kind is video + Sending/receiving PLI's (Picture Loss Indication) for non-video doesn't + really make sense. This also matches what the browsers do. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2220> + +2021-05-06 17:06:44 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.c: + * ext/webrtc/utils.h: + webrtc: move webrtc_kind_from_caps() to utils + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2220> + +2021-04-21 17:34:26 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + * sys/wasapi2/gstwasapi2client.h: + * sys/wasapi2/gstwasapi2sink.c: + * sys/wasapi2/gstwasapi2src.c: + * sys/wasapi2/gstwasapi2util.h: + wasapi2: Propagate HRESULT error code everywhere + ... instead of boolean value which cannot notify the reason + of the operation failure. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2219> + +2021-05-06 10:46:15 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/hls/gsthlssink.c: + * ext/hls/gsthlssink2.c: + * ext/hls/gstm3u8playlist.c: + * ext/hls/gstm3u8playlist.h: + hlssink(2): Don't write deprecated EXT-X-ALLOW-CACHE metadata + It's deprecated since quite a few versions and various validators + complain about it. Instead of the in-manifest metadata this should be + handled by the normal HTTP caching headers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2221> + +2021-05-06 01:35:04 +0900 Seungha Yang <seungha@centricular.com> + + * sys/decklink/gstdecklink.cpp: + decklinkvideosrc: Fix crash when mode is not specified + In that case, we will get "VideoInputFrameArrived" callback + without "VideoInputFormatChanged" + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2218> 2021-05-05 12:34:38 +0530 Nirbheek Chauhan <nirbheek@centricular.com> @@ -516,40 +9138,381 @@ the PPS flag from the parser state. This is important because there are encoders that don't generated a PPS after every SPS. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2245> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2217> -2021-01-28 12:28:03 +0100 Michael Olbrich <m.olbrich@pengutronix.de> +2021-04-20 22:18:09 +0200 François Laignel <fengalin@free.fr> - * gst/videoparsers/gsth264parse.c: - * tests/check/elements/h264parse.c: - h264parse: don't invalidate the last PPS when parsing a new SPS - When a SPS is received then any previous PPS remains valid. So don't clear - the PPS flag from the parser state. - This is important because there are encoders that don't generated a PPS after - every SPS. - Closes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/571 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2243> + * ext/dash/gstdashsink.c: + * ext/dtls/gstdtlssrtpdec.c: + * ext/dtls/gstdtlssrtpenc.c: + * ext/hls/gsthlssink2.c: + * ext/resindvd/resindvdbin.c: + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/transportreceivebin.c: + * gst/camerabin2/gstwrappercamerabinsrc.c: + * gst/rist/gstristsink.c: + * gst/rtp/gstrtpsink.c: + * gst/sdp/gstsdpdemux.c: + * gst/transcode/gsttranscodebin.c: + * gst/transcode/gsturitranscodebin.c: + * sys/dvb/dvbbasebin.c: + * sys/uvch264/gstuvch264_src.c: + * tests/check/elements/asfmux.c: + * tests/check/elements/cccombiner.c: + * tests/check/elements/dtls.c: + * tests/check/elements/mpegtsmux.c: + * tests/check/elements/mplex.c: + * tests/check/elements/webrtcbin.c: + * tests/examples/playout.c: + Use gst_element_request_pad_simple... + Instead of the deprecated gst_element_get_request_pad. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2180> + +2021-05-04 12:29:14 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * gst/mxf/mxfdemux.c: + mxf: check EOS cond with any segment's flag + The previous test was preventing the pad to be in EOS + when the segment position was greater than segment stop. + It ended up consuming all the data before getting in EOS. + Regarding GST_SEEK_FLAG_SEGMENT it seems to be + correctly handled later in the method. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2173> + +2021-04-19 18:25:06 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * gst/mxf/mxfdemux.c: + mxfdemux: fix keyframe detection in index + An index entry should be considered as a keyframe + if the flags allow a random access only. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2173> 2021-04-24 10:43:47 +0000 Antonio Rojas <arojas@archlinux.org> * ext/openexr/gstopenexrdec.cpp: Fix build with OpenEXR 3 Add a header that is no longer transitively included - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2216> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2195> + +2021-04-22 19:21:01 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * gst-libs/gst/codecs/gstvp9statefulparser.c: + codecs: gstvp9statefulparser: do not carry over segmentation flags + Do not carry over segmentation flags from previous frames. The spec + says in 7.2.10 that the feature data carry over from previous frames + if not updated, but the flags do not. + Consider what would happen if a flag B is to depend on a flag A, and + B carries over as set from another frame. Further consider that A is + now not set in this particular frame. This leads to the invalid state + in which flag B is set but flag A isn't. + This might cause the bitstream to be rejected by accelerators down + the line. + Fix it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2203> + +2021-04-29 21:44:07 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdup.h: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + d3d11desktopdup: Don't ignore error DXGI_ERROR_UNSUPPORTED + Although Microsoft's DXGIDesktopDuplication example is considering + the DXGI_ERROR_UNSUPPORTED as an expected error + (See https://github.com/microsoft/Windows-classic-samples/tree/master/Samples/DXGIDesktopDuplication) + it might not be recoverable error if application is + run against a discrete GPU + (See https://docs.microsoft.com/en-US/troubleshoot/windows-client/shell-experience/error-when-dda-capable-app-is-against-gpu) + Do early error out if the error happens while opening device, + instead of retrying it forever. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2208> + +2021-04-29 22:10:15 +0200 Jakub Adam <jakub.adam@collabora.com> + + * sys/d3d11/gstd3d11desktopdup.cpp: + d3d11desktopdup: Support desktop switches + Before creating output duplication interface, call SetThreadDesktop() + with HDESK of the current input desktop in case a desktop switch has + occurred. + This allows d3d11desktopdupsrc to capture Windows User Account Control + (UAC) prompts, which appear on a separate secure desktop. Otherwise + IDXGIOutput1::DuplicateOutput() will return E_ACCESSDENIED and the + element won't produce any frames as long as the UAC screen is active. + Note that in order to access secure desktop the application still has to + run at LOCAL_SYSTEM privileges. For GStreamer applications running with + regular user privileges this change has no effect. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2209> + +2021-04-27 18:08:30 +0000 Jakub Adam <jakub.adam@collabora.com> + + * sys/winscreencap/dxgicapture.c: + dxgicapture: reinitialize duplication interface on ERROR_ACCESS_LOST + IDXGIOutputDuplication can become invalid for example when there's + desktop switch, resolution change or Windows User Account Control prompt + appears on screen. + When that happens, try to re-create the duplication interface for the + changed output. Note that in the case of UAC prompt this operation will + fail if the GStreamer process doesn't run at LOCAL_SYSTEM privileges. In + such situation the source element won't create any frames as long as the + output is occupied by UAC screen. + In order to enable UAC access to sufficiently privileged GStreamer + processes, call SetThreadDesktop() with the desktop handle that + currently receives user input before creating our output duplication. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2204> + +2021-04-29 09:35:51 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdksystemmemory.c: + * sys/msdk/gstmsdkvideomemory.c: + msdk: set correct parameters for BGRx frame + Otherwise when mapping BGRx frame onto CPU's memory, CPU will get wrong + data for B, G, R components + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2205> + +2021-04-29 21:12:42 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtc: advertise support for transport-cc rtcp-fb by default + Still requires explicit enabling by the application through the header + extension on all the relevant payloaders. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2207> + +2021-04-29 21:11:25 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtc/stats: provide the twcc stats when available + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2207> + +2021-04-28 10:52:29 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: Disable derived for Gallium if RGB and reading. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-04-22 17:08:13 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: Disable derived for i965 if YUV and writing. + The problem is for uploading YUV frames using derived images, is that + derived images imply tiling, so frames are wrongly uploaded. + Though derived for reading might work we cannot know the Intel graphics + generation to validate the caching. Overall, it's safer to disable derived + images for i965. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-04-22 17:07:28 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadisplay.c: + * sys/va/gstvadisplay.h: + va: display: Fix typo. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-04-22 12:42:35 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: Hack for i965 to get linear RGB DMABufs. + i965 driver has a hack to provide linear dmabufs, which is required for RGB + formats, since they are directly uploaded by glupload, ignoring tiled modifiers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-04-22 15:51:27 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: Remove unused parameter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-03-31 11:04:17 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: Set usage hint generic if DMABuf. + iHD driver sets a tiled DRM modifier if surface's usage hint is set to + VPP_WRITE. This result in a garbled rendering when using glimagesink. + This patch changes the usage hint to generic if the caps feature is + DMABuf. Either way only iHD driver, so far, uses the usage hint flag. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-04-20 12:52:26 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: Get info from caps in decide_allocation() + decide_allocation() occurs before set_caps(), where out_info is set, + thus setting srcpad_info with zeros or old values. Instead of it, the + caps, from the allocation query, are converted and used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2127> + +2021-04-23 13:56:43 +0200 Timo Wischer <timo.wischer@de.bosch.com> + + * ext/avtp/gstavtpcrfbase.h: + avtp: crf: Remove superfluous sink_event variable + This variable was introduced by commit 12ad2a4bcd6c ("avtp: Introduce + the CRF Sync Element") but it was never used: + $ git log -G "sink_event" -- ext/avtp + Signed-off-by: Timo Wischer <timo.wischer@de.bosch.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2201> + +2020-02-17 14:11:15 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkh265dec.c: + msdkh265dec: Add support for error report too + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/909> + +2019-12-06 12:48:37 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkdec.c: + * sys/msdk/gstmsdkdec.h: + * sys/msdk/gstmsdkdecproputil.c: + * sys/msdk/gstmsdkdecproputil.h: + * sys/msdk/gstmsdkh264dec.c: + msdkh264dec: report error to user + Sometimes user want to know what the error is when decoding a stream, + This commit adds a property of report-error to msdkh264dec. When + report-error is TRUE, msdkh264dec may catch bitstream error and frame + corruption, then report the error to application by using GST_ELEMENT_ERROR + Refer to the code in + https://github.com/Intel-Media-SDK/MediaSDK/tree/master/samples + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/909> + +2019-12-06 12:02:50 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkdec.c: + * sys/msdk/gstmsdkdec.h: + msdkdec: allow sub class to add extra parameters for additional configuration + MSDK allows user add extended buffers to a bitstream for additional + configuration. This commit is to support this feature in this plugin + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/909> + +2021-04-27 21:52:31 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11basefilter.cpp: + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11videosink.cpp: + d3d11: Handle device change + If incoming buffer holds other d3d11 device, and user wants any device + (i.e., adapter index wasn't specified explicitly) update our device + with that of buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2191> + +2021-04-23 19:29:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + d3d11videosink: Delay window setup as much as possible + ... so that videosink can handle device update with + d3d11 device of the first buffer + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2191> + +2021-04-23 18:44:41 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + * sys/d3d11/gstd3d11upload.cpp: + d3d11: Don't accept buffer pool which holds different device + At the moment, d3d11 plugin doesn't support texture sharing between + different device + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2191> + +2021-04-23 18:45:48 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + d3d11decoder: Run gst-indent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2191> + +2021-02-23 11:56:53 -0500 Aaron Boxer <boxerab@gmail.com> + + * ext/meson.build: + * ext/onnx/gstonnx.c: + * ext/onnx/gstonnxclient.cpp: + * ext/onnx/gstonnxclient.h: + * ext/onnx/gstonnxelement.c: + * ext/onnx/gstonnxelement.h: + * ext/onnx/gstonnxobjectdetector.cpp: + * ext/onnx/gstonnxobjectdetector.h: + * ext/onnx/meson.build: + * meson_options.txt: + onnx: add plugin to apply ONNX neural network models to video + This MR provides a transform element that leverage ONNX runtime + to run AI inference on a broad range of neural network toolkits, running + on either CPU or GPU. ONNX supports 16 different providers at the + moment, so with ONNX we immediately get support for Nvidia, AMD, Xilinx + and many others. + For the first release, this plugin adds a gstonnxobjectdetector element to + detect objects in video frames. Meta data generated by the model is + attached to the video buffer as a custom GstObjectDetectorMeta meta. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1997> + +2021-04-26 18:00:27 +0300 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklinkvideosrc.cpp: + decklinkvideosrc: Fix AFD/Bar VANC size check + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2193> + +2021-04-23 18:05:06 +0300 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklinkvideosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.h: + decklinkvideosrc: Automatically detect widescreen vs. normal NTSC/PAL + Based on the AFD aspect ratio flag the source can detect (in mode=auto) + whether this NTSC/PAL mode is actually a normal or a widescreen one and + select the caps according to that. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2193> 2021-03-30 12:39:21 -0400 Olivier Crête <olivier.crete@collabora.com> * gst/jpegformat/gstjpegparse.c: jpegparse: Don't generate timestamp for 0/1 framerates - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2202> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2194> -2021-04-23 10:02:29 +0100 Tim-Philipp Müller <tim@centricular.com> +2021-04-23 23:20:54 +0900 Seungha Yang <seungha@centricular.com> - * ext/webrtc/sctptransport.c: - Revert "webrtc: Fix sctp task's return type." - This reverts commit 1407a433cce82e0d3ca44e6413a747cd257f6cc8. - The earlier function signature change this goes with (!2104 ) - was not backported. + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Set flushing to internal pool on flush event + d3d11 decoders use internal pool for DPB texture and + Gst*Decoder::new_picture() will be blocked if internal pool is full. + We should be able to unblock in on flush-start event as expected. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2192> + +2021-04-23 16:53:16 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11decoder.cpp: + d3d11: Fix wrong GstD3D11BufferPool type check + Fix typos + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2190> + +2021-03-31 18:07:40 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst/rtp/gstrtpsrc.c: + rtpsrc: Fix wrong/NULL URI handling + We can reset the URI to NULL and this fix a deadlock in that case or + when the URI was invalid. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2132> + +2021-04-22 16:45:27 +0000 Nazar Mokrynskyi <nazar@mokrynskyi.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: downgrade "dropping ICE candidates from SDP" from warning to debug level + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2187> + +2021-04-16 20:39:35 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Attach rtpbin even for data channels + This is required because the same transport may later be used for RTP. + In which case the RTCP needs to flow bi-directionnally already. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2172> + +2021-03-04 00:41:09 -0800 Frederich Munch <colsebas@hotmail.com> + + * ext/webrtc/nicetransport.c: + Fix missing unref of nice-agent causing sockets to never close. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1960> 2021-04-22 16:09:40 -0400 Doug Nazar <nazard@nazar.ca> @@ -557,7 +9520,21 @@ webrtc: Fix sctp task's return type. GstWebRTCBinFunc expects a GstStructure* return type. Fixes segfault on PowerPC. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2189> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2188> + +2021-04-22 15:50:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideoenc.cpp: + mfvideoenc: Fix UWP build + Add missing GST_MF_HAVE_D3D11 define guard + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2185> + +2021-04-22 15:42:23 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + wasapi2: Fix UWP build + KSAUDIO_SPEAKER_* defines are WINAPI_PARTITION_DESKTOP only + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2185> 2021-04-21 21:43:59 +0200 Mathieu Duponchelle <mathieu@centricular.com> @@ -567,21 +9544,128 @@ 03031037fafd2d535bbefb1fdf6024b5d1159043 , two instructions were inverted resulting in the stop always being adjusted by 0 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2184> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2182> + +2021-04-20 23:51:49 -0400 Doug Nazar <nazard@nazar.ca> + + * tests/check/elements/netsim.c: + tests/netsim: Set src caps before creating buffers + GstHarness requires the source pad caps to be set before + buffer allocations. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2179> + +2021-04-20 02:00:18 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11format.h: + * tests/check/elements/d3d11colorconvert.c: + d3d11: Add support for BGRx and RGBx formats + For such formats, we can re-use existing BGRA/RGBA implementations + but ignoring alpha channel + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2174> + +2021-04-20 18:37:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + wasapi2: Implement default audio channel mask + Some capture devices might not provide channel mask value which will + result in capturing failure because of unknown channel mask in case + that device generates more than 2 channels. Although it might not + be correct, we can assume channel mask with the given number of channels. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2177> + +2021-04-20 18:40:40 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + wasapi2clinet: Simplify set caps + Don't need to iterate all structure to set identical values + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2177> + +2021-04-20 18:48:18 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + wasapi2client: Run gst-indent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2177> + +2021-04-13 17:35:58 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin test: Don't fail if data channel is created + In tests that voluntarily create a data channel. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2168> + +2021-04-19 19:06:50 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Filter caps isn't fixed + Fix an assertion because the filter paramter passed to + gst_caps_is_equal_fixed() wasn't fixed. So use the regular + gst_caps_is_equal() instead. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2175> + +2021-04-20 02:04:03 +0900 Seungha Yang <seungha@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + d3d11: Update plugin doc cache + Updating for removed d3d11videosink wrapper bin and the change of + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2113 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2169> + +2021-04-17 20:37:13 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11videosink.h: + * sys/d3d11/gstd3d11videosinkbin.cpp: + * sys/d3d11/gstd3d11videosinkbin.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Remove d3d11videosink wrapper bin + Drop d3d11videosink wrapper bin and handle texture upload + in d3d11videosink. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2169> + +2021-04-18 13:49:59 +0100 Philippe Normand <philn@igalia.com> + + * ext/webrtcdsp/gstwebrtcdsp.cpp: + webrtcdsp: Propagate VAD to audio level meta + Whenever the voice activity changed on the stream, update or create an + AudioLevelMeta and associate it to the corresponding buffer. + Fixes #1073 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2170> + +2021-04-19 13:06:23 +0300 Sebastian Dröge <sebastian@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/closedcaption/gstcccombiner.c: + cccombiner: Use correct enum when registering the max-scheduled property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2171> 2021-04-15 14:06:59 -0400 Thibault Saunier <tsaunier@igalia.com> * ext/wpe/WPEThreadedView.cpp: wpe: Remove code targeting WebKit < 2.24 We already depend on wk >= 2.24 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2167> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2165> 2021-04-15 13:28:42 -0400 Thibault Saunier <tsaunier@igalia.com> * ext/wpe/WPEThreadedView.cpp: wpe: Make threaded view singleton creation thread safe It was leading to interesting failures. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2167> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2165> + +2021-04-15 00:02:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11pluginutils.cpp: + d3d11: pluginutils: Fix wrong gst_memory_unmap() on _map() failure + It was obvious typo + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2164> + +2021-04-13 17:15:22 -0400 Doug Nazar <nazard@nazar.ca> + + * tests/check/elements/avtpcvfdepay.c: + tests/avtp: increase timeout of test_depayloader_fragmented_big + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2160> 2021-04-14 01:59:23 -0400 Doug Nazar <nazard@nazar.ca> @@ -589,10 +9673,2560 @@ check: fix dash_mpdparser_check_mpd_client_set_methods test. Setting guint64 valist properties without type specifier fails on 32bit archs. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2162> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2161> + +2021-04-13 16:34:15 -0400 Doug Nazar <nazard@nazar.ca> + + * tests/check/elements/line21.c: + line21enc: fix remove-caption-meta property test + It's possible for the same address to be allocated to the decoded + metadata. Switch test to actual detect if it was removed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2159> + +2021-04-13 06:40:43 -0400 Doug Nazar <nazard@nazar.ca> + + * tests/check/elements/shm.c: + tests: fix shm test deadlock + Stopping the consumer first would occasionally allow the producer + to fill the shm segment causing it to block in send() and unable + to be stopped. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2158> + +2021-04-13 05:54:37 -0400 Doug Nazar <nazard@nazar.ca> + + check: Fix test dash_mpdparser_xlink_period + Test used http://404/ERROR/XML.period as an invalid url. Curl now + interprets that as an 32bit int and tries an actual connect which + timesout. Use .invalid as an IANA reserved domain for invalid DNS. + curl -v http://404/ERROR/XML.period + * Trying 0.0.1.148:80... + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2157> + +2021-04-13 15:42:09 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: Fix an unmap typo in _va_copy. + No need to unmap the the src memory when failing to allocate the + dst mem. It has not been mapped yet. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2156> + +2021-04-06 12:03:32 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkcontext.c: + msdk: don't fall back to the default device + Ohterwise when user set a wrong device, the warning message doesn't get + printed if user doesn't set a right debug level in the environment, this + behavior might mislead user that the wrong device is being used. + This fixed https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1567 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2138> + +2021-04-12 17:54:31 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Simplify answer_caps intersection code a little + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-04-12 15:35:41 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin test: Wait for set-local-desc & set-remote-desc to continue + To avoid racing betwen the SDPs being set and the next step of the + test, let's wait for setting the SDP both locally and remotely to succeed. + of the test + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-04-01 14:51:30 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + * ext/webrtc/webrtcdatachannel.c: + webrtcbin: Move GstPromise reply to operation framework + This makes it possible to reply to all promises in a consistent way + without having to do a unlock/relock that is always risky. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-04-01 14:41:11 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Make sure PC_LOCK is release when replying to promise + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-31 11:56:10 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Take PC lock around all entry points + All of those action signals change the internal state, so + protect it by using the PC_LOCK + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-31 11:49:36 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Take PC_LOCK when requesting new pad + This is needed to avoid having the state change under us. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-31 11:41:45 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin test: Add for the case where a second m-line is renegotiated + This is for the case where there answerer forces a specific media type + for a m-line, but he origin offer only has the other media type. In this + case, we will create a second transceiver on receiving the offer and add + the desired media type using renegotiation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-31 11:40:28 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Ensure that query caps method returns valid caps + This means rejecting any caps that aren't fixed. Also, use a filter + that will create unfixed caps if the other side just returns ANY. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-31 11:33:21 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Associate the stream with a new transceiver + Otherwise, this newly created transceiver has no stream and it + aborts later when it tries to connect the input pad. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-31 11:30:16 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Match unassociated transceiver by kind too + When a new m-line comes in that doesn't have a transceiver, only match + existing transceivers of the same kind. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-30 18:01:56 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Fix typoe in name of error GstStructure + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-30 16:16:50 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtc test: Verify that forcing different kinds on peers fails + If the offer contains an audio kind and a video kind, forcing them both + at m-line zero will fail. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-30 16:04:33 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtc tests: Verify that create-offer is rejected when needed + Verify that it gets rejected if a m-line at index 1 is requested but + there is no m-line 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-29 19:47:21 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin test: Add test for various cases where get_request_pad is meant to fail + This should ensure that the recently added code works. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 21:09:04 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Enforce direction on request sink pad with a specific name + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 20:55:36 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * tests/check/elements/webrtcbin.c: + webrtcbin: Try to match an existing transceiver on pad request + This should avoid creating extra transceivers that are duplicated. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 20:02:13 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Validate locked m-lines in set*Description + Verify that the remote description match the locked m-lines, otherwise + just reject the SDP. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 19:38:57 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + webrtcbin: Remove unused session_mid_map + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 18:15:50 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.h: + webrtcbin: Enforce m-line restrictions when creating offer + First fail the offer creation if the mid of an existing offer doesn't + match a forced m-mline. + Then, for all newly added mlines, first look for a transceiver that + forces this m-line, then add a "floating" one, then the data channel. + And repeat this until we're out of transceivers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 15:57:15 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/webrtctransceiver.h: + webrtcbin: Remember if a transceiver had a forced m-line + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 15:54:35 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Enforce same-kind on request sink pad with a specific name + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 15:23:34 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Enforce compatible caps on pad request + If a pad is requested with certain caps and there is already a + transceiver, reject the pad request if the caps don't match. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 15:19:09 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Reject pad request for a specific m-line if it already exists + This way, the app developer is in control. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 15:02:50 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Make request-pad validation an early return + This reduces the indendation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-26 14:48:58 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Add document for webrtcbin itself to generated doc + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-23 20:18:24 -0400 Olivier Crête <olivier.crete@collabora.com> + + * tests/check/elements/webrtcbin.c: + webrtcbin test: Test adding a stream to a stream+datachannel + This use-case was previously broken by the expectation of having + a 1-1 match between the pad id and the m-line index + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-23 19:51:00 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtc: Reset received_caps when releasing pad + This is to work around a race where the pad is accessed in the + webrtc main thread while being released. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-03-23 17:51:16 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + webrtcbin: Split pad name from mline + The simple case where this breaks is if you add a + datachannel and want to add a new pad (a new media) after). Another + case where this is broken is if the order of the media is forced to + something different by the peer. + It's more simple to just split both things completely. In practice, the + pads will be named in the order in which they are allocated, so it + shouldn't change the current behaviour, just enable new ones. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2104> + +2021-02-25 05:04:00 +1100 Jan Schmidt <jan@centricular.com> + + * gst/switchbin/gstswitchbin.c: + switchbin: When collecting srcpad caps, don't intersect with path caps. + The path caps describe the input caps that will select each path, don't + intersect those with the srcpad caps, which could be completely + different. Instead, when querying allowed caps for the srcpad, just + construct the union of all possible output caps from all path srcpads. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2018> + +2021-02-16 15:00:07 +1100 Jan Schmidt <jan@centricular.com> + + * gst/switchbin/gstswitchbin.c: + switchbin: Don't report sink pad caps for src pad queries. + When handling a caps query on the src pad, don't return the union + of input caps. Even when not active, a path element can be queried + for srcpad template caps, or for dropping paths the allowed downstream + caps is anything - as data will be dropped anyway. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2018> + +2021-02-25 15:22:15 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/accurip/gstaccurip.c: + * gst/accurip/gstaccurip.h: + * gst/adpcmdec/adpcmdec.c: + * gst/adpcmenc/adpcmenc.c: + * gst/aiff/aiff.c: + * gst/aiff/aiffelements.h: + * gst/aiff/aiffmux.c: + * gst/aiff/aiffparse.c: + * gst/aiff/gstaiffelement.c: + * gst/aiff/meson.build: + * gst/asfmux/gstasf.c: + * gst/asfmux/gstasfmux.c: + * gst/asfmux/gstasfmux.h: + * gst/asfmux/gstasfparse.c: + * gst/asfmux/gstasfparse.h: + * gst/asfmux/gstrtpasfpay.c: + * gst/asfmux/gstrtpasfpay.h: + * gst/audiobuffersplit/gstaudiobuffersplit.c: + * gst/audiobuffersplit/gstaudiobuffersplit.h: + * gst/audiofxbad/gstaudiochannelmix.c: + * gst/audiofxbad/gstaudiochannelmix.h: + * gst/audiofxbad/gstaudiofxbad.c: + * gst/audiolatency/gstaudiolatency.c: + * gst/audiolatency/gstaudiolatency.h: + * gst/audiomixmatrix/gstaudiomixmatrix.c: + * gst/audiomixmatrix/gstaudiomixmatrix.h: + * gst/audiovisualizers/gstspacescope.c: + * gst/audiovisualizers/gstspacescope.h: + * gst/audiovisualizers/gstspectrascope.c: + * gst/audiovisualizers/gstspectrascope.h: + * gst/audiovisualizers/gstsynaescope.c: + * gst/audiovisualizers/gstsynaescope.h: + * gst/audiovisualizers/gstwavescope.c: + * gst/audiovisualizers/gstwavescope.h: + * gst/audiovisualizers/plugin.c: + * gst/autoconvert/gstautoconvert.c: + * gst/autoconvert/gstautoconvert.h: + * gst/autoconvert/gstautovideoconvert.c: + * gst/autoconvert/gstautovideoconvert.h: + * gst/autoconvert/plugin.c: + * gst/bayer/gstbayer.c: + * gst/bayer/gstbayer2rgb.c: + * gst/bayer/gstbayerelements.h: + * gst/bayer/gstrgb2bayer.c: + * gst/camerabin2/gstcamerabin2.c: + * gst/camerabin2/gstcamerabin2.h: + * gst/camerabin2/gstplugin.c: + * gst/camerabin2/gstviewfinderbin.c: + * gst/camerabin2/gstviewfinderbin.h: + * gst/camerabin2/gstwrappercamerabinsrc.c: + * gst/camerabin2/gstwrappercamerabinsrc.h: + * gst/coloreffects/gstchromahold.c: + * gst/coloreffects/gstchromahold.h: + * gst/coloreffects/gstcoloreffects.c: + * gst/coloreffects/gstcoloreffects.h: + * gst/coloreffects/gstplugin.c: + * gst/debugutils/debugutilsbad.c: + * gst/debugutils/fpsdisplaysink.c: + * gst/debugutils/gstchecksumsink.c: + * gst/debugutils/gstchopmydata.c: + * gst/debugutils/gstclockselect.c: + * gst/debugutils/gstcompare.c: + * gst/debugutils/gstdebugspy.c: + * gst/debugutils/gstdebugutilsbadelements.h: + * gst/debugutils/gsterrorignore.c: + * gst/debugutils/gstfakeaudiosink.c: + * gst/debugutils/gstfakevideosink.c: + * gst/debugutils/gsttestsrcbin.c: + * gst/debugutils/gstwatchdog.c: + * gst/dvbsubenc/gstdvbsubenc.c: + * gst/dvbsubenc/gstdvbsubenc.h: + * gst/dvbsuboverlay/gstdvbsuboverlay.c: + * gst/dvbsuboverlay/gstdvbsuboverlay.h: + * gst/dvdspu/gstdvdspu.c: + * gst/dvdspu/gstdvdspu.h: + * gst/faceoverlay/gstfaceoverlay.c: + * gst/faceoverlay/gstfaceoverlay.h: + * gst/festival/gstfestival.c: + * gst/festival/gstfestival.h: + * gst/fieldanalysis/gstfieldanalysis.c: + * gst/fieldanalysis/gstfieldanalysis.h: + * gst/freeverb/gstfreeverb.c: + * gst/freeverb/gstfreeverb.h: + * gst/gaudieffects/gstburn.c: + * gst/gaudieffects/gstburn.h: + * gst/gaudieffects/gstchromium.c: + * gst/gaudieffects/gstchromium.h: + * gst/gaudieffects/gstdilate.c: + * gst/gaudieffects/gstdilate.h: + * gst/gaudieffects/gstdodge.c: + * gst/gaudieffects/gstdodge.h: + * gst/gaudieffects/gstexclusion.c: + * gst/gaudieffects/gstexclusion.h: + * gst/gaudieffects/gstgaussblur.c: + * gst/gaudieffects/gstgaussblur.h: + * gst/gaudieffects/gstplugin.c: + * gst/gaudieffects/gstplugin.h: + * gst/gaudieffects/gstsolarize.c: + * gst/gaudieffects/gstsolarize.h: + * gst/gdp/gstgdp.c: + * gst/gdp/gstgdpdepay.c: + * gst/gdp/gstgdpdepay.h: + * gst/gdp/gstgdpelement.c: + * gst/gdp/gstgdpelements.h: + * gst/gdp/gstgdppay.c: + * gst/gdp/gstgdppay.h: + * gst/gdp/meson.build: + * gst/geometrictransform/gstbulge.c: + * gst/geometrictransform/gstbulge.h: + * gst/geometrictransform/gstcircle.c: + * gst/geometrictransform/gstcircle.h: + * gst/geometrictransform/gstdiffuse.c: + * gst/geometrictransform/gstdiffuse.h: + * gst/geometrictransform/gstfisheye.c: + * gst/geometrictransform/gstfisheye.h: + * gst/geometrictransform/gstkaleidoscope.c: + * gst/geometrictransform/gstkaleidoscope.h: + * gst/geometrictransform/gstmarble.c: + * gst/geometrictransform/gstmarble.h: + * gst/geometrictransform/gstmirror.c: + * gst/geometrictransform/gstmirror.h: + * gst/geometrictransform/gstperspective.c: + * gst/geometrictransform/gstperspective.h: + * gst/geometrictransform/gstpinch.c: + * gst/geometrictransform/gstpinch.h: + * gst/geometrictransform/gstrotate.c: + * gst/geometrictransform/gstrotate.h: + * gst/geometrictransform/gstsphere.c: + * gst/geometrictransform/gstsphere.h: + * gst/geometrictransform/gstsquare.c: + * gst/geometrictransform/gstsquare.h: + * gst/geometrictransform/gststretch.c: + * gst/geometrictransform/gststretch.h: + * gst/geometrictransform/gsttunnel.c: + * gst/geometrictransform/gsttunnel.h: + * gst/geometrictransform/gsttwirl.c: + * gst/geometrictransform/gsttwirl.h: + * gst/geometrictransform/gstwaterripple.c: + * gst/geometrictransform/gstwaterripple.h: + * gst/geometrictransform/plugin.c: + * gst/id3tag/gstid3mux.c: + * gst/id3tag/gstid3mux.h: + * gst/inter/gstinter.c: + * gst/inter/gstinteraudiosink.c: + * gst/inter/gstinteraudiosink.h: + * gst/inter/gstinteraudiosrc.c: + * gst/inter/gstinteraudiosrc.h: + * gst/inter/gstintersubsink.c: + * gst/inter/gstintersubsink.h: + * gst/inter/gstintersubsrc.c: + * gst/inter/gstintersubsrc.h: + * gst/inter/gstintervideosink.c: + * gst/inter/gstintervideosink.h: + * gst/inter/gstintervideosrc.c: + * gst/inter/gstintervideosrc.h: + * gst/interlace/gstinterlace.c: + * gst/ivfparse/gstivfparse.c: + * gst/ivfparse/gstivfparse.h: + * gst/ivtc/gstcombdetect.c: + * gst/ivtc/gstcombdetect.h: + * gst/ivtc/gstivtc.c: + * gst/ivtc/gstivtc.h: + * gst/jp2kdecimator/gstjp2kdecimator.c: + * gst/jp2kdecimator/gstjp2kdecimator.h: + * gst/jpegformat/gstjifmux.c: + * gst/jpegformat/gstjifmux.h: + * gst/jpegformat/gstjpegformat.c: + * gst/jpegformat/gstjpegparse.c: + * gst/jpegformat/gstjpegparse.h: + * gst/librfb/gstrfbsrc.c: + * gst/librfb/gstrfbsrc.h: + * gst/midi/midi.c: + * gst/midi/midiparse.c: + * gst/midi/midiparse.h: + * gst/mpegdemux/gstmpegdemux.c: + * gst/mpegdemux/gstmpegdemux.h: + * gst/mpegdemux/plugin.c: + * gst/mpegpsmux/mpegpsmux.c: + * gst/mpegpsmux/mpegpsmux.h: + * gst/mpegtsdemux/gsttsdemux.c: + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsparse.c: + * gst/mpegtsdemux/mpegtsparse.h: + * gst/mpegtsdemux/tsdemux.c: + * gst/mpegtsdemux/tsdemux.h: + * gst/mpegtsmux/gstatscmux.c: + * gst/mpegtsmux/gstatscmux.h: + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/gstmpegtsmux.c: + * gst/mpegtsmux/gstmpegtsmux.h: + * gst/mpegtsmux/gstmpegtsmuxplugin.c: + * gst/mxf/gstmxfelement.c: + * gst/mxf/gstmxfelements.h: + * gst/mxf/meson.build: + * gst/mxf/mxf.c: + * gst/mxf/mxfdemux.c: + * gst/mxf/mxfmux.c: + * gst/netsim/gstnetsim.c: + * gst/netsim/gstnetsim.h: + * gst/onvif/gstrtponvif.c: + * gst/onvif/gstrtponvifparse.c: + * gst/onvif/gstrtponvifparse.h: + * gst/onvif/gstrtponviftimestamp.c: + * gst/onvif/gstrtponviftimestamp.h: + * gst/pcapparse/gstirtspparse.c: + * gst/pcapparse/gstirtspparse.h: + * gst/pcapparse/gstpcapparse.c: + * gst/pcapparse/gstpcapparse.h: + * gst/pcapparse/plugin.c: + * gst/pnm/gstpnm.c: + * gst/pnm/gstpnmdec.c: + * gst/pnm/gstpnmdec.h: + * gst/pnm/gstpnmenc.c: + * gst/pnm/gstpnmenc.h: + * gst/proxy/gstproxy.c: + * gst/proxy/gstproxysink.c: + * gst/proxy/gstproxysink.h: + * gst/proxy/gstproxysrc.c: + * gst/proxy/gstproxysrc.h: + * gst/rawparse/gstaudioparse.c: + * gst/rawparse/gstaudioparse.h: + * gst/rawparse/gstvideoparse.c: + * gst/rawparse/gstvideoparse.h: + * gst/rawparse/plugin.c: + * gst/removesilence/gstremovesilence.c: + * gst/removesilence/gstremovesilence.h: + * gst/rist/gstrist.c: + * gst/rist/gstrist.h: + * gst/rist/gstristplugin.c: + * gst/rist/gstristrtpdeext.c: + * gst/rist/gstristrtpext.c: + * gst/rist/gstristrtxreceive.c: + * gst/rist/gstristrtxsend.c: + * gst/rist/gstristsink.c: + * gst/rist/gstristsrc.c: + * gst/rist/gstroundrobin.c: + * gst/rist/gstroundrobin.h: + * gst/rist/meson.build: + * gst/rtmp2/gstrtmp2.c: + * gst/rtmp2/gstrtmp2element.c: + * gst/rtmp2/gstrtmp2elements.h: + * gst/rtmp2/gstrtmp2sink.c: + * gst/rtmp2/gstrtmp2src.c: + * gst/rtmp2/meson.build: + * gst/rtp/gstrtpsink.c: + * gst/rtp/gstrtpsink.h: + * gst/rtp/gstrtpsrc.c: + * gst/rtp/gstrtpsrc.h: + * gst/rtp/plugin.c: + * gst/sdp/gstsdpdemux.c: + * gst/sdp/gstsdpdemux.h: + * gst/sdp/gstsdpelem.c: + * gst/sdp/gstsdpsrc.c: + * gst/sdp/gstsdpsrc.h: + * gst/segmentclip/gstaudiosegmentclip.c: + * gst/segmentclip/gstaudiosegmentclip.h: + * gst/segmentclip/gstvideosegmentclip.c: + * gst/segmentclip/gstvideosegmentclip.h: + * gst/segmentclip/plugin.c: + * gst/siren/gstsiren.c: + * gst/siren/gstsirendec.c: + * gst/siren/gstsirendec.h: + * gst/siren/gstsirenenc.c: + * gst/siren/gstsirenenc.h: + * gst/smooth/gstsmooth.c: + * gst/smooth/gstsmooth.h: + * gst/speed/gstspeed.c: + * gst/speed/gstspeed.h: + * gst/subenc/gstsrtenc.c: + * gst/subenc/gstsrtenc.h: + * gst/subenc/gstsubenc.c: + * gst/subenc/gstwebvttenc.c: + * gst/subenc/gstwebvttenc.h: + * gst/switchbin/gstswitchbin.c: + * gst/switchbin/gstswitchbin.h: + * gst/switchbin/plugin.c: + * gst/timecode/gstavwait.c: + * gst/timecode/gstavwait.h: + * gst/timecode/gsttimecodestamper.c: + * gst/timecode/gsttimecodestamper.h: + * gst/timecode/plugin.c: + * gst/transcode/gsttranscodebin.c: + * gst/transcode/gsttranscodeelement.c: + * gst/transcode/gsttranscodeelements.h: + * gst/transcode/gsttranscodeplugin.c: + * gst/transcode/gsturitranscodebin.c: + * gst/transcode/meson.build: + * gst/videofilters/gstscenechange.c: + * gst/videofilters/gstscenechange.h: + * gst/videofilters/gstvideodiff.c: + * gst/videofilters/gstvideodiff.h: + * gst/videofilters/gstvideofiltersbad.c: + * gst/videofilters/gstzebrastripe.c: + * gst/videofilters/gstzebrastripe.h: + * gst/videoframe_audiolevel/gstvideoframe-audiolevel.c: + * gst/videoframe_audiolevel/gstvideoframe-audiolevel.h: + * gst/videoparsers/gstav1parse.c: + * gst/videoparsers/gstdiracparse.c: + * gst/videoparsers/gsth263parse.c: + * gst/videoparsers/gsth264parse.c: + * gst/videoparsers/gsth265parse.c: + * gst/videoparsers/gstjpeg2000parse.c: + * gst/videoparsers/gstmpeg4videoparse.c: + * gst/videoparsers/gstmpegvideoparse.c: + * gst/videoparsers/gstpngparse.c: + * gst/videoparsers/gstvc1parse.c: + * gst/videoparsers/gstvideoparserselement.c: + * gst/videoparsers/gstvideoparserselements.h: + * gst/videoparsers/gstvp9parse.c: + * gst/videoparsers/meson.build: + * gst/videoparsers/plugin.c: + * gst/videosignal/gstsimplevideomark.c: + * gst/videosignal/gstsimplevideomark.h: + * gst/videosignal/gstsimplevideomarkdetect.c: + * gst/videosignal/gstsimplevideomarkdetect.h: + * gst/videosignal/gstvideoanalyse.c: + * gst/videosignal/gstvideoanalyse.h: + * gst/videosignal/gstvideosignal.c: + * gst/vmnc/vmncdec.c: + * gst/vmnc/vmncdec.h: + * gst/y4m/gsty4mdec.c: + * gst/y4m/gsty4mdec.h: + gst-plugins: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2110> + +2021-04-10 20:34:26 +0200 Helmut Januschka <helmut@januschka.com> + + * gst/rtmp2/rtmp/rtmpclient.c: + allow NetStream.Play.PublishNotify Message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2154> + +2021-03-26 11:00:50 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * sys/bluez/bluez-plugin.c: + * sys/bluez/gsta2dpsink.c: + * sys/bluez/gsta2dpsink.h: + * sys/bluez/gstavdtpsink.c: + * sys/bluez/gstavdtpsink.h: + * sys/bluez/gstavdtpsrc.c: + * sys/bluez/gstavdtpsrc.h: + * sys/bluez/gstbluezelement.c: + * sys/bluez/gstbluezelements.h: + * sys/bluez/meson.build: + * sys/decklink/gstdecklink.cpp: + * sys/decklink/gstdecklink.h: + * sys/decklink/gstdecklinkaudiosink.cpp: + * sys/decklink/gstdecklinkaudiosink.h: + * sys/decklink/gstdecklinkaudiosrc.cpp: + * sys/decklink/gstdecklinkaudiosrc.h: + * sys/decklink/gstdecklinkdeviceprovider.cpp: + * sys/decklink/gstdecklinkdeviceprovider.h: + * sys/decklink/gstdecklinkplugin.cpp: + * sys/decklink/gstdecklinkvideosink.cpp: + * sys/decklink/gstdecklinkvideosink.h: + * sys/decklink/gstdecklinkvideosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.h: + * sys/decklink/meson.build: + * sys/dvb/dvbbasebin.c: + * sys/dvb/dvbbasebin.h: + * sys/dvb/gstdvb.c: + * sys/dvb/gstdvbelement.c: + * sys/dvb/gstdvbelements.h: + * sys/dvb/gstdvbsrc.c: + * sys/dvb/gstdvbsrc.h: + * sys/dvb/meson.build: + * sys/fbdev/gstfbdevsink.c: + * sys/fbdev/gstfbdevsink.h: + * sys/ipcpipeline/gstipcpipeline.c: + * sys/ipcpipeline/gstipcpipelineelement.c: + * sys/ipcpipeline/gstipcpipelineelements.h: + * sys/ipcpipeline/gstipcpipelinesink.c: + * sys/ipcpipeline/gstipcpipelinesrc.c: + * sys/ipcpipeline/gstipcslavepipeline.c: + * sys/ipcpipeline/meson.build: + * sys/kms/gstkmssink.c: + * sys/kms/gstkmssink.h: + * sys/magicleap/mlaudiosink.c: + * sys/magicleap/mlaudiosink.h: + * sys/magicleap/plugin.c: + * sys/opensles/meson.build: + * sys/opensles/opensles.c: + * sys/opensles/opensles.h: + * sys/opensles/openslesplugin.c: + * sys/opensles/openslessink.c: + * sys/opensles/openslessink.h: + * sys/opensles/openslessrc.c: + * sys/opensles/openslessrc.h: + * sys/shm/gstshm.c: + * sys/shm/gstshmsink.c: + * sys/shm/gstshmsink.h: + * sys/shm/gstshmsrc.c: + * sys/shm/gstshmsrc.h: + * sys/uvch264/gstuvch264.c: + * sys/uvch264/gstuvch264_mjpgdemux.c: + * sys/uvch264/gstuvch264_mjpgdemux.h: + * sys/uvch264/gstuvch264_src.c: + * sys/uvch264/gstuvch264_src.h: + * sys/uvch264/gstuvch264deviceprovider.c: + * sys/uvch264/gstuvch264deviceprovider.h: + plugins-sys: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2116> + +2021-04-09 01:54:50 +0900 Seungha Yang <seungha@centricular.com> + + codecs: vp9decoder: Update docs + * Remove "FIXME 1.20": All the bits are addressed already by using + vp9parse element + * Fix copy & paste errors: Some comments were copied from h264decoder + blindly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2151> + +2021-04-09 12:45:46 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: vp9decoder: Make duplicate_picture() vfunc optional + The default implementation was required when superframe parsing + was handled by vp9decoder. For instance, if a superframe consists + of multiple frames with show_existing_frame header, it was vague + that which GstVp9Picture should consume GstVideoCodecFrame. + After 1.18 release, we introduced vp9parse element and + superframe should be handled by upstream vp9parse elemenet now. + So, we don't need to care about the superframe at vp9decoder class + level anymore. Simply, a frame corresponding to show_existing_frame + can be dropped if subclass doesn't implement duplicate_picture(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2151> + +2021-03-30 14:40:53 +0100 Philippe Normand <philn@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/debugutils/debugutilsbad.c: + * gst/debugutils/gstfakeaudiosink.c: + * gst/debugutils/gstfakeaudiosink.h: + * gst/debugutils/meson.build: + debugutils: Add fakeaudiosink element + This element can be useful for CI purposes on machines not running any system + audio daemon. The element implements the GstStreamVolume interface. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2125> + +2021-04-08 14:53:52 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codecallocator.c: + v4l2codecs: allocator: Keep dmabuf mapped + DMABuf allocator already implements DMABuf Sync, meaning that doing + mmap/munmap (unless the mode have changed) is not required. In fact, on + systems with IOMMU it makes the kernel redo the mmu table which is visible + in the CPU usage. + This change reduces CPU usage when decoding + bbb_sunflower_2160p_60fps_normal.mp4 on RK3399 SoC from over 30% to + around 15%. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2152> + +2021-04-03 14:16:22 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: Implement mem_copy for VA memory. + Implementation of mem_copy() virtual method for GstVaAllocator. + It's a deep copy where a new VA memory is popped out from the pool or, + if pool is empty, a new memory is allocated. The original memory is + mapped to read, and if its VAImage is not derived and size to copy is + the whole surface, the mapped VAImage of the original memory is put in + the new memory. Otherwise a slow memcpy is done between both memories. + Fixes: #1568 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2136> + +2021-04-08 20:29:29 +0800 Zhao Zhili <zhilizhao@tencent.com> + + * ext/srt/gstsrtobject.c: + srtobject: fix optlen of srt_getsockflag + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2149> + +2021-01-14 14:24:06 +0800 Haihua Hu <jared.hu@nxp.com> + + * gst/videoparsers/gstjpeg2000parse.c: + jpeg2000parse: fix critical log when play one gray colorspace video + Need guess color space based on number of components when cannot + got it from sink caps + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1955> + +2020-12-11 16:33:39 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/ldac/gstldacenc.c: + ldacenc: Emit message on errors + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1875> + +2020-12-11 16:26:00 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/sbc/gstsbcenc.c: + sbc: Return hard error on allocation or mapping error + Also post a message on the bus in these cases.wpe: Emit load-progress messages + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1875> + +2020-10-25 16:39:48 +0000 Matthieu De Beule <matthieu.de@beule.be> + + * gst-libs/gst/player/gstplayer.c: + Tell programmers that set_volume uses linear scale (fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1439) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1722> + +2020-12-11 14:52:20 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/ldac/gstldacenc.c: + * ext/sbc/gstsbcenc.c: + sbc/ldac: Don't use GST_CAPS_NONE to mean NULL + The GST_CAPS_NONE macro actually returns a instance of + a empty caps. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1874> + +2021-03-30 17:24:38 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: vp9decoder: Allow decoding start with intra-only frame + As per spec "7.2 Uncompressed header semantics" and + "8.2 Frame order constraints", decoding can start with intra-only + frame. This commit is for fixing vp90-2-16-intra-only.webm + bitstream test failure. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2112> + +2021-03-29 02:11:22 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/nvcodec/gstnvvp9dec.c: + * sys/va/gstvavp9dec.c: + codecs: vp9decoder: Pass GstVideoCodecFrame to duplicate_picture() + ... and fix picture duplication logic for vavp9dec + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2112> + +2021-03-30 11:49:43 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + * gst-libs/gst/codecs/gstvp9picture.h: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/nvcodec/gstnvvp9dec.c: + * sys/va/gstvavp9dec.c: + codecs: vp9decoder: Port to GstVp9StatefulParser + Use newly implemented VP9 parser. Since new GstVp9FrameHeader + struct holds all the information of the stream, baseclass will not + pass parser object to new_sequence() method anymore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2112> + +2021-03-27 15:32:59 +0900 Seungha Yang <seungha@centricular.com> + + codecparsers: Reimplement VP9 parser + Existing VP9 parser implementation doesn't provide information + required by other stateless decoding APIs (i.e., DXVA and NVDEC), + specifically loop filter and segmentation parameters might not exist + current frame. So parser needs to fill the information by using previously + parsed information. + We can update the gstvp9parser implementation so that it can provide + all information required by stateless decoding APIs with a huge API break, + or adding more ugly struct in it. + Instead doing as such, this commit introduce a new VP9 parser implementation. + What is different from existing one? + * All variables will follow the specification as much as possible: + VP9 Bitstream & Decoding Process Specification - v0.6 31st March 2016 + * Parser will fill all the required information for decoding frame + to GstVp9FrameHeader struct. In case of old VP9 parser, + user needs to read additional data from parser's member variables. + * GstVp9StatefulParser object struct is completely completely opaque + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2112> + +2021-03-28 16:11:23 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: vp9decoder: Don't check codec change with show_existing_frame + Show existing frame will zero frame_type value but it doesn't mean + it's keyframe. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2112> + +2021-04-06 16:24:39 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2codecs: Fix holding of reference picture buffer + The picture buffer (V4L2 CAPTURE buffer) was being released immediatly + when the request was done. This was problematic since even after the + request is done, the picture buffer might still be used as a reference + and should not be reused for further decoding yet. + This change effectively bind the picture buffer lifetime to the request. + So that if the picture is never showned (decode only frame) or the request + queue is full before the buffer is displayed, the picture buffer will + remain alive. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2142> + +2021-04-07 07:48:57 -0400 Doug Nazar <nazard@nazar.ca> + + * gst/rtmp2/rtmp/rtmpmessage.c: + rtmp2: Use correct size of write macro for param2. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2146> + +2021-04-01 07:59:45 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: remove unsupported formats because driver's bugs + Add a way to filter out video formats from caps because of unresolved + bugs in drivers. In this case for media-driver (iHD) where some RGB32 + formats are not handled correctly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2129> + +2021-03-31 09:59:21 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavideoformat.c: + va: videoformats: Map more color formats. + Added Y212_LE, Y412_LE, P012_LE, Y444, RGB16, RGB and BGR10A2_LE in + the static map betwen VA and GStreamer color formats. This synchronize + the map used in gstremaer-vaapi and this plugin. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2129> + +2021-03-31 09:50:46 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadisplay.c: + * sys/va/gstvavideoformat.c: + * sys/va/gstvavideoformat.h: + va: videoformats: Fix RGB32 mapping between VA and GStreamer. + Different VA drives might have different definitions for RGB32 color + formats because different bit interpretation. Sadly the specification + doesn't clarify these interpretations. So VA users have to figure out + what's the correct mapping with it's rendering color format + definition. + This patch aims to fix the static map structure after the + VAImageFormats are queried. There is another static map with the + different interpretations of the RGB32 formats, and compare them with + the given VAImageFormat, then with the GStreamer color format, update + the mapping table. + Finally, some RGB32 color formats were added. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2129> + +2021-04-07 01:03:15 -0400 Doug Nazar <nazard@nazar.ca> + + * ext/avtp/gstavtpcvfdepay.c: + * ext/avtp/gstavtpcvfpay.c: + * ext/avtp/gstavtpsrc.c: + avtp: Fix log format macros + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2145> + +2021-04-06 13:07:52 -0300 Daniel Almeida <daniel.almeida@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2codecs: gstv4l2decoder.c: Add missing include + Add missing include for sys/ioctl.h so that these warnings dissapear + when compiling: + ../subprojects/gst-plugins-bad/sys/v4l2codecs/gstv4l2decoder.c:179:9: + warning: implicit declaration of function ‘ioctl’ + [-Wimplicit-function-declaration] + Signed-off-by: Daniel Almeida <daniel.almeida@collabora.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2140> + +2021-04-06 19:18:45 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/curl/gstcurlsftpsink.c: + curlsftpsink: Don't run GST_DEBUG_OBJECT() on a class struct + It's supposed to be a GObject. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2141> + +2021-03-29 15:29:30 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * sys/msdk/gstmsdkenc.c: + * sys/msdk/gstmsdkh265enc.c: + msdkh265enc: add support for RGB 10bit format + The SDK can support A2RGB10 format [1], A2RGB10 format corresponds + to BGR10A2_LE format in gstreamer. A2RGB10 format only supports + low-power mode. + Example: + gst-launch-1.0 videotestsrc ! video/x-raw,format=BGR10A2_LE \ + ! msdkh265enc low-power=1 ! fakesink + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxframedata + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2126> + +2021-03-31 16:18:04 +0200 Wim Taymans <wtaymans@redhat.com> + + * gst-libs/gst/vulkan/android/gstvkwindow_android.c: + * gst-libs/gst/vulkan/cocoa/gstvkwindow_cocoa.m: + * gst-libs/gst/vulkan/gstvkapi.h: + * gst-libs/gst/vulkan/gstvkhandle.h: + * gst-libs/gst/vulkan/ios/gstvkwindow_ios.m: + * gst-libs/gst/vulkan/wayland/gstvkwindow_wayland.c: + * gst-libs/gst/vulkan/win32/gstvkwindow_win32.c: + * gst-libs/gst/vulkan/xcb/gstvkwindow_xcb.c: + * sys/applemedia/videotexturecache-vulkan.mm: + vulkan: provide a custom VK_DEFINE_NON_DISPATCHABLE_HANDLE + If the application did not define one yet, define our own + VK_DEFINE_NON_DISPATCHABLE_HANDLE that is independent of the + architecture. + Vulkan, by default, provides a define that depends on the architecture, + which causes the symbol type to be different. This causes an + architecture dependent .gir file, which then causes multilib + installation problems because the .gir files can't be shared. + Make it possible to override the format specifier and provide + a default one that is compatible with the default non dispatchable + handle. + Return VK_NULL_HANDLE from functions that return a non-dispatchable + handle. + Fixes #1566 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2130> + +2021-03-26 17:48:09 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: postproc, filter: add disable-passthrough property + vapostproc tries to be in passthrough mode as much as possible. But + they might be situations where the user might force to process the + frames. For example, when upstream sets the crop meta and the user + wants VA do that cropping, rather than downstream. + For those situations this property will disable the passthrough mode, + if it's enabled. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2058> + +2021-02-23 09:01:10 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: enable cropping by crop meta + If incoming buffers have crop meta it's done by vapostproc, iif + vapostproc is not in passthrough mode and downstream doesn't handle + it. + This patch announces the crop meta API in proposed bufferpool, while + it stops filtering meta APIs, since it was only filter crop api. + Also if downstream supports crop and video metas, vapostporoc + announces both meta APIs in upstream bufferpool. + Finally, the meta is removed from the buffer if the crop is enabled. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2058> + +2021-03-04 15:19:25 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + va: filter: add gst_va_filter_enable_cropping () + This will toggle the cropping operation in the filter + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2058> + +2021-01-23 12:53:25 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: filter, vpp: add and use GstVaSample struct + This new struct describes the input and output GstBuffers to + post-process, including VA flags. It also contains the VASurfaceID and + VARectangle, but those are private, completed inside GstVaFilter. + It is used for pass arguments to gst_va_filter_convert_surface() function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2058> + +2021-02-28 08:38:36 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: Use allocation caps when creating sink pool. + When an input buffer needs to be copied into a VA memory, it's + required to create a buffer pool. This patch uses the + propose_allocation() caps to instantiate the allocator and pool, + instead of the negotiated caps, which rather represents the resolution + to display. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2058> + +2021-01-22 23:54:50 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/closedcaption/gstline21dec.c: + line21dec: relax caps requirements + Instead of requiring interlaced video, simply skip CC detection + when the input is progressive. + This allows placing line21decoder unconditionally in pipelines, + without having to worry about whether the input stream will be + interlaced, or even worse interlacing just in case! + + update doc cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1885> + +2020-12-16 01:02:53 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstline21dec.c: + * ext/closedcaption/gstline21dec.h: + line21dec: expose mode property + That new property can be used to control whether and how + detected CC meta should be inserted in the list of existing + CC meta on the input frame (if there was any). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1885> + +2020-12-15 22:01:33 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/closedcaption/gstline21dec.c: + * ext/closedcaption/gstline21dec.h: + line21dec: expose ntsc-only property + When this is set, the element only tries to detect CC when the + height is 525 or 486 (NTSC resolutions). The height is already + checked. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1885> + +2021-03-31 11:52:07 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: Use derived images only if not mapped for reading. + Derived images are direct maps to surfaces bits, but in Intel Gen7 to + Gen9, that memory is not cachable, thus reading can be very slow (it + might produce timeout is tests such as fluster). + This patch tries first to define if derived images are possible, and + later use them only if mapping is not for reading. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2128> + +2021-03-31 11:13:52 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvacaps.c: + va: caps: Add image formats in raw caps only for non-iHD. + This plugin, for decoders more concretely, assumes that a VA config + can do certain color conversions when mapping frames onto CPU's + memory. + This assumption was valid for i965 and Gallium drivers which generates + valid outputs in bitstreams testers (v.gr. fluster). Nonetheless, iHD, + even when it generates acceptable rendered frames, output's MD5 of + tests weren't valid. + This patch append the image formats, for color conversion when mapping + to memory, for non-iHD drivers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2128> + +2021-04-01 15:09:45 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11upload.cpp: + d3d11: Fix for UYVY/VYUY format rendering + Don't assume that non-native DXGI formats support RTV and/or SRV. + We are mapping UYVY and VYUY formats to DXGI_FORMAT_R8G8_B8G8_UNORM + which doesn't support render target view + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2133> + +2021-03-25 03:16:05 +1100 Jan Schmidt <jan@centricular.com> + + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/gstbasetsmux.h: + mpegtsmux: Respect the start-time-selection property. + Use the start time provided by the aggregator base class for output + times. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2105> + +2021-03-29 15:24:38 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + msdkh265enc: add dblk-idc property + The SDK can support deblocking reference structure [1], so add a new + property to enable this feature. + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxextcodingoption2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2122> + +2021-03-29 15:18:13 +0800 Yinhang Liu <yinhang.liu@intel.com> + + * sys/msdk/gstmsdkh264enc.c: + * sys/msdk/gstmsdkh264enc.h: + msdkh264enc: add dblk-idc property + The SDK can support deblocking reference structure [1], so add a new + property to enable this feature. + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxextcodingoption2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2122> + +2021-03-30 11:34:54 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/flite/gstflite.c: + * ext/flite/gstflitetestsrc.c: + flite: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2123> + +2021-03-30 11:27:11 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dtls/plugin.c: + dtls: hotfix: allow per feature registration + Use of GST_ELEMENT_REGISTER in plugin.c + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2123> + +2021-03-26 19:47:06 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/gs/meson.build: + gs: remove clang formatting + remove clang formatting during + the build. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2115> + +2021-03-26 11:41:50 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/gs/gstgs.cpp: + * ext/gs/gstgssink.cpp: + * ext/gs/gstgssink.h: + * ext/gs/gstgssrc.cpp: + * ext/gs/gstgssrc.h: + gs: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2115> + +2021-03-29 12:15:18 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst/timecode/gstavwait.c: + avwait: Don't reset time tracking when receiving the same segment again + This causes avwait to go back into "dropping" mode until audio and video + are synced again, which is unnecessary when the segment didn't actually + change. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2121> + +2021-03-17 14:30:09 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvdecoder.h: + * sys/nvcodec/gstnvh264dec.c: + * sys/nvcodec/gstnvh265dec.c: + * sys/nvcodec/gstnvvp8dec.c: + * sys/nvcodec/gstnvvp9dec.c: + nvcodec: nvsldec: Refactor graphics api resource handling + * Move GL context object to GstNVDecoder object, and remove + duplicated handling of each codec decoder element + * Don't create GL context too early. We can create it only if + we need to negotiate with downstream gl element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2089> + +2021-03-17 14:38:40 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkh264enc.c: + * sys/msdk/gstmsdkh264enc.h: + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + * sys/msdk/msdk-enums.c: + * sys/msdk/msdk-enums.h: + msdkenc{h264,h265}: add intra-refresh-type property + The SDK allows user to specify the intra refresh type which can improve + error resilience without significant impact on encoded bitstream size + caused by I frames [1] + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxextcodingoption2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2090> + +2021-03-28 12:03:09 +0200 Marijn Suijten <marijns95@gmail.com> + + * gst-libs/gst/d3d11/gstd3d11memory.h: + * gst-libs/gst/webrtc/webrtc_fwd.h: + Add @ prefix to enum-variant references in documentation + Found while working on GStreamer-rs documentation, some enums had this + bit of text pasted verbatim in the enum documentation rather than + attached to the enum-variant. Fortunately it seems these in WebRTC and + D3D11 are the only ones matching the non-@-prefixed pattern: + ^ \* GST_\w+:\s*\w+ + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2118> + +2021-03-26 12:20:07 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/rtmp2/rtmp/rtmpconnection.c: + rtmp2/connection: Separate inner from outer cancelling + The connection cancels itself when it is closed. To avoid the + cancellable passed to `gst_rtmp_connection_new` from being unexpectedly + cancelled, separate inner from outer cancellation by holding two + cancellables. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1558 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2111> + +2021-03-28 12:06:24 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11vp9dec: Remove debug dump functions + Existing debug messages are not quite useful + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2117> + +2021-03-28 16:06:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Move zero-copy decision logic into decoder object + Get rid of all duplicated code for zero-copy decision and output buffer + allocation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2117> + +2021-03-26 22:40:34 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11vp9dec: Ignore show_frame flag in output_picture() + baseclass will not call output_picture() if it shouldn't be ouputted. + Note that the show_frame flag can be zero when show_existing_frame is set + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2114> + +2021-03-26 22:27:38 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: vp9decoder: Fix to output frame when show_existing_frame flag is set + When show_existing_frame flag is set, show_frame flag is zero + but we should output previously decoded frame as specified in frame header. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2114> + +2021-03-26 21:06:59 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Do not hardcode the limit minimum resolution to 64 + Decoder should be able to support lower resolution than 64x64 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2113> + +2021-03-25 21:17:07 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11videosinkbin.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + d3d11videosink: Remove DirectWrite related dead code + It's now not enabled since we moved core part to gst-libs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2108> + +2021-03-25 03:24:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideoenc.cpp: + mfvideoenc: Don't pass 0/1 framerate to MFT + Some MFT implementations do not accept 0/1 framerate and it will + result in encoder open failure. If framerate is unknown, + we will use arbitrary 25/1 framerate value. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2106> + +2021-03-23 13:48:09 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/gstd3d11memory.h: + * sys/d3d11/gstd3d11decoder.cpp: + d3d11decoder: Resurrect zero-copy for fixed-size DPB pool + Enable zero-copy if downstream proposed pool and therefore decoder + can know the amount of buffer required by downstream. + Otherwise decoder will copy when our DPB pool has no sufficient + buffers for later decoding operation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2097> + +2021-03-20 19:52:16 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Implement array-of-texture DPB again + Re-implementation of array-of-texture based on d3d11 memory pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2097> + +2021-03-18 22:31:55 +0900 Seungha Yang <seungha@centricular.com> + + d3d11: Implement memory pool + Major changes: + * GstD3D11Allocator: This allocator is now device-independent object + which can allocate GstD3D11Memory object for any GstD3D11Device. + User can get this object via gst_allocator_find(GST_D3D11_MEMORY_NAME) + * GstD3D11PoolAllocator: A new allocator implementation for texture pool. + From now on GstD3D11BufferPool will make use of this memory pool allocator + to avoid frequent texture reallocation. That usually happens because + of buffer copy (gst_buffer_make_writable for example) + In addition to that, GstD3D11BufferPool will provide GstBuffer with + GstVideoMeta, because CPU access to a GstD3D11Memory without GstVideoMeta + is almost impossible since GPU drivers needs padding for stride alignment. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2097> + +2021-03-20 22:11:49 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/gstd3d11memory.h: + * sys/d3d11/gstd3d11decoder.cpp: + d3d11decoder: Temporarily remove zero-copy related code + We will re-implement it based on memory pool + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2097> + +2021-03-23 09:33:49 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: Set one buffer in pools as minimum. + Because some elements, such as videorate check that minimum are + different of maximum number of buffers in the proposed pool, since + they might hold one or more buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2102> + +2021-03-23 19:19:14 +0200 Sebastian Dröge <sebastian@centricular.com> + + * gst-libs/gst/codecparsers/gsth264parser.c: + * gst-libs/gst/codecparsers/gsth265parser.c: + * tests/check/libs/h264parser.c: + * tests/check/libs/h265parser.c: + h2645parser: Catch overflows in AVC/HEVC NAL unit length calculations + Offset and size are stored as 32 bit guint and might overflow when + adding the nal_length_size, so let's avoid that. + For the size this would happen if the AVC/HEVC NAL unit size happens to + be stored in 4 bytes and is 4294967292 or higher, which is likely + corrupted data anyway. + For the offset this is something for the caller of these functions to + take care of but is unlikely to happen as it would require parsing on a + >4GB buffer. + Allowing these overflows causes all kinds of follow-up bugs in the + h2645parse elements, ranging from infinite loops and memory leaks to + potential memory corruptions. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2103> + +2021-02-25 09:59:50 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/zxing/gstzxing.cpp: + * ext/zxing/gstzxing.h: + * ext/zxing/gstzxingplugin.c: + zxing: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-25 09:57:00 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/zbar/gstzbar.c: + * ext/zbar/gstzbar.h: + zbar: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-25 09:51:52 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/x265/gstx265enc.c: + * ext/x265/gstx265enc.h: + x265: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-25 09:45:10 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/wpe/gstwpesrc.cpp: + * ext/wpe/gstwpesrc.h: + wpe: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-25 09:27:19 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/wildmidi/gstwildmididec.c: + * ext/wildmidi/gstwildmididec.h: + wildmidi: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-25 08:18:54 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/webrtcdsp/gstwebrtcdsp.cpp: + * ext/webrtcdsp/gstwebrtcdsp.h: + * ext/webrtcdsp/gstwebrtcdspplugin.cpp: + * ext/webrtcdsp/gstwebrtcechoprobe.cpp: + * ext/webrtcdsp/gstwebrtcechoprobe.h: + * ext/webrtcdsp/meson.build: + webrtcdsp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-25 08:04:42 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/webp/gstwebp.c: + * ext/webp/gstwebpdec.c: + * ext/webp/gstwebpdec.h: + * ext/webp/gstwebpenc.c: + * ext/webp/gstwebpenc.h: + webp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 18:56:55 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/wayland/gstwaylandsink.c: + * ext/wayland/gstwaylandsink.h: + wayland: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 18:45:15 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/vulkan/gstvulkan.c: + * ext/vulkan/gstvulkanelement.c: + * ext/vulkan/gstvulkanelements.h: + * ext/vulkan/meson.build: + * ext/vulkan/vkcolorconvert.c: + * ext/vulkan/vkdeviceprovider.c: + * ext/vulkan/vkdownload.c: + * ext/vulkan/vkimageidentity.c: + * ext/vulkan/vksink.c: + * ext/vulkan/vkupload.c: + * ext/vulkan/vkviewconvert.c: + vulkan: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 17:34:50 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/voamrwbenc/gstvoamrwb.c: + * ext/voamrwbenc/gstvoamrwbenc.c: + * ext/voamrwbenc/gstvoamrwbenc.h: + voamrwbenc: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 17:32:34 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/voaacenc/gstvoaac.c: + * ext/voaacenc/gstvoaacenc.c: + * ext/voaacenc/gstvoaacenc.h: + voaacenc: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 13:07:30 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/ttml/gstttmlelement.c: + * ext/ttml/gstttmlelements.h: + * ext/ttml/gstttmlparse.c: + * ext/ttml/gstttmlplugin.c: + * ext/ttml/gstttmlrender.c: + * ext/ttml/meson.build: + ttml: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 12:52:08 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/srtp/gstsrtp.c: + * ext/srtp/gstsrtpdec.c: + * ext/srtp/gstsrtpdec.h: + * ext/srtp/gstsrtpelement.c: + * ext/srtp/gstsrtpelements.h: + * ext/srtp/gstsrtpenc.c: + * ext/srtp/gstsrtpenc.h: + * ext/srtp/gstsrtpplugin.c: + * ext/srtp/meson.build: + srtp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-24 12:39:22 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/srt/gstsrt.c: + * ext/srt/gstsrtelement.c: + * ext/srt/gstsrtelements.h: + * ext/srt/gstsrtplugin.c: + * ext/srt/gstsrtsink.c: + * ext/srt/gstsrtsrc.c: + * ext/srt/meson.build: + srt: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:54:56 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/spandsp/gstdtmfdetect.c: + * ext/spandsp/gstdtmfdetect.h: + * ext/spandsp/gstspandsp.c: + * ext/spandsp/gstspanplc.c: + * ext/spandsp/gstspanplc.h: + * ext/spandsp/gsttonegeneratesrc.c: + * ext/spandsp/gsttonegeneratesrc.h: + spandsp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:41:41 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/soundtouch/gstbpmdetect.cc: + * ext/soundtouch/gstbpmdetect.hh: + * ext/soundtouch/gstpitch.cc: + * ext/soundtouch/gstpitch.hh: + * ext/soundtouch/plugin.c: + soundtouch: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:30:50 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/sndfile/gstsf.c: + * ext/sndfile/gstsfdec.c: + * ext/sndfile/gstsfdec.h: + * ext/sndfile/gstsfelement.c: + * ext/sndfile/gstsfelements.h: + * ext/sndfile/gstsfsink.h: + * ext/sndfile/meson.build: + sndfile: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:18:39 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/smoothstreaming/gstmssdemux.c: + * ext/smoothstreaming/gstmssdemux.h: + * ext/smoothstreaming/gstsmoothstreaming-plugin.c: + smoothstreaming: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:14:53 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/sctp/gstsctpdec.c: + * ext/sctp/gstsctpdec.h: + * ext/sctp/gstsctpenc.c: + * ext/sctp/gstsctpenc.h: + * ext/sctp/gstsctpplugin.c: + sctp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:09:18 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/sbc/gstsbcdec.c: + * ext/sbc/gstsbcdec.h: + * ext/sbc/gstsbcenc.c: + * ext/sbc/gstsbcenc.h: + * ext/sbc/sbc-plugin.c: + sbc: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 12:00:13 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/rtmp/gstrtmp.c: + * ext/rtmp/gstrtmpelement.c: + * ext/rtmp/gstrtmpelements.h: + * ext/rtmp/gstrtmpsink.c: + * ext/rtmp/gstrtmpsink.h: + * ext/rtmp/gstrtmpsrc.c: + * ext/rtmp/meson.build: + rtmp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 11:53:12 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/rsvg/gstrsvg.c: + * ext/rsvg/gstrsvgdec.c: + * ext/rsvg/gstrsvgdec.h: + * ext/rsvg/gstrsvgoverlay.c: + * ext/rsvg/gstrsvgoverlay.h: + rsvg: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-19 11:40:40 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/resindvd/plugin.c: + * ext/resindvd/resindvdbin.c: + * ext/resindvd/resindvdbin.h: + resindvd: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 16:23:42 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/qroverlay/gstdebugqroverlay.c: + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlayelement.c: + * ext/qroverlay/gstqroverlayelements.h: + * ext/qroverlay/gstqroverlayplugin.c: + * ext/qroverlay/meson.build: + qroverlay: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 15:56:44 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/opus/gstopus.c: + * ext/opus/gstopusparse.c: + * ext/opus/gstopusparse.h: + opus: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 15:48:12 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openni2/gstopenni2.cpp: + * ext/openni2/gstopenni2src.cpp: + * ext/openni2/gstopenni2src.h: + openni2: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 15:42:44 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openmpt/gstopenmptdec.c: + * ext/openmpt/gstopenmptdec.h: + * ext/openmpt/plugin.c: + openmpt: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 15:30:06 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openjpeg/gstopenjpeg.c: + * ext/openjpeg/gstopenjpegdec.c: + * ext/openjpeg/gstopenjpegdec.h: + * ext/openjpeg/gstopenjpegenc.c: + * ext/openjpeg/gstopenjpegenc.h: + openjpeg: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 15:21:40 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openh264/gstopenh264dec.cpp: + * ext/openh264/gstopenh264element.c: + * ext/openh264/gstopenh264elements.h: + * ext/openh264/gstopenh264enc.cpp: + * ext/openh264/gstopenh264plugin.c: + * ext/openh264/meson.build: + openh264: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 14:08:34 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openexr/gstopenexr.c: + * ext/openexr/gstopenexrdec.cpp: + * ext/openexr/gstopenexrdec.h: + openexr: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 13:34:54 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/opencv/gstcameracalibrate.cpp: + * ext/opencv/gstcameracalibrate.h: + * ext/opencv/gstcameraundistort.cpp: + * ext/opencv/gstcameraundistort.h: + * ext/opencv/gstcvdilate.cpp: + * ext/opencv/gstcvdilate.h: + * ext/opencv/gstcvequalizehist.cpp: + * ext/opencv/gstcvequalizehist.h: + * ext/opencv/gstcverode.cpp: + * ext/opencv/gstcverode.h: + * ext/opencv/gstcvlaplace.cpp: + * ext/opencv/gstcvlaplace.h: + * ext/opencv/gstcvsmooth.cpp: + * ext/opencv/gstcvsmooth.h: + * ext/opencv/gstcvsobel.cpp: + * ext/opencv/gstcvsobel.h: + * ext/opencv/gstdewarp.cpp: + * ext/opencv/gstdewarp.h: + * ext/opencv/gstdisparity.cpp: + * ext/opencv/gstdisparity.h: + * ext/opencv/gstedgedetect.cpp: + * ext/opencv/gstedgedetect.h: + * ext/opencv/gstfaceblur.cpp: + * ext/opencv/gstfaceblur.h: + * ext/opencv/gstfacedetect.cpp: + * ext/opencv/gstfacedetect.h: + * ext/opencv/gstgrabcut.cpp: + * ext/opencv/gstgrabcut.h: + * ext/opencv/gsthanddetect.cpp: + * ext/opencv/gsthanddetect.h: + * ext/opencv/gstmotioncells.cpp: + * ext/opencv/gstmotioncells.h: + * ext/opencv/gstopencv.cpp: + * ext/opencv/gstretinex.cpp: + * ext/opencv/gstretinex.h: + * ext/opencv/gstsegmentation.cpp: + * ext/opencv/gstsegmentation.h: + * ext/opencv/gstskindetect.cpp: + * ext/opencv/gstskindetect.h: + * ext/opencv/gsttemplatematch.cpp: + * ext/opencv/gsttemplatematch.h: + * ext/opencv/gsttextoverlay.cpp: + * ext/opencv/gsttextoverlay.h: + opencv: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:58:28 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openaptx/gstopenaptxdec.c: + * ext/openaptx/gstopenaptxdec.h: + * ext/openaptx/gstopenaptxenc.c: + * ext/openaptx/gstopenaptxenc.h: + * ext/openaptx/openaptx-plugin.c: + openaptx: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:52:51 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openal/gstopenal.c: + * ext/openal/gstopenalelement.c: + * ext/openal/gstopenalelements.h: + * ext/openal/gstopenalsink.c: + * ext/openal/gstopenalsrc.c: + * ext/openal/meson.build: + openal: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:41:53 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/ofa/gstofa.c: + * ext/ofa/gstofa.h: + ofa: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:35:34 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/neon/gstneonhttpsrc.c: + * ext/neon/gstneonhttpsrc.h: + neon: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:24:18 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/musepack/gstmusepackdec.c: + * ext/musepack/gstmusepackdec.h: + musepack: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:17:20 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/mplex/gstmplex.cc: + * ext/mplex/gstmplex.hh: + mplex: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:14:38 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/mpeg2enc/gstmpeg2enc.cc: + * ext/mpeg2enc/gstmpeg2enc.hh: + mpeg2enc: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 10:10:16 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/modplug/gstmodplug.cc: + * ext/modplug/gstmodplug.h: + modplug: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 09:56:08 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/mdns/gstmicrodns.c: + mdns: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 09:52:08 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/libmms/gstmms.c: + * ext/libmms/gstmms.h: + libmms: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 09:50:21 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/libde265/gstlibde265.c: + * ext/libde265/libde265-dec.c: + * ext/libde265/libde265-dec.h: + libde265: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-18 09:48:04 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/ldac/gstldacenc.c: + * ext/ldac/gstldacenc.h: + * ext/ldac/ldac-plugin.c: + ldac: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:38:16 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/kate/gstkate.c: + * ext/kate/gstkatedec.c: + * ext/kate/gstkateelement.c: + * ext/kate/gstkateelements.h: + * ext/kate/gstkateenc.c: + * ext/kate/gstkateparse.c: + * ext/kate/gstkatetag.c: + * ext/kate/gstkatetiger.c: + * ext/kate/gstkateutil.c: + * ext/kate/meson.build: + kate: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:26:42 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/isac/gstisac.c: + * ext/isac/gstisacdec.c: + * ext/isac/gstisacdec.h: + * ext/isac/gstisacenc.c: + * ext/isac/gstisacenc.h: + isac: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:23:21 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/iqa/iqa.c: + * ext/iqa/iqa.h: + iqa: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:17:08 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/hls/gsthls.h: + * ext/hls/gsthlsdemux.c: + * ext/hls/gsthlsdemux.h: + * ext/hls/gsthlselement.c: + * ext/hls/gsthlselements.h: + * ext/hls/gsthlsplugin.c: + * ext/hls/gsthlssink.c: + * ext/hls/gsthlssink2.c: + * ext/hls/gstm3u8playlist.c: + * ext/hls/m3u8.c: + * ext/hls/m3u8.h: + * ext/hls/meson.build: + hls: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:07:42 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/gsm/gstgsm.c: + * ext/gsm/gstgsmdec.c: + * ext/gsm/gstgsmdec.h: + * ext/gsm/gstgsmenc.c: + * ext/gsm/gstgsmenc.h: + gsm: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:04:20 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/gme/gstgme.c: + * ext/gme/gstgme.h: + gme: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 18:01:05 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/fluidsynth/gstfluiddec.c: + * ext/fluidsynth/gstfluiddec.h: + fluidsynth: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 16:05:02 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/fdkaac/gstfdkaac.c: + * ext/fdkaac/gstfdkaacdec.c: + * ext/fdkaac/gstfdkaacdec.h: + * ext/fdkaac/gstfdkaacenc.c: + * ext/fdkaac/gstfdkaacenc.h: + * ext/fdkaac/gstfdkaacplugin.c: + * ext/fdkaac/meson.build: + fdkaac: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 15:59:49 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/faad/gstfaad.c: + * ext/faad/gstfaad.h: + faad: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 15:59:36 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/faac/gstfaac.c: + * ext/faac/gstfaac.h: + faac: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 15:35:10 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dts/gstdtsdec.c: + * ext/dts/gstdtsdec.h: + dts: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 12:22:07 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dtls/gstdtlsdec.c: + * ext/dtls/gstdtlselement.c: + * ext/dtls/gstdtlselements.h: + * ext/dtls/gstdtlsenc.c: + * ext/dtls/gstdtlssrtpdec.c: + * ext/dtls/gstdtlssrtpdemux.c: + * ext/dtls/gstdtlssrtpenc.c: + * ext/dtls/meson.build: + dtls: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 12:10:31 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/directfb/dfbvideosink.c: + * ext/directfb/dfbvideosink.h: + directfb: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 12:07:48 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dc1394/gstdc1394src.c: + * ext/dc1394/gstdc1394src.h: + dc1394: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 12:03:05 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dash/gstdashdemux.c: + * ext/dash/gstdashdemux.h: + * ext/dash/gstdashsink.c: + * ext/dash/gstdashsink.h: + * ext/dash/gstplugin.c: + dash: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 11:55:14 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/curl/gstcurl.c: + * ext/curl/gstcurlelement.c: + * ext/curl/gstcurlelements.h: + * ext/curl/gstcurlfilesink.c: + * ext/curl/gstcurlftpsink.c: + * ext/curl/gstcurlhttpsink.c: + * ext/curl/gstcurlhttpsrc.c: + * ext/curl/gstcurlsftpsink.c: + * ext/curl/gstcurlsmtpsink.c: + * ext/curl/meson.build: + curl: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 11:43:33 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/colormanagement/gstcolormanagement.c: + * ext/colormanagement/gstlcms.c: + * ext/colormanagement/gstlcms.h: + colormanagement: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 11:31:35 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/closedcaption/gstcccombiner.c: + * ext/closedcaption/gstcccombiner.h: + * ext/closedcaption/gstccconverter.c: + * ext/closedcaption/gstccconverter.h: + * ext/closedcaption/gstccextractor.c: + * ext/closedcaption/gstccextractor.h: + * ext/closedcaption/gstceaccoverlay.c: + * ext/closedcaption/gstceaccoverlay.h: + * ext/closedcaption/gstclosedcaption.c: + * ext/closedcaption/gstline21dec.c: + * ext/closedcaption/gstline21dec.h: + * ext/closedcaption/gstline21enc.c: + * ext/closedcaption/gstline21enc.h: + closedcaption: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 10:23:15 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/chromaprint/gstchromaprint.c: + * ext/chromaprint/gstchromaprint.h: + chromaprint: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 10:13:45 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/bz2/gstbz2.c: + * ext/bz2/gstbz2dec.c: + * ext/bz2/gstbz2dec.h: + * ext/bz2/gstbz2enc.c: + * ext/bz2/gstbz2enc.h: + bz2: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 10:10:39 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/bs2b/gstbs2b.c: + * ext/bs2b/gstbs2b.h: + bs2b: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 10:05:20 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/avtp/gstavtp.c: + * ext/avtp/gstavtpaafdepay.c: + * ext/avtp/gstavtpaafdepay.h: + * ext/avtp/gstavtpaafpay.c: + * ext/avtp/gstavtpaafpay.h: + * ext/avtp/gstavtpcrfcheck.c: + * ext/avtp/gstavtpcrfcheck.h: + * ext/avtp/gstavtpcrfsync.c: + * ext/avtp/gstavtpcrfsync.h: + * ext/avtp/gstavtpcvfdepay.c: + * ext/avtp/gstavtpcvfdepay.h: + * ext/avtp/gstavtpcvfpay.c: + * ext/avtp/gstavtpcvfpay.h: + * ext/avtp/gstavtpsink.c: + * ext/avtp/gstavtpsink.h: + * ext/avtp/gstavtpsrc.c: + * ext/avtp/gstavtpsrc.h: + avtp: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-02-17 09:45:04 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/assrender/gstassrender.c: + * ext/assrender/gstassrender.h: + assrender: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2020-08-17 09:52:11 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * ext/aom/gstaom.c: + * ext/aom/gstav1dec.c: + * ext/aom/gstav1dec.h: + * ext/aom/gstav1enc.c: + * ext/aom/gstav1enc.h: + aom: allow per feature registration + Split plugin into features including + dynamic types which can be indiviually + registered during a static build. + More details here: + https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/199 + https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/661 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2038> + +2021-03-23 16:26:13 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/meson.build: + d3d11decoder: Enable high precision clock if needed + We've been doing retry with 1ms sleep if DecoderBeginFrame() + returned E_PENDING which means application should call + DecoderBeginFrame() again because GPU is busy. + The 1ms sleep() during retry would result in usually about 15ms delay + in reality because of bad clock precision on Windows. + To improve throughput performance, this commit will enable + high precision clock only for NVIDIA platform since + DecoderBeginFrame() call on the other GPU vendors seems to + succeed without retry. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2099> + +2021-03-03 16:03:07 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/mpegdemux/gstmpegdemux.c: + * gst/mpegdemux/gstmpegdemux.h: + mpegpsdemux: fix accurate seek + In an accurate seek, the segment start should be + the same as the one requested in the seek. + The start should be kept as the one from the + segment if its inferior. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2048> + +2021-03-03 14:11:21 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/mpegdemux/gstmpegdemux.c: + * gst/mpegdemux/gstmpegdemux.h: + mpegpsdemux: Keep seqnum events + Keep the same seqnum of the new segment events for each + of the streams. + Keep the segment to send the EOS event. + Keep the seek seqnum for segment and flush event. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2048> + +2021-03-01 16:23:09 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/mpegdemux/gstmpegdemux.c: + mpegpsdemux: avoid early EOS + In a case of a scr different from 0, after a seek, + the src_segment.stop has been updated with the duration + not including the base_time (scr). The segment position + needs to be tested upon segment.stop + base_time (scr) + to check for an EOS. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2048> + +2021-03-19 16:17:41 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: disable passthrough as soon as possible + After the VA filter creation, when changing the element's state from NULL + to READY, immediatly checks for any filter operation requested by the user. + If any, the passthrough mode is disabled early, so there's no need for a + future renegotiation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2094> + +2021-03-19 16:14:08 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: rename function to gst_va_vpp_update_passthrough + Since it's widely used, a proper name will reflect its importance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2094> 2021-03-22 14:34:36 +1100 Matthew Waters <matthew@centricular.com> + * ext/closedcaption/gstccconverter.c: * ext/colormanagement/gstlcms.c: * ext/curl/gstcurlqueue.h: * ext/iqa/iqa.c: @@ -605,6 +12239,8 @@ * gst-libs/gst/adaptivedemux/gstadaptivedemux.c: * gst-libs/gst/audio/gstnonstreamaudiodecoder.c: * gst-libs/gst/codecparsers/gstmpegvideometa.c: + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11utils.c: * gst-libs/gst/sctp/sctpreceivemeta.c: * gst-libs/gst/sctp/sctpsendmeta.c: * gst-libs/gst/vulkan/android/gstvkwindow_android.c: @@ -649,16 +12285,16 @@ * sys/applemedia/iosurfaceglmemory.c: * sys/applemedia/iosurfacevulkanmemory.c: * sys/bluez/gstavdtpsrc.h: - * sys/d3d11/gstd3d11decoder.c: - * sys/d3d11/gstd3d11device.c: - * sys/d3d11/gstd3d11shader.c: - * sys/d3d11/gstd3d11utils.c: - * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11shader.cpp: * sys/d3d11/gstd3d11window.cpp: * sys/d3d11/gstd3d11window_win32.cpp: * sys/ipcpipeline/gstipcpipelinecomm.c: + * sys/mediafoundation/gstmftransform.cpp: + * sys/mediafoundation/gstmfvideobuffer.h: * sys/msdk/gstmsdkcontextutil.c: * sys/nvcodec/gstcudacontext.c: + * sys/nvcodec/gstcudanvrtc.c: * sys/nvcodec/gstcudautils.c: * sys/nvcodec/gstnvbaseenc.h: * sys/opensles/openslescommon.c: @@ -674,69 +12310,626 @@ with atomic operations. https://gitlab.gnome.org/GNOME/glib/-/merge_requests/1719 Discovered in https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/868 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2155> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2098> -2021-04-07 07:48:57 -0400 Doug Nazar <nazard@nazar.ca> +2021-03-20 16:26:21 +0900 Seungha Yang <seungha@centricular.com> - * gst/rtmp2/rtmp/rtmpmessage.c: - rtmp2: Use correct size of write macro for param2. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2147> + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/meson.build: + mfvideoenc: Enable Direct3D multi-thread protection + As documented by MS. See also + https://docs.microsoft.com/en-us/windows/win32/medfound/supporting-direct3d-11-video-decoding-in-media-foundation#open-a-device-handle + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2095> + +2021-03-20 16:15:35 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/meson.build: + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11converter.cpp: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11overlaycompositor.cpp: + * sys/d3d11/gstd3d11pluginutils.cpp: + * sys/d3d11/gstd3d11videoprocessor.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + * sys/d3d11/gstd3d11window_win32.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + Revert "d3d11: Enable native multi-thread protection layer and make use of it" + This reverts commit 872b7f503c49442e559f6a381416c6a84b76a3c6. + Native multi-thread protection layer seems to be consuming more CPU + resource than application side protection approach in some cases + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2095> -2021-03-26 12:20:07 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> +2021-03-19 16:36:41 +1100 Matthew Waters <matthew@centricular.com> - * gst/rtmp2/rtmp/rtmpconnection.c: - rtmp2/connection: Separate inner from outer cancelling - The connection cancels itself when it is closed. To avoid the - cancellable passed to `gst_rtmp_connection_new` from being unexpectedly - cancelled, separate inner from outer cancellation by holding two - cancellables. - Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1558 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2119> + * tests/check/elements/webrtcbin.c: + tests/webrtc: check for more sdp things across the board + e.g. + - test for a=setup:$val and direction attributes in all tests + - test number of media sections + - test number of formats in each m= section (for audio/video) + - test no duplicate formats + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2093> -2021-03-25 03:24:11 +0900 Seungha Yang <seungha@centricular.com> +2020-06-23 12:41:27 -0700 Julien <jisorce@oblong.com> + * docs/plugins/gst_plugins_cache.json: + * ext/gs/.clang-format: + * ext/gs/README.md: + * ext/gs/gstgs.cpp: + * ext/gs/gstgscommon.cpp: + * ext/gs/gstgscommon.h: + * ext/gs/gstgssink.cpp: + * ext/gs/gstgssink.h: + * ext/gs/gstgssrc.cpp: + * ext/gs/gstgssrc.h: + * ext/gs/meson.build: + * ext/meson.build: + * meson_options.txt: + gs: add source and sink for Google Cloud Storage + Useful when having a service that runs a GStreamer pipeline + or application in Google Cloud to avoid storing the inputs + and outputs in the running container or service. For example + when analyzing a video from a Google Cloud Storage bucket + and extracting images or converting the video and then uploading + the results into another Google Cloud Storage bucket. + - gssrc allows to read from a file located in Google Cloud + Storage and it supports seeking. + - gssink allows to write to a file located in Google Cloud + Storage. There are 2 modes, one similar to multifilesink and + the other similar to filesink. + Example: + gst-launch-1.0 gssrc location=gs://mybucket/videos/sample.mp4 ! decodebin ! glimagesink + gst-launch-1.0 playbin uri=gs://mybucket/videos/sample.mp4 + gst-launch-1.0 videotestsrc num-buffers=5 ! pngenc ! gssink object-name="img/img%05d.png" bucket-name="mybucket" next-file=buffer + gst-launch-1.0 filesrc location=sample.mp4 ! gssink object-name="videos/video.mp4" bucket-name="mybucket" next-file=none + When running locally simply set GOOGLE_APPLICATION_CREDENTIALS. But + when running in Google Cloud Run or Google Cloud Engine, just set the + "service-account-email" property on each element. + Closes #1264 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1369> + +2021-03-17 23:53:04 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/meson.build: + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11converter.cpp: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11overlaycompositor.cpp: + * sys/d3d11/gstd3d11pluginutils.cpp: + * sys/d3d11/gstd3d11videoprocessor.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + * sys/d3d11/gstd3d11window_win32.cpp: * sys/mediafoundation/gstmfvideoenc.cpp: - mfvideoenc: Don't pass 0/1 framerate to MFT - Some MFT implementations do not accept 0/1 framerate and it will - result in encoder open failure. If framerate is unknown, - we will use arbitrary 25/1 framerate value. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2109> + d3d11: Enable native multi-thread protection layer and make use of it + ... instead of our own GRecMutex locking. In this way, any other + Direct3D11 client (MediaFoundation for example) can safely call + any Direct3D11 API even when we are sharing our Direct3D11 device + with others. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2092> -2021-03-23 19:19:14 +0200 Sebastian Dröge <sebastian@centricular.com> +2021-02-26 03:28:29 +1100 Jan Schmidt <jan@centricular.com> - * gst-libs/gst/codecparsers/gsth264parser.c: - * gst-libs/gst/codecparsers/gsth265parser.c: - * tests/check/libs/h264parser.c: - * tests/check/libs/h265parser.c: - h2645parser: Catch overflows in AVC/HEVC NAL unit length calculations - Offset and size are stored as 32 bit guint and might overflow when - adding the nal_length_size, so let's avoid that. - For the size this would happen if the AVC/HEVC NAL unit size happens to - be stored in 4 bytes and is 4294967292 or higher, which is likely - corrupted data anyway. - For the offset this is something for the caller of these functions to - take care of but is unlikely to happen as it would require parsing on a - >4GB buffer. - Allowing these overflows causes all kinds of follow-up bugs in the - h2645parse elements, ranging from infinite loops and memory leaks to - potential memory corruptions. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2107> + * tests/examples/mpegts/meson.build: + * tests/examples/mpegts/tsmux-prog-map.c: + examples: Add an mpegtsmux example of prog-map usage. + Add an example of how to construct the prog-map structure for + the MPEG-TS muxers and assign streams to programs, and set PCR + and PMT PIDs. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2039> -2021-03-15 19:49:19 +0000 Tim-Philipp Müller <tim@centricular.com> +2021-02-26 02:53:33 +1100 Jan Schmidt <jan@centricular.com> - * meson.build: - Back to development + * gst/mpegtsmux/gstbasetsmux.c: + * gst/mpegtsmux/tsmux/tsmux.c: + * gst/mpegtsmux/tsmux/tsmux.h: + mpegtsmux: Add PMT_%d support to prog-map. + Support a PMT_%d field in the prog-map, that's optionally used + to set the PMT for each program in the mux. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2039> -=== release 1.18.4 === +2021-03-12 18:10:18 +1100 Jan Schmidt <jan@centricular.com> -2021-03-15 17:49:16 +0000 Tim-Philipp Müller <tim@centricular.com> + * gst/mpegtsmux/tsmux/tsmux.c: + * gst/mpegtsmux/tsmux/tsmux.h: + mpegtsmux: Don't write PCR until PAT/PMT are output. + Make sure streams start cleanly with a PAT/PMT and defer the first PCR + output until after that. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2073> - * ChangeLog: - * NEWS: - * RELEASE: - * gst-plugins-bad.doap: - * meson.build: - Release 1.18.4 +2021-03-11 18:21:11 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + tsmux: finalize PCR timing for complete accuracy + In order to always insert a PCR packet right on time, we need to + check whether one is needed when outputting any packet, not only + a packet for the PCR stream. Most of the PCR packets will remain + data-carrying packets, but as a last resort we may insert stuffing + packets on the PCR stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2073> + +2021-03-11 18:05:25 +1100 Jan Schmidt <jan@centricular.com> + + * gst/mpegtsmux/tsmux/tsmux.c: + mpegtsmux: Improve PCR/SI scheduling. + Change PCR / SI scheduling so that instead of checking if + the current PCR is larger than the next target time, instead + check if the PCR of the next packet would be too late, so PCR + and SI are always scheduled earlier than the target, not later. + There are still cases where PCR can be written too late though, + because we don't check before each output packet. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2073> + +2021-03-11 18:05:10 +1100 Jan Schmidt <jan@centricular.com> + + * gst/mpegtsmux/tsmux/tsmuxstream.c: + tsmuxstream: Fix comment typo + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2073> + +2021-03-09 13:22:10 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/closedcaption/gstcccombiner.c: + * ext/closedcaption/gstcccombiner.h: + * tests/check/elements/cccombiner.c: + cccombiner: implement scheduling + Prior to that, cccombiner's behaviour was essentially that of + a funnel: it strictly looked at input timestamps to associate + together video and caption buffers. + This patch instead exposes a "schedule" property, with a default + of TRUE, to control whether caption buffers should be smoothly + scheduled, in order to have exactly one per output video buffer. + This can involve rewriting input captions, for example when the + input is CDP sequence counters are rewritten, time codes are dropped + and potentially re-injected if the input video frame had a time code + meta. + Caption buffers may also get split up in order to assign captions to + the correct field when the input is interlaced. + This can also imply that the input will drift from synchronization, + when there isn't enough padding in the input stream to catch up. In + that case the element will start dropping old caption buffers once + the number of buffers in its internal queue reaches a certain limit + (configurable). + The property is exposed so that existing users of cccombiner can + revert back to the original behaviour, but should eventually be + removed, as that behaviour was simply inadequate. + This commit also disallows changing the input caption type, as + this would needlessly complicate implementation, and removes + the corresponding test. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2076> + +2021-03-17 19:26:12 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11upload.cpp: + d3d11: Use render-target and shader-resource bind flags by default + Even if bind flags is not needed by an element, other element + might need such bind flags. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2091> + +2021-03-15 00:04:21 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavpp.c: + va: vpp: Fix features lost in transform_caps(). + When we transform the caps from the sink to src, or vice versa, the + "caps" passed to us may only contain parts of the features. Which + makes our vpp lose some feature in caps and get a negotiation error. + The correct way should be: + Cleaning the format and resolution of that caps, but adding all VA, + DMA features to it, making it a full feature caps. Then, clipping it + with the pad template. + fixes: #1551 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2081> + +2021-03-15 16:25:36 -0300 Thibault Saunier <tsaunier@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + wpe: Ignore 'error-cancelled' 'failures' + This happens when the user use the 'load-bytes' signal and nothing is wrong there + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2085> + +2021-03-16 19:09:59 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + d3d11decoder: WARNING if ID3D11VideoDevice is unavailable, not ERROR + gst_d3d11_decoder_new() method is also used for device capability + checking during plugin init. Although we are checking hardware + flag prior to that, it doesn't guarantee ID3D11VideoDevice interface. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2088> + +2021-03-16 17:56:51 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfsourcereader.cpp: + * sys/mediafoundation/gstmftransform.cpp: + mediafoundation: Fix resource leak + IMFActivate would hold its internal objects unless user call ShutdownObject(), + even if we release the IMFActivate. Here internal objects may + include Direct3D objects, such as texture, device handle for example. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2087> + +2021-03-16 15:58:57 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11convert.cpp: + d3d11colorconvert: Fix caps leak + GstBaseTransform::fixate_caps() takes ownership of passed + othercaps argument. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2086> + +2021-03-13 19:00:18 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/examples/va/meson.build: + * tests/examples/va/multiple-vpp.c: + va: example: multiple-vpp: test sharpen with dynamic controller + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2082> + +2021-03-13 18:57:37 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: synchronize segment with stream time + This is required to use dynamic controllable parameters. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2082> + +2021-03-15 18:26:03 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Do negotiation again per forward/reverse playback mode change + For reverse playback, we are always copying decoded + frame to downstream buffer. So the pool size can be + and need to be large enough. + In case that forward playback, however, we need to restrict + the max pool size for performance reason. Otherwise decoder + will keep copying decoded texture to downstream buffer pool + if decoding is faster than downstream throughput + performance and also there are queue element between them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2083> + +2021-03-15 19:48:56 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + d3d11videosink: Avoid switching conversion tool during playback + Decoder might be able to copy decoded texture to the other buffer pool + during playback depending on context. In that case, copied one + has no D3D11_BIND_DECODER bind flag. + If we used ID3D11VideoProcessor previously for decoder texture, + and incoming texture supports ID3D11VideoProcessor as well even if it has no + D3D11_BIND_DECODER flag (having D3D11_BIND_RENDER_TARGET for example), + allow zero-copying instead of using our fallback texture. + Frequent conversion tool change (between ID3D11VideoProcessor and generic shader) + might result in inconsistent image quality. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2084> + +2021-03-12 13:50:59 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: postproc: only drop filters if they change + Currently, at every frame the filters array is recreated. This is not + optimal, since it should be only rebuilt if the VA filter's related + properties change. This patches does that by using a flag. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2078> + +2021-03-14 16:11:12 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Refactor device open step and negotiation + * Remove redundant method arguments + * Don't allocate staging texture if downstream supports d3d11 memory + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2079> + +2021-03-14 15:08:01 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Move profile GUID handling into decoder object + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2079> + +2021-03-14 14:26:17 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + d3d11decoder: Get rid of private struct + Completely hide member variables + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2079> + +2021-03-14 12:50:21 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + Revert "d3d11vp9dec: Add support for internal frame resizing" + This reverts commit 58a4c33a0e4f4e5415d8578166716e0d65c0c27e + We should use ID3D11VideoProcessor instead of shader + to avoid copy. We need to revisit this topic later + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2079> + +2021-03-13 22:47:55 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11device.h: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11videoprocessor.cpp: + d3d11device: Hold ID3D11VideoDevice and ID3D11VideoContext object + ... instead of QueryInterface-ing per elements. Note that + ID3D11VideoDevice and ID3D11VideoContext objects might not be available + if device doesn't support video interface. + So GstD3D11Device object will create those objects only when requested. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2079> + +2021-03-14 13:01:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11: Run gst-indent for all C++ code + Since all d3d11 plugin implementation code are C++, we need to + run gst-indent manually. This is preparation for later + "gst-indent sys/d3d11/*.cpp" run. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2077> + +2021-03-13 17:40:57 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11basefilter.cpp: + * sys/d3d11/gstd3d11colorconverter.h: + * sys/d3d11/gstd3d11compositor.cpp: + * sys/d3d11/gstd3d11compositorbin.cpp: + * sys/d3d11/gstd3d11convert.cpp: + * sys/d3d11/gstd3d11convert.h: + * sys/d3d11/gstd3d11converter.cpp: + * sys/d3d11/gstd3d11converter.h: + * sys/d3d11/gstd3d11decoder.cpp: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdupsrc.cpp: + * sys/d3d11/gstd3d11download.cpp: + * sys/d3d11/gstd3d11h264dec.cpp: + * sys/d3d11/gstd3d11h265dec.cpp: + * sys/d3d11/gstd3d11mpeg2dec.cpp: + * sys/d3d11/gstd3d11overlaycompositor.cpp: + * sys/d3d11/gstd3d11pluginutils.cpp: + * sys/d3d11/gstd3d11pluginutils.h: + * sys/d3d11/gstd3d11shader.cpp: + * sys/d3d11/gstd3d11upload.cpp: + * sys/d3d11/gstd3d11videoprocessor.cpp: + * sys/d3d11/gstd3d11videoprocessor.h: + * sys/d3d11/gstd3d11videosink.cpp: + * sys/d3d11/gstd3d11videosinkbin.cpp: + * sys/d3d11/gstd3d11vp8dec.cpp: + * sys/d3d11/gstd3d11vp9dec.cpp: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Port to C++ + Direct3D11 objects are COM, and most COM C APIs are verbose + (C++ is a little better). So, by using C++ APIs, we can make code + shorter and more readable. + Moreover, "ComPtr" helper class (which is C++ only) can be + utilized, that is very helpful for avoiding error-prone COM refcounting + issue/leak. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2077> + +2021-03-12 12:36:52 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/examples/va/multiple-vpp.c: + va: example: multiple-vpp: test direction change + If the driver supports it (iHD, so far) and the parameter -d is set, + the direction of the video will be changed randomly. + In the code you can select, at compilation time, if the direction + change is done by element's property or by pipeline events. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2074> + +2021-03-11 18:53:09 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: update passthrough and reconfigure pads + Added helper function _update_passthrough() which will define and set + the pass-through mode of the filter, and it'll either reconfigure both + pads or it will just mark the src pad for renegotiation or nothing at + all. + There are cases where both pads have to be reconfigured (direction + changed, for example), other when just src pad has to (filters + updated) or none (changing to ready state). + The requirement of renegotiation depends on the need to enable/disable + its VA buffer pools. + This patch sets pass-through mode by default, so the buffer pools + aren't allocated if no filtering/direction operations are defined, + which is the correct behavior. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2074> + +2021-02-25 14:09:50 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/zxing/gstzxing.cpp: + * ext/zxing/meson.build: + * tests/check/elements/zxing.c: + zxing: update to support version 1.1.1 + Support new API in 1.1.1 + Update the supported input video format. + Update tests to use parse_launch + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2037> + +2021-03-10 13:10:28 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * gst/videoparsers/gstmpegvideoparse.c: + mpegvideoparse: do not clip the frame + If the current buffer is delta unit such as P or B + frame, the buffer should not be clipped and need to + let the decoder handle the segment boundary situation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2070> + +2021-03-11 02:36:28 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + d3d11device: Fix wrong printf formatting + Add missing '%' there + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2069> + +2021-02-20 11:36:42 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder-private.h: + * gst-libs/gst/transcoder/gsttranscoder-signal-adapter.c: + * gst-libs/gst/transcoder/gsttranscoder.c: + * gst-libs/gst/transcoder/gsttranscoder.h: + transcoder: Add state-changed signal + Similar to GstPlayer, a new signal for state tracking is now emitted at runtime, + as a commodity for applications which then don't need to monitor the pipeline + GstBus for state changes anymore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2028> + +2020-12-07 10:47:30 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay-signal-adapter.c: + * gst-libs/gst/player/gstplayer-media-info-private.h: + * gst-libs/gst/player/gstplayer-media-info.c: + * gst-libs/gst/player/gstplayer-wrapped-video-renderer-private.h: + * gst-libs/gst/player/gstplayer-wrapped-video-renderer.c: + * gst-libs/gst/player/gstplayer.c: + * gst-libs/gst/player/meson.build: + player: Rewrite as GstPlay wrapper + For the time being the GstPlayer library remains as a wrapper for GstPlay, in + order to keep existing applications working and give them time to port to + GstPlay. GstPlayer will remain in -bad for a couple cycles and the plan for + GstPlay is to move it to -base before 1.20. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2020-12-07 09:56:26 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay.c: + play: Flush API bus before exiting main loop + Otherwise the bus might attempt to dispatch queued messages after the thread + ended, causing runtime warnings. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2020-11-29 18:55:48 +0000 Philippe Normand <philn@igalia.com> + + * tests/check/libs/play.c: + * tests/check/meson.build: + play: tests: Switch user-agent test to a real HTTP server + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2020-11-14 10:56:51 +0000 Philippe Normand <philn@igalia.com> + + * meson_options.txt: + * tests/check/libs/play.c: + * tests/check/meson.build: + play: tests: Refactor to use new Message bus API + Instead of relying on an extra GMainLoop, the messages are poped from the player + bus and handled synchronously. This should avoid flaky behaviors. + Fixes #608 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2020-11-14 10:47:53 +0000 Philippe Normand <philn@igalia.com> + + * gst-libs/gst/play/gstplay.c: + play: Plug media_info leak + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2020-04-28 21:09:40 +0200 Stephan Hesse <stephan@emliri.com> + + * gst-libs/gst/play/gstplay.c: + play: Rename internal buffering field to buffering_percent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2019-11-02 16:14:13 +0100 Stephan Hesse <stephan@emliri.com> + + * docs/libs/play/index.md: + * docs/libs/play/sitemap.txt: + * docs/meson.build: + * gst-libs/gst/meson.build: + * gst-libs/gst/play/gstplay-media-info-private.h: + * gst-libs/gst/play/gstplay-media-info.c: + * gst-libs/gst/play/gstplay-media-info.h: + * gst-libs/gst/play/gstplay-message-private.h: + * gst-libs/gst/play/gstplay-signal-adapter.c: + * gst-libs/gst/play/gstplay-signal-adapter.h: + * gst-libs/gst/play/gstplay-types.h: + * gst-libs/gst/play/gstplay-video-overlay-video-renderer.c: + * gst-libs/gst/play/gstplay-video-overlay-video-renderer.h: + * gst-libs/gst/play/gstplay-video-renderer-private.h: + * gst-libs/gst/play/gstplay-video-renderer.c: + * gst-libs/gst/play/gstplay-video-renderer.h: + * gst-libs/gst/play/gstplay-visualization.c: + * gst-libs/gst/play/gstplay-visualization.h: + * gst-libs/gst/play/gstplay.c: + * gst-libs/gst/play/gstplay.h: + * gst-libs/gst/play/meson.build: + * gst-libs/gst/play/play-prelude.h: + * gst-libs/gst/play/play.h: + play: Introducing the new playback library + This aims to be a replacement for the GstPlayer library. In GstPlay, notifications are + sent as application messages through a dedicated GstBus. The GMainContext-based + signal dispatcher was replaced by a GObject signal adapter, now relying on the + bus to emit its signals. The signal dispatcher is now optional and fully + decoupled from the GstPlay object. + Co-authored with: Philippe Normand <philn@igalia.com> + Fixes #394 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2061> + +2021-03-09 13:00:10 +0200 Sebastian Dröge <sebastian@centricular.com> + + * gst/timecode/gstavwait.c: + avwait: Don't post messages with the mutex locked + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2063> + +2021-03-01 20:53:53 +1100 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * tests/check/elements/webrtcbin.c: + webrtc: don't generate duplicate rtx payloads when bundle-policy is set + It was possible to generate a SDP that had an RTX payload type + that matched one of the media payload types when providing caps via + codec_preferences without any sink pads. + Fixes + m=video 9 UDP/TLS/RTP/SAVPF 96 + ... + a=rtpmap:96 VP8/90000 + a=rtcp-fb:96 nack pli + a=fmtp:96 apt=96 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2046> 2021-03-08 14:30:52 +0200 Vivia Nikolaidou <vivia@ahiru.eu> @@ -744,14 +12937,14 @@ * tests/check/meson.build: tests: Add negotiation tests for the interlace elements Many complicated cases exist. Would be good to have some checks. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2067> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2062> 2021-03-08 20:59:14 +0200 Vivia Nikolaidou <vivia@ahiru.eu> * gst/interlace/gstinterlace.c: interlace: Discard stored_frame on EOS and PAUSED_TO_READY Would otherwise leak it. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2067> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2062> 2021-03-08 16:16:25 +0200 Vivia Nikolaidou <vivia@ahiru.eu> @@ -763,22 +12956,7 @@ GST_PAD_SET_ACCEPT_INTERSECT is not set on the sinkpad, and the field-order is missing in the sink template but can be present in the outside caps. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2067> - -2021-03-10 13:10:28 +0100 Stéphane Cerveau <scerveau@collabora.com> - - * gst/videoparsers/gstmpegvideoparse.c: - mpegvideoparse: do not clip the frame - If the current buffer is delta unit such as P or B - frame, the buffer should not be clipped and need to - let the decoder handle the segment boundary situation. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2070> - -2021-03-09 13:00:10 +0200 Sebastian Dröge <sebastian@centricular.com> - - * gst/timecode/gstavwait.c: - avwait: Don't post messages with the mutex locked - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2068> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2062> 2021-03-07 16:47:07 +0900 Seungha Yang <seungha@centricular.com> @@ -786,7 +12964,22 @@ nvh264sldec: Reopen decoder object if larger DPB size is required Equivalent to the d3d11h264dec fix https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1839 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2060> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2059> + +2021-03-03 01:23:20 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconverter.c: + d3d11: Fix an HLSL compiler warning + warning X3578: Output value 'main' is not completely initialized + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2052> + +2021-01-27 10:55:13 +0800 Bing Song <bing.song@nxp.com> + + * tools/gst-transcoder.c: + transcoder: handle SIGINT and SIGHUP + Handle SIGINT and SIGHUP in transcoder. Or the output file maybe corrupt. + Fixes #1507 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1987> 2021-03-04 17:42:28 +0900 Seungha Yang <seungha@centricular.com> @@ -805,31 +12998,333 @@ 3) SPS update without resolution change, only required DPB size is updated to 6 - decoder object should be re-opened but didn't happen because we didn't update max_dpb_size at 2). - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2057> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2056> -2020-11-26 05:55:29 +0900 Seungha Yang <seungha@centricular.com> +2021-03-03 16:19:39 +0000 Tim-Philipp Müller <tim@centricular.com> - * sys/d3d11/gstd3d11h264dec.c: - d3d11h264dec: Reconfigure decoder object on DPB size change - Even if resolution and/or bitdepth is not updated, required - DPB size can be changed per SPS update and it could be even - larger than previously configured size of DPB. If so, we need - to reconfigure DPB d3d11 texture pool again. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2057> + * docs/plugins/gst_plugins_cache.json: + * gst/interlace/gstinterlace.c: + interlace: add more formats, esp 10-bit, 12-bit and 16-bit ones + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2054> + +2021-02-16 11:23:17 +0100 Benjamin Gaignard <benjamin.gaignard@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: retrieve interlaced information + Lets the decoder knows if the frames are interlaced or not. + Provide this information to the driver while filling reference + pictures fields in slice params structure + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-11-27 16:00:03 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Reduce controls for subsequent slices + Only the SLICE_PARAMS and PRED_WEIGHTS are needed for the second and + following slices. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-08-14 10:13:09 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Implement optional scaling matrix + The new H.264 uAPI requires that all drivers support + scaling matrix only as an option, when a non-flat + scaling matrix is provided in the bitstream headers. + Take advantage of this and avoid passing the scaling + matrix if not needed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-09-30 14:34:15 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Only set SPS control if needed + Given V4L2 controls are cached in V4L2, there is no need + to set them if they don't change. Set the SPS control + only if a new sequence was received by the parser. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-09-30 14:22:14 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Only slice-based need SLICE_PARAMS and PRED_WEIGHTS + Frame-based decoding mode doesn't require SLICE_PARAMS and + PRED_WEIGHTS controls. + Moreover, if the driver doesn't support these two controls, trying + to set them will fail. Fix this by only setting these on + slice-based decoding mode. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-09-30 14:14:41 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + v4l2codecs: h264: Add API checks + Check that the V4L2 H264 controls' sizes match + our expectation. If not, then probably there's an API + mismatch which will cause errors or decoding corruption. + Also, print a warning if the kernel version is too old. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-09-30 10:40:51 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/linux/h264-ctrls.h: + * sys/v4l2codecs/linux/media.h: + * sys/v4l2codecs/linux/types-compat.h: + * sys/v4l2codecs/linux/v4l2-common.h: + * sys/v4l2codecs/linux/v4l2-controls.h: + * sys/v4l2codecs/linux/videodev2.h: + * sys/v4l2codecs/plugin.c: + v4l2codecs: h264: Update to the new uAPI + Starting from Linux v5.11, the V4L2 stateless H.264 uAPI + is updated and stable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2020-09-30 10:33:59 -0300 Ezequiel Garcia <ezequiel@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2codecs: h264: Set the scaling matrix present flag unconditionally + We are currently always setting and passing a scaling matrix, + so need to set this flag accordingly. Passing a scaling matrix + optionally will be implemented in follow-up commit. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1624> + +2021-03-02 12:46:24 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: postproc: mention the possibility of color balance + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2050> + +2021-03-02 12:46:06 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + docs: plugins update VA elements + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2050> + +2021-03-02 12:44:12 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: vp8dec, vp9dec: only set NV12 color format for documentation + Mention in documentation only the most used output format in VA-API. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2050> + +2021-03-02 22:01:26 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + d3d11convert: Forward colorimetry and chroma-site from upstream + Adopt the improvement of https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1033 + into d3d11. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 17:47:03 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + d3d11convert: Add support for border drawing + ... and fix wrong resizing when downstream requested PAR value + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 21:35:00 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + d3d11convert: Prefer video processor over shader + ... if video processor was used previously. Otherwise, switching + between video processor and shader would result in inconsistent + output image quality. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 18:07:36 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videoprocessor.c: + d3d11videoprocessor: Disable auto processing mode explicitly + Don't allow auto processing (e.g., denoising), as it might result + in unexpected output. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 21:10:24 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11memory.c: + d3d11memory: Fix for wrong texture_array_size returns + Fix mismatched return values + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 21:13:18 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + d3d11decoder: Add trace log for DPB pool size debugging + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 20:45:22 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + d3d11decoder: Always use render-target bind flag for downstream pool + To convert decoded texture into other format, downstream would use + video processor instead of shader. In order for downstream to + be able to use video processor even if we copied decoded texture + into downstream pool, we should set this bind flag. Otherwise, + downstream would keep switching video processor and shader + to convert format which would result in inconsistent image quality. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-02 20:37:04 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11_fwd.h: + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11videoprocessor.h: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11: Fix wrong preprocessing blocks + Missed in https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/464 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2051> + +2021-03-01 13:44:09 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavpp.c: + va: vpp: fix a wrong caps logic in vpp_transform_caps(). + The current gst_va_vpp_transform_caps return such as: + video/x-raw(memory:VAMemory), width=(int)[ 16, 16384 ], height=(int)[ 16, 16384 ], + interlace-mode=(string)progressive, format=(string){ NV12, I420, YV12, YUY2, RGBA, + BGRA, P010_10LE, ARGB, ABGR, VUYA }; video/x-raw(memory:DMABuf), width=(int)[ 16, + 16384 ], height=(int)[ 16, 16384 ], interlace-mode=(string)progressive, format=(string) + { NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR, VUYA }; video/x-raw, + width=(int)[ 16, 16384 ], height=(int)[ 16, 16384 ], interlace-mode=(string)progressive, + format=(string){ VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, I420, P010_10LE }; + video/x-raw(memory:VAMemory), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], + interlace-mode=(string)progressive; video/x-raw(memory:DMABuf), width=(int)[ 1, 2147483647 ], + height=(int)[ 1, 2147483647 ], interlace-mode=(string)progressive; video/x-raw, width=(int) + [ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], interlace-mode=(string)progressive + Which is not correct. It mixes the template caps and the input query caps together. + The correct way should be: clip the template caps with the input caps(remove format + and rangify size). The correct answer should be: + video/x-raw(memory:VAMemory), width=(int)[ 16, 16384 ], height=(int)[ 16, 16384 ], interlace + -mode=(string)progressive, format=(string){ NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, + ARGB, ABGR, VUYA }; video/x-raw(memory:DMABuf), width=(int)[ 16, 16384 ], height=(int)[ 16, + 16384 ], interlace-mode=(string)progressive, format=(string){ NV12, I420, YV12, YUY2, RGBA, + BGRA, P010_10LE, ARGB, ABGR, VUYA }; video/x-raw, width=(int)[ 16, 16384 ], height=(int)[ 16, + 16384 ], interlace-mode=(string)progressive, format=(string){ VUYA, GRAY8, NV12, NV21, YUY2, + UYVY, YV12, I420, P010_10LE } + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2045> 2021-03-01 16:23:37 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> * gst/mpegtsdemux/mpegtsparse.c: mpegtsparse: Fix switched DTS/PTS when set-timestamps=false Fixes 30ee21eae36e7279f63b77167ba1dcf5f70b8e83. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2049> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2047> + +2019-08-15 08:25:26 -0700 Ilya Kreymer <ikreymer@gmail.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/webrtc/gstwebrtcice.c: + * ext/webrtc/gstwebrtcice.h: + * ext/webrtc/icestream.c: + * tests/check/elements/webrtcbin.c: + webrtc ice: Add 'min/max-rtp-port' props for setting RTP port range + default min port == 0, max port == 65535 -- if min port == 0, uses existing random port selection (range ignored) + add 'gathering_started' flag to avoid changing ports after gathering has started + validity checks: min port <= max port enforced, error thrown otherwise + include tests to ensure port range is being utilized (by @hhardy) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/119> + +2021-02-25 11:58:57 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcice.c: + webrtc ice: Only ever request one component, it's always rtcpmux + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/119> + +2021-02-26 15:40:01 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Add some missing API guards + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2044> + +2021-02-26 15:36:48 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder-signal-adapter.c: + transcoder: Fix potential use of uninitialized variables + gst_structure_get won't touch variables if the field is not present + leading to potential use of initialized vars + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2044> + +2021-02-26 15:31:29 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder-private.h: + * gst-libs/gst/transcoder/gsttranscoder-signal-adapter.c: + * gst-libs/gst/transcoder/gsttranscoder-signal-adapter.h: + * gst-libs/gst/transcoder/gsttranscoder.c: + * gst-libs/gst/transcoder/gsttranscoder.h: + * tools/gst-transcoder.c: + transcoder: Rework the API to create/get SignalAdapter + We can only have 1 single GstTranscoderSignalAdapter object for a + given GstTranscoder object, this enforces that by avoiding to expose + a constructor and instead add a method to GstTranscoder to get the + signal adapter (internally creating it when needed). We can still + cleanly ensure that the signal adapter is running for the requested + GMainContext and return NULL if it is not the case. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2044> + +2021-02-22 16:59:25 +1100 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcice.c: + webrtcbin: use regular ice nomination by default + 1. We don't currently deal with an a=ice-options in the SDP which means + we currently violate https://tools.ietf.org/html/rfc5245#section-8.1.1 + which states: "If its peer is using ICE options (present in + an ice-options attribute from the peer) that the agent does not + understand, the agent MUST use a regular nomination algorithm." + 2. The recommendation is default to regular nomination in both RFC5245 + and RFC8445. libnice change for this is + https://gitlab.freedesktop.org/libnice/libnice/-/merge_requests/125 + which requires an API break in libnice. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2031> + +2021-02-24 18:43:07 +0000 Philippe Normand <philn@igalia.com> + + * docs/meson.build: + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Remove un-needed gst_init call + We can safely assume GStreamer is already initialized from here. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1840> + +2020-11-25 22:25:28 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder-message-private.h: + * gst-libs/gst/transcoder/gsttranscoder-signal-adapter.c: + * gst-libs/gst/transcoder/gsttranscoder-signal-adapter.h: + * gst-libs/gst/transcoder/gsttranscoder.c: + * gst-libs/gst/transcoder/gsttranscoder.h: + * gst-libs/gst/transcoder/meson.build: + * tools/gst-transcoder.c: + transcoder: Port to a GstBus API instead + Following the move made by GstPlayer in: + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/35 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1840> + +2020-11-25 22:21:35 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + * gst-libs/gst/transcoder/gsttranscoder.h: + * gst-libs/gst/transcoder/meson.build: + * gst-libs/gst/transcoder/transcoder-prelude.h: + transcoder: Automatically generate enums GTypes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1840> + +2020-11-25 22:01:30 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.h: + transcoder: Port to G_DECLARE + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1840> 2021-02-26 16:36:58 +0200 Sebastian Dröge <sebastian@centricular.com> * sys/decklink/gstdecklinkvideosink.cpp: decklinkvideosink: Use correct numerator for 29.97fps It's not 0.2997fps. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2043> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2042> 2021-02-26 11:39:10 +0100 Edward Hervey <edward@centricular.com> @@ -838,7 +13333,37 @@ Use the hardware reference clock time when the frame was finished being captured instead of a time much further down the road. This improves the stability/accuracy of buffer times. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2041> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2040> + +2021-02-24 19:14:42 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: No need of fourcc to create surface. + In commits 430aa327 and a119a940 there are a regression since it is + possible to create surfaces without fourcc, only chroma (rtformat) is + required. + This regression is shown on radeonsi driver with certain color + formats. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2035> + +2021-02-24 13:06:51 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + * tests/examples/va/meson.build: + * tests/examples/va/multiple-vpp.c: + va: vpp: implement GstColorBalance interface + And modify multiple-vpp example to use it with -r parameter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2033> + +2021-02-23 17:22:40 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: add controllable and mutable playing to GParamFlags + Add controllable and mutable playgin to common GParamFlags. + Also use this common flags to video-direction + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2033> 2021-02-24 16:57:06 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> @@ -847,20 +13372,998 @@ * ext/vulkan/vkviewconvert.c: vulkan: Fix elements long name. Fix vkcoloconvert and vkviewconvert long names. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2036> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2034> + +2021-01-12 15:33:49 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkcontext.c: + * sys/msdk/meson.build: + msdk: allow user specify a drm device via an env variable + User may specify the required device via GST_MSDK_DRM_DEVICE + Example: + GST_MSDK_DRM_DEVICE=/dev/dri/card0 gst-launch-1.0 videotestsrc ! msdkh264enc + ! fakesink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1953> + +2021-01-19 15:36:29 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaav1dec.c: + * sys/va/gstvaav1dec.h: + * sys/va/gstvabasedec.h: + * sys/va/gstvaprofile.c: + * sys/va/meson.build: + * sys/va/plugin.c: + VA: Add the vaav1dec element as the av1 decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1636> + +2021-01-19 15:17:58 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + VA: Add the aux surface for gst buffer used by decoder. + The AV1 codec needs to support the film grain feature. When the film + grain feature is enabled, we need two surfaces as the output of the + decoded picture, one without film grain effect and the other one with + it. The first one acts as the reference and is needed for later pictures' + reconstruction, and the second one is the real display output. + So we need to attach another aux surface to the gst buffer/mem and make + that aux surface as the target of vaBeginPicture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1636> + +2021-01-19 15:07:38 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + VA: Add a helper function of decoder_add_slice_buffer_with_n_params. + Some codecs such as AV1 needs several parameters associated with one + slice. It may have multi tiles within one slice and each tile needs + its description parameter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1636> + +2021-01-19 14:59:45 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstav1decoder.c: + * gst-libs/gst/codecs/gstav1decoder.h: + * gst-libs/gst/codecs/gstav1picture.c: + * gst-libs/gst/codecs/gstav1picture.h: + * gst-libs/gst/codecs/meson.build: + codecs: AV1decoder: Add the AV1 decoder base class. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1636> + +2021-02-23 13:47:29 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: replace assert with error log in va alloc. + We should print error log rather than assert when the forcc or + the rt_format of va allocator is unrecognized. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1636> + +2021-02-21 17:38:38 +0900 Seungha Yang <seungha@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/d3d11/gstd3d11compositor.c: + * sys/d3d11/gstd3d11compositorbin.c: + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11desktopdupsrc.c: + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11mpeg2dec.c: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosinkbin.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + * sys/d3d11/plugin.c: + d3d11: Documentation update + * Update class metadata + * for wrapper bin elements to be distinguishable from internal element. + * D3D11 -> Direct3D11 for consistency + * Add missing Since mark everywhere + * Update plugin cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2029> + +2021-02-21 20:38:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11colorconvert.h: + d3d11: Reorganize class hierarchy of convert elements + AS-IS: + D3D11Convert class is baseclass of D3D11ColorConvert and D3D11Scale + * GstD3D11Convert + |_ GstD3D11ColorConvert + |_ GstD3D11Scale + TO-BE: + Introducing a new base class for color conversion and/or rescale elements + * GstD3D11BaseConvert + |_ GstD3D11Convert + |_ GstD3D11ColorConvert + |_ GstD3D11Scale + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2029> + +2021-02-21 17:35:40 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11deinterlace.cpp: + d3d11deinterlace: Add missing system memory caps features on templates + This element can support system memory + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2029> + +2021-02-18 09:53:09 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + va: filter, vpp: process colorimetry + A new filter method were added: gst_va_filter_set_formats(). In this + way the input & output GstVideoInfo are processed only once per stream + negotiation, and not per frame. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2023> + +2021-02-18 05:58:25 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: check if filter is open on set_orientation() + Because the method requires pipeline_caps is filled. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2023> + +2021-02-17 18:56:29 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: human readable background color + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2023> + +2021-02-17 18:55:14 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: fail immediately if vaBeginPicture() fails + There's no need to try vaRenderPicture() if vaBeginPicture() failed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2023> + +2021-02-17 18:30:10 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: destroy pipeline buffer after destroying filters + In 6ae24948 the pipeline buffer destroy were removing assuming it + wasn't required. Nonetheless, debugging the code it looks like a + buffer leak in iHD driver since the ID of the buffer kept increasing. + The difference now is that first the filter buffers are destroy first + and later the pipeline buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2023> + +2021-02-19 14:27:39 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavpp.c: + va: vpp: Add raw buffer copy when needed. + Just like the decoder, the vapostproc also needs to copy the output + buffer to raw buffer if downstream elements only supports raw caps + and does not support the video meta. + The pipeline like: + gst-launch-1.0 filesrc location=xxxx ! h264parse ! vah264dec ! \ + vapostproc ! capsfilter caps=video/x-raw,width=55,height=128 ! \ + filesink location=xxx + needs this logic to dump the data correctly. + fixes: #1523 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2026> 2021-02-19 00:03:00 +0000 Tim-Philipp Müller <tim@centricular.com> * gst/sdp/gstsdpsrc.c: sdpsrc: fix double free if sdp is provided as string via the property Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1532 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2027> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2025> + +2021-02-18 21:38:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2device.c: + wasapi2device: Make wasapi2 device distinguishable from wasapi device + Both wasapi2 and wasapi plugins use WASAPI API. So "device.api=wasapi" + would make sense for the wasapi2 plugin as well. But people would be + confused by the identical "device.api=wasapi" property if intended + plugin is wasapi, not wasapi2. This change will make them distinguishable + by using "device.api" device property. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2024> + +2021-01-13 00:27:40 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/meson.build: + wasapi2: Always build if Windows 10 SDK is available + Add support for building wasapi2 plugin if Windows 10 SDK is + available on system + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1951> + +2021-02-08 12:24:58 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dtls/gstdtlssrtpbin.c: + * ext/dtls/gstdtlssrtpdec.c: + * ext/dtls/gstdtlssrtpenc.c: + dtls: use GST_WARNING instead of g_warning + No need a g_warning which is failing always + with gst-inspect -a + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2010> + +2021-01-28 12:28:03 +0100 Michael Olbrich <m.olbrich@pengutronix.de> + + * gst/videoparsers/gsth264parse.c: + * tests/check/elements/h264parse.c: + h264parse: don't invalidate the last PPS when parsing a new SPS + When a SPS is received then any previous PPS remains valid. So don't clear + the PPS flag from the parser state. + This is important because there are encoders that don't generated a PPS after + every SPS. + Closes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/571 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2019> + +2021-02-17 15:15:09 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/videoparsers/gsth265parse.c: + h265parse: Detect height change on field-based interlaced files + The first time update_src_caps is called, there's no frame parsed yet, + therefore we don't know whether the file has alternate-field interlacing + mode. If we run it again after we have a frame, it might be that now we + have the SEI pic_struct parsed, and therefore we know that it's + field-based interlaced, and therefore the height must be multiplied by + two. Earlier on this was not detected as a change. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2022> + +2020-09-19 21:39:06 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * tests/examples/va/meson.build: + * tests/examples/va/multiple-vpp.c: + va: add multiple-vpp example + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2015> + +2021-02-15 15:54:11 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: set workaround only for i965 driver + In commit 117453b9 a i965 driver workaround was added for all drivers, because + at that time we didn't have a driver implementation API. + Now there's one. This patch set the workaround only for the i965 driver. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2021> + +2021-02-17 13:46:03 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/videoparsers/gsth265parse.c: + h265parse: Fix FPS/duration for interlaced files + There can be h265 files with frame-based, not field-based, interlacing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2020> + +2021-02-12 18:43:00 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: MT-safe queue & dequeue dmabuf-based memories + One problem that va dmabuf allocator had is when preparing a buffer from + dmabuf memories in the allocator pool, specially when a buffer is composed by + several memories. This memories have to be by certain number and in certain + order. + This patch stores the number of memories and their address in order when a + dmabuf-based buffer is created and when preparing a buffer, it is reconstructed + with this info. + Finally, instead of pushing the memories as soon as they are unrefed, they are + hold until GstVaBufferSurface's ref_mems_count reaches zero (all the memories + related with that buffer/surface are unrefed). Until that happen, all the + memories are pushed back into the queue, locked, assuring that all the memories + related with a single buffer (with the same surface) remain contiguous, so the + buffer reconstruction is assured. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2013> + +2021-02-15 15:34:56 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + va: pool, allocator: free memories at bufferpool's stop() + This patch frees the memories in the allocator's pool after the bufferpool frees + all its buffers, sync'ing them at stop() vmethod. + By doing it, the current logic in flush_start() is not valid, so the vmethod is removed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2013> + +2021-02-12 15:40:33 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + va: allocator: remove unused public functions + Deleted the public functions: + gst_va_dmabuf_allocator_wait_for_memory() + gst_va_allocator_wait_for_memory() + And all the support for wait, cond in allocator's pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2013> + +2021-02-12 13:26:24 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + va: pool: simplify the logic + Instead of removing memories from buffers at reset_buffer()/release_buffer() the + bufferpool operation is kept as originally designed, still the allocator pool is + used too. Thus, this patch restores the buffer size configuration while removing + release_buffer(), reset_buffer() and acquire_buffer() vmethods overloads. + Then, when the bufferpool base class decides to discard a buffer, the VA + surface-based memory is returned to the allocator pool when its last reference + is freed, and later reused if a new buffer is allocated again. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2013> + +2021-02-07 16:12:56 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvapool.c: + va: pool: use allocator pool at alloc() + Check if the allocator pool has memories available before creating a + new one, but only iif pool is not starting. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2013> + +2021-02-08 12:25:07 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: log buffer at dmabuf setup and prepare + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2013> + +2021-01-22 00:10:28 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11deinterlace.cpp: + * sys/d3d11/gstd3d11deinterlace.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Add support for deinterlacing by using ID3D11VideoProcessor interface + Add a new element d3d11deinterlace to support deinterlacing. + Similar to d3d11videosink and d3d11compositor, this element is + a wrapper bin of set of child elements including helpful + conversion elements (upload/download and color convert) + to make this element configurable between non-d3d11 elements. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2016> + +2021-02-14 06:23:55 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11format.c: + * gst-libs/gst/d3d11/gstd3d11format.h: + d3d11: Add a method for conversion from DXGI format to GstVideoFormat + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2016> + +2021-01-22 03:26:29 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/videoparsers/gsth264parse.c: + h264parse: fix timestamping of interlaced fields in output + Instead of relying on GstBaseParse default behaviour of computing + the duration of a parsed buffer based on the framerate passed + to gst_base_parse_set_framerate(), we instead compute the duration + ourselves, as we have more information available. + In particular, this means we now output buffers with a duration + that matches that of raw interlaced buffers when each field is + output in a separate buffer. + This fixes DTS interpolation performed by GstBaseParse, as the + previous behaviour of outputting each field with the duration of + a full frame was messing up the base class calculations. + When not enough information is available, h264parse simply falls + back to calculating the duration based on the framerate and hope + for the best as was the case previously. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1973> + +2021-02-14 21:01:32 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11mpeg2dec.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11decoder: Take account of min buffers of downstream buffer pool + Since our decoder DPB texture pool cannot be grown once it's + configured, we should pre-allocate sufficient number of textures + for zero-copy playback (but not too many). + The "min buffers" allocation query parameter can be a hint for + the number of required textures in addition to DPB size. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2017> + +2020-10-29 10:54:45 -0300 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/openh264/gstopenh264enc.cpp: + openh264enc: Add support for main and high profiles + Those are supported (to a certain extent) so we should not limit + ourself to baseline + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1789> + +2021-02-11 16:04:12 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: Move frame_unref to handle_frame. + In the current code, we call frame_unref only when the frame is + outputted. This is OK for normal playback, but when seek happens, + the frames stored in DPB is not outputted and causes some memory + leak. + The correct way is that we should call frame_unref every time we + finish the handle_frame(), which is also the behaviour of H264/H265 + decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2014> + +2021-02-07 02:26:02 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11h265dec.c: + d3d11h265dec: Add support for interlaced stream + Note that we have no D3D11 deinterlace element yet. + If downstream is not support format:Interlaced caps feature including all + D3D11 the other elements, aspect-ratio will be adjusted as an + alternative approach. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2008> + +2021-02-07 00:21:06 +0900 Seungha Yang <seungha@centricular.com> + + codecs: h265decoder: Add support for interlaced stream + * Invoke GstH265DecoderClass::new_sequence() method per interlaced + stream status update so that subclass can update caps. + * Parse picture timing SEI and set buffer flags on GstH265Picture + object. Subclass can refer to it like that of our h264decoder + implementation. + * Remove pointless GstH265PictureField enum + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2008> + +2021-02-10 00:59:05 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvah265dec.c: + vah265dec: Don't need to pass picture structure to VA + This code came from gstvaapidecoder_h265 implementation + but picture structure is always GST_VAAPI_PICTURE_STRUCTURE_FRAME. + Moreover, in theory, VA doesn't need to know picture structure for + decoding HEVC stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2008> + +2021-02-06 22:02:59 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvh265dec.c: + nvh265sldec: Remove pointless field picture parameter setup + HEVC has no decoding flow for interlaced, field picture referencing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2008> + +2021-02-08 00:07:26 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvavpp.c: + va: vpp: Make the global lock only to DMA buffer's import. + The normal gst_va_buffer_get_surface does not need a global lock. + Too big lock may make the performance lower. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2009> + +2021-02-05 14:05:53 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: log dmabuf and surface at pool push or pop + In order to keep track of the dmabuf fds and surfaces numbers log messages are + added at memory_release() (queue push) and prepare_buffer() (queue pop). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1999> + +2021-02-02 06:43:27 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah265dec.c: + va: h265dec: fix HVC1 stream format name + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1999> + +2021-02-01 23:19:27 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasedec.c: + va: basedec: refactor context query + Context query is handled either by source and sink queries. This patch + factors-out its handling in a common utility function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1999> + +2020-12-21 18:10:44 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: use a common GParamFlags definition + Instead of repeating the same code along gst_va_filter_install_properties() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1999> + +2021-01-22 16:54:05 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: remove spurious if validation + The first if checks for caps, thus else doesn't need to recheck for the + opposite. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1999> + +2021-02-05 18:13:32 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: fix frame copy + There were two problems with frame copy: + 1. The input video info are from the format color, not form the allocated VA + surface, it's needed to update the sink video info according with the + allocator's data. + 2. The parameters of `gst_video_frame_copy()` were backwards. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-02 18:05:46 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: request video and alignment metas for src pool + This is for the pool used when importing raw video frames to surfaces. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-04 16:43:02 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: transform_size() must return FALSE + transform_size() basetransform vmethod is used when there's no output buffer + pool and allocates a system memory buffer. With VA this cannot be allowed, since + it needs VASurfaces to process. + Thus transform_size() is not required, but to play safe let's return FALSE. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-02 16:22:34 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: copy input buffer flags and timestamps + Strictly speaking right now it's not required do this copy, but let's play safe + and assume in the future this metadata might be required while doing the + postprocessing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-01 23:55:11 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: handle context query + Previously vapostproc didn't communicate its context through query mechanism, + which is required for context sharing. This patch completes this missing bits. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-01 23:50:12 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: don't copy color, size or orientation video metas + If they are processed by the element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-05 16:46:00 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: don't break passthrough if no color balance required + The function `_add_filter_cb_buffer()` returned TRUE if no color balance filter + are required, but that's is wrong, since it will break the passthrough. This + patch return FALSE which is the correct value for the situation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-01-15 14:07:19 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: use gst_clear_caps() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2007> + +2021-02-02 16:23:28 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: don't destroy pipeline buffer + This was only required by i915 driver before libva-2.0 because it didn't + conform. + Also changes the way _destroy_filters() is called, now inside a locked block, so + it must not lock in it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2006> + +2021-02-01 16:57:49 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: lock member variables access + While gst_va_filter_open() and gst_va_filter_close() remain non-thread-safe, the + other API calls that modify member variables are locked. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2005> + +2021-02-03 23:39:00 +0100 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + * ext/srt/gstsrtsink.c: + * ext/srt/gstsrtsrc.c: + srt: preserve ABI compatibility + Reintroduce socket descriptor parameter removed in 327ad84e to + "caller-added" and "caller-removed" signals, just set it always to zero. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2004> + +2021-02-04 03:42:05 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11mpeg2dec.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11decoder: Fix deadlock when DPB texture pool is full + Unlike other stateless decoder implementations (e.g., VA), + our DPB pool cannot be grown since we are using + texture array (pre-allocated, fixed-size d3d11 texture pool). + So, if there's no more available texture to use, + there's no way other than copying it to downstream's + d3d11 buffer pool. Otherwise deadlock will happen. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2003> + +2021-02-02 19:10:13 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/gstd3d11memory.h: + d3d11memory: Add a method for querying texture array size + ... and the number of textures in use. + Direct3D11 texture array is usually used for decoder DPB pool, + and d3d11 decoder elements might want to know + whether there's available texture resource or not. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2003> + +2020-12-14 20:34:15 +0100 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + * ext/srt/gstsrtsink.c: + * ext/srt/gstsrtsrc.c: + srt: don't pass SRT socket ID to "caller-added,removed" signals + The caller's IP and port is enough for unique identification. Don't leak + the socket handle since using it in unadvised libsrt calls from the + application could break the SRT element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1772> + +2020-11-04 17:14:03 +0100 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: add caller address to stats structure + In listener mode, gst_stats() returns an independent set of + statistics for every connected caller. Having the caller's IP and port + present in each structure allows to correlate the statistics with a + particular caller that has been announced by "caller-added" signal. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1772> + +2021-02-03 14:27:14 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/videoparsers/gsth265parse.c: + * gst/videoparsers/gsth265parse.h: + h265parse: Support for alternate-field interlacing + Also don't set interlacing information on the caps, see #1313 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1996> + +2021-02-02 18:25:31 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/videoparsers/gsth264parse.c: + * gst/videoparsers/gsth265parse.c: + h264/h265parse: Add VideoTimeCodeMeta to the outgoing buffer + The parsers attempted to add the meta to the incoming buffer, which + might not be the outgoing buffer or may not have been writable yet. + To fix this, call `gst_buffer_make_writable` earlier and make sure to + use the `parse_buffer` to add the meta. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1521 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2002> + +2021-01-27 15:32:26 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/va/gstvadevice.c: + va: sort the device queue + If so, the elements will be registered per drm node in order of + renderD128, renderD129, ... etc, an element with constant name will be + registered on renderD128 on a hardware with multiple drm nodes. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1988> + +2021-02-02 04:33:09 +0900 Seungha Yang <seungha@centricular.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2decoder: Small documentation fix + Fixing documentation even though those methods are v4l2codecs plugin internals + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/2000> + +2021-01-29 09:43:07 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2codecs: h264: Enable 1 frame delay on non-live + When doing non-live decoding, enable 1 frame of delay. This will ensure + that we queue the next decoding job before we actually wait for the previous + to complete. This improves throughput notably on RK3399. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1881> + +2021-01-29 09:41:22 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + v4l2codecs: Add support for render delay + This add support for render delay in the decoder helper. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1881> + +2021-01-27 15:55:43 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2decoder.c: + v4l2codecs: Coding style fix + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1881> + +2021-01-27 15:53:49 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + v4l2codecs: Poll inside set_done() + This removes the need for the gst_v4l2_decoder_is_done() helper and + simplify slightly the subclass code. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1881> + +2020-12-18 16:36:16 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + v4l2codecs: Make request structure ref-counted + This adds a non-thread safe refcount to the GstV4l2Request. This will + allow holding on more then one request in order to implement render + delay. This is made non-thread safe for speed as we know this will all + happen on the same streaming thread. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1881> + +2020-12-14 17:07:01 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/v4l2codecs/gstv4l2codecvp8dec.c: + * sys/v4l2codecs/gstv4l2decoder.c: + * sys/v4l2codecs/gstv4l2decoder.h: + v4l2codecs: Rework handling of queues and pending requests + Starting from this patch, all queue and dequeue operation happening + on V4L2 is now abstracted with the request. Buffers are dequeued + automatically when pending requests are marked done and only 1 in-flight + request is now used. + Along with fixing issues with request not being reused with slice + decoders, this change reduces the memory footprint by allocating only + two bitstream buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1881> 2021-01-29 02:09:05 -0500 Staz M <staz@staz.io> * sys/decklink/gstdecklink.cpp: decklink: Fixed decklinkvideosink auto format detection - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1998> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1994> + +2021-01-28 04:03:37 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvh264dec.c: + nvh264sldec: Add support for output-delay to improve throughput performance + NVDEC API support delaying getting decoded output, and recommended + delay by API document is 4 frames. In case that throughput is + more critical factor than latency, we can prefer delayed output + as recommended by NVIDIA. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1925> + +2020-12-29 19:54:35 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264decoder.h: + codecs: h264decoder: Add support for output delay + Some decoding APIs support delayed output or a command for decoding + a frame doesn't need to be sequential to corresponding command for + getting decoded frame. For instance, subclass might be able to + request decoding for multiple frames and then get for one (oldest) + decoded frame or so. + If aforementioned case is supported by specific decoding API, + delayed output might show better throughput performance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1925> + +2021-01-27 17:09:07 -0500 Arun Raghavan <arun@asymptotic.io> + + * ext/ldac/gstldacenc.h: + * ext/ldac/meson.build: + ldac: Use pkg-config instead of raw lib/header search + The ldacBT library includes pkg-config files for the standard and ABR + libraries, so let's just use that instead of doing a header/library + search. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1990> + +2021-01-28 02:02:28 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.c: + d3d11videosink: Don't limit max buffers of buffer pool + In some case, especially reverse playback, we would need more than + two buffers. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1989> + +2021-01-27 04:34:13 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11shader.c: + d3d11: Suppress some warning debug messages + * Don't warn for live object, since ID3D11Debug itself seems to be + holding refcount of ID3D11Device at the moment we called + ID3D11Debug::ReportLiveDeviceObjects(). It would report live object + always + * Device might not be able to support some formats (e.g., P010) + especially in case of WARP device. We don't need to warn about that. + * gst_d3d11_device_new() can be used for device enumeration. Don't warn + even if we cannot create D3D11 device with given adapter index therefore. + * Don't warn for HLSL compiler warning. It's just noise and + should not be critical thing at all + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1986> + +2020-12-11 05:23:20 +0900 Seungha Yang <seungha@centricular.com> + + * tests/examples/d3d11videosink/d3d11device.cpp: + * tests/examples/d3d11videosink/d3d11device.h: + * tests/examples/d3d11videosink/d3d11videosink-shared-texture-d3d9ex.cpp: + * tests/examples/d3d11videosink/d3d11videosink-shared-texture.cpp: + * tests/examples/d3d11videosink/meson.build: + examples: Add d3d11videosink examples for shared-texture use cases + Add two examples to demonstrate "draw-on-shared-texture" use cases. + d3d11videosink will draw application's own texture without copy + by using: + - Enable "draw-on-shared-texture" property + - make use of "begin-draw" and "draw" signals + And then, application will render the shared application's texture + to swapchain's backbuffer by using + 1) Direct3D11 APIs + 2) Or, Direct3D9Ex + interop APIs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1873> + +2020-12-23 23:49:12 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosink.h: + * sys/d3d11/gstd3d11videosinkbin.c: + * sys/d3d11/gstd3d11videosinkbin.h: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + * sys/d3d11/gstd3d11window_dummy.cpp: + * sys/d3d11/gstd3d11window_dummy.h: + * sys/d3d11/meson.build: + d3d11videosink: Add support for drawing on application's own texture + Add a way to support drawing on application's texture instead of + usual window handle. + To make use of this new feature, application should follow below step. + 1) Enable this feature by using "draw-on-shared-texture" property + 2) Watch "begin-draw" signal + 3) On "begin-draw" signal handler, application can request drawing + by using "draw" signal action. Note that "draw" signal action + should be happen before "begin-draw" signal handler is returned + NOTE 1) For texture sharing, creating a texture with + D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX flag is strongly recommend + if possible because we cannot ensure sync a texture + which was created with D3D11_RESOURCE_MISC_SHARED + and it would cause glitch with ID3D11VideoProcessor use case. + NOTE 2) Direct9Ex doesn't support texture sharing which was + created with D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX. In other words, + D3D11_RESOURCE_MISC_SHARED is the only option for Direct3D11/Direct9Ex interop. + NOTE 3) Because of missing synchronization around ID3D11VideoProcessor, + If shared texture was created with D3D11_RESOURCE_MISC_SHARED, + d3d11videosink might use fallback texture to convert DXVA texture + to normal Direct3D texture. Then converted texture will be + copied to user-provided shared texture. + * Why not use generic appsink approach? + In order for application to be able to store video data + which was produced by GStreamer in application's own texture, + there would be two possible approaches, + one is copying our texture into application's own texture, + and the other is drawing on application's own texture directly. + The former (appsink way) cannot be a zero-copy by nature. + In order to support zero-copy processing, we need to draw on + application's own texture directly. + For example, assume that application wants RGBA texture. + Then we can imagine following case. + "d3d11h264dec ! d3d11convert ! video/x-raw(memory:D3D11Memory),format=RGBA ! appsink" + ^ + |_ allocate new Direct3D texture for RGBA format + In above case, d3d11convert will allocate new texture(s) for RGBA format + and then application will copy again the our RGBA texutre into + application's own texture. One texture allocation plus per frame GPU copy will hanppen + in that case therefore. + Moreover, in order for application to be able to access + our texture, we need to allocate texture with additional flags for + application's Direct3D11 device to be able to read texture data. + That would be another implementation burden on our side + But with this MR, we can configure pipeline in this way + "d3d11h264dec ! d3d11videosink". + In that way, we can save at least one texture allocation and + per frame texutre copy since d3d11videosink will convert incoming texture + into application's texture format directly without copy. + * What if we expose texture without conversion and application does + conversion by itself? + As mentioned above, for application to be able to access our texture + from application's Direct3D11 device, we need to allocate texture + in a special form. But in some case, that might not be possible. + Also, if a texture belongs to decoder DPB, exposing such texture + to application is unsafe and usual Direct3D11 shader cannot handle + such texture. To convert format, ID3D11VideoProcessor API needs to + be used but that would be a implementation burden for application. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1873> + +2021-01-20 20:04:20 +0800 Haihua Hu <jared.hu@nxp.com> + + * ext/dash/gstmpdhelper.c: + dashsink: add h265 codec support + Return hvc1 for video/x-h265 mime type in mpd helper function + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1966> + +2021-01-23 23:25:30 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: set the default alignment for input and output. + 1. Set the default output alignment to frame, rather than current + alignment of obu. This make it the same behaviour as h264/h265 + parse, which default align to AU. + 2. Set the default input alignment to byte. It can handle the "not + enough data" error while the OBU alignment can not. Also make it + conform to the comments. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1979> + +2021-01-23 19:26:59 +0800 He Junyan <junyan.he@intel.com> + + * tests/check/elements/av1parse.c: + * tests/check/elements/av1parse.h: + test: Add more test cases for the av1parse obu aligned output. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1979> + +2021-01-23 19:21:21 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Reset the annex_b when meet TU inside a buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1979> + +2021-01-23 19:05:57 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Output each OBU when output is aligned to obu. + The current behaviour for obu aligned output is not very precise. + Several OBUs will be output together within one gst buffer. We + should output each gst buffer just containing one OBU. This is + the same way as the h264/h265 parse do when NAL aligned. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1979> + +2021-01-23 17:38:12 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Always copy the OBU to cache. + The current optimization when input align and out out align are + the same is not very correct. We simply copy the data from input + buffer to output buffer, but we failed to consider the dropping of + OBUs. When we need to drop some OBUs(such as filter out the OBUs + of some temporal ID), we can not do simple copy. So we need to + always copy the input OBUs into a cache. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1979> + +2021-01-23 17:26:25 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Improve the logic when to drop the OBU. + When drop some OBU, we need to go on. The current manner will make + the data access out range of the buffer mapping. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1979> + +2021-01-26 11:12:28 +0100 Marijn Suijten <marijns95@gmail.com> + + * ext/ldac/gstldacenc.c: + ext/ldac: Move duplicate sampling rates into #define + Because there was a typo in one of the duplicates already (see previous + commit) it is much safer to specify these once and only once. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1985> + +2021-01-26 11:02:21 +0100 Marijn Suijten <marijns95@gmail.com> + + * ext/ldac/gstldacenc.c: + ext/ldac: Fix typo in 88200(0) stereo encoder sampling rate + Fixes: a57681455 ("ext: Add LDAC encoder") + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1985> + +2021-01-11 01:06:24 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11mpeg2dec.c: + * sys/d3d11/gstd3d11mpeg2dec.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Add support for MPEG-2 video decoding + Add DXVA/Direct3D11 API based MPEG-2 decoder element + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1969> + +2020-11-27 16:18:29 +1100 Matthew Waters <matthew@centricular.com> + + * ext/wpe/WPEThreadedView.cpp: + wpesrc: fix possible small deadlock on shutdown + Problem is that unreffing the EGLImage/SHM Buffer while holding the + images_mutex lock may deadlock when a new buffer is advertised and + an attempt is made to lock the images_mutex there. + The advertisement of the new image/buffer is performed in the + WPEContextThread and the blocking dispatch when unreffing wants to run + something on the WPEContextThread however images_mutex has already been + locked by the destructor. + Delay unreffing images/buffers outside of images_mutex and instead just + clear the relevant fields within the lock. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1843> 2021-01-20 18:16:17 +0800 Haihua Hu <jared.hu@nxp.com> @@ -868,43 +14371,615 @@ dashsink: fix double unref of sinkpad caps no need to unref caps in gst_mpd_helper_get_XXX_codec_from_mime it will be unref in caller gst_dash_sink_get_stream_metadata() - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1991> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1981> + +2021-01-22 16:56:24 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: Fix a typo in frame_restoration_type setting. + Fixes: #1500 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1974> + +2021-01-22 14:01:01 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + av1parse: Fix some issues in the src caps. + 1. Add the mono_chrome to identify 4:0:0 chroma-format. + 2. Correct the mapping between subsampling_x/y and chroma-format. + There is no 4:4:0 format definition in AV1. And 4:4:4 should + let both subsampling_x/y be equal to 0. + 3. Send the chroma-format when the color space is not RGB. + Fixes: #1502 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1974> + +2021-01-22 13:25:50 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstvp9parse.c: + vp9parse: Fix the subsampling_x/y to chroma format mapping. + The chroma format 4:4:4 needs both subsampling_x and subsampling_y + equal to 0. + Fixes: #1502 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1974> + +2021-01-22 21:10:59 +1100 Matthew Waters <matthew@centricular.com> + + * gst-libs/gst/vulkan/gstvkutils.c: + vulkan: remove duplicated check + Checking the same value twice is pointless + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1504 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1977> + +2021-01-22 19:26:18 +1100 Matthew Waters <matthew@centricular.com> + + * ext/ldac/meson.build: + ldac: also look for the ldac/ldacBT.h header. + Otherwise there will be a scenario where the library can be found but + not the header and a compilation build error will result + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1975> + +2021-01-22 09:35:30 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: fix assignation to proper variable + Fix the result of a wrong copy&paste + Fixes: #1501 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1976> + +2021-01-21 04:41:44 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideoenc.cpp: + mfvideoenc: Add support for P010 d3d11 texture + Add P010 Direct3D11 texture format support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1970> + +2021-01-20 02:29:43 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11compositor.c: + * sys/d3d11/gstd3d11compositorbin.c: + * sys/d3d11/gstd3d11desktopdupsrc.c: + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11pluginutils.c: + * sys/d3d11/gstd3d11pluginutils.h: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosinkbin.c: + * sys/d3d11/plugin.c: + d3d11: Don't use hardcoded maximum resolution value + Maximum supported texture dimension is pre-defined based on + feature level and it couldn't be INT_MAX in any case. + See also https://docs.microsoft.com/en-us/windows/win32/direct3d11/overviews-direct3d-11-devices-downlevel-intro + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1964> + +2021-01-16 19:14:06 +0800 He Junyan <junyan.he@intel.com> + + * docs/plugins/gst_plugins_cache.json: + doc: Add the av1 parse element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1614> + +2021-01-16 16:48:38 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: Exclude the size of obu_size when identify OBU. + obu->obu_size does not contain the bytes of obu_size itself, we need + to exclude it when doing the saint check. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1614> + +2021-01-06 23:33:24 +0800 He Junyan <junyan.he@intel.com> + + * tests/check/elements/av1parse.c: + * tests/check/elements/av1parse.h: + * tests/check/meson.build: + test: Add test cases for av1parse element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1614> + +2020-09-22 14:54:19 +0800 He Junyan <junyan.he@intel.com> + + * gst/videoparsers/gstav1parse.c: + * gst/videoparsers/gstav1parse.h: + * gst/videoparsers/meson.build: + * gst/videoparsers/plugin.c: + videoparsers: av1: Add the AV1 parse. + This AV1 parse implements the conversion between alignment of obu, + tu and frame, and the conversion between stream-format of obu-stream + and annexb. + TODO: + 1. May need a property of operating_point to filter the OBUs + 2. May add a property to disable deep parse. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1614> + +2021-01-20 00:57:05 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.h: + codecs: mpeg2decoder: Fix a typo in header file's comment. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1963> + +2021-01-18 20:30:44 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11videosink: Fix ugly thread name for Win32 window impl. + Don't need to put Win32 twice + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1962> + +2021-01-18 20:28:14 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + d3d11videosink: Fix MSVC build warnings around UWP code + gstd3d11window_corewindow.cpp(408): warning C4189: + 'storage': local variable is initialized but not referenced + gstd3d11window_corewindow.cpp(490): warning C4189: + 'self': local variable is initialized but not referenced + gstd3d11window_swapchainpanel.cpp(481): warning C4189: + 'self': local variable is initialized but not referenced + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1962> + +2021-01-18 19:17:14 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/gstd3d11config.h.meson: + * gst-libs/gst/d3d11/meson.build: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/meson.build: + d3d11: Allow building UWP features with Desktop features if possible + WINAPI_PARTITION_DESKTOP and WINAPI_PARTITION_APP can coexist. + Although UWP only binaries should be used for production stage, + this change will be useful for development stage + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1962> + +2020-12-28 02:35:38 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + d3d11decoder: Do more retry for ID3D11VideoContext::DecoderBeginFrame failure + Some GPUs (especially NVIDIA) are complaining that GPU is still busy + even we did 50 times of retry with 1ms sleep per failure. + Because DXVA/D3D11 doesn't provide API for "GPU-IS-READY-TO-DECODE" + like signal, there seems to be still no better solution other than sleep. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1913> 2021-01-18 19:23:30 +0900 Seungha Yang <seungha@centricular.com> * sys/d3d11/gstd3d11videosink.c: d3d11videosink: Fix build error on UWP gstd3d11videosink.c(662): error C2065: 'sink': undeclared identifier - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1972> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1961> -2021-01-14 02:17:31 +0000 Tim-Philipp Müller <tim@centricular.com> +2021-01-17 01:16:17 +0800 He Junyan <junyan.he@intel.com> - * meson.build: - Back to development - -=== release 1.18.3 === + * sys/va/gstvabasedec.c: + * sys/va/gstvautils.c: + va: Fix some gst_object_unref error because the pointer is NULL. + !1957 introduces some error of gst_object_unref for NULL pointer. + Fixes all of them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1959> -2021-01-13 21:11:22 +0000 Tim-Philipp Müller <tim@centricular.com> +2021-01-15 16:05:06 +0800 He Junyan <junyan.he@intel.com> - * ChangeLog: - * NEWS: - * RELEASE: - * gst-plugins-bad.doap: - * meson.build: - Release 1.18.3 + * sys/va/gstvadecoder.c: + va: Make the caps pointer operation atomic in vadecoder. + The vadecoder's srcpad_caps and sinkpad_caps pointers are outside of the + mutex protection. Just make all operation for them atomic. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1957> -2020-12-21 14:06:53 +0530 Raju Babannavar <raju.babannavar@gmail.com> +2021-01-15 15:22:07 +0800 He Junyan <junyan.he@intel.com> - * gst/dvbsuboverlay/dvb-sub.c: - dvbsuboverlay: Add support for dynamic resolution update. - Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1487 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1954> + * sys/va/gstvabasedec.c: + * sys/va/gstvautils.c: + va: Fix a latent race condition in vabasedec. + The vabasedec's display and decoder are created/destroyed between + the gst_va_base_dec_open/close pair. All the data and event handling + functions are between this pair and so the accessing to these pointers + are safe. But the query function can be called anytime. So we need to: + 1. Make these pointers operation in open/close and query atomic. + 2. Hold an extra ref during query function to avoid it destroyed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1957> -2020-12-16 18:32:25 +0200 Sebastian Dröge <sebastian@centricular.com> +2021-01-14 14:37:32 +0200 Sebastian Dröge <sebastian@centricular.com> * sys/decklink/gstdecklinkaudiosrc.cpp: - decklinkaudiosrc: Fix duration of the first audio frame after each discont - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1891> + decklinkaudiosrc: Allow disabling audio sample alignment code by setting the alignment-threshold to 0 + And handle setting it to GST_CLOCK_TIME_NONE as always aligning without + ever detecting a discont. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1956> + +2020-12-21 05:11:03 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh264enc.h: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfh265enc.h: + * sys/mediafoundation/gstmftransform.cpp: + * sys/mediafoundation/gstmftransform.h: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvideoenc.h: + * sys/mediafoundation/gstmfvp9enc.cpp: + * sys/mediafoundation/gstmfvp9enc.h: + * sys/mediafoundation/meson.build: + * sys/mediafoundation/plugin.c: + mfvideoenc: Add support for Direct3D11 texture + Initial support for d3d11 texture so that encoder can copy + upstream d3d11 texture into encoder's own texture pool without + downloading memory. + This implementation requires MFTEnum2() API for creating + MFT (Media Foundation Transform) object for specific GPU but + the API is Windows 10 desktop only. So UWP is not target + of this change. + See also https://docs.microsoft.com/en-us/windows/win32/api/mfapi/nf-mfapi-mftenum2 + Note that, for MF plugin to be able to support old OS versions + without breakage, this commit will load MFTEnum2() symbol + by using g_module_open() + Summary of required system environment: + - Needs Windows 10 (probably at least RS 1 update) + - GPU should support ExtendedNV12SharedTextureSupported feature + - Desktop application only (UWP is not supported yet) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1903> + +2021-01-12 19:12:42 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/webrtctransceiver.c: + * gst-libs/gst/webrtc/rtpreceiver.c: + * gst-libs/gst/webrtc/rtpsender.c: + webrtc: expose transport property on sender and receiver + As advised by !1366#note_629558 , the nice transport should be + accessed through: + > transceiver->sender/receiver->transport/rtcp_transport->icetransport + All the objects on the path can be accessed through properties + except sender/receiver->transport. This patch addresses that. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1952> + +2020-12-21 02:47:45 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/d3d11/d3d11-prelude.h: + * gst-libs/gst/d3d11/gstd3d11.h: + * gst-libs/gst/d3d11/gstd3d11_fwd.h: + * gst-libs/gst/d3d11/gstd3d11_private.h: + * gst-libs/gst/d3d11/gstd3d11bufferpool.c: + * gst-libs/gst/d3d11/gstd3d11bufferpool.h: + * gst-libs/gst/d3d11/gstd3d11config.h.meson: + * gst-libs/gst/d3d11/gstd3d11device.c: + * gst-libs/gst/d3d11/gstd3d11device.h: + * gst-libs/gst/d3d11/gstd3d11format.c: + * gst-libs/gst/d3d11/gstd3d11format.h: + * gst-libs/gst/d3d11/gstd3d11memory.c: + * gst-libs/gst/d3d11/gstd3d11memory.h: + * gst-libs/gst/d3d11/gstd3d11utils.c: + * gst-libs/gst/d3d11/gstd3d11utils.h: + * gst-libs/gst/d3d11/meson.build: + * gst-libs/gst/meson.build: + * sys/d3d11/gstd3d11basefilter.c: + * sys/d3d11/gstd3d11basefilter.h: + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11colorconverter.h: + * sys/d3d11/gstd3d11compositor.c: + * sys/d3d11/gstd3d11compositor.h: + * sys/d3d11/gstd3d11compositorbin.c: + * sys/d3d11/gstd3d11compositorbin.h: + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdup.h: + * sys/d3d11/gstd3d11desktopdupsrc.c: + * sys/d3d11/gstd3d11desktopdupsrc.h: + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11overlaycompositor.c: + * sys/d3d11/gstd3d11overlaycompositor.h: + * sys/d3d11/gstd3d11pluginutils.c: + * sys/d3d11/gstd3d11pluginutils.h: + * sys/d3d11/gstd3d11shader.c: + * sys/d3d11/gstd3d11shader.h: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11utils.c: + * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11videoprocessor.h: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosink.h: + * sys/d3d11/gstd3d11videosinkbin.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_corewindow.h: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.h: + * sys/d3d11/gstd3d11window_win32.cpp: + * sys/d3d11/gstd3d11window_win32.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Move core methods to gst-libs + Move d3d11 device, memory, buffer pool and minimal method + to gst-libs so that other plugins can access d3d11 resource. + Since Direct3D is primary graphics API on Windows, we need + this infrastructure for various plugins can share GPU resource + without downloading GPU memory. + Note that this implementation is public only for -bad scope + for now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/464> + +2021-01-12 00:13:22 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvaallocator.c: + va: allocator: Fix deadlock caused by double lock + Trivial bug fix for deadlock + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1949> + +2021-01-04 19:34:40 +1100 Matthew Waters <matthew@centricular.com> + + * ext/wpe/gstwpesrc.cpp: + wpesrc: replace object lock usage with a new lock + Using the object lock is problematic for anything that can dispatch to + another thread which is what createWPEView() does inside + gst_wpe_src_start(). Using the object lock there can cause a deadlock. + One example of such a deadlock is when createWPEView is called, but + another (or the same) wpesrc is on the WPEContextThread and e.g. posts a + bus message. This message propagations takes and releases the object + lock of numerous elements in quick succession for determining various + information about the elements in the bin. If the object lock is + already held, then the message propagation will block and stall bin + processing (state changes, other messages) and wpe servicing any events. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1490 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1934> + +2021-01-10 23:16:55 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264picture: Count only complete complementary field pair for dpb fullness decision + Our DPB implementation was designed as such that allowing + temporary DPB overflow in the middle of field picture decoding + and incomplete field pair should not trigger DPB bumping. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1947> + +2021-01-10 23:11:01 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Add support for field-pair input frame + In case that upstream pushed buffer as a frame unit, not picture + unit for interlaced stream, baseclass should be able to detect + AU boundary (i.e., complementary field pair). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1947> + +2021-01-10 22:01:27 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Remove unused private variables + ... and reset() method to clear internal status at one place + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1947> + +2020-12-22 02:29:03 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: try harder not to pick duplicate media ids + On renegotiation, or when the user has specified a mid for + a transceiver, we need to avoid picking a duplicate mid for + a transceiver that doesn't yet have one. + Also assign the mid we created to the transceiver, that doesn't + fix a specific bug but seems to make sense to me. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1902> + +2021-01-07 23:47:35 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/meson.build: + * tests/examples/va/meson.build: + meson: va: Skip configuration on non-linux environment + VA plugin is linux-only plugin, so we can skip it earlier. + Note that this plugin is making use of libdrm meson fallback, + which is unusable on the other platforms such as Windows + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1946> + +2021-01-07 12:41:16 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkenc.c: + msdkenc: the unit for max-frame-size is kbyte + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1944> + +2021-01-07 09:21:47 +0100 Edward Hervey <edward.hervey@collabora.co.uk> + + * ext/srt/gstsrtobject.c: + srt: Define options added in later revisions + Allows compiling the plugin against old headers. + For SRTO_BINDTODEVICE there's nothing we can do, since the value depends on + configuration options of the library. Nice. + Fixes build with libsrt < 1.4.2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1945> + +2020-10-16 19:30:59 +0200 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: distinguish authentication error messages + Use GST_RESOURCE_ERROR_NOT_AUTHORIZED code in posted error messages + related to SRT authentication (e.g. incorrect or missing password) so + that the application can recognize them more easily. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1943> + +2020-10-16 19:27:37 +0200 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: detect socket errors from srt_epoll_wait() + On an error event, epoll wait puts the failed socket in both readfds and + writefds. We can take advantage of this and avoid explicitly checking + socket state before every read or write attempt. + In addition, srt_getrejectreason() will give us more detailed + description of the connection failure. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1943> + +2020-12-30 13:51:21 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcice.c: + * ext/webrtc/transportsendbin.c: + * ext/webrtc/transportsendbin.h: + * ext/webrtc/transportstream.c: + * ext/webrtc/transportstream.h: + * ext/webrtc/webrtctransceiver.h: + webrtcbin: Remove remnant of non-rtcp-mux mode + There was some code left that wasn't used anymore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1930> + +2020-11-24 22:25:15 +0100 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: make possible to specify more sockopts in SRT URI + Any socket option that can be passed to libsrt's srt-live-transmit + through SRT URI query string is now recognized. + Also make the code that applies options to SRT sockets more generic. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1842> + +2020-08-26 14:33:57 +0200 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtsrc.c: + srtsrc: fix typos + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1541> + +2020-08-25 13:44:42 +0200 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtsink.c: + srtsink: remove unused connection_mode variable + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1541> + +2020-11-23 16:12:39 +0100 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: obey "wait-for-connection" in caller mode + The pipeline now gets stuck in gst_srt_object_write_one() until the + receiver comes online, which may or may not be desired based on the use + case. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1836> + +2021-01-05 14:18:39 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvampeg2dec.c: + va: mpeg2dec: refactor the picture reference filling + Add the helper function _get_surface_id() which extracts the + VASurfaceID from the passed picture. This function gets the surface of + the next and previous reference picture. + Instead of if-statements, this refactor uses a switch-statement with a + fall-through, for P-type pictures, making the code a bit more readable. + Also it adds quirks for gallium driver, which cannot handle invalid + surfaces as forwarding nor backwarding references, so the function fails. + Also iHD cannot handle them, but to avoid failing, the current picture + is used as self-reference. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1939> + +2021-01-05 14:16:45 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvampeg2dec.c: + va: mpeg2dec: set first field either frame or has a first field + Add a helper function _is_frame_start() which check if picture has a + frame structure or if it has not an interlaced first field yet. This + function is used with filling is_first_field parameter. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1939> + +2021-01-06 16:38:14 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: decode only if B and not closed gop + Mark as decode only if picture type is B, without previous picture in DBP and + closed_gop is 0 as might be understood in "6.3.8 Group of pictures header". + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1939> + +2021-01-06 12:48:14 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: rename variables + Since prev_picture and next_picture are plain pointers, not pointer to pointers, + it's misleading to name them with _ptr suffix. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1939> + +2021-01-04 21:02:35 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadisplay.c: + * sys/va/gstvadisplay.h: + va: display: parse and set driver implementation + This enum can be used for quirk handling. It's not a property because + the driver enum list might change, it's not static, thus avoiding the + update of GType declaration. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1938> + +2021-01-04 20:56:26 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadisplay.c: + * sys/va/gstvadisplay.h: + va: display: add function precondition check + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1938> + +2020-08-25 19:12:13 +0200 Jakub Adam <jakub.adam@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: post a message on the bus when broken socket is detected + So that the application gets notified may react to it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1935> + +2020-12-30 23:29:47 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvampeg2dec.c: + va: mpeg2dec: Using the current picture's surface when missing reference. + When missing the reference frames, we should not just discard the current + frame. Some streams have group of picture header. It is an optional header + that can be used immediately before a coded I-frame to indicate to the decoder + if the first consecutive B-pictures immediately following the coded I-frame can + be reconstructed properly in the case of a random access. + In that case, the B frames may miss the previous reference and can still be + correctly decoded. We also notice that the second field of the I frame may + be set to P type, and it only ref its first field. + We should not skip all those frames, and even the frame really misses the + reference frame, some manner such as inserting grey picture should be used + to handle these cases. + The driver crashes when it needs to access the reference picture while we set + forward_reference_picture or backward_reference_picture to VA_INVALID_ID. We + now set it to current picture to avoid this. This is just a temp manner. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1929> + +2020-12-30 23:14:01 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: Creating the field based on its arriving time. + Spec says: + In a frame picture top_field_first being set to ‘1’ indicates that the + top field of the reconstructed frame is the first field output by the + decoding process. top_field_first being set to ‘0’ indicates that the + bottom field of the reconstructed frame is the first field output by + decoding process. + Here, the "output" should be interpreted just as the output order, not + including the decoding order. The field should be decoded as the order + they comes in the stream. Namely, no matter top_field_first is 0 or 1, + the first coming field is the first one to be decoded. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1929> + +2021-01-01 16:00:10 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvampeg2dec.c: + va: mpeg2dec: Apply buffer_flags to the output buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1929> + +2021-01-01 15:56:03 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstmpeg2picture.h: + codecs: Add buffer_flags for mpeg2 picture. + We need to store the buffer flags such as GST_VIDEO_BUFFER_FLAG_INTERLACED + and GST_VIDEO_BUFFER_FLAG_TFF for interlaced video. Without these flags, + the VPP and display elements can not apply filter correctly. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1929> + +2020-12-30 23:00:51 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: Reset the quant matrices for each sequence in mpeg2 decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1929> + +2020-10-27 11:52:09 +0530 Raghavendra <raghavendra.rao@collabora.com> + + * ext/srt/gstsrtobject.c: + * ext/srt/gstsrtobject.h: + * ext/srt/gstsrtsink.c: + * ext/srt/gstsrtsink.h: + * ext/srt/gstsrtsrc.c: + * ext/srt/gstsrtsrc.h: + srt: Add authentication to srtsink and srtsrc elements + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1725> 2020-12-30 22:52:01 +0800 Haihua Hu <jared.hu@nxp.com> @@ -915,7 +14990,52 @@ GstDateTime object, this object will be unref twice and cause reference count issue. Should use g_value_dup_boxed() to copy this object. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1932> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1928> + +2020-12-23 16:11:42 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkh264enc.c: + * sys/msdk/gstmsdkh264enc.h: + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + msdkenc{h264,h265}: add min-qp and max-qp properties + The SDK allows user to set a QP range [1], so add min-qp and max-qp to + sepecify QP range. By default, there is no limitations on QP. + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxextcodingoption2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1908> + +2020-12-23 13:36:02 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkh264enc.c: + * sys/msdk/gstmsdkh264enc.h: + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + msdkenc{h264,h265}: add p-pyramid property + The SDK can support P-Pyramid reference structure [1], so add a new + property to enable this feature in msdkenc{h264,h265}. + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#preftype + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1908> + +2020-12-22 16:17:18 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + msdkh265enc: add b-pyramid property + Like as msdkh264enc, b-pyramid is added to enable B-Pyramid reference + structure for H265 encoding + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1908> + +2020-12-22 14:54:59 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkh265enc.c: + * sys/msdk/gstmsdkh265enc.h: + * sys/msdk/msdk-enums.c: + * sys/msdk/msdk-enums.h: + msdkh265enc: add transform-skip property + Since the SDK API 1.26, TransformSkip was added to control + transform_skip_enabled_flag setting in PPS [1] + [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#mfxextcodingoption3 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1908> 2020-12-29 09:41:05 +0800 Haihao Xiang <haihao.xiang@intel.com> @@ -925,7 +15045,142 @@ finalize. See https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1867#note_739346 for the double free issue. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1927> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1916> + +2020-12-29 13:29:05 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: PLI/FIR/NACK direction are the opposite of the media + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1924> + +2020-12-29 13:15:10 +0200 Sebastian Dröge <sebastian@centricular.com> + + * ext/assrender/gstassrender.c: + assrender: Don't try unlocking unlocked mutex + When flushing right at the beginning of the video chain function or + when failing negotiation at the top of the function, the assrender mutex + would be unlocked without being previously locked. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1918> + +2020-12-27 22:16:13 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.c: + d3d11compositor: Add support for resolution change + Not only for position update (e.g., xpos, ypos), + we need to configure shader again per resolution change of each + input stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1912> + +2020-12-28 04:33:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11shader.c: + d3d11shader: Fix ID3DBlob object leak + Even if HLSL compiler was able to compile our shader code, D3DCompile() + might return ID3DBlob object for compile warnings and the object + should be released. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1914> + +2020-12-28 17:13:22 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: Fix a typo in mpeg2 stateless decoder base class. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1915> + +2020-12-24 20:07:09 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvampeg2dec.c: + va: mpeg2dec: cosmetic changes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1798> + +2020-12-27 15:47:13 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstmpeg2decoder.h: + * gst-libs/gst/codecs/gstmpeg2picture.c: + * gst-libs/gst/codecs/gstmpeg2picture.h: + codecs: mpeg2decoder: fix documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1798> + +2020-12-24 16:20:31 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + codecs: mpeg2decoder: simplify macros + For constructors, instead of casting to pointers, cast to the structures. + For compare, use inlined functions. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1798> + +2020-12-18 22:28:41 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.h: + * sys/va/gstvadecoder.c: + * sys/va/gstvampeg2dec.c: + * sys/va/gstvampeg2dec.h: + * sys/va/meson.build: + * sys/va/plugin.c: + va: Add mpeg2 VA decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1798> + +2020-12-18 21:25:08 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstmpeg2decoder.c: + * gst-libs/gst/codecs/gstmpeg2decoder.h: + * gst-libs/gst/codecs/gstmpeg2picture.c: + * gst-libs/gst/codecs/gstmpeg2picture.h: + * gst-libs/gst/codecs/meson.build: + codecs: Add mpeg2 stateless decoder base class. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1798> + +2020-12-27 03:16:28 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvideoenc.h: + * sys/mediafoundation/gstmfvp9enc.cpp: + mfvideoenc: Re-define default GOP size value + The behavior for zero AVEncMPVGOPSize value would be + varying depending on GPU vendor implementation and some + GPU will produce keyframe only once at the beginning of encoding. + That's unlikely expected result for users. + To make this property behave consistently among various GPUs, + this commit will change default value of "gop-size" property to -1 + which means "auto". When "gop-size" is unspecified, then + mfvideoenc will calculate GOP size based on framerate + like that of our x264enc implementation. + See also + https://docs.microsoft.com/en-us/windows/win32/directshow/avencmpvgopsize-property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1911> + +2020-12-27 03:43:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideoenc.cpp: + mfvideoenc: Fix use of uninitialized value + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1911> + +2020-12-24 21:31:04 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11utils.c: + d3d11device: Add property for getting adapter LUID + LUID (Locally Unique Identifier) can used for identifying GPU + and that's required for some Windows APIs (e.g., MFTEnum2()) to setup device. + See also + https://docs.microsoft.com/en-us/windows/win32/api/mfapi/nf-mfapi-mftenum2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1910> + +2020-12-26 20:39:07 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfh264enc.cpp: + * sys/mediafoundation/gstmfh265enc.cpp: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvideoenc.h: + * sys/mediafoundation/gstmfvp9enc.cpp: + * sys/mediafoundation/plugin.c: + mfvideoenc: Remove duplicated class registration code + Each codec subclass has the same code for class/element registration, + so we can move the code into one helper methodm and that will make + future enhancement simple. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1909> 2020-12-10 11:11:04 +0800 Haihao Xiang <haihao.xiang@intel.com> @@ -940,25 +15195,133 @@ unexpected, so we should call MFXVideoCORE_QueryPlatform after MFXVideoCORE_SetHandle on Linux [1] https://github.com/Intel-Media-SDK/MediaSDK/blob/master/doc/mediasdk-man.md#working-with-va-api-applications - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1921> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1867> -2020-12-28 04:33:11 +0900 Seungha Yang <seungha@centricular.com> +2020-12-23 21:21:55 +0900 Seungha Yang <seungha@centricular.com> + * sys/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11device.h: * sys/d3d11/gstd3d11shader.c: - d3d11shader: Fix ID3DBlob object leak - Even if HLSL compiler was able to compile our shader code, D3DCompile() - might return ID3DBlob object for compile warnings and the object - should be released. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1922> + * sys/d3d11/plugin.c: + d3d11: Remove unnecessary helper methods + We can query selected D3D_FEATURE_LEVEL and factory version + by using native D3D11 API + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1906> + +2020-11-21 03:20:36 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11desktopdup.cpp: + * sys/d3d11/gstd3d11desktopdup.h: + * sys/d3d11/gstd3d11desktopdupsrc.c: + * sys/d3d11/gstd3d11desktopdupsrc.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Re-implement Desktop Duplication source + Add a new video source element "d3d11desktopdupsrc" for capturing desktop image + via Desktop Duplication based on Microsoft's Desktop Duplication sample available at + https://github.com/microsoft/Windows-classic-samples/tree/master/Samples/DXGIDesktopDuplication + This element is expected to be a replacement of existing dxgiscreencapsrc + element in winscreencap plugin. + Currently this element can support (but dxgiscreencapsrc cannot) + - Copying captured D3D11 texture to output buffer without download + - Support desktop session transition + e.g., can capture desktop without error even in case that + "Lock desktop" and "Permission dialog" + - Multiple d3d11desktopdupsrc elements can capture the same monitor + Not yet implemented features + - Cropping rect is not implemented, but that can be handled by downstream + - Mult-monitor is not supported. But that is also can be implemented by + downstream element for example via multiple d3d11desktopdup elements + with d3d11compositor + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1855> -2020-12-29 13:15:10 +0200 Sebastian Dröge <sebastian@centricular.com> +2020-12-22 00:47:09 +0900 Seungha Yang <seungha@centricular.com> - * ext/assrender/gstassrender.c: - assrender: Don't try unlocking unlocked mutex - When flushing right at the beginning of the video chain function or - when failing negotiation at the top of the function, the assrender mutex - would be unlocked without being previously locked. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1923> + * sys/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11device.h: + * sys/d3d11/gstd3d11utils.c: + * sys/d3d11/plugin.c: + d3d11device: Add an optional flags argument for creating device + Extend gst_d3d11_device_new() method so that caller can specify + D3D11_CREATE_DEVICE_FLAG value to use. + See https://docs.microsoft.com/en-us/windows/win32/api/d3d11/ne-d3d11-d3d11_create_device_flag + for more detail about D3D11_CREATE_DEVICE_FLAG + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1901> + +2020-12-21 14:06:53 +0530 Raju Babannavar <raju.babannavar@gmail.com> + + * gst/dvbsuboverlay/dvb-sub.c: + dvbsuboverlay: Add support for dynamic resolution update. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1487 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1897> + +2020-12-21 02:56:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11device.h: + d3d11device: Remove dead code + We don't use this method since the commit of + 0788492461e1b559230cc5c3a354fe5f48f95f8b + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1892> + +2020-12-20 02:39:40 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11compositor.c: + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11memory.h: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11utils.c: + * sys/d3d11/gstd3d11utils.h: + * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11videoprocessor.h: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + * sys/d3d11/gstd3d11window.cpp: + d3d11: Privatize d3d11memory implementation + Hide most of symbols of GstD3D11Memory object. + GstD3D11Memory is one of primary resource for imcoming d3d11 library + and it's expected to be a extensible feature. + Hiding implementation detail would be helpful for later use case. + Summary of this commit: + * Now all native Direct3D11 resources are private of GstD3D11Memory. + To access native resources, getter methods need to be used + or generic map (e.g., gst_memory_map) API should be called + apart from some exceptional case such as d3d11decoder case. + * Various helper methods are added for GstBuffer related operations + and in order to remove duplicated code. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1892> + +2020-12-20 01:06:24 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.c: + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11utils.c: + * sys/d3d11/gstd3d11utils.h: + * sys/d3d11/gstd3d11videosink.c: + d3d11: Add a helper method for d3d11buffferpool setup + Remove duplicated code for d3d11buffferpool setup. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1892> + +2020-12-19 00:40:53 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11device.h: + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11overlaycompositor.c: + * sys/d3d11/gstd3d11window_corewindow.cpp: + * sys/d3d11/gstd3d11window_swapchainpanel.cpp: + * sys/d3d11/gstd3d11window_win32.cpp: + d3d11device: Remove optional helper methods + Most of Direct3D11 APIs can be called without GstD3D11Device + abstraction. This is a part of prework for public GstD3D11 library + to introduce minimal APIs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1892> 2020-12-20 22:12:44 +0900 Seungha Yang <seungha@centricular.com> @@ -968,7 +15331,170 @@ overlay window handle especially playbin/playsink scenario since playsink will set given overlay handle on videosink once READY state change of videosink is ensured. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1905> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1893> + +2020-08-19 03:19:26 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmftransform.cpp: + * sys/mediafoundation/gstmftransform.h: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/gstmfvideoenc.h: + mfvideoenc: Improve latency performance for hardware encoder + Unlike software MFT (Media Foundation Transform) which is synchronous + in terms of processing input and output data, hardware MFT works + in asynchronous mode. output data might not be available right after + we pushed one input data into MFT. + Note that async MFT will fire two events, one is "METransformNeedInput" + which happens when MFT can accept more input data, + and the other is "METransformHaveOutput", that's for signaling + there's pending data which can be outputted immediately. + To listen the events, we can wait synchronously via + IMFMediaEventGenerator::GetEvent() or make use of IMFAsyncCallback + object which is asynchronous way and the event will be notified + from Media Foundation's internal worker queue thread. + To handle such asynchronous operation, previous working flow was + as follows (IMFMediaEventGenerator::GetEvent() was used for now) + - Check if there is pending output data and push the data toward downstream. + - Pulling events (from streaming thread) until there's at least + one pending "METransformNeedInput" event + - Then, push one data into MFT from streaming thread + - Check if there is pending "METransformHaveOutput" again. + If there is, push new output data to downstream + (unlikely there is pending output data at this moment) + Above flow was processed from upstream streaming thread. That means + even if there's available output data, it could be outputted later + when the next buffer is pushed from upstream streaming thread. + It would introduce at least one frame latency in case of live stream. + To reduce such latency, this commit modifies the flow to be fully + asynchronous like hardware MFT was designed and to be able to + output encoded data whenever it's available. More specifically, + IMFAsyncCallback object will be used for handling + "METransformNeedInput" and "METransformHaveOutput" events from + Media Foundation's internal thread, and new output data will be + also outputted from the Media Foundation's thread. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1520> + +2020-12-16 18:32:25 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklinkaudiosrc.cpp: + decklinkaudiosrc: Fix duration of the first audio frame after each discont + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1886> + +2020-12-16 00:28:08 +0530 Biswapriyo Nath <nathbappai@gmail.com> + + * sys/mediafoundation/gstmfdevice.h: + mediafoundation: Fix redefinition of variables. + Remove duplicate GstMFDevice and GstMFDeviceProvider declaration. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1884> + +2020-12-17 04:41:18 +1100 Jan Schmidt <jan@centricular.com> + + * gst/audiobuffersplit/gstaudiobuffersplit.c: + audiobuffersplit: Calculate the correct size for fixed size buffers + Fix the output-buffer-size property to do what it says by calculating + the correct audio buffer size for that target size, rounded down to + the nearest whole number of samples. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1887> + +2020-12-10 12:35:07 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklinkaudiosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.cpp: + decklink: Implement GstBaseSrc::get_caps() to return more constrained caps + Instead of the template caps we can return a subset of them based on the + selected properties. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1868> + +2020-10-30 02:21:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/wasapi2/gstwasapi2client.cpp: + wasapi2: Ensure unmute when opening audio client + ISimpleAudioVolume::SetMute() status seems to be preserved even + after process is terminated. In order to start audio client with + unmuted state, always disable mute when opening audio client. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1731> + +2020-12-14 16:12:22 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtsparse.c: + tsparse: Don't use non-object for debugging statement + Use the pad instead + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1880> + +2020-12-14 10:56:39 +0100 Edward Hervey <edward@centricular.com> + + * tests/examples/mpegts/ts-parser.c: + examples/ts-parser: Use the section type for descriptor identification + Some descriptors can only be present in some section + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1880> + +2020-12-14 10:56:02 +0100 Edward Hervey <edward@centricular.com> + + * tests/examples/mpegts/ts-parser.c: + examples/ts-parser: Try more descriptor/stream types + These were added recently + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1880> + +2020-12-09 09:14:12 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsbase.h: + mpegts: Don't add non-padded streams to collection on updates + When carrying over existing GstStream to a new GstStreamCollection we need to + check whether they *actually* were being used in the previous collection. + This avoids adding unknown streams (metadata, PSI, etc...) to the collection on + updates. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1880> + +2020-11-22 18:48:08 +0100 Edward Hervey <edward@centricular.com> + + * gst-libs/gst/mpegts/gst-dvb-descriptor.h: + * gst-libs/gst/mpegts/gst-dvb-section.c: + * gst-libs/gst/mpegts/gst-dvb-section.h: + * gst-libs/gst/mpegts/gstmpegtssection.c: + * gst-libs/gst/mpegts/gstmpegtssection.h: + * tests/examples/mpegts/ts-parser.c: + mpegts: Add support for SIT sections + Selection Information Tables (EN 300 468) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1852> + +2020-12-14 10:50:02 +0100 Edward Hervey <edward@centricular.com> + + * docs/libs/mpegts/index.md: + * gst-libs/gst/mpegts/gst-atsc-descriptor.h: + * gst-libs/gst/mpegts/gst-atsc-section.c: + * gst-libs/gst/mpegts/gst-atsc-section.h: + * gst-libs/gst/mpegts/gst-dvb-section.c: + * gst-libs/gst/mpegts/gst-dvb-section.h: + * gst-libs/gst/mpegts/gst-hdmv-section.h: + * gst-libs/gst/mpegts/gst-isdb-descriptor.h: + * gst-libs/gst/mpegts/gst-scte-section.c: + * gst-libs/gst/mpegts/gst-scte-section.h: + * gst-libs/gst/mpegts/gstmpegts-private.h: + * gst-libs/gst/mpegts/gstmpegtsdescriptor.c: + * gst-libs/gst/mpegts/gstmpegtsdescriptor.h: + * gst-libs/gst/mpegts/gstmpegtssection.c: + * gst-libs/gst/mpegts/gstmpegtssection.h: + * gst-libs/gst/mpegts/meson.build: + * gst-libs/gst/mpegts/mpegts.c: + * gst-libs/gst/mpegts/mpegts.h: + mpegts: Update documentation + * Split up into appropriate individual header files + * Document more sections and structures + * Add well-known list of registration id + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1879> + +2020-12-10 16:29:31 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/player/gstplayer.c: + * gst-libs/gst/transcoder/gsttranscoder.c: + player/transcoder: Use bus signal watch + Instead of implementing exactly the same thing ourself but making + `GstBus` not know that it is the case. + Since we are *sure* that the bus can't have been access at the point + where we add the watch we are guaranteed that the current thread + maincontext is going to be used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1870> 2020-12-10 15:37:14 +0800 Lim Siew Hoon <siew.hoon.lim@intel.com> @@ -976,7 +15502,63 @@ intervideosrc: fix negotiation of interlaced caps In 1.0 the field in caps is called "interlace-mode", not "interlaced". Fixes #1480 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1878> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1869> + +2020-12-11 21:45:25 -0500 Arun Raghavan <arun@asymptotic.io> + + * ext/openaptx/meson.build: + * meson_options.txt: + openaptx: Drop lib prefix from option name for consistency + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1876> + +2020-12-11 08:45:06 +0000 Igor Kovalenko <igor.v.kovalenko@gmail.com> + + * ext/meson.build: + * ext/openaptx/gstopenaptxdec.c: + * ext/openaptx/gstopenaptxdec.h: + * ext/openaptx/gstopenaptxenc.c: + * ext/openaptx/gstopenaptxenc.h: + * ext/openaptx/meson.build: + * ext/openaptx/openaptx-plugin.c: + * ext/openaptx/openaptx-plugin.h: + * meson_options.txt: + openaptx: add aptX and aptX-HD codecs using libopenaptx + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1871> + +2020-10-19 14:56:43 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpesrc.cpp: + wpe: Emit load-progress messages + The estimated-load-progress value can be used on application side to display a + progress bar for instance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1710> + +2020-12-08 16:46:42 +0200 Vivia Nikolaidou <vivia@ahiru.eu> + + * gst/mpegtsmux/gstbasetsmux.c: + basetsmux: Don't send the capsheader if src pad has no caps + That means we're shutting down, so there's no point in the streamheader + being sent + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1864> + +2020-12-04 17:02:00 +1100 Matthew Waters <matthew@centricular.com> + + * gst/rtmp2/rtmp/rtmpclient.c: + * gst/rtmp2/rtmp/rtmpconnection.c: + * gst/rtmp2/rtmp/rtmpconnection.h: + rtmp2/connection: pass the parent cancellable down to the connection + Otherwise, when rtpm2src cancels an inflight operation that has a queued + message stored, then the rtmp connection operation is not stopped. + If the cancellation occurs during rtmp connection start up, then + rtpm2src does not have any way of accessing the connection object as it + has not been returned yet. As a result, rtpm2src will cancel, the + connection will still be processing things and the + GMainContext/GMainLoop associated with the outstanding operation will be + destroyed. All outstanding operations and the rtmpconnection object will + therefore be leaked in this case. + Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1425 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1862> 2020-12-07 14:54:28 +0100 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1000,67 +15582,101 @@ #9 gst_srt_object_write at gst-plugins-bad/ext/srt/gstsrtobject.c:1598 #10 gst_srt_sink_render at gst-plugins-bad/ext/srt/gstsrtsink.c:179 Fixes d2d00e07acc2b1ab1ae5a728ef5dc33c9dee7869. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1863> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1861> -2020-12-04 17:02:00 +1100 Matthew Waters <matthew@centricular.com> +2020-11-25 16:24:25 +0200 Sebastian Dröge <sebastian@centricular.com> - * gst/rtmp2/rtmp/rtmpclient.c: - * gst/rtmp2/rtmp/rtmpconnection.c: - * gst/rtmp2/rtmp/rtmpconnection.h: - rtmp2/connection: pass the parent cancellable down to the connection - Otherwise, when rtpm2src cancels an inflight operation that has a queued - message stored, then the rtmp connection operation is not stopped. - If the cancellation occurs during rtmp connection start up, then - rtpm2src does not have any way of accessing the connection object as it - has not been returned yet. As a result, rtpm2src will cancel, the - connection will still be processing things and the - GMainContext/GMainLoop associated with the outstanding operation will be - destroyed. All outstanding operations and the rtmpconnection object will - therefore be leaked in this case. - Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1425 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1865> + * docs/plugins/gst_plugins_cache.json: + * ext/closedcaption/gstccconverter.c: + * ext/closedcaption/gstccconverter.h: + ccconverter: Add property to specify which sections to include in CDP packets + Various software, including ffmpeg's Decklink support, fails parsing CDP + packets that contain anything but CC data in the CDP packets. + Based on this property, timecodes are not written into the CDP packets + even if they're present. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1833> -2020-12-06 23:57:01 +0000 Tim-Philipp Müller <tim@centricular.com> +2020-11-25 14:54:09 +0200 Sebastian Dröge <sebastian@centricular.com> - * meson.build: - Back to development + * ext/closedcaption/gstccconverter.c: + ccconverter: Refactor code to only retrieve the timecode meta once + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1833> -=== release 1.18.2 === +2020-12-06 18:03:47 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> -2020-12-06 13:24:10 +0000 Tim-Philipp Müller <tim@centricular.com> + * sys/va/gstvadecoder.h: + va: decode: fix display type + Instead of a pointer to GstVaDisplay it was used a VADisplay type, which in + certain platforms is the same, and the compiler didn't complain. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1860> - * ChangeLog: - * NEWS: - * RELEASE: - * gst-plugins-bad.doap: - * meson.build: - Release 1.18.2 +2020-07-03 12:25:31 +0200 Marc Leeman <m.leeman@televic.com> -2020-08-25 14:56:50 +0100 Chris Bass <floobleflam@gmail.com> + * docs/plugins/gst_plugins_cache.json: + * gst/rtp/gstrtpsrc.c: + * gst/rtp/gstrtpsrc.h: + rtpmanagerbad: allow setting caps on rtpsrc + rtpsrc tries to do a lookup of the caps based on the encoding-name. For + not so standard encodings, the caps can be set, avoiding the lookup. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1406> - * ext/ttml/ttmlparse.c: - ttmlparse: Handle whitespace before XML declaration - When ttmlparse is in, e.g., an MPEG-DASH pipeline, there may be - whitespace between successive TTML documents in ttmlparse's accumulated - input. As libxml2 will fail to parse documents that have whitespace - before the opening XML declaration, ensure that any preceding whitespace - is not passed to libxml2. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1859> +2020-11-22 04:39:57 +0900 Seungha Yang <seungha@centricular.com> -2020-08-25 14:54:31 +0100 Chris Bass <floobleflam@gmail.com> + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosinkbin.c: + * sys/d3d11/gstd3d11window.cpp: + * sys/d3d11/gstd3d11window.h: + * sys/d3d11/meson.build: + d3d11videosink: Add a property to support rendering statistics data on window + Add a new property "render-stats" to allow rendering statistics + data on window for debugging and/or development purpose. + Text rendering will be accelerated by GPU since this implementation + uses Direct2D/DirectWrite API and Direct3D inter-op for minimal overhead. + Specifically, text data will be rendered on swapchain backbuffer + directly without any copy/allocation of extra texture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1830> - * ext/ttml/ttmlparse.c: - ttmlparse: Ensure only single TTML doc parsed - The parser handles only one TTML file at a time, therefore if there are - multiple TTML documets in the input, parse only the first. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1859> +2020-12-04 03:40:17 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11window.cpp: + d3d11: Protect ID3D11VideoContext with lock + Likewise d3d11 immediate context (i.e., ID3D11DeviceContext), + ID3D11VideoContext API is not thread safe. It must be protected therefore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1856> + +2020-12-03 17:13:15 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * docs/meson.build: + docs: don't exit the subdir when optional deps aren't found + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1854> + +2020-12-02 11:29:08 +0100 Edward Hervey <edward@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/opencv/gstretinex.cpp: + * ext/opencv/gstretinex.h: + opencv: Expose retinex parameters + Makes the plugin a tad more useful :) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1845> + +2020-10-12 14:12:07 +0300 Marius Vlad <marius.vlad@collabora.com> + + * gst-libs/gst/wayland/meson.build: + gst-libs/gst/wayland: Install "unstable" wayland header + Context creation and retrieval is required, the symbols are exported + with the header missing. Users most likely define GST_USE_UNSTABLE_API + so they're aware of the implications of using a header that might change + between releases. + Signed-off-by: Marius Vlad <marius.vlad@collabora.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1688> 2020-12-03 14:12:06 +0100 Edward Hervey <edward@centricular.com> * ext/hls/gsthlsdemux.c: hlsdemux: Use actual object for logging i.e. the pad of the stream - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1858> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1853> 2020-12-03 06:55:00 -0500 Arun Raghavan <arun@asymptotic.io> @@ -1075,30 +15691,14 @@ curl: Remove incorrect GST_DEBUG_OBJECT() calls klass is not a GstObject, and these debugs print should likely not be around anyway. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1857> - -2020-11-10 14:48:28 +0100 Edward Hervey <edward@centricular.com> - - * gst-libs/gst/adaptivedemux/gstadaptivedemux.c: - adaptivedemux: Don't calculate bitrate for header/index fragments - They are generally substantially smaller than regular fragments, and therefore - we end up pushing totally wrong bitrates downstream. - Fixes erratic buffering issues with DASH introduced by - 66f5e874352016e29f555e3ce693b23474e476db - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1819> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1851> -2020-11-09 11:41:10 +0100 Edward Hervey <edward@centricular.com> +2020-11-25 17:59:54 +0100 Edward Hervey <edward@centricular.com> - * ext/dash/gstdashdemux.c: - * gst-libs/gst/adaptivedemux/gstadaptivedemux.c: - * gst-libs/gst/adaptivedemux/gstadaptivedemux.h: - adaptivedemux: Store QoS values on the element - Storing it per-stream requires taking the manifest lock which can apparenly be - hold for aeons. And since the QoS event comes from the video rendering thread - we *really* do not want to do that. - Storing it as-is in the element is fine, the important part is knowing the - earliest time downstream. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1819> + * sys/nvcodec/gstcudanvrtc.c: + cuda: Fix lowest targetted architecture for CUDA >= 11.0 + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1469 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1835> 2020-11-05 13:48:27 +0200 Edward Hervey <edward@centricular.com> @@ -1112,7 +15712,7 @@ Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1419 And properly forward PTS/DTS for program pads (which wasn't the case before) Original patch by Vivia Nikolaidou <vivia@ahiru.eu> - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1849> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1769> 2020-12-02 09:39:45 +0200 Sebastian Dröge <sebastian@centricular.com> @@ -1120,14 +15720,161 @@ adaptivedemux: Don't log with non-GObject objects Instead of using the streams, log with the pad of the streams. https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1457 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1847> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1844> -2020-12-02 15:08:44 +0000 Nicolas Dufresne <nicolas@ndufresne.ca> +2020-11-20 11:29:46 -0300 Thibault Saunier <tsaunier@igalia.com> - * sys/v4l2codecs/gstv4l2decoder.c: - Revert "v4l2codecs: decoder: Unmark previously pending request" - This reverts commit a3e6d9fc24098fc27fa3fb10c4d189fa61e67500 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1848> + * gst/transcode/gsttranscodebin.c: + * tools/gst-transcoder.c: + transcodebin: Minor error message enhancement + +2020-11-19 22:56:46 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst/transcode/gsttranscodebin.c: + transcodebin: Unlock while setting decodebin caps + Otherwise it will deadlock recursing up to notify parent object property changes + +2020-11-19 18:31:34 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst/transcode/gsttranscodebin.c: + transcodebin: Avoid plugin converter if filter handles ANY caps + For example identity or clocksync or this kind of elements can be + used with any data flow and we should not enforce decoding to row in + that case. + +2020-11-19 18:39:33 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst/transcode/gsttranscodebin.c: + transcodebin: Add filter as soon as it is set + Instead of waiting so that we can simply use a clocksync element as + filter, otherwise we won't know the pipeline is live as it won't + return NO_PREROLL as one would expect in that case. + Adding it right away shouldn't create any issue, both ways are fine. + +2020-11-19 18:29:15 -0300 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/transcode/gsturitranscodebin.c: + uritranscodebin: Add `setup-source` and `element-setup` signals + The same way as playbinX does it as it is often quite useful + +2020-11-19 17:55:10 -0300 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/transcode/gsttranscodebin.c: + * gst/transcode/gsturitranscodebin.c: + transcode: Port to encodebin2 + This allows supporting muxing sinks like hlssink2 or splitmux + +2020-11-19 17:55:10 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Handle the case where several errors are posted + There were cases where the loop was already destroyed when we were + receiving the following message. + +2020-11-19 17:54:28 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Minor refactoring to output better debug logs + +2020-11-19 17:51:56 -0300 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/hls/gsthlssink2.c: + hlssink2: Mark as Muxer + The way it is usable by encodebin2. This is what splitmux does already. + +2020-11-30 17:12:14 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + va: decoder: Picture dups only holds GstBuffer + Also removes the warning log message at destroying buffers when picture free() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1841> + +2020-11-30 15:01:01 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Remove gst_va_decoder_destroy_buffers() + Since GstVaDecodePicture is destroyed completely with its free() function and + it's used as destroy notify by codecs picture, there's no need to call + gst_va_decoder_destroy_buffers() externally, since the codecs base classes + destroy the codec picture when it's required. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1841> + +2020-11-26 14:04:31 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvah265dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp9dec.c: + va: Destroy picture unreleased buffers when finalize. + The current way of GstVaDecodePicture's finalize will leak some + resource such as parameter buffers and slice data. + The current way deliberately leaves these resource releasing logic + to va decoder related function and trigger a warning if we free the + GstVaDecodePicture without releasing these resources. + But in practice, sometimes, you do not have the chance to release + these resource before picture is freed. For example, H264/Mpeg2 + support multi slice NALs/Packets for one frame. It is possible that + we already succeed to parse and generate the first several slices + data by _decode_slice(), but then we get a wrong slice NAL/packet + and fail to parse it. We decide to discard the whole frame in the + decoder's base class, it just free the current picture and does not + trigger sub class's function again. In this kind of cases, we do + not have the chance to cleanup the resource, and the resource will + be leaked. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1841> + +2020-11-21 19:00:02 -0300 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstbaseqroverlay.c: + * ext/qroverlay/gstbaseqroverlay.h: + * ext/qroverlay/gstdebugqroverlay.c: + * ext/qroverlay/gstqroverlay.c: + qroverlay: Reuse the same OverlayComposition object when possible + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1829> + +2020-11-20 11:28:25 -0300 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstbaseqroverlay.c: + * ext/qroverlay/gstbaseqroverlay.h: + * ext/qroverlay/gstdebugqroverlay.c: + * ext/qroverlay/gstqroverlay.c: + qroverlay: Rework basing it on overlaycomposition + The base class is now a bin which wraps the `overlaycomposition` + element and implements the `draw` signal. + This way we support all the video formats the GstVideoOverlayComposition + API supports and the blending code can be reused. It is also possible + to have the blending happen in the sinks now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1829> + +2020-11-26 05:55:29 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h264dec.c: + d3d11h264dec: Reconfigure decoder object on DPB size change + Even if resolution and/or bitdepth is not updated, required + DPB size can be changed per SPS update and it could be even + larger than previously configured size of DPB. If so, we need + to reconfigure DPB d3d11 texture pool again. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1839> + +2020-11-25 17:52:42 +0100 Marijn Suijten <marijns95@gmail.com> + + * gst/audiobuffersplit/gstaudiobuffersplit.c: + * gst/inter/gstinteraudiosrc.c: + audio: Use new AudioFormatInfo::fill_silence function + The function is renamed to be properly associated with AudioFormatInfo + (its instance) instead of AudioFormat (an unrelated enum), see [1] for + the rename itself. + [1]: https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/940 2020-11-05 17:14:22 +0000 Philippe Normand <philn@igalia.com> @@ -1135,26 +15882,640 @@ player: Fix get_current_subtitle_track annotation As the info returned is a new object, the annotation should be transfer-full, similarly to the get_current_{audio,video}_track() implementations. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1834> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1775> -2020-11-11 18:21:25 +0900 Seungha Yang <seungha@centricular.com> +2020-11-23 20:44:27 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * gst/mpegdemux/gstmpegdemux.c: - mpegdemux: Set duration on seeking query if possible - Set duration on seeking query in the same way as duration query handler. - Otherwise application might get confused as if the duration is unknown. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1831> + * sys/va/gstvaallocator.c: + va: allocator: add a memory pool object helper + Since both allocators use a memory pool, with its mutex and cond, this patch + refactors it into a single internal object, implementing a generic GstMemory + pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1815> -2020-11-09 18:27:14 +0100 Edward Hervey <edward@centricular.com> +2020-11-17 14:53:05 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * gst/mpegtsdemux/mpegtspacketizer.c: - mpegtspacketizer: Handle PCR issues with adaptive streams - A lot of content producers out there targetting "adaptive streaming" are riddled - with non-compliant PCR streams (essentially all the players out there just use - PTS/DTS and don't care about the PCR). - In order to gracefully cope with these, we detect them appropriately and any - small (< 15s) PCR resets get gracefully ignored. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1787> + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvapool.c: + va: pool, allocator: honor GST_BUFFER_POOL_ACQUIRE_FLAG_DONTWAIT + In order to honor GST_BUFFER_POOL_ACQUIRE_FLAG_DONTWAIT in VA pool, allocators' + wait_for_memory() has to be decoupled from their prepare_buffer() so it could be + called in pools' acquire_buffer() if the flag is not set. + wait_for_memory() functions are blocking so the received memories are assigned + to the fist requested buffer, if multithreaded calls. For this a new mutex were + added. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1815> + +2020-11-17 13:18:37 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvapool.c: + va: allocator: broadcast when flushing + This patch handles when the bufferpool request a new buffer while + flushing. + Also fixes the usage of g_cond_wait(), which demands to be used + inside a loop to avoid spurious wakeups. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1815> + +2020-11-17 13:17:03 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: free allocator when a mem is held + An application, using for example appsink, can hold buffers from any + va allocator after setting the pipeline to NULL. We need to destroy + the allocator when that memory is unrefed. + This patch juggles a bit with the allocator reference count in + memories in order to achieve this: + 1. When memory is created no alloc ref is modified + 2. When memory is released, alloc ref is decreased + 3. When memory is reassiged to a buffer, alloc ref is increased + 4. When memory is flushed, alloc ref is increased becase it is going + to be decreased in gst_memory_unref() + Also this patch moves the deallocation of member variables to + finalize() rather than dispose() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1815> + +2020-11-23 17:01:52 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: dmabuf: initialize cond + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1815> + +2020-11-20 17:32:44 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcstats.c: + * ext/webrtc/transportstream.c: + * ext/webrtc/transportstream.h: + webrtc: Make ssrc map into separate data structures + They now contain a weak reference and that could be freed later + causing strange crashes as GWeakRef are not movable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-15 21:23:08 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Get the remote-inbound stats from the right RTPSource + This also means that we need to get the clock-rate from the codec instead + of from the RTPSource, as the remote one doesn't include a clock rate. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-15 19:36:45 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + * ext/webrtc/gstwebrtcstats.c: + * ext/webrtc/gstwebrtcstats.h: + webrtcbin: Implement getting stats for a specific pad + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-10 18:21:19 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Also return the raw rtpsource stats for more information + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-09 20:59:58 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Avoid copy of GstStructure + Instead transfer the ownership to the new structure + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-09 20:45:10 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Remove receiver side when sending + Those are just invalid and just reflect what we sent. We'd need to parse the + RTCP XR packets from the other side to know more about those. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-09 20:27:40 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Extract statistics from the rtpjitterbuffer + And expose them as standardised webrtc statistics + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-09 18:45:57 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/transportstream.c: + * ext/webrtc/transportstream.h: + webrtcbin: Store the rtpjitterbuffer instances to extract stats from them + Store them as web refs to avoid having to worry about freeing later and because + the new-jitterbuffer is on a different thread + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-09 19:59:18 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Document all RTP missing fields according to the latest spec + Just document all the missing fields and document which ones will never + be implemented because they depend on the codec or depayloader + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-09 19:38:15 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: RTCP computed RTT is only available at sender + The receiver doesn't have the information to compute it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-10-08 17:11:30 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcstats.c: + webrtcstats: Remove redundant lines + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1766> + +2020-11-04 17:06:02 -0500 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpreceiver.c: + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + webrtc: Also remove rtcp_transport from the structure + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1765> + +2020-11-02 19:55:46 -0500 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpreceiver.c: + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + webrtc: Remove APIs to set transport on sender/receiver + They're not not used ever. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1765> + +2020-11-02 19:49:55 -0500 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/nicetransport.c: + * ext/webrtc/transportreceivebin.c: + * ext/webrtc/transportsendbin.c: + * ext/webrtc/transportsendbin.h: + * ext/webrtc/transportstream.c: + * ext/webrtc/transportstream.h: + * ext/webrtc/webrtctransceiver.c: + * gst-libs/gst/webrtc/dtlstransport.c: + * gst-libs/gst/webrtc/dtlstransport.h: + * gst-libs/gst/webrtc/rtpreceiver.c: + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + webrtc: Remove non rtcp-mux code + RTCP mux is now always required by the WebRTC spec + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1765> + +2020-11-20 15:01:03 +0000 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstnvdec.c: + nvcodec: Assume 25fps if framerate is invalid when calculating latency + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1826> + +2020-11-20 22:26:14 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: fix memory leak + gst_h264_dbp_get_picture_all() returns a full transfer of the GArray, which + needs be unrefed. But it is not unrefed in + gst_h264_decoder_find_first_field_picture() leaking it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1827> + +2020-11-20 16:07:36 +0100 Edward Hervey <edward@centricular.com> + + * gst-libs/gst/mpegts/gst-dvb-descriptor.c: + * gst-libs/gst/mpegts/gst-dvb-descriptor.h: + * gst-libs/gst/mpegts/gstmpegtsdescriptor.c: + * gst-libs/gst/mpegts/gstmpegtsdescriptor.h: + mpegts: Documentation fixes + gtk-doc was complaining :) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1825> + +2020-11-20 13:24:24 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ext/qroverlay/gstdebugqroverlay.c: + * ext/qroverlay/gstdebugqroverlay.h: + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + qroverlay: unset executable flag on source files + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1824> + +2020-11-20 13:22:48 +0000 Tim-Philipp Müller <tim@centricular.com> + + * ext/qroverlay/meson.build: + qroverlay: fix auto detection of json-glib for plugin + Only want to check for json-glib when libqrencode was found, + but also it shouldn't be required but depend on the option. + Fixes #1465 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1824> + +2020-11-19 21:15:25 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11format.c: + * sys/d3d11/gstd3d11format.h: + * sys/d3d11/gstd3d11memory.c: + * tests/check/elements/d3d11colorconvert.c: + d3d11: Add support for packed 4:2:2 and 4:4:4 10bits formats + Add support for Y210 and Y410 formats which are commonly used format + for en/decoders on Windows. Note that those formats cannot be used for + render target (output) of shader. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1821> + +2020-10-02 18:47:16 -0400 Olivier Crête <olivier.crete@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/openh264/gstopenh264dec.cpp: + openh264dec: Accept constrained-high and progressive-high profiles + They're just subsets of the high profile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1634> + +2020-10-02 18:47:06 -0400 Olivier Crête <olivier.crete@collabora.com> + + * sys/d3d11/gstd3d11h264dec.c: + d3d11h264dec: Accept constrained-high and progressive-high profiles + They're just subsets of the high profile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1634> + +2020-10-02 18:46:56 -0400 Olivier Crête <olivier.crete@collabora.com> + + * sys/msdk/gstmsdkh264dec.c: + msdkh264dec: Accept constrained-high and progressive-high profiles + They're just subsets of the high profile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1634> + +2020-09-22 15:42:37 -0400 Olivier Crête <olivier.crete@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvh264dec.c: + nvdec: Accept progressive-high and contrained-high profiles + They're subsets of the high profiles with no interlacing and + no B-frames for constrained + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1634> + +2020-09-28 13:33:00 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: add the set_operating_point() API. + The av1 can support multi layers when scalability is enabled. We + need an API to set the operating point and filter the OBUs just + belonging to some layers(the layers are specified by the operating + point). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-10-09 16:13:28 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: Add an API to reset the annex_b state only. + In practice, we encounter streams that have one or more temporal units + error. When that kind of error temporal units is in annex b format, the + whole temporal unit should be discarded. + But the temporal units before it are correct and can be used. More + important, because of the error temporal unit, the parser is in a wrong + state and all later temporal unit are also parsed uncorrectly. + We need to add this API to reset the annex_b state only when we meet + some temporal unit error. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-10-09 16:01:35 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: clean the seen_frame_header in parse_tile_group(). + The current seen_frame_header is not cleaned correctly. According + to the spec, it should be cleaned when tiles are parsed completely. + Also delete a verbose seen_frame_header init in reset_state(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-09-29 13:15:37 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: fix a typo in parse_metadata_scalability + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-09-28 18:22:08 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: Do not assert in identify_one_obu when check annex b size. + Some buggy stream just writes the wrong temporal unit and frame size in + the stream. We should return failure rather than assert to abort. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-09-22 19:16:30 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: Add unknow AV1 profile define for saint check. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-07-24 14:54:37 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: Improve the parse_tile_info. + 1. store more tile info when parse tile group. + The column, row, tile offset and tile data size are all useful for + decoder process, especially for HW kind decoder such as VAAPI dec. + Also fix the tile group skip size for each tile data. + 2. No min_inner_tile_width requirement in newest spec. + 3. Calculate the sbs of each tile for both uniform tile and non-uniformi + tile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-07-28 17:25:44 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: Fix a tile info read typo in frame header. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-25 19:44:48 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: Fix a typo when get value of segmentation params. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-25 16:33:26 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: add valid check for global motion params. + The global motion params and its matrix values need to be verified + before we use them. If it is invalid, we should notify the decoder + that it should not be used. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-25 15:25:56 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: uint8 range is not enough for av1_bitstreamfn_ns + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-25 15:25:06 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: delete duplicated GST_AV1_GM_ABS_ALPHA_BITS define. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-27 21:33:14 +0800 He Junyan <junyan.he@intel.com> + + * tests/check/libs/av1parser.c: + test: av1parser: update the test result because of bug fixing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-24 15:29:56 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: Improve the loop filter setting. + 1. loop_filter_ref_deltas should be int because it needs to compare + with 0. + 2. Move the loop filter init logic to setup_past_independence() and + load_previous(), which make it more precise with the spec. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-08-14 14:40:49 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: Fix a error report for metadata obu. + The metadata OBUs, for example, ITUT_T35 has an undefined payload such + as itu_t_t35_payload_bytes field in AV1 spec, which may cause the failure + of parsing the trailings bits. We can give a warning and ignore this kind + of errors. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-07-28 15:06:04 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + codecparsers: av1: Fix a level index bug in sequence. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-07-24 12:49:10 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstav1parser.c: + * gst-libs/gst/codecparsers/gstav1parser.h: + codecparsers: av1: all ref idx should be gint8. + All the ref index need to compare with 0 in reference index decision + algorithm. We also need to init them to -1. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1464> + +2020-11-14 18:48:05 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvah264dec.c: + va: h264dec: Add support for interlaced stream + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-16 16:29:04 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: implement gst_va_h264_dec_new_field_picture() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-14 20:46:30 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvah264dec.c: + va: h264dec: Fix picture_height_in_mbs_minus1 + Fix for interlaced stream (when sps->frame_mbs_only_flag is equal to one) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-16 16:29:46 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: admit baseline if stream obeys A.2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-15 00:20:54 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264decoder.h: + codecs: h264decoder: Add support for field ref picture list modification + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-17 18:39:56 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + * sys/va/gstvah264dec.c: + codecs: h264decoder: Add more option arguments for reference picture getter + In case that "pic_order_cnt_type" is equal to zero, ref picture + list for B slice should not include non-existing picture + as per spec 8.2.4.2.3. And, the second field is not needed + for the process of frame picture reference list construction + since it needs to be frame unit, not field picture in that case. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-17 18:59:35 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264decoder: Split gap picture as well if needed + field pair pictures might be required for reference list + depending on context. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-05 18:09:06 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + h264dec: Fix POC calculation for type 0 + This is mostly for future use as it only fixes the caclulation for interlaced + cases, the case of frame seemed correct already. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-17 03:11:46 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Don't try to construct RefPicList0 and RefPicList1 if not required + We were trying to construct reference picture list even for + I slice before this commit. Reference list is required only for + P, SP and B slices. Also, if all existing reference pictures + are gap pictures, we don't need to construct lists. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1812> + +2020-11-03 01:59:46 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvapool.c: + va: pool: Check the force_videometa for all memory types. + force_videometa should mean that the buffer must use video meta to + map correctly. When the stride or the offset of the alloc_info is + different from the src caps, the downstream must use video meta. + So this flag should not link with the RAW caps only. All kinds of + caps(memory:VAMemory, memory:DMABuf) should have this flag. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1711> + +2020-11-17 00:18:22 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + va: basedec: Improve the decide_allocation(). + In decide_allocation(), we now just use the other_pool for frames + copy when the src caps is raw. This can make the logic a little + clear. There is no need for us to check the alignment and video + meta again. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1711> + +2020-11-16 23:53:39 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + va: basedec: fallback to system memory if downstream caps is any. + When the downstream element reports an ANY caps, and it also fails to + support VideoMeta, we should fallback to the system memory. + Note: the basetransform kind elements never return valid allocation + query before set_caps(). So, if a basetransform return an ANY sink + caps, we always fallback to system memory for it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1711> + +2020-11-16 04:38:28 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvah264dec.c: + vah264dec: Fix for long term reference picture signalling + Allocate a GArray which is used to fill + VAPictureParameterBufferH264.ReferenceFrames (called per frame), + instead of alloc/free per frame. + Also this commit is to fix the condition where long-term reference + picture is needed for VAPictureParameterBufferH264.ReferenceFrames + entry. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1813> + +2020-11-15 03:41:27 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264decoder: Fix MMCO type 1 for interlaced stream + If field_pic_flag of current picture is equal to zero, + both field of reference field pair should be marked as + "unused for reference" + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1810> + +2020-11-15 02:59:24 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264decoder: Fix MMCO type 3 for interlaced stream + Depending on short-ref picture corresponding to picNumX value, + there's a condition that only one field should be updated to + be non-reference picture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1810> + +2020-11-15 00:55:09 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Split frame picture into field pictures if needed + In case of interlaced stream, frame pictures need to be splitted + into field for reference marking process. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1810> + +2020-11-16 00:27:28 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264decoder: Add util macro for frame/field picture identification + Add a macro to check whether given GstH264Picture is for frame or field + decoding. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1810> + +2020-11-16 20:44:06 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11window.cpp: + d3d11window: Prefer full color range for display target colorspace + We don't need to preserve input color range for transformed target + color space. Also some GPUs doesn't seem to be happy with 16-235 + color range for RGB color space. + Also, since our default display target color space is + DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709, choosing full color range + would make more sense. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1814> + +2020-08-15 02:02:44 +1000 Jan Schmidt <jan@centricular.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/WPEThreadedView.h: + wpe: Don't crash when running on X11. + Don't assume the available EGL display is a wayland display - + instead, check the the GStreamer GL context is EGL, and then + use gst_gl_display_egl_from_gl_display to create a + GstGLDisplayEGL from that, which also adds refcounting + around the underlying EGLDisplay. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1385 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1752> + +2020-11-13 20:25:36 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: dmabuf: log unknown surface format + It is possible that surface format is not assigned, keeping its default + GStreamer value: unknown, but gst_video_format_to_string() doesn't print + unknown format, so this patch does it manually. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1811> + +2020-11-13 20:20:47 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: dmabuf: destroy VASurface if no pooled buffer + When gst_va_dmabuf_allocator_setup_buffer_full() receives info (not NULL) it is + supposed that this buffer is not part of the allocator pool, so it has to be + de-allocated as soon it is freed. + This patch sets the destroy notify of the assigned GstVaBufferSurface if info is + not NULL. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1811> + +2020-11-14 03:20:19 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvah264dec.c: + vah264dec: Allow missing reference picture + baseclass might provide reference picture list with null picture. + Ensure picture before filling picture information. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1809> + +2020-11-14 03:16:07 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Don't give up to decode due to missing reference picture + Missing reference picture is very common thing for broken/malformed stream. + Decoder should be able to keep decoding if it's not a very critical error. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1809> 2020-11-13 17:50:03 +0100 Edward Hervey <edward@centricular.com> @@ -1162,7 +16523,200 @@ mpegtsdemux: Fix off by one error Turns out timestamps of zero are valid :) Fixes issues with streams where the PTS/DTS would be equal to the first PCR. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1820> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1807> + +2020-11-06 02:45:21 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11h264dec: Add support for interlaced stream + Add support for interlaced stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-10 01:28:03 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264decoder: Add support for interlaced stream + Initial support for interlaced stream. Subclass should implement + new_field_picture() vfunc. Otherwise, baseclass will assume that + subclass doesn't support interlaced stream. + Restrictions: + * Reference picture modification process for interlaced stream + is not implemented yet + * PAFF (Picture Adaptive Frame Field) is not properly implemented. + * Field display ordering (e.g., top-field-first) decision should + be enhanced via picture timing SEI parsing + * Gap in field picture should be handled + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-05 04:16:54 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264decoder: Rename DPB methods + Clarify wheter it's for picture(field) or frame in order to + support interlaced stream, because DPB size is frame unit, not picture + in case of interlaced stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-05 03:47:35 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264decoder.h: + codecs: h264decoder: Remove interlaced stream related constraints + ... and add new_field_picture() vfunc so that ensure interlaced + decoding support by subclass. + The method will be used later with interlaced stream support. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-12 23:49:01 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.h: + codecs: h264decoder: Move to inline GstH264DecoderClass documentation + Don't duplicate documentation for class vfunc. Hotdoc doesn't seem + to be happy with duplicated documentation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-06 01:45:36 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + * sys/d3d11/gstd3d11h264dec.c: + * sys/nvcodec/gstnvh264dec.c: + * sys/v4l2codecs/gstv4l2codech264dec.c: + * sys/va/gstvah264dec.c: + codecs: h264decoder: Store reference picture type using enum value + Managing reference picture type by using two variables + (ref and long_term) seems to be redundant and that can be + represented by using a single enum value. + This is to sync this implementation with gstreamer-vaapi so that + make comparison between this and gstreamer-vaapi easier and also + in order to minimize the change required for subclass to be able + to support interlaced. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-11 01:56:52 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264decoder: Minor documentation fix + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534> + +2020-11-13 23:18:20 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Remove DPB size related spammy debug message + It's not informative at all if SPS wasn't updated. Also we are printing + DPB size related debug message in another place already. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1806> + +2020-11-12 22:27:08 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/videoparsers/gsth264parse.c: + h264parse: try harder to update timecode + NumClockTS is the maximum number of timecodes the pic_timing SEI + can carry, but it is perfectly OK for it to carry fewer, and have + one of the clock_timestamp_flags set to 0. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1804> + +2020-11-12 22:32:00 +0100 Mathieu Duponchelle <mathieu@centricular.com> + + * gst/videoparsers/gsth264parse.c: + h264parse: fix installing of update-timecode property + Simply fixes a typo that did not have any adverse effect, + and avoid hardcoding initializer + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1805> + +2020-11-12 19:43:22 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Don't fill gap picture if it's not allowed + We should fill gap picture only if sps->gaps_in_frame_num_value_allowed_flag + is set. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1801> + +2020-04-16 10:06:29 -0400 Aaron Boxer <aaron.boxer@collabora.com> + + * ext/openjpeg/gstopenjpegenc.c: + openjpegenc: store stripe offset when encoding image + The decoder can simply read this offset after decoding + to know where to blit the stripe to the full frame + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1800> + +2020-03-24 09:15:30 -0400 Aaron Boxer <aaron.boxer@collabora.com> + + * ext/openjpeg/gstopenjpegenc.c: + * ext/openjpeg/meson.build: + openjpegenc: take subsampling into account when calculating stripe height + We calculate minimum of (stripe height * sub sampling) across all components + to ensure that all component dimensions are consistent with sub-sampling. + The last stripe for each component is simply the remaining height. + limit wavelet resolutions for "thin" stripes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1800> + +2020-03-12 13:41:40 +0100 Stéphane Cerveau <scerveau@collabora.com> + + * ext/openjpeg/gstopenjpegenc.c: + openjpegenc: fix memory leak from mstream + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1800> + +2020-01-13 14:00:38 -0500 Aaron Boxer <aaron.boxer@collabora.com> + + * ext/openjpeg/gstopenjpegenc.c: + openjpegenc: fail negotation in handle_frame if alignment mismatch + If encoder is in stripe mode, then downstream must also support stripe + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1800> + +2020-11-12 21:46:59 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvdecoder.h: + * sys/nvcodec/gstnvh264dec.c: + * sys/nvcodec/gstnvh265dec.c: + * sys/nvcodec/gstnvvp8dec.c: + * sys/nvcodec/gstnvvp9dec.c: + nvcodec: Fix various typo + Not sure where the DECOCER came from + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1803> + +2020-11-12 13:33:26 +0100 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasedec.c: + * sys/va/gstvapool.c: + va: comments to explain code + There are a couple part where code seems, at least to me, a bit oscure or + confusing. So let's better add an explanation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1802> + +2020-11-10 14:48:28 +0100 Edward Hervey <edward@centricular.com> + + * gst-libs/gst/adaptivedemux/gstadaptivedemux.c: + adaptivedemux: Don't calculate bitrate for header/index fragments + They are generally substantially smaller than regular fragments, and therefore + we end up pushing totally wrong bitrates downstream. + Fixes erratic buffering issues with DASH introduced by + 66f5e874352016e29f555e3ce693b23474e476db + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1021> + +2020-11-09 11:41:10 +0100 Edward Hervey <edward@centricular.com> + + * ext/dash/gstdashdemux.c: + * gst-libs/gst/adaptivedemux/gstadaptivedemux.c: + * gst-libs/gst/adaptivedemux/gstadaptivedemux.h: + adaptivedemux: Store QoS values on the element + Storing it per-stream requires taking the manifest lock which can apparenly be + hold for aeons. And since the QoS event comes from the video rendering thread + we *really* do not want to do that. + Storing it as-is in the element is fine, the important part is knowing the + earliest time downstream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1021> 2020-11-10 14:48:28 +0100 Edward Hervey <edward@centricular.com> @@ -1172,7 +16726,7 @@ we end up pushing totally wrong bitrates downstream. Fixes erratic buffering issues with DASH introduced by 66f5e874352016e29f555e3ce693b23474e476db - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1818> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1786> 2020-11-11 18:07:57 +0100 Edward Hervey <edward@centricular.com> @@ -1183,20 +16737,338 @@ In order to avoid such issues, keep a reference to the old variant until we're sure we don't need it anymore Fixes cases of double-free on variants and its contents - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1817> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1799> + +2020-11-12 00:42:59 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + * gst-libs/gst/codecs/gstvp8decoder.h: + codecs: vp8decoder: Fix two typo of struct name. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1797> + +2020-10-27 19:53:44 +0530 Sanchayan Maity <sanchayan@asymptotic.io> + + * sys/bluez/gsta2dpsink.c: + gsta2dpsink: Fix GstPad leak + The sinkpad returned by a call to gst_element_get_static_pad needs to be + unrefed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1621> + +2020-09-30 17:12:04 +0530 Arun Raghavan <arun@asymptotic.io> + + * docs/plugins/gst_plugins_cache.json: + * sys/bluez/gsta2dpsink.c: + bluez: a2dpsink: Add support for LDAC to a2dpsink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1621> + +2020-09-30 13:28:08 +0530 Arun Raghavan <arun@asymptotic.io> + + * docs/plugins/gst_plugins_cache.json: + * sys/bluez/a2dp-codecs.h: + * sys/bluez/gstavdtpsink.c: + * sys/bluez/gstavdtputil.c: + bluez: avdtpsink: Add support for LDAC to avdtpsink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1621> + +2020-09-18 17:35:24 +0530 Sanchayan Maity <sanchayan@asymptotic.io> + + * ext/ldac/gstldacenc.c: + * ext/ldac/gstldacenc.h: + * ext/ldac/ldac-plugin.c: + * ext/ldac/meson.build: + * ext/meson.build: + * meson_options.txt: + ext: Add LDAC encoder + LDAC is an audio coding technology developed by Sony that enables the + transmission of High-Resolution (Hi-Res) audio contents over Bluetooth. + Currently Adaptive Bit Rate (ABR) as supported by libldac encoder is not + implemented. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1621> + +2020-11-11 18:21:25 +0900 Seungha Yang <seungha@centricular.com> + + * gst/mpegdemux/gstmpegdemux.c: + mpegdemux: Set duration on seeking query if possible + Set duration on seeking query in the same way as duration query handler. + Otherwise application might get confused as if the duration is unknown. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1791> + +2020-11-11 13:39:37 +0200 Raul Tambre <raul@tambre.ee> + + * ext/webrtc/meson.build: + webrtc: Update libnice version requirement to 0.1.17 + Since !1366 nice_agent_get_sockets() is used, which requires 0.1.17. + Update the version requirement accordingly. + Fixes #1459. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1792> + +2020-11-03 17:48:02 +0100 Edward Hervey <edward@centricular.com> + + * ext/hls/gsthlsdemux.c: + * ext/hls/gsthlsdemux.h: + hlsdemux: Re-use streams if possible + When switching variants, try to re-use existing streams/pads instead of creating + new ones. When dealing with urisourcebin and decodebin3 this is not only the + expected way but also avoids a lot of buffering/hang issues. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1757> + +2020-11-04 10:36:21 +0100 Edward Hervey <edward@centricular.com> + + * ext/hls/m3u8.c: + * ext/hls/m3u8.h: + m3u8: Make a debug function usable elsewhere + The rest of the code might want to use this + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1757> + +2020-07-12 00:18:38 -0400 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * ext/qroverlay/gstdebugqroverlay.c: + qroverlay: Generate documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-12 00:03:04 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstdebugqroverlay.c: + * ext/qroverlay/gstdebugqroverlay.h: + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + * ext/qroverlay/meson.build: + qroverlay: Add a qroverlay element that allows overlaying any data + This moves `gstqroverlay.c` to `gstdebugqroverlay.c` and implements + a simple `gstqroverlay` element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-11 23:43:01 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + qroverlay: Rename qroverlay to debugqroverlay + The element is specially focus on debugging purposes and not a generique QR overlay + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-11 23:36:03 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + * ext/qroverlay/meson.build: + qroverlay: Factor out qroverlay logic to a base class + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-11 23:35:55 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstbaseqroverlay.c: + * ext/qroverlay/gstbaseqroverlay.h: + qroverlay: Factor out qroverlay logic to a base class + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-11 23:06:16 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + qroverlay: Make subclassable + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-11 20:42:51 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + * ext/qroverlay/meson.build: + qroverlay: Port to VideoFilter + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-11 15:04:57 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + qroverlay: Make default pizel-size 3 + Otherwise zbar isn't able to read the produced qrcodes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-09 14:14:45 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + * ext/qroverlay/meson.build: + qroverlay: Cleanup the way we build the json using json-glib + And reindent the .h file removing tabs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-09 13:05:20 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + qroverlay: Fix copyright + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-09 12:51:23 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + qroverlay: Fix some warnings + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-09 12:49:51 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + qroverlay: Minor renaming and documentation fixes + Matching usual namings + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-07-09 12:37:55 -0400 Thibault Saunier <tsaunier@igalia.com> + + * ext/meson.build: + * ext/qroverlay/gstqroverlay.c: + * ext/qroverlay/gstqroverlay.h: + * ext/qroverlay/meson.build: + * meson_options.txt: + qroverlay: Import from gst-qroverlay + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1730> + +2020-10-30 23:22:01 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvvp9dec.c: + * sys/nvcodec/gstnvvp9dec.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/plugin.c: + nvcodec: Add VP9 stateless decoder element + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1738> + +2020-10-30 21:20:57 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvdecoder.h: + * sys/nvcodec/gstnvh264dec.c: + * sys/nvcodec/gstnvh265dec.c: + * sys/nvcodec/gstnvvp8dec.c: + nvcodec: nvdecoder: Move to refcount based GstNvDecoderFrame + This refcount based way would be helpful for sharing nvdec frame among + multiple codec pictures and later zero-copy use case. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1738> + +2020-10-30 23:38:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvdecoder.h: + nvcodec: nvdecoder: Get rid of G_GNUC_INTERNAL + default is visibility=hidden. Don't need to use G_GNUC_INTERNAL + for new code therefore. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1738> + +2020-10-30 20:37:44 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/gstnvvp8dec.c: + * sys/nvcodec/gstnvvp8dec.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/plugin.c: + nvcodec: Add VP8 stateless decoder element + Like other nvcodec stateless decoders, the rank of this new nvvp8sldec + element will be secondary for now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1738> + +2020-10-30 23:26:49 +0900 Seungha Yang <seungha@centricular.com> + + * sys/nvcodec/plugin.c: + nvcodec: nvsldec: Fix typo in debug message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1738> + +2020-11-09 18:27:14 +0100 Edward Hervey <edward@centricular.com> + + * gst/mpegtsdemux/mpegtspacketizer.c: + mpegtspacketizer: Handle PCR issues with adaptive streams + A lot of content producers out there targetting "adaptive streaming" are riddled + with non-compliant PCR streams (essentially all the players out there just use + PTS/DTS and don't care about the PCR). + In order to gracefully cope with these, we detect them appropriately and any + small (< 15s) PCR resets get gracefully ignored. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1785> 2020-10-30 14:07:02 +0000 Julian Bouzas <julian.bouzas@collabora.com> * sys/nvcodec/gstcudautils.c: nvcodec: leave g_once_init when all quarks are initialized - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1784> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1782> + +2020-11-09 23:22:09 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264decoder: Fix missing drain handling in bumping + Should've included in the commit 5527cc4a2e7ce8eeee1d8a717f99252477d6015f + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1783> + +2020-11-09 23:04:32 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Try reference picture marking process in any case + ... even if there is some invalid conditions + (because of broken stream, our implementation fault or so). + Otherwise baseclass will keep such reference pictures and + it would result to DPB full. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1783> + +2020-11-09 11:44:36 +0100 Edward Hervey <edward@centricular.com> + + * tests/examples/mpegts/ts-parser.c: + examples: Properly handle extended descriptors + By checking the extended tag. Provides a bit more information (if extended tag + is known) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1781> + +2020-11-08 19:08:25 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h264dec.c: + d3d11h264dec: Fix for MbaffFrameFlag and FrameNumList + As per spec 7.4.3 Slice header semantics, the flag value is derived as + MbaffFrameFlag = (mb_adaptive_frame_field_flag && !field_pic_flag) + and DXVA uses the value. + Regarding FrameNumList, in case of long-term ref, FrameNumList[i] + value should be long_term_frame_idx not long_term_pic_num. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1780> + +2020-11-05 19:30:35 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264decoder: Reset frame number per MMCO type 5 + It should be cleared so that avoid wrong frame gap detection + for following pictures. + Passing 4 more conformance bitstream tests + * MR2_TANDBERG_E + * MR3_TANDBERG_B + * MR4_TANDBERG_C + * MR5_TANDBERG_C + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1768> + +2020-11-05 18:42:37 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264decoder: Fix for MMCO type 2 + As per 8.2.5.4.2, we should mark a picture which has + LongTermPicNum == long_term_pic_num as "unused for reference", + not pic_num. + Passing conformance bitstream test with MR2_MW_A + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1768> + +2020-11-05 18:27:11 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + codecs: h264picture: Add more trace log + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1768> 2020-11-05 13:30:49 +0000 Jason Pereira <mindriot88@users.noreply.github.com> * docs/plugins/gst_plugins_cache.json: * sys/decklink/gstdecklink.cpp: decklink: correct framerate 2KDCI 23.98 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1774> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1771> + +2020-11-05 09:11:03 +0100 Rafostar <40623528+Rafostar@users.noreply.github.com> + + * gst-libs/gst/player/gstplayer.c: + doc: player: mention that get_pipeline method needs unref + All other methods in docs clearly mention that an unref is needed, so should `get_pipeline()`. #1450 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1764> 2020-11-05 09:01:47 +0100 Rafostar <40623528+Rafostar@users.noreply.github.com> @@ -1204,7 +17076,84 @@ player: call ref_sink on pipeline Otherwise `gst_player_get_pipeline()` will return a floating reference which may confuse bindings and lead to crash. Fixes #1450 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1770> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1763> + +2020-11-04 18:43:41 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * meson.build: + meson: Enable some MSVC warnings for parity with GCC/Clang + This makes it easier to do development with MSVC by making it warn + on common issues that GCC/Clang error out for in our CI configuration. + Continuation from https://gitlab.freedesktop.org/gstreamer/gst-build/-/merge_requests/223 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1760> + +2020-10-21 09:01:31 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/va/gstvabasedec.h: + * sys/va/gstvah265dec.c: + * sys/va/gstvah265dec.h: + * sys/va/meson.build: + * sys/va/plugin.c: + va: Add HEVC decoding support + This add HEVC decoding support into the new VA plugin. This implementation has + been tested using the ITU comformance test (through fluster). It fails all + MAIN10 tests, as this is not implemented yet along with the following: + CONFWIN_A_Sony_1 (looks fine, but md5sum is incorrect) + PICSIZE_A_Bossen_1 (height too high) + PICSIZE_B_Bossen_1 (same) + VPSSPSPPS_A_MainConcept_1 (parser issue) + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1714> + +2020-11-03 16:05:48 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + h265parser: Fix wrong warning message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1714> + +2020-11-03 11:23:15 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + h265decoder: Remove unsued WpOffsetHalfRangeC + This is only needed for VA implementation of weight tables and isn't used + within the base class. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1714> + +2020-11-02 00:08:04 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264decoder: Rework for DPB management + Sync with recent h265decoder DPB implementation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1761> + +2020-11-04 18:47:30 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + codecs: h264decoder: Remove unused pts variable + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1761> + +2020-11-03 14:12:45 +0900 youngh.lee <youngh.lee@lge.com> + + * gst/aiff/aiffparse.c: + aiffparse: Also set a channel mask for 2 channels + And only do add debug output at FIXME level when using the fallback + channel mask, not for those defined in the AIFF spec. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1756> + +2020-06-23 10:29:42 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/icestream.c: + * ext/webrtc/nicetransport.c: + * ext/webrtc/nicetransport.h: + webrtc: Add properties to change the socket buffer sizes to ice object + libnice doesn't touch the kernel buffer sizes. When dealing with RTP data, + it's generally advisable to increase them to avoid dropping packets locally. + This is especially important when running multiple higher bitrate streams at + the same time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1366> 2020-11-03 02:22:23 +1100 Jan Schmidt <jan@centricular.com> @@ -1214,7 +17163,77 @@ constructing a GstVulkanDevice around it, as GstVulkanDevice can make calls on the physical device that require the object lock. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1759> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1754> + +2020-11-03 02:14:21 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth265picture.c: + codecs: h265picture: Minor update for coding style + It's GstH265Dpb, not GstH265Decoder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1755> + +2020-11-03 01:53:15 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gsth265picture.c: + * gst-libs/gst/codecs/gsth265picture.h: + codecs: h265decoder: Make GstVideoCodecFrame hold the last reference of the buffer + The functionality of passing the last reference of GstH265Picture + was silently dropped by the commit eeffd91109a409063e866337452eedd392649775 + This commit will make it work again. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1755> + +2020-11-03 01:41:13 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Clear GstVideoCodecFrame on DPB clear if needed + h265decoder might need to clear DPB depending on context even if + it's not flushing case. So associated GstVideoCodecFrame needs to be + released in case non-flushing case. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1755> + +2020-11-03 00:57:46 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Don't drain DPB on EOB/EOS/IDR nalu + DPB bumping decision per end-of-bitstream, end-of-sequence or IDR nal + should done by spec. In short, draining on EOB/EOS/IDR is undefined + behavior as per spec. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1755> + +2020-11-01 18:32:56 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + h265decoder: Complete dependent slice header + This will save the last independent slice and fill in the missing + information for dependent slices. This was left over during the porting + from gstreamer-vaapi. The private variable prev_independent_slice was + already there. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1750> + +2020-11-01 18:30:34 -0500 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + h265decoder: Prevent possible infinite loop + Theoretically, one could produce a broken stream that would lead to + infinite in the specified algorithm to calculate l0/l1 reference lists. + This patch will pearly exit if this condition is met. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1750> + +2020-10-22 12:38:11 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + * gst-libs/gst/codecparsers/gsth265parser.h: + h265parse: Add missing const qualifier + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1750> + +2020-11-02 22:47:20 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + Revert "d3d11decoder: Use D3D11/DXGI standard colorimetry" + This reverts commit a52fc6deeda203add520cb59ae0026d109ecda95. + The change breaks H264/HEVC conformance bitstream tests + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1753> 2020-11-02 08:46:25 +0000 Randy Li <ayaka@soulik.info> @@ -1223,7 +17242,67 @@ In the most of case, this typo would work. But for ARGB8888 and XRGB8888, which shm format is not based on fourcc, which would never appear in format enumeration. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1758> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1751> + +2020-11-01 03:58:30 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + d3d11decoder: Use D3D11/DXGI standard colorimetry + D3D11/DXGI supports smaller set of colorimetry than all possible + combination. This restriction would make more streams convertible + by using ID3D11VideoProcessor + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1743> + +2020-10-31 03:28:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11window.cpp: + d3d11window: Use ID3D11VideoProcessor only if device supports corresponding conversion + ... and drop support for ID3D11VideoProcessor if device doesn't + support ID3D11VideoContext1 interface and therefore we cannot + query conversion supportability. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1743> + +2020-11-01 20:52:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + d3d11h{264,265}dec: Submit picture level parameters only once + Submit PICTURE_PARAMETERS and INVERSE_QUANTIZATION_MATRIX + buffers only once per picture. Multiple submission is redundant. + Also this modification would fix broken hevc decoding with + dependent slice. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1749> + +2020-10-31 20:36:13 +0900 Seungha Yang <seungha@centricular.com> + + codecs: h265decocer: Rework for DPB management + * Move all DPB bumping process into GstH265Dpb internal + * Handle DPB add process in GstH265Dpb struct + * Make implementation to be 1:1 mappable with hevc specification + * Fix wrong DPB bumping implementation especially when no_output_of_prior_pics_flag + was specified. + With fixes from Nicolas Dufresne <nicolas.dufresne@collabora.com> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1748> + +2020-10-31 20:31:51 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11decoder: Get rid of framerate field from pad template + Framerate is optional value and we don't have any framerate + related restriction for those elements. This commit is to fix + negotiation failure when upstream doesn't set framerate on caps. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1747> + +2020-10-31 21:46:16 +1100 Jan Schmidt <jan@centricular.com> + + * tests/check/elements/dtls.c: + tests: Don't set dtlsenc state before linking. + Link the dtlsenc in the testsuite before setting it to paused, as it + starts a pad task that can generate a not-linked error otherwise. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1744> 2020-10-31 01:23:36 +1100 Jan Schmidt <jan@centricular.com> @@ -1232,7 +17311,7 @@ If the DTLS elements fail, they post a bus error and don't signal any key negotiation. Catch the bus error and fail the test early instead of letting it hang and time out. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1746> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1741> 2020-10-30 22:52:18 +1100 Jan Schmidt <jan@centricular.com> @@ -1242,7 +17321,7 @@ Call the parent state_change function first when changing state downward, to make sure that the element has stopped before cleaning it up. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1746> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1741> 2020-10-30 22:49:22 +1100 Jan Schmidt <jan@centricular.com> @@ -1252,7 +17331,7 @@ might be incomplete and leave a partial bio_buffer behind. If the DTLS connection is already marked closed, drop out of dtls_connection_process early without asserting. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1746> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1741> 2020-10-30 16:31:18 +1100 Jan Schmidt <jan@centricular.com> @@ -1262,15 +17341,112 @@ between checking the is_closed flag and enqueueing a source on the main context. Protect the main context with the object lock instead of the PC lock, and hold a ref briefly to make sure it stays alive. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1746> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1741> -2020-10-31 21:46:16 +1100 Jan Schmidt <jan@centricular.com> +2020-07-08 17:24:36 -0400 Olivier Crête <olivier.crete@collabora.com> - * tests/check/elements/dtls.c: - tests: Don't set dtlsenc state before linking. - Link the dtlsenc in the testsuite before setting it to paused, as it - starts a pad task that can generate a not-linked error otherwise. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1745> + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + * ext/webrtc/gstwebrtcice.c: + * ext/webrtc/gstwebrtcice.h: + * ext/webrtc/meson.build: + * ext/webrtc/sctptransport.c: + * ext/webrtc/sctptransport.h: + * ext/webrtc/webrtctransceiver.c: + * ext/webrtc/webrtctransceiver.h: + webrtc: Set the DSCP markings based on the priority + This matches how the WebRTC javascript API works and the Chrome implementation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1707> + +2020-07-09 13:39:03 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + rtpsender: Add API to set the priority + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1707> + +2020-07-09 13:42:35 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/webrtctransceiver.h: + rtptransceiver: Store the SSRC of the current stream + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1707> + +2020-07-08 19:13:33 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * gst-libs/gst/webrtc/rtptransceiver.h: + * gst-libs/gst/webrtc/webrtc_fwd.h: + webrtc: Save the media kind in the transceiver + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1707> + +2020-07-09 13:45:20 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Remove unused function + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1707> + +2020-10-02 21:38:00 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/rtpsender.h: + * gst-libs/gst/webrtc/rtptransceiver.h: + webrtc: Document more objects + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1707> + +2020-10-31 00:37:48 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11decoder: Allow 10bits only profiles + HEVC_VLD_Main10 and VP9_VLD_10bit_Profile2 might not support + 8bit format (i.e., NV12) depending on GPU vendor. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1742> + +2020-10-25 13:33:21 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklink.cpp: + decklink: Remove \n from debug output + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1721> + +2020-10-25 13:32:26 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklink.cpp: + * sys/decklink/gstdecklinkaudiosink.cpp: + * sys/decklink/gstdecklinkaudiosrc.cpp: + * sys/decklink/gstdecklinkvideosink.cpp: + * sys/decklink/gstdecklinkvideosrc.cpp: + decklink: Correctly indent everything + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1721> + +2020-10-25 13:30:55 +0200 Sebastian Dröge <sebastian@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/decklink/gstdecklink.cpp: + * sys/decklink/gstdecklink.h: + * sys/decklink/gstdecklinkvideosink.cpp: + * sys/decklink/gstdecklinkvideosink.h: + * sys/decklink/gstdecklinkvideosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.h: + decklink: Add a default profile id + This causes no changes to the profile but keeps the existing settings. + The profile can also be changed from e.g. the card's configuration + application and in that case probably should be left alone. + The default is the new value as it keeps the profile setting as it is, + which is consistent with the previous behaviour in 1.18. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1721> + +2020-10-25 13:14:11 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklink.cpp: + decklink: Mark internal function as static + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1721> + +2020-10-25 13:13:37 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklink.cpp: + decklink: Remove some dead code + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1721> 2020-10-30 10:02:32 +0200 Sebastian Dröge <sebastian@centricular.com> @@ -1279,7 +17455,26 @@ The widescreen modes moved after GST_DECKLINK_MODE_2160p60 and using them now would cause an assertion. This is a regression from 309f6187fef890c7ffa49305f38e89beac3b1423. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1740> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1737> + +2020-08-25 14:56:50 +0100 Chris Bass <floobleflam@gmail.com> + + * ext/ttml/ttmlparse.c: + ttmlparse: Handle whitespace before XML declaration + When ttmlparse is in, e.g., an MPEG-DASH pipeline, there may be + whitespace between successive TTML documents in ttmlparse's accumulated + input. As libxml2 will fail to parse documents that have whitespace + before the opening XML declaration, ensure that any preceding whitespace + is not passed to libxml2. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1539> + +2020-08-25 14:54:31 +0100 Chris Bass <floobleflam@gmail.com> + + * ext/ttml/ttmlparse.c: + ttmlparse: Ensure only single TTML doc parsed + The parser handles only one TTML file at a time, therefore if there are + multiple TTML documets in the input, parse only the first. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1539> 2020-10-29 13:43:16 -0400 Xavier Claessens <xavier.claessens@collabora.com> @@ -1287,13 +17482,82 @@ amc: Fix crash when encoding AVC gstamcvideoenc.c calls gst_amc_avc_profile_to_string() with alternatives set to NULL which causes a crash. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1739> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1732> -2020-10-18 17:59:44 +0200 Nicola Murino <nicola.murino@gmail.com> +2020-03-19 15:07:47 +0100 Guillaume Desmottes <guillaume.desmottes@collabora.com> - * ext/opencv/meson.build: - opencv: allow compilation against 4.5.x - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1735> + * ext/isac/gstisac.c: + * ext/isac/gstisacdec.c: + * ext/isac/gstisacdec.h: + * ext/isac/gstisacenc.c: + * ext/isac/gstisacenc.h: + * ext/isac/gstisacutils.c: + * ext/isac/gstisacutils.h: + * ext/isac/meson.build: + * ext/meson.build: + * meson_options.txt: + isac: add iSAC plugin + Wrapper on the iSAC reference encoder and decoder from webrtc, + see https://en.wikipedia.org/wiki/Internet_Speech_Audio_Codec + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1124> + +2020-10-28 11:49:54 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst/transcode/gsttranscodebin.c: + transcodebin: Create the decodebin in _init + This way user can request pads right from the beginning + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1151> + +2020-10-25 18:04:05 +0000 Philippe Normand <philn@igalia.com> + + * gst/transcode/gsttranscodebin.c: + transcodebin: Accept more than one stream + Look-up the stream matching the given ID also after building the stream list + from the received collection. Without this change the transcoder would discard + the second incoming stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1151> + +2020-03-20 09:27:48 -0300 Thibault Saunier <tsaunier@igalia.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/transcode/gsttranscodebin.c: + * gst/transcode/gsturitranscodebin.c: + transcodebin: Port to decodebin3 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1151> + +2020-03-19 09:35:57 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + * gst/transcode/gsturitranscodebin.c: + uritranscodebin: Move to using a urisourcebin for our source. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1151> + +2020-03-19 09:34:54 -0300 Thibault Saunier <tsaunier@igalia.com> + + * gst-libs/gst/transcoder/gsttranscoder.c: + transcoder: Base sync transcoding variant on a GMainLoop + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1151> + +2020-10-29 06:13:05 +0000 Randy Li <ayaka@soulik.info> + + * ext/wayland/gstwaylandsink.c: + * ext/wayland/wlwindow.c: + * ext/wayland/wlwindow.h: + waylandsink: release frame callback when destroyed + We would use a frame callback from the surface to indicate + that last buffer is rendered, but when we destroy the surface + and that callback is not back yet, it may cause the wayland event + queue crash. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1729> + +2020-10-28 19:00:43 +0900 Seungha Yang <seungha@centricular.com> + + * gst/rtmp2/gstrtmp2src.c: + rtmp2src: Set buffer timestamp on output buffer + This timestamp information would be useful for queue2 element + when calculating time level and also it makes buffering decision + more reliable. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1727> 2020-10-28 00:47:49 +0900 Seungha Yang <seungha@centricular.com> @@ -1301,45 +17565,723 @@ d3d11videoprocessor: Fix wrong input/output supportability check The flag argument of ID3D11VideoProcessorEnumerator::CheckVideoProcessorFormat method is output value, not input. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1736> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1726> -2020-09-12 02:48:43 +0200 Jan Alexander Steffens (heftig) <heftig@archlinux.org> +2020-10-25 02:27:52 +0900 Seungha Yang <seungha@centricular.com> - * tests/check/elements/svthevcenc.c: - tests: svthevcenc: Fix test_encode_simple - Pick the same I420 format the other test use. Without this, the source - picks AYUV64, which fails. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1734> + * sys/nvcodec/gstnvdecoder.c: + * sys/nvcodec/gstnvdecoder.h: + * sys/nvcodec/gstnvh264dec.c: + * sys/nvcodec/gstnvh265dec.c: + nvcodec: nvsldec: Add support for CUDA memory + Add CUDA memory support. Note that zero copying is not supported yet + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1720> -2020-09-23 17:04:55 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> +2020-10-26 05:15:33 +0900 Seungha Yang <seungha@centricular.com> - * gst/mpegtsmux/gstbasetsmux.c: - mpegtsmux: Restore intervals when creating TsMux - Otherwise the settings from the properties would be overwritten with - the defaults. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1733> + * sys/d3d11/gstd3d11memory.c: + d3d11memory: Adjust log level for some spammy debug messages + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1723> -2020-10-27 12:34:33 +0000 Tim-Philipp Müller <tim@centricular.com> +2020-10-26 05:11:45 +0900 Seungha Yang <seungha@centricular.com> - * meson.build: - Back to development + * sys/d3d11/gstd3d11colorconvert.c: + d3d11convert: Use ID3D11VideoProcessor only if device supports colorspace + Check whether conversion with given combination of input/output + format and dxgi colorspace is supported or not by driver. + If not, we should use shader. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1723> -=== release 1.18.1 === +2020-10-26 05:09:40 +0900 Seungha Yang <seungha@centricular.com> -2020-10-26 11:14:43 +0000 Tim-Philipp Müller <tim@centricular.com> + * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11videoprocessor.h: + d3d11videoprocessor: Add a method for device's conversion caps check + Add a wrapper method for + ID3D11VideoProcessorEnumerator1::CheckVideoProcessorFormatConversion. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1723> - * ChangeLog: - * NEWS: - * RELEASE: - * gst-plugins-bad.doap: +2020-10-26 05:04:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11format.c: + d3d11format: Map more colorimetry with dxgi colorspace + Map more logically identical set of GstVideoColorimetry formats + with dxgi color space. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1723> + +2020-10-25 23:13:46 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvapool.c: + va: pool: Set the video_alignment after we get image info. + The set_format() of the allocator may change the stride of the + alloc_info. We should update the video_align.stride_align based + on it. Or, we get a warning in gst_video_meta_validate_alignment(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1698> + +2020-10-26 11:50:59 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah264dec.c: + va: h264dec: Set the padding to VideoAlignment's right. + In our va implemenation, we just use image's info to map the buffer. + The padding info just plays a role as a place holder to expand the + allocation size in caps when decoding size is bigger than display + size. So the padding_right or padding_left does not change the result. + But we find if using padding_left, it is hard to meet the requirement + of gst_video_meta_validate_alignment(), when the video meta's stride + is different from the allocation width. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1698> + +2020-10-26 01:22:12 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + va: basedec: No need to call base class' decide_allocation(). + We have already done the jobs in gst_va_base_dec_decide_allocation() + and no need to call base class' decide_allocation() again. The base + class' decide_allocation() will set_format() again and let use do the + image/surface testing again, which is low performance and no needed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1698> + +2020-10-20 14:31:22 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: use _update_image_info() to set allocator parameters. + Use this standalone function to update the allocator info and make + all ensure_image() and mem_alloc() API clean. + We also change the default way of using image. We now set the non + derive manner as the default manner, and if it fails, then fallback + to the derived image manner. + On a lot of platforms, the derived image does not have caches, so the + read and write operations have very low performance. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1698> + +2020-10-20 14:09:35 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: Add a helper function to update the image info. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1698> + +2020-10-19 23:32:44 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.c: + va: allocator: Decide the allocator's parameters when set_format(). + Moving the parameters testing and setting from the allocator_alloc_full() + to the allocator_try(). The allocator_alloc_full() will be called every + time when we need to allocate a new memory. But all these parameters such + as the surface and the image format, rt_format, etc, are unchanged during + the whole allocator lifetime. Just setting them in set_format() is enough. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1698> + +2020-10-18 17:59:44 +0200 Nicola Murino <nicola.murino@gmail.com> + + * ext/opencv/meson.build: + opencv: allow compilation against 4.5.x + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1709> + +2020-03-24 09:18:28 -0400 Aaron Boxer <aaron.boxer@collabora.com> + + * gst/videoparsers/gstjpeg2000parse.c: + jpeg2000parse: sub-sampling parse should take component into account + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1653> + +2020-04-21 14:16:45 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * gst/videoparsers/gstjpeg2000parse.c: + jpeg2000parse: no pts interpolation with subframe. + The jpeg2000parser must not interpolate PTS with subframes. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1653> + +2020-01-13 14:01:19 -0500 Aaron Boxer <aaron.boxer@collabora.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/videoparsers/gstjpeg2000parse.c: + jpeg2000parse: support frame and stripe alignment in caps + forward alignment and num-stripes caps properties + Use caps height when setting caps for subframe + We want downstream to use full frame height, not subframe height + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1653> + +2020-10-25 11:46:29 +0200 Sebastian Dröge <sebastian@centricular.com> + + * sys/decklink/gstdecklinkaudiosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.cpp: + decklink: Reset skip counters when starting the sources + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/378> + +2018-05-10 14:05:12 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * sys/decklink/gstdecklinkaudiosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.cpp: + decklink*src: Post a warning message on the bus about dropped frames + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/378> + +2017-11-28 13:44:18 +0100 Georg Lippitsch <glippitsch@toolsonair.com> + + * sys/decklink/gstdecklinkaudiosrc.cpp: + * sys/decklink/gstdecklinkaudiosrc.h: + * sys/decklink/gstdecklinkvideosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.h: + decklink*src: Aggregate dropped frame/packet logging + decklink*src currently prints a log entry for every dropped frame and + audio packet. That completely spams the logs. + This change aggregates information about dropped packets and only prints + a message once when dropping starts, and a summary when dropping ends. + Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/705 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/378> + +2020-10-24 20:59:55 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11memory.h: + d3d11memory: Protect view object with lock + Make resource allocation more thread-safe + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1718> + +2020-10-24 02:47:22 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11videoprocessor.h: + d3d11convert: Add support for conversion using ID3D11VideoProcessor + Output texture of d3d11 decoder cannot have the bind flag + D3D11_BIND_SHADER_RESOURCE (meaning that it cannot be used for shader + input resource). So d3d11convert (and it's subclasses) was copying + texture into another internal texture to use d3d11 shader. + It's obviously overhead and we can avoid texture copy for + colorspace conversion or resizing via ID3D11VideoProcessor + as it supports decoder output texture. + This commit would be a visible optimization for d3d11 decoder with + d3d11compositor use case because we can avoid texture copy per frame. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1718> + +2020-10-24 02:33:29 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11memory.h: + d3d11memory: Store ID3D11VideoProcessorOutputView object + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1718> + +2020-10-23 22:29:57 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Fix picture leaks because of reference set. + The last frame's reference set has no one to cleanup. We need to + clean all pictures in the stop() func. + We also add a helper function to cleanup all the pictures in the + reference picture set. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1717> + +2020-10-23 21:21:05 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + codecs: h265decoder: Fix 3 ref array leaks in finalize. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1717> + +2020-10-23 16:59:00 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11videoprocessor.c: + * sys/d3d11/gstd3d11videoprocessor.h: + * sys/d3d11/gstd3d11window.cpp: + d3d11window: Reuse ID3D11VideoProcessorInputView if possible + GstMemory object could be disposed if GstBuffer is not allocated + by GstD3D11BufferPool such as via gst_buffer_copy() and/or + gst_buffer_make_writable(). So attaching qdata on GstMemory + object would cause unnecessary view alloc/free. + By using view pool which is implemented in GstD3D11Allocator, + we can avoid redundant view alloc/free. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1716> + +2020-10-21 16:28:11 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11memory.h: + d3d11memory: Implement ID3D11VideoProcessorInputView pool + Similar to ID3D11VideoDecoderOutputView pool implementation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1716> + +2018-04-07 16:33:47 -0400 Xavier Claessens <xavier.claessens@collabora.com> + + * gst-libs/gst/audio/meson.build: + * gst-libs/gst/codecparsers/meson.build: + * gst-libs/gst/insertbin/meson.build: + * gst-libs/gst/interfaces/meson.build: + * gst-libs/gst/mpegts/meson.build: + * gst-libs/gst/player/meson.build: + * gst-libs/gst/sctp/meson.build: + * gst-libs/gst/transcoder/meson.build: + * gst-libs/gst/vulkan/meson.build: + * gst-libs/gst/wayland/meson.build: + * gst-libs/gst/webrtc/meson.build: * meson.build: - Release 1.18.1 + * pkgconfig/.gitignore: + * pkgconfig/gstreamer-bad-audio-uninstalled.pc.in: + * pkgconfig/gstreamer-bad-audio.pc.in: + * pkgconfig/gstreamer-codecparsers-uninstalled.pc.in: + * pkgconfig/gstreamer-codecparsers.pc.in: + * pkgconfig/gstreamer-insertbin-uninstalled.pc.in: + * pkgconfig/gstreamer-insertbin.pc.in: + * pkgconfig/gstreamer-mpegts-uninstalled.pc.in: + * pkgconfig/gstreamer-mpegts.pc.in: + * pkgconfig/gstreamer-photography-uninstalled.pc.in: + * pkgconfig/gstreamer-photography.pc.in: + * pkgconfig/gstreamer-player-uninstalled.pc.in: + * pkgconfig/gstreamer-player.pc.in: + * pkgconfig/gstreamer-plugins-bad-uninstalled.pc.in: + * pkgconfig/gstreamer-plugins-bad.pc.in: + * pkgconfig/gstreamer-sctp-uninstalled.pc.in: + * pkgconfig/gstreamer-sctp.pc.in: + * pkgconfig/gstreamer-transcoder-uninstalled.pc.in: + * pkgconfig/gstreamer-transcoder.pc.in: + * pkgconfig/gstreamer-vulkan-uninstalled.pc.in: + * pkgconfig/gstreamer-vulkan-wayland-uninstalled.pc.in: + * pkgconfig/gstreamer-vulkan-wayland.pc.in: + * pkgconfig/gstreamer-vulkan-xcb-uninstalled.pc.in: + * pkgconfig/gstreamer-vulkan-xcb.pc.in: + * pkgconfig/gstreamer-vulkan.pc.in: + * pkgconfig/gstreamer-wayland-uninstalled.pc.in: + * pkgconfig/gstreamer-wayland.pc.in: + * pkgconfig/gstreamer-webrtc-uninstalled.pc.in: + * pkgconfig/gstreamer-webrtc.pc.in: + * pkgconfig/meson.build: + Meson: Use pkg-config generator + +2020-07-30 20:23:37 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gsth265decoder.h: + * sys/d3d11/gstd3d11h265dec.c: + * sys/nvcodec/gstnvh265dec.c: + h265decoder: Add support for l0/l1 + Add support for reference list needed for VA-API and some V4L2 decoders. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1713> + +2020-07-28 18:37:38 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth265decoder.c: + * gst-libs/gst/codecs/gsth265decoder.h: + * gst-libs/gst/codecs/gsth265picture.c: + * gst-libs/gst/codecs/gsth265picture.h: + * sys/d3d11/gstd3d11h265dec.c: + * sys/nvcodec/gstnvh265dec.c: + h265decoder: Sync with the H264 implementation + This ensures that we get the last reference to picture being outputed, + avoiding GstBuffer structure copies and simplifying the buffer management. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1713> + +2020-10-20 17:31:17 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11decoder: Directly access ID3D11VideoDecoderOutputView for decoding + Decoder output view is stored in GstD3D11Memory object instead of + wrapper struct now. So qdata is no more required. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1712> + +2020-10-20 01:59:35 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11memory.h: + d3d11memory: Implement ID3D11VideoDecoderOutputView pool + Similar to texture-array pool, we can reuse decoder output view + since the life time of output view is identical to that of texture-array. + In this way, we can avoid frequent output view alloc/free. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1712> + +2020-10-04 23:39:05 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11memory.c: + d3d11memory: Move to GArray to store texture-array status + The size D3D11_REQ_TEXTURE2D_ARRAY_AXIS_DIMENSION is 2048 + which is too large in practice especially for a texture + of dpb + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1712> + +2020-10-12 19:20:10 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvabasedec.h: + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvaprofile.c: + * sys/va/gstvavp9dec.c: + * sys/va/gstvavp9dec.h: + * sys/va/meson.build: + * sys/va/plugin.c: + va: Add VP9 decoder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1702> + +2020-10-16 15:46:20 +0100 Philippe Normand <philn@igalia.com> + + * ext/wpe/gstwpesrc.cpp: + wpe: Convert launch lines to markdown and move since tag + Seems like the examples don't appear in the generated docs because the Since tag + was badly positioned in the doc blurb. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1706> + +2020-10-16 10:35:36 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9picture.h: + codecs: vp9decoder: Add segmentation to picture. + VA-API needs AC and DC quant scales for both luma and chroma, and the loop + filter level for current frame, but these values are not available outside + the private GstVp9Parser structure. And these values may change from frame + to frame, so they are picture specific. + This patch add GstVp9Segmentation structure array to GstVp9Picture to expose + it to derived classes. This approach is safer than passing the parser at + picture handling flow. + Also, this patch, in order to solve Documentation CI, mark as private the + GstVp9Picture structure. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1700> + +2020-10-12 11:07:47 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9decoder.h: + * sys/d3d11/gstd3d11vp9dec.c: + codecs: vp9decoder: Pass parser as new_sequence() parameter. + In order to know the chroma format, besides profile, subsampling_x and + subsampling_y are needed (Spec 7.2.2 Color config semantics). These values are + in GstVp9Parser but not in GstVp9Framehdr. + Also, bit_depth is available in parser but not frame header. Evenmore, those + values are copied to picture structure later. + In case of VA-API, to configure the pipeline, it is require to know the chroma + format and depth. + It is possible to know chroma and depth through caps coming from vp9parser, but + it requires string parsing. It would be less error prone to get these values + through the parser structure at new_sequence() virtual method. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1700> + +2020-09-23 16:43:30 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst/rtp/gstrtpsrc.c: + rtpsrc: Cleanup on BYE, timeout or when pad is reused + In this patch, we enabled 'autoremove' feature of rtpbin and also call + 'clear-ssrc' on the rtpssrcdemux element when a pad is being reused. This + ensure that the jitterbuffer is removed and no threads accumulates. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1575> + +2020-09-04 14:18:13 +0300 George Kiagiadakis <george.kiagiadakis@collabora.com> + + * gst/rtp/gstrtpsrc.c: + rtpsrc: re-use the same src pad for streams that have the same payload type + Also use payload type when naming pads, this will make it easier to identify + pads and simplify the code. + Fixes #1395 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1575> + +2020-06-03 01:26:12 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11compositor.c: + * sys/d3d11/gstd3d11compositor.h: + * sys/d3d11/gstd3d11compositorbin.c: + * sys/d3d11/gstd3d11compositorbin.h: + * sys/d3d11/meson.build: + * sys/d3d11/plugin.c: + d3d11: Introduce d3d11compositor element + Add new video composition element which is equivalent to compositor + and glvideomixer elements. When d3d11 decoder elements are used, + d3d11compositor can do efficient graphics memory handling + (zero copying or at least copying memory on GPU memory space). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-08-05 17:27:30 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11shader.c: + d3d11shader: Allow drawing without shader resource view + ... for the case that we are rendering on target without input texture. + For example, we might want to draw arbitrary shape on render target view + without shader resource view. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-08-03 03:19:34 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11colorconvert.h: + * sys/d3d11/plugin.c: + d3d11convert: Add new subclasses for only color convert or resize + New d3d11colorconvert and d3d11scale elements will perform only + colorspace conversion and rescale, respectively. Those new elements + would be useful when only colorspace conversion or rescale is required + and the other part should be done by another elements. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-08-01 20:16:52 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11colorconverter.h: + d3d11colorconverter: Allow setting alpha value to use + ... used for reordering case for now. In other words, non-alpha formats + such as NV12 is not supported case yet. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-06-03 01:20:41 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11colorconverter.h: + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11overlaycompositor.c: + * sys/d3d11/gstd3d11shader.c: + * sys/d3d11/gstd3d11shader.h: + * sys/d3d11/gstd3d11window.cpp: + d3d11colorconverter: Add support conversion with blending + This is pre-work for d3d11compositor support + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-06-03 00:59:15 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11colorconverter.h: + d3d11colorconverter: Add method to support updating destination rect + It's equivalent to GST_VIDEO_CONVERTER_OPT_DEST_* options of GstVideoConverter + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-06-03 00:46:13 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11colorconverter.h: + * sys/d3d11/gstd3d11decoder.c: + * sys/d3d11/gstd3d11overlaycompositor.c: + * sys/d3d11/gstd3d11overlaycompositor.h: + * sys/d3d11/gstd3d11window.cpp: + d3d11: Clarify target rect to be updated + Rename internal methods to clarify which rect (i.e., input or output) + should be updated + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323> + +2020-10-02 10:02:38 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstnvdec.c: + * sys/nvcodec/gstnvdec.h: + nvcodec: Report latency in decoder based on max-display-delay + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2020-10-02 09:22:34 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstnvdec.c: + * sys/nvcodec/gstnvdec.h: + nvcodec: Add max-display-delay decoder property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2020-09-24 11:25:33 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstnvdec.c: + nvcodec: Fix compiler error if OpenGL is not enabled + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2020-09-24 10:33:58 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstnvdecoder.c: + nvcodec: Add missing CUDAMemory src caps in h264 decoder + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2020-09-23 13:49:43 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstnvh264enc.c: + * sys/nvcodec/gstnvh265enc.c: + nvcodec: Add missing CUDAMemory sink caps in h264 and h265 encoders + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2020-09-22 13:07:19 -0400 Julian Bouzas <julian.bouzas@collabora.com> + + * sys/nvcodec/gstcudadownload.c: + nvcodec: Fix description of cudadownload element + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-08-11 15:02:04 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * tests/check/elements/cudaconvert.c: + * tests/check/elements/cudafilter.c: + * tests/check/meson.build: + tests: Add CUDA filter unit tests + Adding a test for buffer meta and colorspace conversion + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-10-16 22:43:09 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstcudafilter.c: + * sys/nvcodec/gstcudascale.c: + * sys/nvcodec/gstcudascale.h: + * sys/nvcodec/meson.build: + nvcodec: Add CUDA video scale element + Add new element for video resizing using CUDA + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-10-16 22:42:55 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstcudabasefilter.c: + * sys/nvcodec/gstcudabasefilter.h: + * sys/nvcodec/gstcudaconvert.c: + * sys/nvcodec/gstcudaconvert.h: + * sys/nvcodec/gstcudafilter.c: + * sys/nvcodec/gstcudafilter.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/plugin.c: + nvcodec: Add CUDA video convert element + Add new element for colorspace conversion using CUDA. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-10-16 22:42:39 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/cuda-converter.c: + * sys/nvcodec/cuda-converter.h: + * sys/nvcodec/gstcudacontext.c: + * sys/nvcodec/gstcudacontext.h: + * sys/nvcodec/gstcudaloader.c: + * sys/nvcodec/gstcudaloader.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/stub/cuda.h: + nvcodec: Add generic CUDA video convert object + Introducing generic video convert object similar to video-converter + but using CUDA. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-10-16 22:42:24 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstcudanvrtc.c: + * sys/nvcodec/gstcudanvrtc.h: + * sys/nvcodec/gstnvrtcloader.c: + * sys/nvcodec/gstnvrtcloader.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/stub/nvrtc.h: + nvcodec: Add support runtime CUDA kernel source compilation + Add util functions for runtime CUDA kernel source compilation + using NVRTC library. Like other nvcodec dependent libraries, + NVRTC library will be loaded via g_module_open. + Note that the NVRTC library naming is not g_module_open friendly + on Windows. + (i.e., nvrtc64_{CUDA major version}{CUDA minor version}.dll). + So users can specify the dll name using GST_NVCODEC_NVRTC_LIBNAME + environment. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-10-16 22:42:06 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstcudabasetransform.c: + * sys/nvcodec/gstcudabasetransform.h: + * sys/nvcodec/gstcudadownload.c: + * sys/nvcodec/gstcudadownload.h: + * sys/nvcodec/gstcudaupload.c: + * sys/nvcodec/gstcudaupload.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/plugin.c: + nvcodec: Add CUDA upload/download elements with base class for CUDA filters + Similar to glupload/gldownload elements but for CUDA memory. + It will help transfer memory between system and nvidia GPU + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-10-16 22:21:05 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstnvbaseenc.c: + * sys/nvcodec/gstnvdec.c: + nvcodec: Peer direct access support + If support direct access each other, use device to device memory copy + without staging host memory + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-08-30 17:19:44 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstcudacontext.c: + * sys/nvcodec/gstcudacontext.h: + * sys/nvcodec/gstcudaloader.c: + * sys/nvcodec/gstcudaloader.h: + cudacontext: Enable direct CUDA memory access over multiple GPUs + If each device context can access each other, enable peer access + for better interoperability. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-08-30 13:57:15 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstnvbaseenc.c: + * sys/nvcodec/gstnvbaseenc.h: + * sys/nvcodec/gstnvenc.c: + nvenc: Support CUDA buffer pool + When upstream support CUDA memory (only nvdec for now), we will create + CUDA buffer pool. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-08-30 13:55:25 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstnvdec.c: + * sys/nvcodec/gstnvdec.h: + nvdec: Support CUDA buffer pool + If downstream can accept CUDA memory caps feature (currently nvenc only), + always CUDA memory is preferred. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> + +2019-08-19 18:02:56 +0900 Seungha Yang <seungha.yang@navercorp.com> + + * sys/nvcodec/gstcudabufferpool.c: + * sys/nvcodec/gstcudabufferpool.h: + * sys/nvcodec/gstcudaloader.c: + * sys/nvcodec/gstcudaloader.h: + * sys/nvcodec/gstcudamemory.c: + * sys/nvcodec/gstcudamemory.h: + * sys/nvcodec/meson.build: + * sys/nvcodec/stub/cuda.h: + nvcodec: Add CUDA specific memory and bufferpool + Introducing CUDA buffer pool with generic CUDA memory support. + Likewise GL memory, any elements which are able to access CUDA device + memory directly can map this CUDA memory without upload/download + overhead via the "GST_MAP_CUDA" map flag. + Also usual GstMemory map/unmap is also possible with internal staging memory. + For staging, CUDA Host allocated memory is used (see CuMemAllocHost API). + The memory is allowing system access but has lower overhead + during GPU upload/download than normal system memory. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1633> 2020-10-16 12:29:02 +0100 Andrew Wesie <andrew@theori.io> * gst-libs/gst/codecparsers/gsth264parser.c: codecparsers: h264parser: guard against ref_pic_markings overflow - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1704> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1703> + +2020-10-16 00:48:01 +0100 Tim-Philipp Müller <tim@centricular.com> + + * ext/hls/gsthlssink2.c: + hlssink2: fix and flesh out docs + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1699> + +2020-10-15 18:26:48 +0200 Stéphane Cerveau <scerveau@collabora.com> + + * ext/dash/gstxmlhelper.c: + * meson.build: + meson: update glib minimum version to 2.56 + In order to support the symbol g_enum_to_string in various + project using GStreamer ( gst-validate etc.), the glib minimum + version should be 2.56.0. + Remove compat code as glib requirement + is now > 2.56 + Version used by Ubuntu 18.04 LTS + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1695> + +2020-10-12 01:04:13 +0800 He Junyan <junyan.he@intel.com> + + * sys/d3d11/gstd3d11vp8dec.c: + d3d11: vp8dec: No need to check show_frame flag when output_picture. + The VP8 base class has already handled it for us. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1670> + +2020-10-12 00:57:24 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecs/gstvp8decoder.c: + codecs: vp8decoder: handle the show_frame check in base class. + Move the show_frame check from sub class to vp8 decoder's base class. + Calling the sub class' output_picture() function only when the frame + is displayed and marking the other automatically as decode only. + This is done to avoid logic and code repetition in subclasses. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1670> + +2020-10-16 02:06:49 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * sys/d3d11/gstd3d11vp9dec.c: + codecs: vp9decoder: handle the show_frame check in base class + Same as vp8 decoder update https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1670 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1697> 2020-10-15 12:08:19 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> @@ -1348,7 +18290,7 @@ In preparation for !1670, this will allow the base class from skipping frames that should not be displayed. Previously it would complain about unordered decoding taking place in the driver. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1701> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1696> 2020-10-15 12:05:45 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> @@ -1357,7 +18299,7 @@ requests are executed in order, so while dequeuing sink buffers for previous request, also mark these request as no longer pending. This will allow reusing the request later. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1701> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1696> 2020-10-15 11:35:04 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> @@ -1366,7 +18308,43 @@ Pass the pointer instead of NULL in order to find and remove properly any pending request from the queue. This coding error was leading to use after free in error and early exit cases. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1701> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1696> + +2020-10-14 19:04:44 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + va: basedec: Create the other pool anyway. + Fix a bug in _create_other_pool(). The old way of checking the + base->other_pool make that other_pool never be changed until the + gst_va_base_dec_stop() to stop the current decoding context. + But in some stream, the resolution may change during the decoding + process, and we need to re-negotiate the buffer pool. Then, the + old other_pool can not be clean correctly and the new correct one + can not be created. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1692> + +2020-10-14 16:54:54 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvabasedec.c: + va: basedec: Should unmap src frame when dst frame map failing. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1692> + +2020-10-13 15:28:24 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvapool.c: + va: bufferpool: use release_buffer to clean the mem. + The current bufferpool wastes all pre-allocate buffers when the + buffer pool is actived. + The pool->priv->size is 0 for va buffer pool. And every time, the + reset_buffer() will clean all mem and make the buffer size 0, that + can cache the gst_buffer in the buffer pool. + But when the buffer pool is activing, the default_start() just + allocate the buffer and release_buffer() immediately, all the pre + allocated buffers and surfaces are destroyed because of + gst_buffer_get_size (buffer) != pool->priv->size. + We need to use release_buffer() to do the clean job at the pool + start time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1686> 2020-10-04 02:02:16 +0900 Seungha Yang <seungha@centricular.com> @@ -1374,7 +18352,47 @@ h265parse: Don't enable passthrough by default SEI messages contain various information which wouldn't be conveyed by using upstream CAPS (HDR, timecode for example). - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1694> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1639> + +2020-10-13 13:11:06 +0300 Vivia Nikolaidou <vivia@ahiru.eu> + + * docs/plugins/gst_plugins_cache.json: + * ext/opencv/gstcameracalibrate.cpp: + cameracalibrate: Improve gst-inspect documentation + Thanks to @kazz_naka on Twitter + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1691> + +2020-10-07 21:13:09 +1100 Matthew Waters <matthew@centricular.com> + + * ext/wpe/WPEThreadedView.cpp: + * ext/wpe/gstwpesrc.cpp: + wpesrc: add some debug logging around WPEView creation/destruction + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1663> + +2020-10-07 21:14:55 +1100 Matthew Waters <matthew@centricular.com> + + * ext/wpe/gstwpesrc.cpp: + wpesrc: fix a memory leak of the bytes + free the previous GBytes if load-bytes is called multiple times + before view creation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1663> + +2020-10-06 22:19:21 +1100 Matthew Waters <matthew@centricular.com> + + * ext/wpe/gstwpesrc.cpp: + wpesrc: only create webview if not already created + e.g. _decide_allocation() can be called multiple times throughout the + element's lifetime and we only want to create the view once rather than + overwriting. + Fixes a leak of the WPEView under certain circumstances. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1663> + +2020-10-02 12:06:59 +1000 Matthew Waters <matthew@centricular.com> + + * ext/wpe/WPEThreadedView.cpp: + wpe: free a previous pending image/shm buffer + Don't blindly overwrite a possibly previously set buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1663> 2020-10-12 14:15:49 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1383,7 +18401,7 @@ `delay` should be a GstClockTimeDiff since SRT time is int64_t. All values are in local time so we should never see a srctime that's in the future. If we do, clamp the delay to 0 and warn about it. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1690> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1674> 2020-10-12 14:12:24 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1394,7 +18412,7 @@ Delivery mode (TSBPDMODE) is not enabled"][1] so it may not apply to us, but it's best to be defensive. [1]: https://github.com/Haivision/srt/blob/v1.4.2/docs/API.md#sending-and-receiving - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1690> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1674> 2020-10-12 14:09:28 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1403,48 +18421,199 @@ If we don't have a clock, stop the source instead of asserting in gst_clock_get_time. This can happen when the element is removed from the pipeline while it's playing. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1690> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1674> -2020-10-12 11:50:59 +0100 Tim-Philipp Müller <tim@centricular.com> +2020-10-12 13:56:50 +0200 Marc Leeman <m.leeman@televic.com> - * docs/plugins/gst_plugins_cache.json: - * gst/rtmp2/gstrtmp2.c: - * gst/rtmp2/gstrtmp2sink.c: - rtmp2sink: don't expose stop-commands property in backported patches - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1628> + * gst/rtp/gstrtpsink.h: + rtpmanagerbad: remove duplicate parent declaration + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1689> -2020-08-19 14:51:17 +0300 Nazar Mokrynskyi <nazar@mokrynskyi.com> +2020-10-12 11:55:46 +0100 Tim-Philipp Müller <tim@centricular.com> - * docs/meson.build: * gst/rtmp2/gstrtmp2sink.c: - * gst/rtmp2/rtmp/rtmpclient.h: - rtmp2sink: add docs section with since marker on new stop-commands property - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1628> + rtmp2sink: fix since marker on new "stop-commands" property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1687> -2020-08-18 19:16:40 +0300 Nazar Mokrynskyi <nazar@mokrynskyi.com> +2020-10-09 16:00:18 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * docs/plugins/gst_plugins_cache.json: - * gst/rtmp2/gstrtmp2sink.c: - rtmp2: fix code style, update documentation cache - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1628> + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvavp8dec.c: + va: basedec: copy frames logic to decide_allocation() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1673> -2020-08-18 14:05:26 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> +2020-10-09 15:47:43 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * gst/rtmp2/gstrtmp2.c: - * gst/rtmp2/gstrtmp2sink.c: - * gst/rtmp2/rtmp/rtmpclient.c: - * gst/rtmp2/rtmp/rtmpclient.h: - rtmp2: Clean up (improve) GstRtmpStopCommands type - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1628> + * sys/va/gstvabasedec.c: + va: basedec: refactor the other video pool instantiation + Just a code clean up + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1673> -2020-05-02 04:49:42 +0300 Nazar Mokrynskyi <nazar@mokrynskyi.com> +2020-10-08 19:39:56 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * gst/rtmp2/gstrtmp2sink.c: - * gst/rtmp2/rtmp/rtmpclient.c: - * gst/rtmp2/rtmp/rtmpclient.h: - rtmp2sink: handle EOS event and close stream - https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1285 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1628> + * sys/va/gstvabasedec.c: + * sys/va/gstvabasedec.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/meson.build: + va: basedec: add gstvabasedec helper + This is a helper for all decoders. + It is not an abstract subclass, just merely a helper that avoids code + duplication among the decoders. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1673> + +2020-10-09 10:33:58 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavp8dec.c: + va: vp8dec: add element documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1673> + +2020-10-09 12:27:12 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: set video alignment definition earlier + This patch renames need_cropping variable to need_videoalign which is clearer + with its function. And now GstVideoAlignment is part of GstVaH264Dec structure, + so it can be set earlier. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1673> + +2020-10-05 16:40:55 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/dtls/gstdtlsconnection.c: + dtlsconnection: Ignore OpenSSL system call errors + OpenSSL shouldn't be making real system calls, so we can safely + ignore syscall errors. System interactions should happen through + our BIO. So especially don't look at the system's errno, as it + should be meaningless. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1656> + +2020-09-12 02:48:43 +0200 Jan Alexander Steffens (heftig) <heftig@archlinux.org> + + * tests/check/elements/svthevcenc.c: + tests: svthevcenc: Fix test_encode_simple + Pick the same I420 format the other test use. Without this, the source + picks AYUV64, which fails. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1573> + +2020-10-07 18:03:20 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvapool.c: + * sys/va/gstvapool.h: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavpp.c: + va: allocator: add _set_format() and _get_format() + Since allocators keep an available memory queue to reuse, video format and usage + hint are now persistant while allocator's memories are around. + This patch adds _set_format() and _get_format() for both VA allocators. + _set_format() validates if given format can be used or reused. If no allocated + surface previously it creates a dummy one to fetch its offsets and + strides. Updated info is returned to callee. + GstVaPool uses _set_format() at config to verify the allocator capacity and to + get the surfaces offsets and strides, which are going to be used by the video + meta. + Allocator extracted caps are compared with caps from config and if they have + different strides or offsets, force_videometa is set. + A new bufferpool method gst_va_pool_requires_video_meta() is added return the + value of force_videometa. This value is checked in order to know if decoders + need to copy the surface if downstream doesn't announce video meta support. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-08 14:10:41 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvacaps.c: + * sys/va/gstvacaps.h: + va: caps: added gst_caps_is_raw() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-08 10:26:54 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + va: pool: call parent's start() method + Without preallocating buffers and memories a deadlock in pool allocator is + highly probably since it might hit the case were buffer is returned to the pool + but their memories are still hold by a copy downstream, without other + preallocated buffers available. + This kind of a hack, where buffer_reset() follow the normal path if it's called + from start(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-07 16:18:30 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + va: pool: fix log's object + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-08 10:30:28 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: remove noisy log message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-07 11:08:49 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: add a todo for dmabuf_memories_setup() + It would be nice to add a surface pool for this type of surface allocation in + order to have a better control of them. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-07 10:16:27 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: add a surface counter + Every time a new surface is created the counter increases by one, and when it is + destroyed (or will be destroyed in case of GstVaAllocator), the counter is + decreased. Then, at allocator dispose, it is warning if the counter is not zero. + This counter will be also used to check if the allocator can change its + configuration if the counter is zero or can not. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-06 20:01:04 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: remove GstVideoInfo from GstVaBufferSurface + Don't store it them anymore since it is related with the negotiated stream and + not the concrete buffer. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-06 19:54:26 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvadecoder.c: + * sys/va/gstvavpp.c: + va: remove GstVideoInfo parameter from _get_surface() functions + There shouldn't be need to retrieve GstVideoInfo per buffer or memory since it + is the same for all the negotiated stream. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-06 19:40:16 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvavpp.c: + va: vpp: don't fetch video info from buffer + Instead of fetching video info from the buffer, use the already set ones. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-07 12:49:44 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavpp.c: + va: dec, vpp: don't get buffer size from allocators + Since buffer size is now ignored by bufferpool there's no need to get tha value + from the allocator. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> + +2020-10-04 11:14:38 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + va: pool: ignore size in config + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1667> 2020-10-05 11:07:25 +0200 Guillaume Desmottes <guillaume.desmottes@collabora.com> @@ -1457,7 +18626,266 @@ Decoders producing alternate, such as OMX on the Zynq, should change the interlace-mode on their output caps. Fix https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/825 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1683> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1655> + +2020-10-09 10:24:50 +0200 Jacek Tomaszewski <lord.jacold@gmail.com> + + * COPYING: + * COPYING.LIB: + Replace LGPL v2 with LGPL v2.1 in COPYING and remove COPYING.LIB + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1631> + +2020-10-02 11:42:07 +0200 Jacek Tomaszewski <lord.jacold@gmail.com> + + * COPYING: + Replace GPL v2 with LGPL v2 in COPYING file + Fixes #1422 + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1422 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1631> + +2020-10-08 17:52:05 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * ext/srt/gstsrtsink.c: + srt: Consume the error from gst_srt_object_write + Instead of leaking it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1668> + +2020-10-08 17:48:20 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * ext/srt/gstsrtobject.c: + srt: Check socket state before retrieving payload size + The connection might be broken, which we should detect instead of just + aborting the write. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1669> + +2020-10-08 18:25:59 +0200 Jakub Adam <jakub.adam@collabora.com> + + * ext/x265/gstx265enc.c: + x265enc: fix deadlock on reconfig + Don't attempt to obtain encoder lock that is already held by + gst_x265_enc_encode_frame(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1671> + +2020-10-07 11:04:30 +0300 Sebastian Dröge <sebastian@centricular.com> + + * pkgconfig/gstreamer-webrtc-uninstalled.pc.in: + * pkgconfig/gstreamer-webrtc.pc.in: + webrtc: Require gstreamer-sdp in the pkg-config file + Some headers include API from it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1660> + +2020-10-06 11:45:36 +0200 Edward Hervey <edward@centricular.com> + + * ext/srt/gstsrtobject.c: + * ext/srt/gstsrtobject.h: + * ext/srt/gstsrtsrc.c: + * ext/srt/gstsrtsrc.h: + srtsrc: Fix timestamping + SRT provides the original timestamp of a packet (with drift/skew corrected for + local clock), which is what should be used for timestamping the outgoing + buffers. This ensures that we output the packets with the same timestamp (and by + extension rate) as the original feed. + Also detect if packets were dropped (by checking the sequence number) and + properly set DISCONT flag on the outgoing buffer. + Finally answer the latency queries + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1658> + +2020-10-07 05:05:25 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfsourcereader.cpp: + mfvideosrc: Use only the first video stream per device + Non-first video stream might not be working with current + implementation. It could be non-video (e.g., photo source) and then + ReadSample() might be blocked forever. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1661> + +2020-10-08 03:15:21 +0900 Seungha Yang <seungha@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * sys/decklink/gstdecklink.cpp: + * sys/decklink/gstdecklinkvideosink.cpp: + * sys/decklink/gstdecklinkvideosrc.cpp: + decklink: Update doc + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1665> + +2020-10-08 01:39:42 +0900 Seungha Yang <seungha@centricular.com> + + * sys/decklink/win/DeckLinkAPI.h: + * sys/decklink/win/DeckLinkAPI_i.c: + decklink: Update Windows headers with SDK 11.2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1665> + +2020-10-08 01:33:35 +0900 Seungha Yang <seungha@centricular.com> + + * sys/decklink/osx/DeckLinkAPI.h: + * sys/decklink/osx/DeckLinkAPIConfiguration.h: + * sys/decklink/osx/DeckLinkAPIDeckControl.h: + * sys/decklink/osx/DeckLinkAPIDiscovery.h: + * sys/decklink/osx/DeckLinkAPIDispatch.cpp: + * sys/decklink/osx/DeckLinkAPIModes.h: + * sys/decklink/osx/DeckLinkAPIStreaming.h: + * sys/decklink/osx/DeckLinkAPITypes.h: + * sys/decklink/osx/DeckLinkAPIVersion.h: + decklink: Update OSX headers with SDK 11.2 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1665> + +2019-06-25 11:51:32 +0200 Tim <Timothee.Autin@ifremer.fr> + + * sys/decklink/gstdecklink.cpp: + * sys/decklink/gstdecklink.h: + * sys/decklink/gstdecklinkvideosink.cpp: + * sys/decklink/gstdecklinkvideosink.h: + * sys/decklink/gstdecklinkvideosrc.cpp: + * sys/decklink/gstdecklinkvideosrc.h: + * sys/decklink/linux/DeckLinkAPI.h: + * sys/decklink/linux/DeckLinkAPIConfiguration.h: + * sys/decklink/linux/DeckLinkAPIDeckControl.h: + * sys/decklink/linux/DeckLinkAPIDiscovery.h: + * sys/decklink/linux/DeckLinkAPIDispatch.cpp: + * sys/decklink/linux/DeckLinkAPIModes.h: + * sys/decklink/linux/DeckLinkAPITypes.h: + * sys/decklink/linux/DeckLinkAPIVersion.h: + decklink: Updated DeckLink SDK to 11.2 to support DeckLink 8K Pro + Updated Decklink SDK to version 11.2 in order to support newer cards like the Decklink 8K Pro. + This required to replace the duplex property by a profile property. + Profile values can be the following: + - bmdProfileOneSubDeviceFullDuplex + - bmdProfileOneSubDeviceHalfDuplex + - bmdProfileTwoSubDevicesFullDuplex + - bmdProfileTwoSubDevicesHalfDuplex + - bmdProfileFourSubDevicesHalfDuplex + Fixes #987 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1665> + +2020-10-07 17:37:25 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfsourcereader.cpp: + mfvideosrc: Fix invalid memory access when outputting jpeg + Don't access unknown-dangerous-nonsense address + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1662> + +2020-10-08 18:50:12 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * gst-libs/gst/webrtc/rtptransceiver.h: + * gst-libs/gst/webrtc/webrtc_fwd.h: + Revert "webrtc: Save the media kind in the transceiver" + This reverts commit f54d8e99457996303b8477b1f3a710f0fabd1cc6. + It breaks the CI until the C# bindings are fixed. + +2020-10-08 18:49:57 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + Revert "rtpsender: Add API to set the priority" + This reverts commit a8b287c76472c8d7fd38800807c482d020ff4a63. + It breaks the CI until the C# bindings are fixed. + +2020-10-08 18:49:56 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/webrtctransceiver.h: + Revert "rtptransceiver: Store the SSRC of the current stream" + This reverts commit d1da271f255101dbe95a426d9f5065d300b53e5a. + It breaks the CI until the C# bindings are fixed. + +2020-10-08 18:49:55 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + Revert "webrtcbin: Remove unused function" + This reverts commit 39723dbe934186c11f7b2a2b04c0af7932a1509c. + It breaks the CI until the C# bindings are fixed. + +2020-10-08 18:49:54 +0300 Sebastian Dröge <sebastian@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + * ext/webrtc/gstwebrtcice.c: + * ext/webrtc/gstwebrtcice.h: + * ext/webrtc/meson.build: + * ext/webrtc/sctptransport.c: + * ext/webrtc/sctptransport.h: + * ext/webrtc/webrtctransceiver.c: + * ext/webrtc/webrtctransceiver.h: + Revert "webrtc: Set the DSCP markings based on the priority" + This reverts commit 8ba08598bbe51f3b1f063ae22605f9608865f16b. + It breaks the CI until the C# bindings are fixed. + +2020-10-08 18:49:53 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/rtpsender.h: + * gst-libs/gst/webrtc/rtptransceiver.h: + Revert "webrtc: Document more objects" + This reverts commit ad68c6b1eb7c73c66dc9d1dbf1a8cc47fd489c61. + It breaks the CI until the C# bindings are fixed. + +2020-10-08 18:49:50 +0300 Sebastian Dröge <sebastian@centricular.com> + + * gst-libs/gst/webrtc/rtpsender.h: + * gst-libs/gst/webrtc/rtptransceiver.h: + Revert "webrtc: Add hotdoc style since tags" + This reverts commit 63a5fa818c31ecbe43891c077a38b6b162d73c28. + It breaks the CI until the C# bindings are fixed. + +2020-10-06 16:52:48 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpsender.h: + * gst-libs/gst/webrtc/rtptransceiver.h: + webrtc: Add hotdoc style since tags + We're stuck having to add a separate comment for now. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> + +2020-10-02 21:38:00 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpreceiver.h: + * gst-libs/gst/webrtc/rtpsender.h: + * gst-libs/gst/webrtc/rtptransceiver.h: + webrtc: Document more objects + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> + +2020-07-08 17:24:36 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcbin.h: + * ext/webrtc/gstwebrtcice.c: + * ext/webrtc/gstwebrtcice.h: + * ext/webrtc/meson.build: + * ext/webrtc/sctptransport.c: + * ext/webrtc/sctptransport.h: + * ext/webrtc/webrtctransceiver.c: + * ext/webrtc/webrtctransceiver.h: + webrtc: Set the DSCP markings based on the priority + This matches how the WebRTC javascript API works and the Chrome implementation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> + +2020-07-09 13:45:20 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Remove unused function + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> + +2020-07-09 13:42:35 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/webrtctransceiver.h: + rtptransceiver: Store the SSRC of the current stream + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> + +2020-07-09 13:39:03 -0400 Olivier Crête <olivier.crete@collabora.com> + + * gst-libs/gst/webrtc/rtpsender.c: + * gst-libs/gst/webrtc/rtpsender.h: + rtpsender: Add API to set the priority + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> + +2020-07-08 19:13:33 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * gst-libs/gst/webrtc/rtptransceiver.h: + * gst-libs/gst/webrtc/webrtc_fwd.h: + webrtc: Save the media kind in the transceiver + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1425> 2020-10-06 13:39:23 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1465,7 +18893,7 @@ srt: Remove unused sa_family tracking Now that SRT no longer needs the family when creating the socket, this code has become useless. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1685> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1659> 2020-10-02 03:39:40 +0200 Niklas Hambüchen <mail@nh2.me> @@ -1476,20 +18904,20 @@ `srt_create_socket()` was added in https://github.com/Haivision/srt/commit/4b897ba92d34f1829a1c6e419eeab17f0763a0fc and srt `v1.3.0` is the first release that has it. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1685> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1659> 2020-10-01 17:31:13 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> * ext/srt/gstsrt.c: srt: Register a log handler - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1685> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1659> 2020-09-25 19:17:35 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> * ext/srt/gstsrtobject.c: srt: Avoid removing invalid sockets from the polls This would provoke error messages from SRT. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1685> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1659> 2020-09-25 19:08:17 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1498,7 +18926,7 @@ `srt_startup` can also return 1 if it was successful. Avoid warning in this case. Avoid a race when checking whether we need to call it at all. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1685> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1659> 2020-10-06 12:35:12 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1509,94 +18937,219 @@ This reverts the linger workaround in commit 84f8dbd932029220ee86154dd and extends srt_constant_params to support other types than int. [1]: https://github.com/Haivision/srt/blob/master/docs/APISocketOptions.md - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1685> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1659> -2020-10-07 11:04:30 +0300 Sebastian Dröge <sebastian@centricular.com> +2020-09-29 18:52:43 +0900 Seungha Yang <seungha@centricular.com> - * pkgconfig/gstreamer-webrtc-uninstalled.pc.in: - * pkgconfig/gstreamer-webrtc.pc.in: - webrtc: Require gstreamer-sdp in the pkg-config file - Some headers include API from it. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1678> + * sys/d3d11/gstd3d11upload.c: + d3d11upload: Allow passthrough for system memory + ... like how d3d11download and gl{upload,download} do. + This should've been part of the commit 9b72b04daddafb1c86cb6ab5923c593a70bc4166 + but I missed. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1645> -2020-10-06 11:45:36 +0200 Edward Hervey <edward@centricular.com> +2020-10-04 10:01:31 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * ext/srt/gstsrtobject.c: - * ext/srt/gstsrtobject.h: - * ext/srt/gstsrtsrc.c: - * ext/srt/gstsrtsrc.h: - srtsrc: Fix timestamping - SRT provides the original timestamp of a packet (with drift/skew corrected for - local clock), which is what should be used for timestamping the outgoing - buffers. This ensures that we output the packets with the same timestamp (and by - extension rate) as the original feed. - Also detect if packets were dropped (by checking the sequence number) and - properly set DISCONT flag on the outgoing buffer. - Finally answer the latency queries - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1677> + * sys/va/gstvaallocator.c: + va: allocator: refactor flush methods for both allocators + Since the logic is the same, it can be generalized in a single common + function. + Also the methods run the common function with a lock and signal the + buffers' conditional. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-08 17:52:05 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> +2020-10-03 16:37:54 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * ext/srt/gstsrtsink.c: - srt: Consume the error from gst_srt_object_write - Instead of leaking it. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1681> + * sys/va/gstvaallocator.c: + va: allocator: refactor GstVaDmabufAllocator + Move code down to group it. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-05 16:40:55 -0400 Olivier Crête <olivier.crete@collabora.com> +2020-10-03 16:30:14 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * ext/dtls/gstdtlsconnection.c: - dtlsconnection: Ignore OpenSSL system call errors - OpenSSL shouldn't be making real system calls, so we can safely - ignore syscall errors. System interactions should happen through - our BIO. So especially don't look at the system's errno, as it - should be meaningless. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1684> + * sys/va/gstvaallocator.c: + va: allocator: refactor GstVaBuffersurface + Move code up and add namespace to methods, and renaming + _creating_buffer_surface() to the canonical + gst_va_buffer_surface_new() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-09 10:24:50 +0200 Jacek Tomaszewski <lord.jacold@gmail.com> +2020-09-30 19:35:14 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * COPYING: - * COPYING.LIB: - Replace LGPL v2 with LGPL v2.1 in COPYING and remove COPYING.LIB - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1682> + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvapool.c: + va: implement pooled allocators + 1. Allocators don't implement memory free() methods since all the memories will + implement dispose() returning FALSE + 2. Memory/miniobject dispose() will act as memory release, enqueueing the + release memory + 3. A new allocator's method prepare_buffer() which queries the released memory + queue and will add the requiered memories to the buffer. + 4. Allocators added a GCond to synchronize dispose() and prepare_buffer() + 5. A new allocator's method flush() which will free for real the memories. + While the bufferpool will + 1. Remove all the memories at reset_buffer() + 2. Implement acquire_buffer() calling allocator's prepare_buffer() + 3. Implement flush_start() calling allocator's flush() + 4. start() is disabled since it pre-allocs buffers but also calls + our reset_buffer() which will drop the memories and later the + buffers are ditched, something we don't want. This approach avoids + buffer pre-allocation. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-02 11:42:07 +0200 Jacek Tomaszewski <lord.jacold@gmail.com> +2020-09-30 15:54:18 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * COPYING: - Replace GPL v2 with LGPL v2 in COPYING file - Fixes #1422 - https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1422 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1682> + * sys/va/gstvaallocator.c: + va: allocator: user gst_clear_object() for _buffer_surface_unref() + Event if this function is only used by gst_va_dmabuf_memories_setup(), it might + get reused later by GstVaDmabufAllocator's functions. This change makes the + function less fragile. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-08 17:48:20 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> +2020-09-30 15:53:39 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * ext/srt/gstsrtobject.c: - srt: Check socket state before retrieving payload size - The connection might be broken, which we should detect instead of just - aborting the write. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1680> + * sys/va/gstvaallocator.c: + va: allocator: renamed gst_va_dmabuf_memory_release() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-08 18:25:59 +0200 Jakub Adam <jakub.adam@collabora.com> +2020-09-30 15:48:12 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * ext/x265/gstx265enc.c: - x265enc: fix deadlock on reconfig - Don't attempt to obtain encoder lock that is already held by - gst_x265_enc_encode_frame(). - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1679> + * sys/va/gstvaallocator.c: + va: allocator: renamed available_mems queue + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-10-07 05:05:25 +0900 Seungha Yang <seungha@centricular.com> +2020-09-30 15:45:54 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * sys/mediafoundation/gstmfsourcereader.cpp: - mfvideosrc: Use only the first video stream per device - Non-first video stream might not be working with current - implementation. It could be non-video (e.g., photo source) and then - ReadSample() might be blocked forever. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1676> + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + * sys/va/gstvapool.c: + va: allocator: rename gst_va_dmabuf_allocator_setup_buffer() + Since it's related with GstVaDmabufAllocator. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> -2020-09-25 22:00:26 +0530 raghavendra <raghavendra.rao@collabora.com> +2020-09-29 15:03:11 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> - * ext/srt/gstsrtobject.c: - srtobject: typecast SRTO_LINGER to linger - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1675> + * sys/va/gstvaallocator.c: + va: allocator: calculated surface frame internally + Instead of using gst_buffer_get_size() just add the memory sizes reported by + exported fd. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> + +2020-09-28 16:59:44 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: make GstVaMemory shareable + Renamed the first variable member of GstVaMemory from parent to mem in + order to avoid confusion with GstMemory's parent. + When freeing the structure, memory's parent is check in order to + decide if surfaces has to be destroyed or not, since only the parent + class have to destroy it. + Removed GST_MEMORY_FLAG_NO_SHARE in memory initialization, since it is + deprecated. + Implemented allocator's share virtual method which creates a new + shallow GstVaMemory structure based on the passed one which will be + it's parent. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> + +2020-09-28 16:50:16 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: remove copy method for GstVaMemory + Since the memory has to be shareable. That will be address in the next + commits. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> + +2020-09-24 17:32:47 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: remove va allocator mem_is_span() vmethod + Since it is the default by base class. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1626> + +2020-10-01 03:47:13 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11utils.c: + * sys/d3d11/gstd3d11utils.h: + d3d11upload/d3d11download: Make use of staging buffer + ... instead of direct cpu map for d3d11memory object. In this way, + we don't need per GstD3D11Memory staging texture. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1627> + +2020-10-03 18:53:46 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11bufferpool.c: + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11memory.h: + * sys/d3d11/gstd3d11utils.c: + * sys/d3d11/gstd3d11utils.h: + d3d11: Don't hold staging texture + Staging texture is used for memory transfer between system and + gpu memory. Apart from d3d11{upload,download} elements, however, + it should happen very rarely. + Before this commit, d3d11bufferpool was allocating at least one + staging texture in order to calculate cpu accessible memory size, + and it wasn't freed for later use of the texture unconditionally. + But it will increase system memory usage. Although GstD3D11memory + object is implemented so that support CPU access, most memory + transfer will happen in d3d11{upload,download} elements. + By this commit, the initial staging texture will be freed immediately + once cpu accessible memory size is calculated. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1627> + +2020-09-26 03:27:39 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11_fwd.h: + * sys/d3d11/gstd3d11basefilter.h: + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11colorconvert.h: + * sys/d3d11/gstd3d11decoder.h: + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11download.h: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11upload.h: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosink.h: + * sys/d3d11/gstd3d11videosinkbin.c: + * sys/d3d11/gstd3d11videosinkbin.h: + d3d11: Move to G_DECLARE_FINAL_TYPE + ... and remove unnecessary forward declaration. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1627> + +2020-10-04 16:33:47 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvacaps.c: + * sys/va/gstvacaps.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavpp.c: + va: caps: centralize caps feature discovering + These function were repeated in the different implemented + elements. This patch centralize them. + The side effect is dmabuf memory type is no longer checked with the + current VAContext, but assuming that dmabuf is a consequence of caps + negotiation from dynamic generated caps templates, where the context's + memory types are validated, there's no need to validate them twice. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1644> + +2020-10-04 12:43:35 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + va: filter: fix counter variable reuse + There was a bug reusing the counter variable i in nested loops. Also + the patch makes the code cleaner. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1643> + +2019-10-02 11:17:09 +0200 Lars Lundqvist <larslu@axis.com> + + * ext/curl/gstcurlbasesink.c: + curlbasesink: Add curl seek callback + Adding functionality to handle SEEK_SET enables rewinding of sent data. + In the HTTP case, this happens after an HTTP 401 has been received from + the other end. This will result in the sent data being resent. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1616> 2020-09-29 15:52:21 +0300 Sebastian Dröge <sebastian@centricular.com> @@ -1604,7 +19157,7 @@ * sys/decklink/gstdecklink.cpp: decklink: Correctly order the different dependent mode tables One was forgotten in 309f6187fef890c7ffa49305f38e89beac3b1423. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1652> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1617> 2020-09-19 00:26:35 +0900 Seungha Yang <seungha@centricular.com> @@ -1627,7 +19180,7 @@ when loopback capture client is running with event-driven mode. To work around the bug, event signalling should be handled manually for read thread to wake up. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1651> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1588> 2020-09-29 23:46:00 +1000 Matthew Waters <matthew@centricular.com> @@ -1636,7 +19189,7 @@ Always chain up to the parent _stop() implementation as it unrefs some caps (among other things). Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1409 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1650> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1618> 2020-06-16 11:16:37 +0900 Hosang Lee <hosang10.lee@lge.com> @@ -1656,7 +19209,112 @@ size error will occur. Clear out the live adapter on seek so that no unnecessary remaining data is pushed out together with the new fragment after seeking. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1649> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1345> + +2020-09-30 10:47:45 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/plugin.c: + va: simplify VPP detection + Also the previous code failed if VPP was not present blacklisting the + plugin. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1619> + +2020-06-15 15:24:07 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdkallocator.h: + * sys/msdk/gstmsdkallocator_libva.c: + * sys/msdk/gstmsdkcontext.h: + * sys/msdk/gstmsdkvideomemory.c: + msdk: call vaExportSurfaceHandle() to get DMABuf FD + Compared to vaAcquireBufferHandle(), vaExportSurfaceHandle() may + provide the handle details, so we needn't call vaDeriveImage(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1353> + +2020-07-01 09:03:21 -0700 Ederson de Souza <ederson.desouza@intel.com> + + * ext/avtp/gstavtpcvfpay.c: + * tests/check/elements/avtpcrfcheck.c: + * tests/check/elements/avtpcrfsync.c: + * tests/check/elements/avtpcvfdepay.c: + tests/avtp: Fix coverity issues + Fixes sign extension issues, unchecked return values and some constant + expression results. + CID: 1465073, 1465074, 1465075, 1465076, 1465077 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1398> + +2020-07-08 09:08:31 -0700 Ederson de Souza <ederson.desouza@intel.com> + + * ext/avtp/gstavtpcvfdepay.c: + * ext/avtp/gstavtpcvfpay.c: + * ext/avtp/gstavtpsrc.c: + avtp: Change "%lu" for G_GUINT64_FORMAT + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1398> + +2020-09-25 22:00:26 +0530 raghavendra <raghavendra.rao@collabora.com> + + * ext/srt/gstsrtobject.c: + srtobject: typecast SRTO_LINGER to linger + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1615> + +2020-09-24 01:24:40 +0800 He Junyan <junyan.he@intel.com> + + * gst-libs/gst/codecparsers/gstvp8parser.c: + codecparsers: vp8parser: clear the frame_hdr before parsing. + Uninited frame_hdr may have garbage and may contain some wrong + results after the parsing process. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1587> + +2020-09-20 23:29:00 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvah264dec.c: + * sys/va/gstvavp8dec.c: + * sys/va/gstvavp8dec.h: + * sys/va/meson.build: + * sys/va/plugin.c: + va: Implement the VA vp8 decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1587> + +2020-09-21 23:08:05 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvaallocator.h: + * sys/va/gstvadecoder.h: + * sys/va/gstvautils.h: + va: codestyle: Clear all tabs in header files + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1587> + +2020-09-21 12:51:53 +0800 He Junyan <junyan.he@intel.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvah264dec.c: + * sys/va/gstvautils.h: + va: util: make the _format_changed a common decoder function. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1587> + +2020-09-24 12:36:26 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: allocate output buffers according DPB size + Instead of allocating the maximal number of references for output + buffers, this patch reduces the memory foot print in many cases by + just allocating the output buffers required for the DPB. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1595> + +2020-09-24 12:29:49 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + * sys/va/gstvadecoder.h: + * sys/va/gstvah264dec.c: + va: decoder: store output buffer rather than surface + GstVaDecodePicture stored the processed VASurfaceID, under the + assumption that the bufferpool with keep the referenced buffers, but + this approach is fragil. + This patch changes GstVaDecodePicture to store the output buffer, + which already contains its VASurfaceID, and provides a new method to + retrieve the VASurfaceID directly from picture. + Based on He Junyan <junyan.he@intel.com> patches for + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1587 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1594> 2020-09-23 20:08:46 +1000 Matthew Waters <matthew@centricular.com> @@ -1664,7 +19322,155 @@ vtdec/vulkan: use Shared storage mode for IOSurface textures Fixes a debug assertion with i(Pad)OS 14: 'IOSurface textures must use MTLStorageModeShared' - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1648> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1592> + +2020-09-23 17:04:55 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/mpegtsmux/gstbasetsmux.c: + mpegtsmux: Restore intervals when creating TsMux + Otherwise the settings from the properties would be overwritten with + the defaults. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1593> + +2020-09-19 14:26:42 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvavpp.c: + * sys/va/plugin.c: + va: vpp: global lock to handle shared buffers + Add a global mutex to exclusive access to shared stream buffers, such + as DMABufs or VASurfaces after a tee: + LIBVA_DRIVER_NAME=iHD \ + gst-launch-1.0 v4l2src ! tee name=t t. ! queue ! \ + vapostproc skin-tone=9 ! xvimagesink \ + t. ! queue ! vapostproc ! xvimagesink + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-16 09:18:11 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvafilter.c: + * sys/va/gstvafilter.h: + * sys/va/gstvavpp.c: + * sys/va/gstvavpp.h: + * sys/va/meson.build: + * sys/va/plugin.c: + va: add vapostproc element + Video postprocessor for VA-API + Funcionalities: resize frames, change format, import buffers, apply + filters (such as denoise, sharpen, orientation, if driver offers them). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-20 13:49:33 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: export composed layer for YUY2 and UYVY + This is a result of an error reported by i965 driver which only can + export composed layer for these formats. This seems to work too with + iHD. These formats are not exposed as native surfaces in Gallium. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-19 16:52:10 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvaallocator.h: + va: allocator: add gst_va_dmabuf_buffer_setup() + This function will take an array of DMABuf GstMemory and an array of + fd, and create a VASurfaceID with those fds. Later that VASurfaceID is + attached to each DMABuf through GstVaBufferSurface. + In order to free the surface GstVaBufferSurface now have GstVaDisplay + member, and _buffer_surface_unref() were added. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-19 16:48:39 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: try to get VASurfaceID from every DMABuf allocator + Relax the check of the allocator type, because now the qdata can be + attached for other DMABuf allocators. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-09-12 13:10:18 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + * sys/va/gstvavideoformat.c: + va: allocator: try to create surface without fourcc but chroma only + There are, in VPP, surfaces that doesn't support 4:2:2 fourccs but it + supports the chroma. So this patch gives that opportunity to the + driver. + This patch also simplifiies + gst_va_video_surface_format_from_image_format() to just an iterator + for surfaces available formats. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-19 16:45:49 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvaallocator.c: + va: allocator: create surfaces with VASurfaceAttribExternalBuffers + Add a new parameter to _create_surfaces(): a pointer to + VASurfaceAttribExternalBuffers. + If it's defined the memory type is changed to DRM_PRIME, also a new item is + added to the VASurfaceAttrib array with + VASurfaceAttribExternalBufferDescriptor. + Also, the VASurfaceAttrib for pixel format is not mandatory anymore. If fourcc + parameter is 0, is not added in the array, relying on the chroma. This is + useful when creating surfaces for uploading or downloading images. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-20 13:46:12 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvapool.c: + va: pool: use gst_object_replace() for allocator + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-09-16 19:14:30 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadevice.c: + va: device: use gst_clear_object() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-08-18 19:12:46 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvacaps.c: + * sys/va/gstvacaps.h: + va: caps: expose gst_caps_set_format_array() + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1529> + +2020-09-22 19:59:41 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: vah264dec: fix documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1591> + +2020-09-23 10:58:43 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264: unref leaked caps + Unref a leaked caps at set_latency(). + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1590> + +2020-07-20 10:13:13 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/gstmsdk.c: + * sys/msdk/gstmsdkav1dec.c: + * sys/msdk/gstmsdkav1dec.h: + * sys/msdk/meson.build: + msdk: add support for AV1 decoding + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1448> + +2020-09-19 21:43:24 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvautils.c: + va: utils: use GstObject for GstVaDisplay in context + Thus application could fetch the GstVaDisplay through the sync bus + without knowning the specific implementation, and sharing it or + extract properties. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1586> + +2020-09-19 21:36:58 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvautils.c: + va: utils: fix code style and wrong log message + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1586> 2020-09-14 09:48:48 +0100 Philippe Normand <philn@igalia.com> @@ -1672,7 +19478,7 @@ wpe: Plug event leak Handled events don't go through the default pad event handler, so they need to be unreffed in this case. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1647> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1568> 2020-08-25 01:57:55 +1000 Jan Schmidt <jan@centricular.com> @@ -1683,14 +19489,14 @@ leading to a crash. This commit fixes the crash, but not the underlying failure - a 2nd wpesrc can still error out instead. Partially fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1386 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1647> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1568> 2020-09-11 12:51:56 +0100 Philippe Normand <philn@igalia.com> * ext/wpe/WPEThreadedView.cpp: wpe: Plug SHM buffer leaks Fixes #1409 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1647> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1568> 2020-09-10 14:39:58 +0100 Philippe Normand <philn@igalia.com> @@ -1702,7 +19508,32 @@ multiple wpesrc elements are created in sequence. Without this patch the first view might receive erroneous buffer notifications. Fixes #1386 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1647> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1568> + +2020-09-11 18:17:20 +0530 Sanchayan Maity <sanchayan@asymptotic.io> + + * docs/plugins/gst_plugins_cache.json: + * gst/audiobuffersplit/gstaudiobuffersplit.c: + * gst/audiobuffersplit/gstaudiobuffersplit.h: + audiobuffersplit: Add support for specifying output buffer size + Currently for buffer splitting only output duration can be specified. + Allow specifying a buffer size in bytes for splitting. + Consider a use case of the below pipeline + appsrc ! rptL16pay ! capsfilter ! rtpbin ! udpsink + Maintaining MTU for RTP transfer is desirable but in a scenario + where the buffers being pushed to appsrc do not adhere to this, + an audiobuffersplit element placed between appsrc and rtpL16pay + with output buffer size specified considering the MTU can help + mitigate this. + While rtpL16pay already has a MTU setting, in case of where an + incoming buffer has a size close to MTU, for eg. with a MTU of + 1280, a buffer of size 1276 bytes would be split into two buffers, + one of 1268 and other of 8 bytes considering RTP header size of + 12 bytes. Putting audiobuffersplit between appsrc and rtpL16pay + can take care of this. + While buffer duration could still be used being able to specify + the size in bytes is helpful here. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1578> 2020-09-10 13:05:23 +0100 Philippe Normand <philn@igalia.com> @@ -1711,7 +19542,71 @@ The load-failed and load-failed-with-tls-errors signals expect distinct callback signatures. Fixes #1388 - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1646> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1566> + +2020-09-20 08:31:23 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: add documentation + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1585> + +2019-12-10 19:54:43 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/msdk.c: + msdk: enable GPUCopy + Note it works for system memory only + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/910> + +2019-12-10 19:47:03 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * sys/msdk/msdk.c: + msdk: call MFXInitEx instead of MFXInit + MFXInitEx has more control than MFXInit. The current setting in this + commit is identical to MFXInit + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/910> + +2020-09-19 14:08:46 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + va: decoder: render picture only if data + Call vaRenderPicture() only if buffer or slice data is available. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1584> + +2020-09-17 19:54:28 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvadecoder.c: + va: decoder: warn if decode fails on a surface + Instead of logging error if a step fails, it logs a warning message + reducing the noise and obeying the rule for errors since the program + doesn't end + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1584> + +2020-09-17 19:52:29 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + va: h264dec: check is pad has fixed caps at caps query + Otherwise it will always reply with the possible driver caps, which + generates problems with Web MSE players. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1583> + +2020-09-19 05:39:32 +0900 Seungha Yang <seungha@centricular.com> + + * sys/va/gstvah264dec.c: + va: h264dec: Don't need to set pts/dts/duration on outputting frame + It will be handled by videodecoder baseclass + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1582> + +2020-03-25 20:50:01 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/gstwebrtcice.c: + * ext/webrtc/meson.build: + webrtcbin: Accept end-of-candidate pass it to libnice + libnice now supports the concept of end-of-candidate, so use the API + for it. This also means that if you don't do that, the webrtcbin will + never declared the connection as failed. + This requires bumping the dependency to libnice 0.1.16 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1139> 2020-09-17 17:39:25 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> @@ -1724,7 +19619,34 @@ GstVideoInfo when settting the src rectangle. This fixes an issue with 1080p video displaying repeated or green at the padded bottom 8 lines (seen with v4l2codecs). - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1642> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1580> + +2020-09-18 01:41:35 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11colorconvert.c: + * sys/d3d11/gstd3d11colorconverter.c: + * sys/d3d11/gstd3d11device.c: + * sys/d3d11/gstd3d11download.c: + * sys/d3d11/gstd3d11format.c: + * sys/d3d11/gstd3d11format.h: + * sys/d3d11/gstd3d11memory.c: + * sys/d3d11/gstd3d11shader.c: + * sys/d3d11/gstd3d11upload.c: + * sys/d3d11/gstd3d11videosink.c: + * sys/d3d11/gstd3d11videosinkbin.c: + * tests/check/elements/d3d11colorconvert.c: + d3d11: Add support for packed 8bits 4:2:2 YUV formats + Note that newly added formats (YUY2, UYVY, and VYUY) are not supported + render target view formats. So such formats can be only input of d3d11convert + or d3d11videosink. Another note is that YUY2 format is a very common + format for hardware en/decoders on Windows. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1581> + +2020-08-26 17:48:06 -0400 Olivier Crête <olivier.crete@collabora.com> + + * ext/webrtc/gstwebrtcbin.c: + webrtcbin: Merge the RTX SSRCs from all transceivers when bundling + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1545> 2020-09-15 17:09:57 +0200 Marian Cichy <m.cichy@pengutronix.de> @@ -1735,7 +19657,176 @@ As 64-bit values for rate, depth, format, channels does not make much sense and since any other functionality in gstreamer expects G_TYPE_INT for channels and rate, we should stick to that - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1641> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1576> + +2020-06-09 10:10:12 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + * gst-libs/gst/codecparsers/gsth265parser.h: + * gst/videoparsers/gsth265parse.c: + * tests/check/libs/h265parser.c: + h265parse: recognize more HEVC extension streams + There are streams which have the right general_profile_idc and + general_profile_compatibility_flag, but don't have the right extension + flags. We may try to use chroma_format_idc and bit_depth to + recognize these streams. + e.g. + https://www.itu.int/wftp3/av-arch/jctvc-site/bitstream_exchange/draft_conformance/SCC/IBF_Disabled_A_MediaTek_2.zip + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1328> + +2020-06-05 13:01:06 +0800 Haihao Xiang <haihao.xiang@intel.com> + + * gst-libs/gst/codecparsers/gsth265parser.c: + * tests/check/libs/h265parser.c: + h265parser: select the right profile for high throughput SCC stream + Currently screen-extended-high-throughput-444 is recognized as + screen-extended-main-444, screen-extended-high-throughput-444-10 is + recognized as screen-extended-main-444-10 because they have the same + extension flags, so without this patch, it is possible that a decoder + which supports SCC but doesn't support throughput SCC will try to decode + a throughput SCC stream. + e.g. + https://www.itu.int/wftp3/av-arch/jctvc-site/bitstream_exchange/draft_conformance/SCC/HT_A_SCC_Apple_2.zip + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1328> + +2020-09-08 14:43:49 +0800 Randy Li (ayaka) <ayaka@soulik.info> + + * sys/msdk/gstmsdkvpp.c: + msdk: vpp: fixup passthrough checking for DMA + I think it is just a typo from e1a90f1ec9 + msdkvpp: Disable passthrough if memory capsfeature changes + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1559> + +2020-09-15 17:51:51 +0200 Emmanuel Gil Peyrot <linkmauve@linkmauve.fr> + + * ext/wayland/wlshmallocator.c: + * meson.build: + waylandsink: Use memfd_create() when available + This (so-far) Linux- and FreeBSD-only API lets users create file + descriptors purely in memory, without any backing file on the filesystem + and the race condition which could ensue when unlink()ing it. + It also allows seals to be placed on the file, ensuring to every other + process that we won’t be allowed to shrink the contents, potentially + causing a SIGBUS when they try reading it. + This patch is best viewed with the -w option of git log -p. + It is an almost exact copy of Wayland commit + 6908c8c85a2e33e5654f64a55cd4f847bf385cae, see + https://gitlab.freedesktop.org/wayland/wayland/merge_requests/4 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1577> + +2020-09-10 21:19:43 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + codecs: vp9decoder: Relying on upstream vp9parse for super frame handling + By this way, we can simplify the decoding flow. Moreover, we don't + need to worry about the case where multiple visible-frames are + composed in one super-frame, since upstream vp9parse will split + them per frame unit. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1567> + +2020-09-10 21:42:49 +0900 Seungha Yang <seungha@centricular.com> + + * gst-libs/gst/codecs/gstvp9decoder.c: + * gst-libs/gst/codecs/gstvp9picture.c: + * gst-libs/gst/codecs/gstvp9picture.h: + codecs: vp9decoder: Remove unused pts variable + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1567> + +2020-09-12 00:12:03 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.c: + d3d11vp9dec: Don't need to consider output_picture() call without GstVideoCodecFrame + Baseclass will be updated in order to ensure GstVideoCodecFrame. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1567> + +2020-09-10 20:32:13 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11vp9dec.c: + d3d11vp9dec: Specify profile and alignment on sink template + Set supported profile(s) on sink template caps, so that decodebin + can filter out this element if profile of given vp9 stream is not + supported by hardware decoder. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1567> + +2020-09-10 21:31:38 +0900 Seungha Yang <seungha@centricular.com> + + * sys/d3d11/gstd3d11h264dec.c: + * sys/d3d11/gstd3d11h265dec.c: + * sys/d3d11/gstd3d11vp8dec.c: + * sys/d3d11/gstd3d11vp9dec.c: + d3d11decoder: Cleanup code + * Don't need to set pts/dts/duration on output buffer of frame. + it's handled by baseclass + * Remove meaningless debug output + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1567> + +2020-08-22 12:44:16 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + codecs: h264decoder: Calculate and set latency + Add gst_h264_decoder_set_latency(), which calculates and sets + latency on base decoder class, after new_sequence is called. + This assumes that in new_sequence() vmethod, callee negotiates + downstream caps. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1531> + +2020-08-22 12:47:23 +0200 Víctor Manuel Jáquez Leal <vjaquez@igalia.com> + + * sys/va/gstvah264dec.c: + Revert "va: h264dec: set latency" + This reverts commit 3aedef4c8601dcafb065d8095a927f1cd528056f. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1531> + +2020-09-14 14:42:36 +0300 Jordan Petridis <jordan@centricular.com> + + * tests/check/gst-plugins-bad.supp: + validate: plug leak in gssdp + These are triggered by the webrtcbin tests + https://gitlab.gnome.org/GNOME/gssdp/-/issues/10 + +2020-09-04 23:34:16 +0800 yychao <yychao@realtek.com> + + * gst-libs/gst/mpegts/gst-dvb-descriptor.c: + * gst-libs/gst/mpegts/gst-dvb-descriptor.h: + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Parse Audio Preselection Descriptor + For Dolby AC4 audio experience, parsing PMTs/APD from transport stream layer for all available presentations. + Refer to ETSI EN 300 468 V1.16.1 (2019-05) + 1. 6.4.1 Audio preselection descriptor + 2. Table M.1: Mapping of codec specific values to the audio preselection descriptor + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1555> + +2020-09-04 23:28:58 +0800 yychao <yychao@realtek.com> + + * gst-libs/gst/mpegts/gstmpegtsdescriptor.c: + * gst-libs/gst/mpegts/gstmpegtsdescriptor.h: + * gst/mpegtsdemux/mpegtsbase.c: + * gst/mpegtsdemux/mpegtsbase.h: + * gst/mpegtsdemux/tsdemux.c: + tsdemux: Add new API for fetching extended descriptors + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1555> + +2020-08-26 15:45:35 +1000 Matthew Waters <matthew@centricular.com> + + * ext/webrtc/gstwebrtcbin.c: + * ext/webrtc/utils.h: + * ext/webrtc/webrtcsdp.c: + * ext/webrtc/webrtcsdp.h: + * tests/check/elements/webrtcbin.c: + webrtc: propagate more errors through the promise + Return errors on promises when things fail where available. + Things like parsing errors, invalid states, missing fields, unsupported + transitions, etc. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1565> + +2020-07-01 07:35:08 +0530 Nirbheek Chauhan <nirbheek@centricular.com> + + * gst-libs/gst/vulkan/meson.build: + meson: Do not warn when a windowing system is not found + Error out when the vulkan option is enabled, and just print + a message() otherwise. This is more correct and also allows us to pass + --fatal-meson-warnings more reliably. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1391> 2020-09-10 10:35:11 -0700 Adam Williamson <awilliam@redhat.com> @@ -1745,7 +19836,78 @@ be set even if the option is disabled. Fixes https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1406 Signed-off-by: Adam Williamson <awilliam@redhat.com> - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1638> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1570> + +2020-09-10 23:12:10 +0200 Mathieu Duponchelle <mathieu@centricular.com> + + * ext/openh264/gstopenh264dec.cpp: + openh264dec: port to new request_sync_point() API + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1571> + +2020-07-28 18:32:03 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * gst-libs/gst/codecs/gsth264decoder.c: + * gst-libs/gst/codecs/gsth264decoder.h: + * gst-libs/gst/codecs/gsth264picture.c: + * gst-libs/gst/codecs/gsth264picture.h: + h264decoder: Fix various typos + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1563> + +2020-07-28 18:39:52 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2slh264dec: Minor cleanup + Move few variables in their respective scope. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1563> + +2020-07-22 15:40:14 -0400 Nicolas Dufresne <nicolas.dufresne@collabora.com> + + * sys/v4l2codecs/gstv4l2codech264dec.c: + v4l2slh264dec: Fix B-Frame weight table + We where not setting the luma l1 weight table. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1563> + +2020-09-09 21:38:33 +0900 Seungha Yang <seungha@centricular.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/videoparsers/gstvp9parse.c: + * gst/videoparsers/gstvp9parse.h: + * gst/videoparsers/meson.build: + * gst/videoparsers/plugin.c: + * tests/check/elements/vp9parse.c: + * tests/check/elements/vp9parse.h: + * tests/check/meson.build: + videoparsers: Add vp9parse element + Adding vp9parse element to parse various stream information such as + resolution, profile, and so on. If upstream does not provide resolution and/or + profile, this would be useful for decodebin pipeline for autoplugging + suitable decoder element depending on template caps of each decoder element. + In addition, vp9parse element supports unpacking superframe into + single frame for decoders. The vp9 superframe is a frame which consists + of multiple frames (or superframe with one frame is allowed) followed by superframe + index block. Then unpacked each frame will be considered as normal frame + by decoder. The decision for unpacking will be done by downstream element's + "alignment" caps field, which can be "super-frame" or "frame". + If downstream specifies the "alignment" as "frame", + then vp9parse element will split an incoming superframe into single frames + and the superframe index (located at the end of the superframe) data + will be discarded by vp9parse element. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1041> + +2020-06-16 21:09:36 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfcapturewinrt.cpp: + * sys/mediafoundation/gstmfsourceobject.c: + * sys/mediafoundation/gstmfsourceobject.h: + * sys/mediafoundation/gstmfsourcereader.cpp: + * sys/mediafoundation/gstmfvideosrc.c: + * sys/mediafoundation/mediacapturewrapper.cpp: + * sys/mediafoundation/mediacapturewrapper.h: + mfvideosrc: Set timestamp on buffer when it's captured + Capture the timestamp immediately when new frame is arrived, + instead of doing that on ::create() method. There would be + time gap between captured time and outputting time. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1351> 2020-09-04 16:53:03 +0200 Mathieu Duponchelle <mathieu@centricular.com> @@ -1755,7 +19917,7 @@ * tests/check/elements/line21.c: line21enc: add remove-caption-meta property Similar to #GstCCExtractor:remove-caption-meta - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1637> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1554> 2020-09-04 02:38:58 +0200 Mathieu Duponchelle <mathieu@centricular.com> @@ -1767,13 +19929,13 @@ with: * height == 525 (standard NTSC, line 21 / 22) * height == 486 (NTSC usable lines + 6 lines for VBI, line 1 / 2) - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1637> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1554> 2020-09-04 02:33:52 +0200 Mathieu Duponchelle <mathieu@centricular.com> * ext/closedcaption/gstline21enc.c: line21enc: add support for CDP closed caption meta - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1637> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1554> 2020-08-27 11:23:01 +0200 Edward Hervey <edward@centricular.com> @@ -1782,7 +19944,7 @@ Some HTTP servers don't provide fragment sizes (with the Content-Length HTTP header). In order to still figure out a nominal bitrate (for usage by queue2), calculate on when we're done downloading a fragment. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1635> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1544> 2020-08-27 11:18:56 +0200 Edward Hervey <edward@centricular.com> @@ -1793,7 +19955,21 @@ Content-Length we also need to check whether it was valid when calculating bitrates. Avoids returning completely bogus bitrates with gogol's video streaming services - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1635> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1544> + +2020-08-12 20:31:32 +0900 Seungha Yang <seungha@centricular.com> + + * sys/mediafoundation/gstmfvideobuffer.cpp: + * sys/mediafoundation/gstmfvideobuffer.h: + * sys/mediafoundation/gstmfvideoenc.cpp: + * sys/mediafoundation/meson.build: + * sys/mediafoundation/plugin.c: + mfvideoenc: Add support for zero-copy encoding + Add custom IMFMediaBuffer and IMF2DBuffer implementation in order to + keep track of lifecycle of Media Foundation memory object. + By this new implementation, we can pass raw memory of upstream buffer + to Media Foundation without copy. + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1518> 2020-09-01 13:28:44 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1805,7 +19981,40 @@ not a good idea, as the latter can get blocked on the streaming thread. Have get_stats read the values directly, adding a lock to ensure we don't read garbage. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1629> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1550> + +2020-08-19 14:51:17 +0300 Nazar Mokrynskyi <nazar@mokrynskyi.com> + + * docs/meson.build: + * gst/rtmp2/gstrtmp2sink.c: + * gst/rtmp2/rtmp/rtmpclient.h: + rtmp2sink: add docs section with since marker on new stop-commands property + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1256> + +2020-08-18 19:16:40 +0300 Nazar Mokrynskyi <nazar@mokrynskyi.com> + + * docs/plugins/gst_plugins_cache.json: + * gst/rtmp2/gstrtmp2sink.c: + rtmp2: fix code style, update documentation cache + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1256> + +2020-08-18 14:05:26 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> + + * gst/rtmp2/gstrtmp2.c: + * gst/rtmp2/gstrtmp2sink.c: + * gst/rtmp2/rtmp/rtmpclient.c: + * gst/rtmp2/rtmp/rtmpclient.h: + rtmp2: Clean up (improve) GstRtmpStopCommands type + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1256> + +2020-05-02 04:49:42 +0300 Nazar Mokrynskyi <nazar@mokrynskyi.com> + + * gst/rtmp2/gstrtmp2sink.c: + * gst/rtmp2/rtmp/rtmpclient.c: + * gst/rtmp2/rtmp/rtmpclient.h: + rtmp2sink: handle EOS event and close stream + https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1285 + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1256> 2020-09-02 15:29:49 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1816,7 +20025,7 @@ Use an iterator when the code isn't simple to avoid deadlock. When we find the best pad, take a reference so a concurrent pad release doesn't destroy the pad before we're done with it. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1625> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1553> 2020-09-01 13:29:30 +0200 Jan Alexander Steffens (heftig) <jan.steffens@ltnglobal.com> @@ -1825,7 +20034,7 @@ It was looking at the "outer" peer of the ghost pad, not the "inner" peer (the target). It provided the wrong pad to gst_element_release_request_pad. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1623> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1551> 2020-09-08 10:48:56 +0200 Edward Hervey <edward@centricular.com> @@ -1838,16 +20047,14 @@ on the purpose of streams based on their PID. Therefore, when requesting a pad with a specific PID, make sure it is not a restricted PID. - Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1620> + Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1561> -2020-09-14 14:42:36 +0300 Jordan Petridis <jordan@centricular.com> +2020-09-08 17:30:42 +0100 Tim-Philipp Müller <tim@centricular.com> - * tests/check/gst-plugins-bad.supp: - validate: plug leak in gssdp - These are triggered by the webrtcbin tests - https://gitlab.gnome.org/GNOME/gssdp/-/issues/10 + * .gitlab-ci.yml: + ci: include template from gst-ci master branch again -2020-09-08 17:44:29 +0100 Tim-Philipp Müller <tim@centricular.com> +2020-09-08 16:58:50 +0100 Tim-Philipp Müller <tim@centricular.com> * meson.build: Back to development @@ -152134,6 +170341,7 @@ === release 0.10.21 === +('0', '10', '21') 2011-01-21 21:13:22 +0000 Tim-Philipp Müller <tim.muller@collabora.co.uk> * ChangeLog: @@ -190444,8 +208652,6 @@ every file but apart from that no changes compared to the latest SVN versions happened. -=== release 0.10.21 === - 2008-09-01 16:05:45 +0000 Edward Hervey <bilboed@bilboed.com> tests/check/elements/audioresample.c: Now that GstBaseTransform is 'fixed' ... remove cruft from tests. @@ -193950,8 +212156,6 @@ (deinterleave_suite): Add some more deinterleave unit test bits I had locally. -=== release 0.10.20 === - 2008-05-14 13:57:41 +0000 Tim-Philipp Müller <tim@centricular.net> gst/audioresample/gstaudioresample.c: Revert previous change which made basetransform handle buffer_alloc and which b... @@ -195508,8 +213712,6 @@ Use AG_GST_CHECK_PLUGIN and AG_GST_DISABLE_PLUGIN to simplify which plug-ins are included/excluded. (#498222) -=== release 0.10.19 === - 2008-03-03 06:04:02 +0000 Sebastian Dröge <slomo@circular-chaos.org> Correct all relevant warnings found by the sparse semantic code analyzer. This include marking several symbols static... @@ -199686,8 +217888,6 @@ (gst_speex_resample_update_state): Only post the latency message if we have a resampler state already. -=== release 0.10.17 === - 2007-11-23 10:21:11 +0000 Sebastian Dröge <slomo@circular-chaos.org> gst/audioresample/gstaudioresample.c: Implement latency query. @@ -204655,8 +222855,6 @@ work with earlier versions due to GstChildProxy brokeness). Also up requirements to last released core/base. -=== release 0.10.15 === - 2007-04-27 15:33:46 +0000 Julien Moutte <julien@moutte.net> ext/theora/theoradec.c: Calculate buffer duration correctly to generate a perfect stream (#433888). @@ -206986,8 +225184,6 @@ Original commit message from CVS: move amrwb code -=== release 0.10.12 === - 2007-01-04 12:49:47 +0000 Thomas Vander Stichele <thomas@apestaart.org> configure.ac: split out GST_CFLAGS into GST_PLUGINS_BASE_CFLAGS and GST_CFLAGS so that GST_BASE_CFLAGS can go inbetwe...
View file
gst-plugins-bad-1.18.6.tar.xz/NEWS -> gst-plugins-bad-1.20.1.tar.xz/NEWS
Changed
@@ -1,14 +1,14 @@ -GStreamer 1.18 Release Notes +GStreamer 1.20 Release Notes -GStreamer 1.18.0 was originally released on 8 September 2020. +GStreamer 1.20.0 was originally released on 3 February 2022. -The latest bug-fix release in the 1.18 series is 1.18.6 and was released -on 2 February 2022. +The latest bug-fix release in the 1.20 series is 1.20.1 and was released +on 14 March 2022. -See https://gstreamer.freedesktop.org/releases/1.18/ for the latest +See https://gstreamer.freedesktop.org/releases/1.20/ for the latest version of this document. -Last updated: Wednesday 2 February 2022, 11:30 UTC (log) +Last updated: Monday 14 March 2022, 00:30 UTC (log) Introduction @@ -21,1781 +21,1583 @@ Highlights -- GstTranscoder: new high level API for applications to transcode - media files from one format to another - -- High Dynamic Range (HDR) video information representation and - signalling enhancements - -- Instant playback rate change support - -- Active Format Description (AFD) and Bar Data support - -- RTSP server and client implementations gained ONVIF trick modes - support - -- Hardware-accelerated video decoding on Windows via DXVA2 / - Direct3D11 - -- Microsoft Media Foundation plugin for video capture and - hardware-accelerated video encoding on Windows - -- qmlgloverlay: New overlay element that renders a QtQuick scene over - the top of an input video stream - -- imagesequencesrc: New element to easily create a video stream from a - sequence of jpeg or png images - -- dashsink: New sink to produce DASH content - -- dvbsubenc: New DVB Subtitle encoder element - -- MPEG-TS muxing now also supports TV broadcast compliant muxing with - constant bitrate muxing and SCTE-35 support - -- rtmp2: New RTMP client source and sink element from-scratch - implementation - -- svthevcenc: New SVT-HEVC-based H.265 video encoder - -- vaapioverlay: New compositor element using VA-API - -- rtpmanager gained support for Google’s Transport-Wide Congestion - Control (twcc) RTP extension - -- splitmuxsink and splitmuxsrc gained support for auxiliary video - streams - -- webrtcbin now contains some initial support for renegotiation - involving stream addition and removal - -- RTP support was enhanced with new RTP source and sink elements to - easily set up RTP streaming via rtp:// URIs - -- avtp: New Audio Video Transport Protocol (AVTP) plugin for - Time-Sensitive Applications - -- Support for the Video Services Forum’s Reliable Internet Stream - Transport (RIST) TR-06-1 Simple Profile - -- Universal Windows Platform (UWP) support - -- rpicamsrc: New element for capturing from the Raspberry Pi camera - -- RTSP Server TCP interleaved backpressure handling improvements as - well as support for Scale/Speed headers - -- GStreamer Editing Services gained support for nested timelines, - per-clip speed rate control and the OpenTimelineIO format. - -- Autotools build system has been removed in favour of Meson +- Development in GitLab was switched to a single git repository + containing all the modules +- GstPlay: new high-level playback library, replaces GstPlayer +- WebM Alpha decoding support +- Encoding profiles can now be tweaked with additional + application-specified element properties +- Compositor: multi-threaded video conversion and mixing +- RTP header extensions: unified support in RTP depayloader and + payloader base classes +- SMPTE 2022-1 2-D Forward Error Correction support +- Smart encoding (pass through) support for VP8, VP9, H.265 in + encodebin and transcodebin +- Runtime compatibility support for libsoup2 and libsoup3 (libsoup3 + support experimental) +- Video decoder subframe support +- Video decoder automatic packet-loss, data corruption, and keyframe + request handling for RTP / WebRTC / RTSP +- mp4 and Matroska muxers now support profile/level/resolution changes + for H.264/H.265 input streams (i.e. codec data changing on the fly) +- mp4 muxing mode that initially creates a fragmented mp4 which is + converted to a regular mp4 on EOS +- Audio support for the WebKit Port for Embedded (WPE) web page source + element +- CUDA based video color space convert and rescale elements and + upload/download elements +- NVIDIA memory:NVMM support for OpenGL glupload and gldownload + elements +- Many WebRTC improvements +- The new VA-API plugin implementation fleshed out with more decoders + and new postproc elements +- AppSink API to retrieve events in addition to buffers and buffer + lists +- AppSrc gained more configuration options for the internal queue + (leakiness, limits in buffers and time, getters to read current + levels) +- Updated Rust bindings and many new Rust plugins +- Improved support for custom minimal GStreamer builds +- Support build against FFmpeg 5.0 +- Linux Stateless CODEC support gained MPEG-2 and VP9 +- Windows Direct3D11/DXVA decoder gained AV1 and MPEG-2 support +- Lots of new plugins, features, performance improvements and bug + fixes Major new features and changes Noteworthy new features and API -Instant playback rate changes +- gst_element_get_request_pad() has been deprecated in favour of the + newly-added gst_element_request_pad_simple() which does the exact + same thing but has a less confusing name that hopefully makes clear + that the function request a new pad rather than just retrieves an + already-existing request pad. + +Development in GitLab was switched to a single git repository containing all the modules + +The GStreamer multimedia framework is a set of libraries and plugins +split into a number of distinct modules which are released independently +and which have so far been developed in separate git repositories in +freedesktop.org GitLab. + +In addition to these separate git repositories there was a gst-build +module that would use the Meson build system’s subproject feature to +download each individual module and then build everything in one go. It +would also provide an uninstalled development environment that made it +easy to work on GStreamer and use or test versions other than the +system-installed GStreamer version. + +All of these modules have now (as of 28 September 2021) been merged into +a single git repository (“Mono repository” or “monorepo”) which should +simplify development workflows and continuous integration, especially +where changes need to be made to multiple modules at once. + +This mono repository merge will primarily affect GStreamer developers +and contributors and anyone who has workflows based on the GStreamer git +repositories. + +The Rust bindings and Rust plugins modules have not been merged into the +mono repository at this time because they follow a different release +cycle. + +The mono repository lives in the existing GStreamer core git repository +in GitLab in the new main branch and all future development will happen +on this branch. + +Modules will continue to be released as separate tarballs. + +For more details, please see the GStreamer mono repository FAQ. + +GstPlay: new high-level playback library replacing GstPlayer + +- GstPlay is a new high-level playback library that replaces the older + GstPlayer API. It is basically the same API as GstPlayer but + refactored to use bus messages for application notifications instead + of GObject signals. There is still a signal adapter object for those + who prefer signals. Since the existing GstPlayer API is already in + use in various applications, it didn’t seem like a good idea to + break it entirely. Instead a new API was added, and it is expected + that this new GstPlay API will be moved to gst-plugins-base in + future. + +- The existing GstPlayer API is scheduled for deprecation and will be + removed at some point in the future (e.g. in GStreamer 1.24), so + application developers are urged to migrate to the new GstPlay API + at their earliest convenience. + +WebM alpha decoding + +- Implement WebM alpha decoding (VP8/VP9 with alpha), which required + support and additions in various places. This is supported both with + software decoders and hardware-accelerated decoders. + +- VP8/VP9 don’t support alpha components natively in the codec, so the + way this is implemented in WebM is by encoding the alpha plane with + transparency data as a separate VP8/VP9 stream. Inside the WebM + container (a variant of Matroska) this is coded as a single video + track with the “normal” VP8/VP9 video data making up the main video + data and each frame of video having an encoded alpha frame attached + to it as extra data ("BlockAdditional"). + +- matroskademux has been extended extract this per-frame alpha side + data and attach it in form of a GstVideoCodecAlphaMeta to the + regular video buffers. Note that this new meta is specific to this + VP8/VP9 alpha support and can’t be used to just add alpha support to + other codecs that don’t support it. Lastly, matroskademux also + advertises the fact that the streams contain alpha in the caps. + +- The new codecalpha plugin contains various bits of infrastructure to + support autoplugging and debugging: + + - codecalphademux splits out the alpha stream from the metas on + the regular VP8/VP9 buffers + - alphacombine takes two decoded raw video streams (one alpha, one + the regular video) and combines it into a video stream with + alpha + - vp8alphadecodebin + vp9alphadecodebin are wrapper bins that use + the regular vp8dec and vp9dec software decoders to decode + regular and alpha streams and combine them again. To decodebin + these look like regular decoders. + - The V4L2 CODEC plugin has stateless VP8/VP9 decoders that can + decode both alpha and non-alpha stream with a single decoder + instance + +- A new AV12 video format was added which is basically NV12 with an + alpha plane, which is more convenient for many hardware-accelerated + decoders. + +- Watch Nicolas Dufresne’s LCA 2022 talk “Bringing WebM Alpha support + to GStreamer” for all the details and a demo. + +RTP Header Extensions Base Class and Automatic Header Extension Handling in RTP Payloaders and Depayloaders + +- RTP Header Extensions are specified in RFC 5285 and provide a way to + add small pieces of data to RTP packets in between the RTP header + and the RTP payload. This is often used for per-frame metadata, + extended timestamps or other application-specific extra data. There + are several commonly-used extensions specified in various RFCs, but + senders are free to put any kind of data in there, as long as sender + and receiver both know what that data is. Receivers that don’t know + about the header extensions will just skip the extra data without + ever looking at it. These header extensions can often be combined + with any kind of payload format, so may need to be supported by many + RTP payloader and depayloader elements. + +- Inserting and extracting RTP header extension data has so far been a + bit inconvenient in GStreamer: There are functions to add and + retrieve RTP header extension data from RTP packets, but nothing + works automatically, even for common extensions. People would have + to do the insertion/extraction either in custom elements + before/after the RTP payloader/depayloader, or inside pad probes, + which isn’t very nice. + +- This release adds various pieces of new infrastructure for generic + RTP header extension handling, as well as some implementations for + common extensions: + + - GstRTPHeaderExtension is a new helper base class for reading and + writing RTP header extensions. Nominally this subclasses + GstElement, but only so these extensions are stored in the + registry where they can be looked up by URI or name. They don’t + have pads and don’t get added to the pipeline graph as an + element. + + - "add-extension" and "clear-extension" action signals on RTP + payloaders and depayloaders for manual extension management + + - The "request-extension" signal will be emitted if an extension + is encountered that requires explicit mapping by the application + + - new "auto-header-extension" property on RTP payloaders and + depayloaders for automatic handling of known header extensions. + This is enabled by default. The extensions must be signalled via + caps / SDP. + + - RTP header extension implementations: + + - rtphdrextclientaudiolevel: Client-to-Mixer Audio Level + Indication (RFC 6464) (also see below) + - rtphdrextcolorspace: Color Space extension, extends RTP + packets with color space and high dynamic range (HDR) + information + - rtphdrexttwcc: Transport Wide Congestion Control support + +- gst_rtp_buffer_remove_extension_data() is a new helper function to + remove an RTP header extension from an RTP buffer + +- The existing gst_rtp_buffer_set_extension_data() now also supports + shrinking the extension data in size + +AppSink and AppSrc improvements + +- appsink: new API to pull events out of appsink in addition to + buffers and buffer lists. + + There was previously no way for users to receive incoming events + from appsink properly serialised with the data flow, even if they + are serialised events. The reason for that is that the only way to + intercept events was via a pad probe on the appsink sink pad, but + there is also internal queuing inside of appsink, so it’s difficult + to ascertain the right order of everything in all cases. + + There is now a new "new-serialized-event" signal which will be + emitted when there’s a new event pending (just like the existing + "new-sample" signal). The "emit-signals" property must be set to + TRUE in order to activate this (but it’s also fine to just pull from + the application thread without using the signals). + + gst_app_sink_pull_object() and gst_app_sink_try_pull_object() can be + used to pull out either an event or a new sample carrying a buffer + or buffer list, whatever is next in the queue. + + EOS events will be filtered and will not be returned. EOS handling + can be done the usual way, same as with _pull_sample(). + +- appsrc: allow configuration of internal queue limits in time and + buffers and add leaky mode. + + There is internal queuing inside appsrc so the application thread + can push data into the element which will then be picked up by the + source element’s streaming thread and pushed into the pipeline from + that streaming thread. This queue is unlimited by default and until + now it was only possible to set a maximum size limit in bytes. When + that byte limit is reached, the pushing thread (application thread) + would be blocked until more space becomes available. + + A limit in bytes is not particularly useful for many use cases, so + now it is possible to also configure limits in time and buffers + using the new "max-time" and "max-buffers" properties. Of course + there are also matching new read-only"current-level-buffers" and + "current-level-time properties" properties to query the current fill + level of the internal queue in time and buffers. + + And as if that wasn’t enough the internal queue can also be + configured as leaky using the new "leaky-type" property. That way + when the queue is full the application thread won’t be blocked when + it tries to push in more data, but instead either the new buffer + will be dropped or the oldest data in the queue will be dropped. + +Better string serialization of nested GstCaps and GstStructures + +- New string serialisation format for structs and caps that can handle + nested structs and caps properly by using brackets to delimit nested + items (e.g. some-struct, some-field=[nested-struct, nested=true]). + Unlike the default format the new variant can also support more than + one level of nesting. For backwards-compatibility reasons the old + format is still output by default when serialising caps and structs + using the existing API. The new functions gst_caps_serialize() and + gst_structure_serialize() can be used to output strings in the new + format. + +Convenience API for custom GstMetas + +- New convenience API to register and create custom GstMetas: + gst_meta_register_custom() and gst_buffer_add_custom_meta(). Such + custom meta is backed by a GstStructure and does not require that + users of the API expose their GstMeta implementation as public API + for other components to make use of it. In addition, it provides a + simpler interface by ignoring the impl vs. api distinction that the + regular API exposes. This new API is meant to be the meta + counterpart to custom events and messages, and to be more convenient + than the lower-level API when the absolute best performance isn’t a + requirement. The reason it’s less performant than a “proper” meta is + that a proper meta is just a C struct in the end whereas this goes + through the GstStructure API which has a bit more overhead, which + for most scenarios is negligible however. This new API is useful for + experimentation or proprietary metas, but also has some limitations: + it can only be used if there’s a single producer of these metas; + registering the same custom meta multiple times or from multiple + places is not allowed. + +Additional Element Properties on Encoding Profiles + +- GstEncodingProfile: The new "element-properties" and + gst_encoding_profile_set_element_properties() API allows + applications to set additional element properties on encoding + profiles to configure muxers and encoders. So far the encoding + profile template was the only place where this could be specified, + but often what applications want to do is take a ready-made encoding + profile shipped by GStreamer or the application and then tweak the + settings on top of that, which is now possible with this API. Since + applications can’t always know in advance what encoder element will + be used in the end, it’s even possible to specify properties on a + per-element basis. + + Encoding Profiles are used in the encodebin, transcodebin and + camerabin elements and APIs to configure output formats (containers + and elementary streams). + +Audio Level Indication Meta for RFC 6464 + +- New GstAudioLevelMeta containing Audio Level Indication as per RFC + 6464 + +- The level element has been updated to add GstAudioLevelMeta on + buffers if the "audio-level-meta" property is set to TRUE. This can + then in turn be picked up by RTP payloaders to signal the audio + level to receivers through RTP header extensions (see above). + +- New Client-to-Mixer Audio Level Indication (RFC6464) RTP Header + Extension which should be automatically created and used by RTP + payloaders and depayloaders if their "auto-header-extension" + property is enabled and if the extension is part of the RTP caps. + +Automatic packet loss, data corruption and keyframe request handling for video decoders + +- The GstVideoDecoder base class has gained various new APIs to + automatically handle packet loss and data corruption better by + default, especially in RTP, RTSP and WebRTC streaming scenarios, and + to give subclasses more control about how they want to handle + missing data: + + - Video decoder subclasses can mark output frames as corrupted via + the new GST_VIDEO_CODEC_FRAME_FLAG_CORRUPTED flag + + - A new "discard-corrupted-frames" property allows applications to + configure decoders so that corrupted frames are directly + discarded instead of being forwarded inside the pipeline. This + is a replacement for the "output-corrupt" property of the FFmpeg + decoders. + + - RTP depayloaders can now signal to decoders that data is missing + when sending GAP events for lost packets. GAP events can be sent + for various reason in a GStreamer pipeline. Often they are just + used to let downstream elements know that there isn’t a buffer + available at the moment, so downstream elements can move on + instead of waiting for one. They are also sent by RTP + depayloaders in the case that packets are missing, however, and + so far a decoder was not able to differentiate the two cases. + This has been remedied now: GAP events can be decorated with + gst_event_set_gap_flags() and GST_GAP_FLAG_MISSING_DATA to let + decoders now what happened, and decoders can then use that in + some cases to handle missing data better. + + - The GstVideoDecoder::handle_missing_data vfunc was added to + inform subclasses about packet loss or missing data and let them + handle it in their own way if they like. + + - gst_video_decoder_set_needs_sync_point() lets subclasses signal + that they need the stream to start with a sync point. If + enabled, the base class will discard all non-sync point frames + in the beginning and after a flush and does not pass them to the + subclass. Furthermore, if the first frame is not a sync point, + the base class will try and request a sync frame from upstream + by sending a force-key-unit event (see next items). + + - New "automatic-request-sync-points" and + "automatic-request-sync-point-flags" properties to automatically + request sync points when needed, e.g. on packet loss or if the + first frame is not a keyframe. Applications may want to enable + this on decoders operating in e.g. RTP/WebRTC/RTSP receiver + pipelines. + + - The new "min-force-key-unit-interval" property can be used to + ensure there’s a minimal interval between keyframe requests to + upstream (and/or the sender) and we’re not flooding the sender + with key unit requests. + + - gst_video_decoder_request_sync_point() allows subclasses to + request a new sync point (e.g. if they choose to do their own + missing data handling). This will still honour the + "min-force-key-unit-interval" property if set. + +Improved support for custom minimal GStreamer builds + +- Element registration and registration of other plugin features + inside plugin init functions has been improved in order to + facilitate minimal custom GStreamer builds. + +- A number of new macros have been added to declare and create + per-element and per-plugin feature register functions in all + plugins, and then call those from the per-plugin plugin_init + functions: + + - GST_ELEMENT_REGISTER_DEFINE, + GST_DEVICE_PROVIDER_REGISTER_DEFINE, + GST_DYNAMIC_TYPE_REGISTER_DEFINE, GST_TYPE_FIND_REGISTER_DEFINE + for the actual registration call with GStreamer + - GST_ELEMENT_REGISTER, GST_DEVICE_PROVIDER_REGISTER, + GST_DYNAMIC_TYPE_REGISTER, GST_PLUGIN_STATIC_REGISTER, + GST_TYPE_FIND_REGISTER to call the registration function defined + by the REGISTER_DEFINE macro + - GST_ELEMENT_REGISTER_DECLARE, + GST_DEVICE_PROVIDER_REGISTER_DECLARE, + GST_DYNAMIC_TYPE_REGISTER_DECLARE, + GST_TYPE_FIND_REGISTER_DECLARE to declare the registration + function defined by the REGISTER_DEFINE macro + - and various variants for advanced use cases. + +- This means that applications can call the per-element and per-plugin + feature registration functions for only the elements they need + instead of registering plugins as a whole with all kinds of elements + that may not be required (e.g. encoder and decoder instead of just + decoder). In case of static linking all unused functions and their + dependencies would be removed in this case by the linker, which + helps minimise binary size for custom builds. -Changing the playback rate as quickly as possible so far always required -a flushing seek. This generally works, but has the disadvantage of -flushing all data from the playback pipeline and requiring the demuxer -or parser to do a full-blown seek including resetting its internal state -and resetting the position of the data source. It might also require -considerable decoding effort to get to the right position to resume -playback from at the higher rate. - -This release adds a new mechanism to achieve quasi-instant rate changes -in certain playback pipelines without interrupting the flow of data in -the pipeline. This is activated by sending a seek with the -GST_SEEK_FLAG_INSTANT_RATE_CHANGE flag and start_type = stop_type = -GST_SEEK_TYPE_NONE. This flag does not work for all pipelines, in which -case it is necessary to fall back to sending a full flushing seek to -change the playback rate. When using this flag, the seek event is only -allowed to change the current rate and can modify the trickmode flags -(e.g. keyframe only or not), but it is not possible to change the -current playback position, playback direction or do a flush. - -This is particularly useful for streaming use cases like HLS or DASH -where the streaming download should not be interrupted when changing -rate. - -Instant rate changing is handled in the pipeline in a specific sequence -which is detailed in the seeking design docs. Most elements don’t need -to worry about this, only elements that sync to the clock need some -special handling which is implemented in the GstBaseSink base class, so -should be taken care of automatically in most normal playback pipelines -and sink elements. - -See Jan’s GStreamer Conference 2019 talk “Changing Playback Rate -Instantly” for more information. - -You can try this feature by passing the -i command line option to -gst-play-1.0. It is supported at least by qtdemux, tsdemux, hlsdemux, -and dashdemux. - -Google Transport-Wide Congestion Control - -rtpmanager now supports the parsing and generating of RTCP messages for -the Google Transport-Wide Congestion Control RTP Extension, as described -in: -https://tools.ietf.org/html/draft-holmer-rmcat-transport-wide-cc-extensions-01. - -This “just” provides the required plumbing/infrastructure, it does not -actually make effect any actual congestion control on the sender side, -but rather provides information for applications to use to make such -decisions. - -See Håvard’s “Google Transport-Wide Congestion Control” talk for more -information about this feature. - -GstTranscoder: a new high-level transcoding API for applications - -The new GstTranscoder library, along with transcodebin and -uritranscodebin elements, provides high level API for applications to -transcode media files from one format to another. Watch Thibault’s talk -“GstTranscoder: A High Level API to Quickly Implement Transcoding -Capabilities in your Applications” for more information. - -This also comes with a gst-transcoder-1.0 command line utility to -transcode one URI into another URI based on the specified encoding -profile. - -Active Format Description (AFD) and Bar Data support - -The GstVideo Ancillary Data API has gained support for Active Format -Description (AFD) and Bar data. - -This includes various two new buffer metas: GstVideoAFDMeta and -GstVideoBarMeta. - -GStreamer now also parses and extracts AFD/Bar data in the h264/h265 -video parsers, and supports both capturing them and outputting them in -the decklink elements. See Aaron’s lightning talk at the GStreamer -Conference for more background. - -ONVIF trick modes support in both GStreamer RTSP server and client - -- Support for the various trick modes described in section 6 of the - ONVIF streaming spec has been implemented in both gst-rtsp-server - and rtspsrc. -- Various new properties in rtspsrc must be set to take advantage of - the ONVIF support -- Examples are available here: test-onvif-server.c and - test-onvif-client.c -- Watch Mathieu Duponchelle’s talk “Implementing a Trickmode Player - with ONVIF, RTSP and GStreamer” for more information and a live - demo. - -GStreamer Codecs library with decoder base classes - -This introduces a new library in gst-plugins-bad which contains a set of -base classes that handle bitstream parsing and state tracking for the -purpose of decoding different codecs. Currently H264, H265, VP8 and VP9 -are supported. These bases classes are meant primarily for internal use -in GStreamer and are used in various decoder elements in connection with -low level decoding APIs like DXVA, NVDEC, VAAPI and V4L2 State Less -decoders. The new library is named gstreamer-codecs-1.0 / -libgstcodecs-1.0 and is not yet guaranteed to be API stable across major -versions. - -MPEG-TS muxing improvements - -The GStreamer MPEG-TS muxer has seen major improvements on various -fronts in this cycle: - -- It has been ported to the GstAggregator base class which means it - can work in defined-latency mode with live input sources and - continue streaming if one of the inputs stops producing data. - -- atscmux, a new ATSC-specific tsmux subclass - -- Constant Bit Rate (CBR) muxing support via the new bitrate property - which allows setting the target bitrate in bps. If this is set the - muxer will insert null packets as padding to achieve the desired - multiplex-wide constant bitrate. - -- compliance fixes for TV broadcasting use cases (esp. ATSC). See - Jan’s talk “TV Broadcast compliant MPEG-TS” for details. - -- Streams can now be added and removed at runtime: Until now, any - streams in tsmux had to be present when the element started - outputting its first buffer. Now they can appear at any point during - the stream, or even disappear and reappear later using the same PID. - -- new pcr-interval property allows applications to configure the - desired interval instead of hardcoding it - -- basic SCTE-35 support. This is enabled by setting the scte-35-pid - property on the muxer. Sending SCTE-35 commands is then done by - creating the appropriate SCTE-35 GstMpegtsSection and sending them - on the muxer. +- gst_init() will automatically call a gst_init_static_plugins() + function if one exists. -- MPEG-2 AAC handling improvements +- See the GStreamer static build documentation and Stéphane’s blog + post Generate a minimal GStreamer build, tailored to your needs for + more details. New elements -- New qmlgloverlay element for rendering a QtQuick scene over the top - of a video stream. qmlgloverlay requires that Qt support adopting an - external OpenGL context and is known to work on X11 and Windows. - Wayland is known not to work due to limitations within Qt. Check out - the example to see how it works. - -- The clocksync element is a generic element that can be placed in a - pipeline to synchronise passing buffers to the clock at that point. - This is similar to identity sync=true, but because it isn’t - GstBaseTransform-based, it can process GstBufferLists without - breaking them into separate GstBuffers. It is also more discoverable - than the identity option. Note that you do not need to insert this - element into your pipeline to make GStreamer sync to the pipeline - clock, this is usually handled automatically by the elements in the - pipeline (sources and sinks mostly). This element is useful to feed - non-live input such as local files into elements that expect live - input such as webrtcbin.` - -- New imagesequencesrc element to easily create a video stream from a - sequence of JPEG or PNG images (or any other encoding where the type - can be detected), basically a multifilesrc made specifically for - image sequences. - -- rpicamsrc element for capturing raw or encoded video (H.264, MJPEG) - from the Raspberry Pi camera. This works much like the popular - raspivid command line utility but outputs data nicely timestamped - and formatted in order to integrate nicely with other GStreamer - elements. Also comes with a device provider so applications can - discover the camera if available. - -- aatv and cacatv video filters that transform video ASCII art style - -- avtp: new Audio Video Transport Protocol (AVTP) plugin for Linux. - See Andre Guedes’ talk “Audio/Video Bridging (AVB) support in - GStreamer” for more details. - -- clockselect: a pipeline element that enables clock selection/forcing - via gst-launch pipeline syntax. - -- dashsink: Add new sink to produce DASH content. See Stéphane’s talk - or blog post for details. - -- dvbsubenc: a DVB subtitle encoder element - -- microdns: a libmicrodns-based mdns device provider to discover RTSP - cameras on the local network - -- mlaudiosink: new audio sink element for the Magic Leap platform, - accompanied by an MLSDK implementation in the amc plugin - -- msdkvp9enc: VP9 encoder element for the Intel MediaSDK - -- rist: new plugin implementing support for the Video Services Forum’s - Reliable Internet Stream Transport (RIST) TR-06-1 Simple Profile. - See Nicolas’ blog post “GStreamer support for the RIST - Specification” for more details. - -- rtmp2: new RTMP client source and sink elements with fully - asynchronous network operations, better robustness and additional - features such as handling ping and stats messages, and adobe-style - authentication. The new rtmp2src and rtmp2sink elements should be - API-compatible with the old rtmpsrc / rtmpsink elements and should - work as drop-in replacements. - -- new RTP source and sink elements to easily set up RTP streaming via - rtp:// URIs: The rtpsink and rtpsrc elements add an URI interface so - that streams can be decoded with decodebin using rtp:// URIs. These - can be used as follows: ``` gst-launch-1.0 videotestsrc ! x264enc ! - rtph264pay config-interval=3 ! rtpsink uri=rtp://239.1.1.1:1234 - - gst-launch-1.0 videotestsrc ! x264enc ! rtph264pay config-interval=1 - ! rtpsink uri=rtp://239.1.2.3:5000 gst-launch-1.0 rtpsrc - uri=rtp://239.1.2.3:5000?encoding-name=H264 ! rtph264depay ! - avdec_h264 ! videoconvert ! xvimagesink - - gst-launch-1.0 videotestsrc ! avenc_mpeg4 ! rtpmp4vpay - config-interval=1 ! rtpsink uri=rtp://239.1.2.3:5000 gst-launch-1.0 - rtpsrc uri=rtp://239.1.2.3:5000?encoding-name=MP4V-ES ! rtpmp4vdepay - ! avdec_mpeg4 ! videoconvert ! xvimagesink ``` - -- svthevcenc: new SVT-HEVC-based H.265 video encoder - -- switchbin: new helper element which chooses between a set of - processing chains (paths) based on input caps, and changes the - active chain if new caps arrive. Paths are child objects, which are - accessed by the GstChildProxy interface. See the switchbin - documentation for a usage example. - -- vah264dec: new experimental va plugin with an element for H.264 - decoding with VA-API using GStreamer’s new stateless decoder - infrastructure (see Linux section below). - -- v4l2codecs: introduce an V4L2 CODECs Accelerator supporting the new - CODECs uAPI in the Linux kernel (see Linux section below) +- New aesdec and aesenc elements for AES encryption and decryption in + a custom format. -- zxing new plugin to detect QR codes and barcodes, based on libzxing - -- also see the Rust plugins section below which contains plenty of new - exciting plugins written in Rust! +- New encodebin2 element with dynamic/sometimes source pads in order + to support the option of doing the muxing outside of encodebin, + e.g. in combination with a splitmuxsink. + +- New fakeaudiosink and videocodectestsink elements for testing and + debugging (see below for more details) + +- rtpisacpay, rtpisacdepay: new RTP payloader and depayloader for iSAC + audio codec + +- rtpst2022-1-fecdec, rtpst2022-1-fecenc: new elements providing SMPTE + 2022-1 2-D Forward Error Correction. More details in Mathieu’s blog + post. + +- isac: new plugin wrapping the Internet Speech Audio Codec reference + encoder and decoder from the WebRTC project. + +- asio: plugin for Steinberg ASIO (Audio Streaming Input/Output) API + +- gssrc, gssink: add source and sink for Google Cloud Storage + +- onnx: new plugin to apply ONNX neural network models to video + +- openaptx: aptX and aptX-HD codecs using libopenaptx (v0.2.0) + +- qroverlay, debugqroverlay: new elements that allow overlaying data + on top of video in the form of a QR code + +- cvtracker: new OpenCV-based tracker element + +- av1parse, vp9parse: new parsers for AV1 and VP9 video + +- va: work on the new VA-API plugin implementation for + hardware-accelerated video decoding and encoding has continued at + pace, with various new decoders and filters having joined the + initial vah264dec: + + - vah265dec: VA-API H.265 decoder + - vavp8dec: VA-API VP8 decoder + - vavp9dec: VA-API VP9 decoder + - vaav1dec: VA-API AV1 decoder + - vampeg2dec: VA-API MPEG-2 decoder + - vadeinterlace: : VA-API deinterlace filter + - vapostproc: : VA-API postproc filter (color conversion, + resizing, cropping, color balance, video rotation, skin tone + enhancement, denoise, sharpen) + + See Víctor’s blog post “GstVA in GStreamer 1.20” for more details + and what’s coming up next. + +- vaapiav1dec: new AV1 decoder element (in gstreamer-vaapi) + +- msdkav1dec: hardware-accelerated AV1 decoder using the Intel Media + SDK / oneVPL + +- nvcodec plugin for NVIDIA NVCODEC API for hardware-accelerated video + encoding and decoding: + + - cudaconvert, cudascale: new CUDA based video color space convert + and rescale elements + - cudaupload, cudadownload: new helper elements for memory + transfer between CUDA and system memory spaces + - nvvp8sldec, nvvp9sldec: new GstCodecs-based VP8/VP9 decoders + +- Various new hardware-accelerated elements for Windows: + + - d3d11screencapturesrc: new desktop capture element, including a + GstDeviceProvider implementation to enumerate/select target + monitors for capture. + - d3d11av1dec and d3d11mpeg2dec: AV1 and MPEG-2 decoders + - d3d11deinterlace: deinterlacing filter + - d3d11compositor: video composing element + - see Windows section below for more details + +- new Rust plugins: + + - audiornnoise: Removes noise from an audio stream + - awstranscribeparse: Parses AWS audio transcripts into timed text + buffers + - ccdetect: Detects if valid closed captions are present in a + closed captions stream + - cea608tojson: Converts CEA-608 Closed Captions to a JSON + representation + - cmafmux: CMAF fragmented mp4 muxer + - dashmp4mux: DASH fragmented mp4 muxer + - isofmp4mux: ISO fragmented mp4 muxer + - ebur128level: EBU R128 Loudness Level Measurement + - ffv1dec: FFV1 video decoder + - gtk4paintablesink: GTK4 video sink, which provides a + GdkPaintable that can be rendered in various widgets + - hlssink3: HTTP Live Streaming sink + - hrtfrender: Head-Related Transfer Function (HRTF) renderer + - hsvdetector: HSV colorspace detector + - hsvfilter: HSV colorspace filter + - jsongstenc: Wraps buffers containing any valid top-level JSON + structures into higher level JSON objects, and outputs those as + ndjson + - jsongstparse: Parses ndjson as output by jsongstenc + - jsontovtt: converts JSON to WebVTT subtitles + - regex: Applies regular expression operations on text + - roundedcorners: Adds rounded corners to video + - spotifyaudiosrc: Spotify source + - textahead: Display upcoming text buffers ahead (e.g. for + Karaoke) + - transcriberbin: passthrough bin that transcribes raw audio to + closed captions using awstranscriber and puts the captions as + metas onto the video + - tttojson: Converts timed text to a JSON representation + - uriplaylistbin: Playlist source bin + - webpdec-rs: WebP image decoder with animation support + +- New plugin codecalpha with elements to assist with WebM Alpha + decoding + + - codecalphademux: Split stream with GstVideoCodecAlphaMeta into + two streams + - alphacombine: Combine two raw video stream (I420 or NV12) as one + stream with alpha channel (A420 or AV12) + - vp8alphadecodebin: A bin to handle software decoding of VP8 with + alpha + - vp9alphadecodebin: A bin to handle software decoding of VP9 with + alpha + +- New hardware accelerated elements for Linux: + + - v4l2slmpeg2dec: Support for Linux Stateless MPEG-2 decoders + - v4l2slvp9dec: Support for Linux Stateless VP9 decoders + - v4l2slvp8alphadecodebin: Support HW accelerated VP8 with alpha + layer decoding + - v4l2slvp9alphadecodebin: Support HW accelerated VP9 with alpha + layer decoding New element features and additions -GStreamer core - -- filesink: Add a new “full” buffer mode. Previously the default and - full modes were the same. Now the default mode is like before: it - accumulates all buffers in a buffer list until the threshold is - reached and then writes them all out, potentially in multiple - writes. The new full mode works by always copying memory to a single - memory area and writing everything out with a single write once the - threshold is reached. - -- multiqueue: Add stats property and - current-level-{buffers, bytes, time} pad properties to query the - current levels of the corresponding internal queue. - -Plugins Base - -- alsa: implement a device provider - -- alsasrc: added use-driver-timestamp property to force use of - pipeline timestamps (and disable driver timestamps) if so desired - -- audioconvert: fix changing the mix-matrix property at runtime - -- appsrc: added support for segment forwarding or custom GstSegments - via GstSample, enabled via the handle-segment-change property. This - only works for segments in TIME format for now. - -- compositor: various performance optimisations, checkerboard drawing - fixes, and support for VUYA format - -- encodebin: Fix and refactor smart encoding; ensure that a single - segment is pushed into encoders; improve force-key-unit event - handling. - -- opusenc: Add low delay option (audio-type=restricted-lowdelay) to - disable the SILK layer and achieve only 5ms delay. - -- opusdec: add stats property to retrieve various decoder statistics. - -- uridecodebin3: Let decodebin3 do its stream selection if no one - answers - -- decodebin3: Avoid overriding explicit user selection of streams - -- playbin: add flag to force use of software decoders over any - hardware decoders that might also be available - -- playbin3, playbin: propagate sink context - -- rawvideoparse: Fix tiling support, allow setting colorimetry - -- subparse: output plain utf8 text instead of pango-markup formatted - text if downstream requires it, useful for interop with elements - that only accept utf8-formatted subtitles such as muxers or closed - caption converters. - -- tcpserversrc, tcpclientsrc: add stats property with TCP connection - stats (some are only available on Linux though) - -- timeoverlay: add show-times-as-dates, datetime-format and - datetime-epoch properties to display times with dates - -- videorate: Fix changing rate property during playback; reverse - playback fixes; update QoS events taking into account our rate - -- videoscale: pass through and transform size sensitive metas instead - of just dropping them - -Plugins Good - -- avidemux can handle H.265 video now. Our advice remains to - immediately cease all contact and communication with anyone who - hands you H.265 video in an AVI container, however. - -- avimux: Add support for S24LE and S32LE raw audio and v210 raw video - formats; support more than 2 channels of raw audio. - -- souphttpsrc: disable session sharing and cookie jar when the cookies - property is set; correctly handle seeks past the end of the content - -- deinterlace: new YADIF deinterlace method which should provide - better quality than the existing methods and is LGPL licensed; - alternate fields are supported as input to the deinterlacer as well - now, and there were also fixes for switching the deinterlace mode on - the fly. - -- flvmux: in streamable mode allow adding new pads even if the initial - header has already been written. Old clients will only process the - initial stream, new clients will get a header with the new streams. - The skip-backwards-streams property can be used to force flvmux to - skip and drop a few buffers rather than produce timestamps that go - backward and confuse librtmp-based clients. There’s also better - handling for timestamp rollover when streaming for a long time. - -- imagefreeze: Add live mode, which can be enabled via the new is-live - property. In this mode frames will only be output in PLAYING state - according to the negotiated framerate, skipping frames if the output - can’t keep up (e.g. because it’s blocked downstream). This makes it - possible to actually use imagefreeze in live pipelines without - having to manually ensure somehow that it starts outputting at the - current running time and without still risking to fall behind - without recovery. - -- matroskademux, qtdemux: Provide audio lead-in for some lossy formats - when doing accurate seeks, to make sure we can actually decode - samples at the desired position. This is especially important for - non-linear audio/video editing use-cases. - -- matroskademux, matroskamux: Handle interlaced field order (tff, bff) - -- matroskamux: - - - new offset-to-zero property to offset all streams to start at - zero. This takes the timestamp of the earliest stream and - offsets it so that it starts at 0. Some software (VLC, - ffmpeg-based) does not properly handle Matroska files that start - at timestamps much bigger than zero, which could happen with - live streams. - - added a creation-time property to explicitly set the creation - time to write into the file headers. Useful when remuxing, for - example, but also for live feeds where the DateUTC header can be - set a UTC timestamp corresponding to the beginning of the file. - - the muxer now also always waits for caps on sparse streams, and - warns if caps arrive after the header has already been sent, - otherwise the subtitle track might be silently absent in the - final file. This might affect applications that send sparse data - into matroskamux via an appsrc element, which will usually not - send out the initial caps before it sends out the first buffer. - -- pulseaudio: device provider improvements: fix discovery of - newly-added devices and hide the alsa device provider if we provide - alsa devices - -- qtdemux: raw audio handling improvements, support for AC4 audio, and - key-units trickmode interval support - -- qtmux: - - - was ported to the GstAggregator base class which allows for - better handling of live inputs, but might entail minor - behavioural changes for sparse inputs if inputs are not live. - - has also gained a force-create-timecode-trak property to create - a timecode trak in non-mov flavors, which may not be supported - by Apple but is supported by other software such as Final Cut - Pro X - - also a force-chunks property to force the creation of chunks - even in single-stream files, which is required for Apple ProRes - certification. - - also supports 8k resolutions in prefill mode with ProRes. - -- rtpbin gained a request-jitterbuffer signal which allows - applications to plug in their own jitterbuffer implementation such - as the threadsharing jitterbuffer from the Rust plugins, for - example. - -- rtprtxsend: add clock-rate-map property to allow generic RTP input - caps without a clock-rate whilst still supporting the max-size-time - property for bundled streams. - -- rtpssrcdemux: introduce max-streams property to guard against - attacks where the sender changes SSRC for every RTP packet. - -- rtph264pay, rtph264pay: implement STAP-A and various aggregation - modes controled by the new aggegrate-mode property: none to not - aggregate NAL units (as before), zero-latency to aggregate NAL units - until a VCL or suffix unit is included, or max to aggregate all NAL - units with the same timestamp (which adds one frame of latency). The - default has been kept at none for backwards compatibility reasons - and because various RTP/RTSP implementions don’t handle aggregation - well. For WebRTC use cases this should be set to zero-latency, - however. - -- rtpmp4vpay: add support for config-interval=-1 to resend headers - with each IDR keyframe, like other video payloaders. - -- rtpvp8depay: Add wait-for-keyframe property for waiting until the - next keyframe after packet loss. Useful if the video stream was not - encoded with error resilience enabled, in which case packet loss - tends to cause very bad artefacts when decoding, and waiting for the - next keyframe instead improves user experience considerably. - -- splitmuxsink and splitmuxsrc can now handle auxiliary video streams - in addition to the primary video stream. The primary video stream is - still used to select fragment cut points at keyframe boundaries. - Auxilliary video streams may be broken up at any packet - so - fragments may not start with a keyframe for those streams. - -- splitmuxsink: - - - new muxer-preset and sink-preset properties for setting - muxer/sink presets - - a new start-index property to set the initial fragment id - - and a new muxer-pad-map property which explicitly maps - splitmuxsink pads to the muxer pads they should connect to, - overriding the implicit logic that tries to match pads but - yields arbitrary names. - - Also includes the actual sink element in the fragment-opened and - fragment-closed element messages now, which is especially useful - for sinks without a location property or when finalisation of - the fragments is done asynchronously. - -- videocrop: add support for Y444, Y41B and Y42B pixel formats - -- vp8enc, vp9enc: change default value of VP8E_SET_STATIC_THRESHOLD - from 0 to 1 which matches what Google WebRTC does and results in - lower CPU usage; also added a new bit-per-pixel property to select a - better default bitrate - -- v4l2: add support for ABGR, xBGR, RGBA, and RGBx formats and for - handling interlaced video in alternate fields interlace mode (one - field per buffer instead of one frame per picture with both fields - interleaved) - -- v4l2: Profile and level probing support for H264, H265, MPEG-4, - MPEG-2, VP8, and VP9 video encoders and decoders - -Plugins Ugly - -- asfdemux: extract more metadata: disc number and disc count - -- x264enc: - - - respect YouTube bitrate recommendation when user sets the - YouTube profile preset - - separate high-10 video formats from 8-bit formats to improve - depth negotiation and only advertise suitable input raw formats - for the desired output depth - - forward downstream colorimetry and chroma-site restrictions to - upstream elements - - support more color primaries/mappings - -Plugins Bad - -- av1enc: add threads, row-mt and tile-{columns,rows} properties for - this AOMedia AV1 encoder - -- ccconverter: implement support for CDP framerate conversions - -- ccextractor: Add remove-caption-meta property to remove caption - metas from the outgoing video buffers - -- decklink: add support for 2K DCI video modes, widescreen NTSC/PAL, - and for parsing/outputting AFD/Bar data. Also implement a simple - device provider for Decklink devices. - -- dtlsrtpenc: add rtp-sync property which synchronises RTP streams to - the pipeline clock before passing them to funnel for merging with - RTCP. - -- fdkaac: also decode MPEG-2 AAC; encoder now supports more - multichannel/surround sound layouts - -- hlssink2: add action signals for custom playlist/fragment handling: - Instead of always going through the file system API we allow the - application to modify the behaviour. For the playlist itself and - fragments, the application can provide a GOutputStream. In addition - the sink notifies the application whenever a fragment can be - deleted. - -- interlace: can now output data in alternate fields mode; added field - switching mode for 2:2 field pattern - -- iqa: Add a mode property to enable strict mode that checks that all - the input streams have the exact same number of frames; also - implement the child proxy interface - -- mpeg2enc: add disable-encode-retries property for lower CPU usage - -- mpeg4videoparse: allow re-sending codec config at IDR via - config-interval=-1 - -- mpegtsparse: new alignment property to determine number of TS - packets per output buffer, useful for feeding an MPEG-TS stream for - sending via udpsink. This can be used in combination with the - split-on-rai property that makes sure to start a new output buffer - for any TS packet with the Random Access Indicator set. Also set - delta unit buffer flag on non-random-access buffers. - -- mpegdemux: add an ignore-scr property to ignore the SCR in - non-compliant MPEG-PS streams with a broken SCR, which will work as - long as PTS/DTS in the PES header is consistently increasing. - -- tsdemux: - - - add an ignore-pcr property to ignore MPEG-TS streams with broken - PCR streams on which we can’t reliably recover correct - timestamps. - - new latency property to allow applications to lower the - advertised worst-case latency of 700ms if they know their - streams support this (must have timestamps in higher frequency - than required by the spec) - - support for AC4 audio - -- msdk - Intel Media SDK plugin for hardware-accelerated video - decoding and encoding on Windows and Linux: - - - mappings for more video formats: Y210, Y410, P012_LE, Y212_LE - - encoders now support bitrate changes and input format changes in - playing state - - msdkh264enc, msdkh265enc: add support for CEA708 closed caption - insertion - - msdkh264enc, msdkh265enc: set Region of Interest (ROI) region - from ROI metas - - msdkh264enc, msdkh265enc: new tune property to enable low-power - mode - - msdkh265enc: add support 12-bit 4:2:0 encoding and 8-bit 4:2:2 - encoding and VUYA, Y210, and Y410 as input formats - - msdkh265enc: add support for screen content coding extension - - msdkh265dec: add support for main-12/main-12-intra, - main-422-10/main-422-10-intra 10bit, - main-422-10/main-422-10-intra 8bit, - main-422-12/main-422-12-intra, main-444-10/main-444-10-intra, - main-444-12/main-444-12-intra, and main-444 profiles - - msdkvp9dec: add support for 12-bit 4:4:4 - - msdkvpp: add support for Y410 and Y210 formats, cropping via - properties, and a new video-direction property. - -- mxf: Add support for CEA-708 CDP from S436 essence tracks. mxfdemux - can now handle Apple ProRes - -- nvdec: add H264 + H265 stateless codec implementation nvh264sldec - and nvh265sldec with fewer features but improved latency. You can - set the environment variable GST_USE_NV_STATELESS_CODEC=h264 to use - the stateless decoder variant as nvh264dec instead of the “normal” - NVDEC decoder implementation. - -- nvdec: add support for 12-bit 4:4:4/4:2:0 and 10-bit 4:2:0 decoding - -- nvenc: - - - add more rate-control options, support for B-frame encoding (if - device supports it), an aud property to toggle Access Unit - Delimiter insertion, and qp-{min,max,const}-{i,p,b} properties. - - the weighted-pred property enables weighted prediction. - - support for more input formats, namely 8-bit and 10-bit RGB - formats (BGRA, RGBA, RGB10A2, BGR10A2) and YV12 and VUYA. - - on-the-fly resolution changes are now supported as well. - - in case there are multiple GPUs on the system, there are also - per-GPU elements registered now, since different devices will - have different capabilities. - - nvh265enc can now support 10-bit YUV 4:4:4 encoding and 8-bit - 4:4:4 / 10-bit 4:2:0 formats up to 8K resolution (with some - devices). In case of HDR content HDR related SEI nals will be - inserted automatically. - -- openjpeg: enable multi-threaded decoding and add support for - sub-frame encoding (for lower latency) - -- rtponviftimestamp: add opt-out “drop-out-of-segment” property - -- spanplc: new stats property - -- srt: add support for IPv6 and for using hostnames instead of IP - addresses; add streamid property, but also allow passing the id via - the stream URI; add wait-for-connection property to srtsink - -- timecodestamper: this element was rewritten with an updated API - (properties); it has gained many new properties, seeking support and - support for linear timecode (LTC) from an audio stream. - -- uvch264src now comes with a device provider to advertise available - camera sources that support this interface (mostly Logitech C920s) - -- wpe: Add software rendering support and support for mouse scroll - events - -- x265enc: support more 8/10/12 bits 4:2:0, 4:2:2 and 4:4:4 profiles; - add support for mastering display info and content light level - encoding SEIs - -gst-libav +- assrender: handle more font mime types; better interaction with + matroskademux for embedded fonts -- Add mapping for SpeedHQ video codec used by NDI +- audiobuffersplit: Add support for specifying output buffer size in + bytes (not just duration) -- Add mapping for aptX and aptX-HD +- audiolatency: new "samplesperbuffer" property so users can configure + the number of samples per buffer. The default value is 240 samples + which is equivalent to 5ms latency with a sample rate of 48000, + which might be larger than actual buffer size of audio capture + device. + +- audiomixer, audiointerleave, GstAudioAggregator: now keep a count of + samples that are dropped or processed as statistic and can be made + to post QoS messages on the bus whenever samples are dropped by + setting the "qos-messages" property on input pads. + +- audiomixer, compositor: improved handling of new inputs added at + runtime. New API was added to the GstAggregator base class to allow + subclasses to opt into an aggregation mode where inactive pads are + ignored when processing input buffers + (gst_aggregator_set_ignore_inactive_pads(), + gst_aggregator_pad_is_inactive()). An “inactive pad” in this context + is a pad which, in live mode, hasn’t yet received a first buffer, + but has been waited on at least once. What would happen usually in + this case is that the aggregator would wait for data on this pad + every time, up to the maximum configured latency. This would + inadvertently push mixer elements in live mode to the configured + latency envelope and delay processing when new inputs are added at + runtime until these inputs have actually produced data. This is + usually undesirable. With this new API, new inputs can be added + (requested) and configured and they won’t delay the data processing. + Applications can opt into this new behaviour by setting the + "ignore-inactive-pads" property on compositor, audiomixer or other + GstAudioAggregator-based elements. + +- cccombiner: implement “scheduling” of captions. So far cccombiner’s + behaviour was essentially that of a funnel: it strictly looked at + input timestamps to associate together video and caption buffers. + Now it will try to smoothly schedule caption buffers in order to + have exactly one per output video buffer. This might involve + rewriting input captions, for example when the input is CDP then + sequence counters are rewritten, time codes are dropped and + potentially re-injected if the input video frame had a time code + meta. This can also lead to the input drifting from synchronisation, + when there isn’t enough padding in the input stream to catch up. In + that case the element will start dropping old caption buffers once + the number of buffers in its internal queue reaches a certain limit + (configurable via the "max-scheduled" property). The new original + funnel-like behaviour can be restored by setting the "scheduling" + property to FALSE. + +- ccconverter: new "cdp-mode" property to specify which sections to + include in CDP packets (timecode, CC data, service info). Various + software, including FFmpeg’s Decklink support, fails parsing CDP + packets that contain anything but CC data in the CDP packets. + +- clocksync: new "sync-to-first" property for automatic timestamp + offset setup: if set clocksync will set up the "ts-offset" value + based on the first buffer and the pipeline’s running time when the + first buffer arrived. The newly configured "ts-offset" in this case + would be the value that allows outputting the first buffer without + waiting on the clock. This is useful for example to feed a non-live + input into an already-running pipeline. + +- compositor: + + - multi-threaded input conversion and compositing. Set the + "max-threads" property to activate this. + - new "sizing-policy" property to support display aspect ratio + (DAR)-aware scaling. By default the image is scaled to fill the + configured destination rectangle without padding and without + keeping the aspect ratio. With sizing-policy=keep-aspect-ratio + the input image is scaled to fit the destination rectangle + specified by GstCompositorPad:{xpos, ypos, width, height} + properties preserving the aspect ratio. As a result, the image + will be centered in the destination rectangle with padding if + necessary. + - new "zero-size-is-unscaled" property on input pads. By default + pad width=0 or pad height=0 mean that the stream should not be + scaled in that dimension. But if the "zero-size-is-unscaled" + property is set to FALSE a width or height of 0 is instead + interpreted to mean that the input image on that pad should not + be composited, which is useful when creating animations where an + input image is made smaller and smaller until it disappears. + - improved handling of new inputs at runtime via + "ignore-inactive-pads"property (see above for details) + - allow output format with alpha even if none of the inputs have + alpha (also glvideomixer and other GstVideoAggregator + subclasses) + +- dashsink: add H.265 codec support and signals for allowing custom + playlist/fragment output + +- decodebin3: + + - improved decoder selection, especially for hardware decoders + - make input activation “atomic” when adding inputs dynamically + - better interleave handling: take into account decoder latency + for interleave size + +- decklink: + + - Updated DeckLink SDK to 11.2 to support DeckLink 8K Pro + - decklinkvideosrc: + - More accurate and stable capture timestamps: use the + hardware reference clock time when the frame was finished + being captured instead of a clock time much further down the + road. + - Automatically detect widescreen vs. normal NTSC/PAL + +- encodebin: + + - add “smart encoding” support for H.265, VP8 and VP9 (i.e. only + re-encode where needed and otherwise pass through encoded video + as-is). + - H.264/H.265 smart encoding improvements: respect user-specified + stream-format, but if not specified default to avc3/hvc1 with + in-band SPS/PPS/VPS signalling for more flexibility. + - new encodebin2 element with dynamic/sometimes source pads in + order to support the option of doing the muxing outside of + encodebin, e.g. in combination with splitmuxsink. + - add APIs to set element properties on encoding profiles (see + below) + +- errorignore: new "ignore-eos" property to also ignore FLOW_EOS from + downstream elements + +- giosrc: add support for growing source files: applications can + specify that the underlying file being read is growing by setting + the "is-growing" property. If set, the source won’t EOS when it + reaches the end of the file, but will instead start monitoring it + and will start reading data again whenever a change is detected. The + new "waiting-data" and "done-waiting-data" signals keep the + application informed about the current state. + +- gtksink, gtkglsink: + + - scroll event support: forwarded as navigation events into the + pipeline + - "video-aspect-ratio-override" property to force a specific + aspect ratio + - "rotate-method" property and support automatic rotation based on + image tags + +- identity: new "stats" property allows applications to retrieve the + number of bytes and buffers that have passed through so far. + +- interlace: add support for more formats, esp 10-bit, 12-bit and + 16-bit ones + +- jack: new "low-latency" property for automatic latency-optimized + setting and "port-names" property to select ports explicitly + +- jpegdec: support output conversion to RGB using libjpeg-turbo (for + certain input files) + +- line21dec: + + - "mode" property to control whether and how detected closed + captions should be inserted in the list of existing close + caption metas on the input frame (if any): add, drop, or + replace. + - "ntsc-only" property to only look for captions if video has NTSC + resolution + +- line21enc: new "remove-caption-meta" to remove metas from output + buffers after encoding the captions into the video data; support for + CDP closed captions + +- matroskademux, matroskamux: Add support for ffv1, a lossless + intra-frame video coding format. + +- matroskamux: accept in-band SPS/PPS/VPS for H.264 and H.265 + (i.e. stream-format avc3 and hev1) which allows on-the-fly + profile/level/resolution changes. + +- matroskamux: new "cluster-timestamp-offset" property, useful for use + cases where the container timestamps should map to some absolute + wall clock time, for example. + +- rtpsrc: add "caps" property to allow explicit setting of the caps + where needed + +- mpegts: support SCTE-35 pass-through via new "send-scte35-events" + property on MPEG-TS demuxer tsdemux. When enabled, SCTE 35 sections + (e.g. ad placement opportunities) are forwarded as events downstream + where they can be picked up again by mpegtsmux. This required a + semantic change in the SCTE-35 section API: timestamps are now in + running time instead of muxer pts. + +- tsdemux: Handle PCR-less MPEG-TS streams; more robust timestamp + handling in certain corner cases and for poorly muxed streams. + +- mpegtsmux: + + - More conformance improvements to make MPEG-TS analysers happy: + - PCR timing accuracy: Improvements to the way mpegtsmux + outputs PCR observations in CBR mode, so that a PCR + observation is always inserted when needed, so that we never + miss the configured pcr-interval, as that triggers various + MPEG-TS analyser errors. + - Improved PCR/SI scheduling + - Don’t write PCR until PAT/PMT are output to make sure streams + start cleanly with a PAT/PMT. + - Allow overriding the automatic PMT PID selection via + application-supplied PMT_%d fields in the prog-map + structure/property. + +- mp4mux: + + - new "first-moov-then-finalise" mode for fragmented output where + the output will start with a self-contained moov atom for the + first fragment, and then produce regular fragments. Then at the + end when the file is finalised, the initial moov is invalidated + and a new moov is written covering the entire file. This way the + file is a “fragmented mp4” file while it is still being written + out, and remains playable at all times, but at the end it is + turned into a regular mp4 file (with former fragment headers + remaining as unused junk data in the file). + - support H.264 avc3 and H.265 hvc1 stream formats as input where + the codec data is signalled in-band inside the bitstream instead + of caps/file headers. + - support profile/level/resolution changes for H.264/H.265 input + streams (i.e. codec data changing on the fly). Each codec_data + is put into its own SampleTableEntry inside the stsd, unless the + input is in avc3 stream format in which case it’s written + in-band and not in the headers. + +- multifilesink: new ""min-keyframe-distance"" property to make + minimum distance between keyframes in next-file=key-frame mode + configurable instead of hard-coding it to 10 seconds. + +- mxfdemux has seen a big refactoring to support non-frame wrappings + and more accurate timestamp/seek handling for some formats + +- msdk plugin for hardware-accelerated video encoding and decoding + using the Intel Media SDK: + + - oneVPL support (Intel oneAPI Video Processing Library) + - AV1 decoding support + - H.264 decoder now supports constrained-high and progressive-high + profiles + - H.264 encoder: + - more configuration options (properties): + "intra-refresh-type", "min-qp" , "max-qp", "p-pyramid", + "dblk-idc" + - H.265 encoder: + - can output main-still-picture profile + - now inserts HDR SEIs (mastering display colour volume and + content light level) + - more configuration options (properties): + "intra-refresh-type", "min-qp" , "max-qp", "p-pyramid", + "b-pyramid", "dblk-idc", "transform-skip" + - support for RGB 10bit format + - External bitrate control in encoders + - Video post proc element msdkvpp gained support for 12-bit pixel + formats P012_LE, Y212_LE and Y412_LE + +- nvh264sldec: interlaced stream support + +- openh264enc: support main, high, constrained-high and + progressive-high profiles + +- openjpeg: support for multithreaded decoding and encoding + +- rtspsrc: now supports IPv6 also for tunneled mode (RTSP-over-HTTP); + new "ignore-x-server-reply" property to ignore the + x-server-ip-address server header reply in case of HTTP tunneling, + as it is often broken. + +- souphttpsrc: Runtime compatibility support for libsoup2 and + libsoup3. libsoup3 is the latest major version of libsoup, but + libsoup2 and libsoup3 can’t co-exist in the same process because + there is no namespacing or versioning for GObject types. As a + result, it would be awkward if the GStreamer souphttpsrc plugin + linked to a specific version of libsoup, because it would only work + with applications that use the same version of libsoup. To make this + work, the soup plugin now tries to determine the libsoup version + used by the application (and its other dependencies) at runtime on + systems where GStreamer is linked dynamically. libsoup3 support is + still considered somewhat experimental at this point. Distro + packagers please take note of the souphttpsrc plugin dependency + changes mentioned in the build and dependencies section below. + +- srtsrc, srtsink: add signals for the application to accept/reject + incoming connections + +- timeoverlay: new elapsed-running-time time mode which shows the + running time since the first running time (and each flush-stop). + +- udpsrc: new timestamping mode to retrieve packet receive timestamps + from the kernel via socket control messages (SO_TIMESTAMPNS) on + supported platforms + +- uritranscodebin: new setup-source and element-setup signals for + applications to configure elements used + +- v4l2codecs plugin gained support for 4x4 and 32x32 tile formats + enabling some platforms or direct renders. Important memory usage + improvement. + +- v4l2slh264dec now implements the final Linux uAPI as shipped on + Linux 5.11 and later. + +- valve: add "drop-mode" property and provide two new modes of + operation: in drop-mode=forward-sticky-events sticky events + (stream-start, segment, tags, caps, etc.) are forwarded downstream + even when dropping is enabled; drop-mode=transform-to-gap will in + addition also convert buffers into gap events when dropping is + enabled, which lets downstream elements know that time is advancing + and might allow for preroll in many scenarios. By default all events + and all buffers are dropped when dropping is enabled, which can + cause problems with caps negotiation not progressing or branches not + prerolling when dropping is enabled. + +- videocrop: support for many more pixel formats, e.g. planar YUV + formats with > 8bits and GBR* video formats; can now also accept + video not backed by system memory as long as downstream supports the + GstCropMeta + +- videotestsrc: new smpte-rp-219 pattern for SMPTE75 RP-219 conformant + color bars + +- vp8enc: finish support for temporal scalability: two new properties + ("temporal-scalability-layer-flags", + "temporal-scalability-layer-sync-flags") and a unit change on the + "temporal-scalability-target-bitrate" property (now expects bps); + also make temporal scalability details available to RTP payloaders + as buffer metadata. + +- vp9enc: new properties to tweak encoder performance: + + - "aq-mode" to configure adaptive quantization modes + - "frame-parallel-decoding" to configure whether to create a + bitstream that reduces decoding dependencies between frames + which allows staged parallel processing of more than one video + frames in the decoder. (Defaults to TRUE) + - "row-mt", "tile-columns" and "tile-rows" so multithreading can + be enabled on a per-tile basis, instead of on a per tile-column + basis. In combination with the new "tile-rows" property, this + allows the encoder to make much better use of the available CPU + power. + +- vp9dec, vp9enc: add support for 10-bit 4:2:0 and 4:2:2 YUV, as well + as 8-bit 4:4:4 + +- vp8enc, vp9enc now default to “good quality” for the deadline + property rather then “best quality”. Having the deadline set to best + quality causes the encoder to be absurdly slow, most real-life users + will prefer good-enough quality with better performance instead. + +- wpesrc: + + - implement audio support: a new sometimes source pad will be + created for each audio stream created by the web engine. + - move wpesrc to wpevideosrc and add a wrapper bin wpesrc to also + support audio + - also handles web:// URIs now (same as cefsrc) + - post messages with the estimated load progress on the bus + +- x265enc: add negative DTS support, which means timestamps are now + offset by 1h same as with x264enc + +RTP Payloaders and Depayloaders + +- rtpisacpay, rtpisacdepay: new RTP payloader and depayloader for iSAC + audio codec + +- rtph264depay: + + - new "request-keyframe" property to make the depayloader + automatically request a new keyframe from the sender on packet + loss, consistent with the new property on rtpvp8depay. + - new "wait-for-keyframe" property to make depayloader wait for a + new keyframe at the beginning and after packet loss (only + effective if the depayloader outputs AUs), consistent with the + existing property on rtpvp8depay. + +- rtpopuspay, rtpopusdepay: support libwebrtc-compatible multichannel + audio in addition to the previously supported multichannel audio + modes + +- rtpopuspay: add DTX (Discontinuous Transmission) support + +- rtpvp8depay: new "request-keyframe" property to make the depayloader + automatically request a new keyframe from the sender on packet loss. + +- rtpvp8pay: temporal scaling support + +- rtpvp9depay: Improved SVC handling (aggregate all layers) + +RTP Infrastructure + +- rtpst2022-1-fecdec, rtpst2022-1-fecenc: new elements providing SMPTE + 2022-1 2-D Forward Error Correction. More details in Mathieu’s blog + post. -- avivf_mux: support VP9 and AV1 +- rtpreddec: BUNDLE support -- avvidenc: shift output buffer timestamps and output segment by 1h - just like x264enc does, to allow for negative DTS. +- rtpredenc, rtpulpfecenc: add support for Transport-wide Congestion + Control (TWCC) -- avviddec: Limit default number of decoder threads on systems with - more than 16 cores, as the number of threads used in avdec has a - direct impact on the latency of the decoder, which is of as many - frames as threads, so a large numbers of threads can make for - latency levels that can be problematic in some applications. - -- avviddec: Add thread-type property that allows applications to - specify the preferred multithreading method (auto, frame, slice). - Note that thread-type=frame may introduce additional latency - especially in live pipelines, since it introduces a decoding delay - of number of thread frames. +- rtpsession: new "twcc-feedback-interval" property to allow RTCP TWCC + reports to be scheduled on a timer instead of per marker-bit. Plugin and library moves - There were no plugin moves or library moves in this cycle. -- The rpicamsrc element was moved into -good from an external - repository on github. - Plugin removals The following elements or plugins have been removed: -- The yadif video deinterlacing plugin from gst-plugins-bad, which was - one of the few GPL licensed plugins, has been removed in favour of - deinterlace method=yadif. - -- The avdec_cdgraphics CD Graphics video decoder element from - gst-libav was never usable in GStreamer and we now have a cdgdec - element written in Rust in gst-plugins-rs to replace it. - -- The VDPAU plugin has been unmaintained and unsupported for a very - long time and does not have the feature set we expect from - hardware-accelerated video decoders. It’s been superseded by the - nvcodec plugin leveraging NVIDIA’s NVDEC API. +- The ofa audio fingerprinting plugin has been removed. The MusicIP + database has been defunct for years so this plugin is likely neither + useful nor used by anyone. + +- The mms plugin containing mmssrc has been removed. It seems unlikely + anyone still needs this or that there are even any streams left out + there. The MMS protocol was deprecated in 2003 (in favour of RTSP) + and support for it was dropped with Microsoft Media Services 2008, + and Windows Media Player apparently also does not support it any + more. Miscellaneous API additions -GStreamer core - -- gst_task_resume(): This new API allows resuming a task if it was - paused, while leaving it in stopped state if it was stopped or not - started yet. This can be useful for callback-based driver workflows, - where you basically want to pause and resume the task when buffers - are notified while avoiding the race with a gst_task_stop() coming - from another thread. - -- info: add printf extensions GST_TIMEP_FORMAT and GST_STIMEP_FORMAT - for printing GstClockTime/GstClockTimeDiff pointers, which is much - more convenient to use in debug log statements than the usual - GST_TIME_FORMAT-followed-by-GST_TIME_ARGS dance. Also add an - explicit GST_STACK_TRACE_SHOW_NONE enum value. - -- gst_element_get_current_clock_time() and - gst_element_get_current_running_time(): new helper functions for - getting an element clock’s time, and the clock time minus base time, - respectively. Useful when adding additional input branches to - elements such as compositor, audiomixer, flvmux, interleave or - input-selector to determine initial pad offsets and such. - -- seeking: Add GST_SEEK_FLAG_TRICKMODE_FORWARD_PREDICTED to just skip - B-frames during trick mode, showing both keyframes + P-frame, and - add support for it in h264parse and h265parse. - -- elementfactory: add GST_ELEMENT_FACTORY_TYPE_HARDWARE to allow - elements to advertise that they are hardware-based or interact with - hardware. This has multiple applications: - - - it makes it possible to easily differentiate hardware and - software based element implementations such as audio or video - encoders and decoders. This is useful in order to force the use - of software decoders for specific use cases, or to check if a - selected decoder is actually hardware-accelerated or not. - - elements interacting with hardware and their respective drivers - typically don’t know the actually supported capabilities until - the element is set into at least READY state and can open a - device handle and probe the hardware. - -- gst_uri_from_string_escaped(): identical to gst_uri_from_string() - except that the userinfo and fragment components of the URI will not - be unescaped while parsing. This is needed for correctly parsing - usernames or passwords with : in them . - -- paramspecs: new GstParamSpec flag GST_PARAM_CONDITIONALLY_AVAILABLE - to indicate that a property might not always exist. - -- gst_bin_iterate_all_by_element_factory_name() finds elements in a - bin by factory name - -- pad: gst_pad_get_single_internal_link() is a new convenience - function to return the single internal link of a pad, which is - useful e.g. to retrieve the output pad of a new multiqueue request - pad. - -- datetime: Add constructors to create datetimes with timestamps in - microseconds, gst_date_time_new_from_unix_epoch_local_time_usecs() - and gst_date_time_new_from_unix_epoch_utc_usecs(). - -- gst_debug_log_get_lines() gets debug log lines formatted in the same - way the default log handler would print them - -- GstSystemClock: Add GST_CLOCK_TYPE_TAI as GStreamer abstraction for - CLOCK_TAI, to support transmission offloading features where network - packets are timestamped with the time they are deemed to be actually - transmitted. Useful in combination with the new AVTP plugin. - -- miscellaneous utility functions: gst_clear_uri(), - gst_structure_take(). - -- harness: Added gst_harness_pull_until_eos() - -- GstBaseSrc: - - - gst_base_src_new_segment() allows subclasses to update the - segment to be used at runtime from the ::create() function. This - deprecates gst_base_src_new_seamless_segment() - - gst_base_src_negotiate() allows subclasses to trigger format - renegotiation at runtime from inside the ::create() or ::alloc() - function - -- GstBaseSink: new stats property and gst_base_sink_get_stats() method - to retrieve various statistics such as average frame rate and - dropped/rendered buffers. - -- GstBaseTransform: gst_base_transform_reconfigure() is now public - API, useful for subclasses that need to completely re-implement the - ::submit_input_buffer() virtual method - -- GstAggregator: - - - gst_aggregator_update_segment() allows subclasses to update the - output segment at runtime. Subclasses should use this function - rather than push a segment event onto the source pad directly. - - new sample selection API: - - subclasses should now call gst_aggregator_selected_samples() - from their ::aggregate() implementation to signal that they - have selected the next samples they will aggregate - - GstAggregator will then emit the samples-selected signal - where handlers can then look up samples per pad via - gst_aggregator_peek_next_sample(). - - This is useful for example to atomically update input pad - properties in mixer subclasses such as compositor. - Applications can now update properties with precise control - of when these changes will take effect, and for which input - buffer(s). - - gst_aggregator_finish_buffer_list() allows subclasses to push - out a buffer list, improving efficiency in some cases. - - a ::negotiate() virtual method was added, for consistency with - other base classes and to allow subclasses to completely - override the negotiation behaviour. - - the new ::sink_event_pre_queue() and ::sink_query_pre_queue() - virtual methods allow subclasses to intercept or handle - serialized events and queries before they’re queued up - internally. - -GStreamer Plugins Base Libraries - -Audio library - -- audioaggregator, audiomixer: new output-buffer-duration-fraction - property which allows use cases such as keeping the buffers output - by compositor on one branch and audiomixer on another perfectly - aligned, by requiring the compositor to output a n/d frame rate, and - setting output-buffer-duration-fraction to d/n on the audiomixer. - -- GstAudioDecoder: new max-errors property so applications can - configure at what point the decoder should error out, or tell it to - just keep going - -- gst_audio_make_raw_caps() and gst_audio_formats_raw() are - bindings-friendly versions of the GST_AUDIO_CAPS_MAKE() C macro. - -- gst_audio_info_from_caps() now handles encoded audio formats as well - -PbUtils library - -- GstEncodingProfile: - - Do not restrict number of similar profiles in a container - - add GstValue serialization function -- codec utils now support more H.264/H.265 profiles/levels and have - improved extension handling - -RTP library - -- rtpbasepayloader: Add scale-rtptime property for scaling RTP - timestamp according to the segment rate (equivalent to RTSP speed - parameter). This is useful for ONVIF trickmodes via RTSP. - -- rtpbasepayload: add experimental property for embedding twcc - sequencenumbers for Transport-Wide Congestion Control (gated behind - the GST_RTP_ENABLE_EXPERIMENTAL_TWCC_PROPERTY environment - variable) - more generic API for enabling this is expected to land - in the next development cycle. - -- rtcpbuffer: add RTPFB_TYPE_TWCC for Transport-Wide Congestion - Control - -- rtpbuffer: add - gst_rtp_buffer_get_extension_onebyte_header_from_bytes()``, so that one can parse theGBytes` - returned by gst_rtp_buffer_get_extension_bytes() - -- rtpbasedepayload: Add max-reorder property to make the - previously-hardcoded value when to consider a sender to have - restarted configurable. In some scenarios it’s particularly useful - to set max-reorder=0 to disable the behaviour that the depayloader - will drop packets: when max-reorder is set to 0 all - reordered/duplicate packets are considered coming from a restarted - sender. - -RTSP library - -- add gst_rtsp_url_get_request_uri_with_control() to create request - uri combined with control url - -- GstRTSPConnection: add the possibility to limit the Content-Length - for RTSP messages via - gst_rtsp_connection_set_content_length_limit(). The same - functionality is also exposed in gst-rtsp-server. - -SDP library - -- add support for parsing the extmap attribute from caps and storing - inside caps The extmap attribute allows mapping RTP extension header - IDs to well-known RTP extension header specifications. See RFC8285 - for details. +Core -Tags library +- gst_buffer_new_memdup() is a convenience function for the + widely-used gst_buffer_new_wrapped(g_memdup(data,size),size) + pattern. + +- gst_caps_features_new_single() creates a new single GstCapsFeatures, + avoiding the need to use the vararg function with NULL terminator + for simple cases. + +- gst_element_type_set_skip_documentation() can be used by plugins to + signal that certain elements should not be included in the GStreamer + plugin documentation. This is useful for plugins where elements are + registered dynamically based on hardware capabilities and/or where + the available plugins and properties vary from system to system. + This is used in the d3d11 plugin for example to ensure that only the + list of default elements is advertised in the documentation. + +- gst_type_find_suggest_empty_simple() is a new convenience function + for typefinders for cases where there’s only a media type and no + other fields. + +- New API to create elements and set properties at construction time, + which is not only convenient, but also allows GStreamer elements to + have construct-only properties: gst_element_factory_make_full(), + gst_element_factory_make_valist(), + gst_element_factory_make_with_properties(), + gst_element_factory_create_full(), + gst_element_factory_create_valist(), + gst_element_factory_create_with_properties(). + +- GstSharedTaskPool: new “shared” task pool subclass with slightly + different default behaviour than the existing GstTaskPool which + would create unlimited number of threads for new tasks. The shared + task pool creates up to N threads (default: 1) and then distributes + pending tasks to those threads round-robin style, and blocks if no + thread is available. It is possible to join tasks. This can be used + by plugins to implement simple multi-threaded processing and is used + for the new multi-threaded video conversion and compositing done in + GstVideoAggregator, videoconverter and compositor. + +Plugins Base Utils library + +- GstDiscoverer: + + - gst_discoverer_container_info_get_tags() was added to retrieve + global/container tags (vs. per-stream tags). Per-Stream tags can + be retrieved via the existing + gst_discoverer_stream_info_get_tags(). + gst_discoverer_info_get_tags(), which for many files returns a + confusing mix of stream and container tags, has been deprecated + in favour of the container/stream-specific functions. + - gst_discoverer_stream_info_get_stream_number() returns a unique + integer identifier for a given stream within the given + GstDiscoverer context. (If this matches the stream number inside + the container bitstream that’s by coincidence and not by + design.) + +- gst_pb_utils_get_caps_description_flags() can be used to query + whether certain caps represent a container, audio, video, image, + subtitles, tags, or something else. This only works for formats + known to GStreamer. + +- gst_pb_utils_get_file_extension_from_caps() returns a possible file + extension for given caps. + +- gst_codec_utils_h264_get_profile_flags_level(): Parses profile, + flags, and level from H.264 AvcC codec_data. The format of H.264 + AVCC extradata/sequence_header is documented in the ITU-T H.264 + specification section 7.3.2.1.1 as well as in ISO/IEC 14496-15 + section 5.3.3.1.2. + +- gst_codec_utils_caps_get_mime_codec() to convert caps to a RFC 6381 + compatible MIME codec string codec. Useful for providing the codecs + field inside the Content-Type HTTP header for container formats, + such as mp4 or Matroska. + +GStreamer OpenGL integration library and plugins + +- glcolorconvert: added support for converting the video formats A420, + AV12, BGR, BGRA, RGBP and BGRP. + +- Added support to GstGLBuffer for persistent buffer mappings where a + Pixel Buffer Object (PBO) can be mapped by both the CPU and the GPU. + This removes a memcpy() when uploading textures or vertices + particularly when software decoders (e.g. libav) are direct + rendering into our memory. Improves transfer performance + significantly. Requires OpenGL 4.4, GL_ARB_buffer_storage or + GL_EXT_buffer_storage + +- Added various helper functions for handling 4x4 matrices of affine + transformations as used by GstVideoAffineTransformationMeta. + +- Add support to GstGLContext for allowing the application to control + the config (EGLConfig, GLXConfig, etc) used when creating the OpenGL + context. This allows the ability to choose between RGB16 or RGB10A2 + or RGBA8 back/front buffer configurations that were previously + hardcoded. GstGLContext also supports retrieving the configuration + it was created with or from an externally provide OpenGL context + handle. This infrastructure is also used to create a compatible + config from an application/externally provided OpenGL context in + order to improve compatibility with other OpenGL frameworks and GUI + toolkits. A new environment variable GST_GL_CONFIG was also added to + be able to request a specific configuration from the command line. + Note: different platforms will have different functionality + available. + +- Add support for choosing between EGL and WGL at runtime when running + on Windows. Previously this was a build-time switch. Allows use in + e.g. Gtk applications on Windows that target EGL/ANGLE without + recompiling GStreamer. gst_gl_display_new_with_type() can be used by + applications to choose a specific display type to use. + +- Build fixes to explicitly check for Broadcom-specific libraries on + older versions of the Raspberry Pi platform. The Broadcom OpenGL ES + and EGL libraries have different filenames. Using the vc4 Mesa + driver on the Raspberry Pi is not affected. + +- Added support to glupload and gldownload for transferring RGBA + buffers using the memory:NVMM available on the Nvidia Tegra family + of embedded devices. + +- Added support for choosing libOpenGL and libGLX as used in a GLVND + environment on unix-based platforms. This allows using desktop + OpenGL and EGL without pulling in any GLX symbols as would be + required with libGL. -- update to latest iso-code and support more languages +Video library -- add tags for acoustid id & acoustid fingerprint, plus MusicBrainz ID - handling fixes +- New raw video formats: -Video library + - AV12 (NV12 with alpha plane) + - RGBP and BGRP (planar RGB formats) + - ARGB64 variants with specified endianness instead of host + endianness: + - ARGB64_LE, ARGB64_BE + - RGBA64_BE, RGBA64_LE + - BGRA64_BE, BGRA64_LE + - ABGR64_BE, ABGR64_LE + +- gst_video_orientation_from_tag() is new convenience API to parse the + image orientation from a GstTagList. -- High Dynamic Range (HDR) video information representation and - signalling enhancements: +- GstVideoDecoder subframe support (see below) - - New APIs for HDR video information representation and - signalling: - - GstVideoMasteringDisplayInfo: display color volume info as - per SMPTE ST 2086 - - GstVideoContentLightLevel: content light level specified in - CEA-861.3, Appendix A. - - plus functions to serialise/deserialise and add them to or - parse them from caps - - gst_video_color_{matrix,primaries,transfer}_{to,from}_iso(): - new utilility functions for conversion from/to ISO/IEC - 23001-8 - - add ARIB STD-B67 transfer chracteristic function - - add SMPTE ST 2084 support and BT 2100 colorimetry - - define bt2020-10 transfer characteristics for clarity: - bt707, bt2020-10, and bt2020-12 transfer characteristics are - functionally identical but have their own unique values in - the specification. - - h264parse, h265parse: Parse mastering display info and content - light level from SEIs. - - matroskademux: parse HDR metadata - - matroskamux: Write MasteringMetadata and Max{CLL,FALL}. Enable - muxing with HDR meta data if upstream provided it - - avviddec: Extract HDR information if any and map bt2020-10, PQ - and HLG transfer functions - -- added bt601 transfer function (for completeness) - -- support for more pixel formats: - - - Y412 (packed 12 bits 4:4:4:4) - - Y212 (packed 12 bits 4:2:2) - - P012 (semi-planar 4:2:0) - - P016_{LE,BE} (semi-planar 16 bits 4:2:0) - - Y444_16{LE,BE} (planar 16 bits 4:4:4) - - RGB10A2_LE (packed 10-bit RGB with 2-bit alpha channel) - - NV12_32L32 (NV12 with 32x32 tiles in linear order) - - NV12_4L4 (NV12 with 4x4 tiles in linear order) - -- GstVideoDecoder: - - - new max-errors property so applications can configure at what - point the decoder should error out, or tell it to just keep - going - - - new qos property to disable dropping frames because of QoS, and - post QoS messages on the bus when dropping frames. This is - useful for example in a scenario where the decoded video is - tee-ed off to go into a live sink that syncs to the clock in one - branch, and an encoding and save to file pipeline in the other - branch. In that case one wouldn’t want QoS events from the video - sink make the decoder drop frames because that would also leave - gaps in the encoding branch then. - -- GstVideoEncoder: - - - gst_video_encoder_finish_subframe() is new API to push out - subframes (e.g. slices), so encoders can split the encoding into - subframes, which can be useful to reduce the overall end-to-end - latency as we no longer need to wait for the full frame to be - encoded to start decoding or sending out the data. - - new min-force-key-unit-interval property allows configuring the - minimum interval between force-key-unit requests and prevents a - big bitrate increase if a lot of key-units are requested in a - short period of time (as might happen in live streaming RTP - pipelines when packet loss is detected). - - various force-key-unit event handling fixes - -- GstVideoAggregator, compositor, glvideomixer: expose - max-last-buffer-repeat property on pads. This can be used to have a - compositor display either the background or a stream on a lower - zorder after a live input stream freezes for a certain amount of - time, for example because of network issues. - -- gst_video_format_info_component() is new API to find out which - components are packed into a given plane, which is useful to prevent - us from assuming a 1-1 mapping between planes and components. - -- gst_video_make_raw_caps() and gst_video_formats_raw() are - bindings-friendly versions of the GST_VIDEO_CAPS_MAKE() C macro. - -- video-blend: Add support for blending on top of 16 bit per component - formats, which makes sure we can support every currently supported - raw video format for blending subtitles or logos on top of video. - -- GST_VIDEO_BUFFER_IS_TOP_FIELD() and - GST_VIDEO_BUFFER_IS_BOTTOM_FIELD() convenience macros to check - whether the video buffer contains only the top field or bottom field - of an interlaced picture. - -- GstVideoMeta now includes an alignment field with the - GstVideoAlignment so buffer producers can explicitly specify the - exact geometry of the planes, allowing users to easily know the - padded size and height of each plane. Default values will be used if - this is not set. - - Use gst_video_meta_set_alignment() to set the alignment and - gst_video_meta_get_plane_size() or gst_video_meta_get_plane_height() - to compute the plane sizes or plane heights based on the information - in the video meta. +- GstVideoCodecState now also carries some HDR metadata -- gst_video_info_align_full() works like gst_video_info_align() but - also retrieves the plane sizes. +- Ancillary video data: implement transform functions for AFD/Bar + metas, so they will be forwarded in more cases MPEG-TS library -- support for SCTE-35 sections - -- extend support for ATSC tables: +This library only handles section parsing and such, see above for +changes to the actual mpegtsmux and mpegtsdemux elements. - - System Time Table (STT) - - Master Guide Table (MGT) - - Rating Region Table (RRT) +- many additions and improvements to SCTE-35 section parsing +- new API for fetching extended descriptors: + gst_mpegts_find_descriptor_with_extension() +- add support for SIT sections (Selection Information Tables) +- expose event-from-section constructor gst_event_new_mpegts_section() +- parse Audio Preselection Descriptor needed for Dolby AC-4 + +GstWebRTC library + webrtcbin + +- Change the way in which sink pads and transceivers are matched + together to support easier usage. If a pad is created without a + specific index (i.e. using sink_%u as the pad template), then an + available compatible transceiver will be searched for. If a specific + index is requested (i.e. sink_1) then if a transceiver for that + m-line already exists, that transceiver must match the new sink pad + request. If there is no transceiver available in either scenario, a + new transceiver is created. If a mixture of both sink_1 and sink_%u + requests result in an impossible situation, an error will be + produced at pad request time or from create offer/answer. + +- webrtcbin now uses regular ICE nomination instead of libnice’s + default of aggressive ICE nomination. Regular ICE nomination is the + default recommended by various relevant standards and improves + connectivity in specific network scenarios. + +- Add support for limiting the port range used for RTP with the + addition of the min-rtp-port and max-rtp-port properties on the ICE + object. + +- Expose the SCTP transport as a property on webrtcbin to more closely + match the WebRTC specification. + +- Added support for taking into account the data channel transport + state when determining the value of the "connection-state" property. + Previous versions of the WebRTC spec did not include the data + channel state when computing this value. + +- Add configuration for choosing the size of the underlying sockets + used for transporting media data + +- Always advertise support for the transport-cc RTCP feedback protocol + as rtpbin supports it. For full support, the configured caps (input + or through codec-preferences) need to include the relevant RTP + header extension. + +- Numerous fixes to caps and media handling to fail-fast when an + incompatible situation is detected. + +- Improved support for attaching the required media after a remote + offer has been set. + +- Add support for dynamically changing the amount of FEC used for a + particular stream. + +- webrtcbin now stops further SDP processing at the first error it + encounters. + +- Completed support for either local or the remote closing a data + channel. + +- Various fixes when performing BUNDLEing of the media streams in + relation to RTX and FEC usage. + +- Add support for writing out QoS DSCP marking on outgoing packets to + improve reliability in some network scenarios. + +- Improvements to the statistics returned by the get-stats signal + including the addition of the raw statistics from the internal + RTPSource, the TWCC stats when available. + +- The webrtc library does not expose any objects anymore with public + fields. Instead properties have been added to replace that + functionality. If you are accessing such fields in your application, + switch to the corresponding properties. + +GstCodecs and Video Parsers + +- Support for render delays to improve throughput across all CODECs + (used with NVDEC and V4L2). +- lots of improvements to parsers and the codec parsing decoder base + classes (H.264, H.265, VP8, VP9, AV1, MPEG-2) used for various + hardware-accelerated decoder APIs. + +Bindings support + +- gst_allocation_params_new() allocates a GstAllocationParams struct + on the heap. This should only be used by bindings (and freed via + gst_allocation_params_free() afterwards). In C code you would + allocate this on the stack and only init it in place. + +- gst_debug_log_literal() can be used to log a string to the debug log + without going through any printf format expansion and associated + overhead. This is mostly useful for bindings such as the Rust + bindings which may have done their own formatting already . + +- Provide non-inlined versions of refcounting APIs for various + GStreamer mini objects, so that they can be consumed by bindings + (e.g. gstreamer-sharp): gst_buffer_ref, gst_buffer_unref, + gst_clear_buffer, gst_buffer_copy, gst_buffer_replace, + gst_buffer_list_ref, gst_buffer_list_unref, gst_clear_buffer_list, + gst_buffer_list_copy, gst_buffer_list_replace, gst_buffer_list_take, + gst_caps_ref, gst_caps_unref, gst_clear_caps, gst_caps_replace, + gst_caps_take, gst_context_ref, gst_context_unref, gst_context_copy, + gst_context_replace, gst_event_replace, gst_event_steal, + gst_event_take, gst_event_ref, gst_event_unref, gst_clear_event, + gst_event_copy, gst_memory_ref, gst_memory_unref, gst_message_ref, + gst_message_unref, gst_clear_message, gst_message_copy, + gst_message_replace, gst_message_take, gst_promise_ref, + gst_promise_unref, gst_query_ref, gst_query_unref, gst_clear_query, + gst_query_copy, gst_query_replace, gst_query_take, gst_sample_ref, + gst_sample_unref, gst_sample_copy, gst_tag_list_ref, + gst_tag_list_unref, gst_clear_tag_list, gst_tag_list_replace, + gst_tag_list_take, gst_uri_copy, gst_uri_ref, gst_uri_unref, + gst_clear_uri. + +- expose a GType for GstMiniObject + +- gst_device_provider_probe() now returns non-floating device object + +API Deprecations + +- gst_element_get_request_pad() has been deprecated in favour of the + newly-added gst_element_request_pad_simple() which does the exact + same thing but has a less confusing name that hopefully makes clear + that the function request a new pad rather than just retrieves an + already-existing request pad. + +- gst_discoverer_info_get_tags(), which for many files returns a + confusing mix of stream and container tags, has been deprecated in + favour of the container-specific and stream-specific functions, + gst_discoverer_container_info_get_tags() and + gst_discoverer_stream_info_get_tags(). + +- gst_video_sink_center_rect() was deprecated in favour of the more + generic newly-added gst_video_center_rect(). + +- The GST_MEMORY_FLAG_NO_SHARE flag has been deprecated, as it tends + to cause problems and prevents sub-buffering. If pooling or lifetime + tracking is required, memories should be allocated through a custom + GstAllocator instead of relying on the lifetime of the buffers the + memories were originally attached to, which is fragile anyway. + +- The GstPlayer high-level playback library is being replaced with the + new GstPlay library (see above). GstPlayer should be considered + deprecated at this point and will be marked as such in the next + development cycle. Applications should be ported to GstPlay. + +- Gstreamer Editing Services: ges_video_transition_set_border(), + ges_video_transition_get_border() + ges_video_transition_set_inverted() + ges_video_transition_is_inverted() have been deprecated, use + ges_timeline_element_set_children_properties() instead. Miscellaneous performance, latency and memory optimisations -As always there have been many performance and memory usage improvements -across all components and modules. Some of them have already been -mentioned elsewhere so won’t be repeated here. - -The following list is only a small snapshot of some of the more -interesting optimisations that haven’t been mentioned in other contexts -yet: - -- caps negotiation, structure and GValue performance optimizations - -- systemclock: clock waiting performance improvements (moved from - GstPoll to GCond for waiting), especially on Windows. - -- rtpsession: add support for buffer lists on the recv path for better - performance with higher packet rate streams. - -- rtpjitterbuffer: internal timer handling has been rewritten for - better performance, see Nicolas’ talk “Revisiting RTP Jitter Buffer - Timers” for more details. - -- H.264/H.265 parsers and RTP payloaders/depayloaders have been - optimised for latency to make sure data is processed and pushed out - as quickly as possible - -- video-scaler: correctness and performance improvements, esp. for - interlaced formats and GBRA - -- GstVideoEncoder has gained new API to push out subframes - (e.g. slices), so encoders can split the encoding into subframes, - which can be useful to reduce the overall end-to-end latency as we - no longer need to wait for the full frame to be encoded to start - decoding or sending out the data. - - This is complemented by the new GST_VIDEO_BUFFER_FLAG_MARKER which - is a video-specific buffer flag to mark the end of a video frame, so - elements can know that they have received all data for a frame - without waiting for the beginning of the next frame. This is similar - to how the RTP marker flag is used in many RTP video mappings. - - The video encoder base class now also releases the internal stream - lock before pushing out data, so as to not block the input side of - things from processing more data in the meantime. +More video conversion fast paths -Miscellaneous other changes and enhancements +- v210 ↔ I420, YV12, Y42B, UYVY and YUY2 +- A420 → RGB -- it is now possible to modify the initial rank of plugin features - without modifying the source code or writing code to do so - programmatically via the GST_PLUGIN_FEATURE_RANK environment - variable. Users can adjust the rank of plugin(s) by passing a - comma-separated list of feature:rank pairs where rank can be a - numerical value or one of NONE, MARGINAL, SECONDARY, PRIMARY, and - MAX. Example: GST_PLUGIN_FEATURE_RANK=myh264dec:MAX,avdec_h264:NONE - sets the rank of the myh264dec element feature to the maximum and - that of avdec_h264 to 0 (none), thus ensuring that myh264dec is - prefered as H264 decoder in an autoplugging context. - -- GstDeviceProvider now does a static probe on start as fallback for - providers that don’t support dynamic probing to make things easier - for users - -WebRTC - -- webrtcbin now contains initial support for renegotiation involving - stream addition and removal. There are a number of caveats to this - initial renegotiation support and many complex scenarios are known - to require some work. - -- webrtcbin now exposes the internal ICE object for advanced - configuration options. Using the internal ICE object, it is possible - to toggle UDP or TCP connection usage as well as provide local - network addresses. - -- Fix a number of call flows within webrtcbin’s GstPromise handling - where a promise was never replied to. This has been fixed and now a - promise will always receive a reply. - -- webrtcbin now exposes a latency property for configuring the - internal rtpjitterbuffer latency and buffering when receiving - streams. - -- webrtcbin now only synchronises the RTP part of a stream, allowing - RTCP messages to skip synchronisation entirely. - -- Fixed most of the webrtcbin state properties (connection-state, - ice-connection-state, signaling-state, but not ice-gathering-state - as that requires newer API in libnice and will be fixed in the next - release series) to advance through the state values correctly. Also - implemented DTLS connection states in the DTLS elements so that - peer-connection-state is not always new. - -- webrtcbin now accounts for the a=ice-lite attribute in a remote SDP - offer and will configure the internal ICE implementation - accordingly. - -- webrtcbin will now resolve .local candidate addresses using the - system DNS resolver. .local candidate addresses are now produced by - web browsers to help protect the privacy of users. - -- webrtcbin will now add candidates found in the SDP to the internal - ICE agent. This was previously unsupported and required using the - add-ice-candidate signal manually from the application. - -- webrtcbin will now correctly parse a TURN URI that contains a - username or password with a : in it. - -- The GStreamer WebRTC library gained a GstWebRTCDataChannel object - roughly matching the interface exposed by the WebRTC specification - to allow for easier binding generation and use of data channels. - -OpenGL integration - -GStreamer OpenGL bindings/build related changes - -- The GStreamer OpenGL library (libgstgl) now ships pkg-config files - for platform-specific API where libgstgl provides a public - integration interface and a pkg-config file for a dependency on the - detected OpenGL headers. The new list of pkg-config files in - addition to the original gstreamer-gl-1.0 are gstreamer-gl-x11-1.0, - gstreamer-gl-wayland-1.0, gstreamer-gl-egl-1.0, and - gstreamer-gl-prototypes-1.0 (for OpenGL headers when including - gst/gl/gstglfuncs.h). - -- GStreamer OpenGL now ships some platform-specific introspection data - for platforms that have a public interface. This should allow for - easier integration with bindings involving platform specific - functionality. The new introspection data files are named - GstGLX11-1.0, GstGLWayland-1.0, and GstGLEGL-1.0. - -GStreamer OpenGL Features - -- The iOS implementation no longer accesses UIKit objects off the main - thread fixing a loud warning message when used in iOS applications. - -- Support for mouse and keyboard handling using the GstNavigation - interface was added for the wayland implementation complementing the - already existing support for the X11 and Windows implementations. - -- A new helper base class for source elements, GstGLBaseSrc is - provided to ease writing source elements producing OpenGL video - frames. - -- Support for some more 12-bit and 16-bit video formats (Y412_LE, - Y412_BE, Y212_LE, Y212_BE, P012_LE, P012_BE, P016, NV16, NV61) was - added to glcolorconvert. - -- glupload can now import dma-buf’s into external-oes textures. - -- A new display type for EGLDevice-based systems was added. It is - currently opt-in by using either the GST_GL_PLATFORM=egl-device - environment variable or manual construction - (gst_gl_display_egl_device_new*()) due to compatibility issues with - some platforms. - -- Support was added for WinRT/UWP using the ANGLE project for running - OpenGL-based pipelines within a UWP application. - -- Various elements now support changing the GstGLDisplay to be used at - runtime in simple cases. This is primarily helpful for changing or - adding an OpenGL-based video sink that must share an OpenGL context - with an external source to an already running pipeline. - -GStreamer Vulkan integration - -- There is now a GStreamer Vulkan library to provide integration - points and helpers with applications and external GStreamer Vulkan - based elements. The structure of the library is modelled similarly - to the already existing GStreamer OpenGL library. Please note that - the API is still unstable and may change in future releases, - particularly around memory handling. The GStreamer Vulkan library - contains objects for sharing the vkInstance, vkDevice, vkQueue, - vkImage, VkMemory, etc with other elements and/or the application as - well as some helper objects for using Vulkan in an application or - element. - -- Added support for building and running on/for the Android and - Windows systems to complement the existing XCB, Wayland, MacOS, and - iOS implementations. - -- XCB gained support for mouse/keyboard events using the GstNavigation - API. - -- New vulkancolorconvert element for converting between color formats. - vulkancolorconvert can currently convert to/from all 8-bit RGBA - formats as well as 8-bit RGBA formats to/from the YUV formats AYUV, - NV12, and YUY2. - -- New vulkanviewconvert element for converting between stereo view - layouts. vulkanviewconvert can currently convert between all of the - single memory formats (side-by-side, top-bottom, column-interleaved, - row-interleaved, checkerboard, left, right, mono). - -- New vulkanimageidentity element for a blit from the input vulkan - image/s to a new vulkan image/s. - -- The vulkansink element can now scale the input image to the output - window/surface size where that information is available. - -- The vulkanupload element can now configure a transfer from system - memory to VulkanImage-based memory. Previously, this required two - vulkanupload elements. - -Tracing framework and debugging improvements - -- gst_tracing_get_active_tracers() returns a list of active tracer - objects. This can be used to interact with tracers at runtime using - GObject API such as action signals. This has been implemented in the - leaks tracer for snapshotting and retrieving leaked/active objects - at runtime. - -- The leaks tracer can now be interacted with programmatically at - runtime via GObject action signals: - - - get-live-object returns a list of live (allocated) traced - objects - - log-live-objects logs a list of live objects into the debug log. - This is the same as sending the SIGUSR1 signal on unix systems, - but works on all operating systems including Windows. - - activity-start-tracking, activity-get-checkpoint, - activity-log-checkpoint, activity-stop-tracking: add support for - tracking and checkpointing objects, similar to what was - previously available via SIGUSR2 on unix systems, but works on - all operating systems including Windows. - -- various GStreamer gdb debug helper improvements: - - - new ‘gst-pipeline-tree’ command - - more gdb helper functions: gst_element_pad(), gst_pipeline() and - gst_bin_get() - - support for queries and buffers - - print more info for segment events, print event seqnums, object - pointers and structures - - improve gst-print command to show more pad and element - information +Less jitter when waiting on the system clock -Tools +- Better system clock wait accuracy, less jitter: where available, + clock_nanosleep is used for higher accuracy for waits below 500 + usecs, and waits below 2ms will first use the regular waiting system + and then clock_nanosleep for the remainder. The various wait + implementation have a latency ranging from 50 to 500+ microseconds. + While this is not a major issue when dealing with a low number of + waits per second (for ex: video), it does introduce a non-negligible + jitter for synchronisation of higher packet rate systems. + +Video decoder subframe support + +- The GstVideoDecoder base class gained API to process input at the + sub-frame level. That way video decoders can start decoding slices + before they have received the full input frame in its entirety (to + the extent this is supported by the codec, of course). This helps + with CPU utilisation and reduces latency. + +- This functionality is now being used in the OpenJPEG JPEG 2000 + decoder, the FFmpeg H.264 decoder (in case of NAL-aligned input) and + the OpenMAX H.264/H.265 decoders (in case of NAL-aligned input). -gst-launch-1.0 +Miscellaneous other changes and enhancements -- now prints the pipeline position and duration if available when the - pipeline is advancing. This is hopefully more user-friendly and - gives visual feedback on the terminal that the pipeline is actually - up and running. This can be disabled with the --no-position command - line option. +- GstDeviceMonitor no longer fails to start just because one of the + device providers failed to start. That could happen for example on + systems where the pulseaudio device provider is installed, but + pulseaudio isn’t actually running but ALSA is used for audio + instead. In the same vein the device monitor now keeps track of + which providers have been started (via the new + gst_device_provider_is_started()) and only stops actually running + device providers when stopping the device monitor. + +- On embedded systems it can be useful to create a registry that can + be shared and read by multiple processes running as different users. + It is now possible to set the new GST_REGISTRY_MODE environment + variable to specify the file mode for the registry file, which by + default is set to be only user readable/writable. + +- GstNetClientClock will signal lost sync in case the remote time + resets (e.g. because device power cycles), by emitting the “synced” + signal with synced=FALSE parameter, so applications can take action. + +- gst_value_deserialize_with_pspec() allows deserialisation with a + hint for what the target GType should be. This allows for example + passing arrays of flags through the command line or + gst_util_set_object_arg(), eg: foo="<bar,bar+baz>". + +- It’s now possible to create an empty GstVideoOverlayComposition + without any rectangles by passing a NULL rectangle to + gst_video_overlay_composition_new(). This is useful for bindings and + simplifies application code in some places. + +Tracing framework, debugging and testing improvements + +- New factories tracer to list loaded elements (and other plugin + features). This can be useful to collect a list of elements needed + for an application, which in turn can be used to create a tailored + minimal GStreamer build that contains just the elements needed and + nothing else. +- New plugin-feature-loaded tracing hook for use by tracers like the + new factories tracer + +- GstHarness: Add gst_harness_set_live() so that harnesses can be set + to non-live and return is-live=false in latency queries if needed. + Default behaviour is to always return is-live=true in latency + queries. + +- navseek: new "hold-eos" property. When enabled, the element will + hold back an EOS event until the next keystroke (via navigation + events). This can be used to keep a video sink showing the last + frame of a video pipeline until a key is pressed instead of tearing + it down immediately on EOS. + +- New fakeaudiosink element: mimics an audio sink and can be used for + testing and CI pipelines on systems where no audio system is + installed or running. It differs from fakesink in that it only + support audio caps and syncs to the clock by default like a normal + audio sink. It also implements the GstStreamVolume interface like + most audio sinks do. + +- New videocodectestsink element for video codec conformance testing: + Calculates MD5 checksums for video frames and skips any padding + whilst doing so. Can optionally also write back the video data with + padding removed into a file for easy byte-by-byte comparison with + reference data. -- the parse-launch pipeline syntax now has support for presets: - use@preset=<preset-name>" after an element to load a preset. +Tools gst-inspect-1.0 -- new --color command line option to force coloured output even if not - connected to a tty +- Can sort the list of plugins by passing --sort=name as command line + option -gst-tester-1.0 (new) +gst-launch-1.0 -- gst-tester-1.0 is a new tool for plugin developers to launch - .validatetest files with TAP compatible output, meaning it can - easily and cleanly be integrated with the meson test harness. It - allows you to use gst-validate (from the gst-devtools module) to - write integration tests in any GStreamer repository whilst keeping - the tests as close as possible to the code. The tool transparently - handles gst-validate being installed or not: if it is not installed - those integration tests will simply be skipped. +- will now error out on top-level properties that don’t exist and + which were silently ignored before +- On Windows the high-resolution clock is enabled now, which provides + better clock and timer performance on Windows (see Windows section + below for more details). gst-play-1.0 -- interactive keyboard controls now also work on Windows - -gst-transcoder-1.0 (new) - -- gst-transcoder-1.0 is a new command line tool to transcode one URI - into another URI based on the specified encoding profile using the - new GstTranscoder API (see above). +- New --start-position command line argument to start playback from + the specified position +- Audio can be muted/unmuted in interactive mode by pressing the m + key. +- On Windows the high-resolution clock is enabled now (see Windows + section below for more details) + +gst-device-monitor-1.0 + +- New --include-hidden command line argument to also show “hidden” + device providers + +ges-launch-1.0 + +- New interactive mode that allows seeking and such. Can be disabled + by passing the --no-interactive argument on the command line. +- Option to forward tags +- Allow using an existing clip to determine the rendering format (both + topology and profile) via new --profile-from command line argument. GStreamer RTSP server -- Fix issue where the first few packets (i.e. keyframes) could - sometimes be dropped if the rtsp media pipeline had a live input. - This was a regression from GStreamer 1.14. There are more fixes - pending for that which will hopefully land in 1.18.1. - -- Fix backpressure handling when sending data in TCP interleave mode - where RTSP requests and responses and RTP/RTCP packets flow over the - same RTSP TCP connection: The previous implementation would at some - point stop sending data to other clients when a single client - stopped consuming data or did not consume data fast enough. This - obviously created problems for shared media, where the same stream - from a single producer pipeline is sent to multiple clients. Instead - we now manage a backlog in the server’s stream-transport component - and remove slow clients once this backlog exceeds a maximum duration - (which is currently hardcoded). - -- Onvif Streaming Specification trick modes support (see section at - the beginning) - -- Scale/Speed header support: Speed will deliver the data at the - requested speed, which means increasing the data bandwidth for - speeds > 1.0. Scale will attempt to do the same without affecting - the overall bandwidth requirement vis-a-vis normal playback speed - (e.g. it might drop data for fast-forward playback). - -- rtspclientsink: send buffer lists in one go for better performance +- GstRTSPMediaFactory gained API to disable RTCP + (gst_rtsp_media_factory_set_enable_rtcp(), "enable-rtcp" property). + Previously RTCP was always allowed for all RTSP medias. With this + change it is possible to disable RTCP completely, irrespective of + whether the client wants to do RTCP or not. + +- Make a mount point of / work correctly. While not allowed by the + RTSP 2 spec, the RTSP 1 spec is silent on this and it is used in the + wild. It is now possible to use / as a mount path in + gst-rtsp-server, e.g. rtsp://example.com/ would work with this now. + Note that query/fragment parts of the URI are not necessarily + correctly handled, and behaviour will differ between various + client/server implementations; so use it if you must but don’t bug + us if it doesn’t work with third party clients as you’d hoped. + +- multithreading fixes (races, refcounting issues, deadlocks) + +- ONVIF audio backchannel fixes + +- ONVIF trick mode optimisations + +- rtspclientsink: new "update-sdp" signal that allows updating the SDP + before sending it to the server via ANNOUNCE. This can be used to + add additional metadata to the SDP, for example. The order and + number of medias must not be changed, however. GStreamer VAAPI -- A lot of work was done adding support for media-driver (iHD), the - new VAAPI driver for Intel, mostly for Gen9 onwards. - -- Available color formats and frame sizes are now detected at run-time - according to the context configuration. - -- Gallium drivers have been re-enabled in the allowed drivers list - -- Improved the mapping between VA formats and GStreamer formats by - generating a mapping table at run-time since even among different - drivers the mapping might be different, particularly for RGB with - little endianness. +- new AV1 decoder element (vaapiav1dec) -- The experimental Flexible Encoding Infrastructure (FEI) elements - have been removed since they were not really actively maintained or - tested. +- H.264 decoder: handle stereoscopic 3D video with frame packing + arrangement SEI messages -- Enhanced the juggling of DMABuf buffers and VASurface metas +- H.265 encoder: added Screen Content Coding extensions support -- New vaapioverlay element: a compositor element using VA VPP blend - capabilities to accelerate overlaying and compositing. Example - pipeline: +- H.265 decoder: gained MAIN_444_12 profile support (decoded to + Y412_LE), and 4:2:2 12-bits support (decoded to Y212_LE) - gst-launch-1.0 -vf videotestsrc ! vaapipostproc ! tee name=testsrc ! queue \ - ! vaapioverlay sink_1::xpos=300 sink_1::alpha=0.75 name=overlay ! vaapisink \ - testsrc. ! queue ! overlay. +- vaapipostproc: gained BT2020 color standard support -vaapipostproc +- vaapidecode: now generates caps templates dynamically at runtime in + order to advertise actually supported caps instead of all + theoretically supported caps. -- added video-orientation support, supporting frame mirroring and - rotation - -- added cropping support, either via properties (crop-left, - crop-right, crop-bottom and crop-top) or buffer meta. - -- new skin-tone-enhancenment-level property which is the iHD - replacement of the i965 driver’s sink-tone-level. Both are - incompatible with each other, so both were kept. - -- handle video colorimetry - -- support HDR10 tone mapping - -vaapisink - -- resurrected wayland backend for non-weston compositors by extracting - the DMABuf from the VASurface and rendering it. - -- merged the video overlay API for wayland. Now applications can - define the “window” to render on. - -- demoted the vaapisink element to secondary rank since libva - considers rendering as a second-class feature. - -VAAPI Encoders - -- new common target-percentage property which is the desired target - percentage of bitrate for variable rate control. - -- encoders now extract their caps from the driver at registration - time. - -- vaapivp9enc: added support for low power mode and support for - profile 2 (profile 0 by default) - -- vaapih264enc: new max-qp property that sets the maximum quantization - value. Support for ICQ and QBVR bitrate control mode, adding a - quality-factor property for these modes. Support baseline profile as - constrained-baseline - -- vaapih265enc: - - - support for main-444 and main-12 encoding profiles. - - new max-qp property that sets the maximum quantization value. - - support for ICQ and QBVR bitrate control mode, adding a - quality-factor property for these modes. - - handle SCC profiles. - - num-tile-cols and num-tile-row properties to specify the number - of tiles to use. - - the low-delay-b property was deprecated and is now determined - automatically. - - improved profile selection through caps. - -VAAPI Decoders - -- Decoder surfaces are not bound to their context any longer and can - thus be created and used dynamically, removing the deadlock - headache. - -- Reverse playback is now fluid - -- Forward Region-of-Interest (ROI) metas downstream - -- GLTextureUploadMeta uses DMABuf when GEM is not available. Now - Gallium drivers can use this meta for rendering with EGL. - -- vaapivp9dec: support for 4:2:2 and 4:4:4 chroma type streams - -- vaapih265dec: skip all pictures prior to the first I-frame. Enable - passing range extension flags to the driver. Handle SCC profiles. - -- vaapijpegdec: support for 4:0:0, 4:1:1, 4:2:2 and 4:4:4 chroma types - pictures - -- vaapih264dec: handle baseline streams as constrained-baseline if - possible and make it more tolerant when encountering unknown NALs +- GST_VAAPI_DRM_DEVICE environment variable to force a specified DRM + device when a DRM display is used. It is ignored when other types of + displays are used. By default /dev/dri/renderD128 is used for DRM + display. GStreamer OMX -- omxvideoenc: use new video encoder subframe API to push out slices - as soon as they’re ready - -- omxh264enc, omxh265enc: negotiate subframe mode via caps. To enable - it, force downstream caps to video/x-h264,alignment=nal or - video/x-h265,alignment=nal. - -- omxh264enc: Add ref-frames property - -- Zynq ultrascale+ specific video encoder/decoder improvements: - - - GRAY8 format support - - support for alternate fields interlacing mode - - video encoder: look-ahead, long-term-ref, and long-term-freq - properties +- subframe support in H.264/H.265 decoders GStreamer Editing Services and NLE -- Added nested timelines and subproject support so that GES projects - can be used as clips, potentially serializing nested projects in the - main file or referencing external project files. - -- Implemented an OpenTimelineIO GES formatter. This means GES and - GStreamer can now load and save projects in all the formats - supported by otio. - -- Implemented a GESMarkerList object which allow setting timed - metadata on any GES object. - -- Fixed audio rendering issues during clip transition by ensuring that - a single segment is pushed into encoders. - -- The GESUriClipAsset API is now MT safe. - -- Added ges_meta_container_register_static_meta() to allow fixing a - type for a specific metadata without actually setting a value. - -- The framepositioner element now handles resizing the project and - keeps the same positioning when the aspect ratio is not changed . - -- Reworked the documentation, making it more comprehensive and much - more detailed. - -- Added APIs to retrieve natural size and framerate of a clip (for - example in the case of URIClip it is the framerate/size of the - underlying file). - -- ges_container_edit() is now deprecated and GESTimelineElement gained - the ges_timeline_element_edit() method so the editing API is now - usable from any element in the timeline. - -- GESProject::loading was added so applications can be notified about - when a new timeline starts loading. - -- Implemented the GstStream API in GESTimeline. - -- Added a way to add a timeoverlay inside the test source (potentially - with timecodes). - -- Added APIs to convert times to frame numbers and vice versa: - - - ges_timeline_get_frame_time() - - - ges_timeline_get_frame_at() - - - ges_clip_asset_get_frame_time() - - - ges_clip_get_timeline_time_from_source_frame() - - Quite a few validate tests have been implemented to check the - behavior for various demuxer/codec formats - -- Added ges_layer_set_active_for_tracks() which allows muting layers - for the specified tracks - -- Deprecated GESImageSource and GESMultiFileSource now that we have - imagesequencesrc which handles the imagesequence “protocol” - -- Stopped exposing ‘deinterlacing’ children properties for clip types - where they do not make sense. - -- Added support for simple time remapping effects +- framepositioner: new "operator" property to access blending modes in + the compositor +- timeline: Implement snapping to markers +- smart-mixer: Add support for d3d11compositor and glvideomixer +- titleclip: add "draw-shadow" child property +- ges:// URI support to define a timeline from a description. +- command-line-formatter + - Add track management to timeline description + - Add keyframe support +- ges-launch-1.0: + - Add an interactive mode where we can seek etc… + - Add option to forward tags + - Allow using an existing clip to determine the rendering format + (both topology and profile) via new --profile-from command line + argument. +- Fix static build GStreamer validate -- Introduced the concept of “Test files” allowing to implement “all - included” test cases, meaning that inside the file the following can - be defined: - - - The application arguments - - The validate configurations - - The validate scenario - - This replaces the previous big dictionary file in - gst-validate-launcher to implement specific test cases. - - We set several variables inside the files (as well as inside - scenarios and config files) to make them relocatable. - - The file format has been enhanced so it is easier to read and write, - for example line ending with a coma or (curly) brackets can now be - used as continuation marker so you do not need to add \ at the end - of lines to write a structure on several lines. - -- Support the imagesequence “protocol” and added integration tests for - it. - -- Added action types to allow the scenario to run the Test Clock for - better reproducibility of tests. - -- Support generating tests to check that seeking is frame accurate - (base on ssim). - -- Added ways to record buffers checksum (in different ways) in the - validateflow module. - -- Added vp9 encoding tests. - -- Enhanced seeking action types implementation to allow support for - segment seeks. - -- Output improvements: - - - Logs are now in markdown formats (and bat is used to dump them - if available). - - File format issues in scenarios/configs/tests files are nicely - reported with the line numbers now. +- report: Add a way to force backtraces on reports even if not a + critical issue (GST_VALIDATE_ISSUE_FLAGS_FORCE_BACKTRACE) +- Add a flag to gst_validate_replace_variables_in_string() allow + defining how to resolve variables in structs +- Add gst_validate_bin_monitor_get_scenario() to get the bin monitor + scenario, which is useful for applications that use Validate + directly. +- Add an expected-values parameter to wait, message-type=XX allowing + more precise filtering of the message we are waiting for. +- Add config file support: each test can now use a config file for the + given media file used to test. +- Add support to check properties of object properties +- scenario: Add an "action-done" signal to signal when an action is + done +- scenario: Add a "run-command" action type +- scenario: Allow forcing running action on idle from scenario file +- scenario: Allow iterating over arrays in foreach +- scenario: Rename ‘interlaced’ action to ‘non-blocking’ +- scenario: Add a non-blocking flag to the wait signal GStreamer Python Bindings -- Python 2.x is no longer supported +- Fixes for Python 3.10 +- Various build fixes +- at least one known breaking change caused by g-i annotation changes + (see below) -- Support mapping buffers without any memcpy: - - - Added a ContextManager to make the API more pythonic - - with buf.map(Gst.MapFlags.READ | Gst.MapFlags.WRITE) as info: - info.data[42] = 0 +GStreamer C# Bindings -- Added high-level helper API for constructing pipelines: +- Fix GstDebugGraphDetails enum +- Updated to latest GtkSharp +- Updated to include GStreamer 1.20 API - - Gst.Bin.make_and_add(factory_name, instance_name=None) - - Gst.Element.link_many(element, ...) +GStreamer Rust Bindings and Rust Plugins -GStreamer C# Bindings +- The GStreamer Rust bindings are released separately with a different + release cadence that’s tied to gtk-rs, but the latest release has + already been updated for the upcoming new GStreamer 1.20 API (v1_20 + feature). + +- gst-plugins-rs, the module containing GStreamer plugins written in + Rust, has also seen lots of activity with many new elements and + plugins. See the New Elements section above for a list of new Rust + elements. -- Bind gst_buffer_new_wrapped() manually to fix memory handling. +Build and Dependencies -- Fix gst_promise_new_with_change_func() where bindgen didn’t properly - detect the func as a closure. +- Meson 0.59 or newer is now required to build GStreamer. -- Declare GstVideoOverlayComposition and GstVideoOverlayRectangle as - opaque type and subclasses of Gst.MiniObject. This changes the API - but without this all usage will cause memory corruption or simply - not work. +- The GLib requirement has been bumped to GLib 2.56 or newer (from + March 2018). -- on Windows, look for gstreamer, glib and gobject DLLs using the MSVC - naming convention (i.e. gstvideo-1.0-0.dll instead of - libgstvideo-1.0-0.dll). +- The wpe plugin now requires wpe >= 2.28 and wpebackend-fdo >= 1.8 - The names of these DLLs have to be hardcoded in the bindings, and - most C# users will probably be using the Microsoft toolchain anyway. +- The souphttpsrc plugin is no longer linked against libsoup but + instead tries to pick up either libsoup2 or libsoup3 dynamically at + runtime. Distro packagers please ensure to add a dependency on one + of the libsoup runtimes to the gst-plugins-good package so that + there is actually a libsoup for the plugin to find! - This means that the MSVC compiler is now required to build the - bindings, MingW will no longer work out of the box. +Explicit opt-in required for build of certain plugins with (A)GPL dependencies -GStreamer Rust Bindings and Rust Plugins +Some plugins have GPL- or AGPL-licensed dependencies and those plugins +will no longer be built by default unless you have explicitly opted in +to allow (A)GPL-licensed dependencies by passing -Dgpl=enabled to Meson, +even if the required dependencies are available. -The GStreamer Rust bindings are released separately with a different -release cadence that’s tied to gtk-rs, but the latest release has -already been updated for the new GStreamer 1.18 API, so there’s -absolutely no excuse why your next GStreamer application can’t be -written in Rust anymore. - -gst-plugins-rs, the module containing GStreamer plugins written in Rust, -has also seen lots of activity with many new elements and plugins. - -What follows is a list of elements and plugins available in -gst-plugins-rs, so people don’t miss out on all those potentially useful -elements that have no C equivalent. - -Rust audio plugins - -- audiornnoise: New element for audio denoising which implements the - noise removal algorithm of the Xiph RNNoise library, in Rust -- rsaudioecho: Port of the audioecho element from gst-plugins-good - rsaudioloudnorm: Live audio loudness normalization element based on - the FFmpeg af_loudnorm filter -- claxondec: FLAC lossless audio codec decoder element based on the - pure-Rust claxon implementation -- csoundfilter: Audio filter that can use any filter defined via the - Csound audio programming language -- lewtondec: Vorbis audio decoder element based on the pure-Rust - lewton implementation - -Rust video plugins - -- cdgdec/cdgparse: Decoder and parser for the CD+G video codec based - on a pure-Rust CD+G implementation, used for example by karaoke CDs -- cea608overlay: CEA-608 Closed Captions overlay element -- cea608tott: CEA-608 Closed Captions to timed-text (e.g. VTT or SRT - subtitles) converter -- tttocea608: CEA-608 Closed Captions from timed-text converter -- mccenc/mccparse: MacCaption Closed Caption format encoder and parser -- sccenc/sccparse: Scenarist Closed Caption format encoder and parser -- dav1dec: AV1 video decoder based on the dav1d decoder implementation - by the VLC project -- rav1enc: AV1 video encoder based on the fast and pure-Rust rav1e - encoder implementation -- rsflvdemux: Alternative to the flvdemux FLV demuxer element from - gst-plugins-good, not feature-equivalent yet -- rsgifenc/rspngenc: GIF/PNG encoder elements based on the pure-Rust - implementations by the image-rs project - -Rust text plugins - -- textwrap: Element for line-wrapping timed text (e.g. subtitles) for - better screen-fitting, including hyphenation support for some - languages - -Rust network plugins - -- reqwesthttpsrc: HTTP(S) source element based on the Rust - reqwest/hyper HTTP implementations and almost feature-equivalent - with the main GStreamer HTTP source souphttpsrc -- s3src/s3sink: Source/sink element for the Amazon S3 cloud storage -- awstranscriber: Live audio to timed text transcription element using - the Amazon AWS Transcribe API - -Generic Rust plugins - -- sodiumencrypter/sodiumdecrypter: Encryption/decryption element based - on libsodium/NaCl -- togglerecord: Recording element that allows to pause/resume - recordings easily and considers keyframe boundaries -- fallbackswitch/fallbacksrc: Elements for handling potentially - failing (network) sources, restarting them on errors/timeout and - showing a fallback stream instead -- threadshare: Set of elements that provide alternatives for various - existing GStreamer elements but allow to share the streaming threads - between each other to reduce the number of threads -- rsfilesrc/rsfilesink: File source/sink elements as replacements for - the existing filesrc/filesink elements +See Building plugins with (A)GPL-licensed dependencies for more details +and a non-exhaustive list of plugins affected. -Build and Dependencies +gst-build: replaced by mono repository -- The Autotools build system has finally been removed in favour of the - Meson build system. Developers who currently use gst-uninstalled - should move to gst-build. - -- API and plugin documentation are no longer built with gtk_doc. The - gtk_doc documentation has been removed in favour of a new unified - documentation module built with hotdoc (also see “Documentation - improvements” section below). Distributors should use the - documentation release tarball instead of trying to package hotdoc - and building the documentation from scratch. - -- gst-plugins-bad now includes an internal copy of libusrsctp, as - there are problems in usrsctp with global shared state, lack of API - stability guarantees, and the absence of any kind of release - process. We also can’t rely on distros shipping a version with the - fixes we need. Both firefox and Chrome bundle their own copies too. - It is still possible to build against an external copy of usrsctp if - so desired. - -- nvcodec no longer needs the NVIDIA NVDEC/NVENC SDKs available at - build time, only at runtime. This allows distributions to ship this - plugin by default and it will just start to work when the required - run-time SDK libraries are installed by the user, without users - needing to build and install the plugin from source. - -- the gst-editing-services tarball is now named gst-editing-services - for consistency (used to be gstreamer-editing-services). - -- the gst-validate tarball has been superseded by the gst-devtools - tarball for consistency with the git module name. - -gst-build - -gst-build is a meta-module and serves primarily as our uninstalled -development environment. It makes it easy to build most of GStreamer, -but unlike Cerbero it only comes with a limited number of external -dependencies that can be built as subprojects if they are not found on -the system. - -gst-build is based on Meson and replaces the old autotools -gst-uninstalled script. - -- The ‘uninstalled’ target has been renamed to ‘devenv’ - -- Experimental gstreamer-full library containing all built plugins and - their deps when building with -Ddefault_library=static. A monolithic - library is easier to distribute, and may be required in some - environments. GStreamer core, GLib and GObject are always included, - but external dependencies are still dynamically linked. The - gst-full-libraries meson option allows adding other GStreamer - libraries to the gstreamer-full build. This is an experiment for now - and its behaviour or API may still change in future releases. - -- Add glib-networking as a subproject when glib is a subproject and - load gio modules in the devenv, tls option control whether to use - openssl or gnutls. - -- git-worktree: Allow multiple worktrees for subproject branches - -- Guard against meson being run from inside the uninstalled devenv, as - this might have unexpected consequences. - -- our ffmpeg and x264 meson ports have been updated to the latest - stable version (you might need to update the subprojects checkout - manually though, or just remove the checkouts so meson checks out - the latest version again; improvements for this are pending in - meson, but not merged yet). +See mono repository section above and the GStreamer mono repository FAQ. Cerbero @@ -1803,1568 +1605,494 @@ on platforms where dependencies are not readily available, such as Windows, Android, iOS and macOS. -General improvements +General Cerbero improvements -- Recipe build steps are done in parallel wherever possible. This - leads to massive improvements in overall build time. -- Several recipes were ported to Meson, which improved build times -- Moved from using both GnuTLS and OpenSSL to only OpenSSL -- Moved from yasm to nasm for all assembly compilation -- Support zsh when running the cerbero shell command -- Numerous version upgrades for dependencies -- Default to xz for tarball binary packages. bz2 can be selected with - the --compress-method option to package. -- Added boolean variant for controlling the optimization level: - -v optimization -- Ship .pc pkgconfig files for all plugins in the binary packages -- CMake and nasm will only be built by Cerbero if the system versions - are unusable -- The nvcodec variant was removed and the nvcodec plugin is built by - default now (as it no longer requires the SDK to be installed at - build time, only at runtime) - -macOS / iOS - -- Minimum iOS SDK version bumped to 11.0 -- Minimum macOS SDK version bumped to 10.11 -- No longer need to manually add support for newer iOS SDK versions -- Added Vulkan elements via MoltenVK -- Build times were improved by code-signing all build tools -- macOS framework ships all gstreamer libraries instead of an outdated - subset -- Ship pkg-config in the macOS framework package -- fontconfig: Fix EXC_BAD_ACCESS crash on iOS ARM64 -- Improved App Store compatibility by setting LC_VERSION_MIN_MACOSX, - fixing relocations, and improved bitcode support +- Plugin removed: libvisual +- New plugins: rtpmanagerbad and rist -Windows +macOS / iOS specific Cerbero improvements -- MinGW-GCC toolchain was updated to 8.2. It uses the Universal CRT - instead of MSVCRT which eliminates cross-CRT issues in the Visual - Studio build. -- Require Windows 7 or newer for running binaries produced by Cerbero -- Require Windows x86_64 for running Cerbero to build binary packages -- Cerbero no longer uses C:/gstreamer/1.0 as a prefix when building. - That prefix is reserved for use by the MSI installers. -- Several recipes can now be buit with Visual Studio instead of MinGW. - Ported to meson: opus, libsrtp, harfbuzz, cairo, openh264, libsoup, - libusrsctp. Existing build system: libvpx, openssl. -- Support building using Visual Studio for 32-bit x86. Previously we - only supported building for 32-bit x86 using the MinGW toolchain. -- Fixed annoying msgmerge popups in the middle of cerbero builds -- Added configuration options vs_install_path and vs_install_version - for specifying custom search locations for older Visual Studio - versions that do not support vswhere. You can set these in - ~/.cerbero/cerbero.cbc where ~ is the MSYS homedir, not your Windows - homedir. -- New Windows-specific plugins: d3d11, mediafoundation, wasapi2 -- Numerous compatibility and reliability fixes when running Cerbero on - Windows, especially non-English locales -- proxy-libintl now exports the same symbols as gettext, which makes - it a drop-in replacement -- New mapping variant for selecting the Visual Studio CRT to use: - -v vscrt=<value>. Valid values are md, mdd, and auto (default). A - separate prefix is used when building with either md (release) or - mdd (debug), and the outputted package will have +debug in the - filename. This variant is also used for selecting the correct Qt - libraries (debug vs release) to use when building with -v qt5 on - Windows. -- Support cross-compile on Windows to Windows ARM64 and ARMv7 -- Support cross-compile on Windows to the Universal Windows Platform - (UWP). Only the subset of plugins that can be built entirely with - Visual Studio will be selected in this case. To do so, use the - config/cross-uwp-universal.cbc configuration, which will build - ARM64, x86, and x86_64 binaries linked to the release CRT, with - optimizations enabled, and debugging turned on. You can combine this - with -v vscrt=mdd to produce binaries linked to the debug CRT. You - can turn off optimizations with the -v nooptimization variant. +- XCode 12 support +- macOS OS release support is now future-proof, similar to iOS +- macOS Apple Silicon (ARM64) cross-compile support has been added, + including Universal binaries. There is a known bug regarding this on + ARM64. +- Running Cerbero itself on macOS Apple Silicon (ARM64) is currently + experimental and is known to have bugs + +Windows specific Cerbero improvements + +- Visual Studio 2022 support has been added +- bootstrap is faster since it requires building fewer build-tools + recipes on Windows +- package is faster due to better scheduling of recipe stages and + elimination of unnecessary autotools regeneration +- The following plugins are no longer built on Windows: + - a52dec (another decoder is still available in libav) + - dvdread + - resindvd Windows MSI installer -- Require Windows 7 or newer for running GStreamer -- Fixed some issues with shipping of pkg-config in the Windows - installers -- Plugin PDB debug files are now shipped in the development package, - not the runtime package -- Ship installers for 32-bit binaries built with Visual Studio -- Ship debug and release “universal” (ARM64, X86, and X86_64) tarballs - built for the Universal Windows Platform -- Windows MSI installers now install into separate prefixes when - building with MSVC and MinGW. Previously both would be installed - into C:/gstreamer/1.0/x86 or C:/gstreamer/1.0/x86_64. Now, the - installation prefixes are: - - ---------------------------------------------------------------------------------------------------------------- - Target Path Build options - --------------------------- ------------------------------------ ----------------------------------------------- - MinGW 32-bit C:/gstreamer/1.0/mingw_x86 -c config/win32.cbc - - MinGW 64-bit C:/gstreamer/1.0/mingw_x86_64 -c config/win64.cbc - - MSVC 32-bit C:/gstreamer/1.0/msvc_x86 -c config/win32.cbc -v visualstudio +- no major changes - MSVC 64-bit C:/gstreamer/1.0/msvc_x86_64 -c config/win64.cbc -v visualstudio +Linux specific Cerbero improvements - MSVC 32-bit (debug) C:/gstreamer/1.0/msvc-debug_x86 -c config/win32.cbc -v visualstudio,vscrt=mdd +- Fedora, Debian OS release support is now more future-proof +- Amazon Linux 2 support has been added - MSVC 64-bit (debug) C:/gstreamer/1.0/msvc-debug_x86_64 -c config/win64.cbc -v visualstudio,vscrt=mdd - ---------------------------------------------------------------------------------------------------------------- +Android specific Cerbero improvements -Note: UWP binary packages are tarballs, not MSI installers. - -Linux - -- Support creating MSI installers using WiX when cross-compiling to - Windows -- Support running cross-windows binaries with Wine when using the - shell and runit cerbero commands -- Added bash-completion support inside the cerbero shell on Linux -- Require a system-wide installation of openssl on Linux -- Added variant -v vaapi to build gstreamer-vaapi and the new gstva - plugin -- Debian packaging was disabled because it does not work. Help in - fixing this is appreciated. -- Trimmed the list of packages needed for bootstrap on Linux - -Android - -- Updated to NDK r21 -- Support Vulkan -- Support Qt 5.14+ binary package layout +- no major changes Platform-specific changes and improvements Android -- opensles: Remove hard-coded buffer-/latency-time values and allow - openslessink to handle 48kHz streams. - -- photography interface and camera source: Add additional settings - relevant to Android such as: Exposure mode property, extra colour - tone values (aqua, emboss, sketch, neon), extra scene modes - (backlight, flowers, AR, HDR), and missing virtual methods for - exposure mode, analog gain, lens focus, colour temperature, min & - max exposure time. Add new effects and scene modes to Camera - parameters. +- No major changes macOS and iOS -- vtdec can now output to Vulkan-backed memory for zerocopy support - with the Vulkan elements. - -Windows - -- d3d11videosink: new Direct3D11-based video sink with support for - HDR10 rendering if supported. - -- Hardware-accelerated video decoding on Windows via DXVA2 / - Direct3D11 using native Windows APIs rather than per-vendor SDKs - (like MSDK for Intel or NVCODEC for NVidia). Plus modern Direct3D11 - integration rather than the almost 20-year old Direct3D9 from - Windows XP times used in d3dvideosink. Formats supported for - decoding are H.264, H.265, VP8, and VP9, and zero-copy operation - should be supported in combination with the new d3d11videosink. See - Seungha’s blog post “Windows DXVA2 (via Direct3D 11) Support in - GStreamer 1.17” for more details. - -- Microsoft Media Foundation plugin for hardware-accelerated video - encoding on Windows using native Windows APIs rather than per-vendor - SDKs. Formats supported for encoding are H.264, H.265 and VP9. Also - includes audio encoders for AAC and MP3. See Seungha’s blog post - “Bringing Microsoft Media Foundation to GStreamer” for some more - details about this. - -- new mfvideosrc video capture source element using the latest Windows - APIs rather than ancient APIs used by ksvideosrc/winks. ksvideosrc - should be considered deprecated going forward. - -- d3d11: add d3d11convert, a color space conversion and rescaling - element using shaders, and introduce d3d11upload and d3d11download - elements that work just like glupload and gldownload but for D3D11. - -- Universal Windows Platform (UWP) support, including official - GStreamer binary packages for it. Check out Nirbheek’s latest blog - post “GStreamer 1.18 supports the Universal Windows Platform” for - more details. +- applemedia: add ProRes support to vtenc and vtdec -- systemclock correctness and reliability fixes, and also don’t start - the system clock at 0 any longer (which shouldn’t make any - difference to anyone, as absolute clock time values are supposed to - be meaningless in themselves, only the rate of increase matters). - -- toolchain specific plugin registry: the registry cache is now named - differently for MSVC and MinGW toolchains/packages, which should - avoid problems when switching between binaries built with a - different toolchain. - -- new wasapi2 plugin mainly to support UWP applications. The core - logic of this plugin is almost identical to existing wasapi plugin, - but the main target is Windows 10 and UWP. This plugin uses WinRT - APIs, so will likely not work on Windows 8 or older. Unlike the - existing wasapi plugin, this plugin supports automatic stream - routing (auto fallback when device was removed) and device level - mute/volume control. Exclusive streaming mode is not supported, - however, and loopback features are not implemented yet. It is also - only possible to build this plugin with MSVC and the Windows 10 SDK, - it can’t be cross-compiled with the MingW toolchain. - -- new dxgiscreencapsrc element which uses the Desktop Duplication API - to capture the desktop screen at high speed. This is only supported - on Windows 8 or later. Compared to the existing elements - dxgiscreencapsrc offers much better performance, works in High DPI - environments and draws an accurate mouse cursor. - -- d3dvideosink was downgraded to secondary rank, d3d11videosink is - preferred now. Support OverlayComposition for GPU overlay - compositing of subtitles and logos. +- The GStreamer.framework location is now relocatable and is not + required to be /Library/Frameworks/ -- debug log output fixes, esp. with a non-UTF8 locale/codepage +- Cerbero now supports cross-compiling to macOS running on Apple + Silicon (ARM64), and Universal binaries are now available that can + be used on both X86_64 and ARM64 macOS. -- speex, jack: fixed crashes on Windows caused by cross-CRT issues +Windows -- gst-play-1.0 interactive keyboard controls now also work on Windows +- On Windows the high-resolution clock is enabled now in the + gst-launch-1.0 and gst-play-1.0 command line tools, which provides + better clock and timer performance on Windows, at the cost of higher + power consumption. By default, without the high-resolution clock + enabled, the timer precision on Windows is system-dependent and may + be as bad as 15ms which is not good enough for many multimedia + applications. Developers may want to do the same in their Windows + applications if they think it’s a good idea for their application + use case, and depending on the Windows version they target. This is + not done automatically by GStreamer because on older Windows + versions (pre-Windows 10) this affects a global Windows setting and + also there’s a power consumption vs. performance trade-off that may + differ from application to application. + +- dxgiscreencapsrc now supports resolution changes + +- The wasapi2 audio plugin was rewritten and now has a higher rank + than the old wasapi plugin since it has a number of additional + features such as automatic stream routing, and no + known-but-hard-to-fix issues. The plugin is always built if the + Windows 10 SDK is available now. + +- The wasapi device providers now detect and notify dynamic device + additions/removals + +- d3d11screencapturesrc: new desktop capture element, including + GstDeviceProvider implementation to enumerate/select target monitors + for capture. + +- Direct3D11/DXVA decoder now supports AV1 and MPEG-2 codecs + (d3d11av1dec, d3d11mpeg2dec) + +- VP9 decoding got more reliable and stable thanks to a newly written + codec parser + +- Support for decoding interlaced H.264/AVC streams + +- Hardware-accelerated video deinterlacing (d3d11deinterlace) and + video mixing (d3d11compositor) + +- Video mixing with the Direct3D11 API (d3d11compositor) + +- MediaFoundation API based hardware encoders gained the ability to + receive Direct3D11 textures as an input + +- Seungha’s blog post “GStreamer ❤ Windows: A primer on the cool stuff + you’ll find in the 1.20 release” describes many of the + Windows-related improvements in more detail Linux -- kmssink: Add support for P010 and P016 formats +- bluez: LDAC Bluetooth audio codec support in a2dpsink and avdtpsink, + as well as an LDAC RTP payloader (rtpldacpay) and an LDAC audio + encoder (ldacenc) -- vah264dec: new experimental va plugin with an element for H.264 - decoding with VA-API. This novel approach, different from - gstreamer-vaapi, uses the gstcodecs library for decoder state - handling, which it is hoped will make for cleaner code because it - uses VA-API without further layers or wrappers. Check out Víctor’s - blog post “New VA-API H.264 decoder in gst-plugins-bad” for the full - lowdown and the limitations of this new plugin, and how to give it a - spin. - -- v4l2codecs: introduce a V4L2 CODECs Accelerator. This plugin will - support the new CODECs uAPI in the Linux kernel, which consists of - an accelerator interface similar to DXVA, NVDEC, VDPAU and VAAPI. So - far H.264 and VP8 are supported. This is used on certain embedded - systems such as i.mx8m, rk3288, rk3399, Allwinner H-series SoCs. +- kmssink: gained support for NV24, NV61, RGB16/BGR16 formats; + auto-detect NVIDIA Tegra driver Documentation improvements -- unified documentation containing tutorials, API docs, plugin docs, - etc. all under one roof, shipped in form of a documentation release - tarball containing both devhelp and html documentation. - -- all documentation is now generated using hotdoc, gtk-doc is no - longer used. Distributors should use the above-mentioned - documentation release tarball instead of trying to package hotdoc - and building the documentation from scratch. - -- there is now documentation for wrapper plugins like gst-libav and - frei0r, as well as tracer plugins. - -- for more info, check out Thibault’s “GStreamer Documentation” - lightning talk from the 2019 GStreamer Conference. - -- new API for plugins to support the documentation system: - - - new GParamSpecFlag GST_PARAM_DOC_SHOW_DEFAULT to make - gst-inspect-1.0 (and the documentation) show the paramspec’s - default value rather than the actually set value as default - - GstPadTemplate getter and setter for “documentation caps”, - gst_pad_template_set_documentation_caps() and - gst_pad_template_get_documentation_caps(): This can be used in - elements where the caps of pad templates are dynamically - generated and/or dependent on the environment, to override the - caps shown in the documentation (usually to advertise the full - set of possible caps). - - gst_type_mark_as_plugin_api() for marking types as plugin API, - used for plugin-internal types like enums, flags, pad - subclasses, boxed types, and such. - -Possibly Breaking Changes - -- GstVideo: the canonical list of raw video formats (for use in caps) - has been reordered, so video elements such as videotestsrc or - videoconvert might negotiate to a different format now than before. - The new format might be a higher-quality format or require more - processing overhead, which might affect pipeline performance. - -- mpegtsdemux used to wrongly advertise H.264 and H.265 video - elementary streams as alignment=nal. This has now been fixed and - changed to alignment=none, which means an h264parse or h265parse - element is now required after tsdemux for some pipelines where there - wasn’t one before, e.g. in transmuxing scenarios (tsdemux ! tsmux). - Pipelines without such a parser may now fail to link or error out at - runtime. As parsers after demuxers and before muxers have been - generally required for a long time now it is hoped that this will - only affect a small number of applications or pipelines. - -- The Android opensles audio source and sink used to have hard-coded - buffer-/latency-time values of 20ms. This is no longer needed with - newer Android versions and has now been removed. This means a higher - or lower value might now be negotiated by default, which can affect - pipeline performance and latency. +- hardware-accelerated GPU plugins will now no longer always list all + the element variants for all available GPUs, since those are + system-dependent and it’s confusing for users to see those in the + documentation just because the GStreamer developer who generated the + docs had multiple GPUs to play with at the time. Instead just show + the default elements. + +Possibly Breaking and Other Noteworthy Behavioural Changes + +- gst_parse_launch(), gst_parse_bin_from_description() and friends + will now error out when setting properties that don’t exist on + top-level bins. They were silently ignored before. + +- The GstWebRTC library does not expose any objects anymore with + public fields. Instead properties have been added to replace that + functionality. If you are accessing such fields in your application, + switch to the corresponding properties. + +- playbin and uridecodebin now emit the source-setup signal before the + element is added to the bin and linked so that the source element is + already configured before any scheduling query comes in, which is + useful for elements such as appsrc or giostreamsrc. + +- The source element inside urisourcebin (used inside uridecodebin3 + which is used inside playbin3) is no longer called "source". This + shouldn’t affect anyone hopefully, because there’s a "setup-source" + signal to configure the source element and no one should rely on + names of internal elements anyway. + +- The vp8enc element now expects bps (bits per second) for the + "temporal-scalability-target-bitrate" property, which is consistent + with the "target-bitrate" property. Since additional configuration + is required with modern libvpx to make temporal scaling work anyway, + chances are that very few people will have been using this property + +- vp8enc and vp9enc now default to “good quality” for the "deadline" + property rather then “best quality”. Having the deadline set to best + quality causes the encoder to be absurdly slow, most real-life users + will want the good quality tradeoff instead. + +- The experimental GstTranscoder library API in gst-plugins-bad was + changed from a GObject signal-based notification mechanism to a + GstBus/message-based mechanism akin to GstPlayer/GstPlay. + +- MPEG-TS SCTE-35 API: semantic change for SCTE-35 splice commands: + timestamps passed by the application should be in running time now, + since users of the API can’t really be expected to predict the local + PTS of the muxer. + +- The GstContext used by souphttpsrc to share the session between + multiple element instances has changed. Previously it provided + direct access to the internal SoupSession object, now it only + provides access to an opaque, internal type. This change is + necessary because SoupSession is not thread-safe at all and can’t be + shared safely between arbitrary external code and souphttpsrc. + +- Python bindings: GObject-introspection related Annotation fixes have + led to a case of a GstVideo.VideoInfo-related function signature + changing in the Python bindings (possibly one or two other cases + too). This is for a function that should never have been exposed in + the first place though, so the bindings are being updated to throw + an exception in that case, and the correct replacement API has been + added in form of an override. Known Issues -- GStreamer 1.18 versions <= 1.18.4 would fail to build on Linux with - Meson 0.58 due to an issue with the include directories. - GStreamer >= 1.18.5 includes a fix for this. +- nothing in particular at this point (but also see possibly breaking + changes section above) Contributors -Aaron Boxer, Adam Duskett, Adam x Nilsson, Adrian Negreanu, Akinobu -Mita, Alban Browaeys, Alcaro, Alexander Lapajne, Alexandru Băluț, Alex -Ashley, Alex Hoenig, Alicia Boya García, Alistair Buxton, Ali Yousuf, -Ambareesh “Amby” Balaji, Amr Mahdi, Andoni Morales Alastruey, Andreas -Frisch, Andre Guedes, Andrew Branson, Andrey Sazonov, Antonio Ospite, -aogun, Arun Raghavan, Askar Safin, AsociTon, A. Wilcox, Axel Mårtensson, -Ayush Mittal, Bastian Bouchardon, Benjamin Otte, Bilal Elmoussaoui, -Brady J. Garvin, Branko Subasic, Camilo Celis Guzman, Carlos Rafael -Giani, Charlie Turner, Cheng-Chang Wu, Chris Ayoup, Chris Lord, -Christoph Reiter, cketti, Damian Hobson-Garcia, Daniel Klamt, Daniel -Molkentin, Danny Smith, David Bender, David Gunzinger, David Ing, David -Svensson Fors, David Trussel, Debarshi Ray, Derek Lesho, Devarsh -Thakkar, dhilshad, Dimitrios Katsaros, Dmitriy Purgin, Dmitry Shusharin, -Dominique Leuenberger, Dong Il Park, Doug Nazar, dudengke, Dylan McCall, -Dylan Yip, Ederson de Souza, Edward Hervey, Eero Nurkkala, Eike Hein, -ekwange, Eric Marks, Fabian Greffrath, Fabian Orccon, Fabio D’Urso, -Fabrice Bellet, Fabrice Fontaine, Fanchao L, Felix Yan, Fernando -Herrrera, Francisco Javier Velázquez-García, Freyr, Fuwei Tang, Gaurav -Kalra, George Kiagiadakis, Georgii Staroselskii, Georg Lippitsch, Georg -Ottinger, gla, Göran Jönsson, Gordon Hart, Gregor Boirie, Guillaume -Desmottes, Guillermo Rodríguez, Haakon Sporsheim, Haihao Xiang, Haihua -Hu, Havard Graff, Håvard Graff, Heinrich Kruger, He Junyan, Henry -Wilkes, Hosang Lee, Hou Qi, Hu Qian, Hyunjun Ko, ibauer, Ignacio Casal -Quinteiro, Ilya Smelykh, Jake Barnes, Jakub Adam, James Cowgill, James -Westman, Jan Alexander Steffens, Jan Schmidt, Jan Tojnar, Javier Celaya, -Jeffy Chen, Jennifer Berringer, Jens Göpfert, Jérôme Laheurte, Jim -Mason, Jimmy Ohn, J. Kim, Joakim Johansson, Jochen Henneberg, Johan -Bjäreholt, Johan Sternerup, John Bassett, Jonas Holmberg, Jonas Larsson, -Jonathan Matthew, Jordan Petridis, Jose Antonio Santos Cadenas, Josep -Torra, Jose Quaresma, Josh Matthews, Joshua M. Doe, Juan Navarro, -Juergen Werner, Julian Bouzas, Julien Isorce, Jun-ichi OKADA, Justin -Chadwell, Justin Kim, Keri Henare, Kevin JOLY, Kevin King, Kevin Song, -Knut Andre Tidemann, Kristofer Björkström, krivoguzovVlad, Kyrylo -Polezhaiev, Lenny Jorissen, Linus Svensson, Loïc Le Page, Loïc Minier, -Lucas Stach, Ludvig Rappe, Luka Blaskovic, luke.lin, Luke Yelavich, -Marcin Kolny, Marc Leeman, Marco Felsch, Marcos Kintschner, Marek -Olejnik, Mark Nauwelaerts, Markus Ebner, Martin Liska, Martin Theriault, -Mart Raudsepp, Matej Knopp, Mathieu Duponchelle, Mats Lindestam, Matthew -Read, Matthew Waters, Matus Gajdos, Maxim Paymushkin, Maxim P. -Dementiev, Michael Bunk, Michael Gruner, Michael Olbrich, Miguel París -Díaz, Mikhail Fludkov, Milian Wolff, Millan Castro, Muhammet Ilendemli, -Nacho García, Nayana Topolsky, Nian Yan, Nicola Murino, Nicolas -Dufresne, Nicolas Pernas Maradei, Niels De Graef, Nikita Bobkov, Niklas -Hambüchen, Nirbheek Chauhan, Ognyan Tonchev, okuoku, Oleksandr -Kvl,Olivier Crête, Ondřej Hruška, Pablo Marcos Oltra, Patricia Muscalu, -Peter Seiderer, Peter Workman, Philippe Normand, Philippe Renon, Philipp -Zabel, Pieter Willem Jordaan, Piotr Drąg, Ralf Sippl, Randy Li, Rasmus -Thomsen, Ratchanan Srirattanamet, Raul Tambre, Ray Tiley, Richard -Kreckel, Rico Tzschichholz, R Kh, Robert Rosengren, Robert Tiemann, -Roman Shpuntov, Roman Sivriver, Ruben Gonzalez, Rubén Gonzalez, -rubenrua, Ryan Huang, Sam Gigliotti, Santiago Carot-Nemesio, Saunier -Thibault, Scott Kanowitz, Sebastian Dröge, Sebastiano Barrera, Seppo -Yli-Olli, Sergey Nazaryev, Seungha Yang, Shinya Saito, Silvio -Lazzeretti, Simon Arnling Bååth, Siwon Kang, sohwan.park, Song Bing, -Soohyun Lee, Srimanta Panda, Stefano Buora, Stefan Sauer, Stéphane -Cerveau, Stian Selnes, Sumaid Syed, Swayamjeet, Thiago Santos, Thibault -Saunier, Thomas Bluemel, Thomas Coldrick, Thor Andreassen, Tim-Philipp -Müller, Ting-Wei Lan, Tobias Ronge, trilene, Tulio Beloqui, U. Artie -Eoff, VaL Doroshchuk, Varunkumar Allagadapa, Vedang Patel, Veerabadhran -G, Víctor Manuel Jáquez Leal, Vivek R, Vivia Nikolaidou, Wangfei, Wang -Zhanjun, Wim Taymans, Wonchul Lee, Xabier Rodriguez Calvar, Xavier -Claessens, Xidorn Quan, Xu Guangxin, Yan Wang, Yatin Maan, Yeongjin -Jeong, yychao, Zebediah Figura, Zeeshan Ali, Zeid Bekli, Zhiyuan Sraf, -Zoltán Imets, +Aaron Boxer, Adam Leppky, Adam Williamson, Alba Mendez, Alejandro +González, Aleksandr Slobodeniuk, Alexander Vandenbulcke, Alex Ashley, +Alicia Boya García, Andika Triwidada, Andoni Morales Alastruey, Andrew +Wesie, Andrey Moiseev, Antonio Ospite, Antonio Rojas, Arthur Crippa +Búrigo, Arun Raghavan, Ashley Brighthope, Axel Kellermann, Baek, Bastien +Nocera, Bastien Reboulet, Benjamin Gaignard, Bing Song, Binh Truong, +Biswapriyo Nath, Brad Hards, Brad Smith, Brady J. Garvin, Branko +Subasic, Camilo Celis Guzman, Chris Bass, ChrisDuncanAnyvision, Chris +White, Corentin Damman, Daniel Almeida, Daniel Knobe, Daniel Stone, +david, David Fernandez, David Keijser, David Phung, Devarsh Thakkar, +Dinesh Manajipet, Dmitry Samoylov, Dmitry Shusharin, Dominique Martinet, +Doug Nazar, Ederson de Souza, Edward Hervey, Emmanuel Gil Peyrot, +Enrique Ocaña González, Ezequiel Garcia, Fabian Orccon, Fabrice +Fontaine, Fernando Jimenez Moreno, Florian Karydes, Francisco Javier +Velázquez-García, François Laignel, Frederich Munch, Fredrik Pålsson, +George Kiagiadakis, Georg Lippitsch, Göran Jönsson, Guido Günther, +Guillaume Desmottes, Guiqin Zou, Haakon Sporsheim, Haelwenn (lanodan) +Monnier, Haihao Xiang, Haihua Hu, Havard Graff, He Junyan, Helmut +Januschka, Henry Wilkes, Hosang Lee, Hou Qi, Ignacio Casal Quinteiro, +Igor Kovalenko, Ilya Kreymer, Imanol Fernandez, Jacek Tomaszewski, Jade +Macho, Jakub Adam, Jakub Janků, Jan Alexander Steffens (heftig), Jan +Schmidt, Jason Carrete, Jason Pereira, Jay Douglass, Jeongki Kim, Jérôme +Laheurte, Jimmi Holst Christensen, Johan Sternerup, John Hassell, John +Lindgren, John-Mark Bell, Jonathan Matthew, Jordan Petridis, Jose +Quaresma, Julian Bouzas, Julien, Kai Uwe Broulik, Kasper Steensig +Jensen, Kellermann Axel, Kevin Song, Khem Raj, Knut Inge Hvidsten, Knut +Saastad, Kristofer Björkström, Lars Lundqvist, Lawrence Troup, Lim Siew +Hoon, Lucas Stach, Ludvig Rappe, Luis Paulo Fernandes de Barros, Luke +Yelavich, Mads Buvik Sandvei, Marc Leeman, Marco Felsch, Marek Vasut, +Marian Cichy, Marijn Suijten, Marius Vlad, Markus Ebner, Mart Raudsepp, +Matej Knopp, Mathieu Duponchelle, Matthew Waters, Matthieu De Beule, +Mengkejiergeli Ba, Michael de Gans, Michael Olbrich, Michael Tretter, +Michal Dzik, Miguel Paris, Mikhail Fludkov, mkba, Nazar Mokrynskyi, +Nicholas Jackson, Nicola Murino, Nicolas Dufresne, Niklas Hambüchen, +Nikolay Sivov, Nirbheek Chauhan, Olivier Blin, Olivier Crete, Olivier +Crête, Paul Goulpié, Per Förlin, Peter Boba, P H, Philippe Normand, +Philipp Zabel, Pieter Willem Jordaan, Piotrek Brzeziński, Rafał +Dzięgiel, Rafostar, raghavendra, Raghavendra, Raju Babannavar, Raleigh +Littles III, Randy Li, Randy Li (ayaka), Ratchanan Srirattanamet, Raul +Tambre, reed.lawrence, Ricky Tang, Robert Rosengren, Robert Swain, Robin +Burchell, Roman Sivriver, R S Nikhil Krishna, Ruben Gonzalez, Ruslan +Khamidullin, Sanchayan Maity, Scott Moreau, Sebastian Dröge, Sergei +Kovalev, Seungha Yang, Sid Sethupathi, sohwan.park, Sonny Piers, Staz M, +Stefan Brüns, Stéphane Cerveau, Stephan Hesse, Stian Selnes, Stirling +Westrup, Théo MAILLART, Thibault Saunier, Tim, Timo Wischer, Tim-Philipp +Müller, Tim Schneider, Tobias Ronge, Tom Schoonjans, Tulio Beloqui, +tyler-aicradle, U. Artie Eoff, Ung, Val Doroshchuk, VaL Doroshchuk, +Víctor Manuel Jáquez Leal, Vivek R, Vivia Nikolaidou, Vivienne +Watermeier, Vladimir Menshakov, Will Miller, Wim Taymans, Xabier +Rodriguez Calvar, Xavier Claessens, Xℹ Ruoyao, Yacine Bandou, Yinhang +Liu, youngh.lee, youngsoo.lee, yychao, Zebediah Figura, Zhang yuankun, +Zhang Yuankun, Zhao, Zhao Zhili, , Aleksandar Topic, Antonio Ospite, +Bastien Nocera, Benjamin Gaignard, Brad Hards, Carlos Falgueras García, +Célestin Marot, Corentin Damman, Corentin Noël, Daniel Almeida, Daniel +Knobe, Danny Smith, Dave Piché, Dmitry Osipenko, Fabrice Fontaine, +fjmax, Florian Zwoch, Guillaume Desmottes, Haihua Hu, Heinrich Kruger, +He Junyan, Jakub Adam, James Cowgill, Jan Alexander Steffens (heftig), +Jean Felder, Jeongki Kim, Jiri Uncovsky, Joe Todd, Jordan Petridis, +Krystian Wojtas, Marc-André Lureau, Marcin Kolny, Marc Leeman, Mark +Nauwelaerts, Martin Reboredo, Mathieu Duponchelle, Matthew Waters, +Mengkejiergeli Ba, Michael Gruner, Nicolas Dufresne, Nirbheek Chauhan, +Olivier Crête, Philippe Normand, Rafał Dzięgiel, Ralf Sippl, Robert +Mader, Sanchayan Maity, Sangchul Lee, Sebastian Dröge, Seungha Yang, +Stéphane Cerveau, Teh Yule Kim, Thibault Saunier, Thomas Klausner, Timo +Wischer, Tim-Philipp Müller, Tobias Reineke, Tomasz Andrzejak, Trung Do, +Tyler Compton, Ung, Víctor Manuel Jáquez Leal, Vivia Nikolaidou, Wim +Taymans, wngecn, Wonchul Lee, wuchang li, Xavier Claessens, Xi Ruoyao, +Yoshiharu Hirose, Zhao, … and many others who have contributed bug reports, translations, sent suggestions or helped testing. -Stable 1.18 branch +Stable 1.20 branch -After the 1.18.0 release there will be several 1.18.x bug-fix releases +After the 1.20.0 release there will be several 1.20.x bug-fix releases which will contain bug fixes which have been deemed suitable for a stable branch, but no new features or intrusive changes will be added to -a bug-fix release usually. The 1.18.x bug-fix releases will be made from -the git 1.18 branch, which will be a stable branch. - -1.18.0 - -1.18.0 was released on 8 September 2020. - -1.18.1 - -The first 1.18 bug-fix release (1.18.1) was released on 26 October 2020. - -This release only contains bugfixes and it should be safe to update from -1.18.0. - -Highlighted bugfixes in 1.18.1 - -- important security fixes -- bug fixes and memory leak fixes -- various stability and reliability improvements - -gstreamer - -- aggregator: make peek() has() pop() drop() buffer API threadsafe -- gstvalue: don’t write to const char * -- meson: Disallow DbgHelp for UWP build -- info: Fix build on Windows ARM64 device -- build: use cpu_family for arch checks -- basetransform: Fix in/outbuf confusion of _default_transform_meta -- Fix documentation -- info: Load DbgHelp.dll using g_module_open() -- padtemplate: mark documentation caps as may be leaked -- gstmeta: intern registered impl string -- aggregator: Hold SRC_LOCK while unblocking via SRC_BROADCAST() -- ptp_helper_post_install.sh: deal with none -- skip elements/leak.c if tracer is not available -- aggregator: Wake up source pad in PAUSED<->PLAYING transitions -- input-selector: Wake up blocking pads when releasing them -- ptp: Also handle gnu/kfreebsd - -gst-plugins-base - -- theoradec: Set telemetry options only if they are nonzero -- glslstage: delete shader on finalize of stage -- urisourcebin: Fix crash caused by use after free -- decodebin3: Store stream-start event on output pad before exposing - it -- Add some missing nullable annotations -- typefind/xdgmime: Validate mimetypes to be valid GstStructure names - before using them -- uridecodebin3: Forward upstream events to decodebin3 directly -- video-converter: Add fast paths from v210 to I420/YV12, Y42B, UYVY - and YUY2 -- videoaggregator: Limit accepted caps by template caps -- gstrtpbuffer: fix header extension length validation -- decodebin3: only force streams-selected seqnum after a - select-streams -- videodecoder: don’t copy interlace-mode from reference state -- enable abi checks -- multihandlesink: Don’t pass NULL caps to gst_caps_is_equal -- audio: video: Fix in/outbuf confusion of transform_meta -- meson: Always wrap “prefix” option with join_paths() to make Windows - happy -- videoaggregator: ensure peek_next_sample() uses the correct caps -- meson: Actually build gstgl without implicit include dirs -- videoaggregator: Don’t require any pads to be configured for - negotiating source pad caps -- gst-libs: gl: Fix documentation typo and clarify - gl_memory_texsubimage -- audioaggregator: Reset offset if the output rate is renegotiated -- video-anc: Implement transform functions for AFD/Bar metas -- appsrc: Wake up the create() function on caps changes -- rtpbasepayload: do not forget delayed segment when forwarding gaps - -gst-plugins-good - -- v4l2object: Only offer inactive pools and if needed -- vpx: Fix the check to unfixed/unknown framerate to set bitrate -- qmlglsink: fix crash when created/destroyed in quick succession -- rtputils: Count metas with an empty tag list for copying/keeping -- rtpbin: Remove the rtpjitterbuffer with the stream -- rtph26*depay: drop FU’s without a corresponding start bit -- imagefreeze: Response caps query from srcpad -- rtpmp4gdepay: Allow lower-case “aac-hbr” instead of correct - “AAC-hbr” -- rtspsrc: Fix push-backchannel-buffer parameter mismatch -- jpegdec: check buffer size before dereferencing -- flvmux: Move stream skipping to GstAggregatorPadClass.skip_buffer -- v4l2object: plug memory leak -- splitmuxsink: fix sink pad release while PLAYING - -gst-plugins-bad - -- codecparsers: h264parser: guard against ref_pic_markings overflow -- v4l2codecs: Various fixes -- h265parse: Don’t enable passthrough by default -- srt: Fix “Fix timestamping” -- srt: Fixes for 1.4.2 -- dtlsconnection: Ignore OpenSSL system call errors -- h265parse: set interlace-mode=interleaved on interlaced content -- Replace GPL v2 with LGPL v2 in COPYING file -- srt: Consume the error from gst_srt_object_write -- srt: Check socket state before retrieving payload size -- x265enc: fix deadlock on reconfig -- webrtc: Require gstreamer-sdp in the pkg-config file -- srtsrc: Fix timestamping -- mfvideosrc: Use only the first video stream per device -- srtobject: typecast SRTO_LINGER to linger -- decklink: Correctly order the different dependent mode tables -- wasapisrc: Make sure that wasapisrc produces data in loopback mode -- wpesrc: fix some caps leaks using the non-GL output -- smoothstreaming: clear live adapter on seek -- vtdec/vulkan: use Shared storage mode for IOSurface textures -- wpe: Move webview load waiting to WPEView -- wpe: Use proper callback for TLS errors signal handling -- kmssink: Do not source using padded width/height -- avtp: avtpaafdepay: fix crash when building caps -- opencv: set opencv_dep when option is disabled to fix the build -- line21encoder: miscellaneous enhancements -- Hls youtube issues with urisourcebin/queue2 -- rtmp2: Replace stats queue with stats lock -- rtmp2sink: support EOS event for graceful connection shutdown -- mpegtsmux: Make handling of sinkpads thread-safe -- hlssink2: Actually release splitmuxsink’s pads -- mpegtsmux: Don’t create streams with reserved PID - -gst-plugins-ugly - -- no changes - -gst-libav - -- avaudenc/avvidenc: Reopen encoding session if it’s required -- avauddec/audenc/videnc: Don’t return GST_FLOW_EOS when draining -- avauddec/avviddec: Avoid dropping non-OK flow return -- avcodecmap: Enable 24 bit WMA Lossless decoding - -gst-rtsp-server - -- rtsp-stream: collect rtp info when blocking -- rtsp-media: set a 0 storage size for TCP receivers -- rtsp-stream: preroll on gap events -- rtsp-media: do not unblock on unsuspend - -gstreamer-vaapi - -- decoder: don’t reply src caps query with allowed if pad is fixed -- plugins: decode: fix a DMA caps typo in ensure_allowed_srcpad_caps - -gstreamer-sharp - -- Add bindings for some missing 1.18 API - -gst-omx - -- omxvideodec: support interlace-mode=interleaved input - -gst-python - -- no changes - -gst-editing-services - -- ges: Do not recreate auto-transitions when changing clip assets -- ges: Fix a copy/paste mistake in meson file - -gst-integration-testsuites - -- medias: Update for h265parse passthrough behavior change -- update validate.test.h265parse.alternate test - -gst-build - -- windows: Detect Strawberry Perl and error out early -- {pygobject,pycairo}.wrap: point to stable refs - -Cerbero build tool and packaging changes in 1.18.1 - -- Add macOS Big Sur support -- gst-plugins-bad: Ship rtpmanagerbad plugin -- gstreamer-1.0: Don’t enable DbgHelp for UWP build -- pango: fix font corruption on windows -- cairo: use thread local storage to grant one windows HDC per thread -- small fixes for Xcode 12 -- cerbero: Re-add alsa-devel to bootstrap on Linux -- FreeType: update to 2.10.4 to fix security vulnerability - -Contributors to 1.18.1 - -Aaron Boxer, Adam Williamson, Andrew Wesie, Arun Raghavan, Bastien -Reboulet, Brent Gardner, Edward Hervey, François Laignel, Guillaume -Desmottes, Havard Graff, He Junyan, Hosang Lee, Jacek Tomaszewski, Jakub -Adam, Jan Alexander Steffens (heftig), Jan Schmidt, Jérôme Laheurte, -Jordan Petridis, Marc Leeman, Marian Cichy, Marijn Suijten, Mathieu -Duponchelle, Matthew Waters, Michael Tretter, Nazar Mokrynskyi, Nicolas -Dufresne, Niklas Hambüchen, Nirbheek Chauhan, Olivier Crête, Philippe -Normand, raghavendra, Ricky Tang, Sebastian Dröge, Seungha Yang, -sohwan.park, Stéphane Cerveau, Thibault Saunier, Tim-Philipp Müller, Tom -Schoonjans, Víctor Manuel Jáquez Leal, Will Miller, Xavier Claessens, Xℹ -Ruoyao, Zebediah Figura, - -… and many others who have contributed bug reports, translations, sent -suggestions or helped testing. Thank you all! - -List of merge requests and issues fixed in 1.18.1 - -- List of Merge Requests applied in 1.18.1 -- List of Issues fixed in 1.18.1 - -1.18.2 - -The second 1.18 bug-fix release (1.18.2) was released on 6 December -2020. - -This release only contains bugfixes and it should be safe to update from -1.18.x. +a bug-fix release usually. The 1.20.x bug-fix releases will be made from +the git 1.20 branch, which will be a stable branch. -Highlighted bugfixes in 1.18.2 +1.20.0 -- Fix MPEG-TS timestamping regression when playing DVB streams -- compositor: fix artefacts in certain input scaling/conversion - situations and make sure that the output format is actually - supported, plus renegotiation fixes -- Fix sftp:// URI playback in decodebin/playbin via giosrc -- adaptivedemux/dashdemux/hlsdemux fixes -- rtsp-server fixes -- android media: fix crash when encoding AVC -- fix races in various unit tests -- lots of other bug fixes and memory leak fixes -- various stability, performance and reliability improvements -- g-i annotation fixes -- build fixes +1.20.0 was released on 3 February 2022. -gstreamer - -- bin: When removing a sink, check if the EOS status changed -- info: colorize PIDs in log messages -- aggregator: Include min-upstream-latency in buffering time, helps - especially with performance issues on single core systems where - there are a lot of threads running -- typefind: copy seqnum to new segment event, fixing issues with - oggdemux operating in push mode with typefind operating in pull mode -- identity, clocksync: Also provide system clock if sync=false -- queue2: Fix modes in scheduling query handling -- harness: Handle element not being set cleanly -- g-i: Add some missing nullable annotations, and fix some nullable - annotations: - - gst_test_clock_process_next_clock_id() returns nullable - - gst_stream_type_get_name() is not nullable -- build: fix build issue when compiling for 32-bit architectures with - 64-bit time_t (e.g. riscv32) by increasing padding in - GstClockEntryImpl in gst_private.h - -gst-plugins-base - -- gl/eagl: internal view resize fixes for glimagesink -- video-converter: increase the number of cache lines for resampling, - fixes significant color issues and artefacts with “special” resizing - parameters in compositor -- compositor: Don’t crash in prepare_frame() if the pad was just - removed -- decodebin3: Properly handle caps query with no filter -- videoaggregator: Guarantee that the output format is supported -- videoaggregator: Fix locking around vagg->info -- gluploadelement: Avoid race condition of base class’ context -- gluploadelement: Avoid race condition of inside upload creation -- gl: Fix prototype of glGetSynciv() -- tcpserversink: Don’t assume g_socket_get_remote_address() succeeds -- video-aggregator: Fix renegotiation when using convert pads -- videoaggregator: document and fix locking in convert pad -- audiodecoder, videodecoder: Don’t reset max-errors property value in - reset() -- audioencoder: Fix incorrect GST_LOG_OBJECT usage -- pbutils: Fix segfault when using invalid encoding profile -- g-i: videometa: gir annotate the size of plane array in new API -- examples/gl/gtk: Add missing dependency on gstgl -- video: fix doc warning - -gst-plugins-good - -- rpicamsrc: add vchostif library as it is required to build - successful -- deinterlace: Enable x86 assembly with nasm on MSVC -- v4l2: caps negotiate wrong as interlace feature -- aacparse: Fix caps change handling -- rtspsrc: Use URI hash for stream id -- flvmux: Release pads via GstAggregator -- qtmux: Chain up when releasing pad, and fix some locking -- matroska-mux: Fix sparse stream crash -- Splitmux testsuite races - -gst-plugins-bad - -- tsparse: timestamp packetized buffers, fixing timestamp handling - regression in connection with dvbsrc in MeTV -- ttmlparse: fix issues in aggregation of input TTML -- mpegdemux: Set duration on seeking query if possible, fixes seeking - in MPEG-PS streams in gst-play-1.0 -- mpegtsdemux: Fix off by one error -- adaptivedemux: Store QoS values on the element -- adaptivedemux: Don’t calculate bitrate for header/index fragments -- hlsdemux: Don’t double-free variant streams on errors -- mpegtspacketizer: Handle PCR issues with adaptive streams -- player: call ref_sink on pipeline -- vkdeviceprovider: Avoid deadlock on physical device -- wlvideoformat: fix DMA format convertor -- Webrtc shutdown crashes -- decklink: Update enum value bounds check in gst_decklink_get_mode() -- decklink: correct framerate 2KDCI 23.98 -- amc: Fix crash when encoding AVC -- d3d11videoprocessor: Fix wrong input/output supportability check -- opencv: allow compilation against 4.5.x -- tests: svthevcenc: Fix test_encode_simple -- tests: dtls: Don’t set dtlsenc state before linking -- mpegtsmux: Restore intervals when creating TsMux -- adaptivedemux, hlsdemux, curl: Use actual object for logging -- gi: player: Fix get_current_subtitle_track() annotation - -gst-plugins-ugly - -- no changes - -gst-libav - -- avauddec: Check planar-ness of frame rather than context, fixes - issue with aptX HD decoding - -gst-rtsp-server - -- stream: collect a clock_rate when blocking -- media: Ignore GstRTSPStreamBlocking from incomplete streams, to - prevent cases with prerolling when the inactive stream prerolls - first and the server proceeds without waiting for the active stream. - When there are no complete streams (during DESCRIBE), we will listen - to all streams. -- media: Use guint64 for setting the size-time property on rtpstorage, - fixes potential crashes or memory corruption. -- media: Get rates only on sender streams, fixing issue with ONVIF - audio backchannel streams -- media: Plug memory leak - -gstreamer-vaapi - -- H265 decoder: Fix a typo in scc reference setting - -gstreamer-sharp - -- no changes - -gst-omx - -- no changes - -gst-python - -- no changes - -gst-editing-services - -- Fix static build -- ges_init(): Fix potential initialisation crash on error +1.20.1 -gst-integration-testsuites - -- no changes - -gst-build - -- gst-env: use Path.open() in get_pkgconfig_variable_from_pcfile(), - fixes issues with python 3.5 -- subprojects: pin orc to 0.4.32 release (was 0.4.29) and pin libpsl - to 0.21.1 (was master) - -Cerbero build tool and packaging changes in 1.18.2 - -- build-tools: copy the removed site.py from setuptools, fixing python - programs (like meson) from using libraries from incorrect places - -Contributors to 1.18.2 - -Arun Raghavan, Bing Song, Chris Bass, Chris Duncan, Chris White, David -Keijser, David Phung, Edward Hervey, Fabrice Fontaine, Guillaume -Desmottes, Guiqin Zou, He Junyan, Jan Alexander Steffens (heftig), Jan -Schmidt, Jason Pereira, Jonathan Matthew, Jose Quaresma, Julian Bouzas, -Khem Raj, Kristofer Björkström, Marijn Suijten, Mart Raudsepp, Mathieu -Duponchelle, Matthew Waters, Nicola Murino, Nicolas Dufresne, Nirbheek -Chauhan, Olivier Crête, Philippe Normand, Rafostar, Randy Li, Sanchayan -Maity, Sebastian Dröge, Seungha Yang, Thibault Saunier, Tim-Philipp -Müller, Vivia Nikolaidou, Xavier Claessens - -… and many others who have contributed bug reports, translations, sent -suggestions or helped testing. Thank you all! - -List of merge requests and issues fixed in 1.18.2 - -- List of Merge Requests applied in 1.18.2 -- List of Issues fixed in 1.18.2 - -1.18.3 - -The third 1.18 bug-fix release (1.18.3) was released on 13 January 2021. +The first 1.20 bug-fix release (1.20.1) was released on 14 March 2022. This release only contains bugfixes and it should be safe to update from -1.18.x. - -Highlighted bugfixes in 1.18.3 - -- fix ogg playback regression for ogg files that also have ID3 or APE - tags -- compositor: fix artefacts and invalid memory access when blending - subsampled formats -- exported mini object ref/unref/copy functions for use in bindings - such as gstreamer-sharp -- Add support for Apple silicon (M1) to cerbero package builder -- Ship RIST plugin in binary packages -- various stability, performance and reliability improvements -- memory leak fixes -- build fixes - -gstreamer - -- gst: Add non-inline ref/unref/copy/replace methods for various mini - objects (buffer, bufferlist, caps, context, event, memory, message, - promise, query, sample, taglist, uri) for use in bindings such as - gstreamer-sharp -- harness: don’t use GST_DEBUG_OBJECT with GstHarness which is not a - GObject - -gst-plugins-base - -- audiorate: Make buffer writable before changing its metadata -- compositor: fix blending of subsampled components -- decodebin3: When reconfiguring a slot make sure that the ghostpad is - unlinked -- decodebin3: Release selection lock when pushing EOS -- encodebasebin: Ensure that parsers are compatible with selected - encoders -- tagdemux: resize and trim buffer in place to fix interaction with - oggdemux -- videoaggregator: Pop out old buffers on timeout -- video-blend: fix blending 8-bit and 16-bit frames together -- appsrc: fix signal documentation -- gl: document some GL caps specifics -- libvisual: workaround clang compiler warning - -gst-plugins-good - -- deinterlace: fix build of assembly optimisations on macOS -- splitmuxsink: Avoid deadlock when releasing a pad from a running - muxer -- splitmuxsink: fix bogus fragment split -- v4l2object: Map correct video format for RGBA -- videoflip: fix possible crash when changing video-direction/method - while running - -gst-plugins-bad - -- assrender: fix mutex handling in certain flushing/error situations -- dvbsuboverlay: Add support for dynamic resolution update -- dashsink: fix critical log of dynamic pipeline -- d3d11shader: Fix ID3DBlob object leak -- d3d11videosink: Prepare window once streaming started -- decklinkaudiosrc: Fix duration of the first audio frame after each - discont -- intervideosrc: fix negotiation of interlaced caps -- msdk: needn’t close mfx session when failed, fixes double free / - potential crash -- msdk: check GstMsdkContext instead of mfxSession instance -- srt: fix locking when retrieving stats -- rtmp2src: fix leaks when connection is cancelled during startup or - connection fails - -gst-plugins-ugly - -- no changes - -gst-libav - -- avauddec: Drain decoder on decoding failure, fixes timestamps after - decoding errors - -gst-rtsp-server - -- rtsp-media: Only count senders when counting blocked streams -- rtsp-client: Only unref client watch context on finalize, to avoid - deadlock - -gstreamer-vaapi - -- no changes - -gstreamer-sharp - -- no changes - -gst-omx - -- no changes - -gst-python - -- no changes - -gst-editing-services - -- launch: Ensure to add required ref to profiles from project -- tests: fix meson test env setup to make sure we use the right - gst-plugin-scanner - -gst-integration-testsuites - -- no changes - -gst-build - -- meson: Update zlib.wrap to use wrapdb instead of github fork - -Cerbero build tool and packaging changes in 1.18.3 - -- Add support for Apple silicon -- Build and ship RIST plugin - -Contributors to 1.18.3 - -Andoni Morales Alastruey, Edward Hervey, Haihao Xiang, Haihua Hu, Hou -Qi, Ignacio Casal Quinteiro, Jakub Adam, Jan Alexander Steffens -(heftig), Jan Schmidt, Jordan Petridis, Lawrence Troup, Lim Siew Hoon, -Mathieu Duponchelle, Matthew Waters, Nicolas Dufresne, Raju Babannavar, -Sebastian Dröge, Seungha Yang, Thibault Saunier, Tim-Philipp Müller, -Tobias Ronge, Vivia Nikolaidou, - -… and many others who have contributed bug reports, translations, sent -suggestions or helped testing. Thank you all! - -List of merge requests and issues fixed in 1.18.3 - -- List of Merge Requests applied in 1.18.3 -- List of Issues fixed in 1.18.3 +1.20.0. -1.18.4 +Highlighted bugfixes in 1.20.1 -The fourth 1.18 bug-fix release (1.18.4) was released on 15 March 2021. - -This release only contains bugfixes and security fixes and it should be -safe to update from 1.18.x. - -Highlighted bugfixes in 1.18.4 - -- important security fixes for ID3 tag reading, matroska and realmedia - parsing, and gst-libav audio decoding -- audiomixer, audioaggregator: input buffer handling fixes -- decodebin3: improve stream-selection message handling -- uridecodebin3: make “caps” property work -- wavenc: fix writing of INFO chunks in some cases -- v4l2: bt601 colorimetry, allow encoder resolution changes, fix - decoder frame rate negotiation -- decklinkvideosink: fix auto format detection, and fixes for 29.97fps - framerate output -- mpeg-2 video handling fixes when seeking -- avviddec: fix bufferpool negotiation and possible memory corruption - when changing resolution -- various stability, performance and reliability improvements -- memory leak fixes -- build fixes: rpicamsrc, qt overlay example, d3d11videosink on UWP +- deinterlace: various bug fixes for yadif and greedy methods +- gtk video sink: Fix rotation not being applied when paused +- gst-play-1.0: Fix trick-mode handling in keyboard shortcut +- jpegdec: fix RGB conversion handling +- matroskademux: improved ProRes video handling +- matroskamux: Handle multiview-mode/flags/pixel-aspect-ratio caps + fields correctly when checking caps equality on input caps changes +- videoaggregator fixes (negative rate handling, current position + rounding) +- soup http plugin: Lookup libsoup dylib files on Apple platforms & + fix Cerbero static build on Android and iOS +- Support build against libfreeaptx in openaptx plugin +- Fix linking issues on Illumos distros +- GstPlay: Fix new error + warning parsing API (was unusuable before) +- mpegtsmux: VBR muxing fixes +- nvdecoder: Various fixes for 4:4:4 and high-bitdepth decoding +- Support build against libfreeaptx in openaptx plugin +- webrtc: Various fixes to the webrtc-sendrecv python example +- macOS: support a relocatable GStreamer.framework on macOS (see below + for details) +- macOS: fix applemedia plugin failing to load on ARM64 macOS +- windows: ship wavpack library +- gst-python: Fix build with Python 3.11 +- various bug fixes, memory leak fixes, and other stability and + reliability improvements gstreamer -- info: Don’t leak log function user_data if the debug system is - compiled out -- task: Use SetThreadDescription() Win32 API for setting thread names, - which preserves thread names in dump files. -- buffer, memory: Mark info in map functions as caller-allocates and - pass allocation params as const pointers where possible -- clock: define AUTO_CLEANUP_FREE_FUNC for GstClockID +- plugin loader: show the reason when spawning of gst-plugin-scanner + fails +- registry, plugin loading: fix dynamic relocation if + GST_PLUGIN_SUBDIR (libdir) is not a single subdirectory; improve + GST_PLUGIN_SUBDIR handling +- context: fix transfer annotation on gst_context_writable_structure() + for bindings +- baseparse: Don’t truncate the duration to milliseconds in + gst_base_parse_convert_default() +- bufferpool: Deactivate pool and get rid of references to other + objects from dispose instead of finalize gst-plugins-base -- tag: id3v2: fix frame size check and potential invalid reads -- audio: Fix gst_audio_buffer_truncate() meta handling for - non-interleaved audio -- audioresample: respect buffer layout when draining -- audioaggregator: fix input_buffer ownership -- decodebin3: change stream selection message owner, so that the app - sends the stream-selection event to the right element -- rtspconnection: correct data_size when tunneled mode -- uridecodebin3: make caps property work -- video-converter: Don’t upsample invalid lines -- videodecoder: Fix racy critical when pool negotiation occurs during - flush -- video: Convert gst_video_info_to_caps() to take self as const ptr -- examples: added qt core dependency for qt overlay example +- typefindfunctions: Fix WebVTT format detection for very short files +- gldisplay: Reorder GST_GL_WINDOW check for egl-device +- rtpbasepayload: Copy all buffer metadata instead of just GstMetas + for the input meta buffer +- codec-utils: Avoid out-of-bounds error +- navigation: Fix Since markers for mouse scroll events +- videoaggregator: Fix for unhandled negative rate +- videoaggregator: Use floor() to calculate current position +- video-color: Fix for missing clipping in PQ EOTF function +- gst-play-1.0: Fix trick-mode handling in keyboard shortcut +- audiovisualizer: shader: Fix out of bound write gst-plugins-good -- matroskademux: header parsing fixes -- rpicamsrc: depend on posix threads and vchiq_arm to fix build on - raspios again -- wavenc: Fixed INFO chunk corruption, caused by odd sized data not - being padded -- wavpackdec: Add floating point format support to fix distortions in - some cases -- v4l2: recognize V4L2 bt601 colorimetry again -- v4l2videoenc: support resolution change stream encode -- v4l2h265codec: fix HEVC profile string issue -- v4l2object: Need keep same transfer as input caps -- v4l2videodec: Fix vp8 and vp9 streams can’t play on board with - vendor bsp -- v4l2videodec: fix src side frame rate negotiation +- deinterlace: various bug fixes for yadif method +- deinterlace: Refactor greedyh and fix planar formats +- deinterlace: Prevent race between method configuration and latency + query +- gtk video sink: Fix rotation not being applied when paused +- jpegdec: fix RGB conversion handling +- matroskademux: improved ProRes video handling +- matroskamux: Handle multiview-mode/flags/pixel-aspect-ratio caps + fields correctly when checking caps equality on input caps changes +- rtprtx: don’t access type-system per buffer (performance + optimisation); code cleanups +- rtpulpfecenc: fix unmatched g_slice_free() +- rtpvp8depay: fix crash when making GstRTPPacketLost custom event +- qtmux: Don’t post an error message if pushing a sample failed with + FLUSHING (e.g. on pipeline shutdown) +- soup: Lookup libsoup dylib files on Apple platforms & fix Cerbero + static build on Android and iOS +- souphttpsrc: element not present on iOS after 1.20.0 update +- v4l2tuner: return NULL if no norm set +- v4l2bufferpool: Fix race condition between qbuf and pool streamoff +- meson: Don’t build lame plugin with -Dlame=disabled gst-plugins-bad -- avwait: Don’t post messages with the mutex locked -- d3d11h264dec: Reconfigure decoder object on DPB size change and keep - track of actually configured DPB size -- dashsink: fix double unref of sinkpad caps -- decklinkvideosink: Use correct numerator for 29.97fps -- decklinkvideosink: fix auto format detection -- decklinksrc: Use a more accurate capture time -- d3d11videosink: Fix build error on UWP -- interlace: negotiation and buffer leak fixes -- mpegvideoparse: do not clip, so decoder receives data from keyframe - even if it’s before the segment start -- mpegtsparse: Fix switched DTS/PTS when set-timestamps=false -- nvh264sldec: Reopen decoder object if larger DPB size is required -- sdpsrc: fix double free if sdp is provided as string via the - property -- vulkan: Fix elements long name. +- GstPlay: Fix new error + warning parsing API (was unusuable before) +- av1parse: let the parser continue on verbose OBUs +- d3d11converter: Fix RGB to GRAY conversion, broken debug messages, + and add missing GRAY conversion +- gs: look for google_cloud_cpp_storage.pc +- ipcpipeline: fix crash and error on windows with SOCKET or _pipe() +- ivfparse: Don’t set zero resolution on caps +- mpegtsdemux: Handle PES headers bigger than a mpeg-ts packet; fix + locking in error code path; handle more program updates +- mpegtsmux: Start last_ts with GST_CLOCK_TIME_NONE to fix VBR muxing + behaviour +- mpegtsmux: Thread safety fixes: lock mux->tsmux, the programs hash + table, and pad streams +- mpegtsmux: Skip empty buffers +- osxaudiodeviceprovider: Add initial support for duplex devices on + OSX +- rtpldacpay: Fix missing payload information +- sdpdemux: add media attributes to caps, fixes ptp clock handling +- mfaudioenc: Handle empty IMFMediaBuffer +- nvdecoder: Various fixes for 4:4:4 and high-bitdepth decoding +- nvenc: Fix deadlock because of too strict buffer pool size +- va: fix library build issues, caps leaks in the vpp transform + function, and add vaav1dec to documentation +- v4l2codecs: vp9: Minor fixes +- v4l2codecs: h264: Correct scaling matrix ABI check +- dtlstransport: Notify ICE transport property changes +- webrtc: Various fixes to the webrtc-sendrecv python example +- webrtc-ice: Fix memory leaks in gst_webrtc_ice_add_candidate() +- Support build against libfreeaptx in openaptx plugin +- Fix linking issues on Illumos distros gst-plugins-ugly -- rmdemux: Make sure we have enough data available when parsing - audio/video packets +- x264enc: fix plugin long-name and description gst-libav -- avviddec: take the maximum of the height/coded_height -- viddec: don’t configure an incorrect buffer pool when receiving a - gap event -- audiodec: fix stack overflow in gst_ffmpeg_channel_layout_to_gst() +- No changes gst-rtsp-server -- rtspclientsink: fix deadlock on shutdown if no data has been - received yet -- rtspclientsink: fix leaks in unit tests -- rtsp-stream: avoid deadlock in send_func -- rtsp-client: cleanup transports during TEARDOWN +- Fix race in rtsp-client when tunneling over HTTP gstreamer-vaapi -- h264 encoder: append encoder exposure to aud -- postproc: Fix a problem of propose_allocation when passthrough -- glx: Iterate over FBConfig and select 8 bit color size +- No changes gstreamer-sharp -- no changes +- No changes gst-omx -- no changes +- No changes gst-python -- no changes +- Fix build with Python 3.11 gst-editing-services -- group: Use proper group constructor +- Update validate test scenarios for videoaggregator rounding + behaviour change gst-integration-testsuites -- no changes - -gst-build - -- no changes - -Cerbero build tool and packaging changes in 1.18.4 - -- macOS: more BigSur fixes -- glib: Backport patch to set thread names on Windows 10 - -Contributors to 1.18.4 - -Alicia Boya García, Ashley Brighthope, Bing Song, Branko Subasic, Edward -Hervey, Guillaume Desmottes, Haihua Hu, He Junyan, Hou Qi, Jan Alexander -Steffens (heftig), Jeongki Kim, Jordan Petridis, Knobe, Kristofer -Björkström, Marijn Suijten, Matthew Waters, Paul Goulpié, Philipp Zabel, -Rafał Dzięgiel, Sebastian Dröge, Seungha Yang, Staz M, Stéphane Cerveau, -Thibault Saunier, Tim-Philipp Müller, Víctor Manuel Jáquez Leal, Vivia -Nikolaidou, Vladimir Menshakov, - -… and many others who have contributed bug reports, translations, sent -suggestions or helped testing. Thank you all! - -List of merge requests and issues fixed in 1.18.4 - -- List of Merge Requests applied in 1.18.4 -- List of Issues fixed in 1.18.4 - -1.18.5 - -The fifth 1.18 bug-fix release (1.18.5) was released on 8 September -2021. - -This release only contains bugfixes and security fixes and it should be -safe to update from 1.18.x. - -Highlighted bugfixes in 1.18.5 - -- basesink: fix reverse frame stepping -- downloadbuffer/sparsefile: several fixes for win32 -- systemclock: Update monotonic reference time when re-scheduling, - fixes high CPU usage with gnome-music when pausing playback -- audioaggregator: fix glitches when resyncing on discont -- compositor: Fix NV12 blend operation -- rtspconnection: Add IPv6 support for tunneled mode -- avidemux: fix playback of some H.264-in-AVI streams -- jpegdec: Fix crash when interlaced field height is not DCT block - size aligned -- qmlglsink: Keep old buffers around a bit longer if they were bound - by QML -- qml: qtitem: don’t potentially leak a large number of buffers -- rtpjpegpay: fix image corruption when compiled with MSVC on Windows -- rtspsrc: seeking improvements -- rtpjitterbuffer: Avoid generation of invalid timestamps -- rtspsrc: Fix behaviour of select-streams, new-manager, - request-rtcp-key and before-send signals with GLib >= 2.62 -- multiudpsink: Fix broken SO_SNDBUF get/set on Windows -- openh264enc: fix broken sps/pps header generation and some minor - leaks -- mpeg2enc: fix interlace-mode detection and unbound memory usage if - encoder can’t keep up -- mfvideosrc: Fix for negative MF stride and for negotiation when - interlace-mode is specified -- tsdemux: fix seek-with-stop regression and decoding errors after - seeking with dvdlpcmdec -- rtsp-server: seek handling improvements -- gst-libav: fix build (and other issues) with ffmpeg 4.4 -- cerbero: spandsp: Fix build error with Visual Studio 2019 -- win32 packages: Fix hang in GLib when G_SLICE environment variable - is set - -gstreamer - -- aggregator: Release the SRC lock while querying latency -- aggregator: Release pads’ peeked buffer when removing the pad or - finalizing it -- basesink: Don’t swap rstart/rstop when stepping -- basesrc: Print segments with GST_SEGMENT_FORMAT and not - GST_PTR_FORMAT -- childproxy: init value in gst_child_proxy_get_property() if needed -- clocksync: Fix providing system clock by default -- concat: Properly propagate seqnum of segment events -- concat: adjust running time offsets on downstream events -- concat: fix locking in SEGMENT event handler -- downloadbuffer/sparsefile: several fixes for win32 -- element: NULL the lists of contexts in dispose() -- multiqueue: Use running time of gap events for wakeups. -- multiqueue: Ensure peer pad exists when iterating internal links -- pad: Keep IDLE probe hook alive during immediate callback -- pad: Ensure last flow return is set on sink pads in push mode -- pad: Don’t spam the debug log at INFO level when default-chaining a - buffer list -- pad: clear probes holding mutex -- parse-launch: Fix a critical when using the : operator. -- parse-launch: Don’t do delayed property setting for top-level - properties. -- plugin: load plugins with unknown license strings -- ptpclock: Don’t leak the GList -- queue2: Refuse all serialized queries when posting buffering - messages -- systemclock: Update monotonic reference time when re-scheduling -- High CPU usage in 1.18 (but not master) when pausing playback in - gnome-music -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) - -gst-plugins-base - -- appsrc: Don’t leak buffer list while wrongly unreffing buffer on - EOS/flushing -- audioaggregator: Don’t overwrite already written samples -- audioaggregator: Resync on the next buffer when dropping a buffer on - discont resyncing -- audiobasesink: Fix of double lock release -- audioaggregator: Don’t overwrite already written samples -- audiobasesrc: Fix divide by zero assertion -- clockoverlay: Fix broken string formatting by strftime() on Windows -- compositor: Fix NV12 blend operation -- giosrc: Don’t leak scheme string in gst_gio_src_query() -- giobasesink: Handle incomplete writes in gst_gio_base_sink_render() -- gl/wayland: Use consistent wl_display when creating work queue for - proxy wrapper -- gl: Fix build when Meson >= 0.58.0rc1 -- gl/wayland: provide a dummy global_remove function -- playbin2: fix base_time selection when flush seeking live (such as - with RTSP) -- rtspconnection: Add IPv6 support for tunneled mode -- rtspconnection: Consistently translate GIOError to GstRTSPResult - (for rtspsrc) -- rawbaseparse: check destination format correctly -- uridecodebin: Don’t force floating reference for future reusable - decodebin -- parsebin: Put stream flags in GstStream -- splitmuxsink: always use factory property when set -- video-converter: Set up matrix tables only once. -- videoscale: Performance degradation from 1.16.2 -> 1.18.4 -- videotestsrc: Fix a leak when computing alpha caps -- audio/video-converter: Plug some minor leaks -- audio,video-format: Make generate_raw_formats idempotent for - assertions -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) -- Fix build issue on MinGW64 - -gst-plugins-good - -- avidemux: Also detect 0x000001 as H264 byte-stream start code in - codec_data -- deinterlace: Plug a method subobject leak -- deinterlace: Drop field-order field if outputting progressive -- jpegdec: Fix crash when interlaced field height is not DCT block - size aligned -- qmlglsink: Keep old buffers around a bit longer if they were bound - by QML -- qml: qtitem: don’t potentially leak a large number of buffers -- qtdemux: Force stream-start push when re-using EOS’d streams -- qtmux: for Apple ProRes, allow overriding pixel bit depth, e.g. when - exporting an opaque image, yet with alpha. -- qtmux: Make sure to write 64-bit STCO table when needed. -- rtpjpegpay: fix image corruption when compiled with MSVC on Windows -- rtpptdemux: Remove pads also in PAUSED->READY -- rtph265depay: update codec_data in caps regardless of format -- rtspsrc: Do not overwrite the known duration after a seek -- rtspsrc: De-dup seek event seqnums to avoid multiple seeks -- rtspsrc: Fix race saving seek event seqnum -- rtspsrc: Using multicast UDP has no relation to seekability, also - add some logging -- rtpjitterbuffer: Fix parsing of the mediaclk:direct= field -- rtpjitterbuffer: Avoid generation of invalid timestamps -- rtpjitterbuffer: Check srcresult before waiting on the condition - variable too -- rtpjitterbuffer: More logging when calculating rfc7273 timestamps -- rtspsrc: Fix more signals -- rtspsrc: Fix accumulation of before-send signal return values -- souphttpsrc: Always use the content decoder but set - `Accept-Encoding:… -- udpsrc: Plug leaks of saddr in error cases -- multiudpsink: Fix broken SO_SNDBUF get/set on Windows -- v4l2object: Add interlace-mode back to caps for camera -- v4l2object: Use default colorimetry if that in caps is unknown -- V4l2object: Avoid colorimetry mismatch for streams with invalid - colorimetry -- v4l2object: Add support for hdr10 stream playback -- wavparse: adtl/note/labl chunk parsing fixes -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) -- 1.18.4: build fails with glib 2.67.6 and gcc-11: argument 2 of - ‘__atomic_load’ must not be a pointer to a ‘volatile’ type - -gst-plugins-bad - -- audiolatency: Use live mode audiotestsrc -- audiolatency: Handle audio buffers with invalid duration -- ccconverter: fix framerate caps negotiation from non-cdp to cdp -- dashdemux: Properly initalize GError, remove duplicate logging call -- dashdemux: Log protection events on corresponding pad -- dashdemux: fix dash_mpdparser_check_mpd_client_set_methods unit test -- h264parse,h265parse: Push parameter set NAL units again per - segment-done -- h265parse: Fix a typo in get_compatible_profile_caps() -- h265parse: don’t invalidate the last PPS when parsing a new SPS -- h264parse: improve PPS handling -- h2645parser: Catch overflows in AVC/HEVC NAL unit length - calculations -- interlace: Don’t set field-order field for progressive caps, fixes - negotiation issues -- interlace: Fix too small buffer size error -- jpegparse: Don’t generate timestamp for 0/1 framerates -- opencv: fix build error on macOS -- openexr: Fix build with OpenEXR 3 -- openh264enc: fix broken sps/pps header generation and some minor - leaks -- mpeg2enc: fix interlace-mode detection on input video -- mpeg2enc: Only allow 1 pending frame for encoding (fixes unbound - memory usage in case encoder can’t keep up with input) -- mfvideoenc: Don’t pass 0/1 framerate to MFT -- mfvideosrc: Fix for negative MF stride -- mfvideosrc: Fix negotiation when interlace-mode is specified -- mxfvanc: Handle empty ANC essence -- rtmp2src: workaround a GLib race when destroying a - GMainContext/GSource -- rtpsrc: Plug leak of rtcp_send_addr and fix setting URI back to NULL -- rtpsink: Return proper pad from _request_new_pad() -- rist: Plug leak of rtcp_send_addr -- rtmp2: Use correct size of write macro for param2. -- rtmp2/connection: Separate inner from outer cancelling -- tsmux: When selecting random PIDs, name the pads according to those - PIDs -- tsmux: Recheck existing pad PIDs when requesting a new pad with a - random pid -- tsdemux: fix seek with stop regression -- tsdemux: Clear all streams when rewinding, fixes the case where the - demuxer sends out partial invalid data downstream after a seek which - causes some decoders (such as dvdlpmdec) to error out -- v4l2slh264dec: Fix slice header bit size calculation -- videoparseutils: Fix for wrong CEA708 minimum size check -- waylandsink: Fix for missing initial configure -- wpe: Make threaded view singleton creation thread safe -- x265: Fix a deadlock when failing to create the x265enc -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) - -gst-plugins-ugly - -- asfdemux/realmedia: Drop duplicate seek events -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) - -gst-libav - -- avmux: Blacklist ttml subtitles (fixes crash with ffmpeg >= 4.4) -- avmux: fix segfault when a plugin’s long_name is NULL -- avviddec: Fix size of linesize parameter -- avviddec: Take into account coded_height for pool -- avdemux: fix build with FFmpeg 4.4 - -gst-rtsp-server - -- rtsp-media: Ensure the bus watch is removed during unprepare -- rtsp-media: Add one more case to seek avoidance -- rtsp-media: Improve skipping trickmode seek -- Fix a few memory leaks - -gstreamer-vaapi - -- plugins: Demote rank of vaapipostproc and vaapioverlay to match - other filters -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) - -gst-editing-services - -- xml-formatter: Fix allocation size of buffer -- framepositioner: Fix runtime warning -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) - -gst-devtools - -- scenario: Fix EOS handling in seek_forward.scenario -- validate-utils: Only modify structure fields that really need - updates -- Don’t use volatile to mean atomic (fixes compiler warnings with - gcc 11) +- Update validate test scenarios for videoaggregator rounding + behaviour change -gst-integration-testsuites +Development build environment -- validate: Update interlace_deinterlace_alternate to remove - field-order from expected caps +- gst-env: various clean-ups and documentation improvements -gst-build +Cerbero build tool and packaging changes in 1.20.1 -- git-update: Make fetching of external repos non-fatal on the CI -- gst-env: Windows: Fix looking for cmd_or_ps.ps1 in the wrong - directory -- Pin gst-plugins-rs subproject to 0.7 branch - -Cerbero build tool and packaging changes in 1.18.5 - -- cerbero: Add a dotted progress bar for urllib downloads -- libunwind: make sure all pkgconfig files get included in the devel - package -- openssl.recipe: Bump to 1.1.1k -- glib: Fix hang on Windows when G_SLICE env is configured -- utils: Support latest Debian release names -- enums: generate fedora version strings automatically -- Rework cmake build system -- spandsp: Fix build error with Visual Studio 2019 - -Contributors to 1.18.5 - -Alba Mendez, Andoni Morales Alastruey, Antonio Rojas, Bartłomiej -Kurzeja, Binh Truong, Daniel Knobe, Doug Nazar, Edward Hervey, He -Junyan, Hou Qi, Jan Alexander Steffens (heftig), Jan Schmidt, Marijn -Suijten, Mathieu Duponchelle, Matthew Waters, Michael Olbrich, Miguel -Paris, Nicholas Jackson, Nicolas Dufresne, Nirbheek Chauhan, Olivier -Crête, Per Förlin, Philippe Normand, Robin Burchell, Ruslan Khamidullin, -Scott Moreau, Sebastian Dröge, Sergei Kovalev, Seungha Yang, Stéphane -Cerveau, Steve McDaniel, Thibault Saunier, Tim-Philipp Müller, Víctor -Manuel Jáquez Leal, Xavier Claessens, Youngsoo Lee, +- Fix nasm version check +- Disable certificate checking on RHEL/CentOS 7 +- packages: Ship wavpack.dll for Windows +- osx/universal: make the library name relocatable +- macOS: In order to support a relocatable GStreamer.framework on + macOS, an application may now need to add an rpath entry to the + location of the GStreamer.framework (which could be bundled with the + application itself). Some build systems will do this for you by + default. +- Disable MoltenVK on macOS arm64 to fix applemedia plugin loading +- Fix applemedia plugin failing to load on ARM64 macOS + +Contributors to 1.20.1 + +Bastien Nocera, Branko Subasic, David Svensson Fors, Dmitry Osipenko, +Edward Hervey, Guillaume Desmottes, Havard Graff, Heiko Becker, He +Junyan, Igor V. Kovalenko, Jan Alexander Steffens (heftig), Jan Schmidt, +jinsl00000, Joseph Donofry, Jose Quaresma, Marek Vasut, Matthew Waters, +Mengkejiergeli Ba, Nicolas Dufresne, Nirbheek Chauhan, Philippe Normand, +Qi Hou, Rouven Czerwinski, Ruben Gonzalez, Sanchayan Maity, Sangchul +Lee, Sebastian Dröge, Sebastian Fricke, Sebastian Groß, Sebastian +Mueller, Sebastian Wick, Seungha Yang, Stéphane Cerveau, Thibault +Saunier, Tim Mooney, Tim-Philipp Müller, Víctor Manuel Jáquez Leal, +Vivia Nikolaidou, Zebediah Figura, … and many others who have contributed bug reports, translations, sent suggestions or helped testing. Thank you all! -List of merge requests and issues fixed in 1.18.5 - -- List of Merge Requests applied in 1.18.5 -- List of Issues fixed in 1.18.5 - -1.18.6 - -The sixth 1.18 bug-fix release (1.18.6) was released on 2 February 2022. - -This release only contains bugfixes and security fixes and it should be -safe to update from 1.18.x. - -Highlighted bugfixes in 1.18.6 - -- tagdemux: Fix crash when presented with malformed files (security - fix) -- video-converter: Fix broken gamma remap with high bitdepth YUV - output -- shout2send: Fix issues with libshout >= 2.4.2 -- mxfdemux: fix regression with VANC tracks that only contains packet - types we don’t handle -- Better plugin loading error reporting on Windows -- Fixes for deprecations in Python 3.10 -- build fixes, memory leak fixes, reliability fixes -- security fixes - -gstreamer - -- gstplugin: Fix for UWP build -- gstplugin: Better warnings on plugin load failure on Windows -- gst-ptp-helper: Do not disable multicast loopback -- concat: fix qos event handling -- pluginfeature: Fix object leak -- baseparse: fix invalid avg_bitrate after reset -- multiqueue: Fix query unref race on flush -- gst: Initialize optional event/message fields when parsing -- bitwriter: Fix the trailing bits lost when getting its data. -- multiqueue: never consider a queue that is not waiting -- input-selector: Use proper segments when cleaning cached buffers - -gst-plugins-base - -- tagdemux: Fix crash when presented with malformed files (security - fix) -- videoencoder: make sure the buffer is writable before modifying - metadata -- video-converter: Fix for broken gamma remap with high bitdepth YUV - output -- sdpmessage: fix mapping single char fmtp params -- oggdemux: fix a race in push mode when performing the duration seek -- uridecodebin: Fix critical warnings -- audio-converter: Fix resampling when there’s nothing to output -- tcp: fix build on Solaris -- uridecodebin3: Nullify current item after all play items are freed. -- audio-resampler: Fix segfault when we can’t output any frames -- urisourcebin: Handle sources with dynamic pads and pads already - present -- playbin2/3: autoplug/caps: don’t expand caps to ANY -- uridecodebin3/urisourcebin: Reusability fixes -- rtspconnection: Only reset timeout when socket is unused -- gstvideoaggregator.c: fix build with gcc 4.8 - -gst-plugins-good - -- rtspsrc: Fix critical while serializing timeout element message -- multifilesrc: fix caps leak -- shout2: Add compatibility for libshout >= 2.4.2 shout_open return - values -- v4l2: Update fmt if padded height is greater than fmt height -- v4l2bufferpool: set video alignment of video meta -- qtmux: fix deadlock in gst_qt_mux_prepare_moov_recovery -- matroska: Add support for muxing/demuxing ffv1 -- qtdemux: Try to build AAC codec-data whenever it’s possible - -gst-plugins-bad - -- interlace: Fix a double-unref on shutdown -- webrtcbin: Chain up to parent constructed method -- webrtc: fix log error message in function - gst_webrtc_bin_set_local_description -- mxfdemux: don’t error out if VANC track only contains packets we - don’t handle -- av1parser: Fix data type of film grain param -- assrender: Support RFC8081 mime types -- pitch: Specify layout as required for negotiation -- magicleap: update lumin_rt libraries names to the latest official - version -- codecs: h265decoder: Fix per-slice leak -- mpeg4videoparse: fix criticals trying to insert configs that don’t - exist yet -- webrtcbin: Always set SINK/SRC flags -- mpegtspacketizer: memcmp potentially seen_before data -- zxing: update to support version 1.1.1 - -gst-plugins-ugly - -- No changes - -gst-libav - -- avcodecmap: Add support for GBRA_10LE/BE - -gst-rtsp-server - -- rtsp-stream: fix get_rates raciness -- rtsp-media: Only unprepare a media if it was not already unpreparing - anyway -- rtsp-media: Unprepare suspended medias too -- rtsp-client: make sure sessmedia will not get freed while used -- rtsp-media: Also mark receive-only (RECORD) medias as prepared when - unsuspending -- rtsp-session: Don’t unref medias twice if it is removed inside… -- examples: Fix leak in appsrc2 example - -gstreamer-vaapi - -- libs: video-format: Check if formats map is not NULL -- vaapidecode: Autogenerate caps template -- vaapipostproc: copy over metadata also when using system allocated - buffer - -gst-python - -- Avoid treating float as int (fix for Python 3.10) - -gst-editing-services - -- meson: Remove duplicate definition of ‘examples’ option - -gst-devtools - -- No changes - -gst-integration-testsuites - -- No changes - -gst-build - -- env: Fix deprecations from python 3.10 -- Various fixes for macOS -- update FFmpeg wrap to 4.3.3 - -Cerbero build tool and packaging changes in 1.18.6 - -- Some fixes for Fedora 34 -- cerbero: Backport fix for removed loop param of PriorityQueue() -- cerbero: Fix support for Fedora 35 -- Add support for Visual Studio 2022 -- openssl.recipe: Fix crash on iOS TestFlight -- UnixBootstrapper: remove sudo as root user -- bzip2.recipe: bump version to 1.0.8 -- openssl.recipe: upgrade to version 1.1.1l - -Contributors to 1.18.6 - -Antonio Ospite, Célestin Marot, Dave Piché, Erlend Eriksen, Fabrice -Fontaine, Guillaume Desmottes, Haihua Hu, He Junyan, Jakub Adam, Jan -Alexander Steffens (heftig), Jan Schmidt, Jeremy Cline, Jordan Petridis, -Mathieu Duponchelle, Matthew Waters, Mengkejiergeli Ba, Michael Gruner, -Nirbheek Chauhan, Ognyan Tonchev, Pascal Hache, Rafał Dzięgiel, -Sebastian Dröge, Seungha Yang, Stéphane Cerveau, Teng En Ung,Thibault -Saunier, Thomas Klausner, Tim-Philipp Müller, Tobias Reineke, Tobias -Ronge, Tomasz Andrzejak, Trung Do, Víctor Manuel Jáquez Leal, Vivia -Nikolaidou, - -… and many others who have contributed bug reports, translations, sent -suggestions or helped testing. Thank you all! +List of merge requests and issues fixed in 1.20.1 -List of merge requests and issues fixed in 1.18.6 +- List of Merge Requests applied in 1.20.1 +- List of Issues fixed in 1.20.1 -- List of Merge Requests applied in 1.18.6 -- List of Issues fixed in 1.18.6 +Schedule for 1.22 -Schedule for 1.20 +Our next major feature release will be 1.22, and 1.21 will be the +unstable development version leading up to the stable 1.22 release. The +development of 1.21/1.22 will happen in the git main branch. -Our next major feature release will be 1.20, and will be released in -early February 2022. You can track its progress on the 1.20 Release -Notes page. +The plan for the 1.22 development cycle is yet to be confirmed. Assuming +no major project-wide reorganisations in the 1.22 cycle we might try and +aim for a release around August 2022. -1.20 will be backwards-compatible to the stable 1.18, 1.16, 1.14, 1.12, -1.10, 1.8, 1.6, 1.4, 1.2 and 1.0 release series. +1.22 will be backwards-compatible to the stable 1.20, 1.18, 1.16, 1.14, +1.12, 1.10, 1.8, 1.6, 1.4, 1.2 and 1.0 release series. ------------------------------------------------------------------------ These release notes have been prepared by Tim-Philipp Müller with -contributions from Mathieu Duponchelle, Matthew Waters, Nirbheek -Chauhan, Sebastian Dröge, Thibault Saunier, and Víctor Manuel Jáquez -Leal. +contributions from Matthew Waters, Nicolas Dufresne, Nirbheek Chauhan, +Sebastian Dröge and Seungha Yang. License: CC BY-SA 4.0
View file
gst-plugins-bad-1.20.1.tar.xz/README.md
Added
@@ -0,0 +1,196 @@ +GStreamer 1.20.x stable series + +WHAT IT IS +---------- + +This is GStreamer, a framework for streaming media. + +WHERE TO START +-------------- + +We have a website at + + https://gstreamer.freedesktop.org + +Our documentation, including tutorials, API reference and FAQ can be found at + + https://gstreamer.freedesktop.org/documentation/ + +You can subscribe to our mailing lists: + + https://lists.freedesktop.org/mailman/listinfo/gstreamer-announce + + https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel + +We track bugs, feature requests and merge requests (patches) in GitLab at + + https://gitlab.freedesktop.org/gstreamer/ + +You can join us on IRC - #gstreamer on irc.oftc.net + +GStreamer 1.0 series +-------------------- + +Starring + + GSTREAMER + +The core around which all other modules revolve. Base functionality and +libraries, some essential elements, documentation, and testing. + + BASE + +A well-groomed and well-maintained collection of GStreamer plug-ins and +elements, spanning the range of possible types of elements one would want +to write for GStreamer. + +And introducing, for the first time ever, on the development screen ... + + THE GOOD + + --- "Such ingratitude. After all the times I've saved your life." + +A collection of plug-ins you'd want to have right next to you on the +battlefield. Shooting sharp and making no mistakes, these plug-ins have it +all: good looks, good code, and good licensing. Documented and dressed up +in tests. If you're looking for a role model to base your own plug-in on, +here it is. + +If you find a plot hole or a badly lip-synced line of code in them, +let us know - it is a matter of honour for us to ensure Blondie doesn't look +like he's been walking 100 miles through the desert without water. + + THE UGLY + + --- "When you have to shoot, shoot. Don't talk." + +There are times when the world needs a color between black and white. +Quality code to match the good's, but two-timing, backstabbing and ready to +sell your freedom down the river. These plug-ins might have a patent noose +around their neck, or a lock-up license, or any other problem that makes you +think twice about shipping them. + +We don't call them ugly because we like them less. Does a mother love her +son less because he's not as pretty as the other ones ? No - she commends +him on his great personality. These plug-ins are the life of the party. +And we'll still step in and set them straight if you report any unacceptable +behaviour - because there are two kinds of people in the world, my friend: +those with a rope around their neck and the people who do the cutting. + + THE BAD + + --- "That an accusation?" + +No perfectly groomed moustache or any amount of fine clothing is going to +cover up the truth - these plug-ins are Bad with a capital B. +They look fine on the outside, and might even appear to get the job done, but +at the end of the day they're a black sheep. Without a golden-haired angel +to watch over them, they'll probably land in an unmarked grave at the final +showdown. + +Don't bug us about their quality - exercise your Free Software rights, +patch up the offender and send us the patch on the fastest steed you can +steal from the Confederates. Because you see, in this world, there's two +kinds of people, my friend: those with loaded guns and those who dig. +You dig. + +The Lowdown +----------- + + --- "I've never seen so many plug-ins wasted so badly." + +GStreamer Plug-ins has grown so big that it's hard to separate the wheat from +the chaff. Also, distributors have brought up issues about the legal status +of some of the plug-ins we ship. To remedy this, we've divided the previous +set of available plug-ins into four modules: + +- gst-plugins-base: a small and fixed set of plug-ins, covering a wide range + of possible types of elements; these are continuously kept up-to-date + with any core changes during the development series. + + - We believe distributors can safely ship these plug-ins. + - People writing elements should base their code on these elements. + - These elements come with examples, documentation, and regression tests. + +- gst-plugins-good: a set of plug-ins that we consider to have good quality + code, correct functionality, our preferred license (LGPL for the plug-in + code, LGPL or LGPL-compatible for the supporting library). + + - We believe distributors can safely ship these plug-ins. + - People writing elements should base their code on these elements. + +- gst-plugins-ugly: a set of plug-ins that have good quality and correct + functionality, but distributing them might pose problems. The license + on either the plug-ins or the supporting libraries might not be how we'd + like. The code might be widely known to present patent problems. + + - Distributors should check if they want/can ship these plug-ins. + - People writing elements should base their code on these elements. + +- gst-plugins-bad: a set of plug-ins that aren't up to par compared to the + rest. They might be close to being good quality, but they're missing + something - be it a good code review, some documentation, a set of tests, + a real live maintainer, or some actual wide use. + If the blanks are filled in they might be upgraded to become part of + either gst-plugins-good or gst-plugins-ugly, depending on the other factors. + + - If the plug-ins break, you can't complain - instead, you can fix the + problem and send us a patch, or bribe someone into fixing them for you. + - New contributors can start here for things to work on. + +PLATFORMS +--------- + +- Linux is of course fully supported +- FreeBSD is reported to work; other BSDs should work too; same for Solaris +- MacOS works, binary 1.x packages can be built using the cerbero build tool +- Windows works; binary 1.x packages can be built using the cerbero build tool + - MSys/MinGW builds + - Microsoft Visual Studio builds are also available and supported +- Android works, binary 1.x packages can be built using the cerbero build tool +- iOS works + +INSTALLING FROM PACKAGES +------------------------ + +You should always prefer installing from packages first. GStreamer is +well-maintained for a number of distributions, including Fedora, Debian, +Ubuntu, Mandrake, Arch Linux, Gentoo, ... + +Only in cases where you: + + - want to hack on GStreamer + - want to verify that a bug has been fixed + - do not have a sane distribution + +should you choose to build from source tarballs or git. + +Find more information about the various packages at + + https://gstreamer.freedesktop.org/download/ + +For in-depth instructions about building GStreamer visit: +[getting-started](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/README.md#getting-started). + +PLUG-IN DEPENDENCIES AND LICENSES +--------------------------------- + +GStreamer is developed under the terms of the LGPL (see COPYING file for +details). Some of our plug-ins however rely on libraries which are available +under other licenses. This means that if you are distributing an application +which has a non-GPL compatible license (for instance a closed-source +application) with GStreamer, you have to make sure not to distribute GPL-linked +plug-ins. + +When using GPL-linked plug-ins, GStreamer is for all practical reasons +under the GPL itself. + +HISTORY +------- + +The fundamental design comes from the video pipeline at Oregon Graduate +Institute, as well as some ideas from DirectMedia. It's based on plug-ins that +will provide the various codec and other functionality. The interface +hopefully is generic enough for various companies (ahem, Apple) to release +binary codecs for Linux, until such time as they get a clue and release the +source.
View file
gst-plugins-bad-1.18.6.tar.xz/RELEASE -> gst-plugins-bad-1.20.1.tar.xz/RELEASE
Changed
@@ -1,4 +1,4 @@ -This is GStreamer gst-plugins-bad 1.18.6. +This is GStreamer gst-plugins-bad 1.20.1. The GStreamer team is thrilled to announce a new major feature release of your favourite cross-platform multimedia framework! @@ -6,13 +6,12 @@ As always, this release is again packed with new features, bug fixes and other improvements. -The 1.18 release series adds new features on top of the 1.16 series and is -part of the API and ABI-stable 1.x release series of the GStreamer multimedia -framework. +The 1.20 release series adds new features on top of the 1.18 series and is +part of the API and ABI-stable 1.x release series. Full release notes can be found at: - https://gstreamer.freedesktop.org/releases/1.18/ + https://gstreamer.freedesktop.org/releases/1.20/ Binaries for Android, iOS, Mac OS X and Windows will usually be provided shortly after the release. @@ -60,7 +59,7 @@ directory: https://gstreamer.freedesktop.org/src/gstreamer/ The git repository and details how to clone it can be found at -https://gitlab.freedesktop.org/gstreamer/ +https://gitlab.freedesktop.org/gstreamer/gstreamer/ ==== Homepage ==== @@ -68,10 +67,9 @@ ==== Support and Bugs ==== -We have recently moved from GNOME Bugzilla to GitLab on freedesktop.org -for bug reports and feature requests: +We track bugs and feature requests in GitLab: - https://gitlab.freedesktop.org/gstreamer + https://gitlab.freedesktop.org/gstreamer/gstreamer/ Please submit patches via GitLab as well, in form of Merge Requests. See @@ -84,11 +82,14 @@ There is also a #gstreamer IRC channel on the OFTC IRC network. +Please do not submit support requests in GitLab, we only use it +for bug tracking and merge requests review. + ==== Developers ==== -GStreamer source code repositories can be found on GitLab on freedesktop.org: +The GStreamer source code repository can be found on GitLab on freedesktop.org: - https://gitlab.freedesktop.org/gstreamer + https://gitlab.freedesktop.org/gstreamer/gstreamer/ and can also be cloned from there and this is also where you can submit Merge Requests or file issues for bugs or feature requests.
View file
gst-plugins-bad-1.18.6.tar.xz/REQUIREMENTS -> gst-plugins-bad-1.20.1.tar.xz/REQUIREMENTS
Changed
@@ -45,10 +45,6 @@ http://www.videolan.org/libdca.html musepack (for musepack audio codec/format) (http://www.musepack.net/) -nas (for the NAS sound server sink) - (http://radscan.com/nas.html) -libmms (for MMS protocol support) - (http://www.sf.net/projects/libmms) libamrnb (for AMR-NB support) (http://www.penguin.cz/~utx/amr) libamrwb (for AMR-WB support)
View file
gst-plugins-bad-1.18.6.tar.xz/data/meson.build -> gst-plugins-bad-1.20.1.tar.xz/data/meson.build
Changed
@@ -9,6 +9,7 @@ 'targets/file-extension/webm.gep', 'targets/file-extension/flv.gep', 'targets/file-extension/mp4.gep', + 'targets/file-extension/ts.gep', 'targets/file-extension/avi.gep',], ], ['online-services', ['targets/online-service/youtube.gep',]],
View file
gst-plugins-bad-1.20.1.tar.xz/data/targets/file-extension/ts.gep
Added
@@ -0,0 +1,33 @@ +[GStreamer Encoding Target] +name=ts +category=file-extension +description=Default target for files with a .ts extension + +[profile-default] +name=default +type=container +description=Default profile for files with a .ts extension. Suitable for streaming. +format=video/mpegts,systemstream=true,packetsize=188 + +[streamprofile-default-0] +parent=default +type=audio +format=audio/mpeg,mpegversion=4,base-profile=lc,rate={48000,96000},channels=2;audio/mpeg,mpegversion=4,base-profile=lc,rate={48000,96000} +restriction=audio/x-raw,channels=6,channel-mask=0x3f;audio/x-raw,channels=2 + +[streamprofile-default-1] +parent=default +type=video +format=video/x-h264 +preset=Profile YouTube +pass=0 + +[streamprofile-default-2] +parent=default +type=audio +format=audio/mpeg,mpegversion=4 + +[streamprofile-default-3] +parent=default +type=video +format=video/x-h264
View file
gst-plugins-bad-1.18.6.tar.xz/docs/libs/mpegts/index.md -> gst-plugins-bad-1.20.1.tar.xz/docs/libs/mpegts/index.md
Changed
@@ -1,7 +1,104 @@ -# Mpeg TS helper library +# MPEG-TS helper library This library should be linked to by getting cflags and libs from gstreamer-plugins-bad-{{ gst_api_version.md }}.pc and adding -lgstmpegts-{{ gst_api_version.md }} to the library flags. > NOTE: This library API is considered *unstable* + +## Purpose + +The MPEG-TS helper library provides a collection of definitions, object, +enumerations and functions to assist with dealing with the base *MPEG 2 +Transport Stream* (MPEG-TS) format (as defined by `ISO/IEC 13818-1` and `ITU-T +H.222.0`) and its derivates (`DVB`, `ATSC`, `SCTE`, `ARIB`, `Blu-ray`, `AVCHD`, +...). + + +This library provides helpers for dealing with: + +* The various Section Information (SI) and Program-Specific Information (SI), + handled with the [GstMpegtsSection](GstMpegtsSection) object and related + functions. + +* The various descriptors present in SI/PSI, handled with the + [GstMpegtsDescriptor](GstMpegtsDescriptor) object and related functions. + + +This library does not cover: + +* Parsing MPEG-TS packets (PSI or PES) and extracting the sections. One can use + an existing demuxer/parser element for this, or parse the packets + themselves. + +* Generate and multiplex MPEG-TS packets and sections. One can use an existing + muxer element for this. + +Applications, or external elements, can interact with the existing MPEG-TS +elements via [messages](gst_message_new_mpegts_section) (to receive sections) or +[events](gst_mpegts_section_send_event) (to send sections). + +## Specification and References + +As much as possible, the information contained in this library is based on the +official Specification and/or References listed below: + +### `MPEG-TS` + +This is the base specification from which all variants are derived. It covers +the basic sections (for program signalling) and descriptors. All variants must +abide by this specification. + +* `ISO/IEC 13818-1` and `ITU-T H.222.0`: *"Information technology – Generic + coding of moving pictures and associated audio information: Systems"*. The two + specifications are identical, the ITU one is more easily available (*nudge*). + +### `SMPTE-RA` : *SMPTE Registration Authority* +The official registration authority for MPEG-TS. This is used for the base +[Registration Descriptor](gst_mpegts_descriptor_parse_registration) which +allows to unambiguously identify a stream when it is not specified in a standard +(yet). + +* <http://smpte-ra.org/> + +### `DVB` : *Digital Video Broadcasting* + +This standards body covers the variant of MPEG-TS used in Europe, Oceania, and +most of Asia and Africa. The standards are actually published by the `ETSI` +(European Telecommunications Standards Institute). + +* `ETSI EN 300 468`: *"Digital Video Broadcasting (DVB); Specification for + Service Information (SI) in DVB systems"*. Covers all the sections and + descriptors used in DVB variants. +* `ETSI EN 101 154`: *"Digital Video Broadcasting (DVB);Specification for the + use of Video and Audio Coding in Broadcast and Broadband + Applications"*. Provides more details about signalling/sectios for audio/video + codecs. + +### `ATSC` : *Advanced Television Systems Committee* + +This set of standards covers the variants of MPEG-TS used in North America. +* `ATSC A/53-3` : *"ATSC Digital Television Standard, Part 3 – Service Multiplex + and Transport Subsystem Characteristics"*. How ATSC extends the base MPEG-TS. +* `ATSC A/65` : *"ATSC Standard:Program and System Information Protocol for + Terrestrial Broadcast and Cable"*. Covers all sections and descriptors used in + ATSC 1.0 variants. +* `ATSC A/90` : *"ATSC Data Broadcast Standard"*. Extensions for data transfer + (i.e. DSM-CC). +* `ATSC A/107` : *"ATSC 2.0 Standard"*. Adds a few more descriptors. +* `ATSC Code Points Registry` : The list of stream types, decriptor types, + etc... used by ATSC standards. + +### `SCTE` : *Society of Cable Telecommunications Engineers* + +This set of standards evolved in parallel with ATSC in North-America. Most of it +has been merged into ATSC and other standards since. + +* `ANSI/SCTE 35` : *"Digital Program Insertion Cueing Message for Cable"* + +### `DSM-CC` : "Digital Storage Media - Command & Control" + +This ISO standard is the base for asynchronously carrying "files" over mpeg-ts. + +* `ISO/IEC 13818-6` : *"Information technology — Generic coding of moving + pictures and associated audio information — Part 6: Extensions for DSM-CC"*.
View file
gst-plugins-bad-1.20.1.tar.xz/docs/libs/play
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/docs/libs/play/index.md
Added
@@ -0,0 +1,3 @@ +# Play Library + +> NOTE: This library API is considered *unstable*
View file
gst-plugins-bad-1.20.1.tar.xz/docs/libs/play/sitemap.txt
Added
@@ -0,0 +1,1 @@ +gi-index
View file
gst-plugins-bad-1.18.6.tar.xz/docs/meson.build -> gst-plugins-bad-1.20.1.tar.xz/docs/meson.build
Changed
@@ -9,13 +9,29 @@ subdir_done() endif +if static_build + if get_option('doc').enabled() + error('Documentation enabled but not supported when building statically.') + endif + + message('Building statically, can\'t build the documentation') + subdir_done() +endif + +if not build_gir + if get_option('doc').enabled() + error('Documentation enabled but introspection not built.') + endif + + message('Introspection not built, won\'t build documentation requiring it') +endif + required_hotdoc_extensions = ['gi-extension', 'c-extension', 'gst-extension'] if gst_dep.type_name() == 'internal' gst_proj = subproject('gstreamer') plugins_cache_generator = gst_proj.get_variable('plugins_cache_generator') else - required_hotdoc_extensions += ['gst-extension'] - plugins_cache_generator = find_program(join_paths(gst_dep.get_pkgconfig_variable('libexecdir'), 'gstreamer-' + api_version, 'gst-plugins-doc-cache-generator'), + plugins_cache_generator = find_program(join_paths(gst_dep.get_variable('libexecdir'), 'gstreamer-' + api_version, 'gst-plugins-doc-cache-generator'), required: false) endif @@ -41,7 +57,7 @@ endif hotdoc_req = '>= 0.11.0' -hotdoc_version = run_command(hotdoc_p, '--version').stdout() +hotdoc_version = run_command(hotdoc_p, '--version', check: false).stdout() if not hotdoc_version.version_compare(hotdoc_req) if get_option('doc').enabled() error('Hotdoc version @0@ not found, got @1@'.format(hotdoc_req, hotdoc_version)) @@ -59,19 +75,9 @@ endif message('@0@ extensions not found, not building documentation requiring it'.format(extension)) - subdir_done() endif endforeach -if not build_gir - if get_option('doc').enabled() - error('Documentation enabled but introspection not built.') - endif - - message('Introspection not built, can\'t build the documentation') - subdir_done() -endif - build_hotdoc = true docconf = configuration_data() @@ -86,8 +92,7 @@ foreach f: [ 'gst/frei0r/frei0r.h', 'gst/mxf/mxful.c', - 'gst-libs/gst/mpegts/gstmpegts-private.h', - 'gst-libs/gst/player/*-private.h', + 'gst-libs/gst/*/*-private.h', 'gst-libs/gst/codecparsers/nalutils.h', 'ext/closedcaption/bcd.h', 'ext/closedcaption/bit_slicer.[ch]', @@ -110,6 +115,7 @@ if build_gir libs = [ {'name': 'mpegts', 'gir': mpegts_gir, 'lib': gstmpegts_dep}, + {'name': 'play', 'gir': play_gir, 'lib': gstplay_dep}, {'name': 'player', 'gir': player_gir, 'lib': gstplayer_dep}, {'name': 'insertbin', 'gir': insertbin_gir, 'lib': gstinsertbin_dep}, {'name': 'codecparsers', 'lib': gstcodecparsers_dep}, @@ -209,10 +215,8 @@ with open("@0@") as f: print(':'.join(json.load(f).keys()), end='') -'''.format(plugins_cache)) - -assert(list_plugin_res.returncode() == 0, - 'Could not list plugins from @0@'.format(plugins_cache)) +'''.format(plugins_cache), + check: true) plugins_doc = [] foreach plugin_name: list_plugin_res.stdout().split(':') @@ -224,9 +228,10 @@ gst_index: 'plugins/index.md', gst_smart_index: true, gst_c_sources: [ - join_paths(root_rel, 'sys/*/*.[ch]'), + join_paths(root_rel, 'sys/*/*.[cmh]'), join_paths(root_rel, 'sys/*/*.cpp'), join_paths(root_rel, 'sys/*/*.cc'), + join_paths(root_rel, 'sys/*/*.mm'), join_paths(root_rel, 'sys/*/*.hh'), join_paths(root_rel, 'ext/*/*.[ch]'), join_paths(root_rel, 'ext/*/*.cpp'),
View file
gst-plugins-bad-1.18.6.tar.xz/docs/plugins/gst_plugins_cache.json -> gst-plugins-bad-1.20.1.tar.xz/docs/plugins/gst_plugins_cache.json
Changed
@@ -186,6 +186,212 @@ "tracers": {}, "url": "Unknown package origin" }, + "aes": { + "description": "AES encryption/decryption plugin", + "elements": { + "aesdec": { + "author": "Rabindra Harlalka <Rabindra.Harlalka@nice.com>", + "description": "AES buffer decryption", + "hierarchy": [ + "GstAesDec", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Generic/Filter", + "long-name": "aesdec", + "pad-templates": { + "sink": { + "caps": "ANY", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "ANY", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "cipher": { + "blurb": "cipher mode", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "aes-128-cbc (0)", + "mutable": "ready", + "readable": true, + "type": "GstAesCipher", + "writable": true + }, + "iv": { + "blurb": "AES encryption initialization vector (in hexadecimal). Length must equal AES block length (16 bytes)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + }, + "key": { + "blurb": "AES encryption key (in hexadecimal). Length (in bytes) must be equivalent to the number of bits in the key length : 16 bytes for AES 128 and 32 bytes for AES 256", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + }, + "per-buffer-padding": { + "blurb": "If true, pad each buffer using PKCS7 padding scheme. Otherwise, onlypad final buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + }, + "serialize-iv": { + "blurb": "Read initialization vector from first 16 bytes of first buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "rank": "primary" + }, + "aesenc": { + "author": "Rabindra Harlalka <Rabindra.Harlalka@nice.com>", + "description": "AES buffer encryption", + "hierarchy": [ + "GstAesEnc", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Generic/Filter", + "long-name": "aesenc", + "pad-templates": { + "sink": { + "caps": "ANY", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "ANY", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "cipher": { + "blurb": "cipher mode", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "aes-128-cbc (0)", + "mutable": "ready", + "readable": true, + "type": "GstAesCipher", + "writable": true + }, + "iv": { + "blurb": "AES encryption initialization vector (in hexadecimal). Length must equal AES block length (16 bytes)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + }, + "key": { + "blurb": "AES encryption key (in hexadecimal). Length (in bytes) must be equivalent to the number of bits in the key length : 16 bytes for AES 128 and 32 bytes for AES 256", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + }, + "per-buffer-padding": { + "blurb": "If true, pad each buffer using PKCS7 padding scheme. Otherwise, onlypad final buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + }, + "serialize-iv": { + "blurb": "Store initialization vector in first 16 bytes of first buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "rank": "primary" + } + }, + "filename": "gstaes", + "license": "LGPL", + "other-types": { + "GstAesCipher": { + "kind": "enum", + "values": [ + { + "desc": "AES 128 bit cipher key using CBC method", + "name": "aes-128-cbc", + "value": "0" + }, + { + "desc": "AES 256 bit cipher key using CBC method", + "name": "aes-256-cbc", + "value": "1" + } + ] + } + }, + "package": "GStreamer Bad Plug-ins", + "source": "gst-plugins-bad", + "tracers": {}, + "url": "Unknown package origin" + }, "aiff": { "description": "Create and parse Audio Interchange File Format (AIFF) files", "elements": { @@ -707,6 +913,764 @@ "tracers": {}, "url": "Unknown package origin" }, + "applemedia": { + "description": "Elements for capture and codec access on Apple OS X and iOS", + "elements": { + "atdec": { + "author": "Alessandro Decina <alessandro.d@gmail.com>", + "description": "AudioToolbox based audio decoder", + "hierarchy": [ + "GstATDec", + "GstAudioDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Audio", + "long-name": "AudioToolbox based audio decoder", + "pad-templates": { + "sink": { + "caps": "audio/mpeg:\n mpegversion: 4\n framed: true\n channels: [ 1, 2147483647 ]\naudio/mpeg:\n mpegversion: 1\n layer: [ 1, 3 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "audio/x-raw:\n format: S16LE\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2147483647 ]\n layout: interleaved\naudio/x-raw:\n format: F32LE\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2147483647 ]\n layout: interleaved\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "marginal" + }, + "avfassetsrc": { + "author": "Andoni Morales Alastruey amorales@fluendo.com", + "description": "Read and decode samples from AVFoundation assets using the AVFAssetReader API", + "hierarchy": [ + "GstAVFAssetSrc", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstURIHandler" + ], + "klass": "Source/Codec", + "long-name": "Source and decoder for AVFoundation assets", + "pad-templates": { + "audio": { + "caps": "audio/x-raw:\n format: F32LE\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2 ]\n layout: interleaved\n", + "direction": "src", + "presence": "sometimes" + }, + "video": { + "caps": "video/x-raw:\n format: NV12\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n", + "direction": "src", + "presence": "sometimes" + } + }, + "properties": { + "uri": { + "blurb": "URI of the asset to read", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + } + }, + "rank": "secondary" + }, + "avfvideosrc": { + "author": "Ole André Vadla Ravnås <oleavr@soundrop.com>", + "description": "Reads frames from an iOS AVFoundation device", + "hierarchy": [ + "GstAVFVideoSrc", + "GstPushSrc", + "GstBaseSrc", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Source/Video/Hardware", + "long-name": "Video Source (AVFoundation)", + "pad-templates": { + "src": { + "caps": "video/x-raw(memory:GLMemory):\n format: UYVY\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n texture-target: rectangle\nvideo/x-raw:\n format: { NV12, UYVY, YUY2 }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\nvideo/x-raw:\n format: BGRA\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "capture-screen": { + "blurb": "Enable screen capture functionality", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "capture-screen-cursor": { + "blurb": "Enable cursor capture while capturing screen", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "capture-screen-mouse-clicks": { + "blurb": "Enable mouse clicks capture while capturing screen", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "device-index": { + "blurb": "The zero-based device index", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "-1", + "max": "2147483647", + "min": "-1", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "device-name": { + "blurb": "The name of the currently opened capture device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": false + }, + "device-type": { + "blurb": "The general type of a video capture device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "default (0)", + "mutable": "null", + "readable": true, + "type": "GstAVFVideoSourceDeviceType", + "writable": true + }, + "do-stats": { + "blurb": "Enable logging of statistics", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "fps": { + "blurb": "Last measured framerate, if statistics are enabled", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "-1", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": false + }, + "orientation": { + "blurb": "The orientation of the video", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "default (0)", + "mutable": "null", + "readable": true, + "type": "GstAVFVideoSourceOrientation", + "writable": true + }, + "position": { + "blurb": "The position of the capture device (front or back-facing)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "default (0)", + "mutable": "null", + "readable": true, + "type": "GstAVFVideoSourcePosition", + "writable": true + } + }, + "rank": "secondary" + }, + "avsamplebufferlayersink": { + "author": "Matthew Waters <matthew@centricular.com>", + "description": "A videosink based on AVSampleBuffers", + "hierarchy": [ + "GstAVSampleVideoSink", + "GstVideoSink", + "GstBaseSink", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Sink/Video", + "long-name": "AV Sample video sink", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { RGB, BGR, ARGB, BGRA, ABGR, RGBA, YUY2, UYVY, NV12, I420 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + } + }, + "properties": { + "force-aspect-ratio": { + "blurb": "When enabled, scaling will respect original aspect ratio", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "layer": { + "blurb": "The CoreAnimation layer that can be placed in the render tree", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "mutable": "null", + "readable": true, + "type": "gpointer", + "writable": false + } + }, + "rank": "none" + }, + "vtdec": { + "author": "Ole André Vadla Ravnås <oleavr@soundrop.com>; Alessandro Decina <alessandro.d@gmail.com>", + "description": "Apple VideoToolbox Decoder", + "hierarchy": [ + "GstVtdec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Apple VideoToolbox decoder", + "pad-templates": { + "sink": { + "caps": "video/x-h264:\n stream-format: avc\n alignment: au\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\nvideo/mpeg:\n mpegversion: 2\n systemstream: false\n parsed: true\nimage/jpeg:\nvideo/x-prores:\n variant: { (string)standard, (string)hq, (string)lt, (string)proxy, (string)4444, (string)4444xq }\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n format: { NV12, AYUV64, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:GLMemory):\n format: { NV12, AYUV64, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n texture-target: rectangle\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "secondary" + }, + "vtdec_hw": { + "author": "Ole André Vadla Ravnås <oleavr@soundrop.com>; Alessandro Decina <alessandro.d@gmail.com>", + "description": "Apple VideoToolbox Decoder", + "hierarchy": [ + "GstVtdecHw", + "GstVtdec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Apple VideoToolbox decoder (hardware only)", + "pad-templates": { + "sink": { + "caps": "video/x-h264:\n stream-format: avc\n alignment: au\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\nvideo/mpeg:\n mpegversion: 2\n systemstream: false\n parsed: true\nimage/jpeg:\nvideo/x-prores:\n variant: { (string)standard, (string)hq, (string)lt, (string)proxy, (string)4444, (string)4444xq }\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n format: { NV12, AYUV64, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:GLMemory):\n format: { NV12, AYUV64, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n texture-target: rectangle\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "primary + 1" + }, + "vtenc_h264": { + "author": "Ole André Vadla Ravnås <oleavr@soundrop.com>, Dominik Röttsches <dominik.rottsches@intel.com>", + "description": "H.264 encoder", + "hierarchy": [ + "vtenc_h264", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstPreset" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "H.264 encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { AYUV64, UYVY, NV12, I420, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-h264:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive, (string)interleaved }\n stream-format: avc\n alignment: au\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "allow-frame-reordering": { + "blurb": "Whether to allow frame reordering or not", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "bitrate": { + "blurb": "Target video bitrate in kbps (0 = auto)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-keyframe-interval": { + "blurb": "Maximum number of frames between keyframes (0 = auto)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "max-keyframe-interval-duration": { + "blurb": "Maximum number of nanoseconds between keyframes (0 = no limit)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "quality": { + "blurb": "The desired compression quality", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0.5", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true + }, + "realtime": { + "blurb": "Configure the encoder for realtime output", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "rank": "primary" + }, + "vtenc_h264_hw": { + "author": "Ole André Vadla Ravnås <oleavr@soundrop.com>, Dominik Röttsches <dominik.rottsches@intel.com>", + "description": "H.264 (HW only) encoder", + "hierarchy": [ + "vtenc_h264_hw", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstPreset" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "H.264 (HW only) encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { AYUV64, UYVY, NV12, I420, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-h264:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive, (string)interleaved }\n stream-format: avc\n alignment: au\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "allow-frame-reordering": { + "blurb": "Whether to allow frame reordering or not", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "bitrate": { + "blurb": "Target video bitrate in kbps (0 = auto)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-keyframe-interval": { + "blurb": "Maximum number of frames between keyframes (0 = auto)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "max-keyframe-interval-duration": { + "blurb": "Maximum number of nanoseconds between keyframes (0 = no limit)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "quality": { + "blurb": "The desired compression quality", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0.5", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true + }, + "realtime": { + "blurb": "Configure the encoder for realtime output", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "rank": "primary" + }, + "vtenc_prores": { + "author": "Ole André Vadla Ravnås <oleavr@soundrop.com>, Dominik Röttsches <dominik.rottsches@intel.com>", + "description": "Apple ProRes encoder", + "hierarchy": [ + "vtenc_prores", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstPreset" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "Apple ProRes encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { AYUV64, UYVY, NV12, I420, RGBA64_LE, ARGB64_BE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-prores:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive, (string)interleaved }\n variant: { (string)standard, (string)4444xq, (string)4444, (string)hq, (string)lt, (string)proxy }\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "allow-frame-reordering": { + "blurb": "Whether to allow frame reordering or not", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "bitrate": { + "blurb": "Target video bitrate in kbps (0 = auto)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-keyframe-interval": { + "blurb": "Maximum number of frames between keyframes (0 = auto)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "max-keyframe-interval-duration": { + "blurb": "Maximum number of nanoseconds between keyframes (0 = no limit)", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "preserve-alpha": { + "blurb": "Video alpha values (non opaque) need to be preserved", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "quality": { + "blurb": "The desired compression quality", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "0.5", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true + }, + "realtime": { + "blurb": "Configure the encoder for realtime output", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "rank": "primary" + } + }, + "filename": "gstapplemedia", + "license": "LGPL", + "other-types": { + "GstAVFVideoSourceDeviceType": { + "kind": "enum", + "values": [ + { + "desc": "A built-in wide angle camera. These devices are suitable for general purpose use.", + "name": "wide-angle", + "value": "1" + }, + { + "desc": "A built-in camera device with a longer focal length than a wide-angle camera.", + "name": "telephoto", + "value": "2" + }, + { + "desc": "A dual camera device, combining built-in wide-angle and telephoto cameras that work together as a single capture device.", + "name": "dual", + "value": "3" + }, + { + "desc": "Default", + "name": "default", + "value": "0" + } + ] + }, + "GstAVFVideoSourceOrientation": { + "kind": "enum", + "values": [ + { + "desc": "Indicates that video should be oriented vertically, top at the top.", + "name": "portrait", + "value": "1" + }, + { + "desc": "Indicates that video should be oriented vertically, top at the bottom.", + "name": "portrat-upside-down", + "value": "2" + }, + { + "desc": "Indicates that video should be oriented horizontally, top on the left.", + "name": "landscape-right", + "value": "3" + }, + { + "desc": "Indicates that video should be oriented horizontally, top on the right.", + "name": "landscape-left", + "value": "4" + }, + { + "desc": "Default", + "name": "default", + "value": "0" + } + ] + }, + "GstAVFVideoSourcePosition": { + "kind": "enum", + "values": [ + { + "desc": "Front-facing camera", + "name": "front", + "value": "1" + }, + { + "desc": "Back-facing camera", + "name": "back", + "value": "2" + }, + { + "desc": "Default", + "name": "default", + "value": "0" + } + ] + } + }, + "package": "GStreamer Bad Plug-ins", + "source": "gst-plugins-bad", + "tracers": {}, + "url": "Unknown package origin" + }, "asfmux": { "description": "ASF Muxer Plugin", "elements": { @@ -894,7 +1858,7 @@ "long-name": "ASS/SSA Render", "pad-templates": { "src": { - "caps": "video/x-raw:\n format: { BGRx, RGBx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, I420, YV12, AYUV, YUY2, UYVY, v308, Y41B, Y42B, Y444, NV12, NV21, A420, YUV9, YVU9, IYU1, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { BGRx, RGBx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, I420, YV12, AYUV, YUY2, UYVY, v308, Y41B, Y42B, Y444, NV12, NV21, A420, YUV9, YVU9, IYU1, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" }, @@ -904,7 +1868,7 @@ "presence": "always" }, "video_sink": { - "caps": "video/x-raw:\n format: { BGRx, RGBx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, I420, YV12, AYUV, YUY2, UYVY, v308, Y41B, Y42B, Y444, NV12, NV21, A420, YUV9, YVU9, IYU1, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { BGRx, RGBx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, I420, YV12, AYUV, YUY2, UYVY, v308, Y41B, Y42B, Y444, NV12, NV21, A420, YUV9, YVU9, IYU1, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -1054,6 +2018,20 @@ "type": "GstFraction", "writable": true }, + "output-buffer-size": { + "blurb": "Output block size in bytes, takes precedence over buffer duration when set to non zero", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "ready", + "readable": true, + "type": "guint", + "writable": true + }, "strict-buffer-size": { "blurb": "Discard the last samples at EOS or discont if they are too small to fill a buffer", "conditionally-available": false, @@ -1247,6 +2225,20 @@ "readable": true, "type": "gboolean", "writable": true + }, + "samplesperbuffer": { + "blurb": "Number of samples in each outgoing buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "240", + "max": "2147483647", + "min": "1", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true } }, "rank": "primary" @@ -2258,7 +3250,7 @@ "long-name": "Bluetooth A2DP sink", "pad-templates": { "sink": { - "caps": "audio/x-sbc:\n rate: { (int)16000, (int)32000, (int)44100, (int)48000 }\n channels: [ 1, 2 ]\n channel-mode: { (string)mono, (string)dual, (string)stereo, (string)joint }\n blocks: { (int)4, (int)8, (int)12, (int)16 }\n subbands: { (int)4, (int)8 }\nallocation-method: { (string)snr, (string)loudness }\n bitpool: [ 2, 64 ]\naudio/mpeg:\n", + "caps": "audio/x-sbc:\n rate: { (int)16000, (int)32000, (int)44100, (int)48000 }\n channels: [ 1, 2 ]\n channel-mode: { (string)mono, (string)dual, (string)stereo, (string)joint }\n blocks: { (int)4, (int)8, (int)12, (int)16 }\n subbands: { (int)4, (int)8 }\nallocation-method: { (string)snr, (string)loudness }\n bitpool: [ 2, 64 ]\naudio/mpeg:\naudio/x-ldac:\n", "direction": "sink", "presence": "always" } @@ -2318,7 +3310,7 @@ "long-name": "Bluetooth AVDTP sink", "pad-templates": { "sink": { - "caps": "application/x-rtp:\n media: audio\n payload: [ 96, 127 ]\n clock-rate: { (int)16000, (int)32000, (int)44100, (int)48000 }\n encoding-name: SBC\napplication/x-rtp:\n media: audio\n payload: 14\n clock-rate: 90000\napplication/x-rtp:\n media: audio\n payload: [ 96, 127 ]\n clock-rate: 90000\n encoding-name: MPA\n", + "caps": "application/x-rtp:\n media: audio\n payload: [ 96, 127 ]\n clock-rate: { (int)16000, (int)32000, (int)44100, (int)48000 }\n encoding-name: SBC\napplication/x-rtp:\n media: audio\n payload: 14\n clock-rate: 90000\napplication/x-rtp:\n media: audio\n payload: [ 96, 127 ]\n clock-rate: 90000\n encoding-name: MPA\napplication/x-rtp:\n media: audio\n payload: [ 96, 127 ]\n clock-rate: { (int)44100, (int)48000, (int)88200, (int)96000 }\n encoding-name: X-GST-LDAC\n", "direction": "sink", "presence": "always" } @@ -3207,12 +4199,12 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" }, "video_sink": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -3304,7 +4296,34 @@ "type": "GstAggregatorPad" } }, - "properties": {}, + "properties": { + "max-scheduled": { + "blurb": "Maximum number of buffers to queue for scheduling", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "30", + "max": "-1", + "min": "0", + "mutable": "ready", + "readable": true, + "type": "guint", + "writable": true + }, + "schedule": { + "blurb": "Schedule caption buffers so that exactly one is output per video frame", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + } + }, "rank": "none" }, "ccconverter": { @@ -3332,7 +4351,20 @@ "presence": "always" } }, - "properties": {}, + "properties": { + "cdp-mode": { + "blurb": "Select which CDP sections to store in CDP packets", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "cc-svc-info+cc-data+time-code", + "mutable": "null", + "readable": true, + "type": "GstCCConverterCDPMode", + "writable": true + } + }, "rank": "none" }, "ccextractor": { @@ -3397,17 +4429,42 @@ "long-name": "Line 21 CC Decoder", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { I420, YUY2, YVYU, UYVY, VYUY, v210 }\n interlace-mode: interleaved\n", + "caps": "video/x-raw:\n format: { I420, YUY2, YVYU, UYVY, VYUY, v210 }\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n format: { I420, YUY2, YVYU, UYVY, VYUY, v210 }\n interlace-mode: interleaved\n", + "caps": "video/x-raw:\n format: { I420, YUY2, YVYU, UYVY, VYUY, v210 }\n", "direction": "src", "presence": "always" } }, - "properties": {}, + "properties": { + "mode": { + "blurb": "Control whether and how detected CC meta should be inserted in the list of existing CC meta on a frame (if any).", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "add (0)", + "mutable": "null", + "readable": true, + "type": "GstLine21DecoderMode", + "writable": true + }, + "ntsc-only": { + "blurb": "Whether line 21 decoding should only be attempted when the input resolution matches NTSC", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + } + }, "rank": "none" }, "line21encoder": { @@ -3456,6 +4513,26 @@ "filename": "gstclosedcaption", "license": "LGPL", "other-types": { + "GstCCConverterCDPMode": { + "kind": "flags", + "values": [ + { + "desc": "Store time code information in CDP packets", + "name": "time-code", + "value": "0x00000001" + }, + { + "desc": "Store CC data in CDP packets", + "name": "cc-data", + "value": "0x00000002" + }, + { + "desc": "Store CC service information in CDP packets", + "name": "cc-svc-info", + "value": "0x00000004" + } + ] + }, "GstCeaCcOverlayWinHPos": { "kind": "enum", "values": [ @@ -3480,6 +4557,177 @@ "value": "3" } ] + }, + "GstLine21DecoderMode": { + "kind": "enum", + "values": [ + { + "desc": "add new CC meta on top of other CC meta, if any", + "name": "add", + "value": "0" + }, + { + "desc": "ignore CC if a CC meta was already present", + "name": "drop", + "value": "1" + }, + { + "desc": "replace existing CC meta", + "name": "replace", + "value": "2" + } + ] + } + }, + "package": "GStreamer Bad Plug-ins", + "source": "gst-plugins-bad", + "tracers": {}, + "url": "Unknown package origin" + }, + "codecalpha": { + "description": "CODEC Alpha Utilities", + "elements": { + "alphacombine": { + "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com>", + "description": "Use luma from an opaque stream as alpha plane on another", + "hierarchy": [ + "GstAlphaCombine", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Demuxer", + "long-name": "Alpha Combiner", + "pad-templates": { + "alpha": { + "caps": "video/x-raw:\n format: { GRAY8, I420, NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "sink": { + "caps": "video/x-raw:\n format: { I420, NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n format: { A420, AV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "codecalphademux": { + "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com>", + "description": "Extract and expose as a stream the CODEC alpha.", + "hierarchy": [ + "GstCodecAlphaDemux", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Demuxer", + "long-name": "CODEC Alpha Demuxer", + "pad-templates": { + "alpha": { + "caps": "ANY", + "direction": "src", + "presence": "always" + }, + "sink": { + "caps": "ANY", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "ANY", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "vp8alphadecodebin": { + "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com>", + "description": "Wrapper bin to decode VP8 with alpha stream.", + "hierarchy": [ + "GstVp8AlphaDecodeBin", + "GstAlphaDecodeBin", + "GstBin", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstChildProxy" + ], + "klass": "Codec/Decoder/Video", + "long-name": "VP8 Alpha Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp8:\n codec-alpha: true\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "ANY", + "direction": "src", + "presence": "always" + } + }, + "rank": "primary + 10" + }, + "vp9alphadecodebin": { + "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com>", + "description": "Wrapper bin to decode VP9 with alpha stream.", + "hierarchy": [ + "GstVp9AlphaDecodeBin", + "GstAlphaDecodeBin", + "GstBin", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstChildProxy" + ], + "klass": "Codec/Decoder/Video", + "long-name": "VP9 Alpha Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp9:\n codec-alpha: true\n alignment: super-frame\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "ANY", + "direction": "src", + "presence": "always" + } + }, + "rank": "primary + 10" + } + }, + "filename": "gstcodecalpha", + "license": "LGPL", + "other-types": { + "GstAlphaDecodeBin": { + "hierarchy": [ + "GstAlphaDecodeBin", + "GstBin", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstChildProxy" + ], + "kind": "object" } }, "package": "GStreamer Bad Plug-ins", @@ -5260,11 +6508,84 @@ "d3d11": { "description": "Direct3D11 plugin", "elements": { - "d3d11convert": { - "author": "Seungha Yang <seungha.yang@navercorp.com>, Jeongki Kim <jeongki.kim@jeongki.kim>", - "description": "Resizes video and allow color conversion using D3D11", + "d3d11av1dec": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "Direct3D11/DXVA based AV1 video decoder", + "hierarchy": [ + "GstD3D11AV1Dec", + "GstAV1Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Direct3D11/DXVA AV1 NVIDIA GeForce RTX 3060 Laptop GPU Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-av1:\n alignment: frame\n profile: 0\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\nvideo/x-raw:\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "59071", + "max": "9223372036854775807", + "min": "-9223372036854775808", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": false + }, + "device-id": { + "blurb": "DXGI Device ID", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "9504", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": false + }, + "vendor-id": { + "blurb": "DXGI Vendor ID", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "4318", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": false + } + }, + "rank": "secondary" + }, + "d3d11colorconvert": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "Color conversion using Direct3D11", "hierarchy": [ "GstD3D11ColorConvert", + "GstD3D11BaseConvert", "GstD3D11BaseFilter", "GstBaseTransform", "GstElement", @@ -5272,27 +6593,226 @@ "GInitiallyUnowned", "GObject" ], - "klass": "Filter/Converter/Scaler/Video/Hardware", - "long-name": "Direct3D11 colorspace converter and scaler", + "klass": "Filter/Converter/Video/Hardware", + "long-name": "Direct3D11 colorspace converter", "pad-templates": { "sink": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" } }, "rank": "none" }, - "d3d11download": { - "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "Downloads D3D11 texture memory into system memory", + "d3d11compositor": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "A Direct3D11 compositor bin", "hierarchy": [ - "GstD3D11Download", + "GstD3D11CompositorBin", + "GstBin", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstChildProxy" + ], + "klass": "Filter/Editor/Video/Compositor", + "long-name": "Direct3D11 Compositor Bin", + "pad-templates": { + "sink_%%u": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "request", + "type": "GstD3D11CompositorBinInput" + }, + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always", + "type": "GstD3D11CompositorBinPad" + } + }, + "properties": { + "adapter": { + "blurb": "Adapter index for creating device (-1 for default)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "-1", + "mutable": "ready", + "readable": true, + "type": "gint", + "writable": true + }, + "background": { + "blurb": "Background type", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "checker (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBackground", + "writable": true + }, + "emit-signals": { + "blurb": "Send signals", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "latency": { + "blurb": "Additional latency in live mode to allow upstream to take longer to produce buffers for the current position (in nanoseconds)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "min-upstream-latency": { + "blurb": "When sources with a higher latency are expected to be plugged in dynamically after the aggregator has started playing, this allows overriding the minimum latency reported by the initial source(s). This is only taken into account when larger than the actually reported minimum latency. (nanoseconds)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "mixer": { + "blurb": "The d3d11 mixer chain to use", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "mutable": "null", + "readable": true, + "type": "GstElement", + "writable": false + }, + "start-time": { + "blurb": "Start time to use if start-time-selection=set", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "18446744073709551615", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "start-time-selection": { + "blurb": "Decides which start time is output", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "zero (0)", + "mutable": "null", + "readable": true, + "type": "GstAggregatorStartTimeSelection", + "writable": true + } + }, + "rank": "secondary" + }, + "d3d11compositorelement": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "A Direct3D11 compositor", + "hierarchy": [ + "GstD3D11Compositor", + "GstVideoAggregator", + "GstAggregator", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstChildProxy" + ], + "klass": "Filter/Editor/Video/Compositor", + "long-name": "Direct3D11 Compositor", + "pad-templates": { + "sink_%%u": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { RGBA, BGRA }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "request", + "type": "GstD3D11CompositorPad" + }, + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { RGBA, BGRA }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always", + "type": "GstAggregatorPad" + } + }, + "properties": { + "adapter": { + "blurb": "Adapter index for creating device (-1 for default)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "-1", + "max": "2147483647", + "min": "-1", + "mutable": "ready", + "readable": true, + "type": "gint", + "writable": true + }, + "background": { + "blurb": "Background type", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "checker (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBackground", + "writable": true + } + }, + "rank": "none" + }, + "d3d11convert": { + "author": "Seungha Yang <seungha.yang@navercorp.com>, Jeongki Kim <jeongki.kim@jeongki.kim>", + "description": "Resizes video and allow color conversion using Direct3D11", + "hierarchy": [ + "GstD3D11Convert", + "GstD3D11BaseConvert", "GstD3D11BaseFilter", "GstBaseTransform", "GstElement", @@ -5300,44 +6820,60 @@ "GInitiallyUnowned", "GObject" ], - "klass": "Filter/Video", - "long-name": "Direct3D11 downloader", + "klass": "Filter/Converter/Scaler/Video/Hardware", + "long-name": "Direct3D11 colorspace converter and scaler", "pad-templates": { "sink": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" } }, + "properties": { + "add-borders": { + "blurb": "Add black borders if necessary to keep the display aspect ratio", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "playing", + "readable": true, + "type": "gboolean", + "writable": true + } + }, "rank": "none" }, - "d3d11h264dec": { - "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "A Direct3D11 based H.264 video decoder", + "d3d11deinterlace": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "A Direct3D11 based deinterlacer bin", "hierarchy": [ - "GstD3D11H264Dec", - "GstH264Decoder", - "GstVideoDecoder", + "GstD3D11DeinterlaceBin", + "GstBin", "GstElement", "GstObject", "GInitiallyUnowned", "GObject" ], - "klass": "Codec/Decoder/Video/Hardware", - "long-name": "Direct3D11 H.264 Intel(R) Iris(R) Plus Graphics Decoder", + "interfaces": [ + "GstChildProxy" + ], + "klass": "Filter/Effect/Video/Deinterlace/Hardware", + "long-name": "Direct3D11 Intel(R) Iris(R) Plus Graphics Deinterlacer Bin", "pad-templates": { "sink": { - "caps": "video/x-h264:\n stream-format: { (string)avc, (string)avc3, (string)byte-stream }\n alignment: au\n profile: { (string)high, (string)main, (string)constrained-baseline, (string)baseline }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 64, 4096 ]\n height: [ 64, 4096 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: NV12\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 64, 4096 ]\n height: [ 64, 4096 ]\nvideo/x-raw:\n format: NV12\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 64, 4096 ]\n height: [ 64, 4096 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", "direction": "src", "presence": "always" } @@ -5371,6 +6907,42 @@ "type": "guint", "writable": false }, + "method": { + "blurb": "Deinterlace Method. Use can set multiple methods as a flagset and element will select one of method automatically. If deinterlacing device failed to deinterlace with given mode, fallback might happen by the device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "bob", + "mutable": "ready", + "readable": true, + "type": "GstD3D11DeinterlaceMethod", + "writable": true + }, + "qos": { + "blurb": "Handle Quality-of-Service events", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "supported-methods": { + "blurb": "Set of supported deinterlace methods by device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "mocomp+adaptive+bob+blend", + "mutable": "null", + "readable": true, + "type": "GstD3D11DeinterlaceMethod", + "writable": false + }, "vendor-id": { "blurb": "DXGI Vendor ID", "conditionally-available": false, @@ -5386,30 +6958,29 @@ "writable": false } }, - "rank": "secondary" + "rank": "marginal" }, - "d3d11h265dec": { - "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "A Direct3D11 based H.265 video decoder", + "d3d11deinterlaceelement": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "A Direct3D11 based deinterlacer", "hierarchy": [ - "GstD3D11H265Dec", - "GstH265Decoder", - "GstVideoDecoder", + "GstD3D11Deinterlace", + "GstBaseTransform", "GstElement", "GstObject", "GInitiallyUnowned", "GObject" ], - "klass": "Codec/Decoder/Video/Hardware", - "long-name": "Direct3D11 H.265 Intel(R) Iris(R) Plus Graphics Decoder", + "klass": "Filter/Effect/Video/Deinterlace/Hardware", + "long-name": "Direct3D11 Intel(R) Iris(R) Plus Graphics Deinterlacer", "pad-templates": { "sink": { - "caps": "video/x-h265:\n stream-format: { (string)hev1, (string)hvc1, (string)byte-stream }\n alignment: au\n framerate: [ 0/1, 2147483647/1 ]\n profile: { (string)main, (string)main-10 }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw(memory:D3D11Memory):\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\nvideo/x-raw:\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -5443,6 +7014,30 @@ "type": "guint", "writable": false }, + "method": { + "blurb": "Deinterlace Method. Use can set multiple methods as a flagset and element will select one of method automatically. If deinterlacing device failed to deinterlace with given mode, fallback might happen by the device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "bob", + "mutable": "ready", + "readable": true, + "type": "GstD3D11DeinterlaceMethod", + "writable": true + }, + "supported-methods": { + "blurb": "Set of supported deinterlace methods by device", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "mocomp+adaptive+bob+blend", + "mutable": "null", + "readable": true, + "type": "GstD3D11DeinterlaceMethod", + "writable": false + }, "vendor-id": { "blurb": "DXGI Vendor ID", "conditionally-available": false, @@ -5458,13 +7053,13 @@ "writable": false } }, - "rank": "secondary" + "rank": "none" }, - "d3d11upload": { + "d3d11download": { "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "Uploads data into D3D11 texture memory", + "description": "Downloads Direct3D11 texture memory into system memory", "hierarchy": [ - "GstD3D11Upload", + "GstD3D11Download", "GstD3D11BaseFilter", "GstBaseTransform", "GstElement", @@ -5473,214 +7068,303 @@ "GObject" ], "klass": "Filter/Video", - "long-name": "Direct3D11 uploader", + "long-name": "Direct3D11 downloader", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" } }, "rank": "none" }, - "d3d11videosink": { + "d3d11h264dec": { "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "A Direct3D11 based videosink", + "description": "Direct3D11/DXVA based H.264 video decoder", "hierarchy": [ - "GstD3D11VideoSinkBin", - "GstBin", + "GstD3D11H264Dec", + "GstH264Decoder", + "GstVideoDecoder", "GstElement", "GstObject", "GInitiallyUnowned", "GObject" ], - "interfaces": [ - "GstChildProxy", - "GstVideoOverlay", - "GstNavigation" - ], - "klass": "Sink/Video", - "long-name": "Direct3D11 video sink bin", + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Direct3D11/DXVA H.264 Intel(R) Iris(R) Plus Graphics Decoder", "pad-templates": { "sink": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-h264:\n stream-format: { (string)avc, (string)avc3, (string)byte-stream }\n alignment: au\n profile: { (string)high, (string)progressive-high, (string)constrained-high, (string)main, (string)constrained-baseline, (string)baseline }\n width: [ 1, 4096 ]\n height: [ 1, 4096 ]\n", "direction": "sink", "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: NV12\n width: [ 1, 4096 ]\n height: [ 1, 4096 ]\nvideo/x-raw:\n format: NV12\n width: [ 1, 4096 ]\n height: [ 1, 4096 ]\n", + "direction": "src", + "presence": "always" } }, "properties": { - "adapter": { - "blurb": "Adapter index for creating device (-1 for default)", + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "-1", - "max": "2147483647", - "min": "-1", - "mutable": "ready", - "readable": true, - "type": "gint", - "writable": true - }, - "async": { - "blurb": "Go asynchronously to PAUSED", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "true", + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", "mutable": "null", "readable": true, - "type": "gboolean", - "writable": true + "type": "gint64", + "writable": false }, - "blocksize": { - "blurb": "Size in bytes to pull per buffer (0 = default)", + "device-id": { + "blurb": "DXGI Device ID", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "4096", + "default": "35410", "max": "-1", "min": "0", "mutable": "null", "readable": true, "type": "guint", - "writable": true - }, - "enable-last-sample": { - "blurb": "Enable the last-sample property", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "true", - "mutable": "null", - "readable": true, - "type": "gboolean", - "writable": true + "writable": false }, - "enable-navigation-events": { - "blurb": "When enabled, navigation events are sent upstream", + "vendor-id": { + "blurb": "DXGI Vendor ID", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "true", + "default": "32902", + "max": "-1", + "min": "0", "mutable": "null", "readable": true, - "type": "gboolean", - "writable": true + "type": "guint", + "writable": false + } + }, + "rank": "secondary" + }, + "d3d11h265dec": { + "author": "Seungha Yang <seungha.yang@navercorp.com>", + "description": "Direct3D11/DXVA based H.265 video decoder", + "hierarchy": [ + "GstD3D11H265Dec", + "GstH265Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Direct3D11/DXVA H.265 Intel(R) Iris(R) Plus Graphics Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-h265:\n stream-format: { (string)hev1, (string)hvc1, (string)byte-stream }\n alignment: au\n profile: { (string)main, (string)main-10 }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n", + "direction": "sink", + "presence": "always" }, - "force-aspect-ratio": { - "blurb": "When enabled, scaling will respect original aspect ratio", + "src": { + "caps": "video/x-raw:\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n\nvideo/x-raw(format:Interlaced):\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n interlace-mode: alternate\n\nvideo/x-raw(memory:D3D11Memory):\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "true", + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", "mutable": "null", "readable": true, - "type": "gboolean", - "writable": true + "type": "gint64", + "writable": false }, - "fullscreen": { - "blurb": "Ignored when \"fullscreen-toggle-mode\" does not include \"property\"", + "device-id": { + "blurb": "DXGI Device ID", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "false", + "default": "35410", + "max": "-1", + "min": "0", "mutable": "null", "readable": true, - "type": "gboolean", - "writable": true + "type": "guint", + "writable": false }, - "fullscreen-toggle-mode": { - "blurb": "Full screen toggle mode used to trigger fullscreen mode change", + "vendor-id": { + "blurb": "DXGI Vendor ID", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "none", + "default": "32902", + "max": "-1", + "min": "0", "mutable": "null", "readable": true, - "type": "GstD3D11WindowFullscreenToggleMode", - "writable": true + "type": "guint", + "writable": false + } + }, + "rank": "secondary" + }, + "d3d11mpeg2dec": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "Direct3D11/DXVA based MPEG2 video decoder", + "hierarchy": [ + "GstD3D11Mpeg2Dec", + "GstMpeg2Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Direct3D11/DXVA MPEG2 Intel(R) Iris(R) Plus Graphics Decoder", + "pad-templates": { + "sink": { + "caps": "video/mpeg:\n mpegversion: 2\n systemstream: false\n profile: { (string)main, (string)simple }\n width: [ 1, 1920 ]\n height: [ 1, 1920 ]\n", + "direction": "sink", + "presence": "always" }, - "last-sample": { - "blurb": "The last sample received in the sink", + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: NV12\n width: [ 1, 1920 ]\n height: [ 1, 1920 ]\nvideo/x-raw:\n format: NV12\n width: [ 1, 1920 ]\n height: [ 1, 1920 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", "mutable": "null", "readable": true, - "type": "GstSample", + "type": "gint64", "writable": false }, - "max-bitrate": { - "blurb": "The maximum bits per second to render (0 = disabled)", + "device-id": { + "blurb": "DXGI Device ID", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "18446744073709551615", + "default": "35410", + "max": "-1", "min": "0", "mutable": "null", "readable": true, - "type": "guint64", - "writable": true - }, - "max-lateness": { - "blurb": "Maximum number of nanoseconds that a buffer can be late before it is dropped (-1 unlimited)", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "5000000", - "max": "9223372036854775807", - "min": "-1", - "mutable": "null", - "readable": true, - "type": "gint64", - "writable": true + "type": "guint", + "writable": false }, - "processing-deadline": { - "blurb": "Maximum processing deadline in nanoseconds", + "vendor-id": { + "blurb": "DXGI Vendor ID", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "15000000", - "max": "18446744073709551615", + "default": "32902", + "max": "-1", "min": "0", "mutable": "null", "readable": true, - "type": "guint64", - "writable": true + "type": "guint", + "writable": false + } + }, + "rank": "secondary" + }, + "d3d11scale": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "Resizes video using Direct3D11", + "hierarchy": [ + "GstD3D11Scale", + "GstD3D11BaseConvert", + "GstD3D11BaseFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Converter/Video/Scaler/Hardware", + "long-name": "Direct3D11 scaler", + "pad-templates": { + "sink": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" }, - "qos": { - "blurb": "Generate Quality-of-Service events upstream", + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "add-borders": { + "blurb": "Add black borders if necessary to keep the display aspect ratio", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, "default": "true", - "mutable": "null", + "mutable": "playing", "readable": true, "type": "gboolean", "writable": true - }, - "render-delay": { - "blurb": "Additional render delay of the sink in nanoseconds", + } + }, + "rank": "none" + }, + "d3d11screencapturesrc": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "Capture desktop image by using Desktop Duplication API", + "hierarchy": [ + "GstD3D11ScreenCaptureSrc", + "GstBaseSrc", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Source/Video", + "long-name": "Direct3D11 screen capture src", + "pad-templates": { + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: BGRA\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: BGRA\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "monitor-handle": { + "blurb": "A HMONITOR handle of monitor to capture", "conditionally-available": false, "construct": false, "construct-only": false, @@ -5688,79 +7372,69 @@ "default": "0", "max": "18446744073709551615", "min": "0", - "mutable": "null", + "mutable": "ready", "readable": true, "type": "guint64", "writable": true }, - "show-preroll-frame": { - "blurb": "Whether to render video frames during preroll", - "conditionally-available": false, - "construct": true, - "construct-only": false, - "controllable": false, - "default": "true", - "mutable": "null", - "readable": true, - "type": "gboolean", - "writable": true - }, - "stats": { - "blurb": "Sink Statistics", + "monitor-index": { + "blurb": "Zero-based index for monitor to capture (-1 = primary monitor)", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "application/x-gst-base-sink-stats, average-rate=(double)0, dropped=(guint64)0, rendered=(guint64)0;", - "mutable": "null", - "readable": true, - "type": "GstStructure", - "writable": false - }, - "sync": { - "blurb": "Sync on the clock", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "true", - "mutable": "null", + "default": "-1", + "max": "2147483647", + "min": "-1", + "mutable": "ready", "readable": true, - "type": "gboolean", + "type": "gint", "writable": true }, - "throttle-time": { - "blurb": "The time to keep between rendered buffers (0 = disabled)", + "show-cursor": { + "blurb": "Whether to show mouse cursor", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "18446744073709551615", - "min": "0", + "default": "false", "mutable": "null", "readable": true, - "type": "guint64", + "type": "gboolean", "writable": true + } + }, + "rank": "none" + }, + "d3d11upload": { + "author": "Seungha Yang <seungha.yang@navercorp.com>", + "description": "Uploads data into Direct3D11 texture memory", + "hierarchy": [ + "GstD3D11Upload", + "GstD3D11BaseFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Video", + "long-name": "Direct3D11 uploader", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" }, - "ts-offset": { - "blurb": "Timestamp offset in nanoseconds", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "0", - "max": "9223372036854775807", - "min": "-9223372036854775808", - "mutable": "null", - "readable": true, - "type": "gint64", - "writable": true + "src": { + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" } }, - "rank": "primary" + "rank": "none" }, - "d3d11videosinkelement": { + "d3d11videosink": { "author": "Seungha Yang <seungha.yang@navercorp.com>", "description": "A Direct3D11 based videosink", "hierarchy": [ @@ -5780,7 +7454,7 @@ "long-name": "Direct3D11 video sink", "pad-templates": { "sink": { - "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, VUYA, NV12, P010_10LE, P016_LE, I420, I420_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:D3D11Memory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition):\n format: { BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, GRAY8, GRAY16_LE, Y410 }\n width: [ 1, 16384 ]\n height: [ 1, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -5800,6 +7474,18 @@ "type": "gint", "writable": true }, + "draw-on-shared-texture": { + "blurb": "Draw on user provided shared texture instead of window. When enabled, user can pass application's own texture to sink by using \"draw\" action signal on \"begin-draw\" signal handler, so that sink can draw video data on application's texture. Supported texture formats for user texture are DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_FORMAT_B8G8R8A8_UNORM, and DXGI_FORMAT_R10G10B10A2_UNORM.", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + }, "enable-navigation-events": { "blurb": "When enabled, navigation events are sent upstream", "conditionally-available": false, @@ -5849,11 +7535,41 @@ "writable": true } }, - "rank": "none" + "rank": "primary", + "signals": { + "begin-draw": { + "args": [], + "return-type": "void", + "when": "last" + }, + "draw": { + "action": true, + "args": [ + { + "name": "arg0", + "type": "gpointer" + }, + { + "name": "arg1", + "type": "guint" + }, + { + "name": "arg2", + "type": "guint64" + }, + { + "name": "arg3", + "type": "guint64" + } + ], + "return-type": "gboolean", + "when": "last" + } + } }, "d3d11vp8dec": { "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "A Direct3D11 based VP8 video decoder", + "description": "Direct3D11/DXVA based VP8 video decoder", "hierarchy": [ "GstD3D11Vp8Dec", "GstVp8Decoder", @@ -5864,32 +7580,32 @@ "GObject" ], "klass": "Codec/Decoder/Video/Hardware", - "long-name": "Direct3D11 VP8 Intel(R) Iris(R) Plus Graphics Decoder", + "long-name": "Direct3D11/DXVA VP8 Intel(R) Iris(R) Plus Graphics Decoder", "pad-templates": { "sink": { - "caps": "video/x-vp8:\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 64, 4096 ]\n height: [ 64, 4096 ]\n", + "caps": "video/x-vp8:\n width: [ 1, 4096 ]\n height: [ 1, 4096 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw(memory:D3D11Memory):\n framerate: [ 0/1, 2147483647/1 ]\n format: NV12\n width: [ 64, 4096 ]\n height: [ 64, 4096 ]\nvideo/x-raw:\n framerate: [ 0/1, 2147483647/1 ]\n format: NV12\n width: [ 64, 4096 ]\n height: [ 64, 4096 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: NV12\n width: [ 1, 4096 ]\n height: [ 1, 4096 ]\nvideo/x-raw:\n format: NV12\n width: [ 1, 4096 ]\n height: [ 1, 4096 ]\n", "direction": "src", "presence": "always" } }, "properties": { - "adapter": { - "blurb": "DXGI Adapter index for creating device", + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "-1", - "min": "0", + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", "mutable": "null", "readable": true, - "type": "guint", + "type": "gint64", "writable": false }, "device-id": { @@ -5925,7 +7641,7 @@ }, "d3d11vp9dec": { "author": "Seungha Yang <seungha.yang@navercorp.com>", - "description": "A Direct3D11 based VP9 video decoder", + "description": "Direct3D11/DXVA based VP9 video decoder", "hierarchy": [ "GstD3D11Vp9Dec", "GstVp9Decoder", @@ -5936,32 +7652,32 @@ "GObject" ], "klass": "Codec/Decoder/Video/Hardware", - "long-name": "Direct3D11 VP9 Intel(R) Iris(R) Plus Graphics Decoder", + "long-name": "Direct3D11/DXVA VP9 Intel(R) Iris(R) Plus Graphics Decoder", "pad-templates": { "sink": { - "caps": "video/x-vp9:\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-vp9:\n alignment: frame\n profile: { (string)0, (string)2 }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw(memory:D3D11Memory):\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\nvideo/x-raw:\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-raw(memory:D3D11Memory):\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\nvideo/x-raw:\n format: { NV12, P010_10LE }\n width: [ 1, 8192 ]\n height: [ 1, 8192 ]\n", "direction": "src", "presence": "always" } }, "properties": { - "adapter": { - "blurb": "DXGI Adapter index for creating device", + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", "conditionally-available": false, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "-1", - "min": "0", + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", "mutable": "null", "readable": true, - "type": "guint", + "type": "gint64", "writable": false }, "device-id": { @@ -5999,6 +7715,18 @@ "filename": "gstd3d11", "license": "LGPL", "other-types": { + "GstD3D11BaseConvert": { + "hierarchy": [ + "GstD3D11BaseConvert", + "GstD3D11BaseFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object" + }, "GstD3D11BaseFilter": { "hierarchy": [ "GstD3D11BaseFilter", @@ -6026,6 +7754,698 @@ } } }, + "GstD3D11CompositorBackground": { + "kind": "enum", + "values": [ + { + "desc": "Checker pattern", + "name": "checker", + "value": "0" + }, + { + "desc": "Black", + "name": "black", + "value": "1" + }, + { + "desc": "White", + "name": "white", + "value": "2" + }, + { + "desc": "Transparent Background to enable further compositing", + "name": "transparent", + "value": "3" + } + ] + }, + "GstD3D11CompositorBinInput": { + "hierarchy": [ + "GstD3D11CompositorBinInput", + "GstD3D11CompositorBinPad", + "GstGhostPad", + "GstProxyPad", + "GstPad", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "alpha": { + "blurb": "Alpha of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true + }, + "blend-dest-alpha": { + "blurb": "Blend factor for destination alpha, \"*-color\" values are not allowed", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "zero (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "blend-dest-rgb": { + "blurb": "Blend factor for destination RGB", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "zero (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "blend-factor-alpha": { + "blurb": "Blend factor for alpha component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-factor-blue": { + "blurb": "Blend factor for blue component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-factor-green": { + "blurb": "Blend factor for green component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-factor-red": { + "blurb": "Blend factor for red component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-op-alpha": { + "blurb": "Blend equation for alpha", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "add (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlendOperation", + "writable": true + }, + "blend-op-rgb": { + "blurb": "Blend equation for RGB", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "add (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlendOperation", + "writable": true + }, + "blend-src-alpha": { + "blurb": "Blend factor for source alpha, \"*-color\" values are not allowed", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "zero (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "blend-src-rgb": { + "blurb": "Blend factor for source RGB", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "zero (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "height": { + "blurb": "Height of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "max-last-buffer-repeat": { + "blurb": "Repeat last buffer for time (in ns, -1=until EOS), behaviour on EOS is not affected", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "playing", + "readable": true, + "type": "guint64", + "writable": true + }, + "repeat-after-eos": { + "blurb": "Repeat the last frame after EOS until all pads are EOS", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "sizing-policy": { + "blurb": "Sizing policy to use for image scaling", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "none (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorSizingPolicy", + "writable": true + }, + "width": { + "blurb": "Width of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "xpos": { + "blurb": "X position of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "ypos": { + "blurb": "Y position of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "zorder": { + "blurb": "Z Order of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + } + } + }, + "GstD3D11CompositorBinPad": { + "hierarchy": [ + "GstD3D11CompositorBinPad", + "GstGhostPad", + "GstProxyPad", + "GstPad", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "emit-signals": { + "blurb": "Send signals to signal data consumption", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "signals": { + "buffer-consumed": { + "args": [ + { + "name": "arg0", + "type": "GstBuffer" + } + ], + "return-type": "void", + "when": "first" + } + } + }, + "GstD3D11CompositorBlend": { + "kind": "enum", + "values": [ + { + "desc": "The blend factor is (0, 0, 0, 0)", + "name": "zero", + "value": "0" + }, + { + "desc": "The blend factor is (1, 1, 1, 1)", + "name": "one", + "value": "1" + }, + { + "desc": "The blend factor is (Rs, Gs, Bs, As)", + "name": "src-color", + "value": "2" + }, + { + "desc": "The blend factor is (1 - Rs, 1 - Gs, 1 - Bs, 1 - As)", + "name": "inv-src-color", + "value": "3" + }, + { + "desc": "The blend factor is (As, As, As, As)", + "name": "src-alpha", + "value": "4" + }, + { + "desc": "The blend factor is (1 - As, 1 - As, 1 - As, 1 - As)", + "name": "inv-src-alpha", + "value": "5" + }, + { + "desc": "The blend factor is (Ad, Ad, Ad, Ad)", + "name": "dest-alpha", + "value": "6" + }, + { + "desc": "The blend factor is (1 - Ad, 1 - Ad, 1 - Ad, 1 - Ad)", + "name": "inv-dest-alpha", + "value": "7" + }, + { + "desc": "The blend factor is (Rd, Gd, Bd, Ad)", + "name": "dest-color", + "value": "8" + }, + { + "desc": "The blend factor is (1 - Rd, 1 - Gd, 1 - Bd, 1 - Ad)", + "name": "inv-dest-color", + "value": "9" + }, + { + "desc": "The blend factor is (f, f, f, 1); where f = min(As, 1 - Ad)", + "name": "src-alpha-sat", + "value": "10" + }, + { + "desc": "User defined blend factor", + "name": "blend-factor", + "value": "11" + }, + { + "desc": "Inverse of user defined blend factor", + "name": "inv-blend-factor", + "value": "12" + } + ] + }, + "GstD3D11CompositorBlendOperation": { + "kind": "enum", + "values": [ + { + "desc": "Add source and background", + "name": "add", + "value": "0" + }, + { + "desc": "Subtract source from background", + "name": "subtract", + "value": "1" + }, + { + "desc": "Subtract background from source", + "name": "rev-subtract", + "value": "2" + }, + { + "desc": "Minimum of source and background", + "name": "min", + "value": "3" + }, + { + "desc": "Maximum of source and background", + "name": "max", + "value": "4" + } + ] + }, + "GstD3D11CompositorPad": { + "hierarchy": [ + "GstD3D11CompositorPad", + "GstVideoAggregatorPad", + "GstAggregatorPad", + "GstPad", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "alpha": { + "blurb": "Alpha of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "1", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true + }, + "blend-dest-alpha": { + "blurb": "Blend factor for destination alpha, \"*-color\" values are not allowed", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "inv-src-alpha (5)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "blend-dest-rgb": { + "blurb": "Blend factor for destination RGB", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "inv-src-alpha (5)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "blend-factor-alpha": { + "blurb": "Blend factor for alpha component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "1", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-factor-blue": { + "blurb": "Blend factor for blue component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "1", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-factor-green": { + "blurb": "Blend factor for green component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "1", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-factor-red": { + "blurb": "Blend factor for red component when blend type is \"blend-factor\" or \"inv-blend-factor\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "1", + "max": "1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "blend-op-alpha": { + "blurb": "Blend equation for alpha", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "add (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlendOperation", + "writable": true + }, + "blend-op-rgb": { + "blurb": "Blend equation for RGB", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "add (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlendOperation", + "writable": true + }, + "blend-src-alpha": { + "blurb": "Blend factor for source alpha, \"*-color\" values are not allowed", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "one (1)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "blend-src-rgb": { + "blurb": "Blend factor for source RGB", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "src-alpha (4)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorBlend", + "writable": true + }, + "height": { + "blurb": "Height of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "sizing-policy": { + "blurb": "Sizing policy to use for image scaling", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "none (0)", + "mutable": "null", + "readable": true, + "type": "GstD3D11CompositorSizingPolicy", + "writable": true + }, + "width": { + "blurb": "Width of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "xpos": { + "blurb": "X position of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "ypos": { + "blurb": "Y position of the picture", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "0", + "max": "2147483647", + "min": "-2147483648", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + } + } + }, + "GstD3D11CompositorSizingPolicy": { + "kind": "enum", + "values": [ + { + "desc": "None: Image is scaled to fill configured destination rectangle without padding or keeping the aspect ratio", + "name": "none", + "value": "0" + }, + { + "desc": "Keep Aspect Ratio: Image is scaled to fit destination rectangle specified by GstCompositorPad:{xpos, ypos, width, height} with preserved aspect ratio. Resulting image will be centered in the destination rectangle with padding if necessary", + "name": "keep-aspect-ratio", + "value": "1" + } + ] + }, + "GstD3D11DeinterlaceMethod": { + "kind": "flags", + "values": [ + { + "desc": "Blend: Blending top/bottom field pictures into one frame. Framerate will be preserved (e.g., 60i -> 30p)", + "name": "blend", + "value": "0x00000001" + }, + { + "desc": "Bob: Interpolating missing lines by using the adjacent lines. Framerate will be doubled (e,g, 60i -> 60p)", + "name": "bob", + "value": "0x00000002" + }, + { + "desc": "Adaptive: Interpolating missing lines by using spatial/temporal references. Framerate will be doubled (e,g, 60i -> 60p)", + "name": "adaptive", + "value": "0x00000004" + }, + { + "desc": "Motion Compensation: Recreating missing lines by using motion vector. Framerate will be doubled (e,g, 60i -> 60p)", + "name": "mocomp", + "value": "0x00000008" + } + ] + }, "GstD3D11WindowFullscreenToggleMode": { "kind": "flags", "values": [ @@ -6370,7 +8790,29 @@ "writable": true } }, - "rank": "none" + "rank": "none", + "signals": { + "get-fragment-stream": { + "args": [ + { + "name": "arg0", + "type": "gchararray" + } + ], + "return-type": "GOutputStream", + "when": "last" + }, + "get-playlist-stream": { + "args": [ + { + "name": "arg0", + "type": "gchararray" + } + ], + "return-type": "GOutputStream", + "when": "last" + } + } } }, "filename": "gstdash", @@ -6670,7 +9112,7 @@ }, "chopmydata": { "author": "David Schleef <ds@schleef.org>", - "description": "FIXME", + "description": "Split up a stream into randomly-sized buffers", "hierarchy": [ "GstChopMyData", "GstElement", @@ -6679,7 +9121,7 @@ "GObject" ], "klass": "Generic", - "long-name": "FIXME", + "long-name": "Chop my data", "pad-templates": { "sink": { "caps": "ANY", @@ -6970,6 +9412,18 @@ "type": "GstFlowReturn", "writable": true }, + "ignore-eos": { + "blurb": "Whether to ignore GST_FLOW_EOS", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, "ignore-error": { "blurb": "Whether to ignore GST_FLOW_ERROR", "conditionally-available": false, @@ -7009,6 +9463,339 @@ }, "rank": "none" }, + "fakeaudiosink": { + "author": "Philippe Normand <philn@igalia.com>", + "description": "Fake audio renderer", + "hierarchy": [ + "GstFakeAudioSink", + "GstBin", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstChildProxy", + "GstStreamVolume" + ], + "klass": "Audio/Sink", + "long-name": "Fake Audio Sink", + "pad-templates": { + "sink": { + "caps": "audio/x-raw:\n format: { F64LE, F64BE, F32LE, F32BE, S32LE, S32BE, U32LE, U32BE, S24_32LE, S24_32BE, U24_32LE, U24_32BE, S24LE, S24BE, U24LE, U24BE, S20LE, S20BE, U20LE, U20BE, S18LE, S18BE, U18LE, U18BE, S16LE, S16BE, U16LE, U16BE, S8, U8 }\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2147483647 ]\n", + "direction": "sink", + "presence": "always" + } + }, + "properties": { + "async": { + "blurb": "Go asynchronously to PAUSED", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "blocksize": { + "blurb": "Size in bytes to pull per buffer (0 = default)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "4096", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "can-activate-pull": { + "blurb": "Can activate in pull mode", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "can-activate-push": { + "blurb": "Can activate in push mode", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "drop-out-of-segment": { + "blurb": "Drop and don't render / hand off out-of-segment buffers", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "dump": { + "blurb": "Dump buffer contents to stdout", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "playing", + "readable": true, + "type": "gboolean", + "writable": true + }, + "enable-last-sample": { + "blurb": "Enable the last-sample property", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "last-message": { + "blurb": "The message describing current status", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": false + }, + "last-sample": { + "blurb": "The last sample received in the sink", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "mutable": "null", + "readable": true, + "type": "GstSample", + "writable": false + }, + "max-bitrate": { + "blurb": "The maximum bits per second to render (0 = disabled)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "max-lateness": { + "blurb": "Maximum number of nanoseconds that a buffer can be late before it is dropped (-1 unlimited)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "18446744073709551615", + "max": "9223372036854775807", + "min": "-1", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": true + }, + "mute": { + "blurb": "Mute the audio channel without changing the volume", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "num-buffers": { + "blurb": "Number of buffers to accept going EOS", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "-1", + "max": "2147483647", + "min": "-1", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "processing-deadline": { + "blurb": "Maximum processing time for a buffer in nanoseconds", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "20000000", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "qos": { + "blurb": "Generate Quality-of-Service events upstream", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "render-delay": { + "blurb": "Additional render delay of the sink in nanoseconds", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "signal-handoffs": { + "blurb": "Send a signal before unreffing the buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "silent": { + "blurb": "Don't produce last_message events", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "playing", + "readable": true, + "type": "gboolean", + "writable": true + }, + "state-error": { + "blurb": "Generate a state change error", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "none (0)", + "mutable": "null", + "readable": true, + "type": "GstFakeSinkStateError", + "writable": true + }, + "stats": { + "blurb": "Sink Statistics", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "application/x-gst-base-sink-stats, average-rate=(double)0, dropped=(guint64)0, rendered=(guint64)0;", + "mutable": "null", + "readable": true, + "type": "GstStructure", + "writable": false + }, + "sync": { + "blurb": "Sync on the clock", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "throttle-time": { + "blurb": "The time to keep between rendered buffers (0 = disabled)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "18446744073709551615", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint64", + "writable": true + }, + "ts-offset": { + "blurb": "Timestamp offset in nanoseconds", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "9223372036854775807", + "min": "-9223372036854775808", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": true + }, + "volume": { + "blurb": "The audio volume, 1.0=100%%", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "10", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true + } + }, + "rank": "none" + }, "fakevideosink": { "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com>", "description": "Fake video display that allows zero-copy", @@ -7027,7 +9814,7 @@ "long-name": "Fake Video Sink", "pad-templates": { "sink": { - "caps": "video/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -7561,6 +10348,42 @@ "rank": "none", "signals": {} }, + "videocodectestsink": { + "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com", + "description": "Sink to test video CODEC conformance", + "hierarchy": [ + "GstVideoCodecTestSink", + "GstBaseSink", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Debug/video/Sink", + "long-name": "Video CODEC Test Sink", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { I420, I420_10LE, NV12 }\n", + "direction": "sink", + "presence": "always" + } + }, + "properties": { + "location": { + "blurb": "File path to store non-padded I420 stream (optional).", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": true + } + }, + "rank": "none" + }, "watchdog": { "author": "David Schleef <ds@schleef.org>", "description": "Watches for pauses in stream buffers", @@ -8019,18 +10842,6 @@ "type": "gint", "writable": true }, - "duplex-mode": { - "blurb": "Certain DeckLink devices such as the DeckLink Quad 2 and the DeckLink Duo 2 support configuration of the duplex mode of individual sub-devices.A sub-device configured as full-duplex will use two connectors, which allows simultaneous capture and playback, internal keying, and fill & key scenarios.A half-duplex sub-device will use a single connector as an individual capture or playback channel.", - "conditionally-available": false, - "construct": true, - "construct-only": false, - "controllable": false, - "default": "half (0)", - "mutable": "null", - "readable": true, - "type": "GstDecklinkDuplexMode", - "writable": true - }, "hw-serial-number": { "blurb": "The serial number (hardware ID) of the Decklink card", "conditionally-available": false, @@ -8081,6 +10892,18 @@ "type": "GstDecklinkModes", "writable": true }, + "profile": { + "blurb": "Certain DeckLink devices such as the DeckLink 8K Pro, the DeckLink Quad 2 and the DeckLink Duo 2 support multiple profiles to configure the capture and playback behavior of its sub-devices.For the DeckLink Duo 2 and DeckLink Quad 2, a profile is shared between any 2 sub-devices that utilize the same connectors. For the DeckLink 8K Pro, a profile is shared between all 4 sub-devices. Any sub-devices that share a profile are considered to be part of the same profile group.DeckLink Duo 2 support configuration of the duplex mode of individual sub-devices.", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "default (0)", + "mutable": "null", + "readable": true, + "type": "GstDecklinkProfileId", + "writable": true + }, "timecode-format": { "blurb": "Timecode format type to use for playback", "conditionally-available": false, @@ -8182,18 +11005,6 @@ "type": "gboolean", "writable": true }, - "duplex-mode": { - "blurb": "Certain DeckLink devices such as the DeckLink Quad 2 and the DeckLink Duo 2 support configuration of the duplex mode of individual sub-devices.A sub-device configured as full-duplex will use two connectors, which allows simultaneous capture and playback, internal keying, and fill & key scenarios.A half-duplex sub-device will use a single connector as an individual capture or playback channel.", - "conditionally-available": false, - "construct": true, - "construct-only": false, - "controllable": false, - "default": "half (0)", - "mutable": "null", - "readable": true, - "type": "GstDecklinkDuplexMode", - "writable": true - }, "hw-serial-number": { "blurb": "The serial number (hardware ID) of the Decklink card", "conditionally-available": false, @@ -8254,6 +11065,18 @@ "type": "gboolean", "writable": true }, + "profile": { + "blurb": "Certain DeckLink devices such as the DeckLink 8K Pro, the DeckLink Quad 2 and the DeckLink Duo 2 support multiple profiles to configure the capture and playback behavior of its sub-devices.For the DeckLink Duo 2 and DeckLink Quad 2, a profile is shared between any 2 sub-devices that utilize the same connectors. For the DeckLink 8K Pro, a profile is shared between all 4 sub-devices. Any sub-devices that share a profile are considered to be part of the same profile group.DeckLink Duo 2 support configuration of the duplex mode of individual sub-devices.", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "default (0)", + "mutable": "null", + "readable": true, + "type": "GstDecklinkProfileId", + "writable": true + }, "signal": { "blurb": "True if there is a valid input signal available", "conditionally-available": false, @@ -8411,21 +11234,6 @@ } ] }, - "GstDecklinkDuplexMode": { - "kind": "enum", - "values": [ - { - "desc": "Half-Duplex", - "name": "half", - "value": "0" - }, - { - "desc": "Full-Duplex", - "name": "full", - "value": "1" - } - ] - }, "GstDecklinkKeyerMode": { "kind": "enum", "values": [ @@ -8671,6 +11479,41 @@ } ] }, + "GstDecklinkProfileId": { + "kind": "enum", + "values": [ + { + "desc": "Default, don't change profile", + "name": "default", + "value": "0" + }, + { + "desc": "One sub-device, Full-Duplex", + "name": "one-sub-device-full", + "value": "1" + }, + { + "desc": "One sub-device, Half-Duplex", + "name": "one-sub-device-half", + "value": "2" + }, + { + "desc": "Two sub-devices, Full-Duplex", + "name": "two-sub-devices-full", + "value": "3" + }, + { + "desc": "Two sub-devices, Half-Duplex", + "name": "two-sub-devices-half", + "value": "4" + }, + { + "desc": "Four sub-devices, Half-Duplex", + "name": "four-sub-devices-half", + "value": "5" + } + ] + }, "GstDecklinkTimecodeFormat": { "kind": "enum", "values": [ @@ -11704,7 +14547,7 @@ "long-name": "DVB Subtitles Overlay", "pad-templates": { "src": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" }, @@ -11714,7 +14557,7 @@ "presence": "always" }, "video_sink": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -12215,7 +15058,7 @@ "interfaces": [ "GstPreset" ], - "klass": "Codec/Encoder/Audio", + "klass": "Codec/Encoder/Audio/Converter", "long-name": "FDK AAC audio encoder", "pad-templates": { "sink": { @@ -24470,6 +27313,195 @@ "tracers": {}, "url": "Unknown package origin" }, + "gs": { + "description": "Read and write from and to a Google Cloud Storage", + "elements": { + "gssink": { + "author": "Julien Isorce <jisorce@oblong.com>", + "description": "Write buffers to a sequentially named set of files on Google Cloud Storage", + "hierarchy": [ + "GstGsSink", + "GstBaseSink", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Sink/File", + "long-name": "Google Cloud Storage Sink", + "pad-templates": { + "sink": { + "caps": "ANY", + "direction": "sink", + "presence": "always" + } + }, + "properties": { + "bucket-name": { + "blurb": "Google Cloud Storage Bucket Name", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": true + }, + "index": { + "blurb": "Index to use with location property to create file names. The index is incremented by one for each buffer written.", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, + "next-file": { + "blurb": "When to start a new file", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "buffer (0)", + "mutable": "null", + "readable": true, + "type": "GstGsSinkNext", + "writable": true + }, + "object-name": { + "blurb": "Full path name of the remote file", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "%%s_%%05d", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": true + }, + "post-messages": { + "blurb": "Post a message for each file with information of the buffer", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "service-account-email": { + "blurb": "Service Account Email to use for credentials", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + }, + "start-date": { + "blurb": "Start date in iso8601 format", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + } + }, + "rank": "none" + }, + "gssrc": { + "author": "Julien Isorce <jisorce@oblong.com>", + "description": "Read from arbitrary point from a file in a Google Cloud Storage", + "hierarchy": [ + "GstGsSrc", + "GstBaseSrc", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstURIHandler" + ], + "klass": "Source/File", + "long-name": "Google Cloud Storage Source", + "pad-templates": { + "src": { + "caps": "ANY", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "location": { + "blurb": "Location of the file to read", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + }, + "service-account-email": { + "blurb": "Service Account Email to use for credentials", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "ready", + "readable": true, + "type": "gchararray", + "writable": true + } + }, + "rank": "none" + } + }, + "filename": "gstgs", + "license": "LGPL", + "other-types": { + "GstGsSinkNext": { + "kind": "enum", + "values": [ + { + "desc": "New file for each buffer", + "name": "buffer", + "value": "0" + }, + { + "desc": "New file for each buffer", + "name": "Only one file, no next file", + "value": "1" + } + ] + } + }, + "package": "GStreamer Bad Plug-ins", + "source": "gst-plugins-bad", + "tracers": {}, + "url": "Unknown package origin" + }, "gsm": { "description": "GSM encoder/decoder", "elements": { @@ -24696,7 +27728,7 @@ "interfaces": [ "GstChildProxy" ], - "klass": "Sink", + "klass": "Sink/Muxer", "long-name": "HTTP Live Streaming sink", "pad-templates": { "audio": { @@ -25132,7 +28164,7 @@ "long-name": "Internal video sink", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -25168,7 +28200,7 @@ "long-name": "Internal video source", "pad-templates": { "src": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" } @@ -25229,12 +28261,12 @@ "long-name": "Interlace filter", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { AYUV, YUY2, UYVY, I420, YV12, Y42B, Y444, NV12, NV21 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: progressive\nvideo/x-raw:\n format: { AYUV, YUY2, UYVY, I420, YV12, Y42B, Y444, NV12, NV21 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: interleaved\n field-order: { (string)top-field-first, (string)bottom-field-first }\nvideo/x-raw:\n format: { AYUV, YUY2, UYVY, I420, YV12, Y42B, Y444, NV12, NV21 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: mixed\n\nvideo/x-raw(format:Interlaced):\n format: { AYUV, YUY2, UYVY, I420, YV12, Y42B, Y444, NV12, NV21 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: alternate\n", + "caps": "video/x-raw:\n format: { AYUV64, Y412_BE, Y412_LE, A444_10BE, A444_10LE, AYUV, VUYA, A422_10BE, A422_10LE, A420_10BE, A420_10LE, A420, Y444_16BE, Y444_16LE, Y444_12BE, Y444_12LE, Y410, Y444_10BE, Y444_10LE, v308, IYU2, Y444, NV24, v216, I422_12BE, I422_12LE, Y212_BE, Y212_LE, UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, P016_BE, P016_LE, I420_12BE, I420_12LE, P012_BE, P012_LE, NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, I420, YV12, NV12, NV21, IYU1, Y41B, YUV9, YVU9 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: progressive\nvideo/x-raw:\n format: { AYUV64, Y412_BE, Y412_LE, A444_10BE, A444_10LE, AYUV, VUYA, A422_10BE, A422_10LE, A420_10BE, A420_10LE, A420, Y444_16BE, Y444_16LE, Y444_12BE, Y444_12LE, Y410, Y444_10BE, Y444_10LE, v308, IYU2, Y444, NV24, v216, I422_12BE, I422_12LE, Y212_BE, Y212_LE, UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, P016_BE, P016_LE, I420_12BE, I420_12LE, P012_BE, P012_LE, NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, I420, YV12, NV12, NV21, IYU1, Y41B, YUV9, YVU9 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: interleaved\n field-order: { (string)top-field-first, (string)bottom-field-first }\nvideo/x-raw:\n format: { AYUV64, Y412_BE, Y412_LE, A444_10BE, A444_10LE, AYUV, VUYA, A422_10BE, A422_10LE, A420_10BE, A420_10LE, A420, Y444_16BE, Y444_16LE, Y444_12BE, Y444_12LE, Y410, Y444_10BE, Y444_10LE, v308, IYU2, Y444, NV24, v216, I422_12BE, I422_12LE, Y212_BE, Y212_LE, UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, P016_BE, P016_LE, I420_12BE, I420_12LE, P012_BE, P012_LE, NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, I420, YV12, NV12, NV21, IYU1, Y41B, YUV9, YVU9 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: mixed\n\nvideo/x-raw(format:Interlaced):\n format: { AYUV64, Y412_BE, Y412_LE, A444_10BE, A444_10LE, AYUV, VUYA, A422_10BE, A422_10LE, A420_10BE, A420_10LE, A420, Y444_16BE, Y444_16LE, Y444_12BE, Y444_12LE, Y410, Y444_10BE, Y444_10LE, v308, IYU2, Y444, NV24, v216, I422_12BE, I422_12LE, Y212_BE, Y212_LE, UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, P016_BE, P016_LE, I420_12BE, I420_12LE, P012_BE, P012_LE, NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, I420, YV12, NV12, NV21, IYU1, Y41B, YUV9, YVU9 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: alternate\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n format: { AYUV, YUY2, UYVY, I420, YV12, Y42B, Y444, NV12, NV21 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)interleaved, (string)mixed }\n\nvideo/x-raw(format:Interlaced):\n format: { AYUV, YUY2, UYVY, I420, YV12, Y42B, Y444, NV12, NV21 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: alternate\n", + "caps": "video/x-raw:\n format: { AYUV64, Y412_BE, Y412_LE, A444_10BE, A444_10LE, AYUV, VUYA, A422_10BE, A422_10LE, A420_10BE, A420_10LE, A420, Y444_16BE, Y444_16LE, Y444_12BE, Y444_12LE, Y410, Y444_10BE, Y444_10LE, v308, IYU2, Y444, NV24, v216, I422_12BE, I422_12LE, Y212_BE, Y212_LE, UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, P016_BE, P016_LE, I420_12BE, I420_12LE, P012_BE, P012_LE, NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, I420, YV12, NV12, NV21, IYU1, Y41B, YUV9, YVU9 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)interleaved, (string)mixed }\n\nvideo/x-raw(format:Interlaced):\n format: { AYUV64, Y412_BE, Y412_LE, A444_10BE, A444_10LE, AYUV, VUYA, A422_10BE, A422_10LE, A420_10BE, A420_10LE, A420, Y444_16BE, Y444_16LE, Y444_12BE, Y444_12LE, Y410, Y444_10BE, Y444_10LE, v308, IYU2, Y444, NV24, v216, I422_12BE, I422_12LE, Y212_BE, Y212_LE, UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, P016_BE, P016_LE, I420_12BE, I420_12LE, P012_BE, P012_LE, NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, I420, YV12, NV12, NV21, IYU1, Y41B, YUV9, YVU9 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: alternate\n", "direction": "src", "presence": "always" } @@ -26312,7 +29344,7 @@ "long-name": "KMS video sink", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { BGRA, BGRx, RGBA, RGBx, RGB, BGR, P010_10LE, P016_LE, UYVY, YUY2, YVYU, I420, YV12, Y42B, NV12, NV21, NV16 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { BGRA, RGBA, P016_LE, P010_10LE, NV24, BGRx, RGBx, RGB, BGR, Y42B, NV61, NV16, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, RGB16, BGR16 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -206719,17 +209751,31 @@ "long-name": "Media Foundation Intel® Quick Sync Video H.264 Encoder MFT", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { NV12, BGRA }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-raw:\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n\nvideo/x-raw(memory:D3D11Memory):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)high, (string)main, (string)baseline }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)high, (string)main, (string)constrained-baseline, (string)baseline }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", "direction": "src", "presence": "always" } }, "properties": { + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", + "conditionally-available": true, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": false + }, "bframes": { "blurb": "The maximum number of consecutive B frames", "conditionally-available": true, @@ -206770,18 +209816,30 @@ "type": "gboolean", "writable": true }, + "d3d11-aware": { + "blurb": "Whether device can support Direct3D11 interop", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": false + }, "gop-size": { - "blurb": "The number of pictures from one GOP header to the next, (0 = MFT default)", + "blurb": "The number of pictures from one GOP header to the next. Depending on GPU vendor implementation, zero gop-size might produce only one keyframe at the beginning (-1 for automatic)", "conditionally-available": true, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "-2", - "min": "0", + "default": "-1", + "max": "2147483647", + "min": "-1", "mutable": "null", "readable": true, - "type": "guint", + "type": "gint", "writable": true }, "low-latency": { @@ -206942,7 +210000,7 @@ "long-name": "Media Foundation Intel® Hardware H265 Encoder MFT", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { NV12, BGRA, P010_10LE }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-raw:\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12, P010_10LE }\n\nvideo/x-raw(memory:D3D11Memory):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12, P010_10LE }\n", "direction": "sink", "presence": "always" }, @@ -206953,6 +210011,20 @@ } }, "properties": { + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", + "conditionally-available": true, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": false + }, "bframes": { "blurb": "The maximum number of consecutive B frames", "conditionally-available": true, @@ -206981,18 +210053,30 @@ "type": "guint", "writable": true }, + "d3d11-aware": { + "blurb": "Whether device can support Direct3D11 interop", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": false + }, "gop-size": { - "blurb": "The number of pictures from one GOP header to the next, (0 = MFT default)", + "blurb": "The number of pictures from one GOP header to the next. Depending on GPU vendor implementation, zero gop-size might produce only one keyframe at the beginning (-1 for automatic)", "conditionally-available": true, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "-2", - "min": "0", + "default": "-1", + "max": "2147483647", + "min": "-1", "mutable": "null", "readable": true, - "type": "guint", + "type": "gint", "writable": true }, "low-latency": { @@ -207260,7 +210344,7 @@ "long-name": "Media Foundation Intel® Hardware VP9 Encoder MFT", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { NV12 }\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n", + "caps": "video/x-raw:\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n\nvideo/x-raw(memory:D3D11Memory):\n width: [ 64, 8192 ]\n height: [ 64, 8192 ]\n format: { NV12 }\n", "direction": "sink", "presence": "always" }, @@ -207271,6 +210355,20 @@ } }, "properties": { + "adapter-luid": { + "blurb": "DXGI Adapter LUID (Locally Unique Identifier) of created device", + "conditionally-available": true, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "86670", + "max": "9223372036854775807", + "min": "-9223372036854775808", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": false + }, "bitrate": { "blurb": "Bitrate in kbit/sec", "conditionally-available": false, @@ -207285,18 +210383,30 @@ "type": "guint", "writable": true }, + "d3d11-aware": { + "blurb": "Whether device can support Direct3D11 interop", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": false + }, "gop-size": { - "blurb": "The number of pictures from one GOP header to the next, (0 = MFT default)", + "blurb": "The number of pictures from one GOP header to the next. Depending on GPU vendor implementation, zero gop-size might produce only one keyframe at the beginning (-1 for automatic)", "conditionally-available": true, "construct": false, "construct-only": false, "controllable": false, - "default": "0", - "max": "-2", - "min": "0", + "default": "-1", + "max": "2147483647", + "min": "-1", "mutable": "null", "readable": true, - "type": "guint", + "type": "gint", "writable": true }, "low-latency": { @@ -207482,72 +210592,6 @@ "tracers": {}, "url": "Unknown package origin" }, - "mms": { - "description": "Microsoft Multi Media Server streaming protocol support", - "elements": { - "mmssrc": { - "author": "Maciej Katafiasz <mathrick@users.sourceforge.net>", - "description": "Receive data streamed via MSFT Multi Media Server protocol", - "hierarchy": [ - "GstMMS", - "GstPushSrc", - "GstBaseSrc", - "GstElement", - "GstObject", - "GInitiallyUnowned", - "GObject" - ], - "interfaces": [ - "GstURIHandler" - ], - "klass": "Source/Network", - "long-name": "MMS streaming source", - "pad-templates": { - "src": { - "caps": "video/x-ms-asf:\n", - "direction": "src", - "presence": "always" - } - }, - "properties": { - "connection-speed": { - "blurb": "Network connection speed in kbps (0 = unknown)", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "0", - "max": "2147483", - "min": "0", - "mutable": "null", - "readable": true, - "type": "guint64", - "writable": true - }, - "location": { - "blurb": "Host URL to connect to. Accepted are mms://, mmsu://, mmst:// URL types", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "NULL", - "mutable": "null", - "readable": true, - "type": "gchararray", - "writable": true - } - }, - "rank": "none" - } - }, - "filename": "gstmms", - "license": "LGPL", - "other-types": {}, - "package": "GStreamer Bad Plug-ins", - "source": "gst-plugins-bad", - "tracers": {}, - "url": "Unknown package origin" - }, "modplug": { "description": ".MOD audio decoding", "elements": { @@ -208696,6 +211740,18 @@ "readable": true, "type": "gint", "writable": true + }, + "send-scte35-events": { + "blurb": "Whether SCTE 35 sections should be forwarded as events", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true } }, "rank": "primary", @@ -208897,7 +211953,7 @@ "long-name": "MPEG Transport Stream Muxer", "pad-templates": { "sink_%%d": { - "caps": "video/mpeg:\n parsed: true\n mpegversion: { (int)1, (int)2, (int)4 }\n systemstream: false\nvideo/x-dirac:\nimage/x-jpc:\nvideo/x-h264:\n stream-format: byte-stream\n alignment: { (string)au, (string)nal }\nvideo/x-h265:\n stream-format: byte-stream\n alignment: { (string)au, (string)nal }\naudio/mpeg:\n parsed: true\n mpegversion: 1\naudio/mpeg:\n framed: true\n mpegversion: { (int)2, (int)4 }\n stream-format: { (string)adts, (string)raw }\naudio/x-lpcm:\n width: { (int)16, (int)20, (int)24 }\n rate: { (int)48000, (int)96000 }\n channels: [ 1, 8 ]\n dynamic_range: [ 0, 255 ]\n emphasis: { (boolean)false, (boolean)true }\n mute: { (boolean)false, (boolean)true }\naudio/x-ac3:\n framed: true\naudio/x-dts:\n framed: true\naudio/x-opus:\n channels: [ 1, 8 ]\nchannel-mapping-family: { (int)0, (int)1 }\nsubpicture/x-dvb:\napplication/x-teletext:\nmeta/x-klv:\n parsed: true\nimage/x-jpc:\n profile: [ 0, 49151 ]\n", + "caps": "video/mpeg:\n parsed: true\n mpegversion: { (int)1, (int)2, (int)4 }\n systemstream: false\nvideo/x-dirac:\nimage/x-jpc:\n alignment: frame\nvideo/x-h264:\n stream-format: byte-stream\n alignment: { (string)au, (string)nal }\nvideo/x-h265:\n stream-format: byte-stream\n alignment: { (string)au, (string)nal }\naudio/mpeg:\n parsed: true\n mpegversion: 1\naudio/mpeg:\n framed: true\n mpegversion: { (int)2, (int)4 }\n stream-format: { (string)adts, (string)raw }\naudio/x-lpcm:\n width: { (int)16, (int)20, (int)24 }\n rate: { (int)48000, (int)96000 }\n channels: [ 1, 8 ]\n dynamic_range: [ 0, 255 ]\n emphasis: { (boolean)false, (boolean)true }\n mute: { (boolean)false, (boolean)true }\naudio/x-ac3:\n framed: true\naudio/x-dts:\n framed: true\naudio/x-opus:\n channels: [ 1, 8 ]\nchannel-mapping-family: { (int)0, (int)1 }\nsubpicture/x-dvb:\napplication/x-teletext:\nmeta/x-klv:\n parsed: true\nimage/x-jpc:\n alignment: frame\n profile: [ 0, 49151 ]\n", "direction": "sink", "presence": "request", "type": "GstBaseTsMuxPad" @@ -209297,6 +212353,1869 @@ "tracers": {}, "url": "Unknown package origin" }, + "msdk": { + "description": "MFX API (Intel(R) Media SDK) based elements", + "elements": { + "msdkav1dec": { + "author": "Haihao Xiang <haihao.xiang@intel.com>", + "description": "AV1 video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkAV1Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK AV1 decoder", + "pad-templates": { + "sink": { + "caps": "video/x-av1\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: { NV12, P010_10LE, VUYA, Y410 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: { NV12, P010_10LE, VUYA, Y410 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": {}, + "rank": "none" + }, + "msdkh264dec": { + "author": "Scott D Phillips <scott.d.phillips@intel.com>", + "description": "H264 video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkH264Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK H264 decoder", + "pad-templates": { + "sink": { + "caps": "video/x-h264\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n stream-format: byte-stream\n alignment: au\n profile: { (string)high, (string)progressive-high, (string)constrained-high, (string)main, (string)baseline, (string)constrained-baseline }", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "output-order": { + "blurb": "Decoded frames output order", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "display (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkDecOutputOrder", + "writable": true + } + }, + "rank": "none" + }, + "msdkh264enc": { + "author": "Josep Torra <jtorra@oblong.com>", + "description": "H264 video encoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkH264Enc", + "GstMsdkEnc", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "Intel MSDK H264 encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw\n format: { NV12, I420, YV12, YUY2, UYVY, BGRA }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-h264\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n stream-format: byte-stream\n alignment: au\n profile: { (string)high, (string)main, (string)baseline, (string)constrained-baseline }\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "b-pyramid": { + "blurb": "Enable B-Pyramid Reference structure", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "cabac": { + "blurb": "Enable CABAC entropy coding", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "dblk-idc": { + "blurb": "Option of disable deblocking idc", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "frame-packing": { + "blurb": "Set frame packing mode for Stereoscopic content", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "none (-1)", + "mutable": "null", + "readable": true, + "type": "GstMsdkH264EncFramePacking", + "writable": true + }, + "intra-refresh-type": { + "blurb": "Set intra refresh type", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "no (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncIntraRefreshType", + "writable": true + }, + "low-power": { + "blurb": "Enable low power mode (DEPRECATED, use tune instead)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "max-qp": { + "blurb": "Maximum quantizer for I/P/B frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-slice-size": { + "blurb": "Maximum slice size in bytes (if enabled MSDK will ignore the control over num-slices)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "4294967295", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "min-qp": { + "blurb": "Minimal quantizer for I/P/B frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "p-pyramid": { + "blurb": "Enable P-Pyramid Reference structure", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "rc-lookahead-ds": { + "blurb": "Down sampling mode in look ahead bitrate control", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "default (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncRCLookAheadDownsampling", + "writable": true + }, + "trellis": { + "blurb": "Enable Trellis Quantization", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "None (0x00000000)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncTrellisQuantization", + "writable": true + }, + "tune": { + "blurb": "Encoder tuning option", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "auto (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncTuneMode", + "writable": true + } + }, + "rank": "none" + }, + "msdkh265dec": { + "author": "Scott D Phillips <scott.d.phillips@intel.com>", + "description": "H265 video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkH265Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK H265 decoder", + "pad-templates": { + "sink": { + "caps": "video/x-h265\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n stream-format: byte-stream\n alignment: au\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: { NV12, P010_10LE, YUY2, Y210, VUYA, Y410, P012_LE, Y212_LE, Y412_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: { NV12, P010_10LE, YUY2, Y210, VUYA, Y410, P012_LE, Y212_LE, Y412_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "output-order": { + "blurb": "Decoded frames output order", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "display (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkDecOutputOrder", + "writable": true + } + }, + "rank": "none" + }, + "msdkh265enc": { + "author": "Josep Torra <jtorra@oblong.com>", + "description": "H265 video encoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkH265Enc", + "GstMsdkEnc", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "Intel MSDK H265 encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw\n format: { NV12, I420, YV12, YUY2, UYVY, BGRA, P010_10LE, VUYA, Y410, Y210, P012_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: { NV12, P010_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-h265\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n stream-format: byte-stream\n alignment: au\n profile: { (string)main, (string)main-10, (string)main-444, (string)main-444-10, (string)main-422-10, (string)main-12, (string)screen-extended-main, (string)screen-extended-main-10, (string)screen-extended-main-444, (string)screen-extended-main-444-10 }\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "b-pyramid": { + "blurb": "Enable B-Pyramid Reference structure", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "dblk-idc": { + "blurb": "Option of disable deblocking idc", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "intra-refresh-type": { + "blurb": "Set intra refresh type", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "no (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncIntraRefreshType", + "writable": true + }, + "low-power": { + "blurb": "Enable low power mode (DEPRECATED, use tune instead)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "max-qp": { + "blurb": "Maximum quantizer for I/P/B frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-slice-size": { + "blurb": "Maximum slice size in bytes (if enabled MSDK will ignore the control over num-slices)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "4294967295", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "min-qp": { + "blurb": "Minimal quantizer for I/P/B frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "num-tile-cols": { + "blurb": "number of columns for tiled encoding", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "8192", + "min": "1", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "num-tile-rows": { + "blurb": "number of rows for tiled encoding", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "8192", + "min": "1", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "p-pyramid": { + "blurb": "Enable P-Pyramid Reference structure", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "transform-skip": { + "blurb": "Transform Skip option", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "auto (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncTransformSkip", + "writable": true + }, + "tune": { + "blurb": "Encoder tuning option", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "auto (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncTuneMode", + "writable": true + } + }, + "rank": "none" + }, + "msdkmjpegdec": { + "author": "Scott D Phillips <scott.d.phillips@intel.com>", + "description": "MJPEG video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkMJPEGDec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK MJPEG decoder", + "pad-templates": { + "sink": { + "caps": "image/jpeg\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n parsed: true\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: { NV12, YUY2 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: { NV12, YUY2 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": {}, + "rank": "none" + }, + "msdkmjpegenc": { + "author": "Scott D Phillips <scott.d.phillips@intel.com>", + "description": "MJPEG video encoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkMJPEGEnc", + "GstMsdkEnc", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "Intel MSDK MJPEG encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw\n format: { NV12, I420, YV12, YUY2, UYVY, BGRA }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "image/jpeg\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "quality": { + "blurb": "Quality of encoding", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "85", + "max": "100", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + } + }, + "rank": "none" + }, + "msdkmpeg2dec": { + "author": "Sreerenj Balachandran <sreerenj.balachandran@intel.com>", + "description": "MPEG2 video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkMPEG2Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK MPEG2 decoder", + "pad-templates": { + "sink": { + "caps": "video/mpeg\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n mpegversion: 2\n systemstream: false\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "output-order": { + "blurb": "Decoded frames output order", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "display (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkDecOutputOrder", + "writable": true + } + }, + "rank": "none" + }, + "msdkmpeg2enc": { + "author": "Josep Torra <jtorra@oblong.com>", + "description": "MPEG2 video encoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkMPEG2Enc", + "GstMsdkEnc", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "Intel MSDK MPEG2 encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw\n format: { NV12, I420, YV12, YUY2, UYVY, BGRA }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/mpeg\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n mpegversion: 2\n systemstream: false\n profile: { (string)high, (string)main, (string)simple }\n", + "direction": "src", + "presence": "always" + } + }, + "properties": {}, + "rank": "none" + }, + "msdkvc1dec": { + "author": "Sreerenj Balachandran <sreerenj.balachandran@intel.com", + "description": "VC1/WMV video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkVC1Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK VC1 decoder", + "pad-templates": { + "sink": { + "caps": "video/x-wmv\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n wmvversion: 3\n format: WMV3\n header-format: none\n stream-format: sequence-layer-frame-layer\n profile: { (string)simple, (string)main }\nvideo/x-wmv\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n wmvversion: 3\n format: WVC1\n header-format: asf\n stream-format: bdu\n profile: advanced\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "output-order": { + "blurb": "Decoded frames output order", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "display (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkDecOutputOrder", + "writable": true + } + }, + "rank": "none" + }, + "msdkvp8dec": { + "author": "Hyunjun Ko <zzoon@igalia.com>", + "description": "VP8 video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkVP8Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK VP8 decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp8\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: NV12\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "output-order": { + "blurb": "Decoded frames output order", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "display (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkDecOutputOrder", + "writable": true + } + }, + "rank": "none" + }, + "msdkvp9dec": { + "author": "Sreerenj Balachandran <sreerenj.balachandran@intel.com>", + "description": "VP9 video decoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkVP9Dec", + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "Intel MSDK VP9 decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp9\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw\n format: { NV12, P010_10LE, VUYA, Y410, P012_LE, Y412_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: { NV12, P010_10LE, VUYA, Y410, P012_LE, Y412_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "output-order": { + "blurb": "Decoded frames output order", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "display (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkDecOutputOrder", + "writable": true + } + }, + "rank": "none" + }, + "msdkvp9enc": { + "author": "Haihao Xiang <haihao.xiang@intel.com>", + "description": "VP9 video encoder based on Intel Media SDK", + "hierarchy": [ + "GstMsdkVP9Enc", + "GstMsdkEnc", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Encoder/Video/Hardware", + "long-name": "Intel MSDK VP9 encoder", + "pad-templates": { + "sink": { + "caps": "video/x-raw\n format: { NV12, I420, YV12, YUY2, UYVY, BGRA, P010_10LE, VUYA, Y410 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\nvideo/x-raw(memory:DMABuf)\n format: { NV12, P010_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: progressive\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-vp9\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n profile: { (string)0, (string)1, (string)2, (string)3 }\n", + "direction": "src", + "presence": "always" + } + }, + "properties": {}, + "rank": "none" + }, + "msdkvpp": { + "author": "Sreerenj Balachandrn <sreerenj.balachandran@intel.com>", + "description": "A MediaSDK Video Postprocessing Filter", + "hierarchy": [ + "GstMsdkVPP", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Converter/Video;Filter/Converter/Video/Scaler;Filter/Effect/Video;Filter/Effect/Video/Deinterlace", + "long-name": "MSDK Video Postprocessor", + "pad-templates": { + "sink": { + "caps": "video/x-raw\n format: { NV12, YV12, I420, YUY2, UYVY, VUYA, BGRA, BGRx, P010_10LE, RGB16, Y410, Y210 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: { (string)progressive, (string)interleaved, (string)mixed }\nvideo/x-raw(memory:DMABuf)\n format: { NV12, BGRA, YUY2, UYVY, VUYA, P010_10LE, RGB16, Y410, Y210 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:DMABuf)\n format: { BGRA, YUY2, UYVY, NV12, VUYA, BGRx, P010_10LE, BGR10A2_LE, YV12, Y410, Y210 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw\n format: { BGRA, NV12, YUY2, UYVY, VUYA, BGRx, P010_10LE, BGR10A2_LE, YV12, Y410, Y210 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\ninterlace-mode: { (string)progressive, (string)interleaved, (string)mixed }\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "async-depth": { + "blurb": "Depth of asynchronous pipeline", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "1", + "min": "1", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "brightness": { + "blurb": "The Brightness of the video", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "100", + "min": "-100", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "contrast": { + "blurb": "The Contrast of the video", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "10", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "crop-bottom": { + "blurb": "Pixels to crop at bottom", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "crop-left": { + "blurb": "Pixels to crop at left", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "crop-right": { + "blurb": "Pixels to crop at right", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "crop-top": { + "blurb": "Pixels to crop at top", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "deinterlace-method": { + "blurb": "Deinterlace method to use", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "bob (1)", + "mutable": "null", + "readable": true, + "type": "GstMsdkVPPDeinterlaceMethod", + "writable": true + }, + "deinterlace-mode": { + "blurb": "Deinterlace mode to use", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "auto (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkVPPDeinterlaceMode", + "writable": true + }, + "denoise": { + "blurb": "Denoising Factor", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "100", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "detail": { + "blurb": "The factor of detail/edge enhancement filter algorithm", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "100", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "force-aspect-ratio": { + "blurb": "When enabled, scaling will respect original aspect ratio", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "frc-algorithm": { + "blurb": "The Framerate Control Alogorithm to use", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "none (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkVPPFrcAlgorithm", + "writable": true + }, + "hardware": { + "blurb": "Enable hardware VPP", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "hue": { + "blurb": "The hue of the video", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "180", + "min": "-180", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "mirroring": { + "blurb": "The Mirroring type (DEPRECATED, use video-direction instead)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "disable (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkVPPMirroring", + "writable": true + }, + "rotation": { + "blurb": "Rotation Angle (DEPRECATED, use video-direction instead)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0 (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkVPPRotation", + "writable": true + }, + "saturation": { + "blurb": "The Saturation of the video", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "10", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "scaling-mode": { + "blurb": "The Scaling mode to use", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "disable (0)", + "mutable": "null", + "readable": true, + "type": "GstMsdkVPPScalingMode", + "writable": true + }, + "video-direction": { + "blurb": "Video direction: rotation and flipping, it will override both mirroring & rotation properties if set explicitly", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "identity (0)", + "mutable": "null", + "readable": true, + "type": "GstVideoOrientationMethod", + "writable": true + } + }, + "rank": "none" + } + }, + "filename": "gstmsdk", + "license": "LGPL", + "other-types": { + "GstMsdkDec": { + "hierarchy": [ + "GstMsdkDec", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "async-depth": { + "blurb": "Depth of asynchronous pipeline", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "20", + "min": "1", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "hardware": { + "blurb": "Enable hardware decoders", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + } + } + }, + "GstMsdkDecOutputOrder": { + "kind": "enum", + "values": [ + { + "desc": "Output frames in Display order", + "name": "display", + "value": "0" + }, + { + "desc": "Output frames in Decoded order", + "name": "decoded", + "value": "1" + } + ] + }, + "GstMsdkEnc": { + "hierarchy": [ + "GstMsdkEnc", + "GstVideoEncoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "accuracy": { + "blurb": "The AVBR Accuracy in the unit of tenth of percent", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "async-depth": { + "blurb": "Depth of asynchronous pipeline", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "4", + "max": "20", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "b-adapt": { + "blurb": "Adaptive B-Frame Insertion control", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "off (32)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncAdaptiveB", + "writable": true + }, + "b-frames": { + "blurb": "Number of B frames between I and P frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "bitrate": { + "blurb": "Bitrate in kbit/sec", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "2048", + "max": "2048000", + "min": "1", + "mutable": "playing", + "readable": true, + "type": "guint", + "writable": true + }, + "convergence": { + "blurb": "The AVBR Convergence in the unit of 100 frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "ext-coding-props": { + "blurb": "The properties for the external coding", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "mutable": "null", + "readable": true, + "type": "GstStructure", + "writable": true + }, + "gop-size": { + "blurb": "GOP Size", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "256", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "hardware": { + "blurb": "Enable hardware encoders", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "i-adapt": { + "blurb": "Adaptive I-Frame Insertion control", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "off (32)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncAdaptiveI", + "writable": true + }, + "i-frames": { + "blurb": "Number of I frames between IDR frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-frame-size": { + "blurb": "Maximum possible size (in kbyte) of any compressed frames (0: auto-calculate)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "max-vbv-bitrate": { + "blurb": "Maximum bitrate(kbit/sec) at which data enters Video Buffering Verifier (0: auto-calculate)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2048000", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "mbbrc": { + "blurb": "Macroblock level bitrate control", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "off (32)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncMbBitrateControl", + "writable": true + }, + "num-slices": { + "blurb": "Number of slices per frame, Zero tells the encoder to choose any slice partitioning allowed by the codec standard", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "qpb": { + "blurb": "Constant quantizer for B frames (0 unlimited)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "qpi": { + "blurb": "Constant quantizer for I frames (0 unlimited). Also used as ICQQuality or QVBRQuality for different RateControl methods", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "qpp": { + "blurb": "Constant quantizer for P frames (0 unlimited)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "51", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "rate-control": { + "blurb": "Rate control method", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "cbr (1)", + "mutable": "null", + "readable": true, + "type": "GstMsdkEncRateControl", + "writable": true + }, + "rc-lookahead": { + "blurb": "Number of frames to look ahead for Rate control", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "10", + "max": "100", + "min": "10", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "ref-frames": { + "blurb": "Number of reference frames", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "target-usage": { + "blurb": "1: Best quality, 4: Balanced, 7: Best speed", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "4", + "max": "7", + "min": "1", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + } + } + }, + "GstMsdkEncAdaptiveB": { + "kind": "enum", + "values": [ + { + "desc": "SDK desides what to do", + "name": "auto", + "value": "0" + }, + { + "desc": "Disable Adaptive B-Frame insertion", + "name": "off", + "value": "32" + }, + { + "desc": "Enable Aaptive B-Frame insertion", + "name": "on", + "value": "16" + } + ] + }, + "GstMsdkEncAdaptiveI": { + "kind": "enum", + "values": [ + { + "desc": "SDK desides what to do", + "name": "auto", + "value": "0" + }, + { + "desc": "Disable Adaptive I frame insertion", + "name": "off", + "value": "32" + }, + { + "desc": "Enable Aaptive I frame insertion", + "name": "on", + "value": "16" + } + ] + }, + "GstMsdkEncIntraRefreshType": { + "kind": "enum", + "values": [ + { + "desc": "No (default)", + "name": "no", + "value": "0" + }, + { + "desc": "Vertical", + "name": "vertical", + "value": "1" + }, + { + "desc": "Horizontal", + "name": "horizontal", + "value": "2" + }, + { + "desc": "Slice", + "name": "slice", + "value": "3" + } + ] + }, + "GstMsdkEncMbBitrateControl": { + "kind": "enum", + "values": [ + { + "desc": "SDK desides what to do", + "name": "auto", + "value": "0" + }, + { + "desc": "Disable Macroblock level bit rate control", + "name": "off", + "value": "32" + }, + { + "desc": "Enable Macroblock level bit rate control", + "name": "on", + "value": "16" + } + ] + }, + "GstMsdkEncRCLookAheadDownsampling": { + "kind": "enum", + "values": [ + { + "desc": "SDK desides what to do", + "name": "default", + "value": "0" + }, + { + "desc": "No downsampling", + "name": "off", + "value": "1" + }, + { + "desc": "Down sample 2-times before estimation", + "name": "2x", + "value": "2" + }, + { + "desc": "Down sample 4-times before estimation", + "name": "4x", + "value": "3" + } + ] + }, + "GstMsdkEncRateControl": { + "kind": "enum", + "values": [ + { + "desc": "Constant Bitrate", + "name": "cbr", + "value": "1" + }, + { + "desc": "Variable Bitrate", + "name": "vbr", + "value": "2" + }, + { + "desc": "Constant Quantizer", + "name": "cqp", + "value": "3" + }, + { + "desc": "Average Bitrate", + "name": "avbr", + "value": "4" + }, + { + "desc": "VBR with look ahead (Non HRD compliant)", + "name": "la_vbr", + "value": "8" + }, + { + "desc": "Intelligent CQP", + "name": "icq", + "value": "9" + }, + { + "desc": "Video Conferencing Mode (Non HRD compliant)", + "name": "vcm", + "value": "10" + }, + { + "desc": "Intelligent CQP with LA (Non HRD compliant)", + "name": "la_icq", + "value": "11" + }, + { + "desc": "HRD compliant LA", + "name": "la_hrd", + "value": "13" + }, + { + "desc": "VBR with CQP", + "name": "qvbr", + "value": "14" + } + ] + }, + "GstMsdkEncTransformSkip": { + "kind": "enum", + "values": [ + { + "desc": "SDK desides what to do", + "name": "auto", + "value": "0" + }, + { + "desc": "transform_skip_enabled_flag will be set to 0 in PPS", + "name": "off", + "value": "32" + }, + { + "desc": "transform_skip_enabled_flag will be set to 1 in PPS", + "name": "on", + "value": "16" + } + ] + }, + "GstMsdkEncTrellisQuantization": { + "kind": "flags", + "values": [ + { + "desc": "Disable for all frames", + "name": "None", + "value": "0x00000000" + }, + { + "desc": "Enable for I frames", + "name": "i", + "value": "0x00000002" + }, + { + "desc": "Enable for P frames", + "name": "p", + "value": "0x00000004" + }, + { + "desc": "Enable for B frames", + "name": "b", + "value": "0x00000008" + } + ] + }, + "GstMsdkEncTuneMode": { + "kind": "enum", + "values": [ + { + "desc": "Auto", + "name": "auto", + "value": "0" + }, + { + "desc": "None", + "name": "none", + "value": "32" + }, + { + "desc": "Low power mode", + "name": "low-power", + "value": "16" + } + ] + }, + "GstMsdkH264EncFramePacking": { + "kind": "enum", + "values": [ + { + "desc": "None (default)", + "name": "none", + "value": "-1" + }, + { + "desc": "Side by Side", + "name": "side-by-side", + "value": "3" + }, + { + "desc": "Top Bottom", + "name": "top-bottom", + "value": "7" + } + ] + }, + "GstMsdkVPPDeinterlaceMethod": { + "kind": "enum", + "values": [ + { + "desc": "Disable deinterlacing", + "name": "none", + "value": "0" + }, + { + "desc": "Bob deinterlacing", + "name": "bob", + "value": "1" + }, + { + "desc": "Advanced deinterlacing (Motion adaptive)", + "name": "advanced", + "value": "2" + }, + { + "desc": "Advanced deinterlacing mode without using of reference frames", + "name": "advanced-no-ref ", + "value": "11" + }, + { + "desc": "Advanced deinterlacing mode with scene change detection", + "name": "advanced-scd ", + "value": "12" + }, + { + "desc": "Field weaving", + "name": "field-weave", + "value": "13" + } + ] + }, + "GstMsdkVPPDeinterlaceMode": { + "kind": "enum", + "values": [ + { + "desc": "Auto detection", + "name": "auto", + "value": "0" + }, + { + "desc": "Force deinterlacing", + "name": "interlaced", + "value": "1" + }, + { + "desc": "Never deinterlace", + "name": "disabled", + "value": "2" + } + ] + }, + "GstMsdkVPPFrcAlgorithm": { + "kind": "enum", + "values": [ + { + "desc": "No FrameRate Control algorithm", + "name": "none", + "value": "0" + }, + { + "desc": "Frame dropping/repetition, Preserve timestamp", + "name": "preserve-ts", + "value": "1" + }, + { + "desc": "Frame dropping/repetition, Distribute timestamp", + "name": "distribute-ts", + "value": "2" + }, + { + "desc": "Frame interpolation", + "name": "interpolate", + "value": "4" + }, + { + "desc": "Frame interpolation, Preserve timestamp", + "name": "interpolate-preserve-ts", + "value": "5" + }, + { + "desc": "Frame interpolation, Distribute timestamp", + "name": "interpolate-distribute-ts", + "value": "6" + } + ] + }, + "GstMsdkVPPMirroring": { + "kind": "enum", + "values": [ + { + "desc": "Disable mirroring", + "name": "disable", + "value": "0" + }, + { + "desc": "Horizontal Mirroring", + "name": "horizontal", + "value": "1" + }, + { + "desc": "Vertical Mirroring", + "name": "vertical", + "value": "2" + } + ] + }, + "GstMsdkVPPRotation": { + "kind": "enum", + "values": [ + { + "desc": "Unrotated mode", + "name": "0", + "value": "0" + }, + { + "desc": "Rotated by 90°", + "name": "90", + "value": "90" + }, + { + "desc": "Rotated by 180°", + "name": "180", + "value": "180" + }, + { + "desc": "Rotated by 270°", + "name": "270", + "value": "270" + } + ] + }, + "GstMsdkVPPScalingMode": { + "kind": "enum", + "values": [ + { + "desc": "Default Scaling", + "name": "disable", + "value": "0" + }, + { + "desc": "Lowpower Scaling", + "name": "lowpower", + "value": "1" + }, + { + "desc": "High Quality Scaling", + "name": "quality", + "value": "2" + } + ] + } + }, + "package": "GStreamer Bad Plug-ins", + "source": "gst-plugins-bad", + "tracers": {}, + "url": "Unknown package origin" + }, "musepack": { "description": "Musepack decoder", "elements": { @@ -209369,7 +214288,7 @@ "construct": false, "construct-only": false, "controllable": false, - "default": "500000000", + "default": "100000000", "max": "18446744073709551615", "min": "100000000", "mutable": "null", @@ -209859,6 +214778,120 @@ "nvcodec": { "description": "GStreamer NVCODEC plugin", "elements": { + "cudaconvert": { + "author": "Seungha Yang <seungha.yang@navercorp.com>", + "description": "Converts video from one colorspace to another using CUDA", + "hierarchy": [ + "GstCudaConvert", + "GstCudaBaseFilter", + "GstCudaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Converter/Video/Hardware", + "long-name": "CUDA Colorspace converter", + "pad-templates": { + "sink": { + "caps": "video/x-raw(memory:CUDAMemory):\n format: { I420, YV12, NV12, NV21, P010_10LE, P016_LE, I420_10LE, Y444, Y444_16LE, BGRA, RGBA, RGBx, BGRx, ARGB, ABGR, RGB, BGR, BGR10A2_LE, RGB10A2_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:CUDAMemory):\n format: { I420, YV12, NV12, NV21, P010_10LE, P016_LE, I420_10LE, Y444, Y444_16LE, BGRA, RGBA, RGBx, BGRx, ARGB, ABGR, RGB, BGR, BGR10A2_LE, RGB10A2_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "cudadownload": { + "author": "Seungha Yang <seungha.yang@navercorp.com>", + "description": "Downloads data from NVIDA GPU via CUDA APIs", + "hierarchy": [ + "GstCudaDownload", + "GstCudaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Video", + "long-name": "CUDA downloader", + "pad-templates": { + "sink": { + "caps": "video/x-raw(memory:CUDAMemory):\nvideo/x-raw:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "cudascale": { + "author": "Seungha Yang <seungha.yang@navercorp.com>", + "description": "Resizes Video using CUDA", + "hierarchy": [ + "GstCudaScale", + "GstCudaBaseFilter", + "GstCudaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Converter/Video/Scaler/Hardware", + "long-name": "CUDA Video scaler", + "pad-templates": { + "sink": { + "caps": "video/x-raw(memory:CUDAMemory):\n format: { I420, YV12, NV12, NV21, P010_10LE, P016_LE, I420_10LE, Y444, Y444_16LE, BGRA, RGBA, RGBx, BGRx, ARGB, ABGR, RGB, BGR, BGR10A2_LE, RGB10A2_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:CUDAMemory):\n format: { I420, YV12, NV12, NV21, P010_10LE, P016_LE, I420_10LE, Y444, Y444_16LE, BGRA, RGBA, RGBx, BGRx, ARGB, ABGR, RGB, BGR, BGR10A2_LE, RGB10A2_LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "cudaupload": { + "author": "Seungha Yang <seungha.yang@navercorp.com>", + "description": "Uploads data into NVIDA GPU via CUDA APIs", + "hierarchy": [ + "GstCudaUpload", + "GstCudaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Video", + "long-name": "CUDA uploader", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n\nvideo/x-raw(memory:CUDAMemory):\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:CUDAMemory):\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, "nvh264dec": { "author": "Ericsson AB, http://www.ericsson.com, Seungha Yang <seungha.yang@navercorp.com>", "description": "NVDEC video decoder", @@ -209875,12 +214908,12 @@ "long-name": "NVDEC h264 Video Decoder", "pad-templates": { "sink": { - "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)constrained-baseline, (string)baseline, (string)main, (string)high }\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n", + "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)constrained-baseline, (string)baseline, (string)main, (string)high, (string)constrained-high, (string)progressive-high }\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -209907,12 +214940,12 @@ "long-name": "NVENC H.264 Video Encoder", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { NV12, YV12, I420, BGRA, RGBA, Y444, VUYA }\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive }\n\nvideo/x-raw(memory:GLMemory):\n format: { NV12, YV12, I420, BGRA, RGBA, Y444, VUYA }\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive }\n", + "caps": "video/x-raw:\n format: { NV12, YV12, I420, BGRA, RGBA, Y444, VUYA }\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive }\n\nvideo/x-raw(memory:GLMemory):\n format: { NV12, YV12, I420, BGRA, RGBA, Y444, VUYA }\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive }\n\nvideo/x-raw(memory:CUDAMemory):\n format: { NV12, YV12, I420, BGRA, RGBA, Y444, VUYA }\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n interlace-mode: { (string)progressive }\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-h264:\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n stream-format: byte-stream\n alignment: au\n profile: { (string)main, (string)high, (string)high-4:4:4, (string)baseline }\n", + "caps": "video/x-h264:\n width: [ 145, 4096 ]\n height: [ 49, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n stream-format: byte-stream\n alignment: au\n profile: { (string)main, (string)high, (string)high-4:4:4, (string)baseline, (string)constrained-baseline }\n", "direction": "src", "presence": "always" } @@ -210028,12 +215061,12 @@ "long-name": "NVDEC H.264 Stateless Decoder", "pad-templates": { "sink": { - "caps": "video/x-h264:\n stream-format: { (string)avc, (string)avc3, (string)byte-stream }\n alignment: au\n profile: { (string)high, (string)main, (string)constrained-baseline, (string)baseline }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n", + "caps": "video/x-h264:\n stream-format: { (string)avc, (string)avc3, (string)byte-stream }\n alignment: au\n profile: { (string)high, (string)main, (string)constrained-high, (string)constrained-baseline, (string)baseline }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -210061,7 +215094,7 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n", + "caps": "video/x-raw:\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n", "direction": "src", "presence": "always" } @@ -210088,7 +215121,7 @@ "long-name": "NVENC HEVC Video Encoder", "pad-templates": { "sink": { - "caps": "video/x-raw:\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:GLMemory):\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:GLMemory):\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(memory:CUDAMemory):\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" }, @@ -210214,7 +215247,7 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n", + "caps": "video/x-raw:\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 144, 8192 ]\n height: [ 144, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE, Y444, Y444_16LE, Y444_16LE }\n", "direction": "src", "presence": "always" } @@ -210242,7 +215275,7 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 64, 32768 ]\n height: [ 64, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 64, 32768 ]\n height: [ 64, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 64, 32768 ]\n height: [ 64, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 64, 32768 ]\n height: [ 64, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 64, 32768 ]\n height: [ 64, 16384 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -210270,7 +215303,7 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -210298,7 +215331,7 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 48, 2032 ]\n height: [ 16, 2032 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 2032 ]\n height: [ 16, 2032 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 48, 2032 ]\n height: [ 16, 2032 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 2032 ]\n height: [ 16, 2032 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 2032 ]\n height: [ 16, 2032 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -210326,7 +215359,7 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 4080 ]\n height: [ 16, 4080 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } @@ -210354,13 +215387,42 @@ "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", "direction": "src", "presence": "always" } }, "rank": "primary" }, + "nvvp8sldec": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "NVIDIA VP8 video decoder", + "hierarchy": [ + "GstNvVP8StatelessDec", + "GstNvVp8Dec", + "GstVp8Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "NVDEC VP8 Stateless Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp8:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 48, 4096 ]\n height: [ 16, 4096 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12 }\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "secondary" + }, "nvvp9dec": { "author": "Ericsson AB, http://www.ericsson.com, Seungha Yang <seungha.yang@navercorp.com>", "description": "NVDEC video decoder", @@ -210377,22 +215439,90 @@ "long-name": "NVDEC vp9 Video Decoder", "pad-templates": { "sink": { - "caps": "video/x-vp9:\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n", + "caps": "video/x-vp9:\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n profile: { (string)0, (string)2 }\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n", + "caps": "video/x-raw:\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n", "direction": "src", "presence": "always" } }, "rank": "primary" + }, + "nvvp9sldec": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "NVIDIA VP9 video decoder", + "hierarchy": [ + "GstNvVP9StatelessDec", + "GstNvVp9Dec", + "GstVp9Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "NVDEC VP9 Stateless Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp9:\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n profile: { (string)0, (string)2 }\n alignment: frame\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n\nvideo/x-raw(memory:GLMemory):\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n\nvideo/x-raw(memory:CUDAMemory):\n width: [ 128, 8192 ]\n height: [ 128, 8192 ]\n framerate: [ 0/1, 2147483647/1 ]\n format: { NV12, P010_10LE, P016_LE }\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "secondary" } }, "filename": "gstnvcodec", "license": "LGPL", "other-types": { + "GstCudaBaseFilter": { + "hierarchy": [ + "GstCudaBaseFilter", + "GstCudaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object" + }, + "GstCudaBaseTransform": { + "hierarchy": [ + "GstCudaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "cuda-device-id": { + "blurb": "Set the GPU device to use for operations (-1 = auto)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "-1", + "max": "2147483647", + "min": "-1", + "mutable": "ready", + "readable": true, + "type": "gint", + "writable": true + } + } + }, "GstNvBaseEnc": { "hierarchy": [ "GstNvBaseEnc", @@ -210754,7 +215884,23 @@ "GInitiallyUnowned", "GObject" ], - "kind": "object" + "kind": "object", + "properties": { + "max-display-delay": { + "blurb": "Improves pipelining of decode with display, 0 means no delay (auto = -1)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "-1", + "max": "2147483647", + "min": "-1", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + } + } }, "GstNvDevice0H264Enc": { "hierarchy": [ @@ -210899,62 +216045,32 @@ "value": "7" } ] - } - }, - "package": "GStreamer Bad Plug-ins", - "source": "gst-plugins-bad", - "tracers": {}, - "url": "Unknown package origin" - }, - "ofa": { - "description": "Calculate MusicIP fingerprint from audio files", - "elements": { - "ofa": { - "author": "Milosz Derezynski <internalerror@gmail.com>, Eric Buehl <eric.buehl@gmail.com>", - "description": "Find a music fingerprint using MusicIP's libofa", + }, + "GstNvVp8Dec": { "hierarchy": [ - "GstOFA", - "GstAudioFilter", - "GstBaseTransform", + "GstNvVp8Dec", + "GstVp8Decoder", + "GstVideoDecoder", "GstElement", "GstObject", "GInitiallyUnowned", "GObject" ], - "klass": "MusicIP Fingerprinting element", - "long-name": "OFA", - "pad-templates": { - "sink": { - "caps": "audio/x-raw:\n format: { S16LE, S16BE }\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2 ]\n", - "direction": "sink", - "presence": "always" - }, - "src": { - "caps": "audio/x-raw:\n format: { S16LE, S16BE }\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2 ]\n", - "direction": "src", - "presence": "always" - } - }, - "properties": { - "fingerprint": { - "blurb": "Resulting fingerprint", - "conditionally-available": false, - "construct": false, - "construct-only": false, - "controllable": false, - "default": "NULL", - "mutable": "null", - "readable": true, - "type": "gchararray", - "writable": false - } - }, - "rank": "none" + "kind": "object" + }, + "GstNvVp9Dec": { + "hierarchy": [ + "GstNvVp9Dec", + "GstVp9Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object" } }, - "filename": "gstofa", - "license": "GPL", - "other-types": {}, "package": "GStreamer Bad Plug-ins", "source": "gst-plugins-bad", "tracers": {}, @@ -211114,7 +216230,7 @@ "elements": { "cameracalibrate": { "author": "Philippe Renon <philippe_renon@yahoo.fr>", - "description": "Performs camera calibration", + "description": "Performs camera calibration by having it point at a chessboard pattern using upstream/downstream cameraundistort", "hierarchy": [ "GstCameraCalibrate", "GstOpencvVideoFilter", @@ -211155,7 +216271,7 @@ "writable": true }, "board-height": { - "blurb": "The board height in number of items", + "blurb": "The board height in number of items (e.g. number of squares for chessboard)", "conditionally-available": false, "construct": false, "construct-only": false, @@ -211169,7 +216285,7 @@ "writable": true }, "board-width": { - "blurb": "The board width in number of items", + "blurb": "The board width in number of items (e.g. number of squares for chessboard)", "conditionally-available": false, "construct": false, "construct-only": false, @@ -211797,6 +216913,117 @@ }, "rank": "none" }, + "cvtracker": { + "author": "Vivek R <123vivekr@gmail.com>", + "description": "Performs object tracking on videos and stores it in video buffer metadata.", + "hierarchy": [ + "GstCVTracker", + "GstOpencvVideoFilter", + "GstVideoFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Effect/Video", + "long-name": "cvtracker", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: RGB\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n format: RGB\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "algorithm": { + "blurb": "Algorithm for tracking objects", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "MedianFlow (3)", + "mutable": "null", + "readable": true, + "type": "GstOpenCVTrackerAlgorithm", + "writable": true + }, + "draw-rect": { + "blurb": "Draw rectangle around tracked object", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "object-initial-height": { + "blurb": "Track object box's initial height", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "50", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "object-initial-width": { + "blurb": "Track object box's initial width", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "50", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "object-initial-x": { + "blurb": "Track object box's initial X coordinate", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "50", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "object-initial-y": { + "blurb": "Track object box's initial Y coordinate", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "50", + "max": "-1", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + } + }, + "rank": "none" + }, "dewarp": { "author": "Nicola Murino <nicola.murino@gmail.com>", "description": "Dewarp fisheye images", @@ -212986,6 +218213,20 @@ } }, "properties": { + "gain": { + "blurb": "Gain", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "128", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, "method": { "blurb": "Retinex method to use", "conditionally-available": false, @@ -212998,6 +218239,20 @@ "type": "GstRetinexMethod", "writable": true }, + "offset": { + "blurb": "Offset", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "128", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, "scales": { "blurb": "Amount of gaussian filters (scales) used in multiscale retinex", "conditionally-available": false, @@ -213011,6 +218266,20 @@ "readable": true, "type": "gint", "writable": true + }, + "sigma": { + "blurb": "Sigma", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "14", + "max": "1.79769e+308", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gdouble", + "writable": true } }, "rank": "none" @@ -213371,6 +218640,46 @@ } ] }, + "GstOpenCVTrackerAlgorithm": { + "kind": "enum", + "values": [ + { + "desc": "the Boosting tracker", + "name": "Boosting", + "value": "0" + }, + { + "desc": "the CSRT tracker", + "name": "CSRT", + "value": "1" + }, + { + "desc": "the KCF (Kernelized Correlation Filter) tracker", + "name": "KCF", + "value": "2" + }, + { + "desc": "the Median Flow tracker", + "name": "MedianFlow", + "value": "3" + }, + { + "desc": "the MIL tracker", + "name": "MIL", + "value": "4" + }, + { + "desc": "the MOSSE (Minimum Output Sum of Squared Error) tracker", + "name": "MOSSE", + "value": "5" + }, + { + "desc": "the TLD (Tracking, learning and detection) tracker", + "name": "TLD", + "value": "6" + } + ] + }, "GstOpencvFaceBlurFlags": { "kind": "flags", "values": [ @@ -213505,7 +218814,7 @@ "long-name": "OpenH264 video decoder", "pad-templates": { "sink": { - "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)constrained-baseline, (string)baseline, (string)main, (string)high }\n", + "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)constrained-baseline, (string)baseline, (string)main, (string)high, (string)constrained-high, (string)progressive-high }\n", "direction": "sink", "presence": "always" }, @@ -213541,7 +218850,7 @@ "presence": "always" }, "src": { - "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: baseline\n", + "caps": "video/x-h264:\n stream-format: byte-stream\n alignment: au\n profile: { (string)constrained-baseline, (string)baseline, (string)main, (string)constrained-high, (string)high }\n", "direction": "src", "presence": "always" } @@ -213905,7 +219214,7 @@ "long-name": "OpenJPEG JPEG2000 decoder", "pad-templates": { "sink": { - "caps": "image/x-j2c:\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\nimage/x-jpc:\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\nimage/jp2:\n", + "caps": "image/x-j2c:\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\nimage/x-jpc:\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\nimage/jp2:\nimage/x-jpc-striped:\n num-stripes: [ 2, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n", "direction": "sink", "presence": "always" }, @@ -213916,8 +219225,22 @@ } }, "properties": { + "max-slice-threads": { + "blurb": "Maximum number of worker threads to spawn according to the frame boundary. (0 = no thread)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint", + "writable": true + }, "max-threads": { - "blurb": "Maximum number of worker threads to spawn. (0 = auto)", + "blurb": "Maximum number of worker threads to spawn used by openjpeg internally. (0 = no thread)", "conditionally-available": false, "construct": false, "construct-only": false, @@ -213956,7 +219279,7 @@ "presence": "always" }, "src": { - "caps": "image/x-j2c:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n num-components: [ 1, 4 ]\n num-stripes: [ 1, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\nimage/x-jpc:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n num-components: [ 1, 4 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\nimage/jp2:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n", + "caps": "image/x-j2c:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n num-components: [ 1, 4 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\nimage/x-jpc:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n num-components: [ 1, 4 ]\n num-stripes: [ 1, 2147483647 ]\n alignment: { (string)frame, (string)stripe }\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\nimage/jp2:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\nimage/x-jpc-striped:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n num-components: [ 1, 4 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\n num-stripes: [ 2, 2147483647 ]\n stripe-height: [ 1, 2147483647 ]\n", "direction": "src", "presence": "always" } @@ -214004,6 +219327,20 @@ "type": "gint", "writable": true }, + "num-threads": { + "blurb": "Max number of simultaneous threads to encode stripe or frame, default: encode with streaming thread.", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "2147483647", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, "progression-order": { "blurb": "Progression order", "conditionally-available": false, @@ -214767,6 +220104,238 @@ "tracers": {}, "url": "Unknown package origin" }, + "qroverlay": { + "description": "libqrencode qroverlay plugin", + "elements": { + "debugqroverlay": { + "author": "Anthony Violo <anthony.violo@ubicast.eu>", + "description": "Overlay Qrcodes over each buffer with buffer information and custom data", + "hierarchy": [ + "GstDebugQROverlay", + "GstBaseQROverlay", + "GstVideoFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Qrcode overlay containing buffer information", + "long-name": "qroverlay", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { I420 }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 16, 2147483647 ]\n height: [ 16, 2147483647 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n format: { I420 }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 16, 2147483647 ]\n height: [ 16, 2147483647 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "extra-data-array": { + "blurb": "List of comma separated values that the extra data value will be cycled from at each interval, example array structure : \"240,480,720,960,1200,1440,1680,1920\"", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": true + }, + "extra-data-interval-buffers": { + "blurb": "Extra data append into the Qrcode at the first buffer of each interval", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "60", + "max": "9223372036854775807", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": true + }, + "extra-data-name": { + "blurb": "Json key name for extra append data", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": true + }, + "extra-data-span-buffers": { + "blurb": "Numbers of consecutive buffers that the extra data will be inserted (counting the first buffer)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "1", + "max": "9223372036854775807", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gint64", + "writable": true + } + }, + "rank": "none" + }, + "qroverlay": { + "author": "Thibault Saunier <tsaunier@igalia.com>", + "description": "Overlay Qrcodes over each buffer with data passed in", + "hierarchy": [ + "GstQROverlay", + "GstBaseQROverlay", + "GstVideoFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Qrcode overlay containing random data", + "long-name": "qroverlay", + "pad-templates": { + "sink": { + "caps": "video/x-raw:\n format: { I420 }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 16, 2147483647 ]\n height: [ 16, 2147483647 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw:\n format: { I420 }\n framerate: [ 0/1, 2147483647/1 ]\n width: [ 16, 2147483647 ]\n height: [ 16, 2147483647 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "data": { + "blurb": "Data to write in the QRCode to be overlaid", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": true, + "default": "NULL", + "mutable": "playing", + "readable": true, + "type": "gchararray", + "writable": true + } + }, + "rank": "none" + } + }, + "filename": "gstqroverlay", + "license": "LGPL", + "other-types": { + "GstBaseQROverlay": { + "hierarchy": [ + "GstBaseQROverlay", + "GstVideoFilter", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object", + "properties": { + "pixel-size": { + "blurb": "Pixel size of each Qrcode pixel", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "3", + "max": "100", + "min": "1", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "qrcode-error-correction": { + "blurb": "qrcode-error-correction", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "Approx 15%% (1)", + "mutable": "null", + "readable": true, + "type": "GstQrcodeOverlayCorrection", + "writable": true + }, + "x": { + "blurb": "X position (in percent of the width)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "50", + "max": "100", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + }, + "y": { + "blurb": "Y position (in percent of the height)", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "50", + "max": "100", + "min": "0", + "mutable": "null", + "readable": true, + "type": "gfloat", + "writable": true + } + } + }, + "GstQrcodeOverlayCorrection": { + "kind": "enum", + "values": [ + { + "desc": "Level L", + "name": "Approx 7%%", + "value": "0" + }, + { + "desc": "Level M", + "name": "Approx 15%%", + "value": "1" + }, + { + "desc": "Level Q", + "name": "Approx 25%%", + "value": "2" + }, + { + "desc": "Level H", + "name": "Approx 30%%", + "value": "3" + } + ] + } + }, + "package": "GStreamer Bad Plug-ins", + "source": "gst-plugins-bad", + "tracers": {}, + "url": "Unknown package origin" + }, "removesilence": { "description": "Removes silence from an audio stream", "elements": { @@ -214932,7 +220501,7 @@ "presence": "sometimes" }, "video": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "sometimes" } @@ -216303,6 +221872,18 @@ "readable": true, "type": "GstStructure", "writable": false + }, + "stop-commands": { + "blurb": "RTMP commands to send on EOS event before closing connection", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "deletestream+fcunpublish", + "mutable": "null", + "readable": true, + "type": "GstRtmpStopCommands", + "writable": true } }, "rank": "primary + 1" @@ -216581,6 +222162,31 @@ "value": "1" } ] + }, + "GstRtmpStopCommands": { + "kind": "flags", + "values": [ + { + "desc": "No command", + "name": "none", + "value": "0x00000000" + }, + { + "desc": "FCUnpublish", + "name": "fcunpublish", + "value": "0x00000001" + }, + { + "desc": "closeStream", + "name": "closestream", + "value": "0x00000002" + }, + { + "desc": "deleteStream", + "name": "deletestream", + "value": "0x00000004" + } + ] } }, "package": "GStreamer Bad Plug-ins", @@ -216734,6 +222340,17 @@ "type": "gchararray", "writable": true }, + "caps": { + "blurb": "The caps of the incoming stream", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "mutable": "null", + "readable": true, + "type": "GstCaps", + "writable": true + }, "encoding-name": { "blurb": "Encoding name use to determine caps parameters", "conditionally-available": false, @@ -219739,12 +225356,12 @@ "presence": "request" }, "sink": { - "caps": "video/x-raw:\n framerate: [ 1/2147483647, 2147483647/1 ]\n", + "caps": "video/x-raw:\n framerate: [ 1/2147483647, 2147483647/1 ]\nclosedcaption/x-cea-608:\n framerate: [ 1/2147483647, 2147483647/1 ]\nclosedcaption/x-cea-708:\n framerate: [ 1/2147483647, 2147483647/1 ]\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "video/x-raw:\n framerate: [ 1/2147483647, 2147483647/1 ]\n", + "caps": "video/x-raw:\n framerate: [ 1/2147483647, 2147483647/1 ]\nclosedcaption/x-cea-608:\n framerate: [ 1/2147483647, 2147483647/1 ]\nclosedcaption/x-cea-708:\n framerate: [ 1/2147483647, 2147483647/1 ]\n", "direction": "src", "presence": "always" } @@ -220380,10 +225997,20 @@ "direction": "sink", "presence": "always" }, + "sink_%%u": { + "caps": "ANY", + "direction": "sink", + "presence": "request" + }, "src": { "caps": "ANY", "direction": "src", "presence": "always" + }, + "src_%%u": { + "caps": "ANY", + "direction": "src", + "presence": "sometimes" } }, "properties": { @@ -220537,7 +226164,29 @@ "writable": true } }, - "rank": "none" + "rank": "none", + "signals": { + "element-setup": { + "args": [ + { + "name": "arg0", + "type": "GstElement" + } + ], + "return-type": "void", + "when": "last" + }, + "source-setup": { + "args": [ + { + "name": "arg0", + "type": "GstElement" + } + ], + "return-type": "void", + "when": "last" + } + } } }, "filename": "gsttranscode", @@ -220591,7 +226240,7 @@ "long-name": "TTML subtitle renderer", "pad-templates": { "src": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" }, @@ -220601,7 +226250,7 @@ "presence": "always" }, "video_sink": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n\nvideo/x-raw(ANY):\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "sink", "presence": "always" } @@ -220715,12 +226364,12 @@ "presence": "always" }, "vfsrc": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nimage/jpeg:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nimage/jpeg:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", "direction": "src", "presence": "always" }, "vidsrc": { - "caps": "video/x-raw:\n format: { AYUV64, ARGB64, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, GBR, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nimage/jpeg:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-h264:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n stream-format: { (string)byte-stream, (string)avc }\n alignment: au\n profile: { (string)high, (string)main, (string)baseline, (string)constrained-baseline }\n", + "caps": "video/x-raw:\n format: { ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nimage/jpeg:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-h264:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n stream-format: { (string)byte-stream, (string)avc }\n alignment: au\n profile: { (string)high, (string)main, (string)baseline, (string)constrained-baseline }\n", "direction": "src", "presence": "always" } @@ -221319,6 +226968,76 @@ "va": { "description": "VA-API codecs plugin", "elements": { + "vaav1dec": { + "author": "He Junyan <junyan.he@intel.com>", + "description": "VA-API based AV1 video decoder", + "hierarchy": [ + "GstVaAV1Dec", + "GstAV1Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "VA-API AV1 Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-av1:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12, P010_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { NV12, P010_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "vadeinterlace": { + "author": "Víctor Jáquez <vjaquez@igalia.com>", + "description": "VA-API based deinterlacer", + "hierarchy": [ + "GstVaDeinterlace", + "GstVaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Filter/Effect/Video/Deinterlace", + "long-name": "VA-API Deinterlacer", + "pad-templates": { + "sink": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, I420, P010_10LE, RGBA, BGRA, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, I420, P010_10LE, RGBA, BGRA, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "method": { + "blurb": "Deinterlace Method", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "bob (1)", + "mutable": "playing", + "readable": true, + "type": "GstVaDeinterlaceMethods", + "writable": true + } + }, + "rank": "none" + }, "vah264dec": { "author": "Víctor Jáquez <vjaquez@igalia.com>", "description": "VA-API based H.264 video decoder", @@ -221346,11 +227065,200 @@ } }, "rank": "none" + }, + "vah265dec": { + "author": "Nicolas Dufresne <nicolas.dufresne@collabora.com>", + "description": "VA-API based H.265 video decoder", + "hierarchy": [ + "GstVaH265Dec", + "GstH265Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "VA-API H.265 Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-h265:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12, P010_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { NV12, P010_10LE }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "vampeg2dec": { + "author": "He Junyan <junyan.he@intel.com>", + "description": "VA-API based Mpeg2 video decoder", + "hierarchy": [ + "GstVaMpeg2dec", + "GstMpeg2Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "VA-API Mpeg2 Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-mpeg2:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "vapostproc": { + "author": "Víctor Jáquez <vjaquez@igalia.com>", + "description": "VA-API based video postprocessor", + "hierarchy": [ + "GstVaPostProc", + "GstVaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "interfaces": [ + "GstColorBalance" + ], + "klass": "Filter/Converter/Video/Scaler/Hardware", + "long-name": "VA-API Video Postprocessor", + "pad-templates": { + "sink": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, I420, P010_10LE, RGBA, BGRA, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, I420, P010_10LE, RGBA, BGRA, ARGB, ABGR }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "disable-passthrough": { + "blurb": "Forces passing buffers through the postprocessor", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + } + }, + "rank": "none" + }, + "vavp8dec": { + "author": "He Junyan <junyan.he@intel.com>", + "description": "VA-API based VP8 video decoder", + "hierarchy": [ + "GstVaVp8dec", + "GstVp8Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "VA-API VP8 Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp8:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" + }, + "vavp9dec": { + "author": "Víctor Jáquez <vjaquez@igalia.com>", + "description": "VA-API based VP9 video decoder", + "hierarchy": [ + "GstVaVp9Dec", + "GstVp9Decoder", + "GstVideoDecoder", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Decoder/Video/Hardware", + "long-name": "VA-API VP9 Decoder", + "pad-templates": { + "sink": { + "caps": "video/x-vp9:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-raw(memory:VAMemory):\n format: { NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\nvideo/x-raw:\n format: { NV12 }\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "none" } }, "filename": "gstva", "license": "LGPL", - "other-types": {}, + "other-types": { + "GstVaBaseTransform": { + "hierarchy": [ + "GstVaBaseTransform", + "GstBaseTransform", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "kind": "object" + }, + "GstVaDeinterlaceMethods": { + "kind": "enum", + "values": [ + { + "desc": "Bob: Interpolating missing lines by using the adjacent lines.", + "name": "bob", + "value": "1" + }, + { + "desc": "Adaptive: Interpolating missing lines by using spatial/temporal references.", + "name": "adaptive", + "value": "3" + }, + { + "desc": "Compensation: Recreating missing lines by using motion vector.", + "name": "compensated", + "value": "4" + } + ] + } + }, "package": "GStreamer Bad Plug-ins", "source": "gst-plugins-bad", "tracers": {}, @@ -221521,6 +227429,33 @@ "videoparsersbad": { "description": "videoparsers", "elements": { + "av1parse": { + "author": "He Junyan <junyan.he@intel.com>", + "description": "Parses AV1 streams", + "hierarchy": [ + "GstAV1Parse", + "GstBaseParse", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Parser/Converter/Video", + "long-name": "AV1 parser", + "pad-templates": { + "sink": { + "caps": "video/x-av1:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-av1:\n parsed: true\n stream-format: { (string)obu-stream, (string)annexb }\n alignment: { (string)obu, (string)tu, (string)frame }\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "secondary" + }, "diracparse": { "author": "David Schleef <ds@schleef.org>", "description": "Parses Dirac streams", @@ -221690,12 +227625,12 @@ "long-name": "JPEG 2000 parser", "pad-templates": { "sink": { - "caps": "image/jp2:\nimage/x-jpc:\nimage/x-j2c:\n", + "caps": "image/jp2:\nimage/x-jpc:\nimage/x-j2c:\nimage/x-jpc-striped:\n", "direction": "sink", "presence": "always" }, "src": { - "caps": "image/x-jpc:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\n profile: [ 0, 49151 ]\n parsed: true\nimage/x-j2c:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\n profile: [ 0, 49151 ]\n parsed: true\n", + "caps": "image/x-jpc:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\n profile: [ 0, 49151 ]\n parsed: true\nimage/x-j2c:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\n profile: [ 0, 49151 ]\n parsed: true\nimage/x-jpc-striped:\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n sampling: { (string)RGB, (string)BGR, (string)RGBA, (string)BGRA, (string)YCbCr-4:4:4, (string)YCbCr-4:2:2, (string)YCbCr-4:2:0, (string)YCbCr-4:1:1, (string)YCbCr-4:1:0, (string)GRAYSCALE, (string)YCbCrA-4:4:4:4 }\n colorspace: { (string)sRGB, (string)sYUV, (string)GRAY }\n profile: [ 0, 49151 ]\n num-stripes: [ 2, 2147483647 ]\n parsed: true\n", "direction": "src", "presence": "always" } @@ -221866,6 +227801,33 @@ }, "properties": {}, "rank": "none" + }, + "vp9parse": { + "author": "Seungha Yang <seungha@centricular.com>", + "description": "Parses VP9 streams", + "hierarchy": [ + "GstVp9Parse", + "GstBaseParse", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], + "klass": "Codec/Parser/Converter/Video", + "long-name": "VP9 parser", + "pad-templates": { + "sink": { + "caps": "video/x-vp9:\n", + "direction": "sink", + "presence": "always" + }, + "src": { + "caps": "video/x-vp9:\n parsed: true\n alignment: { (string)super-frame, (string)frame }\n", + "direction": "src", + "presence": "always" + } + }, + "rank": "secondary" } }, "filename": "gstvideoparsersbad", @@ -222575,7 +228537,7 @@ }, "vulkansink": { "author": "Matthew Waters <matthew@centricular.com>", - "description": "A videosink based on OpenGL", + "description": "A videosink based on Vulkan", "hierarchy": [ "GstVulkanSink", "GstVideoSink", @@ -223025,7 +228987,6 @@ "description": "Stream audio to an audio capture device through WASAPI", "hierarchy": [ "GstWasapi2Sink", - "GstAudioSink", "GstAudioBaseSink", "GstBaseSink", "GstElement", @@ -223108,14 +229069,13 @@ "writable": true } }, - "rank": "secondary" + "rank": "primary + 1" }, "wasapi2src": { "author": "Nirbheek Chauhan <nirbheek@centricular.com>, Ole André Vadla Ravnås <ole.andre.ravnas@tandberg.com>, Seungha Yang <seungha@centricular.com>", "description": "Stream audio from an audio capture device through WASAPI", "hierarchy": [ "GstWasapi2Src", - "GstAudioSrc", "GstAudioBaseSrc", "GstPushSrc", "GstBaseSrc", @@ -223160,6 +229120,18 @@ "type": "gpointer", "writable": true }, + "loopback": { + "blurb": "Open render device for loopback recording", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "false", + "mutable": "ready", + "readable": true, + "type": "gboolean", + "writable": true + }, "low-latency": { "blurb": "Optimize all settings for lowest latency. Always safe to enable.", "conditionally-available": false, @@ -223199,7 +229171,7 @@ "writable": true } }, - "rank": "secondary" + "rank": "primary + 1" } }, "filename": "gstwasapi2", @@ -223612,7 +229584,7 @@ "construct": false, "construct-only": false, "controllable": false, - "default": "0", + "default": "200", "max": "-1", "min": "0", "mutable": "null", @@ -223664,6 +229636,17 @@ "type": "GstWebRTCSessionDescription", "writable": false }, + "sctp-transport": { + "blurb": "The WebRTC SCTP Transport", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "mutable": "null", + "readable": true, + "type": "GstWebRTCSCTPTransport", + "writable": false + }, "signaling-state": { "blurb": "The signaling state of this element", "conditionally-available": false, @@ -223964,6 +229947,34 @@ "readable": true, "type": "gboolean", "writable": true + }, + "max-rtp-port": { + "blurb": "Maximum port for local rtp port range. max-rtp-port must be >= min-rtp-port", + "conditionally-available": false, + "construct": true, + "construct-only": false, + "controllable": false, + "default": "65535", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true + }, + "min-rtp-port": { + "blurb": "Minimum port for local rtp port range. min-rtp-port must be <= max-rtp-port", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "0", + "max": "65535", + "min": "0", + "mutable": "null", + "readable": true, + "type": "guint", + "writable": true } }, "signals": { @@ -224921,20 +230932,102 @@ "elements": { "wpesrc": { "author": "Philippe Normand <philn@igalia.com>, Žan Doberšek <zdobersek@igalia.com>", - "description": "Creates a video stream from a WPE browser", + "description": "Creates Audio/Video streams from a web page using WPE web engine", "hierarchy": [ "GstWpeSrc", - "GstGLBaseSrc", - "GstPushSrc", - "GstBaseSrc", + "GstBin", "GstElement", "GstObject", "GInitiallyUnowned", "GObject" ], "interfaces": [ + "GstChildProxy", "GstURIHandler" ], + "klass": "Source/Video/Audio", + "long-name": "WPE source", + "pad-templates": { + "audio_%%u": { + "caps": "audio/x-raw:\n format: F32LE\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2147483647 ]\n layout: interleaved\naudio/x-raw:\n format: F64LE\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2147483647 ]\n layout: interleaved\naudio/x-raw:\n format: S16LE\n rate: [ 1, 2147483647 ]\n channels: [ 1, 2147483647 ]\n layout: interleaved\n", + "direction": "src", + "presence": "sometimes" + }, + "src": { + "caps": "video/x-raw(memory:GLMemory):\n format: RGBA\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\npixel-aspect-ratio: 1/1\n texture-target: 2D\nvideo/x-raw:\n format: BGRA\n", + "direction": "src", + "presence": "always" + }, + "video": { + "caps": "video/x-raw(memory:GLMemory):\n format: RGBA\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\npixel-aspect-ratio: 1/1\n texture-target: 2D\nvideo/x-raw:\n format: BGRA\n width: [ 1, 2147483647 ]\n height: [ 1, 2147483647 ]\n framerate: [ 0/1, 2147483647/1 ]\npixel-aspect-ratio: 1/1\n", + "direction": "src", + "presence": "always" + } + }, + "properties": { + "draw-background": { + "blurb": "Whether to draw the WebView background", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "true", + "mutable": "null", + "readable": true, + "type": "gboolean", + "writable": true + }, + "location": { + "blurb": "The URL to display", + "conditionally-available": false, + "construct": false, + "construct-only": false, + "controllable": false, + "default": "NULL", + "mutable": "null", + "readable": true, + "type": "gchararray", + "writable": true + } + }, + "rank": "none", + "signals": { + "configure-web-view": { + "args": [ + { + "name": "arg0", + "type": "GObject" + } + ], + "return-type": "void", + "when": "last" + }, + "load-bytes": { + "action": true, + "args": [ + { + "name": "arg0", + "type": "GBytes" + } + ], + "return-type": "void", + "when": "last" + } + } + }, + "wpevideosrc": { + "author": "Philippe Normand <philn@igalia.com>, Žan Doberšek <zdobersek@igalia.com>", + "description": "Creates a video stream from a WPE browser", + "hierarchy": [ + "GstWpeVideoSrc", + "GstGLBaseSrc", + "GstPushSrc", + "GstBaseSrc", + "GstElement", + "GstObject", + "GInitiallyUnowned", + "GObject" + ], "klass": "Source/Video", "long-name": "WPE source", "pad-templates": {
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaes.c
Added
@@ -0,0 +1,54 @@ +/* + * GStreamer gstreamer-aes + * + * Copyright, 2021 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * gstaes.c + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstaesdec.h" +#include "gstaesenc.h" + +/** + * SECTION:plugin-aes + * + * AES encryption and decryption + * + * See also: @aesenc, @aesdec + * Since: 1.20 + */ + + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean success = GST_ELEMENT_REGISTER (aesenc, plugin); + success = GST_ELEMENT_REGISTER (aesdec, plugin) || success; + + return success; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + aes, + "AES encryption/decryption plugin", + plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaesdec.c
Added
@@ -0,0 +1,625 @@ +/* + * GStreamer gstreamer-aesdec + * + * Copyright, LCC (C) 2015 RidgeRun, LCC <carsten.behling@ridgerun.com> + * Copyright, LCC (C) 2016 RidgeRun, LCC <jose.jimenez@ridgerun.com> + * Copyright (C) 2020 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1335, USA. + */ + +/** + * SECTION:element-aesdec + * + * AES decryption + * + * ## Example + * + * |[ + * echo "This is an AES crypto test ... " > plain.txt && \ + * gst-launch-1.0 filesrc location=plain.txt ! \ + * aesenc key=1f9423681beb9a79215820f6bda73d0f iv=e9aa8e834d8d70b7e0d254ff670dd718 ! \ + * aesdec key=1f9423681beb9a79215820f6bda73d0f iv=e9aa8e834d8d70b7e0d254ff670dd718 ! \ + * filesink location=dec.txt && \ + * cat dec.txt + * + * ]| + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include <gst/gst.h> +#include <gst/base/gstbasetransform.h> +#include <string.h> +#include "gstaeshelper.h" +#include "gstaesdec.h" + +GST_DEBUG_CATEGORY_STATIC (gst_aes_dec_debug); +#define GST_CAT_DEFAULT gst_aes_dec_debug +G_DEFINE_TYPE_WITH_CODE (GstAesDec, gst_aes_dec, GST_TYPE_BASE_TRANSFORM, + GST_DEBUG_CATEGORY_INIT (gst_aes_dec_debug, "aesdec", 0, + "aesdec AES decryption element") + ); +GST_ELEMENT_REGISTER_DEFINE (aesdec, "aesdec", GST_RANK_PRIMARY, + GST_TYPE_AES_DEC); + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static void gst_aes_dec_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_aes_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +static GstFlowReturn gst_aes_dec_transform (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer * outbuf); +static GstFlowReturn gst_aes_dec_prepare_output_buffer (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer ** outbuf); +static gboolean +gst_aes_dec_sink_event (GstBaseTransform * base, GstEvent * event); + +static gboolean gst_aes_dec_start (GstBaseTransform * base); +static gboolean gst_aes_dec_stop (GstBaseTransform * base); + +/* aes_dec helper functions */ +static gboolean gst_aes_dec_openssl_init (GstAesDec * filter); +static gboolean gst_aes_dec_init_cipher (GstAesDec * filter); +static void gst_aes_dec_finalize (GObject * object); + +/* GObject vmethod implementations */ + +/* initialize class */ +static void +gst_aes_dec_class_init (GstAesDecClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *gstelement_class; + + gobject_class = (GObjectClass *) klass; + gstelement_class = (GstElementClass *) klass; + + gobject_class->set_property = gst_aes_dec_set_property; + gobject_class->get_property = gst_aes_dec_get_property; + gobject_class->finalize = gst_aes_dec_finalize; + + gst_type_mark_as_plugin_api (GST_TYPE_AES_CIPHER, 0); + + /** + * GstAesDec:cipher + * + * AES cipher mode (key length and mode) + * Currently, 128 and 256 bit keys are supported, + * in "cipher block chaining" (CBC) mode + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_CIPHER, + g_param_spec_enum ("cipher", + "Cipher", + "cipher mode", + GST_TYPE_AES_CIPHER, GST_AES_DEFAULT_CIPHER_MODE, + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstAesDec:serialize-iv + * + * If true, read initialization vector from first 16 bytes of first buffer + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_SERIALIZE_IV, + g_param_spec_boolean ("serialize-iv", "Serialize IV", + "Read initialization vector from first 16 bytes of first buffer", + GST_AES_DEFAULT_SERIALIZE_IV, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + /** + * GstAesDec:per-buffer-padding + * + * If true, each buffer will be padded using PKCS7 padding + * If false, only the final buffer in the stream will be padded + * (by OpenSSL) using PKCS7 + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_PER_BUFFER_PADDING, + g_param_spec_boolean ("per-buffer-padding", "Per buffer padding", + "If true, pad each buffer using PKCS7 padding scheme. Otherwise, only" + "pad final buffer", + GST_AES_PER_BUFFER_PADDING_DEFAULT, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + /** + * GstAesDec:key + * + * AES encryption key (hexadecimal) + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_KEY, + g_param_spec_string ("key", "Key", + "AES encryption key (in hexadecimal). Length (in bytes) must be equivalent to " + "the number of bits in the key length : " + "16 bytes for AES 128 and 32 bytes for AES 256", + (gchar *) GST_AES_DEFAULT_KEY, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + /** + * GstAesDec:iv + * + * AES encryption initialization vector (hexadecimal) + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_IV, + g_param_spec_string ("iv", "Iv", + "AES encryption initialization vector (in hexadecimal). " + "Length must equal AES block length (16 bytes)", + (gchar *) GST_AES_DEFAULT_IV, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + gst_element_class_set_details_simple (gstelement_class, + "aesdec", + "Generic/Filter", + "AES buffer decryption", + "Rabindra Harlalka <Rabindra.Harlalka@nice.com>"); + + gst_element_class_add_pad_template (gstelement_class, + gst_static_pad_template_get (&src_template)); + gst_element_class_add_pad_template (gstelement_class, + gst_static_pad_template_get (&sink_template)); + + GST_BASE_TRANSFORM_CLASS (klass)->transform = + GST_DEBUG_FUNCPTR (gst_aes_dec_transform); + GST_BASE_TRANSFORM_CLASS (klass)->prepare_output_buffer = + GST_DEBUG_FUNCPTR (gst_aes_dec_prepare_output_buffer); + GST_BASE_TRANSFORM_CLASS (klass)->start = + GST_DEBUG_FUNCPTR (gst_aes_dec_start); + GST_BASE_TRANSFORM_CLASS (klass)->sink_event = + GST_DEBUG_FUNCPTR (gst_aes_dec_sink_event); + GST_BASE_TRANSFORM_CLASS (klass)->stop = GST_DEBUG_FUNCPTR (gst_aes_dec_stop); +} + +/* Initialize element + */ +static void +gst_aes_dec_init (GstAesDec * filter) +{ + GST_INFO_OBJECT (filter, "Initializing plugin"); + filter->cipher = GST_AES_DEFAULT_CIPHER_MODE; + filter->awaiting_first_buffer = TRUE; + filter->per_buffer_padding = GST_AES_PER_BUFFER_PADDING_DEFAULT; + g_mutex_init (&filter->decoder_lock); +} + +static void +gst_aes_dec_finalize (GObject * object) +{ + GstAesDec *filter = GST_AES_DEC (object); + + g_mutex_clear (&filter->decoder_lock); + G_OBJECT_CLASS (gst_aes_dec_parent_class)->finalize (object); +} + + +static void +gst_aes_dec_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstAesDec *filter = GST_AES_DEC (object); + + g_mutex_lock (&filter->decoder_lock); + /* no property may be set after first output buffer is prepared */ + if (filter->locked_properties) { + GST_WARNING_OBJECT (filter, + "Properties cannot be set once buffers begin flowing in element. Ignored"); + goto cleanup; + } + + switch (prop_id) { + case PROP_CIPHER: + filter->cipher = g_value_get_enum (value); + filter->evp_cipher = + EVP_get_cipherbyname (gst_aes_cipher_enum_to_string (filter->cipher)); + GST_DEBUG_OBJECT (filter, "cipher: %s", + gst_aes_cipher_enum_to_string (filter->cipher)); + break; + case PROP_SERIALIZE_IV: + filter->serialize_iv = g_value_get_boolean (value); + GST_DEBUG_OBJECT (filter, "serialize iv: %s", + filter->serialize_iv ? "TRUE" : "FALSE"); + break; + case PROP_PER_BUFFER_PADDING: + filter->per_buffer_padding = g_value_get_boolean (value); + GST_DEBUG_OBJECT (filter, "Per buffer padding: %s", + filter->per_buffer_padding ? "TRUE" : "FALSE"); + break; + case PROP_KEY: + { + guint hex_len = gst_aes_hexstring2bytearray (GST_ELEMENT (filter), + g_value_get_string (value), filter->key); + + if (!hex_len) { + GST_ERROR_OBJECT (filter, "invalid key"); + goto cleanup; + } + GST_DEBUG_OBJECT (filter, "key: %s", g_value_get_string (value)); + } + break; + case PROP_IV: + { + gchar iv_string[2 * GST_AES_BLOCK_SIZE + 1]; + guint hex_len = gst_aes_hexstring2bytearray (GST_ELEMENT (filter), + g_value_get_string (value), filter->iv); + + if (hex_len != GST_AES_BLOCK_SIZE) { + GST_ERROR_OBJECT (filter, "invalid initialization vector"); + goto cleanup; + } + GST_DEBUG_OBJECT (filter, "iv: %s", + gst_aes_bytearray2hexstring (filter->iv, iv_string, + GST_AES_BLOCK_SIZE)); + } + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + +cleanup: + g_mutex_unlock (&filter->decoder_lock); +} + +static void +gst_aes_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstAesDec *filter = GST_AES_DEC (object); + + switch (prop_id) { + case PROP_CIPHER: + g_value_set_enum (value, filter->cipher); + break; + case PROP_SERIALIZE_IV: + g_value_set_boolean (value, filter->serialize_iv); + break; + case PROP_PER_BUFFER_PADDING: + g_value_set_boolean (value, filter->per_buffer_padding); + break; + case PROP_KEY: + g_value_set_string (value, (gchar *) filter->key); + break; + case PROP_IV: + g_value_set_string (value, (gchar *) filter->iv); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +gst_aes_dec_sink_event (GstBaseTransform * base, GstEvent * event) +{ + GstAesDec *filter = GST_AES_DEC (base); + + g_mutex_lock (&filter->decoder_lock); + + if (GST_EVENT_TYPE (event) == GST_EVENT_EOS) { + GST_DEBUG_OBJECT (filter, "Received EOS on sink pad"); + if (!filter->per_buffer_padding && !filter->awaiting_first_buffer) { + GstBuffer *outbuf = NULL; + gint len; + GstMapInfo outmap; + + outbuf = gst_buffer_new_allocate (NULL, EVP_MAX_BLOCK_LENGTH, NULL); + if (outbuf == NULL) { + GST_DEBUG_OBJECT (filter, + "Failed to allocate a new buffer of length %d", + EVP_MAX_BLOCK_LENGTH); + goto buffer_fail; + } + if (!gst_buffer_map (outbuf, &outmap, GST_MAP_WRITE)) { + GST_DEBUG_OBJECT (filter, + "gst_buffer_map on outbuf failed for final buffer."); + gst_buffer_unref (outbuf); + goto buffer_fail; + } + if (1 != EVP_CipherFinal_ex (filter->evp_ctx, outmap.data, &len)) { + GST_DEBUG_OBJECT (filter, "Could not finalize openssl encryption"); + gst_buffer_unmap (outbuf, &outmap); + gst_buffer_unref (outbuf); + goto cipher_fail; + } + if (len == 0) { + GST_DEBUG_OBJECT (filter, "Not pushing final buffer as length is 0"); + gst_buffer_unmap (outbuf, &outmap); + gst_buffer_unref (outbuf); + goto out; + } + GST_DEBUG_OBJECT (filter, "Pushing final buffer of length: %d", len); + gst_buffer_unmap (outbuf, &outmap); + gst_buffer_set_size (outbuf, len); + if (gst_pad_push (base->srcpad, outbuf) != GST_FLOW_OK) { + GST_DEBUG_OBJECT (filter, "Failed to push the final buffer"); + goto push_fail; + } + } else { + GST_DEBUG_OBJECT (filter, + "Not pushing final buffer as we didn't have any input"); + } + } + +out: + g_mutex_unlock (&filter->decoder_lock); + + return GST_BASE_TRANSFORM_CLASS (gst_aes_dec_parent_class)->sink_event (base, + event); + + /* ERROR */ +buffer_fail: + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to allocate or map buffer for writing")); + g_mutex_unlock (&filter->decoder_lock); + + return FALSE; +cipher_fail: + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Cipher finalization failed."), + ("Error while finalizing the stream")); + g_mutex_unlock (&filter->decoder_lock); + + return FALSE; +push_fail: + GST_ELEMENT_ERROR (filter, CORE, PAD, (NULL), + ("Failed to push the final buffer")); + g_mutex_unlock (&filter->decoder_lock); + + return FALSE; +} + +/* GstBaseTransform vmethod implementations */ +static GstFlowReturn +gst_aes_dec_transform (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GstAesDec *filter = GST_AES_DEC (base); + GstFlowReturn ret = GST_FLOW_ERROR; + GstMapInfo inmap, outmap; + guchar *ciphertext; + gint ciphertext_len; + guchar *plaintext; + gint plaintext_len; + guint padding = 0; + + if (!gst_buffer_map (inbuf, &inmap, GST_MAP_READ)) { + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to map buffer for reading")); + goto cleanup; + } + if (!gst_buffer_map (outbuf, &outmap, GST_MAP_WRITE)) { + gst_buffer_unmap (inbuf, &inmap); + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to map buffer for writing")); + goto cleanup; + } + /* DECRYPTING */ + ciphertext = inmap.data; + ciphertext_len = gst_buffer_get_size (inbuf); + if (filter->awaiting_first_buffer) { + if (filter->serialize_iv) { + gchar iv_string[2 * GST_AES_BLOCK_SIZE + 1]; + + if (ciphertext_len < GST_AES_BLOCK_SIZE) { + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Cipher text too short")); + goto cleanup; + } + memcpy (filter->iv, ciphertext, GST_AES_BLOCK_SIZE); + GST_DEBUG_OBJECT (filter, "read serialized iv: %s", + gst_aes_bytearray2hexstring (filter->iv, iv_string, + GST_AES_BLOCK_SIZE)); + ciphertext += GST_AES_BLOCK_SIZE; + ciphertext_len -= GST_AES_BLOCK_SIZE; + } + if (!gst_aes_dec_init_cipher (filter)) { + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to initialize cipher")); + goto cleanup; + } + } + plaintext = outmap.data; + + if (!EVP_CipherUpdate (filter->evp_ctx, plaintext, + &plaintext_len, ciphertext, ciphertext_len)) { + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Cipher update failed."), + ("Error while updating openssl cipher")); + goto cleanup; + } else { + if (filter->per_buffer_padding) { + gint k; + + /* sanity check on padding value */ + padding = plaintext[plaintext_len - 1]; + if (padding == 0 || padding > GST_AES_BLOCK_SIZE) { + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Corrupt cipher text."), + ("Illegal PKCS7 padding value %d", padding)); + goto cleanup; + } + for (k = 1; k < padding; ++k) { + if (plaintext[plaintext_len - 1 - k] != padding) { + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Corrupt cipher text."), + ("PKCS7 padding values must all be equal")); + goto cleanup; + } + } + /* remove padding (final block padding) */ + plaintext_len -= padding; + } + if (plaintext_len > 2 * GST_AES_BLOCK_SIZE) + GST_MEMDUMP ("First 32 bytes of plain text", plaintext, + 2 * GST_AES_BLOCK_SIZE); + } + gst_buffer_unmap (inbuf, &inmap); + gst_buffer_unmap (outbuf, &outmap); + + GST_LOG_OBJECT (filter, + "Ciphertext len: %d, Plaintext len: %d, Padding: %d", + ciphertext_len, plaintext_len, padding); + gst_buffer_set_size (outbuf, plaintext_len); + ret = GST_FLOW_OK; + +cleanup: + filter->awaiting_first_buffer = FALSE; + + return ret; +} + +static GstFlowReturn +gst_aes_dec_prepare_output_buffer (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer ** outbuf) +{ + GstAesDec *filter = GST_AES_DEC (base); + GstBaseTransformClass *bclass = GST_BASE_TRANSFORM_GET_CLASS (base); + guint out_size; + + g_mutex_lock (&filter->decoder_lock); + filter->locked_properties = TRUE; + /* we need extra space at end of output buffer + * when we let OpenSSL handle PKCS7 padding */ + out_size = (gint) gst_buffer_get_size (inbuf) + + (!filter->per_buffer_padding ? GST_AES_BLOCK_SIZE : 0); + + /* Since serialized IV is stripped from first buffer, + * reduce output buffer size by GST_AES_BLOCK_SIZE in this case */ + if (filter->serialize_iv && filter->awaiting_first_buffer) { + g_assert (gst_buffer_get_size (inbuf) > GST_AES_BLOCK_SIZE); + out_size -= GST_AES_BLOCK_SIZE; + } + g_mutex_unlock (&filter->decoder_lock); + + *outbuf = gst_buffer_new_allocate (NULL, out_size, NULL); + GST_LOG_OBJECT (filter, + "Input buffer size %d,\nAllocating output buffer size: %d", + (gint) gst_buffer_get_size (inbuf), out_size); + bclass->copy_metadata (base, inbuf, *outbuf); + + return GST_FLOW_OK; +} + +static gboolean +gst_aes_dec_start (GstBaseTransform * base) +{ + GstAesDec *filter = GST_AES_DEC (base); + + GST_INFO_OBJECT (filter, "Starting"); + if (!gst_aes_dec_openssl_init (filter)) { + GST_ERROR_OBJECT (filter, "OpenSSL initialization failed"); + return FALSE; + } + + if (!filter->serialize_iv) { + if (!gst_aes_dec_init_cipher (filter)) + return FALSE; + } + GST_INFO_OBJECT (filter, "Start successful"); + + return TRUE; +} + +static gboolean +gst_aes_dec_init_cipher (GstAesDec * filter) +{ + if (!EVP_CipherInit_ex (filter->evp_ctx, filter->evp_cipher, NULL, + filter->key, filter->iv, FALSE)) { + GST_ERROR_OBJECT (filter, "Could not initialize openssl cipher"); + return FALSE; + } + if (filter->per_buffer_padding) { + if (!EVP_CIPHER_CTX_set_padding (filter->evp_ctx, 0)) { + GST_ERROR_OBJECT (filter, "Could not set padding"); + return FALSE; + } + } + + return TRUE; +} + +static gboolean +gst_aes_dec_stop (GstBaseTransform * base) +{ + GstAesDec *filter = GST_AES_DEC (base); + + GST_INFO_OBJECT (filter, "Stopping"); + EVP_CIPHER_CTX_free (filter->evp_ctx); + + return TRUE; +} + +/* AesDec helper functions */ +static gboolean +gst_aes_dec_openssl_init (GstAesDec * filter) +{ + GST_DEBUG_OBJECT (filter, "Initializing with %s", + OpenSSL_version (OPENSSL_VERSION)); + + filter->evp_cipher = + EVP_get_cipherbyname (gst_aes_cipher_enum_to_string (filter->cipher)); + if (!filter->evp_cipher) { + GST_ERROR_OBJECT (filter, "Could not get cipher by name from openssl"); + return FALSE; + } + if (!(filter->evp_ctx = EVP_CIPHER_CTX_new ())) + return FALSE; + GST_LOG_OBJECT (filter, "Initialization successful"); + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaesdec.h
Added
@@ -0,0 +1,90 @@ +/* + * GStreamer gstreamer-aesdec + * + * Copyright, LCC (C) 2015 RidgeRun, LCC <carsten.behling@ridgerun.com> + * Copyright, 2020 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1335, USA. + */ + +#ifndef __GST_AES_DEC_H__ +#define __GST_AES_DEC_H__ + +#include "gstaeshelper.h" +#include <gst/base/gstbasetransform.h> + +G_BEGIN_DECLS + +#define GST_TYPE_AES_DEC (gst_aes_dec_get_type()) +G_DECLARE_FINAL_TYPE (GstAesDec, gst_aes_dec, GST, AES_DEC, GstBaseTransform) +#define GST_AES_DEC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_AES_DEC,GstAesDec)) +#define GST_AES_DEC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_AES_DEC,GstAesDecClass)) +#define GST_AES_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_AES_DEC,GstAesDecClass)) +#define GST_IS_AES_DEC(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_AES_DEC)) +#define GST_IS_AES_DEC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_AES_DEC)) + +struct _GstAesDec +{ + GstBaseTransform element; + + /* Properties */ + GstAesCipher cipher; + guchar key[EVP_MAX_KEY_LENGTH]; + guchar iv[GST_AES_BLOCK_SIZE]; + gboolean serialize_iv; + gboolean per_buffer_padding; + + /* Element variables */ + const EVP_CIPHER *evp_cipher; + EVP_CIPHER_CTX *evp_ctx; + gboolean awaiting_first_buffer; + GMutex decoder_lock; + /* if TRUE, then properties cannot be changed */ + gboolean locked_properties; +}; + +struct _GstAesDecClass +{ + GstBaseTransformClass parent_class; +}; + +GST_ELEMENT_REGISTER_DECLARE (aesdec) + +G_END_DECLS +#endif /* __GST_AES_DEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaesenc.c
Added
@@ -0,0 +1,594 @@ +/* + * GStreamer gstreamer-aesenc + * + * Copyright, LCC (C) 2015 RidgeRun, LCC <carsten.behling@ridgerun.com> + * Copyright, LCC (C) 2016 RidgeRun, LCC <jose.jimenez@ridgerun.com> + * Copyright (C) 2020 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1335, USA. + */ + +/** + * SECTION:element-aesenc + * + * AES encryption + * + * ## Example + * + * |[ + * echo "This is an AES crypto test ... " > plain.txt && \ + * gst-launch-1.0 filesrc location=plain.txt ! \ + * aesenc key=1f9423681beb9a79215820f6bda73d0f iv=e9aa8e834d8d70b7e0d254ff670dd718 ! \ + * aesdec key=1f9423681beb9a79215820f6bda73d0f iv=e9aa8e834d8d70b7e0d254ff670dd718 ! \ + * filesink location=dec.txt && \ + * cat dec.txt + * + * ]| + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include <gst/gst.h> +#include <gst/base/gstbasetransform.h> +#include <string.h> +#include "gstaeshelper.h" +#include "gstaesenc.h" + +GST_DEBUG_CATEGORY_STATIC (gst_aes_enc_debug); +#define GST_CAT_DEFAULT gst_aes_enc_debug +G_DEFINE_TYPE_WITH_CODE (GstAesEnc, gst_aes_enc, GST_TYPE_BASE_TRANSFORM, + GST_DEBUG_CATEGORY_INIT (gst_aes_enc_debug, "aesenc", 0, + "aesenc AES encryption element") + ); +GST_ELEMENT_REGISTER_DEFINE (aesenc, "aesenc", GST_RANK_PRIMARY, + GST_TYPE_AES_ENC); + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static void gst_aes_enc_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_aes_enc_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +static GstFlowReturn gst_aes_enc_transform (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer * outbuf); +static GstFlowReturn gst_aes_enc_prepare_output_buffer (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer ** outbuf); + +static gboolean gst_aes_enc_start (GstBaseTransform * base); +static gboolean gst_aes_enc_stop (GstBaseTransform * base); +static gboolean +gst_aes_enc_sink_event (GstBaseTransform * base, GstEvent * event); + +/* aes_enc helper functions */ +static gboolean gst_aes_enc_openssl_init (GstAesEnc * filter); +static void gst_aes_enc_finalize (GObject * object); + +/* GObject vmethod implementations */ + +/* initialize class */ +static void +gst_aes_enc_class_init (GstAesEncClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *gstelement_class; + + gobject_class = (GObjectClass *) klass; + gstelement_class = (GstElementClass *) klass; + + gobject_class->set_property = gst_aes_enc_set_property; + gobject_class->get_property = gst_aes_enc_get_property; + gobject_class->finalize = gst_aes_enc_finalize; + + gst_type_mark_as_plugin_api (GST_TYPE_AES_CIPHER, 0); + + /** + * GstAesEnc:cipher + * + * AES cipher mode (key length and mode) + * Currently, 128 and 256 bit keys are supported, + * in "cipher block chaining" (CBC) mode + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_CIPHER, + g_param_spec_enum ("cipher", + "Cipher", + "cipher mode", + GST_TYPE_AES_CIPHER, GST_AES_DEFAULT_CIPHER_MODE, + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstAesEnc:serialize-iv + * + * If true, store initialization vector in first 16 bytes of first buffer + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_SERIALIZE_IV, + g_param_spec_boolean ("serialize-iv", "Serialize IV", + "Store initialization vector in first 16 bytes of first buffer", + GST_AES_DEFAULT_SERIALIZE_IV, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + /** + * GstAesEnc:per-buffer-padding + * + * If true, each buffer will be padded using PKCS7 padding + * If false, only the final buffer in the stream will be padded + * (by OpenSSL) using PKCS7 + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_PER_BUFFER_PADDING, + g_param_spec_boolean ("per-buffer-padding", "Per buffer padding", + "If true, pad each buffer using PKCS7 padding scheme. Otherwise, only" + "pad final buffer", + GST_AES_PER_BUFFER_PADDING_DEFAULT, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + /** + * GstAesEnc:key + * + * AES encryption key (hexadecimal) + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_KEY, + g_param_spec_string ("key", "Key", + "AES encryption key (in hexadecimal). Length (in bytes) must be equivalent to " + "the number of bits in the key length : " + "16 bytes for AES 128 and 32 bytes for AES 256", + (gchar *) GST_AES_DEFAULT_KEY, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + /** + * GstAesEnc:iv + * + * AES encryption initialization vector (hexadecimal) + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_IV, + g_param_spec_string ("iv", "Iv", + "AES encryption initialization vector (in hexadecimal). " + "Length must equal AES block length (16 bytes)", + (gchar *) GST_AES_DEFAULT_IV, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY)); + + gst_element_class_set_details_simple (gstelement_class, + "aesenc", + "Generic/Filter", + "AES buffer encryption", + "Rabindra Harlalka <Rabindra.Harlalka@nice.com>"); + + gst_element_class_add_pad_template (gstelement_class, + gst_static_pad_template_get (&src_template)); + gst_element_class_add_pad_template (gstelement_class, + gst_static_pad_template_get (&sink_template)); + + GST_BASE_TRANSFORM_CLASS (klass)->transform = + GST_DEBUG_FUNCPTR (gst_aes_enc_transform); + GST_BASE_TRANSFORM_CLASS (klass)->prepare_output_buffer = + GST_DEBUG_FUNCPTR (gst_aes_enc_prepare_output_buffer); + GST_BASE_TRANSFORM_CLASS (klass)->start = + GST_DEBUG_FUNCPTR (gst_aes_enc_start); + GST_BASE_TRANSFORM_CLASS (klass)->sink_event = + GST_DEBUG_FUNCPTR (gst_aes_enc_sink_event); + GST_BASE_TRANSFORM_CLASS (klass)->stop = GST_DEBUG_FUNCPTR (gst_aes_enc_stop); +} + +/* Initialize element + */ +static void +gst_aes_enc_init (GstAesEnc * filter) +{ + GST_INFO_OBJECT (filter, "Initializing plugin"); + filter->cipher = GST_AES_DEFAULT_CIPHER_MODE; + filter->awaiting_first_buffer = TRUE; + filter->per_buffer_padding = GST_AES_PER_BUFFER_PADDING_DEFAULT; + g_mutex_init (&filter->encoder_lock); +} + +static void +gst_aes_enc_finalize (GObject * object) +{ + GstAesEnc *filter = GST_AES_ENC (object); + + g_mutex_clear (&filter->encoder_lock); + G_OBJECT_CLASS (gst_aes_enc_parent_class)->finalize (object); +} + +static void +gst_aes_enc_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstAesEnc *filter = GST_AES_ENC (object); + + g_mutex_lock (&filter->encoder_lock); + /* no property may be set after first output buffer is prepared */ + if (filter->locked_properties) { + GST_WARNING_OBJECT (filter, + "Properties cannot be set once buffers begin flowing in element. Ignored"); + goto cleanup; + } + switch (prop_id) { + case PROP_CIPHER: + filter->cipher = g_value_get_enum (value); + filter->evp_cipher = + EVP_get_cipherbyname (gst_aes_cipher_enum_to_string (filter->cipher)); + GST_DEBUG_OBJECT (filter, "cipher: %s", + gst_aes_cipher_enum_to_string (filter->cipher)); + break; + case PROP_SERIALIZE_IV: + filter->serialize_iv = g_value_get_boolean (value); + GST_DEBUG_OBJECT (filter, "serialize iv: %s", + filter->serialize_iv ? "TRUE" : "FALSE"); + break; + case PROP_PER_BUFFER_PADDING: + filter->per_buffer_padding = g_value_get_boolean (value); + GST_DEBUG_OBJECT (filter, "Per buffer padding: %s", + filter->per_buffer_padding ? "TRUE" : "FALSE"); + break; + case PROP_KEY: + { + guint hex_len = gst_aes_hexstring2bytearray (GST_ELEMENT (filter), + g_value_get_string (value), filter->key); + + if (!hex_len) { + GST_ERROR_OBJECT (filter, "invalid key"); + goto cleanup; + } + GST_DEBUG_OBJECT (filter, "key: %s", g_value_get_string (value)); + } + break; + case PROP_IV: + { + gchar iv_string[2 * GST_AES_BLOCK_SIZE + 1]; + guint hex_len = gst_aes_hexstring2bytearray (GST_ELEMENT (filter), + g_value_get_string (value), filter->iv); + + if (hex_len != GST_AES_BLOCK_SIZE) { + GST_ERROR_OBJECT (filter, "invalid initialization vector"); + goto cleanup; + } + GST_DEBUG_OBJECT (filter, "iv: %s", + gst_aes_bytearray2hexstring (filter->iv, iv_string, + GST_AES_BLOCK_SIZE)); + } + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + +cleanup: + g_mutex_unlock (&filter->encoder_lock); +} + +static void +gst_aes_enc_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstAesEnc *filter = GST_AES_ENC (object); + + switch (prop_id) { + case PROP_CIPHER: + g_value_set_enum (value, filter->cipher); + break; + case PROP_SERIALIZE_IV: + g_value_set_boolean (value, filter->serialize_iv); + break; + case PROP_PER_BUFFER_PADDING: + g_value_set_boolean (value, filter->per_buffer_padding); + break; + case PROP_KEY: + g_value_set_string (value, (gchar *) filter->key); + break; + case PROP_IV: + g_value_set_string (value, (gchar *) filter->iv); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +gst_aes_enc_sink_event (GstBaseTransform * base, GstEvent * event) +{ + GstAesEnc *filter = GST_AES_ENC (base); + g_mutex_lock (&filter->encoder_lock); + + if (GST_EVENT_TYPE (event) == GST_EVENT_EOS) { + GST_DEBUG_OBJECT (filter, "Received EOS on sink pad"); + if (!filter->per_buffer_padding && !filter->awaiting_first_buffer) { + gint len; + GstBuffer *outbuf; + GstMapInfo outmap; + + outbuf = gst_buffer_new_allocate (NULL, EVP_MAX_BLOCK_LENGTH, NULL); + if (outbuf == NULL) { + GST_DEBUG_OBJECT (filter, + "Failed to allocate a new buffer of length %d", + EVP_MAX_BLOCK_LENGTH); + goto buffer_fail; + } + if (!gst_buffer_map (outbuf, &outmap, GST_MAP_WRITE)) { + GST_DEBUG_OBJECT (filter, + "gst_buffer_map on outbuf failed for final buffer."); + gst_buffer_unref (outbuf); + goto buffer_fail; + } + if (1 != EVP_CipherFinal_ex (filter->evp_ctx, outmap.data, &len)) { + GST_DEBUG_OBJECT (filter, "Could not finalize openssl encryption"); + gst_buffer_unmap (outbuf, &outmap); + gst_buffer_unref (outbuf); + goto cipher_fail; + } + if (len == 0) { + GST_DEBUG_OBJECT (filter, "Not pushing final buffer as length is 0"); + gst_buffer_unmap (outbuf, &outmap); + gst_buffer_unref (outbuf); + goto out; + } + GST_DEBUG_OBJECT (filter, "Pushing final buffer of length: %d", len); + gst_buffer_unmap (outbuf, &outmap); + gst_buffer_set_size (outbuf, len); + if (gst_pad_push (base->srcpad, outbuf) != GST_FLOW_OK) { + GST_DEBUG_OBJECT (filter, "Failed to push the final buffer"); + goto push_fail; + } + } else { + GST_DEBUG_OBJECT (filter, + "Not pushing final buffer as we didn't have any input"); + } + } + +out: + g_mutex_unlock (&filter->encoder_lock); + + return GST_BASE_TRANSFORM_CLASS (gst_aes_enc_parent_class)->sink_event (base, + event); + + /* ERROR */ +buffer_fail: + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to allocate or map buffer for writing")); + g_mutex_unlock (&filter->encoder_lock); + + return FALSE; +cipher_fail: + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Cipher finalization failed."), + ("Error while finalizing the stream")); + g_mutex_unlock (&filter->encoder_lock); + + return FALSE; +push_fail: + GST_ELEMENT_ERROR (filter, CORE, PAD, (NULL), + ("Failed to push the final buffer")); + g_mutex_unlock (&filter->encoder_lock); + + return FALSE; +} + +/* GstBaseTransform vmethod implementations */ +static GstFlowReturn +gst_aes_enc_transform (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GstAesEnc *filter = GST_AES_ENC (base); + GstFlowReturn ret = GST_FLOW_ERROR; + GstMapInfo inmap, outmap; + guchar *plaintext; + gint plaintext_len; + guchar *ciphertext; + gint ciphertext_len; + gint out_len; + + if (!gst_buffer_map (inbuf, &inmap, GST_MAP_READ)) { + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to map buffer for reading")); + goto cleanup; + } + if (!gst_buffer_map (outbuf, &outmap, GST_MAP_WRITE)) { + gst_buffer_unmap (inbuf, &inmap); + GST_ELEMENT_ERROR (filter, RESOURCE, FAILED, (NULL), + ("Failed to map buffer for writing")); + goto cleanup; + } + + /* ENCRYPTING */ + plaintext = inmap.data; + plaintext_len = inmap.size; + if (filter->padding) + plaintext_len += filter->padding - GST_AES_BLOCK_SIZE; + ciphertext = outmap.data; + if (filter->awaiting_first_buffer) { + if (!EVP_CipherInit_ex (filter->evp_ctx, filter->evp_cipher, NULL, + filter->key, filter->iv, TRUE)) { + GST_ERROR_OBJECT (filter, "Could not initialize openssl cipher"); + goto cleanup; + } + if (filter->serialize_iv) { + memcpy (ciphertext, filter->iv, GST_AES_BLOCK_SIZE); + ciphertext += GST_AES_BLOCK_SIZE; + } + } + + /* encrypt unpadded buffer */ + if (!EVP_CipherUpdate (filter->evp_ctx, ciphertext, + &ciphertext_len, plaintext, plaintext_len)) { + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Cipher update failed."), + ("Error while updating openssl cipher")); + goto cleanup; + } else if (filter->padding) { + gint temp; + guint k; + + /* PKCS7 padding */ + memset (filter->padded_block, filter->padding, GST_AES_BLOCK_SIZE); + for (k = 0; k < GST_AES_BLOCK_SIZE - filter->padding; ++k) + filter->padded_block[k] = plaintext[plaintext_len + k]; + + /* encrypt padding buffer */ + if (!EVP_CipherUpdate (filter->evp_ctx, + ciphertext + ciphertext_len, &temp, + filter->padded_block, GST_AES_BLOCK_SIZE)) { + GST_ELEMENT_ERROR (filter, STREAM, FAILED, ("Cipher update failed."), + ("Error while updating openssl cipher")); + goto cleanup; + } else { + g_assert (temp == GST_AES_BLOCK_SIZE); + ciphertext_len += GST_AES_BLOCK_SIZE; + plaintext_len += GST_AES_BLOCK_SIZE; + } + } + gst_buffer_unmap (inbuf, &inmap); + gst_buffer_unmap (outbuf, &outmap); + + out_len = ciphertext_len + (filter->serialize_iv ? GST_AES_BLOCK_SIZE : 0); + gst_buffer_set_size (outbuf, out_len); + GST_LOG_OBJECT (filter, + "plaintext len: %d, ciphertext len: %d, padding: %d, output buffer length: %d", + plaintext_len, ciphertext_len, filter->padding, out_len); + ret = GST_FLOW_OK; + +cleanup: + filter->awaiting_first_buffer = FALSE; + + return ret; +} + +static GstFlowReturn +gst_aes_enc_prepare_output_buffer (GstBaseTransform * base, + GstBuffer * inbuf, GstBuffer ** outbuf) +{ + GstAesEnc *filter = GST_AES_ENC (base); + GstBaseTransformClass *bclass = GST_BASE_TRANSFORM_GET_CLASS (base); + guint out_size = (guint) gst_buffer_get_size (inbuf); + + g_mutex_lock (&filter->encoder_lock); + filter->locked_properties = TRUE; + if (filter->per_buffer_padding) { + /* pad to multiple of GST_AES_BLOCK_SIZE */ + filter->padding = + GST_AES_BLOCK_SIZE - (out_size & (GST_AES_BLOCK_SIZE - 1)); + out_size += filter->padding; + } else { + /* we need extra space at end of output buffer + * when we let OpenSSL handle PKCS7 padding */ + out_size += GST_AES_BLOCK_SIZE; + } + + /* add room for serialized IV at beginning of first output buffer */ + if (filter->serialize_iv && filter->awaiting_first_buffer) + out_size += GST_AES_BLOCK_SIZE; + g_mutex_unlock (&filter->encoder_lock); + + GST_LOG_OBJECT (filter, + "Input buffer size %d, output buffer size: %d. padding : %d", + (guint) gst_buffer_get_size (inbuf), out_size, filter->padding); + *outbuf = gst_buffer_new_allocate (NULL, out_size, NULL); + bclass->copy_metadata (base, inbuf, *outbuf); + + return GST_FLOW_OK; +} + +static gboolean +gst_aes_enc_start (GstBaseTransform * base) +{ + GstAesEnc *filter = GST_AES_ENC (base); + + GST_INFO_OBJECT (filter, "Starting"); + if (!gst_aes_enc_openssl_init (filter)) { + GST_ERROR_OBJECT (filter, "OpenSSL initialization failed"); + return FALSE; + } + + GST_INFO_OBJECT (filter, "Start successful"); + + return TRUE; +} + +static gboolean +gst_aes_enc_stop (GstBaseTransform * base) +{ + GstAesEnc *filter = GST_AES_ENC (base); + + GST_INFO_OBJECT (filter, "Stopping"); + EVP_CIPHER_CTX_free (filter->evp_ctx); + + return TRUE; +} + +/* AesEnc helper functions */ +static gboolean +gst_aes_enc_openssl_init (GstAesEnc * filter) +{ + GST_DEBUG_OBJECT (filter, "Initializing with %s", + OpenSSL_version (OPENSSL_VERSION)); + + filter->evp_cipher = + EVP_get_cipherbyname (gst_aes_cipher_enum_to_string (filter->cipher)); + if (!filter->evp_cipher) { + GST_ERROR_OBJECT (filter, "Could not get cipher by name from openssl"); + return FALSE; + } + if (!(filter->evp_ctx = EVP_CIPHER_CTX_new ())) + return FALSE; + GST_LOG_OBJECT (filter, "Initialization successful"); + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaesenc.h
Added
@@ -0,0 +1,92 @@ +/* + * GStreamer gstreamer-aesenc + * + * Copyright, LCC (C) 2015 RidgeRun, LCC <carsten.behling@ridgerun.com> + * Copyright, 2020 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1335, USA. + */ + +#ifndef __GST_AES_ENC_H__ +#define __GST_AES_ENC_H__ + +#include "gstaeshelper.h" +#include <gst/base/gstbasetransform.h> + +G_BEGIN_DECLS + +#define GST_TYPE_AES_ENC (gst_aes_enc_get_type()) +G_DECLARE_FINAL_TYPE (GstAesEnc, gst_aes_enc, GST, AES_ENC, GstBaseTransform) +#define GST_AES_ENC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_AES_ENC,GstAesEnc)) +#define GST_AES_ENC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_AES_ENC,GstAesEncClass)) +#define GST_AES_ENC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_AES_ENC,GstAesEncClass)) +#define GST_IS_AES_ENC(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_AES_ENC)) +#define GST_IS_AES_ENC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_AES_ENC)) + +struct _GstAesEnc +{ + GstBaseTransform element; + + /* Properties */ + GstAesCipher cipher; + guchar key[EVP_MAX_KEY_LENGTH]; + guchar iv[GST_AES_BLOCK_SIZE]; + gboolean serialize_iv; + gboolean per_buffer_padding; + + /* Element variables */ + const EVP_CIPHER *evp_cipher; + EVP_CIPHER_CTX *evp_ctx; + guchar padding; + guchar padded_block[GST_AES_BLOCK_SIZE]; + gboolean awaiting_first_buffer; + GMutex encoder_lock; + /* if TRUE, then properties cannot be changed */ + gboolean locked_properties; +}; + +struct _GstAesEncClass +{ + GstBaseTransformClass parent_class; +}; + +GST_ELEMENT_REGISTER_DECLARE (aesenc) + +G_END_DECLS +#endif /* __GST_AES_ENC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaeshelper.c
Added
@@ -0,0 +1,187 @@ +/* + * GStreamer gstreamer-aeshelper + * + * Copyright, LCC (C) 2015 RidgeRun, LCC <carsten.behling@ridgerun.com> + * Copyright, LCC (C) 2016 RidgeRun, LCC <jose.jimenez@ridgerun.com> + * Copyright (C) 2020 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1335, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstaeshelper.h" + +GType +gst_aes_cipher_get_type (void) +{ + static GType aes_cipher_type = 0; + + if (g_once_init_enter (&aes_cipher_type)) { + static GEnumValue aes_cipher_types[] = { + {GST_AES_CIPHER_128_CBC, "AES 128 bit cipher key using CBC method", + "aes-128-cbc"}, + {GST_AES_CIPHER_256_CBC, + "AES 256 bit cipher key using CBC method", + "aes-256-cbc"}, + {0, NULL, NULL}, + }; + + GType temp = g_enum_register_static ("GstAesCipher", + aes_cipher_types); + + g_once_init_leave (&aes_cipher_type, temp); + } + + return aes_cipher_type; +} + +const gchar * +gst_aes_cipher_enum_to_string (GstAesCipher cipher) +{ + switch (cipher) { + case GST_AES_CIPHER_128_CBC: + return "aes-128-cbc"; + break; + case GST_AES_CIPHER_256_CBC: + return "aes-256-cbc"; + break; + } + + return ""; +} + + +gchar +gst_aes_nibble_to_hex (gchar in) +{ + return in < 10 ? in + 48 : in + 55; +} + +/* + * gst_aes_bytearray2hexstring + * + * convert array of bytes to hex string + * + * @param in input byte array + * @param out allocated hex string for output + * @param len length of input byte array + * + * @return output hex string + */ +gchar * +gst_aes_bytearray2hexstring (const guchar * in, gchar * const out, + const gushort len) +{ + gushort i; + gchar high; + gchar low; + + for (i = 0; i < len; i++) { + high = (in[i] & 0xF0) >> 4; + low = in[i] & 0x0F; + out[i * 2] = gst_aes_nibble_to_hex (high); + out[i * 2 + 1] = gst_aes_nibble_to_hex (low); + } + out[len * 2] = 0; /* null terminate */ + + return out; +} + +/* + * gst_aes_hexstring2bytearray + * + * convert hex string to array of bytes + * + * @param filter calling element + * @param in input hex string + * @param allocated byte array for output + * + * @return output byte array + */ +guint +gst_aes_hexstring2bytearray (GstElement * filter, const gchar * in, + guchar * out) +{ + gchar byte_val; + guint hex_count = 0; + + GST_LOG_OBJECT (filter, "Converting hex string to number"); + + g_return_val_if_fail (in && out, 0); + + while (*in != 0) { + /* Compute fist half-byte */ + if (*in >= 'A' && *in <= 'F') { + byte_val = (*in - 55) << 4; + } else if (*in >= 'a' && *in <= 'f') { + byte_val = (*in - 87) << 4; + } else if (*in >= '0' && *in <= '9') { + byte_val = (*in - 48) << 4; + } else { + return 0; + } + in++; + if (*in == 0) { + break; + } + /* Compute second half-byte */ + if (*in >= 'A' && *in <= 'F') { + *out = (*in - 55) + byte_val; + } else if (*in >= 'a' && *in <= 'f') { + *out = (*in - 87) + byte_val; + } else if (*in >= '0' && *in <= '9') { + *out = (*in - 48) + byte_val; + } else { + return 0; + } + + GST_LOG_OBJECT (filter, "ch: %c%c, hex: 0x%x", *(in - 1), *in, *out); + in++; + out++; + if (!in || !out) + return 0; + hex_count++; + } + GST_LOG_OBJECT (filter, "Hex string conversion successful"); + + return hex_count; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/gstaeshelper.h
Added
@@ -0,0 +1,104 @@ +/* + * GStreamer gstreamer-aeshelper + * + * Copyright, LCC (C) 2015 RidgeRun, LCC <carsten.behling@ridgerun.com> + * Copyright, 2020 Nice, Contact: Rabindra Harlalka <Rabindra.Harlalka@nice.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1335, USA. + */ + +#ifndef __GST_AES_HELPER_H__ +#define __GST_AES_HELPER_H__ + +#include <gst/gst.h> +#include <openssl/conf.h> +#include <openssl/evp.h> +#include <openssl/err.h> + +/** + * GstAesCipher: + * @GST_AES_CIPHER_128_CBC: AES cipher with 128 bit key using CBC + * @GST_AES_CIPHER_256_CBC: AES cipher with 256 bit key using CBC + * + * Type of AES cipher to use + * + * Since: 1.20 + */ + +typedef enum { + GST_AES_CIPHER_128_CBC, + GST_AES_CIPHER_256_CBC +} GstAesCipher; + +#define GST_AES_DEFAULT_SERIALIZE_IV FALSE +#define GST_AES_DEFAULT_KEY "" +#define GST_AES_DEFAULT_IV "" +#define GST_AES_DEFAULT_CIPHER_MODE GST_AES_CIPHER_128_CBC +#define GST_AES_PER_BUFFER_PADDING_DEFAULT TRUE +#define GST_AES_BLOCK_SIZE 16 +/* only 128 or 256 bit key length is supported */ +#define GST_AES_MAX_KEY_SIZE 32 + +enum +{ + PROP_0, + PROP_CIPHER, + PROP_SERIALIZE_IV, + PROP_KEY, + PROP_IV, + PROP_PER_BUFFER_PADDING +}; + +G_BEGIN_DECLS + +GType gst_aes_cipher_get_type (void); +#define GST_TYPE_AES_CIPHER (gst_aes_cipher_get_type ()) +const gchar* gst_aes_cipher_enum_to_string (GstAesCipher cipher); + +gchar +gst_aes_nibble_to_hex (gchar in); +gchar * +gst_aes_bytearray2hexstring (const guchar * in, gchar * const out, + const gushort len); +guint +gst_aes_hexstring2bytearray (GstElement * filter, const gchar * in, + guchar * out); + +G_END_DECLS +#endif /* __GST_AES_HELPER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/aes/meson.build
Added
@@ -0,0 +1,28 @@ +aes_sources = [ + 'gstaes.c', + 'gstaeshelper.c', + 'gstaesenc.c', + 'gstaesdec.c', +] + +aes_cargs = [] +aes_dep = dependency('openssl', version : '>= 1.1.0', required : get_option('aes')) +if aes_dep.found() + aes_cargs += ['-DHAVE_OPENSSL'] +else + subdir_done() +endif + +gstaes = library('gstaes', + aes_sources, + c_args : gst_plugins_bad_args + aes_cargs, + link_args : noseh_link_args, + include_directories : [configinc], + dependencies : [gstpbutils_dep, gstvideo_dep, + aes_dep, gio_dep, libm], + install : true, + install_dir : plugins_install_dir, +) +pkgconfig.generate(gstaes, install_dir : plugins_pkgconfig_install_dir) +plugins += [gstaes] +aes_dep = declare_dependency(include_directories : include_directories('.'))
View file
gst-plugins-bad-1.18.6.tar.xz/ext/aom/gstaom.c -> gst-plugins-bad-1.20.1.tar.xz/ext/aom/gstaom.c
Changed
@@ -29,18 +29,12 @@ static gboolean plugin_init (GstPlugin * plugin) { + gboolean ret = FALSE; - if (!gst_element_register (plugin, "av1enc", GST_RANK_PRIMARY, - GST_TYPE_AV1_ENC)) { - return FALSE; - } + ret |= GST_ELEMENT_REGISTER (av1enc, plugin); + ret |= GST_ELEMENT_REGISTER (av1dec, plugin); - if (!gst_element_register (plugin, "av1dec", GST_RANK_PRIMARY, - GST_TYPE_AV1_DEC)) { - return FALSE; - } - - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/aom/gstav1dec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/aom/gstav1dec.c
Changed
@@ -94,6 +94,8 @@ #define gst_av1_dec_parent_class parent_class G_DEFINE_TYPE (GstAV1Dec, gst_av1_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (av1dec, "av1dec", GST_RANK_PRIMARY, + GST_TYPE_AV1_DEC); static void gst_av1_dec_class_init (GstAV1DecClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/aom/gstav1dec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/aom/gstav1dec.h
Changed
@@ -68,5 +68,7 @@ GType gst_av1_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (av1dec); + G_END_DECLS #endif /* __GST_AV1_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/aom/gstav1enc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/aom/gstav1enc.c
Changed
@@ -198,6 +198,8 @@ #define gst_av1_enc_parent_class parent_class G_DEFINE_TYPE (GstAV1Enc, gst_av1_enc, GST_TYPE_VIDEO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (av1enc, "av1enc", GST_RANK_PRIMARY, + GST_TYPE_AV1_ENC); /* *INDENT-OFF* */ static GstStaticPadTemplate gst_av1_enc_sink_pad_template = @@ -426,6 +428,13 @@ av1enc->tile_columns = DEFAULT_TILE_COLUMNS; av1enc->tile_rows = DEFAULT_TILE_ROWS; +#ifdef FIXED_QP_OFFSET_COUNT + av1enc->aom_cfg.fixed_qp_offsets[0] = -1; + av1enc->aom_cfg.fixed_qp_offsets[1] = -1; + av1enc->aom_cfg.fixed_qp_offsets[2] = -1; + av1enc->aom_cfg.fixed_qp_offsets[3] = -1; + av1enc->aom_cfg.fixed_qp_offsets[4] = -1; +#endif av1enc->aom_cfg.rc_dropframe_thresh = DEFAULT_DROP_FRAME; av1enc->aom_cfg.rc_resize_mode = DEFAULT_RESIZE_MODE; av1enc->aom_cfg.rc_resize_denominator = DEFAULT_RESIZE_DENOMINATOR; @@ -580,10 +589,24 @@ if (profile_str) { gchar *endptr = NULL; - profile = g_ascii_strtoull (profile_str, &endptr, 10); - if (*endptr != '\0' || profile < 0 || profile > 3) { - GST_ERROR_OBJECT (av1enc, "Invalid profile '%s'", profile_str); - profile = DEFAULT_PROFILE; + if (g_strcmp0 (profile_str, "main") == 0) { + GST_DEBUG_OBJECT (av1enc, "Downstream profile is \"main\""); + profile = 0; + } else if (g_strcmp0 (profile_str, "high") == 0) { + profile = 1; + GST_DEBUG_OBJECT (av1enc, "Downstream profile is \"high\""); + } else if (g_strcmp0 (profile_str, "professional") == 0) { + profile = 2; + GST_DEBUG_OBJECT (av1enc, "Downstream profile is \"professional\""); + } else { + profile = g_ascii_strtoull (profile_str, &endptr, 10); + if (*endptr != '\0' || profile < 0 || profile > 3) { + GST_ERROR_OBJECT (av1enc, "Invalid profile '%s'", profile_str); + profile = DEFAULT_PROFILE; + } else { + GST_DEBUG_OBJECT (av1enc, + "Downstream profile is \"%s\"", profile_str); + } } } } @@ -752,8 +775,7 @@ } frame->output_buffer = - gst_buffer_new_wrapped (g_memdup (pkt->data.frame.buf, - pkt->data.frame.sz), pkt->data.frame.sz); + gst_buffer_new_memdup (pkt->data.frame.buf, pkt->data.frame.sz); if ((pkt->data.frame.flags & AOM_FRAME_IS_DROPPABLE) != 0) GST_BUFFER_FLAG_SET (frame->output_buffer, GST_BUFFER_FLAG_DROPPABLE);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/aom/gstav1enc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/aom/gstav1enc.h
Changed
@@ -135,5 +135,7 @@ GType gst_av1_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (av1enc); + G_END_DECLS #endif /* __GST_AV1_ENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/assrender/gstassrender.c -> gst-plugins-bad-1.20.1.tar.xz/ext/assrender/gstassrender.c
Changed
@@ -113,6 +113,13 @@ #define gst_ass_render_parent_class parent_class G_DEFINE_TYPE (GstAssRender, gst_ass_render, GST_TYPE_ELEMENT); +#define _do_init \ + GST_DEBUG_CATEGORY_INIT (gst_ass_render_debug, "assrender", \ + 0, "ASS/SSA subtitle renderer");\ + GST_DEBUG_CATEGORY_INIT (gst_ass_render_lib_debug, "assrender_library",\ + 0, "ASS/SSA subtitle renderer library"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (assrender, "assrender", + GST_RANK_PRIMARY, GST_TYPE_ASS_RENDER, _do_init); static GstCaps *gst_ass_render_get_videosink_caps (GstPad * pad, GstAssRender * render, GstCaps * filter); @@ -916,7 +923,7 @@ ass_set_pixel_aspect (render->ass_renderer, (gdouble) render->info.par_n / (gdouble) render->info.par_d); ass_set_font_scale (render->ass_renderer, 1.0); - ass_set_hinting (render->ass_renderer, ASS_HINTING_LIGHT); + ass_set_hinting (render->ass_renderer, ASS_HINTING_NONE); ass_set_fonts (render->ass_renderer, "Arial", "sans-serif", 1, NULL, 1); ass_set_fonts (render->ass_renderer, NULL, "Sans", 1, NULL, 1); @@ -1563,7 +1570,7 @@ const GstStructure *structure; gboolean valid_mimetype, valid_extension; guint i; - const gchar *filename; + const gchar *mimetype, *filename; buf = gst_sample_get_buffer (sample); structure = gst_sample_get_info (sample); @@ -1571,20 +1578,23 @@ if (!buf || !structure) return; + filename = gst_structure_get_string (structure, "filename"); + if (!filename) + return; + valid_mimetype = FALSE; valid_extension = FALSE; - for (i = 0; i < G_N_ELEMENTS (mimetypes); i++) { - if (gst_structure_has_name (structure, mimetypes[i])) { - valid_mimetype = TRUE; - break; + mimetype = gst_structure_get_string (structure, "mimetype"); + if (mimetype) { + for (i = 0; i < G_N_ELEMENTS (mimetypes); i++) { + if (strcmp (mimetype, mimetypes[i]) == 0) { + valid_mimetype = TRUE; + break; + } } } - filename = gst_structure_get_string (structure, "filename"); - if (!filename) - return; - if (!valid_mimetype) { guint len = strlen (filename); const gchar *extension = filename + len - 4; @@ -1879,13 +1889,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_ass_render_debug, "assrender", - 0, "ASS/SSA subtitle renderer"); - GST_DEBUG_CATEGORY_INIT (gst_ass_render_lib_debug, "assrender_library", - 0, "ASS/SSA subtitle renderer library"); - - return gst_element_register (plugin, "assrender", - GST_RANK_PRIMARY, GST_TYPE_ASS_RENDER); + return GST_ELEMENT_REGISTER (assrender, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/assrender/gstassrender.h -> gst-plugins-bad-1.20.1.tar.xz/ext/assrender/gstassrender.h
Changed
@@ -94,6 +94,8 @@ GType gst_ass_render_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (assrender); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtp.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtp.c
Changed
@@ -249,24 +249,18 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_avtp_aaf_pay_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_aaf_depay_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_sink_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_src_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_cvf_pay_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_cvf_depay_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_crf_sync_plugin_init (plugin)) - return FALSE; - if (!gst_avtp_crf_check_plugin_init (plugin)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (avtpaafpay, plugin); + ret |= GST_ELEMENT_REGISTER (avtpaafdepay, plugin); + ret |= GST_ELEMENT_REGISTER (avtpsink, plugin); + ret |= GST_ELEMENT_REGISTER (avtpsrc, plugin); + ret |= GST_ELEMENT_REGISTER (avtpcvfpay, plugin); + ret |= GST_ELEMENT_REGISTER (avtpcvfdepay, plugin); + ret |= GST_ELEMENT_REGISTER (avtpcrfsync, plugin); + ret |= GST_ELEMENT_REGISTER (avtpcrfcheck, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpaafdepay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpaafdepay.c
Changed
@@ -56,6 +56,8 @@ G_DEFINE_TYPE (GstAvtpAafDepay, gst_avtp_aaf_depay, GST_TYPE_AVTP_BASE_DEPAYLOAD); +GST_ELEMENT_REGISTER_DEFINE (avtpaafdepay, "avtpaafdepay", GST_RANK_NONE, + GST_TYPE_AVTP_AAF_DEPAY); static GstFlowReturn gst_avtp_aaf_depay_chain (GstPad * pad, GstObject * parent, GstBuffer * buffer); @@ -317,10 +319,3 @@ gst_buffer_unref (buffer); return GST_FLOW_OK; } - -gboolean -gst_avtp_aaf_depay_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpaafdepay", GST_RANK_NONE, - GST_TYPE_AVTP_AAF_DEPAY); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpaafdepay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpaafdepay.h
Changed
@@ -58,7 +58,7 @@ GType gst_avtp_aaf_depay_get_type (void); -gboolean gst_avtp_aaf_depay_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpaafdepay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpaafpay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpaafpay.c
Changed
@@ -88,6 +88,8 @@ #define gst_avtp_aaf_pay_parent_class parent_class G_DEFINE_TYPE (GstAvtpAafPay, gst_avtp_aaf_pay, GST_TYPE_AVTP_BASE_PAYLOAD); +GST_ELEMENT_REGISTER_DEFINE (avtpaafpay, "avtpaafpay", GST_RANK_NONE, + GST_TYPE_AVTP_AAF_PAY); static void gst_avtp_aaf_pay_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -391,10 +393,3 @@ parent, event); } } - -gboolean -gst_avtp_aaf_pay_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpaafpay", GST_RANK_NONE, - GST_TYPE_AVTP_AAF_PAY); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpaafpay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpaafpay.h
Changed
@@ -67,7 +67,7 @@ GType gst_avtp_aaf_pay_get_type (void); -gboolean gst_avtp_aaf_pay_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpaafpay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcrfbase.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcrfbase.c
Changed
@@ -137,7 +137,8 @@ switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: thread_data->past_periods = - g_malloc0 (sizeof (guint64) * MAX_NUM_PERIODS_STORED); + g_malloc0 (sizeof (thread_data->past_periods[0]) * + MAX_NUM_PERIODS_STORED); thread_data->mr = -1; thread_data->is_running = TRUE; thread_data->thread = @@ -180,7 +181,7 @@ guint8 addr[ETH_ALEN]; int fd, res, ifindex; - fd = socket (AF_PACKET, SOCK_DGRAM, htons (ETH_P_TSN)); + fd = socket (AF_PACKET, SOCK_DGRAM, htons (ETH_P_ALL)); if (fd < 0) { GST_ERROR_OBJECT (avtpcrfbase, "Failed to open socket: %s", g_strerror (errno)); @@ -196,7 +197,7 @@ } sk_addr.sll_family = AF_PACKET; - sk_addr.sll_protocol = htons (ETH_P_TSN); + sk_addr.sll_protocol = htons (ETH_P_ALL); sk_addr.sll_ifindex = ifindex; res = bind (fd, (struct sockaddr *) &sk_addr, sizeof (sk_addr)); @@ -408,7 +409,7 @@ GstAvtpCrfThreadData *data = &avtpcrfbase->thread_data; GstClockTime first_pkt_tstamp, last_pkt_tstamp; int num_pkt_tstamps, past_periods_iter; - GstClockTime accumulate_period = 0; + gdouble accumulate_period = 0; num_pkt_tstamps = data->num_pkt_tstamps; past_periods_iter = data->past_periods_iter; @@ -430,7 +431,7 @@ if (!data->last_received_tstamp || ((data->last_seqnum + 1) % 255 != seqnum)) { - GstClockTime average_period = data->average_period; + gdouble average_period = data->average_period; if (!data->last_received_tstamp) { gdouble base_freq_mult; @@ -451,12 +452,13 @@ } data->past_periods[past_periods_iter] = - first_pkt_tstamp - data->last_received_tstamp; + (gdouble) (first_pkt_tstamp - data->last_received_tstamp) / + data->timestamp_interval; data->last_received_tstamp = first_pkt_tstamp; data->last_seqnum = seqnum; } else { data->past_periods[past_periods_iter] = - (last_pkt_tstamp - first_pkt_tstamp) / + (gdouble) (last_pkt_tstamp - first_pkt_tstamp) / (data->timestamp_interval * (num_pkt_tstamps - 1)); } @@ -508,7 +510,8 @@ g_assert (res == 0); if (media_clk_reset != data->mr) { - memset (data->past_periods, 0, sizeof (gint64) * MAX_NUM_PERIODS_STORED); + memset (data->past_periods, 0, + sizeof (data->past_periods[0]) * MAX_NUM_PERIODS_STORED); data->periods_stored = 0; data->average_period = 0; data->current_ts = 0;
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcrfbase.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcrfbase.h
Changed
@@ -51,10 +51,13 @@ guint64 type; guint64 mr; - GstClockTime *past_periods; + gdouble *past_periods; int past_periods_iter; int periods_stored; - GstClockTime average_period; + /** The time in ns between two events. The type of the event is depending on + * the CRF type: Audio sample, video frame sync, video line sync, ... + */ + gdouble average_period; GstClockTime current_ts; GstClockTime last_received_tstamp; guint64 last_seqnum; @@ -75,8 +78,6 @@ { GstBaseTransformClass parent_class; - GstPadEventFunction sink_event; - gpointer _gst_reserved[GST_PADDING]; };
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcrfcheck.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcrfcheck.c
Changed
@@ -58,6 +58,8 @@ #define gst_avtp_crf_check_parent_class parent_class G_DEFINE_TYPE (GstAvtpCrfCheck, gst_avtp_crf_check, GST_TYPE_AVTP_CRF_BASE); +GST_ELEMENT_REGISTER_DEFINE (avtpcrfcheck, "avtpcrfcheck", GST_RANK_NONE, + GST_TYPE_AVTP_CRF_CHECK); static void gst_avtp_crf_check_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -254,10 +256,3 @@ gst_buffer_unmap (buffer, &info); return GST_FLOW_OK; } - -gboolean -gst_avtp_crf_check_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpcrfcheck", GST_RANK_NONE, - GST_TYPE_AVTP_CRF_CHECK); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcrfcheck.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcrfcheck.h
Changed
@@ -52,7 +52,7 @@ GType gst_avtp_crf_check_get_type (void); -gboolean gst_avtp_crf_check_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpcrfcheck); G_END_DECLS #endif /* __GST_AVTP_CRF_CHECK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcrfsync.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcrfsync.c
Changed
@@ -54,7 +54,8 @@ #define gst_avtp_crf_sync_parent_class parent_class G_DEFINE_TYPE (GstAvtpCrfSync, gst_avtp_crf_sync, GST_TYPE_AVTP_CRF_BASE); - +GST_ELEMENT_REGISTER_DEFINE (avtpcrfsync, "avtpcrfsync", GST_RANK_NONE, + GST_TYPE_AVTP_CRF_SYNC); static GstFlowReturn gst_avtp_crf_sync_transform_ip (GstBaseTransform * parent, GstBuffer * buffer); @@ -148,8 +149,10 @@ GstMapInfo info; gboolean res; - if (!avg_period || !current_ts) + if (!avg_period || !current_ts) { + GST_WARNING_OBJECT (avtpcrfsync, "No CRF packet yet received!"); return GST_FLOW_OK; + } res = gst_buffer_map (buffer, &info, GST_MAP_READWRITE); if (!res) { @@ -240,10 +243,3 @@ gst_buffer_unmap (buffer, &info); return GST_FLOW_OK; } - -gboolean -gst_avtp_crf_sync_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpcrfsync", GST_RANK_NONE, - GST_TYPE_AVTP_CRF_SYNC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcrfsync.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcrfsync.h
Changed
@@ -50,7 +50,7 @@ GType gst_avtp_crf_sync_get_type (void); -gboolean gst_avtp_crf_sync_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpcrfsync); G_END_DECLS #endif /* __GST_AVTP_CRF_SYNC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcvfdepay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcvfdepay.c
Changed
@@ -83,6 +83,8 @@ #define gst_avtp_cvf_depay_parent_class parent_class G_DEFINE_TYPE (GstAvtpCvfDepay, gst_avtp_cvf_depay, GST_TYPE_AVTP_BASE_DEPAYLOAD); +GST_ELEMENT_REGISTER_DEFINE (avtpcvfdepay, "avtpcvfdepay", GST_RANK_NONE, + GST_TYPE_AVTP_CVF_DEPAY); static void gst_avtp_cvf_depay_class_init (GstAvtpCvfDepayClass * klass) @@ -331,7 +333,8 @@ g_assert (r == 0); if (G_UNLIKELY (val != 1)) { GST_DEBUG_OBJECT (avtpcvfdepay, - "Unexpected AVTP header stream valid %ld, expected %d", val, 1); + "Unexpected AVTP header stream valid %" G_GUINT64_FORMAT + ", expected %d", val, 1); goto end; } @@ -339,7 +342,8 @@ g_assert (r == 0); if (val != avtpbasedepayload->streamid) { GST_DEBUG_OBJECT (avtpcvfdepay, - "Unexpected AVTP header stream id 0x%lx, expected 0x%lx", val, + "Unexpected AVTP header stream id 0x%" G_GINT64_MODIFIER + "x, expected 0x%" G_GINT64_MODIFIER "x", val, avtpbasedepayload->streamid); goto end; } @@ -348,7 +352,7 @@ g_assert (r == 0); if (G_UNLIKELY (val != AVTP_CVF_FORMAT_RFC)) { GST_DEBUG_OBJECT (avtpcvfdepay, - "Unexpected AVTP header format %ld, expected %d", val, + "Unexpected AVTP header format %" G_GUINT64_FORMAT ", expected %d", val, AVTP_CVF_FORMAT_RFC); goto end; } @@ -357,7 +361,7 @@ g_assert (r == 0); if (G_UNLIKELY (val != AVTP_CVF_FORMAT_SUBTYPE_H264)) { GST_DEBUG_OBJECT (avtpcvfdepay, - "Unsupported AVTP header format subtype %ld", val); + "Unsupported AVTP header format subtype %" G_GUINT64_FORMAT, val); goto end; } @@ -365,8 +369,9 @@ g_assert (r == 0); if (G_UNLIKELY (map->size < sizeof (*pdu) + val)) { GST_DEBUG_OBJECT (avtpcvfdepay, - "AVTP packet size %ld too small, expected at least %lu", - map->size - AVTP_CVF_H264_HEADER_SIZE, sizeof (*pdu) + val); + "AVTP packet size %" G_GSIZE_FORMAT " too small, expected at least %" + G_GUINT64_FORMAT, map->size - AVTP_CVF_H264_HEADER_SIZE, + sizeof (*pdu) + val); goto end; } @@ -375,8 +380,8 @@ g_assert (r == 0); if (G_UNLIKELY (val != avtpcvfdepay->seqnum)) { GST_INFO_OBJECT (avtpcvfdepay, - "Unexpected AVTP header seq num %lu, expected %u", val, - avtpcvfdepay->seqnum); + "Unexpected AVTP header seq num %" G_GUINT64_FORMAT ", expected %u", + val, avtpcvfdepay->seqnum); avtpcvfdepay->seqnum = val; /* This is not a reason to drop the packet, but it may be a good moment @@ -447,8 +452,8 @@ GstFlowReturn ret = GST_FLOW_OK; GST_LOG_OBJECT (avtpcvfdepay, - "Adding buffer of size %lu (nalu size %lu) to out_buffer", - gst_buffer_get_size (buffer), + "Adding buffer of size %" G_GSIZE_FORMAT " (nalu size %" + G_GSIZE_FORMAT ") to out_buffer", gst_buffer_get_size (buffer), gst_buffer_get_size (buffer) - sizeof (guint32)); if (avtpcvfdepay->out_buffer) { @@ -568,8 +573,8 @@ if (G_UNLIKELY (map->size - AVTP_CVF_H264_HEADER_SIZE < 2)) { GST_ERROR_OBJECT (avtpcvfdepay, - "Buffer too small to contain fragment headers, size: %lu", - map->size - AVTP_CVF_H264_HEADER_SIZE); + "Buffer too small to contain fragment headers, size: %" + G_GSIZE_FORMAT, map->size - AVTP_CVF_H264_HEADER_SIZE); ret = gst_avtp_cvf_depay_push_and_discard (avtpcvfdepay); goto end; } @@ -727,10 +732,3 @@ return ret; } - -gboolean -gst_avtp_cvf_depay_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpcvfdepay", GST_RANK_NONE, - GST_TYPE_AVTP_CVF_DEPAY); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcvfdepay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcvfdepay.h
Changed
@@ -57,7 +57,7 @@ GType gst_avtp_cvf_depay_get_type (void); -gboolean gst_avtp_cvf_depay_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpcvfdepay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcvfpay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcvfpay.c
Changed
@@ -97,6 +97,8 @@ #define gst_avtp_cvf_pay_parent_class parent_class G_DEFINE_TYPE (GstAvtpCvfPay, gst_avtp_cvf_pay, GST_TYPE_AVTP_BASE_PAYLOAD); +GST_ELEMENT_REGISTER_DEFINE (avtpcvfpay, "avtpcvfpay", GST_RANK_NONE, + GST_TYPE_AVTP_CVF_PAY); static void gst_avtp_cvf_pay_class_init (GstAvtpCvfPayClass * klass) @@ -347,7 +349,8 @@ if (*offset == 0 && (nal_size + AVTP_CVF_H264_HEADER_SIZE) <= avtpcvfpay->mtu) { *last_fragment = TRUE; *offset = nal_size; - GST_DEBUG_OBJECT (avtpcvfpay, "Generated fragment with size %lu", nal_size); + GST_DEBUG_OBJECT (avtpcvfpay, + "Generated fragment with size %" G_GSIZE_FORMAT, nal_size); return gst_buffer_ref (nal); } @@ -402,8 +405,8 @@ *offset += fragment_size; - GST_DEBUG_OBJECT (avtpcvfpay, "Generated fragment with size %lu", - fragment_size); + GST_DEBUG_OBJECT (avtpcvfpay, + "Generated fragment with size %" G_GSIZE_FORMAT, fragment_size); return fragment; } @@ -533,7 +536,7 @@ nal = g_ptr_array_index (nals, i); GST_LOG_OBJECT (avtpcvfpay, - "Preparing AVTP packets for NAL whose size is %lu", + "Preparing AVTP packets for NAL whose size is %" G_GSIZE_FORMAT, gst_buffer_get_size (nal)); /* Calculate timestamps. Note that we do it twice, one using DTS as base, @@ -602,10 +605,10 @@ if (M) { GST_LOG_OBJECT (avtpcvfpay, "M packet sent, PTS: %" GST_TIME_FORMAT " DTS: %" GST_TIME_FORMAT " AVTP_TS: %" GST_TIME_FORMAT - " H264_TS: %" GST_TIME_FORMAT "\navtp_time: %lu h264_time: %lu", - GST_TIME_ARGS (h264_time), - GST_TIME_ARGS (avtp_time), GST_TIME_ARGS ((guint32) avtp_time), - GST_TIME_ARGS ((guint32) h264_time), avtp_time, h264_time); + " H264_TS: %" GST_TIME_FORMAT "\navtp_time: %" G_GUINT64_FORMAT + " h264_time: %" G_GUINT64_FORMAT, GST_TIME_ARGS (h264_time), + GST_TIME_ARGS (avtp_time), GST_TIME_ARGS (avtp_time & 0xffffffff), + GST_TIME_ARGS (h264_time & 0xffffffff), avtp_time, h264_time); } } @@ -662,8 +665,8 @@ GstFlowReturn ret = GST_FLOW_OK; GST_LOG_OBJECT (avtpcvfpay, - "Incoming buffer size: %lu PTS: %" GST_TIME_FORMAT " DTS: %" - GST_TIME_FORMAT, gst_buffer_get_size (buffer), + "Incoming buffer size: %" G_GSIZE_FORMAT " PTS: %" GST_TIME_FORMAT + " DTS: %" GST_TIME_FORMAT, gst_buffer_get_size (buffer), GST_TIME_ARGS (GST_BUFFER_PTS (buffer)), GST_TIME_ARGS (GST_BUFFER_DTS (buffer))); @@ -765,10 +768,3 @@ return GST_AVTP_BASE_PAYLOAD_CLASS (parent_class)->sink_event (pad, parent, event); } - -gboolean -gst_avtp_cvf_pay_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpcvfpay", GST_RANK_NONE, - GST_TYPE_AVTP_CVF_PAY); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpcvfpay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpcvfpay.h
Changed
@@ -62,7 +62,7 @@ GType gst_avtp_cvf_pay_get_type (void); -gboolean gst_avtp_cvf_pay_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpcvfpay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpsink.c
Changed
@@ -84,7 +84,8 @@ #define gst_avtp_sink_parent_class parent_class G_DEFINE_TYPE (GstAvtpSink, gst_avtp_sink, GST_TYPE_BASE_SINK); - +GST_ELEMENT_REGISTER_DEFINE (avtpsink, "avtpsink", GST_RANK_NONE, + GST_TYPE_AVTP_SINK); static void gst_avtp_sink_finalize (GObject * gobject); static void gst_avtp_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -475,10 +476,3 @@ *start = GST_CLOCK_TIME_NONE; *end = GST_CLOCK_TIME_NONE; } - -gboolean -gst_avtp_sink_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpsink", GST_RANK_NONE, - GST_TYPE_AVTP_SINK); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpsink.h
Changed
@@ -61,7 +61,7 @@ GType gst_avtp_sink_get_type (void); -gboolean gst_avtp_sink_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpsink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpsrc.c
Changed
@@ -75,6 +75,8 @@ #define gst_avtp_src_parent_class parent_class G_DEFINE_TYPE (GstAvtpSrc, gst_avtp_src, GST_TYPE_PUSH_SRC); +GST_ELEMENT_REGISTER_DEFINE (avtpsrc, "avtpsrc", GST_RANK_NONE, + GST_TYPE_AVTP_SRC); static void gst_avtp_src_finalize (GObject * gobject); static void gst_avtp_src_set_property (GObject * object, guint prop_id, @@ -270,13 +272,15 @@ GstMapInfo map; gsize buffer_size; ssize_t n = MAX_AVTPDU_SIZE; + ssize_t received = -1; GstAvtpSrc *avtpsrc = GST_AVTP_SRC (pushsrc); buffer_size = gst_buffer_get_size (buffer); if (G_UNLIKELY (buffer_size < MAX_AVTPDU_SIZE)) { GST_WARNING_OBJECT (avtpsrc, - "Buffer size (%lu) may not be enough to hold AVTPDU (max AVTPDU size %d)", - buffer_size, MAX_AVTPDU_SIZE); + "Buffer size (%" G_GSIZE_FORMAT + ") may not be enough to hold AVTPDU (max AVTPDU size %d)", buffer_size, + MAX_AVTPDU_SIZE); n = buffer_size; } @@ -287,8 +291,8 @@ retry: errno = 0; - n = recv (avtpsrc->sk_fd, map.data, n, 0); - if (n < 0) { + received = recv (avtpsrc->sk_fd, map.data, n, 0); + if (received < 0) { if (errno == EINTR) { goto retry; } @@ -299,14 +303,8 @@ return GST_FLOW_ERROR; } + gst_buffer_set_size (buffer, received); gst_buffer_unmap (buffer, &map); return GST_FLOW_OK; } - -gboolean -gst_avtp_src_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avtpsrc", GST_RANK_NONE, - GST_TYPE_AVTP_SRC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/gstavtpsrc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/gstavtpsrc.h
Changed
@@ -57,7 +57,7 @@ GType gst_avtp_src_get_type (void); -gboolean gst_avtp_src_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (avtpsrc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/avtp/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/avtp/meson.build
Changed
@@ -17,7 +17,7 @@ avtp_dep = dependency('', required: false) avtp_option = get_option('avtp') -if host_machine.system() != 'linux' +if host_machine.system() != 'linux' or not cc.has_type('struct sock_txtime', prefix : '#include <linux/net_tstamp.h>') if avtp_option.enabled() error('avtp plugin enabled but host is not supported') else @@ -28,7 +28,7 @@ avtp_dep = dependency('avtp', required: avtp_option, fallback: ['avtp', 'avtp_dep']) -if avtp_dep.found() and cc.has_type('struct sock_txtime', prefix : '#include <linux/net_tstamp.h>') +if avtp_dep.found() gstavtp = library('gstavtp', avtp_sources, c_args : gst_plugins_bad_args,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bs2b/gstbs2b.c -> gst-plugins-bad-1.20.1.tar.xz/ext/bs2b/gstbs2b.c
Changed
@@ -101,6 +101,7 @@ G_DEFINE_TYPE_WITH_CODE (GstBs2b, gst_bs2b, GST_TYPE_AUDIO_FILTER, G_IMPLEMENT_INTERFACE (GST_TYPE_PRESET, gst_preset_interface_init)); +GST_ELEMENT_REGISTER_DEFINE (bs2b, "bs2b", GST_RANK_NONE, GST_TYPE_BS2B); static void gst_bs2b_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -410,7 +411,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "bs2b", GST_RANK_NONE, GST_TYPE_BS2B); + return GST_ELEMENT_REGISTER (bs2b, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bs2b/gstbs2b.h -> gst-plugins-bad-1.20.1.tar.xz/ext/bs2b/gstbs2b.h
Changed
@@ -59,5 +59,7 @@ GType gst_bs2b_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (bs2b); + G_END_DECLS #endif /* __GST_BS2B_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bz2/gstbz2.c -> gst-plugins-bad-1.20.1.tar.xz/ext/bz2/gstbz2.c
Changed
@@ -30,11 +30,12 @@ static gboolean plugin_init (GstPlugin * p) { - if (!gst_element_register (p, "bz2enc", GST_RANK_NONE, GST_TYPE_BZ2ENC)) - return FALSE; - if (!gst_element_register (p, "bz2dec", GST_RANK_NONE, GST_TYPE_BZ2DEC)) - return FALSE; - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (bz2enc, p); + ret |= GST_ELEMENT_REGISTER (bz2dec, p); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, bz2,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bz2/gstbz2dec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/bz2/gstbz2dec.c
Changed
@@ -59,6 +59,7 @@ #define gst_bz2dec_parent_class parent_class G_DEFINE_TYPE (GstBz2dec, gst_bz2dec, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (bz2dec, "bz2dec", GST_RANK_NONE, GST_TYPE_BZ2DEC); #define DEFAULT_FIRST_BUFFER_SIZE 1024 #define DEFAULT_BUFFER_SIZE 1024
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bz2/gstbz2dec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/bz2/gstbz2dec.h
Changed
@@ -35,5 +35,7 @@ gst_bz2dec_get_type (void) G_GNUC_CONST; +GST_ELEMENT_REGISTER_DECLARE (bz2dec); + G_END_DECLS #endif /* __GST_BZ2DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bz2/gstbz2enc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/bz2/gstbz2enc.c
Changed
@@ -67,6 +67,7 @@ #define gst_bz2enc_parent_class parent_class G_DEFINE_TYPE (GstBz2enc, gst_bz2enc, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (bz2enc, "bz2enc", GST_RANK_NONE, GST_TYPE_BZ2ENC); static void gst_bz2enc_compress_end (GstBz2enc * b)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/bz2/gstbz2enc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/bz2/gstbz2enc.h
Changed
@@ -35,5 +35,7 @@ gst_bz2enc_get_type (void) G_GNUC_CONST; +GST_ELEMENT_REGISTER_DECLARE (bz2enc); + G_END_DECLS #endif /* __GST_BZ2ENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/chromaprint/gstchromaprint.c -> gst-plugins-bad-1.20.1.tar.xz/ext/chromaprint/gstchromaprint.c
Changed
@@ -61,8 +61,11 @@ PROP_MAX_DURATION }; +static gboolean chromaprint_element_init (GstPlugin * plugin); + #define parent_class gst_chromaprint_parent_class G_DEFINE_TYPE (GstChromaprint, gst_chromaprint, GST_TYPE_AUDIO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (chromaprint, chromaprint_element_init); static void gst_chromaprint_finalize (GObject * object); static void gst_chromaprint_set_property (GObject * object, guint prop_id, @@ -292,7 +295,7 @@ } static gboolean -plugin_init (GstPlugin * plugin) +chromaprint_element_init (GstPlugin * plugin) { gboolean ret; @@ -313,6 +316,12 @@ return ret; } +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (chromaprint, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, chromaprint,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/chromaprint/gstchromaprint.h -> gst-plugins-bad-1.20.1.tar.xz/ext/chromaprint/gstchromaprint.h
Changed
@@ -72,6 +72,8 @@ GType gst_chromaprint_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (chromaprint); + G_END_DECLS #endif /* __GST_CHROMAPRINT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/bit_slicer.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/bit_slicer.c
Changed
@@ -463,7 +463,7 @@ /* n_points = n_points; */ /* raw = raw; */ - warning (&bs->log, "vbi3_bit_slicer_set_params() not called."); + warn (&bs->log, "vbi3_bit_slicer_set_params() not called."); return FALSE; } @@ -524,13 +524,13 @@ *n_points = 0; if (bs->payload > buffer_size * 8) { - warning (&bs->log, + warn (&bs->log, "buffer_size %u < %u bits of payload.", buffer_size * 8, bs->payload); return FALSE; } if (bs->total_bits > max_points) { - warning (&bs->log, + warn (&bs->log, "max_points %u < %u CRI, FRC and payload bits.", max_points, bs->total_bits); return FALSE; @@ -539,7 +539,7 @@ if (low_pass_bit_slicer_Y8 == bs->func) { return bs->func (bs, buffer, points, n_points, raw); } else if (bit_slicer_Y8 != bs->func) { - warning (&bs->log, + warn (&bs->log, "Function not implemented for pixfmt %u.", bs->sample_format); return bs->func (bs, buffer, /* points */ NULL, @@ -582,7 +582,7 @@ assert (NULL != raw); if (bs->payload > buffer_size * 8) { - warning (&bs->log, + warn (&bs->log, "buffer_size %u < %u bits of payload.", buffer_size * 8, bs->payload); return FALSE; } @@ -677,13 +677,12 @@ assert (samples_per_line <= 32767); if (cri_rate > sampling_rate) { - warning (&bs->log, - "cri_rate %u > sampling_rate %u.", cri_rate, sampling_rate); + warn (&bs->log, "cri_rate %u > sampling_rate %u.", cri_rate, sampling_rate); goto failure; } if (payload_rate > sampling_rate) { - warning (&bs->log, + warn (&bs->log, "payload_rate %u > sampling_rate %u.", payload_rate, sampling_rate); goto failure; } @@ -835,7 +834,7 @@ default: - warning (&bs->log, + warn (&bs->log, "Unknown sample_format 0x%x.", (unsigned int) sample_format); return FALSE; } @@ -858,7 +857,7 @@ if ((sample_offset > samples_per_line) || ((cri_samples + data_samples) > (samples_per_line - sample_offset))) { - warning (&bs->log, + warn (&bs->log, "%u samples_per_line too small for " "sample_offset %u + %u cri_bits (%u samples) " "+ %u frc_bits and %u payload_bits "
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstcccombiner.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstcccombiner.c
Changed
@@ -51,8 +51,20 @@ ("closedcaption/x-cea-608,format={ (string) raw, (string) s334-1a}; " "closedcaption/x-cea-708,format={ (string) cc_data, (string) cdp }")); -G_DEFINE_TYPE (GstCCCombiner, gst_cc_combiner, GST_TYPE_AGGREGATOR); #define parent_class gst_cc_combiner_parent_class +G_DEFINE_TYPE (GstCCCombiner, gst_cc_combiner, GST_TYPE_AGGREGATOR); +GST_ELEMENT_REGISTER_DEFINE (cccombiner, "cccombiner", + GST_RANK_NONE, GST_TYPE_CCCOMBINER); + +enum +{ + PROP_0, + PROP_SCHEDULE, + PROP_MAX_SCHEDULED, +}; + +#define DEFAULT_MAX_SCHEDULED 30 +#define DEFAULT_SCHEDULE TRUE typedef struct { @@ -60,6 +72,13 @@ GstBuffer *buffer; } CaptionData; +typedef struct +{ + GstBuffer *buffer; + GstClockTime running_time; + GstClockTime stream_time; +} CaptionQueueItem; + static void caption_data_clear (CaptionData * data) { @@ -67,10 +86,18 @@ } static void +clear_scheduled (CaptionQueueItem * item) +{ + gst_buffer_unref (item->buffer); +} + +static void gst_cc_combiner_finalize (GObject * object) { GstCCCombiner *self = GST_CCCOMBINER (object); + gst_queue_array_free (self->scheduled[0]); + gst_queue_array_free (self->scheduled[1]); g_array_unref (self->current_frame_captions); self->current_frame_captions = NULL; @@ -79,6 +106,629 @@ #define GST_FLOW_NEED_DATA GST_FLOW_CUSTOM_SUCCESS +static const guint8 * +extract_cdp (const guint8 * cdp, guint cdp_len, guint * cc_data_len) +{ + GstByteReader br; + guint16 u16; + guint8 u8; + guint8 flags; + guint len = 0; + const guint8 *cc_data = NULL; + + *cc_data_len = 0; + + /* Header + footer length */ + if (cdp_len < 11) { + goto done; + } + + gst_byte_reader_init (&br, cdp, cdp_len); + u16 = gst_byte_reader_get_uint16_be_unchecked (&br); + if (u16 != 0x9669) { + goto done; + } + + u8 = gst_byte_reader_get_uint8_unchecked (&br); + if (u8 != cdp_len) { + goto done; + } + + gst_byte_reader_skip_unchecked (&br, 1); + + flags = gst_byte_reader_get_uint8_unchecked (&br); + + /* No cc_data? */ + if ((flags & 0x40) == 0) { + goto done; + } + + /* cdp_hdr_sequence_cntr */ + gst_byte_reader_skip_unchecked (&br, 2); + + /* time_code_present */ + if (flags & 0x80) { + if (gst_byte_reader_get_remaining (&br) < 5) { + goto done; + } + gst_byte_reader_skip_unchecked (&br, 5); + } + + /* ccdata_present */ + if (flags & 0x40) { + guint8 cc_count; + + if (gst_byte_reader_get_remaining (&br) < 2) { + goto done; + } + u8 = gst_byte_reader_get_uint8_unchecked (&br); + if (u8 != 0x72) { + goto done; + } + + cc_count = gst_byte_reader_get_uint8_unchecked (&br); + if ((cc_count & 0xe0) != 0xe0) { + goto done; + } + cc_count &= 0x1f; + + if (cc_count == 0) + return 0; + + len = 3 * cc_count; + if (gst_byte_reader_get_remaining (&br) < len) + goto done; + + cc_data = gst_byte_reader_get_data_unchecked (&br, len); + *cc_data_len = len; + } + +done: + return cc_data; +} + +#define MAX_CDP_PACKET_LEN 256 +#define MAX_CEA608_LEN 32 + +static const struct cdp_fps_entry cdp_fps_table[] = { + {0x1f, 24000, 1001, 25, 22, 3 /* FIXME: alternating max cea608 count! */ }, + {0x2f, 24, 1, 25, 22, 2}, + {0x3f, 25, 1, 24, 22, 2}, + {0x4f, 30000, 1001, 20, 18, 2}, + {0x5f, 30, 1, 20, 18, 2}, + {0x6f, 50, 1, 12, 11, 1}, + {0x7f, 60000, 1001, 10, 9, 1}, + {0x8f, 60, 1, 10, 9, 1}, +}; +static const struct cdp_fps_entry null_fps_entry = { 0, 0, 0, 0 }; + +static const struct cdp_fps_entry * +cdp_fps_entry_from_fps (guint fps_n, guint fps_d) +{ + int i; + for (i = 0; i < G_N_ELEMENTS (cdp_fps_table); i++) { + if (cdp_fps_table[i].fps_n == fps_n && cdp_fps_table[i].fps_d == fps_d) + return &cdp_fps_table[i]; + } + return &null_fps_entry; +} + + +static GstBuffer * +make_cdp (GstCCCombiner * self, const guint8 * cc_data, guint cc_data_len, + const struct cdp_fps_entry *fps_entry, const GstVideoTimeCode * tc) +{ + GstByteWriter bw; + guint8 flags, checksum; + guint i, len; + GstBuffer *ret = gst_buffer_new_allocate (NULL, MAX_CDP_PACKET_LEN, NULL); + GstMapInfo map; + + gst_buffer_map (ret, &map, GST_MAP_WRITE); + + gst_byte_writer_init_with_data (&bw, map.data, MAX_CDP_PACKET_LEN, FALSE); + gst_byte_writer_put_uint16_be_unchecked (&bw, 0x9669); + /* Write a length of 0 for now */ + gst_byte_writer_put_uint8_unchecked (&bw, 0); + + gst_byte_writer_put_uint8_unchecked (&bw, fps_entry->fps_idx); + + /* caption_service_active */ + flags = 0x02; + + /* ccdata_present */ + flags |= 0x40; + + if (tc && tc->config.fps_n > 0) + flags |= 0x80; + + /* reserved */ + flags |= 0x01; + + gst_byte_writer_put_uint8_unchecked (&bw, flags); + + gst_byte_writer_put_uint16_be_unchecked (&bw, self->cdp_hdr_sequence_cntr); + + if (tc && tc->config.fps_n > 0) { + guint8 u8; + + gst_byte_writer_put_uint8_unchecked (&bw, 0x71); + /* reserved 11 - 2 bits */ + u8 = 0xc0; + /* tens of hours - 2 bits */ + u8 |= ((tc->hours / 10) & 0x3) << 4; + /* units of hours - 4 bits */ + u8 |= (tc->hours % 10) & 0xf; + gst_byte_writer_put_uint8_unchecked (&bw, u8); + + /* reserved 1 - 1 bit */ + u8 = 0x80; + /* tens of minutes - 3 bits */ + u8 |= ((tc->minutes / 10) & 0x7) << 4; + /* units of minutes - 4 bits */ + u8 |= (tc->minutes % 10) & 0xf; + gst_byte_writer_put_uint8_unchecked (&bw, u8); + + /* field flag - 1 bit */ + u8 = tc->field_count < 2 ? 0x00 : 0x80; + /* tens of seconds - 3 bits */ + u8 |= ((tc->seconds / 10) & 0x7) << 4; + /* units of seconds - 4 bits */ + u8 |= (tc->seconds % 10) & 0xf; + gst_byte_writer_put_uint8_unchecked (&bw, u8); + + /* drop frame flag - 1 bit */ + u8 = (tc->config.flags & GST_VIDEO_TIME_CODE_FLAGS_DROP_FRAME) ? 0x80 : + 0x00; + /* reserved0 - 1 bit */ + /* tens of frames - 2 bits */ + u8 |= ((tc->frames / 10) & 0x3) << 4; + /* units of frames 4 bits */ + u8 |= (tc->frames % 10) & 0xf; + gst_byte_writer_put_uint8_unchecked (&bw, u8); + } + + gst_byte_writer_put_uint8_unchecked (&bw, 0x72); + gst_byte_writer_put_uint8_unchecked (&bw, 0xe0 | fps_entry->max_cc_count); + gst_byte_writer_put_data_unchecked (&bw, cc_data, cc_data_len); + while (fps_entry->max_cc_count > cc_data_len / 3) { + gst_byte_writer_put_uint8_unchecked (&bw, 0xfa); + gst_byte_writer_put_uint8_unchecked (&bw, 0x00); + gst_byte_writer_put_uint8_unchecked (&bw, 0x00); + cc_data_len += 3; + } + + gst_byte_writer_put_uint8_unchecked (&bw, 0x74); + gst_byte_writer_put_uint16_be_unchecked (&bw, self->cdp_hdr_sequence_cntr); + self->cdp_hdr_sequence_cntr++; + /* We calculate the checksum afterwards */ + gst_byte_writer_put_uint8_unchecked (&bw, 0); + + len = gst_byte_writer_get_pos (&bw); + gst_byte_writer_set_pos (&bw, 2); + gst_byte_writer_put_uint8_unchecked (&bw, len); + + checksum = 0; + for (i = 0; i < len; i++) { + checksum += map.data[i]; + } + checksum &= 0xff; + checksum = 256 - checksum; + map.data[len - 1] = checksum; + + gst_buffer_unmap (ret, &map); + + gst_buffer_set_size (ret, len); + + return ret; +} + +static GstBuffer * +make_padding (GstCCCombiner * self, const GstVideoTimeCode * tc, guint field) +{ + GstBuffer *ret = NULL; + + switch (self->caption_type) { + case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: + { + const guint8 cc_data[6] = { 0xfc, 0x80, 0x80, 0xf9, 0x80, 0x80 }; + + ret = make_cdp (self, cc_data, 6, self->cdp_fps_entry, tc); + break; + } + case GST_VIDEO_CAPTION_TYPE_CEA708_RAW: + { + GstMapInfo map; + + ret = gst_buffer_new_allocate (NULL, 3, NULL); + + gst_buffer_map (ret, &map, GST_MAP_WRITE); + + map.data[0] = 0xfc | (field & 0x01); + map.data[1] = 0x80; + map.data[2] = 0x80; + + gst_buffer_unmap (ret, &map); + break; + } + case GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A: + { + GstMapInfo map; + + ret = gst_buffer_new_allocate (NULL, 3, NULL); + + gst_buffer_map (ret, &map, GST_MAP_WRITE); + + map.data[0] = field == 0 ? 0x80 : 0x00; + map.data[1] = 0x80; + map.data[2] = 0x80; + + gst_buffer_unmap (ret, &map); + break; + } + case GST_VIDEO_CAPTION_TYPE_CEA608_RAW: + { + GstMapInfo map; + + ret = gst_buffer_new_allocate (NULL, 2, NULL); + + gst_buffer_map (ret, &map, GST_MAP_WRITE); + + map.data[0] = 0x80; + map.data[1] = 0x80; + + gst_buffer_unmap (ret, &map); + break; + } + default: + break; + } + + return ret; +} + +static void +queue_caption (GstCCCombiner * self, GstBuffer * scheduled, guint field) +{ + GstAggregatorPad *caption_pad; + CaptionQueueItem item; + + if (self->progressive && field == 1) { + gst_buffer_unref (scheduled); + return; + } + + caption_pad = + GST_AGGREGATOR_PAD_CAST (gst_element_get_static_pad (GST_ELEMENT_CAST + (self), "caption")); + + g_assert (gst_queue_array_get_length (self->scheduled[field]) <= + self->max_scheduled); + + if (gst_queue_array_get_length (self->scheduled[field]) == + self->max_scheduled) { + CaptionQueueItem *dropped = + gst_queue_array_pop_tail_struct (self->scheduled[field]); + + GST_WARNING_OBJECT (self, + "scheduled queue runs too long, dropping %" GST_PTR_FORMAT, dropped); + + gst_element_post_message (GST_ELEMENT_CAST (self), + gst_message_new_qos (GST_OBJECT_CAST (self), FALSE, + dropped->running_time, dropped->stream_time, + GST_BUFFER_PTS (dropped->buffer), GST_BUFFER_DURATION (dropped))); + + gst_buffer_unref (dropped->buffer); + } + + gst_object_unref (caption_pad); + + item.buffer = scheduled; + item.running_time = + gst_segment_to_running_time (&caption_pad->segment, GST_FORMAT_TIME, + GST_BUFFER_PTS (scheduled)); + item.stream_time = + gst_segment_to_stream_time (&caption_pad->segment, GST_FORMAT_TIME, + GST_BUFFER_PTS (scheduled)); + + gst_queue_array_push_tail_struct (self->scheduled[field], &item); +} + +static void +schedule_cdp (GstCCCombiner * self, const GstVideoTimeCode * tc, + const guint8 * data, guint len, GstClockTime pts, GstClockTime duration) +{ + const guint8 *cc_data; + guint cc_data_len; + gboolean inject = FALSE; + + if ((cc_data = extract_cdp (data, len, &cc_data_len))) { + guint8 i; + + for (i = 0; i < cc_data_len / 3; i++) { + gboolean cc_valid = (cc_data[i * 3] & 0x04) == 0x04; + guint8 cc_type = cc_data[i * 3] & 0x03; + + if (!cc_valid) + continue; + + if (cc_type == 0x00 || cc_type == 0x01) { + if (cc_data[i * 3 + 1] != 0x80 || cc_data[i * 3 + 2] != 0x80) { + inject = TRUE; + break; + } + continue; + } else { + inject = TRUE; + break; + } + } + } + + if (inject) { + GstBuffer *buf = + make_cdp (self, cc_data, cc_data_len, self->cdp_fps_entry, tc); + + /* We only set those for QoS reporting purposes */ + GST_BUFFER_PTS (buf) = pts; + GST_BUFFER_DURATION (buf) = duration; + + queue_caption (self, buf, 0); + } +} + +static void +schedule_cea608_s334_1a (GstCCCombiner * self, guint8 * data, guint len, + GstClockTime pts, GstClockTime duration) +{ + guint8 field0_data[3], field1_data[3]; + guint field0_len = 0, field1_len = 0; + guint i; + gboolean field0_608 = FALSE, field1_608 = FALSE; + + if (len % 3 != 0) { + GST_WARNING ("Invalid cc_data buffer size %u. Truncating to a multiple " + "of 3", len); + len = len - (len % 3); + } + + for (i = 0; i < len / 3; i++) { + if (data[i * 3] & 0x80) { + if (field0_608) + continue; + + field0_608 = TRUE; + + if (data[i * 3 + 1] == 0x80 && data[i * 3 + 2] == 0x80) + continue; + + field0_data[field0_len++] = data[i * 3]; + field0_data[field0_len++] = data[i * 3 + 1]; + field0_data[field0_len++] = data[i * 3 + 2]; + } else { + if (field1_608) + continue; + + field1_608 = TRUE; + + if (data[i * 3 + 1] == 0x80 && data[i * 3 + 2] == 0x80) + continue; + + field1_data[field1_len++] = data[i * 3]; + field1_data[field1_len++] = data[i * 3 + 1]; + field1_data[field1_len++] = data[i * 3 + 2]; + } + } + + if (field0_len > 0) { + GstBuffer *buf = gst_buffer_new_allocate (NULL, field0_len, NULL); + + gst_buffer_fill (buf, 0, field0_data, field0_len); + GST_BUFFER_PTS (buf) = pts; + GST_BUFFER_DURATION (buf) = duration; + + queue_caption (self, buf, 0); + } + + if (field1_len > 0) { + GstBuffer *buf = gst_buffer_new_allocate (NULL, field1_len, NULL); + + gst_buffer_fill (buf, 0, field1_data, field1_len); + GST_BUFFER_PTS (buf) = pts; + GST_BUFFER_DURATION (buf) = duration; + + queue_caption (self, buf, 1); + } +} + +static void +schedule_cea708_raw (GstCCCombiner * self, guint8 * data, guint len, + GstClockTime pts, GstClockTime duration) +{ + guint8 field0_data[MAX_CDP_PACKET_LEN], field1_data[3]; + guint field0_len = 0, field1_len = 0; + guint i; + gboolean field0_608 = FALSE, field1_608 = FALSE; + gboolean started_ccp = FALSE; + + if (len % 3 != 0) { + GST_WARNING ("Invalid cc_data buffer size %u. Truncating to a multiple " + "of 3", len); + len = len - (len % 3); + } + + for (i = 0; i < len / 3; i++) { + gboolean cc_valid = (data[i * 3] & 0x04) == 0x04; + guint8 cc_type = data[i * 3] & 0x03; + + if (!started_ccp) { + if (cc_type == 0x00) { + if (!cc_valid) + continue; + + if (field0_608) + continue; + + field0_608 = TRUE; + + if (data[i * 3 + 1] == 0x80 && data[i * 3 + 2] == 0x80) + continue; + + field0_data[field0_len++] = data[i * 3]; + field0_data[field0_len++] = data[i * 3 + 1]; + field0_data[field0_len++] = data[i * 3 + 2]; + } else if (cc_type == 0x01) { + if (!cc_valid) + continue; + + if (field1_608) + continue; + + field1_608 = TRUE; + + if (data[i * 3 + 1] == 0x80 && data[i * 3 + 2] == 0x80) + continue; + + field1_data[field1_len++] = data[i * 3]; + field1_data[field1_len++] = data[i * 3 + 1]; + field1_data[field1_len++] = data[i * 3 + 2]; + } + + continue; + } + + if (cc_type & 0x10) + started_ccp = TRUE; + + if (!cc_valid) + continue; + + if (cc_type == 0x00 || cc_type == 0x01) + continue; + + field0_data[field0_len++] = data[i * 3]; + field0_data[field0_len++] = data[i * 3 + 1]; + field0_data[field0_len++] = data[i * 3 + 2]; + } + + if (field0_len > 0) { + GstBuffer *buf = gst_buffer_new_allocate (NULL, field0_len, NULL); + + gst_buffer_fill (buf, 0, field0_data, field0_len); + GST_BUFFER_PTS (buf) = pts; + GST_BUFFER_DURATION (buf) = duration; + + queue_caption (self, buf, 0); + } + + if (field1_len > 0) { + GstBuffer *buf = gst_buffer_new_allocate (NULL, field1_len, NULL); + + gst_buffer_fill (buf, 0, field1_data, field1_len); + GST_BUFFER_PTS (buf) = pts; + GST_BUFFER_DURATION (buf) = duration; + + queue_caption (self, buf, 1); + } +} + +static void +schedule_cea608_raw (GstCCCombiner * self, guint8 * data, guint len, + GstBuffer * buffer) +{ + if (len < 2) { + return; + } + + if (data[0] != 0x80 || data[1] != 0x80) { + queue_caption (self, gst_buffer_ref (buffer), 0); + } +} + + +static void +schedule_caption (GstCCCombiner * self, GstBuffer * caption_buf, + const GstVideoTimeCode * tc) +{ + GstMapInfo map; + GstClockTime pts, duration; + + pts = GST_BUFFER_PTS (caption_buf); + duration = GST_BUFFER_DURATION (caption_buf); + + gst_buffer_map (caption_buf, &map, GST_MAP_READ); + + switch (self->caption_type) { + case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: + schedule_cdp (self, tc, map.data, map.size, pts, duration); + break; + case GST_VIDEO_CAPTION_TYPE_CEA708_RAW: + schedule_cea708_raw (self, map.data, map.size, pts, duration); + break; + case GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A: + schedule_cea608_s334_1a (self, map.data, map.size, pts, duration); + break; + case GST_VIDEO_CAPTION_TYPE_CEA608_RAW: + schedule_cea608_raw (self, map.data, map.size, caption_buf); + break; + default: + break; + } + + gst_buffer_unmap (caption_buf, &map); +} + +static void +dequeue_caption_one_field (GstCCCombiner * self, const GstVideoTimeCode * tc, + guint field, gboolean drain) +{ + CaptionQueueItem *scheduled; + CaptionData caption_data; + + if ((scheduled = gst_queue_array_pop_head_struct (self->scheduled[field]))) { + caption_data.buffer = scheduled->buffer; + caption_data.caption_type = self->caption_type; + g_array_append_val (self->current_frame_captions, caption_data); + } else if (!drain) { + caption_data.caption_type = self->caption_type; + caption_data.buffer = make_padding (self, tc, field); + g_array_append_val (self->current_frame_captions, caption_data); + } +} + +static void +dequeue_caption_both_fields (GstCCCombiner * self, const GstVideoTimeCode * tc, + gboolean drain) +{ + CaptionQueueItem *field0_scheduled, *field1_scheduled; + GstBuffer *field0_buffer, *field1_buffer; + CaptionData caption_data; + + field0_scheduled = gst_queue_array_pop_head_struct (self->scheduled[0]); + field1_scheduled = gst_queue_array_pop_head_struct (self->scheduled[1]); + + if (drain && !field0_scheduled && !field1_scheduled) { + return; + } + + if (field0_scheduled) { + field0_buffer = field0_scheduled->buffer; + } else { + field0_buffer = make_padding (self, tc, 0); + } + + if (field1_scheduled) { + field1_buffer = field1_scheduled->buffer; + } else { + field1_buffer = make_padding (self, tc, 1); + } + + caption_data.caption_type = self->caption_type; + caption_data.buffer = gst_buffer_append (field0_buffer, field1_buffer); + + g_array_append_val (self->current_frame_captions, caption_data); +} + static GstFlowReturn gst_cc_combiner_collect_captions (GstCCCombiner * self, gboolean timeout) { @@ -86,24 +736,32 @@ GST_AGGREGATOR_PAD (GST_AGGREGATOR_SRC_PAD (self)); GstAggregatorPad *caption_pad; GstBuffer *video_buf; + GstVideoTimeCodeMeta *tc_meta; + GstVideoTimeCode *tc = NULL; + gboolean caption_pad_is_eos = FALSE; g_assert (self->current_video_buffer != NULL); caption_pad = GST_AGGREGATOR_PAD_CAST (gst_element_get_static_pad (GST_ELEMENT_CAST (self), "caption")); - /* No caption pad, forward buffer directly */ if (!caption_pad) { GST_LOG_OBJECT (self, "No caption pad, passing through video"); video_buf = self->current_video_buffer; - self->current_video_buffer = NULL; gst_aggregator_selected_samples (GST_AGGREGATOR_CAST (self), GST_BUFFER_PTS (video_buf), GST_BUFFER_DTS (video_buf), GST_BUFFER_DURATION (video_buf), NULL); + self->current_video_buffer = NULL; goto done; } + tc_meta = gst_buffer_get_video_time_code_meta (self->current_video_buffer); + + if (tc_meta) { + tc = &tc_meta->tc; + } + GST_LOG_OBJECT (self, "Trying to collect captions for queued video buffer"); do { GstBuffer *caption_buf; @@ -114,6 +772,8 @@ if (!caption_buf) { if (gst_aggregator_pad_is_eos (caption_pad)) { GST_DEBUG_OBJECT (self, "Caption pad is EOS, we're done"); + + caption_pad_is_eos = TRUE; break; } else if (!timeout) { GST_DEBUG_OBJECT (self, "Need more caption data"); @@ -178,44 +838,111 @@ if (caption_time >= self->current_video_running_time_end) { gst_buffer_unref (caption_buf); break; - } else if (GST_CLOCK_TIME_IS_VALID (self->previous_video_running_time_end)) { - if (caption_time < self->previous_video_running_time_end) { + } else if (!self->schedule) { + if (GST_CLOCK_TIME_IS_VALID (self->previous_video_running_time_end)) { + if (caption_time < self->previous_video_running_time_end) { + GST_WARNING_OBJECT (self, + "Caption buffer before end of last video frame, dropping"); + + gst_aggregator_pad_drop_buffer (caption_pad); + gst_buffer_unref (caption_buf); + continue; + } + } else if (caption_time < self->current_video_running_time) { GST_WARNING_OBJECT (self, - "Caption buffer before end of last video frame, dropping"); + "Caption buffer before current video frame, dropping"); gst_aggregator_pad_drop_buffer (caption_pad); gst_buffer_unref (caption_buf); continue; } - } else if (caption_time < self->current_video_running_time) { - GST_WARNING_OBJECT (self, - "Caption buffer before current video frame, dropping"); - - gst_aggregator_pad_drop_buffer (caption_pad); - gst_buffer_unref (caption_buf); - continue; } /* This caption buffer has to be collected */ GST_LOG_OBJECT (self, "Collecting caption buffer %p %" GST_TIME_FORMAT " for video buffer %p", caption_buf, GST_TIME_ARGS (caption_time), self->current_video_buffer); - caption_data.caption_type = self->current_caption_type; - caption_data.buffer = caption_buf; - g_array_append_val (self->current_frame_captions, caption_data); + + caption_data.caption_type = self->caption_type; + gst_aggregator_pad_drop_buffer (caption_pad); + + if (!self->schedule) { + caption_data.buffer = caption_buf; + g_array_append_val (self->current_frame_captions, caption_data); + } else { + schedule_caption (self, caption_buf, tc); + gst_buffer_unref (caption_buf); + } } while (TRUE); + /* FIXME pad correctly according to fps */ + if (self->schedule) { + g_assert (self->current_frame_captions->len == 0); + + switch (self->caption_type) { + case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: + { + /* Only relevant in alternate and mixed mode, no need to look at the caps */ + if (GST_BUFFER_FLAG_IS_SET (self->current_video_buffer, + GST_VIDEO_BUFFER_FLAG_INTERLACED)) { + if (!GST_VIDEO_BUFFER_IS_BOTTOM_FIELD (self->current_video_buffer)) { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } + } else { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } + break; + } + case GST_VIDEO_CAPTION_TYPE_CEA708_RAW: + case GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A: + { + if (self->progressive) { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } else if (GST_BUFFER_FLAG_IS_SET (self->current_video_buffer, + GST_VIDEO_BUFFER_FLAG_INTERLACED) && + GST_BUFFER_FLAG_IS_SET (self->current_video_buffer, + GST_VIDEO_BUFFER_FLAG_ONEFIELD)) { + if (GST_VIDEO_BUFFER_IS_TOP_FIELD (self->current_video_buffer)) { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } else { + dequeue_caption_one_field (self, tc, 1, caption_pad_is_eos); + } + } else { + dequeue_caption_both_fields (self, tc, caption_pad_is_eos); + } + break; + } + case GST_VIDEO_CAPTION_TYPE_CEA608_RAW: + { + if (self->progressive) { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } else if (GST_BUFFER_FLAG_IS_SET (self->current_video_buffer, + GST_VIDEO_BUFFER_FLAG_INTERLACED)) { + if (!GST_VIDEO_BUFFER_IS_BOTTOM_FIELD (self->current_video_buffer)) { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } + } else { + dequeue_caption_one_field (self, tc, 0, caption_pad_is_eos); + } + break; + } + default: + break; + } + } + gst_aggregator_selected_samples (GST_AGGREGATOR_CAST (self), GST_BUFFER_PTS (self->current_video_buffer), GST_BUFFER_DTS (self->current_video_buffer), GST_BUFFER_DURATION (self->current_video_buffer), NULL); + GST_LOG_OBJECT (self, "Attaching %u captions to buffer %p", + self->current_frame_captions->len, self->current_video_buffer); + if (self->current_frame_captions->len > 0) { guint i; - GST_LOG_OBJECT (self, "Attaching %u captions to buffer %p", - self->current_frame_captions->len, self->current_video_buffer); video_buf = gst_buffer_make_writable (self->current_video_buffer); self->current_video_buffer = NULL; @@ -400,14 +1127,32 @@ s = gst_caps_get_structure (caps, 0); if (strcmp (GST_OBJECT_NAME (agg_pad), "caption") == 0) { - self->current_caption_type = gst_video_caption_type_from_caps (caps); + GstVideoCaptionType caption_type = + gst_video_caption_type_from_caps (caps); + + if (self->caption_type != GST_VIDEO_CAPTION_TYPE_UNKNOWN && + caption_type != self->caption_type) { + GST_ERROR_OBJECT (self, "Changing caption type is not allowed"); + + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Changing caption type is not allowed")); + + return FALSE; + } + self->caption_type = caption_type; } else { gint fps_n, fps_d; + const gchar *interlace_mode; fps_n = fps_d = 0; gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d); + interlace_mode = gst_structure_get_string (s, "interlace-mode"); + + self->progressive = !interlace_mode + || !g_strcmp0 (interlace_mode, "progressive"); + if (fps_n != self->video_fps_n || fps_d != self->video_fps_d) { GstClockTime latency; @@ -418,6 +1163,8 @@ self->video_fps_n = fps_n; self->video_fps_d = fps_d; + self->cdp_fps_entry = cdp_fps_entry_from_fps (fps_n, fps_d); + gst_aggregator_set_src_caps (aggregator, caps); } @@ -451,7 +1198,11 @@ gst_buffer_replace (&self->current_video_buffer, NULL); g_array_set_size (self->current_frame_captions, 0); - self->current_caption_type = GST_VIDEO_CAPTION_TYPE_UNKNOWN; + self->caption_type = GST_VIDEO_CAPTION_TYPE_UNKNOWN; + + gst_queue_array_clear (self->scheduled[0]); + gst_queue_array_clear (self->scheduled[1]); + self->cdp_fps_entry = &null_fps_entry; return TRUE; } @@ -471,6 +1222,10 @@ src_pad->segment.position = GST_CLOCK_TIME_NONE; + self->cdp_hdr_sequence_cntr = 0; + gst_queue_array_clear (self->scheduled[0]); + gst_queue_array_clear (self->scheduled[1]); + return GST_FLOW_OK; } @@ -493,7 +1248,7 @@ GST_OBJECT_LOCK (self); agg_pad = g_object_new (GST_TYPE_AGGREGATOR_PAD, "name", "caption", "direction", GST_PAD_SINK, "template", templ, NULL); - self->current_caption_type = GST_VIDEO_CAPTION_TYPE_UNKNOWN; + self->caption_type = GST_VIDEO_CAPTION_TYPE_UNKNOWN; GST_OBJECT_UNLOCK (self); return agg_pad; @@ -655,6 +1410,61 @@ return res; } +static GstStateChangeReturn +gst_cc_combiner_change_state (GstElement * element, GstStateChange transition) +{ + GstCCCombiner *self = GST_CCCOMBINER (element); + + switch (transition) { + case GST_STATE_CHANGE_READY_TO_PAUSED: + self->schedule = self->prop_schedule; + self->max_scheduled = self->prop_max_scheduled; + break; + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + +static void +gst_cc_combiner_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstCCCombiner *self = GST_CCCOMBINER (object); + + switch (prop_id) { + case PROP_SCHEDULE: + self->prop_schedule = g_value_get_boolean (value); + break; + case PROP_MAX_SCHEDULED: + self->prop_max_scheduled = g_value_get_uint (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_cc_combiner_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstCCCombiner *self = GST_CCCOMBINER (object); + + switch (prop_id) { + case PROP_SCHEDULE: + g_value_set_boolean (value, self->prop_schedule); + break; + case PROP_MAX_SCHEDULED: + g_value_set_uint (value, self->prop_max_scheduled); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + static void gst_cc_combiner_class_init (GstCCCombinerClass * klass) { @@ -667,6 +1477,8 @@ aggregator_class = (GstAggregatorClass *) klass; gobject_class->finalize = gst_cc_combiner_finalize; + gobject_class->set_property = gst_cc_combiner_set_property; + gobject_class->get_property = gst_cc_combiner_get_property; gst_element_class_set_static_metadata (gstelement_class, "Closed Caption Combiner", @@ -674,6 +1486,56 @@ "Combines GstVideoCaptionMeta with video input stream", "Sebastian Dröge <sebastian@centricular.com>"); + /** + * GstCCCombiner:schedule: + * + * Controls whether caption buffers should be smoothly scheduled + * in order to have exactly one per output video buffer. + * + * This can involve rewriting input captions, for example when the + * input is CDP sequence counters are rewritten, time codes are dropped + * and potentially re-injected if the input video frame had a time code + * meta. + * + * Caption buffers may also get split up in order to assign captions to + * the correct field when the input is interlaced. + * + * This can also imply that the input will drift from synchronization, + * when there isn't enough padding in the input stream to catch up. In + * that case the element will start dropping old caption buffers once + * the number of buffers in its internal queue reaches + * #GstCCCombiner:max-scheduled. + * + * When this is set to %FALSE, the behaviour of this element is essentially + * that of a funnel. + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_SCHEDULE, g_param_spec_boolean ("schedule", + "Schedule", + "Schedule caption buffers so that exactly one is output per video frame", + DEFAULT_SCHEDULE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + + /** + * GstCCCombiner:max-scheduled: + * + * Controls the number of scheduled buffers after which the element + * will start dropping old buffers from its internal queues. See + * #GstCCCombiner:schedule. + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_MAX_SCHEDULED, g_param_spec_uint ("max-scheduled", + "Max Scheduled", + "Maximum number of buffers to queue for scheduling", 0, G_MAXUINT, + DEFAULT_MAX_SCHEDULED, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + gst_element_class_add_static_pad_template_with_gtype (gstelement_class, &sinktemplate, GST_TYPE_AGGREGATOR_PAD); gst_element_class_add_static_pad_template_with_gtype (gstelement_class, @@ -681,6 +1543,9 @@ gst_element_class_add_static_pad_template_with_gtype (gstelement_class, &captiontemplate, GST_TYPE_AGGREGATOR_PAD); + gstelement_class->change_state = + GST_DEBUG_FUNCPTR (gst_cc_combiner_change_state); + aggregator_class->aggregate = gst_cc_combiner_aggregate; aggregator_class->stop = gst_cc_combiner_stop; aggregator_class->flush = gst_cc_combiner_flush; @@ -716,5 +1581,18 @@ self->current_video_running_time = self->current_video_running_time_end = self->previous_video_running_time_end = GST_CLOCK_TIME_NONE; - self->current_caption_type = GST_VIDEO_CAPTION_TYPE_UNKNOWN; + self->caption_type = GST_VIDEO_CAPTION_TYPE_UNKNOWN; + + self->prop_schedule = DEFAULT_SCHEDULE; + self->prop_max_scheduled = DEFAULT_MAX_SCHEDULED; + self->scheduled[0] = + gst_queue_array_new_for_struct (sizeof (CaptionQueueItem), 0); + self->scheduled[1] = + gst_queue_array_new_for_struct (sizeof (CaptionQueueItem), 0); + gst_queue_array_set_clear_func (self->scheduled[0], + (GDestroyNotify) clear_scheduled); + gst_queue_array_set_clear_func (self->scheduled[1], + (GDestroyNotify) clear_scheduled); + self->cdp_hdr_sequence_cntr = 0; + self->cdp_fps_entry = &null_fps_entry; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstcccombiner.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstcccombiner.h
Changed
@@ -40,18 +40,38 @@ typedef struct _GstCCCombiner GstCCCombiner; typedef struct _GstCCCombinerClass GstCCCombinerClass; +struct cdp_fps_entry +{ + guint8 fps_idx; + guint fps_n, fps_d; + guint max_cc_count; + guint max_ccp_count; + guint max_cea608_count; +}; + struct _GstCCCombiner { GstAggregator parent; gint video_fps_n, video_fps_d; + gboolean progressive; GstClockTime previous_video_running_time_end; GstClockTime current_video_running_time; GstClockTime current_video_running_time_end; GstBuffer *current_video_buffer; GArray *current_frame_captions; - GstVideoCaptionType current_caption_type; + GstVideoCaptionType caption_type; + + gboolean prop_schedule; + guint prop_max_scheduled; + + gboolean schedule; + guint max_scheduled; + /* One queue per field */ + GstQueueArray *scheduled[2]; + guint16 cdp_hdr_sequence_cntr; + const struct cdp_fps_entry *cdp_fps_entry; }; struct _GstCCCombinerClass @@ -61,5 +81,7 @@ GType gst_cc_combiner_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (cccombiner); + G_END_DECLS #endif /* __GST_CCCOMBINER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstccconverter.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstccconverter.c
Changed
@@ -33,6 +33,23 @@ GST_DEBUG_CATEGORY_STATIC (gst_cc_converter_debug); #define GST_CAT_DEFAULT gst_cc_converter_debug +/** + * GstCCConverterCDPMode: + * @GST_CC_CONVERTER_CDP_MODE_TIME_CODE: Store time code information in CDP packets + * @GST_CC_CONVERTER_CDP_MODE_CC_DATA: Store CC data in CDP packets + * @GST_CC_CONVERTER_CDP_MODE_CC_SVC_INFO: Store CC service information in CDP packets + * + * Since: 1.20 + */ + +enum +{ + PROP_0, + PROP_CDP_MODE, +}; + +#define DEFAULT_CDP_MODE (GST_CC_CONVERTER_CDP_MODE_TIME_CODE | GST_CC_CONVERTER_CDP_MODE_CC_DATA | GST_CC_CONVERTER_CDP_MODE_CC_SVC_INFO) + /* Ordered by the amount of information they can contain */ #define CC_CAPS \ "closedcaption/x-cea-708,format=(string) cdp; " \ @@ -52,8 +69,36 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS (CC_CAPS)); -G_DEFINE_TYPE (GstCCConverter, gst_cc_converter, GST_TYPE_BASE_TRANSFORM); #define parent_class gst_cc_converter_parent_class +G_DEFINE_TYPE (GstCCConverter, gst_cc_converter, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (ccconverter, "ccconverter", + GST_RANK_NONE, GST_TYPE_CCCONVERTER); + +#define GST_TYPE_CC_CONVERTER_CDP_MODE (gst_cc_converter_cdp_mode_get_type()) +static GType +gst_cc_converter_cdp_mode_get_type (void) +{ + static const GFlagsValue values[] = { + {GST_CC_CONVERTER_CDP_MODE_TIME_CODE, + "Store time code information in CDP packets", "time-code"}, + {GST_CC_CONVERTER_CDP_MODE_CC_DATA, "Store CC data in CDP packets", + "cc-data"}, + {GST_CC_CONVERTER_CDP_MODE_CC_SVC_INFO, + "Store CC service information in CDP packets", "cc-svc-info"}, + {0, NULL, NULL} + }; + static GType id = 0; + + if (g_once_init_enter ((gsize *) & id)) { + GType _id; + + _id = g_flags_register_static ("GstCCConverterCDPMode", values); + + g_once_init_leave ((gsize *) & id, _id); + } + + return id; +} static gboolean gst_cc_converter_transform_size (GstBaseTransform * base, @@ -839,6 +884,12 @@ if (tc && tc->config.fps_n != 0) interpolate_time_code_with_framerate (self, tc, out_fps_entry->fps_n, out_fps_entry->fps_d, 1, 1, &self->current_output_timecode); + + self->scratch_ccp_len = 0; + self->scratch_cea608_1_len = 0; + self->scratch_cea608_2_len = 0; + self->input_frames = 0; + self->output_frames = 0; } else { int input_frame_n, input_frame_d, output_frame_n, output_frame_d; int output_time_cmp, scale_n, scale_d, rate_cmp; @@ -1008,11 +1059,16 @@ cc_data_len = 3 * fps_entry->max_cc_count; } - /* ccdata_present | caption_service_active */ - flags = 0x42; + /* caption_service_active */ + flags = 0x02; + + /* ccdata_present */ + if ((self->cdp_mode & GST_CC_CONVERTER_CDP_MODE_CC_DATA)) + flags |= 0x40; /* time_code_present */ - if (tc && tc->config.fps_n > 0) + if ((self->cdp_mode & GST_CC_CONVERTER_CDP_MODE_TIME_CODE) && tc + && tc->config.fps_n > 0) flags |= 0x80; /* reserved */ @@ -1022,7 +1078,8 @@ gst_byte_writer_put_uint16_be_unchecked (&bw, self->cdp_hdr_sequence_cntr); - if (tc && tc->config.fps_n > 0) { + if ((self->cdp_mode & GST_CC_CONVERTER_CDP_MODE_TIME_CODE) && tc + && tc->config.fps_n > 0) { guint8 u8; gst_byte_writer_put_uint8_unchecked (&bw, 0x71); @@ -1061,14 +1118,16 @@ gst_byte_writer_put_uint8_unchecked (&bw, u8); } - gst_byte_writer_put_uint8_unchecked (&bw, 0x72); - gst_byte_writer_put_uint8_unchecked (&bw, 0xe0 | fps_entry->max_cc_count); - gst_byte_writer_put_data_unchecked (&bw, cc_data, cc_data_len); - while (fps_entry->max_cc_count > cc_data_len / 3) { - gst_byte_writer_put_uint8_unchecked (&bw, 0xfa); - gst_byte_writer_put_uint8_unchecked (&bw, 0x00); - gst_byte_writer_put_uint8_unchecked (&bw, 0x00); - cc_data_len += 3; + if ((self->cdp_mode & GST_CC_CONVERTER_CDP_MODE_CC_DATA)) { + gst_byte_writer_put_uint8_unchecked (&bw, 0x72); + gst_byte_writer_put_uint8_unchecked (&bw, 0xe0 | fps_entry->max_cc_count); + gst_byte_writer_put_data_unchecked (&bw, cc_data, cc_data_len); + while (fps_entry->max_cc_count > cc_data_len / 3) { + gst_byte_writer_put_uint8_unchecked (&bw, 0xfa); + gst_byte_writer_put_uint8_unchecked (&bw, 0x00); + gst_byte_writer_put_uint8_unchecked (&bw, 0x00); + cc_data_len += 3; + } } gst_byte_writer_put_uint8_unchecked (&bw, 0x74); @@ -1516,10 +1575,9 @@ static GstFlowReturn convert_cea608_raw_cea708_cdp (GstCCConverter * self, GstBuffer * inbuf, - GstBuffer * outbuf) + GstBuffer * outbuf, const GstVideoTimeCodeMeta * tc_meta) { GstMapInfo in, out; - const GstVideoTimeCodeMeta *tc_meta; const struct cdp_fps_entry *in_fps_entry, *out_fps_entry; guint cc_data_len = MAX_CDP_PACKET_LEN; guint cea608_1_len = MAX_CDP_PACKET_LEN; @@ -1555,10 +1613,6 @@ gst_buffer_unmap (inbuf, &in); cea608_1_len += n * 2; self->input_frames++; - - tc_meta = gst_buffer_get_video_time_code_meta (inbuf); - } else { - tc_meta = NULL; } out_fps_entry = cdp_fps_entry_from_fps (self->out_fps_n, self->out_fps_d); @@ -1669,10 +1723,9 @@ static GstFlowReturn convert_cea608_s334_1a_cea708_cdp (GstCCConverter * self, GstBuffer * inbuf, - GstBuffer * outbuf) + GstBuffer * outbuf, const GstVideoTimeCodeMeta * tc_meta) { GstMapInfo in, out; - const GstVideoTimeCodeMeta *tc_meta; const struct cdp_fps_entry *in_fps_entry, *out_fps_entry; guint cc_data_len = MAX_CDP_PACKET_LEN; guint cea608_1_len = MAX_CDP_PACKET_LEN, cea608_2_len = MAX_CDP_PACKET_LEN; @@ -1715,9 +1768,6 @@ } gst_buffer_unmap (inbuf, &in); self->input_frames++; - tc_meta = gst_buffer_get_video_time_code_meta (inbuf); - } else { - tc_meta = NULL; } out_fps_entry = cdp_fps_entry_from_fps (self->out_fps_n, self->out_fps_d); @@ -1840,10 +1890,9 @@ static GstFlowReturn convert_cea708_cc_data_cea708_cdp (GstCCConverter * self, GstBuffer * inbuf, - GstBuffer * outbuf) + GstBuffer * outbuf, const GstVideoTimeCodeMeta * tc_meta) { GstMapInfo in, out; - const GstVideoTimeCodeMeta *tc_meta; const struct cdp_fps_entry *in_fps_entry, *out_fps_entry; guint in_cc_data_len; guint cc_data_len = MAX_CDP_PACKET_LEN, ccp_data_len = MAX_CDP_PACKET_LEN; @@ -1856,12 +1905,10 @@ gst_buffer_map (inbuf, &in, GST_MAP_READ); in_cc_data = in.data; in_cc_data_len = in.size; - tc_meta = gst_buffer_get_video_time_code_meta (inbuf); self->input_frames++; } else { in_cc_data = NULL; in_cc_data_len = 0; - tc_meta = NULL; } in_fps_entry = cdp_fps_entry_from_fps (self->in_fps_n, self->in_fps_d); @@ -1912,7 +1959,7 @@ static GstFlowReturn convert_cea708_cdp_cea608_raw (GstCCConverter * self, GstBuffer * inbuf, - GstBuffer * outbuf) + GstBuffer * outbuf, const GstVideoTimeCodeMeta * tc_meta) { GstMapInfo out; GstVideoTimeCode tc = GST_VIDEO_TIME_CODE_INIT; @@ -1941,8 +1988,7 @@ gst_buffer_set_size (outbuf, cea608_1_len); - if (self->current_output_timecode.config.fps_n != 0 - && !gst_buffer_get_video_time_code_meta (inbuf)) { + if (self->current_output_timecode.config.fps_n != 0 && !tc_meta) { gst_buffer_add_video_time_code_meta (outbuf, &self->current_output_timecode); gst_video_time_code_increment_frame (&self->current_output_timecode); @@ -1953,7 +1999,7 @@ static GstFlowReturn convert_cea708_cdp_cea608_s334_1a (GstCCConverter * self, GstBuffer * inbuf, - GstBuffer * outbuf) + GstBuffer * outbuf, const GstVideoTimeCodeMeta * tc_meta) { GstMapInfo out; GstVideoTimeCode tc = GST_VIDEO_TIME_CODE_INIT; @@ -1992,8 +2038,7 @@ gst_buffer_set_size (outbuf, cc_data_len); - if (self->current_output_timecode.config.fps_n != 0 - && !gst_buffer_get_video_time_code_meta (inbuf)) { + if (self->current_output_timecode.config.fps_n != 0 && !tc_meta) { gst_buffer_add_video_time_code_meta (outbuf, &self->current_output_timecode); gst_video_time_code_increment_frame (&self->current_output_timecode); @@ -2008,7 +2053,7 @@ static GstFlowReturn convert_cea708_cdp_cea708_cc_data (GstCCConverter * self, GstBuffer * inbuf, - GstBuffer * outbuf) + GstBuffer * outbuf, const GstVideoTimeCodeMeta * tc_meta) { GstMapInfo out; GstVideoTimeCode tc = GST_VIDEO_TIME_CODE_INIT; @@ -2043,8 +2088,7 @@ gst_buffer_unmap (outbuf, &out); self->output_frames++; - if (self->current_output_timecode.config.fps_n != 0 - && !gst_buffer_get_video_time_code_meta (inbuf)) { + if (self->current_output_timecode.config.fps_n != 0 && !tc_meta) { gst_buffer_add_video_time_code_meta (outbuf, &self->current_output_timecode); gst_video_time_code_increment_frame (&self->current_output_timecode); @@ -2144,7 +2188,7 @@ ret = convert_cea608_raw_cea708_cc_data (self, inbuf, outbuf); break; case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: - ret = convert_cea608_raw_cea708_cdp (self, inbuf, outbuf); + ret = convert_cea608_raw_cea708_cdp (self, inbuf, outbuf, tc_meta); break; case GST_VIDEO_CAPTION_TYPE_CEA608_RAW: default: @@ -2163,7 +2207,8 @@ ret = convert_cea608_s334_1a_cea708_cc_data (self, inbuf, outbuf); break; case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: - ret = convert_cea608_s334_1a_cea708_cdp (self, inbuf, outbuf); + ret = + convert_cea608_s334_1a_cea708_cdp (self, inbuf, outbuf, tc_meta); break; case GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A: default: @@ -2182,7 +2227,8 @@ ret = convert_cea708_cc_data_cea608_s334_1a (self, inbuf, outbuf); break; case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: - ret = convert_cea708_cc_data_cea708_cdp (self, inbuf, outbuf); + ret = + convert_cea708_cc_data_cea708_cdp (self, inbuf, outbuf, tc_meta); break; case GST_VIDEO_CAPTION_TYPE_CEA708_RAW: default: @@ -2195,13 +2241,15 @@ switch (self->output_caption_type) { case GST_VIDEO_CAPTION_TYPE_CEA608_RAW: - ret = convert_cea708_cdp_cea608_raw (self, inbuf, outbuf); + ret = convert_cea708_cdp_cea608_raw (self, inbuf, outbuf, tc_meta); break; case GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A: - ret = convert_cea708_cdp_cea608_s334_1a (self, inbuf, outbuf); + ret = + convert_cea708_cdp_cea608_s334_1a (self, inbuf, outbuf, tc_meta); break; case GST_VIDEO_CAPTION_TYPE_CEA708_RAW: - ret = convert_cea708_cdp_cea708_cc_data (self, inbuf, outbuf); + ret = + convert_cea708_cdp_cea708_cc_data (self, inbuf, outbuf, tc_meta); break; case GST_VIDEO_CAPTION_TYPE_CEA708_CDP: ret = convert_cea708_cdp_cea708_cdp (self, inbuf, outbuf); @@ -2454,14 +2502,68 @@ } static void +gst_cc_converter_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstCCConverter *filter = GST_CCCONVERTER (object); + + switch (prop_id) { + case PROP_CDP_MODE: + filter->cdp_mode = g_value_get_flags (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_cc_converter_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstCCConverter *filter = GST_CCCONVERTER (object); + + switch (prop_id) { + case PROP_CDP_MODE: + g_value_set_flags (value, filter->cdp_mode); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void gst_cc_converter_class_init (GstCCConverterClass * klass) { + GObjectClass *gobject_class; GstElementClass *gstelement_class; GstBaseTransformClass *basetransform_class; + gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; basetransform_class = (GstBaseTransformClass *) klass; + gobject_class->set_property = gst_cc_converter_set_property; + gobject_class->get_property = gst_cc_converter_get_property; + + /** + * GstCCConverter:cdp-mode + * + * Only insert the selection sections into CEA 708 CDP packets. + * + * Various software does not handle any other information than CC data + * contained in CDP packets and might fail parsing the packets otherwise. + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_CDP_MODE, g_param_spec_flags ("cdp-mode", + "CDP Mode", + "Select which CDP sections to store in CDP packets", + GST_TYPE_CC_CONVERTER_CDP_MODE, DEFAULT_CDP_MODE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + gst_element_class_set_static_metadata (gstelement_class, "Closed Caption Converter", "Filter/ClosedCaption", @@ -2490,9 +2592,12 @@ GST_DEBUG_CATEGORY_INIT (gst_cc_converter_debug, "ccconverter", 0, "Closed Caption converter"); + + gst_type_mark_as_plugin_api (GST_TYPE_CC_CONVERTER_CDP_MODE, 0); } static void gst_cc_converter_init (GstCCConverter * self) { + self->cdp_mode = DEFAULT_CDP_MODE; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstccconverter.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstccconverter.h
Changed
@@ -43,10 +43,18 @@ #define MAX_CDP_PACKET_LEN 256 #define MAX_CEA608_LEN 32 +typedef enum { + GST_CC_CONVERTER_CDP_MODE_TIME_CODE = (1<<0), + GST_CC_CONVERTER_CDP_MODE_CC_DATA = (1<<1), + GST_CC_CONVERTER_CDP_MODE_CC_SVC_INFO = (1<<2) +} GstCCConverterCDPMode; + struct _GstCCConverter { GstBaseTransform parent; + GstCCConverterCDPMode cdp_mode; + GstVideoCaptionType input_caption_type; GstVideoCaptionType output_caption_type; @@ -80,5 +88,7 @@ GType gst_cc_converter_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ccconverter); + G_END_DECLS #endif /* __GST_CCCONVERTER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstccextractor.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstccextractor.c
Changed
@@ -65,8 +65,10 @@ ("closedcaption/x-cea-608,format={ (string) raw, (string) s334-1a}; " "closedcaption/x-cea-708,format={ (string) cc_data, (string) cdp }")); -G_DEFINE_TYPE (GstCCExtractor, gst_cc_extractor, GST_TYPE_ELEMENT); #define parent_class gst_cc_extractor_parent_class +G_DEFINE_TYPE (GstCCExtractor, gst_cc_extractor, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (ccextractor, "ccextractor", + GST_RANK_NONE, GST_TYPE_CCEXTRACTOR); static gboolean gst_cc_extractor_sink_event (GstPad * pad, GstObject * parent, GstEvent * event);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstccextractor.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstccextractor.h
Changed
@@ -61,5 +61,7 @@ GType gst_cc_extractor_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ccextractor); + G_END_DECLS #endif /* __GST_CCEXTRACTOR_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstceaccoverlay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstceaccoverlay.c
Changed
@@ -195,6 +195,9 @@ return type; } +GST_ELEMENT_REGISTER_DEFINE (cc708overlay, "cc708overlay", + GST_RANK_PRIMARY, GST_TYPE_CEA_CC_OVERLAY); + static void gst_base_cea_cc_overlay_base_init (gpointer g_class) {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstceaccoverlay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstceaccoverlay.h
Changed
@@ -132,5 +132,7 @@ GType gst_cea_cc_overlay_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (cc708overlay); + G_END_DECLS #endif /* __GST_CEA_CC_OVERLAY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstclosedcaption.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstclosedcaption.c
Changed
@@ -35,25 +35,14 @@ static gboolean closedcaption_init (GstPlugin * plugin) { - gboolean ret; - - ret = gst_element_register (plugin, "cccombiner", GST_RANK_NONE, - GST_TYPE_CCCOMBINER); - - ret &= gst_element_register (plugin, "ccconverter", GST_RANK_NONE, - GST_TYPE_CCCONVERTER); - - ret &= gst_element_register (plugin, "ccextractor", GST_RANK_NONE, - GST_TYPE_CCEXTRACTOR); - - ret &= gst_element_register (plugin, "line21decoder", GST_RANK_NONE, - GST_TYPE_LINE21DECODER); - - ret &= gst_element_register (plugin, "cc708overlay", GST_RANK_PRIMARY, - GST_TYPE_CEA_CC_OVERLAY); - - ret &= gst_element_register (plugin, "line21encoder", GST_RANK_NONE, - GST_TYPE_LINE21ENCODER); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (cccombiner, plugin); + ret |= GST_ELEMENT_REGISTER (ccconverter, plugin); + ret |= GST_ELEMENT_REGISTER (ccextractor, plugin); + ret |= GST_ELEMENT_REGISTER (line21decoder, plugin); + ret |= GST_ELEMENT_REGISTER (cc708overlay, plugin); + ret |= GST_ELEMENT_REGISTER (line21encoder, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstline21dec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstline21dec.c
Changed
@@ -37,7 +37,26 @@ GST_DEBUG_CATEGORY_STATIC (gst_line_21_decoder_debug); #define GST_CAT_DEFAULT gst_line_21_decoder_debug -#define CAPS "video/x-raw, format={ I420, YUY2, YVYU, UYVY, VYUY, v210 }, interlace-mode=interleaved" +/** + * GstLine21DecoderMode: + * @GST_LINE_21_DECODER_MODE_ADD: add new CC meta on top of other CC meta, if any + * @GST_LINE_21_DECODER_MODE_DROP: ignore CC if a CC meta was already present + * @GST_LINE_21_DECODER_MODE_REPLACE: replace existing CC meta + * + * Since: 1.20 + */ + +enum +{ + PROP_0, + PROP_NTSC_ONLY, + PROP_MODE, +}; + +#define DEFAULT_NTSC_ONLY FALSE +#define DEFAULT_MODE GST_LINE_21_DECODER_MODE_ADD + +#define CAPS "video/x-raw, format={ I420, YUY2, YVYU, UYVY, VYUY, v210 }" static GstStaticPadTemplate sinktemplate = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, @@ -49,8 +68,37 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS (CAPS)); -G_DEFINE_TYPE (GstLine21Decoder, gst_line_21_decoder, GST_TYPE_VIDEO_FILTER); #define parent_class gst_line_21_decoder_parent_class +G_DEFINE_TYPE (GstLine21Decoder, gst_line_21_decoder, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (line21decoder, "line21decoder", + GST_RANK_NONE, GST_TYPE_LINE21DECODER); + +#define GST_TYPE_LINE_21_DECODER_MODE (gst_line_21_decoder_mode_get_type()) +static GType +gst_line_21_decoder_mode_get_type (void) +{ + static const GEnumValue values[] = { + {GST_LINE_21_DECODER_MODE_ADD, + "add new CC meta on top of other CC meta, if any", "add"}, + {GST_LINE_21_DECODER_MODE_DROP, + "ignore CC if a CC meta was already present", + "drop"}, + {GST_LINE_21_DECODER_MODE_REPLACE, + "replace existing CC meta", "replace"}, + {0, NULL, NULL} + }; + static volatile GType id = 0; + + if (g_once_init_enter ((gsize *) & id)) { + GType _id; + + _id = g_enum_register_static ("GstLine21DecoderMode", values); + + g_once_init_leave ((gsize *) & id, _id); + } + + return id; +} static void gst_line_21_decoder_finalize (GObject * self); static gboolean gst_line_21_decoder_stop (GstBaseTransform * btrans); @@ -63,6 +111,44 @@ * trans, GstBuffer * in, GstBuffer ** out); static void +gst_line_21_decoder_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstLine21Decoder *enc = GST_LINE21DECODER (object); + + switch (prop_id) { + case PROP_MODE: + enc->mode = g_value_get_enum (value); + break; + case PROP_NTSC_ONLY: + enc->ntsc_only = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_line_21_decoder_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstLine21Decoder *enc = GST_LINE21DECODER (object); + + switch (prop_id) { + case PROP_MODE: + g_value_set_enum (value, enc->mode); + break; + case PROP_NTSC_ONLY: + g_value_set_boolean (value, enc->ntsc_only); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void gst_line_21_decoder_class_init (GstLine21DecoderClass * klass) { GObjectClass *gobject_class; @@ -76,6 +162,40 @@ filter_class = (GstVideoFilterClass *) klass; gobject_class->finalize = gst_line_21_decoder_finalize; + gobject_class->set_property = gst_line_21_decoder_set_property; + gobject_class->get_property = gst_line_21_decoder_get_property; + + /** + * line21decoder:ntsc-only + * + * Whether line 21 decoding should only be attempted when the + * input resolution matches NTSC (720 x 525) or NTSC usable + * lines (720 x 486) + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_NTSC_ONLY, g_param_spec_boolean ("ntsc-only", + "NTSC only", + "Whether line 21 decoding should only be attempted when the " + "input resolution matches NTSC", DEFAULT_NTSC_ONLY, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstLine21Decoder:mode + * + * Control whether and how detected CC meta should be inserted + * in the list of existing CC meta on a frame (if any). + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_MODE, g_param_spec_enum ("mode", + "Mode", + "Control whether and how detected CC meta should be inserted " + "in the list of existing CC meta on a frame (if any).", + GST_TYPE_LINE_21_DECODER_MODE, DEFAULT_MODE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); gst_element_class_set_static_metadata (gstelement_class, "Line 21 CC Decoder", @@ -96,6 +216,8 @@ GST_DEBUG_CATEGORY_INIT (gst_line_21_decoder_debug, "line21decoder", 0, "Line 21 CC Decoder"); vbi_initialize_gst_debug (); + + gst_type_mark_as_plugin_api (GST_TYPE_LINE_21_DECODER_MODE, 0); } static void @@ -103,8 +225,11 @@ { GstLine21Decoder *self = (GstLine21Decoder *) filter; + self->info = NULL; self->line21_offset = -1; self->max_line_probes = 40; + self->ntsc_only = DEFAULT_NTSC_ONLY; + self->mode = DEFAULT_MODE; } static vbi_pixfmt @@ -165,12 +290,27 @@ /* Scan the next frame from the first line */ self->line21_offset = -1; + if (!GST_VIDEO_INFO_IS_INTERLACED (in_info)) { + GST_DEBUG_OBJECT (filter, "Only interlaced formats are supported"); + self->compatible_format = FALSE; + return TRUE; + } + if (GST_VIDEO_INFO_WIDTH (in_info) != 720) { GST_DEBUG_OBJECT (filter, "Only 720 pixel wide formats are supported"); self->compatible_format = FALSE; return TRUE; } + if (self->ntsc_only && + GST_VIDEO_INFO_HEIGHT (in_info) != 525 && + GST_VIDEO_INFO_HEIGHT (in_info) != 486) { + GST_DEBUG_OBJECT (filter, + "NTSC-only, only 525 or 486 pixel high formats are supported"); + self->compatible_format = FALSE; + return TRUE; + } + if (fmt == 0) { if (GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_v210) { GST_DEBUG_OBJECT (filter, @@ -333,6 +473,15 @@ return self->converted_lines; } +static gboolean +drop_cc_meta (GstBuffer * buffer, GstMeta ** meta, gpointer unused) +{ + if ((*meta)->info->api == GST_VIDEO_CAPTION_META_API_TYPE) + *meta = NULL; + + return TRUE; +} + /* Call this to scan for CC * Returns TRUE if it was found and set, else FALSE */ static gboolean @@ -343,6 +492,13 @@ gboolean found = FALSE; guint8 *data; + if (self->mode == GST_LINE_21_DECODER_MODE_DROP && + gst_buffer_get_n_meta (frame->buffer, + GST_VIDEO_CAPTION_META_API_TYPE) > 0) { + GST_DEBUG_OBJECT (self, "Mode drop and buffer had CC meta, ignoring"); + return FALSE; + } + GST_DEBUG_OBJECT (self, "Starting probing. max_line_probes:%d", self->max_line_probes); @@ -372,7 +528,6 @@ } if (!found) { - GST_DEBUG_OBJECT (self, "No CC found"); self->line21_offset = -1; } else { guint base_line1 = 0, base_line2 = 0; @@ -386,6 +541,12 @@ base_line2 = 318; } + if (self->mode == GST_LINE_21_DECODER_MODE_REPLACE) { + GST_DEBUG_OBJECT (self, + "Mode replace and new CC meta, removing existing CC meta"); + gst_buffer_foreach_meta (frame->buffer, drop_cc_meta, NULL); + } + ccdata[0] |= (base_line1 < i ? i - base_line1 : 0) & 0x1f; ccdata[1] = sliced[0].data[0]; ccdata[2] = sliced[0].data[1]; @@ -426,6 +587,10 @@ GstLine21Decoder *self = (GstLine21Decoder *) btrans; vbi_raw_decoder_destroy (&self->zvbi_decoder); + if (self->info) { + gst_video_info_free (self->info); + self->info = NULL; + } return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstline21dec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstline21dec.h
Changed
@@ -41,6 +41,12 @@ typedef struct _GstLine21Decoder GstLine21Decoder; typedef struct _GstLine21DecoderClass GstLine21DecoderClass; +typedef enum { + GST_LINE_21_DECODER_MODE_ADD, + GST_LINE_21_DECODER_MODE_DROP, + GST_LINE_21_DECODER_MODE_REPLACE, +} GstLine21DecoderMode; + struct _GstLine21Decoder { GstVideoFilter parent; @@ -60,6 +66,9 @@ guint8 *converted_lines; GstVideoInfo *info; + + gboolean ntsc_only; + GstLine21DecoderMode mode; }; struct _GstLine21DecoderClass @@ -69,5 +78,7 @@ GType gst_line_21_decoder_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (line21decoder); + G_END_DECLS #endif /* __GST_LINE21DECODER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstline21enc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstline21enc.c
Changed
@@ -60,6 +60,8 @@ G_DEFINE_TYPE (GstLine21Encoder, gst_line_21_encoder, GST_TYPE_VIDEO_FILTER); #define parent_class gst_line_21_encoder_parent_class +GST_ELEMENT_REGISTER_DEFINE (line21encoder, "line21encoder", + GST_RANK_NONE, GST_TYPE_LINE21ENCODER); static gboolean gst_line_21_encoder_set_info (GstVideoFilter * filter, GstCaps * incaps, GstVideoInfo * in_info,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/gstline21enc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/gstline21enc.h
Changed
@@ -59,5 +59,7 @@ GType gst_line_21_encoder_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (line21encoder); + G_END_DECLS #endif /* __GST_LINE21ENCODER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/io-sim.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/io-sim.c
Changed
@@ -541,7 +541,7 @@ } } else { bounds: - warning (caller, "Sliced line %u out of bounds.", sliced->line); + warn (caller, "Sliced line %u out of bounds.", sliced->line); return FALSE; } @@ -620,7 +620,7 @@ break; default: - warning (caller, + warn (caller, "Service 0x%08x (%s) not supported.", sliced->id, vbi_sliced_name (sliced->id)); return FALSE; @@ -646,7 +646,7 @@ n_scan_lines = sp->count[0] + sp->count[1]; if (unlikely (n_scan_lines * sp->bytes_per_line > raw_size)) { - warning (__FUNCTION__, + warn (__FUNCTION__, "(%u + %u lines) * %lu bytes_per_line " "> %lu raw_size.", sp->count[0], sp->count[1], @@ -655,7 +655,7 @@ } if (unlikely (0 != white_level && blank_level > white_level)) { - warning (__FUNCTION__, + warn (__FUNCTION__, "Invalid blanking %d or peak white level %d.", blank_level, white_level); } @@ -796,7 +796,7 @@ n_scan_lines = sp->count[0] + sp->count[1]; if (unlikely (n_scan_lines * sp->bytes_per_line > raw_size)) { - warning (__FUNCTION__, + warn (__FUNCTION__, "%u + %u lines * %lu bytes_per_line > %lu raw_size.", sp->count[0], sp->count[1], (unsigned long) sp->bytes_per_line, raw_size); @@ -805,7 +805,7 @@ if (unlikely (0 != white_level && (blank_level > black_level || black_level > white_level))) { - warning (__FUNCTION__, + warn (__FUNCTION__, "Invalid blanking %d, black %d or peak " "white level %d.", blank_level, black_level, white_level); }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/meson.build
Changed
@@ -1,4 +1,4 @@ -pangocairo_dep = dependency('pangocairo', version : '>= 1.22.0', +closedcaption_dep = dependency('pangocairo', version : '>= 1.22.0', required : get_option('closedcaption')) zvbi_sources = [ @@ -9,7 +9,7 @@ 'io-sim.c', ] -if pangocairo_dep.found() +if closedcaption_dep.found() gstclosedcaption = library('gstclosedcaption', 'gstcccombiner.c', 'gstccextractor.c', 'gstccconverter.c', 'gstclosedcaption.c', 'gstline21dec.c', 'gstcea708decoder.c', 'gstceaccoverlay.c', 'gstline21enc.c', @@ -17,7 +17,7 @@ c_args : gst_plugins_bad_args, link_args : noseh_link_args, include_directories : [configinc], - dependencies : [gstvideo_dep, gstbase_dep, gst_dep, pangocairo_dep, libm], + dependencies : [gstvideo_dep, gstbase_dep, gst_dep, closedcaption_dep, libm], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/misc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/misc.h
Changed
@@ -348,7 +348,7 @@ #ifdef G_HAVE_GNUC_VARARGS #define error(hook, templ, args...) \ VBI_CAT_LEVEL_LOG (GST_LEVEL_ERROR, NULL, templ , ##args) -#define warning(hook, templ, args...) \ +#define warn(hook, templ, args...) \ VBI_CAT_LEVEL_LOG (GST_LEVEL_WARNING, NULL, templ , ##args) #define notice(hook, templ, args...) \ VBI_CAT_LEVEL_LOG (GST_LEVEL_INFO, NULL, templ , ##args) @@ -363,7 +363,7 @@ #elif defined(G_HAVE_ISO_VARARGS) #define error(hook, templ, ...) \ VBI_CAT_LEVEL_LOG (GST_LEVEL_ERROR, NULL, templ, __VA_ARGS__) -#define warning(hook, templ, ...) \ +#define warn(hook, templ, ...) \ VBI_CAT_LEVEL_LOG (GST_LEVEL_WARNING, NULL, templ, __VA_ARGS__) #define notice(hook, templ, ...) \ VBI_CAT_LEVEL_LOG (GST_LEVEL_INFO, NULL, templ, __VA_ARGS__)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/closedcaption/sampling_par.c -> gst-plugins-bad-1.20.1.tar.xz/ext/closedcaption/sampling_par.c
Changed
@@ -374,7 +374,7 @@ if (0 == (VBI_VIDEOSTD_SET_ALL & videostd_set_req) || ((VBI_VIDEOSTD_SET_525_60 & videostd_set_req) && (VBI_VIDEOSTD_SET_625_50 & videostd_set_req))) { - warning (log, + warn (log, "Ambiguous videostd_set 0x%lx.", (unsigned long) videostd_set_req); CLEAR (*sp); return 0;
View file
gst-plugins-bad-1.18.6.tar.xz/ext/colormanagement/gstcolormanagement.c -> gst-plugins-bad-1.20.1.tar.xz/ext/colormanagement/gstcolormanagement.c
Changed
@@ -27,7 +27,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "lcms", GST_RANK_NONE, GST_TYPE_LCMS); + return GST_ELEMENT_REGISTER (lcms, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/colormanagement/gstlcms.c -> gst-plugins-bad-1.20.1.tar.xz/ext/colormanagement/gstlcms.c
Changed
@@ -158,6 +158,7 @@ GstVideoFrame * outframe); G_DEFINE_TYPE (GstLcms, gst_lcms, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (lcms, "lcms", GST_RANK_NONE, GST_TYPE_LCMS); static void gst_lcms_class_init (GstLcmsClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/colormanagement/gstlcms.h -> gst-plugins-bad-1.20.1.tar.xz/ext/colormanagement/gstlcms.h
Changed
@@ -98,5 +98,7 @@ G_GNUC_INTERNAL GType gst_lcms_get_type (void); G_GNUC_INTERNAL GType gst_lcms_intent_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (lcms); + G_END_DECLS #endif /* __GST_LCMS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurl.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurl.c
Changed
@@ -20,55 +20,25 @@ #include <config.h> #endif -#include <gst/gst-i18n-plugin.h> +#include "gstcurlelements.h" -#include "gstcurlbasesink.h" -#include "gstcurltlssink.h" -#include "gstcurlhttpsink.h" -#include "gstcurlfilesink.h" -#include "gstcurlftpsink.h" -#include "gstcurlsmtpsink.h" -#ifdef HAVE_SSH2 -#include "gstcurlsftpsink.h" -#endif -#include "gstcurlhttpsrc.h" static gboolean plugin_init (GstPlugin * plugin) { -#ifdef ENABLE_NLS - GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, - LOCALEDIR); - bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); - bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); -#endif /* ENABLE_NLS */ - - if (!gst_element_register (plugin, "curlhttpsink", GST_RANK_NONE, - GST_TYPE_CURL_HTTP_SINK)) - return FALSE; - - if (!gst_element_register (plugin, "curlfilesink", GST_RANK_NONE, - GST_TYPE_CURL_FILE_SINK)) - return FALSE; - - if (!gst_element_register (plugin, "curlftpsink", GST_RANK_NONE, - GST_TYPE_CURL_FTP_SINK)) - return FALSE; + gboolean ret = FALSE; - if (!gst_element_register (plugin, "curlsmtpsink", GST_RANK_NONE, - GST_TYPE_CURL_SMTP_SINK)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (curlhttpsink, plugin); + ret |= GST_ELEMENT_REGISTER (curlfilesink, plugin); + ret |= GST_ELEMENT_REGISTER (curlftpsink, plugin); + ret |= GST_ELEMENT_REGISTER (curlsmtpsink, plugin); #ifdef HAVE_SSH2 - if (!gst_element_register (plugin, "curlsftpsink", GST_RANK_NONE, - GST_TYPE_CURL_SFTP_SINK)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (curlsftpsink, plugin); #endif - if (!gst_element_register (plugin, "curlhttpsrc", GST_RANK_SECONDARY, - GST_TYPE_CURLHTTPSRC)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (curlhttpsrc, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlbasesink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlbasesink.c
Changed
@@ -121,6 +121,8 @@ size_t nmemb, void *stream); static size_t gst_curl_base_sink_transfer_write_cb (void *ptr, size_t size, size_t nmemb, void *stream); +static int gst_curl_base_sink_transfer_seek_cb (void *user_p, curl_off_t offset, + int origin); static size_t gst_curl_base_sink_transfer_data_buffer (GstCurlBaseSink * sink, void *curl_ptr, size_t block_size, guint * last_chunk); #ifndef GST_DISABLE_GST_DEBUG @@ -704,6 +706,21 @@ curl_easy_strerror (res)); return FALSE; } + + res = curl_easy_setopt (sink->curl, CURLOPT_SEEKDATA, sink); + if (res != CURLE_OK) { + sink->error = g_strdup_printf ("failed to set seek user data: %s", + curl_easy_strerror (res)); + return FALSE; + } + res = curl_easy_setopt (sink->curl, CURLOPT_SEEKFUNCTION, + gst_curl_base_sink_transfer_seek_cb); + if (res != CURLE_OK) { + sink->error = g_strdup_printf ("failed to set seek function: %s", + curl_easy_strerror (res)); + return FALSE; + } + /* Time out in case transfer speed in bytes per second stay below * CURLOPT_LOW_SPEED_LIMIT during CURLOPT_LOW_SPEED_TIME */ res = curl_easy_setopt (sink->curl, CURLOPT_LOW_SPEED_LIMIT, 1L); @@ -882,6 +899,45 @@ return realsize; } +static int +gst_curl_base_sink_transfer_seek_cb (void *stream, curl_off_t offset, + int origin) +{ + GstCurlBaseSink *sink; + curl_off_t buf_size; + + /* + * Origin is SEEK_SET, SEEK_CUR or SEEK_END, + * libcurl currently only passes SEEK_SET. + */ + + sink = (GstCurlBaseSink *) stream; + + GST_OBJECT_LOCK (sink); + buf_size = sink->transfer_buf->offset + sink->transfer_buf->len; + + switch (origin) { + case SEEK_SET: + if ((0 <= offset) && (offset <= buf_size)) { + sink->transfer_buf->offset = offset; + sink->transfer_buf->len = buf_size - offset; + } else { + GST_OBJECT_UNLOCK (sink); + return CURL_SEEKFUNC_FAIL; + } + break; + case SEEK_CUR: + case SEEK_END: + default: + GST_OBJECT_UNLOCK (sink); + return CURL_SEEKFUNC_FAIL; + break; + } + + GST_OBJECT_UNLOCK (sink); + return CURL_SEEKFUNC_OK; +} + CURLcode gst_curl_base_sink_transfer_check (GstCurlBaseSink * sink) { @@ -1036,7 +1092,7 @@ case CURLINFO_TEXT: case CURLINFO_HEADER_IN: case CURLINFO_HEADER_OUT: - msg = g_memdup (data, size); + msg = g_memdup2 (data, size); if (size > 0) { msg[size - 1] = '\0'; g_strchomp (msg);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlelement.c
Added
@@ -0,0 +1,41 @@ +/* GStreamer + * Copyright (C) 2011 Axis Communications <dev-gstreamer@axis.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst-i18n-plugin.h> + +#include "gstcurlelements.h" + +void +curl_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { +#ifdef ENABLE_NLS + GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, + LOCALEDIR); + bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); + bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); +#endif /* ENABLE_NLS */ + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlelements.h
Added
@@ -0,0 +1,39 @@ +/* GStreamer + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_CURL_ELEMENTS_H__ +#define __GST_CURL_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void curl_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (curlfilesink); +GST_ELEMENT_REGISTER_DECLARE (curlftpsink); +GST_ELEMENT_REGISTER_DECLARE (curlhttpsink); +GST_ELEMENT_REGISTER_DECLARE (curlhttpsrc); +GST_ELEMENT_REGISTER_DECLARE (curlsftpsink); +GST_ELEMENT_REGISTER_DECLARE (curlsmtpsink); + +#endif /* __GST_CURL_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlfilesink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlfilesink.c
Changed
@@ -59,6 +59,7 @@ #include <sys/stat.h> #include <fcntl.h> +#include "gstcurlelements.h" #include "gstcurlbasesink.h" #include "gstcurlfilesink.h" @@ -94,6 +95,8 @@ #define gst_curl_file_sink_parent_class parent_class G_DEFINE_TYPE (GstCurlFileSink, gst_curl_file_sink, GST_TYPE_CURL_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (curlfilesink, "curlfilesink", + GST_RANK_NONE, GST_TYPE_CURL_FILE_SINK, curl_element_init (plugin)); static void gst_curl_file_sink_class_init (GstCurlFileSinkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlftpsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlftpsink.c
Changed
@@ -62,6 +62,7 @@ #include <sys/stat.h> #include <fcntl.h> +#include "gstcurlelements.h" #include "gstcurltlssink.h" #include "gstcurlftpsink.h" @@ -101,6 +102,8 @@ #define gst_curl_ftp_sink_parent_class parent_class G_DEFINE_TYPE (GstCurlFtpSink, gst_curl_ftp_sink, GST_TYPE_CURL_TLS_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (curlftpsink, "curlftpsink", + GST_RANK_NONE, GST_TYPE_CURL_FTP_SINK, curl_element_init (plugin)); static void gst_curl_ftp_sink_class_init (GstCurlFtpSinkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlhttpsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlhttpsink.c
Changed
@@ -64,6 +64,7 @@ #include <sys/stat.h> #include <fcntl.h> +#include "gstcurlelements.h" #include "gstcurltlssink.h" #include "gstcurlhttpsink.h" @@ -111,7 +112,8 @@ #define gst_curl_http_sink_parent_class parent_class G_DEFINE_TYPE (GstCurlHttpSink, gst_curl_http_sink, GST_TYPE_CURL_TLS_SINK); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (curlhttpsink, "curlhttpsink", + GST_RANK_NONE, GST_TYPE_CURL_HTTP_SINK, curl_element_init (plugin)); /* private functions */ static gboolean proxy_setup (GstCurlBaseSink * bcsink);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlhttpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlhttpsrc.c
Changed
@@ -118,6 +118,7 @@ #include <gst/gst-i18n-plugin.h> +#include "gstcurlelements.h" #include "gstcurlhttpsrc.h" #include "gstcurlqueue.h" #include "gstcurldefaults.h" @@ -254,6 +255,8 @@ G_DEFINE_TYPE_WITH_CODE (GstCurlHttpSrc, gst_curl_http_src, GST_TYPE_PUSH_SRC, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_curl_http_src_uri_handler_init)); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (curlhttpsrc, "curlhttpsrc", + GST_RANK_SECONDARY, GST_TYPE_CURLHTTPSRC, curl_element_init (plugin)); static void gst_curl_http_src_class_init (GstCurlHttpSrcClass * klass) @@ -2124,7 +2127,7 @@ switch (type) { case CURLINFO_TEXT: case CURLINFO_HEADER_OUT: - msg = g_memdup (data, size); + msg = g_memdup2 (data, size); if (size > 0) { msg[size - 1] = '\0'; g_strchomp (msg);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlsftpsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlsftpsink.c
Changed
@@ -44,6 +44,7 @@ #include "config.h" #endif +#include "gstcurlelements.h" #include "gstcurlsshsink.h" #include "gstcurlsftpsink.h" @@ -92,6 +93,8 @@ #define gst_curl_sftp_sink_parent_class parent_class G_DEFINE_TYPE (GstCurlSftpSink, gst_curl_sftp_sink, GST_TYPE_CURL_SSH_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (curlsftpsink, "curlsftpsink", + GST_RANK_NONE, GST_TYPE_CURL_SFTP_SINK, curl_element_init (plugin)); static void gst_curl_sftp_sink_class_init (GstCurlSftpSinkClass * klass) @@ -103,7 +106,7 @@ GST_DEBUG_CATEGORY_INIT (gst_curl_sftp_sink_debug, "curlsftpsink", 0, "curl sftp sink element"); - GST_DEBUG_OBJECT (klass, "class_init"); + GST_DEBUG ("class_init"); gst_element_class_set_static_metadata (element_class, "Curl sftp sink",
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlsmtpsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlsmtpsink.c
Changed
@@ -72,6 +72,7 @@ #include <sys/stat.h> #include <fcntl.h> +#include "gstcurlelements.h" #include "gstcurltlssink.h" #include "gstcurlsmtpsink.h" @@ -134,6 +135,8 @@ #define gst_curl_smtp_sink_parent_class parent_class G_DEFINE_TYPE (GstCurlSmtpSink, gst_curl_smtp_sink, GST_TYPE_CURL_TLS_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (curlsmtpsink, "curlsmtpsink", + GST_RANK_NONE, GST_TYPE_CURL_SMTP_SINK, curl_element_init (plugin)); static void gst_curl_smtp_sink_notify_transfer_end_unlocked (GstCurlSmtpSink * sink)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlsshsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlsshsink.c
Changed
@@ -66,6 +66,7 @@ PROP_SSH_KEY_PASSPHRASE, PROP_SSH_KNOWNHOSTS, PROP_SSH_HOST_PUBLIC_KEY_MD5, + PROP_SSH_HOST_PUBLIC_KEY_SHA256, PROP_SSH_ACCEPT_UNKNOWNHOST }; @@ -167,6 +168,16 @@ "remote host's public key", NULL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + g_object_class_install_property (gobject_class, + PROP_SSH_HOST_PUBLIC_KEY_SHA256, + g_param_spec_string ("ssh-host-pubkey-sha256", + "SHA256 checksum of the remote host's public key", + "SHA256 checksum (Base64 encoded) of the remote host's public key", + NULL, G_PARAM_READWRITE | GST_PARAM_CONDITIONALLY_AVAILABLE | + G_PARAM_STATIC_STRINGS)); +#endif + g_object_class_install_property (gobject_class, PROP_SSH_ACCEPT_UNKNOWNHOST, g_param_spec_boolean ("ssh-accept-unknownhost", "SSH accept unknown host", @@ -186,6 +197,7 @@ sink->ssh_key_passphrase = NULL; sink->ssh_knownhosts = NULL; sink->ssh_host_public_key_md5 = NULL; + sink->ssh_host_public_key_sha256 = NULL; sink->ssh_accept_unknownhost = FALSE; } @@ -201,6 +213,7 @@ g_free (this->ssh_key_passphrase); g_free (this->ssh_knownhosts); g_free (this->ssh_host_public_key_md5); + g_free (this->ssh_host_public_key_sha256); G_OBJECT_CLASS (parent_class)->finalize (gobject); } @@ -262,6 +275,13 @@ sink->ssh_host_public_key_md5); break; + case PROP_SSH_HOST_PUBLIC_KEY_SHA256: + g_free (sink->ssh_host_public_key_sha256); + sink->ssh_host_public_key_sha256 = g_value_dup_string (value); + GST_DEBUG_OBJECT (sink, "ssh_host_public_key_sha256 set to %s", + sink->ssh_host_public_key_sha256); + break; + case PROP_SSH_ACCEPT_UNKNOWNHOST: sink->ssh_accept_unknownhost = g_value_get_boolean (value); GST_DEBUG_OBJECT (sink, "ssh_accept_unknownhost set to %d", @@ -309,6 +329,10 @@ g_value_set_string (value, sink->ssh_host_public_key_md5); break; + case PROP_SSH_HOST_PUBLIC_KEY_SHA256: + g_value_set_string (value, sink->ssh_host_public_key_sha256); + break; + case PROP_SSH_ACCEPT_UNKNOWNHOST: g_value_set_boolean (value, sink->ssh_accept_unknownhost); break; @@ -371,6 +395,17 @@ return FALSE; } } +#if CURL_AT_LEAST_VERSION(7, 80, 0) + if (sink->ssh_host_public_key_sha256) { + if ((curl_err = + curl_easy_setopt (bcsink->curl, CURLOPT_SSH_HOST_PUBLIC_KEY_SHA256, + sink->ssh_host_public_key_sha256)) != CURLE_OK) { + bcsink->error = g_strdup_printf ("failed to set remote host's public " + "key SHA256: %s", curl_easy_strerror (curl_err)); + return FALSE; + } + } +#endif /* make sure we only accept PASSWORD or PUBLICKEY auth methods * (can be extended later) */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/gstcurlsshsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/gstcurlsshsink.h
Changed
@@ -74,6 +74,9 @@ from remote host */ gchar *ssh_host_public_key_md5; /* MD5-hash of the remote host's public key: CURLOPT_SSH_HOST_PUBLIC_KEY_MD5 */ + gchar *ssh_host_public_key_sha256; /* SHA256-hash of the remote host's public + key: CURLOPT_SSH_HOST_PUBLIC_KEY_SHA256 + */ }; struct _GstCurlSshSinkClass
View file
gst-plugins-bad-1.18.6.tar.xz/ext/curl/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/curl/meson.build
Changed
@@ -1,6 +1,7 @@ curl_sources = [ 'gstcurlbasesink.c', 'gstcurl.c', + 'gstcurlelement.c', 'gstcurlfilesink.c', 'gstcurlftpsink.c', 'gstcurlhttpsink.c', @@ -22,7 +23,7 @@ curl_sources, c_args : gst_plugins_bad_args, include_directories : [configinc, libsinc], - dependencies : [gstbase_dep, curl_dep] + winsock2, + dependencies : [gstbase_dep, curl_dep] + winsock2 + network_deps, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstdashdemux.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstdashdemux.c
Changed
@@ -443,6 +443,8 @@ GST_DEBUG_CATEGORY_INIT (gst_dash_demux_debug, "dashdemux", 0, "dashdemux element") ); +GST_ELEMENT_REGISTER_DEFINE (dashdemux, "dashdemux", GST_RANK_PRIMARY, + GST_TYPE_DASH_DEMUX); static void gst_dash_demux_dispose (GObject * obj) @@ -896,7 +898,7 @@ schemeIdUri = g_ascii_strdown (cp->schemeIdUri, -1); if (g_str_has_prefix (schemeIdUri, "urn:uuid:")) { pssi_len = strlen (cp->value); - pssi = gst_buffer_new_wrapped (g_memdup (cp->value, pssi_len), pssi_len); + pssi = gst_buffer_new_memdup (cp->value, pssi_len); /* RFC 4122 states that the hex part of a UUID is in lower case, * but some streams seem to ignore this and use upper case for the * protection system ID */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstdashdemux.h -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstdashdemux.h
Changed
@@ -155,6 +155,8 @@ GType gst_dash_demux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dashdemux); + G_END_DECLS #endif /* __GST_DASH_DEMUX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstdashsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstdashsink.c
Changed
@@ -25,7 +25,7 @@ * * ## Example launch line * |[ - * gst-launch-1.0 dashsink name=dashsink max-files=5 audiotestsrc is-live=true ! avenc_aac ! dashsink.audio_0 videotestsrc is-live=true ! x264enc ! dashsink.video_0 + * gst-launch-1.0 dashsink name=dashsink audiotestsrc is-live=true ! avenc_aac ! dashsink.audio_0 videotestsrc is-live=true ! x264enc ! dashsink.video_0 * ]| * */ @@ -37,19 +37,19 @@ * Introduction: * * This element aims to generate the Media Pressentation Description XML based file - * used as DASH content in addition to the necessary media frgaments. + * used as DASH content in addition to the necessary media fragments. * Based on splitmuxsink branches to generate the media fragments, * the element will generate a new adaptation set for each media type (video/audio/test) * and a new representation for each additional stream for a media type. * ,----------------dashsink------------------, * ; ,----------splitmuxsink--------------, ; - * ,-videotestsrc-, ,-x264enc-, ; ; ,-Queue-, ,-tsdemux-, ,-filesink-, ; ; + * ,-videotestsrc-, ,-x264enc-, ; ; ,-Queue-, ,-mpegtsmux-, ,-filesink-, ; ; * ; o--o o---o--o ; o-o o-o , ; ; * '--------------' '---------' ; ; '-------' '---------' '----------' ; ; * ; '------------------------------------' ; * ; ; * ; ,----------splitmuxsink--------------, ; - * ,-audiotestsrc-, ,-avenc_aac-, ; ; ,-Queue-, ,-tsdemux-, ,-filesink-, ; ; + * ,-audiotestsrc-, ,-avenc_aac-, ; ; ,-Queue-, ,-mpegtsmux-, ,-filesink-, ; ; * ; o--o o-o--o o-o o-o ; ; ; * '--------------' '-----------' ; ; '-------' '---------' '----------' ; ; * ; '------------------------------------' ; @@ -64,12 +64,13 @@ * | |_ Representation 2 - Container/Codec - bitrate Y * * This element is able to generate static or dynamic MPD with multiple adaptation sets, - * multiple representations and multiple periods for three kind of . + * multiple representations and multiple periods for three kind of + * media streams (Video/Audio/Text). * * It supports any kind of stream input codec - * which can be encapsulated in Transport Stream or ISO media format. + * which can be encapsulated in Transport Stream (MPEG-TS) or ISO media format (MP4). * The current implementation is generating compliant MPDs for both static and dynamic - * prfiles with https://conformance.dashif.org/ + * profiles with https://conformance.dashif.org/ * * Limitations: * @@ -88,6 +89,7 @@ #include <gst/pbutils/pbutils.h> #include <gst/video/video.h> #include <glib/gstdio.h> +#include <gio/gio.h> #include <memory.h> @@ -173,7 +175,6 @@ PROP_0, PROP_MPD_FILENAME, PROP_MPD_ROOT_PATH, - PROP_MAX_FILES, PROP_TARGET_DURATION, PROP_SEND_KEYFRAME_REQUESTS, PROP_USE_SEGMENT_LIST, @@ -185,6 +186,15 @@ PROP_MPD_PERIOD_DURATION, }; +enum +{ + SIGNAL_GET_PLAYLIST_STREAM, + SIGNAL_GET_FRAGMENT_STREAM, + SIGNAL_LAST +}; + +static guint signals[SIGNAL_LAST]; + typedef enum { DASH_SINK_STREAM_TYPE_VIDEO = 0, @@ -217,22 +227,6 @@ GstDashSinkStreamSubtitleInfo subtitle; } GstDashSinkStreamInfo; -typedef struct _GstDashSinkStream -{ - GstDashSinkStreamType type; - GstPad *pad; - gint buffer_probe; - GstElement *splitmuxsink; - gint adaptation_set_id; - gchar *representation_id; - gchar *current_segment_location; - gchar *mimetype; - gint bitrate; - gchar *codec; - GstClockTime current_running_time_start; - GstDashSinkStreamInfo info; -} GstDashSinkStream; - struct _GstDashSink { GstBin bin; @@ -258,6 +252,26 @@ gint64 period_duration; }; +typedef struct _GstDashSinkStream +{ + GstDashSink *sink; + GstDashSinkStreamType type; + GstPad *pad; + gint buffer_probe; + GstElement *splitmuxsink; + gint adaptation_set_id; + gchar *representation_id; + gchar *current_segment_location; + gint current_segment_id; + gint next_segment_id; + gchar *mimetype; + gint bitrate; + gchar *codec; + GstClockTime current_running_time_start; + GstDashSinkStreamInfo info; + GstElement *giostreamsink; +} GstDashSinkStream; + static GstStaticPadTemplate video_sink_template = GST_STATIC_PAD_TEMPLATE ("video_%u", GST_PAD_SINK, @@ -277,7 +291,10 @@ GST_STATIC_CAPS_ANY); #define gst_dash_sink_parent_class parent_class -G_DEFINE_TYPE (GstDashSink, gst_dash_sink, GST_TYPE_BIN); +G_DEFINE_TYPE_WITH_CODE (GstDashSink, gst_dash_sink, GST_TYPE_BIN, + GST_DEBUG_CATEGORY_INIT (gst_dash_sink_debug, "dashsink", 0, "DashSink")); +GST_ELEMENT_REGISTER_DEFINE (dashsink, "dashsink", GST_RANK_NONE, + gst_dash_sink_get_type ()); static void gst_dash_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * spec); @@ -318,10 +335,42 @@ return NULL; } +static gchar * +gst_dash_sink_stream_get_next_name (GList * streams, GstDashSinkStreamType type) +{ + GList *l; + guint count = 0; + GstDashSinkStream *stream = NULL; + gchar *name = NULL; + + for (l = streams; l != NULL; l = l->next) { + stream = l->data; + if (stream->type == type) + count++; + } + + switch (type) { + case DASH_SINK_STREAM_TYPE_VIDEO: + name = g_strdup_printf ("video_%d", count); + break; + case DASH_SINK_STREAM_TYPE_AUDIO: + name = g_strdup_printf ("audio_%d", count); + break; + case DASH_SINK_STREAM_TYPE_SUBTITLE: + name = g_strdup_printf ("sub_%d", count); + break; + default: + name = g_strdup_printf ("unknown_%d", count); + } + + return name; +} + static void -gst_dash_sink_stream_dispose (gpointer s) +gst_dash_sink_stream_free (gpointer s) { GstDashSinkStream *stream = (GstDashSinkStream *) s; + g_object_unref (stream->sink); g_free (stream->current_segment_location); g_free (stream->representation_id); g_free (stream->mimetype); @@ -350,11 +399,56 @@ gst_mpd_client_free (sink->mpd_client); g_mutex_clear (&sink->mpd_lock); - g_list_free_full (sink->streams, gst_dash_sink_stream_dispose); + g_list_free_full (sink->streams, gst_dash_sink_stream_free); G_OBJECT_CLASS (parent_class)->finalize ((GObject *) sink); } +/* Default implementations for the signal handlers */ +static GOutputStream * +gst_dash_sink_get_playlist_stream (GstDashSink * sink, const gchar * location) +{ + GFile *file = g_file_new_for_path (location); + GOutputStream *ostream; + GError *err = NULL; + + ostream = + G_OUTPUT_STREAM (g_file_replace (file, NULL, FALSE, + G_FILE_CREATE_REPLACE_DESTINATION, NULL, &err)); + if (!ostream) { + GST_ELEMENT_ERROR (sink, RESOURCE, OPEN_WRITE, + (("Got no output stream for playlist '%s': %s."), location, + err->message), (NULL)); + g_clear_error (&err); + } + + g_object_unref (file); + + return ostream; +} + +static GOutputStream * +gst_dash_sink_get_fragment_stream (GstDashSink * sink, const gchar * location) +{ + GFile *file = g_file_new_for_path (location); + GOutputStream *ostream; + GError *err = NULL; + + ostream = + G_OUTPUT_STREAM (g_file_replace (file, NULL, FALSE, + G_FILE_CREATE_REPLACE_DESTINATION, NULL, &err)); + if (!ostream) { + GST_ELEMENT_ERROR (sink, RESOURCE, OPEN_WRITE, + (("Got no output stream for fragment '%s': %s."), location, + err->message), (NULL)); + g_clear_error (&err); + } + + g_object_unref (file); + + return ostream; +} + static void gst_dash_sink_class_init (GstDashSinkClass * klass) { @@ -445,17 +539,92 @@ G_MAXUINT64, DEFAULT_MPD_PERIOD_DURATION, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + /** + * GstDashSink::get-playlist-stream: + * @sink: the #GstDashSink + * @location: Location for the playlist file + * + * Returns: #GOutputStream for writing the playlist file. + * + * Since: 1.20 + */ + signals[SIGNAL_GET_PLAYLIST_STREAM] = + g_signal_new_class_handler ("get-playlist-stream", + G_TYPE_FROM_CLASS (klass), G_SIGNAL_RUN_LAST, + G_CALLBACK (gst_dash_sink_get_playlist_stream), NULL, NULL, NULL, + G_TYPE_OUTPUT_STREAM, 1, G_TYPE_STRING); + + /** + * GstDashSink::get-fragment-stream: + * @sink: the #GstDashSink + * @location: Location for the fragment file + * + * Returns: #GOutputStream for writing the fragment file. + * + * Since: 1.20 + */ + signals[SIGNAL_GET_FRAGMENT_STREAM] = + g_signal_new_class_handler ("get-fragment-stream", + G_TYPE_FROM_CLASS (klass), G_SIGNAL_RUN_LAST, + G_CALLBACK (gst_dash_sink_get_fragment_stream), NULL, NULL, NULL, + G_TYPE_OUTPUT_STREAM, 1, G_TYPE_STRING); + gst_type_mark_as_plugin_api (GST_TYPE_DASH_SINK_MUXER, 0); } +static gchar * +on_format_location (GstElement * splitmuxsink, guint fragment_id, + GstDashSinkStream * dash_stream) +{ + GOutputStream *stream = NULL; + GstDashSink *sink = dash_stream->sink; + gchar *segment_tpl_path; + + dash_stream->current_segment_id = dash_stream->next_segment_id; + g_free (dash_stream->current_segment_location); + if (sink->use_segment_list) + dash_stream->current_segment_location = + g_strdup_printf ("%s" DEFAULT_SEGMENT_LIST_TPL ".%s", + dash_stream->representation_id, dash_stream->current_segment_id, + dash_muxer_list[sink->muxer].file_ext); + else { + dash_stream->current_segment_location = + g_strdup_printf ("%s" DEFAULT_SEGMENT_TEMPLATE_TPL ".%s", + dash_stream->representation_id, dash_stream->current_segment_id, + dash_muxer_list[sink->muxer].file_ext); + } + dash_stream->next_segment_id++; + + if (sink->mpd_root_path) + segment_tpl_path = + g_build_path (G_DIR_SEPARATOR_S, sink->mpd_root_path, + dash_stream->current_segment_location, NULL); + else + segment_tpl_path = g_strdup (dash_stream->current_segment_location); + + + g_signal_emit (sink, signals[SIGNAL_GET_FRAGMENT_STREAM], 0, segment_tpl_path, + &stream); + + if (!stream) + GST_ELEMENT_ERROR (sink, RESOURCE, OPEN_WRITE, + (("Got no output stream for fragment '%s'."), segment_tpl_path), + (NULL)); + else + g_object_set (dash_stream->giostreamsink, "stream", stream, NULL); + + if (stream) + g_object_unref (stream); + + g_free (segment_tpl_path); + + return NULL; +} + static gboolean gst_dash_sink_add_splitmuxsink (GstDashSink * sink, GstDashSinkStream * stream) { - GstElement *mux = NULL; - gchar *segment_tpl; - gchar *segment_tpl_path; - guint start_index = 0; - mux = + GstElement *mux = gst_element_factory_make (dash_muxer_list[sink->muxer].element_name, NULL); @@ -466,35 +635,33 @@ g_return_val_if_fail (mux != NULL, FALSE); stream->splitmuxsink = gst_element_factory_make ("splitmuxsink", NULL); - if (stream->splitmuxsink == NULL) { + if (!stream->splitmuxsink) { + gst_object_unref (mux); + return FALSE; + } + stream->giostreamsink = gst_element_factory_make ("giostreamsink", NULL); + if (!stream->giostreamsink) { + gst_object_unref (stream->splitmuxsink); gst_object_unref (mux); return FALSE; } gst_bin_add (GST_BIN (sink), stream->splitmuxsink); - if (sink->use_segment_list) - segment_tpl = - g_strconcat (stream->representation_id, DEFAULT_SEGMENT_LIST_TPL, - ".", dash_muxer_list[sink->muxer].file_ext, NULL); - else { - segment_tpl = - g_strconcat (stream->representation_id, DEFAULT_SEGMENT_TEMPLATE_TPL, - ".", dash_muxer_list[sink->muxer].file_ext, NULL); - start_index = 1; - } - if (sink->mpd_root_path) - segment_tpl_path = - g_build_path ("/", sink->mpd_root_path, segment_tpl, NULL); + + if (!sink->use_segment_list) + stream->current_segment_id = 1; else - segment_tpl_path = g_strdup (segment_tpl); + stream->current_segment_id = 0; + stream->next_segment_id = stream->current_segment_id; - g_object_set (stream->splitmuxsink, "location", segment_tpl_path, + g_object_set (stream->splitmuxsink, "location", NULL, "max-size-time", ((GstClockTime) sink->target_duration * GST_SECOND), - "send-keyframe-requests", TRUE, "muxer", mux, "reset-muxer", FALSE, - "send-keyframe-requests", sink->send_keyframe_requests, - "start-index", start_index, NULL); - g_free (segment_tpl); - g_free (segment_tpl_path); + "send-keyframe-requests", TRUE, "muxer", mux, "sink", + stream->giostreamsink, "reset-muxer", FALSE, "send-keyframe-requests", + sink->send_keyframe_requests, NULL); + + g_signal_connect (stream->splitmuxsink, "format-location", + G_CALLBACK (on_format_location), stream); return TRUE; } @@ -682,28 +849,44 @@ gint size; GError *error = NULL; gchar *mpd_filepath = NULL; + GOutputStream *file_stream = NULL; + gsize bytes_to_write; + g_mutex_lock (&sink->mpd_lock); gst_dash_sink_generate_mpd_content (sink, current_stream); if (!gst_mpd_client_get_xml_content (sink->mpd_client, &mpd_content, &size)) return; g_mutex_unlock (&sink->mpd_lock); + if (sink->mpd_root_path) mpd_filepath = - g_build_path ("/", sink->mpd_root_path, sink->mpd_filename, NULL); + g_build_path (G_DIR_SEPARATOR_S, sink->mpd_root_path, + sink->mpd_filename, NULL); else mpd_filepath = g_strdup (sink->mpd_filename); GST_DEBUG_OBJECT (sink, "a new mpd content is available: %s", mpd_content); GST_DEBUG_OBJECT (sink, "write mpd to %s", mpd_filepath); - if (!mpd_content - || !g_file_set_contents (mpd_filepath, mpd_content, -1, &error)) { + g_signal_emit (sink, signals[SIGNAL_GET_PLAYLIST_STREAM], 0, mpd_filepath, + &file_stream); + if (!file_stream) { + GST_ELEMENT_ERROR (sink, RESOURCE, OPEN_WRITE, + (("Got no output stream for fragment '%s'."), mpd_filepath), (NULL)); + } + + bytes_to_write = strlen (mpd_content); + if (!g_output_stream_write_all (file_stream, mpd_content, bytes_to_write, + NULL, NULL, &error)) { + GST_ERROR ("Failed to write mpd content: %s", error->message); GST_ELEMENT_ERROR (sink, RESOURCE, OPEN_WRITE, - (("Failed to write mpd '%s'."), error->message), (NULL)); + (("Failed to write playlist '%s'."), error->message), (NULL)); g_error_free (error); error = NULL; } + g_free (mpd_content); g_free (mpd_filepath); + g_object_unref (file_stream); } static void @@ -723,15 +906,10 @@ if (stream) { if (gst_structure_has_name (s, "splitmuxsink-fragment-opened")) { gst_dash_sink_get_stream_metadata (sink, stream); - g_free (stream->current_segment_location); - stream->current_segment_location = - g_strdup (gst_structure_get_string (s, "location")); gst_structure_get_clock_time (s, "running-time", &stream->current_running_time_start); } else if (gst_structure_has_name (s, "splitmuxsink-fragment-closed")) { GstClockTime running_time; - g_assert (strcmp (stream->current_segment_location, - gst_structure_get_string (s, "location")) == 0); gst_structure_get_clock_time (s, "running-time", &running_time); if (sink->running_time < running_time) sink->running_time = running_time; @@ -777,6 +955,7 @@ const gchar *split_pad_name = pad_name; stream = g_new0 (GstDashSinkStream, 1); + stream->sink = g_object_ref (sink); if (g_str_has_prefix (templ->name_template, "video")) { stream->type = DASH_SINK_STREAM_TYPE_VIDEO; stream->adaptation_set_id = ADAPTATION_SET_ID_VIDEO; @@ -789,7 +968,13 @@ stream->adaptation_set_id = ADAPTATION_SET_ID_SUBTITLE; } - stream->representation_id = g_strdup (pad_name); + if (pad_name) + stream->representation_id = g_strdup (pad_name); + else + stream->representation_id = + gst_dash_sink_stream_get_next_name (sink->streams, stream->type); + + stream->mimetype = g_strdup (dash_muxer_list[sink->muxer].mimetype); @@ -797,11 +982,11 @@ GST_ERROR_OBJECT (sink, "Unable to create splitmuxsink element for pad template name %s", templ->name_template); - gst_dash_sink_stream_dispose (stream); + gst_dash_sink_stream_free (stream); goto done; } - peer = gst_element_get_request_pad (stream->splitmuxsink, split_pad_name); + peer = gst_element_request_pad_simple (stream->splitmuxsink, split_pad_name); if (!peer) { GST_ERROR_OBJECT (sink, "Unable to request pad name %s", split_pad_name); return NULL; @@ -981,11 +1166,3 @@ break; } } - -gboolean -gst_dash_sink_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_dash_sink_debug, "dashsink", 0, "DashSink"); - return gst_element_register (plugin, "dashsink", GST_RANK_NONE, - gst_dash_sink_get_type ()); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstdashsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstdashsink.h
Changed
@@ -27,7 +27,7 @@ #define GST_TYPE_DASH_SINK gst_dash_sink_get_type () G_DECLARE_FINAL_TYPE (GstDashSink, gst_dash_sink, GST, DASH_SINK, GstBin) -gboolean gst_dash_sink_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (dashsink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstmpdhelper.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstmpdhelper.c
Changed
@@ -122,6 +122,8 @@ name = gst_structure_get_name (s); if (!g_strcmp0 (name, "video/x-h264")) { return "avc1"; + } else if (!g_strcmp0 (name, "video/x-h265")) { + return "hvc1"; } else { GST_DEBUG ("No codecs for this caps name %s", name); }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstmpdparser.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstmpdparser.c
Changed
@@ -503,35 +503,14 @@ static void gst_mpdparser_parse_content_protection_node (GList ** list, xmlNode * a_node) { - gchar *value = NULL; - if (gst_xml_helper_get_prop_string (a_node, "value", &value)) { - if (!g_strcmp0 (value, "MSPR 2.0")) { - xmlNode *cur_node; - for (cur_node = a_node->children; cur_node; cur_node = cur_node->next) { - if (cur_node->type == XML_ELEMENT_NODE) { - if (xmlStrcmp (cur_node->name, (xmlChar *) "pro") == 0) { - GstMPDDescriptorTypeNode *new_descriptor; - new_descriptor = gst_mpd_descriptor_type_node_new ((const gchar *) - cur_node->name); - *list = g_list_append (*list, new_descriptor); - - gst_xml_helper_get_prop_string_stripped (a_node, "schemeIdUri", - &new_descriptor->schemeIdUri); - - gst_xml_helper_get_node_content (cur_node, &new_descriptor->value); - goto beach; - } - } - } - } else { - gst_mpdparser_parse_descriptor_type (list, a_node); - } - } else { - gst_mpdparser_parse_descriptor_type (list, a_node); - } -beach: - if (value) - g_free (value); + GstMPDDescriptorTypeNode *new_descriptor; + new_descriptor = gst_mpd_descriptor_type_node_new ((const gchar *) + a_node->name); + *list = g_list_append (*list, new_descriptor); + + gst_xml_helper_get_prop_string_stripped (a_node, "schemeIdUri", + &new_descriptor->schemeIdUri); + gst_xml_helper_get_node_as_string (a_node, &new_descriptor->value); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstplugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstplugin.c
Changed
@@ -1,25 +1,38 @@ +/* GStreamer + * Copyright (C) <2020> Stéphane Cerveau <scerveau@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif -#include <string.h> - -#include <gst/gst.h> - #include "gstdashdemux.h" #include "gstdashsink.h" static gboolean dash_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "dashdemux", GST_RANK_PRIMARY, - GST_TYPE_DASH_DEMUX)) - return FALSE; + gboolean ret = FALSE; - if (!gst_dash_sink_plugin_init (plugin)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (dashdemux, plugin); + ret |= GST_ELEMENT_REGISTER (dashsink, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dash/gstxmlhelper.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dash/gstxmlhelper.c
Changed
@@ -460,32 +460,6 @@ return exists; } -/* g_ascii_string_to_unsigned is available since 2.54. Get rid of this wrapper - * when we bump the version in 1.18 */ -#if !GLIB_CHECK_VERSION(2,54,0) -#define g_ascii_string_to_unsigned gst_xml_helper_ascii_string_to_unsigned -static gboolean -gst_xml_helper_ascii_string_to_unsigned (const gchar * str, guint base, - guint64 min, guint64 max, guint64 * out_num, GError ** error) -{ - guint64 number; - gchar *endptr = NULL; - - number = g_ascii_strtoull (str, &endptr, base); - - /* Be as strict as the implementation of g_ascii_string_to_unsigned in glib */ - if (errno) - return FALSE; - if (g_ascii_isspace (str[0]) || str[0] == '-' || str[0] == '+') - return FALSE; - if (*endptr != '\0' || endptr == NULL) - return FALSE; - - *out_num = number; - return TRUE; -} -#endif - gboolean gst_xml_helper_get_prop_unsigned_integer_64 (xmlNode * a_node, const gchar * property_name, guint64 default_val, guint64 * property_value) @@ -1001,11 +975,24 @@ gboolean exists = FALSE; const char *txt_encoding; xmlOutputBufferPtr out_buf; + xmlNode *ncopy = NULL; txt_encoding = (const char *) a_node->doc->encoding; out_buf = xmlAllocOutputBuffer (NULL); g_assert (out_buf != NULL); - xmlNodeDumpOutput (out_buf, a_node->doc, a_node, 0, 0, txt_encoding); + + /* Need to make a copy of XML element so that it includes namespaces + in the output, so that the resulting string can be parsed by an XML parser + that is namespace aware. + Use extended=1 for recursive copy (properties, namespaces and children) */ + ncopy = xmlDocCopyNode (a_node, a_node->doc, 1); + + if (!ncopy) { + GST_WARNING ("Failed to clone XML node"); + goto done; + } + xmlNodeDumpOutput (out_buf, ncopy->doc, ncopy, 0, 0, txt_encoding); + (void) xmlOutputBufferFlush (out_buf); #ifdef LIBXML2_NEW_BUFFER if (xmlOutputBufferGetSize (out_buf) > 0) { @@ -1025,6 +1012,8 @@ exists = TRUE; } #endif // LIBXML2_NEW_BUFFER + xmlFreeNode (ncopy); +done: (void) xmlOutputBufferClose (out_buf); if (exists) {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dc1394/gstdc1394src.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dc1394/gstdc1394src.c
Changed
@@ -87,7 +87,11 @@ #define gst_dc1394_src_parent_class parent_class -G_DEFINE_TYPE (GstDC1394Src, gst_dc1394_src, GST_TYPE_PUSH_SRC); +G_DEFINE_TYPE_WITH_CODE (GstDC1394Src, gst_dc1394_src, GST_TYPE_PUSH_SRC, + GST_DEBUG_CATEGORY_INIT (dc1394_debug, "dc1394", 0, "DC1394 interface"); + ); +GST_ELEMENT_REGISTER_DEFINE (dc1394src, "dc1394src", GST_RANK_NONE, + GST_TYPE_DC1394_SRC); static void gst_dc1394_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -1229,13 +1233,9 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (dc1394_debug, "dc1394", 0, "DC1394 interface"); - - return gst_element_register (plugin, "dc1394src", GST_RANK_NONE, - GST_TYPE_DC1394_SRC); + return GST_ELEMENT_REGISTER (dc1394src, plugin); } - GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, dc1394,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dc1394/gstdc1394src.h -> gst-plugins-bad-1.20.1.tar.xz/ext/dc1394/gstdc1394src.h
Changed
@@ -55,6 +55,8 @@ GType gst_dc1394_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dc1394src); + G_END_DECLS #endif /* __GST_DC1394_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/directfb/dfbvideosink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/directfb/dfbvideosink.c
Changed
@@ -512,8 +512,12 @@ G_IMPLEMENT_INTERFACE (GST_TYPE_NAVIGATION, gst_dfbvideosink_navigation_init); G_IMPLEMENT_INTERFACE (GST_TYPE_COLOR_BALANCE, - gst_dfbvideosink_colorbalance_init)); - + gst_dfbvideosink_colorbalance_init); + GST_DEBUG_CATEGORY_INIT (dfbvideosink_debug, "dfbvideosink", 0, + "DirectFB video sink element"); + ); +GST_ELEMENT_REGISTER_DEFINE (dfbvideosink, "dfbvideosink", GST_RANK_MARGINAL, + GST_TYPE_DFBVIDEOSINK); #ifndef GST_DISABLE_GST_DEBUG static const char * gst_dfbvideosink_get_format_name (DFBSurfacePixelFormat format) @@ -2460,14 +2464,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "dfbvideosink", GST_RANK_MARGINAL, - GST_TYPE_DFBVIDEOSINK)) - return FALSE; - - GST_DEBUG_CATEGORY_INIT (dfbvideosink_debug, "dfbvideosink", 0, - "DirectFB video sink element"); - - return TRUE; + return GST_ELEMENT_REGISTER (dfbvideosink, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/directfb/dfbvideosink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/directfb/dfbvideosink.h
Changed
@@ -156,6 +156,8 @@ GType gst_dfbvideosink_get_type (void); GType gst_dfb_buffer_pool_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dfbvideosink); + G_END_DECLS #endif /* __GST_DFBVIDEOSINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlsconnection.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlsconnection.c
Changed
@@ -101,6 +101,7 @@ GstDtlsConnectionSendCallback send_callback; gpointer send_callback_user_data; GDestroyNotify send_callback_destroy_notify; + GstFlowReturn syscall_flow_return; gboolean timeout_pending; GThreadPool *thread_pool; @@ -600,6 +601,14 @@ g_mutex_unlock (&priv->mutex); } +void +gst_dtls_connection_set_flow_return (GstDtlsConnection * self, + GstFlowReturn flow_ret) +{ + g_return_if_fail (GST_IS_DTLS_CONNECTION (self)); + self->priv->syscall_flow_return = flow_ret; +} + GstFlowReturn gst_dtls_connection_process (GstDtlsConnection * self, gpointer data, gsize len, gsize * written, GError ** err) @@ -1002,13 +1011,19 @@ case SSL_ERROR_WANT_WRITE: GST_LOG_OBJECT (self, "SSL wants write"); return GST_FLOW_OK; - case SSL_ERROR_SYSCALL: + case SSL_ERROR_SYSCALL:{ + GstFlowReturn rc = GST_FLOW_OK; /* OpenSSL shouldn't be making real system calls, so we can safely * ignore syscall errors. System interactions should happen through * our BIO. */ - GST_DEBUG_OBJECT (self, "OpenSSL reported a syscall error, ignoring."); - return GST_FLOW_OK; + if (error_type == GST_RESOURCE_ERROR_WRITE) { + rc = self->priv->syscall_flow_return; + } + GST_DEBUG_OBJECT (self, + "OpenSSL reported a syscall error. flow_return=%i", rc); + return rc; + } default: if (self->priv->connection_state != GST_DTLS_CONNECTION_STATE_FAILED) { self->priv->connection_state = GST_DTLS_CONNECTION_STATE_FAILED; @@ -1182,6 +1197,7 @@ gboolean ret = TRUE; GST_LOG_OBJECT (self, "BIO: writing %d", size); + self->priv->syscall_flow_return = GST_FLOW_OK; if (self->priv->send_callback) ret = self->priv->send_callback (self, data, size,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlsconnection.h -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlsconnection.h
Changed
@@ -119,6 +119,11 @@ void gst_dtls_connection_set_send_callback(GstDtlsConnection *, GstDtlsConnectionSendCallback, gpointer, GDestroyNotify); /* + * Sets the GstFlowReturn that be returned from gst_dtls_connection_send() if callback returns FALSE + */ +void gst_dtls_connection_set_flow_return(GstDtlsConnection *, GstFlowReturn); + +/* * Processes data that has been received, the transformation is done in-place. * * Returns: @@ -142,6 +147,7 @@ * we received an EOS before. * - GST_FLOW_ERROR + err if an error happened * - GST_FLOW_OK + written >= 0 if processing was successful + * - Any GstFlowReturn set with gst_dtls_connection_set_flow_return() */ GstFlowReturn gst_dtls_connection_send(GstDtlsConnection *, gconstpointer ptr, gsize len, gsize *written, GError **err);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlsdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlsdec.c
Changed
@@ -27,6 +27,7 @@ #include "config.h" #endif +#include "gstdtlselements.h" #include "gstdtlsdec.h" #include "gstdtlscertificate.h" @@ -48,6 +49,8 @@ #define gst_dtls_dec_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstDtlsDec, gst_dtls_dec, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_dtls_dec_debug, "dtlsdec", 0, "DTLS Decoder")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dtlsdec, "dtlsdec", GST_RANK_NONE, + GST_TYPE_DTLS_DEC, dtls_element_init (plugin)); enum { @@ -422,7 +425,7 @@ on_key_received (GstDtlsConnection * connection, gpointer key, guint cipher, guint auth, GstDtlsDec * self) { - gpointer key_dup; + GstBuffer *new_decoder_key; gchar *key_str; g_return_if_fail (GST_IS_DTLS_DEC (self)); @@ -430,15 +433,13 @@ self->srtp_cipher = cipher; self->srtp_auth = auth; - key_dup = g_memdup (key, GST_DTLS_SRTP_MASTER_KEY_LENGTH); + new_decoder_key = + gst_buffer_new_memdup (key, GST_DTLS_SRTP_MASTER_KEY_LENGTH); - if (self->decoder_key) { + if (self->decoder_key) gst_buffer_unref (self->decoder_key); - self->decoder_key = NULL; - } - self->decoder_key = - gst_buffer_new_wrapped (key_dup, GST_DTLS_SRTP_MASTER_KEY_LENGTH); + self->decoder_key = new_decoder_key; key_str = g_base64_encode (key, GST_DTLS_SRTP_MASTER_KEY_LENGTH); GST_INFO_OBJECT (self, "received key: %s", key_str);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlselement.c
Added
@@ -0,0 +1,44 @@ +/* + * Copyright (c) 2014, Ericsson AB. All rights reserved. + * + * Redistribution and use in source and binary forms, with or without modification, + * are permitted provided that the following conditions are met: + * + * 1. Redistributions of source code must retain the above copyright notice, this + * list of conditions and the following disclaimer. + * + * 2. Redistributions in binary form must reproduce the above copyright notice, this + * list of conditions and the following disclaimer in the documentation and/or other + * materials provided with the distribution. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND + * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED + * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. + * IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, + * INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT + * NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR + * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, + * WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) + * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY + * OF SUCH DAMAGE. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstdtlselements.h" +#include "gstdtlsconnection.h" + + +#include <gst/gst.h> + +void +dtls_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + gst_type_mark_as_plugin_api (GST_DTLS_TYPE_CONNECTION_STATE, 0); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlselements.h
Added
@@ -0,0 +1,38 @@ +/* GStreamer + * Copyright (C) <2020> The Gstreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_DTLS_ELEMENTS_H__ +#define __GST_DTLS_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void dtls_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (dtlsdec); +GST_ELEMENT_REGISTER_DECLARE (dtlsenc); +GST_ELEMENT_REGISTER_DECLARE (dtlssrtpdec); +GST_ELEMENT_REGISTER_DECLARE (dtlssrtpdemux); +GST_ELEMENT_REGISTER_DECLARE (dtlssrtpenc); + +#endif /* __GST_DTLS_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlsenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlsenc.c
Changed
@@ -27,6 +27,7 @@ #include "config.h" #endif +#include "gstdtlselements.h" #include "gstdtlsenc.h" #include "gstdtlsdec.h" @@ -48,6 +49,8 @@ #define gst_dtls_enc_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstDtlsEnc, gst_dtls_enc, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_dtls_enc_debug, "dtlsenc", 0, "DTLS Encoder")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dtlsenc, "dtlsenc", GST_RANK_NONE, + GST_TYPE_DTLS_ENC, dtls_element_init (plugin)); enum { @@ -562,6 +565,9 @@ GST_ELEMENT_ERROR (self, RESOURCE, WRITE, (NULL), ("%s", err->message)); g_clear_error (&err); break; + case GST_FLOW_FLUSHING: + GST_INFO_OBJECT (self, "Flushing"); + break; default: g_assert_not_reached (); break; @@ -626,7 +632,7 @@ on_key_received (GstDtlsConnection * connection, gpointer key, guint cipher, guint auth, GstDtlsEnc * self) { - gpointer key_dup; + GstBuffer *new_encoder_key; gchar *key_str; g_return_if_fail (GST_IS_DTLS_ENC (self)); @@ -635,15 +641,13 @@ self->srtp_cipher = cipher; self->srtp_auth = auth; - key_dup = g_memdup (key, GST_DTLS_SRTP_MASTER_KEY_LENGTH); + new_encoder_key = + gst_buffer_new_memdup (key, GST_DTLS_SRTP_MASTER_KEY_LENGTH); - if (self->encoder_key) { + if (self->encoder_key) gst_buffer_unref (self->encoder_key); - self->encoder_key = NULL; - } - self->encoder_key = - gst_buffer_new_wrapped (key_dup, GST_DTLS_SRTP_MASTER_KEY_LENGTH); + self->encoder_key = new_encoder_key; key_str = g_base64_encode (key, GST_DTLS_SRTP_MASTER_KEY_LENGTH); GST_INFO_OBJECT (self, "received key: %s", key_str); @@ -662,8 +666,7 @@ GST_DEBUG_OBJECT (self, "sending data from %s with length %" G_GSIZE_FORMAT, self->connection_id, length); - buffer = - data ? gst_buffer_new_wrapped (g_memdup (data, length), length) : NULL; + buffer = data ? gst_buffer_new_memdup (data, length) : NULL; GST_TRACE_OBJECT (self, "send data: acquiring lock"); g_mutex_lock (&self->queue_lock); @@ -677,6 +680,8 @@ GST_TRACE_OBJECT (self, "send data: releasing lock"); ret = self->src_ret == GST_FLOW_OK; + if (self->src_ret == GST_FLOW_FLUSHING) + gst_dtls_connection_set_flow_return (connection, self->src_ret); g_mutex_unlock (&self->queue_lock); return ret;
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlssrtpbin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlssrtpbin.c
Changed
@@ -218,7 +218,8 @@ g_object_get_property (G_OBJECT (self->dtls_element), "connection-id", value); } else { - g_warning ("tried to get connection-id after disabling DTLS"); + GST_WARNING_OBJECT (self, + "tried to get connection-id after disabling DTLS"); } break; case PROP_KEY:
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlssrtpdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlssrtpdec.c
Changed
@@ -27,6 +27,7 @@ #include "config.h" #endif +#include "gstdtlselements.h" #include "gstdtlssrtpdec.h" #include "gstdtlsconnection.h" @@ -62,6 +63,8 @@ G_DEFINE_TYPE_WITH_CODE (GstDtlsSrtpDec, gst_dtls_srtp_dec, GST_TYPE_DTLS_SRTP_BIN, GST_DEBUG_CATEGORY_INIT (gst_dtls_srtp_dec_debug, "dtlssrtpdec", 0, "DTLS Decoder")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dtlssrtpdec, "dtlssrtpdec", + GST_RANK_NONE, GST_TYPE_DTLS_SRTP_DEC, dtls_element_init (plugin)); enum { @@ -288,8 +291,13 @@ } break; case PROP_CONNECTION_STATE: - g_object_get_property (G_OBJECT (self->bin.dtls_element), - "connection-state", value); + if (self->bin.dtls_element) { + g_object_get_property (G_OBJECT (self->bin.dtls_element), + "connection-state", value); + } else { + GST_WARNING_OBJECT (self, + "tried to get connection-state after disabling DTLS"); + } break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (self, prop_id, pspec); @@ -313,7 +321,7 @@ if (templ == gst_element_class_get_pad_template (klass, "data_src")) { GstPad *target_pad; - target_pad = gst_element_get_request_pad (self->bin.dtls_element, "src"); + target_pad = gst_element_request_pad_simple (self->bin.dtls_element, "src"); ghost_pad = gst_ghost_pad_new_from_template (name, target_pad, templ); gst_object_unref (target_pad);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlssrtpdemux.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlssrtpdemux.c
Changed
@@ -27,6 +27,7 @@ #include "config.h" #endif +#include "gstdtlselements.h" #include "gstdtlssrtpdemux.h" #define PACKET_IS_DTLS(b) (b > 0x13 && b < 0x40) @@ -59,6 +60,9 @@ G_DEFINE_TYPE_WITH_CODE (GstDtlsSrtpDemux, gst_dtls_srtp_demux, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_gst_dtls_srtp_demux_debug, "dtlssrtpdemux", 0, "DTLS SRTP Demultiplexer")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dtlssrtpdemux, "dtlssrtpdemux", + GST_RANK_NONE, GST_TYPE_DTLS_SRTP_DEMUX, dtls_element_init (plugin)); + static GstFlowReturn sink_chain (GstPad *, GstObject * self, GstBuffer *);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/gstdtlssrtpenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/gstdtlssrtpenc.c
Changed
@@ -27,6 +27,7 @@ #include "config.h" #endif +#include "gstdtlselements.h" #include "gstdtlssrtpenc.h" #include "gstdtlsconnection.h" @@ -64,6 +65,8 @@ G_DEFINE_TYPE_WITH_CODE (GstDtlsSrtpEnc, gst_dtls_srtp_enc, GST_TYPE_DTLS_SRTP_BIN, GST_DEBUG_CATEGORY_INIT (gst_dtls_srtp_enc_debug, "dtlssrtpenc", 0, "DTLS Decoder")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dtlssrtpenc, "dtlssrtpenc", + GST_RANK_NONE, GST_TYPE_DTLS_SRTP_ENC, dtls_element_init (plugin)); enum { @@ -265,12 +268,21 @@ NULL, auth_enum_class, NULL); } +#if GLIB_CHECK_VERSION(2,68,0) +#define binding_get_source(b) g_binding_dup_source(b) +#define unref_source(s) G_STMT_START { if(s) g_object_unref(s); } G_STMT_END +#else +#define binding_get_source(b) g_binding_get_source(b) +#define unref_source(s) /* no op */ +#endif + static gboolean transform_enum (GBinding * binding, const GValue * source_value, GValue * target_value, GEnumClass * enum_class) { GEnumValue *enum_value; const gchar *nick; + GObject *bind_src; nick = g_value_get_string (source_value); g_return_val_if_fail (nick, FALSE); @@ -278,9 +290,13 @@ enum_value = g_enum_get_value_by_nick (enum_class, nick); g_return_val_if_fail (enum_value, FALSE); - GST_DEBUG_OBJECT (g_binding_get_source (binding), + bind_src = binding_get_source (binding); + + GST_DEBUG_OBJECT (bind_src, "transforming enum from %s to %d", nick, enum_value->value); + unref_source (bind_src); + g_value_set_enum (target_value, enum_value->value); return TRUE; @@ -327,8 +343,13 @@ } break; case PROP_CONNECTION_STATE: - g_object_get_property (G_OBJECT (self->bin.dtls_element), - "connection-state", value); + if (self->bin.dtls_element) { + g_object_get_property (G_OBJECT (self->bin.dtls_element), + "connection-state", value); + } else { + GST_WARNING_OBJECT (self, + "tried to get connection-state after disabling DTLS"); + } break; case PROP_RTP_SYNC: g_value_set_boolean (value, self->rtp_sync); @@ -394,7 +415,7 @@ gst_bin_add (GST_BIN (self), clocksync); gst_element_sync_state_with_parent (clocksync); - target_pad = gst_element_get_request_pad (self->srtp_enc, name); + target_pad = gst_element_request_pad_simple (self->srtp_enc, name); g_return_val_if_fail (target_pad, NULL); srtp_src_name = g_strdup_printf ("rtp_src_%d", pad_n); @@ -409,7 +430,7 @@ GST_LOG_OBJECT (self, "added rtp sink pad"); } else if (templ == gst_element_class_get_pad_template (klass, "rtcp_sink_%d")) { - target_pad = gst_element_get_request_pad (self->srtp_enc, name); + target_pad = gst_element_request_pad_simple (self->srtp_enc, name); g_return_val_if_fail (target_pad, NULL); sscanf (GST_PAD_NAME (target_pad), "rtcp_sink_%d", &pad_n); @@ -424,7 +445,8 @@ GST_LOG_OBJECT (self, "added rtcp sink pad"); } else if (templ == gst_element_class_get_pad_template (klass, "data_sink")) { g_return_val_if_fail (self->bin.dtls_element, NULL); - target_pad = gst_element_get_request_pad (self->bin.dtls_element, "sink"); + target_pad = + gst_element_request_pad_simple (self->bin.dtls_element, "sink"); ghost_pad = add_ghost_pad (element, name, target_pad, templ);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/meson.build
Changed
@@ -9,6 +9,7 @@ 'gstdtlssrtpdemux.c', 'gstdtlssrtpenc.c', 'plugin.c', + 'gstdtlselement.c', ] openssl_dep = dependency('openssl', version : '>= 1.0.1', required : get_option('dtls'))
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dtls/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dtls/plugin.c
Changed
@@ -27,29 +27,22 @@ #include "config.h" #endif -#include "gstdtlsdec.h" -#include "gstdtlsenc.h" -#include "gstdtlssrtpenc.h" -#include "gstdtlssrtpdec.h" -#include "gstdtlssrtpdemux.h" - #include <gst/gst.h> +#include "gstdtlselements.h" + static gboolean plugin_init (GstPlugin * plugin) { - gst_type_mark_as_plugin_api (GST_DTLS_TYPE_CONNECTION_STATE, 0); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (dtlsenc, plugin); + ret |= GST_ELEMENT_REGISTER (dtlsdec, plugin); + ret |= GST_ELEMENT_REGISTER (dtlssrtpdec, plugin); + ret |= GST_ELEMENT_REGISTER (dtlssrtpenc, plugin); + ret |= GST_ELEMENT_REGISTER (dtlssrtpdemux, plugin); - return gst_element_register (plugin, "dtlsenc", GST_RANK_NONE, - GST_TYPE_DTLS_ENC) - && gst_element_register (plugin, "dtlsdec", GST_RANK_NONE, - GST_TYPE_DTLS_DEC) - && gst_element_register (plugin, "dtlssrtpdec", GST_RANK_NONE, - GST_TYPE_DTLS_SRTP_DEC) - && gst_element_register (plugin, "dtlssrtpenc", GST_RANK_NONE, - GST_TYPE_DTLS_SRTP_ENC) - && gst_element_register (plugin, "dtlssrtpdemux", GST_RANK_NONE, - GST_TYPE_DTLS_SRTP_DEMUX); + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dts/gstdtsdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/dts/gstdtsdec.c
Changed
@@ -126,7 +126,7 @@ "rate = (int) [ 4000, 96000 ], " "channels = (int) [ 1, 6 ]") ); -G_DEFINE_TYPE (GstDtsDec, gst_dtsdec, GST_TYPE_AUDIO_DECODER); + static gboolean gst_dtsdec_start (GstAudioDecoder * dec); static gboolean gst_dtsdec_stop (GstAudioDecoder * dec); @@ -143,6 +143,10 @@ const GValue * value, GParamSpec * pspec); static void gst_dtsdec_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); +static gboolean dtsdec_element_init (GstPlugin * plugin); + +G_DEFINE_TYPE (GstDtsDec, gst_dtsdec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (dtsdec, dtsdec_element_init); static void gst_dtsdec_class_init (GstDtsDecClass * klass) @@ -784,7 +788,7 @@ } static gboolean -plugin_init (GstPlugin * plugin) +dtsdec_element_init (GstPlugin * plugin) { GST_DEBUG_CATEGORY_INIT (dtsdec_debug, "dtsdec", 0, "DTS/DCA audio decoder"); @@ -792,11 +796,14 @@ orc_init (); #endif - if (!gst_element_register (plugin, "dtsdec", GST_RANK_PRIMARY, - GST_TYPE_DTSDEC)) - return FALSE; + return gst_element_register (plugin, "dtsdec", GST_RANK_PRIMARY, + GST_TYPE_DTSDEC); +} - return TRUE; +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (dtsdec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dts/gstdtsdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/dts/gstdtsdec.h
Changed
@@ -78,6 +78,8 @@ GType gst_dtsdec_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (dtsdec); + G_END_DECLS #endif /* __GST_DTSDEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/dts/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/dts/meson.build
Changed
@@ -1,5 +1,10 @@ +dts_opt = get_option('dts').require(gpl_allowed, error_message: ''' + Plugin dts explicitly required via options but GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow GPL-licensed plugins to be built. + ''') + # Don't do any dependency checks if disabled -if get_option('dts').disabled() +if dts_opt.disabled() subdir_done() endif @@ -9,7 +14,7 @@ if cc.has_header_symbol('dca.h', 'dca_init') dca_dep = cc.find_library('dca', required : false) endif - if not dca_dep.found() and get_option('dts').enabled() + if not dca_dep.found() and dts_opt.enabled() error('DTS plugin enabled, but libdca not found') endif endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/faac/gstfaac.c -> gst-plugins-bad-1.20.1.tar.xz/ext/faac/gstfaac.c
Changed
@@ -135,6 +135,7 @@ #define gst_faac_parent_class parent_class G_DEFINE_TYPE (GstFaac, gst_faac, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (faac, "faac", GST_RANK_SECONDARY, GST_TYPE_FAAC); #define GST_TYPE_FAAC_RATE_CONTROL (gst_faac_brtype_get_type ()) static GType @@ -781,8 +782,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "faac", GST_RANK_SECONDARY, - GST_TYPE_FAAC); + return GST_ELEMENT_REGISTER (faac, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/faac/gstfaac.h -> gst-plugins-bad-1.20.1.tar.xz/ext/faac/gstfaac.h
Changed
@@ -73,6 +73,8 @@ GType gst_faac_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (faac); + G_END_DECLS #endif /* __GST_FAAC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/faad/gstfaad.c -> gst-plugins-bad-1.20.1.tar.xz/ext/faad/gstfaad.c
Changed
@@ -21,7 +21,7 @@ /** * SECTION:element-faad * @title: faad - * @seealso: faac + * @see_also: faac * * faad decodes AAC (MPEG-4 part 3) stream. * @@ -104,6 +104,7 @@ #define gst_faad_parent_class parent_class G_DEFINE_TYPE (GstFaad, gst_faad, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (faad, "faad", GST_RANK_SECONDARY, GST_TYPE_FAAD); static void gst_faad_class_init (GstFaadClass * klass) @@ -450,7 +451,7 @@ faad->samplerate = info->samplerate; faad->channels = info->channels; g_free (faad->channel_positions); - faad->channel_positions = g_memdup (info->channel_position, faad->channels); + faad->channel_positions = g_memdup2 (info->channel_position, faad->channels); faad->bps = 16 / 8; @@ -840,8 +841,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "faad", GST_RANK_SECONDARY, - GST_TYPE_FAAD); + return GST_ELEMENT_REGISTER (faad, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/faad/gstfaad.h -> gst-plugins-bad-1.20.1.tar.xz/ext/faad/gstfaad.h
Changed
@@ -66,6 +66,8 @@ GType gst_faad_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (faad); + G_END_DECLS #endif /* __GST_FAAD_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/faad/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/faad/meson.build
Changed
@@ -1,12 +1,22 @@ +faad_opt = get_option('faad').require(gpl_allowed, error_message: ''' + Plugin faad explicitly required via options but GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow GPL-licensed plugins to be built. + ''') + +if faad_opt.disabled() + faad_dep = dependency('', required: false) + subdir_done() +endif + faad_args = [ ] have_faad = cc.has_header_symbol('neaacdec.h', 'NeAACDecOpen') have_faad_2_7 = have_faad and cc.has_header_symbol('neaacdec.h', 'LATM') -if have_faad and not have_faad_2_7 and get_option('faad').enabled() +if have_faad and not have_faad_2_7 and faad_opt.enabled() message('Found faad2, but too old (< v2.7.0)') endif -faad_dep = cc.find_library('faad', required : get_option('faad')) +faad_dep = cc.find_library('faad', required : faad_opt) if faad_dep.found() and have_faad_2_7 gstfaad = library('gstfaad',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fdkaac/gstfdkaac.c -> gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/gstfdkaac.c
Changed
@@ -231,18 +231,3 @@ {0, MODE_INVALID, {GST_AUDIO_CHANNEL_POSITION_INVALID}}, }; /* *INDENT-ON* */ - -static gboolean -plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "fdkaacenc", GST_RANK_PRIMARY, - GST_TYPE_FDKAACENC) - && gst_element_register (plugin, "fdkaacdec", GST_RANK_MARGINAL, - GST_TYPE_FDKAACDEC); -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - fdkaac, - "Fraunhofer FDK AAC Codec plugin", - plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fdkaac/gstfdkaacdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/gstfdkaacdec.c
Changed
@@ -68,6 +68,8 @@ static void gst_fdkaacdec_flush (GstAudioDecoder * dec, gboolean hard); G_DEFINE_TYPE (GstFdkAacDec, gst_fdkaacdec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (fdkaacdec, "fdkaacdec", GST_RANK_MARGINAL, + GST_TYPE_FDKAACDEC); static gboolean gst_fdkaacdec_start (GstAudioDecoder * dec)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fdkaac/gstfdkaacdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/gstfdkaacdec.h
Changed
@@ -55,6 +55,8 @@ GType gst_fdkaacdec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (fdkaacdec); + G_END_DECLS #endif /* __GST_FDKAACDEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fdkaac/gstfdkaacenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/gstfdkaacenc.c
Changed
@@ -95,6 +95,8 @@ static void gst_fdkaacenc_flush (GstAudioEncoder * enc); G_DEFINE_TYPE (GstFdkAacEnc, gst_fdkaacenc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (fdkaacenc, "fdkaacenc", GST_RANK_PRIMARY, + GST_TYPE_FDKAACENC); static void gst_fdkaacenc_set_property (GObject * object, guint prop_id, @@ -396,8 +398,7 @@ /* raw */ if (transmux == 0) { GstBuffer *codec_data = - gst_buffer_new_wrapped (g_memdup (enc_info.confBuf, enc_info.confSize), - enc_info.confSize); + gst_buffer_new_memdup (enc_info.confBuf, enc_info.confSize); gst_caps_set_simple (src_caps, "codec_data", GST_TYPE_BUFFER, codec_data, "stream-format", G_TYPE_STRING, "raw", NULL); gst_buffer_unref (codec_data); @@ -579,7 +580,7 @@ gst_element_class_add_static_pad_template (element_class, &src_template); gst_element_class_set_static_metadata (element_class, "FDK AAC audio encoder", - "Codec/Encoder/Audio", "FDK AAC audio encoder", + "Codec/Encoder/Audio/Converter", "FDK AAC audio encoder", "Sebastian Dröge <sebastian@centricular.com>"); GST_DEBUG_CATEGORY_INIT (gst_fdkaacenc_debug, "fdkaacenc", 0,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fdkaac/gstfdkaacenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/gstfdkaacenc.h
Changed
@@ -59,6 +59,8 @@ GType gst_fdkaacenc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (fdkaacenc); + G_END_DECLS #endif /* __GST_FDKAACENC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/gstfdkaacplugin.c
Added
@@ -0,0 +1,42 @@ +/* + * Copyright (C) 2016 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstfdkaacdec.h" +#include "gstfdkaacenc.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (fdkaacenc, plugin); + ret |= GST_ELEMENT_REGISTER (fdkaacdec, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + fdkaac, + "Fraunhofer FDK AAC Codec plugin", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fdkaac/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/fdkaac/meson.build
Changed
@@ -1,4 +1,4 @@ -fdkaac_dep = dependency('fdk-aac', required : get_option('fdkaac')) +fdkaac_dep = dependency('fdk-aac', allow_fallback: true, required : get_option('fdkaac')) if fdkaac_dep.found() fdkaac_defines = [] @@ -14,7 +14,7 @@ endif gstfdkaac = library('gstfdkaac', - ['gstfdkaac.c', 'gstfdkaacenc.c', 'gstfdkaacdec.c'], + ['gstfdkaac.c', 'gstfdkaacenc.c', 'gstfdkaacdec.c','gstfdkaacplugin.c'], c_args : gst_plugins_bad_args + fdkaac_defines, include_directories : [configinc], dependencies : [gstaudio_dep, gstpbutils_dep, fdkaac_dep],
View file
gst-plugins-bad-1.18.6.tar.xz/ext/flite/gstflite.c -> gst-plugins-bad-1.20.1.tar.xz/ext/flite/gstflite.c
Changed
@@ -25,17 +25,12 @@ #include <flite/flite.h> GType gst_flite_test_src_get_type (void); - +GST_ELEMENT_REGISTER_DECLARE (flitetestsrc); static gboolean plugin_init (GstPlugin * plugin) { - flite_init (); - - gst_element_register (plugin, "flitetestsrc", GST_RANK_NONE, - gst_flite_test_src_get_type ()); - - return TRUE; + return GST_ELEMENT_REGISTER (flitetestsrc, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/flite/gstflitetestsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/flite/gstflitetestsrc.c
Changed
@@ -90,8 +90,11 @@ "rate = (int) 48000, " "channels = (int) [1, 8]") ); +GST_ELEMENT_REGISTER_DECLARE (flitetestsrc); #define gst_flite_test_src_parent_class parent_class G_DEFINE_TYPE (GstFliteTestSrc, gst_flite_test_src, GST_TYPE_BASE_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (flitetestsrc, "flitetestsrc", + GST_RANK_NONE, gst_flite_test_src_get_type (), flite_init ()); static void gst_flite_test_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fluidsynth/gstfluiddec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/fluidsynth/gstfluiddec.c
Changed
@@ -99,6 +99,7 @@ const GValue * value, GParamSpec * pspec); static void gst_fluid_dec_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); +static gboolean fluiddec_element_init (GstPlugin * plugin); static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, @@ -116,6 +117,7 @@ #define parent_class gst_fluid_dec_parent_class G_DEFINE_TYPE (GstFluidDec, gst_fluid_dec, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (fluiddec, fluiddec_element_init); /* initialize the plugin's class */ static void @@ -707,7 +709,7 @@ } static gboolean -plugin_init (GstPlugin * plugin) +fluiddec_element_init (GstPlugin * plugin) { GST_DEBUG_CATEGORY_INIT (gst_fluid_dec_debug, "fluiddec", 0, "Fluidsynth MIDI decoder plugin"); @@ -744,11 +746,16 @@ } } #endif - return gst_element_register (plugin, "fluiddec", GST_RANK_SECONDARY, GST_TYPE_FLUID_DEC); } +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (fluiddec, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, fluidsynthmidi,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/fluidsynth/gstfluiddec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/fluidsynth/gstfluiddec.h
Changed
@@ -73,6 +73,8 @@ GType gst_fluid_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (fluiddec); + G_END_DECLS #endif /* __GST_FLUID_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gme/gstgme.c -> gst-plugins-bad-1.20.1.tar.xz/ext/gme/gstgme.c
Changed
@@ -51,6 +51,8 @@ #define gst_gme_dec_parent_class parent_class G_DEFINE_TYPE (GstGmeDec, gst_gme_dec, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (gmedec, "gmedec", GST_RANK_PRIMARY, + GST_TYPE_GME_DEC); static GstFlowReturn gst_gme_dec_chain (GstPad * pad, GstObject * parent, GstBuffer * buffer); @@ -506,8 +508,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "gmedec", GST_RANK_PRIMARY, - GST_TYPE_GME_DEC); + return GST_ELEMENT_REGISTER (gmedec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gme/gstgme.h -> gst-plugins-bad-1.20.1.tar.xz/ext/gme/gstgme.h
Changed
@@ -65,6 +65,8 @@ GType gst_gme_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (gmedec); + G_END_DECLS #endif /* __GST_GME_DEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/.clang-format
Added
@@ -0,0 +1,39 @@ +# Defines the Chromium style for automatic reformatting. +# http://clang.llvm.org/docs/ClangFormatStyleOptions.html +BasedOnStyle: Chromium +# This defaults to 'Auto'. Explicitly set it for a while, so that +# 'vector<vector<int> >' in existing files gets formatted to +# 'vector<vector<int>>'. ('Auto' means that clang-format will only use +# 'int>>' if the file already contains at least one such instance.) +Standard: Cpp11 + +# Make sure code like: +# IPC_BEGIN_MESSAGE_MAP() +# IPC_MESSAGE_HANDLER(WidgetHostViewHost_Update, OnUpdate) +# IPC_END_MESSAGE_MAP() +# gets correctly indented. +MacroBlockBegin: "^\ +BEGIN_MSG_MAP|\ +BEGIN_MSG_MAP_EX|\ +BEGIN_SAFE_MSG_MAP_EX|\ +CR_BEGIN_MSG_MAP_EX|\ +IPC_BEGIN_MESSAGE_MAP|\ +IPC_BEGIN_MESSAGE_MAP_WITH_PARAM|\ +IPC_PROTOBUF_MESSAGE_TRAITS_BEGIN|\ +IPC_STRUCT_BEGIN|\ +IPC_STRUCT_BEGIN_WITH_PARENT|\ +IPC_STRUCT_TRAITS_BEGIN|\ +POLPARAMS_BEGIN|\ +PPAPI_BEGIN_MESSAGE_MAP$" +MacroBlockEnd: "^\ +CR_END_MSG_MAP|\ +END_MSG_MAP|\ +IPC_END_MESSAGE_MAP|\ +IPC_PROTOBUF_MESSAGE_TRAITS_END|\ +IPC_STRUCT_END|\ +IPC_STRUCT_TRAITS_END|\ +POLPARAMS_END|\ +PPAPI_END_MESSAGE_MAP$" + +# TODO: Remove this once clang-format r357700 is rolled in. +JavaImportGroups: ['android', 'androidx', 'com', 'dalvik', 'junit', 'org', 'com.google.android.apps.chrome', 'org.chromium', 'java', 'javax']
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/README.md
Added
@@ -0,0 +1,94 @@ +# Install the Google Cloud Storage dependencies. +``` +sudo apt-get install \ + cmake \ + libcurl3-gnutls-dev \ + libgrpc++-dev \ + libprotobuf-dev \ + protobuf-compiler-grpc \ + flex bison pkg-config \ + curl +``` + +# Build the Google Cloud Storage library + +``` +export cmake_prefix=/usr/local + +mkdir crc32c +cd crc32c +curl -sSL https://github.com/google/crc32c/archive/1.1.2.tar.gz | \ + tar -xzf - --strip-components=1 +cmake -S . -B build \ + -GNinja \ + -DCMAKE_INSTALL_PREFIX:PATH=$cmake_prefix \ + -DCMAKE_INSTALL_LIBDIR:PATH=lib \ + -DBUILD_SHARED_LIBS=YES \ + -DCRC32C_USE_GLOG=NO \ + -DCRC32C_BUILD_TESTS=NO \ + -DCRC32C_BUILD_BENCHMARKS=NO +cmake --build build --target install +cd .. + +mkdir abseil-cpp +cd abseil-cpp +curl -sSL https://github.com/abseil/abseil-cpp/archive/20210324.2.tar.gz | \ + tar -xzf - --strip-components=1 && \ + sed -i 's/^#define ABSL_OPTION_USE_\(.*\) 2/#define ABSL_OPTION_USE_\1 0/' "absl/base/options.h" +cmake -S . -B build \ + -GNinja \ + -DBUILD_TESTING=NO \ + -DCMAKE_INSTALL_PREFIX:PATH=$cmake_prefix \ + -DCMAKE_INSTALL_LIBDIR:PATH=lib \ + -DBUILD_SHARED_LIBS=YES +cmake --build build --target install +cd .. + +# Nlohman/json +mkdir json +cd json +curl -sSL https://github.com/nlohmann/json/archive/v3.10.4.tar.gz | \ + tar -xzf - --strip-components=1 +cmake \ + -DCMAKE_BUILD_TYPE=Release \ + -DBUILD_SHARED_LIBS=yes \ + -DBUILD_TESTING=OFF \ + -H. -Bcmake-out/nlohmann/json && \ + cmake --build cmake-out/nlohmann/json --target install -- -j ${NCPU} && \ + ldconfig +cd .. + +mkdir google-cloud-cpp +cd google-cloud-cpp +curl -sSL https://github.com/googleapis/google-cloud-cpp/archive/v1.31.1.tar.gz | \ + tar --strip-components=1 -zxf - +cmake -S . -B build \ + -GNinja \ + -DCMAKE_BUILD_TYPE=Debug \ + -DCMAKE_CXX_STANDARD=14 \ + -DCMAKE_INSTALL_PREFIX:PATH=$cmake_prefix \ + -DCMAKE_INSTALL_LIBDIR:PATH=lib \ + -DBUILD_SHARED_LIBS=YES \ + -DBUILD_TESTING=NO \ + -DGOOGLE_CLOUD_CPP_ENABLE=storage +cmake --build build --target install -- -v +cd .. +``` + +# Running the gs elements locally + +When running from the command line or in a container running locally, simply set the credentials by exporting +GOOGLE_APPLICATION_CREDENTIALS. If you are not familiar with this environment variable, check the documentation +https://cloud.google.com/docs/authentication/getting-started +Note that you can restrict a service account to the role Storage Admin or Storage Object Creator instead of the Project +Owner role from the above documentation. + +# Running the gs elements in Google Cloud Run + +Add the Storage Object Viewer role to the service account assigned to the Cloud Run service where gssrc runs. For gssink +add the role Storage Object Creator. Then just set the service-account-email property on the element. + +# Running the gs elements in Google Cloud Kubernetes + +You need to set GOOGLE_APPLICATION_CREDENTIALS in the container and ship the json file to which the environment variable +points to.
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgs.cpp
Added
@@ -0,0 +1,54 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgssrc.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +/** + * plugin-gs: + * + * The gs plugin contains elements to interact with with Google Cloud Storage. + * In particular with the gs:// protocol or by specifying the storage bucket. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstgssink.h" +#include "gstgssrc.h" + +static gboolean plugin_init(GstPlugin* plugin) { + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER(gssrc, plugin); + ret |= GST_ELEMENT_REGISTER(gssink, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE(GST_VERSION_MAJOR, + GST_VERSION_MINOR, + gs, + "Read and write from and to a Google Cloud Storage", + plugin_init, + PACKAGE_VERSION, + GST_LICENSE, + GST_PACKAGE_NAME, + GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgscommon.cpp
Added
@@ -0,0 +1,141 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgscommon.h: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstgscommon.h" + +#include "google/cloud/storage/oauth2/compute_engine_credentials.h" + +namespace gcs = google::cloud::storage; + +namespace { + +#if !GLIB_CHECK_VERSION(2, 62, 0) +static inline gchar* g_date_time_format_iso8601(GDateTime* datetime) { + GString* outstr = NULL; + gchar* main_date = NULL; + gint64 offset; + + // Main date and time. + main_date = g_date_time_format(datetime, "%Y-%m-%dT%H:%M:%S"); + outstr = g_string_new(main_date); + g_free(main_date); + + // Timezone. Format it as `%:::z` unless the offset is zero, in which case + // we can simply use `Z`. + offset = g_date_time_get_utc_offset(datetime); + + if (offset == 0) { + g_string_append_c(outstr, 'Z'); + } else { + gchar* time_zone = g_date_time_format(datetime, "%:::z"); + g_string_append(outstr, time_zone); + g_free(time_zone); + } + + return g_string_free(outstr, FALSE); +} +#endif + +} // namespace + +std::unique_ptr<google::cloud::storage::Client> gst_gs_create_client( + const gchar* service_account_email, + const gchar* service_account_credentials, + GError** error) { + if (service_account_email || service_account_credentials) { + google::cloud::StatusOr<std::shared_ptr<gcs::oauth2::Credentials>> creds; + if (service_account_credentials) { + creds = gcs::oauth2::CreateServiceAccountCredentialsFromJsonContents( + service_account_credentials, + {{"https://www.googleapis.com/auth/devstorage.full_control"}}, + absl::nullopt); + } else { + // Meant to be used from a container running in the Cloud. + creds = + gcs::oauth2::CreateComputeEngineCredentials(service_account_email); + } + + if (!creds) { + if (service_account_email) { + g_set_error(error, GST_RESOURCE_ERROR, + GST_RESOURCE_ERROR_NOT_AUTHORIZED, + "Could not retrieve credentials for the given service " + "account %s (%s)", + service_account_email, creds.status().message().c_str()); + } else { + g_set_error(error, GST_RESOURCE_ERROR, + GST_RESOURCE_ERROR_NOT_AUTHORIZED, + "Could not retrieve credentials for the given service " + "account credentials JSON (%s)", + creds.status().message().c_str()); + } + return nullptr; + } + + gcs::ClientOptions client_options(std::move(creds.value())); + return std::make_unique<gcs::Client>(client_options, + gcs::StrictIdempotencyPolicy()); + } + // Default account. This is meant to retrieve the credentials automatically + // using diffrent methods. + google::cloud::StatusOr<gcs::ClientOptions> client_options = + gcs::ClientOptions::CreateDefaultClientOptions(); + + if (!client_options) { + g_set_error(error, GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_NOT_AUTHORIZED, + "Could not create default client options (%s)", + client_options.status().message().c_str()); + return nullptr; + } + return std::make_unique<gcs::Client>(client_options.value(), + gcs::StrictIdempotencyPolicy()); +} + +gboolean gst_gs_get_buffer_date(GstBuffer* buffer, + GDateTime* start_date, + gchar** buffer_date_str_ptr) { + gchar* buffer_date_str = NULL; + GstClockTime buffer_timestamp = GST_CLOCK_TIME_NONE; + GTimeSpan buffer_timespan = 0; + + if (!buffer || !start_date) + return FALSE; + + buffer_timestamp = GST_BUFFER_PTS(buffer); + + // GTimeSpan is in micro seconds. + buffer_timespan = GST_TIME_AS_USECONDS(buffer_timestamp); + + GDateTime* buffer_date = g_date_time_add(start_date, buffer_timespan); + if (!buffer_date) + return FALSE; + + buffer_date_str = g_date_time_format_iso8601(buffer_date); + g_date_time_unref(buffer_date); + + if (!buffer_date_str) + return FALSE; + + if (buffer_date_str_ptr) + *buffer_date_str_ptr = buffer_date_str; + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgscommon.h
Added
@@ -0,0 +1,40 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgscommon.h: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_GS_COMMON_H__ +#define __GST_GS_COMMON_H__ + +#include <memory> + +#include <gst/gst.h> + +#include <google/cloud/storage/client.h> + +std::unique_ptr<google::cloud::storage::Client> gst_gs_create_client( + const gchar* service_account_email, + const gchar* service_account_credentials, + GError** error); + +gboolean gst_gs_get_buffer_date(GstBuffer* buffer, + GDateTime* start_date, + gchar** buffer_date_str_ptr); + +#endif // __GST_GS_COMMON_H__
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgssink.cpp
Added
@@ -0,0 +1,885 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgssink.cpp: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +/** + * SECTION:element-gssink + * @title: gssink + * @see_also: #GstGsSrc + * + * Write incoming data to a series of sequentially-named remote files on a + * Google Cloud Storage bucket. + * + * The object-name property should contain a string with a \%d placeholder + * that will be substituted with the index for each filename. + * + * If the #GstGsSink:post-messages property is %TRUE, it sends an application + * message named `GstGsSink` after writing each buffer. + * + * The message's structure contains these fields: + * + * * #gchararray `filename`: the filename where the buffer was written. + * * #gchararray `date`: the date of the current buffer, NULL if no start date + * is provided. + * * #gint `index`: index of the buffer. + * * #GstClockTime `timestamp`: the timestamp of the buffer. + * * #GstClockTime `stream-time`: the stream time of the buffer. + * * #GstClockTime `running-time`: the running_time of the buffer. + * * #GstClockTime `duration`: the duration of the buffer. + * * #guint64 `offset`: the offset of the buffer that triggered the message. + * * #guint64 `offset-end`: the offset-end of the buffer that triggered the + * message. + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc num-buffers=15 ! pngenc ! gssink + * object-name="mypath/myframes/frame%05d.png" bucket-name="mybucket" + * next-file=buffer post-messages=true + * ``` + * ### Upload 15 png images into gs://mybucket/mypath/myframes/ where the file + * names are frame00000.png, frame00001.png, ..., frame00014.png + * ``` + * gst-launch-1.0 videotestsrc num-buffers=6 ! video/x-raw, framerate=2/1 ! + * pngenc ! gssink start-date="2020-04-16T08:55:03Z" + * object-name="mypath/myframes/im_%s_%03d.png" bucket-name="mybucket" + * next-file=buffer post-messages=true + * ``` + * ### Upload png 6 images into gs://mybucket/mypath/myframes/ where the file + * names are im_2020-04-16T08:55:03Z_000.png, im_2020-04-16T08:55:03Z_001.png, + * im_2020-04-16T08:55:04Z_002.png, im_2020-04-16T08:55:04Z_003.png, + * im_2020-04-16T08:55:05Z_004.png, im_2020-04-16T08:55:05Z_005.png. + * ``` + * gst-launch-1.0 filesrc location=some_video.mp4 ! gssink + * object-name="mypath/myvideos/video.mp4" bucket-name="mybucket" next-file=none + * ``` + * ### Upload any stream as a single file into Google Cloud Storage. Similar as + * filesink in this case. The file is then accessible from: + * gs://mybucket/mypath/myvideos/video.mp4 + * + * See also: #GstGsSrc + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstgscommon.h" +#include "gstgssink.h" + +#include <algorithm> + +static GstStaticPadTemplate sinktemplate = + GST_STATIC_PAD_TEMPLATE("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS_ANY); + +GST_DEBUG_CATEGORY_STATIC(gst_gs_sink_debug); +#define GST_CAT_DEFAULT gst_gs_sink_debug + +#define DEFAULT_INDEX 0 +#define DEFAULT_NEXT_FILE GST_GS_SINK_NEXT_BUFFER +#define DEFAULT_OBJECT_NAME "%s_%05d" +#define DEFAULT_POST_MESSAGES FALSE + +namespace gcs = google::cloud::storage; + +enum { + PROP_0, + PROP_BUCKET_NAME, + PROP_OBJECT_NAME, + PROP_INDEX, + PROP_POST_MESSAGES, + PROP_NEXT_FILE, + PROP_SERVICE_ACCOUNT_EMAIL, + PROP_START_DATE, + PROP_SERVICE_ACCOUNT_CREDENTIALS, + PROP_METADATA, +}; + +class GSWriteStream; + +struct _GstGsSink { + GstBaseSink parent; + + std::unique_ptr<google::cloud::storage::Client> gcs_client; + std::unique_ptr<GSWriteStream> gcs_stream; + gchar* service_account_email; + gchar* service_account_credentials; + gchar* bucket_name; + gchar* object_name; + gchar* start_date_str; + GDateTime* start_date; + gint index; + gboolean post_messages; + GstGsSinkNext next_file; + const gchar* content_type; + size_t nb_percent_format; + gboolean percent_s_is_first; + GstStructure* metadata; +}; + +static void gst_gs_sink_finalize(GObject* object); + +static void gst_gs_sink_set_property(GObject* object, + guint prop_id, + const GValue* value, + GParamSpec* pspec); +static void gst_gs_sink_get_property(GObject* object, + guint prop_id, + GValue* value, + GParamSpec* pspec); + +static gboolean gst_gs_sink_start(GstBaseSink* bsink); +static gboolean gst_gs_sink_stop(GstBaseSink* sink); +static GstFlowReturn gst_gs_sink_render(GstBaseSink* sink, GstBuffer* buffer); +static GstFlowReturn gst_gs_sink_render_list(GstBaseSink* sink, + GstBufferList* buffer_list); +static gboolean gst_gs_sink_set_caps(GstBaseSink* sink, GstCaps* caps); +static gboolean gst_gs_sink_event(GstBaseSink* sink, GstEvent* event); + +#define GST_TYPE_GS_SINK_NEXT (gst_gs_sink_next_get_type()) +static GType gst_gs_sink_next_get_type(void) { + static GType gs_sink_next_type = 0; + static const GEnumValue next_types[] = { + {GST_GS_SINK_NEXT_BUFFER, "New file for each buffer", "buffer"}, + {GST_GS_SINK_NEXT_NONE, "Only one file, no next file", "none"}, + {0, NULL, NULL}}; + + if (!gs_sink_next_type) { + gs_sink_next_type = g_enum_register_static("GstGsSinkNext", next_types); + } + + return gs_sink_next_type; +} + +#define gst_gs_sink_parent_class parent_class +G_DEFINE_TYPE(GstGsSink, gst_gs_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE(gssink, "gssink", GST_RANK_NONE, GST_TYPE_GS_SINK) + +class GSWriteStream { + public: + GSWriteStream(google::cloud::storage::Client& client, + const char* bucket_name, + const char* object_name, + gcs::ObjectMetadata metadata) + : gcs_stream_(client.WriteObject(bucket_name, + object_name, + gcs::WithObjectMetadata(metadata))) {} + ~GSWriteStream() { gcs_stream_.Close(); } + + gcs::ObjectWriteStream& stream() { return gcs_stream_; } + + private: + gcs::ObjectWriteStream gcs_stream_; +}; + +static void gst_gs_sink_class_init(GstGsSinkClass* klass) { + GObjectClass* gobject_class = G_OBJECT_CLASS(klass); + GstElementClass* gstelement_class = GST_ELEMENT_CLASS(klass); + GstBaseSinkClass* gstbasesink_class = GST_BASE_SINK_CLASS(klass); + + gobject_class->set_property = gst_gs_sink_set_property; + gobject_class->get_property = gst_gs_sink_get_property; + + /** + * GstGsSink:bucket-name: + * + * Name of the Google Cloud Storage bucket. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_BUCKET_NAME, + g_param_spec_string( + "bucket-name", "Bucket Name", "Google Cloud Storage Bucket Name", + NULL, (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstGsSink:object-name: + * + * Full path name of the remote file. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_OBJECT_NAME, + g_param_spec_string( + "object-name", "Object Name", "Full path name of the remote file", + DEFAULT_OBJECT_NAME, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstGsSink:index: + * + * Index to use with location property to create file names. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_INDEX, + g_param_spec_int( + "index", "Index", + "Index to use with location property to create file names. The " + "index is incremented by one for each buffer written.", + 0, G_MAXINT, DEFAULT_INDEX, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstGsSink:post-messages: + * + * Post a message on the GstBus for each file. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_POST_MESSAGES, + g_param_spec_boolean( + "post-messages", "Post Messages", + "Post a message for each file with information of the buffer", + DEFAULT_POST_MESSAGES, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + /** + * GstGsSink:next-file: + * + * A #GstGsSinkNext that specifies when to start a new file. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_NEXT_FILE, + g_param_spec_enum( + "next-file", "Next File", "When to start a new file", + GST_TYPE_GS_SINK_NEXT, DEFAULT_NEXT_FILE, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstGsSink:service-account-email: + * + * Service Account Email to use for credentials. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_SERVICE_ACCOUNT_EMAIL, + g_param_spec_string( + "service-account-email", "Service Account Email", + "Service Account Email to use for credentials", NULL, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstGsSink:service-account-credentials: + * + * Service Account Credentials as a JSON string to use for credentials. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_SERVICE_ACCOUNT_CREDENTIALS, + g_param_spec_string( + "service-account-credentials", "Service Account Credentials", + "Service Account Credentials as a JSON string to use for credentials", + NULL, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstGsSink:start-date: + * + * Start date in iso8601 format. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_START_DATE, + g_param_spec_string( + "start-date", "Start Date", "Start date in iso8601 format", NULL, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstGsSink:metadata: + * + * A map of metadata to store with the object; field values need to be + * convertible to strings. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_METADATA, + g_param_spec_boxed( + "metadata", "Metadata", + "A map of metadata to store with the object; field values need to be " + "convertible to strings.", + GST_TYPE_STRUCTURE, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + gobject_class->finalize = gst_gs_sink_finalize; + + gstbasesink_class->start = GST_DEBUG_FUNCPTR(gst_gs_sink_start); + gstbasesink_class->stop = GST_DEBUG_FUNCPTR(gst_gs_sink_stop); + gstbasesink_class->render = GST_DEBUG_FUNCPTR(gst_gs_sink_render); + gstbasesink_class->render_list = GST_DEBUG_FUNCPTR(gst_gs_sink_render_list); + gstbasesink_class->set_caps = GST_DEBUG_FUNCPTR(gst_gs_sink_set_caps); + gstbasesink_class->event = GST_DEBUG_FUNCPTR(gst_gs_sink_event); + + GST_DEBUG_CATEGORY_INIT(gst_gs_sink_debug, "gssink", 0, "gssink element"); + + gst_element_class_add_static_pad_template(gstelement_class, &sinktemplate); + gst_element_class_set_static_metadata( + gstelement_class, "Google Cloud Storage Sink", "Sink/File", + "Write buffers to a sequentially named set of files on Google Cloud " + "Storage", + "Julien Isorce <jisorce@oblong.com>"); +} + +static void gst_gs_sink_init(GstGsSink* sink) { + sink->gcs_client = nullptr; + sink->gcs_stream = nullptr; + sink->index = DEFAULT_INDEX; + sink->post_messages = DEFAULT_POST_MESSAGES; + sink->service_account_email = NULL; + sink->service_account_credentials = NULL; + sink->bucket_name = NULL; + sink->object_name = g_strdup(DEFAULT_OBJECT_NAME); + sink->start_date_str = NULL; + sink->start_date = NULL; + sink->next_file = DEFAULT_NEXT_FILE; + sink->content_type = NULL; + sink->nb_percent_format = 0; + sink->percent_s_is_first = FALSE; + + gst_base_sink_set_sync(GST_BASE_SINK(sink), FALSE); +} + +static void gst_gs_sink_finalize(GObject* object) { + GstGsSink* sink = GST_GS_SINK(object); + + sink->gcs_client = nullptr; + sink->gcs_stream = nullptr; + g_free(sink->service_account_email); + sink->service_account_email = NULL; + g_free(sink->service_account_credentials); + sink->service_account_credentials = NULL; + g_free(sink->bucket_name); + sink->bucket_name = NULL; + g_free(sink->object_name); + sink->object_name = NULL; + g_free(sink->start_date_str); + sink->start_date_str = NULL; + if (sink->start_date) { + g_date_time_unref(sink->start_date); + sink->start_date = NULL; + } + sink->content_type = NULL; + g_clear_pointer(&sink->metadata, gst_structure_free); + + G_OBJECT_CLASS(parent_class)->finalize(object); +} + +static gboolean gst_gs_sink_set_object_name(GstGsSink* sink, + const gchar* object_name) { + g_free(sink->object_name); + sink->object_name = NULL; + sink->nb_percent_format = 0; + sink->percent_s_is_first = FALSE; + + if (!object_name) { + GST_ERROR_OBJECT(sink, "Object name is null"); + return FALSE; + } + + const std::string name(object_name); + sink->nb_percent_format = std::count(name.begin(), name.end(), '%'); + if (sink->nb_percent_format > 2) { + GST_ERROR_OBJECT(sink, "Object name has too many formats"); + return FALSE; + } + + const size_t delimiter_percent_s = name.find("%s"); + if (delimiter_percent_s == std::string::npos) { + if (sink->nb_percent_format == 2) { + GST_ERROR_OBJECT(sink, "Object name must have just one number format"); + return FALSE; + } + sink->object_name = g_strdup(object_name); + return TRUE; + } + + const size_t delimiter_percent = name.find_first_of('%'); + if (delimiter_percent_s == delimiter_percent) { + sink->percent_s_is_first = TRUE; + + if (name.find("%s", delimiter_percent_s + 1) != std::string::npos) { + GST_ERROR_OBJECT(sink, "Object name expect max one string format"); + return FALSE; + } + } + + sink->object_name = g_strdup(object_name); + + return TRUE; +} + +static void gst_gs_sink_set_property(GObject* object, + guint prop_id, + const GValue* value, + GParamSpec* pspec) { + GstGsSink* sink = GST_GS_SINK(object); + + switch (prop_id) { + case PROP_BUCKET_NAME: + g_free(sink->bucket_name); + sink->bucket_name = g_strdup(g_value_get_string(value)); + break; + case PROP_OBJECT_NAME: + gst_gs_sink_set_object_name(sink, g_value_get_string(value)); + break; + case PROP_INDEX: + sink->index = g_value_get_int(value); + break; + case PROP_POST_MESSAGES: + sink->post_messages = g_value_get_boolean(value); + break; + case PROP_NEXT_FILE: + sink->next_file = (GstGsSinkNext)g_value_get_enum(value); + break; + case PROP_SERVICE_ACCOUNT_EMAIL: + g_free(sink->service_account_email); + sink->service_account_email = g_strdup(g_value_get_string(value)); + break; + case PROP_SERVICE_ACCOUNT_CREDENTIALS: + g_free(sink->service_account_credentials); + sink->service_account_credentials = g_strdup(g_value_get_string(value)); + break; + case PROP_START_DATE: + g_free(sink->start_date_str); + if (sink->start_date) + g_date_time_unref(sink->start_date); + sink->start_date_str = g_strdup(g_value_get_string(value)); + sink->start_date = + g_date_time_new_from_iso8601(sink->start_date_str, NULL); + if (!sink->start_date) { + GST_ERROR_OBJECT(sink, "Failed to parse start date %s", + sink->start_date_str); + g_free(sink->start_date_str); + sink->start_date_str = NULL; + } + break; + case PROP_METADATA: + g_clear_pointer(&sink->metadata, gst_structure_free); + sink->metadata = (GstStructure*)g_value_dup_boxed(value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); + break; + } +} + +static void gst_gs_sink_get_property(GObject* object, + guint prop_id, + GValue* value, + GParamSpec* pspec) { + GstGsSink* sink = GST_GS_SINK(object); + + switch (prop_id) { + case PROP_BUCKET_NAME: + g_value_set_string(value, sink->bucket_name); + break; + case PROP_OBJECT_NAME: + g_value_set_string(value, sink->object_name); + break; + case PROP_INDEX: + g_value_set_int(value, sink->index); + break; + case PROP_POST_MESSAGES: + g_value_set_boolean(value, sink->post_messages); + break; + case PROP_NEXT_FILE: + g_value_set_enum(value, sink->next_file); + break; + case PROP_SERVICE_ACCOUNT_EMAIL: + g_value_set_string(value, sink->service_account_email); + break; + case PROP_SERVICE_ACCOUNT_CREDENTIALS: + g_value_set_string(value, sink->service_account_credentials); + break; + case PROP_START_DATE: + g_value_set_string(value, sink->start_date_str); + break; + case PROP_METADATA: + g_value_set_boxed(value, sink->metadata); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); + break; + } +} + +static gboolean gst_gs_sink_start(GstBaseSink* bsink) { + GstGsSink* sink = GST_GS_SINK(bsink); + GError* err = NULL; + + if (!sink->bucket_name) { + GST_ELEMENT_ERROR(sink, RESOURCE, SETTINGS, ("Bucket name is required"), + GST_ERROR_SYSTEM); + return FALSE; + } + + if (!sink->object_name) { + GST_ELEMENT_ERROR(sink, RESOURCE, SETTINGS, ("Object name is required"), + GST_ERROR_SYSTEM); + return FALSE; + } + + sink->content_type = ""; + + sink->gcs_client = gst_gs_create_client( + sink->service_account_email, sink->service_account_credentials, &err); + if (err) { + GST_ELEMENT_ERROR(sink, RESOURCE, OPEN_READ, + ("Could not create client (%s)", err->message), + GST_ERROR_SYSTEM); + g_clear_error(&err); + return FALSE; + } + + GST_INFO_OBJECT(sink, "Using bucket name (%s) and object name (%s)", + sink->bucket_name, sink->object_name); + + return TRUE; +} + +static gboolean gst_gs_sink_stop(GstBaseSink* bsink) { + GstGsSink* sink = GST_GS_SINK(bsink); + + sink->gcs_client = nullptr; + sink->gcs_stream = nullptr; + sink->content_type = NULL; + + return TRUE; +} + +static void gst_gs_sink_post_message_full(GstGsSink* sink, + GstClockTime timestamp, + GstClockTime duration, + GstClockTime offset, + GstClockTime offset_end, + GstClockTime running_time, + GstClockTime stream_time, + const char* filename, + const gchar* date) { + GstStructure* s; + + if (!sink->post_messages) + return; + + s = gst_structure_new("GstGsSink", "filename", G_TYPE_STRING, filename, + "date", G_TYPE_STRING, date, "index", G_TYPE_INT, + sink->index, "timestamp", G_TYPE_UINT64, timestamp, + "stream-time", G_TYPE_UINT64, stream_time, + "running-time", G_TYPE_UINT64, running_time, "duration", + G_TYPE_UINT64, duration, "offset", G_TYPE_UINT64, + offset, "offset-end", G_TYPE_UINT64, offset_end, NULL); + + gst_element_post_message(GST_ELEMENT_CAST(sink), + gst_message_new_element(GST_OBJECT_CAST(sink), s)); +} + +static void gst_gs_sink_post_message_from_time(GstGsSink* sink, + GstClockTime timestamp, + GstClockTime duration, + const char* filename) { + GstClockTime running_time, stream_time; + guint64 offset, offset_end; + GstSegment* segment; + GstFormat format; + + if (!sink->post_messages) + return; + + segment = &GST_BASE_SINK(sink)->segment; + format = segment->format; + + offset = -1; + offset_end = -1; + + running_time = gst_segment_to_running_time(segment, format, timestamp); + stream_time = gst_segment_to_stream_time(segment, format, timestamp); + + gst_gs_sink_post_message_full(sink, timestamp, duration, offset, offset_end, + running_time, stream_time, filename, NULL); +} + +static void gst_gs_sink_post_message(GstGsSink* sink, + GstBuffer* buffer, + const char* filename, + const char* date) { + GstClockTime duration, timestamp; + GstClockTime running_time, stream_time; + guint64 offset, offset_end; + GstSegment* segment; + GstFormat format; + + if (!sink->post_messages) + return; + + segment = &GST_BASE_SINK(sink)->segment; + format = segment->format; + + timestamp = GST_BUFFER_PTS(buffer); + duration = GST_BUFFER_DURATION(buffer); + offset = GST_BUFFER_OFFSET(buffer); + offset_end = GST_BUFFER_OFFSET_END(buffer); + + running_time = gst_segment_to_running_time(segment, format, timestamp); + stream_time = gst_segment_to_stream_time(segment, format, timestamp); + + gst_gs_sink_post_message_full(sink, timestamp, duration, offset, offset_end, + running_time, stream_time, filename, date); +} + +struct AddMetadataIter { + GstGsSink* sink; + gcs::ObjectMetadata* metadata; +}; + +static gboolean add_metadata_foreach(GQuark field_id, + const GValue* value, + gpointer user_data) { + struct AddMetadataIter* it = (struct AddMetadataIter*)user_data; + GValue svalue = G_VALUE_INIT; + + g_value_init(&svalue, G_TYPE_STRING); + + if (g_value_transform(value, &svalue)) { + const gchar* key = g_quark_to_string(field_id); + const gchar* value = g_value_get_string(&svalue); + + GST_LOG_OBJECT(it->sink, "metadata '%s' -> '%s'", key, value); + it->metadata->upsert_metadata(key, value); + } else { + GST_WARNING_OBJECT(it->sink, "Failed to convert metadata '%s' to string", + g_quark_to_string(field_id)); + } + + g_value_unset(&svalue); + return TRUE; +} + +static GstFlowReturn gst_gs_sink_write_buffer(GstGsSink* sink, + GstBuffer* buffer) { + GstMapInfo map = {0}; + gchar* object_name = NULL; + gchar* buffer_date = NULL; + + if (!gst_buffer_map(buffer, &map, GST_MAP_READ)) + return GST_FLOW_ERROR; + + gcs::ObjectMetadata metadata = + gcs::ObjectMetadata().set_content_type(sink->content_type); + + if (sink->metadata) { + struct AddMetadataIter it = {sink, &metadata}; + + gst_structure_foreach(sink->metadata, add_metadata_foreach, &it); + } + + switch (sink->next_file) { + case GST_GS_SINK_NEXT_BUFFER: { + // Get buffer date if needed. + if (sink->start_date) { + if (sink->nb_percent_format != 2) { + GST_ERROR_OBJECT(sink, "Object name expects date and index"); + gst_buffer_unmap(buffer, &map); + return GST_FLOW_ERROR; + } + + if (!gst_gs_get_buffer_date(buffer, sink->start_date, &buffer_date)) { + GST_ERROR_OBJECT(sink, "Could not get buffer date %s", object_name); + gst_buffer_unmap(buffer, &map); + return GST_FLOW_ERROR; + } + + if (sink->percent_s_is_first) { + object_name = + g_strdup_printf(sink->object_name, buffer_date, sink->index); + } else { + object_name = + g_strdup_printf(sink->object_name, sink->index, buffer_date); + } + } else { + if (sink->nb_percent_format != 1) { + GST_ERROR_OBJECT(sink, "Object name expects only an index"); + gst_buffer_unmap(buffer, &map); + return GST_FLOW_ERROR; + } + + object_name = g_strdup_printf(sink->object_name, sink->index); + } + + GST_INFO_OBJECT(sink, "Writing %" G_GSIZE_FORMAT " bytes", map.size); + + gcs::ObjectWriteStream gcs_stream = sink->gcs_client->WriteObject( + sink->bucket_name, object_name, gcs::WithObjectMetadata(metadata)); + + gcs_stream.write(reinterpret_cast<const char*>(map.data), map.size); + if (gcs_stream.fail()) { + GST_WARNING_OBJECT(sink, "Failed to write to %s", object_name); + } + gcs_stream.Close(); + + google::cloud::StatusOr<gcs::ObjectMetadata> object_metadata = + sink->gcs_client->GetObjectMetadata(sink->bucket_name, object_name); + if (!object_metadata) { + GST_ERROR_OBJECT( + sink, "Could not get object metadata for object %s (%s)", + object_name, object_metadata.status().message().c_str()); + gst_buffer_unmap(buffer, &map); + g_free(object_name); + g_free(buffer_date); + return GST_FLOW_ERROR; + } + + GST_INFO_OBJECT(sink, "Wrote object %s of size %" G_GUINT64_FORMAT "\n", + object_name, object_metadata->size()); + + gst_gs_sink_post_message(sink, buffer, object_name, buffer_date); + g_free(object_name); + g_free(buffer_date); + ++sink->index; + break; + } + case GST_GS_SINK_NEXT_NONE: { + if (!sink->gcs_stream) { + GST_INFO_OBJECT(sink, "Opening %s", sink->object_name); + sink->gcs_stream = std::make_unique<GSWriteStream>( + *sink->gcs_client.get(), sink->bucket_name, sink->object_name, + metadata); + + if (!sink->gcs_stream->stream().IsOpen()) { + GST_ELEMENT_ERROR( + sink, RESOURCE, OPEN_READ, + ("Could not create write stream (%s)", + sink->gcs_stream->stream().last_status().message().c_str()), + GST_ERROR_SYSTEM); + gst_buffer_unmap(buffer, &map); + return GST_FLOW_OK; + } + } + + GST_INFO_OBJECT(sink, "Writing %" G_GSIZE_FORMAT " bytes", map.size); + + gcs::ObjectWriteStream& stream = sink->gcs_stream->stream(); + stream.write(reinterpret_cast<const char*>(map.data), map.size); + if (stream.fail()) { + GST_WARNING_OBJECT(sink, "Failed to write to %s", object_name); + } + break; + } + default: + g_assert_not_reached(); + } + + gst_buffer_unmap(buffer, &map); + return GST_FLOW_OK; +} + +static GstFlowReturn gst_gs_sink_render(GstBaseSink* bsink, GstBuffer* buffer) { + GstGsSink* sink = GST_GS_SINK(bsink); + GstFlowReturn flow = GST_FLOW_OK; + + flow = gst_gs_sink_write_buffer(sink, buffer); + return flow; +} + +static gboolean buffer_list_copy_data(GstBuffer** buf, + guint idx, + gpointer data) { + GstBuffer* dest = GST_BUFFER_CAST(data); + guint num, i; + + if (idx == 0) + gst_buffer_copy_into(dest, *buf, GST_BUFFER_COPY_METADATA, 0, -1); + + num = gst_buffer_n_memory(*buf); + for (i = 0; i < num; ++i) { + GstMemory* mem; + + mem = gst_buffer_get_memory(*buf, i); + gst_buffer_append_memory(dest, mem); + } + + return TRUE; +} + +/* Our assumption for now is that the buffers in a buffer list should always + * end up in the same file. If someone wants different behaviour, they'll just + * have to add a property for that. */ +static GstFlowReturn gst_gs_sink_render_list(GstBaseSink* sink, + GstBufferList* list) { + GstBuffer* buf; + guint size; + + size = gst_buffer_list_calculate_size(list); + GST_LOG_OBJECT(sink, "total size of buffer list %p: %u", list, size); + + /* copy all buffers in the list into one single buffer, so we can use + * the normal render function (FIXME: optimise to avoid the memcpy) */ + buf = gst_buffer_new(); + gst_buffer_list_foreach(list, buffer_list_copy_data, buf); + g_assert(gst_buffer_get_size(buf) == size); + + gst_gs_sink_render(sink, buf); + gst_buffer_unref(buf); + + return GST_FLOW_OK; +} + +static gboolean gst_gs_sink_set_caps(GstBaseSink* bsink, GstCaps* caps) { + GstGsSink* sink = GST_GS_SINK(bsink); + GstStructure* s = gst_caps_get_structure(caps, 0); + + sink->content_type = gst_structure_get_name(s); + + GST_INFO_OBJECT(sink, "Content type: %s", sink->content_type); + + return TRUE; +} + +static gboolean gst_gs_sink_event(GstBaseSink* bsink, GstEvent* event) { + GstGsSink* sink = GST_GS_SINK(bsink); + + switch (GST_EVENT_TYPE(event)) { + case GST_EVENT_EOS: + if (sink->gcs_stream) { + sink->gcs_stream = nullptr; + gst_gs_sink_post_message_from_time( + sink, GST_BASE_SINK(sink)->segment.position, -1, sink->object_name); + } + break; + default: + break; + } + + return GST_BASE_SINK_CLASS(parent_class)->event(bsink, event); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgssink.h
Added
@@ -0,0 +1,49 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgssink.h: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_GS_SINK_H__ +#define __GST_GS_SINK_H__ + +#include <gst/base/base.h> +#include <gst/gst.h> + +G_BEGIN_DECLS + +#define GST_TYPE_GS_SINK (gst_gs_sink_get_type()) +G_DECLARE_FINAL_TYPE(GstGsSink, gst_gs_sink, GST, GS_SINK, GstBaseSink) + +/** + * GstGsSinkNext: + * @GST_GS_SINK_NEXT_BUFFER: New file for each buffer. + * @GST_GS_SINK_NEXT_NONE: Only one file like filesink. + * + * File splitting modes. + * Since: 1.20 + */ +typedef enum { + GST_GS_SINK_NEXT_BUFFER, + GST_GS_SINK_NEXT_NONE, +} GstGsSinkNext; + +GST_ELEMENT_REGISTER_DECLARE(gssink); + +G_END_DECLS +#endif // __GST_GS_SINK_H__
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgssrc.cpp
Added
@@ -0,0 +1,632 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgssrc.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-gssrc + * @title: gssrc + * @see_also: #GstGsSrc + * + * Read data from a file in a Google Cloud Storage. + * + * ## Example launch line + * ``` + * gst-launch-1.0 gssrc location=gs://mybucket/myvideo.mkv ! decodebin ! + * glimagesink + * ``` + * ### Play a video from a Google Cloud Storage. + * ``` + * gst-launch-1.0 gssrc location=gs://mybucket/myvideo.mkv ! decodebin ! navseek + * seek-offset=10 ! glimagesink + * ``` + * ### Play a video from a Google Cloud Storage and seek using the keyboard + * from the terminal. + * + * See also: #GstGsSink + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstgscommon.h" +#include "gstgssrc.h" + +static GstStaticPadTemplate srctemplate = + GST_STATIC_PAD_TEMPLATE("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS_ANY); + +GST_DEBUG_CATEGORY_STATIC(gst_gs_src_debug); +#define GST_CAT_DEFAULT gst_gs_src_debug + +enum { LAST_SIGNAL }; + +// https://github.com/googleapis/google-cloud-cpp/issues/2657 +#define DEFAULT_BLOCKSIZE 3 * 1024 * 1024 / 2 + +enum { + PROP_0, + PROP_LOCATION, + PROP_SERVICE_ACCOUNT_EMAIL, + PROP_SERVICE_ACCOUNT_CREDENTIALS +}; + +class GSReadStream; + +struct _GstGsSrc { + GstBaseSrc parent; + + std::unique_ptr<google::cloud::storage::Client> gcs_client; + std::unique_ptr<GSReadStream> gcs_stream; + gchar* uri; + gchar* service_account_email; + gchar* service_account_credentials; + std::string bucket_name; + std::string object_name; + guint64 read_position; + guint64 object_size; +}; + +static void gst_gs_src_finalize(GObject* object); + +static void gst_gs_src_set_property(GObject* object, + guint prop_id, + const GValue* value, + GParamSpec* pspec); +static void gst_gs_src_get_property(GObject* object, + guint prop_id, + GValue* value, + GParamSpec* pspec); + +static gboolean gst_gs_src_start(GstBaseSrc* basesrc); +static gboolean gst_gs_src_stop(GstBaseSrc* basesrc); + +static gboolean gst_gs_src_is_seekable(GstBaseSrc* src); +static gboolean gst_gs_src_get_size(GstBaseSrc* src, guint64* size); +static GstFlowReturn gst_gs_src_fill(GstBaseSrc* src, + guint64 offset, + guint length, + GstBuffer* buf); +static gboolean gst_gs_src_query(GstBaseSrc* src, GstQuery* query); + +static void gst_gs_src_uri_handler_init(gpointer g_iface, gpointer iface_data); + +#define _do_init \ + G_IMPLEMENT_INTERFACE(GST_TYPE_URI_HANDLER, gst_gs_src_uri_handler_init); \ + GST_DEBUG_CATEGORY_INIT(gst_gs_src_debug, "gssrc", 0, "gssrc element"); +#define gst_gs_src_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE(GstGsSrc, gst_gs_src, GST_TYPE_BASE_SRC, _do_init); +GST_ELEMENT_REGISTER_DEFINE(gssrc, "gssrc", GST_RANK_NONE, GST_TYPE_GS_SRC) + +namespace gcs = google::cloud::storage; + +class GSReadStream { + public: + GSReadStream(GstGsSrc* src, + const std::int64_t start = 0, + const std::int64_t end = -1) + : gcs_stream_(src->gcs_client->ReadObject(src->bucket_name, + src->object_name, + gcs::ReadFromOffset(start))) {} + ~GSReadStream() { gcs_stream_.Close(); } + + gcs::ObjectReadStream& stream() { return gcs_stream_; } + + private: + gcs::ObjectReadStream gcs_stream_; +}; + +static void gst_gs_src_class_init(GstGsSrcClass* klass) { + GObjectClass* gobject_class = G_OBJECT_CLASS(klass); + GstElementClass* gstelement_class = GST_ELEMENT_CLASS(klass); + GstBaseSrcClass* gstbasesrc_class = GST_BASE_SRC_CLASS(klass); + + gobject_class->set_property = gst_gs_src_set_property; + gobject_class->get_property = gst_gs_src_get_property; + + /** + * GstGsSink:location: + * + * Name of the Google Cloud Storage bucket. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_LOCATION, + g_param_spec_string( + "location", "File Location", "Location of the file to read", NULL, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstGsSrc:service-account-email: + * + * Service Account Email to use for credentials. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_SERVICE_ACCOUNT_EMAIL, + g_param_spec_string( + "service-account-email", "Service Account Email", + "Service Account Email to use for credentials", NULL, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + /** + * GstGsSrc:service-account-credentials: + * + * Service Account Credentials as a JSON string to use for credentials. + * + * Since: 1.20 + */ + g_object_class_install_property( + gobject_class, PROP_SERVICE_ACCOUNT_CREDENTIALS, + g_param_spec_string( + "service-account-credentials", "Service Account Credentials", + "Service Account Credentials as a JSON string to use for credentials", + NULL, + (GParamFlags)(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + + gobject_class->finalize = gst_gs_src_finalize; + + gst_element_class_set_static_metadata( + gstelement_class, "Google Cloud Storage Source", "Source/File", + "Read from arbitrary point from a file in a Google Cloud Storage", + "Julien Isorce <jisorce@oblong.com>"); + gst_element_class_add_static_pad_template(gstelement_class, &srctemplate); + + gstbasesrc_class->start = GST_DEBUG_FUNCPTR(gst_gs_src_start); + gstbasesrc_class->stop = GST_DEBUG_FUNCPTR(gst_gs_src_stop); + gstbasesrc_class->is_seekable = GST_DEBUG_FUNCPTR(gst_gs_src_is_seekable); + gstbasesrc_class->get_size = GST_DEBUG_FUNCPTR(gst_gs_src_get_size); + gstbasesrc_class->fill = GST_DEBUG_FUNCPTR(gst_gs_src_fill); + gstbasesrc_class->query = GST_DEBUG_FUNCPTR(gst_gs_src_query); +} + +static void gst_gs_src_init(GstGsSrc* src) { + src->gcs_stream = nullptr; + src->uri = NULL; + src->service_account_email = NULL; + src->service_account_credentials = NULL; + src->read_position = 0; + src->object_size = 0; + + gst_base_src_set_blocksize(GST_BASE_SRC(src), DEFAULT_BLOCKSIZE); + gst_base_src_set_dynamic_size(GST_BASE_SRC(src), FALSE); + gst_base_src_set_live(GST_BASE_SRC(src), FALSE); +} + +static void gst_gs_src_finalize(GObject* object) { + GstGsSrc* src = GST_GS_SRC(object); + + g_free(src->uri); + src->uri = NULL; + g_free(src->service_account_email); + src->service_account_email = NULL; + g_free(src->service_account_credentials); + src->service_account_credentials = NULL; + src->read_position = 0; + src->object_size = 0; + + G_OBJECT_CLASS(parent_class)->finalize(object); +} + +static gboolean gst_gs_src_set_location(GstGsSrc* src, + const gchar* location, + GError** err) { + GstState state = GST_STATE_NULL; + std::string filepath = location; + size_t delimiter = std::string::npos; + + // The element must be stopped in order to do this. + GST_OBJECT_LOCK(src); + state = GST_STATE(src); + if (state != GST_STATE_READY && state != GST_STATE_NULL) + goto wrong_state; + GST_OBJECT_UNLOCK(src); + + g_free(src->uri); + src->uri = NULL; + + if (location) { + if (g_str_has_prefix(location, "gs://")) { + src->uri = g_strdup(location); + filepath = filepath.substr(5); + } else { + src->uri = g_strdup_printf("gs://%s", location); + filepath = location; + } + + delimiter = filepath.find_first_of('/'); + if (delimiter == std::string::npos) + goto wrong_location; + + std::string bucket_name = filepath.substr(0, delimiter); + src->bucket_name = bucket_name; + src->object_name = filepath.substr(delimiter + 1); + + GST_INFO_OBJECT(src, "uri is %s", src->uri); + GST_INFO_OBJECT(src, "bucket name is %s", src->bucket_name.c_str()); + GST_INFO_OBJECT(src, "object name is %s", src->object_name.c_str()); + } + g_object_notify(G_OBJECT(src), "location"); + + return TRUE; + + // ERROR. +wrong_state : { + g_warning( + "Changing the `location' property on gssrc when a file is open" + "is not supported."); + if (err) + g_set_error( + err, GST_URI_ERROR, GST_URI_ERROR_BAD_STATE, + "Changing the `location' property on gssrc when a file is open is " + "not supported."); + GST_OBJECT_UNLOCK(src); + return FALSE; +} +wrong_location : { + if (err) + g_set_error(err, GST_URI_ERROR, GST_URI_ERROR_BAD_URI, + "Failed to find a bucket name"); + GST_OBJECT_UNLOCK(src); + return FALSE; +} +} + +static gboolean gst_gs_src_set_service_account_email( + GstGsSrc* src, + const gchar* service_account_email) { + if (GST_STATE(src) == GST_STATE_PLAYING || + GST_STATE(src) == GST_STATE_PAUSED) { + GST_WARNING_OBJECT(src, + "Setting a new service account email not supported in " + "PLAYING or PAUSED state"); + return FALSE; + } + + GST_OBJECT_LOCK(src); + g_free(src->service_account_email); + src->service_account_email = NULL; + + if (service_account_email) + src->service_account_email = g_strdup(service_account_email); + + GST_OBJECT_UNLOCK(src); + + return TRUE; +} + +static gboolean gst_gs_src_set_service_account_credentials( + GstGsSrc* src, + const gchar* service_account_credentials) { + if (GST_STATE(src) == GST_STATE_PLAYING || + GST_STATE(src) == GST_STATE_PAUSED) { + GST_WARNING_OBJECT( + src, + "Setting a new service account credentials not supported in " + "PLAYING or PAUSED state"); + return FALSE; + } + + GST_OBJECT_LOCK(src); + g_free(src->service_account_credentials); + src->service_account_credentials = NULL; + + if (service_account_credentials) + src->service_account_credentials = g_strdup(service_account_credentials); + + GST_OBJECT_UNLOCK(src); + + return TRUE; +} + +static void gst_gs_src_set_property(GObject* object, + guint prop_id, + const GValue* value, + GParamSpec* pspec) { + GstGsSrc* src = GST_GS_SRC(object); + + g_return_if_fail(GST_IS_GS_SRC(object)); + + switch (prop_id) { + case PROP_LOCATION: + gst_gs_src_set_location(src, g_value_get_string(value), NULL); + break; + case PROP_SERVICE_ACCOUNT_EMAIL: + gst_gs_src_set_service_account_email(src, g_value_get_string(value)); + break; + case PROP_SERVICE_ACCOUNT_CREDENTIALS: + gst_gs_src_set_service_account_credentials(src, + g_value_get_string(value)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); + break; + } +} + +static void gst_gs_src_get_property(GObject* object, + guint prop_id, + GValue* value, + GParamSpec* pspec) { + GstGsSrc* src = GST_GS_SRC(object); + + g_return_if_fail(GST_IS_GS_SRC(object)); + + switch (prop_id) { + case PROP_LOCATION: + GST_OBJECT_LOCK(src); + g_value_set_string(value, src->uri); + GST_OBJECT_UNLOCK(src); + break; + case PROP_SERVICE_ACCOUNT_EMAIL: + GST_OBJECT_LOCK(src); + g_value_set_string(value, src->service_account_email); + GST_OBJECT_UNLOCK(src); + break; + case PROP_SERVICE_ACCOUNT_CREDENTIALS: + GST_OBJECT_LOCK(src); + g_value_set_string(value, src->service_account_credentials); + GST_OBJECT_UNLOCK(src); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID(object, prop_id, pspec); + break; + } +} + +static gint gst_gs_read_stream(GstGsSrc* src, + guint8* data, + const guint64 offset, + const guint length) { + gint gcount = 0; + gchar* sdata = reinterpret_cast<gchar*>(data); + + gcs::ObjectReadStream& stream = src->gcs_stream->stream(); + + while (!stream.eof()) { + stream.read(sdata, length); + if (stream.status().ok()) + break; + + GST_ERROR_OBJECT(src, "Restart after (%s)", + stream.status().message().c_str()); + src->gcs_stream = std::make_unique<GSReadStream>(src, offset); + } + + gcount = stream.gcount(); + + GST_INFO_OBJECT(src, "Client read %d bytes", gcount); + + return gcount; +} + +static GstFlowReturn gst_gs_src_fill(GstBaseSrc* basesrc, + guint64 offset, + guint length, + GstBuffer* buf) { + GstGsSrc* src = GST_GS_SRC(basesrc); + guint to_read = 0; + guint bytes_read = 0; + gint ret = 0; + GstMapInfo info = {}; + guint8* data = NULL; + + if (G_UNLIKELY(offset != (guint64)-1 && src->read_position != offset)) { + src->gcs_stream = std::make_unique<GSReadStream>(src, offset); + src->read_position = offset; + } + + if (!gst_buffer_map(buf, &info, GST_MAP_WRITE)) + goto buffer_write_fail; + + data = info.data; + + bytes_read = 0; + to_read = length; + while (to_read > 0) { + GST_INFO_OBJECT(src, "Reading %d bytes at offset 0x%" G_GINT64_MODIFIER "x", + to_read, offset + bytes_read); + + ret = gst_gs_read_stream(src, data + bytes_read, offset, to_read); + if (G_UNLIKELY(ret < 0)) + goto could_not_read; + + if (G_UNLIKELY(ret == 0)) { + // Push any remaining data. + if (bytes_read > 0) + break; + goto eos; + } + + to_read -= ret; + bytes_read += ret; + + src->read_position += ret; + } + + GST_INFO_OBJECT( + src, "Read %" G_GUINT32_FORMAT " bytes of %" G_GUINT32_FORMAT " length", + bytes_read, length); + + gst_buffer_unmap(buf, &info); + if (bytes_read != length) + gst_buffer_resize(buf, 0, bytes_read); + + GST_BUFFER_OFFSET(buf) = offset; + GST_BUFFER_OFFSET_END(buf) = offset + bytes_read; + + return GST_FLOW_OK; + + // ERROR. +could_not_read : { + GST_ELEMENT_ERROR(src, RESOURCE, READ, (NULL), GST_ERROR_SYSTEM); + gst_buffer_unmap(buf, &info); + gst_buffer_resize(buf, 0, 0); + return GST_FLOW_ERROR; +} +eos : { + GST_INFO_OBJECT(src, "EOS"); + gst_buffer_unmap(buf, &info); + gst_buffer_resize(buf, 0, 0); + return GST_FLOW_EOS; +} +buffer_write_fail : { + GST_ELEMENT_ERROR(src, RESOURCE, WRITE, (NULL), ("Can't write to buffer")); + return GST_FLOW_ERROR; +} +} + +static gboolean gst_gs_src_is_seekable(GstBaseSrc* basesrc) { + return TRUE; +} + +static gboolean gst_gs_src_get_size(GstBaseSrc* basesrc, guint64* size) { + GstGsSrc* src = GST_GS_SRC(basesrc); + + *size = src->object_size; + + return TRUE; +} + +static gboolean gst_gs_src_start(GstBaseSrc* basesrc) { + GstGsSrc* src = GST_GS_SRC(basesrc); + GError* err = NULL; + + src->read_position = 0; + src->object_size = 0; + + if (src->uri == NULL || src->uri[0] == '\0') { + GST_ELEMENT_ERROR(src, RESOURCE, NOT_FOUND, + ("No uri specified for reading."), (NULL)); + return FALSE; + } + + GST_INFO_OBJECT(src, "Opening file %s", src->uri); + + src->gcs_client = gst_gs_create_client( + src->service_account_email, src->service_account_credentials, &err); + if (err) { + GST_ELEMENT_ERROR(src, RESOURCE, OPEN_READ, + ("Could not create client (%s)", err->message), + GST_ERROR_SYSTEM); + g_clear_error(&err); + return FALSE; + } + + GST_INFO_OBJECT(src, "Parsed bucket name (%s) and object name (%s)", + src->bucket_name.c_str(), src->object_name.c_str()); + + google::cloud::StatusOr<gcs::ObjectMetadata> object_metadata = + src->gcs_client->GetObjectMetadata(src->bucket_name, src->object_name); + if (!object_metadata) { + GST_ELEMENT_ERROR(src, RESOURCE, OPEN_READ, + ("Could not get object metadata (%s)", + object_metadata.status().message().c_str()), + GST_ERROR_SYSTEM); + return FALSE; + } + + src->object_size = object_metadata->size(); + + GST_INFO_OBJECT(src, "Object size %" G_GUINT64_FORMAT "\n", src->object_size); + + src->gcs_stream = std::make_unique<GSReadStream>(src); + + return TRUE; +} + +static gboolean gst_gs_src_stop(GstBaseSrc* basesrc) { + GstGsSrc* src = GST_GS_SRC(basesrc); + + src->gcs_stream = nullptr; + src->read_position = 0; + src->object_size = 0; + + return TRUE; +} + +static gboolean gst_gs_src_query(GstBaseSrc* src, GstQuery* query) { + gboolean ret; + + switch (GST_QUERY_TYPE(query)) { + case GST_QUERY_SCHEDULING: { + // A pushsrc can by default never operate in pull mode override + // if you want something different. + gst_query_set_scheduling(query, GST_SCHEDULING_FLAG_SEQUENTIAL, 1, -1, 0); + gst_query_add_scheduling_mode(query, GST_PAD_MODE_PUSH); + + ret = TRUE; + break; + } + default: + ret = GST_BASE_SRC_CLASS(parent_class)->query(src, query); + break; + } + return ret; +} + +static GstURIType gst_gs_src_uri_get_type(GType type) { + return GST_URI_SRC; +} + +static const gchar* const* gst_gs_src_uri_get_protocols(GType type) { + static const gchar* protocols[] = {"gs", NULL}; + + return protocols; +} + +static gchar* gst_gs_src_uri_get_uri(GstURIHandler* handler) { + GstGsSrc* src = GST_GS_SRC(handler); + + return g_strdup(src->uri); +} + +static gboolean gst_gs_src_uri_set_uri(GstURIHandler* handler, + const gchar* uri, + GError** err) { + GstGsSrc* src = GST_GS_SRC(handler); + + if (strcmp(uri, "gs://") == 0) { + // Special case for "gs://" as this is used by some applications + // to test with gst_element_make_from_uri if there's an element + // that supports the URI protocol. + gst_gs_src_set_location(src, NULL, NULL); + return TRUE; + } + + return gst_gs_src_set_location(src, uri, err); +} + +static void gst_gs_src_uri_handler_init(gpointer g_iface, gpointer iface_data) { + GstURIHandlerInterface* iface = (GstURIHandlerInterface*)g_iface; + + iface->get_type = gst_gs_src_uri_get_type; + iface->get_protocols = gst_gs_src_uri_get_protocols; + iface->get_uri = gst_gs_src_uri_get_uri; + iface->set_uri = gst_gs_src_uri_set_uri; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/gstgssrc.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) 2020 Julien Isorce <jisorce@oblong.com> + * + * gstgssrc.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_GS_SRC_H__ +#define __GST_GS_SRC_H__ + +#include <gst/base/gstbasesrc.h> +#include <gst/gst.h> + +G_BEGIN_DECLS + +#define GST_TYPE_GS_SRC (gst_gs_src_get_type()) +G_DECLARE_FINAL_TYPE(GstGsSrc, gst_gs_src, GST, GS_SRC, GstBaseSrc) +GST_ELEMENT_REGISTER_DECLARE(gssrc); + +G_END_DECLS +#endif // __GST_GS_SRC_H__
View file
gst-plugins-bad-1.20.1.tar.xz/ext/gs/meson.build
Added
@@ -0,0 +1,22 @@ +gs_sources = [ + 'gstgscommon.cpp', + 'gstgssink.cpp', + 'gstgssrc.cpp', + 'gstgs.cpp', +] + +gs_dep = dependency('google_cloud_cpp_storage', version : '>= 1.25.0', required : get_option('gs')) + +if gs_dep.found() + gstgs = library('gstgs', + gs_sources, + c_args : gst_plugins_bad_args, + cpp_args : gst_plugins_bad_args, + include_directories : [configinc, libsinc], + dependencies : [gstbase_dep, gs_dep], + install : true, + install_dir : plugins_install_dir, + ) + pkgconfig.generate(gstgs, install_dir : plugins_pkgconfig_install_dir) + plugins += [gstgs] +endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gsm/gstgsm.c -> gst-plugins-bad-1.20.1.tar.xz/ext/gsm/gstgsm.c
Changed
@@ -29,14 +29,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "gsmenc", GST_RANK_PRIMARY, - GST_TYPE_GSMENC)) - return FALSE; - if (!gst_element_register (plugin, "gsmdec", GST_RANK_PRIMARY, - GST_TYPE_GSMDEC)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (gsmenc, plugin); + ret |= GST_ELEMENT_REGISTER (gsmdec, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gsm/gstgsmdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/gsm/gstgsmdec.c
Changed
@@ -74,6 +74,8 @@ ); G_DEFINE_TYPE (GstGSMDec, gst_gsmdec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (gsmdec, "gsmdec", GST_RANK_PRIMARY, + GST_TYPE_GSMDEC); static void gst_gsmdec_class_init (GstGSMDecClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gsm/gstgsmdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/gsm/gstgsmdec.h
Changed
@@ -61,6 +61,8 @@ GType gst_gsmdec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (gsmdec); + G_END_DECLS #endif /* __GST_GSMDEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gsm/gstgsmenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/gsm/gstgsmenc.c
Changed
@@ -68,6 +68,8 @@ ); G_DEFINE_TYPE (GstGSMEnc, gst_gsmenc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (gsmenc, "gsmenc", GST_RANK_PRIMARY, + GST_TYPE_GSMENC); static void gst_gsmenc_class_init (GstGSMEncClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/gsm/gstgsmenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/gsm/gstgsmenc.h
Changed
@@ -59,6 +59,8 @@ GType gst_gsmenc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (gsmenc); + G_END_DECLS #endif /* __GST_GSMENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gsthlsdemux.c -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlsdemux.c
Changed
@@ -43,6 +43,7 @@ #include <string.h> #include <gst/base/gsttypefindhelper.h> +#include "gsthlselements.h" #include "gsthlsdemux.h" static GstStaticPadTemplate srctemplate = GST_STATIC_PAD_TEMPLATE ("src_%u", @@ -119,6 +120,8 @@ #define gst_hls_demux_parent_class parent_class G_DEFINE_TYPE (GstHLSDemux, gst_hls_demux, GST_TYPE_ADAPTIVE_DEMUX); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (hlsdemux, "hlsdemux", GST_RANK_PRIMARY, + GST_TYPE_HLS_DEMUX, hls_element_init (plugin)); static void gst_hls_demux_finalize (GObject * obj) @@ -241,6 +244,10 @@ { GstAdaptiveDemux *demux = GST_ADAPTIVE_DEMUX_CAST (hlsdemux); + /* FIXME !!! + * + * No, there isn't a single output :D */ + /* Valid because hlsdemux only has a single output */ if (demux->streams) { GstAdaptiveDemuxStream *stream = demux->streams->data; @@ -494,6 +501,10 @@ return; } + GST_DEBUG_OBJECT (demux, + "is_primary_playlist:%d selected:%d playlist name '%s'", + is_primary_playlist, selected, playlist->name); + stream = gst_adaptive_demux_stream_new (demux, gst_hls_demux_create_pad (hlsdemux)); @@ -508,6 +519,79 @@ hlsdemux_stream->reset_pts = TRUE; } +static GstHLSDemuxStream * +find_adaptive_stream_for_playlist (GstAdaptiveDemux * demux, GstM3U8 * playlist) +{ + GList *tmp; + + GST_DEBUG_OBJECT (demux, "Looking for existing stream for '%s' %s", + playlist->name, playlist->uri); + + for (tmp = demux->streams; tmp; tmp = tmp->next) { + GstHLSDemuxStream *hlsstream = (GstHLSDemuxStream *) tmp->data; + if (hlsstream->playlist == playlist) + return hlsstream; + } + + return NULL; +} + +/* Returns TRUE if the previous and current (to switch to) variant are compatible. + * + * That is: + * * They have the same number of streams + * * The streams are of the same type + */ +static gboolean +new_variant_is_compatible (GstAdaptiveDemux * demux) +{ + GstHLSDemux *hlsdemux = GST_HLS_DEMUX_CAST (demux); + GstHLSVariantStream *previous = hlsdemux->previous_variant; + GstHLSVariantStream *current = hlsdemux->current_variant; + gint i; + + GST_DEBUG_OBJECT (demux, + "Checking whether new variant is compatible with previous"); + + for (i = 0; i < GST_HLS_N_MEDIA_TYPES; ++i) { + GList *mlist = current->media[i]; + if (g_list_length (previous->media[i]) != g_list_length (current->media[i])) { + GST_LOG_OBJECT (demux, "Number of medias for type %s don't match", + gst_hls_media_type_get_name (i)); + return FALSE; + } + + /* Check if all new media were present in previous (if not there are new ones) */ + while (mlist != NULL) { + GstHLSMedia *media = mlist->data; + if (!gst_hls_variant_find_matching_media (previous, media)) { + GST_LOG_OBJECT (demux, + "New stream of type %s present. Variant not compatible", + gst_hls_media_type_get_name (i)); + return FALSE; + } + mlist = mlist->next; + } + + /* Check if all old media are present in current (if not some have gone) */ + mlist = previous->media[i]; + while (mlist != NULL) { + GstHLSMedia *media = mlist->data; + if (!gst_hls_variant_find_matching_media (current, media)) { + GST_LOG_OBJECT (demux, + "Old stream of type %s gone. Variant not compatible", + gst_hls_media_type_get_name (i)); + return FALSE; + } + mlist = mlist->next; + } + } + + GST_DEBUG_OBJECT (demux, "Variants are compatible"); + + return TRUE; +} + static gboolean gst_hls_demux_setup_streams (GstAdaptiveDemux * demux) { @@ -520,6 +604,59 @@ return FALSE; } + GST_DEBUG_OBJECT (demux, "Setting up streams"); + if (hlsdemux->streams_aware && hlsdemux->previous_variant && + new_variant_is_compatible (demux)) { + GstHLSDemuxStream *hlsstream; + GST_DEBUG_OBJECT (demux, "Have a previous variant, Re-using streams"); + + /* Carry over the main playlist */ + hlsstream = + find_adaptive_stream_for_playlist (demux, + hlsdemux->previous_variant->m3u8); + if (G_UNLIKELY (hlsstream == NULL)) + goto no_match_error; + + gst_m3u8_unref (hlsstream->playlist); + hlsstream->playlist = gst_m3u8_ref (playlist->m3u8); + + for (i = 0; i < GST_HLS_N_MEDIA_TYPES; ++i) { + GList *mlist = playlist->media[i]; + while (mlist != NULL) { + GstHLSMedia *media = mlist->data; + GstHLSMedia *old_media = + gst_hls_variant_find_matching_media (hlsdemux->previous_variant, + media); + + if (G_UNLIKELY (old_media == NULL)) { + GST_FIXME_OBJECT (demux, "Handle new stream !"); + goto no_match_error; + } + if (!g_strcmp0 (media->uri, old_media->uri)) + GST_DEBUG_OBJECT (demux, "Identical stream !"); + if (media->mtype == GST_HLS_MEDIA_TYPE_AUDIO || + media->mtype == GST_HLS_MEDIA_TYPE_VIDEO) { + hlsstream = + find_adaptive_stream_for_playlist (demux, old_media->playlist); + if (!hlsstream) + goto no_match_error; + + GST_DEBUG_OBJECT (demux, "Found matching stream"); + gst_m3u8_unref (hlsstream->playlist); + hlsstream->playlist = gst_m3u8_ref (media->playlist); + } else { + GST_DEBUG_OBJECT (demux, "Skipping stream of type %s", + gst_hls_media_type_get_name (media->mtype)); + } + + mlist = mlist->next; + } + } + + return TRUE; + } + + /* FIXME : This seems wrong and assumes there's only one stream :( */ gst_hls_demux_clear_all_pending_data (hlsdemux); /* 1 output for the main playlist */ @@ -533,22 +670,29 @@ if (media->uri == NULL /* || media->mtype != GST_HLS_MEDIA_TYPE_AUDIO */ ) { /* No uri means this is a placeholder for a stream * contained in another mux */ - GST_LOG_OBJECT (demux, "Skipping stream %s type %d with no URI", - media->name, media->mtype); + GST_LOG_OBJECT (demux, "Skipping stream %s type %s with no URI", + media->name, gst_hls_media_type_get_name (media->mtype)); mlist = mlist->next; continue; } - GST_LOG_OBJECT (demux, "media of type %d - %s, uri: %s", i, - media->name, media->uri); + GST_LOG_OBJECT (demux, "media of type %s - %s, uri: %s", + gst_hls_media_type_get_name (i), media->name, media->uri); create_stream_for_playlist (demux, media->playlist, FALSE, - (media->mtype == GST_HLS_MEDIA_TYPE_VIDEO || - media->mtype == GST_HLS_MEDIA_TYPE_AUDIO)); + (media->mtype == GST_HLS_MEDIA_TYPE_VIDEO + || media->mtype == GST_HLS_MEDIA_TYPE_AUDIO)); mlist = mlist->next; } } return TRUE; + +no_match_error: + { + /* POST ERROR MESSAGE */ + GST_ERROR_OBJECT (demux, "Should not happen ! Could not find old stream"); + return FALSE; + } } static const gchar * @@ -588,15 +732,27 @@ gst_hls_variant_find_matching_media (variant, old_media); if (new_media) { + GST_LOG_OBJECT (hlsdemux, "Found matching GstHLSMedia"); + GST_LOG_OBJECT (hlsdemux, "old_media '%s' '%s'", old_media->name, + old_media->uri); + GST_LOG_OBJECT (hlsdemux, "new_media '%s' '%s'", new_media->name, + new_media->uri); new_media->playlist->sequence = old_media->playlist->sequence; new_media->playlist->sequence_position = old_media->playlist->sequence_position; + } else { + GST_LOG_OBJECT (hlsdemux, + "Didn't find a matching variant for '%s' '%s'", old_media->name, + old_media->uri); } mlist = mlist->next; } } - gst_hls_variant_stream_unref (hlsdemux->current_variant); + if (hlsdemux->previous_variant) + gst_hls_variant_stream_unref (hlsdemux->previous_variant); + /* Steal the reference */ + hlsdemux->previous_variant = hlsdemux->current_variant; } hlsdemux->current_variant = gst_hls_variant_stream_ref (variant); @@ -855,8 +1011,8 @@ return GST_FLOW_OK; } - GST_DEBUG_OBJECT (hlsdemux, "Typefind result: %" GST_PTR_FORMAT " prob:%d", - caps, prob); + GST_DEBUG_OBJECT (stream->pad, + "Typefind result: %" GST_PTR_FORMAT " prob:%d", caps, prob); hls_stream->stream_type = caps_to_reader (caps); gst_hlsdemux_tsreader_set_type (&hls_stream->tsreader, @@ -1124,7 +1280,7 @@ g_free (hlsdemux_stream->current_key); hlsdemux_stream->current_key = g_strdup (file->key); g_free (hlsdemux_stream->current_iv); - hlsdemux_stream->current_iv = g_memdup (file->iv, sizeof (file->iv)); + hlsdemux_stream->current_iv = g_memdup2 (file->iv, sizeof (file->iv)); g_free (stream->fragment.uri); stream->fragment.uri = g_strdup (file->uri); @@ -1192,7 +1348,15 @@ gst_hls_variant_stream_unref (demux->current_variant); demux->current_variant = NULL; } + if (demux->previous_variant != NULL) { + gst_hls_variant_stream_unref (demux->previous_variant); + demux->previous_variant = NULL; + } demux->srcpad_counter = 0; + demux->streams_aware = GST_OBJECT_PARENT (demux) + && GST_OBJECT_FLAG_IS_SET (GST_OBJECT_PARENT (demux), + GST_BIN_FLAG_STREAMS_AWARE); + GST_DEBUG_OBJECT (demux, "Streams aware : %d", demux->streams_aware); gst_hls_demux_clear_all_pending_data (demux); GST_M3U8_CLIENT_UNLOCK (hlsdemux->client);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gsthlsdemux.h -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlsdemux.h
Changed
@@ -27,7 +27,6 @@ #include <gst/gst.h> #include "m3u8.h" -#include "gsthls.h" #include <gst/adaptivedemux/gstadaptivedemux.h> #if defined(HAVE_OPENSSL) #include <openssl/evp.h> @@ -147,6 +146,10 @@ GstHLSMasterPlaylist *master; GstHLSVariantStream *current_variant; + /* The previous variant, used to transition streams over */ + GstHLSVariantStream *previous_variant; + + gboolean streams_aware; }; struct _GstHLSDemuxClass
View file
gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlselement.c
Added
@@ -0,0 +1,18 @@ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gsthlselements.h" + +GST_DEBUG_CATEGORY (hls_debug); + +void +hls_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (hls_debug, "hls", 0, "HTTP Live Streaming (HLS)"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlselements.h
Added
@@ -0,0 +1,18 @@ +#ifndef __GST_HLS_ELEMENT_H__ +#define __GST_HLS_ELEMENT_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +void hls_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (hlsdemux); +GST_ELEMENT_REGISTER_DECLARE (hlssink); +GST_ELEMENT_REGISTER_DECLARE (hlssink2); + +GST_DEBUG_CATEGORY_EXTERN (hls_debug); + +G_END_DECLS + +#endif /* __GST_HLS_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gsthlsplugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlsplugin.c
Changed
@@ -1,36 +1,24 @@ + #ifdef HAVE_CONFIG_H # include <config.h> #endif -#include <gst/gst.h> - -#include "gsthls.h" -#include "gsthlsdemux.h" -#include "gsthlssink.h" -#include "gsthlssink2.h" +#include "gsthlselements.h" -GST_DEBUG_CATEGORY (hls_debug); static gboolean -hls_init (GstPlugin * plugin) +plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (hls_debug, "hls", 0, "HTTP Live Streaming (HLS)"); - - if (!gst_element_register (plugin, "hlsdemux", GST_RANK_PRIMARY, - GST_TYPE_HLS_DEMUX) || FALSE) - return FALSE; - - if (!gst_hls_sink_plugin_init (plugin)) - return FALSE; - - if (!gst_hls_sink2_plugin_init (plugin)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (hlsdemux, plugin); + ret |= GST_ELEMENT_REGISTER (hlssink, plugin); + ret |= GST_ELEMENT_REGISTER (hlssink2, plugin); + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, hls, "HTTP Live Streaming (HLS)", - hls_init, VERSION, GST_LICENSE, PACKAGE_NAME, GST_PACKAGE_ORIGIN) + plugin_init, VERSION, GST_LICENSE, PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gsthlssink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlssink.c
Changed
@@ -33,6 +33,7 @@ #include "config.h" #endif +#include "gsthlselements.h" #include "gsthlssink.h" #include <gst/pbutils/pbutils.h> #include <gst/video/video.h> @@ -70,6 +71,11 @@ #define gst_hls_sink_parent_class parent_class G_DEFINE_TYPE (GstHlsSink, gst_hls_sink, GST_TYPE_BIN); +#define _do_init \ + hls_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_hls_sink_debug, "hlssink", 0, "HlsSink"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (hlssink, "hlssink", GST_RANK_NONE, + GST_TYPE_HLS_SINK, _do_init); static void gst_hls_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * spec); @@ -207,8 +213,7 @@ if (sink->playlist) gst_m3u8_playlist_free (sink->playlist); sink->playlist = - gst_m3u8_playlist_new (GST_M3U8_PLAYLIST_VERSION, sink->playlist_length, - FALSE); + gst_m3u8_playlist_new (GST_M3U8_PLAYLIST_VERSION, sink->playlist_length); sink->state = GST_M3U8_PLAYLIST_RENDER_INIT; } @@ -580,11 +585,3 @@ return ret; } - -gboolean -gst_hls_sink_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_hls_sink_debug, "hlssink", 0, "HlsSink"); - return gst_element_register (plugin, "hlssink", GST_RANK_NONE, - gst_hls_sink_get_type ()); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gsthlssink2.c -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gsthlssink2.c
Changed
@@ -19,14 +19,24 @@ */ /** - * SECTION:element-hlssink - * @title: hlssink + * SECTION:element-hlssink2 + * @title: hlssink2 * - * HTTP Live Streaming sink/server + * HTTP Live Streaming sink/server. Unlike the old hlssink which took a muxed + * MPEG-TS stream as input, this element takes elementary audio and video + * streams as input and handles the muxing internally. This allows hlssink2 + * to make better decisions as to when to start a new fragment and also works + * better with input streams where there isn't an encoder element upstream + * that can generate keyframes on demand as needed. + * + * This element only writes fragments and a playlist file into a specified + * directory, it does not contain an actual HTTP server to serve these files. + * Just point an external webserver to the directory with the playlist and + * fragment files. * * ## Example launch line * |[ - * gst-launch-1.0 videotestsrc is-live=true ! x264enc ! hlssink max-files=5 + * gst-launch-1.0 videotestsrc is-live=true ! x264enc ! h264parse ! hlssink2 max-files=5 * ]| * */ @@ -34,6 +44,7 @@ #include "config.h" #endif +#include "gsthlselements.h" #include "gsthlssink2.h" #include <gst/pbutils/pbutils.h> #include <gst/video/video.h> @@ -87,6 +98,11 @@ #define gst_hls_sink2_parent_class parent_class G_DEFINE_TYPE (GstHlsSink2, gst_hls_sink2, GST_TYPE_BIN); +#define _do_init \ + hls_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_hls_sink2_debug, "hlssink2", 0, "HlsSink2"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (hlssink2, "hlssink2", GST_RANK_NONE, + GST_TYPE_HLS_SINK2, _do_init); static void gst_hls_sink2_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * spec); @@ -186,7 +202,7 @@ gst_element_class_add_static_pad_template (element_class, &audio_template); gst_element_class_set_static_metadata (element_class, - "HTTP Live Streaming sink", "Sink", "HTTP Live Streaming sink", + "HTTP Live Streaming sink", "Sink/Muxer", "HTTP Live Streaming sink", "Alessandro Decina <alessandro.d@gmail.com>, " "Sebastian Dröge <sebastian@centricular.com>"); @@ -361,8 +377,7 @@ if (sink->playlist) gst_m3u8_playlist_free (sink->playlist); sink->playlist = - gst_m3u8_playlist_new (GST_M3U8_PLAYLIST_VERSION, sink->playlist_length, - FALSE); + gst_m3u8_playlist_new (GST_M3U8_PLAYLIST_VERSION, sink->playlist_length); g_queue_foreach (&sink->old_locations, (GFunc) g_free, NULL); g_queue_clear (&sink->old_locations); @@ -510,7 +525,7 @@ is_audio = strcmp (templ->name_template, "audio") == 0; peer = - gst_element_get_request_pad (sink->splitmuxsink, + gst_element_request_pad_simple (sink->splitmuxsink, is_audio ? "audio_0" : "video"); if (!peer) return NULL; @@ -673,11 +688,3 @@ break; } } - -gboolean -gst_hls_sink2_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_hls_sink2_debug, "hlssink2", 0, "HlsSink2"); - return gst_element_register (plugin, "hlssink2", GST_RANK_NONE, - gst_hls_sink2_get_type ()); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gstm3u8playlist.c -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gstm3u8playlist.c
Changed
@@ -21,8 +21,8 @@ #include <glib.h> -#include "gsthls.h" #include "gstm3u8playlist.h" +#include "gsthlselements.h" #define GST_CAT_DEFAULT hls_debug @@ -69,14 +69,13 @@ } GstM3U8Playlist * -gst_m3u8_playlist_new (guint version, guint window_size, gboolean allow_cache) +gst_m3u8_playlist_new (guint version, guint window_size) { GstM3U8Playlist *playlist; playlist = g_new0 (GstM3U8Playlist, 1); playlist->version = version; playlist->window_size = window_size; - playlist->allow_cache = allow_cache; playlist->type = GST_M3U8_PLAYLIST_TYPE_EVENT; playlist->end_list = FALSE; playlist->entries = g_queue_new (); @@ -155,9 +154,6 @@ g_string_append_printf (playlist_str, "#EXT-X-VERSION:%d\n", playlist->version); - g_string_append_printf (playlist_str, "#EXT-X-ALLOW-CACHE:%s\n", - playlist->allow_cache ? "YES" : "NO"); - g_string_append_printf (playlist_str, "#EXT-X-MEDIA-SEQUENCE:%d\n", playlist->sequence_number - playlist->entries->length);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/gstm3u8playlist.h -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/gstm3u8playlist.h
Changed
@@ -31,7 +31,6 @@ struct _GstM3U8Playlist { guint version; - gboolean allow_cache; gint window_size; gint type; gboolean end_list; @@ -49,9 +48,8 @@ } GstM3U8PlaylistRenderState; -GstM3U8Playlist * gst_m3u8_playlist_new (guint version, - guint window_size, - gboolean allow_cache); +GstM3U8Playlist * gst_m3u8_playlist_new (guint version, + guint window_size); void gst_m3u8_playlist_free (GstM3U8Playlist * playlist);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/m3u8.c -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/m3u8.c
Changed
@@ -26,8 +26,8 @@ #include <glib.h> #include <string.h> -#include "gsthls.h" #include "m3u8.h" +#include "gsthlselements.h" #define GST_CAT_DEFAULT hls_debug @@ -1300,9 +1300,9 @@ return GST_HLS_MEDIA_TYPE_INVALID; } -#define GST_HLS_MEDIA_TYPE_NAME(mtype) gst_m3u8_hls_media_type_get_nick(mtype) -static inline const gchar * -gst_m3u8_hls_media_type_get_nick (GstHLSMediaType mtype) +#define GST_HLS_MEDIA_TYPE_NAME(mtype) gst_hls_media_type_get_name(mtype) +const gchar * +gst_hls_media_type_get_name (GstHLSMediaType mtype) { static const gchar *nicks[GST_HLS_N_MEDIA_TYPES] = { "audio", "video", "subtitle", "closed-captions"
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/m3u8.h -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/m3u8.h
Changed
@@ -24,7 +24,7 @@ #ifndef __M3U8_H__ #define __M3U8_H__ -#include <glib.h> +#include <gst/gst.h> G_BEGIN_DECLS @@ -176,6 +176,8 @@ void gst_hls_media_unref (GstHLSMedia * media); +const gchar * gst_hls_media_type_get_name (GstHLSMediaType mtype); + struct _GstHLSVariantStream { gchar *name; /* This will be the "name" of the playlist, the original
View file
gst-plugins-bad-1.18.6.tar.xz/ext/hls/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/hls/meson.build
Changed
@@ -1,6 +1,7 @@ hls_sources = [ 'gsthlsdemux.c', 'gsthlsdemux-util.c', + 'gsthlselement.c', 'gsthlsplugin.c', 'gsthlssink.c', 'gsthlssink2.c',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/iqa/iqa.c -> gst-plugins-bad-1.20.1.tar.xz/ext/iqa/iqa.c
Changed
@@ -175,7 +175,10 @@ /* GstIqa */ #define gst_iqa_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstIqa, gst_iqa, GST_TYPE_VIDEO_AGGREGATOR, - G_IMPLEMENT_INTERFACE (GST_TYPE_CHILD_PROXY, gst_iqa_child_proxy_init)); + G_IMPLEMENT_INTERFACE (GST_TYPE_CHILD_PROXY, gst_iqa_child_proxy_init); + GST_DEBUG_CATEGORY_INIT (gst_iqa_debug, "iqa", 0, "iqa"); + ); +GST_ELEMENT_REGISTER_DEFINE (iqa, "iqa", GST_RANK_PRIMARY, GST_TYPE_IQA); #ifdef HAVE_DSSIM inline static unsigned char @@ -510,11 +513,10 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_iqa_debug, "iqa", 0, "iqa"); - - return gst_element_register (plugin, "iqa", GST_RANK_PRIMARY, GST_TYPE_IQA); + return GST_ELEMENT_REGISTER (iqa, plugin); } +// FIXME: effective iqa plugin license should be AGPL3+ ! GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, iqa,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/iqa/iqa.h -> gst-plugins-bad-1.20.1.tar.xz/ext/iqa/iqa.h
Changed
@@ -61,6 +61,8 @@ GType gst_iqa_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (iqa); + G_END_DECLS #endif /* __GST_IQA_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/iqa/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/iqa/meson.build
Changed
@@ -1,4 +1,9 @@ -dssim_dep = dependency('dssim', required : get_option('iqa'), +iqa_opt = get_option('iqa').require(gpl_allowed, error_message: ''' + Plugin iqa explicitly required via options but (A)GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow (A)GPL-licensed plugins to be built. + ''') + +dssim_dep = dependency('dssim', required: iqa_opt, fallback: ['dssim', 'dssim_dep']) if dssim_dep.found()
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisac.c
Added
@@ -0,0 +1,54 @@ +/* iSAC plugin + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +/** + * plugin-isac: + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <string.h> + +#include <gst/gst.h> + +#include "gstisacenc.h" +#include "gstisacdec.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (isacenc, plugin); + ret |= GST_ELEMENT_REGISTER (isacdec, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + isac, + "iSAC plugin", plugin_init, VERSION, "LGPL", PACKAGE_NAME, + GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisacdec.c
Added
@@ -0,0 +1,297 @@ +/* iSAC decoder + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +/** + * SECTION:element-isacdec + * @title: isacdec + * @short_description: iSAC audio decoder + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstisacdec.h" +#include "gstisacutils.h" + +#include <modules/audio_coding/codecs/isac/main/include/isac.h> + +GST_DEBUG_CATEGORY_STATIC (isacdec_debug); +#define GST_CAT_DEFAULT isacdec_debug + +#define SAMPLE_SIZE 2 /* 16-bits samples */ +#define MAX_OUTPUT_SAMPLES 960 /* decoder produces max 960 samples */ +#define MAX_OUTPUT_SIZE (SAMPLE_SIZE * MAX_OUTPUT_SAMPLES) + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/isac, " + "rate = (int) { 16000, 32000 }, " "channels = (int) 1") + ); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/x-raw, " + "format = (string) " GST_AUDIO_NE (S16) ", " + "rate = (int) { 16000, 32000 }, " + "layout = (string) interleaved, " "channels = (int) 1") + ); + +struct _GstIsacDec +{ + /*< private > */ + GstAudioDecoder parent; + + ISACStruct *isac; + + /* properties */ +}; + +#define gst_isacdec_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstIsacDec, gst_isacdec, + GST_TYPE_AUDIO_DECODER, + GST_DEBUG_CATEGORY_INIT (isacdec_debug, "isacdec", 0, + "debug category for isacdec element")); +GST_ELEMENT_REGISTER_DEFINE (isacdec, "isacdec", GST_RANK_PRIMARY, + GST_TYPE_ISACDEC); + +static gboolean +gst_isacdec_start (GstAudioDecoder * dec) +{ + GstIsacDec *self = GST_ISACDEC (dec); + gint16 ret; + + g_assert (!self->isac); + ret = WebRtcIsac_Create (&self->isac); + CHECK_ISAC_RET (ret, Create); + + return TRUE; +} + +static gboolean +gst_isacdec_stop (GstAudioDecoder * dec) +{ + GstIsacDec *self = GST_ISACDEC (dec); + + if (self->isac) { + gint16 ret; + + ret = WebRtcIsac_Free (self->isac); + CHECK_ISAC_RET (ret, Free); + self->isac = NULL; + } + + return TRUE; +} + +static gboolean +gst_isacdec_set_format (GstAudioDecoder * dec, GstCaps * input_caps) +{ + GstIsacDec *self = GST_ISACDEC (dec); + GstAudioInfo output_format; + gint16 ret; + gboolean result; + GstStructure *s; + gint rate, channels; + GstCaps *output_caps; + + GST_DEBUG_OBJECT (self, "input caps: %" GST_PTR_FORMAT, input_caps); + + s = gst_caps_get_structure (input_caps, 0); + if (!s) + return FALSE; + + if (!gst_structure_get_int (s, "rate", &rate)) { + GST_ERROR_OBJECT (self, "'rate' missing in input caps: %" GST_PTR_FORMAT, + input_caps); + return FALSE; + } + + if (!gst_structure_get_int (s, "channels", &channels)) { + GST_ERROR_OBJECT (self, + "'channels' missing in input caps: %" GST_PTR_FORMAT, input_caps); + return FALSE; + } + + gst_audio_info_set_format (&output_format, GST_AUDIO_FORMAT_S16LE, rate, + channels, NULL); + + output_caps = gst_audio_info_to_caps (&output_format); + GST_DEBUG_OBJECT (self, "output caps: %" GST_PTR_FORMAT, output_caps); + gst_caps_unref (output_caps); + + ret = WebRtcIsac_SetDecSampRate (self->isac, rate); + CHECK_ISAC_RET (ret, SetDecSampleRate); + + WebRtcIsac_DecoderInit (self->isac); + + result = gst_audio_decoder_set_output_format (dec, &output_format); + + gst_audio_decoder_set_plc_aware (dec, TRUE); + + return result; +} + +static GstFlowReturn +gst_isacdec_plc (GstIsacDec * self, GstClockTime duration) +{ + GstAudioDecoder *dec = GST_AUDIO_DECODER (self); + guint nb_plc_frames; + GstBuffer *output; + GstMapInfo map_write; + size_t ret; + + /* Decoder produces 30 ms PLC frames */ + nb_plc_frames = duration / (30 * GST_MSECOND); + + GST_DEBUG_OBJECT (self, + "GAP of %" GST_TIME_FORMAT " detected, request PLC for %d frames", + GST_TIME_ARGS (duration), nb_plc_frames); + + output = + gst_audio_decoder_allocate_output_buffer (dec, + nb_plc_frames * MAX_OUTPUT_SIZE); + + if (!gst_buffer_map (output, &map_write, GST_MAP_WRITE)) { + GST_ERROR_OBJECT (self, "Failed to map output buffer"); + gst_buffer_unref (output); + return GST_FLOW_ERROR; + } + + ret = + WebRtcIsac_DecodePlc (self->isac, (gint16 *) map_write.data, + nb_plc_frames); + + gst_buffer_unmap (output, &map_write); + + if (ret < 0) { + /* error */ + gint16 code = WebRtcIsac_GetErrorCode (self->isac); + GST_WARNING_OBJECT (self, "Failed to produce PLC: %s (%d)", + isac_error_code_to_str (code), code); + gst_buffer_unref (output); + return GST_FLOW_ERROR; + } else if (ret == 0) { + GST_DEBUG_OBJECT (self, "Decoder didn't produce any PLC frame"); + gst_buffer_unref (output); + return GST_FLOW_OK; + } + + gst_buffer_set_size (output, ret * SAMPLE_SIZE); + + GST_LOG_OBJECT (self, "Produced %" G_GSIZE_FORMAT " PLC samples", ret); + + return gst_audio_decoder_finish_frame (dec, output, 1); +} + +static GstFlowReturn +gst_isacdec_handle_frame (GstAudioDecoder * dec, GstBuffer * input) +{ + GstIsacDec *self = GST_ISACDEC (dec); + GstMapInfo map_read, map_write; + GstBuffer *output; + gint16 ret, speech_type[1]; + gsize input_size; + + /* Can't drain the decoder */ + if (!input) + return GST_FLOW_OK; + + if (!gst_buffer_get_size (input)) { + /* Base class detected a gap in the stream, try to do PLC */ + return gst_isacdec_plc (self, GST_BUFFER_DURATION (input)); + } + + if (!gst_buffer_map (input, &map_read, GST_MAP_READ)) { + GST_ELEMENT_ERROR (self, RESOURCE, READ, ("Failed to map input buffer"), + (NULL)); + return GST_FLOW_ERROR; + } + + input_size = map_read.size; + + output = gst_audio_decoder_allocate_output_buffer (dec, MAX_OUTPUT_SIZE); + if (!gst_buffer_map (output, &map_write, GST_MAP_WRITE)) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, ("Failed to map output buffer"), + (NULL)); + gst_buffer_unref (output); + gst_buffer_unmap (input, &map_read); + return GST_FLOW_ERROR; + } + + ret = WebRtcIsac_Decode (self->isac, map_read.data, map_read.size, + (gint16 *) map_write.data, speech_type); + + gst_buffer_unmap (input, &map_read); + gst_buffer_unmap (output, &map_write); + + if (ret < 0) { + /* error */ + gint16 code = WebRtcIsac_GetErrorCode (self->isac); + GST_WARNING_OBJECT (self, "Failed to decode: %s (%d)", + isac_error_code_to_str (code), code); + gst_buffer_unref (output); + /* Give a chance to decode next frames */ + return GST_FLOW_OK; + } else if (ret == 0) { + GST_DEBUG_OBJECT (self, "Decoder didn't produce any frame"); + gst_buffer_unref (output); + output = NULL; + } else { + gst_buffer_set_size (output, ret * SAMPLE_SIZE); + } + + GST_LOG_OBJECT (self, "Decoded %d samples from %" G_GSIZE_FORMAT " bytes", + ret, input_size); + + return gst_audio_decoder_finish_frame (dec, output, 1); +} + +static void +gst_isacdec_class_init (GstIsacDecClass * klass) +{ + GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); + GstAudioDecoderClass *base_class = GST_AUDIO_DECODER_CLASS (klass); + + base_class->start = GST_DEBUG_FUNCPTR (gst_isacdec_start); + base_class->stop = GST_DEBUG_FUNCPTR (gst_isacdec_stop); + base_class->set_format = GST_DEBUG_FUNCPTR (gst_isacdec_set_format); + base_class->handle_frame = GST_DEBUG_FUNCPTR (gst_isacdec_handle_frame); + + gst_element_class_set_static_metadata (gstelement_class, "iSAC decoder", + "Codec/Decoder/Audio", + "iSAC audio decoder", + "Guillaume Desmottes <guillaume.desmottes@collabora.com>"); + + gst_element_class_add_static_pad_template (gstelement_class, &sink_template); + gst_element_class_add_static_pad_template (gstelement_class, &src_template); +} + +static void +gst_isacdec_init (GstIsacDec * self) +{ + self->isac = NULL; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisacdec.h
Added
@@ -0,0 +1,36 @@ +/* iSAC decoder + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +#ifndef __GST_ISAC_DEC_H__ +#define __GST_ISAC_DEC_H__ + +#include <gst/audio/audio.h> + +G_BEGIN_DECLS + +#define GST_TYPE_ISACDEC gst_isacdec_get_type () +G_DECLARE_FINAL_TYPE(GstIsacDec, gst_isacdec, GST, ISACDEC, GstAudioDecoder) + +GST_ELEMENT_REGISTER_DECLARE (isacdec); + +G_END_DECLS + +#endif /* __GST_ISAC_DEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisacenc.c
Added
@@ -0,0 +1,437 @@ +/* iSAC encoder + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +/** + * SECTION:element-isacenc + * @title: isacenc + * @short_description: iSAC audio encoder + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstisacenc.h" +#include "gstisacutils.h" + +#include <modules/audio_coding/codecs/isac/main/include/isac.h> + +GST_DEBUG_CATEGORY_STATIC (isacenc_debug); +#define GST_CAT_DEFAULT isacenc_debug + +/* Buffer size used in the simpleKenny.c test app from webrtc */ +#define OUTPUT_BUFFER_SIZE 1200 + +#define GST_TYPE_ISACENC_OUTPUT_FRAME_LEN (gst_isacenc_output_frame_len_get_type ()) +static GType +gst_isacenc_output_frame_len_get_type (void) +{ + static GType qtype = 0; + + if (qtype == 0) { + static const GEnumValue values[] = { + {30, "30 ms", "30 ms"}, + {60, "60 ms", "60 ms, only usable in wideband mode (16 kHz)"}, + {0, NULL, NULL} + }; + + qtype = g_enum_register_static ("GstIsacEncOutputFrameLen", values); + } + return qtype; +} + +enum +{ + PROP_0, + PROP_OUTPUT_FRAME_LEN, + PROP_BITRATE, + PROP_MAX_PAYLOAD_SIZE, + PROP_MAX_RATE, +}; + +#define GST_ISACENC_OUTPUT_FRAME_LEN_DEFAULT (30) +#define GST_ISACENC_BITRATE_DEFAULT (32000) +#define GST_ISACENC_MAX_PAYLOAD_SIZE_DEFAULT (-1) +#define GST_ISACENC_MAX_RATE_DEFAULT (-1) + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/x-raw, " + "format = (string) " GST_AUDIO_NE (S16) ", " + "rate = (int) { 16000, 32000 }, " + "layout = (string) interleaved, " "channels = (int) 1") + ); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/isac, " + "rate = (int) { 16000, 32000 }, " "channels = (int) 1") + ); + +typedef enum +{ + ENCODER_MODE_WIDEBAND, /* 16 kHz */ + ENCODER_MODE_SUPER_WIDEBAND, /* 32 kHz */ +} EncoderMode; + +struct _GstIsacEnc +{ + /*< private > */ + GstAudioEncoder parent; + + ISACStruct *isac; + EncoderMode mode; + gint samples_per_frame; /* number of samples in one input frame */ + gsize frame_size; /* size, in bytes, of one input frame */ + guint nb_processed_input_frames; /* number of input frames processed by the encoder since the last produced encoded data */ + + /* properties */ + gint output_frame_len; + gint bitrate; + gint max_payload_size; + gint max_rate; +}; + +#define gst_isacenc_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstIsacEnc, gst_isacenc, + GST_TYPE_AUDIO_ENCODER, + GST_DEBUG_CATEGORY_INIT (isacenc_debug, "isacenc", 0, + "debug category for isacenc element")); +GST_ELEMENT_REGISTER_DEFINE (isacenc, "isacenc", GST_RANK_PRIMARY, + GST_TYPE_ISACENC); + +static gboolean +gst_isacenc_start (GstAudioEncoder * enc) +{ + GstIsacEnc *self = GST_ISACENC (enc); + gint16 ret; + + g_assert (!self->isac); + ret = WebRtcIsac_Create (&self->isac); + CHECK_ISAC_RET (ret, Create); + + self->nb_processed_input_frames = 0; + + return TRUE; +} + +static gboolean +gst_isacenc_stop (GstAudioEncoder * enc) +{ + GstIsacEnc *self = GST_ISACENC (enc); + + if (self->isac) { + gint16 ret; + + ret = WebRtcIsac_Free (self->isac); + CHECK_ISAC_RET (ret, Free); + self->isac = NULL; + } + + return TRUE; +} + +static gboolean +gst_isacenc_set_format (GstAudioEncoder * enc, GstAudioInfo * info) +{ + GstIsacEnc *self = GST_ISACENC (enc); + GstCaps *input_caps, *output_caps; + gint16 ret; + gboolean result; + + switch (GST_AUDIO_INFO_RATE (info)) { + case 16000: + self->mode = ENCODER_MODE_WIDEBAND; + break; + case 32000: + self->mode = ENCODER_MODE_SUPER_WIDEBAND; + break; + default: + g_assert_not_reached (); + return FALSE; + } + + input_caps = gst_audio_info_to_caps (info); + output_caps = gst_caps_new_simple ("audio/isac", + "channels", G_TYPE_INT, GST_AUDIO_INFO_CHANNELS (info), + "rate", G_TYPE_INT, GST_AUDIO_INFO_RATE (info), NULL); + + GST_DEBUG_OBJECT (self, "input caps: %" GST_PTR_FORMAT, input_caps); + GST_DEBUG_OBJECT (self, "output caps: %" GST_PTR_FORMAT, output_caps); + + ret = WebRtcIsac_SetEncSampRate (self->isac, GST_AUDIO_INFO_RATE (info)); + CHECK_ISAC_RET (ret, SetEncSampleRate); + + /* TODO: add support for automatically adjusted bit rate and frame + * length (codingMode = 0). */ + ret = WebRtcIsac_EncoderInit (self->isac, 1); + CHECK_ISAC_RET (ret, EncoderInit); + + if (self->mode == ENCODER_MODE_SUPER_WIDEBAND && self->output_frame_len != 30) { + GST_ERROR_OBJECT (self, + "Only output-frame-len=30 is supported in super-wideband mode (32 kHz)"); + return FALSE; + } + + if (self->mode == ENCODER_MODE_WIDEBAND && (self->bitrate < 10000 + || self->bitrate > 32000)) { + GST_ERROR_OBJECT (self, + "bitrate range is 10000 to 32000 bps in wideband mode (16 kHz)"); + return FALSE; + } else if (self->mode == ENCODER_MODE_SUPER_WIDEBAND && (self->bitrate < 10000 + || self->bitrate > 56000)) { + GST_ERROR_OBJECT (self, + "bitrate range is 10000 to 56000 bps in super-wideband mode (32 kHz)"); + return FALSE; + } + + ret = WebRtcIsac_Control (self->isac, self->bitrate, self->output_frame_len); + CHECK_ISAC_RET (ret, Control); + + if (self->max_payload_size != GST_ISACENC_MAX_PAYLOAD_SIZE_DEFAULT) { + GST_DEBUG_OBJECT (self, "set max payload size to %d bytes", + self->max_payload_size); + ret = WebRtcIsac_SetMaxPayloadSize (self->isac, self->max_payload_size); + CHECK_ISAC_RET (ret, SetMaxPayloadSize); + } + + if (self->max_rate != GST_ISACENC_MAX_RATE_DEFAULT) { + GST_DEBUG_OBJECT (self, "set max rate to %d bits/sec", self->max_rate); + ret = WebRtcIsac_SetMaxRate (self->isac, self->max_rate); + CHECK_ISAC_RET (ret, SetMaxRate); + } + + result = gst_audio_encoder_set_output_format (enc, output_caps); + + /* input size is 10ms */ + self->samples_per_frame = GST_AUDIO_INFO_RATE (info) / 100; + self->frame_size = self->samples_per_frame * GST_AUDIO_INFO_BPS (info); + + GST_DEBUG_OBJECT (self, "input frame: %d samples, %" G_GSIZE_FORMAT " bytes", + self->samples_per_frame, self->frame_size); + + gst_audio_encoder_set_frame_samples_min (enc, self->samples_per_frame); + gst_audio_encoder_set_frame_samples_max (enc, self->samples_per_frame); + gst_audio_encoder_set_hard_min (enc, TRUE); + + gst_caps_unref (input_caps); + gst_caps_unref (output_caps); + return result; +} + +static GstFlowReturn +gst_isacenc_handle_frame (GstAudioEncoder * enc, GstBuffer * input) +{ + GstIsacEnc *self = GST_ISACENC (enc); + GstMapInfo map_read; + gint16 ret; + GstFlowReturn flow_ret = GST_FLOW_ERROR; + gsize offset = 0; + + /* Can't drain the encoder */ + if (!input) + return GST_FLOW_OK; + + if (!gst_buffer_map (input, &map_read, GST_MAP_READ)) { + GST_ELEMENT_ERROR (self, RESOURCE, READ, ("Failed to map input buffer"), + (NULL)); + return GST_FLOW_ERROR; + } + + GST_LOG_OBJECT (self, "Received %" G_GSIZE_FORMAT " bytes", map_read.size); + + while (offset + self->frame_size <= map_read.size) { + GstBuffer *output; + GstMapInfo map_write; + + output = gst_audio_encoder_allocate_output_buffer (enc, OUTPUT_BUFFER_SIZE); + if (!gst_buffer_map (output, &map_write, GST_MAP_WRITE)) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, ("Failed to map output buffer"), + (NULL)); + gst_buffer_unref (output); + goto out; + } + + ret = + WebRtcIsac_Encode (self->isac, + (const gint16 *) (map_read.data + offset), map_write.data); + + gst_buffer_unmap (output, &map_write); + self->nb_processed_input_frames++; + offset += self->frame_size; + + if (ret == 0) { + /* buffering */ + gst_buffer_unref (output); + continue; + } else if (ret < 0) { + /* error */ + gint16 code = WebRtcIsac_GetErrorCode (self->isac); + GST_ELEMENT_ERROR (self, LIBRARY, ENCODE, ("Failed to encode frame"), + ("Failed to encode: %s (%d)", isac_error_code_to_str (code), code)); + gst_buffer_unref (output); + goto out; + } else { + /* encoded */ + GST_LOG_OBJECT (self, "Encoded %d input frames to %d bytes", + self->nb_processed_input_frames, ret); + + gst_buffer_set_size (output, ret); + + flow_ret = + gst_audio_encoder_finish_frame (enc, output, + self->nb_processed_input_frames * self->samples_per_frame); + + if (flow_ret != GST_FLOW_OK) + goto out; + + self->nb_processed_input_frames = 0; + } + } + + flow_ret = GST_FLOW_OK; +out: + gst_buffer_unmap (input, &map_read); + return flow_ret; +} + +static void +gst_isacenc_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstIsacEnc *self = GST_ISACENC (object); + + switch (prop_id) { + case PROP_OUTPUT_FRAME_LEN: + self->output_frame_len = g_value_get_enum (value); + break; + case PROP_BITRATE: + self->bitrate = g_value_get_int (value); + break; + case PROP_MAX_PAYLOAD_SIZE: + self->max_payload_size = g_value_get_int (value); + break; + case PROP_MAX_RATE: + self->max_rate = g_value_get_int (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_isacenc_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstIsacEnc *self = GST_ISACENC (object); + + switch (prop_id) { + case PROP_OUTPUT_FRAME_LEN: + g_value_set_enum (value, self->output_frame_len); + break; + case PROP_BITRATE: + g_value_set_int (value, self->bitrate); + break; + case PROP_MAX_PAYLOAD_SIZE: + g_value_set_int (value, self->max_payload_size); + break; + case PROP_MAX_RATE: + g_value_set_int (value, self->max_rate); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_isacenc_class_init (GstIsacEncClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); + GstAudioEncoderClass *base_class = GST_AUDIO_ENCODER_CLASS (klass); + + gobject_class->set_property = gst_isacenc_set_property; + gobject_class->get_property = gst_isacenc_get_property; + + base_class->start = GST_DEBUG_FUNCPTR (gst_isacenc_start); + base_class->stop = GST_DEBUG_FUNCPTR (gst_isacenc_stop); + base_class->set_format = GST_DEBUG_FUNCPTR (gst_isacenc_set_format); + base_class->handle_frame = GST_DEBUG_FUNCPTR (gst_isacenc_handle_frame); + + g_object_class_install_property (gobject_class, PROP_OUTPUT_FRAME_LEN, + g_param_spec_enum ("output-frame-len", "Output Frame Length", + "Length, in ms, of output frames", + GST_TYPE_ISACENC_OUTPUT_FRAME_LEN, + GST_ISACENC_OUTPUT_FRAME_LEN_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + + g_object_class_install_property (gobject_class, PROP_BITRATE, + g_param_spec_int ("bitrate", "Bitrate", + "Average Bitrate (ABR) in bits/sec", + 10000, 56000, + GST_ISACENC_BITRATE_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + + g_object_class_install_property (gobject_class, PROP_MAX_PAYLOAD_SIZE, + g_param_spec_int ("max-payload-size", "Max Payload Size", + "Maximum payload size, in bytes. Range is 120 to 400 at 16 kHz " + "and 120 to 600 at 32 kHz (-1 = encoder default)", + -1, 600, + GST_ISACENC_MAX_PAYLOAD_SIZE_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + + g_object_class_install_property (gobject_class, PROP_MAX_RATE, + g_param_spec_int ("max-rate", "Max Rate", + "Maximum rate, in bits/sec, which the codec may not exceed for any " + "signal packet. Range is 32000 to 53400 at 16 kHz " + "and 32000 to 160000 at 32 kHz (-1 = encoder default)", + -1, 160000, + GST_ISACENC_MAX_PAYLOAD_SIZE_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + + gst_element_class_set_static_metadata (gstelement_class, "iSAC encoder", + "Codec/Encoder/Audio", + "iSAC audio encoder", + "Guillaume Desmottes <guillaume.desmottes@collabora.com>"); + + gst_element_class_add_static_pad_template (gstelement_class, &sink_template); + gst_element_class_add_static_pad_template (gstelement_class, &src_template); +} + +static void +gst_isacenc_init (GstIsacEnc * self) +{ + self->output_frame_len = GST_ISACENC_OUTPUT_FRAME_LEN_DEFAULT; + self->bitrate = GST_ISACENC_BITRATE_DEFAULT; + self->max_payload_size = GST_ISACENC_MAX_PAYLOAD_SIZE_DEFAULT; + self->max_rate = GST_ISACENC_MAX_RATE_DEFAULT; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisacenc.h
Added
@@ -0,0 +1,36 @@ +/* iSAC encoder + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +#ifndef __GST_ISAC_ENC_H__ +#define __GST_ISAC_ENC_H__ + +#include <gst/audio/audio.h> + +G_BEGIN_DECLS + +#define GST_TYPE_ISACENC gst_isacenc_get_type () +G_DECLARE_FINAL_TYPE(GstIsacEnc, gst_isacenc, GST, ISACENC, GstAudioEncoder) + +GST_ELEMENT_REGISTER_DECLARE (isacenc); + +G_END_DECLS + +#endif /* __GST_ISAC_ENC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisacutils.c
Added
@@ -0,0 +1,85 @@ +/* iSAC plugin utils + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +#include "gstisacutils.h" + +#include <modules/audio_coding/codecs/isac/main/source/settings.h> + +const gchar * +isac_error_code_to_str (gint code) +{ + switch (code) { + case ISAC_MEMORY_ALLOCATION_FAILED: + return "allocation failed"; + case ISAC_MODE_MISMATCH: + return "mode mismatch"; + case ISAC_DISALLOWED_BOTTLENECK: + return "disallowed bottleneck"; + case ISAC_DISALLOWED_FRAME_LENGTH: + return "disallowed frame length"; + case ISAC_UNSUPPORTED_SAMPLING_FREQUENCY: + return "unsupported sampling frequency"; + case ISAC_RANGE_ERROR_BW_ESTIMATOR: + return "range error bandwitch estimator"; + case ISAC_ENCODER_NOT_INITIATED: + return "encoder not initiated"; + case ISAC_DISALLOWED_CODING_MODE: + return "disallowed coding mode"; + case ISAC_DISALLOWED_FRAME_MODE_ENCODER: + return "disallowed frame mode encoder"; + case ISAC_DISALLOWED_BITSTREAM_LENGTH: + return "disallowed bitstream length"; + case ISAC_PAYLOAD_LARGER_THAN_LIMIT: + return "payload larger than limit"; + case ISAC_DISALLOWED_ENCODER_BANDWIDTH: + return "disallowed encoder bandwith"; + case ISAC_DECODER_NOT_INITIATED: + return "decoder not initiated"; + case ISAC_EMPTY_PACKET: + return "empty packet"; + case ISAC_DISALLOWED_FRAME_MODE_DECODER: + return "disallowed frame mode decoder"; + case ISAC_RANGE_ERROR_DECODE_FRAME_LENGTH: + return "range error decode frame length"; + case ISAC_RANGE_ERROR_DECODE_BANDWIDTH: + return "range error decode bandwith"; + case ISAC_RANGE_ERROR_DECODE_PITCH_GAIN: + return "range error decode pitch gain"; + case ISAC_RANGE_ERROR_DECODE_PITCH_LAG: + return "range error decode pitch lag"; + case ISAC_RANGE_ERROR_DECODE_LPC: + return "range error decode lpc"; + case ISAC_RANGE_ERROR_DECODE_SPECTRUM: + return "range error decode spectrum"; + case ISAC_LENGTH_MISMATCH: + return "length mismatch"; + case ISAC_RANGE_ERROR_DECODE_BANDWITH: + return "range error decode bandwith"; + case ISAC_DISALLOWED_BANDWIDTH_MODE_DECODER: + return "disallowed bandwitch mode decoder"; + case ISAC_DISALLOWED_LPC_MODEL: + return "disallowed lpc model"; + case ISAC_INCOMPATIBLE_FORMATS: + return "incompatible formats"; + } + + return "<unknown>"; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/gstisacutils.h
Added
@@ -0,0 +1,40 @@ +/* iSAC plugin utils + * + * Copyright (C) 2020 Collabora Ltd. + * Author: Guillaume Desmottes <guillaume.desmottes@collabora.com>, Collabora Ltd. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the Free + * Software Foundation, Inc., 51 Franklin Street, Fifth Floor, + * Boston, MA 02110-1301 USA. + */ + +#ifndef __GST_ISAC_UTILS_H__ +#define __GST_ISAC_UTILS_H__ + +#include <glib.h> + +G_BEGIN_DECLS + +const gchar * isac_error_code_to_str (gint code); + +#define CHECK_ISAC_RET(ret, function) \ + if (ret == -1) {\ + gint16 code = WebRtcIsac_GetErrorCode (self->isac);\ + GST_WARNING_OBJECT (self, "WebRtcIsac_"#function " call failed: %s (%d)", isac_error_code_to_str (code), code);\ + return FALSE;\ + } + +G_END_DECLS + +#endif /* __GST_ISAC_UTILS_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/isac/meson.build
Added
@@ -0,0 +1,20 @@ +webrtc_audio_coding_dep = dependency('webrtc-audio-coding-1', required: get_option('isac')) + +if webrtc_audio_coding_dep.found() + isac_sources = [ + 'gstisac.c', + 'gstisacenc.c', + 'gstisacdec.c', + 'gstisacutils.c', + ] + + gstisac = library('gstisac', isac_sources, + c_args : gst_plugins_bad_args, + include_directories : [configinc], + dependencies : [gstaudio_dep, webrtc_audio_coding_dep], + install : true, + install_dir : plugins_install_dir, + ) + pkgconfig.generate(gstisac, install_dir : plugins_pkgconfig_install_dir) + plugins += [gstisac] +endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkate.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkate.c
Changed
@@ -47,67 +47,23 @@ #include "config.h" #endif -#include <string.h> +#include "gstkateelements.h" -#include <gst/gst.h> - -#include "gstkate.h" -#include "gstkatedec.h" -#include "gstkateenc.h" -#include "gstkateparse.h" -#include "gstkatetag.h" - -#undef HAVE_TIGER -#ifdef HAVE_TIGER -#include "gstkatetiger.h" -#endif - -GST_DEBUG_CATEGORY (gst_katedec_debug); -GST_DEBUG_CATEGORY (gst_kateenc_debug); -GST_DEBUG_CATEGORY (gst_kateparse_debug); -GST_DEBUG_CATEGORY (gst_katetag_debug); -GST_DEBUG_CATEGORY (gst_kateutil_debug); -#ifdef HAVE_TIGER -GST_DEBUG_CATEGORY (gst_katetiger_debug); -#endif static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_katedec_debug, "katedec", 0, "Kate decoder"); - GST_DEBUG_CATEGORY_INIT (gst_kateenc_debug, "kateenc", 0, "Kate encoder"); - GST_DEBUG_CATEGORY_INIT (gst_kateparse_debug, "kateparse", 0, "Kate parser"); - GST_DEBUG_CATEGORY_INIT (gst_katetag_debug, "katetag", 0, "Kate tagger"); - GST_DEBUG_CATEGORY_INIT (gst_kateutil_debug, "kateutil", 0, - "Kate utility functions"); -#ifdef HAVE_TIGER - GST_DEBUG_CATEGORY_INIT (gst_katetiger_debug, "tiger", 0, - "Kate Tiger renderer"); -#endif - - if (!gst_element_register (plugin, "katedec", GST_RANK_PRIMARY, - GST_TYPE_KATE_DEC)) - return FALSE; - - if (!gst_element_register (plugin, "kateenc", GST_RANK_NONE, - GST_TYPE_KATE_ENC)) - return FALSE; - - if (!gst_element_register (plugin, "kateparse", GST_RANK_NONE, - GST_TYPE_KATE_PARSE)) - return FALSE; - - if (!gst_element_register (plugin, "katetag", GST_RANK_NONE, - GST_TYPE_KATE_TAG)) - return FALSE; + gboolean ret = FALSE; + ret |= GST_ELEMENT_REGISTER (katedec, plugin); + ret |= GST_ELEMENT_REGISTER (kateenc, plugin); + ret |= GST_ELEMENT_REGISTER (kateparse, plugin); + ret |= GST_ELEMENT_REGISTER (katetag, plugin); #ifdef HAVE_TIGER - if (!gst_element_register (plugin, "tiger", GST_RANK_PRIMARY, - GST_TYPE_KATE_TIGER)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (tiger, plugin); #endif - return TRUE; + return ret; } /* this is the structure that gstreamer looks for to register plugins
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkatedec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkatedec.c
Changed
@@ -79,7 +79,7 @@ #include <gst/gst.h> -#include "gstkate.h" +#include "gstkateelements.h" #include "gstkatespu.h" #include "gstkatedec.h" @@ -115,8 +115,15 @@ GST_KATE_SPU_MIME_TYPE) ); +GST_DEBUG_CATEGORY (gst_katedec_debug); + #define gst_kate_dec_parent_class parent_class G_DEFINE_TYPE (GstKateDec, gst_kate_dec, GST_TYPE_ELEMENT); +#define _do_init \ + kate_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_katedec_debug, "katedec", 0, "Kate decoder"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (katedec, "katedec", GST_RANK_PRIMARY, + GST_TYPE_KATE_DEC, _do_init); static void gst_kate_dec_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkateelement.c
Added
@@ -0,0 +1,72 @@ +/* + * GStreamer + * Copyright 2005 Thomas Vander Stichele <thomas@apestaart.org> + * Copyright 2005 Ronald S. Bultje <rbultje@ronald.bitfreak.net> + * Copyright 2008 Vincent Penquerc'h <ogg.k.ogg.k@googlemail.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <string.h> + +#include <gst/gst.h> + +#include "gstkateelements.h" + +#undef HAVE_TIGER +#ifdef HAVE_TIGER +#include "gstkatetiger.h" +#endif + +GST_DEBUG_CATEGORY (gst_kateutil_debug); + +void +kate_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (gst_kateutil_debug, "kateutil", 0, + "Kate utility functions"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkateelements.h
Added
@@ -0,0 +1,62 @@ +/* + * GStreamer + * Copyright (C) 2008 Vincent Penquerc'h <ogg.k.ogg.k@googlemail.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_KATE_ELEMENT_H__ +#define __GST_KATE_ELEMENT_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +void kate_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (katedec); +GST_ELEMENT_REGISTER_DECLARE (kateenc); +GST_ELEMENT_REGISTER_DECLARE (kateparse); +GST_ELEMENT_REGISTER_DECLARE (tiger); +GST_ELEMENT_REGISTER_DECLARE (katetag); + + +G_END_DECLS + +#endif /* __GST_KATE_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkateenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkateenc.c
Changed
@@ -83,7 +83,7 @@ #include <gst/gsttagsetter.h> #include <gst/tag/tag.h> -#include "gstkate.h" +#include "gstkateelements.h" #include "gstkateutil.h" #include "gstkatespu.h" #include "gstkateenc.h" @@ -144,9 +144,16 @@ static gboolean gst_kate_enc_source_query (GstPad * pad, GstObject * parent, GstQuery * query); +GST_DEBUG_CATEGORY (gst_kateenc_debug); + #define gst_kate_enc_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstKateEnc, gst_kate_enc, GST_TYPE_ELEMENT, G_IMPLEMENT_INTERFACE (GST_TYPE_TAG_SETTER, NULL)); +#define _do_init \ + kate_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_kateenc_debug, "kateenc", 0, "Kate encoder"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (kateenc, "kateenc", GST_RANK_NONE, + GST_TYPE_KATE_ENC, _do_init); /* initialize the plugin's class */ static void
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkateparse.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkateparse.c
Changed
@@ -57,7 +57,7 @@ # include "config.h" #endif -#include "gstkate.h" +#include "gstkateelements.h" #include "gstkateutil.h" #include "gstkateparse.h" @@ -78,8 +78,15 @@ GST_STATIC_CAPS ("subtitle/x-kate; application/x-kate") ); +GST_DEBUG_CATEGORY (gst_kateparse_debug); + #define gst_kate_parse_parent_class parent_class G_DEFINE_TYPE (GstKateParse, gst_kate_parse, GST_TYPE_ELEMENT); +#define _do_init \ + kate_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_kateparse_debug, "kateparse", 0, "Kate parser"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (kateparse, "kateparse", GST_RANK_NONE, + GST_TYPE_KATE_PARSE, _do_init); static GstFlowReturn gst_kate_parse_chain (GstPad * pad, GstObject * parent, GstBuffer * buffer);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkatetag.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkatetag.c
Changed
@@ -69,6 +69,7 @@ #include <kate/kate.h> +#include "gstkateelements.h" #include "gstkatetag.h" @@ -92,9 +93,15 @@ GValue * value, GParamSpec * pspec); static void gst_kate_tag_dispose (GObject * object); +GST_DEBUG_CATEGORY (gst_katetag_debug); #define gst_kate_tag_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstKateTag, gst_kate_tag, GST_TYPE_KATE_PARSE, G_IMPLEMENT_INTERFACE (GST_TYPE_TAG_SETTER, NULL)); +#define _do_init \ + kate_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_katetag_debug, "katetag", 0, "Kate tagger"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (katetag, "katetag", GST_RANK_NONE, + GST_TYPE_KATE_TAG, _do_init); static void gst_kate_tag_class_init (GstKateTagClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkatetiger.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkatetiger.c
Changed
@@ -80,7 +80,7 @@ #include <gst/glib-compat-private.h> #include <gst/video/video.h> -#include "gstkate.h" +#include "gstkateelements.h" #include "gstkatetiger.h" GST_DEBUG_CATEGORY_EXTERN (gst_katetiger_debug); @@ -174,7 +174,15 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS (TIGER_VIDEO_CAPS)); +GST_DEBUG_CATEGORY (gst_katetiger_debug); + GST_BOILERPLATE (GstKateTiger, gst_kate_tiger, GstElement, GST_TYPE_ELEMENT); +#define _do_init \ + kate_element_init (plugin); \ + GST_DEBUG_CATEGORY_INIT (gst_katetiger_debug, "tiger", 0, \ + "Kate Tiger renderer"); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (tiger, "tiger", GST_RANK_NONE, + GST_TYPE_KATE_TIGER, _do_init); static GType gst_kate_tiger_font_effect_get_type (void)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/gstkateutil.c -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/gstkateutil.c
Changed
@@ -28,7 +28,6 @@ #include <tiger/tiger.h> #endif #include <gst/tag/tag.h> -#include "gstkate.h" #include "gstkateutil.h" GST_DEBUG_CATEGORY_EXTERN (gst_kateutil_debug);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/kate/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/kate/meson.build
Changed
@@ -1,5 +1,6 @@ kate_sources = [ 'gstkate.c', + 'gstkateelement.c', 'gstkatedec.c', 'gstkateenc.c', 'gstkateparse.c',
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ldac
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ldac/gstldacenc.c
Added
@@ -0,0 +1,625 @@ +/* GStreamer LDAC audio encoder + * Copyright (C) 2020 Asymptotic <sanchayan@asymptotic.io> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + * + */ + +/** + * SECTION:element-ldacenc + * @title: ldacenc + * + * This element encodes raw integer PCM audio into a Bluetooth LDAC audio. + * + * ## Example pipeline + * |[ + * gst-launch-1.0 -v audiotestsrc ! ldacenc ! rtpldacpay mtu=679 ! avdtpsink + * ]| Encode a sine wave into LDAC, RTP payload it and send over bluetooth + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <string.h> + +#include "gstldacenc.h" + +/* + * MTU size required for LDAC A2DP streaming. Required for initializing the + * encoder. + */ +#define GST_LDAC_MTU_REQUIRED 679 + +GST_DEBUG_CATEGORY_STATIC (ldac_enc_debug); +#define GST_CAT_DEFAULT ldac_enc_debug + +#define parent_class gst_ldac_enc_parent_class +G_DEFINE_TYPE (GstLdacEnc, gst_ldac_enc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (ldacenc, "ldacenc", GST_RANK_NONE, + GST_TYPE_LDAC_ENC); + +#define SAMPLE_RATES "44100, 48000, 88200, 96000" + +static GstStaticPadTemplate ldac_enc_sink_factory = +GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS + ("audio/x-raw, format=(string) { S16LE, S24LE, S32LE, F32LE }, " + "rate = (int) { " SAMPLE_RATES " }, channels = (int) [ 1, 2 ] ")); + +static GstStaticPadTemplate ldac_enc_src_factory = + GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/x-ldac, " + "rate = (int) { " SAMPLE_RATES " }, " + "channels = (int) 1, channel-mode = (string)mono; " + "audio/x-ldac, " + "rate = (int) { " SAMPLE_RATES " }, " + "channels = (int) 2, channel-mode = (string) { dual, stereo }")); + +enum +{ + PROP_0, + PROP_EQMID +}; + +static void gst_ldac_enc_get_property (GObject * object, + guint property_id, GValue * value, GParamSpec * pspec); +static void gst_ldac_enc_set_property (GObject * object, + guint property_id, const GValue * value, GParamSpec * pspec); + +static gboolean gst_ldac_enc_start (GstAudioEncoder * enc); +static gboolean gst_ldac_enc_stop (GstAudioEncoder * enc); +static gboolean gst_ldac_enc_set_format (GstAudioEncoder * enc, + GstAudioInfo * info); +static gboolean gst_ldac_enc_negotiate (GstAudioEncoder * enc); +static GstFlowReturn gst_ldac_enc_handle_frame (GstAudioEncoder * enc, + GstBuffer * buffer); +static guint gst_ldac_enc_get_num_frames (guint eqmid, guint channels); +static guint gst_ldac_enc_get_frame_length (guint eqmid, guint channels); +static guint gst_ldac_enc_get_num_samples (guint rate); + +#define GST_LDAC_EQMID (gst_ldac_eqmid_get_type ()) +static GType +gst_ldac_eqmid_get_type (void) +{ + static GType ldac_eqmid_type = 0; + static const GEnumValue eqmid_types[] = { + {GST_LDAC_EQMID_HQ, "HQ", "hq"}, + {GST_LDAC_EQMID_SQ, "SQ", "sq"}, + {GST_LDAC_EQMID_MQ, "MQ", "mq"}, + {0, NULL, NULL} + }; + + if (!ldac_eqmid_type) + ldac_eqmid_type = g_enum_register_static ("GstLdacEqmid", eqmid_types); + + return ldac_eqmid_type; +} + +static void +gst_ldac_enc_class_init (GstLdacEncClass * klass) +{ + GstAudioEncoderClass *encoder_class = GST_AUDIO_ENCODER_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->set_property = gst_ldac_enc_set_property; + gobject_class->get_property = gst_ldac_enc_get_property; + + encoder_class->start = GST_DEBUG_FUNCPTR (gst_ldac_enc_start); + encoder_class->stop = GST_DEBUG_FUNCPTR (gst_ldac_enc_stop); + encoder_class->set_format = GST_DEBUG_FUNCPTR (gst_ldac_enc_set_format); + encoder_class->handle_frame = GST_DEBUG_FUNCPTR (gst_ldac_enc_handle_frame); + encoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_ldac_enc_negotiate); + + g_object_class_install_property (gobject_class, PROP_EQMID, + g_param_spec_enum ("eqmid", "Encode Quality Mode Index", + "Encode Quality Mode Index. 0: High Quality 1: Standard Quality " + "2: Mobile Use Quality", GST_LDAC_EQMID, + GST_LDAC_EQMID_SQ, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + gst_element_class_add_static_pad_template (element_class, + &ldac_enc_sink_factory); + gst_element_class_add_static_pad_template (element_class, + &ldac_enc_src_factory); + + gst_element_class_set_static_metadata (element_class, + "Bluetooth LDAC audio encoder", "Codec/Encoder/Audio", + "Encode an LDAC audio stream", + "Sanchayan Maity <sanchayan@asymptotic.io>"); + + GST_DEBUG_CATEGORY_INIT (ldac_enc_debug, "ldacenc", 0, + "LDAC encoding element"); +} + +static void +gst_ldac_enc_init (GstLdacEnc * self) +{ + GST_PAD_SET_ACCEPT_TEMPLATE (GST_AUDIO_ENCODER_SINK_PAD (self)); + self->eqmid = GST_LDAC_EQMID_SQ; + self->channel_mode = 0; + self->init_done = FALSE; +} + +static void +gst_ldac_enc_set_property (GObject * object, guint property_id, + const GValue * value, GParamSpec * pspec) +{ + GstLdacEnc *self = GST_LDAC_ENC (object); + + switch (property_id) { + case PROP_EQMID: + self->eqmid = g_value_get_enum (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec); + break; + } +} + +static void +gst_ldac_enc_get_property (GObject * object, guint property_id, + GValue * value, GParamSpec * pspec) +{ + GstLdacEnc *self = GST_LDAC_ENC (object); + + switch (property_id) { + case PROP_EQMID: + g_value_set_enum (value, self->eqmid); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec); + break; + } +} + +static GstCaps * +gst_ldac_enc_do_negotiate (GstAudioEncoder * audio_enc) +{ + GstLdacEnc *enc = GST_LDAC_ENC (audio_enc); + GstCaps *caps, *filter_caps; + GstCaps *output_caps = NULL; + GstStructure *s; + + /* Negotiate output format based on downstream caps restrictions */ + caps = gst_pad_get_allowed_caps (GST_AUDIO_ENCODER_SRC_PAD (enc)); + + if (caps == NULL) + caps = gst_static_pad_template_get_caps (&ldac_enc_src_factory); + else if (gst_caps_is_empty (caps)) + goto failure; + + /* Fixate output caps */ + filter_caps = gst_caps_new_simple ("audio/x-ldac", "rate", G_TYPE_INT, + enc->rate, "channels", G_TYPE_INT, enc->channels, NULL); + output_caps = gst_caps_intersect (caps, filter_caps); + gst_caps_unref (filter_caps); + + if (output_caps == NULL || gst_caps_is_empty (output_caps)) { + GST_WARNING_OBJECT (enc, "Couldn't negotiate output caps with input rate " + "%d and input channels %d and allowed output caps %" GST_PTR_FORMAT, + enc->rate, enc->channels, caps); + goto failure; + } + + gst_clear_caps (&caps); + + GST_DEBUG_OBJECT (enc, "fixating caps %" GST_PTR_FORMAT, output_caps); + output_caps = gst_caps_truncate (output_caps); + s = gst_caps_get_structure (output_caps, 0); + if (enc->channels == 1) + gst_structure_fixate_field_string (s, "channel-mode", "mono"); + else + gst_structure_fixate_field_string (s, "channel-mode", "stereo"); + s = NULL; + + /* In case there's anything else left to fixate */ + output_caps = gst_caps_fixate (output_caps); + gst_caps_set_simple (output_caps, "framed", G_TYPE_BOOLEAN, TRUE, NULL); + + /* Set EQMID in caps to be used downstream by rtpldacpay */ + gst_caps_set_simple (output_caps, "eqmid", G_TYPE_INT, enc->eqmid, NULL); + + GST_INFO_OBJECT (enc, "output caps %" GST_PTR_FORMAT, output_caps); + + if (enc->channels == 1) + enc->channel_mode = LDACBT_CHANNEL_MODE_MONO; + else + enc->channel_mode = LDACBT_CHANNEL_MODE_STEREO; + + return output_caps; + +failure: + if (output_caps) + gst_caps_unref (output_caps); + if (caps) + gst_caps_unref (caps); + return NULL; +} + +static gboolean +gst_ldac_enc_negotiate (GstAudioEncoder * audio_enc) +{ + GstLdacEnc *enc = GST_LDAC_ENC (audio_enc); + GstCaps *output_caps = NULL; + + output_caps = gst_ldac_enc_do_negotiate (audio_enc); + if (output_caps == NULL) { + GST_ERROR_OBJECT (enc, "failed to negotiate"); + return FALSE; + } + + if (!gst_audio_encoder_set_output_format (audio_enc, output_caps)) { + GST_ERROR_OBJECT (enc, "failed to configure output caps on src pad"); + gst_caps_unref (output_caps); + return FALSE; + } + gst_caps_unref (output_caps); + + return GST_AUDIO_ENCODER_CLASS (parent_class)->negotiate (audio_enc); +} + +static gboolean +gst_ldac_enc_set_format (GstAudioEncoder * audio_enc, GstAudioInfo * info) +{ + GstLdacEnc *enc = GST_LDAC_ENC (audio_enc); + GstCaps *output_caps = NULL; + guint num_ldac_frames, num_samples; + gint ret = 0; + + enc->rate = GST_AUDIO_INFO_RATE (info); + enc->channels = GST_AUDIO_INFO_CHANNELS (info); + + switch (GST_AUDIO_INFO_FORMAT (info)) { + case GST_AUDIO_FORMAT_S16: + enc->ldac_fmt = LDACBT_SMPL_FMT_S16; + break; + case GST_AUDIO_FORMAT_S24: + enc->ldac_fmt = LDACBT_SMPL_FMT_S24; + break; + case GST_AUDIO_FORMAT_S32: + enc->ldac_fmt = LDACBT_SMPL_FMT_S32; + break; + case GST_AUDIO_FORMAT_F32: + enc->ldac_fmt = LDACBT_SMPL_FMT_F32; + break; + default: + GST_ERROR_OBJECT (enc, "Invalid audio format"); + return FALSE; + } + + output_caps = gst_ldac_enc_do_negotiate (audio_enc); + if (output_caps == NULL) { + GST_ERROR_OBJECT (enc, "failed to negotiate"); + return FALSE; + } + + if (!gst_audio_encoder_set_output_format (audio_enc, output_caps)) { + GST_ERROR_OBJECT (enc, "failed to configure output caps on src pad"); + gst_caps_unref (output_caps); + return FALSE; + } + gst_caps_unref (output_caps); + + num_samples = gst_ldac_enc_get_num_samples (enc->rate); + num_ldac_frames = gst_ldac_enc_get_num_frames (enc->eqmid, enc->channels); + gst_audio_encoder_set_frame_samples_min (audio_enc, + num_samples * num_ldac_frames); + + /* + * If initialisation was already done means caps have changed, close the + * handle. Closed handle can be initialised and used again. + */ + if (enc->init_done) { + ldacBT_close_handle (enc->ldac); + enc->init_done = FALSE; + } + + /* + * libldac exposes a bluetooth centric API and emits multiple LDAC frames + * depending on the MTU. The MTU is required for LDAC A2DP streaming, is + * inclusive of the RTP header and is required by the encoder. The internal + * encoder API is not exposed in the public interface. + */ + ret = + ldacBT_init_handle_encode (enc->ldac, GST_LDAC_MTU_REQUIRED, enc->eqmid, + enc->channel_mode, enc->ldac_fmt, enc->rate); + if (ret != 0) { + GST_ERROR_OBJECT (enc, "Failed to initialize LDAC handle, ret: %d", ret); + return FALSE; + } + enc->init_done = TRUE; + + return TRUE; +} + +static GstFlowReturn +gst_ldac_enc_handle_frame (GstAudioEncoder * audio_enc, GstBuffer * buffer) +{ + GstLdacEnc *enc = GST_LDAC_ENC (audio_enc); + GstMapInfo in_map, out_map; + GstAudioInfo *info; + GstBuffer *outbuf; + const guint8 *in_data; + guint8 *out_data; + gint encoded, to_encode = 0; + gint samples_consumed = 0; + guint frames, frame_len; + guint ldac_enc_read = 0; + guint frame_count = 0; + + if (buffer == NULL) + return GST_FLOW_OK; + + if (!gst_buffer_map (buffer, &in_map, GST_MAP_READ)) { + GST_ELEMENT_ERROR (audio_enc, STREAM, FAILED, (NULL), + ("Failed to map data from input buffer")); + return GST_FLOW_ERROR; + } + + info = gst_audio_encoder_get_audio_info (audio_enc); + ldac_enc_read = LDACBT_ENC_LSU * info->bpf; + /* + * We may produce extra frames at the end of encoding process (See below). + * Consider some additional frames while allocating output buffer if this + * happens. + */ + frames = (in_map.size / ldac_enc_read) + 4; + + frame_len = gst_ldac_enc_get_frame_length (enc->eqmid, info->channels); + outbuf = gst_audio_encoder_allocate_output_buffer (audio_enc, + frames * frame_len); + if (outbuf == NULL) + goto no_buffer; + + gst_buffer_map (outbuf, &out_map, GST_MAP_WRITE); + in_data = in_map.data; + out_data = out_map.data; + to_encode = in_map.size; + + /* + * ldacBT_encode does not generate an output frame each time it is called. + * For each invocation, it consumes number of sample * bpf bytes of data. + * Depending on the eqmid setting and channels, it will emit multiple frames + * only after the required number of frames are packed for payloading. Below + * for loop exists primarily to handle this. + */ + for (;;) { + guint8 pcm[LDACBT_MAX_LSU * 4 /* bytes/sample */ * 2 /* ch */ ] = { 0 }; + gint ldac_frame_num, written; + guint8 *inp_data = NULL; + gboolean done = FALSE; + gint ret; + + /* + * Even with minimum frame samples specified in set_format with EOS, + * we may get a buffer which is not a multiple of LDACBT_ENC_LSU. LDAC + * encoder always reads a multiple of this and to handle this scenario + * we use local PCM array and in the last iteration when buffer bytes + * < LDACBT_ENC_LSU * bpf, we copy only to_encode bytes to prevent + * walking off the end of input buffer and the rest of the bytes in + * PCM buffer would be zero, so should be safe from encoding point of + * view. + */ + if (to_encode < 0) { + /* + * We got < LDACBT_ENC_LSU * bpf for last iteration. Force the encoder + * to encode the remaining bytes in buffer by passing NULL to the input + * PCM buffer argument. + */ + inp_data = NULL; + done = TRUE; + } else if (to_encode >= ldac_enc_read) { + memcpy (pcm, in_data, ldac_enc_read); + inp_data = &pcm[0]; + } else if (to_encode > 0 && to_encode < ldac_enc_read) { + memcpy (pcm, in_data, to_encode); + inp_data = &pcm[0]; + } + + /* + * Note that while we do not explicitly pass length of data to library + * anywhere, based on the initialization considering eqmid and rate, the + * library will consume a fix number of samples per call. This combined + * with the previous step ensures that the library does not read outside + * of in_data and out_data. + */ + ret = ldacBT_encode (enc->ldac, (void *) inp_data, &encoded, + (guint8 *) out_data, &written, &ldac_frame_num); + if (ret < 0) { + GST_ELEMENT_ERROR (enc, STREAM, ENCODE, (NULL), + ("encoding error, ret = %d written = %d", ret, ldac_frame_num)); + goto encoding_error; + } else { + to_encode -= encoded; + in_data = in_data + encoded; + out_data = out_data + written; + frame_count += ldac_frame_num; + + GST_LOG_OBJECT (enc, + "To Encode: %d, Encoded: %d, Written: %d, LDAC Frames: %d", to_encode, + encoded, written, ldac_frame_num); + + if (done || (to_encode == 0 && encoded == ldac_enc_read)) + break; + } + } + + gst_buffer_unmap (outbuf, &out_map); + + if (frame_count > 0) { + samples_consumed = in_map.size / info->bpf; + gst_buffer_set_size (outbuf, frame_count * frame_len); + } else { + samples_consumed = 0; + gst_buffer_replace (&outbuf, NULL); + } + + gst_buffer_unmap (buffer, &in_map); + + return gst_audio_encoder_finish_frame (audio_enc, outbuf, samples_consumed); + +no_buffer: + { + gst_buffer_unmap (buffer, &in_map); + + GST_ELEMENT_ERROR (enc, STREAM, FAILED, (NULL), + ("could not allocate output buffer")); + + return GST_FLOW_ERROR; + } +encoding_error: + { + gst_buffer_unmap (buffer, &in_map); + + ldacBT_free_handle (enc->ldac); + + enc->ldac = NULL; + + return GST_FLOW_ERROR; + } +} + +static gboolean +gst_ldac_enc_start (GstAudioEncoder * audio_enc) +{ + GstLdacEnc *enc = GST_LDAC_ENC (audio_enc); + + GST_INFO_OBJECT (enc, "Setup LDAC codec"); + /* Note that this only allocates the LDAC handle */ + enc->ldac = ldacBT_get_handle (); + if (enc->ldac == NULL) { + GST_ERROR_OBJECT (enc, "Failed to get LDAC handle"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_ldac_enc_stop (GstAudioEncoder * audio_enc) +{ + GstLdacEnc *enc = GST_LDAC_ENC (audio_enc); + + GST_INFO_OBJECT (enc, "Finish LDAC codec"); + + if (enc->ldac) { + ldacBT_free_handle (enc->ldac); + enc->ldac = NULL; + } + + enc->eqmid = GST_LDAC_EQMID_SQ; + enc->channel_mode = 0; + enc->init_done = FALSE; + + return TRUE; +} + +/** + * gst_ldac_enc_get_frame_length + * @eqmid: Encode Quality Mode Index + * @channels: Number of channels + * + * Returns: Frame length. + */ +static guint +gst_ldac_enc_get_frame_length (guint eqmid, guint channels) +{ + g_assert (channels == 1 || channels == 2); + + switch (eqmid) { + /* Encode setting for High Quality */ + case GST_LDAC_EQMID_HQ: + return 165 * channels; + /* Encode setting for Standard Quality */ + case GST_LDAC_EQMID_SQ: + return 110 * channels; + /* Encode setting for Mobile use Quality */ + case GST_LDAC_EQMID_MQ: + return 55 * channels; + default: + break; + } + + g_assert_not_reached (); + + /* If assertion gets compiled out */ + return 110 * channels; +} + +/** + * gst_ldac_enc_get_num_frames + * @eqmid: Encode Quality Mode Index + * @channels: Number of channels + * + * Returns: Number of LDAC frames per packet. + */ +static guint +gst_ldac_enc_get_num_frames (guint eqmid, guint channels) +{ + g_assert (channels == 1 || channels == 2); + + switch (eqmid) { + /* Encode setting for High Quality */ + case GST_LDAC_EQMID_HQ: + return 4 / channels; + /* Encode setting for Standard Quality */ + case GST_LDAC_EQMID_SQ: + return 6 / channels; + /* Encode setting for Mobile use Quality */ + case GST_LDAC_EQMID_MQ: + return 12 / channels; + default: + break; + } + + g_assert_not_reached (); + + /* If assertion gets compiled out */ + return 6 / channels; +} + +/** + * gst_ldac_enc_get_num_samples + * @rate: Sampling rate + * + * Number of samples in input PCM signal for encoding is fixed to + * LDACBT_ENC_LSU viz. 128 samples/channel and it is not affected + * by sampling frequency. However, frame size is 128 samples at 44.1 + * and 48 KHz and 256 at 88.2 and 96 KHz. + * + * Returns: Number of samples / channel + */ +static guint +gst_ldac_enc_get_num_samples (guint rate) +{ + switch (rate) { + case 44100: + case 48000: + return 128; + case 88200: + case 96000: + return 256; + default: + break; + } + + g_assert_not_reached (); + + /* If assertion gets compiled out */ + return 128; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ldac/gstldacenc.h
Added
@@ -0,0 +1,67 @@ +/* GStreamer LDAC audio encoder + * Copyright (C) 2020 Asymptotic <sanchayan@asymptotic.io> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + */ + +#include <gst/gst.h> +#include <gst/audio/audio.h> + +#include <ldacBT.h> + +G_BEGIN_DECLS + +#define GST_TYPE_LDAC_ENC \ + (gst_ldac_enc_get_type()) +#define GST_LDAC_ENC(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_LDAC_ENC,GstLdacEnc)) +#define GST_LDAC_ENC_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_LDAC_ENC,GstLdacEncClass)) +#define GST_IS_LDAC_ENC(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_LDAC_ENC)) +#define GST_IS_LDAC_ENC_CLASS(obj) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_LDAC_ENC)) + +typedef struct _GstLdacEnc GstLdacEnc; +typedef struct _GstLdacEncClass GstLdacEncClass; + +typedef enum +{ + GST_LDAC_EQMID_HQ = 0, + GST_LDAC_EQMID_SQ, + GST_LDAC_EQMID_MQ +} GstLdacEqmid; + +struct _GstLdacEnc { + GstAudioEncoder audio_encoder; + GstLdacEqmid eqmid; + + guint rate; + guint channels; + guint channel_mode; + gboolean init_done; + + LDACBT_SMPL_FMT_T ldac_fmt; + HANDLE_LDAC_BT ldac; +}; + +struct _GstLdacEncClass { + GstAudioEncoderClass audio_encoder_class; +}; + +GType gst_ldac_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ldacenc); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ldac/ldac-plugin.c
Added
@@ -0,0 +1,37 @@ +/* GStreamer LDAC audio plugin + * Copyright (C) 2020 Asymptotic <sanchayan@asymptotic.io> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstldacenc.h" +#include <string.h> + +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (ldacenc, plugin);; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + ldac, + "LDAC bluetooth audio support", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ldac/meson.build
Added
@@ -0,0 +1,19 @@ +ldac_sources = [ + 'ldac-plugin.c', + 'gstldacenc.c', +] + +ldac_dep = dependency('ldacBT-enc', required : get_option('ldac')) + +if ldac_dep.found() + gstldac = library('gstldac', + ldac_sources, + c_args : gst_plugins_bad_args, + include_directories : [configinc], + dependencies : [gstaudio_dep, ldac_dep], + install : true, + install_dir : plugins_install_dir, + ) + pkgconfig.generate(gstldac, install_dir : plugins_pkgconfig_install_dir) + plugins += [gstldac] +endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libde265/gstlibde265.c -> gst-plugins-bad-1.20.1.tar.xz/ext/libde265/gstlibde265.c
Changed
@@ -28,10 +28,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; - - ret = gst_libde265_dec_plugin_init (plugin); - return ret; + return GST_ELEMENT_REGISTER (libde265dec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libde265/libde265-dec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/libde265/libde265-dec.c
Changed
@@ -47,6 +47,8 @@ #define parent_class gst_libde265_dec_parent_class G_DEFINE_TYPE (GstLibde265Dec, gst_libde265_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (libde265dec, "libde265dec", + GST_RANK_SECONDARY, GST_TYPE_LIBDE265_DEC); static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, @@ -872,14 +874,3 @@ gst_buffer_unmap (frame->input_buffer, &info); return GST_FLOW_ERROR; } - -gboolean -gst_libde265_dec_plugin_init (GstPlugin * plugin) -{ - /* create an elementfactory for the libde265 decoder element */ - if (!gst_element_register (plugin, "libde265dec", - GST_RANK_SECONDARY, GST_TYPE_LIBDE265_DEC)) - return FALSE; - - return TRUE; -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/libde265/libde265-dec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/libde265/libde265-dec.h
Changed
@@ -64,8 +64,8 @@ GType gst_libde265_dec_get_type (void); -G_END_DECLS +GST_ELEMENT_REGISTER_DECLARE (libde265dec) -gboolean gst_libde265_dec_plugin_init (GstPlugin * plugin); +G_END_DECLS #endif /* __GST_LIBDE265_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mdns/gstmicrodns.c -> gst-plugins-bad-1.20.1.tar.xz/ext/mdns/gstmicrodns.c
Changed
@@ -23,14 +23,14 @@ #include "gstmicrodnsdevice.h" +GST_DEVICE_PROVIDER_REGISTER_DECLARE (microdnsdeviceprovider); +GST_DEVICE_PROVIDER_REGISTER_DEFINE (microdnsdeviceprovider, + "microdnsdeviceprovider", GST_RANK_PRIMARY, GST_TYPE_MDNS_DEVICE_PROVIDER); + static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_device_provider_register (plugin, "microdnsdeviceprovider", - GST_RANK_PRIMARY, GST_TYPE_MDNS_DEVICE_PROVIDER)) - return FALSE; - - return TRUE; + return GST_DEVICE_PROVIDER_REGISTER (microdnsdeviceprovider, plugin);; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/meson.build
Changed
@@ -1,3 +1,4 @@ +subdir('aes') subdir('assrender') subdir('aom') subdir('avtp') @@ -18,13 +19,15 @@ subdir('flite') subdir('fluidsynth') subdir('gme') +subdir('gs') subdir('gsm') subdir('hls') subdir('iqa') +subdir('isac') subdir('kate') subdir('ladspa') +subdir('ldac') subdir('libde265') -subdir('libmms') subdir('lv2') subdir('mdns') subdir('modplug') @@ -32,8 +35,9 @@ subdir('mplex') subdir('musepack') subdir('neon') -subdir('ofa') +subdir('onnx') subdir('openal') +subdir('openaptx') subdir('opencv') subdir('openexr') subdir('openh264') @@ -41,6 +45,7 @@ subdir('openmpt') subdir('openni2') subdir('opus') +subdir('qroverlay') subdir('resindvd') subdir('rsvg') subdir('rtmp')
View file
gst-plugins-bad-1.18.6.tar.xz/ext/modplug/gstmodplug.cc -> gst-plugins-bad-1.20.1.tar.xz/ext/modplug/gstmodplug.cc
Changed
@@ -129,6 +129,8 @@ #define parent_class gst_modplug_parent_class G_DEFINE_TYPE (GstModPlug, gst_modplug, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (modplug, "modplug", + GST_RANK_PRIMARY, GST_TYPE_MODPLUG); static void gst_modplug_class_init (GstModPlugClass * klass) @@ -902,8 +904,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "modplug", - GST_RANK_PRIMARY, GST_TYPE_MODPLUG); + return GST_ELEMENT_REGISTER (modplug, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/modplug/gstmodplug.h -> gst-plugins-bad-1.20.1.tar.xz/ext/modplug/gstmodplug.h
Changed
@@ -84,6 +84,8 @@ GType gst_modplug_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (modplug); + G_END_DECLS #endif /* __GST_MODPLUG_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mpeg2enc/gstmpeg2enc.cc -> gst-plugins-bad-1.20.1.tar.xz/ext/mpeg2enc/gstmpeg2enc.cc
Changed
@@ -106,10 +106,12 @@ guint prop_id, const GValue * value, GParamSpec * pspec); static gboolean gst_mpeg2enc_src_activate_mode (GstPad * pad, GstObject * parent, GstPadMode mode, gboolean active); +static gboolean mpeg2enc_element_init (GstPlugin * plugin); #define gst_mpeg2enc_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstMpeg2enc, gst_mpeg2enc, GST_TYPE_VIDEO_ENCODER, G_IMPLEMENT_INTERFACE (GST_TYPE_PRESET, NULL)); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (mpeg2enc, mpeg2enc_element_init); static void gst_mpeg2enc_class_init (GstMpeg2encClass * klass) @@ -790,7 +792,7 @@ #endif static gboolean -plugin_init (GstPlugin * plugin) +mpeg2enc_element_init (GstPlugin * plugin) { #ifndef GST_DISABLE_GST_DEBUG old_handler = mjpeg_log_set_handler (gst_mpeg2enc_log_callback); @@ -803,6 +805,12 @@ GST_RANK_MARGINAL, GST_TYPE_MPEG2ENC); } +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (mpeg2enc, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, mpeg2enc,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mpeg2enc/gstmpeg2enc.hh -> gst-plugins-bad-1.20.1.tar.xz/ext/mpeg2enc/gstmpeg2enc.hh
Changed
@@ -98,6 +98,8 @@ GType gst_mpeg2enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (mpeg2enc); + G_END_DECLS #endif /* __GST_MPEG2ENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mpeg2enc/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/mpeg2enc/meson.build
Changed
@@ -1,21 +1,26 @@ +mpeg2enc_opt = get_option('mpeg2enc').require(gpl_allowed, error_message: ''' + Plugin mpeg2enc explicitly required via options but GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow GPL-licensed plugins to be built. + ''') + # mjpegtools upstream breaks API constantly and doesn't export the version in # a header anywhere. The configure file has a lot of logic to support old # versions, but it all seems untested and broken. Require 2.0.0. Can be changed # if someone complains. -mjpegtools_dep = dependency('mjpegtools', version : '>=2.0.0', required : get_option('mpeg2enc')) -mpeg2enc_dep = cxx.find_library('mpeg2encpp', required : get_option('mpeg2enc')) - -no_warn_args = [] -foreach arg : [ - '-Wno-mismatched-tags', # mjpeg headers get class/struct mixed up - '-Wno-header-guard', - ] - if cxx.has_argument(arg) - no_warn_args += [arg] - endif -endforeach +mjpegtools_dep = dependency('mjpegtools', version : '>=2.0.0', required : mpeg2enc_opt) +mpeg2enc_dep = cxx.find_library('mpeg2encpp', required : mpeg2enc_opt) if mjpegtools_dep.found() and mpeg2enc_dep.found() + no_warn_args = [] + foreach arg : [ + '-Wno-mismatched-tags', # mjpeg headers get class/struct mixed up + '-Wno-header-guard', + ] + if cxx.has_argument(arg) + no_warn_args += [arg] + endif + endforeach + gstmpeg2enc = library('gstmpeg2enc', 'gstmpeg2enc.cc', 'gstmpeg2encoptions.cc',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mplex/gstmplex.cc -> gst-plugins-bad-1.20.1.tar.xz/ext/mplex/gstmplex.cc
Changed
@@ -115,9 +115,11 @@ guint prop_id, GValue * value, GParamSpec * pspec); static void gst_mplex_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); +static gboolean mplex_element_init (GstPlugin * plugin); #define parent_class gst_mplex_parent_class G_DEFINE_TYPE (GstMplex, gst_mplex, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (mplex, mplex_element_init); static void gst_mplex_class_init (GstMplexClass * klass) @@ -802,7 +804,7 @@ #endif static gboolean -plugin_init (GstPlugin * plugin) +mplex_element_init (GstPlugin * plugin) { #ifndef GST_DISABLE_GST_DEBUG old_handler = mjpeg_log_set_handler (gst_mplex_log_callback); @@ -814,6 +816,12 @@ return gst_element_register (plugin, "mplex", GST_RANK_NONE, GST_TYPE_MPLEX); } +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (mplex, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, mplex,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mplex/gstmplex.hh -> gst-plugins-bad-1.20.1.tar.xz/ext/mplex/gstmplex.hh
Changed
@@ -117,6 +117,8 @@ GType gst_mplex_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (mplex); + G_END_DECLS #endif /* __GST_MPLEX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/mplex/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/mplex/meson.build
Changed
@@ -1,6 +1,11 @@ -# See: ext/mpeg2enc for note about mjpegtools dep -mjpegtools_dep = dependency('mjpegtools', version : '>=2.0.0', required : get_option('mplex')) -mplex2_dep = cxx.find_library('mplex2', required : get_option('mplex')) +mplex_opt = get_option('mplex').require(gpl_allowed, error_message: ''' + Plugin mplex explicitly required via options but GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow GPL-licensed plugins to be built. + ''') + +# See: ext/mplex for note about mjpegtools dep +mjpegtools_dep = dependency('mjpegtools', version : '>=2.0.0', required : mplex_opt) +mplex2_dep = cxx.find_library('mplex2', required : mplex_opt) if mjpegtools_dep.found() and mplex2_dep.found() gstmplex2 = library('gstmplex',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/musepack/gstmusepackdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/musepack/gstmusepackdec.c
Changed
@@ -74,8 +74,12 @@ gst_musepackdec_change_state (GstElement * element, GstStateChange transition); #define parent_class gst_musepackdec_parent_class -G_DEFINE_TYPE (GstMusepackDec, gst_musepackdec, GST_TYPE_ELEMENT); - +G_DEFINE_TYPE_WITH_CODE (GstMusepackDec, gst_musepackdec, GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (musepackdec_debug, "musepackdec", 0, + "mpc decoder"); + ); +GST_ELEMENT_REGISTER_DEFINE (musepackdec, "musepackdec", + GST_RANK_PRIMARY, GST_TYPE_MUSEPACK_DEC); static void gst_musepackdec_class_init (GstMusepackDecClass * klass) { @@ -628,10 +632,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (musepackdec_debug, "musepackdec", 0, "mpc decoder"); - - return gst_element_register (plugin, "musepackdec", - GST_RANK_PRIMARY, GST_TYPE_MUSEPACK_DEC); + return GST_ELEMENT_REGISTER (musepackdec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/musepack/gstmusepackdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/musepack/gstmusepackdec.h
Changed
@@ -61,6 +61,8 @@ GType gst_musepackdec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (musepackdec); + G_END_DECLS #endif /* __GST_MUSEPACK_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/neon/gstneonhttpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/neon/gstneonhttpsrc.c
Changed
@@ -109,7 +109,12 @@ #define parent_class gst_neonhttp_src_parent_class G_DEFINE_TYPE_WITH_CODE (GstNeonhttpSrc, gst_neonhttp_src, GST_TYPE_PUSH_SRC, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, - gst_neonhttp_src_uri_handler_init)); + gst_neonhttp_src_uri_handler_init); + GST_DEBUG_CATEGORY_INIT (neonhttpsrc_debug, "neonhttpsrc", 0, + "NEON HTTP src"); + ); +GST_ELEMENT_REGISTER_DEFINE (neonhttpsrc, "neonhttpsrc", GST_RANK_NONE, + GST_TYPE_NEONHTTP_SRC); static void gst_neonhttp_src_class_init (GstNeonhttpSrcClass * klass) @@ -1099,11 +1104,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (neonhttpsrc_debug, "neonhttpsrc", 0, - "NEON HTTP src"); - - return gst_element_register (plugin, "neonhttpsrc", GST_RANK_NONE, - GST_TYPE_NEONHTTP_SRC); + return GST_ELEMENT_REGISTER (neonhttpsrc, plugin); } /* this is the structure that gst-register looks for
View file
gst-plugins-bad-1.18.6.tar.xz/ext/neon/gstneonhttpsrc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/neon/gstneonhttpsrc.h
Changed
@@ -84,6 +84,8 @@ GType gst_neonhttp_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (neonhttpsrc); + G_END_DECLS #endif /* __GST_NEONHTTP_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/neon/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/neon/meson.build
Changed
@@ -1,4 +1,4 @@ -neon_dep = dependency('neon', version: ['>= 0.27', '<= 0.31.99'], +neon_dep = dependency('neon', version: ['>= 0.27', '<= 0.32.99'], required : get_option('neon')) if neon_dep.found()
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnx.c
Added
@@ -0,0 +1,40 @@ +/* + * GStreamer gstreamer-onnx + * Copyright (C) 2021 Collabora Ltd + * + * gstonnx.c + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstonnxobjectdetector.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + GST_ELEMENT_REGISTER (onnx_object_detector, plugin); + + return TRUE; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + onnx, + "ONNX neural network plugin", + plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnxclient.cpp
Added
@@ -0,0 +1,423 @@ +/* + * GStreamer gstreamer-onnxclient + * Copyright (C) 2021 Collabora Ltd + * + * gstonnxclient.cpp + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstonnxclient.h" +#include <providers/cpu/cpu_provider_factory.h> +#ifdef GST_ML_ONNX_RUNTIME_HAVE_CUDA +#include <providers/cuda/cuda_provider_factory.h> +#endif +#include <exception> +#include <fstream> +#include <iostream> +#include <limits> +#include <numeric> +#include <cmath> +#include <sstream> + +namespace GstOnnxNamespace +{ +template < typename T > + std::ostream & operator<< (std::ostream & os, const std::vector < T > &v) +{ + os << "["; + for (size_t i = 0; i < v.size (); ++i) + { + os << v[i]; + if (i != v.size () - 1) + { + os << ", "; + } + } + os << "]"; + + return os; +} + +GstMlOutputNodeInfo::GstMlOutputNodeInfo (void):index + (GST_ML_NODE_INDEX_DISABLED), + type (ONNXTensorElementDataType::ONNX_TENSOR_ELEMENT_DATA_TYPE_FLOAT) +{ +} + +GstOnnxClient::GstOnnxClient ():session (nullptr), + width (0), + height (0), + channels (0), + dest (nullptr), + m_provider (GST_ONNX_EXECUTION_PROVIDER_CPU), + inputImageFormat (GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC), + fixedInputImageSize (true) +{ + for (size_t i = 0; i < GST_ML_OUTPUT_NODE_NUMBER_OF; ++i) + outputNodeIndexToFunction[i] = (GstMlOutputNodeFunction) i; +} + +GstOnnxClient::~GstOnnxClient () +{ + delete session; + delete[]dest; +} + +Ort::Env & GstOnnxClient::getEnv (void) +{ + static Ort::Env env (OrtLoggingLevel::ORT_LOGGING_LEVEL_WARNING, + "GstOnnxNamespace"); + + return env; +} + +int32_t GstOnnxClient::getWidth (void) +{ + return width; +} + +int32_t GstOnnxClient::getHeight (void) +{ + return height; +} + +bool GstOnnxClient::isFixedInputImageSize (void) +{ + return fixedInputImageSize; +} + +std::string GstOnnxClient::getOutputNodeName (GstMlOutputNodeFunction nodeType) +{ + switch (nodeType) { + case GST_ML_OUTPUT_NODE_FUNCTION_DETECTION: + return "detection"; + break; + case GST_ML_OUTPUT_NODE_FUNCTION_BOUNDING_BOX: + return "bounding box"; + break; + case GST_ML_OUTPUT_NODE_FUNCTION_SCORE: + return "score"; + break; + case GST_ML_OUTPUT_NODE_FUNCTION_CLASS: + return "label"; + break; + }; + + return ""; +} + +void GstOnnxClient::setInputImageFormat (GstMlModelInputImageFormat format) +{ + inputImageFormat = format; +} + +GstMlModelInputImageFormat GstOnnxClient::getInputImageFormat (void) +{ + return inputImageFormat; +} + +std::vector < const char *>GstOnnxClient::getOutputNodeNames (void) +{ + return outputNames; +} + +void GstOnnxClient::setOutputNodeIndex (GstMlOutputNodeFunction node, + gint index) +{ + g_assert (index < GST_ML_OUTPUT_NODE_NUMBER_OF); + outputNodeInfo[node].index = index; + if (index != GST_ML_NODE_INDEX_DISABLED) + outputNodeIndexToFunction[index] = node; +} + +gint GstOnnxClient::getOutputNodeIndex (GstMlOutputNodeFunction node) +{ + return outputNodeInfo[node].index; +} + +void GstOnnxClient::setOutputNodeType (GstMlOutputNodeFunction node, + ONNXTensorElementDataType type) +{ + outputNodeInfo[node].type = type; +} + +ONNXTensorElementDataType + GstOnnxClient::getOutputNodeType (GstMlOutputNodeFunction node) +{ + return outputNodeInfo[node].type; +} + +bool GstOnnxClient::hasSession (void) +{ + return session != nullptr; +} + +bool GstOnnxClient::createSession (std::string modelFile, + GstOnnxOptimizationLevel optim, GstOnnxExecutionProvider provider) +{ + if (session) + return true; + + GraphOptimizationLevel onnx_optim; + switch (optim) { + case GST_ONNX_OPTIMIZATION_LEVEL_DISABLE_ALL: + onnx_optim = GraphOptimizationLevel::ORT_DISABLE_ALL; + break; + case GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_BASIC: + onnx_optim = GraphOptimizationLevel::ORT_ENABLE_BASIC; + break; + case GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_EXTENDED: + onnx_optim = GraphOptimizationLevel::ORT_ENABLE_EXTENDED; + break; + case GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_ALL: + onnx_optim = GraphOptimizationLevel::ORT_ENABLE_ALL; + break; + default: + onnx_optim = GraphOptimizationLevel::ORT_ENABLE_EXTENDED; + break; + }; + + Ort::SessionOptions sessionOptions; + // for debugging + //sessionOptions.SetIntraOpNumThreads (1); + sessionOptions.SetGraphOptimizationLevel (onnx_optim); + m_provider = provider; + switch (m_provider) { + case GST_ONNX_EXECUTION_PROVIDER_CUDA: +#ifdef GST_ML_ONNX_RUNTIME_HAVE_CUDA + Ort::ThrowOnError (OrtSessionOptionsAppendExecutionProvider_CUDA + (sessionOptions, 0)); +#else + return false; +#endif + break; + default: + break; + + }; + session = new Ort::Session (getEnv (), modelFile.c_str (), sessionOptions); + auto inputTypeInfo = session->GetInputTypeInfo (0); + std::vector < int64_t > inputDims = + inputTypeInfo.GetTensorTypeAndShapeInfo ().GetShape (); + if (inputImageFormat == GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC) { + height = inputDims[1]; + width = inputDims[2]; + channels = inputDims[3]; + } else { + channels = inputDims[1]; + height = inputDims[2]; + width = inputDims[3]; + } + + fixedInputImageSize = width > 0 && height > 0; + GST_DEBUG ("Number of Output Nodes: %d", (gint) session->GetOutputCount ()); + + Ort::AllocatorWithDefaultOptions allocator; + GST_DEBUG ("Input name: %s", session->GetInputName (0, allocator)); + + for (size_t i = 0; i < session->GetOutputCount (); ++i) { + auto output_name = session->GetOutputName (i, allocator); + outputNames.push_back (output_name); + auto type_info = session->GetOutputTypeInfo (i); + auto tensor_info = type_info.GetTensorTypeAndShapeInfo (); + + if (i < GST_ML_OUTPUT_NODE_NUMBER_OF) { + auto function = outputNodeIndexToFunction[i]; + outputNodeInfo[function].type = tensor_info.GetElementType (); + } + } + + return true; +} + +std::vector < GstMlBoundingBox > GstOnnxClient::run (uint8_t * img_data, + GstVideoMeta * vmeta, std::string labelPath, float scoreThreshold) +{ + auto type = getOutputNodeType (GST_ML_OUTPUT_NODE_FUNCTION_CLASS); + return (type == + ONNXTensorElementDataType::ONNX_TENSOR_ELEMENT_DATA_TYPE_FLOAT) ? + doRun < float >(img_data, vmeta, labelPath, scoreThreshold) + : doRun < int >(img_data, vmeta, labelPath, scoreThreshold); +} + +void GstOnnxClient::parseDimensions (GstVideoMeta * vmeta) +{ + int32_t newWidth = fixedInputImageSize ? width : vmeta->width; + int32_t newHeight = fixedInputImageSize ? height : vmeta->height; + + if (!dest || width * height < newWidth * newHeight) { + delete[] dest; + dest = new uint8_t[newWidth * newHeight * channels]; + } + width = newWidth; + height = newHeight; +} + +template < typename T > std::vector < GstMlBoundingBox > + GstOnnxClient::doRun (uint8_t * img_data, GstVideoMeta * vmeta, + std::string labelPath, float scoreThreshold) +{ + std::vector < GstMlBoundingBox > boundingBoxes; + if (!img_data) + return boundingBoxes; + + parseDimensions (vmeta); + + Ort::AllocatorWithDefaultOptions allocator; + auto inputName = session->GetInputName (0, allocator); + auto inputTypeInfo = session->GetInputTypeInfo (0); + std::vector < int64_t > inputDims = + inputTypeInfo.GetTensorTypeAndShapeInfo ().GetShape (); + inputDims[0] = 1; + if (inputImageFormat == GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC) { + inputDims[1] = height; + inputDims[2] = width; + } else { + inputDims[2] = height; + inputDims[3] = width; + } + + std::ostringstream buffer; + buffer << inputDims; + GST_DEBUG ("Input dimensions: %s", buffer.str ().c_str ()); + + // copy video frame + uint8_t *srcPtr[3] = { img_data, img_data + 1, img_data + 2 }; + uint32_t srcSamplesPerPixel = 3; + switch (vmeta->format) { + case GST_VIDEO_FORMAT_RGBA: + srcSamplesPerPixel = 4; + break; + case GST_VIDEO_FORMAT_BGRA: + srcSamplesPerPixel = 4; + srcPtr[0] = img_data + 2; + srcPtr[1] = img_data + 1; + srcPtr[2] = img_data + 0; + break; + case GST_VIDEO_FORMAT_ARGB: + srcSamplesPerPixel = 4; + srcPtr[0] = img_data + 1; + srcPtr[1] = img_data + 2; + srcPtr[2] = img_data + 3; + break; + case GST_VIDEO_FORMAT_ABGR: + srcSamplesPerPixel = 4; + srcPtr[0] = img_data + 3; + srcPtr[1] = img_data + 2; + srcPtr[2] = img_data + 1; + break; + case GST_VIDEO_FORMAT_BGR: + srcPtr[0] = img_data + 2; + srcPtr[1] = img_data + 1; + srcPtr[2] = img_data + 0; + break; + default: + break; + } + size_t destIndex = 0; + uint32_t stride = vmeta->stride[0]; + if (inputImageFormat == GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC) { + for (int32_t j = 0; j < height; ++j) { + for (int32_t i = 0; i < width; ++i) { + for (int32_t k = 0; k < channels; ++k) { + dest[destIndex++] = *srcPtr[k]; + srcPtr[k] += srcSamplesPerPixel; + } + } + // correct for stride + for (uint32_t k = 0; k < 3; ++k) + srcPtr[k] += stride - srcSamplesPerPixel * width; + } + } else { + size_t frameSize = width * height; + uint8_t *destPtr[3] = { dest, dest + frameSize, dest + 2 * frameSize }; + for (int32_t j = 0; j < height; ++j) { + for (int32_t i = 0; i < width; ++i) { + for (int32_t k = 0; k < channels; ++k) { + destPtr[k][destIndex] = *srcPtr[k]; + srcPtr[k] += srcSamplesPerPixel; + } + destIndex++; + } + // correct for stride + for (uint32_t k = 0; k < 3; ++k) + srcPtr[k] += stride - srcSamplesPerPixel * width; + } + } + + const size_t inputTensorSize = width * height * channels; + auto memoryInfo = + Ort::MemoryInfo::CreateCpu (OrtAllocatorType::OrtArenaAllocator, + OrtMemType::OrtMemTypeDefault); + std::vector < Ort::Value > inputTensors; + inputTensors.push_back (Ort::Value::CreateTensor < uint8_t > (memoryInfo, + dest, inputTensorSize, inputDims.data (), inputDims.size ())); + std::vector < const char *>inputNames { inputName }; + + std::vector < Ort::Value > modelOutput = session->Run (Ort::RunOptions { nullptr}, + inputNames.data (), + inputTensors.data (), 1, outputNames.data (), outputNames.size ()); + + auto numDetections = + modelOutput[getOutputNodeIndex + (GST_ML_OUTPUT_NODE_FUNCTION_DETECTION)].GetTensorMutableData < float >(); + auto bboxes = + modelOutput[getOutputNodeIndex + (GST_ML_OUTPUT_NODE_FUNCTION_BOUNDING_BOX)].GetTensorMutableData < float >(); + auto scores = + modelOutput[getOutputNodeIndex + (GST_ML_OUTPUT_NODE_FUNCTION_SCORE)].GetTensorMutableData < float >(); + T *labelIndex = nullptr; + if (getOutputNodeIndex (GST_ML_OUTPUT_NODE_FUNCTION_CLASS) != + GST_ML_NODE_INDEX_DISABLED) { + labelIndex = + modelOutput[getOutputNodeIndex + (GST_ML_OUTPUT_NODE_FUNCTION_CLASS)].GetTensorMutableData < T > (); + } + if (labels.empty () && !labelPath.empty ()) + labels = ReadLabels (labelPath); + + for (int i = 0; i < numDetections[0]; ++i) { + if (scores[i] > scoreThreshold) { + std::string label = ""; + + if (labelIndex && !labels.empty ()) + label = labels[labelIndex[i] - 1]; + auto score = scores[i]; + auto y0 = bboxes[i * 4] * height; + auto x0 = bboxes[i * 4 + 1] * width; + auto bheight = bboxes[i * 4 + 2] * height - y0; + auto bwidth = bboxes[i * 4 + 3] * width - x0; + boundingBoxes.push_back (GstMlBoundingBox (label, score, x0, y0, bwidth, + bheight)); + } + } + return boundingBoxes; +} + +std::vector < std::string > + GstOnnxClient::ReadLabels (const std::string & labelsFile) +{ + std::vector < std::string > labels; + std::string line; + std::ifstream fp (labelsFile); + while (std::getline (fp, line)) + labels.push_back (line); + + return labels; + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnxclient.h
Added
@@ -0,0 +1,117 @@ +/* + * GStreamer gstreamer-onnxclient + * Copyright (C) 2021 Collabora Ltd + * + * gstonnxclient.h + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifndef __GST_ONNX_CLIENT_H__ +#define __GST_ONNX_CLIENT_H__ + +#include <gst/gst.h> +#include <onnxruntime_cxx_api.h> +#include <gst/video/video.h> +#include "gstonnxelement.h" +#include <string> +#include <vector> + +namespace GstOnnxNamespace { + enum GstMlOutputNodeFunction { + GST_ML_OUTPUT_NODE_FUNCTION_DETECTION, + GST_ML_OUTPUT_NODE_FUNCTION_BOUNDING_BOX, + GST_ML_OUTPUT_NODE_FUNCTION_SCORE, + GST_ML_OUTPUT_NODE_FUNCTION_CLASS, + GST_ML_OUTPUT_NODE_NUMBER_OF, + }; + + const gint GST_ML_NODE_INDEX_DISABLED = -1; + + struct GstMlOutputNodeInfo { + GstMlOutputNodeInfo(void); + gint index; + ONNXTensorElementDataType type; + }; + + struct GstMlBoundingBox { + GstMlBoundingBox(std::string lbl, + float score, + float _x0, + float _y0, + float _width, + float _height):label(lbl), + score(score), x0(_x0), y0(_y0), width(_width), height(_height) { + } + GstMlBoundingBox():GstMlBoundingBox("", 0.0f, 0.0f, 0.0f, 0.0f, 0.0f) { + } + std::string label; + float score; + float x0; + float y0; + float width; + float height; + }; + + class GstOnnxClient { + public: + GstOnnxClient(void); + ~GstOnnxClient(void); + bool createSession(std::string modelFile, GstOnnxOptimizationLevel optim, + GstOnnxExecutionProvider provider); + bool hasSession(void); + void setInputImageFormat(GstMlModelInputImageFormat format); + GstMlModelInputImageFormat getInputImageFormat(void); + void setOutputNodeIndex(GstMlOutputNodeFunction nodeType, gint index); + gint getOutputNodeIndex(GstMlOutputNodeFunction nodeType); + void setOutputNodeType(GstMlOutputNodeFunction nodeType, + ONNXTensorElementDataType type); + ONNXTensorElementDataType getOutputNodeType(GstMlOutputNodeFunction type); + std::string getOutputNodeName(GstMlOutputNodeFunction nodeType); + std::vector < GstMlBoundingBox > run(uint8_t * img_data, + GstVideoMeta * vmeta, + std::string labelPath, + float scoreThreshold); + std::vector < GstMlBoundingBox > &getBoundingBoxes(void); + std::vector < const char *>getOutputNodeNames(void); + bool isFixedInputImageSize(void); + int32_t getWidth(void); + int32_t getHeight(void); + private: + void parseDimensions(GstVideoMeta * vmeta); + template < typename T > std::vector < GstMlBoundingBox > + doRun(uint8_t * img_data, GstVideoMeta * vmeta, std::string labelPath, + float scoreThreshold); + std::vector < std::string > ReadLabels(const std::string & labelsFile); + Ort::Env & getEnv(void); + Ort::Session * session; + int32_t width; + int32_t height; + int32_t channels; + uint8_t *dest; + GstOnnxExecutionProvider m_provider; + std::vector < Ort::Value > modelOutput; + std::vector < std::string > labels; + // !! indexed by function + GstMlOutputNodeInfo outputNodeInfo[GST_ML_OUTPUT_NODE_NUMBER_OF]; + // !! indexed by array index + size_t outputNodeIndexToFunction[GST_ML_OUTPUT_NODE_NUMBER_OF]; + std::vector < const char *>outputNames; + GstMlModelInputImageFormat inputImageFormat; + bool fixedInputImageSize; + }; +} + +#endif /* __GST_ONNX_CLIENT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnxelement.c
Added
@@ -0,0 +1,104 @@ +/* + * GStreamer gstreamer-onnxelement + * Copyright (C) 2021 Collabora Ltd + * + * gstonnxelement.c + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstonnxelement.h" + +GType +gst_onnx_optimization_level_get_type (void) +{ + static GType onnx_optimization_type = 0; + + if (g_once_init_enter (&onnx_optimization_type)) { + static GEnumValue optimization_level_types[] = { + {GST_ONNX_OPTIMIZATION_LEVEL_DISABLE_ALL, "Disable all optimization", + "disable-all"}, + {GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_BASIC, + "Enable basic optimizations (redundant node removals))", + "enable-basic"}, + {GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_EXTENDED, + "Enable extended optimizations (redundant node removals + node fusions)", + "enable-extended"}, + {GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_ALL, + "Enable all possible optimizations", "enable-all"}, + {0, NULL, NULL}, + }; + + GType temp = g_enum_register_static ("GstOnnxOptimizationLevel", + optimization_level_types); + + g_once_init_leave (&onnx_optimization_type, temp); + } + + return onnx_optimization_type; +} + +GType +gst_onnx_execution_provider_get_type (void) +{ + static GType onnx_execution_type = 0; + + if (g_once_init_enter (&onnx_execution_type)) { + static GEnumValue execution_provider_types[] = { + {GST_ONNX_EXECUTION_PROVIDER_CPU, "CPU execution provider", + "cpu"}, + {GST_ONNX_EXECUTION_PROVIDER_CUDA, + "CUDA execution provider", + "cuda"}, + {0, NULL, NULL}, + }; + + GType temp = g_enum_register_static ("GstOnnxExecutionProvider", + execution_provider_types); + + g_once_init_leave (&onnx_execution_type, temp); + } + + return onnx_execution_type; +} + +GType +gst_ml_model_input_image_format_get_type (void) +{ + static GType ml_model_input_image_format = 0; + + if (g_once_init_enter (&ml_model_input_image_format)) { + static GEnumValue ml_model_input_image_format_types[] = { + {GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC, + "Height Width Channel (HWC) a.k.a. interleaved image data format", + "hwc"}, + {GST_ML_MODEL_INPUT_IMAGE_FORMAT_CHW, + "Channel Height Width (CHW) a.k.a. planar image data format", + "chw"}, + {0, NULL, NULL}, + }; + + GType temp = g_enum_register_static ("GstMlModelInputImageFormat", + ml_model_input_image_format_types); + + g_once_init_leave (&ml_model_input_image_format, temp); + } + + return ml_model_input_image_format; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnxelement.h
Added
@@ -0,0 +1,64 @@ +/* + * GStreamer gstreamer-onnxelement + * Copyright (C) 2021 Collabora Ltd + * + * gstonnxelement.h + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ONNX_ELEMENT_H__ +#define __GST_ONNX_ELEMENT_H__ + +#include <gst/gst.h> + +typedef enum +{ + GST_ONNX_OPTIMIZATION_LEVEL_DISABLE_ALL, + GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_BASIC, + GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_EXTENDED, + GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_ALL, +} GstOnnxOptimizationLevel; + +typedef enum +{ + GST_ONNX_EXECUTION_PROVIDER_CPU, + GST_ONNX_EXECUTION_PROVIDER_CUDA, +} GstOnnxExecutionProvider; + +typedef enum { + /* Height Width Channel (a.k.a. interleaved) format */ + GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC, + + /* Channel Height Width (a.k.a. planar) format */ + GST_ML_MODEL_INPUT_IMAGE_FORMAT_CHW, +} GstMlModelInputImageFormat; + + +G_BEGIN_DECLS + +GType gst_onnx_optimization_level_get_type (void); +#define GST_TYPE_ONNX_OPTIMIZATION_LEVEL (gst_onnx_optimization_level_get_type ()) + +GType gst_onnx_execution_provider_get_type (void); +#define GST_TYPE_ONNX_EXECUTION_PROVIDER (gst_onnx_execution_provider_get_type ()) + +GType gst_ml_model_input_image_format_get_type (void); +#define GST_TYPE_ML_MODEL_INPUT_IMAGE_FORMAT (gst_ml_model_input_image_format_get_type ()) + +G_END_DECLS + +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnxobjectdetector.cpp
Added
@@ -0,0 +1,670 @@ +/* + * GStreamer gstreamer-onnxobjectdetector + * Copyright (C) 2021 Collabora Ltd. + * + * gstonnxobjectdetector.c + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-onnxobjectdetector + * @short_description: Detect objects in video frame + * + * This element can apply a generic ONNX object detection model such as YOLO or SSD + * to each video frame. + * + * To install ONNX on your system, recursively clone this repository + * https://github.com/microsoft/onnxruntime.git + * + * and build and install with cmake: + * + * CPU: + * + * cmake -Donnxruntime_BUILD_SHARED_LIB:ON -DBUILD_TESTING:OFF \ + * $SRC_DIR/onnxruntime/cmake && make -j8 && sudo make install + * + * + * GPU : + * + * cmake -Donnxruntime_BUILD_SHARED_LIB:ON -DBUILD_TESTING:OFF -Donnxruntime_USE_CUDA:ON \ + * -Donnxruntime_CUDA_HOME=$CUDA_PATH -Donnxruntime_CUDNN_HOME=$CUDA_PATH \ + * $SRC_DIR/onnxruntime/cmake && make -j8 && sudo make install + * + * + * where : + * + * 1. $SRC_DIR and $BUILD_DIR are local source and build directories + * 2. To run with CUDA, both CUDA and cuDNN libraries must be installed. + * $CUDA_PATH is an environment variable set to the CUDA root path. + * On Linux, it would be /usr/local/cuda-XX.X where XX.X is the installed version of CUDA. + * + * + * ## Example launch command: + * + * (note: an object detection model has 3 or 4 output nodes, but there is no naming convention + * to indicate which node outputs the bounding box, which node outputs the label, etc. + * So, the `onnxobjectdetector` element has properties to map each node's functionality to its + * respective node index in the specified model ) + * + * ``` + * GST_DEBUG=objectdetector:5 gst-launch-1.0 multifilesrc \ + * location=000000088462.jpg caps=image/jpeg,framerate=\(fraction\)30/1 ! jpegdec ! \ + * videoconvert ! \ + * onnxobjectdetector \ + * box-node-index=0 \ + * class-node-index=1 \ + * score-node-index=2 \ + * detection-node-index=3 \ + * execution-provider=cpu \ + * model-file=model.onnx \ + * label-file=COCO_classes.txt ! \ + * videoconvert ! \ + * autovideosink + * ``` + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstonnxobjectdetector.h" +#include "gstonnxclient.h" + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/video/gstvideometa.h> +#include <stdlib.h> +#include <string.h> +#include <glib.h> + +GST_DEBUG_CATEGORY_STATIC (onnx_object_detector_debug); +#define GST_CAT_DEFAULT onnx_object_detector_debug +#define GST_ONNX_MEMBER( self ) ((GstOnnxNamespace::GstOnnxClient *) (self->onnx_ptr)) +GST_ELEMENT_REGISTER_DEFINE (onnx_object_detector, "onnxobjectdetector", + GST_RANK_PRIMARY, GST_TYPE_ONNX_OBJECT_DETECTOR); + +/* GstOnnxObjectDetector properties */ +enum +{ + PROP_0, + PROP_MODEL_FILE, + PROP_LABEL_FILE, + PROP_SCORE_THRESHOLD, + PROP_DETECTION_NODE_INDEX, + PROP_BOUNDING_BOX_NODE_INDEX, + PROP_SCORE_NODE_INDEX, + PROP_CLASS_NODE_INDEX, + PROP_INPUT_IMAGE_FORMAT, + PROP_OPTIMIZATION_LEVEL, + PROP_EXECUTION_PROVIDER +}; + + +#define GST_ONNX_OBJECT_DETECTOR_DEFAULT_EXECUTION_PROVIDER GST_ONNX_EXECUTION_PROVIDER_CPU +#define GST_ONNX_OBJECT_DETECTOR_DEFAULT_OPTIMIZATION_LEVEL GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_EXTENDED +#define GST_ONNX_OBJECT_DETECTOR_DEFAULT_SCORE_THRESHOLD 0.3f /* 0 to 1 */ + +static GstStaticPadTemplate gst_onnx_object_detector_src_template = +GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ RGB,RGBA,BGR,BGRA }")) + ); + +static GstStaticPadTemplate gst_onnx_object_detector_sink_template = +GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ RGB,RGBA,BGR,BGRA }")) + ); + +static void gst_onnx_object_detector_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_onnx_object_detector_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_onnx_object_detector_finalize (GObject * object); +static GstFlowReturn gst_onnx_object_detector_transform_ip (GstBaseTransform * + trans, GstBuffer * buf); +static gboolean gst_onnx_object_detector_process (GstBaseTransform * trans, + GstBuffer * buf); +static gboolean gst_onnx_object_detector_create_session (GstBaseTransform * trans); +static GstCaps *gst_onnx_object_detector_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter_caps); + +G_DEFINE_TYPE (GstOnnxObjectDetector, gst_onnx_object_detector, + GST_TYPE_BASE_TRANSFORM); + +static void +gst_onnx_object_detector_class_init (GstOnnxObjectDetectorClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + GstElementClass *element_class = (GstElementClass *) klass; + GstBaseTransformClass *basetransform_class = (GstBaseTransformClass *) klass; + + GST_DEBUG_CATEGORY_INIT (onnx_object_detector_debug, "onnxobjectdetector", + 0, "onnx_objectdetector"); + gobject_class->set_property = gst_onnx_object_detector_set_property; + gobject_class->get_property = gst_onnx_object_detector_get_property; + gobject_class->finalize = gst_onnx_object_detector_finalize; + + /** + * GstOnnxObjectDetector:model-file + * + * ONNX model file + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), PROP_MODEL_FILE, + g_param_spec_string ("model-file", + "ONNX model file", "ONNX model file", NULL, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstOnnxObjectDetector:label-file + * + * Label file for ONNX model + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), PROP_LABEL_FILE, + g_param_spec_string ("label-file", + "Label file", "Label file associated with model", NULL, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + + /** + * GstOnnxObjectDetector:detection-node-index + * + * Index of model detection node + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_DETECTION_NODE_INDEX, + g_param_spec_int ("detection-node-index", + "Detection node index", + "Index of neural network output node corresponding to number of detected objects", + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, + GstOnnxNamespace::GST_ML_OUTPUT_NODE_NUMBER_OF-1, + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + + /** + * GstOnnxObjectDetector:bounding-box-node-index + * + * Index of model bounding box node + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_BOUNDING_BOX_NODE_INDEX, + g_param_spec_int ("box-node-index", + "Bounding box node index", + "Index of neural network output node corresponding to bounding box", + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, + GstOnnxNamespace::GST_ML_OUTPUT_NODE_NUMBER_OF-1, + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstOnnxObjectDetector:score-node-index + * + * Index of model score node + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_SCORE_NODE_INDEX, + g_param_spec_int ("score-node-index", + "Score node index", + "Index of neural network output node corresponding to score", + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, + GstOnnxNamespace::GST_ML_OUTPUT_NODE_NUMBER_OF-1, + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstOnnxObjectDetector:class-node-index + * + * Index of model class (label) node + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_CLASS_NODE_INDEX, + g_param_spec_int ("class-node-index", + "Class node index", + "Index of neural network output node corresponding to class (label)", + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, + GstOnnxNamespace::GST_ML_OUTPUT_NODE_NUMBER_OF-1, + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + + /** + * GstOnnxObjectDetector:score-threshold + * + * Threshold for deciding when to remove boxes based on score + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), PROP_SCORE_THRESHOLD, + g_param_spec_float ("score-threshold", + "Score threshold", + "Threshold for deciding when to remove boxes based on score", + 0.0, 1.0, + GST_ONNX_OBJECT_DETECTOR_DEFAULT_SCORE_THRESHOLD, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstOnnxObjectDetector:input-image-format + * + * Model input image format + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_INPUT_IMAGE_FORMAT, + g_param_spec_enum ("input-image-format", + "Input image format", + "Input image format", + GST_TYPE_ML_MODEL_INPUT_IMAGE_FORMAT, + GST_ML_MODEL_INPUT_IMAGE_FORMAT_HWC, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstOnnxObjectDetector:optimization-level + * + * ONNX optimization level + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_OPTIMIZATION_LEVEL, + g_param_spec_enum ("optimization-level", + "Optimization level", + "ONNX optimization level", + GST_TYPE_ONNX_OPTIMIZATION_LEVEL, + GST_ONNX_OPTIMIZATION_LEVEL_ENABLE_EXTENDED, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstOnnxObjectDetector:execution-provider + * + * ONNX execution provider + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_EXECUTION_PROVIDER, + g_param_spec_enum ("execution-provider", + "Execution provider", + "ONNX execution provider", + GST_TYPE_ONNX_EXECUTION_PROVIDER, + GST_ONNX_EXECUTION_PROVIDER_CPU, (GParamFlags) + (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_element_class_set_static_metadata (element_class, "onnxobjectdetector", + "Filter/Effect/Video", + "Apply neural network to detect objects in video frames", + "Aaron Boxer <aaron.boxer@collabora.com>, Marcus Edel <marcus.edel@collabora.com>"); + gst_element_class_add_pad_template (element_class, + gst_static_pad_template_get (&gst_onnx_object_detector_sink_template)); + gst_element_class_add_pad_template (element_class, + gst_static_pad_template_get (&gst_onnx_object_detector_src_template)); + basetransform_class->transform_ip = + GST_DEBUG_FUNCPTR (gst_onnx_object_detector_transform_ip); + basetransform_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_onnx_object_detector_transform_caps); +} + +static void +gst_onnx_object_detector_init (GstOnnxObjectDetector * self) +{ + self->onnx_ptr = new GstOnnxNamespace::GstOnnxClient (); + self->onnx_disabled = false; +} + +static void +gst_onnx_object_detector_finalize (GObject * object) +{ + GstOnnxObjectDetector *self = GST_ONNX_OBJECT_DETECTOR (object); + + g_free (self->model_file); + delete GST_ONNX_MEMBER (self); + G_OBJECT_CLASS (gst_onnx_object_detector_parent_class)->finalize (object); +} + +static void +gst_onnx_object_detector_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstOnnxObjectDetector *self = GST_ONNX_OBJECT_DETECTOR (object); + const gchar *filename; + auto onnxClient = GST_ONNX_MEMBER (self); + + switch (prop_id) { + case PROP_MODEL_FILE: + filename = g_value_get_string (value); + if (filename + && g_file_test (filename, + (GFileTest) (G_FILE_TEST_EXISTS | G_FILE_TEST_IS_REGULAR))) { + if (self->model_file) + g_free (self->model_file); + self->model_file = g_strdup (filename); + } else { + GST_WARNING_OBJECT (self, "Model file '%s' not found!", filename); + gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (self), TRUE); + } + break; + case PROP_LABEL_FILE: + filename = g_value_get_string (value); + if (filename + && g_file_test (filename, + (GFileTest) (G_FILE_TEST_EXISTS | G_FILE_TEST_IS_REGULAR))) { + if (self->label_file) + g_free (self->label_file); + self->label_file = g_strdup (filename); + } else { + GST_WARNING_OBJECT (self, "Label file '%s' not found!", filename); + } + break; + case PROP_SCORE_THRESHOLD: + GST_OBJECT_LOCK (self); + self->score_threshold = g_value_get_float (value); + GST_OBJECT_UNLOCK (self); + break; + case PROP_OPTIMIZATION_LEVEL: + self->optimization_level = + (GstOnnxOptimizationLevel) g_value_get_enum (value); + break; + case PROP_EXECUTION_PROVIDER: + self->execution_provider = + (GstOnnxExecutionProvider) g_value_get_enum (value); + break; + case PROP_DETECTION_NODE_INDEX: + onnxClient->setOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_DETECTION, + g_value_get_int (value)); + break; + case PROP_BOUNDING_BOX_NODE_INDEX: + onnxClient->setOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_BOUNDING_BOX, + g_value_get_int (value)); + break; + break; + case PROP_SCORE_NODE_INDEX: + onnxClient->setOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_SCORE, + g_value_get_int (value)); + break; + break; + case PROP_CLASS_NODE_INDEX: + onnxClient->setOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_CLASS, + g_value_get_int (value)); + break; + case PROP_INPUT_IMAGE_FORMAT: + onnxClient->setInputImageFormat ((GstMlModelInputImageFormat) + g_value_get_enum (value)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_onnx_object_detector_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstOnnxObjectDetector *self = GST_ONNX_OBJECT_DETECTOR (object); + auto onnxClient = GST_ONNX_MEMBER (self); + + switch (prop_id) { + case PROP_MODEL_FILE: + g_value_set_string (value, self->model_file); + break; + case PROP_LABEL_FILE: + g_value_set_string (value, self->label_file); + break; + case PROP_SCORE_THRESHOLD: + GST_OBJECT_LOCK (self); + g_value_set_float (value, self->score_threshold); + GST_OBJECT_UNLOCK (self); + break; + case PROP_OPTIMIZATION_LEVEL: + g_value_set_enum (value, self->optimization_level); + break; + case PROP_EXECUTION_PROVIDER: + g_value_set_enum (value, self->execution_provider); + break; + case PROP_DETECTION_NODE_INDEX: + g_value_set_int (value, + onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_DETECTION)); + break; + case PROP_BOUNDING_BOX_NODE_INDEX: + g_value_set_int (value, + onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_BOUNDING_BOX)); + break; + break; + case PROP_SCORE_NODE_INDEX: + g_value_set_int (value, + onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_SCORE)); + break; + break; + case PROP_CLASS_NODE_INDEX: + g_value_set_int (value, + onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_CLASS)); + break; + case PROP_INPUT_IMAGE_FORMAT: + g_value_set_enum (value, onnxClient->getInputImageFormat ()); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +gst_onnx_object_detector_create_session (GstBaseTransform * trans) +{ + GstOnnxObjectDetector *self = GST_ONNX_OBJECT_DETECTOR (trans); + auto onnxClient = GST_ONNX_MEMBER (self); + + GST_OBJECT_LOCK (self); + if (self->onnx_disabled || onnxClient->hasSession ()) { + GST_OBJECT_UNLOCK (self); + + return TRUE; + } + if (self->model_file) { + gboolean ret = GST_ONNX_MEMBER (self)->createSession (self->model_file, + self->optimization_level, + self->execution_provider); + if (!ret) { + GST_ERROR_OBJECT (self, + "Unable to create ONNX session. Detection disabled."); + } else { + auto outputNames = onnxClient->getOutputNodeNames (); + + for (size_t i = 0; i < outputNames.size (); ++i) + GST_INFO_OBJECT (self, "Output node index: %d for node: %s", (gint) i, + outputNames[i]); + if (outputNames.size () < 3) { + GST_ERROR_OBJECT (self, + "Number of output tensor nodes %d does not match the 3 or 4 nodes " + "required for an object detection model. Detection is disabled.", + (gint) outputNames.size ()); + self->onnx_disabled = TRUE; + } + // sanity check on output node indices + if (onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_DETECTION) == + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED) { + GST_ERROR_OBJECT (self, + "Output detection node index not set. Detection disabled."); + self->onnx_disabled = TRUE; + } + if (onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_BOUNDING_BOX) == + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED) { + GST_ERROR_OBJECT (self, + "Output bounding box node index not set. Detection disabled."); + self->onnx_disabled = TRUE; + } + if (onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_SCORE) == + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED) { + GST_ERROR_OBJECT (self, + "Output score node index not set. Detection disabled."); + self->onnx_disabled = TRUE; + } + if (outputNames.size () == 4 && onnxClient->getOutputNodeIndex + (GstOnnxNamespace::GST_ML_OUTPUT_NODE_FUNCTION_CLASS) == + GstOnnxNamespace::GST_ML_NODE_INDEX_DISABLED) { + GST_ERROR_OBJECT (self, + "Output class node index not set. Detection disabled."); + self->onnx_disabled = TRUE; + } + // model is not usable, so fail + if (self->onnx_disabled) { + GST_ELEMENT_WARNING (self, RESOURCE, FAILED, + ("ONNX model cannot be used for object detection"), (NULL)); + + return FALSE; + } + } + } else { + self->onnx_disabled = TRUE; + } + GST_OBJECT_UNLOCK (self); + if (self->onnx_disabled){ + gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (self), TRUE); + } + + return TRUE; +} + + +static GstCaps * +gst_onnx_object_detector_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter_caps) +{ + GstOnnxObjectDetector *self = GST_ONNX_OBJECT_DETECTOR (trans); + auto onnxClient = GST_ONNX_MEMBER (self); + GstCaps *other_caps; + guint i; + + if ( !gst_onnx_object_detector_create_session (trans) ) + return NULL; + GST_LOG_OBJECT (self, "transforming caps %" GST_PTR_FORMAT, caps); + + if (gst_base_transform_is_passthrough (trans) + || (!onnxClient->isFixedInputImageSize ())) + return gst_caps_ref (caps); + + other_caps = gst_caps_new_empty (); + for (i = 0; i < gst_caps_get_size (caps); ++i) { + GstStructure *structure, *new_structure; + + structure = gst_caps_get_structure (caps, i); + new_structure = gst_structure_copy (structure); + gst_structure_set (new_structure, "width", G_TYPE_INT, + onnxClient->getWidth (), "height", G_TYPE_INT, + onnxClient->getHeight (), NULL); + GST_LOG_OBJECT (self, + "transformed structure %2d: %" GST_PTR_FORMAT " => %" + GST_PTR_FORMAT, i, structure, new_structure); + gst_caps_append_structure (other_caps, new_structure); + } + + if (!gst_caps_is_empty (other_caps) && filter_caps) { + GstCaps *tmp = gst_caps_intersect_full (other_caps,filter_caps, + GST_CAPS_INTERSECT_FIRST); + gst_caps_replace (&other_caps, tmp); + gst_caps_unref (tmp); + } + + return other_caps; +} + + +static GstFlowReturn +gst_onnx_object_detector_transform_ip (GstBaseTransform * trans, + GstBuffer * buf) +{ + if (!gst_base_transform_is_passthrough (trans) + && !gst_onnx_object_detector_process (trans, buf)){ + GST_ELEMENT_WARNING (trans, STREAM, FAILED, + ("ONNX object detection failed"), (NULL)); + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static gboolean +gst_onnx_object_detector_process (GstBaseTransform * trans, GstBuffer * buf) +{ + GstMapInfo info; + GstVideoMeta *vmeta = gst_buffer_get_video_meta (buf); + + if (!vmeta) { + GST_WARNING_OBJECT (trans, "missing video meta"); + return FALSE; + } + if (gst_buffer_map (buf, &info, GST_MAP_READ)) { + GstOnnxObjectDetector *self = GST_ONNX_OBJECT_DETECTOR (trans); + auto boxes = GST_ONNX_MEMBER (self)->run (info.data, vmeta, + self->label_file ? self->label_file : "", + self->score_threshold); + for (auto & b:boxes) { + auto vroi_meta = gst_buffer_add_video_region_of_interest_meta (buf, + GST_ONNX_OBJECT_DETECTOR_META_NAME, + b.x0, b.y0, + b.width, + b.height); + if (!vroi_meta) { + GST_WARNING_OBJECT (trans, + "Unable to attach GstVideoRegionOfInterestMeta to buffer"); + return FALSE; + } + auto s = gst_structure_new (GST_ONNX_OBJECT_DETECTOR_META_PARAM_NAME, + GST_ONNX_OBJECT_DETECTOR_META_FIELD_LABEL, + G_TYPE_STRING, + b.label.c_str (), + GST_ONNX_OBJECT_DETECTOR_META_FIELD_SCORE, + G_TYPE_DOUBLE, + b.score, + NULL); + gst_video_region_of_interest_meta_add_param (vroi_meta, s); + GST_DEBUG_OBJECT (self, + "Object detected with label : %s, score: %f, bound box: (%f,%f,%f,%f) \n", + b.label.c_str (), b.score, b.x0, b.y0, + b.x0 + b.width, b.y0 + b.height); + } + gst_buffer_unmap (buf, &info); + } + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/gstonnxobjectdetector.h
Added
@@ -0,0 +1,93 @@ +/* + * GStreamer gstreamer-onnxobjectdetector + * Copyright (C) 2021 Collabora Ltd + * + * gstonnxobjectdetector.h + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ONNX_OBJECT_DETECTOR_H__ +#define __GST_ONNX_OBJECT_DETECTOR_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/video/gstvideofilter.h> +#include "gstonnxelement.h" + +G_BEGIN_DECLS + +#define GST_TYPE_ONNX_OBJECT_DETECTOR (gst_onnx_object_detector_get_type()) +G_DECLARE_FINAL_TYPE (GstOnnxObjectDetector, gst_onnx_object_detector, GST, ONNX_OBJECT_DETECTOR, GstBaseTransform) +#define GST_ONNX_OBJECT_DETECTOR(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_ONNX_OBJECT_DETECTOR,GstOnnxObjectDetector)) +#define GST_ONNX_OBJECT_DETECTOR_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_ONNX_OBJECT_DETECTOR,GstOnnxObjectDetectorClass)) +#define GST_ONNX_OBJECT_DETECTOR_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_ONNX_OBJECT_DETECTOR,GstOnnxObjectDetectorClass)) +#define GST_IS_ONNX_OBJECT_DETECTOR(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_ONNX_OBJECT_DETECTOR)) +#define GST_IS_ONNX_OBJECT_DETECTOR_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_ONNX_OBJECT_DETECTOR)) + +#define GST_ONNX_OBJECT_DETECTOR_META_NAME "onnx-object_detector" +#define GST_ONNX_OBJECT_DETECTOR_META_PARAM_NAME "extra-data" +#define GST_ONNX_OBJECT_DETECTOR_META_FIELD_LABEL "label" +#define GST_ONNX_OBJECT_DETECTOR_META_FIELD_SCORE "score" + +/** + * GstOnnxObjectDetector: + * + * @model_file model file + * @label_file label file + * @score_threshold score threshold + * @confidence_threshold confidence threshold + * @iou_threhsold iou threshold + * @optimization_level ONNX optimization level + * @execution_provider: ONNX execution provider + * @onnx_ptr opaque pointer to ONNX implementation + * + * Since: 1.20 + */ +struct _GstOnnxObjectDetector +{ + GstBaseTransform basetransform; + gchar *model_file; + gchar *label_file; + gfloat score_threshold; + gfloat confidence_threshold; + gfloat iou_threshold; + GstOnnxOptimizationLevel optimization_level; + GstOnnxExecutionProvider execution_provider; + gpointer onnx_ptr; + gboolean onnx_disabled; + + void (*process) (GstOnnxObjectDetector * onnx_object_detector, + GstVideoFrame * inframe, GstVideoFrame * outframe); +}; + +/** + * GstOnnxObjectDetectorClass: + * + * @parent_class base transform base class + * + * Since: 1.20 + */ +struct _GstOnnxObjectDetectorClass +{ + GstBaseTransformClass parent_class; +}; + +GST_ELEMENT_REGISTER_DECLARE (onnx_object_detector) + +G_END_DECLS + +#endif /* __GST_ONNX_OBJECT_DETECTOR_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/onnx/meson.build
Added
@@ -0,0 +1,32 @@ +if get_option('onnx').disabled() + subdir_done() +endif + + +onnxrt_dep = dependency('libonnxruntime',required : get_option('onnx')) + +if onnxrt_dep.found() + onnxrt_include_root = onnxrt_dep.get_variable('includedir') + onnxrt_includes = [onnxrt_include_root / 'core/session', onnxrt_include_root / 'core'] + onnxrt_dep_args = [] + + compiler = meson.get_compiler('cpp') + if compiler.has_header(onnxrt_include_root / 'core/providers/cuda/cuda_provider_factory.h') + onnxrt_dep_args = ['-DGST_ML_ONNX_RUNTIME_HAVE_CUDA'] + endif + gstonnx = library('gstonnx', + 'gstonnx.c', + 'gstonnxelement.c', + 'gstonnxobjectdetector.cpp', + 'gstonnxclient.cpp', + c_args : gst_plugins_bad_args, + cpp_args: onnxrt_dep_args, + link_args : noseh_link_args, + include_directories : [configinc, libsinc, onnxrt_includes], + dependencies : [gstbase_dep, gstvideo_dep, onnxrt_dep, libm], + install : true, + install_dir : plugins_install_dir, + ) + pkgconfig.generate(gstonnx, install_dir : plugins_pkgconfig_install_dir) + plugins += [gstonnx] + endif
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openal/gstopenal.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openal/gstopenal.c
Changed
@@ -26,32 +26,19 @@ #include "config.h" #endif -#include "gstopenalsink.h" -#include "gstopenalsrc.h" +#include "gstopenalelements.h" #include <gst/gst-i18n-plugin.h> -GST_DEBUG_CATEGORY (openal_debug); static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "openalsink", GST_RANK_SECONDARY, - GST_TYPE_OPENAL_SINK)) - return FALSE; - - if (!gst_element_register (plugin, "openalsrc", GST_RANK_SECONDARY, - GST_TYPE_OPENAL_SRC)) - return FALSE; - -#ifdef ENABLE_NLS - GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, - LOCALEDIR); - bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); - bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); -#endif + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (openalsink, plugin); + ret |= GST_ELEMENT_REGISTER (openalsrc, plugin); - GST_DEBUG_CATEGORY_INIT (openal_debug, "openal", 0, "openal plugins"); return TRUE; }
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openal/gstopenalelement.c
Added
@@ -0,0 +1,51 @@ +/* + * GStreamer openal plugin + * + * Copyright (C) 2005 Wim Taymans <wim@fluendo.com> + * Copyright (C) 2006 Tim-Philipp Müller <tim centricular net> + * Copyright (C) 2009-2010 Chris Robinson <chris.kcat@gmail.com> + * Copyright (C) 2013 Juan Manuel Borges Caño <juanmabcmail@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstopenalelements.h" +#include "gstopenalsink.h" +#include "gstopenalsrc.h" + +#include <gst/gst-i18n-plugin.h> + +GST_DEBUG_CATEGORY (openal_debug); + +void +openal_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { +#ifdef ENABLE_NLS + GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, + LOCALEDIR); + bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); + bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); +#endif + GST_DEBUG_CATEGORY_INIT (openal_debug, "openal", 0, "openal plugins"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openal/gstopenalelements.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2020> The Gstreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_OPENAL_ELEMENTS_H__ +#define __GST_OPENAL_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void openal_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (openalsink); +GST_ELEMENT_REGISTER_DECLARE (openalsrc); + +#endif /* __GST_OPENAL_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openal/gstopenalsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openal/gstopenalsink.c
Changed
@@ -61,6 +61,7 @@ GST_DEBUG_CATEGORY_EXTERN (openal_debug); #define GST_CAT_DEFAULT openal_debug +#include "gstopenalelements.h" #include "gstopenalsink.h" static void gst_openal_sink_dispose (GObject * object); @@ -159,6 +160,8 @@ #define checkALError() checkALError(__FILE__, __LINE__) G_DEFINE_TYPE (GstOpenALSink, gst_openal_sink, GST_TYPE_AUDIO_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (openalsink, "openalsink", + GST_RANK_SECONDARY, GST_TYPE_OPENAL_SINK, openal_element_init (plugin)); static void gst_openal_sink_dispose (GObject * object)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openal/gstopenalsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openal/gstopenalsrc.c
Changed
@@ -81,6 +81,7 @@ GST_DEBUG_CATEGORY_EXTERN (openal_debug); #define GST_CAT_DEFAULT openal_debug +#include "gstopenalelements.h" #include "gstopenalsrc.h" static void gst_openal_src_dispose (GObject * object); @@ -140,6 +141,8 @@ ); G_DEFINE_TYPE (GstOpenalSrc, gst_openal_src, GST_TYPE_AUDIO_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (openalsrc, "openalsrc", + GST_RANK_SECONDARY, GST_TYPE_OPENAL_SRC, openal_element_init (plugin)); static void gst_openal_src_dispose (GObject * object)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openal/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/openal/meson.build
Changed
@@ -2,7 +2,7 @@ if openal_dep.found() gstopenal = library('gstopenal', - 'gstopenal.c', 'gstopenalsink.c', 'gstopenalsrc.c', + 'gstopenal.c', 'gstopenalelement.c', 'gstopenalsink.c', 'gstopenalsrc.c', c_args: gst_plugins_bad_args, include_directories: [configinc, libsinc], dependencies: [gstaudio_dep, openal_dep],
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/gstopenaptxdec.c
Added
@@ -0,0 +1,344 @@ +/* GStreamer openaptx audio decoder + * + * Copyright (C) 2020 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + * Copyright (C) 2020 Thomas Weißschuh <thomas@t-8ch.de> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + * + */ + +/** + * SECTION:element-openaptxdec + * @title: openaptxdec + * + * This element decodes Bluetooth aptX or aptX-HD stream to raw S24LE integer stereo PCM audio. + * Accepts audio/aptx or audio/aptx-hd input streams. + * + * ## Example pipelines + * |[ + * gst-launch-1.0 -v audiotestsrc ! avenc_aptx ! openaptxdec ! audioconvert ! autoaudiosink + * ]| Decode a sine wave encoded with AV encoder and listen to result. + * |[ + * gst-launch-1.0 -v audiotestsrc ! avenc_aptx ! openaptxdec autosync=0 ! audioconvert ! autoaudiosink + * ]| Decode a sine wave encoded with AV encoder and listen to result, stream autosync disabled. + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstopenaptxdec.h" +#include "openaptx-plugin.h" + +enum +{ + PROP_0, + PROP_AUTOSYNC +}; + +GST_DEBUG_CATEGORY_STATIC (openaptx_dec_debug); +#define GST_CAT_DEFAULT openaptx_dec_debug + +#define parent_class gst_openaptx_dec_parent_class +G_DEFINE_TYPE (GstOpenaptxDec, gst_openaptx_dec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (openaptxdec, "openaptxdec", GST_RANK_NONE, + GST_TYPE_OPENAPTX_DEC); + +static GstStaticPadTemplate openaptx_dec_sink_factory = + GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/aptx-hd, channels = 2, rate = [ 1, MAX ]; " + "audio/aptx, channels = 2, rate = [ 1, MAX ]")); + +static GstStaticPadTemplate openaptx_dec_src_factory = +GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/x-raw, format = S24LE," + " rate = [ 1, MAX ], channels = 2, layout = interleaved")); + +static GstFlowReturn +gst_openaptx_dec_handle_frame (GstAudioDecoder * audio_dec, GstBuffer * buffer) +{ + GstOpenaptxDec *dec = GST_OPENAPTX_DEC (audio_dec); + GstMapInfo out_map; + GstBuffer *outbuf = NULL; + GstFlowReturn ret; + guint num_frames; + gsize frame_len, output_size; + gssize input_size, processed = 0; + gsize written = 0; + gint synced; + gsize dropped; + + /* no fancy draining */ + if (G_UNLIKELY (!buffer)) + input_size = 0; + else + input_size = gst_buffer_get_size (buffer); + + frame_len = aptx_frame_size (dec->hd); + + if (!dec->autosync) { + /* we assume all frames are of the same size, this is implied by the + * input caps applying to the whole input buffer, and the parser should + * also have made sure of that */ + if (G_UNLIKELY (input_size % frame_len != 0)) + goto mixed_frames; + } + + num_frames = input_size / frame_len; + + /* need one extra frame if autosync is enabled */ + if (dec->autosync) + ++num_frames; + + output_size = num_frames * APTX_SAMPLES_PER_FRAME * APTX_SAMPLE_SIZE; + + outbuf = gst_audio_decoder_allocate_output_buffer (audio_dec, output_size); + + if (outbuf == NULL) + goto no_output_buffer; + + if (!gst_buffer_map (outbuf, &out_map, GST_MAP_WRITE)) { + gst_buffer_replace (&outbuf, NULL); + goto no_output_buffer_map; + } + + if (G_LIKELY (buffer)) { + GstMapInfo in_map; + + if (!gst_buffer_map (buffer, &in_map, GST_MAP_READ)) { + gst_buffer_unmap (outbuf, &out_map); + gst_buffer_replace (&outbuf, NULL); + goto map_failed; + } + + if (dec->autosync) { + processed = aptx_decode_sync (dec->aptx_c, in_map.data, in_map.size, + out_map.data, output_size, &written, &synced, &dropped); + } else { + processed = aptx_decode (dec->aptx_c, in_map.data, in_map.size, + out_map.data, out_map.size, &written); + } + + gst_buffer_unmap (buffer, &in_map); + } else { + if (dec->autosync) { + dropped = aptx_decode_sync_finish (dec->aptx_c); + synced = 1; + } + } + + if (dec->autosync) { + if (!synced) { + GST_WARNING_OBJECT (dec, "%s stream is not synchronized", + aptx_name (dec->hd)); + } + if (dropped) { + GST_WARNING_OBJECT (dec, + "%s decoder dropped %" G_GSIZE_FORMAT " bytes from stream", + aptx_name (dec->hd), dropped); + } + } + + if (processed != input_size) { + GST_WARNING_OBJECT (dec, + "%s decoding error, processed = %" G_GSSIZE_FORMAT ", " + "written = %" G_GSSIZE_FORMAT ", input size = %" G_GSIZE_FORMAT, + aptx_name (dec->hd), processed, written, input_size); + } + + gst_buffer_unmap (outbuf, &out_map); + + GST_LOG_OBJECT (dec, "%s written = %" G_GSSIZE_FORMAT, + aptx_name (dec->hd), written); + +done: + if (G_LIKELY (outbuf)) { + if (G_LIKELY (written > 0)) + gst_buffer_set_size (outbuf, written); + else + gst_buffer_replace (&outbuf, NULL); + } + + ret = gst_audio_decoder_finish_frame (audio_dec, outbuf, 1); + + if (G_UNLIKELY (!buffer)) + ret = GST_FLOW_EOS; + + return ret; + +/* ERRORS */ +mixed_frames: + { + GST_WARNING_OBJECT (dec, "inconsistent input data/frames, skipping"); + goto done; + } +no_output_buffer_map: + { + GST_ELEMENT_ERROR (dec, RESOURCE, FAILED, + ("Could not map output buffer"), + ("Failed to map allocated output buffer for write access.")); + return GST_FLOW_ERROR; + } +no_output_buffer: + { + GST_ELEMENT_ERROR (dec, RESOURCE, FAILED, + ("Could not allocate output buffer"), + ("Audio decoder failed to allocate output buffer to hold an audio frame.")); + return GST_FLOW_ERROR; + } +map_failed: + { + GST_ELEMENT_ERROR (dec, RESOURCE, FAILED, + ("Could not map input buffer"), + ("Failed to map incoming buffer for read access.")); + return GST_FLOW_ERROR; + } +} + +static gboolean +gst_openaptx_dec_set_format (GstAudioDecoder * audio_dec, GstCaps * caps) +{ + GstOpenaptxDec *dec = GST_OPENAPTX_DEC (audio_dec); + GstAudioInfo info; + GstStructure *s; + gint rate; + + s = gst_caps_get_structure (caps, 0); + gst_structure_get_int (s, "rate", &rate); + + /* let's see what is in the output caps */ + dec->hd = gst_structure_has_name (s, "audio/aptx-hd"); + + /* reinitialize codec */ + if (dec->aptx_c) + aptx_finish (dec->aptx_c); + + GST_INFO_OBJECT (dec, "Initialize %s codec", aptx_name (dec->hd)); + dec->aptx_c = aptx_init (dec->hd); + + /* set up output format */ + gst_audio_info_init (&info); + gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S24LE, + rate, APTX_NUM_CHANNELS, NULL); + gst_audio_decoder_set_output_format (audio_dec, &info); + + return TRUE; +} + +static void +gst_openaptx_dec_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstOpenaptxDec *dec = GST_OPENAPTX_DEC (object); + + switch (prop_id) { + case PROP_AUTOSYNC: + dec->autosync = g_value_get_boolean (value); + break; + default: + break; + } +} + +static void +gst_openaptx_dec_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstOpenaptxDec *dec = GST_OPENAPTX_DEC (object); + + switch (prop_id) { + case PROP_AUTOSYNC:{ + g_value_set_boolean (value, dec->autosync); + break; + } + default:{ + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + } +} + +static gboolean +gst_openaptx_dec_start (GstAudioDecoder * audio_dec) +{ + return TRUE; +} + +static gboolean +gst_openaptx_dec_stop (GstAudioDecoder * audio_dec) +{ + GstOpenaptxDec *dec = GST_OPENAPTX_DEC (audio_dec); + + GST_INFO_OBJECT (dec, "Finish openaptx codec"); + + if (dec->aptx_c) { + aptx_finish (dec->aptx_c); + dec->aptx_c = NULL; + } + + return TRUE; +} + +static void +gst_openaptx_dec_class_init (GstOpenaptxDecClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstAudioDecoderClass *base_class = GST_AUDIO_DECODER_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + + gobject_class->set_property = + GST_DEBUG_FUNCPTR (gst_openaptx_dec_set_property); + gobject_class->get_property = + GST_DEBUG_FUNCPTR (gst_openaptx_dec_get_property); + + g_object_class_install_property (gobject_class, PROP_AUTOSYNC, + g_param_spec_boolean ("autosync", "Auto sync", + "Gracefully handle partially corrupted stream in which some bytes are missing", + APTX_AUTOSYNC_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + base_class->start = GST_DEBUG_FUNCPTR (gst_openaptx_dec_start); + base_class->stop = GST_DEBUG_FUNCPTR (gst_openaptx_dec_stop); + base_class->set_format = GST_DEBUG_FUNCPTR (gst_openaptx_dec_set_format); + base_class->handle_frame = GST_DEBUG_FUNCPTR (gst_openaptx_dec_handle_frame); + + gst_element_class_add_static_pad_template (element_class, + &openaptx_dec_sink_factory); + gst_element_class_add_static_pad_template (element_class, + &openaptx_dec_src_factory); + + gst_element_class_set_static_metadata (element_class, + "Bluetooth aptX/aptX-HD audio decoder using libopenaptx", + "Codec/Decoder/Audio", + "Decode an aptX or aptX-HD audio stream using libopenaptx", + "Igor V. Kovalenko <igor.v.kovalenko@gmail.com>, " + "Thomas Weißschuh <thomas@t-8ch.de>"); + + GST_DEBUG_CATEGORY_INIT (openaptx_dec_debug, "openaptxdec", 0, + "openaptx decoding element"); +} + +static void +gst_openaptx_dec_init (GstOpenaptxDec * dec) +{ + gst_audio_decoder_set_needs_format (GST_AUDIO_DECODER (dec), TRUE); + gst_audio_decoder_set_use_default_pad_acceptcaps (GST_AUDIO_DECODER_CAST + (dec), TRUE); + GST_PAD_SET_ACCEPT_TEMPLATE (GST_AUDIO_DECODER_SINK_PAD (dec)); + + dec->aptx_c = NULL; + + dec->autosync = APTX_AUTOSYNC_DEFAULT; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/gstopenaptxdec.h
Added
@@ -0,0 +1,51 @@ +/* GStreamer openaptx audio decoder + * + * Copyright (C) 2020 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + * Copyright (C) 2020 Thomas Weißschuh <thomas@t-8ch.de> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + * + */ +#ifndef __GST_OPENAPTXDEC_H__ +#define __GST_OPENAPTXDEC_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> + +#ifdef USE_FREEAPTX +#include <freeaptx.h> +#else +#include <openaptx.h> +#endif + +G_BEGIN_DECLS + +#define GST_TYPE_OPENAPTX_DEC (gst_openaptx_dec_get_type()) +G_DECLARE_FINAL_TYPE (GstOpenaptxDec, gst_openaptx_dec, GST, OPENAPTX_DEC, GstAudioEncoder) + +struct _GstOpenaptxDec { + GstAudioDecoder audio_decoder; + + gboolean hd; + gboolean autosync; + + struct aptx_context *aptx_c; +}; + +GST_ELEMENT_REGISTER_DECLARE (openaptxdec) + +G_END_DECLS + +#endif /* __GST_OPENAPTXDEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/gstopenaptxenc.c
Added
@@ -0,0 +1,319 @@ +/* GStreamer openaptx audio encoder + * + * Copyright (C) 2020 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + * Copyright (C) 2020 Thomas Weißschuh <thomas@t-8ch.de> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + * + */ + +/** + * SECTION:element-openaptxenc + * @title: openaptxenc + * + * This element encodes raw S24LE integer stereo PCM audio into a Bluetooth aptX or aptX-HD stream. + * Accepts audio/aptx or audio/aptx-hd output streams. + * + * ## Example pipelines + * |[ + * gst-launch-1.0 -v audiotestsrc ! openaptxenc ! avdec_aptx ! audioconvert ! autoaudiosink + * ]| Encode a sine wave into aptX, AV decode it and listen to result. + * |[ + * gst-launch-1.0 -v audiotestsrc ! openaptxenc ! avdec_aptx_hd ! audioconvert ! autoaudiosink + * ]| Encode a sine wave into aptX-HD, AV decode it and listen to result. + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstopenaptxenc.h" +#include "openaptx-plugin.h" + +GST_DEBUG_CATEGORY_STATIC (openaptx_enc_debug); +#define GST_CAT_DEFAULT openaptx_enc_debug + +#define gst_openaptx_enc_parent_class parent_class + +G_DEFINE_TYPE (GstOpenaptxEnc, gst_openaptx_enc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (openaptxenc, "openaptxenc", GST_RANK_NONE, + GST_TYPE_OPENAPTX_ENC); + +static GstStaticPadTemplate openaptx_enc_sink_factory = +GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/x-raw, format = S24LE," + " rate = [ 1, MAX ], channels = 2, layout = interleaved")); + +static GstStaticPadTemplate openaptx_enc_src_factory = + GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/aptx-hd, channels = 2, rate = [ 1, MAX ]; " + "audio/aptx, channels = 2, rate = [ 1, MAX ]")); + + +static gboolean gst_openaptx_enc_start (GstAudioEncoder * enc); +static gboolean gst_openaptx_enc_stop (GstAudioEncoder * enc); +static gboolean gst_openaptx_enc_set_format (GstAudioEncoder * enc, + GstAudioInfo * info); +static GstFlowReturn gst_openaptx_enc_handle_frame (GstAudioEncoder * enc, + GstBuffer * buffer); + +static gint64 +gst_openaptx_enc_get_latency (GstOpenaptxEnc * enc, gint rate) +{ + gint64 latency = + gst_util_uint64_scale (APTX_LATENCY_SAMPLES, GST_SECOND, rate); + GST_DEBUG_OBJECT (enc, "Latency: %" GST_TIME_FORMAT, GST_TIME_ARGS (latency)); + return latency; +} + +static gboolean +gst_openaptx_enc_set_format (GstAudioEncoder * audio_enc, GstAudioInfo * info) +{ + GstOpenaptxEnc *enc = GST_OPENAPTX_ENC (audio_enc); + GstStructure *s; + GstCaps *caps, *output_caps = NULL; + gint rate; + gint64 encoder_latency; + gint ret; + + rate = GST_AUDIO_INFO_RATE (info); + + /* negotiate output format based on downstream caps restrictions */ + caps = gst_pad_get_allowed_caps (GST_AUDIO_ENCODER_SRC_PAD (enc)); + + if (caps == NULL) + caps = gst_static_pad_template_get_caps (&openaptx_enc_src_factory); + else if (gst_caps_is_empty (caps)) + goto failure; + + /* let's see what is in the output caps */ + s = gst_caps_get_structure (caps, 0); + enc->hd = gst_structure_has_name (s, "audio/aptx-hd"); + + gst_clear_caps (&caps); + + output_caps = gst_caps_new_simple (enc->hd ? "audio/aptx-hd" : "audio/aptx", + "channels", G_TYPE_INT, APTX_NUM_CHANNELS, + "rate", G_TYPE_INT, rate, NULL); + + GST_INFO_OBJECT (enc, "output caps %" GST_PTR_FORMAT, output_caps); + + /* reinitialize codec */ + if (enc->aptx_c) + aptx_finish (enc->aptx_c); + + GST_INFO_OBJECT (enc, "Initialize %s codec", aptx_name (enc->hd)); + enc->aptx_c = aptx_init (enc->hd); + + encoder_latency = gst_openaptx_enc_get_latency (enc, rate); + gst_audio_encoder_set_latency (audio_enc, encoder_latency, encoder_latency); + + /* we want to be handed all available samples in handle_frame, but always + * enough to encode a frame */ + gst_audio_encoder_set_frame_samples_min (audio_enc, APTX_SAMPLES_PER_CHANNEL); + gst_audio_encoder_set_frame_samples_max (audio_enc, APTX_SAMPLES_PER_CHANNEL); + gst_audio_encoder_set_frame_max (audio_enc, 0); + + /* FIXME: what to do with left-over samples at the end? can we encode them? */ + gst_audio_encoder_set_hard_min (audio_enc, TRUE); + + ret = gst_audio_encoder_set_output_format (audio_enc, output_caps); + gst_caps_unref (output_caps); + + return ret; + +failure: + if (output_caps) + gst_caps_unref (output_caps); + if (caps) + gst_caps_unref (caps); + return FALSE; +} + +static GstFlowReturn +gst_openaptx_enc_handle_frame (GstAudioEncoder * audio_enc, GstBuffer * buffer) +{ + GstOpenaptxEnc *enc = GST_OPENAPTX_ENC (audio_enc); + GstMapInfo out_map; + GstBuffer *outbuf = NULL; + GstFlowReturn ret; + guint frames; + gsize frame_len, output_size; + gssize processed = 0; + gsize written = 0; + + /* fixed encoded frame size hd=0: LLRR, hd=1: LLLRRR */ + frame_len = aptx_frame_size (enc->hd); + + if (G_UNLIKELY (!buffer)) { + GST_DEBUG_OBJECT (enc, "Finish encoding"); + frames = APTX_FINISH_FRAMES; + } else { + frames = gst_buffer_get_size (buffer) / + (APTX_SAMPLE_SIZE * APTX_SAMPLES_PER_FRAME); + + if (frames == 0) { + GST_WARNING_OBJECT (enc, "Odd input stream size detected, skipping"); + goto mixed_frames; + } + } + + output_size = frames * frame_len; + outbuf = gst_audio_encoder_allocate_output_buffer (audio_enc, output_size); + + if (outbuf == NULL) + goto no_output_buffer; + + if (!gst_buffer_map (outbuf, &out_map, GST_MAP_WRITE)) { + gst_buffer_replace (&outbuf, NULL); + goto no_output_buffer_map; + } + + if (G_LIKELY (buffer)) { + GstMapInfo in_map; + + if (!gst_buffer_map (buffer, &in_map, GST_MAP_READ)) { + gst_buffer_unmap (outbuf, &out_map); + gst_buffer_replace (&outbuf, NULL); + goto map_failed; + } + + GST_LOG_OBJECT (enc, + "encoding %" G_GSIZE_FORMAT " samples into %u %s frames", + in_map.size / (APTX_NUM_CHANNELS * APTX_SAMPLE_SIZE), frames, + aptx_name (enc->hd)); + + processed = aptx_encode (enc->aptx_c, in_map.data, in_map.size, + out_map.data, output_size, &written); + + gst_buffer_unmap (buffer, &in_map); + } else { + aptx_encode_finish (enc->aptx_c, out_map.data, output_size, &written); + output_size = written; + } + + if (processed < 0 || written != output_size) { + GST_WARNING_OBJECT (enc, + "%s encoding error, processed = %" G_GSSIZE_FORMAT ", " + "written = %" G_GSSIZE_FORMAT ", expected = %" G_GSIZE_FORMAT, + aptx_name (enc->hd), processed, written, frames * frame_len); + } + + gst_buffer_unmap (outbuf, &out_map); + + GST_LOG_OBJECT (enc, "%s written = %" G_GSSIZE_FORMAT, + aptx_name (enc->hd), written); + +done: + if (G_LIKELY (outbuf)) { + if (G_LIKELY (written > 0)) + gst_buffer_set_size (outbuf, written); + else + gst_buffer_replace (&outbuf, NULL); + } + + ret = gst_audio_encoder_finish_frame (audio_enc, outbuf, + written / frame_len * APTX_SAMPLES_PER_CHANNEL); + + if (G_UNLIKELY (!buffer)) + ret = GST_FLOW_EOS; + + return ret; + +/* ERRORS */ +mixed_frames: + { + GST_WARNING_OBJECT (enc, "inconsistent input data/frames, skipping"); + goto done; + } +no_output_buffer_map: + { + GST_ELEMENT_ERROR (enc, RESOURCE, FAILED, + ("Could not map output buffer"), + ("Failed to map allocated output buffer for write access.")); + return GST_FLOW_ERROR; + } +no_output_buffer: + { + GST_ELEMENT_ERROR (enc, RESOURCE, FAILED, + ("Could not allocate output buffer"), + ("Audio encoder failed to allocate output buffer to hold an audio frame.")); + return GST_FLOW_ERROR; + } +map_failed: + { + GST_ELEMENT_ERROR (enc, RESOURCE, FAILED, + ("Could not map input buffer"), + ("Failed to map incoming buffer for read access.")); + return GST_FLOW_ERROR; + } +} + +static gboolean +gst_openaptx_enc_start (GstAudioEncoder * audio_enc) +{ + return TRUE; +} + +static gboolean +gst_openaptx_enc_stop (GstAudioEncoder * audio_enc) +{ + GstOpenaptxEnc *enc = GST_OPENAPTX_ENC (audio_enc); + + GST_INFO_OBJECT (enc, "Finish openaptx codec"); + + if (enc->aptx_c) { + aptx_finish (enc->aptx_c); + enc->aptx_c = NULL; + } + + return TRUE; +} + +static void +gst_openaptx_enc_class_init (GstOpenaptxEncClass * klass) +{ + GstAudioEncoderClass *base_class = GST_AUDIO_ENCODER_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + + base_class->start = GST_DEBUG_FUNCPTR (gst_openaptx_enc_start); + base_class->stop = GST_DEBUG_FUNCPTR (gst_openaptx_enc_stop); + base_class->set_format = GST_DEBUG_FUNCPTR (gst_openaptx_enc_set_format); + base_class->handle_frame = GST_DEBUG_FUNCPTR (gst_openaptx_enc_handle_frame); + + gst_element_class_add_static_pad_template (element_class, + &openaptx_enc_sink_factory); + gst_element_class_add_static_pad_template (element_class, + &openaptx_enc_src_factory); + + gst_element_class_set_static_metadata (element_class, + "Bluetooth aptX/aptX-HD audio encoder using libopenaptx", + "Codec/Encoder/Audio", + "Encode an aptX or aptX-HD audio stream using libopenaptx", + "Igor V. Kovalenko <igor.v.kovalenko@gmail.com>, " + "Thomas Weißschuh <thomas@t-8ch.de>"); + + GST_DEBUG_CATEGORY_INIT (openaptx_enc_debug, "openaptxenc", 0, + "openaptx encoding element"); +} + +static void +gst_openaptx_enc_init (GstOpenaptxEnc * enc) +{ + GST_PAD_SET_ACCEPT_TEMPLATE (GST_AUDIO_ENCODER_SINK_PAD (enc)); + + enc->aptx_c = NULL; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/gstopenaptxenc.h
Added
@@ -0,0 +1,50 @@ +/* GStreamer openaptx audio encoder + * + * Copyright (C) 2020 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + * Copyright (C) 2020 Thomas Weißschuh <thomas@t-8ch.de> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA + * + */ +#ifndef __GST_OPENAPTXENC_H__ +#define __GST_OPENAPTXENC_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> + +#ifdef USE_FREEAPTX +#include <freeaptx.h> +#else +#include <openaptx.h> +#endif + +G_BEGIN_DECLS + +#define GST_TYPE_OPENAPTX_ENC (gst_openaptx_enc_get_type()) +G_DECLARE_FINAL_TYPE (GstOpenaptxEnc, gst_openaptx_enc, GST, OPENAPTX_ENC, GstAudioEncoder) + +struct _GstOpenaptxEnc { + GstAudioEncoder audio_encoder; + + gboolean hd; + + struct aptx_context *aptx_c; +}; + +GST_ELEMENT_REGISTER_DECLARE(openaptxenc); + +G_END_DECLS + +#endif /* __GST_OPENAPTXENC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/meson.build
Added
@@ -0,0 +1,31 @@ +openaptx_sources = [ + 'openaptx-plugin.c', + 'gstopenaptxdec.c', + 'gstopenaptxenc.c', +] + +if not get_option('openaptx').allowed() + subdir_done() +endif + +openaptx_defines = [] + +openaptx_dep = dependency('libfreeaptx', version : '>= 0.1.1', required : false) +if openaptx_dep.found() + openaptx_defines += ['-DUSE_FREEAPTX'] +else + openaptx_dep = dependency('libopenaptx', version : '== 0.2.0', required : get_option('openaptx')) +endif + +if openaptx_dep.found() + gstopenaptx = library('gstopenaptx', + openaptx_sources, + c_args : gst_plugins_bad_args + openaptx_defines, + include_directories : [configinc], + dependencies : [gstaudio_dep, openaptx_dep], + install : true, + install_dir : plugins_install_dir, + ) + pkgconfig.generate(gstopenaptx, install_dir : plugins_pkgconfig_install_dir) + plugins += [gstopenaptx] +endif
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/openaptx-plugin.c
Added
@@ -0,0 +1,43 @@ +/* GStreamer openaptx audio plugin + * + * Copyright (C) 2020 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + * Copyright (C) 2020 Thomas Weißschuh <thomas@t-8ch.de> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "openaptx-plugin.h" +#include "gstopenaptxdec.h" +#include "gstopenaptxenc.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + ret |= GST_ELEMENT_REGISTER (openaptxdec, plugin); + ret |= GST_ELEMENT_REGISTER (openaptxenc, plugin); + return TRUE; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + openaptx, + "Open Source implementation of Audio Processing Technology codec (aptX)", + plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openaptx/openaptx-plugin.h
Added
@@ -0,0 +1,61 @@ +/* GStreamer openaptx audio plugin + * + * Copyright (C) 2020 Igor V. Kovalenko <igor.v.kovalenko@gmail.com> + * Copyright (C) 2020 Thomas Weißschuh <thomas@t-8ch.de> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifndef __GST_OPENAPTX_PLUGIN_H__ +#define __GST_OPENAPTX_PLUGIN_H__ + +#include <glib.h> + +#define APTX_HD_DEFAULT 1 +#define APTX_AUTOSYNC_DEFAULT TRUE + +#define APTX_LATENCY_SAMPLES 90 + +/* always stereo */ +#define APTX_NUM_CHANNELS 2 + +/* always S24LE */ +#define APTX_SAMPLE_SIZE 3 + +/* always 4 samples per channel*/ +#define APTX_SAMPLES_PER_CHANNEL 4 + +/* always 4 stereo samples */ +#define APTX_SAMPLES_PER_FRAME (APTX_SAMPLES_PER_CHANNEL * APTX_NUM_CHANNELS) + +/* fixed encoded frame size hd=0: LLRR, hd=1: LLLRRR */ +#define APTX_FRAME_SIZE (2 * APTX_NUM_CHANNELS) +#define APTX_HD_FRAME_SIZE (3 * APTX_NUM_CHANNELS) + +/* while finishing encoding, up to 92 frames will be produced */ +#define APTX_FINISH_FRAMES 92 + +static inline const char* aptx_name(gboolean hd) +{ + return hd ? "aptX-HD" : "aptX"; +} + +/* fixed encoded frame size hd=FALSE: LLRR, hd=TRUE: LLLRRR */ +static inline gsize aptx_frame_size(gboolean hd) +{ + return hd ? APTX_HD_FRAME_SIZE : APTX_FRAME_SIZE; +} + +#endif /* __GST_OPENAPTX_PLUGIN_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcameracalibrate.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcameracalibrate.cpp
Changed
@@ -157,8 +157,13 @@ return camera_calibration_pattern_type; } -G_DEFINE_TYPE (GstCameraCalibrate, gst_camera_calibrate, - GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstCameraCalibrate, gst_camera_calibrate, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_camera_calibrate_debug, "cameracalibrate", 0, + "Performs camera calibration"); + ); +GST_ELEMENT_REGISTER_DEFINE (cameracalibrate, "cameracalibrate", GST_RANK_NONE, + GST_TYPE_CAMERA_CALIBRATE); static void gst_camera_calibrate_dispose (GObject * object); static void gst_camera_calibrate_set_property (GObject * object, guint prop_id, @@ -204,13 +209,13 @@ g_object_class_install_property (gobject_class, PROP_BOARD_WIDTH, g_param_spec_int ("board-width", "Board Width", - "The board width in number of items", + "The board width in number of items (e.g. number of squares for chessboard)", 1, G_MAXINT, DEFAULT_BOARD_WIDTH, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); g_object_class_install_property (gobject_class, PROP_BOARD_HEIGHT, g_param_spec_int ("board-height", "Board Height", - "The board height in number of items", + "The board height in number of items (e.g. number of squares for chessboard)", 1, G_MAXINT, DEFAULT_BOARD_WIDTH, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); @@ -276,7 +281,8 @@ gst_element_class_set_static_metadata (element_class, "cameracalibrate", "Filter/Effect/Video", - "Performs camera calibration", + "Performs camera calibration by having it point at a chessboard pattern " + "using upstream/downstream cameraundistort", "Philippe Renon <philippe_renon@yahoo.fr>"); /* add sink and source pad templates */ @@ -289,7 +295,8 @@ templ = gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, caps); gst_element_class_add_pad_template (element_class, templ); - gst_type_mark_as_plugin_api (GST_TYPE_CAMERA_CALIBRATION_PATTERN, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_CAMERA_CALIBRATION_PATTERN, + (GstPluginAPIFlags) 0); } /* initialize the new element @@ -324,8 +331,8 @@ calib->flags = cv::fisheye::CALIB_FIX_SKEW | cv::fisheye::CALIB_RECOMPUTE_EXTRINSIC | /*cv::fisheye::CALIB_FIX_K1 | */ - cv::fisheye::CALIB_FIX_K2 | cv::fisheye::CALIB_FIX_K3 | cv:: - fisheye::CALIB_FIX_K4; + cv::fisheye::CALIB_FIX_K2 | cv::fisheye::CALIB_FIX_K3 | cv::fisheye:: + CALIB_FIX_K4; } calib->mode = CAPTURING; //DETECTION; @@ -752,18 +759,3 @@ return ok; } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_camera_calibrate_plugin_init (GstPlugin * plugin) -{ - /* debug category for filtering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_camera_calibrate_debug, "cameracalibrate", - 0, "Performs camera calibration"); - - return gst_element_register (plugin, "cameracalibrate", GST_RANK_NONE, - GST_TYPE_CAMERA_CALIBRATE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcameracalibrate.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcameracalibrate.h
Changed
@@ -107,7 +107,7 @@ GType gst_camera_calibrate_get_type (void); -gboolean gst_camera_calibrate_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cameracalibrate); G_END_DECLS #endif /* __GST_CAMERA_CALIBRATE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcameraundistort.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcameraundistort.cpp
Changed
@@ -97,8 +97,13 @@ PROP_SETTINGS }; -G_DEFINE_TYPE (GstCameraUndistort, gst_camera_undistort, - GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstCameraUndistort, gst_camera_undistort, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_camera_undistort_debug, "cameraundistort", 0, + "Performs camera undistortion"); + ); +GST_ELEMENT_REGISTER_DEFINE (cameraundistort, "cameraundistort", GST_RANK_NONE, + GST_TYPE_CAMERA_UNDISTORT); static void gst_camera_undistort_dispose (GObject * object); static void gst_camera_undistort_set_property (GObject * object, guint prop_id, @@ -403,18 +408,3 @@ GST_BASE_TRANSFORM_CLASS (gst_camera_undistort_parent_class)->src_event (trans, event); } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_camera_undistort_plugin_init (GstPlugin * plugin) -{ - /* debug category for filtering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_camera_undistort_debug, "cameraundistort", - 0, "Performs camera undistortion"); - - return gst_element_register (plugin, "cameraundistort", GST_RANK_NONE, - GST_TYPE_CAMERA_UNDISTORT); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcameraundistort.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcameraundistort.h
Changed
@@ -97,7 +97,7 @@ GType gst_camera_undistort_get_type (void); -gboolean gst_camera_undistort_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cameraundistort); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvdilate.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvdilate.cpp
Changed
@@ -64,7 +64,11 @@ GST_DEBUG_CATEGORY_STATIC (gst_cv_dilate_debug); #define GST_CAT_DEFAULT gst_cv_dilate_debug -G_DEFINE_TYPE (GstCvDilate, gst_cv_dilate, GST_TYPE_CV_DILATE_ERODE); +G_DEFINE_TYPE_WITH_CODE (GstCvDilate, gst_cv_dilate, GST_TYPE_CV_DILATE_ERODE, + GST_DEBUG_CATEGORY_INIT (gst_cv_dilate_debug, "cvdilate", 0, "cvdilate"); + ); +GST_ELEMENT_REGISTER_DEFINE (cvdilate, "cvdilate", GST_RANK_NONE, + GST_TYPE_CV_DILATE); static GstFlowReturn gst_cv_dilate_transform_ip (GstOpencvVideoFilter * filter, GstBuffer * buf, cv::Mat img); @@ -106,12 +110,3 @@ return GST_FLOW_OK; } - -gboolean -gst_cv_dilate_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_cv_dilate_debug, "cvdilate", 0, "cvdilate"); - - return gst_element_register (plugin, "cvdilate", GST_RANK_NONE, - GST_TYPE_CV_DILATE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvdilate.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvdilate.h
Changed
@@ -77,7 +77,7 @@ GType gst_cv_dilate_get_type (void); -gboolean gst_cv_dilate_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cvdilate); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvequalizehist.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvequalizehist.cpp
Changed
@@ -75,8 +75,13 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("GRAY8"))); -G_DEFINE_TYPE (GstCvEqualizeHist, gst_cv_equalize_hist, - GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstCvEqualizeHist, gst_cv_equalize_hist, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_cv_equalize_hist_debug, "cvequalizehist", 0, + "cvequalizehist"); + ); +GST_ELEMENT_REGISTER_DEFINE (cvequalizehist, "cvequalizehist", GST_RANK_NONE, + GST_TYPE_CV_EQUALIZE_HIST); static GstFlowReturn gst_cv_equalize_hist_transform (GstOpencvVideoFilter * filter, GstBuffer * buf, cv::Mat img, GstBuffer * outbuf, cv::Mat outimg); @@ -115,12 +120,3 @@ cv::equalizeHist (img, outimg); return GST_FLOW_OK; } - -gboolean -gst_cv_equalize_hist_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_cv_equalize_hist_debug, "cvequalizehist", 0, - "cvequalizehist"); - return gst_element_register (plugin, "cvequalizehist", GST_RANK_NONE, - GST_TYPE_CV_EQUALIZE_HIST); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvequalizehist.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvequalizehist.h
Changed
@@ -75,7 +75,7 @@ GType gst_cv_equalize_hist_get_type (void); -gboolean gst_cv_equalize_hist_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cvequalizehist); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcverode.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcverode.cpp
Changed
@@ -64,7 +64,11 @@ GST_DEBUG_CATEGORY_STATIC (gst_cv_erode_debug); #define GST_CAT_DEFAULT gst_cv_erode_debug -G_DEFINE_TYPE (GstCvErode, gst_cv_erode, GST_TYPE_CV_DILATE_ERODE); +G_DEFINE_TYPE_WITH_CODE (GstCvErode, gst_cv_erode, GST_TYPE_CV_DILATE_ERODE, + GST_DEBUG_CATEGORY_INIT (gst_cv_erode_debug, "cverode", 0, "cverode"); + ); +GST_ELEMENT_REGISTER_DEFINE (cverode, "cverode", GST_RANK_NONE, + GST_TYPE_CV_ERODE); static GstFlowReturn gst_cv_erode_transform_ip (GstOpencvVideoFilter * filter, GstBuffer * buf, cv::Mat img); @@ -106,12 +110,3 @@ return GST_FLOW_OK; } - -gboolean -gst_cv_erode_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_cv_erode_debug, "cverode", 0, "cverode"); - - return gst_element_register (plugin, "cverode", GST_RANK_NONE, - GST_TYPE_CV_ERODE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcverode.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcverode.h
Changed
@@ -77,7 +77,7 @@ GType gst_cv_erode_get_type (void); -gboolean gst_cv_erode_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cverode); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvlaplace.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvlaplace.cpp
Changed
@@ -96,8 +96,12 @@ #define DEFAULT_SHIFT 0.0 #define DEFAULT_MASK TRUE -G_DEFINE_TYPE (GstCvLaplace, gst_cv_laplace, GST_TYPE_OPENCV_VIDEO_FILTER); - +G_DEFINE_TYPE_WITH_CODE (GstCvLaplace, gst_cv_laplace, + GST_TYPE_OPENCV_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_cv_laplace_debug, + "cvlaplace", 0, "cvlaplace"); + ); +GST_ELEMENT_REGISTER_DEFINE (cvlaplace, "cvlaplace", GST_RANK_NONE, + GST_TYPE_CV_LAPLACE); static void gst_cv_laplace_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_cv_laplace_get_property (GObject * object, guint prop_id, @@ -274,12 +278,3 @@ return GST_FLOW_OK; } - -gboolean -gst_cv_laplace_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_cv_laplace_debug, "cvlaplace", 0, "cvlaplace"); - - return gst_element_register (plugin, "cvlaplace", GST_RANK_NONE, - GST_TYPE_CV_LAPLACE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvlaplace.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvlaplace.h
Changed
@@ -84,7 +84,7 @@ GType gst_cv_laplace_get_type (void); -gboolean gst_cv_laplace_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cvlaplace); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvsmooth.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvsmooth.cpp
Changed
@@ -135,7 +135,12 @@ #define DEFAULT_WIDTH G_MAXINT #define DEFAULT_HEIGHT G_MAXINT -G_DEFINE_TYPE (GstCvSmooth, gst_cv_smooth, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstCvSmooth, gst_cv_smooth, + GST_TYPE_OPENCV_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_cv_smooth_debug, + "cvsmooth", 0, "cvsmooth"); + ); +GST_ELEMENT_REGISTER_DEFINE (cvsmooth, "cvsmooth", GST_RANK_NONE, + GST_TYPE_CV_SMOOTH); static void gst_cv_smooth_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -423,12 +428,3 @@ return GST_FLOW_OK; } - -gboolean -gst_cv_smooth_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_cv_smooth_debug, "cvsmooth", 0, "cvsmooth"); - - return gst_element_register (plugin, "cvsmooth", GST_RANK_NONE, - GST_TYPE_CV_SMOOTH); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvsmooth.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvsmooth.h
Changed
@@ -88,7 +88,7 @@ GType gst_cv_smooth_get_type (void); -gboolean gst_cv_smooth_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cvsmooth); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvsobel.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvsobel.cpp
Changed
@@ -95,7 +95,11 @@ #define DEFAULT_APERTURE_SIZE 3 #define DEFAULT_MASK TRUE -G_DEFINE_TYPE (GstCvSobel, gst_cv_sobel, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstCvSobel, gst_cv_sobel, GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_cv_sobel_debug, "cvsobel", 0, "cvsobel"); + ); +GST_ELEMENT_REGISTER_DEFINE (cvsobel, "cvsobel", GST_RANK_NONE, + GST_TYPE_CV_SOBEL); static void gst_cv_sobel_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -269,12 +273,3 @@ return GST_FLOW_OK; } - -gboolean -gst_cv_sobel_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_cv_sobel_debug, "cvsobel", 0, "cvsobel"); - - return gst_element_register (plugin, "cvsobel", GST_RANK_NONE, - GST_TYPE_CV_SOBEL); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstcvsobel.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvsobel.h
Changed
@@ -83,7 +83,7 @@ GType gst_cv_sobel_get_type (void); -gboolean gst_cv_sobel_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (cvsobel); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvtracker.cpp
Added
@@ -0,0 +1,408 @@ +/* + * GStreamer + * Copyright (C) 2020 Vivek R <123vivekr@gmail.com> + * Copyright (C) 2021 Cesar Fabian Orccon Chipana <cfoch.fabian@gmail.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-cvtracker + * + * Performs object tracking on videos and stores it in video buffer metadata. + * + * ## Example launch line + * + * ``` + * gst-launch-1.0 v4l2src ! videoconvert ! cvtracker box-x=50 box-y=50 box-wdith=50 box-height=50 ! videoconvert ! xvimagesink + * ``` + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstcvtracker.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cvtracker_debug); +#define GST_CAT_DEFAULT gst_cvtracker_debug + +#define DEFAULT_PROP_INITIAL_X 50 +#define DEFAULT_PROP_INITIAL_Y 50 +#define DEFAULT_PROP_INITIAL_WIDTH 50 +#define DEFAULT_PROP_INITIAL_HEIGHT 50 + +enum +{ + PROP_0, + PROP_INITIAL_X, + PROP_INITIAL_Y, + PROP_INITIAL_WIDTH, + PROP_INITIAL_HEIGHT, + PROP_ALGORITHM, + PROP_DRAW, +}; + +#define GST_OPENCV_TRACKER_ALGORITHM (tracker_algorithm_get_type ()) + +/** + * GstOpenCVTrackerAlgorithm: + * + * Since: 1.20 + */ +static GType +tracker_algorithm_get_type (void) +{ + static GType algorithm = 0; + static const GEnumValue algorithms[] = { + {GST_OPENCV_TRACKER_ALGORITHM_BOOSTING, "the Boosting tracker", "Boosting"}, + {GST_OPENCV_TRACKER_ALGORITHM_CSRT, "the CSRT tracker", "CSRT"}, + {GST_OPENCV_TRACKER_ALGORITHM_KCF, + "the KCF (Kernelized Correlation Filter) tracker", + "KCF"}, + {GST_OPENCV_TRACKER_ALGORITHM_MEDIANFLOW, "the Median Flow tracker", + "MedianFlow"}, + {GST_OPENCV_TRACKER_ALGORITHM_MIL, "the MIL tracker", "MIL"}, + {GST_OPENCV_TRACKER_ALGORITHM_MOSSE, + "the MOSSE (Minimum Output Sum of Squared Error) tracker", "MOSSE"}, + {GST_OPENCV_TRACKER_ALGORITHM_TLD, + "the TLD (Tracking, learning and detection) tracker", + "TLD"}, + {0, NULL, NULL}, + }; + + if (!algorithm) { + algorithm = + g_enum_register_static ("GstOpenCVTrackerAlgorithm", algorithms); + } + return algorithm; +} + +static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) + ); + +static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) + ); + +G_DEFINE_TYPE_WITH_CODE (GstCVTracker, gst_cvtracker, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_cvtracker_debug, "cvtracker", 0, + "Performs object tracking on videos and stores it in video buffer " + "metadata")); +GST_ELEMENT_REGISTER_DEFINE (cvtracker, "cvtracker", GST_RANK_NONE, + GST_TYPE_OPENCV_TRACKER); + +static void gst_cvtracker_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_cvtracker_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); + +static GstFlowReturn gst_cvtracker_transform_ip (GstOpencvVideoFilter + * filter, GstBuffer * buf, cv::Mat img); + +static void +gst_cvtracker_finalize (GObject * obj) +{ + GstCVTracker *filter = GST_OPENCV_TRACKER (obj); + + filter->tracker.release (); + filter->roi.release (); + + G_OBJECT_CLASS (gst_cvtracker_parent_class)->finalize (obj); +} + +static void +gst_cvtracker_class_init (GstCVTrackerClass * klass) +{ + GObjectClass *gobject_class; + GstOpencvVideoFilterClass *gstopencvbasefilter_class; + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + + gobject_class = (GObjectClass *) klass; + gobject_class->finalize = GST_DEBUG_FUNCPTR (gst_cvtracker_finalize); + gstopencvbasefilter_class = (GstOpencvVideoFilterClass *) klass; + + gstopencvbasefilter_class->cv_trans_ip_func = gst_cvtracker_transform_ip; + + gobject_class->set_property = gst_cvtracker_set_property; + gobject_class->get_property = gst_cvtracker_get_property; + + /* + * Tracker API in versions older than OpenCV 4.5.1 worked with a ROI based + * on Rect<double>. However newer versions use Rect<int>. Running the same + * tracker type on different versions may lead to round up errors. + * To avoid inconsistencies from the GStreamer side depending on the OpenCV + * version, use integer properties independently on the OpenCV. + **/ + g_object_class_install_property (gobject_class, PROP_INITIAL_X, + g_param_spec_uint ("object-initial-x", "Initial X coordinate", + "Track object box's initial X coordinate", 0, G_MAXUINT, + DEFAULT_PROP_INITIAL_X, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INITIAL_Y, + g_param_spec_uint ("object-initial-y", "Initial Y coordinate", + "Track object box's initial Y coordinate", 0, G_MAXUINT, + DEFAULT_PROP_INITIAL_Y, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INITIAL_WIDTH, + g_param_spec_uint ("object-initial-width", "Object Initial Width", + "Track object box's initial width", 0, G_MAXUINT, + DEFAULT_PROP_INITIAL_WIDTH, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INITIAL_HEIGHT, + g_param_spec_uint ("object-initial-height", "Object Initial Height", + "Track object box's initial height", 0, G_MAXUINT, + DEFAULT_PROP_INITIAL_HEIGHT, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_ALGORITHM, + g_param_spec_enum ("algorithm", "Algorithm", + "Algorithm for tracking objects", GST_OPENCV_TRACKER_ALGORITHM, + GST_OPENCV_TRACKER_ALGORITHM_MEDIANFLOW, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_DRAW, + g_param_spec_boolean ("draw-rect", "Display", + "Draw rectangle around tracked object", + TRUE, (GParamFlags) G_PARAM_READWRITE)); + + gst_element_class_set_static_metadata (element_class, + "cvtracker", + "Filter/Effect/Video", + "Performs object tracking on videos and stores it in video buffer metadata.", + "Vivek R <123vivekr@gmail.com>"); + + gst_element_class_add_static_pad_template (element_class, &src_factory); + gst_element_class_add_static_pad_template (element_class, &sink_factory); + + gst_type_mark_as_plugin_api (GST_OPENCV_TRACKER_ALGORITHM, + (GstPluginAPIFlags) 0); +} + +static void +gst_cvtracker_init (GstCVTracker * filter) +{ + filter->x = DEFAULT_PROP_INITIAL_X; + filter->y = DEFAULT_PROP_INITIAL_Y; + filter->width = DEFAULT_PROP_INITIAL_WIDTH; + filter->height = DEFAULT_PROP_INITIAL_HEIGHT; +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + filter->tracker = cv::legacy::upgradeTrackingAPI( + cv::legacy::TrackerMedianFlow::create()); +#else + filter->tracker = cv::TrackerMedianFlow::create(); +#endif + filter->draw = TRUE; + filter->post_debug_info = TRUE; + + gst_opencv_video_filter_set_in_place (GST_OPENCV_VIDEO_FILTER_CAST (filter), + TRUE); + filter->algorithm = GST_OPENCV_TRACKER_ALGORITHM_MEDIANFLOW; +} + +static void +gst_cvtracker_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstCVTracker *filter = GST_OPENCV_TRACKER (object); + + switch (prop_id) { + case PROP_INITIAL_X: + filter->x = g_value_get_uint (value); + break; + case PROP_INITIAL_Y: + filter->y = g_value_get_uint (value); + break; + case PROP_INITIAL_WIDTH: + filter->width = g_value_get_uint (value); + break; + case PROP_INITIAL_HEIGHT: + filter->height = g_value_get_uint (value); + break; + case PROP_ALGORITHM: + filter->algorithm = g_value_get_enum (value); + break; + case PROP_DRAW: + filter->draw = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +create_cvtracker (GstCVTracker * filter) +{ + switch (filter->algorithm) { + case GST_OPENCV_TRACKER_ALGORITHM_BOOSTING: +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + filter->tracker = cv::legacy::upgradeTrackingAPI( + cv::legacy::TrackerBoosting::create()); +#else + filter->tracker = cv::TrackerBoosting::create(); +#endif + break; + case GST_OPENCV_TRACKER_ALGORITHM_CSRT: + filter->tracker = cv::TrackerCSRT::create (); + break; + case GST_OPENCV_TRACKER_ALGORITHM_KCF: + filter->tracker = cv::TrackerKCF::create (); + break; + case GST_OPENCV_TRACKER_ALGORITHM_MEDIANFLOW: +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + filter->tracker = cv::legacy::upgradeTrackingAPI( + cv::legacy::TrackerMedianFlow::create()); +#else + filter->tracker = cv::TrackerMedianFlow::create(); +#endif + break; + case GST_OPENCV_TRACKER_ALGORITHM_MIL: + filter->tracker = cv::TrackerMIL::create (); + break; + case GST_OPENCV_TRACKER_ALGORITHM_MOSSE: +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + filter->tracker = cv::legacy::upgradeTrackingAPI( + cv::legacy::TrackerMOSSE::create()); +#else + filter->tracker = cv::TrackerMOSSE::create (); +#endif + break; + case GST_OPENCV_TRACKER_ALGORITHM_TLD: +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + filter->tracker = cv::legacy::upgradeTrackingAPI( + cv::legacy::TrackerTLD::create()); +#else + filter->tracker = cv::TrackerTLD::create(); +#endif + break; + } +} + +static void +gst_cvtracker_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstCVTracker *filter = GST_OPENCV_TRACKER (object); + + switch (prop_id) { + case PROP_INITIAL_X: + g_value_set_uint (value, filter->x); + break; + case PROP_INITIAL_Y: + g_value_set_uint (value, filter->y); + break; + case PROP_INITIAL_WIDTH: + g_value_set_uint (value, filter->width); + break; + case PROP_INITIAL_HEIGHT: + g_value_set_uint (value, filter->height); + break; + case PROP_ALGORITHM: + g_value_set_enum (value, filter->algorithm); + break; + case PROP_DRAW: + g_value_set_boolean (value, filter->draw); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static GstFlowReturn +gst_cvtracker_transform_ip (GstOpencvVideoFilter * base, + GstBuffer * buf, cv::Mat img) +{ + GstCVTracker *filter = GST_OPENCV_TRACKER (base); + GstStructure *s; + GstMessage *msg; + + if (filter->roi.empty ()) { +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + filter->roi = new (cv::Rect); +#else + filter->roi = new (cv::Rect2d); +#endif + filter->roi->x = filter->x; + filter->roi->y = filter->y; + filter->roi->width = filter->width; + filter->roi->height = filter->height; + create_cvtracker (filter); + filter->tracker->init (img, *filter->roi); + } else if (filter->tracker->update (img, *filter->roi)) { +#if (!(CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1)) + /* Round values to avoid inconsistencies depending on the OpenCV version. */ + filter->roi->x = cvRound (filter->roi->x); + filter->roi->y = cvRound (filter->roi->y); + filter->roi->width = cvRound (filter->roi->width); + filter->roi->height = cvRound (filter->roi->height); +#endif + s = gst_structure_new ("object", + "x", G_TYPE_UINT, (guint) filter->roi->x, + "y", G_TYPE_UINT, (guint) filter->roi->y, + "width", G_TYPE_UINT, (guint) filter->roi->width, + "height", G_TYPE_UINT, (guint) filter->roi->height, NULL); + msg = gst_message_new_element (GST_OBJECT (filter), s); + gst_buffer_add_video_region_of_interest_meta (buf, "object", + filter->roi->x, filter->roi->y, filter->roi->width, + filter->roi->height); + gst_element_post_message (GST_ELEMENT (filter), msg); + if (filter->draw) + cv::rectangle (img, *filter->roi, cv::Scalar (255, 0, 0), 2, 1); + if (!(filter->post_debug_info)) + filter->post_debug_info = TRUE; + } else if (filter->post_debug_info) { + GST_DEBUG_OBJECT (filter, "tracker lost"); + filter->post_debug_info = FALSE; + } + + return GST_FLOW_OK; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstcvtracker.h
Added
@@ -0,0 +1,115 @@ +/* + * GStreamer + * Copyright (C) 2020 Vivek R <123vivekr@gmail.com> + * Copyright (C) 2021 Cesar Fabian Orccon Chipana <cfoch.fabian@gmail.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CVTRACKER_H__ +#define __GST_CVTRACKER_H__ + +#include <gst/video/gstvideometa.h> +#include <gst/opencv/gstopencvvideofilter.h> +#include <opencv2/core.hpp> +#include <opencv2/imgproc.hpp> +#include <opencv2/tracking.hpp> +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 +#include <opencv2/tracking/tracking_legacy.hpp> +#endif + +G_BEGIN_DECLS + +/* #defines don't like whitespacey bits */ +#define GST_TYPE_OPENCV_TRACKER \ + (gst_cvtracker_get_type()) +#define GST_OPENCV_TRACKER(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_OPENCV_TRACKER,GstCVTracker)) +#define GST_OPENCV_TRACKER_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_OPENCV_TRACKER,GstCVTrackerClass)) +#define GST_IS_OPENCV_TRACKER(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_OPENCV_TRACKER)) +#define GST_IS_OPENCV_TRACKER_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_OPENCV_TRACKER)) + +typedef struct _GstCVTracker GstCVTracker; +typedef struct _GstCVTrackerClass GstCVTrackerClass; + +struct _GstCVTracker +{ + GstOpencvVideoFilter element; + + guint x; + guint y; + guint width; + guint height; + gint algorithm; + gboolean draw; + gboolean post_debug_info; + + cv::Ptr<cv::Tracker> tracker; +#if CV_VERSION_MAJOR == 4 && CV_VERSION_MINOR >= 5 && CV_VERSION_REVISION >= 1 + cv::Ptr<cv::Rect> roi; +#else + cv::Ptr<cv::Rect2d> roi; +#endif +}; + +typedef enum { + GST_OPENCV_TRACKER_ALGORITHM_BOOSTING, + GST_OPENCV_TRACKER_ALGORITHM_CSRT, + GST_OPENCV_TRACKER_ALGORITHM_KCF, + GST_OPENCV_TRACKER_ALGORITHM_MEDIANFLOW, + GST_OPENCV_TRACKER_ALGORITHM_MIL, + GST_OPENCV_TRACKER_ALGORITHM_MOSSE, + GST_OPENCV_TRACKER_ALGORITHM_TLD, +} GstOpenCVTrackerAlgorithm; + +struct _GstCVTrackerClass +{ + GstOpencvVideoFilterClass parent_class; +}; + +GType gst_cvtracker_get_type (void); + +GST_ELEMENT_REGISTER_DECLARE (cvtracker); + +G_END_DECLS + +#endif /* __GST_CVTRACKER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstdewarp.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstdewarp.cpp
Changed
@@ -129,7 +129,11 @@ return dewarp_interpolation_mode_type; } -G_DEFINE_TYPE (GstDewarp, gst_dewarp, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstDewarp, gst_dewarp, GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_dewarp_debug, "dewarp", 0, + "Dewarp fisheye images"); + ); +GST_ELEMENT_REGISTER_DEFINE (dewarp, "dewarp", GST_RANK_NONE, GST_TYPE_DEWARP); static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, @@ -714,13 +718,3 @@ return ret; } - -gboolean -gst_dewarp_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_dewarp_debug, "dewarp", - 0, "Dewarp fisheye images"); - - return gst_element_register (plugin, "dewarp", GST_RANK_NONE, - GST_TYPE_DEWARP); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstdewarp.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstdewarp.h
Changed
@@ -106,7 +106,7 @@ GType gst_dewarp_get_type (void); -gboolean gst_dewarp_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (dewarp); G_END_DECLS #endif /* __GST_DEWARP_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstdisparity.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstdisparity.cpp
Changed
@@ -176,7 +176,12 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) ); -G_DEFINE_TYPE (GstDisparity, gst_disparity, GST_TYPE_ELEMENT); +G_DEFINE_TYPE_WITH_CODE (GstDisparity, gst_disparity, GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (gst_disparity_debug, "disparity", 0, + "Stereo image disparity (depth) map calculation"); + ); +GST_ELEMENT_REGISTER_DEFINE (disparity, "disparity", GST_RANK_NONE, + GST_TYPE_DISPARITY); static void gst_disparity_finalize (GObject * object); static void gst_disparity_set_property (GObject * object, guint prop_id, @@ -585,24 +590,6 @@ return ret; } - - - - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_disparity_plugin_init (GstPlugin * disparity) -{ - GST_DEBUG_CATEGORY_INIT (gst_disparity_debug, "disparity", 0, - "Stereo image disparity (depth) map calculation"); - return gst_element_register (disparity, "disparity", GST_RANK_NONE, - GST_TYPE_DISPARITY); -} - - static void initialise_disparity (GstDisparity * fs, int width, int height, int nchannels) {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstdisparity.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstdisparity.h
Changed
@@ -108,7 +108,7 @@ GType gst_disparity_get_type (void); -gboolean gst_disparity_plugin_init (GstPlugin * disparity); +GST_ELEMENT_REGISTER_DECLARE (disparity); G_END_DECLS #endif /* __GST_DISPARITY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstedgedetect.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstedgedetect.cpp
Changed
@@ -97,7 +97,12 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) ); -G_DEFINE_TYPE (GstEdgeDetect, gst_edge_detect, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstEdgeDetect, gst_edge_detect, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_edge_detect_debug, "edgedetect", 0, + "Performs canny edge detection on videos and images");); +GST_ELEMENT_REGISTER_DEFINE (edgedetect, "edgedetect", GST_RANK_NONE, + GST_TYPE_EDGE_DETECT); static void gst_edge_detect_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -269,18 +274,3 @@ return GST_FLOW_OK; } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_edge_detect_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_edge_detect_debug, "edgedetect", - 0, "Performs canny edge detection on videos and images"); - - return gst_element_register (plugin, "edgedetect", GST_RANK_NONE, - GST_TYPE_EDGE_DETECT); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstedgedetect.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstedgedetect.h
Changed
@@ -86,7 +86,7 @@ GType gst_edge_detect_get_type (void); -gboolean gst_edge_detect_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (edgedetect); G_END_DECLS #endif /* __GST_EDGE_DETECT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstfaceblur.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstfaceblur.cpp
Changed
@@ -143,7 +143,11 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) ); -G_DEFINE_TYPE (GstFaceBlur, gst_face_blur, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstFaceBlur, gst_face_blur, + GST_TYPE_OPENCV_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_face_blur_debug, + "faceblur", 0, "Blurs faces in images and videos");); +GST_ELEMENT_REGISTER_DEFINE (faceblur, "faceblur", GST_RANK_NONE, + GST_TYPE_FACE_BLUR); static void gst_face_blur_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -386,19 +390,3 @@ } return cascade; } - - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_face_blur_plugin_init (GstPlugin * plugin) -{ - /* debug category for filtering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_face_blur_debug, "faceblur", - 0, "Blurs faces in images and videos"); - - return gst_element_register (plugin, "faceblur", GST_RANK_NONE, - GST_TYPE_FACE_BLUR); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstfaceblur.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstfaceblur.h
Changed
@@ -93,7 +93,7 @@ GType gst_face_blur_get_type (void); -gboolean gst_face_blur_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (faceblur); G_END_DECLS #endif /* __GST_FACE_BLUR_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstfacedetect.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstfacedetect.cpp
Changed
@@ -225,7 +225,12 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) ); -G_DEFINE_TYPE (GstFaceDetect, gst_face_detect, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstFaceDetect, gst_face_detect, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_face_detect_debug, "facedetect", 0, + "Performs face detection on videos and images, providing detected positions via bus messages");); +GST_ELEMENT_REGISTER_DEFINE (facedetect, "facedetect", GST_RANK_NONE, + GST_TYPE_FACE_DETECT); static void gst_face_detect_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -789,20 +794,3 @@ return cascade; } - - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_face_detect_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_face_detect_debug, "facedetect", - 0, - "Performs face detection on videos and images, providing detected positions via bus messages"); - - return gst_element_register (plugin, "facedetect", GST_RANK_NONE, - GST_TYPE_FACE_DETECT); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstfacedetect.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstfacedetect.h
Changed
@@ -117,7 +117,7 @@ GType gst_face_detect_get_type (void); -gboolean gst_face_detect_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (facedetect); G_END_DECLS #endif /* __GST_FACE_DETECT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstgrabcut.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstgrabcut.cpp
Changed
@@ -107,7 +107,12 @@ #define DEFAULT_TEST_MODE FALSE #define DEFAULT_SCALE 1.6 -G_DEFINE_TYPE (GstGrabcut, gst_grabcut, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstGrabcut, gst_grabcut, GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_grabcut_debug, "grabcut", 0, + "Grabcut image segmentation on either input alpha or input bounding box");); +GST_ELEMENT_REGISTER_DEFINE (grabcut, "grabcut", GST_RANK_NONE, + GST_TYPE_GRABCUT); + static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, @@ -353,23 +358,6 @@ return GST_FLOW_OK; } -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_grabcut_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages - * - */ - GST_DEBUG_CATEGORY_INIT (gst_grabcut_debug, "grabcut", - 0, - "Grabcut image segmentation on either input alpha or input bounding box"); - - return gst_element_register (plugin, "grabcut", GST_RANK_NONE, - GST_TYPE_GRABCUT); -} void compose_matrix_from_image (Mat output, Mat input)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstgrabcut.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstgrabcut.h
Changed
@@ -93,7 +93,7 @@ GType gst_grabcut_get_type (void); -gboolean gst_grabcut_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (grabcut); G_END_DECLS #endif /* __GST_GRABCUT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gsthanddetect.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gsthanddetect.cpp
Changed
@@ -127,7 +127,11 @@ G_DEFINE_TYPE_WITH_CODE (GstHanddetect, gst_handdetect, GST_TYPE_OPENCV_VIDEO_FILTER, G_IMPLEMENT_INTERFACE (GST_TYPE_NAVIGATION, - gst_handdetect_navigation_interface_init);); + gst_handdetect_navigation_interface_init); + GST_DEBUG_CATEGORY_INIT (gst_handdetect_debug, + "handdetect", 0, "opencv hand gesture detection")); +GST_ELEMENT_REGISTER_DEFINE (handdetect, "handdetect", GST_RANK_NONE, + GST_TYPE_HANDDETECT); static void gst_handdetect_navigation_interface_init (GstNavigationInterface * iface) @@ -624,16 +628,3 @@ return cascade; } - -/* Entry point to initialize the plug-in - * Initialize the plug-in itself - * Register the element factories and other features - */ -gboolean -gst_handdetect_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_handdetect_debug, - "handdetect", 0, "opencv hand gesture detection"); - return gst_element_register (plugin, "handdetect", GST_RANK_NONE, - GST_TYPE_HANDDETECT); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gsthanddetect.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gsthanddetect.h
Changed
@@ -95,7 +95,7 @@ GType gst_handdetect_get_type (void); -gboolean gst_handdetect_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (handdetect); G_END_DECLS #endif /* __GST_HANDDETECT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstmotioncells.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstmotioncells.cpp
Changed
@@ -140,7 +140,12 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB"))); -G_DEFINE_TYPE (GstMotioncells, gst_motion_cells, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstMotioncells, gst_motion_cells, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_motion_cells_debug, "motioncells", 0, + "Performs motion detection on videos, providing detected positions via bus messages");); +GST_ELEMENT_REGISTER_DEFINE (motioncells, "motioncells", GST_RANK_NONE, + GST_TYPE_MOTIONCELLS); static void gst_motion_cells_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -1118,20 +1123,3 @@ } return GST_FLOW_OK; } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_motion_cells_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_motion_cells_debug, - "motioncells", - 0, - "Performs motion detection on videos, providing detected positions via bus messages"); - - return gst_element_register (plugin, "motioncells", GST_RANK_NONE, - GST_TYPE_MOTIONCELLS); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstmotioncells.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstmotioncells.h
Changed
@@ -99,7 +99,7 @@ GType gst_motion_cells_get_type (void); -gboolean gst_motion_cells_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (motioncells); G_END_DECLS #endif /* __GST_MOTION_CELLS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstopencv.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstopencv.cpp
Changed
@@ -44,74 +44,38 @@ #include "gstdewarp.h" #include "gstcameracalibrate.h" #include "gstcameraundistort.h" +#include "gstcvtracker.h" + static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_cv_dilate_plugin_init (plugin)) - return FALSE; - - if (!gst_cv_equalize_hist_plugin_init (plugin)) - return FALSE; - - if (!gst_cv_erode_plugin_init (plugin)) - return FALSE; - - if (!gst_cv_laplace_plugin_init (plugin)) - return FALSE; - - if (!gst_cv_smooth_plugin_init (plugin)) - return FALSE; - - if (!gst_cv_sobel_plugin_init (plugin)) - return FALSE; - - if (!gst_edge_detect_plugin_init (plugin)) - return FALSE; - - if (!gst_face_blur_plugin_init (plugin)) - return FALSE; - - if (!gst_face_detect_plugin_init (plugin)) - return FALSE; - - if (!gst_motion_cells_plugin_init (plugin)) - return FALSE; - - if (!gst_template_match_plugin_init (plugin)) - return FALSE; - - if (!gst_opencv_text_overlay_plugin_init (plugin)) - return FALSE; - - if (!gst_handdetect_plugin_init (plugin)) - return FALSE; - - if (!gst_skin_detect_plugin_init (plugin)) - return FALSE; - - if (!gst_retinex_plugin_init (plugin)) - return FALSE; - - if (!gst_segmentation_plugin_init (plugin)) - return FALSE; - - if (!gst_grabcut_plugin_init (plugin)) - return FALSE; - - if (!gst_disparity_plugin_init (plugin)) - return FALSE; - - if (!gst_dewarp_plugin_init (plugin)) - return FALSE; - - if (!gst_camera_calibrate_plugin_init (plugin)) - return FALSE; - - if (!gst_camera_undistort_plugin_init (plugin)) - return FALSE; - - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (cvdilate, plugin); + ret |= GST_ELEMENT_REGISTER (cvequalizehist, plugin); + ret |= GST_ELEMENT_REGISTER (cverode, plugin); + ret |= GST_ELEMENT_REGISTER (cvlaplace, plugin); + ret |= GST_ELEMENT_REGISTER (cvsmooth, plugin); + ret |= GST_ELEMENT_REGISTER (cvsobel, plugin); + ret |= GST_ELEMENT_REGISTER (edgedetect, plugin); + ret |= GST_ELEMENT_REGISTER (faceblur, plugin); + ret |= GST_ELEMENT_REGISTER (facedetect, plugin); + ret |= GST_ELEMENT_REGISTER (motioncells, plugin); + ret |= GST_ELEMENT_REGISTER (templatematch, plugin); + ret |= GST_ELEMENT_REGISTER (opencvtextoverlay, plugin); + ret |= GST_ELEMENT_REGISTER (handdetect, plugin); + ret |= GST_ELEMENT_REGISTER (skindetect, plugin); + ret |= GST_ELEMENT_REGISTER (retinex, plugin); + ret |= GST_ELEMENT_REGISTER (segmentation, plugin); + ret |= GST_ELEMENT_REGISTER (grabcut, plugin); + ret |= GST_ELEMENT_REGISTER (disparity, plugin); + ret |= GST_ELEMENT_REGISTER (dewarp, plugin); + ret |= GST_ELEMENT_REGISTER (cameracalibrate, plugin); + ret |= GST_ELEMENT_REGISTER (cameraundistort, plugin); + ret |= GST_ELEMENT_REGISTER (cvtracker, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstretinex.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstretinex.cpp
Changed
@@ -79,7 +79,10 @@ { PROP_0, PROP_METHOD, - PROP_SCALES + PROP_SCALES, + PROP_SIGMA, + PROP_GAIN, + PROP_OFFSET, }; typedef enum { @@ -89,6 +92,9 @@ #define DEFAULT_METHOD METHOD_BASIC #define DEFAULT_SCALES 3 +#define DEFAULT_SIGMA 14.0 +#define DEFAULT_GAIN 128 +#define DEFAULT_OFFSET 128 #define GST_TYPE_RETINEX_METHOD (gst_retinex_method_get_type ()) static GType @@ -106,7 +112,13 @@ return etype; } -G_DEFINE_TYPE (GstRetinex, gst_retinex, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstRetinex, gst_retinex, GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_retinex_debug, "retinex", 0, + "Multiscale retinex for colour image enhancement"); + ); +GST_ELEMENT_REGISTER_DEFINE (retinex, "retinex", GST_RANK_NONE, + GST_TYPE_RETINEX); + static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, @@ -160,6 +172,42 @@ 4, DEFAULT_SCALES, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + /** + * GstRetinex:sigma: + * + * Sigma + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_SIGMA, + g_param_spec_double ("sigma", "Sigma", + "Sigma", 0.0, G_MAXDOUBLE, DEFAULT_SIGMA, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstRetinex:gain: + * + * Gain + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_GAIN, + g_param_spec_int ("gain", "gain", + "Gain", 0, G_MAXINT, DEFAULT_GAIN, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstRetinex:offset: + * + * Offset + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_OFFSET, + g_param_spec_int ("offset", "Offset", + "Offset", 0, G_MAXINT, DEFAULT_OFFSET, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + gst_element_class_set_static_metadata (element_class, "Retinex image colour enhancement", "Filter/Effect/Video", "Multiscale retinex for colour image enhancement", @@ -182,6 +230,9 @@ filter->method = DEFAULT_METHOD; filter->scales = DEFAULT_SCALES; filter->current_scales = 0; + filter->gain = DEFAULT_GAIN; + filter->offset = DEFAULT_OFFSET; + filter->sigma = DEFAULT_SIGMA; gst_opencv_video_filter_set_in_place (GST_OPENCV_VIDEO_FILTER_CAST (filter), TRUE); } @@ -217,6 +268,15 @@ case PROP_SCALES: retinex->scales = g_value_get_int (value); break; + case PROP_SIGMA: + retinex->sigma = g_value_get_double (value); + break; + case PROP_GAIN: + retinex->gain = g_value_get_int (value); + break; + case PROP_OFFSET: + retinex->offset = g_value_get_int (value); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -236,6 +296,15 @@ case PROP_SCALES: g_value_set_int (value, filter->scales); break; + case PROP_SIGMA: + g_value_set_double (value, filter->sigma); + break; + case PROP_GAIN: + g_value_set_int (value, filter->gain); + break; + case PROP_OFFSET: + g_value_set_int (value, filter->offset); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -265,9 +334,6 @@ Mat img) { GstRetinex *retinex = GST_RETINEX (filter); - double sigma = 14.0; - int gain = 128; - int offset = 128; int filter_size; /* Basic retinex restoration. The image and a filtered image are converted @@ -280,7 +346,7 @@ log (retinex->cvA, retinex->cvB); /* Compute log of blurred image */ - filter_size = (int) floor (sigma * 6) / 2; + filter_size = (int) floor (retinex->sigma * 6) / 2; filter_size = filter_size * 2 + 1; img.convertTo (retinex->cvD, retinex->cvD.type ()); @@ -292,7 +358,7 @@ subtract (retinex->cvB, retinex->cvC, retinex->cvA); /* Restore */ - retinex->cvA.convertTo (img, img.type (), (float) gain, (float) offset); + retinex->cvA.convertTo (img, img.type (), (float) retinex->gain, (float) retinex->offset); } /* Multiscale retinex restoration. The image and a set of filtered images are converted to the log domain and subtracted from the original with some set @@ -339,25 +405,8 @@ } /* Restore */ - retinex->cvB.convertTo (img, img.type (), (float) gain, (float) offset); + retinex->cvB.convertTo (img, img.type (), (float) retinex->gain, (float) retinex->offset); } return GST_FLOW_OK; } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_retinex_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages - * - */ - GST_DEBUG_CATEGORY_INIT (gst_retinex_debug, "retinex", - 0, "Multiscale retinex for colour image enhancement"); - - return gst_element_register (plugin, "retinex", GST_RANK_NONE, - GST_TYPE_RETINEX); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstretinex.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstretinex.h
Changed
@@ -72,6 +72,9 @@ double *weights; double *sigmas; + double sigma; + int gain, offset; + cv::Mat cvA; cv::Mat cvB; cv::Mat cvC; @@ -85,7 +88,7 @@ GType gst_retinex_get_type (void); -gboolean gst_retinex_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (retinex); G_END_DECLS #endif /* __GST_RETINEX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstsegmentation.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstsegmentation.cpp
Changed
@@ -139,7 +139,13 @@ return etype; } -G_DEFINE_TYPE (GstSegmentation, gst_segmentation, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstSegmentation, gst_segmentation, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_segmentation_debug, "segmentation", 0, + "Performs Foreground/Background segmentation in video sequences"); + ); +GST_ELEMENT_REGISTER_DEFINE (segmentation, "segmentation", GST_RANK_NONE, + GST_TYPE_SEGMENTATION); static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, @@ -450,21 +456,6 @@ return GST_FLOW_OK; } -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_segmentation_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_segmentation_debug, "segmentation", - 0, "Performs Foreground/Background segmentation in video sequences"); - - return gst_element_register (plugin, "segmentation", GST_RANK_NONE, - GST_TYPE_SEGMENTATION); -} - - #ifdef CODE_FROM_OREILLY_BOOK /* See license at the beginning of the page */ /*
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstsegmentation.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstsegmentation.h
Changed
@@ -118,7 +118,7 @@ GType gst_segmentation_get_type (void); -gboolean gst_segmentation_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (segmentation); G_END_DECLS #endif /* __GST_SEGMENTATION_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstskindetect.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstskindetect.cpp
Changed
@@ -99,7 +99,14 @@ return etype; } -G_DEFINE_TYPE (GstSkinDetect, gst_skin_detect, GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstSkinDetect, gst_skin_detect, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_skin_detect_debug, "skindetect", 0, + "Performs skin detection on videos and images"); + ); +GST_ELEMENT_REGISTER_DEFINE (skindetect, "skindetect", GST_RANK_NONE, + GST_TYPE_SKIN_DETECT); + static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, @@ -385,20 +392,3 @@ return GST_FLOW_OK; } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_skin_detect_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages - * - */ - GST_DEBUG_CATEGORY_INIT (gst_skin_detect_debug, "skindetect", - 0, "Performs skin detection on videos and images"); - - return gst_element_register (plugin, "skindetect", GST_RANK_NONE, - GST_TYPE_SKIN_DETECT); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gstskindetect.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gstskindetect.h
Changed
@@ -84,7 +84,7 @@ GType gst_skin_detect_get_type (void); -gboolean gst_skin_detect_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (skindetect); G_END_DECLS #endif /* __GST_SKIN_DETECT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gsttemplatematch.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gsttemplatematch.cpp
Changed
@@ -99,8 +99,13 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("BGR")) ); -G_DEFINE_TYPE (GstTemplateMatch, gst_template_match, - GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstTemplateMatch, gst_template_match, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_template_match_debug, "templatematch", 0, + "Performs template matching on videos and images, providing detected positions via bus messages"); + ); +GST_ELEMENT_REGISTER_DEFINE (templatematch, "templatematch", + GST_RANK_NONE, GST_TYPE_TEMPLATE_MATCH); static void gst_template_match_finalize (GObject * object); static void gst_template_match_set_property (GObject * object, guint prop_id, @@ -379,19 +384,3 @@ } return GST_FLOW_OK; } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_template_match_plugin_init (GstPlugin * templatematch) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_template_match_debug, "templatematch", - 0, - "Performs template matching on videos and images, providing detected positions via bus messages"); - - return gst_element_register (templatematch, "templatematch", GST_RANK_NONE, - GST_TYPE_TEMPLATE_MATCH); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gsttemplatematch.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gsttemplatematch.h
Changed
@@ -83,7 +83,7 @@ GType gst_template_match_get_type (void); -gboolean gst_template_match_plugin_init (GstPlugin * templatematch); +GST_ELEMENT_REGISTER_DECLARE (templatematch); G_END_DECLS #endif /* __GST_TEMPLATE_MATCH_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gsttextoverlay.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gsttextoverlay.cpp
Changed
@@ -111,8 +111,13 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("RGB")) ); -G_DEFINE_TYPE (GstOpencvTextOverlay, gst_opencv_text_overlay, - GST_TYPE_OPENCV_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstOpencvTextOverlay, gst_opencv_text_overlay, + GST_TYPE_OPENCV_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_opencv_text_overlay_debug, "opencvtextoverlay", + 0, "Template opencvtextoverlay"); + ); +GST_ELEMENT_REGISTER_DEFINE (opencvtextoverlay, "opencvtextoverlay", + GST_RANK_NONE, GST_TYPE_OPENCV_TEXT_OVERLAY); static void gst_opencv_text_overlay_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -336,22 +341,3 @@ return GST_FLOW_OK; } - - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_opencv_text_overlay_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages - * - * exchange the string 'Template opencvtextoverlay' with your description - */ - GST_DEBUG_CATEGORY_INIT (gst_opencv_text_overlay_debug, "opencvtextoverlay", - 0, "Template opencvtextoverlay"); - - return gst_element_register (plugin, "opencvtextoverlay", GST_RANK_NONE, - GST_TYPE_OPENCV_TEXT_OVERLAY); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/gsttextoverlay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/gsttextoverlay.h
Changed
@@ -87,7 +87,7 @@ }; GType gst_opencv_text_overlay_get_type (void); -gboolean gst_opencv_text_overlay_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (opencvtextoverlay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opencv/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/opencv/meson.build
Changed
@@ -30,7 +30,8 @@ 'camerautils.cpp', 'cameraevent.cpp', 'gstcameracalibrate.cpp', - 'gstcameraundistort.cpp' + 'gstcameraundistort.cpp', + 'gstcvtracker.cpp' ] libopencv_headers = [ @@ -41,6 +42,7 @@ 'opencv2/objdetect.hpp', 'opencv2/opencv.hpp', 'opencv2/video.hpp', + 'opencv2/tracking.hpp', ] libopencv4_headers = [ @@ -51,6 +53,7 @@ 'opencv4/opencv2/objdetect.hpp', 'opencv4/opencv2/opencv.hpp', 'opencv4/opencv2/video.hpp', + 'opencv4/opencv2/tracking.hpp', ] gstopencv_cargs = ['-DGST_HAAR_CASCADES_DIR="@0@"'] @@ -65,7 +68,9 @@ opencv_found = false endif endforeach -else +endif + +if not opencv_found opencv_dep = dependency('opencv4', version : ['>= 4.0.0', '< 4.6.0'], required : false) opencv_found = opencv_dep.found() if opencv_found @@ -79,20 +84,20 @@ endif if opencv_found - opencv_prefix = opencv_dep.get_pkgconfig_variable('prefix') + opencv_prefix = opencv_dep.get_variable('prefix') gstopencv_cargs += ['-DOPENCV_PREFIX="' + opencv_prefix + '"'] # Check the data dir used by opencv for its xml data files # Use prefix from pkg-config to be compatible with cross-compilation - r = run_command('test', '-d', opencv_prefix + '/share/opencv') + r = run_command('test', '-d', opencv_prefix + '/share/opencv', check: false) if r.returncode() == 0 gstopencv_cargs += '-DOPENCV_PATH_NAME="opencv"' else - r = run_command('test', '-d', opencv_prefix + '/share/OpenCV') + r = run_command('test', '-d', opencv_prefix + '/share/OpenCV', check: false) if r.returncode() == 0 gstopencv_cargs += '-DOPENCV_PATH_NAME="OpenCV"' else - r = run_command('test', '-d', opencv_prefix + '/share/opencv4') + r = run_command('test', '-d', opencv_prefix + '/share/opencv4', check: false) if r.returncode() == 0 gstopencv_cargs += '-DOPENCV_PATH_NAME="opencv4"' else @@ -113,7 +118,7 @@ gstopencv = library('gstopencv', gstopencv_sources, cpp_args : gst_plugins_bad_args + gstopencv_cargs + [ '-DGST_USE_UNSTABLE_API' ], - link_args : noseh_link_args, + link_args : [noseh_link_args, '-lopencv_tracking'], include_directories : [configinc, libsinc], dependencies : [gstbase_dep, gstvideo_dep, opencv_dep, gstopencv_dep], install : true,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openexr/gstopenexr.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openexr/gstopenexr.c
Changed
@@ -29,11 +29,9 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "openexrdec", GST_RANK_PRIMARY, - GST_TYPE_OPENEXR_DEC)) - return FALSE; - return TRUE; + + return GST_ELEMENT_REGISTER (openexrdec, plugin);; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openexr/gstopenexrdec.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/openexr/gstopenexrdec.cpp
Changed
@@ -113,6 +113,8 @@ #define parent_class gst_openexr_dec_parent_class G_DEFINE_TYPE (GstOpenEXRDec, gst_openexr_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (openexrdec, "openexrdec", GST_RANK_PRIMARY, + GST_TYPE_OPENEXR_DEC); static void gst_openexr_dec_class_init (GstOpenEXRDecClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openexr/gstopenexrdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/openexr/gstopenexrdec.h
Changed
@@ -58,6 +58,8 @@ GType gst_openexr_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (openexrdec); + G_END_DECLS #endif /* __GST_OPENEXR_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openh264/gstopenh264dec.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/openh264/gstopenh264dec.cpp
Changed
@@ -27,6 +27,7 @@ #include "config.h" #endif +#include "gstopenh264elements.h" #include "gstopenh264dec.h" #include <wels/codec_ver.h> @@ -59,9 +60,10 @@ GstVideoCodecFrame * frame); static gboolean gst_openh264dec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query); +static gboolean openh264dec_element_init (GstPlugin * plugin); #if HAVE_OPENH264_MAIN_PROFILE -#define SUPPORTED_PROFILE_STR "profile=(string){ constrained-baseline, baseline, main, high }" +#define SUPPORTED_PROFILE_STR "profile=(string){ constrained-baseline, baseline, main, high, constrained-high, progressive-high }" #else #define SUPPORTED_PROFILE_STR "profile=(string){ constrained-baseline, baseline }" #endif @@ -88,6 +90,7 @@ GST_TYPE_VIDEO_DECODER, GST_DEBUG_CATEGORY_INIT (gst_openh264dec_debug_category, "openh264dec", 0, "debug category for openh264dec element")); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (openh264dec, openh264dec_element_init); static void gst_openh264dec_class_init (GstOpenh264DecClass * klass) @@ -298,9 +301,9 @@ if (ret != dsErrorFree) { /* Request a key unit from upstream */ GST_DEBUG_OBJECT (openh264dec, "Requesting a key unit"); - gst_pad_push_event (GST_VIDEO_DECODER_SINK_PAD (decoder), - gst_video_event_new_upstream_force_key_unit (GST_CLOCK_TIME_NONE, - FALSE, 0)); + + gst_video_decoder_request_sync_point (decoder, frame, + (GstVideoDecoderRequestSyncPointFlags) 0); GST_LOG_OBJECT (openh264dec, "error decoding nal, return code: %d", ret); gst_video_codec_frame_unref (frame); @@ -448,3 +451,14 @@ return TRUE; } + +static gboolean +openh264dec_element_init (GstPlugin * plugin) +{ + if (openh264_element_init (plugin)) + return gst_element_register (plugin, "openh264dec", GST_RANK_MARGINAL, + GST_TYPE_OPENH264DEC); + + GST_ERROR ("Incorrect library version loaded, expecting %s", g_strCodecVer); + return FALSE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openh264/gstopenh264element.c
Added
@@ -0,0 +1,48 @@ +/* + * Copyright (c) 2014, Ericsson AB. All rights reserved. + * + * Redistribution and use in source and binary forms, with or without modification, + * are permitted provided that the following conditions are met: + * + * 1. Redistributions of source code must retain the above copyright notice, this + * list of conditions and the following disclaimer. + * + * 2. Redistributions in binary form must reproduce the above copyright notice, this + * list of conditions and the following disclaimer in the documentation and/or other + * materials provided with the distribution. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND + * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED + * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. + * IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, + * INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT + * NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR + * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, + * WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) + * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY + * OF SUCH DAMAGE. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> +#include <wels/codec_api.h> +#include <wels/codec_ver.h> +#include <string.h> +#include "gstopenh264elements.h" + + +gboolean +openh264_element_init (GstPlugin * plugin) +{ + OpenH264Version libver; + /* g_stCodecVersion is the version detected at build time as defined in the + * headers and WelsGetCodecVersion() is the version detected at runtime. + * This is a safeguard to avoid crashes since OpenH264 has been changing + * ABI without changing the SONAME. + */ + libver = WelsGetCodecVersion (); + return (memcmp (&libver, &g_stCodecVersion, sizeof (libver)) == 0); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/openh264/gstopenh264elements.h
Added
@@ -0,0 +1,37 @@ +/* GStreamer + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_OPENH264_ELEMENT_H__ +#define __GST_OPENH264_ELEMENT_H__ + +#include <config.h> + +#include <gst/gst.h> + +G_BEGIN_DECLS + +gboolean openh264_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (openh264dec); +GST_ELEMENT_REGISTER_DECLARE (openh264enc); + +G_END_DECLS + +#endif /* __GST_OPENH264_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openh264/gstopenh264enc.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/openh264/gstopenh264enc.cpp
Changed
@@ -27,8 +27,10 @@ #include "config.h" #endif +#include "gstopenh264elements.h" #include "gstopenh264enc.h" +#include <gst/pbutils/pbutils.h> #include <gst/gst.h> #include <gst/base/base.h> #include <gst/video/video.h> @@ -162,6 +164,7 @@ gint usage_type); static void gst_openh264enc_set_rate_control (GstOpenh264Enc * openh264enc, gint rc_mode); +static gboolean openh264enc_element_init (GstPlugin * plugin); #define DEFAULT_BITRATE (128000) @@ -222,8 +225,11 @@ GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS - ("video/x-h264, stream-format=(string)\"byte-stream\", alignment=(string)\"au\", profile=(string)\"baseline\"") - ); + ("video/x-h264, " + "stream-format=(string)\"byte-stream\", alignment=(string)\"au\"," + "profile = (string) { constrained-baseline, baseline, main, constrained-high, high }" + ) +); /* class initialization */ @@ -232,6 +238,7 @@ G_IMPLEMENT_INTERFACE (GST_TYPE_PRESET, NULL); GST_DEBUG_CATEGORY_INIT (gst_openh264enc_debug_category, "openh264enc", 0, "debug category for openh264enc element")); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (openh264enc, openh264enc_element_init); static void gst_openh264enc_class_init (GstOpenh264EncClass * klass) @@ -678,6 +685,42 @@ return TRUE; } +static guint8 +gst_openh264enc_get_level_from_caps (GstCaps *outcaps, GstCaps *allowed_caps) +{ + GstStructure *s = gst_caps_get_structure (outcaps, 0); + const gchar * level = gst_structure_get_string (gst_caps_get_structure (allowed_caps, 0), "level"); + + if (!level) + return LEVEL_UNKNOWN; + + gst_structure_set (s, "level", G_TYPE_STRING, level, NULL); + return gst_codec_utils_h264_get_level_idc (level); +} + +static EProfileIdc +gst_openh264enc_get_profile_from_caps (GstCaps *outcaps, GstCaps *allowed_caps) +{ + EProfileIdc oh264_profile = PRO_BASELINE; + GstStructure *allowed_s = gst_caps_get_structure (allowed_caps, 0); + GstStructure *s = gst_caps_get_structure (outcaps, 0); + const gchar *profile = gst_structure_get_string (allowed_s, "profile"); + + if (!profile) + return oh264_profile; + + gst_structure_set (s, "profile", G_TYPE_STRING, profile, NULL); + if (!g_strcmp0 (profile, "constrained-baseline") || + !g_strcmp0 (profile, "baseline")) + return PRO_BASELINE; + else if (!g_strcmp0 (profile, "main")) + return PRO_MAIN; + else if (!g_strcmp0 (profile, "high")) + return PRO_HIGH; + + g_assert_not_reached (); + return PRO_BASELINE; +} static gboolean gst_openh264enc_set_format (GstVideoEncoder * encoder, @@ -694,6 +737,7 @@ GstVideoCodecState *output_state; openh264enc->frame_count = 0; int video_format = videoFormatI420; + GstCaps *allowed_caps = NULL; debug_caps = gst_caps_to_string (state->caps); GST_DEBUG_OBJECT (openh264enc, "gst_e26d4_enc_set_format called, caps: %s", @@ -721,6 +765,12 @@ unsigned int uiTraceLevel = WELS_LOG_ERROR; openh264enc->encoder->SetOption (ENCODER_OPTION_TRACE_LEVEL, &uiTraceLevel); + outcaps = gst_static_pad_template_get_caps (&gst_openh264enc_src_template); + outcaps = gst_caps_make_writable (outcaps); + allowed_caps = gst_pad_get_allowed_caps (GST_VIDEO_ENCODER_SRC_PAD (encoder)); + allowed_caps = gst_caps_make_writable (allowed_caps); + allowed_caps = gst_caps_fixate (allowed_caps); + GST_OBJECT_LOCK (openh264enc); openh264enc->encoder->GetDefaultParams (&enc_params); @@ -753,13 +803,16 @@ enc_params.bPrefixNalAddingCtrl = 0; enc_params.fMaxFrameRate = fps_n * 1.0 / fps_d; enc_params.iLoopFilterDisableIdc = openh264enc->deblocking_mode; - enc_params.sSpatialLayers[0].uiProfileIdc = PRO_BASELINE; + enc_params.sSpatialLayers[0].uiProfileIdc = gst_openh264enc_get_profile_from_caps (outcaps, allowed_caps); + enc_params.sSpatialLayers[0].uiLevelIdc = (ELevelIdc) gst_openh264enc_get_level_from_caps (outcaps, allowed_caps); enc_params.sSpatialLayers[0].iVideoWidth = enc_params.iPicWidth; enc_params.sSpatialLayers[0].iVideoHeight = enc_params.iPicHeight; enc_params.sSpatialLayers[0].fFrameRate = fps_n * 1.0 / fps_d; enc_params.sSpatialLayers[0].iSpatialBitrate = enc_params.iTargetBitrate; enc_params.sSpatialLayers[0].iMaxSpatialBitrate = enc_params.iMaxBitrate; + gst_clear_caps (&allowed_caps); + if (openh264enc->slice_mode == GST_OPENH264_SLICE_MODE_N_SLICES) { if (openh264enc->num_slices == 1) slice_mode = SM_SINGLE_SLICE; @@ -803,10 +856,6 @@ openh264enc->encoder->SetOption (ENCODER_OPTION_DATAFORMAT, &video_format); - outcaps = - gst_caps_copy (gst_static_pad_template_get_caps - (&gst_openh264enc_src_template)); - output_state = gst_video_encoder_set_output_state (encoder, outcaps, state); gst_video_codec_state_unref (output_state); @@ -1007,3 +1056,13 @@ return GST_FLOW_OK; } +static gboolean +openh264enc_element_init (GstPlugin * plugin) +{ + if (openh264_element_init (plugin)) + return gst_element_register (plugin, "openh264enc", GST_RANK_MARGINAL, + GST_TYPE_OPENH264ENC); + + GST_ERROR ("Incorrect library version loaded, expecting %s", g_strCodecVer); + return FALSE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openh264/gstopenh264plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openh264/gstopenh264plugin.c
Changed
@@ -31,31 +31,18 @@ #include <config.h> #endif -#include <gst/gst.h> -#include <wels/codec_api.h> -#include <wels/codec_ver.h> -#include <string.h> -#include "gstopenh264dec.h" -#include "gstopenh264enc.h" +#include "gstopenh264elements.h" + static gboolean plugin_init (GstPlugin * plugin) { - /* g_stCodecVersion is the version detected at build time as defined in the - * headers and WelsGetCodecVersion() is the version detected at runtime. - * This is a safeguard to avoid crashes since OpenH264 has been changing - * ABI without changing the SONAME. - */ - OpenH264Version libver = WelsGetCodecVersion (); - if (memcmp (&libver, &g_stCodecVersion, sizeof (libver)) == 0) { - gst_element_register (plugin, "openh264dec", GST_RANK_MARGINAL, - GST_TYPE_OPENH264DEC); - gst_element_register (plugin, "openh264enc", GST_RANK_MARGINAL, - GST_TYPE_OPENH264ENC); - } else { - GST_ERROR ("Incorrect library version loaded, expecting %s", g_strCodecVer); - } - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (openh264dec, plugin); + ret |= GST_ELEMENT_REGISTER (openh264enc, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openh264/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/openh264/meson.build
Changed
@@ -1,6 +1,7 @@ openh264_sources = [ 'gstopenh264dec.cpp', 'gstopenh264enc.cpp', + 'gstopenh264element.c', 'gstopenh264plugin.c', ] @@ -14,7 +15,7 @@ c_args : gst_plugins_bad_args, link_args : noseh_link_args, include_directories : [configinc], - dependencies : [gstvideo_dep, openh264_dep], + dependencies : [gstvideo_dep, openh264_dep, gstpbutils_dep, ], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/gstopenjpeg.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/gstopenjpeg.c
Changed
@@ -30,14 +30,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "openjpegdec", GST_RANK_PRIMARY, - GST_TYPE_OPENJPEG_DEC)) - return FALSE; - if (!gst_element_register (plugin, "openjpegenc", GST_RANK_PRIMARY, - GST_TYPE_OPENJPEG_ENC)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (openjpegdec, plugin); + ret |= GST_ELEMENT_REGISTER (openjpegenc, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/gstopenjpeg.h -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/gstopenjpeg.h
Changed
@@ -23,4 +23,29 @@ #include <openjpeg.h> +typedef enum +{ + OPENJPEG_ERROR_NONE = 0, + OPENJPEG_ERROR_INIT, + OPENJPEG_ERROR_ENCODE, + OPENJPEG_ERROR_DECODE, + OPENJPEG_ERROR_OPEN, + OPENJPEG_ERROR_MAP_READ, + OPENJPEG_ERROR_MAP_WRITE, + OPENJPEG_ERROR_FILL_IMAGE, + OPENJPEG_ERROR_NEGOCIATE, + OPENJPEG_ERROR_ALLOCATE, +} OpenJPEGErrorCode; + +typedef struct +{ + GstVideoCodecFrame *frame; + GstBuffer *output_buffer; + GstBuffer *input_buffer; + gint stripe; + OpenJPEGErrorCode last_error; + gboolean direct; + gboolean last_subframe; +} GstOpenJPEGCodecMessage; + #endif /* __GST_OPENJPEG_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/gstopenjpegdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/gstopenjpegdec.c
Changed
@@ -20,6 +20,23 @@ * */ +/** + * SECTION:element-openjpegdec + * @title: openjpegdec + * @see_also: openjpegenc + * + * openjpegdec decodes openjpeg stream. + * + * ## Example launch lines + * |[ + * gst-launch-1.0 -v videotestsrc num-buffers=10 ! openjpegenc ! jpeg2000parse ! openjpegdec ! videoconvert ! autovideosink sync=false + * ]| Encode and decode whole frames. + * |[ + * gst-launch-1.0 -v videotestsrc num-buffers=10 ! openjpegenc num-threads=8 num-stripes=8 ! jpeg2000parse ! openjpegdec max-slice-threads=8 ! videoconvert ! autovideosink sync=fals + * ]| Encode and decode frame split with stripes. + * + */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif @@ -36,15 +53,24 @@ { PROP_0, PROP_MAX_THREADS, + PROP_MAX_SLICE_THREADS, PROP_LAST }; #define GST_OPENJPEG_DEC_DEFAULT_MAX_THREADS 0 +/* prototypes */ +static void gst_openjpeg_dec_finalize (GObject * object); + +static GstStateChangeReturn +gst_openjpeg_dec_change_state (GstElement * element, GstStateChange transition); + static gboolean gst_openjpeg_dec_start (GstVideoDecoder * decoder); static gboolean gst_openjpeg_dec_stop (GstVideoDecoder * decoder); static gboolean gst_openjpeg_dec_set_format (GstVideoDecoder * decoder, GstVideoCodecState * state); +static gboolean gst_openjpeg_dec_flush (GstVideoDecoder * decoder); +static gboolean gst_openjpeg_dec_finish (GstVideoDecoder * decoder); static GstFlowReturn gst_openjpeg_dec_handle_frame (GstVideoDecoder * decoder, GstVideoCodecFrame * frame); static gboolean gst_openjpeg_dec_decide_allocation (GstVideoDecoder * decoder, @@ -54,6 +80,13 @@ static void gst_openjpeg_dec_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); +static gboolean gst_openjpeg_dec_decode_frame_multiple (GstVideoDecoder * + decoder, GstVideoCodecFrame * frame); +static gboolean gst_openjpeg_dec_decode_frame_single (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame); + +static void gst_openjpeg_dec_pause_loop (GstOpenJPEGDec * self, + GstFlowReturn flow_ret); #if G_BYTE_ORDER == G_LITTLE_ENDIAN #define GRAY16 "GRAY16_LE" @@ -69,7 +102,10 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS ("image/x-j2c, " GST_JPEG2000_SAMPLING_LIST "; " - "image/x-jpc, " GST_JPEG2000_SAMPLING_LIST "; " "image/jp2") + "image/x-jpc," + GST_JPEG2000_SAMPLING_LIST "; image/jp2 ; " + "image/x-jpc-striped, " + "num-stripes = (int) [2, MAX], " GST_JPEG2000_SAMPLING_LIST) ); static GstStaticPadTemplate gst_openjpeg_dec_src_template = @@ -83,6 +119,8 @@ #define parent_class gst_openjpeg_dec_parent_class G_DEFINE_TYPE (GstOpenJPEGDec, gst_openjpeg_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (openjpegdec, "openjpegdec", + GST_RANK_PRIMARY, GST_TYPE_OPENJPEG_DEC); static void gst_openjpeg_dec_class_init (GstOpenJPEGDecClass * klass) @@ -105,8 +143,13 @@ "Decode JPEG2000 streams", "Sebastian Dröge <sebastian.droege@collabora.co.uk>"); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_openjpeg_dec_change_state); + video_decoder_class->start = GST_DEBUG_FUNCPTR (gst_openjpeg_dec_start); video_decoder_class->stop = GST_DEBUG_FUNCPTR (gst_openjpeg_dec_stop); + video_decoder_class->flush = GST_DEBUG_FUNCPTR (gst_openjpeg_dec_flush); + video_decoder_class->finish = GST_DEBUG_FUNCPTR (gst_openjpeg_dec_finish); video_decoder_class->set_format = GST_DEBUG_FUNCPTR (gst_openjpeg_dec_set_format); video_decoder_class->handle_frame = @@ -114,17 +157,31 @@ video_decoder_class->decide_allocation = gst_openjpeg_dec_decide_allocation; gobject_class->set_property = gst_openjpeg_dec_set_property; gobject_class->get_property = gst_openjpeg_dec_get_property; + gobject_class->finalize = gst_openjpeg_dec_finalize; /** - * GstOpenJPEGDec:max-threads: + * GstOpenJPEGDec:max-slice-threads: * * Maximum number of worker threads to spawn. (0 = auto) * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_MAX_SLICE_THREADS, g_param_spec_int ("max-slice-threads", + "Maximum slice decoding threads", + "Maximum number of worker threads to spawn according to the frame boundary. (0 = no thread)", + 0, G_MAXINT, GST_OPENJPEG_DEC_DEFAULT_MAX_THREADS, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + /** + * GstOpenJPEGDec:max-threads: + * + * Maximum number of worker threads to spawn used by openjpeg internally. (0 = no thread) + * * Since: 1.18 */ g_object_class_install_property (G_OBJECT_CLASS (klass), PROP_MAX_THREADS, - g_param_spec_int ("max-threads", "Maximum decode threads", - "Maximum number of worker threads to spawn. (0 = auto)", + g_param_spec_int ("max-threads", "Maximum openjpeg threads", + "Maximum number of worker threads to spawn used by openjpeg internally. (0 = no thread)", 0, G_MAXINT, GST_OPENJPEG_DEC_DEFAULT_MAX_THREADS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); @@ -144,8 +201,13 @@ GST_PAD_SET_ACCEPT_TEMPLATE (GST_VIDEO_DECODER_SINK_PAD (self)); opj_set_default_decoder_parameters (&self->params); self->sampling = GST_JPEG2000_SAMPLING_NONE; - self->max_threads = GST_OPENJPEG_DEC_DEFAULT_MAX_THREADS; + self->max_slice_threads = GST_OPENJPEG_DEC_DEFAULT_MAX_THREADS; + self->available_threads = GST_OPENJPEG_DEC_DEFAULT_MAX_THREADS; self->num_procs = g_get_num_processors (); + g_mutex_init (&self->messages_lock); + g_mutex_init (&self->decoding_lock); + g_cond_init (&self->messages_cond); + g_queue_init (&self->messages); } static gboolean @@ -154,6 +216,11 @@ GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); GST_DEBUG_OBJECT (self, "Starting"); + self->available_threads = self->max_slice_threads; + if (self->available_threads) + self->decode_frame = gst_openjpeg_dec_decode_frame_multiple; + else + self->decode_frame = gst_openjpeg_dec_decode_frame_single; return TRUE; } @@ -164,6 +231,8 @@ GstOpenJPEGDec *self = GST_OPENJPEG_DEC (video_decoder); GST_DEBUG_OBJECT (self, "Stopping"); + g_mutex_lock (&self->messages_lock); + gst_pad_stop_task (GST_VIDEO_DECODER_SRC_PAD (video_decoder)); if (self->output_state) { gst_video_codec_state_unref (self->output_state); @@ -174,12 +243,77 @@ gst_video_codec_state_unref (self->input_state); self->input_state = NULL; } - + g_mutex_unlock (&self->messages_lock); GST_DEBUG_OBJECT (self, "Stopped"); return TRUE; } +static void +gst_openjpeg_dec_finalize (GObject * object) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (object); + + g_mutex_clear (&self->messages_lock); + g_mutex_clear (&self->decoding_lock); + g_cond_clear (&self->messages_cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static GstStateChangeReturn +gst_openjpeg_dec_change_state (GstElement * element, GstStateChange transition) +{ + GstOpenJPEGDec *self; + GstStateChangeReturn ret = GST_STATE_CHANGE_SUCCESS; + + g_return_val_if_fail (GST_IS_OPENJPEG_DEC (element), + GST_STATE_CHANGE_FAILURE); + self = GST_OPENJPEG_DEC (element); + + switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + break; + case GST_STATE_CHANGE_READY_TO_PAUSED: + self->draining = FALSE; + self->started = FALSE; + self->flushing = FALSE; + break; + case GST_STATE_CHANGE_PAUSED_TO_PLAYING: + break; + case GST_STATE_CHANGE_PAUSED_TO_READY: + self->flushing = TRUE; + g_mutex_lock (&self->drain_lock); + self->draining = FALSE; + g_cond_broadcast (&self->drain_cond); + g_mutex_unlock (&self->drain_lock); + break; + default: + break; + } + + ret = + GST_ELEMENT_CLASS (gst_openjpeg_dec_parent_class)->change_state + (element, transition); + + if (ret == GST_STATE_CHANGE_FAILURE) + return ret; + + switch (transition) { + case GST_STATE_CHANGE_PLAYING_TO_PAUSED: + break; + case GST_STATE_CHANGE_PAUSED_TO_READY: + self->started = FALSE; + self->downstream_flow_ret = GST_FLOW_FLUSHING; + break; + case GST_STATE_CHANGE_READY_TO_NULL: + break; + default: + break; + } + + return ret; +} static void gst_openjpeg_dec_set_property (GObject * object, @@ -188,6 +322,9 @@ GstOpenJPEGDec *dec = (GstOpenJPEGDec *) object; switch (prop_id) { + case PROP_MAX_SLICE_THREADS: + g_atomic_int_set (&dec->max_slice_threads, g_value_get_int (value)); + break; case PROP_MAX_THREADS: g_atomic_int_set (&dec->max_threads, g_value_get_int (value)); break; @@ -204,6 +341,9 @@ GstOpenJPEGDec *dec = (GstOpenJPEGDec *) object; switch (prop_id) { + case PROP_MAX_SLICE_THREADS: + g_value_set_int (value, g_atomic_int_get (&dec->max_slice_threads)); + break; case PROP_MAX_THREADS: g_value_set_int (value, g_atomic_int_get (&dec->max_threads)); break; @@ -232,13 +372,21 @@ } else if (gst_structure_has_name (s, "image/x-j2c")) { self->codec_format = OPJ_CODEC_J2K; self->is_jp2c = TRUE; - } else if (gst_structure_has_name (s, "image/x-jpc")) { + } else if (gst_structure_has_name (s, "image/x-jpc") || + gst_structure_has_name (s, "image/x-jpc-striped")) { self->codec_format = OPJ_CODEC_J2K; self->is_jp2c = FALSE; } else { g_return_val_if_reached (FALSE); } + if (gst_structure_has_name (s, "image/x-jpc-striped")) { + gst_structure_get_int (s, "num-stripes", &self->num_stripes); + gst_video_decoder_set_subframe_mode (decoder, TRUE); + } else { + self->num_stripes = 1; + gst_video_decoder_set_subframe_mode (decoder, FALSE); + } self->sampling = gst_jpeg2000_sampling_from_string (gst_structure_get_string (s, @@ -268,16 +416,16 @@ } static void -fill_frame_packed8_4 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_packed8_4 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint8 *data_out, *tmp; const gint *data_in[4]; gint dstride; gint off[4]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); @@ -286,9 +434,13 @@ off[c] = 0x80 * image->comps[c].sgnd; } - for (y = 0; y < h; y++) { + /* copy only the stripe content (image) to the full size frame */ + y0 = image->y0; + y1 = image->y1; + GST_DEBUG_OBJECT (self, "yo=%d y1=%d", y0, y1); + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { /* alpha, from 4'th input channel */ tmp[0] = off[3] + *data_in[3]; @@ -308,16 +460,16 @@ } static void -fill_frame_packed16_4 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_packed16_4 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint16 *data_out, *tmp; const gint *data_in[4]; gint dstride; gint shift[4], off[4]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; @@ -329,9 +481,11 @@ 8), 0); } - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { /* alpha, from 4'th input channel */ tmp[0] = off[3] + (*data_in[3] << shift[3]); @@ -351,16 +505,16 @@ } static void -fill_frame_packed8_3 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_packed8_3 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint8 *data_out, *tmp; const gint *data_in[3]; gint dstride; gint off[3]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); @@ -368,10 +522,11 @@ data_in[c] = image->comps[c].data; off[c] = 0x80 * image->comps[c].sgnd; }; - - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { tmp[0] = off[0] + *data_in[0]; tmp[1] = off[1] + *data_in[1]; @@ -386,16 +541,16 @@ } static void -fill_frame_packed16_3 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_packed16_3 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint16 *data_out, *tmp; const gint *data_in[3]; gint dstride; gint shift[3], off[3]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; @@ -407,9 +562,11 @@ 8), 0); } - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { tmp[1] = off[0] + (*data_in[0] << shift[0]); tmp[2] = off[1] + (*data_in[1] << shift[1]); @@ -426,16 +583,16 @@ /* for grayscale with alpha */ static void -fill_frame_packed8_2 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_packed8_2 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint8 *data_out, *tmp; const gint *data_in[2]; gint dstride; gint off[2]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); @@ -444,9 +601,11 @@ off[c] = 0x80 * image->comps[c].sgnd; }; - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { /* alpha, from 2nd input channel */ tmp[0] = off[1] + *data_in[1]; @@ -464,16 +623,16 @@ /* for grayscale with alpha */ static void -fill_frame_packed16_2 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_packed16_2 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint16 *data_out, *tmp; const gint *data_in[2]; gint dstride; gint shift[2], off[2]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; @@ -485,9 +644,11 @@ 8), 0); } - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { /* alpha, from 2nd input channel */ tmp[0] = off[1] + (*data_in[1] << shift[1]); @@ -505,46 +666,44 @@ static void -fill_frame_planar8_1 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_planar8_1 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h; + gint x, y, y0, y1, w; guint8 *data_out, *tmp; const gint *data_in; gint dstride; gint off; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); data_in = image->comps[0].data; off = 0x80 * image->comps[0].sgnd; - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - - for (x = 0; x < w; x++) { - *tmp = off + *data_in; - - tmp++; - data_in++; - } + for (x = 0; x < w; x++) + *tmp++ = off + *data_in++; data_out += dstride; } } static void -fill_frame_planar16_1 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_planar16_1 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h; + gint x, y, y0, y1, w; guint16 *data_out, *tmp; const gint *data_in; gint dstride; gint shift, off; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; @@ -555,92 +714,93 @@ MAX (MIN (GST_VIDEO_FRAME_COMP_DEPTH (frame, 0) - image->comps[0].prec, 8), 0); - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - - for (x = 0; x < w; x++) { - *tmp = off + (*data_in << shift); - - tmp++; - data_in++; - } + for (x = 0; x < w; x++) + *tmp++ = off + (*data_in++ << shift); data_out += dstride; } } static void -fill_frame_planar8_3 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_planar8_3 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint c, x, y, w, h; + gint c, x, y, y0, y1, w; guint8 *data_out, *tmp; const gint *data_in; gint dstride, off; for (c = 0; c < 3; c++) { + opj_image_comp_t *comp = image->comps + c; + w = GST_VIDEO_FRAME_COMP_WIDTH (frame, c); - h = GST_VIDEO_FRAME_COMP_HEIGHT (frame, c); dstride = GST_VIDEO_FRAME_COMP_STRIDE (frame, c); data_out = GST_VIDEO_FRAME_COMP_DATA (frame, c); - data_in = image->comps[c].data; - off = 0x80 * image->comps[c].sgnd; - - for (y = 0; y < h; y++) { + data_in = comp->data; + off = 0x80 * comp->sgnd; + + /* copy only the stripe content (image) to the full size frame */ + y0 = comp->y0; + y1 = comp->y0 + comp->h; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - - for (x = 0; x < w; x++) { - *tmp = off + *data_in; - tmp++; - data_in++; - } + for (x = 0; x < w; x++) + *tmp++ = off + *data_in++; data_out += dstride; } } } static void -fill_frame_planar16_3 (GstVideoFrame * frame, opj_image_t * image) +fill_frame_planar16_3 (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint c, x, y, w, h; + gint c, x, y, y0, y1, w; guint16 *data_out, *tmp; const gint *data_in; gint dstride; gint shift, off; for (c = 0; c < 3; c++) { + opj_image_comp_t *comp = image->comps + c; + w = GST_VIDEO_FRAME_COMP_WIDTH (frame, c); - h = GST_VIDEO_FRAME_COMP_HEIGHT (frame, c); dstride = GST_VIDEO_FRAME_COMP_STRIDE (frame, c) / 2; data_out = (guint16 *) GST_VIDEO_FRAME_COMP_DATA (frame, c); - data_in = image->comps[c].data; - off = (1 << (image->comps[c].prec - 1)) * image->comps[c].sgnd; + data_in = comp->data; + off = (1 << (comp->prec - 1)) * comp->sgnd; shift = - MAX (MIN (GST_VIDEO_FRAME_COMP_DEPTH (frame, c) - image->comps[c].prec, - 8), 0); + MAX (MIN (GST_VIDEO_FRAME_COMP_DEPTH (frame, c) - comp->prec, 8), 0); - for (y = 0; y < h; y++) { + /* copy only the stripe content (image) to the full size frame */ + y0 = comp->y0; + y1 = comp->y0 + comp->h; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - - for (x = 0; x < w; x++) { - *tmp = off + (*data_in << shift); - tmp++; - data_in++; - } + for (x = 0; x < w; x++) + *tmp++ = off + (*data_in++ << shift); data_out += dstride; } } } static void -fill_frame_planar8_3_generic (GstVideoFrame * frame, opj_image_t * image) +fill_frame_planar8_3_generic (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint8 *data_out, *tmp; const gint *data_in[3]; gint dstride; gint dx[3], dy[3], off[3]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); @@ -651,9 +811,11 @@ off[c] = 0x80 * image->comps[c].sgnd; } - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { tmp[0] = 0xff; tmp[1] = off[0] + data_in[0][((y / dy[0]) * w + x) / dx[0]]; @@ -666,51 +828,16 @@ } static void -fill_frame_planar8_4_generic (GstVideoFrame * frame, opj_image_t * image) -{ - gint x, y, w, h, c; - guint8 *data_out, *tmp; - const gint *data_in[4]; - gint dstride; - gint dx[4], dy[4], off[4]; - - w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); - data_out = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); - dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); - - for (c = 0; c < 4; c++) { - data_in[c] = image->comps[c].data; - dx[c] = image->comps[c].dx; - dy[c] = image->comps[c].dy; - off[c] = 0x80 * image->comps[c].sgnd; - } - - for (y = 0; y < h; y++) { - tmp = data_out; - - for (x = 0; x < w; x++) { - tmp[0] = off[3] + data_in[3][((y / dy[3]) * w + x) / dx[3]]; - tmp[1] = off[0] + data_in[0][((y / dy[0]) * w + x) / dx[0]]; - tmp[2] = off[1] + data_in[1][((y / dy[1]) * w + x) / dx[1]]; - tmp[3] = off[2] + data_in[2][((y / dy[2]) * w + x) / dx[2]]; - tmp += 4; - } - data_out += dstride; - } -} - -static void -fill_frame_planar16_3_generic (GstVideoFrame * frame, opj_image_t * image) +fill_frame_planar16_3_generic (GstOpenJPEGDec * self, GstVideoFrame * frame, + opj_image_t * image) { - gint x, y, w, h, c; + gint x, y, y0, y1, w, c; guint16 *data_out, *tmp; const gint *data_in[3]; gint dstride; gint dx[3], dy[3], shift[3], off[3]; w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); data_out = (guint16 *) GST_VIDEO_FRAME_PLANE_DATA (frame, 0); dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; @@ -724,9 +851,11 @@ 8), 0); } - for (y = 0; y < h; y++) { + y0 = image->y0; + y1 = image->y1; + data_out += y0 * dstride; + for (y = y0; y < y1; y++) { tmp = data_out; - for (x = 0; x < w; x++) { tmp[0] = 0xff; tmp[1] = off[0] + (data_in[0][((y / dy[0]) * w + x) / dx[0]] << shift[0]); @@ -738,44 +867,6 @@ } } -static void -fill_frame_planar16_4_generic (GstVideoFrame * frame, opj_image_t * image) -{ - gint x, y, w, h, c; - guint16 *data_out, *tmp; - const gint *data_in[4]; - gint dstride; - gint dx[4], dy[4], shift[4], off[4]; - - w = GST_VIDEO_FRAME_WIDTH (frame); - h = GST_VIDEO_FRAME_HEIGHT (frame); - data_out = (guint16 *) GST_VIDEO_FRAME_PLANE_DATA (frame, 0); - dstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; - - for (c = 0; c < 4; c++) { - dx[c] = image->comps[c].dx; - dy[c] = image->comps[c].dy; - data_in[c] = image->comps[c].data; - off[c] = (1 << (image->comps[c].prec - 1)) * image->comps[c].sgnd; - shift[c] = - MAX (MIN (GST_VIDEO_FRAME_COMP_DEPTH (frame, c) - image->comps[c].prec, - 8), 0); - } - - for (y = 0; y < h; y++) { - tmp = data_out; - - for (x = 0; x < w; x++) { - tmp[0] = off[3] + (data_in[3][((y / dy[3]) * w + x) / dx[3]] << shift[3]); - tmp[1] = off[0] + (data_in[0][((y / dy[0]) * w + x) / dx[0]] << shift[0]); - tmp[2] = off[1] + (data_in[1][((y / dy[1]) * w + x) / dx[1]] << shift[1]); - tmp[3] = off[2] + (data_in[2][((y / dy[2]) * w + x) / dx[2]] << shift[2]); - tmp += 4; - } - data_out += dstride; - } -} - static gint get_highest_prec (opj_image_t * image) { @@ -793,11 +884,13 @@ gst_openjpeg_dec_negotiate (GstOpenJPEGDec * self, opj_image_t * image) { GstVideoFormat format; - gint width, height; if (image->color_space == OPJ_CLRSPC_UNKNOWN || image->color_space == 0) image->color_space = self->color_space; + if (!self->input_state) + return GST_FLOW_FLUSHING; + switch (image->color_space) { case OPJ_CLRSPC_SRGB: if (image->numcomps == 4) { @@ -920,10 +1013,10 @@ } if (get_highest_prec (image) == 8) { - self->fill_frame = fill_frame_planar8_4_generic; + self->fill_frame = fill_frame_packed8_4; format = GST_VIDEO_FORMAT_AYUV; } else if (image->comps[3].prec <= 16) { - self->fill_frame = fill_frame_planar16_4_generic; + self->fill_frame = fill_frame_packed16_4; format = GST_VIDEO_FORMAT_AYUV64; } else { GST_ERROR_OBJECT (self, "Unsupported depth %d", image->comps[0].prec); @@ -998,19 +1091,16 @@ return GST_FLOW_NOT_NEGOTIATED; } - width = image->x1 - image->x0; - height = image->y1 - image->y0; - if (!self->output_state || self->output_state->info.finfo->format != format || - self->output_state->info.width != width || - self->output_state->info.height != height) { + self->output_state->info.width != self->input_state->info.width || + self->output_state->info.height != self->input_state->info.height) { if (self->output_state) gst_video_codec_state_unref (self->output_state); self->output_state = gst_video_decoder_set_output_state (GST_VIDEO_DECODER (self), format, - width, height, self->input_state); - + self->input_state->info.width, self->input_state->info.height, + self->input_state); if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) return GST_FLOW_NOT_NEGOTIATED; } @@ -1106,35 +1196,221 @@ return OPJ_TRUE; } -static GstFlowReturn -gst_openjpeg_dec_handle_frame (GstVideoDecoder * decoder, - GstVideoCodecFrame * frame) +static gboolean +gst_openjpeg_dec_is_last_input_subframe (GstVideoDecoder * dec, + GstOpenJPEGCodecMessage * message) { - GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); - GstFlowReturn ret = GST_FLOW_OK; - gint64 deadline; + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (dec); + + return (message->last_subframe || message->stripe == self->num_stripes); +} + +static gboolean +gst_openjpeg_dec_is_last_output_subframe (GstVideoDecoder * dec, + GstOpenJPEGCodecMessage * message) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (dec); + + return (gst_video_decoder_get_processed_subframe_index (dec, + message->frame) == (self->num_stripes - 1)); +} + + +static gboolean +gst_openjpeg_dec_has_pending_job_to_finish (GstOpenJPEGDec * self) +{ + gboolean res = FALSE; + if (self->downstream_flow_ret != GST_FLOW_OK) + return res; + g_mutex_lock (&self->messages_lock); + res = (!g_queue_is_empty (&self->messages) + || (self->available_threads < self->max_slice_threads)); + g_mutex_unlock (&self->messages_lock); + return res; +} + +static GstOpenJPEGCodecMessage * +gst_openjpeg_decode_message_new (GstOpenJPEGDec * self, + GstVideoCodecFrame * frame, int num_stripe) +{ + GstOpenJPEGCodecMessage *message = g_slice_new0 (GstOpenJPEGCodecMessage); + GST_DEBUG_OBJECT (self, "message: %p", message); + message->frame = gst_video_codec_frame_ref (frame); + message->stripe = num_stripe; + message->last_error = OPENJPEG_ERROR_NONE; + message->input_buffer = gst_buffer_ref (frame->input_buffer); + message->last_subframe = GST_BUFFER_FLAG_IS_SET (frame->input_buffer, + GST_BUFFER_FLAG_MARKER); + return message; +} + +static GstOpenJPEGCodecMessage * +gst_openjpeg_decode_message_free (GstOpenJPEGDec * self, + GstOpenJPEGCodecMessage * message) +{ + if (!message) + return message; + gst_buffer_unref (message->input_buffer); + gst_video_codec_frame_unref (message->frame); + GST_DEBUG_OBJECT (self, "message: %p", message); + g_slice_free (GstOpenJPEGCodecMessage, message); + return NULL; +} + +static GstOpenJPEGCodecMessage * +gst_openjpeg_dec_wait_for_new_message (GstOpenJPEGDec * self, gboolean dry_run) +{ + GstOpenJPEGCodecMessage *message = NULL; + g_mutex_lock (&self->messages_lock); + if (dry_run && self->available_threads == self->max_slice_threads) + goto done; + if (!g_queue_is_empty (&self->messages) && !dry_run) { + message = g_queue_pop_head (&self->messages); + } else { + g_cond_wait (&self->messages_cond, &self->messages_lock); + } + +done: + g_mutex_unlock (&self->messages_lock); + return message; +} + +static void +gst_openjpeg_dec_pause_loop (GstOpenJPEGDec * self, GstFlowReturn flow_ret) +{ + g_mutex_lock (&self->drain_lock); + GST_DEBUG_OBJECT (self, "Pause the loop draining %d flow_ret %s", + self->draining, gst_flow_get_name (flow_ret)); + if (self->draining) { + self->draining = FALSE; + g_cond_broadcast (&self->drain_cond); + } + gst_pad_pause_task (GST_VIDEO_DECODER_SRC_PAD (self)); + self->downstream_flow_ret = flow_ret; + self->started = FALSE; + g_mutex_unlock (&self->drain_lock); +} + +static void +gst_openjpeg_dec_loop (GstOpenJPEGDec * self) +{ + GstOpenJPEGCodecMessage *message; + GstVideoDecoder *decoder = GST_VIDEO_DECODER (self); + GstFlowReturn flow_ret = GST_FLOW_OK; + + message = gst_openjpeg_dec_wait_for_new_message (self, FALSE); + if (message) { + GST_DEBUG_OBJECT (self, + "received message for frame %p stripe %d last_error %d threads %d", + message->frame, message->stripe, message->last_error, + self->available_threads); + + if (self->flushing) + goto flushing; + + if (message->last_error != OPENJPEG_ERROR_NONE) + goto decode_error; + + g_mutex_lock (&self->decoding_lock); + + if (gst_openjpeg_dec_is_last_output_subframe (decoder, message)) + flow_ret = gst_video_decoder_finish_frame (decoder, message->frame); + else + gst_video_decoder_finish_subframe (decoder, message->frame); + g_mutex_unlock (&self->decoding_lock); + message = gst_openjpeg_decode_message_free (self, message); + g_cond_broadcast (&self->messages_cond); + } + + if (flow_ret != GST_FLOW_OK) + goto flow_error; + + if (self->draining && !gst_openjpeg_dec_has_pending_job_to_finish (self)) + gst_openjpeg_dec_pause_loop (self, GST_FLOW_OK); + + if (self->flushing) + goto flushing; + + return; + +decode_error: + { + GST_ELEMENT_ERROR (self, LIBRARY, FAILED, (NULL), + ("OPEN JPEG decode fail %d", message->last_error)); + gst_video_codec_frame_unref (message->frame); + gst_pad_push_event (GST_VIDEO_DECODER_SRC_PAD (self), gst_event_new_eos ()); + gst_openjpeg_dec_pause_loop (self, GST_FLOW_ERROR); + gst_openjpeg_decode_message_free (self, message); + return; + } + +flushing: + { + GST_DEBUG_OBJECT (self, "Flushing -- stopping task"); + if (message) { + gst_video_codec_frame_unref (message->frame); + gst_openjpeg_decode_message_free (self, message); + } + gst_openjpeg_dec_pause_loop (self, GST_FLOW_FLUSHING); + return; + } + +flow_error: + { + if (flow_ret == GST_FLOW_EOS) { + GST_DEBUG_OBJECT (self, "EOS"); + + gst_pad_push_event (GST_VIDEO_DECODER_SRC_PAD (self), + gst_event_new_eos ()); + } else if (flow_ret < GST_FLOW_EOS) { + GST_ELEMENT_ERROR (self, STREAM, FAILED, + ("Internal data stream error."), ("stream stopped, reason %s", + gst_flow_get_name (flow_ret))); + + gst_pad_push_event (GST_VIDEO_DECODER_SRC_PAD (self), + gst_event_new_eos ()); + } else if (flow_ret == GST_FLOW_FLUSHING) { + GST_DEBUG_OBJECT (self, "Flushing -- stopping task"); + } + gst_openjpeg_dec_pause_loop (self, flow_ret); + + return; + } + +} + +#define DECODE_ERROR(self, message, err_code, mutex_unlock) { \ + GST_WARNING_OBJECT(self, "An error occurred err_code=%d", err_code);\ + message->last_error = err_code; \ + if (mutex_unlock) \ + g_mutex_unlock (&self->decoding_lock);\ + goto done; \ +} + +static void +gst_openjpeg_dec_decode_stripe (GstElement * element, gpointer user_data) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (element); + GstVideoDecoder *decoder = GST_VIDEO_DECODER (element); + GstOpenJPEGCodecMessage *message = (GstOpenJPEGCodecMessage *) user_data; GstMapInfo map; - opj_codec_t *dec; - opj_stream_t *stream; - MemStream mstream; - opj_image_t *image; GstVideoFrame vframe; + opj_codec_t *dec = NULL; + opj_stream_t *stream = NULL; + MemStream mstream; + opj_image_t *image = NULL; opj_dparameters_t params; gint max_threads; - GST_DEBUG_OBJECT (self, "Handling frame"); + GstFlowReturn ret; + gint i; - deadline = gst_video_decoder_get_max_decode_time (decoder, frame); - if (deadline < 0) { - GST_LOG_OBJECT (self, "Dropping too late frame: deadline %" G_GINT64_FORMAT, - deadline); - ret = gst_video_decoder_drop_frame (decoder, frame); - return ret; - } + GST_DEBUG_OBJECT (self, "Start to decode stripe %p %d", message->frame, + message->stripe); dec = opj_create_decompress (self->codec_format); if (!dec) - goto initialization_error; + DECODE_ERROR (self, message, OPENJPEG_ERROR_INIT, FALSE); if (G_UNLIKELY (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_TRACE)) { @@ -1151,24 +1427,25 @@ if (self->ncomps) params.jpwl_exp_comps = self->ncomps; if (!opj_setup_decoder (dec, ¶ms)) - goto open_error; + DECODE_ERROR (self, message, OPENJPEG_ERROR_OPEN, FALSE); max_threads = g_atomic_int_get (&self->max_threads); - if (max_threads == 0) + if (max_threads > self->num_procs) max_threads = self->num_procs; if (!opj_codec_set_threads (dec, max_threads)) GST_WARNING_OBJECT (self, "Failed to set %d number of threads", max_threads); - if (!gst_buffer_map (frame->input_buffer, &map, GST_MAP_READ)) - goto map_read_error; + if (!gst_buffer_map (message->input_buffer, &map, GST_MAP_READ)) + DECODE_ERROR (self, message, OPENJPEG_ERROR_MAP_READ, FALSE); + if (self->is_jp2c && map.size < 8) - goto open_error; + DECODE_ERROR (self, message, OPENJPEG_ERROR_MAP_READ, FALSE); stream = opj_stream_create (4096, OPJ_TRUE); if (!stream) - goto open_error; + DECODE_ERROR (self, message, OPENJPEG_ERROR_OPEN, FALSE); mstream.data = map.data + (self->is_jp2c ? 8 : 0); mstream.offset = 0; @@ -1183,119 +1460,286 @@ image = NULL; if (!opj_read_header (stream, dec, &image)) - goto decode_error; + DECODE_ERROR (self, message, OPENJPEG_ERROR_DECODE, FALSE); if (!opj_decode (dec, stream, image)) - goto decode_error; - - { - gint i; + DECODE_ERROR (self, message, OPENJPEG_ERROR_DECODE, FALSE); - for (i = 0; i < image->numcomps; i++) { - if (image->comps[i].data == NULL) - goto decode_error; - } + for (i = 0; i < image->numcomps; i++) { + if (image->comps[i].data == NULL) + DECODE_ERROR (self, message, OPENJPEG_ERROR_DECODE, FALSE); } - gst_buffer_unmap (frame->input_buffer, &map); + gst_buffer_unmap (message->input_buffer, &map); + + g_mutex_lock (&self->decoding_lock); ret = gst_openjpeg_dec_negotiate (self, image); if (ret != GST_FLOW_OK) - goto negotiate_error; + DECODE_ERROR (self, message, OPENJPEG_ERROR_NEGOCIATE, TRUE); - ret = gst_video_decoder_allocate_output_frame (decoder, frame); - if (ret != GST_FLOW_OK) - goto allocate_error; + if (message->frame->output_buffer == NULL) { + ret = gst_video_decoder_allocate_output_frame (decoder, message->frame); + if (ret != GST_FLOW_OK) + DECODE_ERROR (self, message, OPENJPEG_ERROR_ALLOCATE, TRUE); + } if (!gst_video_frame_map (&vframe, &self->output_state->info, - frame->output_buffer, GST_MAP_WRITE)) - goto map_write_error; + message->frame->output_buffer, GST_MAP_WRITE)) + DECODE_ERROR (self, message, OPENJPEG_ERROR_MAP_WRITE, TRUE); + + if (message->stripe) + self->fill_frame (self, &vframe, image); + else { + GST_ERROR_OBJECT (decoder, " current_stripe should be greater than 0"); + DECODE_ERROR (self, message, OPENJPEG_ERROR_MAP_WRITE, TRUE); + } + gst_video_frame_unmap (&vframe); + g_mutex_unlock (&self->decoding_lock); + message->last_error = OPENJPEG_ERROR_NONE; + GST_DEBUG_OBJECT (self, "Finished to decode stripe message=%p stripe=%d", + message->frame, message->stripe); +done: + if (!message->direct) { + g_mutex_lock (&self->messages_lock); + self->available_threads++; + g_queue_push_tail (&self->messages, message); + g_mutex_unlock (&self->messages_lock); + g_cond_broadcast (&self->messages_cond); + } - self->fill_frame (&vframe, image); + if (stream) { + opj_end_decompress (dec, stream); + opj_stream_destroy (stream); + } + if (image) + opj_image_destroy (image); + if (dec) + opj_destroy_codec (dec); +} - gst_video_frame_unmap (&vframe); +static GstFlowReturn +gst_openjpeg_dec_decode_frame_multiple (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); + GstOpenJPEGCodecMessage *message = NULL; + guint current_stripe = + gst_video_decoder_get_input_subframe_index (decoder, frame); + + if (!self->started) { + GST_DEBUG_OBJECT (self, "Starting task"); + gst_pad_start_task (GST_VIDEO_DECODER_SRC_PAD (self), + (GstTaskFunction) gst_openjpeg_dec_loop, decoder, NULL); + self->started = TRUE; + } + /* Make sure to release the base class stream lock, otherwise + * _loop() can't call _finish_frame() and we might block forever + * because no input buffers are released */ + GST_VIDEO_DECODER_STREAM_UNLOCK (self); + + while (!self->available_threads) + gst_openjpeg_dec_wait_for_new_message (self, TRUE); + + GST_VIDEO_DECODER_STREAM_LOCK (self); + + if (self->downstream_flow_ret != GST_FLOW_OK) + return self->downstream_flow_ret; + + g_mutex_lock (&self->messages_lock); + message = gst_openjpeg_decode_message_new (self, frame, current_stripe); + GST_LOG_OBJECT (self, + "About to enqueue a decoding message from frame %p stripe %d", frame, + message->stripe); + + if (self->available_threads) + self->available_threads--; + g_mutex_unlock (&self->messages_lock); + + gst_element_call_async (GST_ELEMENT (self), + (GstElementCallAsyncFunc) gst_openjpeg_dec_decode_stripe, message, NULL); + if (gst_video_decoder_get_subframe_mode (decoder) + && gst_openjpeg_dec_is_last_input_subframe (decoder, message)) + gst_video_decoder_have_last_subframe (decoder, frame); + return GST_FLOW_OK; +} - opj_end_decompress (dec, stream); - opj_stream_destroy (stream); - opj_image_destroy (image); - opj_destroy_codec (dec); +static GstFlowReturn +gst_openjpeg_dec_decode_frame_single (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); + GstOpenJPEGCodecMessage *message = NULL; + guint current_stripe = + gst_video_decoder_get_input_subframe_index (decoder, frame); + GstFlowReturn ret = GST_FLOW_OK; - ret = gst_video_decoder_finish_frame (decoder, frame); + message = gst_openjpeg_decode_message_new (self, frame, current_stripe); + message->direct = TRUE; + gst_openjpeg_dec_decode_stripe (GST_ELEMENT (decoder), message); + if (message->last_error != OPENJPEG_ERROR_NONE) { + GST_WARNING_OBJECT + (self, "An error occured %d during the JPEG decoding", + message->last_error); + self->last_error = message->last_error; + ret = GST_FLOW_ERROR; + goto done; + } + if (gst_openjpeg_dec_is_last_output_subframe (decoder, message)) + ret = gst_video_decoder_finish_frame (decoder, message->frame); + else + gst_video_decoder_finish_subframe (decoder, message->frame); +done: + gst_openjpeg_decode_message_free (self, message); return ret; +} -initialization_error: - { +static gboolean +gst_openjpeg_dec_flush (GstVideoDecoder * decoder) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); + + GST_DEBUG_OBJECT (self, "Flushing decoder"); + + /* 2) Wait until the srcpad loop is stopped, + * unlock GST_VIDEO_DECODER_STREAM_LOCK to prevent deadlocks + * caused by using this lock from inside the loop function */ + GST_VIDEO_DECODER_STREAM_UNLOCK (self); + gst_pad_stop_task (GST_VIDEO_DECODER_SRC_PAD (decoder)); + GST_DEBUG_OBJECT (self, "Flushing -- task stopped"); + GST_VIDEO_DECODER_STREAM_LOCK (self); + + /* Reset our state */ + self->started = FALSE; + GST_DEBUG_OBJECT (self, "Flush finished"); + + return TRUE; +} + +static GstFlowReturn +gst_openjpeg_dec_handle_frame (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); + GstFlowReturn ret = GST_FLOW_OK; + gint64 deadline; + guint current_stripe = + gst_video_decoder_get_input_subframe_index (decoder, frame); + + if (self->downstream_flow_ret != GST_FLOW_OK) { gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, LIBRARY, INIT, - ("Failed to initialize OpenJPEG decoder"), (NULL)); - return GST_FLOW_ERROR; + return self->downstream_flow_ret; } -map_read_error: - { - opj_destroy_codec (dec); - gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, CORE, FAILED, - ("Failed to map input buffer"), (NULL)); - return GST_FLOW_ERROR; - } -open_error: - { - opj_destroy_codec (dec); - gst_buffer_unmap (frame->input_buffer, &map); - gst_video_codec_frame_unref (frame); + GST_DEBUG_OBJECT (self, "Handling frame with current stripe %d", + current_stripe); + + deadline = gst_video_decoder_get_max_decode_time (decoder, frame); + if (self->drop_subframes || deadline < 0) { + GST_INFO_OBJECT (self, + "Dropping too late frame: deadline %" G_GINT64_FORMAT, deadline); + self->drop_subframes = TRUE; + if (current_stripe == self->num_stripes || + GST_BUFFER_FLAG_IS_SET (frame->input_buffer, GST_BUFFER_FLAG_MARKER)) { + ret = gst_video_decoder_drop_frame (decoder, frame); + self->drop_subframes = FALSE; + } else { + gst_video_decoder_drop_subframe (decoder, frame); + } - GST_ELEMENT_ERROR (self, LIBRARY, INIT, - ("Failed to open OpenJPEG stream"), (NULL)); - return GST_FLOW_ERROR; + goto done; } -decode_error: - { - if (image) - opj_image_destroy (image); - opj_stream_destroy (stream); - opj_destroy_codec (dec); - gst_buffer_unmap (frame->input_buffer, &map); - gst_video_codec_frame_unref (frame); - GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, - ("Failed to decode OpenJPEG stream"), (NULL), ret); - return ret; + ret = self->decode_frame (decoder, frame); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Unable to decode the frame with flow error: %s", + gst_flow_get_name (ret)); + goto error; } -negotiate_error: - { - opj_image_destroy (image); - opj_stream_destroy (stream); - opj_destroy_codec (dec); - gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, - ("Failed to negotiate"), (NULL)); - return ret; +done: + return ret; + +error: + switch (self->last_error) { + case OPENJPEG_ERROR_INIT: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to initialize OpenJPEG decoder"), (NULL)); + break; + case OPENJPEG_ERROR_MAP_READ: + GST_ELEMENT_ERROR (self, CORE, FAILED, + ("Failed to map input buffer"), (NULL)); + break; + case OPENJPEG_ERROR_MAP_WRITE: + GST_ELEMENT_ERROR (self, CORE, FAILED, + ("Failed to map input buffer"), (NULL)); + break; + case OPENJPEG_ERROR_FILL_IMAGE: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to fill OpenJPEG image"), (NULL)); + break; + case OPENJPEG_ERROR_OPEN: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to open OpenJPEG data"), (NULL)); + break; + case OPENJPEG_ERROR_DECODE: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to decode OpenJPEG data"), (NULL)); + break; + case OPENJPEG_ERROR_NEGOCIATE: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to negociate OpenJPEG data"), (NULL)); + break; + case OPENJPEG_ERROR_ALLOCATE: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to allocate OpenJPEG data"), (NULL)); + break; + default: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to encode OpenJPEG data"), (NULL)); + break; } -allocate_error: - { - opj_image_destroy (image); - opj_stream_destroy (stream); - opj_destroy_codec (dec); - gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, CORE, FAILED, - ("Failed to allocate output buffer"), (NULL)); - return ret; + return GST_FLOW_ERROR; +} + +static GstFlowReturn +gst_openjpeg_dec_finish (GstVideoDecoder * decoder) +{ + GstOpenJPEGDec *self = GST_OPENJPEG_DEC (decoder); + + GST_DEBUG_OBJECT (self, "Draining component"); + + if (!self->started) { + GST_DEBUG_OBJECT (self, "Component not started yet"); + return GST_FLOW_OK; } -map_write_error: - { - opj_image_destroy (image); - opj_stream_destroy (stream); - opj_destroy_codec (dec); - gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, CORE, FAILED, - ("Failed to map output buffer"), (NULL)); - return GST_FLOW_ERROR; + self->draining = TRUE; + if (!gst_openjpeg_dec_has_pending_job_to_finish (self)) { + GST_DEBUG_OBJECT (self, "Component ready"); + g_cond_broadcast (&self->messages_cond); + return GST_FLOW_OK; } + + /* Make sure to release the base class stream lock, otherwise + * _loop() can't call _finish_frame() and we might block forever + * because no input buffers are released */ + GST_VIDEO_DECODER_STREAM_UNLOCK (self); + + g_mutex_lock (&self->drain_lock); + GST_DEBUG_OBJECT (self, "Waiting until component is drained"); + + while (self->draining) + g_cond_wait (&self->drain_cond, &self->drain_lock); + + GST_DEBUG_OBJECT (self, "Drained component"); + + g_mutex_unlock (&self->drain_lock); + GST_VIDEO_DECODER_STREAM_LOCK (self); + self->started = FALSE; + return GST_FLOW_OK; } static gboolean
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/gstopenjpegdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/gstopenjpegdec.h
Changed
@@ -58,11 +58,36 @@ GstJPEG2000Sampling sampling; gint ncomps; gint max_threads; /* atomic */ + gint max_slice_threads; /* internal openjpeg threading system */ gint num_procs; + gint num_stripes; + gboolean drop_subframes; - void (*fill_frame) (GstVideoFrame *frame, opj_image_t * image); + void (*fill_frame) (GstOpenJPEGDec *self, + GstVideoFrame *frame, opj_image_t * image); + + gboolean (*decode_frame) (GstVideoDecoder * decoder, GstVideoCodecFrame *frame); opj_dparameters_t params; + + guint available_threads; + GQueue messages; + + GCond messages_cond; + GMutex messages_lock; + GMutex decoding_lock; + GstFlowReturn downstream_flow_ret; + gboolean flushing; + + /* Draining state */ + GMutex drain_lock; + GCond drain_cond; + /* TRUE if EOS buffers shouldn't be forwarded */ + gboolean draining; /* protected by drain_lock */ + + int last_error; + + gboolean started; }; struct _GstOpenJPEGDecClass @@ -72,6 +97,8 @@ GType gst_openjpeg_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (openjpegdec); + G_END_DECLS #endif /* __GST_OPENJPEG_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/gstopenjpegenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/gstopenjpegenc.c
Changed
@@ -20,6 +20,23 @@ * */ +/** + * SECTION:element-openjpegenc + * @title: openjpegenc + * @see_also: openjpegdec + * + * openjpegenc encodes raw video stream. + * + * ## Example launch lines + * |[ + * gst-launch-1.0 -v videotestsrc num-buffers=10 ! openjpegenc ! jpeg2000parse ! openjpegdec ! videoconvert ! autovideosink sync=false + * ]| Encode and decode whole frames. + * |[ + * gst-launch-1.0 -v videotestsrc num-buffers=10 ! openjpegenc num-threads=8 num-stripes=8 ! jpeg2000parse ! openjpegdec max-threads=8 ! videoconvert ! autovideosink sync=fals + * ]| Encode and decode frame split with stripes. + * + */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif @@ -28,6 +45,7 @@ #include <gst/codecparsers/gstjpeg2000sampling.h> #include <string.h> +#include <math.h> GST_DEBUG_CATEGORY_STATIC (gst_openjpeg_enc_debug); #define GST_CAT_DEFAULT gst_openjpeg_enc_debug @@ -68,9 +86,11 @@ PROP_TILE_WIDTH, PROP_TILE_HEIGHT, PROP_NUM_STRIPES, + PROP_NUM_THREADS, PROP_LAST }; + #define DEFAULT_NUM_LAYERS 1 #define DEFAULT_NUM_RESOLUTIONS 6 #define DEFAULT_PROGRESSION_ORDER OPJ_LRCP @@ -79,6 +99,13 @@ #define DEFAULT_TILE_WIDTH 0 #define DEFAULT_TILE_HEIGHT 0 #define GST_OPENJPEG_ENC_DEFAULT_NUM_STRIPES 1 +#define GST_OPENJPEG_ENC_DEFAULT_NUM_THREADS 0 + +/* prototypes */ +static void gst_openjpeg_enc_finalize (GObject * object); + +static GstStateChangeReturn +gst_openjpeg_enc_change_state (GstElement * element, GstStateChange transition); static void gst_openjpeg_enc_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -93,6 +120,12 @@ GstVideoCodecFrame * frame); static gboolean gst_openjpeg_enc_propose_allocation (GstVideoEncoder * encoder, GstQuery * query); +static GstFlowReturn gst_openjpeg_enc_encode_frame_multiple (GstVideoEncoder * + encoder, GstVideoCodecFrame * frame); +static GstFlowReturn gst_openjpeg_enc_encode_frame_single (GstVideoEncoder * + encoder, GstVideoCodecFrame * frame); +static GstOpenJPEGCodecMessage + * gst_openjpeg_encode_message_free (GstOpenJPEGCodecMessage * message); #if G_BYTE_ORDER == G_LITTLE_ENDIAN #define GRAY16 "GRAY16_LE" @@ -119,20 +152,31 @@ "width = (int) [1, MAX], " "height = (int) [1, MAX], " "num-components = (int) [1, 4], " - "num-stripes = (int) [1, MAX], " GST_JPEG2000_SAMPLING_LIST "," GST_JPEG2000_COLORSPACE_LIST "; " "image/x-jpc, " "width = (int) [1, MAX], " "height = (int) [1, MAX], " "num-components = (int) [1, 4], " + "num-stripes = (int) [1, MAX], " + "alignment = (string) { frame, stripe }, " GST_JPEG2000_SAMPLING_LIST "," GST_JPEG2000_COLORSPACE_LIST "; " - "image/jp2, " "width = (int) [1, MAX], " "height = (int) [1, MAX]") + "image/jp2, " "width = (int) [1, MAX], " + "height = (int) [1, MAX] ;" + "image/x-jpc-striped, " + "width = (int) [1, MAX], " + "height = (int) [1, MAX], " + "num-components = (int) [1, 4], " + GST_JPEG2000_SAMPLING_LIST ", " + GST_JPEG2000_COLORSPACE_LIST ", " + "num-stripes = (int) [2, MAX], stripe-height = (int) [1 , MAX]") ); #define parent_class gst_openjpeg_enc_parent_class G_DEFINE_TYPE (GstOpenJPEGEnc, gst_openjpeg_enc, GST_TYPE_VIDEO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (openjpegenc, "openjpegenc", + GST_RANK_PRIMARY, GST_TYPE_OPENJPEG_ENC); static void gst_openjpeg_enc_class_init (GstOpenJPEGEncClass * klass) @@ -147,6 +191,10 @@ gobject_class->set_property = gst_openjpeg_enc_set_property; gobject_class->get_property = gst_openjpeg_enc_get_property; + gobject_class->finalize = gst_openjpeg_enc_finalize; + + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_openjpeg_enc_change_state); g_object_class_install_property (gobject_class, PROP_NUM_LAYERS, g_param_spec_int ("num-layers", "Number of layers", @@ -196,6 +244,18 @@ "Number of stripes for low latency encoding. (1 = low latency disabled)", 1, G_MAXINT, GST_OPENJPEG_ENC_DEFAULT_NUM_STRIPES, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + /** + * GstOpenJPEGEnc:num-threads: + * + * Max number of simultaneous threads to encode stripes, default: encode with streaming thread + * + * Since: 1.20 + */ + g_object_class_install_property (G_OBJECT_CLASS (klass), PROP_NUM_THREADS, + g_param_spec_uint ("num-threads", "Number of threads", + "Max number of simultaneous threads to encode stripe or frame, default: encode with streaming thread.", + 0, G_MAXINT, GST_OPENJPEG_ENC_DEFAULT_NUM_THREADS, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); gst_element_class_add_static_pad_template (element_class, &gst_openjpeg_enc_src_template); @@ -256,6 +316,10 @@ && self->params.cp_tdy != 0); self->num_stripes = GST_OPENJPEG_ENC_DEFAULT_NUM_STRIPES; + g_cond_init (&self->messages_cond); + g_queue_init (&self->messages); + + self->available_threads = GST_OPENJPEG_ENC_DEFAULT_NUM_THREADS; } static void @@ -293,6 +357,9 @@ case PROP_NUM_STRIPES: self->num_stripes = g_value_get_int (value); break; + case PROP_NUM_THREADS: + self->available_threads = g_value_get_uint (value); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -330,6 +397,9 @@ case PROP_NUM_STRIPES: g_value_set_int (value, self->num_stripes); break; + case PROP_NUM_THREADS: + g_value_set_uint (value, self->available_threads); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -342,6 +412,10 @@ GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (encoder); GST_DEBUG_OBJECT (self, "Starting"); + if (self->available_threads) + self->encode_frame = gst_openjpeg_enc_encode_frame_multiple; + else + self->encode_frame = gst_openjpeg_enc_encode_frame_single; return TRUE; } @@ -368,6 +442,56 @@ return TRUE; } +static void +gst_openjpeg_enc_flush_messages (GstOpenJPEGEnc * self) +{ + GstOpenJPEGCodecMessage *enc_params; + + GST_OBJECT_LOCK (self); + while ((enc_params = g_queue_pop_head (&self->messages))) { + gst_openjpeg_encode_message_free (enc_params); + } + g_cond_broadcast (&self->messages_cond); + GST_OBJECT_UNLOCK (self); +} + +static void +gst_openjpeg_enc_finalize (GObject * object) +{ + GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (object); + + gst_openjpeg_enc_flush_messages (self); + g_cond_clear (&self->messages_cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static GstStateChangeReturn +gst_openjpeg_enc_change_state (GstElement * element, GstStateChange transition) +{ + GstOpenJPEGEnc *self; + + g_return_val_if_fail (GST_IS_OPENJPEG_ENC (element), + GST_STATE_CHANGE_FAILURE); + self = GST_OPENJPEG_ENC (element); + + switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + break; + case GST_STATE_CHANGE_READY_TO_PAUSED: + break; + case GST_STATE_CHANGE_PAUSED_TO_PLAYING: + break; + case GST_STATE_CHANGE_PAUSED_TO_READY: + gst_openjpeg_enc_flush_messages (self); + break; + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + static guint get_stripe_height (GstOpenJPEGEnc * self, guint slice_num, guint frame_height) { @@ -414,8 +538,6 @@ } data_in += sstride; } - image->y1 -= image->y0; - image->y0 = 0; } static void @@ -454,8 +576,6 @@ } data_in += sstride; } - image->y1 -= image->y0; - image->y0 = 0; } static void @@ -491,8 +611,6 @@ } data_in += sstride; } - image->y1 -= image->y0; - image->y0 = 0; } static void @@ -504,12 +622,15 @@ gint sstride; for (c = 0; c < 3; c++) { + opj_image_comp_t *comp = image->comps + c; + w = GST_VIDEO_FRAME_COMP_WIDTH (frame, c); - h = image->comps[c].h; + h = comp->h; sstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, c) / 2; data_in = - (guint16 *) GST_VIDEO_FRAME_COMP_DATA (frame, c) + image->y0 * sstride; - data_out = image->comps[c].data; + (guint16 *) GST_VIDEO_FRAME_COMP_DATA (frame, + c) + (image->y0 / comp->dy) * sstride; + data_out = comp->data; for (y = 0; y < h; y++) { tmp = data_in; @@ -521,8 +642,6 @@ data_in += sstride; } } - image->y1 -= image->y0; - image->y0 = 0; } static void @@ -534,12 +653,15 @@ gint sstride; for (c = 0; c < 3; c++) { + opj_image_comp_t *comp = image->comps + c; + w = GST_VIDEO_FRAME_COMP_WIDTH (frame, c); - h = image->comps[c].h; + h = comp->h; sstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, c); data_in = - (guint8 *) GST_VIDEO_FRAME_COMP_DATA (frame, c) + image->y0 * sstride; - data_out = image->comps[c].data; + (guint8 *) GST_VIDEO_FRAME_COMP_DATA (frame, + c) + (image->y0 / comp->dy) * sstride; + data_out = comp->data; for (y = 0; y < h; y++) { tmp = data_in; @@ -551,8 +673,6 @@ data_in += sstride; } } - image->y1 -= image->y0; - image->y0 = 0; } static void @@ -562,12 +682,14 @@ const guint8 *data_in, *tmp; gint *data_out; gint sstride; + opj_image_comp_t *comp = image->comps; w = GST_VIDEO_FRAME_COMP_WIDTH (frame, 0); - h = image->comps[0].h; + h = comp->h; sstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); data_in = - (guint8 *) GST_VIDEO_FRAME_COMP_DATA (frame, 0) + image->y0 * sstride; + (guint8 *) GST_VIDEO_FRAME_COMP_DATA (frame, + 0) + (image->y0 / comp->dy) * sstride; data_out = image->comps[0].data; for (y = 0; y < h; y++) { @@ -579,8 +701,6 @@ } data_in += sstride; } - image->y1 -= image->y0; - image->y0 = 0; } static void @@ -590,13 +710,15 @@ const guint16 *data_in, *tmp; gint *data_out; gint sstride; + opj_image_comp_t *comp = image->comps; w = GST_VIDEO_FRAME_COMP_WIDTH (frame, 0); - h = image->comps[0].h; + h = comp->h; sstride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0) / 2; data_in = - (guint16 *) GST_VIDEO_FRAME_COMP_DATA (frame, 0) + image->y0 * sstride; - data_out = image->comps[0].data; + (guint16 *) GST_VIDEO_FRAME_COMP_DATA (frame, + 0) + (image->y0 / comp->dy) * sstride; + data_out = comp->data; for (y = 0; y < h; y++) { tmp = data_in; @@ -607,8 +729,6 @@ } data_in += sstride; } - image->y1 -= image->y0; - image->y0 = 0; } static gboolean @@ -618,7 +738,6 @@ GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (encoder); GstCaps *allowed_caps, *caps; GstStructure *s; - const gchar *str = NULL; const gchar *colorspace = NULL; GstJPEG2000Sampling sampling = GST_JPEG2000_SAMPLING_NONE; gint ncomps; @@ -631,20 +750,46 @@ gst_video_codec_state_unref (self->input_state); self->input_state = gst_video_codec_state_ref (state); - allowed_caps = gst_pad_get_allowed_caps (GST_VIDEO_ENCODER_SRC_PAD (encoder)); - allowed_caps = gst_caps_truncate (allowed_caps); - s = gst_caps_get_structure (allowed_caps, 0); - if (gst_structure_has_name (s, "image/jp2")) { - self->codec_format = OPJ_CODEC_JP2; - self->is_jp2c = FALSE; - } else if (gst_structure_has_name (s, "image/x-j2c")) { - self->codec_format = OPJ_CODEC_J2K; - self->is_jp2c = TRUE; - } else if (gst_structure_has_name (s, "image/x-jpc")) { + if (stripe_mode) { + GstCaps *template_caps = gst_caps_new_empty_simple ("image/x-jpc-striped"); + GstCaps *my_caps; + + my_caps = gst_pad_query_caps (GST_VIDEO_ENCODER_SRC_PAD (encoder), + template_caps); + gst_caps_unref (template_caps); + + allowed_caps = gst_pad_peer_query_caps (GST_VIDEO_ENCODER_SRC_PAD (encoder), + my_caps); + gst_caps_unref (my_caps); + + if (gst_caps_is_empty (allowed_caps)) { + gst_caps_unref (allowed_caps); + GST_WARNING_OBJECT (self, "Striped JPEG 2000 not accepted downstream"); + return FALSE; + } + self->codec_format = OPJ_CODEC_J2K; self->is_jp2c = FALSE; + allowed_caps = gst_caps_truncate (allowed_caps); + s = gst_caps_get_structure (allowed_caps, 0); } else { - g_return_val_if_reached (FALSE); + allowed_caps = + gst_pad_get_allowed_caps (GST_VIDEO_ENCODER_SRC_PAD (encoder)); + allowed_caps = gst_caps_truncate (allowed_caps); + + s = gst_caps_get_structure (allowed_caps, 0); + if (gst_structure_has_name (s, "image/jp2")) { + self->codec_format = OPJ_CODEC_JP2; + self->is_jp2c = FALSE; + } else if (gst_structure_has_name (s, "image/x-j2c")) { + self->codec_format = OPJ_CODEC_J2K; + self->is_jp2c = TRUE; + } else if (gst_structure_has_name (s, "image/x-jpc")) { + self->codec_format = OPJ_CODEC_J2K; + self->is_jp2c = FALSE; + } else { + g_return_val_if_reached (FALSE); + } } switch (state->info.finfo->format) { @@ -695,7 +840,6 @@ g_assert_not_reached (); } - /* sampling */ /* note: encoder re-orders channels so that alpha channel is encoded as the last channel */ switch (state->info.finfo->format) { @@ -724,6 +868,9 @@ case GST_VIDEO_FORMAT_YUV9: sampling = GST_JPEG2000_SAMPLING_YBR410; break; + case GST_VIDEO_FORMAT_Y41B: + sampling = GST_JPEG2000_SAMPLING_YBR411; + break; case GST_VIDEO_FORMAT_I420_10LE: case GST_VIDEO_FORMAT_I420_10BE: case GST_VIDEO_FORMAT_I420: @@ -748,30 +895,23 @@ g_return_val_if_reached (FALSE); if (stripe_mode) { - str = gst_structure_get_string (s, "alignment"); - if (!str || strcmp (str, "stripe") != 0) { - GST_ERROR_OBJECT (self, - "Number of stripes set to %d, but alignment=stripe not supported downstream", - self->num_stripes); - return FALSE; - } - } - - if (sampling != GST_JPEG2000_SAMPLING_NONE) { - caps = gst_caps_new_simple (gst_structure_get_name (s), + caps = gst_caps_new_simple ("image/x-jpc-striped", "colorspace", G_TYPE_STRING, colorspace, "sampling", G_TYPE_STRING, gst_jpeg2000_sampling_to_string (sampling), "num-components", G_TYPE_INT, ncomps, - "alignment", G_TYPE_STRING, - stripe_mode ? "stripe" : "frame", - "num-stripes", G_TYPE_INT, self->num_stripes, NULL); + "num-stripes", G_TYPE_INT, self->num_stripes, + "stripe-height", G_TYPE_INT, + get_stripe_height (self, 0, + GST_VIDEO_INFO_COMP_HEIGHT (&state->info, 0)), NULL); + } else if (sampling != GST_JPEG2000_SAMPLING_NONE) { + caps = gst_caps_new_simple (gst_structure_get_name (s), + "colorspace", G_TYPE_STRING, colorspace, + "sampling", G_TYPE_STRING, gst_jpeg2000_sampling_to_string (sampling), + "num-components", G_TYPE_INT, ncomps, NULL); } else { caps = gst_caps_new_simple (gst_structure_get_name (s), "colorspace", G_TYPE_STRING, colorspace, - "num-components", G_TYPE_INT, ncomps, - "alignment", G_TYPE_STRING, - stripe_mode ? "stripe" : "frame", - "num-stripes", G_TYPE_INT, self->num_stripes, NULL); + "num-components", G_TYPE_INT, ncomps, NULL); } gst_caps_unref (allowed_caps); @@ -789,12 +929,10 @@ gst_openjpeg_enc_fill_image (GstOpenJPEGEnc * self, GstVideoFrame * frame, guint slice_num) { - gint i, ncomps; + gint i, ncomps, temp, min_height = INT_MAX; opj_image_cmptparm_t *comps; OPJ_COLOR_SPACE colorspace; opj_image_t *image; - guint nominal_stripe_height = - GST_VIDEO_FRAME_HEIGHT (frame) / self->num_stripes; ncomps = GST_VIDEO_FRAME_N_COMPONENTS (frame); comps = g_new0 (opj_image_cmptparm_t, ncomps); @@ -804,13 +942,27 @@ comps[i].bpp = GST_VIDEO_FRAME_COMP_DEPTH (frame, i); comps[i].sgnd = 0; comps[i].w = GST_VIDEO_FRAME_COMP_WIDTH (frame, i); - comps[i].h = - get_stripe_height (self, slice_num, GST_VIDEO_FRAME_COMP_HEIGHT (frame, - i)); comps[i].dx = - GST_VIDEO_FRAME_WIDTH (frame) / GST_VIDEO_FRAME_COMP_WIDTH (frame, i); + (guint) ((float) GST_VIDEO_FRAME_WIDTH (frame) / + GST_VIDEO_FRAME_COMP_WIDTH (frame, i) + 0.5f); comps[i].dy = - GST_VIDEO_FRAME_HEIGHT (frame) / GST_VIDEO_FRAME_COMP_HEIGHT (frame, i); + (guint) ((float) GST_VIDEO_FRAME_HEIGHT (frame) / + GST_VIDEO_FRAME_COMP_HEIGHT (frame, i) + 0.5f); + temp = + (GST_VIDEO_FRAME_COMP_HEIGHT (frame, + i) / self->num_stripes) * comps[i].dy; + if (temp < min_height) + min_height = temp; + } + + for (i = 0; i < ncomps; i++) { + gint nominal_height = min_height / comps[i].dy; + + comps[i].h = (slice_num < self->num_stripes) ? + nominal_height + : GST_VIDEO_FRAME_COMP_HEIGHT (frame, + i) - (self->num_stripes - 1) * nominal_height; + } if ((frame->info.finfo->flags & GST_VIDEO_FORMAT_FLAG_YUV)) @@ -823,14 +975,22 @@ g_return_val_if_reached (NULL); image = opj_image_create (ncomps, comps, colorspace); + if (!image) { + GST_WARNING_OBJECT (self, + "Unable to create a JPEG image. first component height=%d", + ncomps ? comps[0].h : 0); + return NULL; + } + g_free (comps); image->x0 = 0; image->x1 = GST_VIDEO_FRAME_WIDTH (frame); - image->y0 = slice_num * nominal_stripe_height; + image->y0 = (slice_num - 1) * min_height; image->y1 = - image->y0 + get_stripe_height (self, slice_num, - GST_VIDEO_FRAME_HEIGHT (frame)); + (slice_num < + self->num_stripes) ? image->y0 + + min_height : GST_VIDEO_FRAME_HEIGHT (frame); self->fill_image (image, frame); return image; @@ -929,156 +1089,356 @@ return OPJ_TRUE; } -static GstFlowReturn -gst_openjpeg_enc_handle_frame (GstVideoEncoder * encoder, - GstVideoCodecFrame * frame) +static gboolean +gst_openjpeg_encode_is_last_subframe (GstVideoEncoder * enc, int stripe) { - GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (encoder); - GstFlowReturn ret = GST_FLOW_OK; - opj_codec_t *enc; - opj_stream_t *stream; + GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (enc); + + return (stripe == self->num_stripes); +} + +static GstOpenJPEGCodecMessage * +gst_openjpeg_encode_message_new (GstOpenJPEGEnc * self, + GstVideoCodecFrame * frame, int num_stripe) +{ + GstOpenJPEGCodecMessage *message = g_slice_new0 (GstOpenJPEGCodecMessage); + + message->frame = gst_video_codec_frame_ref (frame); + message->stripe = num_stripe; + message->last_error = OPENJPEG_ERROR_NONE; + + return message; +} + +static GstOpenJPEGCodecMessage * +gst_openjpeg_encode_message_free (GstOpenJPEGCodecMessage * message) +{ + if (message) { + gst_video_codec_frame_unref (message->frame); + if (message->output_buffer) + gst_buffer_unref (message->output_buffer); + g_slice_free (GstOpenJPEGCodecMessage, message); + } + return NULL; +} + +#define ENCODE_ERROR(encode_params, err_code) { \ + encode_params->last_error = err_code; \ + goto done; \ +} + +/* callback method to be called asynchronously or not*/ +static void +gst_openjpeg_enc_encode_stripe (GstElement * element, gpointer user_data) +{ + GstOpenJPEGCodecMessage *message = (GstOpenJPEGCodecMessage *) user_data; + GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (element); + opj_codec_t *enc = NULL; + opj_stream_t *stream = NULL; MemStream mstream; - opj_image_t *image; + opj_image_t *image = NULL; GstVideoFrame vframe; - guint i; - GST_DEBUG_OBJECT (self, "Handling frame"); + GST_INFO_OBJECT (self, "Encode stripe %d/%d", message->stripe, + self->num_stripes); - for (i = 0; i < self->num_stripes; ++i) { - enc = opj_create_compress (self->codec_format); - if (!enc) - goto initialization_error; + mstream.data = NULL; + enc = opj_create_compress (self->codec_format); + if (!enc) + ENCODE_ERROR (message, OPENJPEG_ERROR_INIT); - if (G_UNLIKELY (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= - GST_LEVEL_TRACE)) { - opj_set_info_handler (enc, gst_openjpeg_enc_opj_info, self); - opj_set_warning_handler (enc, gst_openjpeg_enc_opj_warning, self); - opj_set_error_handler (enc, gst_openjpeg_enc_opj_error, self); - } else { - opj_set_info_handler (enc, NULL, NULL); - opj_set_warning_handler (enc, NULL, NULL); - opj_set_error_handler (enc, NULL, NULL); - } + if (G_UNLIKELY (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= + GST_LEVEL_TRACE)) { + opj_set_info_handler (enc, gst_openjpeg_enc_opj_info, self); + opj_set_warning_handler (enc, gst_openjpeg_enc_opj_warning, self); + opj_set_error_handler (enc, gst_openjpeg_enc_opj_error, self); + } else { + opj_set_info_handler (enc, NULL, NULL); + opj_set_warning_handler (enc, NULL, NULL); + opj_set_error_handler (enc, NULL, NULL); + } + if (!gst_video_frame_map (&vframe, &self->input_state->info, + message->frame->input_buffer, GST_MAP_READ)) + ENCODE_ERROR (message, OPENJPEG_ERROR_MAP_READ); + image = gst_openjpeg_enc_fill_image (self, &vframe, message->stripe); + gst_video_frame_unmap (&vframe); + if (!image) + ENCODE_ERROR (message, OPENJPEG_ERROR_FILL_IMAGE); + + if (vframe.info.finfo->flags & GST_VIDEO_FORMAT_FLAG_RGB) { + self->params.tcp_mct = 1; + } + opj_setup_encoder (enc, &self->params, image); + stream = opj_stream_create (4096, OPJ_FALSE); + if (!stream) + ENCODE_ERROR (message, OPENJPEG_ERROR_OPEN); + + mstream.allocsize = 4096; + mstream.data = g_malloc (mstream.allocsize); + mstream.offset = 0; + mstream.size = 0; + + opj_stream_set_read_function (stream, read_fn); + opj_stream_set_write_function (stream, write_fn); + opj_stream_set_skip_function (stream, skip_fn); + opj_stream_set_seek_function (stream, seek_fn); + opj_stream_set_user_data (stream, &mstream, NULL); + opj_stream_set_user_data_length (stream, mstream.size); + + if (!opj_start_compress (enc, image, stream)) + ENCODE_ERROR (message, OPENJPEG_ERROR_ENCODE); + + if (!opj_encode (enc, stream)) + ENCODE_ERROR (message, OPENJPEG_ERROR_ENCODE); + + if (!opj_end_compress (enc, stream)) + ENCODE_ERROR (message, OPENJPEG_ERROR_ENCODE); + + opj_image_destroy (image); + image = NULL; + opj_stream_destroy (stream); + stream = NULL; + opj_destroy_codec (enc); + enc = NULL; + + message->output_buffer = gst_buffer_new (); + + if (self->is_jp2c) { + GstMapInfo map; + GstMemory *mem; + + mem = gst_allocator_alloc (NULL, 8, NULL); + gst_memory_map (mem, &map, GST_MAP_WRITE); + GST_WRITE_UINT32_BE (map.data, mstream.size + 8); + GST_WRITE_UINT32_BE (map.data + 4, GST_MAKE_FOURCC ('j', 'p', '2', 'c')); + gst_memory_unmap (mem, &map); + gst_buffer_append_memory (message->output_buffer, mem); + } - if (!gst_video_frame_map (&vframe, &self->input_state->info, - frame->input_buffer, GST_MAP_READ)) - goto map_read_error; + gst_buffer_append_memory (message->output_buffer, + gst_memory_new_wrapped (0, mstream.data, mstream.allocsize, 0, + mstream.size, mstream.data, (GDestroyNotify) g_free)); + message->last_error = OPENJPEG_ERROR_NONE; + + GST_INFO_OBJECT (self, + "Stripe %d encoded successfully, pass it to the streaming thread", + message->stripe); + +done: + if (message->last_error != OPENJPEG_ERROR_NONE) { + if (mstream.data) + g_free (mstream.data); + if (enc) + opj_destroy_codec (enc); + if (image) + opj_image_destroy (image); + if (stream) + opj_stream_destroy (stream); + } + if (!message->direct) { + GST_OBJECT_LOCK (self); + g_queue_push_tail (&self->messages, message); + g_cond_signal (&self->messages_cond); + GST_OBJECT_UNLOCK (self); + } - image = gst_openjpeg_enc_fill_image (self, &vframe, i); - if (!image) - goto fill_image_error; - gst_video_frame_unmap (&vframe); +} - if (vframe.info.finfo->flags & GST_VIDEO_FORMAT_FLAG_RGB) { - self->params.tcp_mct = 1; - } - opj_setup_encoder (enc, &self->params, image); - stream = opj_stream_create (4096, OPJ_FALSE); - if (!stream) - goto open_error; - - mstream.allocsize = 4096; - mstream.data = g_malloc (mstream.allocsize); - mstream.offset = 0; - mstream.size = 0; - - opj_stream_set_read_function (stream, read_fn); - opj_stream_set_write_function (stream, write_fn); - opj_stream_set_skip_function (stream, skip_fn); - opj_stream_set_seek_function (stream, seek_fn); - opj_stream_set_user_data (stream, &mstream, NULL); - opj_stream_set_user_data_length (stream, mstream.size); - - if (!opj_start_compress (enc, image, stream)) - goto encode_error; - - if (!opj_encode (enc, stream)) - goto encode_error; - - if (!opj_end_compress (enc, stream)) - goto encode_error; - - opj_image_destroy (image); - opj_stream_destroy (stream); - opj_destroy_codec (enc); - - frame->output_buffer = gst_buffer_new (); - - if (self->is_jp2c) { - GstMapInfo map; - GstMemory *mem; - - mem = gst_allocator_alloc (NULL, 8, NULL); - gst_memory_map (mem, &map, GST_MAP_WRITE); - GST_WRITE_UINT32_BE (map.data, mstream.size + 8); - GST_WRITE_UINT32_BE (map.data + 4, GST_MAKE_FOURCC ('j', 'p', '2', 'c')); - gst_memory_unmap (mem, &map); - gst_buffer_append_memory (frame->output_buffer, mem); - } +static GstOpenJPEGCodecMessage * +gst_openjpeg_enc_wait_for_new_message (GstOpenJPEGEnc * self) +{ + GstOpenJPEGCodecMessage *message = NULL; - gst_buffer_append_memory (frame->output_buffer, - gst_memory_new_wrapped (0, mstream.data, mstream.allocsize, 0, - mstream.size, NULL, (GDestroyNotify) g_free)); + GST_OBJECT_LOCK (self); + while (g_queue_is_empty (&self->messages)) + g_cond_wait (&self->messages_cond, GST_OBJECT_GET_LOCK (self)); + message = g_queue_pop_head (&self->messages); + GST_OBJECT_UNLOCK (self); - GST_VIDEO_CODEC_FRAME_SET_SYNC_POINT (frame); - ret = - (i == - self->num_stripes - - 1) ? gst_video_encoder_finish_frame (encoder, - frame) : gst_video_encoder_finish_subframe (encoder, frame); + return message; +} +static GstFlowReturn +gst_openjpeg_enc_encode_frame_multiple (GstVideoEncoder * encoder, + GstVideoCodecFrame * frame) +{ + GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (encoder); + GstFlowReturn ret = GST_FLOW_OK; + guint i; + guint encoded_stripes = 0; + guint enqueued_stripes = 0; + GstOpenJPEGCodecMessage *message = NULL; + + /* The method receives a frame and split it into n stripes and + * and create a thread per stripe to encode it. + * As the number of stripes can be greater than the + * available threads to encode, there is two loop, one to + * count the enqueues stripes and one to count the encoded + * stripes. + */ + while (encoded_stripes < self->num_stripes) { + for (i = 1; + i <= self->available_threads + && enqueued_stripes < (self->num_stripes - encoded_stripes); i++) { + message = + gst_openjpeg_encode_message_new (self, frame, i + encoded_stripes); + GST_LOG_OBJECT (self, + "About to enqueue an encoding message from frame %p stripe %d", frame, + message->stripe); + gst_element_call_async (GST_ELEMENT (self), + (GstElementCallAsyncFunc) gst_openjpeg_enc_encode_stripe, message, + NULL); + enqueued_stripes++; + } + while (enqueued_stripes > 0) { + message = gst_openjpeg_enc_wait_for_new_message (self); + if (!message) + continue; + enqueued_stripes--; + if (message->last_error == OPENJPEG_ERROR_NONE) { + GST_LOG_OBJECT (self, + "About to push frame %p stripe %d", frame, message->stripe); + frame->output_buffer = gst_buffer_ref (message->output_buffer); + if (gst_openjpeg_encode_is_last_subframe (encoder, encoded_stripes + 1)) { + GST_VIDEO_CODEC_FRAME_SET_SYNC_POINT (frame); + ret = gst_video_encoder_finish_frame (encoder, frame); + } else + ret = gst_video_encoder_finish_subframe (encoder, frame); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT + (self, "An error occurred pushing the frame %s", + gst_flow_get_name (ret)); + goto done; + } + encoded_stripes++; + message = gst_openjpeg_encode_message_free (message); + } else { + GST_WARNING_OBJECT + (self, "An error occurred %d during the JPEG encoding", + message->last_error); + gst_video_codec_frame_unref (frame); + self->last_error = message->last_error; + ret = GST_FLOW_ERROR; + goto done; + } + } } +done: + gst_openjpeg_encode_message_free (message); return ret; +} -initialization_error: - { - gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, LIBRARY, INIT, - ("Failed to initialize OpenJPEG encoder"), (NULL)); - return GST_FLOW_ERROR; - } -map_read_error: - { - opj_destroy_codec (enc); - gst_video_codec_frame_unref (frame); - - GST_ELEMENT_ERROR (self, CORE, FAILED, - ("Failed to map input buffer"), (NULL)); - return GST_FLOW_ERROR; +static GstFlowReturn +gst_openjpeg_enc_encode_frame_single (GstVideoEncoder * encoder, + GstVideoCodecFrame * frame) +{ + GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (encoder); + GstFlowReturn ret = GST_FLOW_OK; + guint i; + GstOpenJPEGCodecMessage *message = NULL; + + for (i = 1; i <= self->num_stripes; ++i) { + message = gst_openjpeg_encode_message_new (self, frame, i); + message->direct = TRUE; + gst_openjpeg_enc_encode_stripe (GST_ELEMENT (self), message); + if (message->last_error != OPENJPEG_ERROR_NONE) { + GST_WARNING_OBJECT + (self, "An error occured %d during the JPEG encoding", + message->last_error); + gst_video_codec_frame_unref (frame); + self->last_error = message->last_error; + ret = GST_FLOW_ERROR; + goto done; + } + frame->output_buffer = gst_buffer_ref (message->output_buffer); + if (gst_openjpeg_encode_is_last_subframe (encoder, message->stripe)) { + GST_VIDEO_CODEC_FRAME_SET_SYNC_POINT (frame); + ret = gst_video_encoder_finish_frame (encoder, frame); + } else + ret = gst_video_encoder_finish_subframe (encoder, frame); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT + (self, "An error occurred pushing the frame %s", + gst_flow_get_name (ret)); + goto done; + } + message = gst_openjpeg_encode_message_free (message); } -fill_image_error: - { - opj_destroy_codec (enc); - gst_video_frame_unmap (&vframe); - gst_video_codec_frame_unref (frame); - GST_ELEMENT_ERROR (self, LIBRARY, INIT, - ("Failed to fill OpenJPEG image"), (NULL)); - return GST_FLOW_ERROR; - } -open_error: - { - opj_image_destroy (image); - opj_destroy_codec (enc); - gst_video_codec_frame_unref (frame); - - GST_ELEMENT_ERROR (self, LIBRARY, INIT, - ("Failed to open OpenJPEG data"), (NULL)); - return GST_FLOW_ERROR; +done: + gst_openjpeg_encode_message_free (message); + return ret; +} + +static GstFlowReturn +gst_openjpeg_enc_handle_frame (GstVideoEncoder * encoder, + GstVideoCodecFrame * frame) +{ + GstOpenJPEGEnc *self = GST_OPENJPEG_ENC (encoder); + GstFlowReturn ret = GST_FLOW_OK; + GstVideoFrame vframe; + gboolean subframe_mode = + self->num_stripes != GST_OPENJPEG_ENC_DEFAULT_NUM_STRIPES; + + GST_DEBUG_OBJECT (self, "Handling frame"); + + if (subframe_mode) { + gint min_res; + + /* due to limitations in openjpeg library, + * number of wavelet resolutions must not exceed floor(log(stripe height)) + 1 */ + if (!gst_video_frame_map (&vframe, &self->input_state->info, + frame->input_buffer, GST_MAP_READ)) { + gst_video_codec_frame_unref (frame); + self->last_error = OPENJPEG_ERROR_MAP_READ; + goto error; + } + /* find stripe with least height */ + min_res = + get_stripe_height (self, self->num_stripes - 1, + GST_VIDEO_FRAME_COMP_HEIGHT (&vframe, 0)); + min_res = MIN (min_res, get_stripe_height (self, 0, + GST_VIDEO_FRAME_COMP_HEIGHT (&vframe, 0))); + /* take log to find correct number of wavelet resolutions */ + min_res = min_res > 1 ? (gint) log (min_res) + 1 : 1; + self->params.numresolution = MIN (min_res + 1, self->params.numresolution); + gst_video_frame_unmap (&vframe); } -encode_error: - { - opj_stream_destroy (stream); - g_free (mstream.data); - opj_image_destroy (image); - opj_destroy_codec (enc); - gst_video_codec_frame_unref (frame); - - GST_ELEMENT_ERROR (self, STREAM, ENCODE, - ("Failed to encode OpenJPEG stream"), (NULL)); - return GST_FLOW_ERROR; + if (self->encode_frame (encoder, frame) != GST_FLOW_OK) + goto error; + + return ret; + +error: + switch (self->last_error) { + case OPENJPEG_ERROR_INIT: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to initialize OpenJPEG encoder"), (NULL)); + break; + case OPENJPEG_ERROR_MAP_READ: + GST_ELEMENT_ERROR (self, CORE, FAILED, + ("Failed to map input buffer"), (NULL)); + break; + case OPENJPEG_ERROR_FILL_IMAGE: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to fill OpenJPEG image"), (NULL)); + break; + case OPENJPEG_ERROR_OPEN: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to open OpenJPEG data"), (NULL)); + break; + case OPENJPEG_ERROR_ENCODE: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to encode OpenJPEG data"), (NULL)); + break; + default: + GST_ELEMENT_ERROR (self, LIBRARY, INIT, + ("Failed to encode OpenJPEG data"), (NULL)); + break; } + gst_openjpeg_enc_flush_messages (self); + return GST_FLOW_ERROR; } static gboolean
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/gstopenjpegenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/gstopenjpegenc.h
Changed
@@ -55,9 +55,17 @@ gboolean is_jp2c; void (*fill_image) (opj_image_t * image, GstVideoFrame *frame); + GstFlowReturn (*encode_frame) (GstVideoEncoder * encoder, GstVideoCodecFrame *frame); opj_cparameters_t params; gint num_stripes; + + guint available_threads; + GQueue messages; + + GCond messages_cond; + + OpenJPEGErrorCode last_error; }; struct _GstOpenJPEGEncClass @@ -67,6 +75,8 @@ GType gst_openjpeg_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (openjpegenc); + G_END_DECLS #endif /* __GST_OPENJPEG_ENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openjpeg/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/openjpeg/meson.build
Changed
@@ -6,13 +6,10 @@ openjpeg_cargs = [] -if get_option('openjpeg').disabled() - subdir_done() -endif - openjpeg_dep = dependency('libopenjp2', version : '>=2.2', fallback : ['libopenjp2', 'libopenjp2_dep'], - required : get_option('openjpeg')) + required : get_option('openjpeg'), + default_options: ['build_codec=false']) if openjpeg_dep.found() gstopenjpeg = library('gstopenjpeg', @@ -21,7 +18,7 @@ link_args : noseh_link_args, include_directories : [configinc], dependencies : [gst_dep, gstvideo_dep, openjpeg_dep, - gstcodecparsers_dep], + gstcodecparsers_dep, libm], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openmpt/gstopenmptdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openmpt/gstopenmptdec.c
Changed
@@ -95,7 +95,8 @@ G_DEFINE_TYPE (GstOpenMptDec, gst_openmpt_dec, GST_TYPE_NONSTREAM_AUDIO_DECODER); - +GST_ELEMENT_REGISTER_DEFINE (openmptdec, "openmptdec", GST_RANK_PRIMARY + 2, + gst_openmpt_dec_get_type ()); static void gst_openmpt_dec_finalize (GObject * object);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openmpt/gstopenmptdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/openmpt/gstopenmptdec.h
Changed
@@ -74,6 +74,7 @@ GType gst_openmpt_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (openmptdec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openmpt/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/openmpt/plugin.c
Changed
@@ -29,11 +29,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret = TRUE; - ret = ret - && gst_element_register (plugin, "openmptdec", GST_RANK_PRIMARY + 2, - gst_openmpt_dec_get_type ()); - return ret; + return GST_ELEMENT_REGISTER (openmptdec, plugin); }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openni2/gstopenni2.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/openni2/gstopenni2.cpp
Changed
@@ -48,10 +48,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_openni2src_plugin_init (plugin)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (openni2src, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openni2/gstopenni2src.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/openni2/gstopenni2src.cpp
Changed
@@ -111,6 +111,8 @@ #define parent_class gst_openni2_src_parent_class G_DEFINE_TYPE (GstOpenni2Src, gst_openni2_src, GST_TYPE_PUSH_SRC); +GST_ELEMENT_REGISTER_DEFINE (openni2src, "openni2src", GST_RANK_NONE, + GST_TYPE_OPENNI2_SRC); static void gst_openni2_src_class_init (GstOpenni2SrcClass * klass) @@ -512,14 +514,6 @@ return GST_BASE_SRC_CLASS (parent_class)->decide_allocation (bsrc, query); } -gboolean -gst_openni2src_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "openni2src", GST_RANK_NONE, - GST_TYPE_OPENNI2_SRC); -} - - static gboolean openni2_initialise_library (void) {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/openni2/gstopenni2src.h -> gst-plugins-bad-1.20.1.tar.xz/ext/openni2/gstopenni2src.h
Changed
@@ -75,7 +75,8 @@ }; GType gst_openni2_src_get_type (void); -gboolean gst_openni2src_plugin_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (openni2src); G_END_DECLS #endif /* __GST_OPENNI2_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opus/gstopus.c -> gst-plugins-bad-1.20.1.tar.xz/ext/opus/gstopus.c
Changed
@@ -23,18 +23,10 @@ #include "gstopusparse.h" -#include <gst/tag/tag.h> - static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "opusparse", GST_RANK_NONE, - GST_TYPE_OPUS_PARSE)) - return FALSE; - - gst_tag_register_musicbrainz_tags (); - - return TRUE; + return GST_ELEMENT_REGISTER (opusparse, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opus/gstopusparse.c -> gst-plugins-bad-1.20.1.tar.xz/ext/opus/gstopusparse.c
Changed
@@ -45,6 +45,7 @@ #include <gst/audio/audio.h> #include <gst/pbutils/pbutils.h> +#include <gst/tag/tag.h> GST_DEBUG_CATEGORY_STATIC (opusparse_debug); #define GST_CAT_DEFAULT opusparse_debug @@ -65,7 +66,7 @@ GST_STATIC_CAPS ("audio/x-opus") ); -G_DEFINE_TYPE (GstOpusParse, gst_opus_parse, GST_TYPE_BASE_PARSE); + static gboolean gst_opus_parse_start (GstBaseParse * parse); static gboolean gst_opus_parse_stop (GstBaseParse * parse); @@ -73,6 +74,10 @@ GstBaseParseFrame * frame, gint * skip); static GstFlowReturn gst_opus_parse_parse_frame (GstBaseParse * base, GstBaseParseFrame * frame); +static gboolean opusparse_element_init (GstPlugin * plugin); + +G_DEFINE_TYPE (GstOpusParse, gst_opus_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (opusparse, opusparse_element_init); static void gst_opus_parse_class_init (GstOpusParseClass * klass) @@ -428,3 +433,14 @@ return GST_FLOW_OK; } + +static gboolean +opusparse_element_init (GstPlugin * plugin) +{ + if (!gst_element_register (plugin, "opusparse", GST_RANK_NONE, + GST_TYPE_OPUS_PARSE)) + return FALSE; + + gst_tag_register_musicbrainz_tags (); + return TRUE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/opus/gstopusparse.h -> gst-plugins-bad-1.20.1.tar.xz/ext/opus/gstopusparse.h
Changed
@@ -57,6 +57,8 @@ GType gst_opus_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (opusparse); + G_END_DECLS #endif /* __GST_OPUS_PARSE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstbaseqroverlay.c
Added
@@ -0,0 +1,398 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * Copyright (c) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +/** + * plugin-qroverlay + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/base/gstbasetransform.h> +#include <json-glib/json-glib.h> + +#include <qrencode.h> +#include <string.h> +#include <stdio.h> +#include <stdlib.h> + +#include "gstbaseqroverlay.h" + +GST_DEBUG_CATEGORY_STATIC (gst_base_qr_overlay_debug); +#define GST_CAT_DEFAULT gst_base_qr_overlay_debug + +enum +{ + PROP_0, + PROP_X_AXIS, + PROP_Y_AXIS, + PROP_PIXEL_SIZE, + PROP_QRCODE_ERROR_CORRECTION, +}; + +typedef struct _GstBaseQROverlayPrivate GstBaseQROverlayPrivate; +struct _GstBaseQROverlayPrivate +{ + gfloat qrcode_size; + guint qrcode_quality; + guint span_frame; + QRecLevel level; + gfloat x_percent; + gfloat y_percent; + GstElement *overlaycomposition; + GstVideoInfo info; + gboolean valid; + + GstPad *sinkpad, *srcpad; + GstVideoOverlayComposition *prev_overlay; +}; + +#define PRIV(s) gst_base_qr_overlay_get_instance_private (GST_BASE_QR_OVERLAY (s)) + +#define OVERLAY_COMPOSITION_CAPS GST_VIDEO_CAPS_MAKE (GST_VIDEO_OVERLAY_COMPOSITION_BLEND_FORMATS) + +#define ALL_CAPS OVERLAY_COMPOSITION_CAPS ";" \ + GST_VIDEO_CAPS_MAKE_WITH_FEATURES ("ANY", GST_VIDEO_FORMATS_ALL) + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (ALL_CAPS) + ); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (ALL_CAPS) + ); + +#define DEFAULT_PROP_QUALITY 1 +#define DEFAULT_PROP_PIXEL_SIZE 3 + +#define GST_TYPE_QRCODE_QUALITY (gst_qrcode_quality_get_type()) +static GType +gst_qrcode_quality_get_type (void) +{ + static GType qrcode_quality_type = 0; + + static const GEnumValue qrcode_quality[] = { + {0, "Level L", "Approx 7%"}, + {1, "Level M", "Approx 15%"}, + {2, "Level Q", "Approx 25%"}, + {3, "Level H", "Approx 30%"}, + {0, NULL, NULL}, + }; + + if (!qrcode_quality_type) { + qrcode_quality_type = + g_enum_register_static ("GstQrcodeOverlayCorrection", qrcode_quality); + } + return qrcode_quality_type; +} + +#define gst_base_qr_overlay_parent_class parent_class +G_DEFINE_TYPE_WITH_PRIVATE (GstBaseQROverlay, gst_base_qr_overlay, + GST_TYPE_BIN); + +static void gst_base_qr_overlay_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_base_qr_overlay_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +static void +gst_base_qr_overlay_caps_changed_cb (GstBaseQROverlay * self, GstCaps * caps, + gint window_width, gint window_height, GstElement * overlay) +{ + GstBaseQROverlayPrivate *priv = PRIV (self); + + if (gst_video_info_from_caps (&priv->info, caps)) + priv->valid = TRUE; + else + priv->valid = FALSE; +} + +static GstVideoOverlayComposition * +draw_overlay (GstBaseQROverlay * self, QRcode * qrcode) +{ + guint8 *qr_data, *pixels; + gint stride, pstride, y, x, yy, square_size; + gsize offset, line_offset; + GstVideoInfo info; + GstVideoOverlayRectangle *rect; + GstVideoOverlayComposition *comp; + GstBuffer *buf; + GstBaseQROverlayPrivate *priv = PRIV (self); + + gst_video_info_init (&info); + + square_size = (qrcode->width + 4 * 2) * priv->qrcode_size; + gst_video_info_set_format (&info, GST_VIDEO_FORMAT_ARGB, square_size, + square_size); + + pixels = g_malloc ((size_t) info.size); + stride = info.stride[0]; + pstride = info.finfo->pixel_stride[0]; + + /* White background */ + for (y = 0; y < info.height; y++) + memset (&pixels[y * stride], 0xff, stride); + + /* Draw the black QR code blocks with 4px white space around it + * on top */ + line_offset = 4 * priv->qrcode_size * stride; + qr_data = qrcode->data; + for (y = 0; y < qrcode->width; y++) { + for (x = 0; x < (qrcode->width); x++) { + for (yy = 0; yy < priv->qrcode_size * pstride; yy += pstride) { + if (!(*qr_data & 1)) + continue; + + offset = + (((line_offset + (stride * (yy / pstride))) + + x * priv->qrcode_size * pstride)) + + (priv->qrcode_size * pstride) + (4 * priv->qrcode_size * pstride); + + for (gint i = 0; i < priv->qrcode_size * pstride; i += pstride) { + pixels[offset + i] = 0x00; + pixels[offset + i + 1] = 0x00; + pixels[offset + i + 2] = 0x00; + } + } + qr_data++; + } + line_offset += (stride * priv->qrcode_size); + } + + buf = gst_buffer_new_wrapped (pixels, info.size); + gst_buffer_add_video_meta (buf, GST_VIDEO_FRAME_FLAG_NONE, + GST_VIDEO_OVERLAY_COMPOSITION_FORMAT_RGB, info.width, info.height); + + x = (int) (priv->info.width - square_size) * (priv->x_percent / 100); + x = GST_ROUND_DOWN_2 (x); + y = (int) (priv->info.height - square_size) * (priv->y_percent / 100); + y = GST_ROUND_DOWN_4 (y); + + rect = gst_video_overlay_rectangle_new_raw (buf, x, y, + info.width, info.height, GST_VIDEO_OVERLAY_FORMAT_FLAG_NONE); + comp = gst_video_overlay_composition_new (rect); + gst_video_overlay_rectangle_unref (rect); + + return comp; +} + +static GstVideoOverlayComposition * +gst_base_qr_overlay_draw_cb (GstBaseQROverlay * self, GstSample * sample, + GstElement * _) +{ + GstBaseQROverlayPrivate *priv = PRIV (self); + QRcode *qrcode; + gchar *content; + gboolean reuse_previous = FALSE; + GstVideoOverlayComposition *overlay = NULL; + GstBuffer *buffer = gst_sample_get_buffer (sample); + GstSegment *segment = gst_sample_get_segment (sample); + GstClockTime rtime = gst_segment_to_running_time (segment, GST_FORMAT_TIME, + GST_BUFFER_PTS (buffer)); + + if (!priv->valid) { + GST_ERROR_OBJECT (self, "Trying to draw before negotiation?"); + + return NULL; + } + + if (GST_CLOCK_TIME_IS_VALID (rtime)) + gst_object_sync_values (GST_OBJECT (self), rtime); + + content = + GST_BASE_QR_OVERLAY_GET_CLASS (self)->get_content (GST_BASE_QR_OVERLAY + (self), buffer, &priv->info, &reuse_previous); + if (reuse_previous && priv->prev_overlay) { + overlay = gst_video_overlay_composition_ref (priv->prev_overlay); + } else if (content) { + GST_INFO_OBJECT (self, "String will be encoded : %s", content); + qrcode = + QRcode_encodeString (content, 0, priv->qrcode_quality, QR_MODE_8, 0); + + if (qrcode) { + GST_DEBUG_OBJECT (self, "String encoded"); + overlay = draw_overlay (GST_BASE_QR_OVERLAY (self), qrcode); + gst_mini_object_replace (((GstMiniObject **) & priv->prev_overlay), + (GstMiniObject *) overlay); + } else { + GST_WARNING_OBJECT (self, "Could not encode content: %s", content); + } + } + g_free (content); + + return overlay; +} + +/* GObject vmethod implementations */ + +static void +gst_base_qr_overlay_dispose (GObject * object) +{ + GstBaseQROverlayPrivate *priv = PRIV (object); + + gst_mini_object_replace (((GstMiniObject **) & priv->prev_overlay), NULL); +} + +/* initialize the qroverlay's class */ +static void +gst_base_qr_overlay_class_init (GstBaseQROverlayClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *gstelement_class; + + gobject_class = (GObjectClass *) klass; + gstelement_class = (GstElementClass *) klass; + + gobject_class->set_property = gst_base_qr_overlay_set_property; + gobject_class->get_property = gst_base_qr_overlay_get_property; + gobject_class->dispose = gst_base_qr_overlay_dispose; + + GST_DEBUG_CATEGORY_INIT (gst_base_qr_overlay_debug, "qroverlay", 0, + "Qrcode overlay base class"); + + g_object_class_install_property (gobject_class, + PROP_X_AXIS, g_param_spec_float ("x", + "X position (in percent of the width)", + "X position (in percent of the width)", + 0.0, 100.0, 50.0, G_PARAM_READWRITE)); + + g_object_class_install_property (gobject_class, + PROP_Y_AXIS, g_param_spec_float ("y", + "Y position (in percent of the height)", + "Y position (in percent of the height)", + 0.0, 100.0, 50.0, G_PARAM_READWRITE)); + + g_object_class_install_property (gobject_class, + PROP_PIXEL_SIZE, g_param_spec_float ("pixel-size", + "pixel-size", "Pixel size of each Qrcode pixel", + 1, 100.0, DEFAULT_PROP_PIXEL_SIZE, G_PARAM_READWRITE)); + + g_object_class_install_property (gobject_class, PROP_QRCODE_ERROR_CORRECTION, + g_param_spec_enum ("qrcode-error-correction", "qrcode-error-correction", + "qrcode-error-correction", GST_TYPE_QRCODE_QUALITY, + DEFAULT_PROP_QUALITY, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + gst_element_class_add_pad_template (gstelement_class, + gst_static_pad_template_get (&src_template)); + gst_element_class_add_pad_template (gstelement_class, + gst_static_pad_template_get (&sink_template)); + + gst_type_mark_as_plugin_api (GST_TYPE_QRCODE_QUALITY, 0); + gst_type_mark_as_plugin_api (GST_TYPE_QRCODE_QUALITY, 0); +} + +/* initialize the new element + * initialize instance structure + */ +static void +gst_base_qr_overlay_init (GstBaseQROverlay * self) +{ + GstBaseQROverlayPrivate *priv = PRIV (self); + + priv->x_percent = 50.0; + priv->y_percent = 50.0; + priv->qrcode_quality = DEFAULT_PROP_QUALITY; + priv->span_frame = 0; + priv->qrcode_size = DEFAULT_PROP_PIXEL_SIZE; + priv->overlaycomposition = + gst_element_factory_make ("overlaycomposition", NULL); + gst_video_info_init (&priv->info); + + if (priv->overlaycomposition) { + GstPadTemplate *sink_tmpl = gst_static_pad_template_get (&sink_template); + GstPadTemplate *src_tmpl = gst_static_pad_template_get (&src_template); + + gst_bin_add (GST_BIN (self), priv->overlaycomposition); + + gst_element_add_pad (GST_ELEMENT_CAST (self), + gst_ghost_pad_new_from_template ("sink", + priv->overlaycomposition->sinkpads->data, sink_tmpl)); + gst_element_add_pad (GST_ELEMENT_CAST (self), + gst_ghost_pad_new_from_template ("src", + priv->overlaycomposition->srcpads->data, src_tmpl)); + gst_object_unref (sink_tmpl); + gst_object_unref (src_tmpl); + + g_signal_connect_swapped (priv->overlaycomposition, "draw", + G_CALLBACK (gst_base_qr_overlay_draw_cb), self); + g_signal_connect_swapped (priv->overlaycomposition, "caps-changed", + G_CALLBACK (gst_base_qr_overlay_caps_changed_cb), self); + } +} + +static void +gst_base_qr_overlay_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstBaseQROverlayPrivate *priv = PRIV (object); + + switch (prop_id) { + case PROP_X_AXIS: + priv->x_percent = g_value_get_float (value); + break; + case PROP_Y_AXIS: + priv->y_percent = g_value_get_float (value); + break; + case PROP_PIXEL_SIZE: + priv->qrcode_size = g_value_get_float (value); + break; + case PROP_QRCODE_ERROR_CORRECTION: + priv->qrcode_quality = g_value_get_enum (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_base_qr_overlay_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstBaseQROverlayPrivate *priv = PRIV (object); + + switch (prop_id) { + case PROP_X_AXIS: + g_value_set_float (value, priv->x_percent); + break; + case PROP_Y_AXIS: + g_value_set_float (value, priv->y_percent); + break; + case PROP_PIXEL_SIZE: + g_value_set_float (value, priv->qrcode_size); + break; + case PROP_QRCODE_ERROR_CORRECTION: + g_value_set_enum (value, priv->qrcode_quality); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstbaseqroverlay.h
Added
@@ -0,0 +1,43 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * Copyright (c) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +#ifndef __GST_BASE_QR_OVERLAY_H__ +#define __GST_BASE_QR_OVERLAY_H__ + +#include <gst/gst.h> +#include <gst/video/gstvideofilter.h> + +G_BEGIN_DECLS +#define GST_TYPE_BASE_QR_OVERLAY (gst_base_qr_overlay_get_type()) +G_DECLARE_DERIVABLE_TYPE (GstBaseQROverlay, gst_base_qr_overlay, GST, BASE_QR_OVERLAY, GstVideoFilter); + +struct _GstBaseQROverlayClass +{ + GstBinClass parent; + + gchar* (*get_content) (GstBaseQROverlay *self, GstBuffer *buf, GstVideoInfo *info, + gboolean *reuse_previous); +}; + +G_END_DECLS +#endif /* __GST_BASE_QR_OVERLAY_H__ */ +
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstdebugqroverlay.c
Added
@@ -0,0 +1,284 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +/** + * SECTION:element-debugqroverlay + * + * This element will build a Json string that contains a description of the + * buffer and will convert the string to a QRcode. The QRcode contains a + * timestamp, a buffer number, a framerate and some custom extra-data. Each + * frame will have a Qrcode overlaid in the video stream. Some properties are + * available to set the position and to define its size. You can add custom data + * with the properties #debugqroverlay:extra-data-name and + * #debugqroverlay:extra-data-array. You can also define the quality of the Qrcode + * with #GstBaseQROverlay:qrcode-error-correction. You can also define interval and + * span of #debugqroverlay:extra-data-name #debugqroverlay:extra-data-array + * + * ## Example launch line + * + * ``` bash + * gst-launch -v -m videotestsrc ! debugqroverlay ! fakesink silent=TRUE + * ``` + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/base/gstbasetransform.h> +#include <json-glib/json-glib.h> + +#include <qrencode.h> +#include <string.h> +#include <stdio.h> +#include <stdlib.h> + +#include "gstqroverlayelements.h" +#include "gstdebugqroverlay.h" + + +GST_DEBUG_CATEGORY_STATIC (gst_debug_qr_overlay_debug); +#define GST_CAT_DEFAULT gst_debug_qr_overlay_debug + +static gchar *get_qrcode_content (GstBaseQROverlay * base, GstBuffer * buf, + GstVideoInfo * info, gboolean * reuse_prev); + +enum +{ + PROP_0, + PROP_DATA_INTERVAL_BUFFERS, + PROP_DATA_SPAN_BUFFERS, + PROP_EXTRA_DATA_NAME, + PROP_EXTRA_DATA_ARRAY, +}; + +struct _GstDebugQROverlay +{ + GstBaseQROverlay parent; + + guint32 frame_number; + guint array_counter; + guint array_size; + guint span_frame; + guint64 extra_data_interval_buffers; + guint64 extra_data_span_buffers; + gchar *extra_data_name; + gchar *extra_data_str; + gchar **extra_data_array; + gfloat x_percent; + gfloat y_percent; + gboolean silent; + gboolean extra_data_enabled; +}; + +#define DEFAULT_PROP_QUALITY 1 +#define DEFAULT_PROP_PIXEL_SIZE 3 + +#define gst_debug_qr_overlay_parent_class parent_class +G_DEFINE_TYPE (GstDebugQROverlay, gst_debug_qr_overlay, + GST_TYPE_BASE_QR_OVERLAY); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (debugqroverlay, "debugqroverlay", + GST_RANK_NONE, GST_TYPE_DEBUG_QR_OVERLAY, qroverlay_element_init (plugin)); + + +static void gst_debug_qr_overlay_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_debug_qr_overlay_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +/* initialize the qroverlay's class */ +static void +gst_debug_qr_overlay_class_init (GstDebugQROverlayClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *gstelement_class; + + gobject_class = (GObjectClass *) klass; + gstelement_class = (GstElementClass *) klass; + + GST_DEBUG_CATEGORY_INIT (gst_debug_qr_overlay_debug, "debugqroverlay", 0, + "Qrcode overlay element"); + + gobject_class->set_property = gst_debug_qr_overlay_set_property; + gobject_class->get_property = gst_debug_qr_overlay_get_property; + + g_object_class_install_property (gobject_class, + PROP_DATA_INTERVAL_BUFFERS, + g_param_spec_int64 ("extra-data-interval-buffers", + "extra-data-interval-buffers", + "Extra data append into the Qrcode at the first buffer of each " + " interval", 0, G_MAXINT64, 60, G_PARAM_READWRITE)); + + g_object_class_install_property (gobject_class, + PROP_DATA_SPAN_BUFFERS, g_param_spec_int64 ("extra-data-span-buffers", + "extra-data-span-buffers", + "Numbers of consecutive buffers that the extra data will be inserted " + " (counting the first buffer)", 0, G_MAXINT64, 1, G_PARAM_READWRITE)); + + g_object_class_install_property (gobject_class, + PROP_EXTRA_DATA_NAME, g_param_spec_string ("extra-data-name", + "Extra data name", + "Json key name for extra append data", NULL, G_PARAM_READWRITE)); + + g_object_class_install_property (gobject_class, + PROP_EXTRA_DATA_ARRAY, g_param_spec_string ("extra-data-array", + "Extra data array", + "List of comma separated values that the extra data value will be " + " cycled from at each interval, example array structure :" + " \"240,480,720,960,1200,1440,1680,1920\"", NULL, G_PARAM_READWRITE)); + + + gst_element_class_set_details_simple (gstelement_class, + "qroverlay", + "Qrcode overlay containing buffer information", + "Overlay Qrcodes over each buffer with buffer information and custom data", + "Anthony Violo <anthony.violo@ubicast.eu>"); + + gst_type_mark_as_plugin_api (GST_TYPE_BASE_QR_OVERLAY, 0); + + GST_BASE_QR_OVERLAY_CLASS (klass)->get_content = + GST_DEBUG_FUNCPTR (get_qrcode_content); +} + +/* initialize the new element + * initialize instance structure + */ +static void +gst_debug_qr_overlay_init (GstDebugQROverlay * filter) +{ + filter->frame_number = 1; + filter->x_percent = 50.0; + filter->y_percent = 50.0; + filter->array_counter = 0; + filter->array_size = 0; + filter->extra_data_interval_buffers = 60; + filter->extra_data_span_buffers = 1; + filter->span_frame = 0; +} + +static void +gst_debug_qr_overlay_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstDebugQROverlay *filter = GST_DEBUG_QR_OVERLAY (object); + + switch (prop_id) { + case PROP_DATA_INTERVAL_BUFFERS: + filter->extra_data_interval_buffers = g_value_get_int64 (value); + break; + case PROP_DATA_SPAN_BUFFERS: + filter->extra_data_span_buffers = g_value_get_int64 (value); + break; + case PROP_EXTRA_DATA_NAME: + filter->extra_data_name = g_value_dup_string (value); + break; + case PROP_EXTRA_DATA_ARRAY: + { + g_clear_pointer (&filter->extra_data_str, g_free); + g_clear_pointer (&filter->extra_data_array, g_strfreev); + filter->extra_data_str = g_value_dup_string (value); + if (filter->extra_data_str) { + filter->extra_data_array = g_strsplit (filter->extra_data_str, ",", -1); + filter->array_size = g_strv_length (filter->extra_data_array); + } + break; + } + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_debug_qr_overlay_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstDebugQROverlay *filter = GST_DEBUG_QR_OVERLAY (object); + + switch (prop_id) { + case PROP_DATA_INTERVAL_BUFFERS: + g_value_set_int64 (value, filter->extra_data_interval_buffers); + break; + case PROP_DATA_SPAN_BUFFERS: + g_value_set_int64 (value, filter->extra_data_span_buffers); + break; + case PROP_EXTRA_DATA_NAME: + g_value_set_string (value, filter->extra_data_name); + break; + case PROP_EXTRA_DATA_ARRAY: + g_value_set_string (value, filter->extra_data_str); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gchar * +get_qrcode_content (GstBaseQROverlay * base, GstBuffer * buf, + GstVideoInfo * info, gboolean * reuse_prev) +{ + GstDebugQROverlay *filter = GST_DEBUG_QR_OVERLAY (base); + GString *res = g_string_new (NULL); + JsonGenerator *jgen; + gchar *framerate_string = g_strdup_printf ("%d/%d", info->fps_n, info->fps_d); + + JsonObject *jobj = json_object_new (); + JsonNode *root = json_node_new (JSON_NODE_OBJECT); + + *reuse_prev = FALSE; + json_object_set_int_member (jobj, "TIMESTAMP", + (gint64) GST_BUFFER_TIMESTAMP (buf)); + json_object_set_int_member (jobj, "BUFFERCOUNT", + (gint64) filter->frame_number); + json_object_set_string_member (jobj, "FRAMERATE", framerate_string); + json_object_set_string_member (jobj, "NAME", GST_ELEMENT_NAME (filter)); + g_free (framerate_string); + + if (filter->extra_data_array && filter->extra_data_name && + (filter->frame_number == 1 + || filter->frame_number % filter->extra_data_interval_buffers == 1 + || (filter->span_frame > 0 + && filter->span_frame < filter->extra_data_span_buffers))) { + json_object_set_string_member (jobj, filter->extra_data_name, + filter->extra_data_array[filter->array_counter]); + + filter->span_frame++; + if (filter->span_frame == filter->extra_data_span_buffers) { + filter->array_counter++; + filter->span_frame = 0; + if (filter->array_counter >= filter->array_size) + filter->array_counter = 0; + } + } + + jgen = json_generator_new (); + json_node_set_object (root, jobj); + json_generator_set_root (jgen, root); + res = json_generator_to_gstring (jgen, res); + g_object_unref (jgen); + filter->frame_number++; + + return g_strdup (g_string_free (res, FALSE)); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstdebugqroverlay.h
Added
@@ -0,0 +1,33 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +#ifndef __GST_DEBUG_QR_OVERLAY_H__ +#define __GST_DEBUG_QR_OVERLAY_H__ + +#include "gstbaseqroverlay.h" + +G_BEGIN_DECLS +#define GST_TYPE_DEBUG_QR_OVERLAY (gst_debug_qr_overlay_get_type()) + +G_DECLARE_FINAL_TYPE (GstDebugQROverlay, gst_debug_qr_overlay, GST, DEBUG_QR_OVERLAY, GstBaseQROverlay); + +G_END_DECLS +#endif /* __GST_DEBUG_QR_OVERLAY_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstqroverlay.c
Added
@@ -0,0 +1,158 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * Copyright (c) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +/** + * SECTION:element-qroverlay + * + * Element to set random data on a qroverlay. + * + * ## Example launch line + * + * ``` bash + * gst-launch -v -m videotestsrc ! qroverlay ! fakesink silent=TRUE + * ``` + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/base/gstbasetransform.h> +#include <json-glib/json-glib.h> + +#include <qrencode.h> +#include <string.h> +#include <stdio.h> +#include <stdlib.h> + +#include "gstqroverlayelements.h" +#include "gstqroverlay.h" + +enum +{ + PROP_0, + PROP_DATA, +}; + +struct _GstQROverlay +{ + GstBaseQROverlay parent; + gchar *data; + + gboolean data_changed; +}; + +#define gst_qr_overlay_parent_class parent_class +G_DEFINE_TYPE (GstQROverlay, gst_qr_overlay, GST_TYPE_BASE_QR_OVERLAY); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (qroverlay, "qroverlay", GST_RANK_NONE, + GST_TYPE_QR_OVERLAY, qroverlay_element_init (plugin)); + +static gchar * +get_qrcode_content (GstBaseQROverlay * base, GstBuffer * buf, + GstVideoInfo * info, gboolean * reuse_prev) +{ + gchar *content; + GstQROverlay *self = GST_QR_OVERLAY (base); + + GST_OBJECT_LOCK (self); + content = g_strdup (self->data); + *reuse_prev = self->data_changed; + GST_OBJECT_UNLOCK (self); + + return content; +} + +static void +gst_qr_overlay_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstQROverlay *self = GST_QR_OVERLAY (object); + + switch (prop_id) { + case PROP_DATA: + GST_OBJECT_LOCK (self); + self->data = g_value_dup_string (value); + self->data_changed = TRUE; + GST_OBJECT_UNLOCK (self); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_qr_overlay_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstQROverlay *self = GST_QR_OVERLAY (object); + + switch (prop_id) { + case PROP_DATA: + g_value_set_string (value, self->data); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_qr_overlay_class_init (GstQROverlayClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *gstelement_class; + + gobject_class = (GObjectClass *) klass; + gstelement_class = (GstElementClass *) klass; + + gobject_class->set_property = gst_qr_overlay_set_property; + gobject_class->get_property = gst_qr_overlay_get_property; + + gst_element_class_set_details_simple (gstelement_class, + "qroverlay", + "Qrcode overlay containing random data", + "Overlay Qrcodes over each buffer with data passed in", + "Thibault Saunier <tsaunier@igalia.com>"); + + g_object_class_install_property (gobject_class, + PROP_DATA, g_param_spec_string ("data", + "Data", + "Data to write in the QRCode to be overlaid", + NULL, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_PLAYING | + GST_PARAM_CONTROLLABLE)); + + GST_BASE_QR_OVERLAY_CLASS (klass)->get_content = + GST_DEBUG_FUNCPTR (get_qrcode_content); +} + +/* initialize the new element + * initialize instance structure + */ +static void +gst_qr_overlay_init (GstQROverlay * filter) +{ +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstqroverlay.h
Added
@@ -0,0 +1,33 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +#ifndef __GST_QR_OVERLAY_H__ +#define __GST_QR_OVERLAY_H__ + +#include "gstbaseqroverlay.h" + +G_BEGIN_DECLS +#define GST_TYPE_QR_OVERLAY (gst_qr_overlay_get_type()) + +G_DECLARE_FINAL_TYPE (GstQROverlay, gst_qr_overlay, GST, QR_OVERLAY, GstBaseQROverlay); + +G_END_DECLS +#endif /* __GST_QR_OVERLAY_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstqroverlayelement.c
Added
@@ -0,0 +1,44 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * Copyright (c) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> + +#include "gstqroverlayelements.h" + +GST_DEBUG_CATEGORY_STATIC (gst_qr_overlay_debug); +#define GST_CAT_DEFAULT gst_qr_overlay_debug + +void +qroverlay_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (gst_qr_overlay_debug, "qroverlay", 0, + "Qrcode overlay element"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstqroverlayelements.h
Added
@@ -0,0 +1,38 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +#ifndef __GST_QR_OVERLAY_H__ +#define __GST_QR_OVERLAY_H__ + +#include "gstbaseqroverlay.h" + +G_BEGIN_DECLS +#define GST_TYPE_QR_OVERLAY (gst_qr_overlay_get_type()) + +G_DECLARE_FINAL_TYPE (GstQROverlay, gst_qr_overlay, GST, QR_OVERLAY, GstBaseQROverlay); + + +void qroverlay_element_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (debugqroverlay); +GST_ELEMENT_REGISTER_DECLARE (qroverlay); + +G_END_DECLS +#endif /* __GST_QR_OVERLAY_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/gstqroverlayplugin.c
Added
@@ -0,0 +1,60 @@ +/* + * GStreamer + * Copyright (C) 2006 Stefan Kost <ensonic@users.sf.net> + * Copyright (c) 2020 Anthony Violo <anthony.violo@ubicast.eu> + * Copyright (c) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + +/** + * SECTION:element-qroverlay + * + * Element to set random data on a qroverlay. + * + * ## Example launch line + * + * ``` bash + * gst-launch -v -m videotestsrc ! qroverlay ! fakesink silent=TRUE + * ``` + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> + +#include "gstqroverlayelements.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (debugqroverlay, plugin); + ret |= GST_ELEMENT_REGISTER (qroverlay, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + qroverlay, + "libqrencode qroverlay plugin", + plugin_init, VERSION, "LGPL", PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/qroverlay/meson.build
Added
@@ -0,0 +1,16 @@ +qrencode_dep = dependency('libqrencode', required: get_option('qroverlay')) +if qrencode_dep.found() + json_dep = dependency('json-glib-1.0', fallback : ['json-glib', 'json_glib_dep'], required: get_option('qroverlay')) + if json_dep.found() + gstqroverlay = library('gstqroverlay', ['gstqroverlay.c', 'gstdebugqroverlay.c', 'gstbaseqroverlay.c', 'gstqroverlayelement.c', 'gstqroverlayplugin.c'], + c_args : gst_plugins_bad_args, + include_directories : [configinc], + dependencies : [gstvideo_dep, qrencode_dep, json_dep], + install : true, + install_dir : plugins_install_dir, + ) + pkgconfig.generate(gstqroverlay, install_dir : plugins_pkgconfig_install_dir) + plugins += [gstqroverlay] + endif +endif +
View file
gst-plugins-bad-1.18.6.tar.xz/ext/resindvd/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/resindvd/meson.build
Changed
@@ -1,3 +1,8 @@ +resindvd_opt = get_option('resindvd').require(gpl_allowed, error_message: ''' + Plugin resindvd explicitly required via options but GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow GPL-licensed plugins to be built. + ''') + resindvd_sources = [ 'gstmpegdemux.c', 'gstmpegdesc.c', @@ -10,8 +15,8 @@ 'rsnparsetter.c', ] -dvdnav_dep = dependency('dvdnav', version : '>= 4.1.2', required : get_option('resindvd')) -dvdread_dep = dependency('dvdread', version : '>= 4.1.2', required : get_option('resindvd')) +dvdnav_dep = dependency('dvdnav', version : '>= 4.1.2', required : resindvd_opt) +dvdread_dep = dependency('dvdread', version : '>= 4.1.2', required : resindvd_opt) if dvdnav_dep.found() and dvdread_dep.found() gstresindvd = library('gstresindvd',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/resindvd/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/resindvd/plugin.c
Changed
@@ -25,32 +25,12 @@ #include "resindvdbin.h" #include "gstmpegdemux.h" -#include <gst/gst-i18n-plugin.h> -GST_DEBUG_CATEGORY (resindvd_debug); -#define GST_CAT_DEFAULT resindvd_debug static gboolean plugin_init (GstPlugin * plugin) { - gboolean result = TRUE; - - GST_DEBUG_CATEGORY_INIT (resindvd_debug, "resindvd", - 0, "DVD playback elements from resindvd"); - -#ifdef ENABLE_NLS - GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, - LOCALEDIR); - bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); - bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); -#endif - - result &= gst_element_register (plugin, "rsndvdbin", - GST_RANK_PRIMARY, RESIN_TYPE_DVDBIN); - - result &= gst_flups_demux_plugin_init (plugin); - - return result; + return GST_ELEMENT_REGISTER (rsndvdbin, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/resindvd/resindvdbin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/resindvd/resindvdbin.c
Changed
@@ -28,6 +28,7 @@ #include <gst/pbutils/missing-plugins.h> #include <gst/video/video.h> #include <gst/audio/audio.h> +#include <gst/gst-i18n-plugin.h> #include "resindvdbin.h" #include "resindvdsrc.h" @@ -84,10 +85,12 @@ static void rsn_dvdbin_finalize (GObject * object); static void rsn_dvdbin_uri_handler_init (gpointer g_iface, gpointer iface_data); +static gboolean rsndvdbin_element_init (GstPlugin * plugin); #define rsn_dvdbin_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (RsnDvdBin, rsn_dvdbin, GST_TYPE_BIN, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, rsn_dvdbin_uri_handler_init)); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (rsndvdbin, rsndvdbin_element_init); static void demux_pad_added (GstElement * element, GstPad * pad, RsnDvdBin * dvdbin); @@ -100,6 +103,10 @@ GstStateChange transition); static void rsn_dvdbin_no_more_pads (RsnDvdBin * dvdbin); + +GST_DEBUG_CATEGORY (resindvd_debug); +#define GST_CAT_DEFAULT resindvd_debug + static void rsn_dvdbin_class_init (RsnDvdBinClass * klass) { @@ -711,7 +718,7 @@ /* Request a pad from multiqueue, then connect this one, then * discover the corresponding output pad and return it */ - mq_sink = gst_element_get_request_pad (dvdbin->pieces[DVD_ELEM_MQUEUE], + mq_sink = gst_element_request_pad_simple (dvdbin->pieces[DVD_ELEM_MQUEUE], "sink_%u"); if (mq_sink == NULL) return FALSE; @@ -788,13 +795,13 @@ GST_LOG_OBJECT (dvdbin, "Found subpicture pad w/ caps %" GST_PTR_FORMAT, caps); dest_pad = - gst_element_get_request_pad (dvdbin->pieces[DVD_ELEM_SPU_SELECT], + gst_element_request_pad_simple (dvdbin->pieces[DVD_ELEM_SPU_SELECT], "sink_%u"); skip_mq = TRUE; } else if (can_sink_caps (dvdbin->pieces[DVD_ELEM_AUDDEC], caps)) { GST_LOG_OBJECT (dvdbin, "Found audio pad w/ caps %" GST_PTR_FORMAT, caps); dest_pad = - gst_element_get_request_pad (dvdbin->pieces[DVD_ELEM_AUD_SELECT], + gst_element_request_pad_simple (dvdbin->pieces[DVD_ELEM_AUD_SELECT], "sink_%u"); } else { GstStructure *s; @@ -1029,3 +1036,26 @@ return ret; } + +static gboolean +rsndvdbin_element_init (GstPlugin * plugin) +{ + gboolean result = TRUE; + + GST_DEBUG_CATEGORY_INIT (resindvd_debug, "resindvd", + 0, "DVD playback elements from resindvd"); + +#ifdef ENABLE_NLS + GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, + LOCALEDIR); + bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); + bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); +#endif + + result &= gst_element_register (plugin, "rsndvdbin", + GST_RANK_PRIMARY, RESIN_TYPE_DVDBIN); + + result &= gst_flups_demux_plugin_init (plugin); + + return result; +}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/resindvd/resindvdbin.h -> gst-plugins-bad-1.20.1.tar.xz/ext/resindvd/resindvdbin.h
Changed
@@ -86,6 +86,8 @@ GType rsn_dvdbin_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (rsndvdbin); + G_END_DECLS #endif /* __RESINDVDBIN_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rsvg/gstrsvg.c -> gst-plugins-bad-1.20.1.tar.xz/ext/rsvg/gstrsvg.c
Changed
@@ -29,11 +29,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - return (gst_element_register (plugin, "rsvgoverlay", - GST_RANK_NONE, GST_TYPE_RSVG_OVERLAY) - && - gst_element_register (plugin, "rsvgdec", GST_RANK_PRIMARY, - GST_TYPE_RSVG_DEC)); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (rsvgoverlay, plugin); + ret |= GST_ELEMENT_REGISTER (rsvgdec, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rsvg/gstrsvgdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/rsvg/gstrsvgdec.c
Changed
@@ -59,6 +59,8 @@ #define gst_rsv_dec_parent_class parent_class G_DEFINE_TYPE (GstRsvgDec, gst_rsvg_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (rsvgdec, "rsvgdec", GST_RANK_PRIMARY, + GST_TYPE_RSVG_DEC); static gboolean gst_rsvg_dec_stop (GstVideoDecoder * decoder); static gboolean gst_rsvg_dec_set_format (GstVideoDecoder * decoder,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rsvg/gstrsvgdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/rsvg/gstrsvgdec.h
Changed
@@ -70,6 +70,7 @@ }; GType gst_rsvg_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (rsvgdec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rsvg/gstrsvgoverlay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/rsvg/gstrsvgoverlay.c
Changed
@@ -53,7 +53,7 @@ * gst-launch-1.0 -v videotestsrc ! videoconvert ! rsvgoverlay name=overlay ! videoconvert ! autovideosink filesrc location=foo.svg ! image/svg ! overlay.data_sink * ]| does the same by feeding data through the data_sink pad. You can also specify the SVG data itself as parameter: * |[ - * gst-launch-1.0 -v videotestsrc ! videoconvert ! rsvgoverlay data='<svg viewBox="0 0 800 600"><image x="80%" y="80%" width="10%" height="10%" xlink:href="foo.jpg" /></svg>' ! videoconvert ! autovideosink + * gst-launch-1.0 -v videotestsrc ! videoconvert ! rsvgoverlay data='<svg viewBox="0 0 800 600"><image x="80%" y="80%" width="10%" height="10%" xlink:href="foo.jpg" /></svg>' ! videoconvert ! autovideosink * ]| * */ @@ -123,6 +123,8 @@ #define gst_rsv_overlay_parent_class parent_class G_DEFINE_TYPE (GstRsvgOverlay, gst_rsvg_overlay, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (rsvgoverlay, "rsvgoverlay", GST_RANK_NONE, + GST_TYPE_RSVG_OVERLAY); static void gst_rsvg_overlay_finalize (GObject * object);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rsvg/gstrsvgoverlay.h -> gst-plugins-bad-1.20.1.tar.xz/ext/rsvg/gstrsvgoverlay.h
Changed
@@ -73,6 +73,7 @@ }; GType gst_rsvg_overlay_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (rsvgoverlay); G_END_DECLS #endif /* __GST_RSVG_OVERLAY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rtmp/gstrtmp.c -> gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/gstrtmp.c
Changed
@@ -31,87 +31,15 @@ #include <gst/gst.h> -#include "gstrtmpsrc.h" -#include "gstrtmpsink.h" - -#ifndef GST_DISABLE_GST_DEBUG -GST_DEBUG_CATEGORY_STATIC (rtmp_debug); - -static void -gst_rtmp_log_callback (int level, const gchar * fmt, va_list vl) -{ - GstDebugLevel gst_level; - - switch (level) { - case RTMP_LOGCRIT: - case RTMP_LOGERROR: - gst_level = GST_LEVEL_ERROR; - break; - case RTMP_LOGWARNING: - gst_level = GST_LEVEL_WARNING; - break; - case RTMP_LOGINFO: - gst_level = GST_LEVEL_INFO; - break; - case RTMP_LOGDEBUG: - gst_level = GST_LEVEL_DEBUG; - break; - case RTMP_LOGDEBUG2: - gst_level = GST_LEVEL_LOG; - break; - default: - gst_level = GST_LEVEL_TRACE; - break; - } - - gst_debug_log_valist (rtmp_debug, gst_level, "", "", 0, NULL, fmt, vl); -} - -static void -_set_debug_level (void) -{ - GstDebugLevel gst_level; - - RTMP_LogSetCallback (gst_rtmp_log_callback); - gst_level = gst_debug_category_get_threshold (rtmp_debug); - - switch (gst_level) { - case GST_LEVEL_ERROR: - RTMP_LogSetLevel (RTMP_LOGERROR); - break; - case GST_LEVEL_WARNING: - case GST_LEVEL_FIXME: - RTMP_LogSetLevel (RTMP_LOGWARNING); - break; - case GST_LEVEL_INFO: - RTMP_LogSetLevel (RTMP_LOGINFO); - break; - case GST_LEVEL_DEBUG: - RTMP_LogSetLevel (RTMP_LOGDEBUG); - break; - case GST_LEVEL_LOG: - RTMP_LogSetLevel (RTMP_LOGDEBUG2); - break; - default: /* _TRACE and beyond */ - RTMP_LogSetLevel (RTMP_LOGALL); - } -} -#endif +#include "gstrtmpelements.h" static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; - -#ifndef GST_DISABLE_GST_DEBUG - GST_DEBUG_CATEGORY_INIT (rtmp_debug, "rtmp", 0, "libRTMP logging"); - _set_debug_level (); -#endif + gboolean ret = FALSE; - ret = gst_element_register (plugin, "rtmpsrc", GST_RANK_PRIMARY, - GST_TYPE_RTMP_SRC); - ret &= gst_element_register (plugin, "rtmpsink", GST_RANK_PRIMARY, - GST_TYPE_RTMP_SINK); + ret |= GST_ELEMENT_REGISTER (rtmpsrc, plugin); + ret |= GST_ELEMENT_REGISTER (rtmpsink, plugin); return ret; }
View file
gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/gstrtmpelement.c
Added
@@ -0,0 +1,113 @@ +/* GStreamer + * Copyright (C) 1999,2000 Erik Walthinsen <omega@cse.ogi.edu> + * 2000 Wim Taymans <wtay@chello.be> + * 2002 Kristian Rietveld <kris@gtk.org> + * 2002,2003 Colin Walters <walters@gnu.org> + * 2001,2010 Bastien Nocera <hadess@hadess.net> + * 2010 Sebastian Dröge <sebastian.droege@collabora.co.uk> + * 2010 Jan Schmidt <thaytan@noraisin.net> + * + * rtmpsrc.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> + +#include "gstrtmpelements.h" +#include "gstrtmpsrc.h" +#include "gstrtmpsink.h" + +#ifndef GST_DISABLE_GST_DEBUG +GST_DEBUG_CATEGORY_STATIC (rtmp_debug); + +static void +gst_rtmp_log_callback (int level, const gchar * fmt, va_list vl) +{ + GstDebugLevel gst_level; + + switch (level) { + case RTMP_LOGCRIT: + case RTMP_LOGERROR: + gst_level = GST_LEVEL_ERROR; + break; + case RTMP_LOGWARNING: + gst_level = GST_LEVEL_WARNING; + break; + case RTMP_LOGINFO: + gst_level = GST_LEVEL_INFO; + break; + case RTMP_LOGDEBUG: + gst_level = GST_LEVEL_DEBUG; + break; + case RTMP_LOGDEBUG2: + gst_level = GST_LEVEL_LOG; + break; + default: + gst_level = GST_LEVEL_TRACE; + break; + } + + gst_debug_log_valist (rtmp_debug, gst_level, "", "", 0, NULL, fmt, vl); +} + +static void +_set_debug_level (void) +{ + GstDebugLevel gst_level; + + RTMP_LogSetCallback (gst_rtmp_log_callback); + gst_level = gst_debug_category_get_threshold (rtmp_debug); + + switch (gst_level) { + case GST_LEVEL_ERROR: + RTMP_LogSetLevel (RTMP_LOGERROR); + break; + case GST_LEVEL_WARNING: + case GST_LEVEL_FIXME: + RTMP_LogSetLevel (RTMP_LOGWARNING); + break; + case GST_LEVEL_INFO: + RTMP_LogSetLevel (RTMP_LOGINFO); + break; + case GST_LEVEL_DEBUG: + RTMP_LogSetLevel (RTMP_LOGDEBUG); + break; + case GST_LEVEL_LOG: + RTMP_LogSetLevel (RTMP_LOGDEBUG2); + break; + default: /* _TRACE and beyond */ + RTMP_LogSetLevel (RTMP_LOGALL); + } +} +#endif + +void +rtmp_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { +#ifndef GST_DISABLE_GST_DEBUG + GST_DEBUG_CATEGORY_INIT (rtmp_debug, "rtmp", 0, "libRTMP logging"); + _set_debug_level (); +#endif + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/gstrtmpelements.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2020> Julian Bouzas <julian.bouzas@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_RSVG_H__ +#define __GST_RSVG_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void rtmp_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (rtmpsink); +GST_ELEMENT_REGISTER_DECLARE (rtmpsrc); + +#endif /* __GST_RSVG_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rtmp/gstrtmpsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/gstrtmpsink.c
Changed
@@ -41,6 +41,7 @@ #include <gst/gst.h> +#include "gstrtmpelements.h" #include "gstrtmpsink.h" #ifdef G_OS_WIN32 @@ -83,6 +84,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRTMPSink, gst_rtmp_sink, GST_TYPE_BASE_SINK, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rtmp_sink_uri_handler_init)); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (rtmpsink, "rtmpsink", GST_RANK_PRIMARY, + GST_TYPE_RTMP_SINK, rtmp_element_init (plugin)); /* initialize the plugin's class */ static void
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rtmp/gstrtmpsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/gstrtmpsink.h
Changed
@@ -64,6 +64,7 @@ GType gst_rtmp_sink_get_type (void); + G_END_DECLS #endif /* __GST_RTMP_SINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rtmp/gstrtmpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/gstrtmpsrc.c
Changed
@@ -31,11 +31,21 @@ * This plugin reads data from a local or remote location specified * by an URI. This location can be specified using any protocol supported by * the RTMP library, i.e. rtmp, rtmpt, rtmps, rtmpe, rtmfp, rtmpte and rtmpts. + * The URL/location can contain extra connection or session parameters + * for librtmp, such as 'flashver=version'. See the librtmp documentation + * for more detail. Of particular interest can be setting `live=1` to certain + * RTMP streams that don't seem to be playing otherwise. + * * ## Example launch lines * |[ * gst-launch-1.0 -v rtmpsrc location=rtmp://somehost/someurl ! fakesink * ]| Open an RTMP location and pass its content to fakesink. + * + * |[ + * gst-launch-1.0 rtmpsrc location="rtmp://somehost/someurl live=1" ! fakesink + * ]| Open an RTMP location and pass its content to fakesink while passing the + * live=1 flag to librtmp * */ @@ -45,6 +55,7 @@ #include <gst/gst-i18n-plugin.h> +#include "gstrtmpelements.h" #include "gstrtmpsrc.h" #include <stdio.h> @@ -104,6 +115,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRTMPSrc, gst_rtmp_src, GST_TYPE_PUSH_SRC, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rtmp_src_uri_handler_init)); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (rtmpsrc, "rtmpsrc", GST_RANK_PRIMARY, + GST_TYPE_RTMP_SRC, rtmp_element_init (plugin)); static void gst_rtmp_src_class_init (GstRTMPSrcClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/rtmp/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/rtmp/meson.build
Changed
@@ -1,5 +1,6 @@ rtmp_sources = [ 'gstrtmp.c', + 'gstrtmpelement.c', 'gstrtmpsink.c', 'gstrtmpsrc.c', ]
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sbc/gstsbcdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sbc/gstsbcdec.c
Changed
@@ -48,6 +48,8 @@ #define parent_class gst_sbc_dec_parent_class G_DEFINE_TYPE (GstSbcDec, gst_sbc_dec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (sbcdec, "sbcdec", GST_RANK_PRIMARY, + GST_TYPE_SBC_DEC); static GstStaticPadTemplate sbc_dec_sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sbc/gstsbcdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/sbc/gstsbcdec.h
Changed
@@ -56,4 +56,6 @@ GType gst_sbc_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (sbcdec); + G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sbc/gstsbcenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sbc/gstsbcenc.c
Changed
@@ -47,6 +47,7 @@ #define GST_CAT_DEFAULT sbc_enc_debug G_DEFINE_TYPE (GstSbcEnc, gst_sbc_enc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (sbcenc, "sbcenc", GST_RANK_NONE, GST_TYPE_SBC_ENC); static GstStaticPadTemplate sbc_enc_sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, @@ -88,11 +89,11 @@ /* negotiate output format based on downstream caps restrictions */ caps = gst_pad_get_allowed_caps (GST_AUDIO_ENCODER_SRC_PAD (enc)); - if (caps == GST_CAPS_NONE || gst_caps_is_empty (caps)) - goto failure; if (caps == NULL) caps = gst_static_pad_template_get_caps (&sbc_enc_src_factory); + else if (gst_caps_is_empty (caps)) + goto failure; /* fixate output caps */ filter_caps = gst_caps_new_simple ("audio/x-sbc", "rate", G_TYPE_INT, @@ -287,8 +288,6 @@ gst_buffer_replace (&outbuf, NULL); } -done: - gst_buffer_unmap (buffer, &in_map); return gst_audio_encoder_finish_frame (audio_enc, outbuf, @@ -297,13 +296,16 @@ /* ERRORS */ no_buffer: { - GST_ERROR_OBJECT (enc, "could not allocate output buffer"); - goto done; + gst_buffer_unmap (buffer, &in_map); + GST_ELEMENT_ERROR (enc, STREAM, FAILED, (NULL), + ("Could not allocate output buffer")); + return GST_FLOW_ERROR; } map_failed: { - GST_ERROR_OBJECT (enc, "could not map input buffer"); - goto done; + GST_ELEMENT_ERROR (enc, STREAM, FAILED, (NULL), + ("Could not allocate output buffer")); + return GST_FLOW_ERROR; } }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sbc/gstsbcenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/sbc/gstsbcenc.h
Changed
@@ -58,4 +58,6 @@ GType gst_sbc_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (sbcenc); + G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sbc/sbc-plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sbc/sbc-plugin.c
Changed
@@ -24,14 +24,15 @@ #include "gstsbcdec.h" #include "gstsbcenc.h" -#include <string.h> static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "sbcdec", GST_RANK_PRIMARY, GST_TYPE_SBC_DEC); - gst_element_register (plugin, "sbcenc", GST_RANK_NONE, GST_TYPE_SBC_ENC); - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (sbcdec, plugin); + ret |= GST_ELEMENT_REGISTER (sbcenc, plugin); + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sctp/gstsctpdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sctp/gstsctpdec.c
Changed
@@ -39,6 +39,8 @@ #define gst_sctp_dec_parent_class parent_class G_DEFINE_TYPE (GstSctpDec, gst_sctp_dec, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (sctpdec, "sctpdec", GST_RANK_NONE, + GST_TYPE_SCTP_DEC); static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, @@ -604,8 +606,11 @@ remove_pad (GstSctpDec * self, GstPad * pad) { stop_srcpad_task (pad); + GST_PAD_STREAM_LOCK (pad); gst_pad_set_active (pad, FALSE); - gst_element_remove_pad (GST_ELEMENT (self), pad); + if (gst_object_has_as_parent (GST_OBJECT (pad), GST_OBJECT (self))) + gst_element_remove_pad (GST_ELEMENT (self), pad); + GST_PAD_STREAM_UNLOCK (pad); GST_OBJECT_LOCK (self); gst_flow_combiner_remove_pad (self->flow_combiner, pad); GST_OBJECT_UNLOCK (self);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sctp/gstsctpdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/sctp/gstsctpdec.h
Changed
@@ -63,6 +63,7 @@ }; GType gst_sctp_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (sctpdec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sctp/gstsctpenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sctp/gstsctpenc.c
Changed
@@ -36,6 +36,8 @@ #define gst_sctp_enc_parent_class parent_class G_DEFINE_TYPE (GstSctpEnc, gst_sctp_enc, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (sctpenc, "sctpenc", GST_RANK_NONE, + GST_TYPE_SCTP_ENC); static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink_%u", GST_PAD_SINK, @@ -481,7 +483,10 @@ if (self->sctp_association) gst_sctp_association_reset_stream (self->sctp_association, stream_id); - gst_element_remove_pad (element, pad); + GST_PAD_STREAM_LOCK (pad); + if (gst_object_has_as_parent (GST_OBJECT (pad), GST_OBJECT (element))) + gst_element_remove_pad (element, pad); + GST_PAD_STREAM_UNLOCK (pad); } static void @@ -891,7 +896,7 @@ GST_DEBUG_OBJECT (self, "Received output packet of size %" G_GSIZE_FORMAT, length); - gstbuf = gst_buffer_new_wrapped (g_memdup (buf, length), length); + gstbuf = gst_buffer_new_memdup (buf, length); item = g_new0 (GstDataQueueItem, 1); item->object = GST_MINI_OBJECT (gstbuf);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sctp/gstsctpenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/sctp/gstsctpenc.h
Changed
@@ -72,6 +72,7 @@ }; GType gst_sctp_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (sctpenc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sctp/gstsctpplugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sctp/gstsctpplugin.c
Changed
@@ -35,12 +35,13 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "sctpenc", GST_RANK_NONE, - GST_TYPE_SCTP_ENC) - && gst_element_register (plugin, "sctpdec", GST_RANK_NONE, - GST_TYPE_SCTP_DEC); -} + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (sctpenc, plugin); + ret |= GST_ELEMENT_REGISTER (sctpdec, plugin); + return ret; +} #ifndef PACKAGE #define PACKAGE "sctp"
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sctp/sctpassociation.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sctp/sctpassociation.c
Changed
@@ -234,7 +234,7 @@ self->state = GST_SCTP_ASSOCIATION_STATE_NEW; - self->use_sock_stream = FALSE; + self->use_sock_stream = TRUE; usrsctp_register_address ((void *) self); } @@ -546,6 +546,7 @@ length = (socklen_t) (sizeof (struct sctp_reset_streams) + sizeof (guint16)); srs = (struct sctp_reset_streams *) g_malloc0 (length); + srs->srs_assoc_id = SCTP_ALL_ASSOC; srs->srs_flags = SCTP_STREAM_RESET_OUTGOING; srs->srs_number_streams = 1; srs->srs_stream_list[0] = stream_id;
View file
gst-plugins-bad-1.18.6.tar.xz/ext/smoothstreaming/gstmssdemux.c -> gst-plugins-bad-1.20.1.tar.xz/ext/smoothstreaming/gstmssdemux.c
Changed
@@ -109,7 +109,8 @@ #define gst_mss_demux_parent_class parent_class G_DEFINE_TYPE (GstMssDemux, gst_mss_demux, GST_TYPE_ADAPTIVE_DEMUX); - +GST_ELEMENT_REGISTER_DEFINE (mssdemux, "mssdemux", + GST_RANK_PRIMARY, GST_TYPE_MSS_DEMUX); static void gst_mss_demux_dispose (GObject * object); static void gst_mss_demux_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/smoothstreaming/gstmssdemux.h -> gst-plugins-bad-1.20.1.tar.xz/ext/smoothstreaming/gstmssdemux.h
Changed
@@ -79,7 +79,7 @@ }; GType gst_mss_demux_get_type (void); - +GST_ELEMENT_REGISTER_DECLARE (mssdemux); G_END_DECLS #endif /* __GST_MSSDEMUX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/smoothstreaming/gstsmoothstreaming-plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/smoothstreaming/gstsmoothstreaming-plugin.c
Changed
@@ -31,11 +31,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "mssdemux", - GST_RANK_PRIMARY, GST_TYPE_MSS_DEMUX)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (mssdemux, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sndfile/gstsf.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/gstsf.c
Changed
@@ -25,100 +25,13 @@ #include <gst/gst-i18n-plugin.h> #include <string.h> -#include "gstsf.h" +#include "gstsfelements.h" -/* sf formats */ - -GstCaps * -gst_sf_create_audio_template_caps (void) -{ - GstCaps *caps = gst_caps_new_empty (); - SF_FORMAT_INFO format_info; - const gchar *fmt; - gint k, count; - - sf_command (NULL, SFC_GET_FORMAT_MAJOR_COUNT, &count, sizeof (gint)); - - for (k = 0; k < count; k++) { - format_info.format = k; - sf_command (NULL, SFC_GET_FORMAT_MAJOR, &format_info, sizeof (format_info)); - - switch (format_info.format) { - case SF_FORMAT_IRCAM: /* Berkeley/IRCAM/CARL */ - fmt = "audio/x-ircam"; - break; - case SF_FORMAT_NIST: /* Sphere NIST format. */ - fmt = "audio/x-nist"; - break; - case SF_FORMAT_PAF: /* Ensoniq PARIS file format. */ - fmt = "audio/x-paris"; - break; - case SF_FORMAT_SDS: /* Midi Sample Dump Standard */ - fmt = "audio/x-sds"; - break; - case SF_FORMAT_SVX: /* Amiga IFF / SVX8 / SV16 format. */ - fmt = "audio/x-svx"; - break; - case SF_FORMAT_VOC: /* VOC files. */ - fmt = "audio/x-voc"; - break; - case SF_FORMAT_W64: /* Sonic Foundry's 64 bit RIFF/WAV */ - fmt = "audio/x-w64"; - break; - case SF_FORMAT_XI: /* Fasttracker 2 Extended Instrument */ - fmt = "audio/x-xi"; - break; - case SF_FORMAT_RF64: /* RF64 WAV file */ - fmt = "audio/x-rf64"; - break; - /* does not make sense to expose that */ - case SF_FORMAT_RAW: /* RAW PCM data. */ - /* we have other elements to handle these */ - case SF_FORMAT_AIFF: /* Apple/SGI AIFF format */ - case SF_FORMAT_AU: /* Sun/NeXT AU format */ - case SF_FORMAT_FLAC: /* FLAC lossless file format */ - case SF_FORMAT_OGG: /* Xiph OGG container */ - case SF_FORMAT_WAV: /* Microsoft WAV format */ - case SF_FORMAT_WAVEX: /* MS WAVE with WAVEFORMATEX */ - fmt = NULL; - GST_LOG ("skipping format '%s'", format_info.name); - break; - case SF_FORMAT_MAT4: /* Matlab (tm) V4.2 / GNU Octave 2.0 */ - case SF_FORMAT_MAT5: /* Matlab (tm) V5.0 / GNU Octave 2.1 */ - case SF_FORMAT_PVF: /* Portable Voice Format */ - case SF_FORMAT_HTK: /* HMM Tool Kit format */ - case SF_FORMAT_AVR: /* Audio Visual Research */ - case SF_FORMAT_SD2: /* Sound Designer 2 */ - case SF_FORMAT_CAF: /* Core Audio File format */ - case SF_FORMAT_WVE: /* Psion WVE format */ - case SF_FORMAT_MPC2K: /* Akai MPC 2000 sampler */ - default: - fmt = NULL; - GST_WARNING ("format 0x%x: '%s' is not mapped", format_info.format, - format_info.name); - } - if (fmt != NULL) { - gst_caps_append_structure (caps, gst_structure_new_empty (fmt)); - } - } - return gst_caps_simplify (caps); -} static gboolean plugin_init (GstPlugin * plugin) { -#ifdef ENABLE_NLS - GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, - LOCALEDIR); - bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); - bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); -#endif /* ENABLE_NLS */ - - if (!gst_element_register (plugin, "sfdec", GST_RANK_MARGINAL, - gst_sf_dec_get_type ())) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (sfdec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sndfile/gstsfdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/gstsfdec.c
Changed
@@ -25,6 +25,7 @@ #include <gst/gst-i18n-plugin.h> #include <gst/audio/audio.h> +#include "gstsfelements.h" #include "gstsfdec.h" #define FORMATS \ @@ -62,7 +63,8 @@ GST_DEBUG_CATEGORY_INIT (gst_sf_dec_debug, "sfdec", 0, "sfdec element"); #define gst_sf_dec_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstSFDec, gst_sf_dec, GST_TYPE_ELEMENT, _do_init); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (sfdec, "sfdec", GST_RANK_MARGINAL, + GST_TYPE_SF_DEC, sf_element_init (plugin)); /* sf virtual io */ static sf_count_t @@ -126,7 +128,7 @@ gst_sf_vio_write (const void *ptr, sf_count_t count, void *user_data) { GstSFDec *self = GST_SF_DEC (user_data); - GstBuffer *buffer = gst_buffer_new_wrapped (g_memdup (ptr, count), count); + GstBuffer *buffer = gst_buffer_new_memdup (ptr, count); if (gst_pad_push (self->srcpad, buffer) == GST_FLOW_OK) { return count;
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sndfile/gstsfdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/gstsfdec.h
Changed
@@ -22,7 +22,6 @@ #define __GST_SF_DEC_H__ -#include "gstsf.h" #include <gst/base/gstbasesrc.h> @@ -69,6 +68,7 @@ GstElementClass parent_class; }; +GType gst_sf_dec_get_type (void); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/gstsfelement.c
Added
@@ -0,0 +1,121 @@ +/* GStreamer libsndfile plugin + * Copyright (C) 2003 Andy Wingo <wingo at pobox dot com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst-i18n-plugin.h> +#include <string.h> + +#include "gstsfelements.h" + +/* sf formats */ + +GstCaps * +gst_sf_create_audio_template_caps (void) +{ + GstCaps *caps = gst_caps_new_empty (); + SF_FORMAT_INFO format_info; + const gchar *fmt; + gint k, count; + + sf_command (NULL, SFC_GET_FORMAT_MAJOR_COUNT, &count, sizeof (gint)); + + for (k = 0; k < count; k++) { + format_info.format = k; + sf_command (NULL, SFC_GET_FORMAT_MAJOR, &format_info, sizeof (format_info)); + + switch (format_info.format) { + case SF_FORMAT_IRCAM: /* Berkeley/IRCAM/CARL */ + fmt = "audio/x-ircam"; + break; + case SF_FORMAT_NIST: /* Sphere NIST format. */ + fmt = "audio/x-nist"; + break; + case SF_FORMAT_PAF: /* Ensoniq PARIS file format. */ + fmt = "audio/x-paris"; + break; + case SF_FORMAT_SDS: /* Midi Sample Dump Standard */ + fmt = "audio/x-sds"; + break; + case SF_FORMAT_SVX: /* Amiga IFF / SVX8 / SV16 format. */ + fmt = "audio/x-svx"; + break; + case SF_FORMAT_VOC: /* VOC files. */ + fmt = "audio/x-voc"; + break; + case SF_FORMAT_W64: /* Sonic Foundry's 64 bit RIFF/WAV */ + fmt = "audio/x-w64"; + break; + case SF_FORMAT_XI: /* Fasttracker 2 Extended Instrument */ + fmt = "audio/x-xi"; + break; + case SF_FORMAT_RF64: /* RF64 WAV file */ + fmt = "audio/x-rf64"; + break; + /* does not make sense to expose that */ + case SF_FORMAT_RAW: /* RAW PCM data. */ + /* we have other elements to handle these */ + case SF_FORMAT_AIFF: /* Apple/SGI AIFF format */ + case SF_FORMAT_AU: /* Sun/NeXT AU format */ + case SF_FORMAT_FLAC: /* FLAC lossless file format */ + case SF_FORMAT_OGG: /* Xiph OGG container */ + case SF_FORMAT_WAV: /* Microsoft WAV format */ + case SF_FORMAT_WAVEX: /* MS WAVE with WAVEFORMATEX */ + fmt = NULL; + GST_LOG ("skipping format '%s'", format_info.name); + break; + case SF_FORMAT_MAT4: /* Matlab (tm) V4.2 / GNU Octave 2.0 */ + case SF_FORMAT_MAT5: /* Matlab (tm) V5.0 / GNU Octave 2.1 */ + case SF_FORMAT_PVF: /* Portable Voice Format */ + case SF_FORMAT_HTK: /* HMM Tool Kit format */ + case SF_FORMAT_AVR: /* Audio Visual Research */ + case SF_FORMAT_SD2: /* Sound Designer 2 */ + case SF_FORMAT_CAF: /* Core Audio File format */ + case SF_FORMAT_WVE: /* Psion WVE format */ + case SF_FORMAT_MPC2K: /* Akai MPC 2000 sampler */ + default: + fmt = NULL; + GST_WARNING ("format 0x%x: '%s' is not mapped", format_info.format, + format_info.name); + } + if (fmt != NULL) { + gst_caps_append_structure (caps, gst_structure_new_empty (fmt)); + } + } + return gst_caps_simplify (caps); +} + +void +sf_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { +#ifdef ENABLE_NLS + GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, + LOCALEDIR); + bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); + bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); +#endif /* ENABLE_NLS */ + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/gstsfelements.h
Added
@@ -0,0 +1,38 @@ +/* GStreamer libsndfile plugin + * Copyright (C) 2003 Andy Wingo <wingo at pobox dot com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_SFELEMENTS_H__ +#define __GST_SFELEMENTS_H__ + + +#include <gst/gst.h> +#include <sndfile.h> + +G_BEGIN_DECLS + +GstCaps *gst_sf_create_audio_template_caps (void); +void sf_element_init(GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (sfdec); + +G_END_DECLS + + +#endif /* __GST_SFELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sndfile/gstsfsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/gstsfsink.h
Changed
@@ -66,6 +66,11 @@ GstBaseSinkClass parent_class; }; +#define GST_TYPE_SF_MAJOR_TYPES (gst_sf_major_types_get_type()) +#define GST_TYPE_SF_MINOR_TYPES (gst_sf_minor_types_get_type()) + +GType gst_sf_major_types_get_type (void); +GType gst_sf_minor_types_get_type (void); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/sndfile/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/sndfile/meson.build
Changed
@@ -2,7 +2,7 @@ if sndfile_dep.found() gstsndfile = library('gstsndfile', - 'gstsf.c', 'gstsfdec.c', # 'gstsfsink.c', 'gstsfsrc.c', + 'gstsf.c', 'gstsfelement.c', 'gstsfdec.c', # 'gstsfsink.c', 'gstsfsrc.c', c_args: gst_plugins_bad_args, include_directories: [configinc, libsinc], dependencies: [gstaudio_dep, gst_dep, sndfile_dep],
View file
gst-plugins-bad-1.18.6.tar.xz/ext/soundtouch/gstbpmdetect.cc -> gst-plugins-bad-1.20.1.tar.xz/ext/soundtouch/gstbpmdetect.cc
Changed
@@ -79,6 +79,8 @@ #define gst_bpm_detect_parent_class parent_class G_DEFINE_TYPE_WITH_PRIVATE (GstBPMDetect, gst_bpm_detect, GST_TYPE_AUDIO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (bpmdetect, "bpmdetect", GST_RANK_NONE, + GST_TYPE_BPM_DETECT); static void gst_bpm_detect_finalize (GObject * object); static gboolean gst_bpm_detect_stop (GstBaseTransform * trans);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/soundtouch/gstbpmdetect.hh -> gst-plugins-bad-1.20.1.tar.xz/ext/soundtouch/gstbpmdetect.hh
Changed
@@ -51,6 +51,7 @@ }; GType gst_bpm_detect_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (bpmdetect); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/soundtouch/gstpitch.cc -> gst-plugins-bad-1.20.1.tar.xz/ext/soundtouch/gstpitch.cc
Changed
@@ -120,6 +120,8 @@ #define gst_pitch_parent_class parent_class G_DEFINE_TYPE_WITH_PRIVATE (GstPitch, gst_pitch, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (pitch, "pitch", GST_RANK_NONE, + GST_TYPE_PITCH); static void gst_pitch_class_init (GstPitchClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/soundtouch/gstpitch.hh -> gst-plugins-bad-1.20.1.tar.xz/ext/soundtouch/gstpitch.hh
Changed
@@ -89,6 +89,7 @@ }; GType gst_pitch_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (pitch); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/soundtouch/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/soundtouch/plugin.c
Changed
@@ -28,9 +28,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "pitch", GST_RANK_NONE, GST_TYPE_PITCH) - && gst_element_register (plugin, "bpmdetect", GST_RANK_NONE, - GST_TYPE_BPM_DETECT); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (pitch, plugin); + ret |= GST_ELEMENT_REGISTER (bpmdetect, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gstdtmfdetect.c -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gstdtmfdetect.c
Changed
@@ -93,6 +93,8 @@ GstEvent * event); G_DEFINE_TYPE (GstDtmfDetect, gst_dtmf_detect, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (dtmfdetect, "dtmfdetect", + GST_RANK_MARGINAL, GST_TYPE_DTMF_DETECT); static void gst_dtmf_detect_class_init (GstDtmfDetectClass * klass) @@ -278,11 +280,3 @@ return GST_BASE_TRANSFORM_CLASS (gst_dtmf_detect_parent_class)->sink_event (trans, event); } - - -gboolean -gst_dtmf_detect_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "dtmfdetect", - GST_RANK_MARGINAL, GST_TYPE_DTMF_DETECT); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gstdtmfdetect.h -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gstdtmfdetect.h
Changed
@@ -64,7 +64,7 @@ GType gst_dtmf_detect_get_type (void); -gboolean gst_dtmf_detect_plugin_init (GstPlugin *plugin); +GST_ELEMENT_REGISTER_DECLARE (dtmfdetect); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gstspandsp.c -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gstspandsp.c
Changed
@@ -31,10 +31,13 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "spanplc", - GST_RANK_PRIMARY, GST_TYPE_SPAN_PLC) && - gst_dtmf_detect_plugin_init (plugin) && - gst_tone_generate_src_plugin_init (plugin); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (spanplc, plugin); + ret |= GST_ELEMENT_REGISTER (dtmfdetect, plugin); + ret |= GST_ELEMENT_REGISTER (tonegeneratesrc, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gstspanplc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gstspanplc.c
Changed
@@ -38,6 +38,8 @@ #include <gst/audio/audio.h> G_DEFINE_TYPE (GstSpanPlc, gst_span_plc, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (spanplc, "spanplc", GST_RANK_PRIMARY, + GST_TYPE_SPAN_PLC); GST_DEBUG_CATEGORY_STATIC (gst_span_plc_debug); #define GST_CAT_DEFAULT gst_span_plc_debug
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gstspanplc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gstspanplc.h
Changed
@@ -59,6 +59,7 @@ }; GType gst_span_plc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (spanplc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gsttonegeneratesrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gsttonegeneratesrc.c
Changed
@@ -63,8 +63,13 @@ ); #define gst_tone_generate_src_parent_class parent_class -G_DEFINE_TYPE (GstToneGenerateSrc, gst_tone_generate_src, GST_TYPE_PUSH_SRC); +G_DEFINE_TYPE_WITH_CODE (GstToneGenerateSrc, gst_tone_generate_src, + GST_TYPE_PUSH_SRC, GST_DEBUG_CATEGORY_INIT (tone_generate_src_debug, + "tonegeneratesrc", 0, "Telephony Tone Test Source"); + ); +GST_ELEMENT_REGISTER_DEFINE (tonegeneratesrc, "tonegeneratesrc", GST_RANK_NONE, + GST_TYPE_TONE_GENERATE_SRC); static void gst_tone_generate_src_finalize (GObject * object); static void gst_tone_generate_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -403,13 +408,3 @@ break; } } - -gboolean -gst_tone_generate_src_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (tone_generate_src_debug, "tonegeneratesrc", 0, - "Telephony Tone Test Source"); - - return gst_element_register (plugin, "tonegeneratesrc", - GST_RANK_NONE, GST_TYPE_TONE_GENERATE_SRC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/spandsp/gsttonegeneratesrc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/spandsp/gsttonegeneratesrc.h
Changed
@@ -82,7 +82,7 @@ }; GType gst_tone_generate_src_get_type (void); -gboolean gst_tone_generate_src_plugin_init (GstPlugin *plugin); +GST_ELEMENT_REGISTER_DECLARE (tonegeneratesrc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrt.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrt.c
Changed
@@ -22,13 +22,10 @@ #include "config.h" #endif +#include "gstsrtelements.h" #include "gstsrtsrc.h" #include "gstsrtsink.h" -GST_DEBUG_CATEGORY_STATIC (gst_debug_srtlib); -GST_DEBUG_CATEGORY (gst_debug_srtobject); -#define GST_CAT_DEFAULT gst_debug_srtobject - #ifndef GST_REMOVE_DEPRECATED #define GST_TYPE_SRT_CLIENT_SRC gst_srt_client_src_get_type() @@ -55,9 +52,20 @@ static GType gst_srt_server_sink_get_type (void); G_DEFINE_TYPE (GstSRTClientSrc, gst_srt_client_src, GST_TYPE_SRT_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtclientsrc, "srtclientsrc", + GST_RANK_NONE, GST_TYPE_SRT_CLIENT_SRC, srt_element_init (plugin)); + G_DEFINE_TYPE (GstSRTServerSrc, gst_srt_server_src, GST_TYPE_SRT_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtserversrc, "srtserversrc", + GST_RANK_NONE, GST_TYPE_SRT_SERVER_SRC, srt_element_init (plugin)); + G_DEFINE_TYPE (GstSRTClientSink, gst_srt_client_sink, GST_TYPE_SRT_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtclientsink, "srtclientsink", + GST_RANK_NONE, GST_TYPE_SRT_CLIENT_SINK, srt_element_init (plugin)); + G_DEFINE_TYPE (GstSRTServerSink, gst_srt_server_sink, GST_TYPE_SRT_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtserversink, "srtserversink", + GST_RANK_NONE, GST_TYPE_SRT_SERVER_SINK, srt_element_init (plugin)); static void gst_srt_client_src_init (GstSRTClientSrc * src) @@ -100,92 +108,3 @@ } #endif - -#ifndef GST_DISABLE_GST_DEBUG -static void -gst_srt_log_handler (void *opaque, int level, const char *file, int line, - const char *area, const char *message) -{ - GstDebugLevel gst_level; - - switch (level) { - case LOG_CRIT: - gst_level = GST_LEVEL_ERROR; - break; - - case LOG_ERR: - gst_level = GST_LEVEL_WARNING; - break; - - case LOG_WARNING: - gst_level = GST_LEVEL_INFO; - break; - - case LOG_NOTICE: - gst_level = GST_LEVEL_DEBUG; - break; - - case LOG_DEBUG: - gst_level = GST_LEVEL_LOG; - break; - - default: - gst_level = GST_LEVEL_FIXME; - break; - } - - if (G_UNLIKELY (gst_level <= _gst_debug_min)) { - gst_debug_log (gst_debug_srtlib, gst_level, file, area, line, NULL, "%s", - message); - } -} -#endif - -static gboolean -plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_debug_srtobject, "srtobject", 0, "SRT Object"); - GST_DEBUG_CATEGORY_INIT (gst_debug_srtlib, "srtlib", 0, "SRT Library"); - -#ifndef GST_DISABLE_GST_DEBUG - srt_setloghandler (NULL, gst_srt_log_handler); - srt_setlogflags (SRT_LOGF_DISABLE_TIME | SRT_LOGF_DISABLE_THREADNAME | - SRT_LOGF_DISABLE_SEVERITY | SRT_LOGF_DISABLE_EOL); - srt_setloglevel (LOG_DEBUG); -#endif - - if (!gst_element_register (plugin, "srtsrc", GST_RANK_PRIMARY, - GST_TYPE_SRT_SRC)) - return FALSE; - - if (!gst_element_register (plugin, "srtsink", GST_RANK_PRIMARY, - GST_TYPE_SRT_SINK)) - return FALSE; - - /* deprecated */ -#ifndef GST_REMOVE_DEPRECATED - if (!gst_element_register (plugin, "srtclientsrc", GST_RANK_NONE, - GST_TYPE_SRT_CLIENT_SRC)) - return FALSE; - - if (!gst_element_register (plugin, "srtserversrc", GST_RANK_NONE, - GST_TYPE_SRT_SERVER_SRC)) - return FALSE; - - if (!gst_element_register (plugin, "srtclientsink", GST_RANK_NONE, - GST_TYPE_SRT_CLIENT_SINK)) - return FALSE; - - if (!gst_element_register (plugin, "srtserversink", GST_RANK_NONE, - GST_TYPE_SRT_SERVER_SINK)) - return FALSE; -#endif - - return TRUE; -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - srt, - "transfer data via SRT", - plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtelement.c
Added
@@ -0,0 +1,90 @@ +/* GStreamer + * Copyright (C) 2017, Collabora Ltd. + * Author:Justin Kim <justin.kim@collabora.com> + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstsrtelements.h" +#include <srt/srt.h> + + +GST_DEBUG_CATEGORY_STATIC (gst_debug_srtlib); +GST_DEBUG_CATEGORY (gst_debug_srtobject); +#define GST_CAT_DEFAULT gst_debug_srtobject + +#ifndef GST_DISABLE_GST_DEBUG +static void +gst_srt_log_handler (void *opaque, int level, const char *file, int line, + const char *area, const char *message) +{ + GstDebugLevel gst_level; + + switch (level) { + case LOG_CRIT: + gst_level = GST_LEVEL_ERROR; + break; + + case LOG_ERR: + gst_level = GST_LEVEL_WARNING; + break; + + case LOG_WARNING: + gst_level = GST_LEVEL_INFO; + break; + + case LOG_NOTICE: + gst_level = GST_LEVEL_DEBUG; + break; + + case LOG_DEBUG: + gst_level = GST_LEVEL_LOG; + break; + + default: + gst_level = GST_LEVEL_FIXME; + break; + } + + if (G_UNLIKELY (gst_level <= _gst_debug_min)) { + gst_debug_log (gst_debug_srtlib, gst_level, file, area, line, NULL, "%s", + message); + } +} +#endif + +void +srt_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (gst_debug_srtobject, "srtobject", 0, "SRT Object"); + GST_DEBUG_CATEGORY_INIT (gst_debug_srtlib, "srtlib", 0, "SRT Library"); +#ifndef GST_DISABLE_GST_DEBUG + srt_setloghandler (NULL, gst_srt_log_handler); + srt_setlogflags (SRT_LOGF_DISABLE_TIME | SRT_LOGF_DISABLE_THREADNAME | + SRT_LOGF_DISABLE_SEVERITY | SRT_LOGF_DISABLE_EOL); + srt_setloglevel (LOG_DEBUG); +#endif + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtelements.h
Added
@@ -0,0 +1,39 @@ +/* GStreamer + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_SRT_ELEMENTS_H__ +#define __GST_SRT_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void srt_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (srtclientsrc); +GST_ELEMENT_REGISTER_DECLARE (srtserversrc); +GST_ELEMENT_REGISTER_DECLARE (srtclientsink); +GST_ELEMENT_REGISTER_DECLARE (srtserversink); +GST_ELEMENT_REGISTER_DECLARE (srtsink); +GST_ELEMENT_REGISTER_DECLARE (srtsrc); + +#endif /* __GST_SRT_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrtobject.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtobject.c
Changed
@@ -37,6 +37,25 @@ GST_DEBUG_CATEGORY_EXTERN (gst_debug_srtobject); #define GST_CAT_DEFAULT gst_debug_srtobject +#if SRT_VERSION_VALUE > 0x10402 +#define SRTSOCK_ERROR_DEBUG ("libsrt reported: %s", srt_rejectreason_str (reason)) +#else +/* srt_rejectreason_str() is unavailable in libsrt 1.4.2 and prior due to + * unexported symbol. See https://github.com/Haivision/srt/pull/1728. */ +#define SRTSOCK_ERROR_DEBUG ("libsrt reported reject reason code %d", reason) +#endif + +/* Define options added in later revisions */ +#if SRT_VERSION_VALUE < 0x10402 +#define SRTO_DRIFTTRACER 37 +/* We can't define SRTO_BINDTODEVICE since it depends on configuration flags *sigh* */ +#define SRTO_RETRANSMITALGO 61 +#endif + +#define ELEMENT_WARNING_SRTSOCK_ERROR(code, reason) \ + GST_ELEMENT_WARNING (srtobject->element, RESOURCE, code, \ + ("Error on SRT socket. Trying to reconnect."), SRTSOCK_ERROR_DEBUG) + enum { PROP_URI = 1, @@ -51,6 +70,7 @@ PROP_STATS, PROP_WAIT_FOR_CONNECTION, PROP_STREAMID, + PROP_AUTHENTICATION, PROP_LAST }; @@ -106,7 +126,7 @@ * socket. Deliver the stats to the app before we throw them away. */ gst_structure_free (stats); - g_signal_emit_by_name (srtobject->element, "caller-removed", caller->sock, + g_signal_emit_by_name (srtobject->element, "caller-removed", 0, caller->sockaddr); } @@ -132,6 +152,54 @@ }; /* *INDENT-ON* */ +typedef struct +{ + const gchar *name; + SRT_SOCKOPT opt; + GType gtype; +} SrtOption; + +SrtOption srt_options[] = { + {"mss", SRTO_MSS, G_TYPE_INT}, + {"fc", SRTO_FC, G_TYPE_INT}, + {"sndbuf", SRTO_SNDBUF, G_TYPE_INT}, + {"rcvbuf", SRTO_RCVBUF, G_TYPE_INT}, + {"maxbw", SRTO_MAXBW, G_TYPE_INT64}, + {"tsbpdmode", SRTO_TSBPDMODE, G_TYPE_BOOLEAN}, + {"latency", SRTO_LATENCY, G_TYPE_INT}, + {"inputbw", SRTO_INPUTBW, G_TYPE_INT64}, + {"oheadbw", SRTO_OHEADBW, G_TYPE_INT}, + {"passphrase", SRTO_PASSPHRASE, G_TYPE_STRING}, + {"pbkeylen", SRTO_PBKEYLEN, G_TYPE_INT}, + {"ipttl", SRTO_IPTTL, G_TYPE_INT}, + {"iptos", SRTO_IPTOS, G_TYPE_INT}, + {"tlpktdrop", SRTO_TLPKTDROP, G_TYPE_BOOLEAN}, + {"snddropdelay", SRTO_SNDDROPDELAY, G_TYPE_INT}, + {"nakreport", SRTO_NAKREPORT, G_TYPE_BOOLEAN}, + {"conntimeo", SRTO_CONNTIMEO, G_TYPE_INT}, + {"drifttracer", SRTO_DRIFTTRACER, G_TYPE_BOOLEAN}, + {"lossmaxttl", SRTO_LOSSMAXTTL, G_TYPE_INT}, + {"rcvlatency", SRTO_RCVLATENCY, G_TYPE_INT}, + {"peerlatency", SRTO_PEERLATENCY, G_TYPE_INT}, + {"minversion", SRTO_MINVERSION, G_TYPE_INT}, + {"streamid", SRTO_STREAMID, G_TYPE_STRING}, + {"congestion", SRTO_CONGESTION, G_TYPE_STRING}, + {"messageapi", SRTO_MESSAGEAPI, G_TYPE_BOOLEAN}, + {"payloadsize", SRTO_PAYLOADSIZE, G_TYPE_INT}, + {"transtype", SRTO_TRANSTYPE, G_TYPE_INT}, + {"kmrefreshrate", SRTO_KMREFRESHRATE, G_TYPE_INT}, + {"kmpreannounce", SRTO_KMPREANNOUNCE, G_TYPE_INT}, + {"enforcedencryption", SRTO_ENFORCEDENCRYPTION, G_TYPE_BOOLEAN}, + {"ipv6only", SRTO_IPV6ONLY, G_TYPE_INT}, + {"peeridletimeo", SRTO_PEERIDLETIMEO, G_TYPE_INT}, +#if SRT_VERSION_VALUE >= 0x10402 + {"bindtodevice", SRTO_BINDTODEVICE, G_TYPE_STRING}, +#endif + {"packetfilter", SRTO_PACKETFILTER, G_TYPE_STRING}, + {"retransmitalgo", SRTO_RETRANSMITALGO, G_TYPE_INT}, + {NULL} +}; + static gint srt_init_refcount = 0; static GSocketAddress * @@ -185,11 +253,68 @@ } static gboolean +gst_srt_object_apply_socket_option (SRTSOCKET sock, SrtOption * option, + const GValue * value, GError ** error) +{ + union + { + int32_t i; + int64_t i64; + gboolean b; + const gchar *c; + } u; + const void *optval = &u; + gint optlen; + + if (!G_VALUE_HOLDS (value, option->gtype)) { + goto bad_type; + } + + switch (option->gtype) { + case G_TYPE_INT: + u.i = g_value_get_int (value); + optlen = sizeof u.i; + break; + case G_TYPE_INT64: + u.i64 = g_value_get_int64 (value); + optlen = sizeof u.i64; + break; + case G_TYPE_BOOLEAN: + u.b = g_value_get_boolean (value); + optlen = sizeof u.b; + break; + case G_TYPE_STRING: + u.c = g_value_get_string (value); + optval = u.c; + optlen = u.c ? strlen (u.c) : 0; + if (optlen == 0) { + return TRUE; + } + break; + default: + goto bad_type; + } + + if (srt_setsockopt (sock, 0, option->opt, optval, optlen)) { + g_set_error (error, GST_LIBRARY_ERROR, GST_LIBRARY_ERROR_SETTINGS, + "failed to set %s (reason: %s)", option->name, srt_getlasterror_str ()); + return FALSE; + } + + return TRUE; + +bad_type: + g_set_error (error, GST_LIBRARY_ERROR, GST_LIBRARY_ERROR_SETTINGS, + "option %s has unsupported type", option->name); + return FALSE; +} + +static gboolean gst_srt_object_set_common_params (SRTSOCKET sock, GstSRTObject * srtobject, GError ** error) { const struct srt_constant_params *params = srt_params; - const gchar *passphrase; + SrtOption *option = srt_options; GST_OBJECT_LOCK (srtobject->element); @@ -202,53 +327,15 @@ } } - passphrase = gst_structure_get_string (srtobject->parameters, "passphrase"); - if (passphrase != NULL && passphrase[0] != '\0') { - int32_t pbkeylen; - - if (srt_setsockopt (sock, 0, SRTO_PASSPHRASE, passphrase, - strlen (passphrase))) { - g_set_error (error, GST_LIBRARY_ERROR, GST_LIBRARY_ERROR_SETTINGS, - "failed to set passphrase (reason: %s)", srt_getlasterror_str ()); + for (; option->name; ++option) { + const GValue *val; - goto err; - } - - if (!gst_structure_get_int (srtobject->parameters, "pbkeylen", &pbkeylen)) { - pbkeylen = GST_SRT_DEFAULT_PBKEYLEN; - } - - if (srt_setsockopt (sock, 0, SRTO_PBKEYLEN, &pbkeylen, sizeof pbkeylen)) { - g_set_error (error, GST_LIBRARY_ERROR, GST_LIBRARY_ERROR_SETTINGS, - "failed to set pbkeylen (reason: %s)", srt_getlasterror_str ()); - goto err; - } - } - - { - int32_t latency; - - if (!gst_structure_get_int (srtobject->parameters, "latency", &latency)) - latency = GST_SRT_DEFAULT_LATENCY; - if (srt_setsockopt (sock, 0, SRTO_LATENCY, &latency, sizeof latency)) { - g_set_error (error, GST_LIBRARY_ERROR, GST_LIBRARY_ERROR_SETTINGS, - "failed to set latency (reason: %s)", srt_getlasterror_str ()); + val = gst_structure_get_value (srtobject->parameters, option->name); + if (val && !gst_srt_object_apply_socket_option (sock, option, val, error)) { goto err; } } - if (gst_structure_has_field (srtobject->parameters, "streamid")) { - const gchar *streamid; - - streamid = gst_structure_get_string (srtobject->parameters, "streamid"); - if (streamid != NULL && streamid[0] != '\0') { - if (srt_setsockopt (sock, 0, SRTO_STREAMID, streamid, strlen (streamid))) { - g_set_error (error, GST_LIBRARY_ERROR, GST_LIBRARY_ERROR_SETTINGS, - "failed to set stream ID (reason: %s)", srt_getlasterror_str ()); - } - } - } - GST_OBJECT_UNLOCK (srtobject->element); return TRUE; @@ -338,7 +425,8 @@ gst_structure_set_value (srtobject->parameters, "passphrase", value); break; case PROP_PBKEYLEN: - gst_structure_set_value (srtobject->parameters, "pbkeylen", value); + gst_structure_set (srtobject->parameters, "pbkeylen", G_TYPE_INT, + g_value_get_enum (value), NULL); break; case PROP_WAIT_FOR_CONNECTION: srtobject->wait_for_connection = g_value_get_boolean (value); @@ -346,6 +434,9 @@ case PROP_STREAMID: gst_structure_set_value (srtobject->parameters, "streamid", value); break; + case PROP_AUTHENTICATION: + srtobject->authentication = g_value_get_boolean (value); + break; default: goto err; } @@ -403,8 +494,8 @@ GstSRTKeyLength v; GST_OBJECT_LOCK (srtobject->element); - if (!gst_structure_get_enum (srtobject->parameters, "pbkeylen", - GST_TYPE_SRT_KEY_LENGTH, (gint *) & v)) { + if (!gst_structure_get_int (srtobject->parameters, "pbkeylen", + (gint *) & v)) { GST_WARNING_OBJECT (srtobject->element, "Failed to get 'pbkeylen'"); v = GST_SRT_KEY_LENGTH_NO_KEY; } @@ -450,6 +541,9 @@ gst_structure_get_string (srtobject->parameters, "streamid")); GST_OBJECT_UNLOCK (srtobject->element); break; + case PROP_AUTHENTICATION: + g_value_set_boolean (value, srtobject->authentication); + break; default: return FALSE; } @@ -594,6 +688,22 @@ "Stream ID for the SRT access control", "", G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | G_PARAM_STATIC_STRINGS)); + + /** + * GstSRTSink:authentication: + * + * Boolean to authenticate a connection. If TRUE, + * the incoming connection is authenticated. Else, + * all the connections are accepted. + * + * Since: 1.20 + * + */ + g_object_class_install_property (gobject_class, PROP_AUTHENTICATION, + g_param_spec_boolean ("authentication", + "Authentication", + "Authenticate a connection", + FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void @@ -637,6 +747,65 @@ } static void +gst_srt_object_set_int64_value (GstStructure * s, const gchar * key, + const gchar * value) +{ + gst_structure_set (s, key, G_TYPE_INT64, + g_ascii_strtoll (value, NULL, 10), NULL); +} + +static void +gst_srt_object_set_boolean_value (GstStructure * s, const gchar * key, + const gchar * value) +{ + gboolean bool_val; + static const gchar *true_names[] = { + "1", "yes", "on", "true", NULL + }; + static const gchar *false_names[] = { + "0", "no", "off", "false", NULL + }; + + if (g_strv_contains (true_names, value)) { + bool_val = TRUE; + } else if (g_strv_contains (false_names, value)) { + bool_val = FALSE; + } else { + return; + } + + gst_structure_set (s, key, G_TYPE_BOOLEAN, bool_val, NULL); +} + +static void +gst_srt_object_set_socket_option (GstStructure * s, const gchar * key, + const gchar * value) +{ + SrtOption *option = srt_options; + + for (; option; ++option) { + if (g_str_equal (key, option->name)) { + switch (option->gtype) { + case G_TYPE_INT: + gst_srt_object_set_int_value (s, key, value); + break; + case G_TYPE_INT64: + gst_srt_object_set_int64_value (s, key, value); + break; + case G_TYPE_STRING: + gst_srt_object_set_string_value (s, key, value); + break; + case G_TYPE_BOOLEAN: + gst_srt_object_set_boolean_value (s, key, value); + break; + } + + break; + } + } +} + +static void gst_srt_object_validate_parameters (GstStructure * s, GstUri * uri) { GstSRTConnectionMode connection_mode = GST_SRT_CONNECTION_MODE_NONE; @@ -724,17 +893,10 @@ gst_srt_object_set_string_value (srtobject->parameters, key, value); } else if (!g_strcmp0 ("localport", key)) { gst_srt_object_set_uint_value (srtobject->parameters, key, value); - } else if (!g_strcmp0 ("passphrase", key)) { - gst_srt_object_set_string_value (srtobject->parameters, key, value); - } else if (!g_strcmp0 ("pbkeylen", key)) { - gst_srt_object_set_enum_value (srtobject->parameters, - GST_TYPE_SRT_KEY_LENGTH, key, value); - } else if (!g_strcmp0 ("streamid", key)) { - gst_srt_object_set_string_value (srtobject->parameters, key, value); - } else if (!g_strcmp0 ("latency", key)) { - gst_srt_object_set_int_value (srtobject->parameters, key, value); } else if (!g_strcmp0 ("poll-timeout", key)) { gst_srt_object_set_int_value (srtobject->parameters, key, value); + } else { + gst_srt_object_set_socket_option (srtobject->parameters, key, value); } } @@ -828,7 +990,7 @@ g_mutex_unlock (&srtobject->sock_lock); /* notifying caller-added */ - g_signal_emit_by_name (srtobject->element, "caller-added", caller->sock, + g_signal_emit_by_name (srtobject->element, "caller-added", 0, caller->sockaddr); if (gst_uri_handler_get_uri_type (GST_URI_HANDLER (srtobject->element)) == @@ -838,6 +1000,62 @@ } } +static GSocketAddress * +peeraddr_to_g_socket_address (const struct sockaddr *peeraddr) +{ + gsize peeraddr_len; + + switch (peeraddr->sa_family) { + case AF_INET: + peeraddr_len = sizeof (struct sockaddr_in); + break; + case AF_INET6: + peeraddr_len = sizeof (struct sockaddr_in6); + break; + default: + g_warning ("Unsupported address family %d", peeraddr->sa_family); + return NULL; + } + return g_socket_address_new_from_native ((gpointer) peeraddr, peeraddr_len); +} + +static gint +srt_listen_callback_func (GstSRTObject * self, SRTSOCKET sock, int hs_version, + const struct sockaddr *peeraddr, const char *stream_id) +{ + GSocketAddress *addr = peeraddr_to_g_socket_address (peeraddr); + + if (!addr) { + GST_WARNING_OBJECT (self->element, + "Invalid peer address. Rejecting sink %d streamid: %s", sock, + stream_id); + return -1; + } + + if (self->authentication) { + gboolean authenticated = FALSE; + + /* notifying caller-connecting */ + g_signal_emit_by_name (self->element, "caller-connecting", addr, + stream_id, &authenticated); + + if (!authenticated) + goto reject; + } + + GST_DEBUG_OBJECT (self->element, + "Accepting sink %d streamid: %s", sock, stream_id); + g_object_unref (addr); + return 0; +reject: + /* notifying caller-rejected */ + GST_WARNING_OBJECT (self->element, + "Rejecting sink %d streamid: %s", sock, stream_id); + g_signal_emit_by_name (self->element, "caller-rejected", addr, stream_id); + g_object_unref (addr); + return -1; +} + static gboolean gst_srt_object_wait_connect (GstSRTObject * srtobject, GCancellable * cancellable, gpointer sa, size_t sa_len, GError ** error) @@ -916,10 +1134,16 @@ srtobject->listener_sock = sock; + /* Register the SRT listen callback */ + if (srt_listen_callback (srtobject->listener_sock, + (srt_listen_callback_fn *) srt_listen_callback_func, srtobject)) { + goto failed; + } + srtobject->thread = g_thread_try_new ("GstSRTObjectListener", thread_func, srtobject, error); - - if (*error != NULL) { + if (srtobject->thread == NULL) { + GST_ERROR_OBJECT (srtobject->element, "Failed to start thread"); goto failed; } @@ -1307,11 +1531,11 @@ SRTSOCKET rsock; gint rsocklen = 1; - int pollret; + SRTSOCKET wsock; + gint wsocklen = 1; - pollret = srt_epoll_wait (poll_id, &rsock, - &rsocklen, 0, 0, poll_timeout, NULL, 0, NULL, 0); - if (pollret < 0) { + if (srt_epoll_wait (poll_id, &rsock, &rsocklen, &wsock, &wsocklen, + poll_timeout, NULL, 0, NULL, 0) < 0) { gint srt_errno = srt_getlasterror (NULL); if (srt_errno != SRT_ETIMEOUT) { @@ -1320,34 +1544,30 @@ continue; } - if (rsocklen < 0) { - GST_WARNING_OBJECT (srtobject->element, - "abnormal SRT socket is detected"); - srt_close (rsock); - } + if (wsocklen == 1 && rsocklen == 1) { + /* Socket reported in wsock AND rsock signifies an error. */ + gint reason = srt_getrejectreason (wsock); + gboolean is_auth_error = (reason == SRT_REJ_BADSECRET + || reason == SRT_REJ_UNSECURE); - switch (srt_getsockstate (rsock)) { - case SRTS_BROKEN: - case SRTS_NONEXIST: - case SRTS_CLOSED: - if (connection_mode == GST_SRT_CONNECTION_MODE_LISTENER) { - /* Caller has been disappeared. */ - return 0; - } else { - GST_WARNING_OBJECT (srtobject->element, - "Invalid SRT socket. Trying to reconnect"); - gst_srt_object_close (srtobject); - if (!gst_srt_object_open_internal (srtobject, cancellable, error)) { - return -1; - } - continue; + if (is_auth_error) { + ELEMENT_WARNING_SRTSOCK_ERROR (NOT_AUTHORIZED, reason); + } + + if (connection_mode == GST_SRT_CONNECTION_MODE_LISTENER) { + /* Caller has disappeared. */ + return 0; + } else { + if (!is_auth_error) { + ELEMENT_WARNING_SRTSOCK_ERROR (READ, reason); } - case SRTS_CONNECTED: - /* good to go */ - break; - default: - /* not-ready */ - continue; + + gst_srt_object_close (srtobject); + if (!gst_srt_object_open_internal (srtobject, cancellable, error)) { + return -1; + } + } + continue; } @@ -1458,7 +1678,7 @@ gssize len = 0; const guint8 *msg = mapinfo->data; gint sent; - gint payload_size, optlen = 1; + gint payload_size, optlen = sizeof (payload_size); SRTCaller *caller = callers->data; callers = callers->next; @@ -1516,9 +1736,11 @@ gssize len = 0; gint poll_timeout; const guint8 *msg = mapinfo->data; - gint payload_size, optlen = 1; + gint payload_size, optlen = sizeof (payload_size); + gboolean wait_for_connection; GST_OBJECT_LOCK (srtobject->element); + wait_for_connection = srtobject->wait_for_connection; if (!gst_structure_get_int (srtobject->parameters, "poll-timeout", &poll_timeout)) { poll_timeout = GST_SRT_DEFAULT_POLL_TIMEOUT; @@ -1534,6 +1756,8 @@ } while (len < mapinfo->size) { + SRTSOCKET rsock; + gint rsocklen = 1; SRTSOCKET wsock; gint wsocklen = 1; @@ -1544,30 +1768,33 @@ break; } - if (srt_epoll_wait (srtobject->poll_id, 0, 0, &wsock, + if (!wait_for_connection && + srt_getsockstate (srtobject->sock) == SRTS_CONNECTING) { + GST_LOG_OBJECT (srtobject->element, + "Not connected yet. Dropping the buffer."); + break; + } + + if (srt_epoll_wait (srtobject->poll_id, &rsock, &rsocklen, &wsock, &wsocklen, poll_timeout, NULL, 0, NULL, 0) < 0) { continue; } - switch (srt_getsockstate (wsock)) { - case SRTS_BROKEN: - case SRTS_NONEXIST: - case SRTS_CLOSED: - GST_WARNING_OBJECT (srtobject->element, - "Invalid SRT socket. Trying to reconnect"); - gst_srt_object_close (srtobject); - if (!gst_srt_object_open_internal (srtobject, cancellable, error)) { - return -1; - } - continue; - case SRTS_CONNECTED: - /* good to go */ - GST_LOG_OBJECT (srtobject->element, "good to go"); - break; - default: - GST_WARNING_OBJECT (srtobject->element, "not ready"); - /* not-ready */ - continue; + if (wsocklen == 1 && rsocklen == 1) { + /* Socket reported in wsock AND rsock signifies an error. */ + gint reason = srt_getrejectreason (wsock); + + if (reason == SRT_REJ_BADSECRET || reason == SRT_REJ_UNSECURE) { + ELEMENT_WARNING_SRTSOCK_ERROR (NOT_AUTHORIZED, reason); + } else { + ELEMENT_WARNING_SRTSOCK_ERROR (WRITE, reason); + } + + gst_srt_object_close (srtobject); + if (!gst_srt_object_open_internal (srtobject, cancellable, error)) { + return -1; + } + continue; } if (srt_getsockflag (wsock, SRTO_PAYLOADSIZE, &payload_size, &optlen)) { @@ -1719,6 +1946,9 @@ tmp = get_stats_for_srtsock (caller->sock, is_sender, &bytes); + gst_structure_set (tmp, "caller-address", G_TYPE_SOCKET_ADDRESS, + caller->sockaddr, NULL); + g_value_array_append (callers_stats, NULL); v = g_value_array_get_nth (callers_stats, callers_stats->n_values - 1); g_value_init (v, GST_TYPE_STRUCTURE);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrtobject.h -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtobject.h
Changed
@@ -70,6 +70,8 @@ gboolean wait_for_connection; + gboolean authentication; + guint64 previous_bytes; };
View file
gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtplugin.c
Added
@@ -0,0 +1,52 @@ +/* GStreamer + * Copyright (C) 2017, Collabora Ltd. + * Author:Justin Kim <justin.kim@collabora.com> + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstsrtelements.h" + + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (srtsrc, plugin); + ret |= GST_ELEMENT_REGISTER (srtsink, plugin); + + /* deprecated */ +#ifndef GST_REMOVE_DEPRECATED + ret |= GST_ELEMENT_REGISTER (srtclientsrc, plugin); + ret |= GST_ELEMENT_REGISTER (srtserversrc, plugin); + ret |= GST_ELEMENT_REGISTER (srtclientsink, plugin); + ret |= GST_ELEMENT_REGISTER (srtserversink, plugin); +#endif + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + srt, + "transfer data via SRT", + plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrtsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtsink.c
Changed
@@ -42,6 +42,7 @@ #include <config.h> #endif +#include "gstsrtelements.h" #include "gstsrtsink.h" static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", @@ -56,7 +57,8 @@ { SIG_CALLER_ADDED, SIG_CALLER_REMOVED, - + SIG_CALLER_REJECTED, + SIG_CALLER_CONNECTING, LAST_SIGNAL }; @@ -67,12 +69,37 @@ static gchar *gst_srt_sink_uri_get_uri (GstURIHandler * handler); static gboolean gst_srt_sink_uri_set_uri (GstURIHandler * handler, const gchar * uri, GError ** error); +static gboolean default_caller_connecting (GstSRTSink * self, + GSocketAddress * addr, const gchar * username, gpointer data); +static gboolean authentication_accumulator (GSignalInvocationHint * ihint, + GValue * return_accu, const GValue * handler_return, gpointer data); #define gst_srt_sink_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstSRTSink, gst_srt_sink, GST_TYPE_BASE_SINK, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_srt_sink_uri_handler_init) GST_DEBUG_CATEGORY_INIT (GST_CAT_DEFAULT, "srtsink", 0, "SRT Sink")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtsink, "srtsink", GST_RANK_PRIMARY, + GST_TYPE_SRT_SINK, srt_element_init (plugin)); + +static gboolean +default_caller_connecting (GstSRTSink * self, + GSocketAddress * addr, const gchar * stream_id, gpointer data) +{ + /* Accept all connections. */ + return TRUE; +} + +static gboolean +authentication_accumulator (GSignalInvocationHint * ihint, + GValue * return_accu, const GValue * handler_return, gpointer data) +{ + gboolean ret = g_value_get_boolean (handler_return); + /* Handlers return TRUE on authentication success and we want to stop on + * the first failure. */ + g_value_set_boolean (return_accu, ret); + return ret; +} static void gst_srt_sink_set_property (GObject * object, @@ -122,14 +149,10 @@ gst_srt_sink_start (GstBaseSink * bsink) { GstSRTSink *self = GST_SRT_SINK (bsink); - GstSRTConnectionMode connection_mode = GST_SRT_CONNECTION_MODE_NONE; GError *error = NULL; gboolean ret = FALSE; - gst_structure_get_enum (self->srtobject->parameters, "mode", - GST_TYPE_SRT_CONNECTION_MODE, (gint *) & connection_mode); - ret = gst_srt_object_open (self->srtobject, self->cancellable, &error); if (!ret) { @@ -147,6 +170,7 @@ { GstSRTSink *self = GST_SRT_SINK (bsink); + g_clear_pointer (&self->headers, gst_buffer_list_unref); gst_srt_object_close (self->srtobject); return TRUE; @@ -280,14 +304,15 @@ gobject_class->set_property = gst_srt_sink_set_property; gobject_class->get_property = gst_srt_sink_get_property; gobject_class->finalize = gst_srt_sink_finalize; + klass->caller_connecting = default_caller_connecting; /** * GstSRTSink::caller-added: * @gstsrtsink: the srtsink element that emitted this signal - * @sock: the client socket descriptor that was added to srtsink - * @addr: the #GSocketAddress that describes the @sock - * - * The given socket descriptor was added to srtsink. + * @unused: always zero (for ABI compatibility with previous versions) + * @addr: the #GSocketAddress of the new caller + * + * A new caller has connected to @gstsrtsink. */ signals[SIG_CALLER_ADDED] = g_signal_new ("caller-added", G_TYPE_FROM_CLASS (klass), @@ -297,10 +322,10 @@ /** * GstSRTSink::caller-removed: * @gstsrtsink: the srtsink element that emitted this signal - * @sock: the client socket descriptor that was added to srtsink - * @addr: the #GSocketAddress that describes the @sock + * @unused: always zero (for ABI compatibility with previous versions) + * @addr: the #GSocketAddress of the caller * - * The given socket descriptor was removed from srtsink. + * The given caller has disconnected. */ signals[SIG_CALLER_REMOVED] = g_signal_new ("caller-removed", G_TYPE_FROM_CLASS (klass), @@ -308,6 +333,41 @@ caller_added), NULL, NULL, NULL, G_TYPE_NONE, 2, G_TYPE_INT, G_TYPE_SOCKET_ADDRESS); + /** + * GstSRTSink::caller-rejected: + * @gstsrtsink: the srtsink element that emitted this signal + * @addr: the #GSocketAddress that describes the client socket + * @stream_id: the stream Id to which the caller wants to connect + * + * A caller's connection to srtsink in listener mode has been rejected. + * + * Since: 1.20 + * + */ + signals[SIG_CALLER_REJECTED] = + g_signal_new ("caller-rejected", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, + 2, G_TYPE_SOCKET_ADDRESS, G_TYPE_STRING); + + /** + * GstSRTSink::caller-connecting: + * @gstsrtsink: the srtsink element that emitted this signal + * @addr: the #GSocketAddress that describes the client socket + * @stream_id: the stream Id to which the caller wants to connect + * + * Whether to accept or reject a caller's connection to srtsink in listener mode. + * The Caller's connection is rejected if the callback returns FALSE, else + * the connection is accepeted. + * + * Since: 1.20 + * + */ + signals[SIG_CALLER_CONNECTING] = + g_signal_new ("caller-connecting", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, G_STRUCT_OFFSET (GstSRTSinkClass, caller_connecting), + authentication_accumulator, NULL, NULL, G_TYPE_BOOLEAN, + 2, G_TYPE_SOCKET_ADDRESS, G_TYPE_STRING); + gst_srt_object_install_properties_helper (gobject_class); gst_element_class_add_static_pad_template (gstelement_class, &sink_template);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrtsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtsink.h
Changed
@@ -56,6 +56,10 @@ void (*caller_added) (GstSRTSink *self, int sock, GSocketAddress * addr); void (*caller_removed) (GstSRTSink *self, int sock, GSocketAddress * addr); + void (*caller_rejected) (GstSRTSink *self, GSocketAddress * peer_address, + const gchar * stream_id, gpointer data); + gboolean (*caller_connecting) (GstSRTSink *self, GSocketAddress * peer_address, + const gchar * stream_id, gpointer data); }; GType gst_srt_sink_get_type (void);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrtsrc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtsrc.c
Changed
@@ -45,6 +45,7 @@ #include <config.h> #endif +#include "gstsrtelements.h" #include "gstsrtsrc.h" static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", @@ -59,7 +60,8 @@ { SIG_CALLER_ADDED, SIG_CALLER_REMOVED, - + SIG_CALLER_REJECTED, + SIG_CALLER_CONNECTING, LAST_SIGNAL }; @@ -70,12 +72,37 @@ static gchar *gst_srt_src_uri_get_uri (GstURIHandler * handler); static gboolean gst_srt_src_uri_set_uri (GstURIHandler * handler, const gchar * uri, GError ** error); +static gboolean src_default_caller_connecting (GstSRTSrc * self, + GSocketAddress * addr, const gchar * username, gpointer data); +static gboolean src_authentication_accumulator (GSignalInvocationHint * ihint, + GValue * return_accu, const GValue * handler_return, gpointer data); #define gst_srt_src_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstSRTSrc, gst_srt_src, GST_TYPE_PUSH_SRC, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_srt_src_uri_handler_init) GST_DEBUG_CATEGORY_INIT (GST_CAT_DEFAULT, "srtsrc", 0, "SRT Source")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtsrc, "srtsrc", GST_RANK_PRIMARY, + GST_TYPE_SRT_SRC, srt_element_init (plugin)); + +static gboolean +src_default_caller_connecting (GstSRTSrc * self, + GSocketAddress * addr, const gchar * stream_id, gpointer data) +{ + /* Accept all connections. */ + return TRUE; +} + +static gboolean +src_authentication_accumulator (GSignalInvocationHint * ihint, + GValue * return_accu, const GValue * handler_return, gpointer data) +{ + gboolean ret = g_value_get_boolean (handler_return); + /* Handlers return TRUE on authentication success and we want to stop on + * the first failure. */ + g_value_set_boolean (return_accu, ret); + return ret; +} static gboolean gst_srt_src_start (GstBaseSrc * bsrc) @@ -333,14 +360,15 @@ gobject_class->set_property = gst_srt_src_set_property; gobject_class->get_property = gst_srt_src_get_property; gobject_class->finalize = gst_srt_src_finalize; + klass->caller_connecting = src_default_caller_connecting; /** * GstSRTSrc::caller-added: - * @gstsrtsink: the srtsink element that emitted this signal - * @sock: the client socket descriptor that was added to srtsink - * @addr: the #GSocketAddress that describes the @sock - * - * The given socket descriptor was added to srtsink. + * @gstsrtsrc: the srtsrc element that emitted this signal + * @unused: always zero (for ABI compatibility with previous versions) + * @addr: the #GSocketAddress of the new caller + * + * A new caller has connected to srtsrc. */ signals[SIG_CALLER_ADDED] = g_signal_new ("caller-added", G_TYPE_FROM_CLASS (klass), @@ -349,11 +377,11 @@ /** * GstSRTSrc::caller-removed: - * @gstsrtsink: the srtsink element that emitted this signal - * @sock: the client socket descriptor that was added to srtsink - * @addr: the #GSocketAddress that describes the @sock + * @gstsrtsrc: the srtsrc element that emitted this signal + * @unused: always zero (for ABI compatibility with previous versions) + * @addr: the #GSocketAddress of the caller * - * The given socket descriptor was removed from srtsink. + * The given caller has disconnected. */ signals[SIG_CALLER_REMOVED] = g_signal_new ("caller-removed", G_TYPE_FROM_CLASS (klass), @@ -361,6 +389,41 @@ caller_added), NULL, NULL, NULL, G_TYPE_NONE, 2, G_TYPE_INT, G_TYPE_SOCKET_ADDRESS); + /** + * GstSRTSrc::caller-rejected: + * @gstsrtsrc: the srtsrc element that emitted this signal + * @addr: the #GSocketAddress that describes the client socket + * @stream_id: the stream Id to which the caller wants to connect + * + * A caller's connection to srtsrc in listener mode has been rejected. + * + * Since: 1.20 + * + */ + signals[SIG_CALLER_REJECTED] = + g_signal_new ("caller-rejected", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, G_STRUCT_OFFSET (GstSRTSrcClass, caller_rejected), + NULL, NULL, NULL, G_TYPE_NONE, 2, G_TYPE_SOCKET_ADDRESS, G_TYPE_STRING); + + /** + * GstSRTSrc::caller-connecting: + * @gstsrtsrc: the srtsrc element that emitted this signal + * @addr: the #GSocketAddress that describes the client socket + * @stream_id: the stream Id to which the caller wants to connect + * + * Whether to accept or reject a caller's connection to srtsrc in listener mode. + * The Caller's connection is rejected if the callback returns FALSE, else + * the connection is accepeted. + * + * Since: 1.20 + * + */ + signals[SIG_CALLER_CONNECTING] = + g_signal_new ("caller-connecting", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, G_STRUCT_OFFSET (GstSRTSrcClass, caller_connecting), + src_authentication_accumulator, NULL, NULL, G_TYPE_BOOLEAN, + 2, G_TYPE_SOCKET_ADDRESS, G_TYPE_STRING); + gst_srt_object_install_properties_helper (gobject_class); gst_element_class_add_static_pad_template (gstelement_class, &src_template);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/gstsrtsrc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/gstsrtsrc.h
Changed
@@ -58,6 +58,10 @@ void (*caller_added) (GstSRTSrc *self, int sock, GSocketAddress * addr); void (*caller_removed) (GstSRTSrc *self, int sock, GSocketAddress * addr); + void (*caller_rejected) (GstSRTSrc *self, GSocketAddress * peer_address, + const gchar * stream_id, gpointer data); + gboolean (*caller_connecting) (GstSRTSrc *self, GSocketAddress * peer_address, + const gchar * stream_id, gpointer data); }; GType gst_srt_src_get_type (void);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srt/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/srt/meson.build
Changed
@@ -1,5 +1,7 @@ srt_sources = [ 'gstsrt.c', + 'gstsrtelement.c', + 'gstsrtplugin.c', 'gstsrtobject.c', 'gstsrtsink.c', 'gstsrtsrc.c'
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srtp/gstsrtp.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtp.c
Changed
@@ -297,26 +297,3 @@ return size; } - -static gboolean -plugin_init (GstPlugin * plugin) -{ - srtp_init (); - - if (!gst_srtp_enc_plugin_init (plugin)) - return FALSE; - - if (!gst_srtp_dec_plugin_init (plugin)) - return FALSE; - - gst_type_mark_as_plugin_api (GST_TYPE_SRTP_AUTH_TYPE, 0); - gst_type_mark_as_plugin_api (GST_TYPE_SRTP_CIPHER_TYPE, 0); - - return TRUE; -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - srtp, - "GStreamer SRTP", - plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srtp/gstsrtpdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpdec.c
Changed
@@ -115,8 +115,8 @@ * */ +#include "gstsrtpelements.h" #include "gstsrtpdec.h" - #include <gst/rtp/gstrtpbuffer.h> #include <string.h> @@ -177,7 +177,11 @@ static guint gst_srtp_dec_signals[LAST_SIGNAL] = { 0 }; -G_DEFINE_TYPE (GstSrtpDec, gst_srtp_dec, GST_TYPE_ELEMENT); +G_DEFINE_TYPE_WITH_CODE (GstSrtpDec, gst_srtp_dec, GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (gst_srtp_dec_debug, "srtpdec", 0, "SRTP dec"); + ); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtpdec, "srtpdec", GST_RANK_NONE, + GST_TYPE_SRTP_DEC, srtp_element_init (plugin)); static void gst_srtp_dec_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -1570,17 +1574,3 @@ } return res; } - - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_srtp_dec_plugin_init (GstPlugin * srtpdec) -{ - GST_DEBUG_CATEGORY_INIT (gst_srtp_dec_debug, "srtpdec", 0, "SRTP dec"); - - return gst_element_register (srtpdec, "srtpdec", GST_RANK_NONE, - GST_TYPE_SRTP_DEC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srtp/gstsrtpdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpdec.h
Changed
@@ -98,7 +98,6 @@ GType gst_srtp_dec_get_type (void); -gboolean gst_srtp_dec_plugin_init (GstPlugin * plugin); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpelement.c
Added
@@ -0,0 +1,41 @@ +/* + * GStreamer - GStreamer SRTP encoder and decoder + * + * Copyright 2009-2013 Collabora Ltd. + * @author: Gabriel Millaire <gabriel.millaire@collabora.co.uk> + * @author: Olivier Crete <olivier.crete@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + + +#define GLIB_DISABLE_DEPRECATION_WARNINGS + +#include "gstsrtpelements.h" + + +void +srtp_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + srtp_init (); + gst_type_mark_as_plugin_api (GST_TYPE_SRTP_AUTH_TYPE, 0); + gst_type_mark_as_plugin_api (GST_TYPE_SRTP_CIPHER_TYPE, 0); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpelements.h
Added
@@ -0,0 +1,63 @@ +/* + * GStreamer - GStreamer SRTP encoder + * + * Copyright 2011-2013 Collabora Ltd. + * @author: Olivier Crete <olivier.crete@collabora.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a + * copy of this software and associated documentation files (the "Software"), + * to deal in the Software without restriction, including without limitation + * the rights to use, copy, modify, merge, publish, distribute, sublicense, + * and/or sell copies of the Software, and to permit persons to whom the + * Software is furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + * DEALINGS IN THE SOFTWARE. + * + * Alternatively, the contents of this file may be used under the + * GNU Lesser General Public License Version 2.1 (the "LGPL"), in + * which case the following provisions apply instead of the ones + * mentioned above: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ +#ifndef __GST_SRTP_ELEMENTS_H__ +#define __GST_SRTP_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstsrtp.h" +#include "gstsrtpenums.h" +#include "gstsrtp-enumtypes.h" + +#include <gst/gst.h> + +void srtp_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (srtpdec); +GST_ELEMENT_REGISTER_DECLARE (srtpenc); + +#endif /* __GST_SRTP_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srtp/gstsrtpenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpenc.c
Changed
@@ -106,6 +106,7 @@ * will be added to every buffer. */ +#include "gstsrtpelements.h" #include "gstsrtpenc.h" #include <gst/rtp/gstrtpbuffer.h> @@ -201,7 +202,10 @@ GST_STATIC_CAPS ("application/x-srtcp") ); -G_DEFINE_TYPE (GstSrtpEnc, gst_srtp_enc, GST_TYPE_ELEMENT); +G_DEFINE_TYPE_WITH_CODE (GstSrtpEnc, gst_srtp_enc, GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (gst_srtp_enc_debug, "srtpenc", 0, "SRTP Enc");); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (srtpenc, "srtpenc", GST_RANK_NONE, + GST_TYPE_SRTP_ENC, srtp_element_init (plugin)); static guint gst_srtp_enc_signals[LAST_SIGNAL] = { 0 }; @@ -1492,16 +1496,3 @@ { return gst_srtp_enc_sink_event (pad, parent, event, TRUE); } - -/* entry point to initialize the plug-in - * initialize the plug-in itself - * register the element factories and other features - */ -gboolean -gst_srtp_enc_plugin_init (GstPlugin * srtpenc) -{ - GST_DEBUG_CATEGORY_INIT (gst_srtp_enc_debug, "srtpenc", 0, "SRTP Enc"); - - return gst_element_register (srtpenc, "srtpenc", GST_RANK_NONE, - GST_TYPE_SRTP_ENC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srtp/gstsrtpenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpenc.h
Changed
@@ -95,8 +95,6 @@ GType gst_srtp_enc_get_type (void); -gboolean gst_srtp_enc_plugin_init (GstPlugin * plugin); - G_END_DECLS #endif /* __GST_SRTPENC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/srtp/gstsrtpplugin.c
Added
@@ -0,0 +1,45 @@ +/* + * GStreamer - GStreamer SRTP encoder and decoder + * + * Copyright 2009-2013 Collabora Ltd. + * @author: Gabriel Millaire <gabriel.millaire@collabora.co.uk> + * @author: Olivier Crete <olivier.crete@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 59 Temple Place - Suite 330, + * Boston, MA 02111-1307, USA. + */ + + +#define GLIB_DISABLE_DEPRECATION_WARNINGS + +#include "gstsrtpelements.h" + + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (srtpenc, plugin); + ret |= GST_ELEMENT_REGISTER (srtpdec, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + srtp, + "GStreamer SRTP", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/srtp/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/srtp/meson.build
Changed
@@ -1,5 +1,7 @@ srtp_sources = [ 'gstsrtp.c', + 'gstsrtpelement.c', + 'gstsrtpplugin.c', 'gstsrtpdec.c', 'gstsrtpenc.c', ]
View file
gst-plugins-bad-1.18.6.tar.xz/ext/teletextdec/gstteletextdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/teletextdec/gstteletextdec.c
Changed
@@ -215,8 +215,7 @@ g_object_class_install_property (gobject_class, PROP_SUBS_TEMPLATE, g_param_spec_string ("subtitles-template", "Subtitles output template", "Output template used to print each one of the subtitles lines", - g_strescape ("%s\n", NULL), - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + "%s\\n", G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_FONT_DESCRIPTION, g_param_spec_string ("font-description", "Pango font description", @@ -540,7 +539,7 @@ n_lines = teletext->frame->current_slice - teletext->frame->sliced_begin; GST_LOG_OBJECT (teletext, "Completed frame, decoding new %d lines", n_lines); - s = g_memdup (teletext->frame->sliced_begin, + s = g_memdup2 (teletext->frame->sliced_begin, n_lines * sizeof (vbi_sliced)); vbi_decode (teletext->decoder, s, n_lines, teletext->last_ts); /* From vbi_decode():
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ttml/gstttmlelement.c
Added
@@ -0,0 +1,42 @@ +/* GStreamer + * Copyright (C) <1999> Erik Walthinsen <omega@cse.ogi.edu> + * Copyright (C) 2004 Ronald S. Bultje <rbultje@ronald.bitfreak.net> + * Copyright (C) 2006 Tim-Philipp Müller <tim centricular net> + * Copyright (C) <2015> British Broadcasting Corporation <dash@rd.bbc.co.uk> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstttmlelements.h" + + +GST_DEBUG_CATEGORY (ttmlrender_debug); + +void +ttml_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + gst_plugin_add_dependency_simple (plugin, "GST_TTML_AUTOPLUG", NULL, NULL, + GST_PLUGIN_DEPENDENCY_FLAG_NONE); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/ttml/gstttmlelements.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_TTML_ELEMENTS_H__ +#define __GST_TTML_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void ttml_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (ttmlparse); +GST_ELEMENT_REGISTER_DECLARE (ttmlrender); + +#endif /* __GST_TTML_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ttml/gstttmlparse.c -> gst-plugins-bad-1.20.1.tar.xz/ext/ttml/gstttmlparse.c
Changed
@@ -50,12 +50,14 @@ #include <sys/types.h> #include <glib.h> +#include "gstttmlelements.h" #include "gstttmlparse.h" #include "ttmlparse.h" -GST_DEBUG_CATEGORY_EXTERN (ttmlparse_debug); +GST_DEBUG_CATEGORY (ttmlparse_debug); #define GST_CAT_DEFAULT ttmlparse_debug + #define DEFAULT_ENCODING NULL static GstStaticPadTemplate sink_templ = GST_STATIC_PAD_TEMPLATE ("sink", @@ -82,9 +84,11 @@ static GstFlowReturn gst_ttml_parse_chain (GstPad * sinkpad, GstObject * parent, GstBuffer * buf); +static gboolean gst_element_ttmlparse_init (GstPlugin * plugin); #define gst_ttml_parse_parent_class parent_class G_DEFINE_TYPE (GstTtmlParse, gst_ttml_parse, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (ttmlparse, gst_element_ttmlparse_init); static void gst_ttml_parse_dispose (GObject * object) @@ -596,3 +600,21 @@ return ret; } + +static gboolean +gst_element_ttmlparse_init (GstPlugin * plugin) +{ + guint rank = GST_RANK_NONE; + + ttml_element_init (plugin); + + GST_DEBUG_CATEGORY_INIT (ttmlparse_debug, "ttmlparse", 0, "TTML parser"); + + /* We don't want this autoplugged by default yet for now */ + if (g_getenv ("GST_TTML_AUTOPLUG")) { + GST_INFO_OBJECT (plugin, "Registering ttml elements with primary rank."); + rank = GST_RANK_PRIMARY; + } + + return gst_element_register (plugin, "ttmlparse", rank, GST_TYPE_TTML_PARSE); +}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ttml/gstttmlplugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/ttml/gstttmlplugin.c
Changed
@@ -24,35 +24,18 @@ #include <config.h> #endif -#include "gstttmlparse.h" -#include "gstttmlrender.h" +#include "gstttmlelements.h" -GST_DEBUG_CATEGORY (ttmlparse_debug); -GST_DEBUG_CATEGORY (ttmlrender_debug); static gboolean plugin_init (GstPlugin * plugin) { - guint rank = GST_RANK_NONE; + gboolean ret = FALSE; - /* We don't want this autoplugged by default yet for now */ - if (g_getenv ("GST_TTML_AUTOPLUG")) { - GST_INFO_OBJECT (plugin, "Registering ttml elements with primary rank."); - rank = GST_RANK_PRIMARY; - } + ret |= GST_ELEMENT_REGISTER (ttmlparse, plugin); + ret |= GST_ELEMENT_REGISTER (ttmlrender, plugin); - gst_plugin_add_dependency_simple (plugin, "GST_TTML_AUTOPLUG", NULL, NULL, - GST_PLUGIN_DEPENDENCY_FLAG_NONE); - - if (!gst_element_register (plugin, "ttmlparse", rank, GST_TYPE_TTML_PARSE)) - return FALSE; - if (!gst_element_register (plugin, "ttmlrender", rank, GST_TYPE_TTML_RENDER)) - return FALSE; - - GST_DEBUG_CATEGORY_INIT (ttmlparse_debug, "ttmlparse", 0, "TTML parser"); - GST_DEBUG_CATEGORY_INIT (ttmlrender_debug, "ttmlrender", 0, "TTML renderer"); - - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ttml/gstttmlrender.c -> gst-plugins-bad-1.20.1.tar.xz/ext/ttml/gstttmlrender.c
Changed
@@ -48,6 +48,7 @@ #include <string.h> #include <math.h> +#include "gstttmlelements.h" #include "gstttmlrender.h" #include "subtitle.h" #include "subtitlemeta.h" @@ -196,6 +197,7 @@ images, GstTtmlDirection direction); static gboolean gst_ttml_render_color_is_transparent (GstSubtitleColor * color); +static gboolean gst_element_ttmlrender_init (GstPlugin * plugin); GType gst_ttml_render_get_type (void) @@ -222,6 +224,8 @@ return type; } +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (ttmlrender, gst_element_ttmlrender_init); + static void gst_ttml_render_base_init (gpointer g_class) { @@ -3059,3 +3063,22 @@ return ret; } + +static gboolean +gst_element_ttmlrender_init (GstPlugin * plugin) +{ + guint rank = GST_RANK_NONE; + + ttml_element_init (plugin); + + GST_DEBUG_CATEGORY_INIT (ttmlrender_debug, "ttmlrender", 0, "TTML renderer"); + + /* We don't want this autoplugged by default yet for now */ + if (g_getenv ("GST_TTML_AUTOPLUG")) { + GST_INFO_OBJECT (plugin, "Registering ttml elements with primary rank."); + rank = GST_RANK_PRIMARY; + } + + return gst_element_register (plugin, "ttmlrender", rank, + GST_TYPE_TTML_RENDER); +}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/ttml/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/ttml/meson.build
Changed
@@ -10,6 +10,7 @@ 'gstttmlparse.c', 'ttmlparse.c', 'gstttmlrender.c', + 'gstttmlelement.c', 'gstttmlplugin.c'], c_args : gst_plugins_bad_args, include_directories : [configinc],
View file
gst-plugins-bad-1.18.6.tar.xz/ext/voaacenc/gstvoaac.c -> gst-plugins-bad-1.20.1.tar.xz/ext/voaacenc/gstvoaac.c
Changed
@@ -26,8 +26,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "voaacenc", - GST_RANK_SECONDARY, GST_TYPE_VOAACENC); + return GST_ELEMENT_REGISTER (voaacenc, plugin); }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/voaacenc/gstvoaacenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/voaacenc/gstvoaacenc.c
Changed
@@ -107,6 +107,8 @@ GstBuffer * in_buf); G_DEFINE_TYPE (GstVoAacEnc, gst_voaacenc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (voaacenc, "voaacenc", + GST_RANK_SECONDARY, GST_TYPE_VOAACENC); static void gst_voaacenc_set_property (GObject * object, guint prop_id,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/voaacenc/gstvoaacenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/voaacenc/gstvoaacenc.h
Changed
@@ -70,6 +70,8 @@ GType gst_voaacenc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (voaacenc); + G_END_DECLS #endif /* __GST_VOAACENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/voamrwbenc/gstvoamrwb.c -> gst-plugins-bad-1.20.1.tar.xz/ext/voamrwbenc/gstvoamrwb.c
Changed
@@ -26,11 +26,9 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "voamrwbenc", - GST_RANK_SECONDARY, GST_TYPE_VOAMRWBENC); + return GST_ELEMENT_REGISTER (voamrwbenc, plugin); } - GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, voamrwbenc,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/voamrwbenc/gstvoamrwbenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/voamrwbenc/gstvoamrwbenc.c
Changed
@@ -115,6 +115,8 @@ GstBuffer * in_buf); G_DEFINE_TYPE (GstVoAmrWbEnc, gst_voamrwbenc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (voamrwbenc, "voamrwbenc", + GST_RANK_SECONDARY, GST_TYPE_VOAMRWBENC); static void gst_voamrwbenc_set_property (GObject * object, guint prop_id,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/voamrwbenc/gstvoamrwbenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/voamrwbenc/gstvoamrwbenc.h
Changed
@@ -58,6 +58,8 @@ GType gst_voamrwbenc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (voamrwbenc); + G_END_DECLS #endif /* __GST_VOAMRWBENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/gstvulkan.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/gstvulkan.c
Changed
@@ -36,50 +36,29 @@ #include "vkdownload.h" #include "vkviewconvert.h" #include "vkdeviceprovider.h" +#include "gstvulkanelements.h" -#define GST_CAT_DEFAULT gst_vulkan_debug -GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_vulkan_debug, "vulkan", 0, "vulkan"); + gboolean ret = FALSE; - if (!gst_element_register (plugin, "vulkansink", - GST_RANK_NONE, GST_TYPE_VULKAN_SINK)) { - return FALSE; - } + ret |= GST_DEVICE_PROVIDER_REGISTER (vulkandeviceprovider, plugin); - if (!gst_element_register (plugin, "vulkanupload", - GST_RANK_NONE, GST_TYPE_VULKAN_UPLOAD)) { - return FALSE; - } + ret |= GST_ELEMENT_REGISTER (vulkansink, plugin); - if (!gst_element_register (plugin, "vulkandownload", - GST_RANK_NONE, GST_TYPE_VULKAN_DOWNLOAD)) { - return FALSE; - } + ret |= GST_ELEMENT_REGISTER (vulkanupload, plugin); - if (!gst_element_register (plugin, "vulkancolorconvert", - GST_RANK_NONE, GST_TYPE_VULKAN_COLOR_CONVERT)) { - return FALSE; - } + ret |= GST_ELEMENT_REGISTER (vulkandownload, plugin); - if (!gst_element_register (plugin, "vulkanimageidentity", - GST_RANK_NONE, GST_TYPE_VULKAN_IMAGE_IDENTITY)) { - return FALSE; - } + ret |= GST_ELEMENT_REGISTER (vulkancolorconvert, plugin); - if (!gst_element_register (plugin, "vulkanviewconvert", - GST_RANK_NONE, GST_TYPE_VULKAN_VIEW_CONVERT)) { - return FALSE; - } + ret |= GST_ELEMENT_REGISTER (vulkanimageidentity, plugin); - if (!gst_device_provider_register (plugin, "vulkandeviceprovider", - GST_RANK_MARGINAL, GST_TYPE_VULKAN_DEVICE_PROVIDER)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (vulkanviewconvert, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/gstvulkanelement.c
Added
@@ -0,0 +1,46 @@ +/* + * GStreamer + * Copyright (C) 2015 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:plugin-vulkan + * @title: vulkan + * + * Cross-platform Vulkan plugin. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvulkanelements.h" + +#define GST_CAT_DEFAULT gst_vulkan_debug +GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); + +void +vulkan_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (gst_vulkan_debug, "vulkan", 0, "vulkan"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/gstvulkanelements.h
Added
@@ -0,0 +1,41 @@ +/* GStreamer + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_VULKAN_ELEMENTS_H__ +#define __GST_VULKAN_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void vulkan_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (vulkancolorconvert); +GST_ELEMENT_REGISTER_DECLARE (vulkandownload); +GST_ELEMENT_REGISTER_DECLARE (vulkanimageidentity); +GST_ELEMENT_REGISTER_DECLARE (vulkansink); +GST_ELEMENT_REGISTER_DECLARE (vulkanupload); +GST_ELEMENT_REGISTER_DECLARE (vulkanviewconvert); +GST_DEVICE_PROVIDER_REGISTER_DECLARE (vulkandeviceprovider); + + +#endif /* __GST_VULKAN_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/meson.build
Changed
@@ -17,6 +17,7 @@ vulkan_sources = [ 'gstvulkan.c', + 'gstvulkanelement.c', 'vkcolorconvert.c', 'vkdownload.c', 'vkdeviceprovider.c',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vkcolorconvert.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vkcolorconvert.c
Changed
@@ -43,6 +43,8 @@ #include "shaders/rgb_to_yuy2.frag.h" #include "shaders/rgb_to_nv12.frag.h" +#include "gstvulkanelements.h" + GST_DEBUG_CATEGORY (gst_debug_vulkan_color_convert); #define GST_CAT_DEFAULT gst_debug_vulkan_color_convert @@ -795,6 +797,8 @@ GST_TYPE_VULKAN_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_debug_vulkan_color_convert, "vulkancolorconvert", 0, "Vulkan Color Convert")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vulkancolorconvert, "vulkancolorconvert", + GST_RANK_NONE, GST_TYPE_VULKAN_COLOR_CONVERT, vulkan_element_init (plugin)); struct yuv_info {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vkdeviceprovider.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vkdeviceprovider.c
Changed
@@ -23,6 +23,7 @@ #include "config.h" #endif +#include "gstvulkanelements.h" #include "vkdeviceprovider.h" #include <string.h> @@ -39,7 +40,10 @@ G_DEFINE_TYPE_WITH_CODE (GstVulkanDeviceProvider, gst_vulkan_device_provider, GST_TYPE_DEVICE_PROVIDER, GST_DEBUG_CATEGORY_INIT (GST_CAT_DEFAULT, - "vulkandevice", 0, "Vulkan Device");); + "vulkandevice", 0, "Vulkan Device"); + ); +GST_DEVICE_PROVIDER_REGISTER_DEFINE (vulkandeviceprovider, + "vulkandeviceprovider", GST_RANK_MARGINAL, GST_TYPE_VULKAN_DEVICE_PROVIDER); static void gst_vulkan_device_provider_finalize (GObject * object); static void gst_vulkan_device_provider_set_property (GObject * object,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vkdownload.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vkdownload.c
Changed
@@ -31,6 +31,7 @@ #include <string.h> +#include "gstvulkanelements.h" #include "vkdownload.h" GST_DEBUG_CATEGORY (gst_debug_vulkan_download); @@ -470,6 +471,8 @@ G_DEFINE_TYPE_WITH_CODE (GstVulkanDownload, gst_vulkan_download, GST_TYPE_BASE_TRANSFORM, GST_DEBUG_CATEGORY_INIT (gst_debug_vulkan_download, "vulkandownload", 0, "Vulkan Downloader")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vulkandownload, "vulkandownload", + GST_RANK_NONE, GST_TYPE_VULKAN_DOWNLOAD, vulkan_element_init (plugin)); static void gst_vulkan_download_class_init (GstVulkanDownloadClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vkimageidentity.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vkimageidentity.c
Changed
@@ -31,6 +31,7 @@ #include <string.h> +#include "gstvulkanelements.h" #include "vkimageidentity.h" #include "shaders/identity.vert.h" @@ -83,6 +84,9 @@ GST_TYPE_VULKAN_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_debug_vulkan_image_identity, "vulkanimageidentity", 0, "Vulkan Image identity")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vulkanimageidentity, + "vulkanimageidentity", GST_RANK_NONE, GST_TYPE_VULKAN_IMAGE_IDENTITY, + vulkan_element_init (plugin)); static void gst_vulkan_image_identity_class_init (GstVulkanImageIdentityClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vksink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vksink.c
Changed
@@ -32,6 +32,7 @@ //#include <gst/video/videooverlay.h> +#include "gstvulkanelements.h" #include "vksink.h" GST_DEBUG_CATEGORY (gst_debug_vulkan_sink); @@ -108,6 +109,8 @@ gst_vulkan_sink_video_overlay_init); G_IMPLEMENT_INTERFACE (GST_TYPE_NAVIGATION, gst_vulkan_sink_navigation_interface_init)); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vulkansink, "vulkansink", GST_RANK_NONE, + GST_TYPE_VULKAN_SINK, vulkan_element_init (plugin)); static void gst_vulkan_sink_class_init (GstVulkanSinkClass * klass) @@ -143,7 +146,7 @@ GST_TYPE_VULKAN_DEVICE, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); gst_element_class_set_metadata (element_class, "Vulkan video sink", - "Sink/Video", "A videosink based on OpenGL", + "Sink/Video", "A videosink based on Vulkan", "Matthew Waters <matthew@centricular.com>"); gst_element_class_add_static_pad_template (element_class,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vkupload.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vkupload.c
Changed
@@ -30,7 +30,7 @@ #endif #include <string.h> - +#include "gstvulkanelements.h" #include "vkupload.h" GST_DEBUG_CATEGORY (gst_debug_vulkan_upload); @@ -1164,6 +1164,8 @@ G_DEFINE_TYPE_WITH_CODE (GstVulkanUpload, gst_vulkan_upload, GST_TYPE_BASE_TRANSFORM, GST_DEBUG_CATEGORY_INIT (gst_debug_vulkan_upload, "vulkanupload", 0, "Vulkan Uploader")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vulkanupload, "vulkanupload", + GST_RANK_NONE, GST_TYPE_VULKAN_UPLOAD, vulkan_element_init (plugin)); static void gst_vulkan_upload_class_init (GstVulkanUploadClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/vulkan/vkviewconvert.c -> gst-plugins-bad-1.20.1.tar.xz/ext/vulkan/vkviewconvert.c
Changed
@@ -31,6 +31,7 @@ #include <string.h> +#include "gstvulkanelements.h" #include "vkviewconvert.h" #include "shaders/identity.vert.h" @@ -561,6 +562,8 @@ GST_TYPE_VULKAN_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_debug_vulkan_view_convert, "vulkanviewconvert", 0, "Vulkan View Convert")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vulkanviewconvert, "vulkanviewconvert", + GST_RANK_NONE, GST_TYPE_VULKAN_VIEW_CONVERT, vulkan_element_init (plugin)); static void gst_vulkan_view_convert_class_init (GstVulkanViewConvertClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/gstwaylandsink.c -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/gstwaylandsink.c
Changed
@@ -122,6 +122,8 @@ gst_wayland_sink_videooverlay_init) G_IMPLEMENT_INTERFACE (GST_TYPE_WAYLAND_VIDEO, gst_wayland_sink_waylandvideo_init)); +GST_ELEMENT_REGISTER_DEFINE (waylandsink, "waylandsink", GST_RANK_MARGINAL, + GST_TYPE_WAYLAND_SINK); /* A tiny GstVideoBufferPool subclass that modify the options to remove * VideoAlignment. To support VideoAlignment we would need to pass the padded @@ -404,6 +406,14 @@ gst_wl_window_render (sink->window, NULL, NULL); } } + + g_mutex_lock (&sink->render_lock); + if (sink->callback) { + wl_callback_destroy (sink->callback); + sink->callback = NULL; + } + sink->redraw_pending = FALSE; + g_mutex_unlock (&sink->render_lock); break; case GST_STATE_CHANGE_READY_TO_NULL: g_mutex_lock (&sink->display_lock); @@ -417,12 +427,9 @@ * to avoid requesting them again from the application if/when we are * restarted (GstVideoOverlay behaves like that in other sinks) */ - if (sink->display && !sink->window) { /* -> the window was toplevel */ + if (sink->display && !sink->window) /* -> the window was toplevel */ g_clear_object (&sink->display); - g_mutex_lock (&sink->render_lock); - sink->redraw_pending = FALSE; - g_mutex_unlock (&sink->render_lock); - } + g_mutex_unlock (&sink->display_lock); g_clear_object (&sink->pool); break; @@ -637,9 +644,12 @@ g_mutex_lock (&sink->render_lock); sink->redraw_pending = FALSE; - g_mutex_unlock (&sink->render_lock); - wl_callback_destroy (callback); + if (sink->callback) { + wl_callback_destroy (callback); + sink->callback = NULL; + } + g_mutex_unlock (&sink->render_lock); } static const struct wl_callback_listener frame_callback_listener = { @@ -660,6 +670,7 @@ sink->redraw_pending = TRUE; callback = wl_surface_frame (surface); + sink->callback = callback; wl_callback_add_listener (callback, &frame_callback_listener, sink); if (G_UNLIKELY (sink->video_info_changed && !redraw)) { @@ -818,7 +829,7 @@ if (G_UNLIKELY (!wbuf)) goto no_wl_buffer_shm; - gst_buffer_add_wl_buffer (to_render, wbuf, sink->display); + wlbuffer = gst_buffer_add_wl_buffer (to_render, wbuf, sink->display); } if (!gst_video_frame_map (&dst, &sink->video_info, to_render, @@ -842,12 +853,13 @@ if (!wbuf) goto no_wl_buffer; - gst_buffer_add_wl_buffer (buffer, wbuf, sink->display); + wlbuffer = gst_buffer_add_wl_buffer (buffer, wbuf, sink->display); to_render = buffer; render: /* drop double rendering */ - if (G_UNLIKELY (to_render == sink->last_buffer)) { + if (G_UNLIKELY (wlbuffer == + gst_buffer_get_wl_buffer (sink->display, sink->last_buffer))) { GST_LOG_OBJECT (sink, "Buffer already being rendered"); goto done; } @@ -1052,8 +1064,7 @@ gst_wl_shm_allocator_register (); - return gst_element_register (plugin, "waylandsink", GST_RANK_MARGINAL, - GST_TYPE_WAYLAND_SINK); + return GST_ELEMENT_REGISTER (waylandsink, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/gstwaylandsink.h -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/gstwaylandsink.h
Changed
@@ -67,6 +67,8 @@ gboolean redraw_pending; GMutex render_lock; GstBuffer *last_buffer; + + struct wl_callback *callback; }; struct _GstWaylandSinkClass @@ -76,6 +78,8 @@ GType gst_wayland_sink_get_type (void) G_GNUC_CONST; +GST_ELEMENT_REGISTER_DECLARE (waylandsink); + G_END_DECLS #endif /* __GST_WAYLAND_VIDEO_SINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/meson.build
Changed
@@ -11,7 +11,7 @@ libdrm_dep = dependency('libdrm', version: '>= 2.4.55', required:get_option('wayland')) if use_wayland - protocols_datadir = wl_protocol_dep.get_pkgconfig_variable('pkgdatadir') + protocols_datadir = wl_protocol_dep.get_variable('pkgdatadir') protocol_defs = [ ['/stable/viewporter/viewporter.xml', 'viewporter-protocol.c', 'viewporter-client-protocol.h'],
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/wlbuffer.c -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/wlbuffer.c
Changed
@@ -185,10 +185,18 @@ GstWlBuffer * gst_buffer_get_wl_buffer (GstWlDisplay * display, GstBuffer * gstbuffer) { - GstMemory *mem0 = gst_buffer_peek_memory (gstbuffer, 0); - GstWlBuffer *wlbuf = gst_wl_display_lookup_buffer (display, mem0); + GstMemory *mem0; + GstWlBuffer *wlbuf; + + if (!gstbuffer) + return NULL; + + mem0 = gst_buffer_peek_memory (gstbuffer, 0); + + wlbuf = gst_wl_display_lookup_buffer (display, mem0); if (wlbuf) wlbuf->current_gstbuffer = gstbuffer; + return wlbuf; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/wldisplay.c -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/wldisplay.c
Changed
@@ -214,7 +214,7 @@ if (g_strcmp0 (interface, "wl_compositor") == 0) { self->compositor = wl_registry_bind (registry, id, &wl_compositor_interface, - MIN (version, 3)); + MIN (version, 4)); } else if (g_strcmp0 (interface, "wl_subcompositor") == 0) { self->subcompositor = wl_registry_bind (registry, id, &wl_subcompositor_interface, 1);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/wlshmallocator.c -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/wlshmallocator.c
Changed
@@ -49,16 +49,33 @@ /* TODO: make use of the allocation params, if necessary */ - /* allocate shm pool */ - snprintf (filename, 1024, "%s/%s-%d-%s", g_get_user_runtime_dir (), - "wayland-shm", init++, "XXXXXX"); - - fd = g_mkstemp (filename); - if (fd < 0) { - GST_ERROR_OBJECT (self, "opening temp file %s failed: %s", filename, - strerror (errno)); - return NULL; +#ifdef HAVE_MEMFD_CREATE + fd = memfd_create ("gst-wayland-shm", MFD_CLOEXEC | MFD_ALLOW_SEALING); + if (fd >= 0) { + /* We can add this seal before calling posix_fallocate(), as + * the file is currently zero-sized anyway. + * + * There is also no need to check for the return value, we + * couldn't do anything with it anyway. + */ + fcntl (fd, F_ADD_SEALS, F_SEAL_SHRINK); + } else +#endif + { + /* allocate shm pool */ + snprintf (filename, 1024, "%s/%s-%d-%s", g_get_user_runtime_dir (), + "wayland-shm", init++, "XXXXXX"); + + fd = g_mkstemp (filename); + if (fd < 0) { + GST_ERROR_OBJECT (self, "opening temp file %s failed: %s", filename, + strerror (errno)); + return NULL; + } + + unlink (filename); } + if (ftruncate (fd, size) < 0) { GST_ERROR_OBJECT (self, "ftruncate failed: %s", strerror (errno)); close (fd); @@ -84,8 +101,6 @@ * need it to release the miniobject lock */ gst_memory_unmap (mem, &info); - unlink (filename); - return mem; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/wlwindow.c -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/wlwindow.c
Changed
@@ -43,6 +43,8 @@ static void gst_wl_window_finalize (GObject * gobject); +static void gst_wl_window_update_borders (GstWlWindow * window); + static void handle_xdg_toplevel_close (void *data, struct xdg_toplevel *xdg_toplevel) { @@ -222,11 +224,7 @@ window->video_surface); } - /* do not accept input */ - region = wl_compositor_create_region (display->compositor); - wl_surface_set_input_region (window->area_surface, region); - wl_region_destroy (region); - + /* never accept input events on the video surface */ region = wl_compositor_create_region (display->compositor); wl_surface_set_input_region (window->video_surface, region); wl_region_destroy (region); @@ -345,8 +343,14 @@ struct wl_surface * parent, GMutex * render_lock) { GstWlWindow *window; + struct wl_region *region; window = gst_wl_window_new_internal (display, render_lock); + /* do not accept input events on the area surface when embedded */ + region = wl_compositor_create_region (display->compositor); + wl_surface_set_input_region (window->area_surface, region); + wl_region_destroy (region); + /* embed in parent */ window->area_subsurface = wl_subcompositor_get_subsurface (display->subcompositor, @@ -407,22 +411,9 @@ wl_subsurface_set_position (window->video_subsurface, res.x, res.y); - if (commit) { - wl_surface_damage (window->video_surface_wrapper, 0, 0, res.w, res.h); + if (commit) wl_surface_commit (window->video_surface_wrapper); - } - - if (gst_wl_window_is_toplevel (window)) { - struct wl_region *region; - - region = wl_compositor_create_region (window->display->compositor); - wl_region_add (region, 0, 0, window->render_rectangle.w, - window->render_rectangle.h); - wl_surface_set_input_region (window->area_surface, region); - wl_region_destroy (region); - } - /* this is saved for use in wl_surface_damage */ window->video_rectangle = res; } @@ -433,16 +424,14 @@ /* Set area opaque */ region = wl_compositor_create_region (window->display->compositor); - wl_region_add (region, 0, 0, window->render_rectangle.w, - window->render_rectangle.h); + wl_region_add (region, 0, 0, G_MAXINT32, G_MAXINT32); wl_surface_set_opaque_region (window->area_surface, region); wl_region_destroy (region); if (!GST_VIDEO_INFO_HAS_ALPHA (info)) { /* Set video opaque */ region = wl_compositor_create_region (window->display->compositor); - wl_region_add (region, 0, 0, window->render_rectangle.w, - window->render_rectangle.h); + wl_region_add (region, 0, 0, G_MAXINT32, G_MAXINT32); wl_surface_set_opaque_region (window->video_surface, region); wl_region_destroy (region); } @@ -464,22 +453,27 @@ if (G_LIKELY (buffer)) { gst_wl_buffer_attach (buffer, window->video_surface_wrapper); - wl_surface_damage (window->video_surface_wrapper, 0, 0, - window->video_rectangle.w, window->video_rectangle.h); + wl_surface_damage_buffer (window->video_surface_wrapper, 0, 0, G_MAXINT32, + G_MAXINT32); wl_surface_commit (window->video_surface_wrapper); + + if (!window->is_area_surface_mapped) { + gst_wl_window_update_borders (window); + wl_surface_commit (window->area_surface_wrapper); + window->is_area_surface_mapped = TRUE; + } } else { /* clear both video and parent surfaces */ wl_surface_attach (window->video_surface_wrapper, NULL, 0, 0); wl_surface_commit (window->video_surface_wrapper); wl_surface_attach (window->area_surface_wrapper, NULL, 0, 0); wl_surface_commit (window->area_surface_wrapper); + window->is_area_surface_mapped = FALSE; } if (G_UNLIKELY (info)) { /* commit also the parent (area_surface) in order to change * the position of the video_subsurface */ - wl_surface_damage (window->area_surface_wrapper, 0, 0, - window->render_rectangle.w, window->render_rectangle.h); wl_surface_commit (window->area_surface_wrapper); wl_subsurface_set_desync (window->video_subsurface); } @@ -501,12 +495,19 @@ GstWlBuffer *gwlbuf; GstAllocator *alloc; - if (window->no_border_update) - return; + if (window->display->viewporter) { + wp_viewport_set_destination (window->area_viewport, + window->render_rectangle.w, window->render_rectangle.h); + + if (window->is_area_surface_mapped) { + /* The area_surface is already visible and only needed to get resized. + * We don't need to attach a new buffer and are done here. */ + return; + } + } if (window->display->viewporter) { width = height = 1; - window->no_border_update = TRUE; } else { width = window->render_rectangle.w; height = window->render_rectangle.h; @@ -527,6 +528,8 @@ window->display, &info); gwlbuf = gst_buffer_add_wl_buffer (buf, wlbuf, window->display); gst_wl_buffer_attach (gwlbuf, window->area_surface_wrapper); + wl_surface_damage_buffer (window->area_surface_wrapper, 0, 0, G_MAXINT32, + G_MAXINT32); /* at this point, the GstWlBuffer keeps the buffer * alive and will free it on wl_buffer::release */ @@ -540,6 +543,10 @@ { g_return_if_fail (window != NULL); + if (window->render_rectangle.x == x && window->render_rectangle.y == y && + window->render_rectangle.w == w && window->render_rectangle.h == h) + return; + window->render_rectangle.x = x; window->render_rectangle.y = y; window->render_rectangle.w = w; @@ -549,11 +556,8 @@ if (window->area_subsurface) wl_subsurface_set_position (window->area_subsurface, x, y); - /* change the size of the area */ - if (window->area_viewport) - wp_viewport_set_destination (window->area_viewport, w, h); - - gst_wl_window_update_borders (window); + if (window->is_area_surface_mapped) + gst_wl_window_update_borders (window); if (!window->configured) return; @@ -563,7 +567,6 @@ gst_wl_window_resize_video_surface (window, TRUE); } - wl_surface_damage (window->area_surface_wrapper, 0, 0, w, h); wl_surface_commit (window->area_surface_wrapper); if (window->video_width != 0)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wayland/wlwindow.h -> gst-plugins-bad-1.20.1.tar.xz/ext/wayland/wlwindow.h
Changed
@@ -68,10 +68,9 @@ /* the size of the video in the buffers */ gint video_width, video_height; - /* this will be set when viewporter is available and black background has - * already been set on the area_subsurface */ - gboolean no_border_update; - + /* when this is not set both the area_surface and the video_surface are not + * visible and certain steps should be skipped */ + gboolean is_area_surface_mapped; }; struct _GstWlWindowClass
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webp/gstwebp.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webp/gstwebp.c
Changed
@@ -31,10 +31,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_webp_dec_register (plugin); - gst_webp_enc_register (plugin); + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (webpdec, plugin); + ret |= GST_ELEMENT_REGISTER (webpenc, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webp/gstwebpdec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webp/gstwebpdec.c
Changed
@@ -79,6 +79,8 @@ #define gst_webp_dec_parent_class parent_class G_DEFINE_TYPE (GstWebPDec, gst_webp_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (webpdec, "webpdec", + GST_RANK_PRIMARY, GST_TYPE_WEBP_DEC); static void gst_webp_dec_class_init (GstWebPDecClass * klass) @@ -489,10 +491,3 @@ done: return ret; } - -gboolean -gst_webp_dec_register (GstPlugin * plugin) -{ - return gst_element_register (plugin, "webpdec", - GST_RANK_PRIMARY, GST_TYPE_WEBP_DEC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webp/gstwebpdec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webp/gstwebpdec.h
Changed
@@ -69,7 +69,8 @@ }; GType gst_webp_dec_get_type (void); -gboolean gst_webp_dec_register (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (webpdec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webp/gstwebpenc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webp/gstwebpenc.c
Changed
@@ -107,6 +107,8 @@ #define gst_webp_enc_parent_class parent_class G_DEFINE_TYPE (GstWebpEnc, gst_webp_enc, GST_TYPE_VIDEO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (webpenc, "webpenc", + GST_RANK_PRIMARY, GST_TYPE_WEBP_ENC); static void gst_webp_enc_class_init (GstWebpEncClass * klass) @@ -398,10 +400,3 @@ gst_video_codec_state_unref (enc->input_state); return TRUE; } - -gboolean -gst_webp_enc_register (GstPlugin * plugin) -{ - return gst_element_register (plugin, "webpenc", - GST_RANK_PRIMARY, GST_TYPE_WEBP_ENC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webp/gstwebpenc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webp/gstwebpenc.h
Changed
@@ -70,7 +70,8 @@ }; GType gst_webp_enc_get_type (void); -gboolean gst_webp_enc_register (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (webpenc); G_END_DECLS #endif /* __GST_WEBPENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/gstwebrtcbin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/gstwebrtcbin.c
Changed
@@ -29,7 +29,11 @@ #include "webrtcsdp.h" #include "webrtctransceiver.h" #include "webrtcdatachannel.h" -#include "sctptransport.h" +#include "webrtcsctptransport.h" + +#include "gst/webrtc/webrtc-priv.h" + +#include <gst/rtp/rtp.h> #include <stdio.h> #include <stdlib.h> @@ -53,11 +57,19 @@ #define ICE_LOCK(w) (g_mutex_lock (ICE_GET_LOCK(w))) #define ICE_UNLOCK(w) (g_mutex_unlock (ICE_GET_LOCK(w))) +#define DC_GET_LOCK(w) (&w->priv->dc_lock) +#define DC_LOCK(w) (g_mutex_lock (DC_GET_LOCK(w))) +#define DC_UNLOCK(w) (g_mutex_unlock (DC_GET_LOCK(w))) /* The extra time for the rtpstorage compared to the RTP jitterbuffer (in ms) */ #define RTPSTORAGE_EXTRA_TIME (50) -/* +#define DEFAULT_JB_LATENCY 200 + +/** + * SECTION: element-webrtcbin + * title: webrtcbin + * * This webrtcbin implements the majority of the W3's peerconnection API and * implementation guide where possible. Generating offers, answers and setting * local and remote SDP's are all supported. Both media descriptions and @@ -78,10 +90,10 @@ * configuration. Some cases are outlined below for a simple single * audio/video/data session: * - * - max-bundle (requires rtcp-muxing) uses a single transport for all + * - max-bundle uses a single transport for all * media/data transported. Renegotiation involves adding/removing the * necessary streams to the existing transports. - * - max-compat without rtcp-mux involves two TransportStream per media stream + * - max-compat involves two TransportStream per media stream * to transport the rtp and the rtcp packets and a single TransportStream for * all data channels. Each stream change involves modifying the associated * TransportStream/s as necessary. @@ -100,6 +112,9 @@ */ static void _update_need_negotiation (GstWebRTCBin * webrtc); +static GstPad *_connect_input_stream (GstWebRTCBin * webrtc, + GstWebRTCBinPad * pad); + #define GST_CAT_DEFAULT gst_webrtc_bin_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -200,17 +215,6 @@ G_DEFINE_TYPE (GstWebRTCBinPad, gst_webrtc_bin_pad, GST_TYPE_GHOST_PAD); static void -gst_webrtc_bin_pad_set_property (GObject * object, guint prop_id, - const GValue * value, GParamSpec * pspec) -{ - switch (prop_id) { - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} - -static void gst_webrtc_bin_pad_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { @@ -248,7 +252,6 @@ GObjectClass *gobject_class = (GObjectClass *) klass; gobject_class->get_property = gst_webrtc_bin_pad_get_property; - gobject_class->set_property = gst_webrtc_bin_pad_set_property; gobject_class->finalize = gst_webrtc_bin_pad_finalize; g_object_class_install_property (gobject_class, @@ -259,6 +262,35 @@ G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); } +static void +gst_webrtc_bin_pad_update_ssrc_event (GstWebRTCBinPad * wpad) +{ + if (wpad->received_caps) { + WebRTCTransceiver *trans = (WebRTCTransceiver *) wpad->trans; + GstPad *pad = GST_PAD (wpad); + + gst_event_take (&trans->ssrc_event, + gst_event_new_custom (GST_EVENT_CUSTOM_DOWNSTREAM_STICKY, + gst_structure_new ("GstWebRtcBinUpdateTos", "ssrc", G_TYPE_UINT, + trans->current_ssrc, NULL))); + + gst_pad_send_event (pad, gst_event_ref (trans->ssrc_event)); + } +} + +static GList * +_get_pending_sink_transceiver (GstWebRTCBin * webrtc, GstWebRTCBinPad * pad) +{ + GList *ret; + + for (ret = webrtc->priv->pending_sink_transceivers; ret; ret = ret->next) { + if (ret->data == pad) + break; + } + + return ret; +} + static gboolean gst_webrtcbin_sink_event (GstPad * pad, GstObject * parent, GstEvent * event) { @@ -271,12 +303,50 @@ gst_event_parse_caps (event, &caps); check_negotiation = (!wpad->received_caps - || gst_caps_is_equal (wpad->received_caps, caps)); + || !gst_caps_is_equal (wpad->received_caps, caps)); gst_caps_replace (&wpad->received_caps, caps); GST_DEBUG_OBJECT (parent, "On %" GST_PTR_FORMAT " checking negotiation? %u, caps %" GST_PTR_FORMAT, pad, check_negotiation, caps); + + if (check_negotiation) { + WebRTCTransceiver *trans = WEBRTC_TRANSCEIVER (wpad->trans); + const GstStructure *s; + + s = gst_caps_get_structure (caps, 0); + gst_structure_get_uint (s, "ssrc", &trans->current_ssrc); + gst_webrtc_bin_pad_update_ssrc_event (wpad); + } + + /* A remote description might have been set while the pad hadn't + * yet received caps, delaying the connection of the input stream + */ + PC_LOCK (webrtc); + if (wpad->trans) { + GST_OBJECT_LOCK (wpad->trans); + if (wpad->trans->current_direction == + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY + || wpad->trans->current_direction == + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV) { + GList *pending = _get_pending_sink_transceiver (webrtc, wpad); + + if (pending) { + GST_LOG_OBJECT (pad, "Connecting input stream to rtpbin with " + "transceiver %" GST_PTR_FORMAT " and caps %" GST_PTR_FORMAT, + wpad->trans, wpad->received_caps); + _connect_input_stream (webrtc, wpad); + gst_pad_remove_probe (GST_PAD (pad), wpad->block_id); + wpad->block_id = 0; + gst_object_unref (pending->data); + webrtc->priv->pending_sink_transceivers = + g_list_delete_link (webrtc->priv->pending_sink_transceivers, + pending); + } + } + GST_OBJECT_UNLOCK (wpad->trans); + } + PC_UNLOCK (webrtc); } else if (GST_EVENT_TYPE (event) == GST_EVENT_EOS) { check_negotiation = TRUE; } @@ -290,11 +360,116 @@ return gst_pad_event_default (pad, parent, event); } +static gboolean +gst_webrtcbin_sink_query (GstPad * pad, GstObject * parent, GstQuery * query) +{ + GstWebRTCBinPad *wpad = GST_WEBRTC_BIN_PAD (pad); + gboolean ret = FALSE; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_ACCEPT_CAPS: + GST_OBJECT_LOCK (wpad->trans); + if (wpad->trans->codec_preferences) { + GstCaps *caps; + + gst_query_parse_accept_caps (query, &caps); + + gst_query_set_accept_caps_result (query, + gst_caps_can_intersect (caps, wpad->trans->codec_preferences)); + ret = TRUE; + } + GST_OBJECT_UNLOCK (wpad->trans); + break; + + case GST_QUERY_CAPS: + { + GstCaps *codec_preferences = NULL; + + GST_OBJECT_LOCK (wpad->trans); + if (wpad->trans->codec_preferences) + codec_preferences = gst_caps_ref (wpad->trans->codec_preferences); + GST_OBJECT_UNLOCK (wpad->trans); + + if (codec_preferences) { + GstCaps *filter = NULL; + GstCaps *filter_prefs = NULL; + GstPad *target; + + gst_query_parse_caps (query, &filter); + + if (filter) { + filter_prefs = gst_caps_intersect_full (filter, codec_preferences, + GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (codec_preferences); + } else { + filter_prefs = codec_preferences; + } + + target = gst_ghost_pad_get_target (GST_GHOST_PAD (pad)); + if (target) { + GstCaps *result; + + result = gst_pad_query_caps (target, filter_prefs); + gst_query_set_caps_result (query, result); + gst_caps_unref (result); + + gst_object_unref (target); + } else { + gst_query_set_caps_result (query, filter_prefs); + } + + gst_caps_unref (filter_prefs); + ret = TRUE; + } + break; + } + default: + break; + } + + if (ret) + return TRUE; + + return gst_pad_query_default (pad, parent, query); +} + + static void gst_webrtc_bin_pad_init (GstWebRTCBinPad * pad) { } +static GstPadProbeReturn +webrtc_bin_pad_buffer_cb (GstPad * pad, GstPadProbeInfo * info, + gpointer user_data) +{ + GstWebRTCBinPad *wpad; + GstBuffer *buf; + GstRTPBuffer rtpbuf = GST_RTP_BUFFER_INIT; + + if (info->type & GST_PAD_PROBE_TYPE_BUFFER) { + buf = GST_PAD_PROBE_INFO_BUFFER (info); + } else { + GstBufferList *list; + + list = GST_PAD_PROBE_INFO_BUFFER_LIST (info); + buf = gst_buffer_list_get (list, 0); + } + + if (buf == NULL) + return GST_PAD_PROBE_OK; + + if (!gst_rtp_buffer_map (buf, GST_MAP_READ, &rtpbuf)) + return GST_PAD_PROBE_OK; + + wpad = GST_WEBRTC_BIN_PAD (pad); + wpad->last_ssrc = gst_rtp_buffer_get_ssrc (&rtpbuf); + + gst_rtp_buffer_unmap (&rtpbuf); + + return GST_PAD_PROBE_OK; +} + static GstWebRTCBinPad * gst_webrtc_bin_pad_new (const gchar * name, GstPadDirection direction) { @@ -314,6 +489,10 @@ gst_object_unref (template); gst_pad_set_event_function (GST_PAD (pad), gst_webrtcbin_sink_event); + gst_pad_set_query_function (GST_PAD (pad), gst_webrtcbin_sink_query); + + gst_pad_add_probe (GST_PAD (pad), GST_PAD_PROBE_TYPE_BUFFER | + GST_PAD_PROBE_TYPE_BUFFER_LIST, webrtc_bin_pad_buffer_cb, NULL, NULL); GST_DEBUG_OBJECT (pad, "new visible pad with direction %s", direction == GST_PAD_SRC ? "src" : "sink"); @@ -326,9 +505,6 @@ GST_DEBUG_CATEGORY_INIT (gst_webrtc_bin_debug, "webrtcbin", 0, "webrtcbin element");); -static GstPad *_connect_input_stream (GstWebRTCBin * webrtc, - GstWebRTCBinPad * pad); - enum { SIGNAL_0, @@ -368,7 +544,8 @@ PROP_BUNDLE_POLICY, PROP_ICE_TRANSPORT_POLICY, PROP_ICE_AGENT, - PROP_LATENCY + PROP_LATENCY, + PROP_SCTP_TRANSPORT, }; static guint gst_webrtc_bin_signals[LAST_SIGNAL] = { 0 }; @@ -412,18 +589,6 @@ g_array_append_val (webrtc->priv->ice_stream_map, item); } -typedef struct -{ - guint session_id; - gchar *mid; -} SessionMidItem; - -static void -clear_session_mid_item (SessionMidItem * item) -{ - g_free (item->mid); -} - typedef gboolean (*FindTransceiverFunc) (GstWebRTCRTPTransceiver * p1, gconstpointer data); @@ -453,6 +618,9 @@ static gboolean transceiver_match_for_mline (GstWebRTCRTPTransceiver * trans, guint * mline) { + if (trans->stopped) + return FALSE; + return trans->mline == *mline; } @@ -571,6 +739,7 @@ return channel->parent.id == *id; } +/* always called with dc_lock held */ static WebRTCDataChannel * _find_data_channel_for_id (GstWebRTCBin * webrtc, gint id) { @@ -622,21 +791,21 @@ typedef struct { GstPadDirection direction; - guint mlineindex; + guint mline; } MLineMatch; static gboolean pad_match_for_mline (GstWebRTCBinPad * pad, const MLineMatch * match) { return GST_PAD_DIRECTION (pad) == match->direction - && pad->mlineindex == match->mlineindex; + && pad->trans->mline == match->mline; } static GstWebRTCBinPad * _find_pad_for_mline (GstWebRTCBin * webrtc, GstPadDirection direction, - guint mlineindex) + guint mline) { - MLineMatch m = { direction, mlineindex }; + MLineMatch m = { direction, mline }; return _find_pad (webrtc, &m, (FindPadFunc) pad_match_for_mline); } @@ -749,14 +918,17 @@ static gboolean _execute_op (GstWebRTCBinTask * op) { + GstStructure *s; + PC_LOCK (op->webrtc); if (op->webrtc->priv->is_closed) { + PC_UNLOCK (op->webrtc); + if (op->promise) { GError *error = - g_error_new (GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_CLOSED, + g_error_new (GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INVALID_STATE, "webrtcbin is closed. aborting execution."); - GstStructure *s = - gst_structure_new ("application/x-gstwebrtcbin-promise-error", + GstStructure *s = gst_structure_new ("application/x-gst-promise", "error", G_TYPE_ERROR, error, NULL); gst_promise_reply (op->promise, s); @@ -768,10 +940,16 @@ goto out; } - op->op (op->webrtc, op->data); + s = op->op (op->webrtc, op->data); -out: PC_UNLOCK (op->webrtc); + + if (op->promise) + gst_promise_reply (op->promise, s); + else if (s) + gst_structure_free (s); + +out: return G_SOURCE_REMOVE; } @@ -845,11 +1023,8 @@ for (i = 0; i < webrtc->priv->transceivers->len; i++) { GstWebRTCRTPTransceiver *rtp_trans = g_ptr_array_index (webrtc->priv->transceivers, i); - WebRTCTransceiver *trans = WEBRTC_TRANSCEIVER (rtp_trans); - TransportStream *stream = trans->stream; - GstWebRTCICETransport *transport, *rtcp_transport; + GstWebRTCICETransport *transport; GstWebRTCICEConnectionState ice_state; - gboolean rtcp_mux = FALSE; if (rtp_trans->stopped) { GST_TRACE_OBJECT (webrtc, "transceiver %p stopped", rtp_trans); @@ -861,8 +1036,6 @@ continue; } - g_object_get (stream, "rtcp-mux", &rtcp_mux, NULL); - transport = webrtc_transceiver_get_dtls_transport (rtp_trans)->transport; /* get transport state */ @@ -878,24 +1051,6 @@ if (ice_state != STATE (CONNECTED) && ice_state != STATE (COMPLETED) && ice_state != STATE (CLOSED)) all_connected_completed_or_closed = FALSE; - - rtcp_transport = - webrtc_transceiver_get_rtcp_dtls_transport (rtp_trans)->transport; - - if (!rtcp_mux && rtcp_transport && transport != rtcp_transport) { - g_object_get (rtcp_transport, "state", &ice_state, NULL); - GST_TRACE_OBJECT (webrtc, "transceiver %p RTCP state 0x%x", rtp_trans, - ice_state); - any_state |= (1 << ice_state); - - if (ice_state != STATE (NEW) && ice_state != STATE (CLOSED)) - all_new_or_closed = FALSE; - if (ice_state != STATE (COMPLETED) && ice_state != STATE (CLOSED)) - all_completed_or_closed = FALSE; - if (ice_state != STATE (CONNECTED) && ice_state != STATE (COMPLETED) - && ice_state != STATE (CLOSED)) - all_connected_completed_or_closed = FALSE; - } } GST_TRACE_OBJECT (webrtc, "ICE connection state: 0x%x", any_state); @@ -956,9 +1111,8 @@ WebRTCTransceiver *trans = WEBRTC_TRANSCEIVER (rtp_trans); TransportStream *stream = trans->stream; GstWebRTCDTLSTransport *dtls_transport; - GstWebRTCICETransport *transport, *rtcp_transport; + GstWebRTCICETransport *transport; GstWebRTCICEGatheringState ice_state; - gboolean rtcp_mux = FALSE; if (rtp_trans->stopped || stream == NULL) { GST_TRACE_OBJECT (webrtc, "transceiver %p stopped or unassociated", @@ -972,8 +1126,6 @@ GST_TRACE_OBJECT (webrtc, "transceiver %p has no mid", rtp_trans); } - g_object_get (stream, "rtcp-mux", &rtcp_mux, NULL); - dtls_transport = webrtc_transceiver_get_dtls_transport (rtp_trans); if (dtls_transport == NULL) { GST_WARNING ("Transceiver %p has no DTLS transport", rtp_trans); @@ -989,22 +1141,6 @@ any_state |= (1 << ice_state); if (ice_state != STATE (COMPLETE)) all_completed = FALSE; - - dtls_transport = webrtc_transceiver_get_rtcp_dtls_transport (rtp_trans); - if (dtls_transport == NULL) { - GST_WARNING ("Transceiver %p has no DTLS RTCP transport", rtp_trans); - continue; - } - rtcp_transport = dtls_transport->transport; - - if (!rtcp_mux && rtcp_transport && rtcp_transport != transport) { - g_object_get (rtcp_transport, "gathering-state", &ice_state, NULL); - GST_TRACE_OBJECT (webrtc, "transceiver %p RTCP gathering state: 0x%x", - rtp_trans, ice_state); - any_state |= (1 << ice_state); - if (ice_state != STATE (COMPLETE)) - all_completed = FALSE; - } } GST_TRACE_OBJECT (webrtc, "ICE gathering state: 0x%x", any_state); @@ -1048,12 +1184,9 @@ for (i = 0; i < webrtc->priv->transceivers->len; i++) { GstWebRTCRTPTransceiver *rtp_trans = g_ptr_array_index (webrtc->priv->transceivers, i); - WebRTCTransceiver *trans = WEBRTC_TRANSCEIVER (rtp_trans); - TransportStream *stream = trans->stream; - GstWebRTCDTLSTransport *transport, *rtcp_transport; + GstWebRTCDTLSTransport *transport; GstWebRTCICEConnectionState ice_state; GstWebRTCDTLSTransportState dtls_state; - gboolean rtcp_mux = FALSE; if (rtp_trans->stopped) { GST_TRACE_OBJECT (webrtc, "transceiver %p stopped", rtp_trans); @@ -1064,7 +1197,6 @@ continue; } - g_object_get (stream, "rtcp-mux", &rtcp_mux, NULL); transport = webrtc_transceiver_get_dtls_transport (rtp_trans); /* get transport state */ @@ -1093,38 +1225,40 @@ if (ice_state != ICE_STATE (CONNECTED) && ice_state != ICE_STATE (COMPLETED) && ice_state != ICE_STATE (CLOSED)) ice_all_connected_completed_or_closed = FALSE; + } - rtcp_transport = webrtc_transceiver_get_rtcp_dtls_transport (rtp_trans); + // also check data channel transport state + if (webrtc->priv->data_channel_transport) { + GstWebRTCDTLSTransport *transport = + webrtc->priv->data_channel_transport->transport; + GstWebRTCICEConnectionState ice_state; + GstWebRTCDTLSTransportState dtls_state; - if (!rtcp_mux && rtcp_transport && rtcp_transport != transport) { - g_object_get (rtcp_transport, "state", &dtls_state, NULL); - GST_TRACE_OBJECT (webrtc, "transceiver %p RTCP DTLS state: 0x%x", - rtp_trans, dtls_state); - any_dtls_state |= (1 << dtls_state); - - if (dtls_state != DTLS_STATE (NEW) && dtls_state != DTLS_STATE (CLOSED)) - dtls_all_new_or_closed = FALSE; - if (dtls_state != DTLS_STATE (NEW) - && dtls_state != DTLS_STATE (CONNECTING)) - dtls_all_new_connecting_or_checking = FALSE; - if (dtls_state != DTLS_STATE (CONNECTED) - && dtls_state != DTLS_STATE (CLOSED)) - dtls_all_connected_completed_or_closed = FALSE; - - g_object_get (rtcp_transport->transport, "state", &ice_state, NULL); - GST_TRACE_OBJECT (webrtc, "transceiver %p RTCP ICE state: 0x%x", - rtp_trans, ice_state); - any_ice_state |= (1 << ice_state); - - if (ice_state != ICE_STATE (NEW) && ice_state != ICE_STATE (CLOSED)) - ice_all_new_or_closed = FALSE; - if (ice_state != ICE_STATE (NEW) && ice_state != ICE_STATE (CHECKING)) - ice_all_new_connecting_or_checking = FALSE; - if (ice_state != ICE_STATE (CONNECTED) - && ice_state != ICE_STATE (COMPLETED) - && ice_state != ICE_STATE (CLOSED)) - ice_all_connected_completed_or_closed = FALSE; - } + g_object_get (transport, "state", &dtls_state, NULL); + GST_TRACE_OBJECT (webrtc, "data channel transport DTLS state: 0x%x", + dtls_state); + any_dtls_state |= (1 << dtls_state); + + if (dtls_state != DTLS_STATE (NEW) && dtls_state != DTLS_STATE (CLOSED)) + dtls_all_new_or_closed = FALSE; + if (dtls_state != DTLS_STATE (NEW) && dtls_state != DTLS_STATE (CONNECTING)) + dtls_all_new_connecting_or_checking = FALSE; + if (dtls_state != DTLS_STATE (CONNECTED) + && dtls_state != DTLS_STATE (CLOSED)) + dtls_all_connected_completed_or_closed = FALSE; + + g_object_get (transport->transport, "state", &ice_state, NULL); + GST_TRACE_OBJECT (webrtc, "data channel transport ICE state: 0x%x", + ice_state); + any_ice_state |= (1 << ice_state); + + if (ice_state != ICE_STATE (NEW) && ice_state != ICE_STATE (CLOSED)) + ice_all_new_or_closed = FALSE; + if (ice_state != ICE_STATE (NEW) && ice_state != ICE_STATE (CHECKING)) + ice_all_new_connecting_or_checking = FALSE; + if (ice_state != ICE_STATE (CONNECTED) && ice_state != ICE_STATE (COMPLETED) + && ice_state != ICE_STATE (CLOSED)) + ice_all_connected_completed_or_closed = FALSE; } GST_TRACE_OBJECT (webrtc, "ICE connection state: 0x%x. DTLS connection " @@ -1156,7 +1290,7 @@ /* All RTCIceTransports and RTCDtlsTransports are in the new or closed * state, or there are no transports. */ if ((dtls_all_new_or_closed && ice_all_new_or_closed) - || webrtc->priv->transceivers->len == 0) { + || webrtc->priv->transports->len == 0) { GST_TRACE_OBJECT (webrtc, "returning new"); return STATE (NEW); } @@ -1193,7 +1327,7 @@ #undef STATE } -static void +static GstStructure * _update_ice_gathering_state_task (GstWebRTCBin * webrtc, gpointer data) { GstWebRTCICEGatheringState old_state = webrtc->ice_gathering_state; @@ -1231,6 +1365,8 @@ g_object_notify (G_OBJECT (webrtc), "ice-gathering-state"); PC_LOCK (webrtc); } + + return NULL; } static void @@ -1240,7 +1376,7 @@ NULL, NULL); } -static void +static GstStructure * _update_ice_connection_state_task (GstWebRTCBin * webrtc, gpointer data) { GstWebRTCICEConnectionState old_state = webrtc->ice_connection_state; @@ -1266,6 +1402,8 @@ g_object_notify (G_OBJECT (webrtc), "ice-connection-state"); PC_LOCK (webrtc); } + + return NULL; } static void @@ -1275,7 +1413,7 @@ NULL, NULL); } -static void +static GstStructure * _update_peer_connection_state_task (GstWebRTCBin * webrtc, gpointer data) { GstWebRTCPeerConnectionState old_state = webrtc->peer_connection_state; @@ -1301,6 +1439,8 @@ g_object_notify (G_OBJECT (webrtc), "connection-state"); PC_LOCK (webrtc); } + + return NULL; } static void @@ -1327,7 +1467,11 @@ wpad = GST_WEBRTC_BIN_PAD (l->data); if (GST_PAD_DIRECTION (l->data) == GST_PAD_SINK && !wpad->received_caps && (!wpad->trans || !wpad->trans->stopped)) { - goto done; + if (wpad->trans && wpad->trans->codec_preferences) { + continue; + } else { + goto done; + } } } @@ -1505,7 +1649,7 @@ return FALSE; } -static void +static GstStructure * _check_need_negotiation_task (GstWebRTCBin * webrtc, gpointer unused) { if (webrtc->priv->need_negotiation) { @@ -1515,6 +1659,8 @@ 0); PC_LOCK (webrtc); } + + return NULL; } /* http://w3c.github.io/webrtc-pc/#dfn-update-the-negotiation-needed-flag */ @@ -1547,55 +1693,183 @@ } static GstCaps * +_query_pad_caps (GstWebRTCBin * webrtc, GstWebRTCRTPTransceiver * rtp_trans, + GstWebRTCBinPad * pad, GstCaps * filter, GError ** error) +{ + GstCaps *caps; + guint i, n; + + caps = gst_pad_peer_query_caps (GST_PAD (pad), filter); + GST_LOG_OBJECT (webrtc, "Using peer query caps: %" GST_PTR_FORMAT, caps); + + /* Only return an error if actual empty caps were returned from the query. */ + if (gst_caps_is_empty (caps)) { + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Caps negotiation on pad %s failed", GST_PAD_NAME (pad)); + gst_clear_caps (&caps); + gst_caps_unref (filter); + return NULL; + } + + n = gst_caps_get_size (caps); + if (n > 0) { + /* Make sure the caps are complete enough to figure out the media type and + * encoding-name, otherwise they would match with basically any media. */ + caps = gst_caps_make_writable (caps); + for (i = n; i > 0; i--) { + const GstStructure *s = gst_caps_get_structure (caps, i - 1); + + if (!gst_structure_has_name (s, "application/x-rtp") || + !gst_structure_has_field (s, "media") || + !gst_structure_has_field (s, "encoding-name")) { + gst_caps_remove_structure (caps, i - 1); + } + } + } + + /* If the filtering above resulted in empty caps, or the caps were ANY to + * begin with, then don't report and error but just NULL. + * + * This would be the case if negotiation would not fail but the peer does + * not have any specific enough preferred caps that would allow us to + * use them further. + */ + if (gst_caps_is_any (caps) || gst_caps_is_empty (caps)) { + GST_DEBUG_OBJECT (webrtc, "Peer caps not specific enough"); + gst_clear_caps (&caps); + } + + gst_caps_unref (filter); + + return caps; +} + +static GstCaps * _find_codec_preferences (GstWebRTCBin * webrtc, - GstWebRTCRTPTransceiver * rtp_trans, GstPadDirection direction, - guint media_idx) + GstWebRTCRTPTransceiver * rtp_trans, guint media_idx, GError ** error) { WebRTCTransceiver *trans = (WebRTCTransceiver *) rtp_trans; GstCaps *ret = NULL; + GstCaps *codec_preferences = NULL; + GstWebRTCBinPad *pad = NULL; + GstPadDirection direction; + + g_assert (rtp_trans); + g_assert (error && *error == NULL); GST_LOG_OBJECT (webrtc, "retrieving codec preferences from %" GST_PTR_FORMAT, trans); - if (rtp_trans && rtp_trans->codec_preferences) { + GST_OBJECT_LOCK (rtp_trans); + if (rtp_trans->codec_preferences) { GST_LOG_OBJECT (webrtc, "Using codec preferences: %" GST_PTR_FORMAT, rtp_trans->codec_preferences); - ret = gst_caps_ref (rtp_trans->codec_preferences); - } else { - GstWebRTCBinPad *pad = NULL; + codec_preferences = gst_caps_ref (rtp_trans->codec_preferences); + } + GST_OBJECT_UNLOCK (rtp_trans); + + if (rtp_trans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY) + direction = GST_PAD_SRC; + else + direction = GST_PAD_SINK; + + pad = _find_pad_for_transceiver (webrtc, direction, rtp_trans); + + /* try to find a pad */ + if (!pad) + pad = _find_pad_for_mline (webrtc, direction, media_idx); + + /* For the case where we have set our transceiver to sendrecv, but the + * sink pad has not been requested yet. + */ + if (!pad && + rtp_trans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV) { + + pad = _find_pad_for_transceiver (webrtc, GST_PAD_SRC, rtp_trans); /* try to find a pad */ - if (!trans - || !(pad = _find_pad_for_transceiver (webrtc, direction, rtp_trans))) - pad = _find_pad_for_mline (webrtc, direction, media_idx); - - if (!pad) { - if (trans && trans->last_configured_caps) - ret = gst_caps_ref (trans->last_configured_caps); + if (!pad) + pad = _find_pad_for_mline (webrtc, GST_PAD_SRC, media_idx); + } + + if (pad) { + GstCaps *caps = NULL; + + if (pad->received_caps) { + caps = gst_caps_ref (pad->received_caps); } else { - GstCaps *caps = NULL; + static GstStaticCaps static_filter = + GST_STATIC_CAPS ("application/x-rtp, " + "media = (string) { audio, video }, payload = (int) [ 0, 127 ]"); + GstCaps *filter = gst_static_caps_get (&static_filter); + + filter = gst_caps_make_writable (filter); + + if (rtp_trans->kind == GST_WEBRTC_KIND_AUDIO) + gst_caps_set_simple (filter, "media", G_TYPE_STRING, "audio", NULL); + else if (rtp_trans->kind == GST_WEBRTC_KIND_VIDEO) + gst_caps_set_simple (filter, "media", G_TYPE_STRING, "video", NULL); - if (pad->received_caps) { - caps = gst_caps_ref (pad->received_caps); - } else if ((caps = gst_pad_get_current_caps (GST_PAD (pad)))) { - GST_LOG_OBJECT (webrtc, "Using current pad caps: %" GST_PTR_FORMAT, - caps); - } else { - if ((caps = gst_pad_peer_query_caps (GST_PAD (pad), NULL))) - GST_LOG_OBJECT (webrtc, "Using peer query caps: %" GST_PTR_FORMAT, - caps); + caps = _query_pad_caps (webrtc, rtp_trans, pad, filter, error); + } + gst_object_unref (pad); + + if (*error) + goto out; + + if (caps && + rtp_trans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV) { + GstWebRTCBinPad *srcpad = + _find_pad_for_transceiver (webrtc, GST_PAD_SRC, rtp_trans); + + if (srcpad) { + caps = _query_pad_caps (webrtc, rtp_trans, srcpad, caps, error); + gst_object_unref (srcpad); + + if (*error) + goto out; } - if (caps) { - if (trans) - gst_caps_replace (&trans->last_configured_caps, caps); + } - ret = caps; + if (caps && codec_preferences) { + GstCaps *intersection; + + intersection = gst_caps_intersect_full (codec_preferences, caps, + GST_CAPS_INTERSECT_FIRST); + gst_clear_caps (&caps); + + if (gst_caps_is_empty (intersection)) { + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Caps negotiation on pad %s failed against codec preferences", + GST_PAD_NAME (pad)); + gst_clear_caps (&intersection); + } else { + caps = intersection; } + } - gst_object_unref (pad); + if (caps) { + if (trans) + gst_caps_replace (&trans->last_configured_caps, caps); + + ret = caps; } } + if (!ret) { + if (codec_preferences) + ret = gst_caps_ref (codec_preferences); + else if (trans->last_configured_caps) + ret = gst_caps_ref (trans->last_configured_caps); + } + +out: + + if (codec_preferences) + gst_caps_unref (codec_preferences); + if (!ret) GST_DEBUG_OBJECT (trans, "Could not find caps for mline %u", media_idx); @@ -1606,11 +1880,16 @@ _add_supported_attributes_to_caps (GstWebRTCBin * webrtc, WebRTCTransceiver * trans, const GstCaps * caps) { + GstWebRTCKind kind; GstCaps *ret; guint i; + if (caps == NULL) + return NULL; + ret = gst_caps_make_writable (caps); + kind = webrtc_kind_from_caps (ret); for (i = 0; i < gst_caps_get_size (ret); i++) { GstStructure *s = gst_caps_get_structure (ret, i); @@ -1618,11 +1897,11 @@ if (!gst_structure_has_field (s, "rtcp-fb-nack")) gst_structure_set (s, "rtcp-fb-nack", G_TYPE_BOOLEAN, TRUE, NULL); - if (!gst_structure_has_field (s, "rtcp-fb-nack-pli")) + if (kind == GST_WEBRTC_KIND_VIDEO + && !gst_structure_has_field (s, "rtcp-fb-nack-pli")) gst_structure_set (s, "rtcp-fb-nack-pli", G_TYPE_BOOLEAN, TRUE, NULL); - /* FIXME: is this needed? */ - /*if (!gst_structure_has_field (s, "rtcp-fb-transport-cc")) - gst_structure_set (s, "rtcp-fb-nack-pli", G_TYPE_BOOLEAN, TRUE, NULL); */ + if (!gst_structure_has_field (s, "rtcp-fb-transport-cc")) + gst_structure_set (s, "rtcp-fb-transport-cc", G_TYPE_BOOLEAN, TRUE, NULL); /* FIXME: codec-specific parameters? */ } @@ -1652,9 +1931,263 @@ _update_peer_connection_state (webrtc); } +static gboolean +match_ssrc (GstWebRTCRTPTransceiver * rtp_trans, gconstpointer data) +{ + WebRTCTransceiver *trans = (WebRTCTransceiver *) rtp_trans; + + return (trans->current_ssrc == GPOINTER_TO_UINT (data)); +} + +static gboolean +_on_sending_rtcp (GObject * internal_session, GstBuffer * buffer, + gboolean early, gpointer user_data) +{ + GstWebRTCBin *webrtc = user_data; + GstRTCPBuffer rtcp = GST_RTCP_BUFFER_INIT; + GstRTCPPacket packet; + + if (!gst_rtcp_buffer_map (buffer, GST_MAP_READ, &rtcp)) + goto done; + + if (gst_rtcp_buffer_get_first_packet (&rtcp, &packet)) { + if (gst_rtcp_packet_get_type (&packet) == GST_RTCP_TYPE_SR) { + guint32 ssrc; + GstWebRTCRTPTransceiver *rtp_trans; + WebRTCTransceiver *trans; + + gst_rtcp_packet_sr_get_sender_info (&packet, &ssrc, NULL, NULL, NULL, + NULL); + + rtp_trans = _find_transceiver (webrtc, GUINT_TO_POINTER (ssrc), + match_ssrc); + trans = (WebRTCTransceiver *) rtp_trans; + + if (rtp_trans && rtp_trans->sender && trans->ssrc_event) { + GstPad *pad; + gchar *pad_name = NULL; + + pad_name = + g_strdup_printf ("send_rtcp_src_%u", + rtp_trans->sender->transport->session_id); + pad = gst_element_get_static_pad (webrtc->rtpbin, pad_name); + g_free (pad_name); + if (pad) { + gst_pad_push_event (pad, gst_event_ref (trans->ssrc_event)); + gst_object_unref (pad); + } + } + } + } + + gst_rtcp_buffer_unmap (&rtcp); + +done: + /* False means we don't care about suppression */ + return FALSE; +} + +static void +gst_webrtc_bin_attach_tos_to_session (GstWebRTCBin * webrtc, guint session_id) +{ + GObject *internal_session = NULL; + + g_signal_emit_by_name (webrtc->rtpbin, "get-internal-session", + session_id, &internal_session); + + if (internal_session) { + g_signal_connect (internal_session, "on-sending-rtcp", + G_CALLBACK (_on_sending_rtcp), webrtc); + g_object_unref (internal_session); + } +} + +static void +weak_free (GWeakRef * weak) +{ + g_weak_ref_clear (weak); + g_free (weak); +} + +static GstPadProbeReturn +_nicesink_pad_probe (GstPad * pad, GstPadProbeInfo * info, gpointer user_data) +{ + GstWebRTCBin *webrtc = g_weak_ref_get ((GWeakRef *) user_data); + + if (!webrtc) + return GST_PAD_PROBE_REMOVE; + + if (GST_EVENT_TYPE (GST_PAD_PROBE_INFO_EVENT (info)) + == GST_EVENT_CUSTOM_DOWNSTREAM_STICKY) { + const GstStructure *s = + gst_event_get_structure (GST_PAD_PROBE_INFO_EVENT (info)); + + if (gst_structure_has_name (s, "GstWebRtcBinUpdateTos")) { + guint ssrc; + gint priority; + + if (gst_structure_get_uint (s, "ssrc", &ssrc)) { + GstWebRTCRTPTransceiver *rtp_trans; + + rtp_trans = _find_transceiver (webrtc, GUINT_TO_POINTER (ssrc), + match_ssrc); + if (rtp_trans) { + WebRTCTransceiver *trans = WEBRTC_TRANSCEIVER (rtp_trans); + GstWebRTCICEStream *stream = _find_ice_stream_for_session (webrtc, + trans->stream->session_id); + guint8 dscp = 0; + + /* Set DSCP field based on + * https://tools.ietf.org/html/draft-ietf-tsvwg-rtcweb-qos-18#section-5 + */ + switch (rtp_trans->sender->priority) { + case GST_WEBRTC_PRIORITY_TYPE_VERY_LOW: + dscp = 8; /* CS1 */ + break; + case GST_WEBRTC_PRIORITY_TYPE_LOW: + dscp = 0; /* DF */ + break; + case GST_WEBRTC_PRIORITY_TYPE_MEDIUM: + switch (rtp_trans->kind) { + case GST_WEBRTC_KIND_AUDIO: + dscp = 46; /* EF */ + break; + case GST_WEBRTC_KIND_VIDEO: + dscp = 38; /* AF43 *//* TODO: differentiate non-interactive */ + break; + case GST_WEBRTC_KIND_UNKNOWN: + dscp = 0; + break; + } + break; + case GST_WEBRTC_PRIORITY_TYPE_HIGH: + switch (rtp_trans->kind) { + case GST_WEBRTC_KIND_AUDIO: + dscp = 46; /* EF */ + break; + case GST_WEBRTC_KIND_VIDEO: + dscp = 36; /* AF42 *//* TODO: differentiate non-interactive */ + break; + case GST_WEBRTC_KIND_UNKNOWN: + dscp = 0; + break; + } + break; + } + + gst_webrtc_ice_set_tos (webrtc->priv->ice, stream, dscp << 2); + } + } else if (gst_structure_get_enum (s, "sctp-priority", + GST_TYPE_WEBRTC_PRIORITY_TYPE, &priority)) { + guint8 dscp = 0; + + /* Set DSCP field based on + * https://tools.ietf.org/html/draft-ietf-tsvwg-rtcweb-qos-18#section-5 + */ + switch (priority) { + case GST_WEBRTC_PRIORITY_TYPE_VERY_LOW: + dscp = 8; /* CS1 */ + break; + case GST_WEBRTC_PRIORITY_TYPE_LOW: + dscp = 0; /* DF */ + break; + case GST_WEBRTC_PRIORITY_TYPE_MEDIUM: + dscp = 10; /* AF11 */ + break; + case GST_WEBRTC_PRIORITY_TYPE_HIGH: + dscp = 18; /* AF21 */ + break; + } + if (webrtc->priv->data_channel_transport) + gst_webrtc_ice_set_tos (webrtc->priv->ice, + webrtc->priv->data_channel_transport->stream, dscp << 2); + } + } + } + + gst_object_unref (webrtc); + + return GST_PAD_PROBE_OK; +} + +static void gst_webrtc_bin_attach_tos (GstWebRTCBin * webrtc); + +static void +gst_webrtc_bin_update_sctp_priority (GstWebRTCBin * webrtc) +{ + GstWebRTCPriorityType sctp_priority = 0; + guint i; + + if (!webrtc->priv->sctp_transport) + return; + + DC_LOCK (webrtc); + for (i = 0; i < webrtc->priv->data_channels->len; i++) { + GstWebRTCDataChannel *channel + = g_ptr_array_index (webrtc->priv->data_channels, i); + + sctp_priority = MAX (sctp_priority, channel->priority); + } + DC_UNLOCK (webrtc); + + /* Default priority is low means DSCP field is left as 0 */ + if (sctp_priority == 0) + sctp_priority = GST_WEBRTC_PRIORITY_TYPE_LOW; + + /* Nobody asks for DSCP, leave it as-is */ + if (sctp_priority == GST_WEBRTC_PRIORITY_TYPE_LOW && + !webrtc->priv->tos_attached) + return; + + /* If one stream has a non-default priority, then everyone else does too */ + gst_webrtc_bin_attach_tos (webrtc); + + webrtc_sctp_transport_set_priority (webrtc->priv->sctp_transport, + sctp_priority); +} + +static void +gst_webrtc_bin_attach_probe_to_ice_sink (GstWebRTCBin * webrtc, + GstWebRTCICETransport * transport) +{ + GstPad *pad; + GWeakRef *weak; + + pad = gst_element_get_static_pad (transport->sink, "sink"); + + weak = g_new0 (GWeakRef, 1); + g_weak_ref_init (weak, webrtc); + + gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, + _nicesink_pad_probe, weak, (GDestroyNotify) weak_free); + gst_object_unref (pad); +} + +static void +gst_webrtc_bin_attach_tos (GstWebRTCBin * webrtc) +{ + guint i; + + if (webrtc->priv->tos_attached) + return; + webrtc->priv->tos_attached = TRUE; + + for (i = 0; i < webrtc->priv->transports->len; i++) { + TransportStream *stream = g_ptr_array_index (webrtc->priv->transports, i); + + gst_webrtc_bin_attach_tos_to_session (webrtc, stream->session_id); + + gst_webrtc_bin_attach_probe_to_ice_sink (webrtc, + stream->transport->transport); + } + + gst_webrtc_bin_update_sctp_priority (webrtc); +} + static WebRTCTransceiver * _create_webrtc_transceiver (GstWebRTCBin * webrtc, - GstWebRTCRTPTransceiverDirection direction, guint mline) + GstWebRTCRTPTransceiverDirection direction, guint mline, GstWebRTCKind kind, + GstCaps * codec_preferences) { WebRTCTransceiver *trans; GstWebRTCRTPTransceiver *rtp_trans; @@ -1667,9 +2200,15 @@ rtp_trans = GST_WEBRTC_RTP_TRANSCEIVER (trans); rtp_trans->direction = direction; rtp_trans->mline = mline; + rtp_trans->kind = kind; + rtp_trans->codec_preferences = + codec_preferences ? gst_caps_ref (codec_preferences) : NULL; /* FIXME: We don't support stopping transceiver yet so they're always not stopped */ rtp_trans->stopped = FALSE; + g_signal_connect_object (sender, "notify::priority", + G_CALLBACK (gst_webrtc_bin_attach_tos), webrtc, G_CONNECT_SWAPPED); + g_ptr_array_add (webrtc->priv->transceivers, trans); gst_object_unref (sender); @@ -1686,6 +2225,7 @@ { GstWebRTCDTLSTransport *transport; TransportStream *ret; + gchar *pad_name; /* FIXME: how to parametrize the sender and the receiver */ ret = transport_stream_new (webrtc, session_id); @@ -1698,16 +2238,24 @@ G_CALLBACK (_on_ice_transport_notify_gathering_state), webrtc); g_signal_connect (G_OBJECT (transport), "notify::state", G_CALLBACK (_on_dtls_transport_notify_state), webrtc); + if (webrtc->priv->tos_attached) + gst_webrtc_bin_attach_probe_to_ice_sink (webrtc, transport->transport); - if ((transport = ret->rtcp_transport)) { - g_signal_connect (G_OBJECT (transport->transport), - "notify::state", G_CALLBACK (_on_ice_transport_notify_state), webrtc); - g_signal_connect (G_OBJECT (transport->transport), - "notify::gathering-state", - G_CALLBACK (_on_ice_transport_notify_gathering_state), webrtc); - g_signal_connect (G_OBJECT (transport), "notify::state", - G_CALLBACK (_on_dtls_transport_notify_state), webrtc); - } + gst_bin_add (GST_BIN (webrtc), GST_ELEMENT (ret->send_bin)); + gst_bin_add (GST_BIN (webrtc), GST_ELEMENT (ret->receive_bin)); + g_ptr_array_add (webrtc->priv->transports, ret); + + pad_name = g_strdup_printf ("recv_rtcp_sink_%u", ret->session_id); + if (!gst_element_link_pads (GST_ELEMENT (ret->receive_bin), "rtcp_src", + GST_ELEMENT (webrtc->rtpbin), pad_name)) + g_warn_if_reached (); + g_free (pad_name); + + pad_name = g_strdup_printf ("send_rtcp_src_%u", ret->session_id); + if (!gst_element_link_pads (GST_ELEMENT (webrtc->rtpbin), pad_name, + GST_ELEMENT (ret->send_bin), "rtcp_sink")) + g_warn_if_reached (); + g_free (pad_name); GST_TRACE_OBJECT (webrtc, "Create transport %" GST_PTR_FORMAT " for session %u", ret, session_id); @@ -1719,28 +2267,11 @@ _get_or_create_rtp_transport_channel (GstWebRTCBin * webrtc, guint session_id) { TransportStream *ret; - gchar *pad_name; ret = _find_transport_for_session (webrtc, session_id); - if (!ret) { + if (!ret) ret = _create_transport_channel (webrtc, session_id); - gst_bin_add (GST_BIN (webrtc), GST_ELEMENT (ret->send_bin)); - gst_bin_add (GST_BIN (webrtc), GST_ELEMENT (ret->receive_bin)); - g_ptr_array_add (webrtc->priv->transports, ret); - - pad_name = g_strdup_printf ("recv_rtcp_sink_%u", ret->session_id); - if (!gst_element_link_pads (GST_ELEMENT (ret->receive_bin), "rtcp_src", - GST_ELEMENT (webrtc->rtpbin), pad_name)) - g_warn_if_reached (); - g_free (pad_name); - - pad_name = g_strdup_printf ("send_rtcp_src_%u", ret->session_id); - if (!gst_element_link_pads (GST_ELEMENT (webrtc->rtpbin), pad_name, - GST_ELEMENT (ret->send_bin), "rtcp_sink")) - g_warn_if_reached (); - g_free (pad_name); - } gst_element_sync_state_with_parent (GST_ELEMENT (ret->send_bin)); gst_element_sync_state_with_parent (GST_ELEMENT (ret->receive_bin)); @@ -1754,32 +2285,38 @@ GParamSpec * pspec, GstWebRTCBin * webrtc) { GstWebRTCDataChannelState ready_state; - guint i; g_object_get (channel, "ready-state", &ready_state, NULL); if (ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_OPEN) { - gboolean found = FALSE; + gboolean found; - for (i = 0; i < webrtc->priv->pending_data_channels->len; i++) { - WebRTCDataChannel *c; - - c = g_ptr_array_index (webrtc->priv->pending_data_channels, i); - if (c == channel) { - found = TRUE; - g_ptr_array_remove_index (webrtc->priv->pending_data_channels, i); - break; - } - } + DC_LOCK (webrtc); + found = g_ptr_array_remove (webrtc->priv->pending_data_channels, channel); if (found == FALSE) { GST_FIXME_OBJECT (webrtc, "Received open for unknown data channel"); + DC_UNLOCK (webrtc); return; } - g_ptr_array_add (webrtc->priv->data_channels, channel); + g_ptr_array_add (webrtc->priv->data_channels, gst_object_ref (channel)); + DC_UNLOCK (webrtc); + + gst_webrtc_bin_update_sctp_priority (webrtc); g_signal_emit (webrtc, gst_webrtc_bin_signals[ON_DATA_CHANNEL_SIGNAL], 0, - gst_object_ref (channel)); + channel); + } else if (ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED) { + gboolean found; + + DC_LOCK (webrtc); + found = g_ptr_array_remove (webrtc->priv->pending_data_channels, channel) + || g_ptr_array_remove (webrtc->priv->data_channels, channel); + + if (found == FALSE) { + GST_FIXME_OBJECT (webrtc, "Received close for unknown data channel"); + } + DC_UNLOCK (webrtc); } } @@ -1794,7 +2331,7 @@ if (sscanf (GST_PAD_NAME (pad), "src_%u", &stream_id) != 1) return; - PC_LOCK (webrtc); + DC_LOCK (webrtc); channel = _find_data_channel_for_id (webrtc, stream_id); if (!channel) { channel = g_object_new (WEBRTC_TYPE_DATA_CHANNEL, NULL); @@ -1811,6 +2348,7 @@ g_ptr_array_add (webrtc->priv->pending_data_channels, channel); } + DC_UNLOCK (webrtc); g_signal_connect (channel, "notify::ready-state", G_CALLBACK (_on_data_channel_ready_state), webrtc); @@ -1820,11 +2358,10 @@ GST_WARNING_OBJECT (channel, "Failed to link sctp pad %s with channel %" GST_PTR_FORMAT, GST_PAD_NAME (pad), channel); gst_object_unref (sink_pad); - PC_UNLOCK (webrtc); } static void -_on_sctp_state_notify (GstWebRTCSCTPTransport * sctp, GParamSpec * pspec, +_on_sctp_state_notify (WebRTCSCTPTransport * sctp, GParamSpec * pspec, GstWebRTCBin * webrtc) { GstWebRTCSCTPTransportState state; @@ -1834,9 +2371,9 @@ if (state == GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED) { int i; - PC_LOCK (webrtc); GST_DEBUG_OBJECT (webrtc, "SCTP association established"); + DC_LOCK (webrtc); for (i = 0; i < webrtc->priv->data_channels->len; i++) { WebRTCDataChannel *channel; @@ -1847,7 +2384,7 @@ if (!channel->parent.negotiated && !channel->opened) webrtc_data_channel_start_negotiation (channel); } - PC_UNLOCK (webrtc); + DC_UNLOCK (webrtc); } } @@ -1855,13 +2392,13 @@ static void _on_sctp_notify_dtls_state (GstWebRTCDTLSTransport * transport, GParamSpec * pspec, GstWebRTCBin * webrtc); -static void +static GstStructure * _sctp_check_dtls_state_task (GstWebRTCBin * webrtc, gpointer unused) { TransportStream *stream; GstWebRTCDTLSTransport *transport; GstWebRTCDTLSTransportState dtls_state; - GstWebRTCSCTPTransport *sctp_transport; + WebRTCSCTPTransport *sctp_transport; stream = webrtc->priv->data_channel_transport; transport = stream->transport; @@ -1871,7 +2408,7 @@ if (dtls_state != GST_WEBRTC_DTLS_TRANSPORT_STATE_CONNECTED) { GST_DEBUG_OBJECT (webrtc, "Data channel DTLS connection is not ready yet: %d", dtls_state); - return; + return NULL; } GST_DEBUG_OBJECT (webrtc, "Data channel DTLS connection is now ready"); @@ -1879,7 +2416,7 @@ /* Not locked state anymore so this was already taken care of before */ if (!gst_element_is_locked_state (sctp_transport->sctpdec)) - return; + return NULL; /* Start up the SCTP elements now that the DTLS connection is established */ gst_element_set_locked_state (sctp_transport->sctpdec, FALSE); @@ -1902,6 +2439,8 @@ g_signal_handlers_disconnect_by_func (transport, _on_sctp_notify_dtls_state, webrtc); + + return NULL; } static void @@ -1947,24 +2486,17 @@ { if (!webrtc->priv->data_channel_transport) { TransportStream *stream; - GstWebRTCSCTPTransport *sctp_transport; - int i; + WebRTCSCTPTransport *sctp_transport; stream = _find_transport_for_session (webrtc, session_id); - if (!stream) { + if (!stream) stream = _create_transport_channel (webrtc, session_id); - gst_bin_add (GST_BIN (webrtc), GST_ELEMENT (stream->send_bin)); - gst_bin_add (GST_BIN (webrtc), GST_ELEMENT (stream->receive_bin)); - g_ptr_array_add (webrtc->priv->transports, stream); - } webrtc->priv->data_channel_transport = stream; - g_object_set (stream, "rtcp-mux", TRUE, NULL); - if (!(sctp_transport = webrtc->priv->sctp_transport)) { - sctp_transport = gst_webrtc_sctp_transport_new (); + sctp_transport = webrtc_sctp_transport_new (); sctp_transport->transport = g_object_ref (webrtc->priv->data_channel_transport->transport); sctp_transport->webrtcbin = webrtc; @@ -2003,14 +2535,6 @@ GST_ELEMENT (stream->send_bin), "data_sink")) g_warn_if_reached (); - for (i = 0; i < webrtc->priv->data_channels->len; i++) { - WebRTCDataChannel *channel; - - channel = g_ptr_array_index (webrtc->priv->data_channels, i); - - webrtc_data_channel_link_to_sctp (channel, webrtc->priv->sctp_transport); - } - gst_element_sync_state_with_parent (GST_ELEMENT (stream->send_bin)); gst_element_sync_state_with_parent (GST_ELEMENT (stream->receive_bin)); @@ -2028,6 +2552,8 @@ } webrtc->priv->sctp_transport = sctp_transport; + + gst_webrtc_bin_update_sctp_priority (webrtc); } return webrtc->priv->data_channel_transport; @@ -2274,12 +2800,153 @@ g_free (val); } +static gchar * +_parse_extmap (GQuark field_id, const GValue * value, GError ** error) +{ + gchar *ret = NULL; + + if (G_VALUE_HOLDS_STRING (value)) { + ret = g_value_dup_string (value); + } else if (G_VALUE_HOLDS (value, GST_TYPE_ARRAY) + && gst_value_array_get_size (value) == 3) { + const GValue *val; + const gchar *direction, *extensionname, *extensionattributes; + + val = gst_value_array_get_value (value, 0); + direction = g_value_get_string (val); + + val = gst_value_array_get_value (value, 1); + extensionname = g_value_get_string (val); + + val = gst_value_array_get_value (value, 2); + extensionattributes = g_value_get_string (val); + + if (!extensionname || *extensionname == '\0') + goto done; + + if (direction && *direction != '\0' && extensionattributes + && *extensionattributes != '\0') { + ret = + g_strdup_printf ("/%s %s %s", direction, extensionname, + extensionattributes); + } else if (direction && *direction != '\0') { + ret = g_strdup_printf ("/%s %s", direction, extensionname); + } else if (extensionattributes && *extensionattributes != '\0') { + ret = g_strdup_printf ("%s %s", extensionname, extensionattributes); + } else { + ret = g_strdup (extensionname); + } + } + + if (!ret && error) { + gchar *val_str = gst_value_serialize (value); + + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Invalid value for %s: %s", g_quark_to_string (field_id), val_str); + g_free (val_str); + } + +done: + return ret; +} + +typedef struct +{ + gboolean ret; + GstStructure *extmap; + GError **error; +} ExtmapData; + +static gboolean +_dedup_extmap_field (GQuark field_id, const GValue * value, ExtmapData * data) +{ + gboolean is_extmap = + g_str_has_prefix (g_quark_to_string (field_id), "extmap-"); + + if (!data->ret) + goto done; + + if (is_extmap) { + gchar *new_value = _parse_extmap (field_id, value, data->error); + + if (!new_value) { + data->ret = FALSE; + goto done; + } + + if (gst_structure_id_has_field (data->extmap, field_id)) { + gchar *old_value = + _parse_extmap (field_id, gst_structure_id_get_value (data->extmap, + field_id), NULL); + + g_assert (old_value); + + if (g_strcmp0 (new_value, old_value)) { + GST_ERROR + ("extmap contains different values for id %s (%s != %s)", + g_quark_to_string (field_id), old_value, new_value); + g_set_error (data->error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "extmap contains different values for id %s (%s != %s)", + g_quark_to_string (field_id), old_value, new_value); + data->ret = FALSE; + } + + g_free (old_value); + + } + + if (data->ret) { + gst_structure_id_set_value (data->extmap, field_id, value); + } + + g_free (new_value); + } + +done: + return !is_extmap; +} + +static GstStructure * +_gather_extmap (GstCaps * caps, GError ** error) +{ + ExtmapData edata = + { TRUE, gst_structure_new_empty ("application/x-extmap"), error }; + guint i, n; + + n = gst_caps_get_size (caps); + + for (i = 0; i < n; i++) { + GstStructure *s = gst_caps_get_structure (caps, i); + + gst_structure_filter_and_map_in_place (s, + (GstStructureFilterMapFunc) _dedup_extmap_field, &edata); + + if (!edata.ret) { + gst_clear_structure (&edata.extmap); + break; + } + } + + return edata.extmap; +} + +static gboolean +_copy_field (GQuark field_id, const GValue * value, GstStructure * s) +{ + gst_structure_id_set_value (s, field_id, value); + + return TRUE; +} + /* based off https://tools.ietf.org/html/draft-ietf-rtcweb-jsep-18#section-5.2.1 */ static gboolean sdp_media_from_transceiver (GstWebRTCBin * webrtc, GstSDPMedia * media, - GstWebRTCRTPTransceiver * trans, GstWebRTCSDPType type, guint media_idx, + GstWebRTCRTPTransceiver * trans, guint media_idx, GString * bundled_mids, guint bundle_idx, gchar * bundle_ufrag, - gchar * bundle_pwd, GArray * reserved_pts) + gchar * bundle_pwd, GArray * reserved_pts, GHashTable * all_mids, + GError ** error) { /* TODO: * rtp header extensions @@ -2295,6 +2962,7 @@ gchar *direction, *sdp_mid, *ufrag, *pwd; gboolean bundle_only; GstCaps *caps; + GstStructure *extmap; int i; if (trans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE @@ -2354,14 +3022,7 @@ gst_sdp_media_add_attribute (media, direction, ""); g_free (direction); - if (type == GST_WEBRTC_SDP_TYPE_OFFER) { - caps = _find_codec_preferences (webrtc, trans, GST_PAD_SINK, media_idx); - caps = - _add_supported_attributes_to_caps (webrtc, WEBRTC_TRANSCEIVER (trans), - caps); - } else { - g_assert_not_reached (); - } + caps = _find_codec_preferences (webrtc, trans, media_idx, error); if (!caps || gst_caps_is_empty (caps) || gst_caps_is_any (caps)) { GST_WARNING_OBJECT (webrtc, "no caps available for transceiver, skipping"); @@ -2370,11 +3031,38 @@ return FALSE; } + caps = gst_caps_make_writable (caps); + + /* When an extmap is defined twice for the same ID, firefox complains and + * errors out (chrome is smart enough to accept strict duplicates). + * + * To work around this, we deduplicate extmap attributes, and also error + * out when a different extmap is defined for the same ID. + * + * _gather_extmap will strip out all extmap- fields, which will then be + * added upon adding the first format for the media. + */ + extmap = _gather_extmap (caps, error); + + if (!extmap) { + GST_ERROR_OBJECT (webrtc, + "Failed to build extmap for transceiver %" GST_PTR_FORMAT, trans); + gst_caps_unref (caps); + return FALSE; + } + + caps = _add_supported_attributes_to_caps (webrtc, WEBRTC_TRANSCEIVER (trans), + caps); + for (i = 0; i < gst_caps_get_size (caps); i++) { GstCaps *format = gst_caps_new_empty (); - const GstStructure *s = gst_caps_get_structure (caps, i); + GstStructure *s = gst_structure_copy (gst_caps_get_structure (caps, i)); + + if (i == 0) { + gst_structure_foreach (extmap, (GstStructureForeachFunc) _copy_field, s); + } - gst_caps_append_structure (format, gst_structure_copy (s)); + gst_caps_append_structure (format, s); GST_DEBUG_OBJECT (webrtc, "Adding %u-th caps %" GST_PTR_FORMAT " to %u-th media", i, format, media_idx); @@ -2386,7 +3074,9 @@ gst_caps_unref (format); } - if (type == GST_WEBRTC_SDP_TYPE_OFFER) { + gst_clear_structure (&extmap); + + { const GstStructure *s = gst_caps_get_structure (caps, 0); gint clockrate = -1; gint rtx_target_pt; @@ -2421,10 +3111,18 @@ if (trans->mid) { gst_sdp_media_add_attribute (media, "mid", trans->mid); } else { - sdp_mid = g_strdup_printf ("%s%u", gst_sdp_media_get_media (media), - webrtc->priv->media_counter++); - gst_sdp_media_add_attribute (media, "mid", sdp_mid); - g_free (sdp_mid); + /* Make sure to avoid mid collisions */ + while (TRUE) { + sdp_mid = g_strdup_printf ("%s%u", gst_sdp_media_get_media (media), + webrtc->priv->media_counter++); + if (g_hash_table_contains (all_mids, (gpointer) sdp_mid)) { + g_free (sdp_mid); + } else { + gst_sdp_media_add_attribute (media, "mid", sdp_mid); + g_hash_table_insert (all_mids, sdp_mid, NULL); + break; + } + } } /* TODO: @@ -2465,6 +3163,7 @@ gint pt; if (gst_structure_get_int (s, "payload", &pt)) { + GST_TRACE_OBJECT (pad, "have reserved pt %u from received caps", pt); g_array_append_val (reserved_pts, pt); } } @@ -2475,11 +3174,34 @@ { GstElement *element = GST_ELEMENT (webrtc); GArray *reserved_pts = g_array_new (FALSE, FALSE, sizeof (guint)); + guint i; GST_OBJECT_LOCK (webrtc); g_list_foreach (element->sinkpads, (GFunc) gather_pad_pt, reserved_pts); g_list_foreach (webrtc->priv->pending_pads, (GFunc) gather_pad_pt, reserved_pts); + + for (i = 0; i < webrtc->priv->transceivers->len; i++) { + GstWebRTCRTPTransceiver *trans; + + trans = g_ptr_array_index (webrtc->priv->transceivers, i); + GST_OBJECT_LOCK (trans); + if (trans->codec_preferences) { + guint j, n; + gint pt; + + n = gst_caps_get_size (trans->codec_preferences); + for (j = 0; j < n; j++) { + GstStructure *s = gst_caps_get_structure (trans->codec_preferences, j); + if (gst_structure_get_int (s, "payload", &pt)) { + GST_TRACE_OBJECT (trans, "have reserved pt %u from codec preferences", + pt); + g_array_append_val (reserved_pts, pt); + } + } + } + GST_OBJECT_UNLOCK (trans); + } GST_OBJECT_UNLOCK (webrtc); return reserved_pts; @@ -2488,7 +3210,7 @@ static gboolean _add_data_channel_offer (GstWebRTCBin * webrtc, GstSDPMessage * msg, GstSDPMedia * media, GString * bundled_mids, guint bundle_idx, - gchar * bundle_ufrag, gchar * bundle_pwd) + gchar * bundle_ufrag, gchar * bundle_pwd, GHashTable * all_mids) { GstSDPMessage *last_offer = _get_latest_self_generated_sdp (webrtc); gchar *ufrag, *pwd, *sdp_mid; @@ -2549,10 +3271,18 @@ gst_sdp_media_add_attribute (media, "mid", mid); } else { - sdp_mid = g_strdup_printf ("%s%u", gst_sdp_media_get_media (media), - webrtc->priv->media_counter++); - gst_sdp_media_add_attribute (media, "mid", sdp_mid); - g_free (sdp_mid); + /* Make sure to avoid mid collisions */ + while (TRUE) { + sdp_mid = g_strdup_printf ("%s%u", gst_sdp_media_get_media (media), + webrtc->priv->media_counter++); + if (g_hash_table_contains (all_mids, (gpointer) sdp_mid)) { + g_free (sdp_mid); + } else { + gst_sdp_media_add_attribute (media, "mid", sdp_mid); + g_hash_table_insert (all_mids, sdp_mid, NULL); + break; + } + } } if (bundled_mids) { @@ -2574,13 +3304,17 @@ /* TODO: use the options argument */ static GstSDPMessage * -_create_offer_task (GstWebRTCBin * webrtc, const GstStructure * options) +_create_offer_task (GstWebRTCBin * webrtc, const GstStructure * options, + GError ** error) { - GstSDPMessage *ret; + GstSDPMessage *ret = NULL; GString *bundled_mids = NULL; gchar *bundle_ufrag = NULL; gchar *bundle_pwd = NULL; GArray *reserved_pts = NULL; + GHashTable *all_mids = + g_hash_table_new_full (g_str_hash, g_str_equal, g_free, NULL); + GstSDPMessage *last_offer = _get_latest_self_generated_sdp (webrtc); GList *seen_transceivers = NULL; guint media_idx = 0; @@ -2617,7 +3351,7 @@ guint bundle_media_index; reserved_pts = gather_reserved_pts (webrtc); - if (last_offer && _parse_bundle (last_offer, &last_bundle) && last_bundle + if (last_offer && _parse_bundle (last_offer, &last_bundle, NULL) && last_bundle && last_bundle[0] && _get_bundle_index (last_offer, last_bundle, &bundle_media_index)) { bundle_ufrag = @@ -2652,9 +3386,21 @@ if (trans->mid && g_strcmp0 (trans->mid, last_mid) == 0) { GstSDPMedia *media; + const gchar *mid; + WebRTCTransceiver *wtrans = WEBRTC_TRANSCEIVER (trans); g_assert (!g_list_find (seen_transceivers, trans)); + if (wtrans->mline_locked && trans->mline != media_idx) { + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Previous negotiatied transceiver %" + GST_PTR_FORMAT " with mid %s was in mline %d but transceiver" + " has locked mline %u", trans, trans->mid, media_idx, + trans->mline); + goto cancel_offer; + } + GST_LOG_OBJECT (webrtc, "using previous negotiatied transceiver %" GST_PTR_FORMAT " with mid %s into media index %u", trans, trans->mid, media_idx); @@ -2663,13 +3409,22 @@ gst_sdp_media_copy (last_media, &media); _media_replace_direction (media, trans->direction); - if (bundled_mids) { - const gchar *mid = gst_sdp_media_get_attribute_val (media, "mid"); + mid = gst_sdp_media_get_attribute_val (media, "mid"); + g_assert (mid); - g_assert (mid); - g_string_append_printf (bundled_mids, " %s", mid); + if (g_hash_table_contains (all_mids, mid)) { + gst_sdp_media_free (media); + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Duplicate mid %s when creating offer", mid); + goto cancel_offer; } + g_hash_table_insert (all_mids, g_strdup (mid), NULL); + + if (bundled_mids) + g_string_append_printf (bundled_mids, " %s", mid); + gst_sdp_message_add_media (ret, media); media_idx++; @@ -2683,7 +3438,7 @@ GstSDPMedia media = { 0, }; gst_sdp_media_init (&media); if (_add_data_channel_offer (webrtc, ret, &media, bundled_mids, 0, - bundle_ufrag, bundle_pwd)) { + bundle_ufrag, bundle_pwd, all_mids)) { gst_sdp_message_add_media (ret, &media); media_idx++; } else { @@ -2693,20 +3448,107 @@ } } - /* add any extra streams */ + /* First, go over all transceivers and gather existing mids */ for (i = 0; i < webrtc->priv->transceivers->len; i++) { GstWebRTCRTPTransceiver *trans; - GstSDPMedia media = { 0, }; trans = g_ptr_array_index (webrtc->priv->transceivers, i); - /* don't add transceivers twice */ if (g_list_find (seen_transceivers, trans)) continue; - /* don't add stopped transceivers */ - if (trans->stopped) - continue; + if (trans->mid) { + if (g_hash_table_contains (all_mids, trans->mid)) { + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Duplicate mid %s when creating offer", trans->mid); + goto cancel_offer; + } + + g_hash_table_insert (all_mids, g_strdup (trans->mid), NULL); + } + } + + + /* add any extra streams */ + for (;;) { + GstWebRTCRTPTransceiver *trans = NULL; + GstSDPMedia media = { 0, }; + + /* First find a transceiver requesting this m-line */ + trans = _find_transceiver_for_mline (webrtc, media_idx); + + if (trans) { + /* We can't have seen it already, because it is locked to this line */ + g_assert (!g_list_find (seen_transceivers, trans)); + seen_transceivers = g_list_prepend (seen_transceivers, trans); + } else { + /* Otherwise find a free transceiver */ + for (i = 0; i < webrtc->priv->transceivers->len; i++) { + WebRTCTransceiver *wtrans; + + trans = g_ptr_array_index (webrtc->priv->transceivers, i); + wtrans = WEBRTC_TRANSCEIVER (trans); + + /* don't add transceivers twice */ + if (g_list_find (seen_transceivers, trans)) + continue; + + /* Ignore transceivers with a locked mline, as they would have been + * found above or will be used later */ + if (wtrans->mline_locked) + continue; + + seen_transceivers = g_list_prepend (seen_transceivers, trans); + /* don't add stopped transceivers */ + if (trans->stopped) { + continue; + } + + /* Otherwise take it */ + break; + } + + /* Stop if we got all transceivers */ + if (i == webrtc->priv->transceivers->len) { + + /* But try to add a data channel first, we do it here, because + * it can allow a locked m-line to be put after, so we need to + * do another iteration after. + */ + if (_message_get_datachannel_index (ret) == G_MAXUINT) { + GstSDPMedia media = { 0, }; + gst_sdp_media_init (&media); + if (_add_data_channel_offer (webrtc, ret, &media, bundled_mids, 0, + bundle_ufrag, bundle_pwd, all_mids)) { + gst_sdp_message_add_media (ret, &media); + media_idx++; + continue; + } else { + gst_sdp_media_uninit (&media); + } + } + + /* Verify that we didn't ignore any locked m-line transceivers */ + for (i = 0; i < webrtc->priv->transceivers->len; i++) { + WebRTCTransceiver *wtrans; + + trans = g_ptr_array_index (webrtc->priv->transceivers, i); + wtrans = WEBRTC_TRANSCEIVER (trans); + /* don't add transceivers twice */ + if (g_list_find (seen_transceivers, trans)) + continue; + g_assert (wtrans->mline_locked); + + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "Tranceiver %" GST_PTR_FORMAT " with mid %s has locked mline %d" + " but the whole offer only has %u sections", trans, trans->mid, + trans->mline, media_idx); + goto cancel_offer; + } + break; + } + } gst_sdp_media_init (&media); @@ -2717,9 +3559,11 @@ GST_LOG_OBJECT (webrtc, "adding transceiver %" GST_PTR_FORMAT " at media " "index %u", trans, media_idx); - if (sdp_media_from_transceiver (webrtc, &media, trans, - GST_WEBRTC_SDP_TYPE_OFFER, media_idx, bundled_mids, 0, bundle_ufrag, - bundle_pwd, reserved_pts)) { + if (sdp_media_from_transceiver (webrtc, &media, trans, media_idx, + bundled_mids, 0, bundle_ufrag, bundle_pwd, reserved_pts, all_mids, + error)) { + /* as per JSEP, a=rtcp-mux-only is only added for new streams */ + gst_sdp_media_add_attribute (&media, "rtcp-mux-only", ""); gst_sdp_message_add_media (ret, &media); media_idx++; } else { @@ -2728,26 +3572,19 @@ if (webrtc->bundle_policy == GST_WEBRTC_BUNDLE_POLICY_NONE) { g_array_free (reserved_pts, TRUE); + reserved_pts = NULL; } - seen_transceivers = g_list_prepend (seen_transceivers, trans); + if (*error) + goto cancel_offer; } if (webrtc->bundle_policy != GST_WEBRTC_BUNDLE_POLICY_NONE) { g_array_free (reserved_pts, TRUE); + reserved_pts = NULL; } - /* add a data channel if exists and not renegotiated */ - if (_message_get_datachannel_index (ret) == G_MAXUINT) { - GstSDPMedia media = { 0, }; - gst_sdp_media_init (&media); - if (_add_data_channel_offer (webrtc, ret, &media, bundled_mids, 0, - bundle_ufrag, bundle_pwd)) { - gst_sdp_message_add_media (ret, &media); - media_idx++; - } else { - gst_sdp_media_uninit (&media); - } - } + webrtc->priv->max_sink_pad_serial = MAX (webrtc->priv->max_sink_pad_serial, + media_idx); g_assert (media_idx == gst_sdp_message_medias_len (ret)); @@ -2756,18 +3593,11 @@ gst_sdp_message_add_attribute (ret, "group", mids); g_free (mids); + bundled_mids = NULL; } - if (bundle_ufrag) - g_free (bundle_ufrag); - - if (bundle_pwd) - g_free (bundle_pwd); - /* FIXME: pre-emptively setup receiving elements when needed */ - g_list_free (seen_transceivers); - if (webrtc->priv->last_generated_answer) gst_webrtc_session_description_free (webrtc->priv->last_generated_answer); webrtc->priv->last_generated_answer = NULL; @@ -2780,7 +3610,29 @@ gst_webrtc_session_description_new (GST_WEBRTC_SDP_TYPE_OFFER, copy); } +out: + if (reserved_pts) + g_array_free (reserved_pts, TRUE); + + g_hash_table_unref (all_mids); + + g_list_free (seen_transceivers); + + if (bundle_ufrag) + g_free (bundle_ufrag); + + if (bundle_pwd) + g_free (bundle_pwd); + + if (bundled_mids) + g_string_free (bundled_mids, TRUE); + return ret; + +cancel_offer: + gst_sdp_message_free (ret); + ret = NULL; + goto out; } static void @@ -2882,6 +3734,23 @@ } } +static gboolean +_update_transceiver_kind_from_caps (GstWebRTCRTPTransceiver * trans, + const GstCaps * caps) +{ + GstWebRTCKind kind = webrtc_kind_from_caps (caps); + + if (trans->kind == kind) + return TRUE; + + if (trans->kind == GST_WEBRTC_KIND_UNKNOWN) { + trans->kind = kind; + return TRUE; + } else { + return FALSE; + } +} + static void _get_rtx_target_pt_and_ssrc_from_caps (GstCaps * answer_caps, gint * target_pt, guint * target_ssrc) @@ -2894,7 +3763,8 @@ /* TODO: use the options argument */ static GstSDPMessage * -_create_answer_task (GstWebRTCBin * webrtc, const GstStructure * options) +_create_answer_task (GstWebRTCBin * webrtc, const GstStructure * options, + GError ** error) { GstSDPMessage *ret = NULL; const GstWebRTCSessionDescription *pending_remote = @@ -2909,12 +3779,13 @@ GstSDPMessage *last_answer = _get_latest_self_generated_sdp (webrtc); if (!webrtc->pending_remote_description) { - GST_ERROR_OBJECT (webrtc, + g_set_error_literal (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INVALID_STATE, "Asked to create an answer without a remote description"); return NULL; } - if (!_parse_bundle (pending_remote->sdp, &bundled)) + if (!_parse_bundle (pending_remote->sdp, &bundled, error)) goto out; if (bundled) { @@ -2922,8 +3793,8 @@ guint bundle_media_index; if (!_get_bundle_index (pending_remote->sdp, bundled, &bundle_idx)) { - GST_ERROR_OBJECT (webrtc, "Bundle tag is %s but no media found matching", - bundled[0]); + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Bundle tag is %s but no media found matching", bundled[0]); goto out; } @@ -2931,7 +3802,7 @@ bundled_mids = g_string_new ("BUNDLE"); } - if (last_answer && _parse_bundle (last_answer, &last_bundle) + if (last_answer && _parse_bundle (last_answer, &last_bundle, NULL) && last_bundle && last_bundle[0] && _get_bundle_index (last_answer, last_bundle, &bundle_media_index)) { bundle_ufrag = @@ -3086,26 +3957,33 @@ gst_sdp_message_get_media (last_answer, i); const gchar *last_mid = gst_sdp_media_get_attribute_val (last_media, "mid"); + GstCaps *current_caps; /* FIXME: assumes no shenanigans with recycling transceivers */ g_assert (g_strcmp0 (mid, last_mid) == 0); - if (!answer_caps - && (rtp_trans->direction == - GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV - || rtp_trans->direction == - GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY)) - answer_caps = - _find_codec_preferences (webrtc, rtp_trans, GST_PAD_SINK, i); - if (!answer_caps - && (rtp_trans->direction == - GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV - || rtp_trans->direction == - GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY)) - answer_caps = - _find_codec_preferences (webrtc, rtp_trans, GST_PAD_SRC, i); - if (!answer_caps) - answer_caps = _rtp_caps_from_media (last_media); + current_caps = _find_codec_preferences (webrtc, rtp_trans, i, error); + if (*error) { + gst_caps_unref (offer_caps); + goto rejected; + } + if (!current_caps) + current_caps = _rtp_caps_from_media (last_media); + + if (current_caps) { + answer_caps = gst_caps_intersect (offer_caps, current_caps); + if (gst_caps_is_empty (answer_caps)) { + GST_WARNING_OBJECT (webrtc, "Caps from offer for m-line %d (%" + GST_PTR_FORMAT ") don't intersect with caps from codec" + " preferences and transceiver %" GST_PTR_FORMAT, i, offer_caps, + current_caps); + gst_caps_unref (current_caps); + gst_caps_unref (answer_caps); + gst_caps_unref (offer_caps); + goto rejected; + } + gst_caps_unref (current_caps); + } /* XXX: In theory we're meant to use the sendrecv formats for the * inactive direction however we don't know what that may be and would @@ -3126,8 +4004,11 @@ continue; } - trans_caps = - _find_codec_preferences (webrtc, rtp_trans, GST_PAD_SINK, j); + trans_caps = _find_codec_preferences (webrtc, rtp_trans, j, error); + if (*error) { + gst_caps_unref (offer_caps); + goto rejected; + } GST_TRACE_OBJECT (webrtc, "trying to compare %" GST_PTR_FORMAT " and %" GST_PTR_FORMAT, offer_caps, trans_caps); @@ -3137,25 +4018,19 @@ * that we cannot actually support */ if (trans_caps) { answer_caps = gst_caps_intersect (offer_caps, trans_caps); - if (answer_caps && !gst_caps_is_empty (answer_caps)) { - GST_LOG_OBJECT (webrtc, - "found compatible transceiver %" GST_PTR_FORMAT - " for offer media %u", rtp_trans, i); - if (trans_caps) - gst_caps_unref (trans_caps); - break; - } else { - if (answer_caps) { - gst_caps_unref (answer_caps); - answer_caps = NULL; + gst_caps_unref (trans_caps); + if (answer_caps) { + if (!gst_caps_is_empty (answer_caps)) { + GST_LOG_OBJECT (webrtc, + "found compatible transceiver %" GST_PTR_FORMAT + " for offer media %u", rtp_trans, i); + break; } - if (trans_caps) - gst_caps_unref (trans_caps); - rtp_trans = NULL; + gst_caps_unref (answer_caps); + answer_caps = NULL; } - } else { - rtp_trans = NULL; } + rtp_trans = NULL; } } @@ -3164,33 +4039,66 @@ g_assert (answer_caps != NULL); } else { /* if no transceiver, then we only receive that stream and respond with - * the exact same caps */ - /* FIXME: how to validate that subsequent elements can actually receive - * this payload/format */ + * the intersection with the transceivers codec preferences caps */ answer_dir = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY; - answer_caps = gst_caps_ref (offer_caps); } - if (gst_caps_is_empty (answer_caps)) { - GST_WARNING_OBJECT (webrtc, "Could not create caps for media"); - if (rtp_trans) - gst_object_unref (rtp_trans); - gst_caps_unref (answer_caps); - goto rejected; - } + if (!rtp_trans) { + GstCaps *trans_caps; + GstWebRTCKind kind = GST_WEBRTC_KIND_UNKNOWN; - seen_transceivers = g_list_prepend (seen_transceivers, rtp_trans); + if (g_strcmp0 (gst_sdp_media_get_media (offer_media), "audio") == 0) + kind = GST_WEBRTC_KIND_AUDIO; + else if (g_strcmp0 (gst_sdp_media_get_media (offer_media), + "video") == 0) + kind = GST_WEBRTC_KIND_VIDEO; + else + GST_LOG_OBJECT (webrtc, "Unknown media kind %s", + GST_STR_NULL (gst_sdp_media_get_media (offer_media))); - if (!rtp_trans) { - trans = _create_webrtc_transceiver (webrtc, answer_dir, i); + trans = _create_webrtc_transceiver (webrtc, answer_dir, i, kind, NULL); rtp_trans = GST_WEBRTC_RTP_TRANSCEIVER (trans); GST_LOG_OBJECT (webrtc, "Created new transceiver %" GST_PTR_FORMAT - " for mline %u", trans, i); + " for mline %u with media kind %d", trans, i, kind); + + trans_caps = _find_codec_preferences (webrtc, rtp_trans, i, error); + if (*error) { + gst_caps_unref (offer_caps); + goto rejected; + } + + GST_TRACE_OBJECT (webrtc, "trying to compare %" GST_PTR_FORMAT + " and %" GST_PTR_FORMAT, offer_caps, trans_caps); + + /* FIXME: technically this is a little overreaching as some fields we + * we can deal with not having and/or we may have unrecognized fields + * that we cannot actually support */ + if (trans_caps) { + answer_caps = gst_caps_intersect (offer_caps, trans_caps); + gst_caps_unref (trans_caps); + } else { + answer_caps = gst_caps_ref (offer_caps); + } } else { trans = WEBRTC_TRANSCEIVER (rtp_trans); } + seen_transceivers = g_list_prepend (seen_transceivers, rtp_trans); + + if (gst_caps_is_empty (answer_caps)) { + GST_WARNING_OBJECT (webrtc, "Could not create caps for media"); + gst_caps_unref (answer_caps); + gst_caps_unref (offer_caps); + goto rejected; + } + + if (!_update_transceiver_kind_from_caps (rtp_trans, answer_caps)) + GST_WARNING_OBJECT (webrtc, + "Trying to change transceiver %d kind from %d to %d", + rtp_trans->mline, rtp_trans->kind, + webrtc_kind_from_caps (answer_caps)); + if (!trans->do_nack) { answer_caps = gst_caps_make_writable (answer_caps); for (k = 0; k < gst_caps_get_size (answer_caps); k++) { @@ -3227,6 +4135,7 @@ if (answer_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE) { GST_WARNING_OBJECT (webrtc, "Could not intersect offer direction with " "transceiver direction"); + gst_caps_unref (offer_caps); goto rejected; } _media_replace_direction (media, answer_dir); @@ -3258,10 +4167,16 @@ if (0) { rejected: - GST_INFO_OBJECT (webrtc, "media %u rejected", i); + if (error && *error) + GST_INFO_OBJECT (webrtc, "media %u rejected: %s", i, (*error)->message); + else + GST_INFO_OBJECT (webrtc, "media %u rejected", i); gst_sdp_media_free (media); gst_sdp_media_copy (offer_media, &media); gst_sdp_media_set_port_info (media, 0, 0); + /* Clear error here as it is not propagated to the caller and the media + * is just skipped, i.e. more iterations are going to happen. */ + g_clear_error (error); } gst_sdp_message_add_media (ret, media); gst_sdp_media_free (media); @@ -3308,24 +4223,24 @@ struct create_sdp { GstStructure *options; - GstPromise *promise; GstWebRTCSDPType type; }; -static void +static GstStructure * _create_sdp_task (GstWebRTCBin * webrtc, struct create_sdp *data) { GstWebRTCSessionDescription *desc = NULL; GstSDPMessage *sdp = NULL; GstStructure *s = NULL; + GError *error = NULL; GST_INFO_OBJECT (webrtc, "creating %s sdp with options %" GST_PTR_FORMAT, gst_webrtc_sdp_type_to_string (data->type), data->options); if (data->type == GST_WEBRTC_SDP_TYPE_OFFER) - sdp = _create_offer_task (webrtc, data->options); + sdp = _create_offer_task (webrtc, data->options, &error); else if (data->type == GST_WEBRTC_SDP_TYPE_ANSWER) - sdp = _create_answer_task (webrtc, data->options); + sdp = _create_answer_task (webrtc, data->options, &error); else { g_assert_not_reached (); goto out; @@ -3336,15 +4251,21 @@ s = gst_structure_new ("application/x-gst-promise", gst_webrtc_sdp_type_to_string (data->type), GST_TYPE_WEBRTC_SESSION_DESCRIPTION, desc, NULL); + } else { + g_warn_if_fail (error != NULL); + GST_WARNING_OBJECT (webrtc, "returning error: %s", + error ? error->message : "Unknown"); + s = gst_structure_new ("application/x-gst-promise", + "error", G_TYPE_ERROR, error, NULL); + g_clear_error (&error); } out: - PC_UNLOCK (webrtc); - gst_promise_reply (data->promise, s); - PC_LOCK (webrtc); if (desc) gst_webrtc_session_description_free (desc); + + return s; } static void @@ -3352,7 +4273,6 @@ { if (data->options) gst_structure_free (data->options); - gst_promise_unref (data->promise); g_free (data); } @@ -3364,16 +4284,14 @@ if (options) data->options = gst_structure_copy (options); - data->promise = gst_promise_ref (promise); data->type = GST_WEBRTC_SDP_TYPE_OFFER; if (!gst_webrtc_bin_enqueue_task (webrtc, (GstWebRTCBinFunc) _create_sdp_task, data, (GDestroyNotify) _free_create_sdp_data, promise)) { GError *error = - g_error_new (GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_CLOSED, + g_error_new (GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INVALID_STATE, "Could not create offer. webrtcbin is closed"); - GstStructure *s = - gst_structure_new ("application/x-gstwebrtcbin-promise-error", + GstStructure *s = gst_structure_new ("application/x-gst-promise", "error", G_TYPE_ERROR, error, NULL); gst_promise_reply (promise, s); @@ -3390,16 +4308,14 @@ if (options) data->options = gst_structure_copy (options); - data->promise = gst_promise_ref (promise); data->type = GST_WEBRTC_SDP_TYPE_ANSWER; if (!gst_webrtc_bin_enqueue_task (webrtc, (GstWebRTCBinFunc) _create_sdp_task, data, (GDestroyNotify) _free_create_sdp_data, promise)) { GError *error = - g_error_new (GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_CLOSED, + g_error_new (GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INVALID_STATE, "Could not create answer. webrtcbin is closed."); - GstStructure *s = - gst_structure_new ("application/x-gstwebrtcbin-promise-error", + GstStructure *s = gst_structure_new ("application/x-gst-promise", "error", G_TYPE_ERROR, error, NULL); gst_promise_reply (promise, s); @@ -3410,17 +4326,25 @@ static GstWebRTCBinPad * _create_pad_for_sdp_media (GstWebRTCBin * webrtc, GstPadDirection direction, - guint media_idx) + GstWebRTCRTPTransceiver * trans, guint serial) { GstWebRTCBinPad *pad; gchar *pad_name; + if (direction == GST_PAD_SINK) { + if (serial == G_MAXUINT) + serial = webrtc->priv->max_sink_pad_serial++; + } else { + serial = trans->mline; + } + pad_name = g_strdup_printf ("%s_%u", direction == GST_PAD_SRC ? "src" : "sink", - media_idx); + serial); pad = gst_webrtc_bin_pad_new (pad_name, direction); g_free (pad_name); - pad->mlineindex = media_idx; + + pad->trans = gst_object_ref (trans); return pad; } @@ -3452,81 +4376,191 @@ return ret; } +static GstElement * +_build_fec_encoder (GstWebRTCBin * webrtc, WebRTCTransceiver * trans) +{ + GstElement *ret = NULL; + GstElement *prev = NULL; + guint ulpfec_pt = 0; + guint red_pt = 0; + GstPad *sinkpad = NULL; + GstWebRTCRTPTransceiver *rtp_trans = GST_WEBRTC_RTP_TRANSCEIVER (trans); + + if (trans->stream) { + ulpfec_pt = + transport_stream_get_pt (trans->stream, "ULPFEC", rtp_trans->mline); + red_pt = transport_stream_get_pt (trans->stream, "RED", rtp_trans->mline); + } + + if (ulpfec_pt || red_pt) + ret = gst_bin_new (NULL); + + if (ulpfec_pt) { + GstElement *fecenc = gst_element_factory_make ("rtpulpfecenc", NULL); + GstCaps *caps = transport_stream_get_caps_for_pt (trans->stream, ulpfec_pt); + + GST_DEBUG_OBJECT (webrtc, + "Creating ULPFEC encoder for mline %u with pt %d", rtp_trans->mline, + ulpfec_pt); + + gst_bin_add (GST_BIN (ret), fecenc); + sinkpad = gst_element_get_static_pad (fecenc, "sink"); + g_object_set (fecenc, "pt", ulpfec_pt, "percentage", + trans->fec_percentage, NULL); + + g_object_bind_property (rtp_trans, "fec-percentage", fecenc, "percentage", + G_BINDING_BIDIRECTIONAL); + + if (caps && !gst_caps_is_empty (caps)) { + const GstStructure *s = gst_caps_get_structure (caps, 0); + const gchar *media = gst_structure_get_string (s, "media"); + + if (!g_strcmp0 (media, "video")) + g_object_set (fecenc, "multipacket", TRUE, NULL); + } + + prev = fecenc; + } + + if (red_pt) { + GstElement *redenc = gst_element_factory_make ("rtpredenc", NULL); + + GST_DEBUG_OBJECT (webrtc, "Creating RED encoder for mline %u with pt %d", + rtp_trans->mline, red_pt); + + gst_bin_add (GST_BIN (ret), redenc); + if (prev) + gst_element_link (prev, redenc); + else + sinkpad = gst_element_get_static_pad (redenc, "sink"); + + g_object_set (redenc, "pt", red_pt, "allow-no-red-blocks", TRUE, NULL); + + prev = redenc; + } + + if (sinkpad) { + GstPad *ghost = gst_ghost_pad_new ("sink", sinkpad); + gst_object_unref (sinkpad); + gst_element_add_pad (ret, ghost); + } + + if (prev) { + GstPad *srcpad = gst_element_get_static_pad (prev, "src"); + GstPad *ghost = gst_ghost_pad_new ("src", srcpad); + gst_object_unref (srcpad); + gst_element_add_pad (ret, ghost); + } + + return ret; +} + + static GstPad * _connect_input_stream (GstWebRTCBin * webrtc, GstWebRTCBinPad * pad) { /* * Not-bundle case: * - * ,-------------------------webrtcbin-------------------------, - * ; ; - * ; ,-------rtpbin-------, ,--transport_send_%u--, ; - * ; ; send_rtp_src_%u o---o rtp_sink ; ; - * ; ; ; ; ; ; - * ; ; send_rtcp_src_%u o---o rtcp_sink ; ; - * ; sink_%u ; ; '---------------------' ; - * o----------o send_rtp_sink_%u ; ; - * ; '--------------------' ; - * '--------------------- -------------------------------------' + * ,--------------------------------------------webrtcbin--------------------------------------------, + * ; ; + * ; ,-------rtpbin-------, ,--transport_send_%u--, ; + * ; ; send_rtp_src_%u o---o rtp_sink ; ; + * ; ,---clocksync---, ; ; ; ; ; + * ; ; ; ; send_rtcp_src_%u o---o rtcp_sink ; ; + * ; sink_%u ; ; ,---fec encoder---, ; ; '---------------------' ; + * o---------o sink src o-o sink src o--o send_rtp_sink_%u ; ; + * ; '---------------' ,-----------------, '--------------------' ; + * '-------------------------------------------------------------------------------------------------' */ /* * Bundle case: - * ,--------------------------------webrtcbin--------------------------------, - * ; ; - * ; ,-------rtpbin-------, ,--transport_send_%u--, ; - * ; ; send_rtp_src_%u o---o rtp_sink ; ; - * ; ; ; ; ; ; - * ; ; send_rtcp_src_%u o---o rtcp_sink ; ; - * ; sink_%u ,---funnel---, ; ; '---------------------' ; - * o---------o sink_%u ; ; ; ; - * ; sink_%u ; src o-o send_rtp_sink_%u ; ; - * o---------o sink_%u ; ; ; ; - * ; '------------' '--------------------' ; - * '-------------------------------------------------------------------------' + * ,-----------------------------------------------------webrtcbin---------------------------------------------------, + * ; ; + * ; ,-------rtpbin-------, ,--transport_send_%u--, ; + * ; ; send_rtp_src_%u o---o rtp_sink ; ; + * ; ; ; ; ; ; + * ; sink_%u ,---clocksync---, ,---fec encoder---, ,---funnel---, ; send_rtcp_src_%u o---o rtcp_sink ; ; + * o----------o sink src o-o sink src o--o sink_%u ; ; ; '---------------------' ; + * ; '---------------' ,-----------------, ; ; ; ; ; + * ; ; src o-o send_rtp_sink_%u ; ; + * ; sink_%u ,---clocksync---, ,---fec encoder---, ; ; ; ; ; + * o----------o sink src o-o sink src o--o sink%u ; '--------------------' ; + * ; '---------------' ,-----------------, '------------' ; + * '-----------------------------------------------------------------------------------------------------------------' */ GstPadTemplate *rtp_templ; - GstPad *rtp_sink; + GstPad *rtp_sink, *sinkpad, *srcpad; gchar *pad_name; WebRTCTransceiver *trans; + GstElement *clocksync; + GstElement *fec_encoder; g_return_val_if_fail (pad->trans != NULL, NULL); - GST_INFO_OBJECT (pad, "linking input stream %u", pad->mlineindex); - trans = WEBRTC_TRANSCEIVER (pad->trans); + GST_INFO_OBJECT (pad, "linking input stream %u", pad->trans->mline); + g_assert (trans->stream); + clocksync = gst_element_factory_make ("clocksync", NULL); + g_object_set (clocksync, "sync", TRUE, NULL); + gst_bin_add (GST_BIN (webrtc), clocksync); + gst_element_sync_state_with_parent (clocksync); + + srcpad = gst_element_get_static_pad (clocksync, "src"); + sinkpad = gst_element_get_static_pad (clocksync, "sink"); + + if ((fec_encoder = _build_fec_encoder (webrtc, trans))) { + GstPad *fec_sink; + + gst_bin_add (GST_BIN (webrtc), fec_encoder); + gst_element_sync_state_with_parent (fec_encoder); + + fec_sink = gst_element_get_static_pad (fec_encoder, "sink"); + gst_pad_link (srcpad, fec_sink); + gst_object_unref (srcpad); + gst_object_unref (fec_sink); + srcpad = gst_element_get_static_pad (fec_encoder, "src"); + } + if (!webrtc->rtpfunnel) { rtp_templ = _find_pad_template (webrtc->rtpbin, GST_PAD_SINK, GST_PAD_REQUEST, "send_rtp_sink_%u"); g_assert (rtp_templ); - pad_name = g_strdup_printf ("send_rtp_sink_%u", pad->mlineindex); + pad_name = g_strdup_printf ("send_rtp_sink_%u", pad->trans->mline); rtp_sink = gst_element_request_pad (webrtc->rtpbin, rtp_templ, pad_name, NULL); g_free (pad_name); - gst_ghost_pad_set_target (GST_GHOST_PAD (pad), rtp_sink); + gst_pad_link (srcpad, rtp_sink); gst_object_unref (rtp_sink); - pad_name = g_strdup_printf ("send_rtp_src_%u", pad->mlineindex); + gst_ghost_pad_set_target (GST_GHOST_PAD (pad), sinkpad); + + pad_name = g_strdup_printf ("send_rtp_src_%u", pad->trans->mline); if (!gst_element_link_pads (GST_ELEMENT (webrtc->rtpbin), pad_name, GST_ELEMENT (trans->stream->send_bin), "rtp_sink")) g_warn_if_reached (); g_free (pad_name); } else { - gchar *pad_name = g_strdup_printf ("sink_%u", pad->mlineindex); + gchar *pad_name = g_strdup_printf ("sink_%u", pad->trans->mline); GstPad *funnel_sinkpad = - gst_element_get_request_pad (webrtc->rtpfunnel, pad_name); + gst_element_request_pad_simple (webrtc->rtpfunnel, pad_name); - gst_ghost_pad_set_target (GST_GHOST_PAD (pad), funnel_sinkpad); + gst_pad_link (srcpad, funnel_sinkpad); + gst_ghost_pad_set_target (GST_GHOST_PAD (pad), sinkpad); g_free (pad_name); gst_object_unref (funnel_sinkpad); } + gst_object_unref (srcpad); + gst_object_unref (sinkpad); + gst_element_sync_state_with_parent (GST_ELEMENT (trans->stream->send_bin)); return GST_PAD (pad); @@ -3631,7 +4665,7 @@ if (stream == NULL) stream = _find_ice_stream_for_session (webrtc, mlineindex); if (stream == NULL) { - GST_WARNING_OBJECT (webrtc, + GST_DEBUG_OBJECT (webrtc, "Unknown mline %u, dropping ICE candidates from SDP", mlineindex); return; } @@ -3712,6 +4746,7 @@ g_object_set (stream->rtxsend, "payload-type-map", pt_map, NULL); gst_structure_free (pt_map); + g_free (rtx_pt); } } @@ -3775,6 +4810,7 @@ } item.pt = pt; + item.media_idx = media_idx; gst_caps_unref (outcaps); g_array_append_val (stream->ptmap, item); @@ -3788,19 +4824,33 @@ _update_transceiver_from_sdp_media (GstWebRTCBin * webrtc, const GstSDPMessage * sdp, guint media_idx, TransportStream * stream, GstWebRTCRTPTransceiver * rtp_trans, - GStrv bundled, guint bundle_idx) + GStrv bundled, guint bundle_idx, GError ** error) { WebRTCTransceiver *trans = WEBRTC_TRANSCEIVER (rtp_trans); GstWebRTCRTPTransceiverDirection prev_dir = rtp_trans->current_direction; GstWebRTCRTPTransceiverDirection new_dir; const GstSDPMedia *media = gst_sdp_message_get_media (sdp, media_idx); GstWebRTCDTLSSetup new_setup; - gboolean new_rtcp_mux, new_rtcp_rsize; + gboolean new_rtcp_rsize; ReceiveState receive_state = RECEIVE_STATE_UNSET; int i; rtp_trans->mline = media_idx; + if (!g_strcmp0 (gst_sdp_media_get_media (media), "audio")) { + if (rtp_trans->kind == GST_WEBRTC_KIND_VIDEO) + GST_FIXME_OBJECT (webrtc, + "Updating video transceiver to audio, which isn't fully supported."); + rtp_trans->kind = GST_WEBRTC_KIND_AUDIO; + } + + if (!g_strcmp0 (gst_sdp_media_get_media (media), "video")) { + if (rtp_trans->kind == GST_WEBRTC_KIND_AUDIO) + GST_FIXME_OBJECT (webrtc, + "Updating audio transceiver to video, which isn't fully supported."); + rtp_trans->kind = GST_WEBRTC_KIND_VIDEO; + } + for (i = 0; i < gst_sdp_media_attributes_len (media); i++) { const GstSDPAttribute *attr = gst_sdp_media_get_attribute (media, i); @@ -3825,26 +4875,32 @@ local_setup = _get_dtls_setup_from_media (local_media); remote_setup = _get_dtls_setup_from_media (remote_media); new_setup = _get_final_setup (local_setup, remote_setup); - if (new_setup == GST_WEBRTC_DTLS_SETUP_NONE) + if (new_setup == GST_WEBRTC_DTLS_SETUP_NONE) { + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Cannot intersect direction attributes for media %u", media_idx); return; + } local_dir = _get_direction_from_media (local_media); remote_dir = _get_direction_from_media (remote_media); new_dir = _get_final_direction (local_dir, remote_dir); - - if (new_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE) + if (new_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE) { + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Cannot intersect dtls setup attributes for media %u", media_idx); return; + } if (prev_dir != GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE && new_dir != GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_INACTIVE && prev_dir != new_dir) { - GST_FIXME_OBJECT (webrtc, "implement transceiver direction changes"); + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "transceiver direction changes are not implemented. Media %u", + media_idx); return; } if (!bundled || bundle_idx == media_idx) { - new_rtcp_mux = _media_has_attribute_key (local_media, "rtcp-mux") - && _media_has_attribute_key (remote_media, "rtcp-mux"); new_rtcp_rsize = _media_has_attribute_key (local_media, "rtcp-rsize") && _media_has_attribute_key (remote_media, "rtcp-rsize"); @@ -3857,8 +4913,6 @@ g_object_unref (session); } } - - g_object_set (stream, "rtcp-mux", new_rtcp_mux, NULL); } } @@ -3919,18 +4973,16 @@ if (new_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY || new_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV) { GstWebRTCBinPad *pad = - _find_pad_for_mline (webrtc, GST_PAD_SINK, media_idx); + _find_pad_for_transceiver (webrtc, GST_PAD_SINK, rtp_trans); if (pad) { GST_DEBUG_OBJECT (webrtc, "found existing send pad %" GST_PTR_FORMAT " for transceiver %" GST_PTR_FORMAT, pad, trans); - g_assert (pad->trans == rtp_trans); - g_assert (pad->mlineindex == media_idx); gst_object_unref (pad); } else { GST_DEBUG_OBJECT (webrtc, "creating new send pad for transceiver %" GST_PTR_FORMAT, trans); - pad = _create_pad_for_sdp_media (webrtc, GST_PAD_SINK, media_idx); - pad->trans = gst_object_ref (rtp_trans); + pad = _create_pad_for_sdp_media (webrtc, GST_PAD_SINK, rtp_trans, + G_MAXUINT); _connect_input_stream (webrtc, pad); _add_pad (webrtc, pad); } @@ -3938,18 +4990,16 @@ if (new_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY || new_dir == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV) { GstWebRTCBinPad *pad = - _find_pad_for_mline (webrtc, GST_PAD_SRC, media_idx); + _find_pad_for_transceiver (webrtc, GST_PAD_SRC, rtp_trans); if (pad) { GST_DEBUG_OBJECT (webrtc, "found existing receive pad %" GST_PTR_FORMAT " for transceiver %" GST_PTR_FORMAT, pad, trans); - g_assert (pad->trans == rtp_trans); - g_assert (pad->mlineindex == media_idx); gst_object_unref (pad); } else { GST_DEBUG_OBJECT (webrtc, "creating new receive pad for transceiver %" GST_PTR_FORMAT, trans); - pad = _create_pad_for_sdp_media (webrtc, GST_PAD_SRC, media_idx); - pad->trans = gst_object_ref (rtp_trans); + pad = _create_pad_for_sdp_media (webrtc, GST_PAD_SRC, rtp_trans, + G_MAXUINT); if (!trans->stream) { TransportStream *item; @@ -4036,7 +5086,8 @@ static void _update_data_channel_from_sdp_media (GstWebRTCBin * webrtc, - const GstSDPMessage * sdp, guint media_idx, TransportStream * stream) + const GstSDPMessage * sdp, guint media_idx, TransportStream * stream, + GError ** error) { const GstSDPMedia *local_media, *remote_media; GstWebRTCDTLSSetup local_setup, remote_setup, new_setup; @@ -4055,18 +5106,25 @@ local_setup = _get_dtls_setup_from_media (local_media); remote_setup = _get_dtls_setup_from_media (remote_media); new_setup = _get_final_setup (local_setup, remote_setup); - if (new_setup == GST_WEBRTC_DTLS_SETUP_NONE) + if (new_setup == GST_WEBRTC_DTLS_SETUP_NONE) { + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Cannot intersect dtls setup for media %u", media_idx); return; + } /* data channel is always rtcp-muxed to avoid generating ICE candidates * for RTCP */ - g_object_set (stream, "rtcp-mux", TRUE, "dtls-client", + g_object_set (stream, "dtls-client", new_setup == GST_WEBRTC_DTLS_SETUP_ACTIVE, NULL); local_port = _get_sctp_port_from_media (local_media); remote_port = _get_sctp_port_from_media (local_media); - if (local_port == -1 || remote_port == -1) + if (local_port == -1 || remote_port == -1) { + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Could not find sctp port for media %u (local %i, remote %i)", + media_idx, local_port, remote_port); return; + } if (0 == (local_max_size = _get_sctp_max_message_size_from_media (local_media))) @@ -4096,6 +5154,7 @@ remote_port, NULL); } + DC_LOCK (webrtc); for (i = 0; i < webrtc->priv->data_channels->len; i++) { WebRTCDataChannel *channel; @@ -4113,6 +5172,7 @@ webrtc_data_channel_start_negotiation (channel); } } + DC_UNLOCK (webrtc); stream->active = TRUE; @@ -4124,12 +5184,16 @@ _find_compatible_unassociated_transceiver (GstWebRTCRTPTransceiver * p1, gconstpointer data) { + GstWebRTCKind kind = GPOINTER_TO_INT (data); + if (p1->mid) return FALSE; if (p1->mline != -1) return FALSE; if (p1->stopped) return FALSE; + if (p1->kind != GST_WEBRTC_KIND_UNKNOWN && p1->kind != kind) + return FALSE; return TRUE; } @@ -4161,7 +5225,7 @@ queue_srcpad = gst_element_get_static_pad (queue, "src"); pad_name = g_strdup_printf ("send_rtp_sink_%d", session_id); - rtp_sink = gst_element_get_request_pad (webrtc->rtpbin, pad_name); + rtp_sink = gst_element_request_pad_simple (webrtc->rtpbin, pad_name); g_free (pad_name); gst_pad_link (queue_srcpad, rtp_sink); gst_object_unref (queue_srcpad); @@ -4179,7 +5243,7 @@ static gboolean _update_transceivers_from_sdp (GstWebRTCBin * webrtc, SDPSource source, - GstWebRTCSessionDescription * sdp) + GstWebRTCSessionDescription * sdp, GError ** error) { int i; gboolean ret = FALSE; @@ -4190,14 +5254,14 @@ /* FIXME: With some peers, it's possible we could have * multiple bundles to deal with, although I've never seen one yet */ if (webrtc->bundle_policy != GST_WEBRTC_BUNDLE_POLICY_NONE) - if (!_parse_bundle (sdp->sdp, &bundled)) + if (!_parse_bundle (sdp->sdp, &bundled, error)) goto done; if (bundled) { if (!_get_bundle_index (sdp->sdp, bundled, &bundle_idx)) { - GST_ERROR_OBJECT (webrtc, "Bundle tag is %s but no media found matching", - bundled[0]); + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Bundle tag is %s but no media found matching", bundled[0]); goto done; } @@ -4248,14 +5312,25 @@ webrtc_transceiver_set_transport ((WebRTCTransceiver *) trans, stream); if (source == SDP_LOCAL && sdp->type == GST_WEBRTC_SDP_TYPE_OFFER && !trans) { - GST_ERROR ("State mismatch. Could not find local transceiver by mline."); + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "State mismatch. Could not find local transceiver by mline %u", i); goto done; } else { if (g_strcmp0 (gst_sdp_media_get_media (media), "audio") == 0 || g_strcmp0 (gst_sdp_media_get_media (media), "video") == 0) { + GstWebRTCKind kind = GST_WEBRTC_KIND_UNKNOWN; + /* No existing transceiver, find an unused one */ if (!trans) { - trans = _find_transceiver (webrtc, NULL, + if (g_strcmp0 (gst_sdp_media_get_media (media), "audio") == 0) + kind = GST_WEBRTC_KIND_AUDIO; + else if (g_strcmp0 (gst_sdp_media_get_media (media), "video") == 0) + kind = GST_WEBRTC_KIND_VIDEO; + else + GST_LOG_OBJECT (webrtc, "Unknown media kind %s", + GST_STR_NULL (gst_sdp_media_get_media (media))); + + trans = _find_transceiver (webrtc, GINT_TO_POINTER (kind), (FindTransceiverFunc) _find_compatible_unassociated_transceiver); } @@ -4265,15 +5340,21 @@ * that calls to setDirection will change the value. Nothing about * a default value when the transceiver is created internally */ if (!trans) { - trans = - GST_WEBRTC_RTP_TRANSCEIVER (_create_webrtc_transceiver (webrtc, - _get_direction_from_media (media), i)); + WebRTCTransceiver *t = _create_webrtc_transceiver (webrtc, + _get_direction_from_media (media), i, kind, NULL); + webrtc_transceiver_set_transport (t, stream); + trans = GST_WEBRTC_RTP_TRANSCEIVER (t); } _update_transceiver_from_sdp_media (webrtc, sdp->sdp, i, stream, - trans, bundled, bundle_idx); + trans, bundled, bundle_idx, error); + if (error && *error) + goto done; } else if (_message_media_is_datachannel (sdp->sdp, i)) { - _update_data_channel_from_sdp_media (webrtc, sdp->sdp, i, stream); + _update_data_channel_from_sdp_media (webrtc, sdp->sdp, i, stream, + error); + if (error && *error) + goto done; } else { GST_ERROR_OBJECT (webrtc, "Unknown media type in SDP at index %u", i); } @@ -4297,15 +5378,107 @@ return ret; } +static gboolean +check_transceivers_not_removed (GstWebRTCBin * webrtc, + GstWebRTCSessionDescription * previous, GstWebRTCSessionDescription * new) +{ + if (!previous) + return TRUE; + + if (gst_sdp_message_medias_len (previous->sdp) > + gst_sdp_message_medias_len (new->sdp)) + return FALSE; + + return TRUE; +} + +static gboolean +check_locked_mlines (GstWebRTCBin * webrtc, GstWebRTCSessionDescription * sdp, + GError ** error) +{ + guint i; + + for (i = 0; i < gst_sdp_message_medias_len (sdp->sdp); i++) { + const GstSDPMedia *media = gst_sdp_message_get_media (sdp->sdp, i); + GstWebRTCRTPTransceiver *rtp_trans; + WebRTCTransceiver *trans; + + rtp_trans = _find_transceiver_for_sdp_media (webrtc, sdp->sdp, i); + /* only look for matching mid */ + if (rtp_trans == NULL) + continue; + + trans = WEBRTC_TRANSCEIVER (rtp_trans); + + /* We only validate the locked mlines for now */ + if (!trans->mline_locked) + continue; + + if (rtp_trans->mline != i) { + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "m-line with mid %s is at position %d, but was locked to %d, " + "rejecting", rtp_trans->mid, i, rtp_trans->mline); + return FALSE; + } + + if (rtp_trans->kind != GST_WEBRTC_KIND_UNKNOWN) { + if (!g_strcmp0 (gst_sdp_media_get_media (media), "audio") && + rtp_trans->kind != GST_WEBRTC_KIND_AUDIO) { + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "m-line %d was locked to audio, but SDP has %s media", i, + gst_sdp_media_get_media (media)); + return FALSE; + } + + if (!g_strcmp0 (gst_sdp_media_get_media (media), "video") && + rtp_trans->kind != GST_WEBRTC_KIND_VIDEO) { + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE, + "m-line %d was locked to video, but SDP has %s media", i, + gst_sdp_media_get_media (media)); + return FALSE; + } + } + } + + return TRUE; +} + + struct set_description { - GstPromise *promise; SDPSource source; GstWebRTCSessionDescription *sdp; }; +static GstWebRTCSessionDescription * +get_previous_description (GstWebRTCBin * webrtc, SDPSource source, + GstWebRTCSDPType type) +{ + switch (type) { + case GST_WEBRTC_SDP_TYPE_OFFER: + case GST_WEBRTC_SDP_TYPE_PRANSWER: + case GST_WEBRTC_SDP_TYPE_ANSWER: + if (source == SDP_LOCAL) { + return webrtc->current_local_description; + } else { + return webrtc->current_remote_description; + } + case GST_WEBRTC_SDP_TYPE_ROLLBACK: + return NULL; + default: + /* other values mean memory corruption/uninitialized! */ + g_assert_not_reached (); + break; + } + + return NULL; +} + /* http://w3c.github.io/webrtc-pc/#set-description */ -static void +static GstStructure * _set_description_task (GstWebRTCBin * webrtc, struct set_description *sd) { GstWebRTCSignalingState new_signaling_state = webrtc->signaling_state; @@ -4329,29 +5502,34 @@ g_free (type_str); } - if (!validate_sdp (webrtc->signaling_state, sd->source, sd->sdp, &error)) { - GST_ERROR_OBJECT (webrtc, "%s", error->message); - g_clear_error (&error); - goto out; - } - - if (webrtc->priv->is_closed) { - GST_WARNING_OBJECT (webrtc, "we are closed"); + if (!validate_sdp (webrtc->signaling_state, sd->source, sd->sdp, &error)) goto out; - } if (webrtc->bundle_policy != GST_WEBRTC_BUNDLE_POLICY_NONE) - if (!_parse_bundle (sd->sdp->sdp, &bundled)) + if (!_parse_bundle (sd->sdp->sdp, &bundled, &error)) goto out; if (bundled) { if (!_get_bundle_index (sd->sdp->sdp, bundled, &bundle_idx)) { - GST_ERROR_OBJECT (webrtc, "Bundle tag is %s but no media found matching", - bundled[0]); + g_set_error (&error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Bundle tag is %s but no matching media found", bundled[0]); goto out; } } + if (!check_transceivers_not_removed (webrtc, + get_previous_description (webrtc, sd->source, sd->sdp->type), + sd->sdp)) { + g_set_error_literal (&error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "m=lines removed from the SDP. Processing a completely new connection " + "is not currently supported."); + goto out; + } + + if (!check_locked_mlines (webrtc, sd->sdp, &error)) + goto out; + switch (sd->sdp->type) { case GST_WEBRTC_SDP_TYPE_OFFER:{ if (sd->source == SDP_LOCAL) { @@ -4497,7 +5675,8 @@ GList *tmp; /* media modifications */ - _update_transceivers_from_sdp (webrtc, sd->source, sd->sdp); + if (!_update_transceivers_from_sdp (webrtc, sd->source, sd->sdp, &error)) + goto out; for (tmp = webrtc->priv->pending_sink_transceivers; tmp;) { GstWebRTCBinPad *pad = GST_WEBRTC_BIN_PAD (tmp->data); @@ -4511,13 +5690,13 @@ continue; } - if (pad->mlineindex >= gst_sdp_message_medias_len (sd->sdp->sdp)) { + if (pad->trans->mline >= gst_sdp_message_medias_len (sd->sdp->sdp)) { GST_DEBUG_OBJECT (pad, "not mentioned in this description. Skipping"); tmp = tmp->next; continue; } - media = gst_sdp_message_get_media (sd->sdp->sdp, pad->mlineindex); + media = gst_sdp_message_get_media (sd->sdp->sdp, pad->trans->mline); /* skip rejected media */ if (gst_sdp_media_get_port (media) == 0) { /* FIXME: arrange for an appropriate flow return */ @@ -4577,11 +5756,7 @@ if (split[0] && sscanf (split[0], "%u", &ssrc) && split[1] && g_str_has_prefix (split[1], "cname:")) { - SsrcMapItem ssrc_item; - - ssrc_item.media_idx = i; - ssrc_item.ssrc = ssrc; - g_array_append_val (item->remote_ssrcmap, ssrc_item); + g_ptr_array_add (item->remote_ssrcmap, ssrcmap_item_new (ssrc, i)); } g_strfreev (split); } @@ -4676,16 +5851,20 @@ out: g_strfreev (bundled); - PC_UNLOCK (webrtc); - gst_promise_reply (sd->promise, NULL); - PC_LOCK (webrtc); + if (error) { + GstStructure *s = gst_structure_new ("application/x-gst-promise", + "error", G_TYPE_ERROR, error, NULL); + GST_WARNING_OBJECT (webrtc, "returning error: %s", error->message); + g_clear_error (&error); + return s; + } else { + return NULL; + } } static void _free_set_description_data (struct set_description *sd) { - if (sd->promise) - gst_promise_unref (sd->promise); if (sd->sdp) gst_webrtc_session_description_free (sd->sdp); g_free (sd); @@ -4703,8 +5882,6 @@ goto bad_input; sd = g_new0 (struct set_description, 1); - if (promise != NULL) - sd->promise = gst_promise_ref (promise); sd->source = SDP_REMOTE; sd->sdp = gst_webrtc_session_description_copy (remote_sdp); @@ -4712,10 +5889,9 @@ (GstWebRTCBinFunc) _set_description_task, sd, (GDestroyNotify) _free_set_description_data, promise)) { GError *error = - g_error_new (GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_CLOSED, + g_error_new (GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INVALID_STATE, "Could not set remote description. webrtcbin is closed."); - GstStructure *s = - gst_structure_new ("application/x-gstwebrtcbin-promise-error", + GstStructure *s = gst_structure_new ("application/x-gst-promise", "error", G_TYPE_ERROR, error, NULL); gst_promise_reply (promise, s); @@ -4744,8 +5920,6 @@ goto bad_input; sd = g_new0 (struct set_description, 1); - if (promise != NULL) - sd->promise = gst_promise_ref (promise); sd->source = SDP_LOCAL; sd->sdp = gst_webrtc_session_description_copy (local_sdp); @@ -4753,10 +5927,9 @@ (GstWebRTCBinFunc) _set_description_task, sd, (GDestroyNotify) _free_set_description_data, promise)) { GError *error = - g_error_new (GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_CLOSED, + g_error_new (GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INVALID_STATE, "Could not set local description. webrtcbin is closed"); - GstStructure *s = - gst_structure_new ("application/x-gstwebrtcbin-promise-error", + GstStructure *s = gst_structure_new ("application/x-gst-promise", "error", G_TYPE_ERROR, error, NULL); gst_promise_reply (promise, s); @@ -4773,7 +5946,7 @@ } } -static void +static GstStructure * _add_ice_candidate_task (GstWebRTCBin * webrtc, IceCandidateItem * item) { if (!webrtc->current_local_description || !webrtc->current_remote_description) { @@ -4787,6 +5960,8 @@ } else { _add_ice_candidate (webrtc, item, FALSE); } + + return NULL; } static void @@ -4804,16 +5979,18 @@ item = g_new0 (IceCandidateItem, 1); item->mlineindex = mline; - if (!g_ascii_strncasecmp (attr, "a=candidate:", 12)) - item->candidate = g_strdup (attr); - else if (!g_ascii_strncasecmp (attr, "candidate:", 10)) - item->candidate = g_strdup_printf ("a=%s", attr); + if (attr && attr[0] != 0) { + if (!g_ascii_strncasecmp (attr, "a=candidate:", 12)) + item->candidate = g_strdup (attr); + else if (!g_ascii_strncasecmp (attr, "candidate:", 10)) + item->candidate = g_strdup_printf ("a=%s", attr); + } gst_webrtc_bin_enqueue_task (webrtc, (GstWebRTCBinFunc) _add_ice_candidate_task, item, (GDestroyNotify) _free_ice_candidate_item, NULL); } -static void +static GstStructure * _on_local_ice_candidate_task (GstWebRTCBin * webrtc) { gsize i; @@ -4823,7 +6000,7 @@ if (webrtc->priv->pending_local_ice_candidates->len == 0) { ICE_UNLOCK (webrtc); GST_LOG_OBJECT (webrtc, "No ICE candidates to process right now"); - return; /* Nothing to process */ + return NULL; /* Nothing to process */ } /* Take the array so we can process it all and free it later * without holding the lock @@ -4870,6 +6047,8 @@ } g_array_free (items, TRUE); + + return NULL; } static void @@ -4898,16 +6077,6 @@ } } -/* https://www.w3.org/TR/webrtc/#dfn-stats-selection-algorithm */ -static GstStructure * -_get_stats_from_selector (GstWebRTCBin * webrtc, gpointer selector) -{ - if (selector) - GST_FIXME_OBJECT (webrtc, "Implement stats selection"); - - return gst_structure_copy (webrtc->priv->stats); -} - struct get_stats { GstPad *pad; @@ -4925,28 +6094,14 @@ } /* https://www.w3.org/TR/webrtc/#dom-rtcpeerconnection-getstats() */ -static void +static GstStructure * _get_stats_task (GstWebRTCBin * webrtc, struct get_stats *stats) { - GstStructure *s; - gpointer selector = NULL; - - gst_webrtc_bin_update_stats (webrtc); - - if (stats->pad) { - GstWebRTCBinPad *wpad = GST_WEBRTC_BIN_PAD (stats->pad); - - if (wpad->trans) { - if (GST_PAD_DIRECTION (wpad) == GST_PAD_SRC) { - selector = wpad->trans->receiver; - } else { - selector = wpad->trans->sender; - } - } - } + /* Our selector is the pad, + * https://www.w3.org/TR/webrtc/#dfn-stats-selection-algorithm + */ - s = _get_stats_from_selector (webrtc, selector); - gst_promise_reply (stats->promise, s); + return gst_webrtc_bin_create_stats (webrtc, stats->pad); } static void @@ -4967,9 +6122,9 @@ if (!gst_webrtc_bin_enqueue_task (webrtc, (GstWebRTCBinFunc) _get_stats_task, stats, (GDestroyNotify) _free_get_stats, promise)) { GError *error = - g_error_new (GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_CLOSED, + g_error_new (GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_INVALID_STATE, "Could not retrieve statistics. webrtcbin is closed."); - GstStructure *s = gst_structure_new ("application/x-gst-promise-error", + GstStructure *s = gst_structure_new ("application/x-gst-promise", "error", G_TYPE_ERROR, error, NULL); gst_promise_reply (promise, s); @@ -4983,18 +6138,19 @@ GstWebRTCRTPTransceiverDirection direction, GstCaps * caps) { WebRTCTransceiver *trans; - GstWebRTCRTPTransceiver *rtp_trans; g_return_val_if_fail (direction != GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE, NULL); - trans = _create_webrtc_transceiver (webrtc, direction, -1); + PC_LOCK (webrtc); + + trans = + _create_webrtc_transceiver (webrtc, direction, -1, + webrtc_kind_from_caps (caps), caps); GST_LOG_OBJECT (webrtc, "Created new unassociated transceiver %" GST_PTR_FORMAT, trans); - rtp_trans = GST_WEBRTC_RTP_TRANSCEIVER (trans); - if (caps) - rtp_trans->codec_preferences = gst_caps_ref (caps); + PC_UNLOCK (webrtc); return gst_object_ref (trans); } @@ -5011,6 +6167,8 @@ GArray *arr = g_array_new (FALSE, TRUE, sizeof (GstWebRTCRTPTransceiver *)); int i; + PC_LOCK (webrtc); + g_array_set_clear_func (arr, (GDestroyNotify) _deref_and_unref); for (i = 0; i < webrtc->priv->transceivers->len; i++) { @@ -5019,6 +6177,7 @@ gst_object_ref (trans); g_array_append_val (arr, trans); } + PC_UNLOCK (webrtc); return arr; } @@ -5028,6 +6187,8 @@ { GstWebRTCRTPTransceiver *trans = NULL; + PC_LOCK (webrtc); + if (idx >= webrtc->priv->transceivers->len) { GST_ERROR_OBJECT (webrtc, "No transceiver for idx %d", idx); goto done; @@ -5037,18 +6198,25 @@ gst_object_ref (trans); done: + PC_UNLOCK (webrtc); return trans; } static gboolean gst_webrtc_bin_add_turn_server (GstWebRTCBin * webrtc, const gchar * uri) { + gboolean ret; + g_return_val_if_fail (GST_IS_WEBRTC_BIN (webrtc), FALSE); g_return_val_if_fail (uri != NULL, FALSE); GST_DEBUG_OBJECT (webrtc, "Adding turn server: %s", uri); - return gst_webrtc_ice_add_turn_server (webrtc->priv->ice, uri); + PC_LOCK (webrtc); + ret = gst_webrtc_ice_add_turn_server (webrtc->priv->ice, uri); + PC_UNLOCK (webrtc); + + return ret; } static gboolean @@ -5121,7 +6289,7 @@ if (webrtc->priv->sctp_transport) { /* Let transport be the connection's [[SctpTransport]] slot. * - * If the [[DataChannelId]] slot is not null, transport is in + * If the [[DataChannelId]] slot is not null, transport is in * connected state and [[DataChannelId]] is greater or equal to the * transport's [[MaxChannels]] slot, throw an OperationError. */ @@ -5136,6 +6304,7 @@ return NULL; PC_LOCK (webrtc); + DC_LOCK (webrtc); /* check if the id has been used already */ if (id != -1) { WebRTCDataChannel *channel = _find_data_channel_for_id (webrtc, id); @@ -5143,6 +6312,7 @@ GST_ELEMENT_WARNING (webrtc, LIBRARY, SETTINGS, ("Attempting to add a data channel with a duplicate ID: %i", id), NULL); + DC_UNLOCK (webrtc); PC_UNLOCK (webrtc); return NULL; } @@ -5155,6 +6325,7 @@ if (id == -1) { GST_ELEMENT_WARNING (webrtc, RESOURCE, NOT_FOUND, ("%s", "Failed to generate an identifier for a data channel"), NULL); + DC_UNLOCK (webrtc); PC_UNLOCK (webrtc); return NULL; } @@ -5165,24 +6336,31 @@ "max-retransmits", max_retransmits, "protocol", protocol, "negotiated", negotiated, "id", id, "priority", priority, NULL); - if (ret) { - gst_bin_add (GST_BIN (webrtc), ret->appsrc); - gst_bin_add (GST_BIN (webrtc), ret->appsink); - - gst_element_sync_state_with_parent (ret->appsrc); - gst_element_sync_state_with_parent (ret->appsink); - - ret = gst_object_ref (ret); - ret->webrtcbin = webrtc; - g_ptr_array_add (webrtc->priv->data_channels, ret); - webrtc_data_channel_link_to_sctp (ret, webrtc->priv->sctp_transport); - if (webrtc->priv->sctp_transport && - webrtc->priv->sctp_transport->association_established - && !ret->parent.negotiated) { - webrtc_data_channel_start_negotiation (ret); - } else { - _update_need_negotiation (webrtc); - } + if (!ret) { + DC_UNLOCK (webrtc); + PC_UNLOCK (webrtc); + return ret; + } + + gst_bin_add (GST_BIN (webrtc), ret->appsrc); + gst_bin_add (GST_BIN (webrtc), ret->appsink); + + gst_element_sync_state_with_parent (ret->appsrc); + gst_element_sync_state_with_parent (ret->appsink); + + ret = gst_object_ref (ret); + ret->webrtcbin = webrtc; + g_ptr_array_add (webrtc->priv->data_channels, ret); + DC_UNLOCK (webrtc); + + gst_webrtc_bin_update_sctp_priority (webrtc); + webrtc_data_channel_link_to_sctp (ret, webrtc->priv->sctp_transport); + if (webrtc->priv->sctp_transport && + webrtc->priv->sctp_transport->association_established + && !ret->parent.negotiated) { + webrtc_data_channel_start_negotiation (ret); + } else { + _update_need_negotiation (webrtc); } PC_UNLOCK (webrtc); @@ -5222,8 +6400,7 @@ media_idx = session_id; for (i = 0; i < stream->remote_ssrcmap->len; i++) { - SsrcMapItem *item = - &g_array_index (stream->remote_ssrcmap, SsrcMapItem, i); + SsrcMapItem *item = g_ptr_array_index (stream->remote_ssrcmap, i); if (item->ssrc == ssrc) { media_idx = item->media_idx; found_ssrc = TRUE; @@ -5290,31 +6467,39 @@ } } +static gboolean +_merge_structure (GQuark field_id, const GValue * value, gpointer user_data) +{ + GstStructure *s = user_data; + + gst_structure_id_set_value (s, field_id, value); + + return TRUE; +} + static GstElement * on_rtpbin_request_aux_sender (GstElement * rtpbin, guint session_id, GstWebRTCBin * webrtc) { TransportStream *stream; gboolean have_rtx = FALSE; - GstStructure *pt_map = NULL; GstElement *ret = NULL; - GstWebRTCRTPTransceiver *trans; stream = _find_transport_for_session (webrtc, session_id); - trans = _find_transceiver (webrtc, &session_id, - (FindTransceiverFunc) transceiver_match_for_mline); if (stream) - have_rtx = transport_stream_get_pt (stream, "RTX") != 0; + have_rtx = transport_stream_get_pt (stream, "RTX", -1) != 0; - GST_LOG_OBJECT (webrtc, "requesting aux sender for stream %" GST_PTR_FORMAT - " with transport %" GST_PTR_FORMAT " and pt map %" GST_PTR_FORMAT, stream, - trans, pt_map); + GST_LOG_OBJECT (webrtc, "requesting aux sender for stream %" GST_PTR_FORMAT, + stream); if (have_rtx) { GstElement *rtx; GstPad *pad; gchar *name; + GstStructure *merged_local_rtx_ssrc_map = + gst_structure_new_empty ("application/x-rtp-ssrc-map"); + guint i; if (stream->rtxsend) { GST_WARNING_OBJECT (webrtc, "rtprtxsend already created! rtpbin bug?!"); @@ -5327,9 +6512,18 @@ g_object_set (rtx, "max-size-packets", 500, NULL); _set_rtx_ptmap_from_stream (webrtc, stream); - if (WEBRTC_TRANSCEIVER (trans)->local_rtx_ssrc_map) - g_object_set (rtx, "ssrc-map", - WEBRTC_TRANSCEIVER (trans)->local_rtx_ssrc_map, NULL); + for (i = 0; i < webrtc->priv->transceivers->len; i++) { + WebRTCTransceiver *trans = + WEBRTC_TRANSCEIVER (g_ptr_array_index (webrtc->priv->transceivers, + i)); + + if (trans->stream == stream && trans->local_rtx_ssrc_map) + gst_structure_foreach (trans->local_rtx_ssrc_map, + _merge_structure, merged_local_rtx_ssrc_map); + } + + g_object_set (rtx, "ssrc-map", merged_local_rtx_ssrc_map, NULL); + gst_structure_free (merged_local_rtx_ssrc_map); gst_bin_add (GST_BIN (ret), rtx); @@ -5349,9 +6543,6 @@ } out: - if (pt_map) - gst_structure_free (pt_map); - return ret; } @@ -5363,20 +6554,43 @@ GstElement *prev = NULL; GstPad *sinkpad = NULL; TransportStream *stream; - gint red_pt = 0; gint rtx_pt = 0; + GValue red_pt_array = { 0, }; + gboolean have_red_pt = FALSE; + + g_value_init (&red_pt_array, GST_TYPE_ARRAY); stream = _find_transport_for_session (webrtc, session_id); if (stream) { - red_pt = transport_stream_get_pt (stream, "RED"); - rtx_pt = transport_stream_get_pt (stream, "RTX"); + guint i = 0; + + for (i = 0; i < stream->ptmap->len; i++) { + PtMapItem *item = &g_array_index (stream->ptmap, PtMapItem, i); + + if (!gst_caps_is_empty (item->caps)) { + GstStructure *s = gst_caps_get_structure (item->caps, 0); + + if (!g_strcmp0 (gst_structure_get_string (s, "encoding-name"), "RED")) { + GValue ptval = { 0, }; + + g_value_init (&ptval, G_TYPE_INT); + g_value_set_int (&ptval, item->pt); + gst_value_array_append_value (&red_pt_array, &ptval); + g_value_unset (&ptval); + + have_red_pt = TRUE; + } + } + } + + rtx_pt = transport_stream_get_pt (stream, "RTX", -1); } GST_LOG_OBJECT (webrtc, "requesting aux receiver for stream %" GST_PTR_FORMAT, stream); - if (red_pt || rtx_pt) + if (have_red_pt || rtx_pt) ret = gst_bin_new (NULL); if (rtx_pt) { @@ -5396,15 +6610,14 @@ prev = gst_object_ref (stream->rtxreceive); } - if (red_pt) { + if (have_red_pt) { GstElement *rtpreddec = gst_element_factory_make ("rtpreddec", NULL); - GST_DEBUG_OBJECT (webrtc, "Creating RED decoder for pt %d in session %u", - red_pt, session_id); + GST_DEBUG_OBJECT (webrtc, "Creating RED decoder in session %u", session_id); gst_bin_add (GST_BIN (ret), rtpreddec); - g_object_set (rtpreddec, "pt", red_pt, NULL); + g_object_set_property (G_OBJECT (rtpreddec), "payloads", &red_pt_array); if (prev) gst_element_link (prev, rtpreddec); @@ -5432,6 +6645,7 @@ } out: + g_value_unset (&red_pt_array); return ret; error: @@ -5441,12 +6655,12 @@ } static GstElement * -on_rtpbin_request_fec_decoder (GstElement * rtpbin, guint session_id, - GstWebRTCBin * webrtc) +on_rtpbin_request_fec_decoder_full (GstElement * rtpbin, guint session_id, + guint ssrc, guint pt, GstWebRTCBin * webrtc) { TransportStream *stream; GstElement *ret = NULL; - gint pt = 0; + gint fec_pt = 0; GObject *internal_storage; stream = _find_transport_for_session (webrtc, session_id); @@ -5461,100 +6675,28 @@ * we detect it (by connecting to ptdemux:new-payload-type for * example) */ - if (stream) - pt = transport_stream_get_pt (stream, "ULPFEC"); - - if (pt) { - GST_DEBUG_OBJECT (webrtc, "Creating ULPFEC decoder for pt %d in session %u", - pt, session_id); - ret = gst_element_factory_make ("rtpulpfecdec", NULL); - g_signal_emit_by_name (webrtc->rtpbin, "get-internal-storage", session_id, - &internal_storage); - - g_object_set (ret, "pt", pt, "storage", internal_storage, NULL); - g_object_unref (internal_storage); - } - - return ret; -} - -static GstElement * -on_rtpbin_request_fec_encoder (GstElement * rtpbin, guint session_id, - GstWebRTCBin * webrtc) -{ - GstElement *ret = NULL; - GstElement *prev = NULL; - TransportStream *stream; - guint ulpfec_pt = 0; - guint red_pt = 0; - GstPad *sinkpad = NULL; - GstWebRTCRTPTransceiver *trans; - - stream = _find_transport_for_session (webrtc, session_id); - trans = _find_transceiver (webrtc, &session_id, - (FindTransceiverFunc) transceiver_match_for_mline); - if (stream) { - ulpfec_pt = transport_stream_get_pt (stream, "ULPFEC"); - red_pt = transport_stream_get_pt (stream, "RED"); - } - - if (ulpfec_pt || red_pt) - ret = gst_bin_new (NULL); - - if (ulpfec_pt) { - GstElement *fecenc = gst_element_factory_make ("rtpulpfecenc", NULL); - GstCaps *caps = transport_stream_get_caps_for_pt (stream, ulpfec_pt); - - GST_DEBUG_OBJECT (webrtc, - "Creating ULPFEC encoder for session %d with pt %d", session_id, - ulpfec_pt); - - gst_bin_add (GST_BIN (ret), fecenc); - sinkpad = gst_element_get_static_pad (fecenc, "sink"); - g_object_set (fecenc, "pt", ulpfec_pt, "percentage", - WEBRTC_TRANSCEIVER (trans)->fec_percentage, NULL); + guint i; + for (i = 0; i < stream->ptmap->len; i++) { + PtMapItem *item = &g_array_index (stream->ptmap, PtMapItem, i); - if (caps && !gst_caps_is_empty (caps)) { - const GstStructure *s = gst_caps_get_structure (caps, 0); - const gchar *media = gst_structure_get_string (s, "media"); - - if (!g_strcmp0 (media, "video")) - g_object_set (fecenc, "multipacket", TRUE, NULL); + if (item->pt == pt) { + fec_pt = transport_stream_get_pt (stream, "ULPFEC", item->media_idx); + break; + } } - - prev = fecenc; } - if (red_pt) { - GstElement *redenc = gst_element_factory_make ("rtpredenc", NULL); - - GST_DEBUG_OBJECT (webrtc, "Creating RED encoder for session %d with pt %d", - session_id, red_pt); - - gst_bin_add (GST_BIN (ret), redenc); - if (prev) - gst_element_link (prev, redenc); - else - sinkpad = gst_element_get_static_pad (redenc, "sink"); - - g_object_set (redenc, "pt", red_pt, "allow-no-red-blocks", TRUE, NULL); - - prev = redenc; - } - - if (sinkpad) { - GstPad *ghost = gst_ghost_pad_new ("sink", sinkpad); - gst_object_unref (sinkpad); - gst_element_add_pad (ret, ghost); - } + if (fec_pt) { + GST_DEBUG_OBJECT (webrtc, "Creating ULPFEC decoder for pt %d in session %u", + fec_pt, session_id); + ret = gst_element_factory_make ("rtpulpfecdec", NULL); + g_signal_emit_by_name (webrtc->rtpbin, "get-internal-storage", session_id, + &internal_storage); - if (prev) { - GstPad *srcpad = gst_element_get_static_pad (prev, "src"); - GstPad *ghost = gst_ghost_pad_new ("src", srcpad); - gst_object_unref (srcpad); - gst_element_add_pad (ret, ghost); + g_object_set (ret, "pt", fec_pt, "storage", internal_storage, NULL); + g_object_unref (internal_storage); } return ret; @@ -5593,7 +6735,7 @@ on_rtpbin_ssrc_active (GstElement * rtpbin, guint session_id, guint ssrc, GstWebRTCBin * webrtc) { - GST_INFO_OBJECT (webrtc, "session %u ssrc %u active", session_id, ssrc); + GST_TRACE_OBJECT (webrtc, "session %u ssrc %u active", session_id, ssrc); } static void @@ -5636,7 +6778,7 @@ on_rtpbin_sender_ssrc_active (GstElement * rtpbin, guint session_id, guint ssrc, GstWebRTCBin * webrtc) { - GST_INFO_OBJECT (webrtc, "session %u ssrc %u sender ssrc active", session_id, + GST_TRACE_OBJECT (webrtc, "session %u ssrc %u sender ssrc active", session_id, ssrc); } @@ -5644,18 +6786,47 @@ on_rtpbin_new_jitterbuffer (GstElement * rtpbin, GstElement * jitterbuffer, guint session_id, guint ssrc, GstWebRTCBin * webrtc) { - GstWebRTCRTPTransceiver *trans; + TransportStream *stream; + guint i; - trans = _find_transceiver (webrtc, &session_id, - (FindTransceiverFunc) transceiver_match_for_mline); + PC_LOCK (webrtc); + GST_INFO_OBJECT (webrtc, "new jitterbuffer %" GST_PTR_FORMAT " for " + "session %u ssrc %u", jitterbuffer, session_id, ssrc); - if (trans) { - /* We don't set do-retransmission on rtpbin as we want per-session control */ - g_object_set (jitterbuffer, "do-retransmission", - WEBRTC_TRANSCEIVER (trans)->do_nack, NULL); - } else { - g_assert_not_reached (); + if (!(stream = _find_transport_for_session (webrtc, session_id))) { + g_warn_if_reached (); + goto out; + } + + /* XXX: this will fail with no ssrc in the remote sdp as used with e.g. simulcast + * newer SDP versions from chrome/firefox */ + for (i = 0; i < stream->remote_ssrcmap->len; i++) { + SsrcMapItem *item = g_ptr_array_index (stream->remote_ssrcmap, i); + + if (item->ssrc == ssrc) { + GstWebRTCRTPTransceiver *trans; + gboolean do_nack; + + trans = _find_transceiver_for_mline (webrtc, item->media_idx); + if (!trans) { + g_warn_if_reached (); + break; + } + + do_nack = WEBRTC_TRANSCEIVER (trans)->do_nack; + /* We don't set do-retransmission on rtpbin as we want per-session control */ + GST_LOG_OBJECT (webrtc, "setting do-nack=%s for transceiver %" + GST_PTR_FORMAT " with transport %" GST_PTR_FORMAT + " rtp session %u ssrc %u", do_nack ? "true" : "false", trans, stream, + session_id, ssrc); + g_object_set (jitterbuffer, "do-retransmission", do_nack, NULL); + + g_weak_ref_set (&item->rtpjitterbuffer, jitterbuffer); + break; + } } +out: + PC_UNLOCK (webrtc); } static void @@ -5694,10 +6865,8 @@ G_CALLBACK (on_rtpbin_request_aux_receiver), webrtc); g_signal_connect (rtpbin, "new-storage", G_CALLBACK (on_rtpbin_new_storage), webrtc); - g_signal_connect (rtpbin, "request-fec-decoder", - G_CALLBACK (on_rtpbin_request_fec_decoder), webrtc); - g_signal_connect (rtpbin, "request-fec-encoder", - G_CALLBACK (on_rtpbin_request_fec_encoder), webrtc); + g_signal_connect (rtpbin, "request-fec-decoder-full", + G_CALLBACK (on_rtpbin_request_fec_decoder_full), webrtc); g_signal_connect (rtpbin, "on-bye-ssrc", G_CALLBACK (on_rtpbin_bye_ssrc), webrtc); g_signal_connect (rtpbin, "on-bye-timeout", @@ -5786,57 +6955,181 @@ return GST_PAD_PROBE_OK; } + static GstPad * gst_webrtc_bin_request_new_pad (GstElement * element, GstPadTemplate * templ, const gchar * name, const GstCaps * caps) { GstWebRTCBin *webrtc = GST_WEBRTC_BIN (element); + GstWebRTCRTPTransceiver *trans = NULL; GstWebRTCBinPad *pad = NULL; guint serial; + gboolean lock_mline = FALSE; if (!_have_nice_elements (webrtc) || !_have_dtls_elements (webrtc)) return NULL; - if (templ->direction == GST_PAD_SINK || - g_strcmp0 (templ->name_template, "sink_%u") == 0) { - GstWebRTCRTPTransceiver *trans; + if (templ->direction != GST_PAD_SINK || + g_strcmp0 (templ->name_template, "sink_%u") != 0) { + GST_ERROR_OBJECT (element, "Requested pad that shouldn't be requestable"); + return NULL; + } - GST_OBJECT_LOCK (webrtc); - if (name == NULL || strlen (name) < 6 || !g_str_has_prefix (name, "sink_")) { - /* no name given when requesting the pad, use next available int */ - serial = webrtc->priv->max_sink_pad_serial++; - } else { - /* parse serial number from requested padname */ - serial = g_ascii_strtoull (&name[5], NULL, 10); - if (serial > webrtc->priv->max_sink_pad_serial) - webrtc->priv->max_sink_pad_serial = serial; - } - GST_OBJECT_UNLOCK (webrtc); + PC_LOCK (webrtc); + + if (name == NULL || strlen (name) < 6 || !g_str_has_prefix (name, "sink_")) { + /* no name given when requesting the pad, use next available int */ + serial = webrtc->priv->max_sink_pad_serial++; + } else { + /* parse serial number from requested padname */ + serial = g_ascii_strtoull (&name[5], NULL, 10); + lock_mline = TRUE; + } + + if (lock_mline) { + GstWebRTCBinPad *pad2; - pad = _create_pad_for_sdp_media (webrtc, GST_PAD_SINK, serial); trans = _find_transceiver_for_mline (webrtc, serial); - if (!trans) { - trans = - GST_WEBRTC_RTP_TRANSCEIVER (_create_webrtc_transceiver (webrtc, - GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV, serial)); - GST_LOG_OBJECT (webrtc, "Created new transceiver %" GST_PTR_FORMAT - " for mline %u", trans, serial); - } else { - GST_LOG_OBJECT (webrtc, "Using existing transceiver %" GST_PTR_FORMAT - " for mline %u", trans, serial); + + if (trans) { + /* Reject transceivers that are only for receiving ... */ + if (trans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY || + trans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_INACTIVE) { + gchar *direction = + g_enum_to_string (GST_TYPE_WEBRTC_RTP_TRANSCEIVER_DIRECTION, + trans->direction); + GST_ERROR_OBJECT (element, "Tried to request a new sink pad %s for" + " existing m-line %d, but the transceiver's direction is %s", + name, serial, direction); + g_free (direction); + goto error_out; + } + + /* Reject transceivers that already have a pad allocated */ + pad2 = _find_pad_for_transceiver (webrtc, GST_PAD_SINK, trans); + if (pad2) { + GST_ERROR_OBJECT (element, "Trying to request pad %s for m-line %d, " + " but the transceiver associated with this m-line already has pad" + " %s", name, serial, GST_PAD_NAME (pad2)); + gst_object_unref (pad2); + goto error_out; + } + + if (caps) { + GST_OBJECT_LOCK (trans); + if (trans->codec_preferences && + !gst_caps_can_intersect (caps, trans->codec_preferences)) { + GST_ERROR_OBJECT (element, "Tried to request a new sink pad %s for" + " existing m-line %d, but requested caps %" GST_PTR_FORMAT + " don't match existing codec preferences %" GST_PTR_FORMAT, + name, serial, caps, trans->codec_preferences); + GST_OBJECT_UNLOCK (trans); + goto error_out; + } + GST_OBJECT_UNLOCK (trans); + + if (trans->kind != GST_WEBRTC_KIND_UNKNOWN) { + GstWebRTCKind kind = webrtc_kind_from_caps (caps); + + if (trans->kind != kind) { + GST_ERROR_OBJECT (element, "Tried to request a new sink pad %s for" + " existing m-line %d, but requested caps %" GST_PTR_FORMAT + " don't match transceiver kind %d", + name, serial, caps, trans->kind); + goto error_out; + } + } + } + } + } + + /* Let's try to find a free transceiver that matches */ + if (!trans) { + GstWebRTCKind kind = GST_WEBRTC_KIND_UNKNOWN; + guint i; + + kind = webrtc_kind_from_caps (caps); + + for (i = 0; i < webrtc->priv->transceivers->len; i++) { + GstWebRTCRTPTransceiver *tmptrans = + g_ptr_array_index (webrtc->priv->transceivers, i); + GstWebRTCBinPad *pad2; + gboolean has_matching_caps; + + /* Ignore transceivers with a non-matching kind */ + if (tmptrans->kind != GST_WEBRTC_KIND_UNKNOWN && + kind != GST_WEBRTC_KIND_UNKNOWN && tmptrans->kind != kind) + continue; + + /* Ignore stopped transmitters */ + if (tmptrans->stopped) + continue; + + /* Ignore transceivers that are only for receiving ... */ + if (tmptrans->direction == GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY + || tmptrans->direction == + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_INACTIVE) + continue; + + /* Ignore transceivers that already have a pad allocated */ + pad2 = _find_pad_for_transceiver (webrtc, GST_PAD_SINK, tmptrans); + if (pad2) { + gst_object_unref (pad2); + continue; + } + + GST_OBJECT_LOCK (tmptrans); + has_matching_caps = (caps && tmptrans->codec_preferences && + !gst_caps_can_intersect (caps, tmptrans->codec_preferences)); + GST_OBJECT_UNLOCK (tmptrans); + /* Ignore transceivers with non-matching caps */ + if (!has_matching_caps) + continue; + + trans = tmptrans; + break; + } + } + + if (!trans) { + trans = GST_WEBRTC_RTP_TRANSCEIVER (_create_webrtc_transceiver (webrtc, + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV, -1, + webrtc_kind_from_caps (caps), NULL)); + GST_LOG_OBJECT (webrtc, "Created new transceiver %" GST_PTR_FORMAT, trans); + } else { + GST_LOG_OBJECT (webrtc, "Using existing transceiver %" GST_PTR_FORMAT + " for mline %u", trans, serial); + if (caps) { + if (!_update_transceiver_kind_from_caps (trans, caps)) + GST_WARNING_OBJECT (webrtc, + "Trying to change transceiver %d kind from %d to %d", + serial, trans->kind, webrtc_kind_from_caps (caps)); } - pad->trans = gst_object_ref (trans); + } + pad = _create_pad_for_sdp_media (webrtc, GST_PAD_SINK, trans, serial); - pad->block_id = gst_pad_add_probe (GST_PAD (pad), GST_PAD_PROBE_TYPE_BLOCK | - GST_PAD_PROBE_TYPE_BUFFER | GST_PAD_PROBE_TYPE_BUFFER_LIST, - (GstPadProbeCallback) sink_pad_block, NULL, NULL); - webrtc->priv->pending_sink_transceivers = - g_list_append (webrtc->priv->pending_sink_transceivers, - gst_object_ref (pad)); - _add_pad (webrtc, pad); + pad->block_id = gst_pad_add_probe (GST_PAD (pad), GST_PAD_PROBE_TYPE_BLOCK | + GST_PAD_PROBE_TYPE_BUFFER | GST_PAD_PROBE_TYPE_BUFFER_LIST, + (GstPadProbeCallback) sink_pad_block, NULL, NULL); + webrtc->priv->pending_sink_transceivers = + g_list_append (webrtc->priv->pending_sink_transceivers, + gst_object_ref (pad)); + + if (lock_mline) { + WebRTCTransceiver *wtrans = WEBRTC_TRANSCEIVER (trans); + wtrans->mline_locked = TRUE; + trans->mline = serial; } + PC_UNLOCK (webrtc); + + _add_pad (webrtc, pad); + return GST_PAD (pad); + +error_out: + PC_UNLOCK (webrtc); + return NULL; } static void @@ -5853,6 +7146,7 @@ if (webrtc_pad->trans) gst_object_unref (webrtc_pad->trans); webrtc_pad->trans = NULL; + gst_caps_replace (&webrtc_pad->received_caps, NULL); PC_UNLOCK (webrtc); _remove_pad (webrtc, webrtc_pad); @@ -5992,6 +7286,9 @@ case PROP_LATENCY: g_value_set_uint (value, webrtc->priv->jb_latency); break; + case PROP_SCTP_TRANSPORT: + g_value_set_object (value, webrtc->priv->sctp_transport); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -6069,10 +7366,6 @@ g_array_free (webrtc->priv->pending_local_ice_candidates, TRUE); webrtc->priv->pending_local_ice_candidates = NULL; - if (webrtc->priv->session_mid_map) - g_array_free (webrtc->priv->session_mid_map, TRUE); - webrtc->priv->session_mid_map = NULL; - if (webrtc->priv->pending_pads) g_list_free_full (webrtc->priv->pending_pads, (GDestroyNotify) _free_pending_pad); @@ -6104,10 +7397,7 @@ gst_webrtc_session_description_free (webrtc->priv->last_generated_offer); webrtc->priv->last_generated_offer = NULL; - if (webrtc->priv->stats) - gst_structure_free (webrtc->priv->stats); - webrtc->priv->stats = NULL; - + g_mutex_clear (DC_GET_LOCK (webrtc)); g_mutex_clear (ICE_GET_LOCK (webrtc)); g_mutex_clear (PC_GET_LOCK (webrtc)); g_cond_clear (PC_GET_COND (webrtc)); @@ -6275,7 +7565,22 @@ PROP_LATENCY, g_param_spec_uint ("latency", "Latency", "Default duration to buffer in the jitterbuffers (in ms)", - 0, G_MAXUINT, 200, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + 0, G_MAXUINT, DEFAULT_JB_LATENCY, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCBin:sctp-transport: + * + * The WebRTC SCTP Transport + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_SCTP_TRANSPORT, + g_param_spec_object ("sctp-transport", "WebRTC SCTP Transport", + "The WebRTC SCTP Transport", + GST_TYPE_WEBRTC_SCTP_TRANSPORT, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); /** * GstWebRTCBin::create-offer: @@ -6329,7 +7634,8 @@ * GstWebRTCBin::add-ice-candidate: * @object: the #webrtcbin * @mline_index: the index of the media description in the SDP - * @ice-candidate: an ice candidate + * @ice-candidate: an ice candidate or NULL/"" to mark that no more candidates + * will arrive */ gst_webrtc_bin_signals[ADD_ICE_CANDIDATE_SIGNAL] = g_signal_new_class_handler ("add-ice-candidate", @@ -6558,11 +7864,6 @@ g_signal_handlers_disconnect_by_data (stream->transport->transport, webrtc); g_signal_handlers_disconnect_by_data (stream->transport, webrtc); } - if (stream->rtcp_transport) { - g_signal_handlers_disconnect_by_data (stream->rtcp_transport->transport, - webrtc); - g_signal_handlers_disconnect_by_data (stream->rtcp_transport, webrtc); - } gst_object_unref (object); } @@ -6585,6 +7886,7 @@ g_cond_init (PC_GET_COND (webrtc)); g_mutex_init (ICE_GET_LOCK (webrtc)); + g_mutex_init (DC_GET_LOCK (webrtc)); webrtc->rtpbin = _create_rtpbin (webrtc); gst_bin_add (GST_BIN (webrtc), webrtc->rtpbin); @@ -6600,11 +7902,6 @@ webrtc->priv->pending_data_channels = g_ptr_array_new_with_free_func ((GDestroyNotify) gst_object_unref); - webrtc->priv->session_mid_map = - g_array_new (FALSE, TRUE, sizeof (SessionMidItem)); - g_array_set_clear_func (webrtc->priv->session_mid_map, - (GDestroyNotify) clear_session_mid_item); - webrtc->priv->ice_stream_map = g_array_new (FALSE, TRUE, sizeof (IceStreamItem)); webrtc->priv->pending_remote_ice_candidates = @@ -6619,4 +7916,5 @@ /* we start off closed until we move to READY */ webrtc->priv->is_closed = TRUE; + webrtc->priv->jb_latency = DEFAULT_JB_LATENCY; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/gstwebrtcbin.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/gstwebrtcbin.h
Changed
@@ -24,6 +24,7 @@ #include "fwd.h" #include "gstwebrtcice.h" #include "transportstream.h" +#include "webrtcsctptransport.h" G_BEGIN_DECLS @@ -42,11 +43,11 @@ { GstGhostPad parent; - guint mlineindex; - GstWebRTCRTPTransceiver *trans; gulong block_id; + guint32 last_ssrc; + GstCaps *received_caps; }; @@ -97,16 +98,18 @@ gboolean bundle; GPtrArray *transceivers; - GArray *session_mid_map; GPtrArray *transports; GPtrArray *data_channels; /* list of data channels we've received a sctp stream for but no data * channel protocol for */ GPtrArray *pending_data_channels; + /* dc_lock protects data_channels and pending_data_channels */ + /* lock ordering is pc_lock first, then dc_lock */ + GMutex dc_lock; guint jb_latency; - GstWebRTCSCTPTransport *sctp_transport; + WebRTCSCTPTransport *sctp_transport; TransportStream *data_channel_transport; GstWebRTCICE *ice; @@ -140,10 +143,10 @@ GstWebRTCSessionDescription *last_generated_offer; GstWebRTCSessionDescription *last_generated_answer; - GstStructure *stats; + gboolean tos_attached; }; -typedef void (*GstWebRTCBinFunc) (GstWebRTCBin * webrtc, gpointer data); +typedef GstStructure *(*GstWebRTCBinFunc) (GstWebRTCBin * webrtc, gpointer data); typedef struct {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/gstwebrtcice.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/gstwebrtcice.c
Changed
@@ -56,6 +56,8 @@ PROP_AGENT, PROP_ICE_TCP, PROP_ICE_UDP, + PROP_MIN_RTP_PORT, + PROP_MAX_RTP_PORT, }; static guint gst_webrtc_ice_signals[LAST_SIGNAL] = { 0 }; @@ -138,35 +140,6 @@ g_thread_unref (ice->priv->thread); } -#if 0 -static NiceComponentType -_webrtc_component_to_nice (GstWebRTCICEComponent comp) -{ - switch (comp) { - case GST_WEBRTC_ICE_COMPONENT_RTP: - return NICE_COMPONENT_TYPE_RTP; - case GST_WEBRTC_ICE_COMPONENT_RTCP: - return NICE_COMPONENT_TYPE_RTCP; - default: - g_assert_not_reached (); - return 0; - } -} - -static GstWebRTCICEComponent -_nice_component_to_webrtc (NiceComponentType comp) -{ - switch (comp) { - case NICE_COMPONENT_TYPE_RTP: - return GST_WEBRTC_ICE_COMPONENT_RTP; - case NICE_COMPONENT_TYPE_RTCP: - return GST_WEBRTC_ICE_COMPONENT_RTCP; - default: - g_assert_not_reached (); - return 0; - } -} -#endif struct NiceStreamItem { guint session_id; @@ -266,7 +239,7 @@ struct NiceStreamItem item; item.session_id = session_id; - item.nice_stream_id = nice_agent_add_stream (ice->priv->nice_agent, 2); + item.nice_stream_id = nice_agent_add_stream (ice->priv->nice_agent, 1); item.stream = gst_webrtc_ice_stream_new (ice, item.nice_stream_id); g_array_append_val (ice->priv->nice_stream_map, item); @@ -380,16 +353,6 @@ g_free (uri); break; } - ret = nice_agent_set_relay_info (ice->priv->nice_agent, - item->nice_stream_id, NICE_COMPONENT_TYPE_RTCP, - gst_uri_get_host (turn_server), gst_uri_get_port (turn_server), - user, pass, relays[i]); - if (!ret) { - gchar *uri = gst_uri_to_string (turn_server); - GST_ERROR_OBJECT (ice, "Failed to set TURN server '%s'", uri); - g_free (uri); - break; - } } g_free (user); g_free (pass); @@ -652,7 +615,7 @@ return FALSE; } -/* must start with "a=candidate:" */ +/* candidate must start with "a=candidate:" or be NULL*/ void gst_webrtc_ice_add_candidate (GstWebRTCICE * ice, GstWebRTCICEStream * stream, const gchar * candidate) @@ -664,31 +627,38 @@ item = _find_item (ice, -1, -1, stream); g_return_if_fail (item != NULL); + if (candidate == NULL) { + nice_agent_peer_candidate_gathering_done (ice->priv->nice_agent, + item->nice_stream_id); + return; + } + cand = nice_agent_parse_remote_candidate_sdp (ice->priv->nice_agent, item->nice_stream_id, candidate); if (!cand) { /* might be a .local candidate */ char *prefix = NULL, *address = NULL, *postfix = NULL; - char *new_addr, *new_candidate; + char *new_addr = NULL, *new_candidate = NULL; char *new_candv[4] = { NULL, }; + gboolean failure = TRUE; if (!get_candidate_address (candidate, &prefix, &address, &postfix)) { GST_WARNING_OBJECT (ice, "Failed to retrieve address from candidate %s", candidate); - goto fail; + goto done; } if (!g_str_has_suffix (address, ".local")) { GST_WARNING_OBJECT (ice, "candidate address \'%s\' does not end " "with \'.local\'", address); - goto fail; + goto done; } /* FIXME: async */ if (!(new_addr = _resolve_host (ice, address))) { GST_WARNING_OBJECT (ice, "Failed to resolve %s", address); - goto fail; + goto done; } new_candv[0] = prefix; @@ -702,24 +672,29 @@ cand = nice_agent_parse_remote_candidate_sdp (ice->priv->nice_agent, item->nice_stream_id, new_candidate); - g_free (new_candidate); if (!cand) { GST_WARNING_OBJECT (ice, "Could not parse candidate \'%s\'", new_candidate); - goto fail; + goto done; } + failure = FALSE; + + done: g_free (prefix); - g_free (new_addr); + g_free (address); g_free (postfix); - - if (0) { - fail: - g_free (prefix); - g_free (address); - g_free (postfix); + g_free (new_addr); + g_free (new_candidate); + if (failure) return; - } + } + + if (cand->component_id == 2) { + /* we only support rtcp-mux so rtcp candidates are useless for us */ + GST_INFO_OBJECT (ice, "Dropping RTCP candidate %s", candidate); + nice_candidate_free (cand); + return; } candidates = g_slist_append (candidates, cand); @@ -861,6 +836,18 @@ ice->priv->on_candidate_notify = notify; } +void +gst_webrtc_ice_set_tos (GstWebRTCICE * ice, GstWebRTCICEStream * stream, + guint tos) +{ + struct NiceStreamItem *item; + + item = _find_item (ice, -1, -1, stream); + g_return_if_fail (item != NULL); + + nice_agent_set_stream_tos (ice->priv->nice_agent, item->nice_stream_id, tos); +} + static void _clear_ice_stream (struct NiceStreamItem *item) { @@ -1012,6 +999,21 @@ g_object_set_property (G_OBJECT (ice->priv->nice_agent), "ice-udp", value); break; + + case PROP_MIN_RTP_PORT: + ice->min_rtp_port = g_value_get_uint (value); + if (ice->min_rtp_port > ice->max_rtp_port) + g_warning ("Set min-rtp-port to %u which is larger than" + " max-rtp-port %u", ice->min_rtp_port, ice->max_rtp_port); + break; + + case PROP_MAX_RTP_PORT: + ice->max_rtp_port = g_value_get_uint (value); + if (ice->min_rtp_port > ice->max_rtp_port) + g_warning ("Set max-rtp-port to %u which is smaller than" + " min-rtp-port %u", ice->max_rtp_port, ice->min_rtp_port); + break; + default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -1036,6 +1038,15 @@ g_object_get_property (G_OBJECT (ice->priv->nice_agent), "ice-udp", value); break; + + case PROP_MIN_RTP_PORT: + g_value_set_uint (value, ice->min_rtp_port); + break; + + case PROP_MAX_RTP_PORT: + g_value_set_uint (value, ice->max_rtp_port); + break; + default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -1077,11 +1088,15 @@ gst_webrtc_ice_constructed (GObject * object) { GstWebRTCICE *ice = GST_WEBRTC_ICE (object); + NiceAgentOption options = 0; _start_thread (ice); - ice->priv->nice_agent = nice_agent_new (ice->priv->main_context, - NICE_COMPATIBILITY_RFC5245); + options |= NICE_AGENT_OPTION_ICE_TRICKLE; + options |= NICE_AGENT_OPTION_REGULAR_NOMINATION; + + ice->priv->nice_agent = nice_agent_new_full (ice->priv->main_context, + NICE_COMPATIBILITY_RFC5245, options); g_signal_connect (ice->priv->nice_agent, "new-candidate-full", G_CALLBACK (_on_new_candidate), ice); @@ -1119,6 +1134,37 @@ TRUE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); /** + * GstWebRTCICE:min-rtp-port: + * + * Minimum port for local rtp port range. + * min-rtp-port must be <= max-rtp-port + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_MIN_RTP_PORT, + g_param_spec_uint ("min-rtp-port", "ICE RTP candidate min port", + "Minimum port for local rtp port range. " + "min-rtp-port must be <= max-rtp-port", + 0, 65535, 0, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCICE:max-rtp-port: + * + * Maximum port for local rtp port range. + * min-rtp-port must be <= max-rtp-port + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_MAX_RTP_PORT, + g_param_spec_uint ("max-rtp-port", "ICE RTP candidate max port", + "Maximum port for local rtp port range. " + "max-rtp-port must be >= min-rtp-port", + 0, 65535, 65535, + G_PARAM_READWRITE | G_PARAM_CONSTRUCT | G_PARAM_STATIC_STRINGS)); + + /** * GstWebRTCICE::add-local-ip-address: * @object: the #GstWebRTCICE * @address: The local IP address
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/gstwebrtcice.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/gstwebrtcice.h
Changed
@@ -51,6 +51,9 @@ GHashTable *turn_servers; GstWebRTCICEPrivate *priv; + + guint min_rtp_port; + guint max_rtp_port; }; struct _GstWebRTCICEClass @@ -101,6 +104,9 @@ gpointer user_data, GDestroyNotify notify); +void gst_webrtc_ice_set_tos (GstWebRTCICE * ice, + GstWebRTCICEStream * stream, + guint tos); G_END_DECLS #endif /* __GST_WEBRTC_ICE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/gstwebrtcstats.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/gstwebrtcstats.c
Changed
@@ -31,6 +31,8 @@ #include "utils.h" #include "webrtctransceiver.h" +#include <stdlib.h> + #define GST_CAT_DEFAULT gst_webrtc_stats_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -81,19 +83,134 @@ return s; } +static void +_gst_structure_take_structure (GstStructure * s, const char *fieldname, + GstStructure ** value_s) +{ + GValue v = G_VALUE_INIT; + + g_return_if_fail (GST_IS_STRUCTURE (*value_s)); + + g_value_init (&v, GST_TYPE_STRUCTURE); + g_value_take_boxed (&v, *value_s); + + gst_structure_take_value (s, fieldname, &v); + + *value_s = NULL; +} + #define CLOCK_RATE_VALUE_TO_SECONDS(v,r) ((double) v / (double) clock_rate) #define FIXED_16_16_TO_DOUBLE(v) ((double) ((v & 0xffff0000) >> 16) + ((v & 0xffff) / 65536.0)) #define FIXED_32_32_TO_DOUBLE(v) ((double) ((v & G_GUINT64_CONSTANT (0xffffffff00000000)) >> 32) + ((v & G_GUINT64_CONSTANT (0xffffffff)) / 4294967296.0)) +/* https://www.w3.org/TR/webrtc-stats/#remoteinboundrtpstats-dict* */ +static gboolean +_get_stats_from_remote_rtp_source_stats (GstWebRTCBin * webrtc, + TransportStream * stream, const GstStructure * source_stats, + guint ssrc, guint clock_rate, const gchar * codec_id, + const gchar * transport_id, GstStructure * s) +{ + gboolean have_rb = FALSE, internal = FALSE; + int lost; + GstStructure *r_in; + gchar *r_in_id, *out_id; + guint32 rtt; + guint fraction_lost, jitter; + double ts; + + gst_structure_get_double (s, "timestamp", &ts); + gst_structure_get (source_stats, "internal", G_TYPE_BOOLEAN, &internal, + "have-rb", G_TYPE_BOOLEAN, &have_rb, NULL); + + /* This isn't what we're looking for */ + if (internal == TRUE || have_rb == FALSE) + return FALSE; + + r_in_id = g_strdup_printf ("rtp-remote-inbound-stream-stats_%u", ssrc); + out_id = g_strdup_printf ("rtp-outbound-stream-stats_%u", ssrc); + + r_in = gst_structure_new_empty (r_in_id); + _set_base_stats (r_in, GST_WEBRTC_STATS_REMOTE_INBOUND_RTP, ts, r_in_id); + + /* RTCRtpStreamStats */ + gst_structure_set (r_in, "local-id", G_TYPE_STRING, out_id, NULL); + gst_structure_set (r_in, "ssrc", G_TYPE_UINT, ssrc, NULL); + gst_structure_set (r_in, "codec-id", G_TYPE_STRING, codec_id, NULL); + gst_structure_set (r_in, "transport-id", G_TYPE_STRING, transport_id, NULL); + /* To be added: kind */ + + /* RTCReceivedRtpStreamStats */ + + if (gst_structure_get_int (source_stats, "rb-packetslost", &lost)) + gst_structure_set (r_in, "packets-lost", G_TYPE_INT, lost, NULL); + + if (clock_rate && gst_structure_get_uint (source_stats, "rb-jitter", &jitter)) + gst_structure_set (r_in, "jitter", G_TYPE_DOUBLE, + CLOCK_RATE_VALUE_TO_SECONDS (jitter, clock_rate), NULL); + + /* RTCReceivedRtpStreamStats: + + unsigned long long packetsReceived; + unsigned long packetsDiscarded; + unsigned long packetsRepaired; + unsigned long burstPacketsLost; + unsigned long burstPacketsDiscarded; + unsigned long burstLossCount; + unsigned long burstDiscardCount; + double burstLossRate; + double burstDiscardRate; + double gapLossRate; + double gapDiscardRate; + + Can't be implemented frame re-assembly happens after rtpbin: + + unsigned long framesDropped; + unsigned long partialFramesLost; + unsigned long fullFramesLost; + */ + + /* RTCRemoteInboundRTPStreamStats */ + + if (gst_structure_get_uint (source_stats, "rb-fractionlost", &fraction_lost)) + gst_structure_set (r_in, "fraction-lost", G_TYPE_DOUBLE, + (double) fraction_lost / 256.0, NULL); + + if (gst_structure_get_uint (source_stats, "rb-round-trip", &rtt)) { + /* 16.16 fixed point to double */ + double val = FIXED_16_16_TO_DOUBLE (rtt); + gst_structure_set (r_in, "round-trip-time", G_TYPE_DOUBLE, val, NULL); + } + + /* RTCRemoteInboundRTPStreamStats: + + To be added: + + DOMString localId; + double totalRoundTripTime; + unsigned long long reportsReceived; + unsigned long long roundTripTimeMeasurements; + */ + + gst_structure_set (r_in, "gst-rtpsource-stats", GST_TYPE_STRUCTURE, + source_stats, NULL); + + _gst_structure_take_structure (s, r_in_id, &r_in); + + g_free (r_in_id); + g_free (out_id); + + return TRUE; +} + /* https://www.w3.org/TR/webrtc-stats/#inboundrtpstats-dict* https://www.w3.org/TR/webrtc-stats/#outboundrtpstats-dict* */ static void _get_stats_from_rtp_source_stats (GstWebRTCBin * webrtc, - const GstStructure * source_stats, const gchar * codec_id, - const gchar * transport_id, GstStructure * s) + TransportStream * stream, const GstStructure * source_stats, + const gchar * codec_id, const gchar * transport_id, GstStructure * s) { guint ssrc, fir, pli, nack, jitter; - int lost, clock_rate; + int clock_rate; guint64 packets, bytes; gboolean internal; double ts; @@ -103,48 +220,10 @@ G_TYPE_INT, &clock_rate, "internal", G_TYPE_BOOLEAN, &internal, NULL); if (internal) { - GstStructure *r_in, *out; + GstStructure *out; gchar *out_id, *r_in_id; out_id = g_strdup_printf ("rtp-outbound-stream-stats_%u", ssrc); - r_in_id = g_strdup_printf ("rtp-remote-inbound-stream-stats_%u", ssrc); - - r_in = gst_structure_new_empty (r_in_id); - _set_base_stats (r_in, GST_WEBRTC_STATS_REMOTE_INBOUND_RTP, ts, r_in_id); - - /* RTCStreamStats */ - gst_structure_set (r_in, "local-id", G_TYPE_STRING, out_id, NULL); - gst_structure_set (r_in, "ssrc", G_TYPE_UINT, ssrc, NULL); - gst_structure_set (r_in, "codec-id", G_TYPE_STRING, codec_id, NULL); - gst_structure_set (r_in, "transport-id", G_TYPE_STRING, transport_id, NULL); - /* XXX: mediaType, trackId, sliCount, qpSum */ - - if (gst_structure_get_uint64 (source_stats, "packets-received", &packets)) - gst_structure_set (r_in, "packets-received", G_TYPE_UINT64, packets, - NULL); - if (gst_structure_get_int (source_stats, "packets-lost", &lost)) - gst_structure_set (r_in, "packets-lost", G_TYPE_INT, lost, NULL); - if (gst_structure_get_uint (source_stats, "jitter", &jitter)) - gst_structure_set (r_in, "jitter", G_TYPE_DOUBLE, - CLOCK_RATE_VALUE_TO_SECONDS (jitter, clock_rate), NULL); - -/* XXX: RTCReceivedRTPStreamStats - double fractionLost; - unsigned long packetsDiscarded; - unsigned long packetsFailedDecryption; - unsigned long packetsRepaired; - unsigned long burstPacketsLost; - unsigned long burstPacketsDiscarded; - unsigned long burstLossCount; - unsigned long burstDiscardCount; - double burstLossRate; - double burstDiscardRate; - double gapLossRate; - double gapDiscardRate; -*/ - - /* RTCRemoteInboundRTPStreamStats */ - /* XXX: framesDecoded, lastPacketReceivedTimestamp */ out = gst_structure_new_empty (out_id); _set_base_stats (out, GST_WEBRTC_STATS_OUTBOUND_RTP, ts, out_id); @@ -153,48 +232,114 @@ gst_structure_set (out, "ssrc", G_TYPE_UINT, ssrc, NULL); gst_structure_set (out, "codec-id", G_TYPE_STRING, codec_id, NULL); gst_structure_set (out, "transport-id", G_TYPE_STRING, transport_id, NULL); - if (gst_structure_get_uint (source_stats, "sent-fir-count", &fir)) - gst_structure_set (out, "fir-count", G_TYPE_UINT, fir, NULL); - if (gst_structure_get_uint (source_stats, "sent-pli-count", &pli)) - gst_structure_set (out, "pli-count", G_TYPE_UINT, pli, NULL); - if (gst_structure_get_uint (source_stats, "sent-nack-count", &nack)) - gst_structure_set (out, "nack-count", G_TYPE_UINT, nack, NULL); - /* XXX: mediaType, trackId, sliCount, qpSum */ + /* To be added: kind */ + -/* RTCSentRTPStreamStats */ + /* RTCSentRtpStreamStats */ if (gst_structure_get_uint64 (source_stats, "octets-sent", &bytes)) gst_structure_set (out, "bytes-sent", G_TYPE_UINT64, bytes, NULL); if (gst_structure_get_uint64 (source_stats, "packets-sent", &packets)) gst_structure_set (out, "packets-sent", G_TYPE_UINT64, packets, NULL); -/* XXX: - unsigned long packetsDiscardedOnSend; - unsigned long long bytesDiscardedOnSend; -*/ /* RTCOutboundRTPStreamStats */ - gst_structure_set (out, "remote-id", G_TYPE_STRING, r_in_id, NULL); -/* XXX: - DOMHighResTimeStamp lastPacketSentTimestamp; - double targetBitrate; - unsigned long framesEncoded; - double totalEncodeTime; - double averageRTCPInterval; -*/ - gst_structure_set (s, out_id, GST_TYPE_STRUCTURE, out, NULL); - gst_structure_set (s, r_in_id, GST_TYPE_STRUCTURE, r_in, NULL); - gst_structure_free (out); - gst_structure_free (r_in); + if (gst_structure_get_uint (source_stats, "recv-fir-count", &fir)) + gst_structure_set (out, "fir-count", G_TYPE_UINT, fir, NULL); + if (gst_structure_get_uint (source_stats, "recv-pli-count", &pli)) + gst_structure_set (out, "pli-count", G_TYPE_UINT, pli, NULL); + if (gst_structure_get_uint (source_stats, "recv-nack-count", &nack)) + gst_structure_set (out, "nack-count", G_TYPE_UINT, nack, NULL); + /* XXX: mediaType, trackId, sliCount, qpSum */ + + r_in_id = g_strdup_printf ("rtp-remote-inbound-stream-stats_%u", ssrc); + if (gst_structure_has_field (s, r_in_id)) + gst_structure_set (out, "remote-id", G_TYPE_STRING, r_in_id, NULL); + g_free (r_in_id); + + /* RTCOutboundRTPStreamStats: + + To be added: + + unsigned long sliCount; + unsigned long rtxSsrc; + DOMString mediaSourceId; + DOMString senderId; + DOMString remoteId; + DOMString rid; + DOMHighResTimeStamp lastPacketSentTimestamp; + unsigned long long headerBytesSent; + unsigned long packetsDiscardedOnSend; + unsigned long long bytesDiscardedOnSend; + unsigned long fecPacketsSent; + unsigned long long retransmittedPacketsSent; + unsigned long long retransmittedBytesSent; + double averageRtcpInterval; + record<USVString, unsigned long long> perDscpPacketsSent; + + Not relevant because webrtcbin doesn't encode: + + double targetBitrate; + unsigned long long totalEncodedBytesTarget; + unsigned long frameWidth; + unsigned long frameHeight; + unsigned long frameBitDepth; + double framesPerSecond; + unsigned long framesSent; + unsigned long hugeFramesSent; + unsigned long framesEncoded; + unsigned long keyFramesEncoded; + unsigned long framesDiscardedOnSend; + unsigned long long qpSum; + unsigned long long totalSamplesSent; + unsigned long long samplesEncodedWithSilk; + unsigned long long samplesEncodedWithCelt; + boolean voiceActivityFlag; + double totalEncodeTime; + double totalPacketSendDelay; + RTCQualityLimitationReason qualityLimitationReason; + record<DOMString, double> qualityLimitationDurations; + unsigned long qualityLimitationResolutionChanges; + DOMString encoderImplementation; + */ + + /* Store the raw stats from GStreamer into the structure for advanced + * information. + */ + gst_structure_set (out, "gst-rtpsource-stats", GST_TYPE_STRUCTURE, + source_stats, NULL); + + _gst_structure_take_structure (s, out_id, &out); g_free (out_id); - g_free (r_in_id); } else { GstStructure *in, *r_out; gchar *r_out_id, *in_id; - gboolean have_rb = FALSE, have_sr = FALSE; + gboolean have_sr = FALSE; + GstStructure *jb_stats = NULL; + guint i; + guint64 jb_lost, duplicates, late, rtx_success; + + gst_structure_get (source_stats, "have-sr", G_TYPE_BOOLEAN, &have_sr, NULL); + + for (i = 0; i < stream->remote_ssrcmap->len; i++) { + SsrcMapItem *item = g_ptr_array_index (stream->remote_ssrcmap, i); + + if (item->ssrc == ssrc) { + GObject *jb = g_weak_ref_get (&item->rtpjitterbuffer); - gst_structure_get (source_stats, "have-rb", G_TYPE_BOOLEAN, &have_rb, - "have-sr", G_TYPE_BOOLEAN, &have_sr, NULL); + if (jb) { + g_object_get (jb, "stats", &jb_stats, NULL); + g_object_unref (jb); + } + break; + } + } + + if (jb_stats) + gst_structure_get (jb_stats, "num-lost", G_TYPE_UINT64, &jb_lost, + "num-duplicates", G_TYPE_UINT64, &duplicates, "num-late", + G_TYPE_UINT64, &late, "rtx-success-count", G_TYPE_UINT64, + &rtx_success, NULL); in_id = g_strdup_printf ("rtp-inbound-stream-stats_%u", ssrc); r_out_id = g_strdup_printf ("rtp-remote-outbound-stream-stats_%u", ssrc); @@ -202,47 +347,113 @@ in = gst_structure_new_empty (in_id); _set_base_stats (in, GST_WEBRTC_STATS_INBOUND_RTP, ts, in_id); - /* RTCStreamStats */ + /* RTCRtpStreamStats */ gst_structure_set (in, "ssrc", G_TYPE_UINT, ssrc, NULL); gst_structure_set (in, "codec-id", G_TYPE_STRING, codec_id, NULL); gst_structure_set (in, "transport-id", G_TYPE_STRING, transport_id, NULL); - if (gst_structure_get_uint (source_stats, "recv-fir-count", &fir)) - gst_structure_set (in, "fir-count", G_TYPE_UINT, fir, NULL); - if (gst_structure_get_uint (source_stats, "recv-pli-count", &pli)) - gst_structure_set (in, "pli-count", G_TYPE_UINT, pli, NULL); - if (gst_structure_get_uint (source_stats, "recv-nack-count", &nack)) - gst_structure_set (in, "nack-count", G_TYPE_UINT, nack, NULL); - /* XXX: mediaType, trackId, sliCount, qpSum */ + /* To be added: kind */ + + + /* RTCReceivedRtpStreamStats */ - /* RTCReceivedRTPStreamStats */ if (gst_structure_get_uint64 (source_stats, "packets-received", &packets)) gst_structure_set (in, "packets-received", G_TYPE_UINT64, packets, NULL); - if (gst_structure_get_uint64 (source_stats, "octets-received", &bytes)) - gst_structure_set (in, "bytes-received", G_TYPE_UINT64, bytes, NULL); - if (gst_structure_get_int (source_stats, "packets-lost", &lost)) - gst_structure_set (in, "packets-lost", G_TYPE_INT, lost, NULL); + if (jb_stats) + gst_structure_set (in, "packets-lost", G_TYPE_UINT64, jb_lost, NULL); if (gst_structure_get_uint (source_stats, "jitter", &jitter)) gst_structure_set (in, "jitter", G_TYPE_DOUBLE, CLOCK_RATE_VALUE_TO_SECONDS (jitter, clock_rate), NULL); -/* - RTCReceivedRTPStreamStats - double fractionLost; - unsigned long packetsDiscarded; - unsigned long packetsFailedDecryption; - unsigned long packetsRepaired; - unsigned long burstPacketsLost; - unsigned long burstPacketsDiscarded; - unsigned long burstLossCount; - unsigned long burstDiscardCount; - double burstLossRate; - double burstDiscardRate; - double gapLossRate; - double gapDiscardRate; -*/ - /* RTCInboundRTPStreamStats */ + if (jb_stats) + gst_structure_set (in, "packets-discarded", G_TYPE_UINT64, late, + "packets-repaired", G_TYPE_UINT64, rtx_success, NULL); + + /* + RTCReceivedRtpStreamStats + + To be added: + + unsigned long long burstPacketsLost; + unsigned long long burstPacketsDiscarded; + unsigned long burstLossCount; + unsigned long burstDiscardCount; + double burstLossRate; + double burstDiscardRate; + double gapLossRate; + double gapDiscardRate; + + Not relevant because webrtcbin doesn't decode: + + unsigned long framesDropped; + unsigned long partialFramesLost; + unsigned long fullFramesLost; + */ + + /* RTCInboundRtpStreamStats */ gst_structure_set (in, "remote-id", G_TYPE_STRING, r_out_id, NULL); - /* XXX: framesDecoded, lastPacketReceivedTimestamp */ + + if (gst_structure_get_uint64 (source_stats, "octets-received", &bytes)) + gst_structure_set (in, "bytes-received", G_TYPE_UINT64, bytes, NULL); + + if (gst_structure_get_uint (source_stats, "sent-fir-count", &fir)) + gst_structure_set (in, "fir-count", G_TYPE_UINT, fir, NULL); + if (gst_structure_get_uint (source_stats, "sent-pli-count", &pli)) + gst_structure_set (in, "pli-count", G_TYPE_UINT, pli, NULL); + if (gst_structure_get_uint (source_stats, "sent-nack-count", &nack)) + gst_structure_set (in, "nack-count", G_TYPE_UINT, nack, NULL); + if (jb_stats) + gst_structure_set (in, "packets-duplicated", G_TYPE_UINT64, duplicates, + NULL); + + /* RTCInboundRtpStreamStats: + + To be added: + + required DOMString receiverId; + double averageRtcpInterval; + unsigned long long headerBytesReceived; + unsigned long long fecPacketsReceived; + unsigned long long fecPacketsDiscarded; + unsigned long long bytesReceived; + unsigned long long packetsFailedDecryption; + record<USVString, unsigned long long> perDscpPacketsReceived; + unsigned long nackCount; + unsigned long firCount; + unsigned long pliCount; + unsigned long sliCount; + double jitterBufferDelay; + + Not relevant because webrtcbin doesn't decode or depayload: + unsigned long framesDecoded; + unsigned long keyFramesDecoded; + unsigned long frameWidth; + unsigned long frameHeight; + unsigned long frameBitDepth; + double framesPerSecond; + unsigned long long qpSum; + double totalDecodeTime; + double totalInterFrameDelay; + double totalSquaredInterFrameDelay; + boolean voiceActivityFlag; + DOMHighResTimeStamp lastPacketReceivedTimestamp; + double totalProcessingDelay; + DOMHighResTimeStamp estimatedPlayoutTimestamp; + unsigned long long jitterBufferEmittedCount; + unsigned long long totalSamplesReceived; + unsigned long long totalSamplesDecoded; + unsigned long long samplesDecodedWithSilk; + unsigned long long samplesDecodedWithCelt; + unsigned long long concealedSamples; + unsigned long long silentConcealedSamples; + unsigned long long concealmentEvents; + unsigned long long insertedSamplesForDeceleration; + unsigned long long removedSamplesForAcceleration; + double audioLevel; + double totalAudioEnergy; + double totalSamplesDuration; + unsigned long framesReceived; + DOMString decoderImplementation; + */ r_out = gst_structure_new_empty (r_out_id); _set_base_stats (r_out, GST_WEBRTC_STATS_REMOTE_OUTBOUND_RTP, ts, r_out_id); @@ -251,30 +462,67 @@ gst_structure_set (r_out, "codec-id", G_TYPE_STRING, codec_id, NULL); gst_structure_set (r_out, "transport-id", G_TYPE_STRING, transport_id, NULL); - if (have_rb) { - guint32 rtt; - if (gst_structure_get_uint (source_stats, "rb-round-trip", &rtt)) { - /* 16.16 fixed point to double */ - double val = FIXED_16_16_TO_DOUBLE (rtt); - gst_structure_set (r_out, "round-trip-time", G_TYPE_DOUBLE, val, NULL); - } - } else { - /* default values */ - gst_structure_set (r_out, "round-trip-time", G_TYPE_DOUBLE, 0.0, NULL); - } - /* XXX: mediaType, trackId, sliCount, qpSum */ + /* XXX: mediaType, trackId */ + + /* RTCSentRtpStreamStats */ -/* RTCSentRTPStreamStats */ if (have_sr) { - if (gst_structure_get_uint64 (source_stats, "sr-octet-count", &bytes)) - gst_structure_set (r_out, "bytes-sent", G_TYPE_UINT64, bytes, NULL); - if (gst_structure_get_uint64 (source_stats, "sr-packet-count", &packets)) - gst_structure_set (r_out, "packets-sent", G_TYPE_UINT64, packets, NULL); + guint sr_bytes, sr_packets; + + if (gst_structure_get_uint (source_stats, "sr-octet-count", &sr_bytes)) + gst_structure_set (r_out, "bytes-sent", G_TYPE_UINT, sr_bytes, NULL); + if (gst_structure_get_uint (source_stats, "sr-packet-count", &sr_packets)) + gst_structure_set (r_out, "packets-sent", G_TYPE_UINT, sr_packets, + NULL); } -/* XXX: - unsigned long packetsDiscardedOnSend; - unsigned long long bytesDiscardedOnSend; -*/ + + /* RTCSentRtpStreamStats: + + To be added: + + unsigned long rtxSsrc; + DOMString mediaSourceId; + DOMString senderId; + DOMString remoteId; + DOMString rid; + DOMHighResTimeStamp lastPacketSentTimestamp; + unsigned long long headerBytesSent; + unsigned long packetsDiscardedOnSend; + unsigned long long bytesDiscardedOnSend; + unsigned long fecPacketsSent; + unsigned long long retransmittedPacketsSent; + unsigned long long retransmittedBytesSent; + double averageRtcpInterval; + unsigned long sliCount; + + Can't be implemented because we don't decode: + + double targetBitrate; + unsigned long long totalEncodedBytesTarget; + unsigned long frameWidth; + unsigned long frameHeight; + unsigned long frameBitDepth; + double framesPerSecond; + unsigned long framesSent; + unsigned long hugeFramesSent; + unsigned long framesEncoded; + unsigned long keyFramesEncoded; + unsigned long framesDiscardedOnSend; + unsigned long long qpSum; + unsigned long long totalSamplesSent; + unsigned long long samplesEncodedWithSilk; + unsigned long long samplesEncodedWithCelt; + boolean voiceActivityFlag; + double totalEncodeTime; + double totalPacketSendDelay; + RTCQualityLimitationReason qualityLimitationReason; + record<DOMString, double> qualityLimitationDurations; + unsigned long qualityLimitationResolutionChanges; + record<USVString, unsigned long long> perDscpPacketsSent; + DOMString encoderImplementation; + */ + + /* RTCRemoteOutboundRtpStreamStats */ if (have_sr) { guint64 ntptime; @@ -290,11 +538,22 @@ gst_structure_set (r_out, "local-id", G_TYPE_STRING, in_id, NULL); - gst_structure_set (s, in_id, GST_TYPE_STRUCTURE, in, NULL); - gst_structure_set (s, r_out_id, GST_TYPE_STRUCTURE, r_out, NULL); + /* To be added: + reportsSent + */ + + /* Store the raw stats from GStreamer into the structure for advanced + * information. + */ + if (jb_stats) + _gst_structure_take_structure (in, "gst-rtpjitterbuffer-stats", + &jb_stats); - gst_structure_free (in); - gst_structure_free (r_out); + gst_structure_set (in, "gst-rtpsource-stats", GST_TYPE_STRUCTURE, + source_stats, NULL); + + _gst_structure_take_structure (s, in_id, &in); + _gst_structure_take_structure (s, r_out_id, &r_out); g_free (in_id); g_free (r_out_id); @@ -304,7 +563,8 @@ /* https://www.w3.org/TR/webrtc-stats/#candidatepair-dict* */ static gchar * _get_stats_from_ice_transport (GstWebRTCBin * webrtc, - GstWebRTCICETransport * transport, GstStructure * s) + GstWebRTCICETransport * transport, const GstStructure * twcc_stats, + GstStructure * s) { GstStructure *stats; gchar *id; @@ -362,6 +622,14 @@ }; */ + /* XXX: these stats are at the rtp session level but there isn't a specific + * stats structure for that. The RTCIceCandidatePairStats is the closest with + * the 'availableIncomingBitrate' and 'availableOutgoingBitrate' fields + */ + if (twcc_stats) + gst_structure_set (stats, "gst-twcc-stats", GST_TYPE_STRUCTURE, twcc_stats, + NULL); + gst_structure_set (s, id, GST_TYPE_STRUCTURE, stats, NULL); gst_structure_free (stats); @@ -371,7 +639,8 @@ /* https://www.w3.org/TR/webrtc-stats/#dom-rtctransportstats */ static gchar * _get_stats_from_dtls_transport (GstWebRTCBin * webrtc, - GstWebRTCDTLSTransport * transport, GstStructure * s) + GstWebRTCDTLSTransport * transport, const GstStructure * twcc_stats, + GstStructure * s) { GstStructure *stats; gchar *id; @@ -419,7 +688,9 @@ gst_structure_set (s, id, GST_TYPE_STRUCTURE, stats, NULL); gst_structure_free (stats); - ice_id = _get_stats_from_ice_transport (webrtc, transport->transport, s); + ice_id = + _get_stats_from_ice_transport (webrtc, transport->transport, twcc_stats, + s); g_free (ice_id); return id; @@ -428,11 +699,12 @@ static void _get_stats_from_transport_channel (GstWebRTCBin * webrtc, TransportStream * stream, const gchar * codec_id, guint ssrc, - GstStructure * s) + guint clock_rate, GstStructure * s) { GstWebRTCDTLSTransport *transport; GObject *rtp_session; - GstStructure *rtp_stats; + GObject *gst_rtp_session; + GstStructure *rtp_stats, *twcc_stats; GValueArray *source_stats; gchar *transport_id; double ts; @@ -442,13 +714,14 @@ transport = stream->transport; if (!transport) - transport = stream->transport; - if (!transport) return; g_signal_emit_by_name (webrtc->rtpbin, "get-internal-session", stream->session_id, &rtp_session); g_object_get (rtp_session, "stats", &rtp_stats, NULL); + g_signal_emit_by_name (webrtc->rtpbin, "get-session", + stream->session_id, &gst_rtp_session); + g_object_get (gst_rtp_session, "twcc-stats", &twcc_stats, NULL); gst_structure_get (rtp_stats, "source-stats", G_TYPE_VALUE_ARRAY, &source_stats, NULL); @@ -458,7 +731,8 @@ "transport %" GST_PTR_FORMAT, stream, rtp_session, source_stats->n_values, transport); - transport_id = _get_stats_from_dtls_transport (webrtc, transport, s); + transport_id = + _get_stats_from_dtls_transport (webrtc, transport, twcc_stats, s); /* construct stats objects */ for (i = 0; i < source_stats->n_values; i++) { @@ -469,29 +743,38 @@ stats = gst_value_get_structure (val); /* skip foreign sources */ - gst_structure_get (stats, "ssrc", G_TYPE_UINT, &stats_ssrc, NULL); - if (ssrc && stats_ssrc && ssrc != stats_ssrc) - continue; - - _get_stats_from_rtp_source_stats (webrtc, stats, codec_id, transport_id, s); + if (gst_structure_get_uint (stats, "ssrc", &stats_ssrc) && + ssrc == stats_ssrc) + _get_stats_from_rtp_source_stats (webrtc, stream, stats, codec_id, + transport_id, s); + else if (gst_structure_get_uint (stats, "rb-ssrc", &stats_ssrc) && + ssrc == stats_ssrc) + _get_stats_from_remote_rtp_source_stats (webrtc, stream, stats, ssrc, + clock_rate, codec_id, transport_id, s); } g_object_unref (rtp_session); + g_object_unref (gst_rtp_session); gst_structure_free (rtp_stats); + if (twcc_stats) + gst_structure_free (twcc_stats); g_value_array_free (source_stats); g_free (transport_id); } /* https://www.w3.org/TR/webrtc-stats/#codec-dict* */ -static void +static gboolean _get_codec_stats_from_pad (GstWebRTCBin * webrtc, GstPad * pad, - GstStructure * s, gchar ** out_id, guint * out_ssrc) + GstStructure * s, gchar ** out_id, guint * out_ssrc, guint * out_clock_rate) { + GstWebRTCBinPad *wpad = GST_WEBRTC_BIN_PAD (pad); GstStructure *stats; - GstCaps *caps; + GstCaps *caps = NULL; gchar *id; double ts; guint ssrc = 0; + gint clock_rate = 0; + gboolean has_caps_ssrc = FALSE; gst_structure_get_double (s, "timestamp", &ts); @@ -499,10 +782,15 @@ id = g_strdup_printf ("codec-stats-%s", GST_OBJECT_NAME (pad)); _set_base_stats (stats, GST_WEBRTC_STATS_CODEC, ts, id); - caps = gst_pad_get_current_caps (pad); + if (wpad->received_caps) + caps = gst_caps_ref (wpad->received_caps); + GST_DEBUG_OBJECT (pad, "Pad caps are: %" GST_PTR_FORMAT, caps); if (caps && gst_caps_is_fixed (caps)) { GstStructure *caps_s = gst_caps_get_structure (caps, 0); - gint pt, clock_rate; + gint pt; + const gchar *encoding_name, *media, *encoding_params; + GstSDPMedia sdp_media = { 0 }; + guint channels = 0; if (gst_structure_get_int (caps_s, "payload", &pt)) gst_structure_set (stats, "payload-type", G_TYPE_UINT, pt, NULL); @@ -510,10 +798,45 @@ if (gst_structure_get_int (caps_s, "clock-rate", &clock_rate)) gst_structure_set (stats, "clock-rate", G_TYPE_UINT, clock_rate, NULL); - if (gst_structure_get_uint (caps_s, "ssrc", &ssrc)) + if (gst_structure_get_uint (caps_s, "ssrc", &ssrc)) { gst_structure_set (stats, "ssrc", G_TYPE_UINT, ssrc, NULL); + has_caps_ssrc = TRUE; + } + + media = gst_structure_get_string (caps_s, "media"); + encoding_name = gst_structure_get_string (caps_s, "encoding-name"); + encoding_params = gst_structure_get_string (caps_s, "encoding-params"); + + if (media || encoding_name) { + gchar *mime_type; + + mime_type = g_strdup_printf ("%s/%s", media ? media : "", + encoding_name ? encoding_name : ""); + gst_structure_set (stats, "mime-type", G_TYPE_STRING, mime_type, NULL); + g_free (mime_type); + } + + if (encoding_params) + channels = atoi (encoding_params); + if (channels) + gst_structure_set (stats, "channels", G_TYPE_UINT, channels, NULL); + + if (gst_pad_get_direction (pad) == GST_PAD_SRC) + gst_structure_set (stats, "codec-type", G_TYPE_STRING, "decode", NULL); + else + gst_structure_set (stats, "codec-type", G_TYPE_STRING, "encode", NULL); + + gst_sdp_media_init (&sdp_media); + if (gst_sdp_media_set_media_from_caps (caps, &sdp_media) == GST_SDP_OK) { + const gchar *fmtp = gst_sdp_media_get_attribute_val (&sdp_media, "fmtp"); - /* FIXME: codecType, mimeType, channels, sdpFmtpLine, implementation, transportId */ + if (fmtp) { + gst_structure_set (stats, "sdp-fmtp-line", G_TYPE_STRING, fmtp, NULL); + } + } + gst_sdp_media_uninit (&sdp_media); + + /* FIXME: transportId */ } if (caps) @@ -529,6 +852,11 @@ if (out_ssrc) *out_ssrc = ssrc; + + if (out_clock_rate) + *out_clock_rate = clock_rate; + + return has_caps_ssrc; } static gboolean @@ -537,9 +865,11 @@ GstWebRTCBinPad *wpad = GST_WEBRTC_BIN_PAD (pad); TransportStream *stream; gchar *codec_id; - guint ssrc; + guint ssrc, clock_rate; + gboolean has_caps_ssrc; - _get_codec_stats_from_pad (webrtc, pad, s, &codec_id, &ssrc); + has_caps_ssrc = _get_codec_stats_from_pad (webrtc, pad, s, &codec_id, &ssrc, + &clock_rate); if (!wpad->trans) goto out; @@ -548,15 +878,19 @@ if (!stream) goto out; - _get_stats_from_transport_channel (webrtc, stream, codec_id, ssrc, s); + if (!has_caps_ssrc) + ssrc = wpad->last_ssrc; + + _get_stats_from_transport_channel (webrtc, stream, codec_id, ssrc, + clock_rate, s); out: g_free (codec_id); return TRUE; } -void -gst_webrtc_bin_update_stats (GstWebRTCBin * webrtc) +GstStructure * +gst_webrtc_bin_create_stats (GstWebRTCBin * webrtc, GstPad * pad) { GstStructure *s = gst_structure_new_empty ("application/x-webrtc-stats"); double ts = monotonic_time_as_double_milliseconds (); @@ -579,12 +913,13 @@ gst_structure_free (pc_stats); } - gst_element_foreach_pad (GST_ELEMENT (webrtc), - (GstElementForeachPadFunc) _get_stats_from_pad, s); + if (pad) + _get_stats_from_pad (webrtc, pad, s); + else + gst_element_foreach_pad (GST_ELEMENT (webrtc), + (GstElementForeachPadFunc) _get_stats_from_pad, s); gst_structure_remove_field (s, "timestamp"); - if (webrtc->priv->stats) - gst_structure_free (webrtc->priv->stats); - webrtc->priv->stats = s; + return s; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/gstwebrtcstats.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/gstwebrtcstats.h
Changed
@@ -28,7 +28,8 @@ G_BEGIN_DECLS G_GNUC_INTERNAL -void gst_webrtc_bin_update_stats (GstWebRTCBin * webrtc); +GstStructure * gst_webrtc_bin_create_stats (GstWebRTCBin * webrtc, + GstPad * pad); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/icestream.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/icestream.c
Changed
@@ -46,6 +46,7 @@ { gboolean gathered; GList *transports; + gboolean gathering_started; }; #define gst_webrtc_ice_stream_parent_class parent_class @@ -187,11 +188,35 @@ } g_object_get (stream->ice, "agent", &agent, NULL); + + if (!stream->priv->gathering_started) { + if (stream->ice->min_rtp_port != 0 || stream->ice->max_rtp_port != 65535) { + if (stream->ice->min_rtp_port > stream->ice->max_rtp_port) { + GST_ERROR_OBJECT (stream->ice, + "invalid port range: min-rtp-port %d must be <= max-rtp-port %d", + stream->ice->min_rtp_port, stream->ice->max_rtp_port); + return FALSE; + } + + nice_agent_set_port_range (agent, stream->stream_id, + NICE_COMPONENT_TYPE_RTP, stream->ice->min_rtp_port, + stream->ice->max_rtp_port); + } + /* mark as gathering started to prevent changing ports again */ + stream->priv->gathering_started = TRUE; + } + if (!nice_agent_gather_candidates (agent, stream->stream_id)) { g_object_unref (agent); return FALSE; } + for (l = stream->priv->transports; l; l = l->next) { + GstWebRTCNiceTransport *trans = l->data; + + gst_webrtc_nice_transport_update_buffer_size (trans); + } + g_object_unref (agent); return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/meson.build
Changed
@@ -4,7 +4,7 @@ 'gstwebrtcstats.c', 'icestream.c', 'nicetransport.c', - 'sctptransport.c', + 'webrtcsctptransport.c', 'gstwebrtcbin.c', 'transportreceivebin.c', 'transportsendbin.c', @@ -15,7 +15,7 @@ 'webrtcdatachannel.c', ] -libnice_dep = dependency('nice', version : '>=0.1.14', required : get_option('webrtc'), +libnice_dep = dependency('nice', version : '>=0.1.17', required : get_option('webrtc'), fallback : ['libnice', 'libnice_dep'], default_options: ['tests=disabled']) @@ -25,7 +25,7 @@ c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API'], include_directories : [configinc], dependencies : [gio_dep, libnice_dep, gstbase_dep, gstsdp_dep, - gstapp_dep, gstwebrtc_dep, gstsctp_dep], + gstapp_dep, gstwebrtc_dep, gstsctp_dep, gstrtp_dep], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/nicetransport.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/nicetransport.c
Changed
@@ -24,6 +24,8 @@ #include "nicetransport.h" #include "icestream.h" +#include <gio/gnetworking.h> + #define GST_CAT_DEFAULT gst_webrtc_nice_transport_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -37,6 +39,8 @@ { PROP_0, PROP_STREAM, + PROP_SEND_BUFFER_SIZE, + PROP_RECEIVE_BUFFER_SIZE }; //static guint gst_webrtc_nice_transport_signals[LAST_SIGNAL] = { 0 }; @@ -44,6 +48,9 @@ struct _GstWebRTCNiceTransportPrivate { gboolean running; + + gint send_buffer_size; + gint receive_buffer_size; }; #define gst_webrtc_nice_transport_parent_class parent_class @@ -115,6 +122,14 @@ gst_object_unref (nice->stream); nice->stream = g_value_dup_object (value); break; + case PROP_SEND_BUFFER_SIZE: + nice->priv->send_buffer_size = g_value_get_int (value); + gst_webrtc_nice_transport_update_buffer_size (nice); + break; + case PROP_RECEIVE_BUFFER_SIZE: + nice->priv->receive_buffer_size = g_value_get_int (value); + gst_webrtc_nice_transport_update_buffer_size (nice); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -131,6 +146,12 @@ case PROP_STREAM: g_value_set_object (value, nice->stream); break; + case PROP_SEND_BUFFER_SIZE: + g_value_set_int (value, nice->priv->send_buffer_size); + break; + case PROP_RECEIVE_BUFFER_SIZE: + g_value_set_int (value, nice->priv->receive_buffer_size); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -147,6 +168,50 @@ G_OBJECT_CLASS (parent_class)->finalize (object); } +void +gst_webrtc_nice_transport_update_buffer_size (GstWebRTCNiceTransport * nice) +{ + NiceAgent *agent = NULL; + GPtrArray *sockets; + guint i; + + g_object_get (nice->stream->ice, "agent", &agent, NULL); + g_assert (agent != NULL); + + sockets = nice_agent_get_sockets (agent, nice->stream->stream_id, 1); + if (sockets == NULL) { + g_object_unref (agent); + return; + } + + for (i = 0; i < sockets->len; i++) { + GSocket *gsocket = g_ptr_array_index (sockets, i); +#ifdef SO_SNDBUF + if (nice->priv->send_buffer_size != 0) { + GError *gerror = NULL; + if (!g_socket_set_option (gsocket, SOL_SOCKET, SO_SNDBUF, + nice->priv->send_buffer_size, &gerror)) + GST_WARNING_OBJECT (nice, "Could not set send buffer size : %s", + gerror->message); + g_clear_error (&gerror); + } +#endif +#ifdef SO_RCVBUF + if (nice->priv->receive_buffer_size != 0) { + GError *gerror = NULL; + if (!g_socket_set_option (gsocket, SOL_SOCKET, SO_RCVBUF, + nice->priv->receive_buffer_size, &gerror)) + GST_WARNING_OBJECT (nice, "Could not set send receive size : %s", + gerror->message); + g_clear_error (&gerror); + } +#endif + } + g_ptr_array_unref (sockets); + g_object_unref (agent); +} + + static void _on_new_selected_pair (NiceAgent * agent, guint stream_id, NiceComponentType component, NiceCandidate * lcandidate, @@ -245,6 +310,34 @@ "WebRTC ICE Stream", "ICE stream associated with this transport", GST_TYPE_WEBRTC_ICE_STREAM, G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCNiceTransport:send-buffer-size: + * + * Size of the kernel send buffer in bytes, 0=default + * + * Since: 1.20 + */ + + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_SEND_BUFFER_SIZE, g_param_spec_int ("send-buffer-size", + "Send Buffer Size", + "Size of the kernel send buffer in bytes, 0=default", 0, G_MAXINT, 0, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCNiceTransport:receive-buffer-size: + * + * Size of the kernel receive buffer in bytes, 0=default + * + * Since: 1.20 + */ + + g_object_class_install_property (G_OBJECT_CLASS (klass), + PROP_RECEIVE_BUFFER_SIZE, g_param_spec_int ("receive-buffer-size", + "Receive Buffer Size", + "Size of the kernel receive buffer in bytes, 0=default", 0, G_MAXINT, + 0, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/nicetransport.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/nicetransport.h
Changed
@@ -26,6 +26,8 @@ #include <gst/webrtc/webrtc.h> #include "gstwebrtcice.h" +#include "gst/webrtc/webrtc-priv.h" + G_BEGIN_DECLS GType gst_webrtc_nice_transport_get_type(void); @@ -50,8 +52,11 @@ GstWebRTCICETransportClass parent_class; }; -GstWebRTCNiceTransport * gst_webrtc_nice_transport_new (GstWebRTCICEStream * stream, - GstWebRTCICEComponent component); +GstWebRTCNiceTransport * gst_webrtc_nice_transport_new (GstWebRTCICEStream * stream, + GstWebRTCICEComponent component); + +void gst_webrtc_nice_transport_update_buffer_size (GstWebRTCNiceTransport * nice); + G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/transportreceivebin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/transportreceivebin.c
Changed
@@ -23,26 +23,19 @@ #include "transportreceivebin.h" #include "utils.h" +#include "gst/webrtc/webrtc-priv.h" /* - * ,----------------------------transport_receive_%u---------------------------, - * ; (rtp/data) ; - * ; ,-nicesrc-, ,-capsfilter-, ,--queue--, ,-dtlssrtpdec-, ,-funnel-, ; - * ; ; src o-o sink src o-osink srco-osink rtp_srco-------o sink_0 ; ; - * ; '---------' '------------' '---------' ; ; ; src o--o rtp_src - * ; ; rtcp_srco---, ,-o sink_1 ; ; - * ; ; ; ; ; '--------' ; - * ; ; data_srco-, ; ; ,-funnel-, ; - * ; (rtcp) '-------------' ; '-+-o sink_0 ; ; - * ; ,-nicesrc-, ,-capsfilter-, ,--queue--, ,-dtlssrtpdec-, ; ,-' ; src o--o rtcp_src - * ; ; src o-o sink src o-osink srco-osink rtp_srco-+-' ,-o sink_1 ; ; - * ; '---------' '------------' '---------' ; ; ; ; '--------' ; - * ; ; rtcp_srco-+---' ,-funnel-, ; - * ; ; ; '-----o sink_0 ; ; - * ; ; data_srco-, ; src o--o data_src - * ; '-------------' '-----o sink_1 ; ; - * ; '--------' ; - * '---------------------------------------------------------------------------' + * ,-----------------------transport_receive_%u------------------, + * ; ; + * ; ,-nicesrc-, ,-capsfilter-, ,---queue---, ,-dtlssrtpdec-, ; + * ; ; src o-o sink src o-o sink src o-osink rtp_srco---o rtp_src + * ; '---------' '------------' '-----------' ; ; ; + * ; ; rtcp_srco---o rtcp_src + * ; ; ; ; + * ; ; data_srco---o data_src + * ; '-------------' ; + * '-------------------------------------------------------------' * * Do we really wnat to be *that* permissive in what we accept? * @@ -103,7 +96,7 @@ * them. Sticky events would be forwarded again later once we unblock * and we don't want to forward them here already because that might * cause a spurious GST_FLOW_FLUSHING */ - if (GST_IS_EVENT (info->data)) + if (GST_IS_EVENT (info->data) || GST_IS_QUERY (info->data)) return GST_PAD_PROBE_DROP; /* But block on any actual data-flow so we don't accidentally send that @@ -119,14 +112,32 @@ transport_receive_bin_set_receive_state (TransportReceiveBin * receive, ReceiveState state) { + GstWebRTCICEConnectionState icestate; g_mutex_lock (&receive->pad_block_lock); if (receive->receive_state != state) { - GST_DEBUG_OBJECT (receive, "changing receive state to %s", + GST_DEBUG_OBJECT (receive, "Requested change of receive state to %s", _receive_state_to_string (state)); } + receive->receive_state = state; + + g_object_get (receive->stream->transport->transport, "state", &icestate, + NULL); + if (state == RECEIVE_STATE_PASS) { + if (icestate == GST_WEBRTC_ICE_CONNECTION_STATE_CONNECTED || + icestate == GST_WEBRTC_ICE_CONNECTION_STATE_COMPLETED) { + GST_LOG_OBJECT (receive, "Unblocking nicesrc because ICE is connected."); + } else { + GST_LOG_OBJECT (receive, "Can't unblock nicesrc yet because ICE " + "is not connected, it is %d", icestate); + state = RECEIVE_STATE_BLOCK; + } + } + if (state == RECEIVE_STATE_PASS) { + g_object_set (receive->queue, "leaky", 0, NULL); + if (receive->rtp_block) _free_pad_block (receive->rtp_block); receive->rtp_block = NULL; @@ -136,6 +147,7 @@ receive->rtcp_block = NULL; } else { g_assert (state == RECEIVE_STATE_BLOCK); + g_object_set (receive->queue, "leaky", 2, NULL); if (receive->rtp_block == NULL) { GstWebRTCDTLSTransport *transport; GstElement *dtlssrtpdec; @@ -155,29 +167,20 @@ (GstPadProbeCallback) pad_block, receive, NULL); gst_object_unref (peer_pad); gst_object_unref (pad); - - transport = receive->stream->rtcp_transport; - dtlssrtpdec = transport->dtlssrtpdec; - pad = gst_element_get_static_pad (dtlssrtpdec, "sink"); - peer_pad = gst_pad_get_peer (pad); - receive->rtcp_block = - _create_pad_block (GST_ELEMENT (receive), peer_pad, 0, NULL, NULL); - receive->rtcp_block->block_id = - gst_pad_add_probe (peer_pad, - GST_PAD_PROBE_TYPE_BLOCK | - GST_PAD_PROBE_TYPE_DATA_DOWNSTREAM, - (GstPadProbeCallback) pad_block, receive, NULL); - gst_object_unref (peer_pad); - gst_object_unref (pad); } } } - - receive->receive_state = state; g_mutex_unlock (&receive->pad_block_lock); } static void +_on_notify_ice_connection_state (GstWebRTCICETransport * transport, + GParamSpec * pspec, TransportReceiveBin * receive) +{ + transport_receive_bin_set_receive_state (receive, receive->receive_state); +} + +static void transport_receive_bin_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { @@ -250,9 +253,6 @@ elem = receive->stream->transport->transport->src; gst_element_set_locked_state (elem, TRUE); gst_element_set_state (elem, GST_STATE_PLAYING); - elem = receive->stream->rtcp_transport->transport->src; - gst_element_set_locked_state (elem, TRUE); - gst_element_set_state (elem, GST_STATE_PLAYING); break; } default: @@ -270,9 +270,6 @@ elem = receive->stream->transport->transport->src; gst_element_set_locked_state (elem, FALSE); gst_element_set_state (elem, GST_STATE_NULL); - elem = receive->stream->rtcp_transport->transport->src; - gst_element_set_locked_state (elem, FALSE); - gst_element_set_state (elem, GST_STATE_NULL); if (receive->rtp_block) _free_pad_block (receive->rtp_block); @@ -297,13 +294,25 @@ GST_WARNING_OBJECT (receive, "Internal receive queue overrun. Dropping data"); } +static GstPadProbeReturn +drop_serialized_queries (GstPad * pad, GstPadProbeInfo * info, + TransportReceiveBin * receive) +{ + GstQuery *query = GST_PAD_PROBE_INFO_QUERY (info); + + if (GST_QUERY_IS_SERIALIZED (query)) + return GST_PAD_PROBE_DROP; + else + return GST_PAD_PROBE_PASS; +} + static void transport_receive_bin_constructed (GObject * object) { TransportReceiveBin *receive = TRANSPORT_RECEIVE_BIN (object); GstWebRTCDTLSTransport *transport; GstPad *ghost, *pad; - GstElement *capsfilter, *funnel, *queue; + GstElement *capsfilter; GstCaps *caps; g_return_if_fail (receive->stream); @@ -317,46 +326,25 @@ g_object_set (capsfilter, "caps", caps, NULL); gst_caps_unref (caps); - queue = gst_element_factory_make ("queue", NULL); + receive->queue = gst_element_factory_make ("queue", NULL); /* FIXME: make this configurable? */ - g_object_set (queue, "leaky", 2, "max-size-time", (guint64) 0, + g_object_set (receive->queue, "leaky", 2, "max-size-time", (guint64) 0, "max-size-buffers", 0, "max-size-bytes", 5 * 1024 * 1024, NULL); - g_signal_connect (queue, "overrun", G_CALLBACK (rtp_queue_overrun), receive); - - gst_bin_add (GST_BIN (receive), GST_ELEMENT (queue)); - gst_bin_add (GST_BIN (receive), GST_ELEMENT (capsfilter)); - if (!gst_element_link_pads (capsfilter, "src", queue, "sink")) - g_warn_if_reached (); - - if (!gst_element_link_pads (queue, "src", transport->dtlssrtpdec, "sink")) - g_warn_if_reached (); - - gst_bin_add (GST_BIN (receive), GST_ELEMENT (transport->transport->src)); - if (!gst_element_link_pads (GST_ELEMENT (transport->transport->src), "src", - GST_ELEMENT (capsfilter), "sink")) - g_warn_if_reached (); + g_signal_connect (receive->queue, "overrun", G_CALLBACK (rtp_queue_overrun), + receive); - /* link ice src, dtlsrtp together for rtcp */ - transport = receive->stream->rtcp_transport; - gst_bin_add (GST_BIN (receive), GST_ELEMENT (transport->dtlssrtpdec)); - - capsfilter = gst_element_factory_make ("capsfilter", NULL); - caps = gst_caps_new_empty_simple ("application/x-rtcp"); - g_object_set (capsfilter, "caps", caps, NULL); - gst_caps_unref (caps); - - queue = gst_element_factory_make ("queue", NULL); - /* FIXME: make this configurable? */ - g_object_set (queue, "leaky", 2, "max-size-time", (guint64) 0, - "max-size-buffers", 0, "max-size-bytes", 5 * 1024 * 1024, NULL); - g_signal_connect (queue, "overrun", G_CALLBACK (rtp_queue_overrun), receive); + pad = gst_element_get_static_pad (receive->queue, "sink"); + gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, + (GstPadProbeCallback) drop_serialized_queries, receive, NULL); + gst_object_unref (pad); - gst_bin_add (GST_BIN (receive), queue); + gst_bin_add (GST_BIN (receive), GST_ELEMENT (receive->queue)); gst_bin_add (GST_BIN (receive), GST_ELEMENT (capsfilter)); - if (!gst_element_link_pads (capsfilter, "src", queue, "sink")) + if (!gst_element_link_pads (capsfilter, "src", receive->queue, "sink")) g_warn_if_reached (); - if (!gst_element_link_pads (queue, "src", transport->dtlssrtpdec, "sink")) + if (!gst_element_link_pads (receive->queue, "src", transport->dtlssrtpdec, + "sink")) g_warn_if_reached (); gst_bin_add (GST_BIN (receive), GST_ELEMENT (transport->transport->src)); @@ -364,52 +352,32 @@ GST_ELEMENT (capsfilter), "sink")) g_warn_if_reached (); - /* create funnel for rtp_src */ - funnel = gst_element_factory_make ("funnel", NULL); - gst_bin_add (GST_BIN (receive), funnel); - if (!gst_element_link_pads (receive->stream->transport->dtlssrtpdec, - "rtp_src", funnel, "sink_0")) - g_warn_if_reached (); - if (!gst_element_link_pads (receive->stream->rtcp_transport->dtlssrtpdec, - "rtp_src", funnel, "sink_1")) - g_warn_if_reached (); - - pad = gst_element_get_static_pad (funnel, "src"); + /* expose rtp_src */ + pad = + gst_element_get_static_pad (receive->stream->transport->dtlssrtpdec, + "rtp_src"); receive->rtp_src = gst_ghost_pad_new ("rtp_src", pad); gst_element_add_pad (GST_ELEMENT (receive), receive->rtp_src); gst_object_unref (pad); - /* create funnel for rtcp_src */ - funnel = gst_element_factory_make ("funnel", NULL); - gst_bin_add (GST_BIN (receive), funnel); - if (!gst_element_link_pads (receive->stream->transport->dtlssrtpdec, - "rtcp_src", funnel, "sink_0")) - g_warn_if_reached (); - if (!gst_element_link_pads (receive->stream->rtcp_transport->dtlssrtpdec, - "rtcp_src", funnel, "sink_1")) - g_warn_if_reached (); - - pad = gst_element_get_static_pad (funnel, "src"); + /* expose rtcp_rtc */ + pad = gst_element_get_static_pad (receive->stream->transport->dtlssrtpdec, + "rtcp_src"); receive->rtcp_src = gst_ghost_pad_new ("rtcp_src", pad); gst_element_add_pad (GST_ELEMENT (receive), receive->rtcp_src); gst_object_unref (pad); - /* create funnel for data_src */ - funnel = gst_element_factory_make ("funnel", NULL); - gst_bin_add (GST_BIN (receive), funnel); - if (!gst_element_link_pads (receive->stream->transport->dtlssrtpdec, - "data_src", funnel, "sink_0")) - g_warn_if_reached (); - if (!gst_element_link_pads (receive->stream->rtcp_transport->dtlssrtpdec, - "data_src", funnel, "sink_1")) - g_warn_if_reached (); - - pad = gst_element_get_static_pad (funnel, "src"); + /* expose data_src */ + pad = gst_element_request_pad_simple (receive->stream->transport->dtlssrtpdec, + "data_src"); ghost = gst_ghost_pad_new ("data_src", pad); gst_element_add_pad (GST_ELEMENT (receive), ghost); gst_object_unref (pad); + g_signal_connect_after (receive->stream->transport->transport, + "notify::state", G_CALLBACK (_on_notify_ice_connection_state), receive); + G_OBJECT_CLASS (parent_class)->constructed (object); }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/transportreceivebin.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/transportreceivebin.h
Changed
@@ -52,6 +52,7 @@ struct pad_block *rtcp_block; GMutex pad_block_lock; ReceiveState receive_state; + GstElement *queue; }; struct _TransportReceiveBinClass
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/transportsendbin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/transportsendbin.c
Changed
@@ -23,22 +23,19 @@ #include "transportsendbin.h" #include "utils.h" +#include "gst/webrtc/webrtc-priv.h" /* - * ,------------------------transport_send_%u-------------------------, - * ; ,-----dtlssrtpenc---, ; - * data_sink o--------------------------o data_sink ; ; - * ; ; ; ,---nicesink---, ; - * rtp_sink o--------------------------o rtp_sink_0 src o--o sink ; ; - * ; ; ; '--------------' ; - * ; ,--outputselector--, ,-o rtcp_sink_0 ; ; - * ; ; src_0 o-' '-------------------' ; - * rtcp_sink ;---o sink ; ,----dtlssrtpenc----, ,---nicesink---, ; - * ; ; src_1 o---o rtcp_sink_0 src o--o sink ; ; - * ; '------------------' '-------------------' '--------------' ; - * '------------------------------------------------------------------' + * ,--------------transport_send_%u-------- ---, + * ; ,-----dtlssrtpenc---, ; + * data_sink o---o data_sink ; ; + * ; ; ; ,---nicesink---, ; + * rtp_sink o---o rtp_sink_0 src o--o sink ; ; + * ; ; ; '--------------' ; + * rtcp_sink o---o rtcp_sink_0 ; ; + * ; '-------------------' + * '-------------------------------------------' * - * outputselecter is used to switch between rtcp-mux and no rtcp-mux * * FIXME: Do we need a valve drop=TRUE for the no RTCP case? */ @@ -73,7 +70,6 @@ { PROP_0, PROP_STREAM, - PROP_RTCP_MUX, }; #define TSB_GET_LOCK(tsb) (&tsb->lock) @@ -83,24 +79,6 @@ static void cleanup_blocks (TransportSendBin * send); static void -_set_rtcp_mux (TransportSendBin * send, gboolean rtcp_mux) -{ - GstPad *active_pad; - - if (rtcp_mux) - active_pad = gst_element_get_static_pad (send->outputselector, "src_0"); - else - active_pad = gst_element_get_static_pad (send->outputselector, "src_1"); - send->rtcp_mux = rtcp_mux; - GST_OBJECT_UNLOCK (send); - - g_object_set (send->outputselector, "active-pad", active_pad, NULL); - - gst_object_unref (active_pad); - GST_OBJECT_LOCK (send); -} - -static void transport_send_bin_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { @@ -112,9 +90,6 @@ /* XXX: weak-ref this? Note, it's construct-only so can't be changed later */ send->stream = TRANSPORT_STREAM (g_value_get_object (value)); break; - case PROP_RTCP_MUX: - _set_rtcp_mux (send, g_value_get_boolean (value)); - break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -133,9 +108,6 @@ case PROP_STREAM: g_value_set_object (value, send->stream); break; - case PROP_RTCP_MUX: - g_value_set_boolean (value, send->rtcp_mux); - break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -200,9 +172,9 @@ * arguably the element should be able to deal with this itself or * we should only add it once/if we get the encoding keys */ TSB_LOCK (send); - gst_element_set_locked_state (send->rtp_ctx.dtlssrtpenc, TRUE); - gst_element_set_locked_state (send->rtcp_ctx.dtlssrtpenc, TRUE); + gst_element_set_locked_state (send->dtlssrtpenc, TRUE); send->active = TRUE; + send->has_clientness = FALSE; TSB_UNLOCK (send); break; } @@ -213,20 +185,12 @@ /* RTP */ /* unblock the encoder once the key is set, this should also be automatic */ elem = send->stream->transport->dtlssrtpenc; - send->rtp_ctx.rtp_block = block_peer_pad (elem, "rtp_sink_0"); + send->rtp_block = block_peer_pad (elem, "rtp_sink_0"); /* Also block the RTCP pad on the RTP encoder, in case we mux RTCP */ - send->rtp_ctx.rtcp_block = block_peer_pad (elem, "rtcp_sink_0"); + send->rtcp_block = block_peer_pad (elem, "rtcp_sink_0"); /* unblock ice sink once a connection is made, this should also be automatic */ elem = send->stream->transport->transport->sink; - send->rtp_ctx.nice_block = block_peer_pad (elem, "sink"); - /* RTCP */ - elem = send->stream->rtcp_transport->dtlssrtpenc; - /* Block the RTCP DTLS encoder */ - send->rtcp_ctx.rtcp_block = block_peer_pad (elem, "rtcp_sink_0"); - /* unblock ice sink once a connection is made, this should also be automatic */ - elem = send->stream->rtcp_transport->transport->sink; - send->rtcp_ctx.nice_block = block_peer_pad (elem, "sink"); TSB_UNLOCK (send); break; } @@ -256,8 +220,7 @@ send->active = FALSE; cleanup_blocks (send); - gst_element_set_locked_state (send->rtp_ctx.dtlssrtpenc, FALSE); - gst_element_set_locked_state (send->rtcp_ctx.dtlssrtpenc, FALSE); + gst_element_set_locked_state (send->dtlssrtpenc, FALSE); TSB_UNLOCK (send); break; @@ -272,13 +235,7 @@ static void _on_dtls_enc_key_set (GstElement * dtlssrtpenc, TransportSendBin * send) { - TransportSendBinDTLSContext *ctx; - - if (dtlssrtpenc == send->rtp_ctx.dtlssrtpenc) - ctx = &send->rtp_ctx; - else if (dtlssrtpenc == send->rtcp_ctx.dtlssrtpenc) - ctx = &send->rtcp_ctx; - else { + if (dtlssrtpenc != send->dtlssrtpenc) { GST_WARNING_OBJECT (send, "Received dtls-enc key info for unknown element %" GST_PTR_FORMAT, dtlssrtpenc); @@ -293,24 +250,40 @@ } GST_LOG_OBJECT (send, "Unblocking %" GST_PTR_FORMAT " pads", dtlssrtpenc); - _free_pad_block (ctx->rtp_block); - _free_pad_block (ctx->rtcp_block); - ctx->rtp_block = ctx->rtcp_block = NULL; + _free_pad_block (send->rtp_block); + _free_pad_block (send->rtcp_block); + send->rtp_block = send->rtcp_block = NULL; done: TSB_UNLOCK (send); } static void +maybe_start_enc (TransportSendBin * send) +{ + GstWebRTCICEConnectionState state; + + if (!send->has_clientness) { + GST_LOG_OBJECT (send, "Can't start DTLS because doesn't know client-ness"); + return; + } + + g_object_get (send->stream->transport->transport, "state", &state, NULL); + if (state != GST_WEBRTC_ICE_CONNECTION_STATE_CONNECTED && + state != GST_WEBRTC_ICE_CONNECTION_STATE_COMPLETED) { + GST_LOG_OBJECT (send, "Can't start DTLS yet because ICE is not connected."); + return; + } + + gst_element_set_locked_state (send->dtlssrtpenc, FALSE); + gst_element_sync_state_with_parent (send->dtlssrtpenc); +} + +static void _on_notify_dtls_client_status (GstElement * dtlssrtpenc, GParamSpec * pspec, TransportSendBin * send) { - TransportSendBinDTLSContext *ctx; - if (dtlssrtpenc == send->rtp_ctx.dtlssrtpenc) - ctx = &send->rtp_ctx; - else if (dtlssrtpenc == send->rtcp_ctx.dtlssrtpenc) - ctx = &send->rtcp_ctx; - else { + if (dtlssrtpenc != send->dtlssrtpenc) { GST_WARNING_OBJECT (send, "Received dtls-enc client mode for unknown element %" GST_PTR_FORMAT, dtlssrtpenc); @@ -324,11 +297,12 @@ goto done; } + send->has_clientness = TRUE; GST_DEBUG_OBJECT (send, - "DTLS-SRTP encoder configured. Unlocking it and changing state %" - GST_PTR_FORMAT, ctx->dtlssrtpenc); - gst_element_set_locked_state (ctx->dtlssrtpenc, FALSE); - gst_element_sync_state_with_parent (ctx->dtlssrtpenc); + "DTLS-SRTP encoder configured. Unlocking it and maybe changing state %" + GST_PTR_FORMAT, dtlssrtpenc); + maybe_start_enc (send); + done: TSB_UNLOCK (send); } @@ -337,116 +311,62 @@ _on_notify_ice_connection_state (GstWebRTCICETransport * transport, GParamSpec * pspec, TransportSendBin * send) { - GstWebRTCICEConnectionState state; - - g_object_get (transport, "state", &state, NULL); - - if (state == GST_WEBRTC_ICE_CONNECTION_STATE_CONNECTED || - state == GST_WEBRTC_ICE_CONNECTION_STATE_COMPLETED) { - TSB_LOCK (send); - if (transport == send->stream->transport->transport) { - if (send->rtp_ctx.nice_block) { - GST_LOG_OBJECT (send, "Unblocking pad %" GST_PTR_FORMAT, - send->rtp_ctx.nice_block->pad); - _free_pad_block (send->rtp_ctx.nice_block); - send->rtp_ctx.nice_block = NULL; - } - } else if (transport == send->stream->rtcp_transport->transport) { - if (send->rtcp_ctx.nice_block) { - GST_LOG_OBJECT (send, "Unblocking pad %" GST_PTR_FORMAT, - send->rtcp_ctx.nice_block->pad); - _free_pad_block (send->rtcp_ctx.nice_block); - send->rtcp_ctx.nice_block = NULL; - } - } - TSB_UNLOCK (send); - } -} - -static void -tsb_setup_ctx (TransportSendBin * send, TransportSendBinDTLSContext * ctx, - GstWebRTCDTLSTransport * transport) -{ - GstElement *dtlssrtpenc, *nicesink; - - dtlssrtpenc = ctx->dtlssrtpenc = transport->dtlssrtpenc; - nicesink = ctx->nicesink = transport->transport->sink; - - /* unblock the encoder once the key is set */ - g_signal_connect (dtlssrtpenc, "on-key-set", - G_CALLBACK (_on_dtls_enc_key_set), send); - /* Bring the encoder up to current state only once the is-client prop is set */ - g_signal_connect (dtlssrtpenc, "notify::is-client", - G_CALLBACK (_on_notify_dtls_client_status), send); - gst_bin_add (GST_BIN (send), GST_ELEMENT (dtlssrtpenc)); - - /* unblock ice sink once it signals a connection */ - g_signal_connect (transport->transport, "notify::state", - G_CALLBACK (_on_notify_ice_connection_state), send); - gst_bin_add (GST_BIN (send), GST_ELEMENT (nicesink)); - - if (!gst_element_link_pads (GST_ELEMENT (dtlssrtpenc), "src", nicesink, - "sink")) - g_warn_if_reached (); + TSB_LOCK (send); + maybe_start_enc (send); + TSB_UNLOCK (send); } static void transport_send_bin_constructed (GObject * object) { TransportSendBin *send = TRANSPORT_SEND_BIN (object); - GstWebRTCDTLSTransport *transport; GstPadTemplate *templ; GstPad *ghost, *pad; g_return_if_fail (send->stream); - g_object_bind_property (send, "rtcp-mux", send->stream, "rtcp-mux", - G_BINDING_BIDIRECTIONAL); + send->dtlssrtpenc = send->stream->transport->dtlssrtpenc; + send->nicesink = send->stream->transport->transport->sink; - /* Output selector to direct the RTCP for muxed-mode */ - send->outputselector = gst_element_factory_make ("output-selector", NULL); - gst_bin_add (GST_BIN (send), send->outputselector); - - /* RTP */ - transport = send->stream->transport; - /* Do the common init for the context struct */ - tsb_setup_ctx (send, &send->rtp_ctx, transport); + /* unblock the encoder once the key is set */ + g_signal_connect (send->dtlssrtpenc, "on-key-set", + G_CALLBACK (_on_dtls_enc_key_set), send); + /* Bring the encoder up to current state only once the is-client prop is set */ + g_signal_connect (send->dtlssrtpenc, "notify::is-client", + G_CALLBACK (_on_notify_dtls_client_status), send); + /* unblock ice sink once it signals a connection */ + g_signal_connect (send->stream->transport->transport, "notify::state", + G_CALLBACK (_on_notify_ice_connection_state), send); - templ = _find_pad_template (transport->dtlssrtpenc, - GST_PAD_SINK, GST_PAD_REQUEST, "rtp_sink_%d"); - pad = gst_element_request_pad (transport->dtlssrtpenc, templ, "rtp_sink_0", - NULL); + gst_bin_add (GST_BIN (send), GST_ELEMENT (send->dtlssrtpenc)); + gst_bin_add (GST_BIN (send), GST_ELEMENT (send->nicesink)); - if (!gst_element_link_pads (GST_ELEMENT (send->outputselector), "src_0", - GST_ELEMENT (transport->dtlssrtpenc), "rtcp_sink_0")) + if (!gst_element_link_pads (GST_ELEMENT (send->dtlssrtpenc), "src", + send->nicesink, "sink")) g_warn_if_reached (); + templ = _find_pad_template (send->dtlssrtpenc, GST_PAD_SINK, GST_PAD_REQUEST, + "rtp_sink_%d"); + pad = gst_element_request_pad (send->dtlssrtpenc, templ, "rtp_sink_0", NULL); + ghost = gst_ghost_pad_new ("rtp_sink", pad); gst_element_add_pad (GST_ELEMENT (send), ghost); gst_object_unref (pad); /* push the data stream onto the RTP dtls element */ - templ = _find_pad_template (transport->dtlssrtpenc, - GST_PAD_SINK, GST_PAD_REQUEST, "data_sink"); - pad = gst_element_request_pad (transport->dtlssrtpenc, templ, "data_sink", - NULL); + templ = _find_pad_template (send->dtlssrtpenc, GST_PAD_SINK, GST_PAD_REQUEST, + "data_sink"); + pad = gst_element_request_pad (send->dtlssrtpenc, templ, "data_sink", NULL); ghost = gst_ghost_pad_new ("data_sink", pad); gst_element_add_pad (GST_ELEMENT (send), ghost); gst_object_unref (pad); /* RTCP */ - transport = send->stream->rtcp_transport; /* Do the common init for the context struct */ - tsb_setup_ctx (send, &send->rtcp_ctx, transport); - templ = _find_pad_template (transport->dtlssrtpenc, - GST_PAD_SINK, GST_PAD_REQUEST, "rtcp_sink_%d"); - - if (!gst_element_link_pads (GST_ELEMENT (send->outputselector), "src_1", - GST_ELEMENT (transport->dtlssrtpenc), "rtcp_sink_0")) - g_warn_if_reached (); - - pad = gst_element_get_static_pad (send->outputselector, "sink"); + templ = _find_pad_template (send->dtlssrtpenc, GST_PAD_SINK, GST_PAD_REQUEST, + "rtcp_sink_%d"); + pad = gst_element_request_pad (send->dtlssrtpenc, templ, "rtcp_sink_0", NULL); ghost = gst_ghost_pad_new ("rtcp_sink", pad); gst_element_add_pad (GST_ELEMENT (send), ghost); @@ -456,45 +376,30 @@ } static void -cleanup_ctx_blocks (TransportSendBinDTLSContext * ctx) +cleanup_blocks (TransportSendBin * send) { - if (ctx->rtp_block) { - _free_pad_block (ctx->rtp_block); - ctx->rtp_block = NULL; - } - - if (ctx->rtcp_block) { - _free_pad_block (ctx->rtcp_block); - ctx->rtcp_block = NULL; + if (send->rtp_block) { + _free_pad_block (send->rtp_block); + send->rtp_block = NULL; } - if (ctx->nice_block) { - _free_pad_block (ctx->nice_block); - ctx->nice_block = NULL; + if (send->rtcp_block) { + _free_pad_block (send->rtcp_block); + send->rtcp_block = NULL; } } static void -cleanup_blocks (TransportSendBin * send) -{ - cleanup_ctx_blocks (&send->rtp_ctx); - cleanup_ctx_blocks (&send->rtcp_ctx); -} - -static void transport_send_bin_dispose (GObject * object) { TransportSendBin *send = TRANSPORT_SEND_BIN (object); TSB_LOCK (send); - if (send->rtp_ctx.nicesink) { - g_signal_handlers_disconnect_by_data (send->rtp_ctx.nicesink, send); - send->rtp_ctx.nicesink = NULL; - } - if (send->rtcp_ctx.nicesink) { - g_signal_handlers_disconnect_by_data (send->rtcp_ctx.nicesink, send); - send->rtcp_ctx.nicesink = NULL; + if (send->nicesink) { + g_signal_handlers_disconnect_by_data (send->nicesink, send); + send->nicesink = NULL; } + cleanup_blocks (send); TSB_UNLOCK (send); @@ -623,12 +528,6 @@ "The TransportStream for this sending bin", transport_stream_get_type (), G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, - PROP_RTCP_MUX, - g_param_spec_boolean ("rtcp-mux", "RTCP Mux", - "Whether RTCP packets are muxed with RTP packets", - FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/transportsendbin.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/transportsendbin.h
Changed
@@ -34,18 +34,6 @@ typedef struct _TransportSendBinDTLSContext TransportSendBinDTLSContext; -struct _TransportSendBinDTLSContext { - GstElement *dtlssrtpenc; - GstElement *nicesink; - - /* Block on the dtlssrtpenc RTP sink pad, if any */ - struct pad_block *rtp_block; - /* Block on the dtlssrtpenc RTCP sink pad, if any */ - struct pad_block *rtcp_block; - /* Block on the nicesink sink pad, if any */ - struct pad_block *nice_block; -}; - struct _TransportSendBin { GstBin parent; @@ -54,21 +42,16 @@ gboolean active; /* Flag that's cleared on shutdown */ TransportStream *stream; /* parent transport stream */ - gboolean rtcp_mux; - GstElement *outputselector; + GstElement *dtlssrtpenc; + GstElement *nicesink; - TransportSendBinDTLSContext rtp_ctx; - TransportSendBinDTLSContext rtcp_ctx; + gboolean has_clientness; - /* + /* Block on the dtlssrtpenc RTP sink pad, if any */ struct pad_block *rtp_block; - struct pad_block *rtcp_mux_block; - struct pad_block *rtp_nice_block; - + /* Block on the dtlssrtpenc RTCP sink pad, if any */ struct pad_block *rtcp_block; - struct pad_block *rtcp_nice_block; - */ }; struct _TransportSendBinClass
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/transportstream.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/transportstream.c
Changed
@@ -27,6 +27,7 @@ #include "gstwebrtcice.h" #include "gstwebrtcbin.h" #include "utils.h" +#include "gst/webrtc/webrtc-priv.h" #define transport_stream_parent_class parent_class G_DEFINE_TYPE (TransportStream, transport_stream, GST_TYPE_OBJECT); @@ -36,7 +37,6 @@ PROP_0, PROP_WEBRTC, PROP_SESSION_ID, - PROP_RTCP_MUX, PROP_DTLS_CLIENT, }; @@ -55,13 +55,18 @@ } int -transport_stream_get_pt (TransportStream * stream, const gchar * encoding_name) +transport_stream_get_pt (TransportStream * stream, const gchar * encoding_name, + guint media_idx) { guint i; gint ret = 0; for (i = 0; i < stream->ptmap->len; i++) { PtMapItem *item = &g_array_index (stream->ptmap, PtMapItem, i); + + if (media_idx != -1 && media_idx != item->media_idx) + continue; + if (!gst_caps_is_empty (item->caps)) { GstStructure *s = gst_caps_get_structure (item->caps, 0); if (!g_strcmp0 (gst_structure_get_string (s, "encoding-name"), @@ -124,9 +129,6 @@ case PROP_SESSION_ID: stream->session_id = g_value_get_uint (value); break; - case PROP_RTCP_MUX: - stream->rtcp_mux = g_value_get_boolean (value); - break; case PROP_DTLS_CLIENT: stream->dtls_client = g_value_get_boolean (value); break; @@ -148,9 +150,6 @@ case PROP_SESSION_ID: g_value_set_uint (value, stream->session_id); break; - case PROP_RTCP_MUX: - g_value_set_boolean (value, stream->rtcp_mux); - break; case PROP_DTLS_CLIENT: g_value_set_boolean (value, stream->dtls_client); break; @@ -178,10 +177,6 @@ gst_object_unref (stream->transport); stream->transport = NULL; - if (stream->rtcp_transport) - gst_object_unref (stream->rtcp_transport); - stream->rtcp_transport = NULL; - if (stream->rtxsend) gst_object_unref (stream->rtxsend); stream->rtxsend = NULL; @@ -201,7 +196,7 @@ TransportStream *stream = TRANSPORT_STREAM (object); g_array_free (stream->ptmap, TRUE); - g_array_free (stream->remote_ssrcmap, TRUE); + g_ptr_array_free (stream->remote_ssrcmap, TRUE); G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -213,19 +208,12 @@ GstWebRTCBin *webrtc; GstWebRTCICETransport *ice_trans; - stream->transport = gst_webrtc_dtls_transport_new (stream->session_id, FALSE); - stream->rtcp_transport = - gst_webrtc_dtls_transport_new (stream->session_id, TRUE); + stream->transport = gst_webrtc_dtls_transport_new (stream->session_id); webrtc = GST_WEBRTC_BIN (gst_object_get_parent (GST_OBJECT (object))); g_object_bind_property (stream->transport, "client", stream, "dtls-client", G_BINDING_BIDIRECTIONAL); - g_object_bind_property (stream->rtcp_transport, "client", stream, - "dtls-client", G_BINDING_BIDIRECTIONAL); - - g_object_bind_property (stream->transport, "certificate", - stream->rtcp_transport, "certificate", G_BINDING_BIDIRECTIONAL); /* Need to go full Java and have a transport manager? * Or make the caller set the ICE transport up? */ @@ -242,12 +230,6 @@ gst_webrtc_dtls_transport_set_transport (stream->transport, ice_trans); gst_object_unref (ice_trans); - ice_trans = - gst_webrtc_ice_find_transport (webrtc->priv->ice, stream->stream, - GST_WEBRTC_ICE_COMPONENT_RTCP); - gst_webrtc_dtls_transport_set_transport (stream->rtcp_transport, ice_trans); - gst_object_unref (ice_trans); - stream->send_bin = g_object_new (transport_send_bin_get_type (), "stream", stream, NULL); gst_object_ref_sink (stream->send_bin); @@ -288,12 +270,6 @@ G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, - PROP_RTCP_MUX, - g_param_spec_boolean ("rtcp-mux", "RTCP Mux", - "Whether RTCP packets are muxed with RTP packets", - FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, PROP_DTLS_CLIENT, g_param_spec_boolean ("dtls-client", "DTLS client", "Whether we take the client role in DTLS negotiation", @@ -307,12 +283,32 @@ gst_caps_unref (item->caps); } +SsrcMapItem * +ssrcmap_item_new (guint32 ssrc, guint media_idx) +{ + SsrcMapItem *ssrc_item = g_slice_new (SsrcMapItem); + + ssrc_item->media_idx = media_idx; + ssrc_item->ssrc = ssrc; + g_weak_ref_init (&ssrc_item->rtpjitterbuffer, NULL); + + return ssrc_item; +} + +static void +ssrcmap_item_free (SsrcMapItem * item) +{ + g_weak_ref_clear (&item->rtpjitterbuffer); + g_slice_free (SsrcMapItem, item); +} + static void transport_stream_init (TransportStream * stream) { stream->ptmap = g_array_new (FALSE, TRUE, sizeof (PtMapItem)); g_array_set_clear_func (stream->ptmap, (GDestroyNotify) clear_ptmap_item); - stream->remote_ssrcmap = g_array_new (FALSE, TRUE, sizeof (SsrcMapItem)); + stream->remote_ssrcmap = g_ptr_array_new_with_free_func ( + (GDestroyNotify) ssrcmap_item_free); } TransportStream *
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/transportstream.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/transportstream.h
Changed
@@ -34,6 +34,7 @@ typedef struct { guint8 pt; + guint media_idx; GstCaps *caps; } PtMapItem; @@ -41,16 +42,17 @@ { guint32 ssrc; guint media_idx; + GWeakRef rtpjitterbuffer; /* for stats */ } SsrcMapItem; +SsrcMapItem * ssrcmap_item_new (guint32 ssrc, + guint media_idx); + struct _TransportStream { GstObject parent; guint session_id; /* session_id */ - gboolean rtcp; - gboolean rtcp_mux; - gboolean rtcp_rsize; gboolean dtls_client; gboolean active; /* TRUE if any mline in the bundle/transport is active */ TransportSendBin *send_bin; /* bin containing all the sending transport elements */ @@ -58,10 +60,9 @@ GstWebRTCICEStream *stream; GstWebRTCDTLSTransport *transport; - GstWebRTCDTLSTransport *rtcp_transport; GArray *ptmap; /* array of PtMapItem's */ - GArray *remote_ssrcmap; /* array of SsrcMapItem's */ + GPtrArray *remote_ssrcmap; /* array of SsrcMapItem's */ gboolean output_connected; /* whether receive bin is connected to rtpbin */ GstElement *rtxsend; @@ -76,7 +77,8 @@ TransportStream * transport_stream_new (GstWebRTCBin * webrtc, guint session_id); int transport_stream_get_pt (TransportStream * stream, - const gchar * encoding_name); + const gchar * encoding_name, + guint media_idx); int * transport_stream_get_all_pt (TransportStream * stream, const gchar * encoding_name, gsize * pt_len);
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/utils.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/utils.c
Changed
@@ -26,12 +26,6 @@ #include "utils.h" #include "gstwebrtcbin.h" -GQuark -gst_webrtc_bin_error_quark (void) -{ - return g_quark_from_static_string ("gst-webrtc-bin-error-quark"); -} - GstPadTemplate * _find_pad_template (GstElement * element, GstPadDirection direction, GstPadPresence presence, const gchar * name) @@ -205,3 +199,27 @@ return ret; } + +GstWebRTCKind +webrtc_kind_from_caps (const GstCaps * caps) +{ + GstStructure *s; + const gchar *media; + + if (!caps || gst_caps_get_size (caps) == 0) + return GST_WEBRTC_KIND_UNKNOWN; + + s = gst_caps_get_structure (caps, 0); + + media = gst_structure_get_string (s, "media"); + if (media == NULL) + return GST_WEBRTC_KIND_UNKNOWN; + + if (!g_strcmp0 (media, "audio")) + return GST_WEBRTC_KIND_AUDIO; + + if (!g_strcmp0 (media, "video")) + return GST_WEBRTC_KIND_VIDEO; + + return GST_WEBRTC_KIND_UNKNOWN; +}
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/utils.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/utils.h
Changed
@@ -26,22 +26,6 @@ G_BEGIN_DECLS -#define GST_WEBRTC_BIN_ERROR gst_webrtc_bin_error_quark () -GQuark gst_webrtc_bin_error_quark (void); - -typedef enum -{ - GST_WEBRTC_BIN_ERROR_FAILED, - GST_WEBRTC_BIN_ERROR_INVALID_SYNTAX, - GST_WEBRTC_BIN_ERROR_INVALID_MODIFICATION, - GST_WEBRTC_BIN_ERROR_INVALID_STATE, - GST_WEBRTC_BIN_ERROR_BAD_SDP, - GST_WEBRTC_BIN_ERROR_FINGERPRINT, - GST_WEBRTC_BIN_ERROR_SCTP_FAILURE, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, - GST_WEBRTC_BIN_ERROR_CLOSED, -} GstWebRTCError; - GstPadTemplate * _find_pad_template (GstElement * element, GstPadDirection direction, GstPadPresence presence, @@ -80,6 +64,8 @@ const gchar * _g_checksum_to_webrtc_string (GChecksumType type); G_GNUC_INTERNAL GstCaps * _rtp_caps_from_media (const GstSDPMedia * media); +G_GNUC_INTERNAL +GstWebRTCKind webrtc_kind_from_caps (const GstCaps * caps); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/webrtcdatachannel.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtcdatachannel.c
Changed
@@ -221,11 +221,13 @@ GDestroyNotify notify; }; -static void +static GstStructure * _execute_task (GstWebRTCBin * webrtc, struct task *task) { if (task->func) task->func (task->channel, task->user_data); + + return NULL; } static void @@ -279,17 +281,26 @@ _transport_closed (WebRTCDataChannel * channel) { GError *error; + gboolean both_sides_closed; GST_WEBRTC_DATA_CHANNEL_LOCK (channel); error = channel->stored_error; channel->stored_error = NULL; + + both_sides_closed = + channel->peer_closed && channel->parent.buffered_amount <= 0; + if (both_sides_closed || error) { + channel->peer_closed = FALSE; + } GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); if (error) { gst_webrtc_data_channel_on_error (GST_WEBRTC_DATA_CHANNEL (channel), error); g_clear_error (&error); } - gst_webrtc_data_channel_on_close (GST_WEBRTC_DATA_CHANNEL (channel)); + if (both_sides_closed || error) { + gst_webrtc_data_channel_on_close (GST_WEBRTC_DATA_CHANNEL (channel)); + } } static void @@ -297,6 +308,9 @@ { GstPad *pad, *peer; + GST_INFO_OBJECT (channel, "Closing outgoing SCTP stream %i label \"%s\"", + channel->parent.id, channel->parent.label); + pad = gst_element_get_static_pad (channel->appsrc, "src"); peer = gst_pad_get_peer (pad); gst_object_unref (pad); @@ -319,31 +333,44 @@ { /* https://www.w3.org/TR/webrtc/#data-transport-closing-procedure */ GST_WEBRTC_DATA_CHANNEL_LOCK (channel); - if (channel->parent.ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED - || channel->parent.ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING) { + if (channel->parent.ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED) { GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); return; - } - channel->parent.ready_state = GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING; - GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); - g_object_notify (G_OBJECT (channel), "ready-state"); + } else if (channel->parent.ready_state == + GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING) { + _channel_enqueue_task (channel, (ChannelTask) _transport_closed, NULL, + NULL); + } else if (channel->parent.ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_OPEN) { + channel->parent.ready_state = GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING; + GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); + g_object_notify (G_OBJECT (channel), "ready-state"); - GST_WEBRTC_DATA_CHANNEL_LOCK (channel); - if (channel->parent.buffered_amount <= 0) { - _channel_enqueue_task (channel, (ChannelTask) _close_sctp_stream, - NULL, NULL); + GST_WEBRTC_DATA_CHANNEL_LOCK (channel); + if (channel->parent.buffered_amount <= 0) { + _channel_enqueue_task (channel, (ChannelTask) _close_sctp_stream, + NULL, NULL); + } } GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); } static void -_on_sctp_reset_stream (GstWebRTCSCTPTransport * sctp, guint stream_id, +_on_sctp_stream_reset (WebRTCSCTPTransport * sctp, guint stream_id, WebRTCDataChannel * channel) { - if (channel->parent.id == stream_id) - _channel_enqueue_task (channel, (ChannelTask) _transport_closed, + if (channel->parent.id == stream_id) { + GST_INFO_OBJECT (channel, + "Received channel close for SCTP stream %i label \"%s\"", + channel->parent.id, channel->parent.label); + + GST_WEBRTC_DATA_CHANNEL_LOCK (channel); + channel->peer_closed = TRUE; + GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); + + _channel_enqueue_task (channel, (ChannelTask) _close_procedure, GUINT_TO_POINTER (stream_id), NULL); + } } static void @@ -386,8 +413,8 @@ GST_INFO_OBJECT (channel, "Received channel open"); if (channel->parent.negotiated) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Data channel was signalled as negotiated already"); g_return_val_if_reached (GST_FLOW_ERROR); } @@ -437,7 +464,7 @@ channel->opened = TRUE; GST_INFO_OBJECT (channel, "Received channel open for SCTP stream %i " - "label %s protocol %s ordered %s", channel->parent.id, + "label \"%s\" protocol %s ordered %s", channel->parent.id, channel->parent.label, channel->parent.protocol, channel->parent.ordered ? "true" : "false"); @@ -452,16 +479,15 @@ ret = gst_app_src_push_buffer (GST_APP_SRC (channel->appsrc), buffer); if (ret != GST_FLOW_OK) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, - "Could not send ack packet"); + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Could not send ack packet"); return ret; } return ret; } else { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Unknown message type in control protocol"); return GST_FLOW_ERROR; } @@ -470,8 +496,8 @@ { g_free (label); g_free (proto); - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, "Failed to parse packet"); + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to parse packet"); g_return_val_if_reached (GST_FLOW_ERROR); } } @@ -523,14 +549,14 @@ buffer = gst_sample_get_buffer (sample); if (!buffer) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, "No buffer to handle"); + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "No buffer to handle"); return GST_FLOW_ERROR; } receive = gst_sctp_buffer_get_receive_meta (buffer); if (!receive) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "No SCTP Receive meta on the buffer"); return GST_FLOW_ERROR; } @@ -539,8 +565,8 @@ case DATA_CHANNEL_PPID_WEBRTC_CONTROL:{ GstMapInfo info = GST_MAP_INFO_INIT; if (!gst_buffer_map (buffer, &info, GST_MAP_READ)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to map received buffer"); ret = GST_FLOW_ERROR; } else { @@ -553,8 +579,8 @@ case DATA_CHANNEL_PPID_WEBRTC_STRING_PARTIAL:{ GstMapInfo info = GST_MAP_INFO_INIT; if (!gst_buffer_map (buffer, &info, GST_MAP_READ)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to map received buffer"); ret = GST_FLOW_ERROR; } else { @@ -569,8 +595,8 @@ case DATA_CHANNEL_PPID_WEBRTC_BINARY_PARTIAL:{ struct map_info *info = g_new0 (struct map_info, 1); if (!gst_buffer_map (buffer, &info->map_info, GST_MAP_READ)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to map received buffer"); ret = GST_FLOW_ERROR; } else { @@ -591,8 +617,8 @@ NULL); break; default: - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Unknown SCTP PPID %u received", receive->ppid); ret = GST_FLOW_ERROR; break; @@ -671,13 +697,14 @@ buffer = construct_open_packet (channel); GST_INFO_OBJECT (channel, "Sending channel open for SCTP stream %i " - "label %s protocol %s ordered %s", channel->parent.id, + "label \"%s\" protocol %s ordered %s", channel->parent.id, channel->parent.label, channel->parent.protocol, channel->parent.ordered ? "true" : "false"); GST_WEBRTC_DATA_CHANNEL_LOCK (channel); channel->parent.buffered_amount += gst_buffer_get_size (buffer); GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); + g_object_notify (G_OBJECT (&channel->parent), "buffered-amount"); if (gst_app_src_push_buffer (GST_APP_SRC (channel->appsrc), buffer) == GST_FLOW_OK) { @@ -685,8 +712,8 @@ _channel_enqueue_task (channel, (ChannelTask) _emit_on_open, NULL, NULL); } else { GError *error = NULL; - g_set_error (&error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (&error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to send DCEP open packet"); _channel_store_error (channel, error); _channel_enqueue_task (channel, (ChannelTask) _close_procedure, NULL, NULL); @@ -737,8 +764,8 @@ g_return_if_fail (data != NULL); if (!_is_within_max_message_size (channel, size)) { GError *error = NULL; - g_set_error (&error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (&error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Requested to send data that is too large"); _channel_store_error (channel, error); _channel_enqueue_task (channel, (ChannelTask) _close_procedure, NULL, @@ -761,13 +788,14 @@ GST_WEBRTC_DATA_CHANNEL_LOCK (channel); channel->parent.buffered_amount += gst_buffer_get_size (buffer); GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); + g_object_notify (G_OBJECT (&channel->parent), "buffered-amount"); ret = gst_app_src_push_buffer (GST_APP_SRC (channel->appsrc), buffer); if (ret != GST_FLOW_OK) { GError *error = NULL; - g_set_error (&error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, "Failed to send data"); + g_set_error (&error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to send data"); _channel_store_error (channel, error); _channel_enqueue_task (channel, (ChannelTask) _close_procedure, NULL, NULL); } @@ -793,12 +821,12 @@ ppid = DATA_CHANNEL_PPID_WEBRTC_STRING_EMPTY; } else { gsize size = strlen (str); - gchar *str_copy = g_strdup (str); + gchar *str_copy; if (!_is_within_max_message_size (channel, size)) { GError *error = NULL; - g_set_error (&error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, + g_set_error (&error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Requested to send a string that is too large"); _channel_store_error (channel, error); _channel_enqueue_task (channel, (ChannelTask) _close_procedure, NULL, @@ -806,6 +834,7 @@ return; } + str_copy = g_strdup (str); buffer = gst_buffer_new_wrapped_full (GST_MEMORY_FLAG_READONLY, str_copy, size, 0, size, str_copy, g_free); @@ -822,13 +851,14 @@ GST_WEBRTC_DATA_CHANNEL_LOCK (channel); channel->parent.buffered_amount += gst_buffer_get_size (buffer); GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); + g_object_notify (G_OBJECT (&channel->parent), "buffered-amount"); ret = gst_app_src_push_buffer (GST_APP_SRC (channel->appsrc), buffer); if (ret != GST_FLOW_OK) { GError *error = NULL; - g_set_error (&error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_DATA_CHANNEL_FAILURE, "Failed to send string"); + g_set_error (&error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, "Failed to send string"); _channel_store_error (channel, error); _channel_enqueue_task (channel, (ChannelTask) _close_procedure, NULL, NULL); } @@ -900,6 +930,7 @@ NULL); } GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); + g_object_notify (G_OBJECT (&channel->parent), "buffered-amount"); } return GST_PAD_PROBE_OK; @@ -976,7 +1007,7 @@ static void _data_channel_set_sctp_transport (WebRTCDataChannel * channel, - GstWebRTCSCTPTransport * sctp) + WebRTCSCTPTransport * sctp) { g_return_if_fail (GST_IS_WEBRTC_DATA_CHANNEL (channel)); g_return_if_fail (GST_IS_WEBRTC_SCTP_TRANSPORT (sctp)); @@ -989,18 +1020,17 @@ GST_OBJECT (sctp)); if (sctp) { - g_signal_connect (sctp, "stream-reset", G_CALLBACK (_on_sctp_reset_stream), + g_signal_connect (sctp, "stream-reset", G_CALLBACK (_on_sctp_stream_reset), channel); g_signal_connect (sctp, "notify::state", G_CALLBACK (_on_sctp_notify_state), channel); - _on_sctp_notify_state_unlocked (G_OBJECT (sctp), channel); } GST_WEBRTC_DATA_CHANNEL_UNLOCK (channel); } void webrtc_data_channel_link_to_sctp (WebRTCDataChannel * channel, - GstWebRTCSCTPTransport * sctp_transport) + WebRTCSCTPTransport * sctp_transport) { if (sctp_transport && !channel->sctp_transport) { gint id; @@ -1016,6 +1046,8 @@ channel->sctp_transport->sctpenc, pad_name)) g_warn_if_reached (); g_free (pad_name); + + _on_sctp_notify_state_unlocked (G_OBJECT (sctp_transport), channel); } } }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/webrtcdatachannel.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtcdatachannel.h
Changed
@@ -24,7 +24,9 @@ #include <gst/webrtc/webrtc_fwd.h> #include <gst/webrtc/dtlstransport.h> #include <gst/webrtc/datachannel.h> -#include "sctptransport.h" +#include "webrtcsctptransport.h" + +#include "gst/webrtc/webrtc-priv.h" G_BEGIN_DECLS @@ -43,7 +45,7 @@ { GstWebRTCDataChannel parent; - GstWebRTCSCTPTransport *sctp_transport; + WebRTCSCTPTransport *sctp_transport; GstElement *appsrc; GstElement *appsink; @@ -51,6 +53,7 @@ gboolean opened; gulong src_probe; GError *stored_error; + gboolean peer_closed; gpointer _padding[GST_PADDING]; }; @@ -65,7 +68,7 @@ void webrtc_data_channel_start_negotiation (WebRTCDataChannel *channel); G_GNUC_INTERNAL void webrtc_data_channel_link_to_sctp (WebRTCDataChannel *channel, - GstWebRTCSCTPTransport *sctp_transport); + WebRTCSCTPTransport *sctp_transport); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtcsctptransport.c
Added
@@ -0,0 +1,251 @@ +/* GStreamer + * Copyright (C) 2018 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include "config.h" +#endif + +#include <stdio.h> + +#include "webrtcsctptransport.h" +#include "gstwebrtcbin.h" + +#define GST_CAT_DEFAULT webrtc_sctp_transport_debug +GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); + +enum +{ + SIGNAL_0, + ON_STREAM_RESET_SIGNAL, + LAST_SIGNAL, +}; + +enum +{ + PROP_0, + PROP_TRANSPORT, + PROP_STATE, + PROP_MAX_MESSAGE_SIZE, + PROP_MAX_CHANNELS, +}; + +static guint webrtc_sctp_transport_signals[LAST_SIGNAL] = { 0 }; + +#define webrtc_sctp_transport_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (WebRTCSCTPTransport, webrtc_sctp_transport, + GST_TYPE_WEBRTC_SCTP_TRANSPORT, + GST_DEBUG_CATEGORY_INIT (webrtc_sctp_transport_debug, + "webrtcsctptransport", 0, "webrtcsctptransport");); + +typedef void (*SCTPTask) (WebRTCSCTPTransport * sctp, gpointer user_data); + +struct task +{ + WebRTCSCTPTransport *sctp; + SCTPTask func; + gpointer user_data; + GDestroyNotify notify; +}; + +static GstStructure * +_execute_task (GstWebRTCBin * webrtc, struct task *task) +{ + if (task->func) + task->func (task->sctp, task->user_data); + return NULL; +} + +static void +_free_task (struct task *task) +{ + gst_object_unref (task->sctp); + + if (task->notify) + task->notify (task->user_data); + g_free (task); +} + +static void +_sctp_enqueue_task (WebRTCSCTPTransport * sctp, SCTPTask func, + gpointer user_data, GDestroyNotify notify) +{ + struct task *task = g_new0 (struct task, 1); + + task->sctp = gst_object_ref (sctp); + task->func = func; + task->user_data = user_data; + task->notify = notify; + + gst_webrtc_bin_enqueue_task (sctp->webrtcbin, + (GstWebRTCBinFunc) _execute_task, task, (GDestroyNotify) _free_task, + NULL); +} + +static void +_emit_stream_reset (WebRTCSCTPTransport * sctp, gpointer user_data) +{ + guint stream_id = GPOINTER_TO_UINT (user_data); + + g_signal_emit (sctp, + webrtc_sctp_transport_signals[ON_STREAM_RESET_SIGNAL], 0, stream_id); +} + +static void +_on_sctp_dec_pad_removed (GstElement * sctpdec, GstPad * pad, + WebRTCSCTPTransport * sctp) +{ + guint stream_id; + + if (sscanf (GST_PAD_NAME (pad), "src_%u", &stream_id) != 1) + return; + + _sctp_enqueue_task (sctp, (SCTPTask) _emit_stream_reset, + GUINT_TO_POINTER (stream_id), NULL); +} + +static void +_on_sctp_association_established (GstElement * sctpenc, gboolean established, + WebRTCSCTPTransport * sctp) +{ + GST_OBJECT_LOCK (sctp); + if (established) + sctp->state = GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED; + else + sctp->state = GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED; + sctp->association_established = established; + GST_OBJECT_UNLOCK (sctp); + + g_object_notify (G_OBJECT (sctp), "state"); +} + +void +webrtc_sctp_transport_set_priority (WebRTCSCTPTransport * sctp, + GstWebRTCPriorityType priority) +{ + GstPad *pad; + + pad = gst_element_get_static_pad (sctp->sctpenc, "src"); + gst_pad_push_event (pad, + gst_event_new_custom (GST_EVENT_CUSTOM_DOWNSTREAM_STICKY, + gst_structure_new ("GstWebRtcBinUpdateTos", "sctp-priority", + GST_TYPE_WEBRTC_PRIORITY_TYPE, priority, NULL))); + gst_object_unref (pad); +} + +static void +webrtc_sctp_transport_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + WebRTCSCTPTransport *sctp = WEBRTC_SCTP_TRANSPORT (object); + + switch (prop_id) { + case PROP_TRANSPORT: + g_value_set_object (value, sctp->transport); + break; + case PROP_STATE: + g_value_set_enum (value, sctp->state); + break; + case PROP_MAX_MESSAGE_SIZE: + g_value_set_uint64 (value, sctp->max_message_size); + break; + case PROP_MAX_CHANNELS: + g_value_set_uint (value, sctp->max_channels); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +webrtc_sctp_transport_finalize (GObject * object) +{ + WebRTCSCTPTransport *sctp = WEBRTC_SCTP_TRANSPORT (object); + + g_signal_handlers_disconnect_by_data (sctp->sctpdec, sctp); + g_signal_handlers_disconnect_by_data (sctp->sctpenc, sctp); + + gst_object_unref (sctp->sctpdec); + gst_object_unref (sctp->sctpenc); + + g_clear_object (&sctp->transport); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +webrtc_sctp_transport_constructed (GObject * object) +{ + WebRTCSCTPTransport *sctp = WEBRTC_SCTP_TRANSPORT (object); + guint association_id; + + association_id = g_random_int_range (0, G_MAXUINT16); + + sctp->sctpdec = + g_object_ref_sink (gst_element_factory_make ("sctpdec", NULL)); + g_object_set (sctp->sctpdec, "sctp-association-id", association_id, NULL); + sctp->sctpenc = + g_object_ref_sink (gst_element_factory_make ("sctpenc", NULL)); + g_object_set (sctp->sctpenc, "sctp-association-id", association_id, NULL); + g_object_set (sctp->sctpenc, "use-sock-stream", TRUE, NULL); + + g_signal_connect (sctp->sctpdec, "pad-removed", + G_CALLBACK (_on_sctp_dec_pad_removed), sctp); + g_signal_connect (sctp->sctpenc, "sctp-association-established", + G_CALLBACK (_on_sctp_association_established), sctp); + + G_OBJECT_CLASS (parent_class)->constructed (object); +} + +static void +webrtc_sctp_transport_class_init (WebRTCSCTPTransportClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->constructed = webrtc_sctp_transport_constructed; + gobject_class->get_property = webrtc_sctp_transport_get_property; + gobject_class->finalize = webrtc_sctp_transport_finalize; + + g_object_class_override_property (gobject_class, PROP_TRANSPORT, "transport"); + g_object_class_override_property (gobject_class, PROP_STATE, "state"); + g_object_class_override_property (gobject_class, + PROP_MAX_MESSAGE_SIZE, "max-message-size"); + g_object_class_override_property (gobject_class, + PROP_MAX_CHANNELS, "max-channels"); + + /** + * WebRTCSCTPTransport::stream-reset: + * @object: the #WebRTCSCTPTransport + * @stream_id: the SCTP stream that was reset + */ + webrtc_sctp_transport_signals[ON_STREAM_RESET_SIGNAL] = + g_signal_new ("stream-reset", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, G_TYPE_UINT); +} + +static void +webrtc_sctp_transport_init (WebRTCSCTPTransport * nice) +{ +} + +WebRTCSCTPTransport * +webrtc_sctp_transport_new (void) +{ + return g_object_new (TYPE_WEBRTC_SCTP_TRANSPORT, NULL); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtcsctptransport.h
Added
@@ -0,0 +1,74 @@ +/* GStreamer + * Copyright (C) 2018 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __WEBRTC_SCTP_TRANSPORT_H__ +#define __WEBRTC_SCTP_TRANSPORT_H__ + +#include <gst/gst.h> +#include <gst/webrtc/webrtc.h> +#include <gst/webrtc/sctptransport.h> +#include "gstwebrtcice.h" + +#include "gst/webrtc/webrtc-priv.h" + +G_BEGIN_DECLS + +GType webrtc_sctp_transport_get_type(void); +#define TYPE_WEBRTC_SCTP_TRANSPORT (webrtc_sctp_transport_get_type()) +#define WEBRTC_SCTP_TRANSPORT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),TYPE_WEBRTC_SCTP_TRANSPORT,WebRTCSCTPTransport)) +#define WEBRTC_IS_SCTP_TRANSPORT(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),TYPE_WEBRTC_SCTP_TRANSPORT)) +#define WEBRTC_SCTP_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass) ,TYPE_WEBRTC_SCTP_TRANSPORT,WebRTCSCTPTransportClass)) +#define WEBRTC_SCTP_IS_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,TYPE_WEBRTC_SCTP_TRANSPORT)) +#define WEBRTC_SCTP_TRANSPORT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,TYPE_WEBRTC_SCTP_TRANSPORT,WebRTCSCTPTransportClass)) + +typedef struct _WebRTCSCTPTransport WebRTCSCTPTransport; +typedef struct _WebRTCSCTPTransportClass WebRTCSCTPTransportClass; + +struct _WebRTCSCTPTransport +{ + GstWebRTCSCTPTransport parent; + + GstWebRTCDTLSTransport *transport; + GstWebRTCSCTPTransportState state; + guint64 max_message_size; + guint max_channels; + + gboolean association_established; + + gulong sctpdec_block_id; + GstElement *sctpdec; + GstElement *sctpenc; + + GstWebRTCBin *webrtcbin; +}; + +struct _WebRTCSCTPTransportClass +{ + GstWebRTCSCTPTransportClass parent_class; +}; + +WebRTCSCTPTransport * webrtc_sctp_transport_new (void); + +void +webrtc_sctp_transport_set_priority (WebRTCSCTPTransport *sctp, + GstWebRTCPriorityType priority); + +G_END_DECLS + +#endif /* __WEBRTC_SCTP_TRANSPORT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/webrtcsdp.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtcsdp.c
Changed
@@ -84,8 +84,8 @@ gchar *state_str = _enum_value_to_string (GST_TYPE_WEBRTC_SIGNALING_STATE, state); gchar *type_str = _enum_value_to_string (GST_TYPE_WEBRTC_SDP_TYPE, type); - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_INVALID_STATE, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INVALID_STATE, "Not in the correct state (%s) for setting %s %s description", state_str, _sdp_source_to_string (source), type_str); g_free (state_str); @@ -108,8 +108,8 @@ key = gst_sdp_message_get_key (sdp->sdp); if (!IS_EMPTY_SDP_ATTRIBUTE (key->data)) { - g_set_error_literal (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_BAD_SDP, "sdp contains a k line"); + g_set_error_literal (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "sdp contains a k line"); return FALSE; } @@ -122,8 +122,8 @@ if (!IS_EMPTY_SDP_ATTRIBUTE (message_fingerprint) && !IS_EMPTY_SDP_ATTRIBUTE (media_fingerprint)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_FINGERPRINT, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_FINGERPRINT_FAILURE, "No fingerprint lines in sdp for media %u", i); return FALSE; } @@ -132,8 +132,8 @@ } if (!IS_EMPTY_SDP_ATTRIBUTE (media_fingerprint) && g_strcmp0 (fingerprint, media_fingerprint) != 0) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_FINGERPRINT, + g_set_error (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_FINGERPRINT_FAILURE, "Fingerprint in media %u differs from %s fingerprint. " "\'%s\' != \'%s\'", i, message_fingerprint ? "global" : "previous", fingerprint, media_fingerprint); @@ -178,8 +178,8 @@ _check_trickle_ice (GstSDPMessage * msg, GError ** error) { if (!_session_has_attribute_key_value (msg, "ice-options", "trickle")) { - g_set_error_literal (error, GST_WEBRTC_BIN_ERROR, - GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error_literal (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "No required \'a=ice-options:trickle\' line in sdp"); } return TRUE; @@ -204,7 +204,7 @@ { const gchar *mid = gst_sdp_media_get_attribute_val (media, "mid"); if (IS_EMPTY_SDP_ATTRIBUTE (mid)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u is missing or contains an empty \'mid\' attribute", media_idx); return FALSE; @@ -248,13 +248,13 @@ static const gchar *valid_setups[] = { "actpass", "active", "passive", NULL }; const gchar *setup = gst_sdp_media_get_attribute_val (media, "setup"); if (IS_EMPTY_SDP_ATTRIBUTE (setup)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u is missing or contains an empty \'setup\' attribute", media_idx); return FALSE; } if (!g_strv_contains (valid_setups, setup)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u contains unknown \'setup\' attribute, \'%s\'", media_idx, setup); return FALSE; @@ -268,7 +268,7 @@ { const gchar *dtls_id = gst_sdp_media_get_attribute_val (media, "ice-pwd"); if (IS_EMPTY_SDP_ATTRIBUTE (dtls_id)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u is missing or contains an empty \'dtls-id\' attribute", media_idx); return FALSE; @@ -307,13 +307,13 @@ media_in_bundle = is_bundle && g_strv_contains ((const gchar **) group_members, mid); if (!_media_get_ice_ufrag (sdp->sdp, i)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u is missing or contains an empty \'ice-ufrag\' attribute", i); goto fail; } if (!_media_get_ice_pwd (sdp->sdp, i)) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u is missing or contains an empty \'ice-pwd\' attribute", i); goto fail; } @@ -327,7 +327,7 @@ if (!bundle_ice_ufrag) bundle_ice_ufrag = ice_ufrag; else if (g_strcmp0 (bundle_ice_ufrag, ice_ufrag) != 0) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u has different ice-ufrag values in bundle. " "%s != %s", i, bundle_ice_ufrag, ice_ufrag); goto fail; @@ -335,7 +335,7 @@ if (!bundle_ice_pwd) { bundle_ice_pwd = ice_pwd; } else if (g_strcmp0 (bundle_ice_pwd, ice_pwd) != 0) { - g_set_error (error, GST_WEBRTC_BIN_ERROR, GST_WEBRTC_BIN_ERROR_BAD_SDP, + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, "media %u has different ice-pwd values in bundle. " "%s != %s", i, bundle_ice_pwd, ice_pwd); goto fail; @@ -569,6 +569,7 @@ GST_TRACE ("replace setup:%s with setup:%s", attr->value, setup_str); gst_sdp_attribute_set (&new_attr, "setup", setup_str); gst_sdp_media_replace_attribute (media, i, &new_attr); + g_free (setup_str); return; } } @@ -872,7 +873,7 @@ } gboolean -_parse_bundle (GstSDPMessage * sdp, GStrv * bundled) +_parse_bundle (GstSDPMessage * sdp, GStrv * bundled, GError ** error) { const gchar *group; gboolean ret = FALSE; @@ -883,8 +884,9 @@ *bundled = g_strsplit (group + strlen ("BUNDLE "), " ", 0); if (!(*bundled)[0]) { - GST_ERROR ("Invalid format for BUNDLE group, expected at least " - "one mid (%s)", group); + g_set_error (error, GST_WEBRTC_ERROR, GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + "Invalid format for BUNDLE group, expected at least one mid (%s)", + group); g_strfreev (*bundled); *bundled = NULL; goto done;
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/webrtcsdp.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtcsdp.h
Changed
@@ -101,7 +101,8 @@ guint * idx); G_GNUC_INTERNAL gboolean _parse_bundle (GstSDPMessage * sdp, - GStrv * bundled); + GStrv * bundled, + GError ** error); G_GNUC_INTERNAL const gchar * _media_get_ice_pwd (const GstSDPMessage * msg,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/webrtctransceiver.c -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtctransceiver.c
Changed
@@ -32,7 +32,8 @@ G_DEFINE_TYPE_WITH_CODE (WebRTCTransceiver, webrtc_transceiver, GST_TYPE_WEBRTC_RTP_TRANSCEIVER, GST_DEBUG_CATEGORY_INIT (webrtc_transceiver_debug, - "webrtctransceiver", 0, "webrtctransceiver");); + "webrtctransceiver", 0, "webrtctransceiver"); + ); #define DEFAULT_FEC_TYPE GST_WEBRTC_FEC_TYPE_NONE #define DEFAULT_DO_NACK FALSE @@ -59,19 +60,17 @@ gst_object_replace ((GstObject **) & trans->stream, (GstObject *) stream); - if (rtp_trans->sender) + if (rtp_trans->sender) { gst_object_replace ((GstObject **) & rtp_trans->sender->transport, (GstObject *) stream->transport); - if (rtp_trans->receiver) + g_object_notify (G_OBJECT (rtp_trans->sender), "transport"); + } + + if (rtp_trans->receiver) { gst_object_replace ((GstObject **) & rtp_trans->receiver->transport, (GstObject *) stream->transport); - - if (rtp_trans->sender) - gst_object_replace ((GstObject **) & rtp_trans->sender->rtcp_transport, - (GstObject *) stream->rtcp_transport); - if (rtp_trans->receiver) - gst_object_replace ((GstObject **) & rtp_trans->receiver->rtcp_transport, - (GstObject *) stream->rtcp_transport); + g_object_notify (G_OBJECT (rtp_trans->receiver), "transport"); + } } GstWebRTCDTLSTransport * @@ -88,20 +87,6 @@ return NULL; } -GstWebRTCDTLSTransport * -webrtc_transceiver_get_rtcp_dtls_transport (GstWebRTCRTPTransceiver * trans) -{ - g_return_val_if_fail (WEBRTC_IS_TRANSCEIVER (trans), NULL); - - if (trans->sender) { - return trans->sender->rtcp_transport; - } else if (trans->receiver) { - return trans->receiver->rtcp_transport; - } - - return NULL; -} - static void webrtc_transceiver_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) @@ -173,6 +158,8 @@ gst_caps_replace (&trans->last_configured_caps, NULL); + gst_event_replace (&trans->ssrc_event, NULL); + G_OBJECT_CLASS (parent_class)->finalize (object); }
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtc/webrtctransceiver.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtc/webrtctransceiver.h
Changed
@@ -22,6 +22,7 @@ #include "fwd.h" #include <gst/webrtc/rtptransceiver.h> +#include "gst/webrtc/webrtc-priv.h" #include "transportstream.h" G_BEGIN_DECLS @@ -39,6 +40,8 @@ TransportStream *stream; GstStructure *local_rtx_ssrc_map; + guint current_ssrc; + GstEvent *ssrc_event; /* Properties */ GstWebRTCFECType fec_type; @@ -46,6 +49,8 @@ gboolean do_nack; GstCaps *last_configured_caps; + + gboolean mline_locked; }; struct _WebRTCTransceiverClass @@ -61,7 +66,6 @@ TransportStream * stream); GstWebRTCDTLSTransport * webrtc_transceiver_get_dtls_transport (GstWebRTCRTPTransceiver * trans); -GstWebRTCDTLSTransport * webrtc_transceiver_get_rtcp_dtls_transport (GstWebRTCRTPTransceiver * trans); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtcdsp/gstwebrtcdsp.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtcdsp/gstwebrtcdsp.cpp
Changed
@@ -274,7 +274,11 @@ webrtc::VoiceDetection::Likelihood voice_detection_likelihood; }; -G_DEFINE_TYPE (GstWebrtcDsp, gst_webrtc_dsp, GST_TYPE_AUDIO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstWebrtcDsp, gst_webrtc_dsp, GST_TYPE_AUDIO_FILTER, + GST_DEBUG_CATEGORY_INIT (webrtc_dsp_debug, "webrtcdsp", 0, + "libwebrtcdsp wrapping elements");); +GST_ELEMENT_REGISTER_DEFINE (webrtcdsp, "webrtcdsp", GST_RANK_NONE, + GST_TYPE_WEBRTC_DSP); static const gchar * webrtc_error_to_string (gint err) @@ -438,12 +442,24 @@ } static void -gst_webrtc_vad_post_message (GstWebrtcDsp *self, GstClockTime timestamp, +gst_webrtc_vad_post_activity (GstWebrtcDsp *self, GstBuffer *buffer, gboolean stream_has_voice) { + GstClockTime timestamp = GST_BUFFER_PTS (buffer); GstBaseTransform *trans = GST_BASE_TRANSFORM_CAST (self); GstStructure *s; GstClockTime stream_time; + GstAudioLevelMeta *meta; + guint8 level; + + level = self->apm->level_estimator ()->RMS (); + meta = gst_buffer_get_audio_level_meta (buffer); + if (meta) { + meta->voice_activity = stream_has_voice; + meta->level = level; + } else { + gst_buffer_add_audio_level_meta (buffer, level, stream_has_voice); + } stream_time = gst_segment_to_stream_time (&trans->segment, GST_FORMAT_TIME, timestamp); @@ -498,7 +514,7 @@ gboolean stream_has_voice = apm->voice_detection ()->stream_has_voice (); if (stream_has_voice != self->stream_has_voice) - gst_webrtc_vad_post_message (self, GST_BUFFER_PTS (buffer), stream_has_voice); + gst_webrtc_vad_post_activity (self, buffer, stream_has_voice); self->stream_has_voice = stream_has_voice; } @@ -712,6 +728,7 @@ apm->voice_detection ()->set_likelihood (self->voice_detection_likelihood); apm->voice_detection ()->set_frame_size_ms ( self->voice_detection_frame_size_ms); + apm->level_estimator ()->Enable (true); } GST_OBJECT_UNLOCK (self); @@ -1118,27 +1135,3 @@ gst_type_mark_as_plugin_api (GST_TYPE_WEBRTC_ECHO_SUPPRESSION_LEVEL, (GstPluginAPIFlags) 0); gst_type_mark_as_plugin_api (GST_TYPE_WEBRTC_VOICE_DETECTION_LIKELIHOOD, (GstPluginAPIFlags) 0); } - -static gboolean -plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT - (webrtc_dsp_debug, "webrtcdsp", 0, "libwebrtcdsp wrapping elements"); - - if (!gst_element_register (plugin, "webrtcdsp", GST_RANK_NONE, - GST_TYPE_WEBRTC_DSP)) { - return FALSE; - } - if (!gst_element_register (plugin, "webrtcechoprobe", GST_RANK_NONE, - GST_TYPE_WEBRTC_ECHO_PROBE)) { - return FALSE; - } - - return TRUE; -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - webrtcdsp, - "Voice pre-processing using WebRTC Audio Processing Library", - plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtcdsp/gstwebrtcdsp.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtcdsp/gstwebrtcdsp.h
Changed
@@ -52,6 +52,8 @@ GType gst_webrtc_dsp_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (webrtcdsp); + G_END_DECLS #endif /* __GST_WEBRTC_DSP_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/ext/webrtcdsp/gstwebrtcdspplugin.cpp
Added
@@ -0,0 +1,90 @@ +/* + * WebRTC Audio Processing Elements + * + * Copyright 2016 Collabora Ltd + * @author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + */ + +/** + * SECTION:element-webrtcdsp + * @short_description: Audio Filter using WebRTC Audio Processing library + * + * A voice enhancement filter based on WebRTC Audio Processing library. This + * library provides a whide variety of enhancement algorithms. This element + * tries to enable as much as possible. The currently enabled enhancements are + * High Pass Filter, Echo Canceller, Noise Suppression, Automatic Gain Control, + * and some extended filters. + * + * While webrtcdsp element can be used alone, there is an exception for the + * echo canceller. The audio canceller need to be aware of the far end streams + * that are played to loud speakers. For this, you must place a webrtcechoprobe + * element at that far end. Note that the sample rate must match between + * webrtcdsp and the webrtechoprobe. Though, the number of channels can differ. + * The probe is found by the DSP element using it's object name. By default, + * webrtcdsp looks for webrtcechoprobe0, which means it just work if you have + * a single probe and DSP. + * + * The probe can only be used within the same top level GstPipeline. + * Additionally, to simplify the code, the probe element must be created + * before the DSP sink pad is activated. It does not need to be in any + * particular state and does not even need to be added to the pipeline yet. + * + * # Example launch line + * + * As a convenience, the echo canceller can be tested using an echo loop. In + * this configuration, one would expect a single echo to be heard. + * + * |[ + * gst-launch-1.0 pulsesrc ! webrtcdsp ! webrtcechoprobe ! pulsesink + * ]| + * + * In real environment, you'll place the probe before the playback, but only + * process the far end streams. The DSP should be placed as close as possible + * to the audio capture. The following pipeline is astracted and does not + * represent a real pipeline. + * + * |[ + * gst-launch-1.0 far-end-src ! audio/x-raw,rate=48000 ! webrtcechoprobe ! pulsesink \ + * pulsesrc ! audio/x-raw,rate=48000 ! webrtcdsp ! far-end-sink + * ]| + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstwebrtcdsp.h" +#include "gstwebrtcechoprobe.h" + + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (webrtcdsp, plugin); + ret |= GST_ELEMENT_REGISTER (webrtcechoprobe, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + webrtcdsp, + "Voice pre-processing using WebRTC Audio Processing Library", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtcdsp/gstwebrtcechoprobe.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtcdsp/gstwebrtcechoprobe.cpp
Changed
@@ -78,6 +78,8 @@ G_DEFINE_TYPE (GstWebrtcEchoProbe, gst_webrtc_echo_probe, GST_TYPE_AUDIO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (webrtcechoprobe, "webrtcechoprobe", + GST_RANK_NONE, GST_TYPE_WEBRTC_ECHO_PROBE); static gboolean gst_webrtc_echo_probe_setup (GstAudioFilter * filter, const GstAudioInfo * info)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtcdsp/gstwebrtcechoprobe.h -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtcdsp/gstwebrtcechoprobe.h
Changed
@@ -87,6 +87,8 @@ GType gst_webrtc_echo_probe_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (webrtcechoprobe); + GstWebrtcEchoProbe *gst_webrtc_acquire_echo_probe (const gchar * name); void gst_webrtc_release_echo_probe (GstWebrtcEchoProbe * probe); gint gst_webrtc_echo_probe_read (GstWebrtcEchoProbe * self,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/webrtcdsp/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/webrtcdsp/meson.build
Changed
@@ -1,6 +1,7 @@ webrtc_sources = [ 'gstwebrtcdsp.cpp', - 'gstwebrtcechoprobe.cpp' + 'gstwebrtcechoprobe.cpp', + 'gstwebrtcdspplugin.cpp' ] webrtc_dep = dependency('webrtc-audio-processing', version : ['>= 0.2', '< 0.4'],
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wildmidi/gstwildmididec.c -> gst-plugins-bad-1.20.1.tar.xz/ext/wildmidi/gstwildmididec.c
Changed
@@ -111,7 +111,8 @@ G_DEFINE_TYPE (GstWildmidiDec, gst_wildmidi_dec, GST_TYPE_NONSTREAM_AUDIO_DECODER); - +GST_ELEMENT_REGISTER_DEFINE (wildmididec, "wildmididec", GST_RANK_MARGINAL, + gst_wildmidi_dec_get_type ()); static void gst_wildmidi_dec_finalize (GObject * object); @@ -677,8 +678,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "wildmididec", GST_RANK_MARGINAL, - gst_wildmidi_dec_get_type ()); + return GST_ELEMENT_REGISTER (wildmididec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wildmidi/gstwildmididec.h -> gst-plugins-bad-1.20.1.tar.xz/ext/wildmidi/gstwildmididec.h
Changed
@@ -62,6 +62,7 @@ GType gst_wildmidi_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (wildmididec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wpe/WPEThreadedView.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/wpe/WPEThreadedView.cpp
Changed
@@ -22,6 +22,8 @@ #endif #include "WPEThreadedView.h" +#include "gstwpe.h" +#include "gstwpesrcbin.h" #include <gst/gl/gl.h> #include <gst/gl/egl/gsteglimage.h> @@ -31,20 +33,10 @@ #include <cstdio> #include <mutex> -#if ENABLE_SHM_BUFFER_SUPPORT #include <wpe/unstable/fdo-shm.h> -#endif - -GST_DEBUG_CATEGORY_EXTERN (wpe_src_debug); -#define GST_CAT_DEFAULT wpe_src_debug -#if defined(WPE_FDO_CHECK_VERSION) && WPE_FDO_CHECK_VERSION(1, 3, 0) -#define USE_DEPRECATED_FDO_EGL_IMAGE 0 -#define WPE_GLIB_SOURCE_PRIORITY G_PRIORITY_DEFAULT -#else -#define USE_DEPRECATED_FDO_EGL_IMAGE 1 -#define WPE_GLIB_SOURCE_PRIORITY -70 -#endif +GST_DEBUG_CATEGORY_EXTERN (wpe_view_debug); +#define GST_CAT_DEFAULT wpe_view_debug class GMutexHolder { public: @@ -85,7 +77,9 @@ { GMutexHolder lock(threading.mutex); threading.thread = g_thread_new("WPEContextThread", s_viewThread, this); - g_cond_wait(&threading.cond, &threading.mutex); + while (!threading.ready) { + g_cond_wait(&threading.cond, &threading.mutex); + } GST_DEBUG("thread spawned"); } } @@ -104,30 +98,49 @@ template<typename Function> void WPEContextThread::dispatch(Function func) { - struct Payload { + struct Job { + Job(Function& f) + : func(f) + { + g_mutex_init(&mutex); + g_cond_init(&cond); + dispatched = FALSE; + } + ~Job() { + g_mutex_clear(&mutex); + g_cond_clear(&cond); + } + + void dispatch() { + GMutexHolder lock(mutex); + func(); + dispatched = TRUE; + g_cond_signal(&cond); + } + + void waitCompletion() { + GMutexHolder lock(mutex); + while(!dispatched) { + g_cond_wait(&cond, &mutex); + } + } + Function& func; + GMutex mutex; + GCond cond; + gboolean dispatched; }; - struct Payload payload { func }; + struct Job job(func); GSource* source = g_idle_source_new(); g_source_set_callback(source, [](gpointer data) -> gboolean { - auto& view = WPEContextThread::singleton(); - GMutexHolder lock(view.threading.mutex); - - auto* payload = static_cast<struct Payload*>(data); - payload->func(); - - g_cond_signal(&view.threading.cond); + auto* job = static_cast<struct Job*>(data); + job->dispatch(); return G_SOURCE_REMOVE; - }, &payload, nullptr); - g_source_set_priority(source, WPE_GLIB_SOURCE_PRIORITY); - - { - GMutexHolder lock(threading.mutex); - g_source_attach(source, glib.context); - g_cond_wait(&threading.cond, &threading.mutex); - } - + }, &job, nullptr); + g_source_set_priority(source, G_PRIORITY_DEFAULT); + g_source_attach(source, glib.context); + job.waitCompletion(); g_source_unref(source); } @@ -146,6 +159,7 @@ [](gpointer data) -> gboolean { auto& view = *static_cast<WPEContextThread*>(data); GMutexHolder lock(view.threading.mutex); + view.threading.ready = TRUE; g_cond_signal(&view.threading.cond); return G_SOURCE_REMOVE; }, @@ -165,27 +179,178 @@ return nullptr; } -WPEView* WPEContextThread::createWPEView(GstWpeSrc* src, GstGLContext* context, GstGLDisplay* display, int width, int height) +#ifdef G_OS_UNIX +static void +initialize_web_extensions (WebKitWebContext *context) +{ + const gchar *local_path = gst_wpe_get_devenv_extension_path (); + const gchar *path = g_file_test (local_path, G_FILE_TEST_IS_DIR) ? local_path : G_STRINGIFY (WPE_EXTENSION_INSTALL_DIR); + GST_INFO ("Loading WebExtension from %s", path); + webkit_web_context_set_web_extensions_directory (context, path); +} + +static void +webkit_extension_gerror_msg_received (GstWpeSrc *src, GVariant *params) +{ + GstStructure *structure; + GstMessage *forwarded; + const gchar *src_path, *src_type, *src_name, *error_domain, *msg, *debug_str, *details_str; + gint message_type; + guint32 error_code; + + g_variant_get (params, "(issssusss)", + &message_type, + &src_type, + &src_name, + &src_path, + &error_domain, + &error_code, + &msg, + &debug_str, + &details_str + ); + + GError *error = g_error_new(g_quark_from_string(error_domain), error_code, "%s", msg); + GstStructure *details = (details_str[0] != '\0') ? gst_structure_new_from_string(details_str) : NULL; + gchar * our_message = g_strdup_printf( + "`%s` posted from %s running inside the web page", + debug_str, src_path + ); + + + if (message_type == GST_MESSAGE_ERROR) { + forwarded = + gst_message_new_error_with_details(GST_OBJECT(src), error, + our_message, details); + } else if (message_type == GST_MESSAGE_WARNING) { + forwarded = + gst_message_new_warning_with_details(GST_OBJECT(src), error, + our_message, details); + } else { + forwarded = + gst_message_new_info_with_details(GST_OBJECT(src), error, our_message, details); + } + + structure = gst_structure_new ("WpeForwarded", + "message", GST_TYPE_MESSAGE, forwarded, + "wpe-original-src-name", G_TYPE_STRING, src_name, + "wpe-original-src-type", G_TYPE_STRING, src_type, + "wpe-original-src-path", G_TYPE_STRING, src_path, + NULL + ); + + g_free (our_message); + gst_element_post_message(GST_ELEMENT(src), gst_message_new_custom(GST_MESSAGE_ELEMENT, + GST_OBJECT(src), structure)); + g_error_free(error); + gst_message_unref (forwarded); +} + +static void +webkit_extension_bus_message_received (GstWpeSrc *src, GVariant *params) +{ + GstStructure *original_structure, *structure; + const gchar *src_name, *src_type, *src_path, *struct_str; + GstMessageType message_type; + GstMessage *forwarded; + + g_variant_get (params, "(issss)", + &message_type, + &src_name, + &src_type, + &src_path, + &struct_str + ); + + original_structure = (struct_str[0] != '\0') ? gst_structure_new_from_string(struct_str) : NULL; + if (!original_structure) + { + if (struct_str[0] != '\0') + GST_ERROR_OBJECT(src, "Could not deserialize: %s", struct_str); + original_structure = gst_structure_new_empty("wpesrc"); + + } + + forwarded = gst_message_new_custom(message_type, + GST_OBJECT (src), original_structure); + structure = gst_structure_new ("WpeForwarded", + "message", GST_TYPE_MESSAGE, forwarded, + "wpe-original-src-name", G_TYPE_STRING, src_name, + "wpe-original-src-type", G_TYPE_STRING, src_type, + "wpe-original-src-path", G_TYPE_STRING, src_path, + NULL + ); + + gst_element_post_message(GST_ELEMENT(src), gst_message_new_custom(GST_MESSAGE_ELEMENT, + GST_OBJECT(src), structure)); + + gst_message_unref (forwarded); +} + +static gboolean +webkit_extension_msg_received (WebKitWebContext *context, + WebKitUserMessage *message, + GstWpeSrc *src) +{ + const gchar *name = webkit_user_message_get_name (message); + GVariant *params = webkit_user_message_get_parameters (message); + gboolean res = TRUE; + + if (!g_strcmp0(name, "gstwpe.new_stream")) { + guint32 id = g_variant_get_uint32 (g_variant_get_child_value (params, 0)); + const gchar *capsstr = g_variant_get_string (g_variant_get_child_value (params, 1), NULL); + GstCaps *caps = gst_caps_from_string (capsstr); + const gchar *stream_id = g_variant_get_string (g_variant_get_child_value (params, 2), NULL); + gst_wpe_src_new_audio_stream(src, id, caps, stream_id); + gst_caps_unref (caps); + } else if (!g_strcmp0(name, "gstwpe.set_shm")) { + auto fdlist = webkit_user_message_get_fd_list (message); + gint id = g_variant_get_uint32 (g_variant_get_child_value (params, 0)); + gst_wpe_src_set_audio_shm (src, fdlist, id); + } else if (!g_strcmp0(name, "gstwpe.new_buffer")) { + guint32 id = g_variant_get_uint32 (g_variant_get_child_value (params, 0)); + guint64 size = g_variant_get_uint64 (g_variant_get_child_value (params, 1)); + gst_wpe_src_push_audio_buffer (src, id, size); + + webkit_user_message_send_reply(message, webkit_user_message_new ("gstwpe.buffer_processed", NULL)); + } else if (!g_strcmp0(name, "gstwpe.pause")) { + guint32 id = g_variant_get_uint32 (params); + + gst_wpe_src_pause_audio_stream (src, id); + } else if (!g_strcmp0(name, "gstwpe.stop")) { + guint32 id = g_variant_get_uint32 (params); + + gst_wpe_src_stop_audio_stream (src, id); + } else if (!g_strcmp0(name, "gstwpe.bus_gerror_message")) { + webkit_extension_gerror_msg_received (src, params); + } else if (!g_strcmp0(name, "gstwpe.bus_message")) { + webkit_extension_bus_message_received (src, params); + } else { + res = FALSE; + g_error("Unknown event: %s", name); + } + + return res; +} +#endif + +WPEView* WPEContextThread::createWPEView(GstWpeVideoSrc* src, GstGLContext* context, GstGLDisplay* display, int width, int height) { GST_DEBUG("context %p display %p, size (%d,%d)", context, display, width, height); static std::once_flag s_loaderFlag; std::call_once(s_loaderFlag, [] { -#if defined(WPE_BACKEND_CHECK_VERSION) && WPE_BACKEND_CHECK_VERSION(1, 2, 0) wpe_loader_init("libWPEBackend-fdo-1.0.so"); -#endif }); WPEView* view = nullptr; dispatch([&]() mutable { - if (!glib.web_context) { - auto* manager = webkit_website_data_manager_new_ephemeral(); - glib.web_context = webkit_web_context_new_with_website_data_manager(manager); - g_object_unref(manager); - } + auto* manager = webkit_website_data_manager_new_ephemeral(); + auto web_context = webkit_web_context_new_with_website_data_manager(manager); + g_object_unref(manager); - view = new WPEView(glib.web_context, src, context, display, width, height); + view = new WPEView(web_context, src, context, display, width, height); }); if (view && view->hasUri()) { @@ -199,7 +364,14 @@ static gboolean s_loadFailed(WebKitWebView*, WebKitLoadEvent, gchar* failing_uri, GError* error, gpointer data) { - GstWpeSrc* src = GST_WPE_SRC(data); + GstWpeVideoSrc* src = GST_WPE_VIDEO_SRC(data); + + if (g_error_matches(error, WEBKIT_NETWORK_ERROR, WEBKIT_NETWORK_ERROR_CANCELLED)) { + GST_INFO_OBJECT (src, "Loading cancelled."); + + return FALSE; + } + GST_ELEMENT_ERROR (GST_ELEMENT_CAST(src), RESOURCE, FAILED, (NULL), ("Failed to load %s (%s)", failing_uri, error->message)); return FALSE; } @@ -210,8 +382,42 @@ return FALSE; } -WPEView::WPEView(WebKitWebContext* web_context, GstWpeSrc* src, GstGLContext* context, GstGLDisplay* display, int width, int height) +static void s_loadProgressChaned(GObject* object, GParamSpec*, gpointer data) +{ + GstElement* src = GST_ELEMENT_CAST (data); + // The src element is locked already so we can't call + // gst_element_post_message(). Instead retrieve the bus manually and use it + // directly. + GstBus* bus = GST_ELEMENT_BUS (src); + double estimatedProgress; + g_object_get(object, "estimated-load-progress", &estimatedProgress, nullptr); + gst_object_ref (bus); + gst_bus_post (bus, gst_message_new_element(GST_OBJECT_CAST(src), gst_structure_new("wpe-stats", "estimated-load-progress", G_TYPE_DOUBLE, estimatedProgress * 100, nullptr))); + gst_object_unref (bus); +} + +WPEView::WPEView(WebKitWebContext* web_context, GstWpeVideoSrc* src, GstGLContext* context, GstGLDisplay* display, int width, int height) +{ +#ifdef G_OS_UNIX { + GstObject *parent = gst_object_get_parent (GST_OBJECT (src)); + + if (parent && GST_IS_WPE_SRC (parent)) { + audio.init_ext_sigid = g_signal_connect (web_context, + "initialize-web-extensions", + G_CALLBACK (initialize_web_extensions), + NULL); + audio.extension_msg_sigid = g_signal_connect (web_context, + "user-message-received", + G_CALLBACK (webkit_extension_msg_received), + parent); + GST_INFO_OBJECT (parent, "Enabled audio"); + } + + gst_clear_object (&parent); +} +#endif // G_OS_UNIX + g_mutex_init(&threading.ready_mutex); g_cond_init(&threading.ready_cond); threading.ready = FALSE; @@ -219,62 +425,76 @@ g_mutex_init(&images_mutex); if (context) gst.context = GST_GL_CONTEXT(gst_object_ref(context)); - if (display) + if (display) { gst.display = GST_GL_DISPLAY(gst_object_ref(display)); + } wpe.width = width; wpe.height = height; - EGLDisplay eglDisplay = EGL_NO_DISPLAY; - if (context && display) - eglDisplay = gst_gl_display_egl_get_from_native(GST_GL_DISPLAY_TYPE_WAYLAND, gst_gl_display_get_handle(display)); - GST_DEBUG("eglDisplay %p", eglDisplay); + if (context && display) { + if (gst_gl_context_get_gl_platform(context) == GST_GL_PLATFORM_EGL) { + gst.display_egl = gst_gl_display_egl_from_gl_display (gst.display); + } else { + GST_DEBUG ("Available GStreamer GL Context is not EGL - not creating an EGL display from it"); + } + } + + if (gst.display_egl) { + EGLDisplay eglDisplay = (EGLDisplay)gst_gl_display_get_handle (GST_GL_DISPLAY(gst.display_egl)); + GST_DEBUG("eglDisplay %p", eglDisplay); - if (eglDisplay) { m_isValid = wpe_fdo_initialize_for_egl_display(eglDisplay); GST_DEBUG("FDO EGL display initialisation result: %d", m_isValid); } else { -#if ENABLE_SHM_BUFFER_SUPPORT m_isValid = wpe_fdo_initialize_shm(); GST_DEBUG("FDO SHM initialisation result: %d", m_isValid); -#else - GST_WARNING("FDO SHM support is available only in WPEBackend-FDO 1.7.0"); -#endif } if (!m_isValid) return; - if (eglDisplay) { + if (gst.display_egl) { wpe.exportable = wpe_view_backend_exportable_fdo_egl_create(&s_exportableEGLClient, this, wpe.width, wpe.height); } else { -#if ENABLE_SHM_BUFFER_SUPPORT wpe.exportable = wpe_view_backend_exportable_fdo_create(&s_exportableClient, this, wpe.width, wpe.height); -#endif } auto* wpeViewBackend = wpe_view_backend_exportable_fdo_get_view_backend(wpe.exportable); auto* viewBackend = webkit_web_view_backend_new(wpeViewBackend, (GDestroyNotify) wpe_view_backend_exportable_fdo_destroy, wpe.exportable); -#if defined(WPE_BACKEND_CHECK_VERSION) && WPE_BACKEND_CHECK_VERSION(1, 1, 0) wpe_view_backend_add_activity_state(wpeViewBackend, wpe_view_activity_state_visible | wpe_view_activity_state_focused | wpe_view_activity_state_in_window); -#endif - webkit.view = WEBKIT_WEB_VIEW(g_object_new(WEBKIT_TYPE_WEB_VIEW, "web-context", web_context, "backend", viewBackend, nullptr)); + webkit.view = WEBKIT_WEB_VIEW(g_object_new(WEBKIT_TYPE_WEB_VIEW, + "web-context", web_context, + "backend", viewBackend, + nullptr)); g_signal_connect(webkit.view, "load-failed", G_CALLBACK(s_loadFailed), src); g_signal_connect(webkit.view, "load-failed-with-tls-errors", G_CALLBACK(s_loadFailedWithTLSErrors), src); + g_signal_connect(webkit.view, "notify::estimated-load-progress", G_CALLBACK(s_loadProgressChaned), src); - gst_wpe_src_configure_web_view(src, webkit.view); + auto* settings = webkit_web_view_get_settings(webkit.view); + webkit_settings_set_enable_webaudio(settings, TRUE); - const gchar* location; + gst_wpe_video_src_configure_web_view(src, webkit.view); + + gchar* location; gboolean drawBackground = TRUE; g_object_get(src, "location", &location, "draw-background", &drawBackground, nullptr); setDrawBackground(drawBackground); - if (location) + if (location) { loadUriUnlocked(location); + g_free(location); + } } WPEView::~WPEView() { + GstEGLImage *egl_pending = NULL; + GstEGLImage *egl_committed = NULL; + GstBuffer *shm_pending = NULL; + GstBuffer *shm_committed = NULL; + GST_TRACE ("%p destroying", this); + g_mutex_clear(&threading.ready_mutex); g_cond_clear(&threading.ready_cond); @@ -282,23 +502,43 @@ GMutexHolder lock(images_mutex); if (egl.pending) { - gst_egl_image_unref(egl.pending); + egl_pending = egl.pending; egl.pending = nullptr; } if (egl.committed) { - gst_egl_image_unref(egl.committed); + egl_committed = egl.committed; egl.committed = nullptr; } if (shm.pending) { - gst_buffer_unref(shm.pending); + GST_TRACE ("%p freeing shm pending %" GST_PTR_FORMAT, this, shm.pending); + shm_pending = shm.pending; shm.pending = nullptr; } if (shm.committed) { - gst_buffer_unref(shm.committed); + GST_TRACE ("%p freeing shm commited %" GST_PTR_FORMAT, this, shm.committed); + shm_committed = shm.committed; shm.committed = nullptr; } } + if (egl_pending) + gst_egl_image_unref (egl_pending); + if (egl_committed) + gst_egl_image_unref (egl_committed); + if (shm_pending) + gst_buffer_unref (shm_pending); + if (shm_committed) + gst_buffer_unref (shm_committed); + + if (audio.init_ext_sigid) { + WebKitWebContext* web_context = webkit_web_view_get_context (webkit.view); + + g_signal_handler_disconnect(web_context, audio.init_ext_sigid); + g_signal_handler_disconnect(web_context, audio.extension_msg_sigid); + audio.init_ext_sigid = 0; + audio.extension_msg_sigid = 0; + } + WPEContextThread::singleton().dispatch([&]() { if (webkit.view) { g_object_unref(webkit.view); @@ -306,6 +546,11 @@ } }); + if (gst.display_egl) { + gst_object_unref(gst.display_egl); + gst.display_egl = nullptr; + } + if (gst.display) { gst_object_unref(gst.display); gst.display = nullptr; @@ -321,6 +566,7 @@ } g_mutex_clear(&images_mutex); + GST_TRACE ("%p destroyed", this); } void WPEView::notifyLoadFinished() @@ -343,6 +589,7 @@ { GstEGLImage* ret = nullptr; bool dispatchFrameComplete = false; + GstEGLImage *prev_image = NULL; { GMutexHolder lock(images_mutex); @@ -353,12 +600,10 @@ GST_IS_EGL_IMAGE(egl.committed) ? GST_MINI_OBJECT_REFCOUNT_VALUE(GST_MINI_OBJECT_CAST(egl.committed)) : 0); if (egl.pending) { - auto* previousImage = egl.committed; + prev_image = egl.committed; egl.committed = egl.pending; egl.pending = nullptr; - if (previousImage) - gst_egl_image_unref(previousImage); dispatchFrameComplete = true; } @@ -366,6 +611,9 @@ ret = egl.committed; } + if (prev_image) + gst_egl_image_unref(prev_image); + if (dispatchFrameComplete) frameComplete(); @@ -376,6 +624,7 @@ { GstBuffer* ret = nullptr; bool dispatchFrameComplete = false; + GstBuffer *prev_image = NULL; { GMutexHolder lock(images_mutex); @@ -386,12 +635,10 @@ GST_IS_BUFFER(shm.committed) ? GST_MINI_OBJECT_REFCOUNT_VALUE(GST_MINI_OBJECT_CAST(shm.committed)) : 0); if (shm.pending) { - auto* previousImage = shm.committed; + prev_image = shm.committed; shm.committed = shm.pending; shm.pending = nullptr; - if (previousImage) - gst_buffer_unref(previousImage); dispatchFrameComplete = true; } @@ -399,6 +646,9 @@ ret = shm.committed; } + if (prev_image) + gst_buffer_unref(prev_image); + if (dispatchFrameComplete) frameComplete(); @@ -463,13 +713,8 @@ { s_view->dispatch([&]() { GST_TRACE("Dispatch release exported image %p", imagePointer); -#if USE_DEPRECATED_FDO_EGL_IMAGE - wpe_view_backend_exportable_fdo_egl_dispatch_release_image(wpe.exportable, - static_cast<EGLImageKHR>(imagePointer)); -#else wpe_view_backend_exportable_fdo_egl_dispatch_release_exported_image(wpe.exportable, static_cast<struct wpe_fdo_egl_exported_image*>(imagePointer)); -#endif }); } @@ -483,25 +728,20 @@ ImageContext* imageContext = g_slice_new(ImageContext); imageContext->view = this; imageContext->image = static_cast<gpointer>(image); - EGLImageKHR eglImage; -#if USE_DEPRECATED_FDO_EGL_IMAGE - eglImage = static_cast<EGLImageKHR>(image); -#else - eglImage = wpe_fdo_egl_exported_image_get_egl_image(static_cast<struct wpe_fdo_egl_exported_image*>(image)); -#endif + EGLImageKHR eglImage = wpe_fdo_egl_exported_image_get_egl_image(static_cast<struct wpe_fdo_egl_exported_image*>(image)); auto* gstImage = gst_egl_image_new_wrapped(gst.context, eglImage, GST_GL_RGBA, imageContext, s_releaseImage); { GMutexHolder lock(images_mutex); GST_TRACE("EGLImage %p wrapped in GstEGLImage %" GST_PTR_FORMAT, eglImage, gstImage); + gst_clear_mini_object ((GstMiniObject **) &egl.pending); egl.pending = gstImage; notifyLoadFinished(); } } -#if ENABLE_SHM_BUFFER_SUPPORT struct SHMBufferContext { WPEView* view; struct wpe_fdo_shm_exported_buffer* buffer; @@ -553,21 +793,13 @@ { GMutexHolder lock(images_mutex); GST_TRACE("SHM buffer %p wrapped in buffer %" GST_PTR_FORMAT, buffer, gstBuffer); + gst_clear_buffer (&shm.pending); shm.pending = gstBuffer; notifyLoadFinished(); } } -#endif struct wpe_view_backend_exportable_fdo_egl_client WPEView::s_exportableEGLClient = { -#if USE_DEPRECATED_FDO_EGL_IMAGE - // export_egl_image - [](void* data, EGLImageKHR image) { - auto& view = *static_cast<WPEView*>(data); - view.handleExportedImage(static_cast<gpointer>(image)); - }, - nullptr, nullptr, -#else // export_egl_image nullptr, [](void* data, struct wpe_fdo_egl_exported_image* image) { @@ -575,12 +807,10 @@ view.handleExportedImage(static_cast<gpointer>(image)); }, nullptr, -#endif // USE_DEPRECATED_FDO_EGL_IMAGE // padding nullptr, nullptr }; -#if ENABLE_SHM_BUFFER_SUPPORT struct wpe_view_backend_exportable_fdo_client WPEView::s_exportableClient = { nullptr, nullptr, @@ -592,7 +822,6 @@ nullptr, nullptr, }; -#endif void WPEView::s_releaseImage(GstEGLImage* image, gpointer data) {
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wpe/WPEThreadedView.h -> gst-plugins-bad-1.20.1.tar.xz/ext/wpe/WPEThreadedView.h
Changed
@@ -22,33 +22,29 @@ #include <EGL/egl.h> #include <glib.h> #include <gst/gl/gstglfuncs.h> +#include <gst/gl/egl/gstgldisplay_egl.h> #include <wpe/fdo.h> #include <wpe/fdo-egl.h> #include <wpe/webkit.h> -#include "gstwpesrc.h" +#include "gstwpevideosrc.h" typedef struct _GstGLContext GstGLContext; typedef struct _GstGLDisplay GstGLDisplay; typedef struct _GstEGLImage GstEGLImage; -#if defined(WPE_FDO_CHECK_VERSION) && WPE_FDO_CHECK_VERSION(1, 7, 0) -#define ENABLE_SHM_BUFFER_SUPPORT 1 -#else -#define ENABLE_SHM_BUFFER_SUPPORT 0 -#endif - class WPEView { public: - WPEView(WebKitWebContext*, GstWpeSrc*, GstGLContext*, GstGLDisplay*, int width, int height); + WPEView(WebKitWebContext*, GstWpeVideoSrc*, GstGLContext*, GstGLDisplay*, int width, int height); ~WPEView(); bool operator!() const { return m_isValid; } - /* Used by wpesrc */ + /* Used by wpevideosrc */ void resize(int width, int height); void loadUri(const gchar*); void loadData(GBytes*); void setDrawBackground(gboolean); + GstEGLImage* image(); GstBuffer* buffer(); @@ -63,9 +59,7 @@ protected: void handleExportedImage(gpointer); -#if ENABLE_SHM_BUFFER_SUPPORT void handleExportedBuffer(struct wpe_fdo_shm_exported_buffer*); -#endif private: struct wpe_view_backend* backend() const; @@ -74,27 +68,24 @@ void notifyLoadFinished(); void releaseImage(gpointer); -#if ENABLE_SHM_BUFFER_SUPPORT void releaseSHMBuffer(gpointer); static void s_releaseSHMBuffer(gpointer); -#endif struct { GstGLContext* context; GstGLDisplay* display; - } gst { nullptr, nullptr }; + GstGLDisplayEGL* display_egl; + } gst { nullptr, nullptr, nullptr }; static struct wpe_view_backend_exportable_fdo_egl_client s_exportableEGLClient; -#if ENABLE_SHM_BUFFER_SUPPORT static struct wpe_view_backend_exportable_fdo_client s_exportableClient; -#endif static void s_releaseImage(GstEGLImage*, gpointer); struct { struct wpe_view_backend_exportable_fdo* exportable; int width; int height; - } wpe { nullptr, 0, 0 }; + } wpe { nullptr, 0, 0, }; struct { gchar* uri; @@ -123,6 +114,11 @@ GstBuffer* committed; } shm { nullptr, nullptr }; + struct { + gulong init_ext_sigid; + gulong extension_msg_sigid; + } audio {0, 0}; + }; class WPEContextThread { @@ -132,7 +128,7 @@ WPEContextThread(); ~WPEContextThread(); - WPEView* createWPEView(GstWpeSrc*, GstGLContext*, GstGLDisplay*, int width, int height); + WPEView* createWPEView(GstWpeVideoSrc*, GstGLContext*, GstGLDisplay*, int width, int height); template<typename Function> void dispatch(Function); @@ -142,6 +138,7 @@ struct { GMutex mutex; GCond cond; + gboolean ready; GThread* thread { nullptr }; } threading;
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/gstwpe.cpp
Added
@@ -0,0 +1,61 @@ +/* Copyright (C) <2018> Philippe Normand <philn@igalia.com> + * Copyright (C) <2018> Žan Doberšek <zdobersek@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstwpevideosrc.h" +#include "gstwpesrcbin.h" +#include "gstwpe.h" + +static gchar *extension_path = NULL; + +GST_DEBUG_CATEGORY (wpe_video_src_debug); +GST_DEBUG_CATEGORY (wpe_view_debug); +GST_DEBUG_CATEGORY (wpe_src_debug); + +const gchar *gst_wpe_get_devenv_extension_path (void) +{ + return extension_path; +} + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean result; + gchar *dirname = g_path_get_dirname (gst_plugin_get_filename (plugin)); + + GST_DEBUG_CATEGORY_INIT (wpe_video_src_debug, "wpevideosrc", 0, "WPE Video Source"); + GST_DEBUG_CATEGORY_INIT (wpe_view_debug, "wpeview", 0, "WPE Threaded View"); + GST_DEBUG_CATEGORY_INIT (wpe_src_debug, "wpesrc", 0, "WPE Source"); + + extension_path = g_build_filename (dirname, "wpe-extension", NULL); + g_free (dirname); + result = gst_element_register (plugin, "wpevideosrc", GST_RANK_NONE, + GST_TYPE_WPE_VIDEO_SRC); + result &= gst_element_register(plugin, "wpesrc", GST_RANK_NONE, GST_TYPE_WPE_SRC); + return result; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, + wpe, "WPE src plugin", plugin_init, VERSION, GST_LICENSE, PACKAGE, + GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/gstwpe.h
Added
@@ -0,0 +1,24 @@ +/* Copyright (C) <2018, 2019> Philippe Normand <philn@igalia.com> + * Copyright (C) <2018, 2019> Žan Doberšek <zdobersek@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> + +const gchar *gst_wpe_get_devenv_extension_path (void);
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/gstwpesrcbin.cpp
Added
@@ -0,0 +1,533 @@ +/* Copyright (C) <2018, 2019> Philippe Normand <philn@igalia.com> + * Copyright (C) <2018, 2019> Žan Doberšek <zdobersek@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-wpesrc + * @title: wpesrc + * + * The wpesrc element is used to produce a video texture representing a web page + * rendered off-screen by WPE. + * + * Starting from WPEBackend-FDO 1.6.x, software rendering support is available. + * This features allows wpesrc to be used on machines without GPU, and/or for + * testing purpose. To enable it, set the `LIBGL_ALWAYS_SOFTWARE=true` + * environment variable and make sure `video/x-raw, format=BGRA` caps are + * negotiated by the wpesrc element. + * + * ## Example launch lines + * + * ### Show the GStreamer website homepage + * + * ``` + * gst-launch-1.0 -v wpesrc location="https://gstreamer.freedesktop.org" ! queue ! glimagesink + * ``` + * + * ### Save the first 50 video frames generated for the GStreamer website as PNG files in /tmp + * + * ``` + * LIBGL_ALWAYS_SOFTWARE=true gst-launch-1.0 -v wpesrc num-buffers=50 location="https://gstreamer.freedesktop.org" ! videoconvert ! pngenc ! multifilesink location=/tmp/snapshot-%05d.png + * ``` + * + * + * ### Show the GStreamer website homepage as played with GstPlayer in a GTK+ window + * + * ``` + * gst-play-1.0 --videosink gtkglsink wpe://https://gstreamer.freedesktop.org + * ``` + * + * The `web://` URI protocol is also supported, as an alias to `wpe://`. Since: 1.20 + * + * ### Composite WPE with a video stream in a single OpenGL scene + * + * ``` + * gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 ! glimagesink wpesrc location="file:///home/phil/Downloads/plunk/index.html" draw-background=0 ! m. videotestsrc ! queue ! glupload ! glcolorconvert ! m. + * ``` + * + * + * ### Composite WPE with a video stream, sink_0 pad properties have to match the video dimensions + * + * ``` + * gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 sink_0::height=818 sink_0::width=1920 ! gtkglsink wpesrc location="file:///home/phil/Downloads/plunk/index.html" draw-background=0 ! m. uridecodebin uri="http://192.168.1.44/Sintel.2010.1080p.mkv" name=d d. ! queue ! glupload ! glcolorconvert ! m. + * ``` + * + * Additionally, any audio stream created by WPE is exposed as "sometimes" audio + * source pads. + * + * This source also relays GStreamer bus messages from the GStreamer pipelines + * running inside the web pages as [element custom](gst_message_new_custom) + * messages which structure is called `WpeForwarded` and has the following + * fields: + * + * * `message`: The original #GstMessage + * * `wpesrc-original-src-name`: Name of the original element posting the + * message + * * `wpesrc-original-src-type`: Name of the GType of the original element + * posting the message + * * `wpesrc-original-src-path`: [Path](gst_object_get_path_string) of the + * original element positing the message + * + * Note: This feature will be disabled if you disable the tracer subsystem. + */ + +#include "gstwpesrcbin.h" +#include "gstwpevideosrc.h" +#include "gstwpe.h" +#include "WPEThreadedView.h" + +#include <gst/allocators/allocators.h> +#include <gst/base/gstflowcombiner.h> +#include <wpe/extensions/audio.h> + +#include <sys/mman.h> +#include <unistd.h> + +G_DEFINE_TYPE (GstWpeAudioPad, gst_wpe_audio_pad, GST_TYPE_GHOST_PAD); + +static void +gst_wpe_audio_pad_class_init (GstWpeAudioPadClass * klass) +{ +} + +static void +gst_wpe_audio_pad_init (GstWpeAudioPad * pad) +{ + gst_audio_info_init (&pad->info); + pad->discont_pending = FALSE; + pad->buffer_time = 0; +} + +static GstWpeAudioPad * +gst_wpe_audio_pad_new (const gchar * name) +{ + GstWpeAudioPad *pad = GST_WPE_AUDIO_PAD (g_object_new (gst_wpe_audio_pad_get_type (), + "name", name, "direction", GST_PAD_SRC, NULL)); + + if (!gst_ghost_pad_construct (GST_GHOST_PAD (pad))) { + gst_object_unref (pad); + return NULL; + } + + return pad; +} + +struct _GstWpeSrc +{ + GstBin parent; + + GstAllocator *fd_allocator; + GstElement *video_src; + GHashTable *audio_src_pads; + GstFlowCombiner *flow_combiner; + gchar *uri; +}; + +enum +{ + PROP_0, + PROP_LOCATION, + PROP_DRAW_BACKGROUND +}; + +enum +{ + SIGNAL_LOAD_BYTES, + LAST_SIGNAL +}; + +static guint gst_wpe_video_src_signals[LAST_SIGNAL] = { 0 }; + +static void gst_wpe_src_uri_handler_init (gpointer iface, gpointer data); + +GST_DEBUG_CATEGORY_EXTERN (wpe_src_debug); +#define GST_CAT_DEFAULT wpe_src_debug + +#define gst_wpe_src_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstWpeSrc, gst_wpe_src, GST_TYPE_BIN, + G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_wpe_src_uri_handler_init)); + +/** + * GstWpeSrc!video + * + * Since: 1.20 + */ +static GstStaticPadTemplate video_src_factory = +GST_STATIC_PAD_TEMPLATE ("video", GST_PAD_SRC, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw(memory:GLMemory), " + "format = (string) RGBA, " + "width = " GST_VIDEO_SIZE_RANGE ", " + "height = " GST_VIDEO_SIZE_RANGE ", " + "framerate = " GST_VIDEO_FPS_RANGE ", " + "pixel-aspect-ratio = (fraction)1/1," + "texture-target = (string)2D" + "; video/x-raw, " + "format = (string) BGRA, " + "width = " GST_VIDEO_SIZE_RANGE ", " + "height = " GST_VIDEO_SIZE_RANGE ", " + "framerate = " GST_VIDEO_FPS_RANGE ", " + "pixel-aspect-ratio = (fraction)1/1" + )); + +/** + * GstWpeSrc!audio_%u + * + * Each audio stream in the renderer web page will expose the and `audio_%u` + * #GstPad. + * + * Since: 1.20 + */ +static GstStaticPadTemplate audio_src_factory = +GST_STATIC_PAD_TEMPLATE ("audio_%u", GST_PAD_SRC, GST_PAD_SOMETIMES, + GST_STATIC_CAPS ( \ + GST_AUDIO_CAPS_MAKE (GST_AUDIO_NE (F32)) ", layout=(string)interleaved; " \ + GST_AUDIO_CAPS_MAKE (GST_AUDIO_NE (F64)) ", layout=(string)interleaved; " \ + GST_AUDIO_CAPS_MAKE (GST_AUDIO_NE (S16)) ", layout=(string)interleaved" \ +)); + +static GstFlowReturn +gst_wpe_src_chain_buffer (GstPad * pad, GstObject * parent, GstBuffer * buffer) +{ + GstWpeSrc *src = GST_WPE_SRC (gst_object_get_parent (parent)); + GstFlowReturn result, chain_result; + + chain_result = gst_proxy_pad_chain_default (pad, GST_OBJECT_CAST (src), buffer); + result = gst_flow_combiner_update_pad_flow (src->flow_combiner, pad, chain_result); + gst_object_unref (src); + + if (result == GST_FLOW_FLUSHING) + return chain_result; + + return result; +} + +void +gst_wpe_src_new_audio_stream (GstWpeSrc *src, guint32 id, GstCaps *caps, const gchar *stream_id) +{ + GstWpeAudioPad *audio_pad; + GstPad *pad; + gchar *name; + GstEvent *stream_start; + GstSegment segment; + + name = g_strdup_printf ("audio_%u", id); + audio_pad = gst_wpe_audio_pad_new (name); + pad = GST_PAD_CAST (audio_pad); + g_free (name); + + gst_pad_set_active (pad, TRUE); + gst_element_add_pad (GST_ELEMENT_CAST (src), pad); + gst_flow_combiner_add_pad (src->flow_combiner, pad); + + GST_DEBUG_OBJECT (src, "Adding pad: %" GST_PTR_FORMAT, pad); + + stream_start = gst_event_new_stream_start (stream_id); + gst_pad_push_event (pad, stream_start); + + gst_audio_info_from_caps (&audio_pad->info, caps); + gst_pad_push_event (pad, gst_event_new_caps (caps)); + + gst_segment_init (&segment, GST_FORMAT_TIME); + gst_pad_push_event (pad, gst_event_new_segment (&segment)); + + g_hash_table_insert (src->audio_src_pads, GUINT_TO_POINTER (id), audio_pad); +} + +void +gst_wpe_src_set_audio_shm (GstWpeSrc* src, GUnixFDList *fds, guint32 id) +{ + gint fd; + GstWpeAudioPad *audio_pad = GST_WPE_AUDIO_PAD (g_hash_table_lookup (src->audio_src_pads, GUINT_TO_POINTER (id))); + + g_return_if_fail (GST_IS_WPE_SRC (src)); + g_return_if_fail (fds); + g_return_if_fail (g_unix_fd_list_get_length (fds) == 1); + g_return_if_fail (audio_pad->fd <= 0); + + fd = g_unix_fd_list_get (fds, 0, NULL); + g_return_if_fail (fd >= 0); + + if (audio_pad->fd > 0) + close(audio_pad->fd); + + audio_pad->fd = dup(fd); +} + +void +gst_wpe_src_push_audio_buffer (GstWpeSrc* src, guint32 id, guint64 size) +{ + GstWpeAudioPad *audio_pad = GST_WPE_AUDIO_PAD (g_hash_table_lookup (src->audio_src_pads, GUINT_TO_POINTER (id))); + GstBuffer *buffer; + + g_return_if_fail (audio_pad->fd > 0); + + GST_TRACE_OBJECT (audio_pad, "Handling incoming audio packet"); + + gpointer data = mmap (0, size, PROT_READ, MAP_PRIVATE, audio_pad->fd, 0); + buffer = gst_buffer_new_memdup (data, size); + munmap (data, size); + gst_buffer_add_audio_meta (buffer, &audio_pad->info, size, NULL); + + audio_pad->buffer_time = gst_element_get_current_running_time (GST_ELEMENT (src)); + GST_BUFFER_DTS (buffer) = audio_pad->buffer_time; + GST_BUFFER_PTS (buffer) = audio_pad->buffer_time; + + GST_BUFFER_FLAG_UNSET (buffer, GST_BUFFER_FLAG_DISCONT); + if (audio_pad->discont_pending) { + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DISCONT); + audio_pad->discont_pending = FALSE; + } + + gst_flow_combiner_update_pad_flow (src->flow_combiner, GST_PAD (audio_pad), + gst_pad_push (GST_PAD_CAST (audio_pad), buffer)); +} + +static void +gst_wpe_src_remove_audio_pad (GstWpeSrc *src, GstPad *pad) +{ + GST_DEBUG_OBJECT (src, "Removing pad: %" GST_PTR_FORMAT, pad); + gst_element_remove_pad (GST_ELEMENT_CAST (src), pad); + gst_flow_combiner_remove_pad (src->flow_combiner, pad); +} + +void +gst_wpe_src_stop_audio_stream(GstWpeSrc* src, guint32 id) +{ + GstPad *pad = GST_PAD (g_hash_table_lookup (src->audio_src_pads, GUINT_TO_POINTER (id))); + g_return_if_fail (GST_IS_PAD (pad)); + + GST_INFO_OBJECT(pad, "Stopping"); + gst_pad_push_event (pad, gst_event_new_eos ()); + gst_wpe_src_remove_audio_pad (src, pad); + g_hash_table_remove (src->audio_src_pads, GUINT_TO_POINTER (id)); +} + +void +gst_wpe_src_pause_audio_stream(GstWpeSrc* src, guint32 id) +{ + GstWpeAudioPad *audio_pad = GST_WPE_AUDIO_PAD (g_hash_table_lookup (src->audio_src_pads, GUINT_TO_POINTER (id))); + GstPad *pad = GST_PAD_CAST (audio_pad); + g_return_if_fail (GST_IS_PAD (pad)); + + GST_INFO_OBJECT(pad, "Pausing"); + gst_pad_push_event (pad, gst_event_new_gap (audio_pad->buffer_time, GST_CLOCK_TIME_NONE)); + + audio_pad->discont_pending = TRUE; +} + +static void +gst_wpe_src_load_bytes (GstWpeVideoSrc * src, GBytes * bytes) +{ + GstWpeSrc *self = GST_WPE_SRC (src); + + if (self->video_src) + g_signal_emit_by_name (self->video_src, "load-bytes", bytes, NULL); +} + +static void +gst_wpe_src_set_location (GstWpeSrc * src, const gchar * location) +{ + g_object_set (src->video_src, "location", location, NULL); +} + +static void +gst_wpe_src_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstWpeSrc *self = GST_WPE_SRC (object); + + if (self->video_src) + g_object_get_property (G_OBJECT (self->video_src), pspec->name, value); +} + +static void +gst_wpe_src_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstWpeSrc *self = GST_WPE_SRC (object); + + if (self->video_src) { + if (prop_id == PROP_LOCATION) + gst_wpe_src_set_location (self, g_value_get_string (value)); + else + g_object_set_property (G_OBJECT (self->video_src), pspec->name, value); + } +} + +static GstURIType +gst_wpe_src_uri_get_type (GType) +{ + return GST_URI_SRC; +} + +static const gchar *const * +gst_wpe_src_get_protocols (GType) +{ + static const char *protocols[] = { "wpe", "web", NULL }; + return protocols; +} + +static gchar * +gst_wpe_src_get_uri (GstURIHandler * handler) +{ + GstWpeSrc *src = GST_WPE_SRC (handler); + + return g_strdup (src->uri); +} + +static gboolean +gst_wpe_src_set_uri (GstURIHandler * handler, const gchar * uri, + GError ** error) +{ + GstWpeSrc *src = GST_WPE_SRC (handler); + + if (src->uri) { + g_free (src->uri); + } + src->uri = g_strdup (uri); + gst_wpe_src_set_location(src, uri + 6); + return TRUE; +} + +static void +gst_wpe_src_uri_handler_init (gpointer iface_ptr, gpointer data) +{ + GstURIHandlerInterface *iface = (GstURIHandlerInterface *) iface_ptr; + + iface->get_type = gst_wpe_src_uri_get_type; + iface->get_protocols = gst_wpe_src_get_protocols; + iface->get_uri = gst_wpe_src_get_uri; + iface->set_uri = gst_wpe_src_set_uri; +} + +static void +gst_wpe_src_init (GstWpeSrc * src) +{ + GstPad *pad; + GstPad *ghost_pad; + GstProxyPad *proxy_pad; + + gst_bin_set_suppressed_flags (GST_BIN_CAST (src), + static_cast<GstElementFlags>(GST_ELEMENT_FLAG_SOURCE | GST_ELEMENT_FLAG_SINK)); + GST_OBJECT_FLAG_SET (src, GST_ELEMENT_FLAG_SOURCE); + + src->fd_allocator = gst_fd_allocator_new (); + src->audio_src_pads = g_hash_table_new (g_direct_hash, g_direct_equal); + src->flow_combiner = gst_flow_combiner_new (); + src->video_src = gst_element_factory_make ("wpevideosrc", NULL); + + gst_bin_add (GST_BIN_CAST (src), src->video_src); + + pad = gst_element_get_static_pad (GST_ELEMENT_CAST (src->video_src), "src"); + ghost_pad = gst_ghost_pad_new_from_template ("video", pad, + gst_static_pad_template_get (&video_src_factory)); + proxy_pad = gst_proxy_pad_get_internal (GST_PROXY_PAD (ghost_pad)); + gst_pad_set_active (GST_PAD_CAST (proxy_pad), TRUE); + + gst_element_add_pad (GST_ELEMENT_CAST (src), GST_PAD_CAST (ghost_pad)); + gst_flow_combiner_add_pad (src->flow_combiner, GST_PAD_CAST (ghost_pad)); + gst_pad_set_chain_function (GST_PAD_CAST (proxy_pad), gst_wpe_src_chain_buffer); + + gst_object_unref (proxy_pad); + gst_object_unref (pad); +} + +static gboolean +gst_wpe_audio_remove_audio_pad (gint32 *id, GstPad *pad, GstWpeSrc *self) +{ + gst_wpe_src_remove_audio_pad (self, pad); + + return TRUE; +} + +static GstStateChangeReturn +gst_wpe_src_change_state (GstElement * element, GstStateChange transition) +{ + GstStateChangeReturn result; + GstWpeSrc *src = GST_WPE_SRC (element); + + GST_DEBUG_OBJECT (src, "%s", gst_state_change_get_name (transition)); + result = GST_CALL_PARENT_WITH_DEFAULT (GST_ELEMENT_CLASS , change_state, (element, transition), GST_STATE_CHANGE_FAILURE); + + switch (transition) { + case GST_STATE_CHANGE_PAUSED_TO_READY:{ + g_hash_table_foreach_remove (src->audio_src_pads, (GHRFunc) gst_wpe_audio_remove_audio_pad, src); + gst_flow_combiner_reset (src->flow_combiner); + break; + } + default: + break; + } + + return result; +} + +static void +gst_wpe_src_finalize (GObject *object) +{ + GstWpeSrc *src = GST_WPE_SRC (object); + + g_hash_table_unref (src->audio_src_pads); + gst_flow_combiner_free (src->flow_combiner); + gst_object_unref (src->fd_allocator); + g_free (src->uri); + + GST_CALL_PARENT (G_OBJECT_CLASS, finalize, (object)); +} + +static void +gst_wpe_src_class_init (GstWpeSrcClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->set_property = gst_wpe_src_set_property; + gobject_class->get_property = gst_wpe_src_get_property; + gobject_class->finalize = gst_wpe_src_finalize; + + g_object_class_install_property (gobject_class, PROP_LOCATION, + g_param_spec_string ("location", "location", "The URL to display", "", + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_DRAW_BACKGROUND, + g_param_spec_boolean ("draw-background", "Draws the background", + "Whether to draw the WebView background", TRUE, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_element_class_set_static_metadata (element_class, "WPE source", + "Source/Video/Audio", "Creates Audio/Video streams from a web" + " page using WPE web engine", + "Philippe Normand <philn@igalia.com>, Žan Doberšek " + "<zdobersek@igalia.com>"); + + /** + * GstWpeSrc::load-bytes: + * @src: the object which received the signal + * @bytes: the GBytes data to load + * + * Load the specified bytes into the internal webView. + */ + gst_wpe_video_src_signals[SIGNAL_LOAD_BYTES] = + g_signal_new_class_handler ("load-bytes", G_TYPE_FROM_CLASS (klass), + static_cast < GSignalFlags > (G_SIGNAL_RUN_LAST | G_SIGNAL_ACTION), + G_CALLBACK (gst_wpe_src_load_bytes), NULL, NULL, NULL, G_TYPE_NONE, 1, + G_TYPE_BYTES); + + element_class->change_state = GST_DEBUG_FUNCPTR (gst_wpe_src_change_state); + + gst_element_class_add_static_pad_template (element_class, &video_src_factory); + gst_element_class_add_static_pad_template (element_class, &audio_src_factory); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/gstwpesrcbin.h
Added
@@ -0,0 +1,81 @@ +/* Copyright (C) <2018, 2019> Philippe Normand <philn@igalia.com> + * Copyright (C) <2018, 2019> Žan Doberšek <zdobersek@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#pragma once + +#include <gst/gst.h> +#include <gst/audio/audio.h> + +#ifdef G_OS_UNIX +#include <gio/gunixfdlist.h> +#endif + +G_BEGIN_DECLS + +GType gst_wpe_audio_pad_get_type(void); +#define GST_TYPE_WPE_AUDIO_PAD (gst_wpe_audio_pad_get_type()) +#define GST_WPE_AUDIO_PAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_WPE_AUDIO_PAD,GstWpeAudioPad)) +#define GST_IS_WPE_AUDIO_PAD(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_WPE_AUDIO_PAD)) +#define GST_WPE_AUDIO_PAD_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass) ,GST_TYPE_WPE_AUDIO_PAD,GstWpeAudioPadClass)) +#define GST_IS_WPE_AUDIO_PAD_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WPE_AUDIO_PAD)) +#define GST_WPE_AUDIO_PAD_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WPE_AUDIO_PAD,GstWpeAudioPadClass)) + +typedef struct _GstWpeAudioPad GstWpeAudioPad; +typedef struct _GstWpeAudioPadClass GstWpeAudioPadClass; + +struct _GstWpeAudioPad +{ + GstGhostPad parent; + + GstAudioInfo info; + GstClockTime buffer_time; + gboolean discont_pending; + gint fd; +}; + +struct _GstWpeAudioPadClass +{ + GstGhostPadClass parent_class; +}; + + +#define GST_TYPE_WPE_SRC (gst_wpe_src_get_type()) +#define GST_WPE_SRC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_WPE_SRC,GstWpeSrc)) +#define GST_WPE_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_WPE_SRC,GstWpeSrcClass)) +#define GST_IS_WPE_SRC(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_WPE_SRC)) +#define GST_IS_WPE_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_WPE_SRC)) + +typedef struct _GstWpeSrc GstWpeSrc; +typedef struct _GstWpeSrcClass GstWpeSrcClass; + +struct _GstWpeSrcClass +{ + GstBinClass parent_class; +}; + +GType gst_wpe_src_get_type (void); + +void gst_wpe_src_new_audio_stream(GstWpeSrc *src, guint32 id, GstCaps *caps, const gchar *stream_id); +void gst_wpe_src_set_audio_shm (GstWpeSrc* src, GUnixFDList *fds, guint32 id); +void gst_wpe_src_push_audio_buffer (GstWpeSrc* src, guint32 id, guint64 size); +void gst_wpe_src_pause_audio_stream (GstWpeSrc* src, guint32 id); +void gst_wpe_src_stop_audio_stream (GstWpeSrc* src, guint32 id); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/gstwpevideosrc.cpp
Added
@@ -0,0 +1,749 @@ +/* Copyright (C) <2018> Philippe Normand <philn@igalia.com> + * Copyright (C) <2018> Žan Doberšek <zdobersek@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-wpevideosrc + * @title: wpevideosrc + * + * The wpevideosrc element is used to produce a video texture representing a web page + * rendered off-screen by WPE. + * + * Starting from WPEBackend-FDO 1.6.x, software rendering support is available. This + * features allows wpevideosrc to be used on machines without GPU, and/or for testing + * purpose. To enable it, set the `LIBGL_ALWAYS_SOFTWARE=true` environment + * variable and make sure `video/x-raw, format=BGRA` caps are negotiated by the + * wpevideosrc element. + * + * As the webview loading is usually not instantaneous, the wpevideosrc element emits + * messages indicating the load progress, in percent. The value is an estimate + * based on the total number of bytes expected to be received for a document, + * including all its possible subresources and child documents. The application + * can handle these `element` messages synchronously for instance, in order to + * display a progress bar or other visual load indicator. The load percent value + * is stored in the message structure as a double value named + * `estimated-load-progress` and the structure name is `wpe-stats`. + * + * ## Example launch lines + * + * ```shell + * gst-launch-1.0 -v wpevideosrc location="https://gstreamer.freedesktop.org" ! queue ! glimagesink + * ``` + * Shows the GStreamer website homepage + * + * ```shell + * LIBGL_ALWAYS_SOFTWARE=true gst-launch-1.0 -v wpevideosrc num-buffers=50 location="https://gstreamer.freedesktop.org" \ + * videoconvert ! pngenc ! multifilesink location=/tmp/snapshot-%05d.png + * ``` + * Saves the first 50 video frames generated for the GStreamer website as PNG files in /tmp. + * + * ```shell + * gst-play-1.0 --videosink gtkglsink wpe://https://gstreamer.freedesktop.org + * ``` + * Shows the GStreamer website homepage as played with GstPlayer in a GTK+ window. + * + * ```shell + * gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 ! glimagesink wpevideosrc location="file:///tmp/asset.html" draw-background=0 \ + * ! m. videotestsrc ! queue ! glupload ! glcolorconvert ! m. + * ``` + * Composite WPE with a video stream in a single OpenGL scene. + * + * ```shell + * gst-launch-1.0 glvideomixer name=m sink_1::zorder=0 sink_0::height=818 sink_0::width=1920 ! gtkglsink \ + * wpevideosrc location="file:///tmp/asset.html" draw-background=0 ! m. + * uridecodebin uri="http://example.com/Sintel.2010.1080p.mkv" name=d d. ! queue ! glupload ! glcolorconvert ! m. + * ``` + * Composite WPE with a video stream, sink_0 pad properties have to match the video dimensions. + * + * Since: 1.16 + */ + +/* + * TODO: + * - DMABuf support (requires changes in WPEBackend-fdo to expose DMABuf planes and fds) + * - Custom EGLMemory allocator + * - Better navigation events handling (would require a new GstNavigation API) + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstwpevideosrc.h" +#include <gst/gl/gl.h> +#include <gst/gl/egl/gstglmemoryegl.h> +#include <gst/gl/wayland/gstgldisplay_wayland.h> +#include <gst/video/video.h> +#include <xkbcommon/xkbcommon.h> + +#include "WPEThreadedView.h" + +#define DEFAULT_WIDTH 1920 +#define DEFAULT_HEIGHT 1080 +#define DEFAULT_FPS_N 30 +#define DEFAULT_FPS_D 1 +#define DEFAULT_DRAW_BACKGROUND TRUE + +enum +{ + PROP_0, + PROP_LOCATION, + PROP_DRAW_BACKGROUND +}; + +enum +{ + SIGNAL_CONFIGURE_WEB_VIEW, + SIGNAL_LOAD_BYTES, + LAST_SIGNAL +}; +static guint gst_wpe_video_src_signals[LAST_SIGNAL] = { 0 }; + +struct _GstWpeVideoSrc +{ + GstGLBaseSrc parent; + + /* properties */ + gchar *location; + gboolean draw_background; + + GBytes *bytes; + gboolean gl_enabled; + + gint64 n_frames; /* total frames sent */ + + WPEView *view; + + GMutex lock; +}; + +#define WPE_LOCK(o) g_mutex_lock(&(o)->lock) +#define WPE_UNLOCK(o) g_mutex_unlock(&(o)->lock) + +GST_DEBUG_CATEGORY_EXTERN (wpe_video_src_debug); +#define GST_CAT_DEFAULT wpe_video_src_debug + +#define gst_wpe_video_src_parent_class parent_class +G_DEFINE_TYPE (GstWpeVideoSrc, gst_wpe_video_src, GST_TYPE_GL_BASE_SRC); + +#define WPE_RAW_CAPS "video/x-raw, " \ + "format = (string) BGRA, " \ + "width = " GST_VIDEO_SIZE_RANGE ", " \ + "height = " GST_VIDEO_SIZE_RANGE ", " \ + "framerate = " GST_VIDEO_FPS_RANGE ", " \ + "pixel-aspect-ratio = (fraction)1/1" + +#define WPE_GL_CAPS "video/x-raw(memory:GLMemory), " \ + "format = (string) RGBA, " \ + "width = " GST_VIDEO_SIZE_RANGE ", " \ + "height = " GST_VIDEO_SIZE_RANGE ", " \ + "framerate = " GST_VIDEO_FPS_RANGE ", " \ + "pixel-aspect-ratio = (fraction)1/1, texture-target = (string)2D" + +#define WPE_VIDEO_SRC_CAPS WPE_GL_CAPS "; " WPE_RAW_CAPS +#define WPE_VIDEO_SRC_DOC_CAPS WPE_GL_CAPS "; video/x-raw, format = (string) BGRA" + +static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (WPE_VIDEO_SRC_CAPS)); + +static GstFlowReturn +gst_wpe_video_src_create (GstBaseSrc * bsrc, guint64 offset, guint length, + GstBuffer ** buf) +{ + GstGLBaseSrc *gl_src = GST_GL_BASE_SRC (bsrc); + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (bsrc); + GstFlowReturn ret = GST_FLOW_ERROR; + GstBuffer *locked_buffer; + GstClockTime next_time; + gint64 ts_offset = 0; + + WPE_LOCK (src); + if (src->gl_enabled) { + WPE_UNLOCK (src); + return GST_CALL_PARENT_WITH_DEFAULT (GST_BASE_SRC_CLASS, create, (bsrc, + offset, length, buf), ret); + } + + locked_buffer = src->view->buffer (); + if (locked_buffer == NULL) { + WPE_UNLOCK (src); + GST_ELEMENT_ERROR (src, RESOURCE, FAILED, + ("WPE View did not render a buffer"), (NULL)); + return ret; + } + *buf = gst_buffer_copy_deep (locked_buffer); + + g_object_get (gl_src, "timestamp-offset", &ts_offset, NULL); + + /* The following code mimics the behaviour of GLBaseSrc::fill */ + GST_BUFFER_TIMESTAMP (*buf) = ts_offset + gl_src->running_time; + GST_BUFFER_OFFSET (*buf) = src->n_frames; + src->n_frames++; + GST_BUFFER_OFFSET_END (*buf) = src->n_frames; + if (gl_src->out_info.fps_n) { + next_time = gst_util_uint64_scale_int (src->n_frames * GST_SECOND, + gl_src->out_info.fps_d, gl_src->out_info.fps_n); + GST_BUFFER_DURATION (*buf) = next_time - gl_src->running_time; + } else { + next_time = ts_offset; + GST_BUFFER_DURATION (*buf) = GST_CLOCK_TIME_NONE; + } + + GST_LOG_OBJECT (src, "Created buffer from SHM %" GST_PTR_FORMAT, *buf); + + gl_src->running_time = next_time; + + ret = GST_FLOW_OK; + WPE_UNLOCK (src); + return ret; +} + +static GQuark +_egl_image_quark (void) +{ + static GQuark quark = 0; + + if (!quark) + quark = g_quark_from_static_string ("GstWPEEGLImage"); + return quark; +} + +static gboolean +gst_wpe_video_src_fill_memory (GstGLBaseSrc * bsrc, GstGLMemory * memory) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (bsrc); + const GstGLFuncs *gl; + guint tex_id; + GstEGLImage *locked_image; + + if (!gst_gl_context_check_feature (GST_GL_CONTEXT (bsrc->context), + "EGL_KHR_image_base")) { + GST_ERROR_OBJECT (src, "EGL_KHR_image_base is not supported"); + return FALSE; + } + + WPE_LOCK (src); + + gl = bsrc->context->gl_vtable; + tex_id = gst_gl_memory_get_texture_id (memory); + locked_image = src->view->image (); + + if (!locked_image) { + WPE_UNLOCK (src); + return TRUE; + } + + // The EGLImage is implicitely associated with the memory we're filling, so we + // need to ensure their life cycles are tied. + gst_mini_object_set_qdata (GST_MINI_OBJECT_CAST (memory), _egl_image_quark (), + gst_egl_image_ref (locked_image), (GDestroyNotify) gst_egl_image_unref); + + gl->ActiveTexture (GL_TEXTURE0 + memory->plane); + gl->BindTexture (GL_TEXTURE_2D, tex_id); + gl->EGLImageTargetTexture2D (GL_TEXTURE_2D, + gst_egl_image_get_image (locked_image)); + gl->Flush (); + WPE_UNLOCK (src); + return TRUE; +} + +static gboolean +gst_wpe_video_src_start (GstWpeVideoSrc * src) +{ + GstGLContext *context = NULL; + GstGLDisplay *display = NULL; + GstGLBaseSrc *base_src = GST_GL_BASE_SRC (src); + gboolean created_view = FALSE; + GBytes *bytes; + + GST_INFO_OBJECT (src, "Starting up"); + WPE_LOCK (src); + + if (src->gl_enabled) { + context = base_src->context; + display = base_src->display; + } + + GST_DEBUG_OBJECT (src, "Will %sfill GLMemories", + src->gl_enabled ? "" : "NOT "); + + auto & thread = WPEContextThread::singleton (); + + if (!src->view) { + src->view = thread.createWPEView (src, context, display, + GST_VIDEO_INFO_WIDTH (&base_src->out_info), + GST_VIDEO_INFO_HEIGHT (&base_src->out_info)); + created_view = TRUE; + GST_DEBUG_OBJECT (src, "created view %p", src->view); + } + + if (!src->view) { + WPE_UNLOCK (src); + GST_ELEMENT_ERROR (src, RESOURCE, FAILED, + ("WPEBackend-FDO EGL display initialisation failed"), (NULL)); + return FALSE; + } + + GST_OBJECT_LOCK (src); + bytes = src->bytes; + src->bytes = NULL; + GST_OBJECT_UNLOCK (src); + + if (bytes != NULL) { + src->view->loadData (bytes); + g_bytes_unref (bytes); + } + + if (created_view) { + src->n_frames = 0; + } + WPE_UNLOCK (src); + return TRUE; +} + +static gboolean +gst_wpe_video_src_decide_allocation (GstBaseSrc * base_src, GstQuery * query) +{ + GstGLBaseSrc *gl_src = GST_GL_BASE_SRC (base_src); + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (base_src); + GstCapsFeatures *caps_features; + + WPE_LOCK (src); + caps_features = gst_caps_get_features (gl_src->out_caps, 0); + if (caps_features != NULL + && gst_caps_features_contains (caps_features, + GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { + src->gl_enabled = TRUE; + } else { + src->gl_enabled = FALSE; + } + + if (src->gl_enabled) { + WPE_UNLOCK (src); + return GST_CALL_PARENT_WITH_DEFAULT (GST_BASE_SRC_CLASS, decide_allocation, + (base_src, query), FALSE); + } + WPE_UNLOCK (src); + return gst_wpe_video_src_start (src); +} + +static gboolean +gst_wpe_video_src_gl_start (GstGLBaseSrc * base_src) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (base_src); + return gst_wpe_video_src_start (src); +} + +static void +gst_wpe_video_src_stop_unlocked (GstWpeVideoSrc * src) +{ + if (src->view) { + GST_DEBUG_OBJECT (src, "deleting view %p", src->view); + delete src->view; + src->view = NULL; + } +} + +static void +gst_wpe_video_src_gl_stop (GstGLBaseSrc * base_src) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (base_src); + + WPE_LOCK (src); + gst_wpe_video_src_stop_unlocked (src); + WPE_UNLOCK (src); +} + +static gboolean +gst_wpe_video_src_stop (GstBaseSrc * base_src) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (base_src); + + /* we can call this always, GstGLBaseSrc is smart enough to not crash if + * gst_gl_base_src_gl_start() has not been called from chaining up + * gst_wpe_video_src_decide_allocation() */ + if (!GST_CALL_PARENT_WITH_DEFAULT (GST_BASE_SRC_CLASS, stop, (base_src), + FALSE)) + return FALSE; + + WPE_LOCK (src); + + /* if gl-enabled, gst_wpe_video_src_stop_unlocked() would have already been called + * inside gst_wpe_video_src_gl_stop() from the base class stopping the OpenGL + * context */ + if (!src->gl_enabled) + gst_wpe_video_src_stop_unlocked (src); + + WPE_UNLOCK (src); + return TRUE; +} + +static GstCaps * +gst_wpe_video_src_fixate (GstBaseSrc * base_src, GstCaps * combined_caps) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (base_src); + GstStructure *structure; + gint width, height; + GstCaps *caps; + + /* In situation where software GL support is explicitly requested, select raw + * caps, otherwise perform default caps negotiation. Unfortunately at this + * point we don't know yet if a GL context will be usable or not, so we can't + * check the element GstContext. + */ + if (!g_strcmp0 (g_getenv ("LIBGL_ALWAYS_SOFTWARE"), "true")) { + caps = gst_caps_from_string (WPE_RAW_CAPS); + } else { + caps = gst_caps_make_writable (combined_caps); + } + + structure = gst_caps_get_structure (caps, 0); + + gst_structure_fixate_field_nearest_int (structure, "width", DEFAULT_WIDTH); + gst_structure_fixate_field_nearest_int (structure, "height", DEFAULT_HEIGHT); + + if (gst_structure_has_field (structure, "framerate")) + gst_structure_fixate_field_nearest_fraction (structure, "framerate", + DEFAULT_FPS_N, DEFAULT_FPS_D); + else + gst_structure_set (structure, "framerate", GST_TYPE_FRACTION, DEFAULT_FPS_N, + DEFAULT_FPS_D, NULL); + + caps = GST_BASE_SRC_CLASS (parent_class)->fixate (base_src, caps); + GST_INFO_OBJECT (base_src, "Fixated caps to %" GST_PTR_FORMAT, caps); + + if (src->view) { + gst_structure_get (structure, "width", G_TYPE_INT, &width, "height", + G_TYPE_INT, &height, NULL); + src->view->resize (width, height); + } + return caps; +} + +void +gst_wpe_video_src_configure_web_view (GstWpeVideoSrc * src, + WebKitWebView * webview) +{ + GValue args[2] = { {0}, {0} }; + + g_value_init (&args[0], GST_TYPE_ELEMENT); + g_value_set_object (&args[0], src); + g_value_init (&args[1], G_TYPE_OBJECT); + g_value_set_object (&args[1], webview); + + g_signal_emitv (args, gst_wpe_video_src_signals[SIGNAL_CONFIGURE_WEB_VIEW], 0, + NULL); + + g_value_unset (&args[0]); + g_value_unset (&args[1]); +} + +static void +gst_wpe_video_src_load_bytes (GstWpeVideoSrc * src, GBytes * bytes) +{ + if (src->view && GST_STATE (GST_ELEMENT_CAST (src)) > GST_STATE_NULL) { + src->view->loadData (bytes); + } else { + GST_OBJECT_LOCK (src); + if (src->bytes) + g_bytes_unref (src->bytes); + src->bytes = g_bytes_ref (bytes); + GST_OBJECT_UNLOCK (src); + } +} + +static gboolean +gst_wpe_video_src_set_location (GstWpeVideoSrc * src, const gchar * location, + GError ** error) +{ + GST_OBJECT_LOCK (src); + g_free (src->location); + src->location = g_strdup (location); + GST_OBJECT_UNLOCK (src); + + if (src->view) + src->view->loadUri (location); + + return TRUE; +} + +static void +gst_wpe_video_src_set_draw_background (GstWpeVideoSrc * src, + gboolean draw_background) +{ + GST_OBJECT_LOCK (src); + src->draw_background = draw_background; + GST_OBJECT_UNLOCK (src); + + if (src->view) + src->view->setDrawBackground (draw_background); +} + +static void +gst_wpe_video_src_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (object); + + switch (prop_id) { + case PROP_LOCATION: + { + const gchar *location; + + location = g_value_get_string (value); + if (location == NULL) { + GST_WARNING_OBJECT (src, "location property cannot be NULL"); + return; + } + + if (!gst_wpe_video_src_set_location (src, location, NULL)) { + GST_WARNING_OBJECT (src, "badly formatted location"); + return; + } + break; + } + case PROP_DRAW_BACKGROUND: + gst_wpe_video_src_set_draw_background (src, g_value_get_boolean (value)); + break; + default: + break; + } +} + +static void +gst_wpe_video_src_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (object); + + switch (prop_id) { + case PROP_LOCATION: + GST_OBJECT_LOCK (src); + g_value_set_string (value, src->location); + GST_OBJECT_UNLOCK (src); + break; + case PROP_DRAW_BACKGROUND: + GST_OBJECT_LOCK (src); + g_value_set_boolean (value, src->draw_background); + GST_OBJECT_UNLOCK (src); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +gst_wpe_video_src_event (GstBaseSrc * base_src, GstEvent * event) +{ + gboolean ret = FALSE; + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (base_src); + + if (src->view && GST_EVENT_TYPE (event) == GST_EVENT_NAVIGATION) { + const gchar *key; + gint button; + gdouble x, y, delta_x, delta_y; + + GST_DEBUG_OBJECT (src, "Processing event %" GST_PTR_FORMAT, event); + switch (gst_navigation_event_get_type (event)) { + case GST_NAVIGATION_EVENT_KEY_PRESS: + case GST_NAVIGATION_EVENT_KEY_RELEASE: + if (gst_navigation_event_parse_key_event (event, &key)) { + /* FIXME: This is wrong... The GstNavigation API should pass + hardware-level information, not high-level keysym strings */ + uint32_t keysym = + (uint32_t) xkb_keysym_from_name (key, XKB_KEYSYM_NO_FLAGS); + struct wpe_input_keyboard_event wpe_event; + wpe_event.key_code = keysym; + wpe_event.pressed = + gst_navigation_event_get_type (event) == + GST_NAVIGATION_EVENT_KEY_PRESS; + src->view->dispatchKeyboardEvent (wpe_event); + ret = TRUE; + } + break; + case GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS: + case GST_NAVIGATION_EVENT_MOUSE_BUTTON_RELEASE: + if (gst_navigation_event_parse_mouse_button_event (event, &button, &x, + &y)) { + struct wpe_input_pointer_event wpe_event; + wpe_event.time = GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); + wpe_event.type = wpe_input_pointer_event_type_button; + wpe_event.x = (int) x; + wpe_event.y = (int) y; + if (button == 1) { + wpe_event.modifiers = wpe_input_pointer_modifier_button1; + } else if (button == 2) { + wpe_event.modifiers = wpe_input_pointer_modifier_button2; + } else if (button == 3) { + wpe_event.modifiers = wpe_input_pointer_modifier_button3; + } else if (button == 4) { + wpe_event.modifiers = wpe_input_pointer_modifier_button4; + } else if (button == 5) { + wpe_event.modifiers = wpe_input_pointer_modifier_button5; + } + wpe_event.button = button; + wpe_event.state = + gst_navigation_event_get_type (event) == + GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS; + src->view->dispatchPointerEvent (wpe_event); + ret = TRUE; + } + break; + case GST_NAVIGATION_EVENT_MOUSE_MOVE: + if (gst_navigation_event_parse_mouse_move_event (event, &x, &y)) { + struct wpe_input_pointer_event wpe_event; + wpe_event.time = GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); + wpe_event.type = wpe_input_pointer_event_type_motion; + wpe_event.x = (int) x; + wpe_event.y = (int) y; + src->view->dispatchPointerEvent (wpe_event); + ret = TRUE; + } + break; + case GST_NAVIGATION_EVENT_MOUSE_SCROLL: + if (gst_navigation_event_parse_mouse_scroll_event (event, &x, &y, + &delta_x, &delta_y)) { + struct wpe_input_axis_event wpe_event; + if (delta_x) { + wpe_event.axis = 1; + wpe_event.value = delta_x; + } else { + wpe_event.axis = 0; + wpe_event.value = delta_y; + } + wpe_event.time = GST_TIME_AS_MSECONDS (GST_EVENT_TIMESTAMP (event)); + wpe_event.type = wpe_input_axis_event_type_motion; + wpe_event.x = (int) x; + wpe_event.y = (int) y; + src->view->dispatchAxisEvent (wpe_event); + ret = TRUE; + } + break; + default: + break; + } + /* FIXME: No touch events handling support in GstNavigation */ + } + + if (!ret) { + ret = + GST_CALL_PARENT_WITH_DEFAULT (GST_BASE_SRC_CLASS, event, (base_src, + event), FALSE); + } + return ret; +} + +static void +gst_wpe_video_src_init (GstWpeVideoSrc * src) +{ + src->draw_background = DEFAULT_DRAW_BACKGROUND; + + gst_base_src_set_live (GST_BASE_SRC_CAST (src), TRUE); + + g_mutex_init (&src->lock); +} + +static void +gst_wpe_video_src_finalize (GObject * object) +{ + GstWpeVideoSrc *src = GST_WPE_VIDEO_SRC (object); + + g_free (src->location); + g_clear_pointer (&src->bytes, g_bytes_unref); + g_mutex_clear (&src->lock); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_wpe_video_src_class_init (GstWpeVideoSrcClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); + GstGLBaseSrcClass *gl_base_src_class = GST_GL_BASE_SRC_CLASS (klass); + GstBaseSrcClass *base_src_class = GST_BASE_SRC_CLASS (klass); + GstPadTemplate *tmpl; + GstCaps *doc_caps; + + gobject_class->set_property = gst_wpe_video_src_set_property; + gobject_class->get_property = gst_wpe_video_src_get_property; + gobject_class->finalize = gst_wpe_video_src_finalize; + + g_object_class_install_property (gobject_class, PROP_LOCATION, + g_param_spec_string ("location", "location", + "The URL to display", + "", (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_DRAW_BACKGROUND, + g_param_spec_boolean ("draw-background", "Draws the background", + "Whether to draw the WebView background", DEFAULT_DRAW_BACKGROUND, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_element_class_set_static_metadata (gstelement_class, + "WPE source", "Source/Video", + "Creates a video stream from a WPE browser", + "Philippe Normand <philn@igalia.com>, Žan Doberšek <zdobersek@igalia.com>"); + + tmpl = gst_static_pad_template_get (&src_factory); + gst_element_class_add_pad_template (gstelement_class, tmpl); + + base_src_class->fixate = GST_DEBUG_FUNCPTR (gst_wpe_video_src_fixate); + base_src_class->create = GST_DEBUG_FUNCPTR (gst_wpe_video_src_create); + base_src_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_wpe_video_src_decide_allocation); + base_src_class->stop = GST_DEBUG_FUNCPTR (gst_wpe_video_src_stop); + base_src_class->event = GST_DEBUG_FUNCPTR (gst_wpe_video_src_event); + + gl_base_src_class->supported_gl_api = + static_cast < GstGLAPI > + (GST_GL_API_OPENGL | GST_GL_API_OPENGL3 | GST_GL_API_GLES2); + gl_base_src_class->gl_start = GST_DEBUG_FUNCPTR (gst_wpe_video_src_gl_start); + gl_base_src_class->gl_stop = GST_DEBUG_FUNCPTR (gst_wpe_video_src_gl_stop); + gl_base_src_class->fill_gl_memory = + GST_DEBUG_FUNCPTR (gst_wpe_video_src_fill_memory); + + doc_caps = gst_caps_from_string (WPE_VIDEO_SRC_DOC_CAPS); + gst_pad_template_set_documentation_caps (tmpl, doc_caps); + gst_clear_caps (&doc_caps); + + /** + * GstWpeVideoSrc::configure-web-view: + * @src: the object which received the signal + * @webview: the webView + * + * Allow application to configure the webView settings. + */ + gst_wpe_video_src_signals[SIGNAL_CONFIGURE_WEB_VIEW] = + g_signal_new ("configure-web-view", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, G_TYPE_OBJECT); + + /** + * GstWpeVideoSrc::load-bytes: + * @src: the object which received the signal + * @bytes: the GBytes data to load + * + * Load the specified bytes into the internal webView. + */ + gst_wpe_video_src_signals[SIGNAL_LOAD_BYTES] = + g_signal_new_class_handler ("load-bytes", G_TYPE_FROM_CLASS (klass), + static_cast < GSignalFlags > (G_SIGNAL_RUN_LAST | G_SIGNAL_ACTION), + G_CALLBACK (gst_wpe_video_src_load_bytes), NULL, NULL, NULL, + G_TYPE_NONE, 1, G_TYPE_BYTES); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/gstwpevideosrc.h
Added
@@ -0,0 +1,12 @@ +#pragma once + +#include <wpe/webkit.h> +#include <gst/gl/gl.h> + +G_BEGIN_DECLS + +#define GST_TYPE_WPE_VIDEO_SRC (gst_wpe_video_src_get_type ()) +G_DECLARE_FINAL_TYPE (GstWpeVideoSrc, gst_wpe_video_src, GST, WPE_VIDEO_SRC, GstGLBaseSrc); + +void gst_wpe_video_src_configure_web_view (GstWpeVideoSrc * src, WebKitWebView * webview); +G_END_DECLS \ No newline at end of file
View file
gst-plugins-bad-1.18.6.tar.xz/ext/wpe/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/wpe/meson.build
Changed
@@ -6,8 +6,11 @@ subdir_done() endif -wpe_dep = dependency('wpe-webkit-1.0', version : '>= 2.24', required : get_option('wpe')) -wpe_fdo_dep = dependency('wpebackend-fdo-1.0', required : get_option('wpe')) +wpe_dep = dependency('wpe-webkit-1.1', version : '>= 2.28', required : false) +if not wpe_dep.found() + wpe_dep = dependency('wpe-webkit-1.0', version : '>= 2.28', required : get_option('wpe')) +endif +wpe_fdo_dep = dependency('wpebackend-fdo-1.0', version : '>= 1.8', required : get_option('wpe')) egl_dep = dependency('egl', required : get_option('wpe')) xkbcommon_dep = dependency('xkbcommon', version : '>= 0.8', required : get_option('wpe')) wl_server_dep = dependency('wayland-server', required : get_option('wpe')) @@ -16,12 +19,19 @@ subdir_done() endif +wpe_extension_install_dir = get_option('prefix') / get_option('libdir') / meson.project_name() / 'wpe-extension' + +giounix_dep = dependency('gio-unix-2.0', required: false) gstwpe = library('gstwpe', - ['WPEThreadedView.cpp', 'gstwpesrc.cpp'], - dependencies : [egl_dep, wpe_dep, wpe_fdo_dep, gstvideo_dep, gstbase_dep, gstgl_dep, xkbcommon_dep, wl_server_dep], - cpp_args : gst_plugins_bad_args + ['-DHAVE_CONFIG_H=1'], + ['WPEThreadedView.cpp', 'gstwpe.cpp', 'gstwpevideosrc.cpp', 'gstwpesrcbin.cpp'], + dependencies : [egl_dep, wpe_dep, wpe_fdo_dep, gstallocators_dep, gstaudio_dep, gstvideo_dep, gstbase_dep, gstgl_dep, xkbcommon_dep, wl_server_dep, giounix_dep], + cpp_args : gst_plugins_bad_args + ['-DHAVE_CONFIG_H=1', '-DWPE_EXTENSION_INSTALL_DIR=' + wpe_extension_install_dir], include_directories : [configinc], install : true, install_dir : plugins_install_dir) + +if giounix_dep.found() + subdir('wpe-extension') +endif pkgconfig.generate(gstwpe, install_dir : plugins_pkgconfig_install_dir) plugins += [gstwpe]
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/wpe-extension
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/wpe-extension/gstwpeaudiosink.c
Added
@@ -0,0 +1,311 @@ +/* Copyright (C) <2020> Philippe Normand <philn@igalia.com> + * Copyright (C) <2021> Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or modify it under the terms of the + * GNU Library General Public License as published by the Free Software Foundation; either version 2 + * of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without + * even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public License along with this + * library; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#define _GNU_SOURCE +#include <stdio.h> +#include <unistd.h> +#include <sys/mman.h> +#include <sys/types.h> +#include "gstwpeextension.h" + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#define gst_wpe_audio_sink_parent_class parent_class +GST_DEBUG_CATEGORY (wpe_audio_sink_debug); +#define GST_CAT_DEFAULT wpe_audio_sink_debug + +struct _GstWpeAudioSink +{ + GstBaseSink parent; + + guint32 id; + GCancellable *cancellable;; + + gchar *caps; + + GMutex buf_lock; + GCond buf_cond; + GUnixFDList *fdlist; +}; + +static guint id = -1; /* atomic */ + +G_DEFINE_TYPE_WITH_CODE (GstWpeAudioSink, gst_wpe_audio_sink, + GST_TYPE_BASE_SINK, GST_DEBUG_CATEGORY_INIT (wpe_audio_sink_debug, + "wpeaudio_sink", 0, "WPE Sink");); + +static GstStaticPadTemplate audio_sink_factory = +GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("audio/x-raw")); + +static void +message_consumed_cb (GObject * source_object, GAsyncResult * res, + GstWpeAudioSink * self) +{ + g_mutex_lock (&self->buf_lock); + g_cond_broadcast (&self->buf_cond); + g_mutex_unlock (&self->buf_lock); +} + +static GstFlowReturn +render (GstBaseSink * sink, GstBuffer * buf) +{ + gsize written_bytes; + static int init = 0; + char filename[1024]; + const gint *fds; + WebKitUserMessage *msg; + GstMapInfo info; + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (sink); + + if (!self->caps) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("Trying to render buffer before caps were set"), (NULL)); + + return GST_FLOW_ERROR; + } + + if (!gst_buffer_map (buf, &info, GST_MAP_READ)) { + GST_ELEMENT_ERROR (self, RESOURCE, READ, ("Failed to map input buffer"), + (NULL)); + + return GST_FLOW_ERROR; + } + + if (!self->fdlist) { + gint fds[1] = { -1 }; + +#ifdef HAVE_MEMFD_CREATE + fds[0] = memfd_create ("gstwpe-shm", MFD_CLOEXEC); +#endif + + if (fds[0] < 0) { + /* allocate shm pool */ + snprintf (filename, 1024, "%s/%s-%d-%s", g_get_user_runtime_dir (), + "gstwpe-shm", init++, "XXXXXX"); + + fds[0] = g_mkstemp (filename); + if (fds[0] < 0) { + gst_buffer_unmap (buf, &info); + GST_ELEMENT_ERROR (self, RESOURCE, READ, + ("opening temp file %s failed: %s", filename, strerror (errno)), + (NULL)); + return GST_FLOW_ERROR; + } + + unlink (filename); + } + + if (fds[0] <= 0) + goto write_error; + + self->fdlist = g_unix_fd_list_new_from_array (fds, 1); + msg = webkit_user_message_new_with_fd_list ("gstwpe.set_shm", + g_variant_new ("(u)", self->id), self->fdlist); + gst_wpe_extension_send_message (msg, self->cancellable, NULL, NULL); + } + + fds = g_unix_fd_list_peek_fds (self->fdlist, NULL); + if (ftruncate (fds[0], info.size) == -1) + goto write_error; + + written_bytes = write (fds[0], info.data, info.size); + if (written_bytes < 0) + goto write_error; + + if (written_bytes != info.size) + goto write_error; + + if (lseek (fds[0], 0, SEEK_SET) == -1) + goto write_error; + + msg = webkit_user_message_new ("gstwpe.new_buffer", + g_variant_new ("(ut)", self->id, info.size)); + + g_mutex_lock (&self->buf_lock); + gst_wpe_extension_send_message (msg, self->cancellable, + (GAsyncReadyCallback) message_consumed_cb, self); + g_cond_wait (&self->buf_cond, &self->buf_lock); + g_mutex_unlock (&self->buf_lock); + + gst_buffer_unmap (buf, &info); + + return GST_FLOW_OK; + +write_error: + gst_buffer_unmap (buf, &info); + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, ("Couldn't write memfd: %s", + strerror (errno)), (NULL)); + + return GST_FLOW_ERROR; +} + +static gboolean +set_caps (GstBaseSink * sink, GstCaps * caps) +{ + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (sink); + gchar *stream_id; + + if (self->caps) { + GST_ERROR_OBJECT (sink, "Renegotiation is not supported yet"); + + return FALSE; + } + + self->caps = gst_caps_to_string (caps); + self->id = g_atomic_int_add (&id, 1); + stream_id = gst_pad_get_stream_id (GST_BASE_SINK_PAD (sink)); + gst_wpe_extension_send_message (webkit_user_message_new ("gstwpe.new_stream", + g_variant_new ("(uss)", self->id, self->caps, stream_id)), + self->cancellable, NULL, NULL); + g_free (stream_id); + + return TRUE; +} + +static gboolean +unlock (GstBaseSink * sink) +{ + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (sink); + + g_cancellable_cancel (self->cancellable); + + return TRUE; +} + +static gboolean +unlock_stop (GstBaseSink * sink) +{ + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (sink); + GCancellable *cancellable = self->cancellable; + + self->cancellable = g_cancellable_new (); + g_object_unref (cancellable); + + return TRUE; +} + +static void +_cancelled_cb (GCancellable * _cancellable, GstWpeAudioSink * self) +{ + g_mutex_lock (&self->buf_lock); + g_cond_broadcast (&self->buf_cond); + g_mutex_unlock (&self->buf_lock); +} + +static gboolean +stop (GstBaseSink * sink) +{ + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (sink); + + if (!self->caps) { + GST_DEBUG_OBJECT (sink, "Stopped before started"); + + return TRUE; + } + + /* Stop processing and claim buffers back */ + g_cancellable_cancel (self->cancellable); + + GST_DEBUG_OBJECT (sink, "Stopping %d", self->id); + gst_wpe_extension_send_message (webkit_user_message_new ("gstwpe.stop", + g_variant_new_uint32 (self->id)), self->cancellable, NULL, NULL); + + return TRUE; +} + +static GstStateChangeReturn +change_state (GstElement * element, GstStateChange transition) +{ + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (element); + + switch (transition) { + case GST_STATE_CHANGE_PAUSED_TO_PLAYING: + { + if (g_cancellable_is_cancelled (self->cancellable)) { + GCancellable *cancellable = self->cancellable; + self->cancellable = g_cancellable_new (); + + g_object_unref (cancellable); + } + break; + } + case GST_STATE_CHANGE_PLAYING_TO_PAUSED: + g_cancellable_cancel (self->cancellable); + + gst_wpe_extension_send_message (webkit_user_message_new ("gstwpe.pause", + g_variant_new_uint32 (self->id)), NULL, NULL, NULL); + + break; + default: + break; + } + + return GST_CALL_PARENT_WITH_DEFAULT (GST_ELEMENT_CLASS, + change_state, (element, transition), GST_STATE_CHANGE_SUCCESS); +} + +static void +dispose (GObject * object) +{ + GstWpeAudioSink *self = GST_WPE_AUDIO_SINK (object); + + g_clear_object (&self->cancellable); + g_clear_pointer (&self->caps, g_free); +} + +static void +gst_wpe_audio_sink_init (GstWpeAudioSink * self) +{ + GstElementClass *klass = GST_ELEMENT_GET_CLASS (self); + GstPadTemplate *pad_template = + gst_element_class_get_pad_template (GST_ELEMENT_CLASS (klass), "sink"); + g_return_if_fail (pad_template != NULL); + + self->cancellable = g_cancellable_new (); + g_cancellable_connect (self->cancellable, + G_CALLBACK (_cancelled_cb), self, NULL); +} + +static void +gst_wpe_audio_sink_class_init (GstWpeAudioSinkClass * klass) +{ + GstPadTemplate *tmpl; + GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); + GObjectClass *object_class = G_OBJECT_CLASS (klass); + GstBaseSinkClass *gstbasesink_class = (GstBaseSinkClass *) klass; + + object_class->dispose = dispose; + + gst_element_class_set_static_metadata (gstelement_class, + "WPE internal audio sink", "Sink/Audio", + "Internal sink to be used in wpe when running inside gstwpe", + "Thibault Saunier <tsaunier@igalia.com>"); + + tmpl = gst_static_pad_template_get (&audio_sink_factory); + gst_element_class_add_pad_template (gstelement_class, tmpl); + + gstelement_class->change_state = GST_DEBUG_FUNCPTR (change_state); + gstbasesink_class->stop = GST_DEBUG_FUNCPTR (stop); + gstbasesink_class->unlock = GST_DEBUG_FUNCPTR (unlock); + gstbasesink_class->unlock_stop = GST_DEBUG_FUNCPTR (unlock_stop); + gstbasesink_class->render = GST_DEBUG_FUNCPTR (render); + gstbasesink_class->set_caps = GST_DEBUG_FUNCPTR (set_caps); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/wpe-extension/gstwpebusmsgforwarder.c
Added
@@ -0,0 +1,181 @@ +/* Copyright (C) <2021> Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or modify it under the terms of the + * GNU Library General Public License as published by the Free Software Foundation; either version 2 + * of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without + * even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public License along with this + * library; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#include "gstwpeextension.h" + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +GST_DEBUG_CATEGORY (wpe_bus_msg_forwarder_debug); +#define GST_CAT_DEFAULT wpe_bus_msg_forwarder_debug + +struct _GstWpeBusMsgForwarder +{ + GstTracer parent; + GCancellable *cancellable; +}; + +G_DEFINE_TYPE_WITH_CODE (GstWpeBusMsgForwarder, gst_wpe_bus_msg_forwarder, + GST_TYPE_TRACER, GST_DEBUG_CATEGORY_INIT (wpe_bus_msg_forwarder_debug, + "wpebusmsgforwarder", 0, "WPE message forwarder");); + +static void +dispose (GObject * object) +{ + GstWpeBusMsgForwarder *self = GST_WPE_BUS_MSG_FORWARDER (object); + + g_clear_object (&self->cancellable); +} + +static WebKitUserMessage * +create_gerror_bus_msg (GstElement * element, GstMessage * message) +{ + GError *error; + gchar *debug_str, *details_structure, *src_path; + WebKitUserMessage *msg; + const GstStructure *details = NULL; + + if (GST_MESSAGE_TYPE (message) == GST_MESSAGE_ERROR) { + gst_message_parse_error (message, &error, &debug_str); + gst_message_parse_error_details (message, &details); + } else if (GST_MESSAGE_TYPE (message) == GST_MESSAGE_WARNING) { + gst_message_parse_warning (message, &error, &debug_str); + gst_message_parse_warning_details (message, &details); + } else { + gst_message_parse_info (message, &error, &debug_str); + gst_message_parse_info_details (message, &details); + } + + details_structure = + details ? gst_structure_to_string (details) : g_strdup (""); + src_path = gst_object_get_path_string (GST_MESSAGE_SRC (message)); + + msg = webkit_user_message_new ("gstwpe.bus_gerror_message", + /* (message_type, src_path, error_domain, error_code, msg, debug_str, details_structure) */ + g_variant_new ("(issssusss)", + GST_MESSAGE_TYPE (message), + GST_MESSAGE_SRC_NAME (message), + G_OBJECT_TYPE_NAME (GST_MESSAGE_SRC (message)), + src_path, + g_quark_to_string (error->domain), + error->code, error->message, debug_str, details_structure) + ); + g_free (src_path); + + return msg; +} + +/* Those types can't be deserialized on the receiver + * side, so we just ignore them for now */ +#define IS_NOT_DESERIALIZABLE_TYPE(value) \ + (g_type_is_a ((G_VALUE_TYPE (value)), G_TYPE_OBJECT) || \ + g_type_is_a ((G_VALUE_TYPE (value)), G_TYPE_ERROR) || \ + g_type_is_a ((G_VALUE_TYPE (value)), GST_TYPE_CONTEXT) || \ + g_type_is_a ((G_VALUE_TYPE (value)), G_TYPE_POINTER)) + +static gboolean +cleanup_structure (GQuark field_id, GValue * value, gpointer self) +{ + /* We need soome API in core to make that happen cleanly */ + if (IS_NOT_DESERIALIZABLE_TYPE (value)) { + return FALSE; + } + + if (GST_VALUE_HOLDS_LIST (value)) { + gint i; + + for (i = 0; i < gst_value_list_get_size (value); i++) { + if (IS_NOT_DESERIALIZABLE_TYPE (gst_value_list_get_value (value, i))) + return FALSE; + } + } + + if (GST_VALUE_HOLDS_ARRAY (value)) { + gint i; + + for (i = 0; i < gst_value_array_get_size (value); i++) { + if (IS_NOT_DESERIALIZABLE_TYPE (gst_value_array_get_value (value, i))) + return FALSE; + } + } + + return TRUE; +} + +static void +gst_message_post_cb (GObject * object, GstClockTime ts, GstElement * element, + GstMessage * message) +{ + gchar *str; + WebKitUserMessage *msg = NULL; + GstStructure *structure; + GstWpeBusMsgForwarder *self; + const GstStructure *message_struct; + + if (!GST_IS_PIPELINE (element)) + return; + + self = GST_WPE_BUS_MSG_FORWARDER (object); + message_struct = gst_message_get_structure (message); + structure = message_struct ? gst_structure_copy (message_struct) : NULL; + + if (structure) { + gst_structure_filter_and_map_in_place (structure, cleanup_structure, self); + str = gst_structure_to_string (structure); + } else { + str = g_strdup (""); + } + + /* we special case error as gst can't serialize/de-serialize it */ + if (GST_MESSAGE_TYPE (message) == GST_MESSAGE_ERROR + || GST_MESSAGE_TYPE (message) == GST_MESSAGE_WARNING + || GST_MESSAGE_TYPE (message) == GST_MESSAGE_INFO) { + msg = create_gerror_bus_msg (element, message); + } else { + gchar *src_path = gst_object_get_path_string (GST_MESSAGE_SRC (message)); + msg = webkit_user_message_new ("gstwpe.bus_message", + g_variant_new ("(issss)", + GST_MESSAGE_TYPE (message), + GST_MESSAGE_SRC_NAME (message), + G_OBJECT_TYPE_NAME (GST_MESSAGE_SRC (message)), src_path, str)); + g_free (src_path); + } + if (msg) + gst_wpe_extension_send_message (msg, self->cancellable, NULL, NULL); + g_free (str); +} + +static void +constructed (GObject * object) +{ + GstTracer *tracer = GST_TRACER (object); + gst_tracing_register_hook (tracer, "element-post-message-pre", + G_CALLBACK (gst_message_post_cb)); +} + +static void +gst_wpe_bus_msg_forwarder_init (GstWpeBusMsgForwarder * self) +{ + self->cancellable = g_cancellable_new (); +} + +static void +gst_wpe_bus_msg_forwarder_class_init (GstWpeBusMsgForwarderClass * klass) +{ + GObjectClass *object_class = G_OBJECT_CLASS (klass); + + object_class->dispose = dispose; + object_class->constructed = constructed; +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/wpe-extension/gstwpeextension.c
Added
@@ -0,0 +1,60 @@ +/* Copyright (C) <2021> Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstwpeextension.h" + +#include <errno.h> +#include <fcntl.h> +#include <sys/socket.h> + +#include <gst/gst.h> +#include <gmodule.h> +#include <gio/gunixfdlist.h> +#include <wpe/webkit-web-extension.h> + +G_MODULE_EXPORT void webkit_web_extension_initialize (WebKitWebExtension * + extension); + +static WebKitWebExtension *global_extension = NULL; + +void +webkit_web_extension_initialize (WebKitWebExtension * extension) +{ + g_return_if_fail (!global_extension); + + gst_init (NULL, NULL); + + /* Register our own audio sink to */ + gst_element_register (NULL, "gstwpeaudiosink", GST_RANK_PRIMARY + 500, + gst_wpe_audio_sink_get_type ()); + gst_object_unref (g_object_new (gst_wpe_bus_msg_forwarder_get_type (), NULL)); + + global_extension = extension; + GST_INFO_OBJECT (global_extension, "Setting as global extension."); +} + +void +gst_wpe_extension_send_message (WebKitUserMessage * msg, + GCancellable * cancellable, GAsyncReadyCallback cb, gpointer udata) +{ + webkit_web_extension_send_message_to_context (global_extension, msg, + cancellable, cb, udata); +}
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/wpe-extension/gstwpeextension.h
Added
@@ -0,0 +1,26 @@ +/* Copyright (C) <2021> Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or modify it under the terms of the + * GNU Library General Public License as published by the Free Software Foundation; either version 2 + * of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without + * even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public License along with this + * library; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <wpe/webkit-web-extension.h> +#include <gio/gunixfdlist.h> +#include <gst/gst.h> +#include <gst/base/gstbasesink.h> + +void gst_wpe_extension_send_message (WebKitUserMessage *msg, GCancellable *cancellable, GAsyncReadyCallback cb, gpointer udata); + +G_DECLARE_FINAL_TYPE (GstWpeAudioSink, gst_wpe_audio_sink, GST, WPE_AUDIO_SINK, GstBaseSink); +G_DECLARE_FINAL_TYPE (GstWpeBusMsgForwarder, gst_wpe_bus_msg_forwarder, GST, WPE_BUS_MSG_FORWARDER, GstTracer); \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/ext/wpe/wpe-extension/meson.build
Added
@@ -0,0 +1,7 @@ +library('gstwpeextension', + ['gstwpeextension.c', 'gstwpeaudiosink.c', 'gstwpebusmsgforwarder.c'], + dependencies : [wpe_dep, gst_dep, gstbase_dep, giounix_dep], + c_args : ['-DHAVE_CONFIG_H=1'], + include_directories : [configinc], + install : true, + install_dir : wpe_extension_install_dir)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/x265/gstx265enc.c -> gst-plugins-bad-1.20.1.tar.xz/ext/x265/gstx265enc.c
Changed
@@ -197,10 +197,12 @@ const GValue * value, GParamSpec * pspec); static void gst_x265_enc_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); +static gboolean x265enc_element_init (GstPlugin * plugin); #define gst_x265_enc_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstX265Enc, gst_x265_enc, GST_TYPE_VIDEO_ENCODER, G_IMPLEMENT_INTERFACE (GST_TYPE_PRESET, NULL)); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (x265enc, x265enc_element_init); static gboolean gst_x265_enc_add_x265_chroma_format (GstStructure * s, @@ -641,6 +643,10 @@ g_ptr_array_set_size (x265enc->peer_profiles, 0); + /* make sure that we have enough time for first DTS, + this is probably overkill for most streams */ + gst_video_encoder_set_min_pts (encoder, GST_SECOND * 60 * 60 * 1000); + return TRUE; } @@ -1735,7 +1741,7 @@ } static gboolean -plugin_init (GstPlugin * plugin) +x265enc_element_init (GstPlugin * plugin) { GST_DEBUG_CATEGORY_INIT (x265_enc_debug, "x265enc", 0, "h265 encoding element"); @@ -1776,6 +1782,12 @@ GST_RANK_PRIMARY, GST_TYPE_X265_ENC); } +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (x265enc, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, x265,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/x265/gstx265enc.h -> gst-plugins-bad-1.20.1.tar.xz/ext/x265/gstx265enc.h
Changed
@@ -84,5 +84,7 @@ GType gst_x265_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (x265enc); + G_END_DECLS #endif /* __GST_X265_ENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/x265/meson.build -> gst-plugins-bad-1.20.1.tar.xz/ext/x265/meson.build
Changed
@@ -1,4 +1,9 @@ -x265_dep = dependency('x265', required: get_option('x265')) +x265_opt = get_option('x265').require(gpl_allowed, error_message: ''' + Plugin x265 explicitly required via options but GPL-licensed plugins disabled via options. + Pass option -Dgpl=enabled to Meson to allow GPL-licensed plugins to be built. + ''') + +x265_dep = dependency('x265', required: x265_opt) if x265_dep.found() gstx265 = library('gstx265', 'gstx265enc.c',
View file
gst-plugins-bad-1.18.6.tar.xz/ext/zbar/gstzbar.c -> gst-plugins-bad-1.20.1.tar.xz/ext/zbar/gstzbar.c
Changed
@@ -110,7 +110,9 @@ GstVideoFrame * frame); #define gst_zbar_parent_class parent_class -G_DEFINE_TYPE (GstZBar, gst_zbar, GST_TYPE_VIDEO_FILTER); +G_DEFINE_TYPE_WITH_CODE (GstZBar, gst_zbar, GST_TYPE_VIDEO_FILTER, + GST_DEBUG_CATEGORY_INIT (zbar_debug, "zbar", 0, "zbar");); +GST_ELEMENT_REGISTER_DEFINE (zbar, "zbar", GST_RANK_NONE, GST_TYPE_ZBAR); static void gst_zbar_class_init (GstZBarClass * g_class) @@ -361,9 +363,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (zbar_debug, "zbar", 0, "zbar"); - - return gst_element_register (plugin, "zbar", GST_RANK_NONE, GST_TYPE_ZBAR); + return GST_ELEMENT_REGISTER (zbar, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/ext/zbar/gstzbar.h -> gst-plugins-bad-1.20.1.tar.xz/ext/zbar/gstzbar.h
Changed
@@ -67,6 +67,8 @@ GType gst_zbar_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (zbar); + G_END_DECLS #endif /* __GST_VIDEO_ZBAR_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/zxing/gstzxing.cpp -> gst-plugins-bad-1.20.1.tar.xz/ext/zxing/gstzxing.cpp
Changed
@@ -191,6 +191,7 @@ GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (zxing_debug, "zxing", 0, "debug category for zxing element")); +GST_ELEMENT_REGISTER_DEFINE (zxing, "zxing", GST_RANK_MARGINAL, GST_TYPE_ZXING); static void gst_zxing_class_init (GstZXingClass * g_class)
View file
gst-plugins-bad-1.18.6.tar.xz/ext/zxing/gstzxing.h -> gst-plugins-bad-1.20.1.tar.xz/ext/zxing/gstzxing.h
Changed
@@ -27,7 +27,7 @@ #define GST_TYPE_ZXING gst_zxing_get_type () G_DECLARE_FINAL_TYPE(GstZXing, gst_zxing, GST, ZXING, GstVideoFilter) - +GST_ELEMENT_REGISTER_DECLARE (zxing); G_END_DECLS #endif /* __GST_VIDEO_ZXING_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/ext/zxing/gstzxingplugin.c -> gst-plugins-bad-1.20.1.tar.xz/ext/zxing/gstzxingplugin.c
Changed
@@ -32,7 +32,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "zxing", GST_RANK_MARGINAL, GST_TYPE_ZXING); + GST_ELEMENT_REGISTER (zxing, plugin); return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/adaptivedemux/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/adaptivedemux/meson.build
Changed
@@ -1,9 +1,10 @@ adaptivedemux_sources = files('gstadaptivedemux.c') adaptivedemux_headers = files('gstadaptivedemux.h') +pkg_name = 'gstreamer-adaptivedemux-1.0' gstadaptivedemux = library('gstadaptivedemux-' + api_version, adaptivedemux_sources, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_ADAPTIVE_DEMUX'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_ADAPTIVE_DEMUX', '-DG_LOG_DOMAIN="GStreamer-AdaptiveDemux"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -11,7 +12,10 @@ install : true, dependencies : [gstbase_dep, gsturidownloader_dep], ) +libraries += [[pkg_name, {'lib': gstadaptivedemux}]] gstadaptivedemux_dep = declare_dependency(link_with : gstadaptivedemux, include_directories : [libsinc], dependencies : [gstbase_dep, gsturidownloader_dep]) + +meson.override_dependency(pkg_name, gstadaptivedemux_dep) \ No newline at end of file
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/audio/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/audio/meson.build
Changed
@@ -1,11 +1,11 @@ -badaudio_sources = ['gstnonstreamaudiodecoder.c', 'gstplanaraudioadapter.c'] -badaudio_headers = ['gstnonstreamaudiodecoder.h', 'audio-bad-prelude.h', 'gstplanaraudioadapter.h'] +badaudio_sources = files('gstnonstreamaudiodecoder.c', 'gstplanaraudioadapter.c') +badaudio_headers = files('gstnonstreamaudiodecoder.h', 'audio-bad-prelude.h', 'gstplanaraudioadapter.h') install_headers(badaudio_headers, subdir : 'gstreamer-1.0/gst/audio') gstbadaudio = library('gstbadaudio-' + api_version, badaudio_sources, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_AUDIO_BAD'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_AUDIO_BAD', '-DG_LOG_DOMAIN="GStreamer-AudioBad"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -14,21 +14,41 @@ dependencies : [gstaudio_dep, gstbase_dep], ) +library_def = {'lib': gstbadaudio} +pkg_name = 'gstreamer-bad-audio-1.0' +pkgconfig.generate(gstbadaudio, + libraries : [gst_dep, gstbase_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'Bad audio library for GStreamer elements', +) +gen_sources = [] + +if build_gir + gir = { + 'sources' : badaudio_sources + badaudio_headers, + 'namespace' : 'GstBadAudio', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstAudio-1.0', 'GstBase-1.0'], + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'], + 'dependencies' : [gstbase_dep, gstaudio_dep] + } + library_def += {'gir': [gir]} + if not static_build + audio_gir = gnome.generate_gir(gstbadaudio, kwargs: gir) + gen_sources += audio_gir + endif +endif +libraries += [[pkg_name, library_def]] + gstbadaudio_dep = declare_dependency(link_with : gstbadaudio, include_directories : [libsinc], + sources: gen_sources, dependencies : [gstaudio_dep, gstbase_dep]) -if build_gir - audio_gir = gnome.generate_gir(gstbadaudio, - sources : badaudio_sources + badaudio_headers, - namespace : 'GstBadAudio', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-bad-audio-1.0', - includes : ['Gst-1.0', 'GstAudio-1.0', 'GstBase-1.0'], - install : true, - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'], - dependencies : [gstbase_dep, gstaudio_dep] - ) -endif +meson.override_dependency(pkg_name, gstbadaudio_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/basecamerabinsrc/gstbasecamerasrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/basecamerabinsrc/gstbasecamerasrc.c
Changed
@@ -46,13 +46,13 @@ * // pad templates should be a #GstStaticPadTemplate with direction * // #GST_PAD_SRC and name "vidsrc", "imgsrc" and "vfsrc" * gst_element_class_add_static_pad_template (gstelement_class, - * &vidsrc_template); + * &vidsrc_template); * gst_element_class_add_static_pad_template (gstelement_class, - * &imgsrc_template); + * &imgsrc_template); * gst_element_class_add_static_pad_template (gstelement_class, - * &vfsrc_template); + * &vfsrc_template); * // see #GstElementDetails - * gst_element_class_set_details (gstelement_class, &details); + * gst_element_class_set_details (gstelement_class, &details); * } * ]| *
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/basecamerabinsrc/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/basecamerabinsrc/meson.build
Changed
@@ -1,19 +1,19 @@ -camerabin_sources = [ +camerabin_sources = files( 'gstcamerabin-enum.c', 'gstcamerabinpreview.c', 'gstbasecamerasrc.c', -] -camerabin_headers = [ +) +camerabin_headers = files( 'basecamerabinsrc-prelude.h', 'gstcamerabin-enum.h', 'gstcamerabinpreview.h', 'gstbasecamerasrc.h', -] +) install_headers(camerabin_headers, subdir : 'gstreamer-1.0/gst/basecamerabinsrc') gstbasecamerabin = library('gstbasecamerabinsrc-' + api_version, camerabin_sources, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_BASE_CAMERA_BIN_SRC'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_BASE_CAMERA_BIN_SRC', '-DG_LOG_DOMAIN="GStreamer-BaseCameraBinSrc"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -22,24 +22,32 @@ dependencies : [gstapp_dep], ) -_sources = [] -if build_gir - basecamerabin_gir = gnome.generate_gir(gstbasecamerabin, - sources : camerabin_sources + camerabin_headers, - namespace : 'GstBadBaseCameraBin', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-bad-base-camerabinsrc-1.0', - includes : ['Gst-1.0', 'GstApp-1.0'], - install : false, # Only for the documentation - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'], - dependencies : [gstapp_dep], - build_by_default : true, - ) - _sources += [basecamerabin_gir] +library_def = {'lib': gstbasecamerabin} +pkg_name = 'gstreamer-bad-base-camerabinsrc-1.0' +gen_sources = [] +if build_gir and not static_build + gir = { + 'sources' : camerabin_sources + camerabin_headers, + 'namespace' : 'GstBadBaseCameraBin', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstApp-1.0'], + 'install' : false, # Only for the documentation + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'], + 'dependencies' : [gstapp_dep], + 'build_by_default' : true, + } + library_def += {'gir': [gir]} + if not static_build + basecamerabin_gir = gnome.generate_gir(gstbasecamerabin, kwargs: gir) + gen_sources += basecamerabin_gir + endif endif gstbasecamerabin_dep = declare_dependency(link_with : gstbasecamerabin, include_directories : [libsinc], + sources: gen_sources, dependencies : [gstapp_dep]) +meson.override_dependency(pkg_name, gstbasecamerabin_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gstav1parser.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gstav1parser.c
Changed
@@ -195,6 +195,88 @@ return r + (v >> 1); } +/* Shift down with rounding for use when n >= 0, value >= 0 */ +static guint64 +av1_helper_round_power_of_two (guint64 value, guint16 n) +{ + return (value + (((guint64) (1) << n) >> 1)) >> n; +} + + /* Shift down with rounding for signed integers, for use when n >= 0 */ +static gint64 +av1_helper_round_power_of_two_signed (gint64 value, guint16 n) +{ + return (value < 0) ? -((gint64) (av1_helper_round_power_of_two (-value, n))) + : (gint64) av1_helper_round_power_of_two (value, n); +} + +static gint +av1_helper_msb (guint n) +{ + int log = 0; + guint value = n; + int i; + + g_assert (n != 0); + + for (i = 4; i >= 0; --i) { + const gint shift = (1 << i); + const guint x = value >> shift; + if (x != 0) { + value = x; + log += shift; + } + } + return log; +} + +static const guint16 div_lut[GST_AV1_DIV_LUT_NUM + 1] = { + 16384, 16320, 16257, 16194, 16132, 16070, 16009, 15948, 15888, 15828, 15768, + 15709, 15650, 15592, 15534, 15477, 15420, 15364, 15308, 15252, 15197, 15142, + 15087, 15033, 14980, 14926, 14873, 14821, 14769, 14717, 14665, 14614, 14564, + 14513, 14463, 14413, 14364, 14315, 14266, 14218, 14170, 14122, 14075, 14028, + 13981, 13935, 13888, 13843, 13797, 13752, 13707, 13662, 13618, 13574, 13530, + 13487, 13443, 13400, 13358, 13315, 13273, 13231, 13190, 13148, 13107, 13066, + 13026, 12985, 12945, 12906, 12866, 12827, 12788, 12749, 12710, 12672, 12633, + 12596, 12558, 12520, 12483, 12446, 12409, 12373, 12336, 12300, 12264, 12228, + 12193, 12157, 12122, 12087, 12053, 12018, 11984, 11950, 11916, 11882, 11848, + 11815, 11782, 11749, 11716, 11683, 11651, 11619, 11586, 11555, 11523, 11491, + 11460, 11429, 11398, 11367, 11336, 11305, 11275, 11245, 11215, 11185, 11155, + 11125, 11096, 11067, 11038, 11009, 10980, 10951, 10923, 10894, 10866, 10838, + 10810, 10782, 10755, 10727, 10700, 10673, 10645, 10618, 10592, 10565, 10538, + 10512, 10486, 10460, 10434, 10408, 10382, 10356, 10331, 10305, 10280, 10255, + 10230, 10205, 10180, 10156, 10131, 10107, 10082, 10058, 10034, 10010, 9986, + 9963, 9939, 9916, 9892, 9869, 9846, 9823, 9800, 9777, 9754, 9732, + 9709, 9687, 9664, 9642, 9620, 9598, 9576, 9554, 9533, 9511, 9489, + 9468, 9447, 9425, 9404, 9383, 9362, 9341, 9321, 9300, 9279, 9259, + 9239, 9218, 9198, 9178, 9158, 9138, 9118, 9098, 9079, 9059, 9039, + 9020, 9001, 8981, 8962, 8943, 8924, 8905, 8886, 8867, 8849, 8830, + 8812, 8793, 8775, 8756, 8738, 8720, 8702, 8684, 8666, 8648, 8630, + 8613, 8595, 8577, 8560, 8542, 8525, 8508, 8490, 8473, 8456, 8439, + 8422, 8405, 8389, 8372, 8355, 8339, 8322, 8306, 8289, 8273, 8257, + 8240, 8224, 8208, 8192, +}; + +static gint16 +av1_helper_resolve_divisor_32 (guint32 D, gint16 * shift) +{ + gint32 f; + gint32 e; + + *shift = av1_helper_msb (D); + // e is obtained from D after resetting the most significant 1 bit. + e = D - ((guint32) 1 << *shift); + // Get the most significant DIV_LUT_BITS (8) bits of e into f + if (*shift > GST_AV1_DIV_LUT_BITS) + f = av1_helper_round_power_of_two (e, *shift - GST_AV1_DIV_LUT_BITS); + else + f = e << (GST_AV1_DIV_LUT_BITS - *shift); + g_assert (f <= GST_AV1_DIV_LUT_NUM); + *shift += GST_AV1_DIV_LUT_PREC_BITS; + // Use f as lookup into the precomputed table of multipliers + return div_lut[f]; +} + /************************************* * * * Bitstream Functions * @@ -293,8 +375,8 @@ /* 4.10.7 * * Unsigned encoded integer with maximum number of values n */ -static guint8 -av1_bitstreamfn_ns (GstBitReader * br, guint8 n, GstAV1ParserResult * retval) +static guint32 +av1_bitstreamfn_ns (GstBitReader * br, guint32 n, GstAV1ParserResult * retval) { gint w, m, v; gint extra_bit; @@ -438,7 +520,6 @@ static void gst_av1_parse_reset_state (GstAV1Parser * parser, gboolean free_sps) { - parser->state.seen_frame_header = 0; parser->state.begin_first_frame = FALSE; parser->state.prev_frame_id = 0; @@ -487,29 +568,44 @@ { g_return_if_fail (parser != NULL); - if (parser->annex_b) { - g_assert (parser->temporal_unit_consumed <= parser->temporal_unit_size); - if (parser->temporal_unit_consumed < parser->temporal_unit_size) - GST_DEBUG ("temporal_unit_consumed: %d, temporal_unit_size:%d, " - "discard the left %d bytes for a temporal_unit.", - parser->temporal_unit_consumed, parser->temporal_unit_size, - parser->temporal_unit_size - parser->temporal_unit_consumed); - - g_assert (parser->frame_unit_consumed <= parser->frame_unit_size); - if (parser->frame_unit_consumed < parser->frame_unit_size) - GST_DEBUG (" frame_unit_consumed %d, frame_unit_size: %d " - "discard the left %d bytes for a frame_unit.", - parser->frame_unit_consumed, parser->frame_unit_size, - parser->frame_unit_size - parser->frame_unit_consumed); - } + parser->annex_b = annex_b; + if (parser->annex_b) + gst_av1_parser_reset_annex_b (parser); + + gst_av1_parse_reset_state (parser, TRUE); +} + +/** + * gst_av1_parser_reset_annex_b: + * @parser: the #GstAV1Parser + * + * Only reset the current #GstAV1Parser's annex b context. + * The other part of the state is kept. + * + * Since: 1.20 + */ +void +gst_av1_parser_reset_annex_b (GstAV1Parser * parser) +{ + g_return_if_fail (parser != NULL); + g_return_if_fail (parser->annex_b); + + if (parser->temporal_unit_consumed < parser->temporal_unit_size) + GST_DEBUG ("temporal_unit_consumed: %d, temporal_unit_size:%d, " + "discard the left %d bytes for a temporal_unit.", + parser->temporal_unit_consumed, parser->temporal_unit_size, + parser->temporal_unit_size - parser->temporal_unit_consumed); + + if (parser->frame_unit_consumed < parser->frame_unit_size) + GST_DEBUG (" frame_unit_consumed %d, frame_unit_size: %d " + "discard the left %d bytes for a frame_unit.", + parser->frame_unit_consumed, parser->frame_unit_size, + parser->frame_unit_size - parser->frame_unit_consumed); parser->temporal_unit_consumed = 0; parser->temporal_unit_size = 0; parser->frame_unit_consumed = 0; parser->frame_unit_size = 0; - parser->annex_b = annex_b; - - gst_av1_parse_reset_state (parser, TRUE); } /* 5.3.2 */ @@ -615,14 +711,17 @@ annex_b_again: last_pos = 0; - g_assert (*consumed <= size); + if (*consumed > size) + goto error; if (*consumed == size) { ret = GST_AV1_PARSER_NO_MORE_DATA; goto error; } gst_bit_reader_init (&br, data + *consumed, size - *consumed); - g_assert (parser->temporal_unit_consumed <= parser->temporal_unit_size); + if (parser->temporal_unit_consumed > parser->temporal_unit_size) + goto error; + if (parser->temporal_unit_consumed && parser->temporal_unit_consumed == parser->temporal_unit_size) { GST_LOG ("Complete a temporal unit of size %d", @@ -647,7 +746,9 @@ } } - g_assert (parser->frame_unit_consumed <= parser->frame_unit_size); + if (parser->frame_unit_consumed > parser->frame_unit_size) + goto error; + if (parser->frame_unit_consumed && parser->frame_unit_consumed == parser->frame_unit_size) { GST_LOG ("Complete a frame unit of size %d", parser->frame_unit_size); @@ -707,7 +808,8 @@ } } - g_assert (*consumed <= size); + if (*consumed > size) + goto error; if (*consumed == size) { ret = GST_AV1_PARSER_NO_MORE_DATA; goto error; @@ -722,12 +824,16 @@ GST_LOG ("identify obu type is %d", obu->obu_type); if (obu->header.obu_has_size_field) { + guint size_sz = gst_bit_reader_get_pos (&br) / 8; + obu->obu_size = av1_bitstreamfn_leb128 (&br, &ret); if (ret != GST_AV1_PARSER_OK) goto error; + size_sz = gst_bit_reader_get_pos (&br) / 8 - size_sz; if (obu_length - && obu_length - 1 - obu->header.obu_extention_flag != obu->obu_size) { + && obu_length - 1 - obu->header.obu_extention_flag - size_sz != + obu->obu_size) { /* If obu_size and obu_length are both present, but inconsistent, then the packed bitstream is deemed invalid. */ ret = GST_AV1_PARSER_BITSTREAM_ERROR; @@ -1138,7 +1244,7 @@ retval = GST_AV1_PARSER_BITSTREAM_ERROR; goto error; } - if (seq_header->operating_points[i].seq_level_idx > GST_AV1_SEQ_LEVEL_4_0) { + if (seq_header->operating_points[i].seq_level_idx > GST_AV1_SEQ_LEVEL_3_3) { seq_header->operating_points[i].seq_tier = AV1_READ_BIT (br); } else { seq_header->operating_points[i].seq_tier = 0; @@ -1187,7 +1293,8 @@ } } - /* Let user decide the operatingPoint, move it later + /* Let user decide the operatingPoint, + implemented by calling gst_av1_parser_set_operating_point() operatingPoint = choose_operating_point( ) operating_point_idc = operating_point_idc[ operatingPoint ] */ @@ -1349,7 +1456,7 @@ if (parser->state.operating_point < 0 || parser->state.operating_point > seq_header->operating_points_cnt_minus_1) { - GST_INFO ("Invalid operating_point %d set by user, just use 0", + GST_WARNING ("Invalid operating_point %d set by user, just use 0", parser->state.operating_point); parser->state.operating_point_idc = seq_header->operating_points[0].idc; } else { @@ -1414,7 +1521,7 @@ if (ret != GST_AV1_PARSER_OK) return ret; - if (itut_t35->itu_t_t35_country_code) { + if (itut_t35->itu_t_t35_country_code == 0xFF) { itut_t35->itu_t_t35_country_code_extention_byte = AV1_READ_BITS_CHECKED (br, 8, &ret); if (ret != GST_AV1_PARSER_OK) @@ -1512,7 +1619,7 @@ if (scalability->spatial_layer_description_present_flag) { for (i = 0; i <= scalability->spatial_layers_cnt_minus_1; i++) { - scalability->spatial_layer_ref_id[i] = AV1_READ_BIT_CHECKED (br, &ret); + scalability->spatial_layer_ref_id[i] = AV1_READ_UINT8_CHECKED (br, &ret); if (ret != GST_AV1_PARSER_OK) goto error; } @@ -1689,6 +1796,12 @@ goto error; retval = av1_skip_trailing_bits (parser, &bit_reader, obu); + if (retval != GST_AV1_PARSER_OK) { + GST_WARNING ("Metadata type %d may have wrong trailings.", + metadata->metadata_type); + retval = GST_AV1_PARSER_OK; + } + return retval; error: @@ -2005,7 +2118,8 @@ gint bits_to_read = segmentation_feature_bits[j]; gint limit = segmentation_feature_max[j]; if (segmentation_feature_signed[j]) { - feature_value = av1_bitstreamfn_su (br, bits_to_read, &retval); + feature_value = + av1_bitstreamfn_su (br, 1 + bits_to_read, &retval); if (retval != GST_AV1_PARSER_OK) goto error; @@ -2096,7 +2210,6 @@ gint max_width /* maxWidth */ , max_height /* maxHeight */ ; gint size_sb /* sizeSb */ ; gint widest_tile_sb /* widestTileSb */ ; - gint min_inner_tile_width = G_MAXINT /* min width of non-rightmost tile */ ; g_assert (parser->seq_header); seq_header = parser->seq_header; @@ -2143,8 +2256,13 @@ } parser->state.mi_col_starts[i] = parser->state.mi_cols; parser->state.tile_cols = i; - if (parser->state.tile_cols > 1) - min_inner_tile_width = tile_width_sb << sb_size; + + while (i >= 1) { + tile_info->width_in_sbs_minus_1[i - 1] = + ((parser->state.mi_col_starts[i] - parser->state.mi_col_starts[i - 1] + + ((1 << sb_shift) - 1)) >> sb_shift) - 1; + i--; + } min_log2_tile_rows = MAX (min_log2_tiles - parser->state.tile_cols_log2, 0); parser->state.tile_rows_log2 = min_log2_tile_rows; @@ -2167,6 +2285,12 @@ } parser->state.mi_row_starts[i] = parser->state.mi_rows; parser->state.tile_rows = i; + while (i >= 1) { + tile_info->height_in_sbs_minus_1[i - 1] = + ((parser->state.mi_row_starts[i] - parser->state.mi_row_starts[i - 1] + + ((1 << sb_shift) - 1)) >> sb_shift) - 1; + i--; + } } else { widest_tile_sb = 0; start_sb = 0; @@ -2181,8 +2305,6 @@ size_sb = tile_info->width_in_sbs_minus_1[i] + 1; widest_tile_sb = MAX (size_sb, widest_tile_sb); start_sb += size_sb; - if (i > 0 && ((size_sb << sb_size) < min_inner_tile_width)) - min_inner_tile_width = size_sb << sb_size; } parser->state.mi_col_starts[i] = parser->state.mi_cols; parser->state.tile_cols = i; @@ -2222,7 +2344,7 @@ if (retval != GST_AV1_PARSER_OK) goto error; - tile_info->tile_size_bytes_minus_1 = AV1_READ_BIT_CHECKED (br, &retval); + tile_info->tile_size_bytes_minus_1 = AV1_READ_BITS_CHECKED (br, 2, &retval); if (retval != GST_AV1_PARSER_OK) goto error; @@ -2231,13 +2353,6 @@ tile_info->context_update_tile_id = 0; } - if (min_inner_tile_width < (64 << (parser->state.upscaled_width != - parser->state.frame_width))) { - GST_INFO ("Minimum tile width requirement not satisfied"); - retval = GST_AV1_PARSER_BITSTREAM_ERROR; - goto error; - } - memcpy (tile_info->mi_col_starts, parser->state.mi_col_starts, sizeof (guint32) * (GST_AV1_MAX_TILE_COLS + 1)); memcpy (tile_info->mi_row_starts, parser->state.mi_row_starts, @@ -2270,13 +2385,8 @@ lf_params = &frame_header->loop_filter_params; if (frame_header->coded_lossless || frame_header->allow_intrabc) { - lf_params->loop_filter_delta_enabled = 0; - lf_params->loop_filter_delta_update = 0; - lf_params->loop_filter_sharpness = 0; lf_params->loop_filter_level[0] = 0; lf_params->loop_filter_level[1] = 0; - lf_params->loop_filter_level[2] = 0; - lf_params->loop_filter_level[3] = 0; lf_params->loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME] = 1; lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME] = 0; lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME] = 0; @@ -2291,58 +2401,6 @@ goto success; } - lf_params->loop_filter_delta_enabled = 0; - lf_params->loop_filter_delta_update = 0; - lf_params->loop_filter_sharpness = 0; - lf_params->loop_filter_level[0] = 0; - lf_params->loop_filter_level[1] = 0; - lf_params->loop_filter_level[2] = 0; - lf_params->loop_filter_level[3] = 0; - if (frame_header->primary_ref_frame != GST_AV1_PRIMARY_REF_NONE) { - /* Copy it from prime_ref */ - GstAV1LoopFilterParams *ref_lf_params = - &parser->state.ref_info.entry[frame_header-> - ref_frame_idx[frame_header->primary_ref_frame]].ref_lf_params; - - g_assert (parser->state.ref_info. - entry[frame_header->ref_frame_idx[frame_header->primary_ref_frame]]. - ref_valid); - lf_params->loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST3_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST3_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_BWDREF_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_BWDREF_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_GOLDEN_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_GOLDEN_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF2_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF2_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF_FRAME] = - ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF_FRAME]; - for (i = 0; i < 2; i++) - lf_params->loop_filter_mode_deltas[i] = - ref_lf_params->loop_filter_mode_deltas[i]; - } else { - /* Set default value */ - lf_params->loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME] = 1; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME] = 0; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME] = - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST3_FRAME] = - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_BWDREF_FRAME] = - lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME]; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_GOLDEN_FRAME] = -1; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF2_FRAME] = -1; - lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF_FRAME] = -1; - for (i = 0; i < 2; i++) - lf_params->loop_filter_mode_deltas[i] = 0; - } - if (AV1_REMAINING_BITS (br) < 6 + 6) { retval = GST_AV1_PARSER_NO_MORE_DATA; goto error; @@ -2386,8 +2444,7 @@ av1_bitstreamfn_su (br, 7, &retval); if (retval != GST_AV1_PARSER_OK) goto error; - } else - lf_params->loop_filter_ref_deltas[i] = 0; + } } for (i = 0; i < 2; i++) { update_mode_deltas = AV1_READ_BIT_CHECKED (br, &retval); @@ -2399,8 +2456,7 @@ av1_bitstreamfn_su (br, 7, &retval); if (retval != GST_AV1_PARSER_OK) goto error; - } else - lf_params->loop_filter_mode_deltas[i] = 0; + } } } } @@ -2546,9 +2602,9 @@ if (frame_header->all_lossless || frame_header->allow_intrabc || !seq_header->enable_restoration) { - lr_params->frame_restoration_type[0] = GST_AV1_FRAME_RESTORE_NONE; - lr_params->frame_restoration_type[0] = GST_AV1_FRAME_RESTORE_NONE; - lr_params->frame_restoration_type[0] = GST_AV1_FRAME_RESTORE_NONE; + for (i = 0; i < GST_AV1_MAX_NUM_PLANES; i++) + lr_params->frame_restoration_type[i] = GST_AV1_FRAME_RESTORE_NONE; + lr_params->uses_lr = 0; goto success; } @@ -2851,6 +2907,66 @@ return GST_AV1_PARSER_OK; } +static gboolean +gst_av1_parser_is_shear_params_valid (gint32 gm_params[6]) +{ + const gint32 *mat = gm_params; + gint16 alpha, beta, gamma, delta; + gint16 shift; + gint16 y; + gint16 v; + guint i; + gboolean default_warp_params; + + if (!(mat[2] > 0)) + return FALSE; + + default_warp_params = TRUE; + for (i = 0; i < 6; i++) { + if (gm_params[i] != ((i % 3 == 2) ? 1 << GST_AV1_WARPEDMODEL_PREC_BITS : 0)) { + default_warp_params = FALSE; + break; + } + } + if (default_warp_params) + return TRUE; + + alpha = CLAMP (mat[2] - (1 << GST_AV1_WARPEDMODEL_PREC_BITS), + G_MININT16, G_MAXINT16); + beta = CLAMP (mat[3], G_MININT16, G_MAXINT16); + y = av1_helper_resolve_divisor_32 (ABS (mat[2]), &shift) + * (mat[2] < 0 ? -1 : 1); + v = ((gint64) mat[4] * (1 << GST_AV1_WARPEDMODEL_PREC_BITS)) * y; + gamma = + CLAMP ((gint) av1_helper_round_power_of_two_signed (v, shift), G_MININT16, + G_MAXINT16); + v = ((gint64) mat[3] * mat[4]) * y; + delta = + CLAMP (mat[5] - (gint) av1_helper_round_power_of_two_signed (v, + shift) - (1 << GST_AV1_WARPEDMODEL_PREC_BITS), G_MININT16, + G_MAXINT16); + + alpha = + av1_helper_round_power_of_two_signed (alpha, + GST_AV1_WARP_PARAM_REDUCE_BITS) * (1 << GST_AV1_WARP_PARAM_REDUCE_BITS); + beta = + av1_helper_round_power_of_two_signed (beta, + GST_AV1_WARP_PARAM_REDUCE_BITS) * (1 << GST_AV1_WARP_PARAM_REDUCE_BITS); + gamma = + av1_helper_round_power_of_two_signed (gamma, + GST_AV1_WARP_PARAM_REDUCE_BITS) * (1 << GST_AV1_WARP_PARAM_REDUCE_BITS); + delta = + av1_helper_round_power_of_two_signed (delta, + GST_AV1_WARP_PARAM_REDUCE_BITS) * (1 << GST_AV1_WARP_PARAM_REDUCE_BITS); + + if ((4 * ABS (alpha) + 7 * ABS (beta) >= (1 << GST_AV1_WARPEDMODEL_PREC_BITS)) + || (4 * ABS (gamma) + 4 * ABS (delta) >= + (1 << GST_AV1_WARPEDMODEL_PREC_BITS))) + return FALSE; + + return TRUE; +} + /* 5.9.24 */ static GstAV1ParserResult gst_av1_parse_global_motion_params (GstAV1Parser * parser, @@ -2865,6 +2981,7 @@ /* init value */ gm_params->gm_type[GST_AV1_REF_INTRA_FRAME] = GST_AV1_WARP_MODEL_IDENTITY; for (ref = GST_AV1_REF_LAST_FRAME; ref <= GST_AV1_REF_ALTREF_FRAME; ref++) { + gm_params->invalid[ref] = 0; gm_params->gm_type[ref] = GST_AV1_WARP_MODEL_IDENTITY; for (i = 0; i < 6; i++) { gm_params->gm_params[ref][i] = @@ -2956,6 +3073,10 @@ if (retval != GST_AV1_PARSER_OK) goto error; } + + if (type <= GST_AV1_WARP_MODEL_AFFINE) + gm_params->invalid[ref] = + !gst_av1_parser_is_shear_params_valid (gm_params->gm_params[ref]); } success: @@ -3875,13 +3996,61 @@ goto error; } + if (frame_header->primary_ref_frame == GST_AV1_PRIMARY_REF_NONE) { + /* do something in setup_past_independence() of parser level */ + gint8 *loop_filter_ref_deltas = + frame_header->loop_filter_params.loop_filter_ref_deltas; + + frame_header->loop_filter_params.loop_filter_delta_enabled = 1; + loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME] = 1; + loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME] = 0; + loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME] = 0; + loop_filter_ref_deltas[GST_AV1_REF_LAST3_FRAME] = 0; + loop_filter_ref_deltas[GST_AV1_REF_BWDREF_FRAME] = 0; + loop_filter_ref_deltas[GST_AV1_REF_GOLDEN_FRAME] = -1; + loop_filter_ref_deltas[GST_AV1_REF_ALTREF_FRAME] = -1; + loop_filter_ref_deltas[GST_AV1_REF_ALTREF2_FRAME] = -1; + frame_header->loop_filter_params.loop_filter_mode_deltas[0] = 0; + frame_header->loop_filter_params.loop_filter_mode_deltas[1] = 0; + } else { + /* do something in load_previous() of parser level */ + /* load_loop_filter_params() */ + GstAV1LoopFilterParams *ref_lf_params = + &parser->state.ref_info.entry[frame_header-> + ref_frame_idx[frame_header->primary_ref_frame]].ref_lf_params; + gint8 *loop_filter_ref_deltas = + frame_header->loop_filter_params.loop_filter_ref_deltas; + + /* Copy all from prime_ref */ + g_assert (parser->state.ref_info. + entry[frame_header->ref_frame_idx[frame_header->primary_ref_frame]]. + ref_valid); + loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_INTRA_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST2_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_LAST3_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_LAST3_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_BWDREF_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_BWDREF_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_GOLDEN_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_GOLDEN_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_ALTREF2_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF2_FRAME]; + loop_filter_ref_deltas[GST_AV1_REF_ALTREF_FRAME] = + ref_lf_params->loop_filter_ref_deltas[GST_AV1_REF_ALTREF_FRAME]; + for (i = 0; i < 2; i++) + frame_header->loop_filter_params.loop_filter_mode_deltas[i] = + ref_lf_params->loop_filter_mode_deltas[i]; + } + /* @TODO: if ( primary_ref_frame == PRIMARY_REF_NONE ) { init_non_coeff_cdfs( ) - setup_past_independence( ) } else { load_cdfs( ref_frame_idx[primary_ref_frame] ) - load_previous( ) } */ /* @TODO: @@ -4253,6 +4422,11 @@ goto error; } + if (tile_group->tg_end < tile_group->tg_start) { + retval = GST_AV1_PARSER_NO_MORE_DATA; + goto error; + } + if (!gst_bit_reader_skip_to_byte (br)) { retval = GST_AV1_PARSER_NO_MORE_DATA; goto error; @@ -4261,6 +4435,7 @@ end_bit_pos = gst_bit_reader_get_pos (br); header_bytes = (end_bit_pos - start_bitpos) / 8; sz -= header_bytes; + for (tile_num = tile_group->tg_start; tile_num <= tile_group->tg_end; tile_num++) { tile_row = tile_num / parser->state.tile_cols; @@ -4274,9 +4449,14 @@ if (retval != GST_AV1_PARSER_OK) goto error; tile_size = tile_size_minus_1 + 1; - sz -= tile_size - parser->state.tile_size_bytes; + sz -= (tile_size + parser->state.tile_size_bytes); } + tile_group->entry[tile_num].tile_size = tile_size; + tile_group->entry[tile_num].tile_offset = gst_bit_reader_get_pos (br) / 8; + tile_group->entry[tile_num].tile_row = tile_row; + tile_group->entry[tile_num].tile_col = tile_col; + tile_group->entry[tile_num].mi_row_start = parser->state.mi_row_starts[tile_row]; tile_group->entry[tile_num].mi_row_end = @@ -4292,20 +4472,22 @@ */ /* Skip the real data to the next one */ - if (!gst_bit_reader_skip (br, tile_size)) { + if (tile_num < tile_group->tg_end && + !gst_bit_reader_skip (br, tile_size * 8)) { retval = GST_AV1_PARSER_NO_MORE_DATA; goto error; } } - /* Not implement here, the real decoder process - if (tile_group->tg_end == tile_group->num_tiles - 1) { - if ( !disable_frame_end_update_cdf ) { - frame_end_update_cdf( ) - } - decode_frame_wrapup( ) - } - */ + if (tile_group->tg_end == tile_group->num_tiles - 1) { + /* Not implement here, the real decoder process + if ( !disable_frame_end_update_cdf ) { + frame_end_update_cdf( ) + } + decode_frame_wrapup( ) + */ + parser->state.seen_frame_header = 0; + } return GST_AV1_PARSER_OK; @@ -4355,9 +4537,14 @@ GstBitReader * bit_reader, GstAV1FrameHeaderOBU * frame_header) { GstAV1ParserResult ret; + guint i; memset (frame_header, 0, sizeof (*frame_header)); frame_header->frame_is_intra = 1; + frame_header->last_frame_idx = -1; + frame_header->gold_frame_idx = -1; + for (i = 0; i < GST_AV1_REFS_PER_FRAME; i++) + frame_header->ref_frame_idx[i] = -1; ret = gst_av1_parse_uncompressed_frame_header (parser, obu, bit_reader, frame_header); @@ -4462,11 +4649,37 @@ return GST_AV1_PARSER_NO_MORE_DATA; retval = gst_av1_parse_tile_group (parser, &bit_reader, &(frame->tile_group)); - parser->state.seen_frame_header = 0; return retval; } /** + * gst_av1_parser_set_operating_point: + * @parser: the #GstAV1Parser + * @operating_point: the operating point to set + * + * Set the operating point to filter OBUs. + * + * Returns: The #GstAV1ParserResult. + * + * Since: 1.20 + */ +GstAV1ParserResult +gst_av1_parser_set_operating_point (GstAV1Parser * parser, + gint32 operating_point) +{ + g_return_val_if_fail (parser != NULL, GST_AV1_PARSER_INVALID_OPERATION); + g_return_val_if_fail (operating_point >= 0, GST_AV1_PARSER_INVALID_OPERATION); + + if (parser->seq_header && + operating_point > parser->seq_header->operating_points_cnt_minus_1) + return GST_AV1_PARSER_INVALID_OPERATION; + + /* Decide whether it is valid when sequence comes. */ + parser->state.operating_point = operating_point; + return GST_AV1_PARSER_OK; +} + +/** * gst_av1_parser_new: * * Allocates a new #GstAV1Parser,
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gstav1parser.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gstav1parser.h
Changed
@@ -52,7 +52,6 @@ #define GST_AV1_SUPERRES_DENOM_MIN 9 #define GST_AV1_SUPERRES_DENOM_BITS 3 #define GST_AV1_MAX_LOOP_FILTER 63 -#define GST_AV1_GM_ABS_ALPHA_BITS 12 #define GST_AV1_GM_ABS_TRANS_BITS 12 #define GST_AV1_GM_ABS_TRANS_ONLY_BITS 9 #define GST_AV1_GM_ABS_ALPHA_BITS 12 @@ -60,6 +59,7 @@ #define GST_AV1_GM_TRANS_PREC_BITS 6 #define GST_AV1_GM_TRANS_ONLY_PREC_BITS 3 #define GST_AV1_WARPEDMODEL_PREC_BITS 16 +#define GST_AV1_WARP_PARAM_REDUCE_BITS 6 #define GST_AV1_SELECT_SCREEN_CONTENT_TOOLS 2 #define GST_AV1_SELECT_INTEGER_MV 2 #define GST_AV1_RESTORATION_TILESIZE_MAX 256 @@ -80,6 +80,11 @@ #define GST_AV1_MAX_NUM_POS_LUMA 25 #define GST_AV1_MAX_NUM_PLANES 3 +#define GST_AV1_DIV_LUT_PREC_BITS 14 +#define GST_AV1_DIV_LUT_BITS 8 +#define GST_AV1_DIV_LUT_NUM (1 << GST_AV1_DIV_LUT_BITS) + + typedef struct _GstAV1Parser GstAV1Parser; typedef struct _GstAV1OBUHeader GstAV1OBUHeader; @@ -137,13 +142,22 @@ * @GST_AV1_PROFILE_0: 8-bit and 10-bit 4:2:0 and 4:0:0 only. * @GST_AV1_PROFILE_1: 8-bit and 10-bit 4:4:4. * @GST_AV1_PROFILE_2: 8-bit and 10-bit 4:2:2, 12-bit 4:0:0 4:2:2 and 4:4:4 + * @GST_AV1_PROFILE_UNDEFINED: unknow AV1 profile (Since: 1.20) * * Defines the AV1 profiles */ +/** + * GST_AV1_PROFILE_UNDEFINED: + * + * unknow AV1 profile + * + * Since: 1.20 + */ typedef enum { GST_AV1_PROFILE_0 = 0, GST_AV1_PROFILE_1 = 1, GST_AV1_PROFILE_2 = 2, + GST_AV1_PROFILE_UNDEFINED, } GstAV1Profile; /** @@ -1087,8 +1101,8 @@ gboolean loop_filter_delta_enabled; gboolean loop_filter_delta_update; - guint8 loop_filter_ref_deltas[GST_AV1_TOTAL_REFS_PER_FRAME]; - guint8 loop_filter_mode_deltas[2]; + gint8 loop_filter_ref_deltas[GST_AV1_TOTAL_REFS_PER_FRAME]; + gint8 loop_filter_mode_deltas[2]; gboolean delta_lf_present; guint8 delta_lf_res; @@ -1258,6 +1272,14 @@ * @gm_params: is set equal to SavedGmParams[ frame_to_show_map_idx ][ ref ][ j ] for * ref = LAST_FRAME..ALTREF_FRAME, for j = 0..5. * @gm_type: specifying the type of global motion. + * @invalid: whether this global motion parameters is invalid. (Since: 1.20) + */ +/** + * _GstAV1GlobalMotionParams.invalid: + * + * whether this global motion parameters is invalid. + * + * Since: 1.20 */ struct _GstAV1GlobalMotionParams { gboolean is_global[GST_AV1_NUM_REF_FRAMES]; @@ -1266,6 +1288,7 @@ gint32 gm_params[GST_AV1_NUM_REF_FRAMES][6]; GstAV1WarpModelType gm_type[GST_AV1_NUM_REF_FRAMES]; /* GmType */ + gboolean invalid[GST_AV1_NUM_REF_FRAMES]; }; /** @@ -1516,7 +1539,7 @@ */ struct _GstAV1FrameHeaderOBU { gboolean show_existing_frame; - guint8 frame_to_show_map_idx; + gint8 frame_to_show_map_idx; guint32 frame_presentation_time; guint32 tu_presentation_delay; guint32 display_frame_id; @@ -1537,9 +1560,9 @@ guint32 ref_order_hint[GST_AV1_NUM_REF_FRAMES]; gboolean allow_intrabc; gboolean frame_refs_short_signaling; - guint8 last_frame_idx; - guint8 gold_frame_idx; - guint8 ref_frame_idx[GST_AV1_REFS_PER_FRAME]; + gint8 last_frame_idx; + gint8 gold_frame_idx; + gint8 ref_frame_idx[GST_AV1_REFS_PER_FRAME]; gboolean allow_high_precision_mv; gboolean is_motion_mode_switchable; gboolean use_ref_frame_mvs; @@ -1636,7 +1659,7 @@ guint8 output_frame_height_in_tiles_minus_1; guint16 tile_count_minus_1; struct { - guint8 anchor_frame_idx; + gint8 anchor_frame_idx; guint8 anchor_tile_row; guint8 anchor_tile_col; guint16 tile_data_size_minus_1; @@ -1659,6 +1682,10 @@ * It is a requirement of bitstream conformance that the value of tg_end is greater * than or equal to tg_start. It is a requirement of bitstream conformance that the * value of tg_end for the last tile group in each frame is equal to num_tiles-1. + * @tile_offset: Offset from the OBU data, the real data start of this tile. + * @tg_size: Data size of this tile. + * @tile_row: Tile index in row. + * @tile_col: Tile index in column. * @mi_row_start: start position in mi rows * @mi_row_end: end position in mi rows * @mi_col_start: start position in mi cols @@ -1670,6 +1697,10 @@ guint8 tg_start; guint8 tg_end; struct { + guint32 tile_offset; /* Tile data offset from the OBU data. */ + guint32 tile_size; /* Data size of this tile */ + guint32 tile_row; /* tileRow */ + guint32 tile_col; /* tileCol */ /* global varialbes */ guint32 mi_row_start; /* MiRowStart */ guint32 mi_row_end; /* MiRowEnd */ @@ -1746,6 +1777,10 @@ gst_av1_parser_reset (GstAV1Parser * parser, gboolean annex_b); GST_CODEC_PARSERS_API +void +gst_av1_parser_reset_annex_b (GstAV1Parser * parser); + +GST_CODEC_PARSERS_API GstAV1ParserResult gst_av1_parser_identify_one_obu (GstAV1Parser * parser, const guint8 * data, guint32 size, GstAV1OBU * obu, guint32 * consumed); @@ -1796,6 +1831,11 @@ GstAV1FrameHeaderOBU * frame_header); GST_CODEC_PARSERS_API +GstAV1ParserResult +gst_av1_parser_set_operating_point (GstAV1Parser * parser, + gint32 operating_point); + +GST_CODEC_PARSERS_API GstAV1Parser * gst_av1_parser_new (void); GST_CODEC_PARSERS_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gsth264parser.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gsth264parser.c
Changed
@@ -271,7 +271,7 @@ *dst_pps = *src_pps; if (src_pps->slice_group_id) - dst_pps->slice_group_id = g_memdup (src_pps->slice_group_id, + dst_pps->slice_group_id = g_memdup2 (src_pps->slice_group_id, src_pps->pic_size_in_map_units_minus1 + 1); return TRUE; @@ -704,11 +704,12 @@ GstH264NalUnit * nalu, NalReader * nr) { GstH264DecRefPicMarking *dec_ref_pic_m; - guint start_pos; + guint start_pos, start_epb; GST_DEBUG ("parsing \"Decoded reference picture marking\""); start_pos = nal_reader_get_pos (nr); + start_epb = nal_reader_get_epb_count (nr); dec_ref_pic_m = &slice->dec_ref_pic_marking; @@ -723,7 +724,7 @@ dec_ref_pic_m->n_ref_pic_marking = 0; while (1) { - READ_UE (nr, mem_mgmt_ctrl_op); + READ_UE_MAX (nr, mem_mgmt_ctrl_op, 6); if (mem_mgmt_ctrl_op == 0) break; @@ -753,7 +754,8 @@ } } - dec_ref_pic_m->bit_size = nal_reader_get_pos (nr) - start_pos; + dec_ref_pic_m->bit_size = (nal_reader_get_pos (nr) - start_pos) - + (8 * (nal_reader_get_epb_count (nr) - start_epb)); return TRUE; @@ -2228,7 +2230,7 @@ gint pps_id; GstH264PPS *pps; GstH264SPS *sps; - guint start_pos; + guint start_pos, start_epb; memset (slice, 0, sizeof (*slice)); @@ -2304,6 +2306,7 @@ READ_UE_MAX (&nr, slice->idr_pic_id, G_MAXUINT16); start_pos = nal_reader_get_pos (&nr); + start_epb = nal_reader_get_epb_count (&nr); if (sps->pic_order_cnt_type == 0) { READ_UINT16 (&nr, slice->pic_order_cnt_lsb, @@ -2319,7 +2322,8 @@ READ_SE (&nr, slice->delta_pic_order_cnt[1]); } - slice->pic_order_cnt_bit_size = nal_reader_get_pos (&nr) - start_pos; + slice->pic_order_cnt_bit_size = (nal_reader_get_pos (&nr) - start_pos) - + (8 * (nal_reader_get_epb_count (&nr) - start_epb)); if (pps->redundant_pic_cnt_present_flag) READ_UE_MAX (&nr, slice->redundant_pic_cnt, G_MAXINT8); @@ -3138,14 +3142,14 @@ /* write payload type bytes */ while (payload_type_data >= 0xff) { WRITE_UINT8 (&nw, 0xff, 8); - payload_type_data -= -0xff; + payload_type_data -= 0xff; } WRITE_UINT8 (&nw, payload_type_data, 8); /* write payload size bytes */ while (payload_size_data >= 0xff) { WRITE_UINT8 (&nw, 0xff, 8); - payload_size_data -= -0xff; + payload_size_data -= 0xff; } WRITE_UINT8 (&nw, payload_size_data, 8); @@ -3169,7 +3173,7 @@ have_written_data = TRUE; break; case GST_H264_SEI_MASTERING_DISPLAY_COLOUR_VOLUME: - GST_DEBUG ("Wrtiting \"Mastering display colour volume\""); + GST_DEBUG ("Writing \"Mastering display colour volume\""); if (!gst_h264_write_sei_mastering_display_colour_volume (&nw, &msg->payload.mastering_display_colour_volume)) { GST_WARNING ("Failed to write \"Mastering display colour volume\"");
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gsth265parser.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gsth265parser.c
Changed
@@ -940,7 +940,7 @@ return TRUE; error: - GST_WARNING ("error parsing \"Prediction weight table\""); + GST_WARNING ("error parsing \"Reference picture list modifications\""); return FALSE; } @@ -2563,7 +2563,8 @@ if (sps->sample_adaptive_offset_enabled_flag) { READ_UINT8 (&nr, slice->sao_luma_flag, 1); - READ_UINT8 (&nr, slice->sao_chroma_flag, 1); + if (sps->chroma_array_type) + READ_UINT8 (&nr, slice->sao_chroma_flag, 1); } if (GST_H265_IS_B_SLICE (slice) || GST_H265_IS_P_SLICE (slice)) { @@ -3376,7 +3377,7 @@ static GstH265Profile get_extension_profile (H265ExtensionProfile * profiles, guint num, - GstH265ProfileTierLevel * ptl) + const GstH265ProfileTierLevel * ptl) { GstH265Profile result = GST_H265_PROFILE_INVALID; guint i; @@ -3479,7 +3480,7 @@ } static GstH265Profile -get_format_range_extension_profile (GstH265ProfileTierLevel * ptl) +get_format_range_extension_profile (const GstH265ProfileTierLevel * ptl) { /* Profile idc: GST_H265_PROFILE_IDC_FORMAT_RANGE_EXTENSION See Table A.2 for the definition of those formats */ @@ -3532,7 +3533,7 @@ } static GstH265Profile -get_3d_profile (GstH265ProfileTierLevel * ptl) +get_3d_profile (const GstH265ProfileTierLevel * ptl) { /* profile idc: GST_H265_PROFILE_IDC_3D_MAIN */ static H265ExtensionProfile profiles[] = { @@ -3544,7 +3545,7 @@ } static GstH265Profile -get_multiview_profile (GstH265ProfileTierLevel * ptl) +get_multiview_profile (const GstH265ProfileTierLevel * ptl) { static H265ExtensionProfile profiles[] = { {GST_H265_PROFILE_MULTIVIEW_MAIN, @@ -3555,7 +3556,7 @@ } static GstH265Profile -get_scalable_profile (GstH265ProfileTierLevel * ptl) +get_scalable_profile (const GstH265ProfileTierLevel * ptl) { static H265ExtensionProfile profiles[] = { {GST_H265_PROFILE_SCALABLE_MAIN, @@ -3568,7 +3569,7 @@ } static GstH265Profile -get_high_throughput_profile (GstH265ProfileTierLevel * ptl) +get_high_throughput_profile (const GstH265ProfileTierLevel * ptl) { static H265ExtensionProfile profiles[] = { {GST_H265_PROFILE_HIGH_THROUGHPUT_444, @@ -3585,7 +3586,8 @@ } static GstH265Profile -get_screen_content_coding_extensions_profile (GstH265ProfileTierLevel * ptl) +get_screen_content_coding_extensions_profile (const GstH265ProfileTierLevel * + ptl) { static H265ExtensionProfile profiles[] = { {GST_H265_PROFILE_SCREEN_EXTENDED_MAIN, @@ -3596,21 +3598,14 @@ 1, 1, 1, 1, 0, 0, 0, 0, 0, TRUE, 2}, {GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444_10, 1, 1, 1, 0, 0, 0, 0, 0, 0, TRUE, 3}, - /* identical to screen-extended-main-444 */ - {GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444, - 1, 1, 1, 1, 0, 0, 0, 0, 0, TRUE, 4}, - /* identical to screen-extended-main-444-10 */ - {GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_10, - 1, 1, 1, 0, 0, 0, 0, 0, 0, TRUE, 5}, - {GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14, - 1, 0, 0, 0, 0, 0, 0, 0, 0, TRUE, 6}, }; return get_extension_profile (profiles, G_N_ELEMENTS (profiles), ptl); } static GstH265Profile -get_scalable_format_range_extensions_profile (GstH265ProfileTierLevel * ptl) +get_scalable_format_range_extensions_profile (const GstH265ProfileTierLevel * + ptl) { static H265ExtensionProfile profiles[] = { {GST_H265_PROFILE_SCALABLE_MONOCHROME, @@ -3626,6 +3621,99 @@ return get_extension_profile (profiles, G_N_ELEMENTS (profiles), ptl); } +static GstH265Profile + get_screen_content_coding_extensions_high_throughput_profile + (const GstH265ProfileTierLevel * ptl) +{ + static H265ExtensionProfile profiles[] = { + {GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444, + 1, 1, 1, 1, 0, 0, 0, 0, 0, TRUE, 0}, + {GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_10, + 1, 1, 1, 0, 0, 0, 0, 0, 0, TRUE, 1}, + {GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14, + 1, 0, 0, 0, 0, 0, 0, 0, 0, TRUE, 2}, + }; + + return get_extension_profile (profiles, G_N_ELEMENTS (profiles), ptl); +} + +static inline void +append_profile (GstH265Profile profiles[GST_H265_PROFILE_MAX], guint * idx, + GstH265Profile profile) +{ + if (profile == GST_H265_PROFILE_INVALID) + return; + profiles[*idx] = profile; + (*idx)++; +} + +/* *INDENT-OFF* */ +struct h265_profiles_map +{ + GstH265ProfileIDC profile_idc; + GstH265Profile (*get_profile) (const GstH265ProfileTierLevel *); + GstH265Profile profile; +}; +/* *INDENT-ON* */ + +static const struct h265_profiles_map profiles_map[] = { + /* keep profile check in asc order */ + {GST_H265_PROFILE_IDC_MAIN, NULL, GST_H265_PROFILE_MAIN}, + {GST_H265_PROFILE_IDC_MAIN_10, NULL, GST_H265_PROFILE_MAIN_10}, + {GST_H265_PROFILE_IDC_MAIN_STILL_PICTURE, NULL, + GST_H265_PROFILE_MAIN_STILL_PICTURE}, + {GST_H265_PROFILE_IDC_FORMAT_RANGE_EXTENSION, + get_format_range_extension_profile, GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_HIGH_THROUGHPUT, get_high_throughput_profile, + GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_MULTIVIEW_MAIN, get_multiview_profile, + GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_SCALABLE_MAIN, get_scalable_profile, + GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_3D_MAIN, get_3d_profile, GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_SCREEN_CONTENT_CODING, + get_screen_content_coding_extensions_profile, + GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_SCALABLE_FORMAT_RANGE_EXTENSION, + get_scalable_format_range_extensions_profile, + GST_H265_PROFILE_INVALID}, + {GST_H265_PROFILE_IDC_HIGH_THROUGHPUT_SCREEN_CONTENT_CODING_EXTENSION, + get_screen_content_coding_extensions_high_throughput_profile, + GST_H265_PROFILE_INVALID}, +}; + +static void +gst_h265_profile_tier_level_get_profiles (const GstH265ProfileTierLevel * ptl, + GstH265Profile profiles[GST_H265_PROFILE_MAX], guint * len) +{ + guint i = 0, j; + + /* First add profile idc */ + for (j = 0; j < G_N_ELEMENTS (profiles_map); j++) { + if (ptl->profile_idc == profiles_map[j].profile_idc) { + if (profiles_map[j].get_profile) + append_profile (profiles, &i, profiles_map[j].get_profile (ptl)); + else + profiles[i++] = profiles_map[j].profile; + break; + } + } + + /* Later add compatibility flags */ + for (j = 0; j < G_N_ELEMENTS (profiles_map); j++) { + if (i > 0 && ptl->profile_idc == profiles_map[j].profile_idc) + continue; + if (ptl->profile_compatibility_flag[profiles_map[j].profile_idc]) { + if (profiles_map[j].get_profile) + append_profile (profiles, &i, profiles_map[j].get_profile (ptl)); + else + profiles[i++] = profiles_map[j].profile; + } + } + + *len = i; +} + /** * gst_h265_profile_tier_level_get_profile: * @ptl: a #GstH265ProfileTierLevel @@ -3636,50 +3724,15 @@ * Since: 1.14 */ GstH265Profile -gst_h265_profile_tier_level_get_profile (GstH265ProfileTierLevel * ptl) +gst_h265_profile_tier_level_get_profile (const GstH265ProfileTierLevel * ptl) { - if (ptl->profile_idc == GST_H265_PROFILE_IDC_MAIN - || ptl->profile_compatibility_flag[1]) - return GST_H265_PROFILE_MAIN; - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_MAIN_10 - || ptl->profile_compatibility_flag[2]) - return GST_H265_PROFILE_MAIN_10; - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_MAIN_STILL_PICTURE - || ptl->profile_compatibility_flag[3]) - return GST_H265_PROFILE_MAIN_STILL_PICTURE; - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_FORMAT_RANGE_EXTENSION - || ptl->profile_compatibility_flag[4]) - return get_format_range_extension_profile (ptl); - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_HIGH_THROUGHPUT - || ptl->profile_compatibility_flag[5]) - return get_high_throughput_profile (ptl); - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_MULTIVIEW_MAIN - || ptl->profile_compatibility_flag[6]) - return get_multiview_profile (ptl); - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_SCALABLE_MAIN - || ptl->profile_compatibility_flag[7]) - return get_scalable_profile (ptl); - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_3D_MAIN - || ptl->profile_compatibility_flag[8]) - return get_3d_profile (ptl); - - if (ptl->profile_idc == GST_H265_PROFILE_IDC_SCREEN_CONTENT_CODING - || ptl->profile_compatibility_flag[9] - || ptl->profile_idc == - GST_H265_PROFILE_IDC_HIGH_THROUGHPUT_SCREEN_CONTENT_CODING_EXTENSION - || ptl->profile_compatibility_flag[11]) - return get_screen_content_coding_extensions_profile (ptl); + guint len = 0; + GstH265Profile profiles[GST_H265_PROFILE_MAX] = { GST_H265_PROFILE_INVALID, }; + + gst_h265_profile_tier_level_get_profiles (ptl, profiles, &len); - if (ptl->profile_idc == GST_H265_PROFILE_IDC_SCALABLE_FORMAT_RANGE_EXTENSION - || ptl->profile_compatibility_flag[10]) - return get_scalable_format_range_extensions_profile (ptl); + if (len > 0) + return profiles[0]; return GST_H265_PROFILE_INVALID; } @@ -3975,14 +4028,14 @@ /* write payload type bytes */ while (payload_type_data >= 0xff) { WRITE_UINT8 (&nw, 0xff, 8); - payload_type_data -= -0xff; + payload_type_data -= 0xff; } WRITE_UINT8 (&nw, payload_type_data, 8); /* write payload size bytes */ while (payload_size_data >= 0xff) { WRITE_UINT8 (&nw, 0xff, 8); - payload_size_data -= -0xff; + payload_size_data -= 0xff; } WRITE_UINT8 (&nw, payload_size_data, 8); @@ -4293,3 +4346,119 @@ return gst_h265_parser_insert_sei_internal (parser, nal_length_size, TRUE, au, sei); } + +/** + * gst_h265_get_profile_from_sps: + * @sps: a #GstH265SPS + * + * Return the H265 profile from @sps. + * + * Returns: a #GstH265Profile + * Since: 1.20 + */ +GstH265Profile +gst_h265_get_profile_from_sps (GstH265SPS * sps) +{ + GstH265Profile profiles[GST_H265_PROFILE_MAX] = { GST_H265_PROFILE_INVALID, }; + GstH265ProfileTierLevel tmp_ptl; + guint i, len = 0; + guint chroma_format_idc, bit_depth_luma, bit_depth_chroma; + + g_return_val_if_fail (sps != NULL, GST_H265_PROFILE_INVALID); + + tmp_ptl = sps->profile_tier_level; + chroma_format_idc = sps->chroma_format_idc; + bit_depth_luma = sps->bit_depth_luma_minus8 + 8; + bit_depth_chroma = sps->bit_depth_chroma_minus8 + 8; + + gst_h265_profile_tier_level_get_profiles (&sps->profile_tier_level, profiles, + &len); + + for (i = 0; i < len && i < G_N_ELEMENTS (profiles); i++) { + switch (profiles[i]) { + case GST_H265_PROFILE_INVALID: + break; + case GST_H265_PROFILE_MAIN: + case GST_H265_PROFILE_MAIN_STILL_PICTURE: + /* A.3.2 or A.3.5 */ + if (chroma_format_idc == 1 + && bit_depth_luma == 8 && bit_depth_chroma == 8) + return profiles[i]; + break; + case GST_H265_PROFILE_MAIN_10: + /* A.3.3 */ + if (chroma_format_idc == 1 + && bit_depth_luma >= 8 && bit_depth_luma <= 10 + && bit_depth_chroma >= 8 && bit_depth_chroma <= 10) + return profiles[i]; + break; + default: + return profiles[i]; + } + } + + /* Invalid profile: */ + /* Set the conformance indicators based on chroma_format_idc / bit_depth */ + switch (chroma_format_idc) { + case 0: + tmp_ptl.max_monochrome_constraint_flag = 1; + tmp_ptl.max_420chroma_constraint_flag = 1; + tmp_ptl.max_422chroma_constraint_flag = 1; + break; + + case 1: + tmp_ptl.max_monochrome_constraint_flag = 0; + tmp_ptl.max_420chroma_constraint_flag = 1; + tmp_ptl.max_422chroma_constraint_flag = 1; + break; + + case 2: + tmp_ptl.max_monochrome_constraint_flag = 0; + tmp_ptl.max_420chroma_constraint_flag = 0; + tmp_ptl.max_422chroma_constraint_flag = 1; + break; + + case 3: + tmp_ptl.max_monochrome_constraint_flag = 0; + tmp_ptl.max_420chroma_constraint_flag = 0; + tmp_ptl.max_422chroma_constraint_flag = 0; + break; + + default: + g_assert_not_reached (); + break; + } + + tmp_ptl.max_8bit_constraint_flag = 1; + tmp_ptl.max_10bit_constraint_flag = 1; + tmp_ptl.max_12bit_constraint_flag = 1; + tmp_ptl.max_14bit_constraint_flag = 1; + + if (bit_depth_luma > 8 || bit_depth_chroma > 8) + tmp_ptl.max_8bit_constraint_flag = 0; + + if (bit_depth_luma > 10 || bit_depth_chroma > 10) + tmp_ptl.max_10bit_constraint_flag = 0; + + if (bit_depth_luma > 12 || bit_depth_chroma > 12) + tmp_ptl.max_12bit_constraint_flag = 0; + + if (tmp_ptl.profile_idc == GST_H265_PROFILE_IDC_HIGH_THROUGHPUT + || tmp_ptl.profile_idc == GST_H265_PROFILE_IDC_SCREEN_CONTENT_CODING + || tmp_ptl.profile_idc == + GST_H265_PROFILE_IDC_SCALABLE_FORMAT_RANGE_EXTENSION + || tmp_ptl.profile_idc == + GST_H265_PROFILE_IDC_HIGH_THROUGHPUT_SCREEN_CONTENT_CODING_EXTENSION + || tmp_ptl.profile_compatibility_flag[5] + || tmp_ptl.profile_compatibility_flag[9] + || tmp_ptl.profile_compatibility_flag[10] + || tmp_ptl.profile_compatibility_flag[11]) { + if (bit_depth_luma > 14 || bit_depth_chroma > 14) + tmp_ptl.max_14bit_constraint_flag = 0; + } else { + tmp_ptl.max_14bit_constraint_flag = 0; + } + + /* first profile of the synthetic ptl */ + return gst_h265_profile_tier_level_get_profile (&tmp_ptl); +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gsth265parser.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gsth265parser.h
Changed
@@ -1777,7 +1777,7 @@ gst_h265_quant_matrix_8x8_get_raster_from_uprightdiagonal GST_CODEC_PARSERS_API -GstH265Profile gst_h265_profile_tier_level_get_profile (GstH265ProfileTierLevel * ptl); +GstH265Profile gst_h265_profile_tier_level_get_profile (const GstH265ProfileTierLevel * ptl); GST_CODEC_PARSERS_API const gchar * gst_h265_profile_to_string (GstH265Profile profile); @@ -1808,5 +1808,8 @@ GstBuffer * au, GstMemory * sei); +GST_CODEC_PARSERS_API +GstH265Profile gst_h265_get_profile_from_sps (GstH265SPS * sps); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gstjpeg2000sampling.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gstjpeg2000sampling.c
Changed
@@ -42,6 +42,7 @@ "YCbCr-4:1:0", "GRAYSCALE", "YCbCrA-4:4:4:4", + "YCbCr-4:1:1", }; /* convert string to GstJPEG2000Sampling enum */ @@ -86,8 +87,9 @@ return sampling == GST_JPEG2000_SAMPLING_YBRA4444_EXT || sampling == GST_JPEG2000_SAMPLING_YBR444 || sampling == GST_JPEG2000_SAMPLING_YBR422 || - sampling == GST_JPEG2000_SAMPLING_YBR420 - || sampling == GST_JPEG2000_SAMPLING_YBR410; + sampling == GST_JPEG2000_SAMPLING_YBR420 || + sampling == GST_JPEG2000_SAMPLING_YBR411 || + sampling == GST_JPEG2000_SAMPLING_YBR410; } /* check if @sampling is in GRAYSCALE color space */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gstjpeg2000sampling.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gstjpeg2000sampling.h
Changed
@@ -30,17 +30,26 @@ * Note: sampling extensions that are not listed in the RFC are signified by an _EXT at the end of the enum * * @GST_JPEG2000_SAMPLING_NONE: no sampling - * @GST_JPEG2000_SAMPLING_RGB: standard Red, Green, Blue color space. - * @GST_JPEG2000_SAMPLING_BGR: standard Blue, Green, Red color space. - * @GST_JPEG2000_SAMPLING_RGBA: standard Red, Green, Blue, Alpha color space. - * @GST_JPEG2000_SAMPLING_BGRA: standard Blue, Green, Red, Alpha color space. - * @GST_JPEG2000_SAMPLING_YCbCr-4:4:4: standard YCbCr color space; no subsampling. - * @GST_JPEG2000_SAMPLING_YCbCr-4:2:2: standard YCbCr color space; Cb and Cr are subsampled horizontally by 1/2. - * @GST_JPEG2000_SAMPLING_YCbCr-4:2:0: standard YCbCr color space; Cb and Cr are subsampled horizontally and vertically by 1/2. - * @GST_JPEG2000_SAMPLING_YCbCr-4:1:1: standard YCbCr color space; Cb and Cr are subsampled vertically by 1/4. + * @GST_JPEG2000_SAMPLING_RGB: standard Red, Green, Blue color space. + * @GST_JPEG2000_SAMPLING_BGR: standard Blue, Green, Red color space. + * @GST_JPEG2000_SAMPLING_RGBA: standard Red, Green, Blue, Alpha color space. + * @GST_JPEG2000_SAMPLING_BGRA: standard Blue, Green, Red, Alpha color space. + * @GST_JPEG2000_SAMPLING_YBR444: standard YCbCr color space; no subsampling. + * @GST_JPEG2000_SAMPLING_YBR422: standard YCbCr color space; Cb and Cr are subsampled horizontally by 1/2. + * @GST_JPEG2000_SAMPLING_YBR420: standard YCbCr color space; Cb and Cr are subsampled horizontally and vertically by 1/2. + * @GST_JPEG2000_SAMPLING_YBR411: standard YCbCr color space; Cb and Cr are subsampled vertically by 1/4 (Since: 1.20). + * @GST_JPEG2000_SAMPLING_YBR410: standard YCbCr color space; Cb and Cr are subsampled vertically by 1/4 alternating the Cb and Cr component. * @GST_JPEG2000_SAMPLING_GRAYSCALE: basically, a single component image of just multilevels of grey. * @GST_JPEG2000_SAMPLING_YBRA4444_EXT: standard YCbCr color space, alpha channel, no subsampling, */ + +/** + * GST_JPEG2000_SAMPLING_YBR411: + * + * standard YCbCr color space; Cb and Cr are subsampled vertically by 1/4 + * + * Since: 1.20 + */ typedef enum { GST_JPEG2000_SAMPLING_NONE, @@ -53,11 +62,12 @@ GST_JPEG2000_SAMPLING_YBR420, GST_JPEG2000_SAMPLING_YBR410, GST_JPEG2000_SAMPLING_GRAYSCALE, - GST_JPEG2000_SAMPLING_YBRA4444_EXT + GST_JPEG2000_SAMPLING_YBRA4444_EXT, + GST_JPEG2000_SAMPLING_YBR411 } GstJPEG2000Sampling; /* GST_JPEG2000_SAMPLING_LIST: sampling strings in list form, for use in caps */ -#define GST_JPEG2000_SAMPLING_LIST "sampling = (string) {\"RGB\", \"BGR\", \"RGBA\", \"BGRA\", \"YCbCr-4:4:4\", \"YCbCr-4:2:2\", \"YCbCr-4:2:0\", \"YCbCr-4:1:1\", \"GRAYSCALE\" , \"YCbCrA-4:4:4:4\"}" +#define GST_JPEG2000_SAMPLING_LIST "sampling = (string) {\"RGB\", \"BGR\", \"RGBA\", \"BGRA\", \"YCbCr-4:4:4\", \"YCbCr-4:2:2\", \"YCbCr-4:2:0\", \"YCbCr-4:1:1\", \"YCbCr-4:1:0\", \"GRAYSCALE\" , \"YCbCrA-4:4:4:4\"}" GST_CODEC_PARSERS_API const gchar *gst_jpeg2000_sampling_to_string (GstJPEG2000Sampling sampling);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gstvp8parser.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gstvp8parser.c
Changed
@@ -535,6 +535,8 @@ g_return_val_if_fail (frame_hdr != NULL, GST_VP8_PARSER_ERROR); g_return_val_if_fail (parser != NULL, GST_VP8_PARSER_ERROR); + memset (frame_hdr, 0, sizeof (GstVp8FrameHdr)); + /* Uncompressed Data Chunk */ gst_byte_reader_init (&br, data, size);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/gstvp9parser.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/gstvp9parser.h
Changed
@@ -48,7 +48,6 @@ #define GST_VP9_FRAME_CONTEXTS_LOG2 2 -#define GST_VP9_MAX_LOOP_FILTER 63 #define GST_VP9_MAX_SHARPNESS 7 #define GST_VP9_MAX_REF_LF_DELTAS 4 @@ -461,7 +460,7 @@ */ struct _GstVp9Segmentation { - guint8 filter_level[4][2]; + guint8 filter_level[GST_VP9_MAX_REF_LF_DELTAS][GST_VP9_MAX_MODE_LF_DELTAS]; gint16 luma_ac_quant_scale; gint16 luma_dc_quant_scale; gint16 chroma_ac_quant_scale;
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecparsers/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecparsers/meson.build
Changed
@@ -37,6 +37,7 @@ cp_args = [ '-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_CODEC_PARSERS', + '-DG_LOG_DOMAIN="GStreamer-CodecParsers"', '-Dvp8_norm=gst_codecparsers_vp8_norm', '-Dvp8dx_start_decode=gst_codecparsers_vp8dx_start_decode', '-Dvp8dx_bool_decoder_fill=gst_codecparsers_vp8dx_bool_decoder_fill', @@ -53,6 +54,18 @@ dependencies : [gstbase_dep, libm], ) +pkg_name = 'gstreamer-codecparsers-1.0' +pkgconfig.generate(gstcodecparsers, + libraries : [gst_dep, gstbase_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'Bitstream parsers for GStreamer elements', +) + gstcodecparsers_dep = declare_dependency(link_with : gstcodecparsers, include_directories : [libsinc], dependencies : [gstbase_dep]) + +libraries += [[pkg_name, {'lib': gstcodecparsers}]] +meson.override_dependency(pkg_name, gstcodecparsers_dep)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstav1decoder.c
Added
@@ -0,0 +1,657 @@ +/* GStreamer + * Copyright (C) 2020 He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstav1decoder + * @title: Gstav1Decoder + * @short_description: Base class to implement stateless AV1 decoders + * @sources: + * - gstav1picture.h + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstav1decoder.h" + +GST_DEBUG_CATEGORY (gst_av1_decoder_debug); +#define GST_CAT_DEFAULT gst_av1_decoder_debug + +struct _GstAV1DecoderPrivate +{ + gint max_width; + gint max_height; + GstAV1Profile profile; + GstAV1Parser *parser; + GstAV1Dpb *dpb; + GstAV1Picture *current_picture; + GstVideoCodecFrame *current_frame; +}; + +#define parent_class gst_av1_decoder_parent_class +G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstAV1Decoder, gst_av1_decoder, + GST_TYPE_VIDEO_DECODER, + G_ADD_PRIVATE (GstAV1Decoder); + GST_DEBUG_CATEGORY_INIT (gst_av1_decoder_debug, "av1decoder", 0, + "AV1 Video Decoder")); + +static gint +_floor_log2 (guint32 x) +{ + gint s = 0; + + while (x != 0) { + x = x >> 1; + s++; + } + return s - 1; +} + +static gboolean gst_av1_decoder_start (GstVideoDecoder * decoder); +static gboolean gst_av1_decoder_stop (GstVideoDecoder * decoder); +static gboolean gst_av1_decoder_set_format (GstVideoDecoder * decoder, + GstVideoCodecState * state); +static GstFlowReturn gst_av1_decoder_finish (GstVideoDecoder * decoder); +static gboolean gst_av1_decoder_flush (GstVideoDecoder * decoder); +static GstFlowReturn gst_av1_decoder_drain (GstVideoDecoder * decoder); +static GstFlowReturn gst_av1_decoder_handle_frame (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame); + +static GstAV1Picture *gst_av1_decoder_duplicate_picture_default (GstAV1Decoder * + decoder, GstAV1Picture * picture); + +static void +gst_av1_decoder_class_init (GstAV1DecoderClass * klass) +{ + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + + decoder_class->start = GST_DEBUG_FUNCPTR (gst_av1_decoder_start); + decoder_class->stop = GST_DEBUG_FUNCPTR (gst_av1_decoder_stop); + decoder_class->set_format = GST_DEBUG_FUNCPTR (gst_av1_decoder_set_format); + decoder_class->finish = GST_DEBUG_FUNCPTR (gst_av1_decoder_finish); + decoder_class->flush = GST_DEBUG_FUNCPTR (gst_av1_decoder_flush); + decoder_class->drain = GST_DEBUG_FUNCPTR (gst_av1_decoder_drain); + decoder_class->handle_frame = + GST_DEBUG_FUNCPTR (gst_av1_decoder_handle_frame); + + klass->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_av1_decoder_duplicate_picture_default); +} + +static void +gst_av1_decoder_init (GstAV1Decoder * self) +{ + gst_video_decoder_set_packetized (GST_VIDEO_DECODER (self), TRUE); + + self->priv = gst_av1_decoder_get_instance_private (self); +} + +static void +gst_av1_decoder_reset (GstAV1Decoder * self) +{ + GstAV1DecoderPrivate *priv = self->priv; + + priv->max_width = 0; + priv->max_height = 0; + gst_av1_picture_clear (&priv->current_picture); + priv->current_frame = NULL; + priv->profile = GST_AV1_PROFILE_UNDEFINED; + + if (priv->dpb) + gst_av1_dpb_clear (priv->dpb); + if (priv->parser) + gst_av1_parser_reset (priv->parser, FALSE); +} + +static gboolean +gst_av1_decoder_start (GstVideoDecoder * decoder) +{ + GstAV1Decoder *self = GST_AV1_DECODER (decoder); + GstAV1DecoderPrivate *priv = self->priv; + + priv->parser = gst_av1_parser_new (); + priv->dpb = gst_av1_dpb_new (); + + gst_av1_decoder_reset (self); + + return TRUE; +} + +static gboolean +gst_av1_decoder_stop (GstVideoDecoder * decoder) +{ + GstAV1Decoder *self = GST_AV1_DECODER (decoder); + GstAV1DecoderPrivate *priv = self->priv; + + gst_av1_decoder_reset (self); + + g_clear_pointer (&self->input_state, gst_video_codec_state_unref); + g_clear_pointer (&priv->parser, gst_av1_parser_free); + g_clear_pointer (&priv->dpb, gst_av1_dpb_free); + + return TRUE; +} + +static gboolean +gst_av1_decoder_set_format (GstVideoDecoder * decoder, + GstVideoCodecState * state) +{ + GstAV1Decoder *self = GST_AV1_DECODER (decoder); + GstAV1DecoderPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (decoder, "Set format"); + + if (self->input_state) + gst_video_codec_state_unref (self->input_state); + + self->input_state = gst_video_codec_state_ref (state); + + priv->max_width = GST_VIDEO_INFO_WIDTH (&state->info); + priv->max_height = GST_VIDEO_INFO_HEIGHT (&state->info); + + return TRUE; +} + +static GstFlowReturn +gst_av1_decoder_finish (GstVideoDecoder * decoder) +{ + GST_DEBUG_OBJECT (decoder, "finish"); + + gst_av1_decoder_reset (GST_AV1_DECODER (decoder)); + + return GST_FLOW_OK; +} + +static gboolean +gst_av1_decoder_flush (GstVideoDecoder * decoder) +{ + GST_DEBUG_OBJECT (decoder, "flush"); + + gst_av1_decoder_reset (GST_AV1_DECODER (decoder)); + + return TRUE; +} + +static GstFlowReturn +gst_av1_decoder_drain (GstVideoDecoder * decoder) +{ + GST_DEBUG_OBJECT (decoder, "drain"); + + gst_av1_decoder_reset (GST_AV1_DECODER (decoder)); + + return GST_FLOW_OK; +} + +static GstAV1Picture * +gst_av1_decoder_duplicate_picture_default (GstAV1Decoder * decoder, + GstAV1Picture * picture) +{ + GstAV1Picture *new_picture; + + new_picture = gst_av1_picture_new (); + + return new_picture; +} + +static const gchar * +get_obu_name (GstAV1OBUType type) +{ + switch (type) { + case GST_AV1_OBU_SEQUENCE_HEADER: + return "sequence header"; + case GST_AV1_OBU_TEMPORAL_DELIMITER: + return "temporal delimiter"; + case GST_AV1_OBU_FRAME_HEADER: + return "frame header"; + case GST_AV1_OBU_TILE_GROUP: + return "tile group"; + case GST_AV1_OBU_METADATA: + return "metadata"; + case GST_AV1_OBU_FRAME: + return "frame"; + case GST_AV1_OBU_REDUNDANT_FRAME_HEADER: + return "redundant frame header"; + case GST_AV1_OBU_TILE_LIST: + return "tile list"; + case GST_AV1_OBU_PADDING: + return "padding"; + default: + return "unknown"; + } + + return NULL; +} + +static const gchar * +gst_av1_decoder_profile_to_string (GstAV1Profile profile) +{ + switch (profile) { + case GST_AV1_PROFILE_0: + return "0"; + case GST_AV1_PROFILE_1: + return "1"; + case GST_AV1_PROFILE_2: + return "2"; + default: + break; + } + + return NULL; +} + +static GstFlowReturn +gst_av1_decoder_process_sequence (GstAV1Decoder * self, GstAV1OBU * obu) +{ + GstAV1ParserResult res; + GstAV1DecoderPrivate *priv = self->priv; + GstAV1SequenceHeaderOBU seq_header; + GstAV1SequenceHeaderOBU old_seq_header = { 0, }; + GstAV1DecoderClass *klass = GST_AV1_DECODER_GET_CLASS (self); + GstFlowReturn ret = GST_FLOW_OK; + + if (priv->parser->seq_header) + old_seq_header = *priv->parser->seq_header; + + res = gst_av1_parser_parse_sequence_header_obu (priv->parser, + obu, &seq_header); + if (res != GST_AV1_PARSER_OK) { + GST_WARNING_OBJECT (self, "Parsing sequence failed."); + return GST_FLOW_ERROR; + } + + if (!memcmp (&old_seq_header, &seq_header, sizeof (GstAV1SequenceHeaderOBU))) { + GST_DEBUG_OBJECT (self, "Get same sequence header."); + return GST_FLOW_OK; + } + + g_assert (klass->new_sequence); + + GST_DEBUG_OBJECT (self, + "Sequence updated, profile %s -> %s, max resolution: %dx%d -> %dx%d", + gst_av1_decoder_profile_to_string (priv->profile), + gst_av1_decoder_profile_to_string (seq_header.seq_profile), + priv->max_width, priv->max_height, seq_header.max_frame_width_minus_1 + 1, + seq_header.max_frame_height_minus_1 + 1); + + ret = klass->new_sequence (self, &seq_header); + if (ret != GST_FLOW_OK) { + GST_ERROR_OBJECT (self, "subclass does not want accept new sequence"); + return ret; + } + + priv->profile = seq_header.seq_profile; + priv->max_width = seq_header.max_frame_width_minus_1 + 1; + priv->max_height = seq_header.max_frame_height_minus_1 + 1; + gst_av1_dpb_clear (priv->dpb); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_av1_decoder_decode_tile_group (GstAV1Decoder * self, + GstAV1TileGroupOBU * tile_group, GstAV1OBU * obu) +{ + GstAV1DecoderPrivate *priv = self->priv; + GstAV1DecoderClass *klass = GST_AV1_DECODER_GET_CLASS (self); + GstAV1Picture *picture = priv->current_picture; + GstAV1Tile tile; + GstFlowReturn ret = GST_FLOW_OK; + + if (!picture) { + GST_ERROR_OBJECT (self, "No picture has created for current frame"); + return GST_FLOW_ERROR; + } + + if (picture->frame_hdr.show_existing_frame) { + GST_ERROR_OBJECT (self, "Current picture is showing the existing frame."); + return GST_FLOW_ERROR; + } + + tile.obu = *obu; + tile.tile_group = *tile_group; + + g_assert (klass->decode_tile); + ret = klass->decode_tile (self, picture, &tile); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Decode tile error"); + return ret; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_av1_decoder_decode_frame_header (GstAV1Decoder * self, + GstAV1FrameHeaderOBU * frame_header) +{ + GstAV1DecoderPrivate *priv = self->priv; + GstAV1DecoderClass *klass = GST_AV1_DECODER_GET_CLASS (self); + GstAV1Picture *picture = NULL; + GstFlowReturn ret = GST_FLOW_OK; + + g_assert (priv->current_frame); + + if (priv->current_picture != NULL) { + GST_ERROR_OBJECT (self, "Already have picture for current frame"); + return GST_FLOW_ERROR; + } + + if (frame_header->show_existing_frame) { + GstAV1Picture *ref_picture; + + ref_picture = priv->dpb->pic_list[frame_header->frame_to_show_map_idx]; + if (!ref_picture) { + GST_WARNING_OBJECT (self, "Failed to find the frame index %d to show.", + frame_header->frame_to_show_map_idx); + return GST_FLOW_ERROR; + } + + if (gst_av1_parser_reference_frame_loading (priv->parser, + &ref_picture->frame_hdr) != GST_AV1_PARSER_OK) { + GST_WARNING_OBJECT (self, "load the reference frame failed"); + return GST_FLOW_ERROR; + } + + /* FIXME: duplicate picture might be optional feature like that of VP9 + * decoder baseclass */ + g_assert (klass->duplicate_picture); + picture = klass->duplicate_picture (self, ref_picture); + if (!picture) { + GST_ERROR_OBJECT (self, "subclass didn't provide duplicated picture"); + return GST_FLOW_ERROR; + } + + picture->system_frame_number = priv->current_frame->system_frame_number; + picture->frame_hdr = *frame_header; + picture->frame_hdr.render_width = ref_picture->frame_hdr.render_width; + picture->frame_hdr.render_height = ref_picture->frame_hdr.render_height; + priv->current_picture = picture; + } else { + picture = gst_av1_picture_new (); + picture->frame_hdr = *frame_header; + picture->display_frame_id = frame_header->display_frame_id; + picture->show_frame = frame_header->show_frame; + picture->showable_frame = frame_header->showable_frame; + picture->apply_grain = frame_header->film_grain_params.apply_grain; + picture->system_frame_number = priv->current_frame->system_frame_number; + + if (!frame_header->show_frame && !frame_header->showable_frame) + GST_VIDEO_CODEC_FRAME_FLAG_SET (priv->current_frame, + GST_VIDEO_CODEC_FRAME_FLAG_DECODE_ONLY); + + if (klass->new_picture) { + ret = klass->new_picture (self, priv->current_frame, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "new picture error"); + return ret; + } + } + priv->current_picture = picture; + + if (klass->start_picture) { + ret = klass->start_picture (self, picture, priv->dpb); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "start picture error"); + return ret; + } + } + } + + g_assert (priv->current_picture != NULL); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_av1_decoder_process_frame_header (GstAV1Decoder * self, GstAV1OBU * obu) +{ + GstAV1ParserResult res; + GstAV1DecoderPrivate *priv = self->priv; + GstAV1FrameHeaderOBU frame_header; + + res = gst_av1_parser_parse_frame_header_obu (priv->parser, obu, + &frame_header); + if (res != GST_AV1_PARSER_OK) { + GST_WARNING_OBJECT (self, "Parsing frame header failed."); + return GST_FLOW_ERROR; + } + + return gst_av1_decoder_decode_frame_header (self, &frame_header); +} + +static GstFlowReturn +gst_av1_decoder_process_tile_group (GstAV1Decoder * self, GstAV1OBU * obu) +{ + GstAV1ParserResult res; + GstAV1DecoderPrivate *priv = self->priv; + GstAV1TileGroupOBU tile_group; + + res = gst_av1_parser_parse_tile_group_obu (priv->parser, obu, &tile_group); + if (res != GST_AV1_PARSER_OK) { + GST_WARNING_OBJECT (self, "Parsing tile group failed."); + return GST_FLOW_ERROR; + } + + return gst_av1_decoder_decode_tile_group (self, &tile_group, obu); +} + +static GstFlowReturn +gst_av1_decoder_process_frame (GstAV1Decoder * self, GstAV1OBU * obu) +{ + GstAV1ParserResult res; + GstAV1DecoderPrivate *priv = self->priv; + GstAV1FrameOBU frame; + GstFlowReturn ret = GST_FLOW_OK; + + res = gst_av1_parser_parse_frame_obu (priv->parser, obu, &frame); + if (res != GST_AV1_PARSER_OK) { + GST_WARNING_OBJECT (self, "Parsing frame failed."); + return GST_FLOW_ERROR; + } + + ret = gst_av1_decoder_decode_frame_header (self, &frame.frame_header); + if (ret != GST_FLOW_OK) + return ret; + + return gst_av1_decoder_decode_tile_group (self, &frame.tile_group, obu); +} + +static GstFlowReturn +gst_av1_decoder_temporal_delimiter (GstAV1Decoder * self, GstAV1OBU * obu) +{ + GstAV1DecoderPrivate *priv = self->priv; + + if (gst_av1_parser_parse_temporal_delimiter_obu (priv->parser, obu) == + GST_AV1_PARSER_OK) { + return GST_FLOW_OK; + } + + return GST_FLOW_ERROR; +} + +static GstFlowReturn +gst_av1_decoder_decode_one_obu (GstAV1Decoder * self, GstAV1OBU * obu) +{ + GstFlowReturn ret = GST_FLOW_OK; + + GST_LOG_OBJECT (self, "Decode obu %s", get_obu_name (obu->obu_type)); + switch (obu->obu_type) { + case GST_AV1_OBU_SEQUENCE_HEADER: + ret = gst_av1_decoder_process_sequence (self, obu); + break; + case GST_AV1_OBU_FRAME_HEADER: + ret = gst_av1_decoder_process_frame_header (self, obu); + break; + case GST_AV1_OBU_FRAME: + ret = gst_av1_decoder_process_frame (self, obu); + break; + case GST_AV1_OBU_TILE_GROUP: + ret = gst_av1_decoder_process_tile_group (self, obu); + break; + case GST_AV1_OBU_TEMPORAL_DELIMITER: + ret = gst_av1_decoder_temporal_delimiter (self, obu); + break; + /* TODO: may need to handled. */ + case GST_AV1_OBU_METADATA: + case GST_AV1_OBU_REDUNDANT_FRAME_HEADER: + case GST_AV1_OBU_TILE_LIST: + case GST_AV1_OBU_PADDING: + break; + default: + GST_WARNING_OBJECT (self, "an unrecognized obu type %d", obu->obu_type); + break; + } + + if (ret != GST_FLOW_OK) + GST_WARNING_OBJECT (self, "Failed to handle %s OBU", + get_obu_name (obu->obu_type)); + + return ret; +} + +static void +gst_av1_decoder_update_state (GstAV1Decoder * self) +{ + GstAV1DecoderPrivate *priv = self->priv; + GstAV1Picture *picture = priv->current_picture; + GstAV1ParserResult res; + GstAV1FrameHeaderOBU *fh; + + g_assert (picture); + fh = &picture->frame_hdr; + + /* This is a show_existing_frame case, only update key frame */ + if (fh->show_existing_frame && fh->frame_type != GST_AV1_KEY_FRAME) + return; + + res = gst_av1_parser_reference_frame_update (priv->parser, fh); + if (res != GST_AV1_PARSER_OK) { + GST_ERROR_OBJECT (self, "failed to update the reference."); + return; + } + + gst_av1_dpb_add (priv->dpb, gst_av1_picture_ref (picture)); +} + +static GstFlowReturn +gst_av1_decoder_handle_frame (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame) +{ + GstAV1Decoder *self = GST_AV1_DECODER (decoder); + GstAV1DecoderPrivate *priv = self->priv; + GstAV1DecoderClass *klass = GST_AV1_DECODER_GET_CLASS (self); + GstBuffer *in_buf = frame->input_buffer; + GstMapInfo map; + GstFlowReturn ret = GST_FLOW_OK; + guint32 total_consumed, consumed; + GstAV1OBU obu; + GstAV1ParserResult res; + + GST_LOG_OBJECT (self, "handle frame id %d, buf %" GST_PTR_FORMAT, + frame->system_frame_number, in_buf); + + priv->current_frame = frame; + g_assert (!priv->current_picture); + + if (!gst_buffer_map (in_buf, &map, GST_MAP_READ)) { + priv->current_frame = NULL; + GST_ERROR_OBJECT (self, "can not map input buffer"); + + return GST_FLOW_ERROR; + } + + total_consumed = 0; + while (total_consumed < map.size) { + res = gst_av1_parser_identify_one_obu (priv->parser, + map.data + total_consumed, map.size, &obu, &consumed); + if (res != GST_AV1_PARSER_OK) { + ret = GST_FLOW_ERROR; + goto out; + } + + ret = gst_av1_decoder_decode_one_obu (self, &obu); + if (ret != GST_FLOW_OK) { + goto out; + } + + total_consumed += consumed; + } + + if (!priv->current_picture) { + GST_ERROR_OBJECT (self, "No valid picture after exhaust input frame"); + ret = GST_FLOW_ERROR; + goto out; + } + + if (!priv->current_picture->frame_hdr.show_existing_frame) { + if (klass->end_picture) { + ret = klass->end_picture (self, priv->current_picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "end picture error"); + goto out; + } + } + } + + gst_av1_decoder_update_state (self); + +out: + gst_buffer_unmap (in_buf, &map); + + if (ret == GST_FLOW_OK) { + if (priv->current_picture->frame_hdr.show_frame || + priv->current_picture->frame_hdr.show_existing_frame) { + /* Only output one frame with the highest spatial id from each TU + * when there are multiple spatial layers. + */ + if (priv->parser->state.operating_point_idc && + obu.header.obu_spatial_id < + _floor_log2 (priv->parser->state.operating_point_idc >> 8)) { + gst_av1_picture_unref (priv->current_picture); + gst_video_decoder_release_frame (decoder, frame); + } else { + g_assert (klass->output_picture); + /* transfer ownership of frame and picture */ + ret = klass->output_picture (self, frame, priv->current_picture); + } + } else { + GST_LOG_OBJECT (self, "Decode only picture %p", priv->current_picture); + GST_VIDEO_CODEC_FRAME_SET_DECODE_ONLY (frame); + gst_av1_picture_unref (priv->current_picture); + ret = gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); + } + } else { + if (priv->current_picture) + gst_av1_picture_unref (priv->current_picture); + + gst_video_decoder_drop_frame (decoder, frame); + } + + priv->current_picture = NULL; + priv->current_frame = NULL; + + if (ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (decoder, 1, STREAM, DECODE, + ("Failed to handle the frame %d", frame->system_frame_number), + NULL, ret); + } + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstav1decoder.h
Added
@@ -0,0 +1,173 @@ +/* GStreamer + * Copyright (C) 2020 He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_AV1_DECODER_H__ +#define __GST_AV1_DECODER_H__ + +#include <gst/codecs/codecs-prelude.h> + +#include <gst/video/video.h> +#include <gst/codecparsers/gstav1parser.h> +#include <gst/codecs/gstav1picture.h> + +G_BEGIN_DECLS + +#define GST_TYPE_AV1_DECODER (gst_av1_decoder_get_type()) +#define GST_AV1_DECODER(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_AV1_DECODER,GstAV1Decoder)) +#define GST_AV1_DECODER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_AV1_DECODER,GstAV1DecoderClass)) +#define GST_AV1_DECODER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_AV1_DECODER,GstAV1DecoderClass)) +#define GST_IS_AV1_DECODER(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_AV1_DECODER)) +#define GST_IS_AV1_DECODER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_AV1_DECODER)) + +typedef struct _GstAV1Decoder GstAV1Decoder; +typedef struct _GstAV1DecoderClass GstAV1DecoderClass; +typedef struct _GstAV1DecoderPrivate GstAV1DecoderPrivate; + +/** + * GstAV1Decoder: + * + * The opaque #GstAV1Decoder data structure. + * + * Since: 1.20 + */ +struct _GstAV1Decoder +{ + /*< private >*/ + GstVideoDecoder parent; + + /*< protected >*/ + GstVideoCodecState * input_state; + + /*< private >*/ + GstAV1DecoderPrivate *priv; + gpointer padding[GST_PADDING_LARGE]; +}; + +/** + * GstAV1DecoderClass: + */ +struct _GstAV1DecoderClass +{ + GstVideoDecoderClass parent_class; + + /** + * GstAV1DecoderClass::new_sequence: + * @decoder: a #GstAV1Decoder + * @seq_hdr: a #GstAV1SequenceHeaderOBU + * + * Notifies subclass of SPS update + * + * Since: 1.20 + */ + GstFlowReturn (*new_sequence) (GstAV1Decoder * decoder, + const GstAV1SequenceHeaderOBU * seq_hdr); + /** + * GstAV1DecoderClass::new_picture: + * @decoder: a #GstAV1Decoder + * @frame: (transfer none): a #GstVideoCodecFrame + * @picture: (transfer none): a #GstAV1Picture + * + * Optional. Called whenever new #GstAV1Picture is created. + * Subclass can set implementation specific user data + * on the #GstAV1Picture via gst_av1_picture_set_user_data() + * + * Since: 1.20 + */ + GstFlowReturn (*new_picture) (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, + GstAV1Picture * picture); + /** + * GstAV1DecoderClass::duplicate_picture: + * @decoder: a #GstAV1Decoder + * @picture: (transfer none): a #GstAV1Picture + * + * Optional. Called when need to duplicate an existing + * #GstAV1Picture. + * + * Since: 1.20 + */ + GstAV1Picture * (*duplicate_picture) (GstAV1Decoder * decoder, + GstAV1Picture * picture); + /** + * GstAV1DecoderClass::start_picture: + * @decoder: a #GstAV1Decoder + * @picture: (transfer none): a #GstAV1Picture + * @dpb: (transfer none): a #GstAV1Dpb + * + * Optional. Called per one #GstAV1Picture to notify subclass to prepare + * decoding process for the #GstAV1Picture + * + * Since: 1.20 + */ + GstFlowReturn (*start_picture) (GstAV1Decoder * decoder, + GstAV1Picture * picture, + GstAV1Dpb * dpb); + /** + * GstAV1DecoderClass::decode_tile: + * @decoder: a #GstAV1Decoder + * @picture: (transfer none): a #GstAV1Picture + * @tile: (transfer none): a #GstAV1Tile + * + * Provides the tile data with tile group header and required raw + * bitstream for subclass to decode it. + * + * Since: 1.20 + */ + GstFlowReturn (*decode_tile) (GstAV1Decoder * decoder, + GstAV1Picture * picture, + GstAV1Tile * tile); + /** + * GstAV1DecoderClass::end_picture: + * @decoder: a #GstAV1Decoder + * @picture: (transfer none): a #GstAV1Picture + * + * Optional. Called per one #GstAV1Picture to notify subclass to finish + * decoding process for the #GstAV1Picture + * + * Since: 1.20 + */ + GstFlowReturn (*end_picture) (GstAV1Decoder * decoder, + GstAV1Picture * picture); + /** + * GstAV1DecoderClass::output_picture: + * @decoder: a #GstAV1Decoder + * @frame: (transfer full): a #GstVideoCodecFrame + * @picture: (transfer full): a #GstAV1Picture + * + * Called with a #GstAV1Picture which is required to be outputted. + * The #GstVideoCodecFrame must be consumed by subclass. + * + * Since: 1.20 + */ + GstFlowReturn (*output_picture) (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, + GstAV1Picture * picture); + + /*< private >*/ + gpointer padding[GST_PADDING_LARGE]; +}; + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstAV1Decoder, gst_object_unref) + +GST_CODECS_API +GType gst_av1_decoder_get_type (void); + +G_END_DECLS + +#endif /* __GST_AV1_DECODER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstav1picture.c
Added
@@ -0,0 +1,194 @@ +/* GStreamer + * Copyright (C) 2020 He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstav1picture.h" + +GST_DEBUG_CATEGORY_EXTERN (gst_av1_decoder_debug); +#define GST_CAT_DEFAULT gst_av1_decoder_debug + +GST_DEFINE_MINI_OBJECT_TYPE (GstAV1Picture, gst_av1_picture); + +static void +_gst_av1_picture_free (GstAV1Picture * picture) +{ + GST_TRACE ("Free picture %p", picture); + + if (picture->notify) + picture->notify (picture->user_data); + + g_free (picture); +} + +/** + * gst_av1_picture_new: + * + * Create new #GstAV1Picture + * + * Returns: a new #GstAV1Picture + * + * Since: 1.20 + */ +GstAV1Picture * +gst_av1_picture_new (void) +{ + GstAV1Picture *pic; + + pic = g_new0 (GstAV1Picture, 1); + + gst_mini_object_init (GST_MINI_OBJECT_CAST (pic), 0, + GST_TYPE_AV1_PICTURE, NULL, NULL, + (GstMiniObjectFreeFunction) _gst_av1_picture_free); + + GST_TRACE ("New picture %p", pic); + + return pic; +} + +/** + * gst_av1_picture_set_user_data: + * @picture: a #GstAV1Picture + * @user_data: private data + * @notify: (closure user_data): a #GDestroyNotify + * + * Sets @user_data on the picture and the #GDestroyNotify that will be called when + * the picture is freed. + * + * If a @user_data was previously set, then the previous set @notify will be called + * before the @user_data is replaced. + * + * Since: 1.20 + */ +void +gst_av1_picture_set_user_data (GstAV1Picture * picture, gpointer user_data, + GDestroyNotify notify) +{ + g_return_if_fail (GST_IS_AV1_PICTURE (picture)); + + if (picture->notify) + picture->notify (picture->user_data); + + picture->user_data = user_data; + picture->notify = notify; +} + +/** + * gst_av1_picture_get_user_data: + * @picture: a #GstAV1Picture + * + * Gets private data set on the picture via + * gst_av1_picture_set_user_data() previously. + * + * Returns: (transfer none): The previously set user_data + * + * Since: 1.20 + */ +gpointer +gst_av1_picture_get_user_data (GstAV1Picture * picture) +{ + return picture->user_data; +} + +/** + * gst_av1_dpb_new: (skip) + * + * Create new #GstAV1Dpb + * + * Returns: a new #GstAV1Dpb + * + * Since: 1.20 + */ +GstAV1Dpb * +gst_av1_dpb_new (void) +{ + GstAV1Dpb *dpb; + + dpb = g_new0 (GstAV1Dpb, 1); + + return dpb; +} + +/** + * gst_av1_dpb_free: + * @dpb: a #GstAV1Dpb to free + * + * Free the @dpb + * + * Since: 1.20 + */ +void +gst_av1_dpb_free (GstAV1Dpb * dpb) +{ + g_return_if_fail (dpb != NULL); + + gst_av1_dpb_clear (dpb); + g_free (dpb); +} + +/** + * gst_av1_dpb_clear: + * @dpb: a #GstAV1Dpb + * + * Clear all stored #GstAV1Picture + * + * Since: 1.20 + */ +void +gst_av1_dpb_clear (GstAV1Dpb * dpb) +{ + gint i; + + g_return_if_fail (dpb != NULL); + + for (i = 0; i < GST_AV1_NUM_REF_FRAMES; i++) + gst_av1_picture_clear (&dpb->pic_list[i]); +} + +/** + * gst_av1_dpb_add: + * @dpb: a #GstAV1Dpb + * @picture: (transfer full): a #GstAV1Picture + * + * Store the @picture + * + * Since: 1.20 + */ +void +gst_av1_dpb_add (GstAV1Dpb * dpb, GstAV1Picture * picture) +{ + GstAV1FrameHeaderOBU *fh; + guint i; + + g_return_if_fail (dpb != NULL); + g_return_if_fail (GST_IS_AV1_PICTURE (picture)); + + fh = &picture->frame_hdr; + + for (i = 0; i < GST_AV1_NUM_REF_FRAMES; i++) { + if ((fh->refresh_frame_flags >> i) & 1) { + GST_TRACE ("reference frame %p to ref slot:%d", picture, i); + gst_av1_picture_replace (&dpb->pic_list[i], picture); + } + } + + gst_av1_picture_unref (picture); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstav1picture.h
Added
@@ -0,0 +1,161 @@ +/* GStreamer + * Copyright (C) 2020 He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_AV1_PICTURE_H__ +#define __GST_AV1_PICTURE_H__ + +#include <gst/codecs/codecs-prelude.h> +#include <gst/codecparsers/gstav1parser.h> + +G_BEGIN_DECLS + +/** + * GST_TYPE_AV1_PICTURE: + * + * Since: 1.20 + */ +#define GST_TYPE_AV1_PICTURE (gst_av1_picture_get_type()) +/** + * GST_IS_AV1_PICTURE: + * + * Since: 1.20 + */ +#define GST_IS_AV1_PICTURE(obj) (GST_IS_MINI_OBJECT_TYPE(obj, GST_TYPE_AV1_PICTURE)) +/** + * GST_AV1_PICTURE: + * + * Since: 1.20 + */ +#define GST_AV1_PICTURE(obj) ((GstAV1Picture *)obj) + +typedef struct _GstAV1Picture GstAV1Picture; +typedef struct _GstAV1Tile GstAV1Tile; + +/** + * GstAV1Tile: + * + * Since: 1.20 + */ +struct _GstAV1Tile +{ + GstAV1TileGroupOBU tile_group; + /* raw data and size of tile group (does not have ownership) */ + GstAV1OBU obu; +}; + +/** + * GstAV1Picture: + * + * Since: 1.20 + */ +struct _GstAV1Picture +{ + GstMiniObject parent; + + /* From GstVideoCodecFrame */ + guint32 system_frame_number; + + GstAV1FrameHeaderOBU frame_hdr; + + /* copied from parser */ + guint32 display_frame_id; + gboolean show_frame; + gboolean showable_frame; + gboolean apply_grain; + + gpointer user_data; + GDestroyNotify notify; +}; + +GST_CODECS_API +GType gst_av1_picture_get_type (void); + +GST_CODECS_API +GstAV1Picture * gst_av1_picture_new (void); + +static inline GstAV1Picture * +gst_av1_picture_ref (GstAV1Picture * picture) +{ + return (GstAV1Picture *) gst_mini_object_ref (GST_MINI_OBJECT_CAST (picture)); +} + +static inline void +gst_av1_picture_unref (GstAV1Picture * picture) +{ + gst_mini_object_unref (GST_MINI_OBJECT_CAST (picture)); +} + +static inline gboolean +gst_av1_picture_replace (GstAV1Picture ** old_picture, + GstAV1Picture * new_picture) +{ + return gst_mini_object_replace ((GstMiniObject **) old_picture, + (GstMiniObject *) new_picture); +} + +static inline void +gst_av1_picture_clear (GstAV1Picture ** picture) +{ + if (picture && *picture) { + gst_av1_picture_unref (*picture); + *picture = NULL; + } +} + +GST_CODECS_API +void gst_av1_picture_set_user_data (GstAV1Picture * picture, + gpointer user_data, + GDestroyNotify notify); + +GST_CODECS_API +gpointer gst_av1_picture_get_user_data (GstAV1Picture * picture); + +/******************* + * GstAV1Dpb * + *******************/ +typedef struct _GstAV1Dpb GstAV1Dpb; + +/** + * GstAV1Dpb: + * + * Since: 1.20 + */ +struct _GstAV1Dpb +{ + GstAV1Picture *pic_list[GST_AV1_NUM_REF_FRAMES]; +}; + +GST_CODECS_API +GstAV1Dpb * gst_av1_dpb_new (void); + +GST_CODECS_API +void gst_av1_dpb_free (GstAV1Dpb * dpb); + +GST_CODECS_API +void gst_av1_dpb_clear (GstAV1Dpb * dpb); + +GST_CODECS_API +void gst_av1_dpb_add (GstAV1Dpb * dpb, + GstAV1Picture * picture); + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstAV1Picture, gst_av1_picture_unref) + +G_END_DECLS + +#endif /* __GST_AV1_PICTURE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth264decoder.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth264decoder.c
Changed
@@ -58,6 +58,7 @@ #include <config.h> #endif +#include <gst/base/base.h> #include "gsth264decoder.h" GST_DEBUG_CATEGORY (gst_h264_decoder_debug); @@ -79,12 +80,11 @@ struct _GstH264DecoderPrivate { + GstH264DecoderCompliance compliance; + + guint8 profile_idc; gint width, height; - gint fps_num, fps_den; - gint upstream_par_n, upstream_par_d; - gint parsed_par_n, parsed_par_d; - gint parsed_fps_n, parsed_fps_d; - GstVideoColorimetry parsed_colorimetry; + /* input codec_data, if any */ GstBuffer *codec_data; guint nal_length_size; @@ -94,7 +94,9 @@ GstH264DecoderAlign align; GstH264NalParser *parser; GstH264Dpb *dpb; - GstFlowReturn last_ret; + /* Cache last field which can not enter the DPB, should be a non ref */ + GstH264Picture *last_field; + /* used for low-latency vs. high throughput mode decision */ gboolean is_live; @@ -112,7 +114,6 @@ gint max_frame_num; gint max_pic_num; gint max_long_term_frame_idx; - gsize max_num_reorder_frames; gint prev_frame_num; gint prev_ref_frame_num; @@ -131,20 +132,42 @@ gint last_output_poc; gboolean process_ref_pic_lists; + guint preferred_output_delay; /* Reference picture lists, constructed for each frame */ GArray *ref_pic_list_p0; GArray *ref_pic_list_b0; GArray *ref_pic_list_b1; + /* Temporary picture list, for reference picture lists in fields, + * corresponding to 8.2.4.2.2 refFrameList0ShortTerm, refFrameList0LongTerm + * and 8.2.4.2.5 refFrameList1ShortTerm and refFrameListLongTerm */ + GArray *ref_frame_list_0_short_term; + GArray *ref_frame_list_1_short_term; + GArray *ref_frame_list_long_term; + /* Reference picture lists, constructed for each slice */ GArray *ref_pic_list0; GArray *ref_pic_list1; - /* Cached array to handle pictures to be outputed */ - GArray *to_output; + /* For delayed output */ + GstQueueArray *output_queue; }; +typedef struct +{ + /* Holds ref */ + GstVideoCodecFrame *frame; + GstH264Picture *picture; + /* Without ref */ + GstH264Decoder *self; +} GstH264DecoderOutputFrame; + +#define UPDATE_FLOW_RETURN(ret,new_ret) G_STMT_START { \ + if (*(ret) == GST_FLOW_OK) \ + *(ret) = new_ret; \ +} G_STMT_END + #define parent_class gst_h264_decoder_parent_class G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstH264Decoder, gst_h264_decoder, GST_TYPE_VIDEO_DECODER, @@ -164,25 +187,114 @@ static GstFlowReturn gst_h264_decoder_handle_frame (GstVideoDecoder * decoder, GstVideoCodecFrame * frame); -/* codec spcific functions */ -static gboolean gst_h264_decoder_process_sps (GstH264Decoder * self, +/* codec specific functions */ +static GstFlowReturn gst_h264_decoder_process_sps (GstH264Decoder * self, GstH264SPS * sps); -static gboolean gst_h264_decoder_decode_slice (GstH264Decoder * self); -static gboolean gst_h264_decoder_decode_nal (GstH264Decoder * self, - GstH264NalUnit * nalu, GstClockTime pts); +static GstFlowReturn gst_h264_decoder_decode_slice (GstH264Decoder * self); +static GstFlowReturn gst_h264_decoder_decode_nal (GstH264Decoder * self, + GstH264NalUnit * nalu); static gboolean gst_h264_decoder_fill_picture_from_slice (GstH264Decoder * self, const GstH264Slice * slice, GstH264Picture * picture); static gboolean gst_h264_decoder_calculate_poc (GstH264Decoder * self, GstH264Picture * picture); static gboolean gst_h264_decoder_init_gap_picture (GstH264Decoder * self, GstH264Picture * picture, gint frame_num); -static gboolean gst_h264_decoder_drain_internal (GstH264Decoder * self); -static gboolean gst_h264_decoder_finish_current_picture (GstH264Decoder * self); -static gboolean gst_h264_decoder_finish_picture (GstH264Decoder * self, - GstH264Picture * picture); -static void gst_h264_decoder_prepare_ref_pic_lists (GstH264Decoder * self); +static GstFlowReturn gst_h264_decoder_drain_internal (GstH264Decoder * self); +static void gst_h264_decoder_finish_current_picture (GstH264Decoder * self, + GstFlowReturn * ret); +static void gst_h264_decoder_finish_picture (GstH264Decoder * self, + GstH264Picture * picture, GstFlowReturn * ret); +static void gst_h264_decoder_prepare_ref_pic_lists (GstH264Decoder * self, + GstH264Picture * current_picture); static void gst_h264_decoder_clear_ref_pic_lists (GstH264Decoder * self); static gboolean gst_h264_decoder_modify_ref_pic_lists (GstH264Decoder * self); +static gboolean +gst_h264_decoder_sliding_window_picture_marking (GstH264Decoder * self, + GstH264Picture * picture); +static void gst_h264_decoder_do_output_picture (GstH264Decoder * self, + GstH264Picture * picture, GstFlowReturn * ret); +static GstH264Picture *gst_h264_decoder_new_field_picture (GstH264Decoder * + self, GstH264Picture * picture); +static void +gst_h264_decoder_clear_output_frame (GstH264DecoderOutputFrame * output_frame); + +enum +{ + PROP_0, + PROP_COMPLIANCE, +}; + +/** + * gst_h264_decoder_compliance_get_type: + * + * Get the compliance type of the h264 decoder. + * + * Since: 1.20 + */ +GType +gst_h264_decoder_compliance_get_type (void) +{ + static gsize h264_decoder_compliance_type = 0; + static const GEnumValue compliances[] = { + {GST_H264_DECODER_COMPLIANCE_AUTO, "GST_H264_DECODER_COMPLIANCE_AUTO", + "auto"}, + {GST_H264_DECODER_COMPLIANCE_STRICT, "GST_H264_DECODER_COMPLIANCE_STRICT", + "strict"}, + {GST_H264_DECODER_COMPLIANCE_NORMAL, "GST_H264_DECODER_COMPLIANCE_NORMAL", + "normal"}, + {GST_H264_DECODER_COMPLIANCE_FLEXIBLE, + "GST_H264_DECODER_COMPLIANCE_FLEXIBLE", "flexible"}, + {0, NULL, NULL}, + }; + + + if (g_once_init_enter (&h264_decoder_compliance_type)) { + GType _type; + + _type = g_enum_register_static ("GstH264DecoderCompliance", compliances); + g_once_init_leave (&h264_decoder_compliance_type, _type); + } + + return (GType) h264_decoder_compliance_type; +} + +static void +gst_h264_decoder_get_property (GObject * object, guint property_id, + GValue * value, GParamSpec * pspec) +{ + GstH264Decoder *self = GST_H264_DECODER (object); + GstH264DecoderPrivate *priv = self->priv; + + switch (property_id) { + case PROP_COMPLIANCE: + GST_OBJECT_LOCK (self); + g_value_set_enum (value, priv->compliance); + GST_OBJECT_UNLOCK (self); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec); + break; + } +} + +static void +gst_h264_decoder_set_property (GObject * object, guint property_id, + const GValue * value, GParamSpec * pspec) +{ + GstH264Decoder *self = GST_H264_DECODER (object); + GstH264DecoderPrivate *priv = self->priv; + + switch (property_id) { + case PROP_COMPLIANCE: + GST_OBJECT_LOCK (self); + priv->compliance = g_value_get_enum (value); + GST_OBJECT_UNLOCK (self); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec); + break; + } +} static void gst_h264_decoder_class_init (GstH264DecoderClass * klass) @@ -191,6 +303,8 @@ GObjectClass *object_class = G_OBJECT_CLASS (klass); object_class->finalize = GST_DEBUG_FUNCPTR (gst_h264_decoder_finalize); + object_class->get_property = gst_h264_decoder_get_property; + object_class->set_property = gst_h264_decoder_set_property; decoder_class->start = GST_DEBUG_FUNCPTR (gst_h264_decoder_start); decoder_class->stop = GST_DEBUG_FUNCPTR (gst_h264_decoder_stop); @@ -200,6 +314,22 @@ decoder_class->drain = GST_DEBUG_FUNCPTR (gst_h264_decoder_drain); decoder_class->handle_frame = GST_DEBUG_FUNCPTR (gst_h264_decoder_handle_frame); + + /** + * GstH264Decoder:compliance: + * + * The compliance controls the behavior of the decoder to handle some + * subtle cases and contexts, such as the low-latency DPB bumping or + * mapping the baseline profile as the constrained-baseline profile, + * etc. + * + * Since: 1.20 + */ + g_object_class_install_property (object_class, PROP_COMPLIANCE, + g_param_spec_enum ("compliance", "Decoder Compliance", + "The decoder's behavior in compliance with the h264 spec.", + GST_TYPE_H264_DECODER_COMPLIANCE, GST_H264_DECODER_COMPLIANCE_AUTO, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT)); } static void @@ -211,6 +341,8 @@ self->priv = priv = gst_h264_decoder_get_instance_private (self); + priv->last_output_poc = G_MININT32; + priv->ref_pic_list_p0 = g_array_sized_new (FALSE, TRUE, sizeof (GstH264Picture *), 32); g_array_set_clear_func (priv->ref_pic_list_p0, @@ -226,15 +358,30 @@ g_array_set_clear_func (priv->ref_pic_list_b1, (GDestroyNotify) gst_h264_picture_clear); + priv->ref_frame_list_0_short_term = g_array_sized_new (FALSE, TRUE, + sizeof (GstH264Picture *), 32); + g_array_set_clear_func (priv->ref_frame_list_0_short_term, + (GDestroyNotify) gst_h264_picture_clear); + + priv->ref_frame_list_1_short_term = g_array_sized_new (FALSE, TRUE, + sizeof (GstH264Picture *), 32); + g_array_set_clear_func (priv->ref_frame_list_1_short_term, + (GDestroyNotify) gst_h264_picture_clear); + + priv->ref_frame_list_long_term = g_array_sized_new (FALSE, TRUE, + sizeof (GstH264Picture *), 32); + g_array_set_clear_func (priv->ref_frame_list_long_term, + (GDestroyNotify) gst_h264_picture_clear); + priv->ref_pic_list0 = g_array_sized_new (FALSE, TRUE, sizeof (GstH264Picture *), 32); priv->ref_pic_list1 = g_array_sized_new (FALSE, TRUE, sizeof (GstH264Picture *), 32); - priv->to_output = g_array_sized_new (FALSE, TRUE, - sizeof (GstH264Picture *), 16); - g_array_set_clear_func (priv->to_output, - (GDestroyNotify) gst_h264_picture_clear); + priv->output_queue = + gst_queue_array_new_for_struct (sizeof (GstH264DecoderOutputFrame), 1); + gst_queue_array_set_clear_func (priv->output_queue, + (GDestroyNotify) gst_h264_decoder_clear_output_frame); } static void @@ -246,19 +393,41 @@ g_array_unref (priv->ref_pic_list_p0); g_array_unref (priv->ref_pic_list_b0); g_array_unref (priv->ref_pic_list_b1); + g_array_unref (priv->ref_frame_list_0_short_term); + g_array_unref (priv->ref_frame_list_1_short_term); + g_array_unref (priv->ref_frame_list_long_term); g_array_unref (priv->ref_pic_list0); g_array_unref (priv->ref_pic_list1); - g_array_unref (priv->to_output); + gst_queue_array_free (priv->output_queue); G_OBJECT_CLASS (parent_class)->finalize (object); } +static void +gst_h264_decoder_reset (GstH264Decoder * self) +{ + GstH264DecoderPrivate *priv = self->priv; + + gst_clear_buffer (&priv->codec_data); + g_clear_pointer (&self->input_state, gst_video_codec_state_unref); + g_clear_pointer (&priv->parser, gst_h264_nal_parser_free); + g_clear_pointer (&priv->dpb, gst_h264_dpb_free); + gst_h264_picture_clear (&priv->last_field); + + priv->profile_idc = 0; + priv->width = 0; + priv->height = 0; + priv->nal_length_size = 4; +} + static gboolean gst_h264_decoder_start (GstVideoDecoder * decoder) { GstH264Decoder *self = GST_H264_DECODER (decoder); GstH264DecoderPrivate *priv = self->priv; + gst_h264_decoder_reset (self); + priv->parser = gst_h264_nal_parser_new (); priv->dpb = gst_h264_dpb_new (); @@ -269,36 +438,52 @@ gst_h264_decoder_stop (GstVideoDecoder * decoder) { GstH264Decoder *self = GST_H264_DECODER (decoder); - GstH264DecoderPrivate *priv = self->priv; - if (self->input_state) { - gst_video_codec_state_unref (self->input_state); - self->input_state = NULL; - } + gst_h264_decoder_reset (self); - gst_clear_buffer (&priv->codec_data); + return TRUE; +} - if (priv->parser) { - gst_h264_nal_parser_free (priv->parser); - priv->parser = NULL; - } +static void +gst_h264_decoder_clear_output_frame (GstH264DecoderOutputFrame * output_frame) +{ + if (!output_frame) + return; - if (priv->dpb) { - gst_h264_dpb_free (priv->dpb); - priv->dpb = NULL; + if (output_frame->frame) { + gst_video_decoder_release_frame (GST_VIDEO_DECODER (output_frame->self), + output_frame->frame); + output_frame->frame = NULL; } - return TRUE; + gst_h264_picture_clear (&output_frame->picture); } static void -gst_h264_decoder_clear_dpb (GstH264Decoder * self) +gst_h264_decoder_clear_dpb (GstH264Decoder * self, gboolean flush) { + GstVideoDecoder *decoder = GST_VIDEO_DECODER (self); GstH264DecoderPrivate *priv = self->priv; + GstH264Picture *picture; + + /* If we are not flushing now, videodecoder baseclass will hold + * GstVideoCodecFrame. Release frames manually */ + if (!flush) { + while ((picture = gst_h264_dpb_bump (priv->dpb, TRUE)) != NULL) { + GstVideoCodecFrame *frame = gst_video_decoder_get_frame (decoder, + picture->system_frame_number); + if (frame) + gst_video_decoder_release_frame (decoder, frame); + gst_h264_picture_unref (picture); + } + } + + gst_queue_array_clear (priv->output_queue); gst_h264_decoder_clear_ref_pic_lists (self); + gst_h264_picture_clear (&priv->last_field); gst_h264_dpb_clear (priv->dpb); - priv->last_output_poc = -1; + priv->last_output_poc = G_MININT32; } static gboolean @@ -306,7 +491,7 @@ { GstH264Decoder *self = GST_H264_DECODER (decoder); - gst_h264_decoder_clear_dpb (self); + gst_h264_decoder_clear_dpb (self, TRUE); return TRUE; } @@ -315,13 +500,9 @@ gst_h264_decoder_drain (GstVideoDecoder * decoder) { GstH264Decoder *self = GST_H264_DECODER (decoder); - GstH264DecoderPrivate *priv = self->priv; - priv->last_ret = GST_FLOW_OK; /* dpb will be cleared by this method */ - gst_h264_decoder_drain_internal (self); - - return priv->last_ret; + return gst_h264_decoder_drain_internal (self); } static GstFlowReturn @@ -340,7 +521,7 @@ GstH264NalUnit nalu; GstH264ParserResult pres; GstMapInfo map; - gboolean decode_ret = TRUE; + GstFlowReturn decode_ret = GST_FLOW_OK; GST_LOG_OBJECT (self, "handle frame, PTS: %" GST_TIME_FORMAT ", DTS: %" @@ -348,16 +529,14 @@ GST_TIME_ARGS (GST_BUFFER_DTS (in_buf))); priv->current_frame = frame; - priv->last_ret = GST_FLOW_OK; gst_buffer_map (in_buf, &map, GST_MAP_READ); if (priv->in_format == GST_H264_DECODER_FORMAT_AVC) { pres = gst_h264_parser_identify_nalu_avc (priv->parser, map.data, 0, map.size, priv->nal_length_size, &nalu); - while (pres == GST_H264_PARSER_OK && decode_ret) { - decode_ret = gst_h264_decoder_decode_nal (self, - &nalu, GST_BUFFER_PTS (in_buf)); + while (pres == GST_H264_PARSER_OK && decode_ret == GST_FLOW_OK) { + decode_ret = gst_h264_decoder_decode_nal (self, &nalu); pres = gst_h264_parser_identify_nalu_avc (priv->parser, map.data, nalu.offset + nalu.size, map.size, priv->nal_length_size, @@ -370,9 +549,8 @@ if (pres == GST_H264_PARSER_NO_NAL_END) pres = GST_H264_PARSER_OK; - while (pres == GST_H264_PARSER_OK && decode_ret) { - decode_ret = gst_h264_decoder_decode_nal (self, - &nalu, GST_BUFFER_PTS (in_buf)); + while (pres == GST_H264_PARSER_OK && decode_ret == GST_FLOW_OK) { + decode_ret = gst_h264_decoder_decode_nal (self, &nalu); pres = gst_h264_parser_identify_nalu (priv->parser, map.data, nalu.offset + nalu.size, map.size, &nalu); @@ -384,47 +562,54 @@ gst_buffer_unmap (in_buf, &map); - if (!decode_ret) { - GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, - ("Failed to decode data"), (NULL), priv->last_ret); - gst_video_decoder_drop_frame (decoder, frame); + if (decode_ret != GST_FLOW_OK) { + if (decode_ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), decode_ret); + } + gst_video_decoder_drop_frame (decoder, frame); gst_h264_picture_clear (&priv->current_picture); priv->current_frame = NULL; - return priv->last_ret; + return decode_ret; } - gst_h264_decoder_finish_current_picture (self); + gst_h264_decoder_finish_current_picture (self, &decode_ret); gst_video_codec_frame_unref (frame); priv->current_frame = NULL; - return priv->last_ret; + if (decode_ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), decode_ret); + } + + return decode_ret; } -static gboolean +static GstFlowReturn gst_h264_decoder_parse_sps (GstH264Decoder * self, GstH264NalUnit * nalu) { GstH264DecoderPrivate *priv = self->priv; GstH264SPS sps; GstH264ParserResult pres; - gboolean ret; + GstFlowReturn ret; pres = gst_h264_parse_sps (nalu, &sps); if (pres != GST_H264_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to parse SPS, result %d", pres); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "SPS parsed"); ret = gst_h264_decoder_process_sps (self, &sps); - if (!ret) { + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to process SPS"); } else if (gst_h264_parser_update_sps (priv->parser, &sps) != GST_H264_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to update SPS"); - ret = FALSE; + ret = GST_FLOW_ERROR; } gst_h264_sps_clear (&sps); @@ -432,29 +617,29 @@ return ret; } -static gboolean +static GstFlowReturn gst_h264_decoder_parse_pps (GstH264Decoder * self, GstH264NalUnit * nalu) { GstH264DecoderPrivate *priv = self->priv; GstH264PPS pps; GstH264ParserResult pres; - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; pres = gst_h264_parse_pps (priv->parser, nalu, &pps); if (pres != GST_H264_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to parse PPS, result %d", pres); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "PPS parsed"); if (pps.num_slice_groups_minus1 > 0) { GST_FIXME_OBJECT (self, "FMO is not supported"); - ret = FALSE; + ret = GST_FLOW_ERROR; } else if (gst_h264_parser_update_pps (priv->parser, &pps) != GST_H264_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to update PPS"); - ret = FALSE; + ret = GST_FLOW_ERROR; } gst_h264_pps_clear (&pps); @@ -462,7 +647,7 @@ return ret; } -static gboolean +static GstFlowReturn gst_h264_decoder_parse_codec_data (GstH264Decoder * self, const guint8 * data, gsize size) { @@ -472,18 +657,19 @@ gint i; GstH264ParserResult pres; GstH264NalUnit nalu; + GstFlowReturn ret = GST_FLOW_OK; #ifndef GST_DISABLE_GST_DEBUG guint profile; #endif /* parse the avcC data */ if (size < 7) { /* when numSPS==0 and numPPS==0, length is 7 bytes */ - return FALSE; + return GST_FLOW_ERROR; } /* parse the version, this must be 1 */ if (data[0] != 1) { - return FALSE; + return GST_FLOW_ERROR; } #ifndef GST_DISABLE_GST_DEBUG /* AVCProfileIndication */ @@ -506,19 +692,20 @@ data, off, size, 2, &nalu); if (pres != GST_H264_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to identify SPS nalu"); - return FALSE; + return GST_FLOW_ERROR; } - if (!gst_h264_decoder_parse_sps (self, &nalu)) { + ret = gst_h264_decoder_parse_sps (self, &nalu); + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to parse SPS"); - return FALSE; + return ret; } off = nalu.offset + nalu.size; } if (off >= size) { GST_WARNING_OBJECT (self, "Too small avcC"); - return FALSE; + return GST_FLOW_ERROR; } num_pps = data[off]; @@ -529,17 +716,18 @@ data, off, size, 2, &nalu); if (pres != GST_H264_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to identify PPS nalu"); - return FALSE; + return GST_FLOW_ERROR; } - if (!gst_h264_decoder_parse_pps (self, &nalu)) { + ret = gst_h264_decoder_parse_pps (self, &nalu); + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to parse PPS"); - return FALSE; + return ret; } off = nalu.offset + nalu.size; } - return TRUE; + return GST_FLOW_OK; } static gboolean @@ -553,23 +741,14 @@ slice->header.first_mb_in_slice); return FALSE; } - - /* If the new picture is an IDR, flush DPB */ - if (slice->nalu.idr_pic_flag) { - /* Output all remaining pictures, unless we are explicitly instructed - * not to do so */ - if (!slice->header.dec_ref_pic_marking.no_output_of_prior_pics_flag) - gst_h264_decoder_drain (GST_VIDEO_DECODER (self)); - - gst_h264_dpb_clear (priv->dpb); - } } return TRUE; } static void -gst_h264_decoder_update_pic_nums (GstH264Decoder * self, gint frame_num) +gst_h264_decoder_update_pic_nums (GstH264Decoder * self, + GstH264Picture * current_picture, gint frame_num) { GstH264DecoderPrivate *priv = self->priv; GArray *dpb = gst_h264_dpb_get_pictures_all (priv->dpb); @@ -578,30 +757,183 @@ for (i = 0; i < dpb->len; i++) { GstH264Picture *picture = g_array_index (dpb, GstH264Picture *, i); - if (picture->field != GST_H264_PICTURE_FIELD_FRAME) { - GST_FIXME_OBJECT (self, "Interlaced video not supported"); - continue; - } - - if (!picture->ref) + if (!GST_H264_PICTURE_IS_REF (picture)) continue; - if (picture->long_term) { - picture->long_term_pic_num = picture->long_term_frame_idx; + if (GST_H264_PICTURE_IS_LONG_TERM_REF (picture)) { + if (GST_H264_PICTURE_IS_FRAME (current_picture)) + picture->long_term_pic_num = picture->long_term_frame_idx; + else if (current_picture->field == picture->field) + picture->long_term_pic_num = 2 * picture->long_term_frame_idx + 1; + else + picture->long_term_pic_num = 2 * picture->long_term_frame_idx; } else { if (picture->frame_num > frame_num) picture->frame_num_wrap = picture->frame_num - priv->max_frame_num; else picture->frame_num_wrap = picture->frame_num; - picture->pic_num = picture->frame_num_wrap; + if (GST_H264_PICTURE_IS_FRAME (current_picture)) + picture->pic_num = picture->frame_num_wrap; + else if (picture->field == current_picture->field) + picture->pic_num = 2 * picture->frame_num_wrap + 1; + else + picture->pic_num = 2 * picture->frame_num_wrap; } } g_array_unref (dpb); } -static gboolean +static GstH264Picture * +gst_h264_decoder_split_frame (GstH264Decoder * self, GstH264Picture * picture) +{ + GstH264Picture *other_field; + + g_assert (GST_H264_PICTURE_IS_FRAME (picture)); + + other_field = gst_h264_decoder_new_field_picture (self, picture); + if (!other_field) { + GST_WARNING_OBJECT (self, + "Couldn't split frame into complementary field pair"); + return NULL; + } + + GST_LOG_OBJECT (self, "Split picture %p, poc %d, frame num %d", + picture, picture->pic_order_cnt, picture->frame_num); + + /* FIXME: enhance TFF decision by using picture timing SEI */ + if (picture->top_field_order_cnt < picture->bottom_field_order_cnt) { + picture->field = GST_H264_PICTURE_FIELD_TOP_FIELD; + picture->pic_order_cnt = picture->top_field_order_cnt; + + other_field->field = GST_H264_PICTURE_FIELD_BOTTOM_FIELD; + other_field->pic_order_cnt = picture->bottom_field_order_cnt; + } else { + picture->field = GST_H264_PICTURE_FIELD_BOTTOM_FIELD; + picture->pic_order_cnt = picture->bottom_field_order_cnt; + + other_field->field = GST_H264_PICTURE_FIELD_TOP_FIELD; + other_field->pic_order_cnt = picture->top_field_order_cnt; + } + + other_field->top_field_order_cnt = picture->top_field_order_cnt; + other_field->bottom_field_order_cnt = picture->bottom_field_order_cnt; + other_field->frame_num = picture->frame_num; + other_field->ref = picture->ref; + other_field->nonexisting = picture->nonexisting; + other_field->system_frame_number = picture->system_frame_number; + + return other_field; +} + +static void +output_picture_directly (GstH264Decoder * self, GstH264Picture * picture, + GstFlowReturn * ret) +{ + GstH264DecoderPrivate *priv = self->priv; + GstH264Picture *out_pic = NULL; + GstFlowReturn flow_ret = GST_FLOW_OK; + + g_assert (ret != NULL); + + if (GST_H264_PICTURE_IS_FRAME (picture)) { + g_assert (priv->last_field == NULL); + out_pic = g_steal_pointer (&picture); + goto output; + } + + if (priv->last_field == NULL) { + if (picture->second_field) { + GST_WARNING ("Set the last output %p poc:%d, without first field", + picture, picture->pic_order_cnt); + + flow_ret = GST_FLOW_ERROR; + goto output; + } + + /* Just cache the first field. */ + priv->last_field = g_steal_pointer (&picture); + } else { + if (!picture->second_field || !picture->other_field + || picture->other_field != priv->last_field) { + GST_WARNING ("The last field %p poc:%d is not the pair of the " + "current field %p poc:%d", + priv->last_field, priv->last_field->pic_order_cnt, + picture, picture->pic_order_cnt); + + gst_h264_picture_clear (&priv->last_field); + flow_ret = GST_FLOW_ERROR; + goto output; + } + + GST_TRACE ("Pair the last field %p poc:%d and the current" + " field %p poc:%d", + priv->last_field, priv->last_field->pic_order_cnt, + picture, picture->pic_order_cnt); + + out_pic = priv->last_field; + priv->last_field = NULL; + /* Link each field. */ + out_pic->other_field = picture; + } + +output: + if (out_pic) { + gst_h264_dpb_set_last_output (priv->dpb, out_pic); + gst_h264_decoder_do_output_picture (self, out_pic, &flow_ret); + } + + gst_h264_picture_clear (&picture); + + UPDATE_FLOW_RETURN (ret, flow_ret); +} + +static void +add_picture_to_dpb (GstH264Decoder * self, GstH264Picture * picture) +{ + GstH264DecoderPrivate *priv = self->priv; + + if (!gst_h264_dpb_get_interlaced (priv->dpb)) { + g_assert (priv->last_field == NULL); + gst_h264_dpb_add (priv->dpb, picture); + return; + } + + /* The first field of the last picture may not be able to enter the + DPB if it is a non ref, but if the second field enters the DPB, we + need to add both of them. */ + if (priv->last_field && picture->other_field == priv->last_field) { + gst_h264_dpb_add (priv->dpb, priv->last_field); + priv->last_field = NULL; + } + + gst_h264_dpb_add (priv->dpb, picture); +} + +static void +_bump_dpb (GstH264Decoder * self, GstH264DpbBumpMode bump_level, + GstH264Picture * current_picture, GstFlowReturn * ret) +{ + GstH264DecoderPrivate *priv = self->priv; + + g_assert (ret != NULL); + + while (gst_h264_dpb_needs_bump (priv->dpb, current_picture, bump_level)) { + GstH264Picture *to_output; + + to_output = gst_h264_dpb_bump (priv->dpb, FALSE); + + if (!to_output) { + GST_WARNING_OBJECT (self, "Bumping is needed but no picture to output"); + break; + } + + gst_h264_decoder_do_output_picture (self, to_output, ret); + } +} + +static GstFlowReturn gst_h264_decoder_handle_frame_num_gap (GstH264Decoder * self, gint frame_num) { GstH264DecoderPrivate *priv = self->priv; @@ -610,40 +942,82 @@ if (!sps) { GST_ERROR_OBJECT (self, "No active sps"); - return FALSE; + return GST_FLOW_ERROR; + } + + if (priv->prev_ref_frame_num == frame_num) { + GST_TRACE_OBJECT (self, + "frame_num == PrevRefFrameNum (%d), not a gap", frame_num); + return GST_FLOW_OK; + } + + if (((priv->prev_ref_frame_num + 1) % priv->max_frame_num) == frame_num) { + GST_TRACE_OBJECT (self, + "frame_num == (PrevRefFrameNum + 1) %% MaxFrameNum (%d), not a gap", + frame_num); + return GST_FLOW_OK; + } + + if (gst_h264_dpb_get_size (priv->dpb) == 0) { + GST_TRACE_OBJECT (self, "DPB is empty, not a gap"); + return GST_FLOW_OK; } if (!sps->gaps_in_frame_num_value_allowed_flag) { /* This is likely the case where some frames were dropped. * then we need to keep decoding without error out */ - GST_WARNING_OBJECT (self, "Invalid frame num %d", frame_num); + GST_WARNING_OBJECT (self, "Invalid frame num %d, maybe frame drop", + frame_num); + + return GST_FLOW_OK; } - GST_DEBUG_OBJECT (self, "Handling frame num gap %d -> %d", - priv->prev_ref_frame_num, frame_num); + GST_DEBUG_OBJECT (self, "Handling frame num gap %d -> %d (MaxFrameNum: %d)", + priv->prev_ref_frame_num, frame_num, priv->max_frame_num); /* 7.4.3/7-23 */ unused_short_term_frame_num = (priv->prev_ref_frame_num + 1) % priv->max_frame_num; while (unused_short_term_frame_num != frame_num) { GstH264Picture *picture = gst_h264_picture_new (); + GstFlowReturn ret = GST_FLOW_OK; if (!gst_h264_decoder_init_gap_picture (self, picture, unused_short_term_frame_num)) - return FALSE; + return GST_FLOW_ERROR; - gst_h264_decoder_update_pic_nums (self, unused_short_term_frame_num); + gst_h264_decoder_update_pic_nums (self, picture, + unused_short_term_frame_num); - if (!gst_h264_decoder_finish_picture (self, picture)) { - GST_WARNING ("Failed to finish picture %p", picture); - return FALSE; + /* C.2.1 */ + if (!gst_h264_decoder_sliding_window_picture_marking (self, picture)) { + GST_ERROR_OBJECT (self, + "Couldn't perform sliding window picture marking"); + return GST_FLOW_ERROR; + } + + gst_h264_dpb_delete_unused (priv->dpb); + + _bump_dpb (self, GST_H264_DPB_BUMP_NORMAL_LATENCY, picture, &ret); + if (ret != GST_FLOW_OK) + return ret; + + /* the picture is short term ref, add to DPB. */ + if (gst_h264_dpb_get_interlaced (priv->dpb)) { + GstH264Picture *other_field = + gst_h264_decoder_split_frame (self, picture); + + add_picture_to_dpb (self, picture); + add_picture_to_dpb (self, other_field); + } else { + add_picture_to_dpb (self, picture); } unused_short_term_frame_num++; unused_short_term_frame_num %= priv->max_frame_num; } - return TRUE; + return GST_FLOW_OK; } static gboolean @@ -671,14 +1045,15 @@ return TRUE; } -static gboolean +static GstFlowReturn gst_h264_decoder_start_current_picture (GstH264Decoder * self) { GstH264DecoderClass *klass; GstH264DecoderPrivate *priv = self->priv; const GstH264SPS *sps; gint frame_num; - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; + GstH264Picture *current_picture; g_assert (priv->current_picture != NULL); g_assert (priv->active_sps != NULL); @@ -691,41 +1066,170 @@ if (priv->current_slice.nalu.idr_pic_flag) priv->prev_ref_frame_num = 0; - /* 7.4.3 */ - if (frame_num != priv->prev_ref_frame_num && - frame_num != (priv->prev_ref_frame_num + 1) % priv->max_frame_num && - gst_h264_dpb_get_size (priv->dpb) > 0) { - if (!gst_h264_decoder_handle_frame_num_gap (self, frame_num)) - return FALSE; - } + ret = gst_h264_decoder_handle_frame_num_gap (self, frame_num); + if (ret != GST_FLOW_OK) + return ret; if (!gst_h264_decoder_init_current_picture (self)) - return FALSE; + return GST_FLOW_ERROR; - gst_h264_decoder_update_pic_nums (self, frame_num); + current_picture = priv->current_picture; + + /* If the new picture is an IDR, flush DPB */ + if (current_picture->idr) { + if (!current_picture->dec_ref_pic_marking.no_output_of_prior_pics_flag) { + ret = gst_h264_decoder_drain_internal (self); + if (ret != GST_FLOW_OK) + return ret; + } else { + /* C.4.4 Removal of pictures from the DPB before possible insertion + * of the current picture + * + * If decoded picture is IDR and no_output_of_prior_pics_flag is equal to 1 + * or is inferred to be equal to 1, all frame buffers in the DPB + * are emptied without output of the pictures they contain, + * and DPB fullness is set to 0. + */ + gst_h264_decoder_clear_dpb (self, FALSE); + } + } + + gst_h264_decoder_update_pic_nums (self, current_picture, frame_num); if (priv->process_ref_pic_lists) - gst_h264_decoder_prepare_ref_pic_lists (self); + gst_h264_decoder_prepare_ref_pic_lists (self, current_picture); klass = GST_H264_DECODER_GET_CLASS (self); - if (klass->start_picture) + if (klass->start_picture) { ret = klass->start_picture (self, priv->current_picture, &priv->current_slice, priv->dpb); - if (!ret) { - GST_ERROR_OBJECT (self, "subclass does not want to start picture"); - return FALSE; + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass does not want to start picture"); + return ret; + } } - return TRUE; + return GST_FLOW_OK; +} + +static GstH264Picture * +gst_h264_decoder_new_field_picture (GstH264Decoder * self, + GstH264Picture * picture) +{ + GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); + GstH264Picture *new_picture; + + if (!klass->new_field_picture) { + GST_WARNING_OBJECT (self, "Subclass does not support interlaced stream"); + return NULL; + } + + new_picture = gst_h264_picture_new (); + /* don't confuse subclass by non-existing picture */ + if (!picture->nonexisting) { + GstFlowReturn ret; + + ret = klass->new_field_picture (self, picture, new_picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Subclass couldn't handle new field picture"); + gst_h264_picture_unref (new_picture); + + return NULL; + } + } + + new_picture->other_field = picture; + new_picture->second_field = TRUE; + + return new_picture; } static gboolean -gst_h264_decoder_parse_slice (GstH264Decoder * self, GstH264NalUnit * nalu, - GstClockTime pts) +gst_h264_decoder_find_first_field_picture (GstH264Decoder * self, + GstH264Slice * slice, GstH264Picture ** first_field) +{ + GstH264DecoderPrivate *priv = self->priv; + const GstH264SliceHdr *slice_hdr = &slice->header; + GstH264Picture *prev_field; + gboolean in_dpb; + + *first_field = NULL; + prev_field = NULL; + in_dpb = FALSE; + if (gst_h264_dpb_get_interlaced (priv->dpb)) { + if (priv->last_field) { + prev_field = priv->last_field; + in_dpb = FALSE; + } else if (gst_h264_dpb_get_size (priv->dpb) > 0) { + GstH264Picture *prev_picture; + GArray *pictures; + + pictures = gst_h264_dpb_get_pictures_all (priv->dpb); + prev_picture = + g_array_index (pictures, GstH264Picture *, pictures->len - 1); + g_array_unref (pictures); /* prev_picture should be held */ + + /* Previous picture was a field picture. */ + if (!GST_H264_PICTURE_IS_FRAME (prev_picture) + && !prev_picture->other_field) { + prev_field = prev_picture; + in_dpb = TRUE; + } + } + } else { + g_assert (priv->last_field == NULL); + } + + /* This is not a field picture */ + if (!slice_hdr->field_pic_flag) { + if (!prev_field) + return TRUE; + + GST_WARNING_OBJECT (self, "Previous picture %p (poc %d) is not complete", + prev_field, prev_field->pic_order_cnt); + goto error; + } + + /* OK, this is the first field. */ + if (!prev_field) + return TRUE; + + if (prev_field->frame_num != slice_hdr->frame_num) { + GST_WARNING_OBJECT (self, "Previous picture %p (poc %d) is not complete", + prev_field, prev_field->pic_order_cnt); + goto error; + } else { + GstH264PictureField current_field = slice_hdr->bottom_field_flag ? + GST_H264_PICTURE_FIELD_BOTTOM_FIELD : GST_H264_PICTURE_FIELD_TOP_FIELD; + + if (current_field == prev_field->field) { + GST_WARNING_OBJECT (self, + "Currnet picture and previous picture have identical field %d", + current_field); + goto error; + } + } + + *first_field = gst_h264_picture_ref (prev_field); + return TRUE; + +error: + if (!in_dpb) { + gst_h264_picture_clear (&priv->last_field); + } else { + /* FIXME: implement fill gap field picture if it is already in DPB */ + } + + return FALSE; +} + +static GstFlowReturn +gst_h264_decoder_parse_slice (GstH264Decoder * self, GstH264NalUnit * nalu) { GstH264DecoderPrivate *priv = self->priv; GstH264ParserResult pres = GST_H264_PARSER_OK; + GstFlowReturn ret = GST_FLOW_OK; memset (&priv->current_slice, 0, sizeof (GstH264Slice)); @@ -736,54 +1240,94 @@ GST_ERROR_OBJECT (self, "Failed to parse slice header, ret %d", pres); memset (&priv->current_slice, 0, sizeof (GstH264Slice)); - return FALSE; + return GST_FLOW_ERROR; } priv->current_slice.nalu = *nalu; if (!gst_h264_decoder_preprocess_slice (self, &priv->current_slice)) - return FALSE; + return GST_FLOW_ERROR; priv->active_pps = priv->current_slice.header.pps; priv->active_sps = priv->active_pps->sequence; + /* Check whether field picture boundary within given codec frame. + * This might happen in case that upstream sent buffer per frame unit, + * not picture unit (i.e., AU unit). + * If AU boundary is detected, then finish first field picture we decoded + * in this chain, we should finish the current picture and + * start new field picture decoding */ + if (gst_h264_dpb_get_interlaced (priv->dpb) && priv->current_picture && + !GST_H264_PICTURE_IS_FRAME (priv->current_picture) && + !priv->current_picture->second_field) { + GstH264PictureField prev_field = priv->current_picture->field; + GstH264PictureField cur_field = GST_H264_PICTURE_FIELD_FRAME; + if (priv->current_slice.header.field_pic_flag) + cur_field = priv->current_slice.header.bottom_field_flag ? + GST_H264_PICTURE_FIELD_BOTTOM_FIELD : + GST_H264_PICTURE_FIELD_TOP_FIELD; + + if (cur_field != prev_field) { + GST_LOG_OBJECT (self, + "Found new field picture, finishing the first field picture"); + gst_h264_decoder_finish_current_picture (self, &ret); + } + } + if (!priv->current_picture) { GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); - GstH264Picture *picture; - gboolean ret = TRUE; + GstH264Picture *picture = NULL; + GstH264Picture *first_field = NULL; + GstFlowReturn ret = GST_FLOW_OK; - picture = gst_h264_picture_new (); - picture->pts = pts; - /* This allows accessing the frame from the picture. */ - picture->system_frame_number = priv->current_frame->system_frame_number; - - priv->current_picture = picture; g_assert (priv->current_frame); - if (klass->new_picture) - ret = klass->new_picture (self, priv->current_frame, picture); + if (!gst_h264_decoder_find_first_field_picture (self, + &priv->current_slice, &first_field)) { + GST_ERROR_OBJECT (self, "Couldn't find or determine first picture"); + return GST_FLOW_ERROR; + } - if (!ret) { - GST_ERROR_OBJECT (self, "subclass does not want accept new picture"); - priv->current_picture = NULL; - gst_h264_picture_unref (picture); - return FALSE; + if (first_field) { + picture = gst_h264_decoder_new_field_picture (self, first_field); + gst_h264_picture_unref (first_field); + + if (!picture) { + GST_ERROR_OBJECT (self, "Couldn't duplicate the first field picture"); + return GST_FLOW_ERROR; + } + } else { + picture = gst_h264_picture_new (); + + if (klass->new_picture) + ret = klass->new_picture (self, priv->current_frame, picture); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass does not want accept new picture"); + priv->current_picture = NULL; + gst_h264_picture_unref (picture); + return ret; + } } - if (!gst_h264_decoder_start_current_picture (self)) { - GST_ERROR_OBJECT (self, "start picture failed"); - return FALSE; + /* This allows accessing the frame from the picture. */ + picture->system_frame_number = priv->current_frame->system_frame_number; + priv->current_picture = picture; + + ret = gst_h264_decoder_start_current_picture (self); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "start picture failed"); + return ret; } } return gst_h264_decoder_decode_slice (self); } -static gboolean -gst_h264_decoder_decode_nal (GstH264Decoder * self, GstH264NalUnit * nalu, - GstClockTime pts) +static GstFlowReturn +gst_h264_decoder_decode_nal (GstH264Decoder * self, GstH264NalUnit * nalu) { - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; GST_LOG_OBJECT (self, "Parsed nal type: %d, offset %d, size %d", nalu->type, nalu->offset, nalu->size); @@ -801,7 +1345,7 @@ case GST_H264_NAL_SLICE_DPC: case GST_H264_NAL_SLICE_IDR: case GST_H264_NAL_SLICE_EXT: - ret = gst_h264_decoder_parse_slice (self, nalu, pts); + ret = gst_h264_decoder_parse_slice (self, nalu); break; default: break; @@ -926,7 +1470,8 @@ GstMapInfo map; gst_buffer_map (priv->codec_data, &map, GST_MAP_READ); - if (!gst_h264_decoder_parse_codec_data (self, map.data, map.size)) { + if (gst_h264_decoder_parse_codec_data (self, map.data, map.size) != + GST_FLOW_OK) { /* keep going without error. * Probably inband SPS/PPS might be valid data */ GST_WARNING_OBJECT (self, "Failed to handle codec data"); @@ -951,6 +1496,7 @@ gst_h264_decoder_fill_picture_from_slice (GstH264Decoder * self, const GstH264Slice * slice, GstH264Picture * picture) { + GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); const GstH264SliceHdr *slice_hdr = &slice->header; const GstH264PPS *pps; const GstH264SPS *sps; @@ -979,16 +1525,23 @@ else picture->field = GST_H264_PICTURE_FIELD_FRAME; - if (picture->field != GST_H264_PICTURE_FIELD_FRAME) { - GST_FIXME ("Interlace video not supported"); + if (!GST_H264_PICTURE_IS_FRAME (picture) && !klass->new_field_picture) { + GST_FIXME_OBJECT (self, "Subclass doesn't support interlace stream"); return FALSE; } picture->nal_ref_idc = slice->nalu.ref_idc; - picture->ref = slice->nalu.ref_idc != 0; + if (slice->nalu.ref_idc != 0) + gst_h264_picture_set_reference (picture, + GST_H264_PICTURE_REF_SHORT_TERM, FALSE); + + picture->frame_num = slice_hdr->frame_num; - /* This assumes non-interlaced stream */ - picture->frame_num = picture->pic_num = slice_hdr->frame_num; + /* 7.4.3 */ + if (!slice_hdr->field_pic_flag) + picture->pic_num = slice_hdr->frame_num; + else + picture->pic_num = 2 * slice_hdr->frame_num + 1; picture->pic_order_cnt_type = sps->pic_order_cnt_type; switch (picture->pic_order_cnt_type) { @@ -1066,15 +1619,21 @@ picture->pic_order_cnt_msb + picture->pic_order_cnt_lsb; } - if (picture->field != GST_H264_PICTURE_FIELD_TOP_FIELD) { - if (picture->field == GST_H264_PICTURE_FIELD_FRAME) { - picture->bottom_field_order_cnt = - picture->top_field_order_cnt + + switch (picture->field) { + case GST_H264_PICTURE_FIELD_FRAME: + picture->top_field_order_cnt = picture->pic_order_cnt_msb + + picture->pic_order_cnt_lsb; + picture->bottom_field_order_cnt = picture->top_field_order_cnt + picture->delta_pic_order_cnt_bottom; - } else { - picture->bottom_field_order_cnt = - picture->pic_order_cnt_msb + picture->pic_order_cnt_lsb; - } + break; + case GST_H264_PICTURE_FIELD_TOP_FIELD: + picture->top_field_order_cnt = picture->pic_order_cnt_msb + + picture->pic_order_cnt_lsb; + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + picture->bottom_field_order_cnt = picture->pic_order_cnt_msb + + picture->pic_order_cnt_lsb; + break; } break; } @@ -1134,7 +1693,7 @@ if (!picture->nal_ref_idc) expected_pic_order_cnt += sps->offset_for_non_ref_pic; - if (picture->field == GST_H264_PICTURE_FIELD_FRAME) { + if (GST_H264_PICTURE_IS_FRAME (picture)) { picture->top_field_order_cnt = expected_pic_order_cnt + picture->delta_pic_order_cnt0; picture->bottom_field_order_cnt = picture->top_field_order_cnt + @@ -1174,7 +1733,7 @@ 2 * (picture->frame_num_offset + picture->frame_num); } - if (picture->field == GST_H264_PICTURE_FIELD_FRAME) { + if (GST_H264_PICTURE_IS_FRAME (picture)) { picture->top_field_order_cnt = temp_pic_order_cnt; picture->bottom_field_order_cnt = temp_pic_order_cnt; } else if (picture->field == GST_H264_PICTURE_FIELD_BOTTOM_FIELD) { @@ -1211,24 +1770,35 @@ } static void -gst_h264_decoder_do_output_picture (GstH264Decoder * self, - GstH264Picture * picture, gboolean clear_dpb) +gst_h264_decoder_drain_output_queue (GstH264Decoder * self, guint num, + GstFlowReturn * ret) { GstH264DecoderPrivate *priv = self->priv; - GstH264DecoderClass *klass; - GstVideoCodecFrame *frame = NULL; + GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); - picture->outputted = TRUE; + g_assert (klass->output_picture); + g_assert (ret != NULL); - if (clear_dpb && !picture->ref) - gst_h264_dpb_delete_by_poc (priv->dpb, picture->pic_order_cnt); + while (gst_queue_array_get_length (priv->output_queue) > num) { + GstH264DecoderOutputFrame *output_frame = (GstH264DecoderOutputFrame *) + gst_queue_array_pop_head_struct (priv->output_queue); + GstFlowReturn flow_ret = klass->output_picture (self, output_frame->frame, + output_frame->picture); - if (picture->nonexisting) { - GST_DEBUG_OBJECT (self, "Skipping output, non-existing frame_num %d", - picture->frame_num); - gst_h264_picture_unref (picture); - return; + UPDATE_FLOW_RETURN (ret, flow_ret); } +} + +static void +gst_h264_decoder_do_output_picture (GstH264Decoder * self, + GstH264Picture * picture, GstFlowReturn * ret) +{ + GstH264DecoderPrivate *priv = self->priv; + GstVideoCodecFrame *frame = NULL; + GstH264DecoderOutputFrame output_frame; + GstFlowReturn flow_ret = GST_FLOW_OK; + + g_assert (ret != NULL); GST_LOG_OBJECT (self, "Outputting picture %p (frame_num %d, poc %d)", picture, picture->frame_num, picture->pic_order_cnt); @@ -1248,32 +1818,39 @@ GST_ERROR_OBJECT (self, "No available codec frame with frame number %d", picture->system_frame_number); - priv->last_ret = GST_FLOW_ERROR; + UPDATE_FLOW_RETURN (ret, GST_FLOW_ERROR); + gst_h264_picture_unref (picture); return; } - klass = GST_H264_DECODER_GET_CLASS (self); - - g_assert (klass->output_picture); - priv->last_ret = klass->output_picture (self, frame, picture); + output_frame.frame = frame; + output_frame.picture = picture; + output_frame.self = self; + gst_queue_array_push_tail_struct (priv->output_queue, &output_frame); + + gst_h264_decoder_drain_output_queue (self, priv->preferred_output_delay, + &flow_ret); + UPDATE_FLOW_RETURN (ret, flow_ret); } -static gboolean -gst_h264_decoder_finish_current_picture (GstH264Decoder * self) +static void +gst_h264_decoder_finish_current_picture (GstH264Decoder * self, + GstFlowReturn * ret) { GstH264DecoderPrivate *priv = self->priv; GstH264DecoderClass *klass; - gboolean ret = TRUE; + GstFlowReturn flow_ret = GST_FLOW_OK; if (!priv->current_picture) - return TRUE; + return; klass = GST_H264_DECODER_GET_CLASS (self); if (klass->end_picture) { - if (!klass->end_picture (self, priv->current_picture)) { + flow_ret = klass->end_picture (self, priv->current_picture); + if (flow_ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "end picture failed, marking picture %p non-existing " "(frame_num %d, poc %d)", priv->current_picture, @@ -1291,15 +1868,10 @@ gst_h264_decoder_clear_ref_pic_lists (self); /* finish picture takes ownership of the picture */ - ret = gst_h264_decoder_finish_picture (self, priv->current_picture); + gst_h264_decoder_finish_picture (self, priv->current_picture, &flow_ret); priv->current_picture = NULL; - if (!ret) { - GST_ERROR_OBJECT (self, "Failed to finish picture"); - return FALSE; - } - - return TRUE; + UPDATE_FLOW_RETURN (ret, flow_ret); } static gint @@ -1314,36 +1886,24 @@ return (*b)->pic_order_cnt - (*a)->pic_order_cnt; } -static gboolean +static GstFlowReturn gst_h264_decoder_drain_internal (GstH264Decoder * self) { GstH264DecoderPrivate *priv = self->priv; - GArray *to_output = priv->to_output; - - /* We are around to drain, so we can get rist of everything that has been - * outputed already */ - gst_h264_dpb_delete_outputed (priv->dpb); - gst_h264_dpb_get_pictures_not_outputted (priv->dpb, to_output); - g_array_sort (to_output, (GCompareFunc) poc_asc_compare); - - while (to_output->len) { - GstH264Picture *picture = g_array_index (to_output, GstH264Picture *, 0); + GstH264Picture *picture; + GstFlowReturn ret = GST_FLOW_OK; - /* We want the last reference when outputing so take a ref and then remove - * from both arrays. */ - gst_h264_picture_ref (picture); - g_array_remove_index (to_output, 0); - gst_h264_dpb_delete_by_poc (priv->dpb, picture->pic_order_cnt); - - GST_LOG_OBJECT (self, "Output picture %p (frame num %d, poc %d)", picture, - picture->frame_num, picture->pic_order_cnt); - gst_h264_decoder_do_output_picture (self, picture, FALSE); + while ((picture = gst_h264_dpb_bump (priv->dpb, TRUE)) != NULL) { + gst_h264_decoder_do_output_picture (self, picture, &ret); } - g_array_set_size (to_output, 0); + gst_h264_decoder_drain_output_queue (self, 0, &ret); + + gst_h264_picture_clear (&priv->last_field); gst_h264_dpb_clear (priv->dpb); - priv->last_output_poc = 0; - return TRUE; + priv->last_output_poc = G_MININT32; + + return ret; } static gboolean @@ -1351,151 +1911,87 @@ GstH264Picture * picture) { GstH264DecoderPrivate *priv = self->priv; - gint i, j; + gint i; for (i = 0; i < G_N_ELEMENTS (picture->dec_ref_pic_marking.ref_pic_marking); i++) { GstH264RefPicMarking *ref_pic_marking = &picture->dec_ref_pic_marking.ref_pic_marking[i]; - GstH264Picture *to_mark; - gint pic_num_x; + guint8 type = ref_pic_marking->memory_management_control_operation; - switch (ref_pic_marking->memory_management_control_operation) { - case 0: - /* Normal end of operations' specification */ - return TRUE; - case 1: - /* Mark a short term reference picture as unused so it can be removed - * if outputted */ - pic_num_x = - picture->pic_num - (ref_pic_marking->difference_of_pic_nums_minus1 + - 1); - to_mark = gst_h264_dpb_get_short_ref_by_pic_num (priv->dpb, pic_num_x); - if (to_mark) { - to_mark->ref = FALSE; - } else { - GST_WARNING_OBJECT (self, "Invalid short term ref pic num to unmark"); - return FALSE; - } - break; + GST_TRACE_OBJECT (self, "memory management operation %d, type %d", i, type); - case 2: - /* Mark a long term reference picture as unused so it can be removed - * if outputted */ - to_mark = gst_h264_dpb_get_long_ref_by_pic_num (priv->dpb, - ref_pic_marking->long_term_pic_num); - if (to_mark) { - to_mark->ref = FALSE; - } else { - GST_WARNING_OBJECT (self, "Invalid long term ref pic num to unmark"); - return FALSE; - } - break; + /* Normal end of operations' specification */ + if (type == 0) + return TRUE; - case 3: - /* Mark a short term reference picture as long term reference */ - pic_num_x = - picture->pic_num - (ref_pic_marking->difference_of_pic_nums_minus1 + - 1); - to_mark = gst_h264_dpb_get_short_ref_by_pic_num (priv->dpb, pic_num_x); - if (to_mark) { - to_mark->long_term = TRUE; - to_mark->long_term_frame_idx = ref_pic_marking->long_term_frame_idx; - } else { - GST_WARNING_OBJECT (self, - "Invalid short term ref pic num to mark as long ref"); - return FALSE; - } - break; - - case 4:{ - GArray *pictures = gst_h264_dpb_get_pictures_all (priv->dpb); - - /* Unmark all reference pictures with long_term_frame_idx over new max */ + switch (type) { + case 4: priv->max_long_term_frame_idx = ref_pic_marking->max_long_term_frame_idx_plus1 - 1; - - for (j = 0; j < pictures->len; j++) { - GstH264Picture *pic = g_array_index (pictures, GstH264Picture *, j); - if (pic->long_term && - pic->long_term_frame_idx > priv->max_long_term_frame_idx) - pic->ref = FALSE; - } - - g_array_unref (pictures); break; - } - case 5: - /* Unmark all reference pictures */ - gst_h264_dpb_mark_all_non_ref (priv->dpb); priv->max_long_term_frame_idx = -1; - picture->mem_mgmt_5 = TRUE; - break; - - case 6:{ - GArray *pictures = gst_h264_dpb_get_pictures_all (priv->dpb); - - /* Replace long term reference pictures with current picture. - * First unmark if any existing with this long_term_frame_idx... */ - - for (j = 0; j < pictures->len; j++) { - GstH264Picture *pic = g_array_index (pictures, GstH264Picture *, j); - - if (pic->long_term && - pic->long_term_frame_idx == ref_pic_marking->long_term_frame_idx) - pic->ref = FALSE; - } - - g_array_unref (pictures); - - /* and mark the current one instead */ - picture->ref = TRUE; - picture->long_term = TRUE; - picture->long_term_frame_idx = ref_pic_marking->long_term_frame_idx; break; - } - default: - g_assert_not_reached (); break; } + + if (!gst_h264_dpb_perform_memory_management_control_operation (priv->dpb, + ref_pic_marking, picture)) { + GST_WARNING_OBJECT (self, "memory management operation type %d failed", + type); + /* Most likely our implementation fault, but let's just perform + * next MMCO if any */ + } } return TRUE; } static gboolean -gst_h264_decoder_sliding_window_picture_marking (GstH264Decoder * self) +gst_h264_decoder_sliding_window_picture_marking (GstH264Decoder * self, + GstH264Picture * picture) { GstH264DecoderPrivate *priv = self->priv; const GstH264SPS *sps = priv->active_sps; gint num_ref_pics; gint max_num_ref_frames; + /* Skip this for the second field */ + if (picture->second_field) + return TRUE; + if (!sps) { GST_ERROR_OBJECT (self, "No active sps"); return FALSE; } /* 8.2.5.3. Ensure the DPB doesn't overflow by discarding the oldest picture */ - num_ref_pics = gst_h264_dpb_num_ref_pictures (priv->dpb); + num_ref_pics = gst_h264_dpb_num_ref_frames (priv->dpb); max_num_ref_frames = MAX (1, sps->num_ref_frames); - if (num_ref_pics > max_num_ref_frames) { - GST_WARNING_OBJECT (self, - "num_ref_pics %d is larger than allowed maximum %d", - num_ref_pics, max_num_ref_frames); - return FALSE; - } + if (num_ref_pics < max_num_ref_frames) + return TRUE; - if (num_ref_pics == max_num_ref_frames) { + /* In theory, num_ref_pics shouldn't be larger than max_num_ref_frames + * but it could happen if our implementation is wrong somehow or so. + * Just try to remove reference pictures as many as possible in order to + * avoid DPB overflow. + */ + while (num_ref_pics >= max_num_ref_frames) { /* Max number of reference pics reached, need to remove one of the short * term ones. Find smallest frame_num_wrap short reference picture and mark * it as unused */ GstH264Picture *to_unmark = gst_h264_dpb_get_lowest_frame_num_short_ref (priv->dpb); + if (num_ref_pics > max_num_ref_frames) { + GST_WARNING_OBJECT (self, + "num_ref_pics %d is larger than allowed maximum %d", + num_ref_pics, max_num_ref_frames); + } + if (!to_unmark) { GST_WARNING_OBJECT (self, "Could not find a short ref picture to unmark"); return FALSE; @@ -1505,8 +2001,10 @@ "Unmark reference flag of picture %p (frame_num %d, poc %d)", to_unmark, to_unmark->frame_num, to_unmark->pic_order_cnt); - to_unmark->ref = FALSE; + gst_h264_picture_set_reference (to_unmark, GST_H264_PICTURE_REF_NONE, TRUE); gst_h264_picture_unref (to_unmark); + + num_ref_pics--; } return TRUE; @@ -1528,11 +2026,13 @@ gst_h264_dpb_mark_all_non_ref (priv->dpb); if (picture->dec_ref_pic_marking.long_term_reference_flag) { - picture->long_term = TRUE; + gst_h264_picture_set_reference (picture, + GST_H264_PICTURE_REF_LONG_TERM, FALSE); picture->long_term_frame_idx = 0; priv->max_long_term_frame_idx = 0; } else { - picture->long_term = FALSE; + gst_h264_picture_set_reference (picture, + GST_H264_PICTURE_REF_SHORT_TERM, FALSE); priv->max_long_term_frame_idx = -1; } @@ -1553,19 +2053,47 @@ return gst_h264_decoder_handle_memory_management_opt (self, picture); } - return gst_h264_decoder_sliding_window_picture_marking (self); + return gst_h264_decoder_sliding_window_picture_marking (self, picture); } -static gboolean +static GstH264DpbBumpMode +get_bump_level (GstH264Decoder * self) +{ + GstH264DecoderPrivate *priv = self->priv; + + /* User set the mode explicitly. */ + switch (priv->compliance) { + case GST_H264_DECODER_COMPLIANCE_STRICT: + return GST_H264_DPB_BUMP_NORMAL_LATENCY; + case GST_H264_DECODER_COMPLIANCE_NORMAL: + return GST_H264_DPB_BUMP_LOW_LATENCY; + case GST_H264_DECODER_COMPLIANCE_FLEXIBLE: + return GST_H264_DPB_BUMP_VERY_LOW_LATENCY; + default: + break; + } + + /* GST_H264_DECODER_COMPLIANCE_AUTO case. */ + + if (priv->is_live) { + /* The baseline and constrained-baseline profiles do not have B frames + and do not use the picture reorder, safe to use the higher bump level. */ + if (priv->profile_idc == GST_H264_PROFILE_BASELINE) + return GST_H264_DPB_BUMP_VERY_LOW_LATENCY; + + return GST_H264_DPB_BUMP_LOW_LATENCY; + } + + return GST_H264_DPB_BUMP_NORMAL_LATENCY; +} + +static void gst_h264_decoder_finish_picture (GstH264Decoder * self, - GstH264Picture * picture) + GstH264Picture * picture, GstFlowReturn * ret) { + GstVideoDecoder *decoder = GST_VIDEO_DECODER (self); GstH264DecoderPrivate *priv = self->priv; - GArray *not_outputted = priv->to_output; - guint num_remaining; -#ifndef GST_DISABLE_GST_DEBUG - gint i; -#endif + GstH264DpbBumpMode bump_level = get_bump_level (self); /* Finish processing the picture. * Start by storing previous picture data for later use */ @@ -1587,185 +2115,82 @@ * them as such */ gst_h264_dpb_delete_unused (priv->dpb); - GST_LOG_OBJECT (self, - "Finishing picture %p (frame_num %d, poc %d), entries in DPB %d", - picture, picture->frame_num, picture->pic_order_cnt, - gst_h264_dpb_get_size (priv->dpb)); - - /* The ownership of pic will either be transferred to DPB - if the picture is - * still needed (for output and/or reference) - or we will release it - * immediately if we manage to output it here and won't have to store it for - * future reference */ - - /* Get all pictures that haven't been outputted yet */ - gst_h264_dpb_get_pictures_not_outputted (priv->dpb, not_outputted); - /* Include the one we've just decoded */ - g_array_append_val (not_outputted, picture); + /* If field pictures belong to different codec frame, + * drop codec frame of the second field because we are consuming + * only the first codec frame via GstH264Decoder::output_picture() method */ + if (picture->second_field && picture->other_field && + picture->system_frame_number != + picture->other_field->system_frame_number) { + GstVideoCodecFrame *frame = gst_video_decoder_get_frame (decoder, + picture->system_frame_number); - /* for debugging */ -#ifndef GST_DISABLE_GST_DEBUG - if (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_TRACE) { - GST_TRACE_OBJECT (self, "Before sorting not outputted list"); - for (i = 0; i < not_outputted->len; i++) { - GstH264Picture *tmp = g_array_index (not_outputted, GstH264Picture *, i); - GST_TRACE_OBJECT (self, - "\t%dth picture %p (frame_num %d, poc %d)", i, tmp, - tmp->frame_num, tmp->pic_order_cnt); - } + gst_video_decoder_release_frame (decoder, frame); } -#endif - /* Sort in output order */ - g_array_sort (not_outputted, (GCompareFunc) poc_asc_compare); + /* C.4.4 */ + if (picture->mem_mgmt_5) { + GstFlowReturn drain_ret; + + GST_TRACE_OBJECT (self, "Memory management type 5, drain the DPB"); + + drain_ret = gst_h264_decoder_drain_internal (self); + UPDATE_FLOW_RETURN (ret, drain_ret); + } + + _bump_dpb (self, bump_level, picture, ret); + + /* Add a ref to avoid the case of directly outputed and destroyed. */ + gst_h264_picture_ref (picture); + + /* C.4.5.1, C.4.5.2 + - If the current decoded picture is the second field of a complementary + reference field pair, add to DPB. + C.4.5.1 + For A reference decoded picture, the "bumping" process is invoked + repeatedly until there is an empty frame buffer, then add to DPB: + C.4.5.2 + For a non-reference decoded picture, if there is empty frame buffer + after bumping the smaller POC, add to DPB. + Otherwise, output directly. */ + if ((picture->second_field && picture->other_field + && picture->other_field->ref) + || picture->ref || gst_h264_dpb_has_empty_frame_buffer (priv->dpb)) { + /* Split frame into top/bottom field pictures for reference picture marking + * process. Even if current picture has field_pic_flag equal to zero, + * if next picture is a field picture, complementary field pair of reference + * frame should have individual pic_num and long_term_pic_num. + */ + if (gst_h264_dpb_get_interlaced (priv->dpb) && + GST_H264_PICTURE_IS_FRAME (picture)) { + GstH264Picture *other_field = + gst_h264_decoder_split_frame (self, picture); -#ifndef GST_DISABLE_GST_DEBUG - if (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_TRACE) { - GST_TRACE_OBJECT (self, - "After sorting not outputted list in poc ascending order"); - for (i = 0; i < not_outputted->len; i++) { - GstH264Picture *tmp = g_array_index (not_outputted, GstH264Picture *, i); - GST_TRACE_OBJECT (self, - "\t%dth picture %p (frame_num %d, poc %d)", i, tmp, - tmp->frame_num, tmp->pic_order_cnt); - } - } -#endif - - /* Try to output as many pictures as we can. A picture can be output, - * if the number of decoded and not yet outputted pictures that would remain - * in DPB afterwards would at least be equal to max_num_reorder_frames. - * If the outputted picture is not a reference picture, it doesn't have - * to remain in the DPB and can be removed */ - num_remaining = not_outputted->len; - - while (num_remaining > priv->max_num_reorder_frames || - /* If the condition below is used, this is an invalid stream. We should - * not be forced to output beyond max_num_reorder_frames in order to - * make room in DPB to store the current picture (if we need to do so). - * However, if this happens, ignore max_num_reorder_frames and try - * to output more. This may cause out-of-order output, but is not - * fatal, and better than failing instead */ - ((gst_h264_dpb_is_full (priv->dpb) && (picture && (!picture->outputted - || picture->ref))) - && num_remaining)) { - gboolean clear_dpb = TRUE; - GstH264Picture *to_output = - g_array_index (not_outputted, GstH264Picture *, 0); - - gst_h264_picture_ref (to_output); - g_array_remove_index (not_outputted, 0); - - if (num_remaining <= priv->max_num_reorder_frames) { - GST_WARNING_OBJECT (self, - "Invalid stream, max_num_reorder_frames not preserved"); - } - - GST_LOG_OBJECT (self, - "Output picture %p (frame num %d)", to_output, to_output->frame_num); - - /* Current picture hasn't been inserted into DPB yet, so don't remove it - * if we managed to output it immediately */ - if (picture && to_output == picture) { - clear_dpb = FALSE; - - if (picture->ref) { - GST_TRACE_OBJECT (self, - "Put current picture %p (frame num %d, poc %d) to dpb", - picture, picture->frame_num, picture->pic_order_cnt); - gst_h264_dpb_add (priv->dpb, gst_h264_picture_ref (picture)); + add_picture_to_dpb (self, picture); + if (!other_field) { + GST_WARNING_OBJECT (self, + "Couldn't split frame into complementary field pair"); + /* Keep decoding anyway... */ + } else { + add_picture_to_dpb (self, other_field); } - - /* and mark current picture is handled */ - picture = NULL; - } - - gst_h264_decoder_do_output_picture (self, to_output, clear_dpb); - - num_remaining--; - } - - /* If we haven't managed to output the picture that we just decoded, or if - * it's a reference picture, we have to store it in DPB */ - if (picture && (!picture->outputted || picture->ref)) { - if (gst_h264_dpb_is_full (priv->dpb)) { - /* If we haven't managed to output anything to free up space in DPB - * to store this picture, it's an error in the stream */ - GST_WARNING_OBJECT (self, "Could not free up space in DPB"); - - g_array_set_size (not_outputted, 0); - return FALSE; - } - - GST_TRACE_OBJECT (self, - "Put picture %p (outputted %d, ref %d, frame num %d, poc %d) to dpb", - picture, picture->outputted, picture->ref, picture->frame_num, - picture->pic_order_cnt); - gst_h264_dpb_add (priv->dpb, gst_h264_picture_ref (picture)); - } - - /* clear possible reference to the current picture. - * If *picture* is still non-null, it means that the current picture not - * outputted yet, and DPB may or may not hold the reference of the picture */ - if (picture) - gst_h264_picture_ref (picture); - - g_array_set_size (not_outputted, 0); - - /* C.4.5.3 "Bumping" process for non-DPB full case, DPB full cases should be - * covered above */ - /* FIXME: should cover interlaced streams */ - if (picture && !picture->outputted && - picture->field == GST_H264_PICTURE_FIELD_FRAME) { - gboolean do_output = TRUE; - if (picture->idr && - !picture->dec_ref_pic_marking.no_output_of_prior_pics_flag) { - /* The current picture is an IDR picture and no_output_of_prior_pics_flag - * is not equal to 1 and is not inferred to be equal to 1, as specified - * in clause C.4.4 */ - GST_TRACE_OBJECT (self, "Output IDR picture"); - } else if (picture->mem_mgmt_5) { - /* The current picture has memory_management_control_operation equal to 5, - * as specified in clause C.4.4 */ - GST_TRACE_OBJECT (self, "Output mem_mgmt_5 picture"); - } else if (priv->last_output_poc >= 0 && - picture->pic_order_cnt > priv->last_output_poc && - (picture->pic_order_cnt - priv->last_output_poc) <= 2 && - /* NOTE: this might have a negative effect on throughput performance - * depending on hardware implementation. - * TODO: Possible solution is threading but it would make decoding flow - * very complicated. */ - priv->is_live) { - /* NOTE: this condition is not specified by spec but we can output - * this picture based on calculated POC and last outputted POC */ - - /* NOTE: The assumption here is, every POC of frame will have step of two. - * however, if the assumption is wrong, (i.e., POC step is one, not two), - * this would break output order. If this assumption is wrong, - * please remove this condition. - */ - GST_LOG_OBJECT (self, - "Forcing output picture %p (frame num %d, poc %d, last poc %d)", - picture, picture->frame_num, picture->pic_order_cnt, - priv->last_output_poc); } else { - do_output = FALSE; - GST_TRACE_OBJECT (self, "Current picture %p (frame num %d, poc %d) " - "is not ready to be output picture", - picture, picture->frame_num, picture->pic_order_cnt); - } - - if (do_output) { - /* pass ownership of the current picture. At this point, - * dpb must be holding a reference of the current picture */ - gst_h264_decoder_do_output_picture (self, picture, TRUE); - picture = NULL; + add_picture_to_dpb (self, picture); } + } else { + output_picture_directly (self, picture, ret); } - if (picture) - gst_h264_picture_unref (picture); + GST_LOG_OBJECT (self, + "Finishing picture %p (frame_num %d, poc %d), entries in DPB %d", + picture, picture->frame_num, picture->pic_order_cnt, + gst_h264_dpb_get_size (priv->dpb)); - return TRUE; + gst_h264_picture_unref (picture); + + /* For the live mode, we try to bump here to avoid waiting + for another decoding circle. */ + if (priv->is_live && priv->compliance != GST_H264_DECODER_COMPLIANCE_STRICT) + _bump_dpb (self, bump_level, NULL, ret); } static gboolean @@ -1773,27 +2198,39 @@ GstH264SPS * sps) { GstH264DecoderPrivate *priv = self->priv; + gsize max_num_reorder_frames = 0; if (sps->vui_parameters_present_flag && sps->vui_parameters.bitstream_restriction_flag) { - priv->max_num_reorder_frames = sps->vui_parameters.num_reorder_frames; - if (priv->max_num_reorder_frames > - gst_h264_dpb_get_max_num_pics (priv->dpb)) { + max_num_reorder_frames = sps->vui_parameters.num_reorder_frames; + if (max_num_reorder_frames > gst_h264_dpb_get_max_num_frames (priv->dpb)) { GST_WARNING ("max_num_reorder_frames present, but larger than MaxDpbFrames (%d > %d)", - (gint) priv->max_num_reorder_frames, - gst_h264_dpb_get_max_num_pics (priv->dpb)); + (gint) max_num_reorder_frames, + gst_h264_dpb_get_max_num_frames (priv->dpb)); - priv->max_num_reorder_frames = 0; + max_num_reorder_frames = 0; return FALSE; } + gst_h264_dpb_set_max_num_reorder_frames (priv->dpb, max_num_reorder_frames); + return TRUE; } - /* max_num_reorder_frames not present, infer from profile/constraints - * (see VUI semantics in spec) */ - if (sps->constraint_set3_flag) { + if (priv->compliance == GST_H264_DECODER_COMPLIANCE_STRICT) { + gst_h264_dpb_set_max_num_reorder_frames (priv->dpb, + gst_h264_dpb_get_max_num_frames (priv->dpb)); + return TRUE; + } + + /* max_num_reorder_frames not present, infer it from profile/constraints. */ + if (sps->profile_idc == 66 || sps->profile_idc == 83) { + /* baseline, constrained baseline and scalable-baseline profiles + only contain I/P frames. */ + max_num_reorder_frames = 0; + } else if (sps->constraint_set3_flag) { + /* constraint_set3_flag may mean the -intra only profile. */ switch (sps->profile_idc) { case 44: case 86: @@ -1801,17 +2238,18 @@ case 110: case 122: case 244: - priv->max_num_reorder_frames = 0; + max_num_reorder_frames = 0; break; default: - priv->max_num_reorder_frames = - gst_h264_dpb_get_max_num_pics (priv->dpb); + max_num_reorder_frames = gst_h264_dpb_get_max_num_frames (priv->dpb); break; } } else { - priv->max_num_reorder_frames = gst_h264_dpb_get_max_num_pics (priv->dpb); + max_num_reorder_frames = gst_h264_dpb_get_max_num_frames (priv->dpb); } + gst_h264_dpb_set_max_num_reorder_frames (priv->dpb, max_num_reorder_frames); + return TRUE; } @@ -1884,9 +2322,74 @@ return 0; } -static gboolean +static void +gst_h264_decoder_set_latency (GstH264Decoder * self, const GstH264SPS * sps, + gint max_dpb_size) +{ + GstH264DecoderPrivate *priv = self->priv; + GstCaps *caps; + GstClockTime min, max; + GstStructure *structure; + gint fps_d = 1, fps_n = 0; + GstH264DpbBumpMode bump_level; + guint32 frames_delay; + + caps = gst_pad_get_current_caps (GST_VIDEO_DECODER_SRC_PAD (self)); + if (!caps) + return; + + structure = gst_caps_get_structure (caps, 0); + if (gst_structure_get_fraction (structure, "framerate", &fps_n, &fps_d)) { + if (fps_n == 0) { + /* variable framerate: see if we have a max-framerate */ + gst_structure_get_fraction (structure, "max-framerate", &fps_n, &fps_d); + } + } + gst_caps_unref (caps); + + /* if no fps or variable, then 25/1 */ + if (fps_n == 0) { + fps_n = 25; + fps_d = 1; + } + + bump_level = get_bump_level (self); + frames_delay = 0; + switch (bump_level) { + case GST_H264_DPB_BUMP_NORMAL_LATENCY: + /* We always wait the DPB full before bumping. */ + frames_delay = max_dpb_size; + break; + case GST_H264_DPB_BUMP_LOW_LATENCY: + /* We bump the IDR if the second frame is not a minus POC. */ + frames_delay = 1; + break; + case GST_H264_DPB_BUMP_VERY_LOW_LATENCY: + /* We bump the IDR immediately. */ + frames_delay = 0; + break; + default: + g_assert_not_reached (); + break; + } + + /* Consider output delay wanted by subclass */ + frames_delay += priv->preferred_output_delay; + + min = gst_util_uint64_scale_int (frames_delay * GST_SECOND, fps_d, fps_n); + max = gst_util_uint64_scale_int ((max_dpb_size + priv->preferred_output_delay) + * GST_SECOND, fps_d, fps_n); + + GST_LOG_OBJECT (self, + "latency min %" G_GUINT64_FORMAT " max %" G_GUINT64_FORMAT, min, max); + + gst_video_decoder_set_latency (GST_VIDEO_DECODER (self), min, max); +} + +static GstFlowReturn gst_h264_decoder_process_sps (GstH264Decoder * self, GstH264SPS * sps) { + GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); GstH264DecoderPrivate *priv = self->priv; guint8 level; gint max_dpb_mbs; @@ -1894,12 +2397,27 @@ gint max_dpb_frames; gint max_dpb_size; gint prev_max_dpb_size; + gboolean prev_interlaced; + gboolean interlaced; + GstFlowReturn ret = GST_FLOW_OK; if (sps->frame_mbs_only_flag == 0) { - GST_FIXME_OBJECT (self, "frame_mbs_only_flag != 1 not supported"); - return FALSE; + if (!klass->new_field_picture) { + GST_FIXME_OBJECT (self, + "frame_mbs_only_flag != 1 not supported by subclass"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (sps->mb_adaptive_frame_field_flag) { + GST_LOG_OBJECT (self, + "mb_adaptive_frame_field_flag == 1, MBAFF sequence"); + } else { + GST_LOG_OBJECT (self, "mb_adaptive_frame_field_flag == 0, PAFF sequence"); + } } + interlaced = !sps->frame_mbs_only_flag; + /* Spec A.3.1 and A.3.2 * For Baseline, Constrained Baseline and Main profile, the indicated level is * Level 1b if level_idc is equal to 11 and constraint_set3_flag is equal to 1 @@ -1913,7 +2431,7 @@ max_dpb_mbs = h264_level_to_max_dpb_mbs ((GstH264DecoderLevel) level); if (!max_dpb_mbs) - return FALSE; + return GST_FLOW_ERROR; width_mb = sps->width / 16; height_mb = sps->height / 16; @@ -1939,37 +2457,53 @@ } /* Safety, so that subclass don't need bound checking */ - g_return_val_if_fail (max_dpb_size <= GST_H264_DPB_MAX_SIZE, FALSE); + g_return_val_if_fail (max_dpb_size <= GST_H264_DPB_MAX_SIZE, GST_FLOW_ERROR); - prev_max_dpb_size = gst_h264_dpb_get_max_num_pics (priv->dpb); + prev_max_dpb_size = gst_h264_dpb_get_max_num_frames (priv->dpb); + prev_interlaced = gst_h264_dpb_get_interlaced (priv->dpb); if (priv->width != sps->width || priv->height != sps->height || - prev_max_dpb_size != max_dpb_size) { + prev_max_dpb_size != max_dpb_size || prev_interlaced != interlaced) { GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); GST_DEBUG_OBJECT (self, - "SPS updated, resolution: %dx%d -> %dx%d, dpb size: %d -> %d", + "SPS updated, resolution: %dx%d -> %dx%d, dpb size: %d -> %d, " + "interlaced %d -> %d", priv->width, priv->height, sps->width, sps->height, - prev_max_dpb_size, max_dpb_size); + prev_max_dpb_size, max_dpb_size, prev_interlaced, interlaced); - if (gst_h264_decoder_drain (GST_VIDEO_DECODER (self)) != GST_FLOW_OK) - return FALSE; + ret = gst_h264_decoder_drain (GST_VIDEO_DECODER (self)); + if (ret != GST_FLOW_OK) + return ret; g_assert (klass->new_sequence); - if (!klass->new_sequence (self, sps, max_dpb_size)) { - GST_ERROR_OBJECT (self, "subclass does not want accept new sequence"); - return FALSE; + if (klass->get_preferred_output_delay) { + priv->preferred_output_delay = + klass->get_preferred_output_delay (self, priv->is_live); + } else { + priv->preferred_output_delay = 0; + } + + ret = klass->new_sequence (self, + sps, max_dpb_size + priv->preferred_output_delay); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass does not want accept new sequence"); + return ret; } + priv->profile_idc = sps->profile_idc; priv->width = sps->width; priv->height = sps->height; - gst_h264_dpb_set_max_num_pics (priv->dpb, max_dpb_size); + gst_h264_decoder_set_latency (self, sps, max_dpb_size); + gst_h264_dpb_set_max_num_frames (priv->dpb, max_dpb_size); + gst_h264_dpb_set_interlaced (priv->dpb, interlaced); } - GST_DEBUG_OBJECT (self, "Set DPB max size %d", max_dpb_size); + if (!gst_h264_decoder_update_max_num_reorder_frames (self, sps)) + return GST_FLOW_ERROR; - return gst_h264_decoder_update_max_num_reorder_frames (self, sps); + return GST_FLOW_OK; } static gboolean @@ -1980,14 +2514,15 @@ picture->nal_ref_idc = 1; picture->frame_num = picture->pic_num = frame_num; picture->dec_ref_pic_marking.adaptive_ref_pic_marking_mode_flag = FALSE; - picture->ref = TRUE; + picture->ref = GST_H264_PICTURE_REF_SHORT_TERM; + picture->ref_pic = TRUE; picture->dec_ref_pic_marking.long_term_reference_flag = FALSE; picture->field = GST_H264_PICTURE_FIELD_FRAME; return gst_h264_decoder_calculate_poc (self, picture); } -static gboolean +static GstFlowReturn gst_h264_decoder_decode_slice (GstH264Decoder * self) { GstH264DecoderClass *klass = GST_H264_DECODER_GET_CLASS (self); @@ -1996,11 +2531,11 @@ GstH264Picture *picture = priv->current_picture; GArray *ref_pic_list0 = NULL; GArray *ref_pic_list1 = NULL; - gboolean ret = FALSE; + GstFlowReturn ret = GST_FLOW_OK; if (!picture) { GST_ERROR_OBJECT (self, "No current picture"); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "Decode picture %p (frame_num %d, poc %d)", @@ -2009,8 +2544,10 @@ priv->max_pic_num = slice->header.max_pic_num; if (priv->process_ref_pic_lists) { - if (!gst_h264_decoder_modify_ref_pic_lists (self)) + if (!gst_h264_decoder_modify_ref_pic_lists (self)) { + ret = GST_FLOW_ERROR; goto beach; + } ref_pic_list0 = priv->ref_pic_list0; ref_pic_list1 = priv->ref_pic_list1; @@ -2020,7 +2557,7 @@ ret = klass->decode_slice (self, picture, slice, ref_pic_list0, ref_pic_list1); - if (!ret) { + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Subclass didn't want to decode picture %p (frame_num %d, poc %d)", picture, picture->frame_num, picture->pic_order_cnt); @@ -2047,7 +2584,8 @@ } static void -construct_ref_pic_lists_p (GstH264Decoder * self) +construct_ref_pic_lists_p (GstH264Decoder * self, + GstH264Picture * current_picture) { GstH264DecoderPrivate *priv = self->priv; gint pos; @@ -2058,11 +2596,13 @@ */ g_array_set_size (priv->ref_pic_list_p0, 0); - gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, priv->ref_pic_list_p0); + gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, + TRUE, FALSE, priv->ref_pic_list_p0); g_array_sort (priv->ref_pic_list_p0, (GCompareFunc) pic_num_desc_compare); pos = priv->ref_pic_list_p0->len; - gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, priv->ref_pic_list_p0); + gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, + FALSE, priv->ref_pic_list_p0); g_qsort_with_data (&g_array_index (priv->ref_pic_list_p0, gpointer, pos), priv->ref_pic_list_p0->len - pos, sizeof (gpointer), (GCompareDataFunc) long_term_pic_num_asc_compare, NULL); @@ -2073,7 +2613,7 @@ for (pos = 0; pos < priv->ref_pic_list_p0->len; pos++) { GstH264Picture *ref = g_array_index (priv->ref_pic_list_p0, GstH264Picture *, pos); - if (!ref->long_term) + if (!GST_H264_PICTURE_IS_LONG_TERM_REF (ref)) g_string_append_printf (str, "|%i", ref->pic_num); else g_string_append_printf (str, "|%is", ref->pic_num); @@ -2084,6 +2624,139 @@ #endif } +static gint +frame_num_wrap_desc_compare (const GstH264Picture ** a, + const GstH264Picture ** b) +{ + return (*b)->frame_num_wrap - (*a)->frame_num_wrap; +} + +static gint +long_term_frame_idx_asc_compare (const GstH264Picture ** a, + const GstH264Picture ** b) +{ + return (*a)->long_term_frame_idx - (*b)->long_term_frame_idx; +} + +/* init_picture_refs_fields_1 in gstvaapidecoder_h264.c */ +static void +init_picture_refs_fields_1 (GstH264Decoder * self, GstH264PictureField field, + GArray * ref_frame_list, GArray * ref_pic_list_x) +{ + guint i = 0, j = 0; + + do { + for (; i < ref_frame_list->len; i++) { + GstH264Picture *pic = g_array_index (ref_frame_list, GstH264Picture *, i); + if (pic->field == field) { + pic = gst_h264_picture_ref (pic); + g_array_append_val (ref_pic_list_x, pic); + i++; + break; + } + } + + for (; j < ref_frame_list->len; j++) { + GstH264Picture *pic = g_array_index (ref_frame_list, GstH264Picture *, j); + if (pic->field != field) { + pic = gst_h264_picture_ref (pic); + g_array_append_val (ref_pic_list_x, pic); + j++; + break; + } + } + } while (i < ref_frame_list->len || j < ref_frame_list->len); +} + +static void +construct_ref_field_pic_lists_p (GstH264Decoder * self, + GstH264Picture * current_picture) +{ + GstH264DecoderPrivate *priv = self->priv; + gint pos; + + g_array_set_size (priv->ref_pic_list_p0, 0); + g_array_set_size (priv->ref_frame_list_0_short_term, 0); + g_array_set_size (priv->ref_frame_list_long_term, 0); + + /* 8.2.4.2.2, 8.2.4.2.5 refFrameList0ShortTerm: + * short-term ref pictures sorted by descending frame_num_wrap. + */ + gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, + TRUE, TRUE, priv->ref_frame_list_0_short_term); + g_array_sort (priv->ref_frame_list_0_short_term, + (GCompareFunc) frame_num_wrap_desc_compare); + +#ifndef GST_DISABLE_GST_DEBUG + if (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_TRACE + && priv->ref_frame_list_0_short_term->len) { + GString *str = g_string_new (NULL); + for (pos = 0; pos < priv->ref_frame_list_0_short_term->len; pos++) { + GstH264Picture *ref = g_array_index (priv->ref_frame_list_0_short_term, + GstH264Picture *, pos); + g_string_append_printf (str, "|%i(%d)", ref->frame_num_wrap, ref->field); + } + GST_TRACE_OBJECT (self, "ref_frame_list_0_short_term (%d): %s|", + current_picture->field, str->str); + g_string_free (str, TRUE); + } +#endif + + /* 8.2.4.2.2 refFrameList0LongTerm,: + * long-term ref pictures sorted by ascending long_term_frame_idx. + */ + gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, + TRUE, priv->ref_frame_list_long_term); + g_array_sort (priv->ref_frame_list_long_term, + (GCompareFunc) long_term_frame_idx_asc_compare); + +#ifndef GST_DISABLE_GST_DEBUG + if (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_TRACE + && priv->ref_frame_list_long_term->len) { + GString *str = g_string_new (NULL); + for (pos = 0; pos < priv->ref_frame_list_long_term->len; pos++) { + GstH264Picture *ref = g_array_index (priv->ref_frame_list_0_short_term, + GstH264Picture *, pos); + g_string_append_printf (str, "|%i(%d)", ref->long_term_frame_idx, + ref->field); + } + GST_TRACE_OBJECT (self, "ref_frame_list_0_long_term (%d): %s|", + current_picture->field, str->str); + g_string_free (str, TRUE); + } +#endif + + /* 8.2.4.2.5 */ + init_picture_refs_fields_1 (self, current_picture->field, + priv->ref_frame_list_0_short_term, priv->ref_pic_list_p0); + init_picture_refs_fields_1 (self, current_picture->field, + priv->ref_frame_list_long_term, priv->ref_pic_list_p0); + +#ifndef GST_DISABLE_GST_DEBUG + if (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_DEBUG + && priv->ref_pic_list_p0->len) { + GString *str = g_string_new (NULL); + for (pos = 0; pos < priv->ref_pic_list_p0->len; pos++) { + GstH264Picture *ref = + g_array_index (priv->ref_pic_list_p0, GstH264Picture *, pos); + if (!GST_H264_PICTURE_IS_LONG_TERM_REF (ref)) + g_string_append_printf (str, "|%i(%d)s", ref->frame_num_wrap, + ref->field); + else + g_string_append_printf (str, "|%i(%d)l", ref->long_term_frame_idx, + ref->field); + } + GST_DEBUG_OBJECT (self, "ref_pic_list_p0 (%d): %s|", current_picture->field, + str->str); + g_string_free (str, TRUE); + } +#endif + + /* Clear temporary lists, now pictures are owned by ref_pic_list_p0 */ + g_array_set_size (priv->ref_frame_list_0_short_term, 0); + g_array_set_size (priv->ref_frame_list_long_term, 0); +} + static gboolean lists_are_equal (GArray * l1, GArray * l2) { @@ -2115,7 +2788,8 @@ } static void -print_ref_pic_list_b (GstH264Decoder * self, GArray * ref_list_b, gint index) +print_ref_pic_list_b (GstH264Decoder * self, GArray * ref_list_b, + const gchar * name) { #ifndef GST_DISABLE_GST_DEBUG GString *str; @@ -2129,20 +2803,21 @@ for (i = 0; i < ref_list_b->len; i++) { GstH264Picture *ref = g_array_index (ref_list_b, GstH264Picture *, i); - if (!ref->long_term) + if (!GST_H264_PICTURE_IS_LONG_TERM_REF (ref)) g_string_append_printf (str, "|%i", ref->pic_order_cnt); else g_string_append_printf (str, "|%il", ref->long_term_pic_num); } - GST_DEBUG_OBJECT (self, "ref_pic_list_b%i: %s| curr %i", index, str->str, + GST_DEBUG_OBJECT (self, "%s: %s| curr %i", name, str->str, self->priv->current_picture->pic_order_cnt); g_string_free (str, TRUE); #endif } static void -construct_ref_pic_lists_b (GstH264Decoder * self) +construct_ref_pic_lists_b (GstH264Decoder * self, + GstH264Picture * current_picture) { GstH264DecoderPrivate *priv = self->priv; gint pos; @@ -2154,13 +2829,20 @@ */ g_array_set_size (priv->ref_pic_list_b0, 0); g_array_set_size (priv->ref_pic_list_b1, 0); - gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, priv->ref_pic_list_b0); + + /* 8.2.4.2.3 + * When pic_order_cnt_type is equal to 0, reference pictures that are marked + * as "non-existing" as specified in clause 8.2.5.2 are not included in either + * RefPicList0 or RefPicList1 + */ + gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, + current_picture->pic_order_cnt_type != 0, FALSE, priv->ref_pic_list_b0); /* First sort ascending, this will put [1] in right place and finish * [2]. */ - print_ref_pic_list_b (self, priv->ref_pic_list_b0, 0); + print_ref_pic_list_b (self, priv->ref_pic_list_b0, "ref_pic_list_b0"); g_array_sort (priv->ref_pic_list_b0, (GCompareFunc) poc_asc_compare); - print_ref_pic_list_b (self, priv->ref_pic_list_b0, 0); + print_ref_pic_list_b (self, priv->ref_pic_list_b0, "ref_pic_list_b0"); /* Find first with POC > current_picture's POC to get first element * in [2]... */ @@ -2175,7 +2857,8 @@ /* Now add [3] and sort by ascending long_term_pic_num. */ pos = priv->ref_pic_list_b0->len; - gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, priv->ref_pic_list_b0); + gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, + FALSE, priv->ref_pic_list_b0); g_qsort_with_data (&g_array_index (priv->ref_pic_list_b0, gpointer, pos), priv->ref_pic_list_b0->len - pos, sizeof (gpointer), (GCompareDataFunc) long_term_pic_num_asc_compare, NULL); @@ -2185,7 +2868,8 @@ * [2] shortterm ref pics with POC < curr_pic's POC by descending POC, * [3] longterm ref pics by ascending long_term_pic_num. */ - gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, priv->ref_pic_list_b1); + gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, + current_picture->pic_order_cnt_type != 0, FALSE, priv->ref_pic_list_b1); /* First sort by descending POC. */ g_array_sort (priv->ref_pic_list_b1, (GCompareFunc) poc_desc_compare); @@ -2201,7 +2885,8 @@ /* Now add [3] and sort by ascending long_term_pic_num */ pos = priv->ref_pic_list_b1->len; - gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, priv->ref_pic_list_b1); + gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, + FALSE, priv->ref_pic_list_b1); g_qsort_with_data (&g_array_index (priv->ref_pic_list_b1, gpointer, pos), priv->ref_pic_list_b1->len - pos, sizeof (gpointer), (GCompareDataFunc) long_term_pic_num_asc_compare, NULL); @@ -2217,15 +2902,154 @@ list[1] = pic; } - print_ref_pic_list_b (self, priv->ref_pic_list_b0, 0); - print_ref_pic_list_b (self, priv->ref_pic_list_b1, 1); + print_ref_pic_list_b (self, priv->ref_pic_list_b0, "ref_pic_list_b0"); + print_ref_pic_list_b (self, priv->ref_pic_list_b1, "ref_pic_list_b1"); } static void -gst_h264_decoder_prepare_ref_pic_lists (GstH264Decoder * self) +construct_ref_field_pic_lists_b (GstH264Decoder * self, + GstH264Picture * current_picture) { - construct_ref_pic_lists_p (self); - construct_ref_pic_lists_b (self); + GstH264DecoderPrivate *priv = self->priv; + gint pos; + + /* refFrameList0ShortTerm (8.2.4.2.4) [[1] [2]], where: + * [1] shortterm ref pics with POC < current_picture's POC sorted by descending POC, + * [2] shortterm ref pics with POC > current_picture's POC by ascending POC, + */ + g_array_set_size (priv->ref_pic_list_b0, 0); + g_array_set_size (priv->ref_pic_list_b1, 0); + g_array_set_size (priv->ref_frame_list_0_short_term, 0); + g_array_set_size (priv->ref_frame_list_1_short_term, 0); + g_array_set_size (priv->ref_frame_list_long_term, 0); + + /* 8.2.4.2.4 + * When pic_order_cnt_type is equal to 0, reference pictures that are marked + * as "non-existing" as specified in clause 8.2.5.2 are not included in either + * RefPicList0 or RefPicList1 + */ + gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, + current_picture->pic_order_cnt_type != 0, TRUE, + priv->ref_frame_list_0_short_term); + + /* First sort ascending, this will put [1] in right place and finish + * [2]. */ + print_ref_pic_list_b (self, priv->ref_frame_list_0_short_term, + "ref_frame_list_0_short_term"); + g_array_sort (priv->ref_frame_list_0_short_term, + (GCompareFunc) poc_asc_compare); + print_ref_pic_list_b (self, priv->ref_frame_list_0_short_term, + "ref_frame_list_0_short_term"); + + /* Find first with POC > current_picture's POC to get first element + * in [2]... */ + pos = split_ref_pic_list_b (self, priv->ref_frame_list_0_short_term, + (GCompareFunc) poc_asc_compare); + + GST_DEBUG_OBJECT (self, "split point %i", pos); + + /* and sort [1] descending, thus finishing sequence [1] [2]. */ + g_qsort_with_data (priv->ref_frame_list_0_short_term->data, pos, + sizeof (gpointer), (GCompareDataFunc) poc_desc_compare, NULL); + + /* refFrameList1ShortTerm (8.2.4.2.4) [[1] [2]], where: + * [1] shortterm ref pics with POC > curr_pic's POC sorted by ascending POC, + * [2] shortterm ref pics with POC < curr_pic's POC by descending POC, + */ + gst_h264_dpb_get_pictures_short_term_ref (priv->dpb, + current_picture->pic_order_cnt_type != 0, TRUE, + priv->ref_frame_list_1_short_term); + + /* First sort by descending POC. */ + g_array_sort (priv->ref_frame_list_1_short_term, + (GCompareFunc) poc_desc_compare); + + /* Split at first with POC < current_picture's POC to get first element + * in [2]... */ + pos = split_ref_pic_list_b (self, priv->ref_frame_list_1_short_term, + (GCompareFunc) poc_desc_compare); + + /* and sort [1] ascending. */ + g_qsort_with_data (priv->ref_frame_list_1_short_term->data, pos, + sizeof (gpointer), (GCompareDataFunc) poc_asc_compare, NULL); + + /* 8.2.4.2.2 refFrameList0LongTerm,: + * long-term ref pictures sorted by ascending long_term_frame_idx. + */ + gst_h264_dpb_get_pictures_long_term_ref (priv->dpb, + TRUE, priv->ref_frame_list_long_term); + g_array_sort (priv->ref_frame_list_long_term, + (GCompareFunc) long_term_frame_idx_asc_compare); + + /* 8.2.4.2.5 RefPicList0 */ + init_picture_refs_fields_1 (self, current_picture->field, + priv->ref_frame_list_0_short_term, priv->ref_pic_list_b0); + init_picture_refs_fields_1 (self, current_picture->field, + priv->ref_frame_list_long_term, priv->ref_pic_list_b0); + + /* 8.2.4.2.5 RefPicList1 */ + init_picture_refs_fields_1 (self, current_picture->field, + priv->ref_frame_list_1_short_term, priv->ref_pic_list_b1); + init_picture_refs_fields_1 (self, current_picture->field, + priv->ref_frame_list_long_term, priv->ref_pic_list_b1); + + /* If lists identical, swap first two entries in RefPicList1 (spec + * 8.2.4.2.5) */ + if (priv->ref_pic_list_b1->len > 1 + && lists_are_equal (priv->ref_pic_list_b0, priv->ref_pic_list_b1)) { + /* swap */ + GstH264Picture **list = (GstH264Picture **) priv->ref_pic_list_b1->data; + GstH264Picture *pic = list[0]; + list[0] = list[1]; + list[1] = pic; + } + + print_ref_pic_list_b (self, priv->ref_pic_list_b0, "ref_pic_list_b0"); + print_ref_pic_list_b (self, priv->ref_pic_list_b1, "ref_pic_list_b1"); + + /* Clear temporary lists, now pictures are owned by ref_pic_list_b0 + * and ref_pic_list_b1 */ + g_array_set_size (priv->ref_frame_list_0_short_term, 0); + g_array_set_size (priv->ref_frame_list_1_short_term, 0); + g_array_set_size (priv->ref_frame_list_long_term, 0); +} + +static void +gst_h264_decoder_prepare_ref_pic_lists (GstH264Decoder * self, + GstH264Picture * current_picture) +{ + GstH264DecoderPrivate *priv = self->priv; + gboolean construct_list = FALSE; + gint i; + GArray *dpb_array = gst_h264_dpb_get_pictures_all (priv->dpb); + + /* 8.2.4.2.1 ~ 8.2.4.2.4 + * When this process is invoked, there shall be at least one reference entry + * that is currently marked as "used for reference" + * (i.e., as "used for short-term reference" or "used for long-term reference") + * and is not marked as "non-existing" + */ + for (i = 0; i < dpb_array->len; i++) { + GstH264Picture *picture = g_array_index (dpb_array, GstH264Picture *, i); + if (GST_H264_PICTURE_IS_REF (picture) && !picture->nonexisting) { + construct_list = TRUE; + break; + } + } + g_array_unref (dpb_array); + + if (!construct_list) { + gst_h264_decoder_clear_ref_pic_lists (self); + return; + } + + if (GST_H264_PICTURE_IS_FRAME (current_picture)) { + construct_ref_pic_lists_p (self, current_picture); + construct_ref_pic_lists_b (self, current_picture); + } else { + construct_ref_field_pic_lists_p (self, current_picture); + construct_ref_field_pic_lists_b (self, current_picture); + } } static void @@ -2241,7 +3065,7 @@ static gint long_term_pic_num_f (GstH264Decoder * self, const GstH264Picture * picture) { - if (picture->ref && picture->long_term) + if (GST_H264_PICTURE_IS_LONG_TERM_REF (picture)) return picture->long_term_pic_num; return 2 * (self->priv->max_long_term_frame_idx + 1); } @@ -2249,7 +3073,7 @@ static gint pic_num_f (GstH264Decoder * self, const GstH264Picture * picture) { - if (!picture->long_term) + if (!GST_H264_PICTURE_IS_LONG_TERM_REF (picture)) return picture->pic_num; return self->priv->max_pic_num; } @@ -2360,7 +3184,7 @@ if (!pic) { GST_WARNING_OBJECT (self, "Malformed stream, no pic num %d", pic_num_lx); - return FALSE; + break; } shift_right_and_insert (ref_pic_listx, ref_idx_lx, num_ref_idx_lX_active_minus1, pic); @@ -2381,12 +3205,12 @@ case 2: /* (8-28) */ g_assert (num_ref_idx_lX_active_minus1 + 1 < 32); - pic = gst_h264_dpb_get_long_ref_by_pic_num (priv->dpb, + pic = gst_h264_dpb_get_long_ref_by_long_term_pic_num (priv->dpb, list_mod->value.long_term_pic_num); if (!pic) { GST_WARNING_OBJECT (self, "Malformed stream, no pic num %d", list_mod->value.long_term_pic_num); - return FALSE; + break; } shift_right_and_insert (ref_pic_listx, ref_idx_lx, num_ref_idx_lX_active_minus1, pic); @@ -2443,11 +3267,15 @@ GstH264DecoderPrivate *priv = self->priv; GstH264SliceHdr *slice_hdr = &priv->current_slice.header; - /* fill reference picture lists for B and S/SP slices */ + g_array_set_size (priv->ref_pic_list0, 0); + g_array_set_size (priv->ref_pic_list1, 0); + if (GST_H264_IS_P_SLICE (slice_hdr) || GST_H264_IS_SP_SLICE (slice_hdr)) { + /* 8.2.4 fill reference picture list RefPicList0 for P or SP slice */ copy_pic_list_into (priv->ref_pic_list0, priv->ref_pic_list_p0); return modify_ref_pic_list (self, 0); - } else { + } else if (GST_H264_IS_B_SLICE (slice_hdr)) { + /* 8.2.4 fill reference picture list RefPicList0 and RefPicList1 for B slice */ copy_pic_list_into (priv->ref_pic_list0, priv->ref_pic_list_b0); copy_pic_list_into (priv->ref_pic_list1, priv->ref_pic_list_b1); return modify_ref_pic_list (self, 0)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth264decoder.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth264decoder.h
Changed
@@ -36,6 +36,49 @@ #define GST_IS_H264_DECODER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_H264_DECODER)) #define GST_H264_DECODER_CAST(obj) ((GstH264Decoder*)obj) +/** + * GstH264DecoderCompliance: + * @GST_H264_DECODER_COMPLIANCE_AUTO: The decoder behavior is + * automatically choosen. + * @GST_H264_DECODER_COMPLIANCE_STRICT: The decoder behavior strictly + * conforms to the SPEC. All the decoder behaviors conform to the + * SPEC, not including any nonstandard behavior which is not + * mentioned in the SPEC. + * @GST_H264_DECODER_COMPLIANCE_NORMAL: The decoder behavior normally + * conforms to the SPEC. Most behaviors conform to the SPEC but + * including some nonstandard features which are widely used or + * often used in the industry practice. This meets the request of + * real streams and usages, but may not 100% conform to the + * SPEC. It has very low risk. E.g., we will output pictures + * without waiting DPB being full for the lower latency, which may + * cause B frame disorder when there are reference frames with + * smaller POC after it in decoder order. And the baseline profile + * may be mapped to the constrained-baseline profile, but it may + * have problems when a real baseline stream comes with FMO or + * ASO. + * @GST_H264_DECODER_COMPLIANCE_FLEXIBLE: The decoder behavior + * flexibly conforms to the SPEC. It uses the nonstandard features + * more aggressively in order to get better performance(for + * example, lower latency). It may change the result of the + * decoder and should be used carefully. Besides including all + * risks in *normal* mode, it has more risks, such as frames + * disorder when reference frames POC decrease in decoder order. + * + * Since: 1.20 + */ +typedef enum +{ + GST_H264_DECODER_COMPLIANCE_AUTO, + GST_H264_DECODER_COMPLIANCE_STRICT, + GST_H264_DECODER_COMPLIANCE_NORMAL, + GST_H264_DECODER_COMPLIANCE_FLEXIBLE +} GstH264DecoderCompliance; + +#define GST_TYPE_H264_DECODER_COMPLIANCE (gst_h264_decoder_compliance_get_type()) + +GST_CODECS_API +GType gst_h264_decoder_compliance_get_type (void); + typedef struct _GstH264Decoder GstH264Decoder; typedef struct _GstH264DecoderClass GstH264DecoderClass; typedef struct _GstH264DecoderPrivate GstH264DecoderPrivate; @@ -60,70 +103,137 @@ /** * GstH264DecoderClass: - * @new_sequence: Notifies subclass of SPS update - * @new_picture: Optional. - * Called whenever new #GstH264Picture is created. - * Subclass can set implementation specific user data - * on the #GstH264Picture via gst_h264_picture_set_user_data() - * @start_picture: Optional. - * Called per one #GstH264Picture to notify subclass to prepare - * decoding process for the #GstH264Picture - * @decode_slice: Provides per slice data with parsed slice header and - * required raw bitstream for subclass to decode it. - * if gst_h264_decoder_set_process_ref_pic_lists() is called - * with %TRUE by the subclass, @ref_pic_list0 and @ref_pic_list1 - * are non-%NULL. - * @end_picture: Optional. - * Called per one #GstH264Picture to notify subclass to finish - * decoding process for the #GstH264Picture - * @output_picture: Called with a #GstH264Picture which is required to be outputted. - * Subclass can retrieve parent #GstVideoCodecFrame by using - * gst_video_decoder_get_frame() with system_frame_number - * and the #GstVideoCodecFrame must be consumed by subclass via - * gst_video_decoder_{finish,drop,release}_frame(). + * + * The opaque #GstH264DecoderClass data structure. */ struct _GstH264DecoderClass { + /*< private >*/ GstVideoDecoderClass parent_class; - gboolean (*new_sequence) (GstH264Decoder * decoder, + /** + * GstH264DecoderClass::new_sequence: + * @decoder: a #GstH264Decoder + * @sps: a #GstH264SPS + * @max_dpb_size: the size of dpb including preferred output delay + * by subclass reported via get_preferred_output_delay method. + * + * Notifies subclass of SPS update + */ + GstFlowReturn (*new_sequence) (GstH264Decoder * decoder, const GstH264SPS * sps, gint max_dpb_size); /** - * GstH264Decoder:new_picture: + * GstH264DecoderClass::new_picture: * @decoder: a #GstH264Decoder * @frame: (transfer none): a #GstVideoCodecFrame * @picture: (transfer none): a #GstH264Picture + * + * Optional. Called whenever new #GstH264Picture is created. + * Subclass can set implementation specific user data + * on the #GstH264Picture via gst_h264_picture_set_user_data() */ - gboolean (*new_picture) (GstH264Decoder * decoder, + GstFlowReturn (*new_picture) (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture); - gboolean (*start_picture) (GstH264Decoder * decoder, + /** + * GstH264DecoderClass::new_field_picture: + * @decoder: a #GstH264Decoder + * @first_field: (transfer none): the first field #GstH264Picture already decoded + * @second_field: (transfer none): a #GstH264Picture for the second field + * + * Called when a new field picture is created for interlaced field picture. + * Subclass can attach implementation specific user data on @second_field via + * gst_h264_picture_set_user_data() + * + * Since: 1.20 + */ + GstFlowReturn (*new_field_picture) (GstH264Decoder * decoder, + const GstH264Picture * first_field, + GstH264Picture * second_field); + + /** + * GstH264DecoderClass::start_picture: + * @decoder: a #GstH264Decoder + * @picture: (transfer none): a #GstH264Picture + * @slice: (transfer none): a #GstH264Slice + * @dpb: (transfer none): a #GstH264Dpb + * + * Optional. Called per one #GstH264Picture to notify subclass to prepare + * decoding process for the #GstH264Picture + */ + GstFlowReturn (*start_picture) (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb); - gboolean (*decode_slice) (GstH264Decoder * decoder, + /** + * GstH264DecoderClass::decode_slice: + * @decoder: a #GstH264Decoder + * @picture: (transfer none): a #GstH264Picture + * @slice: (transfer none): a #GstH264Slice + * @ref_pic_list0: (element-type GstH264Picture) (transfer none): + * an array of #GstH264Picture pointers + * @ref_pic_list1: (element-type GstH264Picture) (transfer none): + * an array of #GstH264Picture pointers + * + * Provides per slice data with parsed slice header and required raw bitstream + * for subclass to decode it. If gst_h264_decoder_set_process_ref_pic_lists() + * is called with %TRUE by the subclass, @ref_pic_list0 and @ref_pic_list1 + * are non-%NULL. + * In case of interlaced stream, @ref_pic_list0 and @ref_pic_list1 will + * contain only the first field of complementary reference field pair + * if currently being decoded picture is a frame picture. Subclasses might + * need to retrive the other field (i.e., the second field) of the picture + * if needed. + */ + GstFlowReturn (*decode_slice) (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, GArray * ref_pic_list1); - gboolean (*end_picture) (GstH264Decoder * decoder, + /** + * GstH264DecoderClass::end_picture: + * @decoder: a #GstH264Decoder + * @picture: (transfer none): a #GstH264Picture + * + * Optional. Called per one #GstH264Picture to notify subclass to finish + * decoding process for the #GstH264Picture + */ + GstFlowReturn (*end_picture) (GstH264Decoder * decoder, GstH264Picture * picture); /** - * GstH264Decoder:output_picture: + * GstH264DecoderClass::output_picture: * @decoder: a #GstH264Decoder * @frame: (transfer full): a #GstVideoCodecFrame * @picture: (transfer full): a #GstH264Picture + * + * Called with a #GstH264Picture which is required to be outputted. + * The #GstVideoCodecFrame must be consumed by subclass. */ GstFlowReturn (*output_picture) (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture); + /** + * GstH264DecoderClass::get_preferred_output_delay: + * @decoder: a #GstH264Decoder + * @live: whether upstream is live or not + * + * Optional. Called by baseclass to query whether delaying output is + * preferred by subclass or not. + * + * Returns: the number of perferred delayed output frame + * + * Since: 1.20 + */ + guint (*get_preferred_output_delay) (GstH264Decoder * decoder, + gboolean live); + /*< private >*/ gpointer padding[GST_PADDING_LARGE]; };
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth264picture.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth264picture.c
Changed
@@ -22,6 +22,7 @@ #endif #include "gsth264picture.h" +#include <stdlib.h> GST_DEBUG_CATEGORY_EXTERN (gst_h264_decoder_debug); #define GST_CAT_DEFAULT gst_h264_decoder_debug @@ -51,7 +52,6 @@ pic = g_new0 (GstH264Picture, 1); - pic->pts = GST_CLOCK_TIME_NONE; pic->top_field_order_cnt = G_MAXINT32; pic->bottom_field_order_cnt = G_MAXINT32; pic->field = GST_H264_PICTURE_FIELD_FRAME; @@ -106,9 +106,23 @@ struct _GstH264Dpb { GArray *pic_list; - gint max_num_pics; + gint max_num_frames; + gint num_output_needed; + guint32 max_num_reorder_frames; + gint32 last_output_poc; + gboolean last_output_non_ref; + + gboolean interlaced; }; +static void +gst_h264_dpb_init (GstH264Dpb * dpb) +{ + dpb->num_output_needed = 0; + dpb->last_output_poc = G_MININT32; + dpb->last_output_non_ref = FALSE; +} + /** * gst_h264_dpb_new: (skip) * @@ -122,6 +136,7 @@ GstH264Dpb *dpb; dpb = g_new0 (GstH264Dpb, 1); + gst_h264_dpb_init (dpb); dpb->pic_list = g_array_sized_new (FALSE, TRUE, sizeof (GstH264Picture *), @@ -133,32 +148,67 @@ } /** - * gst_h264_dpb_set_max_num_pics: + * gst_h264_dpb_set_max_num_frames: * @dpb: a #GstH264Dpb - * @max_num_pics: the maximum number of picture + * @max_num_frames: the maximum number of picture + * + * Set the number of maximum allowed frames to store * - * Set the number of maximum allowed pictures to store + * Since: 1.20 */ void -gst_h264_dpb_set_max_num_pics (GstH264Dpb * dpb, gint max_num_pics) +gst_h264_dpb_set_max_num_frames (GstH264Dpb * dpb, gint max_num_frames) { g_return_if_fail (dpb != NULL); - dpb->max_num_pics = max_num_pics; + dpb->max_num_frames = max_num_frames; } /** - * gst_h264_dpb_get_max_num_pics: + * gst_h264_dpb_get_max_num_frames: * @dpb: a #GstH264Dpb * - * Returns: the number of maximum pictures + * Returns: the number of maximum frames + * + * Since: 1.20 */ gint -gst_h264_dpb_get_max_num_pics (GstH264Dpb * dpb) +gst_h264_dpb_get_max_num_frames (GstH264Dpb * dpb) { g_return_val_if_fail (dpb != NULL, 0); - return dpb->max_num_pics; + return dpb->max_num_frames; +} + +/** + * gst_h264_dpb_set_interlaced: + * @dpb: a #GstH264Dpb + * @interlaced: %TRUE if interlaced + * + * Since: 1.20 + */ +void +gst_h264_dpb_set_interlaced (GstH264Dpb * dpb, gboolean interlaced) +{ + g_return_if_fail (dpb != NULL); + + dpb->interlaced = interlaced; +} + +/** + * gst_h264_dpb_get_interlaced: + * @dpb: a #GstH264Dpb + * + * Returns: %TRUE if @dpb is configured for interlaced stream + * + * Since: 1.20 + */ +gboolean +gst_h264_dpb_get_interlaced (GstH264Dpb * dpb) +{ + g_return_val_if_fail (dpb != NULL, FALSE); + + return dpb->interlaced; } /** @@ -189,87 +239,86 @@ g_return_if_fail (dpb != NULL); g_array_set_size (dpb->pic_list, 0); + gst_h264_dpb_init (dpb); } /** - * gst_h264_dpb_add: + * gst_h264_dpb_set_max_num_reorder_frames: * @dpb: a #GstH264Dpb - * @picture: (transfer full): a #GstH264Picture + * @max_num_reorder_frames: the max number of reorder frames, which + * should not exceed the max size of DPB. * - * Store the @picture + * Since: 1.20 */ void -gst_h264_dpb_add (GstH264Dpb * dpb, GstH264Picture * picture) +gst_h264_dpb_set_max_num_reorder_frames (GstH264Dpb * dpb, + guint32 max_num_reorder_frames) { g_return_if_fail (dpb != NULL); - g_return_if_fail (GST_IS_H264_PICTURE (picture)); + g_return_if_fail (max_num_reorder_frames <= dpb->max_num_frames); - g_array_append_val (dpb->pic_list, picture); + dpb->max_num_reorder_frames = max_num_reorder_frames; } /** - * gst_h264_dpb_delete_unused: + * gst_h264_dpb_add: * @dpb: a #GstH264Dpb + * @picture: (transfer full): a #GstH264Picture * - * Delete already outputted and not referenced all pictures from dpb + * Store the @picture */ void -gst_h264_dpb_delete_unused (GstH264Dpb * dpb) +gst_h264_dpb_add (GstH264Dpb * dpb, GstH264Picture * picture) { - gint i; - g_return_if_fail (dpb != NULL); + g_return_if_fail (GST_IS_H264_PICTURE (picture)); - for (i = 0; i < dpb->pic_list->len; i++) { - GstH264Picture *picture = - g_array_index (dpb->pic_list, GstH264Picture *, i); - - if (picture->outputted && !picture->ref) { - GST_TRACE ("remove picture %p (frame num %d) from dpb", - picture, picture->frame_num); - g_array_remove_index_fast (dpb->pic_list, i); - i--; + /* C.4.2 Decoding of gaps in frame_num and storage of "non-existing" pictures + * + * The "non-existing" frame is stored in an empty frame buffer and is marked + * as "not needed for output", and the DPB fullness is incremented by one */ + if (!picture->nonexisting) { + picture->needed_for_output = TRUE; + + if (GST_H264_PICTURE_IS_FRAME (picture)) { + dpb->num_output_needed++; + } else { + /* We can do output only when field pair are complete */ + if (picture->second_field) { + dpb->num_output_needed++; + } } + } else { + picture->needed_for_output = FALSE; } -} -/** - * gst_h264_dpb_delete_outputed: - * @dpb: a #GstH264Dpb - * - * Delete already outputted picture, even if they are referenced. - * - * Since: 1.18 - */ -void -gst_h264_dpb_delete_outputed (GstH264Dpb * dpb) -{ - gint i; + /* Link each field */ + if (picture->second_field && picture->other_field) { + picture->other_field->other_field = picture; + } - g_return_if_fail (dpb != NULL); + g_array_append_val (dpb->pic_list, picture); - for (i = 0; i < dpb->pic_list->len; i++) { - GstH264Picture *picture = - g_array_index (dpb->pic_list, GstH264Picture *, i); + if (dpb->pic_list->len > dpb->max_num_frames * (dpb->interlaced + 1)) + GST_ERROR ("DPB size is %d, exceed the max size %d", + dpb->pic_list->len, dpb->max_num_frames * (dpb->interlaced + 1)); - if (picture->outputted) { - GST_TRACE ("remove picture %p (frame num %d) from dpb", - picture, picture->frame_num); - g_array_remove_index_fast (dpb->pic_list, i); - i--; - } + /* The IDR frame or mem_mgmt_5 */ + if (picture->pic_order_cnt == 0) { + GST_TRACE ("last_output_poc reset because of IDR or mem_mgmt_5"); + dpb->last_output_poc = G_MININT32; + dpb->last_output_non_ref = FALSE; } } /** - * gst_h264_dpb_delete_by_poc: + * gst_h264_dpb_delete_unused: * @dpb: a #GstH264Dpb - * @poc: a poc of #GstH264Picture to remove * - * Delete a #GstH264Dpb by @poc + * Delete already outputted and not referenced all pictures from dpb */ void -gst_h264_dpb_delete_by_poc (GstH264Dpb * dpb, gint poc) +gst_h264_dpb_delete_unused (GstH264Dpb * dpb) { gint i; @@ -279,26 +328,28 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->pic_order_cnt == poc) { - GST_TRACE ("remove picture %p for poc %d (frame num %d) from dpb", - picture, poc, picture->frame_num); - - g_array_remove_index_fast (dpb->pic_list, i); - return; + /* NOTE: don't use g_array_remove_index_fast here since the last picture + * need to be referenced for bumping decision */ + if (!picture->needed_for_output && !GST_H264_PICTURE_IS_REF (picture)) { + GST_TRACE + ("remove picture %p (frame num: %d, poc: %d, field: %d) from dpb", + picture, picture->frame_num, picture->pic_order_cnt, picture->field); + g_array_remove_index (dpb->pic_list, i); + i--; } } - - GST_WARNING ("Couldn't find picture with poc %d", poc); } /** - * gst_h264_dpb_num_ref_pictures: + * gst_h264_dpb_num_ref_frames: * @dpb: a #GstH264Dpb * - * Returns: The number of referenced pictures + * Returns: The number of referenced frames + * + * Since: 1.20 */ gint -gst_h264_dpb_num_ref_pictures (GstH264Dpb * dpb) +gst_h264_dpb_num_ref_frames (GstH264Dpb * dpb) { gint i; gint ret = 0; @@ -309,7 +360,11 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->ref) + /* Count frame, not field picture */ + if (picture->second_field) + continue; + + if (GST_H264_PICTURE_IS_REF (picture)) ret++; } @@ -333,7 +388,7 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - picture->ref = FALSE; + gst_h264_picture_set_reference (picture, GST_H264_PICTURE_REF_NONE, FALSE); } } @@ -357,7 +412,8 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->ref && !picture->long_term && picture->pic_num == pic_num) + if (GST_H264_PICTURE_IS_SHORT_TERM_REF (picture) + && picture->pic_num == pic_num) return picture; } @@ -367,16 +423,19 @@ } /** - * gst_h264_dpb_get_long_ref_by_pic_num: + * gst_h264_dpb_get_long_ref_by_long_term_pic_num: * @dpb: a #GstH264Dpb - * @pic_num: a picture number + * @long_term_pic_num: a long term picture number * - * Find a long term reference picture which has matching picture number + * Find a long term reference picture which has matching long term picture number * * Returns: (nullable) (transfer none): a #GstH264Picture + * + * Since: 1.20 */ GstH264Picture * -gst_h264_dpb_get_long_ref_by_pic_num (GstH264Dpb * dpb, gint pic_num) +gst_h264_dpb_get_long_ref_by_long_term_pic_num (GstH264Dpb * dpb, + gint long_term_pic_num) { gint i; @@ -386,11 +445,12 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->ref && picture->long_term && picture->pic_num == pic_num) + if (GST_H264_PICTURE_IS_LONG_TERM_REF (picture) && + picture->long_term_pic_num == long_term_pic_num) return picture; } - GST_WARNING ("No long term reference picture for %d", pic_num); + GST_WARNING ("No long term reference picture for %d", long_term_pic_num); return NULL; } @@ -415,7 +475,7 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->ref && !picture->long_term && + if (GST_H264_PICTURE_IS_SHORT_TERM_REF (picture) && (!ret || picture->frame_num_wrap < ret->frame_num_wrap)) ret = picture; } @@ -427,43 +487,21 @@ } /** - * gst_h264_dpb_get_pictures_not_outputted: - * @dpb: a #GstH264Dpb - * @out: (out) (element-type GstH264Picture) (transfer full): an array - * of #GstH264Picture pointer - * - * Retrieve all not-outputted pictures from @dpb - */ -void -gst_h264_dpb_get_pictures_not_outputted (GstH264Dpb * dpb, GArray * out) -{ - gint i; - - g_return_if_fail (dpb != NULL); - g_return_if_fail (out != NULL); - - for (i = 0; i < dpb->pic_list->len; i++) { - GstH264Picture *picture = - g_array_index (dpb->pic_list, GstH264Picture *, i); - - if (!picture->outputted) { - gst_h264_picture_ref (picture); - g_array_append_val (out, picture); - } - } -} - -/** * gst_h264_dpb_get_pictures_short_term_ref: * @dpb: a #GstH264Dpb + * @include_non_existing: %TRUE if non-existing pictures need to be included + * @include_second_field: %TRUE if the second field pictures need to be included * @out: (out) (element-type GstH264Picture) (transfer full): an array * of #GstH264Picture pointers * * Retrieve all short-term reference pictures from @dpb. The picture will be * appended to the array. + * + * Since: 1.20 */ void -gst_h264_dpb_get_pictures_short_term_ref (GstH264Dpb * dpb, GArray * out) +gst_h264_dpb_get_pictures_short_term_ref (GstH264Dpb * dpb, + gboolean include_non_existing, gboolean include_second_field, GArray * out) { gint i; @@ -474,7 +512,12 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->ref && !picture->long_term) { + if (!include_second_field && picture->second_field) + continue; + + if (GST_H264_PICTURE_IS_SHORT_TERM_REF (picture) && + (include_non_existing || (!include_non_existing && + !picture->nonexisting))) { gst_h264_picture_ref (picture); g_array_append_val (out, picture); } @@ -484,14 +527,18 @@ /** * gst_h264_dpb_get_pictures_long_term_ref: * @dpb: a #GstH264Dpb - * @out: (out) (element-type GstH264Picture) (transfer full): an arrat + * @include_second_field: %TRUE if the second field pictures need to be included + * @out: (out) (element-type GstH264Picture) (transfer full): an array * of #GstH264Picture pointer * * Retrieve all long-term reference pictures from @dpb. The picture will be * appended to the array. + * + * Since: 1.20 */ void -gst_h264_dpb_get_pictures_long_term_ref (GstH264Dpb * dpb, GArray * out) +gst_h264_dpb_get_pictures_long_term_ref (GstH264Dpb * dpb, + gboolean include_second_field, GArray * out) { gint i; @@ -502,7 +549,10 @@ GstH264Picture *picture = g_array_index (dpb->pic_list, GstH264Picture *, i); - if (picture->ref && picture->long_term) { + if (!include_second_field && picture->second_field) + continue; + + if (GST_H264_PICTURE_IS_LONG_TERM_REF (picture)) { gst_h264_picture_ref (picture); g_array_append_val (out, picture); } @@ -539,20 +589,6 @@ } /** - * gst_h264_dpb_is_full: - * @dpb: a #GstH264Dpb - * - * Return: %TRUE if @dpb is full - */ -gboolean -gst_h264_dpb_is_full (GstH264Dpb * dpb) -{ - g_return_val_if_fail (dpb != NULL, -1); - - return dpb->pic_list->len >= dpb->max_num_pics; -} - -/** * gst_h264_dpb_get_picture: * @dpb: a #GstH264Dpb * @system_frame_number The system frame number @@ -582,3 +618,603 @@ return NULL; } + +/** + * gst_h264_dpb_has_empty_frame_buffer: + * @dpb: a #GstH264Dpb + * + * Returns: %TRUE if @dpb still has empty frame buffers. + * + * Since: 1.20 + */ +gboolean +gst_h264_dpb_has_empty_frame_buffer (GstH264Dpb * dpb) +{ + if (!dpb->interlaced) { + if (dpb->pic_list->len < dpb->max_num_frames) + return TRUE; + } else { + gint i; + gint count = 0; + /* Count the number of complementary field pairs */ + for (i = 0; i < dpb->pic_list->len; i++) { + GstH264Picture *picture = + g_array_index (dpb->pic_list, GstH264Picture *, i); + + if (picture->second_field) + continue; + + if (GST_H264_PICTURE_IS_FRAME (picture) || picture->other_field) + count++; + } + + if (count < dpb->max_num_frames) + return TRUE; + } + + return FALSE; +} + +static gint +gst_h264_dpb_get_lowest_output_needed_picture (GstH264Dpb * dpb, + GstH264Picture ** picture) +{ + gint i; + GstH264Picture *lowest = NULL; + gint index = -1; + + *picture = NULL; + + for (i = 0; i < dpb->pic_list->len; i++) { + GstH264Picture *picture = + g_array_index (dpb->pic_list, GstH264Picture *, i); + + if (!picture->needed_for_output) + continue; + + if (!GST_H264_PICTURE_IS_FRAME (picture) && + (!picture->other_field || picture->second_field)) + continue; + + if (!lowest) { + lowest = picture; + index = i; + continue; + } + + if (picture->pic_order_cnt < lowest->pic_order_cnt) { + lowest = picture; + index = i; + } + } + + if (lowest) + *picture = gst_h264_picture_ref (lowest); + + return index; +} + +/** + * gst_h264_dpb_needs_bump: + * @dpb: a #GstH264Dpb + * @to_insert: the current #GstH264Picture to insert to dpb. + * @latency_mode: The required #GstH264DpbBumpMode for bumping. + * + * Returns: %TRUE if bumping is required + * + * Since: 1.20 + */ +gboolean +gst_h264_dpb_needs_bump (GstH264Dpb * dpb, GstH264Picture * to_insert, + GstH264DpbBumpMode latency_mode) +{ + GstH264Picture *picture = NULL; + gint32 lowest_poc; + gboolean is_ref_picture; + gint lowest_index; + + g_return_val_if_fail (dpb != NULL, FALSE); + g_assert (dpb->num_output_needed >= 0); + + lowest_poc = G_MAXINT32; + is_ref_picture = FALSE; + lowest_index = gst_h264_dpb_get_lowest_output_needed_picture (dpb, &picture); + if (lowest_index >= 0) { + lowest_poc = picture->pic_order_cnt; + is_ref_picture = picture->ref_pic; + gst_h264_picture_unref (picture); + } else { + goto normal_bump; + } + + if (latency_mode >= GST_H264_DPB_BUMP_LOW_LATENCY) { + /* If low latency, we should not wait for the DPB becoming full. + We try to bump the picture as soon as possible without the + frames disorder. The policy is from the safe to some risk. */ + + /* Do not support interlaced mode. */ + if (gst_h264_dpb_get_interlaced (dpb)) + goto normal_bump; + + /* Equal to normal bump. */ + if (!gst_h264_dpb_has_empty_frame_buffer (dpb)) + goto normal_bump; + + /* 7.4.1.2.2: The values of picture order count for the coded pictures + in consecutive access units in decoding order containing non-reference + pictures shall be non-decreasing. Safe. */ + if (dpb->last_output_non_ref && !is_ref_picture) { + g_assert (dpb->last_output_poc < G_MAXINT32); + GST_TRACE ("Continuous non-reference frame poc: %d -> %d," + " bumping for low-latency.", dpb->last_output_poc, lowest_poc); + return TRUE; + } + + /* num_reorder_frames indicates the maximum number of frames, that + precede any frame in the coded video sequence in decoding order + and follow it in output order. Safe. */ + if (lowest_index >= dpb->max_num_reorder_frames) { + guint i, need_output; + + need_output = 0; + for (i = 0; i < lowest_index; i++) { + GstH264Picture *p = g_array_index (dpb->pic_list, GstH264Picture *, i); + if (p->needed_for_output) + need_output++; + } + + if (need_output >= dpb->max_num_reorder_frames) { + GST_TRACE ("frame with lowest poc %d has %d precede frame, already" + " satisfy num_reorder_frames %d, bumping for low-latency.", + dpb->last_output_poc, lowest_index, dpb->max_num_reorder_frames); + return TRUE; + } + } + + /* Bump leading picture with the negative POC if already found positive + POC. It's even impossible to insert another negative POC after the + positive POCs. Almost safe. */ + if (to_insert && to_insert->pic_order_cnt > 0 && lowest_poc < 0) { + GST_TRACE ("The negative poc %d, bumping for low-latency.", lowest_poc); + return TRUE; + } + + /* There may be leading frames with negative POC following the IDR + frame in decoder order, so when IDR comes, we need to check the + following pictures. In most cases, leading pictures are in increasing + POC order. Bump and should be safe. */ + if (lowest_poc == 0 && gst_h264_dpb_get_size (dpb) <= 1) { + if (to_insert && to_insert->pic_order_cnt > lowest_poc) { + GST_TRACE ("The IDR or mem_mgmt_5 frame, bumping for low-latency."); + return TRUE; + } + + GST_TRACE ("The IDR or mem_mgmt_5 frame is not the first frame."); + goto normal_bump; + } + + /* When non-ref frame has the lowest POC, it's unlike to insert another + ref frame with very small POC. Bump and should be safe. */ + if (!is_ref_picture) { + GST_TRACE ("non ref with lowest-poc: %d bumping for low-latency", + lowest_poc); + return TRUE; + } + + /* When insert non-ref frame with bigger POC, it's unlike to insert + another ref frame with very small POC. Bump and should be safe. */ + if (to_insert && !to_insert->ref_pic + && lowest_poc < to_insert->pic_order_cnt) { + GST_TRACE ("lowest-poc: %d < to insert non ref pic: %d, bumping " + "for low-latency", lowest_poc, to_insert->pic_order_cnt); + return TRUE; + } + + if (latency_mode >= GST_H264_DPB_BUMP_VERY_LOW_LATENCY) { + /* PicOrderCnt increment by <=2. Not all streams meet this, but in + practice this condition can be used. + For stream with 2 poc increment like: + 0(IDR), 2(P), 4(P), 6(P), 12(P), 8(B), 10(B).... + This can work well, but for streams with 1 poc increment like: + 0(IDR), 2(P), 4(P), 1(B), 3(B) ... + This can cause picture disorder. Most stream in practice has the + 2 poc increment, but this may have risk and be careful. */ + if (lowest_poc > dpb->last_output_poc + && lowest_poc - dpb->last_output_poc <= 2) { + GST_TRACE ("lowest-poc: %d, last-output-poc: %d, diff <= 2, " + "bumping for very-low-latency", lowest_poc, dpb->last_output_poc); + return TRUE; + } + } + } + +normal_bump: + /* C.4.5.3: The "bumping" process is invoked in the following cases. + - There is no empty frame buffer and a empty frame buffer is needed + for storage of an inferred "non-existing" frame. + - There is no empty frame buffer and an empty frame buffer is needed + for storage of a decoded (non-IDR) reference picture. + - There is no empty frame buffer and the current picture is a non- + reference picture that is not the second field of a complementary + non-reference field pair and there are pictures in the DPB that are + marked as "needed for output" that precede the current non-reference + picture in output order. */ + if (gst_h264_dpb_has_empty_frame_buffer (dpb)) { + GST_TRACE ("DPB has empty frame buffer, no need bumping."); + return FALSE; + } + + if (to_insert && to_insert->ref_pic) { + GST_TRACE ("No empty frame buffer for ref frame, need bumping."); + return TRUE; + } + + if (to_insert && to_insert->pic_order_cnt > lowest_poc) { + GST_TRACE ("No empty frame buffer, lowest poc %d < current poc %d," + " need bumping.", lowest_poc, to_insert->pic_order_cnt); + return TRUE; + } + + if (to_insert) { + GST_TRACE ("No empty frame buffer, but lowest poc %d > current poc %d," + " no need bumping.", lowest_poc, to_insert->pic_order_cnt); + } + + return FALSE; +} + +/** + * gst_h264_dpb_bump: + * @dpb: a #GstH265Dpb + * @drain: whether draining or not + * + * Perform bumping process as defined in C.4.5.3 "Bumping" process. + * If @drain is %TRUE, @dpb will remove a #GstH264Picture from internal array + * so that returned #GstH264Picture could hold the last reference of it + * + * Returns: (nullable) (transfer full): a #GstH264Picture which is needed to be + * outputted + * + * Since: 1.20 + */ +GstH264Picture * +gst_h264_dpb_bump (GstH264Dpb * dpb, gboolean drain) +{ + GstH264Picture *picture; + GstH264Picture *other_picture; + gint i; + gint index; + + g_return_val_if_fail (dpb != NULL, NULL); + + index = gst_h264_dpb_get_lowest_output_needed_picture (dpb, &picture); + + if (!picture || index < 0) + return NULL; + + picture->needed_for_output = FALSE; + + dpb->num_output_needed--; + g_assert (dpb->num_output_needed >= 0); + + /* NOTE: don't use g_array_remove_index_fast here since the last picture + * need to be referenced for bumping decision */ + if (!GST_H264_PICTURE_IS_REF (picture) || drain) + g_array_remove_index (dpb->pic_list, index); + + other_picture = picture->other_field; + if (other_picture) { + other_picture->needed_for_output = FALSE; + + /* At this moment, this picture should be interlaced */ + picture->buffer_flags |= GST_VIDEO_BUFFER_FLAG_INTERLACED; + + /* FIXME: need to check picture timing SEI for the case where top/bottom poc + * are identical */ + if (picture->pic_order_cnt < other_picture->pic_order_cnt) + picture->buffer_flags |= GST_VIDEO_BUFFER_FLAG_TFF; + + if (!other_picture->ref) { + for (i = 0; i < dpb->pic_list->len; i++) { + GstH264Picture *tmp = + g_array_index (dpb->pic_list, GstH264Picture *, i); + + if (tmp == other_picture) { + g_array_remove_index (dpb->pic_list, i); + break; + } + } + } + /* Now other field may or may not exist */ + } + + dpb->last_output_poc = picture->pic_order_cnt; + dpb->last_output_non_ref = !picture->ref_pic; + + return picture; +} + +/** + * gst_h264_dpb_set_last_output: + * @dpb: a #GstH264Dpb + * @picture: a #GstH264Picture of the last output. + * + * Notify the DPB that @picture is output directly without storing + * in the DPB. + * + * Since: 1.20 + */ +void +gst_h264_dpb_set_last_output (GstH264Dpb * dpb, GstH264Picture * picture) +{ + g_return_if_fail (dpb != NULL); + g_return_if_fail (GST_IS_H264_PICTURE (picture)); + + dpb->last_output_poc = picture->pic_order_cnt; + dpb->last_output_non_ref = !picture->ref_pic; +} + +static gint +get_picNumX (GstH264Picture * picture, GstH264RefPicMarking * ref_pic_marking) +{ + return picture->pic_num - + (ref_pic_marking->difference_of_pic_nums_minus1 + 1); +} + +/** + * gst_h264_dpb_perform_memory_management_control_operation: + * @dpb: a #GstH265Dpb + * @ref_pic_marking: a #GstH264RefPicMarking + * @picture: a #GstH264Picture + * + * Perform "8.2.5.4 Adaptive memory control decoded reference picture marking process" + * + * Returns: %TRUE if successful + * + * Since: 1.20 + */ +gboolean +gst_h264_dpb_perform_memory_management_control_operation (GstH264Dpb * dpb, + GstH264RefPicMarking * ref_pic_marking, GstH264Picture * picture) +{ + guint8 type; + gint pic_num_x; + gint max_long_term_frame_idx; + GstH264Picture *other; + gint i; + + g_return_val_if_fail (dpb != NULL, FALSE); + g_return_val_if_fail (ref_pic_marking != NULL, FALSE); + g_return_val_if_fail (picture != NULL, FALSE); + + type = ref_pic_marking->memory_management_control_operation; + + switch (type) { + case 0: + /* Normal end of operations' specification */ + break; + case 1: + /* 8.2.5.4.1 Mark a short term reference picture as unused so it can be + * removed if outputted */ + pic_num_x = get_picNumX (picture, ref_pic_marking); + other = gst_h264_dpb_get_short_ref_by_pic_num (dpb, pic_num_x); + if (other) { + gst_h264_picture_set_reference (other, + GST_H264_PICTURE_REF_NONE, GST_H264_PICTURE_IS_FRAME (picture)); + GST_TRACE ("MMCO-1: unmark short-term ref picture %p, (poc %d)", + other, other->pic_order_cnt); + } else { + GST_WARNING ("Invalid picNumX %d for operation type 1", pic_num_x); + return FALSE; + } + break; + case 2: + /* 8.2.5.4.2 Mark a long term reference picture as unused so it can be + * removed if outputted */ + other = gst_h264_dpb_get_long_ref_by_long_term_pic_num (dpb, + ref_pic_marking->long_term_pic_num); + if (other) { + gst_h264_picture_set_reference (other, + GST_H264_PICTURE_REF_NONE, FALSE); + GST_TRACE ("MMCO-2: unmark long-term ref picture %p, (poc %d)", + other, other->pic_order_cnt); + } else { + GST_WARNING ("Invalid LongTermPicNum %d for operation type 2", + ref_pic_marking->long_term_pic_num); + return FALSE; + } + break; + case 3: + /* 8.2.5.4.3 Mark a short term reference picture as long term reference */ + + pic_num_x = get_picNumX (picture, ref_pic_marking); + + other = gst_h264_dpb_get_short_ref_by_pic_num (dpb, pic_num_x); + if (!other) { + GST_WARNING ("Invalid picNumX %d for operation type 3", pic_num_x); + return FALSE; + } + + /* If we have long-term ref picture for LongTermFrameIdx, + * mark the picture as non-reference */ + for (i = 0; i < dpb->pic_list->len; i++) { + GstH264Picture *tmp = + g_array_index (dpb->pic_list, GstH264Picture *, i); + + if (GST_H264_PICTURE_IS_LONG_TERM_REF (tmp) + && tmp->long_term_frame_idx == ref_pic_marking->long_term_frame_idx) { + if (GST_H264_PICTURE_IS_FRAME (tmp)) { + /* When long_term_frame_idx is already assigned to a long-term + * reference frame, that frame is marked as "unused for reference" + */ + gst_h264_picture_set_reference (tmp, + GST_H264_PICTURE_REF_NONE, TRUE); + GST_TRACE ("MMCO-3: unmark old long-term frame %p (poc %d)", + tmp, tmp->pic_order_cnt); + } else if (tmp->other_field && + GST_H264_PICTURE_IS_LONG_TERM_REF (tmp->other_field) && + tmp->other_field->long_term_frame_idx == + ref_pic_marking->long_term_frame_idx) { + /* When long_term_frame_idx is already assigned to a long-term + * reference field pair, that complementary field pair and both of + * its fields are marked as "unused for reference" + */ + gst_h264_picture_set_reference (tmp, + GST_H264_PICTURE_REF_NONE, TRUE); + GST_TRACE ("MMCO-3: unmark old long-term field-pair %p (poc %d)", + tmp, tmp->pic_order_cnt); + } else { + /* When long_term_frame_idx is already assigned to a reference field, + * and that reference field is not part of a complementary field + * pair that includes the picture specified by picNumX, + * that field is marked as "unused for reference" + */ + + /* Check "tmp" (a long-term ref pic) is part of + * "other" (a picture to be updated from short-term to long-term) + * complementary field pair */ + + /* NOTE: "other" here is short-ref, so "other" and "tmp" must not be + * identical picture */ + if (!tmp->other_field) { + gst_h264_picture_set_reference (tmp, + GST_H264_PICTURE_REF_NONE, FALSE); + GST_TRACE ("MMCO-3: unmark old long-term field %p (poc %d)", + tmp, tmp->pic_order_cnt); + } else if (tmp->other_field != other && + (!other->other_field || other->other_field != tmp)) { + gst_h264_picture_set_reference (tmp, + GST_H264_PICTURE_REF_NONE, FALSE); + GST_TRACE ("MMCO-3: unmark old long-term field %p (poc %d)", + tmp, tmp->pic_order_cnt); + } + } + break; + } + } + + gst_h264_picture_set_reference (other, + GST_H264_PICTURE_REF_LONG_TERM, GST_H264_PICTURE_IS_FRAME (picture)); + other->long_term_frame_idx = ref_pic_marking->long_term_frame_idx; + + GST_TRACE ("MMCO-3: mark long-term ref pic %p, index %d, (poc %d)", + other, other->long_term_frame_idx, other->pic_order_cnt); + + if (other->other_field && + GST_H264_PICTURE_IS_LONG_TERM_REF (other->other_field)) { + other->other_field->long_term_frame_idx = + ref_pic_marking->long_term_frame_idx; + } + break; + case 4: + /* 8.2.5.4.4 All pictures for which LongTermFrameIdx is greater than + * max_long_term_frame_idx_plus1 − 1 and that are marked as + * "used for long-term reference" are marked as "unused for reference */ + max_long_term_frame_idx = + ref_pic_marking->max_long_term_frame_idx_plus1 - 1; + + GST_TRACE ("MMCO-4: max_long_term_frame_idx %d", max_long_term_frame_idx); + + for (i = 0; i < dpb->pic_list->len; i++) { + other = g_array_index (dpb->pic_list, GstH264Picture *, i); + + if (GST_H264_PICTURE_IS_LONG_TERM_REF (other) && + other->long_term_frame_idx > max_long_term_frame_idx) { + gst_h264_picture_set_reference (other, + GST_H264_PICTURE_REF_NONE, FALSE); + GST_TRACE ("MMCO-4: unmark long-term ref pic %p, index %d, (poc %d)", + other, other->long_term_frame_idx, other->pic_order_cnt); + } + } + break; + case 5: + /* 8.2.5.4.5 Unmark all reference pictures */ + for (i = 0; i < dpb->pic_list->len; i++) { + other = g_array_index (dpb->pic_list, GstH264Picture *, i); + gst_h264_picture_set_reference (other, + GST_H264_PICTURE_REF_NONE, FALSE); + } + picture->mem_mgmt_5 = TRUE; + picture->frame_num = 0; + /* When the current picture includes a memory management control operation + equal to 5, after the decoding of the current picture, tempPicOrderCnt + is set equal to PicOrderCnt( CurrPic ), TopFieldOrderCnt of the current + picture (if any) is set equal to TopFieldOrderCnt - tempPicOrderCnt, + and BottomFieldOrderCnt of the current picture (if any) is set equal to + BottomFieldOrderCnt - tempPicOrderCnt. */ + if (picture->field == GST_H264_PICTURE_FIELD_TOP_FIELD) { + picture->top_field_order_cnt = picture->pic_order_cnt = 0; + } else if (picture->field == GST_H264_PICTURE_FIELD_BOTTOM_FIELD) { + picture->bottom_field_order_cnt = picture->pic_order_cnt = 0; + } else { + picture->top_field_order_cnt -= picture->pic_order_cnt; + picture->bottom_field_order_cnt -= picture->pic_order_cnt; + picture->pic_order_cnt = MIN (picture->top_field_order_cnt, + picture->bottom_field_order_cnt); + } + break; + case 6: + /* 8.2.5.4.6 Replace long term reference pictures with current picture. + * First unmark if any existing with this long_term_frame_idx */ + + /* If we have long-term ref picture for LongTermFrameIdx, + * mark the picture as non-reference */ + for (i = 0; i < dpb->pic_list->len; i++) { + other = g_array_index (dpb->pic_list, GstH264Picture *, i); + + if (GST_H264_PICTURE_IS_LONG_TERM_REF (other) && + other->long_term_frame_idx == + ref_pic_marking->long_term_frame_idx) { + GST_TRACE ("MMCO-6: unmark old long-term ref pic %p (poc %d)", + other, other->pic_order_cnt); + gst_h264_picture_set_reference (other, + GST_H264_PICTURE_REF_NONE, TRUE); + break; + } + } + + gst_h264_picture_set_reference (picture, + GST_H264_PICTURE_REF_LONG_TERM, picture->second_field); + picture->long_term_frame_idx = ref_pic_marking->long_term_frame_idx; + if (picture->other_field && + GST_H264_PICTURE_IS_LONG_TERM_REF (picture->other_field)) { + picture->other_field->long_term_frame_idx = + ref_pic_marking->long_term_frame_idx; + } + break; + default: + g_assert_not_reached (); + return FALSE; + } + + return TRUE; +} + +/** + * gst_h264_picture_set_reference: + * @picture: a #GstH264Picture + * @reference: a GstH264PictureReference + * @other_field: %TRUE if @reference needs to be applied to the + * other field if any + * + * Update reference picture type of @picture with @reference + * + * Since: 1.20 + */ +void +gst_h264_picture_set_reference (GstH264Picture * picture, + GstH264PictureReference reference, gboolean other_field) +{ + g_return_if_fail (picture != NULL); + + picture->ref = reference; + if (reference > GST_H264_PICTURE_REF_NONE) + picture->ref_pic = TRUE; + + if (other_field && picture->other_field) { + picture->other_field->ref = reference; + + if (reference > GST_H264_PICTURE_REF_NONE) + picture->other_field->ref_pic = TRUE; + } +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth264picture.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth264picture.h
Changed
@@ -21,8 +21,8 @@ #define __GST_H264_PICTURE_H__ #include <gst/codecs/codecs-prelude.h> - #include <gst/codecparsers/gsth264parser.h> +#include <gst/video/video.h> G_BEGIN_DECLS @@ -37,6 +37,50 @@ /* As specified in A.3.1 h) and A.3.2 f) */ #define GST_H264_DPB_MAX_SIZE 16 +/** + * GST_H264_PICTURE_IS_REF: + * @picture: a #GstH264Picture + * + * Check whether @picture is used for short-term or long-term reference + * + * Since: 1.20 + */ +#define GST_H264_PICTURE_IS_REF(picture) \ + ((picture)->ref != GST_H264_PICTURE_REF_NONE) + +/** + * GST_H264_PICTURE_IS_SHORT_TERM_REF: + * @picture: a #GstH264Picture + * + * Check whether @picture is used for short-term reference + * + * Since: 1.20 + */ +#define GST_H264_PICTURE_IS_SHORT_TERM_REF(picture) \ + ((picture)->ref == GST_H264_PICTURE_REF_SHORT_TERM) + +/** + * GST_H264_PICTURE_IS_LONG_TERM_REF: + * @picture: a #GstH264Picture + * + * Check whether @picture is used for long-term reference + * + * Since: 1.20 + */ +#define GST_H264_PICTURE_IS_LONG_TERM_REF(picture) \ + ((picture)->ref == GST_H264_PICTURE_REF_LONG_TERM) + +/** + * GST_H264_PICTURE_IS_FRAME: + * @picture: a #GstH264Picture + * + * Check whether @picture is a frame (not a field picture) + * + * Since: 1.20 + */ +#define GST_H264_PICTURE_IS_FRAME(picture) \ + ((picture)->field == GST_H264_PICTURE_FIELD_FRAME) + struct _GstH264Slice { GstH264SliceHdr header; @@ -52,13 +96,28 @@ GST_H264_PICTURE_FIELD_BOTTOM_FIELD, } GstH264PictureField; +/** + * GstH264PictureReference: + * @GST_H264_PICTURE_REF_NONE: Not used for reference picture + * @GST_H264_PICTURE_REF_SHORT_TERM: Used for short-term reference picture + * @GST_H264_PICTURE_REF_LONG_TERM: Used for long-term reference picture + * + * Since: 1.20 + */ +typedef enum +{ + GST_H264_PICTURE_REF_NONE = 0, + GST_H264_PICTURE_REF_SHORT_TERM, + GST_H264_PICTURE_REF_LONG_TERM, +} GstH264PictureReference; + struct _GstH264Picture { + /*< private >*/ GstMiniObject parent; GstH264SliceType type; - GstClockTime pts; /* From GstVideoCodecFrame */ guint32 system_frame_number; @@ -83,9 +142,10 @@ gint nal_ref_idc; gboolean idr; gint idr_pic_id; - gboolean ref; - gboolean long_term; - gboolean outputted; + GstH264PictureReference ref; + /* Whether a reference picture. */ + gboolean ref_pic; + gboolean needed_for_output; gboolean mem_mgmt_5; gboolean nonexisting; @@ -94,10 +154,31 @@ GstH264DecRefPicMarking dec_ref_pic_marking; + /* For interlaced decoding */ + gboolean second_field; + GstH264Picture * other_field; + + GstVideoBufferFlags buffer_flags; + gpointer user_data; GDestroyNotify notify; }; +/** + * GstH264DpbBumpMode: + * @GST_H264_DPB_BUMP_NORMAL_LATENCY: No latency requirement for DBP bumping. + * @GST_H264_DPB_BUMP_LOW_LATENCY: Low-latency requirement for DBP bumping. + * @GST_H264_DPB_BUMP_VERY_LOW_LATENCY: Very low-latency requirement for DBP bumping. + * + * Since: 1.20 + */ +typedef enum +{ + GST_H264_DPB_BUMP_NORMAL_LATENCY, + GST_H264_DPB_BUMP_LOW_LATENCY, + GST_H264_DPB_BUMP_VERY_LOW_LATENCY +} GstH264DpbBumpMode; + GST_CODECS_API GType gst_h264_picture_get_type (void); @@ -150,11 +231,22 @@ GstH264Dpb * gst_h264_dpb_new (void); GST_CODECS_API -void gst_h264_dpb_set_max_num_pics (GstH264Dpb * dpb, - gint max_num_pics); +void gst_h264_dpb_set_max_num_frames (GstH264Dpb * dpb, + gint max_num_frames); + +GST_CODECS_API +gint gst_h264_dpb_get_max_num_frames (GstH264Dpb * dpb); + +GST_CODECS_API +void gst_h264_dpb_set_interlaced (GstH264Dpb * dpb, + gboolean interlaced); GST_CODECS_API -gint gst_h264_dpb_get_max_num_pics (GstH264Dpb * dpb); +void gst_h264_dpb_set_max_num_reorder_frames (GstH264Dpb * dpb, + guint32 max_num_reorder_frames); + +GST_CODECS_API +gboolean gst_h264_dpb_get_interlaced (GstH264Dpb * dpb); GST_CODECS_API void gst_h264_dpb_free (GstH264Dpb * dpb); @@ -170,14 +262,7 @@ void gst_h264_dpb_delete_unused (GstH264Dpb * dpb); GST_CODECS_API -void gst_h264_dpb_delete_outputed (GstH264Dpb * dpb); - -GST_CODECS_API -void gst_h264_dpb_delete_by_poc (GstH264Dpb * dpb, - gint poc); - -GST_CODECS_API -gint gst_h264_dpb_num_ref_pictures (GstH264Dpb * dpb); +gint gst_h264_dpb_num_ref_frames (GstH264Dpb * dpb); GST_CODECS_API void gst_h264_dpb_mark_all_non_ref (GstH264Dpb * dpb); @@ -187,22 +272,21 @@ gint pic_num); GST_CODECS_API -GstH264Picture * gst_h264_dpb_get_long_ref_by_pic_num (GstH264Dpb * dpb, - gint pic_num); +GstH264Picture * gst_h264_dpb_get_long_ref_by_long_term_pic_num (GstH264Dpb * dpb, + gint long_term_pic_num); GST_CODECS_API GstH264Picture * gst_h264_dpb_get_lowest_frame_num_short_ref (GstH264Dpb * dpb); GST_CODECS_API -void gst_h264_dpb_get_pictures_not_outputted (GstH264Dpb * dpb, - GArray * out); - -GST_CODECS_API void gst_h264_dpb_get_pictures_short_term_ref (GstH264Dpb * dpb, + gboolean include_non_existing, + gboolean include_second_field, GArray * out); GST_CODECS_API void gst_h264_dpb_get_pictures_long_term_ref (GstH264Dpb * dpb, + gboolean include_second_field, GArray * out); GST_CODECS_API @@ -216,7 +300,30 @@ gint gst_h264_dpb_get_size (GstH264Dpb * dpb); GST_CODECS_API -gboolean gst_h264_dpb_is_full (GstH264Dpb * dpb); +gboolean gst_h264_dpb_has_empty_frame_buffer (GstH264Dpb * dpb); + +GST_CODECS_API +gboolean gst_h264_dpb_needs_bump (GstH264Dpb * dpb, + GstH264Picture * to_insert, + GstH264DpbBumpMode latency_mode); + +GST_CODECS_API +GstH264Picture * gst_h264_dpb_bump (GstH264Dpb * dpb, + gboolean drain); + +GST_CODECS_API +void gst_h264_dpb_set_last_output (GstH264Dpb * dpb, + GstH264Picture * picture); + +GST_CODECS_API +gboolean gst_h264_dpb_perform_memory_management_control_operation (GstH264Dpb * dpb, + GstH264RefPicMarking *ref_pic_marking, + GstH264Picture * picture); + +/* Internal methods */ +void gst_h264_picture_set_reference (GstH264Picture * picture, + GstH264PictureReference reference, + gboolean other_field); G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstH264Picture, gst_h264_picture_unref)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth265decoder.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth265decoder.c
Changed
@@ -54,6 +54,12 @@ { gint width, height; + guint8 conformance_window_flag; + gint crop_rect_width; + gint crop_rect_height; + gint crop_rect_x; + gint crop_rect_y; + /* input codec_data, if any */ GstBuffer *codec_data; guint nal_length_size; @@ -63,7 +69,18 @@ GstH265DecoderAlign align; GstH265Parser *parser; GstH265Dpb *dpb; - GstFlowReturn last_ret; + + /* 0: frame or field-pair interlaced stream + * 1: alternating, single field interlaced stream. + * When equal to 1, picture timing SEI shall be present in every AU */ + guint8 field_seq_flag; + guint8 progressive_source_flag; + guint8 interlaced_source_flag; + + /* Updated/cleared per handle_frame() by using picture timeing SEI */ + GstH265SEIPicStructType cur_pic_struct; + guint8 cur_source_scan_type; + guint8 cur_duplicate_flag; /* vps/sps/pps of the current slice */ const GstH265VPS *active_vps; @@ -71,13 +88,12 @@ const GstH265PPS *active_pps; guint32 SpsMaxLatencyPictures; - gint32 WpOffsetHalfRangeC; /* Picture currently being processed/decoded */ GstH265Picture *current_picture; GstVideoCodecFrame *current_frame; - /* Slice (slice header + nalu) currently being processed/decodec */ + /* Slice (slice header + nalu) currently being processed/decoded */ GstH265Slice current_slice; GstH265Slice prev_slice; GstH265Slice prev_independent_slice; @@ -101,8 +117,19 @@ gboolean associated_irap_NoRaslOutputFlag; gboolean new_bitstream; gboolean prev_nal_is_eos; + + /* Reference picture lists, constructed for each slice */ + gboolean process_ref_pic_lists; + GArray *ref_pic_list_tmp; + GArray *ref_pic_list0; + GArray *ref_pic_list1; }; +#define UPDATE_FLOW_RETURN(ret,new_ret) G_STMT_START { \ + if (*(ret) == GST_FLOW_OK) \ + *(ret) = new_ret; \ +} G_STMT_END + #define parent_class gst_h265_decoder_parent_class G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstH265Decoder, gst_h265_decoder, GST_TYPE_VIDEO_DECODER, @@ -110,6 +137,8 @@ GST_DEBUG_CATEGORY_INIT (gst_h265_decoder_debug, "h265decoder", 0, "H.265 Video Decoder")); +static void gst_h265_decoder_finalize (GObject * object); + static gboolean gst_h265_decoder_start (GstVideoDecoder * decoder); static gboolean gst_h265_decoder_stop (GstVideoDecoder * decoder); static gboolean gst_h265_decoder_set_format (GstVideoDecoder * decoder, @@ -120,16 +149,21 @@ static GstFlowReturn gst_h265_decoder_handle_frame (GstVideoDecoder * decoder, GstVideoCodecFrame * frame); -static gboolean gst_h265_decoder_finish_current_picture (GstH265Decoder * self); -static void gst_h265_decoder_clear_dpb (GstH265Decoder * self); -static gboolean -gst_h265_decoder_output_all_remaining_pics (GstH265Decoder * self); -static gboolean gst_h265_decoder_start_current_picture (GstH265Decoder * self); +static void gst_h265_decoder_finish_current_picture (GstH265Decoder * self, + GstFlowReturn * ret); +static void gst_h265_decoder_clear_ref_pic_sets (GstH265Decoder * self); +static void gst_h265_decoder_clear_dpb (GstH265Decoder * self, gboolean flush); +static GstFlowReturn gst_h265_decoder_drain_internal (GstH265Decoder * self); +static GstFlowReturn +gst_h265_decoder_start_current_picture (GstH265Decoder * self); static void gst_h265_decoder_class_init (GstH265DecoderClass * klass) { GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GObjectClass *object_class = G_OBJECT_CLASS (klass); + + object_class->finalize = GST_DEBUG_FUNCPTR (gst_h265_decoder_finalize); decoder_class->start = GST_DEBUG_FUNCPTR (gst_h265_decoder_start); decoder_class->stop = GST_DEBUG_FUNCPTR (gst_h265_decoder_stop); @@ -144,9 +178,33 @@ static void gst_h265_decoder_init (GstH265Decoder * self) { + GstH265DecoderPrivate *priv; + gst_video_decoder_set_packetized (GST_VIDEO_DECODER (self), TRUE); - self->priv = gst_h265_decoder_get_instance_private (self); + self->priv = priv = gst_h265_decoder_get_instance_private (self); + + priv->last_output_poc = G_MININT32; + + priv->ref_pic_list_tmp = g_array_sized_new (FALSE, TRUE, + sizeof (GstH265Picture *), 32); + priv->ref_pic_list0 = g_array_sized_new (FALSE, TRUE, + sizeof (GstH265Picture *), 32); + priv->ref_pic_list1 = g_array_sized_new (FALSE, TRUE, + sizeof (GstH265Picture *), 32); +} + +static void +gst_h265_decoder_finalize (GObject * object) +{ + GstH265Decoder *self = GST_H265_DECODER (object); + GstH265DecoderPrivate *priv = self->priv; + + g_array_unref (priv->ref_pic_list_tmp); + g_array_unref (priv->ref_pic_list0); + g_array_unref (priv->ref_pic_list1); + + G_OBJECT_CLASS (parent_class)->finalize (object); } static gboolean @@ -186,29 +244,49 @@ priv->dpb = NULL; } + gst_h265_decoder_clear_ref_pic_sets (self); + return TRUE; } -static gboolean +static GstFlowReturn gst_h265_decoder_parse_vps (GstH265Decoder * self, GstH265NalUnit * nalu) { GstH265DecoderPrivate *priv = self->priv; GstH265VPS vps; GstH265ParserResult pres; - gboolean ret = TRUE; pres = gst_h265_parser_parse_vps (priv->parser, nalu, &vps); if (pres != GST_H265_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to parse VPS, result %d", pres); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "VPS parsed"); - return ret; + return GST_FLOW_OK; } static gboolean +gst_h265_decoder_is_crop_rect_changed (GstH265Decoder * self, GstH265SPS * sps) +{ + GstH265DecoderPrivate *priv = self->priv; + + if (priv->conformance_window_flag != sps->conformance_window_flag) + return TRUE; + if (priv->crop_rect_width != sps->crop_rect_width) + return TRUE; + if (priv->crop_rect_height != sps->crop_rect_height) + return TRUE; + if (priv->crop_rect_x != sps->crop_rect_x) + return TRUE; + if (priv->crop_rect_y != sps->crop_rect_y) + return TRUE; + + return FALSE; +} + +static GstFlowReturn gst_h265_decoder_process_sps (GstH265Decoder * self, GstH265SPS * sps) { GstH265DecoderPrivate *priv = self->priv; @@ -217,8 +295,10 @@ gint MaxLumaPS; const gint MaxDpbPicBuf = 6; gint PicSizeInSamplesY; - guint high_precision_offsets_enabled_flag = 0; - guint bitdepthC = 0; + guint8 field_seq_flag = 0; + guint8 progressive_source_flag = 0; + guint8 interlaced_source_flag = 0; + GstFlowReturn ret = GST_FLOW_OK; /* A.4.1 */ MaxLumaPS = 35651584; @@ -234,25 +314,48 @@ max_dpb_size = MIN (max_dpb_size, 16); + if (sps->vui_parameters_present_flag) + field_seq_flag = sps->vui_params.field_seq_flag; + + progressive_source_flag = sps->profile_tier_level.progressive_source_flag; + interlaced_source_flag = sps->profile_tier_level.interlaced_source_flag; + prev_max_dpb_size = gst_h265_dpb_get_max_num_pics (priv->dpb); if (priv->width != sps->width || priv->height != sps->height || - prev_max_dpb_size != max_dpb_size) { + prev_max_dpb_size != max_dpb_size || + priv->field_seq_flag != field_seq_flag || + priv->progressive_source_flag != progressive_source_flag || + priv->interlaced_source_flag != interlaced_source_flag || + gst_h265_decoder_is_crop_rect_changed (self, sps)) { GstH265DecoderClass *klass = GST_H265_DECODER_GET_CLASS (self); GST_DEBUG_OBJECT (self, - "SPS updated, resolution: %dx%d -> %dx%d, dpb size: %d -> %d", + "SPS updated, resolution: %dx%d -> %dx%d, dpb size: %d -> %d, " + "field_seq_flag: %d -> %d, progressive_source_flag: %d -> %d, " + "interlaced_source_flag: %d -> %d", priv->width, priv->height, sps->width, sps->height, - prev_max_dpb_size, max_dpb_size); + prev_max_dpb_size, max_dpb_size, priv->field_seq_flag, field_seq_flag, + priv->progressive_source_flag, progressive_source_flag, + priv->interlaced_source_flag, interlaced_source_flag); g_assert (klass->new_sequence); - if (!klass->new_sequence (self, sps, max_dpb_size)) { - GST_ERROR_OBJECT (self, "subclass does not want accept new sequence"); - return FALSE; + ret = klass->new_sequence (self, sps, max_dpb_size); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass does not want accept new sequence"); + return ret; } priv->width = sps->width; priv->height = sps->height; + priv->conformance_window_flag = sps->conformance_window_flag; + priv->crop_rect_width = sps->crop_rect_width; + priv->crop_rect_height = sps->crop_rect_height; + priv->crop_rect_x = sps->crop_rect_x; + priv->crop_rect_y = sps->crop_rect_y; + priv->field_seq_flag = field_seq_flag; + priv->progressive_source_flag = progressive_source_flag; + priv->interlaced_source_flag = interlaced_source_flag; gst_h265_dpb_set_max_num_pics (priv->dpb, max_dpb_size); } @@ -263,47 +366,40 @@ sps->max_latency_increase_plus1[sps->max_sub_layers_minus1] - 1; } - /* Calculate WpOffsetHalfRangeC: (7-34) - * FIXME: We don't have parser API for sps_range_extension, so - * assuming high_precision_offsets_enabled_flag as zero */ - bitdepthC = sps->bit_depth_chroma_minus8 + 8; - priv->WpOffsetHalfRangeC = - 1 << (high_precision_offsets_enabled_flag ? (bitdepthC - 1) : 7); - GST_DEBUG_OBJECT (self, "Set DPB max size %d", max_dpb_size); - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_h265_decoder_parse_sps (GstH265Decoder * self, GstH265NalUnit * nalu) { GstH265DecoderPrivate *priv = self->priv; GstH265SPS sps; GstH265ParserResult pres; - gboolean ret; + GstFlowReturn ret = GST_FLOW_OK; pres = gst_h265_parse_sps (priv->parser, nalu, &sps, TRUE); if (pres != GST_H265_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to parse SPS, result %d", pres); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "SPS parsed"); ret = gst_h265_decoder_process_sps (self, &sps); - if (!ret) { + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to process SPS"); } else if (gst_h265_parser_update_sps (priv->parser, &sps) != GST_H265_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to update SPS"); - ret = FALSE; + ret = GST_FLOW_ERROR; } return ret; } -static gboolean +static GstFlowReturn gst_h265_decoder_parse_pps (GstH265Decoder * self, GstH265NalUnit * nalu) { GstH265DecoderPrivate *priv = self->priv; @@ -313,60 +409,234 @@ pres = gst_h265_parser_parse_pps (priv->parser, nalu, &pps); if (pres != GST_H265_PARSER_OK) { GST_WARNING_OBJECT (self, "Failed to parse PPS, result %d", pres); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "PPS parsed"); - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn +gst_h265_decoder_parse_sei (GstH265Decoder * self, GstH265NalUnit * nalu) +{ + GstH265DecoderPrivate *priv = self->priv; + GstH265ParserResult pres; + GArray *messages = NULL; + guint i; + + pres = gst_h265_parser_parse_sei (priv->parser, nalu, &messages); + if (pres != GST_H265_PARSER_OK) { + GST_WARNING_OBJECT (self, "Failed to parse SEI, result %d", pres); + + /* XXX: Ignore error from SEI parsing, it might be malformed bitstream, + * or our fault. But shouldn't be critical */ + g_clear_pointer (&messages, g_array_unref); + return GST_FLOW_OK; + } + + for (i = 0; i < messages->len; i++) { + GstH265SEIMessage *sei = &g_array_index (messages, GstH265SEIMessage, i); + + switch (sei->payloadType) { + case GST_H265_SEI_PIC_TIMING: + priv->cur_pic_struct = sei->payload.pic_timing.pic_struct; + priv->cur_source_scan_type = sei->payload.pic_timing.source_scan_type; + priv->cur_duplicate_flag = sei->payload.pic_timing.duplicate_flag; + + GST_TRACE_OBJECT (self, + "Picture Timing SEI, pic_struct: %d, source_scan_type: %d, " + "duplicate_flag: %d", priv->cur_pic_struct, + priv->cur_source_scan_type, priv->cur_duplicate_flag); + break; + default: + break; + } + } + + g_array_free (messages, TRUE); + GST_LOG_OBJECT (self, "SEI parsed"); + + return GST_FLOW_OK; +} + +static void +gst_h265_decoder_process_ref_pic_lists (GstH265Decoder * self, + GstH265Picture * curr_pic, GstH265Slice * slice, + GArray ** ref_pic_list0, GArray ** ref_pic_list1) +{ + GstH265DecoderPrivate *priv = self->priv; + GstH265RefPicListModification *ref_mod = + &slice->header.ref_pic_list_modification; + GstH265PPSSccExtensionParams *scc_ext = + &slice->header.pps->pps_scc_extension_params; + GArray *tmp_refs; + gint num_tmp_refs, i; + + *ref_pic_list0 = priv->ref_pic_list0; + *ref_pic_list1 = priv->ref_pic_list1; + + /* There is nothing to be done for I slices */ + if (GST_H265_IS_I_SLICE (&slice->header)) + return; + + /* Inifinit loop prevention */ + if (self->NumPocStCurrBefore == 0 && self->NumPocStCurrAfter == 0 && + self->NumPocLtCurr == 0 && !scc_ext->pps_curr_pic_ref_enabled_flag) { + GST_WARNING_OBJECT (self, + "Expected references, got none, preventing infinit loop."); + return; + } + + /* 8.3.4 Deriving l0 */ + tmp_refs = priv->ref_pic_list_tmp; + + /* (8-8) + * Deriving l0 consist of appending in loop RefPicSetStCurrBefore, + * RefPicSetStCurrAfter and RefPicSetLtCurr until NumRpsCurrTempList0 item + * has been reached. + */ + + /* NumRpsCurrTempList0 */ + num_tmp_refs = MAX (slice->header.num_ref_idx_l0_active_minus1 + 1, + self->NumPicTotalCurr); + + while (tmp_refs->len < num_tmp_refs) { + for (i = 0; i < self->NumPocStCurrBefore && tmp_refs->len < num_tmp_refs; + i++) + g_array_append_val (tmp_refs, self->RefPicSetStCurrBefore[i]); + for (i = 0; i < self->NumPocStCurrAfter && tmp_refs->len < num_tmp_refs; + i++) + g_array_append_val (tmp_refs, self->RefPicSetStCurrAfter[i]); + for (i = 0; i < self->NumPocLtCurr && tmp_refs->len < num_tmp_refs; i++) + g_array_append_val (tmp_refs, self->RefPicSetLtCurr[i]); + if (scc_ext->pps_curr_pic_ref_enabled_flag) + g_array_append_val (tmp_refs, curr_pic); + } + + /* (8-9) + * If needed, apply the modificaiton base on the lookup table found in the + * slice header (list_entry_l0). + */ + for (i = 0; i <= slice->header.num_ref_idx_l0_active_minus1; i++) { + GstH265Picture **tmp = (GstH265Picture **) tmp_refs->data; + + if (ref_mod->ref_pic_list_modification_flag_l0) + g_array_append_val (*ref_pic_list0, tmp[ref_mod->list_entry_l0[i]]); + else + g_array_append_val (*ref_pic_list0, tmp[i]); + } + + if (scc_ext->pps_curr_pic_ref_enabled_flag && + !ref_mod->ref_pic_list_modification_flag_l0 && + num_tmp_refs > (slice->header.num_ref_idx_l0_active_minus1 + 1)) { + g_array_index (*ref_pic_list0, GstH265Picture *, + slice->header.num_ref_idx_l0_active_minus1) = curr_pic; + } + + g_array_set_size (tmp_refs, 0); + + /* For P slices we only need l0 */ + if (GST_H265_IS_P_SLICE (&slice->header)) + return; + + /* 8.3.4 Deriving l1 */ + /* (8-10) + * Deriving l1 consist of appending in loop RefPicSetStCurrAfter, + * RefPicSetStCurrBefore and RefPicSetLtCurr until NumRpsCurrTempList0 item + * has been reached. + */ + + /* NumRpsCurrTempList1 */ + num_tmp_refs = MAX (slice->header.num_ref_idx_l1_active_minus1 + 1, + self->NumPicTotalCurr); + + while (tmp_refs->len < num_tmp_refs) { + for (i = 0; i < self->NumPocStCurrAfter && tmp_refs->len < num_tmp_refs; + i++) + g_array_append_val (tmp_refs, self->RefPicSetStCurrAfter[i]); + for (i = 0; i < self->NumPocStCurrBefore && tmp_refs->len < num_tmp_refs; + i++) + g_array_append_val (tmp_refs, self->RefPicSetStCurrBefore[i]); + for (i = 0; i < self->NumPocLtCurr && tmp_refs->len < num_tmp_refs; i++) + g_array_append_val (tmp_refs, self->RefPicSetLtCurr[i]); + if (scc_ext->pps_curr_pic_ref_enabled_flag) + g_array_append_val (tmp_refs, curr_pic); + } + + /* (8-11) + * If needed, apply the modificaiton base on the lookup table found in the + * slice header (list_entry_l1). + */ + for (i = 0; i <= slice->header.num_ref_idx_l1_active_minus1; i++) { + GstH265Picture **tmp = (GstH265Picture **) tmp_refs->data; + + if (ref_mod->ref_pic_list_modification_flag_l1) + g_array_append_val (*ref_pic_list1, tmp[ref_mod->list_entry_l1[i]]); + else + g_array_append_val (*ref_pic_list1, tmp[i]); + } + + g_array_set_size (tmp_refs, 0); +} + +static GstFlowReturn gst_h265_decoder_decode_slice (GstH265Decoder * self) { GstH265DecoderClass *klass = GST_H265_DECODER_GET_CLASS (self); GstH265DecoderPrivate *priv = self->priv; GstH265Slice *slice = &priv->current_slice; GstH265Picture *picture = priv->current_picture; + GArray *l0 = NULL; + GArray *l1 = NULL; + GstFlowReturn ret = GST_FLOW_OK; if (!picture) { GST_ERROR_OBJECT (self, "No current picture"); - return FALSE; + return GST_FLOW_ERROR; } g_assert (klass->decode_slice); - return klass->decode_slice (self, picture, slice); + if (priv->process_ref_pic_lists) { + l0 = priv->ref_pic_list0; + l1 = priv->ref_pic_list1; + gst_h265_decoder_process_ref_pic_lists (self, picture, slice, &l0, &l1); + } + + ret = klass->decode_slice (self, picture, slice, l0, l1); + + if (priv->process_ref_pic_lists) { + g_array_set_size (l0, 0); + g_array_set_size (l1, 0); + } + + return ret; } -static gboolean +static GstFlowReturn gst_h265_decoder_preprocess_slice (GstH265Decoder * self, GstH265Slice * slice) { GstH265DecoderPrivate *priv = self->priv; const GstH265SliceHdr *slice_hdr = &slice->header; - const GstH265NalUnit *nalu = &slice->nalu; if (priv->current_picture && slice_hdr->first_slice_segment_in_pic_flag) { GST_WARNING_OBJECT (self, "Current picture is not finished but slice header has " "first_slice_segment_in_pic_flag"); - return FALSE; - } - - if (GST_H265_IS_NAL_TYPE_IDR (nalu->type)) { - GST_DEBUG_OBJECT (self, "IDR nalu, clear dpb"); - gst_h265_decoder_drain (GST_VIDEO_DECODER (self)); + return GST_FLOW_ERROR; } - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_h265_decoder_parse_slice (GstH265Decoder * self, GstH265NalUnit * nalu, GstClockTime pts) { GstH265DecoderPrivate *priv = self->priv; GstH265ParserResult pres = GST_H265_PARSER_OK; + GstFlowReturn ret = GST_FLOW_OK; memset (&priv->current_slice, 0, sizeof (GstH265Slice)); @@ -377,7 +647,7 @@ GST_ERROR_OBJECT (self, "Failed to parse slice header, ret %d", pres); memset (&priv->current_slice, 0, sizeof (GstH265Slice)); - return FALSE; + return GST_FLOW_ERROR; } /* NOTE: gst_h265_parser_parse_slice_hdr() allocates array @@ -388,8 +658,21 @@ priv->current_slice.nalu = *nalu; - if (!gst_h265_decoder_preprocess_slice (self, &priv->current_slice)) - return FALSE; + if (priv->current_slice.header.dependent_slice_segment_flag) { + GstH265SliceHdr *slice_hdr = &priv->current_slice.header; + GstH265SliceHdr *indep_slice_hdr = &priv->prev_independent_slice.header; + + memcpy (&slice_hdr->type, &indep_slice_hdr->type, + G_STRUCT_OFFSET (GstH265SliceHdr, num_entry_point_offsets) - + G_STRUCT_OFFSET (GstH265SliceHdr, type)); + } else { + priv->prev_independent_slice = priv->current_slice; + memset (&priv->prev_independent_slice.nalu, 0, sizeof (GstH265NalUnit)); + } + + ret = gst_h265_decoder_preprocess_slice (self, &priv->current_slice); + if (ret != GST_FLOW_OK) + return ret; priv->active_pps = priv->current_slice.header.pps; priv->active_sps = priv->active_pps->sps; @@ -397,35 +680,36 @@ if (!priv->current_picture) { GstH265DecoderClass *klass = GST_H265_DECODER_GET_CLASS (self); GstH265Picture *picture; - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; + + g_assert (priv->current_frame); picture = gst_h265_picture_new (); picture->pts = pts; /* This allows accessing the frame from the picture. */ picture->system_frame_number = priv->current_frame->system_frame_number; + priv->current_picture = picture; + if (klass->new_picture) - ret = klass->new_picture (self, picture); + ret = klass->new_picture (self, priv->current_frame, picture); - if (!ret) { - GST_ERROR_OBJECT (self, "subclass does not want accept new picture"); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass does not want accept new picture"); + priv->current_picture = NULL; gst_h265_picture_unref (picture); - return FALSE; + return ret; } - priv->current_picture = picture; - gst_video_codec_frame_set_user_data (priv->current_frame, - gst_h265_picture_ref (priv->current_picture), - (GDestroyNotify) gst_h265_picture_unref); - - if (!gst_h265_decoder_start_current_picture (self)) { - GST_ERROR_OBJECT (self, "start picture failed"); - return FALSE; + ret = gst_h265_decoder_start_current_picture (self); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "start picture failed"); + return ret; } /* this picture was dropped */ if (!priv->current_picture) - return TRUE; + return GST_FLOW_OK; } return gst_h265_decoder_decode_slice (self); @@ -436,7 +720,7 @@ GstClockTime pts) { GstH265DecoderPrivate *priv = self->priv; - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; GST_LOG_OBJECT (self, "Parsed nal type: %d, offset %d, size %d", nalu->type, nalu->offset, nalu->size); @@ -451,6 +735,10 @@ case GST_H265_NAL_PPS: ret = gst_h265_decoder_parse_pps (self, nalu); break; + case GST_H265_NAL_PREFIX_SEI: + case GST_H265_NAL_SUFFIX_SEI: + ret = gst_h265_decoder_parse_sei (self, nalu); + break; case GST_H265_NAL_SLICE_TRAIL_N: case GST_H265_NAL_SLICE_TRAIL_R: case GST_H265_NAL_SLICE_TSA_N: @@ -472,11 +760,9 @@ priv->prev_nal_is_eos = FALSE; break; case GST_H265_NAL_EOB: - gst_h265_decoder_drain (GST_VIDEO_DECODER (self)); priv->new_bitstream = TRUE; break; case GST_H265_NAL_EOS: - gst_h265_decoder_drain (GST_VIDEO_DECODER (self)); priv->prev_nal_is_eos = TRUE; break; default: @@ -529,7 +815,7 @@ } } -static gboolean +static GstFlowReturn gst_h265_decoder_parse_codec_data (GstH265Decoder * self, const guint8 * data, gsize size) { @@ -539,16 +825,17 @@ guint num_nals, i, j; GstH265ParserResult pres; GstH265NalUnit nalu; + GstFlowReturn ret = GST_FLOW_OK; /* parse the hvcC data */ if (size < 23) { GST_WARNING_OBJECT (self, "hvcC too small"); - return FALSE; + return GST_FLOW_ERROR; } /* wrong hvcC version */ if (data[0] != 0 && data[0] != 1) { - return FALSE; + return GST_FLOW_ERROR; } priv->nal_length_size = (data[21] & 0x03) + 1; @@ -560,7 +847,7 @@ for (i = 0; i < num_nal_arrays; i++) { if (off + 3 >= size) { GST_WARNING_OBJECT (self, "hvcC too small"); - return FALSE; + return GST_FLOW_ERROR; } num_nals = GST_READ_UINT16_BE (data + off + 1); @@ -571,26 +858,29 @@ if (pres != GST_H265_PARSER_OK) { GST_WARNING_OBJECT (self, "hvcC too small"); - return FALSE; + return GST_FLOW_ERROR; } switch (nalu.type) { case GST_H265_NAL_VPS: - if (!gst_h265_decoder_parse_vps (self, &nalu)) { + ret = gst_h265_decoder_parse_vps (self, &nalu); + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to parse VPS"); - return FALSE; + return ret; } break; case GST_H265_NAL_SPS: - if (!gst_h265_decoder_parse_sps (self, &nalu)) { + ret = gst_h265_decoder_parse_sps (self, &nalu); + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to parse SPS"); - return FALSE; + return ret; } break; case GST_H265_NAL_PPS: - if (!gst_h265_decoder_parse_pps (self, &nalu)) { + ret = gst_h265_decoder_parse_pps (self, &nalu); + if (ret != GST_FLOW_OK) { GST_WARNING_OBJECT (self, "Failed to parse PPS"); - return FALSE; + return ret; } break; default: @@ -601,7 +891,7 @@ } } - return TRUE; + return GST_FLOW_OK; } static gboolean @@ -678,7 +968,8 @@ GstMapInfo map; gst_buffer_map (priv->codec_data, &map, GST_MAP_READ); - if (!gst_h265_decoder_parse_codec_data (self, map.data, map.size)) { + if (gst_h265_decoder_parse_codec_data (self, map.data, map.size) != + GST_FLOW_OK) { /* keep going without error. * Probably inband SPS/PPS might be valid data */ GST_WARNING_OBJECT (self, "Failed to handle codec data"); @@ -694,7 +985,7 @@ { GstH265Decoder *self = GST_H265_DECODER (decoder); - gst_h265_decoder_clear_dpb (self); + gst_h265_decoder_clear_dpb (self, TRUE); return TRUE; } @@ -703,13 +994,9 @@ gst_h265_decoder_drain (GstVideoDecoder * decoder) { GstH265Decoder *self = GST_H265_DECODER (decoder); - GstH265DecoderPrivate *priv = self->priv; - - priv->last_ret = GST_FLOW_OK; - gst_h265_decoder_output_all_remaining_pics (self); - gst_h265_decoder_clear_dpb (self); - return priv->last_ret; + /* dpb will be cleared by this method */ + return gst_h265_decoder_drain_internal (self); } static GstFlowReturn @@ -730,9 +1017,6 @@ nalu->type <= GST_H265_NAL_SLICE_CRA_NUT) picture->RapPicFlag = TRUE; - /* FIXME: Use SEI header values */ - picture->field = GST_H265_PICTURE_FIELD_FRAME; - /* NoRaslOutputFlag == 1 if the current picture is * 1) an IDR picture * 2) a BLA picture @@ -860,6 +1144,61 @@ } static gboolean +gst_h265_decoder_set_buffer_flags (GstH265Decoder * self, + GstH265Picture * picture) +{ + GstH265DecoderPrivate *priv = self->priv; + + switch (picture->pic_struct) { + case GST_H265_SEI_PIC_STRUCT_FRAME: + break; + case GST_H265_SEI_PIC_STRUCT_TOP_FIELD: + case GST_H265_SEI_PIC_STRUCT_TOP_PAIRED_PREVIOUS_BOTTOM: + case GST_H265_SEI_PIC_STRUCT_TOP_PAIRED_NEXT_BOTTOM: + if (!priv->field_seq_flag) { + GST_FIXME_OBJECT (self, + "top-field with field_seq_flag == 0, what does it mean?"); + } else { + picture->buffer_flags = GST_VIDEO_BUFFER_FLAG_TOP_FIELD; + } + break; + case GST_H265_SEI_PIC_STRUCT_BOTTOM_FIELD: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_PAIRED_PREVIOUS_TOP: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_PAIRED_NEXT_TOP: + if (!priv->field_seq_flag) { + GST_FIXME_OBJECT (self, + "bottom-field with field_seq_flag == 0, what does it mean?"); + } else { + picture->buffer_flags = GST_VIDEO_BUFFER_FLAG_BOTTOM_FIELD; + } + break; + case GST_H265_SEI_PIC_STRUCT_TOP_BOTTOM: + if (priv->field_seq_flag) { + GST_FIXME_OBJECT (self, + "TFF with field_seq_flag == 1, what does it mean?"); + } else { + picture->buffer_flags = + GST_VIDEO_BUFFER_FLAG_INTERLACED | GST_VIDEO_BUFFER_FLAG_TFF; + } + break; + case GST_H265_SEI_PIC_STRUCT_BOTTOM_TOP: + if (priv->field_seq_flag) { + GST_FIXME_OBJECT (self, + "BFF with field_seq_flag == 1, what does it mean?"); + } else { + picture->buffer_flags = GST_VIDEO_BUFFER_FLAG_INTERLACED; + } + break; + default: + GST_FIXME_OBJECT (self, "Unhandled picture time SEI pic_struct %d", + picture->pic_struct); + break; + } + + return TRUE; +} + +static gboolean gst_h265_decoder_init_current_picture (GstH265Decoder * self) { GstH265DecoderPrivate *priv = self->priv; @@ -873,6 +1212,12 @@ &priv->current_slice, priv->current_picture)) return FALSE; + /* Use picture struct parsed from picture timing SEI */ + priv->current_picture->pic_struct = priv->cur_pic_struct; + priv->current_picture->source_scan_type = priv->cur_source_scan_type; + priv->current_picture->duplicate_flag = priv->cur_duplicate_flag; + gst_h265_decoder_set_buffer_flags (self, priv->current_picture); + return TRUE; } @@ -893,13 +1238,9 @@ } static void -gst_h265_decoder_derive_and_mark_rps (GstH265Decoder * self, - GstH265Picture * picture, gint32 * CurrDeltaPocMsbPresentFlag, - gint32 * FollDeltaPocMsbPresentFlag) +gst_h265_decoder_clear_ref_pic_sets (GstH265Decoder * self) { - GstH265DecoderPrivate *priv = self->priv; guint i; - GArray *dpb_array; for (i = 0; i < 16; i++) { gst_h265_picture_replace (&self->RefPicSetLtCurr[i], NULL); @@ -908,6 +1249,18 @@ gst_h265_picture_replace (&self->RefPicSetStCurrAfter[i], NULL); gst_h265_picture_replace (&self->RefPicSetStFoll[i], NULL); } +} + +static void +gst_h265_decoder_derive_and_mark_rps (GstH265Decoder * self, + GstH265Picture * picture, gint32 * CurrDeltaPocMsbPresentFlag, + gint32 * FollDeltaPocMsbPresentFlag) +{ + GstH265DecoderPrivate *priv = self->priv; + guint i; + GArray *dpb_array; + + gst_h265_decoder_clear_ref_pic_sets (self); /* (8-6) */ for (i = 0; i < self->NumPocLtCurr; i++) { @@ -1070,7 +1423,7 @@ numtotalcurr++; } - self->NumPocTotalCurr = numtotalcurr; + self->NumPicTotalCurr = numtotalcurr; /* The variable DeltaPocMsbCycleLt[i] is derived as follows: (7-38) */ for (i = 0; i < num_lt_pics; i++) { @@ -1107,7 +1460,7 @@ GST_LOG_OBJECT (self, "NumPocStFoll: %d", self->NumPocStFoll); GST_LOG_OBJECT (self, "NumPocLtCurr: %d", self->NumPocLtCurr); GST_LOG_OBJECT (self, "NumPocLtFoll: %d", self->NumPocLtFoll); - GST_LOG_OBJECT (self, "NumPocTotalCurr: %d", self->NumPocTotalCurr); + GST_LOG_OBJECT (self, "NumPicTotalCurr: %d", self->NumPicTotalCurr); /* the derivation process for the RPS and the picture marking */ gst_h265_decoder_derive_and_mark_rps (self, picture, @@ -1117,22 +1470,18 @@ } static void -gst_h265_decoder_clear_dpb (GstH265Decoder * self) -{ - GstH265DecoderPrivate *priv = self->priv; - - gst_h265_dpb_clear (priv->dpb); - priv->last_output_poc = -1; -} - -static void gst_h265_decoder_do_output_picture (GstH265Decoder * self, - GstH265Picture * picture) + GstH265Picture * picture, GstFlowReturn * ret) { GstH265DecoderPrivate *priv = self->priv; GstH265DecoderClass *klass; + GstVideoCodecFrame *frame = NULL; + GstFlowReturn flow_ret = GST_FLOW_OK; + + g_assert (ret != NULL); - picture->outputted = TRUE; + GST_LOG_OBJECT (self, "Output picture %p (poc %d)", picture, + picture->pic_order_cnt); if (picture->pic_order_cnt < priv->last_output_poc) { GST_WARNING_OBJECT (self, @@ -1142,66 +1491,80 @@ priv->last_output_poc = picture->pic_order_cnt; + frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), + picture->system_frame_number); + + if (!frame) { + GST_ERROR_OBJECT (self, + "No available codec frame with frame number %d", + picture->system_frame_number); + UPDATE_FLOW_RETURN (ret, GST_FLOW_ERROR); + + gst_h265_picture_unref (picture); + return; + } + klass = GST_H265_DECODER_GET_CLASS (self); g_assert (klass->output_picture); - priv->last_ret = klass->output_picture (self, picture); -} + flow_ret = klass->output_picture (self, frame, picture); -static gint -poc_asc_compare (const GstH265Picture * a, const GstH265Picture * b) -{ - return a->pic_order_cnt > b->pic_order_cnt; + UPDATE_FLOW_RETURN (ret, flow_ret); } -static gboolean -gst_h265_decoder_output_all_remaining_pics (GstH265Decoder * self) +static void +gst_h265_decoder_clear_dpb (GstH265Decoder * self, gboolean flush) { + GstVideoDecoder *decoder = GST_VIDEO_DECODER (self); GstH265DecoderPrivate *priv = self->priv; - GList *to_output = NULL; - GList *iter; - - gst_h265_dpb_get_pictures_not_outputted (priv->dpb, &to_output); + GstH265Picture *picture; - to_output = g_list_sort (to_output, (GCompareFunc) poc_asc_compare); + /* If we are not flushing now, videodecoder baseclass will hold + * GstVideoCodecFrame. Release frames manually */ + if (!flush) { + while ((picture = gst_h265_dpb_bump (priv->dpb, TRUE)) != NULL) { + GstVideoCodecFrame *frame = gst_video_decoder_get_frame (decoder, + picture->system_frame_number); - for (iter = to_output; iter; iter = g_list_next (iter)) { - GstH265Picture *picture = (GstH265Picture *) iter->data; - - GST_LOG_OBJECT (self, "Output picture %p (poc %d)", picture, - picture->pic_order_cnt); - gst_h265_decoder_do_output_picture (self, picture); + if (frame) + gst_video_decoder_release_frame (decoder, frame); + gst_h265_picture_unref (picture); + } } - if (to_output) - g_list_free_full (to_output, (GDestroyNotify) gst_h265_picture_unref); - - return TRUE; + gst_h265_dpb_clear (priv->dpb); + priv->last_output_poc = G_MININT32; } -static gboolean -gst_h265_decoder_check_latency_count (GList * list, guint32 max_latency) +static GstFlowReturn +gst_h265_decoder_drain_internal (GstH265Decoder * self) { - GList *iter; + GstH265DecoderPrivate *priv = self->priv; + GstH265Picture *picture; + GstFlowReturn ret = GST_FLOW_OK; - for (iter = list; iter; iter = g_list_next (iter)) { - GstH265Picture *pic = (GstH265Picture *) iter->data; - if (!pic->outputted && pic->pic_latency_cnt >= max_latency) - return TRUE; - } + while ((picture = gst_h265_dpb_bump (priv->dpb, TRUE)) != NULL) + gst_h265_decoder_do_output_picture (self, picture, &ret); - return FALSE; + gst_h265_dpb_clear (priv->dpb); + priv->last_output_poc = G_MININT32; + + return ret; } /* C.5.2.2 */ -static gboolean +static GstFlowReturn gst_h265_decoder_dpb_init (GstH265Decoder * self, const GstH265Slice * slice, GstH265Picture * picture) { GstH265DecoderPrivate *priv = self->priv; const GstH265SliceHdr *slice_hdr = &slice->header; const GstH265NalUnit *nalu = &slice->nalu; + const GstH265SPS *sps = priv->active_sps; + GstH265Picture *to_output; + GstFlowReturn ret = GST_FLOW_OK; + /* C 3.2 */ if (GST_H265_IS_NAL_TYPE_IRAP (nalu->type) && picture->NoRaslOutputFlag && !priv->new_bitstream) { if (nalu->type == GST_H265_NAL_SLICE_CRA_NUT) @@ -1212,29 +1575,55 @@ if (picture->NoOutputOfPriorPicsFlag) { GST_DEBUG_OBJECT (self, "Clear dpb"); - gst_h265_decoder_drain (GST_VIDEO_DECODER (self)); + gst_h265_decoder_clear_dpb (self, FALSE); + } else { + gst_h265_dpb_delete_unused (priv->dpb); + while ((to_output = gst_h265_dpb_bump (priv->dpb, FALSE)) != NULL) + gst_h265_decoder_do_output_picture (self, to_output, &ret); + + if (gst_h265_dpb_get_size (priv->dpb) > 0) { + GST_WARNING_OBJECT (self, "IDR or BLA frame failed to clear the dpb, " + "there are still %d pictures in the dpb, last output poc is %d", + gst_h265_dpb_get_size (priv->dpb), priv->last_output_poc); + } else { + priv->last_output_poc = G_MININT32; + } } } else { - /* C 3.2 */ gst_h265_dpb_delete_unused (priv->dpb); + while (gst_h265_dpb_needs_bump (priv->dpb, + sps->max_num_reorder_pics[sps->max_sub_layers_minus1], + priv->SpsMaxLatencyPictures, + sps->max_dec_pic_buffering_minus1[sps->max_sub_layers_minus1] + + 1)) { + to_output = gst_h265_dpb_bump (priv->dpb, FALSE); + + /* Something wrong... */ + if (!to_output) { + GST_WARNING_OBJECT (self, "Bumping is needed but no picture to output"); + break; + } + + gst_h265_decoder_do_output_picture (self, to_output, &ret); + } } - return TRUE; + return ret; } -static gboolean +static GstFlowReturn gst_h265_decoder_start_current_picture (GstH265Decoder * self) { GstH265DecoderClass *klass; GstH265DecoderPrivate *priv = self->priv; - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; g_assert (priv->current_picture != NULL); g_assert (priv->active_sps != NULL); g_assert (priv->active_pps != NULL); if (!gst_h265_decoder_init_current_picture (self)) - return FALSE; + return GST_FLOW_ERROR; /* Drop all RASL pictures having NoRaslOutputFlag is TRUE for the * associated IRAP picture */ @@ -1242,168 +1631,120 @@ priv->associated_irap_NoRaslOutputFlag) { GST_DEBUG_OBJECT (self, "Drop current picture"); gst_h265_picture_replace (&priv->current_picture, NULL); - return TRUE; + return GST_FLOW_OK; } gst_h265_decoder_prepare_rps (self, &priv->current_slice, priv->current_picture); - gst_h265_decoder_dpb_init (self, &priv->current_slice, priv->current_picture); + ret = gst_h265_decoder_dpb_init (self, + &priv->current_slice, priv->current_picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Failed to init dpb"); + return ret; + } klass = GST_H265_DECODER_GET_CLASS (self); - if (klass->start_picture) + if (klass->start_picture) { ret = klass->start_picture (self, priv->current_picture, &priv->current_slice, priv->dpb); - if (!ret) { - GST_ERROR_OBJECT (self, "subclass does not want to start picture"); - return FALSE; + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass does not want to start picture"); + return ret; + } } - return TRUE; + return GST_FLOW_OK; } -static gboolean +static void gst_h265_decoder_finish_picture (GstH265Decoder * self, - GstH265Picture * picture) + GstH265Picture * picture, GstFlowReturn * ret) { + GstVideoDecoder *decoder = GST_VIDEO_DECODER (self); GstH265DecoderPrivate *priv = self->priv; const GstH265SPS *sps = priv->active_sps; - GList *not_outputted = NULL; - guint num_remaining; - GList *iter; -#ifndef GST_DISABLE_GST_DEBUG - gint i; -#endif + + g_assert (ret != NULL); GST_LOG_OBJECT (self, "Finishing picture %p (poc %d), entries in DPB %d", picture, picture->pic_order_cnt, gst_h265_dpb_get_size (priv->dpb)); - /* Get all pictures that haven't been outputted yet */ - gst_h265_dpb_get_pictures_not_outputted (priv->dpb, ¬_outputted); - - /* C.5.2.3 */ - if (picture->output_flag) { - for (iter = not_outputted; iter; iter = g_list_next (iter)) { - GstH265Picture *other = GST_H265_PICTURE (iter->data); + gst_h265_dpb_delete_unused (priv->dpb); - if (!other->outputted) - other->pic_latency_cnt++; - } + /* This picture is decode only, drop corresponding frame */ + if (!picture->output_flag) { + GstVideoCodecFrame *frame = gst_video_decoder_get_frame (decoder, + picture->system_frame_number); - picture->outputted = FALSE; - picture->pic_latency_cnt = 0; - } else { - picture->outputted = TRUE; + gst_video_decoder_release_frame (decoder, frame); } - /* set pic as short_term_ref */ - picture->ref = TRUE; - picture->long_term = FALSE; - - /* Include the one we've just decoded */ - if (picture->output_flag) { - not_outputted = - g_list_append (not_outputted, gst_h265_picture_ref (picture)); - } - - /* Add to dpb and transfer ownership */ + /* gst_h265_dpb_add() will take care of pic_latency_cnt increment and + * reference picture marking for this picture */ gst_h265_dpb_add (priv->dpb, picture); - /* for debugging */ -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, "Before sorting not outputted list"); - i = 0; - for (iter = not_outputted; iter; iter = g_list_next (iter)) { - GstH265Picture *tmp = (GstH265Picture *) iter->data; - - GST_TRACE_OBJECT (self, - "\t%dth picture %p (poc %d)", i, tmp, tmp->pic_order_cnt); - i++; - } -#endif - - /* Sort in output order */ - not_outputted = g_list_sort (not_outputted, (GCompareFunc) poc_asc_compare); - -#ifndef GST_DISABLE_GST_DEBUG - GST_TRACE_OBJECT (self, - "After sorting not outputted list in poc ascending order"); - i = 0; - for (iter = not_outputted; iter; iter = g_list_next (iter)) { - GstH265Picture *tmp = (GstH265Picture *) iter->data; - - GST_TRACE_OBJECT (self, - "\t%dth picture %p (poc %d)", i, tmp, tmp->pic_order_cnt); - i++; - } -#endif - - /* Try to output as many pictures as we can. A picture can be output, - * if the number of decoded and not yet outputted pictures that would remain - * in DPB afterwards would at least be equal to max_num_reorder_frames. - * If the outputted picture is not a reference picture, it doesn't have - * to remain in the DPB and can be removed */ - iter = not_outputted; - num_remaining = g_list_length (not_outputted); - - while (num_remaining > sps->max_num_reorder_pics[sps->max_sub_layers_minus1] - || (num_remaining && - sps->max_latency_increase_plus1[sps->max_sub_layers_minus1] && - gst_h265_decoder_check_latency_count (iter, - priv->SpsMaxLatencyPictures))) { - GstH265Picture *to_output = GST_H265_PICTURE (iter->data); - - GST_LOG_OBJECT (self, - "Output picture %p (poc %d)", to_output, to_output->pic_order_cnt); - gst_h265_decoder_do_output_picture (self, to_output); - if (!to_output->ref) { - /* Current picture hasn't been inserted into DPB yet, so don't remove it - * if we managed to output it immediately */ - gint outputted_poc = to_output->pic_order_cnt; - if (outputted_poc != picture->pic_order_cnt) { - GST_LOG_OBJECT (self, "Delete picture %p (poc %d) from DPB", - to_output, to_output->pic_order_cnt); - gst_h265_dpb_delete_by_poc (priv->dpb, outputted_poc); - } + /* NOTE: As per C.5.2.2, bumping by sps_max_dec_pic_buffering_minus1 is + * applied only for the output and removal of pictures from the DPB before + * the decoding of the current picture. So pass zero here */ + while (gst_h265_dpb_needs_bump (priv->dpb, + sps->max_num_reorder_pics[sps->max_sub_layers_minus1], + priv->SpsMaxLatencyPictures, 0)) { + GstH265Picture *to_output = gst_h265_dpb_bump (priv->dpb, FALSE); + + /* Something wrong... */ + if (!to_output) { + GST_WARNING_OBJECT (self, "Bumping is needed but no picture to output"); + break; } - iter = g_list_next (iter); - num_remaining--; + gst_h265_decoder_do_output_picture (self, to_output, ret); } - - if (not_outputted) - g_list_free_full (not_outputted, (GDestroyNotify) gst_h265_picture_unref); - - return TRUE; } -static gboolean -gst_h265_decoder_finish_current_picture (GstH265Decoder * self) +static void +gst_h265_decoder_finish_current_picture (GstH265Decoder * self, + GstFlowReturn * ret) { GstH265DecoderPrivate *priv = self->priv; GstH265DecoderClass *klass; - gboolean ret = TRUE; + GstFlowReturn flow_ret = GST_FLOW_OK; + + g_assert (ret != NULL); if (!priv->current_picture) - return TRUE; + return; klass = GST_H265_DECODER_GET_CLASS (self); - if (klass->end_picture) - ret = klass->end_picture (self, priv->current_picture); + if (klass->end_picture) { + flow_ret = klass->end_picture (self, priv->current_picture); + if (flow_ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "End picture failed"); + + /* continue to empty dpb */ + UPDATE_FLOW_RETURN (ret, flow_ret); + } + } /* finish picture takes ownership of the picture */ - ret = gst_h265_decoder_finish_picture (self, priv->current_picture); + gst_h265_decoder_finish_picture (self, priv->current_picture, &flow_ret); priv->current_picture = NULL; - if (!ret) { - GST_ERROR_OBJECT (self, "Failed to finish picture"); - return FALSE; - } + UPDATE_FLOW_RETURN (ret, flow_ret); +} - return TRUE; +static void +gst_h265_decoder_reset_frame_state (GstH265Decoder * self) +{ + GstH265DecoderPrivate *priv = self->priv; + + /* Clear picture struct information */ + priv->cur_pic_struct = GST_H265_SEI_PIC_STRUCT_FRAME; + priv->cur_source_scan_type = 2; + priv->cur_duplicate_flag = 0; } static GstFlowReturn @@ -1416,7 +1757,7 @@ GstH265NalUnit nalu; GstH265ParserResult pres; GstMapInfo map; - gboolean decode_ret = TRUE; + GstFlowReturn decode_ret = GST_FLOW_OK; GST_LOG_OBJECT (self, "handle frame, PTS: %" GST_TIME_FORMAT ", DTS: %" @@ -1424,15 +1765,21 @@ GST_TIME_ARGS (GST_BUFFER_DTS (in_buf))); priv->current_frame = frame; - priv->last_ret = GST_FLOW_OK; - gst_buffer_map (in_buf, &map, GST_MAP_READ); + gst_h265_decoder_reset_frame_state (self); + + if (!gst_buffer_map (in_buf, &map, GST_MAP_READ)) { + GST_ELEMENT_ERROR (self, RESOURCE, READ, + ("Failed to map memory for reading"), (NULL)); + return GST_FLOW_ERROR; + } + if (priv->in_format == GST_H265_DECODER_FORMAT_HVC1 || priv->in_format == GST_H265_DECODER_FORMAT_HEV1) { pres = gst_h265_parser_identify_nalu_hevc (priv->parser, map.data, 0, map.size, priv->nal_length_size, &nalu); - while (pres == GST_H265_PARSER_OK && decode_ret) { + while (pres == GST_H265_PARSER_OK && decode_ret == GST_FLOW_OK) { decode_ret = gst_h265_decoder_decode_nal (self, &nalu, GST_BUFFER_PTS (in_buf)); @@ -1447,7 +1794,7 @@ if (pres == GST_H265_PARSER_NO_NAL_END) pres = GST_H265_PARSER_OK; - while (pres == GST_H265_PARSER_OK && decode_ret) { + while (pres == GST_H265_PARSER_OK && decode_ret == GST_FLOW_OK) { decode_ret = gst_h265_decoder_decode_nal (self, &nalu, GST_BUFFER_PTS (in_buf)); @@ -1462,18 +1809,65 @@ gst_buffer_unmap (in_buf, &map); priv->current_frame = NULL; - if (!decode_ret) { - GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, - ("Failed to decode data"), (NULL), priv->last_ret); - gst_video_decoder_drop_frame (decoder, frame); + if (decode_ret != GST_FLOW_OK) { + if (decode_ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), decode_ret); + } + gst_video_decoder_drop_frame (decoder, frame); gst_h265_picture_clear (&priv->current_picture); - return priv->last_ret; + return decode_ret; + } + + if (priv->current_picture) { + gst_h265_decoder_finish_current_picture (self, &decode_ret); + gst_video_codec_frame_unref (frame); + } else { + /* This picture was dropped */ + gst_video_decoder_release_frame (decoder, frame); + } + + if (decode_ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), decode_ret); } - gst_h265_decoder_finish_current_picture (self); - gst_video_codec_frame_unref (frame); + return decode_ret; +} - return priv->last_ret; +/** + * gst_h265_decoder_set_process_ref_pic_lists: + * @decoder: a #GstH265Decoder + * @process: whether subclass is requiring reference picture modification process + * + * Called to en/disable reference picture modification process. + * + * Since: 1.20 + */ +void +gst_h265_decoder_set_process_ref_pic_lists (GstH265Decoder * decoder, + gboolean process) +{ + decoder->priv->process_ref_pic_lists = process; +} + +/** + * gst_h265_decoder_get_picture: + * @decoder: a #GstH265Decoder + * @system_frame_number: a target system frame number of #GstH265Picture + * + * Retrive DPB and return a #GstH265Picture corresponding to + * the @system_frame_number + * + * Returns: (transfer full): a #GstH265Picture if successful, or %NULL otherwise + * + * Since: 1.20 + */ +GstH265Picture * +gst_h265_decoder_get_picture (GstH265Decoder * decoder, + guint32 system_frame_number) +{ + return gst_h265_dpb_get_picture (decoder->priv->dpb, system_frame_number); }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth265decoder.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth265decoder.h
Changed
@@ -64,7 +64,7 @@ guint NumPocStFoll; guint NumPocLtCurr; guint NumPocLtFoll; - guint NumPocTotalCurr; + guint NumPicTotalCurr; /*< private >*/ GstH265DecoderPrivate *priv; @@ -87,35 +87,47 @@ * Called per one #GstH265Picture to notify subclass to finish * decoding process for the #GstH265Picture * @output_picture: Called with a #GstH265Picture which is required to be outputted. - * Subclass can retrieve parent #GstVideoCodecFrame by using - * gst_video_decoder_get_frame() with system_frame_number - * and the #GstVideoCodecFrame must be consumed by subclass via + * The #GstVideoCodecFrame must be consumed by subclass via * gst_video_decoder_{finish,drop,release}_frame(). */ struct _GstH265DecoderClass { GstVideoDecoderClass parent_class; - gboolean (*new_sequence) (GstH265Decoder * decoder, + GstFlowReturn (*new_sequence) (GstH265Decoder * decoder, const GstH265SPS * sps, gint max_dpb_size); - - gboolean (*new_picture) (GstH265Decoder * decoder, + /** + * GstH265Decoder:new_picture: + * @decoder: a #GstH265Decoder + * @frame: (transfer none): a #GstVideoCodecFrame + * @picture: (transfer none): a #GstH265Picture + */ + GstFlowReturn (*new_picture) (GstH265Decoder * decoder, + GstVideoCodecFrame * frame, GstH265Picture * picture); - gboolean (*start_picture) (GstH265Decoder * decoder, + GstFlowReturn (*start_picture) (GstH265Decoder * decoder, GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb); - gboolean (*decode_slice) (GstH265Decoder * decoder, + GstFlowReturn (*decode_slice) (GstH265Decoder * decoder, GstH265Picture * picture, - GstH265Slice * slice); + GstH265Slice * slice, + GArray * ref_pic_list0, + GArray * ref_pic_list1); - gboolean (*end_picture) (GstH265Decoder * decoder, + GstFlowReturn (*end_picture) (GstH265Decoder * decoder, GstH265Picture * picture); - + /** + * GstH265Decoder:output_picture: + * @decoder: a #GstH265Decoder + * @frame: (transfer full): a #GstVideoCodecFrame + * @picture: (transfer full): a #GstH265Picture + */ GstFlowReturn (*output_picture) (GstH265Decoder * decoder, + GstVideoCodecFrame * frame, GstH265Picture * picture); /*< private >*/ @@ -127,6 +139,14 @@ GST_CODECS_API GType gst_h265_decoder_get_type (void); +GST_CODECS_API +void gst_h265_decoder_set_process_ref_pic_lists (GstH265Decoder * decoder, + gboolean process); + +GST_CODECS_API +GstH265Picture * gst_h265_decoder_get_picture (GstH265Decoder * decoder, + guint32 system_frame_number); + G_END_DECLS #endif /* __GST_H265_DECODER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth265picture.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth265picture.c
Changed
@@ -52,7 +52,11 @@ pic = g_new0 (GstH265Picture, 1); pic->pts = GST_CLOCK_TIME_NONE; - pic->field = GST_H265_PICTURE_FIELD_FRAME; + pic->pic_struct = GST_H265_SEI_PIC_STRUCT_FRAME; + /* 0: interlaced, 1: progressive, 2: unspecified, 3: reserved, can be + * interpreted as 2 */ + pic->source_scan_type = 2; + pic->duplicate_flag = 0; gst_mini_object_init (GST_MINI_OBJECT_CAST (pic), 0, GST_TYPE_H265_PICTURE, NULL, NULL, @@ -105,6 +109,7 @@ { GArray *pic_list; gint max_num_pics; + gint num_output_needed; }; /** @@ -187,6 +192,7 @@ g_return_if_fail (dpb != NULL); g_array_set_size (dpb->pic_list, 0); + dpb->num_output_needed = 0; } /** @@ -194,7 +200,8 @@ * @dpb: a #GstH265Dpb * @picture: (transfer full): a #GstH265Picture * - * Store the @picture + * Store the @picture and perform increase pic_latency_cnt as defined in + * "C.5.2.3 Additional bumping" process */ void gst_h265_dpb_add (GstH265Dpb * dpb, GstH265Picture * picture) @@ -202,6 +209,27 @@ g_return_if_fail (dpb != NULL); g_return_if_fail (GST_IS_H265_PICTURE (picture)); + if (picture->output_flag) { + gint i; + + for (i = 0; i < dpb->pic_list->len; i++) { + GstH265Picture *other = + g_array_index (dpb->pic_list, GstH265Picture *, i); + + if (other->needed_for_output) + other->pic_latency_cnt++; + } + + dpb->num_output_needed++; + picture->needed_for_output = TRUE; + } else { + picture->needed_for_output = FALSE; + } + + /* C.3.4 */ + picture->ref = TRUE; + picture->long_term = FALSE; + g_array_append_val (dpb->pic_list, picture); } @@ -209,7 +237,7 @@ * gst_h265_dpb_delete_unused: * @dpb: a #GstH265Dpb * - * Delete already outputted and not referenced all pictures from dpb + * Delete not needed for output and not referenced all pictures from dpb */ void gst_h265_dpb_delete_unused (GstH265Dpb * dpb) @@ -222,7 +250,7 @@ GstH265Picture *picture = g_array_index (dpb->pic_list, GstH265Picture *, i); - if (picture->outputted && !picture->ref) { + if (!picture->needed_for_output && !picture->ref) { GST_TRACE ("remove picture %p (poc %d) from dpb", picture, picture->pic_order_cnt); g_array_remove_index (dpb->pic_list, i); @@ -232,33 +260,6 @@ } /** - * gst_h265_dpb_delete_by_poc: - * @dpb: a #GstH265Dpb - * @poc: a poc of #GstH265Picture to remove - * - * Delete a #GstH265Dpb by @poc - */ -void -gst_h265_dpb_delete_by_poc (GstH265Dpb * dpb, gint poc) -{ - gint i; - - g_return_if_fail (dpb != NULL); - - for (i = 0; i < dpb->pic_list->len; i++) { - GstH265Picture *picture = - g_array_index (dpb->pic_list, GstH265Picture *, i); - - if (picture->pic_order_cnt == poc) { - g_array_remove_index (dpb->pic_list, i); - return; - } - } - - GST_WARNING ("Couldn't find picture with poc %d", poc); -} - -/** * gst_h265_dpb_num_ref_pictures: * @dpb: a #GstH265Dpb * @@ -421,31 +422,6 @@ } /** - * gst_h265_dpb_get_pictures_not_outputted: - * @dpb: a #GstH265Dpb - * @out: (out) (element-type GstH265Picture) (transfer full): a list - * of #GstH265Dpb - * - * Retrieve all not-outputted pictures from @dpb - */ -void -gst_h265_dpb_get_pictures_not_outputted (GstH265Dpb * dpb, GList ** out) -{ - gint i; - - g_return_if_fail (dpb != NULL); - g_return_if_fail (out != NULL); - - for (i = 0; i < dpb->pic_list->len; i++) { - GstH265Picture *picture = - g_array_index (dpb->pic_list, GstH265Picture *, i); - - if (!picture->outputted) - *out = g_list_append (*out, gst_h265_picture_ref (picture)); - } -} - -/** * gst_h265_dpb_get_pictures_all: * @dpb: a #GstH265Dpb * @@ -475,15 +451,178 @@ } /** - * gst_h265_dpb_is_full: + * gst_h265_dpb_get_picture: * @dpb: a #GstH265Dpb + * @system_frame_number The system frame number + * + * Returns: (transfer full): the picture identified with the specified + * @system_frame_number, or %NULL if DPB does not contain a #GstH265Picture + * corresponding to the @system_frame_number * - * Return: %TRUE if @dpb is full + * Since: 1.20 + */ +GstH265Picture * +gst_h265_dpb_get_picture (GstH265Dpb * dpb, guint32 system_frame_number) +{ + gint i; + + g_return_val_if_fail (dpb != NULL, NULL); + + for (i = 0; i < dpb->pic_list->len; i++) { + GstH265Picture *picture = + g_array_index (dpb->pic_list, GstH265Picture *, i); + + if (picture->system_frame_number == system_frame_number) { + gst_h265_picture_ref (picture); + return picture; + } + } + + return NULL; +} + +static gboolean +gst_h265_dpb_check_latency_count (GstH265Dpb * dpb, guint32 max_latency) +{ + gint i; + + for (i = 0; i < dpb->pic_list->len; i++) { + GstH265Picture *picture = + g_array_index (dpb->pic_list, GstH265Picture *, i); + + if (!picture->needed_for_output) + continue; + + if (picture->pic_latency_cnt >= max_latency) + return TRUE; + } + + return FALSE; +} + +/** + * gst_h265_dpb_needs_bump: + * @dpb: a #GstH265Dpb + * @max_num_reorder_pics: sps_max_num_reorder_pics[HighestTid] + * @max_latency_increase: SpsMaxLatencyPictures[HighestTid] + * @max_dec_pic_buffering: sps_max_dec_pic_buffering_minus1[HighestTid ] + 1 + * or zero if this shouldn't be used for bumping decision + * + * Returns: %TRUE if bumping is required + * + * Since: 1.20 */ gboolean -gst_h265_dpb_is_full (GstH265Dpb * dpb) +gst_h265_dpb_needs_bump (GstH265Dpb * dpb, guint max_num_reorder_pics, + guint max_latency_increase, guint max_dec_pic_buffering) { - g_return_val_if_fail (dpb != NULL, -1); + g_return_val_if_fail (dpb != NULL, FALSE); + g_assert (dpb->num_output_needed >= 0); + + /* If DPB is full and there is no empty space to store current picture, + * need bumping. + * NOTE: current picture was added already by our decoding flow, so we + * need to do bumping until dpb->pic_list->len == dpb->max_num_pic + */ + if (dpb->pic_list->len > dpb->max_num_pics) { + GST_TRACE ("No empty frame buffer, need bumping"); + return TRUE; + } + + /* C.5.2.3 */ + if (dpb->num_output_needed > max_num_reorder_pics) { + GST_TRACE ("num_output_needed (%d) > max_num_reorder_pics (%d)", + dpb->num_output_needed, max_num_reorder_pics); + return TRUE; + } + + if (dpb->num_output_needed && max_latency_increase && + gst_h265_dpb_check_latency_count (dpb, max_latency_increase)) { + GST_TRACE ("has late picture, max_latency_increase: %d", + max_latency_increase); + return TRUE; + } + + /* C.5.2.2 */ + if (max_dec_pic_buffering && dpb->pic_list->len >= max_dec_pic_buffering) { + GST_TRACE ("dpb size (%d) >= max_dec_pic_buffering (%d)", + dpb->pic_list->len, max_dec_pic_buffering); + return TRUE; + } + + return FALSE; +} + +static gint +gst_h265_dpb_get_lowest_output_needed_picture (GstH265Dpb * dpb, + GstH265Picture ** picture) +{ + gint i; + GstH265Picture *lowest = NULL; + gint index = -1; + + *picture = NULL; + + for (i = 0; i < dpb->pic_list->len; i++) { + GstH265Picture *picture = + g_array_index (dpb->pic_list, GstH265Picture *, i); + + if (!picture->needed_for_output) + continue; + + if (!lowest) { + lowest = picture; + index = i; + continue; + } + + if (picture->pic_order_cnt < lowest->pic_order_cnt) { + lowest = picture; + index = i; + } + } + + if (lowest) + *picture = gst_h265_picture_ref (lowest); + + return index; +} + +/** + * gst_h265_dpb_bump: + * @dpb: a #GstH265Dpb + * @drain: whether draining or not + * + * Perform bumping process as defined in C.5.2.4 "Bumping" process. + * If @drain is %TRUE, @dpb will remove a #GstH265Picture from internal array + * so that returned #GstH265Picture could hold the last reference of it + * + * Returns: (nullable) (transfer full): a #GstH265Picture which is needed to be + * outputted + * + * Since: 1.20 + */ +GstH265Picture * +gst_h265_dpb_bump (GstH265Dpb * dpb, gboolean drain) +{ + GstH265Picture *picture; + gint index; + + g_return_val_if_fail (dpb != NULL, NULL); + + /* C.5.2.4 "Bumping" process */ + index = gst_h265_dpb_get_lowest_output_needed_picture (dpb, &picture); + + if (!picture || index < 0) + return NULL; + + picture->needed_for_output = FALSE; + + dpb->num_output_needed--; + g_assert (dpb->num_output_needed >= 0); + + if (!picture->ref || drain) + g_array_remove_index_fast (dpb->pic_list, index); - return dpb->pic_list->len >= dpb->max_num_pics; + return picture; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gsth265picture.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gsth265picture.h
Changed
@@ -27,8 +27,8 @@ #include <gst/gst.h> #include <gst/codecs/codecs-prelude.h> - #include <gst/codecparsers/gsth265parser.h> +#include <gst/video/video.h> G_BEGIN_DECLS @@ -50,15 +50,9 @@ GstH265NalUnit nalu; }; -typedef enum -{ - GST_H265_PICTURE_FIELD_FRAME, - GST_H265_PICTURE_FILED_TOP_FIELD, - GST_H265_PICTURE_FIELD_BOTTOM_FIELD, -} GstH265PictureField; - struct _GstH265Picture { + /*< private >*/ GstMiniObject parent; GstH265SliceType type; @@ -81,9 +75,14 @@ gboolean ref; gboolean long_term; - gboolean outputted; + gboolean needed_for_output; - GstH265PictureField field; + /* from picture timing SEI */ + GstH265SEIPicStructType pic_struct; + guint8 source_scan_type; + guint8 duplicate_flag; + + GstVideoBufferFlags buffer_flags; gpointer user_data; GDestroyNotify notify; @@ -161,10 +160,6 @@ void gst_h265_dpb_delete_unused (GstH265Dpb * dpb); GST_CODECS_API -void gst_h265_dpb_delete_by_poc (GstH265Dpb * dpb, - gint poc); - -GST_CODECS_API gint gst_h265_dpb_num_ref_pictures (GstH265Dpb * dpb); GST_CODECS_API @@ -187,17 +182,24 @@ gint poc); GST_CODECS_API -void gst_h265_dpb_get_pictures_not_outputted (GstH265Dpb * dpb, - GList ** out); +GArray * gst_h265_dpb_get_pictures_all (GstH265Dpb * dpb); GST_CODECS_API -GArray * gst_h265_dpb_get_pictures_all (GstH265Dpb * dpb); +GstH265Picture * gst_h265_dpb_get_picture (GstH265Dpb * dpb, + guint32 system_frame_number); GST_CODECS_API gint gst_h265_dpb_get_size (GstH265Dpb * dpb); GST_CODECS_API -gboolean gst_h265_dpb_is_full (GstH265Dpb * dpb); +gboolean gst_h265_dpb_needs_bump (GstH265Dpb * dpb, + guint max_num_reorder_pics, + guint max_latency_increase, + guint max_dec_pic_buffering); + +GST_CODECS_API +GstH265Picture * gst_h265_dpb_bump (GstH265Dpb * dpb, + gboolean drain); G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstH265Picture, gst_h265_picture_unref)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstmpeg2decoder.c
Added
@@ -0,0 +1,1311 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +/** + * SECTION:gstmpeg2decoder + * @title: GstMpeg2Decoder + * @short_description: Base class to implement stateless MPEG2 decoders + * @sources: + * - gstmpeg2picture.h + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/base/base.h> +#include "gstmpeg2decoder.h" + +GST_DEBUG_CATEGORY (gst_mpeg2_decoder_debug); +#define GST_CAT_DEFAULT gst_mpeg2_decoder_debug + +/* ------------------------------------------------------------------------- */ +/* --- PTS Generator --- */ +/* ------------------------------------------------------------------------- */ +typedef struct _PTSGenerator PTSGenerator; +struct _PTSGenerator +{ + /* The current GOP PTS */ + GstClockTime gop_pts; + /* Max picture PTS */ + GstClockTime max_pts; + /* Absolute GOP TSN */ + guint gop_tsn; + /* Max picture TSN, relative to last GOP TSN */ + guint max_tsn; + /* How many times TSN overflowed since GOP */ + guint ovl_tsn; + /* Last picture TSN */ + guint lst_tsn; + guint fps_n; + guint fps_d; +}; + +static void +_pts_init (PTSGenerator * tsg) +{ + tsg->gop_pts = GST_CLOCK_TIME_NONE; + tsg->max_pts = GST_CLOCK_TIME_NONE; + tsg->gop_tsn = 0; + tsg->max_tsn = 0; + tsg->ovl_tsn = 0; + tsg->lst_tsn = 0; + tsg->fps_n = 0; + tsg->fps_d = 0; +} + +static inline GstClockTime +_pts_get_duration (PTSGenerator * tsg, guint num_frames) +{ + return gst_util_uint64_scale (num_frames, GST_SECOND * tsg->fps_d, + tsg->fps_n); +} + +static inline guint +_pts_get_poc (PTSGenerator * tsg) +{ + return tsg->gop_tsn + tsg->ovl_tsn * 1024 + tsg->lst_tsn; +} + +static void +_pts_set_framerate (PTSGenerator * tsg, guint fps_n, guint fps_d) +{ + tsg->fps_n = fps_n; + tsg->fps_d = fps_d; +} + +static void +_pts_sync (PTSGenerator * tsg, GstClockTime gop_pts) +{ + guint gop_tsn; + + if (!GST_CLOCK_TIME_IS_VALID (gop_pts) || + (GST_CLOCK_TIME_IS_VALID (tsg->max_pts) && tsg->max_pts >= gop_pts)) { + /* Invalid GOP PTS, interpolate from the last known picture PTS */ + if (GST_CLOCK_TIME_IS_VALID (tsg->max_pts)) { + gop_pts = tsg->max_pts + _pts_get_duration (tsg, 1); + gop_tsn = tsg->gop_tsn + tsg->ovl_tsn * 1024 + tsg->max_tsn + 1; + } else { + gop_pts = 0; + gop_tsn = 0; + } + } else { + /* Interpolate GOP TSN from this valid PTS */ + if (GST_CLOCK_TIME_IS_VALID (tsg->gop_pts)) + gop_tsn = tsg->gop_tsn + gst_util_uint64_scale (gop_pts - tsg->gop_pts + + _pts_get_duration (tsg, 1) - 1, tsg->fps_n, GST_SECOND * tsg->fps_d); + else + gop_tsn = 0; + } + + tsg->gop_pts = gop_pts; + tsg->gop_tsn = gop_tsn; + tsg->max_tsn = 0; + tsg->ovl_tsn = 0; + tsg->lst_tsn = 0; +} + +static GstClockTime +_pts_eval (PTSGenerator * tsg, GstClockTime pic_pts, guint pic_tsn) +{ + GstClockTime pts; + + if (!GST_CLOCK_TIME_IS_VALID (tsg->gop_pts)) + tsg->gop_pts = _pts_get_duration (tsg, pic_tsn); + + pts = pic_pts; + if (!GST_CLOCK_TIME_IS_VALID (pts)) + pts = tsg->gop_pts + _pts_get_duration (tsg, tsg->ovl_tsn * 1024 + pic_tsn); + else if (pts == tsg->gop_pts) { + /* The picture following the GOP header shall be an I-frame. + So we can compensate for the GOP start time from here */ + tsg->gop_pts -= _pts_get_duration (tsg, pic_tsn); + } + + if (!GST_CLOCK_TIME_IS_VALID (tsg->max_pts) || tsg->max_pts < pts) + tsg->max_pts = pts; + + if (tsg->max_tsn < pic_tsn) + tsg->max_tsn = pic_tsn; + else if (tsg->max_tsn == 1023 && pic_tsn < tsg->lst_tsn) { /* TSN wrapped */ + tsg->max_tsn = pic_tsn; + tsg->ovl_tsn++; + } + tsg->lst_tsn = pic_tsn; + + return pts; +} + +static inline gboolean +_seq_hdr_is_valid (GstMpegVideoSequenceHdr * hdr) +{ + return hdr->width > 0 && hdr->height > 0; +} + +#define SEQ_HDR_INIT (GstMpegVideoSequenceHdr) { 0, } + +static inline gboolean +_seq_ext_is_valid (GstMpegVideoSequenceExt * ext) +{ + return ext->profile >= GST_MPEG_VIDEO_PROFILE_422 + && ext->profile <= GST_MPEG_VIDEO_PROFILE_SIMPLE; +} + +#define SEQ_EXT_INIT (GstMpegVideoSequenceExt) { 0xff, 0, } + +static inline gboolean +_seq_display_ext_is_valid (GstMpegVideoSequenceDisplayExt * ext) +{ + return ext->video_format != 0xff; +} + +#define SEQ_DISPLAY_EXT_INIT (GstMpegVideoSequenceDisplayExt) { 0xff, 0, } + +static inline gboolean +_seq_scalable_ext_is_valid (GstMpegVideoSequenceScalableExt * ext) +{ + return ext->scalable_mode != 0xff; +} + +#define SEQ_SCALABLE_EXT_INIT (GstMpegVideoSequenceScalableExt) { 0xff, 0, } + +static inline gboolean +_quant_matrix_ext_is_valid (GstMpegVideoQuantMatrixExt * ext) +{ + return ext->load_intra_quantiser_matrix != 0xff; +} + +#define QUANT_MATRIX_EXT_INIT (GstMpegVideoQuantMatrixExt) { 0xff, { 0, } } + +static inline gboolean +_pic_hdr_is_valid (GstMpegVideoPictureHdr * hdr) +{ + return hdr->tsn != 0xffff; +} + +#define PIC_HDR_INIT (GstMpegVideoPictureHdr) { 0xffff, 0, } + +static inline gboolean +_pic_hdr_ext_is_valid (GstMpegVideoPictureExt * ext) +{ + return ext->f_code[0][0] != 0xff; +} + +#define PIC_HDR_EXT_INIT \ + (GstMpegVideoPictureExt) { { { 0xff, 0, }, { 0, } }, 0, } + +typedef enum +{ + GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR = 1 << 0, + GST_MPEG2_DECODER_STATE_GOT_SEQ_EXT = 1 << 1, + GST_MPEG2_DECODER_STATE_GOT_PIC_HDR = 1 << 2, + GST_MPEG2_DECODER_STATE_GOT_PIC_EXT = 1 << 3, + GST_MPEG2_DECODER_STATE_GOT_SLICE = 1 << 4, + + GST_MPEG2_DECODER_STATE_VALID_SEQ_HEADERS = + (GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR | + GST_MPEG2_DECODER_STATE_GOT_SEQ_EXT), + GST_MPEG2_DECODER_STATE_VALID_PIC_HEADERS = + (GST_MPEG2_DECODER_STATE_GOT_PIC_HDR | + GST_MPEG2_DECODER_STATE_GOT_PIC_EXT), + GST_MPEG2_DECODER_STATE_VALID_PICTURE = + (GST_MPEG2_DECODER_STATE_VALID_SEQ_HEADERS | + GST_MPEG2_DECODER_STATE_VALID_PIC_HEADERS | + GST_MPEG2_DECODER_STATE_GOT_SLICE) +} GstMpeg2DecoderState; + +struct _GstMpeg2DecoderPrivate +{ + gint width; + gint height; + gint display_width; + gint display_height; + GstMpegVideoProfile profile; + gboolean progressive; + + GstMpegVideoSequenceHdr seq_hdr; + GstMpegVideoSequenceExt seq_ext; + GstMpegVideoSequenceDisplayExt seq_display_ext; + GstMpegVideoSequenceScalableExt seq_scalable_ext; + + /* some sequence info changed after last new_sequence () */ + gboolean seq_changed; + /* whether we need to drain before new_sequence () */ + gboolean need_to_drain; + GstMpegVideoGop gop; + GstMpegVideoQuantMatrixExt quant_matrix; + GstMpegVideoPictureHdr pic_hdr; + GstMpegVideoPictureExt pic_ext; + + GstMpeg2Dpb *dpb; + GstMpeg2DecoderState state; + PTSGenerator tsg; + GstClockTime current_pts; + + GstMpeg2Picture *current_picture; + GstVideoCodecFrame *current_frame; + GstMpeg2Picture *first_field; + + guint preferred_output_delay; + /* for delayed output */ + GstQueueArray *output_queue; + /* used for low-latency vs. high throughput mode decision */ + gboolean is_live; +}; + +#define UPDATE_FLOW_RETURN(ret,new_ret) G_STMT_START { \ + if (*(ret) == GST_FLOW_OK) \ + *(ret) = new_ret; \ +} G_STMT_END + +typedef struct +{ + GstVideoCodecFrame *frame; + GstMpeg2Picture *picture; + GstMpeg2Decoder *self; +} GstMpeg2DecoderOutputFrame; + + +#define parent_class gst_mpeg2_decoder_parent_class +G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstMpeg2Decoder, gst_mpeg2_decoder, + GST_TYPE_VIDEO_DECODER, + G_ADD_PRIVATE (GstMpeg2Decoder); + GST_DEBUG_CATEGORY_INIT (gst_mpeg2_decoder_debug, "mpeg2decoder", 0, + "MPEG2 Video Decoder")); + +static gboolean gst_mpeg2_decoder_start (GstVideoDecoder * decoder); +static gboolean gst_mpeg2_decoder_stop (GstVideoDecoder * decoder); +static gboolean gst_mpeg2_decoder_set_format (GstVideoDecoder * decoder, + GstVideoCodecState * state); +static GstFlowReturn gst_mpeg2_decoder_finish (GstVideoDecoder * decoder); +static gboolean gst_mpeg2_decoder_flush (GstVideoDecoder * decoder); +static GstFlowReturn gst_mpeg2_decoder_drain (GstVideoDecoder * decoder); +static GstFlowReturn gst_mpeg2_decoder_handle_frame (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame); +static void gst_mpeg2_decoder_do_output_picture (GstMpeg2Decoder * self, + GstMpeg2Picture * picture, GstFlowReturn * ret); +static void gst_mpeg2_decoder_clear_output_frame (GstMpeg2DecoderOutputFrame * + output_frame); +static void gst_mpeg2_decoder_drain_output_queue (GstMpeg2Decoder * + self, guint num, GstFlowReturn * ret); + + +static void +gst_mpeg2_decoder_class_init (GstMpeg2DecoderClass * klass) +{ + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + + decoder_class->start = GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_start); + decoder_class->stop = GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_stop); + decoder_class->set_format = GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_set_format); + decoder_class->finish = GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_finish); + decoder_class->flush = GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_flush); + decoder_class->drain = GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_drain); + decoder_class->handle_frame = + GST_DEBUG_FUNCPTR (gst_mpeg2_decoder_handle_frame); +} + +static void +gst_mpeg2_decoder_init (GstMpeg2Decoder * self) +{ + gst_video_decoder_set_packetized (GST_VIDEO_DECODER (self), TRUE); + + self->priv = gst_mpeg2_decoder_get_instance_private (self); + + self->priv->seq_hdr = SEQ_HDR_INIT; + self->priv->seq_ext = SEQ_EXT_INIT; + self->priv->seq_display_ext = SEQ_DISPLAY_EXT_INIT; + self->priv->seq_scalable_ext = SEQ_SCALABLE_EXT_INIT; + self->priv->quant_matrix = QUANT_MATRIX_EXT_INIT; + self->priv->pic_hdr = PIC_HDR_INIT; + self->priv->pic_ext = PIC_HDR_EXT_INIT; +} + +static gboolean +gst_mpeg2_decoder_start (GstVideoDecoder * decoder) +{ + GstMpeg2Decoder *self = GST_MPEG2_DECODER (decoder); + GstMpeg2DecoderPrivate *priv = self->priv; + + _pts_init (&priv->tsg); + priv->dpb = gst_mpeg2_dpb_new (); + priv->profile = -1; + priv->progressive = TRUE; + + priv->output_queue = + gst_queue_array_new_for_struct (sizeof (GstMpeg2DecoderOutputFrame), 1); + gst_queue_array_set_clear_func (priv->output_queue, + (GDestroyNotify) gst_mpeg2_decoder_clear_output_frame); + + return TRUE; +} + +static gboolean +gst_mpeg2_decoder_stop (GstVideoDecoder * decoder) +{ + GstMpeg2Decoder *self = GST_MPEG2_DECODER (decoder); + GstMpeg2DecoderPrivate *priv = self->priv; + + g_clear_pointer (&self->input_state, gst_video_codec_state_unref); + g_clear_pointer (&priv->dpb, gst_mpeg2_dpb_free); + gst_queue_array_free (priv->output_queue); + + return TRUE; +} + +static gboolean +gst_mpeg2_decoder_set_format (GstVideoDecoder * decoder, + GstVideoCodecState * state) +{ + GstMpeg2Decoder *self = GST_MPEG2_DECODER (decoder); + GstMpeg2DecoderPrivate *priv = self->priv; + GstQuery *query; + + GST_DEBUG_OBJECT (decoder, "Set format"); + + if (self->input_state) + gst_video_codec_state_unref (self->input_state); + + self->input_state = gst_video_codec_state_ref (state); + + priv->width = GST_VIDEO_INFO_WIDTH (&state->info); + priv->height = GST_VIDEO_INFO_HEIGHT (&state->info); + + query = gst_query_new_latency (); + if (gst_pad_peer_query (GST_VIDEO_DECODER_SINK_PAD (self), query)) + gst_query_parse_latency (query, &priv->is_live, NULL, NULL); + gst_query_unref (query); + + return TRUE; +} + +static GstFlowReturn +gst_mpeg2_decoder_drain (GstVideoDecoder * decoder) +{ + GstMpeg2Decoder *self = GST_MPEG2_DECODER (decoder); + GstMpeg2DecoderPrivate *priv = self->priv; + GstMpeg2Picture *picture; + GstFlowReturn ret = GST_FLOW_OK; + + while ((picture = gst_mpeg2_dpb_bump (priv->dpb)) != NULL) { + gst_mpeg2_decoder_do_output_picture (self, picture, &ret); + } + + gst_mpeg2_decoder_drain_output_queue (self, 0, &ret); + gst_queue_array_clear (priv->output_queue); + gst_mpeg2_dpb_clear (priv->dpb); + + return ret; +} + +static GstFlowReturn +gst_mpeg2_decoder_finish (GstVideoDecoder * decoder) +{ + return gst_mpeg2_decoder_drain (decoder); +} + +static gboolean +gst_mpeg2_decoder_flush (GstVideoDecoder * decoder) +{ + GstMpeg2Decoder *self = GST_MPEG2_DECODER (decoder); + GstMpeg2DecoderPrivate *priv = self->priv; + + gst_mpeg2_dpb_clear (priv->dpb); + gst_queue_array_clear (priv->output_queue); + priv->state &= GST_MPEG2_DECODER_STATE_VALID_SEQ_HEADERS; + priv->pic_hdr = PIC_HDR_INIT; + priv->pic_ext = PIC_HDR_EXT_INIT; + + return TRUE; +} + +static inline gboolean +_is_valid_state (GstMpeg2Decoder * decoder, GstMpeg2DecoderState state) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + + return (priv->state & state) == state; +} + +static void +gst_mpeg2_decoder_set_latency (GstMpeg2Decoder * decoder) +{ + GstCaps *caps; + GstClockTime min, max; + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstStructure *structure; + gint fps_d = 1, fps_n = 0; + + if (priv->tsg.fps_d > 0 && priv->tsg.fps_n > 0) { + fps_n = priv->tsg.fps_n; + fps_d = priv->tsg.fps_d; + } else { + caps = gst_pad_get_current_caps (GST_VIDEO_DECODER_SINK_PAD (decoder)); + if (caps) { + structure = gst_caps_get_structure (caps, 0); + if (gst_structure_get_fraction (structure, "framerate", &fps_n, &fps_d)) { + if (fps_n == 0) { + /* variable framerate: see if we have a max-framerate */ + gst_structure_get_fraction (structure, "max-framerate", &fps_n, + &fps_d); + } + } + gst_caps_unref (caps); + } + } + + /* if no fps or variable, then 25/1 */ + if (fps_n == 0) { + fps_n = 25; + fps_d = 1; + } + + max = gst_util_uint64_scale (2 * GST_SECOND, fps_d, fps_n); + min = gst_util_uint64_scale (1 * GST_SECOND, fps_d, fps_n); + + GST_LOG_OBJECT (decoder, + "latency min %" G_GUINT64_FORMAT " max %" G_GUINT64_FORMAT, min, max); + + gst_video_decoder_set_latency (GST_VIDEO_DECODER (decoder), min, max); +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_sequence (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoSequenceHdr seq_hdr = { 0, }; + + if (!gst_mpeg_video_packet_parse_sequence_header (packet, &seq_hdr)) { + GST_ERROR_OBJECT (decoder, "failed to parse sequence header"); + return GST_FLOW_ERROR; + } + + /* 6.1.1.6 Sequence header + The quantisation matrices may be redefined each time that a sequence + header occurs in the bitstream */ + priv->quant_matrix = QUANT_MATRIX_EXT_INIT; + + if (_seq_hdr_is_valid (&priv->seq_hdr) && + memcmp (&priv->seq_hdr, &seq_hdr, sizeof (seq_hdr)) == 0) + return GST_FLOW_OK; + + priv->seq_ext = SEQ_EXT_INIT; + priv->seq_display_ext = SEQ_DISPLAY_EXT_INIT; + priv->seq_scalable_ext = SEQ_SCALABLE_EXT_INIT; + priv->pic_ext = PIC_HDR_EXT_INIT; + + priv->seq_hdr = seq_hdr; + priv->seq_changed = TRUE; + + if (priv->width != seq_hdr.width || priv->height != seq_hdr.height) { + priv->need_to_drain = TRUE; + priv->width = seq_hdr.width; + priv->height = seq_hdr.height; + } + priv->display_width = priv->width; + priv->display_height = priv->height; + + _pts_set_framerate (&priv->tsg, seq_hdr.fps_n, seq_hdr.fps_d); + + gst_mpeg2_decoder_set_latency (decoder); + + priv->state = GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_sequence_ext (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoSequenceExt seq_ext = { 0, }; + guint width, height; + + if (!_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR)) { + GST_ERROR_OBJECT (decoder, "no sequence before parsing sequence-extension"); + return GST_FLOW_ERROR; + } + + if (!gst_mpeg_video_packet_parse_sequence_extension (packet, &seq_ext)) { + GST_ERROR_OBJECT (decoder, "failed to parse sequence-extension"); + return GST_FLOW_ERROR; + } + + if (_seq_ext_is_valid (&priv->seq_ext) && + memcmp (&priv->seq_ext, &seq_ext, sizeof (seq_ext)) == 0) + return GST_FLOW_OK; + + priv->seq_ext = seq_ext; + priv->seq_changed = TRUE; + + if (seq_ext.fps_n_ext && seq_ext.fps_d_ext) { + guint fps_n = priv->tsg.fps_n; + guint fps_d = priv->tsg.fps_d; + fps_n *= seq_ext.fps_n_ext + 1; + fps_d *= seq_ext.fps_d_ext + 1; + _pts_set_framerate (&priv->tsg, fps_n, fps_d); + gst_mpeg2_decoder_set_latency (decoder); + } + + width = (priv->width & 0x0fff) | ((guint32) seq_ext.horiz_size_ext << 12); + height = (priv->height & 0x0fff) | ((guint32) seq_ext.vert_size_ext << 12); + + if (priv->width != width || priv->height != height || + priv->profile != seq_ext.profile || + priv->progressive != seq_ext.progressive) { + priv->need_to_drain = TRUE; + priv->width = width; + priv->height = height; + priv->profile = seq_ext.profile; + priv->progressive = seq_ext.progressive; + + GST_DEBUG_OBJECT (decoder, "video resolution %ux%u, profile %d," + " progressive %d", priv->width, priv->height, priv->profile, + priv->progressive); + } + + priv->state |= GST_MPEG2_DECODER_STATE_GOT_SEQ_EXT; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_sequence_display_ext (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoSequenceDisplayExt seq_display_ext = { 0, }; + + if (!_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR)) { + GST_ERROR_OBJECT (decoder, + "no sequence before parsing sequence-display-extension"); + return GST_FLOW_ERROR; + } + + if (!gst_mpeg_video_packet_parse_sequence_display_extension (packet, + &seq_display_ext)) { + GST_ERROR_OBJECT (decoder, "failed to parse sequence-display-extension"); + return GST_FLOW_ERROR; + } + + if (_seq_display_ext_is_valid (&priv->seq_display_ext) && + memcmp (&priv->seq_display_ext, &seq_display_ext, + sizeof (seq_display_ext)) == 0) + return GST_FLOW_OK; + + priv->seq_display_ext = seq_display_ext; + priv->seq_changed = TRUE; + + priv->display_width = seq_display_ext.display_horizontal_size; + priv->display_height = seq_display_ext.display_vertical_size; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_sequence_scalable_ext (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoSequenceScalableExt seq_scalable_ext = { 0, }; + + if (!_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR)) { + GST_ERROR_OBJECT (decoder, + "no sequence before parsing sequence-scalable-extension"); + return GST_FLOW_ERROR; + } + + if (!gst_mpeg_video_packet_parse_sequence_scalable_extension (packet, + &seq_scalable_ext)) { + GST_ERROR_OBJECT (decoder, "failed to parse sequence-scalable-extension"); + return GST_FLOW_ERROR; + } + + if (_seq_scalable_ext_is_valid (&priv->seq_scalable_ext) && + memcmp (&priv->seq_scalable_ext, &seq_scalable_ext, + sizeof (seq_scalable_ext)) == 0) + return GST_FLOW_OK; + + priv->seq_scalable_ext = seq_scalable_ext; + priv->seq_changed = TRUE; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_quant_matrix_ext (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoQuantMatrixExt matrix_ext = { 0, }; + + if (!gst_mpeg_video_packet_parse_quant_matrix_extension (packet, &matrix_ext)) { + GST_ERROR_OBJECT (decoder, "failed to parse sequence-scalable-extension"); + return GST_FLOW_ERROR; + } + + priv->quant_matrix = matrix_ext; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_picture_ext (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoPictureExt pic_ext = { {{0,},}, }; + + if (!_is_valid_state (decoder, + GST_MPEG2_DECODER_STATE_VALID_SEQ_HEADERS | + GST_MPEG2_DECODER_STATE_GOT_PIC_HDR)) { + GST_ERROR_OBJECT (decoder, + "no sequence before parsing sequence-scalable-extension"); + return GST_FLOW_ERROR; + } + + if (!gst_mpeg_video_packet_parse_picture_extension (packet, &pic_ext)) { + GST_ERROR_OBJECT (decoder, "failed to parse picture-extension"); + return GST_FLOW_ERROR; + } + + if (priv->progressive && !pic_ext.progressive_frame) { + GST_WARNING_OBJECT (decoder, + "invalid interlaced frame in progressive sequence, fixing"); + pic_ext.progressive_frame = 1; + } + + if (pic_ext.picture_structure == 0 || + (pic_ext.progressive_frame && + pic_ext.picture_structure != + GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME)) { + GST_WARNING_OBJECT (decoder, + "invalid picture_structure %d, replacing with \"frame\"", + pic_ext.picture_structure); + pic_ext.picture_structure = GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME; + } + + priv->pic_ext = pic_ext; + + priv->state |= GST_MPEG2_DECODER_STATE_GOT_PIC_EXT; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_gop (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoGop gop = { 0, }; + + if (!gst_mpeg_video_packet_parse_gop (packet, &gop)) { + GST_ERROR_OBJECT (decoder, "failed to parse GOP"); + return GST_FLOW_ERROR; + } + + GST_DEBUG_OBJECT (decoder, + "GOP %02u:%02u:%02u:%02u (closed_gop %d, broken_link %d)", gop.hour, + gop.minute, gop.second, gop.frame, gop.closed_gop, gop.broken_link); + + priv->gop = gop; + + _pts_sync (&priv->tsg, priv->current_frame->pts); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_picture (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoPictureHdr pic_hdr = { 0, }; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (decoder); + + if (!_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_VALID_SEQ_HEADERS)) { + GST_ERROR_OBJECT (decoder, "no sequence before parsing picture header"); + return GST_FLOW_ERROR; + } + + /* If need_to_drain, we must have sequence changed. */ + g_assert (priv->need_to_drain ? priv->seq_changed : TRUE); + + /* 6.1.1.6: Conversely if no sequence_xxx_extension() occurs between + the first sequence_header() and the first picture_header() then + sequence_xxx_extension() shall not occur in the bitstream. */ + if (priv->seq_changed) { + GstFlowReturn ret; + + /* There are a lot of info in the mpeg2's sequence(also including ext + display_ext and scalable_ext). We need to notify the subclass about + its change, but not all the changes should trigger a drain(), which + may change the output picture order. */ + if (priv->need_to_drain) { + ret = gst_mpeg2_decoder_drain (GST_VIDEO_DECODER (decoder)); + if (ret != GST_FLOW_OK) + return ret; + + priv->need_to_drain = FALSE; + } + + if (klass->get_preferred_output_delay) + priv->preferred_output_delay = + klass->get_preferred_output_delay (decoder, priv->is_live); + + priv->seq_changed = FALSE; + + if (klass->new_sequence) { + ret = klass->new_sequence (decoder, &priv->seq_hdr, + _seq_ext_is_valid (&priv->seq_ext) ? &priv->seq_ext : NULL, + _seq_display_ext_is_valid (&priv->seq_display_ext) ? + &priv->seq_display_ext : NULL, + _seq_scalable_ext_is_valid (&priv->seq_scalable_ext) ? + &priv->seq_scalable_ext : NULL); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, "new sequence error"); + return ret; + } + } + } + + priv->state &= (GST_MPEG2_DECODER_STATE_GOT_SEQ_HDR | + GST_MPEG2_DECODER_STATE_GOT_SEQ_EXT); + + if (!gst_mpeg_video_packet_parse_picture_header (packet, &pic_hdr)) { + GST_ERROR_OBJECT (decoder, "failed to parse picture header"); + return GST_FLOW_ERROR; + } + + priv->pic_hdr = pic_hdr; + + priv->state |= GST_MPEG2_DECODER_STATE_GOT_PIC_HDR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_start_current_picture (GstMpeg2Decoder * decoder, + GstMpeg2Slice * slice) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (decoder); + GstMpeg2Picture *prev_picture, *next_picture; + GstFlowReturn ret; + + if (!klass->start_picture) + return GST_FLOW_OK; + + gst_mpeg2_dpb_get_neighbours (priv->dpb, priv->current_picture, + &prev_picture, &next_picture); + + if (priv->current_picture->type == GST_MPEG_VIDEO_PICTURE_TYPE_B + && !prev_picture && !priv->gop.closed_gop) { + GST_VIDEO_CODEC_FRAME_FLAG_SET (priv->current_frame, + GST_VIDEO_CODEC_FRAME_FLAG_DECODE_ONLY); + } + + ret = klass->start_picture (decoder, priv->current_picture, slice, + prev_picture, next_picture); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, "subclass does not want to start picture"); + return ret; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_ensure_current_picture (GstMpeg2Decoder * decoder, + GstMpeg2Slice * slice) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (decoder); + GstMpeg2Picture *picture = NULL; + GstFlowReturn ret = GST_FLOW_OK; + + if (priv->current_picture) { + g_assert (_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_GOT_SLICE)); + return GST_FLOW_OK; + } + + if (priv->progressive || + priv->pic_ext.picture_structure == + GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME) { + g_assert (!_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_GOT_SLICE)); + + if (priv->first_field) { + GST_WARNING_OBJECT (decoder, "An unmatched first field"); + gst_mpeg2_picture_clear (&priv->first_field); + } + + picture = gst_mpeg2_picture_new (); + if (klass->new_picture) + ret = klass->new_picture (decoder, priv->current_frame, picture); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, "subclass does not want accept new picture"); + gst_mpeg2_picture_unref (picture); + return ret; + } + + picture->structure = GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME; + } else { + if (!priv->first_field) { + picture = gst_mpeg2_picture_new (); + if (klass->new_picture) + ret = klass->new_picture (decoder, priv->current_frame, picture); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, + "subclass does not want accept new picture"); + gst_mpeg2_picture_unref (picture); + return ret; + } + } else { + picture = gst_mpeg2_picture_new (); + + if (klass->new_field_picture) + ret = klass->new_field_picture (decoder, priv->first_field, picture); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, + "Subclass couldn't handle new field picture"); + gst_mpeg2_picture_unref (picture); + return ret; + } + + picture->first_field = gst_mpeg2_picture_ref (priv->first_field); + + /* At this moment, this picture should be interlaced */ + picture->buffer_flags |= GST_VIDEO_BUFFER_FLAG_INTERLACED; + if (priv->pic_ext.top_field_first) + picture->buffer_flags |= GST_VIDEO_BUFFER_FLAG_TFF; + } + + picture->structure = priv->pic_ext.picture_structure; + } + + picture->needed_for_output = TRUE; + /* This allows accessing the frame from the picture. */ + picture->system_frame_number = priv->current_frame->system_frame_number; + picture->type = priv->pic_hdr.pic_type; + picture->tsn = priv->pic_hdr.tsn; + priv->current_pts = + _pts_eval (&priv->tsg, priv->current_frame->pts, picture->tsn); + picture->pic_order_cnt = _pts_get_poc (&priv->tsg); + + priv->current_picture = picture; + GST_LOG_OBJECT (decoder, + "Create new picture %p(%s), system number: %d, poc: %d," + " type: 0x%d, first field %p", + picture, + (picture->structure == GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME) ? + "frame" : "field", + picture->system_frame_number, picture->pic_order_cnt, picture->type, + picture->first_field); + + return gst_mpeg2_decoder_start_current_picture (decoder, slice); +} + +static GstFlowReturn +gst_mpeg2_decoder_finish_current_field (GstMpeg2Decoder * decoder) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (decoder); + GstFlowReturn ret; + + if (priv->current_picture == NULL) + return GST_FLOW_OK; + + ret = klass->end_picture (decoder, priv->current_picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, "subclass end_picture failed"); + return ret; + } + + if (priv->current_picture->structure != + GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME && + !priv->current_picture->first_field) { + priv->first_field = priv->current_picture; + priv->current_picture = NULL; + } else { + GST_WARNING_OBJECT (decoder, "The current picture %p is not %s, should not " + "begin another picture. Just discard this.", + priv->current_picture, priv->current_picture->structure == + GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME ? + " a field" : "the first field"); + gst_mpeg2_picture_clear (&priv->current_picture); + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_finish_current_picture (GstMpeg2Decoder * decoder) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (decoder); + GstFlowReturn ret; + + g_assert (priv->current_picture != NULL); + + ret = klass->end_picture (decoder, priv->current_picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, "subclass end_picture failed"); + return ret; + } + + if (priv->current_picture->structure != + GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME && + !priv->current_picture->first_field) { + priv->first_field = priv->current_picture; + priv->current_picture = NULL; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_slice (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpegVideoSliceHdr slice_hdr; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (decoder); + GstMpeg2Slice slice; + GstFlowReturn ret; + + if (!_is_valid_state (decoder, GST_MPEG2_DECODER_STATE_VALID_PIC_HEADERS)) { + GST_ERROR_OBJECT (decoder, + "no sequence or picture header before parsing picture header"); + return GST_FLOW_ERROR; + } + + if (!gst_mpeg_video_packet_parse_slice_header (packet, &slice_hdr, + &priv->seq_hdr, + _seq_scalable_ext_is_valid (&priv->seq_scalable_ext) ? + &priv->seq_scalable_ext : NULL)) { + GST_ERROR_OBJECT (decoder, "failed to parse slice header"); + return GST_FLOW_ERROR; + } + + slice.header = slice_hdr; + slice.packet = *packet; + slice.quant_matrix = _quant_matrix_ext_is_valid (&priv->quant_matrix) ? + &priv->quant_matrix : NULL; + g_assert (_pic_hdr_is_valid (&priv->pic_hdr)); + slice.pic_hdr = &priv->pic_hdr; + slice.pic_ext = _pic_hdr_ext_is_valid (&priv->pic_ext) ? + &priv->pic_ext : NULL; + slice.sc_offset = slice.packet.offset - 4; + slice.size = slice.packet.size + 4; + + ret = gst_mpeg2_decoder_ensure_current_picture (decoder, &slice); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, "failed to start current picture"); + return ret; + } + + g_assert (klass->decode_slice); + ret = klass->decode_slice (decoder, priv->current_picture, &slice); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (decoder, + "Subclass didn't want to decode picture %p (frame_num %d, poc %d)", + priv->current_picture, priv->current_picture->system_frame_number, + priv->current_picture->pic_order_cnt); + return ret; + } + + priv->state |= GST_MPEG2_DECODER_STATE_GOT_SLICE; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mpeg2_decoder_decode_packet (GstMpeg2Decoder * decoder, + GstMpegVideoPacket * packet) +{ + GstMpegVideoPacketExtensionCode ext_type; + GstFlowReturn ret = GST_FLOW_OK; + + GST_LOG_OBJECT (decoder, "Parsing the packet 0x%x, size %d", + packet->type, packet->size); + switch (packet->type) { + case GST_MPEG_VIDEO_PACKET_PICTURE:{ + ret = gst_mpeg2_decoder_finish_current_field (decoder); + if (ret != GST_FLOW_OK) + break; + + ret = gst_mpeg2_decoder_handle_picture (decoder, packet); + break; + } + case GST_MPEG_VIDEO_PACKET_SEQUENCE: + ret = gst_mpeg2_decoder_handle_sequence (decoder, packet); + break; + case GST_MPEG_VIDEO_PACKET_EXTENSION: + ext_type = packet->data[packet->offset] >> 4; + GST_LOG_OBJECT (decoder, " Parsing the ext packet 0x%x", ext_type); + switch (ext_type) { + case GST_MPEG_VIDEO_PACKET_EXT_SEQUENCE: + ret = gst_mpeg2_decoder_handle_sequence_ext (decoder, packet); + break; + case GST_MPEG_VIDEO_PACKET_EXT_SEQUENCE_DISPLAY: + ret = gst_mpeg2_decoder_handle_sequence_display_ext (decoder, packet); + break; + case GST_MPEG_VIDEO_PACKET_EXT_SEQUENCE_SCALABLE: + ret = + gst_mpeg2_decoder_handle_sequence_scalable_ext (decoder, packet); + break; + case GST_MPEG_VIDEO_PACKET_EXT_QUANT_MATRIX: + ret = gst_mpeg2_decoder_handle_quant_matrix_ext (decoder, packet); + break; + case GST_MPEG_VIDEO_PACKET_EXT_PICTURE: + ret = gst_mpeg2_decoder_handle_picture_ext (decoder, packet); + break; + default: + /* Ignore unknown start-code extensions */ + break; + } + break; + case GST_MPEG_VIDEO_PACKET_SEQUENCE_END: + break; + case GST_MPEG_VIDEO_PACKET_GOP: + ret = gst_mpeg2_decoder_handle_gop (decoder, packet); + break; + case GST_MPEG_VIDEO_PACKET_USER_DATA: + break; + default: + if (packet->type >= GST_MPEG_VIDEO_PACKET_SLICE_MIN && + packet->type <= GST_MPEG_VIDEO_PACKET_SLICE_MAX) { + ret = gst_mpeg2_decoder_handle_slice (decoder, packet); + break; + } + GST_WARNING_OBJECT (decoder, "unsupported packet type 0x%02x, ignore", + packet->type); + break; + } + + return ret; +} + +static void +gst_mpeg2_decoder_do_output_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * to_output, GstFlowReturn * ret) +{ + GstVideoCodecFrame *frame = NULL; + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpeg2DecoderOutputFrame output_frame; + + g_assert (ret != NULL); + + frame = + gst_video_decoder_get_frame (GST_VIDEO_DECODER (decoder), + to_output->system_frame_number); + + if (!frame) { + GST_ERROR_OBJECT (decoder, + "No available codec frame with frame number %d", + to_output->system_frame_number); + UPDATE_FLOW_RETURN (ret, GST_FLOW_ERROR); + + gst_mpeg2_picture_unref (to_output); + + return; + } + + output_frame.frame = frame; + output_frame.picture = to_output; + output_frame.self = decoder; + gst_queue_array_push_tail_struct (priv->output_queue, &output_frame); + gst_mpeg2_decoder_drain_output_queue (decoder, priv->preferred_output_delay, + ret); +} + +static GstFlowReturn +gst_mpeg2_decoder_output_current_picture (GstMpeg2Decoder * decoder) +{ + GstMpeg2DecoderPrivate *priv = decoder->priv; + GstMpeg2Picture *picture = priv->current_picture; + GstFlowReturn ret = GST_FLOW_OK; + + if (!picture && priv->first_field) { + GST_WARNING_OBJECT (decoder, "Missing the second field"); + picture = priv->first_field; + } + + g_assert (picture); + + /* Update the presentation time */ + priv->current_frame->pts = priv->current_pts; + + gst_mpeg2_dpb_add (priv->dpb, picture); + + GST_LOG_OBJECT (decoder, + "Add picture %p (frame_num %d, poc %d, type 0x%x), into DPB", picture, + picture->system_frame_number, picture->pic_order_cnt, picture->type); + + while (gst_mpeg2_dpb_need_bump (priv->dpb)) { + GstMpeg2Picture *to_output; + + to_output = gst_mpeg2_dpb_bump (priv->dpb); + g_assert (to_output); + + gst_mpeg2_decoder_do_output_picture (decoder, to_output, &ret); + if (ret != GST_FLOW_OK) + break; + } + + return ret; +} + +static void +gst_mpeg2_decoder_clear_output_frame (GstMpeg2DecoderOutputFrame * output_frame) +{ + if (!output_frame) + return; + + if (output_frame->frame) { + gst_video_decoder_release_frame (GST_VIDEO_DECODER (output_frame->self), + output_frame->frame); + output_frame->frame = NULL; + } + + gst_mpeg2_picture_clear (&output_frame->picture); +} + +static GstFlowReturn +gst_mpeg2_decoder_handle_frame (GstVideoDecoder * decoder, + GstVideoCodecFrame * frame) +{ + GstMpeg2Decoder *self = GST_MPEG2_DECODER (decoder); + GstMpeg2DecoderPrivate *priv = self->priv; + GstBuffer *in_buf = frame->input_buffer; + GstMapInfo map_info; + GstMpegVideoPacket packet; + GstFlowReturn ret = GST_FLOW_OK; + guint offset; + gboolean last_one; + + GST_LOG_OBJECT (self, "handle frame, PTS: %" GST_TIME_FORMAT + ", DTS: %" GST_TIME_FORMAT " system frame number is %d", + GST_TIME_ARGS (GST_BUFFER_PTS (in_buf)), + GST_TIME_ARGS (GST_BUFFER_DTS (in_buf)), frame->system_frame_number); + + priv->state &= ~GST_MPEG2_DECODER_STATE_GOT_SLICE; + + priv->current_frame = frame; + gst_buffer_map (in_buf, &map_info, GST_MAP_READ); + + offset = 0; + last_one = FALSE; + while (gst_mpeg_video_parse (&packet, map_info.data, map_info.size, offset)) { + /* The packet is the last one */ + if (packet.size == -1) { + if (packet.offset < map_info.size) { + packet.size = map_info.size - packet.offset; + last_one = TRUE; + } else { + GST_WARNING_OBJECT (decoder, "Get a packet with wrong size"); + break; + } + } + + ret = gst_mpeg2_decoder_decode_packet (self, &packet); + if (ret != GST_FLOW_OK) { + gst_buffer_unmap (in_buf, &map_info); + GST_WARNING_OBJECT (decoder, "failed to handle the packet type 0x%x", + packet.type); + goto failed; + } + + if (last_one) + break; + + offset = packet.offset; + } + + gst_buffer_unmap (in_buf, &map_info); + + if (!priv->current_picture) { + GST_ERROR_OBJECT (decoder, "no valid picture created"); + goto failed; + } + + ret = gst_mpeg2_decoder_finish_current_picture (self); + if (ret != GST_FLOW_OK) { + GST_ERROR_OBJECT (decoder, "failed to decode the current picture"); + goto failed; + } + + ret = gst_mpeg2_decoder_output_current_picture (self); + gst_mpeg2_picture_clear (&priv->current_picture); + gst_mpeg2_picture_clear (&priv->first_field); + gst_video_codec_frame_unref (priv->current_frame); + priv->current_frame = NULL; + return ret; + +failed: + { + if (ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (decoder, 1, STREAM, DECODE, + ("failed to handle the frame %d", frame->system_frame_number), (NULL), + ret); + } + + gst_video_decoder_drop_frame (decoder, frame); + gst_mpeg2_picture_clear (&priv->current_picture); + gst_mpeg2_picture_clear (&priv->first_field); + priv->current_frame = NULL; + + return ret; + } +} + +static void +gst_mpeg2_decoder_drain_output_queue (GstMpeg2Decoder * self, guint num, + GstFlowReturn * ret) +{ + GstMpeg2DecoderPrivate *priv = self->priv; + GstMpeg2DecoderClass *klass = GST_MPEG2_DECODER_GET_CLASS (self); + GstFlowReturn flow_ret; + + g_assert (klass->output_picture); + + while (gst_queue_array_get_length (priv->output_queue) > num) { + GstMpeg2DecoderOutputFrame *output_frame = (GstMpeg2DecoderOutputFrame *) + gst_queue_array_pop_head_struct (priv->output_queue); + GST_LOG_OBJECT (self, + "Output picture %p (frame_num %d, poc %d, pts: %" GST_TIME_FORMAT + "), from DPB", + output_frame->picture, output_frame->picture->system_frame_number, + output_frame->picture->pic_order_cnt, + GST_TIME_ARGS (output_frame->frame->pts)); + + flow_ret = + klass->output_picture (self, output_frame->frame, + output_frame->picture); + + UPDATE_FLOW_RETURN (ret, flow_ret); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstmpeg2decoder.h
Added
@@ -0,0 +1,207 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_MPEG2_DECODER_H__ +#define __GST_MPEG2_DECODER_H__ + +#include <gst/codecs/codecs-prelude.h> + +#include <gst/video/video.h> +#include <gst/codecparsers/gstmpegvideoparser.h> +#include <gst/codecs/gstmpeg2picture.h> + +G_BEGIN_DECLS + +#define GST_TYPE_MPEG2_DECODER (gst_mpeg2_decoder_get_type()) +#define GST_MPEG2_DECODER(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_MPEG2_DECODER,GstMpeg2Decoder)) +#define GST_MPEG2_DECODER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_MPEG2_DECODER,GstMpeg2DecoderClass)) +#define GST_MPEG2_DECODER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_MPEG2_DECODER,GstMpeg2DecoderClass)) +#define GST_IS_MPEG2_DECODER(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_MPEG2_DECODER)) +#define GST_IS_MPEG2_DECODER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_MPEG2_DECODER)) + +typedef struct _GstMpeg2Decoder GstMpeg2Decoder; +typedef struct _GstMpeg2DecoderClass GstMpeg2DecoderClass; +typedef struct _GstMpeg2DecoderPrivate GstMpeg2DecoderPrivate; + +/** + * GstMpeg2Decoder: + * + * The opaque #GstMpeg2Decoder data structure. + * + * Since: 1.20 + */ +struct _GstMpeg2Decoder +{ + /*< private >*/ + GstVideoDecoder parent; + + /*< protected >*/ + GstVideoCodecState * input_state; + + /*< private >*/ + GstMpeg2DecoderPrivate *priv; + + gpointer padding[GST_PADDING_LARGE]; +}; + +/** + * GstMpeg2DecoderClass: + */ +struct _GstMpeg2DecoderClass +{ + GstVideoDecoderClass parent_class; + + /** + * GstMpeg2DecoderClass::new_sequence: + * @decoder: a #GstMpeg2Decoder + * @seq: a #GstMpegVideoSequenceHdr + * @seq_ext: a #GstMpegVideoSequenceExt + * + * Notifies subclass of SPS update + * + * Since: 1.20 + */ + GstFlowReturn (*new_sequence) (GstMpeg2Decoder * decoder, + const GstMpegVideoSequenceHdr * seq, + const GstMpegVideoSequenceExt * seq_ext, + const GstMpegVideoSequenceDisplayExt * seq_display_ext, + const GstMpegVideoSequenceScalableExt * seq_scalable_ext); + + /** + * GstMpeg2DecoderClass::new_picture: + * @decoder: a #GstMpeg2Decoder + * @frame: (transfer none): a #GstVideoCodecFrame + * @picture: (transfer none): a #GstMpeg2Picture + * + * Optional. Called whenever new #GstMpeg2Picture is created. + * Subclass can set implementation specific user data + * on the #GstMpeg2Picture via gst_mpeg2_picture_set_user_data() + * + * Since: 1.20 + */ + GstFlowReturn (*new_picture) (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, + GstMpeg2Picture * picture); + + /** + * GstMpeg2DecoderClass::new_field_picture: + * @decoder: a #GstMpeg2Decoder + * @first_field: (transfer none): the first field #GstMpeg2Picture already decoded + * @second_field: (transfer none): a #GstMpeg2Picture for the second field + * + * Called when a new field picture is created for interlaced field picture. + * Subclass can attach implementation specific user data on @second_field via + * gst_mpeg2_picture_set_user_data() + * + * Since: 1.20 + */ + GstFlowReturn (*new_field_picture) (GstMpeg2Decoder * decoder, + const GstMpeg2Picture * first_field, + GstMpeg2Picture * second_field); + + /** + * GstMpeg2DecoderClass::start_picture: + * @decoder: a #GstMpeg2Decoder + * @picture: (transfer none): a #GstMpeg2Picture + * @slice: (transfer none): a #GstMpeg2Slice + * @prev_picture: (transfer none): a #GstMpeg2Picture + * @next_picture: (transfer none): a #GstMpeg2Picture + * + * Optional. Called per one #GstMpeg2Picture to notify subclass to prepare + * decoding process for the #GstMpeg2Picture + * + * Since: 1.20 + */ + GstFlowReturn (*start_picture) (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, + GstMpeg2Slice * slice, + GstMpeg2Picture * prev_picture, + GstMpeg2Picture * next_picture); + + /** + * GstMpeg2DecoderClass::decode_slice: + * @decoder: a #GstMpeg2Decoder + * @picture: (transfer none): a #GstMpeg2Picture + * @slice: (transfer none): a #GstMpeg2Slice + * + * Provides per slice data with parsed slice header and required raw bitstream + * for subclass to decode it. + * + * Since: 1.20 + */ + GstFlowReturn (*decode_slice) (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, + GstMpeg2Slice * slice); + + /** + * GstMpeg2DecoderClass::end_picture: + * @decoder: a #GstMpeg2Decoder + * @picture: (transfer none): a #GstMpeg2Picture + * + * Optional. Called per one #GstMpeg2Picture to notify subclass to finish + * decoding process for the #GstMpeg2Picture + * + * Since: 1.20 + */ + GstFlowReturn (*end_picture) (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture); + + /** + * GstMpeg2DecoderClass::output_picture: + * @decoder: a #GstMpeg2Decoder + * @frame: (transfer full): a #GstVideoCodecFrame + * @picture: (transfer full): a #GstMpeg2Picture + * + * Called with a #GstMpeg2Picture which is required to be outputted. + * The #GstVideoCodecFrame must be consumed by subclass. + * + * Since: 1.20 + */ + GstFlowReturn (*output_picture) (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, + GstMpeg2Picture * picture); + + /** + * GstMpeg2DecoderClass::get_preferred_output_delay: + * @decoder: a #GstMpeg2Decoder + * @is_live: whether upstream is live or not + * + * Optional. Called by baseclass to query whether delaying output is + * preferred by subclass or not. + * + * Returns: the number of perferred delayed output frames + * + * Since: 1.20 + */ + guint (*get_preferred_output_delay) (GstMpeg2Decoder * decoder, + gboolean is_live); + + /*< private >*/ + gpointer padding[GST_PADDING_LARGE]; +}; + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstMpeg2Decoder, gst_object_unref) + +GST_CODECS_API +GType gst_mpeg2_decoder_get_type (void); + +G_END_DECLS + +#endif /* __GST_MPEG2_DECODER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstmpeg2picture.c
Added
@@ -0,0 +1,348 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstmpeg2picture.h" + +GST_DEBUG_CATEGORY_EXTERN (gst_mpeg2_decoder_debug); +#define GST_CAT_DEFAULT gst_mpeg2_decoder_debug + +GST_DEFINE_MINI_OBJECT_TYPE (GstMpeg2Picture, gst_mpeg2_picture); + +static void +_gst_mpeg2_picture_free (GstMpeg2Picture * picture) +{ + GST_TRACE ("Free picture %p", picture); + + if (picture->first_field) + gst_mpeg2_picture_unref (picture->first_field); + + if (picture->notify) + picture->notify (picture->user_data); + + g_free (picture); +} + +/** + * gst_mpeg2_picture_new: + * + * Create new #GstMpeg2Picture + * + * Returns: a new #GstMpeg2Picture + * + * Since: 1.20 + */ +GstMpeg2Picture * +gst_mpeg2_picture_new (void) +{ + GstMpeg2Picture *pic; + + pic = g_new0 (GstMpeg2Picture, 1); + + pic->pic_order_cnt = G_MAXINT32; + pic->structure = GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME; + + gst_mini_object_init (GST_MINI_OBJECT_CAST (pic), 0, + GST_TYPE_MPEG2_PICTURE, NULL, NULL, + (GstMiniObjectFreeFunction) _gst_mpeg2_picture_free); + + GST_TRACE ("New picture %p", pic); + + return pic; +} + +/** + * gst_mpeg2_picture_set_user_data: + * @picture: a #GstMpeg2Picture + * @user_data: private data + * @notify: (closure user_data): a #GDestroyNotify + * + * Sets @user_data on the picture and the #GDestroyNotify that will be called when + * the picture is freed. + * + * If a @user_data was previously set, then the previous set @notify will be called + * before the @user_data is replaced. + * + * Since: 1.20 + */ +void +gst_mpeg2_picture_set_user_data (GstMpeg2Picture * picture, gpointer user_data, + GDestroyNotify notify) +{ + g_return_if_fail (GST_IS_MPEG2_PICTURE (picture)); + + if (picture->notify) + picture->notify (picture->user_data); + + picture->user_data = user_data; + picture->notify = notify; +} + +/** + * gst_mpeg2_picture_get_user_data: + * @picture: a #GstMpeg2Picture + * + * Gets private data set on the picture via + * gst_mpeg2_picture_set_user_data() previously. + * + * Returns: (transfer none): The previously set user_data + * + * Since: 1.20 + */ +gpointer +gst_mpeg2_picture_get_user_data (GstMpeg2Picture * picture) +{ + return picture->user_data; +} + +struct _GstMpeg2Dpb +{ + GstMpeg2Picture *ref_pic_list[2]; + guint num_ref_pictures; + /* last added picture */ + GstMpeg2Picture *new_pic; +}; + +/** + * gst_mpeg2_dpb_new: (skip) + * + * Create new #GstMpeg2Dpb + * + * Returns: a new #GstMpeg2Dpb + * + * Since: 1.20 + */ +GstMpeg2Dpb * +gst_mpeg2_dpb_new (void) +{ + return g_new0 (GstMpeg2Dpb, 1); +} + +/** + * gst_mpeg2_dpb_free: + * @dpb: a #GstMpeg2Dpb to free + * + * Free the @dpb + * + * Since: 1.20 + */ +void +gst_mpeg2_dpb_free (GstMpeg2Dpb * dpb) +{ + guint i; + + g_return_if_fail (dpb != NULL); + + gst_mpeg2_picture_clear (&dpb->new_pic); + + g_assert (dpb->num_ref_pictures <= 2); + for (i = 0; i < dpb->num_ref_pictures; i++) + gst_mpeg2_picture_clear (&dpb->ref_pic_list[i]); + + g_free (dpb); +} + +/** + * gst_mpeg2_dpb_clear: + * @dpb: a #GstMpeg2Dpb + * + * Clear all stored #GstMpeg2Picture + * + * Since: 1.20 + */ +void +gst_mpeg2_dpb_clear (GstMpeg2Dpb * dpb) +{ + guint i; + + g_return_if_fail (dpb != NULL); + + gst_mpeg2_picture_clear (&dpb->new_pic); + + g_assert (dpb->num_ref_pictures <= 2); + for (i = 0; i < dpb->num_ref_pictures; i++) + gst_mpeg2_picture_clear (&dpb->ref_pic_list[i]); + + dpb->num_ref_pictures = 0; +} + +static void +_dpb_add_to_reference (GstMpeg2Dpb * dpb, GstMpeg2Picture * pic) +{ + gint index = -1; + + if (G_LIKELY (dpb->num_ref_pictures == 2)) { + index = (dpb->ref_pic_list[0]->pic_order_cnt > + dpb->ref_pic_list[1]->pic_order_cnt); + + if (dpb->ref_pic_list[index]->pic_order_cnt > pic->pic_order_cnt) + return; + } + + if (index < 0) { + index = dpb->num_ref_pictures; + dpb->num_ref_pictures++; + } + + gst_mpeg2_picture_replace (&dpb->ref_pic_list[index], pic); +} + +/** + * gst_mpeg2_dpb_add: + * @dpb: a #GstMpeg2Dpb + * @picture: (transfer full): a #GstMpeg2Picture + * + * Store the @picture + * + * Since: 1.20 + */ +void +gst_mpeg2_dpb_add (GstMpeg2Dpb * dpb, GstMpeg2Picture * picture) +{ + g_return_if_fail (dpb != NULL); + g_return_if_fail (GST_IS_MPEG2_PICTURE (picture)); + + g_assert (dpb->num_ref_pictures <= 2); + + if (!GST_MPEG2_PICTURE_IS_REF (picture) || dpb->num_ref_pictures == 2) { + gst_mpeg2_picture_replace (&dpb->new_pic, picture); + } else { + _dpb_add_to_reference (dpb, picture); + } +} + +/** + * gst_mpeg2_dpb_need_bump: + * @dpb: a #GstMpeg2Dpb + * + * Checks if @dbp has a new picture. + * + * Returns: #TRUE if @dpb needs to be bumped; otherwise, #FALSE + * + * Since: 1.20 + */ +gboolean +gst_mpeg2_dpb_need_bump (GstMpeg2Dpb * dpb) +{ + g_return_val_if_fail (dpb != NULL, FALSE); + g_assert (dpb->num_ref_pictures <= 2); + + if (dpb->new_pic) + return TRUE; + + return FALSE; +} + +/** + * gst_mpeg2_dpb_bump: + * @dpb: a #GstMpeg2Dpb + * + * Returns: (nullable) (transfer full): a #GstMpeg2Picture which is needed to be + * outputted + * + * Since: 1.20 + */ +GstMpeg2Picture * +gst_mpeg2_dpb_bump (GstMpeg2Dpb * dpb) +{ + GstMpeg2Picture *pic = NULL; + guint i; + + g_return_val_if_fail (dpb != NULL, FALSE); + g_assert (dpb->num_ref_pictures <= 2); + + /* First, find the lowest poc. */ + for (i = 0; i < 2; i++) { + if (!dpb->ref_pic_list[i]) + continue; + + if (dpb->ref_pic_list[i]->needed_for_output) { + if (!pic || pic->pic_order_cnt > dpb->ref_pic_list[i]->pic_order_cnt) + gst_mpeg2_picture_replace (&pic, dpb->ref_pic_list[i]); + } + } + + if (dpb->new_pic && dpb->new_pic->needed_for_output && + (!pic || pic->pic_order_cnt > dpb->new_pic->pic_order_cnt)) + gst_mpeg2_picture_replace (&pic, dpb->new_pic); + + /* Then, replace the reference if needed. */ + if (dpb->new_pic && GST_MPEG2_PICTURE_IS_REF (dpb->new_pic)) { + _dpb_add_to_reference (dpb, dpb->new_pic); + gst_mpeg2_picture_clear (&dpb->new_pic); + } + + if (pic) { + pic->needed_for_output = FALSE; + if (pic == dpb->new_pic) + gst_mpeg2_picture_clear (&dpb->new_pic); + } + + return pic; +} + +/** + * gst_mpeg2_dpb_get_neighbours: + * @dpb: a #GstMpeg2Dpb + * @picture: current #GstMpeg2Picture + * @prev_picture_ptr: (transfer none) (out) (nullable): previuous + * #GstMpeg2Picture in @dpb + * @next_picture_ptr: (transfer none) (out) (nullable): next + * #GstMpeg2Picture in @dpb + * + * Gets the neighbours #GstMpeg2Picture of @picture in @dpb. + * + * Since: 1.20 + */ +void +gst_mpeg2_dpb_get_neighbours (GstMpeg2Dpb * dpb, + GstMpeg2Picture * picture, GstMpeg2Picture ** prev_picture_ptr, + GstMpeg2Picture ** next_picture_ptr) +{ + GstMpeg2Picture *ref_picture, *ref_pictures[2]; + GstMpeg2Picture **picture_ptr; + guint i, index; + + g_return_if_fail (dpb != NULL); + g_return_if_fail (picture != NULL); + g_assert (dpb->num_ref_pictures <= 2); + + ref_pictures[0] = NULL; + ref_pictures[1] = NULL; + for (i = 0; i < 2; i++) { + ref_picture = dpb->ref_pic_list[i]; + if (!ref_picture) + continue; + + index = ref_picture->pic_order_cnt > picture->pic_order_cnt; + picture_ptr = &ref_pictures[index]; + if (!*picture_ptr || + ((*picture_ptr)->pic_order_cnt > ref_picture->pic_order_cnt) == index) + *picture_ptr = ref_picture; + } + + if (prev_picture_ptr) + *prev_picture_ptr = ref_pictures[0]; + if (next_picture_ptr) + *next_picture_ptr = ref_pictures[1]; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstmpeg2picture.h
Added
@@ -0,0 +1,192 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_MPEG2_PICTURE_H__ +#define __GST_MPEG2_PICTURE_H__ + +#include <gst/codecs/codecs-prelude.h> +#include <gst/codecparsers/gstmpegvideoparser.h> +#include <gst/video/video.h> + +G_BEGIN_DECLS + +/** + * GST_TYPE_MPEG2_PICTURE: + * + * Since: 1.20 + */ +#define GST_TYPE_MPEG2_PICTURE (gst_mpeg2_picture_get_type()) +/** + * GST_IS_MPEG2_PICTURE: + * + * Since: 1.20 + */ +#define GST_IS_MPEG2_PICTURE(obj) (GST_IS_MINI_OBJECT_TYPE(obj, GST_TYPE_MPEG2_PICTURE)) +/** + * GST_MPEG2_PICTURE: + * + * Since: 1.20 + */ +#define GST_MPEG2_PICTURE(obj) ((GstMpeg2Picture *)obj) + +typedef struct _GstMpeg2Slice GstMpeg2Slice; +typedef struct _GstMpeg2Picture GstMpeg2Picture; + +/** + * GstMpeg2Slice: + * + * Since: 1.20 + */ +struct _GstMpeg2Slice +{ + /*< private >*/ + GstMpegVideoQuantMatrixExt *quant_matrix; /* The parameter set */ + GstMpegVideoPictureHdr *pic_hdr; + GstMpegVideoPictureExt *pic_ext; + + GstMpegVideoSliceHdr header; + + /* parsed video packet (doesn't take ownership of raw data) */ + GstMpegVideoPacket packet; + /* offset of the start code for the slice */ + guint sc_offset; + /* size, including the start code */ + guint size; +}; + +/** + * GstMpeg2Picture: + * + * Since: 1.20 + */ +struct _GstMpeg2Picture +{ + /*< private >*/ + GstMiniObject parent; + + /* From GstVideoCodecFrame */ + guint32 system_frame_number; + gboolean needed_for_output; + /* For interlaced streams */ + GstMpeg2Picture *first_field; + + GstVideoBufferFlags buffer_flags; + + gint pic_order_cnt; + gint tsn; + GstMpegVideoPictureStructure structure; + GstMpegVideoPictureType type; + + gpointer user_data; + GDestroyNotify notify; +}; + +/** + * GST_MPEG2_PICTURE_IS_REF: + * @picture: a #GstMpeg2Picture + * + * Check whether @picture's type is I or P + * + * Since: 1.20 + */ +#define GST_MPEG2_PICTURE_IS_REF(picture) \ + (((GstMpeg2Picture *) picture)->type == GST_MPEG_VIDEO_PICTURE_TYPE_I || \ + ((GstMpeg2Picture *) picture)->type == GST_MPEG_VIDEO_PICTURE_TYPE_P) + +GST_CODECS_API +GType gst_mpeg2_picture_get_type (void); + +GST_CODECS_API +GstMpeg2Picture * gst_mpeg2_picture_new (void); + +static inline GstMpeg2Picture * +gst_mpeg2_picture_ref (GstMpeg2Picture * picture) +{ + return (GstMpeg2Picture *) gst_mini_object_ref (GST_MINI_OBJECT_CAST (picture)); +} + +static inline void +gst_mpeg2_picture_unref (GstMpeg2Picture * picture) +{ + gst_mini_object_unref (GST_MINI_OBJECT_CAST (picture)); +} + +static inline gboolean +gst_mpeg2_picture_replace (GstMpeg2Picture ** old_picture, + GstMpeg2Picture * new_picture) +{ + return gst_mini_object_replace ((GstMiniObject **) old_picture, + (GstMiniObject *) new_picture); +} + +static inline void +gst_mpeg2_picture_clear (GstMpeg2Picture ** picture) +{ + if (picture && *picture) { + gst_mpeg2_picture_unref (*picture); + *picture = NULL; + } +} + +GST_CODECS_API +void gst_mpeg2_picture_set_user_data (GstMpeg2Picture * picture, + gpointer user_data, + GDestroyNotify notify); + +GST_CODECS_API +gpointer gst_mpeg2_picture_get_user_data (GstMpeg2Picture * picture); + + +/** + * GstMpeg2Dpb: + * + * Since: 1.20 + */ +typedef struct _GstMpeg2Dpb GstMpeg2Dpb; + +GST_CODECS_API +GstMpeg2Dpb * gst_mpeg2_dpb_new (void); + +GST_CODECS_API +void gst_mpeg2_dpb_free (GstMpeg2Dpb * dpb); + +GST_CODECS_API +void gst_mpeg2_dpb_clear (GstMpeg2Dpb * dpb); + +GST_CODECS_API +void gst_mpeg2_dpb_add (GstMpeg2Dpb * dpb, + GstMpeg2Picture * picture); +GST_CODECS_API +GstMpeg2Picture * gst_mpeg2_dpb_bump (GstMpeg2Dpb * dpb); + +GST_CODECS_API +gboolean gst_mpeg2_dpb_need_bump (GstMpeg2Dpb * dpb); + +GST_CODECS_API +void gst_mpeg2_dpb_get_neighbours (GstMpeg2Dpb * dpb, + GstMpeg2Picture * picture, + GstMpeg2Picture ** prev_picture_ptr, + GstMpeg2Picture ** next_picture_ptr); + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstMpeg2Picture, gst_mpeg2_picture_unref) + +G_END_DECLS + +#endif /* __GST_MPEG2_PICTURE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gstvp8decoder.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp8decoder.c
Changed
@@ -19,7 +19,7 @@ /** * SECTION:gstvp8decoder - * @title: GstVP8Decoder + * @title: GstVp8Decoder * @short_description: Base class to implement stateless VP8 decoders * @sources: * - gstvp8picture.h @@ -29,6 +29,7 @@ #include <config.h> #endif +#include <gst/base/base.h> #include "gstvp8decoder.h" GST_DEBUG_CATEGORY (gst_vp8_decoder_debug); @@ -42,8 +43,19 @@ gboolean had_sequence; GstVp8Parser parser; gboolean wait_keyframe; + guint preferred_output_delay; + /* for delayed output */ + GstQueueArray *output_queue; + gboolean is_live; }; +typedef struct +{ + GstVideoCodecFrame *frame; + GstVp8Picture *picture; + GstVp8Decoder *self; +} GstVp8DecoderOutputFrame; + #define parent_class gst_vp8_decoder_parent_class G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstVp8Decoder, gst_vp8_decoder, GST_TYPE_VIDEO_DECODER, @@ -60,6 +72,12 @@ static GstFlowReturn gst_vp8_decoder_drain (GstVideoDecoder * decoder); static GstFlowReturn gst_vp8_decoder_handle_frame (GstVideoDecoder * decoder, GstVideoCodecFrame * frame); +static void gst_vp8_decoder_clear_output_frame (GstVp8DecoderOutputFrame * + output_frame); +static void gst_vp8_decoder_drain_output_queue (GstVp8Decoder * self, + guint num, GstFlowReturn * ret); +static GstFlowReturn gst_vp8_decoder_drain_internal (GstVp8Decoder * self, + gboolean wait_keyframe); static void gst_vp8_decoder_class_init (GstVp8DecoderClass * klass) @@ -93,6 +111,11 @@ gst_vp8_parser_init (&priv->parser); priv->wait_keyframe = TRUE; + priv->output_queue = + gst_queue_array_new_for_struct (sizeof (GstVp8DecoderOutputFrame), 1); + gst_queue_array_set_clear_func (priv->output_queue, + (GDestroyNotify) gst_vp8_decoder_clear_output_frame); + return TRUE; } @@ -106,12 +129,14 @@ gst_vp8_picture_clear (&self->alt_ref_picture); priv->wait_keyframe = TRUE; + gst_queue_array_clear (priv->output_queue); } static gboolean gst_vp8_decoder_stop (GstVideoDecoder * decoder) { GstVp8Decoder *self = GST_VP8_DECODER (decoder); + GstVp8DecoderPrivate *priv = self->priv; if (self->input_state) { gst_video_codec_state_unref (self->input_state); @@ -119,16 +144,17 @@ } gst_vp8_decoder_reset (self); + gst_queue_array_free (priv->output_queue); return TRUE; } -static gboolean +static GstFlowReturn gst_vp8_decoder_check_codec_change (GstVp8Decoder * self, const GstVp8FrameHdr * frame_hdr) { GstVp8DecoderPrivate *priv = self->priv; - gboolean ret = TRUE; + GstFlowReturn ret = GST_FLOW_OK; gboolean changed = FALSE; if (priv->width != frame_hdr->width || priv->height != frame_hdr->height) { @@ -142,8 +168,22 @@ if (changed || !priv->had_sequence) { GstVp8DecoderClass *klass = GST_VP8_DECODER_GET_CLASS (self); + /* Drain before new sequence */ + ret = gst_vp8_decoder_drain_internal (self, FALSE); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Failed to drain pending frames, returned %s", + gst_flow_get_name (ret)); + return ret; + } + priv->had_sequence = TRUE; + if (klass->get_preferred_output_delay) + priv->preferred_output_delay = + klass->get_preferred_output_delay (self, priv->is_live); + else + priv->preferred_output_delay = 0; + if (klass->new_sequence) ret = klass->new_sequence (self, frame_hdr); } @@ -157,6 +197,7 @@ { GstVp8Decoder *self = GST_VP8_DECODER (decoder); GstVp8DecoderPrivate *priv = self->priv; + GstQuery *query; GST_DEBUG_OBJECT (decoder, "Set format"); @@ -168,6 +209,11 @@ priv->width = GST_VIDEO_INFO_WIDTH (&state->info); priv->height = GST_VIDEO_INFO_HEIGHT (&state->info); + query = gst_query_new_latency (); + if (gst_pad_peer_query (GST_VIDEO_DECODER_SINK_PAD (self), query)) + gst_query_parse_latency (query, &priv->is_live, NULL, NULL); + gst_query_unref (query); + return TRUE; } @@ -236,15 +282,27 @@ } static GstFlowReturn -gst_vp8_decoder_finish (GstVideoDecoder * decoder) +gst_vp8_decoder_drain_internal (GstVp8Decoder * self, gboolean wait_keyframe) { - GstVp8Decoder *self = GST_VP8_DECODER (decoder); + GstFlowReturn ret = GST_FLOW_OK; + GstVp8DecoderPrivate *priv = self->priv; + + gst_vp8_decoder_drain_output_queue (self, 0, &ret); + gst_vp8_picture_clear (&self->last_picture); + gst_vp8_picture_clear (&self->golden_ref_picture); + gst_vp8_picture_clear (&self->alt_ref_picture); - GST_DEBUG_OBJECT (self, "finish"); + priv->wait_keyframe = wait_keyframe; - gst_vp8_decoder_reset (self); + return ret; +} + +static GstFlowReturn +gst_vp8_decoder_finish (GstVideoDecoder * decoder) +{ + GST_DEBUG_OBJECT (decoder, "finish"); - return GST_FLOW_OK; + return gst_vp8_decoder_drain_internal (GST_VP8_DECODER (decoder), TRUE); } static gboolean @@ -262,13 +320,24 @@ static GstFlowReturn gst_vp8_decoder_drain (GstVideoDecoder * decoder) { - GstVp8Decoder *self = GST_VP8_DECODER (decoder); + GST_DEBUG_OBJECT (decoder, "drain"); - GST_DEBUG_OBJECT (self, "drain"); + return gst_vp8_decoder_drain_internal (GST_VP8_DECODER (decoder), TRUE); +} - gst_vp8_decoder_reset (self); +static void +gst_vp8_decoder_clear_output_frame (GstVp8DecoderOutputFrame * output_frame) +{ + if (!output_frame) + return; + + if (output_frame->frame) { + gst_video_decoder_release_frame (GST_VIDEO_DECODER (output_frame->self), + output_frame->frame); + output_frame->frame = NULL; + } - return GST_FLOW_OK; + gst_vp8_picture_clear (&output_frame->picture); } static GstFlowReturn @@ -283,6 +352,8 @@ GstVp8FrameHdr frame_hdr; GstVp8ParserResult pres; GstVp8Picture *picture = NULL; + GstFlowReturn ret = GST_FLOW_OK; + GstVp8DecoderOutputFrame output_frame; GST_LOG_OBJECT (self, "handle frame, PTS: %" GST_TIME_FORMAT ", DTS: %" @@ -291,7 +362,7 @@ if (!gst_buffer_map (in_buf, &map, GST_MAP_READ)) { GST_ERROR_OBJECT (self, "Cannot map buffer"); - + ret = GST_FLOW_ERROR; goto error; } @@ -300,7 +371,7 @@ if (pres != GST_VP8_PARSER_OK) { GST_ERROR_OBJECT (self, "Cannot parser frame header"); - + ret = GST_FLOW_ERROR; goto unmap_and_error; } @@ -318,10 +389,12 @@ priv->wait_keyframe = FALSE; - if (frame_hdr.key_frame && - !gst_vp8_decoder_check_codec_change (self, &frame_hdr)) { - GST_ERROR_OBJECT (self, "Subclass cannot handle codec change"); - goto unmap_and_error; + if (frame_hdr.key_frame) { + ret = gst_vp8_decoder_check_codec_change (self, &frame_hdr); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Subclass cannot handle codec change"); + goto unmap_and_error; + } } picture = gst_vp8_picture_new (); @@ -332,29 +405,33 @@ picture->system_frame_number = frame->system_frame_number; if (klass->new_picture) { - if (!klass->new_picture (self, frame, picture)) { - GST_ERROR_OBJECT (self, "subclass cannot handle new picture"); + ret = klass->new_picture (self, frame, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to handle new picture"); goto unmap_and_error; } } if (klass->start_picture) { - if (!klass->start_picture (self, picture)) { - GST_ERROR_OBJECT (self, "subclass cannot handle start picture"); + ret = klass->start_picture (self, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to handle start picture"); goto unmap_and_error; } } if (klass->decode_picture) { - if (!klass->decode_picture (self, picture, &priv->parser)) { - GST_ERROR_OBJECT (self, "subclass cannot decode current picture"); + ret = klass->decode_picture (self, picture, &priv->parser); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to decode current picture"); goto unmap_and_error; } } if (klass->end_picture) { - if (!klass->end_picture (self, picture)) { - GST_ERROR_OBJECT (self, "subclass cannot handle end picture"); + ret = klass->end_picture (self, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to handle end picture"); goto unmap_and_error; } } @@ -363,8 +440,29 @@ gst_vp8_decoder_update_reference (self, gst_vp8_picture_ref (picture)); - g_assert (klass->output_picture); - return klass->output_picture (self, frame, picture); + if (!picture->frame_hdr.show_frame) { + GST_LOG_OBJECT (self, "Decode only picture %p", picture); + GST_VIDEO_CODEC_FRAME_SET_DECODE_ONLY (frame); + + gst_vp8_picture_unref (picture); + + ret = gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); + } else { + output_frame.frame = frame; + output_frame.picture = picture; + output_frame.self = self; + gst_queue_array_push_tail_struct (priv->output_queue, &output_frame); + } + + gst_vp8_decoder_drain_output_queue (self, priv->preferred_output_delay, &ret); + + if (ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), ret); + return ret; + } + + return ret; unmap_and_error: { @@ -374,15 +472,54 @@ error: { - GstFlowReturn ret; - if (picture) gst_vp8_picture_unref (picture); + if (ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), ret); + } + gst_video_decoder_drop_frame (decoder, frame); - GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, - ("Failed to decode data"), (NULL), ret); return ret; } } + +static void +gst_vp8_decoder_drain_output_queue (GstVp8Decoder * self, guint num, + GstFlowReturn * ret) +{ + GstVp8DecoderPrivate *priv = self->priv; + GstVp8DecoderClass *klass = GST_VP8_DECODER_GET_CLASS (self); + + g_assert (klass->output_picture); + + while (gst_queue_array_get_length (priv->output_queue) > num) { + GstVp8DecoderOutputFrame *output_frame = (GstVp8DecoderOutputFrame *) + gst_queue_array_pop_head_struct (priv->output_queue); + /* Output queued frames whatever the return value is, in order to empty + * the queue */ + GstFlowReturn flow_ret = klass->output_picture (self, + output_frame->frame, output_frame->picture); + + /* Then, update @ret with new flow return value only if @ret was + * GST_FLOW_OK. This is to avoid pattern such that + * ```c + * GstFlowReturn my_return = GST_FLOW_OK; + * do something + * + * if (my_return == GST_FLOW_OK) { + * my_return = gst_vp8_decoder_drain_output_queue (); + * } else { + * // Ignore flow return of this method, but current `my_return` error code + * gst_vp8_decoder_drain_output_queue (); + * } + * + * return my_return; + * ``` + */ + if (*ret == GST_FLOW_OK) + *ret = flow_ret; + } +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gstvp8decoder.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp8decoder.h
Changed
@@ -34,7 +34,7 @@ #define GST_VP8_DECODER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_VP8_DECODER,GstVp8DecoderClass)) #define GST_IS_VP8_DECODER(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_VP8_DECODER)) #define GST_IS_VP8_DECODER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_VP8_DECODER)) -#define GST_VP8_DECODER_CAST(obj) ((GstVP8Decoder*)obj) +#define GST_VP8_DECODER_CAST(obj) ((GstVp8Decoder*)obj) typedef struct _GstVp8Decoder GstVp8Decoder; typedef struct _GstVp8DecoderClass GstVp8DecoderClass; @@ -88,31 +88,31 @@ { GstVideoDecoderClass parent_class; - gboolean (*new_sequence) (GstVp8Decoder * decoder, + GstFlowReturn (*new_sequence) (GstVp8Decoder * decoder, const GstVp8FrameHdr * frame_hdr); /** - * GstVp8Decoder:new_picture: + * GstVp8DecoderClass:new_picture: * @decoder: a #GstVp8Decoder * @frame: (transfer none): a #GstVideoCodecFrame * @picture: (transfer none): a #GstVp8Picture */ - gboolean (*new_picture) (GstVp8Decoder * decoder, + GstFlowReturn (*new_picture) (GstVp8Decoder * decoder, GstVideoCodecFrame * frame, GstVp8Picture * picture); - gboolean (*start_picture) (GstVp8Decoder * decoder, + GstFlowReturn (*start_picture) (GstVp8Decoder * decoder, GstVp8Picture * picture); - gboolean (*decode_picture) (GstVp8Decoder * decoder, + GstFlowReturn (*decode_picture) (GstVp8Decoder * decoder, GstVp8Picture * picture, GstVp8Parser * parser); - gboolean (*end_picture) (GstVp8Decoder * decoder, + GstFlowReturn (*end_picture) (GstVp8Decoder * decoder, GstVp8Picture * picture); /** - * GstVp8Decoder:output_picture: + * GstVp8DecoderClass:output_picture: * @decoder: a #GstVp8Decoder * @frame: (transfer full): a #GstVideoCodecFrame * @picture: (transfer full): a #GstVp8Picture @@ -121,6 +121,21 @@ GstVideoCodecFrame * frame, GstVp8Picture * picture); + /** + * GstVp8DecoderClass::get_preferred_output_delay: + * @decoder: a #GstVp8Decoder + * @is_live: whether upstream is live or not + * + * Optional. Called by baseclass to query whether delaying output is + * preferred by subclass or not. + * + * Returns: the number of perferred delayed output frame + * + * Since: 1.20 + */ + guint (*get_preferred_output_delay) (GstVp8Decoder * decoder, + gboolean is_live); + /*< private >*/ gpointer padding[GST_PADDING_LARGE]; };
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gstvp9decoder.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp9decoder.c
Changed
@@ -58,6 +58,7 @@ #include <config.h> #endif +#include <gst/base/base.h> #include "gstvp9decoder.h" GST_DEBUG_CATEGORY (gst_vp9_decoder_debug); @@ -65,18 +66,33 @@ struct _GstVp9DecoderPrivate { - gint width; - gint height; + gint frame_width; + gint frame_height; + gint render_width; + gint render_height; GstVP9Profile profile; gboolean had_sequence; - GstVp9Parser *parser; + GstVp9StatefulParser *parser; GstVp9Dpb *dpb; + gboolean support_non_kf_change; + gboolean wait_keyframe; + /* controls how many frames to delay when calling output_picture() */ + guint preferred_output_delay; + GstQueueArray *output_queue; + gboolean is_live; }; +typedef struct +{ + GstVideoCodecFrame *frame; + GstVp9Picture *picture; + GstVp9Decoder *self; +} GstVp9DecoderOutputFrame; + #define parent_class gst_vp9_decoder_parent_class G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstVp9Decoder, gst_vp9_decoder, GST_TYPE_VIDEO_DECODER, @@ -94,8 +110,12 @@ static GstFlowReturn gst_vp9_decoder_handle_frame (GstVideoDecoder * decoder, GstVideoCodecFrame * frame); -static GstVp9Picture *gst_vp9_decoder_duplicate_picture_default (GstVp9Decoder * - decoder, GstVp9Picture * picture); +static void +gst_vp9_decoder_clear_output_frame (GstVp9DecoderOutputFrame * output_frame); +static void gst_vp9_decoder_drain_output_queue (GstVp9Decoder * self, + guint num, GstFlowReturn * ret); +static GstFlowReturn gst_vp9_decoder_drain_internal (GstVp9Decoder * self, + gboolean wait_keyframe); static void gst_vp9_decoder_class_init (GstVp9DecoderClass * klass) @@ -110,9 +130,6 @@ decoder_class->drain = GST_DEBUG_FUNCPTR (gst_vp9_decoder_drain); decoder_class->handle_frame = GST_DEBUG_FUNCPTR (gst_vp9_decoder_handle_frame); - - klass->duplicate_picture = - GST_DEBUG_FUNCPTR (gst_vp9_decoder_duplicate_picture_default); } static void @@ -121,6 +138,9 @@ gst_video_decoder_set_packetized (GST_VIDEO_DECODER (self), TRUE); self->priv = gst_vp9_decoder_get_instance_private (self); + + /* Assume subclass can support non-keyframe format change by default */ + self->priv->support_non_kf_change = TRUE; } static gboolean @@ -129,9 +149,19 @@ GstVp9Decoder *self = GST_VP9_DECODER (decoder); GstVp9DecoderPrivate *priv = self->priv; - priv->parser = gst_vp9_parser_new (); + priv->parser = gst_vp9_stateful_parser_new (); priv->dpb = gst_vp9_dpb_new (); priv->wait_keyframe = TRUE; + priv->profile = GST_VP9_PROFILE_UNDEFINED; + priv->frame_width = 0; + priv->frame_height = 0; + priv->render_width = 0; + priv->render_height = 0; + + priv->output_queue = + gst_queue_array_new_for_struct (sizeof (GstVp9DecoderOutputFrame), 1); + gst_queue_array_set_clear_func (priv->output_queue, + (GDestroyNotify) gst_vp9_decoder_clear_output_frame); return TRUE; } @@ -142,56 +172,83 @@ GstVp9Decoder *self = GST_VP9_DECODER (decoder); GstVp9DecoderPrivate *priv = self->priv; - if (self->input_state) { - gst_video_codec_state_unref (self->input_state); - self->input_state = NULL; - } - - if (priv->parser) { - gst_vp9_parser_free (priv->parser); - priv->parser = NULL; - } - - if (priv->dpb) { - gst_vp9_dpb_free (priv->dpb); - priv->dpb = NULL; - } + g_clear_pointer (&self->input_state, gst_video_codec_state_unref); + g_clear_pointer (&priv->parser, gst_vp9_stateful_parser_free); + g_clear_pointer (&priv->dpb, gst_vp9_dpb_free); + gst_queue_array_free (priv->output_queue); return TRUE; } static gboolean -gst_vp9_decoder_check_codec_change (GstVp9Decoder * self, - const GstVp9FrameHdr * frame_hdr) +gst_vp9_decoder_is_format_change (GstVp9Decoder * self, + const GstVp9FrameHeader * frame_hdr) { GstVp9DecoderPrivate *priv = self->priv; - gboolean ret = TRUE; - gboolean changed = FALSE; - if (priv->width != frame_hdr->width || priv->height != frame_hdr->height) { - GST_INFO_OBJECT (self, "resolution changed %dx%d", frame_hdr->width, + if (priv->frame_width != frame_hdr->width + || priv->frame_height != frame_hdr->height) { + GST_INFO_OBJECT (self, "frame resolution changed %dx%d", frame_hdr->width, frame_hdr->height); - priv->width = frame_hdr->width; - priv->height = frame_hdr->height; - changed = TRUE; + return TRUE; + } + + if (priv->render_width != frame_hdr->render_width + || priv->render_height != frame_hdr->render_height) { + GST_INFO_OBJECT (self, "render resolution changed %dx%d", + frame_hdr->render_width, frame_hdr->render_height); + return TRUE; } if (priv->profile != frame_hdr->profile) { GST_INFO_OBJECT (self, "profile changed %d", frame_hdr->profile); - priv->profile = frame_hdr->profile; - changed = TRUE; + return TRUE; + } + + return FALSE; +} + +static GstFlowReturn +gst_vp9_decoder_check_codec_change (GstVp9Decoder * self, + const GstVp9FrameHeader * frame_hdr) +{ + GstVp9DecoderPrivate *priv = self->priv; + GstVp9DecoderClass *klass = GST_VP9_DECODER_GET_CLASS (self); + GstFlowReturn ret = GST_FLOW_OK; + + if (priv->had_sequence && !gst_vp9_decoder_is_format_change (self, frame_hdr)) { + return GST_FLOW_OK; } - if (changed || !priv->had_sequence) { - GstVp9DecoderClass *klass = GST_VP9_DECODER_GET_CLASS (self); + priv->frame_width = frame_hdr->width; + priv->frame_height = frame_hdr->height; + priv->render_width = frame_hdr->render_width; + priv->render_height = frame_hdr->render_height; + priv->profile = frame_hdr->profile; + + /* Drain before new sequence */ + ret = gst_vp9_decoder_drain_internal (self, FALSE); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Failed to drain pending frames, returned %s", + gst_flow_get_name (ret)); + return ret; + } - priv->had_sequence = TRUE; - if (klass->new_sequence) - priv->had_sequence = klass->new_sequence (self, frame_hdr); + priv->had_sequence = TRUE; - ret = priv->had_sequence; + if (klass->get_preferred_output_delay) { + priv->preferred_output_delay = + klass->get_preferred_output_delay (self, priv->is_live); + } else { + priv->preferred_output_delay = 0; } + if (klass->new_sequence) + ret = klass->new_sequence (self, frame_hdr); + + if (ret != GST_FLOW_OK) + priv->had_sequence = FALSE; + return ret; } @@ -201,6 +258,7 @@ { GstVp9Decoder *self = GST_VP9_DECODER (decoder); GstVp9DecoderPrivate *priv = self->priv; + GstQuery *query; GST_DEBUG_OBJECT (decoder, "Set format"); @@ -209,8 +267,10 @@ self->input_state = gst_video_codec_state_ref (state); - priv->width = GST_VIDEO_INFO_WIDTH (&state->info); - priv->height = GST_VIDEO_INFO_HEIGHT (&state->info); + query = gst_query_new_latency (); + if (gst_pad_peer_query (GST_VIDEO_DECODER_SINK_PAD (self), query)) + gst_query_parse_latency (query, &priv->is_live, NULL, NULL); + gst_query_unref (query); return TRUE; } @@ -224,6 +284,22 @@ gst_vp9_dpb_clear (priv->dpb); priv->wait_keyframe = TRUE; + gst_queue_array_clear (priv->output_queue); +} + +static GstFlowReturn +gst_vp9_decoder_drain_internal (GstVp9Decoder * self, gboolean wait_keyframe) +{ + GstFlowReturn ret = GST_FLOW_OK; + GstVp9DecoderPrivate *priv = self->priv; + + gst_vp9_decoder_drain_output_queue (self, 0, &ret); + if (priv->dpb) + gst_vp9_dpb_clear (priv->dpb); + + priv->wait_keyframe = wait_keyframe; + + return ret; } static GstFlowReturn @@ -231,9 +307,7 @@ { GST_DEBUG_OBJECT (decoder, "finish"); - gst_vp9_decoder_reset (GST_VP9_DECODER (decoder)); - - return GST_FLOW_OK; + return gst_vp9_decoder_drain_internal (GST_VP9_DECODER (decoder), TRUE); } static gboolean @@ -251,21 +325,22 @@ { GST_DEBUG_OBJECT (decoder, "drain"); - gst_vp9_decoder_reset (GST_VP9_DECODER (decoder)); - - return GST_FLOW_OK; + return gst_vp9_decoder_drain_internal (GST_VP9_DECODER (decoder), TRUE); } -static GstVp9Picture * -gst_vp9_decoder_duplicate_picture_default (GstVp9Decoder * decoder, - GstVp9Picture * picture) +static void +gst_vp9_decoder_clear_output_frame (GstVp9DecoderOutputFrame * output_frame) { - GstVp9Picture *new_picture; + if (!output_frame) + return; - new_picture = gst_vp9_picture_new (); - new_picture->frame_hdr = picture->frame_hdr; + if (output_frame->frame) { + gst_video_decoder_release_frame (GST_VIDEO_DECODER (output_frame->self), + output_frame->frame); + output_frame->frame = NULL; + } - return new_picture; + gst_vp9_picture_clear (&output_frame->picture); } static GstFlowReturn @@ -276,81 +351,87 @@ GstVp9DecoderClass *klass = GST_VP9_DECODER_GET_CLASS (self); GstVp9DecoderPrivate *priv = self->priv; GstBuffer *in_buf = frame->input_buffer; - GstVp9FrameHdr frame_hdr[GST_VP9_MAX_FRAMES_IN_SUPERFRAME]; + GstVp9FrameHeader frame_hdr; GstVp9Picture *picture = NULL; - GstVp9FrameHdr *cur_hdr; GstVp9ParserResult pres; - GstVp9SuperframeInfo superframe_info; GstMapInfo map; GstFlowReturn ret = GST_FLOW_OK; - gint i; - gsize offset = 0; - gint frame_idx_to_consume = 0; + gboolean intra_only = FALSE; + gboolean check_codec_change = FALSE; + GstVp9DecoderOutputFrame output_frame; GST_LOG_OBJECT (self, "handle frame %" GST_PTR_FORMAT, in_buf); if (!gst_buffer_map (in_buf, &map, GST_MAP_READ)) { GST_ERROR_OBJECT (self, "Cannot map input buffer"); + ret = GST_FLOW_ERROR; goto error; } - pres = gst_vp9_parser_parse_superframe_info (priv->parser, - &superframe_info, map.data, map.size); + pres = + gst_vp9_stateful_parser_parse_uncompressed_frame_header (priv->parser, + &frame_hdr, map.data, map.size); + if (pres != GST_VP9_PARSER_OK) { - GST_ERROR_OBJECT (self, "Failed to parse superframe header"); + GST_ERROR_OBJECT (self, "Failed to parsing frame header"); + ret = GST_FLOW_ERROR; goto unmap_and_error; } - if (superframe_info.frames_in_superframe > 1) { - GST_LOG_OBJECT (self, - "Have %d frames in superframe", superframe_info.frames_in_superframe); - } - - for (i = 0; i < superframe_info.frames_in_superframe; i++) { - pres = gst_vp9_parser_parse_frame_header (priv->parser, &frame_hdr[i], - map.data + offset, superframe_info.frame_sizes[i]); + if (self->parse_compressed_headers && !frame_hdr.show_existing_frame) { + pres = + gst_vp9_stateful_parser_parse_compressed_frame_header (priv->parser, + &frame_hdr, map.data + frame_hdr.frame_header_length_in_bytes, + map.size); if (pres != GST_VP9_PARSER_OK) { - GST_ERROR_OBJECT (self, "Failed to parsing frame header %d", i); + GST_ERROR_OBJECT (self, "Failed to parse the compressed frame header"); goto unmap_and_error; } + } - offset += superframe_info.frame_sizes[i]; + if (frame_hdr.show_existing_frame) { + /* This is a non-intra, dummy frame */ + intra_only = FALSE; + } else if (frame_hdr.frame_type == GST_VP9_KEY_FRAME || frame_hdr.intra_only) { + intra_only = TRUE; } - /* if we have multiple frames in superframe here, - * decide which frame should consume given GstVideoCodecFrame. - * In practice, superframe consists of two frame, one is decode-only frame - * and the other is normal frame. If it's not the case, existing vp9 decoder - * implementations (nvdec, vp9dec, d3d11 and so on) would - * show mismatched number of input and output buffers. - * To handle it in generic manner, we need vp9parse element to - * split frames from superframe. */ - if (superframe_info.frames_in_superframe > 1) { - for (i = 0; i < superframe_info.frames_in_superframe; i++) { - if (frame_hdr[i].show_frame) { - frame_idx_to_consume = i; - break; - } + if (intra_only) { + if (frame_hdr.frame_type == GST_VP9_KEY_FRAME) { + /* Always check codec change per keyframe */ + check_codec_change = TRUE; + } else if (priv->wait_keyframe) { + /* Or, if we are waiting for leading keyframe, but this is intra-only, + * try decoding this frame, it's allowed as per spec */ + check_codec_change = TRUE; } - - /* if all frames are decode-only, choose the first one - * (seems to be no possibility) */ - if (i == superframe_info.frames_in_superframe) - frame_idx_to_consume = 0; } - if (priv->wait_keyframe && frame_hdr[0].frame_type != GST_VP9_KEY_FRAME) { + if (priv->wait_keyframe && !intra_only) { GST_DEBUG_OBJECT (self, "Drop frame before initial keyframe"); gst_buffer_unmap (in_buf, &map); - return gst_video_decoder_drop_frame (decoder, frame);; + gst_video_decoder_release_frame (decoder, frame);; + + return GST_FLOW_OK; } - if (frame_hdr[0].frame_type == GST_VP9_KEY_FRAME && - !gst_vp9_decoder_check_codec_change (self, &frame_hdr[0])) { - GST_ERROR_OBJECT (self, "codec change error"); - goto unmap_and_error; + if (check_codec_change) { + ret = gst_vp9_decoder_check_codec_change (self, &frame_hdr); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Subclass cannot handle codec change"); + goto unmap_and_error; + } + } else if (!frame_hdr.show_existing_frame && !priv->support_non_kf_change && + gst_vp9_decoder_is_format_change (self, &frame_hdr)) { + GST_DEBUG_OBJECT (self, "Drop frame on non-keyframe format change"); + + gst_buffer_unmap (in_buf, &map); + gst_video_decoder_release_frame (decoder, frame); + + /* Drains frames if any and waits for keyframe again */ + return gst_vp9_decoder_drain_internal (self, TRUE); } if (!priv->had_sequence) { @@ -360,105 +441,102 @@ priv->wait_keyframe = FALSE; - offset = 0; - for (i = 0; i < superframe_info.frames_in_superframe; i++) { - GstVideoCodecFrame *cur_frame = NULL; - cur_hdr = &frame_hdr[i]; - - if (cur_hdr->show_existing_frame) { - GstVp9Picture *pic_to_dup; - - if (cur_hdr->frame_to_show >= GST_VP9_REF_FRAMES || - !priv->dpb->pic_list[cur_hdr->frame_to_show]) { - GST_ERROR_OBJECT (self, "Invalid frame_to_show %d", - cur_hdr->frame_to_show); - goto unmap_and_error; - } - - g_assert (klass->duplicate_picture); - pic_to_dup = priv->dpb->pic_list[cur_hdr->frame_to_show]; - picture = klass->duplicate_picture (self, pic_to_dup); - - if (!picture) { - GST_ERROR_OBJECT (self, "subclass didn't provide duplicated picture"); - goto unmap_and_error; - } - - picture->pts = GST_BUFFER_PTS (in_buf); - picture->size = 0; - - if (i == frame_idx_to_consume) - cur_frame = gst_video_codec_frame_ref (frame); - - g_assert (klass->output_picture); + if (frame_hdr.show_existing_frame) { + GstVp9Picture *pic_to_dup; - /* transfer ownership of picture */ - ret = klass->output_picture (self, cur_frame, picture); - picture = NULL; - } else { - picture = gst_vp9_picture_new (); - picture->frame_hdr = *cur_hdr; - picture->pts = GST_BUFFER_PTS (in_buf); + if (frame_hdr.frame_to_show_map_idx >= GST_VP9_REF_FRAMES || + !priv->dpb->pic_list[frame_hdr.frame_to_show_map_idx]) { + GST_ERROR_OBJECT (self, "Invalid frame_to_show_map_idx %d", + frame_hdr.frame_to_show_map_idx); + goto unmap_and_error; + } - picture->data = map.data + offset; - picture->size = superframe_info.frame_sizes[i]; + /* If not implemented by subclass, we can just drop this picture + * since this frame header indicates the frame index to be duplicated + * and also this frame header doesn't affect reference management */ + if (!klass->duplicate_picture) { + gst_buffer_unmap (in_buf, &map); + GST_VIDEO_CODEC_FRAME_SET_DECODE_ONLY (frame); - picture->subsampling_x = priv->parser->subsampling_x; - picture->subsampling_y = priv->parser->subsampling_y; - picture->bit_depth = priv->parser->bit_depth; + gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); + } - if (i == frame_idx_to_consume) - cur_frame = gst_video_codec_frame_ref (frame); + pic_to_dup = priv->dpb->pic_list[frame_hdr.frame_to_show_map_idx]; + picture = klass->duplicate_picture (self, frame, pic_to_dup); - if (klass->new_picture) { - if (!klass->new_picture (self, cur_frame, picture)) { - GST_ERROR_OBJECT (self, "new picture error"); - goto unmap_and_error; - } + if (!picture) { + GST_ERROR_OBJECT (self, "subclass didn't provide duplicated picture"); + goto unmap_and_error; + } + } else { + picture = gst_vp9_picture_new (); + picture->frame_hdr = frame_hdr; + picture->system_frame_number = frame->system_frame_number; + picture->data = map.data; + picture->size = map.size; + + if (klass->new_picture) { + ret = klass->new_picture (self, frame, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to handle new picture"); + goto unmap_and_error; } + } - if (klass->start_picture) { - if (!klass->start_picture (self, picture)) { - GST_ERROR_OBJECT (self, "start picture error"); - goto unmap_and_error; - } + if (klass->start_picture) { + ret = klass->start_picture (self, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to handle start picture"); + goto unmap_and_error; } + } - if (klass->decode_picture) { - if (!klass->decode_picture (self, picture, priv->dpb)) { - GST_ERROR_OBJECT (self, "decode picture error"); - goto unmap_and_error; - } + if (klass->decode_picture) { + ret = klass->decode_picture (self, picture, priv->dpb); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to decode current picture"); + goto unmap_and_error; } + } - if (klass->end_picture) { - if (!klass->end_picture (self, picture)) { - GST_ERROR_OBJECT (self, "end picture error"); - goto unmap_and_error; - } + if (klass->end_picture) { + ret = klass->end_picture (self, picture); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "subclass failed to handle end picture"); + goto unmap_and_error; } + } - /* Just pass our picture to dpb object. - * Even if this picture does not need to be added to dpb - * (i.e., not a reference frame), gst_vp9_dpb_add() will take care of - * the case as well */ - gst_vp9_dpb_add (priv->dpb, gst_vp9_picture_ref (picture)); + /* Just pass our picture to dpb object. + * Even if this picture does not need to be added to dpb + * (i.e., not a reference frame), gst_vp9_dpb_add() will take care of + * the case as well */ + gst_vp9_dpb_add (priv->dpb, gst_vp9_picture_ref (picture)); + } - g_assert (klass->output_picture); + gst_buffer_unmap (in_buf, &map); - /* transfer ownership of picture */ - ret = klass->output_picture (self, cur_frame, picture); - picture = NULL; - } + if (!frame_hdr.show_frame && !frame_hdr.show_existing_frame) { + GST_LOG_OBJECT (self, "Decode only picture %p", picture); + GST_VIDEO_CODEC_FRAME_SET_DECODE_ONLY (frame); - if (ret != GST_FLOW_OK) - break; + gst_vp9_picture_unref (picture); - offset += superframe_info.frame_sizes[i]; + ret = gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); + } else { + output_frame.frame = frame; + output_frame.picture = picture; + output_frame.self = self; + gst_queue_array_push_tail_struct (priv->output_queue, &output_frame); } - gst_buffer_unmap (in_buf, &map); - gst_video_codec_frame_unref (frame); + gst_vp9_decoder_drain_output_queue (self, priv->preferred_output_delay, &ret); + + if (ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), ret); + return ret; + } return ret; @@ -473,10 +551,69 @@ if (picture) gst_vp9_picture_unref (picture); + if (ret == GST_FLOW_ERROR) { + GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, + ("Failed to decode data"), (NULL), ret); + } + gst_video_decoder_drop_frame (decoder, frame); - GST_VIDEO_DECODER_ERROR (self, 1, STREAM, DECODE, - ("Failed to decode data"), (NULL), ret); return ret; } } + +static void +gst_vp9_decoder_drain_output_queue (GstVp9Decoder * self, guint num, + GstFlowReturn * ret) +{ + GstVp9DecoderPrivate *priv = self->priv; + GstVp9DecoderClass *klass = GST_VP9_DECODER_GET_CLASS (self); + + g_assert (klass->output_picture); + + while (gst_queue_array_get_length (priv->output_queue) > num) { + GstVp9DecoderOutputFrame *output_frame = (GstVp9DecoderOutputFrame *) + gst_queue_array_pop_head_struct (priv->output_queue); + /* Output queued frames whatever the return value is, in order to empty + * the queue */ + GstFlowReturn flow_ret = klass->output_picture (self, + output_frame->frame, output_frame->picture); + + /* Then, update @ret with new flow return value only if @ret was + * GST_FLOW_OK. This is to avoid pattern such that + * ```c + * GstFlowReturn my_return = GST_FLOW_OK; + * do something + * + * if (my_return == GST_FLOW_OK) { + * my_return = gst_vp9_decoder_drain_output_queue (); + * } else { + * // Ignore flow return of this method, but current `my_return` error code + * gst_vp9_decoder_drain_output_queue (); + * } + * + * return my_return; + * ``` + */ + if (*ret == GST_FLOW_OK) + *ret = flow_ret; + } +} + +/** + * gst_vp9_decoder_set_non_keyframe_format_change_support: + * @decoder: a #GstVp9Decoder + * @support: whether subclass can support non-keyframe format change + * + * Called to set non-keyframe format change awareness + * + * Since: 1.20 + */ +void +gst_vp9_decoder_set_non_keyframe_format_change_support (GstVp9Decoder * decoder, + gboolean support) +{ + g_return_if_fail (GST_IS_VP9_DECODER (decoder)); + + decoder->priv->support_non_kf_change = support; +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gstvp9decoder.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp9decoder.h
Changed
@@ -23,7 +23,6 @@ #include <gst/codecs/codecs-prelude.h> #include <gst/video/video.h> -#include <gst/codecparsers/gstvp9parser.h> #include <gst/codecs/gstvp9picture.h> G_BEGIN_DECLS @@ -52,6 +51,7 @@ /*< protected >*/ GstVideoCodecState * input_state; + gboolean parse_compressed_headers; /*< private >*/ GstVp9DecoderPrivate *priv; @@ -60,72 +60,134 @@ /** * GstVp9DecoderClass: - * @new_sequence: Notifies subclass of SPS update - * @new_picture: Optional. - * Called whenever new #GstVp9Picture is created. - * Subclass can set implementation specific user data - * on the #GstVp9Picture via gst_vp9_picture_set_user_data() - * @duplicate_picture: Duplicate the #GstVp9Picture - * @start_picture: Optional. - * Called per one #GstVp9Picture to notify subclass to prepare - * decoding process for the #GstVp9Picture - * @decode_slice: Provides per slice data with parsed slice header and - * required raw bitstream for subclass to decode it - * @end_picture: Optional. - * Called per one #GstVp9Picture to notify subclass to finish - * decoding process for the #GstVp9Picture - * @output_picture: Called with a #GstVp9Picture which is required to be outputted. - * Subclass can retrieve parent #GstVideoCodecFrame by using - * gst_video_decoder_get_frame() with system_frame_number - * and the #GstVideoCodecFrame must be consumed by subclass via - * gst_video_decoder_{finish,drop,release}_frame(). */ struct _GstVp9DecoderClass { GstVideoDecoderClass parent_class; - gboolean (*new_sequence) (GstVp9Decoder * decoder, - const GstVp9FrameHdr * frame_hdr); + /** + * GstVp9DecoderClass::new_sequence: + * + * Notifies subclass of video sequence update such as resolution, bitdepth, + * profile. + * + * Since: 1.18 + */ + GstFlowReturn (*new_sequence) (GstVp9Decoder * decoder, + const GstVp9FrameHeader *frame_hdr); /** - * GstVp9Decoder:new_picture: + * GstVp9DecoderClass::new_picture: * @decoder: a #GstVp9Decoder - * @frame: (nullable): (transfer none): a #GstVideoCodecFrame + * @frame: (transfer none): a #GstVideoCodecFrame * @picture: (transfer none): a #GstVp9Picture * - * FIXME 1.20: vp9parse element can splitting super frames, - * and then we can ensure non-null @frame + * Optional. Called whenever new #GstVp9Picture is created. + * Subclass can set implementation specific user data on the #GstVp9Picture + * via gst_vp9_picture_set_user_data() + * + * Since: 1.18 */ - gboolean (*new_picture) (GstVp9Decoder * decoder, + GstFlowReturn (*new_picture) (GstVp9Decoder * decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); + /** + * GstVp9DecoderClass::duplicate_picture: + * @decoder: a #GstVp9Decoder + * @frame: (transfer none): a #GstVideoCodecFrame + * @picture: (transfer none): a #GstVp9Picture to be duplicated + * + * Optional. Called to duplicate @picture when show_existing_frame flag is set + * in the parsed vp9 frame header. Returned #GstVp9Picture from this method + * should hold already decoded picture data corresponding to the @picture, + * since the returned #GstVp9Picture from this method will be passed to + * the output_picture method immediately without additional decoding process. + * + * If this method is not implemented by subclass, baseclass will drop + * current #GstVideoCodecFrame without additional processing for the current + * frame. + * + * Returns: (transfer full): a #GstVp9Picture or %NULL if failed to duplicate + * @picture. + * + * Since: 1.18 + */ GstVp9Picture * (*duplicate_picture) (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture); - gboolean (*start_picture) (GstVp9Decoder * decoder, + /** + * GstVp9DecoderClass::start_picture: + * @decoder: a #GstVp9Decoder + * @picture: (transfer none): a #GstVp9Picture + * + * Optional. Called to notify subclass to prepare decoding process for + * @picture + * + * Since: 1.18 + */ + GstFlowReturn (*start_picture) (GstVp9Decoder * decoder, GstVp9Picture * picture); - gboolean (*decode_picture) (GstVp9Decoder * decoder, + /** + * GstVp9DecoderClass::decode_picture: + * @decoder: a #GstVp9Decoder + * @picture: (transfer none): a #GstVp9Picture to decoder + * @dpb: (transfer none): a #GstVp9Dpb + * + * Called to notify decoding for subclass to decoder given @picture with + * given @dpb + * + * Since: 1.18 + */ + GstFlowReturn (*decode_picture) (GstVp9Decoder * decoder, GstVp9Picture * picture, GstVp9Dpb * dpb); - gboolean (*end_picture) (GstVp9Decoder * decoder, + /** + * GstVp9DecoderClass::end_picture: + * @decoder: a #GstVp9Decoder + * @picture: (transfer none): a #GstVp9Picture + * + * Optional. Called per one #GstVp9Picture to notify subclass to finish + * decoding process for the #GstVp9Picture + * + * Since: 1.18 + */ + GstFlowReturn (*end_picture) (GstVp9Decoder * decoder, GstVp9Picture * picture); /** - * GstVp9Decoder:output_picture: + * GstVp9DecoderClass::output_picture: * @decoder: a #GstVp9Decoder - * @frame: (nullable): (transfer full): a #GstVideoCodecFrame + * @frame: (transfer full): a #GstVideoCodecFrame * @picture: (transfer full): a #GstVp9Picture * - * FIXME 1.20: vp9parse element can splitting super frames, - * and then we can ensure non-null @frame + * Called to notify @picture is ready to be outputted. + * + * Since: 1.18 */ GstFlowReturn (*output_picture) (GstVp9Decoder * decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); + /** + * GstVp9DecoderClass::get_preferred_output_delay: + * @decoder: a #GstVp9Decoder + * @is_live: whether upstream is live or not + * + * Optional. Retrieve the preferred output delay from child classes. + * controls how many frames to delay when calling + * GstVp9DecoderClass::output_picture + * + * Returns: the number of perferred delayed output frame + * + * Since: 1.20 + */ + guint (*get_preferred_output_delay) (GstVp9Decoder * decoder, + gboolean is_live); + /*< private >*/ gpointer padding[GST_PADDING_LARGE]; }; @@ -135,6 +197,10 @@ GST_CODECS_API GType gst_vp9_decoder_get_type (void); +GST_CODECS_API +void gst_vp9_decoder_set_non_keyframe_format_change_support (GstVp9Decoder * decoder, + gboolean support); + G_END_DECLS #endif /* __GST_VP9_DECODER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gstvp9picture.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp9picture.c
Changed
@@ -52,7 +52,6 @@ GstVp9Picture *pic; pic = g_new0 (GstVp9Picture, 1); - pic->pts = GST_CLOCK_TIME_NONE; gst_mini_object_init (GST_MINI_OBJECT_CAST (pic), 0, GST_TYPE_VP9_PICTURE, NULL, NULL,
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/gstvp9picture.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp9picture.h
Changed
@@ -21,7 +21,7 @@ #define __GST_VP9_PICTURE_H__ #include <gst/codecs/codecs-prelude.h> -#include <gst/codecparsers/gstvp9parser.h> +#include <gst/codecs/gstvp9statefulparser.h> G_BEGIN_DECLS @@ -34,18 +34,13 @@ struct _GstVp9Picture { + /*< private >*/ GstMiniObject parent; - GstClockTime pts; /* From GstVideoCodecFrame */ guint32 system_frame_number; - GstVp9FrameHdr frame_hdr; - - /* copied from parser */ - gint subsampling_x; - gint subsampling_y; - guint bit_depth; + GstVp9FrameHeader frame_hdr; /* raw data and size (does not have ownership) */ const guint8 * data;
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp9statefulparser.c
Added
@@ -0,0 +1,1802 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/* + * Copyright (c) 2010 The WebM project authors. All Rights Reserved. + * + * Redistribution and use in source and binary forms, with or without + * modification, are permitted provided that the following conditions are + * met: + * + * * Redistributions of source code must retain the above copyright + * notice, this list of conditions and the following disclaimer. + * + * * Redistributions in binary form must reproduce the above copyright + * notice, this list of conditions and the following disclaimer in + * the documentation and/or other materials provided with the + * distribution. + * + * * Neither the name of Google, nor the WebM Project, nor the names + * of its contributors may be used to endorse or promote products + * derived from this software without specific prior written + * permission. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR + * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT + * HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, + * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT + * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, + * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY + * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT + * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE + * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. * + */ + +/** + * SECTION:gstvp9statefulparser + * @title: GstVp9StatefulParser + * @short_description: Convenience library for parsing vp9 video bitstream. + * + * This object is used to parse VP9 bitstream header. + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/base/gstbitreader.h> +#include "gstvp9statefulparser.h" +#include <string.h> + +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT ensure_debug_category() +static GstDebugCategory * +ensure_debug_category (void) +{ + static gsize cat_gonce = 0; + + if (g_once_init_enter (&cat_gonce)) { + gsize cat_done; + + cat_done = (gsize) _gst_debug_category_new ("codecparsers_vp9stateful", 0, + "VP9 parser library"); + + g_once_init_leave (&cat_gonce, cat_done); + } + + return (GstDebugCategory *) cat_gonce; +} +#else +#define ensure_debug_category() +#endif /* GST_DISABLE_GST_DEBUG */ + +#define VP9_READ_UINT8(val,nbits) G_STMT_START { \ + if (!gst_bit_reader_get_bits_uint8 (br, &val, nbits)) { \ + GST_ERROR ("failed to read uint8 for '" G_STRINGIFY (val) "', nbits: %d", nbits); \ + return GST_VP9_PARSER_BROKEN_DATA; \ + } \ +} G_STMT_END + +#define VP9_READ_UINT16(val,nbits) G_STMT_START { \ + if (!gst_bit_reader_get_bits_uint16 (br, &val, nbits)) { \ + GST_ERROR ("failed to read uint16 for '" G_STRINGIFY (val) "', nbits: %d", nbits); \ + return GST_VP9_PARSER_BROKEN_DATA; \ + } \ +} G_STMT_END + +#define VP9_READ_UINT32(val,nbits) G_STMT_START { \ + if (!gst_bit_reader_get_bits_uint32 (br, &val, nbits)) { \ + GST_ERROR ("failed to read uint32 for '" G_STRINGIFY (val) "', nbits: %d", nbits); \ + return GST_VP9_PARSER_BROKEN_DATA; \ + } \ +} G_STMT_END + +#define VP9_READ_BIT(val) VP9_READ_UINT8(val, 1) + +#define VP9_READ_SIGNED_8(val,nbits) G_STMT_START { \ + guint8 _value; \ + guint8 _negative; \ + VP9_READ_UINT8(_value, nbits); \ + VP9_READ_BIT(_negative); \ + if (_negative) { \ + val = (gint8) _value * -1; \ + } else { \ + val = _value; \ + } \ +} G_STMT_END + +#define VP9_READ_SIGNED_16(val,nbits) G_STMT_START { \ + guint16 _value; \ + guint8 _negative; \ + VP9_READ_UINT16(_value, nbits); \ + VP9_READ_BIT(_negative); \ + if (_negative) { \ + val = (gint16) _value * -1; \ + } else { \ + val = _value; \ + } \ +} G_STMT_END + +#define CHECK_ALLOWED_WITH_DEBUG(dbg, val, min, max) { \ + if (val < min || val > max) { \ + GST_WARNING ("value for '" dbg "' not in allowed range. value: %d, range %d-%d", \ + val, min, max); \ + return GST_VP9_PARSER_ERROR; \ + } \ +} + +#define CHECK_ALLOWED(val, min, max) \ + CHECK_ALLOWED_WITH_DEBUG (G_STRINGIFY (val), val, min, max) + +typedef struct _Vp9BoolDecoder +{ + guint64 value; + guint32 range; + guint32 bits_left; + gint count_to_fill; + GstBitReader *bit_reader; + gboolean out_of_bits; +} Vp9BoolDecoder; + +/* how much to shift to get range > 128 */ +const static guint8 bool_shift_table[256] = { + 0, 7, 6, 6, 5, 5, 5, 5, 4, 4, 4, 4, 4, 4, 4, 4, 3, 3, 3, 3, 3, 3, 3, 3, + 3, 3, 3, 3, 3, 3, 3, 3, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, + 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, +}; + +static const guint8 inv_map_table[255] = { + 7, 20, 33, 46, 59, 72, 85, 98, 111, 124, 137, 150, 163, 176, + 189, 202, 215, 228, 241, 254, 1, 2, 3, 4, 5, 6, 8, 9, + 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 21, 22, 23, 24, + 25, 26, 27, 28, 29, 30, 31, 32, 34, 35, 36, 37, 38, 39, + 40, 41, 42, 43, 44, 45, 47, 48, 49, 50, 51, 52, 53, 54, + 55, 56, 57, 58, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, + 70, 71, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, + 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 99, 100, + 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 112, 113, 114, 115, + 116, 117, 118, 119, 120, 121, 122, 123, 125, 126, 127, 128, 129, 130, + 131, 132, 133, 134, 135, 136, 138, 139, 140, 141, 142, 143, 144, 145, + 146, 147, 148, 149, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, + 161, 162, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, + 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 190, 191, + 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 203, 204, 205, 206, + 207, 208, 209, 210, 211, 212, 213, 214, 216, 217, 218, 219, 220, 221, + 222, 223, 224, 225, 226, 227, 229, 230, 231, 232, 233, 234, 235, 236, + 237, 238, 239, 240, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, + 252, 253, 253, +}; + +static void +fill_bool (Vp9BoolDecoder * bd) +{ + guint max_bits_to_read; + guint bits_to_read; + guint64 data; + + if (G_UNLIKELY (bd->bits_left < bd->count_to_fill)) { + GST_ERROR + ("Invalid VP9 bitstream: the boolean decoder ran out of bits to read"); + bd->out_of_bits = TRUE; + return; + } + + max_bits_to_read = + 8 * (sizeof (bd->value) - sizeof (guint8)) + bd->count_to_fill; + bits_to_read = MIN (max_bits_to_read, bd->bits_left); + + data = + gst_bit_reader_get_bits_uint64_unchecked (bd->bit_reader, bits_to_read); + + bd->value |= data << (max_bits_to_read - bits_to_read); + bd->count_to_fill -= bits_to_read; + bd->bits_left -= bits_to_read; +} + +static gboolean +read_bool (Vp9BoolDecoder * bd, guint8 probability) +{ + guint64 split; + guint64 big_split; + guint count; + gboolean bit; + + if (bd->count_to_fill > 0) + fill_bool (bd); + + split = 1 + (((bd->range - 1) * probability) >> 8); + big_split = split << 8 * (sizeof (bd->value) - sizeof (guint8)); + + if (bd->value < big_split) { + bd->range = split; + bit = FALSE; + } else { + bd->range -= split; + bd->value -= big_split; + bit = TRUE; + } + + count = bool_shift_table[bd->range]; + bd->range <<= count; + bd->value <<= count; + bd->count_to_fill += count; + + return bit; +} + +static guint +read_literal (Vp9BoolDecoder * bd, guint n) +{ + guint ret = 0; + guint i; + + for (i = 0; G_UNLIKELY (!bd->out_of_bits) && i < n; i++) { + ret = 2 * ret + read_bool (bd, 128); + } + + return ret; +} + +static GstVp9ParserResult +init_bool (Vp9BoolDecoder * bd, GstBitReader * br, guint size_in_bytes) +{ + gboolean marker_bit; + + if (size_in_bytes < 1) + GST_ERROR ("VP9 Boolean Decoder has no bits to read"); + + if ((gst_bit_reader_get_pos (br) % 8) != 0) + GST_ERROR ("VP9 Boolean Decoder was passed an unaligned buffer"); + + bd->value = 0; + bd->range = 255; + bd->bits_left = 8 * size_in_bytes; + bd->bit_reader = br; + bd->count_to_fill = 8; + bd->out_of_bits = FALSE; + + marker_bit = read_literal (bd, 1); + if (marker_bit != 0) { + GST_ERROR ("Marker bit should be zero was %d", marker_bit); + return GST_VP9_PARSER_BROKEN_DATA; + } + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +exit_bool (Vp9BoolDecoder * bd) +{ + guint8 padding; + guint8 bits = bd->bits_left; + guint8 n; + + while (bits) { + n = MIN (bits, 8); + padding = gst_bit_reader_get_bits_uint32_unchecked (bd->bit_reader, n); + if (padding != 0 || (n < 8 && (padding & 0xe0) == 0xc0)) { + GST_ERROR + ("Invalid padding at end of frame. Total padding bits is %d and the wrong byte is: %x", + bd->bits_left, padding); + return GST_VP9_PARSER_BROKEN_DATA; + } + bits -= n; + } + + return GST_VP9_PARSER_OK; +} + +static guint +decode_term_subexp (Vp9BoolDecoder * bd) +{ + guint8 bit; + guint v; + /* only coded if update_prob is set */ + gboolean prob_is_coded_in_bitstream; + guint delta; + + prob_is_coded_in_bitstream = read_bool (bd, 252); + if (!prob_is_coded_in_bitstream) + return 0; + + bit = read_literal (bd, 1); + if (bit == 0) { + delta = read_literal (bd, 4); + goto end; + } + + bit = read_literal (bd, 1); + if (bit == 0) { + delta = read_literal (bd, 4) + 16; + goto end; + } + + bit = read_literal (bd, 1); + if (bit == 0) { + delta = read_literal (bd, 5) + 32; + goto end; + } + + v = read_literal (bd, 7); + if (v < 65) { + delta = v + 64; + goto end; + } + + bit = read_literal (bd, 1); + delta = (v << 1) - 1 + bit; +end: + return inv_map_table[delta]; +} + +static guint8 +read_mv_prob (Vp9BoolDecoder * bd) +{ + gboolean update_mv_prob; + guint8 mv_prob; + guint8 prob = 0; + + update_mv_prob = read_bool (bd, 252); + if (update_mv_prob) { + mv_prob = read_literal (bd, 7); + prob = (mv_prob << 1) | 1; + } + + return prob; +} + +static GstVp9ParserResult +parse_mv_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i, j, k; + + for (j = 0; j < GST_VP9_MV_JOINTS - 1; j++) + hdr->delta_probabilities.mv.joint[j] = read_mv_prob (bd); + + for (i = 0; i < 2; i++) { + hdr->delta_probabilities.mv.sign[i] = read_mv_prob (bd); + + for (j = 0; j < GST_VP9_MV_CLASSES - 1; j++) + hdr->delta_probabilities.mv.klass[i][j] = read_mv_prob (bd); + + hdr->delta_probabilities.mv.class0_bit[i] = read_mv_prob (bd); + + for (j = 0; j < GST_VP9_MV_OFFSET_BITS; j++) + hdr->delta_probabilities.mv.bits[i][j] = read_mv_prob (bd); + } + + for (i = 0; i < 2; i++) { + for (j = 0; j < GST_VP9_CLASS0_SIZE; j++) + for (k = 0; k < GST_VP9_MV_FR_SIZE - 1; k++) + hdr->delta_probabilities.mv.class0_fr[i][j][k] = read_mv_prob (bd); + + for (k = 0; k < GST_VP9_MV_FR_SIZE - 1; k++) + hdr->delta_probabilities.mv.fr[i][k] = read_mv_prob (bd); + } + + if (hdr->allow_high_precision_mv) { + for (i = 0; i < 2; i++) { + hdr->delta_probabilities.mv.class0_hp[i] = read_mv_prob (bd); + hdr->delta_probabilities.mv.hp[i] = read_mv_prob (bd); + } + + } + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_partition_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i, j; + + for (i = 0; i < GST_VP9_PARTITION_CONTEXTS; i++) + for (j = 0; j < GST_VP9_PARTITION_TYPES - 1; j++) + hdr->delta_probabilities.partition[i][j] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_y_mode_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i, j; + + for (i = 0; i < GST_VP9_BLOCK_SIZE_GROUPS; i++) + for (j = 0; j < GST_VP9_INTRA_MODES - 1; j++) + hdr->delta_probabilities.y_mode[i][j] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_frame_reference_mode_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i; + + if (hdr->reference_mode == GST_VP9_REFERENCE_MODE_SELECT) + for (i = 0; i < GST_VP9_COMP_MODE_CONTEXTS; i++) + hdr->delta_probabilities.comp_mode[i] = decode_term_subexp (bd); + + if (hdr->reference_mode != GST_VP9_REFERENCE_MODE_COMPOUND_REFERENCE) + for (i = 0; i < GST_VP9_REF_CONTEXTS; i++) { + hdr->delta_probabilities.single_ref[i][0] = decode_term_subexp (bd); + hdr->delta_probabilities.single_ref[i][1] = decode_term_subexp (bd); + } + + if (hdr->reference_mode != GST_VP9_REFERENCE_MODE_SINGLE_REFERENCE) + for (i = 0; i < GST_VP9_REF_CONTEXTS; i++) + hdr->delta_probabilities.comp_ref[i] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_frame_reference (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + gboolean compound_ref_allowed = FALSE; + guint8 non_single_reference; + guint8 reference_select; + guint i; + + for (i = GST_VP9_REF_FRAME_LAST; i < GST_VP9_REFS_PER_FRAME; i++) + if (hdr->ref_frame_sign_bias[i + 1] != + hdr->ref_frame_sign_bias[GST_VP9_REF_FRAME_LAST]) + compound_ref_allowed = TRUE; + + if (compound_ref_allowed) { + non_single_reference = read_literal (bd, 1); + if (!non_single_reference) + hdr->reference_mode = GST_VP9_REFERENCE_MODE_SINGLE_REFERENCE; + else { + reference_select = read_literal (bd, 1); + if (!reference_select) + hdr->reference_mode = GST_VP9_REFERENCE_MODE_COMPOUND_REFERENCE; + else + hdr->reference_mode = GST_VP9_REFERENCE_MODE_SELECT; + } + } else + hdr->reference_mode = GST_VP9_REFERENCE_MODE_SINGLE_REFERENCE; + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_is_inter_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i; + + for (i = 0; i < GST_VP9_IS_INTER_CONTEXTS; i++) + hdr->delta_probabilities.is_inter[i] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_interp_filter_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i, j; + + for (i = 0; i < GST_VP9_INTERP_FILTER_CONTEXTS; i++) + for (j = 0; j < GST_VP9_SWITCHABLE_FILTERS - 1; j++) + hdr->delta_probabilities.interp_filter[i][j] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_inter_mode_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i, j; + + for (i = 0; i < GST_VP9_INTER_MODE_CONTEXTS; i++) + for (j = 0; j < GST_VP9_INTER_MODES - 1; j++) + hdr->delta_probabilities.inter_mode[i][j] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_skip_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i; + + for (i = 0; i < GST_VP9_SKIP_CONTEXTS; i++) + hdr->delta_probabilities.skip[i] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_coef_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + GstVp9TxSize tx_size, max_tx_size; + guint8 i, j, k, l, m; + guint8 update_probs; + + static const guint8 tx_mode_to_biggest_tx_size[GST_VP9_TX_MODES] = { + GST_VP9_TX_4x4, + GST_VP9_TX_8x8, + GST_VP9_TX_16x16, + GST_VP9_TX_32x32, + GST_VP9_TX_32x32, + }; + + max_tx_size = tx_mode_to_biggest_tx_size[hdr->tx_mode]; + for (tx_size = GST_VP9_TX_4x4; tx_size <= max_tx_size; tx_size++) { + update_probs = read_literal (bd, 1); + if (update_probs) { + for (i = 0; i < 2; i++) + for (j = 0; j < 2; j++) + for (k = 0; k < 6; k++) + for (l = 0; l < ((k == 0) ? 3 : 6); l++) + for (m = 0; m < 3; m++) + hdr->delta_probabilities.coef[tx_size][i][j][k][l][m] = + decode_term_subexp (bd); + } + } + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_tx_mode_probs (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint i, j; + + for (i = 0; i < GST_VP9_TX_SIZE_CONTEXTS; i++) + for (j = 0; j < GST_VP9_TX_SIZES - 3; j++) + hdr->delta_probabilities.tx_probs_8x8[i][j] = decode_term_subexp (bd); + + for (i = 0; i < GST_VP9_TX_SIZE_CONTEXTS; i++) + for (j = 0; j < GST_VP9_TX_SIZES - 2; j++) + hdr->delta_probabilities.tx_probs_16x16[i][j] = decode_term_subexp (bd); + + for (i = 0; i < GST_VP9_TX_SIZE_CONTEXTS; i++) + for (j = 0; j < GST_VP9_TX_SIZES - 1; j++) + hdr->delta_probabilities.tx_probs_32x32[i][j] = decode_term_subexp (bd); + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_tx_mode (GstVp9FrameHeader * hdr, Vp9BoolDecoder * bd) +{ + guint8 tx_mode; + guint8 tx_mode_select; + + if (hdr->lossless_flag) { + hdr->tx_mode = GST_VP9_TX_MODE_ONLY_4x4; + return GST_VP9_PARSER_OK; + } + + tx_mode = read_literal (bd, 2); + if (tx_mode == GST_VP9_TX_MODE_ALLOW_32x32) { + tx_mode_select = read_literal (bd, 1); + tx_mode += tx_mode_select; + } + + hdr->tx_mode = tx_mode; + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_compressed_header (GstVp9StatefulParser * self, GstVp9FrameHeader * hdr, + GstBitReader * br) +{ + GstVp9ParserResult rst; + gboolean frame_is_intra_only; + Vp9BoolDecoder bd; + + /* consume trailing bits */ + while (gst_bit_reader_get_pos (br) & 0x7) + gst_bit_reader_get_bits_uint8_unchecked (br, 1); + + rst = init_bool (&bd, br, hdr->header_size_in_bytes); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to init the boolean decoder."); + return rst; + } + + rst = parse_tx_mode (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + if (hdr->tx_mode == GST_VP9_TX_MODE_SELECT) { + rst = parse_tx_mode_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + } + + rst = parse_coef_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_skip_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + frame_is_intra_only = (hdr->frame_type == GST_VP9_KEY_FRAME + || hdr->intra_only); + + if (!frame_is_intra_only) { + rst = parse_inter_mode_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + if (hdr->interpolation_filter == GST_VP9_INTERPOLATION_FILTER_SWITCHABLE) { + rst = parse_interp_filter_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + } + + rst = parse_is_inter_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_frame_reference (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_frame_reference_mode_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_y_mode_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_partition_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_mv_probs (hdr, &bd); + if (rst != GST_VP9_PARSER_OK) + return rst; + } + + rst = exit_bool (&bd); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("The boolean decoder did not exit cleanly."); + return rst; + } + + return GST_VP9_PARSER_OK; +} + +static const gint16 dc_qlookup[256] = { + 4, 8, 8, 9, 10, 11, 12, 12, + 13, 14, 15, 16, 17, 18, 19, 19, + 20, 21, 22, 23, 24, 25, 26, 26, + 27, 28, 29, 30, 31, 32, 32, 33, + 34, 35, 36, 37, 38, 38, 39, 40, + 41, 42, 43, 43, 44, 45, 46, 47, + 48, 48, 49, 50, 51, 52, 53, 53, + 54, 55, 56, 57, 57, 58, 59, 60, + 61, 62, 62, 63, 64, 65, 66, 66, + 67, 68, 69, 70, 70, 71, 72, 73, + 74, 74, 75, 76, 77, 78, 78, 79, + 80, 81, 81, 82, 83, 84, 85, 85, + 87, 88, 90, 92, 93, 95, 96, 98, + 99, 101, 102, 104, 105, 107, 108, 110, + 111, 113, 114, 116, 117, 118, 120, 121, + 123, 125, 127, 129, 131, 134, 136, 138, + 140, 142, 144, 146, 148, 150, 152, 154, + 156, 158, 161, 164, 166, 169, 172, 174, + 177, 180, 182, 185, 187, 190, 192, 195, + 199, 202, 205, 208, 211, 214, 217, 220, + 223, 226, 230, 233, 237, 240, 243, 247, + 250, 253, 257, 261, 265, 269, 272, 276, + 280, 284, 288, 292, 296, 300, 304, 309, + 313, 317, 322, 326, 330, 335, 340, 344, + 349, 354, 359, 364, 369, 374, 379, 384, + 389, 395, 400, 406, 411, 417, 423, 429, + 435, 441, 447, 454, 461, 467, 475, 482, + 489, 497, 505, 513, 522, 530, 539, 549, + 559, 569, 579, 590, 602, 614, 626, 640, + 654, 668, 684, 700, 717, 736, 755, 775, + 796, 819, 843, 869, 896, 925, 955, 988, + 1022, 1058, 1098, 1139, 1184, 1232, 1282, 1336, +}; + +static const gint16 dc_qlookup_10[256] = { + 4, 9, 10, 13, 15, 17, 20, 22, + 25, 28, 31, 34, 37, 40, 43, 47, + 50, 53, 57, 60, 64, 68, 71, 75, + 78, 82, 86, 90, 93, 97, 101, 105, + 109, 113, 116, 120, 124, 128, 132, 136, + 140, 143, 147, 151, 155, 159, 163, 166, + 170, 174, 178, 182, 185, 189, 193, 197, + 200, 204, 208, 212, 215, 219, 223, 226, + 230, 233, 237, 241, 244, 248, 251, 255, + 259, 262, 266, 269, 273, 276, 280, 283, + 287, 290, 293, 297, 300, 304, 307, 310, + 314, 317, 321, 324, 327, 331, 334, 337, + 343, 350, 356, 362, 369, 375, 381, 387, + 394, 400, 406, 412, 418, 424, 430, 436, + 442, 448, 454, 460, 466, 472, 478, 484, + 490, 499, 507, 516, 525, 533, 542, 550, + 559, 567, 576, 584, 592, 601, 609, 617, + 625, 634, 644, 655, 666, 676, 687, 698, + 708, 718, 729, 739, 749, 759, 770, 782, + 795, 807, 819, 831, 844, 856, 868, 880, + 891, 906, 920, 933, 947, 961, 975, 988, + 1001, 1015, 1030, 1045, 1061, 1076, 1090, 1105, + 1120, 1137, 1153, 1170, 1186, 1202, 1218, 1236, + 1253, 1271, 1288, 1306, 1323, 1342, 1361, 1379, + 1398, 1416, 1436, 1456, 1476, 1496, 1516, 1537, + 1559, 1580, 1601, 1624, 1647, 1670, 1692, 1717, + 1741, 1766, 1791, 1817, 1844, 1871, 1900, 1929, + 1958, 1990, 2021, 2054, 2088, 2123, 2159, 2197, + 2236, 2276, 2319, 2363, 2410, 2458, 2508, 2561, + 2616, 2675, 2737, 2802, 2871, 2944, 3020, 3102, + 3188, 3280, 3375, 3478, 3586, 3702, 3823, 3953, + 4089, 4236, 4394, 4559, 4737, 4929, 5130, 5347, +}; + +static const gint16 dc_qlookup_12[256] = { + 4, 12, 18, 25, 33, 41, 50, 60, + 70, 80, 91, 103, 115, 127, 140, 153, + 166, 180, 194, 208, 222, 237, 251, 266, + 281, 296, 312, 327, 343, 358, 374, 390, + 405, 421, 437, 453, 469, 484, 500, 516, + 532, 548, 564, 580, 596, 611, 627, 643, + 659, 674, 690, 706, 721, 737, 752, 768, + 783, 798, 814, 829, 844, 859, 874, 889, + 904, 919, 934, 949, 964, 978, 993, 1008, + 1022, 1037, 1051, 1065, 1080, 1094, 1108, 1122, + 1136, 1151, 1165, 1179, 1192, 1206, 1220, 1234, + 1248, 1261, 1275, 1288, 1302, 1315, 1329, 1342, + 1368, 1393, 1419, 1444, 1469, 1494, 1519, 1544, + 1569, 1594, 1618, 1643, 1668, 1692, 1717, 1741, + 1765, 1789, 1814, 1838, 1862, 1885, 1909, 1933, + 1957, 1992, 2027, 2061, 2096, 2130, 2165, 2199, + 2233, 2267, 2300, 2334, 2367, 2400, 2434, 2467, + 2499, 2532, 2575, 2618, 2661, 2704, 2746, 2788, + 2830, 2872, 2913, 2954, 2995, 3036, 3076, 3127, + 3177, 3226, 3275, 3324, 3373, 3421, 3469, 3517, + 3565, 3621, 3677, 3733, 3788, 3843, 3897, 3951, + 4005, 4058, 4119, 4181, 4241, 4301, 4361, 4420, + 4479, 4546, 4612, 4677, 4742, 4807, 4871, 4942, + 5013, 5083, 5153, 5222, 5291, 5367, 5442, 5517, + 5591, 5665, 5745, 5825, 5905, 5984, 6063, 6149, + 6234, 6319, 6404, 6495, 6587, 6678, 6769, 6867, + 6966, 7064, 7163, 7269, 7376, 7483, 7599, 7715, + 7832, 7958, 8085, 8214, 8352, 8492, 8635, 8788, + 8945, 9104, 9275, 9450, 9639, 9832, 10031, 10245, + 10465, 10702, 10946, 11210, 11482, 11776, 12081, 12409, + 12750, 13118, 13501, 13913, 14343, 14807, 15290, 15812, + 16356, 16943, 17575, 18237, 18949, 19718, 20521, 21387, +}; + +static const gint16 ac_qlookup[256] = { + 4, 8, 9, 10, 11, 12, 13, 14, + 15, 16, 17, 18, 19, 20, 21, 22, + 23, 24, 25, 26, 27, 28, 29, 30, + 31, 32, 33, 34, 35, 36, 37, 38, + 39, 40, 41, 42, 43, 44, 45, 46, + 47, 48, 49, 50, 51, 52, 53, 54, + 55, 56, 57, 58, 59, 60, 61, 62, + 63, 64, 65, 66, 67, 68, 69, 70, + 71, 72, 73, 74, 75, 76, 77, 78, + 79, 80, 81, 82, 83, 84, 85, 86, + 87, 88, 89, 90, 91, 92, 93, 94, + 95, 96, 97, 98, 99, 100, 101, 102, + 104, 106, 108, 110, 112, 114, 116, 118, + 120, 122, 124, 126, 128, 130, 132, 134, + 136, 138, 140, 142, 144, 146, 148, 150, + 152, 155, 158, 161, 164, 167, 170, 173, + 176, 179, 182, 185, 188, 191, 194, 197, + 200, 203, 207, 211, 215, 219, 223, 227, + 231, 235, 239, 243, 247, 251, 255, 260, + 265, 270, 275, 280, 285, 290, 295, 300, + 305, 311, 317, 323, 329, 335, 341, 347, + 353, 359, 366, 373, 380, 387, 394, 401, + 408, 416, 424, 432, 440, 448, 456, 465, + 474, 483, 492, 501, 510, 520, 530, 540, + 550, 560, 571, 582, 593, 604, 615, 627, + 639, 651, 663, 676, 689, 702, 715, 729, + 743, 757, 771, 786, 801, 816, 832, 848, + 864, 881, 898, 915, 933, 951, 969, 988, + 1007, 1026, 1046, 1066, 1087, 1108, 1129, 1151, + 1173, 1196, 1219, 1243, 1267, 1292, 1317, 1343, + 1369, 1396, 1423, 1451, 1479, 1508, 1537, 1567, + 1597, 1628, 1660, 1692, 1725, 1759, 1793, 1828, +}; + +static const gint16 ac_qlookup_10[256] = { + 4, 9, 11, 13, 16, 18, 21, 24, + 27, 30, 33, 37, 40, 44, 48, 51, + 55, 59, 63, 67, 71, 75, 79, 83, + 88, 92, 96, 100, 105, 109, 114, 118, + 122, 127, 131, 136, 140, 145, 149, 154, + 158, 163, 168, 172, 177, 181, 186, 190, + 195, 199, 204, 208, 213, 217, 222, 226, + 231, 235, 240, 244, 249, 253, 258, 262, + 267, 271, 275, 280, 284, 289, 293, 297, + 302, 306, 311, 315, 319, 324, 328, 332, + 337, 341, 345, 349, 354, 358, 362, 367, + 371, 375, 379, 384, 388, 392, 396, 401, + 409, 417, 425, 433, 441, 449, 458, 466, + 474, 482, 490, 498, 506, 514, 523, 531, + 539, 547, 555, 563, 571, 579, 588, 596, + 604, 616, 628, 640, 652, 664, 676, 688, + 700, 713, 725, 737, 749, 761, 773, 785, + 797, 809, 825, 841, 857, 873, 889, 905, + 922, 938, 954, 970, 986, 1002, 1018, 1038, + 1058, 1078, 1098, 1118, 1138, 1158, 1178, 1198, + 1218, 1242, 1266, 1290, 1314, 1338, 1362, 1386, + 1411, 1435, 1463, 1491, 1519, 1547, 1575, 1603, + 1631, 1663, 1695, 1727, 1759, 1791, 1823, 1859, + 1895, 1931, 1967, 2003, 2039, 2079, 2119, 2159, + 2199, 2239, 2283, 2327, 2371, 2415, 2459, 2507, + 2555, 2603, 2651, 2703, 2755, 2807, 2859, 2915, + 2971, 3027, 3083, 3143, 3203, 3263, 3327, 3391, + 3455, 3523, 3591, 3659, 3731, 3803, 3876, 3952, + 4028, 4104, 4184, 4264, 4348, 4432, 4516, 4604, + 4692, 4784, 4876, 4972, 5068, 5168, 5268, 5372, + 5476, 5584, 5692, 5804, 5916, 6032, 6148, 6268, + 6388, 6512, 6640, 6768, 6900, 7036, 7172, 7312, +}; + +static const gint16 ac_qlookup_12[256] = { + 4, 13, 19, 27, 35, 44, 54, 64, + 75, 87, 99, 112, 126, 139, 154, 168, + 183, 199, 214, 230, 247, 263, 280, 297, + 314, 331, 349, 366, 384, 402, 420, 438, + 456, 475, 493, 511, 530, 548, 567, 586, + 604, 623, 642, 660, 679, 698, 716, 735, + 753, 772, 791, 809, 828, 846, 865, 884, + 902, 920, 939, 957, 976, 994, 1012, 1030, + 1049, 1067, 1085, 1103, 1121, 1139, 1157, 1175, + 1193, 1211, 1229, 1246, 1264, 1282, 1299, 1317, + 1335, 1352, 1370, 1387, 1405, 1422, 1440, 1457, + 1474, 1491, 1509, 1526, 1543, 1560, 1577, 1595, + 1627, 1660, 1693, 1725, 1758, 1791, 1824, 1856, + 1889, 1922, 1954, 1987, 2020, 2052, 2085, 2118, + 2150, 2183, 2216, 2248, 2281, 2313, 2346, 2378, + 2411, 2459, 2508, 2556, 2605, 2653, 2701, 2750, + 2798, 2847, 2895, 2943, 2992, 3040, 3088, 3137, + 3185, 3234, 3298, 3362, 3426, 3491, 3555, 3619, + 3684, 3748, 3812, 3876, 3941, 4005, 4069, 4149, + 4230, 4310, 4390, 4470, 4550, 4631, 4711, 4791, + 4871, 4967, 5064, 5160, 5256, 5352, 5448, 5544, + 5641, 5737, 5849, 5961, 6073, 6185, 6297, 6410, + 6522, 6650, 6778, 6906, 7034, 7162, 7290, 7435, + 7579, 7723, 7867, 8011, 8155, 8315, 8475, 8635, + 8795, 8956, 9132, 9308, 9484, 9660, 9836, 10028, + 10220, 10412, 10604, 10812, 11020, 11228, 11437, 11661, + 11885, 12109, 12333, 12573, 12813, 13053, 13309, 13565, + 13821, 14093, 14365, 14637, 14925, 15213, 15502, 15806, + 16110, 16414, 16734, 17054, 17390, 17726, 18062, 18414, + 18766, 19134, 19502, 19886, 20270, 20670, 21070, 21486, + 21902, 22334, 22766, 23214, 23662, 24126, 24590, 25070, + 25551, 26047, 26559, 27071, 27599, 28143, 28687, 29247, +}; + +static GstVp9ParserResult +parse_frame_marker (GstBitReader * br) +{ + guint8 frame_marker; + + VP9_READ_UINT8 (frame_marker, 2); + + if (frame_marker != GST_VP9_FRAME_MARKER) { + GST_ERROR ("Invalid VP9 Frame Marker"); + return GST_VP9_PARSER_ERROR; + } + + return GST_VP9_PARSER_OK; +} + +static GstVp9ParserResult +parse_frame_sync_code (GstBitReader * br) +{ + guint32 code; + + VP9_READ_UINT32 (code, 24); + if (code != GST_VP9_SYNC_CODE) { + GST_ERROR ("%d is not VP9 sync code", code); + return GST_VP9_PARSER_ERROR; + } + + return GST_VP9_PARSER_OK; +} + +/* 6.2.2 Color config syntax */ +static GstVp9ParserResult +parse_color_config (GstVp9StatefulParser * self, GstBitReader * br, + GstVp9FrameHeader * header) +{ + guint8 bit = 0; + + if (header->profile >= GST_VP9_PROFILE_2) { + VP9_READ_BIT (bit); + if (bit) { + header->bit_depth = GST_VP9_BIT_DEPTH_12; + } else { + header->bit_depth = GST_VP9_BIT_DEPTH_10; + } + } else { + header->bit_depth = GST_VP9_BIT_DEPTH_8; + } + + VP9_READ_UINT8 (header->color_space, 3); + if (header->color_space != GST_VP9_CS_SRGB) { + VP9_READ_BIT (header->color_range); + + if (header->profile == GST_VP9_PROFILE_1 + || header->profile == GST_VP9_PROFILE_3) { + VP9_READ_BIT (header->subsampling_x); + VP9_READ_BIT (header->subsampling_y); + + if (header->subsampling_x == 1 && header->subsampling_y == 1) { + GST_ERROR + ("4:2:0 subsampling is not supported in profile_1 or profile_3"); + return GST_VP9_PARSER_ERROR; + } + + /* reserved bit */ + VP9_READ_BIT (bit); + } else { + header->subsampling_y = header->subsampling_x = 1; + } + } else { + header->color_range = GST_VP9_CR_FULL; + if (header->profile == GST_VP9_PROFILE_1 + || header->profile == GST_VP9_PROFILE_3) { + /* reserved bit */ + VP9_READ_BIT (bit); + } else { + GST_ERROR + ("4:4:4 subsampling is not supported in profile_0 and profile_2"); + return GST_VP9_PARSER_ERROR; + } + } + + self->bit_depth = header->bit_depth; + self->color_space = header->color_space; + self->subsampling_x = header->subsampling_x; + self->subsampling_y = header->subsampling_y; + self->color_range = header->color_range; + + return GST_VP9_PARSER_OK; +} + +/* 6.2 Uncompressed header syntax */ +static GstVp9ParserResult +parse_profile (GstBitReader * br, guint8 * profile) +{ + guint8 profile_low_bit, profile_high_bit, ret, bit; + + VP9_READ_BIT (profile_low_bit); + VP9_READ_BIT (profile_high_bit); + + ret = (profile_high_bit << 1) | profile_low_bit; + if (ret == 3) { + /* reserved bit */ + VP9_READ_BIT (bit); + } + + *profile = ret; + + return GST_VP9_PARSER_OK; +} + +/* 6.2.6 Compute image size syntax */ +static void +compute_image_size (GstVp9StatefulParser * self, guint32 width, guint32 height) +{ + self->mi_cols = (width + 7) >> 3; + self->mi_rows = (height + 7) >> 3; + self->sb64_cols = (self->mi_cols + 7) >> 3; + self->sb64_rows = (self->mi_rows + 7) >> 3; +} + +static GstVp9ParserResult +parse_frame_or_render_size (GstBitReader * br, + guint32 * width, guint32 * height) +{ + guint32 width_minus_1; + guint32 height_minus_1; + + VP9_READ_UINT32 (width_minus_1, 16); + VP9_READ_UINT32 (height_minus_1, 16); + + *width = width_minus_1 + 1; + *height = height_minus_1 + 1; + + return GST_VP9_PARSER_OK; +} + +/* 6.2.3 Frame size syntax */ +static GstVp9ParserResult +parse_frame_size (GstVp9StatefulParser * self, GstBitReader * br, + guint32 * width, guint32 * height) +{ + GstVp9ParserResult rst; + + rst = parse_frame_or_render_size (br, width, height); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse frame size"); + return rst; + } + + compute_image_size (self, *width, *height); + + return GST_VP9_PARSER_OK; +} + +/* 6.2.4 Render size syntax */ +static GstVp9ParserResult +parse_render_size (GstBitReader * br, GstVp9FrameHeader * header) +{ + VP9_READ_BIT (header->render_and_frame_size_different); + if (header->render_and_frame_size_different) { + return parse_frame_or_render_size (br, + &header->render_width, &header->render_height); + } else { + header->render_width = header->width; + header->render_height = header->height; + } + + return GST_VP9_PARSER_OK; +} + +/* 6.2.5 Frame size with refs syntax */ +static GstVp9ParserResult +parse_frame_size_with_refs (GstVp9StatefulParser * self, GstBitReader * br, + GstVp9FrameHeader * header) +{ + guint8 found_ref = 0; + guint i; + + for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { + VP9_READ_BIT (found_ref); + + if (found_ref) { + guint8 idx = header->ref_frame_idx[i]; + + header->width = self->reference[idx].width; + header->height = self->reference[idx].height; + break; + } + } + + if (found_ref == 0) { + GstVp9ParserResult rst; + + rst = parse_frame_size (self, br, &header->width, &header->height); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse frame size without refs"); + return rst; + } + } else { + compute_image_size (self, header->width, header->height); + } + + return parse_render_size (br, header); +} + +/* 6.2.7 Interpolation filter syntax */ +static GstVp9ParserResult +read_interpolation_filter (GstBitReader * br, GstVp9FrameHeader * header) +{ + static const GstVp9InterpolationFilter filter_map[] = { + GST_VP9_INTERPOLATION_FILTER_EIGHTTAP_SMOOTH, + GST_VP9_INTERPOLATION_FILTER_EIGHTTAP, + GST_VP9_INTERPOLATION_FILTER_EIGHTTAP_SHARP, + GST_VP9_INTERPOLATION_FILTER_BILINEAR + }; + guint8 is_filter_switchable; + + VP9_READ_BIT (is_filter_switchable); + if (is_filter_switchable) { + header->interpolation_filter = GST_VP9_INTERPOLATION_FILTER_SWITCHABLE; + } else { + guint8 map_val; + + VP9_READ_UINT8 (map_val, 2); + header->interpolation_filter = filter_map[map_val]; + } + + return GST_VP9_PARSER_OK; +} + +/* 6.2.8 Loop filter params syntax */ +static GstVp9ParserResult +parse_loop_filter_params (GstBitReader * br, GstVp9LoopFilterParams * params) +{ + VP9_READ_UINT8 (params->loop_filter_level, 6); + VP9_READ_UINT8 (params->loop_filter_sharpness, 3); + VP9_READ_BIT (params->loop_filter_delta_enabled); + + if (params->loop_filter_delta_enabled) { + VP9_READ_BIT (params->loop_filter_delta_update); + if (params->loop_filter_delta_update) { + guint i; + + for (i = 0; i < GST_VP9_MAX_REF_LF_DELTAS; i++) { + VP9_READ_BIT (params->update_ref_delta[i]); + if (params->update_ref_delta[i]) { + VP9_READ_SIGNED_8 (params->loop_filter_ref_deltas[i], 6); + } + } + + for (i = 0; i < GST_VP9_MAX_MODE_LF_DELTAS; i++) { + VP9_READ_BIT (params->update_mode_delta[i]); + if (params->update_mode_delta[i]) + VP9_READ_SIGNED_8 (params->loop_filter_mode_deltas[i], 6); + } + } + } + + return GST_VP9_PARSER_OK; +} + +/* 6.2.10 Delta quantizer syntax */ +static inline GstVp9ParserResult +parse_delta_q (GstBitReader * br, gint8 * value) +{ + guint8 read_signed; + gint8 delta_q; + + VP9_READ_BIT (read_signed); + if (!read_signed) { + *value = 0; + return GST_VP9_PARSER_OK; + } + + VP9_READ_SIGNED_8 (delta_q, 4); + *value = delta_q; + + return GST_VP9_PARSER_OK; +} + +/* 6.2.9 Quantization params syntax */ +static GstVp9ParserResult +parse_quantization_params (GstBitReader * br, GstVp9FrameHeader * header) +{ + GstVp9QuantizationParams *params = &header->quantization_params; + GstVp9ParserResult rst; + + VP9_READ_UINT8 (params->base_q_idx, 8); + rst = parse_delta_q (br, ¶ms->delta_q_y_dc); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_delta_q (br, ¶ms->delta_q_uv_dc); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_delta_q (br, ¶ms->delta_q_uv_ac); + if (rst != GST_VP9_PARSER_OK) + return rst; + + header->lossless_flag = params->base_q_idx == 0 && params->delta_q_y_dc == 0 + && params->delta_q_uv_dc == 0 && params->delta_q_uv_ac == 0; + + return GST_VP9_PARSER_OK; +} + +/* 6.2.12 Probability syntax */ +static GstVp9ParserResult +read_prob (GstBitReader * br, guint8 * val) +{ + guint8 prob = GST_VP9_MAX_PROB; + guint8 prob_coded; + + VP9_READ_BIT (prob_coded); + + if (prob_coded) + VP9_READ_UINT8 (prob, 8); + + *val = prob; + + return GST_VP9_PARSER_OK; +} + +/* 6.2.11 Segmentation params syntax */ +static GstVp9ParserResult +parse_segmentation_params (GstBitReader * br, GstVp9SegmentationParams * params) +{ + guint i; + GstVp9ParserResult rst; + + params->segmentation_update_map = 0; + params->segmentation_update_data = 0; + params->segmentation_temporal_update = 0; + + VP9_READ_BIT (params->segmentation_enabled); + if (!params->segmentation_enabled) + return GST_VP9_PARSER_OK; + + VP9_READ_BIT (params->segmentation_update_map); + if (params->segmentation_update_map) { + for (i = 0; i < GST_VP9_SEG_TREE_PROBS; i++) { + rst = read_prob (br, ¶ms->segmentation_tree_probs[i]); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to read segmentation_tree_probs[%d]", i); + return rst; + } + } + + VP9_READ_BIT (params->segmentation_temporal_update); + if (params->segmentation_temporal_update) { + for (i = 0; i < GST_VP9_PREDICTION_PROBS; i++) { + rst = read_prob (br, ¶ms->segmentation_pred_prob[i]); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to read segmentation_pred_prob[%d]", i); + return rst; + } + } + } else { + for (i = 0; i < GST_VP9_PREDICTION_PROBS; i++) + params->segmentation_pred_prob[i] = GST_VP9_MAX_PROB; + } + } + + VP9_READ_BIT (params->segmentation_update_data); + if (params->segmentation_update_data) { + VP9_READ_BIT (params->segmentation_abs_or_delta_update); + + for (i = 0; i < GST_VP9_MAX_SEGMENTS; i++) { + VP9_READ_BIT (params->feature_enabled[i][GST_VP9_SEG_LVL_ALT_Q]); + if (params->feature_enabled[i][GST_VP9_SEG_LVL_ALT_Q]) { + VP9_READ_SIGNED_16 (params->feature_data[i][GST_VP9_SEG_LVL_ALT_Q], 8); + } else { + params->feature_data[i][GST_VP9_SEG_LVL_ALT_Q] = 0; + } + + VP9_READ_BIT (params->feature_enabled[i][GST_VP9_SEG_LVL_ALT_L]); + if (params->feature_enabled[i][GST_VP9_SEG_LVL_ALT_L]) { + VP9_READ_SIGNED_8 (params->feature_data[i][GST_VP9_SEG_LVL_ALT_L], 6); + } else { + params->feature_data[i][GST_VP9_SEG_LVL_ALT_L] = 0; + } + + VP9_READ_BIT (params->feature_enabled[i][GST_VP9_SEG_LVL_REF_FRAME]); + if (params->feature_enabled[i][GST_VP9_SEG_LVL_REF_FRAME]) { + guint8 val; + + VP9_READ_UINT8 (val, 2); + params->feature_data[i][GST_VP9_SEG_LVL_REF_FRAME] = val; + } else { + params->feature_data[i][GST_VP9_SEG_LVL_REF_FRAME] = 0; + } + + VP9_READ_BIT (params->feature_enabled[i][GST_VP9_SEG_SEG_LVL_SKIP]); + } + } + return GST_VP9_PARSER_OK; +} + +/* 6.2.14 Tile size calculation */ +static guint +calc_min_log2_tile_cols (guint32 sb64_cols) +{ + guint minLog2 = 0; + static const guint MAX_TILE_WIDTH_B64 = 64; + + while ((MAX_TILE_WIDTH_B64 << minLog2) < sb64_cols) + minLog2++; + + return minLog2; +} + +static guint +calc_max_log2_tile_cols (guint32 sb64_cols) +{ + guint maxLog2 = 1; + static const guint MIN_TILE_WIDTH_B64 = 4; + + while ((sb64_cols >> maxLog2) >= MIN_TILE_WIDTH_B64) + maxLog2++; + + return maxLog2 - 1; +} + +/* 6.2.13 Tile info syntax */ +static GstVp9ParserResult +parse_tile_info (GstVp9StatefulParser * self, GstBitReader * br, + GstVp9FrameHeader * header) +{ + guint32 minLog2TileCols = calc_min_log2_tile_cols (self->sb64_cols); + guint32 maxLog2TileCols = calc_max_log2_tile_cols (self->sb64_cols); + + header->tile_cols_log2 = minLog2TileCols; + + while (header->tile_cols_log2 < maxLog2TileCols) { + guint8 increment_tile_cols_log2; + + VP9_READ_BIT (increment_tile_cols_log2); + if (increment_tile_cols_log2) + header->tile_cols_log2++; + else + break; + } + + if (header->tile_cols_log2 > 6) { + GST_ERROR ("Invalid number of tile columns"); + return GST_VP9_PARSER_ERROR; + } + + VP9_READ_BIT (header->tile_rows_log2); + if (header->tile_rows_log2) { + guint8 increment_tile_rows_log2; + + VP9_READ_BIT (increment_tile_rows_log2); + header->tile_rows_log2 += increment_tile_rows_log2; + } + + return GST_VP9_PARSER_OK; +} + +/* 7.2 Uncompressed header semantics */ +static void +setup_past_independence (GstVp9StatefulParser * self, + GstVp9FrameHeader * header) +{ + memset (self->segmentation_params.feature_enabled, + 0, sizeof (self->segmentation_params.feature_enabled)); + memset (self->segmentation_params.feature_data, + 0, sizeof (self->segmentation_params.feature_data)); + + self->segmentation_params.segmentation_abs_or_delta_update = 0; + + self->loop_filter_params.loop_filter_delta_enabled = 1; + self->loop_filter_params.loop_filter_ref_deltas[GST_VP9_REF_FRAME_INTRA] = 1; + self->loop_filter_params.loop_filter_ref_deltas[GST_VP9_REF_FRAME_LAST] = 0; + self->loop_filter_params.loop_filter_ref_deltas[GST_VP9_REF_FRAME_GOLDEN] = + -1; + self->loop_filter_params.loop_filter_ref_deltas[GST_VP9_REF_FRAME_ALTREF] = + -1; + + memset (self->loop_filter_params.loop_filter_mode_deltas, 0, + sizeof (self->loop_filter_params.loop_filter_mode_deltas)); + memset (header->ref_frame_sign_bias, 0, sizeof (header->ref_frame_sign_bias)); +} + +/** + * gst_vp9_stateful_parser_new: + * + * Creates a new #GstVp9StatefulParser. It should be freed with + * gst_vp9_stateful_parser_free() after use. + * + * Returns: a new #GstVp9StatefulParser + * + * Since: 1.20 + */ +GstVp9StatefulParser * +gst_vp9_stateful_parser_new (void) +{ + GstVp9StatefulParser *parser; + + parser = g_new0 (GstVp9StatefulParser, 1); + + return parser; +} + +/** + * gst_vp9_stateful_parser_free: + * @parser: the #GstVp9StatefulParser to free + * + * Frees @parser. + * + * Since: 1.20 + */ +void +gst_vp9_stateful_parser_free (GstVp9StatefulParser * parser) +{ + g_free (parser); +} + +/** + * gst_vp9_stateful_parser_parse_compressed_frame_header: + * @parser: The #GstVp9StatefulParser + * @header: The #GstVp9FrameHeader to fill + * @data: The data to parse + * @size: The size of the @data to parse + * + * Parses the compressed information in the VP9 bitstream contained in @data, + * and fills in @header with the parsed values. + * The @size argument represent the whole frame size. + * + * Returns: a #GstVp9ParserResult + * + * Since: 1.20 + */ + +GstVp9ParserResult +gst_vp9_stateful_parser_parse_compressed_frame_header (GstVp9StatefulParser * + parser, GstVp9FrameHeader * header, const guint8 * data, gsize size) +{ + GstVp9ParserResult rst = GST_VP9_PARSER_OK; + GstBitReader bit_reader; + GstBitReader *br = &bit_reader; + + gst_bit_reader_init (br, data, size); + + rst = parse_compressed_header (parser, header, br); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse the compressed header"); + return GST_VP9_PARSER_ERROR; + } + + return rst; +} + +/** + * gst_vp9_stateful_parser_parse_uncompressed_frame_header: + * @parser: The #GstVp9StatefulParser + * @header: The #GstVp9FrameHeader to fill + * @data: The data to parse + * @size: The size of the @data to parse + * + * Parses the VP9 bitstream contained in @data, and fills in @header + * with the information. The @size argument represent the whole frame size. + * + * Returns: a #GstVp9ParserResult + * + * Since: 1.20 + */ +GstVp9ParserResult +gst_vp9_stateful_parser_parse_uncompressed_frame_header (GstVp9StatefulParser * + parser, GstVp9FrameHeader * header, const guint8 * data, gsize size) +{ + GstBitReader bit_reader; + GstBitReader *br = &bit_reader; + gboolean frame_is_intra = FALSE; + GstVp9ParserResult rst = GST_VP9_PARSER_OK; + guint i; + + g_return_val_if_fail (parser, GST_VP9_PARSER_ERROR); + g_return_val_if_fail (header, GST_VP9_PARSER_ERROR); + g_return_val_if_fail (data, GST_VP9_PARSER_ERROR); + g_return_val_if_fail (size, GST_VP9_PARSER_ERROR); + + gst_bit_reader_init (br, data, size); + memset (header, 0, sizeof (GstVp9FrameHeader)); + + /* Parsing Uncompressed Data Chunk */ + rst = parse_frame_marker (br); + if (rst != GST_VP9_PARSER_OK) + return rst; + + rst = parse_profile (br, &header->profile); + if (rst != GST_VP9_PARSER_OK) + return rst; + + CHECK_ALLOWED (header->profile, GST_VP9_PROFILE_0, GST_VP9_PROFILE_3); + + VP9_READ_BIT (header->show_existing_frame); + if (header->show_existing_frame) { + VP9_READ_UINT8 (header->frame_to_show_map_idx, 3); + return GST_VP9_PARSER_OK; + } + + VP9_READ_BIT (header->frame_type); + VP9_READ_BIT (header->show_frame); + VP9_READ_BIT (header->error_resilient_mode); + + if (header->frame_type == GST_VP9_KEY_FRAME) { + rst = parse_frame_sync_code (br); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Invalid VP9 sync code in keyframe"); + return rst; + } + + rst = parse_color_config (parser, br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse color config of keyframe"); + return rst; + } + + rst = parse_frame_size (parser, br, &header->width, &header->height); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse frame size of keyframe"); + return rst; + } + + rst = parse_render_size (br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse render size of keyframe"); + return rst; + } + + header->refresh_frame_flags = 0xff; + frame_is_intra = TRUE; + } else { + if (header->show_frame == 0) + VP9_READ_BIT (header->intra_only); + + frame_is_intra = header->intra_only; + if (header->error_resilient_mode == 0) + VP9_READ_UINT8 (header->reset_frame_context, 2); + + if (header->intra_only) { + rst = parse_frame_sync_code (br); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Invalid VP9 sync code in intra-only frame"); + return rst; + } + + if (header->profile > GST_VP9_PROFILE_0) { + rst = parse_color_config (parser, br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse color config of intra-only frame"); + return rst; + } + } else { + parser->color_space = header->color_space = GST_VP9_CS_BT_601; + parser->color_range = header->color_range = GST_VP9_CR_LIMITED; + parser->subsampling_x = parser->subsampling_y = + header->subsampling_x = header->subsampling_y = 1; + parser->bit_depth = header->bit_depth = GST_VP9_BIT_DEPTH_8; + } + + VP9_READ_UINT8 (header->refresh_frame_flags, 8); + rst = parse_frame_size (parser, br, &header->width, &header->height); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to pase frame size of intra-only frame"); + return rst; + } + + rst = parse_render_size (br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse render size of intra-only frame"); + return rst; + } + } else { + /* copy color_config from previously parsed one */ + header->color_space = parser->color_space; + header->color_range = parser->color_range; + header->subsampling_x = parser->subsampling_x; + header->subsampling_y = parser->subsampling_y; + header->bit_depth = parser->bit_depth; + + VP9_READ_UINT8 (header->refresh_frame_flags, 8); + for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { + VP9_READ_UINT8 (header->ref_frame_idx[i], 3); + VP9_READ_BIT (header->ref_frame_sign_bias[GST_VP9_REF_FRAME_LAST + i]); + } + + rst = parse_frame_size_with_refs (parser, br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse frame size with refs"); + return rst; + } + + VP9_READ_BIT (header->allow_high_precision_mv); + rst = read_interpolation_filter (br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to read interpolation filter information"); + return rst; + } + } + } + + if (!header->error_resilient_mode) { + VP9_READ_BIT (header->refresh_frame_context); + VP9_READ_BIT (header->frame_parallel_decoding_mode); + } else { + header->refresh_frame_context = 0; + header->frame_parallel_decoding_mode = 1; + } + + VP9_READ_UINT8 (header->frame_context_idx, 2); + + if (frame_is_intra || header->error_resilient_mode) + setup_past_independence (parser, header); + + /* First update our own table, and we will copy to frame header later */ + rst = parse_loop_filter_params (br, &parser->loop_filter_params); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse loop filter params"); + return rst; + } + + rst = parse_quantization_params (br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse quantization params"); + return rst; + } + + /* Also update our own table, then it will be copied later */ + rst = parse_segmentation_params (br, &parser->segmentation_params); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse segmentation params"); + return rst; + } + + rst = parse_tile_info (parser, br, header); + if (rst != GST_VP9_PARSER_OK) { + GST_ERROR ("Failed to parse tile info"); + return rst; + } + + VP9_READ_UINT16 (header->header_size_in_bytes, 16); + if (!header->header_size_in_bytes) { + GST_ERROR ("Failed to parse header size in bytes"); + return GST_VP9_PARSER_ERROR; + } + + /* copy our values to header */ + memcpy (&header->loop_filter_params, &parser->loop_filter_params, + sizeof (GstVp9LoopFilterParams)); + memcpy (&header->segmentation_params, &parser->segmentation_params, + sizeof (GstVp9SegmentationParams)); + + /* And update reference frames */ + for (i = 0; i < GST_VP9_REF_FRAMES; i++) { + guint8 flag = (1 << i); + if ((header->refresh_frame_flags & flag) != 0) { + parser->reference[i].width = header->width; + parser->reference[i].height = header->height; + } + } + + header->frame_header_length_in_bytes = (gst_bit_reader_get_pos (br) + 7) / 8; + + + return GST_VP9_PARSER_OK; +} + +/** + * gst_vp9_seg_feature_active: + * @params: a #GstVp9SegmentationParams + * @segment_id: a segment id + * @feature: a segmentation feature + * + * An implementation of "seg_feature_active" function specified in + * "6.4.9 Segmentation feature active syntax" + * + * Returns: %TRUE if feature is active + * + * Since: 1.20 + */ +gboolean +gst_vp9_seg_feature_active (const GstVp9SegmentationParams * params, + guint8 segment_id, guint8 feature) +{ + g_return_val_if_fail (params != NULL, FALSE); + g_return_val_if_fail (segment_id < GST_VP9_MAX_SEGMENTS, FALSE); + g_return_val_if_fail (feature < GST_VP9_SEG_LVL_MAX, FALSE); + + return params->segmentation_enabled && + params->feature_enabled[segment_id][feature]; +} + +/** + * gst_vp9_get_qindex: + * @segmentation_params: a #GstVp9SegmentationParams + * @quantization_params: a #GstVp9QuantizationParams + * @segment_id: a segment id + * + * An implementation of "get_qindex" function specfied in + * "8.6.1 Dequantization functions" + * + * Returns: the quantizer index + * + * Since: 1.20 + */ +guint8 +gst_vp9_get_qindex (const GstVp9SegmentationParams * segmentation_params, + const GstVp9QuantizationParams * quantization_params, guint8 segment_id) +{ + guint8 base_q_index; + + g_return_val_if_fail (segmentation_params != NULL, 0); + g_return_val_if_fail (quantization_params != NULL, 0); + g_return_val_if_fail (segment_id < GST_VP9_MAX_SEGMENTS, 0); + + base_q_index = quantization_params->base_q_idx; + + if (gst_vp9_seg_feature_active (segmentation_params, segment_id, + GST_VP9_SEG_LVL_ALT_Q)) { + gint data = + segmentation_params->feature_data[segment_id][GST_VP9_SEG_LVL_ALT_Q]; + + if (!segmentation_params->segmentation_abs_or_delta_update) + data += base_q_index; + + return CLAMP (data, 0, 255); + } + + return base_q_index; +} + +/** + * gst_vp9_get_dc_quant: + * @qindex: the quantizer index + * @delta_q_dc: a delta_q_dc value + * @bit_depth: coded bit depth + * + * An implementation of "dc_q" function specified in + * "8.6.1 Dequantization functions" + * + * Returns: the quantizer value for the dc coefficient + * + * Since: 1.20 + */ +gint16 +gst_vp9_get_dc_quant (guint8 qindex, gint8 delta_q_dc, guint8 bit_depth) +{ + guint8 q_table_idx = CLAMP (qindex + delta_q_dc, 0, 255); + + switch (bit_depth) { + case 8: + return dc_qlookup[q_table_idx]; + case 10: + return dc_qlookup_10[q_table_idx]; + case 12: + return dc_qlookup_12[q_table_idx]; + default: + GST_WARNING ("Unhandled bitdepth %d", bit_depth); + break; + } + + return -1; +} + +/** + * gst_vp9_get_ac_quant: + * @qindex: the quantizer index + * @delta_q_ac: a delta_q_ac value + * @bit_depth: coded bit depth + * + * An implementation of "ac_q" function specified in + * "8.6.1 Dequantization functions" + * + * Returns: the quantizer value for the ac coefficient + * + * Since: 1.20 + */ +gint16 +gst_vp9_get_ac_quant (guint8 qindex, gint8 delta_q_ac, guint8 bit_depth) +{ + guint8 q_table_idx = CLAMP (qindex + delta_q_ac, 0, 255); + + switch (bit_depth) { + case 8: + return ac_qlookup[q_table_idx]; + case 10: + return ac_qlookup_10[q_table_idx]; + case 12: + return ac_qlookup_12[q_table_idx]; + default: + GST_WARNING ("Unhandled bitdepth %d", bit_depth); + break; + } + + return -1; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/gstvp9statefulparser.h
Added
@@ -0,0 +1,694 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_VP9_STATEFUL_PARSER_H__ +#define __GST_VP9_STATEFUL_PARSER_H__ + +#include <gst/codecs/codecs-prelude.h> +#include <gst/codecparsers/gstvp9parser.h> + +G_BEGIN_DECLS + +typedef struct _GstVp9StatefulParser GstVp9StatefulParser; +typedef struct _GstVp9LoopFilterParams GstVp9LoopFilterParams; +typedef struct _GstVp9QuantizationParams GstVp9QuantizationParams; +typedef struct _GstVp9SegmentationParams GstVp9SegmentationParams; +typedef struct _GstVp9MvDeltaProbs GstVp9MvDeltaProbs; +typedef struct _GstVp9DeltaProbabilities GstVp9DeltaProbabilities; +typedef struct _GstVp9FrameHeader GstVp9FrameHeader; + +/** + * GST_VP9_SEG_LVL_ALT_Q: + * + * Index for quantizer segment feature + * + * Since: 1.20 + */ +#define GST_VP9_SEG_LVL_ALT_Q 0 + +/** + * GST_VP9_SEG_LVL_ALT_L: + * + * Index for loop filter segment feature + * + * Since: 1.20 + */ +#define GST_VP9_SEG_LVL_ALT_L 1 + +/** + * GST_VP9_SEG_LVL_REF_FRAME: + * + * Index for reference frame segment feature + * + * Since: 1.20 + */ +#define GST_VP9_SEG_LVL_REF_FRAME 2 + +/** + * GST_VP9_SEG_SEG_LVL_SKIP: + * + * Index for skip segment feature + * + * Since: 1.20 + */ +#define GST_VP9_SEG_SEG_LVL_SKIP 3 + +/** + * GST_VP9_SEG_LVL_MAX: + * + * Number of segment features + * + * Since: 1.20 + */ +#define GST_VP9_SEG_LVL_MAX 4 +/** + * GST_VP9_TX_SIZE_CONTEXTS: + * + * Number of contexts for transform size + * + * Since: 1.20 + */ +#define GST_VP9_TX_SIZE_CONTEXTS 2 + +/** + * GST_VP9_TX_SIZES: + * + * Number of values for tx_size + * + * Since: 1.20 + * + */ +#define GST_VP9_TX_SIZES 4 + +/** + * GST_VP9_SKIP_CONTEXTS: + * + * Number of contexts for decoding skip + * + * Since: 1.20 + * + */ +#define GST_VP9_SKIP_CONTEXTS 3 + +/** + * GST_VP9_INTER_MODE_CONTEXTS: + * + * Number of contexts for inter_mode + * + * Since: 1.20 + * + */ +#define GST_VP9_INTER_MODE_CONTEXTS 7 + +/** + * GST_VP9_INTER_MODES: + * + * Number of values for inter_mode + * + * Since: 1.20 + * + */ +#define GST_VP9_INTER_MODES 4 + +/** + * GST_VP9_INTERP_FILTER_CONTEXTS: + * + * Number of contexts for interp_filter + * + * Since: 1.20 + * + */ +#define GST_VP9_INTERP_FILTER_CONTEXTS 4 + +/** + * GST_VP9_SWITCHABLE_FILTERS: + * + * Number of contexts for interp_filter + * + * Since: 1.20 + * + */ +#define GST_VP9_SWITCHABLE_FILTERS 3 + + +/** + * GST_VP9_IS_INTER_CONTEXTS: + * + * Number of contexts for interp_filter + * + * Since: 1.20 + * + */ +#define GST_VP9_IS_INTER_CONTEXTS 4 + +/** + * GST_VP9_COMP_MODE_CONTEXTS: + * + * Number of contexts for comp_mode + * + * Since: 1.20 + * + */ +#define GST_VP9_COMP_MODE_CONTEXTS 5 + +/** + * GST_VP9_REF_CONTEXTS: + * + * Number of contexts for single_ref and comp_ref + * + * Since: 1.20 + * + */ +#define GST_VP9_REF_CONTEXTS 5 + +/** + * GST_VP9_BLOCK_SIZE_GROUPS: + * + * Number of contexts when decoding intra_mode + * + * Since: 1.20 + * + */ +#define GST_VP9_BLOCK_SIZE_GROUPS 4 + +/** + * GST_VP9_INTRA_MODES: + * + * Number of values for intra_mode + * + * Since: 1.20 + * + */ +#define GST_VP9_INTRA_MODES 10 + +/** + * GST_VP9_PARTITION_CONTEXTS: + * + * Number of contexts when decoding partition + * + * Since: 1.20 + * + */ +#define GST_VP9_PARTITION_CONTEXTS 16 + +/** + * GST_VP9_PARTITION_TYPES: + * + * Number of values for partition + * + * Since: 1.20 + * + */ +#define GST_VP9_PARTITION_TYPES 4 + +/** + * GST_VP9_MV_JOINTS: + * + * Number of values for partition + * + * Since: 1.20 + * + */ +#define GST_VP9_MV_JOINTS 4 + +/** + * GST_VP9_MV_CLASSES: + * + * Number of values for mv_class + * + * Since: 1.20 + * + */ +#define GST_VP9_MV_CLASSES 11 + +/** + * GST_VP9_MV_OFFSET_BITS: + * + * Maximum number of bits for decoding motion vectors + * + * Since: 1.20 + * + */ +#define GST_VP9_MV_OFFSET_BITS 10 + +/** + * GST_VP9_CLASS0_SIZE: + * + * Number of values for mv_classO_bit + * + * Since: 1.20 + * + */ +#define GST_VP9_CLASS0_SIZE 2 + +/** + * GST_VP9_MV_FR_SIZE: + * + * Number of values that can be decoded for mv_fr + * + * Since: 1.20 + * + */ +#define GST_VP9_MV_FR_SIZE 4 + +/** + * GST_VP9_TX_MODES: + * + * Number of values for tx_mode + * + * Since: 1.20 + * + */ +#define GST_VP9_TX_MODES 5 + +/** + * GstVp9TxMode: + * @GST_VP9_TX_MODE_ONLY_4x4: Only 4x4 + * @GST_VP9_TX_MODE_ALLOW_8x8: Allow 8x8 + * @GST_VP9_TX_MODE_ALLOW_16x16: Allow 16x16 + * @GST_VP9_TX_MODE_ALLOW_32x32: Allow 32x32 + * @GST_VP9_TX_MODE_SELECT: The choice is specified explicitly for each block + * + * TxMode: Specifies how the transform size is determined + * + * Since: 1.20 + */ +typedef enum +{ + GST_VP9_TX_MODE_ONLY_4x4 = 0, + GST_VP9_TX_MODE_ALLOW_8x8 = 1, + GST_VP9_TX_MODE_ALLOW_16x16 = 2, + GST_VP9_TX_MODE_ALLOW_32x32 = 3, + GST_VP9_TX_MODE_SELECT = 4, + +} GstVp9TxMode; + +/** + * GstVp9ReferenceMode: + * @GST_VP9_REFERENCE_MODE_SINGLE_REFERENCE: Indicates that all the inter blocks use only a single reference frame + * @GST_VP9_REFERENCE_MODE_COMPOUND_REFERENCE: Requires all the inter blocks to use compound mode + * @GST_VP9_REFERENCE_MODE_SELECT: Allows each individual inter block to select between single and compound prediction modes + * + * Reference modes: Specify the type of inter prediction to be used + * + * Since: 1.20 + */ +typedef enum +{ + GST_VP9_REFERENCE_MODE_SINGLE_REFERENCE = 0, + GST_VP9_REFERENCE_MODE_COMPOUND_REFERENCE = 1, + GST_VP9_REFERENCE_MODE_SELECT = 2, +} GstVp9ReferenceMode; + +/** + * GstVp9TxSize: + * @GST_VP9_TX_4x4: 4x4 + * @GST_VP9_TX_8x8: 8x8 + * @GST_VP9_TX_16x16: 16x16 + * @GST_VP9_TX_32x32: 32x32 + * + * TxSize: Specifies the transform size + * + * Since: 1.20 + */ +typedef enum +{ + GST_VP9_TX_4x4 = 0, + GST_VP9_TX_8x8 = 1, + GST_VP9_TX_16x16 = 2, + GST_VP9_TX_32x32 = 3, +} GstVp9TxSize; + +/** + * GstVp9LoopFilterParams: + * @loop_filter_level: indicates the loop filter strength + * @loop_filter_sharpness: indicates the sharpness level + * @loop_filter_delta_enabled: equal to 1 means that the filter level depends + * on the mode and reference frame used to predict a block + * @loop_filter_delta_update: equal to 1 means that the bitstream contains + * additional syntax elements that specify which mode and reference frame + * deltas are to be updated + * @update_ref_delta: equal to 1 means that the bitstream contains the syntax + * element loop_filter_ref_delta + * @loop_filter_ref_deltas: contains the adjustment needed for the filter level + * based on the chosen reference frame + * @update_mode_delta: equal to 1 means that the bitstream contains the syntax + * element loop_filter_mode_deltas + * @loop_filter_mode_deltas: contains the adjustment needed for the filter level + * based on the chosen mode + * + * Loop filter params. See "6.2.8 Loop filter params syntax" and + * "7.2.8 Loop filter semantics". + * + * If syntax elements for @update_ref_delta + * and/or @loop_filter_mode_deltas are not present in bitstream, + * parser will fill @loop_filter_ref_deltas and @loop_filter_mode_deltas values + * by using previously parsed values. + * + * Since: 1.20 + */ +struct _GstVp9LoopFilterParams +{ + guint8 loop_filter_level; + guint8 loop_filter_sharpness; + guint8 loop_filter_delta_enabled; + guint8 loop_filter_delta_update; + + guint8 update_ref_delta[GST_VP9_MAX_REF_LF_DELTAS]; + gint8 loop_filter_ref_deltas[GST_VP9_MAX_REF_LF_DELTAS]; + + guint8 update_mode_delta[GST_VP9_MAX_MODE_LF_DELTAS]; + gint8 loop_filter_mode_deltas[GST_VP9_MAX_MODE_LF_DELTAS]; +}; + +/** + * GstVp9QuantizationParams: + * @base_q_idx: indicates the base frame qindex. This is used for Y AC + * coefficients and as the base value for the other quantizers + * @delta_q_y_dc: indicates the Y DC quantizer relative to base_q_idx + * @delta_q_uv_dc: indicates the UV DC quantizer relative to base_q_idx + * @delta_q_uv_ac: indicates the UV AC quantizer relative to base_q_idx + * + * Since: 1.20 + */ +struct _GstVp9QuantizationParams +{ + guint8 base_q_idx; + gint8 delta_q_y_dc; + gint8 delta_q_uv_dc; + gint8 delta_q_uv_ac; +}; + +/** + * GstVp9SegmentationParams: + * @segmentation_enabled: equal to 1 indicates that this frame makes use of the + * segmentation tool + * @segmentation_update_map: equal to 1 indicates that the segmentation map + * should be updated during the decoding of this frame + * @segmentation_tree_probs: specify the probability values to be used when + * decoding segment_id + * @segmentation_pred_prob: specify the probability values to be used when + * decoding seg_id_predicted + * @segmentation_temporal_update: equal to 1 indicates that the updates to + * the segmentation map are coded relative to the existing segmentation map + * @segmentation_update_data: equal to 1 indicates that new parameters are + * about to be specified for each segment + * @segmentation_abs_or_delta_update: equal to 0 indicates that the segmentation + * parameters represent adjustments relative to the standard values. + * equal to 1 indicates that the segmentation parameters represent the actual + * values to be used + * @feature_enabled: indicates whether feature is enabled or not + * @feature_data: segmentation feature data + * + * See "6.2.11 Segmentation params syntax" and + * "7.2.10 Segmentation params syntax". When @segmentation_update_data is equal + * to zero, parser will fill @feature_enabled and by @feature_data + * using previously parsed values. + * + * Since: 1.20 + */ +struct _GstVp9SegmentationParams +{ + guint8 segmentation_enabled; + guint8 segmentation_update_map; + guint8 segmentation_tree_probs[GST_VP9_SEG_TREE_PROBS]; + guint8 segmentation_pred_prob[GST_VP9_PREDICTION_PROBS]; + guint8 segmentation_temporal_update; + + guint8 segmentation_update_data; + guint8 segmentation_abs_or_delta_update; + + guint8 feature_enabled[GST_VP9_MAX_SEGMENTS][GST_VP9_SEG_LVL_MAX]; + gint16 feature_data[GST_VP9_MAX_SEGMENTS][GST_VP9_SEG_LVL_MAX]; +}; + +/** + * GstVp9MvDeltaProbs: + * + * Stores motion vectors probabilities updates. This is from the spec + * and can be used as a binary. + * + * Since: 1.20 + */ +struct _GstVp9MvDeltaProbs +{ + /*< private >*/ + guint8 joint[GST_VP9_MV_JOINTS - 1]; + guint8 sign[2]; + guint8 klass[2][GST_VP9_MV_CLASSES - 1]; + guint8 class0_bit[2]; + guint8 bits[2][GST_VP9_MV_OFFSET_BITS]; + guint8 class0_fr[2][GST_VP9_CLASS0_SIZE][GST_VP9_MV_FR_SIZE - 1]; + guint8 fr[2][GST_VP9_MV_FR_SIZE - 1]; + guint8 class0_hp[2]; + guint8 hp[2]; +}; + + +/** + * GstVp9DeltaProbabilities: + * + * Stores probabilities updates. This is from the spec + * and can be used as a binary. + * + * Since: 1.20 + */ +struct _GstVp9DeltaProbabilities +{ + /*< private >*/ + guint8 tx_probs_8x8[GST_VP9_TX_SIZE_CONTEXTS][GST_VP9_TX_SIZES - 3]; + guint8 tx_probs_16x16[GST_VP9_TX_SIZE_CONTEXTS][GST_VP9_TX_SIZES - 2]; + guint8 tx_probs_32x32[GST_VP9_TX_SIZE_CONTEXTS][GST_VP9_TX_SIZES - 1]; + guint8 coef[4][2][2][6][6][3]; + guint8 skip[GST_VP9_SKIP_CONTEXTS]; + guint8 inter_mode[GST_VP9_INTER_MODE_CONTEXTS][GST_VP9_INTER_MODES - 1]; + guint8 + interp_filter[GST_VP9_INTERP_FILTER_CONTEXTS][GST_VP9_SWITCHABLE_FILTERS + - 1]; + guint8 is_inter[GST_VP9_IS_INTER_CONTEXTS]; + guint8 comp_mode[GST_VP9_COMP_MODE_CONTEXTS]; + guint8 single_ref[GST_VP9_REF_CONTEXTS][2]; + guint8 comp_ref[GST_VP9_REF_CONTEXTS]; + guint8 y_mode[GST_VP9_BLOCK_SIZE_GROUPS][GST_VP9_INTRA_MODES - 1]; + guint8 partition[GST_VP9_PARTITION_CONTEXTS][GST_VP9_PARTITION_TYPES - 1]; + GstVp9MvDeltaProbs mv; +}; + + +/** + * GstVp9FrameHeader: + * @profile: encoded profile + * @bit_depth: encoded bit depth + * @subsampling_x: specify the chroma subsampling format for x coordinate + * @subsampling_y: specify the chroma subsampling format for y coordinate + * @color_space: specifies the color space of the stream + * @color_range: specifies the black level and range of the luma and chroma + * signals + * @show_existing_frame: equal to 1, indicates the frame indexed by + * frame_to_show_map_idx is to be displayed + * @frame_to_show_map_idx: specifies the frame to be displayed. + * It is only available if show_existing_frame is 1 + * @frame_type: equal to 0 indicates that the current frame is a key frame + * @show_frame: indicate whether it is a displayable frame or not + * @error_resilient_mode: equal to 1 indicates that error resilient mode is + * enabled + * @width: coded frame width + * @height: coded frame height + * @render_and_frame_size_different: equal to 0 means that the render width and + * height are inferred from the frame width and height + * @render_width: render width of the frame + * @render_height: render width of the frame + * @intra_only: equal to 1 indicates that the frame is an intra-only frame + * @reset_frame_context: specifies whether the frame context should be reset to + * default values + * @refresh_frame_flags: contains a bitmask that specifies which reference frame + * slots will be updated with the current frame after it is decoded + * @ref_frame_idx: specifies which reference frames are used by inter frames + * @ref_frame_sign_bias: specifies the intended direction of the motion vector + * in time for each reference frame. A sign bias equal to 0 indicates that + * the reference frame is a backwards reference + * @allow_high_precision_mv: equal to 0 specifies that motion vectors are + * specified to quarter pel precision + * @interpolation_filter: specifies the filter selection used for performing + * inter prediction + * @refresh_frame_context: equal to 1 indicates that the probabilities computed + * for this frame + * @frame_parallel_decoding_mode: equal to 1 indicates that parallel decoding + * mode is enabled + * @frame_context_idx: indicates the frame context to use + * @loop_filter_params: a #GstVp9LoopFilterParams + * @quantization_params: a #GstVp9QuantizationParams + * @segmentation_params: a #GstVp9SegmentationParams + * @tile_cols_log2: specifies the base 2 logarithm of the width of each tile + * @tile_rows_log2: specifies the base 2 logarithm of the height of each tile + * @tx_mode: specifies how the transform size is determined + * @reference_mode: is a derived syntax element that specifies the type of + * inter prediction to be used + * @delta_probabilities: modification to the probabilities encoded in the + * bitstream + * @lossless_flag: lossless mode decode + * @frame_header_length_in_bytes: length of uncompressed header + * + * Since: 1.20 + */ +/** + * GstVp9FrameHeader.tx_mode: + * + * Specifies how the transform size is determined. + * + * Since: 1.20 + */ +/** + * GstVp9FrameHeader.reference_mode: + * + * Is a derived syntax element that specifies the type of + * inter prediction to be used. + * + * Since: 1.20 + */ +/** + * GstVp9FrameHeader.delta_probabilities: + * + * Modification to the probabilities encoded in the bitstream. + * + * Since: 1.20 + */ +struct _GstVp9FrameHeader +{ + guint8 profile; + guint8 bit_depth; + guint8 subsampling_x; + guint8 subsampling_y; + guint8 color_space; + guint8 color_range; + guint8 show_existing_frame; + guint8 frame_to_show_map_idx; + guint8 frame_type; + guint8 show_frame; + guint8 error_resilient_mode; + guint32 width; + guint32 height; + guint8 render_and_frame_size_different; + guint32 render_width; + guint32 render_height; + guint8 intra_only; + guint8 reset_frame_context; + guint8 refresh_frame_flags; + guint8 ref_frame_idx[GST_VP9_REFS_PER_FRAME]; + guint8 ref_frame_sign_bias[4]; + guint8 allow_high_precision_mv; + guint8 interpolation_filter; + + guint8 refresh_frame_context; + guint8 frame_parallel_decoding_mode; + guint8 frame_context_idx; + + GstVp9LoopFilterParams loop_filter_params; + GstVp9QuantizationParams quantization_params; + GstVp9SegmentationParams segmentation_params; + + guint8 tile_cols_log2; + guint8 tile_rows_log2; + + guint16 header_size_in_bytes; + + /* compressed header */ + GstVp9TxMode tx_mode; + GstVp9ReferenceMode reference_mode; + GstVp9DeltaProbabilities delta_probabilities; + + /* calculated values */ + guint8 lossless_flag; + guint32 frame_header_length_in_bytes; +}; + +/** + * GstVp9StatefulParser: + * + * Opaque VP9 parser struct. + * The size of this object and member variables are not API + * + * Since: 1.20 + */ +struct _GstVp9StatefulParser +{ + /*< private >*/ + guint8 bit_depth; + guint8 subsampling_x; + guint8 subsampling_y; + guint8 color_space; + guint8 color_range; + + guint mi_cols; + guint mi_rows; + guint sb64_cols; + guint sb64_rows; + + GstVp9LoopFilterParams loop_filter_params; + GstVp9SegmentationParams segmentation_params; + + struct { + guint32 width; + guint32 height; + } reference[GST_VP9_REF_FRAMES]; +}; + +GST_CODECS_API +GstVp9StatefulParser * gst_vp9_stateful_parser_new (void); + +GST_CODECS_API +void gst_vp9_stateful_parser_free (GstVp9StatefulParser * parser); + +GST_CODECS_API +GstVp9ParserResult gst_vp9_stateful_parser_parse_compressed_frame_header (GstVp9StatefulParser * parser, + GstVp9FrameHeader * header, + const guint8 * data, + gsize size); + +GST_CODECS_API +GstVp9ParserResult gst_vp9_stateful_parser_parse_uncompressed_frame_header (GstVp9StatefulParser * parser, + GstVp9FrameHeader * header, + const guint8 * data, + gsize size); + +/* Util methods */ +GST_CODECS_API +gboolean gst_vp9_seg_feature_active (const GstVp9SegmentationParams * params, + guint8 segment_id, + guint8 feature); + +GST_CODECS_API +guint8 gst_vp9_get_qindex (const GstVp9SegmentationParams * segmentation_params, + const GstVp9QuantizationParams * quantization_params, + guint8 segment_id); + + +GST_CODECS_API +gint16 gst_vp9_get_dc_quant (guint8 qindex, + gint8 delta_q_dc, + guint8 bit_depth); + +GST_CODECS_API +gint16 gst_vp9_get_ac_quant (guint8 qindex, + gint8 delta_q_ac, + guint8 bit_depth); + +G_END_DECLS + +#endif /* __GST_VP9_STATEFUL_PARSER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/codecs/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/codecs/meson.build
Changed
@@ -1,4 +1,4 @@ -codecs_sources = files([ +codecs_sources = files( 'gsth264decoder.c', 'gsth264picture.c', 'gsth265decoder.c', @@ -7,9 +7,14 @@ 'gstvp9picture.c', 'gstvp8decoder.c', 'gstvp8picture.c', -]) + 'gstmpeg2decoder.c', + 'gstmpeg2picture.c', + 'gstav1decoder.c', + 'gstav1picture.c', + 'gstvp9statefulparser.c', +) -codecs_headers = [ +codecs_headers = files( 'gsth264decoder.h', 'gsth264picture.h', 'gsth265decoder.h', @@ -18,11 +23,17 @@ 'gstvp9picture.h', 'gstvp8decoder.h', 'gstvp8picture.h', -] + 'gstmpeg2decoder.h', + 'gstmpeg2picture.h', + 'gstav1decoder.h', + 'gstav1picture.h', + 'gstvp9statefulparser.h', +) cp_args = [ '-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_CODECS', + '-DG_LOG_DOMAIN="GStreamer-Codecs"' ] gstcodecs = library('gstcodecs-' + api_version, @@ -36,28 +47,38 @@ dependencies : [gstvideo_dep, gstcodecparsers_dep], ) +library_def = {'lib': gstcodecs} +pkg_name = 'gstreamer-codecs-1.0' gen_sources = [] if build_gir - codecs_gir = gnome.generate_gir(gstcodecs, - sources : codecs_sources + codecs_headers, - namespace : 'GstCodecs', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-codecs-1.0', - includes : ['Gst-1.0', 'GstVideo-1.0'], - install : true, - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + + gir = { + 'sources' : codecs_sources + codecs_headers, + 'namespace' : 'GstCodecs', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstVideo-1.0'], + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/codecs/gsth264decoder.h', '--c-include=gst/codecs/gsth265decoder.h', '--c-include=gst/codecs/gstvp9decoder.h', '--c-include=gst/codecs/gstvp8decoder.h', + '--c-include=gst/codecs/gstmpeg2decoder.h', ], - dependencies : [gstvideo_dep, gstcodecparsers_dep] - ) + 'dependencies' : [gstvideo_dep, gstcodecparsers_dep] + } + library_def += {'gir': [gir]} + if not static_build + codecs_gir = gnome.generate_gir(gstcodecs, kwargs: gir) + gen_sources += codecs_gir + endif endif +libraries += [[pkg_name, library_def]] gstcodecs_dep = declare_dependency(link_with : gstcodecs, include_directories : [libsinc], sources: gen_sources, dependencies : [gstvideo_dep, gstcodecparsers_dep]) +meson.override_dependency(pkg_name, gstcodecs_dep) \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/d3d11-prelude.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2020 GStreamer developers + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_PRELUDE_H__ +#define __GST_D3D11_PRELUDE_H__ + +#include <gst/gst.h> + +#ifndef GST_D3D11_API +# ifdef BUILDING_GST_D3D11 +# define GST_D3D11_API GST_API_EXPORT /* from config.h */ +# else +# define GST_D3D11_API GST_API_IMPORT +# endif +#endif + +#endif /* __GST_D3D11_PRELUDE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11.h
Added
@@ -0,0 +1,37 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_H__ +#define __GST_D3D11_H__ + +#ifndef GST_USE_UNSTABLE_API +#pragma message ("The d3d11 library from gst-plugins-bad is unstable API and may change in future.") +#pragma message ("You can define GST_USE_UNSTABLE_API to avoid this warning.") +#endif + +#include <gst/gst.h> +#include <gst/d3d11/gstd3d11config.h> +#include <gst/d3d11/gstd3d11_fwd.h> +#include <gst/d3d11/gstd3d11device.h> +#include <gst/d3d11/gstd3d11memory.h> +#include <gst/d3d11/gstd3d11bufferpool.h> +#include <gst/d3d11/gstd3d11utils.h> +#include <gst/d3d11/gstd3d11format.h> + +#endif /* __GST_D3D11_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11_fwd.h
Added
@@ -0,0 +1,84 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_FWD_H__ +#define __GST_D3D11_FWD_H__ + +#include <gst/gst.h> +#include <gst/d3d11/gstd3d11config.h> +#include <gst/d3d11/d3d11-prelude.h> + +#ifndef INITGUID +#include <initguid.h> +#endif + +#if (GST_D3D11_HEADER_VERSION >= 4) +#include <d3d11_4.h> +#elif (GST_D3D11_HEADER_VERSION >= 3) +#include <d3d11_3.h> +#elif (GST_D3D11_HEADER_VERSION >= 2) +#include <d3d11_2.h> +#elif (GST_D3D11_HEADER_VERSION >= 1) +#include <d3d11_1.h> +#else +#include <d3d11.h> +#endif + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 6) +#include <dxgi1_6.h> +#elif (GST_D3D11_DXGI_HEADER_VERSION >= 5) +#include <dxgi1_5.h> +#elif (GST_D3D11_DXGI_HEADER_VERSION >= 4) +#include <dxgi1_4.h> +#elif (GST_D3D11_DXGI_HEADER_VERSION >= 3) +#include <dxgi1_3.h> +#elif (GST_D3D11_DXGI_HEADER_VERSION >= 2) +#include <dxgi1_2.h> +#else +#include <dxgi.h> +#endif + +G_BEGIN_DECLS + +typedef struct _GstD3D11Device GstD3D11Device; +typedef struct _GstD3D11DeviceClass GstD3D11DeviceClass; +typedef struct _GstD3D11DevicePrivate GstD3D11DevicePrivate; + +typedef struct _GstD3D11AllocationParams GstD3D11AllocationParams; +typedef struct _GstD3D11Memory GstD3D11Memory; +typedef struct _GstD3D11MemoryPrivate GstD3D11MemoryPrivate; + +typedef struct _GstD3D11Allocator GstD3D11Allocator; +typedef struct _GstD3D11AllocatorClass GstD3D11AllocatorClass; +typedef struct _GstD3D11AllocatorPrivate GstD3D11AllocatorPrivate; + +typedef struct _GstD3D11PoolAllocator GstD3D11PoolAllocator; +typedef struct _GstD3D11PoolAllocatorClass GstD3D11PoolAllocatorClass; +typedef struct _GstD3D11PoolAllocatorPrivate GstD3D11PoolAllocatorPrivate; + +typedef struct _GstD3D11BufferPool GstD3D11BufferPool; +typedef struct _GstD3D11BufferPoolClass GstD3D11BufferPoolClass; +typedef struct _GstD3D11BufferPoolPrivate GstD3D11BufferPoolPrivate; + +typedef struct _GstD3D11Format GstD3D11Format; + +G_END_DECLS + +#endif /* __GST_D3D11_FWD_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11_private.h
Added
@@ -0,0 +1,48 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_PRIVATE_H__ +#define __GST_D3D11_PRIVATE_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11_fwd.h> + +G_BEGIN_DECLS + +void gst_d3d11_device_d3d11_debug (GstD3D11Device * device, + const gchar * file, + const gchar * function, + gint line); + +void gst_d3d11_device_dxgi_debug (GstD3D11Device * device, + const gchar * file, + const gchar * function, + gint line); + +#define GST_D3D11_CLEAR_COM(obj) G_STMT_START { \ + if (obj) { \ + (obj)->Release (); \ + (obj) = NULL; \ + } \ + } G_STMT_END + +G_END_DECLS + +#endif /* __GST_D3D11_PRIVATE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11bufferpool.cpp
Added
@@ -0,0 +1,566 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11bufferpool.h" +#include "gstd3d11memory.h" +#include "gstd3d11device.h" +#include "gstd3d11utils.h" + +#include <string.h> + +/** + * SECTION:gstd3d11bufferpool + * @title: GstD3D11BufferPool + * @short_description: buffer pool for #GstD3D11Memory objects + * @see_also: #GstBufferPool, #GstGLMemory + * + * a #GstD3D11BufferPool is an object that allocates buffers with #GstD3D11Memory + * + * A #GstGLBufferPool is created with gst_d3d11_buffer_pool_new() + */ + +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_buffer_pool_debug); +#define GST_CAT_DEFAULT gst_d3d11_buffer_pool_debug + +struct _GstD3D11BufferPoolPrivate +{ + GstD3D11Allocator *alloc[GST_VIDEO_MAX_PLANES]; + + GstD3D11AllocationParams *d3d11_params; + gboolean texture_array_pool; + + gint stride[GST_VIDEO_MAX_PLANES]; + gsize offset[GST_VIDEO_MAX_PLANES]; +}; + +#define gst_d3d11_buffer_pool_parent_class parent_class +G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11BufferPool, + gst_d3d11_buffer_pool, GST_TYPE_BUFFER_POOL); + +static void gst_d3d11_buffer_pool_dispose (GObject * object); +static const gchar **gst_d3d11_buffer_pool_get_options (GstBufferPool * pool); +static gboolean gst_d3d11_buffer_pool_set_config (GstBufferPool * pool, + GstStructure * config); +static GstFlowReturn gst_d3d11_buffer_pool_alloc_buffer (GstBufferPool * pool, + GstBuffer ** buffer, GstBufferPoolAcquireParams * params); +static GstFlowReturn gst_d3d11_buffer_pool_acquire_buffer (GstBufferPool * pool, + GstBuffer ** buffer, GstBufferPoolAcquireParams * params); +static void gst_d3d11_buffer_pool_reset_buffer (GstBufferPool * pool, + GstBuffer * buffer); +static gboolean gst_d3d11_buffer_pool_start (GstBufferPool * pool); +static gboolean gst_d3d11_buffer_pool_stop (GstBufferPool * pool); + +static void +gst_d3d11_buffer_pool_class_init (GstD3D11BufferPoolClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstBufferPoolClass *bufferpool_class = GST_BUFFER_POOL_CLASS (klass); + + gobject_class->dispose = gst_d3d11_buffer_pool_dispose; + + bufferpool_class->get_options = gst_d3d11_buffer_pool_get_options; + bufferpool_class->set_config = gst_d3d11_buffer_pool_set_config; + bufferpool_class->alloc_buffer = gst_d3d11_buffer_pool_alloc_buffer; + bufferpool_class->acquire_buffer = gst_d3d11_buffer_pool_acquire_buffer; + bufferpool_class->reset_buffer = gst_d3d11_buffer_pool_reset_buffer; + bufferpool_class->start = gst_d3d11_buffer_pool_start; + bufferpool_class->stop = gst_d3d11_buffer_pool_stop; + + GST_DEBUG_CATEGORY_INIT (gst_d3d11_buffer_pool_debug, "d3d11bufferpool", 0, + "d3d11bufferpool object"); +} + +static void +gst_d3d11_buffer_pool_init (GstD3D11BufferPool * self) +{ + self->priv = (GstD3D11BufferPoolPrivate *) + gst_d3d11_buffer_pool_get_instance_private (self); +} + +static void +gst_d3d11_buffer_pool_clear_allocator (GstD3D11BufferPool * self) +{ + GstD3D11BufferPoolPrivate *priv = self->priv; + guint i; + + for (i = 0; i < G_N_ELEMENTS (priv->alloc); i++) { + if (priv->alloc[i]) { + gst_d3d11_allocator_set_active (priv->alloc[i], FALSE); + gst_clear_object (&priv->alloc[i]); + } + } +} + +static void +gst_d3d11_buffer_pool_dispose (GObject * object) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (object); + GstD3D11BufferPoolPrivate *priv = self->priv; + + g_clear_pointer (&priv->d3d11_params, gst_d3d11_allocation_params_free); + gst_clear_object (&self->device); + gst_d3d11_buffer_pool_clear_allocator (self); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static const gchar ** +gst_d3d11_buffer_pool_get_options (GstBufferPool * pool) +{ + /* NOTE: d3d11 memory does not support alignment */ + static const gchar *options[] = { GST_BUFFER_POOL_OPTION_VIDEO_META, NULL }; + + return options; +} + +static gboolean +gst_d3d11_buffer_pool_set_config (GstBufferPool * pool, GstStructure * config) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); + GstD3D11BufferPoolPrivate *priv = self->priv; + GstVideoInfo info; + GstCaps *caps = NULL; + guint min_buffers, max_buffers; + gboolean ret = TRUE; + D3D11_TEXTURE2D_DESC *desc; + const GstD3D11Format *format; + gsize offset = 0; + gint i; + + if (!gst_buffer_pool_config_get_params (config, &caps, NULL, &min_buffers, + &max_buffers)) + goto wrong_config; + + if (caps == NULL) + goto no_caps; + + /* now parse the caps from the config */ + if (!gst_video_info_from_caps (&info, caps)) + goto wrong_caps; + + GST_LOG_OBJECT (pool, "%dx%d, caps %" GST_PTR_FORMAT, info.width, info.height, + caps); + + gst_d3d11_buffer_pool_clear_allocator (self); + + memset (priv->stride, 0, sizeof (priv->stride)); + memset (priv->offset, 0, sizeof (priv->offset)); + + if (priv->d3d11_params) + gst_d3d11_allocation_params_free (priv->d3d11_params); + priv->d3d11_params = + gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!priv->d3d11_params) { + /* allocate memory with resource format by default */ + priv->d3d11_params = + gst_d3d11_allocation_params_new (self->device, + &info, (GstD3D11AllocationFlags) 0, 0); + } + + desc = priv->d3d11_params->desc; + + /* resolution of semi-planar formats must be multiple of 2 */ + if (desc[0].Format == DXGI_FORMAT_NV12 || desc[0].Format == DXGI_FORMAT_P010 + || desc[0].Format == DXGI_FORMAT_P016) { + if (desc[0].Width % 2 || desc[0].Height % 2) { + gint width, height; + GstVideoAlignment align; + + GST_WARNING_OBJECT (self, "Resolution %dx%d is not mutiple of 2, fixing", + desc[0].Width, desc[0].Height); + + width = GST_ROUND_UP_2 (desc[0].Width); + height = GST_ROUND_UP_2 (desc[0].Height); + + gst_video_alignment_reset (&align); + align.padding_right = width - desc[0].Width; + align.padding_bottom = height - desc[0].Height; + + gst_d3d11_allocation_params_alignment (priv->d3d11_params, &align); + } + } +#ifndef GST_DISABLE_GST_DEBUG + { + GST_LOG_OBJECT (self, "Direct3D11 Allocation params"); + GST_LOG_OBJECT (self, "\tD3D11AllocationFlags: 0x%x", + priv->d3d11_params->flags); + for (i = 0; GST_VIDEO_MAX_PLANES; i++) { + if (desc[i].Format == DXGI_FORMAT_UNKNOWN) + break; + GST_LOG_OBJECT (self, "\t[plane %d] %dx%d, DXGI format %d", + i, desc[i].Width, desc[i].Height, desc[i].Format); + GST_LOG_OBJECT (self, "\t[plane %d] MipLevel %d, ArraySize %d", + i, desc[i].MipLevels, desc[i].ArraySize); + GST_LOG_OBJECT (self, + "\t[plane %d] SampleDesc.Count %d, SampleDesc.Quality %d", + i, desc[i].SampleDesc.Count, desc[i].SampleDesc.Quality); + GST_LOG_OBJECT (self, "\t[plane %d] Usage %d", i, desc[i].Usage); + GST_LOG_OBJECT (self, + "\t[plane %d] BindFlags 0x%x", i, desc[i].BindFlags); + GST_LOG_OBJECT (self, + "\t[plane %d] CPUAccessFlags 0x%x", i, desc[i].CPUAccessFlags); + GST_LOG_OBJECT (self, + "\t[plane %d] MiscFlags 0x%x", i, desc[i].MiscFlags); + } + } +#endif + + if ((priv->d3d11_params->flags & GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY)) { + guint max_array_size = 0; + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (desc[i].Format == DXGI_FORMAT_UNKNOWN) + break; + + if (desc[i].ArraySize > max_array_size) + max_array_size = desc[i].ArraySize; + } + + if (max_buffers == 0 || max_buffers > max_array_size) { + GST_WARNING_OBJECT (pool, + "Array pool is requested but allowed pool size %d > ArraySize %d", + max_buffers, max_array_size); + max_buffers = max_array_size; + } + + priv->texture_array_pool = TRUE; + } else { + priv->texture_array_pool = FALSE; + } + + offset = 0; + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GstD3D11Allocator *alloc; + GstD3D11PoolAllocator *pool_alloc; + GstFlowReturn flow_ret; + GstMemory *mem = NULL; + guint stride = 0; + + if (desc[i].Format == DXGI_FORMAT_UNKNOWN) + break; + + alloc = + (GstD3D11Allocator *) gst_d3d11_pool_allocator_new (self->device, + &desc[i]); + if (!gst_d3d11_allocator_set_active (alloc, TRUE)) { + GST_ERROR_OBJECT (self, "Failed to activate allocator"); + gst_object_unref (alloc); + return FALSE; + } + + pool_alloc = GST_D3D11_POOL_ALLOCATOR (alloc); + flow_ret = gst_d3d11_pool_allocator_acquire_memory (pool_alloc, &mem); + if (flow_ret != GST_FLOW_OK) { + GST_ERROR_OBJECT (self, "Failed to allocate initial memory"); + gst_d3d11_allocator_set_active (alloc, FALSE); + gst_object_unref (alloc); + return FALSE; + } + + if (!gst_d3d11_memory_get_texture_stride (GST_D3D11_MEMORY_CAST (mem), + &stride) || stride < desc[i].Width) { + GST_ERROR_OBJECT (self, "Failed to calculate stride"); + + gst_d3d11_allocator_set_active (alloc, FALSE); + gst_object_unref (alloc); + gst_memory_unref (mem); + + return FALSE; + } + + priv->stride[i] = stride; + priv->offset[i] = offset; + offset += mem->size; + + priv->alloc[i] = alloc; + + gst_memory_unref (mem); + } + + g_assert (priv->d3d11_params->d3d11_format != NULL); + format = priv->d3d11_params->d3d11_format; + /* single texture semi-planar formats */ + if (format->dxgi_format != DXGI_FORMAT_UNKNOWN && + GST_VIDEO_INFO_N_PLANES (&info) == 2) { + priv->stride[1] = priv->stride[0]; + priv->offset[1] = priv->stride[0] * desc[0].Height; + } + + gst_buffer_pool_config_set_params (config, + caps, offset, min_buffers, max_buffers); + + return GST_BUFFER_POOL_CLASS (parent_class)->set_config (pool, config) && ret; + + /* ERRORS */ +wrong_config: + { + GST_WARNING_OBJECT (pool, "invalid config"); + return FALSE; + } +no_caps: + { + GST_WARNING_OBJECT (pool, "no caps in config"); + return FALSE; + } +wrong_caps: + { + GST_WARNING_OBJECT (pool, + "failed getting geometry from caps %" GST_PTR_FORMAT, caps); + return FALSE; + } +} + +static GstFlowReturn +gst_d3d11_buffer_pool_fill_buffer (GstD3D11BufferPool * self, GstBuffer * buf) +{ + GstD3D11BufferPoolPrivate *priv = self->priv; + GstFlowReturn ret = GST_FLOW_OK; + guint i; + + for (i = 0; i < G_N_ELEMENTS (priv->alloc); i++) { + GstMemory *mem = NULL; + GstD3D11PoolAllocator *alloc = GST_D3D11_POOL_ALLOCATOR (priv->alloc[i]); + + if (!alloc) + break; + + ret = gst_d3d11_pool_allocator_acquire_memory (alloc, &mem); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, "Failed to acquire memory, ret %s", + gst_flow_get_name (ret)); + return ret; + } + + gst_buffer_append_memory (buf, mem); + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_buffer_pool_alloc_buffer (GstBufferPool * pool, GstBuffer ** buffer, + GstBufferPoolAcquireParams * params) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); + GstD3D11BufferPoolPrivate *priv = self->priv; + GstD3D11AllocationParams *d3d11_params = priv->d3d11_params; + GstVideoInfo *info = &d3d11_params->info; + GstBuffer *buf; + GstFlowReturn ret = GST_FLOW_OK; + + buf = gst_buffer_new (); + /* In case of texture-array, we are releasing memory objects in + * the GstBufferPool::reset_buffer() so that GstD3D11Memory objects can be + * returned to the GstD3D11PoolAllocator. So, underlying GstD3D11Memory + * will be filled in the later GstBufferPool::acquire_buffer() call. + * Don't fill memory here for non-texture-array therefore */ + if (!priv->texture_array_pool) { + ret = gst_d3d11_buffer_pool_fill_buffer (self, buf); + if (ret != GST_FLOW_OK) { + gst_buffer_unref (buf); + return ret; + } + } + + gst_buffer_add_video_meta_full (buf, GST_VIDEO_FRAME_FLAG_NONE, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_WIDTH (info), + GST_VIDEO_INFO_HEIGHT (info), GST_VIDEO_INFO_N_PLANES (info), + priv->offset, priv->stride); + + *buffer = buf; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_buffer_pool_acquire_buffer (GstBufferPool * pool, + GstBuffer ** buffer, GstBufferPoolAcquireParams * params) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); + GstD3D11BufferPoolPrivate *priv = self->priv; + GstFlowReturn ret; + + ret = GST_BUFFER_POOL_CLASS (parent_class)->acquire_buffer (pool, + buffer, params); + + if (ret != GST_FLOW_OK) + return ret; + + /* Don't need special handling for non-texture-array case */ + if (!priv->texture_array_pool) + return ret; + + /* Baseclass will hold empty buffer in this case, fill GstMemory */ + g_assert (gst_buffer_n_memory (*buffer) == 0); + + return gst_d3d11_buffer_pool_fill_buffer (self, *buffer); +} + +static void +gst_d3d11_buffer_pool_reset_buffer (GstBufferPool * pool, GstBuffer * buffer) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); + GstD3D11BufferPoolPrivate *priv = self->priv; + + /* If we are using texture array, we should return GstD3D11Memory to + * to the GstD3D11PoolAllocator, so that the allocator can wake up + * if it's waiting for available memory object */ + if (priv->texture_array_pool) { + GST_LOG_OBJECT (self, "Returning memory to allocator"); + gst_buffer_remove_all_memory (buffer); + } + + GST_BUFFER_POOL_CLASS (parent_class)->reset_buffer (pool, buffer); + GST_BUFFER_FLAGS (buffer) = 0; +} + +static gboolean +gst_d3d11_buffer_pool_start (GstBufferPool * pool) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); + GstD3D11BufferPoolPrivate *priv = self->priv; + guint i; + gboolean ret; + + GST_DEBUG_OBJECT (self, "Start"); + + for (i = 0; i < G_N_ELEMENTS (priv->alloc); i++) { + GstD3D11Allocator *alloc = priv->alloc[i]; + + if (!alloc) + break; + + if (!gst_d3d11_allocator_set_active (alloc, TRUE)) { + GST_ERROR_OBJECT (self, "Failed to activate allocator"); + return FALSE; + } + } + + ret = GST_BUFFER_POOL_CLASS (parent_class)->start (pool); + if (!ret) { + GST_ERROR_OBJECT (self, "Failed to start"); + + for (i = 0; i < G_N_ELEMENTS (priv->alloc); i++) { + GstD3D11Allocator *alloc = priv->alloc[i]; + + if (!alloc) + break; + + gst_d3d11_allocator_set_active (alloc, FALSE); + } + + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_buffer_pool_stop (GstBufferPool * pool) +{ + GstD3D11BufferPool *self = GST_D3D11_BUFFER_POOL (pool); + GstD3D11BufferPoolPrivate *priv = self->priv; + guint i; + + GST_DEBUG_OBJECT (self, "Stop"); + + for (i = 0; i < G_N_ELEMENTS (priv->alloc); i++) { + GstD3D11Allocator *alloc = priv->alloc[i]; + + if (!alloc) + break; + + if (!gst_d3d11_allocator_set_active (alloc, FALSE)) { + GST_ERROR_OBJECT (self, "Failed to deactivate allocator"); + return FALSE; + } + } + + return GST_BUFFER_POOL_CLASS (parent_class)->stop (pool); +} + +/** + * gst_d3d11_buffer_pool_new: + * @device: a #GstD3D11Device to use + * + * Returns: a #GstBufferPool that allocates buffers with #GstD3D11Memory + * + * Since: 1.20 + */ +GstBufferPool * +gst_d3d11_buffer_pool_new (GstD3D11Device * device) +{ + GstD3D11BufferPool *pool; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + pool = (GstD3D11BufferPool *) g_object_new (GST_TYPE_D3D11_BUFFER_POOL, NULL); + gst_object_ref_sink (pool); + + pool->device = (GstD3D11Device *) gst_object_ref (device); + + return GST_BUFFER_POOL_CAST (pool); +} + +/** + * gst_buffer_pool_config_get_d3d11_allocation_params: + * @config: a buffer pool config + * + * Returns: (transfer full) (nullable): the currently configured + * #GstD3D11AllocationParams on @config or %NULL if @config doesn't contain + * #GstD3D11AllocationParams + * + * Since: 1.20 + */ +GstD3D11AllocationParams * +gst_buffer_pool_config_get_d3d11_allocation_params (GstStructure * config) +{ + GstD3D11AllocationParams *ret; + + if (!gst_structure_get (config, "d3d11-allocation-params", + GST_TYPE_D3D11_ALLOCATION_PARAMS, &ret, NULL)) + ret = NULL; + + return ret; +} + +/** + * gst_buffer_pool_config_set_d3d11_allocation_params: + * @config: a buffer pool config + * @params: (transfer none): a #GstD3D11AllocationParams + * + * Sets @params on @config + * + * Since: 1.20 + */ +void +gst_buffer_pool_config_set_d3d11_allocation_params (GstStructure * config, + GstD3D11AllocationParams * params) +{ + g_return_if_fail (config != NULL); + g_return_if_fail (params != NULL); + + gst_structure_set (config, "d3d11-allocation-params", + GST_TYPE_D3D11_ALLOCATION_PARAMS, params, NULL); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11bufferpool.h
Added
@@ -0,0 +1,73 @@ +/* + * GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_BUFFER_POOL_H__ +#define __GST_D3D11_BUFFER_POOL_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11_fwd.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_BUFFER_POOL (gst_d3d11_buffer_pool_get_type()) +#define GST_D3D11_BUFFER_POOL(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_D3D11_BUFFER_POOL, GstD3D11BufferPool)) +#define GST_D3D11_BUFFER_POOL_CLASS(klass) (G_TYPE_CHECK_CLASS((klass), GST_TYPE_D3D11_BUFFER_POOL, GstD3D11BufferPoolClass)) +#define GST_IS_D3D11_BUFFER_POOL(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_D3D11_BUFFER_POOL)) +#define GST_IS_D3D11_BUFFER_POOL_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_BUFFER_POOL)) +#define GST_D3D11_BUFFER_POOL_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_BUFFER_POOL, GstD3D11BufferPoolClass)) + +struct _GstD3D11BufferPool +{ + GstBufferPool parent; + + GstD3D11Device *device; + + /*< private >*/ + GstD3D11BufferPoolPrivate *priv; + + gpointer _gst_reserved[GST_PADDING]; +}; + +struct _GstD3D11BufferPoolClass +{ + GstBufferPoolClass bufferpool_class; + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING]; +}; + +GST_D3D11_API +GType gst_d3d11_buffer_pool_get_type (void); + +GST_D3D11_API +GstBufferPool * gst_d3d11_buffer_pool_new (GstD3D11Device * device); + +GST_D3D11_API +GstD3D11AllocationParams * gst_buffer_pool_config_get_d3d11_allocation_params (GstStructure * config); + +GST_D3D11_API +void gst_buffer_pool_config_set_d3d11_allocation_params (GstStructure * config, + GstD3D11AllocationParams * params); + +G_END_DECLS + +#endif /* __GST_D3D11_BUFFER_POOL_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11config.h.meson
Added
@@ -0,0 +1,17 @@ +/* gstd3d11config.h */ + +#ifndef __GST_D3D11_CONFIG_H__ +#define __GST_D3D11_CONFIG_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +#mesondefine GST_D3D11_DXGI_HEADER_VERSION +#mesondefine GST_D3D11_HEADER_VERSION +#mesondefine GST_D3D11_WINAPI_ONLY_APP +#mesondefine GST_D3D11_WINAPI_APP + +G_END_DECLS + +#endif /* __GST_D3D11_CONFIG_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11device.cpp
Added
@@ -0,0 +1,1448 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11device.h" +#include "gstd3d11utils.h" +#include "gstd3d11format.h" +#include "gstd3d11_private.h" +#include "gstd3d11memory.h" +#include <gmodule.h> +#include <wrl.h> + +#include <windows.h> +#include <versionhelpers.h> + +/** + * SECTION:gstd3d11device + * @short_description: Direct3D11 device abstraction + * @title: GstD3D11Device + * + * #GstD3D11Device wraps ID3D11Device and ID3D11DeviceContext for GPU resources + * to be able to be shared among various elements. Caller can get native + * Direct3D11 handles via getter method. + * Basically Direct3D11 API doesn't require dedicated thread like that of + * OpenGL context, and ID3D11Device APIs are supposed to be thread-safe. + * But concurrent call for ID3D11DeviceContext and DXGI API are not allowed. + * To protect such object, callers need to make use of gst_d3d11_device_lock() + * and gst_d3d11_device_unlock() + */ + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +#if HAVE_D3D11SDKLAYERS_H +#include <d3d11sdklayers.h> +static GModule *d3d11_debug_module = NULL; + +/* mingw header does not define D3D11_RLDO_IGNORE_INTERNAL + * D3D11_RLDO_SUMMARY = 0x1, + D3D11_RLDO_DETAIL = 0x2, + * D3D11_RLDO_IGNORE_INTERNAL = 0x4 + */ +#define GST_D3D11_RLDO_FLAGS (0x2 | 0x4) +#endif + +#if HAVE_DXGIDEBUG_H +#include <dxgidebug.h> +typedef HRESULT (WINAPI * DXGIGetDebugInterface_t) (REFIID riid, + void **ppDebug); +static GModule *dxgi_debug_module = NULL; +static DXGIGetDebugInterface_t GstDXGIGetDebugInterface = NULL; + +#endif + +#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H) +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_debug_layer_debug); +#endif +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_device_debug); +#define GST_CAT_DEFAULT gst_d3d11_device_debug + +enum +{ + PROP_0, + PROP_ADAPTER, + PROP_DEVICE_ID, + PROP_VENDOR_ID, + PROP_HARDWARE, + PROP_DESCRIPTION, + PROP_CREATE_FLAGS, + PROP_ADAPTER_LUID, +}; + +#define DEFAULT_ADAPTER 0 +#define DEFAULT_CREATE_FLAGS 0 + +#define GST_D3D11_N_FORMATS 25 + +struct _GstD3D11DevicePrivate +{ + guint adapter; + guint device_id; + guint vendor_id; + gboolean hardware; + gchar *description; + guint create_flags; + gint64 adapter_luid; + + ID3D11Device *device; + ID3D11DeviceContext *device_context; + + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; + + IDXGIFactory1 *factory; + GstD3D11Format format_table[GST_D3D11_N_FORMATS]; + + GRecMutex extern_lock; + GMutex resource_lock; + +#if HAVE_D3D11SDKLAYERS_H + ID3D11Debug *d3d11_debug; + ID3D11InfoQueue *d3d11_info_queue; +#endif + +#if HAVE_DXGIDEBUG_H + IDXGIDebug *dxgi_debug; + IDXGIInfoQueue *dxgi_info_queue; +#endif +}; + +static void +debug_init_once (void) +{ + static gsize init_once = 0; + + if (g_once_init_enter (&init_once)) { + GST_DEBUG_CATEGORY_INIT (gst_d3d11_device_debug, + "d3d11device", 0, "d3d11 device object"); +#if defined(HAVE_D3D11SDKLAYERS_H) || defined(HAVE_DXGIDEBUG_H) + GST_DEBUG_CATEGORY_INIT (gst_d3d11_debug_layer_debug, + "d3d11debuglayer", 0, "native d3d11 and dxgi debug"); +#endif + g_once_init_leave (&init_once, 1); + } +} + +#define gst_d3d11_device_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstD3D11Device, gst_d3d11_device, GST_TYPE_OBJECT, + G_ADD_PRIVATE (GstD3D11Device); debug_init_once ()); + +static void gst_d3d11_device_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); +static void gst_d3d11_device_dispose (GObject * object); +static void gst_d3d11_device_finalize (GObject * object); + +#if HAVE_D3D11SDKLAYERS_H +static gboolean +gst_d3d11_device_enable_d3d11_debug (void) +{ + static gsize _init = 0; + + /* If all below libraries are unavailable, d3d11 device would fail with + * D3D11_CREATE_DEVICE_DEBUG flag */ + if (g_once_init_enter (&_init)) { + d3d11_debug_module = + g_module_open ("d3d11sdklayers.dll", G_MODULE_BIND_LAZY); + + if (!d3d11_debug_module) + d3d11_debug_module = + g_module_open ("d3d11_1sdklayers.dll", G_MODULE_BIND_LAZY); + if (!d3d11_debug_module) + d3d11_debug_module = + g_module_open ("d3d11_2sdklayers.dll", G_MODULE_BIND_LAZY); + if (!d3d11_debug_module) + d3d11_debug_module = + g_module_open ("d3d11_3sdklayers.dll", G_MODULE_BIND_LAZY); + + g_once_init_leave (&_init, 1); + } + + if (d3d11_debug_module) + return TRUE; + + return FALSE; +} + +static inline GstDebugLevel +d3d11_message_severity_to_gst (D3D11_MESSAGE_SEVERITY level) +{ + switch (level) { + case D3D11_MESSAGE_SEVERITY_CORRUPTION: + case D3D11_MESSAGE_SEVERITY_ERROR: + return GST_LEVEL_ERROR; + case D3D11_MESSAGE_SEVERITY_WARNING: + return GST_LEVEL_WARNING; + case D3D11_MESSAGE_SEVERITY_INFO: + return GST_LEVEL_INFO; + case D3D11_MESSAGE_SEVERITY_MESSAGE: + return GST_LEVEL_DEBUG; + default: + break; + } + + return GST_LEVEL_LOG; +} + +void +gst_d3d11_device_d3d11_debug (GstD3D11Device * device, + const gchar * file, const gchar * function, gint line) +{ + GstD3D11DevicePrivate *priv = device->priv; + D3D11_MESSAGE *msg; + SIZE_T msg_len = 0; + HRESULT hr; + UINT64 num_msg, i; + ID3D11InfoQueue *info_queue = priv->d3d11_info_queue; + + if (!info_queue) + return; + + num_msg = info_queue->GetNumStoredMessages (); + + for (i = 0; i < num_msg; i++) { + GstDebugLevel level; + + hr = info_queue->GetMessage (i, NULL, &msg_len); + + if (FAILED (hr) || msg_len == 0) { + return; + } + + msg = (D3D11_MESSAGE *) g_alloca (msg_len); + hr = info_queue->GetMessage (i, msg, &msg_len); + + level = d3d11_message_severity_to_gst (msg->Severity); + if (msg->Category == D3D11_MESSAGE_CATEGORY_STATE_CREATION && + level > GST_LEVEL_ERROR) { + /* Do not warn for live object, since there would be live object + * when ReportLiveDeviceObjects was called */ + level = GST_LEVEL_INFO; + } + + gst_debug_log (gst_d3d11_debug_layer_debug, level, file, function, line, + G_OBJECT (device), "D3D11InfoQueue: %s", msg->pDescription); + } + + info_queue->ClearStoredMessages (); + + return; +} +#else +void +gst_d3d11_device_d3d11_debug (GstD3D11Device * device, + const gchar * file, const gchar * function, gint line) +{ + /* do nothing */ + return; +} +#endif + +#if HAVE_DXGIDEBUG_H +static gboolean +gst_d3d11_device_enable_dxgi_debug (void) +{ + static gsize _init = 0; + gboolean ret = FALSE; + + /* If all below libraries are unavailable, d3d11 device would fail with + * D3D11_CREATE_DEVICE_DEBUG flag */ + if (g_once_init_enter (&_init)) { +#if (!GST_D3D11_WINAPI_ONLY_APP) + dxgi_debug_module = g_module_open ("dxgidebug.dll", G_MODULE_BIND_LAZY); + + if (dxgi_debug_module) + g_module_symbol (dxgi_debug_module, + "DXGIGetDebugInterface", (gpointer *) & GstDXGIGetDebugInterface); + if (GstDXGIGetDebugInterface) + ret = TRUE; +#elif (GST_D3D11_DXGI_HEADER_VERSION >= 3) + ret = TRUE; +#endif + g_once_init_leave (&_init, 1); + } + + return ret; +} + +static HRESULT +gst_d3d11_device_dxgi_get_device_interface (REFIID riid, void **debug) +{ +#if (!GST_D3D11_WINAPI_ONLY_APP) + if (GstDXGIGetDebugInterface) { + return GstDXGIGetDebugInterface (riid, debug); + } +#elif (GST_D3D11_DXGI_HEADER_VERSION >= 3) + return DXGIGetDebugInterface1 (0, riid, debug); +#endif + + return E_NOINTERFACE; +} + +static inline GstDebugLevel +dxgi_info_queue_message_severity_to_gst (DXGI_INFO_QUEUE_MESSAGE_SEVERITY level) +{ + switch (level) { + case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_CORRUPTION: + case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_ERROR: + return GST_LEVEL_ERROR; + case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_WARNING: + return GST_LEVEL_WARNING; + case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_INFO: + return GST_LEVEL_INFO; + case DXGI_INFO_QUEUE_MESSAGE_SEVERITY_MESSAGE: + return GST_LEVEL_DEBUG; + default: + break; + } + + return GST_LEVEL_LOG; +} + +void +gst_d3d11_device_dxgi_debug (GstD3D11Device * device, + const gchar * file, const gchar * function, gint line) +{ + GstD3D11DevicePrivate *priv = device->priv; + DXGI_INFO_QUEUE_MESSAGE *msg; + SIZE_T msg_len = 0; + HRESULT hr; + UINT64 num_msg, i; + IDXGIInfoQueue *info_queue = priv->dxgi_info_queue; + + if (!info_queue) + return; + + num_msg = info_queue->GetNumStoredMessages (DXGI_DEBUG_ALL); + + for (i = 0; i < num_msg; i++) { + GstDebugLevel level; + + hr = info_queue->GetMessage (DXGI_DEBUG_ALL, i, NULL, &msg_len); + + if (FAILED (hr) || msg_len == 0) { + return; + } + + msg = (DXGI_INFO_QUEUE_MESSAGE *) g_alloca (msg_len); + hr = info_queue->GetMessage (DXGI_DEBUG_ALL, i, msg, &msg_len); + + level = dxgi_info_queue_message_severity_to_gst (msg->Severity); + gst_debug_log (gst_d3d11_debug_layer_debug, level, file, function, line, + G_OBJECT (device), "DXGIInfoQueue: %s", msg->pDescription); + } + + info_queue->ClearStoredMessages (DXGI_DEBUG_ALL); + + return; +} +#else +void +gst_d3d11_device_dxgi_debug (GstD3D11Device * device, + const gchar * file, const gchar * function, gint line) +{ + /* do nothing */ + return; +} +#endif + +static void +gst_d3d11_device_class_init (GstD3D11DeviceClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GParamFlags readable_flags = + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + gobject_class->get_property = gst_d3d11_device_get_property; + gobject_class->dispose = gst_d3d11_device_dispose; + gobject_class->finalize = gst_d3d11_device_finalize; + + g_object_class_install_property (gobject_class, PROP_ADAPTER, + g_param_spec_uint ("adapter", "Adapter", + "DXGI Adapter index for creating device", + 0, G_MAXUINT32, DEFAULT_ADAPTER, readable_flags)); + + g_object_class_install_property (gobject_class, PROP_DEVICE_ID, + g_param_spec_uint ("device-id", "Device Id", + "DXGI Device ID", 0, G_MAXUINT32, 0, readable_flags)); + + g_object_class_install_property (gobject_class, PROP_VENDOR_ID, + g_param_spec_uint ("vendor-id", "Vendor Id", + "DXGI Vendor ID", 0, G_MAXUINT32, 0, readable_flags)); + + g_object_class_install_property (gobject_class, PROP_HARDWARE, + g_param_spec_boolean ("hardware", "Hardware", + "Whether hardware device or not", TRUE, readable_flags)); + + g_object_class_install_property (gobject_class, PROP_DESCRIPTION, + g_param_spec_string ("description", "Description", + "Human readable device description", NULL, readable_flags)); + + g_object_class_install_property (gobject_class, PROP_ADAPTER_LUID, + g_param_spec_int64 ("adapter-luid", "Adapter LUID", + "DXGI Adapter LUID (Locally Unique Identifier) of created device", + G_MININT64, G_MAXINT64, 0, readable_flags)); + + gst_d3d11_memory_init_once (); +} + +static void +gst_d3d11_device_init (GstD3D11Device * self) +{ + GstD3D11DevicePrivate *priv; + + priv = (GstD3D11DevicePrivate *) + gst_d3d11_device_get_instance_private (self); + priv->adapter = DEFAULT_ADAPTER; + + g_rec_mutex_init (&priv->extern_lock); + g_mutex_init (&priv->resource_lock); + + self->priv = priv; +} + +static gboolean +is_windows_8_or_greater (void) +{ + static gsize version_once = 0; + static gboolean ret = FALSE; + + if (g_once_init_enter (&version_once)) { +#if (!GST_D3D11_WINAPI_ONLY_APP) + if (IsWindows8OrGreater ()) + ret = TRUE; +#else + ret = TRUE; +#endif + + g_once_init_leave (&version_once, 1); + } + + return ret; +} + +inline D3D11_FORMAT_SUPPORT +operator | (D3D11_FORMAT_SUPPORT lhs, D3D11_FORMAT_SUPPORT rhs) +{ + return static_cast < D3D11_FORMAT_SUPPORT > (static_cast < UINT > + (lhs) | static_cast < UINT > (rhs)); +} + +inline D3D11_FORMAT_SUPPORT +operator |= (D3D11_FORMAT_SUPPORT lhs, D3D11_FORMAT_SUPPORT rhs) +{ + return lhs | rhs; +} + +static gboolean +can_support_format (GstD3D11Device * self, DXGI_FORMAT format, + D3D11_FORMAT_SUPPORT extra_flags) +{ + GstD3D11DevicePrivate *priv = self->priv; + ID3D11Device *handle = priv->device; + HRESULT hr; + UINT supported; + D3D11_FORMAT_SUPPORT flags = D3D11_FORMAT_SUPPORT_TEXTURE2D; + + flags |= extra_flags; + + if (!is_windows_8_or_greater ()) { + GST_INFO_OBJECT (self, "DXGI format %d needs Windows 8 or greater", + (guint) format); + return FALSE; + } + + hr = handle->CheckFormatSupport (format, &supported); + if (FAILED (hr)) { + GST_DEBUG_OBJECT (self, "DXGI format %d is not supported by device", + (guint) format); + return FALSE; + } + + if ((supported & flags) != flags) { + GST_DEBUG_OBJECT (self, + "DXGI format %d doesn't support flag 0x%x (supported flag 0x%x)", + (guint) format, (guint) supported, (guint) flags); + return FALSE; + } + + GST_INFO_OBJECT (self, "Device supports DXGI format %d", (guint) format); + + return TRUE; +} + +static void +gst_d3d11_device_setup_format_table (GstD3D11Device * self) +{ + GstD3D11DevicePrivate *priv = self->priv; + guint n_formats = 0; + + /* RGB formats */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_BGRA; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_B8G8R8A8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_B8G8R8A8_UNORM; + n_formats++; + + /* Identical to BGRA, but alpha will be ignored */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_BGRx; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_B8G8R8A8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_B8G8R8A8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_RGBA; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8G8B8A8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R8G8B8A8_UNORM; + n_formats++; + + /* Identical to RGBA, but alpha will be ignored */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_RGBx; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8G8B8A8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R8G8B8A8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_RGB10A2_LE; + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_R10G10B10A2_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R10G10B10A2_UNORM; + n_formats++; + + /* YUV packed */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_VUYA; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8G8B8A8_UNORM; + if (can_support_format (self, DXGI_FORMAT_AYUV, + D3D11_FORMAT_SUPPORT_RENDER_TARGET | + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_AYUV; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + /* FIXME: d3d11 sampler doesn't support packed-and-subsampled formats + * very well (and it's really poorly documented). + * As per observation, d3d11 samplers seems to be dropping the second + * Y componet from "Y0-U0-Y1-V0" pair which results in bad visual quality + * than 4:2:0 subsampled formats. We should revisit this later */ + + /* TODO: The best would be using d3d11 compute shader to handle this kinds of + * samples but comute shader is not implemented yet by us. + * + * Another simple approach is using d3d11 video processor, + * but capability will be very device dependent because it depends on + * GPU vendor's driver implementation, moreover, software fallback does + * not support d3d11 video processor. So it's not reliable in this case */ +#if 0 + /* NOTE: packted yuv 4:2:2 YUY2, UYVY, and VYUY formats are not natively + * supported render target view formats + * (i.e., cannot be output format of shader pipeline) */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_YUY2; + if (can_support_format (self, DXGI_FORMAT_YUY2, + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) { + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_R8G8B8A8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_YUY2; + } else { + /* If DXGI_FORMAT_YUY2 format is not supported, use this format, + * it's analogous to YUY2 */ + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_G8R8_G8B8_UNORM; + } + n_formats++; + + /* No native DXGI format available for UYVY */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_UYVY; + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_R8G8_B8G8_UNORM; + n_formats++; + + /* No native DXGI format available for VYUY */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_VYUY; + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_R8G8_B8G8_UNORM; + n_formats++; + + /* Y210 and Y410 formats cannot support rtv */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y210; + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_R16G16B16A16_UNORM; + if (can_support_format (self, DXGI_FORMAT_Y210, + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_Y210; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; +#endif + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y410; + priv->format_table[n_formats].resource_format[0] = + DXGI_FORMAT_R10G10B10A2_UNORM; + if (can_support_format (self, DXGI_FORMAT_Y410, + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_Y410; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + /* YUV semi-planar */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_NV12; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8G8_UNORM; + if (can_support_format (self, DXGI_FORMAT_NV12, + D3D11_FORMAT_SUPPORT_RENDER_TARGET | + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_NV12; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + /* no native format for NV21 */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_NV21; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8G8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_P010_10LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16G16_UNORM; + if (can_support_format (self, DXGI_FORMAT_P010, + D3D11_FORMAT_SUPPORT_RENDER_TARGET | + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_P010; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + /* P012 is identical to P016 from runtime point of view */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_P012_LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16G16_UNORM; + if (can_support_format (self, DXGI_FORMAT_P016, + D3D11_FORMAT_SUPPORT_RENDER_TARGET | + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_P016; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_P016_LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16G16_UNORM; + if (can_support_format (self, DXGI_FORMAT_P016, + D3D11_FORMAT_SUPPORT_RENDER_TARGET | + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE)) + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_P016; + else + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_UNKNOWN; + n_formats++; + + /* YUV planar */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I420; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_YV12; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I420_10LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I420_12LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y42B; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I422_10LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_I422_12LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y444; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y444_10LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y444_12LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_Y444_16LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[1] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].resource_format[2] = DXGI_FORMAT_R16_UNORM; + n_formats++; + + /* GRAY */ + /* NOTE: To support conversion by using video processor, + * mark DXGI_FORMAT_{R8,R16}_UNORM formats as known dxgi_format. + * Otherwise, d3d11 elements will not try to use video processor for + * those formats */ + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_GRAY8; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R8_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R8_UNORM; + n_formats++; + + priv->format_table[n_formats].format = GST_VIDEO_FORMAT_GRAY16_LE; + priv->format_table[n_formats].resource_format[0] = DXGI_FORMAT_R16_UNORM; + priv->format_table[n_formats].dxgi_format = DXGI_FORMAT_R16_UNORM; + n_formats++; + + g_assert (n_formats == GST_D3D11_N_FORMATS); +} + +static void +gst_d3d11_device_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11Device *self = GST_D3D11_DEVICE (object); + GstD3D11DevicePrivate *priv = self->priv; + + switch (prop_id) { + case PROP_ADAPTER: + g_value_set_uint (value, priv->adapter); + break; + case PROP_DEVICE_ID: + g_value_set_uint (value, priv->device_id); + break; + case PROP_VENDOR_ID: + g_value_set_uint (value, priv->vendor_id); + break; + case PROP_HARDWARE: + g_value_set_boolean (value, priv->hardware); + break; + case PROP_DESCRIPTION: + g_value_set_string (value, priv->description); + break; + case PROP_ADAPTER_LUID: + g_value_set_int64 (value, priv->adapter_luid); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_device_dispose (GObject * object) +{ + GstD3D11Device *self = GST_D3D11_DEVICE (object); + GstD3D11DevicePrivate *priv = self->priv; + + GST_LOG_OBJECT (self, "dispose"); + + GST_D3D11_CLEAR_COM (priv->video_device); + GST_D3D11_CLEAR_COM (priv->video_context); + GST_D3D11_CLEAR_COM (priv->device); + GST_D3D11_CLEAR_COM (priv->device_context); + GST_D3D11_CLEAR_COM (priv->factory); +#if HAVE_D3D11SDKLAYERS_H + if (priv->d3d11_debug) { + priv->d3d11_debug->ReportLiveDeviceObjects ((D3D11_RLDO_FLAGS) + GST_D3D11_RLDO_FLAGS); + } + GST_D3D11_CLEAR_COM (priv->d3d11_debug); + + if (priv->d3d11_info_queue) + gst_d3d11_device_d3d11_debug (self, __FILE__, GST_FUNCTION, __LINE__); + + GST_D3D11_CLEAR_COM (priv->d3d11_info_queue); +#endif + +#if HAVE_DXGIDEBUG_H + if (priv->dxgi_debug) { + priv->dxgi_debug->ReportLiveObjects (DXGI_DEBUG_ALL, + (DXGI_DEBUG_RLO_FLAGS) GST_D3D11_RLDO_FLAGS); + } + GST_D3D11_CLEAR_COM (priv->dxgi_debug); + + if (priv->dxgi_info_queue) + gst_d3d11_device_dxgi_debug (self, __FILE__, GST_FUNCTION, __LINE__); + + GST_D3D11_CLEAR_COM (priv->dxgi_info_queue); +#endif + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_d3d11_device_finalize (GObject * object) +{ + GstD3D11Device *self = GST_D3D11_DEVICE (object); + GstD3D11DevicePrivate *priv = self->priv; + + GST_LOG_OBJECT (self, "finalize"); + + g_rec_mutex_clear (&priv->extern_lock); + g_mutex_clear (&priv->resource_lock); + g_free (priv->description); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +typedef enum +{ + DEVICE_CONSTRUCT_FOR_ADAPTER_INDEX, + DEVICE_CONSTRUCT_FOR_ADAPTER_LUID, + DEVICE_CONSTRUCT_WRAPPED, +} GstD3D11DeviceConstructType; + +typedef struct _GstD3D11DeviceConstructData +{ + union + { + guint adapter_index; + gint64 adapter_luid; + ID3D11Device *device; + } data; + GstD3D11DeviceConstructType type; + UINT create_flags; +} GstD3D11DeviceConstructData; + +static HRESULT +_gst_d3d11_device_get_adapter (const GstD3D11DeviceConstructData * data, + IDXGIFactory1 * factory, guint * index, DXGI_ADAPTER_DESC * adapter_desc, + IDXGIAdapter1 ** dxgi_adapter) +{ + HRESULT hr = S_OK; + ComPtr < IDXGIAdapter1 > adapter1; + DXGI_ADAPTER_DESC desc; + + switch (data->type) { + case DEVICE_CONSTRUCT_FOR_ADAPTER_INDEX: + { + hr = factory->EnumAdapters1 (data->data.adapter_index, &adapter1); + if (FAILED (hr)) + return hr; + + hr = adapter1->GetDesc (&desc); + if (FAILED (hr)) + return hr; + + *index = data->data.adapter_index; + *adapter_desc = desc; + *dxgi_adapter = adapter1.Detach (); + + return S_OK; + } + case DEVICE_CONSTRUCT_FOR_ADAPTER_LUID: + { + for (guint i = 0;; i++) { + gint64 luid; + + adapter1 = nullptr; + + hr = factory->EnumAdapters1 (i, &adapter1); + if (FAILED (hr)) + return hr; + + hr = adapter1->GetDesc (&desc); + if (FAILED (hr)) + continue; + + luid = gst_d3d11_luid_to_int64 (&desc.AdapterLuid); + if (luid != data->data.adapter_luid) + continue; + + *index = i; + *adapter_desc = desc; + *dxgi_adapter = adapter1.Detach (); + + return S_OK; + } + + return E_FAIL; + } + case DEVICE_CONSTRUCT_WRAPPED: + { + ComPtr < IDXGIDevice > dxgi_device; + ComPtr < IDXGIAdapter > adapter; + ID3D11Device *device = data->data.device; + guint luid; + + hr = device->QueryInterface (IID_PPV_ARGS (&dxgi_device)); + if (FAILED (hr)) + return hr; + + hr = dxgi_device->GetAdapter (&adapter); + if (FAILED (hr)) + return hr; + + hr = adapter.As (&adapter1); + if (FAILED (hr)) + return hr; + + hr = adapter1->GetDesc (&desc); + if (FAILED (hr)) + return hr; + + luid = gst_d3d11_luid_to_int64 (&desc.AdapterLuid); + + for (guint i = 0;; i++) { + DXGI_ADAPTER_DESC tmp_desc; + ComPtr < IDXGIAdapter1 > tmp; + + hr = factory->EnumAdapters1 (i, &tmp); + if (FAILED (hr)) + return hr; + + hr = tmp->GetDesc (&tmp_desc); + if (FAILED (hr)) + continue; + + if (luid != gst_d3d11_luid_to_int64 (&tmp_desc.AdapterLuid)) + continue; + + *index = i; + *adapter_desc = desc; + *dxgi_adapter = adapter1.Detach (); + + return S_OK; + } + + return E_FAIL; + } + default: + g_assert_not_reached (); + break; + } + + return E_FAIL; +} + +static void +gst_d3d11_device_setup_debug_layer (GstD3D11Device * self) +{ +#if HAVE_DXGIDEBUG_H + if (gst_debug_category_get_threshold (gst_d3d11_debug_layer_debug) > + GST_LEVEL_ERROR) { + GstD3D11DevicePrivate *priv = self->priv; + + if (gst_d3d11_device_enable_dxgi_debug ()) { + IDXGIDebug *debug = nullptr; + IDXGIInfoQueue *info_queue = nullptr; + HRESULT hr; + + GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, + "dxgi debug library was loaded"); + hr = gst_d3d11_device_dxgi_get_device_interface (IID_PPV_ARGS (&debug)); + + if (SUCCEEDED (hr)) { + GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, + "IDXGIDebug interface available"); + priv->dxgi_debug = debug; + + hr = gst_d3d11_device_dxgi_get_device_interface (IID_PPV_ARGS + (&info_queue)); + if (SUCCEEDED (hr)) { + GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, + "IDXGIInfoQueue interface available"); + priv->dxgi_info_queue = info_queue; + } + } + } else { + GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, + "couldn't load dxgi debug library"); + } + } +#endif + +#if HAVE_D3D11SDKLAYERS_H + if ((self->priv->create_flags & D3D11_CREATE_DEVICE_DEBUG) != 0) { + GstD3D11DevicePrivate *priv = self->priv; + ID3D11Debug *debug; + ID3D11InfoQueue *info_queue; + HRESULT hr; + + hr = priv->device->QueryInterface (IID_PPV_ARGS (&debug)); + + if (SUCCEEDED (hr)) { + GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, + "D3D11Debug interface available"); + priv->d3d11_debug = debug; + + hr = priv->device->QueryInterface (IID_PPV_ARGS (&info_queue)); + if (SUCCEEDED (hr)) { + GST_CAT_INFO_OBJECT (gst_d3d11_debug_layer_debug, self, + "ID3D11InfoQueue interface available"); + priv->d3d11_info_queue = info_queue; + } + } + } +#endif +} + +static GstD3D11Device * +gst_d3d11_device_new_internal (const GstD3D11DeviceConstructData * data) +{ + ComPtr < IDXGIAdapter1 > adapter; + ComPtr < IDXGIFactory1 > factory; + ComPtr < ID3D11Device > device; + ComPtr < ID3D11DeviceContext > device_context; + HRESULT hr; + UINT create_flags; + guint adapter_index = 0; + DXGI_ADAPTER_DESC adapter_desc; + static const D3D_FEATURE_LEVEL feature_levels[] = { + D3D_FEATURE_LEVEL_11_1, + D3D_FEATURE_LEVEL_11_0, + D3D_FEATURE_LEVEL_10_1, + D3D_FEATURE_LEVEL_10_0, + D3D_FEATURE_LEVEL_9_3, + D3D_FEATURE_LEVEL_9_2, + D3D_FEATURE_LEVEL_9_1 + }; + D3D_FEATURE_LEVEL selected_level; + + debug_init_once (); + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (!gst_d3d11_result (hr, NULL)) { + GST_ERROR ("cannot create dxgi factory, hr: 0x%x", (guint) hr); + return nullptr; + } + + create_flags = 0; + if (data->type != DEVICE_CONSTRUCT_WRAPPED) { + create_flags = data->create_flags; +#if HAVE_D3D11SDKLAYERS_H + if (gst_debug_category_get_threshold (gst_d3d11_debug_layer_debug) > + GST_LEVEL_ERROR) { + /* DirectX SDK should be installed on system for this */ + if (gst_d3d11_device_enable_d3d11_debug ()) { + GST_CAT_INFO (gst_d3d11_debug_layer_debug, + "d3d11 debug library was loaded"); + create_flags |= D3D11_CREATE_DEVICE_DEBUG; + } else { + GST_CAT_INFO (gst_d3d11_debug_layer_debug, + "couldn't load d3d11 debug library"); + } + } +#endif + } + + /* Ensure valid device handle */ + if (data->type == DEVICE_CONSTRUCT_WRAPPED) { + ID3D11Device *external_device = data->data.device; + + hr = external_device->QueryInterface (IID_PPV_ARGS (&device)); + if (FAILED (hr)) { + GST_ERROR ("Not a valid external ID3D11Device handle"); + return nullptr; + } + + device->GetImmediateContext (&device_context); + } + + hr = _gst_d3d11_device_get_adapter (data, factory.Get (), &adapter_index, + &adapter_desc, &adapter); + if (FAILED (hr)) { + GST_INFO ("Failed to get DXGI adapter"); + return nullptr; + } + + if (data->type != DEVICE_CONSTRUCT_WRAPPED) { + hr = D3D11CreateDevice (adapter.Get (), D3D_DRIVER_TYPE_UNKNOWN, + NULL, create_flags, feature_levels, G_N_ELEMENTS (feature_levels), + D3D11_SDK_VERSION, &device, &selected_level, &device_context); + + if (FAILED (hr)) { + /* Retry if the system could not recognize D3D_FEATURE_LEVEL_11_1 */ + hr = D3D11CreateDevice (adapter.Get (), D3D_DRIVER_TYPE_UNKNOWN, + NULL, create_flags, &feature_levels[1], + G_N_ELEMENTS (feature_levels) - 1, D3D11_SDK_VERSION, &device, + &selected_level, &device_context); + } + + /* if D3D11_CREATE_DEVICE_DEBUG was enabled but couldn't create device, + * try it without the flag again */ + if (FAILED (hr) && (create_flags & D3D11_CREATE_DEVICE_DEBUG) != 0) { + create_flags &= ~D3D11_CREATE_DEVICE_DEBUG; + + hr = D3D11CreateDevice (adapter.Get (), D3D_DRIVER_TYPE_UNKNOWN, + NULL, create_flags, feature_levels, G_N_ELEMENTS (feature_levels), + D3D11_SDK_VERSION, &device, &selected_level, &device_context); + + if (FAILED (hr)) { + /* Retry if the system could not recognize D3D_FEATURE_LEVEL_11_1 */ + hr = D3D11CreateDevice (adapter.Get (), D3D_DRIVER_TYPE_UNKNOWN, + NULL, create_flags, &feature_levels[1], + G_N_ELEMENTS (feature_levels) - 1, D3D11_SDK_VERSION, &device, + &selected_level, &device_context); + } + } + } + + if (FAILED (hr)) { + switch (data->type) { + case DEVICE_CONSTRUCT_FOR_ADAPTER_INDEX: + { + GST_INFO ("Failed to create d3d11 device for adapter index %d" + " with flags 0x%x, hr: 0x%x", data->data.adapter_index, + create_flags, (guint) hr); + return nullptr; + } + case DEVICE_CONSTRUCT_FOR_ADAPTER_LUID: + { + GST_ERROR ("Failed to create d3d11 device for adapter luid %" + G_GINT64_FORMAT " with flags 0x%x, hr: 0x%x", + data->data.adapter_luid, create_flags, (guint) hr); + return nullptr; + } + default: + break; + } + + return nullptr; + } + + GstD3D11Device *self = nullptr; + GstD3D11DevicePrivate *priv; + + self = (GstD3D11Device *) g_object_new (GST_TYPE_D3D11_DEVICE, nullptr); + gst_object_ref_sink (self); + + priv = self->priv; + + priv->adapter = adapter_index; + priv->device = device.Detach (); + priv->device_context = device_context.Detach (); + priv->factory = factory.Detach (); + + priv->vendor_id = adapter_desc.VendorId; + priv->device_id = adapter_desc.DeviceId; + priv->description = g_utf16_to_utf8 ((gunichar2 *) adapter_desc.Description, + -1, nullptr, nullptr, nullptr); + priv->adapter_luid = gst_d3d11_luid_to_int64 (&adapter_desc.AdapterLuid); + + DXGI_ADAPTER_DESC1 desc1; + hr = adapter->GetDesc1 (&desc1); + + /* DXGI_ADAPTER_FLAG_SOFTWARE is missing in dxgi.h of mingw */ + if (SUCCEEDED (hr) && (desc1.Flags & 0x2) != 0x2) + priv->hardware = TRUE; + + priv->create_flags = create_flags; + gst_d3d11_device_setup_format_table (self); + gst_d3d11_device_setup_debug_layer (self); + + return self; +} + +/** + * gst_d3d11_device_new: + * @adapter_index: the index of adapter for creating d3d11 device + * @flags: a D3D11_CREATE_DEVICE_FLAG value used for creating d3d11 device + * + * Returns: (transfer full) (nullable): a new #GstD3D11Device for @adapter_index + * or %NULL when failed to create D3D11 device with given adapter index. + * + * Since: 1.20 + */ +GstD3D11Device * +gst_d3d11_device_new (guint adapter_index, guint flags) +{ + GstD3D11DeviceConstructData data; + + data.data.adapter_index = adapter_index; + data.type = DEVICE_CONSTRUCT_FOR_ADAPTER_INDEX; + data.create_flags = flags; + + return gst_d3d11_device_new_internal (&data); +} + +/** + * gst_d3d11_device_new_for_adapter_luid: + * @adapter_luid: an int64 representation of the DXGI adapter LUID + * @flags: a D3D11_CREATE_DEVICE_FLAG value used for creating d3d11 device + * + * Returns: (transfer full) (nullable): a new #GstD3D11Device for @adapter_luid + * or %NULL when failed to create D3D11 device with given adapter luid. + * + * Since: 1.20 + */ +GstD3D11Device * +gst_d3d11_device_new_for_adapter_luid (gint64 adapter_luid, guint flags) +{ + GstD3D11DeviceConstructData data; + + data.data.adapter_luid = adapter_luid; + data.type = DEVICE_CONSTRUCT_FOR_ADAPTER_LUID; + data.create_flags = flags; + + return gst_d3d11_device_new_internal (&data); +} + +/** + * gst_d3d11_device_new_wrapped: + * @device: (transfer none): an existing ID3D11Device handle + * + * Returns: (transfer full) (nullable): a new #GstD3D11Device for @device + * or %NULL if an error occurred + * + * Since: 1.20 + */ +GstD3D11Device * +gst_d3d11_device_new_wrapped (ID3D11Device * device) +{ + GstD3D11DeviceConstructData data; + + g_return_val_if_fail (device != nullptr, nullptr); + + data.data.device = device; + data.type = DEVICE_CONSTRUCT_WRAPPED; + data.create_flags = 0; + + return gst_d3d11_device_new_internal (&data); +} + +/** + * gst_d3d11_device_get_device_handle: + * @device: a #GstD3D11Device + * + * Used for various D3D11 APIs directly. Caller must not destroy returned device + * object. + * + * Returns: (transfer none): the ID3D11Device handle + * + * Since: 1.20 + */ +ID3D11Device * +gst_d3d11_device_get_device_handle (GstD3D11Device * device) +{ + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + return device->priv->device; +} + +/** + * gst_d3d11_device_get_device_context_handle: + * @device: a #GstD3D11Device + * + * Used for various D3D11 APIs directly. Caller must not destroy returned device + * object. Any ID3D11DeviceContext call needs to be protected by + * gst_d3d11_device_lock() and gst_d3d11_device_unlock() method. + * + * Returns: (transfer none): the immeidate ID3D11DeviceContext handle + * + * Since: 1.20 + */ +ID3D11DeviceContext * +gst_d3d11_device_get_device_context_handle (GstD3D11Device * device) +{ + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + return device->priv->device_context; +} + +/** + * gst_d3d11_device_get_dxgi_factory_handle: + * @device: a #GstD3D11Device + * + * Used for various D3D11 APIs directly. Caller must not destroy returned device + * object. + * + * Returns: (transfer none): the IDXGIFactory1 handle + * + * Since: 1.20 + */ +IDXGIFactory1 * +gst_d3d11_device_get_dxgi_factory_handle (GstD3D11Device * device) +{ + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + return device->priv->factory; +} + +/** + * gst_d3d11_device_get_video_device_handle: + * @device: a #GstD3D11Device + * + * Used for various D3D11 APIs directly. Caller must not destroy returned device + * object. + * + * Returns: (nullable) (transfer none) : the ID3D11VideoDevice handle or %NULL + * if ID3D11VideoDevice is unavailable. + * + * Since: 1.20 + */ +ID3D11VideoDevice * +gst_d3d11_device_get_video_device_handle (GstD3D11Device * device) +{ + GstD3D11DevicePrivate *priv; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + priv = device->priv; + g_mutex_lock (&priv->resource_lock); + if (!priv->video_device) { + HRESULT hr; + ID3D11VideoDevice *video_device = NULL; + + hr = priv->device->QueryInterface (IID_PPV_ARGS (&video_device)); + if (gst_d3d11_result (hr, device)) + priv->video_device = video_device; + } + g_mutex_unlock (&priv->resource_lock); + + return priv->video_device; +} + +/** + * gst_d3d11_device_get_video_context_handle: + * @device: a #GstD3D11Device + * + * Used for various D3D11 APIs directly. Caller must not destroy returned device + * object. + * + * Returns: (nullable) (transfer none): the ID3D11VideoContext handle or %NULL + * if ID3D11VideoContext is unavailable. + * + * Since: 1.20 + */ +ID3D11VideoContext * +gst_d3d11_device_get_video_context_handle (GstD3D11Device * device) +{ + GstD3D11DevicePrivate *priv; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + priv = device->priv; + g_mutex_lock (&priv->resource_lock); + if (!priv->video_context) { + HRESULT hr; + ID3D11VideoContext *video_context = NULL; + + hr = priv->device_context->QueryInterface (IID_PPV_ARGS (&video_context)); + if (gst_d3d11_result (hr, device)) + priv->video_context = video_context; + } + g_mutex_unlock (&priv->resource_lock); + + return priv->video_context; +} + +/** + * gst_d3d11_device_lock: + * @device: a #GstD3D11Device + * + * Take lock for @device. Any thread-unsafe API call needs to be + * protected by this method. This call must be paired with + * gst_d3d11_device_unlock() + * + * Since: 1.20 + */ +void +gst_d3d11_device_lock (GstD3D11Device * device) +{ + GstD3D11DevicePrivate *priv; + + g_return_if_fail (GST_IS_D3D11_DEVICE (device)); + + priv = device->priv; + + GST_TRACE_OBJECT (device, "device locking"); + g_rec_mutex_lock (&priv->extern_lock); + GST_TRACE_OBJECT (device, "device locked"); +} + +/** + * gst_d3d11_device_unlock: + * @device: a #GstD3D11Device + * + * Release lock for @device. This call must be paired with + * gst_d3d11_device_lock() + * + * Since: 1.20 + */ +void +gst_d3d11_device_unlock (GstD3D11Device * device) +{ + GstD3D11DevicePrivate *priv; + + g_return_if_fail (GST_IS_D3D11_DEVICE (device)); + + priv = device->priv; + + g_rec_mutex_unlock (&priv->extern_lock); + GST_TRACE_OBJECT (device, "device unlocked"); +} + +/** + * gst_d3d11_device_format_from_gst: + * @device: a #GstD3D11Device + * @format: a #GstVideoFormat + * + * Returns: (transfer none) (nullable): a pointer to #GstD3D11Format + * or %NULL if @format is not supported by @device + * + * Since: 1.20 + */ +const GstD3D11Format * +gst_d3d11_device_format_from_gst (GstD3D11Device * device, + GstVideoFormat format) +{ + GstD3D11DevicePrivate *priv; + guint i; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + priv = device->priv; + + for (i = 0; i < G_N_ELEMENTS (priv->format_table); i++) { + if (priv->format_table[i].format == format) + return &priv->format_table[i]; + } + + return NULL; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11device.h
Added
@@ -0,0 +1,99 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_DEVICE_H__ +#define __GST_D3D11_DEVICE_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11_fwd.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_DEVICE (gst_d3d11_device_get_type()) +#define GST_D3D11_DEVICE(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_DEVICE,GstD3D11Device)) +#define GST_D3D11_DEVICE_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_D3D11_DEVICE,GstD3D11DeviceClass)) +#define GST_D3D11_DEVICE_GET_CLASS(obj) (GST_D3D11_DEVICE_CLASS(G_OBJECT_GET_CLASS(obj))) +#define GST_IS_D3D11_DEVICE(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_DEVICE)) +#define GST_IS_D3D11_DEVICE_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_DEVICE)) +#define GST_D3D11_DEVICE_CAST(obj) ((GstD3D11Device*)(obj)) + +#define GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE "gst.d3d11.device.handle" + +struct _GstD3D11Device +{ + GstObject parent; + + GstD3D11DevicePrivate *priv; + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING]; +}; + +struct _GstD3D11DeviceClass +{ + GstObjectClass parent_class; + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING]; +}; + +GST_D3D11_API +GType gst_d3d11_device_get_type (void); + +GST_D3D11_API +GstD3D11Device * gst_d3d11_device_new (guint adapter_index, + guint flags); + +GST_D3D11_API +GstD3D11Device * gst_d3d11_device_new_for_adapter_luid (gint64 adapter_luid, + guint flags); + +GST_D3D11_API +GstD3D11Device * gst_d3d11_device_new_wrapped (ID3D11Device * device); + +GST_D3D11_API +ID3D11Device * gst_d3d11_device_get_device_handle (GstD3D11Device * device); + +GST_D3D11_API +ID3D11DeviceContext * gst_d3d11_device_get_device_context_handle (GstD3D11Device * device); + +GST_D3D11_API +IDXGIFactory1 * gst_d3d11_device_get_dxgi_factory_handle (GstD3D11Device * device); + +GST_D3D11_API +ID3D11VideoDevice * gst_d3d11_device_get_video_device_handle (GstD3D11Device * device); + +GST_D3D11_API +ID3D11VideoContext * gst_d3d11_device_get_video_context_handle (GstD3D11Device * device); + +GST_D3D11_API +void gst_d3d11_device_lock (GstD3D11Device * device); + +GST_D3D11_API +void gst_d3d11_device_unlock (GstD3D11Device * device); + +GST_D3D11_API +const GstD3D11Format * gst_d3d11_device_format_from_gst (GstD3D11Device * device, + GstVideoFormat format); + +G_END_DECLS + +#endif /* __GST_D3D11_DEVICE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11format.cpp
Added
@@ -0,0 +1,193 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11format.h" +#include "gstd3d11utils.h" +#include "gstd3d11device.h" +#include "gstd3d11memory.h" + +#include <string.h> + +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT ensure_debug_category() +static GstDebugCategory * +ensure_debug_category (void) +{ + static gsize cat_gonce = 0; + + if (g_once_init_enter (&cat_gonce)) { + gsize cat_done; + + cat_done = (gsize) _gst_debug_category_new ("d3d11format", 0, + "d3d11 specific formats"); + + g_once_init_leave (&cat_gonce, cat_done); + } + + return (GstDebugCategory *) cat_gonce; +} +#else +#define ensure_debug_category() /* NOOP */ +#endif /* GST_DISABLE_GST_DEBUG */ + +/** + * gst_d3d11_dxgi_format_n_planes: + * @format: a DXGI_FORMAT + * + * Returns: the number of planes for @format + * + * Since: 1.20 + */ +guint +gst_d3d11_dxgi_format_n_planes (DXGI_FORMAT format) +{ + switch (format) { + case DXGI_FORMAT_B8G8R8A8_UNORM: + case DXGI_FORMAT_R8G8B8A8_UNORM: + case DXGI_FORMAT_R10G10B10A2_UNORM: + case DXGI_FORMAT_AYUV: + case DXGI_FORMAT_YUY2: + case DXGI_FORMAT_R8_UNORM: + case DXGI_FORMAT_R8G8_UNORM: + case DXGI_FORMAT_R16_UNORM: + case DXGI_FORMAT_R16G16_UNORM: + case DXGI_FORMAT_G8R8_G8B8_UNORM: + case DXGI_FORMAT_R8G8_B8G8_UNORM: + case DXGI_FORMAT_Y210: + case DXGI_FORMAT_Y410: + case DXGI_FORMAT_R16G16B16A16_UNORM: + return 1; + case DXGI_FORMAT_NV12: + case DXGI_FORMAT_P010: + case DXGI_FORMAT_P016: + return 2; + default: + break; + } + + return 0; +} + +/** + * gst_d3d11_dxgi_format_get_size: + * @format: a DXGI_FORMAT + * @width: a texture width + * @height: a texture height + * @pitch: a pitch of texture + * @offset: offset for each plane + * @stride: stride for each plane + * @size: (out): required memory size for given format + * + * Calculate required memory size and per plane stride with + * based on information + * + * Returns: %TRUE if @size can be calculated with given information + * + * Since: 1.20 + */ +gboolean +gst_d3d11_dxgi_format_get_size (DXGI_FORMAT format, guint width, guint height, + guint pitch, gsize offset[GST_VIDEO_MAX_PLANES], + gint stride[GST_VIDEO_MAX_PLANES], gsize * size) +{ + g_return_val_if_fail (format != DXGI_FORMAT_UNKNOWN, FALSE); + + switch (format) { + case DXGI_FORMAT_B8G8R8A8_UNORM: + case DXGI_FORMAT_R8G8B8A8_UNORM: + case DXGI_FORMAT_R10G10B10A2_UNORM: + case DXGI_FORMAT_AYUV: + case DXGI_FORMAT_YUY2: + case DXGI_FORMAT_R8_UNORM: + case DXGI_FORMAT_R8G8_UNORM: + case DXGI_FORMAT_R16_UNORM: + case DXGI_FORMAT_R16G16_UNORM: + case DXGI_FORMAT_G8R8_G8B8_UNORM: + case DXGI_FORMAT_R8G8_B8G8_UNORM: + case DXGI_FORMAT_Y210: + case DXGI_FORMAT_Y410: + case DXGI_FORMAT_R16G16B16A16_UNORM: + offset[0] = 0; + stride[0] = pitch; + *size = pitch * height; + break; + case DXGI_FORMAT_NV12: + case DXGI_FORMAT_P010: + case DXGI_FORMAT_P016: + offset[0] = 0; + stride[0] = pitch; + offset[1] = offset[0] + stride[0] * height; + stride[1] = pitch; + *size = offset[1] + stride[1] * GST_ROUND_UP_2 (height / 2); + break; + default: + return FALSE; + } + + GST_LOG ("Calculated buffer size: %" G_GSIZE_FORMAT + " (dxgi format:%d, %dx%d, Pitch %d)", + *size, format, width, height, pitch); + + return TRUE; +} + +/** + * gst_d3d11_dxgi_format_to_gst: + * @format: a DXGI_FORMAT + * + * Converts the @format to its #GstVideoFormat representation. + * + * Returns: a #GstVideoFormat equivalent to @format + * + * Since: 1.20 + */ +GstVideoFormat +gst_d3d11_dxgi_format_to_gst (DXGI_FORMAT format) +{ + switch (format) { + case DXGI_FORMAT_B8G8R8A8_UNORM: + return GST_VIDEO_FORMAT_BGRA; + case DXGI_FORMAT_R8G8B8A8_UNORM: + return GST_VIDEO_FORMAT_RGBA; + case DXGI_FORMAT_R10G10B10A2_UNORM: + return GST_VIDEO_FORMAT_RGB10A2_LE; + case DXGI_FORMAT_AYUV: + return GST_VIDEO_FORMAT_VUYA; + case DXGI_FORMAT_YUY2: + return GST_VIDEO_FORMAT_YUY2; + case DXGI_FORMAT_Y210: + return GST_VIDEO_FORMAT_Y210; + case DXGI_FORMAT_Y410: + return GST_VIDEO_FORMAT_Y410; + case DXGI_FORMAT_NV12: + return GST_VIDEO_FORMAT_NV12; + case DXGI_FORMAT_P010: + return GST_VIDEO_FORMAT_P010_10LE; + case DXGI_FORMAT_P016: + return GST_VIDEO_FORMAT_P016_LE; + default: + break; + } + + return GST_VIDEO_FORMAT_UNKNOWN; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11format.h
Added
@@ -0,0 +1,78 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_FORMAT_H__ +#define __GST_D3D11_FORMAT_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11_fwd.h> + +G_BEGIN_DECLS + +#define GST_D3D11_COMMON_FORMATS \ + "BGRA, RGBA, RGB10A2_LE, BGRx, RGBx, VUYA, NV12, NV21, " \ + "P010_10LE, P012_LE, P016_LE, I420, YV12, I420_10LE, I420_12LE, " \ + "Y42B, I422_10LE, I422_12LE, Y444, Y444_10LE, Y444_12LE, Y444_16LE, " \ + "GRAY8, GRAY16_LE" + +#define GST_D3D11_EXTRA_IN_FORMATS \ + "Y410" + +#define GST_D3D11_SINK_FORMATS \ + "{ " GST_D3D11_COMMON_FORMATS " ," GST_D3D11_EXTRA_IN_FORMATS " }" + +#define GST_D3D11_SRC_FORMATS \ + "{ " GST_D3D11_COMMON_FORMATS " }" + +#define GST_D3D11_ALL_FORMATS \ + "{ " GST_D3D11_COMMON_FORMATS " ," GST_D3D11_EXTRA_IN_FORMATS " }" + +struct _GstD3D11Format +{ + GstVideoFormat format; + + /* direct mapping to dxgi format if applicable */ + DXGI_FORMAT dxgi_format; + + /* formats for texture processing */ + DXGI_FORMAT resource_format[GST_VIDEO_MAX_PLANES]; + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING]; +}; + +GST_D3D11_API +guint gst_d3d11_dxgi_format_n_planes (DXGI_FORMAT format); + +GST_D3D11_API +gboolean gst_d3d11_dxgi_format_get_size (DXGI_FORMAT format, + guint width, + guint height, + guint pitch, + gsize offset[GST_VIDEO_MAX_PLANES], + gint stride[GST_VIDEO_MAX_PLANES], + gsize *size); + +GST_D3D11_API +GstVideoFormat gst_d3d11_dxgi_format_to_gst (DXGI_FORMAT format); + +G_END_DECLS + +#endif /* __GST_D3D11_FORMAT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11memory.cpp
Added
@@ -0,0 +1,2083 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <string.h> +#include "gstd3d11memory.h" +#include "gstd3d11device.h" +#include "gstd3d11utils.h" +#include "gstd3d11_private.h" + +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_allocator_debug); +#define GST_CAT_DEFAULT gst_d3d11_allocator_debug + +static GstAllocator *_d3d11_memory_allocator; + +/* GstD3D11AllocationParams */ +static void gst_d3d11_allocation_params_init (GType type); +G_DEFINE_BOXED_TYPE_WITH_CODE (GstD3D11AllocationParams, + gst_d3d11_allocation_params, + (GBoxedCopyFunc) gst_d3d11_allocation_params_copy, + (GBoxedFreeFunc) gst_d3d11_allocation_params_free, + gst_d3d11_allocation_params_init (g_define_type_id)); + +/** + * gst_d3d11_allocation_params_new: + * @device: a #GstD3D11Device + * @info: a #GstVideoInfo + * @flags: a #GstD3D11AllocationFlags + * @bind_flags: D3D11_BIND_FLAG value used for creating Direct3D11 texture + * + * Create #GstD3D11AllocationParams object which is used by #GstD3D11BufferPool + * and #GstD3D11Allocator in order to allocate new ID3D11Texture2D + * object with given configuration + * + * Returns: a #GstD3D11AllocationParams or %NULL if @info is not supported + * + * Since: 1.20 + */ +GstD3D11AllocationParams * +gst_d3d11_allocation_params_new (GstD3D11Device * device, GstVideoInfo * info, + GstD3D11AllocationFlags flags, guint bind_flags) +{ + GstD3D11AllocationParams *ret; + const GstD3D11Format *d3d11_format; + guint i; + + g_return_val_if_fail (info != NULL, NULL); + + d3d11_format = gst_d3d11_device_format_from_gst (device, + GST_VIDEO_INFO_FORMAT (info)); + if (!d3d11_format) { + GST_WARNING ("Couldn't get d3d11 format"); + return NULL; + } + + ret = g_new0 (GstD3D11AllocationParams, 1); + + ret->info = *info; + ret->aligned_info = *info; + ret->d3d11_format = d3d11_format; + + /* Usage Flag + * https://docs.microsoft.com/en-us/windows/win32/api/d3d11/ne-d3d11-d3d11_usage + * + * +----------------------------------------------------------+ + * | Resource Usage | Default | Dynamic | Immutable | Staging | + * +----------------+---------+---------+-----------+---------+ + * | GPU-Read | Yes | Yes | Yes | Yes | + * | GPU-Write | Yes | | | Yes | + * | CPU-Read | | | | Yes | + * | CPU-Write | | Yes | | Yes | + * +----------------------------------------------------------+ + */ + + /* If corresponding dxgi format is undefined, use resource format instead */ + if (d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + g_assert (d3d11_format->resource_format[i] != DXGI_FORMAT_UNKNOWN); + + ret->desc[i].Width = GST_VIDEO_INFO_COMP_WIDTH (info, i); + ret->desc[i].Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); + ret->desc[i].MipLevels = 1; + ret->desc[i].ArraySize = 1; + ret->desc[i].Format = d3d11_format->resource_format[i]; + ret->desc[i].SampleDesc.Count = 1; + ret->desc[i].SampleDesc.Quality = 0; + ret->desc[i].Usage = D3D11_USAGE_DEFAULT; + ret->desc[i].BindFlags = bind_flags; + } + } else { + ret->desc[0].Width = GST_VIDEO_INFO_WIDTH (info); + ret->desc[0].Height = GST_VIDEO_INFO_HEIGHT (info); + ret->desc[0].MipLevels = 1; + ret->desc[0].ArraySize = 1; + ret->desc[0].Format = d3d11_format->dxgi_format; + ret->desc[0].SampleDesc.Count = 1; + ret->desc[0].SampleDesc.Quality = 0; + ret->desc[0].Usage = D3D11_USAGE_DEFAULT; + ret->desc[0].BindFlags = bind_flags; + } + + ret->flags = flags; + + return ret; +} + +/** + * gst_d3d11_allocation_params_alignment: + * @params: a #GstD3D11AllocationParams + * @align: a #GstVideoAlignment + * + * Adjust Width and Height fields of D3D11_TEXTURE2D_DESC with given + * @align + * + * Returns: %TRUE if alignment could be applied + * + * Since: 1.20 + */ +gboolean +gst_d3d11_allocation_params_alignment (GstD3D11AllocationParams * params, + GstVideoAlignment * align) +{ + guint i; + guint padding_width, padding_height; + GstVideoInfo *info; + GstVideoInfo new_info; + + g_return_val_if_fail (params != NULL, FALSE); + g_return_val_if_fail (align != NULL, FALSE); + + /* d3d11 does not support stride align. Consider padding only */ + padding_width = align->padding_left + align->padding_right; + padding_height = align->padding_top + align->padding_bottom; + + info = ¶ms->info; + + if (!gst_video_info_set_format (&new_info, GST_VIDEO_INFO_FORMAT (info), + GST_VIDEO_INFO_WIDTH (info) + padding_width, + GST_VIDEO_INFO_HEIGHT (info) + padding_height)) { + GST_WARNING ("Set format fail"); + return FALSE; + } + + params->aligned_info = new_info; + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + params->desc[i].Width = GST_VIDEO_INFO_COMP_WIDTH (&new_info, i); + params->desc[i].Height = GST_VIDEO_INFO_COMP_HEIGHT (&new_info, i); + } + + return TRUE; +} + +/** + * gst_d3d11_allocation_params_copy: + * @src: a #GstD3D11AllocationParams + * + * Returns: a copy of @src + * + * Since: 1.20 + */ +GstD3D11AllocationParams * +gst_d3d11_allocation_params_copy (GstD3D11AllocationParams * src) +{ + GstD3D11AllocationParams *dst; + + g_return_val_if_fail (src != NULL, NULL); + + dst = g_new0 (GstD3D11AllocationParams, 1); + memcpy (dst, src, sizeof (GstD3D11AllocationParams)); + + return dst; +} + +/** + * gst_d3d11_allocation_params_free: + * @params: a #GstD3D11AllocationParams + * + * Free @params + * + * Since: 1.20 + */ +void +gst_d3d11_allocation_params_free (GstD3D11AllocationParams * params) +{ + g_free (params); +} + +static gint +gst_d3d11_allocation_params_compare (const GstD3D11AllocationParams * p1, + const GstD3D11AllocationParams * p2) +{ + g_return_val_if_fail (p1 != NULL, -1); + g_return_val_if_fail (p2 != NULL, -1); + + if (p1 == p2) + return 0; + + return -1; +} + +static void +gst_d3d11_allocation_params_init (GType type) +{ + static GstValueTable table = { + 0, (GstValueCompareFunc) gst_d3d11_allocation_params_compare, + NULL, NULL + }; + + table.type = type; + gst_value_register (&table); +} + +/* GstD3D11Memory */ +#define GST_D3D11_MEMORY_GET_LOCK(m) (&(GST_D3D11_MEMORY_CAST(m)->priv->lock)) +#define GST_D3D11_MEMORY_LOCK(m) G_STMT_START { \ + GST_TRACE("Locking %p from thread %p", (m), g_thread_self()); \ + g_mutex_lock(GST_D3D11_MEMORY_GET_LOCK(m)); \ + GST_TRACE("Locked %p from thread %p", (m), g_thread_self()); \ +} G_STMT_END + +#define GST_D3D11_MEMORY_UNLOCK(m) G_STMT_START { \ + GST_TRACE("Unlocking %p from thread %p", (m), g_thread_self()); \ + g_mutex_unlock(GST_D3D11_MEMORY_GET_LOCK(m)); \ +} G_STMT_END + +struct _GstD3D11MemoryPrivate +{ + ID3D11Texture2D *texture; + ID3D11Texture2D *staging; + + D3D11_TEXTURE2D_DESC desc; + + guint subresource_index; + + ID3D11ShaderResourceView *shader_resource_view[GST_VIDEO_MAX_PLANES]; + guint num_shader_resource_views; + + ID3D11RenderTargetView *render_target_view[GST_VIDEO_MAX_PLANES]; + guint num_render_target_views; + + ID3D11VideoDecoderOutputView *decoder_output_view; + ID3D11VideoProcessorInputView *processor_input_view; + ID3D11VideoProcessorOutputView *processor_output_view; + + D3D11_MAPPED_SUBRESOURCE map; + + + GMutex lock; + gint cpu_map_count; +}; + +GST_DEFINE_MINI_OBJECT_TYPE (GstD3D11Memory, gst_d3d11_memory); + +static inline D3D11_MAP +gst_d3d11_map_flags_to_d3d11 (GstMapFlags flags) +{ + if ((flags & GST_MAP_READWRITE) == GST_MAP_READWRITE) + return D3D11_MAP_READ_WRITE; + else if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) + return D3D11_MAP_WRITE; + else if ((flags & GST_MAP_READ) == GST_MAP_READ) + return D3D11_MAP_READ; + else + g_assert_not_reached (); + + return D3D11_MAP_READ; +} + +static ID3D11Texture2D * +gst_d3d11_allocate_staging_texture (GstD3D11Device * device, + const D3D11_TEXTURE2D_DESC * ref) +{ + D3D11_TEXTURE2D_DESC desc = { 0, }; + ID3D11Texture2D *texture = NULL; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + HRESULT hr; + + desc.Width = ref->Width; + desc.Height = ref->Height; + desc.MipLevels = 1; + desc.Format = ref->Format; + desc.SampleDesc.Count = 1; + desc.ArraySize = 1; + desc.Usage = D3D11_USAGE_STAGING; + desc.CPUAccessFlags = (D3D11_CPU_ACCESS_READ | D3D11_CPU_ACCESS_WRITE); + + hr = device_handle->CreateTexture2D (&desc, NULL, &texture); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (device, "Failed to create texture"); + return NULL; + } + + return texture; +} + +/* Must be called with d3d11 device lock */ +static gboolean +gst_d3d11_memory_map_cpu_access (GstD3D11Memory * dmem, D3D11_MAP map_type) +{ + GstD3D11MemoryPrivate *priv = dmem->priv; + HRESULT hr; + ID3D11Resource *staging = (ID3D11Resource *) priv->staging; + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (dmem->device); + + hr = device_context->Map (staging, 0, map_type, 0, &priv->map); + + if (!gst_d3d11_result (hr, dmem->device)) { + GST_ERROR_OBJECT (GST_MEMORY_CAST (dmem)->allocator, + "Failed to map staging texture (0x%x)", (guint) hr); + return FALSE; + } + + return TRUE; +} + +/* Must be called with d3d11 device lock */ +static void +gst_d3d11_memory_upload (GstD3D11Memory * dmem) +{ + GstD3D11MemoryPrivate *priv = dmem->priv; + ID3D11DeviceContext *device_context; + + if (!priv->staging || priv->staging == priv->texture || + !GST_MEMORY_FLAG_IS_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD)) + return; + + device_context = gst_d3d11_device_get_device_context_handle (dmem->device); + device_context->CopySubresourceRegion (priv->texture, priv->subresource_index, + 0, 0, 0, priv->staging, 0, NULL); +} + +/* Must be called with d3d11 device lock */ +static void +gst_d3d11_memory_download (GstD3D11Memory * dmem) +{ + GstD3D11MemoryPrivate *priv = dmem->priv; + ID3D11DeviceContext *device_context; + + if (!priv->staging || priv->staging == priv->texture || + !GST_MEMORY_FLAG_IS_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD)) + return; + + device_context = gst_d3d11_device_get_device_context_handle (dmem->device); + device_context->CopySubresourceRegion (priv->staging, 0, 0, 0, 0, + priv->texture, priv->subresource_index, NULL); +} + +static gpointer +gst_d3d11_memory_map_full (GstMemory * mem, GstMapInfo * info, gsize maxsize) +{ + GstD3D11Memory *dmem = GST_D3D11_MEMORY_CAST (mem); + GstD3D11MemoryPrivate *priv = dmem->priv; + GstMapFlags flags = info->flags; + gpointer ret = NULL; + + gst_d3d11_device_lock (dmem->device); + GST_D3D11_MEMORY_LOCK (dmem); + + if ((flags & GST_MAP_D3D11) == GST_MAP_D3D11) { + gst_d3d11_memory_upload (dmem); + GST_MEMORY_FLAG_UNSET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD); + + if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) + GST_MINI_OBJECT_FLAG_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + + g_assert (priv->texture != NULL); + ret = priv->texture; + goto out; + } + + if (priv->cpu_map_count == 0) { + D3D11_MAP map_type; + + /* Allocate staging texture for CPU access */ + if (!priv->staging) { + priv->staging = gst_d3d11_allocate_staging_texture (dmem->device, + &priv->desc); + if (!priv->staging) { + GST_ERROR_OBJECT (mem->allocator, "Couldn't create staging texture"); + goto out; + } + + /* first memory, always need download to staging */ + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + } + + gst_d3d11_memory_download (dmem); + map_type = gst_d3d11_map_flags_to_d3d11 (flags); + + if (!gst_d3d11_memory_map_cpu_access (dmem, map_type)) { + GST_ERROR_OBJECT (mem->allocator, "Couldn't map staging texture"); + goto out; + } + } + + if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) { + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD); + } + + GST_MEMORY_FLAG_UNSET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + + priv->cpu_map_count++; + ret = dmem->priv->map.pData; + +out: + GST_D3D11_MEMORY_UNLOCK (dmem); + gst_d3d11_device_unlock (dmem->device); + + return ret; +} + +/* Must be called with d3d11 device lock */ +static void +gst_d3d11_memory_unmap_cpu_access (GstD3D11Memory * dmem) +{ + GstD3D11MemoryPrivate *priv = dmem->priv; + ID3D11Resource *staging = (ID3D11Resource *) priv->staging; + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (dmem->device); + + device_context->Unmap (staging, 0); +} + +static void +gst_d3d11_memory_unmap_full (GstMemory * mem, GstMapInfo * info) +{ + GstD3D11Memory *dmem = GST_D3D11_MEMORY_CAST (mem); + GstD3D11MemoryPrivate *priv = dmem->priv; + + gst_d3d11_device_lock (dmem->device); + GST_D3D11_MEMORY_LOCK (dmem); + + if ((info->flags & GST_MAP_D3D11) == GST_MAP_D3D11) { + if ((info->flags & GST_MAP_WRITE) == GST_MAP_WRITE) + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + + goto out; + } + + if ((info->flags & GST_MAP_WRITE) == GST_MAP_WRITE) + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD); + + priv->cpu_map_count--; + if (priv->cpu_map_count > 0) + goto out; + + gst_d3d11_memory_unmap_cpu_access (dmem); + +out: + GST_D3D11_MEMORY_UNLOCK (dmem); + gst_d3d11_device_unlock (dmem->device); +} + +static GstMemory * +gst_d3d11_memory_share (GstMemory * mem, gssize offset, gssize size) +{ + /* TODO: impl. */ + return NULL; +} + +static gboolean +gst_d3d11_memory_update_size (GstMemory * mem) +{ + GstD3D11Memory *dmem = GST_D3D11_MEMORY_CAST (mem); + GstD3D11MemoryPrivate *priv = dmem->priv; + gsize offset[GST_VIDEO_MAX_PLANES]; + gint stride[GST_VIDEO_MAX_PLANES]; + gsize size; + D3D11_TEXTURE2D_DESC *desc = &priv->desc; + gboolean ret = FALSE; + + if (!priv->staging) { + priv->staging = gst_d3d11_allocate_staging_texture (dmem->device, + &priv->desc); + if (!priv->staging) { + GST_ERROR_OBJECT (mem->allocator, "Couldn't create staging texture"); + return FALSE; + } + + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + } + + gst_d3d11_device_lock (dmem->device); + if (!gst_d3d11_memory_map_cpu_access (dmem, D3D11_MAP_READ_WRITE)) { + GST_ERROR_OBJECT (mem->allocator, "Couldn't map staging texture"); + return FALSE; + } + + gst_d3d11_memory_unmap_cpu_access (dmem); + + if (!gst_d3d11_dxgi_format_get_size (desc->Format, desc->Width, desc->Height, + priv->map.RowPitch, offset, stride, &size)) { + GST_ERROR_OBJECT (mem->allocator, "Couldn't calculate memory size"); + goto out; + } + + mem->maxsize = mem->size = size; + ret = TRUE; + +out: + gst_d3d11_device_unlock (dmem->device); + return ret; +} + +/** + * gst_is_d3d11_memory: + * @mem: a #GstMemory + * + * Returns: whether @mem is a #GstD3D11Memory + * + * Since: 1.20 + */ +gboolean +gst_is_d3d11_memory (GstMemory * mem) +{ + return mem != NULL && mem->allocator != NULL && + (GST_IS_D3D11_ALLOCATOR (mem->allocator) || + GST_IS_D3D11_POOL_ALLOCATOR (mem->allocator)); +} + +/** + * gst_d3d11_memory_init_once: + * + * Initializes the Direct3D11 Texture allocator. It is safe to call + * this function multiple times. This must be called before any other + * GstD3D11Memory operation. + * + * Since: 1.20 + */ +void +gst_d3d11_memory_init_once (void) +{ + static gsize _init = 0; + + if (g_once_init_enter (&_init)) { + + GST_DEBUG_CATEGORY_INIT (gst_d3d11_allocator_debug, "d3d11allocator", 0, + "Direct3D11 Texture Allocator"); + + _d3d11_memory_allocator = + (GstAllocator *) g_object_new (GST_TYPE_D3D11_ALLOCATOR, NULL); + gst_object_ref_sink (_d3d11_memory_allocator); + + gst_allocator_register (GST_D3D11_MEMORY_NAME, _d3d11_memory_allocator); + g_once_init_leave (&_init, 1); + } +} + +/** + * gst_d3d11_memory_get_texture_handle: + * @mem: a #GstD3D11Memory + * + * Returns: (transfer none): a ID3D11Texture2D handle. Caller must not release + * returned handle. + * + * Since: 1.20 + */ +ID3D11Texture2D * +gst_d3d11_memory_get_texture_handle (GstD3D11Memory * mem) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), NULL); + + return mem->priv->texture; +} + +/** + * gst_d3d11_memory_get_subresource_index: + * @mem: a #GstD3D11Memory + * + * Returns: subresource index corresponding to @mem. + * + * Since: 1.20 + */ +guint +gst_d3d11_memory_get_subresource_index (GstD3D11Memory * mem) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), 0); + + return mem->priv->subresource_index; +} + +/** + * gst_d3d11_memory_get_texture_desc: + * @mem: a #GstD3D11Memory + * @desc: (out): a D3D11_TEXTURE2D_DESC + * + * Fill @desc with D3D11_TEXTURE2D_DESC for ID3D11Texture2D + * + * Returns: %TRUE if successeed + * + * Since: 1.20 + */ +gboolean +gst_d3d11_memory_get_texture_desc (GstD3D11Memory * mem, + D3D11_TEXTURE2D_DESC * desc) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), FALSE); + g_return_val_if_fail (desc != NULL, FALSE); + + *desc = mem->priv->desc; + + return TRUE; +} + +gboolean +gst_d3d11_memory_get_texture_stride (GstD3D11Memory * mem, guint * stride) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), FALSE); + g_return_val_if_fail (stride != NULL, FALSE); + + *stride = mem->priv->map.RowPitch; + + return TRUE; +} + +static gboolean +create_shader_resource_views (GstD3D11Memory * mem) +{ + GstD3D11MemoryPrivate *priv = mem->priv; + guint i; + HRESULT hr; + guint num_views = 0; + ID3D11Device *device_handle; + D3D11_SHADER_RESOURCE_VIEW_DESC resource_desc; + DXGI_FORMAT formats[GST_VIDEO_MAX_PLANES] = { DXGI_FORMAT_UNKNOWN, }; + + memset (&resource_desc, 0, sizeof (D3D11_SHADER_RESOURCE_VIEW_DESC)); + + device_handle = gst_d3d11_device_get_device_handle (mem->device); + + switch (priv->desc.Format) { + case DXGI_FORMAT_B8G8R8A8_UNORM: + case DXGI_FORMAT_R8G8B8A8_UNORM: + case DXGI_FORMAT_R10G10B10A2_UNORM: + case DXGI_FORMAT_R8_UNORM: + case DXGI_FORMAT_R8G8_UNORM: + case DXGI_FORMAT_R16_UNORM: + case DXGI_FORMAT_R16G16_UNORM: + case DXGI_FORMAT_G8R8_G8B8_UNORM: + case DXGI_FORMAT_R8G8_B8G8_UNORM: + case DXGI_FORMAT_R16G16B16A16_UNORM: + num_views = 1; + formats[0] = priv->desc.Format; + break; + case DXGI_FORMAT_AYUV: + case DXGI_FORMAT_YUY2: + num_views = 1; + formats[0] = DXGI_FORMAT_R8G8B8A8_UNORM; + break; + case DXGI_FORMAT_NV12: + num_views = 2; + formats[0] = DXGI_FORMAT_R8_UNORM; + formats[1] = DXGI_FORMAT_R8G8_UNORM; + break; + case DXGI_FORMAT_P010: + case DXGI_FORMAT_P016: + num_views = 2; + formats[0] = DXGI_FORMAT_R16_UNORM; + formats[1] = DXGI_FORMAT_R16G16_UNORM; + break; + case DXGI_FORMAT_Y210: + num_views = 1; + formats[0] = DXGI_FORMAT_R16G16B16A16_UNORM; + break; + case DXGI_FORMAT_Y410: + num_views = 1; + formats[0] = DXGI_FORMAT_R10G10B10A2_UNORM; + break; + default: + g_assert_not_reached (); + return FALSE; + } + + if ((priv->desc.BindFlags & D3D11_BIND_SHADER_RESOURCE) == + D3D11_BIND_SHADER_RESOURCE) { + resource_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; + resource_desc.Texture2D.MipLevels = 1; + + for (i = 0; i < num_views; i++) { + resource_desc.Format = formats[i]; + hr = device_handle->CreateShaderResourceView (priv->texture, + &resource_desc, &priv->shader_resource_view[i]); + + if (!gst_d3d11_result (hr, mem->device)) { + GST_ERROR_OBJECT (GST_MEMORY_CAST (mem)->allocator, + "Failed to create %dth resource view (0x%x)", i, (guint) hr); + goto error; + } + } + + priv->num_shader_resource_views = num_views; + + return TRUE; + } + + return FALSE; + +error: + for (i = 0; i < num_views; i++) + GST_D3D11_CLEAR_COM (priv->shader_resource_view[i]); + + priv->num_shader_resource_views = 0; + + return FALSE; +} + +static gboolean +gst_d3d11_memory_ensure_shader_resource_view (GstD3D11Memory * mem) +{ + GstD3D11MemoryPrivate *priv = mem->priv; + gboolean ret = FALSE; + + if (!(priv->desc.BindFlags & D3D11_BIND_SHADER_RESOURCE)) { + GST_LOG_OBJECT (GST_MEMORY_CAST (mem)->allocator, + "Need BindFlags, current flag 0x%x", priv->desc.BindFlags); + return FALSE; + } + + GST_D3D11_MEMORY_LOCK (mem); + if (priv->num_shader_resource_views) { + ret = TRUE; + goto done; + } + + ret = create_shader_resource_views (mem); + +done: + GST_D3D11_MEMORY_UNLOCK (mem); + + return ret; +} + +/** + * gst_d3d11_memory_get_shader_resource_view_size: + * @mem: a #GstD3D11Memory + * + * Returns: the number of ID3D11ShaderResourceView that can be used + * for processing GPU operation with @mem + * + * Since: 1.20 + */ +guint +gst_d3d11_memory_get_shader_resource_view_size (GstD3D11Memory * mem) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), 0); + + if (!gst_d3d11_memory_ensure_shader_resource_view (mem)) + return 0; + + return mem->priv->num_shader_resource_views; +} + +/** + * gst_d3d11_memory_get_shader_resource_view: + * @mem: a #GstD3D11Memory + * @index: the index of the ID3D11ShaderResourceView + * + * Returns: (transfer none) (nullable): a pointer to the + * ID3D11ShaderResourceView or %NULL if ID3D11ShaderResourceView is unavailable + * for @index + * + * Since: 1.20 + */ +ID3D11ShaderResourceView * +gst_d3d11_memory_get_shader_resource_view (GstD3D11Memory * mem, guint index) +{ + GstD3D11MemoryPrivate *priv; + + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), NULL); + + if (!gst_d3d11_memory_ensure_shader_resource_view (mem)) + return NULL; + + priv = mem->priv; + + if (index >= priv->num_shader_resource_views) { + GST_ERROR ("Invalid SRV index %d", index); + return NULL; + } + + return priv->shader_resource_view[index]; +} + +static gboolean +create_render_target_views (GstD3D11Memory * mem) +{ + GstD3D11MemoryPrivate *priv = mem->priv; + guint i; + HRESULT hr; + guint num_views = 0; + ID3D11Device *device_handle; + D3D11_RENDER_TARGET_VIEW_DESC render_desc; + DXGI_FORMAT formats[GST_VIDEO_MAX_PLANES] = { DXGI_FORMAT_UNKNOWN, }; + + memset (&render_desc, 0, sizeof (D3D11_RENDER_TARGET_VIEW_DESC)); + + device_handle = gst_d3d11_device_get_device_handle (mem->device); + + switch (priv->desc.Format) { + case DXGI_FORMAT_B8G8R8A8_UNORM: + case DXGI_FORMAT_R8G8B8A8_UNORM: + case DXGI_FORMAT_R10G10B10A2_UNORM: + case DXGI_FORMAT_R8_UNORM: + case DXGI_FORMAT_R8G8_UNORM: + case DXGI_FORMAT_R16_UNORM: + case DXGI_FORMAT_R16G16_UNORM: + num_views = 1; + formats[0] = priv->desc.Format; + break; + case DXGI_FORMAT_AYUV: + num_views = 1; + formats[0] = DXGI_FORMAT_R8G8B8A8_UNORM; + break; + case DXGI_FORMAT_NV12: + num_views = 2; + formats[0] = DXGI_FORMAT_R8_UNORM; + formats[1] = DXGI_FORMAT_R8G8_UNORM; + break; + case DXGI_FORMAT_P010: + case DXGI_FORMAT_P016: + num_views = 2; + formats[0] = DXGI_FORMAT_R16_UNORM; + formats[1] = DXGI_FORMAT_R16G16_UNORM; + break; + default: + g_assert_not_reached (); + return FALSE; + } + + if ((priv->desc.BindFlags & D3D11_BIND_RENDER_TARGET) == + D3D11_BIND_RENDER_TARGET) { + render_desc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; + render_desc.Texture2D.MipSlice = 0; + + for (i = 0; i < num_views; i++) { + render_desc.Format = formats[i]; + + hr = device_handle->CreateRenderTargetView (priv->texture, &render_desc, + &priv->render_target_view[i]); + if (!gst_d3d11_result (hr, mem->device)) { + GST_ERROR_OBJECT (GST_MEMORY_CAST (mem)->allocator, + "Failed to create %dth render target view (0x%x)", i, (guint) hr); + goto error; + } + } + + priv->num_render_target_views = num_views; + + return TRUE; + } + + return FALSE; + +error: + for (i = 0; i < num_views; i++) + GST_D3D11_CLEAR_COM (priv->render_target_view[i]); + + priv->num_render_target_views = 0; + + return FALSE; +} + +static gboolean +gst_d3d11_memory_ensure_render_target_view (GstD3D11Memory * mem) +{ + GstD3D11MemoryPrivate *priv = mem->priv; + gboolean ret = FALSE; + + if (!(priv->desc.BindFlags & D3D11_BIND_RENDER_TARGET)) { + GST_WARNING_OBJECT (GST_MEMORY_CAST (mem)->allocator, + "Need BindFlags, current flag 0x%x", priv->desc.BindFlags); + return FALSE; + } + + GST_D3D11_MEMORY_LOCK (mem); + if (priv->num_render_target_views) { + ret = TRUE; + goto done; + } + + ret = create_render_target_views (mem); + +done: + GST_D3D11_MEMORY_UNLOCK (mem); + + return ret; +} + +/** + * gst_d3d11_memory_get_render_target_view_size: + * @mem: a #GstD3D11Memory + * + * Returns: the number of ID3D11RenderTargetView that can be used + * for processing GPU operation with @mem + * + * Since: 1.20 + */ +guint +gst_d3d11_memory_get_render_target_view_size (GstD3D11Memory * mem) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), 0); + + if (!gst_d3d11_memory_ensure_render_target_view (mem)) + return 0; + + return mem->priv->num_render_target_views; +} + +/** + * gst_d3d11_memory_get_render_target_view: + * @mem: a #GstD3D11Memory + * @index: the index of the ID3D11RenderTargetView + * + * Returns: (transfer none) (nullable): a pointer to the + * ID3D11RenderTargetView or %NULL if ID3D11RenderTargetView is unavailable + * for @index + * + * Since: 1.20 + */ +ID3D11RenderTargetView * +gst_d3d11_memory_get_render_target_view (GstD3D11Memory * mem, guint index) +{ + GstD3D11MemoryPrivate *priv; + + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), NULL); + + if (!gst_d3d11_memory_ensure_render_target_view (mem)) + return NULL; + + priv = mem->priv; + + if (index >= priv->num_render_target_views) { + GST_ERROR ("Invalid RTV index %d", index); + return NULL; + } + + return priv->render_target_view[index]; +} + +static gboolean +gst_d3d11_memory_ensure_decoder_output_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, GUID * decoder_profile) +{ + GstD3D11MemoryPrivate *dmem_priv = mem->priv; + GstD3D11Allocator *allocator; + D3D11_VIDEO_DECODER_OUTPUT_VIEW_DESC desc; + HRESULT hr; + gboolean ret = FALSE; + + allocator = GST_D3D11_ALLOCATOR (GST_MEMORY_CAST (mem)->allocator); + + if (!(dmem_priv->desc.BindFlags & D3D11_BIND_DECODER)) { + GST_LOG_OBJECT (allocator, + "Need BindFlags, current flag 0x%x", dmem_priv->desc.BindFlags); + return FALSE; + } + + GST_D3D11_MEMORY_LOCK (mem); + if (dmem_priv->decoder_output_view) { + dmem_priv->decoder_output_view->GetDesc (&desc); + if (IsEqualGUID (desc.DecodeProfile, *decoder_profile)) { + goto succeeded; + } else { + /* Shouldn't happen, but try again anyway */ + GST_WARNING_OBJECT (allocator, + "Existing view has different decoder profile"); + GST_D3D11_CLEAR_COM (dmem_priv->decoder_output_view); + } + } + + if (dmem_priv->decoder_output_view) + goto succeeded; + + desc.DecodeProfile = *decoder_profile; + desc.ViewDimension = D3D11_VDOV_DIMENSION_TEXTURE2D; + desc.Texture2D.ArraySlice = dmem_priv->subresource_index; + + hr = video_device->CreateVideoDecoderOutputView (dmem_priv->texture, &desc, + &dmem_priv->decoder_output_view); + if (!gst_d3d11_result (hr, mem->device)) { + GST_ERROR_OBJECT (allocator, + "Could not create decoder output view, hr: 0x%x", (guint) hr); + goto done; + } + +succeeded: + ret = TRUE; + +done: + GST_D3D11_MEMORY_UNLOCK (mem); + + return ret; +} + +/** + * gst_d3d11_memory_get_decoder_output_view: + * @mem: a #GstD3D11Memory + * + * Returns: (transfer none) (nullable): a pointer to the + * ID3D11VideoDecoderOutputView or %NULL if ID3D11VideoDecoderOutputView is + * unavailable + * + * Since: 1.20 + */ +ID3D11VideoDecoderOutputView * +gst_d3d11_memory_get_decoder_output_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, GUID * decoder_profile) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), NULL); + g_return_val_if_fail (video_device != NULL, NULL); + g_return_val_if_fail (decoder_profile != NULL, NULL); + + if (!gst_d3d11_memory_ensure_decoder_output_view (mem, + video_device, decoder_profile)) + return NULL; + + return mem->priv->decoder_output_view; +} + +static gboolean +check_bind_flags_for_processor_input_view (guint bind_flags) +{ + static const guint compatible_flags = (D3D11_BIND_DECODER | + D3D11_BIND_VIDEO_ENCODER | D3D11_BIND_RENDER_TARGET | + D3D11_BIND_UNORDERED_ACCESS); + + if (bind_flags == 0) + return TRUE; + + if ((bind_flags & compatible_flags) != 0) + return TRUE; + + return FALSE; +} + +static gboolean +gst_d3d11_memory_ensure_processor_input_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + ID3D11VideoProcessorEnumerator * enumerator) +{ + GstD3D11MemoryPrivate *dmem_priv = mem->priv; + GstD3D11Allocator *allocator; + D3D11_VIDEO_PROCESSOR_INPUT_VIEW_DESC desc; + HRESULT hr; + gboolean ret = FALSE; + + allocator = GST_D3D11_ALLOCATOR (GST_MEMORY_CAST (mem)->allocator); + + if (!check_bind_flags_for_processor_input_view (dmem_priv->desc.BindFlags)) { + GST_LOG_OBJECT (allocator, + "Need BindFlags, current flag 0x%x", dmem_priv->desc.BindFlags); + return FALSE; + } + + GST_D3D11_MEMORY_LOCK (mem); + if (dmem_priv->processor_input_view) + goto succeeded; + + desc.FourCC = 0; + desc.ViewDimension = D3D11_VPIV_DIMENSION_TEXTURE2D; + desc.Texture2D.MipSlice = 0; + desc.Texture2D.ArraySlice = dmem_priv->subresource_index; + + hr = video_device->CreateVideoProcessorInputView (dmem_priv->texture, + enumerator, &desc, &dmem_priv->processor_input_view); + if (!gst_d3d11_result (hr, mem->device)) { + GST_ERROR_OBJECT (allocator, + "Could not create processor input view, hr: 0x%x", (guint) hr); + goto done; + } + +succeeded: + ret = TRUE; + +done: + GST_D3D11_MEMORY_UNLOCK (mem); + + return ret; +} + +/** + * gst_d3d11_memory_get_processor_input_view: + * @mem: a #GstD3D11Memory + * @video_device: a #ID3D11VideoDevice + * @enumerator: a #ID3D11VideoProcessorEnumerator + * + * Returns: (transfer none) (nullable): a pointer to the + * ID3D11VideoProcessorInputView or %NULL if ID3D11VideoProcessorInputView is + * unavailable + * + * Since: 1.20 + */ +ID3D11VideoProcessorInputView * +gst_d3d11_memory_get_processor_input_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + ID3D11VideoProcessorEnumerator * enumerator) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), NULL); + g_return_val_if_fail (video_device != NULL, NULL); + g_return_val_if_fail (enumerator != NULL, NULL); + + if (!gst_d3d11_memory_ensure_processor_input_view (mem, video_device, + enumerator)) + return NULL; + + return mem->priv->processor_input_view; +} + +static gboolean +gst_d3d11_memory_ensure_processor_output_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + ID3D11VideoProcessorEnumerator * enumerator) +{ + GstD3D11MemoryPrivate *priv = mem->priv; + GstD3D11Allocator *allocator; + D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC desc; + HRESULT hr; + gboolean ret; + + memset (&desc, 0, sizeof (D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC)); + + allocator = GST_D3D11_ALLOCATOR (GST_MEMORY_CAST (mem)->allocator); + + if (!(priv->desc.BindFlags & D3D11_BIND_RENDER_TARGET)) { + GST_LOG_OBJECT (allocator, + "Need BindFlags, current flag 0x%x", priv->desc.BindFlags); + return FALSE; + } + + /* FIXME: texture array should be supported at some point */ + if (priv->subresource_index != 0) { + GST_FIXME_OBJECT (allocator, + "Texture array is not suppoted for processor output view"); + return FALSE; + } + + GST_D3D11_MEMORY_LOCK (mem); + if (priv->processor_output_view) + goto succeeded; + + desc.ViewDimension = D3D11_VPOV_DIMENSION_TEXTURE2D; + desc.Texture2D.MipSlice = 0; + + hr = video_device->CreateVideoProcessorOutputView (priv->texture, + enumerator, &desc, &priv->processor_output_view); + if (!gst_d3d11_result (hr, mem->device)) { + GST_ERROR_OBJECT (allocator, + "Could not create processor input view, hr: 0x%x", (guint) hr); + goto done; + } + +succeeded: + ret = TRUE; + +done: + GST_D3D11_MEMORY_UNLOCK (mem); + + return ret; +} + +/** + * gst_d3d11_memory_get_processor_output_view: + * @mem: a #GstD3D11Memory + * @video_device: a #ID3D11VideoDevice + * @enumerator: a #ID3D11VideoProcessorEnumerator + * + * Returns: (transfer none) (nullable): a pointer to the + * ID3D11VideoProcessorOutputView or %NULL if ID3D11VideoProcessorOutputView is + * unavailable + * + * Since: 1.20 + */ +ID3D11VideoProcessorOutputView * +gst_d3d11_memory_get_processor_output_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + ID3D11VideoProcessorEnumerator * enumerator) +{ + g_return_val_if_fail (gst_is_d3d11_memory (GST_MEMORY_CAST (mem)), NULL); + g_return_val_if_fail (video_device != NULL, NULL); + g_return_val_if_fail (enumerator != NULL, NULL); + + if (!gst_d3d11_memory_ensure_processor_output_view (mem, video_device, + enumerator)) + return NULL; + + return mem->priv->processor_output_view; +} + +/* GstD3D11Allocator */ +struct _GstD3D11AllocatorPrivate +{ + GstMemoryCopyFunction fallback_copy; +}; + +#define gst_d3d11_allocator_parent_class alloc_parent_class +G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11Allocator, + gst_d3d11_allocator, GST_TYPE_ALLOCATOR); + +static GstMemory *gst_d3d11_allocator_dummy_alloc (GstAllocator * allocator, + gsize size, GstAllocationParams * params); +static GstMemory *gst_d3d11_allocator_alloc_internal (GstD3D11Allocator * self, + GstD3D11Device * device, const D3D11_TEXTURE2D_DESC * desc); +static void gst_d3d11_allocator_free (GstAllocator * allocator, + GstMemory * mem); + +static void +gst_d3d11_allocator_class_init (GstD3D11AllocatorClass * klass) +{ + GstAllocatorClass *allocator_class = GST_ALLOCATOR_CLASS (klass); + + allocator_class->alloc = gst_d3d11_allocator_dummy_alloc; + allocator_class->free = gst_d3d11_allocator_free; +} + +static GstMemory * +gst_d3d11_memory_copy (GstMemory * mem, gssize offset, gssize size) +{ + GstD3D11Allocator *alloc = GST_D3D11_ALLOCATOR (mem->allocator); + GstD3D11AllocatorPrivate *priv = alloc->priv; + GstD3D11Memory *dmem = GST_D3D11_MEMORY_CAST (mem); + GstD3D11Memory *copy_dmem; + GstD3D11Device *device = dmem->device; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (device); + D3D11_TEXTURE2D_DESC dst_desc = { 0, }; + D3D11_TEXTURE2D_DESC src_desc = { 0, }; + GstMemory *copy = NULL; + GstMapInfo info; + HRESULT hr; + UINT bind_flags = 0; + UINT supported_flags = 0; + + /* non-zero offset or different size is not supported */ + if (offset != 0 || (size != -1 && (gsize) size != mem->size)) { + GST_DEBUG_OBJECT (alloc, "Different size/offset, try fallback copy"); + return priv->fallback_copy (mem, offset, size); + } + + gst_d3d11_device_lock (device); + if (!gst_memory_map (mem, &info, + (GstMapFlags) (GST_MAP_READ | GST_MAP_D3D11))) { + gst_d3d11_device_unlock (device); + + GST_WARNING_OBJECT (alloc, "Failed to map memory, try fallback copy"); + + return priv->fallback_copy (mem, offset, size); + } + + dmem->priv->texture->GetDesc (&src_desc); + dst_desc.Width = src_desc.Width; + dst_desc.Height = src_desc.Height; + dst_desc.MipLevels = 1; + dst_desc.Format = src_desc.Format; + dst_desc.SampleDesc.Count = 1; + dst_desc.ArraySize = 1; + dst_desc.Usage = D3D11_USAGE_DEFAULT; + + /* If supported, use bind flags for SRV/RTV */ + hr = device_handle->CheckFormatSupport (src_desc.Format, &supported_flags); + if (gst_d3d11_result (hr, device)) { + if ((supported_flags & D3D11_FORMAT_SUPPORT_SHADER_SAMPLE) == + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE) { + bind_flags |= D3D11_BIND_SHADER_RESOURCE; + } + + if ((supported_flags & D3D11_FORMAT_SUPPORT_RENDER_TARGET) == + D3D11_FORMAT_SUPPORT_RENDER_TARGET) { + bind_flags |= D3D11_BIND_RENDER_TARGET; + } + } + + copy = gst_d3d11_allocator_alloc_internal (alloc, device, &dst_desc); + if (!copy) { + gst_memory_unmap (mem, &info); + gst_d3d11_device_unlock (device); + + GST_WARNING_OBJECT (alloc, + "Failed to allocate new d3d11 map memory, try fallback copy"); + + return priv->fallback_copy (mem, offset, size); + } + + copy_dmem = GST_D3D11_MEMORY_CAST (copy); + device_context->CopySubresourceRegion (copy_dmem->priv->texture, 0, 0, 0, 0, + dmem->priv->texture, dmem->priv->subresource_index, NULL); + copy->maxsize = copy->size = mem->maxsize; + gst_memory_unmap (mem, &info); + gst_d3d11_device_unlock (device); + + /* newly allocated memory holds valid image data. We need download this + * pixel data into staging memory for CPU access */ + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + + return copy; +} + +static void +gst_d3d11_allocator_init (GstD3D11Allocator * allocator) +{ + GstAllocator *alloc = GST_ALLOCATOR_CAST (allocator); + GstD3D11AllocatorPrivate *priv; + + priv = allocator->priv = (GstD3D11AllocatorPrivate *) + gst_d3d11_allocator_get_instance_private (allocator); + + alloc->mem_type = GST_D3D11_MEMORY_NAME; + alloc->mem_map_full = gst_d3d11_memory_map_full; + alloc->mem_unmap_full = gst_d3d11_memory_unmap_full; + alloc->mem_share = gst_d3d11_memory_share; + + /* Store pointer to default mem_copy method for fallback copy */ + priv->fallback_copy = alloc->mem_copy; + alloc->mem_copy = gst_d3d11_memory_copy; + + GST_OBJECT_FLAG_SET (alloc, GST_ALLOCATOR_FLAG_CUSTOM_ALLOC); +} + +static GstMemory * +gst_d3d11_allocator_dummy_alloc (GstAllocator * allocator, gsize size, + GstAllocationParams * params) +{ + g_return_val_if_reached (NULL); +} + +static void +gst_d3d11_allocator_free (GstAllocator * allocator, GstMemory * mem) +{ + GstD3D11Memory *dmem = GST_D3D11_MEMORY_CAST (mem); + GstD3D11MemoryPrivate *dmem_priv = dmem->priv; + gint i; + + GST_LOG_OBJECT (allocator, "Free memory %p", mem); + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (dmem_priv->render_target_view[i]); + GST_D3D11_CLEAR_COM (dmem_priv->shader_resource_view[i]); + } + + GST_D3D11_CLEAR_COM (dmem_priv->decoder_output_view); + GST_D3D11_CLEAR_COM (dmem_priv->processor_input_view); + GST_D3D11_CLEAR_COM (dmem_priv->processor_output_view); + GST_D3D11_CLEAR_COM (dmem_priv->texture); + GST_D3D11_CLEAR_COM (dmem_priv->staging); + + gst_clear_object (&dmem->device); + g_mutex_clear (&dmem_priv->lock); + g_free (dmem->priv); + g_free (dmem); +} + +static GstMemory * +gst_d3d11_allocator_alloc_wrapped (GstD3D11Allocator * self, + GstD3D11Device * device, const D3D11_TEXTURE2D_DESC * desc, + ID3D11Texture2D * texture) +{ + GstD3D11Memory *mem; + + mem = g_new0 (GstD3D11Memory, 1); + mem->priv = g_new0 (GstD3D11MemoryPrivate, 1); + + gst_memory_init (GST_MEMORY_CAST (mem), + (GstMemoryFlags) 0, GST_ALLOCATOR_CAST (self), NULL, 0, 0, 0, 0); + g_mutex_init (&mem->priv->lock); + mem->priv->texture = texture; + mem->priv->desc = *desc; + mem->device = (GstD3D11Device *) gst_object_ref (device); + + /* This is staging texture as well */ + if (desc->Usage == D3D11_USAGE_STAGING) { + mem->priv->staging = texture; + texture->AddRef (); + } + + return GST_MEMORY_CAST (mem); +} + +static GstMemory * +gst_d3d11_allocator_alloc_internal (GstD3D11Allocator * self, + GstD3D11Device * device, const D3D11_TEXTURE2D_DESC * desc) +{ + ID3D11Texture2D *texture = NULL; + ID3D11Device *device_handle; + HRESULT hr; + + device_handle = gst_d3d11_device_get_device_handle (device); + + hr = device_handle->CreateTexture2D (desc, NULL, &texture); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Couldn't create texture"); + return NULL; + } + + return gst_d3d11_allocator_alloc_wrapped (self, device, desc, texture); +} + +/** + * gst_d3d11_allocator_alloc: + * @allocator: a #GstD3D11Allocator + * @device: a #GstD3D11Device + * @desc: a D3D11_TEXTURE2D_DESC struct + * + * Returns: a newly allocated #GstD3D11Memory with given parameters. + * + * Since: 1.20 + */ +GstMemory * +gst_d3d11_allocator_alloc (GstD3D11Allocator * allocator, + GstD3D11Device * device, const D3D11_TEXTURE2D_DESC * desc) +{ + GstMemory *mem; + + g_return_val_if_fail (GST_IS_D3D11_ALLOCATOR (allocator), NULL); + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (desc != NULL, NULL); + + mem = gst_d3d11_allocator_alloc_internal (allocator, device, desc); + if (!mem) + return NULL; + + if (!gst_d3d11_memory_update_size (mem)) { + GST_ERROR_OBJECT (allocator, "Failed to calculate size"); + gst_memory_unref (mem); + return NULL; + } + + return mem; +} + +gboolean +gst_d3d11_allocator_set_active (GstD3D11Allocator * allocator, gboolean active) +{ + GstD3D11AllocatorClass *klass; + + g_return_val_if_fail (GST_IS_D3D11_ALLOCATOR (allocator), FALSE); + + klass = GST_D3D11_ALLOCATOR_GET_CLASS (allocator); + if (klass->set_actvie) + return klass->set_actvie (allocator, active); + + return TRUE; +} + +/* GstD3D11PoolAllocator */ +#define GST_D3D11_POOL_ALLOCATOR_LOCK(alloc) (g_rec_mutex_lock(&alloc->priv->lock)) +#define GST_D3D11_POOL_ALLOCATOR_UNLOCK(alloc) (g_rec_mutex_unlock(&alloc->priv->lock)) +#define GST_D3D11_POOL_ALLOCATOR_IS_FLUSHING(alloc) (g_atomic_int_get (&alloc->priv->flushing)) + +struct _GstD3D11PoolAllocatorPrivate +{ + /* parent texture when array typed memory is used */ + ID3D11Texture2D *texture; + D3D11_TEXTURE2D_DESC desc; + + /* All below member variables are analogous to that of GstBufferPool */ + GstAtomicQueue *queue; + GstPoll *poll; + + /* This lock will protect all below variables apart from atomic ones + * (identical to GstBufferPool::priv::rec_lock) */ + GRecMutex lock; + gboolean started; + gboolean active; + + /* atomic */ + gint outstanding; + guint max_mems; + guint cur_mems; + gboolean flushing; + + /* Calculated memory size, based on Direct3D11 staging texture map. + * Note that, we cannot know the actually staging texture memory size prior + * to map the staging texture because driver will likely require padding */ + gsize mem_size; +}; + +static void gst_d3d11_pool_allocator_dispose (GObject * object); +static void gst_d3d11_pool_allocator_finalize (GObject * object); + +static gboolean +gst_d3d11_pool_allocator_set_active (GstD3D11Allocator * allocator, + gboolean active); + +static gboolean gst_d3d11_pool_allocator_start (GstD3D11PoolAllocator * self); +static gboolean gst_d3d11_pool_allocator_stop (GstD3D11PoolAllocator * self); +static gboolean gst_d3d11_memory_release (GstMiniObject * mini_object); + +#define gst_d3d11_pool_allocator_parent_class pool_alloc_parent_class +G_DEFINE_TYPE_WITH_PRIVATE (GstD3D11PoolAllocator, + gst_d3d11_pool_allocator, GST_TYPE_D3D11_ALLOCATOR); + +static void +gst_d3d11_pool_allocator_class_init (GstD3D11PoolAllocatorClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstD3D11AllocatorClass *d3d11alloc_class = GST_D3D11_ALLOCATOR_CLASS (klass); + + gobject_class->dispose = gst_d3d11_pool_allocator_dispose; + gobject_class->finalize = gst_d3d11_pool_allocator_finalize; + + d3d11alloc_class->set_actvie = gst_d3d11_pool_allocator_set_active; +} + +static void +gst_d3d11_pool_allocator_init (GstD3D11PoolAllocator * allocator) +{ + GstD3D11PoolAllocatorPrivate *priv; + + priv = allocator->priv = (GstD3D11PoolAllocatorPrivate *) + gst_d3d11_pool_allocator_get_instance_private (allocator); + g_rec_mutex_init (&priv->lock); + + priv->poll = gst_poll_new_timer (); + priv->queue = gst_atomic_queue_new (16); + priv->flushing = 1; + priv->active = FALSE; + priv->started = FALSE; + + /* 1 control write for flushing - the flush token */ + gst_poll_write_control (priv->poll); + /* 1 control write for marking that we are not waiting for poll - the wait token */ + gst_poll_write_control (priv->poll); +} + +static void +gst_d3d11_pool_allocator_dispose (GObject * object) +{ + GstD3D11PoolAllocator *self = GST_D3D11_POOL_ALLOCATOR (object); + + gst_clear_object (&self->device); + + G_OBJECT_CLASS (pool_alloc_parent_class)->dispose (object); +} + +static void +gst_d3d11_pool_allocator_finalize (GObject * object) +{ + GstD3D11PoolAllocator *self = GST_D3D11_POOL_ALLOCATOR (object); + GstD3D11PoolAllocatorPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (self, "Finalize"); + + gst_d3d11_pool_allocator_stop (self); + gst_atomic_queue_unref (priv->queue); + gst_poll_free (priv->poll); + g_rec_mutex_clear (&priv->lock); + + GST_D3D11_CLEAR_COM (priv->texture); + + G_OBJECT_CLASS (pool_alloc_parent_class)->finalize (object); +} + +static gboolean +gst_d3d11_pool_allocator_start (GstD3D11PoolAllocator * self) +{ + GstD3D11PoolAllocatorPrivate *priv = self->priv; + ID3D11Device *device_handle; + HRESULT hr; + guint i; + + if (priv->started) + return TRUE; + + /* Nothing to do */ + if (priv->desc.ArraySize == 1) { + priv->started = TRUE; + return TRUE; + } + + device_handle = gst_d3d11_device_get_device_handle (self->device); + + if (!priv->texture) { + hr = device_handle->CreateTexture2D (&priv->desc, NULL, &priv->texture); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Failed to allocate texture"); + return FALSE; + } + } + + /* Pre-allocate memory objects */ + for (i = 0; i < priv->desc.ArraySize; i++) { + GstMemory *mem; + + priv->texture->AddRef (); + mem = + gst_d3d11_allocator_alloc_wrapped (GST_D3D11_ALLOCATOR_CAST + (_d3d11_memory_allocator), self->device, &priv->desc, priv->texture); + + if (i == 0) { + if (!gst_d3d11_memory_update_size (mem)) { + GST_ERROR_OBJECT (self, "Failed to calculate memory size"); + gst_memory_unref (mem); + return FALSE; + } + + priv->mem_size = mem->size; + } else { + mem->size = mem->maxsize = priv->mem_size; + } + + GST_D3D11_MEMORY_CAST (mem)->priv->subresource_index = i; + + g_atomic_int_add (&priv->cur_mems, 1); + gst_atomic_queue_push (priv->queue, mem); + gst_poll_write_control (priv->poll); + } + + priv->started = TRUE; + + return TRUE; +} + +static void +gst_d3d11_pool_allocator_do_set_flushing (GstD3D11PoolAllocator * self, + gboolean flushing) +{ + GstD3D11PoolAllocatorPrivate *priv = self->priv; + + if (GST_D3D11_POOL_ALLOCATOR_IS_FLUSHING (self) == flushing) + return; + + if (flushing) { + g_atomic_int_set (&priv->flushing, 1); + /* Write the flush token to wake up any waiters */ + gst_poll_write_control (priv->poll); + } else { + while (!gst_poll_read_control (priv->poll)) { + if (errno == EWOULDBLOCK) { + /* This should not really happen unless flushing and unflushing + * happens on different threads. Let's wait a bit to get back flush + * token from the thread that was setting it to flushing */ + g_thread_yield (); + continue; + } else { + /* Critical error but GstPoll already complained */ + break; + } + } + + g_atomic_int_set (&priv->flushing, 0); + } +} + +static gboolean +gst_d3d11_pool_allocator_set_active (GstD3D11Allocator * allocator, + gboolean active) +{ + GstD3D11PoolAllocator *self = GST_D3D11_POOL_ALLOCATOR (allocator); + GstD3D11PoolAllocatorPrivate *priv = self->priv; + + GST_LOG_OBJECT (self, "active %d", active); + + GST_D3D11_POOL_ALLOCATOR_LOCK (self); + /* just return if we are already in the right state */ + if (priv->active == active) + goto was_ok; + + if (active) { + if (!gst_d3d11_pool_allocator_start (self)) + goto start_failed; + + /* flush_stop may release memory objects, setting to active to avoid running + * do_stop while activating the pool */ + priv->active = TRUE; + + gst_d3d11_pool_allocator_do_set_flushing (self, FALSE); + } else { + gint outstanding; + + /* set to flushing first */ + gst_d3d11_pool_allocator_do_set_flushing (self, TRUE); + + /* when all memory objects are in the pool, free them. Else they will be + * freed when they are released */ + outstanding = g_atomic_int_get (&priv->outstanding); + GST_LOG_OBJECT (self, "outstanding memories %d, (in queue %d)", + outstanding, gst_atomic_queue_length (priv->queue)); + if (outstanding == 0) { + if (!gst_d3d11_pool_allocator_stop (self)) + goto stop_failed; + } + + priv->active = FALSE; + } + + GST_D3D11_POOL_ALLOCATOR_UNLOCK (self); + + return TRUE; + +was_ok: + { + GST_DEBUG_OBJECT (self, "allocator was in the right state"); + GST_D3D11_POOL_ALLOCATOR_UNLOCK (self); + return TRUE; + } +start_failed: + { + GST_ERROR_OBJECT (self, "start failed"); + GST_D3D11_POOL_ALLOCATOR_UNLOCK (self); + return FALSE; + } +stop_failed: + { + GST_ERROR_OBJECT (self, "stop failed"); + GST_D3D11_POOL_ALLOCATOR_UNLOCK (self); + return FALSE; + } +} + +static void +gst_d3d11_pool_allocator_free_memory (GstD3D11PoolAllocator * self, + GstMemory * mem) +{ + GstD3D11PoolAllocatorPrivate *priv = self->priv; + + g_atomic_int_add (&priv->cur_mems, -1); + GST_LOG_OBJECT (self, "freeing memory %p (%u left)", mem, priv->cur_mems); + + GST_MINI_OBJECT_CAST (mem)->dispose = NULL; + gst_memory_unref (mem); +} + +/* must be called with the lock */ +static gboolean +gst_d3d11_pool_allocator_clear_queue (GstD3D11PoolAllocator * self) +{ + GstD3D11PoolAllocatorPrivate *priv = self->priv; + GstMemory *memory; + + GST_LOG_OBJECT (self, "Clearing queue"); + + /* clear the pool */ + while ((memory = (GstMemory *) gst_atomic_queue_pop (priv->queue))) { + while (!gst_poll_read_control (priv->poll)) { + if (errno == EWOULDBLOCK) { + /* We put the memory into the queue but did not finish writing control + * yet, let's wait a bit and retry */ + g_thread_yield (); + continue; + } else { + /* Critical error but GstPoll already complained */ + break; + } + } + gst_d3d11_pool_allocator_free_memory (self, memory); + } + + GST_LOG_OBJECT (self, "Clear done"); + + return priv->cur_mems == 0; +} + +/* must be called with the lock */ +static gboolean +gst_d3d11_pool_allocator_stop (GstD3D11PoolAllocator * self) +{ + GstD3D11PoolAllocatorPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (self, "Stop"); + + if (priv->started) { + if (!gst_d3d11_pool_allocator_clear_queue (self)) + return FALSE; + + priv->started = FALSE; + } else { + GST_DEBUG_OBJECT (self, "Wasn't started"); + } + + return TRUE; +} + +static inline void +dec_outstanding (GstD3D11PoolAllocator * self) +{ + if (g_atomic_int_dec_and_test (&self->priv->outstanding)) { + /* all memory objects are returned to the pool, see if we need to free them */ + if (GST_D3D11_POOL_ALLOCATOR_IS_FLUSHING (self)) { + /* take the lock so that set_active is not run concurrently */ + GST_D3D11_POOL_ALLOCATOR_LOCK (self); + /* now that we have the lock, check if we have been de-activated with + * outstanding buffers */ + if (!self->priv->active) + gst_d3d11_pool_allocator_stop (self); + + GST_D3D11_POOL_ALLOCATOR_UNLOCK (self); + } + } +} + +static void +gst_d3d11_pool_allocator_release_memory (GstD3D11PoolAllocator * self, + GstMemory * mem) +{ + GST_LOG_OBJECT (self, "Released memory %p", mem); + + GST_MINI_OBJECT_CAST (mem)->dispose = NULL; + mem->allocator = (GstAllocator *) gst_object_ref (_d3d11_memory_allocator); + gst_object_unref (self); + + /* keep it around in our queue */ + gst_atomic_queue_push (self->priv->queue, mem); + gst_poll_write_control (self->priv->poll); + dec_outstanding (self); +} + +static gboolean +gst_d3d11_memory_release (GstMiniObject * mini_object) +{ + GstMemory *mem = GST_MEMORY_CAST (mini_object); + GstD3D11PoolAllocator *alloc; + + g_assert (mem->allocator != NULL); + + if (!GST_IS_D3D11_POOL_ALLOCATOR (mem->allocator)) { + GST_LOG_OBJECT (mem->allocator, "Not our memory, free"); + return TRUE; + } + + alloc = GST_D3D11_POOL_ALLOCATOR (mem->allocator); + /* if flushing, free this memory */ + if (GST_D3D11_POOL_ALLOCATOR_IS_FLUSHING (alloc)) { + GST_LOG_OBJECT (alloc, "allocator is flushing, free %p", mem); + return TRUE; + } + + /* return the memory to the allocator */ + gst_memory_ref (mem); + gst_d3d11_pool_allocator_release_memory (alloc, mem); + + return FALSE; +} + +static GstFlowReturn +gst_d3d11_pool_allocator_alloc (GstD3D11PoolAllocator * self, GstMemory ** mem) +{ + GstD3D11PoolAllocatorPrivate *priv = self->priv; + GstMemory *new_mem; + + /* we allcates texture array during start */ + if (priv->desc.ArraySize > 1) + return GST_FLOW_EOS; + + /* increment the allocation counter */ + g_atomic_int_add (&priv->cur_mems, 1); + new_mem = + gst_d3d11_allocator_alloc_internal (GST_D3D11_ALLOCATOR_CAST + (_d3d11_memory_allocator), self->device, &priv->desc); + if (!new_mem) { + GST_ERROR_OBJECT (self, "Failed to allocate new memory"); + g_atomic_int_add (&priv->cur_mems, -1); + return GST_FLOW_ERROR; + } + + if (!priv->mem_size) { + if (!gst_d3d11_memory_update_size (new_mem)) { + GST_ERROR_OBJECT (self, "Failed to calculate size"); + gst_memory_unref (new_mem); + g_atomic_int_add (&priv->cur_mems, -1); + + return GST_FLOW_ERROR; + } + + priv->mem_size = new_mem->size; + } + + new_mem->size = new_mem->maxsize = priv->mem_size; + + *mem = new_mem; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_pool_allocator_acquire_memory_internal (GstD3D11PoolAllocator * self, + GstMemory ** memory) +{ + GstFlowReturn result; + GstD3D11PoolAllocatorPrivate *priv = self->priv; + + while (TRUE) { + if (G_UNLIKELY (GST_D3D11_POOL_ALLOCATOR_IS_FLUSHING (self))) + goto flushing; + + /* try to get a memory from the queue */ + *memory = (GstMemory *) gst_atomic_queue_pop (priv->queue); + if (G_LIKELY (*memory)) { + while (!gst_poll_read_control (priv->poll)) { + if (errno == EWOULDBLOCK) { + /* We put the memory into the queue but did not finish writing control + * yet, let's wait a bit and retry */ + g_thread_yield (); + continue; + } else { + /* Critical error but GstPoll already complained */ + break; + } + } + result = GST_FLOW_OK; + GST_LOG_OBJECT (self, "acquired memory %p", *memory); + break; + } + + /* no memory, try to allocate some more */ + GST_LOG_OBJECT (self, "no memory, trying to allocate"); + result = gst_d3d11_pool_allocator_alloc (self, memory); + if (G_LIKELY (result == GST_FLOW_OK)) + /* we have a memory, return it */ + break; + + if (G_UNLIKELY (result != GST_FLOW_EOS)) + /* something went wrong, return error */ + break; + + /* now we release the control socket, we wait for a memory release or + * flushing */ + if (!gst_poll_read_control (priv->poll)) { + if (errno == EWOULDBLOCK) { + /* This means that we have two threads trying to allocate memory + * already, and the other one already got the wait token. This + * means that we only have to wait for the poll now and not write the + * token afterwards: we will be woken up once the other thread is + * woken up and that one will write the wait token it removed */ + GST_LOG_OBJECT (self, "waiting for free memory or flushing"); + gst_poll_wait (priv->poll, GST_CLOCK_TIME_NONE); + } else { + /* This is a critical error, GstPoll already gave a warning */ + result = GST_FLOW_ERROR; + break; + } + } else { + /* We're the first thread waiting, we got the wait token and have to + * write it again later + * OR + * We're a second thread and just consumed the flush token and block all + * other threads, in which case we must not wait and give it back + * immediately */ + if (!GST_D3D11_POOL_ALLOCATOR_IS_FLUSHING (self)) { + GST_LOG_OBJECT (self, "waiting for free memory or flushing"); + gst_poll_wait (priv->poll, GST_CLOCK_TIME_NONE); + } + gst_poll_write_control (priv->poll); + } + } + + return result; + + /* ERRORS */ +flushing: + { + GST_DEBUG_OBJECT (self, "we are flushing"); + return GST_FLOW_FLUSHING; + } +} + +/** + * gst_d3d11_pool_allocator_new: + * @device: a #GstD3D11Device + * @desc: a D3D11_TEXTURE2D_DESC for texture allocation + * + * Creates a new #GstD3D11PoolAllocator instance. + * + * Returns: (transfer full): a new #GstD3D11PoolAllocator instance + */ +GstD3D11PoolAllocator * +gst_d3d11_pool_allocator_new (GstD3D11Device * device, + const D3D11_TEXTURE2D_DESC * desc) +{ + GstD3D11PoolAllocator *self; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (desc != NULL, NULL); + + gst_d3d11_memory_init_once (); + + self = (GstD3D11PoolAllocator *) + g_object_new (GST_TYPE_D3D11_POOL_ALLOCATOR, NULL); + gst_object_ref_sink (self); + + self->device = (GstD3D11Device *) gst_object_ref (device); + self->priv->desc = *desc; + + return self; +} + +/** + * gst_d3d11_pool_allocator_acquire_memory: + * @allocator: a #GstD3D11PoolAllocator + * @memory: (transfer full): a #GstMemory + * + * Acquires a #GstMemory from @allocator. @memory should point to a memory + * location that can hold a pointer to the new #GstMemory. + * + * Returns: a #GstFlowReturn such as %GST_FLOW_FLUSHING when the allocator is + * inactive. + */ +GstFlowReturn +gst_d3d11_pool_allocator_acquire_memory (GstD3D11PoolAllocator * allocator, + GstMemory ** memory) +{ + GstD3D11PoolAllocatorPrivate *priv; + GstFlowReturn result; + + g_return_val_if_fail (GST_IS_D3D11_POOL_ALLOCATOR (allocator), + GST_FLOW_ERROR); + g_return_val_if_fail (memory != NULL, GST_FLOW_ERROR); + + priv = allocator->priv; + + /* assume we'll have one more outstanding buffer we need to do that so + * that concurrent set_active doesn't clear the buffers */ + g_atomic_int_inc (&priv->outstanding); + result = gst_d3d11_pool_allocator_acquire_memory_internal (allocator, memory); + + if (result == GST_FLOW_OK) { + GstMemory *mem = *memory; + /* Replace default allocator with ours */ + gst_object_unref (mem->allocator); + mem->allocator = (GstAllocator *) gst_object_ref (allocator); + GST_MINI_OBJECT_CAST (mem)->dispose = gst_d3d11_memory_release; + } else { + dec_outstanding (allocator); + } + + return result; +} + +/** + * gst_d3d11_pool_allocator_get_pool_size: + * @allocator: a #GstD3D11PoolAllocator + * @max_size: (out) (optional): the max size of pool + * @outstanding_size: (out) (optional): the number of outstanding memory + * + * Returns: %TRUE if the size of memory pool is known + * + * Since: 1.20 + */ +gboolean +gst_d3d11_pool_allocator_get_pool_size (GstD3D11PoolAllocator * allocator, + guint * max_size, guint * outstanding_size) +{ + GstD3D11PoolAllocatorPrivate *priv; + + g_return_val_if_fail (GST_IS_D3D11_POOL_ALLOCATOR (allocator), FALSE); + + priv = allocator->priv; + + if (max_size) { + if (priv->desc.ArraySize > 1) { + *max_size = priv->desc.ArraySize; + } else { + /* For non-texture-array memory, we don't have any limit yet */ + *max_size = 0; + } + } + + if (outstanding_size) + *outstanding_size = g_atomic_int_get (&priv->outstanding); + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11memory.h
Added
@@ -0,0 +1,279 @@ +/* + * GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_MEMORY_H__ +#define __GST_D3D11_MEMORY_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11_fwd.h> +#include <gst/d3d11/gstd3d11format.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_ALLOCATION_PARAMS (gst_d3d11_allocation_params_get_type()) + +#define GST_TYPE_D3D11_MEMORY (gst_d3d11_memory_get_type()) +#define GST_D3D11_MEMORY_CAST(obj) ((GstD3D11Memory *)obj) + +#define GST_TYPE_D3D11_ALLOCATOR (gst_d3d11_allocator_get_type()) +#define GST_D3D11_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_D3D11_ALLOCATOR, GstD3D11Allocator)) +#define GST_D3D11_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_ALLOCATOR, GstD3D11AllocatorClass)) +#define GST_IS_D3D11_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_D3D11_ALLOCATOR)) +#define GST_IS_D3D11_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_ALLOCATOR)) +#define GST_D3D11_ALLOCATOR_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_ALLOCATOR, GstD3D11AllocatorClass)) +#define GST_D3D11_ALLOCATOR_CAST(obj) ((GstD3D11Allocator *)obj) + +#define GST_TYPE_D3D11_POOL_ALLOCATOR (gst_d3d11_pool_allocator_get_type()) +#define GST_D3D11_POOL_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_D3D11_POOL_ALLOCATOR, GstD3D11PoolAllocator)) +#define GST_D3D11_POOL_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_POOL_ALLOCATOR, GstD3D11PoolAllocatorClass)) +#define GST_IS_D3D11_POOL_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_D3D11_POOL_ALLOCATOR)) +#define GST_IS_D3D11_POOL_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_POOL_ALLOCATOR)) +#define GST_D3D11_POOL_ALLOCATOR_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_POOL_ALLOCATOR, GstD3D11PoolAllocatorClass)) +#define GST_D3D11_POOL_ALLOCATOR_CAST(obj) ((GstD3D11PoolAllocator *)obj) + +/** + * GST_D3D11_MEMORY_NAME: + * + * The name of the Direct3D11 memory + * + * Since: 1.20 + */ +#define GST_D3D11_MEMORY_NAME "D3D11Memory" + +/** + * GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY: + * + * Name of the caps feature for indicating the use of #GstD3D11Memory + * + * Since: 1.20 + */ +#define GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "memory:D3D11Memory" + +/** + * GST_MAP_D3D11: + * + * Flag indicating that we should map the D3D11 resource instead of to system memory. + * + * Since: 1.20 + */ +#define GST_MAP_D3D11 (GST_MAP_FLAG_LAST << 1) + +/** + * GstD3D11AllocationFlags: + * @GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY: Indicates each allocated texture + * should be array type. This type of + * is used for D3D11/DXVA decoders + * in general. + * + * Since: 1.20 + */ +typedef enum +{ + GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY = (1 << 0), +} GstD3D11AllocationFlags; + +/** + * GstD3D11MemoryTransfer: + * @GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD: the texture needs downloading + * to the staging texture memory + * @GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD: the staging texture needs uploading + * to the texture + * + * Since: 1.20 + */ +typedef enum +{ + GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD = (GST_MEMORY_FLAG_LAST << 0), + GST_D3D11_MEMORY_TRANSFER_NEED_UPLOAD = (GST_MEMORY_FLAG_LAST << 1) +} GstD3D11MemoryTransfer; + +struct _GstD3D11AllocationParams +{ + /* Texture description per plane */ + D3D11_TEXTURE2D_DESC desc[GST_VIDEO_MAX_PLANES]; + + GstVideoInfo info; + GstVideoInfo aligned_info; + const GstD3D11Format *d3d11_format; + + GstD3D11AllocationFlags flags; + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING_LARGE]; +}; + +GST_D3D11_API +GType gst_d3d11_allocation_params_get_type (void); + +GST_D3D11_API +GstD3D11AllocationParams * gst_d3d11_allocation_params_new (GstD3D11Device * device, + GstVideoInfo * info, + GstD3D11AllocationFlags flags, + guint bind_flags); + +GST_D3D11_API +GstD3D11AllocationParams * gst_d3d11_allocation_params_copy (GstD3D11AllocationParams * src); + +GST_D3D11_API +void gst_d3d11_allocation_params_free (GstD3D11AllocationParams * params); + +GST_D3D11_API +gboolean gst_d3d11_allocation_params_alignment (GstD3D11AllocationParams * parms, + GstVideoAlignment * align); + +struct _GstD3D11Memory +{ + GstMemory mem; + + /*< public >*/ + GstD3D11Device *device; + + /*< private >*/ + GstD3D11MemoryPrivate *priv; + gpointer _gst_reserved[GST_PADDING]; +}; + +GST_D3D11_API +GType gst_d3d11_memory_get_type (void); + +GST_D3D11_API +void gst_d3d11_memory_init_once (void); + +GST_D3D11_API +gboolean gst_is_d3d11_memory (GstMemory * mem); + +GST_D3D11_API +ID3D11Texture2D * gst_d3d11_memory_get_texture_handle (GstD3D11Memory * mem); + +GST_D3D11_API +gboolean gst_d3d11_memory_get_texture_desc (GstD3D11Memory * mem, + D3D11_TEXTURE2D_DESC * desc); + +GST_D3D11_API +gboolean gst_d3d11_memory_get_texture_stride (GstD3D11Memory * mem, + guint * stride); + +GST_D3D11_API +guint gst_d3d11_memory_get_subresource_index (GstD3D11Memory * mem); + +GST_D3D11_API +guint gst_d3d11_memory_get_shader_resource_view_size (GstD3D11Memory * mem); + +GST_D3D11_API +ID3D11ShaderResourceView * gst_d3d11_memory_get_shader_resource_view (GstD3D11Memory * mem, + guint index); + +GST_D3D11_API +guint gst_d3d11_memory_get_render_target_view_size (GstD3D11Memory * mem); + +GST_D3D11_API +ID3D11RenderTargetView * gst_d3d11_memory_get_render_target_view (GstD3D11Memory * mem, + guint index); + +GST_D3D11_API +ID3D11VideoDecoderOutputView * gst_d3d11_memory_get_decoder_output_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + GUID * decoder_profile); + +GST_D3D11_API +ID3D11VideoProcessorInputView * gst_d3d11_memory_get_processor_input_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + ID3D11VideoProcessorEnumerator * enumerator); + +GST_D3D11_API +ID3D11VideoProcessorOutputView * gst_d3d11_memory_get_processor_output_view (GstD3D11Memory * mem, + ID3D11VideoDevice * video_device, + ID3D11VideoProcessorEnumerator * enumerator); + +struct _GstD3D11Allocator +{ + GstAllocator allocator; + + /*< private >*/ + GstD3D11AllocatorPrivate *priv; + + gpointer _gst_reserved[GST_PADDING]; +}; + +struct _GstD3D11AllocatorClass +{ + GstAllocatorClass allocator_class; + + gboolean (*set_actvie) (GstD3D11Allocator * allocator, + gboolean active); + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING_LARGE]; +}; + +GST_D3D11_API +GType gst_d3d11_allocator_get_type (void); + +GST_D3D11_API +GstMemory * gst_d3d11_allocator_alloc (GstD3D11Allocator * allocator, + GstD3D11Device * device, + const D3D11_TEXTURE2D_DESC * desc); + +GST_D3D11_API +gboolean gst_d3d11_allocator_set_active (GstD3D11Allocator * allocator, + gboolean active); + +struct _GstD3D11PoolAllocator +{ + GstD3D11Allocator allocator; + + /*< public >*/ + GstD3D11Device *device; + + /*< private >*/ + GstD3D11PoolAllocatorPrivate *priv; + + gpointer _gst_reserved[GST_PADDING]; +}; + +struct _GstD3D11PoolAllocatorClass +{ + GstD3D11AllocatorClass allocator_class; + + /*< private >*/ + gpointer _gst_reserved[GST_PADDING]; +}; + +GST_D3D11_API +GType gst_d3d11_pool_allocator_get_type (void); + +GST_D3D11_API +GstD3D11PoolAllocator * gst_d3d11_pool_allocator_new (GstD3D11Device * device, + const D3D11_TEXTURE2D_DESC * desc); + +GST_D3D11_API +GstFlowReturn gst_d3d11_pool_allocator_acquire_memory (GstD3D11PoolAllocator * allocator, + GstMemory ** memory); + +GST_D3D11_API +gboolean gst_d3d11_pool_allocator_get_pool_size (GstD3D11PoolAllocator * allocator, + guint * max_size, + guint * outstanding_size); + +G_END_DECLS + +#endif /* __GST_D3D11_MEMORY_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11utils.cpp
Added
@@ -0,0 +1,549 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11utils.h" +#include "gstd3d11device.h" +#include "gstd3d11_private.h" + +#include <windows.h> +#include <versionhelpers.h> + +GST_DEBUG_CATEGORY_STATIC (GST_CAT_CONTEXT); +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT ensure_debug_category() +static GstDebugCategory * +ensure_debug_category (void) +{ + static gsize cat_gonce = 0; + + if (g_once_init_enter (&cat_gonce)) { + gsize cat_done; + + cat_done = (gsize) _gst_debug_category_new ("d3d11utils", 0, + "d3d11 utility functions"); + + g_once_init_leave (&cat_gonce, cat_done); + } + + return (GstDebugCategory *) cat_gonce; +} +#else +#define ensure_debug_category() /* NOOP */ +#endif /* GST_DISABLE_GST_DEBUG */ + +static void +_init_context_debug (void) +{ + static gsize _init = 0; + + if (g_once_init_enter (&_init)) { + GST_DEBUG_CATEGORY_GET (GST_CAT_CONTEXT, "GST_CONTEXT"); + g_once_init_leave (&_init, 1); + } +} + +/** + * gst_d3d11_handle_set_context: + * @element: a #GstElement + * @context: a #GstContext + * @adapter_index: a DXGI adapter index + * @device: (inout) (transfer full): location of a #GstD3D11Device + * + * Helper function for implementing #GstElementClass.set_context() in + * D3D11 capable elements. + * + * Retrieve's the #GstD3D11Device in @context and places the result in @device. + * @device is accepted if @adapter_index is equal to -1 (accept any device) + * or equal to that of @device + * + * Returns: whether the @device could be set successfully + * + * Since: 1.20 + */ +gboolean +gst_d3d11_handle_set_context (GstElement * element, GstContext * context, + gint adapter_index, GstD3D11Device ** device) +{ + const gchar *context_type; + + g_return_val_if_fail (GST_IS_ELEMENT (element), FALSE); + g_return_val_if_fail (device != NULL, FALSE); + + _init_context_debug (); + + if (!context) + return FALSE; + + context_type = gst_context_get_context_type (context); + if (g_strcmp0 (context_type, GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE) == 0) { + const GstStructure *str; + GstD3D11Device *other_device = NULL; + guint other_adapter_index = 0; + + /* If we had device already, will not replace it */ + if (*device) + return TRUE; + + str = gst_context_get_structure (context); + + if (gst_structure_get (str, "device", GST_TYPE_D3D11_DEVICE, + &other_device, "adapter", G_TYPE_UINT, &other_adapter_index, + NULL)) { + if (adapter_index == -1 || (guint) adapter_index == other_adapter_index) { + GST_CAT_DEBUG_OBJECT (GST_CAT_CONTEXT, + element, "Found D3D11 device context"); + *device = other_device; + + return TRUE; + } + + gst_object_unref (other_device); + } + } + + return FALSE; +} + +/** + * gst_d3d11_handle_set_context_for_adapter_luid: + * @element: a #GstElement + * @context: a #GstContext + * @adapter_luid: an int64 representation of DXGI adapter LUID + * @device: (inout) (transfer full): location of a #GstD3D11Device + * + * Helper function for implementing #GstElementClass.set_context() in + * D3D11 capable elements. + * + * Retrieve's the #GstD3D11Device in @context and places the result in @device. + * @device is accepted only when @adapter_index is equal to that of @device + * + * Returns: whether the @device could be set successfully + * + * Since: 1.20 + */ +gboolean +gst_d3d11_handle_set_context_for_adapter_luid (GstElement * element, + GstContext * context, gint64 adapter_luid, GstD3D11Device ** device) +{ + const gchar *context_type; + + g_return_val_if_fail (GST_IS_ELEMENT (element), FALSE); + g_return_val_if_fail (device != NULL, FALSE); + + _init_context_debug (); + + if (!context) + return FALSE; + + context_type = gst_context_get_context_type (context); + if (g_strcmp0 (context_type, GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE) == 0) { + const GstStructure *str; + GstD3D11Device *other_device = NULL; + gint64 other_adapter_luid = 0; + + /* If we had device already, will not replace it */ + if (*device) + return TRUE; + + str = gst_context_get_structure (context); + + if (gst_structure_get (str, "device", GST_TYPE_D3D11_DEVICE, + &other_device, "adapter-luid", G_TYPE_INT64, + &other_adapter_luid, NULL)) { + if (adapter_luid == other_adapter_luid) { + GST_CAT_DEBUG_OBJECT (GST_CAT_CONTEXT, + element, "Found D3D11 device context"); + *device = other_device; + + return TRUE; + } + + gst_object_unref (other_device); + } + } + + return FALSE; +} + +static void +context_set_d3d11_device (GstContext * context, GstD3D11Device * device) +{ + GstStructure *s; + guint adapter = 0; + guint device_id = 0; + guint vendor_id = 0; + gboolean hardware = FALSE; + gchar *desc = NULL; + gint64 adapter_luid = 0; + + g_return_if_fail (context != NULL); + + g_object_get (G_OBJECT (device), "adapter", &adapter, "device-id", &device_id, + "vendor-id", &vendor_id, "hardware", &hardware, "description", &desc, + "adapter-luid", &adapter_luid, NULL); + + GST_CAT_LOG (GST_CAT_CONTEXT, + "setting GstD3D11Device(%" GST_PTR_FORMAT + ") with adapter %d on context(%" GST_PTR_FORMAT ")", + device, adapter, context); + + s = gst_context_writable_structure (context); + gst_structure_set (s, "device", GST_TYPE_D3D11_DEVICE, device, + "adapter", G_TYPE_UINT, adapter, + "adapter-luid", G_TYPE_INT64, adapter_luid, + "device-id", G_TYPE_UINT, device_id, + "vendor-id", G_TYPE_UINT, vendor_id, + "hardware", G_TYPE_BOOLEAN, hardware, + "description", G_TYPE_STRING, GST_STR_NULL (desc), NULL); + g_free (desc); +} + +/** + * gst_d3d11_handle_context_query: + * @element: a #GstElement + * @query: a #GstQuery of type %GST_QUERY_CONTEXT + * @device: (transfer none) (nullable): a #GstD3D11Device + * + * Returns: Whether the @query was successfully responded to from the passed + * @device. + * + * Since: 1.20 + */ +gboolean +gst_d3d11_handle_context_query (GstElement * element, GstQuery * query, + GstD3D11Device * device) +{ + const gchar *context_type; + GstContext *context, *old_context; + + g_return_val_if_fail (GST_IS_ELEMENT (element), FALSE); + g_return_val_if_fail (GST_IS_QUERY (query), FALSE); + + _init_context_debug (); + + GST_LOG_OBJECT (element, "handle context query %" GST_PTR_FORMAT, query); + + if (!device) + return FALSE; + + gst_query_parse_context_type (query, &context_type); + if (g_strcmp0 (context_type, GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE) != 0) + return FALSE; + + gst_query_parse_context (query, &old_context); + if (old_context) + context = gst_context_copy (old_context); + else + context = gst_context_new (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE, TRUE); + + context_set_d3d11_device (context, device); + gst_query_set_context (query, context); + gst_context_unref (context); + + GST_DEBUG_OBJECT (element, "successfully set %" GST_PTR_FORMAT + " on %" GST_PTR_FORMAT, device, query); + + return TRUE; +} + +static gboolean +pad_query (const GValue * item, GValue * value, gpointer user_data) +{ + GstPad *pad = (GstPad *) g_value_get_object (item); + GstQuery *query = (GstQuery *) user_data; + gboolean res; + + res = gst_pad_peer_query (pad, query); + + if (res) { + g_value_set_boolean (value, TRUE); + return FALSE; + } + + GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, pad, "pad peer query failed"); + return TRUE; +} + +static gboolean +run_query (GstElement * element, GstQuery * query, GstPadDirection direction) +{ + GstIterator *it; + GstIteratorFoldFunction func = pad_query; + GValue res = { 0 }; + + g_value_init (&res, G_TYPE_BOOLEAN); + g_value_set_boolean (&res, FALSE); + + /* Ask neighbor */ + if (direction == GST_PAD_SRC) + it = gst_element_iterate_src_pads (element); + else + it = gst_element_iterate_sink_pads (element); + + while (gst_iterator_fold (it, func, &res, query) == GST_ITERATOR_RESYNC) + gst_iterator_resync (it); + + gst_iterator_free (it); + + return g_value_get_boolean (&res); +} + +static void +run_d3d11_context_query (GstElement * element, GstD3D11Device ** device) +{ + GstQuery *query; + GstContext *ctxt = NULL; + + /* 1) Query downstream with GST_QUERY_CONTEXT for the context and + * check if downstream already has a context of the specific type + */ + query = gst_query_new_context (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE); + if (run_query (element, query, GST_PAD_SRC)) { + gst_query_parse_context (query, &ctxt); + if (ctxt) { + GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, + "found context (%" GST_PTR_FORMAT ") in downstream query", ctxt); + gst_element_set_context (element, ctxt); + } + } + + /* 2) although we found d3d11 device context above, the context might not be + * expected/wanted one by the element (e.g., belongs to the other GPU). + * Then try to find it from the other direction */ + if (*device == NULL && run_query (element, query, GST_PAD_SINK)) { + gst_query_parse_context (query, &ctxt); + if (ctxt) { + GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, + "found context (%" GST_PTR_FORMAT ") in upstream query", ctxt); + gst_element_set_context (element, ctxt); + } + } + + if (*device == NULL) { + /* 3) Post a GST_MESSAGE_NEED_CONTEXT message on the bus with + * the required context type and afterwards check if a + * usable context was set now as in 1). The message could + * be handled by the parent bins of the element and the + * application. + */ + GstMessage *msg; + + GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, + "posting need context message"); + msg = gst_message_new_need_context (GST_OBJECT_CAST (element), + GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE); + gst_element_post_message (element, msg); + } + + gst_query_unref (query); +} + +/** + * gst_d3d11_ensure_element_data: + * @element: the #GstElement running the query + * @adapter: preferred DXGI adapter index, pass adapter >=0 when + * the adapter explicitly required. Otherwise, set -1. + * @device: (inout): the resulting #GstD3D11Device + * + * Perform the steps necessary for retrieving a #GstD3D11Device + * from the surrounding elements or from the application using the #GstContext mechanism. + * + * If the contents of @device is not %NULL, then no #GstContext query is + * necessary for #GstD3D11Device retrieval is performed. + * + * Returns: whether a #GstD3D11Device exists in @device + * + * Since: 1.20 + */ +gboolean +gst_d3d11_ensure_element_data (GstElement * element, gint adapter, + GstD3D11Device ** device) +{ + guint target_adapter = 0; + + g_return_val_if_fail (element != NULL, FALSE); + g_return_val_if_fail (device != NULL, FALSE); + + _init_context_debug (); + + if (*device) { + GST_LOG_OBJECT (element, "already have a device %" GST_PTR_FORMAT, *device); + return TRUE; + } + + run_d3d11_context_query (element, device); + if (*device) + return TRUE; + + if (adapter > 0) + target_adapter = adapter; + + /* Needs D3D11_CREATE_DEVICE_BGRA_SUPPORT flag for Direct2D interop */ + *device = gst_d3d11_device_new (target_adapter, + D3D11_CREATE_DEVICE_BGRA_SUPPORT); + + if (*device == NULL) { + GST_ERROR_OBJECT (element, + "Couldn't create new device with adapter index %d", target_adapter); + return FALSE; + } else { + GstContext *context; + GstMessage *msg; + + /* Propagate new D3D11 device context */ + + context = gst_context_new (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE, TRUE); + context_set_d3d11_device (context, *device); + + gst_element_set_context (element, context); + + GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, + "posting have context (%p) message with D3D11 device context (%p)", + context, *device); + msg = gst_message_new_have_context (GST_OBJECT_CAST (element), context); + gst_element_post_message (GST_ELEMENT_CAST (element), msg); + } + + return TRUE; +} + +/** + * gst_d3d11_ensure_element_data_for_adapter_luid: + * @element: the #GstElement running the query + * @adapter_luid: an int64 representation of DXGI adapter LUID + * @device: (inout): the resulting #GstD3D11Device + * + * Perform the steps necessary for retrieving a #GstD3D11Device + * from the surrounding elements or from the application using the #GstContext mechanism. + * + * If the contents of @device is not %NULL, then no #GstContext query is + * necessary for #GstD3D11Device retrieval is performed. + * + * Returns: whether a #GstD3D11Device exists in @device + * + * Since: 1.20 + */ +gboolean +gst_d3d11_ensure_element_data_for_adapter_luid (GstElement * element, + gint64 adapter_luid, GstD3D11Device ** device) +{ + g_return_val_if_fail (element != NULL, FALSE); + g_return_val_if_fail (device != NULL, FALSE); + + _init_context_debug (); + + if (*device) { + GST_LOG_OBJECT (element, "already have a device %" GST_PTR_FORMAT, *device); + return TRUE; + } + + run_d3d11_context_query (element, device); + if (*device) + return TRUE; + + /* Needs D3D11_CREATE_DEVICE_BGRA_SUPPORT flag for Direct2D interop */ + *device = gst_d3d11_device_new_for_adapter_luid (adapter_luid, + D3D11_CREATE_DEVICE_BGRA_SUPPORT); + + if (*device == NULL) { + GST_ERROR_OBJECT (element, + "Couldn't create new device with adapter luid %" G_GINT64_FORMAT, + adapter_luid); + return FALSE; + } else { + GstContext *context; + GstMessage *msg; + + /* Propagate new D3D11 device context */ + + context = gst_context_new (GST_D3D11_DEVICE_HANDLE_CONTEXT_TYPE, TRUE); + context_set_d3d11_device (context, *device); + + gst_element_set_context (element, context); + + GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, + "posting have context (%p) message with D3D11 device context (%p)", + context, *device); + msg = gst_message_new_have_context (GST_OBJECT_CAST (element), context); + gst_element_post_message (GST_ELEMENT_CAST (element), msg); + } + + return TRUE; +} + +/** + * gst_d3d11_luid_to_int64: + * @luid: A pointer to LUID struct + * + * Converts from a LUID to a 64-bit signed integer. + * See also Int64FromLuid method defined in + * windows.devices.display.core.interop.h Windows SDK header + * + * Since: 1.20 + */ +gint64 +gst_d3d11_luid_to_int64 (const LUID * luid) +{ + LARGE_INTEGER val; + + g_return_val_if_fail (luid != nullptr, 0); + + val.LowPart = luid->LowPart; + val.HighPart = luid->HighPart; + + return val.QuadPart; +} + +gboolean +_gst_d3d11_result (HRESULT hr, GstD3D11Device * device, GstDebugCategory * cat, + const gchar * file, const gchar * function, gint line) +{ +#ifndef GST_DISABLE_GST_DEBUG + gboolean ret = TRUE; + + if (FAILED (hr)) { + gchar *error_text = NULL; + + error_text = g_win32_error_message ((guint) hr); + /* g_win32_error_message() doesn't cover all HERESULT return code, + * so it could be empty string, or null if there was an error + * in g_utf16_to_utf8() */ + gst_debug_log (cat, GST_LEVEL_WARNING, file, function, line, + NULL, "D3D11 call failed: 0x%x, %s", (guint) hr, + GST_STR_NULL (error_text)); + g_free (error_text); + + ret = FALSE; + } +#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H) + if (device) { + gst_d3d11_device_d3d11_debug (device, file, function, line); + gst_d3d11_device_dxgi_debug (device, file, function, line); + } +#endif + + return ret; +#else + return SUCCEEDED (hr); +#endif +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/gstd3d11utils.h
Added
@@ -0,0 +1,79 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_UTILS_H__ +#define __GST_D3D11_UTILS_H__ + +#include <gst/gst.h> +#include <gst/d3d11/gstd3d11_fwd.h> + +G_BEGIN_DECLS + +GST_D3D11_API +gboolean gst_d3d11_handle_set_context (GstElement * element, + GstContext * context, + gint adapter_index, + GstD3D11Device ** device); + +GST_D3D11_API +gboolean gst_d3d11_handle_set_context_for_adapter_luid (GstElement * element, + GstContext * context, + gint64 adapter_luid, + GstD3D11Device ** device); + +GST_D3D11_API +gboolean gst_d3d11_handle_context_query (GstElement * element, + GstQuery * query, + GstD3D11Device * device); + +GST_D3D11_API +gboolean gst_d3d11_ensure_element_data (GstElement * element, + gint adapter_index, + GstD3D11Device ** device); + +GST_D3D11_API +gboolean gst_d3d11_ensure_element_data_for_adapter_luid (GstElement * element, + gint64 adapter_luid, + GstD3D11Device ** device); + +GST_D3D11_API +gint64 gst_d3d11_luid_to_int64 (const LUID * luid); + +GST_D3D11_API +gboolean _gst_d3d11_result (HRESULT hr, + GstD3D11Device * device, + GstDebugCategory * cat, + const gchar * file, + const gchar * function, + gint line); +/** + * gst_d3d11_result: + * @result: HRESULT D3D11 API return code + * @device: (nullable): Associated #GstD3D11Device + * + * Returns: %TRUE if D3D11 API call result is SUCCESS + * + * Since: 1.20 + */ +#define gst_d3d11_result(result,device) \ + _gst_d3d11_result (result, device, GST_CAT_DEFAULT, __FILE__, GST_FUNCTION, __LINE__) + +G_END_DECLS + +#endif /* __GST_D3D11_UTILS_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/d3d11/meson.build
Added
@@ -0,0 +1,209 @@ +d3d11_sources = [ + 'gstd3d11bufferpool.cpp', + 'gstd3d11device.cpp', + 'gstd3d11format.cpp', + 'gstd3d11memory.cpp', + 'gstd3d11utils.cpp', +] + +dxgi_headers = [ + ['dxgi1_6.h', 6], + ['dxgi1_5.h', 5], + ['dxgi1_4.h', 4], + ['dxgi1_3.h', 3], + ['dxgi1_2.h', 2], + ['dxgi.h', 1] +] + +d3d11_headers = [ + ['d3d11_4.h', 4], + ['d3d11_3.h', 3], + ['d3d11_2.h', 2], + ['d3d11_1.h', 1], + ['d3d11.h', 0] +] + +gstd3d11_dep = dependency('', required : false) + +d3d11_option = get_option('d3d11') +if host_system != 'windows' or d3d11_option.disabled() + subdir_done() +endif + +have_d3d11 = false +extra_c_args = [ + '-DCOBJMACROS', +] +extra_comm_args = [ + '-DGST_USE_UNSTABLE_API', + '-DBUILDING_GST_D3D11', + '-DG_LOG_DOMAIN="GStreamer-D3D11"', +] + +have_dxgi_header = false +have_d3d11_header = false +have_d3d11sdk_h = false +have_dxgidebug_h = false +winapi_desktop = false +winapi_app = false +d3d11_conf = configuration_data() +d3d11_conf_options = [ + 'GST_D3D11_DXGI_HEADER_VERSION', + 'GST_D3D11_HEADER_VERSION', + 'GST_D3D11_WINAPI_ONLY_APP', +] + +foreach option : d3d11_conf_options + d3d11_conf.set10(option, false) +endforeach + +d3d11_lib = cc.find_library('d3d11', required : d3d11_option) +dxgi_lib = cc.find_library('dxgi', required : d3d11_option) +d3dcompiler_lib = cc.find_library('d3dcompiler', required: d3d11_option) +runtimeobject_lib = cc.find_library('runtimeobject', required : false) + +foreach dxgi_h: dxgi_headers + if not have_dxgi_header and cc.has_header(dxgi_h[0]) + d3d11_conf.set('GST_D3D11_DXGI_HEADER_VERSION', dxgi_h[1]) + have_dxgi_header = true + endif +endforeach + +foreach d3d11_h: d3d11_headers + if not have_d3d11_header and cc.has_header(d3d11_h[0]) + d3d11_conf.set('GST_D3D11_HEADER_VERSION', d3d11_h[1]) + have_d3d11_header = true + endif +endforeach + +have_d3d11 = d3d11_lib.found() and dxgi_lib.found() and have_d3d11_header and have_dxgi_header +if not have_d3d11 + if d3d11_option.enabled() + error('The d3d11 was enabled explicitly, but required dependencies were not found.') + endif + subdir_done() +endif + +d3d11_winapi_desktop = cxx.compiles('''#include <winapifamily.h> + #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP) + #error "not win32" + #endif''', + dependencies: [d3d11_lib, dxgi_lib], + name: 'building for Win32') + +if runtimeobject_lib.found() and d3dcompiler_lib.found() + d3d11_winapi_app = cxx.compiles('''#include <winapifamily.h> + #include <windows.applicationmodel.core.h> + #include <wrl.h> + #include <wrl/wrappers/corewrappers.h> + #include <d3d11.h> + #include <dxgi1_2.h> + #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_APP) + #error "not winrt" + #endif + #if (WINVER < 0x0A00) + #error "Windows 10 API is not guaranteed" + #endif''', + dependencies: [d3d11_lib, dxgi_lib, runtimeobject_lib], + name: 'building for WinRT') +endif + +if not d3d11_winapi_desktop and not d3d11_winapi_app + error('Neither Desktop partition nor App partition') +endif + +d3d11_winapi_only_app = d3d11_winapi_app and not d3d11_winapi_desktop +d3d11_conf.set10('GST_D3D11_WINAPI_ONLY_APP', d3d11_winapi_only_app) +d3d11_conf.set10('GST_D3D11_WINAPI_APP', d3d11_winapi_app) + +# for enabling debug layer +# NOTE: Disable d3d11/dxgi debug layer in case of [UWP build + release CRT] +# WACK (Windows App Certification Kit) doesn't seem to be happy with +# the DXGIGetDebugInterface1 symbol. + +# FIXME: Probably DXGIGetDebugInterface1 might be used on UWP app for development +# purpose. So, I suspect one possible reason why WACK is complaining about +# DXGIGetDebugInterface1 is that debugging APIs couldn't be used for +# Windows store app, but couldn't find any reference about that. +# +# [IDXGIDebug1] +# https://docs.microsoft.com/en-us/windows/win32/api/dxgidebug/nn-dxgidebug-idxgidebug1 +# is saying that the IDXGIDebug1 interface is available for both desktop app and +# UWP. And then the *DXGIGetDebugInterface1* method need to be called to obtain +# the IDXGIDebug1 interface. +# +# [DXGIGetDebugInterface1] +# https://docs.microsoft.com/en-us/windows/win32/api/dxgi1_3/nf-dxgi1_3-dxgigetdebuginterface1 +# is mentioning that DXGIGetDebugInterface1 is desktop app only. +# +# PLEASE LET US KNOW A CORRECT WAY TO OBTAIN IDXGIDebug1 ON UWP, MICROSOFT +if get_option('debug') and not (d3d11_winapi_only_app and get_option('b_vscrt') == 'md') + d3d11_debug_libs = [ + ['d3d11sdklayers.h', 'ID3D11Debug', 'ID3D11InfoQueue', 'have_d3d11sdk_h'], + ['dxgidebug.h', 'IDXGIDebug', 'IDXGIInfoQueue', 'have_dxgidebug_h'], + ] + + foreach f : d3d11_debug_libs + header = f.get(0) + debug_obj = f.get(1) + info_obj = f.get(2) + compile_code = ''' + #include <d3d11.h> + #include <dxgi.h> + #include <@0@> + int main(int arc, char ** argv) { + @1@ *debug = NULL; + @2@ *info_queue = NULL; + return 0; + }'''.format(header, debug_obj, info_obj) + if cc.compiles(compile_code, dependencies: [d3d11_lib, dxgi_lib], name: debug_obj) + set_variable(f.get(3), true) + endif + endforeach +else + message('Disable D3D11Debug and DXGIDebug layers') +endif + +# don't need to be defined in gstd3d11config.h since it's gstd3d11device internal +if have_d3d11sdk_h + extra_comm_args += ['-DHAVE_D3D11SDKLAYERS_H'] +endif + +if have_dxgidebug_h + extra_comm_args += ['-DHAVE_DXGIDEBUG_H'] +endif + +# MinGW 32bits compiler seems to be complaining about redundant-decls +# when ComPtr is in use. Let's just disable the warning +if cc.get_id() != 'msvc' + extra_args = cc.get_supported_arguments([ + '-Wno-redundant-decls', + ]) + + extra_comm_args += extra_args +endif + +configure_file( + output: 'gstd3d11config.h', + configuration: d3d11_conf, +) + +pkg_name = 'gstreamer-d3d11-' + api_version +gstd3d11 = library('gstd3d11-' + api_version, + d3d11_sources, + c_args : gst_plugins_bad_args + extra_c_args + extra_comm_args, + cpp_args : gst_plugins_bad_args + extra_comm_args, + include_directories : [configinc, libsinc], + version : libversion, + soversion : soversion, + install : true, + dependencies : [gstbase_dep, gstvideo_dep, gmodule_dep, d3d11_lib, dxgi_lib] +) + +library_def = {'lib': gstd3d11} +libraries += [[pkg_name, library_def]] + +# Still non-public api, should not install headers +gstd3d11_dep = declare_dependency(link_with : gstd3d11, + include_directories : [libsinc], + dependencies : [gstbase_dep, gstvideo_dep, gmodule_dep, d3d11_lib, dxgi_lib])
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/insertbin/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/insertbin/meson.build
Changed
@@ -1,10 +1,10 @@ -insert_sources = ['gstinsertbin.c'] -insert_headers = ['gstinsertbin.h'] +insert_sources = files('gstinsertbin.c') +insert_headers = files('gstinsertbin.h') install_headers(insert_headers, subdir : 'gstreamer-1.0/gst/insertbin') gstinsertbin = library('gstinsertbin-' + api_version, insert_sources, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_INSERT_BIN'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_INSERT_BIN', '-DG_LOG_DOMAIN="GStreamer-InsertBin"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -13,24 +13,41 @@ dependencies : [gst_dep], ) +library_def = {'lib': gstinsertbin} +pkg_name = 'gstreamer-insertbin-1.0' +pkgconfig.generate(gstinsertbin, + libraries : [gst_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'Bin to automatically and insertally link elements', +) + gen_sources = [] if build_gir - insertbin_gir = gnome.generate_gir(gstinsertbin, - sources : insert_sources + insert_headers, - namespace : 'GstInsertBin', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-insertbin-1.0', - includes : ['Gst-1.0'], - install : true, - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/insertbin/gstinsertbin.h'], - dependencies : [gst_dep] - ) - gen_sources += insertbin_gir + gir = { + 'sources' : insert_sources + insert_headers, + 'namespace' : 'GstInsertBin', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0'], + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/insertbin/gstinsertbin.h'], + 'dependencies' : [gst_dep] + } + library_def += {'gir': [gir]} + if not static_build + insertbin_gir = gnome.generate_gir(gstinsertbin, kwargs: gir) + gen_sources += insertbin_gir + endif endif +libraries += [[pkg_name, library_def]] gstinsertbin_dep = declare_dependency(link_with : gstinsertbin, include_directories : [libsinc], sources: gen_sources, dependencies : [gst_dep]) + +meson.override_dependency(pkg_name, gstinsertbin_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/interfaces/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/interfaces/meson.build
Changed
@@ -13,9 +13,10 @@ photoenum_c = photo_enums[0] photoenum_h = photo_enums[1] +pkg_name = 'gstreamer-photography-1.0' gstphotography = library('gstphotography-' + api_version, photography_sources, photoenum_h, photoenum_c, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_PHOTOGRAPHY'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_PHOTOGRAPHY', '-DG_LOG_DOMAIN="GStreamer-Photography"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -24,7 +25,18 @@ dependencies : [gst_dep], ) +pkgconfig.generate(gstphotography, + libraries : [gst_dep, gstbase_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'High level API for transcoding using GStreamer', +) + gstphotography_dep = declare_dependency(link_with : gstphotography, include_directories : [libsinc], dependencies : [gst_dep], sources : [photoenum_h]) + +libraries += [[pkg_name, {'lib': gstphotography}]] +meson.override_dependency(pkg_name, gstphotography_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/isoff/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/isoff/meson.build
Changed
@@ -8,7 +8,7 @@ gstisoff = library('gstisoff-' + api_version, isoff_sources, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_ISOFF'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_ISOFF', '-DG_LOG_DOMAIN="GStreamer-ISOFF"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion,
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/meson.build
Changed
@@ -5,14 +5,18 @@ subdir('basecamerabinsrc') subdir('codecparsers') subdir('codecs') +subdir('d3d11') subdir('insertbin') subdir('interfaces') subdir('isoff') subdir('mpegts') subdir('opencv') +subdir('play') subdir('player') subdir('sctp') subdir('transcoder') +subdir('va') subdir('vulkan') subdir('wayland') subdir('webrtc') +subdir('winrt')
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-atsc-descriptor.h
Added
@@ -0,0 +1,87 @@ +/* + * gstmpegtsdescriptor.h - + * Copyright (C) 2020 Edward Hervey + * + * Authors: + * Edward Hervey <edward@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef GST_ATSC_DESCRIPTOR_H +#define GST_ATSC_DESCRIPTOR_H + +#include <gst/gst.h> +#include <gst/mpegts/mpegts-prelude.h> + +G_BEGIN_DECLS + +/** + * SECTION:gst-atsc-descriptor + * @title: ATSC variants of MPEG-TS descriptors + * @short_description: Descriptors for the various ATSC specifications + * @include: gst/mpegts/mpegts.h + * + * This contains the various descriptors defined by the ATSC specifications + */ + +/** + * GstMpegtsATSCDescriptorType: + * + * These values correspond to the registered descriptor type from + * the various ATSC specifications. + * + * Consult the relevant specifications for more details. + */ +typedef enum { + /* ATSC A/65 2009 */ + GST_MTS_DESC_ATSC_STUFFING = 0x80, + GST_MTS_DESC_ATSC_AC3 = 0x81, + GST_MTS_DESC_ATSC_CAPTION_SERVICE = 0x86, + GST_MTS_DESC_ATSC_CONTENT_ADVISORY = 0x87, + GST_MTS_DESC_ATSC_EXTENDED_CHANNEL_NAME = 0xA0, + GST_MTS_DESC_ATSC_SERVICE_LOCATION = 0xA1, + GST_MTS_DESC_ATSC_TIME_SHIFTED_SERVICE = 0xA2, + GST_MTS_DESC_ATSC_COMPONENT_NAME = 0xA3, + GST_MTS_DESC_ATSC_DCC_DEPARTING_REQUEST = 0xA8, + GST_MTS_DESC_ATSC_DCC_ARRIVING_REQUEST = 0xA9, + GST_MTS_DESC_ATSC_REDISTRIBUTION_CONTROL = 0xAA, + GST_MTS_DESC_ATSC_GENRE = 0xAB, + GST_MTS_DESC_ATSC_PRIVATE_INFORMATION = 0xAD, + GST_MTS_DESC_ATSC_EAC3 = 0xCC, + + /* ATSC A/53:3 2009 */ + GST_MTS_DESC_ATSC_ENHANCED_SIGNALING = 0xB2, + + /* ATSC A/90 */ + GST_MTS_DESC_ATSC_DATA_SERVICE = 0xA4, + GST_MTS_DESC_ATSC_PID_COUNT = 0xA5, + GST_MTS_DESC_ATSC_DOWNLOAD_DESCRIPTOR = 0xA6, + GST_MTS_DESC_ATSC_MULTIPROTOCOL_ENCAPSULATION = 0xA7, + GST_MTS_DESC_ATSC_MODULE_LINK = 0xB4, + GST_MTS_DESC_ATSC_CRC32 = 0xB5, + GST_MTS_DESC_ATSC_GROUP_LINK = 0xB8, +} GstMpegtsATSCDescriptorType; + +/* For backwards compatibility */ +/** + * GST_MTS_DESC_AC3_AUDIO_STREAM: (skip) (attributes doc.skip=true) + */ +#define GST_MTS_DESC_AC3_AUDIO_STREAM GST_MTS_DESC_ATSC_AC3 + +G_END_DECLS + +#endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-atsc-section.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-atsc-section.c
Changed
@@ -35,6 +35,50 @@ * @short_description: Sections for the various ATSC specifications * @include: gst/mpegts/mpegts.h * + * The list of section types defined and used by the ATSC specifications can be + * seen in %GstMpegtsSectionATSCTableID. + * + * # Supported ATSC MPEG-TS sections + * These are the sections for which parsing and packetizing code exists. + * + * ## Master Guide Table (MGT) + * See: + * * gst_mpegts_section_get_atsc_mgt() + * * %GstMpegtsAtscMGT + * * %GstMpegtsAtscMGTTable + * * gst_mpegts_atsc_mgt_new() + * + * ## Terrestrial (TVCT) and Cable (CVCT) Virtual Channel Table + * See: + * * gst_mpegts_section_get_atsc_tvct() + * * gst_mpegts_section_get_atsc_cvct() + * * %GstMpegtsAtscVCT + * * %GstMpegtsAtscVCTSource + * + * ## Rating Region Table (RRT) + * See: + * * gst_mpegts_section_get_atsc_rrt() + * * %GstMpegtsAtscRRT + * * gst_mpegts_atsc_rrt_new() + * + * ## Event Information Table (EIT) + * See: + * * gst_mpegts_section_get_atsc_eit() + * * %GstMpegtsAtscEIT + * * %GstMpegtsAtscEITEvent + * + * ## Extended Text Table (ETT) + * See: + * * gst_mpegts_section_get_atsc_ett() + * * %GstMpegtsAtscETT + * + * ## System Time Table (STT) + * See: + * * gst_mpegts_section_get_atsc_stt() + * * %GstMpegtsAtscSTT + * * gst_mpegts_atsc_stt_new() + * + * # API */ /* Terrestrial/Cable Virtual Channel Table TVCT/CVCT */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-atsc-section.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-atsc-section.h
Changed
@@ -32,16 +32,30 @@ /** * GstMpegtsSectionATSCTableID: + * @GST_MTS_TABLE_ID_ATSC_MASTER_GUIDE: Master Guide Table (MGT) + * @GST_MTS_TABLE_ID_ATSC_TERRESTRIAL_VIRTUAL_CHANNEL: Terrestrial Virtual Channel Table (TVCT) + * @GST_MTS_TABLE_ID_ATSC_CABLE_VIRTUAL_CHANNEL: Cable Virtual Channel Table (CVCT) + * @GST_MTS_TABLE_ID_ATSC_RATING_REGION: Rating Region Table (RRT) + * @GST_MTS_TABLE_ID_ATSC_EVENT_INFORMATION: Event Information Table (EIT) + * @GST_MTS_TABLE_ID_ATSC_CHANNEL_OR_EVENT_EXTENDED_TEXT: Extended Text Table (ETT) + * @GST_MTS_TABLE_ID_ATSC_SYSTEM_TIME: System Time Table (STT) + * @GST_MTS_TABLE_ID_ATSC_DATA_EVENT: A/90: Data Event Table (DET) + * @GST_MTS_TABLE_ID_ATSC_DATA_SERVICE: A/90: Data Service Table (DST) + * @GST_MTS_TABLE_ID_ATSC_NETWORK_RESOURCE: A/90: Network Resources Table (NRT) + * @GST_MTS_TABLE_ID_ATSC_LONG_TERM_SERVICE: A/90: Long Term Service Table (LTST) + * @GST_MTS_TABLE_ID_ATSC_DIRECTED_CHANNEL_CHANGE: Directed Channel Change Table (DCCT) + * @GST_MTS_TABLE_ID_ATSC_DIRECTED_CHANNEL_CHANGE_SECTION_CODE: Directed Channel Change Selection Code Table (DCCSCT) + * @GST_MTS_TABLE_ID_ATSC_SATELLITE_VIRTUAL_CHANNEL: A/81: Satellite Virtual Channel Table * * Values for a #GstMpegtsSection table_id. * - * These are the registered ATSC table_id variants. + * These are the registered ATSC section `table_id` variants. Unless specified + * otherwise, they are defined in the "ATSC A/65" specification. * - * see also: #GstMpegtsSectionTableID + * see also: #GstMpegtsSectionTableID and other variants. */ typedef enum { - /* ATSC (A/65) */ GST_MTS_TABLE_ID_ATSC_MASTER_GUIDE = 0xC7, GST_MTS_TABLE_ID_ATSC_TERRESTRIAL_VIRTUAL_CHANNEL = 0xC8, @@ -53,19 +67,57 @@ /* ATSC (A/90) */ GST_MTS_TABLE_ID_ATSC_DATA_EVENT = 0xCE, GST_MTS_TABLE_ID_ATSC_DATA_SERVICE = 0xCF, - /* 0xD0 ?? */ + + /* ATSC (A/57B) */ + /** + * GST_MTS_TABLE_ID_ATSC_PROGRAM_IDENTIFIER: + * + * A/57B: Program Identifier Table. + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_ATSC_PROGRAM_IDENTIFIER = 0xD0, + /* ATSC (A/90) */ GST_MTS_TABLE_ID_ATSC_NETWORK_RESOURCE = 0xD1, GST_MTS_TABLE_ID_ATSC_LONG_TERM_SERVICE = 0xD2, + /* ATSC (A/65) */ GST_MTS_TABLE_ID_ATSC_DIRECTED_CHANNEL_CHANGE = 0xD3, GST_MTS_TABLE_ID_ATSC_DIRECTED_CHANNEL_CHANGE_SECTION_CODE = 0xD4, - /* 0xD5 ?? */ + /* 0xD5-0xD9 covered in CEA/SCTE */ GST_MTS_TABLE_ID_ATSC_AGGREGATE_EVENT_INFORMATION = 0xD6, GST_MTS_TABLE_ID_ATSC_AGGREGATE_EXTENDED_TEXT = 0xD7, - /* 0xD8 ?? */ GST_MTS_TABLE_ID_ATSC_AGGREGATE_DATA_EVENT = 0xD9, + /* */ GST_MTS_TABLE_ID_ATSC_SATELLITE_VIRTUAL_CHANNEL = 0xDA, } GstMpegtsSectionATSCTableID; +/** + * GstMpegtsATSCStreamType: + * @GST_MPEGTS_STREAM_TYPE_ATSC_DCII_VIDEO: DigiCipher II video | Identical to ITU-T Rec. H.262 | ISO/IEC 13818-2 Video + * @GST_MPEGTS_STREAM_TYPE_ATSC_AUDIO_AC3: ATSC A/53 Audio | AC-3 + * @GST_MPEGTS_STREAM_TYPE_ATSC_SUBTITLING: SCTE-27 Subtitling + * @GST_MPEGTS_STREAM_TYPE_ATSC_ISOCH_DATA: SCTE-19 Isochronous data | Reserved + * @GST_MPEGTS_STREAM_TYPE_ATSC_SIT: SCTE-35 Splice Information Table + * @GST_MPEGTS_STREAM_TYPE_ATSC_AUDIO_EAC3: E-AC-3 A/52:2018 + * @GST_MPEGTS_STREAM_TYPE_ATSC_AUDIO_DTS_HD: E-AC-3 A/107 (ATSC 2.0) + * + * Type of mpeg-ts streams for ATSC, as defined by the ATSC Code Points + * Registry. For convenience, some stream types from %GstMpegtsScteStreamType + * are also included. + * + * Since: 1.20 + */ +typedef enum { + GST_MPEGTS_STREAM_TYPE_ATSC_DCII_VIDEO = 0x80, + GST_MPEGTS_STREAM_TYPE_ATSC_AUDIO_AC3 = 0x81, + GST_MPEGTS_STREAM_TYPE_ATSC_SUBTITLING = 0x82, + GST_MPEGTS_STREAM_TYPE_ATSC_ISOCH_DATA = 0x83, + /* 0x84-0x85 : RESERVED */ + GST_MPEGTS_STREAM_TYPE_ATSC_SIT = 0x86, + GST_MPEGTS_STREAM_TYPE_ATSC_AUDIO_EAC3 = 0x87, + GST_MPEGTS_STREAM_TYPE_ATSC_AUDIO_DTS_HD = 0x88, +} GstMpegtsATSCStreamType; + /* TVCT/CVCT */ #define GST_TYPE_MPEGTS_ATSC_VCT (gst_mpegts_atsc_vct_get_type ()) #define GST_TYPE_MPEGTS_ATSC_VCT_SOURCE (gst_mpegts_atsc_vct_source_get_type ()) @@ -92,7 +144,7 @@ * @source_id: The source id * @descriptors: (element-type GstMpegtsDescriptor): an array of #GstMpegtsDescriptor * - * Source from a @GstMpegtsAtscVCT, can be used both for TVCT and CVCT tables + * Source from a %GstMpegtsAtscVCT, can be used both for TVCT and CVCT tables */ struct _GstMpegtsAtscVCTSource { @@ -214,7 +266,7 @@ GST_MPEGTS_API GstMpegtsAtscMGT * gst_mpegts_atsc_mgt_new (void); -/* Multiple string structure (used in ETT and EIT */ +/* Multiple string structure (used in ETT and EIT) */ #define GST_TYPE_MPEGTS_ATSC_STRING_SEGMENT (gst_mpegts_atsc_string_segment_get_type()) #define GST_TYPE_MPEGTS_ATSC_MULT_STRING (gst_mpegts_atsc_mult_string_get_type()) @@ -425,7 +477,7 @@ }; /** - * _GstMpegtsAtscRRTDimension: + * GstMpegtsAtscRRTDimension: * @names: (element-type GstMpegtsAtscMultString): the names * @graduated_scale: whether the ratings represent a graduated scale * @values_defined: the number of values defined for this dimension
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-dvb-descriptor.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-dvb-descriptor.c
Changed
@@ -204,7 +204,7 @@ data = (guint8 *) descriptor->data + 2; - *stuffing_bytes = g_memdup (data, descriptor->length); + *stuffing_bytes = g_memdup2 (data, descriptor->length); return TRUE; } @@ -600,7 +600,7 @@ break; } - copy->private_data_bytes = g_memdup (source->private_data_bytes, + copy->private_data_bytes = g_memdup2 (source->private_data_bytes, source->private_data_length); return copy; @@ -825,7 +825,7 @@ } res->private_data_length = end - data; - res->private_data_bytes = g_memdup (data, res->private_data_length); + res->private_data_bytes = g_memdup2 (data, res->private_data_length); *desc = res; @@ -2013,7 +2013,7 @@ if (length && private_data) { *length = descriptor->length - 4; - *private_data = g_memdup (data + 4, *length); + *private_data = g_memdup2 (data + 4, *length); } return TRUE; } @@ -2091,7 +2091,7 @@ copy = g_slice_dup (GstMpegtsDataBroadcastDescriptor, source); - copy->selector_bytes = g_memdup (source->selector_bytes, source->length); + copy->selector_bytes = g_memdup2 (source->selector_bytes, source->length); copy->language_code = g_strdup (source->language_code); copy->text = g_strdup (source->text); @@ -2145,7 +2145,7 @@ res->length = *data; data += 1; - res->selector_bytes = g_memdup (data, res->length); + res->selector_bytes = g_memdup2 (data, res->length); data += res->length; res->language_code = convert_lang_code (data); @@ -2220,7 +2220,7 @@ *len = descriptor->length - 2; - *id_selector_bytes = g_memdup (data, *len); + *id_selector_bytes = g_memdup2 (data, *len); return TRUE; } @@ -2460,3 +2460,117 @@ *desc = res; return TRUE; } + +/** + * gst_mpegts_descriptor_parse_audio_preselection_list: + * @descriptor: a %GST_MTS_DESC_EXT_DVB_AUDIO_PRESELECTION #GstMpegtsDescriptor + * @list: (out) (transfer full) (element-type GstMpegtsAudioPreselectionDescriptor): + * the list of audio preselection + * + * Parses out a list of audio preselection from the @descriptor. + * + * Returns: %TRUE if the parsing happened correctly, else %FALSE. + * + * Since: 1.20 + */ +gboolean +gst_mpegts_descriptor_parse_audio_preselection_list (const GstMpegtsDescriptor + * descriptor, GPtrArray ** list) +{ + guint8 *data; + guint8 i, num_preselections, num_aux_components, future_extension_length; + GstMpegtsAudioPreselectionDescriptor *item; + + g_return_val_if_fail (descriptor != NULL && list != NULL, FALSE); + __common_desc_ext_check_base (descriptor, + GST_MTS_DESC_EXT_DVB_AUDIO_PRESELECTION, FALSE); + + *list = g_ptr_array_new_with_free_func ((GDestroyNotify) + gst_mpegts_descriptor_parse_audio_preselection_free); + + data = (guint8 *) descriptor->data + 3; + num_preselections = (guint8) ((*data & 0xF8) >> 3); + data += 1; + + for (i = 0; i < num_preselections; i++) { + item = g_slice_new0 (GstMpegtsAudioPreselectionDescriptor); + g_ptr_array_add (*list, item); + + item->preselection_id = (*data & 0xF8) >> 3; + item->audio_rendering_indication = *data & 0x7; + data += 1; + + item->audio_description = (*data & 0x80) >> 7; + item->spoken_subtitles = (*data & 0x40) >> 6; + item->dialogue_enhancement = (*data & 0x20) >> 5; + item->interactivity_enabled = (*data & 0x10) >> 4; + item->language_code_present = (*data & 0x08) >> 3; + item->text_label_present = (*data & 0x04) >> 2; + item->multi_stream_info_present = (*data & 0x02) >> 1; + item->future_extension = (*data) & 0x01; + data += 1; + + if (item->language_code_present == 1) { + item->language_code = convert_lang_code (data); + data += 3; + } + + if (item->text_label_present == 1) { + item->message_id = *data; + data += 1; + } + + if (item->multi_stream_info_present == 1) { + num_aux_components = (*data & 0xE0) >> 5; + data += 1; + for (i = 0; i < num_aux_components; i++) { + data += 1; + } + } + + if (item->future_extension == 1) { + future_extension_length = *data & 0x1F; + data += 1; + for (i = 0; i < future_extension_length; i++) { + data += 1; + } + } + } + + return TRUE; +} + +void gst_mpegts_descriptor_parse_audio_preselection_free + (GstMpegtsAudioPreselectionDescriptor * source) +{ + if (source->language_code_present == 1) { + g_free (source->language_code); + } + g_slice_free (GstMpegtsAudioPreselectionDescriptor, source); +} + +void gst_mpegts_descriptor_parse_audio_preselection_dump + (GstMpegtsAudioPreselectionDescriptor * source) +{ + GST_DEBUG ("[Audio Preselection Descriptor]"); + GST_DEBUG (" preselection_id: 0x%02x", source->preselection_id); + GST_DEBUG ("audio_rendering_indication: 0x%02x", + source->audio_rendering_indication); + GST_DEBUG (" audio_description: %d", source->audio_description); + GST_DEBUG (" spoken_subtitles: %d", source->spoken_subtitles); + GST_DEBUG (" dialogue_enhancement: %d", source->dialogue_enhancement); + GST_DEBUG (" interactivity_enabled: %d", source->interactivity_enabled); + GST_DEBUG (" language_code_present: %d", source->language_code_present); + GST_DEBUG (" text_label_present: %d", source->text_label_present); + GST_DEBUG (" multi_stream_info_present: %d", + source->multi_stream_info_present); + GST_DEBUG (" future_extension: %d", source->future_extension); + + if (source->language_code_present == 1) { + GST_DEBUG (" language_code: %s", source->language_code); + } + if (source->text_label_present == 1) { + GST_DEBUG (" message_id: 0x%02x", source->message_id); + } + GST_DEBUG ("-------------------------------"); +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-dvb-descriptor.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-dvb-descriptor.h
Changed
@@ -86,6 +86,13 @@ GST_MTS_DESC_DVB_SERVICE_MOVE = 0x60, GST_MTS_DESC_DVB_SHORT_SMOOTHING_BUFFER = 0x61, GST_MTS_DESC_DVB_FREQUENCY_LIST = 0x62, + /** + * GST_MTS_DESC_DVB_PARTIAL_TRANSPORT_STREAM: + * + * Partial Transport Stream descriptor. Only present in SIT Sections. + * + * See also: %GST_MPEGTS_SECTION_SIT, %GstMpegtsSIT + */ GST_MTS_DESC_DVB_PARTIAL_TRANSPORT_STREAM = 0x63, GST_MTS_DESC_DVB_DATA_BROADCAST = 0x64, GST_MTS_DESC_DVB_SCRAMBLING = 0x65, @@ -152,6 +159,14 @@ GST_MTS_DESC_EXT_DVB_T2MI = 0x11, GST_MTS_DESC_EXT_DVB_URI_LINKAGE = 0x13, GST_MTS_DESC_EXT_DVB_AC4 = 0x15, + /** + * GST_MTS_DESC_EXT_DVB_AUDIO_PRESELECTION: + * + * Provide all avaliable audio programme for user selection + * + * Since: 1.20 + */ + GST_MTS_DESC_EXT_DVB_AUDIO_PRESELECTION = 0x19 } GstMpegtsDVBExtendedDescriptorType; /* GST_MTS_DESC_DVB_CAROUSEL_IDENTIFIER (0x13) */ @@ -1055,6 +1070,70 @@ gboolean gst_mpegts_descriptor_parse_dvb_t2_delivery_system (const GstMpegtsDescriptor *descriptor, GstMpegtsT2DeliverySystemDescriptor ** res); +/** + * GstMpegtsAudioPreselectionDescriptor: + * @preselection_id: 5-bit + * @audio_rendering_indication: 3-bit field + * @audio_description: visually impaired + * @spoken_subtitles: + * @dialogue_enhancement: + * @interactivity_enabled: + * @language_code_present: + * @text_label_present: + * @multi_stream_info_present: indicates if this PID conveys a complete audio programme + * @future_extension: + * @language_code: NULL terminated ISO 639 language code. + * @message_id: + * @items: (element-type GstMpegtsExtendedEventItem): the #GstMpegtsExtendedEventItem + * @text: + * + * Table 110: Audio Preselection Descriptor (ETSI EN 300 468 v1.16.1) + * + * Since: 1.20 + */ +typedef struct _GstMpegtsAudioPreselectionDescriptor GstMpegtsAudioPreselectionDescriptor; +struct _GstMpegtsAudioPreselectionDescriptor +{ + guint8 preselection_id; + guint8 audio_rendering_indication; + gboolean audio_description; + gboolean spoken_subtitles; + gboolean dialogue_enhancement; + gboolean interactivity_enabled; + gboolean language_code_present; + gboolean text_label_present; + gboolean multi_stream_info_present; + gboolean future_extension; + gchar *language_code; + guint8 message_id; +}; + +GST_MPEGTS_API +gboolean +gst_mpegts_descriptor_parse_audio_preselection_list (const GstMpegtsDescriptor + * descriptor, GPtrArray ** list); + +/** + * gst_mpegts_descriptor_parse_audio_preselection_free: + * + * Since: 1.20 + */ +GST_MPEGTS_API +void +gst_mpegts_descriptor_parse_audio_preselection_free (GstMpegtsAudioPreselectionDescriptor + * source); + +/** + * gst_mpegts_descriptor_parse_audio_preselection_dump: + * + * Since: 1.20 + */ +GST_MPEGTS_API +void +gst_mpegts_descriptor_parse_audio_preselection_dump (GstMpegtsAudioPreselectionDescriptor + * source); + + G_END_DECLS #endif /* GST_MPEGTS_DESCRIPTOR_H */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-dvb-section.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-dvb-section.c
Changed
@@ -44,6 +44,54 @@ * @short_description: Sections for the various DVB specifications * @include: gst/mpegts/mpegts.h * + * The list of section types defined and used by the DVB specifications can be + * seen in %GstMpegtsSectionDVBTableID. + * + * # Supported DVB MPEG-TS sections + * These are the sections for which parsing and packetizing code exists. + * + * ## Network Information Table (NIT) + * See: + * * gst_mpegts_section_get_nit() + * * %GstMpegtsNIT + * * %GstMpegtsNITStream + * * gst_mpegts_nit_new() + * + * ## Service Description Table (SDT) + * See: + * * gst_mpegts_section_get_sdt() + * * %GstMpegtsSDT + * * %GstMpegtsSDTService + * * gst_mpegts_sdt_new() + * + * ## Bouquet Association Table (BAT) + * See: + * * gst_mpegts_section_get_bat() + * * %GstMpegtsBAT + * * %GstMpegtsBATStream + * + * ## Event Information Table (EIT) + * See: + * * gst_mpegts_section_get_eit() + * * %GstMpegtsEIT + * * %GstMpegtsEITEvent + * + * ## Time Date Table (TDT) + * See: + * * gst_mpegts_section_get_tdt() + * + * ## Time Offset Table (TOT) + * See: + * * gst_mpegts_section_get_tot() + * * %GstMpegtsTOT + * + * ## Selection Information Table (SIT) + * See: + * * gst_mpegts_section_get_sit() + * * %GstMpegtsSIT + * * %GstMpegtsSITService + * + * # API */ @@ -1210,3 +1258,160 @@ return (const GstMpegtsTOT *) section->cached_parsed; } + + +/* Selection Information Table (SIT) */ + +static GstMpegtsSITService * +_gst_mpegts_sit_service_copy (GstMpegtsSITService * sit) +{ + GstMpegtsSITService *copy = g_slice_dup (GstMpegtsSITService, sit); + + copy->service_id = sit->service_id; + copy->running_status = sit->running_status; + copy->descriptors = g_ptr_array_ref (sit->descriptors); + + return copy; +} + +static void +_gst_mpegts_sit_service_free (GstMpegtsSITService * sit) +{ + if (sit->descriptors) + g_ptr_array_unref (sit->descriptors); + g_slice_free (GstMpegtsSITService, sit); +} + +G_DEFINE_BOXED_TYPE (GstMpegtsSITService, gst_mpegts_sit_service, + (GBoxedCopyFunc) _gst_mpegts_sit_service_copy, + (GFreeFunc) _gst_mpegts_sit_service_free); + +static GstMpegtsSIT * +_gst_mpegts_sit_copy (GstMpegtsSIT * sit) +{ + GstMpegtsSIT *copy = g_slice_dup (GstMpegtsSIT, sit); + + copy->services = g_ptr_array_ref (sit->services); + copy->descriptors = g_ptr_array_ref (sit->descriptors); + + return copy; +} + +static void +_gst_mpegts_sit_free (GstMpegtsSIT * sit) +{ + g_ptr_array_unref (sit->services); + g_ptr_array_unref (sit->descriptors); + g_slice_free (GstMpegtsSIT, sit); +} + +G_DEFINE_BOXED_TYPE (GstMpegtsSIT, gst_mpegts_sit, + (GBoxedCopyFunc) _gst_mpegts_sit_copy, (GFreeFunc) _gst_mpegts_sit_free); + + +static gpointer +_parse_sit (GstMpegtsSection * section) +{ + GstMpegtsSIT *sit = NULL; + guint i = 0, allocated_services = 8; + guint8 *data, *end, *entry_begin; + guint sit_info_length; + guint descriptors_loop_length; + + GST_DEBUG ("SIT"); + + sit = g_slice_new0 (GstMpegtsSIT); + + data = section->data; + end = data + section->section_length; + + /* Skip common fields */ + data += 8; + + descriptors_loop_length = GST_READ_UINT16_BE (data) & 0x0fff; + data += 2; + sit->descriptors = + gst_mpegts_parse_descriptors (data, descriptors_loop_length); + if (sit->descriptors == NULL) + goto error; + data += descriptors_loop_length; + + sit_info_length = end - data;; + sit->services = g_ptr_array_new_full (allocated_services, + (GDestroyNotify) _gst_mpegts_sit_service_free); + + /* read up to the CRC */ + while (sit_info_length - 4 > 0) { + GstMpegtsSITService *service = g_slice_new0 (GstMpegtsSITService); + g_ptr_array_add (sit->services, service); + + entry_begin = data; + + if (sit_info_length - 4 < 4) { + /* each entry must be at least 4 bytes (+4 bytes for the CRC) */ + GST_WARNING ("PID %d invalid SIT entry size %d", + section->pid, sit_info_length); + goto error; + } + + service->service_id = GST_READ_UINT16_BE (data); + data += 2; + + service->running_status = (*data >> 5) & 0x07; + descriptors_loop_length = GST_READ_UINT16_BE (data) & 0x0fff; + data += 2; + + if (descriptors_loop_length && (data + descriptors_loop_length > end - 4)) { + GST_WARNING ("PID %d invalid SIT entry %d descriptors loop length %d", + section->pid, service->service_id, descriptors_loop_length); + goto error; + } + service->descriptors = + gst_mpegts_parse_descriptors (data, descriptors_loop_length); + if (!service->descriptors) + goto error; + data += descriptors_loop_length; + + sit_info_length -= data - entry_begin; + i += 1; + } + + if (data != end - 4) { + GST_WARNING ("PID %d invalid SIT parsed %d length %d", + section->pid, (gint) (data - section->data), section->section_length); + goto error; + } + + return sit; + +error: + if (sit) + _gst_mpegts_sit_free (sit); + + return NULL; +} + +/** + * gst_mpegts_section_get_sit: + * @section: a #GstMpegtsSection of type %GST_MPEGTS_SECTION_SIT + * + * Returns the #GstMpegtsSIT contained in the @section. + * + * Returns: The #GstMpegtsSIT contained in the section, or %NULL if an error + * happened. + * + * Since: 1.20 + */ +const GstMpegtsSIT * +gst_mpegts_section_get_sit (GstMpegtsSection * section) +{ + g_return_val_if_fail (section->section_type == GST_MPEGTS_SECTION_SIT, NULL); + g_return_val_if_fail (section->cached_parsed || section->data, NULL); + + if (!section->cached_parsed) + section->cached_parsed = + __common_section_checks (section, 18, _parse_sit, + (GDestroyNotify) _gst_mpegts_sit_free); + + return (const GstMpegtsSIT *) section->cached_parsed; +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-dvb-section.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-dvb-section.h
Changed
@@ -32,10 +32,39 @@ /** * GstMpegtsSectionDVBTableID: + * @GST_MTS_TABLE_ID_NETWORK_INFORMATION_ACTUAL_NETWORK: Network Information Table (NIT), Actual Network + * @GST_MTS_TABLE_ID_NETWORK_INFORMATION_OTHER_NETWORK: Network Information Table (NIT), Other Network + * @GST_MTS_TABLE_ID_SERVICE_DESCRIPTION_ACTUAL_TS: Service Description Table (SDT), Actual Transport Stream + * @GST_MTS_TABLE_ID_SERVICE_DESCRIPTION_OTHER_TS: Service Description Table (SDT), Other Transport Stream + * @GST_MTS_TABLE_ID_BOUQUET_ASSOCIATION: Bouquet Association Table (BAT) + * @GST_MTS_TABLE_ID_EVENT_INFORMATION_ACTUAL_TS_PRESENT: Event Information Table (EIT), Actual Transport Stream, present/following + * @GST_MTS_TABLE_ID_EVENT_INFORMATION_OTHER_TS_PRESENT: Event Information Table (EIT), Other Transport Stream, present/following + * @GST_MTS_TABLE_ID_EVENT_INFORMATION_ACTUAL_TS_SCHEDULE_1: Event Information Table (EIT), Actual Transport Stream, Schedule (first) + * @GST_MTS_TABLE_ID_EVENT_INFORMATION_ACTUAL_TS_SCHEDULE_N: Event Information Table (EIT), Actual Transport Stream, Schedule (last) + * @GST_MTS_TABLE_ID_EVENT_INFORMATION_OTHER_TS_SCHEDULE_1: Event Information Table (EIT), Other Transport Stream, Schedule (first) + * @GST_MTS_TABLE_ID_EVENT_INFORMATION_OTHER_TS_SCHEDULE_N: Event Information Table (EIT), Other Transport Stream, Schedule (last) + * @GST_MTS_TABLE_ID_TIME_DATE: Time Date Table (TDT) + * @GST_MTS_TABLE_ID_RUNNING_STATUS: Running Status Table (RST) + * @GST_MTS_TABLE_ID_STUFFING: Stuffing Table (ST) + * @GST_MTS_TABLE_ID_TIME_OFFSET: Time Offset Table (TOT) + * @GST_MTS_TABLE_ID_APPLICATION_INFORMATION_TABLE: ETSI TS 102 323: Application Information Table (AIT) + * @GST_MTS_TABLE_ID_CONTAINER: ETSI TS 102 323: Container Section + * @GST_MTS_TABLE_ID_RELATED_CONTENT: ETSI TS 102 323: Related Content Table (RCT) + * @GST_MTS_TABLE_ID_CONTENT_IDENTIFIER: ETSI TS 102 323: Content Identifier Table (CIT) + * @GST_MTS_TABLE_ID_MPE_FEC: ETSI TS 301 192: MPE-FEC Section + * @GST_MTS_TABLE_ID_RESOLUTION_NOTIFICATION: ETSI 103 323: Resolution Provider Notification Table (RNT) + * @GST_MTS_TABLE_ID_MPE_IFEC: ETSI TS 102 772: MPE-IFEC Section + * @GST_MTS_TABLE_ID_DISCONTINUITY_INFORMATION: Discontinuity Information Table (DIT) + * @GST_MTS_TABLE_ID_SELECTION_INFORMATION: Selection Information Table (SIT) + * @GST_MTS_TABLE_ID_CA_MESSAGE_ECM_0: ETSI TR 289: CA Message Table (CMT): ECM 0 + * @GST_MTS_TABLE_ID_CA_MESSAGE_ECM_1: ETSI TR 289: CA Message Table (CMT): ECM 1 + * @GST_MTS_TABLE_ID_CA_MESSAGE_SYSTEM_PRIVATE_1: ETSI TR 289: CA Message Table (CMT): CA System Private (First) + * @GST_MTS_TABLE_ID_CA_MESSAGE_SYSTEM_PRIVATE_N: ETSI TR 289: CA Message Table (CMT): CA System Private (Last) * * Values for a #GstMpegtsSection table_id. * - * These are the registered DVB table_id variants. + * These are the registered DVB table_id variants. Unless specified otherwise, + * they come from the DVB Specification for SI (ETSI EN 300 468). * * see also: #GstMpegtsSectionTableID */ @@ -46,6 +75,28 @@ GST_MTS_TABLE_ID_SERVICE_DESCRIPTION_ACTUAL_TS = 0x42, GST_MTS_TABLE_ID_SERVICE_DESCRIPTION_OTHER_TS = 0x46, GST_MTS_TABLE_ID_BOUQUET_ASSOCIATION = 0x4A, + + /* ETSI TS 102 006 */ + /** + * GST_MTS_TABLE_ID_UPDATE_NOTIFICATION: + * + * ETSI TS 102 006: Update Notification Table (UNT) + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_UPDATE_NOTIFICATION = 0x4B, + + /* ETSI EN 303 560 */ + /** + * GST_MTS_TABLE_ID_DOWNLOADABLE_FONT_INFO: + * + * ETSI EN 303 560: Downloadable Font Info + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_DOWNLOADABLE_FONT_INFO = 0x4C, + + /* EN 300 468 */ GST_MTS_TABLE_ID_EVENT_INFORMATION_ACTUAL_TS_PRESENT = 0x4E, GST_MTS_TABLE_ID_EVENT_INFORMATION_OTHER_TS_PRESENT = 0x4F, GST_MTS_TABLE_ID_EVENT_INFORMATION_ACTUAL_TS_SCHEDULE_1 = 0x50, @@ -74,6 +125,16 @@ /* TS 102 772 (DVB-SH Multi-Protocol Encapsulation) */ GST_MTS_TABLE_ID_MPE_IFEC = 0x7A, + /* TS 102 809 (DVB Hybrid Broadcast/Broadband) */ + /** + * GST_MTS_TABLE_ID_PROTECTION_MESSAGE: + * + * ETSI TS 102 809: Protection Message Section + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_PROTECTION_MESSAGE = 0x7B, + /* EN 300 468 (DVB) v 1.12.1 */ GST_MTS_TABLE_ID_DISCONTINUITY_INFORMATION = 0x7E, GST_MTS_TABLE_ID_SELECTION_INFORMATION = 0x7F, @@ -87,6 +148,7 @@ /* ... */ /* EN 301 790 (DVB interaction channel for satellite distribution channels) */ + /* Note: Not 100% sure we want those exposed here ... */ GST_MTS_TABLE_ID_SCT = 0xA0, GST_MTS_TABLE_ID_FCT = 0xA1, GST_MTS_TABLE_ID_TCT = 0xA2, @@ -371,6 +433,68 @@ GST_MPEGTS_API const GstMpegtsTOT *gst_mpegts_section_get_tot (GstMpegtsSection *section); +/* SIT */ + +typedef struct _GstMpegtsSITService GstMpegtsSITService; +/** + * GST_TYPE_MPEGTS_SIT_SERVICE: + * + * Since: 1.20 + */ +#define GST_TYPE_MPEGTS_SIT_SERVICE (gst_mpegts_sit_service_get_type()) + +typedef struct _GstMpegtsSIT GstMpegtsSIT; +/** + * GST_TYPE_MPEGTS_SIT: + * + * Since: 1.20 + */ +#define GST_TYPE_MPEGTS_SIT (gst_mpegts_sit_get_type()) + +/** + * GstMpegtsSITService: + * @service_id: The Program number this table belongs to + * @running_status: Status of this service + * @descriptors: (element-type GstMpegtsDescriptor): List of descriptors + * + * SIT Service entry + * + * Since: 1.20 + */ +struct _GstMpegtsSITService +{ + guint16 service_id; + GstMpegtsRunningStatus running_status; + + GPtrArray *descriptors; +}; + +/** + * GstMpegtsSIT: + * @descriptors: (element-type GstMpegtsDescriptor): List of descriptors + * @services: (element-type GstMpegtsSITService): List of services + * + * Selection Information Table (EN 300 468) + * + * Since: 1.20 + */ +struct _GstMpegtsSIT +{ + GPtrArray *descriptors; + GPtrArray *services; +}; + + +GST_MPEGTS_API +GType gst_mpegts_sit_get_type (void); + +GST_MPEGTS_API +GType gst_mpegts_sit_service_get_type (void); + +GST_MPEGTS_API +const GstMpegtsSIT *gst_mpegts_section_get_sit (GstMpegtsSection *section); + + G_END_DECLS #endif /* GST_MPEGTS_SECTION_H */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-hdmv-section.h
Added
@@ -0,0 +1,66 @@ +/* + * gst-hdmv-section.h - + * Copyright (C) 2020, Centricular ltd + * + * Authors: + * Edward Hervey <edward@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef GST_HDMV_SECTION_H +#define GST_HDMV_SECTION_H + +#include <gst/gst.h> +#include <gst/mpegts/gstmpegtssection.h> +#include <gst/mpegts/gstmpegtsdescriptor.h> + +G_BEGIN_DECLS + +/** + * SECTION:gst-hdmv-section + * @title: HDMV variants of MPEG-TS (Bluray, AVCHD, ...) + * @short_description: Stream Types for the various Bluray specifications + * @include: gst/mpegts/mpegts.h + */ + +/** + * GstMpegtsHdmvStreamType: + * + * Type of mpeg-ts streams for Blu-ray formats. To be matched with the + * stream-type of a #GstMpegtsSection. + * + * Since: 1.20 + */ +typedef enum { + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_LPCM = 0x80, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_AC3 = 0x81, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_DTS = 0x82, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_AC3_TRUE_HD = 0x83, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_AC3_PLUS = 0x84, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_DTS_HD = 0x85, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_DTS_HD_MASTER_AUDIO = 0x86, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_EAC3 = 0x87, + GST_MPEGTS_STREAM_TYPE_HDMV_SUBPICTURE_PGS = 0x90, + GST_MPEGTS_STREAM_TYPE_HDMV_IGS = 0x91, + GST_MPEGTS_STREAM_TYPE_HDMV_SUBTITLE = 0x92, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_AC3_PLUS_SECONDARY = 0xa1, + GST_MPEGTS_STREAM_TYPE_HDMV_AUDIO_DTS_HD_SECONDARY = 0xa2, +} GstMpegtsHdmvStreamType; + +G_END_DECLS + +#endif /* GST_HDMV_SECTION_H */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-isdb-descriptor.h
Added
@@ -0,0 +1,89 @@ +/* + * gst-isdb-descriptor.h - + * Copyright (C) 2020 Edward Hervey + * + * Authors: + * Edward Hervey <edward@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef GST_ISDB_DESCRIPTOR_H +#define GST_ISDB_DESCRIPTOR_H + +#include <gst/gst.h> +#include <gst/mpegts/mpegts-prelude.h> + +G_BEGIN_DECLS + +/** + * SECTION:gst-isdb-descriptor + * @title: ISDB variants of MPEG-TS descriptors + * @short_description: Descriptors for the various ISDB specifications + * @include: gst/mpegts/mpegts.h + * + * This contains the various descriptors defined by the ISDB specifications + */ + +/** + * GstMpegtsISDBDescriptorType: + * + * These values correspond to the registered descriptor type from + * the various ISDB specifications. + * + * Consult the relevant specifications for more details. + */ +typedef enum { + /* ISDB ARIB B10 v4.6 */ + GST_MTS_DESC_ISDB_HIERARCHICAL_TRANSMISSION = 0xC0, + GST_MTS_DESC_ISDB_DIGITAL_COPY_CONTROL = 0xC1, + GST_MTS_DESC_ISDB_NETWORK_IDENTIFICATION = 0xC2, + GST_MTS_DESC_ISDB_PARTIAL_TS_TIME = 0xc3, + GST_MTS_DESC_ISDB_AUDIO_COMPONENT = 0xc4, + GST_MTS_DESC_ISDB_HYPERLINK = 0xc5, + GST_MTS_DESC_ISDB_TARGET_REGION = 0xc6, + GST_MTS_DESC_ISDB_DATA_CONTENT = 0xc7, + GST_MTS_DESC_ISDB_VIDEO_DECODE_CONTROL = 0xc8, + GST_MTS_DESC_ISDB_DOWNLOAD_CONTENT = 0xc9, + GST_MTS_DESC_ISDB_CA_EMM_TS = 0xca, + GST_MTS_DESC_ISDB_CA_CONTRACT_INFORMATION = 0xcb, + GST_MTS_DESC_ISDB_CA_SERVICE = 0xcc, + GST_MTS_DESC_ISDB_TS_INFORMATION = 0xcd, + GST_MTS_DESC_ISDB_EXTENDED_BROADCASTER = 0xce, + GST_MTS_DESC_ISDB_LOGO_TRANSMISSION = 0xcf, + GST_MTS_DESC_ISDB_BASIC_LOCAL_EVENT = 0xd0, + GST_MTS_DESC_ISDB_REFERENCE = 0xd1, + GST_MTS_DESC_ISDB_NODE_RELATION = 0xd2, + GST_MTS_DESC_ISDB_SHORT_NODE_INFORMATION = 0xd3, + GST_MTS_DESC_ISDB_STC_REFERENCE = 0xd4, + GST_MTS_DESC_ISDB_SERIES = 0xd5, + GST_MTS_DESC_ISDB_EVENT_GROUP = 0xd6, + GST_MTS_DESC_ISDB_SI_PARAMETER = 0xd7, + GST_MTS_DESC_ISDB_BROADCASTER_NAME = 0xd8, + GST_MTS_DESC_ISDB_COMPONENT_GROUP = 0xd9, + GST_MTS_DESC_ISDB_SI_PRIME_TS = 0xda, + GST_MTS_DESC_ISDB_BOARD_INFORMATION = 0xdb, + GST_MTS_DESC_ISDB_LDT_LINKAGE = 0xdc, + GST_MTS_DESC_ISDB_CONNECTED_TRANSMISSION = 0xdd, + GST_MTS_DESC_ISDB_CONTENT_AVAILABILITY = 0xde, + /* ... */ + GST_MTS_DESC_ISDB_SERVICE_GROUP = 0xe0 + +} GstMpegtsISDBDescriptorType; + +G_END_DECLS + +#endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-scte-section.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-scte-section.c
Changed
@@ -35,19 +35,28 @@ * @short_description: Sections for the various SCTE specifications * @include: gst/mpegts/mpegts.h * + * This contains the %GstMpegtsSection relevent to SCTE specifications. */ +/* TODO: port to gst_bit_reader / gst_bit_writer */ + /* Splice Information Table (SIT) */ static GstMpegtsSCTESpliceEvent * _gst_mpegts_scte_splice_event_copy (GstMpegtsSCTESpliceEvent * event) { - return g_slice_dup (GstMpegtsSCTESpliceEvent, event); + GstMpegtsSCTESpliceEvent *copy = + g_slice_dup (GstMpegtsSCTESpliceEvent, event); + + copy->components = g_ptr_array_ref (event->components); + + return copy; } static void _gst_mpegts_scte_splice_event_free (GstMpegtsSCTESpliceEvent * event) { + g_ptr_array_unref (event->components); g_slice_free (GstMpegtsSCTESpliceEvent, event); } @@ -55,8 +64,73 @@ (GBoxedCopyFunc) _gst_mpegts_scte_splice_event_copy, (GFreeFunc) _gst_mpegts_scte_splice_event_free); +static GstMpegtsSCTESpliceComponent * +_gst_mpegts_scte_splice_component_copy (GstMpegtsSCTESpliceComponent * + component) +{ + return g_slice_dup (GstMpegtsSCTESpliceComponent, component); +} + +static void +_gst_mpegts_scte_splice_component_free (GstMpegtsSCTESpliceComponent * + component) +{ + g_slice_free (GstMpegtsSCTESpliceComponent, component); +} + +G_DEFINE_BOXED_TYPE (GstMpegtsSCTESpliceComponent, + gst_mpegts_scte_splice_component, + (GBoxedCopyFunc) _gst_mpegts_scte_splice_component_copy, + (GFreeFunc) _gst_mpegts_scte_splice_component_free); + +static GstMpegtsSCTESpliceComponent * +_parse_splice_component (GstMpegtsSCTESpliceEvent * event, guint8 ** orig_data, + guint8 * end) +{ + GstMpegtsSCTESpliceComponent *component = + g_slice_new0 (GstMpegtsSCTESpliceComponent); + guint8 *data = *orig_data; + + if (data + 1 + 6 > end) + goto error; + + component->tag = *data; + data += 1; + + if (event->insert_event && event->splice_immediate_flag == 0) { + component->splice_time_specified = *data >> 7; + if (component->splice_time_specified) { + component->splice_time = ((guint64) (*data & 0x01)) << 32; + data += 1; + component->splice_time += GST_READ_UINT32_BE (data); + data += 4; + GST_LOG ("component %u splice_time %" G_GUINT64_FORMAT " (%" + GST_TIME_FORMAT ")", component->tag, component->splice_time, + GST_TIME_ARGS (MPEGTIME_TO_GSTTIME (component->splice_time))); + } else { + data += 1; + } + } else if (!event->insert_event) { + component->utc_splice_time = GST_READ_UINT32_BE (data); + GST_LOG ("component %u utc_splice_time %u", component->tag, + component->utc_splice_time); + data += 4; + } + + *orig_data = data; + + return component; + +error: + { + if (event) + _gst_mpegts_scte_splice_event_free (event); + return NULL; + } +} + static GstMpegtsSCTESpliceEvent * -_parse_slice_event (guint8 ** orig_data, guint8 * end, gboolean insert_event) +_parse_splice_event (guint8 ** orig_data, guint8 * end, gboolean insert_event) { GstMpegtsSCTESpliceEvent *event = g_slice_new0 (GstMpegtsSCTESpliceEvent); guint8 *data = *orig_data; @@ -65,6 +139,9 @@ if (data + 5 + 6 > end) goto error; + event->components = g_ptr_array_new_with_free_func ((GDestroyNotify) + _gst_mpegts_scte_splice_component_free); + event->insert_event = insert_event; event->splice_event_id = GST_READ_UINT32_BE (data); GST_LOG ("splice_event_id: 0x%08x", event->splice_event_id); @@ -82,30 +159,50 @@ event->out_of_network_indicator = *data >> 7; event->program_splice_flag = (*data >> 6) & 0x01; event->duration_flag = (*data >> 5) & 0x01; - event->splice_immediate_flag = (*data >> 4) & 0x01; + + if (insert_event) + event->splice_immediate_flag = (*data >> 4) & 0x01; + GST_LOG ("out_of_network_indicator:%d", event->out_of_network_indicator); GST_LOG ("program_splice_flag:%d", event->program_splice_flag); GST_LOG ("duration_flag:%d", event->duration_flag); - GST_LOG ("splice_immediate_flag:%d", event->splice_immediate_flag); + + if (insert_event) + GST_LOG ("splice_immediate_flag:%d", event->splice_immediate_flag); + data += 1; if (event->program_splice_flag == 0) { - GST_ERROR ("Component splice flag not supported !"); - goto error; - } + guint component_count = *data; + guint i; - if (event->splice_immediate_flag == 0) { - event->program_splice_time_specified = *data >> 7; - if (event->program_splice_time_specified) { - event->program_splice_time = ((guint64) (*data & 0x01)) << 32; - data += 1; - event->program_splice_time += GST_READ_UINT32_BE (data); + data += 1; + + for (i = 0; i < component_count; i++) { + GstMpegtsSCTESpliceComponent *component = + _parse_splice_component (event, &data, end); + if (component == NULL) + goto error; + g_ptr_array_add (event->components, component); + } + } else { + if (insert_event && event->splice_immediate_flag == 0) { + event->program_splice_time_specified = *data >> 7; + if (event->program_splice_time_specified) { + event->program_splice_time = ((guint64) (*data & 0x01)) << 32; + data += 1; + event->program_splice_time += GST_READ_UINT32_BE (data); + data += 4; + GST_LOG ("program_splice_time %" G_GUINT64_FORMAT " (%" + GST_TIME_FORMAT ")", event->program_splice_time, + GST_TIME_ARGS (MPEGTIME_TO_GSTTIME (event->program_splice_time))); + } else + data += 1; + } else if (!insert_event) { + event->utc_splice_time = GST_READ_UINT32_BE (data); + GST_LOG ("utc_splice_time %u", event->utc_splice_time); data += 4; - GST_LOG ("program_splice_time %" G_GUINT64_FORMAT " (%" GST_TIME_FORMAT - ")", event->program_splice_time, - GST_TIME_ARGS (MPEGTIME_TO_GSTTIME (event->program_splice_time))); - } else - data += 1; + } } if (event->duration_flag) { @@ -183,6 +280,8 @@ sit = g_slice_new0 (GstMpegtsSCTESIT); + sit->fully_parsed = FALSE; + data = section->data; end = data + section->section_length; @@ -219,16 +318,22 @@ if (sit->splice_command_length == 0xfff) sit->splice_command_length = 0; GST_LOG ("command length %d", sit->splice_command_length); - sit->splice_command_type = *data; - data += 1; + + if (sit->encrypted_packet) { + GST_LOG ("Encrypted SIT, parsed partially"); + goto done; + } if (sit->splice_command_length - && (data + sit->splice_command_length > end - 4)) { + && (data + sit->splice_command_length > end - 5)) { GST_WARNING ("PID %d invalid SCTE SIT splice command length %d", section->pid, sit->splice_command_length); goto error; } + sit->splice_command_type = *data; + data += 1; + sit->splices = g_ptr_array_new_with_free_func ((GDestroyNotify) _gst_mpegts_scte_splice_event_free); switch (sit->splice_command_type) { @@ -248,9 +353,24 @@ data += 1; } break; + case GST_MTS_SCTE_SPLICE_COMMAND_SCHEDULE: + { + guint i; + guint splice_count = *data; + data += 1; + + for (i = 0; i < splice_count; i++) { + GstMpegtsSCTESpliceEvent *event = + _parse_splice_event (&data, end, FALSE); + if (event == NULL) + goto error; + g_ptr_array_add (sit->splices, event); + } + } + break; case GST_MTS_SCTE_SPLICE_COMMAND_INSERT: { - GstMpegtsSCTESpliceEvent *event = _parse_slice_event (&data, end, TRUE); + GstMpegtsSCTESpliceEvent *event = _parse_splice_event (&data, end, TRUE); if (event == NULL) goto error; g_ptr_array_add (sit->splices, event); @@ -263,9 +383,9 @@ } break; default: - GST_ERROR ("Unknown SCTE splice command type (0x%02x) !", + GST_WARNING ("Unknown SCTE splice command type (0x%02x) !", sit->splice_command_type); - goto error; + break;; } /* descriptors */ @@ -279,7 +399,6 @@ } data += tmp; - GST_DEBUG ("%p - %p", data, end); if (data != end - 4) { GST_WARNING ("PID %d invalid SIT parsed %d length %d", @@ -287,13 +406,18 @@ goto error; } + sit->fully_parsed = TRUE; + +done: return sit; error: - if (sit) + if (sit) { _gst_mpegts_scte_sit_free (sit); + sit = NULL; + } - return NULL; + goto done; } /** @@ -336,12 +460,15 @@ /* Set all default values (which aren't already 0/NULL) */ sit->tier = 0xfff; + sit->fully_parsed = TRUE; sit->splices = g_ptr_array_new_with_free_func ((GDestroyNotify) _gst_mpegts_scte_splice_event_free); sit->descriptors = g_ptr_array_new_with_free_func ((GDestroyNotify) gst_mpegts_descriptor_free); + sit->is_running_time = TRUE; + return sit; } @@ -358,6 +485,9 @@ GstMpegtsSCTESIT *sit = gst_mpegts_scte_sit_new (); sit->splice_command_type = GST_MTS_SCTE_SPLICE_COMMAND_NULL; + + sit->is_running_time = TRUE; + return sit; } @@ -381,13 +511,15 @@ event->splice_event_cancel_indicator = TRUE; g_ptr_array_add (sit->splices, event); + sit->is_running_time = TRUE; + return sit; } /** * gst_mpegts_scte_splice_in_new: * @event_id: The event ID. - * @splice_time: The PCR time for the splice event + * @splice_time: The running time for the splice event * * Allocates and initializes a new "Splice In" INSERT command * #GstMpegtsSCTESIT for the given @event_id and @splice_time. @@ -398,13 +530,14 @@ * Returns: (transfer full): A newly allocated #GstMpegtsSCTESIT */ GstMpegtsSCTESIT * -gst_mpegts_scte_splice_in_new (guint32 event_id, guint64 splice_time) +gst_mpegts_scte_splice_in_new (guint32 event_id, GstClockTime splice_time) { GstMpegtsSCTESIT *sit = gst_mpegts_scte_sit_new (); GstMpegtsSCTESpliceEvent *event = gst_mpegts_scte_splice_event_new (); sit->splice_command_type = GST_MTS_SCTE_SPLICE_COMMAND_INSERT; event->splice_event_id = event_id; + event->insert_event = TRUE; if (splice_time == G_MAXUINT64) { event->splice_immediate_flag = TRUE; } else { @@ -413,18 +546,20 @@ } g_ptr_array_add (sit->splices, event); + sit->is_running_time = TRUE; + return sit; } /** * gst_mpegts_scte_splice_out_new: * @event_id: The event ID. - * @splice_time: The PCR time for the splice event + * @splice_time: The running time for the splice event * @duration: The optional duration. * * Allocates and initializes a new "Splice Out" INSERT command * #GstMpegtsSCTESIT for the given @event_id, @splice_time and - * duration. + * @duration. * * If the @splice_time is #G_MAXUINT64 then the event will be * immediate as opposed to for the target @splice_time. @@ -434,8 +569,8 @@ * Returns: (transfer full): A newly allocated #GstMpegtsSCTESIT */ GstMpegtsSCTESIT * -gst_mpegts_scte_splice_out_new (guint32 event_id, guint64 splice_time, - guint64 duration) +gst_mpegts_scte_splice_out_new (guint32 event_id, GstClockTime splice_time, + GstClockTime duration) { GstMpegtsSCTESIT *sit = gst_mpegts_scte_sit_new (); GstMpegtsSCTESpliceEvent *event = gst_mpegts_scte_splice_event_new (); @@ -443,6 +578,7 @@ sit->splice_command_type = GST_MTS_SCTE_SPLICE_COMMAND_INSERT; event->splice_event_id = event_id; event->out_of_network_indicator = TRUE; + event->insert_event = TRUE; if (splice_time == G_MAXUINT64) { event->splice_immediate_flag = TRUE; } else { @@ -455,6 +591,8 @@ } g_ptr_array_add (sit->splices, event); + sit->is_running_time = TRUE; + return sit; } @@ -472,10 +610,33 @@ /* Non-0 Default values */ event->program_splice_flag = TRUE; + event->components = g_ptr_array_new_with_free_func ((GDestroyNotify) + _gst_mpegts_scte_splice_component_free); + return event; } +/** + * gst_mpegts_scte_splice_component_new: + * @tag: the elementary PID stream identifier + * + * Allocates and initializes a #GstMpegtsSCTESpliceComponent. + * + * Returns: (transfer full): A newly allocated #GstMpegtsSCTESpliceComponent + * Since: 1.20 + */ +GstMpegtsSCTESpliceComponent * +gst_mpegts_scte_splice_component_new (guint8 tag) +{ + GstMpegtsSCTESpliceComponent *component = + g_slice_new0 (GstMpegtsSCTESpliceComponent); + + component->tag = tag; + + return component; +} + static gboolean _packetize_sit (GstMpegtsSection * section) { @@ -491,6 +652,11 @@ if (sit == NULL) return FALSE; + if (sit->fully_parsed == FALSE) { + GST_WARNING ("Attempted to packetize an incompletely parsed SIT"); + return FALSE; + } + /* Skip cases we don't handle for now */ if (sit->encrypted_packet) { GST_WARNING ("SCTE encrypted packet is not supported"); @@ -498,8 +664,6 @@ } switch (sit->splice_command_type) { - case GST_MTS_SCTE_SPLICE_COMMAND_SCHEDULE: - case GST_MTS_SCTE_SPLICE_COMMAND_TIME: case GST_MTS_SCTE_SPLICE_COMMAND_PRIVATE: GST_WARNING ("SCTE command not supported"); return FALSE; @@ -521,22 +685,59 @@ /* There is at least 5 bytes */ command_length += 5; if (!event->splice_event_cancel_indicator) { - if (!event->program_splice_flag) { - GST_WARNING ("Only SCTE program splices are supported"); - return FALSE; - } /* Add at least 5 bytes for common fields */ command_length += 5; - if (!event->splice_immediate_flag) { - if (event->program_splice_time_specified) - command_length += 5; - else + + if (event->program_splice_flag) { + if (event->insert_event) { + if (!event->splice_immediate_flag) { + if (event->program_splice_time_specified) + command_length += 5; + else + command_length += 1; + } + } else { + /* Schedule events, 4 bytes for utc_splice_time */ + command_length += 4; + } + } else { + guint j; + + /* component_count */ + command_length += 1; + + for (j = 0; j < event->components->len; j++) { + GstMpegtsSCTESpliceComponent *component = + g_ptr_array_index (event->components, j); + + /* component_tag */ command_length += 1; + if (event->insert_event) { + if (!event->splice_immediate_flag) { + if (component->splice_time_specified) + command_length += 5; + else + command_length += 1; + } + } else { + /* utc_splice_time */ + command_length += 4; + } + } } + if (event->duration_flag) command_length += 5; } } + + if (sit->splice_command_type == GST_MTS_SCTE_SPLICE_COMMAND_TIME) { + if (sit->splice_time_specified) + command_length += 5; + else + command_length += 1; + } + length += command_length; /* Calculate size of descriptors */ @@ -572,6 +773,16 @@ GST_WRITE_UINT32_BE (data, tmp32); data += 4; + if (sit->splice_command_type == GST_MTS_SCTE_SPLICE_COMMAND_TIME) { + if (!sit->splice_time_specified) { + *data++ = 0x7f; + } else { + *data++ = 0xf2 | ((sit->splice_time >> 32) & 0x1); + GST_WRITE_UINT32_BE (data, sit->splice_time & 0xffffffff); + data += 4; + } + } + /* Write the events */ for (i = 0; i < sit->splices->len; i++) { GstMpegtsSCTESpliceEvent *event = g_ptr_array_index (sit->splices, i); @@ -592,19 +803,58 @@ *data++ = (event->out_of_network_indicator << 7) | (event->program_splice_flag << 6) | (event->duration_flag << 5) | - (event->splice_immediate_flag << 4) | 0x0f; - if (!event->splice_immediate_flag) { - /* program_splice_time_specified : 1bit - * reserved : 6/7 bit */ - if (!event->program_splice_time_specified) - *data++ = 0x7f; - else { - /* time : 33bit */ - *data++ = 0xf2 | ((event->program_splice_time >> 32) & 0x1); - GST_WRITE_UINT32_BE (data, event->program_splice_time & 0xffffffff); + (event->insert_event ? (event->splice_immediate_flag << 4) : 0) | + 0x0f; + if (event->program_splice_flag) { + if (event->insert_event) { + if (!event->splice_immediate_flag) { + /* program_splice_time_specified : 1bit + * reserved : 6/7 bit */ + if (!event->program_splice_time_specified) + *data++ = 0x7f; + else { + /* time : 33bit */ + *data++ = 0xf2 | ((event->program_splice_time >> 32) & 0x1); + GST_WRITE_UINT32_BE (data, + event->program_splice_time & 0xffffffff); + data += 4; + } + } + } else { + GST_WRITE_UINT32_BE (data, event->utc_splice_time); data += 4; } + } else { + guint j; + + *data++ = event->components->len & 0xff; + + for (j = 0; j < event->components->len; j++) { + GstMpegtsSCTESpliceComponent *component = + g_ptr_array_index (event->components, j); + + *data++ = component->tag; + + if (event->insert_event) { + if (!event->splice_immediate_flag) { + /* program_splice_time_specified : 1bit + * reserved : 6/7 bit */ + if (!component->splice_time_specified) + *data++ = 0x7f; + else { + /* time : 33bit */ + *data++ = 0xf2 | ((component->splice_time >> 32) & 0x1); + GST_WRITE_UINT32_BE (data, component->splice_time & 0xffffffff); + data += 4; + } + } + } else { + GST_WRITE_UINT32_BE (data, component->utc_splice_time); + data += 4; + } + } } + if (event->duration_flag) { *data = event->break_duration_auto_return ? 0xfe : 0x7e; *data++ |= (event->break_duration >> 32) & 0x1;
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gst-scte-section.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gst-scte-section.h
Changed
@@ -46,12 +46,13 @@ * @GST_MPEGTS_STREAM_TYPE_SCTE_SYNC_DATA: SCTE-07 Synchronous data * @GST_MPEGTS_STREAM_TYPE_SCTE_ASYNC_DATA: SCTE-53 Asynchronous data * - * Type of mpeg-ts streams for SCTE + * Type of mpeg-ts streams for SCTE. Most users would want to use the + * #GstMpegtsATSCStreamType instead since it also covers these stream types * */ typedef enum { - /* 0x01 - 0x82 : defined in other specs */ + /* 0x01 - 0x7f : defined in other specs */ GST_MPEGTS_STREAM_TYPE_SCTE_SUBTITLING = 0x82, /* Subtitling data */ GST_MPEGTS_STREAM_TYPE_SCTE_ISOCH_DATA = 0x83, /* Isochronous data */ /* 0x84 - 0x85 : defined in other specs */ @@ -102,8 +103,31 @@ } GstMpegtsSectionSCTETableID; +#define GST_MPEGTS_TYPE_SCTE_SPLICE_COMPONENT (gst_mpegts_scte_splice_component_get_type()) +typedef struct _GstMpegtsSCTESpliceComponent GstMpegtsSCTESpliceComponent; + +/** + * GstMpegtsSCTESpliceComponent: + * @tag: the elementary PID stream containing the Splice Point + * @splice_time_specified: Whether @splice_time was specified + * @splice_time: the presentation time of the signaled splice event + * @utc_splice_time: The UTC time of the signaled splice event + * + * Per-PID splice information. + * + * Since: 1.20 + */ +struct _GstMpegtsSCTESpliceComponent { + guint8 tag; + + gboolean splice_time_specified; /* Only valid for insert_event */ + guint64 splice_time; /* Only valid for insert_event */ + + guint32 utc_splice_time; /* Only valid for !insert_event (schedule) */ +}; + /* Splice Information Table (SIT) */ -#define GST_TYPE_MPEGTS_SCTE_SPLICE_EVENT (gst_mpegts_scte_splice_event_get_type); +#define GST_TYPE_MPEGTS_SCTE_SPLICE_EVENT (gst_mpegts_scte_splice_event_get_type()) typedef struct _GstMpegtsSCTESpliceEvent GstMpegtsSCTESpliceEvent; struct _GstMpegtsSCTESpliceEvent { @@ -115,12 +139,31 @@ /* If splice_event_cancel_indicator == 0 */ gboolean out_of_network_indicator; - gboolean program_splice_flag; /* NOTE: Only program splice are supported */ + gboolean program_splice_flag; gboolean duration_flag; + gboolean splice_immediate_flag; /* Only valid for insert_event */ - gboolean program_splice_time_specified; - guint64 program_splice_time; + gboolean program_splice_time_specified; /* Only valid for insert_event && program_splice */ + guint64 program_splice_time; /* Only valid for insert_event && program_splice */ + + /** + * GstMpegtsSCTESpliceEvent.utc_splice_time: + * + * The UTC time of the signaled splice event + * + * Since: 1.20 + */ + guint32 utc_splice_time; /* Only valid for !insert_event (schedule) && program_splice */ + + /** + * GstMpegtsSCTESpliceEvent.components: + * + * Per-PID splice time information + * + * Since: 1.20 + */ + GPtrArray *components; /* Only valid for !program_splice */ gboolean break_duration_auto_return; guint64 break_duration; @@ -152,13 +195,12 @@ GST_MTS_SCTE_SPLICE_COMMAND_PRIVATE = 0xff } GstMpegtsSCTESpliceCommandType; -#define GST_TYPE_MPEGTS_SCTE_SIT (gst_mpegts_scte_sit_get_type()); +#define GST_TYPE_MPEGTS_SCTE_SIT (gst_mpegts_scte_sit_get_type()) typedef struct _GstMpegtsSCTESIT GstMpegtsSCTESIT; struct _GstMpegtsSCTESIT { - /* Encryption not supported for now */ gboolean encrypted_packet; guint8 encryption_algorithm; @@ -167,6 +209,7 @@ guint16 tier; guint16 splice_command_length; + GstMpegtsSCTESpliceCommandType splice_command_type; /* For time_signal commands */ @@ -176,6 +219,26 @@ GPtrArray *splices; GPtrArray *descriptors; + + /** + * GstMpegtsSCTESIT.fully_parsed: + * + * When encrypted, or when encountering an unknown command type, + * we may still want to pass the sit through. + * + * Since: 1.20 + */ + gboolean fully_parsed; + + /** + * GstMpegtsSCTESIT.is_running_time: + * + * When the SIT was constructed by the application, splice times + * are in running_time and must be translated before packetizing. + * + * Since: 1.20 + */ + gboolean is_running_time; }; GST_MPEGTS_API @@ -192,12 +255,12 @@ GST_MPEGTS_API GstMpegtsSCTESIT *gst_mpegts_scte_splice_in_new (guint32 event_id, - guint64 splice_time); + GstClockTime splice_time); GST_MPEGTS_API GstMpegtsSCTESIT *gst_mpegts_scte_splice_out_new (guint32 event_id, - guint64 splice_time, - guint64 duration); + GstClockTime splice_time, + GstClockTime duration); GST_MPEGTS_API @@ -212,6 +275,12 @@ GST_MPEGTS_API GstMpegtsSection *gst_mpegts_section_from_scte_sit (GstMpegtsSCTESIT * sit, guint16 pid); +GST_MPEGTS_API +GType gst_mpegts_scte_splice_component_get_type (void); + +GST_MPEGTS_API +GstMpegtsSCTESpliceComponent *gst_mpegts_scte_splice_component_new (guint8 tag); + G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gstmpegts-private.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gstmpegts-private.h
Changed
@@ -29,6 +29,7 @@ GST_DEBUG_CATEGORY_EXTERN (mpegts_debug); #define GST_CAT_DEFAULT mpegts_debug +G_GNUC_INTERNAL void __initialize_sections (void); G_GNUC_INTERNAL void __initialize_descriptors (void); G_GNUC_INTERNAL guint32 _calc_crc32 (const guint8 *data, guint datalen); G_GNUC_INTERNAL gchar *get_encoding_and_convert (const gchar *text, guint length);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gstmpegtsdescriptor.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gstmpegtsdescriptor.c
Changed
@@ -47,46 +47,6 @@ * @title: Base MPEG-TS descriptors * @short_description: Descriptors for ITU H.222.0 | ISO/IEC 13818-1 * @include: gst/mpegts/mpegts.h - * @symbols: - * - GstMpegtsDescriptor - * - GstMpegtsDescriptorType - * - GstMpegtsMiscDescriptorType - * - gst_mpegts_find_descriptor - * - gst_mpegts_parse_descriptors - * - gst_mpegts_descriptor_from_custom - * - gst_mpegts_descriptor_from_registration - * - GstMpegtsISO639LanguageDescriptor - * - GstMpegtsIso639AudioType - * - gst_mpegts_descriptor_parse_iso_639_language - * - gst_mpegts_descriptor_parse_iso_639_language_idx - * - gst_mpegts_descriptor_parse_iso_639_language_nb - * - gst_mpegts_iso_639_language_descriptor_free - * - GstMpegtsLogicalChannel - * - GstMpegtsLogicalChannelDescriptor - * - gst_mpegts_descriptor_parse_logical_channel - * - GST_TYPE_MPEGTS_DVB_CODE_RATE - * - GST_TYPE_MPEGTS_CABLE_OUTER_FEC_SCHEME - * - GST_TYPE_MPEGTS_MODULATION_TYPE - * - GST_TYPE_MPEGTS_SATELLITE_POLARIZATION_TYPE - * - GST_TYPE_MPEGTS_SATELLITE_ROLLOFF - * - GST_TYPE_MPEGTS_ISO_639_LANGUAGE - * - GST_TYPE_MPEGTS_DESCRIPTOR - * - GST_TYPE_MPEGTS_DVB_SERVICE_TYPE - * - GST_TYPE_MPEGTS_DESCRIPTOR_TYPE - * - GST_TYPE_MPEGTS_ISO639_AUDIO_TYPE - * - GST_TYPE_MPEGTS_DVB_DESCRIPTOR_TYPE - * - GST_TYPE_MPEGTS_MISC_DESCRIPTOR_TYPE - * - gst_mpegts_descriptor_get_type - * - gst_mpegts_iso_639_language_get_type - * - gst_mpegts_cable_outer_fec_scheme_get_type - * - gst_mpegts_modulation_type_get_type - * - gst_mpegts_satellite_polarization_type_get_type - * - gst_mpegts_satellite_rolloff_get_type - * - gst_mpegts_descriptor_type_get_type - * - gst_mpegts_dvb_descriptor_type_get_type - * - gst_mpegts_misc_descriptor_type_get_type - * - gst_mpegts_iso639_audio_type_get_type - * - gst_mpegts_dvb_service_type_get_type * * These are the base descriptor types and methods. * @@ -94,119 +54,6 @@ * and other specifications mentioned in the documentation. */ -/* FIXME : Move this to proper file once we have a C file for ATSC/ISDB descriptors */ -/** - * SECTION:gst-atsc-descriptor - * @title: ATSC variants of MPEG-TS descriptors - * @short_description: Descriptors for the various ATSC specifications - * @include: gst/mpegts/mpegts.h - * @symbols: - * - GstMpegtsATSCDescriptorType - * - GST_TYPE_MPEGTS_ATSC_DESCRIPTOR_TYPE - * - gst_mpegts_atsc_descriptor_type_get_type - * - GstMpegtsDVBDescriptorType - * - GstMpegtsDVBExtendedDescriptorType - * - GstMpegtsContent - * - gst_mpegts_descriptor_parse_dvb_content - * - GstMpegtsComponentDescriptor - * - gst_mpegts_dvb_component_descriptor_free - * - gst_mpegts_descriptor_parse_dvb_component - * - GstMpegtsExtendedEventItem - * - GstMpegtsExtendedEventDescriptor - * - gst_mpegts_extended_event_descriptor_free - * - gst_mpegts_descriptor_parse_dvb_extended_event - * - GstMpegtsSatelliteDeliverySystemDescriptor - * - GstMpegtsDVBCodeRate - * - GstMpegtsModulationType - * - GstMpegtsSatellitePolarizationType - * - GstMpegtsSatelliteRolloff - * - gst_mpegts_descriptor_parse_satellite_delivery_system - * - GstMpegtsCableDeliverySystemDescriptor - * - GstMpegtsCableOuterFECScheme - * - gst_mpegts_descriptor_parse_cable_delivery_system - * - GstMpegtsTerrestrialDeliverySystemDescriptor - * - GstMpegtsTerrestrialTransmissionMode - * - GstMpegtsTerrestrialGuardInterval - * - GstMpegtsTerrestrialHierarchy - * - gst_mpegts_descriptor_parse_terrestrial_delivery_system - * - GstMpegtsT2DeliverySystemCellExtension - * - GstMpegtsT2DeliverySystemCell - * - GstMpegtsT2DeliverySystemDescriptor - * - gst_mpegts_t2_delivery_system_descriptor_free - * - gst_mpegts_descriptor_parse_dvb_t2_delivery_system - * - gst_mpegts_descriptor_parse_dvb_short_event - * - gst_mpegts_descriptor_parse_dvb_network_name - * - gst_mpegts_descriptor_from_dvb_network_name - * - GstMpegtsDVBServiceType - * - gst_mpegts_descriptor_parse_dvb_service - * - gst_mpegts_descriptor_from_dvb_service - * - GstMpegtsDVBTeletextType - * - gst_mpegts_descriptor_parse_dvb_teletext_idx - * - gst_mpegts_descriptor_parse_dvb_teletext_nb - * - gst_mpegts_descriptor_parse_dvb_subtitling_idx - * - gst_mpegts_descriptor_parse_dvb_subtitling_nb - * - gst_mpegts_descriptor_from_dvb_subtitling - * - GstMpegtsDVBLinkageType - * - GstMpegtsDVBLinkageHandOverType - * - GstMpegtsDVBLinkageMobileHandOver - * - GstMpegtsDVBLinkageEvent - * - GstMpegtsDVBLinkageExtendedEvent - * - GstMpegtsDVBLinkageDescriptor - * - gst_mpegts_dvb_linkage_descriptor_free - * - gst_mpegts_dvb_linkage_descriptor_get_mobile_hand_over - * - gst_mpegts_dvb_linkage_descriptor_get_event - * - gst_mpegts_dvb_linkage_descriptor_get_extended_event - * - gst_mpegts_descriptor_parse_dvb_linkage - * - gst_mpegts_descriptor_parse_dvb_private_data_specifier - * - gst_mpegts_descriptor_parse_dvb_frequency_list - * - GstMpegtsDataBroadcastDescriptor - * - gst_mpegts_dvb_data_broadcast_descriptor_free - * - gst_mpegts_descriptor_parse_dvb_data_broadcast - * - GstMpegtsDVBScramblingModeType - * - gst_mpegts_descriptor_parse_dvb_scrambling - * - gst_mpegts_descriptor_parse_dvb_data_broadcast_id - * - GstMpegtsDVBParentalRatingItem - * - gst_mpegts_descriptor_parse_dvb_parental_rating - * - gst_mpegts_descriptor_parse_dvb_stream_identifier - * - gst_mpegts_descriptor_parse_dvb_ca_identifier - * - GstMpegtsDVBServiceListItem - * - gst_mpegts_descriptor_parse_dvb_service_list - * - gst_mpegts_descriptor_parse_dvb_stuffing - * - gst_mpegts_descriptor_parse_dvb_bouquet_name - * - GstMpegtsDvbMultilingualNetworkNameItem - * - gst_mpegts_descriptor_parse_dvb_multilingual_network_name - * - GstMpegtsDvbMultilingualBouquetNameItem - * - gst_mpegts_descriptor_parse_dvb_multilingual_bouquet_name - * - GstMpegtsDvbMultilingualServiceNameItem - * - gst_mpegts_descriptor_parse_dvb_multilingual_service_name - * - GstMpegtsDvbMultilingualComponentItem - * - gst_mpegts_descriptor_parse_dvb_multilingual_component - * - GST_TYPE_MPEGTS_COMPONENT_DESCRIPTOR - * - GST_TYPE_MPEGTS_DVB_DATA_BROADCAST_DESCRIPTOR - * - GST_TYPE_MPEGTS_DVB_LINKAGE_DESCRIPTOR - * - GST_TYPE_MPEGTS_EXTENDED_EVENT_DESCRIPTOR - * - GST_TYPE_MPEGTS_T2_DELIVERY_SYSTEM_DESCRIPTOR - * - gst_mpegts_dvb_code_rate_get_type - * - gst_mpegts_component_descriptor_get_type - * - gst_mpegts_dvb_data_broadcast_descriptor_get_type - * - gst_mpegts_dvb_linkage_descriptor_get_type - * - gst_mpegts_extended_event_descriptor_get_type - * - gst_mpegts_t2_delivery_system_descriptor_get_type - * - */ - -/** - * SECTION:gst-isdb-descriptor - * @title: ISDB variants of MPEG-TS descriptors - * @short_description: Descriptors for the various ISDB specifications - * @include: gst/mpegts/mpegts.h - * @symbols: - * - GstMpegtsISDBDescriptorType - * - GST_TYPE_MPEGTS_ISDB_DESCRIPTOR_TYPE - * - gst_mpegts_isdb_descriptor_type_get_type - */ - - /* * TODO * @@ -853,7 +700,7 @@ GstMpegtsDescriptor *copy; copy = g_slice_dup (GstMpegtsDescriptor, desc); - copy->data = g_memdup (desc->data, desc->length + 2); + copy->data = g_memdup2 (desc->data, desc->length + 2); return copy; } @@ -941,7 +788,7 @@ desc->tag = *data++; desc->length = *data++; /* Copy the data now that we known the size */ - desc->data = g_memdup (desc->data, desc->length + 2); + desc->data = g_memdup2 (desc->data, desc->length + 2); GST_LOG ("descriptor 0x%02x length:%d", desc->tag, desc->length); GST_MEMDUMP ("descriptor", desc->data + 2, desc->length); /* extended descriptors */ @@ -988,6 +835,38 @@ return NULL; } +/** + * gst_mpegts_find_descriptor_with_extension: + * @descriptors: (element-type GstMpegtsDescriptor) (transfer none): an array + * of #GstMpegtsDescriptor + * @tag: the tag to look for + * + * Finds the first descriptor of type @tag with @tag_extension in the array. + * + * Note: To look for descriptors that can be present more than once in an + * array of descriptors, iterate the #GArray manually. + * + * Returns: (transfer none): the first descriptor matchin @tag with @tag_extension, else %NULL. + * + * Since: 1.20 + */ +const GstMpegtsDescriptor * +gst_mpegts_find_descriptor_with_extension (GPtrArray * descriptors, guint8 tag, + guint8 tag_extension) +{ + guint i, nb_desc; + + g_return_val_if_fail (descriptors != NULL, NULL); + + nb_desc = descriptors->len; + for (i = 0; i < nb_desc; i++) { + GstMpegtsDescriptor *desc = g_ptr_array_index (descriptors, i); + if ((desc->tag == tag) && (desc->tag_extension == tag_extension)) + return (const GstMpegtsDescriptor *) desc; + } + return NULL; +} + /* GST_MTS_DESC_REGISTRATION (0x05) */ /** * gst_mpegts_descriptor_from_registration: @@ -1018,6 +897,47 @@ return descriptor; } +/** + * gst_mpegts_descriptor_parse_registration: + * @descriptor: a %GST_MTS_DESC_REGISTRATION #GstMpegtsDescriptor + * @registration_id: (out): The registration ID (in host endiannes) + * @additional_info: (out) (allow-none) (array length=additional_info_length): The additional information + * @additional_info_length: (out) (allow-none): The size of @additional_info in bytes. + * + * Extracts the Registration information from @descriptor. + * + * Returns: %TRUE if parsing succeeded, else %FALSE. + * + * Since: 1.20 + */ + +gboolean +gst_mpegts_descriptor_parse_registration (GstMpegtsDescriptor * descriptor, + guint32 * registration_id, + guint8 ** additional_info, gsize * additional_info_length) +{ + guint8 *data; + + g_return_val_if_fail (descriptor != NULL && registration_id != NULL, FALSE); + + /* The smallest registration is 4 bytes */ + __common_desc_checks (descriptor, GST_MTS_DESC_REGISTRATION, 4, FALSE); + + data = (guint8 *) descriptor->data + 2; + *registration_id = GST_READ_UINT32_BE (data); + data += 4; + if (additional_info && additional_info_length) { + *additional_info_length = descriptor->length - 4; + if (descriptor->length > 4) { + *additional_info = data; + } else { + *additional_info = NULL; + } + } + + return TRUE; +} + /* GST_MTS_DESC_CA (0x09) */ /** @@ -1310,6 +1230,8 @@ * Creates a #GstMpegtsDescriptor with custom @tag, @tag_extension and @data * * Returns: #GstMpegtsDescriptor + * + * Since: 1.20 */ GstMpegtsDescriptor * gst_mpegts_descriptor_from_custom_with_extension (guint8 tag,
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gstmpegtsdescriptor.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gstmpegtsdescriptor.h
Changed
@@ -148,93 +148,28 @@ */ typedef enum { /* 0x80 - 0xFE are user defined */ - GST_MTS_DESC_AC3_AUDIO_STREAM = 0x81, - GST_MTS_DESC_DTG_LOGICAL_CHANNEL = 0x83, /* from DTG D-Book */ + GST_MTS_DESC_DTG_LOGICAL_CHANNEL = 0x83, /* from DTG D-Book, only present in NIT */ } GstMpegtsMiscDescriptorType; /** - * GstMpegtsATSCDescriptorType: + * GstMpegtsSCTEDescriptorType: * - * These values correspond to the registered descriptor type from - * the various ATSC specifications. + * These values correspond to the ones defined by SCTE (amongst other in ANSI/SCTE 57) * - * Consult the relevant specifications for more details. + * Since: 1.20 */ typedef enum { - /* ATSC A/65 2009 */ - GST_MTS_DESC_ATSC_STUFFING = 0x80, - GST_MTS_DESC_ATSC_AC3 = 0x83, - GST_MTS_DESC_ATSC_CAPTION_SERVICE = 0x86, - GST_MTS_DESC_ATSC_CONTENT_ADVISORY = 0x87, - GST_MTS_DESC_ATSC_EXTENDED_CHANNEL_NAME = 0xA0, - GST_MTS_DESC_ATSC_SERVICE_LOCATION = 0xA1, - GST_MTS_DESC_ATSC_TIME_SHIFTED_SERVICE = 0xA2, - GST_MTS_DESC_ATSC_COMPONENT_NAME = 0xA3, - GST_MTS_DESC_ATSC_DCC_DEPARTING_REQUEST = 0xA8, - GST_MTS_DESC_ATSC_DCC_ARRIVING_REQUEST = 0xA9, - GST_MTS_DESC_ATSC_REDISTRIBUTION_CONTROL = 0xAA, - GST_MTS_DESC_ATSC_GENRE = 0xAB, - GST_MTS_DESC_ATSC_PRIVATE_INFORMATION = 0xAD, - GST_MTS_DESC_ATSC_EAC3 = 0xCC, - - /* ATSC A/53:3 2009 */ - GST_MTS_DESC_ATSC_ENHANCED_SIGNALING = 0xB2, - - /* ATSC A/90 */ - GST_MTS_DESC_ATSC_DATA_SERVICE = 0xA4, - GST_MTS_DESC_ATSC_PID_COUNT = 0xA5, - GST_MTS_DESC_ATSC_DOWNLOAD_DESCRIPTOR = 0xA6, - GST_MTS_DESC_ATSC_MULTIPROTOCOL_ENCAPSULATION = 0xA7, - GST_MTS_DESC_ATSC_MODULE_LINK = 0xB4, - GST_MTS_DESC_ATSC_CRC32 = 0xB5, - GST_MTS_DESC_ATSC_GROUP_LINK = 0xB8, -} GstMpegtsATSCDescriptorType; + GST_MTS_DESC_SCTE_STUFFING = 0x80, + GST_MTS_DESC_SCTE_AC3 = 0x81, + GST_MTS_DESC_SCTE_FRAME_RATE = 0x82, + GST_MTS_DESC_SCTE_EXTENDED_VIDEO = 0x83, + GST_MTS_DESC_SCTE_COMPONENT_NAME = 0x84, + GST_MTS_DESC_SCTE_FREQUENCY_SPEC = 0x90, + GST_MTS_DESC_SCTE_MODULATION_PARAMS = 0x91, + GST_MTS_DESC_SCTE_TRANSPORT_STREAM_ID = 0x92 +} GstMpegtsSCTEDescriptorType; + -/** - * GstMpegtsISDBDescriptorType: - * - * These values correspond to the registered descriptor type from - * the various ISDB specifications. - * - * Consult the relevant specifications for more details. - */ -typedef enum { - /* ISDB ARIB B10 v4.6 */ - GST_MTS_DESC_ISDB_HIERARCHICAL_TRANSMISSION = 0xC0, - GST_MTS_DESC_ISDB_DIGITAL_COPY_CONTROL = 0xC1, - GST_MTS_DESC_ISDB_NETWORK_IDENTIFICATION = 0xC2, - GST_MTS_DESC_ISDB_PARTIAL_TS_TIME = 0xc3, - GST_MTS_DESC_ISDB_AUDIO_COMPONENT = 0xc4, - GST_MTS_DESC_ISDB_HYPERLINK = 0xc5, - GST_MTS_DESC_ISDB_TARGET_REGION = 0xc6, - GST_MTS_DESC_ISDB_DATA_CONTENT = 0xc7, - GST_MTS_DESC_ISDB_VIDEO_DECODE_CONTROL = 0xc8, - GST_MTS_DESC_ISDB_DOWNLOAD_CONTENT = 0xc9, - GST_MTS_DESC_ISDB_CA_EMM_TS = 0xca, - GST_MTS_DESC_ISDB_CA_CONTRACT_INFORMATION = 0xcb, - GST_MTS_DESC_ISDB_CA_SERVICE = 0xcc, - GST_MTS_DESC_ISDB_TS_INFORMATION = 0xcd, - GST_MTS_DESC_ISDB_EXTENDED_BROADCASTER = 0xce, - GST_MTS_DESC_ISDB_LOGO_TRANSMISSION = 0xcf, - GST_MTS_DESC_ISDB_BASIC_LOCAL_EVENT = 0xd0, - GST_MTS_DESC_ISDB_REFERENCE = 0xd1, - GST_MTS_DESC_ISDB_NODE_RELATION = 0xd2, - GST_MTS_DESC_ISDB_SHORT_NODE_INFORMATION = 0xd3, - GST_MTS_DESC_ISDB_STC_REFERENCE = 0xd4, - GST_MTS_DESC_ISDB_SERIES = 0xd5, - GST_MTS_DESC_ISDB_EVENT_GROUP = 0xd6, - GST_MTS_DESC_ISDB_SI_PARAMETER = 0xd7, - GST_MTS_DESC_ISDB_BROADCASTER_NAME = 0xd8, - GST_MTS_DESC_ISDB_COMPONENT_GROUP = 0xd9, - GST_MTS_DESC_ISDB_SI_PRIME_TS = 0xda, - GST_MTS_DESC_ISDB_BOARD_INFORMATION = 0xdb, - GST_MTS_DESC_ISDB_LDT_LINKAGE = 0xdc, - GST_MTS_DESC_ISDB_CONNECTED_TRANSMISSION = 0xdd, - GST_MTS_DESC_ISDB_CONTENT_AVAILABILITY = 0xde, - /* ... */ - GST_MTS_DESC_ISDB_SERVICE_GROUP = 0xe0 - -} GstMpegtsISDBDescriptorType; typedef struct _GstMpegtsDescriptor GstMpegtsDescriptor; @@ -274,6 +209,73 @@ const GstMpegtsDescriptor * gst_mpegts_find_descriptor (GPtrArray *descriptors, guint8 tag); +GST_MPEGTS_API +const GstMpegtsDescriptor * gst_mpegts_find_descriptor_with_extension (GPtrArray *descriptors, + guint8 tag, guint8 tag_extension); +/** + * GstMpegtsRegistrationId: + * @GST_MTS_REGISTRATION_0: Undefined registration id + * @GST_MTS_REGISTRATION_AC_3: Audio AC-3, ATSC A/52 + * @GST_MTS_REGISTRATION_AC_4: Audio AC-4, ETSI 103 190-2 + * @GST_MTS_REGISTRATION_CUEI: SCTE 35, "Digital Program Insertion Cueing Message" + * @GST_MTS_REGISTRATION_drac: Dirac Video codec + * @GST_MTS_REGISTRATION_DTS1: DTS Audio + * @GST_MTS_REGISTRATION_DTS2: DTS Audio + * @GST_MTS_REGISTRATION_DTS3: DTS Audio + * @GST_MTS_REGISTRATION_EAC3: Enhanced AC-3 (i.e. EAC3) + * @GST_MTS_REGISTRATION_ETV1: Cablelabs ETV + * @GST_MTS_REGISTRATION_BSSD: SMPTE 302M, Mapping of AES3 Data in mpeg-ts + * @GST_MTS_REGISTRATION_GA94: ATSC A/53 compliant stream (i.e. ATSC) + * @GST_MTS_REGISTRATION_HDMV: Blu-ray, "System Description Blu-ray Disc + * Read-Only Format part 3 Audio Visual Basic Specifications" + * @GST_MTS_REGISTRATION_KLVA: SMPTE RP217 : Non-synchronized Mapping of KLV + * Packets in mpeg-ts + * @GST_MTS_REGISTRATION_OPUS: Opus Audio + * @GST_MTS_REGISTRATION_TSHV: HDV (Sony) + * @GST_MTS_REGISTRATION_VC_1: Video VC-1, SMPTE RP227 "VC-1 Bitstream Transport Encodings" + * @GST_MTS_REGISTRATION_OTHER_HEVC: HEVC / h265 + * + * Well-known registration ids, expressed as native-endian 32bit integers. These + * are used in descriptors of type %GST_MTS_DESC_REGISTRATION. Unless specified + * otherwise (by use of the "OTHER" prefix), they are all registered by the + * [SMPTE Registration Authority](https://smpte-ra.org/) or specified in + * "official" documentation for the given format. + * + * Since: 1.20 + */ + +/** + * REG_TO_UINT32: (skip) (attributes doc.skip=true) + */ +#define REG_TO_UINT32(a,b,c,d)((a) << 24 | (b) << 16 | (c) << 8 | (d)) + +typedef enum { + GST_MTS_REGISTRATION_0 = 0, + + /* SMPTE-RA registered */ + GST_MTS_REGISTRATION_AC_3 = REG_TO_UINT32 ('A', 'C', '-', '3'), + GST_MTS_REGISTRATION_CUEI = REG_TO_UINT32 ('C', 'U', 'E', 'I'), + GST_MTS_REGISTRATION_drac = REG_TO_UINT32 ('d', 'r', 'a', 'c'), + GST_MTS_REGISTRATION_DTS1 = REG_TO_UINT32 ('D', 'T', 'S', '1'), + GST_MTS_REGISTRATION_DTS2 = REG_TO_UINT32 ('D', 'T', 'S', '2'), + GST_MTS_REGISTRATION_DTS3 = REG_TO_UINT32 ('D', 'T', 'S', '3'), + GST_MTS_REGISTRATION_BSSD = REG_TO_UINT32 ('B', 'S', 'S', 'D'), + GST_MTS_REGISTRATION_EAC3 = REG_TO_UINT32 ('E', 'A', 'C', '3'), + GST_MTS_REGISTRATION_ETV1 = REG_TO_UINT32 ('E', 'T', 'V', '1'), + GST_MTS_REGISTRATION_GA94 = REG_TO_UINT32 ('G', 'A', '9', '4'), + GST_MTS_REGISTRATION_HDMV = REG_TO_UINT32 ('H', 'D', 'M', 'V'), + GST_MTS_REGISTRATION_KLVA = REG_TO_UINT32 ('K', 'L', 'V', 'A'), + GST_MTS_REGISTRATION_OPUS = REG_TO_UINT32 ('O', 'P', 'U', 'S'), + GST_MTS_REGISTRATION_TSHV = REG_TO_UINT32 ('T', 'S', 'H', 'V'), + GST_MTS_REGISTRATION_VC_1 = REG_TO_UINT32 ('V', 'C', '-', '1'), + + /* Self-registered by formats, but not in SMPTE-RA registry */ + GST_MTS_REGISTRATION_AC_4 = REG_TO_UINT32 ('A', 'C', '-', '4'), + + /* Found elsewhere */ + GST_MTS_REGISTRATION_OTHER_HEVC = REG_TO_UINT32 ('H', 'E', 'V', 'C') +} GstMpegtsRegistrationId; + /* GST_MTS_DESC_REGISTRATION (0x05) */ GST_MPEGTS_API @@ -281,6 +283,12 @@ const gchar *format_identifier, guint8 *additional_info, gsize additional_info_length); +GST_MPEGTS_API +gboolean gst_mpegts_descriptor_parse_registration(GstMpegtsDescriptor *descriptor, + guint32 *registration_id, + guint8 **additional_info, + gsize *additional_info_length); + /* GST_MTS_DESC_CA (0x09) */ GST_MPEGTS_API @@ -375,6 +383,7 @@ GstMpegtsDescriptor * gst_mpegts_descriptor_from_custom (guint8 tag, const guint8 *data, gsize length); + GST_MPEGTS_API GstMpegtsDescriptor * gst_mpegts_descriptor_from_custom_with_extension (guint8 tag, guint8 tag_extension, const guint8 *data, gsize length);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gstmpegtssection.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gstmpegtssection.c
Changed
@@ -39,77 +39,57 @@ #include "gstmpegts-private.h" /** - * SECTION:gstmpegts - * @title: Mpeg-ts helper library - * @short_description: Mpeg-ts helper library for plugins and applications - * @include: gst/mpegts/mpegts.h - */ - -/** * SECTION:gstmpegtssection * @title: Base MPEG-TS sections * @short_description: Sections for ITU H.222.0 | ISO/IEC 13818-1 * @include: gst/mpegts/mpegts.h - * @symbols: - * - GST_MPEGTS_SECTION_TYPE - * - GstMpegtsSection - * - GstMpegtsSectionTableID - * - GstMpegtsSectionType - * - gst_message_new_mpegts_section - * - gst_message_parse_mpegts_section - * - gst_mpegts_section_send_event - * - gst_event_parse_mpegts_section - * - gst_mpegts_section_packetize - * - gst_mpegts_section_new - * - gst_mpegts_section_ref - * - gst_mpegts_section_unref - * - GstMpegtsPatProgram - * - gst_mpegts_section_get_pat - * - gst_mpegts_pat_new - * - gst_mpegts_pat_program_new - * - gst_mpegts_section_from_pat - * - GstMpegtsPMT - * - GstMpegtsPMTStream - * - GstMpegtsStreamType - * - gst_mpegts_section_get_pmt - * - gst_mpegts_pmt_new - * - gst_mpegts_pmt_stream_new - * - gst_mpegts_section_from_pmt - * - gst_mpegts_section_get_tsdt - * - gst_mpegts_section_get_cat - * - GST_TYPE_MPEGTS_SECTION_TABLE_ID - * - GST_TYPE_MPEGTS_SECTION_TYPE - * - GST_TYPE_MPEGTS_SECTION_DVB_TABLE_ID - * - GST_MPEGTS_SECTION - * - GST_TYPE_MPEGTS_STREAM_TYPE - * - GST_TYPE_MPEGTS_PMT - * - GST_TYPE_MPEGTS_PMT_STREAM - * - GST_TYPE_MPEGTS_SECTION - * - gst_mpegts_section_table_id_get_type - * - gst_mpegts_section_type_get_type - * - gst_mpegts_pmt_get_type - * - gst_mpegts_pmt_stream_get_type - * - gst_mpegts_section_get_type - * - gst_mpegts_stream_type_get_type + * + * ## Generic usage of sections with %GstMpegtsSection + * + * The %GstMpegtsSection object is the representation of MPEG-TS Section (SI or + * PSI). + * + * Various elements can post those on the bus via %GstMessage of type + * %GST_MESSAGE_ELEMENT. The gst_message_parse_mpegts_section() function + * provides access to the section. + * + * Applications (or other elements) can create them either by using one of the + * `gst_mpegts_section_from_*` functions, or by providing the raw SI data via + * gst_mpegts_section_new(). + * + * Elements outputting MPEG-TS streams can also create sections using the + * various convenience functions and then get the packetized data (to be + * inserted in MPEG-TS packets) using gst_mpegts_section_packetize(). * * For more details, refer to the ITU H.222.0 or ISO/IEC 13818-1 specifications * and other specifications mentioned in the documentation. - */ - -/* - * TODO * - * * Check minimum size for section parsing in the various - * gst_mpegts_section_get_<tabld>() methods + * # Supported base MPEG-TS sections + * These are the sections for which parsing and packetizing code exists. + * + * ## Program Association Table (PAT) + * See: + * * gst_mpegts_section_get_pat() + * * gst_mpegts_pat_program_new() + * * %GstMpegtsPatProgram * - * * Implement parsing code for - * * BAT - * * CAT - * * TSDT + * ## Conditional Access Table (CAT) + * See: + * * gst_mpegts_section_get_cat() + * + * ## Program Map Table (PMT) + * See: + * * %GstMpegtsPMT + * * gst_mpegts_section_get_pmt() + * * gst_mpegts_pmt_new() + * * %GstMpegtsPMTStream + * + * ## Transport Stream Description Table (TSDT) + * See: + * * gst_mpegts_section_get_tsdt() + * # API */ -GST_DEBUG_CATEGORY (mpegts_debug); - static GQuark QUARK_PAT; static GQuark QUARK_CAT; static GQuark QUARK_BAT; @@ -119,6 +99,7 @@ static GQuark QUARK_EIT; static GQuark QUARK_TDT; static GQuark QUARK_TOT; +static GQuark QUARK_SCTE_SIT; static GQuark QUARK_SECTION; static GType _gst_mpegts_section_type = 0; @@ -253,7 +234,7 @@ copy->last_section_number = section->last_section_number; copy->crc = section->crc; - copy->data = g_memdup (section->data, section->section_length); + copy->data = g_memdup2 (section->data, section->section_length); copy->section_length = section->section_length; /* Note: We do not copy the cached parsed item, it will be * reconstructed on that copy */ @@ -338,6 +319,9 @@ case GST_MPEGTS_SECTION_TOT: quark = QUARK_TOT; break; + case GST_MPEGTS_SECTION_SCTE_SIT: + quark = QUARK_SCTE_SIT; + break; default: GST_DEBUG ("Creating structure for unknown GstMpegtsSection"); quark = QUARK_SECTION; @@ -373,8 +357,17 @@ return msg; } -static GstEvent * -_mpegts_section_get_event (GstMpegtsSection * section) +/** + * gst_event_new_mpegts_section: + * @section: (transfer none): The #GstMpegtsSection to put in a message + * + * Creates a new #GstEvent for a #GstMpegtsSection. + * + * Returns: (transfer full): The new custom #GstEvent. + * Since: 1.20 + */ +GstEvent * +gst_event_new_mpegts_section (GstMpegtsSection * section) { GstStructure *structure; GstEvent *event; @@ -392,7 +385,8 @@ * * Extracts the #GstMpegtsSection contained in the @event #GstEvent * - * Returns: (transfer full): The extracted #GstMpegtsSection + * Returns: (transfer full): The extracted #GstMpegtsSection , or %NULL if the + * event did not contain a valid #GstMpegtsSection. */ GstMpegtsSection * gst_event_parse_mpegts_section (GstEvent * event) @@ -417,10 +411,10 @@ * @element: (transfer none): The #GstElement to send to section event to * @section: (transfer none): The #GstMpegtsSection to put in the event * - * Creates a custom #GstEvent with a @GstMpegtsSection. - * The #GstEvent is sent to the @element #GstElement. + * Creates a custom #GstEvent with a @GstMpegtsSection and send it the @element + * #GstElement. * - * Returns: %TRUE if the event is sent + * Returns: %TRUE if the event was sent to the element. */ gboolean gst_mpegts_section_send_event (GstMpegtsSection * section, GstElement * element) @@ -430,7 +424,7 @@ g_return_val_if_fail (section != NULL, FALSE); g_return_val_if_fail (element != NULL, FALSE); - event = _mpegts_section_get_event (section); + event = gst_event_new_mpegts_section (section); if (!gst_element_send_event (element, event)) { gst_event_unref (event); @@ -509,12 +503,13 @@ * * Returns the array of #GstMpegtsPatProgram contained in the section. * - * Note: The PAT "transport_id" field corresponds to the "subtable_extension" - * field of the provided @section. + * Note: The PAT `transport_stream_id` field corresponds to the + * "subtable_extension" field of the provided @section. * * Returns: (transfer container) (element-type GstMpegtsPatProgram): The - * #GstMpegtsPatProgram contained in the section, or %NULL if an error - * happened. Release with #g_ptr_array_unref when done. + * #GstMpegtsPatProgram contained in the section, or %NULL if an error happened + * or the @section did not contain a valid PAT. Release with #g_ptr_array_unref + * when done. */ GPtrArray * gst_mpegts_section_get_pat (GstMpegtsSection * section) @@ -535,7 +530,8 @@ /** * gst_mpegts_pat_new: * - * Allocates a new #GPtrArray for #GstMpegtsPatProgram + * Allocates a new #GPtrArray for #GstMpegtsPatProgram. The array can be filled + * and then converted to a PAT section with gst_mpegts_section_from_pat(). * * Returns: (transfer full) (element-type GstMpegtsPatProgram): A newly allocated #GPtrArray */ @@ -782,7 +778,7 @@ * gst_mpegts_section_get_pmt: * @section: a #GstMpegtsSection of type %GST_MPEGTS_SECTION_PMT * - * Returns the #GstMpegtsPMT contained in the @section. + * Parses the Program Map Table contained in the @section. * * Returns: The #GstMpegtsPMT contained in the section, or %NULL if an error * happened. @@ -804,7 +800,9 @@ /** * gst_mpegts_pmt_new: * - * Allocates and initializes a new #GstMpegtsPMT. + * Allocates and initializes a new #GstMpegtsPMT. #GstMpegtsPMTStream can be + * added to the streams array, and global PMT #GstMpegtsDescriptor to the + * descriptors array. * * Returns: (transfer full): #GstMpegtsPMT */ @@ -982,11 +980,13 @@ * gst_mpegts_section_get_cat: * @section: a #GstMpegtsSection of type %GST_MPEGTS_SECTION_CAT * + * Parses a Conditional Access Table. + * * Returns the array of #GstMpegtsDescriptor contained in the Conditional * Access Table. * - * Returns: (transfer container) (element-type GstMpegtsDescriptor): The - * #GstMpegtsDescriptor contained in the section, or %NULL if an error + * Returns: (transfer container) (element-type GstMpegtsDescriptor): The array + * of #GstMpegtsDescriptor contained in the section, or %NULL if an error * happened. Release with #g_array_unref when done. */ GPtrArray * @@ -1010,10 +1010,12 @@ * gst_mpegts_section_get_tsdt: * @section: a #GstMpegtsSection of type %GST_MPEGTS_SECTION_TSDT * + * Parses a Transport Stream Description Table. + * * Returns the array of #GstMpegtsDescriptor contained in the section * - * Returns: (transfer container) (element-type GstMpegtsDescriptor): The - * #GstMpegtsDescriptor contained in the section, or %NULL if an error + * Returns: (transfer container) (element-type GstMpegtsDescriptor): The array + * of #GstMpegtsDescriptor contained in the section, or %NULL if an error * happened. Release with #g_array_unref when done. */ GPtrArray * @@ -1030,39 +1032,7 @@ } -/** - * gst_mpegts_initialize: - * - * Initializes the MPEG-TS helper library. Must be called before any - * usage. - */ -void -gst_mpegts_initialize (void) -{ - if (_gst_mpegts_section_type) - return; - - GST_DEBUG_CATEGORY_INIT (mpegts_debug, "mpegts", 0, "MPEG-TS helper library"); - - /* FIXME : Temporary hack to initialize section gtype */ - _gst_mpegts_section_type = gst_mpegts_section_get_type (); - - QUARK_PAT = g_quark_from_string ("pat"); - QUARK_CAT = g_quark_from_string ("cat"); - QUARK_PMT = g_quark_from_string ("pmt"); - QUARK_NIT = g_quark_from_string ("nit"); - QUARK_BAT = g_quark_from_string ("bat"); - QUARK_SDT = g_quark_from_string ("sdt"); - QUARK_EIT = g_quark_from_string ("eit"); - QUARK_TDT = g_quark_from_string ("tdt"); - QUARK_TOT = g_quark_from_string ("tot"); - QUARK_SECTION = g_quark_from_string ("section"); - - __initialize_descriptors (); -} -/* FIXME : Later on we might need to use more than just the table_id - * to figure out which type of section this is. */ static GstMpegtsSectionType _identify_section (guint16 pid, guint8 table_id) { @@ -1129,6 +1099,9 @@ case GST_MTS_TABLE_ID_SCTE_SPLICE: return GST_MPEGTS_SECTION_SCTE_SIT; break; + case GST_MTS_TABLE_ID_SELECTION_INFORMATION: + if (pid == 0x001f) + return GST_MPEGTS_SECTION_SIT; default: /* Handle ranges */ if (table_id >= GST_MTS_TABLE_ID_EVENT_INFORMATION_ACTUAL_TS_PRESENT && @@ -1220,8 +1193,8 @@ /** * gst_mpegts_section_new: * @pid: the PID to which this section belongs - * @data: (transfer full) (array length=data_size): a pointer to the beginning of the section (i.e. the first byte - * should contain the table_id field). + * @data: (transfer full) (array length=data_size): a pointer to the beginning of + * the section (i.e. the first byte should contain the `table_id` field). * @data_size: size of the @data argument. * * Creates a new #GstMpegtsSection from the provided @data. @@ -1320,10 +1293,11 @@ * @section: (transfer none): the #GstMpegtsSection that holds the data * @output_size: (out): #gsize to hold the size of the data * - * If the data in @section has already been packetized, the data pointer is returned - * immediately. Otherwise, the data field is allocated and populated. + * Packetize (i.e. serialize) the @section. If the data in @section has already + * been packetized, the data pointer is returned immediately. Otherwise, the + * data field is allocated and populated. * - * Returns: (transfer none): pointer to section data, or %NULL on fail + * Returns: (transfer none): pointer to section data, or %NULL on failure. */ guint8 * gst_mpegts_section_packetize (GstMpegtsSection * section, gsize * output_size) @@ -1331,7 +1305,6 @@ guint8 *crc; g_return_val_if_fail (section != NULL, NULL); g_return_val_if_fail (output_size != NULL, NULL); - g_return_val_if_fail (section->packetizer != NULL, NULL); /* Section data has already been packetized */ if (section->data) { @@ -1339,6 +1312,8 @@ return section->data; } + g_return_val_if_fail (section->packetizer != NULL, NULL); + if (!section->packetizer (section)) return NULL; @@ -1352,3 +1327,22 @@ return section->data; } + +void +__initialize_sections (void) +{ + /* FIXME : Temporary hack to initialize section gtype */ + _gst_mpegts_section_type = gst_mpegts_section_get_type (); + + QUARK_PAT = g_quark_from_string ("pat"); + QUARK_CAT = g_quark_from_string ("cat"); + QUARK_PMT = g_quark_from_string ("pmt"); + QUARK_NIT = g_quark_from_string ("nit"); + QUARK_BAT = g_quark_from_string ("bat"); + QUARK_SDT = g_quark_from_string ("sdt"); + QUARK_EIT = g_quark_from_string ("eit"); + QUARK_TDT = g_quark_from_string ("tdt"); + QUARK_TOT = g_quark_from_string ("tot"); + QUARK_SCTE_SIT = g_quark_from_string ("scte-sit"); + QUARK_SECTION = g_quark_from_string ("section"); +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/gstmpegtssection.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/gstmpegtssection.h
Changed
@@ -58,9 +58,12 @@ * @GST_MPEGTS_SECTION_ATSC_ETT: ATSC Extended Text Table (A65) * @GST_MPEGTS_SECTION_ATSC_EIT: ATSC Event Information Table (A65) * @GST_MPEGTS_SECTION_ATSC_STT: ATSC System Time Table (A65) + * @GST_MPEGTS_SECTION_ATSC_RRT: ATSC Rating Region Table (A65) * @GST_MPEGTS_SECTION_SCTE_SIT: SCTE Splice Information Table (SCTE-35) * - * Types of #GstMpegtsSection that the library handles. + * Types of #GstMpegtsSection that the library handles. This covers all the + * MPEG-TS and derivate specification that the library can properly identify and + * use. */ typedef enum { GST_MPEGTS_SECTION_UNKNOWN = 0, @@ -74,6 +77,14 @@ GST_MPEGTS_SECTION_SDT, GST_MPEGTS_SECTION_TDT, GST_MPEGTS_SECTION_TOT, + /** + * GST_MPEGTS_SECTION_SIT: + * + * Selection Information Table (EN 300 468) + * + * Since: 1.20 + */ + GST_MPEGTS_SECTION_SIT, GST_MPEGTS_SECTION_ATSC_TVCT, GST_MPEGTS_SECTION_ATSC_CVCT, GST_MPEGTS_SECTION_ATSC_MGT, @@ -86,6 +97,21 @@ /** * GstMpegtsSectionTableID: + * @GST_MTS_TABLE_ID_PROGRAM_ASSOCIATION: Program Association Table (PAT) + * @GST_MTS_TABLE_ID_CONDITIONAL_ACCESS: Conditional Access Table (CAT) + * @GST_MTS_TABLE_ID_TS_PROGRAM_MAP: Program Map Table (PMT) + * @GST_MTS_TABLE_ID_TS_DESCRIPTION: Transport Stream Description Table + * @GST_MTS_TABLE_ID_14496_SCENE_DESCRIPTION: ISO/IEC 14496 Scene Description Table + * @GST_MTS_TABLE_ID_14496_OBJET_DESCRIPTOR: ISO/IEC 14496 Object Descriptor Table + * @GST_MTS_TABLE_ID_METADATA: Metadata Section + * @GST_MTS_TABLE_ID_IPMP_CONTROL_INFORMATION: IPMP Control Information + * @GST_MTS_TABLE_ID_DSM_CC_MULTIPROTO_ENCAPSULATED_DATA: DSM-CC Multi-Protocol Encapsulated (MPE) Data + * @GST_MTS_TABLE_ID_DSM_CC_U_N_MESSAGES: DSM-CC U-N Messages + * @GST_MTS_TABLE_ID_DSM_CC_DOWNLOAD_DATA_MESSAGES: DSM-CC Download Data Messages + * @GST_MTS_TABLE_ID_DSM_CC_STREAM_DESCRIPTORS: DSM-CC Stream Descriptors + * @GST_MTS_TABLE_ID_DSM_CC_PRIVATE_DATA: DSM-CC Private Data + * @GST_MTS_TABLE_ID_DSM_CC_ADDRESSABLE_SECTIONS: DSM-CC Addressable Section + * @GST_MTS_TABLE_ID_UNSET: Unset section table_id (value is forbidden to use in actual sections) * * Values for a #GstMpegtsSection table_id * @@ -104,6 +130,32 @@ GST_MTS_TABLE_ID_14496_OBJET_DESCRIPTOR = 0x05, GST_MTS_TABLE_ID_METADATA = 0x06, GST_MTS_TABLE_ID_IPMP_CONTROL_INFORMATION = 0x07, + /** + * GST_MTS_TABLE_ID_14496_SECTION: + * + * ISO/IEC 14496 Section. + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_14496_SECTION = 0x08, + + /** + * GST_MTS_TABLE_ID_23001_11_SECTION: + * + * ISO/IEC 23001-11 (Green Access Unit) Section. + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_23001_11_SECTION = 0x09, + + /** + * GST_MTS_TABLE_ID_23001_10_SECTION: + * + * ISO/ISO 23001-10 (Quality Access Unit) Section. + * + * Since: 1.20 + */ + GST_MTS_TABLE_ID_23001_10_SECTION = 0x0A, /* 0x08 - 0x39 : ITU H.222.0 | ISO/IEC 13818-1 reserved */ @@ -120,22 +172,26 @@ } GstMpegtsSectionTableID; +/** + * GstMpegtsPacketizeFunc: (attributes doc.skip=true) + */ typedef gboolean (*GstMpegtsPacketizeFunc) (GstMpegtsSection *section); /** * GstMpegtsSection: - * @section_type: The type of section - * @pid: The pid on which this section was found - * @table_id: The table id of this section + * @section_type: The type of section. + * @pid: The PID on which this section was found or belongs to. + * @table_id: The table id of this section. See %GstMpegtsSectionTableID and + * derivates for more information. * @subtable_extension: This meaning differs per section. See the documentation - * of the parsed section type for the meaning of this field + * of the parsed section type for the meaning of this field * @version_number: Version of the section. * @current_next_indicator: Applies to current/next stream or not * @section_number: Number of the section (if multiple) * @last_section_number: Number of the last expected section (if multiple) - * @crc: CRC + * @crc: Checksum (if applicable) * - * Mpeg-TS Section Information (SI) (ISO/IEC 13818-1) + * Mpeg-TS Section Information (SI) (ISO/IEC 13818-1) object. */ struct _GstMpegtsSection { @@ -174,6 +230,7 @@ * FIXME : Maybe make public later on when allowing creation of * sections to that people can create private short sections ? */ gboolean short_section; + GstMpegtsPacketizeFunc packetizer; /* Padding for future extension */ @@ -230,9 +287,10 @@ /** * GstMpegtsStreamType: * @GST_MPEGTS_STREAM_TYPE_RESERVED_00: ITU-T | ISO/IEC Reserved - * @GST_MPEGTS_STREAM_TYPE_VIDEO_MPEG1: ISO/IEC 11172-2 Video + * @GST_MPEGTS_STREAM_TYPE_VIDEO_MPEG1: ISO/IEC 11172-2 Video (i.e. MPEG-1 Video) * @GST_MPEGTS_STREAM_TYPE_VIDEO_MPEG2: Rec. ITU-T H.262 | ISO/IEC 13818-2 - * Video or ISO/IEC 11172-2 constrained parameter video stream + * Video or ISO/IEC 11172-2 constrained parameter video stream (i.e. + * MPEG-2 Video) * @GST_MPEGTS_STREAM_TYPE_AUDIO_MPEG1: ISO/IEC 11172-3 Audio * @GST_MPEGTS_STREAM_TYPE_AUDIO_MPEG2: ISO/IEC 13818-3 Audio * @GST_MPEGTS_STREAM_TYPE_PRIVATE_SECTIONS: private sections @@ -245,34 +303,34 @@ * @GST_MPEGTS_STREAM_TYPE_DSMCC_C: ISO/IEC 13818-6 type C * @GST_MPEGTS_STREAM_TYPE_DSMCC_D: ISO/IEC 13818-6 type D * @GST_MPEGTS_STREAM_TYPE_AUXILIARY: auxiliary streams - * @GST_MPEGTS_STREAM_TYPE_AUDIO_AAC_ADTS: ISO/IEC 13818-7 Audio with ADTS - * transport syntax - * @GST_MPEGTS_STREAM_TYPE_VIDEO_MPEG4: ISO/IEC 14496-2 Visual - * @GST_MPEGTS_STREAM_TYPE_AUDIO_AAC_LATM: ISO/IEC 14496-3 Audio with the LATM - * transport syntax as defined in ISO/IEC 14496-3 + * @GST_MPEGTS_STREAM_TYPE_AUDIO_AAC_ADTS: ISO/IEC 13818-7 Audio (AAC) with ADTS + * transport syntax + * @GST_MPEGTS_STREAM_TYPE_VIDEO_MPEG4: ISO/IEC 14496-2 Visual (MPEG-4 Video) + * @GST_MPEGTS_STREAM_TYPE_AUDIO_AAC_LATM: ISO/IEC 14496-3 Audio (AAC) with the LATM + * transport syntax as defined in ISO/IEC 14496-3 * @GST_MPEGTS_STREAM_TYPE_SL_FLEXMUX_PES_PACKETS: ISO/IEC 14496-1 - * SL-packetized stream or FlexMux stream carried in PES packets + * SL-packetized stream or FlexMux stream carried in PES packets * @GST_MPEGTS_STREAM_TYPE_SL_FLEXMUX_SECTIONS: ISO/IEC 14496-1 SL-packetized - * stream or FlexMux stream carried in ISO/IEC 14496_sections + * stream or FlexMux stream carried in ISO/IEC 14496_sections * @GST_MPEGTS_STREAM_TYPE_SYNCHRONIZED_DOWNLOAD: ISO/IEC 13818-6 Synchronized - * Download Protocol + * Download Protocol * @GST_MPEGTS_STREAM_TYPE_METADATA_PES_PACKETS: Metadata carried in PES packets * @GST_MPEGTS_STREAM_TYPE_METADATA_SECTIONS: Metadata carried in metadata_sections * @GST_MPEGTS_STREAM_TYPE_METADATA_DATA_CAROUSEL: Metadata carried in ISO/IEC - * 13818-6 Data Carousel + * 13818-6 Data Carousel * @GST_MPEGTS_STREAM_TYPE_METADATA_OBJECT_CAROUSEL: Metadata carried in - * ISO/IEC 13818-6 Object Carousel + * ISO/IEC 13818-6 Object Carousel * @GST_MPEGTS_STREAM_TYPE_METADATA_SYNCHRONIZED_DOWNLOAD: Metadata carried in - * ISO/IEC 13818-6 Synchronized Download Protocol + * ISO/IEC 13818-6 Synchronized Download Protocol * @GST_MPEGTS_STREAM_TYPE_MPEG2_IPMP: IPMP stream (defined in ISO/IEC 13818-11, - * MPEG-2 IPMP) + * MPEG-2 IPMP) * @GST_MPEGTS_STREAM_TYPE_VIDEO_H264: AVC video stream conforming to one or * more profiles defined in Annex A of Rec. ITU-T H.264 | ISO/IEC 14496-10 or * AVC video sub-bitstream of SVC as defined in 2.1.78 or MVC base view * sub-bitstream, as defined in 2.1.85, or AVC video sub-bitstream of MVC, as * defined in 2.1.88 - * @GST_MPEGTS_STREAM_TYPE_AUDIO_AAC_CLEAN: ISO/IEC 14496-3 Audio, without - * using any additional transport syntax, such as DST, ALS and SLS + * @GST_MPEGTS_STREAM_TYPE_AUDIO_AAC_CLEAN: ISO/IEC 14496-3 (AAC) Audio, without + * using any additional transport syntax, such as DST, ALS and SLS * @GST_MPEGTS_STREAM_TYPE_MPEG4_TIMED_TEXT: ISO/IEC 14496-17 Text * @GST_MPEGTS_STREAM_TYPE_VIDEO_RVC: Auxiliary video stream as defined in * ISO/IEC 23002-3 @@ -283,16 +341,18 @@ * of an AVC video stream conforming to one or more profiles defined in Annex H * of Rec. ITU-T H.264 | ISO/IEC 14496-10 * @GST_MPEGTS_STREAM_TYPE_VIDEO_JP2K: Video stream conforming to one or more - * profiles as defined in Rec. ITU-T T.800 | ISO/IEC 15444-1 + * profiles as defined in Rec. ITU-T T.800 | ISO/IEC 15444-1 (i.e. JPEG 2000) * @GST_MPEGTS_STREAM_TYPE_VIDEO_MPEG2_STEREO_ADDITIONAL_VIEW: Additional view * Rec. ITU-T H.262 | ISO/IEC 13818-2 video stream for service-compatible * stereoscopic 3D services * @GST_MPEGTS_STREAM_TYPE_VIDEO_H264_STEREO_ADDITIONAL_VIEW: Additional view * Rec. ITU-T H.264 | ISO/IEC 14496-10 video stream conforming to one or more * profiles defined in Annex A for service-compatible stereoscopic 3D services + * @GST_MPEGTS_STREAM_TYPE_VIDEO_HEVC: Rec. ITU-T H.265 | ISO/IEC 23008-2 video + * stream or an HEVC temporal video sub-bitstream * @GST_MPEGTS_STREAM_TYPE_IPMP_STREAM: IPMP stream * - * Type of mpeg-ts stream type. + * Type of MPEG-TS stream type. * * These values correspond to the base standard registered types. Depending * on the variant of mpeg-ts being used (Bluray, ATSC, DVB, ...), other @@ -339,8 +399,17 @@ GST_MPEGTS_STREAM_TYPE_VIDEO_H264_STEREO_ADDITIONAL_VIEW = 0x23, GST_MPEGTS_STREAM_TYPE_VIDEO_HEVC = 0x24, /* 0x24 - 0x7e : Rec. ITU-T H.222.0 | ISO/IEC 13818-1 Reserved */ - GST_MPEGTS_STREAM_TYPE_IPMP_STREAM = 0x7f + GST_MPEGTS_STREAM_TYPE_IPMP_STREAM = 0x7f, /* 0x80 - 0xff : User Private (or defined in other specs) */ + + /** + * GST_MPEGTS_STREAM_TYPE_USER_PRIVATE_EA: + * + * User Private stream id (used for VC-1) as defined by SMPTE RP227. + * + * Since: 1.20 + */ + GST_MPEGTS_STREAM_TYPE_USER_PRIVATE_EA = 0xea, } GstMpegtsStreamType; /** @@ -350,7 +419,7 @@ * @descriptors: (element-type GstMpegtsDescriptor): the descriptors of the * stream * - * An individual stream definition. + * An individual stream definition of a #GstMpegtsPMT. */ struct _GstMpegtsPMTStream { @@ -362,11 +431,13 @@ /** * GstMpegtsPMT: - * @pcr_pid: PID of the stream containing PCR - * @descriptors: (element-type GstMpegtsDescriptor): array of #GstMpegtsDescriptor + * @pcr_pid: PID of the stream containing the PCR for this program. + * @program_number: The program to which this PMT is applicable. + * @descriptors: (element-type GstMpegtsDescriptor): Array of #GstMpegtsDescriptor * @streams: (element-type GstMpegtsPMTStream): Array of #GstMpegtsPMTStream * - * Program Map Table (ISO/IEC 13818-1). + * Program Map Table (ISO/IEC 13818-1). Provides the mappings between program + * numbers and the program elements that comprise them. * * The program_number is contained in the subtable_extension field of the * container #GstMpegtsSection. @@ -413,6 +484,9 @@ GstMessage *gst_message_new_mpegts_section (GstObject *parent, GstMpegtsSection *section); GST_MPEGTS_API +GstEvent *gst_event_new_mpegts_section (GstMpegtsSection * section); + +GST_MPEGTS_API gboolean gst_mpegts_section_send_event (GstMpegtsSection * section, GstElement * element); GST_MPEGTS_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/meson.build
Changed
@@ -1,22 +1,26 @@ -mpegts_sources = [ +mpegts_sources = files( + 'mpegts.c', 'gstmpegtssection.c', 'gstmpegtsdescriptor.c', 'gst-dvb-descriptor.c', 'gst-dvb-section.c', 'gst-atsc-section.c', 'gst-scte-section.c', -] +) -mpegts_headers = [ +mpegts_headers = files( 'gstmpegtssection.h', 'gst-atsc-section.h', 'gst-dvb-section.h', 'gst-scte-section.h', + 'gst-hdmv-section.h', 'gstmpegtsdescriptor.h', + 'gst-atsc-descriptor.h', 'gst-dvb-descriptor.h', + 'gst-isdb-descriptor.h', 'mpegts-prelude.h', 'mpegts.h', -] +) install_headers(mpegts_headers, subdir : 'gstreamer-1.0/gst/mpegts') mpegts_enums = gnome.mkenums_simple('gstmpegts-enumtypes', @@ -33,7 +37,7 @@ gstmpegts = library('gstmpegts-' + api_version, mpegts_sources, mpegts_enums, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_MPEGTS'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_MPEGTS', '-DG_LOG_DOMAIN="GStreamer-MpegTS"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -41,24 +45,40 @@ install : true, dependencies : [gst_dep], ) + +library_def = {'lib': gstmpegts} +pkg_name = 'gstreamer-mpegts-1.0' +pkgconfig.generate(gstmpegts, + libraries : [gst_dep, gstbase_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'GStreamer MPEG-TS support', +) + if build_gir - mpegts_gir = gnome.generate_gir(gstmpegts, - sources : mpegts_sources + mpegts_headers, - namespace : 'GstMpegts', - nsversion : api_version, - identifier_prefix : 'GstMpegts', - symbol_prefix : ['gst_mpegts', 'gst'], - export_packages : 'gstreamer-mpegts-1.0', - includes : ['Gst-1.0'], - install : true, - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/mpegts/mpegts.h'], - dependencies : [gst_dep] - ) - gen_sources += mpegts_gir + gir = { + 'sources' : mpegts_sources + mpegts_headers, + 'namespace' : 'GstMpegts', + 'nsversion' : api_version, + 'identifier_prefix' : 'GstMpegts', + 'symbol_prefix' : ['gst_mpegts', 'gst'], + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0'], + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/mpegts/mpegts.h'], + 'dependencies' : [gst_dep] + } + library_def += {'gir': [gir]} + if not static_build + mpegts_gir = gnome.generate_gir(gstmpegts, kwargs: gir) + gen_sources += mpegts_gir + endif endif - +libraries += [[pkg_name, library_def]] gstmpegts_dep = declare_dependency(link_with : gstmpegts, include_directories : [libsinc], dependencies : [gst_dep], sources : gen_sources) +meson.override_dependency(pkg_name, gstmpegts_dep)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/mpegts.c
Added
@@ -0,0 +1,62 @@ +/* + * gstmpegtssection.c - + * Copyright (C) 2013 Edward Hervey + * Copyright (C) 2011, Hewlett-Packard Development Company, L.P. + * Copyright (C) 2007 Alessandro Decina + * 2010 Edward Hervey + * Author: Youness Alaoui <youness.alaoui@collabora.co.uk>, Collabora Ltd. + * Author: Sebastian Dröge <sebastian.droege@collabora.co.uk>, Collabora Ltd. + * Author: Edward Hervey <bilboed@bilboed.com>, Collabora Ltd. + * + * Authors: + * Alessandro Decina <alessandro@nnva.org> + * Zaheer Abbas Merali <zaheerabbas at merali dot org> + * Edward Hervey <edward@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "mpegts.h" +#include "gstmpegts-private.h" + +/** + * SECTION:gstmpegts + * @title: Initialization + * @short_description: Initialization of the MPEG-TS helper library + * @include: gst/mpegts/mpegts.h + * + * Before any usage of this library, the initialization function should be called. + */ + +GST_DEBUG_CATEGORY (mpegts_debug); + +/** + * gst_mpegts_initialize: + * + * Initializes the MPEG-TS helper library. Must be called before any + * usage. + */ +void +gst_mpegts_initialize (void) +{ + GST_DEBUG_CATEGORY_INIT (mpegts_debug, "mpegts", 0, "MPEG-TS helper library"); + + __initialize_sections (); + __initialize_descriptors (); +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/mpegts/mpegts.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/mpegts/mpegts.h
Changed
@@ -31,12 +31,15 @@ #include <gst/mpegts/mpegts-prelude.h> #include <gst/mpegts/gstmpegtsdescriptor.h> +#include <gst/mpegts/gst-atsc-descriptor.h> #include <gst/mpegts/gst-dvb-descriptor.h> +#include <gst/mpegts/gst-isdb-descriptor.h> #include <gst/mpegts/gstmpegtssection.h> #include <gst/mpegts/gst-atsc-section.h> #include <gst/mpegts/gst-dvb-section.h> #include <gst/mpegts/gst-scte-section.h> #include <gst/mpegts/gstmpegts-enumtypes.h> +#include <gst/mpegts/gst-hdmv-section.h> G_BEGIN_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/opencv/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/opencv/meson.build
Changed
@@ -25,22 +25,25 @@ # /usr/include/opencv4/opencv2/flann/logger.h:83:36: error: format string is not a string literal [-Werror,-Wformat-nonliteral] gstopencv_cargs = cxx.get_supported_arguments(['-Wno-missing-include-dirs', '-Wno-format-nonliteral']) + pkg_name = 'gstreamer-opencv-1.0' gstopencv = library('gstopencv-' + api_version, opencv_sources, c_args : gst_plugins_bad_args + ['-DBUILDING_GST_OPENCV'], - cpp_args : gst_plugins_bad_args + gstopencv_cargs + ['-DBUILDING_GST_OPENCV'], + cpp_args : gst_plugins_bad_args + gstopencv_cargs + ['-DBUILDING_GST_OPENCV', '-DG_LOG_DOMAIN="GStreamer-OpenCV"'], override_options : ['cpp_std=c++11'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, - darwin_versions : osxversion, + darwin_versions : osxversion, install : true, dependencies : [gstbase_dep, gstvideo_dep, opencv_dep], ) + libraries += [[pkg_name, {'lib': gstopencv}]] gstopencv_dep = declare_dependency(link_with: gstopencv, include_directories : [libsinc], dependencies : [gstvideo_dep, opencv_dep]) + meson.override_dependency(pkg_name, gstopencv_dep) install_headers(opencv_headers, subdir : 'gstreamer-1.0/gst/opencv') elif get_option('opencv').enabled()
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-media-info-private.h
Added
@@ -0,0 +1,126 @@ +/* GStreamer + * + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstplay-media-info.h" + +#ifndef __GST_PLAY_MEDIA_INFO_PRIVATE_H__ +#define __GST_PLAY_MEDIA_INFO_PRIVATE_H__ + +struct _GstPlayStreamInfo +{ + GObject parent; + + gchar *codec; + + GstCaps *caps; + gint stream_index; + GstTagList *tags; + gchar *stream_id; +}; + +struct _GstPlayStreamInfoClass +{ + GObjectClass parent_class; +}; + +struct _GstPlaySubtitleInfo +{ + GstPlayStreamInfo parent; + + gchar *language; +}; + +struct _GstPlaySubtitleInfoClass +{ + GstPlayStreamInfoClass parent_class; +}; + +struct _GstPlayAudioInfo +{ + GstPlayStreamInfo parent; + + gint channels; + gint sample_rate; + + guint bitrate; + guint max_bitrate; + + gchar *language; +}; + +struct _GstPlayAudioInfoClass +{ + GstPlayStreamInfoClass parent_class; +}; + +struct _GstPlayVideoInfo +{ + GstPlayStreamInfo parent; + + gint width; + gint height; + gint framerate_num; + gint framerate_denom; + gint par_num; + gint par_denom; + + guint bitrate; + guint max_bitrate; +}; + +struct _GstPlayVideoInfoClass +{ + GstPlayStreamInfoClass parent_class; +}; + +struct _GstPlayMediaInfo +{ + GObject parent; + + gchar *uri; + gchar *title; + gchar *container; + gboolean seekable, is_live; + GstTagList *tags; + GstSample *image_sample; + + GList *stream_list; + GList *audio_stream_list; + GList *video_stream_list; + GList *subtitle_stream_list; + + GstClockTime duration; +}; + +struct _GstPlayMediaInfoClass +{ + GObjectClass parent_class; +}; + +G_GNUC_INTERNAL GstPlayMediaInfo* gst_play_media_info_new + (const gchar *uri); +G_GNUC_INTERNAL GstPlayMediaInfo* gst_play_media_info_copy + (GstPlayMediaInfo *ref); +G_GNUC_INTERNAL GstPlayStreamInfo* gst_play_stream_info_new + (gint stream_index, GType type); +G_GNUC_INTERNAL GstPlayStreamInfo* gst_play_stream_info_copy + (GstPlayStreamInfo *ref); + +#endif /* __GST_PLAY_MEDIA_INFO_PRIVATE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-media-info.c
Added
@@ -0,0 +1,935 @@ +/* GStreamer + * + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstplay-mediainfo + * @title: GstPlayMediaInfo + * @short_description: Play Media Information + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstplay-media-info.h" +#include "gstplay-media-info-private.h" + +/* Per-stream information */ +G_DEFINE_ABSTRACT_TYPE (GstPlayStreamInfo, gst_play_stream_info, G_TYPE_OBJECT); + +static void +gst_play_stream_info_init (GstPlayStreamInfo * sinfo) +{ + sinfo->stream_index = -1; +} + +static void +gst_play_stream_info_finalize (GObject * object) +{ + GstPlayStreamInfo *sinfo = GST_PLAY_STREAM_INFO (object); + + g_free (sinfo->codec); + g_free (sinfo->stream_id); + + if (sinfo->caps) + gst_caps_unref (sinfo->caps); + + if (sinfo->tags) + gst_tag_list_unref (sinfo->tags); + + G_OBJECT_CLASS (gst_play_stream_info_parent_class)->finalize (object); +} + +static void +gst_play_stream_info_class_init (GstPlayStreamInfoClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->finalize = gst_play_stream_info_finalize; +} + +/** + * gst_play_stream_info_get_index: + * @info: a #GstPlayStreamInfo + * + * Function to get stream index from #GstPlayStreamInfo instance or -1 if + * unknown. + * + * Returns: the stream index of this stream. + * Since: 1.20 + */ +gint +gst_play_stream_info_get_index (const GstPlayStreamInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_STREAM_INFO (info), -1); + + return info->stream_index; +} + +/** + * gst_play_stream_info_get_stream_type: + * @info: a #GstPlayStreamInfo + * + * Function to return human readable name for the stream type + * of the given @info (ex: "audio", "video", "subtitle") + * + * Returns: a human readable name + * Since: 1.20 + */ +const gchar * +gst_play_stream_info_get_stream_type (const GstPlayStreamInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_STREAM_INFO (info), NULL); + + if (GST_IS_PLAY_VIDEO_INFO (info)) + return "video"; + else if (GST_IS_PLAY_AUDIO_INFO (info)) + return "audio"; + else + return "subtitle"; +} + +/** + * gst_play_stream_info_get_tags: + * @info: a #GstPlayStreamInfo + * + * Returns: (transfer none) (nullable): the tags contained in this stream. + * Since: 1.20 + */ +GstTagList * +gst_play_stream_info_get_tags (const GstPlayStreamInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_STREAM_INFO (info), NULL); + + return info->tags; +} + +/** + * gst_play_stream_info_get_codec: + * @info: a #GstPlayStreamInfo + * + * A string describing codec used in #GstPlayStreamInfo. + * + * Returns: (nullable): codec string or %NULL on unknown. + * Since: 1.20 + */ +const gchar * +gst_play_stream_info_get_codec (const GstPlayStreamInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_STREAM_INFO (info), NULL); + + return info->codec; +} + +/** + * gst_play_stream_info_get_caps: + * @info: a #GstPlayStreamInfo + * + * Returns: (nullable) (transfer none): the #GstCaps of the stream or %NULL if + * unknown. + * Since: 1.20 + */ +GstCaps * +gst_play_stream_info_get_caps (const GstPlayStreamInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_STREAM_INFO (info), NULL); + + return info->caps; +} + +/* Video information */ +G_DEFINE_TYPE (GstPlayVideoInfo, gst_play_video_info, + GST_TYPE_PLAY_STREAM_INFO); + +static void +gst_play_video_info_init (GstPlayVideoInfo * info) +{ + info->width = -1; + info->height = -1; + info->framerate_num = 0; + info->framerate_denom = 1; + info->par_num = 1; + info->par_denom = 1; +} + +static void +gst_play_video_info_class_init (G_GNUC_UNUSED GstPlayVideoInfoClass * klass) +{ + /* nothing to do here */ +} + +/** + * gst_play_video_info_get_width: + * @info: a #GstPlayVideoInfo + * + * Returns: the width of video in #GstPlayVideoInfo or -1 if unknown. + * Since: 1.20 + */ +gint +gst_play_video_info_get_width (const GstPlayVideoInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_VIDEO_INFO (info), -1); + + return info->width; +} + +/** + * gst_play_video_info_get_height: + * @info: a #GstPlayVideoInfo + * + * Returns: the height of video in #GstPlayVideoInfo or -1 if unknown. + * Since: 1.20 + */ +gint +gst_play_video_info_get_height (const GstPlayVideoInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_VIDEO_INFO (info), -1); + + return info->height; +} + +/** + * gst_play_video_info_get_framerate: + * @info: a #GstPlayVideoInfo + * @fps_n: (out): Numerator of frame rate + * @fps_d: (out): Denominator of frame rate + * + * Since: 1.20 + */ +void +gst_play_video_info_get_framerate (const GstPlayVideoInfo * info, + gint * fps_n, gint * fps_d) +{ + g_return_if_fail (GST_IS_PLAY_VIDEO_INFO (info)); + + *fps_n = info->framerate_num; + *fps_d = info->framerate_denom; +} + +/** + * gst_play_video_info_get_pixel_aspect_ratio: + * @info: a #GstPlayVideoInfo + * @par_n: (out): numerator + * @par_d: (out): denominator + * + * Returns the pixel aspect ratio in @par_n and @par_d + * + * Since: 1.20 + */ +void +gst_play_video_info_get_pixel_aspect_ratio (const GstPlayVideoInfo * info, + guint * par_n, guint * par_d) +{ + g_return_if_fail (GST_IS_PLAY_VIDEO_INFO (info)); + + *par_n = info->par_num; + *par_d = info->par_denom; +} + +/** + * gst_play_video_info_get_bitrate: + * @info: a #GstPlayVideoInfo + * + * Returns: the current bitrate of video in #GstPlayVideoInfo or -1 if unknown. + * Since: 1.20 + */ +gint +gst_play_video_info_get_bitrate (const GstPlayVideoInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_VIDEO_INFO (info), -1); + + return info->bitrate; +} + +/** + * gst_play_video_info_get_max_bitrate: + * @info: a #GstPlayVideoInfo + * + * Returns: the maximum bitrate of video in #GstPlayVideoInfo or -1 if unknown. + * Since: 1.20 + */ +gint +gst_play_video_info_get_max_bitrate (const GstPlayVideoInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_VIDEO_INFO (info), -1); + + return info->max_bitrate; +} + +/* Audio information */ +G_DEFINE_TYPE (GstPlayAudioInfo, gst_play_audio_info, + GST_TYPE_PLAY_STREAM_INFO); + +static void +gst_play_audio_info_init (GstPlayAudioInfo * info) +{ + info->channels = 0; + info->sample_rate = 0; + info->bitrate = -1; + info->max_bitrate = -1; +} + +static void +gst_play_audio_info_finalize (GObject * object) +{ + GstPlayAudioInfo *info = GST_PLAY_AUDIO_INFO (object); + + g_free (info->language); + + G_OBJECT_CLASS (gst_play_audio_info_parent_class)->finalize (object); +} + +static void +gst_play_audio_info_class_init (GstPlayAudioInfoClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->finalize = gst_play_audio_info_finalize; +} + +/** + * gst_play_audio_info_get_language: + * @info: a #GstPlayAudioInfo + * + * Returns: (nullable): the language of the stream, or %NULL if unknown. + * Since: 1.20 + */ +const gchar * +gst_play_audio_info_get_language (const GstPlayAudioInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_AUDIO_INFO (info), NULL); + + return info->language; +} + +/** + * gst_play_audio_info_get_channels: + * @info: a #GstPlayAudioInfo + * + * Returns: the number of audio channels in #GstPlayAudioInfo or 0 if unknown. + * Since: 1.20 + */ +gint +gst_play_audio_info_get_channels (const GstPlayAudioInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_AUDIO_INFO (info), 0); + + return info->channels; +} + +/** + * gst_play_audio_info_get_sample_rate: + * @info: a #GstPlayAudioInfo + * + * Returns: the audio sample rate in #GstPlayAudioInfo or 0 if unknown. + * Since: 1.20 + */ +gint +gst_play_audio_info_get_sample_rate (const GstPlayAudioInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_AUDIO_INFO (info), 0); + + return info->sample_rate; +} + +/** + * gst_play_audio_info_get_bitrate: + * @info: a #GstPlayAudioInfo + * + * Returns: the audio bitrate in #GstPlayAudioInfo or -1 if unknown. + * Since: 1.20 + */ +gint +gst_play_audio_info_get_bitrate (const GstPlayAudioInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_AUDIO_INFO (info), -1); + + return info->bitrate; +} + +/** + * gst_play_audio_info_get_max_bitrate: + * @info: a #GstPlayAudioInfo + * + * Returns: the audio maximum bitrate in #GstPlayAudioInfo or -1 if unknown. + * Since: 1.20 + */ +gint +gst_play_audio_info_get_max_bitrate (const GstPlayAudioInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_AUDIO_INFO (info), -1); + + return info->max_bitrate; +} + +/* Subtitle information */ +G_DEFINE_TYPE (GstPlaySubtitleInfo, gst_play_subtitle_info, + GST_TYPE_PLAY_STREAM_INFO); + +static void +gst_play_subtitle_info_init (G_GNUC_UNUSED GstPlaySubtitleInfo * info) +{ + /* nothing to do */ +} + +static void +gst_play_subtitle_info_finalize (GObject * object) +{ + GstPlaySubtitleInfo *info = GST_PLAY_SUBTITLE_INFO (object); + + g_free (info->language); + + G_OBJECT_CLASS (gst_play_subtitle_info_parent_class)->finalize (object); +} + +static void +gst_play_subtitle_info_class_init (GstPlaySubtitleInfoClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->finalize = gst_play_subtitle_info_finalize; +} + +/** + * gst_play_subtitle_info_get_language: + * @info: a #GstPlaySubtitleInfo + * + * Returns: (nullable): the language of the stream, or %NULL if unknown. + * Since: 1.20 + */ +const gchar * +gst_play_subtitle_info_get_language (const GstPlaySubtitleInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_SUBTITLE_INFO (info), NULL); + + return info->language; +} + +/* Global media information */ +G_DEFINE_TYPE (GstPlayMediaInfo, gst_play_media_info, G_TYPE_OBJECT); + +static void +gst_play_media_info_init (GstPlayMediaInfo * info) +{ + info->duration = -1; + info->is_live = FALSE; + info->seekable = FALSE; +} + +static void +gst_play_media_info_finalize (GObject * object) +{ + GstPlayMediaInfo *info = GST_PLAY_MEDIA_INFO (object); + + g_free (info->uri); + + if (info->tags) + gst_tag_list_unref (info->tags); + + g_free (info->title); + + g_free (info->container); + + if (info->image_sample) + gst_sample_unref (info->image_sample); + + if (info->audio_stream_list) + g_list_free (info->audio_stream_list); + + if (info->video_stream_list) + g_list_free (info->video_stream_list); + + if (info->subtitle_stream_list) + g_list_free (info->subtitle_stream_list); + + if (info->stream_list) + g_list_free_full (info->stream_list, g_object_unref); + + G_OBJECT_CLASS (gst_play_media_info_parent_class)->finalize (object); +} + +static void +gst_play_media_info_class_init (GstPlayMediaInfoClass * klass) +{ + GObjectClass *oclass = (GObjectClass *) klass; + + oclass->finalize = gst_play_media_info_finalize; +} + +static GstPlayVideoInfo * +gst_play_video_info_new (void) +{ + return g_object_new (GST_TYPE_PLAY_VIDEO_INFO, NULL); +} + +static GstPlayAudioInfo * +gst_play_audio_info_new (void) +{ + return g_object_new (GST_TYPE_PLAY_AUDIO_INFO, NULL); +} + +static GstPlaySubtitleInfo * +gst_play_subtitle_info_new (void) +{ + return g_object_new (GST_TYPE_PLAY_SUBTITLE_INFO, NULL); +} + +static GstPlayStreamInfo * +gst_play_video_info_copy (GstPlayVideoInfo * ref) +{ + GstPlayVideoInfo *ret; + + ret = gst_play_video_info_new (); + + ret->width = ref->width; + ret->height = ref->height; + ret->framerate_num = ref->framerate_num; + ret->framerate_denom = ref->framerate_denom; + ret->par_num = ref->par_num; + ret->par_denom = ref->par_denom; + ret->bitrate = ref->bitrate; + ret->max_bitrate = ref->max_bitrate; + + return (GstPlayStreamInfo *) ret; +} + +static GstPlayStreamInfo * +gst_play_audio_info_copy (GstPlayAudioInfo * ref) +{ + GstPlayAudioInfo *ret; + + ret = gst_play_audio_info_new (); + + ret->sample_rate = ref->sample_rate; + ret->channels = ref->channels; + ret->bitrate = ref->bitrate; + ret->max_bitrate = ref->max_bitrate; + + if (ref->language) + ret->language = g_strdup (ref->language); + + return (GstPlayStreamInfo *) ret; +} + +static GstPlayStreamInfo * +gst_play_subtitle_info_copy (GstPlaySubtitleInfo * ref) +{ + GstPlaySubtitleInfo *ret; + + ret = gst_play_subtitle_info_new (); + if (ref->language) + ret->language = g_strdup (ref->language); + + return (GstPlayStreamInfo *) ret; +} + +GstPlayStreamInfo * +gst_play_stream_info_copy (GstPlayStreamInfo * ref) +{ + GstPlayStreamInfo *info = NULL; + + if (!ref) + return NULL; + + if (GST_IS_PLAY_VIDEO_INFO (ref)) + info = gst_play_video_info_copy ((GstPlayVideoInfo *) ref); + else if (GST_IS_PLAY_AUDIO_INFO (ref)) + info = gst_play_audio_info_copy ((GstPlayAudioInfo *) ref); + else + info = gst_play_subtitle_info_copy ((GstPlaySubtitleInfo *) ref); + + info->stream_index = ref->stream_index; + if (ref->tags) + info->tags = gst_tag_list_ref (ref->tags); + if (ref->caps) + info->caps = gst_caps_copy (ref->caps); + if (ref->codec) + info->codec = g_strdup (ref->codec); + if (ref->stream_id) + info->stream_id = g_strdup (ref->stream_id); + + return info; +} + +GstPlayMediaInfo * +gst_play_media_info_copy (GstPlayMediaInfo * ref) +{ + GList *l; + GstPlayMediaInfo *info; + + if (!ref) + return NULL; + + info = gst_play_media_info_new (ref->uri); + info->duration = ref->duration; + info->seekable = ref->seekable; + info->is_live = ref->is_live; + if (ref->tags) + info->tags = gst_tag_list_ref (ref->tags); + if (ref->title) + info->title = g_strdup (ref->title); + if (ref->container) + info->container = g_strdup (ref->container); + if (ref->image_sample) + info->image_sample = gst_sample_ref (ref->image_sample); + + for (l = ref->stream_list; l != NULL; l = l->next) { + GstPlayStreamInfo *s; + + s = gst_play_stream_info_copy ((GstPlayStreamInfo *) l->data); + info->stream_list = g_list_append (info->stream_list, s); + + if (GST_IS_PLAY_AUDIO_INFO (s)) + info->audio_stream_list = g_list_append (info->audio_stream_list, s); + else if (GST_IS_PLAY_VIDEO_INFO (s)) + info->video_stream_list = g_list_append (info->video_stream_list, s); + else + info->subtitle_stream_list = + g_list_append (info->subtitle_stream_list, s); + } + + return info; +} + +GstPlayStreamInfo * +gst_play_stream_info_new (gint stream_index, GType type) +{ + GstPlayStreamInfo *info = NULL; + + if (type == GST_TYPE_PLAY_AUDIO_INFO) + info = (GstPlayStreamInfo *) gst_play_audio_info_new (); + else if (type == GST_TYPE_PLAY_VIDEO_INFO) + info = (GstPlayStreamInfo *) gst_play_video_info_new (); + else + info = (GstPlayStreamInfo *) gst_play_subtitle_info_new (); + + info->stream_index = stream_index; + + return info; +} + +GstPlayMediaInfo * +gst_play_media_info_new (const gchar * uri) +{ + GstPlayMediaInfo *info; + + g_return_val_if_fail (uri != NULL, NULL); + + info = g_object_new (GST_TYPE_PLAY_MEDIA_INFO, NULL); + info->uri = g_strdup (uri); + + return info; +} + +/** + * gst_play_media_info_get_uri: + * @info: a #GstPlayMediaInfo + * + * Returns: the URI associated with #GstPlayMediaInfo. + * Since: 1.20 + */ +const gchar * +gst_play_media_info_get_uri (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->uri; +} + +/** + * gst_play_media_info_is_seekable: + * @info: a #GstPlayMediaInfo + * + * Returns: %TRUE if the media is seekable. + * Since: 1.20 + */ +gboolean +gst_play_media_info_is_seekable (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), FALSE); + + return info->seekable; +} + +/** + * gst_play_media_info_is_live: + * @info: a #GstPlayMediaInfo + * + * Returns: %TRUE if the media is live. + * Since: 1.20 + */ +gboolean +gst_play_media_info_is_live (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), FALSE); + + return info->is_live; +} + +/** + * gst_play_media_info_get_stream_list: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlayStreamInfo): A #GList of + * matching #GstPlayStreamInfo. + * Since: 1.20 + */ +GList * +gst_play_media_info_get_stream_list (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->stream_list; +} + +/** + * gst_play_media_info_get_video_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlayVideoInfo): A #GList of + * matching #GstPlayVideoInfo. + * Since: 1.20 + */ +GList * +gst_play_media_info_get_video_streams (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->video_stream_list; +} + +/** + * gst_play_media_info_get_subtitle_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlaySubtitleInfo): A #GList of + * matching #GstPlaySubtitleInfo. + * Since: 1.20 + */ +GList * +gst_play_media_info_get_subtitle_streams (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->subtitle_stream_list; +} + +/** + * gst_play_media_info_get_audio_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlayAudioInfo): A #GList of + * matching #GstPlayAudioInfo. + * Since: 1.20 + */ +GList * +gst_play_media_info_get_audio_streams (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->audio_stream_list; +} + +/** + * gst_play_media_info_get_duration: + * @info: a #GstPlayMediaInfo + * + * Returns: duration of the media. + * Since: 1.20 + */ +GstClockTime +gst_play_media_info_get_duration (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), -1); + + return info->duration; +} + +/** + * gst_play_media_info_get_tags: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (nullable): the tags contained in media info. + * Since: 1.20 + */ +GstTagList * +gst_play_media_info_get_tags (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->tags; +} + +/** + * gst_play_media_info_get_title: + * @info: a #GstPlayMediaInfo + * + * Returns: (nullable): the media title or %NULL if unknown. + * Since: 1.20 + */ +const gchar * +gst_play_media_info_get_title (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->title; +} + +/** + * gst_play_media_info_get_container_format: + * @info: a #GstPlayMediaInfo + * + * Returns: (nullable): the container format or %NULL if unknown. + * Since: 1.20 + */ +const gchar * +gst_play_media_info_get_container_format (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->container; +} + +/** + * gst_play_media_info_get_image_sample: + * @info: a #GstPlayMediaInfo + * + * Function to get the image (or preview-image) stored in taglist. + * Application can use `gst_sample_*_()` API's to get caps, buffer etc. + * + * Returns: (nullable) (transfer none): GstSample or %NULL. + * Since: 1.20 + */ +GstSample * +gst_play_media_info_get_image_sample (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), NULL); + + return info->image_sample; +} + +/** + * gst_play_media_info_get_number_of_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: number of total streams. + * Since: 1.20 + */ +guint +gst_play_media_info_get_number_of_streams (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), 0); + + return g_list_length (info->stream_list); +} + +/** + * gst_play_media_info_get_number_of_video_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: number of video streams. + * Since: 1.20 + */ +guint +gst_play_media_info_get_number_of_video_streams (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), 0); + + return g_list_length (info->video_stream_list); +} + +/** + * gst_play_media_info_get_number_of_audio_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: number of audio streams. + * Since: 1.20 + */ +guint +gst_play_media_info_get_number_of_audio_streams (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), 0); + + return g_list_length (info->audio_stream_list); +} + +/** + * gst_play_media_info_get_number_of_subtitle_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: number of subtitle streams. + * Since: 1.20 + */ +guint gst_play_media_info_get_number_of_subtitle_streams + (const GstPlayMediaInfo * info) +{ + g_return_val_if_fail (GST_IS_PLAY_MEDIA_INFO (info), 0); + + return g_list_length (info->subtitle_stream_list); +} + +/** + * gst_play_get_video_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlayVideoInfo): A #GList of + * matching #GstPlayVideoInfo. + * Since: 1.20 + */ +#ifndef GST_REMOVE_DEPRECATED +GList * +gst_play_get_video_streams (const GstPlayMediaInfo * info) +{ + return gst_play_media_info_get_video_streams (info); +} +#endif + +/** + * gst_play_get_audio_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlayAudioInfo): A #GList of + * matching #GstPlayAudioInfo. + * Since: 1.20 + */ +#ifndef GST_REMOVE_DEPRECATED +GList * +gst_play_get_audio_streams (const GstPlayMediaInfo * info) +{ + return gst_play_media_info_get_audio_streams (info); +} +#endif + +/** + * gst_play_get_subtitle_streams: + * @info: a #GstPlayMediaInfo + * + * Returns: (transfer none) (element-type GstPlaySubtitleInfo): A #GList of + * matching #GstPlaySubtitleInfo. + * Since: 1.20 + */ +#ifndef GST_REMOVE_DEPRECATED +GList * +gst_play_get_subtitle_streams (const GstPlayMediaInfo * info) +{ + return gst_play_media_info_get_subtitle_streams (info); +} +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-media-info.h
Added
@@ -0,0 +1,280 @@ +/* GStreamer + * + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_MEDIA_INFO_H__ +#define __GST_PLAY_MEDIA_INFO_H__ + +#include <gst/gst.h> +#include <gst/play/play-prelude.h> + +G_BEGIN_DECLS + +/** + * GST_TYPE_PLAY_STREAM_INFO: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_STREAM_INFO \ + (gst_play_stream_info_get_type ()) +#define GST_PLAY_STREAM_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_PLAY_STREAM_INFO,GstPlayStreamInfo)) +#define GST_PLAY_STREAM_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_PLAY_STREAM_INFO,GstPlayStreamInfo)) +#define GST_IS_PLAY_STREAM_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_PLAY_STREAM_INFO)) +#define GST_IS_PLAY_STREAM_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_PLAY_STREAM_INFO)) + +/** + * GstPlayStreamInfo: + * + * Base structure for information concerning a media stream. Depending on + * the stream type, one can find more media-specific information in + * #GstPlayVideoInfo, #GstPlayAudioInfo, #GstPlaySubtitleInfo. + * Since: 1.20 + */ +typedef struct _GstPlayStreamInfo GstPlayStreamInfo; +typedef struct _GstPlayStreamInfoClass GstPlayStreamInfoClass; + +GST_PLAY_API +GType gst_play_stream_info_get_type (void); + +GST_PLAY_API +gint gst_play_stream_info_get_index (const GstPlayStreamInfo *info); + +GST_PLAY_API +const gchar* gst_play_stream_info_get_stream_type (const GstPlayStreamInfo *info); + +GST_PLAY_API +GstTagList* gst_play_stream_info_get_tags (const GstPlayStreamInfo *info); + +GST_PLAY_API +GstCaps* gst_play_stream_info_get_caps (const GstPlayStreamInfo *info); + +GST_PLAY_API +const gchar* gst_play_stream_info_get_codec (const GstPlayStreamInfo *info); + +/** + * GST_TYPE_PLAY_VIDEO_INFO: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_VIDEO_INFO \ + (gst_play_video_info_get_type ()) +#define GST_PLAY_VIDEO_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_PLAY_VIDEO_INFO, GstPlayVideoInfo)) +#define GST_PLAY_VIDEO_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((obj),GST_TYPE_PLAY_VIDEO_INFO, GstPlayVideoInfoClass)) +#define GST_IS_PLAY_VIDEO_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_PLAY_VIDEO_INFO)) +#define GST_IS_PLAY_VIDEO_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((obj),GST_TYPE_PLAY_VIDEO_INFO)) + +/** + * GstPlayVideoInfo: + * + * #GstPlayStreamInfo specific to video streams. + * Since: 1.20 + */ +typedef struct _GstPlayVideoInfo GstPlayVideoInfo; +typedef struct _GstPlayVideoInfoClass GstPlayVideoInfoClass; + +GST_PLAY_API +GType gst_play_video_info_get_type (void); + +GST_PLAY_API +gint gst_play_video_info_get_bitrate (const GstPlayVideoInfo * info); + +GST_PLAY_API +gint gst_play_video_info_get_max_bitrate (const GstPlayVideoInfo * info); + +GST_PLAY_API +gint gst_play_video_info_get_width (const GstPlayVideoInfo * info); + +GST_PLAY_API +gint gst_play_video_info_get_height (const GstPlayVideoInfo * info); + +GST_PLAY_API +void gst_play_video_info_get_framerate (const GstPlayVideoInfo * info, + gint * fps_n, + gint * fps_d); + +GST_PLAY_API +void gst_play_video_info_get_pixel_aspect_ratio (const GstPlayVideoInfo * info, + guint * par_n, + guint * par_d); + +/** + * GST_TYPE_PLAY_AUDIO_INFO: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_AUDIO_INFO \ + (gst_play_audio_info_get_type ()) +#define GST_PLAY_AUDIO_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_PLAY_AUDIO_INFO, GstPlayAudioInfo)) +#define GST_PLAY_AUDIO_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_PLAY_AUDIO_INFO, GstPlayAudioInfoClass)) +#define GST_IS_PLAY_AUDIO_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_PLAY_AUDIO_INFO)) +#define GST_IS_PLAY_AUDIO_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_PLAY_AUDIO_INFO)) + +/** + * GstPlayAudioInfo: + * + * #GstPlayStreamInfo specific to audio streams. + * Since: 1.20 + */ +typedef struct _GstPlayAudioInfo GstPlayAudioInfo; +typedef struct _GstPlayAudioInfoClass GstPlayAudioInfoClass; + +GST_PLAY_API +GType gst_play_audio_info_get_type (void); + +GST_PLAY_API +gint gst_play_audio_info_get_channels (const GstPlayAudioInfo* info); + +GST_PLAY_API +gint gst_play_audio_info_get_sample_rate (const GstPlayAudioInfo* info); + +GST_PLAY_API +gint gst_play_audio_info_get_bitrate (const GstPlayAudioInfo* info); + +GST_PLAY_API +gint gst_play_audio_info_get_max_bitrate (const GstPlayAudioInfo* info); + +GST_PLAY_API +const gchar* gst_play_audio_info_get_language (const GstPlayAudioInfo* info); + +/** + * GST_TYPE_PLAY_SUBTITLE_INFO: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_SUBTITLE_INFO \ + (gst_play_subtitle_info_get_type ()) +#define GST_PLAY_SUBTITLE_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_PLAY_SUBTITLE_INFO, GstPlaySubtitleInfo)) +#define GST_PLAY_SUBTITLE_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_PLAY_SUBTITLE_INFO,GstPlaySubtitleInfoClass)) +#define GST_IS_PLAY_SUBTITLE_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_PLAY_SUBTITLE_INFO)) +#define GST_IS_PLAY_SUBTITLE_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_PLAY_SUBTITLE_INFO)) + +/** + * GstPlaySubtitleInfo: + * + * #GstPlayStreamInfo specific to subtitle streams. + * Since: 1.20 + */ +typedef struct _GstPlaySubtitleInfo GstPlaySubtitleInfo; +typedef struct _GstPlaySubtitleInfoClass GstPlaySubtitleInfoClass; + +GST_PLAY_API +GType gst_play_subtitle_info_get_type (void); + +GST_PLAY_API +const gchar * gst_play_subtitle_info_get_language (const GstPlaySubtitleInfo* info); + +/** + * GST_TYPE_PLAY_MEDIA_INFO: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_MEDIA_INFO \ + (gst_play_media_info_get_type()) +#define GST_PLAY_MEDIA_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_PLAY_MEDIA_INFO,GstPlayMediaInfo)) +#define GST_PLAY_MEDIA_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_PLAY_MEDIA_INFO,GstPlayMediaInfoClass)) +#define GST_IS_PLAY_MEDIA_INFO(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_PLAY_MEDIA_INFO)) +#define GST_IS_PLAY_MEDIA_INFO_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_PLAY_MEDIA_INFO)) + +/** + * GstPlayMediaInfo: + * + * Structure containing the media information of a URI. + * Since: 1.20 + */ +typedef struct _GstPlayMediaInfo GstPlayMediaInfo; +typedef struct _GstPlayMediaInfoClass GstPlayMediaInfoClass; + +GST_PLAY_API +GType gst_play_media_info_get_type (void); + +GST_PLAY_API +const gchar * gst_play_media_info_get_uri (const GstPlayMediaInfo *info); + +GST_PLAY_API +gboolean gst_play_media_info_is_seekable (const GstPlayMediaInfo *info); + +GST_PLAY_API +gboolean gst_play_media_info_is_live (const GstPlayMediaInfo *info); + +GST_PLAY_API +GstClockTime gst_play_media_info_get_duration (const GstPlayMediaInfo *info); + +GST_PLAY_API +GList* gst_play_media_info_get_stream_list (const GstPlayMediaInfo *info); + +GST_PLAY_API +guint gst_play_media_info_get_number_of_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +GList* gst_play_media_info_get_video_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +guint gst_play_media_info_get_number_of_video_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +GList* gst_play_media_info_get_audio_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +guint gst_play_media_info_get_number_of_audio_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +GList* gst_play_media_info_get_subtitle_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +guint gst_play_media_info_get_number_of_subtitle_streams (const GstPlayMediaInfo *info); + +GST_PLAY_API +GstTagList* gst_play_media_info_get_tags (const GstPlayMediaInfo *info); + +GST_PLAY_API +const gchar* gst_play_media_info_get_title (const GstPlayMediaInfo *info); + +GST_PLAY_API +const gchar* gst_play_media_info_get_container_format (const GstPlayMediaInfo *info); + +GST_PLAY_API +GstSample* gst_play_media_info_get_image_sample (const GstPlayMediaInfo *info); + +GST_PLAY_DEPRECATED_FOR(gst_play_media_info_get_video_streams) +GList* gst_play_get_video_streams (const GstPlayMediaInfo *info); + +GST_PLAY_DEPRECATED_FOR(gst_play_media_info_get_audio_streams) +GList* gst_play_get_audio_streams (const GstPlayMediaInfo *info); + +GST_PLAY_DEPRECATED_FOR(gst_play_media_info_get_subtitle_streams) +GList* gst_play_get_subtitle_streams (const GstPlayMediaInfo *info); + +G_END_DECLS + +#endif /* __GST_PLAY_MEDIA_INFO_H */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-message-private.h
Added
@@ -0,0 +1,42 @@ +/* GStreamer + * + * Copyright (C) 2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_MESSAGE_PRIVATE_H__ +#define __GST_PLAY_MESSAGE_PRIVATE_H__ + +#define GST_PLAY_MESSAGE_DATA "gst-play-message-data" +#define GST_PLAY_MESSAGE_DATA_TYPE "play-message-type" +#define GST_PLAY_MESSAGE_DATA_URI "uri" +#define GST_PLAY_MESSAGE_DATA_POSITION "position" +#define GST_PLAY_MESSAGE_DATA_DURATION "duration" +#define GST_PLAY_MESSAGE_DATA_PLAY_STATE "play-state" +#define GST_PLAY_MESSAGE_DATA_BUFFERING_PERCENT "bufferring-percent" +#define GST_PLAY_MESSAGE_DATA_ERROR "error" +#define GST_PLAY_MESSAGE_DATA_ERROR_DETAILS "error-details" +#define GST_PLAY_MESSAGE_DATA_WARNING "warning" +#define GST_PLAY_MESSAGE_DATA_WARNING_DETAILS "warning-details" +#define GST_PLAY_MESSAGE_DATA_VIDEO_WIDTH "video-width" +#define GST_PLAY_MESSAGE_DATA_VIDEO_HEIGHT "video-height" +#define GST_PLAY_MESSAGE_DATA_MEDIA_INFO "media-info" +#define GST_PLAY_MESSAGE_DATA_VOLUME "volume" +#define GST_PLAY_MESSAGE_DATA_IS_MUTED "is-muted" + +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-signal-adapter.c
Added
@@ -0,0 +1,459 @@ +/* GStreamer + * + * Copyright (C) 2019-2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstplay.h" +#include "gstplay-signal-adapter.h" +#include "gstplay-message-private.h" + +GST_DEBUG_CATEGORY_STATIC (gst_play_signal_adapter_debug); +#define GST_CAT_DEFAULT gst_play_signal_adapter_debug + +enum +{ + SIGNAL_URI_LOADED, + SIGNAL_POSITION_UPDATED, + SIGNAL_DURATION_CHANGED, + SIGNAL_STATE_CHANGED, + SIGNAL_BUFFERING, + SIGNAL_END_OF_STREAM, + SIGNAL_ERROR, + SIGNAL_WARNING, + SIGNAL_VIDEO_DIMENSIONS_CHANGED, + SIGNAL_MEDIA_INFO_UPDATED, + SIGNAL_VOLUME_CHANGED, + SIGNAL_MUTE_CHANGED, + SIGNAL_SEEK_DONE, + SIGNAL_LAST +}; + +enum +{ + PROP_0, + PROP_PLAY, + PROP_LAST +}; + +static GParamSpec *param_specs[PROP_LAST] = { NULL, }; + +struct _GstPlaySignalAdapter +{ + GObject parent; + GstBus *bus; + GstPlay *play; + GSource *source; +}; + +struct _GstPlaySignalAdapterClass +{ + GObjectClass parent_class; +}; + +#define _do_init \ + GST_DEBUG_CATEGORY_INIT (gst_play_signal_adapter_debug, "gst-play-signal-adapter", \ + 0, "GstPlay signal adapter") + +#define parent_class gst_play_signal_adapter_parent_class +G_DEFINE_TYPE_WITH_CODE (GstPlaySignalAdapter, gst_play_signal_adapter, + G_TYPE_OBJECT, _do_init); + +static guint signals[SIGNAL_LAST] = { 0, }; + +static void +gst_play_signal_adapter_emit (GstPlaySignalAdapter * self, + const GstStructure * message_data) +{ + GstPlayMessage play_message_type; + g_return_if_fail (g_str_equal (gst_structure_get_name (message_data), + GST_PLAY_MESSAGE_DATA)); + + GST_LOG ("Emitting message %" GST_PTR_FORMAT, message_data); + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_TYPE, + GST_TYPE_PLAY_MESSAGE, &play_message_type, NULL); + + switch (play_message_type) { + case GST_PLAY_MESSAGE_URI_LOADED:{ + const gchar *uri = + gst_structure_get_string (message_data, GST_PLAY_MESSAGE_DATA_URI); + g_signal_emit (self, signals[SIGNAL_URI_LOADED], 0, uri); + break; + } + case GST_PLAY_MESSAGE_POSITION_UPDATED:{ + GstClockTime pos = GST_CLOCK_TIME_NONE; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_POSITION, + GST_TYPE_CLOCK_TIME, &pos, NULL); + g_signal_emit (self, signals[SIGNAL_POSITION_UPDATED], 0, pos); + break; + } + case GST_PLAY_MESSAGE_DURATION_CHANGED:{ + GstClockTime duration = GST_CLOCK_TIME_NONE; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_DURATION, + GST_TYPE_CLOCK_TIME, &duration, NULL); + g_signal_emit (self, signals[SIGNAL_DURATION_CHANGED], 0, duration); + break; + } + case GST_PLAY_MESSAGE_STATE_CHANGED:{ + GstPlayState state = 0; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_PLAY_STATE, + GST_TYPE_PLAY_STATE, &state, NULL); + g_signal_emit (self, signals[SIGNAL_STATE_CHANGED], 0, state); + break; + } + case GST_PLAY_MESSAGE_BUFFERING:{ + guint percent = 0; + gst_structure_get (message_data, + GST_PLAY_MESSAGE_DATA_BUFFERING_PERCENT, G_TYPE_UINT, &percent, NULL); + g_signal_emit (self, signals[SIGNAL_BUFFERING], 0, percent); + break; + } + case GST_PLAY_MESSAGE_END_OF_STREAM: + g_signal_emit (self, signals[SIGNAL_END_OF_STREAM], 0); + break; + case GST_PLAY_MESSAGE_ERROR:{ + GError *error = NULL; + GstStructure *details = NULL; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_ERROR, + G_TYPE_ERROR, &error, GST_PLAY_MESSAGE_DATA_ERROR_DETAILS, + GST_TYPE_STRUCTURE, &details, NULL); + g_signal_emit (self, signals[SIGNAL_ERROR], 0, error, details); + g_error_free (error); + if (details) + gst_structure_free (details); + break; + } + case GST_PLAY_MESSAGE_WARNING:{ + GError *error = NULL; + GstStructure *details = NULL; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_WARNING, + G_TYPE_ERROR, &error, GST_PLAY_MESSAGE_DATA_WARNING_DETAILS, + GST_TYPE_STRUCTURE, &details, NULL); + g_signal_emit (self, signals[SIGNAL_WARNING], 0, error, details); + g_error_free (error); + if (details) + gst_structure_free (details); + break; + } + case GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED:{ + guint width = 0; + guint height = 0; + gst_structure_get (message_data, + GST_PLAY_MESSAGE_DATA_VIDEO_WIDTH, G_TYPE_UINT, &width, + GST_PLAY_MESSAGE_DATA_VIDEO_HEIGHT, G_TYPE_UINT, &height, NULL); + g_signal_emit (self, signals[SIGNAL_VIDEO_DIMENSIONS_CHANGED], 0, + width, height); + break; + } + case GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED:{ + GstPlayMediaInfo *media_info; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_MEDIA_INFO, + GST_TYPE_PLAY_MEDIA_INFO, &media_info, NULL); + g_signal_emit (self, signals[SIGNAL_MEDIA_INFO_UPDATED], 0, media_info); + g_object_unref (media_info); + break; + } + case GST_PLAY_MESSAGE_VOLUME_CHANGED:{ + gdouble volume; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_VOLUME, + G_TYPE_DOUBLE, &volume, NULL); + g_signal_emit (self, signals[SIGNAL_VOLUME_CHANGED], 0, volume); + break; + } + case GST_PLAY_MESSAGE_MUTE_CHANGED:{ + gboolean is_muted; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_IS_MUTED, + G_TYPE_BOOLEAN, &is_muted, NULL); + g_signal_emit (self, signals[SIGNAL_MUTE_CHANGED], 0, is_muted); + break; + } + case GST_PLAY_MESSAGE_SEEK_DONE:{ + GstClockTime pos; + gst_structure_get (message_data, GST_PLAY_MESSAGE_DATA_POSITION, + GST_TYPE_CLOCK_TIME, &pos, NULL); + g_signal_emit (self, signals[SIGNAL_SEEK_DONE], 0, pos); + break; + } + default: + g_assert_not_reached (); + break; + } +} + +/* + * callback for the bus-message in-sync handling + */ +static GstBusSyncReply + gst_play_signal_adapter_bus_sync_handler + (GstBus * bus, GstMessage * message, gpointer user_data) +{ + GstPlaySignalAdapter *self = GST_PLAY_SIGNAL_ADAPTER (user_data); + const GstStructure *message_data = gst_message_get_structure (message); + gst_play_signal_adapter_emit (self, message_data); + gst_message_unref (message); + return GST_BUS_DROP; +} + +/* + * callback for the bus-watch + * pre: there is a message on the bus + */ +static gboolean +gst_play_signal_adapter_on_message (GstBus * bus, + GstMessage * message, gpointer user_data) +{ + GstPlaySignalAdapter *self = GST_PLAY_SIGNAL_ADAPTER (user_data); + const GstStructure *message_data = gst_message_get_structure (message); + gst_play_signal_adapter_emit (self, message_data); + return TRUE; +} + +/** + * gst_play_signal_adapter_new: + * @play: (transfer none): #GstPlay instance to emit signals for. + * + * A bus-watching #GSource will be created and attached to the the + * thread-default #GMainContext. The attached callback will emit the + * corresponding signal for the message received. Matching signals for play + * messages from the bus will be emitted by it on the created adapter object. + * + * Returns: (transfer full): A new #GstPlaySignalAdapter to connect signal handlers to. + * + * Since: 1.20 + */ +GstPlaySignalAdapter * +gst_play_signal_adapter_new (GstPlay * play) +{ + GstPlaySignalAdapter *self = NULL; + GMainContext *context = NULL; + + g_return_val_if_fail (GST_IS_PLAY (play), NULL); + + self = g_object_new (GST_TYPE_PLAY_SIGNAL_ADAPTER, NULL); + self->play = play; + self->bus = gst_play_get_message_bus (play); + self->source = gst_bus_create_watch (self->bus); + + context = g_main_context_get_thread_default (); + g_source_attach (self->source, context); + g_source_set_callback (self->source, + (GSourceFunc) gst_play_signal_adapter_on_message, self, NULL); + return self; +} + +/** + * gst_play_signal_adapter_new_with_main_context: + * @play: (transfer none): #GstPlay instance to emit signals for. + * @context: A #GMainContext on which the main-loop will process play bus messages on. + * + * A bus-watching #GSource will be created and attached to the @context. The + * attached callback will emit the corresponding signal for the message + * received. Matching signals for play messages from the bus will be emitted by + * it on the created adapter object. + * + * Returns: (transfer full): A new #GstPlaySignalAdapter to connect signal handlers to. + * + * Since: 1.20 + */ +GstPlaySignalAdapter * +gst_play_signal_adapter_new_with_main_context (GstPlay * play, + GMainContext * context) +{ + GstPlaySignalAdapter *self = NULL; + + g_return_val_if_fail (GST_IS_PLAY (play), NULL); + g_return_val_if_fail (context != NULL, NULL); + + self = g_object_new (GST_TYPE_PLAY_SIGNAL_ADAPTER, NULL); + self->play = play; + self->bus = gst_play_get_message_bus (play); + self->source = gst_bus_create_watch (self->bus); + + g_source_attach (self->source, context); + g_source_set_callback (self->source, + (GSourceFunc) gst_play_signal_adapter_on_message, self, NULL); + return self; +} + +/** + * gst_play_signal_adapter_new_sync_emit: + * @play: (transfer none): #GstPlay instance to emit signals for. + * + * Create an adapter that synchronously emits its signals, from the thread in + * which the messages have been posted. + * + * Returns: (transfer full): A new #GstPlaySignalAdapter to connect signal handlers to. + * + * Since: 1.20 + */ +GstPlaySignalAdapter * +gst_play_signal_adapter_new_sync_emit (GstPlay * play) +{ + GstBus *bus = NULL; + GstPlaySignalAdapter *self = NULL; + + g_return_val_if_fail (GST_IS_PLAY (play), NULL); + + bus = gst_play_get_message_bus (play); + + self = g_object_new (GST_TYPE_PLAY_SIGNAL_ADAPTER, NULL); + self->play = play; + self->bus = bus; + gst_bus_set_sync_handler (self->bus, + gst_play_signal_adapter_bus_sync_handler, self, NULL); + return self; +} + + +/** + * gst_play_signal_adapter_get_play: + * @adapter: #GstPlaySignalAdapter instance + * + * Returns: (transfer none): The #GstPlay owning this signal adapter. + * + * Since: 1.20 + */ +GstPlay * +gst_play_signal_adapter_get_play (GstPlaySignalAdapter * adapter) +{ + g_return_val_if_fail (GST_IS_PLAY_SIGNAL_ADAPTER (adapter), NULL); + return adapter->play; +} + +static void +gst_play_signal_adapter_init (GstPlaySignalAdapter * self) +{ + self->source = NULL; +} + +static void +gst_play_signal_adapter_dispose (GObject * object) +{ + GstPlaySignalAdapter *self = GST_PLAY_SIGNAL_ADAPTER (object); + + if (self->source) { + g_source_destroy (self->source); + g_source_unref (self->source); + self->source = NULL; + } + + gst_clear_object (&self->bus); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_play_signal_adapter_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstPlaySignalAdapter *self = GST_PLAY_SIGNAL_ADAPTER (object); + + switch (prop_id) { + case PROP_PLAY: + g_value_set_object (value, self->play); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_play_signal_adapter_class_init (GstPlaySignalAdapterClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->dispose = gst_play_signal_adapter_dispose; + gobject_class->get_property = gst_play_signal_adapter_get_property; + + param_specs[PROP_PLAY] = + g_param_spec_object ("play", "Play", + "GstPlay owning this adapter", + GST_TYPE_PLAY, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + signals[SIGNAL_URI_LOADED] = + g_signal_new ("uri-loaded", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, G_TYPE_STRING); + + signals[SIGNAL_POSITION_UPDATED] = + g_signal_new ("position-updated", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); + + signals[SIGNAL_DURATION_CHANGED] = + g_signal_new ("duration-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); + + signals[SIGNAL_STATE_CHANGED] = + g_signal_new ("state-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_PLAY_STATE); + + signals[SIGNAL_BUFFERING] = + g_signal_new ("buffering", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, G_TYPE_INT); + + signals[SIGNAL_END_OF_STREAM] = + g_signal_new ("end-of-stream", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 0, G_TYPE_INVALID); + + signals[SIGNAL_ERROR] = + g_signal_new ("error", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 2, G_TYPE_ERROR, GST_TYPE_STRUCTURE); + + signals[SIGNAL_VIDEO_DIMENSIONS_CHANGED] = + g_signal_new ("video-dimensions-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 2, G_TYPE_UINT, G_TYPE_UINT); + + signals[SIGNAL_MEDIA_INFO_UPDATED] = + g_signal_new ("media-info-updated", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_PLAY_MEDIA_INFO); + + signals[SIGNAL_VOLUME_CHANGED] = + g_signal_new ("volume-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, G_TYPE_DOUBLE); + + signals[SIGNAL_MUTE_CHANGED] = + g_signal_new ("mute-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, G_TYPE_BOOLEAN); + + signals[SIGNAL_WARNING] = + g_signal_new ("warning", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 2, G_TYPE_ERROR, GST_TYPE_STRUCTURE); + + signals[SIGNAL_SEEK_DONE] = + g_signal_new ("seek-done", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); + + g_object_class_install_properties (gobject_class, PROP_LAST, param_specs); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-signal-adapter.h
Added
@@ -0,0 +1,59 @@ +/* GStreamer + * + * Copyright (C) 2019-2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_SIGNAL_ADAPTER_H__ +#define __GST_PLAY_SIGNAL_ADAPTER_H__ + +#include <gst/play/gstplay-types.h> + +G_BEGIN_DECLS + +#define GST_TYPE_PLAY_SIGNAL_ADAPTER (gst_play_signal_adapter_get_type ()) +#define GST_IS_PLAY_SIGNAL_ADAPTER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_PLAY_SIGNAL_ADAPTER)) +#define GST_IS_PLAY_SIGNAL_ADAPTER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_PLAY_SIGNAL_ADAPTER)) +#define GST_PLAY_SIGNAL_ADAPTER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_PLAY_SIGNAL_ADAPTER, GstPlaySignalAdapterClass)) +#define GST_PLAY_SIGNAL_ADAPTER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_PLAY_SIGNAL_ADAPTER, GstPlaySignalAdapter)) +#define GST_PLAY_SIGNAL_ADAPTER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_PLAY_SIGNAL_ADAPTER, GstPlaySignalAdapterClass)) + +/** + * GST_PLAY_SIGNAL_ADAPTER_CAST: + * Since: 1.20 + */ +#define GST_PLAY_SIGNAL_ADAPTER_CAST(obj) ((GstPlaySignalAdapter*)(obj)) + +GST_PLAY_API +GType gst_play_signal_adapter_get_type (void); + +GST_PLAY_API +GstPlaySignalAdapter * gst_play_signal_adapter_new (GstPlay * play); + +GST_PLAY_API +GstPlaySignalAdapter * gst_play_signal_adapter_new_with_main_context (GstPlay * play, GMainContext * context); + +GST_PLAY_API +GstPlaySignalAdapter * gst_play_signal_adapter_new_sync_emit (GstPlay * play); + +GST_PLAY_API +GstPlay * gst_play_signal_adapter_get_play (GstPlaySignalAdapter * adapter); + +G_END_DECLS + +#endif /* __GST_PLAY_SIGNAL_ADAPTER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-types.h
Added
@@ -0,0 +1,47 @@ +/* GStreamer + * + * Copyright (C) 2015 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_TYPES_H__ +#define __GST_PLAY_TYPES_H__ + +#include <gst/gst.h> +#include <gst/play/play-prelude.h> + +G_BEGIN_DECLS + +/** + * GstPlay: + * Since: 1.20 + */ +typedef struct _GstPlay GstPlay; +typedef struct _GstPlayClass GstPlayClass; + +/** + * GstPlaySignalAdapter: + * Since: 1.20 + */ +typedef struct _GstPlaySignalAdapter GstPlaySignalAdapter; +typedef struct _GstPlaySignalAdapterClass GstPlaySignalAdapterClass; + +G_END_DECLS + +#endif /* __GST_PLAY_TYPES_H__ */ + +
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-video-overlay-video-renderer.c
Added
@@ -0,0 +1,351 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstplay-videooverlayvideorenderer + * @title: GstPlayVideoOverlayVideoRenderer + * @short_description: Play Video Overlay Video Renderer + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstplay-video-overlay-video-renderer.h" +#include "gstplay.h" + +#include <gst/video/video.h> + +struct _GstPlayVideoOverlayVideoRenderer +{ + GObject parent; + + GstVideoOverlay *video_overlay; + gpointer window_handle; + gint x, y, width, height; + + GstElement *video_sink; /* configured video sink, or NULL */ +}; + +struct _GstPlayVideoOverlayVideoRendererClass +{ + GObjectClass parent_class; +}; + +static void + gst_play_video_overlay_video_renderer_interface_init + (GstPlayVideoRendererInterface * iface); + +enum +{ + VIDEO_OVERLAY_VIDEO_RENDERER_PROP_0, + VIDEO_OVERLAY_VIDEO_RENDERER_PROP_WINDOW_HANDLE, + VIDEO_OVERLAY_VIDEO_RENDERER_PROP_VIDEO_SINK, + VIDEO_OVERLAY_VIDEO_RENDERER_PROP_LAST +}; + +G_DEFINE_TYPE_WITH_CODE (GstPlayVideoOverlayVideoRenderer, + gst_play_video_overlay_video_renderer, G_TYPE_OBJECT, + G_IMPLEMENT_INTERFACE (GST_TYPE_PLAY_VIDEO_RENDERER, + gst_play_video_overlay_video_renderer_interface_init)); + +static GParamSpec + * video_overlay_video_renderer_param_specs + [VIDEO_OVERLAY_VIDEO_RENDERER_PROP_LAST] = { NULL, }; + +static void +gst_play_video_overlay_video_renderer_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec) +{ + GstPlayVideoOverlayVideoRenderer *self = + GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (object); + + switch (prop_id) { + case VIDEO_OVERLAY_VIDEO_RENDERER_PROP_WINDOW_HANDLE: + self->window_handle = g_value_get_pointer (value); + if (self->video_overlay) + gst_video_overlay_set_window_handle (self->video_overlay, + (guintptr) self->window_handle); + break; + case VIDEO_OVERLAY_VIDEO_RENDERER_PROP_VIDEO_SINK: + self->video_sink = gst_object_ref_sink (g_value_get_object (value)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_play_video_overlay_video_renderer_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec) +{ + GstPlayVideoOverlayVideoRenderer *self = + GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (object); + + switch (prop_id) { + case VIDEO_OVERLAY_VIDEO_RENDERER_PROP_WINDOW_HANDLE: + g_value_set_pointer (value, self->window_handle); + break; + case VIDEO_OVERLAY_VIDEO_RENDERER_PROP_VIDEO_SINK: + g_value_set_object (value, self->video_sink); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_play_video_overlay_video_renderer_finalize (GObject * object) +{ + GstPlayVideoOverlayVideoRenderer *self = + GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (object); + + if (self->video_overlay) + gst_object_unref (self->video_overlay); + + if (self->video_sink) + gst_object_unref (self->video_sink); + + G_OBJECT_CLASS + (gst_play_video_overlay_video_renderer_parent_class)->finalize (object); +} + +static void + gst_play_video_overlay_video_renderer_class_init + (GstPlayVideoOverlayVideoRendererClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->set_property = + gst_play_video_overlay_video_renderer_set_property; + gobject_class->get_property = + gst_play_video_overlay_video_renderer_get_property; + gobject_class->finalize = gst_play_video_overlay_video_renderer_finalize; + + video_overlay_video_renderer_param_specs + [VIDEO_OVERLAY_VIDEO_RENDERER_PROP_WINDOW_HANDLE] = + g_param_spec_pointer ("window-handle", "Window Handle", + "Window handle to embed the video into", + G_PARAM_READWRITE | G_PARAM_CONSTRUCT | G_PARAM_STATIC_STRINGS); + + video_overlay_video_renderer_param_specs + [VIDEO_OVERLAY_VIDEO_RENDERER_PROP_VIDEO_SINK] = + g_param_spec_object ("video-sink", "Video Sink", + "the video output element to use (NULL = default sink)", + GST_TYPE_ELEMENT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + g_object_class_install_properties (gobject_class, + VIDEO_OVERLAY_VIDEO_RENDERER_PROP_LAST, + video_overlay_video_renderer_param_specs); +} + +static void + gst_play_video_overlay_video_renderer_init + (GstPlayVideoOverlayVideoRenderer * self) +{ + self->x = self->y = self->width = self->height = -1; + self->video_sink = NULL; +} + +static GstElement *gst_play_video_overlay_video_renderer_create_video_sink + (GstPlayVideoRenderer * iface, GstPlay * play) +{ + GstElement *video_overlay; + GstPlayVideoOverlayVideoRenderer *self = + GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (iface); + + if (self->video_overlay) + gst_object_unref (self->video_overlay); + + video_overlay = gst_play_get_pipeline (play); + g_return_val_if_fail (GST_IS_VIDEO_OVERLAY (video_overlay), NULL); + + self->video_overlay = GST_VIDEO_OVERLAY (video_overlay); + + gst_video_overlay_set_window_handle (self->video_overlay, + (guintptr) self->window_handle); + if (self->width != -1 || self->height != -1) + gst_video_overlay_set_render_rectangle (self->video_overlay, self->x, + self->y, self->width, self->height); + + return self->video_sink; +} + +static void + gst_play_video_overlay_video_renderer_interface_init + (GstPlayVideoRendererInterface * iface) +{ + iface->create_video_sink = + gst_play_video_overlay_video_renderer_create_video_sink; +} + +/** + * gst_play_video_overlay_video_renderer_new: + * @window_handle: (allow-none): Window handle to use or %NULL + * + * Returns: (transfer full): + * Since: 1.20 + */ +GstPlayVideoRenderer * +gst_play_video_overlay_video_renderer_new (gpointer window_handle) +{ + return g_object_new (GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER, + "window-handle", window_handle, NULL); +} + +/** + * gst_play_video_overlay_video_renderer_new_with_sink: + * @window_handle: (allow-none): Window handle to use or %NULL + * @video_sink: (transfer floating): the custom video_sink element to be set for the video renderer + * + * Returns: (transfer full): + * + * Since: 1.20 + */ +GstPlayVideoRenderer * +gst_play_video_overlay_video_renderer_new_with_sink (gpointer window_handle, + GstElement * video_sink) +{ + return g_object_new (GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER, + "window-handle", window_handle, "video-sink", video_sink, NULL); +} + +/** + * gst_play_video_overlay_video_renderer_set_window_handle: + * @self: #GstPlayVideoRenderer instance + * @window_handle: handle referencing to the platform specific window + * + * Sets the platform specific window handle into which the video + * should be rendered + * Since: 1.20 + **/ +void gst_play_video_overlay_video_renderer_set_window_handle + (GstPlayVideoOverlayVideoRenderer * self, gpointer window_handle) +{ + g_return_if_fail (GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (self)); + + g_object_set (self, "window-handle", window_handle, NULL); +} + +/** + * gst_play_video_overlay_video_renderer_get_window_handle: + * @self: #GstPlayVideoRenderer instance + * + * Returns: (transfer none): The currently set, platform specific window + * handle + * Since: 1.20 + */ +gpointer + gst_play_video_overlay_video_renderer_get_window_handle + (GstPlayVideoOverlayVideoRenderer * self) { + gpointer window_handle; + + g_return_val_if_fail (GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (self), NULL); + + g_object_get (self, "window-handle", &window_handle, NULL); + + return window_handle; +} + +/** + * gst_play_video_overlay_video_renderer_expose: + * @self: a #GstPlayVideoOverlayVideoRenderer instance. + * + * Tell an overlay that it has been exposed. This will redraw the current frame + * in the drawable even if the pipeline is PAUSED. + * Since: 1.20 + */ +void gst_play_video_overlay_video_renderer_expose + (GstPlayVideoOverlayVideoRenderer * self) +{ + g_return_if_fail (GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (self)); + + if (self->video_overlay) + gst_video_overlay_expose (self->video_overlay); +} + +/** + * gst_play_video_overlay_video_renderer_set_render_rectangle: + * @self: a #GstPlayVideoOverlayVideoRenderer instance + * @x: the horizontal offset of the render area inside the window + * @y: the vertical offset of the render area inside the window + * @width: the width of the render area inside the window + * @height: the height of the render area inside the window + * + * Configure a subregion as a video target within the window set by + * gst_play_video_overlay_video_renderer_set_window_handle(). If this is not + * used or not supported the video will fill the area of the window set as the + * overlay to 100%. By specifying the rectangle, the video can be overlaid to + * a specific region of that window only. After setting the new rectangle one + * should call gst_play_video_overlay_video_renderer_expose() to force a + * redraw. To unset the region pass -1 for the @width and @height parameters. + * + * This method is needed for non fullscreen video overlay in UI toolkits that + * do not support subwindows. + * + * Since: 1.20 + */ +void gst_play_video_overlay_video_renderer_set_render_rectangle + (GstPlayVideoOverlayVideoRenderer * self, gint x, gint y, gint width, + gint height) +{ + g_return_if_fail (GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (self)); + + self->x = x; + self->y = y; + self->width = width; + self->height = height; + + if (self->video_overlay) + gst_video_overlay_set_render_rectangle (self->video_overlay, + x, y, width, height); +} + +/** + * gst_play_video_overlay_video_renderer_get_render_rectangle: + * @self: a #GstPlayVideoOverlayVideoRenderer instance + * @x: (out) (allow-none): the horizontal offset of the render area inside the window + * @y: (out) (allow-none): the vertical offset of the render area inside the window + * @width: (out) (allow-none): the width of the render area inside the window + * @height: (out) (allow-none): the height of the render area inside the window + * + * Return the currently configured render rectangle. See gst_play_video_overlay_video_renderer_set_render_rectangle() + * for details. + * + * Since: 1.20 + */ +void gst_play_video_overlay_video_renderer_get_render_rectangle + (GstPlayVideoOverlayVideoRenderer * self, gint * x, gint * y, + gint * width, gint * height) +{ + g_return_if_fail (GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (self)); + + if (x) + *x = self->x; + if (y) + *y = self->y; + if (width) + *width = self->width; + if (height) + *height = self->height; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-video-overlay-video-renderer.h
Added
@@ -0,0 +1,77 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_H__ +#define __GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_H__ + +#include <gst/play/gstplay-types.h> +#include <gst/play/gstplay-video-renderer.h> + +G_BEGIN_DECLS + +/** + * GstPlayVideoOverlayVideoRenderer: + * Since: 1.20 + */ +typedef struct _GstPlayVideoOverlayVideoRenderer + GstPlayVideoOverlayVideoRenderer; +typedef struct _GstPlayVideoOverlayVideoRendererClass + GstPlayVideoOverlayVideoRendererClass; + +#define GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER (gst_play_video_overlay_video_renderer_get_type ()) +#define GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER)) +#define GST_IS_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER)) +#define GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER, GstPlayVideoOverlayVideoRendererClass)) +#define GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER, GstPlayVideoOverlayVideoRenderer)) +#define GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER, GstPlayVideoOverlayVideoRendererClass)) + +/** + * GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_CAST: + * Since: 1.20 + */ +#define GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_CAST(obj) ((GstPlayVideoOverlayVideoRenderer*)(obj)) + +GST_PLAY_API +GType gst_play_video_overlay_video_renderer_get_type (void); + +GST_PLAY_API +GstPlayVideoRenderer * gst_play_video_overlay_video_renderer_new (gpointer window_handle); + +GST_PLAY_API +GstPlayVideoRenderer * gst_play_video_overlay_video_renderer_new_with_sink (gpointer window_handle, GstElement * video_sink); + +GST_PLAY_API +void gst_play_video_overlay_video_renderer_set_window_handle (GstPlayVideoOverlayVideoRenderer * self, gpointer window_handle); + +GST_PLAY_API +gpointer gst_play_video_overlay_video_renderer_get_window_handle (GstPlayVideoOverlayVideoRenderer * self); + +GST_PLAY_API +void gst_play_video_overlay_video_renderer_expose (GstPlayVideoOverlayVideoRenderer * self); + +GST_PLAY_API +void gst_play_video_overlay_video_renderer_set_render_rectangle (GstPlayVideoOverlayVideoRenderer * self, gint x, gint y, gint width, gint height); + +GST_PLAY_API +void gst_play_video_overlay_video_renderer_get_render_rectangle (GstPlayVideoOverlayVideoRenderer * self, gint *x, gint *y, gint *width, gint *height); + +G_END_DECLS + +#endif /* __GST_PLAY_VIDEO_OVERLAY_VIDEO_RENDERER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-video-renderer-private.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_VIDEO_RENDERER_PRIVATE_H__ +#define __GST_PLAY_VIDEO_RENDERER_PRIVATE_H__ + +#include <gst/play/gstplay-video-renderer.h> + +G_BEGIN_DECLS + +G_GNUC_INTERNAL GstElement * gst_play_video_renderer_create_video_sink (GstPlayVideoRenderer * + self, GstPlay * play); + +G_END_DECLS + +#endif /* __GST_PLAY_VIDEO_RENDERER_PRIVATE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-video-renderer.c
Added
@@ -0,0 +1,49 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstplay-video-renderer.h" +#include "gstplay-video-renderer-private.h" + +G_DEFINE_INTERFACE (GstPlayVideoRenderer, gst_play_video_renderer, + G_TYPE_OBJECT); + +static void +gst_play_video_renderer_default_init (G_GNUC_UNUSED + GstPlayVideoRendererInterface * iface) +{ + +} + +GstElement * +gst_play_video_renderer_create_video_sink (GstPlayVideoRenderer * self, + GstPlay * play) +{ + GstPlayVideoRendererInterface *iface; + + g_return_val_if_fail (GST_IS_PLAY_VIDEO_RENDERER (self), NULL); + iface = GST_PLAY_VIDEO_RENDERER_GET_INTERFACE (self); + g_return_val_if_fail (iface->create_video_sink != NULL, NULL); + + return iface->create_video_sink (self, play); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-video-renderer.h
Added
@@ -0,0 +1,57 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_VIDEO_RENDERER_H__ +#define __GST_PLAY_VIDEO_RENDERER_H__ + +#include <gst/gst.h> +#include <gst/play/gstplay-types.h> + +G_BEGIN_DECLS + +/** + * GstPlayVideoRenderer: + * Since: 1.20 + */ +typedef struct _GstPlayVideoRenderer GstPlayVideoRenderer; +typedef struct _GstPlayVideoRendererInterface GstPlayVideoRendererInterface; + +#define GST_TYPE_PLAY_VIDEO_RENDERER (gst_play_video_renderer_get_type ()) +#define GST_PLAY_VIDEO_RENDERER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_PLAY_VIDEO_RENDERER, GstPlayVideoRenderer)) +#define GST_IS_PLAY_VIDEO_RENDERER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_PLAY_VIDEO_RENDERER)) + +/** + * GST_PLAY_VIDEO_RENDERER_GET_INTERFACE: + * Since: 1.20 + */ +#define GST_PLAY_VIDEO_RENDERER_GET_INTERFACE(inst) (G_TYPE_INSTANCE_GET_INTERFACE ((inst), GST_TYPE_PLAY_VIDEO_RENDERER, GstPlayVideoRendererInterface)) + +struct _GstPlayVideoRendererInterface { + GTypeInterface parent_iface; + + GstElement * (*create_video_sink) (GstPlayVideoRenderer * self, GstPlay * play); +}; + +GST_PLAY_API +GType gst_play_video_renderer_get_type (void); + +G_END_DECLS + +#endif /* __GST_PLAY_VIDEO_RENDERER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-visualization.c
Added
@@ -0,0 +1,183 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstplay-visualization + * @title: GstPlayVisualization + * @short_description: Play Visualization + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstplay-visualization.h" + +#include <string.h> + +static GMutex vis_lock; +static GQueue vis_list = G_QUEUE_INIT; +static guint32 vis_cookie; + +G_DEFINE_BOXED_TYPE (GstPlayVisualization, gst_play_visualization, + (GBoxedCopyFunc) gst_play_visualization_copy, + (GBoxedFreeFunc) gst_play_visualization_free); + +/** + * gst_play_visualization_free: + * @vis: #GstPlayVisualization instance + * + * Frees a #GstPlayVisualization. + * Since: 1.20 + */ +void +gst_play_visualization_free (GstPlayVisualization * vis) +{ + g_return_if_fail (vis != NULL); + + g_free (vis->name); + g_free (vis->description); + g_free (vis); +} + +/** + * gst_play_visualization_copy: + * @vis: #GstPlayVisualization instance + * + * Makes a copy of the #GstPlayVisualization. The result must be + * freed using gst_play_visualization_free(). + * + * Returns: (transfer full): an allocated copy of @vis. + * Since: 1.20 + */ +GstPlayVisualization * +gst_play_visualization_copy (const GstPlayVisualization * vis) +{ + GstPlayVisualization *ret; + + g_return_val_if_fail (vis != NULL, NULL); + + ret = g_new0 (GstPlayVisualization, 1); + ret->name = vis->name ? g_strdup (vis->name) : NULL; + ret->description = vis->description ? g_strdup (vis->description) : NULL; + + return ret; +} + +/** + * gst_play_visualizations_free: + * @viss: a %NULL terminated array of #GstPlayVisualization to free + * + * Frees a %NULL terminated array of #GstPlayVisualization. + * Since: 1.20 + */ +void +gst_play_visualizations_free (GstPlayVisualization ** viss) +{ + GstPlayVisualization **p; + + g_return_if_fail (viss != NULL); + + p = viss; + while (*p) { + g_free ((*p)->name); + g_free ((*p)->description); + g_free (*p); + p++; + } + g_free (viss); +} + +static void +gst_play_update_visualization_list (void) +{ + GList *features; + GList *l; + guint32 cookie; + GstPlayVisualization *vis; + + g_mutex_lock (&vis_lock); + + /* check if we need to update the list */ + cookie = gst_registry_get_feature_list_cookie (gst_registry_get ()); + if (vis_cookie == cookie) { + g_mutex_unlock (&vis_lock); + return; + } + + /* if update is needed then first free the existing list */ + while ((vis = g_queue_pop_head (&vis_list))) + gst_play_visualization_free (vis); + + features = gst_registry_get_feature_list (gst_registry_get (), + GST_TYPE_ELEMENT_FACTORY); + + for (l = features; l; l = l->next) { + GstPluginFeature *feature = l->data; + const gchar *klass; + + klass = gst_element_factory_get_metadata (GST_ELEMENT_FACTORY (feature), + GST_ELEMENT_METADATA_KLASS); + + if (strstr (klass, "Visualization")) { + vis = g_new0 (GstPlayVisualization, 1); + + vis->name = g_strdup (gst_plugin_feature_get_name (feature)); + vis->description = + g_strdup (gst_element_factory_get_metadata (GST_ELEMENT_FACTORY + (feature), GST_ELEMENT_METADATA_DESCRIPTION)); + g_queue_push_tail (&vis_list, vis); + } + } + gst_plugin_feature_list_free (features); + + vis_cookie = cookie; + + g_mutex_unlock (&vis_lock); +} + +/** + * gst_play_visualizations_get: + * + * Returns: (transfer full) (array zero-terminated=1) (element-type GstPlayVisualization): + * a %NULL terminated array containing all available + * visualizations. Use gst_play_visualizations_free() after + * usage. + * Since: 1.20 + */ +GstPlayVisualization ** +gst_play_visualizations_get (void) +{ + gint i = 0; + GList *l; + GstPlayVisualization **ret; + + gst_play_update_visualization_list (); + + g_mutex_lock (&vis_lock); + ret = g_new0 (GstPlayVisualization *, g_queue_get_length (&vis_list) + 1); + for (l = vis_list.head; l; l = l->next) + ret[i++] = gst_play_visualization_copy (l->data); + g_mutex_unlock (&vis_lock); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay-visualization.h
Added
@@ -0,0 +1,61 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_VISUALIZATION_H__ +#define __GST_PLAY_VISUALIZATION_H__ + +#include <gst/gst.h> +#include <gst/play/play-prelude.h> + +G_BEGIN_DECLS + +typedef struct _GstPlayVisualization GstPlayVisualization; +/** + * GstPlayVisualization: + * @name: name of the visualization. + * @description: description of the visualization. + * + * A #GstPlayVisualization descriptor. + * Since: 1.20 + */ +struct _GstPlayVisualization { + gchar *name; + gchar *description; +}; + +GST_PLAY_API +GType gst_play_visualization_get_type (void); + +GST_PLAY_API +GstPlayVisualization * gst_play_visualization_copy (const GstPlayVisualization *vis); + +GST_PLAY_API +void gst_play_visualization_free (GstPlayVisualization *vis); + +GST_PLAY_API +GstPlayVisualization ** gst_play_visualizations_get (void); + +GST_PLAY_API +void gst_play_visualizations_free (GstPlayVisualization **viss); + +G_END_DECLS + +#endif /* __GST_PLAY_VISUALIZATION_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay.c
Added
@@ -0,0 +1,4759 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * Copyright (C) 2019-2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstplay + * @title: GstPlay + * @short_description: Player + * @symbols: + * - GstPlay + * + * Since: 1.20 + */ + +/* TODO: + * + * - Equalizer + * - Gapless playback + * - Frame stepping + * - Subtitle font, connection speed + * - Deinterlacing + * - Buffering control (-> progressive downloading) + * - Playlist/queue object + * - Custom video sink (e.g. embed in GL scene) + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstplay.h" +#include "gstplay-video-renderer-private.h" +#include "gstplay-media-info-private.h" +#include "gstplay-message-private.h" + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/video/colorbalance.h> +#include <gst/tag/tag.h> +#include <gst/pbutils/descriptions.h> + +#include <string.h> + +GST_DEBUG_CATEGORY_STATIC (gst_play_debug); +#define GST_CAT_DEFAULT gst_play_debug + +#define DEFAULT_URI NULL +#define DEFAULT_POSITION GST_CLOCK_TIME_NONE +#define DEFAULT_DURATION GST_CLOCK_TIME_NONE +#define DEFAULT_VOLUME 1.0 +#define DEFAULT_MUTE FALSE +#define DEFAULT_RATE 1.0 +#define DEFAULT_POSITION_UPDATE_INTERVAL_MS 100 +#define DEFAULT_AUDIO_VIDEO_OFFSET 0 +#define DEFAULT_SUBTITLE_VIDEO_OFFSET 0 + +/** + * gst_play_error_quark: + * Since: 1.20 + */ +GQuark +gst_play_error_quark (void) +{ + return g_quark_from_static_string ("gst-play-error-quark"); +} + +static GQuark QUARK_CONFIG; + +/* Keep ConfigQuarkId and _config_quark_strings ordered and synced */ +typedef enum +{ + CONFIG_QUARK_USER_AGENT = 0, + CONFIG_QUARK_POSITION_INTERVAL_UPDATE, + CONFIG_QUARK_ACCURATE_SEEK, + + CONFIG_QUARK_MAX +} ConfigQuarkId; + +static const gchar *_config_quark_strings[] = { + "user-agent", + "position-interval-update", + "accurate-seek", +}; + +static GQuark _config_quark_table[CONFIG_QUARK_MAX]; + +#define CONFIG_QUARK(q) _config_quark_table[CONFIG_QUARK_##q] + +enum +{ + PROP_0, + PROP_VIDEO_RENDERER, + PROP_URI, + PROP_SUBURI, + PROP_POSITION, + PROP_DURATION, + PROP_MEDIA_INFO, + PROP_CURRENT_AUDIO_TRACK, + PROP_CURRENT_VIDEO_TRACK, + PROP_CURRENT_SUBTITLE_TRACK, + PROP_VOLUME, + PROP_MUTE, + PROP_RATE, + PROP_PIPELINE, + PROP_VIDEO_MULTIVIEW_MODE, + PROP_VIDEO_MULTIVIEW_FLAGS, + PROP_AUDIO_VIDEO_OFFSET, + PROP_SUBTITLE_VIDEO_OFFSET, + PROP_LAST +}; + +enum +{ + GST_PLAY_FLAG_VIDEO = (1 << 0), + GST_PLAY_FLAG_AUDIO = (1 << 1), + GST_PLAY_FLAG_SUBTITLE = (1 << 2), + GST_PLAY_FLAG_VIS = (1 << 3) +}; + +struct _GstPlay +{ + GstObject parent; + + GstPlayVideoRenderer *video_renderer; + + gchar *uri; + gchar *redirect_uri; + gchar *suburi; + + GThread *thread; + GMutex lock; + GCond cond; + GMainContext *context; + GMainLoop *loop; + + GstBus *api_bus; + + GstElement *playbin; + GstBus *bus; + GstState target_state, current_state; + gboolean is_live, is_eos; + GSource *tick_source, *ready_timeout_source; + + GstClockTime cached_duration; + gint64 cached_position; + + gdouble rate; + + GstPlayState app_state; + + gint buffering_percent; + + GstTagList *global_tags; + GstPlayMediaInfo *media_info; + + GstElement *current_vis_element; + + GstStructure *config; + + /* Protected by lock */ + gboolean seek_pending; /* Only set from main context */ + GstClockTime last_seek_time; /* Only set from main context */ + GSource *seek_source; + GstClockTime seek_position; + + /* For playbin3 */ + gboolean use_playbin3; + GstStreamCollection *collection; + gchar *video_sid; + gchar *audio_sid; + gchar *subtitle_sid; + gulong stream_notify_id; +}; + +struct _GstPlayClass +{ + GstObjectClass parent_class; +}; + +#define parent_class gst_play_parent_class +G_DEFINE_TYPE (GstPlay, gst_play, GST_TYPE_OBJECT); + +static GParamSpec *param_specs[PROP_LAST] = { NULL, }; + +static void gst_play_dispose (GObject * object); +static void gst_play_finalize (GObject * object); +static void gst_play_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_play_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); +static void gst_play_constructed (GObject * object); + +static gpointer gst_play_main (gpointer data); + +static void gst_play_seek_internal_locked (GstPlay * self); +static void gst_play_stop_internal (GstPlay * self, gboolean transient); +static gboolean gst_play_pause_internal (gpointer user_data); +static gboolean gst_play_play_internal (gpointer user_data); +static gboolean gst_play_seek_internal (gpointer user_data); +static void gst_play_set_rate_internal (GstPlay * self); +static void change_state (GstPlay * self, GstPlayState state); + +static GstPlayMediaInfo *gst_play_media_info_create (GstPlay * self); + +static void gst_play_streams_info_create (GstPlay * self, + GstPlayMediaInfo * media_info, const gchar * prop, GType type); +static void gst_play_stream_info_update (GstPlay * self, GstPlayStreamInfo * s); +static void gst_play_stream_info_update_tags_and_caps (GstPlay * self, + GstPlayStreamInfo * s); +static GstPlayStreamInfo *gst_play_stream_info_find (GstPlayMediaInfo * + media_info, GType type, gint stream_index); +static GstPlayStreamInfo *gst_play_stream_info_get_current (GstPlay * + self, const gchar * prop, GType type); + +static void gst_play_video_info_update (GstPlay * self, + GstPlayStreamInfo * stream_info); +static void gst_play_audio_info_update (GstPlay * self, + GstPlayStreamInfo * stream_info); +static void gst_play_subtitle_info_update (GstPlay * self, + GstPlayStreamInfo * stream_info); + +/* For playbin3 */ +static void gst_play_streams_info_create_from_collection (GstPlay * self, + GstPlayMediaInfo * media_info, GstStreamCollection * collection); +static void gst_play_stream_info_update_from_stream (GstPlay * self, + GstPlayStreamInfo * s, GstStream * stream); +static GstPlayStreamInfo *gst_play_stream_info_find_from_stream_id + (GstPlayMediaInfo * media_info, const gchar * stream_id); +static GstPlayStreamInfo *gst_play_stream_info_get_current_from_stream_id + (GstPlay * self, const gchar * stream_id, GType type); +static void stream_notify_cb (GstStreamCollection * collection, + GstStream * stream, GParamSpec * pspec, GstPlay * self); + +static void on_media_info_updated (GstPlay * self); + +static void *get_title (GstTagList * tags); +static void *get_container_format (GstTagList * tags); +static void *get_from_tags (GstPlay * self, GstPlayMediaInfo * media_info, + void *(*func) (GstTagList *)); +static void *get_cover_sample (GstTagList * tags); + +static void remove_seek_source (GstPlay * self); + +static gboolean query_position (GstPlay * self, GstClockTime * position); + +static void +gst_play_init (GstPlay * self) +{ + GST_TRACE_OBJECT (self, "Initializing"); + + self = gst_play_get_instance_private (self); + + g_mutex_init (&self->lock); + g_cond_init (&self->cond); + + self->context = g_main_context_new (); + self->loop = g_main_loop_new (self->context, FALSE); + self->api_bus = gst_bus_new (); + + /* *INDENT-OFF* */ + self->config = gst_structure_new_id (QUARK_CONFIG, + CONFIG_QUARK (POSITION_INTERVAL_UPDATE), G_TYPE_UINT, DEFAULT_POSITION_UPDATE_INTERVAL_MS, + CONFIG_QUARK (ACCURATE_SEEK), G_TYPE_BOOLEAN, FALSE, + NULL); + /* *INDENT-ON* */ + + self->seek_pending = FALSE; + self->seek_position = GST_CLOCK_TIME_NONE; + self->last_seek_time = GST_CLOCK_TIME_NONE; + + self->cached_position = 0; + self->cached_duration = GST_CLOCK_TIME_NONE; + + GST_TRACE_OBJECT (self, "Initialized"); +} + +/* + * Works same as gst_structure_set to set field/type/value triplets on message data + */ +static void +api_bus_post_message (GstPlay * self, GstPlayMessage message_type, + const gchar * firstfield, ...) +{ + GstStructure *message_data = NULL; + GstMessage *msg = NULL; + va_list varargs; + + GST_INFO ("Posting API-bus message-type: %s", + gst_play_message_get_name (message_type)); + message_data = gst_structure_new (GST_PLAY_MESSAGE_DATA, + GST_PLAY_MESSAGE_DATA_TYPE, GST_TYPE_PLAY_MESSAGE, message_type, NULL); + + va_start (varargs, firstfield); + gst_structure_set_valist (message_data, firstfield, varargs); + va_end (varargs); + + msg = gst_message_new_custom (GST_MESSAGE_APPLICATION, + GST_OBJECT (self), message_data); + GST_DEBUG ("Created message with payload: [ %" GST_PTR_FORMAT " ]", + message_data); + gst_bus_post (self->api_bus, msg); +} + +static void +config_quark_initialize (void) +{ + gint i; + + QUARK_CONFIG = g_quark_from_static_string ("play-config"); + + if (G_N_ELEMENTS (_config_quark_strings) != CONFIG_QUARK_MAX) + g_warning ("the quark table is not consistent! %d != %d", + (int) G_N_ELEMENTS (_config_quark_strings), CONFIG_QUARK_MAX); + + for (i = 0; i < CONFIG_QUARK_MAX; i++) { + _config_quark_table[i] = + g_quark_from_static_string (_config_quark_strings[i]); + } +} + +static void +gst_play_class_init (GstPlayClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->set_property = gst_play_set_property; + gobject_class->get_property = gst_play_get_property; + gobject_class->dispose = gst_play_dispose; + gobject_class->finalize = gst_play_finalize; + gobject_class->constructed = gst_play_constructed; + + param_specs[PROP_VIDEO_RENDERER] = + g_param_spec_object ("video-renderer", + "Video Renderer", "Video renderer to use for rendering videos", + GST_TYPE_PLAY_VIDEO_RENDERER, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_URI] = g_param_spec_string ("uri", "URI", "Current URI", + DEFAULT_URI, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_SUBURI] = g_param_spec_string ("suburi", "Subtitle URI", + "Current Subtitle URI", NULL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_POSITION] = + g_param_spec_uint64 ("position", "Position", "Current Position", + 0, G_MAXUINT64, DEFAULT_POSITION, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_MEDIA_INFO] = + g_param_spec_object ("media-info", "Media Info", + "Current media information", GST_TYPE_PLAY_MEDIA_INFO, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_CURRENT_AUDIO_TRACK] = + g_param_spec_object ("current-audio-track", "Current Audio Track", + "Current audio track information", GST_TYPE_PLAY_AUDIO_INFO, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_CURRENT_VIDEO_TRACK] = + g_param_spec_object ("current-video-track", "Current Video Track", + "Current video track information", GST_TYPE_PLAY_VIDEO_INFO, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_CURRENT_SUBTITLE_TRACK] = + g_param_spec_object ("current-subtitle-track", "Current Subtitle Track", + "Current audio subtitle information", GST_TYPE_PLAY_SUBTITLE_INFO, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_DURATION] = + g_param_spec_uint64 ("duration", "Duration", "Duration", + 0, G_MAXUINT64, DEFAULT_DURATION, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_VOLUME] = + g_param_spec_double ("volume", "Volume", "Volume", + 0, 10.0, DEFAULT_VOLUME, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_MUTE] = + g_param_spec_boolean ("mute", "Mute", "Mute", + DEFAULT_MUTE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_PIPELINE] = + g_param_spec_object ("pipeline", "Pipeline", + "GStreamer pipeline that is used", + GST_TYPE_ELEMENT, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_RATE] = + g_param_spec_double ("rate", "rate", "Playback rate", + -64.0, 64.0, DEFAULT_RATE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_VIDEO_MULTIVIEW_MODE] = + g_param_spec_enum ("video-multiview-mode", + "Multiview Mode Override", + "Re-interpret a video stream as one of several frame-packed stereoscopic modes.", + GST_TYPE_VIDEO_MULTIVIEW_FRAME_PACKING, + GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_VIDEO_MULTIVIEW_FLAGS] = + g_param_spec_flags ("video-multiview-flags", + "Multiview Flags Override", + "Override details of the multiview frame layout", + GST_TYPE_VIDEO_MULTIVIEW_FLAGS, GST_VIDEO_MULTIVIEW_FLAGS_NONE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_AUDIO_VIDEO_OFFSET] = + g_param_spec_int64 ("audio-video-offset", "Audio Video Offset", + "The synchronisation offset between audio and video in nanoseconds", + G_MININT64, G_MAXINT64, 0, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + param_specs[PROP_SUBTITLE_VIDEO_OFFSET] = + g_param_spec_int64 ("subtitle-video-offset", "Text Video Offset", + "The synchronisation offset between text and video in nanoseconds", + G_MININT64, G_MAXINT64, 0, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + + g_object_class_install_properties (gobject_class, PROP_LAST, param_specs); + + config_quark_initialize (); +} + +static void +gst_play_dispose (GObject * object) +{ + GstPlay *self = GST_PLAY (object); + + GST_TRACE_OBJECT (self, "Stopping main thread"); + + gst_bus_set_flushing (self->api_bus, TRUE); + + if (self->loop) { + g_main_loop_quit (self->loop); + + if (self->thread != g_thread_self ()) + g_thread_join (self->thread); + else + g_thread_unref (self->thread); + self->thread = NULL; + + g_main_loop_unref (self->loop); + self->loop = NULL; + + g_main_context_unref (self->context); + self->context = NULL; + } + + gst_clear_object (&self->api_bus); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_play_finalize (GObject * object) +{ + GstPlay *self = GST_PLAY (object); + + GST_TRACE_OBJECT (self, "Finalizing"); + + g_free (self->uri); + g_free (self->redirect_uri); + g_free (self->suburi); + g_free (self->video_sid); + g_free (self->audio_sid); + g_free (self->subtitle_sid); + if (self->global_tags) + gst_tag_list_unref (self->global_tags); + if (self->video_renderer) + g_object_unref (self->video_renderer); + if (self->current_vis_element) + gst_object_unref (self->current_vis_element); + if (self->config) + gst_structure_free (self->config); + if (self->collection) + gst_object_unref (self->collection); + if (self->media_info) + g_object_unref (self->media_info); + g_mutex_clear (&self->lock); + g_cond_clear (&self->cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_play_constructed (GObject * object) +{ + GstPlay *self = GST_PLAY (object); + + GST_TRACE_OBJECT (self, "Constructed"); + + g_mutex_lock (&self->lock); + self->thread = g_thread_new ("GstPlay", gst_play_main, self); + while (!self->loop || !g_main_loop_is_running (self->loop)) + g_cond_wait (&self->cond, &self->lock); + g_mutex_unlock (&self->lock); + + G_OBJECT_CLASS (parent_class)->constructed (object); +} + +static gboolean +gst_play_set_uri_internal (gpointer user_data) +{ + GstPlay *self = user_data; + + gst_play_stop_internal (self, FALSE); + + g_mutex_lock (&self->lock); + + GST_DEBUG_OBJECT (self, "Changing URI to '%s'", GST_STR_NULL (self->uri)); + + g_object_set (self->playbin, "uri", self->uri, NULL); + + api_bus_post_message (self, GST_PLAY_MESSAGE_URI_LOADED, + GST_PLAY_MESSAGE_DATA_URI, G_TYPE_STRING, self->uri, NULL); + + g_object_set (self->playbin, "suburi", NULL, NULL); + + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +static gboolean +gst_play_set_suburi_internal (gpointer user_data) +{ + GstPlay *self = user_data; + GstClockTime position; + GstState target_state; + + /* save the state and position */ + target_state = self->target_state; + position = gst_play_get_position (self); + + gst_play_stop_internal (self, TRUE); + g_mutex_lock (&self->lock); + + GST_DEBUG_OBJECT (self, "Changing SUBURI to '%s'", + GST_STR_NULL (self->suburi)); + + g_object_set (self->playbin, "suburi", self->suburi, NULL); + + g_mutex_unlock (&self->lock); + + /* restore state and position */ + if (position != GST_CLOCK_TIME_NONE) + gst_play_seek (self, position); + if (target_state == GST_STATE_PAUSED) + gst_play_pause_internal (self); + else if (target_state == GST_STATE_PLAYING) + gst_play_play_internal (self); + + return G_SOURCE_REMOVE; +} + +static void +gst_play_set_rate_internal (GstPlay * self) +{ + self->seek_position = gst_play_get_position (self); + + /* If there is no seek being dispatch to the main context currently do that, + * otherwise we just updated the rate so that it will be taken by + * the seek handler from the main context instead of the old one. + */ + if (!self->seek_source) { + /* If no seek is pending then create new seek source */ + if (!self->seek_pending) { + self->seek_source = g_idle_source_new (); + g_source_set_callback (self->seek_source, + (GSourceFunc) gst_play_seek_internal, self, NULL); + g_source_attach (self->seek_source, self->context); + } + } +} + +static void +gst_play_set_playbin_video_sink (GstPlay * self) +{ + GstElement *video_sink = NULL; + + if (self->video_renderer != NULL) + video_sink = + gst_play_video_renderer_create_video_sink (self->video_renderer, self); + if (video_sink) + g_object_set (self->playbin, "video-sink", video_sink, NULL); +} + +static void +gst_play_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstPlay *self = GST_PLAY (object); + + switch (prop_id) { + case PROP_VIDEO_RENDERER: + g_mutex_lock (&self->lock); + g_clear_object (&self->video_renderer); + self->video_renderer = g_value_dup_object (value); + gst_play_set_playbin_video_sink (self); + g_mutex_unlock (&self->lock); + break; + case PROP_URI:{ + g_mutex_lock (&self->lock); + g_free (self->uri); + g_free (self->redirect_uri); + self->redirect_uri = NULL; + + g_free (self->suburi); + self->suburi = NULL; + + self->uri = g_value_dup_string (value); + GST_DEBUG_OBJECT (self, "Set uri=%s", GST_STR_NULL (self->uri)); + g_mutex_unlock (&self->lock); + + g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, + gst_play_set_uri_internal, self, NULL); + break; + } + case PROP_SUBURI:{ + g_mutex_lock (&self->lock); + g_free (self->suburi); + + self->suburi = g_value_dup_string (value); + GST_DEBUG_OBJECT (self, "Set suburi=%s", self->suburi); + g_mutex_unlock (&self->lock); + + g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, + gst_play_set_suburi_internal, self, NULL); + break; + } + case PROP_VOLUME: + GST_DEBUG_OBJECT (self, "Set volume=%lf", g_value_get_double (value)); + g_object_set_property (G_OBJECT (self->playbin), "volume", value); + break; + case PROP_RATE: + g_mutex_lock (&self->lock); + self->rate = g_value_get_double (value); + GST_DEBUG_OBJECT (self, "Set rate=%lf", g_value_get_double (value)); + gst_play_set_rate_internal (self); + g_mutex_unlock (&self->lock); + break; + case PROP_MUTE: + GST_DEBUG_OBJECT (self, "Set mute=%d", g_value_get_boolean (value)); + g_object_set_property (G_OBJECT (self->playbin), "mute", value); + break; + case PROP_VIDEO_MULTIVIEW_MODE: + GST_DEBUG_OBJECT (self, "Set multiview mode=%u", + g_value_get_enum (value)); + g_object_set_property (G_OBJECT (self->playbin), "video-multiview-mode", + value); + break; + case PROP_VIDEO_MULTIVIEW_FLAGS: + GST_DEBUG_OBJECT (self, "Set multiview flags=%x", + g_value_get_flags (value)); + g_object_set_property (G_OBJECT (self->playbin), "video-multiview-flags", + value); + break; + case PROP_AUDIO_VIDEO_OFFSET: + g_object_set_property (G_OBJECT (self->playbin), "av-offset", value); + break; + case PROP_SUBTITLE_VIDEO_OFFSET: + g_object_set_property (G_OBJECT (self->playbin), "text-offset", value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_play_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstPlay *self = GST_PLAY (object); + + switch (prop_id) { + case PROP_URI: + g_mutex_lock (&self->lock); + g_value_set_string (value, self->uri); + g_mutex_unlock (&self->lock); + break; + case PROP_SUBURI: + g_mutex_lock (&self->lock); + g_value_set_string (value, self->suburi); + g_mutex_unlock (&self->lock); + GST_DEBUG_OBJECT (self, "Returning suburi=%s", + g_value_get_string (value)); + break; + case PROP_POSITION:{ + GstClockTime position = GST_CLOCK_TIME_NONE; + query_position (self, &position); + g_value_set_uint64 (value, position); + GST_TRACE_OBJECT (self, "Returning position=%" GST_TIME_FORMAT, + GST_TIME_ARGS (g_value_get_uint64 (value))); + break; + } + case PROP_DURATION:{ + g_value_set_uint64 (value, self->cached_duration); + GST_TRACE_OBJECT (self, "Returning duration=%" GST_TIME_FORMAT, + GST_TIME_ARGS (g_value_get_uint64 (value))); + break; + } + case PROP_MEDIA_INFO:{ + GstPlayMediaInfo *media_info = gst_play_get_media_info (self); + g_value_take_object (value, media_info); + break; + } + case PROP_CURRENT_AUDIO_TRACK:{ + GstPlayAudioInfo *audio_info = gst_play_get_current_audio_track (self); + g_value_take_object (value, audio_info); + break; + } + case PROP_CURRENT_VIDEO_TRACK:{ + GstPlayVideoInfo *video_info = gst_play_get_current_video_track (self); + g_value_take_object (value, video_info); + break; + } + case PROP_CURRENT_SUBTITLE_TRACK:{ + GstPlaySubtitleInfo *subtitle_info = + gst_play_get_current_subtitle_track (self); + g_value_take_object (value, subtitle_info); + break; + } + case PROP_VOLUME: + g_object_get_property (G_OBJECT (self->playbin), "volume", value); + GST_TRACE_OBJECT (self, "Returning volume=%lf", + g_value_get_double (value)); + break; + case PROP_RATE: + g_mutex_lock (&self->lock); + g_value_set_double (value, self->rate); + g_mutex_unlock (&self->lock); + break; + case PROP_MUTE: + g_object_get_property (G_OBJECT (self->playbin), "mute", value); + GST_TRACE_OBJECT (self, "Returning mute=%d", g_value_get_boolean (value)); + break; + case PROP_PIPELINE: + g_value_set_object (value, self->playbin); + break; + case PROP_VIDEO_MULTIVIEW_MODE:{ + g_object_get_property (G_OBJECT (self->playbin), "video-multiview-mode", + value); + GST_TRACE_OBJECT (self, "Return multiview mode=%d", + g_value_get_enum (value)); + break; + } + case PROP_VIDEO_MULTIVIEW_FLAGS:{ + g_object_get_property (G_OBJECT (self->playbin), "video-multiview-flags", + value); + GST_TRACE_OBJECT (self, "Return multiview flags=%x", + g_value_get_flags (value)); + break; + } + case PROP_AUDIO_VIDEO_OFFSET: + g_object_get_property (G_OBJECT (self->playbin), "av-offset", value); + break; + case PROP_SUBTITLE_VIDEO_OFFSET: + g_object_get_property (G_OBJECT (self->playbin), "text-offset", value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +main_loop_running_cb (gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + GST_TRACE_OBJECT (self, "Main loop running now"); + + g_mutex_lock (&self->lock); + g_cond_signal (&self->cond); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +static void +change_state (GstPlay * self, GstPlayState state) +{ + if (state == self->app_state) + return; + + GST_DEBUG_OBJECT (self, "Changing app state from %s to %s", + gst_play_state_get_name (self->app_state), + gst_play_state_get_name (state)); + + self->app_state = state; + + api_bus_post_message (self, GST_PLAY_MESSAGE_STATE_CHANGED, + GST_PLAY_MESSAGE_DATA_PLAY_STATE, GST_TYPE_PLAY_STATE, + self->app_state, NULL); +} + +static gboolean +tick_cb (gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstClockTime position; + if (query_position (self, &position)) { + api_bus_post_message (self, GST_PLAY_MESSAGE_POSITION_UPDATED, + GST_PLAY_MESSAGE_DATA_POSITION, GST_TYPE_CLOCK_TIME, position, NULL); + } + + return G_SOURCE_CONTINUE; +} + +/* + * Returns true when position is queried and differed from cached position. + * Sets position to cached value, and to queried value if position can be queried + * and different. + */ +static gboolean +query_position (GstPlay * self, GstClockTime * position) +{ + gint64 current_position; + *position = self->cached_position; + if (self->target_state >= GST_STATE_PAUSED + && gst_element_query_position (self->playbin, GST_FORMAT_TIME, + ¤t_position)) { + GST_LOG_OBJECT (self, "Queried position %" GST_TIME_FORMAT, + GST_TIME_ARGS (current_position)); + if (self->cached_position != current_position) { + self->cached_position = current_position; + *position = (GstClockTime) current_position; + return TRUE; + } + } + return FALSE; +} + +static void +add_tick_source (GstPlay * self) +{ + guint position_update_interval_ms; + + if (self->tick_source) + return; + + position_update_interval_ms = + gst_play_config_get_position_update_interval (self->config); + if (!position_update_interval_ms) + return; + + self->tick_source = g_timeout_source_new (position_update_interval_ms); + g_source_set_callback (self->tick_source, (GSourceFunc) tick_cb, self, NULL); + g_source_attach (self->tick_source, self->context); +} + +static void +remove_tick_source (GstPlay * self) +{ + if (!self->tick_source) + return; + + g_source_destroy (self->tick_source); + g_source_unref (self->tick_source); + self->tick_source = NULL; +} + +static gboolean +ready_timeout_cb (gpointer user_data) +{ + GstPlay *self = user_data; + + if (self->target_state <= GST_STATE_READY) { + GST_DEBUG_OBJECT (self, "Setting pipeline to NULL state"); + self->target_state = GST_STATE_NULL; + self->current_state = GST_STATE_NULL; + gst_element_set_state (self->playbin, GST_STATE_NULL); + } + + return G_SOURCE_REMOVE; +} + +static void +add_ready_timeout_source (GstPlay * self) +{ + if (self->ready_timeout_source) + return; + + self->ready_timeout_source = g_timeout_source_new_seconds (60); + g_source_set_callback (self->ready_timeout_source, + (GSourceFunc) ready_timeout_cb, self, NULL); + g_source_attach (self->ready_timeout_source, self->context); +} + +static void +remove_ready_timeout_source (GstPlay * self) +{ + if (!self->ready_timeout_source) + return; + + g_source_destroy (self->ready_timeout_source); + g_source_unref (self->ready_timeout_source); + self->ready_timeout_source = NULL; +} + + +static void +on_error (GstPlay * self, GError * err, const GstStructure * details) +{ + GST_ERROR_OBJECT (self, "Error: %s (%s, %d)", err->message, + g_quark_to_string (err->domain), err->code); + + api_bus_post_message (self, GST_PLAY_MESSAGE_ERROR, + GST_PLAY_MESSAGE_DATA_ERROR, G_TYPE_ERROR, err, + GST_PLAY_MESSAGE_DATA_ERROR_DETAILS, GST_TYPE_STRUCTURE, details, NULL); + + g_error_free (err); + + remove_tick_source (self); + remove_ready_timeout_source (self); + + self->target_state = GST_STATE_NULL; + self->current_state = GST_STATE_NULL; + self->is_live = FALSE; + self->is_eos = FALSE; + gst_element_set_state (self->playbin, GST_STATE_NULL); + change_state (self, GST_PLAY_STATE_STOPPED); + self->buffering_percent = 100; + + g_mutex_lock (&self->lock); + if (self->media_info) { + g_object_unref (self->media_info); + self->media_info = NULL; + } + + if (self->global_tags) { + gst_tag_list_unref (self->global_tags); + self->global_tags = NULL; + } + + self->seek_pending = FALSE; + remove_seek_source (self); + self->seek_position = GST_CLOCK_TIME_NONE; + self->last_seek_time = GST_CLOCK_TIME_NONE; + g_mutex_unlock (&self->lock); +} + +static void +dump_dot_file (GstPlay * self, const gchar * name) +{ + gchar *full_name; + + full_name = g_strdup_printf ("gst-play.%p.%s", self, name); + + GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (self->playbin), + GST_DEBUG_GRAPH_SHOW_ALL, full_name); + + g_free (full_name); +} + +static void +error_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GError *err, *play_err; + gchar *name, *debug, *message, *full_message; + const GstStructure *details = NULL; + + dump_dot_file (self, "error"); + + gst_message_parse_error (msg, &err, &debug); + gst_message_parse_error_details (msg, &details); + + name = gst_object_get_path_string (msg->src); + message = gst_error_get_message (err->domain, err->code); + + if (debug) + full_message = + g_strdup_printf ("Error from element %s: %s\n%s\n%s", name, message, + err->message, debug); + else + full_message = + g_strdup_printf ("Error from element %s: %s\n%s", name, message, + err->message); + + GST_ERROR_OBJECT (self, "ERROR: from element %s: %s", name, err->message); + if (debug != NULL) + GST_ERROR_OBJECT (self, "Additional debug info: %s", debug); + + play_err = + g_error_new_literal (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, full_message); + on_error (self, play_err, details); + + g_clear_error (&err); + g_free (debug); + g_free (name); + g_free (full_message); + g_free (message); +} + +static void +warning_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GError *err, *play_err; + gchar *name, *debug, *message, *full_message; + const GstStructure *details = NULL; + + dump_dot_file (self, "warning"); + + gst_message_parse_warning (msg, &err, &debug); + gst_message_parse_warning_details (msg, &details); + + name = gst_object_get_path_string (msg->src); + message = gst_error_get_message (err->domain, err->code); + + if (debug) + full_message = + g_strdup_printf ("Warning from element %s: %s\n%s\n%s", name, message, + err->message, debug); + else + full_message = + g_strdup_printf ("Warning from element %s: %s\n%s", name, message, + err->message); + + GST_WARNING_OBJECT (self, "WARNING: from element %s: %s", name, err->message); + if (debug != NULL) + GST_WARNING_OBJECT (self, "Additional debug info: %s", debug); + + play_err = + g_error_new_literal (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, full_message); + + GST_ERROR_OBJECT (self, "Warning: %s (%s, %d)", err->message, + g_quark_to_string (err->domain), err->code); + + api_bus_post_message (self, GST_PLAY_MESSAGE_WARNING, + GST_PLAY_MESSAGE_DATA_WARNING, G_TYPE_ERROR, play_err, + GST_PLAY_MESSAGE_DATA_WARNING_DETAILS, GST_TYPE_STRUCTURE, details, NULL); + + g_clear_error (&play_err); + g_clear_error (&err); + g_free (debug); + g_free (name); + g_free (full_message); + g_free (message); +} + +static void +eos_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + GST_DEBUG_OBJECT (self, "End of stream"); + + tick_cb (self); + remove_tick_source (self); + + api_bus_post_message (self, GST_PLAY_MESSAGE_END_OF_STREAM, NULL); + + change_state (self, GST_PLAY_STATE_STOPPED); + self->buffering_percent = 100; + self->is_eos = TRUE; +} + +static void +buffering_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + gint percent; + + if (self->target_state < GST_STATE_PAUSED) + return; + if (self->is_live) + return; + + gst_message_parse_buffering (msg, &percent); + GST_LOG_OBJECT (self, "Buffering %d%%", percent); + + if (percent < 100 && self->target_state >= GST_STATE_PAUSED) { + GstStateChangeReturn state_ret; + + GST_DEBUG_OBJECT (self, "Waiting for buffering to finish"); + state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); + + if (state_ret == GST_STATE_CHANGE_FAILURE) { + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to handle buffering"), NULL); + return; + } + + change_state (self, GST_PLAY_STATE_BUFFERING); + } + + if (self->buffering_percent != percent) { + self->buffering_percent = percent; + + api_bus_post_message (self, GST_PLAY_MESSAGE_BUFFERING, + GST_PLAY_MESSAGE_DATA_BUFFERING_PERCENT, G_TYPE_UINT, percent, NULL); + } + + g_mutex_lock (&self->lock); + if (percent == 100 && (self->seek_position != GST_CLOCK_TIME_NONE || + self->seek_pending)) { + g_mutex_unlock (&self->lock); + + GST_DEBUG_OBJECT (self, "Buffering finished - seek pending"); + } else if (percent == 100 && self->target_state >= GST_STATE_PLAYING + && self->current_state >= GST_STATE_PAUSED) { + GstStateChangeReturn state_ret; + + g_mutex_unlock (&self->lock); + + GST_DEBUG_OBJECT (self, "Buffering finished - going to PLAYING"); + state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); + /* Application state change is happening when the state change happened */ + if (state_ret == GST_STATE_CHANGE_FAILURE) + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to handle buffering"), NULL); + } else if (percent == 100 && self->target_state >= GST_STATE_PAUSED) { + g_mutex_unlock (&self->lock); + + GST_DEBUG_OBJECT (self, "Buffering finished - staying PAUSED"); + change_state (self, GST_PLAY_STATE_PAUSED); + } else { + g_mutex_unlock (&self->lock); + } +} + +static void +clock_lost_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstStateChangeReturn state_ret; + + GST_DEBUG_OBJECT (self, "Clock lost"); + if (self->target_state >= GST_STATE_PLAYING) { + state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); + if (state_ret != GST_STATE_CHANGE_FAILURE) + state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); + + if (state_ret == GST_STATE_CHANGE_FAILURE) + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to handle clock loss"), NULL); + } +} + + +static void +check_video_dimensions_changed (GstPlay * self) +{ + GstElement *video_sink; + GstPad *video_sink_pad; + GstCaps *caps; + GstVideoInfo info; + guint width = 0, height = 0; + + g_object_get (self->playbin, "video-sink", &video_sink, NULL); + if (!video_sink) + goto out; + + video_sink_pad = gst_element_get_static_pad (video_sink, "sink"); + if (!video_sink_pad) { + gst_object_unref (video_sink); + goto out; + } + + caps = gst_pad_get_current_caps (video_sink_pad); + + if (caps) { + if (gst_video_info_from_caps (&info, caps)) { + info.width = info.width * info.par_n / info.par_d; + + GST_DEBUG_OBJECT (self, "Video dimensions changed: %dx%d", info.width, + info.height); + width = info.width; + height = info.height; + } + + gst_caps_unref (caps); + } + gst_object_unref (video_sink_pad); + gst_object_unref (video_sink); + +out: + api_bus_post_message (self, GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED, + GST_PLAY_MESSAGE_DATA_VIDEO_WIDTH, G_TYPE_UINT, width, + GST_PLAY_MESSAGE_DATA_VIDEO_HEIGHT, G_TYPE_UINT, height, NULL); +} + +static void +notify_caps_cb (G_GNUC_UNUSED GObject * object, + G_GNUC_UNUSED GParamSpec * pspec, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + check_video_dimensions_changed (self); +} + +static void +on_duration_changed (GstPlay * self, GstClockTime duration) +{ + gboolean updated = FALSE; + + if (self->cached_duration == duration) + return; + + GST_DEBUG_OBJECT (self, "Duration changed %" GST_TIME_FORMAT, + GST_TIME_ARGS (duration)); + + g_mutex_lock (&self->lock); + self->cached_duration = duration; + if (self->media_info) { + self->media_info->duration = duration; + updated = TRUE; + } + g_mutex_unlock (&self->lock); + + api_bus_post_message (self, GST_PLAY_MESSAGE_DURATION_CHANGED, + GST_PLAY_MESSAGE_DATA_DURATION, GST_TYPE_CLOCK_TIME, + gst_play_get_duration (self), NULL); + + if (updated) { + on_media_info_updated (self); + } +} + +static void +on_seek_done (GstPlay * self) +{ + api_bus_post_message (self, GST_PLAY_MESSAGE_SEEK_DONE, + GST_PLAY_MESSAGE_DATA_POSITION, GST_TYPE_CLOCK_TIME, + gst_play_get_position (self), NULL); +} + +static void +state_changed_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstState old_state, new_state, pending_state; + + gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state); + + if (GST_MESSAGE_SRC (msg) == GST_OBJECT (self->playbin)) { + gchar *transition_name; + + GST_DEBUG_OBJECT (self, "Changed state old: %s new: %s pending: %s", + gst_element_state_get_name (old_state), + gst_element_state_get_name (new_state), + gst_element_state_get_name (pending_state)); + + transition_name = g_strdup_printf ("%s_%s", + gst_element_state_get_name (old_state), + gst_element_state_get_name (new_state)); + dump_dot_file (self, transition_name); + g_free (transition_name); + + self->current_state = new_state; + + if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED + && pending_state == GST_STATE_VOID_PENDING) { + GstElement *video_sink; + GstPad *video_sink_pad; + gint64 duration = -1; + + GST_DEBUG_OBJECT (self, "Initial PAUSED - pre-rolled"); + + g_mutex_lock (&self->lock); + if (self->media_info) + g_object_unref (self->media_info); + self->media_info = gst_play_media_info_create (self); + g_mutex_unlock (&self->lock); + on_media_info_updated (self); + + g_object_get (self->playbin, "video-sink", &video_sink, NULL); + + if (video_sink) { + video_sink_pad = gst_element_get_static_pad (video_sink, "sink"); + + if (video_sink_pad) { + g_signal_connect (video_sink_pad, "notify::caps", + (GCallback) notify_caps_cb, self); + gst_object_unref (video_sink_pad); + } + gst_object_unref (video_sink); + } + + check_video_dimensions_changed (self); + if (gst_element_query_duration (self->playbin, GST_FORMAT_TIME, + &duration)) { + on_duration_changed (self, duration); + } else { + self->cached_duration = GST_CLOCK_TIME_NONE; + } + } + + if (new_state == GST_STATE_PAUSED + && pending_state == GST_STATE_VOID_PENDING) { + remove_tick_source (self); + + g_mutex_lock (&self->lock); + if (self->seek_pending) { + self->seek_pending = FALSE; + + if (!self->media_info->seekable) { + GST_DEBUG_OBJECT (self, "Media is not seekable"); + remove_seek_source (self); + self->seek_position = GST_CLOCK_TIME_NONE; + self->last_seek_time = GST_CLOCK_TIME_NONE; + } else if (self->seek_source) { + GST_DEBUG_OBJECT (self, "Seek finished but new seek is pending"); + gst_play_seek_internal_locked (self); + } else { + GST_DEBUG_OBJECT (self, "Seek finished"); + on_seek_done (self); + } + } + + if (self->seek_position != GST_CLOCK_TIME_NONE) { + GST_DEBUG_OBJECT (self, "Seeking now that we reached PAUSED state"); + gst_play_seek_internal_locked (self); + g_mutex_unlock (&self->lock); + } else if (!self->seek_pending) { + g_mutex_unlock (&self->lock); + + tick_cb (self); + + if (self->target_state >= GST_STATE_PLAYING + && self->buffering_percent == 100) { + GstStateChangeReturn state_ret; + + state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); + if (state_ret == GST_STATE_CHANGE_FAILURE) + on_error (self, g_error_new (GST_PLAY_ERROR, + GST_PLAY_ERROR_FAILED, "Failed to play"), NULL); + } else if (self->buffering_percent == 100) { + change_state (self, GST_PLAY_STATE_PAUSED); + } + } else { + g_mutex_unlock (&self->lock); + } + } else if (new_state == GST_STATE_PLAYING + && pending_state == GST_STATE_VOID_PENDING) { + /* api_bus_post_message (self, GST_PLAY_MESSAGE_POSITION_UPDATED, */ + /* GST_PLAY_MESSAGE_DATA_POSITION, GST_TYPE_CLOCK_TIME, 0, NULL); */ + + /* If no seek is currently pending, add the tick source. This can happen + * if we seeked already but the state-change message was still queued up */ + if (!self->seek_pending) { + add_tick_source (self); + change_state (self, GST_PLAY_STATE_PLAYING); + } + } else if (new_state == GST_STATE_READY && old_state > GST_STATE_READY) { + change_state (self, GST_PLAY_STATE_STOPPED); + } else { + /* Otherwise we neither reached PLAYING nor PAUSED, so must + * wait for something to happen... i.e. are BUFFERING now */ + change_state (self, GST_PLAY_STATE_BUFFERING); + } + } +} + +static void +duration_changed_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + gint64 duration = GST_CLOCK_TIME_NONE; + + if (gst_element_query_duration (self->playbin, GST_FORMAT_TIME, &duration)) { + on_duration_changed (self, duration); + } +} + +static void +latency_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + GST_DEBUG_OBJECT (self, "Latency changed"); + + gst_bin_recalculate_latency (GST_BIN (self->playbin)); +} + +static void +request_state_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstState state; + GstStateChangeReturn state_ret; + + gst_message_parse_request_state (msg, &state); + + GST_DEBUG_OBJECT (self, "State %s requested", + gst_element_state_get_name (state)); + + self->target_state = state; + state_ret = gst_element_set_state (self->playbin, state); + if (state_ret == GST_STATE_CHANGE_FAILURE) + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to change to requested state %s", + gst_element_state_get_name (state)), NULL); +} + +static void +media_info_update (GstPlay * self, GstPlayMediaInfo * info) +{ + g_free (info->title); + info->title = get_from_tags (self, info, get_title); + + g_free (info->container); + info->container = get_from_tags (self, info, get_container_format); + + if (info->image_sample) + gst_sample_unref (info->image_sample); + info->image_sample = get_from_tags (self, info, get_cover_sample); + + GST_DEBUG_OBJECT (self, "title: %s, container: %s " + "image_sample: %p", info->title, info->container, info->image_sample); +} + +static void +tags_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstTagList *tags = NULL; + + gst_message_parse_tag (msg, &tags); + + GST_DEBUG_OBJECT (self, "received %s tags", + gst_tag_list_get_scope (tags) == + GST_TAG_SCOPE_GLOBAL ? "global" : "stream"); + + if (gst_tag_list_get_scope (tags) == GST_TAG_SCOPE_GLOBAL) { + g_mutex_lock (&self->lock); + if (self->media_info) { + if (self->media_info->tags) + gst_tag_list_unref (self->media_info->tags); + self->media_info->tags = gst_tag_list_ref (tags); + media_info_update (self, self->media_info); + g_mutex_unlock (&self->lock); + on_media_info_updated (self); + } else { + if (self->global_tags) + gst_tag_list_unref (self->global_tags); + self->global_tags = gst_tag_list_ref (tags); + g_mutex_unlock (&self->lock); + } + } + + gst_tag_list_unref (tags); +} + +static void +element_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + const GstStructure *s; + + s = gst_message_get_structure (msg); + if (gst_structure_has_name (s, "redirect")) { + const gchar *new_location; + + new_location = gst_structure_get_string (s, "new-location"); + if (!new_location) { + const GValue *locations_list, *location_val; + guint i, size; + + locations_list = gst_structure_get_value (s, "locations"); + size = gst_value_list_get_size (locations_list); + for (i = 0; i < size; ++i) { + const GstStructure *location_s; + + location_val = gst_value_list_get_value (locations_list, i); + if (!GST_VALUE_HOLDS_STRUCTURE (location_val)) + continue; + + location_s = (const GstStructure *) g_value_get_boxed (location_val); + if (!gst_structure_has_name (location_s, "redirect")) + continue; + + new_location = gst_structure_get_string (location_s, "new-location"); + if (new_location) + break; + } + } + + if (new_location) { + GstState target_state; + + GST_DEBUG_OBJECT (self, "Redirect to '%s'", new_location); + + /* Remember target state and restore after setting the URI */ + target_state = self->target_state; + + gst_play_stop_internal (self, TRUE); + + g_mutex_lock (&self->lock); + g_free (self->redirect_uri); + self->redirect_uri = g_strdup (new_location); + g_object_set (self->playbin, "uri", self->redirect_uri, NULL); + g_mutex_unlock (&self->lock); + + if (target_state == GST_STATE_PAUSED) + gst_play_pause_internal (self); + else if (target_state == GST_STATE_PLAYING) + gst_play_play_internal (self); + } + } +} + +/* Must be called with lock */ +static gboolean +update_stream_collection (GstPlay * self, GstStreamCollection * collection) +{ + if (self->collection && self->collection == collection) + return FALSE; + + if (self->collection && self->stream_notify_id) + g_signal_handler_disconnect (self->collection, self->stream_notify_id); + + gst_object_replace ((GstObject **) & self->collection, + (GstObject *) collection); + if (self->media_info) { + gst_object_unref (self->media_info); + self->media_info = gst_play_media_info_create (self); + } + + self->stream_notify_id = + g_signal_connect (self->collection, "stream-notify", + G_CALLBACK (stream_notify_cb), self); + + return TRUE; +} + +static void +stream_collection_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstStreamCollection *collection = NULL; + gboolean updated = FALSE; + + gst_message_parse_stream_collection (msg, &collection); + + if (!collection) + return; + + g_mutex_lock (&self->lock); + updated = update_stream_collection (self, collection); + gst_object_unref (collection); + g_mutex_unlock (&self->lock); + + if (self->media_info && updated) + on_media_info_updated (self); +} + +static void +streams_selected_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, + gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstStreamCollection *collection = NULL; + gboolean updated = FALSE; + guint i, len; + + gst_message_parse_streams_selected (msg, &collection); + + if (!collection) + return; + + g_mutex_lock (&self->lock); + updated = update_stream_collection (self, collection); + gst_object_unref (collection); + + g_free (self->video_sid); + g_free (self->audio_sid); + g_free (self->subtitle_sid); + self->video_sid = NULL; + self->audio_sid = NULL; + self->subtitle_sid = NULL; + + len = gst_message_streams_selected_get_size (msg); + for (i = 0; i < len; i++) { + GstStream *stream; + GstStreamType stream_type; + const gchar *stream_id; + gchar **current_sid; + stream = gst_message_streams_selected_get_stream (msg, i); + stream_type = gst_stream_get_stream_type (stream); + stream_id = gst_stream_get_stream_id (stream); + if (stream_type & GST_STREAM_TYPE_AUDIO) + current_sid = &self->audio_sid; + else if (stream_type & GST_STREAM_TYPE_VIDEO) + current_sid = &self->video_sid; + else if (stream_type & GST_STREAM_TYPE_TEXT) + current_sid = &self->subtitle_sid; + else { + GST_WARNING_OBJECT (self, + "Unknown stream-id %s with type 0x%x", stream_id, stream_type); + continue; + } + + if (G_UNLIKELY (*current_sid)) { + GST_FIXME_OBJECT (self, + "Multiple streams are selected for type %s, choose the first one", + gst_stream_type_get_name (stream_type)); + continue; + } + + *current_sid = g_strdup (stream_id); + } + g_mutex_unlock (&self->lock); + + if (self->media_info && updated) + on_media_info_updated (self); +} + +static void +play_set_flag (GstPlay * self, gint pos) +{ + gint flags; + + g_object_get (self->playbin, "flags", &flags, NULL); + flags |= pos; + g_object_set (self->playbin, "flags", flags, NULL); + + GST_DEBUG_OBJECT (self, "setting flags=%#x", flags); +} + +static void +play_clear_flag (GstPlay * self, gint pos) +{ + gint flags; + + g_object_get (self->playbin, "flags", &flags, NULL); + flags &= ~pos; + g_object_set (self->playbin, "flags", flags, NULL); + + GST_DEBUG_OBJECT (self, "setting flags=%#x", flags); +} + +/* + * on_media_info_updated: + * + * create a new copy of self->media_info object and post it to the user + * application. + */ +static void +on_media_info_updated (GstPlay * self) +{ + GstPlayMediaInfo *media_info_copy; + + g_mutex_lock (&self->lock); + media_info_copy = gst_play_media_info_copy (self->media_info); + g_mutex_unlock (&self->lock); + + api_bus_post_message (self, GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED, + GST_PLAY_MESSAGE_DATA_MEDIA_INFO, GST_TYPE_PLAY_MEDIA_INFO, + media_info_copy, NULL); + g_object_unref (media_info_copy); +} + +static GstCaps * +get_caps (GstPlay * self, gint stream_index, GType type) +{ + GstPad *pad = NULL; + GstCaps *caps = NULL; + + if (type == GST_TYPE_PLAY_VIDEO_INFO) + g_signal_emit_by_name (G_OBJECT (self->playbin), + "get-video-pad", stream_index, &pad); + else if (type == GST_TYPE_PLAY_AUDIO_INFO) + g_signal_emit_by_name (G_OBJECT (self->playbin), + "get-audio-pad", stream_index, &pad); + else + g_signal_emit_by_name (G_OBJECT (self->playbin), + "get-text-pad", stream_index, &pad); + + if (pad) { + caps = gst_pad_get_current_caps (pad); + gst_object_unref (pad); + } + + return caps; +} + +static void +gst_play_subtitle_info_update (GstPlay * self, GstPlayStreamInfo * stream_info) +{ + GstPlaySubtitleInfo *info = (GstPlaySubtitleInfo *) stream_info; + + if (stream_info->tags) { + + /* free the old language info */ + g_free (info->language); + info->language = NULL; + + /* First try to get the language full name from tag, if name is not + * available then try language code. If we find the language code + * then use gstreamer api to translate code to full name. + */ + gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_NAME, + &info->language); + if (!info->language) { + gchar *lang_code = NULL; + + gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_CODE, + &lang_code); + if (lang_code) { + info->language = g_strdup (gst_tag_get_language_name (lang_code)); + g_free (lang_code); + } + } + + /* If we are still failed to find language name then check if external + * subtitle is loaded and compare the stream index between current sub + * stream index with our stream index and if matches then declare it as + * external subtitle and use the filename. + */ + if (!info->language) { + gint text_index = -1; + gchar *suburi = NULL; + + g_object_get (G_OBJECT (self->playbin), "current-suburi", &suburi, NULL); + if (suburi) { + if (self->use_playbin3) { + if (g_str_equal (self->subtitle_sid, stream_info->stream_id)) + info->language = g_path_get_basename (suburi); + } else { + g_object_get (G_OBJECT (self->playbin), "current-text", &text_index, + NULL); + if (text_index == gst_play_stream_info_get_index (stream_info)) + info->language = g_path_get_basename (suburi); + } + g_free (suburi); + } + } + + } else { + g_free (info->language); + info->language = NULL; + } + + GST_DEBUG_OBJECT (self, "language=%s", info->language); +} + +static void +gst_play_video_info_update (GstPlay * self, GstPlayStreamInfo * stream_info) +{ + GstPlayVideoInfo *info = (GstPlayVideoInfo *) stream_info; + + if (stream_info->caps) { + GstStructure *s; + + s = gst_caps_get_structure (stream_info->caps, 0); + if (s) { + gint width, height; + gint fps_n, fps_d; + gint par_n, par_d; + + if (gst_structure_get_int (s, "width", &width)) + info->width = width; + else + info->width = -1; + + if (gst_structure_get_int (s, "height", &height)) + info->height = height; + else + info->height = -1; + + if (gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d)) { + info->framerate_num = fps_n; + info->framerate_denom = fps_d; + } else { + info->framerate_num = 0; + info->framerate_denom = 1; + } + + + if (gst_structure_get_fraction (s, "pixel-aspect-ratio", &par_n, &par_d)) { + info->par_num = par_n; + info->par_denom = par_d; + } else { + info->par_num = 1; + info->par_denom = 1; + } + } + } else { + info->width = info->height = -1; + info->par_num = info->par_denom = 1; + info->framerate_num = 0; + info->framerate_denom = 1; + } + + if (stream_info->tags) { + guint bitrate, max_bitrate; + + if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_BITRATE, &bitrate)) + info->bitrate = bitrate; + else + info->bitrate = -1; + + if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_MAXIMUM_BITRATE, + &max_bitrate) || gst_tag_list_get_uint (stream_info->tags, + GST_TAG_NOMINAL_BITRATE, &max_bitrate)) + info->max_bitrate = max_bitrate; + else + info->max_bitrate = -1; + } else { + info->bitrate = info->max_bitrate = -1; + } + + GST_DEBUG_OBJECT (self, "width=%d height=%d fps=%.2f par=%d:%d " + "bitrate=%d max_bitrate=%d", info->width, info->height, + (gdouble) info->framerate_num / info->framerate_denom, + info->par_num, info->par_denom, info->bitrate, info->max_bitrate); +} + +static void +gst_play_audio_info_update (GstPlay * self, GstPlayStreamInfo * stream_info) +{ + GstPlayAudioInfo *info = (GstPlayAudioInfo *) stream_info; + + if (stream_info->caps) { + GstStructure *s; + + s = gst_caps_get_structure (stream_info->caps, 0); + if (s) { + gint rate, channels; + + if (gst_structure_get_int (s, "rate", &rate)) + info->sample_rate = rate; + else + info->sample_rate = -1; + + if (gst_structure_get_int (s, "channels", &channels)) + info->channels = channels; + else + info->channels = 0; + } + } else { + info->sample_rate = -1; + info->channels = 0; + } + + if (stream_info->tags) { + guint bitrate, max_bitrate; + + if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_BITRATE, &bitrate)) + info->bitrate = bitrate; + else + info->bitrate = -1; + + if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_MAXIMUM_BITRATE, + &max_bitrate) || gst_tag_list_get_uint (stream_info->tags, + GST_TAG_NOMINAL_BITRATE, &max_bitrate)) + info->max_bitrate = max_bitrate; + else + info->max_bitrate = -1; + + /* if we have old language the free it */ + g_free (info->language); + info->language = NULL; + + /* First try to get the language full name from tag, if name is not + * available then try language code. If we find the language code + * then use gstreamer api to translate code to full name. + */ + gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_NAME, + &info->language); + if (!info->language) { + gchar *lang_code = NULL; + + gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_CODE, + &lang_code); + if (lang_code) { + info->language = g_strdup (gst_tag_get_language_name (lang_code)); + g_free (lang_code); + } + } + } else { + g_free (info->language); + info->language = NULL; + info->max_bitrate = info->bitrate = -1; + } + + GST_DEBUG_OBJECT (self, "language=%s rate=%d channels=%d bitrate=%d " + "max_bitrate=%d", info->language, info->sample_rate, info->channels, + info->bitrate, info->max_bitrate); +} + +static GstPlayStreamInfo * +gst_play_stream_info_find (GstPlayMediaInfo * media_info, + GType type, gint stream_index) +{ + GList *list, *l; + GstPlayStreamInfo *info = NULL; + + if (!media_info) + return NULL; + + list = gst_play_media_info_get_stream_list (media_info); + for (l = list; l != NULL; l = l->next) { + info = (GstPlayStreamInfo *) l->data; + if ((G_OBJECT_TYPE (info) == type) && (info->stream_index == stream_index)) { + return info; + } + } + + return NULL; +} + +static GstPlayStreamInfo * +gst_play_stream_info_find_from_stream_id (GstPlayMediaInfo * media_info, + const gchar * stream_id) +{ + GList *list, *l; + GstPlayStreamInfo *info = NULL; + + if (!media_info) + return NULL; + + list = gst_play_media_info_get_stream_list (media_info); + for (l = list; l != NULL; l = l->next) { + info = (GstPlayStreamInfo *) l->data; + if (g_str_equal (info->stream_id, stream_id)) { + return info; + } + } + + return NULL; +} + +static gboolean +is_track_enabled (GstPlay * self, gint pos) +{ + gint flags; + + g_object_get (G_OBJECT (self->playbin), "flags", &flags, NULL); + + if ((flags & pos)) + return TRUE; + + return FALSE; +} + +static GstPlayStreamInfo * +gst_play_stream_info_get_current (GstPlay * self, const gchar * prop, + GType type) +{ + gint current; + GstPlayStreamInfo *info; + + if (!self->media_info) + return NULL; + + g_object_get (G_OBJECT (self->playbin), prop, ¤t, NULL); + g_mutex_lock (&self->lock); + info = gst_play_stream_info_find (self->media_info, type, current); + if (info) + info = gst_play_stream_info_copy (info); + g_mutex_unlock (&self->lock); + + return info; +} + +static GstPlayStreamInfo * +gst_play_stream_info_get_current_from_stream_id (GstPlay * self, + const gchar * stream_id, GType type) +{ + GstPlayStreamInfo *info; + + if (!self->media_info || !stream_id) + return NULL; + + g_mutex_lock (&self->lock); + info = gst_play_stream_info_find_from_stream_id (self->media_info, stream_id); + if (info && G_OBJECT_TYPE (info) == type) + info = gst_play_stream_info_copy (info); + else + info = NULL; + g_mutex_unlock (&self->lock); + + return info; +} + +static void +stream_notify_cb (GstStreamCollection * collection, GstStream * stream, + GParamSpec * pspec, GstPlay * self) +{ + GstPlayStreamInfo *info; + const gchar *stream_id; + gboolean emit_signal = FALSE; + + if (!self->media_info) + return; + + if (G_PARAM_SPEC_VALUE_TYPE (pspec) != GST_TYPE_CAPS && + G_PARAM_SPEC_VALUE_TYPE (pspec) != GST_TYPE_TAG_LIST) + return; + + stream_id = gst_stream_get_stream_id (stream); + g_mutex_lock (&self->lock); + info = gst_play_stream_info_find_from_stream_id (self->media_info, stream_id); + if (info) { + gst_play_stream_info_update_from_stream (self, info, stream); + emit_signal = TRUE; + } + g_mutex_unlock (&self->lock); + + if (emit_signal) + on_media_info_updated (self); +} + +static void +gst_play_stream_info_update (GstPlay * self, GstPlayStreamInfo * s) +{ + if (GST_IS_PLAY_VIDEO_INFO (s)) + gst_play_video_info_update (self, s); + else if (GST_IS_PLAY_AUDIO_INFO (s)) + gst_play_audio_info_update (self, s); + else + gst_play_subtitle_info_update (self, s); +} + +static gchar * +stream_info_get_codec (GstPlayStreamInfo * s) +{ + const gchar *type; + GstTagList *tags; + gchar *codec = NULL; + + if (GST_IS_PLAY_VIDEO_INFO (s)) + type = GST_TAG_VIDEO_CODEC; + else if (GST_IS_PLAY_AUDIO_INFO (s)) + type = GST_TAG_AUDIO_CODEC; + else + type = GST_TAG_SUBTITLE_CODEC; + + tags = gst_play_stream_info_get_tags (s); + if (tags) { + gst_tag_list_get_string (tags, type, &codec); + if (!codec) + gst_tag_list_get_string (tags, GST_TAG_CODEC, &codec); + } + + if (!codec) { + GstCaps *caps; + caps = gst_play_stream_info_get_caps (s); + if (caps) { + codec = gst_pb_utils_get_codec_description (caps); + } + } + + return codec; +} + +static void +gst_play_stream_info_update_tags_and_caps (GstPlay * self, + GstPlayStreamInfo * s) +{ + GstTagList *tags; + gint stream_index; + + stream_index = gst_play_stream_info_get_index (s); + + if (GST_IS_PLAY_VIDEO_INFO (s)) + g_signal_emit_by_name (self->playbin, "get-video-tags", + stream_index, &tags); + else if (GST_IS_PLAY_AUDIO_INFO (s)) + g_signal_emit_by_name (self->playbin, "get-audio-tags", + stream_index, &tags); + else + g_signal_emit_by_name (self->playbin, "get-text-tags", stream_index, &tags); + + if (s->tags) + gst_tag_list_unref (s->tags); + s->tags = tags; + + if (s->caps) + gst_caps_unref (s->caps); + s->caps = get_caps (self, stream_index, G_OBJECT_TYPE (s)); + + g_free (s->codec); + s->codec = stream_info_get_codec (s); + + GST_DEBUG_OBJECT (self, "%s index: %d tags: %p caps: %p", + gst_play_stream_info_get_stream_type (s), stream_index, s->tags, s->caps); + + gst_play_stream_info_update (self, s); +} + +static void +gst_play_streams_info_create (GstPlay * self, + GstPlayMediaInfo * media_info, const gchar * prop, GType type) +{ + gint i; + gint total = -1; + GstPlayStreamInfo *s; + + if (!media_info) + return; + + g_object_get (G_OBJECT (self->playbin), prop, &total, NULL); + + GST_DEBUG_OBJECT (self, "%s: %d", prop, total); + + for (i = 0; i < total; i++) { + /* check if stream already exist in the list */ + s = gst_play_stream_info_find (media_info, type, i); + + if (!s) { + /* create a new stream info instance */ + s = gst_play_stream_info_new (i, type); + + /* add the object in stream list */ + media_info->stream_list = g_list_append (media_info->stream_list, s); + + /* based on type, add the object in its corresponding stream_ list */ + if (GST_IS_PLAY_AUDIO_INFO (s)) + media_info->audio_stream_list = g_list_append + (media_info->audio_stream_list, s); + else if (GST_IS_PLAY_VIDEO_INFO (s)) + media_info->video_stream_list = g_list_append + (media_info->video_stream_list, s); + else + media_info->subtitle_stream_list = g_list_append + (media_info->subtitle_stream_list, s); + + GST_DEBUG_OBJECT (self, "create %s stream stream_index: %d", + gst_play_stream_info_get_stream_type (s), i); + } + + gst_play_stream_info_update_tags_and_caps (self, s); + } +} + +static void +gst_play_stream_info_update_from_stream (GstPlay * self, + GstPlayStreamInfo * s, GstStream * stream) +{ + if (s->tags) + gst_tag_list_unref (s->tags); + s->tags = gst_stream_get_tags (stream); + + if (s->caps) + gst_caps_unref (s->caps); + s->caps = gst_stream_get_caps (stream); + + g_free (s->codec); + s->codec = stream_info_get_codec (s); + + GST_DEBUG_OBJECT (self, "%s index: %d tags: %p caps: %p", + gst_play_stream_info_get_stream_type (s), s->stream_index, + s->tags, s->caps); + + gst_play_stream_info_update (self, s); +} + +static void +gst_play_streams_info_create_from_collection (GstPlay * self, + GstPlayMediaInfo * media_info, GstStreamCollection * collection) +{ + guint i; + guint total; + GstPlayStreamInfo *s; + guint n_audio = 0; + guint n_video = 0; + guint n_text = 0; + + if (!media_info || !collection) + return; + + total = gst_stream_collection_get_size (collection); + + for (i = 0; i < total; i++) { + GstStream *stream = gst_stream_collection_get_stream (collection, i); + GstStreamType stream_type = gst_stream_get_stream_type (stream); + const gchar *stream_id = gst_stream_get_stream_id (stream); + + if (stream_type & GST_STREAM_TYPE_AUDIO) { + s = gst_play_stream_info_new (n_audio, GST_TYPE_PLAY_AUDIO_INFO); + n_audio++; + } else if (stream_type & GST_STREAM_TYPE_VIDEO) { + s = gst_play_stream_info_new (n_video, GST_TYPE_PLAY_VIDEO_INFO); + n_video++; + } else if (stream_type & GST_STREAM_TYPE_TEXT) { + s = gst_play_stream_info_new (n_text, GST_TYPE_PLAY_SUBTITLE_INFO); + n_text++; + } else { + GST_DEBUG_OBJECT (self, "Unknown type stream %d", i); + continue; + } + + s->stream_id = g_strdup (stream_id); + + /* add the object in stream list */ + media_info->stream_list = g_list_append (media_info->stream_list, s); + + /* based on type, add the object in its corresponding stream_ list */ + if (GST_IS_PLAY_AUDIO_INFO (s)) + media_info->audio_stream_list = g_list_append + (media_info->audio_stream_list, s); + else if (GST_IS_PLAY_VIDEO_INFO (s)) + media_info->video_stream_list = g_list_append + (media_info->video_stream_list, s); + else + media_info->subtitle_stream_list = g_list_append + (media_info->subtitle_stream_list, s); + + GST_DEBUG_OBJECT (self, "create %s stream stream_index: %d", + gst_play_stream_info_get_stream_type (s), s->stream_index); + + gst_play_stream_info_update_from_stream (self, s, stream); + } +} + +static void +video_changed_cb (G_GNUC_UNUSED GObject * object, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + g_mutex_lock (&self->lock); + gst_play_streams_info_create (self, self->media_info, + "n-video", GST_TYPE_PLAY_VIDEO_INFO); + g_mutex_unlock (&self->lock); +} + +static void +audio_changed_cb (G_GNUC_UNUSED GObject * object, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + g_mutex_lock (&self->lock); + gst_play_streams_info_create (self, self->media_info, + "n-audio", GST_TYPE_PLAY_AUDIO_INFO); + g_mutex_unlock (&self->lock); +} + +static void +subtitle_changed_cb (G_GNUC_UNUSED GObject * object, gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + g_mutex_lock (&self->lock); + gst_play_streams_info_create (self, self->media_info, + "n-text", GST_TYPE_PLAY_SUBTITLE_INFO); + g_mutex_unlock (&self->lock); +} + +static void * +get_title (GstTagList * tags) +{ + gchar *title = NULL; + + gst_tag_list_get_string (tags, GST_TAG_TITLE, &title); + if (!title) + gst_tag_list_get_string (tags, GST_TAG_TITLE_SORTNAME, &title); + + return title; +} + +static void * +get_container_format (GstTagList * tags) +{ + gchar *container = NULL; + + gst_tag_list_get_string (tags, GST_TAG_CONTAINER_FORMAT, &container); + + /* TODO: If container is not available then maybe consider + * parsing caps or file extension to guess the container format. + */ + + return container; +} + +static void * +get_from_tags (GstPlay * self, GstPlayMediaInfo * media_info, + void *(*func) (GstTagList *)) +{ + GList *l; + void *ret = NULL; + + if (media_info->tags) { + ret = func (media_info->tags); + if (ret) + return ret; + } + + /* if global tag does not exit then try video and audio streams */ + GST_DEBUG_OBJECT (self, "trying video tags"); + for (l = gst_play_media_info_get_video_streams (media_info); l != NULL; + l = l->next) { + GstTagList *tags; + + tags = gst_play_stream_info_get_tags ((GstPlayStreamInfo *) l->data); + if (tags) + ret = func (tags); + + if (ret) + return ret; + } + + GST_DEBUG_OBJECT (self, "trying audio tags"); + for (l = gst_play_media_info_get_audio_streams (media_info); l != NULL; + l = l->next) { + GstTagList *tags; + + tags = gst_play_stream_info_get_tags ((GstPlayStreamInfo *) l->data); + if (tags) + ret = func (tags); + + if (ret) + return ret; + } + + GST_DEBUG_OBJECT (self, "failed to get the information from tags"); + return NULL; +} + +static void * +get_cover_sample (GstTagList * tags) +{ + GstSample *cover_sample = NULL; + + gst_tag_list_get_sample (tags, GST_TAG_IMAGE, &cover_sample); + if (!cover_sample) + gst_tag_list_get_sample (tags, GST_TAG_PREVIEW_IMAGE, &cover_sample); + + return cover_sample; +} + +static GstPlayMediaInfo * +gst_play_media_info_create (GstPlay * self) +{ + GstPlayMediaInfo *media_info; + GstQuery *query; + + GST_DEBUG_OBJECT (self, "begin"); + media_info = gst_play_media_info_new (self->uri); + media_info->duration = gst_play_get_duration (self); + media_info->tags = self->global_tags; + media_info->is_live = self->is_live; + self->global_tags = NULL; + + query = gst_query_new_seeking (GST_FORMAT_TIME); + if (gst_element_query (self->playbin, query)) + gst_query_parse_seeking (query, NULL, &media_info->seekable, NULL, NULL); + gst_query_unref (query); + + if (self->use_playbin3 && self->collection) { + gst_play_streams_info_create_from_collection (self, media_info, + self->collection); + } else { + /* create audio/video/sub streams */ + gst_play_streams_info_create (self, media_info, "n-video", + GST_TYPE_PLAY_VIDEO_INFO); + gst_play_streams_info_create (self, media_info, "n-audio", + GST_TYPE_PLAY_AUDIO_INFO); + gst_play_streams_info_create (self, media_info, "n-text", + GST_TYPE_PLAY_SUBTITLE_INFO); + } + + media_info->title = get_from_tags (self, media_info, get_title); + media_info->container = + get_from_tags (self, media_info, get_container_format); + media_info->image_sample = get_from_tags (self, media_info, get_cover_sample); + + GST_DEBUG_OBJECT (self, "uri: %s title: %s duration: %" GST_TIME_FORMAT + " seekable: %s live: %s container: %s image_sample %p", + media_info->uri, media_info->title, GST_TIME_ARGS (media_info->duration), + media_info->seekable ? "yes" : "no", media_info->is_live ? "yes" : "no", + media_info->container, media_info->image_sample); + + GST_DEBUG_OBJECT (self, "end"); + return media_info; +} + +static void +tags_changed_cb (GstPlay * self, gint stream_index, GType type) +{ + GstPlayStreamInfo *s; + + if (!self->media_info) + return; + + /* update the stream information */ + g_mutex_lock (&self->lock); + s = gst_play_stream_info_find (self->media_info, type, stream_index); + gst_play_stream_info_update_tags_and_caps (self, s); + g_mutex_unlock (&self->lock); + + on_media_info_updated (self); +} + +static void +video_tags_changed_cb (G_GNUC_UNUSED GstElement * playbin, gint stream_index, + gpointer user_data) +{ + tags_changed_cb (GST_PLAY (user_data), stream_index, + GST_TYPE_PLAY_VIDEO_INFO); +} + +static void +audio_tags_changed_cb (G_GNUC_UNUSED GstElement * playbin, gint stream_index, + gpointer user_data) +{ + tags_changed_cb (GST_PLAY (user_data), stream_index, + GST_TYPE_PLAY_AUDIO_INFO); +} + +static void +subtitle_tags_changed_cb (G_GNUC_UNUSED GstElement * playbin, gint stream_index, + gpointer user_data) +{ + tags_changed_cb (GST_PLAY (user_data), stream_index, + GST_TYPE_PLAY_SUBTITLE_INFO); +} + +static void +volume_notify_cb (G_GNUC_UNUSED GObject * obj, G_GNUC_UNUSED GParamSpec * pspec, + GstPlay * self) +{ + api_bus_post_message (self, GST_PLAY_MESSAGE_VOLUME_CHANGED, + GST_PLAY_MESSAGE_DATA_VOLUME, G_TYPE_DOUBLE, + gst_play_get_volume (self), NULL); +} + +static void +mute_notify_cb (G_GNUC_UNUSED GObject * obj, G_GNUC_UNUSED GParamSpec * pspec, + GstPlay * self) +{ + + api_bus_post_message (self, GST_PLAY_MESSAGE_MUTE_CHANGED, + GST_PLAY_MESSAGE_DATA_IS_MUTED, G_TYPE_BOOLEAN, + gst_play_get_mute (self), NULL); +} + +static void +source_setup_cb (GstElement * playbin, GstElement * source, GstPlay * self) +{ + gchar *user_agent; + + user_agent = gst_play_config_get_user_agent (self->config); + if (user_agent) { + GParamSpec *prop; + + prop = g_object_class_find_property (G_OBJECT_GET_CLASS (source), + "user-agent"); + if (prop && prop->value_type == G_TYPE_STRING) { + GST_INFO_OBJECT (self, "Setting source user-agent: %s", user_agent); + g_object_set (source, "user-agent", user_agent, NULL); + } + + g_free (user_agent); + } +} + +static gpointer +gst_play_main (gpointer data) +{ + GstPlay *self = GST_PLAY (data); + GstBus *bus; + GSource *source; + GstElement *scaletempo; + const gchar *env; + + GST_TRACE_OBJECT (self, "Starting main thread"); + + g_main_context_push_thread_default (self->context); + + source = g_idle_source_new (); + g_source_set_callback (source, (GSourceFunc) main_loop_running_cb, self, + NULL); + g_source_attach (source, self->context); + g_source_unref (source); + + env = g_getenv ("GST_PLAY_USE_PLAYBIN3"); + if (env && g_str_has_prefix (env, "1")) + self->use_playbin3 = TRUE; + + if (self->use_playbin3) { + GST_DEBUG_OBJECT (self, "playbin3 enabled"); + self->playbin = gst_element_factory_make ("playbin3", "playbin3"); + } else { + self->playbin = gst_element_factory_make ("playbin", "playbin"); + } + + if (!self->playbin) { + g_error ("GstPlay: 'playbin' element not found, please check your setup"); + g_assert_not_reached (); + } + + gst_object_ref_sink (self->playbin); + + if (self->video_renderer) { + gst_play_set_playbin_video_sink (self); + } + + scaletempo = gst_element_factory_make ("scaletempo", NULL); + if (scaletempo) { + g_object_set (self->playbin, "audio-filter", scaletempo, NULL); + } else { + g_warning ("GstPlay: scaletempo element not available. Audio pitch " + "will not be preserved during trick modes"); + } + + self->bus = bus = gst_element_get_bus (self->playbin); + gst_bus_add_signal_watch (bus); + + g_signal_connect (G_OBJECT (bus), "message::error", G_CALLBACK (error_cb), + self); + g_signal_connect (G_OBJECT (bus), "message::warning", G_CALLBACK (warning_cb), + self); + g_signal_connect (G_OBJECT (bus), "message::eos", G_CALLBACK (eos_cb), self); + g_signal_connect (G_OBJECT (bus), "message::state-changed", + G_CALLBACK (state_changed_cb), self); + g_signal_connect (G_OBJECT (bus), "message::buffering", + G_CALLBACK (buffering_cb), self); + g_signal_connect (G_OBJECT (bus), "message::clock-lost", + G_CALLBACK (clock_lost_cb), self); + g_signal_connect (G_OBJECT (bus), "message::duration-changed", + G_CALLBACK (duration_changed_cb), self); + g_signal_connect (G_OBJECT (bus), "message::latency", + G_CALLBACK (latency_cb), self); + g_signal_connect (G_OBJECT (bus), "message::request-state", + G_CALLBACK (request_state_cb), self); + g_signal_connect (G_OBJECT (bus), "message::element", + G_CALLBACK (element_cb), self); + g_signal_connect (G_OBJECT (bus), "message::tag", G_CALLBACK (tags_cb), self); + + if (self->use_playbin3) { + g_signal_connect (G_OBJECT (bus), "message::stream-collection", + G_CALLBACK (stream_collection_cb), self); + g_signal_connect (G_OBJECT (bus), "message::streams-selected", + G_CALLBACK (streams_selected_cb), self); + } else { + g_signal_connect (self->playbin, "video-changed", + G_CALLBACK (video_changed_cb), self); + g_signal_connect (self->playbin, "audio-changed", + G_CALLBACK (audio_changed_cb), self); + g_signal_connect (self->playbin, "text-changed", + G_CALLBACK (subtitle_changed_cb), self); + + g_signal_connect (self->playbin, "video-tags-changed", + G_CALLBACK (video_tags_changed_cb), self); + g_signal_connect (self->playbin, "audio-tags-changed", + G_CALLBACK (audio_tags_changed_cb), self); + g_signal_connect (self->playbin, "text-tags-changed", + G_CALLBACK (subtitle_tags_changed_cb), self); + } + + g_signal_connect (self->playbin, "notify::volume", + G_CALLBACK (volume_notify_cb), self); + g_signal_connect (self->playbin, "notify::mute", + G_CALLBACK (mute_notify_cb), self); + g_signal_connect (self->playbin, "source-setup", + G_CALLBACK (source_setup_cb), self); + + self->target_state = GST_STATE_NULL; + self->current_state = GST_STATE_NULL; + change_state (self, GST_PLAY_STATE_STOPPED); + self->buffering_percent = 100; + self->is_eos = FALSE; + self->is_live = FALSE; + self->rate = 1.0; + + GST_TRACE_OBJECT (self, "Starting main loop"); + g_main_loop_run (self->loop); + GST_TRACE_OBJECT (self, "Stopped main loop"); + + gst_bus_remove_signal_watch (bus); + gst_object_unref (bus); + + remove_tick_source (self); + remove_ready_timeout_source (self); + + g_mutex_lock (&self->lock); + if (self->media_info) { + g_object_unref (self->media_info); + self->media_info = NULL; + } + + remove_seek_source (self); + g_mutex_unlock (&self->lock); + + g_main_context_pop_thread_default (self->context); + + self->target_state = GST_STATE_NULL; + self->current_state = GST_STATE_NULL; + if (self->playbin) { + gst_element_set_state (self->playbin, GST_STATE_NULL); + gst_object_unref (self->playbin); + self->playbin = NULL; + } + + GST_TRACE_OBJECT (self, "Stopped main thread"); + + return NULL; +} + +static gpointer +gst_play_init_once (G_GNUC_UNUSED gpointer user_data) +{ + gst_init (NULL, NULL); + + GST_DEBUG_CATEGORY_INIT (gst_play_debug, "gst-play", 0, "GstPlay"); + gst_play_error_quark (); + + return NULL; +} + +/** + * gst_play_new: + * @video_renderer: (transfer full) (allow-none): GstPlayVideoRenderer to use + * + * Creates a new #GstPlay instance. + * + * Video is going to be rendered by @video_renderer, or if %NULL is provided + * no special video set up will be done and some default handling will be + * performed. + * + * Returns: (transfer full): a new #GstPlay instance + * Since: 1.20 + */ +GstPlay * +gst_play_new (GstPlayVideoRenderer * video_renderer) +{ + static GOnce once = G_ONCE_INIT; + GstPlay *self; + + g_once (&once, gst_play_init_once, NULL); + + self = g_object_new (GST_TYPE_PLAY, NULL); + + // When the video_renderer is a GstPlayerWrappedVideoRenderer it cannot be set + // at construction time because it requires a valid pipeline which is created + // only after GstPlay has been constructed. That is why the video renderer is + // set *after* GstPlay has been constructed. + if (video_renderer != NULL) { + g_object_set (self, "video-renderer", video_renderer, NULL); + } + gst_object_ref_sink (self); + + if (video_renderer) + g_object_unref (video_renderer); + + return self; +} + +/** + * gst_play_get_message_bus: + * @play: #GstPlay instance + * + * GstPlay API exposes a #GstBus instance which purpose is to provide data + * structures representing play-internal events in form of #GstMessage<!-- -->s of + * type GST_MESSAGE_APPLICATION. + * + * Each message carries a "play-message" field of type #GstPlayMessage. + * Further fields of the message data are specific to each possible value of + * that enumeration. + * + * Applications can consume the messages asynchronously within their own + * event-loop / UI-thread etc. Note that in case the application does not + * consume the messages, the bus will accumulate these internally and eventually + * fill memory. To avoid that, the bus has to be set "flushing". + * + * Returns: (transfer full): The play message bus instance + * + * Since: 1.20 + */ +GstBus * +gst_play_get_message_bus (GstPlay * self) +{ + return g_object_ref (self->api_bus); +} + +static gboolean +gst_play_play_internal (gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstStateChangeReturn state_ret; + + GST_DEBUG_OBJECT (self, "Play"); + + g_mutex_lock (&self->lock); + if (!self->uri) { + g_mutex_unlock (&self->lock); + return G_SOURCE_REMOVE; + } + g_mutex_unlock (&self->lock); + + remove_ready_timeout_source (self); + self->target_state = GST_STATE_PLAYING; + + if (self->current_state < GST_STATE_PAUSED) + change_state (self, GST_PLAY_STATE_BUFFERING); + + if (self->current_state >= GST_STATE_PAUSED && !self->is_eos + && self->buffering_percent >= 100 + && !(self->seek_position != GST_CLOCK_TIME_NONE || self->seek_pending)) { + state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); + } else { + state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); + } + + if (state_ret == GST_STATE_CHANGE_FAILURE) { + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to play"), NULL); + return G_SOURCE_REMOVE; + } else if (state_ret == GST_STATE_CHANGE_NO_PREROLL) { + self->is_live = TRUE; + GST_DEBUG_OBJECT (self, "Pipeline is live"); + } + + if (self->is_eos) { + gboolean ret; + + GST_DEBUG_OBJECT (self, "Was EOS, seeking to beginning"); + self->is_eos = FALSE; + ret = + gst_element_seek_simple (self->playbin, GST_FORMAT_TIME, + GST_SEEK_FLAG_FLUSH, 0); + if (!ret) { + GST_ERROR_OBJECT (self, "Seek to beginning failed"); + gst_play_stop_internal (self, TRUE); + gst_play_play_internal (self); + } + } + + return G_SOURCE_REMOVE; +} + +/** + * gst_play_play: + * @play: #GstPlay instance + * + * Request to play the loaded stream. + * Since: 1.20 + */ +void +gst_play_play (GstPlay * self) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, + gst_play_play_internal, self, NULL); +} + +static gboolean +gst_play_pause_internal (gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + GstStateChangeReturn state_ret; + + GST_DEBUG_OBJECT (self, "Pause"); + + g_mutex_lock (&self->lock); + if (!self->uri) { + g_mutex_unlock (&self->lock); + return G_SOURCE_REMOVE; + } + g_mutex_unlock (&self->lock); + + tick_cb (self); + remove_tick_source (self); + remove_ready_timeout_source (self); + + self->target_state = GST_STATE_PAUSED; + + if (self->current_state < GST_STATE_PAUSED) + change_state (self, GST_PLAY_STATE_BUFFERING); + + state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); + if (state_ret == GST_STATE_CHANGE_FAILURE) { + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to pause"), NULL); + return G_SOURCE_REMOVE; + } else if (state_ret == GST_STATE_CHANGE_NO_PREROLL) { + self->is_live = TRUE; + GST_DEBUG_OBJECT (self, "Pipeline is live"); + } + + if (self->is_eos) { + gboolean ret; + + GST_DEBUG_OBJECT (self, "Was EOS, seeking to beginning"); + self->is_eos = FALSE; + ret = + gst_element_seek_simple (self->playbin, GST_FORMAT_TIME, + GST_SEEK_FLAG_FLUSH, 0); + if (!ret) { + GST_ERROR_OBJECT (self, "Seek to beginning failed"); + gst_play_stop_internal (self, TRUE); + gst_play_pause_internal (self); + } + } + + return G_SOURCE_REMOVE; +} + +/** + * gst_play_pause: + * @play: #GstPlay instance + * + * Pauses the current stream. + * Since: 1.20 + */ +void +gst_play_pause (GstPlay * self) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, + gst_play_pause_internal, self, NULL); +} + +static void +gst_play_stop_internal (GstPlay * self, gboolean transient) +{ + /* directly return if we're already stopped */ + if (self->current_state <= GST_STATE_READY && + self->target_state <= GST_STATE_READY) + return; + + GST_DEBUG_OBJECT (self, "Stop (transient %d)", transient); + + tick_cb (self); + remove_tick_source (self); + + add_ready_timeout_source (self); + + self->target_state = GST_STATE_NULL; + self->current_state = GST_STATE_READY; + self->is_live = FALSE; + self->is_eos = FALSE; + gst_bus_set_flushing (self->bus, TRUE); + gst_element_set_state (self->playbin, GST_STATE_READY); + gst_bus_set_flushing (self->bus, FALSE); + change_state (self, transient + && self->app_state != + GST_PLAY_STATE_STOPPED ? GST_PLAY_STATE_BUFFERING : + GST_PLAY_STATE_STOPPED); + self->buffering_percent = 100; + self->cached_duration = GST_CLOCK_TIME_NONE; + g_mutex_lock (&self->lock); + if (self->media_info) { + g_object_unref (self->media_info); + self->media_info = NULL; + } + if (self->global_tags) { + gst_tag_list_unref (self->global_tags); + self->global_tags = NULL; + } + self->seek_pending = FALSE; + remove_seek_source (self); + self->seek_position = GST_CLOCK_TIME_NONE; + self->last_seek_time = GST_CLOCK_TIME_NONE; + self->rate = 1.0; + if (self->collection) { + if (self->stream_notify_id) + g_signal_handler_disconnect (self->collection, self->stream_notify_id); + self->stream_notify_id = 0; + gst_object_unref (self->collection); + self->collection = NULL; + } + g_free (self->video_sid); + g_free (self->audio_sid); + g_free (self->subtitle_sid); + self->video_sid = NULL; + self->audio_sid = NULL; + self->subtitle_sid = NULL; + g_mutex_unlock (&self->lock); +} + +static gboolean +gst_play_stop_internal_dispatch (gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + gst_play_stop_internal (self, FALSE); + + return G_SOURCE_REMOVE; +} + + +/** + * gst_play_stop: + * @play: #GstPlay instance + * + * Stops playing the current stream and resets to the first position + * in the stream. + * Since: 1.20 + */ +void +gst_play_stop (GstPlay * self) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, + gst_play_stop_internal_dispatch, self, NULL); +} + +/* Must be called with lock from main context, releases lock! */ +static void +gst_play_seek_internal_locked (GstPlay * self) +{ + gboolean ret; + GstClockTime position; + gdouble rate; + GstStateChangeReturn state_ret; + GstEvent *s_event; + GstSeekFlags flags = 0; + gboolean accurate = FALSE; + + remove_seek_source (self); + + /* Only seek in PAUSED */ + if (self->current_state < GST_STATE_PAUSED) { + return; + } else if (self->current_state != GST_STATE_PAUSED) { + g_mutex_unlock (&self->lock); + state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); + if (state_ret == GST_STATE_CHANGE_FAILURE) { + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to seek"), NULL); + g_mutex_lock (&self->lock); + return; + } + g_mutex_lock (&self->lock); + return; + } + + self->last_seek_time = gst_util_get_timestamp (); + position = self->seek_position; + self->seek_position = GST_CLOCK_TIME_NONE; + self->seek_pending = TRUE; + rate = self->rate; + g_mutex_unlock (&self->lock); + + remove_tick_source (self); + self->is_eos = FALSE; + + flags |= GST_SEEK_FLAG_FLUSH; + + accurate = gst_play_config_get_seek_accurate (self->config); + + if (accurate) { + flags |= GST_SEEK_FLAG_ACCURATE; + } else { + flags &= ~GST_SEEK_FLAG_ACCURATE; + } + + if (rate != 1.0) { + flags |= GST_SEEK_FLAG_TRICKMODE; + } + + if (rate >= 0.0) { + s_event = gst_event_new_seek (rate, GST_FORMAT_TIME, flags, + GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_SET, GST_CLOCK_TIME_NONE); + } else { + s_event = gst_event_new_seek (rate, GST_FORMAT_TIME, flags, + GST_SEEK_TYPE_SET, G_GINT64_CONSTANT (0), GST_SEEK_TYPE_SET, position); + } + + GST_DEBUG_OBJECT (self, "Seek with rate %.2lf to %" GST_TIME_FORMAT, + rate, GST_TIME_ARGS (position)); + + ret = gst_element_send_event (self->playbin, s_event); + if (!ret) + on_error (self, g_error_new (GST_PLAY_ERROR, GST_PLAY_ERROR_FAILED, + "Failed to seek to %" GST_TIME_FORMAT, GST_TIME_ARGS (position)), + NULL); + + g_mutex_lock (&self->lock); +} + +static gboolean +gst_play_seek_internal (gpointer user_data) +{ + GstPlay *self = GST_PLAY (user_data); + + g_mutex_lock (&self->lock); + gst_play_seek_internal_locked (self); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +/** + * gst_play_set_rate: + * @play: #GstPlay instance + * @rate: playback rate + * + * Playback at specified rate + * Since: 1.20 + */ +void +gst_play_set_rate (GstPlay * self, gdouble rate) +{ + g_return_if_fail (GST_IS_PLAY (self)); + g_return_if_fail (rate != 0.0); + + g_object_set (self, "rate", rate, NULL); +} + +/** + * gst_play_get_rate: + * @play: #GstPlay instance + * + * Returns: current playback rate + * Since: 1.20 + */ +gdouble +gst_play_get_rate (GstPlay * self) +{ + gdouble val; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_RATE); + + g_object_get (self, "rate", &val, NULL); + + return val; +} + +/** + * gst_play_seek: + * @play: #GstPlay instance + * @position: position to seek in nanoseconds + * + * Seeks the currently-playing stream to the absolute @position time + * in nanoseconds. + * Since: 1.20 + */ +void +gst_play_seek (GstPlay * self, GstClockTime position) +{ + g_return_if_fail (GST_IS_PLAY (self)); + g_return_if_fail (GST_CLOCK_TIME_IS_VALID (position)); + + g_mutex_lock (&self->lock); + if (self->media_info && !self->media_info->seekable) { + GST_DEBUG_OBJECT (self, "Media is not seekable"); + g_mutex_unlock (&self->lock); + return; + } + + self->seek_position = position; + + /* If there is no seek being dispatch to the main context currently do that, + * otherwise we just updated the seek position so that it will be taken by + * the seek handler from the main context instead of the old one. + */ + if (!self->seek_source) { + GstClockTime now = gst_util_get_timestamp (); + + /* If no seek is pending or it was started more than 250 mseconds ago seek + * immediately, otherwise wait until the 250 mseconds have passed */ + if (!self->seek_pending || (now - self->last_seek_time > 250 * GST_MSECOND)) { + self->seek_source = g_idle_source_new (); + g_source_set_callback (self->seek_source, + (GSourceFunc) gst_play_seek_internal, self, NULL); + GST_TRACE_OBJECT (self, "Dispatching seek to position %" GST_TIME_FORMAT, + GST_TIME_ARGS (position)); + g_source_attach (self->seek_source, self->context); + } else { + guint delay = 250000 - (now - self->last_seek_time) / 1000; + + /* Note that last_seek_time must be set to something at this point and + * it must be smaller than 250 mseconds */ + self->seek_source = g_timeout_source_new (delay); + g_source_set_callback (self->seek_source, + (GSourceFunc) gst_play_seek_internal, self, NULL); + + GST_TRACE_OBJECT (self, + "Delaying seek to position %" GST_TIME_FORMAT " by %u us", + GST_TIME_ARGS (position), delay); + g_source_attach (self->seek_source, self->context); + } + } + g_mutex_unlock (&self->lock); +} + +static void +remove_seek_source (GstPlay * self) +{ + if (!self->seek_source) + return; + + g_source_destroy (self->seek_source); + g_source_unref (self->seek_source); + self->seek_source = NULL; +} + +/** + * gst_play_get_uri: + * @play: #GstPlay instance + * + * Gets the URI of the currently-playing stream. + * + * Returns: (transfer full) (nullable): a string containing the URI of the + * currently-playing stream. g_free() after usage. + * Since: 1.20 + */ +gchar * +gst_play_get_uri (GstPlay * self) +{ + gchar *val; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_URI); + + g_object_get (self, "uri", &val, NULL); + + return val; +} + +/** + * gst_play_set_uri: + * @play: #GstPlay instance + * @uri: (nullable): next URI to play. + * + * Sets the next URI to play. + * Since: 1.20 + */ +void +gst_play_set_uri (GstPlay * self, const gchar * val) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "uri", val, NULL); +} + +/** + * gst_play_set_subtitle_uri: + * @play: #GstPlay instance + * @uri: (nullable): subtitle URI + * + * Sets the external subtitle URI. This should be combined with a call to + * gst_play_set_subtitle_track_enabled(@play, TRUE) so the subtitles are actually + * rendered. + * Since: 1.20 + */ +void +gst_play_set_subtitle_uri (GstPlay * self, const gchar * suburi) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "suburi", suburi, NULL); +} + +/** + * gst_play_get_subtitle_uri: + * @play: #GstPlay instance + * + * current subtitle URI + * + * Returns: (transfer full) (nullable): URI of the current external subtitle. + * g_free() after usage. + * Since: 1.20 + */ +gchar * +gst_play_get_subtitle_uri (GstPlay * self) +{ + gchar *val = NULL; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + g_object_get (self, "suburi", &val, NULL); + + return val; +} + +/** + * gst_play_get_position: + * @play: #GstPlay instance + * + * Returns: the absolute position time, in nanoseconds, of the + * currently-playing stream. + * Since: 1.20 + */ +GstClockTime +gst_play_get_position (GstPlay * self) +{ + GstClockTime val; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_POSITION); + + g_object_get (self, "position", &val, NULL); + + return val; +} + +/** + * gst_play_get_duration: + * @play: #GstPlay instance + * + * Retrieves the duration of the media stream that self represents. + * + * Returns: the duration of the currently-playing media stream, in + * nanoseconds. + * Since: 1.20 + */ +GstClockTime +gst_play_get_duration (GstPlay * self) +{ + GstClockTime val; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_DURATION); + + g_object_get (self, "duration", &val, NULL); + + return val; +} + +/** + * gst_play_get_volume: + * @play: #GstPlay instance + * + * Returns the current volume level, as a percentage between 0 and 1. + * + * Returns: the volume as percentage between 0 and 1. + * Since: 1.20 + */ +gdouble +gst_play_get_volume (GstPlay * self) +{ + gdouble val; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_VOLUME); + + g_object_get (self, "volume", &val, NULL); + + return val; +} + +/** + * gst_play_set_volume: + * @play: #GstPlay instance + * @val: the new volume level, as a percentage between 0 and 1 + * + * Sets the volume level of the stream as a percentage between 0 and 1. + * Since: 1.20 + */ +void +gst_play_set_volume (GstPlay * self, gdouble val) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "volume", val, NULL); +} + +/** + * gst_play_get_mute: + * @play: #GstPlay instance + * + * Returns: %TRUE if the currently-playing stream is muted. + * Since: 1.20 + */ +gboolean +gst_play_get_mute (GstPlay * self) +{ + gboolean val; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_MUTE); + + g_object_get (self, "mute", &val, NULL); + + return val; +} + +/** + * gst_play_set_mute: + * @play: #GstPlay instance + * @val: Mute state the should be set + * + * %TRUE if the currently-playing stream should be muted. + * Since: 1.20 + */ +void +gst_play_set_mute (GstPlay * self, gboolean val) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "mute", val, NULL); +} + +/** + * gst_play_get_pipeline: + * @play: #GstPlay instance + * + * Returns: (transfer full): The internal playbin instance. + * + * The caller should free it with g_object_unref() + * Since: 1.20 + */ +GstElement * +gst_play_get_pipeline (GstPlay * self) +{ + GstElement *val; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + g_object_get (self, "pipeline", &val, NULL); + + return val; +} + +/** + * gst_play_get_media_info: + * @play: #GstPlay instance + * + * A Function to get the current media info #GstPlayMediaInfo instance. + * + * Returns: (transfer full) (nullable): media info instance. + * + * The caller should free it with g_object_unref() + * Since: 1.20 + */ +GstPlayMediaInfo * +gst_play_get_media_info (GstPlay * self) +{ + GstPlayMediaInfo *info; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + if (!self->media_info) + return NULL; + + g_mutex_lock (&self->lock); + info = gst_play_media_info_copy (self->media_info); + g_mutex_unlock (&self->lock); + + return info; +} + +/** + * gst_play_get_current_audio_track: + * @play: #GstPlay instance + * + * A Function to get current audio #GstPlayAudioInfo instance. + * + * Returns: (transfer full) (nullable): current audio track. + * + * The caller should free it with g_object_unref() + * Since: 1.20 + */ +GstPlayAudioInfo * +gst_play_get_current_audio_track (GstPlay * self) +{ + GstPlayAudioInfo *info; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + if (!is_track_enabled (self, GST_PLAY_FLAG_AUDIO)) + return NULL; + + if (self->use_playbin3) { + info = (GstPlayAudioInfo *) + gst_play_stream_info_get_current_from_stream_id (self, + self->audio_sid, GST_TYPE_PLAY_AUDIO_INFO); + } else { + info = (GstPlayAudioInfo *) gst_play_stream_info_get_current (self, + "current-audio", GST_TYPE_PLAY_AUDIO_INFO); + } + + return info; +} + +/** + * gst_play_get_current_video_track: + * @play: #GstPlay instance + * + * A Function to get current video #GstPlayVideoInfo instance. + * + * Returns: (transfer full) (nullable): current video track. + * + * The caller should free it with g_object_unref() + * Since: 1.20 + */ +GstPlayVideoInfo * +gst_play_get_current_video_track (GstPlay * self) +{ + GstPlayVideoInfo *info; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + if (!is_track_enabled (self, GST_PLAY_FLAG_VIDEO)) + return NULL; + + if (self->use_playbin3) { + info = (GstPlayVideoInfo *) + gst_play_stream_info_get_current_from_stream_id (self, + self->video_sid, GST_TYPE_PLAY_VIDEO_INFO); + } else { + info = (GstPlayVideoInfo *) gst_play_stream_info_get_current (self, + "current-video", GST_TYPE_PLAY_VIDEO_INFO); + } + + return info; +} + +/** + * gst_play_get_current_subtitle_track: + * @play: #GstPlay instance + * + * A Function to get current subtitle #GstPlaySubtitleInfo instance. + * + * Returns: (transfer full) (nullable): current subtitle track. + * + * The caller should free it with g_object_unref() + * Since: 1.20 + */ +GstPlaySubtitleInfo * +gst_play_get_current_subtitle_track (GstPlay * self) +{ + GstPlaySubtitleInfo *info; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + if (!is_track_enabled (self, GST_PLAY_FLAG_SUBTITLE)) + return NULL; + + if (self->use_playbin3) { + info = (GstPlaySubtitleInfo *) + gst_play_stream_info_get_current_from_stream_id (self, + self->subtitle_sid, GST_TYPE_PLAY_SUBTITLE_INFO); + } else { + info = (GstPlaySubtitleInfo *) gst_play_stream_info_get_current (self, + "current-text", GST_TYPE_PLAY_SUBTITLE_INFO); + } + + return info; +} + +/* Must be called with lock */ +static gboolean +gst_play_select_streams (GstPlay * self) +{ + GList *stream_list = NULL; + gboolean ret = FALSE; + + if (self->audio_sid) + stream_list = g_list_append (stream_list, g_strdup (self->audio_sid)); + if (self->video_sid) + stream_list = g_list_append (stream_list, g_strdup (self->video_sid)); + if (self->subtitle_sid) + stream_list = g_list_append (stream_list, g_strdup (self->subtitle_sid)); + + g_mutex_unlock (&self->lock); + if (stream_list) { + ret = gst_element_send_event (self->playbin, + gst_event_new_select_streams (stream_list)); + g_list_free_full (stream_list, g_free); + } else { + GST_ERROR_OBJECT (self, "No available streams for select-streams"); + } + g_mutex_lock (&self->lock); + + return ret; +} + +/** + * gst_play_set_audio_track: + * @play: #GstPlay instance + * @stream_index: stream index + * + * Returns: %TRUE or %FALSE + * + * Sets the audio track @stream_index. + * Since: 1.20 + */ +gboolean +gst_play_set_audio_track (GstPlay * self, gint stream_index) +{ + GstPlayStreamInfo *info; + gboolean ret = TRUE; + + g_return_val_if_fail (GST_IS_PLAY (self), 0); + + g_mutex_lock (&self->lock); + info = gst_play_stream_info_find (self->media_info, + GST_TYPE_PLAY_AUDIO_INFO, stream_index); + g_mutex_unlock (&self->lock); + if (!info) { + GST_ERROR_OBJECT (self, "invalid audio stream index %d", stream_index); + return FALSE; + } + + if (self->use_playbin3) { + g_mutex_lock (&self->lock); + g_free (self->audio_sid); + self->audio_sid = g_strdup (info->stream_id); + ret = gst_play_select_streams (self); + g_mutex_unlock (&self->lock); + } else { + g_object_set (G_OBJECT (self->playbin), "current-audio", stream_index, + NULL); + } + + GST_DEBUG_OBJECT (self, "set stream index '%d'", stream_index); + return ret; +} + +/** + * gst_play_set_video_track: + * @play: #GstPlay instance + * @stream_index: stream index + * + * Returns: %TRUE or %FALSE + * + * Sets the video track @stream_index. + * Since: 1.20 + */ +gboolean +gst_play_set_video_track (GstPlay * self, gint stream_index) +{ + GstPlayStreamInfo *info; + gboolean ret = TRUE; + + g_return_val_if_fail (GST_IS_PLAY (self), 0); + + /* check if stream_index exist in our internal media_info list */ + g_mutex_lock (&self->lock); + info = gst_play_stream_info_find (self->media_info, + GST_TYPE_PLAY_VIDEO_INFO, stream_index); + g_mutex_unlock (&self->lock); + if (!info) { + GST_ERROR_OBJECT (self, "invalid video stream index %d", stream_index); + return FALSE; + } + + if (self->use_playbin3) { + g_mutex_lock (&self->lock); + g_free (self->video_sid); + self->video_sid = g_strdup (info->stream_id); + ret = gst_play_select_streams (self); + g_mutex_unlock (&self->lock); + } else { + g_object_set (G_OBJECT (self->playbin), "current-video", stream_index, + NULL); + } + + GST_DEBUG_OBJECT (self, "set stream index '%d'", stream_index); + return ret; +} + +/** + * gst_play_set_subtitle_track: + * @play: #GstPlay instance + * @stream_index: stream index + * + * Returns: %TRUE or %FALSE + * + * Sets the subtitle stack @stream_index. + * Since: 1.20 + */ +gboolean +gst_play_set_subtitle_track (GstPlay * self, gint stream_index) +{ + GstPlayStreamInfo *info; + gboolean ret = TRUE; + + g_return_val_if_fail (GST_IS_PLAY (self), 0); + + g_mutex_lock (&self->lock); + info = gst_play_stream_info_find (self->media_info, + GST_TYPE_PLAY_SUBTITLE_INFO, stream_index); + g_mutex_unlock (&self->lock); + if (!info) { + GST_ERROR_OBJECT (self, "invalid subtitle stream index %d", stream_index); + return FALSE; + } + + if (self->use_playbin3) { + g_mutex_lock (&self->lock); + g_free (self->subtitle_sid); + self->subtitle_sid = g_strdup (info->stream_id); + ret = gst_play_select_streams (self); + g_mutex_unlock (&self->lock); + } else { + g_object_set (G_OBJECT (self->playbin), "current-text", stream_index, NULL); + } + + GST_DEBUG_OBJECT (self, "set stream index '%d'", stream_index); + return ret; +} + +/** + * gst_play_set_audio_track_enabled: + * @play: #GstPlay instance + * @enabled: TRUE or FALSE + * + * Enable or disable the current audio track. + * Since: 1.20 + */ +void +gst_play_set_audio_track_enabled (GstPlay * self, gboolean enabled) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + if (enabled) + play_set_flag (self, GST_PLAY_FLAG_AUDIO); + else + play_clear_flag (self, GST_PLAY_FLAG_AUDIO); + + GST_DEBUG_OBJECT (self, "track is '%s'", enabled ? "Enabled" : "Disabled"); +} + +/** + * gst_play_set_video_track_enabled: + * @play: #GstPlay instance + * @enabled: TRUE or FALSE + * + * Enable or disable the current video track. + * Since: 1.20 + */ +void +gst_play_set_video_track_enabled (GstPlay * self, gboolean enabled) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + if (enabled) + play_set_flag (self, GST_PLAY_FLAG_VIDEO); + else + play_clear_flag (self, GST_PLAY_FLAG_VIDEO); + + GST_DEBUG_OBJECT (self, "track is '%s'", enabled ? "Enabled" : "Disabled"); +} + +/** + * gst_play_set_subtitle_track_enabled: + * @play: #GstPlay instance + * @enabled: TRUE or FALSE + * + * Enable or disable the current subtitle track. + * Since: 1.20 + */ +void +gst_play_set_subtitle_track_enabled (GstPlay * self, gboolean enabled) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + if (enabled) + play_set_flag (self, GST_PLAY_FLAG_SUBTITLE); + else + play_clear_flag (self, GST_PLAY_FLAG_SUBTITLE); + + GST_DEBUG_OBJECT (self, "track is '%s'", enabled ? "Enabled" : "Disabled"); +} + +/** + * gst_play_set_visualization: + * @play: #GstPlay instance + * @name: (nullable): visualization element obtained from + * #gst_play_visualizations_get() + * + * Returns: %TRUE if the visualizations was set correctly. Otherwise, + * %FALSE. + * Since: 1.20 + */ +gboolean +gst_play_set_visualization (GstPlay * self, const gchar * name) +{ + g_return_val_if_fail (GST_IS_PLAY (self), FALSE); + + g_mutex_lock (&self->lock); + if (self->current_vis_element) { + gst_object_unref (self->current_vis_element); + self->current_vis_element = NULL; + } + + if (name) { + self->current_vis_element = gst_element_factory_make (name, NULL); + if (!self->current_vis_element) + goto error_no_element; + gst_object_ref_sink (self->current_vis_element); + } + g_object_set (self->playbin, "vis-plugin", self->current_vis_element, NULL); + + g_mutex_unlock (&self->lock); + GST_DEBUG_OBJECT (self, "set vis-plugin to '%s'", name); + + return TRUE; + +error_no_element: + g_mutex_unlock (&self->lock); + GST_WARNING_OBJECT (self, "could not find visualization '%s'", name); + return FALSE; +} + +/** + * gst_play_get_current_visualization: + * @play: #GstPlay instance + * + * Returns: (transfer full) (nullable): Name of the currently enabled + * visualization. + * g_free() after usage. + * Since: 1.20 + */ +gchar * +gst_play_get_current_visualization (GstPlay * self) +{ + gchar *name = NULL; + GstElement *vis_plugin = NULL; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + if (!is_track_enabled (self, GST_PLAY_FLAG_VIS)) + return NULL; + + g_object_get (self->playbin, "vis-plugin", &vis_plugin, NULL); + + if (vis_plugin) { + GstElementFactory *factory = gst_element_get_factory (vis_plugin); + if (factory) + name = g_strdup (gst_plugin_feature_get_name (factory)); + gst_object_unref (vis_plugin); + } + + GST_DEBUG_OBJECT (self, "vis-plugin '%s' %p", name, vis_plugin); + + return name; +} + +/** + * gst_play_set_visualization_enabled: + * @play: #GstPlay instance + * @enabled: TRUE or FALSE + * + * Enable or disable the visualization. + * Since: 1.20 + */ +void +gst_play_set_visualization_enabled (GstPlay * self, gboolean enabled) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + if (enabled) + play_set_flag (self, GST_PLAY_FLAG_VIS); + else + play_clear_flag (self, GST_PLAY_FLAG_VIS); + + GST_DEBUG_OBJECT (self, "visualization is '%s'", + enabled ? "Enabled" : "Disabled"); +} + +struct CBChannelMap +{ + const gchar *label; /* channel label name */ + const gchar *name; /* get_name () */ +}; + +static const struct CBChannelMap cb_channel_map[] = { + /* GST_PLAY_COLOR_BALANCE_BRIGHTNESS */ {"BRIGHTNESS", "brightness"}, + /* GST_PLAY_COLOR_BALANCE_CONTRAST */ {"CONTRAST", "contrast"}, + /* GST_PLAY_COLOR_BALANCE_SATURATION */ {"SATURATION", "saturation"}, + /* GST_PLAY_COLOR_BALANCE_HUE */ {"HUE", "hue"}, +}; + +static GstColorBalanceChannel * +gst_play_color_balance_find_channel (GstPlay * self, + GstPlayColorBalanceType type) +{ + GstColorBalanceChannel *channel; + const GList *l, *channels; + + if (type < GST_PLAY_COLOR_BALANCE_BRIGHTNESS || + type > GST_PLAY_COLOR_BALANCE_HUE) + return NULL; + + channels = + gst_color_balance_list_channels (GST_COLOR_BALANCE (self->playbin)); + for (l = channels; l; l = l->next) { + channel = l->data; + if (g_strrstr (channel->label, cb_channel_map[type].label)) + return channel; + } + + return NULL; +} + +/** + * gst_play_has_color_balance: + * @play:#GstPlay instance + * + * Checks whether the @play has color balance support available. + * + * Returns: %TRUE if @play has color balance support. Otherwise, + * %FALSE. + * Since: 1.20 + */ +gboolean +gst_play_has_color_balance (GstPlay * self) +{ + const GList *channels; + + g_return_val_if_fail (GST_IS_PLAY (self), FALSE); + + if (!GST_IS_COLOR_BALANCE (self->playbin)) + return FALSE; + + channels = + gst_color_balance_list_channels (GST_COLOR_BALANCE (self->playbin)); + return (channels != NULL); +} + +/** + * gst_play_set_color_balance: + * @play: #GstPlay instance + * @type: #GstPlayColorBalanceType + * @value: The new value for the @type, ranged [0,1] + * + * Sets the current value of the indicated channel @type to the passed + * value. + * Since: 1.20 + */ +void +gst_play_set_color_balance (GstPlay * self, GstPlayColorBalanceType type, + gdouble value) +{ + GstColorBalanceChannel *channel; + gdouble new_val; + + g_return_if_fail (GST_IS_PLAY (self)); + g_return_if_fail (value >= 0.0 && value <= 1.0); + + if (!GST_IS_COLOR_BALANCE (self->playbin)) + return; + + channel = gst_play_color_balance_find_channel (self, type); + if (!channel) + return; + + value = CLAMP (value, 0.0, 1.0); + + /* Convert to channel range */ + new_val = channel->min_value + value * ((gdouble) channel->max_value - + (gdouble) channel->min_value); + + gst_color_balance_set_value (GST_COLOR_BALANCE (self->playbin), channel, + new_val); +} + +/** + * gst_play_get_color_balance: + * @play: #GstPlay instance + * @type: #GstPlayColorBalanceType + * + * Retrieve the current value of the indicated @type. + * + * Returns: The current value of @type, between [0,1]. In case of + * error -1 is returned. + * Since: 1.20 + */ +gdouble +gst_play_get_color_balance (GstPlay * self, GstPlayColorBalanceType type) +{ + GstColorBalanceChannel *channel; + gint value; + + g_return_val_if_fail (GST_IS_PLAY (self), -1); + + if (!GST_IS_COLOR_BALANCE (self->playbin)) + return -1; + + channel = gst_play_color_balance_find_channel (self, type); + if (!channel) + return -1; + + value = gst_color_balance_get_value (GST_COLOR_BALANCE (self->playbin), + channel); + + return ((gdouble) value - + (gdouble) channel->min_value) / ((gdouble) channel->max_value - + (gdouble) channel->min_value); +} + +/** + * gst_play_get_multiview_mode: + * @play: #GstPlay instance + * + * Retrieve the current value of the indicated @type. + * + * Returns: The current value of @type, Default: -1 "none" + * + * Since: 1.20 + */ +GstVideoMultiviewFramePacking +gst_play_get_multiview_mode (GstPlay * self) +{ + GstVideoMultiviewFramePacking val = GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE; + + g_return_val_if_fail (GST_IS_PLAY (self), + GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE); + + g_object_get (self, "video-multiview-mode", &val, NULL); + + return val; +} + +/** + * gst_play_set_multiview_mode: + * @play: #GstPlay instance + * @mode: The new value for the @type + * + * Sets the current value of the indicated mode @type to the passed + * value. + * + * Since: 1.20 + */ +void +gst_play_set_multiview_mode (GstPlay * self, GstVideoMultiviewFramePacking mode) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "video-multiview-mode", mode, NULL); +} + +/** + * gst_play_get_multiview_flags: + * @play: #GstPlay instance + * + * Retrieve the current value of the indicated @type. + * + * Returns: The current value of @type, Default: 0x00000000 "none + * + * Since: 1.20 + */ +GstVideoMultiviewFlags +gst_play_get_multiview_flags (GstPlay * self) +{ + GstVideoMultiviewFlags val = GST_VIDEO_MULTIVIEW_FLAGS_NONE; + + g_return_val_if_fail (GST_IS_PLAY (self), val); + + g_object_get (self, "video-multiview-flags", &val, NULL); + + return val; +} + +/** + * gst_play_set_multiview_flags: + * @play: #GstPlay instance + * @flags: The new value for the @type + * + * Sets the current value of the indicated mode @type to the passed + * value. + * + * Since: 1.20 + */ +void +gst_play_set_multiview_flags (GstPlay * self, GstVideoMultiviewFlags flags) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "video-multiview-flags", flags, NULL); +} + +/** + * gst_play_get_audio_video_offset: + * @play: #GstPlay instance + * + * Retrieve the current value of audio-video-offset property + * + * Returns: The current value of audio-video-offset in nanoseconds + * + * Since: 1.20 + */ +gint64 +gst_play_get_audio_video_offset (GstPlay * self) +{ + gint64 val = 0; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_AUDIO_VIDEO_OFFSET); + + g_object_get (self, "audio-video-offset", &val, NULL); + + return val; +} + +/** + * gst_play_set_audio_video_offset: + * @play: #GstPlay instance + * @offset: #gint64 in nanoseconds + * + * Sets audio-video-offset property by value of @offset + * + * Since: 1.20 + */ +void +gst_play_set_audio_video_offset (GstPlay * self, gint64 offset) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "audio-video-offset", offset, NULL); +} + +/** + * gst_play_get_subtitle_video_offset: + * @play: #GstPlay instance + * + * Retrieve the current value of subtitle-video-offset property + * + * Returns: The current value of subtitle-video-offset in nanoseconds + * + * Since: 1.20 + */ +gint64 +gst_play_get_subtitle_video_offset (GstPlay * self) +{ + gint64 val = 0; + + g_return_val_if_fail (GST_IS_PLAY (self), DEFAULT_SUBTITLE_VIDEO_OFFSET); + + g_object_get (self, "subtitle-video-offset", &val, NULL); + + return val; +} + +/** + * gst_play_set_subtitle_video_offset: + * @play: #GstPlay instance + * @offset: #gint64 in nanoseconds + * + * Sets subtitle-video-offset property by value of @offset + * + * Since: 1.20 + */ +void +gst_play_set_subtitle_video_offset (GstPlay * self, gint64 offset) +{ + g_return_if_fail (GST_IS_PLAY (self)); + + g_object_set (self, "subtitle-video-offset", offset, NULL); +} + + +#define C_ENUM(v) ((gint) v) +#define C_FLAGS(v) ((guint) v) + +GType +gst_play_color_balance_type_get_type (void) +{ + static gsize id = 0; + static const GEnumValue values[] = { + {C_ENUM (GST_PLAY_COLOR_BALANCE_HUE), "GST_PLAY_COLOR_BALANCE_HUE", + "hue"}, + {C_ENUM (GST_PLAY_COLOR_BALANCE_BRIGHTNESS), + "GST_PLAY_COLOR_BALANCE_BRIGHTNESS", "brightness"}, + {C_ENUM (GST_PLAY_COLOR_BALANCE_SATURATION), + "GST_PLAY_COLOR_BALANCE_SATURATION", "saturation"}, + {C_ENUM (GST_PLAY_COLOR_BALANCE_CONTRAST), + "GST_PLAY_COLOR_BALANCE_CONTRAST", "contrast"}, + {0, NULL, NULL} + }; + + if (g_once_init_enter (&id)) { + GType tmp = g_enum_register_static ("GstPlayColorBalanceType", values); + g_once_init_leave (&id, tmp); + } + + return (GType) id; +} + +/** + * gst_play_color_balance_type_get_name: + * @type: a #GstPlayColorBalanceType + * + * Gets a string representing the given color balance type. + * + * Returns: (transfer none): a string with the name of the color + * balance type. + * Since: 1.20 + */ +const gchar * +gst_play_color_balance_type_get_name (GstPlayColorBalanceType type) +{ + g_return_val_if_fail (type >= GST_PLAY_COLOR_BALANCE_BRIGHTNESS && + type <= GST_PLAY_COLOR_BALANCE_HUE, NULL); + + return cb_channel_map[type].name; +} + +GType +gst_play_state_get_type (void) +{ + static gsize id = 0; + static const GEnumValue values[] = { + {C_ENUM (GST_PLAY_STATE_STOPPED), "GST_PLAY_STATE_STOPPED", "stopped"}, + {C_ENUM (GST_PLAY_STATE_BUFFERING), "GST_PLAY_STATE_BUFFERING", + "buffering"}, + {C_ENUM (GST_PLAY_STATE_PAUSED), "GST_PLAY_STATE_PAUSED", "paused"}, + {C_ENUM (GST_PLAY_STATE_PLAYING), "GST_PLAY_STATE_PLAYING", "playing"}, + {0, NULL, NULL} + }; + + if (g_once_init_enter (&id)) { + GType tmp = g_enum_register_static ("GstPlayState", values); + g_once_init_leave (&id, tmp); + } + + return (GType) id; +} + +GType +gst_play_message_get_type (void) +{ + static gsize id = 0; + static const GEnumValue values[] = { + {C_ENUM (GST_PLAY_MESSAGE_URI_LOADED), "GST_PLAY_MESSAGE_URI_LOADED", + "uri-loaded"}, + {C_ENUM (GST_PLAY_MESSAGE_POSITION_UPDATED), + "GST_PLAY_MESSAGE_POSITION_UPDATED", "position-updated"}, + {C_ENUM (GST_PLAY_MESSAGE_DURATION_CHANGED), + "GST_PLAY_MESSAGE_DURATION_CHANGED", "duration-changed"}, + {C_ENUM (GST_PLAY_MESSAGE_STATE_CHANGED), + "GST_PLAY_MESSAGE_STATE_CHANGED", "state-changed"}, + {C_ENUM (GST_PLAY_MESSAGE_BUFFERING), "GST_PLAY_MESSAGE_BUFFERING", + "buffering"}, + {C_ENUM (GST_PLAY_MESSAGE_END_OF_STREAM), + "GST_PLAY_MESSAGE_END_OF_STREAM", "end-of-stream"}, + {C_ENUM (GST_PLAY_MESSAGE_ERROR), "GST_PLAY_MESSAGE_ERROR", "error"}, + {C_ENUM (GST_PLAY_MESSAGE_WARNING), "GST_PLAY_MESSAGE_WARNING", + "warning"}, + {C_ENUM (GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED), + "GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED", + "video-dimensions-changed"}, + {C_ENUM (GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED), + "GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED", "media-info-updated"}, + {C_ENUM (GST_PLAY_MESSAGE_VOLUME_CHANGED), + "GST_PLAY_MESSAGE_VOLUME_CHANGED", "volume-changed"}, + {C_ENUM (GST_PLAY_MESSAGE_MUTE_CHANGED), + "GST_PLAY_MESSAGE_MUTE_CHANGED", "mute-changed"}, + {C_ENUM (GST_PLAY_MESSAGE_SEEK_DONE), "GST_PLAY_MESSAGE_SEEK_DONE", + "seek-done"}, + {0, NULL, NULL} + }; + + if (g_once_init_enter (&id)) { + GType tmp = g_enum_register_static ("GstPlayMessage", values); + g_once_init_leave (&id, tmp); + } + + return (GType) id; +} + +/** + * gst_play_state_get_name: + * @state: a #GstPlayState + * + * Gets a string representing the given state. + * + * Returns: (transfer none): a string with the name of the state. + * Since: 1.20 + */ +const gchar * +gst_play_state_get_name (GstPlayState state) +{ + switch (state) { + case GST_PLAY_STATE_STOPPED: + return "stopped"; + case GST_PLAY_STATE_BUFFERING: + return "buffering"; + case GST_PLAY_STATE_PAUSED: + return "paused"; + case GST_PLAY_STATE_PLAYING: + return "playing"; + } + + g_assert_not_reached (); + return NULL; +} + +/** + * gst_play_message_get_name: + * @message_type: a #GstPlayMessage + * + * Returns: (transfer none): a string with the name of the message. + * Since: 1.20 + */ +const gchar * +gst_play_message_get_name (GstPlayMessage message_type) +{ + GEnumClass *enum_class; + GEnumValue *enum_value; + enum_class = g_type_class_ref (GST_TYPE_PLAY_MESSAGE); + enum_value = g_enum_get_value (enum_class, message_type); + g_assert (enum_value != NULL); + g_type_class_unref (enum_class); + return enum_value->value_name; +} + +GType +gst_play_error_get_type (void) +{ + static gsize id = 0; + static const GEnumValue values[] = { + {C_ENUM (GST_PLAY_ERROR_FAILED), "GST_PLAY_ERROR_FAILED", "failed"}, + {0, NULL, NULL} + }; + + if (g_once_init_enter (&id)) { + GType tmp = g_enum_register_static ("GstPlayError", values); + g_once_init_leave (&id, tmp); + } + + return (GType) id; +} + +/** + * gst_play_error_get_name: + * @error: a #GstPlayError + * + * Gets a string representing the given error. + * + * Returns: (transfer none): a string with the given error. + * Since: 1.20 + */ +const gchar * +gst_play_error_get_name (GstPlayError error) +{ + switch (error) { + case GST_PLAY_ERROR_FAILED: + return "failed"; + } + + g_assert_not_reached (); + return NULL; +} + +/** + * gst_play_set_config: + * @play: #GstPlay instance + * @config: (transfer full): a #GstStructure + * + * Set the configuration of the play. If the play is already configured, and + * the configuration haven't change, this function will return %TRUE. If the + * play is not in the GST_PLAY_STATE_STOPPED, this method will return %FALSE + * and active configuration will remain. + * + * @config is a #GstStructure that contains the configuration parameters for + * the play. + * + * This function takes ownership of @config. + * + * Returns: %TRUE when the configuration could be set. + * Since: 1.20 + */ +gboolean +gst_play_set_config (GstPlay * self, GstStructure * config) +{ + g_return_val_if_fail (GST_IS_PLAY (self), FALSE); + g_return_val_if_fail (config != NULL, FALSE); + + g_mutex_lock (&self->lock); + + if (self->app_state != GST_PLAY_STATE_STOPPED) { + GST_INFO_OBJECT (self, "can't change config while play is %s", + gst_play_state_get_name (self->app_state)); + g_mutex_unlock (&self->lock); + return FALSE; + } + + if (self->config) + gst_structure_free (self->config); + self->config = config; + g_mutex_unlock (&self->lock); + + return TRUE; +} + +/** + * gst_play_get_config: + * @play: #GstPlay instance + * + * Get a copy of the current configuration of the play. This configuration + * can either be modified and used for the gst_play_set_config() call + * or it must be freed after usage. + * + * Returns: (transfer full): a copy of the current configuration of @play. Use + * gst_structure_free() after usage or gst_play_set_config(). + * + * Since: 1.20 + */ +GstStructure * +gst_play_get_config (GstPlay * self) +{ + GstStructure *ret; + + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + g_mutex_lock (&self->lock); + ret = gst_structure_copy (self->config); + g_mutex_unlock (&self->lock); + + return ret; +} + +/** + * gst_play_config_set_user_agent: + * @config: a #GstPlay configuration + * @agent: (nullable): the string to use as user agent + * + * Set the user agent to pass to the server if @play needs to connect + * to a server during playback. This is typically used when playing HTTP + * or RTSP streams. + * + * Since: 1.20 + */ +void +gst_play_config_set_user_agent (GstStructure * config, const gchar * agent) +{ + g_return_if_fail (config != NULL); + g_return_if_fail (agent != NULL); + + gst_structure_id_set (config, + CONFIG_QUARK (USER_AGENT), G_TYPE_STRING, agent, NULL); +} + +/** + * gst_play_config_get_user_agent: + * @config: a #GstPlay configuration + * + * Return the user agent which has been configured using + * gst_play_config_set_user_agent() if any. + * + * Returns: (transfer full) (nullable): the configured agent, or %NULL + * Since: 1.20 + */ +gchar * +gst_play_config_get_user_agent (const GstStructure * config) +{ + gchar *agent = NULL; + + g_return_val_if_fail (config != NULL, NULL); + + gst_structure_id_get (config, + CONFIG_QUARK (USER_AGENT), G_TYPE_STRING, &agent, NULL); + + return agent; +} + +/** + * gst_play_config_set_position_update_interval: + * @config: a #GstPlay configuration + * @interval: interval in ms + * + * set desired interval in milliseconds between two position-updated messages. + * pass 0 to stop updating the position. + * Since: 1.20 + */ +void +gst_play_config_set_position_update_interval (GstStructure * config, + guint interval) +{ + g_return_if_fail (config != NULL); + g_return_if_fail (interval <= 10000); + + gst_structure_id_set (config, + CONFIG_QUARK (POSITION_INTERVAL_UPDATE), G_TYPE_UINT, interval, NULL); +} + +/** + * gst_play_config_get_position_update_interval: + * @config: a #GstPlay configuration + * + * Returns: current position update interval in milliseconds + * + * Since: 1.20 + */ +guint +gst_play_config_get_position_update_interval (const GstStructure * config) +{ + guint interval = DEFAULT_POSITION_UPDATE_INTERVAL_MS; + + g_return_val_if_fail (config != NULL, DEFAULT_POSITION_UPDATE_INTERVAL_MS); + + gst_structure_id_get (config, + CONFIG_QUARK (POSITION_INTERVAL_UPDATE), G_TYPE_UINT, &interval, NULL); + + return interval; +} + +/** + * gst_play_config_set_seek_accurate: + * @config: a #GstPlay configuration + * @accurate: accurate seek or not + * + * Enable or disable accurate seeking. When enabled, elements will try harder + * to seek as accurately as possible to the requested seek position. Generally + * it will be slower especially for formats that don't have any indexes or + * timestamp markers in the stream. + * + * If accurate seeking is disabled, elements will seek as close as the request + * position without slowing down seeking too much. + * + * Accurate seeking is disabled by default. + * + * Since: 1.20 + */ +void +gst_play_config_set_seek_accurate (GstStructure * config, gboolean accurate) +{ + g_return_if_fail (config != NULL); + + gst_structure_id_set (config, + CONFIG_QUARK (ACCURATE_SEEK), G_TYPE_BOOLEAN, accurate, NULL); +} + +/** + * gst_play_config_get_seek_accurate: + * @config: a #GstPlay configuration + * + * Returns: %TRUE if accurate seeking is enabled + * + * Since: 1.20 + */ +gboolean +gst_play_config_get_seek_accurate (const GstStructure * config) +{ + gboolean accurate = FALSE; + + g_return_val_if_fail (config != NULL, FALSE); + + gst_structure_id_get (config, + CONFIG_QUARK (ACCURATE_SEEK), G_TYPE_BOOLEAN, &accurate, NULL); + + return accurate; +} + +/** + * gst_play_get_video_snapshot: + * @play: #GstPlay instance + * @format: output format of the video snapshot + * @config: (allow-none): Additional configuration + * + * Get a snapshot of the currently selected video stream, if any. The format can be + * selected with @format and optional configuration is possible with @config + * Currently supported settings are: + * - width, height of type G_TYPE_INT + * - pixel-aspect-ratio of type GST_TYPE_FRACTION + * Except for GST_PLAY_THUMBNAIL_RAW_NATIVE format, if no config is set, pixel-aspect-ratio would be 1/1 + * + * Returns: (transfer full) (nullable): Current video snapshot sample or %NULL on failure + * + * Since: 1.20 + */ +GstSample * +gst_play_get_video_snapshot (GstPlay * self, + GstPlaySnapshotFormat format, const GstStructure * config) +{ + gint video_tracks = 0; + GstSample *sample = NULL; + GstCaps *caps = NULL; + gint width = -1; + gint height = -1; + gint par_n = 1; + gint par_d = 1; + g_return_val_if_fail (GST_IS_PLAY (self), NULL); + + g_object_get (self->playbin, "n-video", &video_tracks, NULL); + if (video_tracks == 0) { + GST_DEBUG_OBJECT (self, "total video track num is 0"); + return NULL; + } + + switch (format) { + case GST_PLAY_THUMBNAIL_RAW_xRGB: + caps = gst_caps_new_simple ("video/x-raw", + "format", G_TYPE_STRING, "xRGB", NULL); + break; + case GST_PLAY_THUMBNAIL_RAW_BGRx: + caps = gst_caps_new_simple ("video/x-raw", + "format", G_TYPE_STRING, "BGRx", NULL); + break; + case GST_PLAY_THUMBNAIL_JPG: + caps = gst_caps_new_empty_simple ("image/jpeg"); + break; + case GST_PLAY_THUMBNAIL_PNG: + caps = gst_caps_new_empty_simple ("image/png"); + break; + case GST_PLAY_THUMBNAIL_RAW_NATIVE: + default: + caps = gst_caps_new_empty_simple ("video/x-raw"); + break; + } + + if (NULL != config) { + if (!gst_structure_get_int (config, "width", &width)) + width = -1; + if (!gst_structure_get_int (config, "height", &height)) + height = -1; + if (!gst_structure_get_fraction (config, "pixel-aspect-ratio", &par_n, + &par_d)) { + if (format != GST_PLAY_THUMBNAIL_RAW_NATIVE) { + par_n = 1; + par_d = 1; + } else { + par_n = 0; + par_d = 0; + } + } + } + + if (width > 0 && height > 0) { + gst_caps_set_simple (caps, "width", G_TYPE_INT, width, + "height", G_TYPE_INT, height, NULL); + } + + if (format != GST_PLAY_THUMBNAIL_RAW_NATIVE) { + gst_caps_set_simple (caps, "pixel-aspect-ratio", GST_TYPE_FRACTION, + par_n, par_d, NULL); + } else if (NULL != config && par_n != 0 && par_d != 0) { + gst_caps_set_simple (caps, "pixel-aspect-ratio", GST_TYPE_FRACTION, + par_n, par_d, NULL); + } + + g_signal_emit_by_name (self->playbin, "convert-sample", caps, &sample); + gst_caps_unref (caps); + if (!sample) { + GST_WARNING_OBJECT (self, "Failed to retrieve or convert video frame"); + return NULL; + } + + return sample; +} + +/** + * gst_play_is_play_message: + * @msg: A #GstMessage + * + * Returns: A #gboolean indicating wheter the passes message represents a #GstPlay message or not. + * + * Since: 1.20 + */ +gboolean +gst_play_is_play_message (GstMessage * msg) +{ + const GstStructure *data = NULL; + g_return_val_if_fail (GST_IS_MESSAGE (msg), FALSE); + + data = gst_message_get_structure (msg); + g_return_val_if_fail (data, FALSE); + + return g_str_equal (gst_structure_get_name (data), GST_PLAY_MESSAGE_DATA); +} + +#define PARSE_MESSAGE_FIELD(msg, field, value_type, value) G_STMT_START { \ + const GstStructure *data = NULL; \ + g_return_if_fail (gst_play_is_play_message (msg)); \ + data = gst_message_get_structure (msg); \ + gst_structure_get (data, field, value_type, value, NULL); \ +} G_STMT_END + +/** + * gst_play_message_parse_type: + * @msg: A #GstMessage + * @type: (out) (optional): the resulting message type + * + * Parse the given @msg and extract its #GstPlayMessage type. + * + * Since: 1.20 + */ +void +gst_play_message_parse_type (GstMessage * msg, GstPlayMessage * type) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_TYPE, + GST_TYPE_PLAY_MESSAGE, type); +} + +/** + * gst_play_message_parse_duration_updated: + * @msg: A #GstMessage + * @duration: (out) (optional): the resulting duration + * + * Parse the given duration @msg and extract the corresponding #GstClockTime + * + * Since: 1.20 + */ +void +gst_play_message_parse_duration_updated (GstMessage * msg, + GstClockTime * duration) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_DURATION, + GST_TYPE_CLOCK_TIME, duration); +} + +/** + * gst_play_message_parse_position_updated: + * @msg: A #GstMessage + * @position: (out) (optional): the resulting position + * + * Parse the given position @msg and extract the corresponding #GstClockTime + * + * Since: 1.20 + */ +void +gst_play_message_parse_position_updated (GstMessage * msg, + GstClockTime * position) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_POSITION, + GST_TYPE_CLOCK_TIME, position); +} + +/** + * gst_play_message_parse_state_changed: + * @msg: A #GstMessage + * @state: (out) (optional): the resulting play state + * + * Parse the given state @msg and extract the corresponding #GstPlayState + * + * Since: 1.20 + */ +void +gst_play_message_parse_state_changed (GstMessage * msg, GstPlayState * state) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_PLAY_STATE, + GST_TYPE_PLAY_STATE, state); +} + +/** + * gst_play_message_parse_buffering_percent: + * @msg: A #GstMessage + * @percent: (out) (optional): the resulting buffering percent + * + * Parse the given buffering-percent @msg and extract the corresponding value + * + * Since: 1.20 + */ +void +gst_play_message_parse_buffering_percent (GstMessage * msg, guint * percent) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_BUFFERING_PERCENT, + G_TYPE_UINT, percent); +} + +/** + * gst_play_message_parse_error: + * @msg: A #GstMessage + * @error: (out) (optional) (transfer full): the resulting error + * @details: (out) (optional) (nullable) (transfer full): A #GstStructure containing additional details about the error + * + * Parse the given error @msg and extract the corresponding #GError + * + * Since: 1.20 + */ +void +gst_play_message_parse_error (GstMessage * msg, GError ** error, + GstStructure ** details) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_ERROR, G_TYPE_ERROR, error); + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_ERROR, GST_TYPE_STRUCTURE, + details); +} + +/** + * gst_play_message_parse_warning: + * @msg: A #GstMessage + * @error: (out) (optional) (transfer full): the resulting warning + * @details: (out) (optional) (nullable) (transfer full): A #GstStructure containing additional details about the warning + * + * Parse the given error @msg and extract the corresponding #GError warning + * + * Since: 1.20 + */ +void +gst_play_message_parse_warning (GstMessage * msg, GError ** error, + GstStructure ** details) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_WARNING, G_TYPE_ERROR, error); + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_WARNING, GST_TYPE_STRUCTURE, + details); +} + +/** + * gst_play_message_parse_video_dimensions_changed: + * @msg: A #GstMessage + * @width: (out) (optional): the resulting video width + * @height: (out) (optional): the resulting video height + * + * Parse the given @msg and extract the corresponding video dimensions + * + * Since: 1.20 + */ +void +gst_play_message_parse_video_dimensions_changed (GstMessage * msg, + guint * width, guint * height) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_VIDEO_WIDTH, + G_TYPE_UINT, width); + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_VIDEO_HEIGHT, + G_TYPE_UINT, height); +} + +/** + * gst_play_message_parse_media_info_updated: + * @msg: A #GstMessage + * @info: (out) (optional) (transfer full): the resulting media info + * + * Parse the given @msg and extract the corresponding media information + * + * Since: 1.20 + */ +void +gst_play_message_parse_media_info_updated (GstMessage * msg, + GstPlayMediaInfo ** info) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_MEDIA_INFO, + GST_TYPE_PLAY_MEDIA_INFO, info); +} + +/** + * gst_play_message_parse_volume_changed: + * @msg: A #GstMessage + * @volume: (out) (optional): the resulting audio volume + * + * Parse the given @msg and extract the corresponding audio volume + * + * Since: 1.20 + */ +void +gst_play_message_parse_volume_changed (GstMessage * msg, gdouble * volume) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_VOLUME, G_TYPE_DOUBLE, + volume); +} + +/** + * gst_play_message_parse_muted_changed: + * @msg: A #GstMessage + * @muted: (out) (optional): the resulting audio muted state + * + * Parse the given @msg and extract the corresponding audio muted state + * + * Since: 1.20 + */ +void +gst_play_message_parse_muted_changed (GstMessage * msg, gboolean * muted) +{ + PARSE_MESSAGE_FIELD (msg, GST_PLAY_MESSAGE_DATA_IS_MUTED, G_TYPE_BOOLEAN, + muted); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/gstplay.h
Added
@@ -0,0 +1,442 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * Copyright (C) 2019-2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_H__ +#define __GST_PLAY_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/play/play-prelude.h> +#include <gst/play/gstplay-types.h> +#include <gst/play/gstplay-video-renderer.h> +#include <gst/play/gstplay-media-info.h> + +G_BEGIN_DECLS + +GST_PLAY_API +GType gst_play_state_get_type (void); + +/** + * GST_TYPE_PLAY_STATE: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_STATE (gst_play_state_get_type ()) + +GST_PLAY_API +GType gst_play_message_get_type (void); + +/** + * GST_TYPE_PLAY_MESSAGE: + * Since: 1.20 + */ +#define GST_TYPE_PLAY_MESSAGE (gst_play_message_get_type ()) + +/** + * GstPlayState: + * @GST_PLAY_STATE_STOPPED: the play is stopped. + * @GST_PLAY_STATE_BUFFERING: the play is buffering. + * @GST_PLAY_STATE_PAUSED: the play is paused. + * @GST_PLAY_STATE_PLAYING: the play is currently playing a + * stream. + * + * Since: 1.20 + */ +typedef enum +{ + GST_PLAY_STATE_STOPPED, + GST_PLAY_STATE_BUFFERING, + GST_PLAY_STATE_PAUSED, + GST_PLAY_STATE_PLAYING +} GstPlayState; + +/** + * GstPlayMessage: + * @GST_PLAY_MESSAGE_URI_LOADED: Source element was initalized for set URI + * @GST_PLAY_MESSAGE_POSITION_UPDATED: Sink position changed + * @GST_PLAY_MESSAGE_DURATION_CHANGED: Duration of stream changed + * @GST_PLAY_MESSAGE_STATE_CHANGED: State changed, see #GstPlayState + * @GST_PLAY_MESSAGE_BUFFERING: Pipeline is in buffering state, message contains the percentage value of the decoding buffer + * @GST_PLAY_MESSAGE_END_OF_STREAM: Sink has received EOS + * @GST_PLAY_MESSAGE_ERROR: Message contains an error + * @GST_PLAY_MESSAGE_WARNING: Message contains an error + * @GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED: Video sink received format in different dimensions than before + * @GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED: A media-info property has changed, message contains current #GstPlayMediaInfo + * @GST_PLAY_MESSAGE_VOLUME_CHANGED: The volume of the audio ouput has changed + * @GST_PLAY_MESSAGE_MUTE_CHANGED: Audio muting flag has been toggled + * @GST_PLAY_MESSAGE_SEEK_DONE: Any pending seeking operation has been completed + * + * Since: 1.20 + * + * Types of messages that will be posted on the play API bus. + * + * See also #gst_play_get_message_bus() + * + */ +typedef enum +{ + GST_PLAY_MESSAGE_URI_LOADED, + GST_PLAY_MESSAGE_POSITION_UPDATED, + GST_PLAY_MESSAGE_DURATION_CHANGED, + GST_PLAY_MESSAGE_STATE_CHANGED, + GST_PLAY_MESSAGE_BUFFERING, + GST_PLAY_MESSAGE_END_OF_STREAM, + GST_PLAY_MESSAGE_ERROR, + GST_PLAY_MESSAGE_WARNING, + GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED, + GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED, + GST_PLAY_MESSAGE_VOLUME_CHANGED, + GST_PLAY_MESSAGE_MUTE_CHANGED, + GST_PLAY_MESSAGE_SEEK_DONE +} GstPlayMessage; + +GST_PLAY_API +const gchar *gst_play_state_get_name (GstPlayState state); + +GST_PLAY_API +const gchar *gst_play_message_get_name (GstPlayMessage message_type); + +GST_PLAY_API +GQuark gst_play_error_quark (void); + +GST_PLAY_API +GType gst_play_error_get_type (void); + +/** + * GST_PLAY_ERROR: + * + * Since: 1.20 + */ +#define GST_PLAY_ERROR (gst_play_error_quark ()) + +/** + * GST_TYPE_PLAY_ERROR: + * + * Since: 1.20 + */ +#define GST_TYPE_PLAY_ERROR (gst_play_error_get_type ()) + +/** + * GstPlayError: + * @GST_PLAY_ERROR_FAILED: generic error. + * + * Since: 1.20 + */ +typedef enum { + GST_PLAY_ERROR_FAILED = 0 +} GstPlayError; + +GST_PLAY_API +const gchar *gst_play_error_get_name (GstPlayError error); + +GST_PLAY_API +GType gst_play_color_balance_type_get_type (void); + +/** + * GST_TYPE_PLAY_COLOR_BALANCE_TYPE: + * + * Since: 1.20 + */ +#define GST_TYPE_PLAY_COLOR_BALANCE_TYPE (gst_play_color_balance_type_get_type ()) + +/** + * GstPlayColorBalanceType: + * @GST_PLAY_COLOR_BALANCE_BRIGHTNESS: brightness or black level. + * @GST_PLAY_COLOR_BALANCE_CONTRAST: contrast or luma gain. + * @GST_PLAY_COLOR_BALANCE_SATURATION: color saturation or chroma + * gain. + * @GST_PLAY_COLOR_BALANCE_HUE: hue or color balance. + * + * Since: 1.20 + */ +typedef enum +{ + GST_PLAY_COLOR_BALANCE_BRIGHTNESS, + GST_PLAY_COLOR_BALANCE_CONTRAST, + GST_PLAY_COLOR_BALANCE_SATURATION, + GST_PLAY_COLOR_BALANCE_HUE, +} GstPlayColorBalanceType; + +GST_PLAY_API +const gchar *gst_play_color_balance_type_get_name (GstPlayColorBalanceType type); + +#define GST_TYPE_PLAY (gst_play_get_type ()) +#define GST_IS_PLAY(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_PLAY)) +#define GST_IS_PLAY_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_PLAY)) +#define GST_PLAY_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_PLAY, GstPlayClass)) +#define GST_PLAY(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_PLAY, GstPlay)) +#define GST_PLAY_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_PLAY, GstPlayClass)) + +/** + * GST_PLAY_CAST: + * Since: 1.20 + */ +#define GST_PLAY_CAST(obj) ((GstPlay*)(obj)) + +#ifdef G_DEFINE_AUTOPTR_CLEANUP_FUNC +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstPlay, gst_object_unref) +#endif + +GST_PLAY_API +GType gst_play_get_type (void); + +GST_PLAY_API +GstPlay * gst_play_new (GstPlayVideoRenderer * video_renderer); + +GST_PLAY_API +GstBus * gst_play_get_message_bus (GstPlay * play); + +GST_PLAY_API +void gst_play_play (GstPlay * play); + +GST_PLAY_API +void gst_play_pause (GstPlay * play); + +GST_PLAY_API +void gst_play_stop (GstPlay * play); + +GST_PLAY_API +void gst_play_seek (GstPlay * play, + GstClockTime position); + +GST_PLAY_API +void gst_play_set_rate (GstPlay * play, + gdouble rate); + +GST_PLAY_API +gdouble gst_play_get_rate (GstPlay * play); + +GST_PLAY_API +gchar * gst_play_get_uri (GstPlay * play); + +GST_PLAY_API +void gst_play_set_uri (GstPlay * play, + const gchar * uri); + +GST_PLAY_API +gchar * gst_play_get_subtitle_uri (GstPlay * play); + +GST_PLAY_API +void gst_play_set_subtitle_uri (GstPlay * play, + const gchar *uri); + +GST_PLAY_API +GstClockTime gst_play_get_position (GstPlay * play); + +GST_PLAY_API +GstClockTime gst_play_get_duration (GstPlay * play); + +GST_PLAY_API +gdouble gst_play_get_volume (GstPlay * play); + +GST_PLAY_API +void gst_play_set_volume (GstPlay * play, + gdouble val); + +GST_PLAY_API +gboolean gst_play_get_mute (GstPlay * play); + +GST_PLAY_API +void gst_play_set_mute (GstPlay * play, + gboolean val); + +GST_PLAY_API +GstElement * gst_play_get_pipeline (GstPlay * play); + +GST_PLAY_API +void gst_play_set_video_track_enabled (GstPlay * play, + gboolean enabled); + +GST_PLAY_API +void gst_play_set_audio_track_enabled (GstPlay * play, + gboolean enabled); + +GST_PLAY_API +void gst_play_set_subtitle_track_enabled (GstPlay * play, + gboolean enabled); + +GST_PLAY_API +gboolean gst_play_set_audio_track (GstPlay *play, + gint stream_index); + +GST_PLAY_API +gboolean gst_play_set_video_track (GstPlay *play, + gint stream_index); + +GST_PLAY_API +gboolean gst_play_set_subtitle_track (GstPlay *play, + gint stream_index); + +GST_PLAY_API +GstPlayMediaInfo * gst_play_get_media_info (GstPlay * play); + +GST_PLAY_API +GstPlayAudioInfo * gst_play_get_current_audio_track (GstPlay * play); + +GST_PLAY_API +GstPlayVideoInfo * gst_play_get_current_video_track (GstPlay * play); + +GST_PLAY_API +GstPlaySubtitleInfo * gst_play_get_current_subtitle_track (GstPlay * play); + +GST_PLAY_API +gboolean gst_play_set_visualization (GstPlay * play, + const gchar *name); + +GST_PLAY_API +void gst_play_set_visualization_enabled (GstPlay * play, + gboolean enabled); + +GST_PLAY_API +gchar * gst_play_get_current_visualization (GstPlay * play); + +GST_PLAY_API +gboolean gst_play_has_color_balance (GstPlay * play); + +GST_PLAY_API +void gst_play_set_color_balance (GstPlay * play, + GstPlayColorBalanceType type, + gdouble value); + +GST_PLAY_API +gdouble gst_play_get_color_balance (GstPlay * play, + GstPlayColorBalanceType type); + + +GST_PLAY_API +GstVideoMultiviewFramePacking gst_play_get_multiview_mode (GstPlay * play); + +GST_PLAY_API +void gst_play_set_multiview_mode (GstPlay * play, + GstVideoMultiviewFramePacking mode); + +GST_PLAY_API +GstVideoMultiviewFlags gst_play_get_multiview_flags (GstPlay * play); + +GST_PLAY_API +void gst_play_set_multiview_flags (GstPlay * play, + GstVideoMultiviewFlags flags); + +GST_PLAY_API +gint64 gst_play_get_audio_video_offset (GstPlay * play); + +GST_PLAY_API +void gst_play_set_audio_video_offset (GstPlay * play, + gint64 offset); + +GST_PLAY_API +gint64 gst_play_get_subtitle_video_offset (GstPlay * play); + +GST_PLAY_API +void gst_play_set_subtitle_video_offset (GstPlay * play, + gint64 offset); + +GST_PLAY_API +gboolean gst_play_set_config (GstPlay * play, + GstStructure * config); + +GST_PLAY_API +GstStructure * gst_play_get_config (GstPlay * play); + +/* helpers for configuring the config structure */ + +GST_PLAY_API +void gst_play_config_set_user_agent (GstStructure * config, + const gchar * agent); + +GST_PLAY_API +gchar * gst_play_config_get_user_agent (const GstStructure * config); + +GST_PLAY_API +void gst_play_config_set_position_update_interval (GstStructure * config, + guint interval); + +GST_PLAY_API +guint gst_play_config_get_position_update_interval (const GstStructure * config); + +GST_PLAY_API +void gst_play_config_set_seek_accurate (GstStructure * config, gboolean accurate); + +GST_PLAY_API +gboolean gst_play_config_get_seek_accurate (const GstStructure * config); + +/** + * GstPlaySnapshotFormat: + * @GST_PLAY_THUMBNAIL_RAW_NATIVE: raw native format. + * @GST_PLAY_THUMBNAIL_RAW_xRGB: raw xRGB format. + * @GST_PLAY_THUMBNAIL_RAW_BGRx: raw BGRx format. + * @GST_PLAY_THUMBNAIL_JPG: jpeg format. + * @GST_PLAY_THUMBNAIL_PNG: png format. + * + * Since: 1.20 + */ +typedef enum +{ + GST_PLAY_THUMBNAIL_RAW_NATIVE = 0, + GST_PLAY_THUMBNAIL_RAW_xRGB, + GST_PLAY_THUMBNAIL_RAW_BGRx, + GST_PLAY_THUMBNAIL_JPG, + GST_PLAY_THUMBNAIL_PNG +} GstPlaySnapshotFormat; + +GST_PLAY_API +GstSample * gst_play_get_video_snapshot (GstPlay * play, + GstPlaySnapshotFormat format, const GstStructure * config); + +GST_PLAY_API +gboolean gst_play_is_play_message (GstMessage *msg); + +GST_PLAY_API +void gst_play_message_parse_type (GstMessage *msg, GstPlayMessage *type); + +GST_PLAY_API +void gst_play_message_parse_duration_updated (GstMessage *msg, GstClockTime *duration); + +GST_PLAY_API +void gst_play_message_parse_position_updated (GstMessage *msg, GstClockTime *position); + +GST_PLAY_API +void gst_play_message_parse_state_changed (GstMessage *msg, GstPlayState *state); + +GST_PLAY_API +void gst_play_message_parse_buffering_percent (GstMessage *msg, guint *percent); + +GST_PLAY_API +void gst_play_message_parse_error (GstMessage *msg, GError **error, GstStructure **details); + +GST_PLAY_API +void gst_play_message_parse_warning (GstMessage *msg, GError **error, GstStructure **details); + +GST_PLAY_API +void gst_play_message_parse_video_dimensions_changed (GstMessage *msg, guint *width, guint *height); + +GST_PLAY_API +void gst_play_message_parse_media_info_updated (GstMessage *msg, GstPlayMediaInfo **info); + +GST_PLAY_API +void gst_play_message_parse_volume_changed (GstMessage *msg, gdouble *volume); + +GST_PLAY_API +void gst_play_message_parse_muted_changed (GstMessage *msg, gboolean *muted); + +G_END_DECLS + +#endif /* __GST_PLAY_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/meson.build
Added
@@ -0,0 +1,76 @@ +gstplay_sources = files( + 'gstplay.c', + 'gstplay-signal-adapter.c', + 'gstplay-video-renderer.c', + 'gstplay-media-info.c', + 'gstplay-visualization.c', + 'gstplay-video-overlay-video-renderer.c', +) + +gstplay_headers = files( + 'play.h', + 'play-prelude.h', + 'gstplay.h', + 'gstplay-types.h', + 'gstplay-signal-adapter.h', + 'gstplay-video-renderer.h', + 'gstplay-media-info.h', + 'gstplay-video-overlay-video-renderer.h', + 'gstplay-visualization.h', +) + +install_headers(gstplay_headers, subdir : 'gstreamer-' + api_version + '/gst/play/') + +gstplay = library('gstplay-' + api_version, + gstplay_sources, + c_args : gst_plugins_bad_args + ['-DBUILDING_GST_PLAY', '-DG_LOG_DOMAIN="GStreamer-Play"'], + include_directories : [configinc, libsinc], + version : libversion, + soversion : soversion, + darwin_versions : osxversion, + install : true, + dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, + gsttag_dep, gstpbutils_dep], +) + +pkg_name = 'gstreamer-play-1.0' +pkgconfig.generate(gstplay, + libraries : [gst_dep, gstvideo_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'GStreamer Player convenience library', +) + +library_def = {'lib': gstplay} +gen_sources = [] +if build_gir + gir = { + 'sources' : gstplay_sources + gstplay_headers, + 'namespace' : 'GstPlay', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstPbutils-1.0', 'GstBase-1.0', 'GstVideo-1.0', + 'GstAudio-1.0', 'GstTag-1.0'], + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/play/play.h'], + 'dependencies' : [gstbase_dep, gstvideo_dep, gstaudio_dep, + gsttag_dep, gstpbutils_dep] + } + library_def += {'gir': [gir]} + if not static_build + play_gir = gnome.generate_gir(gstplay, kwargs: gir) + gen_sources += play_gir + endif +endif +libraries += [[pkg_name, library_def]] + +gstplay_dep = declare_dependency(link_with : gstplay, + include_directories : [libsinc], + sources: gen_sources, + dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, + gsttag_dep, gstpbutils_dep]) + +meson.override_dependency(pkg_name, gstplay_dep)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/play-prelude.h
Added
@@ -0,0 +1,43 @@ +/* GStreamer Play Library + * Copyright (C) 2018 GStreamer developers + * + * play-prelude.h: prelude include header for gst-play library + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAY_PRELUDE_H__ +#define __GST_PLAY_PRELUDE_H__ + +#include <gst/gst.h> + +#ifndef GST_PLAY_API +# ifdef BUILDING_GST_PLAY +# define GST_PLAY_API GST_API_EXPORT /* from config.h */ +# else +# define GST_PLAY_API GST_API_IMPORT +# endif +#endif + +#ifndef GST_DISABLE_DEPRECATED +#define GST_PLAY_DEPRECATED GST_PLAY_API +#define GST_PLAY_DEPRECATED_FOR(f) GST_PLAY_API +#else +#define GST_PLAY_DEPRECATED G_DEPRECATED GST_PLAY_API +#define GST_PLAY_DEPRECATED_FOR(f) G_DEPRECATED_FOR(f) GST_PLAY_API +#endif + +#endif /* __GST_PLAY_PRELUDE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/play/play.h
Added
@@ -0,0 +1,31 @@ +/* GStreamer + * + * Copyright (C) 2014 Sebastian Dröge <sebastian@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __PLAY_H__ +#define __PLAY_H__ + +#include <gst/play/play-prelude.h> +#include <gst/play/gstplay.h> +#include <gst/play/gstplay-media-info.h> +#include <gst/play/gstplay-video-overlay-video-renderer.h> +#include <gst/play/gstplay-visualization.h> +#include <gst/play/gstplay-signal-adapter.h> + +#endif /* __PLAY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/player/gstplayer-media-info-private.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/player/gstplayer-media-info-private.h
Changed
@@ -23,16 +23,14 @@ #ifndef __GST_PLAYER_MEDIA_INFO_PRIVATE_H__ #define __GST_PLAYER_MEDIA_INFO_PRIVATE_H__ +#include <gst/play/gstplay-media-info.h> + struct _GstPlayerStreamInfo { GObject parent; - gchar *codec; - - GstCaps *caps; gint stream_index; - GstTagList *tags; - gchar *stream_id; + GstPlayStreamInfo *info; }; struct _GstPlayerStreamInfoClass @@ -42,9 +40,9 @@ struct _GstPlayerSubtitleInfo { - GstPlayerStreamInfo parent; + GstPlayerStreamInfo parent; - gchar *language; + GstPlaySubtitleInfo *info; }; struct _GstPlayerSubtitleInfoClass @@ -54,15 +52,9 @@ struct _GstPlayerAudioInfo { - GstPlayerStreamInfo parent; - - gint channels; - gint sample_rate; - - guint bitrate; - guint max_bitrate; + GstPlayerStreamInfo parent; - gchar *language; + GstPlayAudioInfo *info; }; struct _GstPlayerAudioInfoClass @@ -72,17 +64,9 @@ struct _GstPlayerVideoInfo { - GstPlayerStreamInfo parent; + GstPlayerStreamInfo parent; - gint width; - gint height; - gint framerate_num; - gint framerate_denom; - gint par_num; - gint par_denom; - - guint bitrate; - guint max_bitrate; + GstPlayVideoInfo *info; }; struct _GstPlayerVideoInfoClass @@ -94,19 +78,11 @@ { GObject parent; - gchar *uri; - gchar *title; - gchar *container; - gboolean seekable, is_live; - GstTagList *tags; - GstSample *image_sample; - GList *stream_list; GList *audio_stream_list; GList *video_stream_list; GList *subtitle_stream_list; - - GstClockTime duration; + GstPlayMediaInfo *info; }; struct _GstPlayerMediaInfoClass @@ -115,12 +91,23 @@ }; G_GNUC_INTERNAL GstPlayerMediaInfo* gst_player_media_info_new - (const gchar *uri); + (void); G_GNUC_INTERNAL GstPlayerMediaInfo* gst_player_media_info_copy (GstPlayerMediaInfo *ref); G_GNUC_INTERNAL GstPlayerStreamInfo* gst_player_stream_info_new (gint stream_index, GType type); +G_GNUC_INTERNAL GstPlayerStreamInfo* gst_player_stream_info_wrapped + (GstPlayStreamInfo * info); G_GNUC_INTERNAL GstPlayerStreamInfo* gst_player_stream_info_copy (GstPlayerStreamInfo *ref); +G_GNUC_INTERNAL GstPlayerMediaInfo* gst_player_media_info_wrapped + (GstPlayMediaInfo *info); +G_GNUC_INTERNAL GstPlayerAudioInfo* gst_player_audio_info_wrapped + (GstPlayAudioInfo *info); +G_GNUC_INTERNAL GstPlayerVideoInfo* gst_player_video_info_wrapped + (GstPlayVideoInfo *info); +G_GNUC_INTERNAL GstPlayerSubtitleInfo* gst_player_subtitle_info_wrapped + (GstPlaySubtitleInfo *info); + #endif /* __GST_PLAYER_MEDIA_INFO_PRIVATE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/player/gstplayer-media-info.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/player/gstplayer-media-info.c
Changed
@@ -47,14 +47,7 @@ { GstPlayerStreamInfo *sinfo = GST_PLAYER_STREAM_INFO (object); - g_free (sinfo->codec); - g_free (sinfo->stream_id); - - if (sinfo->caps) - gst_caps_unref (sinfo->caps); - - if (sinfo->tags) - gst_tag_list_unref (sinfo->tags); + g_clear_object (&sinfo->info); G_OBJECT_CLASS (gst_player_stream_info_parent_class)->finalize (object); } @@ -71,7 +64,8 @@ * gst_player_stream_info_get_index: * @info: a #GstPlayerStreamInfo * - * Function to get stream index from #GstPlayerStreamInfo instance. + * Function to get stream index from #GstPlayerStreamInfo instance or -1 if + * unknown. * * Returns: the stream index of this stream. */ @@ -97,26 +91,21 @@ { g_return_val_if_fail (GST_IS_PLAYER_STREAM_INFO (info), NULL); - if (GST_IS_PLAYER_VIDEO_INFO (info)) - return "video"; - else if (GST_IS_PLAYER_AUDIO_INFO (info)) - return "audio"; - else - return "subtitle"; + return gst_play_stream_info_get_stream_type (info->info); } /** * gst_player_stream_info_get_tags: * @info: a #GstPlayerStreamInfo * - * Returns: (transfer none): the tags contained in this stream. + * Returns: (transfer none) (nullable): the tags contained in this stream. */ GstTagList * gst_player_stream_info_get_tags (const GstPlayerStreamInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_STREAM_INFO (info), NULL); - return info->tags; + return gst_play_stream_info_get_tags (info->info); } /** @@ -125,28 +114,28 @@ * * A string describing codec used in #GstPlayerStreamInfo. * - * Returns: codec string or NULL on unknown. + * Returns: (nullable): codec string or %NULL on unknown. */ const gchar * gst_player_stream_info_get_codec (const GstPlayerStreamInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_STREAM_INFO (info), NULL); - return info->codec; + return gst_play_stream_info_get_codec (info->info); } /** * gst_player_stream_info_get_caps: * @info: a #GstPlayerStreamInfo * - * Returns: (transfer none): the #GstCaps of the stream. + * Returns: (transfer none) (nullable): the #GstCaps of the stream. */ GstCaps * gst_player_stream_info_get_caps (const GstPlayerStreamInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_STREAM_INFO (info), NULL); - return info->caps; + return gst_play_stream_info_get_caps (info->info); } /* Video information */ @@ -154,14 +143,9 @@ GST_TYPE_PLAYER_STREAM_INFO); static void -gst_player_video_info_init (GstPlayerVideoInfo * info) +gst_player_video_info_init (G_GNUC_UNUSED GstPlayerVideoInfo * info) { - info->width = -1; - info->height = -1; - info->framerate_num = 0; - info->framerate_denom = 1; - info->par_num = 1; - info->par_denom = 1; + } static void @@ -174,28 +158,28 @@ * gst_player_video_info_get_width: * @info: a #GstPlayerVideoInfo * - * Returns: the width of video in #GstPlayerVideoInfo. + * Returns: the width of video in #GstPlayerVideoInfo or -1 if unknown. */ gint gst_player_video_info_get_width (const GstPlayerVideoInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_VIDEO_INFO (info), -1); - return info->width; + return gst_play_video_info_get_width (info->info); } /** * gst_player_video_info_get_height: * @info: a #GstPlayerVideoInfo * - * Returns: the height of video in #GstPlayerVideoInfo. + * Returns: the height of video in #GstPlayerVideoInfo or -1 if unknown. */ gint gst_player_video_info_get_height (const GstPlayerVideoInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_VIDEO_INFO (info), -1); - return info->height; + return gst_play_video_info_get_height (info->info); } /** @@ -211,8 +195,7 @@ { g_return_if_fail (GST_IS_PLAYER_VIDEO_INFO (info)); - *fps_n = info->framerate_num; - *fps_d = info->framerate_denom; + gst_play_video_info_get_framerate (info->info, fps_n, fps_d); } /** @@ -230,36 +213,37 @@ { g_return_if_fail (GST_IS_PLAYER_VIDEO_INFO (info)); - *par_n = info->par_num; - *par_d = info->par_denom; + gst_play_video_info_get_pixel_aspect_ratio (info->info, par_n, par_d); } /** * gst_player_video_info_get_bitrate: * @info: a #GstPlayerVideoInfo * - * Returns: the current bitrate of video in #GstPlayerVideoInfo. + * Returns: the current bitrate of video in #GstPlayerVideoInfo or -1 if + * unknown. */ gint gst_player_video_info_get_bitrate (const GstPlayerVideoInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_VIDEO_INFO (info), -1); - return info->bitrate; + return gst_play_video_info_get_bitrate (info->info); } /** * gst_player_video_info_get_max_bitrate: * @info: a #GstPlayerVideoInfo * - * Returns: the maximum bitrate of video in #GstPlayerVideoInfo. + * Returns: the maximum bitrate of video in #GstPlayerVideoInfo or -1 if + * unknown. */ gint gst_player_video_info_get_max_bitrate (const GstPlayerVideoInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_VIDEO_INFO (info), -1); - return info->max_bitrate; + return gst_play_video_info_get_max_bitrate (info->info); } /* Audio information */ @@ -267,12 +251,9 @@ GST_TYPE_PLAYER_STREAM_INFO); static void -gst_player_audio_info_init (GstPlayerAudioInfo * info) +gst_player_audio_info_init (G_GNUC_UNUSED GstPlayerAudioInfo * info) { - info->channels = 0; - info->sample_rate = 0; - info->bitrate = -1; - info->max_bitrate = -1; + } static void @@ -280,7 +261,7 @@ { GstPlayerAudioInfo *info = GST_PLAYER_AUDIO_INFO (object); - g_free (info->language); + g_clear_object (&info->info); G_OBJECT_CLASS (gst_player_audio_info_parent_class)->finalize (object); } @@ -297,70 +278,71 @@ * gst_player_audio_info_get_language: * @info: a #GstPlayerAudioInfo * - * Returns: the language of the stream, or NULL if unknown. + * Returns: (nullable): the language of the stream, or NULL if unknown. */ const gchar * gst_player_audio_info_get_language (const GstPlayerAudioInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_AUDIO_INFO (info), NULL); - return info->language; + return gst_play_audio_info_get_language (info->info); } /** * gst_player_audio_info_get_channels: * @info: a #GstPlayerAudioInfo * - * Returns: the number of audio channels in #GstPlayerAudioInfo. + * Returns: the number of audio channels in #GstPlayerAudioInfo or 0 if + * unknown. */ gint gst_player_audio_info_get_channels (const GstPlayerAudioInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_AUDIO_INFO (info), 0); - return info->channels; + return gst_play_audio_info_get_channels (info->info); } /** * gst_player_audio_info_get_sample_rate: * @info: a #GstPlayerAudioInfo * - * Returns: the audio sample rate in #GstPlayerAudioInfo. + * Returns: the audio sample rate in #GstPlayerAudioInfo or 0 if unknown. */ gint gst_player_audio_info_get_sample_rate (const GstPlayerAudioInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_AUDIO_INFO (info), 0); - return info->sample_rate; + return gst_play_audio_info_get_sample_rate (info->info); } /** * gst_player_audio_info_get_bitrate: * @info: a #GstPlayerAudioInfo * - * Returns: the audio bitrate in #GstPlayerAudioInfo. + * Returns: the audio bitrate in #GstPlayerAudioInfo or -1 if unknown. */ gint gst_player_audio_info_get_bitrate (const GstPlayerAudioInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_AUDIO_INFO (info), -1); - return info->bitrate; + return gst_play_audio_info_get_bitrate (info->info); } /** * gst_player_audio_info_get_max_bitrate: * @info: a #GstPlayerAudioInfo * - * Returns: the audio maximum bitrate in #GstPlayerAudioInfo. + * Returns: the audio maximum bitrate in #GstPlayerAudioInfo or -1 if unknown. */ gint gst_player_audio_info_get_max_bitrate (const GstPlayerAudioInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_AUDIO_INFO (info), -1); - return info->max_bitrate; + return gst_play_audio_info_get_max_bitrate (info->info); } /* Subtitle information */ @@ -378,7 +360,7 @@ { GstPlayerSubtitleInfo *info = GST_PLAYER_SUBTITLE_INFO (object); - g_free (info->language); + g_clear_object (&info->info); G_OBJECT_CLASS (gst_player_subtitle_info_parent_class)->finalize (object); } @@ -395,25 +377,23 @@ * gst_player_subtitle_info_get_language: * @info: a #GstPlayerSubtitleInfo * - * Returns: the language of the stream, or NULL if unknown. + * Returns: (nullable): the language of the stream, or %NULL if unknown. */ const gchar * gst_player_subtitle_info_get_language (const GstPlayerSubtitleInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_SUBTITLE_INFO (info), NULL); - return info->language; + return gst_play_subtitle_info_get_language (info->info); } /* Global media information */ G_DEFINE_TYPE (GstPlayerMediaInfo, gst_player_media_info, G_TYPE_OBJECT); static void -gst_player_media_info_init (GstPlayerMediaInfo * info) +gst_player_media_info_init (G_GNUC_UNUSED GstPlayerMediaInfo * info) { - info->duration = -1; - info->is_live = FALSE; - info->seekable = FALSE; + } static void @@ -421,18 +401,6 @@ { GstPlayerMediaInfo *info = GST_PLAYER_MEDIA_INFO (object); - g_free (info->uri); - - if (info->tags) - gst_tag_list_unref (info->tags); - - g_free (info->title); - - g_free (info->container); - - if (info->image_sample) - gst_sample_unref (info->image_sample); - if (info->audio_stream_list) g_list_free (info->audio_stream_list); @@ -444,6 +412,7 @@ if (info->stream_list) g_list_free_full (info->stream_list, g_object_unref); + g_clear_object (&info->info); G_OBJECT_CLASS (gst_player_media_info_parent_class)->finalize (object); } @@ -480,15 +449,7 @@ GstPlayerVideoInfo *ret; ret = gst_player_video_info_new (); - - ret->width = ref->width; - ret->height = ref->height; - ret->framerate_num = ref->framerate_num; - ret->framerate_denom = ref->framerate_denom; - ret->par_num = ref->par_num; - ret->par_denom = ref->par_denom; - ret->bitrate = ref->bitrate; - ret->max_bitrate = ref->max_bitrate; + ret->info = g_object_ref (ref->info); return (GstPlayerStreamInfo *) ret; } @@ -499,14 +460,7 @@ GstPlayerAudioInfo *ret; ret = gst_player_audio_info_new (); - - ret->sample_rate = ref->sample_rate; - ret->channels = ref->channels; - ret->bitrate = ref->bitrate; - ret->max_bitrate = ref->max_bitrate; - - if (ref->language) - ret->language = g_strdup (ref->language); + ret->info = g_object_ref (ref->info); return (GstPlayerStreamInfo *) ret; } @@ -517,8 +471,7 @@ GstPlayerSubtitleInfo *ret; ret = gst_player_subtitle_info_new (); - if (ref->language) - ret->language = g_strdup (ref->language); + ret->info = g_object_ref (ref->info); return (GstPlayerStreamInfo *) ret; } @@ -539,14 +492,6 @@ info = gst_player_subtitle_info_copy ((GstPlayerSubtitleInfo *) ref); info->stream_index = ref->stream_index; - if (ref->tags) - info->tags = gst_tag_list_ref (ref->tags); - if (ref->caps) - info->caps = gst_caps_copy (ref->caps); - if (ref->codec) - info->codec = g_strdup (ref->codec); - if (ref->stream_id) - info->stream_id = g_strdup (ref->stream_id); return info; } @@ -560,20 +505,9 @@ if (!ref) return NULL; - info = gst_player_media_info_new (ref->uri); - info->duration = ref->duration; - info->seekable = ref->seekable; - info->is_live = ref->is_live; - if (ref->tags) - info->tags = gst_tag_list_ref (ref->tags); - if (ref->title) - info->title = g_strdup (ref->title); - if (ref->container) - info->container = g_strdup (ref->container); - if (ref->image_sample) - info->image_sample = gst_sample_ref (ref->image_sample); - - for (l = ref->stream_list; l != NULL; l = l->next) { + info = gst_player_media_info_new (); + + for (l = gst_player_media_info_get_stream_list (ref); l != NULL; l = l->next) { GstPlayerStreamInfo *s; s = gst_player_stream_info_copy ((GstPlayerStreamInfo *) l->data); @@ -588,6 +522,8 @@ g_list_append (info->subtitle_stream_list, s); } + info->info = g_object_ref (ref->info); + return info; } @@ -608,17 +544,99 @@ return info; } +GstPlayerStreamInfo * +gst_player_stream_info_wrapped (GstPlayStreamInfo * info) +{ + GstPlayerStreamInfo *ret; + GType type; + + if (GST_IS_PLAY_AUDIO_INFO (info)) { + type = GST_TYPE_PLAYER_AUDIO_INFO; + } else if (GST_IS_PLAY_VIDEO_INFO (info)) { + type = GST_TYPE_PLAYER_VIDEO_INFO; + } else { + type = GST_TYPE_PLAYER_SUBTITLE_INFO; + } + + ret = + gst_player_stream_info_new (gst_play_stream_info_get_index (info), type); + ret->info = g_object_ref (info); + return ret; +} + GstPlayerMediaInfo * -gst_player_media_info_new (const gchar * uri) +gst_player_media_info_new (void) { - GstPlayerMediaInfo *info; + return g_object_new (GST_TYPE_PLAYER_MEDIA_INFO, NULL); +} - g_return_val_if_fail (uri != NULL, NULL); +GstPlayerMediaInfo * +gst_player_media_info_wrapped (GstPlayMediaInfo * info) +{ + GstPlayerMediaInfo *ret; + GList *l; - info = g_object_new (GST_TYPE_PLAYER_MEDIA_INFO, NULL); - info->uri = g_strdup (uri); + ret = gst_player_media_info_new (); + ret->info = g_object_ref (info); - return info; + for (l = gst_play_media_info_get_stream_list (info); l != NULL; l = l->next) { + GstPlayerStreamInfo *s; + + s = gst_player_stream_info_wrapped ((GstPlayStreamInfo *) l->data); + ret->stream_list = g_list_append (ret->stream_list, s); + + if (GST_IS_PLAYER_AUDIO_INFO (s)) { + GstPlayerAudioInfo *i = GST_PLAYER_AUDIO_INFO (s); + i->info = g_object_ref (GST_PLAY_AUDIO_INFO (l->data)); + ret->audio_stream_list = g_list_append (ret->audio_stream_list, i); + } else if (GST_IS_PLAYER_VIDEO_INFO (s)) { + GstPlayerVideoInfo *i = GST_PLAYER_VIDEO_INFO (s); + i->info = g_object_ref (GST_PLAY_VIDEO_INFO (l->data)); + ret->video_stream_list = g_list_append (ret->video_stream_list, i); + } else { + GstPlayerSubtitleInfo *i = GST_PLAYER_SUBTITLE_INFO (s); + i->info = g_object_ref (GST_PLAY_SUBTITLE_INFO (l->data)); + ret->subtitle_stream_list = g_list_append (ret->subtitle_stream_list, i); + } + } + + return ret; +} + +GstPlayerAudioInfo * +gst_player_audio_info_wrapped (GstPlayAudioInfo * info) +{ + GstPlayerStreamInfo *s; + GstPlayerAudioInfo *i; + + s = gst_player_stream_info_wrapped ((GstPlayStreamInfo *) info); + i = GST_PLAYER_AUDIO_INFO (s); + i->info = g_object_ref (info); + return i; +} + +GstPlayerVideoInfo * +gst_player_video_info_wrapped (GstPlayVideoInfo * info) +{ + GstPlayerStreamInfo *s; + GstPlayerVideoInfo *i; + + s = gst_player_stream_info_wrapped ((GstPlayStreamInfo *) info); + i = GST_PLAYER_VIDEO_INFO (s); + i->info = g_object_ref (info); + return i; +} + +GstPlayerSubtitleInfo * +gst_player_subtitle_info_wrapped (GstPlaySubtitleInfo * info) +{ + GstPlayerStreamInfo *s; + GstPlayerSubtitleInfo *i; + + s = gst_player_stream_info_wrapped ((GstPlayStreamInfo *) info); + i = GST_PLAYER_SUBTITLE_INFO (s); + i->info = g_object_ref (info); + return i; } /** @@ -632,7 +650,7 @@ { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), NULL); - return info->uri; + return gst_play_media_info_get_uri (info->info); } /** @@ -646,7 +664,7 @@ { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), FALSE); - return info->seekable; + return gst_play_media_info_is_seekable (info->info); } /** @@ -660,7 +678,7 @@ { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), FALSE); - return info->is_live; + return gst_play_media_info_is_live (info->info); } /** @@ -727,56 +745,56 @@ * gst_player_media_info_get_duration: * @info: a #GstPlayerMediaInfo * - * Returns: duration of the media. + * Returns: duration of the media or %GST_CLOCK_TIME_NONE if unknown. */ GstClockTime gst_player_media_info_get_duration (const GstPlayerMediaInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), -1); - return info->duration; + return gst_play_media_info_get_duration (info->info); } /** * gst_player_media_info_get_tags: * @info: a #GstPlayerMediaInfo * - * Returns: (transfer none): the tags contained in media info. + * Returns: (transfer none) (nullable): the tags contained in media info. */ GstTagList * gst_player_media_info_get_tags (const GstPlayerMediaInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), NULL); - return info->tags; + return gst_play_media_info_get_tags (info->info); } /** * gst_player_media_info_get_title: * @info: a #GstPlayerMediaInfo * - * Returns: the media title. + * Returns: (nullable): the media title or %NULL if unknown. */ const gchar * gst_player_media_info_get_title (const GstPlayerMediaInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), NULL); - return info->title; + return gst_play_media_info_get_title (info->info); } /** * gst_player_media_info_get_container_format: * @info: a #GstPlayerMediaInfo * - * Returns: the container format. + * Returns: (nullable): the container format or %NULL if unknown. */ const gchar * gst_player_media_info_get_container_format (const GstPlayerMediaInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), NULL); - return info->container; + return gst_play_media_info_get_container_format (info->info); } /** @@ -786,21 +804,22 @@ * Function to get the image (or preview-image) stored in taglist. * Application can use `gst_sample_*_()` API's to get caps, buffer etc. * - * Returns: (transfer none): GstSample or NULL. + * Returns: (transfer none) (nullable): GstSample or %NULL. */ GstSample * gst_player_media_info_get_image_sample (const GstPlayerMediaInfo * info) { g_return_val_if_fail (GST_IS_PLAYER_MEDIA_INFO (info), NULL); - return info->image_sample; + return gst_play_media_info_get_image_sample (info->info); } /** * gst_player_media_info_get_number_of_streams: * @info: a #GstPlayerMediaInfo * - * Returns: number of total streams. + * Returns: number of total streams or 0 if unknown. + * * Since: 1.12 */ guint @@ -815,7 +834,8 @@ * gst_player_media_info_get_number_of_video_streams: * @info: a #GstPlayerMediaInfo * - * Returns: number of video streams. + * Returns: number of video streams or 0 if unknown. + * * Since: 1.12 */ guint @@ -831,7 +851,8 @@ * gst_player_media_info_get_number_of_audio_streams: * @info: a #GstPlayerMediaInfo * - * Returns: number of audio streams. + * Returns: number of audio streams or 0 if unknown. + * * Since: 1.12 */ guint @@ -847,7 +868,8 @@ * gst_player_media_info_get_number_of_subtitle_streams: * @info: a #GstPlayerMediaInfo * - * Returns: number of subtitle streams. + * Returns: number of subtitle streams or 0 if unknown. + * * Since: 1.12 */ guint gst_player_media_info_get_number_of_subtitle_streams
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/player/gstplayer-wrapped-video-renderer-private.h
Added
@@ -0,0 +1,49 @@ +/* GStreamer + * + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_PLAYER_WRAPPED_VIDEO_RENDERER_H__ +#define __GST_PLAYER_WRAPPED_VIDEO_RENDERER_H__ + +#include <gst/player/gstplayer-types.h> +#include <gst/player/gstplayer-video-renderer.h> + +G_BEGIN_DECLS + +typedef struct _GstPlayerWrappedVideoRenderer + GstPlayerWrappedVideoRenderer; +typedef struct _GstPlayerWrappedVideoRendererClass + GstPlayerWrappedVideoRendererClass; + +#define GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER (gst_player_wrapped_video_renderer_get_type ()) +#define GST_IS_PLAYER_WRAPPED_VIDEO_RENDERER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER)) +#define GST_IS_PLAYER_WRAPPED_VIDEO_RENDERER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER)) +#define GST_PLAYER_WRAPPED_VIDEO_RENDERER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER, GstPlayerWrappedVideoRendererClass)) +#define GST_PLAYER_WRAPPED_VIDEO_RENDERER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER, GstPlayerWrappedVideoRenderer)) +#define GST_PLAYER_WRAPPED_VIDEO_RENDERER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER, GstPlayerWrappedVideoRendererClass)) +#define GST_PLAYER_WRAPPED_VIDEO_RENDERER_CAST(obj) ((GstPlayerWrappedVideoRenderer*)(obj)) + +GType gst_player_wrapped_video_renderer_get_type (void); + +GstPlayerVideoRenderer * gst_player_wrapped_video_renderer_new (GstPlayerVideoRenderer * renderer, GstPlayer * player); + + +G_END_DECLS + +#endif /* __GST_PLAYER_WRAPPED_VIDEO_RENDERER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/player/gstplayer-wrapped-video-renderer.c
Added
@@ -0,0 +1,114 @@ +/* GStreamer + * + * Copyright (C) 2020 Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/play/gstplay-video-renderer.h> + +#include "gstplayer-wrapped-video-renderer-private.h" +#include "gstplayer.h" +#include "gstplayer-video-renderer-private.h" + +/* + * This object is an internal wrapper created by the GstPlayer, implementing the + * new GstPlayVideoRenderer interface and acting as a bridge from the legacy + * GstPlayerVideoRenderer interface. + */ + +struct _GstPlayerWrappedVideoRenderer +{ + GObject parent; + + GstPlayerVideoRenderer *renderer; + GstPlayer *player; +}; + +struct _GstPlayerWrappedVideoRendererClass +{ + GObjectClass parent_class; +}; + +static void + gst_player_wrapped_video_renderer_interface_init + (GstPlayVideoRendererInterface * iface); + +G_DEFINE_TYPE_WITH_CODE (GstPlayerWrappedVideoRenderer, + gst_player_wrapped_video_renderer, G_TYPE_OBJECT, + G_IMPLEMENT_INTERFACE (GST_TYPE_PLAY_VIDEO_RENDERER, + gst_player_wrapped_video_renderer_interface_init)); + +static void +gst_player_wrapped_video_renderer_finalize (GObject * object) +{ + GstPlayerWrappedVideoRenderer *self = + GST_PLAYER_WRAPPED_VIDEO_RENDERER (object); + + g_clear_object (&self->renderer); + + G_OBJECT_CLASS + (gst_player_wrapped_video_renderer_parent_class)->finalize (object); +} + +static void +gst_player_wrapped_video_renderer_class_init (GstPlayerWrappedVideoRendererClass + * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->finalize = gst_player_wrapped_video_renderer_finalize; +} + +static void +gst_player_wrapped_video_renderer_init (GstPlayerWrappedVideoRenderer * self) +{ +} + +static GstElement * +gst_player_wrapped_video_renderer_create_video_sink (GstPlayVideoRenderer * + iface, GstPlay * player) +{ + GstPlayerWrappedVideoRenderer *self = + GST_PLAYER_WRAPPED_VIDEO_RENDERER (iface); + + return gst_player_video_renderer_create_video_sink (self->renderer, + self->player); +} + +static void +gst_player_wrapped_video_renderer_interface_init (GstPlayVideoRendererInterface + * iface) +{ + iface->create_video_sink = + gst_player_wrapped_video_renderer_create_video_sink; +} + +GstPlayerVideoRenderer * +gst_player_wrapped_video_renderer_new (GstPlayerVideoRenderer * renderer, + GstPlayer * player) +{ + GstPlayerWrappedVideoRenderer *self = + g_object_new (GST_TYPE_PLAYER_WRAPPED_VIDEO_RENDERER, + NULL); + self->renderer = g_object_ref (renderer); + self->player = player; + return (GstPlayerVideoRenderer *) self; +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/player/gstplayer.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/player/gstplayer.c
Changed
@@ -26,6 +26,8 @@ * @symbols: * - GstPlayer * + * Starting from GStreamer 1.20, application developers are strongly advised to migrate to #GstPlay. + * #GstPlayer will be deprecated in 1.20 and most likely removed by 1.24. */ /* TODO: @@ -49,8 +51,10 @@ #include "gstplayer-signal-dispatcher-private.h" #include "gstplayer-video-renderer-private.h" #include "gstplayer-media-info-private.h" +#include "gstplayer-wrapped-video-renderer-private.h" #include <gst/gst.h> +#include <gst/play/play.h> #include <gst/video/video.h> #include <gst/video/colorbalance.h> #include <gst/tag/tag.h> @@ -98,7 +102,7 @@ "accurate-seek", }; -GQuark _config_quark_table[CONFIG_QUARK_MAX]; +static GQuark _config_quark_table[CONFIG_QUARK_MAX]; #define CONFIG_QUARK(q) _config_quark_table[CONFIG_QUARK_##q] @@ -144,67 +148,15 @@ SIGNAL_LAST }; -enum -{ - GST_PLAY_FLAG_VIDEO = (1 << 0), - GST_PLAY_FLAG_AUDIO = (1 << 1), - GST_PLAY_FLAG_SUBTITLE = (1 << 2), - GST_PLAY_FLAG_VIS = (1 << 3) -}; - struct _GstPlayer { GstObject parent; - GstPlayerVideoRenderer *video_renderer; - GstPlayerSignalDispatcher *signal_dispatcher; + GstPlay *play; + GstPlaySignalAdapter *signal_adapter; - gchar *uri; - gchar *redirect_uri; - gchar *suburi; - - GThread *thread; - GMutex lock; - GCond cond; - GMainContext *context; - GMainLoop *loop; - - GstElement *playbin; - GstBus *bus; - GstState target_state, current_state; - gboolean is_live, is_eos; - GSource *tick_source, *ready_timeout_source; - GstClockTime cached_duration; - - gdouble rate; - - GstPlayerState app_state; - gint buffering; - - GstTagList *global_tags; - GstPlayerMediaInfo *media_info; - - GstElement *current_vis_element; - - GstStructure *config; - - /* Protected by lock */ - gboolean seek_pending; /* Only set from main context */ - GstClockTime last_seek_time; /* Only set from main context */ - GSource *seek_source; - GstClockTime seek_position; - /* If TRUE, all signals are inhibited except the - * state-changed:GST_PLAYER_STATE_STOPPED/PAUSED. This ensures that no signal - * is emitted after gst_player_stop/pause() has been called by the user. */ - gboolean inhibit_sigs; - - /* For playbin3 */ - gboolean use_playbin3; - GstStreamCollection *collection; - gchar *video_sid; - gchar *audio_sid; - gchar *subtitle_sid; - gulong stream_notify_id; + /* legacy */ + GstPlayerSignalDispatcher *signal_dispatcher; }; struct _GstPlayerClass @@ -218,92 +170,16 @@ static guint signals[SIGNAL_LAST] = { 0, }; static GParamSpec *param_specs[PROP_LAST] = { NULL, }; -static void gst_player_dispose (GObject * object); static void gst_player_finalize (GObject * object); static void gst_player_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_player_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); -static void gst_player_constructed (GObject * object); - -static gpointer gst_player_main (gpointer data); - -static void gst_player_seek_internal_locked (GstPlayer * self); -static void gst_player_stop_internal (GstPlayer * self, gboolean transient); -static gboolean gst_player_pause_internal (gpointer user_data); -static gboolean gst_player_play_internal (gpointer user_data); -static gboolean gst_player_seek_internal (gpointer user_data); -static void gst_player_set_rate_internal (GstPlayer * self); -static void change_state (GstPlayer * self, GstPlayerState state); - -static GstPlayerMediaInfo *gst_player_media_info_create (GstPlayer * self); - -static void gst_player_streams_info_create (GstPlayer * self, - GstPlayerMediaInfo * media_info, const gchar * prop, GType type); -static void gst_player_stream_info_update (GstPlayer * self, - GstPlayerStreamInfo * s); -static void gst_player_stream_info_update_tags_and_caps (GstPlayer * self, - GstPlayerStreamInfo * s); -static GstPlayerStreamInfo *gst_player_stream_info_find (GstPlayerMediaInfo * - media_info, GType type, gint stream_index); -static GstPlayerStreamInfo *gst_player_stream_info_get_current (GstPlayer * - self, const gchar * prop, GType type); - -static void gst_player_video_info_update (GstPlayer * self, - GstPlayerStreamInfo * stream_info); -static void gst_player_audio_info_update (GstPlayer * self, - GstPlayerStreamInfo * stream_info); -static void gst_player_subtitle_info_update (GstPlayer * self, - GstPlayerStreamInfo * stream_info); - -/* For playbin3 */ -static void gst_player_streams_info_create_from_collection (GstPlayer * self, - GstPlayerMediaInfo * media_info, GstStreamCollection * collection); -static void gst_player_stream_info_update_from_stream (GstPlayer * self, - GstPlayerStreamInfo * s, GstStream * stream); -static GstPlayerStreamInfo *gst_player_stream_info_find_from_stream_id - (GstPlayerMediaInfo * media_info, const gchar * stream_id); -static GstPlayerStreamInfo *gst_player_stream_info_get_current_from_stream_id - (GstPlayer * self, const gchar * stream_id, GType type); -static void stream_notify_cb (GstStreamCollection * collection, - GstStream * stream, GParamSpec * pspec, GstPlayer * self); - -static void emit_media_info_updated_signal (GstPlayer * self); - -static void *get_title (GstTagList * tags); -static void *get_container_format (GstTagList * tags); -static void *get_from_tags (GstPlayer * self, GstPlayerMediaInfo * media_info, - void *(*func) (GstTagList *)); -static void *get_cover_sample (GstTagList * tags); - -static void remove_seek_source (GstPlayer * self); static void -gst_player_init (GstPlayer * self) +gst_player_init (G_GNUC_UNUSED GstPlayer * self) { - GST_TRACE_OBJECT (self, "Initializing"); - - self = gst_player_get_instance_private (self); - - g_mutex_init (&self->lock); - g_cond_init (&self->cond); - - self->context = g_main_context_new (); - self->loop = g_main_loop_new (self->context, FALSE); - - /* *INDENT-OFF* */ - self->config = gst_structure_new_id (QUARK_CONFIG, - CONFIG_QUARK (POSITION_INTERVAL_UPDATE), G_TYPE_UINT, DEFAULT_POSITION_UPDATE_INTERVAL_MS, - CONFIG_QUARK (ACCURATE_SEEK), G_TYPE_BOOLEAN, FALSE, - NULL); - /* *INDENT-ON* */ - self->seek_pending = FALSE; - self->seek_position = GST_CLOCK_TIME_NONE; - self->last_seek_time = GST_CLOCK_TIME_NONE; - self->inhibit_sigs = FALSE; - - GST_TRACE_OBJECT (self, "Initialized"); } static void @@ -330,15 +206,13 @@ gobject_class->set_property = gst_player_set_property; gobject_class->get_property = gst_player_get_property; - gobject_class->dispose = gst_player_dispose; gobject_class->finalize = gst_player_finalize; - gobject_class->constructed = gst_player_constructed; param_specs[PROP_VIDEO_RENDERER] = g_param_spec_object ("video-renderer", "Video Renderer", "Video renderer to use for rendering videos", GST_TYPE_PLAYER_VIDEO_RENDERER, - G_PARAM_WRITABLE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS); param_specs[PROP_SIGNAL_DISPATCHER] = g_param_spec_object ("signal-dispatcher", @@ -495,260 +369,33 @@ } static void -gst_player_dispose (GObject * object) -{ - GstPlayer *self = GST_PLAYER (object); - - GST_TRACE_OBJECT (self, "Stopping main thread"); - - if (self->loop) { - g_main_loop_quit (self->loop); - - if (self->thread != g_thread_self ()) - g_thread_join (self->thread); - else - g_thread_unref (self->thread); - self->thread = NULL; - - g_main_loop_unref (self->loop); - self->loop = NULL; - - g_main_context_unref (self->context); - self->context = NULL; - } - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void gst_player_finalize (GObject * object) { GstPlayer *self = GST_PLAYER (object); GST_TRACE_OBJECT (self, "Finalizing"); - g_free (self->uri); - g_free (self->redirect_uri); - g_free (self->suburi); - g_free (self->video_sid); - g_free (self->audio_sid); - g_free (self->subtitle_sid); - if (self->global_tags) - gst_tag_list_unref (self->global_tags); - if (self->video_renderer) - g_object_unref (self->video_renderer); if (self->signal_dispatcher) g_object_unref (self->signal_dispatcher); - if (self->current_vis_element) - gst_object_unref (self->current_vis_element); - if (self->config) - gst_structure_free (self->config); - if (self->collection) - gst_object_unref (self->collection); - g_mutex_clear (&self->lock); - g_cond_clear (&self->cond); + if (self->play) + gst_object_unref (self->play); G_OBJECT_CLASS (parent_class)->finalize (object); } static void -gst_player_constructed (GObject * object) -{ - GstPlayer *self = GST_PLAYER (object); - - GST_TRACE_OBJECT (self, "Constructed"); - - g_mutex_lock (&self->lock); - self->thread = g_thread_new ("GstPlayer", gst_player_main, self); - while (!self->loop || !g_main_loop_is_running (self->loop)) - g_cond_wait (&self->cond, &self->lock); - g_mutex_unlock (&self->lock); - - G_OBJECT_CLASS (parent_class)->constructed (object); -} - -typedef struct -{ - GstPlayer *player; - gchar *uri; -} UriLoadedSignalData; - -static void -uri_loaded_dispatch (gpointer user_data) -{ - UriLoadedSignalData *data = user_data; - - g_signal_emit (data->player, signals[SIGNAL_URI_LOADED], 0, data->uri); -} - -static void -uri_loaded_signal_data_free (UriLoadedSignalData * data) -{ - g_object_unref (data->player); - g_free (data->uri); - g_free (data); -} - -static gboolean -gst_player_set_uri_internal (gpointer user_data) -{ - GstPlayer *self = user_data; - - gst_player_stop_internal (self, FALSE); - - g_mutex_lock (&self->lock); - - GST_DEBUG_OBJECT (self, "Changing URI to '%s'", GST_STR_NULL (self->uri)); - - g_object_set (self->playbin, "uri", self->uri, NULL); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_URI_LOADED], 0, NULL, NULL, NULL) != 0) { - UriLoadedSignalData *data = g_new (UriLoadedSignalData, 1); - - data->player = g_object_ref (self); - data->uri = g_strdup (self->uri); - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - uri_loaded_dispatch, data, - (GDestroyNotify) uri_loaded_signal_data_free); - } - - g_object_set (self->playbin, "suburi", NULL, NULL); - - g_mutex_unlock (&self->lock); - - return G_SOURCE_REMOVE; -} - -static gboolean -gst_player_set_suburi_internal (gpointer user_data) -{ - GstPlayer *self = user_data; - GstClockTime position; - GstState target_state; - - /* save the state and position */ - target_state = self->target_state; - position = gst_player_get_position (self); - - gst_player_stop_internal (self, TRUE); - g_mutex_lock (&self->lock); - - GST_DEBUG_OBJECT (self, "Changing SUBURI to '%s'", - GST_STR_NULL (self->suburi)); - - g_object_set (self->playbin, "suburi", self->suburi, NULL); - - g_mutex_unlock (&self->lock); - - /* restore state and position */ - if (position != GST_CLOCK_TIME_NONE) - gst_player_seek (self, position); - if (target_state == GST_STATE_PAUSED) - gst_player_pause_internal (self); - else if (target_state == GST_STATE_PLAYING) - gst_player_play_internal (self); - - return G_SOURCE_REMOVE; -} - -static void -gst_player_set_rate_internal (GstPlayer * self) -{ - self->seek_position = gst_player_get_position (self); - - /* If there is no seek being dispatch to the main context currently do that, - * otherwise we just updated the rate so that it will be taken by - * the seek handler from the main context instead of the old one. - */ - if (!self->seek_source) { - /* If no seek is pending then create new seek source */ - if (!self->seek_pending) { - self->seek_source = g_idle_source_new (); - g_source_set_callback (self->seek_source, - (GSourceFunc) gst_player_seek_internal, self, NULL); - g_source_attach (self->seek_source, self->context); - } - } -} - -static void gst_player_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstPlayer *self = GST_PLAYER (object); switch (prop_id) { - case PROP_VIDEO_RENDERER: - self->video_renderer = g_value_dup_object (value); - break; case PROP_SIGNAL_DISPATCHER: self->signal_dispatcher = g_value_dup_object (value); break; - case PROP_URI:{ - g_mutex_lock (&self->lock); - g_free (self->uri); - g_free (self->redirect_uri); - self->redirect_uri = NULL; - - g_free (self->suburi); - self->suburi = NULL; - - self->uri = g_value_dup_string (value); - GST_DEBUG_OBJECT (self, "Set uri=%s", self->uri); - g_mutex_unlock (&self->lock); - - g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, - gst_player_set_uri_internal, self, NULL); - break; - } - case PROP_SUBURI:{ - g_mutex_lock (&self->lock); - g_free (self->suburi); - - self->suburi = g_value_dup_string (value); - GST_DEBUG_OBJECT (self, "Set suburi=%s", self->suburi); - g_mutex_unlock (&self->lock); - - g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, - gst_player_set_suburi_internal, self, NULL); - break; - } - case PROP_VOLUME: - GST_DEBUG_OBJECT (self, "Set volume=%lf", g_value_get_double (value)); - g_object_set_property (G_OBJECT (self->playbin), "volume", value); - break; - case PROP_RATE: - g_mutex_lock (&self->lock); - self->rate = g_value_get_double (value); - GST_DEBUG_OBJECT (self, "Set rate=%lf", g_value_get_double (value)); - gst_player_set_rate_internal (self); - g_mutex_unlock (&self->lock); - break; - case PROP_MUTE: - GST_DEBUG_OBJECT (self, "Set mute=%d", g_value_get_boolean (value)); - g_object_set_property (G_OBJECT (self->playbin), "mute", value); - break; - case PROP_VIDEO_MULTIVIEW_MODE: - GST_DEBUG_OBJECT (self, "Set multiview mode=%u", - g_value_get_enum (value)); - g_object_set_property (G_OBJECT (self->playbin), "video-multiview-mode", - value); - break; - case PROP_VIDEO_MULTIVIEW_FLAGS: - GST_DEBUG_OBJECT (self, "Set multiview flags=%x", - g_value_get_flags (value)); - g_object_set_property (G_OBJECT (self->playbin), "video-multiview-flags", - value); - break; - case PROP_AUDIO_VIDEO_OFFSET: - g_object_set_property (G_OBJECT (self->playbin), "av-offset", value); - break; - case PROP_SUBTITLE_VIDEO_OFFSET: - g_object_set_property (G_OBJECT (self->playbin), "text-offset", value); - break; default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + g_object_set_property (G_OBJECT (self->play), + g_param_spec_get_name (pspec), value); break; } } @@ -760,2279 +407,141 @@ GstPlayer *self = GST_PLAYER (object); switch (prop_id) { - case PROP_URI: - g_mutex_lock (&self->lock); - g_value_set_string (value, self->uri); - g_mutex_unlock (&self->lock); - break; - case PROP_SUBURI: - g_mutex_lock (&self->lock); - g_value_set_string (value, self->suburi); - g_mutex_unlock (&self->lock); - GST_DEBUG_OBJECT (self, "Returning suburi=%s", - g_value_get_string (value)); - break; - case PROP_POSITION:{ - gint64 position = GST_CLOCK_TIME_NONE; - - gst_element_query_position (self->playbin, GST_FORMAT_TIME, &position); - g_value_set_uint64 (value, position); - GST_TRACE_OBJECT (self, "Returning position=%" GST_TIME_FORMAT, - GST_TIME_ARGS (g_value_get_uint64 (value))); - break; - } - case PROP_DURATION:{ - g_value_set_uint64 (value, self->cached_duration); - GST_TRACE_OBJECT (self, "Returning duration=%" GST_TIME_FORMAT, - GST_TIME_ARGS (g_value_get_uint64 (value))); - break; - } - case PROP_MEDIA_INFO:{ - GstPlayerMediaInfo *media_info = gst_player_get_media_info (self); - g_value_take_object (value, media_info); - break; - } - case PROP_CURRENT_AUDIO_TRACK:{ - GstPlayerAudioInfo *audio_info = - gst_player_get_current_audio_track (self); - g_value_take_object (value, audio_info); - break; - } - case PROP_CURRENT_VIDEO_TRACK:{ - GstPlayerVideoInfo *video_info = - gst_player_get_current_video_track (self); - g_value_take_object (value, video_info); - break; - } - case PROP_CURRENT_SUBTITLE_TRACK:{ - GstPlayerSubtitleInfo *subtitle_info = - gst_player_get_current_subtitle_track (self); - g_value_take_object (value, subtitle_info); + case PROP_MEDIA_INFO: + g_value_take_object (value, gst_player_get_media_info (self)); break; - } - case PROP_VOLUME: - g_object_get_property (G_OBJECT (self->playbin), "volume", value); - GST_TRACE_OBJECT (self, "Returning volume=%lf", - g_value_get_double (value)); + case PROP_CURRENT_AUDIO_TRACK: + g_value_take_object (value, gst_player_get_current_audio_track (self)); break; - case PROP_RATE: - g_mutex_lock (&self->lock); - g_value_set_double (value, self->rate); - g_mutex_unlock (&self->lock); + case PROP_CURRENT_VIDEO_TRACK: + g_value_take_object (value, gst_player_get_current_video_track (self)); break; - case PROP_MUTE: - g_object_get_property (G_OBJECT (self->playbin), "mute", value); - GST_TRACE_OBJECT (self, "Returning mute=%d", g_value_get_boolean (value)); - break; - case PROP_PIPELINE: - g_value_set_object (value, self->playbin); - break; - case PROP_VIDEO_MULTIVIEW_MODE:{ - g_object_get_property (G_OBJECT (self->playbin), "video-multiview-mode", - value); - GST_TRACE_OBJECT (self, "Return multiview mode=%d", - g_value_get_enum (value)); - break; - } - case PROP_VIDEO_MULTIVIEW_FLAGS:{ - g_object_get_property (G_OBJECT (self->playbin), "video-multiview-flags", - value); - GST_TRACE_OBJECT (self, "Return multiview flags=%x", - g_value_get_flags (value)); - break; - } - case PROP_AUDIO_VIDEO_OFFSET: - g_object_get_property (G_OBJECT (self->playbin), "av-offset", value); - break; - case PROP_SUBTITLE_VIDEO_OFFSET: - g_object_get_property (G_OBJECT (self->playbin), "text-offset", value); + case PROP_CURRENT_SUBTITLE_TRACK: + g_value_take_object (value, gst_player_get_current_subtitle_track (self)); break; default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + g_object_get_property (G_OBJECT (self->play), + g_param_spec_get_name (pspec), value); break; } } -static gboolean -main_loop_running_cb (gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - GST_TRACE_OBJECT (self, "Main loop running now"); - - g_mutex_lock (&self->lock); - g_cond_signal (&self->cond); - g_mutex_unlock (&self->lock); - - return G_SOURCE_REMOVE; -} - -typedef struct -{ - GstPlayer *player; - GstPlayerState state; -} StateChangedSignalData; - -static void -state_changed_dispatch (gpointer user_data) -{ - StateChangedSignalData *data = user_data; - - if (data->player->inhibit_sigs && data->state != GST_PLAYER_STATE_STOPPED - && data->state != GST_PLAYER_STATE_PAUSED) - return; - - g_signal_emit (data->player, signals[SIGNAL_STATE_CHANGED], 0, data->state); -} - -static void -state_changed_signal_data_free (StateChangedSignalData * data) -{ - g_object_unref (data->player); - g_free (data); -} - -static void -change_state (GstPlayer * self, GstPlayerState state) -{ - if (state == self->app_state) - return; - - GST_DEBUG_OBJECT (self, "Changing app state from %s to %s", - gst_player_state_get_name (self->app_state), - gst_player_state_get_name (state)); - self->app_state = state; - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_STATE_CHANGED], 0, NULL, NULL, NULL) != 0) { - StateChangedSignalData *data = g_new (StateChangedSignalData, 1); - - data->player = g_object_ref (self); - data->state = state; - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - state_changed_dispatch, data, - (GDestroyNotify) state_changed_signal_data_free); - } -} - -typedef struct -{ - GstPlayer *player; - GstClockTime position; -} PositionUpdatedSignalData; - -static void -position_updated_dispatch (gpointer user_data) -{ - PositionUpdatedSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - if (data->player->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->player, signals[SIGNAL_POSITION_UPDATED], 0, - data->position); - g_object_notify_by_pspec (G_OBJECT (data->player), - param_specs[PROP_POSITION]); - } -} - -static void -position_updated_signal_data_free (PositionUpdatedSignalData * data) -{ - g_object_unref (data->player); - g_free (data); -} - -static gboolean -tick_cb (gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - gint64 position; - - if (self->target_state >= GST_STATE_PAUSED - && gst_element_query_position (self->playbin, GST_FORMAT_TIME, - &position)) { - GST_LOG_OBJECT (self, "Position %" GST_TIME_FORMAT, - GST_TIME_ARGS (position)); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_POSITION_UPDATED], 0, NULL, NULL, NULL) != 0) { - PositionUpdatedSignalData *data = g_new (PositionUpdatedSignalData, 1); - - data->player = g_object_ref (self); - data->position = position; - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - position_updated_dispatch, data, - (GDestroyNotify) position_updated_signal_data_free); - } - } - - return G_SOURCE_CONTINUE; -} - -static void -add_tick_source (GstPlayer * self) -{ - guint position_update_interval_ms; - - if (self->tick_source) - return; - - position_update_interval_ms = - gst_player_config_get_position_update_interval (self->config); - if (!position_update_interval_ms) - return; - - self->tick_source = g_timeout_source_new (position_update_interval_ms); - g_source_set_callback (self->tick_source, (GSourceFunc) tick_cb, self, NULL); - g_source_attach (self->tick_source, self->context); -} - -static void -remove_tick_source (GstPlayer * self) -{ - if (!self->tick_source) - return; - - g_source_destroy (self->tick_source); - g_source_unref (self->tick_source); - self->tick_source = NULL; -} - -static gboolean -ready_timeout_cb (gpointer user_data) -{ - GstPlayer *self = user_data; - - if (self->target_state <= GST_STATE_READY) { - GST_DEBUG_OBJECT (self, "Setting pipeline to NULL state"); - self->target_state = GST_STATE_NULL; - self->current_state = GST_STATE_NULL; - gst_element_set_state (self->playbin, GST_STATE_NULL); - } - - return G_SOURCE_REMOVE; -} - -static void -add_ready_timeout_source (GstPlayer * self) -{ - if (self->ready_timeout_source) - return; - - self->ready_timeout_source = g_timeout_source_new_seconds (60); - g_source_set_callback (self->ready_timeout_source, - (GSourceFunc) ready_timeout_cb, self, NULL); - g_source_attach (self->ready_timeout_source, self->context); -} - -static void -remove_ready_timeout_source (GstPlayer * self) -{ - if (!self->ready_timeout_source) - return; - - g_source_destroy (self->ready_timeout_source); - g_source_unref (self->ready_timeout_source); - self->ready_timeout_source = NULL; -} - -typedef struct -{ - GstPlayer *player; - GError *err; -} ErrorSignalData; - -static void -error_dispatch (gpointer user_data) -{ - ErrorSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - g_signal_emit (data->player, signals[SIGNAL_ERROR], 0, data->err); -} - -static void -free_error_signal_data (ErrorSignalData * data) -{ - g_object_unref (data->player); - g_clear_error (&data->err); - g_free (data); -} - -static void -emit_error (GstPlayer * self, GError * err) -{ - GST_ERROR_OBJECT (self, "Error: %s (%s, %d)", err->message, - g_quark_to_string (err->domain), err->code); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_ERROR], 0, NULL, NULL, NULL) != 0) { - ErrorSignalData *data = g_new (ErrorSignalData, 1); - - data->player = g_object_ref (self); - data->err = g_error_copy (err); - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - error_dispatch, data, (GDestroyNotify) free_error_signal_data); - } - - g_error_free (err); - - remove_tick_source (self); - remove_ready_timeout_source (self); - - self->target_state = GST_STATE_NULL; - self->current_state = GST_STATE_NULL; - self->is_live = FALSE; - self->is_eos = FALSE; - gst_element_set_state (self->playbin, GST_STATE_NULL); - change_state (self, GST_PLAYER_STATE_STOPPED); - self->buffering = 100; - - g_mutex_lock (&self->lock); - if (self->media_info) { - g_object_unref (self->media_info); - self->media_info = NULL; - } - - if (self->global_tags) { - gst_tag_list_unref (self->global_tags); - self->global_tags = NULL; - } - - self->seek_pending = FALSE; - remove_seek_source (self); - self->seek_position = GST_CLOCK_TIME_NONE; - self->last_seek_time = GST_CLOCK_TIME_NONE; - g_mutex_unlock (&self->lock); -} - -static void -dump_dot_file (GstPlayer * self, const gchar * name) -{ - gchar *full_name; - - full_name = g_strdup_printf ("gst-player.%p.%s", self, name); - - GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (self->playbin), - GST_DEBUG_GRAPH_SHOW_ALL, full_name); - - g_free (full_name); -} - -typedef struct -{ - GstPlayer *player; - GError *err; -} WarningSignalData; - -static void -warning_dispatch (gpointer user_data) -{ - WarningSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - g_signal_emit (data->player, signals[SIGNAL_WARNING], 0, data->err); -} - -static void -free_warning_signal_data (WarningSignalData * data) -{ - g_object_unref (data->player); - g_clear_error (&data->err); - g_free (data); -} - -static void -emit_warning (GstPlayer * self, GError * err) -{ - GST_ERROR_OBJECT (self, "Warning: %s (%s, %d)", err->message, - g_quark_to_string (err->domain), err->code); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_WARNING], 0, NULL, NULL, NULL) != 0) { - WarningSignalData *data = g_new (WarningSignalData, 1); - - data->player = g_object_ref (self); - data->err = g_error_copy (err); - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - warning_dispatch, data, (GDestroyNotify) free_warning_signal_data); - } - - g_error_free (err); -} - -static void -error_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GError *err, *player_err; - gchar *name, *debug, *message, *full_message; - - dump_dot_file (self, "error"); - - gst_message_parse_error (msg, &err, &debug); - - name = gst_object_get_path_string (msg->src); - message = gst_error_get_message (err->domain, err->code); - - if (debug) - full_message = - g_strdup_printf ("Error from element %s: %s\n%s\n%s", name, message, - err->message, debug); - else - full_message = - g_strdup_printf ("Error from element %s: %s\n%s", name, message, - err->message); - - GST_ERROR_OBJECT (self, "ERROR: from element %s: %s", name, err->message); - if (debug != NULL) - GST_ERROR_OBJECT (self, "Additional debug info: %s", debug); - - player_err = - g_error_new_literal (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - full_message); - emit_error (self, player_err); - - g_clear_error (&err); - g_free (debug); - g_free (name); - g_free (full_message); - g_free (message); -} - -static void -warning_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GError *err, *player_err; - gchar *name, *debug, *message, *full_message; - - dump_dot_file (self, "warning"); - - gst_message_parse_warning (msg, &err, &debug); - - name = gst_object_get_path_string (msg->src); - message = gst_error_get_message (err->domain, err->code); - - if (debug) - full_message = - g_strdup_printf ("Warning from element %s: %s\n%s\n%s", name, message, - err->message, debug); - else - full_message = - g_strdup_printf ("Warning from element %s: %s\n%s", name, message, - err->message); - - GST_WARNING_OBJECT (self, "WARNING: from element %s: %s", name, err->message); - if (debug != NULL) - GST_WARNING_OBJECT (self, "Additional debug info: %s", debug); - - player_err = - g_error_new_literal (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - full_message); - emit_warning (self, player_err); - - g_clear_error (&err); - g_free (debug); - g_free (name); - g_free (full_message); - g_free (message); -} - -static void -eos_dispatch (gpointer user_data) -{ - GstPlayer *player = user_data; - - if (player->inhibit_sigs) - return; - - g_signal_emit (player, signals[SIGNAL_END_OF_STREAM], 0); -} - -static void -eos_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - GST_DEBUG_OBJECT (self, "End of stream"); - - tick_cb (self); - remove_tick_source (self); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_END_OF_STREAM], 0, NULL, NULL, NULL) != 0) { - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - eos_dispatch, g_object_ref (self), (GDestroyNotify) g_object_unref); - } - change_state (self, GST_PLAYER_STATE_STOPPED); - self->buffering = 100; - self->is_eos = TRUE; -} - -typedef struct -{ - GstPlayer *player; - gint percent; -} BufferingSignalData; - -static void -buffering_dispatch (gpointer user_data) -{ - BufferingSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - if (data->player->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->player, signals[SIGNAL_BUFFERING], 0, data->percent); - } -} - -static void -buffering_signal_data_free (BufferingSignalData * data) -{ - g_object_unref (data->player); - g_free (data); -} - -static void -buffering_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - gint percent; - - if (self->target_state < GST_STATE_PAUSED) - return; - if (self->is_live) - return; - - gst_message_parse_buffering (msg, &percent); - GST_LOG_OBJECT (self, "Buffering %d%%", percent); - - if (percent < 100 && self->target_state >= GST_STATE_PAUSED) { - GstStateChangeReturn state_ret; - - GST_DEBUG_OBJECT (self, "Waiting for buffering to finish"); - state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); - - if (state_ret == GST_STATE_CHANGE_FAILURE) { - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to handle buffering")); - return; - } - - change_state (self, GST_PLAYER_STATE_BUFFERING); - } - - if (self->buffering != percent) { - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_BUFFERING], 0, NULL, NULL, NULL) != 0) { - BufferingSignalData *data = g_new (BufferingSignalData, 1); - - data->player = g_object_ref (self); - data->percent = percent; - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - buffering_dispatch, data, - (GDestroyNotify) buffering_signal_data_free); - } - - self->buffering = percent; - } - - - g_mutex_lock (&self->lock); - if (percent == 100 && (self->seek_position != GST_CLOCK_TIME_NONE || - self->seek_pending)) { - g_mutex_unlock (&self->lock); - - GST_DEBUG_OBJECT (self, "Buffering finished - seek pending"); - } else if (percent == 100 && self->target_state >= GST_STATE_PLAYING - && self->current_state >= GST_STATE_PAUSED) { - GstStateChangeReturn state_ret; - - g_mutex_unlock (&self->lock); - - GST_DEBUG_OBJECT (self, "Buffering finished - going to PLAYING"); - state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); - /* Application state change is happening when the state change happened */ - if (state_ret == GST_STATE_CHANGE_FAILURE) - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to handle buffering")); - } else if (percent == 100 && self->target_state >= GST_STATE_PAUSED) { - g_mutex_unlock (&self->lock); - - GST_DEBUG_OBJECT (self, "Buffering finished - staying PAUSED"); - change_state (self, GST_PLAYER_STATE_PAUSED); - } else { - g_mutex_unlock (&self->lock); - } -} - -static void -clock_lost_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstStateChangeReturn state_ret; - - GST_DEBUG_OBJECT (self, "Clock lost"); - if (self->target_state >= GST_STATE_PLAYING) { - state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); - if (state_ret != GST_STATE_CHANGE_FAILURE) - state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); - - if (state_ret == GST_STATE_CHANGE_FAILURE) - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to handle clock loss")); - } -} - -typedef struct -{ - GstPlayer *player; - gint width, height; -} VideoDimensionsChangedSignalData; - -static void -video_dimensions_changed_dispatch (gpointer user_data) -{ - VideoDimensionsChangedSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - if (data->player->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->player, signals[SIGNAL_VIDEO_DIMENSIONS_CHANGED], 0, - data->width, data->height); - } -} - -static void -video_dimensions_changed_signal_data_free (VideoDimensionsChangedSignalData * - data) -{ - g_object_unref (data->player); - g_free (data); -} - -static void -check_video_dimensions_changed (GstPlayer * self) -{ - GstElement *video_sink; - GstPad *video_sink_pad; - GstCaps *caps; - GstVideoInfo info; - gint width = 0, height = 0; - - g_object_get (self->playbin, "video-sink", &video_sink, NULL); - if (!video_sink) - goto out; - - video_sink_pad = gst_element_get_static_pad (video_sink, "sink"); - if (!video_sink_pad) { - gst_object_unref (video_sink); - goto out; - } - - caps = gst_pad_get_current_caps (video_sink_pad); - - if (caps) { - if (gst_video_info_from_caps (&info, caps)) { - info.width = info.width * info.par_n / info.par_d; - - GST_DEBUG_OBJECT (self, "Video dimensions changed: %dx%d", info.width, - info.height); - width = info.width; - height = info.height; - } - - gst_caps_unref (caps); - } - gst_object_unref (video_sink_pad); - gst_object_unref (video_sink); - -out: - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_VIDEO_DIMENSIONS_CHANGED], 0, NULL, NULL, NULL) != 0) { - VideoDimensionsChangedSignalData *data = - g_new (VideoDimensionsChangedSignalData, 1); - - data->player = g_object_ref (self); - data->width = width; - data->height = height; - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - video_dimensions_changed_dispatch, data, - (GDestroyNotify) video_dimensions_changed_signal_data_free); - } -} - -static void -notify_caps_cb (G_GNUC_UNUSED GObject * object, - G_GNUC_UNUSED GParamSpec * pspec, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - check_video_dimensions_changed (self); -} - -typedef struct -{ - GstPlayer *player; - GstClockTime duration; -} DurationChangedSignalData; - -static void -duration_changed_dispatch (gpointer user_data) -{ - DurationChangedSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - if (data->player->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->player, signals[SIGNAL_DURATION_CHANGED], 0, - data->duration); - g_object_notify_by_pspec (G_OBJECT (data->player), - param_specs[PROP_DURATION]); - } -} - -static void -duration_changed_signal_data_free (DurationChangedSignalData * data) -{ - g_object_unref (data->player); - g_free (data); -} - -static void -emit_duration_changed (GstPlayer * self, GstClockTime duration) -{ - gboolean updated = FALSE; - - if (self->cached_duration == duration) - return; - - GST_DEBUG_OBJECT (self, "Duration changed %" GST_TIME_FORMAT, - GST_TIME_ARGS (duration)); - - self->cached_duration = duration; - g_mutex_lock (&self->lock); - if (self->media_info) { - self->media_info->duration = duration; - updated = TRUE; - } - g_mutex_unlock (&self->lock); - if (updated) { - emit_media_info_updated_signal (self); - } - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_DURATION_CHANGED], 0, NULL, NULL, NULL) != 0) { - DurationChangedSignalData *data = g_new (DurationChangedSignalData, 1); - - data->player = g_object_ref (self); - data->duration = duration; - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - duration_changed_dispatch, data, - (GDestroyNotify) duration_changed_signal_data_free); - } -} - -typedef struct -{ - GstPlayer *player; - GstClockTime position; -} SeekDoneSignalData; - -static void -seek_done_dispatch (gpointer user_data) -{ - SeekDoneSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - g_signal_emit (data->player, signals[SIGNAL_SEEK_DONE], 0, data->position); -} - -static void -seek_done_signal_data_free (SeekDoneSignalData * data) -{ - g_object_unref (data->player); - g_free (data); -} - -static void -emit_seek_done (GstPlayer * self) -{ - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_SEEK_DONE], 0, NULL, NULL, NULL) != 0) { - SeekDoneSignalData *data = g_new (SeekDoneSignalData, 1); - - data->player = g_object_ref (self); - data->position = gst_player_get_position (self); - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - seek_done_dispatch, data, (GDestroyNotify) seek_done_signal_data_free); - } -} - -static void -state_changed_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstState old_state, new_state, pending_state; - - gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state); - - if (GST_MESSAGE_SRC (msg) == GST_OBJECT (self->playbin)) { - gchar *transition_name; - - GST_DEBUG_OBJECT (self, "Changed state old: %s new: %s pending: %s", - gst_element_state_get_name (old_state), - gst_element_state_get_name (new_state), - gst_element_state_get_name (pending_state)); - - transition_name = g_strdup_printf ("%s_%s", - gst_element_state_get_name (old_state), - gst_element_state_get_name (new_state)); - dump_dot_file (self, transition_name); - g_free (transition_name); - - self->current_state = new_state; - - if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED - && pending_state == GST_STATE_VOID_PENDING) { - GstElement *video_sink; - GstPad *video_sink_pad; - gint64 duration = -1; - - GST_DEBUG_OBJECT (self, "Initial PAUSED - pre-rolled"); - - g_mutex_lock (&self->lock); - if (self->media_info) - g_object_unref (self->media_info); - self->media_info = gst_player_media_info_create (self); - g_mutex_unlock (&self->lock); - emit_media_info_updated_signal (self); - - g_object_get (self->playbin, "video-sink", &video_sink, NULL); - - if (video_sink) { - video_sink_pad = gst_element_get_static_pad (video_sink, "sink"); - - if (video_sink_pad) { - g_signal_connect (video_sink_pad, "notify::caps", - (GCallback) notify_caps_cb, self); - gst_object_unref (video_sink_pad); - } - gst_object_unref (video_sink); - } - - check_video_dimensions_changed (self); - if (gst_element_query_duration (self->playbin, GST_FORMAT_TIME, - &duration)) { - emit_duration_changed (self, duration); - } else { - self->cached_duration = GST_CLOCK_TIME_NONE; - } - } - - if (new_state == GST_STATE_PAUSED - && pending_state == GST_STATE_VOID_PENDING) { - remove_tick_source (self); - - g_mutex_lock (&self->lock); - if (self->seek_pending) { - self->seek_pending = FALSE; - - if (!self->media_info->seekable) { - GST_DEBUG_OBJECT (self, "Media is not seekable"); - remove_seek_source (self); - self->seek_position = GST_CLOCK_TIME_NONE; - self->last_seek_time = GST_CLOCK_TIME_NONE; - } else if (self->seek_source) { - GST_DEBUG_OBJECT (self, "Seek finished but new seek is pending"); - gst_player_seek_internal_locked (self); - } else { - GST_DEBUG_OBJECT (self, "Seek finished"); - emit_seek_done (self); - } - } - - if (self->seek_position != GST_CLOCK_TIME_NONE) { - GST_DEBUG_OBJECT (self, "Seeking now that we reached PAUSED state"); - gst_player_seek_internal_locked (self); - g_mutex_unlock (&self->lock); - } else if (!self->seek_pending) { - g_mutex_unlock (&self->lock); - - tick_cb (self); - - if (self->target_state >= GST_STATE_PLAYING && self->buffering == 100) { - GstStateChangeReturn state_ret; - - state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); - if (state_ret == GST_STATE_CHANGE_FAILURE) - emit_error (self, g_error_new (GST_PLAYER_ERROR, - GST_PLAYER_ERROR_FAILED, "Failed to play")); - } else if (self->buffering == 100) { - change_state (self, GST_PLAYER_STATE_PAUSED); - } - } else { - g_mutex_unlock (&self->lock); - } - } else if (new_state == GST_STATE_PLAYING - && pending_state == GST_STATE_VOID_PENDING) { - - /* If no seek is currently pending, add the tick source. This can happen - * if we seeked already but the state-change message was still queued up */ - if (!self->seek_pending) { - add_tick_source (self); - change_state (self, GST_PLAYER_STATE_PLAYING); - } - } else if (new_state == GST_STATE_READY && old_state > GST_STATE_READY) { - change_state (self, GST_PLAYER_STATE_STOPPED); - } else { - /* Otherwise we neither reached PLAYING nor PAUSED, so must - * wait for something to happen... i.e. are BUFFERING now */ - change_state (self, GST_PLAYER_STATE_BUFFERING); - } - } -} - -static void -duration_changed_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - gint64 duration = GST_CLOCK_TIME_NONE; - - if (gst_element_query_duration (self->playbin, GST_FORMAT_TIME, &duration)) { - emit_duration_changed (self, duration); - } -} - -static void -latency_cb (G_GNUC_UNUSED GstBus * bus, G_GNUC_UNUSED GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - GST_DEBUG_OBJECT (self, "Latency changed"); - - gst_bin_recalculate_latency (GST_BIN (self->playbin)); -} - -static void -request_state_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstState state; - GstStateChangeReturn state_ret; - - gst_message_parse_request_state (msg, &state); - - GST_DEBUG_OBJECT (self, "State %s requested", - gst_element_state_get_name (state)); - - self->target_state = state; - state_ret = gst_element_set_state (self->playbin, state); - if (state_ret == GST_STATE_CHANGE_FAILURE) - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to change to requested state %s", - gst_element_state_get_name (state))); -} - -static void -media_info_update (GstPlayer * self, GstPlayerMediaInfo * info) -{ - g_free (info->title); - info->title = get_from_tags (self, info, get_title); - - g_free (info->container); - info->container = get_from_tags (self, info, get_container_format); - - if (info->image_sample) - gst_sample_unref (info->image_sample); - info->image_sample = get_from_tags (self, info, get_cover_sample); - - GST_DEBUG_OBJECT (self, "title: %s, container: %s " - "image_sample: %p", info->title, info->container, info->image_sample); -} - -static void -tags_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstTagList *tags = NULL; - - gst_message_parse_tag (msg, &tags); - - GST_DEBUG_OBJECT (self, "received %s tags", - gst_tag_list_get_scope (tags) == - GST_TAG_SCOPE_GLOBAL ? "global" : "stream"); - - if (gst_tag_list_get_scope (tags) == GST_TAG_SCOPE_GLOBAL) { - g_mutex_lock (&self->lock); - if (self->media_info) { - if (self->media_info->tags) - gst_tag_list_unref (self->media_info->tags); - self->media_info->tags = gst_tag_list_ref (tags); - media_info_update (self, self->media_info); - g_mutex_unlock (&self->lock); - emit_media_info_updated_signal (self); - } else { - if (self->global_tags) - gst_tag_list_unref (self->global_tags); - self->global_tags = gst_tag_list_ref (tags); - g_mutex_unlock (&self->lock); - } - } - - gst_tag_list_unref (tags); -} - -static void -element_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - const GstStructure *s; - - s = gst_message_get_structure (msg); - if (gst_structure_has_name (s, "redirect")) { - const gchar *new_location; - - new_location = gst_structure_get_string (s, "new-location"); - if (!new_location) { - const GValue *locations_list, *location_val; - guint i, size; - - locations_list = gst_structure_get_value (s, "locations"); - size = gst_value_list_get_size (locations_list); - for (i = 0; i < size; ++i) { - const GstStructure *location_s; - - location_val = gst_value_list_get_value (locations_list, i); - if (!GST_VALUE_HOLDS_STRUCTURE (location_val)) - continue; - - location_s = (const GstStructure *) g_value_get_boxed (location_val); - if (!gst_structure_has_name (location_s, "redirect")) - continue; - - new_location = gst_structure_get_string (location_s, "new-location"); - if (new_location) - break; - } - } - - if (new_location) { - GstState target_state; - - GST_DEBUG_OBJECT (self, "Redirect to '%s'", new_location); - - /* Remember target state and restore after setting the URI */ - target_state = self->target_state; - - gst_player_stop_internal (self, TRUE); - - g_mutex_lock (&self->lock); - g_free (self->redirect_uri); - self->redirect_uri = g_strdup (new_location); - g_object_set (self->playbin, "uri", self->redirect_uri, NULL); - g_mutex_unlock (&self->lock); - - if (target_state == GST_STATE_PAUSED) - gst_player_pause_internal (self); - else if (target_state == GST_STATE_PLAYING) - gst_player_play_internal (self); - } - } -} - -/* Must be called with lock */ -static gboolean -update_stream_collection (GstPlayer * self, GstStreamCollection * collection) -{ - if (self->collection && self->collection == collection) - return FALSE; - - if (self->collection && self->stream_notify_id) - g_signal_handler_disconnect (self->collection, self->stream_notify_id); - - gst_object_replace ((GstObject **) & self->collection, - (GstObject *) collection); - if (self->media_info) { - gst_object_unref (self->media_info); - self->media_info = gst_player_media_info_create (self); - } - - self->stream_notify_id = - g_signal_connect (self->collection, "stream-notify", - G_CALLBACK (stream_notify_cb), self); - - return TRUE; -} - -static void -stream_collection_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstStreamCollection *collection = NULL; - gboolean updated = FALSE; - - gst_message_parse_stream_collection (msg, &collection); - - if (!collection) - return; - - g_mutex_lock (&self->lock); - updated = update_stream_collection (self, collection); - gst_object_unref (collection); - g_mutex_unlock (&self->lock); - - if (self->media_info && updated) - emit_media_info_updated_signal (self); -} - -static void -streams_selected_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, - gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstStreamCollection *collection = NULL; - gboolean updated = FALSE; - guint i, len; - - gst_message_parse_streams_selected (msg, &collection); - - if (!collection) - return; - - g_mutex_lock (&self->lock); - updated = update_stream_collection (self, collection); - gst_object_unref (collection); - - g_free (self->video_sid); - g_free (self->audio_sid); - g_free (self->subtitle_sid); - self->video_sid = NULL; - self->audio_sid = NULL; - self->subtitle_sid = NULL; - - len = gst_message_streams_selected_get_size (msg); - for (i = 0; i < len; i++) { - GstStream *stream; - GstStreamType stream_type; - const gchar *stream_id; - gchar **current_sid; - stream = gst_message_streams_selected_get_stream (msg, i); - stream_type = gst_stream_get_stream_type (stream); - stream_id = gst_stream_get_stream_id (stream); - if (stream_type & GST_STREAM_TYPE_AUDIO) - current_sid = &self->audio_sid; - else if (stream_type & GST_STREAM_TYPE_VIDEO) - current_sid = &self->video_sid; - else if (stream_type & GST_STREAM_TYPE_TEXT) - current_sid = &self->subtitle_sid; - else { - GST_WARNING_OBJECT (self, - "Unknown stream-id %s with type 0x%x", stream_id, stream_type); - continue; - } - - if (G_UNLIKELY (*current_sid)) { - GST_FIXME_OBJECT (self, - "Multiple streams are selected for type %s, choose the first one", - gst_stream_type_get_name (stream_type)); - continue; - } - - *current_sid = g_strdup (stream_id); - } - g_mutex_unlock (&self->lock); - - if (self->media_info && updated) - emit_media_info_updated_signal (self); -} - -static void -player_set_flag (GstPlayer * self, gint pos) -{ - gint flags; - - g_object_get (self->playbin, "flags", &flags, NULL); - flags |= pos; - g_object_set (self->playbin, "flags", flags, NULL); - - GST_DEBUG_OBJECT (self, "setting flags=%#x", flags); -} - -static void -player_clear_flag (GstPlayer * self, gint pos) -{ - gint flags; - - g_object_get (self->playbin, "flags", &flags, NULL); - flags &= ~pos; - g_object_set (self->playbin, "flags", flags, NULL); - - GST_DEBUG_OBJECT (self, "setting flags=%#x", flags); -} - -typedef struct -{ - GstPlayer *player; - GstPlayerMediaInfo *info; -} MediaInfoUpdatedSignalData; - -static void -media_info_updated_dispatch (gpointer user_data) -{ - MediaInfoUpdatedSignalData *data = user_data; - - if (data->player->inhibit_sigs) - return; - - if (data->player->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->player, signals[SIGNAL_MEDIA_INFO_UPDATED], 0, - data->info); - } -} - -static void -free_media_info_updated_signal_data (MediaInfoUpdatedSignalData * data) -{ - g_object_unref (data->player); - g_object_unref (data->info); - g_free (data); -} - -/* - * emit_media_info_updated_signal: - * - * create a new copy of self->media_info object and emits the newly created - * copy to user application. The newly created media_info will be unref'ed - * as part of signal finalize method. - */ -static void -emit_media_info_updated_signal (GstPlayer * self) -{ - MediaInfoUpdatedSignalData *data = g_new (MediaInfoUpdatedSignalData, 1); - data->player = g_object_ref (self); - g_mutex_lock (&self->lock); - data->info = gst_player_media_info_copy (self->media_info); - g_mutex_unlock (&self->lock); - - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - media_info_updated_dispatch, data, - (GDestroyNotify) free_media_info_updated_signal_data); -} - -static GstCaps * -get_caps (GstPlayer * self, gint stream_index, GType type) -{ - GstPad *pad = NULL; - GstCaps *caps = NULL; - - if (type == GST_TYPE_PLAYER_VIDEO_INFO) - g_signal_emit_by_name (G_OBJECT (self->playbin), - "get-video-pad", stream_index, &pad); - else if (type == GST_TYPE_PLAYER_AUDIO_INFO) - g_signal_emit_by_name (G_OBJECT (self->playbin), - "get-audio-pad", stream_index, &pad); - else - g_signal_emit_by_name (G_OBJECT (self->playbin), - "get-text-pad", stream_index, &pad); - - if (pad) { - caps = gst_pad_get_current_caps (pad); - gst_object_unref (pad); - } - - return caps; -} - -static void -gst_player_subtitle_info_update (GstPlayer * self, - GstPlayerStreamInfo * stream_info) -{ - GstPlayerSubtitleInfo *info = (GstPlayerSubtitleInfo *) stream_info; - - if (stream_info->tags) { - - /* free the old language info */ - g_free (info->language); - info->language = NULL; - - /* First try to get the language full name from tag, if name is not - * available then try language code. If we find the language code - * then use gstreamer api to translate code to full name. - */ - gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_NAME, - &info->language); - if (!info->language) { - gchar *lang_code = NULL; - - gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_CODE, - &lang_code); - if (lang_code) { - info->language = g_strdup (gst_tag_get_language_name (lang_code)); - g_free (lang_code); - } - } - - /* If we are still failed to find language name then check if external - * subtitle is loaded and compare the stream index between current sub - * stream index with our stream index and if matches then declare it as - * external subtitle and use the filename. - */ - if (!info->language) { - gint text_index = -1; - gchar *suburi = NULL; - - g_object_get (G_OBJECT (self->playbin), "current-suburi", &suburi, NULL); - if (suburi) { - if (self->use_playbin3) { - if (g_str_equal (self->subtitle_sid, stream_info->stream_id)) - info->language = g_path_get_basename (suburi); - } else { - g_object_get (G_OBJECT (self->playbin), "current-text", &text_index, - NULL); - if (text_index == gst_player_stream_info_get_index (stream_info)) - info->language = g_path_get_basename (suburi); - } - g_free (suburi); - } - } - - } else { - g_free (info->language); - info->language = NULL; - } - - GST_DEBUG_OBJECT (self, "language=%s", info->language); -} - -static void -gst_player_video_info_update (GstPlayer * self, - GstPlayerStreamInfo * stream_info) -{ - GstPlayerVideoInfo *info = (GstPlayerVideoInfo *) stream_info; - - if (stream_info->caps) { - GstStructure *s; - - s = gst_caps_get_structure (stream_info->caps, 0); - if (s) { - gint width, height; - gint fps_n, fps_d; - gint par_n, par_d; - - if (gst_structure_get_int (s, "width", &width)) - info->width = width; - else - info->width = -1; - - if (gst_structure_get_int (s, "height", &height)) - info->height = height; - else - info->height = -1; - - if (gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d)) { - info->framerate_num = fps_n; - info->framerate_denom = fps_d; - } else { - info->framerate_num = 0; - info->framerate_denom = 1; - } - - - if (gst_structure_get_fraction (s, "pixel-aspect-ratio", &par_n, &par_d)) { - info->par_num = par_n; - info->par_denom = par_d; - } else { - info->par_num = 1; - info->par_denom = 1; - } - } - } else { - info->width = info->height = -1; - info->par_num = info->par_denom = 1; - info->framerate_num = 0; - info->framerate_denom = 1; - } - - if (stream_info->tags) { - guint bitrate, max_bitrate; - - if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_BITRATE, &bitrate)) - info->bitrate = bitrate; - else - info->bitrate = -1; - - if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_MAXIMUM_BITRATE, - &max_bitrate) || gst_tag_list_get_uint (stream_info->tags, - GST_TAG_NOMINAL_BITRATE, &max_bitrate)) - info->max_bitrate = max_bitrate; - else - info->max_bitrate = -1; - } else { - info->bitrate = info->max_bitrate = -1; - } - - GST_DEBUG_OBJECT (self, "width=%d height=%d fps=%.2f par=%d:%d " - "bitrate=%d max_bitrate=%d", info->width, info->height, - (gdouble) info->framerate_num / info->framerate_denom, - info->par_num, info->par_denom, info->bitrate, info->max_bitrate); -} - -static void -gst_player_audio_info_update (GstPlayer * self, - GstPlayerStreamInfo * stream_info) -{ - GstPlayerAudioInfo *info = (GstPlayerAudioInfo *) stream_info; - - if (stream_info->caps) { - GstStructure *s; - - s = gst_caps_get_structure (stream_info->caps, 0); - if (s) { - gint rate, channels; - - if (gst_structure_get_int (s, "rate", &rate)) - info->sample_rate = rate; - else - info->sample_rate = -1; - - if (gst_structure_get_int (s, "channels", &channels)) - info->channels = channels; - else - info->channels = 0; - } - } else { - info->sample_rate = -1; - info->channels = 0; - } - - if (stream_info->tags) { - guint bitrate, max_bitrate; - - if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_BITRATE, &bitrate)) - info->bitrate = bitrate; - else - info->bitrate = -1; - - if (gst_tag_list_get_uint (stream_info->tags, GST_TAG_MAXIMUM_BITRATE, - &max_bitrate) || gst_tag_list_get_uint (stream_info->tags, - GST_TAG_NOMINAL_BITRATE, &max_bitrate)) - info->max_bitrate = max_bitrate; - else - info->max_bitrate = -1; - - /* if we have old language the free it */ - g_free (info->language); - info->language = NULL; - - /* First try to get the language full name from tag, if name is not - * available then try language code. If we find the language code - * then use gstreamer api to translate code to full name. - */ - gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_NAME, - &info->language); - if (!info->language) { - gchar *lang_code = NULL; - - gst_tag_list_get_string (stream_info->tags, GST_TAG_LANGUAGE_CODE, - &lang_code); - if (lang_code) { - info->language = g_strdup (gst_tag_get_language_name (lang_code)); - g_free (lang_code); - } - } - } else { - g_free (info->language); - info->language = NULL; - info->max_bitrate = info->bitrate = -1; - } - - GST_DEBUG_OBJECT (self, "language=%s rate=%d channels=%d bitrate=%d " - "max_bitrate=%d", info->language, info->sample_rate, info->channels, - info->bitrate, info->max_bitrate); -} - -static GstPlayerStreamInfo * -gst_player_stream_info_find (GstPlayerMediaInfo * media_info, - GType type, gint stream_index) -{ - GList *list, *l; - GstPlayerStreamInfo *info = NULL; - - if (!media_info) - return NULL; - - list = gst_player_media_info_get_stream_list (media_info); - for (l = list; l != NULL; l = l->next) { - info = (GstPlayerStreamInfo *) l->data; - if ((G_OBJECT_TYPE (info) == type) && (info->stream_index == stream_index)) { - return info; - } - } - - return NULL; -} - -static GstPlayerStreamInfo * -gst_player_stream_info_find_from_stream_id (GstPlayerMediaInfo * media_info, - const gchar * stream_id) +static gpointer +gst_player_init_once (G_GNUC_UNUSED gpointer user_data) { - GList *list, *l; - GstPlayerStreamInfo *info = NULL; - - if (!media_info) - return NULL; + gst_init (NULL, NULL); - list = gst_player_media_info_get_stream_list (media_info); - for (l = list; l != NULL; l = l->next) { - info = (GstPlayerStreamInfo *) l->data; - if (g_str_equal (info->stream_id, stream_id)) { - return info; - } - } + GST_DEBUG_CATEGORY_INIT (gst_player_debug, "gst-player", 0, "GstPlayer"); + gst_player_error_quark (); return NULL; } -static gboolean -is_track_enabled (GstPlayer * self, gint pos) -{ - gint flags; - - g_object_get (G_OBJECT (self->playbin), "flags", &flags, NULL); - - if ((flags & pos)) - return TRUE; - - return FALSE; -} - -static GstPlayerStreamInfo * -gst_player_stream_info_get_current (GstPlayer * self, const gchar * prop, - GType type) -{ - gint current; - GstPlayerStreamInfo *info; - - if (!self->media_info) - return NULL; - - g_object_get (G_OBJECT (self->playbin), prop, ¤t, NULL); - g_mutex_lock (&self->lock); - info = gst_player_stream_info_find (self->media_info, type, current); - if (info) - info = gst_player_stream_info_copy (info); - g_mutex_unlock (&self->lock); - - return info; -} - -static GstPlayerStreamInfo * -gst_player_stream_info_get_current_from_stream_id (GstPlayer * self, - const gchar * stream_id, GType type) -{ - GstPlayerStreamInfo *info; - - if (!self->media_info || !stream_id) - return NULL; - - g_mutex_lock (&self->lock); - info = - gst_player_stream_info_find_from_stream_id (self->media_info, stream_id); - if (info && G_OBJECT_TYPE (info) == type) - info = gst_player_stream_info_copy (info); - else - info = NULL; - g_mutex_unlock (&self->lock); - - return info; -} - static void -stream_notify_cb (GstStreamCollection * collection, GstStream * stream, - GParamSpec * pspec, GstPlayer * self) +uri_loaded_cb (GstPlaySignalAdapter * adapter, const gchar * uri, + GstPlayer * self) { - GstPlayerStreamInfo *info; - const gchar *stream_id; - gboolean emit_signal = FALSE; - - if (!self->media_info) - return; - - if (G_PARAM_SPEC_VALUE_TYPE (pspec) != GST_TYPE_CAPS && - G_PARAM_SPEC_VALUE_TYPE (pspec) != GST_TYPE_TAG_LIST) - return; - - stream_id = gst_stream_get_stream_id (stream); - g_mutex_lock (&self->lock); - info = - gst_player_stream_info_find_from_stream_id (self->media_info, stream_id); - if (info) { - gst_player_stream_info_update_from_stream (self, info, stream); - emit_signal = TRUE; - } - g_mutex_unlock (&self->lock); - - if (emit_signal) - emit_media_info_updated_signal (self); + g_signal_emit (self, signals[SIGNAL_URI_LOADED], 0, uri); } static void -gst_player_stream_info_update (GstPlayer * self, GstPlayerStreamInfo * s) +position_updated_cb (GstPlaySignalAdapter * adapter, GstClockTime position, + GstPlayer * self) { - if (GST_IS_PLAYER_VIDEO_INFO (s)) - gst_player_video_info_update (self, s); - else if (GST_IS_PLAYER_AUDIO_INFO (s)) - gst_player_audio_info_update (self, s); - else - gst_player_subtitle_info_update (self, s); -} - -static gchar * -stream_info_get_codec (GstPlayerStreamInfo * s) -{ - const gchar *type; - GstTagList *tags; - gchar *codec = NULL; - - if (GST_IS_PLAYER_VIDEO_INFO (s)) - type = GST_TAG_VIDEO_CODEC; - else if (GST_IS_PLAYER_AUDIO_INFO (s)) - type = GST_TAG_AUDIO_CODEC; - else - type = GST_TAG_SUBTITLE_CODEC; - - tags = gst_player_stream_info_get_tags (s); - if (tags) { - gst_tag_list_get_string (tags, type, &codec); - if (!codec) - gst_tag_list_get_string (tags, GST_TAG_CODEC, &codec); - } - - if (!codec) { - GstCaps *caps; - caps = gst_player_stream_info_get_caps (s); - if (caps) { - codec = gst_pb_utils_get_codec_description (caps); - } - } - - return codec; + g_signal_emit (self, signals[SIGNAL_POSITION_UPDATED], 0, position); } static void -gst_player_stream_info_update_tags_and_caps (GstPlayer * self, - GstPlayerStreamInfo * s) -{ - GstTagList *tags; - gint stream_index; - - stream_index = gst_player_stream_info_get_index (s); - - if (GST_IS_PLAYER_VIDEO_INFO (s)) - g_signal_emit_by_name (self->playbin, "get-video-tags", - stream_index, &tags); - else if (GST_IS_PLAYER_AUDIO_INFO (s)) - g_signal_emit_by_name (self->playbin, "get-audio-tags", - stream_index, &tags); - else - g_signal_emit_by_name (self->playbin, "get-text-tags", stream_index, &tags); - - if (s->tags) - gst_tag_list_unref (s->tags); - s->tags = tags; - - if (s->caps) - gst_caps_unref (s->caps); - s->caps = get_caps (self, stream_index, G_OBJECT_TYPE (s)); - - g_free (s->codec); - s->codec = stream_info_get_codec (s); - - GST_DEBUG_OBJECT (self, "%s index: %d tags: %p caps: %p", - gst_player_stream_info_get_stream_type (s), stream_index, - s->tags, s->caps); - - gst_player_stream_info_update (self, s); -} - -static void -gst_player_streams_info_create (GstPlayer * self, - GstPlayerMediaInfo * media_info, const gchar * prop, GType type) +duration_changed_cb (GstPlaySignalAdapter * adapter, GstClockTime duraton, + GstPlayer * self) { - gint i; - gint total = -1; - GstPlayerStreamInfo *s; - - if (!media_info) - return; - - g_object_get (G_OBJECT (self->playbin), prop, &total, NULL); - - GST_DEBUG_OBJECT (self, "%s: %d", prop, total); - - for (i = 0; i < total; i++) { - /* check if stream already exist in the list */ - s = gst_player_stream_info_find (media_info, type, i); - - if (!s) { - /* create a new stream info instance */ - s = gst_player_stream_info_new (i, type); - - /* add the object in stream list */ - media_info->stream_list = g_list_append (media_info->stream_list, s); - - /* based on type, add the object in its corresponding stream_ list */ - if (GST_IS_PLAYER_AUDIO_INFO (s)) - media_info->audio_stream_list = g_list_append - (media_info->audio_stream_list, s); - else if (GST_IS_PLAYER_VIDEO_INFO (s)) - media_info->video_stream_list = g_list_append - (media_info->video_stream_list, s); - else - media_info->subtitle_stream_list = g_list_append - (media_info->subtitle_stream_list, s); - - GST_DEBUG_OBJECT (self, "create %s stream stream_index: %d", - gst_player_stream_info_get_stream_type (s), i); - } - - gst_player_stream_info_update_tags_and_caps (self, s); - } + g_signal_emit (self, signals[SIGNAL_DURATION_CHANGED], 0, duraton); } static void -gst_player_stream_info_update_from_stream (GstPlayer * self, - GstPlayerStreamInfo * s, GstStream * stream) -{ - if (s->tags) - gst_tag_list_unref (s->tags); - s->tags = gst_stream_get_tags (stream); - - if (s->caps) - gst_caps_unref (s->caps); - s->caps = gst_stream_get_caps (stream); - - g_free (s->codec); - s->codec = stream_info_get_codec (s); - - GST_DEBUG_OBJECT (self, "%s index: %d tags: %p caps: %p", - gst_player_stream_info_get_stream_type (s), s->stream_index, - s->tags, s->caps); - - gst_player_stream_info_update (self, s); -} - -static void -gst_player_streams_info_create_from_collection (GstPlayer * self, - GstPlayerMediaInfo * media_info, GstStreamCollection * collection) +state_changed_cb (GstPlaySignalAdapter * adapter, GstPlayState state, + GstPlayer * self) { - guint i; - guint total; - GstPlayerStreamInfo *s; - guint n_audio = 0; - guint n_video = 0; - guint n_text = 0; - - if (!media_info || !collection) - return; - - total = gst_stream_collection_get_size (collection); - - for (i = 0; i < total; i++) { - GstStream *stream = gst_stream_collection_get_stream (collection, i); - GstStreamType stream_type = gst_stream_get_stream_type (stream); - const gchar *stream_id = gst_stream_get_stream_id (stream); - - if (stream_type & GST_STREAM_TYPE_AUDIO) { - s = gst_player_stream_info_new (n_audio, GST_TYPE_PLAYER_AUDIO_INFO); - n_audio++; - } else if (stream_type & GST_STREAM_TYPE_VIDEO) { - s = gst_player_stream_info_new (n_video, GST_TYPE_PLAYER_VIDEO_INFO); - n_video++; - } else if (stream_type & GST_STREAM_TYPE_TEXT) { - s = gst_player_stream_info_new (n_text, GST_TYPE_PLAYER_SUBTITLE_INFO); - n_text++; - } else { - GST_DEBUG_OBJECT (self, "Unknown type stream %d", i); - continue; - } - - s->stream_id = g_strdup (stream_id); - - /* add the object in stream list */ - media_info->stream_list = g_list_append (media_info->stream_list, s); - - /* based on type, add the object in its corresponding stream_ list */ - if (GST_IS_PLAYER_AUDIO_INFO (s)) - media_info->audio_stream_list = g_list_append - (media_info->audio_stream_list, s); - else if (GST_IS_PLAYER_VIDEO_INFO (s)) - media_info->video_stream_list = g_list_append - (media_info->video_stream_list, s); - else - media_info->subtitle_stream_list = g_list_append - (media_info->subtitle_stream_list, s); - - GST_DEBUG_OBJECT (self, "create %s stream stream_index: %d", - gst_player_stream_info_get_stream_type (s), s->stream_index); - - gst_player_stream_info_update_from_stream (self, s, stream); + GstPlayerState s = GST_PLAYER_STATE_BUFFERING; + switch (state) { + case GST_PLAY_STATE_BUFFERING: + s = GST_PLAYER_STATE_BUFFERING; + break; + case GST_PLAY_STATE_PAUSED: + s = GST_PLAYER_STATE_PAUSED; + break; + case GST_PLAY_STATE_PLAYING: + s = GST_PLAYER_STATE_PLAYING; + break; + case GST_PLAY_STATE_STOPPED: + s = GST_PLAYER_STATE_STOPPED; + break; } + g_signal_emit (self, signals[SIGNAL_STATE_CHANGED], 0, s); } static void -video_changed_cb (G_GNUC_UNUSED GObject * object, gpointer user_data) +buffering_cb (GstPlaySignalAdapter * adapter, gint buffering_percent, + GstPlayer * self) { - GstPlayer *self = GST_PLAYER (user_data); - - g_mutex_lock (&self->lock); - gst_player_streams_info_create (self, self->media_info, - "n-video", GST_TYPE_PLAYER_VIDEO_INFO); - g_mutex_unlock (&self->lock); + g_signal_emit (self, signals[SIGNAL_BUFFERING], 0, buffering_percent); } static void -audio_changed_cb (G_GNUC_UNUSED GObject * object, gpointer user_data) +end_of_stream_cb (GstPlaySignalAdapter * adapter, GstPlayer * self) { - GstPlayer *self = GST_PLAYER (user_data); - - g_mutex_lock (&self->lock); - gst_player_streams_info_create (self, self->media_info, - "n-audio", GST_TYPE_PLAYER_AUDIO_INFO); - g_mutex_unlock (&self->lock); + g_signal_emit (self, signals[SIGNAL_END_OF_STREAM], 0, NULL); } static void -subtitle_changed_cb (G_GNUC_UNUSED GObject * object, gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - g_mutex_lock (&self->lock); - gst_player_streams_info_create (self, self->media_info, - "n-text", GST_TYPE_PLAYER_SUBTITLE_INFO); - g_mutex_unlock (&self->lock); -} - -static void * -get_title (GstTagList * tags) -{ - gchar *title = NULL; - - gst_tag_list_get_string (tags, GST_TAG_TITLE, &title); - if (!title) - gst_tag_list_get_string (tags, GST_TAG_TITLE_SORTNAME, &title); - - return title; -} - -static void * -get_container_format (GstTagList * tags) +error_cb (GstPlaySignalAdapter * adapter, GError * error, + GstStructure * details, GstPlayer * self) { - gchar *container = NULL; - - gst_tag_list_get_string (tags, GST_TAG_CONTAINER_FORMAT, &container); - - /* TODO: If container is not available then maybe consider - * parsing caps or file extension to guess the container format. - */ - - return container; -} - -static void * -get_from_tags (GstPlayer * self, GstPlayerMediaInfo * media_info, - void *(*func) (GstTagList *)) -{ - GList *l; - void *ret = NULL; - - if (media_info->tags) { - ret = func (media_info->tags); - if (ret) - return ret; - } - - /* if global tag does not exit then try video and audio streams */ - GST_DEBUG_OBJECT (self, "trying video tags"); - for (l = gst_player_media_info_get_video_streams (media_info); l != NULL; - l = l->next) { - GstTagList *tags; - - tags = gst_player_stream_info_get_tags ((GstPlayerStreamInfo *) l->data); - if (tags) - ret = func (tags); - - if (ret) - return ret; - } - - GST_DEBUG_OBJECT (self, "trying audio tags"); - for (l = gst_player_media_info_get_audio_streams (media_info); l != NULL; - l = l->next) { - GstTagList *tags; - - tags = gst_player_stream_info_get_tags ((GstPlayerStreamInfo *) l->data); - if (tags) - ret = func (tags); - - if (ret) - return ret; - } - - GST_DEBUG_OBJECT (self, "failed to get the information from tags"); - return NULL; -} - -static void * -get_cover_sample (GstTagList * tags) -{ - GstSample *cover_sample = NULL; - - gst_tag_list_get_sample (tags, GST_TAG_IMAGE, &cover_sample); - if (!cover_sample) - gst_tag_list_get_sample (tags, GST_TAG_PREVIEW_IMAGE, &cover_sample); - - return cover_sample; -} - -static GstPlayerMediaInfo * -gst_player_media_info_create (GstPlayer * self) -{ - GstPlayerMediaInfo *media_info; - GstQuery *query; - - GST_DEBUG_OBJECT (self, "begin"); - media_info = gst_player_media_info_new (self->uri); - media_info->duration = gst_player_get_duration (self); - media_info->tags = self->global_tags; - media_info->is_live = self->is_live; - self->global_tags = NULL; - - query = gst_query_new_seeking (GST_FORMAT_TIME); - if (gst_element_query (self->playbin, query)) - gst_query_parse_seeking (query, NULL, &media_info->seekable, NULL, NULL); - gst_query_unref (query); - - if (self->use_playbin3 && self->collection) { - gst_player_streams_info_create_from_collection (self, media_info, - self->collection); - } else { - /* create audio/video/sub streams */ - gst_player_streams_info_create (self, media_info, "n-video", - GST_TYPE_PLAYER_VIDEO_INFO); - gst_player_streams_info_create (self, media_info, "n-audio", - GST_TYPE_PLAYER_AUDIO_INFO); - gst_player_streams_info_create (self, media_info, "n-text", - GST_TYPE_PLAYER_SUBTITLE_INFO); - } - - media_info->title = get_from_tags (self, media_info, get_title); - media_info->container = - get_from_tags (self, media_info, get_container_format); - media_info->image_sample = get_from_tags (self, media_info, get_cover_sample); - - GST_DEBUG_OBJECT (self, "uri: %s title: %s duration: %" GST_TIME_FORMAT - " seekable: %s live: %s container: %s image_sample %p", - media_info->uri, media_info->title, GST_TIME_ARGS (media_info->duration), - media_info->seekable ? "yes" : "no", media_info->is_live ? "yes" : "no", - media_info->container, media_info->image_sample); - - GST_DEBUG_OBJECT (self, "end"); - return media_info; + g_signal_emit (self, signals[SIGNAL_ERROR], 0, error); } static void -tags_changed_cb (GstPlayer * self, gint stream_index, GType type) +dimensions_changed_cb (GstPlaySignalAdapter * adapter, guint width, + guint height, GstPlayer * self) { - GstPlayerStreamInfo *s; - - if (!self->media_info) - return; - - /* update the stream information */ - g_mutex_lock (&self->lock); - s = gst_player_stream_info_find (self->media_info, type, stream_index); - gst_player_stream_info_update_tags_and_caps (self, s); - g_mutex_unlock (&self->lock); - - emit_media_info_updated_signal (self); + g_signal_emit (self, signals[SIGNAL_VIDEO_DIMENSIONS_CHANGED], 0, width, + height); } static void -video_tags_changed_cb (G_GNUC_UNUSED GstElement * playbin, gint stream_index, - gpointer user_data) +media_info_cb (GstPlaySignalAdapter * adapter, GstPlayMediaInfo * info, + GstPlayer * self) { - tags_changed_cb (GST_PLAYER (user_data), stream_index, - GST_TYPE_PLAYER_VIDEO_INFO); + GstPlayerMediaInfo *i = gst_player_media_info_wrapped (info); + g_signal_emit (self, signals[SIGNAL_MEDIA_INFO_UPDATED], 0, i); + g_object_unref (i); } static void -audio_tags_changed_cb (G_GNUC_UNUSED GstElement * playbin, gint stream_index, - gpointer user_data) +volume_cb (GstPlaySignalAdapter * adapter, gdouble volume, GstPlayer * self) { - tags_changed_cb (GST_PLAYER (user_data), stream_index, - GST_TYPE_PLAYER_AUDIO_INFO); + g_signal_emit (self, signals[SIGNAL_VOLUME_CHANGED], 0, NULL); } -static void -subtitle_tags_changed_cb (G_GNUC_UNUSED GstElement * playbin, gint stream_index, - gpointer user_data) -{ - tags_changed_cb (GST_PLAYER (user_data), stream_index, - GST_TYPE_PLAYER_SUBTITLE_INFO); -} static void -volume_changed_dispatch (gpointer user_data) +mute_cb (GstPlaySignalAdapter * adapter, gboolean muted, GstPlayer * self) { - GstPlayer *player = user_data; - - if (player->inhibit_sigs) - return; - - g_signal_emit (player, signals[SIGNAL_VOLUME_CHANGED], 0); - g_object_notify_by_pspec (G_OBJECT (player), param_specs[PROP_VOLUME]); + g_signal_emit (self, signals[SIGNAL_MUTE_CHANGED], 0, NULL); } static void -volume_notify_cb (G_GNUC_UNUSED GObject * obj, G_GNUC_UNUSED GParamSpec * pspec, - GstPlayer * self) +warning_cb (GstPlaySignalAdapter * adapter, GError * warning, + GstStructure * details, GstPlayer * self) { - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_VOLUME_CHANGED], 0, NULL, NULL, NULL) != 0) { - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - volume_changed_dispatch, g_object_ref (self), - (GDestroyNotify) g_object_unref); - } + g_signal_emit (self, signals[SIGNAL_WARNING], 0, warning); } static void -mute_changed_dispatch (gpointer user_data) -{ - GstPlayer *player = user_data; - - if (player->inhibit_sigs) - return; - - g_signal_emit (player, signals[SIGNAL_MUTE_CHANGED], 0); - g_object_notify_by_pspec (G_OBJECT (player), param_specs[PROP_MUTE]); -} - -static void -mute_notify_cb (G_GNUC_UNUSED GObject * obj, G_GNUC_UNUSED GParamSpec * pspec, +seek_done_cb (GstPlaySignalAdapter * adapter, GstClockTime time, GstPlayer * self) { - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_MUTE_CHANGED], 0, NULL, NULL, NULL) != 0) { - gst_player_signal_dispatcher_dispatch (self->signal_dispatcher, self, - mute_changed_dispatch, g_object_ref (self), - (GDestroyNotify) g_object_unref); - } -} - -static void -source_setup_cb (GstElement * playbin, GstElement * source, GstPlayer * self) -{ - gchar *user_agent; - - user_agent = gst_player_config_get_user_agent (self->config); - if (user_agent) { - GParamSpec *prop; - - prop = g_object_class_find_property (G_OBJECT_GET_CLASS (source), - "user-agent"); - if (prop && prop->value_type == G_TYPE_STRING) { - GST_INFO_OBJECT (self, "Setting source user-agent: %s", user_agent); - g_object_set (source, "user-agent", user_agent, NULL); - } - - g_free (user_agent); - } -} - -static gpointer -gst_player_main (gpointer data) -{ - GstPlayer *self = GST_PLAYER (data); - GstBus *bus; - GSource *source; - GSource *bus_source; - GstElement *scaletempo; - const gchar *env; - - GST_TRACE_OBJECT (self, "Starting main thread"); - - g_main_context_push_thread_default (self->context); - - source = g_idle_source_new (); - g_source_set_callback (source, (GSourceFunc) main_loop_running_cb, self, - NULL); - g_source_attach (source, self->context); - g_source_unref (source); - - env = g_getenv ("GST_PLAYER_USE_PLAYBIN3"); - if (env && g_str_has_prefix (env, "1")) - self->use_playbin3 = TRUE; - - if (self->use_playbin3) { - GST_DEBUG_OBJECT (self, "playbin3 enabled"); - self->playbin = gst_element_factory_make ("playbin3", "playbin3"); - } else { - self->playbin = gst_element_factory_make ("playbin", "playbin"); - } - - if (!self->playbin) { - g_error ("GstPlayer: 'playbin' element not found, please check your setup"); - g_assert_not_reached (); - } - - gst_object_ref_sink (self->playbin); - - if (self->video_renderer) { - GstElement *video_sink = - gst_player_video_renderer_create_video_sink (self->video_renderer, - self); - - if (video_sink) - g_object_set (self->playbin, "video-sink", video_sink, NULL); - } - - scaletempo = gst_element_factory_make ("scaletempo", NULL); - if (scaletempo) { - g_object_set (self->playbin, "audio-filter", scaletempo, NULL); - } else { - g_warning ("GstPlayer: scaletempo element not available. Audio pitch " - "will not be preserved during trick modes"); - } - - self->bus = bus = gst_element_get_bus (self->playbin); - bus_source = gst_bus_create_watch (bus); - g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, - NULL, NULL); - g_source_attach (bus_source, self->context); - - g_signal_connect (G_OBJECT (bus), "message::error", G_CALLBACK (error_cb), - self); - g_signal_connect (G_OBJECT (bus), "message::warning", G_CALLBACK (warning_cb), - self); - g_signal_connect (G_OBJECT (bus), "message::eos", G_CALLBACK (eos_cb), self); - g_signal_connect (G_OBJECT (bus), "message::state-changed", - G_CALLBACK (state_changed_cb), self); - g_signal_connect (G_OBJECT (bus), "message::buffering", - G_CALLBACK (buffering_cb), self); - g_signal_connect (G_OBJECT (bus), "message::clock-lost", - G_CALLBACK (clock_lost_cb), self); - g_signal_connect (G_OBJECT (bus), "message::duration-changed", - G_CALLBACK (duration_changed_cb), self); - g_signal_connect (G_OBJECT (bus), "message::latency", - G_CALLBACK (latency_cb), self); - g_signal_connect (G_OBJECT (bus), "message::request-state", - G_CALLBACK (request_state_cb), self); - g_signal_connect (G_OBJECT (bus), "message::element", - G_CALLBACK (element_cb), self); - g_signal_connect (G_OBJECT (bus), "message::tag", G_CALLBACK (tags_cb), self); - - if (self->use_playbin3) { - g_signal_connect (G_OBJECT (bus), "message::stream-collection", - G_CALLBACK (stream_collection_cb), self); - g_signal_connect (G_OBJECT (bus), "message::streams-selected", - G_CALLBACK (streams_selected_cb), self); - } else { - g_signal_connect (self->playbin, "video-changed", - G_CALLBACK (video_changed_cb), self); - g_signal_connect (self->playbin, "audio-changed", - G_CALLBACK (audio_changed_cb), self); - g_signal_connect (self->playbin, "text-changed", - G_CALLBACK (subtitle_changed_cb), self); - - g_signal_connect (self->playbin, "video-tags-changed", - G_CALLBACK (video_tags_changed_cb), self); - g_signal_connect (self->playbin, "audio-tags-changed", - G_CALLBACK (audio_tags_changed_cb), self); - g_signal_connect (self->playbin, "text-tags-changed", - G_CALLBACK (subtitle_tags_changed_cb), self); - } - - g_signal_connect (self->playbin, "notify::volume", - G_CALLBACK (volume_notify_cb), self); - g_signal_connect (self->playbin, "notify::mute", - G_CALLBACK (mute_notify_cb), self); - g_signal_connect (self->playbin, "source-setup", - G_CALLBACK (source_setup_cb), self); - - self->target_state = GST_STATE_NULL; - self->current_state = GST_STATE_NULL; - change_state (self, GST_PLAYER_STATE_STOPPED); - self->buffering = 100; - self->is_eos = FALSE; - self->is_live = FALSE; - self->rate = 1.0; - - GST_TRACE_OBJECT (self, "Starting main loop"); - g_main_loop_run (self->loop); - GST_TRACE_OBJECT (self, "Stopped main loop"); - - g_source_destroy (bus_source); - g_source_unref (bus_source); - gst_object_unref (bus); - - remove_tick_source (self); - remove_ready_timeout_source (self); - - g_mutex_lock (&self->lock); - if (self->media_info) { - g_object_unref (self->media_info); - self->media_info = NULL; - } - - remove_seek_source (self); - g_mutex_unlock (&self->lock); - - g_main_context_pop_thread_default (self->context); - - self->target_state = GST_STATE_NULL; - self->current_state = GST_STATE_NULL; - if (self->playbin) { - gst_element_set_state (self->playbin, GST_STATE_NULL); - gst_object_unref (self->playbin); - self->playbin = NULL; - } - - GST_TRACE_OBJECT (self, "Stopped main thread"); - - return NULL; -} - -static gpointer -gst_player_init_once (G_GNUC_UNUSED gpointer user_data) -{ - gst_init (NULL, NULL); - - GST_DEBUG_CATEGORY_INIT (gst_player_debug, "gst-player", 0, "GstPlayer"); - gst_player_error_quark (); - - return NULL; + g_signal_emit (self, signals[SIGNAL_SEEK_DONE], 0, time); } /** @@ -3056,76 +565,67 @@ { static GOnce once = G_ONCE_INIT; GstPlayer *self; + GstPlayerVideoRenderer *renderer = NULL; g_once (&once, gst_player_init_once, NULL); self = - g_object_new (GST_TYPE_PLAYER, "video-renderer", video_renderer, - "signal-dispatcher", signal_dispatcher, NULL); - gst_object_ref_sink (self); - - if (video_renderer) - g_object_unref (video_renderer); - if (signal_dispatcher) - g_object_unref (signal_dispatcher); - - return self; -} - -static gboolean -gst_player_play_internal (gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstStateChangeReturn state_ret; + g_object_new (GST_TYPE_PLAYER, "signal-dispatcher", signal_dispatcher, + NULL); - GST_DEBUG_OBJECT (self, "Play"); + self->play = gst_play_new (NULL); - g_mutex_lock (&self->lock); - if (!self->uri) { - g_mutex_unlock (&self->lock); - return G_SOURCE_REMOVE; + if (video_renderer != NULL) { + renderer = gst_player_wrapped_video_renderer_new (video_renderer, self); + g_object_set (self->play, "video-renderer", + GST_PLAY_VIDEO_RENDERER (renderer), NULL); } - g_mutex_unlock (&self->lock); - remove_ready_timeout_source (self); - self->target_state = GST_STATE_PLAYING; + if (signal_dispatcher != NULL) { + GMainContext *context = NULL; - if (self->current_state < GST_STATE_PAUSED) - change_state (self, GST_PLAYER_STATE_BUFFERING); - - if (self->current_state >= GST_STATE_PAUSED && !self->is_eos - && self->buffering >= 100 && !(self->seek_position != GST_CLOCK_TIME_NONE - || self->seek_pending)) { - state_ret = gst_element_set_state (self->playbin, GST_STATE_PLAYING); + g_object_get (signal_dispatcher, "application-context", &context, NULL); + self->signal_adapter = + gst_play_signal_adapter_new_with_main_context (self->play, context); + g_main_context_unref (context); } else { - state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); + self->signal_adapter = gst_play_signal_adapter_new (self->play); } - if (state_ret == GST_STATE_CHANGE_FAILURE) { - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to play")); - return G_SOURCE_REMOVE; - } else if (state_ret == GST_STATE_CHANGE_NO_PREROLL) { - self->is_live = TRUE; - GST_DEBUG_OBJECT (self, "Pipeline is live"); - } + gst_object_ref_sink (self); - if (self->is_eos) { - gboolean ret; + g_signal_connect (self->signal_adapter, "uri-loaded", + G_CALLBACK (uri_loaded_cb), self); + g_signal_connect (self->signal_adapter, "position-updated", + G_CALLBACK (position_updated_cb), self); + g_signal_connect (self->signal_adapter, "duration-changed", + G_CALLBACK (duration_changed_cb), self); + g_signal_connect (self->signal_adapter, "state-changed", + G_CALLBACK (state_changed_cb), self); + g_signal_connect (self->signal_adapter, "buffering", + G_CALLBACK (buffering_cb), self); + g_signal_connect (self->signal_adapter, "end-of-stream", + G_CALLBACK (end_of_stream_cb), self); + g_signal_connect (self->signal_adapter, "error", G_CALLBACK (error_cb), self); + g_signal_connect (self->signal_adapter, "video-dimensions-changed", + G_CALLBACK (dimensions_changed_cb), self); + g_signal_connect (self->signal_adapter, "media-info-updated", + G_CALLBACK (media_info_cb), self); + g_signal_connect (self->signal_adapter, "volume-changed", + G_CALLBACK (volume_cb), self); + g_signal_connect (self->signal_adapter, "mute-changed", G_CALLBACK (mute_cb), + self); + g_signal_connect (self->signal_adapter, "warning", G_CALLBACK (warning_cb), + self); + g_signal_connect (self->signal_adapter, "seek-done", + G_CALLBACK (seek_done_cb), self); - GST_DEBUG_OBJECT (self, "Was EOS, seeking to beginning"); - self->is_eos = FALSE; - ret = - gst_element_seek_simple (self->playbin, GST_FORMAT_TIME, - GST_SEEK_FLAG_FLUSH, 0); - if (!ret) { - GST_ERROR_OBJECT (self, "Seek to beginning failed"); - gst_player_stop_internal (self, TRUE); - gst_player_play_internal (self); - } - } + if (video_renderer) + g_object_unref (video_renderer); + if (signal_dispatcher) + g_object_unref (signal_dispatcher); - return G_SOURCE_REMOVE; + return self; } /** @@ -3139,64 +639,7 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - g_mutex_lock (&self->lock); - self->inhibit_sigs = FALSE; - g_mutex_unlock (&self->lock); - - g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, - gst_player_play_internal, self, NULL); -} - -static gboolean -gst_player_pause_internal (gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - GstStateChangeReturn state_ret; - - GST_DEBUG_OBJECT (self, "Pause"); - - g_mutex_lock (&self->lock); - if (!self->uri) { - g_mutex_unlock (&self->lock); - return G_SOURCE_REMOVE; - } - g_mutex_unlock (&self->lock); - - tick_cb (self); - remove_tick_source (self); - remove_ready_timeout_source (self); - - self->target_state = GST_STATE_PAUSED; - - if (self->current_state < GST_STATE_PAUSED) - change_state (self, GST_PLAYER_STATE_BUFFERING); - - state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); - if (state_ret == GST_STATE_CHANGE_FAILURE) { - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to pause")); - return G_SOURCE_REMOVE; - } else if (state_ret == GST_STATE_CHANGE_NO_PREROLL) { - self->is_live = TRUE; - GST_DEBUG_OBJECT (self, "Pipeline is live"); - } - - if (self->is_eos) { - gboolean ret; - - GST_DEBUG_OBJECT (self, "Was EOS, seeking to beginning"); - self->is_eos = FALSE; - ret = - gst_element_seek_simple (self->playbin, GST_FORMAT_TIME, - GST_SEEK_FLAG_FLUSH, 0); - if (!ret) { - GST_ERROR_OBJECT (self, "Seek to beginning failed"); - gst_player_stop_internal (self, TRUE); - gst_player_pause_internal (self); - } - } - - return G_SOURCE_REMOVE; + gst_play_play (self->play); } /** @@ -3210,83 +653,9 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - g_mutex_lock (&self->lock); - self->inhibit_sigs = FALSE; - g_mutex_unlock (&self->lock); - - g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, - gst_player_pause_internal, self, NULL); -} - -static void -gst_player_stop_internal (GstPlayer * self, gboolean transient) -{ - /* directly return if we're already stopped */ - if (self->current_state <= GST_STATE_READY && - self->target_state <= GST_STATE_READY) - return; - - GST_DEBUG_OBJECT (self, "Stop (transient %d)", transient); - - tick_cb (self); - remove_tick_source (self); - - add_ready_timeout_source (self); - - self->target_state = GST_STATE_NULL; - self->current_state = GST_STATE_READY; - self->is_live = FALSE; - self->is_eos = FALSE; - gst_bus_set_flushing (self->bus, TRUE); - gst_element_set_state (self->playbin, GST_STATE_READY); - gst_bus_set_flushing (self->bus, FALSE); - change_state (self, transient - && self->app_state != - GST_PLAYER_STATE_STOPPED ? GST_PLAYER_STATE_BUFFERING : - GST_PLAYER_STATE_STOPPED); - self->buffering = 100; - self->cached_duration = GST_CLOCK_TIME_NONE; - g_mutex_lock (&self->lock); - if (self->media_info) { - g_object_unref (self->media_info); - self->media_info = NULL; - } - if (self->global_tags) { - gst_tag_list_unref (self->global_tags); - self->global_tags = NULL; - } - self->seek_pending = FALSE; - remove_seek_source (self); - self->seek_position = GST_CLOCK_TIME_NONE; - self->last_seek_time = GST_CLOCK_TIME_NONE; - self->rate = 1.0; - if (self->collection) { - if (self->stream_notify_id) - g_signal_handler_disconnect (self->collection, self->stream_notify_id); - self->stream_notify_id = 0; - gst_object_unref (self->collection); - self->collection = NULL; - } - g_free (self->video_sid); - g_free (self->audio_sid); - g_free (self->subtitle_sid); - self->video_sid = NULL; - self->audio_sid = NULL; - self->subtitle_sid = NULL; - g_mutex_unlock (&self->lock); -} - -static gboolean -gst_player_stop_internal_dispatch (gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - gst_player_stop_internal (self, FALSE); - - return G_SOURCE_REMOVE; + gst_play_pause (self->play); } - /** * gst_player_stop: * @player: #GstPlayer instance @@ -3299,97 +668,7 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - g_mutex_lock (&self->lock); - self->inhibit_sigs = TRUE; - g_mutex_unlock (&self->lock); - - g_main_context_invoke_full (self->context, G_PRIORITY_DEFAULT, - gst_player_stop_internal_dispatch, self, NULL); -} - -/* Must be called with lock from main context, releases lock! */ -static void -gst_player_seek_internal_locked (GstPlayer * self) -{ - gboolean ret; - GstClockTime position; - gdouble rate; - GstStateChangeReturn state_ret; - GstEvent *s_event; - GstSeekFlags flags = 0; - gboolean accurate = FALSE; - - remove_seek_source (self); - - /* Only seek in PAUSED */ - if (self->current_state < GST_STATE_PAUSED) { - return; - } else if (self->current_state != GST_STATE_PAUSED) { - g_mutex_unlock (&self->lock); - state_ret = gst_element_set_state (self->playbin, GST_STATE_PAUSED); - if (state_ret == GST_STATE_CHANGE_FAILURE) { - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to seek")); - g_mutex_lock (&self->lock); - return; - } - g_mutex_lock (&self->lock); - return; - } - - self->last_seek_time = gst_util_get_timestamp (); - position = self->seek_position; - self->seek_position = GST_CLOCK_TIME_NONE; - self->seek_pending = TRUE; - rate = self->rate; - g_mutex_unlock (&self->lock); - - remove_tick_source (self); - self->is_eos = FALSE; - - flags |= GST_SEEK_FLAG_FLUSH; - - accurate = gst_player_config_get_seek_accurate (self->config); - - if (accurate) { - flags |= GST_SEEK_FLAG_ACCURATE; - } else { - flags &= ~GST_SEEK_FLAG_ACCURATE; - } - - if (rate != 1.0) { - flags |= GST_SEEK_FLAG_TRICKMODE; - } - - if (rate >= 0.0) { - s_event = gst_event_new_seek (rate, GST_FORMAT_TIME, flags, - GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_SET, GST_CLOCK_TIME_NONE); - } else { - s_event = gst_event_new_seek (rate, GST_FORMAT_TIME, flags, - GST_SEEK_TYPE_SET, G_GINT64_CONSTANT (0), GST_SEEK_TYPE_SET, position); - } - - GST_DEBUG_OBJECT (self, "Seek with rate %.2lf to %" GST_TIME_FORMAT, - rate, GST_TIME_ARGS (position)); - - ret = gst_element_send_event (self->playbin, s_event); - if (!ret) - emit_error (self, g_error_new (GST_PLAYER_ERROR, GST_PLAYER_ERROR_FAILED, - "Failed to seek to %" GST_TIME_FORMAT, GST_TIME_ARGS (position))); - - g_mutex_lock (&self->lock); -} - -static gboolean -gst_player_seek_internal (gpointer user_data) -{ - GstPlayer *self = GST_PLAYER (user_data); - - g_mutex_lock (&self->lock); - gst_player_seek_internal_locked (self); - g_mutex_unlock (&self->lock); - - return G_SOURCE_REMOVE; + gst_play_stop (self->play); } /** @@ -3440,58 +719,7 @@ g_return_if_fail (GST_IS_PLAYER (self)); g_return_if_fail (GST_CLOCK_TIME_IS_VALID (position)); - g_mutex_lock (&self->lock); - if (self->media_info && !self->media_info->seekable) { - GST_DEBUG_OBJECT (self, "Media is not seekable"); - g_mutex_unlock (&self->lock); - return; - } - - self->seek_position = position; - - /* If there is no seek being dispatch to the main context currently do that, - * otherwise we just updated the seek position so that it will be taken by - * the seek handler from the main context instead of the old one. - */ - if (!self->seek_source) { - GstClockTime now = gst_util_get_timestamp (); - - /* If no seek is pending or it was started more than 250 mseconds ago seek - * immediately, otherwise wait until the 250 mseconds have passed */ - if (!self->seek_pending || (now - self->last_seek_time > 250 * GST_MSECOND)) { - self->seek_source = g_idle_source_new (); - g_source_set_callback (self->seek_source, - (GSourceFunc) gst_player_seek_internal, self, NULL); - GST_TRACE_OBJECT (self, "Dispatching seek to position %" GST_TIME_FORMAT, - GST_TIME_ARGS (position)); - g_source_attach (self->seek_source, self->context); - } else { - guint delay = 250000 - (now - self->last_seek_time) / 1000; - - /* Note that last_seek_time must be set to something at this point and - * it must be smaller than 250 mseconds */ - self->seek_source = g_timeout_source_new (delay); - g_source_set_callback (self->seek_source, - (GSourceFunc) gst_player_seek_internal, self, NULL); - - GST_TRACE_OBJECT (self, - "Delaying seek to position %" GST_TIME_FORMAT " by %u us", - GST_TIME_ARGS (position), delay); - g_source_attach (self->seek_source, self->context); - } - } - g_mutex_unlock (&self->lock); -} - -static void -remove_seek_source (GstPlayer * self) -{ - if (!self->seek_source) - return; - - g_source_destroy (self->seek_source); - g_source_unref (self->seek_source); - self->seek_source = NULL; + gst_play_seek (self->play, position); } /** @@ -3500,7 +728,7 @@ * * Gets the URI of the currently-playing stream. * - * Returns: (transfer full): a string containing the URI of the + * Returns: (transfer full) (nullable): a string containing the URI of the * currently-playing stream. g_free() after usage. */ gchar * @@ -3518,7 +746,7 @@ /** * gst_player_set_uri: * @player: #GstPlayer instance - * @uri: next URI to play. + * @uri: (nullable): next URI to play. * * Sets the next URI to play. */ @@ -3533,7 +761,7 @@ /** * gst_player_set_subtitle_uri: * @player: #GstPlayer instance - * @uri: subtitle URI + * @uri: (nullable): subtitle URI * * Sets the external subtitle URI. This should be combined with a call to * gst_player_set_subtitle_track_enabled(@player, TRUE) so the subtitles are actually @@ -3553,7 +781,7 @@ * * current subtitle URI * - * Returns: (transfer full): URI of the current external subtitle. + * Returns: (transfer full) (nullable): URI of the current external subtitle. * g_free() after usage. */ gchar * @@ -3634,6 +862,10 @@ * @val: the new volume level, as a percentage between 0 and 1 * * Sets the volume level of the stream as a percentage between 0 and 1. + * + * This volume is a linear factor. For showing the volume in a GUI it + * might make sense to first convert from a different format. Volume sliders + * should usually use a cubic volume. See gst_stream_volume_convert_volume(). */ void gst_player_set_volume (GstPlayer * self, gdouble val) @@ -3680,7 +912,9 @@ * gst_player_get_pipeline: * @player: #GstPlayer instance * - * Returns: (transfer full): The internal playbin instance + * Returns: (transfer full): The internal playbin instance. + * + * The caller should free it with g_object_unref() */ GstElement * gst_player_get_pipeline (GstPlayer * self) @@ -3700,25 +934,25 @@ * * A Function to get the current media info #GstPlayerMediaInfo instance. * - * Returns: (transfer full): media info instance. + * Returns: (transfer full) (nullable): media info instance. * * The caller should free it with g_object_unref() */ GstPlayerMediaInfo * gst_player_get_media_info (GstPlayer * self) { - GstPlayerMediaInfo *info; + GstPlayMediaInfo *info; + GstPlayerMediaInfo *ret; g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - if (!self->media_info) + info = gst_play_get_media_info (self->play); + if (!info) return NULL; - g_mutex_lock (&self->lock); - info = gst_player_media_info_copy (self->media_info); - g_mutex_unlock (&self->lock); - - return info; + ret = gst_player_media_info_wrapped (info); + g_object_unref (info); + return ret; } /** @@ -3727,30 +961,24 @@ * * A Function to get current audio #GstPlayerAudioInfo instance. * - * Returns: (transfer full): current audio track. + * Returns: (transfer full) (nullable): current audio track. * * The caller should free it with g_object_unref() */ GstPlayerAudioInfo * gst_player_get_current_audio_track (GstPlayer * self) { - GstPlayerAudioInfo *info; + GstPlayAudioInfo *info; + GstPlayerAudioInfo *ret = NULL; g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - if (!is_track_enabled (self, GST_PLAY_FLAG_AUDIO)) - return NULL; - - if (self->use_playbin3) { - info = (GstPlayerAudioInfo *) - gst_player_stream_info_get_current_from_stream_id (self, - self->audio_sid, GST_TYPE_PLAYER_AUDIO_INFO); - } else { - info = (GstPlayerAudioInfo *) gst_player_stream_info_get_current (self, - "current-audio", GST_TYPE_PLAYER_AUDIO_INFO); + info = gst_play_get_current_audio_track (self->play); + if (info != NULL) { + ret = gst_player_audio_info_wrapped (info); + g_object_unref (info); } - - return info; + return ret; } /** @@ -3759,30 +987,24 @@ * * A Function to get current video #GstPlayerVideoInfo instance. * - * Returns: (transfer full): current video track. + * Returns: (transfer full) (nullable): current video track. * * The caller should free it with g_object_unref() */ GstPlayerVideoInfo * gst_player_get_current_video_track (GstPlayer * self) { - GstPlayerVideoInfo *info; + GstPlayVideoInfo *info; + GstPlayerVideoInfo *ret = NULL; g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - if (!is_track_enabled (self, GST_PLAY_FLAG_VIDEO)) - return NULL; - - if (self->use_playbin3) { - info = (GstPlayerVideoInfo *) - gst_player_stream_info_get_current_from_stream_id (self, - self->video_sid, GST_TYPE_PLAYER_VIDEO_INFO); - } else { - info = (GstPlayerVideoInfo *) gst_player_stream_info_get_current (self, - "current-video", GST_TYPE_PLAYER_VIDEO_INFO); + info = gst_play_get_current_video_track (self->play); + if (info != NULL) { + ret = gst_player_video_info_wrapped (info); + g_object_unref (info); } - - return info; + return ret; } /** @@ -3791,56 +1013,23 @@ * * A Function to get current subtitle #GstPlayerSubtitleInfo instance. * - * Returns: (transfer full): current subtitle track. + * Returns: (transfer full) (nullable): current subtitle track. * * The caller should free it with g_object_unref() */ GstPlayerSubtitleInfo * gst_player_get_current_subtitle_track (GstPlayer * self) { - GstPlayerSubtitleInfo *info; + GstPlaySubtitleInfo *info; + GstPlayerSubtitleInfo *ret = NULL; g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - if (!is_track_enabled (self, GST_PLAY_FLAG_SUBTITLE)) - return NULL; - - if (self->use_playbin3) { - info = (GstPlayerSubtitleInfo *) - gst_player_stream_info_get_current_from_stream_id (self, - self->subtitle_sid, GST_TYPE_PLAYER_SUBTITLE_INFO); - } else { - info = (GstPlayerSubtitleInfo *) gst_player_stream_info_get_current (self, - "current-text", GST_TYPE_PLAYER_SUBTITLE_INFO); - } - - return info; -} - -/* Must be called with lock */ -static gboolean -gst_player_select_streams (GstPlayer * self) -{ - GList *stream_list = NULL; - gboolean ret = FALSE; - - if (self->audio_sid) - stream_list = g_list_append (stream_list, g_strdup (self->audio_sid)); - if (self->video_sid) - stream_list = g_list_append (stream_list, g_strdup (self->video_sid)); - if (self->subtitle_sid) - stream_list = g_list_append (stream_list, g_strdup (self->subtitle_sid)); - - g_mutex_unlock (&self->lock); - if (stream_list) { - ret = gst_element_send_event (self->playbin, - gst_event_new_select_streams (stream_list)); - g_list_free_full (stream_list, g_free); - } else { - GST_ERROR_OBJECT (self, "No available streams for select-streams"); + info = gst_play_get_current_subtitle_track (self->play); + if (info != NULL) { + ret = gst_player_subtitle_info_wrapped (info); + g_object_unref (info); } - g_mutex_lock (&self->lock); - return ret; } @@ -3856,33 +1045,9 @@ gboolean gst_player_set_audio_track (GstPlayer * self, gint stream_index) { - GstPlayerStreamInfo *info; - gboolean ret = TRUE; - g_return_val_if_fail (GST_IS_PLAYER (self), 0); - g_mutex_lock (&self->lock); - info = gst_player_stream_info_find (self->media_info, - GST_TYPE_PLAYER_AUDIO_INFO, stream_index); - g_mutex_unlock (&self->lock); - if (!info) { - GST_ERROR_OBJECT (self, "invalid audio stream index %d", stream_index); - return FALSE; - } - - if (self->use_playbin3) { - g_mutex_lock (&self->lock); - g_free (self->audio_sid); - self->audio_sid = g_strdup (info->stream_id); - ret = gst_player_select_streams (self); - g_mutex_unlock (&self->lock); - } else { - g_object_set (G_OBJECT (self->playbin), "current-audio", stream_index, - NULL); - } - - GST_DEBUG_OBJECT (self, "set stream index '%d'", stream_index); - return ret; + return gst_play_set_audio_track (self->play, stream_index); } /** @@ -3897,34 +1062,9 @@ gboolean gst_player_set_video_track (GstPlayer * self, gint stream_index) { - GstPlayerStreamInfo *info; - gboolean ret = TRUE; - g_return_val_if_fail (GST_IS_PLAYER (self), 0); - /* check if stream_index exist in our internal media_info list */ - g_mutex_lock (&self->lock); - info = gst_player_stream_info_find (self->media_info, - GST_TYPE_PLAYER_VIDEO_INFO, stream_index); - g_mutex_unlock (&self->lock); - if (!info) { - GST_ERROR_OBJECT (self, "invalid video stream index %d", stream_index); - return FALSE; - } - - if (self->use_playbin3) { - g_mutex_lock (&self->lock); - g_free (self->video_sid); - self->video_sid = g_strdup (info->stream_id); - ret = gst_player_select_streams (self); - g_mutex_unlock (&self->lock); - } else { - g_object_set (G_OBJECT (self->playbin), "current-video", stream_index, - NULL); - } - - GST_DEBUG_OBJECT (self, "set stream index '%d'", stream_index); - return ret; + return gst_play_set_video_track (self->play, stream_index); } /** @@ -3939,32 +1079,9 @@ gboolean gst_player_set_subtitle_track (GstPlayer * self, gint stream_index) { - GstPlayerStreamInfo *info; - gboolean ret = TRUE; - g_return_val_if_fail (GST_IS_PLAYER (self), 0); - g_mutex_lock (&self->lock); - info = gst_player_stream_info_find (self->media_info, - GST_TYPE_PLAYER_SUBTITLE_INFO, stream_index); - g_mutex_unlock (&self->lock); - if (!info) { - GST_ERROR_OBJECT (self, "invalid subtitle stream index %d", stream_index); - return FALSE; - } - - if (self->use_playbin3) { - g_mutex_lock (&self->lock); - g_free (self->subtitle_sid); - self->subtitle_sid = g_strdup (info->stream_id); - ret = gst_player_select_streams (self); - g_mutex_unlock (&self->lock); - } else { - g_object_set (G_OBJECT (self->playbin), "current-text", stream_index, NULL); - } - - GST_DEBUG_OBJECT (self, "set stream index '%d'", stream_index); - return ret; + return gst_play_set_subtitle_track (self->play, stream_index); } /** @@ -3979,12 +1096,7 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - if (enabled) - player_set_flag (self, GST_PLAY_FLAG_AUDIO); - else - player_clear_flag (self, GST_PLAY_FLAG_AUDIO); - - GST_DEBUG_OBJECT (self, "track is '%s'", enabled ? "Enabled" : "Disabled"); + gst_play_set_audio_track_enabled (self->play, enabled); } /** @@ -3999,12 +1111,7 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - if (enabled) - player_set_flag (self, GST_PLAY_FLAG_VIDEO); - else - player_clear_flag (self, GST_PLAY_FLAG_VIDEO); - - GST_DEBUG_OBJECT (self, "track is '%s'", enabled ? "Enabled" : "Disabled"); + gst_play_set_video_track_enabled (self->play, enabled); } /** @@ -4019,18 +1126,13 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - if (enabled) - player_set_flag (self, GST_PLAY_FLAG_SUBTITLE); - else - player_clear_flag (self, GST_PLAY_FLAG_SUBTITLE); - - GST_DEBUG_OBJECT (self, "track is '%s'", enabled ? "Enabled" : "Disabled"); + gst_play_set_subtitle_track_enabled (self->play, enabled); } /** * gst_player_set_visualization: * @player: #GstPlayer instance - * @name: visualization element obtained from + * @name: (nullable): visualization element obtained from * #gst_player_visualizations_get() * * Returns: %TRUE if the visualizations was set correctly. Otherwise, @@ -4041,61 +1143,23 @@ { g_return_val_if_fail (GST_IS_PLAYER (self), FALSE); - g_mutex_lock (&self->lock); - if (self->current_vis_element) { - gst_object_unref (self->current_vis_element); - self->current_vis_element = NULL; - } - - if (name) { - self->current_vis_element = gst_element_factory_make (name, NULL); - if (!self->current_vis_element) - goto error_no_element; - gst_object_ref_sink (self->current_vis_element); - } - g_object_set (self->playbin, "vis-plugin", self->current_vis_element, NULL); - - g_mutex_unlock (&self->lock); - GST_DEBUG_OBJECT (self, "set vis-plugin to '%s'", name); - - return TRUE; - -error_no_element: - g_mutex_unlock (&self->lock); - GST_WARNING_OBJECT (self, "could not find visualization '%s'", name); - return FALSE; + return gst_play_set_visualization (self->play, name); } /** * gst_player_get_current_visualization: * @player: #GstPlayer instance * - * Returns: (transfer full): Name of the currently enabled visualization. + * Returns: (transfer full) (nullable): Name of the currently enabled + * visualization. * g_free() after usage. */ gchar * gst_player_get_current_visualization (GstPlayer * self) { - gchar *name = NULL; - GstElement *vis_plugin = NULL; - g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - if (!is_track_enabled (self, GST_PLAY_FLAG_VIS)) - return NULL; - - g_object_get (self->playbin, "vis-plugin", &vis_plugin, NULL); - - if (vis_plugin) { - GstElementFactory *factory = gst_element_get_factory (vis_plugin); - if (factory) - name = g_strdup (gst_plugin_feature_get_name (factory)); - gst_object_unref (vis_plugin); - } - - GST_DEBUG_OBJECT (self, "vis-plugin '%s' %p", name, vis_plugin); - - return name; + return gst_play_get_current_visualization (self->play); } /** @@ -4110,13 +1174,7 @@ { g_return_if_fail (GST_IS_PLAYER (self)); - if (enabled) - player_set_flag (self, GST_PLAY_FLAG_VIS); - else - player_clear_flag (self, GST_PLAY_FLAG_VIS); - - GST_DEBUG_OBJECT (self, "visualization is '%s'", - enabled ? "Enabled" : "Disabled"); + gst_play_set_visualization_enabled (self->play, enabled); } struct CBChannelMap @@ -4132,28 +1190,6 @@ /* GST_PLAYER_COLOR_BALANCE_HUE */ {"HUE", "hue"}, }; -static GstColorBalanceChannel * -gst_player_color_balance_find_channel (GstPlayer * self, - GstPlayerColorBalanceType type) -{ - GstColorBalanceChannel *channel; - const GList *l, *channels; - - if (type < GST_PLAYER_COLOR_BALANCE_BRIGHTNESS || - type > GST_PLAYER_COLOR_BALANCE_HUE) - return NULL; - - channels = - gst_color_balance_list_channels (GST_COLOR_BALANCE (self->playbin)); - for (l = channels; l; l = l->next) { - channel = l->data; - if (g_strrstr (channel->label, cb_channel_map[type].label)) - return channel; - } - - return NULL; -} - /** * gst_player_has_color_balance: * @player:#GstPlayer instance @@ -4166,16 +1202,9 @@ gboolean gst_player_has_color_balance (GstPlayer * self) { - const GList *channels; - g_return_val_if_fail (GST_IS_PLAYER (self), FALSE); - if (!GST_IS_COLOR_BALANCE (self->playbin)) - return FALSE; - - channels = - gst_color_balance_list_channels (GST_COLOR_BALANCE (self->playbin)); - return (channels != NULL); + return gst_play_has_color_balance (self->play); } /** @@ -4191,27 +1220,11 @@ gst_player_set_color_balance (GstPlayer * self, GstPlayerColorBalanceType type, gdouble value) { - GstColorBalanceChannel *channel; - gdouble new_val; - g_return_if_fail (GST_IS_PLAYER (self)); g_return_if_fail (value >= 0.0 && value <= 1.0); - if (!GST_IS_COLOR_BALANCE (self->playbin)) - return; - - channel = gst_player_color_balance_find_channel (self, type); - if (!channel) - return; - - value = CLAMP (value, 0.0, 1.0); - - /* Convert to channel range */ - new_val = channel->min_value + value * ((gdouble) channel->max_value - - (gdouble) channel->min_value); - - gst_color_balance_set_value (GST_COLOR_BALANCE (self->playbin), channel, - new_val); + gst_play_set_color_balance (self->play, (GstPlayColorBalanceType) type, + value); } /** @@ -4227,24 +1240,10 @@ gdouble gst_player_get_color_balance (GstPlayer * self, GstPlayerColorBalanceType type) { - GstColorBalanceChannel *channel; - gint value; - g_return_val_if_fail (GST_IS_PLAYER (self), -1); - if (!GST_IS_COLOR_BALANCE (self->playbin)) - return -1; - - channel = gst_player_color_balance_find_channel (self, type); - if (!channel) - return -1; - - value = gst_color_balance_get_value (GST_COLOR_BALANCE (self->playbin), - channel); - - return ((gdouble) value - - (gdouble) channel->min_value) / ((gdouble) channel->max_value - - (gdouble) channel->min_value); + return gst_play_get_color_balance (self->play, + (GstPlayColorBalanceType) type); } /** @@ -4561,21 +1560,7 @@ g_return_val_if_fail (GST_IS_PLAYER (self), FALSE); g_return_val_if_fail (config != NULL, FALSE); - g_mutex_lock (&self->lock); - - if (self->app_state != GST_PLAYER_STATE_STOPPED) { - GST_INFO_OBJECT (self, "can't change config while player is %s", - gst_player_state_get_name (self->app_state)); - g_mutex_unlock (&self->lock); - return FALSE; - } - - if (self->config) - gst_structure_free (self->config); - self->config = config; - g_mutex_unlock (&self->lock); - - return TRUE; + return gst_play_set_config (self->play, config); } /** @@ -4594,21 +1579,15 @@ GstStructure * gst_player_get_config (GstPlayer * self) { - GstStructure *ret; - g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - g_mutex_lock (&self->lock); - ret = gst_structure_copy (self->config); - g_mutex_unlock (&self->lock); - - return ret; + return gst_play_get_config (self->play); } /** * gst_player_config_set_user_agent: * @config: a #GstPlayer configuration - * @agent: the string to use as user agent + * @agent: (nullable): the string to use as user agent * * Set the user agent to pass to the server if @player needs to connect * to a server during playback. This is typically used when playing HTTP @@ -4633,7 +1612,8 @@ * Return the user agent which has been configured using * gst_player_config_set_user_agent() if any. * - * Returns: (transfer full): the configured agent, or %NULL + * Returns: (transfer full) (nullable): the configured agent, or %NULL + * * Since: 1.10 */ gchar * @@ -4656,6 +1636,7 @@ * * set interval in milliseconds between two position-updated signals. * pass 0 to stop updating the position. + * * Since: 1.10 */ void @@ -4750,7 +1731,7 @@ * - pixel-aspect-ratio of type GST_TYPE_FRACTION * Except for GST_PLAYER_THUMBNAIL_RAW_NATIVE format, if no config is set, pixel-aspect-ratio would be 1/1 * - * Returns: (transfer full): Current video snapshot sample or %NULL on failure + * Returns: (transfer full) (nullable): Current video snapshot sample or %NULL on failure * * Since: 1.12 */ @@ -4758,78 +1739,8 @@ gst_player_get_video_snapshot (GstPlayer * self, GstPlayerSnapshotFormat format, const GstStructure * config) { - gint video_tracks = 0; - GstSample *sample = NULL; - GstCaps *caps = NULL; - gint width = -1; - gint height = -1; - gint par_n = 1; - gint par_d = 1; g_return_val_if_fail (GST_IS_PLAYER (self), NULL); - g_object_get (self->playbin, "n-video", &video_tracks, NULL); - if (video_tracks == 0) { - GST_DEBUG_OBJECT (self, "total video track num is 0"); - return NULL; - } - - switch (format) { - case GST_PLAYER_THUMBNAIL_RAW_xRGB: - caps = gst_caps_new_simple ("video/x-raw", - "format", G_TYPE_STRING, "xRGB", NULL); - break; - case GST_PLAYER_THUMBNAIL_RAW_BGRx: - caps = gst_caps_new_simple ("video/x-raw", - "format", G_TYPE_STRING, "BGRx", NULL); - break; - case GST_PLAYER_THUMBNAIL_JPG: - caps = gst_caps_new_empty_simple ("image/jpeg"); - break; - case GST_PLAYER_THUMBNAIL_PNG: - caps = gst_caps_new_empty_simple ("image/png"); - break; - case GST_PLAYER_THUMBNAIL_RAW_NATIVE: - default: - caps = gst_caps_new_empty_simple ("video/x-raw"); - break; - } - - if (NULL != config) { - if (!gst_structure_get_int (config, "width", &width)) - width = -1; - if (!gst_structure_get_int (config, "height", &height)) - height = -1; - if (!gst_structure_get_fraction (config, "pixel-aspect-ratio", &par_n, - &par_d)) { - if (format != GST_PLAYER_THUMBNAIL_RAW_NATIVE) { - par_n = 1; - par_d = 1; - } else { - par_n = 0; - par_d = 0; - } - } - } - - if (width > 0 && height > 0) { - gst_caps_set_simple (caps, "width", G_TYPE_INT, width, - "height", G_TYPE_INT, height, NULL); - } - - if (format != GST_PLAYER_THUMBNAIL_RAW_NATIVE) { - gst_caps_set_simple (caps, "pixel-aspect-ratio", GST_TYPE_FRACTION, - par_n, par_d, NULL); - } else if (NULL != config && par_n != 0 && par_d != 0) { - gst_caps_set_simple (caps, "pixel-aspect-ratio", GST_TYPE_FRACTION, - par_n, par_d, NULL); - } - - g_signal_emit_by_name (self->playbin, "convert-sample", caps, &sample); - gst_caps_unref (caps); - if (!sample) { - GST_WARNING_OBJECT (self, "Failed to retrieve or convert video frame"); - return NULL; - } - - return sample; + return gst_play_get_video_snapshot (self->play, + (GstPlaySnapshotFormat) format, config); }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/player/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/player/meson.build
Changed
@@ -1,14 +1,15 @@ -gstplayer_sources = [ +gstplayer_sources = files([ 'gstplayer.c', 'gstplayer-signal-dispatcher.c', 'gstplayer-video-renderer.c', 'gstplayer-media-info.c', 'gstplayer-g-main-context-signal-dispatcher.c', 'gstplayer-video-overlay-video-renderer.c', + 'gstplayer-wrapped-video-renderer.c', 'gstplayer-visualization.c', -] +]) -gstplayer_headers = [ +gstplayer_headers = files([ 'player.h', 'player-prelude.h', 'gstplayer.h', @@ -19,43 +20,60 @@ 'gstplayer-g-main-context-signal-dispatcher.h', 'gstplayer-video-overlay-video-renderer.h', 'gstplayer-visualization.h', -] +]) install_headers(gstplayer_headers, subdir : 'gstreamer-' + api_version + '/gst/player/') gstplayer = library('gstplayer-' + api_version, gstplayer_sources, - c_args : gst_plugins_bad_args + ['-DBUILDING_GST_PLAYER'], + c_args : gst_plugins_bad_args + ['-DBUILDING_GST_PLAYER', '-DG_LOG_DOMAIN="GStreamer-Player"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, darwin_versions : osxversion, install : true, - dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, + dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, gstplay_dep, gsttag_dep, gstpbutils_dep], ) +library_def = {'lib': gstplayer} +pkg_name = 'gstreamer-player-1.0' +pkgconfig.generate(gstplayer, + libraries : [gst_dep, gstvideo_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : 'gstreamer-player-1.0', + description : 'GStreamer Player convenience library', +) + gen_sources = [] if build_gir - player_gir = gnome.generate_gir(gstplayer, - sources : gstplayer_sources + gstplayer_headers, - namespace : 'GstPlayer', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-player-1.0', - includes : ['Gst-1.0', 'GstPbutils-1.0', 'GstBase-1.0', 'GstVideo-1.0', + gir = { + 'sources' : gstplayer_sources + gstplayer_headers, + 'namespace' : 'GstPlayer', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstPbutils-1.0', 'GstBase-1.0', 'GstVideo-1.0', 'GstAudio-1.0', 'GstTag-1.0'], - install : true, - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/player/player.h'], - dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, - gsttag_dep, gstpbutils_dep] - ) - gen_sources += player_gir + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/player/player.h'], + 'dependencies' : [gstbase_dep, gstvideo_dep, gstaudio_dep, gstplay_dep, + gsttag_dep, gstpbutils_dep] + } + library_def = {'lib': library_def['lib'], 'gir': [gir]} + if not static_build + player_gir = gnome.generate_gir(gstplayer, kwargs: gir) + gen_sources += player_gir + endif endif +libraries += [[pkg_name, library_def]] gstplayer_dep = declare_dependency(link_with : gstplayer, include_directories : [libsinc], sources: gen_sources, - dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, + dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, gstplay_dep, gsttag_dep, gstpbutils_dep]) + +meson.override_dependency(pkg_name, gstplayer_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/sctp/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/sctp/meson.build
Changed
@@ -13,7 +13,7 @@ libgstsctp = library('gstsctp-' + api_version, sctp_sources, - c_args : gst_plugins_bad_args + ['-DBUILDING_GST_SCTP'], + c_args : gst_plugins_bad_args + ['-DBUILDING_GST_SCTP', '-DG_LOG_DOMAIN="GStreamer-SCTP"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -21,6 +21,18 @@ dependencies : [gstbase_dep], ) +pkg_name = 'gstreamer-sctp-1.0' +libraries += [[pkg_name, {'lib': libgstsctp}]] +pkgconfig.generate(libgstsctp, + libraries : [gst_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'SCTP helper functions', +) + gstsctp_dep = declare_dependency(link_with : libgstsctp, include_directories : [libsinc], dependencies : [gstbase_dep]) + +meson.override_dependency(pkg_name, gstsctp_dep)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/gsttranscoder-private.h
Added
@@ -0,0 +1,44 @@ +/* GStreamer + * + * Copyright (C) 2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#define GST_TRANSCODER_MESSAGE_DATA "gst-transcoder-message-data" +#define GST_TRANSCODER_MESSAGE_DATA_TYPE "transcoder-message-type" +#define GST_TRANSCODER_MESSAGE_DATA_POSITION "position" +#define GST_TRANSCODER_MESSAGE_DATA_DURATION "duration" +#define GST_TRANSCODER_MESSAGE_DATA_STATE "state" +#define GST_TRANSCODER_MESSAGE_DATA_ERROR "error" +#define GST_TRANSCODER_MESSAGE_DATA_WARNING "warning" +#define GST_TRANSCODER_MESSAGE_DATA_ISSUE_DETAILS "issue-details" + +struct _GstTranscoderSignalAdapter +{ + GObject parent; + GstBus *bus; + GSource *source; + + GWeakRef transcoder; +}; + + +GstTranscoderSignalAdapter * gst_transcoder_signal_adapter_new_sync_emit (GstTranscoder * transcoder); +GstTranscoderSignalAdapter * gst_transcoder_signal_adapter_new (GstTranscoder * transcoder, GMainContext * context);
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/gsttranscoder-signal-adapter.c
Added
@@ -0,0 +1,342 @@ +/* GStreamer + * + * Copyright (C) 2019-2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gsttranscoder.h" +#include "gsttranscoder-signal-adapter.h" + +#include "gsttranscoder-private.h" + +#include <gst/gst.h> + +GST_DEBUG_CATEGORY_STATIC (gst_transcoder_signal_adapter_debug); +#define GST_CAT_DEFAULT gst_transcoder_signal_adapter_debug + +enum +{ + SIGNAL_POSITION_UPDATED, + SIGNAL_DURATION_CHANGED, + SIGNAL_STATE_CHANGED, + SIGNAL_DONE, + SIGNAL_ERROR, + SIGNAL_WARNING, + SIGNAL_LAST +}; + +enum +{ + PROP_0, + PROP_TRANSCODER, + PROP_LAST +}; + +static GParamSpec *param_specs[PROP_LAST] = { NULL, }; + +struct _GstTranscoderSignalAdapterClass +{ + GObjectClass parent_class; +}; + +#define _do_init \ + GST_DEBUG_CATEGORY_INIT (gst_transcoder_signal_adapter_debug, "gst-transcoder-signaladapter", \ + 0, "GstTranscoder signal adapter") + +#define parent_class gst_transcoder_signal_adapter_parent_class +G_DEFINE_TYPE_WITH_CODE (GstTranscoderSignalAdapter, + gst_transcoder_signal_adapter, G_TYPE_OBJECT, _do_init); + +static guint signals[SIGNAL_LAST] = { 0, }; + +static void +gst_transcoder_signal_adapter_emit (GstTranscoderSignalAdapter * self, + const GstStructure * message_data) +{ + GstTranscoderMessage transcoder_message_type; + g_return_if_fail (g_str_equal (gst_structure_get_name (message_data), + GST_TRANSCODER_MESSAGE_DATA)); + + GST_LOG ("Emitting message %" GST_PTR_FORMAT, message_data); + gst_structure_get (message_data, GST_TRANSCODER_MESSAGE_DATA_TYPE, + GST_TYPE_TRANSCODER_MESSAGE, &transcoder_message_type, NULL); + + switch (transcoder_message_type) { + case GST_TRANSCODER_MESSAGE_POSITION_UPDATED:{ + GstClockTime pos = GST_CLOCK_TIME_NONE; + gst_structure_get (message_data, GST_TRANSCODER_MESSAGE_DATA_POSITION, + GST_TYPE_CLOCK_TIME, &pos, NULL); + g_signal_emit (self, signals[SIGNAL_POSITION_UPDATED], 0, pos); + break; + } + case GST_TRANSCODER_MESSAGE_DURATION_CHANGED:{ + GstClockTime duration = GST_CLOCK_TIME_NONE; + gst_structure_get (message_data, GST_TRANSCODER_MESSAGE_DATA_DURATION, + GST_TYPE_CLOCK_TIME, &duration, NULL); + g_signal_emit (self, signals[SIGNAL_DURATION_CHANGED], 0, duration); + break; + } + case GST_TRANSCODER_MESSAGE_STATE_CHANGED:{ + GstTranscoderState state; + gst_structure_get (message_data, GST_TRANSCODER_MESSAGE_DATA_STATE, + GST_TYPE_TRANSCODER_STATE, &state, NULL); + g_signal_emit (self, signals[SIGNAL_STATE_CHANGED], 0, state); + break; + } + case GST_TRANSCODER_MESSAGE_DONE: + g_signal_emit (self, signals[SIGNAL_DONE], 0); + break; + case GST_TRANSCODER_MESSAGE_ERROR:{ + GError *error = NULL; + GstStructure *details = NULL; + + gst_structure_get (message_data, GST_TRANSCODER_MESSAGE_DATA_ERROR, + G_TYPE_ERROR, &error, GST_TYPE_STRUCTURE, &details, NULL); + g_signal_emit (self, signals[SIGNAL_ERROR], 0, error, details); + g_error_free (error); + if (details) + gst_structure_free (details); + break; + } + case GST_TRANSCODER_MESSAGE_WARNING:{ + GstStructure *details = NULL; + GError *error = NULL; + + gst_structure_get (message_data, GST_TRANSCODER_MESSAGE_DATA_WARNING, + G_TYPE_ERROR, &error, GST_TYPE_STRUCTURE, &details, NULL); + g_signal_emit (self, signals[SIGNAL_WARNING], 0, error, details); + g_error_free (error); + if (details) + gst_structure_free (details); + break; + } + default: + g_assert_not_reached (); + break; + } +} + +/* + * callback for the bus-message in-sync handling + */ +static GstBusSyncReply + gst_transcoder_signal_adapter_bus_sync_handler + (GstBus * bus, GstMessage * message, gpointer user_data) +{ + GstTranscoderSignalAdapter *self = GST_TRANSCODER_SIGNAL_ADAPTER (user_data); + const GstStructure *message_data = gst_message_get_structure (message); + gst_transcoder_signal_adapter_emit (self, message_data); + gst_message_unref (message); + return GST_BUS_DROP; +} + +/* + * callback for the bus-watch + * pre: there is a message on the bus + */ +static gboolean +gst_transcoder_signal_adapter_on_message (GstBus * bus, + GstMessage * message, gpointer user_data) +{ + GstTranscoderSignalAdapter *self = GST_TRANSCODER_SIGNAL_ADAPTER (user_data); + const GstStructure *message_data = gst_message_get_structure (message); + gst_transcoder_signal_adapter_emit (self, message_data); + return TRUE; +} + +/** + * gst_transcoder_signal_adapter_new: + * @transcoder: (transfer none): #GstTranscoder instance to emit signals for. + * @context: (nullable): A #GMainContext on which the main-loop will process + * transcoder bus messages on. Can be NULL (thread-default + * context will be used then). + * + * A bus-watching #GSource will be created and attached to the context. The + * attached callback will emit the corresponding signal for the message + * received. Matching signals for transcoder messages from the bus will be + * emitted by it on the created adapter object. + * + * Returns: (transfer full)(nullable): A new #GstTranscoderSignalAdapter to + * connect signal handlers to. + * + * Since: 1.20 + */ +GstTranscoderSignalAdapter * +gst_transcoder_signal_adapter_new (GstTranscoder * transcoder, + GMainContext * context) +{ + GstTranscoderSignalAdapter *self = NULL; + + g_return_val_if_fail (GST_IS_TRANSCODER (transcoder), NULL); + + self = g_object_new (GST_TYPE_TRANSCODER_SIGNAL_ADAPTER, NULL); + self->bus = gst_transcoder_get_message_bus (transcoder); + self->source = gst_bus_create_watch (self->bus); + + if (!self->source) { + GST_ERROR_OBJECT (transcoder, "Could not create watch."); + + gst_object_unref (self); + + return NULL; + } + + g_weak_ref_set (&self->transcoder, transcoder); + g_source_attach (self->source, context); + g_source_set_callback (self->source, + (GSourceFunc) gst_transcoder_signal_adapter_on_message, self, NULL); + return self; +} + +/** + * gst_transcoder_signal_adapter_new_sync_emit: + * @transcoder: (transfer none): #GstTranscoder instance to emit signals + * synchronously for. + * + * Returns: (transfer full): A new #GstTranscoderSignalAdapter to connect signal + * handlers to. + * + * Since: 1.20 + */ +GstTranscoderSignalAdapter * +gst_transcoder_signal_adapter_new_sync_emit (GstTranscoder * transcoder) +{ + GstBus *bus = NULL; + GstTranscoderSignalAdapter *self = NULL; + + g_return_val_if_fail (GST_IS_TRANSCODER (transcoder), NULL); + + bus = gst_transcoder_get_message_bus (transcoder); + + self = g_object_new (GST_TYPE_TRANSCODER_SIGNAL_ADAPTER, NULL); + self->bus = bus; + gst_bus_set_sync_handler (self->bus, + gst_transcoder_signal_adapter_bus_sync_handler, self, NULL); + return self; +} + +static void +gst_transcoder_signal_adapter_init (GstTranscoderSignalAdapter * self) +{ + self->source = NULL; +} + +static void +gst_transcoder_signal_adapter_dispose (GObject * object) +{ + GstTranscoderSignalAdapter *self = GST_TRANSCODER_SIGNAL_ADAPTER (object); + + if (self->source) { + g_source_destroy (self->source); + g_source_unref (self->source); + self->source = NULL; + } + + gst_clear_object (&self->bus); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_transcoder_signal_adapter_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstTranscoderSignalAdapter *self = GST_TRANSCODER_SIGNAL_ADAPTER (object); + + switch (prop_id) { + case PROP_TRANSCODER: + g_value_take_object (value, g_weak_ref_get (&self->transcoder)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_transcoder_signal_adapter_class_init (GstTranscoderSignalAdapterClass * + klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + + gobject_class->dispose = gst_transcoder_signal_adapter_dispose; + gobject_class->get_property = gst_transcoder_signal_adapter_get_property; + + signals[SIGNAL_POSITION_UPDATED] = + g_signal_new ("position-updated", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); + + signals[SIGNAL_DURATION_CHANGED] = + g_signal_new ("duration-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); + + signals[SIGNAL_DONE] = + g_signal_new ("done", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 0, G_TYPE_INVALID); + + signals[SIGNAL_ERROR] = + g_signal_new ("error", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 2, G_TYPE_ERROR, GST_TYPE_STRUCTURE); + + signals[SIGNAL_WARNING] = + g_signal_new ("warning", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 2, G_TYPE_ERROR, GST_TYPE_STRUCTURE); + + signals[SIGNAL_STATE_CHANGED] = + g_signal_new ("state-changed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, + NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_TRANSCODER_STATE); + + /** + * GstTranscoderSignalAdapter:transcoder: + * + * The #GstTranscoder tracked by the adapter. + * + * Since: 1.20 + */ + param_specs[PROP_TRANSCODER] = + g_param_spec_object ("transcoder", "Transcoder", + "The GstTranscoder @self is tracking", GST_TYPE_TRANSCODER, + G_PARAM_READABLE); + + g_object_class_install_properties (gobject_class, PROP_LAST, param_specs); +} + + +/** + * gst_transcoder_signal_adapter_get_transcoder: + * @self: The #GstTranscoderSignalAdapter + * + * Returns: (transfer full)(nullable): The #GstTranscoder @self is tracking + * + * Since: 1.20 + */ +GstTranscoder * +gst_transcoder_signal_adapter_get_transcoder (GstTranscoderSignalAdapter * self) +{ + return g_weak_ref_get (&self->transcoder); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/gsttranscoder-signal-adapter.h
Added
@@ -0,0 +1,55 @@ +/* GStreamer + * + * Copyright (C) 2019-2020 Stephan Hesse <stephan@emliri.com> + * Copyright (C) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#pragma once + +#include <gst/gst.h> +#include <gst/transcoder/gsttranscoder.h> + +G_BEGIN_DECLS + +/** + * GstTranscoderSignalAdapter: + * + * Transforms #GstTranscoder bus messages to signals from the adapter object. + * + * Since: 1.20 + */ + +/** + * GST_TYPE_TRANSCODER_SIGNAL_ADAPTER: + * + * Since: 1.20 + */ +#define GST_TYPE_TRANSCODER_SIGNAL_ADAPTER (gst_transcoder_signal_adapter_get_type ()) +GST_TRANSCODER_API + +/** + * GstTranscoderSignalAdapterClass: + * + * Since: 1.20 + */ +G_DECLARE_FINAL_TYPE(GstTranscoderSignalAdapter, gst_transcoder_signal_adapter, GST, TRANSCODER_SIGNAL_ADAPTER, GObject) + +GST_TRANSCODER_API +GstTranscoder * gst_transcoder_signal_adapter_get_transcoder (GstTranscoderSignalAdapter * self); + + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/transcoder/gsttranscoder.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/gsttranscoder.c
Changed
@@ -28,6 +28,9 @@ */ #include "gsttranscoder.h" +#include "gsttranscoder-private.h" + +static GOnce once = G_ONCE_INIT; GST_DEBUG_CATEGORY_STATIC (gst_transcoder_debug); #define GST_CAT_DEFAULT gst_transcoder_debug @@ -52,7 +55,6 @@ enum { PROP_0, - PROP_SIGNAL_DISPATCHER, PROP_SRC_URI, PROP_DEST_URI, PROP_PROFILE, @@ -64,22 +66,10 @@ PROP_LAST }; -enum -{ - SIGNAL_POSITION_UPDATED, - SIGNAL_DURATION_CHANGED, - SIGNAL_DONE, - SIGNAL_ERROR, - SIGNAL_WARNING, - SIGNAL_LAST -}; - struct _GstTranscoder { GstObject parent; - GstTranscoderSignalDispatcher *signal_dispatcher; - GstEncodingProfile *profile; gchar *source_uri; gchar *dest_uri; @@ -99,6 +89,12 @@ gint wanted_cpu_usage; GstClockTime last_duration; + + GstTranscoderState app_state; + + GstBus *api_bus; + GstTranscoderSignalAdapter *signal_adapter; + GstTranscoderSignalAdapter *sync_signal_adapter; }; struct _GstTranscoderClass @@ -106,15 +102,9 @@ GstObjectClass parent_class; }; -static void -gst_transcoder_signal_dispatcher_dispatch (GstTranscoderSignalDispatcher * self, - GstTranscoder * transcoder, void (*emitter) (gpointer data), gpointer data, - GDestroyNotify destroy); - #define parent_class gst_transcoder_parent_class G_DEFINE_TYPE (GstTranscoder, gst_transcoder, GST_TYPE_OBJECT); -static guint signals[SIGNAL_LAST] = { 0, }; static GParamSpec *param_specs[PROP_LAST] = { NULL, }; static void gst_transcoder_dispose (GObject * object); @@ -162,6 +152,7 @@ self->context = g_main_context_new (); self->loop = g_main_loop_new (self->context, FALSE); + self->api_bus = gst_bus_new (); self->wanted_cpu_usage = 100; self->position_update_interval_ms = DEFAULT_POSITION_UPDATE_INTERVAL_MS; @@ -180,12 +171,6 @@ gobject_class->finalize = gst_transcoder_finalize; gobject_class->constructed = gst_transcoder_constructed; - param_specs[PROP_SIGNAL_DISPATCHER] = - g_param_spec_object ("signal-dispatcher", - "Signal Dispatcher", "Dispatcher for the signals to e.g. event loops", - GST_TYPE_TRANSCODER_SIGNAL_DISPATCHER, - G_PARAM_WRITABLE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); - param_specs[PROP_SRC_URI] = g_param_spec_string ("src-uri", "URI", "Source URI", DEFAULT_URI, G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); @@ -232,31 +217,6 @@ DEFAULT_AVOID_REENCODING, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); g_object_class_install_properties (gobject_class, PROP_LAST, param_specs); - - signals[SIGNAL_POSITION_UPDATED] = - g_signal_new ("position-updated", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, - NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); - - signals[SIGNAL_DURATION_CHANGED] = - g_signal_new ("duration-changed", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, - NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_CLOCK_TIME); - - signals[SIGNAL_DONE] = - g_signal_new ("done", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, - NULL, NULL, G_TYPE_NONE, 0, G_TYPE_INVALID); - - signals[SIGNAL_ERROR] = - g_signal_new ("error", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, - NULL, NULL, G_TYPE_NONE, 2, G_TYPE_ERROR, GST_TYPE_STRUCTURE); - - signals[SIGNAL_WARNING] = - g_signal_new ("warning", G_TYPE_FROM_CLASS (klass), - G_SIGNAL_RUN_LAST | G_SIGNAL_NO_RECURSE | G_SIGNAL_NO_HOOKS, 0, NULL, - NULL, NULL, G_TYPE_NONE, 2, G_TYPE_ERROR, GST_TYPE_STRUCTURE); } static void @@ -266,10 +226,14 @@ GST_TRACE_OBJECT (self, "Stopping main thread"); + GST_OBJECT_LOCK (self); if (self->loop) { g_main_loop_quit (self->loop); + GST_OBJECT_UNLOCK (self); g_thread_join (self->thread); + + GST_OBJECT_LOCK (self); self->thread = NULL; g_main_loop_unref (self->loop); @@ -278,6 +242,11 @@ g_main_context_unref (self->context); self->context = NULL; + gst_clear_object (&self->signal_adapter); + gst_clear_object (&self->sync_signal_adapter); + GST_OBJECT_UNLOCK (self); + } else { + GST_OBJECT_UNLOCK (self); } G_OBJECT_CLASS (parent_class)->dispose (object); @@ -292,8 +261,6 @@ g_free (self->source_uri); g_free (self->dest_uri); - if (self->signal_dispatcher) - g_object_unref (self->signal_dispatcher); g_cond_clear (&self->cond); G_OBJECT_CLASS (parent_class)->finalize (object); @@ -329,9 +296,6 @@ GstTranscoder *self = GST_TRANSCODER (object); switch (prop_id) { - case PROP_SIGNAL_DISPATCHER: - self->signal_dispatcher = g_value_dup_object (value); - break; case PROP_SRC_URI:{ GST_OBJECT_LOCK (self); g_free (self->source_uri); @@ -441,6 +405,34 @@ } } +/* + * Works same as gst_structure_set to set field/type/value triplets on message data + */ +static void +api_bus_post_message (GstTranscoder * self, GstTranscoderMessage message_type, + const gchar * firstfield, ...) +{ + GstStructure *message_data = NULL; + GstMessage *msg = NULL; + va_list varargs; + + GST_INFO ("Posting API-bus message-type: %s", + gst_transcoder_message_get_name (message_type)); + message_data = gst_structure_new (GST_TRANSCODER_MESSAGE_DATA, + GST_TRANSCODER_MESSAGE_DATA_TYPE, GST_TYPE_TRANSCODER_MESSAGE, + message_type, NULL); + + va_start (varargs, firstfield); + gst_structure_set_valist (message_data, firstfield, varargs); + va_end (varargs); + + msg = gst_message_new_custom (GST_MESSAGE_APPLICATION, + GST_OBJECT (self), message_data); + GST_DEBUG ("Created message with payload: [ %" GST_PTR_FORMAT " ]", + message_data); + gst_bus_post (self->api_bus, msg); +} + static gboolean main_loop_running_cb (gpointer user_data) { @@ -455,56 +447,27 @@ return G_SOURCE_REMOVE; } -typedef struct -{ - GstTranscoder *transcoder; - GstClockTime position; -} PositionUpdatedSignalData; - -static void -position_updated_dispatch (gpointer user_data) -{ - PositionUpdatedSignalData *data = user_data; - - if (data->transcoder->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->transcoder, signals[SIGNAL_POSITION_UPDATED], 0, - data->position); - g_object_notify_by_pspec (G_OBJECT (data->transcoder), - param_specs[PROP_POSITION]); - } -} - -static void -position_updated_signal_data_free (PositionUpdatedSignalData * data) -{ - g_object_unref (data->transcoder); - g_free (data); -} - static gboolean tick_cb (gpointer user_data) { GstTranscoder *self = GST_TRANSCODER (user_data); gint64 position; - if (self->target_state >= GST_STATE_PAUSED - && gst_element_query_position (self->transcodebin, GST_FORMAT_TIME, + if (self->target_state < GST_STATE_PAUSED) + return G_SOURCE_CONTINUE; + + if (!gst_element_query_position (self->transcodebin, GST_FORMAT_TIME, &position)) { - GST_LOG_OBJECT (self, "Position %" GST_TIME_FORMAT, - GST_TIME_ARGS (position)); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_POSITION_UPDATED], 0, NULL, NULL, NULL) != 0) { - PositionUpdatedSignalData *data = g_new0 (PositionUpdatedSignalData, 1); - - data->transcoder = g_object_ref (self); - data->position = position; - gst_transcoder_signal_dispatcher_dispatch (self->signal_dispatcher, self, - position_updated_dispatch, data, - (GDestroyNotify) position_updated_signal_data_free); - } + GST_LOG_OBJECT (self, "Could not query position"); + return G_SOURCE_CONTINUE; } + GST_LOG_OBJECT (self, "Position %" GST_TIME_FORMAT, GST_TIME_ARGS (position)); + + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_POSITION_UPDATED, + GST_TRANSCODER_MESSAGE_DATA_POSITION, GST_TYPE_CLOCK_TIME, position, + NULL); + return G_SOURCE_CONTINUE; } @@ -533,58 +496,6 @@ self->tick_source = NULL; } -typedef struct -{ - GstTranscoder *transcoder; - GError *err; - GstStructure *details; -} IssueSignalData; - -static void -error_dispatch (gpointer user_data) -{ - IssueSignalData *data = user_data; - - g_signal_emit (data->transcoder, signals[SIGNAL_ERROR], 0, data->err, - data->details); -} - -static void -free_issue_signal_data (IssueSignalData * data) -{ - g_object_unref (data->transcoder); - if (data->details) - gst_structure_free (data->details); - g_clear_error (&data->err); - g_free (data); -} - -static void -emit_error (GstTranscoder * self, GError * err, const GstStructure * details) -{ - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_ERROR], 0, NULL, NULL, NULL) != 0) { - IssueSignalData *data = g_new0 (IssueSignalData, 1); - - data->transcoder = g_object_ref (self); - data->err = g_error_copy (err); - if (details) - data->details = gst_structure_copy (details); - gst_transcoder_signal_dispatcher_dispatch (self->signal_dispatcher, self, - error_dispatch, data, (GDestroyNotify) free_issue_signal_data); - } - - g_error_free (err); - - remove_tick_source (self); - - self->target_state = GST_STATE_NULL; - self->current_state = GST_STATE_NULL; - self->is_live = FALSE; - self->is_eos = FALSE; - gst_element_set_state (self->transcodebin, GST_STATE_NULL); -} - static void dump_dot_file (GstTranscoder * self, const gchar * name) { @@ -593,39 +504,12 @@ full_name = g_strdup_printf ("gst-transcoder.%p.%s", self, name); GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (self->transcodebin), - GST_DEBUG_GRAPH_SHOW_VERBOSE, full_name); + GST_DEBUG_GRAPH_SHOW_ALL, full_name); g_free (full_name); } static void -warning_dispatch (gpointer user_data) -{ - IssueSignalData *data = user_data; - - g_signal_emit (data->transcoder, signals[SIGNAL_WARNING], 0, data->err, - data->details); -} - -static void -emit_warning (GstTranscoder * self, GError * err, const GstStructure * details) -{ - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_WARNING], 0, NULL, NULL, NULL) != 0) { - IssueSignalData *data = g_new0 (IssueSignalData, 1); - - data->transcoder = g_object_ref (self); - data->err = g_error_copy (err); - if (details) - data->details = gst_structure_copy (details); - gst_transcoder_signal_dispatcher_dispatch (self->signal_dispatcher, self, - warning_dispatch, data, (GDestroyNotify) free_issue_signal_data); - } - - g_error_free (err); -} - -static void error_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg, gpointer user_data) { GError *err; @@ -650,7 +534,11 @@ "msg-source-element-name", G_TYPE_STRING, "name", "msg-source-type", G_TYPE_GTYPE, G_OBJECT_TYPE (msg->src), "msg-error", G_TYPE_STRING, message, NULL); - emit_error (self, g_error_copy (err), details); + + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_ERROR, + GST_TRANSCODER_MESSAGE_DATA_ERROR, G_TYPE_ERROR, err, + GST_TRANSCODER_MESSAGE_DATA_ISSUE_DETAILS, GST_TYPE_STRUCTURE, details, + NULL); gst_structure_free (details); g_clear_error (&err); @@ -691,8 +579,13 @@ transcoder_err = g_error_new_literal (GST_TRANSCODER_ERROR, GST_TRANSCODER_ERROR_FAILED, full_message); - emit_warning (self, transcoder_err, details); + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_WARNING, + GST_TRANSCODER_MESSAGE_DATA_WARNING, G_TYPE_ERROR, transcoder_err, + GST_TRANSCODER_MESSAGE_DATA_ISSUE_DETAILS, GST_TYPE_STRUCTURE, details, + NULL); + + g_clear_error (&transcoder_err); g_clear_error (&err); g_free (debug); g_free (name); @@ -701,9 +594,17 @@ } static void -eos_dispatch (gpointer user_data) +notify_state_changed (GstTranscoder * self, GstTranscoderState new_state) { - g_signal_emit (user_data, signals[SIGNAL_DONE], 0); + if (new_state == self->app_state) + return; + + GST_DEBUG_OBJECT (self, "Notifying new state: %s", + gst_transcoder_state_get_name (new_state)); + self->app_state = new_state; + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_STATE_CHANGED, + GST_TRANSCODER_MESSAGE_DATA_STATE, GST_TYPE_TRANSCODER_STATE, new_state, + NULL); } static void @@ -719,11 +620,8 @@ tick_cb (self); remove_tick_source (self); - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_DONE], 0, NULL, NULL, NULL) != 0) { - gst_transcoder_signal_dispatcher_dispatch (self->signal_dispatcher, self, - eos_dispatch, g_object_ref (self), (GDestroyNotify) g_object_unref); - } + notify_state_changed (self, GST_TRANSCODER_STATE_STOPPED); + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_DONE, NULL, NULL); self->is_eos = TRUE; } @@ -740,54 +638,13 @@ if (state_ret != GST_STATE_CHANGE_FAILURE) state_ret = gst_element_set_state (self->transcodebin, GST_STATE_PLAYING); - if (state_ret == GST_STATE_CHANGE_FAILURE) - emit_error (self, g_error_new (GST_TRANSCODER_ERROR, - GST_TRANSCODER_ERROR_FAILED, "Failed to handle clock loss"), - NULL); - } -} - -typedef struct -{ - GstTranscoder *transcoder; - GstClockTime duration; -} DurationChangedSignalData; - -static void -duration_changed_dispatch (gpointer user_data) -{ - DurationChangedSignalData *data = user_data; - - if (data->transcoder->target_state >= GST_STATE_PAUSED) { - g_signal_emit (data->transcoder, signals[SIGNAL_DURATION_CHANGED], 0, - data->duration); - g_object_notify_by_pspec (G_OBJECT (data->transcoder), - param_specs[PROP_DURATION]); - } -} - -static void -duration_changed_signal_data_free (DurationChangedSignalData * data) -{ - g_object_unref (data->transcoder); - g_free (data); -} - -static void -emit_duration_changed (GstTranscoder * self, GstClockTime duration) -{ - GST_DEBUG_OBJECT (self, "Duration changed %" GST_TIME_FORMAT, - GST_TIME_ARGS (duration)); - - if (g_signal_handler_find (self, G_SIGNAL_MATCH_ID, - signals[SIGNAL_DURATION_CHANGED], 0, NULL, NULL, NULL) != 0) { - DurationChangedSignalData *data = g_new0 (DurationChangedSignalData, 1); - - data->transcoder = g_object_ref (self); - data->duration = duration; - gst_transcoder_signal_dispatcher_dispatch (self->signal_dispatcher, self, - duration_changed_dispatch, data, - (GDestroyNotify) duration_changed_signal_data_free); + if (state_ret == GST_STATE_CHANGE_FAILURE) { + GError *err = g_error_new (GST_TRANSCODER_ERROR, + GST_TRANSCODER_ERROR_FAILED, "Failed to handle clock loss"); + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_ERROR, + GST_TRANSCODER_MESSAGE_DATA_ERROR, G_TYPE_ERROR, err, NULL); + g_error_free (err); + } } } @@ -816,9 +673,16 @@ self->current_state = new_state; + if (new_state == GST_STATE_PAUSED + && pending_state == GST_STATE_VOID_PENDING) { + remove_tick_source (self); + notify_state_changed (self, GST_TRANSCODER_STATE_PAUSED); + } + if (new_state == GST_STATE_PLAYING && pending_state == GST_STATE_VOID_PENDING) { add_tick_source (self); + notify_state_changed (self, GST_TRANSCODER_STATE_PLAYING); } } } @@ -832,7 +696,9 @@ if (gst_element_query_duration (self->transcodebin, GST_FORMAT_TIME, &duration)) { - emit_duration_changed (self, duration); + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_DURATION_CHANGED, + GST_TRANSCODER_MESSAGE_DATA_DURATION, GST_TYPE_CLOCK_TIME, + duration, NULL); } } @@ -862,11 +728,16 @@ self->target_state = state; state_ret = gst_element_set_state (self->transcodebin, state); - if (state_ret == GST_STATE_CHANGE_FAILURE) - emit_error (self, g_error_new (GST_TRANSCODER_ERROR, - GST_TRANSCODER_ERROR_FAILED, - "Failed to change to requested state %s", - gst_element_state_get_name (state)), NULL); + if (state_ret == GST_STATE_CHANGE_FAILURE) { + GError *err = g_error_new (GST_TRANSCODER_ERROR, + GST_TRANSCODER_ERROR_FAILED, + "Failed to change to requested state %s", + gst_element_state_get_name (state)); + + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_ERROR, + GST_TRANSCODER_MESSAGE_DATA_ERROR, G_TYPE_ERROR, err, NULL); + g_error_free (err); + } } static void @@ -916,7 +787,6 @@ GstTranscoder *self = GST_TRANSCODER (data); GstBus *bus; GSource *source; - GSource *bus_source; GST_TRACE_OBJECT (self, "Starting main thread"); @@ -929,10 +799,7 @@ g_source_unref (source); self->bus = bus = gst_element_get_bus (self->transcodebin); - bus_source = gst_bus_create_watch (bus); - g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, - NULL, NULL); - g_source_attach (bus_source, self->context); + gst_bus_add_signal_watch (bus); g_signal_connect (G_OBJECT (bus), "message::error", G_CALLBACK (error_cb), self); @@ -956,13 +823,13 @@ self->current_state = GST_STATE_NULL; self->is_eos = FALSE; self->is_live = FALSE; + self->app_state = GST_TRANSCODER_STATE_STOPPED; GST_TRACE_OBJECT (self, "Starting main loop"); g_main_loop_run (self->loop); GST_TRACE_OBJECT (self, "Stopped main loop"); - g_source_destroy (bus_source); - g_source_unref (bus_source); + gst_bus_remove_signal_watch (bus); gst_object_unref (bus); remove_tick_source (self); @@ -984,8 +851,6 @@ static gpointer gst_transcoder_init_once (G_GNUC_UNUSED gpointer user_data) { - gst_init (NULL, NULL); - GST_DEBUG_CATEGORY_INIT (gst_transcoder_debug, "gst-transcoder", 0, "GstTranscoder"); gst_transcoder_error_quark (); @@ -1029,9 +894,15 @@ { GstEncodingProfile *profile; + g_once (&once, gst_transcoder_init_once, NULL); + + g_return_val_if_fail (source_uri, NULL); + g_return_val_if_fail (dest_uri, NULL); + g_return_val_if_fail (encoding_profile, NULL); + profile = create_encoding_profile (encoding_profile); - return gst_transcoder_new_full (source_uri, dest_uri, profile, NULL); + return gst_transcoder_new_full (source_uri, dest_uri, profile); } /** @@ -1041,57 +912,47 @@ * @profile: The #GstEncodingProfile defining the output format * have a look at the #GstEncodingProfile documentation to find more * about the serialization format. - * @signal_dispatcher: The #GstTranscoderSignalDispatcher to be used - * to dispatch the various signals. * * Returns: a new #GstTranscoder instance */ GstTranscoder * gst_transcoder_new_full (const gchar * source_uri, - const gchar * dest_uri, GstEncodingProfile * profile, - GstTranscoderSignalDispatcher * signal_dispatcher) + const gchar * dest_uri, GstEncodingProfile * profile) { - static GOnce once = G_ONCE_INIT; - g_once (&once, gst_transcoder_init_once, NULL); g_return_val_if_fail (source_uri, NULL); g_return_val_if_fail (dest_uri, NULL); return g_object_new (GST_TYPE_TRANSCODER, "src-uri", source_uri, - "dest-uri", dest_uri, "profile", profile, - "signal-dispatcher", signal_dispatcher, NULL); + "dest-uri", dest_uri, "profile", profile, NULL); } typedef struct { - GError **user_error; - GMutex m; - GCond cond; - - gboolean done; - + GError *error; + GMainLoop *loop; } RunSyncData; static void -_error_cb (GstTranscoder * self, GError * error, GstStructure * details, - RunSyncData * data) +_error_cb (RunSyncData * data, GError * error, GstStructure * details) { - g_mutex_lock (&data->m); - data->done = TRUE; - if (data->user_error && (*data->user_error) == NULL) - g_propagate_error (data->user_error, error); - g_cond_broadcast (&data->cond); - g_mutex_unlock (&data->m); + if (data->error == NULL) + data->error = g_error_copy (error); + + if (data->loop) { + g_main_loop_quit (data->loop); + data->loop = NULL; + } } static void -_done_cb (GstTranscoder * self, RunSyncData * data) +_done_cb (RunSyncData * data) { - g_mutex_lock (&data->m); - data->done = TRUE; - g_cond_broadcast (&data->cond); - g_mutex_unlock (&data->m); + if (data->loop) { + g_main_loop_quit (data->loop); + data->loop = NULL; + } } /** @@ -1107,22 +968,28 @@ gst_transcoder_run (GstTranscoder * self, GError ** error) { RunSyncData data = { 0, }; + GstTranscoderSignalAdapter *signal_adapter; + + g_return_val_if_fail (GST_IS_TRANSCODER (self), FALSE); - g_mutex_init (&data.m); - g_cond_init (&data.cond); + signal_adapter = gst_transcoder_get_signal_adapter (self, NULL); - g_signal_connect (self, "error", G_CALLBACK (_error_cb), &data); - g_signal_connect (self, "done", G_CALLBACK (_done_cb), &data); + data.loop = g_main_loop_new (NULL, FALSE); + g_signal_connect_swapped (signal_adapter, "error", G_CALLBACK (_error_cb), + &data); + g_signal_connect_swapped (signal_adapter, "done", G_CALLBACK (_done_cb), + &data); gst_transcoder_run_async (self); - g_mutex_lock (&data.m); - while (!data.done) { - g_cond_wait (&data.cond, &data.m); - } - g_mutex_unlock (&data.m); + if (!data.error) + g_main_loop_run (data.loop); + + gst_element_set_state (self->transcodebin, GST_STATE_NULL); + g_object_unref (signal_adapter); - if (data.user_error) { - g_propagate_error (error, *data.user_error); + if (data.error) { + if (error) + g_propagate_error (error, data.error); return FALSE; } @@ -1144,11 +1011,17 @@ { GstStateChangeReturn state_ret; + g_return_if_fail (GST_IS_TRANSCODER (self)); + GST_DEBUG_OBJECT (self, "Play"); if (!self->profile) { - emit_error (self, g_error_new (GST_TRANSCODER_ERROR, - GST_TRANSCODER_ERROR_FAILED, "No \"profile\" provided"), NULL); + GError *err = g_error_new (GST_TRANSCODER_ERROR, + GST_TRANSCODER_ERROR_FAILED, "No \"profile\" provided"); + + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_ERROR, + GST_TRANSCODER_MESSAGE_DATA_ERROR, G_TYPE_ERROR, err, NULL); + g_error_free (err); return; } @@ -1157,8 +1030,12 @@ state_ret = gst_element_set_state (self->transcodebin, GST_STATE_PLAYING); if (state_ret == GST_STATE_CHANGE_FAILURE) { - emit_error (self, g_error_new (GST_TRANSCODER_ERROR, - GST_TRANSCODER_ERROR_FAILED, "Could not start transcoding"), NULL); + GError *err = g_error_new (GST_TRANSCODER_ERROR, + GST_TRANSCODER_ERROR_FAILED, "Could not start transcoding"); + api_bus_post_message (self, GST_TRANSCODER_MESSAGE_ERROR, + GST_TRANSCODER_MESSAGE_DATA_ERROR, G_TYPE_ERROR, err, NULL); + g_error_free (err); + return; } else if (state_ret == GST_STATE_CHANGE_NO_PREROLL) { self->is_live = TRUE; @@ -1356,27 +1233,6 @@ g_object_set (self->transcodebin, "avoid-reencoding", avoid_reencoding, NULL); } -#define C_ENUM(v) ((gint) v) -#define C_FLAGS(v) ((guint) v) - -GType -gst_transcoder_error_get_type (void) -{ - static gsize id = 0; - static const GEnumValue values[] = { - {C_ENUM (GST_TRANSCODER_ERROR_FAILED), "GST_TRANSCODER_ERROR_FAILED", - "failed"}, - {0, NULL, NULL} - }; - - if (g_once_init_enter (&id)) { - GType tmp = g_enum_register_static ("GstTranscoderError", values); - g_once_init_leave (&id, tmp); - } - - return (GType) id; -} - /** * gst_transcoder_error_get_name: * @error: a #GstTranscoderError @@ -1397,212 +1253,269 @@ return NULL; } -G_DEFINE_INTERFACE (GstTranscoderSignalDispatcher, - gst_transcoder_signal_dispatcher, G_TYPE_OBJECT); - -static void -gst_transcoder_signal_dispatcher_default_init (G_GNUC_UNUSED - GstTranscoderSignalDispatcherInterface * iface) +/** + * gst_transcoder_get_message_bus: + * @transcoder: #GstTranscoder instance + * + * GstTranscoder API exposes a #GstBus instance which purpose is to provide data + * structures representing transcoder-internal events in form of #GstMessage-s of + * type GST_MESSAGE_APPLICATION. + * + * Each message carries a "transcoder-message" field of type #GstTranscoderMessage. + * Further fields of the message data are specific to each possible value of + * that enumeration. + * + * Applications can consume the messages asynchronously within their own + * event-loop / UI-thread etc. Note that in case the application does not + * consume the messages, the bus will accumulate these internally and eventually + * fill memory. To avoid that, the bus has to be set "flushing". + * + * Returns: (transfer full): The transcoder message bus instance + * + * Since: 1.20 + */ +GstBus * +gst_transcoder_get_message_bus (GstTranscoder * self) { + g_return_val_if_fail (GST_IS_TRANSCODER (self), NULL); + return g_object_ref (self->api_bus); } -static void -gst_transcoder_signal_dispatcher_dispatch (GstTranscoderSignalDispatcher * self, - GstTranscoder * transcoder, void (*emitter) (gpointer data), gpointer data, - GDestroyNotify destroy) +/** + * gst_transcoder_get_sync_signal_adapter: + * @self: (transfer none): #GstTranscoder instance to emit signals synchronously + * for. + * + * Gets the #GstTranscoderSignalAdapter attached to @self to emit signals from + * its thread of emission. + * + * Returns: (transfer full): The #GstTranscoderSignalAdapter to connect signal + * handlers to. + * + * Since: 1.20 + */ +GstTranscoderSignalAdapter * +gst_transcoder_get_sync_signal_adapter (GstTranscoder * self) { - GstTranscoderSignalDispatcherInterface *iface; - - if (!self) { - emitter (data); - if (destroy) - destroy (data); - return; - } + g_return_val_if_fail (GST_IS_TRANSCODER (self), NULL); - g_return_if_fail (GST_IS_TRANSCODER_SIGNAL_DISPATCHER (self)); - iface = GST_TRANSCODER_SIGNAL_DISPATCHER_GET_INTERFACE (self); - g_return_if_fail (iface->dispatch != NULL); + GST_OBJECT_LOCK (self); + if (!self->sync_signal_adapter) + self->sync_signal_adapter = + gst_transcoder_signal_adapter_new_sync_emit (self); + GST_OBJECT_UNLOCK (self); - iface->dispatch (self, transcoder, emitter, data, destroy); + return g_object_ref (self->sync_signal_adapter); } -struct _GstTranscoderGMainContextSignalDispatcher -{ - GObject parent; - GMainContext *application_context; -}; - -struct _GstTranscoderGMainContextSignalDispatcherClass -{ - GObjectClass parent_class; -}; - -static void - gst_transcoder_g_main_context_signal_dispatcher_interface_init - (GstTranscoderSignalDispatcherInterface * iface); - -enum +/** + * gst_transcoder_get_signal_adapter: + * @self: (transfer none): #GstTranscoder instance to emit signals for. + * @context: (nullable): A #GMainContext on which the main-loop will process + * transcoder bus messages on. Can be NULL (thread-default + * context will be used then). + * + * Gets the #GstTranscoderSignalAdapter attached to @self if it is attached to + * the right #GMainContext. If no #GstTranscoderSignalAdapter has been created + * yet, it will be created and returned, other calls will return that same + * adapter until it is destroyed, at which point, a new one can be attached the + * same way. + * + * Returns: (transfer full)(nullable): The #GstTranscoderSignalAdapter to + * connect signal handlers to. + * + * Since: 1.20 + */ +GstTranscoderSignalAdapter * +gst_transcoder_get_signal_adapter (GstTranscoder * self, GMainContext * context) { - G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_0, - G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_APPLICATION_CONTEXT, - G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_LAST -}; - -G_DEFINE_TYPE_WITH_CODE (GstTranscoderGMainContextSignalDispatcher, - gst_transcoder_g_main_context_signal_dispatcher, G_TYPE_OBJECT, - G_IMPLEMENT_INTERFACE (GST_TYPE_TRANSCODER_SIGNAL_DISPATCHER, - gst_transcoder_g_main_context_signal_dispatcher_interface_init)); + g_return_val_if_fail (GST_IS_TRANSCODER (self), NULL); -static GParamSpec - * g_main_context_signal_dispatcher_param_specs - [G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_LAST] = { NULL, }; + if (!context) + context = g_main_context_get_thread_default (); + if (!context) + context = g_main_context_default (); -static void -gst_transcoder_g_main_context_signal_dispatcher_finalize (GObject * object) -{ - GstTranscoderGMainContextSignalDispatcher *self = - GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER (object); + GST_OBJECT_LOCK (self); + if (!self->signal_adapter) { + self->signal_adapter = gst_transcoder_signal_adapter_new (self, context); + } else if (g_source_get_context (self->signal_adapter->source) != context) { + GST_WARNING_OBJECT (self, "Trying to get an adapter for a different " + "GMainContext than the one attached, this is not possible"); + GST_OBJECT_UNLOCK (self); - if (self->application_context) - g_main_context_unref (self->application_context); + return NULL; + } + GST_OBJECT_UNLOCK (self); - G_OBJECT_CLASS - (gst_transcoder_g_main_context_signal_dispatcher_parent_class)->finalize - (object); + return g_object_ref (self->signal_adapter); } -static void -gst_transcoder_g_main_context_signal_dispatcher_set_property (GObject * object, - guint prop_id, const GValue * value, GParamSpec * pspec) +/** + * gst_transcoder_message_get_name: + * @message: a #GstTranscoderMessage + * + * Returns (transfer none): The message name + * + * Since: 1.20 + */ +const gchar * +gst_transcoder_message_get_name (GstTranscoderMessage message) { - GstTranscoderGMainContextSignalDispatcher *self = - GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER (object); - - switch (prop_id) { - case G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_APPLICATION_CONTEXT: - self->application_context = g_value_dup_boxed (value); - if (!self->application_context) - self->application_context = g_main_context_ref_thread_default (); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } + GEnumClass *enum_class; + GEnumValue *enum_value; + enum_class = g_type_class_ref (GST_TYPE_TRANSCODER_MESSAGE); + enum_value = g_enum_get_value (enum_class, message); + g_assert (enum_value != NULL); + g_type_class_unref (enum_class); + return enum_value->value_name; } -static void -gst_transcoder_g_main_context_signal_dispatcher_get_property (GObject * object, - guint prop_id, GValue * value, GParamSpec * pspec) -{ - GstTranscoderGMainContextSignalDispatcher *self = - GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER (object); - switch (prop_id) { - case G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_APPLICATION_CONTEXT: - g_value_set_boxed (value, self->application_context); - break; - default: - G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); - break; - } -} +#define PARSE_MESSAGE_FIELD(msg, field, value_type, value) G_STMT_START { \ + const GstStructure *data = NULL; \ + g_return_if_fail (gst_transcoder_is_transcoder_message (msg)); \ + data = gst_message_get_structure (msg); \ + if (!gst_structure_get (data, field, value_type, value, NULL)) { \ + g_error ("Could not parse field from structure: %s", field); \ + } \ +} G_STMT_END -static void - gst_transcoder_g_main_context_signal_dispatcher_class_init - (GstTranscoderGMainContextSignalDispatcherClass * klass) +/** + * gst_transcoder_is_transcoder_message: + * @msg: A #GstMessage + * + * Returns: A #gboolean indicating whether the passes message represents a #GstTranscoder message or not. + * + * Since: 1.20 + */ +gboolean +gst_transcoder_is_transcoder_message (GstMessage * msg) { - GObjectClass *gobject_class = G_OBJECT_CLASS (klass); - - gobject_class->finalize = - gst_transcoder_g_main_context_signal_dispatcher_finalize; - gobject_class->set_property = - gst_transcoder_g_main_context_signal_dispatcher_set_property; - gobject_class->get_property = - gst_transcoder_g_main_context_signal_dispatcher_get_property; - - g_main_context_signal_dispatcher_param_specs - [G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_APPLICATION_CONTEXT] = - g_param_spec_boxed ("application-context", "Application Context", - "Application GMainContext to dispatch signals to", G_TYPE_MAIN_CONTEXT, - G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); + const GstStructure *data = NULL; + g_return_val_if_fail (GST_IS_MESSAGE (msg), FALSE); - g_object_class_install_properties (gobject_class, - G_MAIN_CONTEXT_SIGNAL_DISPATCHER_PROP_LAST, - g_main_context_signal_dispatcher_param_specs); -} + data = gst_message_get_structure (msg); + g_return_val_if_fail (data, FALSE); -static void - gst_transcoder_g_main_context_signal_dispatcher_init - (G_GNUC_UNUSED GstTranscoderGMainContextSignalDispatcher * self) -{ + return g_str_equal (gst_structure_get_name (data), + GST_TRANSCODER_MESSAGE_DATA); } -typedef struct +/** + * gst_transcoder_message_parse_duration: + * @msg: A #GstMessage + * @duration: (out): the resulting duration + * + * Parse the given duration @msg and extract the corresponding #GstClockTime + * + * Since: 1.20 + */ +void +gst_transcoder_message_parse_duration (GstMessage * msg, + GstClockTime * duration) { - void (*emitter) (gpointer data); - gpointer data; - GDestroyNotify destroy; -} GMainContextSignalDispatcherData; + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_DURATION, + GST_TYPE_CLOCK_TIME, duration); +} -static gboolean -g_main_context_signal_dispatcher_dispatch_gsourcefunc (gpointer user_data) +/** + * gst_transcoder_message_parse_position: + * @msg: A #GstMessage + * @position: (out): the resulting position + * + * Parse the given position @msg and extract the corresponding #GstClockTime + * + * Since: 1.20 + */ +void +gst_transcoder_message_parse_position (GstMessage * msg, + GstClockTime * position) { - GMainContextSignalDispatcherData *data = user_data; - - data->emitter (data->data); - - return G_SOURCE_REMOVE; + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_POSITION, + GST_TYPE_CLOCK_TIME, position); } -static void -g_main_context_signal_dispatcher_dispatch_destroy (gpointer user_data) +/** + * gst_transcoder_message_parse_state: + * @msg: A #GstMessage + * @state: (out): the resulting state + * + * Parse the given state @msg and extract the corresponding #GstTranscoderState + * + * Since: 1.20 + */ +void +gst_transcoder_message_parse_state (GstMessage * msg, + GstTranscoderState * state) { - GMainContextSignalDispatcherData *data = user_data; - - if (data->destroy) - data->destroy (data->data); - g_free (data); + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_STATE, + GST_TYPE_TRANSCODER_STATE, state); } -/* *INDENT-OFF* */ -static void -gst_transcoder_g_main_context_signal_dispatcher_dispatch (GstTranscoderSignalDispatcher * iface, - G_GNUC_UNUSED GstTranscoder * transcoder, void (*emitter) (gpointer data), - gpointer data, GDestroyNotify destroy) +/** + * gst_transcoder_message_parse_error: + * @msg: A #GstMessage + * @error: (out): the resulting error + * @details: (out): (transfer none): A GstStructure containing extra details about the error + * + * Parse the given error @msg and extract the corresponding #GError + * + * Since: 1.20 + */ +void +gst_transcoder_message_parse_error (GstMessage * msg, GError * error, + GstStructure ** details) { - GstTranscoderGMainContextSignalDispatcher *self = - GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER (iface); - GMainContextSignalDispatcherData *gsourcefunc_data = - g_new0 (GMainContextSignalDispatcherData, 1); - - gsourcefunc_data->emitter = emitter; - gsourcefunc_data->data = data; - gsourcefunc_data->destroy = destroy; - - g_main_context_invoke_full (self->application_context, - G_PRIORITY_DEFAULT, g_main_context_signal_dispatcher_dispatch_gsourcefunc, - gsourcefunc_data, g_main_context_signal_dispatcher_dispatch_destroy); + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_ERROR, G_TYPE_ERROR, + error); + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_ISSUE_DETAILS, + GST_TYPE_STRUCTURE, details); } -static void -gst_transcoder_g_main_context_signal_dispatcher_interface_init (GstTranscoderSignalDispatcherInterface * iface) +/** + * gst_transcoder_message_parse_warning: + * @msg: A #GstMessage + * @error: (out): the resulting warning + * @details: (out): (transfer none): A GstStructure containing extra details about the warning + * + * Parse the given error @msg and extract the corresponding #GError warning + * + * Since: 1.20 + */ +void +gst_transcoder_message_parse_warning (GstMessage * msg, GError * error, + GstStructure ** details) { - iface->dispatch = gst_transcoder_g_main_context_signal_dispatcher_dispatch; + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_WARNING, G_TYPE_ERROR, + error); + PARSE_MESSAGE_FIELD (msg, GST_TRANSCODER_MESSAGE_DATA_ISSUE_DETAILS, + GST_TYPE_STRUCTURE, details); } -/* *INDENT-ON* */ /** - * gst_transcoder_g_main_context_signal_dispatcher_new: - * @application_context: (allow-none): GMainContext to use or %NULL + * gst_transcoder_state_get_name: + * @state: a #GstTranscoderState * - * Returns: (transfer full): + * Gets a string representing the given state. + * + * Returns: (transfer none): a string with the name of the state. + * + * Since: 1.20 */ -GstTranscoderSignalDispatcher * -gst_transcoder_g_main_context_signal_dispatcher_new (GMainContext * - application_context) -{ - return g_object_new (GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER, - "application-context", application_context, NULL); +const gchar * +gst_transcoder_state_get_name (GstTranscoderState state) +{ + switch (state) { + case GST_TRANSCODER_STATE_STOPPED: + return "stopped"; + case GST_TRANSCODER_STATE_PAUSED: + return "paused"; + case GST_TRANSCODER_STATE_PLAYING: + return "playing"; + } + + g_assert_not_reached (); + return NULL; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/transcoder/gsttranscoder.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/gsttranscoder.h
Changed
@@ -7,16 +7,13 @@ #include <gst/gst.h> #include <gst/pbutils/pbutils.h> -#include "transcoder-prelude.h" +#include <gst/transcoder/transcoder-prelude.h> +#include <gst/transcoder/transcoder-enumtypes.h> G_BEGIN_DECLS -typedef struct _GstTranscoderSignalDispatcher GstTranscoderSignalDispatcher; -typedef struct _GstTranscoderSignalDispatcherInterface GstTranscoderSignalDispatcherInterface; - /*********** Error definitions ************/ #define GST_TRANSCODER_ERROR (gst_transcoder_error_quark ()) -#define GST_TYPE_TRANSCODER_ERROR (gst_transcoder_error_get_type ()) /** * GstTranscoderError: @@ -29,24 +26,90 @@ GST_TRANSCODER_API GQuark gst_transcoder_error_quark (void); GST_TRANSCODER_API -GType gst_transcoder_error_get_type (void); -GST_TRANSCODER_API const gchar * gst_transcoder_error_get_name (GstTranscoderError error); +/*********** State definition ************/ + +/** + * GstTranscoderState: + * @GST_TRANSCODER_STATE_STOPPED: the transcoder is stopped. + * @GST_TRANSCODER_STATE_PAUSED: the transcoder is paused. + * @GST_TRANSCODER_STATE_PLAYING: the transcoder is currently transcoding a + * stream. + * + * High level representation of the transcoder pipeline state. + * + * Since: 1.20 + */ +typedef enum { + GST_TRANSCODER_STATE_STOPPED, + GST_TRANSCODER_STATE_PAUSED, + GST_TRANSCODER_STATE_PLAYING +} GstTranscoderState; + +GST_TRANSCODER_API +const gchar * gst_transcoder_state_get_name (GstTranscoderState state); + +/*********** Messages types definitions ************/ + +/** + * GstTranscoderMessage: + * @GST_TRANSCODER_MESSAGE_POSITION_UPDATED: Sink position changed + * @GST_TRANSCODER_MESSAGE_DURATION_CHANGED: Duration of stream changed + * @GST_TRANSCODER_MESSAGE_STATE_CHANGED: Pipeline state changed + * @GST_TRANSCODER_MESSAGE_DONE: Transcoding is done + * @GST_TRANSCODER_MESSAGE_ERROR: Message contains an error + * @GST_TRANSCODER_MESSAGE_WARNING: Message contains an error + * + * Types of messages that will be posted on the transcoder API bus. + * + * See also #gst_transcoder_get_message_bus() + * + * Since: 1.20 + */ +typedef enum +{ + GST_TRANSCODER_MESSAGE_POSITION_UPDATED, + GST_TRANSCODER_MESSAGE_DURATION_CHANGED, + GST_TRANSCODER_MESSAGE_STATE_CHANGED, + GST_TRANSCODER_MESSAGE_DONE, + GST_TRANSCODER_MESSAGE_ERROR, + GST_TRANSCODER_MESSAGE_WARNING, +} GstTranscoderMessage; + +GST_TRANSCODER_API +gboolean gst_transcoder_is_transcoder_message (GstMessage * msg); + +GST_TRANSCODER_API +const gchar * gst_transcoder_message_get_name (GstTranscoderMessage message); + +GST_TRANSCODER_API +void gst_transcoder_message_parse_position (GstMessage * msg, GstClockTime * position); + +GST_TRANSCODER_API +void gst_transcoder_message_parse_duration (GstMessage * msg, GstClockTime * duration); + +GST_TRANSCODER_API +void gst_transcoder_message_parse_state (GstMessage * msg, GstTranscoderState * state); + +GST_TRANSCODER_API +void gst_transcoder_message_parse_error (GstMessage * msg, GError * error, GstStructure ** details); + +GST_TRANSCODER_API +void gst_transcoder_message_parse_warning (GstMessage * msg, GError * error, GstStructure ** details); + + + /*********** GstTranscoder definition ************/ #define GST_TYPE_TRANSCODER (gst_transcoder_get_type ()) -#define GST_TRANSCODER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_TRANSCODER, GstTranscoder)) -#define GST_TRANSCODER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_TRANSCODER, GstTranscoderClass)) -#define GST_IS_TRANSCODER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_TRANSCODER)) -#define GST_IS_TRANSCODER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_TRANSCODER)) -#define GST_TRANSCODER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_TRANSCODER, GstTranscoderClass)) - -typedef struct _GstTranscoder GstTranscoder; -typedef struct _GstTranscoderClass GstTranscoderClass; -typedef struct _GstTranscoderPrivate GstTranscoderPrivate; +/** + * GstTranscoderClass.parent_class: + * + * Since: 1.20 + */ GST_TRANSCODER_API -GType gst_transcoder_get_type (void); +G_DECLARE_FINAL_TYPE (GstTranscoder, gst_transcoder, GST, TRANSCODER, GstObject) GST_TRANSCODER_API GstTranscoder * gst_transcoder_new (const gchar * source_uri, @@ -56,22 +119,24 @@ GST_TRANSCODER_API GstTranscoder * gst_transcoder_new_full (const gchar * source_uri, const gchar * dest_uri, - GstEncodingProfile *profile, - GstTranscoderSignalDispatcher *signal_dispatcher); + GstEncodingProfile * profile); GST_TRANSCODER_API -gboolean gst_transcoder_run (GstTranscoder *self, +gboolean gst_transcoder_run (GstTranscoder * self, GError ** error); GST_TRANSCODER_API -void gst_transcoder_set_cpu_usage (GstTranscoder *self, +GstBus * gst_transcoder_get_message_bus (GstTranscoder * transcoder); + +GST_TRANSCODER_API +void gst_transcoder_set_cpu_usage (GstTranscoder * self, gint cpu_usage); GST_TRANSCODER_API -void gst_transcoder_run_async (GstTranscoder *self); +void gst_transcoder_run_async (GstTranscoder * self); GST_TRANSCODER_API -void gst_transcoder_set_position_update_interval (GstTranscoder *self, +void gst_transcoder_set_position_update_interval (GstTranscoder * self, guint interval); GST_TRANSCODER_API @@ -81,7 +146,7 @@ gchar * gst_transcoder_get_dest_uri (GstTranscoder * self); GST_TRANSCODER_API -guint gst_transcoder_get_position_update_interval (GstTranscoder *self); +guint gst_transcoder_get_position_update_interval (GstTranscoder * self); GST_TRANSCODER_API GstClockTime gst_transcoder_get_position (GstTranscoder * self); @@ -98,43 +163,15 @@ void gst_transcoder_set_avoid_reencoding (GstTranscoder * self, gboolean avoid_reencoding); - -/****************** Signal dispatcher *******************************/ - -#define GST_TYPE_TRANSCODER_SIGNAL_DISPATCHER (gst_transcoder_signal_dispatcher_get_type ()) -#define GST_TRANSCODER_SIGNAL_DISPATCHER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_TRANSCODER_SIGNAL_DISPATCHER, GstTranscoderSignalDispatcher)) -#define GST_IS_TRANSCODER_SIGNAL_DISPATCHER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_TRANSCODER_SIGNAL_DISPATCHER)) -#define GST_TRANSCODER_SIGNAL_DISPATCHER_GET_INTERFACE(inst) (G_TYPE_INSTANCE_GET_INTERFACE ((inst), GST_TYPE_TRANSCODER_SIGNAL_DISPATCHER, GstTranscoderSignalDispatcherInterface)) - -struct _GstTranscoderSignalDispatcherInterface { - GTypeInterface parent_iface; - - void (*dispatch) (GstTranscoderSignalDispatcher * self, - GstTranscoder * transcoder, - void (*emitter) (gpointer data), - gpointer data, - GDestroyNotify destroy); -}; - -typedef struct _GstTranscoderGMainContextSignalDispatcher GstTranscoderGMainContextSignalDispatcher; -typedef struct _GstTranscoderGMainContextSignalDispatcherClass GstTranscoderGMainContextSignalDispatcherClass; +#include "gsttranscoder-signal-adapter.h" GST_TRANSCODER_API -GType gst_transcoder_signal_dispatcher_get_type (void); - -#define GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER (gst_transcoder_g_main_context_signal_dispatcher_get_type ()) -#define GST_IS_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER)) -#define GST_IS_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER)) -#define GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER, GstTranscoderGMainContextSignalDispatcherClass)) -#define GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER, GstTranscoderGMainContextSignalDispatcher)) -#define GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER, GstTranscoderGMainContextSignalDispatcherClass)) -#define GST_TRANSCODER_G_MAIN_CONTEXT_SIGNAL_DISPATCHER_CAST(obj) ((GstTranscoderGMainContextSignalDispatcher*)(obj)) - -GST_TRANSCODER_API -GType gst_transcoder_g_main_context_signal_dispatcher_get_type (void); - +GstTranscoderSignalAdapter* +gst_transcoder_get_signal_adapter (GstTranscoder * self, + GMainContext *context); GST_TRANSCODER_API -GstTranscoderSignalDispatcher * gst_transcoder_g_main_context_signal_dispatcher_new (GMainContext * application_context); +GstTranscoderSignalAdapter* +gst_transcoder_get_sync_signal_adapter (GstTranscoder * self); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/transcoder/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/meson.build
Changed
@@ -1,33 +1,66 @@ -sources = files(['gsttranscoder.c']) -headers = files(['gsttranscoder.h', 'transcoder-prelude.h']) +sources = files(['gsttranscoder.c', 'gsttranscoder-signal-adapter.c']) +headers = files(['gsttranscoder.h', 'transcoder-prelude.h', 'gsttranscoder-signal-adapter.h']) install_headers(headers, subdir : 'gstreamer-' + api_version + '/gst/transcoder') +transcoder_enums = gnome.mkenums_simple('transcoder-enumtypes', + sources : headers, + body_prefix : '#ifdef HAVE_CONFIG_H\n#include "config.h"\n#endif', + header_prefix : '#include <gst/transcoder/transcoder-prelude.h>', + decorator: 'GST_TRANSCODER_API', + install_header: true, + install_dir : join_paths(get_option('includedir'), 'gstreamer-1.0/gst/transcoder')) + +gsttranscoder_c = transcoder_enums[0] +gsttranscoder_h = transcoder_enums[1] + +transcoder_gen_sources = [gsttranscoder_h] + gst_transcoder = library('gsttranscoder-' + api_version, - sources, + sources + [gsttranscoder_c] + transcoder_gen_sources, install: true, include_directories : [configinc, libsinc], dependencies: [gst_dep, gstpbutils_dep], - c_args: gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_TRANSCODER'], + c_args: gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_TRANSCODER', '-DG_LOG_DOMAIN="GStreamer-Transcoder"'], soversion : soversion, ) + +library_def = {'lib': gst_transcoder} +pkg_name = 'gstreamer-transcoder-1.0' +pkgconfig.generate(gst_transcoder, + libraries : [gst_dep, gstbase_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'High level API for transcoding using GStreamer', +) + if build_gir - transcoder_gir = gnome.generate_gir(gst_transcoder, - sources : sources + headers, - nsversion : api_version, - namespace : 'GstTranscoder', - identifier_prefix : 'Gst', - symbol_prefix : 'gst_', - includes : ['GObject-2.0', + gir = { + 'sources' : sources + headers + [gsttranscoder_h], + 'nsversion' : api_version, + 'namespace' : 'GstTranscoder', + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst_', + 'includes' : ['GObject-2.0', 'Gst-' + api_version, 'GstPbutils-' + api_version], - dependencies: [gst_dep, gstpbutils_dep], - install : true, - extra_args : gir_init_section - ) + 'dependencies' : [gst_dep, gstpbutils_dep], + 'export_packages' : pkg_name, + 'install' : true, + 'extra_args' : gir_init_section + } + library_def += { 'gir': [gir]} + if not static_build + transcoder_gir = gnome.generate_gir(gst_transcoder, kwargs: gir) + transcoder_gen_sources += transcoder_gir + endif endif +libraries += [[pkg_name, library_def]] gst_transcoder_dep = declare_dependency(link_with: gst_transcoder, dependencies : [gst_dep, gstpbutils_dep], + sources: transcoder_gen_sources, include_directories : [libsinc] -) \ No newline at end of file +) +meson.override_dependency(pkg_name, gst_transcoder_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/transcoder/transcoder-prelude.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/transcoder/transcoder-prelude.h
Changed
@@ -33,4 +33,6 @@ # endif #endif +#include <gst/transcoder/transcoder-enumtypes.h> + #endif /* __GST_TRANSCODER_PRELUDE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/uridownloader/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/uridownloader/meson.build
Changed
@@ -10,9 +10,10 @@ ] install_headers(urid_headers, subdir : 'gstreamer-1.0/gst/uridownloader') +pkg_name = 'gstreamer-downloader-1.0' gsturidownloader = library('gsturidownloader-' + api_version, urid_sources, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_URI_DOWNLOADER'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_URI_DOWNLOADER', '-DG_LOG_DOMAIN="GStreamer-UriDownloader"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -24,3 +25,6 @@ gsturidownloader_dep = declare_dependency(link_with : gsturidownloader, include_directories : [libsinc], dependencies : [gstbase_dep]) + +libraries += [[pkg_name, {'lib': gsturidownloader}]] +meson.override_dependency(pkg_name, gsturidownloader_dep) \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/gstvadisplay.c
Added
@@ -0,0 +1,402 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstvadisplay + * @title: GstVaDisplay + * @short_description: Generic VADisplay wrapper. + * @sources: + * - gstvadisplay.h + * + * It is a generic wrapper for VADisplay. To create new instances + * subclasses are required, depending on the display type to use + * (v.gr. DRM, X11, Wayland, etc.). + * + * The purpose of this class is to be shared among pipelines via + * #GstContext so all the VA processing elements will use the same + * display entry. Application developers can create their own + * subclass, based on their display, and shared it via the synced bus + * message for the application. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvadisplay.h" +#include <va/va.h> + +GST_DEBUG_CATEGORY (gst_va_display_debug); +#define GST_CAT_DEFAULT gst_va_display_debug + +typedef struct _GstVaDisplayPrivate GstVaDisplayPrivate; +struct _GstVaDisplayPrivate +{ + GRecMutex lock; + VADisplay display; + + gboolean foreign; + gboolean init; + GstVaImplementation impl; +}; + +#define gst_va_display_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstVaDisplay, gst_va_display, GST_TYPE_OBJECT, + G_ADD_PRIVATE (GstVaDisplay); + GST_DEBUG_CATEGORY_INIT (gst_va_display_debug, "vadisplay", 0, + "VA Display")); +enum +{ + PROP_VA_DISPLAY = 1, + N_PROPERTIES +}; + +static GParamSpec *g_properties[N_PROPERTIES]; + +#define GET_PRIV(obj) gst_va_display_get_instance_private (GST_VA_DISPLAY (obj)) + +static GstVaImplementation +_get_implementation (const char *vendor) +{ + if (g_str_has_prefix (vendor, "Mesa Gallium driver")) + return GST_VA_IMPLEMENTATION_MESA_GALLIUM; + else if (g_str_has_prefix (vendor, "Intel i965 driver")) + return GST_VA_IMPLEMENTATION_INTEL_I965; + else if (g_str_has_prefix (vendor, "Intel iHD driver")) + return GST_VA_IMPLEMENTATION_INTEL_IHD; + + return GST_VA_IMPLEMENTATION_OTHER; +} + +static gboolean +_gst_va_display_filter_driver (GstVaDisplay * self, gpointer foreign_display) +{ + GstVaDisplayPrivate *priv = GET_PRIV (self); + VADisplay dpy; + const char *vendor; + + g_assert ((foreign_display != NULL) ^ (priv->display != NULL)); + dpy = foreign_display ? foreign_display : priv->display; + + vendor = vaQueryVendorString (dpy); + GST_INFO ("VA-API driver vendor: %s", vendor); + + /* XXX(victor): driver allow list */ + + if (foreign_display) { + priv->display = foreign_display; + priv->foreign = TRUE; + } + priv->impl = _get_implementation (vendor); + + return TRUE; +} + +static void +gst_va_display_set_display (GstVaDisplay * self, gpointer display) +{ + GstVaDisplayPrivate *priv = GET_PRIV (self); + + if (!display) + return; + + if (vaDisplayIsValid (display) == 0) { + GST_WARNING_OBJECT (self, + "User's VA display is invalid. An internal one will be tried."); + return; + } + + /* assume driver is already initialized */ + priv->init = TRUE; + + _gst_va_display_filter_driver (self, display); +} + +static void +gst_va_display_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstVaDisplay *self = GST_VA_DISPLAY (object); + + switch (prop_id) { + case PROP_VA_DISPLAY:{ + gpointer display = g_value_get_pointer (value); + gst_va_display_set_display (self, display); + break; + } + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_va_display_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstVaDisplayPrivate *priv = GET_PRIV (object); + + switch (prop_id) { + case PROP_VA_DISPLAY: + g_value_set_pointer (value, priv->display); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_va_display_constructed (GObject * object) +{ + GstVaDisplay *self = GST_VA_DISPLAY (object); + GstVaDisplayPrivate *priv = GET_PRIV (object); + GstVaDisplayClass *klass = GST_VA_DISPLAY_GET_CLASS (object); + + if (!priv->display && klass->create_va_display) + priv->display = klass->create_va_display (self); + + G_OBJECT_CLASS (parent_class)->constructed (object); +} + +static void +gst_va_display_dispose (GObject * object) +{ + GstVaDisplayPrivate *priv = GET_PRIV (object); + + if (priv->display && !priv->foreign) + vaTerminate (priv->display); + priv->display = NULL; + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_display_finalize (GObject * object) +{ + GstVaDisplayPrivate *priv = GET_PRIV (object); + + g_rec_mutex_clear (&priv->lock); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_va_display_class_init (GstVaDisplayClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->set_property = gst_va_display_set_property; + gobject_class->get_property = gst_va_display_get_property; + gobject_class->constructed = gst_va_display_constructed; + gobject_class->dispose = gst_va_display_dispose; + gobject_class->finalize = gst_va_display_finalize; + + g_properties[PROP_VA_DISPLAY] = + g_param_spec_pointer ("va-display", "VADisplay", "VA Display handler", + G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); + + g_object_class_install_properties (gobject_class, N_PROPERTIES, g_properties); +} + +static void +gst_va_display_init (GstVaDisplay * self) +{ + GstVaDisplayPrivate *priv = GET_PRIV (self); + + g_rec_mutex_init (&priv->lock); + priv->impl = GST_VA_IMPLEMENTATION_INVALID; +} + +/** + * gst_va_display_lock: + * @self: a #GstVaDisplay + * + * Lock the display. It will be used before we call the + * VA API functions to serialize the VA commands. + * + * Since: 1.20 + **/ +void +gst_va_display_lock (GstVaDisplay * self) +{ + GstVaDisplayPrivate *priv; + + g_return_if_fail (GST_IS_VA_DISPLAY (self)); + + priv = GET_PRIV (self); + + g_rec_mutex_lock (&priv->lock); +} + +/** + * gst_va_display_unlock: + * @self: a #GstVaDisplay + * + * Unlock the display. It will be used after we call the + * VA API functions. + * + * Since: 1.20 + **/ +void +gst_va_display_unlock (GstVaDisplay * self) +{ + GstVaDisplayPrivate *priv; + + g_return_if_fail (GST_IS_VA_DISPLAY (self)); + + priv = GET_PRIV (self); + + g_rec_mutex_unlock (&priv->lock); +} + +#ifndef GST_DISABLE_GST_DEBUG +static gchar * +_strip_msg (const char *message) +{ + gchar *msg = g_strdup (message); + if (!msg) + return NULL; + return g_strstrip (msg); +} + +static void +_va_warning (gpointer object, const char *message) +{ + GstVaDisplay *self = GST_VA_DISPLAY (object); + gchar *msg; + + if ((msg = _strip_msg (message))) { + GST_WARNING_OBJECT (self, "VA error: %s", msg); + g_free (msg); + } +} + +static void +_va_info (gpointer object, const char *message) +{ + GstVaDisplay *self = GST_VA_DISPLAY (object); + gchar *msg; + + if ((msg = _strip_msg (message))) { + GST_INFO_OBJECT (self, "VA info: %s", msg); + g_free (msg); + } +} +#endif + +/** + * gst_va_display_initialize: + * @self: a #GstVaDisplay + * + * If the display is set by the user (foreign) it is assumed that the + * driver is already initialized, thus this function is noop. + * + * If the display is opened internally, this function will initialize + * the driver and it will set driver's message callbacks. + * + * NOTE: this function is supposed to be private, only used by + * GstVaDisplay descendants. + * + * Returns: %TRUE if the VA driver can be initialized; %FALSE + * otherwise + * + * Since: 1.20 + **/ +gboolean +gst_va_display_initialize (GstVaDisplay * self) +{ + GstVaDisplayPrivate *priv; + VAStatus status; + int major_version = -1, minor_version = -1; + + g_return_val_if_fail (GST_IS_VA_DISPLAY (self), FALSE); + + priv = GET_PRIV (self); + + if (priv->init) + return TRUE; + + if (!priv->display) + return FALSE; + +#ifndef GST_DISABLE_GST_DEBUG + vaSetErrorCallback (priv->display, _va_warning, self); + vaSetInfoCallback (priv->display, _va_info, self); +#endif + + status = vaInitialize (priv->display, &major_version, &minor_version); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING_OBJECT (self, "vaInitialize: %s", vaErrorStr (status)); + return FALSE; + } + + GST_INFO_OBJECT (self, "VA-API version %d.%d", major_version, minor_version); + + priv->init = TRUE; + + if (!_gst_va_display_filter_driver (self, NULL)) + return FALSE; + + return TRUE; +} + +/** + * gst_va_display_get_va_dpy: + * @self: a #GstVaDisplay type display. + * + * Get the VA display handle of the @self. + * + * Returns: the VA display handle. + * + * Since: 1.20 + */ +gpointer +gst_va_display_get_va_dpy (GstVaDisplay * self) +{ + VADisplay dpy; + + g_return_val_if_fail (GST_IS_VA_DISPLAY (self), NULL); + + g_object_get (self, "va-display", &dpy, NULL); + return dpy; +} + +/** + * gst_va_display_get_implementation: + * @self: a #GstVaDisplay type display. + * + * Get the the #GstVaImplementation type of @self. + * + * Returns: #GstVaImplementation. + * + * Since: 1.20 + */ +GstVaImplementation +gst_va_display_get_implementation (GstVaDisplay * self) +{ + GstVaDisplayPrivate *priv; + + g_return_val_if_fail (GST_IS_VA_DISPLAY (self), + GST_VA_IMPLEMENTATION_INVALID); + + priv = GET_PRIV (self); + return priv->impl; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/gstvadisplay.h
Added
@@ -0,0 +1,152 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/va/va_fwd.h> +#include <gst/va/va-prelude.h> +#include <gst/gst.h> + +G_BEGIN_DECLS + +/** + * GstVaImplementation: + * @GST_VA_IMPLEMENTATION_MESA_GALLIUM: The mesa gallium implementation. + * @GST_VA_IMPLEMENTATION_INTEL_I965: The legacy i965 intel implementation. + * @GST_VA_IMPLEMENTATION_INTEL_IHD: The iHD intel implementation. + * @GST_VA_IMPLEMENTATION_OTHER: Other implementation. + * @GST_VA_IMPLEMENTATION_INVALID: Invalid implementation. + * + * Types of different VA API implemented drivers. These are the typical and + * the most widely used VA drivers. + * + * Since: 1.20 + */ +typedef enum +{ + GST_VA_IMPLEMENTATION_MESA_GALLIUM, + GST_VA_IMPLEMENTATION_INTEL_I965, + GST_VA_IMPLEMENTATION_INTEL_IHD, + GST_VA_IMPLEMENTATION_OTHER, + GST_VA_IMPLEMENTATION_INVALID, +} GstVaImplementation; + +/** + * GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR: + * + * Since: 1.20 + */ +#define GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR "gst.va.display.handle" + +/** + * GST_CAPS_FEATURE_MEMORY_VA: + * + * Since: 1.20 + */ +#define GST_CAPS_FEATURE_MEMORY_VA "memory:VAMemory" + +/** + * GST_VA_DISPLAY_IS_IMPLEMENTATION: (skip) + * + * Check whether the display is the implementation of the specified + * #GstVaImplementation type. + * + * Since: 1.20 + */ +#define GST_VA_DISPLAY_IS_IMPLEMENTATION(display, impl) \ + (gst_va_display_is_implementation (display, G_PASTE (GST_VA_IMPLEMENTATION_, impl))) + +#define GST_TYPE_VA_DISPLAY (gst_va_display_get_type()) +#define GST_VA_DISPLAY(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_VA_DISPLAY, GstVaDisplay)) +#define GST_VA_DISPLAY_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_VA_DISPLAY, GstVaDisplayClass)) +#define GST_IS_VA_DISPLAY(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_VA_DISPLAY)) +#define GST_IS_VA_DISPLAY_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_VA_DISPLAY)) +#define GST_VA_DISPLAY_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_VA_DISPLAY, GstVaDisplayClass)) + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVaDisplay, gst_object_unref) + +/** + * GstVaDisplay: + * @parent: parent #GstObject + * + * The common VA display object structure. + * + * Since: 1.20 + */ +struct _GstVaDisplay +{ + GstObject parent; +}; + +/** + * GstVaDisplayClass: + * @parent_class: parent #GstObjectClass + * + * The common VA display object class structure. + * + * Since: 1.20 + */ +struct _GstVaDisplayClass +{ + GstObjectClass parent_class; + + /** + * GstVaDisplayClass::create_va_display: + * @self: a #GstVaDisplay instance + * + * This is called when the subclass has to create the internal + * VADisplay. + * + * Returns: The created VADisplay + */ + gpointer (*create_va_display) (GstVaDisplay * self); +}; + +GST_VA_API +GType gst_va_display_get_type (void); +GST_VA_API +void gst_va_display_lock (GstVaDisplay * self); +GST_VA_API +void gst_va_display_unlock (GstVaDisplay * self); +GST_VA_API +gboolean gst_va_display_initialize (GstVaDisplay * self); +GST_VA_API +gpointer gst_va_display_get_va_dpy (GstVaDisplay * self); +GST_VA_API +GstVaImplementation gst_va_display_get_implementation (GstVaDisplay * self); + +/** + * gst_va_display_is_implementation: + * @display: the #GstVaDisplay to check. + * @impl: the specified #GstVaImplementation. + * + * Check whether the @display is the implementation of the @impl type. + * + * Returns: %TRUE if the @display is the implementation of the @impl type. + * + * Since: 1.20 + */ +static inline gboolean +gst_va_display_is_implementation (GstVaDisplay * display, GstVaImplementation impl) +{ + return (gst_va_display_get_implementation (display) == impl); +} + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/gstvadisplay_drm.c
Added
@@ -0,0 +1,217 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstvadisplaydrm + * @title: GstVaDisplayDrm + * @short_description: VADisplay from a DRM device + * @sources: + * - gstvadisplay_drm.h + * + * This is a #GstVaDisplay subclass to instantiate with DRM devices. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvadisplay_drm.h" +#include <fcntl.h> +#include <unistd.h> +#include <sys/types.h> +#include <sys/stat.h> +#include <va/va_drm.h> + +#if HAVE_LIBDRM +#include <xf86drm.h> +#endif + +/** + * GstVaDisplayDrm: + * @parent: parent #GstVaDisplay + * + * Since: 1.20 + */ +struct _GstVaDisplayDrm +{ + GstVaDisplay parent; + + /* <private> */ + gchar *path; + gint fd; +}; + +/** + * GstVaDisplayDrmClass: + * @parent_class: parent #GstVaDisplayClass + * + * Since: 1.20 + */ +struct _GstVaDisplayDrmClass +{ + GstVaDisplayClass parent_class; +}; + +GST_DEBUG_CATEGORY_EXTERN (gst_va_display_debug); +#define GST_CAT_DEFAULT gst_va_display_debug + +#define gst_va_display_drm_parent_class parent_class +G_DEFINE_TYPE (GstVaDisplayDrm, gst_va_display_drm, GST_TYPE_VA_DISPLAY); + +enum +{ + PROP_PATH = 1, + N_PROPERTIES +}; + +static GParamSpec *g_properties[N_PROPERTIES]; + +#define MAX_DEVICES 8 + +static void +gst_va_display_drm_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (object); + + switch (prop_id) { + case PROP_PATH: + self->path = g_value_dup_string (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_va_display_drm_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (object); + + switch (prop_id) { + case PROP_PATH: + g_value_set_string (value, self->path); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_va_display_drm_finalize (GObject * object) +{ + GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (object); + + g_free (self->path); + if (self->fd > -1) + close (self->fd); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static gpointer +gst_va_display_drm_create_va_display (GstVaDisplay * display) +{ + int fd, saved_errno = 0; + GstVaDisplayDrm *self = GST_VA_DISPLAY_DRM (display); + + fd = open (self->path, O_RDWR); + saved_errno = errno; + if (fd < 0) { + GST_WARNING_OBJECT (self, "Failed to open %s: %s", self->path, + g_strerror (saved_errno)); + return 0; + } +#if HAVE_LIBDRM + { + drmVersion *version; + + version = drmGetVersion (fd); + if (!version) { + GST_ERROR_OBJECT (self, "Device %s is not a DRM render node", self->path); + return 0; + } + GST_INFO_OBJECT (self, "DRM render node with kernel driver %s", + version->name); + drmFreeVersion (version); + } +#endif + + self->fd = fd; + return vaGetDisplayDRM (self->fd); +} + +static void +gst_va_display_drm_class_init (GstVaDisplayDrmClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstVaDisplayClass *vadisplay_class = GST_VA_DISPLAY_CLASS (klass); + + gobject_class->set_property = gst_va_display_drm_set_property; + gobject_class->get_property = gst_va_display_drm_get_property; + gobject_class->finalize = gst_va_display_drm_finalize; + + vadisplay_class->create_va_display = gst_va_display_drm_create_va_display; + + g_properties[PROP_PATH] = + g_param_spec_string ("path", "render-path", "The path of DRM device", + "/dev/dri/renderD128", + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT_ONLY); + + g_object_class_install_properties (gobject_class, N_PROPERTIES, g_properties); +} + +static void +gst_va_display_drm_init (GstVaDisplayDrm * self) +{ + self->fd = -1; +} + +/** + * gst_va_display_drm_new_from_path: + * @path: the path to the DRM device + * + * Creates a new #GstVaDisplay from a DRM device . It will try to open + * and operate the device in @path. + * + * Returns: (transfer full): a newly allocated #GstVaDisplay if the + * specified DRM render device could be opened and initialized; + * otherwise %NULL is returned. + * + * Since: 1.20 + **/ +GstVaDisplay * +gst_va_display_drm_new_from_path (const gchar * path) +{ + GstVaDisplay *dpy; + + g_return_val_if_fail (path, NULL); + + dpy = g_object_new (GST_TYPE_VA_DISPLAY_DRM, "path", path, NULL); + if (!gst_va_display_initialize (dpy)) { + gst_object_unref (dpy); + return NULL; + } + + return gst_object_ref_sink (dpy); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/gstvadisplay_drm.h
Added
@@ -0,0 +1,41 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadisplay.h" + +G_BEGIN_DECLS + +#define GST_TYPE_VA_DISPLAY_DRM (gst_va_display_drm_get_type()) +#define GST_VA_DISPLAY_DRM(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_VA_DISPLAY_DRM, GstVaDisplayDrm)) +#define GST_VA_DISPLAY_DRM_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_VA_DISPLAY_DRM, GstVaDisplayDrmClass)) +#define GST_IS_VA_DISPLAY_DRM(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_VA_DISPLAY_DRM)) +#define GST_IS_VA_DISPLAY_DRM_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_VA_DISPLAY_DRM)) +#define GST_VA_DISPLAY_DRM_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_VA_DISPLAY_DRM, GstVaDisplayDrmClass)) + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVaDisplayDrm, gst_object_unref) + +GST_VA_API +GType gst_va_display_drm_get_type (void); +GST_VA_API +GstVaDisplay * gst_va_display_drm_new_from_path (const gchar * path); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/gstvadisplay_wrapped.c
Added
@@ -0,0 +1,104 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:gstvadisplaywrapped + * @title: GstVaDisplayWrapped + * @short_description: User's custom VADisplay + * @sources: + * - gstvadisplay_wrapped.h + * + * This is a #GstVaDisplay instantiaton subclass for custom created + * VADisplay, such as X11 or Wayland, wrapping it. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvadisplay_wrapped.h" + +/** + * GstVaDisplayWrapped: + * @parent: parent #GstVaDisplay + * + * Since: 1.20 + */ +struct _GstVaDisplayWrapped +{ + GstVaDisplay parent; +}; + +/** + * GstVaDisplayWrappedClass: + * @parent_class: parent #GstVaDisplayClass + * + * Since: 1.20 + */ +struct _GstVaDisplayWrappedClass +{ + GstVaDisplayClass parent_class; +}; + +#define gst_va_display_wrapped_parent_class parent_class +G_DEFINE_TYPE (GstVaDisplayWrapped, gst_va_display_wrapped, + GST_TYPE_VA_DISPLAY); + +static void +gst_va_display_wrapped_class_init (GstVaDisplayWrappedClass * klass) +{ +} + +static void +gst_va_display_wrapped_init (GstVaDisplayWrapped * self) +{ +} + +/** + * gst_va_display_wrapped_new: + * @handle: a VADisplay to wrap + * + * Creates a #GstVaDisplay wrapping an already created and initialized + * VADisplay. + * + * The lifetime of @handle must be hold by the provider while the + * pipeline is instantiated. Do not call vaTerminate on it while the + * pipeline is not in NULL state. + * + * Returns: (transfer full): a new #GstVaDisplay if @handle is valid, + * Otherwise %NULL. + * + * Since: 1.20 + **/ +GstVaDisplay * +gst_va_display_wrapped_new (gpointer handle) +{ + GstVaDisplay *dpy; + + g_return_val_if_fail (handle, NULL); + + dpy = g_object_new (GST_TYPE_VA_DISPLAY_WRAPPED, "va-display", handle, NULL); + if (!gst_va_display_initialize (dpy)) { + gst_object_unref (dpy); + return NULL; + } + + return gst_object_ref_sink (dpy); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/gstvadisplay_wrapped.h
Added
@@ -0,0 +1,41 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadisplay.h" + +G_BEGIN_DECLS + +#define GST_TYPE_VA_DISPLAY_WRAPPED (gst_va_display_wrapped_get_type()) +#define GST_VA_DISPLAY_WRAPPED(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_VA_DISPLAY_WRAPPED, GstVaDisplayWrapped)) +#define GST_VA_DISPLAY_WRAPPED_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_VA_DISPLAY_WRAPPED, GstVaDisplayWrappedClass)) +#define GST_IS_VA_DISPLAY_WRAPPED(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_VA_DISPLAY_WRAPPED)) +#define GST_IS_VA_DISPLAY_WRAPPED_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_VA_DISPLAY_WRAPPED)) +#define GST_VA_DISPLAY_WRAPPED_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_VA_DISPLAY_WRAPPED, GstVaDisplayWrappedClass)) + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVaDisplayWrapped, gst_object_unref) + +GST_VA_API +GType gst_va_display_wrapped_get_type (void); +GST_VA_API +GstVaDisplay * gst_va_display_wrapped_new (gpointer handle); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/meson.build
Added
@@ -0,0 +1,49 @@ +va_sources = [ + 'gstvadisplay.c', + 'gstvadisplay_drm.c', + 'gstvadisplay_wrapped.c', +] + +va_headers = [ + 'gstvadisplay.h', + 'gstvadisplay_drm.h', + 'gstvadisplay_wrapped.h', + 'va_fwd.h', + 'va-prelude.h', +] + +gstva_dep = dependency('', required : false) + +va_option = get_option('va') +if va_option.disabled() or host_system != 'linux' + subdir_done() +endif + +libva_req = ['>= 1.6'] +libva_dep = dependency('libva', version: libva_req, required: va_option) +libva_drm_dep = dependency('libva-drm', version: libva_req, required: va_option) + +if not (libva_dep.found() and libva_drm_dep.found()) + subdir_done() +endif + +libdrm_dep = dependency('libdrm', required: false, fallback: ['libdrm', 'ext_libdrm']) +cdata.set10('HAVE_LIBDRM', libdrm_dep.found()) + +gstva = library('gstva-' + api_version, + va_sources, + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_VA', '-DG_LOG_DOMAIN="GStreamer-VA"'], + include_directories : [configinc, libsinc], + version : libversion, + soversion : soversion, + install : true, + dependencies : [gst_dep, libva_dep, libva_drm_dep, libdrm_dep], +) + +pkg_name = 'gstreamer-va-' + api_version +libraries += [[pkg_name, {'lib': gstva}]] + +gstva_dep = declare_dependency(link_with : gstva, + include_directories : [libsinc], + dependencies : [gst_dep, libva_dep, libva_drm_dep, libdrm_dep]) +meson.override_dependency(pkg_name, gstva_dep)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/va-prelude.h
Added
@@ -0,0 +1,30 @@ +/* GStreamer + * Copyright (C) 2021 GStreamer developers + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> + +#ifndef GST_VA_API +# ifdef BUILDING_GST_VA +# define GST_VA_API GST_API_EXPORT /* from config.h */ +# else +# define GST_VA_API GST_API_IMPORT +# endif +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/va/va_fwd.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) 2021 GStreamer developers + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> + +G_BEGIN_DECLS + +typedef struct _GstVaDisplay GstVaDisplay; +typedef struct _GstVaDisplayClass GstVaDisplayClass; + +typedef struct _GstVaDisplayDrm GstVaDisplayDrm; +typedef struct _GstVaDisplayDrmClass GstVaDisplayDrmClass; + +typedef struct _GstVaDisplayWrapped GstVaDisplayWrapped; +typedef struct _GstVaDisplayWrappedClass GstVaDisplayWrappedClass; + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/android/gstvkwindow_android.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/android/gstvkwindow_android.c
Changed
@@ -163,14 +163,14 @@ if (!window_android->CreateAndroidSurface) { g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_FEATURE_NOT_PRESENT, "Could not retrieve \"vkCreateAndroidSurfaceKHR\" function pointer"); - return 0; + return VK_NULL_HANDLE; } err = window_android->CreateAndroidSurface (window->display->instance->instance, &info, NULL, &ret); if (gst_vulkan_error_to_g_error (err, error, "vkCreateAndroidSurfaceKHR") < 0) - return 0; + return VK_NULL_HANDLE; return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/cocoa/gstvkwindow_cocoa.m -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/cocoa/gstvkwindow_cocoa.m
Changed
@@ -226,14 +226,14 @@ if (!window_cocoa->CreateMacOSSurface) { g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_FEATURE_NOT_PRESENT, "Could not retrieve \"vkCreateMacOSSurfaceMVK\" function pointer"); - return NULL; + return VK_NULL_HANDLE; } err = window_cocoa->CreateMacOSSurface (window->display->instance->instance, &info, NULL, &ret); if (gst_vulkan_error_to_g_error (err, error, "vkCreateMacOSSurfaceMVK") < 0) - return NULL; + return VK_NULL_HANDLE; return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkapi.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkapi.h
Changed
@@ -31,6 +31,24 @@ #include <gst/vulkan/vulkan_fwd.h> #include <gst/vulkan/vulkan-enumtypes.h> +/** + * VK_DEFINE_NON_DISPATCHABLE_HANDLE: + * + * Allow applications to override the VK_DEFINE_NON_DISPATCHABLE_HANDLE + * but provide our own version otherwise. The default vulkan define + * provides a different symbol type depending on the architecture and + * this causes multilib problems because the generated .gir files are + * different. + * + * Also make sure to provide a suitable GST_VULKAN_NON_DISPATCHABLE_HANDLE_FORMAT + * implementation when redefining VK_DEFINE_NON_DISPATCHABLE_HANDLE. + * + * Since: 1.20 + */ +#if !defined(VK_DEFINE_NON_DISPATCHABLE_HANDLE) +#define VK_DEFINE_NON_DISPATCHABLE_HANDLE(object) typedef uint64_t object; +#endif + #include <vulkan/vulkan_core.h> #endif /* __GST_VULKAN_API_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkcommandpool.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkcommandpool.h
Changed
@@ -66,6 +66,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanCommandPool, gst_object_unref) + GST_VULKAN_API GstVulkanQueue * gst_vulkan_command_pool_get_queue (GstVulkanCommandPool * pool);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkdebug.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkdebug.c
Changed
@@ -163,3 +163,27 @@ return "unknown"; } } + +/** + * gst_vulkan_present_mode_to_string: + * @present_mode: a `VkPresentModeKHR` + * + * Returns: name of @present_mode + * + * Since: 1.20 + */ +const gchar * +gst_vulkan_present_mode_to_string (VkPresentModeKHR present_mode) +{ + switch (present_mode) { + case VK_PRESENT_MODE_FIFO_KHR: + return "FIFO"; + case VK_PRESENT_MODE_IMMEDIATE_KHR: + return "immediate"; + case VK_PRESENT_MODE_MAILBOX_KHR: + return "mailbox"; + /* XXX: add other values as necessary */ + default: + return "unknown"; + } +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkdebug.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkdebug.h
Changed
@@ -66,6 +66,8 @@ gchar * gst_vulkan_queue_flags_to_string (VkQueueFlags queue_bits); GST_VULKAN_API gchar * gst_vulkan_sample_count_flags_to_string (VkSampleCountFlags sample_count_bits); +GST_VULKAN_API +const gchar * gst_vulkan_present_mode_to_string (VkPresentModeKHR present_mode); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkdescriptorcache.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkdescriptorcache.h
Changed
@@ -35,7 +35,7 @@ /** * GstVulkanDescriptorCache: - * @parent: the parent #GstObject + * @parent: the parent #GstVulkanHandlePool * @pool: the #GstVulkanDescriptorPool to cache descriptor sets for * * Since: 1.18 @@ -64,6 +64,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanDescriptorCache, gst_object_unref) + GST_VULKAN_API GstVulkanDescriptorCache * gst_vulkan_descriptor_cache_new (GstVulkanDescriptorPool * pool, guint n_layouts,
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkdevice.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkdevice.h
Changed
@@ -84,6 +84,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanDevice, gst_object_unref) + GST_VULKAN_API GstVulkanDevice * gst_vulkan_device_new (GstVulkanPhysicalDevice * physical_device); GST_VULKAN_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkdisplay.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkdisplay.h
Changed
@@ -125,6 +125,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanDisplay, gst_object_unref) + GST_VULKAN_API GstVulkanDisplay * gst_vulkan_display_new (GstVulkanInstance *instance); GST_VULKAN_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkfullscreenquad.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkfullscreenquad.h
Changed
@@ -99,6 +99,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanFullScreenQuad, gst_object_unref) + GST_VULKAN_API GstVulkanFullScreenQuad * gst_vulkan_full_screen_quad_new (GstVulkanQueue * queue);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkhandle.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkhandle.h
Changed
@@ -54,11 +54,12 @@ * * The printf format specifier for raw Vulkan non dispatchable handles. * + * When redefining VK_DEFINE_NON_DISPATCHABLE_HANDLE, also make sure + * to redefine a suitable printf format specifier. + * * Since: 1.18 */ -#if GLIB_SIZEOF_VOID_P == 8 -# define GST_VULKAN_NON_DISPATCHABLE_HANDLE_FORMAT "p" -#else +#if !defined(GST_VULKAN_NON_DISPATCHABLE_HANDLE_FORMAT) # define GST_VULKAN_NON_DISPATCHABLE_HANDLE_FORMAT G_GUINT64_FORMAT #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkhandlepool.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkhandlepool.h
Changed
@@ -89,6 +89,8 @@ gpointer _padding[GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanHandlePool, gst_object_unref) + GST_VULKAN_API gpointer gst_vulkan_handle_pool_alloc (GstVulkanHandlePool * pool, GError ** error); GST_VULKAN_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkinstance.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkinstance.c
Changed
@@ -975,7 +975,15 @@ if (gst_vulkan_error_to_g_error (err, error, "vkEnumeratePhysicalDevices") < 0) goto error; - g_assert (instance->n_physical_devices > 0); + + if (instance->n_physical_devices == 0) { + GST_WARNING_OBJECT (instance, "No available physical device"); + g_set_error_literal (error, + GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_NOT_FOUND, + "No available physical device"); + goto error; + } + instance->physical_devices = g_new0 (VkPhysicalDevice, instance->n_physical_devices); err =
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkinstance.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkinstance.h
Changed
@@ -76,6 +76,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanInstance, gst_object_unref) + GST_VULKAN_API GstVulkanInstance * gst_vulkan_instance_new (void); GST_VULKAN_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkphysicaldevice.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkphysicaldevice.h
Changed
@@ -83,6 +83,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanPhysicalDevice, gst_object_unref) + GST_VULKAN_API GstVulkanPhysicalDevice * gst_vulkan_physical_device_new (GstVulkanInstance * instance, guint device_index);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkqueue.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkqueue.h
Changed
@@ -78,7 +78,9 @@ gpointer _reserved [GST_PADDING]; }; -GST_VULKAN_API +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanQueue, gst_object_unref) + + GST_VULKAN_API GstVulkanDevice * gst_vulkan_queue_get_device (GstVulkanQueue * queue); GST_VULKAN_API
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkswapper.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkswapper.c
Changed
@@ -753,10 +753,15 @@ swapchain_dims.width = width; swapchain_dims.height = height; priv->any_current_extent = TRUE; + GST_DEBUG_OBJECT (swapper, "using requested swapchain dimensions %ux%u " + "from window", width, height); } else { /* If the surface size is defined, the swap chain size must match */ swapchain_dims = priv->surf_props.currentExtent; priv->any_current_extent = FALSE; + GST_DEBUG_OBJECT (swapper, "using current swapchain dimensions %ux%u", + priv->surf_props.currentExtent.width, + priv->surf_props.currentExtent.height); } priv->surface_location.w = swapchain_dims.width; priv->surface_location.h = swapchain_dims.height; @@ -767,6 +772,12 @@ * always available. */ present_mode = VK_PRESENT_MODE_FIFO_KHR; for (i = 0; i < priv->n_surf_present_modes; i++) { + GST_TRACE_OBJECT (swapper, + "surface %" GST_VULKAN_NON_DISPATCHABLE_HANDLE_FORMAT + " has present mode \'%s\' (0x%x)", priv->surface, + gst_vulkan_present_mode_to_string (priv->surf_present_modes[i]), + priv->surf_present_modes[i]); + if (priv->surf_present_modes[i] == VK_PRESENT_MODE_MAILBOX_KHR) { present_mode = VK_PRESENT_MODE_MAILBOX_KHR; break; @@ -776,6 +787,8 @@ present_mode = VK_PRESENT_MODE_IMMEDIATE_KHR; } } + GST_DEBUG_OBJECT (swapper, "using present mode \'%s\'", + gst_vulkan_present_mode_to_string (present_mode)); /* Determine the number of VkImage's to use in the swap chain (we desire to * own only 1 image at a time, besides the images being displayed and
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkswapper.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkswapper.h
Changed
@@ -84,6 +84,8 @@ gpointer _reserved [GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanSwapper, gst_object_unref) + GST_VULKAN_API GstVulkanSwapper * gst_vulkan_swapper_new (GstVulkanDevice * device, GstVulkanWindow * window);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvktrash.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvktrash.h
Changed
@@ -96,9 +96,7 @@ gst_mini_object_unref (GST_MINI_OBJECT_CAST (trash)); } -#ifdef G_DEFINE_AUTOPTR_CLEANUP_FUNC G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVulkanTrash, gst_vulkan_trash_unref) -#endif GST_VULKAN_API GstVulkanTrash * gst_vulkan_trash_new (GstVulkanFence * fence, @@ -163,9 +161,7 @@ #define GST_IS_VULKAN_TRASH_LIST(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_VULKAN_TRASH_LIST)) #define GST_IS_VULKAN_TRASH_LIST_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_VULKAN_TRASH_LIST)) -#ifdef G_DEFINE_AUTOPTR_CLEANUP_FUNC G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVulkanTrashList, gst_object_unref) -#endif /** * GstVulkanTrashList:
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkutils.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkutils.c
Changed
@@ -447,8 +447,6 @@ info->subresourceRange.baseMipLevel && view->create_info.subresourceRange.levelCount == info->subresourceRange.levelCount - && view->create_info.subresourceRange.levelCount == - info->subresourceRange.levelCount && view->create_info.subresourceRange.baseArrayLayer == info->subresourceRange.baseArrayLayer && view->create_info.subresourceRange.layerCount == @@ -499,8 +497,8 @@ * Since: 1.18 */ GstVulkanHandle * -gst_vulkan_create_shader (GstVulkanDevice * device, gchar * code, gsize size, - GError ** error) +gst_vulkan_create_shader (GstVulkanDevice * device, const gchar * code, + gsize size, GError ** error) { VkShaderModule shader; VkResult res; @@ -511,7 +509,7 @@ .pNext = NULL, .flags = 0, .codeSize = size, - .pCode = (guint32 *) code + .pCode = (const guint32 *) code }; /* *INDENT-ON* */ guint32 first_word; @@ -525,7 +523,7 @@ || first_word == SPIRV_MAGIC_NUMBER_OE, VK_NULL_HANDLE); if (first_word == SPIRV_MAGIC_NUMBER_OE) { /* endianness swap... */ - guint32 *old_code = (guint32 *) code; + const guint32 *old_code = (const guint32 *) code; gsize i; GST_DEBUG ("performaing endianness conversion on spirv shader of size %"
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkutils.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkutils.h
Changed
@@ -57,7 +57,7 @@ GST_VULKAN_API GstVulkanHandle * gst_vulkan_create_shader (GstVulkanDevice * device, - gchar * code, + const gchar * code, gsize size, GError ** error);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkvideofilter.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkvideofilter.c
Changed
@@ -161,7 +161,7 @@ struct choose_data { - GstVulkanVideoFilter *upload; + GstVulkanVideoFilter *filter; GstVulkanQueue *queue; }; @@ -183,14 +183,14 @@ } static GstVulkanQueue * -_find_graphics_queue (GstVulkanVideoFilter * upload) +_find_graphics_queue (GstVulkanVideoFilter * filter) { struct choose_data data; - data.upload = upload; + data.filter = filter; data.queue = NULL; - gst_vulkan_device_foreach_queue (upload->device, + gst_vulkan_device_foreach_queue (filter->device, (GstVulkanDeviceForEachQueueFunc) _choose_queue, &data); return data.queue;
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/gstvkwindow.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/gstvkwindow.h
Changed
@@ -140,6 +140,8 @@ gpointer _reserved[GST_PADDING]; }; +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVulkanWindow, gst_object_unref) + GST_VULKAN_API GstVulkanWindow * gst_vulkan_window_new (GstVulkanDisplay *display);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/ios/gstvkwindow_ios.m -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/ios/gstvkwindow_ios.m
Changed
@@ -203,7 +203,7 @@ g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_INITIALIZATION_FAILED, "No layer to retrieve surface for. Has create_window() been called?"); - return 0; + return VK_NULL_HANDLE; } info.sType = VK_STRUCTURE_TYPE_IOS_SURFACE_CREATE_INFO_MVK; @@ -218,14 +218,14 @@ if (!window_ios->CreateIOSSurface) { g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_FEATURE_NOT_PRESENT, "Could not retrieve \"vkCreateIOSSurfaceMVK\" function pointer"); - return 0; + return VK_NULL_HANDLE; } err = window_ios->CreateIOSSurface (window->display->instance->instance, &info, NULL, &ret); if (gst_vulkan_error_to_g_error (err, error, "vkCreateIOSSurfaceMVK") < 0) - return 0; + return VK_NULL_HANDLE; return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/meson.build
Changed
@@ -5,7 +5,7 @@ subdir_done() endif -vulkan_sources = [ +vulkan_sources = files( 'gstvkbuffermemory.c', 'gstvkbufferpool.c', 'gstvkcommandbuffer.c', @@ -34,9 +34,9 @@ 'gstvkvideofilter.c', 'gstvkutils.c', 'gstvkwindow.c', -] +) -vulkan_headers = [ +vulkan_headers = files( 'gstvkapi.h', 'gstvkbarrier.h', 'gstvkbuffermemory.h', @@ -70,7 +70,7 @@ 'vulkan-prelude.h', 'vulkan_fwd.h', 'vulkan.h', -] +) vulkan_priv_sources = [] vulkan_xcb_sources = [] @@ -110,7 +110,7 @@ # https://github.com/KhronosGroup/MoltenVK/issues/492 vulkan_dep = cc.find_library('MoltenVK', required : get_option('vulkan')) elif host_system == 'windows' - vulkan_root = run_command(python3, '-c', 'import os; print(os.environ.get("VK_SDK_PATH"))').stdout().strip() + vulkan_root = run_command(python3, '-c', 'import os; print(os.environ.get("VK_SDK_PATH"))', check: false).stdout().strip() if vulkan_root != '' and vulkan_root != 'None' vulkan_lib_dir = '' if build_machine.cpu_family() == 'x86_64' @@ -154,17 +154,17 @@ xkbcommon_x11_dep = dependency('xkbcommon-x11', required : get_option('x11')) if xcb_dep.found() and xkbcommon_dep.found() and xkbcommon_x11_dep.found() and cc.has_header('vulkan/vulkan_xcb.h', dependencies : vulkan_dep) - vulkan_priv_sources += [ + vulkan_priv_sources += files( 'xcb/gstvkwindow_xcb.c', 'xcb/xcb_event_source.c', - ] - vulkan_xcb_sources += [ + ) + vulkan_xcb_sources += files( 'xcb/gstvkdisplay_xcb.c', - ] - vulkan_xcb_headers += [ + ) + vulkan_xcb_headers += files( 'xcb/xcb.h', 'xcb/gstvkdisplay_xcb.h' - ] + ) optional_deps += [xcb_dep, xkbcommon_dep, xkbcommon_x11_dep] vulkan_windowing = true @@ -174,18 +174,18 @@ wayland_client_dep = dependency('wayland-client', version : '>=1.4', required : get_option('wayland')) if wayland_client_dep.found() and cc.has_header('vulkan/vulkan_wayland.h', dependencies : vulkan_dep) - vulkan_priv_sources += [ + vulkan_priv_sources += files( 'wayland/gstvkdisplay_wayland.c', 'wayland/gstvkwindow_wayland.c', 'wayland/wayland_event_source.c', - ] - vulkan_wayland_sources += [ + ) + vulkan_wayland_sources += files( 'wayland/gstvkdisplay_wayland.c', - ] - vulkan_wayland_headers += [ + ) + vulkan_wayland_headers += files( 'wayland/wayland.h', 'wayland/gstvkdisplay_wayland.h' - ] + ) optional_deps += wayland_client_dep vulkan_windowing = true @@ -213,10 +213,10 @@ cocoa_dep = dependency('appleframeworks', modules : ['Cocoa'], required : get_option('vulkan')) if cocoa_dep.found() and cc.has_header('vulkan/vulkan_macos.h', dependencies : vulkan_dep) - vulkan_priv_sources += [ + vulkan_priv_sources += files( 'cocoa/gstvkdisplay_cocoa.m', 'cocoa/gstvkwindow_cocoa.m', - ] + ) optional_deps += [cocoa_dep] vulkan_windowing = true vulkan_conf.set10('GST_VULKAN_HAVE_WINDOW_COCOA', 1) @@ -228,10 +228,10 @@ uikit_dep = dependency('appleframeworks', modules : ['UIKit'], required : get_option('vulkan')) if uikit_dep.found() and cc.has_header('vulkan/vulkan_ios.h', dependencies : vulkan_dep) - vulkan_priv_sources += [ + vulkan_priv_sources += files( 'ios/gstvkdisplay_ios.m', 'ios/gstvkwindow_ios.m', - ] + ) optional_deps += [uikit_dep] vulkan_windowing = true vulkan_conf.set10('GST_VULKAN_HAVE_WINDOW_IOS', 1) @@ -254,10 +254,10 @@ if host_system == 'android' if cc.has_header('vulkan/vulkan_android.h', dependencies : vulkan_dep) - vulkan_priv_sources += [ + vulkan_priv_sources += files( 'android/gstvkdisplay_android.c', 'android/gstvkwindow_android.c', - ] + ) vulkan_windowing = true vulkan_conf.set10('GST_VULKAN_HAVE_WINDOW_ANDROID', 1) enabled_vulkan_winsys += ['android'] @@ -265,7 +265,11 @@ endif if not vulkan_windowing - warning('No Windowing system found. vulkansink will not work') + if get_option('vulkan').enabled() + error('No Windowing system found. vulkansink will not work') + else + message('No Windowing system found. vulkansink will not work') + endif endif # Only needed for the vulkan plugin, but doesn't make sense to build @@ -307,8 +311,8 @@ gstvulkan = library('gstvulkan-' + api_version, vulkan_sources, vulkan_priv_sources, vulkan_wayland_sources, vulkan_xcb_sources, vulkan_enumtypes_c, vulkan_enumtypes_h, - c_args : gst_plugins_bad_args + vulkan_defines + ['-DBUILDING_GST_VULKAN'], - objc_args : gst_plugins_bad_args + vulkan_defines + vulkan_objc_args + ['-DBUILDING_GST_VULKAN'], + c_args : gst_plugins_bad_args + vulkan_defines + ['-DBUILDING_GST_VULKAN', '-DG_LOG_DOMAIN="GStreamer-Vulkan"'], + objc_args : gst_plugins_bad_args + vulkan_defines + vulkan_objc_args + ['-DBUILDING_GST_VULKAN', '-DG_LOG_DOMAIN="GStreamer-Vulkan"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -318,6 +322,16 @@ # don't confuse gst/vulkan/xcb/xcb.h with xcb/xcb.h implicit_include_directories : false) +library_def = {'lib': gstvulkan} +pkg_name = 'gstreamer-vulkan-1.0' +pkgconfig.generate(gstvulkan, + libraries : [gst_dep, gstbase_dep, gstvideo_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'GStreamer Vulkan support', +) + if build_gir extra_gir_includes = [] gobject_introspection_dep = dependency('gobject-introspection-1.0') @@ -326,64 +340,101 @@ extra_gir_includes += ['Vulkan-1.0'] endif - vulkan_gir = gnome.generate_gir(gstvulkan, - sources : vulkan_sources + vulkan_headers + [vulkan_enumtypes_h, vulkan_enumtypes_c], - namespace : 'GstVulkan', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-vulkan-1.0', - includes : ['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0'] + extra_gir_includes, - install : true, - extra_args : gir_init_section + ['--c-include=gst/vulkan/vulkan.h'], - dependencies : [gstvideo_dep, gst_dep, gstbase_dep] + optional_deps - ) - gen_sources += vulkan_gir + gir = { + 'sources' : vulkan_sources + vulkan_headers + [vulkan_enumtypes_h, vulkan_enumtypes_c], + 'namespace' : 'GstVulkan', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0'] + extra_gir_includes, + 'install' : true, + 'extra_args' : gir_init_section + ['--c-include=gst/vulkan/vulkan.h'], + 'dependencies' : [gstvideo_dep, gst_dep, gstbase_dep] + optional_deps + } + + library_def += {'gir': [gir]} + if not static_build + vulkan_gir = gnome.generate_gir(gstvulkan, kwargs: gir) + gen_sources += vulkan_gir + endif endif +libraries += [[pkg_name, library_def]] gstvulkan_dep = declare_dependency(link_with : gstvulkan, include_directories : [libsinc], sources: gen_sources, dependencies : [gstvideo_dep, gstbase_dep, vulkan_dep] + optional_deps) +meson.override_dependency(pkg_name, gstvulkan_dep) + if enabled_vulkan_winsys.contains('xcb') install_headers(vulkan_xcb_headers, subdir : 'gstreamer-1.0/gst/vulkan/xcb') + pkgconfig.generate( + libraries : [gstvulkan], + requires : ['xcb'], + subdirs : pkgconfig_subdirs, + name : 'gstreamer-vulkan-xcb-1.0', + description : 'GStreamer Vulkan support (XCB Specifics)', + ) vulkan_xcb_gir = [] if build_gir - vulkan_xcb_gir = gnome.generate_gir(gstvulkan, - sources : vulkan_xcb_sources + vulkan_xcb_headers, - namespace : 'GstVulkanXCB', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-vulkan-xcb-1.0', - includes : ['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0', vulkan_gir[0]] + extra_gir_includes, - install : true, - extra_args : gir_init_section + ['--c-include=gst/vulkan/xcb/xcb.h'], - dependencies : [gstvideo_dep, gst_dep, gstbase_dep] + optional_deps - ) + gir = { + 'sources' : vulkan_xcb_sources + vulkan_xcb_headers, + 'namespace' : 'GstVulkanXCB', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : 'gstreamer-vulkan-xcb-1.0', + 'install' : true, + 'extra_args' : gir_init_section + ['--c-include=gst/vulkan/xcb/xcb.h'], + 'dependencies' : [gstvideo_dep, gst_dep, gstbase_dep] + optional_deps + } + + if not static_build + gir += {'includes' : ['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0', vulkan_gir[0]] + extra_gir_includes} + vulkan_xcb_gir = gnome.generate_gir(gstvulkan, kwargs: gir) + endif + + gir += {'includes' :['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0', 'GstVulkan-1.0'] + extra_gir_includes} + library_def += {'gir': library_def['gir'] + [gir]} endif gstvulkanxcb_dep = declare_dependency(dependencies : [gstvulkan_dep], sources : vulkan_xcb_gir) + meson.override_dependency('gstreamer-vulkan-xcb-1.0', gstvulkanxcb_dep) endif if enabled_vulkan_winsys.contains('wayland') install_headers(vulkan_wayland_headers, subdir : 'gstreamer-1.0/gst/vulkan/wayland') + pkgconfig.generate( + libraries : [gstvulkan], + requires : ['wayland-client'], + subdirs : pkgconfig_subdirs, + name : 'gstreamer-vulkan-wayland-1.0', + description : 'GStreamer Vulkan support (Wayland Specifics)', + ) vulkan_wayland_gir = [] if build_gir - vulkan_wayland_gir = gnome.generate_gir(gstvulkan, - sources : vulkan_wayland_sources + vulkan_wayland_headers, - namespace : 'GstVulkanWayland', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-vulkan-wayland-1.0', - includes : ['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0', vulkan_gir[0]] + extra_gir_includes, - install : true, - extra_args : gir_init_section + ['--c-include=gst/vulkan/wayland/wayland.h'], - dependencies : [gstvideo_dep, gst_dep, gstbase_dep] + optional_deps - ) + gir = { + 'sources' : vulkan_wayland_sources + vulkan_wayland_headers, + 'namespace' : 'GstVulkanWayland', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : 'gstreamer-vulkan-wayland-1.0', + 'install' : true, + 'extra_args' : gir_init_section + ['--c-include=gst/vulkan/wayland/wayland.h'], + 'dependencies' : [gstvideo_dep, gst_dep, gstbase_dep] + optional_deps + } + if not static_build + gir += {'includes' : ['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0', vulkan_gir[0]] + extra_gir_includes} + vulkan_wayland_gir += gnome.generate_gir(gstvulkan, kwargs: gir) + endif + gir += {'includes' :['Gst-1.0', 'GstBase-1.0', 'GstVideo-1.0', 'GstVulkan-1.0'] + extra_gir_includes} + library_def += {'gir': library_def['gir'] + [gir]} endif gstvulkanwayland_dep = declare_dependency(dependencies : [gstvulkan_dep], sources : vulkan_wayland_gir) + meson.override_dependency('gstreamer-vulkan-wayland-1.0', gstvulkanwayland_dep) endif +
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/wayland/gstvkwindow_wayland.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/wayland/gstvkwindow_wayland.c
Changed
@@ -268,14 +268,14 @@ if (!window_wl->CreateWaylandSurface) { g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_FEATURE_NOT_PRESENT, "Could not retrieve \"vkCreateWaylandSurfaceKHR\" function pointer"); - return NULL; + return VK_NULL_HANDLE; } err = window_wl->CreateWaylandSurface (window->display->instance->instance, &info, NULL, &ret); if (gst_vulkan_error_to_g_error (err, error, "vkCreateWaylandSurfaceKHR") < 0) - return NULL; + return VK_NULL_HANDLE; return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/win32/gstvkwindow_win32.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/win32/gstvkwindow_win32.c
Changed
@@ -393,7 +393,7 @@ if (!window_win32->CreateWin32Surface) { g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_FEATURE_NOT_PRESENT, "Could not retrieve \"vkCreateWin32SurfaceKHR\" function pointer"); - return NULL; + return VK_NULL_HANDLE; } err = @@ -401,7 +401,7 @@ &info, NULL, &ret); if (gst_vulkan_error_to_g_error (err, error, "vkCreateWin32SurfaceKHR") < 0) - return NULL; + return VK_NULL_HANDLE; return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/vulkan/xcb/gstvkwindow_xcb.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/vulkan/xcb/gstvkwindow_xcb.c
Changed
@@ -283,14 +283,14 @@ if (!window_xcb->CreateXcbSurface) { g_set_error_literal (error, GST_VULKAN_ERROR, VK_ERROR_FEATURE_NOT_PRESENT, "Could not retrieve \"vkCreateXcbSurfaceKHR\" function pointer"); - return NULL; + return VK_NULL_HANDLE; } err = window_xcb->CreateXcbSurface (window->display->instance->instance, &info, NULL, &ret); if (gst_vulkan_error_to_g_error (err, error, "vkCreateXcbSurfaceKHR") < 0) - return NULL; + return VK_NULL_HANDLE; return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/wayland/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/wayland/meson.build
Changed
@@ -9,16 +9,29 @@ if use_wayland gstwayland = library('gstwayland-' + api_version, 'wayland.c', - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_WAYLAND'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_WAYLAND', '-DG_LOG_DOMAIN="GStreamer-Wayland"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, - darwin_versions : osxversion, + darwin_versions : osxversion, install : true, dependencies : [gst_dep, gstvideo_dep, wl_client_dep] ) + pkg_name = 'gstreamer-wayland-1.0' + libraries += [[pkg_name, {'lib': gstwayland}]] + pkgconfig.generate(gstwayland, + libraries : [gst_dep, gstvideo_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'GStreamer Wayland support', + ) + gstwayland_dep = declare_dependency(link_with : gstwayland, include_directories : [libsinc], dependencies : [gst_dep, gstvideo_dep]) + + install_headers('wayland.h', subdir: 'gstreamer-1.0/gst/wayland') + meson.override_dependency(pkg_name, gstwayland_dep) endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/wayland/wayland.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/wayland/wayland.c
Changed
@@ -46,7 +46,7 @@ GstContext *context = gst_context_new (GST_WAYLAND_DISPLAY_HANDLE_CONTEXT_TYPE, TRUE); gst_structure_set (gst_context_writable_structure (context), - "handle", G_TYPE_POINTER, display, NULL); + "display", G_TYPE_POINTER, display, NULL); return context; } @@ -59,8 +59,11 @@ g_return_val_if_fail (GST_IS_CONTEXT (context), NULL); s = gst_context_get_structure (context); - gst_structure_get (s, "handle", G_TYPE_POINTER, &display, NULL); - return display; + if (gst_structure_get (s, "display", G_TYPE_POINTER, &display, NULL)) + return display; + if (gst_structure_get (s, "handle", G_TYPE_POINTER, &display, NULL)) + return display; + return NULL; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/datachannel.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/datachannel.c
Changed
@@ -33,6 +33,7 @@ #endif #include "datachannel.h" +#include "webrtc-priv.h" #define GST_CAT_DEFAULT gst_webrtc_data_channel_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/datachannel.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/datachannel.h
Changed
@@ -36,69 +36,6 @@ #define GST_IS_WEBRTC_DATA_CHANNEL_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_DATA_CHANNEL)) #define GST_WEBRTC_DATA_CHANNEL_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_DATA_CHANNEL,GstWebRTCDataChannelClass)) -#define GST_WEBRTC_DATA_CHANNEL_LOCK(channel) g_mutex_lock(&((GstWebRTCDataChannel *)(channel))->lock) -#define GST_WEBRTC_DATA_CHANNEL_UNLOCK(channel) g_mutex_unlock(&((GstWebRTCDataChannel *)(channel))->lock) - -/** - * GstWebRTCDataChannel: - * - * Since: 1.18 - */ -struct _GstWebRTCDataChannel -{ - GObject parent; - - GMutex lock; - - gchar *label; - gboolean ordered; - guint max_packet_lifetime; - guint max_retransmits; - gchar *protocol; - gboolean negotiated; - gint id; - GstWebRTCPriorityType priority; - GstWebRTCDataChannelState ready_state; - guint64 buffered_amount; - guint64 buffered_amount_low_threshold; - - gpointer _padding[GST_PADDING]; -}; - -/** - * GstWebRTCDataChannelClass: - * - * Since: 1.18 - */ -struct _GstWebRTCDataChannelClass -{ - GObjectClass parent_class; - - void (*send_data) (GstWebRTCDataChannel * channel, GBytes *data); - void (*send_string) (GstWebRTCDataChannel * channel, const gchar *str); - void (*close) (GstWebRTCDataChannel * channel); - - gpointer _padding[GST_PADDING]; -}; - -GST_WEBRTC_API -void gst_webrtc_data_channel_on_open (GstWebRTCDataChannel * channel); - -GST_WEBRTC_API -void gst_webrtc_data_channel_on_close (GstWebRTCDataChannel * channel); - -GST_WEBRTC_API -void gst_webrtc_data_channel_on_error (GstWebRTCDataChannel * channel, GError * error); - -GST_WEBRTC_API -void gst_webrtc_data_channel_on_message_data (GstWebRTCDataChannel * channel, GBytes * data); - -GST_WEBRTC_API -void gst_webrtc_data_channel_on_message_string (GstWebRTCDataChannel * channel, const gchar * str); - -GST_WEBRTC_API -void gst_webrtc_data_channel_on_buffered_amount_low (GstWebRTCDataChannel * channel); - GST_WEBRTC_API void gst_webrtc_data_channel_send_data (GstWebRTCDataChannel * channel, GBytes * data);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/dtlstransport.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/dtlstransport.c
Changed
@@ -32,6 +32,8 @@ #include "dtlstransport.h" +#include "webrtc-priv.h" + #define GST_CAT_DEFAULT gst_webrtc_dtls_transport_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -55,20 +57,26 @@ PROP_STATE, PROP_CLIENT, PROP_CERTIFICATE, - PROP_REMOTE_CERTIFICATE, - PROP_RTCP, + PROP_REMOTE_CERTIFICATE }; void gst_webrtc_dtls_transport_set_transport (GstWebRTCDTLSTransport * transport, GstWebRTCICETransport * ice) { + gboolean notify = FALSE; + g_return_if_fail (GST_IS_WEBRTC_DTLS_TRANSPORT (transport)); g_return_if_fail (GST_IS_WEBRTC_ICE_TRANSPORT (ice)); GST_OBJECT_LOCK (transport); - gst_object_replace ((GstObject **) & transport->transport, GST_OBJECT (ice)); + notify = + gst_object_replace ((GstObject **) & transport->transport, + GST_OBJECT (ice)); GST_OBJECT_UNLOCK (transport); + + if (notify) + g_object_notify (G_OBJECT (transport), "transport"); } static void @@ -88,9 +96,6 @@ case PROP_CERTIFICATE: g_object_set_property (G_OBJECT (webrtc->dtlssrtpdec), "pem", value); break; - case PROP_RTCP: - webrtc->is_rtcp = g_value_get_boolean (value); - break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -123,9 +128,6 @@ case PROP_REMOTE_CERTIFICATE: g_object_get_property (G_OBJECT (webrtc->dtlssrtpdec), "peer-pem", value); break; - case PROP_RTCP: - g_value_set_boolean (value, webrtc->is_rtcp); - break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -184,12 +186,12 @@ /* XXX: this may collide with another connection_id however this is only a * problem if multiple dtls element sets are being used within the same * process */ - connection_id = g_strdup_printf ("%s_%u_%u", webrtc->is_rtcp ? "rtcp" : "rtp", - webrtc->session_id, g_random_int ()); + connection_id = g_strdup_printf ("rtp_%u_%u", webrtc->session_id, + g_random_int ()); webrtc->dtlssrtpenc = gst_element_factory_make ("dtlssrtpenc", NULL); g_object_set (webrtc->dtlssrtpenc, "connection-id", connection_id, - "is-client", webrtc->client, "rtp-sync", TRUE, NULL); + "is-client", webrtc->client, "rtp-sync", FALSE, NULL); webrtc->dtlssrtpdec = gst_element_factory_make ("dtlssrtpdec", NULL); g_object_set (webrtc->dtlssrtpdec, "connection-id", connection_id, NULL); @@ -249,12 +251,6 @@ g_param_spec_string ("remote-certificate", "Remote DTLS certificate", "Remote DTLS certificate", NULL, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); - - g_object_class_install_property (gobject_class, - PROP_RTCP, - g_param_spec_boolean ("rtcp", "RTCP", - "The transport is being used solely for RTCP", FALSE, - G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); } static void @@ -263,8 +259,8 @@ } GstWebRTCDTLSTransport * -gst_webrtc_dtls_transport_new (guint session_id, gboolean is_rtcp) +gst_webrtc_dtls_transport_new (guint session_id) { return g_object_new (GST_TYPE_WEBRTC_DTLS_TRANSPORT, "session-id", session_id, - "rtcp", is_rtcp, NULL); + NULL); }
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/dtlstransport.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/dtlstransport.h
Changed
@@ -35,39 +35,6 @@ #define GST_IS_WEBRTC_DTLS_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_DTLS_TRANSPORT)) #define GST_WEBRTC_DTLS_TRANSPORT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_DTLS_TRANSPORT,GstWebRTCDTLSTransportClass)) -/** - * GstWebRTCDTLSTransport: - */ -struct _GstWebRTCDTLSTransport -{ - GstObject parent; - - GstWebRTCICETransport *transport; - GstWebRTCDTLSTransportState state; - - gboolean is_rtcp; - gboolean client; - guint session_id; - GstElement *dtlssrtpenc; - GstElement *dtlssrtpdec; - - gpointer _padding[GST_PADDING]; -}; - -struct _GstWebRTCDTLSTransportClass -{ - GstObjectClass parent_class; - - gpointer _padding[GST_PADDING]; -}; - -GST_WEBRTC_API -GstWebRTCDTLSTransport * gst_webrtc_dtls_transport_new (guint session_id, gboolean rtcp); - -GST_WEBRTC_API -void gst_webrtc_dtls_transport_set_transport (GstWebRTCDTLSTransport * transport, - GstWebRTCICETransport * ice); - G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstWebRTCDTLSTransport, gst_object_unref) G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/icetransport.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/icetransport.c
Changed
@@ -33,6 +33,8 @@ #include "icetransport.h" #include "webrtc-enumtypes.h" +#include "webrtc-priv.h" + #define GST_CAT_DEFAULT gst_webrtc_ice_transport_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT);
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/icetransport.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/icetransport.h
Changed
@@ -34,46 +34,6 @@ #define GST_IS_WEBRTC_ICE_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_ICE_TRANSPORT)) #define GST_WEBRTC_ICE_TRANSPORT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_ICE_TRANSPORT,GstWebRTCICETransportClass)) -/** - * GstWebRTCICETransport: - */ -struct _GstWebRTCICETransport -{ - GstObject parent; - - GstWebRTCICERole role; - GstWebRTCICEComponent component; - - GstWebRTCICEConnectionState state; - GstWebRTCICEGatheringState gathering_state; - - /* Filled by subclasses */ - GstElement *src; - GstElement *sink; - - gpointer _padding[GST_PADDING]; -}; - -struct _GstWebRTCICETransportClass -{ - GstObjectClass parent_class; - - gboolean (*gather_candidates) (GstWebRTCICETransport * transport); - - gpointer _padding[GST_PADDING]; -}; - -GST_WEBRTC_API -void gst_webrtc_ice_transport_connection_state_change (GstWebRTCICETransport * ice, - GstWebRTCICEConnectionState new_state); -GST_WEBRTC_API -void gst_webrtc_ice_transport_gathering_state_change (GstWebRTCICETransport * ice, - GstWebRTCICEGatheringState new_state); -GST_WEBRTC_API -void gst_webrtc_ice_transport_selected_pair_change (GstWebRTCICETransport * ice); -GST_WEBRTC_API -void gst_webrtc_ice_transport_new_candidate (GstWebRTCICETransport * ice, guint stream_id, GstWebRTCICEComponent component, gchar * attr); - G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstWebRTCICETransport, gst_object_unref) G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/meson.build
Changed
@@ -1,4 +1,4 @@ -webrtc_sources = [ +webrtc_sources = files([ 'dtlstransport.c', 'icetransport.c', 'rtcsessiondescription.c', @@ -6,9 +6,11 @@ 'rtpsender.c', 'rtptransceiver.c', 'datachannel.c', -] + 'sctptransport.c', + 'webrtc.c', +]) -webrtc_headers = [ +webrtc_headers = files([ 'dtlstransport.h', 'icetransport.h', 'rtcsessiondescription.h', @@ -18,14 +20,15 @@ 'datachannel.h', 'webrtc_fwd.h', 'webrtc.h', -] + 'sctptransport.h', +]) -webrtc_enumtypes_headers = [ +webrtc_enumtypes_headers = files([ 'dtlstransport.h', 'icetransport.h', 'rtptransceiver.h', 'webrtc_fwd.h', -] +]) webrtc_enums = gnome.mkenums_simple('webrtc-enumtypes', sources : webrtc_enumtypes_headers, @@ -44,7 +47,7 @@ gstwebrtc = library('gstwebrtc-' + api_version, webrtc_sources, gstwebrtc_c, gstwebrtc_h, - c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_WEBRTC'], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API', '-DBUILDING_GST_WEBRTC', '-DG_LOG_DOMAIN="GStreamer-WebRTC"'], include_directories : [configinc, libsinc], version : libversion, soversion : soversion, @@ -53,21 +56,36 @@ dependencies : gstwebrtc_dependencies, ) +library_def = {'lib': gstwebrtc} +pkg_name = 'gstreamer-webrtc-1.0' +pkgconfig.generate(gstwebrtc, + libraries : [gst_dep, gstbase_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : pkg_name, + description : 'GStreamer WebRTC support', +) + if build_gir - webrtc_gir = gnome.generate_gir(gstwebrtc, - sources : webrtc_sources + webrtc_headers + [gstwebrtc_h], - namespace : 'GstWebRTC', - nsversion : api_version, - identifier_prefix : 'Gst', - symbol_prefix : 'gst', - export_packages : 'gstreamer-webrtc-1.0', - includes : ['Gst-1.0', 'GstSdp-1.0'], - install : true, - extra_args : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/webrtc/webrtc.h'], - dependencies : [gstbase_dep, gstsdp_dep] - ) - webrtc_gen_sources += webrtc_gir + gir = { + 'sources' : webrtc_sources + webrtc_headers + [gstwebrtc_h], + 'namespace' : 'GstWebRTC', + 'nsversion' : api_version, + 'identifier_prefix' : 'Gst', + 'symbol_prefix' : 'gst', + 'export_packages' : pkg_name, + 'includes' : ['Gst-1.0', 'GstSdp-1.0'], + 'install' : true, + 'extra_args' : gir_init_section + ['-DGST_USE_UNSTABLE_API'] + ['--c-include=gst/webrtc/webrtc.h'], + 'dependencies' : gstwebrtc_dependencies, + } + library_def = {'lib': library_def['lib'], 'gir': [gir]} + if not static_build + webrtc_gir = gnome.generate_gir(gstwebrtc, kwargs: gir) + webrtc_gen_sources += webrtc_gir + endif endif +libraries += [[pkg_name, library_def]] install_headers(webrtc_headers, subdir : 'gstreamer-1.0/gst/webrtc') @@ -75,3 +93,5 @@ include_directories : libsinc, sources: webrtc_gen_sources, dependencies: gstwebrtc_dependencies) + +meson.override_dependency(pkg_name, gstwebrtc_dep)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/rtpreceiver.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/rtpreceiver.c
Changed
@@ -31,6 +31,7 @@ #endif #include "rtpreceiver.h" +#include "webrtc-priv.h" #define GST_CAT_DEFAULT gst_webrtc_rtp_receiver_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -49,36 +50,11 @@ enum { PROP_0, + PROP_TRANSPORT, }; //static guint gst_webrtc_rtp_receiver_signals[LAST_SIGNAL] = { 0 }; -void -gst_webrtc_rtp_receiver_set_transport (GstWebRTCRTPReceiver * receiver, - GstWebRTCDTLSTransport * transport) -{ - g_return_if_fail (GST_IS_WEBRTC_RTP_RECEIVER (receiver)); - g_return_if_fail (GST_IS_WEBRTC_DTLS_TRANSPORT (transport)); - - GST_OBJECT_LOCK (receiver); - gst_object_replace ((GstObject **) & receiver->transport, - GST_OBJECT (transport)); - GST_OBJECT_UNLOCK (receiver); -} - -void -gst_webrtc_rtp_receiver_set_rtcp_transport (GstWebRTCRTPReceiver * receiver, - GstWebRTCDTLSTransport * transport) -{ - g_return_if_fail (GST_IS_WEBRTC_RTP_RECEIVER (receiver)); - g_return_if_fail (GST_IS_WEBRTC_DTLS_TRANSPORT (transport)); - - GST_OBJECT_LOCK (receiver); - gst_object_replace ((GstObject **) & receiver->rtcp_transport, - GST_OBJECT (transport)); - GST_OBJECT_UNLOCK (receiver); -} - static void gst_webrtc_rtp_receiver_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) @@ -94,7 +70,13 @@ gst_webrtc_rtp_receiver_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { + GstWebRTCRTPReceiver *receiver = GST_WEBRTC_RTP_RECEIVER (object); switch (prop_id) { + case PROP_TRANSPORT: + GST_OBJECT_LOCK (receiver); + g_value_set_object (value, receiver->transport); + GST_OBJECT_UNLOCK (receiver); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -110,10 +92,6 @@ gst_object_unref (webrtc->transport); webrtc->transport = NULL; - if (webrtc->rtcp_transport) - gst_object_unref (webrtc->rtcp_transport); - webrtc->rtcp_transport = NULL; - G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -125,6 +103,20 @@ gobject_class->get_property = gst_webrtc_rtp_receiver_get_property; gobject_class->set_property = gst_webrtc_rtp_receiver_set_property; gobject_class->finalize = gst_webrtc_rtp_receiver_finalize; + + /** + * GstWebRTCRTPReceiver:transport: + * + * The DTLS transport for this receiver + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_TRANSPORT, + g_param_spec_object ("transport", "Transport", + "The DTLS transport for this receiver", + GST_TYPE_WEBRTC_DTLS_TRANSPORT, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/rtpreceiver.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/rtpreceiver.h
Changed
@@ -35,36 +35,6 @@ #define GST_IS_WEBRTC_RTP_RECEIVER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_RTP_RECEIVER)) #define GST_WEBRTC_RTP_RECEIVER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_RTP_RECEIVER,GstWebRTCRTPReceiverClass)) -/** - * GstWebRTCRTPReceiver: - */ -struct _GstWebRTCRTPReceiver -{ - GstObject parent; - - /* The MediStreamTrack is represented by the stream and is output into @transport/@rtcp_transport as necessary */ - GstWebRTCDTLSTransport *transport; - GstWebRTCDTLSTransport *rtcp_transport; - - gpointer _padding[GST_PADDING]; -}; - -struct _GstWebRTCRTPReceiverClass -{ - GstObjectClass parent_class; - - gpointer _padding[GST_PADDING]; -}; - -GST_WEBRTC_API -GstWebRTCRTPReceiver * gst_webrtc_rtp_receiver_new (void); -GST_WEBRTC_API -void gst_webrtc_rtp_receiver_set_transport (GstWebRTCRTPReceiver * receiver, - GstWebRTCDTLSTransport * transport); -GST_WEBRTC_API -void gst_webrtc_rtp_receiver_set_rtcp_transport (GstWebRTCRTPReceiver * receiver, - GstWebRTCDTLSTransport * transport); - G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstWebRTCRTPReceiver, gst_object_unref) G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/rtpsender.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/rtpsender.c
Changed
@@ -32,6 +32,7 @@ #include "rtpsender.h" #include "rtptransceiver.h" +#include "webrtc-priv.h" #define GST_CAT_DEFAULT gst_webrtc_rtp_sender_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -51,45 +52,44 @@ enum { PROP_0, - PROP_MID, - PROP_SENDER, - PROP_STOPPED, - PROP_DIRECTION, + PROP_PRIORITY, + PROP_TRANSPORT, }; //static guint gst_webrtc_rtp_sender_signals[LAST_SIGNAL] = { 0 }; -void -gst_webrtc_rtp_sender_set_transport (GstWebRTCRTPSender * sender, - GstWebRTCDTLSTransport * transport) -{ - g_return_if_fail (GST_IS_WEBRTC_RTP_SENDER (sender)); - g_return_if_fail (GST_IS_WEBRTC_DTLS_TRANSPORT (transport)); - - GST_OBJECT_LOCK (sender); - gst_object_replace ((GstObject **) & sender->transport, - GST_OBJECT (transport)); - GST_OBJECT_UNLOCK (sender); -} +/** + * gst_webrtc_rtp_sender_set_priority: + * @sender: a #GstWebRTCRTPSender + * @priority: The priority of this sender + * + * Sets the content of the IPv4 Type of Service (ToS), also known as DSCP + * (Differentiated Services Code Point). + * This also sets the Traffic Class field of IPv6. + * + * Since: 1.20 + */ void -gst_webrtc_rtp_sender_set_rtcp_transport (GstWebRTCRTPSender * sender, - GstWebRTCDTLSTransport * transport) +gst_webrtc_rtp_sender_set_priority (GstWebRTCRTPSender * sender, + GstWebRTCPriorityType priority) { - g_return_if_fail (GST_IS_WEBRTC_RTP_SENDER (sender)); - g_return_if_fail (GST_IS_WEBRTC_DTLS_TRANSPORT (transport)); - GST_OBJECT_LOCK (sender); - gst_object_replace ((GstObject **) & sender->rtcp_transport, - GST_OBJECT (transport)); + sender->priority = priority; GST_OBJECT_UNLOCK (sender); + g_object_notify (G_OBJECT (sender), "priority"); } static void gst_webrtc_rtp_sender_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { + GstWebRTCRTPSender *sender = GST_WEBRTC_RTP_SENDER (object); + switch (prop_id) { + case PROP_PRIORITY: + gst_webrtc_rtp_sender_set_priority (sender, g_value_get_uint (value)); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -100,7 +100,19 @@ gst_webrtc_rtp_sender_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { + GstWebRTCRTPSender *sender = GST_WEBRTC_RTP_SENDER (object); + switch (prop_id) { + case PROP_PRIORITY: + GST_OBJECT_LOCK (sender); + g_value_set_uint (value, sender->priority); + GST_OBJECT_UNLOCK (sender); + break; + case PROP_TRANSPORT: + GST_OBJECT_LOCK (sender); + g_value_set_object (value, sender->transport); + GST_OBJECT_UNLOCK (sender); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -110,15 +122,11 @@ static void gst_webrtc_rtp_sender_finalize (GObject * object) { - GstWebRTCRTPSender *webrtc = GST_WEBRTC_RTP_SENDER (object); + GstWebRTCRTPSender *sender = GST_WEBRTC_RTP_SENDER (object); - if (webrtc->transport) - gst_object_unref (webrtc->transport); - webrtc->transport = NULL; - - if (webrtc->rtcp_transport) - gst_object_unref (webrtc->rtcp_transport); - webrtc->rtcp_transport = NULL; + if (sender->transport) + gst_object_unref (sender->transport); + sender->transport = NULL; G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -131,6 +139,35 @@ gobject_class->get_property = gst_webrtc_rtp_sender_get_property; gobject_class->set_property = gst_webrtc_rtp_sender_set_property; gobject_class->finalize = gst_webrtc_rtp_sender_finalize; + + /** + * GstWebRTCRTPSender:priority: + * + * The priority from which to set the DSCP field on packets + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_PRIORITY, + g_param_spec_enum ("priority", + "Priority", + "The priority from which to set the DSCP field on packets", + GST_TYPE_WEBRTC_PRIORITY_TYPE, GST_WEBRTC_PRIORITY_TYPE_LOW, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCRTPSender:transport: + * + * The DTLS transport for this sender + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_TRANSPORT, + g_param_spec_object ("transport", "Transport", + "The DTLS transport for this sender", + GST_TYPE_WEBRTC_DTLS_TRANSPORT, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/rtpsender.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/rtpsender.h
Changed
@@ -35,39 +35,9 @@ #define GST_IS_WEBRTC_RTP_SENDER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_RTP_SENDER)) #define GST_WEBRTC_RTP_SENDER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_RTP_SENDER,GstWebRTCRTPSenderClass)) -/** - * GstWebRTCRTPSender: - */ -struct _GstWebRTCRTPSender -{ - GstObject parent; - - /* The MediStreamTrack is represented by the stream and is output into @transport/@rtcp_transport as necessary */ - GstWebRTCDTLSTransport *transport; - GstWebRTCDTLSTransport *rtcp_transport; - - GArray *send_encodings; - - gpointer _padding[GST_PADDING]; -}; - -struct _GstWebRTCRTPSenderClass -{ - GstObjectClass parent_class; - - gpointer _padding[GST_PADDING]; -}; - -GST_WEBRTC_API -GstWebRTCRTPSender * gst_webrtc_rtp_sender_new (void); - GST_WEBRTC_API -void gst_webrtc_rtp_sender_set_transport (GstWebRTCRTPSender * sender, - GstWebRTCDTLSTransport * transport); -GST_WEBRTC_API -void gst_webrtc_rtp_sender_set_rtcp_transport (GstWebRTCRTPSender * sender, - GstWebRTCDTLSTransport * transport); - +void gst_webrtc_rtp_sender_set_priority (GstWebRTCRTPSender *sender, + GstWebRTCPriorityType priority); G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstWebRTCRTPSender, gst_object_unref)
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/rtptransceiver.c -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/rtptransceiver.c
Changed
@@ -32,6 +32,8 @@ #include "rtptransceiver.h" +#include "webrtc-priv.h" + #define GST_CAT_DEFAULT gst_webrtc_rtp_transceiver_debug GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); @@ -51,11 +53,14 @@ enum { PROP_0, - PROP_MID, PROP_SENDER, PROP_RECEIVER, PROP_DIRECTION, PROP_MLINE, + PROP_MID, + PROP_CURRENT_DIRECTION, + PROP_KIND, + PROP_CODEC_PREFERENCES, PROP_STOPPED, // FIXME }; @@ -78,7 +83,14 @@ webrtc->mline = g_value_get_uint (value); break; case PROP_DIRECTION: + GST_OBJECT_LOCK (webrtc); webrtc->direction = g_value_get_enum (value); + GST_OBJECT_UNLOCK (webrtc); + break; + case PROP_CODEC_PREFERENCES: + GST_OBJECT_LOCK (webrtc); + gst_caps_replace (&webrtc->codec_preferences, g_value_get_boxed (value)); + GST_OBJECT_UNLOCK (webrtc); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); @@ -93,6 +105,9 @@ GstWebRTCRTPTransceiver *webrtc = GST_WEBRTC_RTP_TRANSCEIVER (object); switch (prop_id) { + case PROP_MID: + g_value_set_string (value, webrtc->mid); + break; case PROP_SENDER: g_value_set_object (value, webrtc->sender); break; @@ -103,7 +118,20 @@ g_value_set_uint (value, webrtc->mline); break; case PROP_DIRECTION: + GST_OBJECT_LOCK (webrtc); g_value_set_enum (value, webrtc->direction); + GST_OBJECT_UNLOCK (webrtc); + break; + case PROP_CURRENT_DIRECTION: + g_value_set_enum (value, webrtc->current_direction); + break; + case PROP_KIND: + g_value_set_enum (value, webrtc->kind); + break; + case PROP_CODEC_PREFERENCES: + GST_OBJECT_LOCK (webrtc); + gst_value_set_caps (value, webrtc->codec_preferences); + GST_OBJECT_UNLOCK (webrtc); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); @@ -199,6 +227,73 @@ GST_TYPE_WEBRTC_RTP_TRANSCEIVER_DIRECTION, GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCRTPTransceiver:mid: + * + * The media ID of the m-line associated with this transceiver. This + * association is established, when possible, whenever either a + * local or remote description is applied. This field is null if + * neither a local or remote description has been applied, or if its + * associated m-line is rejected by either a remote offer or any + * answer. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, + PROP_MID, + g_param_spec_string ("mid", "Media ID", + "The media ID of the m-line associated with this transceiver. This " + " association is established, when possible, whenever either a local" + " or remote description is applied. This field is null if neither a" + " local or remote description has been applied, or if its associated" + " m-line is rejected by either a remote offer or any answer.", + NULL, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCRTPTransceiver:current-direction: + * + * The transceiver's current directionality, or none if the + * transceiver is stopped or has never participated in an exchange + * of offers and answers. To change the transceiver's + * directionality, set the value of the direction property. + * + * Since: 1.20 + **/ + g_object_class_install_property (gobject_class, + PROP_DIRECTION, + g_param_spec_enum ("current-direction", "Current Direction", + "Transceiver current direction", + GST_TYPE_WEBRTC_RTP_TRANSCEIVER_DIRECTION, + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_NONE, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCRTPTransceiver:kind: + * + * The kind of media this transceiver transports + * + * Since: 1.20 + **/ + g_object_class_install_property (gobject_class, + PROP_KIND, + g_param_spec_enum ("kind", "Media Kind", + "Kind of media this transceiver transports", + GST_TYPE_WEBRTC_KIND, GST_WEBRTC_KIND_UNKNOWN, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + + /** + * GstWebRTCRTPTransceiver:codec-preferences: + * + * Caps representing the codec preferences. + * + * Since: 1.20 + **/ + g_object_class_install_property (gobject_class, + PROP_CODEC_PREFERENCES, + g_param_spec_boxed ("codec-preferences", "Codec Preferences", + "Caps representing the codec preferences.", + GST_TYPE_CAPS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/rtptransceiver.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/rtptransceiver.h
Changed
@@ -22,8 +22,6 @@ #include <gst/gst.h> #include <gst/webrtc/webrtc_fwd.h> -#include <gst/webrtc/rtpsender.h> -#include <gst/webrtc/rtpreceiver.h> G_BEGIN_DECLS @@ -36,35 +34,6 @@ #define GST_IS_WEBRTC_RTP_TRANSCEIVER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_RTP_TRANSCEIVER)) #define GST_WEBRTC_RTP_TRANSCEIVER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_RTP_TRANSCEIVER,GstWebRTCRTPTransceiverClass)) -/** - * GstWebRTCRTPTransceiver: - */ -struct _GstWebRTCRTPTransceiver -{ - GstObject parent; - guint mline; - gchar *mid; - gboolean stopped; - - GstWebRTCRTPSender *sender; - GstWebRTCRTPReceiver *receiver; - - GstWebRTCRTPTransceiverDirection direction; - GstWebRTCRTPTransceiverDirection current_direction; - - GstCaps *codec_preferences; - - gpointer _padding[GST_PADDING]; -}; - -struct _GstWebRTCRTPTransceiverClass -{ - GstObjectClass parent_class; - - /* FIXME; reset */ - gpointer _padding[GST_PADDING]; -}; - G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstWebRTCRTPTransceiver, gst_object_unref) G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/sctptransport.c
Added
@@ -0,0 +1,79 @@ +/* GStreamer + * Copyright (C) 2018 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include "config.h" +#endif + +#include "sctptransport.h" +#include "webrtc-priv.h" + +G_DEFINE_ABSTRACT_TYPE (GstWebRTCSCTPTransport, gst_webrtc_sctp_transport, + GST_TYPE_OBJECT); + +static void +gst_webrtc_sctp_transport_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + /* all properties should by handled by the plugin class */ + g_assert_not_reached (); +} + +static void +gst_webrtc_sctp_transport_class_init (GstWebRTCSCTPTransportClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + guint property_id_dummy = 0; + + gobject_class->get_property = gst_webrtc_sctp_transport_get_property; + + g_object_class_install_property (gobject_class, + ++property_id_dummy, + g_param_spec_object ("transport", + "WebRTC DTLS Transport", + "DTLS transport used for this SCTP transport", + GST_TYPE_WEBRTC_DTLS_TRANSPORT, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, + ++property_id_dummy, + g_param_spec_enum ("state", + "WebRTC SCTP Transport state", "WebRTC SCTP Transport state", + GST_TYPE_WEBRTC_SCTP_TRANSPORT_STATE, + GST_WEBRTC_SCTP_TRANSPORT_STATE_NEW, + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, + ++property_id_dummy, + g_param_spec_uint64 ("max-message-size", + "Maximum message size", + "Maximum message size as reported by the transport", 0, G_MAXUINT64, + 0, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, + ++property_id_dummy, + g_param_spec_uint ("max-channels", + "Maximum number of channels", "Maximum number of channels", + 0, G_MAXUINT16, 0, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); +} + +static void +gst_webrtc_sctp_transport_init (GstWebRTCSCTPTransport * nice) +{ +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/sctptransport.h
Added
@@ -0,0 +1,42 @@ +/* GStreamer + * Copyright (C) 2018 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WEBRTC_SCTP_TRANSPORT_H__ +#define __GST_WEBRTC_SCTP_TRANSPORT_H__ + +#include <gst/gst.h> +#include <gst/webrtc/webrtc_fwd.h> + +G_BEGIN_DECLS + +GST_WEBRTC_API +GType gst_webrtc_sctp_transport_get_type(void); + +#define GST_TYPE_WEBRTC_SCTP_TRANSPORT (gst_webrtc_sctp_transport_get_type()) +#define GST_WEBRTC_SCTP_TRANSPORT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_WEBRTC_SCTP_TRANSPORT,GstWebRTCSCTPTransport)) +#define GST_IS_WEBRTC_SCTP_TRANSPORT(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_WEBRTC_SCTP_TRANSPORT)) +#define GST_WEBRTC_SCTP_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass) ,GST_TYPE_WEBRTC_SCTP_TRANSPORT,GstWebRTCSCTPTransportClass)) +#define GST_IS_WEBRTC_SCTP_TRANSPORT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_WEBRTC_SCTP_TRANSPORT)) +#define GST_WEBRTC_SCTP_TRANSPORT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_WEBRTC_SCTP_TRANSPORT,GstWebRTCSCTPTransportClass)) + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstWebRTCSCTPTransport, gst_object_unref) + +G_END_DECLS + +#endif /* __GST_WEBRTC_SCTP_TRANSPORT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/webrtc-priv.h
Added
@@ -0,0 +1,315 @@ +/* GStreamer + * Copyright (C) 2017 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WEBRTC_PRIV_H__ +#define __GST_WEBRTC_PRIV_H__ + +#include <gst/gst.h> +#include <gst/webrtc/webrtc_fwd.h> +#include <gst/webrtc/rtpsender.h> +#include <gst/webrtc/rtpreceiver.h> + +G_BEGIN_DECLS + +/** + * GstWebRTCRTPTransceiver: + * @mline: the mline number this transceiver corresponds to + * @mid: The media ID of the m-line associated with this + * transceiver. This association is established, when possible, + * whenever either a local or remote description is applied. This + * field is NULL if neither a local or remote description has been + * applied, or if its associated m-line is rejected by either a remote + * offer or any answer. + * @stopped: Indicates whether or not sending and receiving using the paired + * #GstWebRTCRTPSender and #GstWebRTCRTPReceiver has been permanently disabled, + * either due to SDP offer/answer + * @sender: The #GstWebRTCRTPSender object responsible sending data to the + * remote peer + * @receiver: The #GstWebRTCRTPReceiver object responsible for receiver data from + * the remote peer. + * @direction: The transceiver's desired direction. + * @current_direction: The transceiver's current direction (read-only) + * @codec_preferences: A caps representing the codec preferences (read-only) + * @kind: Type of media (Since: 1.20) + * + * Mostly matches the WebRTC RTCRtpTransceiver interface. + */ +/** + * GstWebRTCRTPTransceiver.kind: + * + * Type of media + * + * Since: 1.20 + */ +struct _GstWebRTCRTPTransceiver +{ + GstObject parent; + guint mline; + gchar *mid; + gboolean stopped; + + GstWebRTCRTPSender *sender; + GstWebRTCRTPReceiver *receiver; + + GstWebRTCRTPTransceiverDirection direction; + GstWebRTCRTPTransceiverDirection current_direction; + + GstCaps *codec_preferences; + GstWebRTCKind kind; + + gpointer _padding[GST_PADDING]; +}; + +struct _GstWebRTCRTPTransceiverClass +{ + GstObjectClass parent_class; + + /* FIXME; reset */ + gpointer _padding[GST_PADDING]; +}; + +/** + * GstWebRTCRTPSender: + * @transport: The transport for RTP packets + * @send_encodings: Unused + * @priority: The priority of the stream (Since: 1.20) + * + * An object to track the sending aspect of the stream + * + * Mostly matches the WebRTC RTCRtpSender interface. + */ +/** + * GstWebRTCRTPSender.priority: + * + * The priority of the stream + * + * Since: 1.20 + */ +struct _GstWebRTCRTPSender +{ + GstObject parent; + + /* The MediStreamTrack is represented by the stream and is output into @transport as necessary */ + GstWebRTCDTLSTransport *transport; + + GArray *send_encodings; + GstWebRTCPriorityType priority; + + gpointer _padding[GST_PADDING]; +}; + +struct _GstWebRTCRTPSenderClass +{ + GstObjectClass parent_class; + + gpointer _padding[GST_PADDING]; +}; + +GST_WEBRTC_API +GstWebRTCRTPSender * gst_webrtc_rtp_sender_new (void); + +/** + * GstWebRTCRTPReceiver: + * @transport: The transport for RTP packets + * + * An object to track the receiving aspect of the stream + * + * Mostly matches the WebRTC RTCRtpReceiver interface. + */ +struct _GstWebRTCRTPReceiver +{ + GstObject parent; + + /* The MediStreamTrack is represented by the stream and is output into @transport as necessary */ + GstWebRTCDTLSTransport *transport; + + gpointer _padding[GST_PADDING]; +}; + +struct _GstWebRTCRTPReceiverClass +{ + GstObjectClass parent_class; + + gpointer _padding[GST_PADDING]; +}; + +GST_WEBRTC_API +GstWebRTCRTPReceiver * gst_webrtc_rtp_receiver_new (void); + + +/** + * GstWebRTCICETransport: + */ +struct _GstWebRTCICETransport +{ + GstObject parent; + + GstWebRTCICERole role; + GstWebRTCICEComponent component; + + GstWebRTCICEConnectionState state; + GstWebRTCICEGatheringState gathering_state; + + /* Filled by subclasses */ + GstElement *src; + GstElement *sink; + + gpointer _padding[GST_PADDING]; +}; + +struct _GstWebRTCICETransportClass +{ + GstObjectClass parent_class; + + gboolean (*gather_candidates) (GstWebRTCICETransport * transport); + + gpointer _padding[GST_PADDING]; +}; + +GST_WEBRTC_API +void gst_webrtc_ice_transport_connection_state_change (GstWebRTCICETransport * ice, + GstWebRTCICEConnectionState new_state); +GST_WEBRTC_API +void gst_webrtc_ice_transport_gathering_state_change (GstWebRTCICETransport * ice, + GstWebRTCICEGatheringState new_state); +GST_WEBRTC_API +void gst_webrtc_ice_transport_selected_pair_change (GstWebRTCICETransport * ice); +GST_WEBRTC_API +void gst_webrtc_ice_transport_new_candidate (GstWebRTCICETransport * ice, guint stream_id, GstWebRTCICEComponent component, gchar * attr); + +/** + * GstWebRTCDTLSTransport: + */ +struct _GstWebRTCDTLSTransport +{ + GstObject parent; + + GstWebRTCICETransport *transport; + GstWebRTCDTLSTransportState state; + + gboolean client; + guint session_id; + GstElement *dtlssrtpenc; + GstElement *dtlssrtpdec; + + gpointer _padding[GST_PADDING]; +}; + +struct _GstWebRTCDTLSTransportClass +{ + GstObjectClass parent_class; + + gpointer _padding[GST_PADDING]; +}; + +GST_WEBRTC_API +GstWebRTCDTLSTransport * gst_webrtc_dtls_transport_new (guint session_id); + +GST_WEBRTC_API +void gst_webrtc_dtls_transport_set_transport (GstWebRTCDTLSTransport * transport, + GstWebRTCICETransport * ice); + +#define GST_WEBRTC_DATA_CHANNEL_LOCK(channel) g_mutex_lock(&((GstWebRTCDataChannel *)(channel))->lock) +#define GST_WEBRTC_DATA_CHANNEL_UNLOCK(channel) g_mutex_unlock(&((GstWebRTCDataChannel *)(channel))->lock) + +/** + * GstWebRTCDataChannel: + * + * Since: 1.18 + */ +struct _GstWebRTCDataChannel +{ + GObject parent; + + GMutex lock; + + gchar *label; + gboolean ordered; + guint max_packet_lifetime; + guint max_retransmits; + gchar *protocol; + gboolean negotiated; + gint id; + GstWebRTCPriorityType priority; + GstWebRTCDataChannelState ready_state; + guint64 buffered_amount; + guint64 buffered_amount_low_threshold; + + gpointer _padding[GST_PADDING]; +}; + +/** + * GstWebRTCDataChannelClass: + * + * Since: 1.18 + */ +struct _GstWebRTCDataChannelClass +{ + GObjectClass parent_class; + + void (*send_data) (GstWebRTCDataChannel * channel, GBytes *data); + void (*send_string) (GstWebRTCDataChannel * channel, const gchar *str); + void (*close) (GstWebRTCDataChannel * channel); + + gpointer _padding[GST_PADDING]; +}; + +GST_WEBRTC_API +void gst_webrtc_data_channel_on_open (GstWebRTCDataChannel * channel); + +GST_WEBRTC_API +void gst_webrtc_data_channel_on_close (GstWebRTCDataChannel * channel); + +GST_WEBRTC_API +void gst_webrtc_data_channel_on_error (GstWebRTCDataChannel * channel, GError * error); + +GST_WEBRTC_API +void gst_webrtc_data_channel_on_message_data (GstWebRTCDataChannel * channel, GBytes * data); + +GST_WEBRTC_API +void gst_webrtc_data_channel_on_message_string (GstWebRTCDataChannel * channel, const gchar * str); + +GST_WEBRTC_API +void gst_webrtc_data_channel_on_buffered_amount_low (GstWebRTCDataChannel * channel); + + +/** + * GstWebRTCSCTPTransport: + * + * Since: 1.20 + */ +struct _GstWebRTCSCTPTransport +{ + GstObject parent; +}; + +/** + * GstWebRTCSCTPTransportClass: + * + * Since: 1.20 + */ +struct _GstWebRTCSCTPTransportClass +{ + GstObjectClass parent_class; +}; + + +G_END_DECLS + +#endif /* __GST_WEBRTC_PRIV_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/webrtc.c
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) 2017 Matthew Waters <matthew@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/webrtc/webrtc_fwd.h> + +/** + * gst_webrtc_error_quark: + * + * Since: 1.20 + */ +GQuark +gst_webrtc_error_quark (void) +{ + return g_quark_from_static_string ("gst-webrtc-error-quark"); +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst-libs/gst/webrtc/webrtc_fwd.h -> gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/webrtc/webrtc_fwd.h
Changed
@@ -42,26 +42,59 @@ #include <gst/webrtc/webrtc-enumtypes.h> +/** + * GstWebRTCDTLSTransport: + */ typedef struct _GstWebRTCDTLSTransport GstWebRTCDTLSTransport; typedef struct _GstWebRTCDTLSTransportClass GstWebRTCDTLSTransportClass; +/** + * GstWebRTCICETransport: + */ typedef struct _GstWebRTCICETransport GstWebRTCICETransport; typedef struct _GstWebRTCICETransportClass GstWebRTCICETransportClass; +/** + * GstWebRTCRTPReceiver: + * + * An object to track the receiving aspect of the stream + * + * Mostly matches the WebRTC RTCRtpReceiver interface. + */ typedef struct _GstWebRTCRTPReceiver GstWebRTCRTPReceiver; typedef struct _GstWebRTCRTPReceiverClass GstWebRTCRTPReceiverClass; +/** + * GstWebRTCRTPSender: + * + * An object to track the sending aspect of the stream + * + * Mostly matches the WebRTC RTCRtpSender interface. + */ typedef struct _GstWebRTCRTPSender GstWebRTCRTPSender; typedef struct _GstWebRTCRTPSenderClass GstWebRTCRTPSenderClass; typedef struct _GstWebRTCSessionDescription GstWebRTCSessionDescription; +/** + * GstWebRTCRTPTransceiver: + * + * Mostly matches the WebRTC RTCRtpTransceiver interface. + */ typedef struct _GstWebRTCRTPTransceiver GstWebRTCRTPTransceiver; typedef struct _GstWebRTCRTPTransceiverClass GstWebRTCRTPTransceiverClass; +/** + * GstWebRTCDataChannel: + * + * Since: 1.18 + */ typedef struct _GstWebRTCDataChannel GstWebRTCDataChannel; typedef struct _GstWebRTCDataChannelClass GstWebRTCDataChannelClass; +typedef struct _GstWebRTCSCTPTransport GstWebRTCSCTPTransport; +typedef struct _GstWebRTCSCTPTransportClass GstWebRTCSCTPTransportClass; + /** * GstWebRTCDTLSTransportState: * @GST_WEBRTC_DTLS_TRANSPORT_STATE_NEW: new @@ -280,10 +313,10 @@ /** * GstWebRTCSCTPTransportState: - * GST_WEBRTC_SCTP_TRANSPORT_STATE_NEW: new - * GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTING: connecting - * GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED: connected - * GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED: closed + * @GST_WEBRTC_SCTP_TRANSPORT_STATE_NEW: new + * @GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTING: connecting + * @GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED: connected + * @GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED: closed * * See <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate> * @@ -299,10 +332,10 @@ /** * GstWebRTCPriorityType: - * GST_WEBRTC_PRIORITY_TYPE_VERY_LOW: very-low - * GST_WEBRTC_PRIORITY_TYPE_LOW: low - * GST_WEBRTC_PRIORITY_TYPE_MEDIUM: medium - * GST_WEBRTC_PRIORITY_TYPE_HIGH: high + * @GST_WEBRTC_PRIORITY_TYPE_VERY_LOW: very-low + * @GST_WEBRTC_PRIORITY_TYPE_LOW: low + * @GST_WEBRTC_PRIORITY_TYPE_MEDIUM: medium + * @GST_WEBRTC_PRIORITY_TYPE_HIGH: high * * See <http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype> * @@ -318,11 +351,11 @@ /** * GstWebRTCDataChannelState: - * GST_WEBRTC_DATA_CHANNEL_STATE_NEW: new - * GST_WEBRTC_DATA_CHANNEL_STATE_CONNECTING: connection - * GST_WEBRTC_DATA_CHANNEL_STATE_OPEN: open - * GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING: closing - * GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED: closed + * @GST_WEBRTC_DATA_CHANNEL_STATE_NEW: new + * @GST_WEBRTC_DATA_CHANNEL_STATE_CONNECTING: connection + * @GST_WEBRTC_DATA_CHANNEL_STATE_OPEN: open + * @GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING: closing + * @GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED: closed * * See <http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate> * @@ -339,10 +372,10 @@ /** * GstWebRTCBundlePolicy: - * GST_WEBRTC_BUNDLE_POLICY_NONE: none - * GST_WEBRTC_BUNDLE_POLICY_BALANCED: balanced - * GST_WEBRTC_BUNDLE_POLICY_MAX_COMPAT: max-compat - * GST_WEBRTC_BUNDLE_POLICY_MAX_BUNDLE: max-bundle + * @GST_WEBRTC_BUNDLE_POLICY_NONE: none + * @GST_WEBRTC_BUNDLE_POLICY_BALANCED: balanced + * @GST_WEBRTC_BUNDLE_POLICY_MAX_COMPAT: max-compat + * @GST_WEBRTC_BUNDLE_POLICY_MAX_BUNDLE: max-bundle * * See https://tools.ietf.org/html/draft-ietf-rtcweb-jsep-24#section-4.1.1 * for more information. @@ -359,8 +392,8 @@ /** * GstWebRTCICETransportPolicy: - * GST_WEBRTC_ICE_TRANSPORT_POLICY_ALL: all - * GST_WEBRTC_ICE_TRANSPORT_POLICY_RELAY: relay + * @GST_WEBRTC_ICE_TRANSPORT_POLICY_ALL: all + * @GST_WEBRTC_ICE_TRANSPORT_POLICY_RELAY: relay * * See https://tools.ietf.org/html/draft-ietf-rtcweb-jsep-24#section-4.1.1 * for more information. @@ -373,4 +406,62 @@ GST_WEBRTC_ICE_TRANSPORT_POLICY_RELAY, } GstWebRTCICETransportPolicy; +/** + * GstWebRTCKind: + * @GST_WEBRTC_KIND_UNKNOWN: Kind has not yet been set + * @GST_WEBRTC_KIND_AUDIO: Kind is audio + * @GST_WEBRTC_KIND_VIDEO: Kind is audio + * + * https://w3c.github.io/mediacapture-main/#dom-mediastreamtrack-kind + * + * Since: 1.20 + */ +typedef enum /*<underscore_name=gst_webrtc_kind>*/ +{ + GST_WEBRTC_KIND_UNKNOWN, + GST_WEBRTC_KIND_AUDIO, + GST_WEBRTC_KIND_VIDEO, +} GstWebRTCKind; + + +GST_WEBRTC_API +GQuark gst_webrtc_error_quark (void); + +/** + * GST_WEBRTC_ERROR: + * + * Since: 1.20 + */ +#define GST_WEBRTC_ERROR gst_webrtc_error_quark () + +/** + * GstWebRTCError: + * @GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE: data-channel-failure + * @GST_WEBRTC_ERROR_DTLS_FAILURE: dtls-failure + * @GST_WEBRTC_ERROR_FINGERPRINT_FAILURE: fingerprint-failure + * @GST_WEBRTC_ERROR_SCTP_FAILURE: sctp-failure + * @GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR: sdp-syntax-error + * @GST_WEBRTC_ERROR_HARDWARE_ENCODER_NOT_AVAILABLE: hardware-encoder-not-available + * @GST_WEBRTC_ERROR_ENCODER_ERROR: encoder-error + * @GST_WEBRTC_ERROR_INVALID_STATE: invalid-state (part of WebIDL specification) + * @GST_WEBRTC_ERROR_INTERNAL_FAILURE: GStreamer-specific failure, not matching any other value from the specification + * + * See <https://www.w3.org/TR/webrtc/#dom-rtcerrordetailtype> for more information. + * + * Since: 1.20 + */ +typedef enum /*<underscore_name=gst_webrtc_error>*/ +{ + GST_WEBRTC_ERROR_DATA_CHANNEL_FAILURE, + GST_WEBRTC_ERROR_DTLS_FAILURE, + GST_WEBRTC_ERROR_FINGERPRINT_FAILURE, + GST_WEBRTC_ERROR_SCTP_FAILURE, + GST_WEBRTC_ERROR_SDP_SYNTAX_ERROR, + GST_WEBRTC_ERROR_HARDWARE_ENCODER_NOT_AVAILABLE, + GST_WEBRTC_ERROR_ENCODER_ERROR, + GST_WEBRTC_ERROR_INVALID_STATE, + GST_WEBRTC_ERROR_INTERNAL_FAILURE +} GstWebRTCError; + + #endif /* __GST_WEBRTC_FWD_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/winrt
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/winrt/gstwinrt.h
Added
@@ -0,0 +1,31 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WINRT_H__ +#define __GST_WINRT_H__ + +#ifndef GST_USE_UNSTABLE_API +#pragma message ("The winrt library from gst-plugins-bad is unstable API and may change in future.") +#pragma message ("You can define GST_USE_UNSTABLE_API to avoid this warning.") +#endif + +#include <gst/gst.h> +#include <gst/winrt/gstwinrtdevicewatcher.h> + +#endif /* __GST_WINRT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/winrt/gstwinrtdevicewatcher.cpp
Added
@@ -0,0 +1,723 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstwinrtdevicewatcher.h" + +/* workaround for GetCurrentTime collision */ +#ifdef GetCurrentTime +#undef GetCurrentTime +#endif + +#include <windows.foundation.h> +#include <wrl.h> +#include <wrl/wrappers/corewrappers.h> + +/* *INDENT-OFF* */ +typedef __FITypedEventHandler_2_Windows__CDevices__CEnumeration__CDeviceWatcher_Windows__CDevices__CEnumeration__CDeviceInformation IAddedHandler; +typedef __FITypedEventHandler_2_Windows__CDevices__CEnumeration__CDeviceWatcher_Windows__CDevices__CEnumeration__CDeviceInformationUpdate IUpdatedHandler; +typedef __FITypedEventHandler_2_Windows__CDevices__CEnumeration__CDeviceWatcher_Windows__CDevices__CEnumeration__CDeviceInformationUpdate IRemovedHandler; +typedef __FITypedEventHandler_2_Windows__CDevices__CEnumeration__CDeviceWatcher_IInspectable IEnumerationCompletedHandler; +typedef __FITypedEventHandler_2_Windows__CDevices__CEnumeration__CDeviceWatcher_IInspectable IStoppedHandler; + +using namespace Microsoft::WRL; +using namespace Microsoft::WRL::Wrappers; +using namespace ABI::Windows::Foundation; +using namespace ABI::Windows::Devices; +using namespace ABI::Windows::Devices::Enumeration; + +GST_DEBUG_CATEGORY_STATIC (gst_winrt_device_watcher_debug); +#define GST_CAT_DEFAULT gst_winrt_device_watcher_debug + +static void +gst_winrt_device_watcher_device_added (GstWinRTDeviceWatcher * self, + IDeviceInformation * info); +static void +gst_winrt_device_watcher_device_updated (GstWinRTDeviceWatcher * self, + IDeviceInformationUpdate * info_update); +static void +gst_winrt_device_watcher_device_removed (GstWinRTDeviceWatcher * self, + IDeviceInformationUpdate * info_update); +static void +gst_winrt_device_watcher_device_enumeration_completed (GstWinRTDeviceWatcher * + self); +static void +gst_winrt_device_watcher_device_enumeration_stopped (GstWinRTDeviceWatcher * + self); + +class AddedHandler + : public RuntimeClass<RuntimeClassFlags<ClassicCom>, IAddedHandler> +{ +public: + AddedHandler () {} + HRESULT RuntimeClassInitialize (GstWinRTDeviceWatcher * listenr) + { + if (!listenr) + return E_INVALIDARG; + + listener_ = listenr; + return S_OK; + } + + IFACEMETHOD(Invoke) + (IDeviceWatcher* sender, IDeviceInformation * arg) + { + gst_winrt_device_watcher_device_added (listener_, arg); + + return S_OK; + } + +private: + GstWinRTDeviceWatcher * listener_; +}; + +class UpdatedHandler + : public RuntimeClass<RuntimeClassFlags<ClassicCom>, IUpdatedHandler> +{ +public: + UpdatedHandler () {} + HRESULT RuntimeClassInitialize (GstWinRTDeviceWatcher * listenr) + { + if (!listenr) + return E_INVALIDARG; + + listener_ = listenr; + return S_OK; + } + + IFACEMETHOD(Invoke) + (IDeviceWatcher* sender, IDeviceInformationUpdate * arg) + { + gst_winrt_device_watcher_device_updated (listener_, arg); + + return S_OK; + } + +private: + GstWinRTDeviceWatcher * listener_; +}; + +class RemovedHandler + : public RuntimeClass<RuntimeClassFlags<ClassicCom>, IRemovedHandler> +{ +public: + RemovedHandler () {} + HRESULT RuntimeClassInitialize (GstWinRTDeviceWatcher * listenr) + { + if (!listenr) + return E_INVALIDARG; + + listener_ = listenr; + return S_OK; + } + + IFACEMETHOD(Invoke) + (IDeviceWatcher* sender, IDeviceInformationUpdate * arg) + { + gst_winrt_device_watcher_device_removed (listener_, arg); + + return S_OK; + } + +private: + GstWinRTDeviceWatcher * listener_; +}; + +class EnumerationCompletedHandler + : public RuntimeClass<RuntimeClassFlags<ClassicCom>, IEnumerationCompletedHandler> +{ +public: + EnumerationCompletedHandler () {} + HRESULT RuntimeClassInitialize (GstWinRTDeviceWatcher * listenr) + { + if (!listenr) + return E_INVALIDARG; + + listener_ = listenr; + return S_OK; + } + + IFACEMETHOD(Invoke) + (IDeviceWatcher* sender, IInspectable * arg) + { + gst_winrt_device_watcher_device_enumeration_completed (listener_); + + return S_OK; + } + +private: + GstWinRTDeviceWatcher * listener_; +}; + +class StoppedHandler + : public RuntimeClass<RuntimeClassFlags<ClassicCom>, IStoppedHandler> +{ +public: + StoppedHandler () {} + HRESULT RuntimeClassInitialize (GstWinRTDeviceWatcher * listenr) + { + if (!listenr) + return E_INVALIDARG; + + listener_ = listenr; + return S_OK; + } + + IFACEMETHOD(Invoke) + (IDeviceWatcher* sender, IInspectable * arg) + { + gst_winrt_device_watcher_device_enumeration_stopped (listener_); + + return S_OK; + } + +private: + GstWinRTDeviceWatcher * listener_; +}; +/* *INDENT-ON* */ + +typedef struct +{ + ComPtr < IDeviceWatcher > watcher; + + EventRegistrationToken added_token; + EventRegistrationToken updated_token; + EventRegistrationToken removed_token; + EventRegistrationToken enum_completed_token; + EventRegistrationToken stopped_token; +} GstWinRTDeviceWatcherInner; + +enum +{ + PROP_0, + PROP_DEVICE_CLASS, +}; + +#define DEFAULT_DEVICE_CLASS GST_WINRT_DEVICE_CLASS_ALL + +struct _GstWinRTDeviceWatcherPrivate +{ + GMutex lock; + GCond cond; + + GThread *thread; + GMainContext *context; + GMainLoop *loop; + + gboolean running; + + GstWinRTDeviceWatcherCallbacks callbacks; + gpointer user_data; + + GstWinRTDeviceWatcherInner *inner; + + GstWinRTDeviceClass device_class; +}; + +GType +gst_winrt_device_class_get_type (void) +{ + static gsize device_class_type = 0; + + if (g_once_init_enter (&device_class_type)) { + static const GEnumValue classes[] = { + {GST_WINRT_DEVICE_CLASS_ALL, "All", "all"}, + {GST_WINRT_DEVICE_CLASS_AUDIO_CAPTURE, "AudioCapture", "audio-capture"}, + {GST_WINRT_DEVICE_CLASS_AUDIO_RENDER, "AudioRender", "audio-render"}, + {GST_WINRT_DEVICE_CLASS_PORTABLE_STORAGE_DEVICE, + "PortableStorageDevice", "portable-storage-device"}, + {GST_WINRT_DEVICE_CLASS_VIDEO_CAPTURE, + "VideoCapture", "video-capture"}, + {0, nullptr, nullptr}, + }; + GType tmp = g_enum_register_static ("GstWinRTDeviceClass", classes); + g_once_init_leave (&device_class_type, tmp); + } + + return (GType) device_class_type; +} + +static void gst_winrt_device_watcher_constructed (GObject * object); +static void gst_winrt_device_watcher_finalize (GObject * object); +static void gst_winrt_device_watcher_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_winrt_device_watcher_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); + +static gpointer +gst_winrt_device_watcher_thread_func (GstWinRTDeviceWatcher * self); + +#define gst_winrt_device_watcher_parent_class parent_class +G_DEFINE_TYPE_WITH_PRIVATE (GstWinRTDeviceWatcher, gst_winrt_device_watcher, + GST_TYPE_OBJECT); + +static void +gst_winrt_device_watcher_class_init (GstWinRTDeviceWatcherClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->constructed = gst_winrt_device_watcher_constructed; + gobject_class->finalize = gst_winrt_device_watcher_finalize; + gobject_class->set_property = gst_winrt_device_watcher_set_property; + gobject_class->get_property = gst_winrt_device_watcher_get_property; + + g_object_class_install_property (gobject_class, PROP_DEVICE_CLASS, + g_param_spec_enum ("device-class", "Device Class", + "Device class to watch", GST_TYPE_WINRT_DEVICE_CLASS, + DEFAULT_DEVICE_CLASS, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | + G_PARAM_STATIC_STRINGS))); + + GST_DEBUG_CATEGORY_INIT (gst_winrt_device_watcher_debug, + "winrtdevicewatcher", 0, "winrtdevicewatcher"); +} + +static void +gst_winrt_device_watcher_init (GstWinRTDeviceWatcher * self) +{ + GstWinRTDeviceWatcherPrivate *priv; + + self->priv = priv = (GstWinRTDeviceWatcherPrivate *) + gst_winrt_device_watcher_get_instance_private (self); + + g_mutex_init (&priv->lock); + g_cond_init (&priv->cond); + priv->context = g_main_context_new (); + priv->loop = g_main_loop_new (priv->context, FALSE); +} + +static void +gst_winrt_device_watcher_constructed (GObject * object) +{ + GstWinRTDeviceWatcher *self = GST_WINRT_DEVICE_WATCHER (object); + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + g_mutex_lock (&priv->lock); + priv->thread = g_thread_new ("GstWinRTDeviceWatcher", + (GThreadFunc) gst_winrt_device_watcher_thread_func, self); + while (!g_main_loop_is_running (priv->loop)) + g_cond_wait (&priv->cond, &priv->lock); + g_mutex_unlock (&priv->lock); +} + +static void +gst_winrt_device_watcher_finalize (GObject * object) +{ + GstWinRTDeviceWatcher *self = GST_WINRT_DEVICE_WATCHER (object); + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + g_main_loop_quit (priv->loop); + if (g_thread_self () != priv->thread) { + g_thread_join (priv->thread); + g_main_loop_unref (priv->loop); + g_main_context_unref (priv->context); + } else { + g_warning ("Trying join from self-thread"); + } + + g_mutex_clear (&priv->lock); + g_cond_clear (&priv->cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_winrt_device_watcher_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstWinRTDeviceWatcher *self = GST_WINRT_DEVICE_WATCHER (object); + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + switch (prop_id) { + case PROP_DEVICE_CLASS: + priv->device_class = (GstWinRTDeviceClass) g_value_get_enum (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_winrt_device_watcher_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstWinRTDeviceWatcher *self = GST_WINRT_DEVICE_WATCHER (object); + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + switch (prop_id) { + case PROP_DEVICE_CLASS: + g_value_set_enum (value, priv->device_class); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +loop_running_cb (GstWinRTDeviceWatcher * self) +{ + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + g_mutex_lock (&priv->lock); + g_cond_signal (&priv->cond); + g_mutex_unlock (&priv->lock); + + return G_SOURCE_REMOVE; +} + +static void +gst_winrt_device_watcher_thread_func_inner (GstWinRTDeviceWatcher * self) +{ + GstWinRTDeviceWatcherPrivate *priv = self->priv; + GSource *idle_source; + HRESULT hr; + GstWinRTDeviceWatcherInner *inner = nullptr; + ComPtr < IDeviceInformationStatics > factory; + ComPtr < IDeviceWatcher > watcher; + ComPtr < IAddedHandler > added_handler; + ComPtr < IUpdatedHandler > updated_handler; + ComPtr < IRemovedHandler > removed_handler; + ComPtr < IEnumerationCompletedHandler > enum_completed_handler; + ComPtr < IStoppedHandler > stopped_handler; + + g_main_context_push_thread_default (priv->context); + + idle_source = g_idle_source_new (); + g_source_set_callback (idle_source, + (GSourceFunc) loop_running_cb, self, nullptr); + g_source_attach (idle_source, priv->context); + g_source_unref (idle_source); + + hr = GetActivationFactory (HStringReference + (RuntimeClass_Windows_Devices_Enumeration_DeviceInformation).Get (), + &factory); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, + "Failed to get IDeviceInformationStatics, hr: 0x%x", (guint) hr); + goto run_loop; + } + + hr = factory->CreateWatcherDeviceClass ((DeviceClass) priv->device_class, + &watcher); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, + "Failed to create IDeviceWatcher, hr: 0x%x", (guint) hr); + goto run_loop; + } + + hr = MakeAndInitialize < AddedHandler > (&added_handler, self); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to create added handler, hr: 0x%x", hr); + goto run_loop; + } + + hr = MakeAndInitialize < UpdatedHandler > (&updated_handler, self); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to create updated handler, hr: 0x%x", hr); + goto run_loop; + } + + hr = MakeAndInitialize < RemovedHandler > (&removed_handler, self); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to create removed handler, hr: 0x%x", hr); + goto run_loop; + } + + hr = MakeAndInitialize < EnumerationCompletedHandler > + (&enum_completed_handler, self); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, + "Failed to create enumeration completed handler, hr: 0x%x", hr); + goto run_loop; + } + + hr = MakeAndInitialize < StoppedHandler > (&stopped_handler, self); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to create stopped handler, hr: 0x%x", hr); + goto run_loop; + } + + inner = new GstWinRTDeviceWatcherInner (); + hr = watcher->add_Added (added_handler.Get (), &inner->added_token); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to register added handler, hr: 0x%x", hr); + delete inner; + inner = nullptr; + + goto run_loop; + } + + hr = watcher->add_Updated (updated_handler.Get (), &inner->updated_token); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to register updated handler, hr: 0x%x", hr); + delete inner; + inner = nullptr; + + goto run_loop; + } + + hr = watcher->add_Removed (removed_handler.Get (), &inner->removed_token); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to register removed handler, hr: 0x%x", hr); + delete inner; + inner = nullptr; + + goto run_loop; + } + + hr = watcher->add_EnumerationCompleted (enum_completed_handler.Get (), + &inner->enum_completed_token); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, + "Failed to register enumeration completed handler, hr: 0x%x", hr); + delete inner; + inner = nullptr; + + goto run_loop; + } + + hr = watcher->add_Stopped (stopped_handler.Get (), &inner->stopped_token); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to register stopped handler, hr: 0x%x", hr); + delete inner; + inner = nullptr; + + goto run_loop; + } + + inner->watcher = watcher; + priv->inner = inner; + +run_loop: + GST_INFO_OBJECT (self, "Starting loop"); + g_main_loop_run (priv->loop); + GST_INFO_OBJECT (self, "Stopped loop"); + + if (inner) { + if (priv->running) + watcher->Stop (); + + watcher->remove_Added (inner->added_token); + watcher->remove_Updated (inner->updated_token); + watcher->remove_Removed (inner->removed_token); + watcher->remove_EnumerationCompleted (inner->enum_completed_token); + watcher->remove_Stopped (inner->stopped_token); + + delete inner; + } + + g_main_context_pop_thread_default (priv->context); +} + +static gpointer +gst_winrt_device_watcher_thread_func (GstWinRTDeviceWatcher * self) +{ + RoInitializeWrapper initialize (RO_INIT_MULTITHREADED); + + /* wrap with another function so that everything can happen + * before RoInitializeWrapper is destructed */ + gst_winrt_device_watcher_thread_func_inner (self); + + return nullptr; +} + +static void +gst_winrt_device_watcher_device_added (GstWinRTDeviceWatcher * self, + IDeviceInformation * info) +{ + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (self, "Device added"); + + if (priv->callbacks.added) + priv->callbacks.added (self, info, priv->user_data); +} + +static void +gst_winrt_device_watcher_device_updated (GstWinRTDeviceWatcher * self, + IDeviceInformationUpdate * info_update) +{ + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (self, "Device updated"); + + if (priv->callbacks.updated) + priv->callbacks.updated (self, info_update, priv->user_data); +} + +static void +gst_winrt_device_watcher_device_removed (GstWinRTDeviceWatcher * self, + IDeviceInformationUpdate * info_update) +{ + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (self, "Device removed"); + + if (priv->callbacks.removed) + priv->callbacks.removed (self, info_update, priv->user_data); +} + +static void +gst_winrt_device_watcher_device_enumeration_completed (GstWinRTDeviceWatcher * + self) +{ + GstWinRTDeviceWatcherPrivate *priv = self->priv; + + GST_DEBUG_OBJECT (self, "Enumeration completed"); + + if (priv->callbacks.enumeration_completed) + priv->callbacks.enumeration_completed (self, priv->user_data); +} + +static void +gst_winrt_device_watcher_device_enumeration_stopped (GstWinRTDeviceWatcher * + self) +{ + GST_DEBUG_OBJECT (self, "Stopped"); +} + +/** + * gst_winrt_device_watcher_new: + * @device_class: a #GstWinRTDeviceClass to watch + * @callbacks: a pointer to #GstWinRTDeviceWatcherCallbacks + * @user_data: a user_data argument for the callbacks + * + * Constructs a new #GstWinRTDeviceWatcher object for watching device update + * of @device_class + * + * Returns: (transfer full) (nullable): a new #GstWinRTDeviceWatcher + * or %NULL when failed to create/setup #GstWinRTDeviceWatcher object + * + * Since: 1.20 + */ +GstWinRTDeviceWatcher * +gst_winrt_device_watcher_new (GstWinRTDeviceClass device_class, + const GstWinRTDeviceWatcherCallbacks * callbacks, gpointer user_data) +{ + GstWinRTDeviceWatcher *self; + GstWinRTDeviceWatcherPrivate *priv; + + g_return_val_if_fail (callbacks != nullptr, nullptr); + + self = (GstWinRTDeviceWatcher *) + g_object_new (GST_TYPE_WINRT_DEVICE_WATCHER, "device-class", device_class, + nullptr); + + priv = self->priv; + if (!priv->inner) { + gst_object_unref (self); + return nullptr; + } + + priv->callbacks = *callbacks; + priv->user_data = user_data; + + gst_object_ref_sink (self); + + return self; +} + +/** + * gst_winrt_device_watcher_start: + * @device_class: a #GstWinRTDeviceClass to watch + * + * Starts watching device update. + * + * Returns: %TRUE if successful + * + * Since: 1.20 + */ +gboolean +gst_winrt_device_watcher_start (GstWinRTDeviceWatcher * watcher) +{ + GstWinRTDeviceWatcherPrivate *priv; + GstWinRTDeviceWatcherInner *inner; + HRESULT hr; + + g_return_val_if_fail (GST_IS_WINRT_DEVICE_WATCHER (watcher), FALSE); + + priv = watcher->priv; + inner = priv->inner; + + GST_DEBUG_OBJECT (watcher, "Start"); + + g_mutex_lock (&priv->lock); + if (priv->running) { + GST_DEBUG_OBJECT (watcher, "Already running"); + g_mutex_unlock (&priv->lock); + + return TRUE; + } + + hr = inner->watcher->Start (); + if (FAILED (hr)) { + GST_ERROR_OBJECT (watcher, "Failed to start watcher, hr: 0x%x", (guint) hr); + g_mutex_unlock (&priv->lock); + + return FALSE; + } + + priv->running = TRUE; + g_mutex_unlock (&priv->lock); + + return TRUE; +} + +/** + * gst_winrt_device_watcher_stop: + * @device_class: a #GstWinRTDeviceClass to watch + * + * Stops watching device update. + * + * Since: 1.20 + */ +void +gst_winrt_device_watcher_stop (GstWinRTDeviceWatcher * watcher) +{ + GstWinRTDeviceWatcherPrivate *priv; + GstWinRTDeviceWatcherInner *inner; + HRESULT hr; + + g_return_if_fail (GST_IS_WINRT_DEVICE_WATCHER (watcher)); + + GST_DEBUG_OBJECT (watcher, "Stop"); + + priv = watcher->priv; + inner = priv->inner; + + g_mutex_lock (&priv->lock); + if (!priv->running) { + g_mutex_unlock (&priv->lock); + + return; + } + + priv->running = FALSE; + hr = inner->watcher->Stop (); + if (FAILED (hr)) { + GST_WARNING_OBJECT (watcher, + "Failed to stop watcher, hr: 0x%x", (guint) hr); + } + g_mutex_unlock (&priv->lock); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/winrt/gstwinrtdevicewatcher.h
Added
@@ -0,0 +1,136 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WINRT_DEVICE_WATCHER_H__ +#define __GST_WINRT_DEVICE_WATCHER_H__ + +#include <gst/gst.h> +#include <gst/winrt/winrt-prelude.h> +#include <windows.devices.enumeration.h> + +G_BEGIN_DECLS + +#define GST_TYPE_WINRT_DEVICE_WATCHER (gst_winrt_device_watcher_get_type()) +#define GST_WINRT_DEVICE_WATCHER(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_WINRT_DEVICE_WATCHER, GstWinRTDeviceWatcher)) +#define GST_WINRT_DEVICE_WATCHER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_WINRT_DEVICE_WATCHER, GstWinRTDeviceWatcherClass)) +#define GST_IS_WINRT_DEVICE_WATCHER(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_WINRT_DEVICE_WATCHER)) +#define GST_IS_WINRT_DEVICE_WATCHER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_WINRT_DEVICE_WATCHER)) +#define GST_WINRT_DEVICE_WATCHER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_WINRT_DEVICE_WATCHER, GstWinRTDeviceWatcherClass)) +#define GST_WINRT_DEVICE_WATCHER_CAST(obj) ((GstWinRTDeviceWatcher *)obj) + +typedef struct _GstWinRTDeviceWatcher GstWinRTDeviceWatcher; +typedef struct _GstWinRTDeviceWatcherClass GstWinRTDeviceWatcherClass; +typedef struct _GstWinRTDeviceWatcherPrivate GstWinRTDeviceWatcherPrivate; + +/* ABI::Windows::Devices::Enumeration::DeviceClass */ +#define GST_TYPE_WINRT_DEVICE_CLASS (gst_winrt_device_class_get_type ()) +typedef enum +{ + GST_WINRT_DEVICE_CLASS_ALL = 0, + GST_WINRT_DEVICE_CLASS_AUDIO_CAPTURE = 1, + GST_WINRT_DEVICE_CLASS_AUDIO_RENDER = 2, + GST_WINRT_DEVICE_CLASS_PORTABLE_STORAGE_DEVICE = 3, + GST_WINRT_DEVICE_CLASS_VIDEO_CAPTURE = 4, +} GstWinRTDeviceClass; + +typedef struct +{ + /** + * GstWinRTDeviceWatcherCallbacks::added: + * @watcher: a #GstWinRTDeviceWatcher + * @info: (transfer none): a IDeviceInformation interface handle + * @user_data: a user_data + * + * Called when a device is added to the collection enumerated by the DeviceWatcher + */ + void (*added) (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformation * info, + gpointer user_data); + + /** + * GstWinRTDeviceWatcherCallbacks::updated: + * @watcher: a #GstWinRTDeviceWatcher + * @info_update: (transfer none): a IDeviceInformationUpdate interface handle + * @user_data: a user_data + * + * Called when a device is updated in the collection of enumerated devices + */ + void (*updated) (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * info_update, + gpointer user_data); + + /** + * GstWinRTDeviceWatcherCallbacks::removed: + * @watcher: a #GstWinRTDeviceWatcher + * @info_update: (transfer none): a IDeviceInformationUpdate interface handle + * @user_data: a user_data + * + * Called when a device is removed from the collection of enumerated devices + */ + void (*removed) (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * info_update, + gpointer user_data); + + /** + * GstWinRTDeviceWatcherCallbacks::removed: + * @watcher: a #GstWinRTDeviceWatcher + * @user_data: a user_data + * + * Called when the enumeration of devices completes + */ + void (*enumeration_completed) (GstWinRTDeviceWatcher * watcher, + gpointer user_data); +} GstWinRTDeviceWatcherCallbacks; + +struct _GstWinRTDeviceWatcher +{ + GstObject parent; + + GstWinRTDeviceWatcherPrivate *priv; + + gpointer _gst_reserved[GST_PADDING]; +}; + +struct _GstWinRTDeviceWatcherClass +{ + GstObjectClass parent_class; + + gpointer _gst_reserved[GST_PADDING_LARGE]; +}; + +GST_WINRT_API +GType gst_winrt_device_class_get_type (void); + +GST_WINRT_API +GType gst_winrt_device_watcher_get_type (void); + +GST_WINRT_API +GstWinRTDeviceWatcher * gst_winrt_device_watcher_new (GstWinRTDeviceClass device_class, + const GstWinRTDeviceWatcherCallbacks * callbacks, + gpointer user_data); + +GST_WINRT_API +gboolean gst_winrt_device_watcher_start (GstWinRTDeviceWatcher * watcher); + +GST_WINRT_API +void gst_winrt_device_watcher_stop (GstWinRTDeviceWatcher * watcher); + +G_END_DECLS + +#endif /* __GST_WINRT_DEVICE_WATCHER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/winrt/meson.build
Added
@@ -0,0 +1,87 @@ +winrt_sources = [ + 'gstwinrtdevicewatcher.cpp', +] + +gstwinrt_dep = dependency('', required : false) + +extra_c_args = [ + '-DCOBJMACROS', +] +extra_comm_args = [ + '-DGST_USE_UNSTABLE_API', + '-DBUILDING_GST_WINRT', + '-DG_LOG_DOMAIN="GStreamer-WinRT"' +] + +if host_system != 'windows' + subdir_done() +endif + +# TODO: Need to bump mingw tool chain +if cxx.get_id() != 'msvc' + subdir_done() +endif + +runtimeobject_lib = cc.find_library('runtimeobject', required : false) +if not runtimeobject_lib.found() + subdir_done() +endif + +winapi_app = cxx.compiles('''#include <winapifamily.h> + #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_APP) + #error "not winrt" + #endif + int main (int argc, char ** argv) { + return 0; + } ''', + dependencies: runtimeobject_lib, + name: 'building for WinRT') + +if not winapi_app + subdir_done() +endif + +win10_sdk = cxx.compiles('''#include <windows.h> + #ifndef WDK_NTDDI_VERSION + #error "unknown Windows SDK version" + #endif + #if (WDK_NTDDI_VERSION < 0x0A000000) + #error "Not a Windows 10 SDK" + #endif + ''', + name: 'building with Windows 10 SDK') + +if not win10_sdk + subdir_done() +endif + +building_for_win10 = cxx.compiles('''#include <windows.h> + #ifndef WINVER + #error "unknown minimum supported OS version" + #endif + #if (WINVER < 0x0A00) + #error "Windows 10 API is not guaranteed" + #endif + ''', + name: 'building for Windows 10') + +if not building_for_win10 + message('Bumping target Windows version to Windows 10 for building gstwinrt library') + extra_comm_args += ['-DWINVER=0x0A00', '-D_WIN32_WINNT=0x0A00', '-DNTDDI_VERSION=WDK_NTDDI_VERSION'] +endif + +gstwinrt = library('gstwinrt-' + api_version, + winrt_sources, + c_args : gst_plugins_bad_args + extra_c_args + extra_comm_args, + cpp_args : gst_plugins_bad_args + extra_comm_args, + include_directories : [configinc, libsinc], + version : libversion, + soversion : soversion, + install : true, + dependencies : [gstbase_dep, runtimeobject_lib] +) + +# Still non-public api, should not install headers +gstwinrt_dep = declare_dependency(link_with : gstwinrt, + include_directories : [libsinc], + dependencies : [gstbase_dep, runtimeobject_lib])
View file
gst-plugins-bad-1.20.1.tar.xz/gst-libs/gst/winrt/winrt-prelude.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2021 GStreamer developers + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WINRT_PRELUDE_H__ +#define __GST_WINRT_PRELUDE_H__ + +#include <gst/gst.h> + +#ifndef GST_WINRT_API +# ifdef BUILDING_GST_WINRT +# define GST_WINRT_API GST_API_EXPORT /* from config.h */ +# else +# define GST_WINRT_API GST_API_IMPORT +# endif +#endif + +#endif /* __GST_WIN32_PRELUDE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst-plugins-bad.doap -> gst-plugins-bad-1.20.1.tar.xz/gst-plugins-bad.doap
Changed
@@ -35,61 +35,61 @@ <release> <Version> - <revision>1.18.6</revision> - <branch>1.18</branch> + <revision>1.20.1</revision> + <branch>1.20</branch> <name></name> - <created>2022-02-02</created> - <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.6.tar.xz" /> + <created>2022-03-14</created> + <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.20.1.tar.xz" /> </Version> </release> <release> <Version> - <revision>1.18.5</revision> - <branch>1.18</branch> + <revision>1.20.0</revision> + <branch>main</branch> <name></name> - <created>2021-09-08</created> - <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.5.tar.xz" /> + <created>2022-02-03</created> + <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.20.0.tar.xz" /> </Version> </release> <release> <Version> - <revision>1.18.4</revision> - <branch>1.18</branch> + <revision>1.19.90</revision> + <branch>main</branch> <name></name> - <created>2021-03-15</created> - <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.4.tar.xz" /> + <created>2022-01-28</created> + <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.19.90.tar.xz" /> </Version> </release> <release> <Version> - <revision>1.18.3</revision> - <branch>1.18</branch> + <revision>1.19.3</revision> + <branch>main</branch> <name></name> - <created>2021-01-13</created> - <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.3.tar.xz" /> + <created>2021-11-03</created> + <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.19.3.tar.xz" /> </Version> </release> <release> <Version> - <revision>1.18.2</revision> - <branch>1.18</branch> + <revision>1.19.2</revision> + <branch>master</branch> <name></name> - <created>2020-12-06</created> - <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.2.tar.xz" /> + <created>2021-09-23</created> + <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.19.2.tar.xz" /> </Version> </release> <release> <Version> - <revision>1.18.1</revision> - <branch>1.18</branch> + <revision>1.19.1</revision> + <branch>master</branch> <name></name> - <created>2020-10-26</created> - <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.1.tar.xz" /> + <created>2021-06-01</created> + <file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.19.1.tar.xz" /> </Version> </release>
View file
gst-plugins-bad-1.18.6.tar.xz/gst/accurip/gstaccurip.c -> gst-plugins-bad-1.20.1.tar.xz/gst/accurip/gstaccurip.c
Changed
@@ -72,8 +72,6 @@ PROP_LAST_TRACK }; -#define parent_class gst_accurip_parent_class -G_DEFINE_TYPE (GstAccurip, gst_accurip, GST_TYPE_AUDIO_FILTER); @@ -86,6 +84,11 @@ GstBuffer * buf); static gboolean gst_accurip_sink_event (GstBaseTransform * trans, GstEvent * event); +static gboolean accurip_element_init (GstPlugin * plugin); + +#define parent_class gst_accurip_parent_class +G_DEFINE_TYPE (GstAccurip, gst_accurip, GST_TYPE_AUDIO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (accurip, accurip_element_init); static void gst_accurip_class_init (GstAccuripClass * klass) @@ -342,7 +345,7 @@ } static gboolean -plugin_init (GstPlugin * plugin) +accurip_element_init (GstPlugin * plugin) { gboolean ret; @@ -362,6 +365,12 @@ return ret; } +static gboolean +plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (accurip, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, accurip,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/accurip/gstaccurip.h -> gst-plugins-bad-1.20.1.tar.xz/gst/accurip/gstaccurip.h
Changed
@@ -80,6 +80,8 @@ GType gst_accurip_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (accurip); + G_END_DECLS #endif /* __GST_ACCURIP_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/adpcmdec/adpcmdec.c -> gst-plugins-bad-1.20.1.tar.xz/gst/adpcmdec/adpcmdec.c
Changed
@@ -81,7 +81,12 @@ } ADPCMDec; GType adpcmdec_get_type (void); -G_DEFINE_TYPE (ADPCMDec, adpcmdec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DECLARE (adpcmdec); +G_DEFINE_TYPE_WITH_CODE (ADPCMDec, adpcmdec, GST_TYPE_AUDIO_DECODER, + GST_DEBUG_CATEGORY_INIT (adpcmdec_debug, "adpcmdec", 0, "ADPCM Decoders"); + ); +GST_ELEMENT_REGISTER_DEFINE (adpcmdec, "adpcmdec", GST_RANK_PRIMARY, + GST_TYPE_ADPCM_DEC); static gboolean adpcmdec_set_format (GstAudioDecoder * bdec, GstCaps * in_caps) @@ -485,12 +490,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (adpcmdec_debug, "adpcmdec", 0, "ADPCM Decoders"); - if (!gst_element_register (plugin, "adpcmdec", GST_RANK_PRIMARY, - GST_TYPE_ADPCM_DEC)) { - return FALSE; - } - return TRUE; + return GST_ELEMENT_REGISTER (adpcmdec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, adpcmdec,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/adpcmenc/adpcmenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/adpcmenc/adpcmenc.c
Changed
@@ -132,7 +132,12 @@ } ADPCMEnc; GType adpcmenc_get_type (void); -G_DEFINE_TYPE (ADPCMEnc, adpcmenc, GST_TYPE_AUDIO_ENCODER); +GST_ELEMENT_REGISTER_DECLARE (adpcmenc); +G_DEFINE_TYPE_WITH_CODE (ADPCMEnc, adpcmenc, GST_TYPE_AUDIO_ENCODER, + GST_DEBUG_CATEGORY_INIT (adpcmenc_debug, "adpcmenc", 0, "ADPCM Encoders"); + ); +GST_ELEMENT_REGISTER_DEFINE (adpcmenc, "adpcmenc", GST_RANK_PRIMARY, + GST_TYPE_ADPCM_ENC); static gboolean adpcmenc_setup (ADPCMEnc * enc) @@ -470,12 +475,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (adpcmenc_debug, "adpcmenc", 0, "ADPCM Encoders"); - if (!gst_element_register (plugin, "adpcmenc", GST_RANK_PRIMARY, - GST_TYPE_ADPCM_ENC)) { - return FALSE; - } - return TRUE; + return GST_ELEMENT_REGISTER (adpcmenc, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, adpcmenc,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/aiff/aiff.c -> gst-plugins-bad-1.20.1.tar.xz/gst/aiff/aiff.c
Changed
@@ -22,41 +22,16 @@ #include "config.h" #endif -#include <gst/tag/tag.h> +#include "aiffelements.h" -#include <gst/gst-i18n-plugin.h> - -#include "aiffparse.h" -#include "aiffmux.h" - -GST_DEBUG_CATEGORY_STATIC (aiff_debug); -#define GST_CAT_DEFAULT (aiff_debug) - -GST_DEBUG_CATEGORY_EXTERN (aiffparse_debug); -GST_DEBUG_CATEGORY_EXTERN (aiffmux_debug); static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; - - GST_DEBUG_CATEGORY_INIT (aiff_debug, "aiff", 0, "AIFF plugin"); - GST_DEBUG_CATEGORY_INIT (aiffparse_debug, "aiffparse", 0, "AIFF parser"); - GST_DEBUG_CATEGORY_INIT (aiffmux_debug, "aiffmux", 0, "AIFF muxer"); - -#ifdef ENABLE_NLS - GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, - LOCALEDIR); - bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); - bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); -#endif - - ret = gst_element_register (plugin, "aiffparse", GST_RANK_PRIMARY, - GST_TYPE_AIFF_PARSE); - ret &= gst_element_register (plugin, "aiffmux", GST_RANK_PRIMARY, - GST_TYPE_AIFF_MUX); + gboolean ret = FALSE; - gst_tag_register_musicbrainz_tags (); + ret |= GST_ELEMENT_REGISTER (aiffparse, plugin); + ret |= GST_ELEMENT_REGISTER (aiffmux, plugin); return ret; }
View file
gst-plugins-bad-1.20.1.tar.xz/gst/aiff/aiffelements.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2020> The GStreamer Contributors. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_AIFF_ELEMENTS_H__ +#define __GST_AIFF_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +G_GNUC_INTERNAL void aiff_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (aiffmux); +GST_ELEMENT_REGISTER_DECLARE (aiffparse); + +#endif /* __GST_AIFF_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/aiff/aiffmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/aiff/aiffmux.c
Changed
@@ -57,6 +57,7 @@ #include <gst/gst.h> #include <gst/base/gstbytewriter.h> +#include "aiffelements.h" #include "aiffmux.h" GST_DEBUG_CATEGORY (aiffmux_debug); @@ -77,7 +78,10 @@ ); #define gst_aiff_mux_parent_class parent_class -G_DEFINE_TYPE (GstAiffMux, gst_aiff_mux, GST_TYPE_ELEMENT); +G_DEFINE_TYPE_WITH_CODE (GstAiffMux, gst_aiff_mux, GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (aiffmux_debug, "aiffmux", 0, "AIFF muxer")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (aiffmux, "aiffmux", GST_RANK_PRIMARY, + GST_TYPE_AIFF_MUX, aiff_element_init (plugin)); static GstStateChangeReturn gst_aiff_mux_change_state (GstElement * element, GstStateChange transition)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/aiff/aiffparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/aiff/aiffparse.c
Changed
@@ -52,6 +52,7 @@ #include <string.h> #include <math.h> +#include "aiffelements.h" #include "aiffparse.h" #include <gst/audio/audio.h> #include <gst/tag/tag.h> @@ -104,7 +105,10 @@ #define MAX_BUFFER_SIZE 4096 #define gst_aiff_parse_parent_class parent_class -G_DEFINE_TYPE (GstAiffParse, gst_aiff_parse, GST_TYPE_ELEMENT); +G_DEFINE_TYPE_WITH_CODE (GstAiffParse, gst_aiff_parse, GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (aiffparse_debug, "aiffparse", 0, "AIFF parser")); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (aiffparse, "aiffparse", GST_RANK_PRIMARY, + GST_TYPE_AIFF_PARSE, aiff_element_init (plugin)); static void gst_aiff_parse_class_init (GstAiffParseClass * klass) @@ -824,7 +828,7 @@ { GstCaps *caps = NULL; const gchar *format = NULL; - guint64 channel_mask; + guint64 channel_mask = 0; if (aiff->floating_point) { if (aiff->endianness == G_BIG_ENDIAN) { @@ -862,42 +866,37 @@ "rate", G_TYPE_INT, aiff->rate, NULL); } - if (aiff->channels > 2) { - GST_FIXME_OBJECT (aiff, "using fallback channel layout for %d channels", - aiff->channels); - - /* based on AIFF-1.3.pdf */ - switch (aiff->channels) { - case 1: - channel_mask = 0; - break; - case 2: - channel_mask = _P (FRONT_LEFT) | _P (FRONT_RIGHT); - break; - case 3: - channel_mask = _P (FRONT_LEFT) | _P (FRONT_RIGHT) | _P (FRONT_CENTER); - break; - case 4: - /* lists both this and 'quad' but doesn't say how to distinguish the two */ - channel_mask = - _P (FRONT_LEFT) | _P (FRONT_RIGHT) | _P (REAR_LEFT) | - _P (REAR_RIGHT); - break; - case 6: - channel_mask = - _P (FRONT_LEFT) | _P (FRONT_LEFT_OF_CENTER) | _P (FRONT_CENTER) | - _P (FRONT_RIGHT) | _P (FRONT_RIGHT_OF_CENTER) | _P (LFE1); - break; - default: - channel_mask = gst_audio_channel_get_fallback_mask (aiff->channels); - break; - } - + /* based on AIFF-1.3.pdf */ + switch (aiff->channels) { + case 1: + channel_mask = 0; + break; + case 2: + channel_mask = _P (FRONT_LEFT) | _P (FRONT_RIGHT); + break; + case 3: + channel_mask = _P (FRONT_LEFT) | _P (FRONT_RIGHT) | _P (FRONT_CENTER); + break; + case 4: + /* lists both this and 'quad' but doesn't say how to distinguish the two */ + channel_mask = + _P (FRONT_LEFT) | _P (FRONT_RIGHT) | _P (REAR_LEFT) | _P (REAR_RIGHT); + break; + case 6: + channel_mask = + _P (FRONT_LEFT) | _P (FRONT_LEFT_OF_CENTER) | _P (FRONT_CENTER) | + _P (FRONT_RIGHT) | _P (FRONT_RIGHT_OF_CENTER) | _P (LFE1); + break; + default: + GST_FIXME_OBJECT (aiff, "using fallback channel layout for %d channels", + aiff->channels); + channel_mask = gst_audio_channel_get_fallback_mask (aiff->channels); + break; + } - if (channel_mask != 0) { - gst_caps_set_simple (caps, "channel-mask", GST_TYPE_BITMASK, channel_mask, - NULL); - } + if (channel_mask != 0) { + gst_caps_set_simple (caps, "channel-mask", GST_TYPE_BITMASK, channel_mask, + NULL); } GST_DEBUG_OBJECT (aiff, "Created caps: %" GST_PTR_FORMAT, caps);
View file
gst-plugins-bad-1.20.1.tar.xz/gst/aiff/gstaiffelement.c
Added
@@ -0,0 +1,50 @@ +/* -*- Mode: C; tab-width: 2; indent-tabs-mode: t; c-basic-offset: 2 -*- */ +/* GStreamer AIFF plugin initialisation + * Copyright (C) <2008> Pioneers of the Inevitable <songbird@songbirdnest.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/tag/tag.h> + +#include <gst/gst-i18n-plugin.h> + +#include "aiffelements.h" + + +GST_DEBUG_CATEGORY_STATIC (aiff_debug); +#define GST_CAT_DEFAULT (aiff_debug) + +void +aiff_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (aiff_debug, "aiff", 0, "AIFF plugin"); +#ifdef ENABLE_NLS + GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, + LOCALEDIR); + bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); + bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); +#endif + gst_tag_register_musicbrainz_tags (); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/aiff/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/aiff/meson.build
Changed
@@ -1,5 +1,5 @@ aiff_sources = [ - 'aiff.c', 'aiffmux.c', 'aiffparse.c', + 'aiff.c', 'aiffmux.c', 'aiffparse.c', 'gstaiffelement.c' ] gstaiff = library('gstaiff',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstasf.c -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstasf.c
Changed
@@ -32,16 +32,13 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_asf_mux_plugin_init (plugin)) { - return FALSE; - } - if (!gst_rtp_asf_pay_plugin_init (plugin)) { - return FALSE; - } - if (!gst_asf_parse_plugin_init (plugin)) { - return FALSE; - } - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (asfmux, plugin); + ret |= GST_ELEMENT_REGISTER (rtpasfpay, plugin); + ret |= GST_ELEMENT_REGISTER (asfparse, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstasfmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstasfmux.c
Changed
@@ -173,6 +173,8 @@ G_DEFINE_TYPE_WITH_CODE (GstAsfMux, gst_asf_mux, GST_TYPE_ELEMENT, G_IMPLEMENT_INTERFACE (GST_TYPE_TAG_SETTER, NULL)); +GST_ELEMENT_REGISTER_DEFINE (asfmux, "asfmux", + GST_RANK_PRIMARY, GST_TYPE_ASF_MUX); static void gst_asf_mux_reset (GstAsfMux * asfmux) @@ -2455,10 +2457,3 @@ done: return ret; } - -gboolean -gst_asf_mux_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "asfmux", - GST_RANK_PRIMARY, GST_TYPE_ASF_MUX); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstasfmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstasfmux.h
Changed
@@ -152,7 +152,7 @@ }; GType gst_asf_mux_get_type (void); -gboolean gst_asf_mux_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (asfmux); G_END_DECLS #endif /* __GST_ASF_MUX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstasfparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstasfparse.c
Changed
@@ -44,6 +44,8 @@ #define gst_asf_parse_parent_class parent_class G_DEFINE_TYPE (GstAsfParse, gst_asf_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE (asfparse, "asfparse", + GST_RANK_NONE, GST_TYPE_ASF_PARSE); static gboolean gst_asf_parse_start (GstBaseParse * parse) @@ -416,10 +418,3 @@ asfparse->asfinfo = gst_asf_file_info_new (); asfparse->packetinfo = g_new0 (GstAsfPacketInfo, 1); } - -gboolean -gst_asf_parse_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "asfparse", - GST_RANK_NONE, GST_TYPE_ASF_PARSE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstasfparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstasfparse.h
Changed
@@ -71,7 +71,7 @@ }; GType gst_asf_parse_get_type(void); -gboolean gst_asf_parse_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (asfparse); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstrtpasfpay.c -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstrtpasfpay.c
Changed
@@ -57,6 +57,8 @@ #define gst_rtp_asf_pay_parent_class parent_class G_DEFINE_TYPE (GstRtpAsfPay, gst_rtp_asf_pay, GST_TYPE_RTP_BASE_PAYLOAD); +GST_ELEMENT_REGISTER_DEFINE (rtpasfpay, "rtpasfpay", + GST_RANK_NONE, GST_TYPE_RTP_ASF_PAY); static void gst_rtp_asf_pay_init (GstRtpAsfPay * rtpasfpay) @@ -460,10 +462,3 @@ gst_buffer_unref (buffer); return GST_FLOW_OK; } - -gboolean -gst_rtp_asf_pay_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "rtpasfpay", - GST_RANK_NONE, GST_TYPE_RTP_ASF_PAY); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/asfmux/gstrtpasfpay.h -> gst-plugins-bad-1.20.1.tar.xz/gst/asfmux/gstrtpasfpay.h
Changed
@@ -81,7 +81,8 @@ }; GType gst_rtp_asf_pay_get_type (void); -gboolean gst_rtp_asf_pay_plugin_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (rtpasfpay); G_END_DECLS #endif /* __GST_RTP_ASF_PAY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiobuffersplit/gstaudiobuffersplit.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiobuffersplit/gstaudiobuffersplit.c
Changed
@@ -45,6 +45,7 @@ { PROP_0, PROP_OUTPUT_BUFFER_DURATION, + PROP_OUTPUT_BUFFER_SIZE, PROP_ALIGNMENT_THRESHOLD, PROP_DISCONT_WAIT, PROP_STRICT_BUFFER_SIZE, @@ -62,7 +63,11 @@ #define DEFAULT_MAX_SILENCE_TIME (0) #define parent_class gst_audio_buffer_split_parent_class -G_DEFINE_TYPE (GstAudioBufferSplit, gst_audio_buffer_split, GST_TYPE_ELEMENT); +G_DEFINE_TYPE_WITH_CODE (GstAudioBufferSplit, gst_audio_buffer_split, + GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_audio_buffer_split_debug, + "audiobuffersplit", 0, "Audio buffer splitter");); +GST_ELEMENT_REGISTER_DEFINE (audiobuffersplit, "audiobuffersplit", + GST_RANK_NONE, GST_TYPE_AUDIO_BUFFER_SPLIT); static GstFlowReturn gst_audio_buffer_split_sink_chain (GstPad * pad, GstObject * parent, GstBuffer * buffer); @@ -98,6 +103,22 @@ G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | GST_PARAM_MUTABLE_READY)); + /** + * GstAudioBufferSplit:output-buffer-size + * + * Allow specifying a buffer size for splitting. Zero by default. + * Takes precedence over output-buffer-duration when set to a + * non zero value else will not be in effect. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_OUTPUT_BUFFER_SIZE, + g_param_spec_uint ("output-buffer-size", "Output buffer size", + "Output block size in bytes, takes precedence over " + "buffer duration when set to non zero", 0, G_MAXINT, 0, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY)); + g_object_class_install_property (gobject_class, PROP_ALIGNMENT_THRESHOLD, g_param_spec_uint64 ("alignment-threshold", "Alignment Threshold", "Timestamp alignment threshold in nanoseconds", 0, @@ -171,6 +192,7 @@ self->output_buffer_duration_d = DEFAULT_OUTPUT_BUFFER_DURATION_D; self->strict_buffer_size = DEFAULT_STRICT_BUFFER_SIZE; self->gapless = DEFAULT_GAPLESS; + self->output_buffer_size = 0; self->adapter = gst_adapter_new (); @@ -211,6 +233,12 @@ goto out; } + if (self->output_buffer_size) { + self->output_buffer_duration_n = + self->output_buffer_size / GST_AUDIO_INFO_BPF (&self->info); + self->output_buffer_duration_d = GST_AUDIO_INFO_RATE (&self->info); + } + self->samples_per_buffer = (((guint64) GST_AUDIO_INFO_RATE (&self->info)) * self->output_buffer_duration_n) / self->output_buffer_duration_d; @@ -248,6 +276,10 @@ gst_value_get_fraction_denominator (value); gst_audio_buffer_split_update_samples_per_buffer (self); break; + case PROP_OUTPUT_BUFFER_SIZE: + self->output_buffer_size = g_value_get_uint (value); + gst_audio_buffer_split_update_samples_per_buffer (self); + break; case PROP_ALIGNMENT_THRESHOLD: GST_OBJECT_LOCK (self); gst_audio_stream_align_set_alignment_threshold (self->stream_align, @@ -286,6 +318,9 @@ gst_value_set_fraction (value, self->output_buffer_duration_n, self->output_buffer_duration_d); break; + case PROP_OUTPUT_BUFFER_SIZE: + g_value_set_uint (value, self->output_buffer_size); + break; case PROP_ALIGNMENT_THRESHOLD: GST_OBJECT_LOCK (self); g_value_set_uint64 (value, @@ -558,7 +593,7 @@ silence = gst_buffer_new_and_alloc (n_samples * bpf); GST_BUFFER_FLAG_SET (silence, GST_BUFFER_FLAG_GAP); gst_buffer_map (silence, &map, GST_MAP_WRITE); - gst_audio_format_fill_silence (info, map.data, map.size); + gst_audio_format_info_fill_silence (info, map.data, map.size); gst_buffer_unmap (silence, &map); gst_adapter_push (self->adapter, silence); @@ -628,13 +663,6 @@ } static GstBuffer * -gst_audio_buffer_split_clip_buffer (GstAudioBufferSplit * self, - GstBuffer * buffer, const GstSegment * segment, gint rate, gint bpf) -{ - return gst_audio_buffer_clip (buffer, segment, rate, bpf); -} - -static GstBuffer * gst_audio_buffer_split_clip_buffer_start_for_gapless (GstAudioBufferSplit * self, GstBuffer * buffer, gint rate, gint bpf) { @@ -702,9 +730,7 @@ return GST_FLOW_NOT_NEGOTIATED; } - buffer = - gst_audio_buffer_split_clip_buffer (self, buffer, &self->in_segment, rate, - bpf); + buffer = gst_audio_buffer_clip (buffer, &self->in_segment, rate, bpf); if (!buffer) return GST_FLOW_OK; @@ -892,13 +918,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_audio_buffer_split_debug, "audiobuffersplit", - 0, "Audio buffer splitter"); - - gst_element_register (plugin, "audiobuffersplit", GST_RANK_NONE, - GST_TYPE_AUDIO_BUFFER_SPLIT); - - return TRUE; + return GST_ELEMENT_REGISTER (audiobuffersplit, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiobuffersplit/gstaudiobuffersplit.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiobuffersplit/gstaudiobuffersplit.h
Changed
@@ -45,6 +45,7 @@ /* Properties */ gint output_buffer_duration_n; gint output_buffer_duration_d; + guint output_buffer_size; /* State */ GstSegment in_segment, out_segment; @@ -73,6 +74,7 @@ }; GType gst_audio_buffer_split_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (audiobuffersplit); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiofxbad/gstaudiochannelmix.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiofxbad/gstaudiochannelmix.c
Changed
@@ -91,6 +91,8 @@ GST_TYPE_AUDIO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_audio_channel_mix_debug_category, "audiochannelmix", 0, "debug category for audiochannelmix element")); +GST_ELEMENT_REGISTER_DEFINE (audiochannelmix, "audiochannelmix", GST_RANK_NONE, + GST_TYPE_AUDIO_CHANNEL_MIX); static void gst_audio_channel_mix_class_init (GstAudioChannelMixClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiofxbad/gstaudiochannelmix.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiofxbad/gstaudiochannelmix.h
Changed
@@ -50,6 +50,8 @@ GType gst_audio_channel_mix_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (audiochannelmix); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiofxbad/gstaudiofxbad.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiofxbad/gstaudiofxbad.c
Changed
@@ -26,8 +26,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "audiochannelmix", GST_RANK_NONE, - GST_TYPE_AUDIO_CHANNEL_MIX); + return GST_ELEMENT_REGISTER (audiochannelmix, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiolatency/gstaudiolatency.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiolatency/gstaudiolatency.c
Changed
@@ -79,15 +79,22 @@ ); #define gst_audiolatency_parent_class parent_class -G_DEFINE_TYPE (GstAudioLatency, gst_audiolatency, GST_TYPE_BIN); +G_DEFINE_TYPE_WITH_CODE (GstAudioLatency, gst_audiolatency, GST_TYPE_BIN, + GST_DEBUG_CATEGORY_INIT (gst_audiolatency_debug, "audiolatency", 0, + "audiolatency");); +GST_ELEMENT_REGISTER_DEFINE (audiolatency, "audiolatency", GST_RANK_PRIMARY, + GST_TYPE_AUDIOLATENCY); #define DEFAULT_PRINT_LATENCY FALSE +#define DEFAULT_SAMPLES_PER_BUFFER 240 + enum { PROP_0, PROP_PRINT_LATENCY, PROP_LAST_LATENCY, - PROP_AVERAGE_LATENCY + PROP_AVERAGE_LATENCY, + PROP_SAMPLES_PER_BUFFER, }; static gint64 gst_audiolatency_get_latency (GstAudioLatency * self); @@ -115,6 +122,9 @@ case PROP_AVERAGE_LATENCY: g_value_set_int64 (value, gst_audiolatency_get_average_latency (self)); break; + case PROP_SAMPLES_PER_BUFFER: + g_value_set_int (value, self->samples_per_buffer); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -131,6 +141,11 @@ case PROP_PRINT_LATENCY: self->print_latency = g_value_get_boolean (value); break; + case PROP_SAMPLES_PER_BUFFER: + self->samples_per_buffer = g_value_get_int (value); + g_object_set (self->audiosrc, + "samplesperbuffer", self->samples_per_buffer, NULL); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -161,6 +176,20 @@ "The running average latency, in microseconds", 0, G_USEC_PER_SEC, 0, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); + /** + * GstAudioLatency:samplesperbuffer: + * + * The number of audio samples in each outgoing buffer. + * See also #GstAudioTestSrc:samplesperbuffer + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_SAMPLES_PER_BUFFER, + g_param_spec_int ("samplesperbuffer", "Samples per buffer", + "Number of samples in each outgoing buffer", + 1, G_MAXINT, DEFAULT_SAMPLES_PER_BUFFER, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + gst_element_class_add_static_pad_template (gstelement_class, &src_template); gst_element_class_add_static_pad_template (gstelement_class, &sink_template); @@ -179,6 +208,7 @@ self->send_pts = 0; self->recv_pts = 0; self->print_latency = DEFAULT_PRINT_LATENCY; + self->samples_per_buffer = DEFAULT_SAMPLES_PER_BUFFER; /* Setup sinkpad */ self->sinkpad = gst_pad_new_from_static_template (&sink_template, "sink"); @@ -191,8 +221,8 @@ /* Setup srcpad */ self->audiosrc = gst_element_factory_make ("audiotestsrc", NULL); - g_object_set (self->audiosrc, "wave", 8, "samplesperbuffer", 240, - "is-live", TRUE, NULL); + g_object_set (self->audiosrc, "wave", 8, "samplesperbuffer", + DEFAULT_SAMPLES_PER_BUFFER, "is-live", TRUE, NULL); gst_bin_add (GST_BIN (self), self->audiosrc); templ = gst_static_pad_template_get (&src_template); @@ -454,11 +484,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_audiolatency_debug, "audiolatency", 0, - "audiolatency"); - - return gst_element_register (plugin, "audiolatency", GST_RANK_PRIMARY, - GST_TYPE_AUDIOLATENCY); + return GST_ELEMENT_REGISTER (audiolatency, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiolatency/gstaudiolatency.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiolatency/gstaudiolatency.h
Changed
@@ -56,6 +56,7 @@ /* properties */ gboolean print_latency; + gint samples_per_buffer; }; struct _GstAudioLatencyClass @@ -65,5 +66,7 @@ GType gst_audiolatency_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (audiolatency); + G_END_DECLS #endif /* __GST_AUDIOLATENCY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiomixmatrix/gstaudiomixmatrix.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiomixmatrix/gstaudiomixmatrix.c
Changed
@@ -150,6 +150,8 @@ G_DEFINE_TYPE (GstAudioMixMatrix, gst_audio_mix_matrix, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (audiomixmatrix, "audiomixmatrix", GST_RANK_NONE, + GST_TYPE_AUDIO_MIX_MATRIX); static void gst_audio_mix_matrix_class_init (GstAudioMixMatrixClass * klass) @@ -743,8 +745,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "audiomixmatrix", GST_RANK_NONE, - GST_TYPE_AUDIO_MIX_MATRIX); + return GST_ELEMENT_REGISTER (audiomixmatrix, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiomixmatrix/gstaudiomixmatrix.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiomixmatrix/gstaudiomixmatrix.h
Changed
@@ -72,6 +72,8 @@ GType gst_audio_mix_matrix_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (audiomixmatrix); + GType gst_audio_mix_matrix_mode_get_type (void); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstspacescope.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstspacescope.c
Changed
@@ -120,7 +120,11 @@ GstBuffer * audio, GstVideoFrame * video); -G_DEFINE_TYPE (GstSpaceScope, gst_space_scope, GST_TYPE_AUDIO_VISUALIZER); +G_DEFINE_TYPE_WITH_CODE (GstSpaceScope, gst_space_scope, + GST_TYPE_AUDIO_VISUALIZER, GST_DEBUG_CATEGORY_INIT (space_scope_debug, + "spacescope", 0, "spacescope");); +GST_ELEMENT_REGISTER_DEFINE (spacescope, "spacescope", GST_RANK_NONE, + GST_TYPE_SPACE_SCOPE); static void gst_space_scope_class_init (GstSpaceScopeClass * g_class) @@ -445,12 +449,3 @@ gst_buffer_unmap (audio, &amap); return TRUE; } - -gboolean -gst_space_scope_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (space_scope_debug, "spacescope", 0, "spacescope"); - - return gst_element_register (plugin, "spacescope", GST_RANK_NONE, - GST_TYPE_SPACE_SCOPE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstspacescope.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstspacescope.h
Changed
@@ -56,7 +56,8 @@ }; GType gst_space_scope_get_type (void); -gboolean gst_space_scope_plugin_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (spacescope); G_END_DECLS #endif /* __GST_SPACE_SCOPE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstspectrascope.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstspectrascope.c
Changed
@@ -75,7 +75,11 @@ GstBuffer * audio, GstVideoFrame * video); -G_DEFINE_TYPE (GstSpectraScope, gst_spectra_scope, GST_TYPE_AUDIO_VISUALIZER); +G_DEFINE_TYPE_WITH_CODE (GstSpectraScope, gst_spectra_scope, + GST_TYPE_AUDIO_VISUALIZER, GST_DEBUG_CATEGORY_INIT (spectra_scope_debug, + "spectrascope", 0, "spectrascope");); +GST_ELEMENT_REGISTER_DEFINE (spectrascope, "spectrascope", GST_RANK_NONE, + GST_TYPE_SPECTRA_SCOPE); static void gst_spectra_scope_class_init (GstSpectraScopeClass * g_class) @@ -184,7 +188,7 @@ channels = GST_AUDIO_INFO_CHANNELS (&bscope->ainfo); - mono_adata = (gint16 *) g_memdup (amap.data, amap.size); + mono_adata = g_memdup2 (amap.data, amap.size); if (channels > 1) { guint ch = channels; @@ -228,13 +232,3 @@ gst_buffer_unmap (audio, &amap); return TRUE; } - -gboolean -gst_spectra_scope_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (spectra_scope_debug, "spectrascope", 0, - "spectrascope"); - - return gst_element_register (plugin, "spectrascope", GST_RANK_NONE, - GST_TYPE_SPECTRA_SCOPE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstspectrascope.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstspectrascope.h
Changed
@@ -48,7 +48,8 @@ }; GType gst_spectra_scope_get_type (void); -gboolean gst_spectra_scope_plugin_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (spectrascope); G_END_DECLS #endif /* __GST_SPECTRA_SCOPE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstsynaescope.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstsynaescope.c
Changed
@@ -73,7 +73,11 @@ GstBuffer * audio, GstVideoFrame * video); -G_DEFINE_TYPE (GstSynaeScope, gst_synae_scope, GST_TYPE_AUDIO_VISUALIZER); +G_DEFINE_TYPE_WITH_CODE (GstSynaeScope, gst_synae_scope, + GST_TYPE_AUDIO_VISUALIZER, GST_DEBUG_CATEGORY_INIT (synae_scope_debug, + "synaescope", 0, "synaescope");); +GST_ELEMENT_REGISTER_DEFINE (synaescope, "synaescope", GST_RANK_NONE, + GST_TYPE_SYNAE_SCOPE); static void gst_synae_scope_class_init (GstSynaeScopeClass * g_class) @@ -309,12 +313,3 @@ return TRUE; } - -gboolean -gst_synae_scope_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (synae_scope_debug, "synaescope", 0, "synaescope"); - - return gst_element_register (plugin, "synaescope", GST_RANK_NONE, - GST_TYPE_SYNAE_SCOPE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstsynaescope.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstsynaescope.h
Changed
@@ -52,7 +52,8 @@ }; GType gst_synae_scope_get_type (void); -gboolean gst_synae_scope_plugin_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (synaescope); G_END_DECLS #endif /* __GST_SYNAE_SCOPE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstwavescope.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstwavescope.c
Changed
@@ -122,7 +122,11 @@ GstBuffer * audio, GstVideoFrame * video); #define gst_wave_scope_parent_class parent_class -G_DEFINE_TYPE (GstWaveScope, gst_wave_scope, GST_TYPE_AUDIO_VISUALIZER); +G_DEFINE_TYPE_WITH_CODE (GstWaveScope, gst_wave_scope, + GST_TYPE_AUDIO_VISUALIZER, GST_DEBUG_CATEGORY_INIT (wave_scope_debug, + "wavescope", 0, "wavescope");); +GST_ELEMENT_REGISTER_DEFINE (wavescope, "wavescope", GST_RANK_NONE, + GST_TYPE_WAVE_SCOPE); static void gst_wave_scope_class_init (GstWaveScopeClass * g_class) @@ -422,12 +426,3 @@ return TRUE; } - -gboolean -gst_wave_scope_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (wave_scope_debug, "wavescope", 0, "wavescope"); - - return gst_element_register (plugin, "wavescope", GST_RANK_NONE, - GST_TYPE_WAVE_SCOPE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/gstwavescope.h -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/gstwavescope.h
Changed
@@ -53,7 +53,8 @@ }; GType gst_wave_scope_get_type (void); -gboolean gst_wave_scope_plugin_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (wavescope); G_END_DECLS #endif /* __GST_WAVE_SCOPE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/audiovisualizers/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/audiovisualizers/plugin.c
Changed
@@ -32,13 +32,14 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean res = TRUE; + gboolean ret = FALSE; - res &= gst_space_scope_plugin_init (plugin); - res &= gst_spectra_scope_plugin_init (plugin); - res &= gst_synae_scope_plugin_init (plugin); - res &= gst_wave_scope_plugin_init (plugin); - return res; + ret |= GST_ELEMENT_REGISTER (spacescope, plugin); + ret |= GST_ELEMENT_REGISTER (spectrascope, plugin); + ret |= GST_ELEMENT_REGISTER (synaescope, plugin); + ret |= GST_ELEMENT_REGISTER (wavescope, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/autoconvert/gstautoconvert.c -> gst-plugins-bad-1.20.1.tar.xz/gst/autoconvert/gstautoconvert.c
Changed
@@ -145,6 +145,8 @@ static GQuark parent_quark = 0; G_DEFINE_TYPE (GstAutoConvert, gst_auto_convert, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (autoconvert, "autoconvert", + GST_RANK_NONE, GST_TYPE_AUTO_CONVERT); static void gst_auto_convert_class_init (GstAutoConvertClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/autoconvert/gstautoconvert.h -> gst-plugins-bad-1.20.1.tar.xz/gst/autoconvert/gstautoconvert.h
Changed
@@ -61,6 +61,7 @@ }; GType gst_auto_convert_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (autoconvert); G_END_DECLS #endif /* __GST_AUTO_CONVERT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/autoconvert/gstautovideoconvert.c -> gst-plugins-bad-1.20.1.tar.xz/gst/autoconvert/gstautovideoconvert.c
Changed
@@ -131,6 +131,8 @@ } G_DEFINE_TYPE (GstAutoVideoConvert, gst_auto_video_convert, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (autovideoconvert, "autovideoconvert", + GST_RANK_NONE, GST_TYPE_AUTO_VIDEO_CONVERT); static void gst_auto_video_convert_class_init (GstAutoVideoConvertClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/autoconvert/gstautovideoconvert.h -> gst-plugins-bad-1.20.1.tar.xz/gst/autoconvert/gstautovideoconvert.h
Changed
@@ -50,6 +50,7 @@ }; GType gst_auto_video_convert_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (autovideoconvert); G_END_DECLS #endif /* __GST_AUTO_VIDEO_CONVERT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/autoconvert/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/autoconvert/plugin.c
Changed
@@ -28,13 +28,10 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; + gboolean ret = FALSE; - ret = gst_element_register (plugin, "autoconvert", - GST_RANK_NONE, GST_TYPE_AUTO_CONVERT); - - ret &= gst_element_register (plugin, "autovideoconvert", - GST_RANK_NONE, GST_TYPE_AUTO_VIDEO_CONVERT); + ret |= GST_ELEMENT_REGISTER (autoconvert, plugin); + ret |= GST_ELEMENT_REGISTER (autovideoconvert, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/bayer/gstbayer.c -> gst-plugins-bad-1.20.1.tar.xz/gst/bayer/gstbayer.c
Changed
@@ -23,19 +23,17 @@ #include <gst/gst.h> - -GType gst_bayer2rgb_get_type (void); -GType gst_rgb2bayer_get_type (void); +#include "gstbayerelements.h" static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "bayer2rgb", GST_RANK_NONE, - gst_bayer2rgb_get_type ()); - gst_element_register (plugin, "rgb2bayer", GST_RANK_NONE, - gst_rgb2bayer_get_type ()); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (bayer2rgb, plugin); + ret |= GST_ELEMENT_REGISTER (rgb2bayer, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/bayer/gstbayer2rgb.c -> gst-plugins-bad-1.20.1.tar.xz/gst/bayer/gstbayer2rgb.c
Changed
@@ -86,6 +86,7 @@ #include <stdint.h> #endif +#include "gstbayerelements.h" #include "gstbayerorc.h" #define GST_CAT_DEFAULT gst_bayer2rgb_debug @@ -145,6 +146,8 @@ #define gst_bayer2rgb_parent_class parent_class G_DEFINE_TYPE (GstBayer2RGB, gst_bayer2rgb, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (bayer2rgb, "bayer2rgb", GST_RANK_NONE, + gst_bayer2rgb_get_type ()); static void gst_bayer2rgb_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.20.1.tar.xz/gst/bayer/gstbayerelements.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) <2020> Julian Bouzas <julian.bouzas@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_BAYER_ELEMENT_H__ +#define __GST_BAYER_ELEMENT_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +GST_ELEMENT_REGISTER_DECLARE (bayer2rgb); +GST_ELEMENT_REGISTER_DECLARE (rgb2bayer); + +#endif /* __GST_BAYER_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/bayer/gstrgb2bayer.c -> gst-plugins-bad-1.20.1.tar.xz/gst/bayer/gstrgb2bayer.c
Changed
@@ -25,6 +25,7 @@ #include <gst/gst.h> #include <gst/base/gstbasetransform.h> #include <gst/video/video.h> +#include "gstbayerelements.h" #include "gstrgb2bayer.h" #define GST_CAT_DEFAULT gst_rgb2bayer_debug @@ -76,6 +77,8 @@ #define gst_rgb2bayer_parent_class parent_class G_DEFINE_TYPE (GstRGB2Bayer, gst_rgb2bayer, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (rgb2bayer, "rgb2bayer", GST_RANK_NONE, + gst_rgb2bayer_get_type ()); static void gst_rgb2bayer_class_init (GstRGB2BayerClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstcamerabin2.c -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstcamerabin2.c
Changed
@@ -311,6 +311,11 @@ return gst_camera_bin_type; } +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (camerabin, "camerabin", GST_RANK_NONE, + gst_camera_bin2_get_type (), GST_DEBUG_CATEGORY_INIT (gst_camera_bin_debug, + "camerabin", 0, "CameraBin"); + ); + /* GObject class functions */ static void gst_camera_bin_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -2390,12 +2395,3 @@ break; } } - -gboolean -gst_camera_bin2_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_camera_bin_debug, "camerabin", 0, "CameraBin"); - - return gst_element_register (plugin, "camerabin", GST_RANK_NONE, - gst_camera_bin2_get_type ()); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstcamerabin2.h -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstcamerabin2.h
Changed
@@ -161,7 +161,7 @@ }; GType gst_camera_bin2_get_type (void); -gboolean gst_camera_bin2_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (camerabin); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstplugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstplugin.c
Changed
@@ -30,14 +30,13 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_viewfinder_bin_plugin_init (plugin)) - return FALSE; - if (!gst_wrapper_camera_bin_src_plugin_init (plugin)) - return FALSE; - if (!gst_camera_bin2_plugin_init (plugin)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (viewfinderbin, plugin); + ret |= GST_ELEMENT_REGISTER (wrappercamerabinsrc, plugin); + ret |= GST_ELEMENT_REGISTER (camerabin, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstviewfinderbin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstviewfinderbin.c
Changed
@@ -63,6 +63,11 @@ /* class initialization */ #define gst_viewfinder_bin_parent_class parent_class G_DEFINE_TYPE (GstViewfinderBin, gst_viewfinder_bin, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (viewfinderbin, "viewfinderbin", + GST_RANK_NONE, gst_viewfinder_bin_get_type (), + GST_DEBUG_CATEGORY_INIT (gst_viewfinder_bin_debug, "viewfinderbin", 0, + "ViewFinderBin"); + ); static void gst_viewfinder_bin_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * spec); @@ -358,12 +363,3 @@ break; } } - -gboolean -gst_viewfinder_bin_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_viewfinder_bin_debug, "viewfinderbin", 0, - "ViewFinderBin"); - return gst_element_register (plugin, "viewfinderbin", GST_RANK_NONE, - gst_viewfinder_bin_get_type ()); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstviewfinderbin.h -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstviewfinderbin.h
Changed
@@ -53,7 +53,7 @@ }; GType gst_viewfinder_bin_get_type (void); -gboolean gst_viewfinder_bin_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (viewfinderbin); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstwrappercamerabinsrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstwrappercamerabinsrc.c
Changed
@@ -52,6 +52,8 @@ #define gst_wrapper_camera_bin_src_parent_class parent_class G_DEFINE_TYPE (GstWrapperCameraBinSrc, gst_wrapper_camera_bin_src, GST_TYPE_BASE_CAMERA_SRC); +GST_ELEMENT_REGISTER_DEFINE (wrappercamerabinsrc, "wrappercamerabinsrc", + GST_RANK_NONE, gst_wrapper_camera_bin_src_get_type ()); static GstStaticPadTemplate vfsrc_template = GST_STATIC_PAD_TEMPLATE (GST_BASE_CAMERA_SRC_VIEWFINDER_PAD_NAME, @@ -596,10 +598,10 @@ video_recording_tee = gst_element_factory_make ("tee", "video_rec_tee"); gst_bin_add (GST_BIN_CAST (self), video_recording_tee); /* TODO check returns */ self->video_tee_vf_pad = - gst_element_get_request_pad (video_recording_tee, "src_%u"); + gst_element_request_pad_simple (video_recording_tee, "src_%u"); self->video_tee_sink = gst_element_get_static_pad (video_recording_tee, "sink"); - tee_pad = gst_element_get_request_pad (video_recording_tee, "src_%u"); + tee_pad = gst_element_request_pad_simple (video_recording_tee, "src_%u"); gst_ghost_pad_set_target (GST_GHOST_PAD (self->vidsrc), tee_pad); gst_object_unref (tee_pad); @@ -1169,10 +1171,3 @@ self->mode = GST_BASE_CAMERA_SRC_CAST (self)->mode; self->app_vid_filter = NULL; } - -gboolean -gst_wrapper_camera_bin_src_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "wrappercamerabinsrc", GST_RANK_NONE, - gst_wrapper_camera_bin_src_get_type ()); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/camerabin2/gstwrappercamerabinsrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/camerabin2/gstwrappercamerabinsrc.h
Changed
@@ -124,7 +124,7 @@ GstBaseCameraSrcClass parent; }; -gboolean gst_wrapper_camera_bin_src_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (wrappercamerabinsrc); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstalphacombine.c
Added
@@ -0,0 +1,673 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-alphacombine + * @title: Alpha Combiner + * + * This element can combine a Luma plane from one stream as being the alpha + * plane of another stream. This element can only work with planar formats + * that have an equivalent format with an alpha plane. This is notably used to + * combine VP8/VP9 alpha streams from WebM container. + * + * ## Example launch line + * |[ + * gst-launch-1.0 -v videotestsrc ! c. videotestsrc pattern=ball ! c. + * alphacombine name=c ! compositor ! videoconvert ! autovideosink + * ]| This pipeline uses luma of a ball test pattern as alpha, combined with + * default test pattern and renders the resulting moving ball on a checker + * board. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/video/video.h> + +#include "gstalphacombine.h" + + +#define SUPPORTED_SINK_FORMATS "{ I420, NV12 }" +#define SUPPORTED_ALPHA_FORMATS "{ GRAY8, I420, NV12 }" +#define SUPPORTED_SRC_FORMATS "{ A420, AV12 }" + +/* *INDENT-OFF* */ +struct { + GstVideoFormat sink; + GstVideoFormat alpha; + GstVideoFormat src; +} format_map[] = { + { + .sink = GST_VIDEO_FORMAT_I420, + .alpha = GST_VIDEO_FORMAT_I420, + .src = GST_VIDEO_FORMAT_A420 + },{ + .sink = GST_VIDEO_FORMAT_I420, + .alpha = GST_VIDEO_FORMAT_GRAY8, + .src = GST_VIDEO_FORMAT_A420 + },{ + .sink = GST_VIDEO_FORMAT_I420, + .alpha = GST_VIDEO_FORMAT_NV12, + .src = GST_VIDEO_FORMAT_A420 + }, { + .sink = GST_VIDEO_FORMAT_NV12, + .alpha = GST_VIDEO_FORMAT_NV12, + .src = GST_VIDEO_FORMAT_AV12, + }, { + .sink = GST_VIDEO_FORMAT_NV12, + .alpha = GST_VIDEO_FORMAT_GRAY8, + .src = GST_VIDEO_FORMAT_AV12 + },{ + .sink = GST_VIDEO_FORMAT_NV12, + .alpha = GST_VIDEO_FORMAT_I420, + .src = GST_VIDEO_FORMAT_AV12 + }, +}; +/* *INDENT-ON* */ + +GST_DEBUG_CATEGORY_STATIC (alphacombine_debug); +#define GST_CAT_DEFAULT (alphacombine_debug) + +struct _GstAlphaCombine +{ + GstElement parent; + + GstPad *sink_pad; + GstPad *alpha_pad; + GstPad *src_pad; + + /* protected by sink_pad stream lock */ + GstBuffer *last_alpha_buffer; + GstFlowReturn last_flow_ret; + + GMutex buffer_lock; + GCond buffer_cond; + GstBuffer *alpha_buffer; + /* Ref-counted flushing state */ + guint flushing; + + GstVideoInfo sink_vinfo; + GstVideoInfo alpha_vinfo; + GstVideoFormat src_format; + + guint sink_format_cookie; + guint alpha_format_cookie; +}; + +#define gst_alpha_combine_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstAlphaCombine, gst_alpha_combine, + GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (alphacombine_debug, "alphacombine", 0, + "Alpha Combiner")); + +GST_ELEMENT_REGISTER_DEFINE (alpha_combine, "alphacombine", + GST_RANK_NONE, GST_TYPE_ALPHA_COMBINE); + +static GstStaticPadTemplate gst_alpha_combine_sink_template = +GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (SUPPORTED_SINK_FORMATS)) + ); + +static GstStaticPadTemplate gst_alpha_combine_alpha_template = +GST_STATIC_PAD_TEMPLATE ("alpha", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (SUPPORTED_ALPHA_FORMATS)) + ); + +static GstStaticPadTemplate gst_alpha_combine_src_template = +GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (SUPPORTED_SRC_FORMATS)) + ); + +static void +gst_alpha_combine_unlock (GstAlphaCombine * self) +{ + g_mutex_lock (&self->buffer_lock); + self->flushing++; + g_cond_broadcast (&self->buffer_cond); + g_mutex_unlock (&self->buffer_lock); +} + +static void +gst_alpha_combine_unlock_stop (GstAlphaCombine * self) +{ + g_mutex_lock (&self->buffer_lock); + g_assert (self->flushing); + self->flushing--; + + /* Reset the format cookies to ensure they are equal */ + if (!self->flushing) { + self->sink_format_cookie = 0; + self->alpha_format_cookie = 0; + } + + g_mutex_unlock (&self->buffer_lock); +} + +static void +gst_alpha_combine_reset (GstAlphaCombine * self) +{ + gst_buffer_replace (&self->alpha_buffer, NULL); + gst_buffer_replace (&self->last_alpha_buffer, NULL); + self->last_flow_ret = GST_FLOW_OK; +} + +/* + * gst_alpha_combine_negotiate: + * @self: #GstAlphaCombine pointer + * + * Verify that the stream and alpha stream format are compatible and fail + * otherwise. There is no effort in helping upstream to dynamically negotiate + * a valid combination to keep the complexity low, and because this would be a + * very atypical usage. + */ +static gboolean +gst_alpha_combine_negotiate (GstAlphaCombine * self) +{ + gint i; + GstVideoFormat src_format = GST_VIDEO_FORMAT_UNKNOWN; + GstVideoFormat sink_format = GST_VIDEO_INFO_FORMAT (&self->sink_vinfo); + GstVideoFormat alpha_format = GST_VIDEO_INFO_FORMAT (&self->alpha_vinfo); + + if (self->src_format != GST_VIDEO_FORMAT_UNKNOWN) + return TRUE; + + for (i = 0; i < G_N_ELEMENTS (format_map); i++) { + if (format_map[i].sink == sink_format + && format_map[i].alpha == alpha_format) { + src_format = format_map[i].src; + break; + } + } + + if (src_format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ELEMENT_ERROR (self, STREAM, FORMAT, ("Unsupported formats."), + ("Cannot combined '%s' and '%s' into any supported transparent format", + gst_video_format_to_string (sink_format), + gst_video_format_to_string (alpha_format))); + return FALSE; + } + + if (GST_VIDEO_INFO_COLORIMETRY (&self->sink_vinfo).range != + GST_VIDEO_INFO_COLORIMETRY (&self->alpha_vinfo).range) { + GST_ELEMENT_ERROR (self, STREAM, FORMAT, ("Color range miss-match"), + ("We can only combine buffers if they have the same color range.")); + return FALSE; + } + + self->src_format = src_format; + return TRUE; +} + +static GstFlowReturn +gst_alpha_combine_peek_alpha_buffer (GstAlphaCombine * self, + GstBuffer ** alpha_buffer) +{ + g_mutex_lock (&self->buffer_lock); + + while (!self->alpha_buffer && !self->flushing) + g_cond_wait (&self->buffer_cond, &self->buffer_lock); + + if (self->flushing) { + g_mutex_unlock (&self->buffer_lock); + return GST_FLOW_FLUSHING; + } + + /* Now is a good time to validate the formats, as the alpha_vinfo won't be + * updated until we signal this alpha_buffer_as being consumed */ + if (!gst_alpha_combine_negotiate (self)) { + g_mutex_unlock (&self->buffer_lock); + return GST_FLOW_NOT_NEGOTIATED; + } + + *alpha_buffer = gst_buffer_ref (self->alpha_buffer); + g_mutex_unlock (&self->buffer_lock); + + if (GST_BUFFER_FLAG_IS_SET (*alpha_buffer, GST_BUFFER_FLAG_GAP)) { + if (!self->last_alpha_buffer) { + GST_ELEMENT_ERROR (self, STREAM, WRONG_TYPE, + ("Cannot handle streams without an initial alpha buffer."), (NULL)); + gst_clear_buffer (alpha_buffer); + return GST_FLOW_ERROR; + } + + /* Re-use the last alpha buffer if one is gone missing */ + gst_buffer_replace (alpha_buffer, self->last_alpha_buffer); + } + + return GST_FLOW_OK; +} + +static void +gst_alpha_combine_pop_alpha_buffer (GstAlphaCombine * self, + GstFlowReturn flow_ret) +{ + g_mutex_lock (&self->buffer_lock); + self->last_flow_ret = flow_ret; + gst_clear_buffer (&self->alpha_buffer); + g_cond_broadcast (&self->buffer_cond); + g_mutex_unlock (&self->buffer_lock); +} + +static GstFlowReturn +gst_alpha_combine_push_alpha_buffer (GstAlphaCombine * self, GstBuffer * buffer) +{ + GstFlowReturn ret; + + g_mutex_lock (&self->buffer_lock); + + /* We wait for the alpha_buffer to be consumed and store the buffer for the + * sink_chain to pick it up */ + while (self->alpha_buffer && !self->flushing) + g_cond_wait (&self->buffer_cond, &self->buffer_lock); + + if (self->flushing) { + gst_buffer_unref (buffer); + g_mutex_unlock (&self->buffer_lock); + return GST_FLOW_FLUSHING; + } + + self->alpha_buffer = buffer; + GST_DEBUG_OBJECT (self, "Stored pending alpha buffer %p", buffer); + g_cond_signal (&self->buffer_cond); + ret = self->last_flow_ret; + g_mutex_unlock (&self->buffer_lock); + + return ret; +} + +static GstFlowReturn +gst_alpha_combine_sink_chain (GstPad * pad, GstObject * object, + GstBuffer * src_buffer) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (object); + GstFlowReturn ret; + GstVideoMeta *vmeta; + GstBuffer *alpha_buffer; + GstMemory *alpha_mem = NULL; + gsize alpha_skip = 0; + gint alpha_stride; + GstBuffer *buffer; + guint alpha_plane_idx; + + ret = gst_alpha_combine_peek_alpha_buffer (self, &alpha_buffer); + if (ret != GST_FLOW_OK) + return ret; + + GST_DEBUG_OBJECT (self, "Combining buffer %p with alpha buffer %p", + src_buffer, alpha_buffer); + + vmeta = gst_buffer_get_video_meta (alpha_buffer); + if (vmeta) { + guint idx, length; + if (gst_buffer_find_memory (alpha_buffer, vmeta->offset[GST_VIDEO_COMP_Y], + 1, &idx, &length, &alpha_skip)) { + alpha_mem = gst_buffer_get_memory (alpha_buffer, idx); + } + + alpha_stride = vmeta->stride[GST_VIDEO_COMP_Y]; + } else { + alpha_mem = gst_buffer_get_memory (alpha_buffer, 0); + alpha_stride = self->alpha_vinfo.stride[GST_VIDEO_COMP_Y]; + } + + if (!alpha_mem) { + gst_buffer_unref (alpha_buffer); + gst_buffer_unref (src_buffer); + GST_ELEMENT_ERROR (self, STREAM, WRONG_TYPE, + ("Invalid alpha video frame."), ("Could not find the plane")); + return GST_FLOW_ERROR; + } + + /* FIXME use some GstBuffer cache to reduce run-time allocation */ + buffer = gst_buffer_copy (src_buffer); + vmeta = gst_buffer_get_video_meta (buffer); + if (!vmeta) + vmeta = gst_buffer_add_video_meta (buffer, 0, + GST_VIDEO_INFO_FORMAT (&self->sink_vinfo), + GST_VIDEO_INFO_WIDTH (&self->sink_vinfo), + GST_VIDEO_INFO_HEIGHT (&self->sink_vinfo)); + + alpha_skip += gst_buffer_get_size (buffer); + gst_buffer_append_memory (buffer, alpha_mem); + + alpha_plane_idx = GST_VIDEO_INFO_N_PLANES (&self->sink_vinfo); + vmeta->offset[alpha_plane_idx] = alpha_skip; + vmeta->stride[alpha_plane_idx] = alpha_stride; + + vmeta->format = self->src_format; + vmeta->n_planes = alpha_plane_idx + 1; + + /* Keep the origina GstBuffer alive to make this buffer pool friendly */ + gst_buffer_add_parent_buffer_meta (buffer, src_buffer); + gst_buffer_add_parent_buffer_meta (buffer, alpha_buffer); + + gst_buffer_replace (&self->last_alpha_buffer, alpha_buffer); + gst_buffer_unref (src_buffer); + gst_buffer_unref (alpha_buffer); + + ret = gst_pad_push (self->src_pad, buffer); + gst_alpha_combine_pop_alpha_buffer (self, ret); + + return ret; +} + +static GstFlowReturn +gst_alpha_combine_alpha_chain (GstPad * pad, GstObject * object, + GstBuffer * buffer) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (object); + + return gst_alpha_combine_push_alpha_buffer (self, buffer); +} + +static gboolean +gst_alpha_combine_set_sink_format (GstAlphaCombine * self, GstCaps * caps) +{ + GstVideoFormat sink_format, src_format = GST_VIDEO_FORMAT_UNKNOWN; + GstEvent *event; + gint i; + gboolean ret; + + if (!gst_video_info_from_caps (&self->sink_vinfo, caps)) { + GST_ELEMENT_ERROR (self, STREAM, FORMAT, ("Invalid video format"), (NULL)); + return FALSE; + } + + sink_format = GST_VIDEO_INFO_FORMAT (&self->sink_vinfo); + + /* The sink format determin the src format, though we cannot fully validate + * the negotiation here, since we don't have the alpha format yet. */ + for (i = 0; i < G_N_ELEMENTS (format_map); i++) { + if (format_map[i].sink == sink_format) { + src_format = format_map[i].src; + break; + } + } + + if (src_format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ELEMENT_ERROR (self, STREAM, FORMAT, ("Unsupported formats."), + ("Sink format '%s' not supported.", + gst_video_format_to_string (sink_format))); + return FALSE; + } + + caps = gst_caps_copy (caps); + gst_caps_set_simple (caps, "format", G_TYPE_STRING, + gst_video_format_to_string (src_format), NULL); + event = gst_event_new_caps (caps); + gst_caps_unref (caps); + + ret = gst_pad_push_event (self->src_pad, event); + + /* signal the sink format change */ + g_mutex_lock (&self->buffer_lock); + self->sink_format_cookie++; + g_cond_signal (&self->buffer_cond); + g_mutex_unlock (&self->buffer_lock); + + return ret; +} + +static gboolean +gst_alpha_combine_set_alpha_format (GstAlphaCombine * self, GstCaps * caps) +{ + /* We wait for the alpha_buffer to be consumed, so that we don't pick the + * caps too soon */ + g_mutex_lock (&self->buffer_lock); + + /* We wait for the alpha_buffer to be consumed and store the buffer for the + * sink_chain to pick it up */ + while (self->alpha_buffer && !self->flushing) + g_cond_wait (&self->buffer_cond, &self->buffer_lock); + + if (self->flushing) { + g_mutex_unlock (&self->buffer_lock); + return GST_FLOW_FLUSHING; + } + + if (!gst_video_info_from_caps (&self->alpha_vinfo, caps)) { + g_mutex_unlock (&self->buffer_lock); + GST_ELEMENT_ERROR (self, STREAM, FORMAT, ("Invalid video format"), (NULL)); + return FALSE; + } + + self->alpha_format_cookie++; + + /* wait for the matching format change on the sink pad */ + while (self->alpha_format_cookie != self->sink_format_cookie && + !self->flushing) + g_cond_wait (&self->buffer_cond, &self->buffer_lock); + + g_mutex_unlock (&self->buffer_lock); + + return TRUE; +} + +static gboolean +gst_alpha_combine_sink_event (GstPad * pad, GstObject * object, + GstEvent * event) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (object); + + switch (event->type) { + case GST_EVENT_FLUSH_START: + gst_alpha_combine_unlock (self); + break; + case GST_EVENT_FLUSH_STOP: + gst_alpha_combine_unlock_stop (self); + break; + case GST_EVENT_CAPS: + { + GstCaps *caps; + gboolean ret; + + gst_event_parse_caps (event, &caps); + ret = gst_alpha_combine_set_sink_format (self, caps); + gst_event_unref (event); + + return ret; + } + default: + break; + } + + return gst_pad_event_default (pad, object, event); +} + +static gboolean +gst_alpha_combine_alpha_event (GstPad * pad, GstObject * object, + GstEvent * event) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (object); + + switch (event->type) { + case GST_EVENT_FLUSH_START: + gst_alpha_combine_unlock (self); + break; + case GST_EVENT_FLUSH_STOP: + gst_alpha_combine_unlock_stop (self); + gst_alpha_combine_reset (self); + break; + case GST_EVENT_CAPS: + { + GstCaps *caps; + gst_event_parse_caps (event, &caps); + gst_alpha_combine_set_alpha_format (self, caps); + } + default: + break; + } + + /* Events are being duplicated, over both branches, so let's just drop this + * secondary stream and use the one from the main stream. */ + gst_event_unref (event); + return TRUE; +} + +static gboolean +gst_alpha_combine_sink_query (GstPad * pad, GstObject * object, + GstQuery * query) +{ + switch (query->type) { + case GST_QUERY_ALLOCATION: + { + int i; + + if (!gst_pad_query_default (pad, object, query)) + return FALSE; + + /* Ensure NULL pool because it cannot be shared between the 2 decoders. + * Ideally, we should cache the downstream query and use it for both + * decoders, but it is hard to know when we should refresh it */ + for (i = 0; i < gst_query_get_n_allocation_pools (query); i++) { + guint size = 0, min = 0, max = 0; + gst_query_parse_nth_allocation_pool (query, i, NULL, &size, &min, &max); + gst_query_set_nth_allocation_pool (query, i, NULL, size, min, max); + } + + return TRUE; + break; + } + default: + break; + } + + return gst_pad_query_default (pad, object, query); +} + +static GstStateChangeReturn +gst_alpha_combine_change_state (GstElement * element, GstStateChange transition) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (element); + GstStateChangeReturn ret; + + switch (transition) { + case GST_STATE_CHANGE_READY_TO_PAUSED: + gst_alpha_combine_unlock_stop (self); + break; + case GST_STATE_CHANGE_PAUSED_TO_READY: + gst_alpha_combine_unlock (self); + break; + default: + break; + } + + ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); + + switch (transition) { + case GST_STATE_CHANGE_PAUSED_TO_READY: + gst_alpha_combine_reset (self); + self->src_format = GST_VIDEO_FORMAT_UNKNOWN; + gst_video_info_init (&self->sink_vinfo); + gst_video_info_init (&self->alpha_vinfo); + self->sink_format_cookie = 0; + self->alpha_format_cookie = 0; + break; + default: + break; + } + + return ret; +} + +static void +gst_alpha_combine_dispose (GObject * object) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (object); + + g_clear_object (&self->sink_pad); + g_clear_object (&self->alpha_pad); + g_clear_object (&self->src_pad); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_alpha_combine_finalize (GObject * object) +{ + GstAlphaCombine *self = GST_ALPHA_COMBINE (object); + + g_mutex_clear (&self->buffer_lock); + g_cond_clear (&self->buffer_cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_alpha_combine_class_init (GstAlphaCombineClass * klass) +{ + GstElementClass *element_class = (GstElementClass *) klass; + GObjectClass *object_class = (GObjectClass *) klass; + + gst_element_class_set_static_metadata (element_class, + "Alpha Combiner", "Codec/Demuxer", + "Use luma from an opaque stream as alpha plane on another", + "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); + + gst_element_class_add_static_pad_template (element_class, + &gst_alpha_combine_sink_template); + gst_element_class_add_static_pad_template (element_class, + &gst_alpha_combine_alpha_template); + gst_element_class_add_static_pad_template (element_class, + &gst_alpha_combine_src_template); + + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_alpha_combine_change_state); + + object_class->dispose = GST_DEBUG_FUNCPTR (gst_alpha_combine_dispose); + object_class->finalize = GST_DEBUG_FUNCPTR (gst_alpha_combine_finalize); +} + +static void +gst_alpha_combine_init (GstAlphaCombine * self) +{ + gst_element_create_all_pads (GST_ELEMENT (self)); + self->sink_pad = gst_element_get_static_pad (GST_ELEMENT (self), "sink"); + self->alpha_pad = gst_element_get_static_pad (GST_ELEMENT (self), "alpha"); + self->src_pad = gst_element_get_static_pad (GST_ELEMENT (self), "src"); + self->flushing = 1; + + g_mutex_init (&self->buffer_lock); + g_cond_init (&self->buffer_cond); + + GST_PAD_SET_PROXY_SCHEDULING (self->sink_pad); + GST_PAD_SET_PROXY_SCHEDULING (self->src_pad); + + GST_PAD_SET_PROXY_ALLOCATION (self->sink_pad); + GST_PAD_SET_PROXY_ALLOCATION (self->alpha_pad); + + gst_pad_set_chain_function (self->sink_pad, gst_alpha_combine_sink_chain); + gst_pad_set_chain_function (self->alpha_pad, gst_alpha_combine_alpha_chain); + + gst_pad_set_event_function (self->sink_pad, gst_alpha_combine_sink_event); + gst_pad_set_event_function (self->alpha_pad, gst_alpha_combine_alpha_event); + + gst_pad_set_query_function (self->sink_pad, gst_alpha_combine_sink_query); + gst_pad_set_query_function (self->alpha_pad, gst_alpha_combine_sink_query); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstalphacombine.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ALPHA_COMBINE_H__ +#define __GST_ALPHA_COMBINE_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +#define GST_TYPE_ALPHA_COMBINE (gst_alpha_combine_get_type()) +G_DECLARE_FINAL_TYPE (GstAlphaCombine, + gst_alpha_combine, GST, ALPHA_COMBINE, GstElement); + +GST_ELEMENT_REGISTER_DECLARE (alpha_combine); + +G_END_DECLS +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstalphadecodebin.c
Added
@@ -0,0 +1,214 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/pbutils/pbutils.h> + +#include "gstalphadecodebin.h" + +GST_DEBUG_CATEGORY_STATIC (alphadecodebin_debug); +#define GST_CAT_DEFAULT (alphadecodebin_debug) + +typedef struct +{ + GstBin parent; + + gboolean constructed; + const gchar *missing_element; +} GstAlphaDecodeBinPrivate; + +#define gst_alpha_decode_bin_parent_class parent_class +G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstAlphaDecodeBin, gst_alpha_decode_bin, + GST_TYPE_BIN, + G_ADD_PRIVATE (GstAlphaDecodeBin); + GST_DEBUG_CATEGORY_INIT (alphadecodebin_debug, "alphadecodebin", 0, + "alphadecodebin")); + +static GstStaticPadTemplate gst_alpha_decode_bin_src_template = +GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static gboolean +gst_alpha_decode_bin_open (GstAlphaDecodeBin * self) +{ + GstAlphaDecodeBinPrivate *priv = + gst_alpha_decode_bin_get_instance_private (self); + + if (priv->missing_element) { + gst_element_post_message (GST_ELEMENT (self), + gst_missing_element_message_new (GST_ELEMENT (self), + priv->missing_element)); + } else if (!priv->constructed) { + GST_ELEMENT_ERROR (self, CORE, FAILED, + ("Failed to construct alpha decoder pipeline."), (NULL)); + } + + return priv->constructed; +} + +static GstStateChangeReturn +gst_alpha_decode_bin_change_state (GstElement * element, + GstStateChange transition) +{ + GstAlphaDecodeBin *self = GST_ALPHA_DECODE_BIN (element); + + switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + if (!gst_alpha_decode_bin_open (self)) + return GST_STATE_CHANGE_FAILURE; + break; + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + +static void +gst_alpha_decode_bin_constructed (GObject * obj) +{ + GstAlphaDecodeBin *self = GST_ALPHA_DECODE_BIN (obj); + GstAlphaDecodeBinPrivate *priv = + gst_alpha_decode_bin_get_instance_private (self); + GstAlphaDecodeBinClass *klass = GST_ALPHA_DECODE_BIN_GET_CLASS (self); + GstPad *src_gpad, *sink_gpad; + GstPad *src_pad = NULL, *sink_pad = NULL; + GstElement *alphademux = NULL; + GstElement *queue = NULL; + GstElement *alpha_queue = NULL; + GstElement *decoder = NULL; + GstElement *alpha_decoder = NULL; + GstElement *alphacombine = NULL; + + /* setup ghost pads */ + sink_gpad = gst_ghost_pad_new_no_target_from_template ("sink", + gst_element_class_get_pad_template (GST_ELEMENT_CLASS (klass), "sink")); + gst_element_add_pad (GST_ELEMENT (self), sink_gpad); + + src_gpad = gst_ghost_pad_new_no_target_from_template ("src", + gst_element_class_get_pad_template (GST_ELEMENT_CLASS (klass), "src")); + gst_element_add_pad (GST_ELEMENT (self), src_gpad); + + /* create elements */ + alphademux = gst_element_factory_make ("codecalphademux", NULL); + if (!alphademux) { + priv->missing_element = "codecalphademux"; + goto cleanup; + } + + queue = gst_element_factory_make ("queue", NULL); + alpha_queue = gst_element_factory_make ("queue", NULL); + if (!queue || !alpha_queue) { + priv->missing_element = "queue"; + goto cleanup; + } + + decoder = gst_element_factory_make (klass->decoder_name, "maindec"); + if (!decoder) { + priv->missing_element = klass->decoder_name; + goto cleanup; + } + + alpha_decoder = gst_element_factory_make (klass->decoder_name, "alphadec"); + if (!alpha_decoder) { + priv->missing_element = klass->decoder_name; + goto cleanup; + } + + /* We disable QoS on decoders because we need to maintain frame pairing in + * order for alphacombine to work. */ + g_object_set (decoder, "qos", FALSE, NULL); + g_object_set (alpha_decoder, "qos", FALSE, NULL); + + alphacombine = gst_element_factory_make ("alphacombine", NULL); + if (!alphacombine) { + priv->missing_element = "alphacombine"; + goto cleanup; + } + + gst_bin_add_many (GST_BIN (self), alphademux, queue, alpha_queue, decoder, + alpha_decoder, alphacombine, NULL); + + /* link elements */ + sink_pad = gst_element_get_static_pad (alphademux, "sink"); + gst_ghost_pad_set_target (GST_GHOST_PAD (sink_gpad), sink_pad); + gst_clear_object (&sink_pad); + + gst_element_link_pads (alphademux, "src", queue, "sink"); + gst_element_link_pads (queue, "src", decoder, "sink"); + gst_element_link_pads (decoder, "src", alphacombine, "sink"); + + gst_element_link_pads (alphademux, "alpha", alpha_queue, "sink"); + gst_element_link_pads (alpha_queue, "src", alpha_decoder, "sink"); + gst_element_link_pads (alpha_decoder, "src", alphacombine, "alpha"); + + src_pad = gst_element_get_static_pad (alphacombine, "src"); + gst_ghost_pad_set_target (GST_GHOST_PAD (src_gpad), src_pad); + gst_object_unref (src_pad); + + g_object_set (queue, "max-size-bytes", 0, "max-size-time", 0, + "max-size-buffers", 1, NULL); + g_object_set (alpha_queue, "max-size-bytes", 0, "max-size-time", 0, + "max-size-buffers", 1, NULL); + + /* signal success, we will handle this in NULL->READY transition */ + priv->constructed = TRUE; + return; + +cleanup: + gst_clear_object (&alphademux); + gst_clear_object (&queue); + gst_clear_object (&alpha_queue); + gst_clear_object (&decoder); + gst_clear_object (&alpha_decoder); + gst_clear_object (&alphacombine); + + G_OBJECT_CLASS (parent_class)->constructed (obj); +} + +static void +gst_alpha_decode_bin_class_init (GstAlphaDecodeBinClass * klass) +{ + GstElementClass *element_class = (GstElementClass *) klass; + GObjectClass *obj_class = (GObjectClass *) klass; + + /* This is needed to access the subclass class instance, otherwise we cannot + * read the class parameters */ + obj_class->constructed = gst_alpha_decode_bin_constructed; + + gst_element_class_add_static_pad_template (element_class, + &gst_alpha_decode_bin_src_template); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_alpha_decode_bin_change_state); + + /* let's make the doc generator happy */ + gst_type_mark_as_plugin_api (GST_TYPE_ALPHA_DECODE_BIN, 0); +} + +static void +gst_alpha_decode_bin_init (GstAlphaDecodeBin * self) +{ +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstalphadecodebin.h
Added
@@ -0,0 +1,48 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ALPHA_DECODE_BIN_H__ +#define __GST_ALPHA_DECODE_BIN_H__ + +#include <gst/gst.h> + +/* When wrapping, use the original rank plus this offset. The ad-hoc rules is + * that hardware implementation will use PRIMARY+1 or +2 to override the + * software decoder, so the offset must be large enough to jump over those. + * This should also be small enough so that a marginal (64) or secondary + * wrapper does not cross the PRIMARY line. + */ +#define GST_ALPHA_DECODE_BIN_RANK_OFFSET 10 + +G_BEGIN_DECLS + +#define GST_TYPE_ALPHA_DECODE_BIN (gst_alpha_decode_bin_get_type()) +G_DECLARE_DERIVABLE_TYPE (GstAlphaDecodeBin, + gst_alpha_decode_bin, GST, ALPHA_DECODE_BIN, GstBin); + +struct _GstAlphaDecodeBinClass +{ + GstBinClass parent_class; + + const gchar *decoder_name; +}; + +G_END_DECLS +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstcodecalphademux.c
Added
@@ -0,0 +1,316 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-codecalphademux + * @title: CODEC Alpha Demuxer + * + * Extracts the CODEC (typically VP8/VP9) alpha stream stored as meta and + * exposes it as a stream. This element allow using single stream VP8/9 + * decoders in order to decode both streams. + * + * ## Example launch line + * |[ + * gst-launch-1.0 -v filesrc location=transparency.webm ! matroskademux ! + * codecalphademux name=d + * d.video ! queue ! vp9dec ! autovideosink + * d.alpha ! queue ! vp9dec ! autovideosink + * ]| This pipeline splits and decode the video and the alpha stream, showing + * the result on seperate windows. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/video/video.h> +#include <gst/base/gstflowcombiner.h> + +#include "gstcodecalphademux.h" + +GST_DEBUG_CATEGORY_STATIC (codecalphademux_debug); +#define GST_CAT_DEFAULT (codecalphademux_debug) + +struct _GstCodecAlphaDemux +{ + GstElement parent; + + GstPad *sink_pad; + GstPad *src_pad; + GstPad *alpha_pad; + + GstFlowCombiner *flow_combiner; +}; + +#define gst_codec_alpha_demux_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstCodecAlphaDemux, gst_codec_alpha_demux, + GST_TYPE_ELEMENT, + GST_DEBUG_CATEGORY_INIT (codecalphademux_debug, "codecalphademux", 0, + "codecalphademux")); + +GST_ELEMENT_REGISTER_DEFINE (codec_alpha_demux, "codecalphademux", + GST_RANK_NONE, GST_TYPE_CODEC_ALPHA_DEMUX); + +static GstStaticPadTemplate gst_codec_alpha_demux_sink_template = +GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static GstStaticPadTemplate gst_codec_alpha_demux_src_template = +GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static GstStaticPadTemplate gst_codec_alpha_demux_alpha_template = +GST_STATIC_PAD_TEMPLATE ("alpha", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static GstFlowReturn +gst_codec_alpha_demux_chain (GstPad * pad, GstObject * object, + GstBuffer * buffer) +{ + GstCodecAlphaDemux *self = GST_CODEC_ALPHA_DEMUX (object); + GstVideoCodecAlphaMeta *alpha_meta = + gst_buffer_get_video_codec_alpha_meta (buffer); + GstBuffer *alpha_buffer = NULL; + GstClockTime pts = GST_BUFFER_PTS (buffer); + GstClockTime duration = GST_BUFFER_DURATION (buffer); + GstFlowReturn ret = GST_FLOW_EOS; + + if (alpha_meta) + alpha_buffer = gst_buffer_ref (alpha_meta->buffer); + + ret = gst_pad_push (self->src_pad, buffer); + ret = gst_flow_combiner_update_flow (self->flow_combiner, ret); + + /* we lost ownership here */ + buffer = NULL; + alpha_meta = NULL; + + if (alpha_buffer) { + ret = gst_pad_push (self->alpha_pad, alpha_buffer); + } else { + gst_pad_push_event (self->alpha_pad, gst_event_new_gap (pts, duration)); + ret = GST_PAD_LAST_FLOW_RETURN (self->alpha_pad); + } + ret = gst_flow_combiner_update_flow (self->flow_combiner, ret); + + return ret; +} + +static GstCaps * +gst_codec_alpha_demux_transform_caps (GstCaps * caps, gboolean codec_alpha) +{ + if (!caps) + return NULL; + + caps = gst_caps_copy (caps); + gst_caps_set_simple (caps, "codec-alpha", G_TYPE_BOOLEAN, codec_alpha, NULL); + + return caps; +} + +static GstEvent * +gst_codec_alpha_demux_transform_caps_event (GstEvent * src_event) +{ + GstEvent *dst_event; + GstCaps *caps; + + gst_event_parse_caps (src_event, &caps); + + caps = gst_codec_alpha_demux_transform_caps (caps, FALSE); + dst_event = gst_event_new_caps (caps); + gst_event_set_seqnum (dst_event, gst_event_get_seqnum (src_event)); + + gst_caps_unref (caps); + gst_event_unref (src_event); + return dst_event; +} + +static gboolean +gst_codec_alpha_demux_sink_event (GstPad * sink_pad, GstObject * parent, + GstEvent * event) +{ + GstCodecAlphaDemux *self = GST_CODEC_ALPHA_DEMUX (parent); + + switch (event->type) { + case GST_EVENT_FLUSH_STOP: + gst_flow_combiner_reset (self->flow_combiner); + break; + case GST_EVENT_CAPS: + event = gst_codec_alpha_demux_transform_caps_event (event); + break; + default: + break; + } + + return gst_pad_event_default (sink_pad, parent, event); +} + +static gboolean +gst_codec_alpha_demux_sink_query (GstPad * sink_pad, GstObject * parent, + GstQuery * query) +{ + GstQuery *peer_query; + GstCaps *caps; + gboolean ret; + + switch (query->type) { + case GST_QUERY_CAPS: + gst_query_parse_caps (query, &caps); + caps = gst_codec_alpha_demux_transform_caps (caps, FALSE); + peer_query = gst_query_new_caps (caps); + gst_clear_caps (&caps); + break; + case GST_QUERY_ACCEPT_CAPS: + gst_query_parse_accept_caps (query, &caps); + caps = gst_codec_alpha_demux_transform_caps (caps, FALSE); + peer_query = gst_query_new_accept_caps (caps); + gst_clear_caps (&caps); + break; + default: + peer_query = query; + break; + } + + ret = gst_pad_query_default (sink_pad, parent, peer_query); + if (!ret) { + if (peer_query != query) + gst_query_unref (peer_query); + return FALSE; + } + + switch (query->type) { + case GST_QUERY_CAPS: + gst_query_parse_caps_result (peer_query, &caps); + caps = gst_caps_copy (caps); + caps = gst_codec_alpha_demux_transform_caps (caps, TRUE); + gst_query_set_caps_result (query, caps); + gst_caps_unref (caps); + gst_query_unref (peer_query); + break; + case GST_QUERY_ACCEPT_CAPS: + { + gboolean result; + gst_query_parse_accept_caps_result (peer_query, &result); + gst_query_set_accept_caps_result (query, result); + gst_query_unref (peer_query); + break; + } + default: + break; + } + + + return ret; +} + +static void +gst_codec_alpha_demux_start (GstCodecAlphaDemux * self) +{ + gst_flow_combiner_reset (self->flow_combiner); +} + +static GstStateChangeReturn +gst_codec_alpha_demux_change_state (GstElement * element, + GstStateChange transition) +{ + GstCodecAlphaDemux *self = GST_CODEC_ALPHA_DEMUX (element); + + switch (transition) { + case GST_STATE_CHANGE_READY_TO_PAUSED: + gst_codec_alpha_demux_start (self); + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + +static void +gst_codec_alpha_demux_dispose (GObject * object) +{ + GstCodecAlphaDemux *self = GST_CODEC_ALPHA_DEMUX (object); + + g_clear_object (&self->sink_pad); + g_clear_object (&self->src_pad); + g_clear_object (&self->alpha_pad); + g_clear_pointer (&self->flow_combiner, gst_flow_combiner_unref); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_codec_alpha_demux_class_init (GstCodecAlphaDemuxClass * klass) +{ + GstElementClass *element_class = (GstElementClass *) klass; + GObjectClass *object_class = (GObjectClass *) klass; + + gst_element_class_set_static_metadata (element_class, + "CODEC Alpha Demuxer", "Codec/Demuxer", + "Extract and expose as a stream the CODEC alpha.", + "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); + + gst_element_class_add_static_pad_template (element_class, + &gst_codec_alpha_demux_sink_template); + gst_element_class_add_static_pad_template (element_class, + &gst_codec_alpha_demux_src_template); + gst_element_class_add_static_pad_template (element_class, + &gst_codec_alpha_demux_alpha_template); + + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_codec_alpha_demux_change_state); + + object_class->dispose = GST_DEBUG_FUNCPTR (gst_codec_alpha_demux_dispose); +} + +static void +gst_codec_alpha_demux_init (GstCodecAlphaDemux * self) +{ + gst_element_create_all_pads (GST_ELEMENT (self)); + self->sink_pad = gst_element_get_static_pad (GST_ELEMENT (self), "sink"); + self->src_pad = gst_element_get_static_pad (GST_ELEMENT (self), "src"); + self->alpha_pad = gst_element_get_static_pad (GST_ELEMENT (self), "alpha"); + + self->flow_combiner = gst_flow_combiner_new (); + gst_flow_combiner_add_pad (self->flow_combiner, self->src_pad); + gst_flow_combiner_add_pad (self->flow_combiner, self->alpha_pad); + + GST_PAD_SET_PROXY_CAPS (self->sink_pad); + GST_PAD_SET_PROXY_CAPS (self->src_pad); + GST_PAD_SET_PROXY_CAPS (self->alpha_pad); + + GST_PAD_SET_PROXY_SCHEDULING (self->sink_pad); + GST_PAD_SET_PROXY_SCHEDULING (self->src_pad); + GST_PAD_SET_PROXY_SCHEDULING (self->alpha_pad); + + gst_pad_set_chain_function (self->sink_pad, gst_codec_alpha_demux_chain); + gst_pad_set_event_function (self->sink_pad, gst_codec_alpha_demux_sink_event); + gst_pad_set_query_function (self->sink_pad, gst_codec_alpha_demux_sink_query); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstcodecalphademux.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CODEC_ALPHA_DEMUX_H__ +#define __GST_CODEC_ALPHA_DEMUX_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +#define GST_TYPE_CODEC_ALPHA_DEMUX (gst_codec_alpha_demux_get_type()) +G_DECLARE_FINAL_TYPE (GstCodecAlphaDemux, + gst_codec_alpha_demux, GST, CODEC_ALPHA_DEMUX, GstElement); + +GST_ELEMENT_REGISTER_DECLARE (codec_alpha_demux); + +G_END_DECLS +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstplugin.c
Added
@@ -0,0 +1,76 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:plugin-codecalpha + * + * This plugin contains a set of utilities that helps handling alpha encoded + * streams as produced by some WebM streams using VP8/VP9. The elements are + * meant to be used in decoder wrappers which allows playbin to automatically + * handle these streams. + * + * `codecalphademux` will produce two streams out of a stream of buffers holding + * the #GstVideoCodecAlphaMeta. The presence of the meta is indicated by the + * usage of the field `codec-alpha=(boolean)true` in the caps. This is only + * applicable to VP8 and VP9 for now. + * + * Wrappers for vp8dec and vp9dec are available, allowing seamless support for + * these streams inside playbin (which is used by WebKit GTK and WPE). + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include <gst/gst.h> + +#include "gstcodecalphademux.h" +#include "gstalphacombine.h" +#include "gstvp8alphadecodebin.h" +#include "gstvp9alphadecodebin.h" + +/* When wrapping, use the original rank plus this offset. The ad-hoc rules is + * that hardware implementation will use PRIMARY+1 or +2 to override the + * software decoder, so the offset must be large enough to jump over those. + * This should also be small enough so that a marginal (64) or secondary + * wrapper does not cross the PRIMARY line. + */ +#define RANK_OFFSET 10 + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (codec_alpha_demux, plugin); + ret |= GST_ELEMENT_REGISTER (alpha_combine, plugin); + ret |= GST_ELEMENT_REGISTER (vp8_alpha_decode_bin, plugin); + ret |= GST_ELEMENT_REGISTER (vp9_alpha_decode_bin, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + codecalpha, + "CODEC Alpha Utilities", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstvp8alphadecodebin.c
Added
@@ -0,0 +1,73 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vp8alphadecodebin + * @title: Wrapper to decode VP8 alpha using vp8dec + * + * Use two `vp8dec` instance in order to decode VP8 alpha channel. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvp8alphadecodebin.h" + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp8, codec-alpha = (boolean) true") + ); + +struct _GstVp8AlphaDecodeBin +{ + GstAlphaDecodeBin parent; +}; + +#define gst_vp8_alpha_decode_bin_parent_class parent_class +G_DEFINE_TYPE (GstVp8AlphaDecodeBin, gst_vp8_alpha_decode_bin, + GST_TYPE_ALPHA_DECODE_BIN); + +GST_ELEMENT_REGISTER_DEFINE (vp8_alpha_decode_bin, "vp8alphadecodebin", + GST_RANK_PRIMARY + GST_ALPHA_DECODE_BIN_RANK_OFFSET, + GST_TYPE_VP8_ALPHA_DECODE_BIN); + +static void +gst_vp8_alpha_decode_bin_class_init (GstVp8AlphaDecodeBinClass * klass) +{ + GstAlphaDecodeBinClass *adbin_class = (GstAlphaDecodeBinClass *) klass; + GstElementClass *element_class = (GstElementClass *) klass; + + adbin_class->decoder_name = "vp8dec"; + gst_element_class_add_static_pad_template (element_class, &sink_template); + + gst_element_class_set_static_metadata (element_class, + "VP8 Alpha Decoder", "Codec/Decoder/Video", + "Wrapper bin to decode VP8 with alpha stream.", + "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); +} + +static void +gst_vp8_alpha_decode_bin_init (GstVp8AlphaDecodeBin * self) +{ +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstvp8alphadecodebin.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_VP8_ALPHA_DECODE_BIN_H__ +#define __GST_VP8_ALPHA_DECODE_BIN_H__ + +#include "gstalphadecodebin.h" + +G_BEGIN_DECLS + +#define GST_TYPE_VP8_ALPHA_DECODE_BIN (gst_vp8_alpha_decode_bin_get_type()) +G_DECLARE_FINAL_TYPE (GstVp8AlphaDecodeBin, gst_vp8_alpha_decode_bin, + GST, VP8_ALPHA_DECODE_BIN, GstAlphaDecodeBin); + +GST_ELEMENT_REGISTER_DECLARE (vp8_alpha_decode_bin); + +G_END_DECLS +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstvp9alphadecodebin.c
Added
@@ -0,0 +1,74 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vp9alphadecodebin + * @title: Wrapper to decode VP9 alpha using vp9dec + * + * Use two `vp9dec` instance in order to decode VP9 alpha channel. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvp9alphadecodebin.h" + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp9, codec-alpha = (boolean) true, " + "alignment = super-frame") + ); + +struct _GstVp9AlphaDecodeBin +{ + GstAlphaDecodeBin parent; +}; + +#define gst_vp9_alpha_decode_bin_parent_class parent_class +G_DEFINE_TYPE (GstVp9AlphaDecodeBin, gst_vp9_alpha_decode_bin, + GST_TYPE_ALPHA_DECODE_BIN); + +GST_ELEMENT_REGISTER_DEFINE (vp9_alpha_decode_bin, "vp9alphadecodebin", + GST_RANK_PRIMARY + GST_ALPHA_DECODE_BIN_RANK_OFFSET, + GST_TYPE_VP9_ALPHA_DECODE_BIN); + +static void +gst_vp9_alpha_decode_bin_class_init (GstVp9AlphaDecodeBinClass * klass) +{ + GstAlphaDecodeBinClass *adbin_class = (GstAlphaDecodeBinClass *) klass; + GstElementClass *element_class = (GstElementClass *) klass; + + adbin_class->decoder_name = "vp9dec"; + gst_element_class_add_static_pad_template (element_class, &sink_template); + + gst_element_class_set_static_metadata (element_class, + "VP9 Alpha Decoder", "Codec/Decoder/Video", + "Wrapper bin to decode VP9 with alpha stream.", + "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); +} + +static void +gst_vp9_alpha_decode_bin_init (GstVp9AlphaDecodeBin * self) +{ +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/gstvp9alphadecodebin.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_VP9_ALPHA_DECODE_BIN_H__ +#define __GST_VP9_ALPHA_DECODE_BIN_H__ + +#include "gstalphadecodebin.h" + +G_BEGIN_DECLS + +#define GST_TYPE_VP9_ALPHA_DECODE_BIN (gst_vp9_alpha_decode_bin_get_type()) +G_DECLARE_FINAL_TYPE (GstVp9AlphaDecodeBin, + gst_vp9_alpha_decode_bin, GST, VP9_ALPHA_DECODE_BIN, GstAlphaDecodeBin); + +GST_ELEMENT_REGISTER_DECLARE (vp9_alpha_decode_bin); + +G_END_DECLS +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst/codecalpha/meson.build
Added
@@ -0,0 +1,19 @@ +codecalpha_sources = [ + 'gstplugin.c', + 'gstalphacombine.c', + 'gstalphadecodebin.c', + 'gstvp8alphadecodebin.c', + 'gstvp9alphadecodebin.c', + 'gstcodecalphademux.c', +] + +gstcodecalpha = library('gstcodecalpha', + codecalpha_sources, + c_args : gst_plugins_bad_args, + include_directories : [configinc], + dependencies : [gstvideo_dep, gstpbutils_dep], + install : true, + install_dir : plugins_install_dir, +) +pkgconfig.generate(gstcodecalpha, install_dir : plugins_pkgconfig_install_dir) +plugins += [gstcodecalpha]
View file
gst-plugins-bad-1.18.6.tar.xz/gst/coloreffects/gstchromahold.c -> gst-plugins-bad-1.20.1.tar.xz/gst/coloreffects/gstchromahold.c
Changed
@@ -111,6 +111,8 @@ #define gst_chroma_hold_parent_class parent_class G_DEFINE_TYPE (GstChromaHold, gst_chroma_hold, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (chromahold, "chromahold", + GST_RANK_NONE, gst_chroma_hold_get_type ()); static void gst_chroma_hold_class_init (GstChromaHoldClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/coloreffects/gstchromahold.h -> gst-plugins-bad-1.20.1.tar.xz/gst/coloreffects/gstchromahold.h
Changed
@@ -74,6 +74,7 @@ }; GType gst_chroma_hold_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (chromahold); G_END_DECLS #endif /* __GST_CHROMA_HOLD_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/coloreffects/gstcoloreffects.c -> gst-plugins-bad-1.20.1.tar.xz/gst/coloreffects/gstcoloreffects.c
Changed
@@ -51,6 +51,8 @@ #define gst_color_effects_parent_class parent_class G_DEFINE_TYPE (GstColorEffects, gst_color_effects, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (coloreffects, "coloreffects", + GST_RANK_NONE, gst_color_effects_get_type ()); #define CAPS_STR GST_VIDEO_CAPS_MAKE ("{ " \ "ARGB, BGRA, ABGR, RGBA, xRGB, BGRx, xBGR, RGBx, RGB, BGR, AYUV }")
View file
gst-plugins-bad-1.18.6.tar.xz/gst/coloreffects/gstcoloreffects.h -> gst-plugins-bad-1.20.1.tar.xz/gst/coloreffects/gstcoloreffects.h
Changed
@@ -87,6 +87,7 @@ }; GType gst_color_effects_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (coloreffects); G_END_DECLS #endif /* __GST_COLOR_EFFECTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/coloreffects/gstplugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/coloreffects/gstplugin.c
Changed
@@ -26,31 +26,16 @@ #include "gstcoloreffects.h" #include "gstchromahold.h" -struct _elements_entry -{ - const gchar *name; - GType (*type) (void); -}; - -static const struct _elements_entry _elements[] = { - {"coloreffects", gst_color_effects_get_type}, - {"chromahold", gst_chroma_hold_get_type}, - {NULL, 0}, -}; static gboolean plugin_init (GstPlugin * plugin) { - gint i = 0; + gboolean ret = FALSE; - while (_elements[i].name) { - if (!gst_element_register (plugin, _elements[i].name, - GST_RANK_NONE, (_elements[i].type) ())) - return FALSE; - i++; - } + ret |= GST_ELEMENT_REGISTER (coloreffects, plugin); + ret |= GST_ELEMENT_REGISTER (chromahold, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/debugutilsbad.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/debugutilsbad.c
Changed
@@ -23,42 +23,28 @@ #include <gst/gst.h> -GType gst_checksum_sink_get_type (void); -GType fps_display_sink_get_type (void); -GType gst_chop_my_data_get_type (void); -GType gst_compare_get_type (void); -GType gst_debug_spy_get_type (void); -GType gst_error_ignore_get_type (void); -GType gst_watchdog_get_type (void); -GType gst_fake_video_sink_get_type (void); -GType gst_test_src_bin_get_type (void); -GType gst_clock_select_get_type (void); + +#include "gstdebugutilsbadelements.h" static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "checksumsink", GST_RANK_NONE, - gst_checksum_sink_get_type ()); - gst_element_register (plugin, "fpsdisplaysink", GST_RANK_NONE, - fps_display_sink_get_type ()); - gst_element_register (plugin, "chopmydata", GST_RANK_NONE, - gst_chop_my_data_get_type ()); - gst_element_register (plugin, "compare", GST_RANK_NONE, - gst_compare_get_type ()); - gst_element_register (plugin, "debugspy", GST_RANK_NONE, - gst_debug_spy_get_type ()); - gst_element_register (plugin, "watchdog", GST_RANK_NONE, - gst_watchdog_get_type ()); - gst_element_register (plugin, "errorignore", GST_RANK_NONE, - gst_error_ignore_get_type ()); - gst_element_register (plugin, "fakevideosink", GST_RANK_NONE, - gst_fake_video_sink_get_type ()); - gst_element_register (plugin, "testsrcbin", GST_RANK_NONE, - gst_test_src_bin_get_type ()); - gst_element_register (plugin, "clockselect", GST_RANK_NONE, - gst_clock_select_get_type ()); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (checksumsink, plugin); + ret |= GST_ELEMENT_REGISTER (chopmydata, plugin); + ret |= GST_ELEMENT_REGISTER (clockselect, plugin); + ret |= GST_ELEMENT_REGISTER (compare, plugin); + ret |= GST_ELEMENT_REGISTER (debugspy, plugin); + ret |= GST_ELEMENT_REGISTER (errorignore, plugin); + ret |= GST_ELEMENT_REGISTER (fakeaudiosink, plugin); + ret |= GST_ELEMENT_REGISTER (fakevideosink, plugin); + ret |= GST_ELEMENT_REGISTER (fpsdisplaysink, plugin); + ret |= GST_ELEMENT_REGISTER (testsrcbin, plugin); + ret |= GST_ELEMENT_REGISTER (videocodectestsink, plugin); + ret |= GST_ELEMENT_REGISTER (watchdog, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/fpsdisplaysink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/fpsdisplaysink.c
Changed
@@ -47,6 +47,7 @@ #include "config.h" #endif +#include "gstdebugutilsbadelements.h" #include "fpsdisplaysink.h" #define DEFAULT_SIGNAL_FPS_MEASUREMENTS FALSE @@ -723,3 +724,6 @@ return fps_display_sink_type; } + +GST_ELEMENT_REGISTER_DEFINE (fpsdisplaysink, "fpsdisplaysink", + GST_RANK_NONE, fps_display_sink_get_type ());
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstchecksumsink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstchecksumsink.c
Changed
@@ -23,6 +23,7 @@ #include <gst/gst.h> #include <gst/base/gstbasesink.h> +#include "gstdebugutilsbadelements.h" #include "gstchecksumsink.h" static void gst_checksum_sink_set_property (GObject * object, guint prop_id, @@ -73,6 +74,8 @@ #define gst_checksum_sink_parent_class parent_class G_DEFINE_TYPE (GstChecksumSink, gst_checksum_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE (checksumsink, "checksumsink", + GST_RANK_NONE, gst_checksum_sink_get_type ()); static void gst_checksum_sink_class_init (GstChecksumSinkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstchopmydata.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstchopmydata.c
Changed
@@ -45,6 +45,7 @@ #include <gst/gst.h> #include <gst/gst.h> +#include "gstdebugutilsbadelements.h" #include "gstchopmydata.h" /* prototypes */ @@ -95,6 +96,8 @@ #define gst_chop_my_data_parent_class parent_class G_DEFINE_TYPE (GstChopMyData, gst_chop_my_data, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (chopmydata, "chopmydata", + GST_RANK_NONE, gst_chop_my_data_get_type ()); static void gst_chop_my_data_class_init (GstChopMyDataClass * klass) @@ -125,8 +128,9 @@ gst_element_class_add_static_pad_template (element_class, &gst_chop_my_data_sink_template); - gst_element_class_set_static_metadata (element_class, "FIXME", - "Generic", "FIXME", "David Schleef <ds@schleef.org>"); + gst_element_class_set_static_metadata (element_class, "Chop my data", + "Generic", "Split up a stream into randomly-sized buffers", + "David Schleef <ds@schleef.org>"); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstclockselect.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstclockselect.c
Changed
@@ -39,6 +39,7 @@ #include <gst/gst.h> #include <gst/net/net.h> +#include "gstdebugutilsbadelements.h" #include "gstclockselect.h" GST_DEBUG_CATEGORY_STATIC (gst_clock_select_debug_category); @@ -94,6 +95,8 @@ G_DEFINE_TYPE_WITH_CODE (GstClockSelect, gst_clock_select, GST_TYPE_PIPELINE, GST_DEBUG_CATEGORY_INIT (gst_clock_select_debug_category, "clockselect", 0, "debug category for clockselect element")); +GST_ELEMENT_REGISTER_DEFINE (clockselect, "clockselect", + GST_RANK_NONE, gst_clock_select_get_type ()); static void gst_clock_select_class_init (GstClockSelectClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstcompare.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstcompare.c
Changed
@@ -29,6 +29,7 @@ #include <gst/base/gstcollectpads.h> #include <gst/video/video.h> +#include "gstdebugutilsbadelements.h" #include "gstcompare.h" GST_DEBUG_CATEGORY_STATIC (compare_debug); @@ -117,6 +118,8 @@ #define gst_compare_parent_class parent_class G_DEFINE_TYPE (GstCompare, gst_compare, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (compare, "compare", + GST_RANK_NONE, gst_compare_get_type ()); static void gst_compare_finalize (GObject * object)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstdebugspy.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstdebugspy.c
Changed
@@ -40,6 +40,7 @@ #include <gst/gst.h> +#include "gstdebugutilsbadelements.h" #include "gstdebugspy.h" GST_DEBUG_CATEGORY_STATIC (gst_debug_spy_debug); @@ -96,6 +97,8 @@ ); G_DEFINE_TYPE (GstDebugSpy, gst_debug_spy, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (debugspy, "debugspy", + GST_RANK_NONE, gst_debug_spy_get_type ()); static void gst_debug_spy_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstdebugutilsbadelements.h
Added
@@ -0,0 +1,44 @@ +/* GStreamer + * Copyright (C) <2020> Julian Bouzas <julian.bouzas@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_DEBUGUTILSBAD_ELEMENT_H__ +#define __GST_DEBUGUTILSBAD_ELEMENT_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +GST_ELEMENT_REGISTER_DECLARE (checksumsink); +GST_ELEMENT_REGISTER_DECLARE (chopmydata); +GST_ELEMENT_REGISTER_DECLARE (clockselect); +GST_ELEMENT_REGISTER_DECLARE (compare); +GST_ELEMENT_REGISTER_DECLARE (debugspy); +GST_ELEMENT_REGISTER_DECLARE (errorignore); +GST_ELEMENT_REGISTER_DECLARE (fakeaudiosink); +GST_ELEMENT_REGISTER_DECLARE (fakevideosink); +GST_ELEMENT_REGISTER_DECLARE (fpsdisplaysink); +GST_ELEMENT_REGISTER_DECLARE (testsrcbin); +GST_ELEMENT_REGISTER_DECLARE (videocodectestsink); +GST_ELEMENT_REGISTER_DECLARE (watchdog); + + +#endif /* __GST_DEBUGUTILSBAD_PLUGIN_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gsterrorignore.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gsterrorignore.c
Changed
@@ -38,6 +38,7 @@ #include "config.h" #endif +#include "gstdebugutilsbadelements.h" #include "gsterrorignore.h" #define GST_CAT_DEFAULT gst_error_ignore_debug @@ -49,6 +50,7 @@ PROP_IGNORE_ERROR, PROP_IGNORE_NOTLINKED, PROP_IGNORE_NOTNEGOTIATED, + PROP_IGNORE_EOS, PROP_CONVERT_TO }; @@ -69,6 +71,8 @@ #define parent_class gst_error_ignore_parent_class G_DEFINE_TYPE (GstErrorIgnore, gst_error_ignore, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (errorignore, "errorignore", + GST_RANK_NONE, gst_error_ignore_get_type ()); static GstFlowReturn gst_error_ignore_sink_chain (GstPad * pad, GstObject * parent, GstBuffer * inbuf); @@ -119,6 +123,16 @@ "Whether to ignore GST_FLOW_NOT_NEGOTIATED", TRUE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + /** + * GstErrorIgnore:ignore-eos: + * + * Since: 1.20 + */ + g_object_class_install_property (object_class, PROP_IGNORE_EOS, + g_param_spec_boolean ("ignore-eos", + "Ignore GST_FLOW_EOS", "Whether to ignore GST_FLOW_EOS", + FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + g_object_class_install_property (object_class, PROP_CONVERT_TO, g_param_spec_enum ("convert-to", "GstFlowReturn to convert to", "Which GstFlowReturn value we should convert to when ignoring", @@ -150,6 +164,7 @@ self->ignore_error = TRUE; self->ignore_notlinked = FALSE; self->ignore_notnegotiated = TRUE; + self->ignore_eos = FALSE; self->convert_to = GST_FLOW_NOT_LINKED; } @@ -169,6 +184,9 @@ case PROP_IGNORE_NOTNEGOTIATED: self->ignore_notnegotiated = g_value_get_boolean (value); break; + case PROP_IGNORE_EOS: + self->ignore_eos = g_value_get_boolean (value); + break; case PROP_CONVERT_TO: self->convert_to = g_value_get_enum (value); break; @@ -194,6 +212,9 @@ case PROP_IGNORE_NOTNEGOTIATED: g_value_set_boolean (value, self->ignore_notnegotiated); break; + case PROP_IGNORE_EOS: + g_value_set_boolean (value, self->ignore_eos); + break; case PROP_CONVERT_TO: g_value_set_enum (value, self->convert_to); break; @@ -243,6 +264,7 @@ if ((ret == GST_FLOW_ERROR && self->ignore_error) || (ret == GST_FLOW_NOT_LINKED && self->ignore_notlinked) || + (ret == GST_FLOW_EOS && self->ignore_eos) || (ret == GST_FLOW_NOT_NEGOTIATED && self->ignore_notnegotiated)) return self->convert_to; else
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gsterrorignore.h -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gsterrorignore.h
Changed
@@ -44,6 +44,7 @@ gboolean ignore_error; gboolean ignore_notlinked; gboolean ignore_notnegotiated; + gboolean ignore_eos; GstFlowReturn convert_to; };
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstfakeaudiosink.c
Added
@@ -0,0 +1,187 @@ +/* + * GStreamer + * Copyright (C) 2017 Collabora Inc. + * Copyright (C) 2021 Igalia S.L. + * Author: Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-fakeaudiosink + * @title: fakeaudiosink + * + * This element is the same as fakesink but will pretend to act as an audio sink + * supporting the `GstStreamVolume` interface. This is useful for throughput + * testing while creating a new pipeline or for CI purposes on machines not + * running a real audio daemon. + * + * ## Example launch lines + * |[ + * gst-launch-1.0 audiotestsrc ! fakeaudiosink + * ]| + * + * Since: 1.20 + */ + +#include "gstdebugutilsbadelements.h" +#include "gstfakeaudiosink.h" +#include "gstfakesinkutils.h" + +#include <gst/audio/audio.h> + +enum +{ + PROP_0, + PROP_VOLUME, + PROP_MUTE, + PROP_LAST +}; + + +static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_AUDIO_CAPS_MAKE (GST_AUDIO_FORMATS_ALL))); + +G_DEFINE_TYPE_WITH_CODE (GstFakeAudioSink, gst_fake_audio_sink, GST_TYPE_BIN, + G_IMPLEMENT_INTERFACE (GST_TYPE_STREAM_VOLUME, NULL);); +GST_ELEMENT_REGISTER_DEFINE (fakeaudiosink, "fakeaudiosink", + GST_RANK_NONE, gst_fake_audio_sink_get_type ()); + +static void +gst_fake_audio_sink_proxy_properties (GstFakeAudioSink * self, + GstElement * child) +{ + static gsize initialized = 0; + + if (g_once_init_enter (&initialized)) { + gst_fake_sink_proxy_properties (GST_ELEMENT_CAST (self), child, PROP_LAST); + g_once_init_leave (&initialized, 1); + } +} + +static void +gst_fake_audio_sink_init (GstFakeAudioSink * self) +{ + GstElement *child; + GstPadTemplate *template = gst_static_pad_template_get (&sink_factory); + + self->volume = 1.0; + self->mute = FALSE; + + child = gst_element_factory_make ("fakesink", "sink"); + + if (child) { + GstPad *sink_pad = gst_element_get_static_pad (child, "sink"); + GstPad *ghost_pad; + + /* mimic GstAudioSink base class */ + g_object_set (child, "qos", TRUE, "sync", TRUE, NULL); + + gst_bin_add (GST_BIN_CAST (self), child); + + ghost_pad = gst_ghost_pad_new_from_template ("sink", sink_pad, template); + gst_object_unref (template); + gst_element_add_pad (GST_ELEMENT_CAST (self), ghost_pad); + gst_object_unref (sink_pad); + + self->child = child; + + gst_fake_audio_sink_proxy_properties (self, child); + } else { + g_warning ("Check your GStreamer installation, " + "core element 'fakesink' is missing."); + } +} + +static void +gst_fake_audio_sink_get_property (GObject * object, guint property_id, + GValue * value, GParamSpec * pspec) +{ + GstFakeAudioSink *self = GST_FAKE_AUDIO_SINK (object); + + switch (property_id) { + case PROP_VOLUME: + g_value_set_double (value, self->volume); + break; + case PROP_MUTE: + g_value_set_boolean (value, self->mute); + break; + default: + g_object_get_property (G_OBJECT (self->child), pspec->name, value); + break; + } +} + +static void +gst_fake_audio_sink_set_property (GObject * object, guint property_id, + const GValue * value, GParamSpec * pspec) +{ + GstFakeAudioSink *self = GST_FAKE_AUDIO_SINK (object); + + switch (property_id) { + case PROP_VOLUME: + self->volume = g_value_get_double (value); + break; + case PROP_MUTE: + self->mute = g_value_get_boolean (value); + break; + default: + g_object_set_property (G_OBJECT (self->child), pspec->name, value); + break; + } +} + +static void +gst_fake_audio_sink_class_init (GstFakeAudioSinkClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GObjectClass *object_class = G_OBJECT_CLASS (klass); + + object_class->get_property = gst_fake_audio_sink_get_property; + object_class->set_property = gst_fake_audio_sink_set_property; + + + /** + * GstFakeAudioSink:volume + * + * Control the audio volume + * + * Since: 1.20 + */ + g_object_class_install_property (object_class, PROP_VOLUME, + g_param_spec_double ("volume", "Volume", "The audio volume, 1.0=100%", + 0, 10, 1, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstFakeAudioSink:mute + * + * Control the mute state + * + * Since: 1.20 + */ + g_object_class_install_property (object_class, PROP_MUTE, + g_param_spec_boolean ("mute", "Mute", + "Mute the audio channel without changing the volume", FALSE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + + gst_element_class_add_static_pad_template (element_class, &sink_factory); + gst_element_class_set_static_metadata (element_class, "Fake Audio Sink", + "Audio/Sink", "Fake audio renderer", + "Philippe Normand <philn@igalia.com>"); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstfakeaudiosink.h
Added
@@ -0,0 +1,63 @@ +/* + * GStreamer + * Copyright (C) 2017 Collabora Inc. + * Copyright (C) 2021 Igalia S.L. + * Author: Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef _GST_FAKE_AUDIO_SINK_H_ +#define _GST_FAKE_AUDIO_SINK_H_ + +#include <gst/gst.h> + +#define GST_TYPE_FAKE_AUDIO_SINK \ + (gst_fake_audio_sink_get_type()) +#define GST_FAKE_AUDIO_SINK(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_FAKE_AUDIO_SINK, GstFakeAudioSink)) +#define GST_FAKE_AUDIO_SINK_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_FAKE_AUDIO_SINK, GstFakeAudioSinkClass)) +#define GST_FAKE_AUDIO_SINK_GET_CLASS(obj) \ + (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_FAKE_AUDIO_SINK, GstFakeAudioSinkClass)) +#define GST_IS_FAKE_AUDIO_SINK(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_FAKE_AUDIO_SINK)) +#define GST_IS_FAKE_AUDIO_SINK_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_FAKE_AUDIO_SINK)) + +G_BEGIN_DECLS + +typedef struct _GstFakeAudioSink GstFakeAudioSink; +typedef struct _GstFakeAudioSinkClass GstFakeAudioSinkClass; + +struct _GstFakeAudioSink +{ + GstBin parent; + GstElement *child; + gdouble volume; + gboolean mute; +}; + +struct _GstFakeAudioSinkClass +{ + GstBinClass parent; +}; + +GType gst_fake_audio_sink_get_type (void); + +G_END_DECLS + +#endif
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstfakesinkutils.c
Added
@@ -0,0 +1,109 @@ +/* + * GStreamer + * Copyright (C) 2017 Collabora Inc. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstfakesinkutils.h" +#include <gst/base/gstbasesink.h> + +/* TODO complete the types */ +void +gst_fake_sink_proxy_properties (GstElement * self, + GstElement * child, guint property_id_offset) +{ + GObjectClass *object_class; + GParamSpec **properties; + guint n_properties, i; + + object_class = G_OBJECT_CLASS (GST_ELEMENT_GET_CLASS (self)); + properties = g_object_class_list_properties (G_OBJECT_GET_CLASS (child), + &n_properties); + + for (i = 0; i < n_properties; i++) { + guint property_id = i + property_id_offset; + + if (properties[i]->owner_type != G_OBJECT_TYPE (child) && + properties[i]->owner_type != GST_TYPE_BASE_SINK) + continue; + + if (G_IS_PARAM_SPEC_BOOLEAN (properties[i])) { + GParamSpecBoolean *prop = G_PARAM_SPEC_BOOLEAN (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_boolean (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + prop->default_value, properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_INT (properties[i])) { + GParamSpecInt *prop = G_PARAM_SPEC_INT (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_int (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + prop->minimum, prop->maximum, prop->default_value, + properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_UINT (properties[i])) { + GParamSpecUInt *prop = G_PARAM_SPEC_UINT (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_uint (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + prop->minimum, prop->maximum, prop->default_value, + properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_INT64 (properties[i])) { + GParamSpecInt64 *prop = G_PARAM_SPEC_INT64 (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_int64 (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + prop->minimum, prop->maximum, prop->default_value, + properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_UINT64 (properties[i])) { + GParamSpecUInt64 *prop = G_PARAM_SPEC_UINT64 (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_uint64 (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + prop->minimum, prop->maximum, prop->default_value, + properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_ENUM (properties[i])) { + GParamSpecEnum *prop = G_PARAM_SPEC_ENUM (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_enum (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + properties[i]->value_type, prop->default_value, + properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_STRING (properties[i])) { + GParamSpecString *prop = G_PARAM_SPEC_STRING (properties[i]); + g_object_class_install_property (object_class, property_id, + g_param_spec_string (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + prop->default_value, properties[i]->flags)); + } else if (G_IS_PARAM_SPEC_BOXED (properties[i])) { + g_object_class_install_property (object_class, property_id, + g_param_spec_boxed (g_param_spec_get_name (properties[i]), + g_param_spec_get_nick (properties[i]), + g_param_spec_get_blurb (properties[i]), + properties[i]->value_type, properties[i]->flags)); + } + } + + g_free (properties); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstfakesinkutils.h
Added
@@ -0,0 +1,33 @@ +/* + * GStreamer + * Copyright (C) 2017 Collabora Inc. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_FAKE_SINK_UTILS_H__ +#define __GST_FAKE_SINK_UTILS_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +void gst_fake_sink_proxy_properties (GstElement * self, GstElement * child, guint property_id_offset); + +G_END_DECLS + +#endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstfakevideosink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstfakevideosink.c
Changed
@@ -37,7 +37,9 @@ * Since 1.14 */ +#include "gstdebugutilsbadelements.h" #include "gstfakevideosink.h" +#include "gstfakesinkutils.h" #include <gst/video/video.h> @@ -84,6 +86,8 @@ GST_VIDEO_FORMATS_ALL))); G_DEFINE_TYPE (GstFakeVideoSink, gst_fake_video_sink, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (fakevideosink, "fakevideosink", + GST_RANK_NONE, gst_fake_video_sink_get_type ()); static gboolean gst_fake_video_sink_query (GstPad * pad, GstObject * parent, GstQuery * query) @@ -123,7 +127,6 @@ return TRUE; } -/* TODO complete the types and make this an utility */ static void gst_fake_video_sink_proxy_properties (GstFakeVideoSink * self, GstElement * child) @@ -131,100 +134,7 @@ static gsize initialized = 0; if (g_once_init_enter (&initialized)) { - GObjectClass *object_class; - GParamSpec **properties; - guint n_properties, i; - - object_class = G_OBJECT_CLASS (GST_FAKE_VIDEO_SINK_GET_CLASS (self)); - properties = g_object_class_list_properties (G_OBJECT_GET_CLASS (child), - &n_properties); - - /** - * GstFakeVideoSink:allocation-meta-flags - * - * Control the behaviour of the sink allocation query handler. - * - * Since: 1.18 - */ - g_object_class_install_property (object_class, PROP_ALLOCATION_META_FLAGS, - g_param_spec_flags ("allocation-meta-flags", "Flags", - "Flags to control behaviour", - GST_TYPE_FAKE_VIDEO_SINK_ALLOCATION_META_FLAGS, - ALLOCATION_META_DEFAULT_FLAGS, - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); - - - for (i = 0; i < n_properties; i++) { - guint property_id = i + PROP_LAST; - - if (properties[i]->owner_type != G_OBJECT_TYPE (child) && - properties[i]->owner_type != GST_TYPE_BASE_SINK) - continue; - - if (G_IS_PARAM_SPEC_BOOLEAN (properties[i])) { - GParamSpecBoolean *prop = G_PARAM_SPEC_BOOLEAN (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_boolean (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - prop->default_value, properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_INT (properties[i])) { - GParamSpecInt *prop = G_PARAM_SPEC_INT (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_int (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - prop->minimum, prop->maximum, prop->default_value, - properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_UINT (properties[i])) { - GParamSpecUInt *prop = G_PARAM_SPEC_UINT (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_uint (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - prop->minimum, prop->maximum, prop->default_value, - properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_INT64 (properties[i])) { - GParamSpecInt64 *prop = G_PARAM_SPEC_INT64 (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_int64 (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - prop->minimum, prop->maximum, prop->default_value, - properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_UINT64 (properties[i])) { - GParamSpecUInt64 *prop = G_PARAM_SPEC_UINT64 (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_uint64 (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - prop->minimum, prop->maximum, prop->default_value, - properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_ENUM (properties[i])) { - GParamSpecEnum *prop = G_PARAM_SPEC_ENUM (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_enum (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - properties[i]->value_type, prop->default_value, - properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_STRING (properties[i])) { - GParamSpecString *prop = G_PARAM_SPEC_STRING (properties[i]); - g_object_class_install_property (object_class, property_id, - g_param_spec_string (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - prop->default_value, properties[i]->flags)); - } else if (G_IS_PARAM_SPEC_BOXED (properties[i])) { - g_object_class_install_property (object_class, property_id, - g_param_spec_boxed (g_param_spec_get_name (properties[i]), - g_param_spec_get_nick (properties[i]), - g_param_spec_get_blurb (properties[i]), - properties[i]->value_type, properties[i]->flags)); - } - } - - g_free (properties); + gst_fake_sink_proxy_properties (GST_ELEMENT_CAST (self), child, PROP_LAST); g_once_init_leave (&initialized, 1); } } @@ -316,6 +226,20 @@ "Video/Sink", "Fake video display that allows zero-copy", "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); + /** + * GstFakeVideoSink:allocation-meta-flags + * + * Control the behaviour of the sink allocation query handler. + * + * Since: 1.18 + */ + g_object_class_install_property (object_class, PROP_ALLOCATION_META_FLAGS, + g_param_spec_flags ("allocation-meta-flags", "Flags", + "Flags to control behaviour", + GST_TYPE_FAKE_VIDEO_SINK_ALLOCATION_META_FLAGS, + ALLOCATION_META_DEFAULT_FLAGS, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + gst_type_mark_as_plugin_api (GST_TYPE_FAKE_VIDEO_SINK_ALLOCATION_META_FLAGS, 0); }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gsttestsrcbin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gsttestsrcbin.c
Changed
@@ -39,6 +39,7 @@ #include <gst/gst.h> #include <gst/base/gstflowcombiner.h> #include <gst/app/gstappsink.h> +#include "gstdebugutilsbadelements.h" static GstStaticPadTemplate video_src_template = GST_STATIC_PAD_TEMPLATE ("video_src_%u", @@ -75,6 +76,7 @@ GstStreamCollection *collection; gint group_id; GstFlowCombiner *flow_combiner; + GstCaps *streams_def; }; enum @@ -124,7 +126,7 @@ { ProbeData *data = g_malloc0 (sizeof (ProbeData)); - data->stream_start = gst_event_ref (stream_start); + data->stream_start = stream_start; data->collection = gst_object_ref (collection); return data; @@ -195,6 +197,9 @@ gst_test_src_bin_set_element_property (GQuark property_id, const GValue * value, GObject * element) { + if (property_id == g_quark_from_static_string ("__streamobj__")) + return TRUE; + if (G_VALUE_HOLDS_STRING (value)) gst_util_set_object_arg (element, g_quark_to_string (property_id), g_value_get_string (value)); @@ -265,11 +270,13 @@ gst_event_set_stream (stream_start, stream); gst_event_set_group_id (stream_start, self->group_id); + gst_structure_set (props, "__streamobj__", GST_TYPE_STREAM, stream, NULL); + gst_stream_collection_add_stream (collection, stream); + gst_pad_add_probe (pad, (GstPadProbeType) GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, (GstPadProbeCallback) src_pad_probe_cb, _probe_data_new (stream_start, collection), (GDestroyNotify) _probe_data_free); - gst_stream_collection_add_stream (collection, stream); g_free (stream_id); gst_bin_add (GST_BIN (self), src); @@ -288,14 +295,58 @@ gst_object_unref (pad); gst_element_sync_state_with_parent (src); *n_stream += 1; + + gst_structure_set (props, "__src__", GST_TYPE_OBJECT, src, NULL); } static void -gst_test_src_bin_remove_child (GValue * val, GstBin * self) +gst_test_src_bin_remove_child (GstElement * self, GstElement * child) { - GstElement *child = g_value_get_object (val); + GstPad *pad = gst_element_get_static_pad (child, "src"); + GstPad *ghost = + GST_PAD (gst_proxy_pad_get_internal (GST_PROXY_PAD (gst_pad_get_peer + (pad)))); + + + gst_element_set_locked_state (child, FALSE); + gst_element_set_state (child, GST_STATE_NULL); + gst_bin_remove (GST_BIN (self), child); + gst_element_remove_pad (self, ghost); +} + +static GstStream * +gst_test_check_prev_stream_def (GstTestSrcBin * self, GstCaps * prev_streams, + GstStructure * stream_def) +{ + gint i; + + if (!prev_streams) + return NULL; + + for (i = 0; i < gst_caps_get_size (prev_streams); i++) { + GstStructure *prev_stream = gst_caps_get_structure (prev_streams, i); + GstElement *e = NULL; + GstStream *stream = NULL; + + gst_structure_get (prev_stream, "__src__", GST_TYPE_OBJECT, &e, + "__streamobj__", GST_TYPE_STREAM, &stream, NULL); + gst_structure_remove_fields (prev_stream, "__src__", "__streamobj__", NULL); + if (gst_structure_is_equal (prev_stream, stream_def)) { + g_assert (stream); - gst_bin_remove (self, child); + gst_caps_remove_structure (prev_streams, i); + gst_structure_set (stream_def, "__src__", GST_TYPE_OBJECT, e, + "__streamobj__", GST_TYPE_STREAM, stream, NULL); + + g_assert (stream); + return stream; + } + + gst_structure_set (stream_def, "__src__", GST_TYPE_OBJECT, e, + "__streamobj__", GST_TYPE_STREAM, stream, NULL); + } + + return NULL; } static gboolean @@ -306,31 +357,34 @@ gchar *tmp, *location = gst_uri_get_location (uri); gint i, n_audio = 0, n_video = 0; GstStreamCollection *collection = gst_stream_collection_new (NULL); - GstIterator *it; - GstCaps *streams_defs; + GstCaps *streams_def, *prev_streams = self->streams_def; for (tmp = location; *tmp != '\0'; tmp++) if (*tmp == '+') *tmp = ';'; - streams_defs = gst_caps_from_string (location); + streams_def = gst_caps_from_string (location); g_free (location); - if (!streams_defs) + if (!streams_def) goto failed; - /* Clear us up */ - it = gst_bin_iterate_elements (GST_BIN (self)); - while (gst_iterator_foreach (it, - (GstIteratorForeachFunction) gst_test_src_bin_remove_child, - self) == GST_ITERATOR_RESYNC) - gst_iterator_resync (it); - - gst_iterator_free (it); - self->group_id = gst_util_group_id_next (); - for (i = 0; i < gst_caps_get_size (streams_defs); i++) { - GstStructure *stream_def = gst_caps_get_structure (streams_defs, i); + for (i = 0; i < gst_caps_get_size (streams_def); i++) { + GstStream *stream; + GstStructure *stream_def = gst_caps_get_structure (streams_def, i); + + if ((stream = + gst_test_check_prev_stream_def (self, prev_streams, stream_def))) { + GST_INFO_OBJECT (self, + "Reusing already existing stream: %" GST_PTR_FORMAT, stream_def); + gst_stream_collection_add_stream (collection, stream); + if (gst_structure_has_name (stream_def, "video")) + n_video++; + else + n_audio++; + continue; + } if (gst_structure_has_name (stream_def, "video")) gst_test_src_bin_setup_src (self, "videotestsrc", &video_src_template, @@ -342,6 +396,18 @@ GST_ERROR_OBJECT (self, "Unknown type %s", gst_structure_get_name (stream_def)); } + self->streams_def = streams_def; + + if (prev_streams) { + for (i = 0; i < gst_caps_get_size (prev_streams); i++) { + GstStructure *prev_stream = gst_caps_get_structure (prev_streams, i); + GstElement *child; + + gst_structure_get (prev_stream, "__src__", GST_TYPE_OBJECT, &child, NULL); + gst_test_src_bin_remove_child (GST_ELEMENT (self), child); + } + gst_clear_caps (&prev_streams); + } if (!n_video && !n_audio) goto failed; @@ -377,6 +443,9 @@ G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_test_src_bin_uri_handler_init)) /* *INDENT-ON* */ +GST_ELEMENT_REGISTER_DEFINE (testsrcbin, "testsrcbin", + GST_RANK_NONE, gst_test_src_bin_get_type ()); + static void gst_test_src_bin_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) @@ -450,6 +519,7 @@ GstTestSrcBin *self = GST_TEST_SRC_BIN (object); g_free (self->uri); + gst_clear_caps (&self->streams_def); gst_flow_combiner_free (self->flow_combiner); }
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstvideocodectestsink.c
Added
@@ -0,0 +1,426 @@ +/* GStreamer + * Copyright (C) 2021 Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gio/gio.h> + +#include "gstdebugutilsbadelements.h" +#include "gstvideocodectestsink.h" + +/** + * SECTION:videocodectestsink + * + * An element that computes the checksum of a video stream and/or writes back its + * raw I420 data ignoring the padding introduced by GStreamer. This element is + * meant to be used for CODEC conformance testing. It also supports producing an I420 + * checksum and and can write out a file in I420 layout directly from NV12 input + * data. + * + * The checksum is communicated back to the application just before EOS + * message with an element message of type `conformance/checksum` with the + * following fields: + * + * * "checksum-type" G_TYPE_STRING The checksum type (only MD5 is supported) + * * "checksum" G_TYPE_STRING The checksum as a string + * + * ## Example launch lines + * |[ + * gst-launch-1.0 videotestsrc num-buffers=2 ! videocodectestsink location=true-raw.yuv -m + * ]| + * + * Since: 1.20 + */ + +enum +{ + PROP_0, + PROP_LOCATION, +}; + +struct _GstVideoCodecTestSink +{ + GstBaseSink parent; + GChecksumType hash; + + /* protect with stream lock */ + GstVideoInfo vinfo; + GstFlowReturn (*process) (GstVideoCodecTestSink * self, + GstVideoFrame * frame); + GOutputStream *ostream; + GChecksum *checksum; + + /* protect with object lock */ + gchar *location; +}; + +static GstStaticPadTemplate gst_video_codec_test_sink_template = +GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw, format = { I420, I420_10LE, NV12 }")); + +#define gst_video_codec_test_sink_parent_class parent_class +G_DEFINE_TYPE (GstVideoCodecTestSink, gst_video_codec_test_sink, + GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE (videocodectestsink, "videocodectestsink", + GST_RANK_NONE, gst_video_codec_test_sink_get_type ()); + +static void +gst_video_codec_test_sink_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (object); + + GST_OBJECT_LOCK (self); + + switch (prop_id) { + case PROP_LOCATION: + g_free (self->location); + self->location = g_value_dup_string (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + + GST_OBJECT_UNLOCK (self); +} + +static void +gst_video_codec_test_sink_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (object); + + GST_OBJECT_LOCK (self); + + switch (prop_id) { + case PROP_LOCATION: + g_value_set_string (value, self->location); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + + GST_OBJECT_UNLOCK (self); +} + +static gboolean +gst_video_codec_test_sink_start (GstBaseSink * sink) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (sink); + GError *error = NULL; + GFile *file = NULL; + gboolean ret = TRUE; + + GST_OBJECT_LOCK (self); + + self->checksum = g_checksum_new (self->hash); + if (self->location) + file = g_file_new_for_path (self->location); + + GST_OBJECT_UNLOCK (self); + + if (file) { + self->ostream = G_OUTPUT_STREAM (g_file_replace (file, NULL, FALSE, + G_FILE_CREATE_REPLACE_DESTINATION, NULL, &error)); + if (!self->ostream) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, + ("Failed to open '%s' for writing.", self->location), + ("Open failed failed: %s", error->message)); + g_error_free (error); + ret = FALSE; + } + + g_object_unref (file); + } + + return ret; +} + +static gboolean +gst_video_codec_test_sink_stop (GstBaseSink * sink) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (sink); + + g_checksum_free (self->checksum); + self->checksum = NULL; + + if (self->ostream) { + GError *error = NULL; + + if (!g_output_stream_close (self->ostream, NULL, &error)) { + GST_ELEMENT_WARNING (self, RESOURCE, CLOSE, + ("Did not close '%s' properly", self->location), + ("Failed to close stream: %s", error->message)); + } + + g_clear_object (&self->ostream); + } + + return TRUE; +} + +static GstFlowReturn +gst_video_codec_test_sink_process_data (GstVideoCodecTestSink * self, + const guchar * data, gssize length) +{ + GError *error = NULL; + + g_checksum_update (self->checksum, data, length); + + if (!self->ostream) + return GST_FLOW_OK; + + if (!g_output_stream_write_all (self->ostream, data, length, NULL, NULL, + &error)) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, + ("Failed to write video data into '%s'", self->location), + ("Writing %" G_GSIZE_FORMAT " bytes failed: %s", length, + error->message)); + g_error_free (error); + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_video_codec_test_sink_process_i420 (GstVideoCodecTestSink * self, + GstVideoFrame * frame) +{ + guint plane; + + for (plane = 0; plane < 3; plane++) { + gint y; + guint stride; + const guchar *data; + + stride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, plane); + data = GST_VIDEO_FRAME_PLANE_DATA (frame, plane); + + for (y = 0; y < GST_VIDEO_INFO_COMP_HEIGHT (&self->vinfo, plane); y++) { + gsize length = GST_VIDEO_INFO_COMP_WIDTH (&self->vinfo, plane) * + GST_VIDEO_INFO_COMP_PSTRIDE (&self->vinfo, plane); + GstFlowReturn ret; + + ret = gst_video_codec_test_sink_process_data (self, data, length); + if (ret != GST_FLOW_OK) + return ret; + + data += stride; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_video_codec_test_sink_process_nv12 (GstVideoCodecTestSink * self, + GstVideoFrame * frame) +{ + gint x, y, comp; + guint stride; + const guchar *data; + + stride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 0); + data = GST_VIDEO_FRAME_PLANE_DATA (frame, 0); + + for (y = 0; y < GST_VIDEO_INFO_HEIGHT (&self->vinfo); y++) { + gsize length = GST_VIDEO_INFO_WIDTH (&self->vinfo); + GstFlowReturn ret; + + ret = gst_video_codec_test_sink_process_data (self, data, length); + if (ret != GST_FLOW_OK) + return ret; + + data += stride; + } + + /* Deinterleave the UV plane */ + stride = GST_VIDEO_FRAME_PLANE_STRIDE (frame, 1); + + for (comp = 0; comp < 2; comp++) { + data = GST_VIDEO_FRAME_PLANE_DATA (frame, 1); + + for (y = 0; y < GST_VIDEO_INFO_COMP_HEIGHT (&self->vinfo, 1); y++) { + guint width = GST_ROUND_UP_2 (GST_VIDEO_INFO_WIDTH (&self->vinfo)) / 2; + + for (x = 0; x < width; x++) { + GstFlowReturn ret; + + ret = gst_video_codec_test_sink_process_data (self, + data + 2 * x + comp, 1); + + if (ret != GST_FLOW_OK) + return ret; + } + + data += stride; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_video_codec_test_sink_render (GstBaseSink * sink, GstBuffer * buffer) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (sink); + GstVideoFrame frame; + + if (!gst_video_frame_map (&frame, &self->vinfo, buffer, GST_MAP_READ)) + return GST_FLOW_ERROR; + + self->process (self, &frame); + + gst_video_frame_unmap (&frame); + return GST_FLOW_OK; +} + +static gboolean +gst_video_codec_test_sink_set_caps (GstBaseSink * sink, GstCaps * caps) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (sink); + + if (!gst_video_info_from_caps (&self->vinfo, caps)) + return FALSE; + + switch (GST_VIDEO_INFO_FORMAT (&self->vinfo)) { + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_I420_10LE: + self->process = gst_video_codec_test_sink_process_i420; + break; + case GST_VIDEO_FORMAT_NV12: + self->process = gst_video_codec_test_sink_process_nv12; + break; + default: + g_assert_not_reached (); + break; + } + + return TRUE; +} + +static gboolean +gst_video_codec_test_sink_propose_allocation (GstBaseSink * sink, + GstQuery * query) +{ + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + return TRUE; +} + +static gboolean +gst_video_codec_test_sink_event (GstBaseSink * sink, GstEvent * event) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (sink); + + if (event->type == GST_EVENT_EOS) { + const gchar *checksum_type = "UNKNOWN"; + + switch (self->hash) { + case G_CHECKSUM_MD5: + checksum_type = "MD5"; + break; + case G_CHECKSUM_SHA1: + checksum_type = "SHA1"; + break; + case G_CHECKSUM_SHA256: + checksum_type = "SHA256"; + break; + case G_CHECKSUM_SHA512: + checksum_type = "SHA512"; + break; + case G_CHECKSUM_SHA384: + checksum_type = "SHA384"; + break; + default: + g_assert_not_reached (); + break; + } + + gst_element_post_message (GST_ELEMENT (self), + gst_message_new_element (GST_OBJECT (self), + gst_structure_new ("conformance/checksum", "checksum-type", + G_TYPE_STRING, checksum_type, "checksum", G_TYPE_STRING, + g_checksum_get_string (self->checksum), NULL))); + g_checksum_reset (self->checksum); + } + + return GST_BASE_SINK_CLASS (parent_class)->event (sink, event); +} + +static void +gst_video_codec_test_sink_init (GstVideoCodecTestSink * sink) +{ + gst_base_sink_set_sync (GST_BASE_SINK (sink), FALSE); + sink->hash = G_CHECKSUM_MD5; +} + +static void +gst_video_codec_test_sink_finalize (GObject * object) +{ + GstVideoCodecTestSink *self = GST_VIDEO_CODEC_TEST_SINK (object); + + g_free (self->location); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_video_codec_test_sink_class_init (GstVideoCodecTestSinkClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseSinkClass *base_sink_class = GST_BASE_SINK_CLASS (klass); + + gobject_class->set_property = gst_video_codec_test_sink_set_property; + gobject_class->get_property = gst_video_codec_test_sink_get_property; + gobject_class->finalize = gst_video_codec_test_sink_finalize; + + base_sink_class->start = GST_DEBUG_FUNCPTR (gst_video_codec_test_sink_start); + base_sink_class->stop = GST_DEBUG_FUNCPTR (gst_video_codec_test_sink_stop); + base_sink_class->render = + GST_DEBUG_FUNCPTR (gst_video_codec_test_sink_render); + base_sink_class->set_caps = + GST_DEBUG_FUNCPTR (gst_video_codec_test_sink_set_caps); + base_sink_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_video_codec_test_sink_propose_allocation); + base_sink_class->event = GST_DEBUG_FUNCPTR (gst_video_codec_test_sink_event); + + gst_element_class_add_static_pad_template (element_class, + &gst_video_codec_test_sink_template); + + g_object_class_install_property (gobject_class, PROP_LOCATION, + g_param_spec_string ("location", "Location", + "File path to store non-padded I420 stream (optional).", NULL, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + gst_element_class_set_static_metadata (element_class, + "Video CODEC Test Sink", "Debug/video/Sink", + "Sink to test video CODEC conformance", + "Nicolas Dufresne <nicolas.dufresne@collabora.com"); +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstvideocodectestsink.h
Added
@@ -0,0 +1,32 @@ +/* GStreamer + * Copyright (C) 2021 Collabora Ltd. + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> +#include <gst/base/gstbasesink.h> + +G_BEGIN_DECLS + +#define GST_TYPE_VIDEO_CODEC_TEST_SINK gst_video_codec_test_sink_get_type () +G_DECLARE_FINAL_TYPE (GstVideoCodecTestSink, gst_video_codec_test_sink, GST, + VIDEO_CODEC_TEST_SINK, GstBaseSink); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/gstwatchdog.c -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/gstwatchdog.c
Changed
@@ -46,6 +46,7 @@ #include <gst/gst.h> #include <gst/base/gstbasetransform.h> +#include "gstdebugutilsbadelements.h" #include "gstwatchdog.h" GST_DEBUG_CATEGORY_STATIC (gst_watchdog_debug_category); @@ -83,6 +84,8 @@ G_DEFINE_TYPE_WITH_CODE (GstWatchdog, gst_watchdog, GST_TYPE_BASE_TRANSFORM, GST_DEBUG_CATEGORY_INIT (gst_watchdog_debug_category, "watchdog", 0, "debug category for watchdog element")); +GST_ELEMENT_REGISTER_DEFINE (watchdog, "watchdog", GST_RANK_NONE, + gst_watchdog_get_type ()); static void gst_watchdog_class_init (GstWatchdogClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/debugutils/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/debugutils/meson.build
Changed
@@ -1,22 +1,25 @@ debugutilsbad_sources = [ - 'gstdebugspy.c', - 'gsterrorignore.c', 'debugutilsbad.c', 'fpsdisplaysink.c', 'gstchecksumsink.c', 'gstchopmydata.c', + 'gstclockselect.c', 'gstcompare.c', + 'gstdebugspy.c', + 'gsterrorignore.c', + 'gstfakeaudiosink.c', + 'gstfakesinkutils.c', 'gstfakevideosink.c', - 'gstwatchdog.c', 'gsttestsrcbin.c', - 'gstclockselect.c', + 'gstvideocodectestsink.c', + 'gstwatchdog.c', ] gstdebugutilsbad = library('gstdebugutilsbad', debugutilsbad_sources, c_args : gst_plugins_bad_args, include_directories : [configinc], - dependencies : [gstbase_dep, gstvideo_dep, gstnet_dep], + dependencies : [gstbase_dep, gstvideo_dep, gstnet_dep, gstaudio_dep, gio_dep], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/gst/dvbsubenc/gstdvbsubenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/dvbsubenc/gstdvbsubenc.c
Changed
@@ -51,6 +51,10 @@ #define gst_dvb_sub_enc_parent_class parent_class G_DEFINE_TYPE (GstDvbSubEnc, gst_dvb_sub_enc, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dvbsubenc, "dvbsubenc", GST_RANK_NONE, + GST_TYPE_DVB_SUB_ENC, GST_DEBUG_CATEGORY_INIT (gst_dvb_sub_enc_debug, + "dvbsubenc", 0, "DVB subtitle encoder"); + ); static void gst_dvb_sub_enc_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); @@ -591,15 +595,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "dvbsubenc", GST_RANK_NONE, - GST_TYPE_DVB_SUB_ENC)) { - return FALSE; - } - - GST_DEBUG_CATEGORY_INIT (gst_dvb_sub_enc_debug, "dvbsubenc", 0, - "DVB subtitle encoder"); - - return TRUE; + return GST_ELEMENT_REGISTER (dvbsubenc, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/dvbsubenc/gstdvbsubenc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/dvbsubenc/gstdvbsubenc.h
Changed
@@ -64,6 +64,7 @@ }; GType gst_dvb_sub_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dvbsubenc); gboolean gst_dvbsubenc_ayuv_to_ayuv8p (GstVideoFrame * src, GstVideoFrame * dest, int max_colours, guint32 *out_num_colours);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/dvbsuboverlay/gstdvbsuboverlay.c -> gst-plugins-bad-1.20.1.tar.xz/gst/dvbsuboverlay/gstdvbsuboverlay.c
Changed
@@ -107,6 +107,11 @@ #define gst_dvbsub_overlay_parent_class parent_class G_DEFINE_TYPE (GstDVBSubOverlay, gst_dvbsub_overlay, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dvbsuboverlay, "dvbsuboverlay", + GST_RANK_PRIMARY, GST_TYPE_DVBSUB_OVERLAY, + GST_DEBUG_CATEGORY_INIT (gst_dvbsub_overlay_debug, "dvbsuboverlay", 0, + "DVB subtitle overlay"); + ); static GstCaps *gst_dvbsub_overlay_get_videosink_caps (GstDVBSubOverlay * render, GstPad * pad, GstCaps * filter); @@ -1303,11 +1308,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_dvbsub_overlay_debug, "dvbsuboverlay", - 0, "DVB subtitle overlay"); - - return gst_element_register (plugin, "dvbsuboverlay", - GST_RANK_PRIMARY, GST_TYPE_DVBSUB_OVERLAY); + return GST_ELEMENT_REGISTER (dvbsuboverlay, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/dvbsuboverlay/gstdvbsuboverlay.h -> gst-plugins-bad-1.20.1.tar.xz/gst/dvbsuboverlay/gstdvbsuboverlay.h
Changed
@@ -76,6 +76,7 @@ }; GType gst_dvbsub_overlay_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dvbsuboverlay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/dvdspu/gstdvdspu.c -> gst-plugins-bad-1.20.1.tar.xz/gst/dvdspu/gstdvdspu.c
Changed
@@ -75,8 +75,11 @@ GST_STATIC_CAPS ("subpicture/x-dvd; subpicture/x-pgs") ); +static gboolean dvd_spu_element_init (GstPlugin * plugin); + #define gst_dvd_spu_parent_class parent_class G_DEFINE_TYPE (GstDVDSpu, gst_dvd_spu, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_CUSTOM (dvdspu, dvd_spu_element_init); static void gst_dvd_spu_dispose (GObject * object); static void gst_dvd_spu_finalize (GObject * object); @@ -1238,7 +1241,7 @@ } static gboolean -gst_dvd_spu_plugin_init (GstPlugin * plugin) +dvd_spu_element_init (GstPlugin * plugin) { const gchar *env; @@ -1260,6 +1263,12 @@ GST_RANK_PRIMARY, GST_TYPE_DVD_SPU); } +static gboolean +gst_dvd_spu_plugin_init (GstPlugin * plugin) +{ + return GST_ELEMENT_REGISTER (dvdspu, plugin); +} + GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, dvdspu,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/dvdspu/gstdvdspu.h -> gst-plugins-bad-1.20.1.tar.xz/gst/dvdspu/gstdvdspu.h
Changed
@@ -123,6 +123,7 @@ }; GType gst_dvd_spu_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dvdspu); typedef enum { GST_DVD_SPU_DEBUG_RENDER_RECTANGLE = (1 << 0),
View file
gst-plugins-bad-1.18.6.tar.xz/gst/faceoverlay/gstfaceoverlay.c -> gst-plugins-bad-1.20.1.tar.xz/gst/faceoverlay/gstfaceoverlay.c
Changed
@@ -89,6 +89,11 @@ #define gst_face_overlay_parent_class parent_class G_DEFINE_TYPE (GstFaceOverlay, gst_face_overlay, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (faceoverlay, "faceoverlay", + GST_RANK_NONE, GST_TYPE_FACEOVERLAY, + GST_DEBUG_CATEGORY_INIT (gst_face_overlay_debug, "faceoverlay", 0, + "SVG Face Overlay"); + ); static void gst_face_overlay_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -430,13 +435,9 @@ } static gboolean -faceoverlay_init (GstPlugin * faceoverlay) +faceoverlay_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_face_overlay_debug, "faceoverlay", - 0, "SVG Face Overlay"); - - return gst_element_register (faceoverlay, "faceoverlay", GST_RANK_NONE, - GST_TYPE_FACEOVERLAY); + return GST_ELEMENT_REGISTER (faceoverlay, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/faceoverlay/gstfaceoverlay.h -> gst-plugins-bad-1.20.1.tar.xz/gst/faceoverlay/gstfaceoverlay.h
Changed
@@ -88,6 +88,7 @@ }; GType gst_face_overlay_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (faceoverlay); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/festival/gstfestival.c -> gst-plugins-bad-1.20.1.tar.xz/gst/festival/gstfestival.c
Changed
@@ -150,9 +150,14 @@ /*static guint gst_festival_signals[LAST_SIGNAL] = { 0 }; */ -G_DEFINE_TYPE (GstFestival, gst_festival, GST_TYPE_ELEMENT) +G_DEFINE_TYPE (GstFestival, gst_festival, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (festival, "festival", GST_RANK_NONE, + GST_TYPE_FESTIVAL, GST_DEBUG_CATEGORY_INIT (festival_debug, "festival", + 0, "Festival text-to-speech synthesizer"); + );; - static void gst_festival_class_init (GstFestivalClass * klass) +static void +gst_festival_class_init (GstFestivalClass * klass) { GObjectClass *gobject_class; GstElementClass *gstelement_class; @@ -524,14 +529,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (festival_debug, "festival", - 0, "Festival text-to-speech synthesizer"); - - if (!gst_element_register (plugin, "festival", GST_RANK_NONE, - GST_TYPE_FESTIVAL)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (festival, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/festival/gstfestival.h -> gst-plugins-bad-1.20.1.tar.xz/gst/festival/gstfestival.h
Changed
@@ -116,6 +116,7 @@ }; GType gst_festival_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (festival); #ifdef __cplusplus }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/festival/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/festival/meson.build
Changed
@@ -6,7 +6,7 @@ festival_sources, c_args : gst_plugins_bad_args, include_directories : [configinc], - dependencies : [gstbase_dep, gstaudio_dep] + winsock2, + dependencies : [gstbase_dep, gstaudio_dep] + winsock2 + network_deps, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/gst/fieldanalysis/gstfieldanalysis.c -> gst-plugins-bad-1.20.1.tar.xz/gst/fieldanalysis/gstfieldanalysis.c
Changed
@@ -108,6 +108,12 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{YUY2,UYVY,Y42B,I420,YV12}"))); G_DEFINE_TYPE (GstFieldAnalysis, gst_field_analysis, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (fieldanalysis, "fieldanalysis", + GST_RANK_NONE, GST_TYPE_FIELDANALYSIS, + GST_DEBUG_CATEGORY_INIT (gst_field_analysis_debug, "fieldanalysis", 0, + "Video field analysis"); + ); + #define parent_class gst_field_analysis_parent_class static void gst_field_analysis_set_property (GObject * object, guint prop_id, @@ -1840,13 +1846,9 @@ static gboolean -fieldanalysis_init (GstPlugin * fieldanalysis) +fieldanalysis_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_field_analysis_debug, "fieldanalysis", - 0, "Video field analysis"); - - return gst_element_register (fieldanalysis, "fieldanalysis", GST_RANK_NONE, - GST_TYPE_FIELDANALYSIS); + return GST_ELEMENT_REGISTER (fieldanalysis, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/fieldanalysis/gstfieldanalysis.h -> gst-plugins-bad-1.20.1.tar.xz/gst/fieldanalysis/gstfieldanalysis.h
Changed
@@ -143,6 +143,7 @@ }; GType gst_field_analysis_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (fieldanalysis); G_END_DECLS #endif /* __GST_FIELDANALYSIS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/freeverb/gstfreeverb.c -> gst-plugins-bad-1.20.1.tar.xz/gst/freeverb/gstfreeverb.c
Changed
@@ -347,6 +347,8 @@ G_DEFINE_TYPE_WITH_CODE (GstFreeverb, gst_freeverb, GST_TYPE_BASE_TRANSFORM, G_ADD_PRIVATE (GstFreeverb) G_IMPLEMENT_INTERFACE (GST_TYPE_PRESET, NULL)); +GST_ELEMENT_REGISTER_DEFINE (freeverb, "freeverb", + GST_RANK_NONE, GST_TYPE_FREEVERB); static void freeverb_revmodel_init (GstFreeverb * filter) @@ -928,8 +930,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "freeverb", - GST_RANK_NONE, GST_TYPE_FREEVERB); + return GST_ELEMENT_REGISTER (freeverb, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/freeverb/gstfreeverb.h -> gst-plugins-bad-1.20.1.tar.xz/gst/freeverb/gstfreeverb.h
Changed
@@ -62,6 +62,7 @@ }; GType gst_freeverb_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (freeverb); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstburn.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstburn.c
Changed
@@ -64,17 +64,19 @@ #include <gst/gst.h> #include <math.h> -#include "gstplugin.h" #include "gstburn.h" #include "gstgaudieffectsorc.h" -#define gst_burn_parent_class parent_class -G_DEFINE_TYPE (GstBurn, gst_burn, GST_TYPE_VIDEO_FILTER); - GST_DEBUG_CATEGORY_STATIC (gst_burn_debug); #define GST_CAT_DEFAULT gst_burn_debug +#define gst_burn_parent_class parent_class +G_DEFINE_TYPE (GstBurn, gst_burn, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (burn, "burn", GST_RANK_NONE, + GST_TYPE_BURN, GST_DEBUG_CATEGORY_INIT (gst_burn_debug, "burn", 0, + "Template burn")); + #if G_BYTE_ORDER == G_LITTLE_ENDIAN #define CAPS_STR GST_VIDEO_CAPS_MAKE ("{ BGRx, RGBx }") #else @@ -246,14 +248,3 @@ return GST_FLOW_OK; } - -/* Entry point to initialize the plug-in. - * Register the element factories and other features. */ -gboolean -gst_burn_plugin_init (GstPlugin * burn) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_burn_debug, "burn", 0, "Template burn"); - - return gst_element_register (burn, "burn", GST_RANK_NONE, GST_TYPE_BURN); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstburn.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstburn.h
Changed
@@ -81,6 +81,7 @@ }; GType gst_burn_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (burn); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstchromium.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstchromium.c
Changed
@@ -64,15 +64,17 @@ #include <math.h> #include <gst/gst.h> -#include "gstplugin.h" #include "gstchromium.h" -#define gst_chromium_parent_class parent_class -G_DEFINE_TYPE (GstChromium, gst_chromium, GST_TYPE_VIDEO_FILTER); - GST_DEBUG_CATEGORY_STATIC (gst_chromium_debug); #define GST_CAT_DEFAULT gst_chromium_debug +#define gst_chromium_parent_class parent_class +G_DEFINE_TYPE (GstChromium, gst_chromium, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (chromium, "chromium", GST_RANK_NONE, + GST_TYPE_CHROMIUM, GST_DEBUG_CATEGORY_INIT (gst_chromium_debug, "chromium", + 0, "Template chromium")); + #if G_BYTE_ORDER == G_LITTLE_ENDIAN #define CAPS_STR GST_VIDEO_CAPS_MAKE ("{ BGRx, RGBx }") #else @@ -275,19 +277,6 @@ return GST_FLOW_OK; } -/* Entry point to initialize the plug-in. - * Register the element factories and other features. */ -gboolean -gst_chromium_plugin_init (GstPlugin * chromium) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_chromium_debug, "chromium", 0, - "Template chromium"); - - return gst_element_register (chromium, "chromium", GST_RANK_NONE, - GST_TYPE_CHROMIUM); -} - /*** Now the image processing work.... ***/ /* Set up the cosine table. */ void
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstchromium.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstchromium.h
Changed
@@ -79,6 +79,7 @@ }; GType gst_chromium_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (chromium); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstdilate.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstdilate.c
Changed
@@ -64,7 +64,6 @@ #include <gst/gst.h> #include <math.h> -#include "gstplugin.h" #include "gstdilate.h" GST_DEBUG_CATEGORY_STATIC (gst_dilate_debug); @@ -78,6 +77,9 @@ #define gst_dilate_parent_class parent_class G_DEFINE_TYPE (GstDilate, gst_dilate, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dilate, "dilate", GST_RANK_NONE, + GST_TYPE_DILATE, GST_DEBUG_CATEGORY_INIT (gst_dilate_debug, "dilate", 0, + "Template dilate")); /* Filter signals and args. */ enum @@ -250,18 +252,6 @@ return GST_FLOW_OK; } -/* Entry point to initialize the plug-in. - * Register the element factories and other features. */ -gboolean -gst_dilate_plugin_init (GstPlugin * dilate) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_dilate_debug, "dilate", 0, "Template dilate"); - - return gst_element_register (dilate, "dilate", GST_RANK_NONE, - GST_TYPE_DILATE); -} - /*** Now the image processing work.... ***/ /* Return luminance of the color */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstdilate.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstdilate.h
Changed
@@ -81,6 +81,7 @@ }; GType gst_dilate_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dilate); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstdodge.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstdodge.c
Changed
@@ -64,7 +64,6 @@ #include <gst/gst.h> #include <math.h> -#include "gstplugin.h" #include "gstdodge.h" GST_DEBUG_CATEGORY_STATIC (gst_dodge_debug); @@ -78,6 +77,9 @@ #define gst_dodge_parent_class parent_class G_DEFINE_TYPE (GstDodge, gst_dodge, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dodge, "dodge", GST_RANK_NONE, + GST_TYPE_DODGE, GST_DEBUG_CATEGORY_INIT (gst_dodge_debug, "dodge", 0, + "Template dodge")); /* Filter signals and args. */ enum @@ -223,17 +225,6 @@ return GST_FLOW_OK; } -/* Entry point to initialize the plug-in. - * Register the element factories and other features. */ -gboolean -gst_dodge_plugin_init (GstPlugin * dodge) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_dodge_debug, "dodge", 0, "Template dodge"); - - return gst_element_register (dodge, "dodge", GST_RANK_NONE, GST_TYPE_DODGE); -} - /*** Now the image processing work.... ***/ /* Transform processes each frame. */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstdodge.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstdodge.h
Changed
@@ -78,6 +78,7 @@ }; GType gst_dodge_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (dodge); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstexclusion.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstexclusion.c
Changed
@@ -64,7 +64,6 @@ #include <gst/gst.h> #include <math.h> -#include "gstplugin.h" #include "gstexclusion.h" #include <gst/video/video.h> @@ -115,6 +114,9 @@ #define gst_exclusion_parent_class parent_class G_DEFINE_TYPE (GstExclusion, gst_exclusion, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (exclusion, "exclusion", GST_RANK_NONE, + GST_TYPE_EXCLUSION, GST_DEBUG_CATEGORY_INIT (gst_exclusion_debug, + "exclusion", 0, "Template exclusion")); static void gst_exclusion_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -248,19 +250,6 @@ return GST_FLOW_OK; } -/* Entry point to initialize the plug-in. - * Register the element factories and other features. */ -gboolean -gst_exclusion_plugin_init (GstPlugin * exclusion) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_exclusion_debug, "exclusion", - 0, "Template exclusion"); - - return gst_element_register (exclusion, "exclusion", GST_RANK_NONE, - GST_TYPE_EXCLUSION); -} - /*** Now the image processing work.... ***/ /* Transform processes each frame. */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstexclusion.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstexclusion.h
Changed
@@ -81,6 +81,7 @@ }; GType gst_exclusion_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (exclusion); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstgaussblur.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstgaussblur.c
Changed
@@ -66,7 +66,6 @@ #include <math.h> #include <gst/gst.h> -#include "gstplugin.h" #include "gstgaussblur.h" static void gst_gaussianblur_finalize (GObject * object); @@ -120,7 +119,10 @@ #define gst_gaussianblur_parent_class parent_class G_DEFINE_TYPE (GstGaussianBlur, gst_gaussianblur, GST_TYPE_VIDEO_FILTER); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (gaussianblur, "gaussianblur", + GST_RANK_NONE, GST_TYPE_GAUSSIANBLUR, + GST_DEBUG_CATEGORY_INIT (gst_gauss_blur_debug, "gaussianblur", 0, + "Gaussian Blur video effect")); #define DEFAULT_SIGMA 1.2 /* Initialize the gaussianblur's class. */ @@ -452,15 +454,3 @@ break; } } - -/* Register the element factories and other features. */ -gboolean -gst_gauss_blur_plugin_init (GstPlugin * plugin) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_gauss_blur_debug, "gaussianblur", - 0, "Gaussian Blur video effect"); - - return gst_element_register (plugin, "gaussianblur", GST_RANK_NONE, - GST_TYPE_GAUSSIANBLUR); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstgaussblur.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstgaussblur.h
Changed
@@ -81,6 +81,7 @@ }; GType gst_gaussianblur_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (gaussianblur); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstplugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstplugin.c
Changed
@@ -49,7 +49,14 @@ #include <gst/gst.h> -#include "gstplugin.h" +#include "gstburn.h" +#include "gstchromium.h" +#include "gstdilate.h" +#include "gstdodge.h" +#include "gstexclusion.h" +#include "gstgaussblur.h" +#include "gstsolarize.h" + /* PACKAGE: this is usually set by autotools depending on some _INIT macro * in configure.ac and then written into and defined in config.h, but we can @@ -63,15 +70,15 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret = TRUE; + gboolean ret = FALSE; - ret &= gst_burn_plugin_init (plugin); - ret &= gst_chromium_plugin_init (plugin); - ret &= gst_dilate_plugin_init (plugin); - ret &= gst_dodge_plugin_init (plugin); - ret &= gst_exclusion_plugin_init (plugin); - ret &= gst_solarize_plugin_init (plugin); - ret &= gst_gauss_blur_plugin_init (plugin); + ret |= GST_ELEMENT_REGISTER (burn, plugin); + ret |= GST_ELEMENT_REGISTER (chromium, plugin); + ret |= GST_ELEMENT_REGISTER (dilate, plugin); + ret |= GST_ELEMENT_REGISTER (dodge, plugin); + ret |= GST_ELEMENT_REGISTER (exclusion, plugin); + ret |= GST_ELEMENT_REGISTER (solarize, plugin); + ret |= GST_ELEMENT_REGISTER (gaussianblur, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstsolarize.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstsolarize.c
Changed
@@ -64,7 +64,6 @@ #include <gst/gst.h> #include <math.h> -#include "gstplugin.h" #include "gstsolarize.h" GST_DEBUG_CATEGORY_STATIC (gst_solarize_debug); @@ -117,6 +116,9 @@ #define gst_solarize_parent_class parent_class G_DEFINE_TYPE (GstSolarize, gst_solarize, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (solarize, "solarize", GST_RANK_NONE, + GST_TYPE_SOLARIZE, GST_DEBUG_CATEGORY_INIT (gst_solarize_debug, "solarize", + 0, "Template solarize")); static void gst_solarize_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -278,19 +280,6 @@ return GST_FLOW_OK; } -/* Entry point to initialize the plug-in. - * Register the element factories and other features. */ -gboolean -gst_solarize_plugin_init (GstPlugin * solarize) -{ - /* debug category for fltering log messages */ - GST_DEBUG_CATEGORY_INIT (gst_solarize_debug, "solarize", - 0, "Template solarize"); - - return gst_element_register (solarize, "solarize", GST_RANK_NONE, - GST_TYPE_SOLARIZE); -} - /*** Now the image processing work.... ***/ /* Transform processes each frame. */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gaudieffects/gstsolarize.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gaudieffects/gstsolarize.h
Changed
@@ -81,6 +81,7 @@ }; GType gst_solarize_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (solarize); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gdp/gstgdp.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdp.c
Changed
@@ -23,21 +23,17 @@ #include "dataprotocol.h" -#include "gstgdppay.h" -#include "gstgdpdepay.h" +#include "gstgdpelements.h" static gboolean plugin_init (GstPlugin * plugin) { - gst_dp_init (); + gboolean ret = FALSE; - if (!gst_gdp_depay_plugin_init (plugin)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (gdpdepay, plugin); + ret |= GST_ELEMENT_REGISTER (gdppay, plugin); - if (!gst_gdp_pay_plugin_init (plugin)) - return FALSE; - - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gdp/gstgdpdepay.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdpdepay.c
Changed
@@ -40,6 +40,7 @@ #include "dataprotocol.h" +#include "gstgdpelements.h" #include "gstgdpdepay.h" enum @@ -69,6 +70,8 @@ #define gst_gdp_depay_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstGDPDepay, gst_gdp_depay, GST_TYPE_ELEMENT, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (gdpdepay, "gdpdepay", GST_RANK_NONE, + GST_TYPE_GDP_DEPAY, gdp_element_init (plugin)); static gboolean gst_gdp_depay_sink_event (GstPad * pad, GstObject * parent, GstEvent * event); @@ -586,13 +589,3 @@ gst_caps_unref (caps); gst_query_unref (query); } - -gboolean -gst_gdp_depay_plugin_init (GstPlugin * plugin) -{ - if (!gst_element_register (plugin, "gdpdepay", GST_RANK_NONE, - GST_TYPE_GDP_DEPAY)) - return FALSE; - - return TRUE; -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gdp/gstgdpdepay.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdpdepay.h
Changed
@@ -78,8 +78,6 @@ GstElementClass parent_class; }; -gboolean gst_gdp_depay_plugin_init (GstPlugin * plugin); - GType gst_gdp_depay_get_type (void); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdpelement.c
Added
@@ -0,0 +1,37 @@ +/* GStreamer + * Copyright (C) 2006 Thomas Vander Stichele <thomas at apestaart dot org> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "dataprotocol.h" +#include "gstgdpelements.h" + + +void +gdp_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + gst_dp_init (); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdpelements.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) <2020> Julian Bouzas <julian.bouzas@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_GDP_ELEMENT_H__ +#define __GST_GDP_ELEMENT_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void gdp_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (gdppay); +GST_ELEMENT_REGISTER_DECLARE (gdpdepay); + +#endif /* __GST_GDP_ELEMENT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gdp/gstgdppay.c -> gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdppay.c
Changed
@@ -39,6 +39,7 @@ #include "dataprotocol.h" +#include "gstgdpelements.h" #include "gstgdppay.h" static GstStaticPadTemplate gdp_pay_sink_template = @@ -71,6 +72,8 @@ "GDP payloader"); #define gst_gdp_pay_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstGDPPay, gst_gdp_pay, GST_TYPE_ELEMENT, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (gdppay, "gdppay", GST_RANK_NONE, + GST_TYPE_GDP_PAY, gdp_element_init (plugin)); static void gst_gdp_pay_reset (GstGDPPay * this); @@ -708,12 +711,3 @@ return ret; } - -gboolean -gst_gdp_pay_plugin_init (GstPlugin * plugin) -{ - if (!gst_element_register (plugin, "gdppay", GST_RANK_NONE, GST_TYPE_GDP_PAY)) - return FALSE; - - return TRUE; -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gdp/gstgdppay.h -> gst-plugins-bad-1.20.1.tar.xz/gst/gdp/gstgdppay.h
Changed
@@ -72,8 +72,6 @@ GstElementClass parent_class; }; -gboolean gst_gdp_pay_plugin_init (GstPlugin * plugin); - GType gst_gdp_pay_get_type (void); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/gdp/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/gdp/meson.build
Changed
@@ -1,5 +1,6 @@ gdp_sources = [ 'dataprotocol.c', + 'gstgdpelement.c', 'gstgdp.c', 'gstgdppay.c', 'gstgdpdepay.c',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstbulge.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstbulge.c
Changed
@@ -79,6 +79,9 @@ #define gst_bulge_parent_class parent_class G_DEFINE_TYPE (GstBulge, gst_bulge, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (bulge, "bulge", GST_RANK_NONE, + GST_TYPE_BULGE, GST_DEBUG_CATEGORY_INIT (gst_bulge_debug, "bulge", 0, + "bulge")); static void gst_bulge_set_property (GObject * object, guint prop_id, const GValue * value, @@ -208,11 +211,3 @@ filter->zoom = DEFAULT_ZOOM; gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; } - -gboolean -gst_bulge_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_bulge_debug, "bulge", 0, "bulge"); - - return gst_element_register (plugin, "bulge", GST_RANK_NONE, GST_TYPE_BULGE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstbulge.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstbulge.h
Changed
@@ -78,7 +78,7 @@ GType gst_bulge_get_type (void); -gboolean gst_bulge_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (bulge); G_END_DECLS #endif /* __GST_BULGE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstcircle.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstcircle.c
Changed
@@ -90,6 +90,9 @@ #define gst_circle_parent_class parent_class G_DEFINE_TYPE (GstCircle, gst_circle, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (circle, "circle", GST_RANK_NONE, + GST_TYPE_CIRCLE, GST_DEBUG_CATEGORY_INIT (gst_circle_debug, "circle", 0, + "circle")); static void gst_circle_set_property (GObject * object, guint prop_id, const GValue * value, @@ -231,12 +234,3 @@ filter->spread_angle = DEFAULT_SPREAD_ANGLE; filter->height = DEFAULT_HEIGHT; } - -gboolean -gst_circle_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_circle_debug, "circle", 0, "circle"); - - return gst_element_register (plugin, "circle", GST_RANK_NONE, - GST_TYPE_CIRCLE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstcircle.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstcircle.h
Changed
@@ -82,7 +82,7 @@ GType gst_circle_get_type (void); -gboolean gst_circle_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (circle); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstdiffuse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstdiffuse.c
Changed
@@ -84,6 +84,9 @@ #define gst_diffuse_parent_class parent_class G_DEFINE_TYPE (GstDiffuse, gst_diffuse, GST_TYPE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (diffuse, "diffuse", GST_RANK_NONE, + GST_TYPE_DIFFUSE, GST_DEBUG_CATEGORY_INIT (gst_diffuse_debug, "diffuse", 0, + "diffuse")); static void gst_diffuse_set_property (GObject * object, guint prop_id, const GValue * value, @@ -226,12 +229,3 @@ gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; filter->scale = DEFAULT_SCALE; } - -gboolean -gst_diffuse_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_diffuse_debug, "diffuse", 0, "diffuse"); - - return gst_element_register (plugin, "diffuse", GST_RANK_NONE, - GST_TYPE_DIFFUSE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstdiffuse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstdiffuse.h
Changed
@@ -83,7 +83,7 @@ GType gst_diffuse_get_type (void); -gboolean gst_diffuse_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (diffuse); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstfisheye.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstfisheye.c
Changed
@@ -70,6 +70,9 @@ #define gst_fisheye_parent_class parent_class G_DEFINE_TYPE (GstFisheye, gst_fisheye, GST_TYPE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (fisheye, "fisheye", GST_RANK_NONE, + GST_TYPE_FISHEYE, GST_DEBUG_CATEGORY_INIT (gst_fisheye_debug, "fisheye", 0, + "fisheye")); static gboolean fisheye_map (GstGeometricTransform * gt, gint x, gint y, gdouble * in_x, @@ -146,12 +149,3 @@ gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; } - -gboolean -gst_fisheye_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_fisheye_debug, "fisheye", 0, "fisheye"); - - return gst_element_register (plugin, "fisheye", GST_RANK_NONE, - GST_TYPE_FISHEYE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstfisheye.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstfisheye.h
Changed
@@ -77,7 +77,7 @@ GType gst_fisheye_get_type (void); -gboolean gst_fisheye_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (fisheye); G_END_DECLS #endif /* __GST_FISHEYE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstkaleidoscope.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstkaleidoscope.c
Changed
@@ -90,6 +90,10 @@ #define gst_kaleidoscope_parent_class parent_class G_DEFINE_TYPE (GstKaleidoscope, gst_kaleidoscope, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (kaleidoscope, "kaleidoscope", + GST_RANK_NONE, GST_TYPE_KALEIDOSCOPE, + GST_DEBUG_CATEGORY_INIT (gst_kaleidoscope_debug, "kaleidoscope", 0, + "kaleidoscope")); static void gst_kaleidoscope_set_property (GObject * object, guint prop_id, @@ -238,13 +242,3 @@ filter->angle2 = DEFAULT_ANGLE2; filter->sides = DEFAULT_SIDES; } - -gboolean -gst_kaleidoscope_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_kaleidoscope_debug, "kaleidoscope", 0, - "kaleidoscope"); - - return gst_element_register (plugin, "kaleidoscope", GST_RANK_NONE, - GST_TYPE_KALEIDOSCOPE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstkaleidoscope.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstkaleidoscope.h
Changed
@@ -82,7 +82,7 @@ GType gst_kaleidoscope_get_type (void); -gboolean gst_kaleidoscope_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (kaleidoscope); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstmarble.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstmarble.c
Changed
@@ -90,7 +90,9 @@ #define gst_marble_parent_class parent_class G_DEFINE_TYPE (GstMarble, gst_marble, GST_TYPE_GEOMETRIC_TRANSFORM); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (marble, "marble", GST_RANK_NONE, + GST_TYPE_MARBLE, GST_DEBUG_CATEGORY_INIT (gst_marble_debug, "marble", 0, + "marble")); static void gst_marble_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) @@ -282,12 +284,3 @@ filter->amount = DEFAULT_AMOUNT; filter->turbulence = DEFAULT_TURBULENCE; } - -gboolean -gst_marble_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_marble_debug, "marble", 0, "marble"); - - return gst_element_register (plugin, "marble", GST_RANK_NONE, - GST_TYPE_MARBLE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstmarble.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstmarble.h
Changed
@@ -88,7 +88,7 @@ GType gst_marble_get_type (void); -gboolean gst_marble_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (marble); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstmirror.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstmirror.c
Changed
@@ -78,6 +78,9 @@ #define gst_mirror_parent_class parent_class G_DEFINE_TYPE (GstMirror, gst_mirror, GST_TYPE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mirror, "mirror", GST_RANK_NONE, + GST_TYPE_MIRROR, GST_DEBUG_CATEGORY_INIT (gst_mirror_debug, "mirror", 0, + "mirror")); #define GST_TYPE_MIRROR_MODE (gst_mirror_mode_get_type()) static GType @@ -238,12 +241,3 @@ filter->mode = DEFAULT_PROP_MODE; gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; } - -gboolean -gst_mirror_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_mirror_debug, "mirror", 0, "mirror"); - - return gst_element_register (plugin, "mirror", GST_RANK_NONE, - GST_TYPE_MIRROR); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstmirror.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstmirror.h
Changed
@@ -96,7 +96,7 @@ GType gst_mirror_get_type (void); -gboolean gst_mirror_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (mirror); G_END_DECLS #endif /* __GST_MIRROR_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstperspective.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstperspective.c
Changed
@@ -88,6 +88,10 @@ #define gst_perspective_parent_class parent_class G_DEFINE_TYPE (GstPerspective, gst_perspective, GST_TYPE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (perspective, "perspective", + GST_RANK_NONE, GST_TYPE_PERSPECTIVE, + GST_DEBUG_CATEGORY_INIT (gst_perspective_debug, "perspective", 0, + "perspective")); static GValueArray * get_array_from_matrix (GstPerspective * self) @@ -255,13 +259,3 @@ filter->matrix[7] = 0; filter->matrix[8] = 1; } - -gboolean -gst_perspective_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_perspective_debug, "perspective", 0, - "perspective"); - - return gst_element_register (plugin, "perspective", GST_RANK_NONE, - GST_TYPE_PERSPECTIVE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstperspective.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstperspective.h
Changed
@@ -81,7 +81,7 @@ GType gst_perspective_get_type (void); -gboolean gst_perspective_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (perspective); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstpinch.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstpinch.c
Changed
@@ -83,6 +83,9 @@ #define gst_pinch_parent_class parent_class G_DEFINE_TYPE (GstPinch, gst_pinch, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (pinch, "pinch", GST_RANK_NONE, + GST_TYPE_PINCH, GST_DEBUG_CATEGORY_INIT (gst_pinch_debug, "pinch", 0, + "pinch")); static void gst_pinch_set_property (GObject * object, guint prop_id, const GValue * value, @@ -207,11 +210,3 @@ gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; filter->intensity = DEFAULT_INTENSITY; } - -gboolean -gst_pinch_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_pinch_debug, "pinch", 0, "pinch"); - - return gst_element_register (plugin, "pinch", GST_RANK_NONE, GST_TYPE_PINCH); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstpinch.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstpinch.h
Changed
@@ -80,7 +80,7 @@ GType gst_pinch_get_type (void); -gboolean gst_pinch_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (pinch); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstrotate.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstrotate.c
Changed
@@ -84,6 +84,9 @@ #define gst_rotate_parent_class parent_class G_DEFINE_TYPE (GstRotate, gst_rotate, GST_TYPE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (rotate, "rotate", GST_RANK_NONE, + GST_TYPE_ROTATE, GST_DEBUG_CATEGORY_INIT (gst_rotate_debug, "rotate", 0, + "rotate")); static void gst_rotate_set_property (GObject * object, guint prop_id, const GValue * value, @@ -212,12 +215,3 @@ { filter->angle = DEFAULT_ANGLE; } - -gboolean -gst_rotate_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_rotate_debug, "rotate", 0, "rotate"); - - return gst_element_register (plugin, "rotate", GST_RANK_NONE, - GST_TYPE_ROTATE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstrotate.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstrotate.h
Changed
@@ -80,7 +80,7 @@ GType gst_rotate_get_type (void); -gboolean gst_rotate_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (rotate); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstsphere.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstsphere.c
Changed
@@ -83,6 +83,9 @@ #define gst_sphere_parent_class parent_class G_DEFINE_TYPE (GstSphere, gst_sphere, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (sphere, "sphere", GST_RANK_NONE, + GST_TYPE_SPHERE, GST_DEBUG_CATEGORY_INIT (gst_sphere_debug, "sphere", 0, + "sphere")); static void gst_sphere_set_property (GObject * object, guint prop_id, const GValue * value, @@ -219,12 +222,3 @@ gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; filter->refraction = DEFAULT_REFRACTION; } - -gboolean -gst_sphere_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_sphere_debug, "sphere", 0, "sphere"); - - return gst_element_register (plugin, "sphere", GST_RANK_NONE, - GST_TYPE_SPHERE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstsphere.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstsphere.h
Changed
@@ -80,7 +80,7 @@ GType gst_sphere_get_type (void); -gboolean gst_sphere_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (sphere); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstsquare.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstsquare.c
Changed
@@ -81,6 +81,9 @@ #define gst_square_parent_class parent_class G_DEFINE_TYPE (GstSquare, gst_square, GST_TYPE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (square, "square", GST_RANK_NONE, + GST_TYPE_SQUARE, GST_DEBUG_CATEGORY_INIT (gst_square_debug, "square", 0, + "square")); /* GObject vmethod implementations */ @@ -237,12 +240,3 @@ filter->zoom = DEFAULT_ZOOM; gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; } - -gboolean -gst_square_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_square_debug, "square", 0, "square"); - - return gst_element_register (plugin, "square", GST_RANK_NONE, - GST_TYPE_SQUARE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstsquare.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstsquare.h
Changed
@@ -80,7 +80,7 @@ GType gst_square_get_type (void); -gboolean gst_square_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (square); G_END_DECLS #endif /* __GST_SQUARE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gststretch.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gststretch.c
Changed
@@ -79,6 +79,9 @@ #define gst_stretch_parent_class parent_class G_DEFINE_TYPE (GstStretch, gst_stretch, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (stretch, "stretch", GST_RANK_NONE, + GST_TYPE_STRETCH, GST_DEBUG_CATEGORY_INIT (gst_stretch_debug, "stretch", 0, + "stretch")); static void gst_stretch_set_property (GObject * object, guint prop_id, const GValue * value, @@ -209,12 +212,3 @@ filter->intensity = DEFAULT_INTENSITY; gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; } - -gboolean -gst_stretch_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_stretch_debug, "stretch", 0, "stretch"); - - return gst_element_register (plugin, "stretch", GST_RANK_NONE, - GST_TYPE_STRETCH); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gststretch.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gststretch.h
Changed
@@ -78,7 +78,7 @@ GType gst_stretch_get_type (void); -gboolean gst_stretch_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (stretch); G_END_DECLS #endif /* __GST_STRETCH_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gsttunnel.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gsttunnel.c
Changed
@@ -71,6 +71,9 @@ #define gst_tunnel_parent_class parent_class G_DEFINE_TYPE (GstTunnel, gst_tunnel, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (tunnel, "tunnel", GST_RANK_NONE, + GST_TYPE_TUNNEL, GST_DEBUG_CATEGORY_INIT (gst_tunnel_debug, "tunnel", 0, + "tunnel")); static gboolean tunnel_map (GstGeometricTransform * gt, gint x, gint y, gdouble * in_x, @@ -135,12 +138,3 @@ gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; } - -gboolean -gst_tunnel_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_tunnel_debug, "tunnel", 0, "tunnel"); - - return gst_element_register (plugin, "tunnel", GST_RANK_NONE, - GST_TYPE_TUNNEL); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gsttunnel.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gsttunnel.h
Changed
@@ -76,7 +76,7 @@ GType gst_tunnel_get_type (void); -gboolean gst_tunnel_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (tunnel); G_END_DECLS #endif /* __GST_TUNNEL_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gsttwirl.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gsttwirl.c
Changed
@@ -83,6 +83,9 @@ #define gst_twirl_parent_class parent_class G_DEFINE_TYPE (GstTwirl, gst_twirl, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (twirl, "twirl", GST_RANK_NONE, + GST_TYPE_TWIRL, GST_DEBUG_CATEGORY_INIT (gst_twirl_debug, "twirl", 0, + "twirl")); static void gst_twirl_set_property (GObject * object, guint prop_id, const GValue * value, @@ -198,11 +201,3 @@ gt->off_edge_pixels = GST_GT_OFF_EDGES_PIXELS_CLAMP; filter->angle = DEFAULT_ANGLE; } - -gboolean -gst_twirl_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_twirl_debug, "twirl", 0, "twirl"); - - return gst_element_register (plugin, "twirl", GST_RANK_NONE, GST_TYPE_TWIRL); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gsttwirl.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gsttwirl.h
Changed
@@ -80,7 +80,7 @@ GType gst_twirl_get_type (void); -gboolean gst_twirl_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (twirl); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstwaterripple.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstwaterripple.c
Changed
@@ -88,6 +88,10 @@ #define gst_water_ripple_parent_class parent_class G_DEFINE_TYPE (GstWaterRipple, gst_water_ripple, GST_TYPE_CIRCLE_GEOMETRIC_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (waterripple, "waterripple", + GST_RANK_NONE, GST_TYPE_WATER_RIPPLE, + GST_DEBUG_CATEGORY_INIT (gst_water_ripple_debug, "waterripple", 0, + "waterripple")); static void gst_water_ripple_set_property (GObject * object, guint prop_id, @@ -236,13 +240,3 @@ filter->phase = DEFAULT_PHASE; filter->wavelength = DEFAULT_WAVELENGTH; } - -gboolean -gst_water_ripple_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gst_water_ripple_debug, "waterripple", 0, - "waterripple"); - - return gst_element_register (plugin, "waterripple", GST_RANK_NONE, - GST_TYPE_WATER_RIPPLE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/gstwaterripple.h -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/gstwaterripple.h
Changed
@@ -82,7 +82,7 @@ GType gst_water_ripple_get_type (void); -gboolean gst_water_ripple_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (waterripple); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/geometrictransform/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/geometrictransform/plugin.c
Changed
@@ -41,55 +41,26 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_circle_plugin_init (plugin)) - return FALSE; - - if (!gst_diffuse_plugin_init (plugin)) - return FALSE; - - if (!gst_kaleidoscope_plugin_init (plugin)) - return FALSE; - - if (!gst_marble_plugin_init (plugin)) - return FALSE; - - if (!gst_pinch_plugin_init (plugin)) - return FALSE; - - if (!gst_rotate_plugin_init (plugin)) - return FALSE; - - if (!gst_sphere_plugin_init (plugin)) - return FALSE; - - if (!gst_twirl_plugin_init (plugin)) - return FALSE; - - if (!gst_water_ripple_plugin_init (plugin)) - return FALSE; - - if (!gst_stretch_plugin_init (plugin)) - return FALSE; - - if (!gst_bulge_plugin_init (plugin)) - return FALSE; - - if (!gst_tunnel_plugin_init (plugin)) - return FALSE; - - if (!gst_square_plugin_init (plugin)) - return FALSE; - - if (!gst_mirror_plugin_init (plugin)) - return FALSE; - - if (!gst_fisheye_plugin_init (plugin)) - return FALSE; - - if (!gst_perspective_plugin_init (plugin)) - return FALSE; - - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (circle, plugin); + ret |= GST_ELEMENT_REGISTER (diffuse, plugin); + ret |= GST_ELEMENT_REGISTER (kaleidoscope, plugin); + ret |= GST_ELEMENT_REGISTER (marble, plugin); + ret |= GST_ELEMENT_REGISTER (pinch, plugin); + ret |= GST_ELEMENT_REGISTER (rotate, plugin); + ret |= GST_ELEMENT_REGISTER (sphere, plugin); + ret |= GST_ELEMENT_REGISTER (twirl, plugin); + ret |= GST_ELEMENT_REGISTER (waterripple, plugin); + ret |= GST_ELEMENT_REGISTER (stretch, plugin); + ret |= GST_ELEMENT_REGISTER (bulge, plugin); + ret |= GST_ELEMENT_REGISTER (tunnel, plugin); + ret |= GST_ELEMENT_REGISTER (square, plugin); + ret |= GST_ELEMENT_REGISTER (mirror, plugin); + ret |= GST_ELEMENT_REGISTER (fisheye, plugin); + ret |= GST_ELEMENT_REGISTER (perspective, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/id3tag/gstid3mux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/id3tag/gstid3mux.c
Changed
@@ -81,7 +81,12 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS ("application/x-id3")); -G_DEFINE_TYPE (GstId3Mux, gst_id3_mux, GST_TYPE_TAG_MUX); +G_DEFINE_TYPE_WITH_CODE (GstId3Mux, gst_id3_mux, GST_TYPE_TAG_MUX, + GST_DEBUG_CATEGORY_INIT (gst_id3_mux_debug, "id3mux", 0, + "ID3 v1 and v2 tag muxer"); + ); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (id3mux, "id3mux", GST_RANK_PRIMARY, + GST_TYPE_ID3_MUX, gst_tag_register_musicbrainz_tags ()); static GstBuffer *gst_id3_mux_render_v2_tag (GstTagMux * mux, const GstTagList * taglist); @@ -211,16 +216,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_id3_mux_debug, "id3mux", 0, - "ID3 v1 and v2 tag muxer"); - - if (!gst_element_register (plugin, "id3mux", GST_RANK_PRIMARY, - GST_TYPE_ID3_MUX)) - return FALSE; - - gst_tag_register_musicbrainz_tags (); - - return TRUE; + return GST_ELEMENT_REGISTER (id3mux, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/id3tag/gstid3mux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/id3tag/gstid3mux.h
Changed
@@ -56,7 +56,7 @@ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_ID3_MUX)) GType gst_id3_mux_get_type (void); - +GST_ELEMENT_REGISTER_DECLARE (id3mux); G_END_DECLS #endif /* GST_ID3_MUX_H */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/id3tag/id3tag.c -> gst-plugins-bad-1.20.1.tar.xz/gst/id3tag/id3tag.c
Changed
@@ -708,7 +708,7 @@ val = g_strdup (s); } - /* If we don't have a valid language, match what taglib does for + /* If we don't have a valid language, match what taglib does for unknown languages */ if (!lang || strlen (lang) < 3) lang = g_strdup ("XXX"); @@ -841,9 +841,9 @@ { static const struct { - const gchar gst_tag[28]; - const gchar spec_id[28]; - const gchar realworld_id[28]; + const gchar gst_tag[32]; + const gchar spec_id[32]; + const gchar realworld_id[32]; } mb_ids[] = { { GST_TAG_MUSICBRAINZ_ARTISTID, "MusicBrainz Artist Id", @@ -851,6 +851,8 @@ GST_TAG_MUSICBRAINZ_ALBUMID, "MusicBrainz Album Id", "musicbrainz_albumid"}, { GST_TAG_MUSICBRAINZ_ALBUMARTISTID, "MusicBrainz Album Artist Id", "musicbrainz_albumartistid"}, { + GST_TAG_MUSICBRAINZ_RELEASEGROUPID, "MusicBrainz Release Group Id", + "musicbrainz_releasegroupid"}, { GST_TAG_MUSICBRAINZ_TRMID, "MusicBrainz TRM Id", "musicbrainz_trmid"}, { GST_TAG_CDDA_MUSICBRAINZ_DISCID, "MusicBrainz DiscID", "musicbrainz_discid"}, { @@ -858,7 +860,9 @@ * evidence that any popular application is actually putting this info * into TXXX frames; the first one comes from a musicbrainz wiki 'proposed * tags' page, the second one is analogue to the vorbis/ape/flac tag. */ - GST_TAG_CDDA_CDDB_DISCID, "CDDB DiscID", "discid"} + GST_TAG_CDDA_CDDB_DISCID, "CDDB DiscID", "discid"}, { + GST_TAG_MUSICBRAINZ_RELEASETRACKID, "MusicBrainz Track Id", + "musicbrainz_trackid"} }; guint i, idx; @@ -1170,9 +1174,11 @@ GST_TAG_MUSICBRAINZ_ARTISTID, add_musicbrainz_tag, "\000"}, { GST_TAG_MUSICBRAINZ_ALBUMID, add_musicbrainz_tag, "\001"}, { GST_TAG_MUSICBRAINZ_ALBUMARTISTID, add_musicbrainz_tag, "\002"}, { - GST_TAG_MUSICBRAINZ_TRMID, add_musicbrainz_tag, "\003"}, { - GST_TAG_CDDA_MUSICBRAINZ_DISCID, add_musicbrainz_tag, "\004"}, { - GST_TAG_CDDA_CDDB_DISCID, add_musicbrainz_tag, "\005"}, { + GST_TAG_MUSICBRAINZ_RELEASEGROUPID, add_musicbrainz_tag, "\003"}, { + GST_TAG_MUSICBRAINZ_TRMID, add_musicbrainz_tag, "\004"}, { + GST_TAG_CDDA_MUSICBRAINZ_DISCID, add_musicbrainz_tag, "\005"}, { + GST_TAG_CDDA_CDDB_DISCID, add_musicbrainz_tag, "\006"}, { + GST_TAG_MUSICBRAINZ_RELEASETRACKID, add_musicbrainz_tag, "\007"}, { GST_TAG_MUSICBRAINZ_TRACKID, add_unique_file_id_tag, NULL}, { /* Info about encoder */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstinter.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstinter.c
Changed
@@ -32,20 +32,16 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "interaudiosrc", GST_RANK_NONE, - GST_TYPE_INTER_AUDIO_SRC); - gst_element_register (plugin, "interaudiosink", GST_RANK_NONE, - GST_TYPE_INTER_AUDIO_SINK); - gst_element_register (plugin, "intersubsrc", GST_RANK_NONE, - GST_TYPE_INTER_SUB_SRC); - gst_element_register (plugin, "intersubsink", GST_RANK_NONE, - GST_TYPE_INTER_SUB_SINK); - gst_element_register (plugin, "intervideosrc", GST_RANK_NONE, - GST_TYPE_INTER_VIDEO_SRC); - gst_element_register (plugin, "intervideosink", GST_RANK_NONE, - GST_TYPE_INTER_VIDEO_SINK); + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (interaudiosrc, plugin); + ret |= GST_ELEMENT_REGISTER (interaudiosink, plugin); + ret |= GST_ELEMENT_REGISTER (intersubsrc, plugin); + ret |= GST_ELEMENT_REGISTER (intersubsink, plugin); + ret |= GST_ELEMENT_REGISTER (intervideosrc, plugin); + ret |= GST_ELEMENT_REGISTER (intervideosink, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstinteraudiosink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstinteraudiosink.c
Changed
@@ -89,6 +89,8 @@ /* class initialization */ #define parent_class gst_inter_audio_sink_parent_class G_DEFINE_TYPE (GstInterAudioSink, gst_inter_audio_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE (interaudiosink, "interaudiosink", + GST_RANK_NONE, GST_TYPE_INTER_AUDIO_SINK); static void gst_inter_audio_sink_class_init (GstInterAudioSinkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstinteraudiosink.h -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstinteraudiosink.h
Changed
@@ -51,6 +51,7 @@ }; GType gst_inter_audio_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (interaudiosink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstinteraudiosrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstinteraudiosrc.c
Changed
@@ -95,6 +95,8 @@ /* class initialization */ #define parent_class gst_inter_audio_src_parent_class G_DEFINE_TYPE (GstInterAudioSrc, gst_inter_audio_src, GST_TYPE_BASE_SRC); +GST_ELEMENT_REGISTER_DEFINE (interaudiosrc, "interaudiosrc", + GST_RANK_NONE, GST_TYPE_INTER_AUDIO_SRC); static void gst_inter_audio_src_class_init (GstInterAudioSrcClass * klass) @@ -402,7 +404,7 @@ period_samples - n); mem = gst_allocator_alloc (NULL, (period_samples - n) * bpf, NULL); if (gst_memory_map (mem, &map, GST_MAP_WRITE)) { - gst_audio_format_fill_silence (interaudiosrc->info.finfo, map.data, + gst_audio_format_info_fill_silence (interaudiosrc->info.finfo, map.data, map.size); gst_memory_unmap (mem, &map); }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstinteraudiosrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstinteraudiosrc.h
Changed
@@ -54,7 +54,7 @@ }; GType gst_inter_audio_src_get_type (void); - +GST_ELEMENT_REGISTER_DECLARE (interaudiosrc); G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintersubsink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintersubsink.c
Changed
@@ -80,6 +80,8 @@ /* class initialization */ #define parent_class gst_inter_sub_sink_parent_class G_DEFINE_TYPE (GstInterSubSink, gst_inter_sub_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE (intersubsink, "intersubsink", GST_RANK_NONE, + GST_TYPE_INTER_SUB_SINK); static void gst_inter_sub_sink_class_init (GstInterSubSinkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintersubsink.h -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintersubsink.h
Changed
@@ -52,6 +52,7 @@ }; GType gst_inter_sub_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (intersubsink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintersubsrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintersubsrc.c
Changed
@@ -82,6 +82,8 @@ /* class initialization */ #define parent_class gst_inter_sub_src_parent_class G_DEFINE_TYPE (GstInterSubSrc, gst_inter_sub_src, GST_TYPE_BASE_SRC); +GST_ELEMENT_REGISTER_DEFINE (intersubsrc, "intersubsrc", GST_RANK_NONE, + GST_TYPE_INTER_SUB_SRC); static void gst_inter_sub_src_class_init (GstInterSubSrcClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintersubsrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintersubsrc.h
Changed
@@ -51,6 +51,7 @@ }; GType gst_inter_sub_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (intersubsrc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintervideosink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintervideosink.c
Changed
@@ -82,6 +82,8 @@ /* class initialization */ G_DEFINE_TYPE (GstInterVideoSink, gst_inter_video_sink, GST_TYPE_VIDEO_SINK); +GST_ELEMENT_REGISTER_DEFINE (intervideosink, "intervideosink", + GST_RANK_NONE, GST_TYPE_INTER_VIDEO_SINK); static void gst_inter_video_sink_class_init (GstInterVideoSinkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintervideosink.h -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintervideosink.h
Changed
@@ -50,6 +50,7 @@ }; GType gst_inter_video_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (intervideosink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintervideosrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintervideosrc.c
Changed
@@ -89,6 +89,8 @@ /* class initialization */ #define parent_class gst_inter_video_src_parent_class G_DEFINE_TYPE (GstInterVideoSrc, gst_inter_video_src, GST_TYPE_BASE_SRC); +GST_ELEMENT_REGISTER_DEFINE (intervideosrc, "intervideosrc", + GST_RANK_NONE, GST_TYPE_INTER_VIDEO_SRC); static void gst_inter_video_src_class_init (GstInterVideoSrcClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/inter/gstintervideosrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/inter/gstintervideosrc.h
Changed
@@ -56,6 +56,7 @@ }; GType gst_inter_video_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (intervideosrc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/interlace/gstinterlace.c -> gst-plugins-bad-1.20.1.tar.xz/gst/interlace/gstinterlace.c
Changed
@@ -98,7 +98,6 @@ int src_fps_n; int src_fps_d; - GMutex lock; gint new_pattern; GstBuffer *stored_frame; guint stored_fields; @@ -172,7 +171,32 @@ return interlace_pattern_type; } -#define VIDEO_FORMATS "{AYUV,YUY2,UYVY,I420,YV12,Y42B,Y444,NV12,NV21}" +/* We can support all planar and packed YUV formats, but not tiled formats. + * We don't advertise RGB formats because interlaced video is usually YUV. */ +#define VIDEO_FORMATS \ + "{" \ + "AYUV64, " /* 16-bit 4:4:4:4 */ \ + "Y412_BE, Y412_LE, " /* 12-bit 4:4:4:4 */ \ + "A444_10BE,A444_10LE, " /* 10-bit 4:4:4:4 */ \ + "AYUV, VUYA, " /* 8-bit 4:4:4:4 */ \ + "A422_10BE, A422_10LE, " /* 10-bit 4:4:2:2 */ \ + "A420_10BE, A420_10LE, " /* 10-bit 4:4:2:0 */ \ + "A420, " /* 8-bit 4:4:2:0 */ \ + "Y444_16BE, Y444_16LE, " /* 16-bit 4:4:4 */ \ + "Y444_12BE, Y444_12LE, " /* 12-bit 4:4:4 */ \ + "Y410, Y444_10BE, Y444_10LE, " /* 10-bit 4:4:4 */ \ + "v308, IYU2, Y444, NV24, " /* 8-bit 4:4:4 */ \ + "v216, I422_12BE, I422_12LE, " /* 16-bit 4:2:2 */ \ + "Y212_BE, Y212_LE, " /* 12-bit 4:2:2 */ \ + "UYVP, Y210, NV16_10LE32, v210, I422_10BE, I422_10LE, " /* 10-bit 4:2:2 */ \ + "YUY2, UYVY, VYUY, YVYU, Y42B, NV16, NV61, " /* 8-bit 4:2:2 */ \ + "P016_BE, P016_LE, " /* 16-bit 4:2:0 */ \ + "I420_12BE, I420_12LE, P012_BE, P012_LE, " /* 12-bit 4:2:0 */ \ + "NV12_10LE40, NV12_10LE32, I420_10BE, I420_10LE, P010_10BE, P010_10LE, " /* 10-bit 4:2:0 */ \ + "I420, YV12, NV12, NV21, " /* 8-bit 4:2:0 */ \ + "IYU1, Y41B, " /* 8-bit 4:1:1 */ \ + "YUV9, YVU9, " /* 8-bit 4:1:0 */ \ + "}" static GstStaticPadTemplate gst_interlace_src_template = GST_STATIC_PAD_TEMPLATE ("src", @@ -222,9 +246,13 @@ static GstCaps *gst_interlace_caps_double_framerate (GstCaps * caps, gboolean half, gboolean skip_progressive); +GST_ELEMENT_REGISTER_DECLARE (interlace); + #define gst_interlace_parent_class parent_class G_DEFINE_TYPE (GstInterlace, gst_interlace, GST_TYPE_ELEMENT); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (interlace, "interlace", GST_RANK_NONE, + GST_TYPE_INTERLACE, GST_DEBUG_CATEGORY_INIT (gst_interlace_debug, + "interlace", 0, "interlace element")); static void gst_interlace_class_init (GstInterlaceClass * klass) { @@ -276,15 +304,16 @@ static void gst_interlace_finalize (GObject * obj) { - GstInterlace *interlace = GST_INTERLACE (obj); - g_mutex_clear (&interlace->lock); G_OBJECT_CLASS (parent_class)->finalize (obj); } static void gst_interlace_reset (GstInterlace * interlace) { + GST_OBJECT_LOCK (interlace); interlace->phase_index = interlace->pattern_offset; + GST_OBJECT_UNLOCK (interlace); + interlace->timebase = GST_CLOCK_TIME_NONE; interlace->field_index = 0; interlace->passthrough = FALSE; @@ -292,6 +321,7 @@ if (interlace->stored_frame) { gst_buffer_unref (interlace->stored_frame); interlace->stored_frame = NULL; + interlace->stored_fields = 0; } } @@ -317,7 +347,7 @@ interlace->new_pattern = GST_INTERLACE_PATTERN_2_3; interlace->pattern_offset = 0; interlace->src_fps_n = 0; - g_mutex_init (&interlace->lock); + interlace->src_fps_d = 1; gst_interlace_reset (interlace); } @@ -360,10 +390,12 @@ int n_fields) { gint src_fps_n, src_fps_d; - g_mutex_lock (&interlace->lock); + + GST_OBJECT_LOCK (interlace); src_fps_n = interlace->src_fps_n; src_fps_d = interlace->src_fps_d; - g_mutex_unlock (&interlace->lock); + GST_OBJECT_UNLOCK (interlace); + /* field duration = src_fps_d / (2 * src_fps_n) */ if (src_fps_n == 0) { /* If we don't know the fps, we can't generate timestamps/durations */ @@ -384,6 +416,12 @@ gst_interlace_decorate_buffer (GstInterlace * interlace, GstBuffer * buf, int n_fields, gboolean interlaced) { + GstInterlacePattern pattern; + + GST_OBJECT_LOCK (interlace); + pattern = interlace->pattern; + GST_OBJECT_UNLOCK (interlace); + gst_interlace_decorate_buffer_ts (interlace, buf, n_fields); if (interlace->field_index == 0) { @@ -395,21 +433,20 @@ if (n_fields == 1) { GST_BUFFER_FLAG_SET (buf, GST_VIDEO_BUFFER_FLAG_ONEFIELD); } - g_mutex_lock (&interlace->lock); - if (interlace->pattern > GST_INTERLACE_PATTERN_2_2 && n_fields == 2 - && interlaced) { + if (pattern > GST_INTERLACE_PATTERN_2_2 && n_fields == 2 && interlaced) { GST_BUFFER_FLAG_SET (buf, GST_VIDEO_BUFFER_FLAG_INTERLACED); } - g_mutex_unlock (&interlace->lock); } static const gchar * interlace_mode_from_pattern (GstInterlace * interlace) { GstInterlacePattern pattern; - g_mutex_lock (&interlace->lock); + + GST_OBJECT_LOCK (interlace); pattern = interlace->pattern; - g_mutex_unlock (&interlace->lock); + GST_OBJECT_UNLOCK (interlace); + if (pattern > GST_INTERLACE_PATTERN_2_2) return "mixed"; else @@ -439,7 +476,7 @@ GstVideoInfo info, out_info; GstCaps *othercaps, *src_peer_caps; const PulldownFormat *pdformat; - gboolean alternate; + gboolean top_field_first, alternate; int i; int src_fps_n, src_fps_d; GstInterlacePattern pattern; @@ -447,10 +484,11 @@ if (!gst_video_info_from_caps (&info, caps)) goto caps_error; - g_mutex_lock (&interlace->lock); + GST_OBJECT_LOCK (interlace); interlace->pattern = interlace->new_pattern; pattern = interlace->pattern; - g_mutex_unlock (&interlace->lock); + top_field_first = interlace->top_field_first; + GST_OBJECT_UNLOCK (interlace); /* Check if downstream prefers alternate mode */ othercaps = gst_caps_copy (caps); @@ -497,14 +535,15 @@ pdformat = &formats[pattern]; - interlace->phase_index = interlace->pattern_offset; - src_fps_n = info.fps_n * pdformat->ratio_n; src_fps_d = info.fps_d * pdformat->ratio_d; - g_mutex_lock (&interlace->lock); + + GST_OBJECT_LOCK (interlace); + interlace->phase_index = interlace->pattern_offset; interlace->src_fps_n = src_fps_n; interlace->src_fps_d = src_fps_d; - g_mutex_unlock (&interlace->lock); + GST_OBJECT_UNLOCK (interlace); + GST_DEBUG_OBJECT (interlace, "new framerate %d/%d", src_fps_n, src_fps_d); if (alternate) { @@ -561,8 +600,7 @@ src_fps_d, NULL); if (pattern <= GST_INTERLACE_PATTERN_2_2 || alternate) { gst_caps_set_simple (othercaps, "field-order", G_TYPE_STRING, - interlace->top_field_first ? "top-field-first" : "bottom-field-first", - NULL); + top_field_first ? "top-field-first" : "bottom-field-first", NULL); } /* outcaps changed, regenerate out_info */ gst_video_info_from_caps (&out_info, othercaps); @@ -656,6 +694,7 @@ if (interlace->stored_frame) { gst_buffer_unref (interlace->stored_frame); interlace->stored_frame = NULL; + interlace->stored_fields = 0; } ret = gst_pad_push_event (interlace->srcpad, event); break; @@ -842,13 +881,15 @@ const char *mode; guint i; gint pattern; + gboolean top_field_first; otherpad = (pad == interlace->srcpad) ? interlace->sinkpad : interlace->srcpad; - g_mutex_lock (&interlace->lock); + GST_OBJECT_LOCK (interlace); pattern = interlace->new_pattern; - g_mutex_unlock (&interlace->lock); + top_field_first = interlace->top_field_first; + GST_OBJECT_UNLOCK (interlace); if (filter != NULL) { clean_filter = gst_caps_copy (filter); @@ -897,8 +938,7 @@ if (pad == interlace->srcpad) { gst_structure_set (s, "field-order", G_TYPE_STRING, - interlace->top_field_first ? "top-field-first" : - "bottom-field-first", NULL); + top_field_first ? "top-field-first" : "bottom-field-first", NULL); } else { gst_structure_remove_field (s, "field-order"); } @@ -1183,10 +1223,10 @@ GstInterlace *interlace = GST_INTERLACE (parent); GstFlowReturn ret = GST_FLOW_OK; gint num_fields = 0; - guint current_fields; + guint current_fields, pattern_offset; const PulldownFormat *format; GstClockTime timestamp; - gboolean alternate; + gboolean allow_rff, top_field_first, alternate; timestamp = GST_BUFFER_TIMESTAMP (buffer); @@ -1204,16 +1244,23 @@ return gst_pad_push (interlace->srcpad, buffer); } + GST_OBJECT_LOCK (interlace); + format = &formats[interlace->pattern]; + allow_rff = interlace->allow_rff; + pattern_offset = interlace->pattern_offset; + top_field_first = interlace->top_field_first; + GST_OBJECT_UNLOCK (interlace); + if (GST_BUFFER_FLAGS (buffer) & GST_BUFFER_FLAG_DISCONT) { GST_DEBUG ("discont"); if (interlace->stored_frame) { gst_buffer_unref (interlace->stored_frame); + interlace->stored_frame = NULL; + interlace->stored_fields = 0; } - interlace->stored_frame = NULL; - interlace->stored_fields = 0; - if (interlace->top_field_first) { + if (top_field_first) { interlace->field_index = 0; } else { interlace->field_index = 1; @@ -1225,12 +1272,8 @@ interlace->timebase = timestamp; } - g_mutex_lock (&interlace->lock); - format = &formats[interlace->pattern]; - g_mutex_unlock (&interlace->lock); - if (interlace->stored_fields == 0 - && interlace->phase_index == interlace->pattern_offset + && interlace->phase_index == pattern_offset && GST_CLOCK_TIME_IS_VALID (timestamp)) { interlace->timebase = timestamp; interlace->fields_since_timebase = 0; @@ -1338,7 +1381,7 @@ gst_video_frame_unmap (&sframe); } - if (num_fields >= 3 && interlace->allow_rff) { + if (num_fields >= 3 && allow_rff) { GST_DEBUG ("3 fields from current"); /* take both fields from incoming buffer */ current_fields -= 3; @@ -1427,26 +1470,35 @@ switch (prop_id) { case PROP_TOP_FIELD_FIRST: + GST_OBJECT_LOCK (interlace); interlace->top_field_first = g_value_get_boolean (value); + GST_OBJECT_UNLOCK (interlace); break; case PROP_PATTERN:{ gint pattern = g_value_get_enum (value); - g_mutex_lock (&interlace->lock); + gboolean reconfigure = FALSE; + + GST_OBJECT_LOCK (interlace); interlace->new_pattern = pattern; - if (pattern == interlace->pattern || interlace->src_fps_n == 0) { + if (interlace->src_fps_n == 0 || interlace->pattern == pattern) interlace->pattern = pattern; - g_mutex_unlock (&interlace->lock); - } else { - g_mutex_unlock (&interlace->lock); - gst_pad_push_event (interlace->srcpad, gst_event_new_reconfigure ()); - } + else + reconfigure = TRUE; + GST_OBJECT_UNLOCK (interlace); + + if (reconfigure) + gst_pad_push_event (interlace->sinkpad, gst_event_new_reconfigure ()); break; } case PROP_PATTERN_OFFSET: + GST_OBJECT_LOCK (interlace); interlace->pattern_offset = g_value_get_uint (value); + GST_OBJECT_UNLOCK (interlace); break; case PROP_ALLOW_RFF: + GST_OBJECT_LOCK (interlace); interlace->allow_rff = g_value_get_boolean (value); + GST_OBJECT_UNLOCK (interlace); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); @@ -1462,18 +1514,24 @@ switch (prop_id) { case PROP_TOP_FIELD_FIRST: + GST_OBJECT_LOCK (interlace); g_value_set_boolean (value, interlace->top_field_first); + GST_OBJECT_UNLOCK (interlace); break; case PROP_PATTERN: - g_mutex_lock (&interlace->lock); + GST_OBJECT_LOCK (interlace); g_value_set_enum (value, interlace->new_pattern); - g_mutex_unlock (&interlace->lock); + GST_OBJECT_UNLOCK (interlace); break; case PROP_PATTERN_OFFSET: + GST_OBJECT_LOCK (interlace); g_value_set_uint (value, interlace->pattern_offset); + GST_OBJECT_UNLOCK (interlace); break; case PROP_ALLOW_RFF: + GST_OBJECT_LOCK (interlace); g_value_set_boolean (value, interlace->allow_rff); + GST_OBJECT_UNLOCK (interlace); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); @@ -1491,9 +1549,10 @@ switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_READY: - g_mutex_lock (&interlace->lock); + GST_OBJECT_LOCK (interlace); interlace->src_fps_n = 0; - g_mutex_unlock (&interlace->lock); + interlace->src_fps_d = 1; + GST_OBJECT_UNLOCK (interlace); gst_interlace_reset (interlace); break; @@ -1507,11 +1566,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_interlace_debug, "interlace", 0, - "interlace element"); - - return gst_element_register (plugin, "interlace", GST_RANK_NONE, - GST_TYPE_INTERLACE); + return GST_ELEMENT_REGISTER (interlace, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/ivfparse/gstivfparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/ivfparse/gstivfparse.c
Changed
@@ -72,6 +72,8 @@ #define gst_ivf_parse_parent_class parent_class G_DEFINE_TYPE (GstIvfParse, gst_ivf_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE (ivfparse, "ivfparse", GST_RANK_PRIMARY, + GST_TYPE_IVF_PARSE); static void gst_ivf_parse_finalize (GObject * object); static gboolean gst_ivf_parse_start (GstBaseParse * parse); @@ -223,8 +225,11 @@ media_type = fourcc_to_media_type (ivf->fourcc); /* Create src pad caps */ - caps = gst_caps_new_simple (media_type, "width", G_TYPE_INT, ivf->width, - "height", G_TYPE_INT, ivf->height, NULL); + caps = gst_caps_new_empty_simple (media_type); + if (ivf->width > 0 && ivf->height > 0) { + gst_caps_set_simple (caps, "width", G_TYPE_INT, ivf->width, + "height", G_TYPE_INT, ivf->height, NULL); + } if (ivf->fps_n > 0 && ivf->fps_d > 0) { gst_base_parse_set_frame_rate (GST_BASE_PARSE_CAST (ivf), @@ -397,14 +402,9 @@ /* entry point to initialize the plug-in */ static gboolean -ivfparse_init (GstPlugin * ivfparse) +ivfparse_init (GstPlugin * plugin) { - /* register parser element */ - if (!gst_element_register (ivfparse, "ivfparse", GST_RANK_PRIMARY, - GST_TYPE_IVF_PARSE)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (ivfparse, plugin); } /* gstreamer looks for this structure to register plugins */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/ivfparse/gstivfparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/ivfparse/gstivfparse.h
Changed
@@ -66,6 +66,7 @@ }; GType gst_ivf_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ivfparse); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/ivtc/gstcombdetect.c -> gst-plugins-bad-1.20.1.tar.xz/gst/ivtc/gstcombdetect.c
Changed
@@ -93,6 +93,8 @@ G_DEFINE_TYPE_WITH_CODE (GstCombDetect, gst_comb_detect, GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_comb_detect_debug_category, "combdetect", 0, "debug category for combdetect element")); +GST_ELEMENT_REGISTER_DEFINE (combdetect, "combdetect", GST_RANK_NONE, + GST_TYPE_COMB_DETECT); static void gst_comb_detect_class_init (GstCombDetectClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/ivtc/gstcombdetect.h -> gst-plugins-bad-1.20.1.tar.xz/gst/ivtc/gstcombdetect.h
Changed
@@ -48,6 +48,8 @@ GType gst_comb_detect_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (combdetect); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/ivtc/gstivtc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/ivtc/gstivtc.c
Changed
@@ -110,6 +110,7 @@ G_DEFINE_TYPE_WITH_CODE (GstIvtc, gst_ivtc, GST_TYPE_BASE_TRANSFORM, GST_DEBUG_CATEGORY_INIT (gst_ivtc_debug_category, "ivtc", 0, "debug category for ivtc element")); +GST_ELEMENT_REGISTER_DEFINE (ivtc, "ivtc", GST_RANK_NONE, GST_TYPE_IVTC); static void gst_ivtc_class_init (GstIvtcClass * klass) @@ -681,9 +682,8 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "ivtc", GST_RANK_NONE, GST_TYPE_IVTC); - gst_element_register (plugin, "combdetect", GST_RANK_NONE, - GST_TYPE_COMB_DETECT); + GST_ELEMENT_REGISTER (ivtc, plugin); + GST_ELEMENT_REGISTER (combdetect, plugin); return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/ivtc/gstivtc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/ivtc/gstivtc.h
Changed
@@ -25,6 +25,8 @@ G_BEGIN_DECLS +GST_ELEMENT_REGISTER_DECLARE (ivtc); + #define GST_TYPE_IVTC (gst_ivtc_get_type()) #define GST_IVTC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_IVTC,GstIvtc)) #define GST_IVTC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_IVTC,GstIvtcClass))
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jp2kdecimator/gstjp2kdecimator.c -> gst-plugins-bad-1.20.1.tar.xz/gst/jp2kdecimator/gstjp2kdecimator.c
Changed
@@ -79,7 +79,10 @@ #define GST_CAT_DEFAULT gst_jp2k_decimator_debug G_DEFINE_TYPE (GstJP2kDecimator, gst_jp2k_decimator, GST_TYPE_ELEMENT); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (jp2kdecimator, "jp2kdecimator", + GST_RANK_NONE, GST_TYPE_JP2K_DECIMATOR, + GST_DEBUG_CATEGORY_INIT (gst_jp2k_decimator_debug, "jp2kdecimator", 0, + "JPEG2000 decimator")); static void gst_jp2k_decimator_class_init (GstJP2kDecimatorClass * klass) { @@ -258,13 +261,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_jp2k_decimator_debug, "jp2kdecimator", 0, - "JPEG2000 decimator"); - - gst_element_register (plugin, "jp2kdecimator", GST_RANK_NONE, - GST_TYPE_JP2K_DECIMATOR); - - return TRUE; + return GST_ELEMENT_REGISTER (jp2kdecimator, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jp2kdecimator/gstjp2kdecimator.h -> gst-plugins-bad-1.20.1.tar.xz/gst/jp2kdecimator/gstjp2kdecimator.h
Changed
@@ -51,6 +51,7 @@ }; GType gst_jp2k_decimator_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (jp2kdecimator); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jpegformat/gstjifmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/jpegformat/gstjifmux.c
Changed
@@ -110,6 +110,8 @@ G_DEFINE_TYPE_WITH_CODE (GstJifMux, gst_jif_mux, GST_TYPE_ELEMENT, G_IMPLEMENT_INTERFACE (GST_TYPE_TAG_SETTER, NULL); G_IMPLEMENT_INTERFACE (GST_TYPE_TAG_XMP_WRITER, NULL)); +GST_ELEMENT_REGISTER_DEFINE (jifmux, "jifmux", GST_RANK_SECONDARY, + GST_TYPE_JIF_MUX); static void gst_jif_mux_class_init (GstJifMuxClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jpegformat/gstjifmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/jpegformat/gstjifmux.h
Changed
@@ -61,6 +61,8 @@ GType gst_jif_mux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (jifmux); + G_END_DECLS #endif /* __GST_JFIF_MUX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jpegformat/gstjpegformat.c -> gst-plugins-bad-1.20.1.tar.xz/gst/jpegformat/gstjpegformat.c
Changed
@@ -30,14 +30,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "jpegparse", GST_RANK_NONE, - GST_TYPE_JPEG_PARSE)) - return FALSE; - if (!gst_element_register (plugin, "jifmux", GST_RANK_SECONDARY, - GST_TYPE_JIF_MUX)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (jpegparse, plugin); + ret |= GST_ELEMENT_REGISTER (jifmux, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jpegformat/gstjpegparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/jpegformat/gstjpegparse.c
Changed
@@ -89,6 +89,8 @@ #define gst_jpeg_parse_parent_class parent_class G_DEFINE_TYPE (GstJpegParse, gst_jpeg_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE (jpegparse, "jpegparse", GST_RANK_NONE, + GST_TYPE_JPEG_PARSE); static void gst_jpeg_parse_class_init (GstJpegParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/jpegformat/gstjpegparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/jpegformat/gstjpegparse.h
Changed
@@ -87,6 +87,8 @@ GType gst_jpeg_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (jpegparse); + G_END_DECLS #endif /* __GST_JPEG_PARSE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/librfb/gstrfbsrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/librfb/gstrfbsrc.c
Changed
@@ -82,6 +82,7 @@ #define gst_rfb_src_parent_class parent_class G_DEFINE_TYPE (GstRfbSrc, gst_rfb_src, GST_TYPE_PUSH_SRC); +GST_ELEMENT_REGISTER_DEFINE (rfbsrc, "rfbsrc", GST_RANK_NONE, GST_TYPE_RFB_SRC); static void gst_rfb_src_class_init (GstRfbSrcClass * klass) @@ -632,8 +633,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "rfbsrc", GST_RANK_NONE, - GST_TYPE_RFB_SRC); + return GST_ELEMENT_REGISTER (rfbsrc, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/librfb/gstrfbsrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/librfb/gstrfbsrc.h
Changed
@@ -68,6 +68,7 @@ }; GType gst_rfb_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (rfbsrc); G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/meson.build
Changed
@@ -1,7 +1,7 @@ foreach plugin : ['accurip', 'adpcmdec', 'adpcmenc', 'aiff', 'asfmux', 'audiobuffersplit', 'audiofxbad', 'audiomixmatrix', 'audiolatency', 'audiovisualizers', 'autoconvert', 'bayer', - 'camerabin2', 'coloreffects', 'debugutils', 'dvbsubenc', + 'camerabin2', 'codecalpha', 'coloreffects', 'debugutils', 'dvbsubenc', 'dvbsuboverlay', 'dvdspu', 'faceoverlay', 'festival', 'fieldanalysis', 'freeverb', 'frei0r', 'gaudieffects', 'gdp', 'geometrictransform', 'id3tag', 'inter', 'interlace',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/midi/midi.c -> gst-plugins-bad-1.20.1.tar.xz/gst/midi/midi.c
Changed
@@ -28,16 +28,9 @@ #include "midiparse.h" -GST_DEBUG_CATEGORY_STATIC (midi_debug); -#define GST_CAT_DEFAULT (midi_debug) - static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; - - GST_DEBUG_CATEGORY_INIT (midi_debug, "midi", 0, "MIDI plugin"); - #ifdef ENABLE_NLS GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, LOCALEDIR); @@ -45,10 +38,7 @@ bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); #endif - ret = gst_element_register (plugin, "midiparse", GST_RANK_PRIMARY, - GST_TYPE_MIDI_PARSE); - - return ret; + return GST_ELEMENT_REGISTER (midiparse, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/midi/midiparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/midi/midiparse.c
Changed
@@ -118,6 +118,8 @@ #define parent_class gst_midi_parse_parent_class G_DEFINE_TYPE (GstMidiParse, gst_midi_parse, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (midiparse, "midiparse", GST_RANK_PRIMARY, + GST_TYPE_MIDI_PARSE); /* initialize the plugin's class */ static void
View file
gst-plugins-bad-1.18.6.tar.xz/gst/midi/midiparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/midi/midiparse.h
Changed
@@ -87,6 +87,7 @@ }; GType gst_midi_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (midiparse); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegdemux/gstmpegdemux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegdemux/gstmpegdemux.c
Changed
@@ -81,6 +81,8 @@ GST_DEBUG_CATEGORY_STATIC (gstflupsdemux_debug); #define GST_CAT_DEFAULT (gstflupsdemux_debug) +GST_DEBUG_CATEGORY_EXTERN (mpegpspesfilter_debug); + /* MPEG2Demux signals and args */ enum { @@ -218,6 +220,11 @@ return ps_demux_type; } +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mpegpsdemux, "mpegpsdemux", + GST_RANK_PRIMARY, GST_TYPE_PS_DEMUX, + GST_DEBUG_CATEGORY_INIT (mpegpspesfilter_debug, "mpegpspesfilter", 0, + "MPEG-PS PES filter")); + static void gst_ps_demux_base_init (GstPsDemuxClass * klass) { @@ -393,7 +400,6 @@ demux->next_pts = G_MAXUINT64; demux->next_dts = G_MAXUINT64; demux->need_no_more_pads = TRUE; - demux->adjust_segment = TRUE; gst_ps_demux_reset_psm (demux); gst_segment_init (&demux->sink_segment, GST_FORMAT_UNDEFINED); gst_segment_init (&demux->src_segment, GST_FORMAT_TIME); @@ -646,6 +652,7 @@ /* discont */ if (G_UNLIKELY (stream->need_segment)) { GstSegment segment; + GstEvent *segment_event; GST_DEBUG ("PTS timestamp:%" GST_TIME_FORMAT " base_time %" GST_TIME_FORMAT " src_segment.start:%" GST_TIME_FORMAT " .stop:%" GST_TIME_FORMAT, @@ -653,20 +660,6 @@ GST_TIME_ARGS (demux->src_segment.start), GST_TIME_ARGS (demux->src_segment.stop)); - /* adjust segment start if estimating a seek was off quite a bit, - * make sure to do for all streams though to preserve a/v sync */ - /* FIXME such adjustment tends to be frowned upon */ - if (pts != GST_CLOCK_TIME_NONE && demux->adjust_segment) { - if (demux->src_segment.rate > 0) { - if (GST_CLOCK_DIFF (demux->src_segment.start, pts) > GST_SECOND) - demux->src_segment.start = pts - demux->base_time; - } else { - if (GST_CLOCK_DIFF (demux->src_segment.stop, pts) > GST_SECOND) - demux->src_segment.stop = pts - demux->base_time; - } - } - demux->adjust_segment = FALSE; - /* we should be in sync with downstream, so start from our segment notion, * which also includes proper base_time etc, tweak it a bit and send */ gst_segment_copy_into (&demux->src_segment, &segment); @@ -678,10 +671,15 @@ segment.time = segment.start - demux->base_time; } + segment_event = gst_event_new_segment (&segment); + if (demux->segment_seqnum) + gst_event_set_seqnum (segment_event, demux->segment_seqnum); + else + demux->segment_seqnum = gst_event_get_seqnum (segment_event); GST_INFO_OBJECT (demux, "sending segment event %" GST_SEGMENT_FORMAT " to pad %" GST_PTR_FORMAT, &segment, stream->pad); - gst_pad_push_event (stream->pad, gst_event_new_segment (&segment)); + gst_pad_push_event (stream->pad, segment_event); stream->need_segment = FALSE; } @@ -784,7 +782,8 @@ if (G_LIKELY (stream)) { stream->discont |= discont; stream->need_segment |= need_segment; - demux->adjust_segment |= need_segment; + if (need_segment) + demux->segment_seqnum = 0; GST_DEBUG_OBJECT (demux, "marked stream as discont %d, need_segment %d", stream->discont, stream->need_segment); } @@ -1084,8 +1083,6 @@ /* we expect our timeline (SCR, PTS) to match the one from upstream, * if not, will adjust with offset later on */ gst_segment_copy_into (segment, &demux->src_segment); - /* accept upstream segment without adjusting */ - demux->adjust_segment = FALSE; } gst_event_unref (event); @@ -1289,9 +1286,10 @@ GstSeekType start_type, stop_type; gint64 start, stop; gdouble rate; - gboolean update, flush; + gboolean update, flush, accurate; GstSegment seeksegment; GstClockTime first_pts = MPEGTIME_TO_GSTTIME (demux->first_pts); + guint32 seek_seqnum = gst_event_get_seqnum (event); gst_event_parse_seek (event, &rate, &format, &flags, &start_type, &start, &stop_type, &stop); @@ -1307,13 +1305,17 @@ goto no_scr_rate; flush = flags & GST_SEEK_FLAG_FLUSH; + accurate = flags & GST_SEEK_FLAG_ACCURATE; + /* keyframe = flags & GST_SEEK_FLAG_KEY_UNIT; *//* FIXME */ if (flush) { + GstEvent *event = gst_event_new_flush_start (); + gst_event_set_seqnum (event, seek_seqnum); /* Flush start up and downstream to make sure data flow and loops are idle */ demux->flushing = TRUE; - gst_ps_demux_send_event (demux, gst_event_new_flush_start ()); + gst_ps_demux_send_event (demux, event); gst_pad_push_event (demux->sinkpad, gst_event_new_flush_start ()); } else { /* Pause the pulling task */ @@ -1351,11 +1353,11 @@ } /* check the limits */ - if (seeksegment.rate > 0.0 && first_pts != G_MAXUINT64) { - if (seeksegment.start < first_pts - demux->base_time) { - seeksegment.start = first_pts - demux->base_time; - seeksegment.position = seeksegment.start; - } + if (seeksegment.rate > 0.0 && first_pts != G_MAXUINT64 + && seeksegment.start < first_pts - demux->base_time) { + seeksegment.position = first_pts - demux->base_time; + if (!accurate) + seeksegment.start = seeksegment.position; } /* update the rate in our src segment */ @@ -1365,8 +1367,10 @@ &seeksegment); if (flush) { + GstEvent *event = gst_event_new_flush_stop (TRUE); /* Stop flushing, the sinks are at time 0 now */ - gst_ps_demux_send_event (demux, gst_event_new_flush_stop (TRUE)); + gst_event_set_seqnum (event, seek_seqnum); + gst_ps_demux_send_event (demux, event); } if (flush || seeksegment.position != demux->src_segment.position) { @@ -1386,6 +1390,9 @@ /* Tell all the stream a new segment is needed */ gst_ps_demux_mark_discont (demux, TRUE, TRUE); + /* Update the segment_seqnum with the seek event seqnum */ + demux->segment_seqnum = seek_seqnum; + gst_pad_start_task (demux->sinkpad, (GstTaskFunction) gst_ps_demux_loop, demux->sinkpad, NULL); @@ -2980,9 +2987,11 @@ offset += size; gst_segment_set_position (&demux->sink_segment, GST_FORMAT_BYTES, offset); /* check EOS condition */ + /* FIXME: The src_segment.stop is not including the SCR after seek(set) */ if ((demux->sink_segment.position >= demux->sink_segment.stop) || (demux->src_segment.stop != (guint64) - 1 && - demux->src_segment.position >= demux->src_segment.stop)) { + demux->src_segment.position >= + demux->src_segment.stop + demux->base_time)) { GST_DEBUG_OBJECT (demux, "forward mode using segment reached end of " "segment pos %" GST_TIME_FORMAT " stop %" GST_TIME_FORMAT " pos in bytes %" @@ -3052,10 +3061,14 @@ demux->src_segment.start)); } } else { + GstEvent *event; /* normal playback, send EOS to all linked pads */ gst_element_no_more_pads (GST_ELEMENT (demux)); GST_LOG_OBJECT (demux, "Sending EOS, at end of stream"); - if (!gst_ps_demux_send_event (demux, gst_event_new_eos ()) + event = gst_event_new_eos (); + if (demux->segment_seqnum) + gst_event_set_seqnum (event, demux->segment_seqnum); + if (!gst_ps_demux_send_event (demux, event) && !have_open_streams (demux)) { GST_WARNING_OBJECT (demux, "EOS and no streams open"); GST_ELEMENT_ERROR (demux, STREAM, FAILED, @@ -3063,8 +3076,12 @@ } } } else if (ret == GST_FLOW_NOT_LINKED || ret < GST_FLOW_EOS) { + GstEvent *event; GST_ELEMENT_FLOW_ERROR (demux, ret); - gst_ps_demux_send_event (demux, gst_event_new_eos ()); + event = gst_event_new_eos (); + if (demux->segment_seqnum) + gst_event_set_seqnum (event, demux->segment_seqnum); + gst_ps_demux_send_event (demux, event); } gst_object_unref (demux);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegdemux/gstmpegdemux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegdemux/gstmpegdemux.h
Changed
@@ -141,7 +141,7 @@ GstSegment sink_segment; GstSegment src_segment; - gboolean adjust_segment; + guint32 segment_seqnum; /* stream output */ GstPsStream *current_stream; @@ -173,6 +173,7 @@ }; GType gst_ps_demux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (mpegpsdemux); G_END_DECLS #endif /* __GST_PS_DEMUX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegdemux/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegdemux/plugin.c
Changed
@@ -47,19 +47,10 @@ #include "gstmpegdemux.h" -GST_DEBUG_CATEGORY_EXTERN (mpegpspesfilter_debug); - static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (mpegpspesfilter_debug, "mpegpspesfilter", 0, - "MPEG-PS PES filter"); - - if (!gst_element_register (plugin, "mpegpsdemux", GST_RANK_PRIMARY, - GST_TYPE_PS_DEMUX)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (mpegpsdemux, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegpsmux/mpegpsmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegpsmux/mpegpsmux.c
Changed
@@ -108,7 +108,9 @@ #define parent_class mpegpsmux_parent_class G_DEFINE_TYPE (MpegPsMux, mpegpsmux, GST_TYPE_ELEMENT); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mpegpsmux, "mpegpsmux", GST_RANK_PRIMARY, + mpegpsmux_get_type (), GST_DEBUG_CATEGORY_INIT (mpegpsmux_debug, + "mpegpsmux", 0, "MPEG Program Stream muxer")); static void mpegpsmux_class_init (MpegPsMuxClass * klass) { @@ -671,7 +673,7 @@ GST_LOG_OBJECT (mux, "Outputting a packet of length %d", len); - data = g_memdup (data, len); + data = g_memdup2 (data, len); buf = gst_buffer_new_wrapped (data, len); GST_BUFFER_TIMESTAMP (buf) = mux->last_ts; @@ -774,14 +776,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "mpegpsmux", GST_RANK_PRIMARY, - mpegpsmux_get_type ())) - return FALSE; - - GST_DEBUG_CATEGORY_INIT (mpegpsmux_debug, "mpegpsmux", 0, - "MPEG Program Stream muxer"); - - return TRUE; + return GST_ELEMENT_REGISTER (mpegpsmux, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegpsmux/mpegpsmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegpsmux/mpegpsmux.h
Changed
@@ -113,6 +113,7 @@ }; GType mpegpsmux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (mpegpsmux); #define CLOCK_BASE 9LL #define CLOCK_FREQ (CLOCK_BASE * 10000)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegpsmux/mpegpsmux_aac.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegpsmux/mpegpsmux_aac.c
Changed
@@ -6,10 +6,9 @@ * * Copyright 2008 Lin YANG <oxcsnicho@gmail.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -40,22 +39,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -80,6 +63,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegpsmux/mpegpsmux_aac.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegpsmux/mpegpsmux_aac.h
Changed
@@ -6,10 +6,9 @@ * * Copyright 2008 Lin YANG <oxcsnicho@gmail.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under three different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -40,22 +39,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -80,6 +63,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __MPEGPSMUX_AAC_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegpsmux/mpegpsmux_h264.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegpsmux/mpegpsmux_h264.c
Changed
@@ -6,10 +6,9 @@ * * Copyright 2008 Lin YANG <oxcsnicho@gmail.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -40,22 +39,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -80,6 +63,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegpsmux/mpegpsmux_h264.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegpsmux/mpegpsmux_h264.h
Changed
@@ -6,10 +6,9 @@ * * Copyright 2008 Lin YANG <oxcsnicho@gmail.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -40,22 +39,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -80,6 +63,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __MPEGPSMUX_H264_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/gsttsdemux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/gsttsdemux.c
Changed
@@ -31,14 +31,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_mpegts_initialize (); - if (!gst_mpegtsbase_plugin_init (plugin)) - return FALSE; - if (!gst_mpegtsparse_plugin_init (plugin)) - return FALSE; - if (!gst_ts_demux_plugin_init (plugin)) - return FALSE; - return TRUE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (tsparse, plugin); + ret |= GST_ELEMENT_REGISTER (tsdemux, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/mpegtsbase.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/mpegtsbase.c
Changed
@@ -96,7 +96,7 @@ GstMpegtsSection * section); static gboolean mpegts_base_parse_atsc_mgt (MpegTSBase * base, GstMpegtsSection * section); -static gboolean remove_each_program (gpointer key, MpegTSBaseProgram * program, +static void remove_each_program (MpegTSBaseProgram * program, MpegTSBase * base); static void @@ -108,6 +108,9 @@ QUARK_PCR_PID = g_quark_from_string ("pcr-pid"); QUARK_STREAMS = g_quark_from_string ("streams"); QUARK_STREAM_TYPE = g_quark_from_string ("stream-type"); + GST_DEBUG_CATEGORY_INIT (mpegts_base_debug, "mpegtsbase", 0, + "MPEG transport stream base class"); + gst_mpegts_initialize (); } #define mpegts_base_parent_class parent_class @@ -159,6 +162,7 @@ G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); klass->sink_query = GST_DEBUG_FUNCPTR (mpegts_base_default_sink_query); + klass->handle_psi = NULL; gst_type_mark_as_plugin_api (GST_TYPE_MPEGTS_BASE, 0); } @@ -237,14 +241,16 @@ base->seen_pat = FALSE; base->seek_offset = -1; - g_hash_table_foreach_remove (base->programs, (GHRFunc) remove_each_program, - base); + g_ptr_array_foreach (base->programs, (GFunc) remove_each_program, base); + g_ptr_array_remove_range (base->programs, 0, base->programs->len); base->streams_aware = GST_OBJECT_PARENT (base) && GST_OBJECT_FLAG_IS_SET (GST_OBJECT_PARENT (base), GST_BIN_FLAG_STREAMS_AWARE); GST_DEBUG_OBJECT (base, "Streams aware : %d", base->streams_aware); + gst_event_replace (&base->seek_event, NULL); + if (klass->reset) klass->reset (base); } @@ -263,8 +269,8 @@ base->disposed = FALSE; base->packetizer = mpegts_packetizer_new (); - base->programs = g_hash_table_new_full (g_direct_hash, g_direct_equal, - NULL, (GDestroyNotify) mpegts_base_free_program); + base->programs = + g_ptr_array_new_full (16, (GDestroyNotify) mpegts_base_free_program); base->parse_private_sections = FALSE; base->is_pes = g_new0 (guint8, 1024); @@ -304,7 +310,9 @@ g_ptr_array_unref (base->pat); base->pat = NULL; } - g_hash_table_destroy (base->programs); + g_ptr_array_free (base->programs, TRUE); + + gst_event_replace (&base->seek_event, NULL); if (G_OBJECT_CLASS (parent_class)->finalize) G_OBJECT_CLASS (parent_class)->finalize (object); @@ -325,6 +333,20 @@ return gst_mpegts_find_descriptor (pmt->descriptors, tag); } +const GstMpegtsDescriptor * +mpegts_get_descriptor_from_stream_with_extension (MpegTSBaseStream * stream, + guint8 tag, guint8 tag_extension) +{ + GstMpegtsPMTStream *pmt = stream->stream; + + GST_DEBUG ("Searching for tag 0x%02x tag_extension 0x%02x " + "in stream 0x%04x (stream_type 0x%02x)", + tag, tag_extension, stream->pid, stream->stream_type); + + return gst_mpegts_find_descriptor_with_extension (pmt->descriptors, tag, + tag_extension); +} + typedef struct { gboolean res; @@ -332,8 +354,7 @@ } PIDLookup; static void -foreach_pid_in_program (gpointer key, MpegTSBaseProgram * program, - PIDLookup * lookup) +foreach_pid_in_program (MpegTSBaseProgram * program, PIDLookup * lookup) { if (!program->active) return; @@ -348,8 +369,7 @@ lookup.res = FALSE; lookup.pid = pid; - g_hash_table_foreach (base->programs, (GHFunc) foreach_pid_in_program, - &lookup); + g_ptr_array_foreach (base->programs, (GFunc) foreach_pid_in_program, &lookup); return lookup.res; } @@ -412,7 +432,7 @@ MpegTSBaseProgram *program; gchar *upstream_id, *stream_id; - GST_DEBUG_OBJECT (base, "program_number : %d, pmt_pid : %d", + GST_DEBUG_OBJECT (base, "program_number : %d, pmt_pid : 0x%04x", program_number, pmt_pid); program = g_malloc0 (base->program_size); @@ -437,7 +457,7 @@ { MpegTSBaseProgram *program; - GST_DEBUG_OBJECT (base, "program_number : %d, pmt_pid : %d", + GST_DEBUG_OBJECT (base, "program_number : %d, pmt_pid : 0x%04x", program_number, pmt_pid); program = mpegts_base_new_program (base, program_number, pmt_pid); @@ -449,8 +469,14 @@ } MPEGTS_BIT_SET (base->known_psi, pmt_pid); - g_hash_table_insert (base->programs, - GINT_TO_POINTER (program_number), program); + /* Ensure the PMT PID was not used by some PES stream */ + if (G_UNLIKELY (MPEGTS_BIT_IS_SET (base->is_pes, pmt_pid))) { + GST_DEBUG ("New program PMT PID was previously used by a PES stream"); + MPEGTS_BIT_UNSET (base->is_pes, pmt_pid); + } + + + g_ptr_array_add (base->programs, program); return program; } @@ -458,27 +484,35 @@ MpegTSBaseProgram * mpegts_base_get_program (MpegTSBase * base, gint program_number) { - MpegTSBaseProgram *program; - - program = (MpegTSBaseProgram *) g_hash_table_lookup (base->programs, - GINT_TO_POINTER ((gint) program_number)); + guint i; - return program; + for (i = 0; i < base->programs->len; i++) { + MpegTSBaseProgram *program = g_ptr_array_index (base->programs, i); + if (program->program_number == program_number) + return program; + } + return NULL; } static MpegTSBaseProgram * mpegts_base_steal_program (MpegTSBase * base, gint program_number) { - MpegTSBaseProgram *program; - - program = (MpegTSBaseProgram *) g_hash_table_lookup (base->programs, - GINT_TO_POINTER ((gint) program_number)); + guint i; - if (program) - g_hash_table_steal (base->programs, - GINT_TO_POINTER ((gint) program_number)); + for (i = 0; i < base->programs->len; i++) { + MpegTSBaseProgram *program = g_ptr_array_index (base->programs, i); + if (program->program_number == program_number) { +#if GLIB_CHECK_VERSION(2, 58, 0) + return g_ptr_array_steal_index (base->programs, i); +#else + program->recycle = TRUE; + g_ptr_array_remove_index (base->programs, i); + return program; +#endif + } + } - return program; + return NULL; } static void @@ -496,6 +530,11 @@ { GList *tmp; + if (program->recycle) { + program->recycle = FALSE; + return; + } + if (program->pmt) { gst_mpegts_section_unref (program->section); program->pmt = NULL; @@ -529,11 +568,11 @@ } static void -mpegts_base_remove_program (MpegTSBase * base, gint program_number) +mpegts_base_remove_program (MpegTSBase * base, MpegTSBaseProgram * program) { - GST_DEBUG_OBJECT (base, "program_number : %d", program_number); + GST_DEBUG_OBJECT (base, "program_number : %d", program->program_number); - g_hash_table_remove (base->programs, GINT_TO_POINTER (program_number)); + g_ptr_array_remove (base->programs, program); } static guint32 @@ -553,6 +592,28 @@ return 0; } +static gboolean +find_registration_in_descriptors (GPtrArray * descriptors, + guint32 registration_id) +{ + + guint i, nb_desc; + + if (!descriptors) + return FALSE; + + nb_desc = descriptors->len; + for (i = 0; i < nb_desc; i++) { + GstMpegtsDescriptor *desc = g_ptr_array_index (descriptors, i); + if (desc->tag == GST_MTS_DESC_REGISTRATION) { + guint32 reg_desc = GST_READ_UINT32_BE (desc->data + 2); + if (reg_desc == registration_id) + return TRUE; + } + } + return FALSE; +} + static MpegTSBaseStream * mpegts_base_program_add_stream (MpegTSBase * base, MpegTSBaseProgram * program, guint16 pid, guint8 stream_type, @@ -593,9 +654,11 @@ program->stream_list = g_list_append (program->stream_list, bstream); if (klass->stream_added) - if (klass->stream_added (base, bstream, program)) + if (klass->stream_added (base, bstream, program)) { gst_stream_collection_add_stream (program->collection, (GstStream *) gst_object_ref (bstream->stream_object)); + bstream->in_collection = TRUE; + } return bstream; @@ -678,7 +741,7 @@ /* Copy over gststream that still exist into the collection */ for (tmp = program->stream_list; tmp; tmp = tmp->next) { MpegTSBaseStream *stream = (MpegTSBaseStream *) tmp->data; - if (_stream_in_pmt (pmt, stream)) { + if (_stream_in_pmt (pmt, stream) && stream->in_collection) { gst_stream_collection_add_stream (program->collection, gst_object_ref (stream->stream_object)); } @@ -739,10 +802,8 @@ return TRUE; case GST_MPEGTS_STREAM_TYPE_SCTE_SIT: { - guint32 registration_id = - get_registration_from_descriptors (pmt->descriptors); /* Not a private section stream */ - if (registration_id != DRF_ID_CUEI) + if (!find_registration_in_descriptors (pmt->descriptors, DRF_ID_CUEI)) return FALSE; return TRUE; } @@ -1010,10 +1071,21 @@ for (i = 0; i < pat->len; ++i) { GstMpegtsPatProgram *patp = g_ptr_array_index (pat, i); + GST_LOG ("Looking for program %d / 0x%04x", patp->program_number, + patp->network_or_program_map_PID); program = mpegts_base_get_program (base, patp->program_number); if (program) { - /* IF the program already existed, just check if the PMT PID changed */ - if (program->pmt_pid != patp->network_or_program_map_PID) { + GST_LOG ("Program exists on pid 0x%04x", program->pmt_pid); + /* If the new PMT PID clashes with an existing known PES stream, we know + * it is not an update */ + if (MPEGTS_BIT_IS_SET (base->is_pes, patp->network_or_program_map_PID)) { + GST_LOG ("Program is not an update"); + program = + mpegts_base_add_program (base, patp->program_number, + patp->network_or_program_map_PID); + } else if (program->pmt_pid != patp->network_or_program_map_PID) { + /* IF the program already existed, just check if the PMT PID changed */ + GST_LOG ("PMT is on a different PID"); if (program->pmt_pid != G_MAXUINT16) { /* pmt pid changed */ /* FIXME: when this happens it may still be pmt pid of another @@ -1028,6 +1100,8 @@ ("Refcounting issue. Setting twice a PMT PID (0x%04x) as know PSI", program->pmt_pid); MPEGTS_BIT_SET (base->known_psi, patp->network_or_program_map_PID); + } else { + GST_LOG ("Regular program update"); } } else { /* Create a new program */ @@ -1054,21 +1128,25 @@ continue; } - if (--program->patcount > 0) + GST_LOG ("Deactivating program %d / 0x%04x", patp->program_number, + patp->network_or_program_map_PID); + + if (--program->patcount > 0) { + GST_LOG ("Referenced by new program, keeping"); /* the program has been referenced by the new pat, keep it */ continue; + } GST_INFO_OBJECT (base, "PAT removing program 0x%04x 0x%04x", patp->program_number, patp->network_or_program_map_PID); if (klass->can_remove_program (base, program)) { mpegts_base_deactivate_program (base, program); - mpegts_base_remove_program (base, patp->program_number); + mpegts_base_remove_program (base, program); } else { /* sub-class now owns the program and must call * mpegts_base_deactivate_and_free_program later */ - g_hash_table_steal (base->programs, - GINT_TO_POINTER ((gint) patp->program_number)); + mpegts_base_steal_program (base, patp->program_number); } /* FIXME: when this happens it may still be pmt pid of another * program, so setting to False may make it go through expensive @@ -1153,12 +1231,10 @@ } else { /* sub-class now owns the program and must call * mpegts_base_deactivate_and_free_program later */ - g_hash_table_steal (base->programs, - GINT_TO_POINTER ((gint) old_program->program_number)); + mpegts_base_steal_program (base, old_program->program_number); } /* Add new program to the programs we track */ - g_hash_table_insert (base->programs, - GINT_TO_POINTER (program_number), program); + g_ptr_array_add (base->programs, program); initial_program = FALSE; } else { GST_DEBUG ("Program update, re-using same program"); @@ -1219,6 +1295,10 @@ break; } + /* Give the subclass a chance to look at the section */ + if (GST_MPEGTS_BASE_GET_CLASS (base)->handle_psi) + GST_MPEGTS_BASE_GET_CLASS (base)->handle_psi (base, section); + /* Finally post message (if it wasn't corrupted) */ if (post_message) gst_element_post_message (GST_ELEMENT_CAST (base), @@ -1314,14 +1394,11 @@ return TRUE; } -static gboolean -remove_each_program (gpointer key, MpegTSBaseProgram * program, - MpegTSBase * base) +static void +remove_each_program (MpegTSBaseProgram * program, MpegTSBase * base) { /* First deactivate it */ mpegts_base_deactivate_program (base, program); - - return TRUE; } static inline GstFlowReturn
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/mpegtsbase.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/mpegtsbase.h
Changed
@@ -65,6 +65,7 @@ GstMpegtsPMTStream *stream; GstStream *stream_object; + gboolean in_collection; gchar *stream_id; }; @@ -94,6 +95,9 @@ gboolean active; /* TRUE if this is the first program created */ gboolean initial_program; + + /* TRUE if the program shouldn't be freed */ + gboolean recycle; }; typedef enum { @@ -122,7 +126,7 @@ /* the following vars must be protected with the OBJECT_LOCK as they can be * accessed from the application thread and the streaming thread */ - GHashTable *programs; + GPtrArray *programs; GPtrArray *pat; MpegTSPacketizer2 *packetizer; @@ -169,6 +173,9 @@ /* Do not use the PCR stream for timestamp calculation. Useful for * streams with broken/invalid PCR streams. */ gboolean ignore_pcr; + + /* Used for delayed seek events */ + GstEvent *seek_event; }; struct _MpegTSBaseClass { @@ -180,6 +187,7 @@ void (*inspect_packet) (MpegTSBase *base, MpegTSPacketizerPacket *packet); /* takes ownership of @event */ gboolean (*push_event) (MpegTSBase *base, GstEvent * event); + void (*handle_psi) (MpegTSBase *base, GstMpegtsSection * section); /* program_started gets called when program's pmt arrives for first time */ void (*program_started) (MpegTSBase *base, MpegTSBaseProgram *program); @@ -234,6 +242,8 @@ G_GNUC_INTERNAL MpegTSBaseProgram *mpegts_base_add_program (MpegTSBase * base, gint program_number, guint16 pmt_pid); G_GNUC_INTERNAL const GstMpegtsDescriptor *mpegts_get_descriptor_from_stream (MpegTSBaseStream * stream, guint8 tag); +G_GNUC_INTERNAL const GstMpegtsDescriptor *mpegts_get_descriptor_from_stream_with_extension (MpegTSBaseStream * stream, + guint8 tag, guint8 tag_extension); G_GNUC_INTERNAL const GstMpegtsDescriptor *mpegts_get_descriptor_from_program (MpegTSBaseProgram * program, guint8 tag); G_GNUC_INTERNAL gboolean
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/mpegtspacketizer.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/mpegtspacketizer.c
Changed
@@ -21,6 +21,9 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif #include <string.h> #include <stdlib.h> @@ -282,6 +285,7 @@ packetizer->pcr_discont_threshold = GST_SECOND; packetizer->last_pts = GST_CLOCK_TIME_NONE; packetizer->last_dts = GST_CLOCK_TIME_NONE; + packetizer->extra_shift = 0; } static void @@ -1123,7 +1127,7 @@ /* Only do fast-path if we have enough byte */ if (data + section_length <= packet->data_end) { if ((section = - gst_mpegts_section_new (packet->pid, g_memdup (data, + gst_mpegts_section_new (packet->pid, g_memdup2 (data, section_length), section_length))) { GST_DEBUG ("PID 0x%04x Short section complete !", packet->pid); section->offset = packet->offset; @@ -2274,23 +2278,24 @@ GST_TIME_ARGS (pcrtable->base_pcrtime), GST_TIME_ARGS (pcrtable->base_time), GST_TIME_ARGS (pcrtable->pcroffset)); - res = pts + pcrtable->pcroffset; + res = pts + pcrtable->pcroffset + packetizer->extra_shift; /* Don't return anything if we differ too much against last seen PCR */ - /* FIXME : Ideally we want to figure out whether we have a wraparound or - * a reset so we can provide actual values. - * That being said, this will only happen for the small interval of time - * where PTS/DTS are wrapping just before we see the first reset/wrap PCR - */ if (G_UNLIKELY (pcr_pid != 0x1fff && ABSDIFF (res, pcrtable->last_pcrtime) > 15 * GST_SECOND)) res = GST_CLOCK_TIME_NONE; else { GstClockTime tmp = pcrtable->base_time + pcrtable->skew; - if (tmp + res >= pcrtable->base_pcrtime) + if (tmp + res >= pcrtable->base_pcrtime) { res += tmp - pcrtable->base_pcrtime; - else + } else if (ABSDIFF (tmp + res + PCR_GST_MAX_VALUE, + pcrtable->base_pcrtime) < PCR_GST_MAX_VALUE / 2) { + /* Handle wrapover */ + res += tmp + PCR_GST_MAX_VALUE - pcrtable->base_pcrtime; + } else { + /* Fallback for values that differ way too much */ res = GST_CLOCK_TIME_NONE; + } } } else if (packetizer->calculate_offset && pcrtable->groups) { gint64 refpcr = G_MAXINT64, refpcroffset;
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/mpegtspacketizer.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/mpegtspacketizer.h
Changed
@@ -287,6 +287,10 @@ /* PTS/DTS of last buffer */ GstClockTime last_pts; GstClockTime last_dts; + + /* Extra time offset to handle values before initial PCR. + * This will be added to all converted timestamps */ + GstClockTime extra_shift; }; struct _MpegTSPacketizer2Class {
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/mpegtsparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/mpegtsparse.c
Changed
@@ -125,6 +125,11 @@ #define mpegts_parse_parent_class parent_class G_DEFINE_TYPE (MpegTSParse2, mpegts_parse, GST_TYPE_MPEGTS_BASE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (tsparse, "tsparse", + GST_RANK_NONE, GST_TYPE_MPEGTS_PARSE, + GST_DEBUG_CATEGORY_INIT (mpegts_parse_debug, "tsparse", 0, + "MPEG transport stream parser")); + static void mpegts_parse_reset (MpegTSBase * base); static GstFlowReturn mpegts_parse_input_done (MpegTSBase * base); static GstFlowReturn @@ -418,9 +423,12 @@ prepare_src_pad (base, parse); } if (G_UNLIKELY (GST_EVENT_TYPE (event) == GST_EVENT_EOS)) { + gsize packet_size = base->packetizer->packet_size; + parse->is_eos = TRUE; - if (parse->alignment > 0 && parse->ts_adapter.packets_in_adapter > 0 + if (packet_size > 0 && parse->alignment > 0 && + parse->ts_adapter.packets_in_adapter > 0 && parse->ts_adapter.packets_in_adapter < parse->alignment) { GstBuffer *buf; GstMapInfo map; @@ -428,7 +436,6 @@ gint missing_packets = parse->alignment - parse->ts_adapter.packets_in_adapter; gint i = missing_packets; - gsize packet_size = base->packetizer->packet_size; GST_DEBUG_OBJECT (parse, "Adding %d dummy packets", missing_packets); @@ -436,8 +443,6 @@ gst_buffer_map (buf, &map, GST_MAP_READWRITE); data = map.data; - g_assert (packet_size > 0); - for (; i > 0; i--) { gint offset; @@ -643,7 +648,7 @@ pts = gst_adapter_prev_pts_at_offset (adapter, offset, &pts_dist); dts = gst_adapter_prev_dts_at_offset (adapter, offset, &dts_dist); - GST_LOG_OBJECT (ts_adapter, + GST_LOG_OBJECT (pad, "prev pts:%" GST_TIME_FORMAT " (dist:%" G_GUINT64_FORMAT ") dts:%" GST_TIME_FORMAT " (dist:%" G_GUINT64_FORMAT ")", GST_TIME_ARGS (pts), pts_dist, GST_TIME_ARGS (dts), dts_dist); @@ -1206,13 +1211,3 @@ } return res; } - -gboolean -gst_mpegtsparse_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (mpegts_parse_debug, "tsparse", 0, - "MPEG transport stream parser"); - - return gst_element_register (plugin, "tsparse", - GST_RANK_NONE, GST_TYPE_MPEGTS_PARSE); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/mpegtsparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/mpegtsparse.h
Changed
@@ -95,8 +95,7 @@ }; G_GNUC_INTERNAL GType mpegts_parse_get_type(void); - -G_GNUC_INTERNAL gboolean gst_mpegtsparse_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (tsparse); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/tsdemux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/tsdemux.c
Changed
@@ -172,6 +172,10 @@ /* Output data */ PendingPacketState state; + /* PES header being reconstructed (optional, allocated) */ + guint8 *pending_header_data; + guint pending_header_size; + /* Data being reconstructed (allocated) */ guint8 *data; @@ -297,6 +301,7 @@ PROP_PROGRAM_NUMBER, PROP_EMIT_STATS, PROP_LATENCY, + PROP_SEND_SCTE35_EVENTS, /* FILL ME */ }; @@ -340,6 +345,7 @@ static gboolean sink_query (MpegTSBase * base, GstQuery * query); static void gst_ts_demux_check_and_sync_streams (GstTSDemux * demux, GstClockTime time); +static void handle_psi (MpegTSBase * base, GstMpegtsSection * section); static void _extra_init (void) @@ -356,6 +362,12 @@ #define gst_ts_demux_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstTSDemux, gst_ts_demux, GST_TYPE_MPEGTS_BASE, _extra_init ()); +#define _do_element_init \ + GST_DEBUG_CATEGORY_INIT (ts_demux_debug, "tsdemux", 0, \ + "MPEG transport stream demuxer");\ + init_pes_parser (); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (tsdemux, "tsdemux", + GST_RANK_PRIMARY, GST_TYPE_TS_DEMUX, _do_element_init); static void gst_ts_demux_dispose (GObject * object) @@ -368,6 +380,17 @@ } static void +gst_ts_demux_finalize (GObject * object) +{ + GstTSDemux *demux = GST_TS_DEMUX_CAST (object); + + gst_event_replace (&demux->segment_event, NULL); + g_mutex_clear (&demux->lock); + + GST_CALL_PARENT (G_OBJECT_CLASS, finalize, (object)); +} + +static void gst_ts_demux_class_init (GstTSDemuxClass * klass) { GObjectClass *gobject_class; @@ -378,6 +401,7 @@ gobject_class->set_property = gst_ts_demux_set_property; gobject_class->get_property = gst_ts_demux_get_property; gobject_class->dispose = gst_ts_demux_dispose; + gobject_class->finalize = gst_ts_demux_finalize; g_object_class_install_property (gobject_class, PROP_PROGRAM_NUMBER, g_param_spec_int ("program-number", "Program number", @@ -389,6 +413,22 @@ "Emit messages for every pcr/opcr/pts/dts", FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + /** + * tsdemux:send-scte35-events: + * + * Whether SCTE 35 sections should be forwarded as events. + * + * When forwarding those, potential splice times are converted + * to running time, and can be used by a downstream muxer to reinject + * the sections. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_SEND_SCTE35_EVENTS, + g_param_spec_boolean ("send-scte35-events", "Send SCTE 35 events", + "Whether SCTE 35 sections should be forwarded as events", FALSE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + g_object_class_install_property (gobject_class, PROP_LATENCY, g_param_spec_int ("latency", "Latency", "Latency to add for smooth demuxing (in ms)", -1, @@ -416,6 +456,7 @@ ts_class->reset = GST_DEBUG_FUNCPTR (gst_ts_demux_reset); ts_class->push = GST_DEBUG_FUNCPTR (gst_ts_demux_push); ts_class->push_event = GST_DEBUG_FUNCPTR (push_event); + ts_class->handle_psi = GST_DEBUG_FUNCPTR (handle_psi); ts_class->sink_query = GST_DEBUG_FUNCPTR (sink_query); ts_class->program_started = GST_DEBUG_FUNCPTR (gst_ts_demux_program_started); ts_class->program_stopped = GST_DEBUG_FUNCPTR (gst_ts_demux_program_stopped); @@ -434,10 +475,9 @@ GstTSDemux *demux = (GstTSDemux *) base; demux->rate = 1.0; - if (demux->segment_event) { - gst_event_unref (demux->segment_event); - demux->segment_event = NULL; - } + g_mutex_lock (&demux->lock); + gst_event_replace (&demux->segment_event, NULL); + g_mutex_unlock (&demux->lock); if (demux->global_tags) { gst_tag_list_unref (demux->global_tags); @@ -454,6 +494,8 @@ demux->last_seek_offset = -1; demux->program_generation = 0; + + demux->mpeg_pts_offset = 0; } static void @@ -471,6 +513,8 @@ demux->program_number = -1; demux->latency = DEFAULT_LATENCY; gst_ts_demux_reset (base); + + g_mutex_init (&demux->lock); } @@ -489,6 +533,9 @@ case PROP_EMIT_STATS: demux->emit_statistics = g_value_get_boolean (value); break; + case PROP_SEND_SCTE35_EVENTS: + demux->send_scte35_events = g_value_get_boolean (value); + break; case PROP_LATENCY: demux->latency = g_value_get_int (value); break; @@ -510,6 +557,9 @@ case PROP_EMIT_STATS: g_value_set_boolean (value, demux->emit_statistics); break; + case PROP_SEND_SCTE35_EVENTS: + g_value_set_boolean (value, demux->send_scte35_events); + break; case PROP_LATENCY: g_value_set_int (value, demux->latency); break; @@ -556,7 +606,7 @@ switch (GST_QUERY_TYPE (query)) { case GST_QUERY_DURATION: { - GST_DEBUG ("query duration"); + GST_DEBUG_OBJECT (pad, "query duration"); gst_query_parse_duration (query, &format, NULL); if (format == GST_FORMAT_TIME) { if (!gst_pad_peer_query (base->sinkpad, query)) { @@ -574,7 +624,7 @@ } case GST_QUERY_LATENCY: { - GST_DEBUG ("query latency"); + GST_DEBUG_OBJECT (pad, "query latency"); res = gst_pad_peer_query (base->sinkpad, query); if (res) { GstClockTime min_lat, max_lat; @@ -601,9 +651,10 @@ } case GST_QUERY_SEEKING: { - GST_DEBUG ("query seeking"); + GST_DEBUG_OBJECT (pad, "query seeking"); gst_query_parse_seeking (query, &format, NULL, NULL, NULL); - GST_DEBUG ("asked for format %s", gst_format_get_name (format)); + GST_DEBUG_OBJECT (pad, "asked for format %s", + gst_format_get_name (format)); if (format == GST_FORMAT_TIME) { gboolean seekable = FALSE; @@ -616,7 +667,8 @@ GstClockTime dur; if (gst_ts_demux_get_duration (demux, &dur)) { gst_query_set_seeking (query, GST_FORMAT_TIME, TRUE, 0, dur); - GST_DEBUG ("Gave duration: %" GST_TIME_FORMAT, GST_TIME_ARGS (dur)); + GST_DEBUG_OBJECT (pad, "Gave duration: %" GST_TIME_FORMAT, + GST_TIME_ARGS (dur)); } } } else { @@ -703,9 +755,10 @@ if (gst_byte_writer_put_data (h264infos->sei, unit.data + unit.sc_offset, unit.size + unit.offset - unit.sc_offset)) { - GST_DEBUG ("adding SEI %u", unit.size + unit.offset - unit.sc_offset); + GST_DEBUG_OBJECT (stream->pad, "adding SEI %u", + unit.size + unit.offset - unit.sc_offset); } else { - GST_WARNING ("Could not write SEI"); + GST_WARNING_OBJECT (stream->pad, "Could not write SEI"); } break; case GST_H264_NAL_PPS: @@ -715,9 +768,10 @@ if (gst_byte_writer_put_data (h264infos->pps, unit.data + unit.sc_offset, unit.size + unit.offset - unit.sc_offset)) { - GST_DEBUG ("adding PPS %u", unit.size + unit.offset - unit.sc_offset); + GST_DEBUG_OBJECT (stream->pad, "adding PPS %u", + unit.size + unit.offset - unit.sc_offset); } else { - GST_WARNING ("Could not write PPS"); + GST_WARNING_OBJECT (stream->pad, "Could not write PPS"); } break; case GST_H264_NAL_SPS: @@ -727,9 +781,10 @@ if (gst_byte_writer_put_data (h264infos->sps, unit.data + unit.sc_offset, unit.size + unit.offset - unit.sc_offset)) { - GST_DEBUG ("adding SPS %u", unit.size + unit.offset - unit.sc_offset); + GST_DEBUG_OBJECT (stream->pad, "adding SPS %u", + unit.size + unit.offset - unit.sc_offset); } else { - GST_WARNING ("Could not write SPS"); + GST_WARNING_OBJECT (stream->pad, "Could not write SPS"); } break; /* these units are considered keyframes in h264parse */ @@ -786,19 +841,19 @@ tmpsize = gst_byte_writer_get_size (h264infos->sei); if (tmpsize) { - GST_DEBUG ("Adding SEI"); + GST_DEBUG_OBJECT (stream->pad, "Adding SEI"); data = gst_byte_writer_reset_and_get_data (h264infos->sei); gst_byte_writer_put_data (h264infos->sps, data, tmpsize); g_free (data); } if (frame_unit.size) { /* We found the everything in one go! */ - GST_DEBUG ("Adding Keyframe"); + GST_DEBUG_OBJECT (stream->pad, "Adding Keyframe"); gst_byte_writer_put_data (h264infos->sps, frame_unit.data + frame_unit.sc_offset, stream->current_size - frame_unit.sc_offset); } else { - GST_DEBUG ("Adding Keyframe"); + GST_DEBUG_OBJECT (stream->pad, "Adding Keyframe"); gst_byte_writer_put_data (h264infos->sps, h264infos->framedata.data, h264infos->framedata.size); clear_simple_buffer (&h264infos->framedata); @@ -819,7 +874,7 @@ " we will push later"); h264infos->framedata.data = - g_memdup (frame_unit.data + frame_unit.sc_offset, + g_memdup2 (frame_unit.data + frame_unit.sc_offset, stream->current_size - frame_unit.sc_offset); h264infos->framedata.size = stream->current_size - frame_unit.sc_offset; } @@ -870,18 +925,26 @@ gboolean update = FALSE; GstSegment seeksegment; - GST_DEBUG ("seek event, %" GST_PTR_FORMAT, event); + GST_DEBUG_OBJECT (demux, "seek event, %" GST_PTR_FORMAT, event); + + if (base->out_segment.format == GST_FORMAT_UNDEFINED) { + GST_DEBUG_OBJECT (demux, "Cannot process seek event now, delaying"); + gst_event_replace (&base->seek_event, event); + res = GST_FLOW_OK; + goto done; + } gst_event_parse_seek (event, &rate, &format, &flags, &start_type, &start, &stop_type, &stop); if (rate <= 0.0) { - GST_WARNING ("Negative rate not supported"); + GST_WARNING_OBJECT (demux, "Negative rate not supported"); goto done; } if (flags & (GST_SEEK_FLAG_SEGMENT)) { - GST_WARNING ("seek flags 0x%x are not supported", (int) flags); + GST_WARNING_OBJECT (demux, "seek flags 0x%x are not supported", + (int) flags); goto done; } @@ -903,6 +966,7 @@ &seeksegment); /* If the position actually changed, update == TRUE */ + g_mutex_lock (&demux->lock); if (update) { GstClockTime target = seeksegment.start; if (target >= SEEK_TIMESTAMP_OFFSET) @@ -914,7 +978,9 @@ mpegts_packetizer_ts_to_offset (base->packetizer, target, demux->program->pcr_pid); if (G_UNLIKELY (start_offset == -1)) { - GST_WARNING ("Couldn't convert start position to an offset"); + GST_WARNING_OBJECT (demux, + "Couldn't convert start position to an offset"); + g_mutex_unlock (&demux->lock); goto done; } @@ -940,8 +1006,8 @@ } else { /* Position didn't change, just update the output segment based on * our new one */ - gst_event_replace (&demux->segment_event, NULL); - demux->segment_event = gst_event_new_segment (&seeksegment); + gst_event_take (&demux->segment_event, + gst_event_new_segment (&seeksegment)); if (base->last_seek_seqnum) gst_event_set_seqnum (demux->segment_event, base->last_seek_seqnum); for (tmp = demux->program->stream_list; tmp; tmp = tmp->next) { @@ -949,6 +1015,7 @@ stream->need_newsegment = TRUE; } } + g_mutex_unlock (&demux->lock); /* Commit the new segment */ memcpy (&base->out_segment, &seeksegment, sizeof (GstSegment)); @@ -971,7 +1038,7 @@ case GST_EVENT_SEEK: res = mpegts_base_handle_seek_event ((MpegTSBase *) demux, pad, event); if (!res) - GST_WARNING ("seeking failed"); + GST_WARNING_OBJECT (pad, "seeking failed"); gst_event_unref (event); break; default: @@ -996,6 +1063,16 @@ gboolean early_ret = FALSE; if (GST_EVENT_TYPE (event) == GST_EVENT_SEGMENT) { + if (base->segment.format == GST_FORMAT_TIME && base->ignore_pcr) { + /* Shift start/stop values by 2s */ + base->packetizer->extra_shift = 2 * GST_SECOND; + if (GST_CLOCK_TIME_IS_VALID (base->segment.start)) + base->segment.start += 2 * GST_SECOND; + if (GST_CLOCK_TIME_IS_VALID (base->segment.stop)) + base->segment.stop += 2 * GST_SECOND; + if (GST_CLOCK_TIME_IS_VALID (base->segment.position)) + base->segment.position += 2 * GST_SECOND; + } GST_DEBUG_OBJECT (base, "Ignoring segment event (recreated later)"); gst_event_unref (event); return TRUE; @@ -1049,6 +1126,87 @@ return TRUE; } +static void +handle_psi (MpegTSBase * base, GstMpegtsSection * section) +{ + GstTSDemux *demux = (GstTSDemux *) base; + + if (section->section_type == GST_MPEGTS_SECTION_SCTE_SIT) { + GList *tmp; + gboolean forward = FALSE; + + if (demux->send_scte35_events) { + for (tmp = demux->program->stream_list; tmp; tmp = tmp->next) { + TSDemuxStream *stream = (TSDemuxStream *) tmp->data; + + if (stream->stream.pid == section->pid) { + forward = TRUE; + break; + } + } + } + + /* Create a new section to travel through the pipeline, with splice + * times translated from local time to running time */ + if (forward) { + GstEvent *event; + GstStructure *s; + GstStructure *rtime_map; + GstClockTime pts; + guint i = 0; + GstMpegtsSection *new_section = + (GstMpegtsSection *) gst_mini_object_copy ((GstMiniObject *) section); + GstMpegtsSCTESIT *sit = + (GstMpegtsSCTESIT *) gst_mpegts_section_get_scte_sit (new_section); + + rtime_map = gst_structure_new_empty ("running-time-map"); + + if (sit->fully_parsed) { + if (sit->splice_time_specified) { + pts = + mpegts_packetizer_pts_to_ts (base->packetizer, + MPEGTIME_TO_GSTTIME (sit->splice_time + + sit->pts_adjustment), demux->program->pcr_pid); + gst_structure_set (rtime_map, "splice-time", G_TYPE_UINT64, + gst_segment_to_running_time (&base->out_segment, GST_FORMAT_TIME, + pts), NULL); + } + + for (i = 0; i < sit->splices->len; i++) { + gchar *field_name; + GstMpegtsSCTESpliceEvent *sevent = + g_ptr_array_index (sit->splices, i); + + if (sevent->program_splice_time_specified) { + pts = + mpegts_packetizer_pts_to_ts (base->packetizer, + MPEGTIME_TO_GSTTIME (sevent->program_splice_time + + sit->pts_adjustment), demux->program->pcr_pid); + field_name = + g_strdup_printf ("event-%u-splice-time", + sevent->splice_event_id); + gst_structure_set (rtime_map, field_name, G_TYPE_UINT64, + gst_segment_to_running_time (&base->out_segment, + GST_FORMAT_TIME, pts), NULL); + g_free (field_name); + } + } + } + + event = gst_event_new_mpegts_section (new_section); + gst_mpegts_section_unref (new_section); + + s = gst_event_writable_structure (event); + gst_structure_set (s, "mpeg-pts-offset", G_TYPE_UINT64, + demux->mpeg_pts_offset, "running-time-map", GST_TYPE_STRUCTURE, + rtime_map, NULL); + gst_structure_free (rtime_map); + + push_event (base, event); + } + } +} + static gboolean sink_query (MpegTSBase * base, GstQuery * query) { @@ -1089,7 +1247,7 @@ { const gchar *lc; - GST_LOG ("Add language code for stream: '%s'", lang_code); + GST_LOG_OBJECT (stream->pad, "Add language code for stream: '%s'", lang_code); if (!stream->taglist) stream->taglist = gst_tag_list_new_empty (); @@ -1118,7 +1276,7 @@ nb = gst_mpegts_descriptor_parse_iso_639_language_nb (desc); - GST_DEBUG ("Found ISO 639 descriptor (%d entries)", nb); + GST_DEBUG_OBJECT (stream->pad, "Found ISO 639 descriptor (%d entries)", nb); for (i = 0; i < nb; i++) if (gst_mpegts_descriptor_parse_iso_639_language_idx (desc, i, &lang_code, @@ -1138,7 +1296,8 @@ nb = gst_mpegts_descriptor_parse_dvb_subtitling_nb (desc); - GST_DEBUG ("Found SUBTITLING descriptor (%d entries)", nb); + GST_DEBUG_OBJECT (stream->pad, "Found SUBTITLING descriptor (%d entries)", + nb); for (i = 0; i < nb; i++) if (gst_mpegts_descriptor_parse_dvb_subtitling_idx (desc, i, &lang_code, @@ -1147,6 +1306,32 @@ g_free (lang_code); } } + + if (bstream->stream_type == GST_MPEGTS_STREAM_TYPE_PRIVATE_PES_PACKETS) { + desc = mpegts_get_descriptor_from_stream_with_extension (bstream, + GST_MTS_DESC_DVB_EXTENSION, GST_MTS_DESC_EXT_DVB_AUDIO_PRESELECTION); + + if (desc) { + GPtrArray *list; + GstMpegtsAudioPreselectionDescriptor *item; + + if (gst_mpegts_descriptor_parse_audio_preselection_list (desc, &list)) { + GST_DEBUG ("Found AUDIO PRESELECTION descriptor (%d entries)", + list->len); + + for (i = 0; i < list->len; i++) { + item = g_ptr_array_index (list, i); + gst_mpegts_descriptor_parse_audio_preselection_dump (item); + + if (item->language_code_present) { + add_iso639_language_to_tags (stream, item->language_code); + break; + } + } + g_ptr_array_unref (list); + } + } + } } static GstPad * @@ -1166,7 +1351,8 @@ gst_ts_demux_create_tags (stream); - GST_LOG ("Attempting to create pad for stream 0x%04x with stream_type %d", + GST_LOG_OBJECT (demux, + "Attempting to create pad for stream 0x%04x with stream_type %d", bstream->pid, bstream->stream_type); /* First handle BluRay-specific stream types since there is some overlap @@ -1182,7 +1368,7 @@ mpegts_get_descriptor_from_stream (bstream, GST_MTS_DESC_AC3_AUDIO_STREAM); if (ac3_desc && DESC_AC_AUDIO_STREAM_bsid (ac3_desc->data) != 16) { - GST_LOG ("ac3 audio"); + GST_LOG_OBJECT (demux, "ac3 audio"); is_audio = TRUE; caps = gst_caps_new_empty_simple ("audio/x-ac3"); } else { @@ -1241,7 +1427,7 @@ * * frame_rate * * profile_and_level */ - GST_LOG ("mpeg video"); + GST_LOG_OBJECT (demux, "mpeg video"); is_video = TRUE; caps = gst_caps_new_simple ("video/mpeg", "mpegversion", G_TYPE_INT, @@ -1251,7 +1437,7 @@ break; case GST_MPEGTS_STREAM_TYPE_AUDIO_MPEG1: case GST_MPEGTS_STREAM_TYPE_AUDIO_MPEG2: - GST_LOG ("mpeg audio"); + GST_LOG_OBJECT (demux, "mpeg audio"); is_audio = TRUE; caps = gst_caps_new_simple ("audio/mpeg", "mpegversion", G_TYPE_INT, 1, @@ -1261,16 +1447,15 @@ gst_caps_set_simple (caps, "layer", G_TYPE_INT, 2, NULL); break; case GST_MPEGTS_STREAM_TYPE_PRIVATE_PES_PACKETS: - GST_LOG ("private data"); + GST_LOG_OBJECT (demux, "private data"); /* FIXME: Move all of this into a common method (there might be other * types also, depending on registratino descriptors also */ - desc = - mpegts_get_descriptor_from_stream (bstream, - GST_MTS_DESC_DVB_EXTENSION); - if (desc != NULL && desc->tag_extension == GST_MTS_DESC_EXT_DVB_AC4) { - GST_LOG ("ac4 audio"); + desc = mpegts_get_descriptor_from_stream_with_extension (bstream, + GST_MTS_DESC_DVB_EXTENSION, GST_MTS_DESC_EXT_DVB_AC4); + if (desc) { + GST_LOG_OBJECT (demux, "ac4 audio"); is_audio = TRUE; caps = gst_caps_new_empty_simple ("audio/x-ac4"); break; @@ -1278,7 +1463,7 @@ desc = mpegts_get_descriptor_from_stream (bstream, GST_MTS_DESC_DVB_AC3); if (desc) { - GST_LOG ("ac3 audio"); + GST_LOG_OBJECT (demux, "ac3 audio"); is_audio = TRUE; caps = gst_caps_new_empty_simple ("audio/x-ac3"); break; @@ -1288,7 +1473,7 @@ mpegts_get_descriptor_from_stream (bstream, GST_MTS_DESC_DVB_ENHANCED_AC3); if (desc) { - GST_LOG ("ac3 audio"); + GST_LOG_OBJECT (demux, "ac3 audio"); is_audio = TRUE; caps = gst_caps_new_empty_simple ("audio/x-eac3"); break; @@ -1297,7 +1482,7 @@ mpegts_get_descriptor_from_stream (bstream, GST_MTS_DESC_DVB_TELETEXT); if (desc) { - GST_LOG ("teletext"); + GST_LOG_OBJECT (demux, "teletext"); is_private = TRUE; caps = gst_caps_new_empty_simple ("application/x-teletext"); sparse = TRUE; @@ -1307,7 +1492,7 @@ mpegts_get_descriptor_from_stream (bstream, GST_MTS_DESC_DVB_SUBTITLING); if (desc) { - GST_LOG ("subtitling"); + GST_LOG_OBJECT (demux, "subtitling"); is_subpicture = TRUE; caps = gst_caps_new_empty_simple ("subpicture/x-dvb"); sparse = TRUE; @@ -1592,8 +1777,8 @@ const guint desc_min_length = 24; if (desc->length < desc_min_length) { - GST_ERROR - ("GST_MPEGTS_STREAM_TYPE_VIDEO_JP2K: descriptor length %d too short", + GST_ERROR_OBJECT (demux, + "GST_MPEGTS_STREAM_TYPE_VIDEO_JP2K: descriptor length %d too short", desc->length); return NULL; } @@ -1613,8 +1798,8 @@ interlaced_video = remaining_8b & 0x40; /* we don't support demuxing interlaced at the moment */ if (interlaced_video) { - GST_ERROR - ("GST_MPEGTS_STREAM_TYPE_VIDEO_JP2K: interlaced video not supported"); + GST_ERROR_OBJECT (demux, + "GST_MPEGTS_STREAM_TYPE_VIDEO_JP2K: interlaced video not supported"); return NULL; } else { interlace_mode = "progressive"; @@ -1646,7 +1831,7 @@ break; case ST_VIDEO_DIRAC: if (bstream->registration_id == 0x64726163) { - GST_LOG ("dirac"); + GST_LOG_OBJECT (demux, "dirac"); /* dirac in hex */ is_video = TRUE; caps = gst_caps_new_empty_simple ("video/x-dirac"); @@ -1662,8 +1847,8 @@ if (bstream->registration_id == DRF_ID_VC1) is_vc1 = TRUE; if (!is_vc1) { - GST_WARNING ("0xea private stream type found but no descriptor " - "for VC1. Assuming plain VC1."); + GST_WARNING_OBJECT (demux, "0xea private stream type found but " + "no descriptor for VC1. Assuming plain VC1."); } is_video = TRUE; @@ -1695,7 +1880,8 @@ break; } - GST_WARNING ("AC3 stream type found but no guaranteed " + GST_WARNING_OBJECT (demux, + "AC3 stream type found but no guaranteed " "way found to differentiate between AC3 and EAC3. " "Assuming plain AC3."); is_audio = TRUE; @@ -1744,7 +1930,8 @@ caps = gst_caps_new_empty_simple ("video/x-cavs"); break; default: - GST_DEBUG ("Non-media stream (stream_type:0x%x). Not creating pad", + GST_DEBUG_OBJECT (demux, + "Non-media stream (stream_type:0x%x). Not creating pad", bstream->stream_type); break; } @@ -1785,7 +1972,8 @@ GstEvent *event; const gchar *stream_id; - GST_LOG ("stream:%p creating pad with name %s and caps %" GST_PTR_FORMAT, + GST_LOG_OBJECT (demux, + "stream:%p creating pad with name %s and caps %" GST_PTR_FORMAT, stream, name, caps); pad = gst_pad_new_from_template (template, name); gst_pad_set_active (pad, TRUE); @@ -1960,6 +2148,9 @@ g_free (stream->data); stream->data = NULL; + g_free (stream->pending_header_data); + stream->pending_header_data = NULL; + stream->pending_header_size = 0; stream->state = PENDING_PACKET_EMPTY; stream->expected_size = 0; stream->allocated_size = 0; @@ -2079,10 +2270,9 @@ /* If this is not the initial program, we need to calculate * a new segment */ - if (demux->segment_event) { - gst_event_unref (demux->segment_event); - demux->segment_event = NULL; - } + g_mutex_lock (&demux->lock); + gst_event_replace (&demux->segment_event, NULL); + g_mutex_unlock (&demux->lock); /* DRAIN ALL STREAMS FIRST ! */ if (demux->previous_program) { @@ -2158,6 +2348,7 @@ guint64 pts, guint64 offset) { MpegTSBaseStream *bs = (MpegTSBaseStream *) stream; + MpegTSBase *base = GST_MPEGTS_BASE (demux); stream->raw_pts = pts; if (pts == -1) { @@ -2173,6 +2364,24 @@ mpegts_packetizer_pts_to_ts (MPEG_TS_BASE_PACKETIZER (demux), MPEGTIME_TO_GSTTIME (pts), demux->program->pcr_pid); + if (base->out_segment.format == GST_FORMAT_TIME) + demux->mpeg_pts_offset = + (GSTTIME_TO_MPEGTIME (gst_segment_to_running_time (&base->out_segment, + GST_FORMAT_TIME, stream->pts)) - pts) & 0x1ffffffff; + + /* Sanity check, some stream have completely different PTS vs DTS. If that + * happens we only keep the DTS */ + if (GST_CLOCK_TIME_IS_VALID (stream->pts) && + GST_CLOCK_TIME_IS_VALID (stream->dts) && + ABSDIFF (stream->pts, stream->dts) > 5 * GST_SECOND) { + GST_WARNING ("pid 0x%04x PTS %" GST_TIME_FORMAT + " differs too much against DTS %" GST_TIME_FORMAT ", discarding it", + bs->pid, GST_TIME_ARGS (stream->pts), GST_TIME_ARGS (stream->dts)); + stream->raw_pts = stream->raw_dts; + stream->pts = stream->dts; + return; + } + GST_LOG ("pid 0x%04x Stored PTS %" G_GUINT64_FORMAT, bs->pid, stream->pts); if (G_UNLIKELY (demux->emit_statistics)) { @@ -2230,6 +2439,7 @@ guint64 offset = 0; GList *tmp; gboolean have_only_sparse = TRUE; + gboolean exceeded_threshold = FALSE; /* 0. Do we only have sparse stream */ for (tmp = demux->program->stream_list; tmp; tmp = tmp->next) { @@ -2251,12 +2461,39 @@ have_observation = TRUE; break; } + /* 1.2 Check if we exceeded the maximum threshold of pending data */ + if (tmpstream->pending && (tmpstream->raw_dts != -1 + || tmpstream->raw_pts != -1)) { + PendingBuffer *pend = tmpstream->pending->data; + guint64 lastval = + tmpstream->raw_dts != -1 ? tmpstream->raw_dts : tmpstream->raw_pts; + guint64 firstval = pend->dts != -1 ? pend->dts : pend->pts; + GstClockTime dur; + g_assert (firstval != -1); + dur = MPEGTIME_TO_GSTTIME (lastval - firstval); + GST_DEBUG_OBJECT (tmpstream->pad, + "Pending content duration: %" GST_TIME_FORMAT, GST_TIME_ARGS (dur)); + if (dur > 500 * GST_MSECOND) { + exceeded_threshold = TRUE; + break; + } + } } } - /* 2. If we don't have a valid value yet, break out */ - if (have_observation == FALSE) - return FALSE; + if (have_observation == FALSE) { + /* 2. If we don't have a valid value yet, break out */ + if (!exceeded_threshold) + return FALSE; + + /* Except if we've exceed the maximum amount of pending buffers, in which + * case we ignore PCR from now on */ + GST_DEBUG_OBJECT (demux, + "Saw more than 500ms of data without PCR. Ignoring PCR from now on"); + GST_MPEGTS_BASE (demux)->ignore_pcr = TRUE; + demux->program->pcr_pid = 0x1fff; + g_object_notify (G_OBJECT (demux), "ignore-pcr"); + } /* 3. Go over all streams that have current/pending data */ for (tmp = demux->program->stream_list; tmp; tmp = tmp->next) { @@ -2368,9 +2605,26 @@ GST_MEMDUMP ("Header buffer", data, MIN (length, 32)); + if (G_UNLIKELY (stream->pending_header_data)) { + /* Accumulate with previous header if present */ + stream->pending_header_data = + g_realloc (stream->pending_header_data, + stream->pending_header_size + length); + memcpy (stream->pending_header_data + stream->pending_header_size, data, + length); + data = stream->pending_header_data; + length = stream->pending_header_size + length; + } + parseres = mpegts_parse_pes_header (data, length, &header); - if (G_UNLIKELY (parseres == PES_PARSING_NEED_MORE)) - goto discont; + + if (G_UNLIKELY (parseres == PES_PARSING_NEED_MORE)) { + /* This can happen if PES header is bigger than a packet. */ + if (!stream->pending_header_data) + stream->pending_header_data = g_memdup2 (data, length); + stream->pending_header_size = length; + return; + } if (G_UNLIKELY (parseres == PES_PARSING_BAD)) { GST_WARNING ("Error parsing PES header. pid: 0x%x stream_type: 0x%x", stream->stream.pid, stream->stream.stream_type); @@ -2430,9 +2684,20 @@ stream->state = PENDING_PACKET_BUFFER; + if (stream->pending_header_data) { + g_free (stream->pending_header_data); + stream->pending_header_data = NULL; + stream->pending_header_size = 0; + } + return; discont: + if (stream->pending_header_data) { + g_free (stream->pending_header_data); + stream->pending_header_data = NULL; + stream->pending_header_size = 0; + } stream->state = PENDING_PACKET_DISCONT; return; } @@ -2449,16 +2714,17 @@ guint size; guint8 cc = FLAGS_CONTINUITY_COUNTER (packet->scram_afc_cc); - GST_LOG ("pid: 0x%04x state:%d", stream->stream.pid, stream->state); + GST_LOG_OBJECT (demux, "pid: 0x%04x state:%d", stream->stream.pid, + stream->state); size = packet->data_end - packet->payload; data = packet->payload; if (stream->continuity_counter == CONTINUITY_UNSET) { - GST_DEBUG ("CONTINUITY: Initialize to %d", cc); + GST_DEBUG_OBJECT (demux, "CONTINUITY: Initialize to %d", cc); } else if ((cc == stream->continuity_counter + 1 || (stream->continuity_counter == MAX_CONTINUITY && cc == 0))) { - GST_LOG ("CONTINUITY: Got expected %d", cc); + GST_LOG_OBJECT (demux, "CONTINUITY: Got expected %d", cc); } else { if (stream->state != PENDING_PACKET_EMPTY) { if (packet->payload_unit_start_indicator) { @@ -2468,10 +2734,19 @@ g_free (stream->data); stream->data = NULL; } + if (G_UNLIKELY (stream->pending_header_data)) { + g_free (stream->pending_header_data); + stream->pending_header_data = NULL; + } stream->state = PENDING_PACKET_HEADER; } else { - GST_WARNING ("CONTINUITY: Mismatch packet %d, stream %d", - cc, stream->continuity_counter); + GST_ELEMENT_WARNING_WITH_DETAILS (demux, STREAM, DEMUX, + ("CONTINUITY: Mismatch packet %d, stream %d (pid 0x%04x)", cc, + stream->continuity_counter, stream->stream.pid), (NULL), + ("warning-type", G_TYPE_STRING, "continuity-mismatch", + "packet", G_TYPE_INT, cc, + "stream", G_TYPE_INT, stream->continuity_counter, + "pid", G_TYPE_UINT, stream->stream.pid, NULL)); stream->state = PENDING_PACKET_DISCONT; } } @@ -2481,9 +2756,9 @@ if (stream->state == PENDING_PACKET_EMPTY) { if (G_UNLIKELY (!packet->payload_unit_start_indicator)) { stream->state = PENDING_PACKET_DISCONT; - GST_DEBUG ("Didn't get the first packet of this PES"); + GST_DEBUG_OBJECT (demux, "Didn't get the first packet of this PES"); } else { - GST_LOG ("EMPTY=>HEADER"); + GST_LOG_OBJECT (demux, "EMPTY=>HEADER"); stream->state = PENDING_PACKET_HEADER; } } @@ -2491,7 +2766,7 @@ switch (stream->state) { case PENDING_PACKET_HEADER: { - GST_LOG ("HEADER: Parsing PES header"); + GST_LOG_OBJECT (demux, "HEADER: Parsing PES header"); /* parse the header */ gst_ts_demux_parse_pes_header (demux, stream, data, size, packet->offset); @@ -2499,9 +2774,9 @@ } case PENDING_PACKET_BUFFER: { - GST_LOG ("BUFFER: appending data"); + GST_LOG_OBJECT (demux, "BUFFER: appending data"); if (G_UNLIKELY (stream->current_size + size > stream->allocated_size)) { - GST_LOG ("resizing buffer"); + GST_LOG_OBJECT (demux, "resizing buffer"); do { stream->allocated_size = MAX (8192, 2 * stream->allocated_size); } while (stream->current_size + size > stream->allocated_size); @@ -2513,11 +2788,15 @@ } case PENDING_PACKET_DISCONT: { - GST_LOG ("DISCONT: not storing/pushing"); + GST_LOG_OBJECT (demux, "DISCONT: not storing/pushing"); if (G_UNLIKELY (stream->data)) { g_free (stream->data); stream->data = NULL; } + if (G_UNLIKELY (stream->pending_header_data)) { + g_free (stream->pending_header_data); + stream->pending_header_data = NULL; + } stream->continuity_counter = CONTINUITY_UNSET; break; } @@ -2537,14 +2816,18 @@ GstClockTime firstts = 0; GList *tmp; - GST_DEBUG ("Creating new newsegment for stream %p", stream); + GST_DEBUG_OBJECT (demux, "Creating new newsegment for stream %p", stream); if (target_program == NULL) target_program = demux->program; /* Speedup : if we don't need to calculate anything, go straight to pushing */ - if (demux->segment_event) + g_mutex_lock (&demux->lock); + if (demux->segment_event) { + g_mutex_unlock (&demux->lock); goto push_new_segment; + } + g_mutex_unlock (&demux->lock); /* Calculate the 'new_start' value, used for newsegment */ for (tmp = target_program->stream_list; tmp; tmp = tmp->next) { @@ -2558,12 +2841,12 @@ } if (GST_CLOCK_TIME_IS_VALID (lowest_pts)) firstts = lowest_pts; - GST_DEBUG ("lowest_pts %" G_GUINT64_FORMAT " => clocktime %" GST_TIME_FORMAT, - lowest_pts, GST_TIME_ARGS (firstts)); + GST_DEBUG_OBJECT (demux, "lowest_pts %" G_GUINT64_FORMAT " => clocktime %" + GST_TIME_FORMAT, lowest_pts, GST_TIME_ARGS (firstts)); if (base->out_segment.format != GST_FORMAT_TIME || demux->reset_segment) { /* It will happen only if it's first program or after flushes. */ - GST_DEBUG ("Calculating actual segment"); + GST_DEBUG_OBJECT (demux, "Calculating actual segment"); if (base->segment.format == GST_FORMAT_TIME) { /* Try to recover segment info from base if it's in TIME format */ base->out_segment = base->segment; @@ -2595,12 +2878,15 @@ GST_LOG_OBJECT (demux, "Output segment now %" GST_SEGMENT_FORMAT, &base->out_segment); + g_mutex_lock (&demux->lock); if (!demux->segment_event) { - demux->segment_event = gst_event_new_segment (&base->out_segment); + gst_event_take (&demux->segment_event, + gst_event_new_segment (&base->out_segment)); if (base->last_seek_seqnum != GST_SEQNUM_INVALID) gst_event_set_seqnum (demux->segment_event, base->last_seek_seqnum); } + g_mutex_unlock (&demux->lock); push_new_segment: for (tmp = target_program->stream_list; tmp; tmp = tmp->next) { @@ -2608,11 +2894,15 @@ if (stream->pad == NULL) continue; + g_mutex_lock (&demux->lock); if (demux->segment_event) { + GstEvent *evt = gst_event_ref (demux->segment_event); GST_DEBUG_OBJECT (stream->pad, "Pushing newsegment event"); - gst_event_ref (demux->segment_event); - gst_pad_push_event (stream->pad, demux->segment_event); + g_mutex_unlock (&demux->lock); + gst_pad_push_event (stream->pad, evt); + } else { + g_mutex_unlock (&demux->lock); } if (demux->global_tags) { @@ -2630,6 +2920,11 @@ stream->need_newsegment = FALSE; } + if (base->seek_event) { + g_assert (base->out_segment.format != GST_FORMAT_UNDEFINED); + gst_ts_demux_do_seek (base, base->seek_event); + gst_event_replace (&base->seek_event, NULL); + } } static void @@ -2654,9 +2949,10 @@ * This means we can detect buffers passing without PTSes fine and still generate * gaps. * - * If there haven't been any buffers pushed on this stream since the last - * gap check, push a gap event updating to the indicated input PCR time - * and update the pad's tracking. + * If there haven't been any buffers pushed on this stream since the last gap + * check *AND* there is no pending data (stream->current_size), push a gap + * event updating to the indicated input PCR time and update the pad's + * tracking. * * If there have been buffers pushed, update the reference buffer count * and but don't push a gap event @@ -2665,15 +2961,16 @@ TSDemuxStream *ps = (TSDemuxStream *) tmp->data; GST_DEBUG_OBJECT (ps->pad, "0x%04x, PTS:%" GST_TIME_FORMAT " REFPTS:%" GST_TIME_FORMAT " Gap:%" - GST_TIME_FORMAT " nb_buffers: %d (ref:%d)", + GST_TIME_FORMAT " nb_buffers: %d (ref:%d) pending_data size %u", ((MpegTSBaseStream *) ps)->pid, GST_TIME_ARGS (ps->pts), GST_TIME_ARGS (ps->gap_ref_pts), GST_TIME_ARGS (ps->pts - ps->gap_ref_pts), ps->nb_out_buffers, - ps->gap_ref_buffers); + ps->gap_ref_buffers, ps->current_size); if (ps->pad == NULL) continue; - if (ps->nb_out_buffers == ps->gap_ref_buffers && ps->gap_ref_pts != ps->pts) { + if (ps->nb_out_buffers == ps->gap_ref_buffers && ps->current_size == 0 + && ps->gap_ref_pts != ps->pts) { /* Do initial setup of pad if needed - segment etc */ GST_DEBUG_OBJECT (ps->pad, "Stream needs update. Pushing GAP event to TS %" GST_TIME_FORMAT, @@ -3035,17 +3332,17 @@ bs->stream_type, stream->state); if (G_UNLIKELY (stream->data == NULL)) { - GST_LOG ("stream->data == NULL"); + GST_LOG_OBJECT (stream->pad, "stream->data == NULL"); goto beach; } if (G_UNLIKELY (stream->state == PENDING_PACKET_EMPTY)) { - GST_LOG ("EMPTY: returning"); + GST_LOG_OBJECT (stream->pad, "EMPTY: returning"); goto beach; } if (G_UNLIKELY (stream->state != PENDING_PACKET_BUFFER)) { - GST_LOG ("state:%d, returning", stream->state); + GST_LOG_OBJECT (stream->pad, "state:%d, returning", stream->state); goto beach; } @@ -3169,7 +3466,8 @@ } gst_buffer_list_unref (buffer_list); } - GST_DEBUG ("Not enough information to push buffers yet, storing buffer"); + GST_DEBUG_OBJECT (demux, + "Not enough information to push buffers yet, storing buffer"); goto beach; } } @@ -3292,7 +3590,8 @@ beach: /* Reset the PES payload collection, but don't clear the state, * we might want to keep collecting this PES */ - GST_LOG ("Cleared PES data. returning %s", gst_flow_get_name (res)); + GST_LOG_OBJECT (demux, "Cleared PES data. returning %s", + gst_flow_get_name (res)); if (stream->expected_size) { if (stream->current_size > stream->expected_size) stream->expected_size = 0; @@ -3312,8 +3611,9 @@ { GstFlowReturn res = GST_FLOW_OK; - GST_LOG ("pid 0x%04x pusi:%d, afc:%d, cont:%d, payload:%p", packet->pid, - packet->payload_unit_start_indicator, packet->scram_afc_cc & 0x30, + GST_LOG_OBJECT (demux, "pid 0x%04x pusi:%d, afc:%d, cont:%d, payload:%p", + packet->pid, packet->payload_unit_start_indicator, + packet->scram_afc_cc & 0x30, FLAGS_CONTINUITY_COUNTER (packet->scram_afc_cc), packet->payload); if (G_UNLIKELY (packet->payload_unit_start_indicator) && @@ -3330,13 +3630,13 @@ if (packet->payload && (res == GST_FLOW_OK || res == GST_FLOW_NOT_LINKED) && stream->pad) { gst_ts_demux_queue_data (demux, stream, packet); - GST_LOG ("current_size:%d, expected_size:%d", + GST_LOG_OBJECT (demux, "current_size:%d, expected_size:%d", stream->current_size, stream->expected_size); /* Finally check if the data we queued completes a packet, or got too * large and needs output now */ if ((stream->expected_size && stream->current_size >= stream->expected_size) || (stream->current_size >= MAX_PES_PAYLOAD)) { - GST_LOG ("pushing packet of size %u", stream->current_size); + GST_LOG_OBJECT (demux, "pushing packet of size %u", stream->current_size); res = gst_ts_demux_push_pending_data (demux, stream, NULL); } } @@ -3357,10 +3657,9 @@ gst_ts_demux_flush_streams (demux, hard); - if (demux->segment_event) { - gst_event_unref (demux->segment_event); - demux->segment_event = NULL; - } + g_mutex_lock (&demux->lock); + gst_event_replace (&demux->segment_event, NULL); + g_mutex_unlock (&demux->lock); if (demux->global_tags) { gst_tag_list_unref (demux->global_tags); demux->global_tags = NULL; @@ -3411,14 +3710,3 @@ } return res; } - -gboolean -gst_ts_demux_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (ts_demux_debug, "tsdemux", 0, - "MPEG transport stream demuxer"); - init_pes_parser (); - - return gst_element_register (plugin, "tsdemux", - GST_RANK_PRIMARY, GST_TYPE_TS_DEMUX); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsdemux/tsdemux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsdemux/tsdemux.h
Changed
@@ -76,6 +76,7 @@ gint requested_program_number; /* Required program number (ignore:-1) */ guint program_number; gboolean emit_statistics; + gboolean send_scte35_events; gint latency; /* latency in ms */ /*< private >*/ @@ -101,6 +102,14 @@ /* Used when seeking for a keyframe to go backward in the stream */ guint64 last_seek_offset; + + /* The current difference between PES PTSs and our output running times, + * in the MPEG time domain. This is used for potentially updating + * SCTE 35 sections' pts_adjustment further down the line (eg mpegtsmux) */ + guint64 mpeg_pts_offset; + + /* This is to protect demux->segment_event */ + GMutex lock; }; struct _GstTSDemuxClass @@ -109,8 +118,7 @@ }; G_GNUC_INTERNAL GType gst_ts_demux_get_type (void); - -G_GNUC_INTERNAL gboolean gst_ts_demux_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (tsdemux); G_END_DECLS #endif /* GST_TS_DEMUX_H */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstatscmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstatscmux.c
Changed
@@ -17,7 +17,12 @@ * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. + * + * SPDX-License-Identifier: LGPL-2.0-or-later */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif #include "gstatscmux.h" @@ -25,6 +30,8 @@ #define GST_CAT_DEFAULT gst_atsc_mux_debug G_DEFINE_TYPE (GstATSCMux, gst_atsc_mux, GST_TYPE_BASE_TS_MUX); +GST_ELEMENT_REGISTER_DEFINE (atscmux, "atscmux", GST_RANK_PRIMARY, + gst_atsc_mux_get_type ()); #define parent_class gst_atsc_mux_parent_class #define ATSCMUX_ST_PS_AUDIO_EAC3 0x87
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstatscmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstatscmux.h
Changed
@@ -17,6 +17,8 @@ * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. + * + * SPDX-License-Identifier: LGPL-2.0-or-later */ #ifndef __ATSCMUX_H__ @@ -40,6 +42,7 @@ }; GType gst_atsc_mux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (atscmux); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmux.c
Changed
@@ -6,10 +6,9 @@ * * Copyright (C) 2011 Jan Schmidt <thaytan@noraisin.net> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -40,22 +39,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -80,7 +63,11 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif #include <stdio.h> #include <string.h> @@ -142,6 +129,7 @@ /* Send initial segments again after a flush-stop, and also resend the * header sections */ + g_mutex_lock (&mux->lock); mux->first = TRUE; /* output PAT, SI tables */ @@ -154,6 +142,7 @@ tsmux_resend_pmt (program); } + g_mutex_unlock (&mux->lock); return GST_FLOW_OK; } @@ -210,6 +199,7 @@ #define CLOCK_BASE 9LL #define CLOCK_FREQ (CLOCK_BASE * 10000) /* 90 kHz PTS clock */ #define CLOCK_FREQ_SCR (CLOCK_FREQ * 300) /* 27 MHz SCR clock */ +#define TS_MUX_CLOCK_BASE (TSMUX_CLOCK_FREQ * 10 * 360) #define GSTTIME_TO_MPEGTIME(time) \ (((time) > 0 ? (gint64) 1 : (gint64) -1) * \ @@ -236,7 +226,8 @@ GstBuffer *buffer; } StreamData; -G_DEFINE_TYPE (GstBaseTsMux, gst_base_ts_mux, GST_TYPE_AGGREGATOR); +G_DEFINE_TYPE_WITH_CODE (GstBaseTsMux, gst_base_ts_mux, GST_TYPE_AGGREGATOR, + gst_mpegts_initialize ()); /* Internals */ @@ -272,9 +263,13 @@ GValue value = { 0 }; GstCaps *caps; - caps = - gst_caps_make_writable (gst_pad_get_current_caps (GST_AGGREGATOR_SRC_PAD - (mux))); + caps = gst_pad_get_current_caps (GST_AGGREGATOR_SRC_PAD (mux)); + + /* If we have no caps, we are possibly shutting down */ + if (!caps) + return; + + caps = gst_caps_make_writable (caps); structure = gst_caps_get_structure (caps, 0); g_value_init (&array, GST_TYPE_ARRAY); @@ -304,6 +299,7 @@ return TRUE; } +/* Must be called with mux->lock held */ static void gst_base_ts_mux_reset (GstBaseTsMux * mux, gboolean alloc) { @@ -314,7 +310,7 @@ mux->first = TRUE; mux->last_flow_ret = GST_FLOW_OK; - mux->last_ts = 0; + mux->last_ts = GST_CLOCK_TIME_NONE; mux->is_delta = TRUE; mux->is_header = FALSE; @@ -324,6 +320,7 @@ if (mux->out_adapter) gst_adapter_clear (mux->out_adapter); + mux->output_ts_offset = GST_CLOCK_STIME_NONE; if (mux->tsmux) { if (mux->tsmux->si_sections) @@ -366,6 +363,8 @@ if (si_sections) g_hash_table_unref (si_sections); + mux->last_scte35_event_seqnum = GST_SEQNUM_INVALID; + if (klass->reset) klass->reset (mux); } @@ -376,13 +375,12 @@ stream_data_free ((StreamData *) user_data); } +/* Must be called with mux->lock held */ static GstFlowReturn -gst_base_ts_mux_create_stream (GstBaseTsMux * mux, GstBaseTsMuxPad * ts_pad) +gst_base_ts_mux_create_or_update_stream (GstBaseTsMux * mux, + GstBaseTsMuxPad * ts_pad, GstCaps * caps) { - GstFlowReturn ret = GST_FLOW_ERROR; - GstCaps *caps; GstStructure *s; - GstPad *pad; guint st = TSMUX_ST_RESERVED; const gchar *mt; const GValue *value = NULL; @@ -392,16 +390,13 @@ guint8 main_level = 0; guint32 max_rate = 0; guint8 color_spec = 0; - j2k_private_data *private_data = NULL; const gchar *stream_format = NULL; + const char *interlace_mode = NULL; + gchar *pmt_name; - pad = GST_PAD (ts_pad); - caps = gst_pad_get_current_caps (pad); - if (caps == NULL) - goto not_negotiated; - - GST_DEBUG_OBJECT (pad, "Creating stream with PID 0x%04x for caps %" - GST_PTR_FORMAT, ts_pad->pid, caps); + GST_DEBUG_OBJECT (ts_pad, + "%s stream with PID 0x%04x for caps %" GST_PTR_FORMAT, + ts_pad->stream ? "Recreating" : "Creating", ts_pad->pid, caps); s = gst_caps_get_structure (caps, 0); @@ -410,6 +405,9 @@ if (value != NULL) codec_data = gst_value_get_buffer (value); + g_clear_pointer (&ts_pad->codec_data, gst_buffer_unref); + ts_pad->prepare_func = NULL; + stream_format = gst_structure_get_string (s, "stream-format"); if (strcmp (mt, "video/x-dirac") == 0) { @@ -428,7 +426,7 @@ gint mpegversion; if (!gst_structure_get_int (s, "mpegversion", &mpegversion)) { - GST_ERROR_OBJECT (pad, "caps missing mpegversion"); + GST_ERROR_OBJECT (ts_pad, "caps missing mpegversion"); goto not_negotiated; } @@ -467,7 +465,7 @@ /* Check the stream format. We need codec_data with RAW streams and mpegversion=4 */ if (g_strcmp0 (stream_format, "raw") == 0) { if (codec_data) { - GST_DEBUG_OBJECT (pad, + GST_DEBUG_OBJECT (ts_pad, "we have additional codec data (%" G_GSIZE_FORMAT " bytes)", gst_buffer_get_size (codec_data)); ts_pad->codec_data = gst_buffer_ref (codec_data); @@ -477,18 +475,22 @@ GST_ERROR_OBJECT (mux, "Need codec_data for raw MPEG-4 AAC"); goto not_negotiated; } + } else if (codec_data) { + ts_pad->codec_data = gst_buffer_ref (codec_data); + } else { + ts_pad->codec_data = NULL; } break; } default: - GST_WARNING_OBJECT (pad, "unsupported mpegversion %d", mpegversion); + GST_WARNING_OBJECT (ts_pad, "unsupported mpegversion %d", mpegversion); goto not_negotiated; } } else if (strcmp (mt, "video/mpeg") == 0) { gint mpegversion; if (!gst_structure_get_int (s, "mpegversion", &mpegversion)) { - GST_ERROR_OBJECT (pad, "caps missing mpegversion"); + GST_ERROR_OBJECT (ts_pad, "caps missing mpegversion"); goto not_negotiated; } @@ -503,7 +505,7 @@ st = TSMUX_ST_VIDEO_MPEG4; break; default: - GST_WARNING_OBJECT (pad, "unsupported mpegversion %d", mpegversion); + GST_WARNING_OBJECT (ts_pad, "unsupported mpegversion %d", mpegversion); goto not_negotiated; } } else if (strcmp (mt, "subpicture/x-dvb") == 0) { @@ -518,7 +520,7 @@ if (!gst_codec_utils_opus_parse_caps (caps, NULL, &channels, &mapping_family, &stream_count, &coupled_count, channel_mapping)) { - GST_ERROR_OBJECT (pad, "Incomplete Opus caps"); + GST_ERROR_OBJECT (ts_pad, "Incomplete Opus caps"); goto not_negotiated; } @@ -565,7 +567,7 @@ channels) == 0) { opus_channel_config_code = channels | 0x80; } else { - GST_FIXME_OBJECT (pad, "Opus channel mapping not handled"); + GST_FIXME_OBJECT (ts_pad, "Opus channel mapping not handled"); goto not_negotiated; } } @@ -587,21 +589,22 @@ const GValue *vMainlevel = gst_structure_get_value (s, "main-level"); const GValue *vFramerate = gst_structure_get_value (s, "framerate"); const GValue *vColorimetry = gst_structure_get_value (s, "colorimetry"); - private_data = g_new0 (j2k_private_data, 1); + j2k_private_data *private_data; + /* for now, we relax the condition that profile must exist and equal * GST_JPEG2000_PARSE_PROFILE_BC_SINGLE */ if (vProfile) { profile = g_value_get_int (vProfile); if (profile != GST_JPEG2000_PARSE_PROFILE_BC_SINGLE) { - GST_LOG_OBJECT (pad, "Invalid JPEG 2000 profile %d", profile); - /*goto not_negotiated; */ + GST_LOG_OBJECT (ts_pad, "Invalid JPEG 2000 profile %d", profile); + /* goto not_negotiated; */ } } /* for now, we will relax the condition that the main level must be present */ if (vMainlevel) { main_level = g_value_get_uint (vMainlevel); if (main_level > 11) { - GST_ERROR_OBJECT (pad, "Invalid main level %d", main_level); + GST_ERROR_OBJECT (ts_pad, "Invalid main level %d", main_level); goto not_negotiated; } if (main_level >= 6) { @@ -625,10 +628,12 @@ } } } else { - /*GST_ERROR_OBJECT (pad, "Missing main level"); - goto not_negotiated; */ + /* GST_ERROR_OBJECT (ts_pad, "Missing main level"); + * goto not_negotiated; */ } + /* We always mux video in J2K-over-MPEG-TS non-interlaced mode */ + private_data = g_new0 (j2k_private_data, 1); private_data->interlace = FALSE; private_data->den = 0; private_data->num = 0; @@ -658,7 +663,8 @@ } private_data->color_spec = color_spec; } else { - GST_ERROR_OBJECT (pad, "Colorimetry not present in caps"); + GST_ERROR_OBJECT (ts_pad, "Colorimetry not present in caps"); + g_free (private_data); goto not_negotiated; } st = TSMUX_ST_VIDEO_JP2K; @@ -673,64 +679,106 @@ } } + if (st == TSMUX_ST_RESERVED) { + GST_ERROR_OBJECT (ts_pad, "Failed to determine stream type"); + goto error; + } - if (st != TSMUX_ST_RESERVED) { - ts_pad->stream = tsmux_create_stream (mux->tsmux, st, ts_pad->pid, - ts_pad->language); - } else { - GST_DEBUG_OBJECT (pad, "Failed to determine stream type"); + if (ts_pad->stream && st != ts_pad->stream->stream_type) { + GST_ELEMENT_ERROR (mux, STREAM, MUX, + ("Stream type change from %02x to %02x not supported", + ts_pad->stream->stream_type, st), NULL); + goto error; } - if (ts_pad->stream != NULL) { - const char *interlace_mode = gst_structure_get_string (s, "interlace-mode"); - gst_structure_get_int (s, "rate", &ts_pad->stream->audio_sampling); - gst_structure_get_int (s, "channels", &ts_pad->stream->audio_channels); - gst_structure_get_int (s, "bitrate", &ts_pad->stream->audio_bitrate); + if (ts_pad->stream == NULL) { + ts_pad->stream = + tsmux_create_stream (mux->tsmux, st, ts_pad->pid, ts_pad->language); + if (ts_pad->stream == NULL) + goto error; + } - /* frame rate */ - gst_structure_get_fraction (s, "framerate", &ts_pad->stream->num, - &ts_pad->stream->den); + pmt_name = g_strdup_printf ("PMT_%d", ts_pad->pid); + if (mux->prog_map && gst_structure_has_field (mux->prog_map, pmt_name)) { + gst_structure_get_int (mux->prog_map, pmt_name, &ts_pad->stream->pmt_index); + } + g_free (pmt_name); + + interlace_mode = gst_structure_get_string (s, "interlace-mode"); + gst_structure_get_int (s, "rate", &ts_pad->stream->audio_sampling); + gst_structure_get_int (s, "channels", &ts_pad->stream->audio_channels); + gst_structure_get_int (s, "bitrate", &ts_pad->stream->audio_bitrate); + + /* frame rate */ + gst_structure_get_fraction (s, "framerate", &ts_pad->stream->num, + &ts_pad->stream->den); + + /* Interlace mode */ + ts_pad->stream->interlace_mode = FALSE; + if (interlace_mode) { + ts_pad->stream->interlace_mode = + g_str_equal (interlace_mode, "interleaved"); + } - /* Interlace mode */ - ts_pad->stream->interlace_mode = FALSE; - if (interlace_mode) { - ts_pad->stream->interlace_mode = - g_str_equal (interlace_mode, "interleaved"); - } - /* Width and Height */ - gst_structure_get_int (s, "width", &ts_pad->stream->horizontal_size); - gst_structure_get_int (s, "height", &ts_pad->stream->vertical_size); + /* Width and Height */ + gst_structure_get_int (s, "width", &ts_pad->stream->horizontal_size); + gst_structure_get_int (s, "height", &ts_pad->stream->vertical_size); - ts_pad->stream->color_spec = color_spec; - ts_pad->stream->max_bitrate = max_rate; - ts_pad->stream->profile_and_level = profile | main_level; + ts_pad->stream->color_spec = color_spec; + ts_pad->stream->max_bitrate = max_rate; + ts_pad->stream->profile_and_level = profile | main_level; - ts_pad->stream->opus_channel_config_code = opus_channel_config_code; + ts_pad->stream->opus_channel_config_code = opus_channel_config_code; - tsmux_stream_set_buffer_release_func (ts_pad->stream, release_buffer_cb); - tsmux_program_add_stream (ts_pad->prog, ts_pad->stream); + tsmux_stream_set_buffer_release_func (ts_pad->stream, release_buffer_cb); + + return GST_FLOW_OK; - ret = GST_FLOW_OK; - } - gst_caps_unref (caps); - return ret; /* ERRORS */ not_negotiated: - { - g_free (private_data); - GST_DEBUG_OBJECT (pad, "Sink pad caps were not set before pushing"); - if (caps) - gst_caps_unref (caps); + return GST_FLOW_NOT_NEGOTIATED; + +error: + return GST_FLOW_ERROR; +} + +static gboolean +is_valid_pmt_pid (guint16 pmt_pid) +{ + if (pmt_pid < 0x0010 || pmt_pid > 0x1ffe) + return FALSE; + return TRUE; +} + +/* Must be called with mux->lock held */ +static GstFlowReturn +gst_base_ts_mux_create_stream (GstBaseTsMux * mux, GstBaseTsMuxPad * ts_pad) +{ + GstCaps *caps = gst_pad_get_current_caps (GST_PAD (ts_pad)); + GstFlowReturn ret; + + if (caps == NULL) { + GST_DEBUG_OBJECT (ts_pad, "Sink pad caps were not set before pushing"); return GST_FLOW_NOT_NEGOTIATED; } + + ret = gst_base_ts_mux_create_or_update_stream (mux, ts_pad, caps); + gst_caps_unref (caps); + + if (ret == GST_FLOW_OK) { + tsmux_program_add_stream (ts_pad->prog, ts_pad->stream); + } + + return ret; } +/* Must be called with mux->lock held */ static GstFlowReturn gst_base_ts_mux_create_pad_stream (GstBaseTsMux * mux, GstPad * pad) { GstBaseTsMuxPad *ts_pad = GST_BASE_TS_MUX_PAD (pad); gchar *name = NULL; - gchar *pcr_name; + gchar *prop_name; GstFlowReturn ret = GST_FLOW_OK; if (ts_pad->prog_id == -1) { @@ -767,6 +815,25 @@ tsmux_program_set_scte35_interval (ts_pad->prog, mux->scte35_null_interval); g_hash_table_insert (mux->programs, GINT_TO_POINTER (ts_pad->prog_id), ts_pad->prog); + + /* Check for user-specified PMT PID */ + prop_name = g_strdup_printf ("PMT_%d", ts_pad->prog->pgm_number); + if (mux->prog_map && gst_structure_has_field (mux->prog_map, prop_name)) { + guint pmt_pid; + + if (gst_structure_get_uint (mux->prog_map, prop_name, &pmt_pid)) { + if (is_valid_pmt_pid (pmt_pid)) { + GST_DEBUG_OBJECT (mux, "User specified pid=%u as PMT for " + "program (prog_id = %d)", pmt_pid, ts_pad->prog->pgm_number); + tsmux_program_set_pmt_pid (ts_pad->prog, pmt_pid); + } else { + GST_ELEMENT_WARNING (mux, LIBRARY, SETTINGS, + ("User specified PMT pid %u for program %d is not valid.", + pmt_pid, ts_pad->prog->pgm_number), (NULL)); + } + } + } + g_free (prop_name); } if (ts_pad->stream == NULL) { @@ -785,9 +852,10 @@ } /* Check for user-specified PCR PID */ - pcr_name = g_strdup_printf ("PCR_%d", ts_pad->prog->pgm_number); - if (mux->prog_map && gst_structure_has_field (mux->prog_map, pcr_name)) { - const gchar *sink_name = gst_structure_get_string (mux->prog_map, pcr_name); + prop_name = g_strdup_printf ("PCR_%d", ts_pad->prog->pgm_number); + if (mux->prog_map && gst_structure_has_field (mux->prog_map, prop_name)) { + const gchar *sink_name = + gst_structure_get_string (mux->prog_map, prop_name); if (!g_strcmp0 (name, sink_name)) { GST_DEBUG_OBJECT (mux, "User specified stream (pid=%d) as PCR for " @@ -795,7 +863,7 @@ tsmux_program_set_pcr_stream (ts_pad->prog, ts_pad->stream); } } - g_free (pcr_name); + g_free (prop_name); return ret; @@ -814,6 +882,7 @@ } } +/* Must be called with mux->lock held */ static gboolean gst_base_ts_mux_create_pad_stream_func (GstElement * element, GstPad * pad, gpointer user_data) @@ -825,6 +894,7 @@ return *ret == GST_FLOW_OK; } +/* Must be called with mux->lock held */ static GstFlowReturn gst_base_ts_mux_create_streams (GstBaseTsMux * mux) { @@ -1048,15 +1118,44 @@ new_packet_cb (GstBuffer * buf, void *user_data, gint64 new_pcr) { GstBaseTsMux *mux = (GstBaseTsMux *) user_data; + GstAggregator *agg = GST_AGGREGATOR (mux); GstBaseTsMuxClass *klass = GST_BASE_TS_MUX_GET_CLASS (mux); GstMapInfo map; + GstSegment *agg_segment = &GST_AGGREGATOR_PAD (agg->srcpad)->segment; g_assert (klass->output_packet); gst_buffer_map (buf, &map, GST_MAP_READWRITE); - if (!GST_CLOCK_TIME_IS_VALID (GST_BUFFER_PTS (buf))) + if (!GST_CLOCK_TIME_IS_VALID (GST_BUFFER_PTS (buf))) { + /* tsmux isn't generating timestamps. Use the input times */ GST_BUFFER_PTS (buf) = mux->last_ts; + } + + if (GST_CLOCK_TIME_IS_VALID (GST_BUFFER_PTS (buf))) { + if (!GST_CLOCK_STIME_IS_VALID (mux->output_ts_offset)) { + GstClockTime output_start_time = agg_segment->position; + if (agg_segment->position == -1 + || agg_segment->position < agg_segment->start) { + output_start_time = agg_segment->start; + } + + mux->output_ts_offset = + GST_CLOCK_DIFF (GST_BUFFER_PTS (buf), output_start_time); + + GST_DEBUG_OBJECT (mux, "New output ts offset %" GST_STIME_FORMAT, + GST_STIME_ARGS (mux->output_ts_offset)); + } + + GST_BUFFER_PTS (buf) += mux->output_ts_offset; + + agg_segment->position = GST_BUFFER_PTS (buf); + } else if (agg_segment->position == -1 + || agg_segment->position < agg_segment->start) { + GST_BUFFER_PTS (buf) = agg_segment->start; + } else { + GST_BUFFER_PTS (buf) = agg_segment->position; + } /* do common init (flags and streamheaders) */ new_packet_common_init (mux, buf, map.data, map.size); @@ -1099,11 +1198,13 @@ return GST_FLOW_OK; } + g_mutex_lock (&mux->lock); if (G_UNLIKELY (mux->first)) { ret = gst_base_ts_mux_create_streams (mux); if (G_UNLIKELY (ret != GST_FLOW_OK)) { if (buf) gst_buffer_unref (buf); + g_mutex_unlock (&mux->lock); return ret; } @@ -1142,6 +1243,7 @@ if (mux->force_key_unit_event != NULL && best->stream->is_video_stream) { GstEvent *event; + g_mutex_unlock (&mux->lock); event = check_pending_key_unit_event (mux->force_key_unit_event, &agg_pad->segment, GST_BUFFER_PTS (buf), GST_BUFFER_FLAGS (buf), mux->pending_key_unit_ts); @@ -1161,6 +1263,7 @@ GST_TIME_ARGS (running_time), count); gst_pad_push_event (GST_AGGREGATOR_SRC_PAD (mux), event); + g_mutex_lock (&mux->lock); /* output PAT, SI tables */ tsmux_resend_pat (mux->tsmux); tsmux_resend_si (mux->tsmux); @@ -1171,6 +1274,8 @@ tsmux_resend_pmt (program); } + } else { + g_mutex_lock (&mux->lock); } } @@ -1199,7 +1304,9 @@ if (GST_CLOCK_TIME_IS_VALID (GST_BUFFER_PTS (buf))) { pts = GSTTIME_TO_MPEGTIME (GST_BUFFER_PTS (buf)); GST_DEBUG_OBJECT (mux, "Buffer has PTS %" GST_TIME_FORMAT " pts %" - G_GINT64_FORMAT, GST_TIME_ARGS (GST_BUFFER_PTS (buf)), pts); + G_GINT64_FORMAT "%s", GST_TIME_ARGS (GST_BUFFER_PTS (buf)), pts, + !GST_BUFFER_FLAG_IS_SET (buf, + GST_BUFFER_FLAG_DELTA_UNIT) ? " (keyframe)" : ""); } if (GST_CLOCK_STIME_IS_VALID (best->dts)) { @@ -1223,14 +1330,17 @@ GST_WARNING_OBJECT (mux, "KLV meta unit too big, splitting not supported"); gst_buffer_unref (buf); + g_mutex_unlock (&mux->lock); return GST_FLOW_OK; } GST_DEBUG_OBJECT (mux, "delta: %d", delta); - stream_data = stream_data_new (buf); - tsmux_stream_add_data (best->stream, stream_data->map_info.data, - stream_data->map_info.size, stream_data, pts, dts, !delta); + if (gst_buffer_get_size (buf) > 0) { + stream_data = stream_data_new (buf); + tsmux_stream_add_data (best->stream, stream_data->map_info.data, + stream_data->map_info.size, stream_data, pts, dts, !delta); + } /* outgoing ts follows ts of PCR program stream */ if (prog->pcr_stream == best->stream) { @@ -1252,6 +1362,7 @@ goto write_fail; } } + g_mutex_unlock (&mux->lock); /* flush packet cache */ return gst_base_ts_mux_push_packets (mux, FALSE); @@ -1293,9 +1404,12 @@ GstPad *pad = NULL; gchar *free_name = NULL; + g_mutex_lock (&mux->lock); if (name != NULL && sscanf (name, "sink_%d", &pid) == 1) { - if (tsmux_find_stream (mux->tsmux, pid)) + if (tsmux_find_stream (mux->tsmux, pid)) { + g_mutex_unlock (&mux->lock); goto stream_exists; + } /* Make sure we don't use reserved PID. * FIXME : This should be extended to other variants (ex: ATSC) reserved PID */ if (pid < TSMUX_START_ES_PID) @@ -1308,6 +1422,7 @@ /* Name the pad correctly after the selected pid */ name = free_name = g_strdup_printf ("sink_%d", pid); } + g_mutex_unlock (&mux->lock); pad = (GstPad *) GST_ELEMENT_CLASS (parent_class)->request_new_pad (element, @@ -1341,6 +1456,7 @@ { GstBaseTsMux *mux = GST_BASE_TS_MUX (element); + g_mutex_lock (&mux->lock); if (mux->tsmux) { GList *cur; GstBaseTsMuxPad *ts_pad = GST_BASE_TS_MUX_PAD (pad); @@ -1365,10 +1481,411 @@ tsmux_resend_pmt (program); } } + g_mutex_unlock (&mux->lock); GST_ELEMENT_CLASS (parent_class)->release_pad (element, pad); } +/* GstAggregator implementation */ + +static void +request_keyframe (GstBaseTsMux * mux, GstClockTime running_time) +{ + GList *l; + GST_OBJECT_LOCK (mux); + + for (l = GST_ELEMENT_CAST (mux)->sinkpads; l; l = l->next) { + gst_pad_push_event (GST_PAD (l->data), + gst_video_event_new_upstream_force_key_unit (running_time, TRUE, 0)); + } + + GST_OBJECT_UNLOCK (mux); +} + +static const guint32 crc_tab[256] = { + 0x00000000, 0x04c11db7, 0x09823b6e, 0x0d4326d9, 0x130476dc, 0x17c56b6b, + 0x1a864db2, 0x1e475005, 0x2608edb8, 0x22c9f00f, 0x2f8ad6d6, 0x2b4bcb61, + 0x350c9b64, 0x31cd86d3, 0x3c8ea00a, 0x384fbdbd, 0x4c11db70, 0x48d0c6c7, + 0x4593e01e, 0x4152fda9, 0x5f15adac, 0x5bd4b01b, 0x569796c2, 0x52568b75, + 0x6a1936c8, 0x6ed82b7f, 0x639b0da6, 0x675a1011, 0x791d4014, 0x7ddc5da3, + 0x709f7b7a, 0x745e66cd, 0x9823b6e0, 0x9ce2ab57, 0x91a18d8e, 0x95609039, + 0x8b27c03c, 0x8fe6dd8b, 0x82a5fb52, 0x8664e6e5, 0xbe2b5b58, 0xbaea46ef, + 0xb7a96036, 0xb3687d81, 0xad2f2d84, 0xa9ee3033, 0xa4ad16ea, 0xa06c0b5d, + 0xd4326d90, 0xd0f37027, 0xddb056fe, 0xd9714b49, 0xc7361b4c, 0xc3f706fb, + 0xceb42022, 0xca753d95, 0xf23a8028, 0xf6fb9d9f, 0xfbb8bb46, 0xff79a6f1, + 0xe13ef6f4, 0xe5ffeb43, 0xe8bccd9a, 0xec7dd02d, 0x34867077, 0x30476dc0, + 0x3d044b19, 0x39c556ae, 0x278206ab, 0x23431b1c, 0x2e003dc5, 0x2ac12072, + 0x128e9dcf, 0x164f8078, 0x1b0ca6a1, 0x1fcdbb16, 0x018aeb13, 0x054bf6a4, + 0x0808d07d, 0x0cc9cdca, 0x7897ab07, 0x7c56b6b0, 0x71159069, 0x75d48dde, + 0x6b93dddb, 0x6f52c06c, 0x6211e6b5, 0x66d0fb02, 0x5e9f46bf, 0x5a5e5b08, + 0x571d7dd1, 0x53dc6066, 0x4d9b3063, 0x495a2dd4, 0x44190b0d, 0x40d816ba, + 0xaca5c697, 0xa864db20, 0xa527fdf9, 0xa1e6e04e, 0xbfa1b04b, 0xbb60adfc, + 0xb6238b25, 0xb2e29692, 0x8aad2b2f, 0x8e6c3698, 0x832f1041, 0x87ee0df6, + 0x99a95df3, 0x9d684044, 0x902b669d, 0x94ea7b2a, 0xe0b41de7, 0xe4750050, + 0xe9362689, 0xedf73b3e, 0xf3b06b3b, 0xf771768c, 0xfa325055, 0xfef34de2, + 0xc6bcf05f, 0xc27dede8, 0xcf3ecb31, 0xcbffd686, 0xd5b88683, 0xd1799b34, + 0xdc3abded, 0xd8fba05a, 0x690ce0ee, 0x6dcdfd59, 0x608edb80, 0x644fc637, + 0x7a089632, 0x7ec98b85, 0x738aad5c, 0x774bb0eb, 0x4f040d56, 0x4bc510e1, + 0x46863638, 0x42472b8f, 0x5c007b8a, 0x58c1663d, 0x558240e4, 0x51435d53, + 0x251d3b9e, 0x21dc2629, 0x2c9f00f0, 0x285e1d47, 0x36194d42, 0x32d850f5, + 0x3f9b762c, 0x3b5a6b9b, 0x0315d626, 0x07d4cb91, 0x0a97ed48, 0x0e56f0ff, + 0x1011a0fa, 0x14d0bd4d, 0x19939b94, 0x1d528623, 0xf12f560e, 0xf5ee4bb9, + 0xf8ad6d60, 0xfc6c70d7, 0xe22b20d2, 0xe6ea3d65, 0xeba91bbc, 0xef68060b, + 0xd727bbb6, 0xd3e6a601, 0xdea580d8, 0xda649d6f, 0xc423cd6a, 0xc0e2d0dd, + 0xcda1f604, 0xc960ebb3, 0xbd3e8d7e, 0xb9ff90c9, 0xb4bcb610, 0xb07daba7, + 0xae3afba2, 0xaafbe615, 0xa7b8c0cc, 0xa379dd7b, 0x9b3660c6, 0x9ff77d71, + 0x92b45ba8, 0x9675461f, 0x8832161a, 0x8cf30bad, 0x81b02d74, 0x857130c3, + 0x5d8a9099, 0x594b8d2e, 0x5408abf7, 0x50c9b640, 0x4e8ee645, 0x4a4ffbf2, + 0x470cdd2b, 0x43cdc09c, 0x7b827d21, 0x7f436096, 0x7200464f, 0x76c15bf8, + 0x68860bfd, 0x6c47164a, 0x61043093, 0x65c52d24, 0x119b4be9, 0x155a565e, + 0x18197087, 0x1cd86d30, 0x029f3d35, 0x065e2082, 0x0b1d065b, 0x0fdc1bec, + 0x3793a651, 0x3352bbe6, 0x3e119d3f, 0x3ad08088, 0x2497d08d, 0x2056cd3a, + 0x2d15ebe3, 0x29d4f654, 0xc5a92679, 0xc1683bce, 0xcc2b1d17, 0xc8ea00a0, + 0xd6ad50a5, 0xd26c4d12, 0xdf2f6bcb, 0xdbee767c, 0xe3a1cbc1, 0xe760d676, + 0xea23f0af, 0xeee2ed18, 0xf0a5bd1d, 0xf464a0aa, 0xf9278673, 0xfde69bc4, + 0x89b8fd09, 0x8d79e0be, 0x803ac667, 0x84fbdbd0, 0x9abc8bd5, 0x9e7d9662, + 0x933eb0bb, 0x97ffad0c, 0xafb010b1, 0xab710d06, 0xa6322bdf, 0xa2f33668, + 0xbcb4666d, 0xb8757bda, 0xb5365d03, 0xb1f740b4 +}; + +static guint32 +_calc_crc32 (const guint8 * data, guint datalen) +{ + gint i; + guint32 crc = 0xffffffff; + + for (i = 0; i < datalen; i++) { + crc = (crc << 8) ^ crc_tab[((crc >> 24) ^ *data++) & 0xff]; + } + return crc; +} + +#define MPEGTIME_TO_GSTTIME(t) ((t) * (guint64)100000 / 9) + +static GstMpegtsSCTESpliceEvent * +copy_splice (GstMpegtsSCTESpliceEvent * splice) +{ + return g_boxed_copy (GST_TYPE_MPEGTS_SCTE_SPLICE_EVENT, splice); +} + +static void +free_splice (GstMpegtsSCTESpliceEvent * splice) +{ + g_boxed_free (GST_TYPE_MPEGTS_SCTE_SPLICE_EVENT, splice); +} + +/* FIXME: get rid of this when depending on glib >= 2.62 */ + +static GPtrArray * +_g_ptr_array_copy (GPtrArray * array, + GCopyFunc func, GFreeFunc free_func, gpointer user_data) +{ + GPtrArray *new_array; + + g_return_val_if_fail (array != NULL, NULL); + + new_array = g_ptr_array_new_with_free_func (free_func); + + g_ptr_array_set_size (new_array, array->len); + + if (func != NULL) { + guint i; + + for (i = 0; i < array->len; i++) + new_array->pdata[i] = func (array->pdata[i], user_data); + } else if (array->len > 0) { + memcpy (new_array->pdata, array->pdata, + array->len * sizeof (*array->pdata)); + } + + new_array->len = array->len; + + return new_array; +} + +static GstMpegtsSCTESIT * +deep_copy_sit (const GstMpegtsSCTESIT * sit) +{ + GstMpegtsSCTESIT *sit_copy = g_boxed_copy (GST_TYPE_MPEGTS_SCTE_SIT, sit); + GPtrArray *splices_copy = + _g_ptr_array_copy (sit_copy->splices, (GCopyFunc) copy_splice, + (GFreeFunc) free_splice, NULL); + + g_ptr_array_unref (sit_copy->splices); + sit_copy->splices = splices_copy; + + return sit_copy; +} + +/* Takes ownership of @section. + * + * This function is a bit complex because the SCTE sections can + * have various origins: + * + * * Sections created by the application with the gst_mpegts_scte_*_new() + * API. The splice times / durations contained by these are expressed + * in the GStreamer running time domain, and must be translated to + * our local PES time domain. In this case, we will packetize the section + * ourselves. + * + * * Sections passed through from tsdemux: this case is complicated as + * splice times in the incoming stream may be encrypted, with pts_adjustment + * being the only timing field guaranteed *not* to be encrypted. In this + * case, the original binary data (section->data) will be reinjected as is + * in the output stream, with pts_adjustment adjusted. tsdemux provides us + * with the pts_offset it introduces, the difference between the original + * PES PTSs and the running times it outputs. + * + * Additionally, in either of these cases when the splice times aren't encrypted + * we want to make use of those to request keyframes. For the passthrough case, + * as the splice times are left untouched tsdemux provides us with the running + * times the section originally referred to. We cannot calculate it locally + * because we would need to have access to the information that the timestamps + * in the original PES domain have wrapped around, and how many times they have + * done so. While we could probably make educated guesses, tsdemux (more specifically + * mpegtspacketizer) already keeps track of that, and it seemed more logical to + * perform the calculation there and forward it alongside the downstream events. + * + * Finally, while we can't request keyframes at splice points in the encrypted + * case, if the input stream was compliant in that regard and no reencoding took + * place the splice times will still match with valid splice points, it is up + * to the application to ensure that that is the case. + */ +static void +handle_scte35_section (GstBaseTsMux * mux, GstEvent * event, + GstMpegtsSection * section, guint64 mpeg_pts_offset, + GstStructure * rtime_map) +{ + GstMpegtsSCTESIT *sit; + guint i; + gboolean forward = TRUE; + guint64 pts_adjust; + guint8 *section_data; + guint8 *crc; + gboolean translate = FALSE; + + sit = (GstMpegtsSCTESIT *) gst_mpegts_section_get_scte_sit (section); + + /* When the application injects manually constructed splice events, + * their time domain is the GStreamer running time, we receive them + * unpacketized and translate the fields in the SIT to local PTS. + * + * We make a copy of the SIT in order to make sure we can rewrite it. + */ + if (sit->is_running_time) { + sit = deep_copy_sit (sit); + translate = TRUE; + } + + switch (sit->splice_command_type) { + case GST_MTS_SCTE_SPLICE_COMMAND_NULL: + /* We implement heartbeating ourselves */ + forward = FALSE; + break; + case GST_MTS_SCTE_SPLICE_COMMAND_SCHEDULE: + /* No need to request keyframes at this point, splice_insert + * messages will precede the future splice points and we + * can request keyframes then. Only translate if needed. + */ + if (translate) { + for (i = 0; i < sit->splices->len; i++) { + GstMpegtsSCTESpliceEvent *sevent = + g_ptr_array_index (sit->splices, i); + + if (sevent->program_splice_time_specified) + sevent->program_splice_time = + GSTTIME_TO_MPEGTIME (sevent->program_splice_time) + + TS_MUX_CLOCK_BASE; + + if (sevent->duration_flag) + sevent->break_duration = + GSTTIME_TO_MPEGTIME (sevent->break_duration); + } + } + break; + case GST_MTS_SCTE_SPLICE_COMMAND_INSERT: + /* We want keyframes at splice points */ + if (sit->fully_parsed && (rtime_map || translate)) { + + for (i = 0; i < sit->splices->len; i++) { + guint64 running_time = GST_CLOCK_TIME_NONE; + + GstMpegtsSCTESpliceEvent *sevent = + g_ptr_array_index (sit->splices, i); + if (sevent->program_splice_time_specified) { + if (rtime_map) { + gchar *field_name = g_strdup_printf ("event-%u-splice-time", + sevent->splice_event_id); + if (gst_structure_get_uint64 (rtime_map, field_name, + &running_time)) { + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for splice point at %" GST_TIME_FORMAT, + GST_TIME_ARGS (running_time)); + request_keyframe (mux, running_time); + } + g_free (field_name); + } else { + g_assert (translate == TRUE); + running_time = sevent->program_splice_time; + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for splice point at %" GST_TIME_FORMAT, + GST_TIME_ARGS (running_time)); + request_keyframe (mux, running_time); + sevent->program_splice_time = + GSTTIME_TO_MPEGTIME (running_time) + TS_MUX_CLOCK_BASE; + } + } else { + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for immediate splice point"); + request_keyframe (mux, GST_CLOCK_TIME_NONE); + } + + if (sevent->duration_flag) { + if (translate) { + sevent->break_duration = + GSTTIME_TO_MPEGTIME (sevent->break_duration); + } + + /* Even if auto_return is FALSE, when a break_duration is specified it + * is intended as a redundancy mechanism in case the follow-up + * splice insert goes missing. + * + * Schedule a keyframe at that point (if we can calculate its position + * accurately). + */ + if (GST_CLOCK_TIME_IS_VALID (running_time)) { + running_time += MPEGTIME_TO_GSTTIME (sevent->break_duration); + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for end of break at %" GST_TIME_FORMAT, + GST_TIME_ARGS (running_time)); + request_keyframe (mux, running_time); + } + } + } + } + break; + case GST_MTS_SCTE_SPLICE_COMMAND_TIME:{ + /* Adjust timestamps and potentially request keyframes */ + gboolean do_request_keyframes = FALSE; + + /* TODO: we can probably be a little more fine-tuned about determining + * whether a keyframe is actually needed, but this at least takes care + * of the requirement in 10.3.4 that a keyframe should not be created + * when the signal contains only a time_descriptor. + */ + if (sit->fully_parsed && (rtime_map || translate)) { + for (i = 0; i < sit->descriptors->len; i++) { + GstMpegtsDescriptor *descriptor = + g_ptr_array_index (sit->descriptors, i); + + switch (descriptor->tag) { + case GST_MTS_SCTE_DESC_AVAIL: + case GST_MTS_SCTE_DESC_DTMF: + case GST_MTS_SCTE_DESC_SEGMENTATION: + do_request_keyframes = TRUE; + break; + case GST_MTS_SCTE_DESC_TIME: + case GST_MTS_SCTE_DESC_AUDIO: + break; + } + + if (do_request_keyframes) + break; + } + + if (sit->splice_time_specified) { + GstClockTime running_time = GST_CLOCK_TIME_NONE; + + if (rtime_map) { + if (do_request_keyframes + && gst_structure_get_uint64 (rtime_map, "splice-time", + &running_time)) { + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for time signal at %" GST_TIME_FORMAT, + GST_TIME_ARGS (running_time)); + request_keyframe (mux, running_time); + } + } else { + g_assert (translate); + running_time = sit->splice_time; + sit->splice_time = + GSTTIME_TO_MPEGTIME (running_time) + TS_MUX_CLOCK_BASE; + if (do_request_keyframes) { + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for time signal at %" GST_TIME_FORMAT, + GST_TIME_ARGS (running_time)); + request_keyframe (mux, running_time); + } + } + } else if (do_request_keyframes) { + GST_DEBUG_OBJECT (mux, + "Requesting keyframe for immediate time signal"); + request_keyframe (mux, GST_CLOCK_TIME_NONE); + } + } + break; + } + case GST_MTS_SCTE_SPLICE_COMMAND_BANDWIDTH: + case GST_MTS_SCTE_SPLICE_COMMAND_PRIVATE: + /* Just let those go through untouched, none of our business */ + break; + default: + break; + } + + if (!forward) { + gst_mpegts_section_unref (section); + return; + } + + if (!translate) { + g_assert (section->data); + /* Calculate the final adjustment, as a sum of: + * - The adjustment in the original packet + * - The offset introduced between the original local PTS + * and the GStreamer PTS output by tsdemux + * - Our own 1-hour offset + */ + pts_adjust = sit->pts_adjustment + mpeg_pts_offset + TS_MUX_CLOCK_BASE; + + /* Account for offsets potentially introduced between the demuxer and us */ + pts_adjust += + GSTTIME_TO_MPEGTIME (gst_event_get_running_time_offset (event)); + + pts_adjust &= 0x1ffffffff; + section_data = g_memdup2 (section->data, section->section_length); + section_data[4] |= pts_adjust >> 32; + section_data[5] = pts_adjust >> 24; + section_data[6] = pts_adjust >> 16; + section_data[7] = pts_adjust >> 8; + section_data[8] = pts_adjust; + + /* Now rewrite our checksum */ + crc = section_data + section->section_length - 4; + GST_WRITE_UINT32_BE (crc, _calc_crc32 (section_data, crc - section_data)); + + GST_OBJECT_LOCK (mux); + GST_DEBUG_OBJECT (mux, "Storing SCTE section"); + if (mux->pending_scte35_section) + gst_mpegts_section_unref (mux->pending_scte35_section); + mux->pending_scte35_section = + gst_mpegts_section_new (mux->scte35_pid, section_data, + section->section_length); + GST_OBJECT_UNLOCK (mux); + + gst_mpegts_section_unref (section); + } else { + GST_OBJECT_LOCK (mux); + GST_DEBUG_OBJECT (mux, "Storing SCTE section"); + gst_mpegts_section_unref (section); + if (mux->pending_scte35_section) + gst_mpegts_section_unref (mux->pending_scte35_section); + mux->pending_scte35_section = + gst_mpegts_section_from_scte_sit (sit, mux->scte35_pid);; + GST_OBJECT_UNLOCK (mux); + } +} + static gboolean gst_base_ts_mux_send_event (GstElement * element, GstEvent * event) { @@ -1381,16 +1898,12 @@ GST_DEBUG ("Received event with mpegts section"); if (section->section_type == GST_MPEGTS_SECTION_SCTE_SIT) { - /* Will be sent from the streaming threads */ - GST_DEBUG_OBJECT (mux, "Storing SCTE event"); - GST_OBJECT_LOCK (element); - if (mux->pending_scte35_section) - gst_mpegts_section_unref (mux->pending_scte35_section); - mux->pending_scte35_section = section; - GST_OBJECT_UNLOCK (element); + handle_scte35_section (mux, event, section, 0, NULL); } else { + g_mutex_lock (&mux->lock); /* TODO: Check that the section type is supported */ tsmux_add_mpegts_si_section (mux->tsmux, section); + g_mutex_unlock (&mux->lock); } gst_event_unref (event); @@ -1414,11 +1927,95 @@ gboolean forward = TRUE; switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_CAPS: + { + GstCaps *caps; + GstFlowReturn ret; + GList *cur; + + g_mutex_lock (&mux->lock); + if (ts_pad->stream == NULL) { + g_mutex_unlock (&mux->lock); + break; + } + + forward = FALSE; + + gst_event_parse_caps (event, &caps); + if (!caps || !gst_caps_is_fixed (caps)) { + g_mutex_unlock (&mux->lock); + break; + } + + ret = gst_base_ts_mux_create_or_update_stream (mux, ts_pad, caps); + if (ret != GST_FLOW_OK) { + g_mutex_unlock (&mux->lock); + break; + } + + mux->tsmux->pat_changed = TRUE; + mux->tsmux->si_changed = TRUE; + tsmux_resend_pat (mux->tsmux); + tsmux_resend_si (mux->tsmux); + + /* output PMT for each program */ + for (cur = mux->tsmux->programs; cur; cur = cur->next) { + TsMuxProgram *program = (TsMuxProgram *) cur->data; + + program->pmt_changed = TRUE; + tsmux_resend_pmt (program); + } + g_mutex_unlock (&mux->lock); + + res = TRUE; + break; + } case GST_EVENT_CUSTOM_DOWNSTREAM: { GstClockTime timestamp, stream_time, running_time; gboolean all_headers; guint count; + const GstStructure *s; + + s = gst_event_get_structure (event); + + if (gst_structure_has_name (s, "scte-sit") && mux->scte35_pid != 0) { + + /* When operating downstream of tsdemux, tsdemux will send out events + * on all its source pads for each splice table it encounters. If we + * are remuxing multiple streams it has demuxed, this means we could + * unnecessarily repeat the same table multiple times, we avoid that + * by deduplicating thanks to the event sequm + */ + if (gst_event_get_seqnum (event) != mux->last_scte35_event_seqnum) { + GstMpegtsSection *section; + + gst_structure_get (s, "section", GST_TYPE_MPEGTS_SECTION, §ion, + NULL); + if (section) { + guint64 mpeg_pts_offset = 0; + GstStructure *rtime_map = NULL; + + gst_structure_get (s, "running-time-map", GST_TYPE_STRUCTURE, + &rtime_map, NULL); + gst_structure_get_uint64 (s, "mpeg-pts-offset", &mpeg_pts_offset); + + handle_scte35_section (mux, event, section, mpeg_pts_offset, + rtime_map); + if (rtime_map) + gst_structure_free (rtime_map); + mux->last_scte35_event_seqnum = gst_event_get_seqnum (event); + } else { + GST_WARNING_OBJECT (ts_pad, + "Ignoring scte-sit event without a section"); + } + } else { + GST_DEBUG_OBJECT (ts_pad, "Ignoring duplicate scte-sit event"); + } + res = TRUE; + forward = FALSE; + goto out; + } if (!gst_video_event_is_force_key_unit (event)) goto out; @@ -1742,6 +2339,10 @@ GstBuffer *buffer; buffer = gst_aggregator_pad_pop_buffer (GST_AGGREGATOR_PAD (best)); + if (!buffer) { + /* We might have gotten a flush event after we picked the pad */ + goto done; + } ret = gst_base_ts_mux_aggregate_buffer (GST_BASE_TS_MUX (agg), @@ -1770,7 +2371,11 @@ static gboolean gst_base_ts_mux_start (GstAggregator * agg) { - gst_base_ts_mux_reset (GST_BASE_TS_MUX (agg), TRUE); + GstBaseTsMux *mux = GST_BASE_TS_MUX (agg); + + g_mutex_lock (&mux->lock); + gst_base_ts_mux_reset (mux, TRUE); + g_mutex_unlock (&mux->lock); return TRUE; } @@ -1778,7 +2383,11 @@ static gboolean gst_base_ts_mux_stop (GstAggregator * agg) { + GstBaseTsMux *mux = GST_BASE_TS_MUX (agg); + + g_mutex_lock (&mux->lock); gst_base_ts_mux_reset (GST_BASE_TS_MUX (agg), TRUE); + g_mutex_unlock (&mux->lock); return TRUE; } @@ -1790,6 +2399,7 @@ { GstBaseTsMux *mux = GST_BASE_TS_MUX (object); + g_mutex_lock (&mux->lock); gst_base_ts_mux_reset (mux, FALSE); if (mux->out_adapter) { @@ -1804,16 +2414,28 @@ g_hash_table_destroy (mux->programs); mux->programs = NULL; } + g_mutex_unlock (&mux->lock); GST_CALL_PARENT (G_OBJECT_CLASS, dispose, (object)); } static void +gst_base_ts_mux_finalize (GObject * object) +{ + GstBaseTsMux *mux = GST_BASE_TS_MUX (object); + + g_mutex_clear (&mux->lock); + GST_CALL_PARENT (G_OBJECT_CLASS, finalize, (object)); +} + +static void gst_base_ts_mux_constructed (GObject * object) { GstBaseTsMux *mux = GST_BASE_TS_MUX (object); /* initial state */ + g_mutex_lock (&mux->lock); gst_base_ts_mux_reset (mux, TRUE); + g_mutex_unlock (&mux->lock); } static void @@ -1838,8 +2460,10 @@ } case PROP_PAT_INTERVAL: mux->pat_interval = g_value_get_uint (value); + g_mutex_lock (&mux->lock); if (mux->tsmux) tsmux_set_pat_interval (mux->tsmux, mux->pat_interval); + g_mutex_unlock (&mux->lock); break; case PROP_PMT_INTERVAL: mux->pmt_interval = g_value_get_uint (value); @@ -1847,7 +2471,9 @@ for (l = GST_ELEMENT_CAST (mux)->sinkpads; l; l = l->next) { GstBaseTsMuxPad *ts_pad = GST_BASE_TS_MUX_PAD (l->data); + g_mutex_lock (&mux->lock); tsmux_set_pmt_interval (ts_pad->prog, mux->pmt_interval); + g_mutex_unlock (&mux->lock); } GST_OBJECT_UNLOCK (mux); break; @@ -1856,17 +2482,23 @@ break; case PROP_SI_INTERVAL: mux->si_interval = g_value_get_uint (value); + g_mutex_lock (&mux->lock); tsmux_set_si_interval (mux->tsmux, mux->si_interval); + g_mutex_unlock (&mux->lock); break; case PROP_BITRATE: mux->bitrate = g_value_get_uint64 (value); + g_mutex_lock (&mux->lock); if (mux->tsmux) tsmux_set_bitrate (mux->tsmux, mux->bitrate); + g_mutex_unlock (&mux->lock); break; case PROP_PCR_INTERVAL: mux->pcr_interval = g_value_get_uint (value); + g_mutex_lock (&mux->lock); if (mux->tsmux) tsmux_set_pcr_interval (mux->tsmux, mux->pcr_interval); + g_mutex_unlock (&mux->lock); break; case PROP_SCTE_35_PID: mux->scte35_pid = g_value_get_uint (value); @@ -1990,6 +2622,7 @@ gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_base_ts_mux_get_property); gobject_class->dispose = gst_base_ts_mux_dispose; + gobject_class->finalize = gst_base_ts_mux_finalize; gobject_class->constructed = gst_base_ts_mux_constructed; gstelement_class->request_new_pad = gst_base_ts_mux_request_new_pad; @@ -2090,4 +2723,6 @@ mux->packet_size = GST_BASE_TS_MUX_NORMAL_PACKET_LENGTH; mux->automatic_alignment = 0; + + g_mutex_init (&mux->lock); }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmux.h
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __BASETSMUX_H__ @@ -176,6 +160,7 @@ guint pcr_interval; guint scte35_pid; guint scte35_null_interval; + guint32 last_scte35_event_seqnum; /* state */ gboolean first; @@ -197,6 +182,10 @@ /* output buffer aggregation */ GstAdapter *out_adapter; GstBuffer *out_buffer; + GstClockTimeDiff output_ts_offset; + + /* protects the tsmux object, the programs hash table, and pad streams */ + GMutex lock; }; /**
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxaac.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxaac.c
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxaac.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxaac.h
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __BASETSMUX_AAC_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxjpeg2000.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxjpeg2000.c
Changed
@@ -20,6 +20,8 @@ * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. + * + * SPDX-License-Identifier: LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxjpeg2000.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxjpeg2000.h
Changed
@@ -20,6 +20,8 @@ * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. + * + * SPDX-License-Identifier: LGPL-2.0-or-later */ #ifndef __BASETSMUX_JPEG2000_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxopus.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxopus.c
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxopus.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxopus.h
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __BASETSMUX_OPUS_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxttxt.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxttxt.c
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstbasetsmuxttxt.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstbasetsmuxttxt.h
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __BASETSMUX_TTXT_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstmpegtsmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstmpegtsmux.c
Changed
@@ -6,10 +6,9 @@ * * Copyright (C) 2011 Jan Schmidt <thaytan@noraisin.net> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -40,22 +39,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -80,6 +63,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ /** @@ -92,6 +76,9 @@ * * {{ tests/examples/mpegts/ts-section-writer.c }} */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif #include "gstmpegtsmux.h" #include <string.h> @@ -115,7 +102,7 @@ "mpegversion = (int) { 1, 2, 4 }, " "systemstream = (boolean) false; " "video/x-dirac;" - "image/x-jpc;" + "image/x-jpc, alignment = (string) frame;" "video/x-h264,stream-format=(string)byte-stream," "alignment=(string){au, nal}; " "video/x-h265,stream-format=(string)byte-stream," @@ -139,7 +126,7 @@ "channels = (int) [1, 8], " "channel-mapping-family = (int) {0, 1};" "subpicture/x-dvb; application/x-teletext; meta/x-klv, parsed=true;" - "image/x-jpc, profile = (int)[0, 49151];")); + "image/x-jpc, alignment = (string) frame, profile = (int)[0, 49151];")); static GstStaticPadTemplate gst_mpeg_ts_mux_src_factory = GST_STATIC_PAD_TEMPLATE ("src", @@ -152,8 +139,10 @@ GST_DEBUG_CATEGORY (gst_mpeg_ts_mux_debug); #define GST_CAT_DEFAULT gst_mpeg_ts_mux_debug -G_DEFINE_TYPE (GstMpegTsMux, gst_mpeg_ts_mux, GST_TYPE_BASE_TS_MUX); #define parent_class gst_mpeg_ts_mux_parent_class +G_DEFINE_TYPE (GstMpegTsMux, gst_mpeg_ts_mux, GST_TYPE_BASE_TS_MUX); +GST_ELEMENT_REGISTER_DEFINE (mpegtsmux, "mpegtsmux", GST_RANK_PRIMARY, + gst_mpeg_ts_mux_get_type ()); /* Internals */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstmpegtsmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstmpegtsmux.h
Changed
@@ -4,10 +4,9 @@ * Kapil Agrawal <kapil@fluendo.com> * Julien Moutte <julien@fluendo.com> * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL and the MIT license. * * MPL: * @@ -38,22 +37,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -78,6 +61,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __MPEGTSMUX_H__ @@ -118,6 +102,7 @@ }; GType gst_mpeg_ts_mux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (mpegtsmux); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/gstmpegtsmuxplugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/gstmpegtsmuxplugin.c
Changed
@@ -1,24 +1,21 @@ +/* SPDX-License-Identifier: LGPL-2.0-or-later */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif -#include "gstmpegtsmux.h" #include "gstatscmux.h" +#include "gstmpegtsmux.h" static gboolean plugin_init (GstPlugin * plugin) { - gst_mpegts_initialize (); - - if (!gst_element_register (plugin, "mpegtsmux", GST_RANK_PRIMARY, - gst_mpeg_ts_mux_get_type ())) - return FALSE; + gboolean ret = FALSE; - if (!gst_element_register (plugin, "atscmux", GST_RANK_PRIMARY, - gst_atsc_mux_get_type ())) - return FALSE; + ret |= GST_ELEMENT_REGISTER (mpegtsmux, plugin); + ret |= GST_ELEMENT_REGISTER (atscmux, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/tsmux/tsmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/tsmux/tsmux.c
Changed
@@ -1,10 +1,9 @@ /* * Copyright 2006 BBC and Fluendo S.A. * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL, and the MIT license. * * MPL: * @@ -35,22 +34,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -75,6 +58,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H @@ -101,6 +85,10 @@ #define TSMUX_DEFAULT_NETWORK_ID 0x0001 #define TSMUX_DEFAULT_TS_ID 0x0001 +/* The last byte of the PCR in the header defines the byte position + * at which PCR should be calculated */ +#define PCR_BYTE_OFFSET 11 + /* HACK: We use a fixed buffering offset for the PCR at the moment - * this is the amount 'in advance' of the stream that the PCR sits. * 1/8 second atm */ @@ -113,6 +101,14 @@ static gboolean tsmux_write_pat (TsMux * mux); static gboolean tsmux_write_pmt (TsMux * mux, TsMuxProgram * program); static gboolean tsmux_write_scte_null (TsMux * mux, TsMuxProgram * program); +static gint64 get_next_pcr (TsMux * mux, gint64 cur_ts); +static gint64 get_current_pcr (TsMux * mux, gint64 cur_ts); +static gint64 write_new_pcr (TsMux * mux, TsMuxStream * stream, gint64 cur_pcr, + gint64 next_pcr); +static gboolean tsmux_write_ts_header (TsMux * mux, guint8 * buf, + TsMuxPacketInfo * pi, guint * payload_len_out, guint * payload_offset_out, + guint stream_avail); + static void tsmux_section_free (TsMuxSection * section) { @@ -460,7 +456,8 @@ program->scte35_null_interval = TSMUX_DEFAULT_SCTE_35_NULL_INTERVAL; program->next_scte35_pcr = -1; - program->streams = g_array_sized_new (FALSE, TRUE, sizeof (TsMuxStream *), 1); + /* mux->streams owns the streams */ + program->streams = g_ptr_array_new_full (1, NULL); mux->programs = g_list_prepend (mux->programs, program); mux->nb_programs++; @@ -612,12 +609,45 @@ void tsmux_program_add_stream (TsMuxProgram * program, TsMuxStream * stream) { + GPtrArray *streams; + guint i; + gint pmt_index, array_index = -1 /* append */ ; + guint16 pid; + g_return_if_fail (program != NULL); g_return_if_fail (stream != NULL); - stream->program_array_index = program->streams->len; + streams = program->streams; + pmt_index = stream->pmt_index; + pid = tsmux_stream_get_pid (stream); + + if (pmt_index >= 0) { + /* Insert into streams with known indices */ + for (i = 0; i < streams->len; i++) { + TsMuxStream *s = g_ptr_array_index (streams, i); + + if (s->pmt_index < 0 || pmt_index < s->pmt_index) { + array_index = i; + GST_DEBUG ("PID 0x%04x: Using known-order index %d/%u", + pid, array_index, streams->len); + break; + } + } + } else { + /* Insert after streams with known indices, sorted by PID */ + for (i = 0; i < streams->len; i++) { + TsMuxStream *s = g_ptr_array_index (streams, i); + + if (s->pmt_index < 0 && pid < tsmux_stream_get_pid (s)) { + array_index = i; + GST_DEBUG ("PID 0x%04x: Using PID-order index %d/%u", + pid, array_index, streams->len); + break; + } + } + } - g_array_append_val (program->streams, stream); + g_ptr_array_insert (streams, array_index, stream); program->pmt_changed = TRUE; } @@ -745,6 +775,20 @@ return found; } +static gboolean +tsmux_program_remove_stream (TsMuxProgram * program, TsMuxStream * stream) +{ + GPtrArray *streams = program->streams; + + if (!g_ptr_array_remove (streams, stream)) { + g_warn_if_reached (); + return FALSE; + } + + return streams->len == 0; +} + + gboolean tsmux_remove_stream (TsMux * mux, guint16 pid, TsMuxProgram * program) { @@ -757,20 +801,16 @@ TsMuxStream *stream = (TsMuxStream *) cur->data; if (tsmux_stream_get_pid (stream) == pid) { - if (program->streams->len == 1) { - tsmux_program_delete (mux, program); - ret = TRUE; - } else { - program->streams = - g_array_remove_index (program->streams, - stream->program_array_index); - } - + ret = tsmux_program_remove_stream (program, stream); mux->streams = g_list_remove (mux->streams, stream); tsmux_stream_free (stream); - return ret; + break; } } + + if (ret) + tsmux_program_delete (mux, program); + return ret; } @@ -800,13 +840,56 @@ return TRUE; } - if (mux->bitrate) + if (mux->bitrate) { GST_BUFFER_PTS (buf) = gst_util_uint64_scale (mux->n_bytes * 8, GST_SECOND, mux->bitrate); + /* Check and insert a PCR observation for each program if needed, + * but only for programs that have written their SI at least once, + * so the stream starts with PAT/PMT */ + if (mux->first_pcr_ts != G_MININT64) { + GList *cur; + + for (cur = mux->programs; cur; cur = cur->next) { + TsMuxProgram *program = (TsMuxProgram *) cur->data; + TsMuxStream *stream = program->pcr_stream; + gint64 cur_pcr, next_pcr, new_pcr; + + if (!program->wrote_si) + continue; + + cur_pcr = get_current_pcr (mux, 0); + next_pcr = get_next_pcr (mux, 0); + new_pcr = write_new_pcr (mux, stream, cur_pcr, next_pcr); + + if (new_pcr != -1) { + GstBuffer *buf = NULL; + GstMapInfo map; + guint payload_len, payload_offs; + + if (!tsmux_get_buffer (mux, &buf)) { + goto error; + } + + gst_buffer_map (buf, &map, GST_MAP_READ); + tsmux_write_ts_header (mux, map.data, &stream->pi, &payload_len, + &payload_offs, 0); + gst_buffer_unmap (buf, &map); + + stream->pi.flags &= TSMUX_PACKET_FLAG_PES_FULL_HEADER; + if (!tsmux_packet_out (mux, buf, new_pcr)) + goto error; + } + } + } + } + mux->n_bytes += gst_buffer_get_size (buf); return mux->write_func (buf, mux->write_func_data, pcr); + +error: + return FALSE; } /* @@ -1248,6 +1331,7 @@ return (ts - TSMUX_PCR_OFFSET) * (TSMUX_SYS_CLOCK_FREQ / TSMUX_CLOCK_FREQ); } +/* Calculate the PCR to write into the current packet */ static gint64 get_current_pcr (TsMux * mux, gint64 cur_ts) { @@ -1257,25 +1341,48 @@ if (mux->first_pcr_ts == G_MININT64) { g_assert (cur_ts != G_MININT64); mux->first_pcr_ts = cur_ts; + GST_DEBUG ("First PCR offset is %" G_GUINT64_FORMAT, cur_ts); } return ts_to_pcr (mux->first_pcr_ts) + - gst_util_uint64_scale (mux->n_bytes * 8, TSMUX_SYS_CLOCK_FREQ, - mux->bitrate); + gst_util_uint64_scale ((mux->n_bytes + PCR_BYTE_OFFSET) * 8, + TSMUX_SYS_CLOCK_FREQ, mux->bitrate); } +/* Predict the PCR at the next packet if possible */ static gint64 -write_new_pcr (TsMux * mux, TsMuxStream * stream, gint64 cur_pcr) +get_next_pcr (TsMux * mux, gint64 cur_ts) { - if (stream->next_pcr == -1 || cur_pcr > stream->next_pcr) { + if (!mux->bitrate) + return ts_to_pcr (cur_ts); + + if (mux->first_pcr_ts == G_MININT64) { + g_assert (cur_ts != G_MININT64); + mux->first_pcr_ts = cur_ts; + GST_DEBUG ("First PCR offset is %" G_GUINT64_FORMAT, cur_ts); + } + + return ts_to_pcr (mux->first_pcr_ts) + + gst_util_uint64_scale ((mux->n_bytes + TSMUX_PACKET_LENGTH + + PCR_BYTE_OFFSET) * 8, TSMUX_SYS_CLOCK_FREQ, mux->bitrate); +} + +static gint64 +write_new_pcr (TsMux * mux, TsMuxStream * stream, gint64 cur_pcr, + gint64 next_pcr) +{ + if (stream->next_pcr == -1 || next_pcr > stream->next_pcr) { stream->pi.flags |= TSMUX_PACKET_FLAG_ADAPTATION | TSMUX_PACKET_FLAG_WRITE_PCR; stream->pi.pcr = cur_pcr; - if (stream->next_pcr == -1) - stream->next_pcr = cur_pcr + mux->pcr_interval * 300; - else - stream->next_pcr += mux->pcr_interval * 300; + if (mux->bitrate && stream->next_pcr != -1 && cur_pcr >= stream->next_pcr) { + GST_WARNING ("Writing PCR %" G_GUINT64_FORMAT " missed the target %" + G_GUINT64_FORMAT " by %f ms", cur_pcr, stream->next_pcr, + (double) (cur_pcr - stream->next_pcr) / 27000.0); + } + /* Next PCR deadline is now plus the scheduled interval */ + stream->next_pcr = cur_pcr + mux->pcr_interval * 300; } else { cur_pcr = -1; } @@ -1289,48 +1396,48 @@ gboolean write_pat; gboolean write_si; GList *cur; - gint64 cur_pcr; + gint64 next_pcr; - cur_pcr = get_current_pcr (mux, cur_ts); + next_pcr = get_next_pcr (mux, cur_ts); /* check if we need to rewrite pat */ if (mux->next_pat_pcr == -1 || mux->pat_changed) write_pat = TRUE; - else if (cur_pcr > mux->next_pat_pcr) + else if (next_pcr > mux->next_pat_pcr) write_pat = TRUE; else write_pat = FALSE; if (write_pat) { if (mux->next_pat_pcr == -1) - mux->next_pat_pcr = cur_pcr + mux->pat_interval * 300; + mux->next_pat_pcr = next_pcr + mux->pat_interval * 300; else mux->next_pat_pcr += mux->pat_interval * 300; if (!tsmux_write_pat (mux)) return FALSE; - cur_pcr = get_current_pcr (mux, cur_ts); + next_pcr = get_next_pcr (mux, cur_ts); } /* check if we need to rewrite sit */ if (mux->next_si_pcr == -1 || mux->si_changed) write_si = TRUE; - else if (cur_pcr > mux->next_si_pcr) + else if (next_pcr > mux->next_si_pcr) write_si = TRUE; else write_si = FALSE; if (write_si) { if (mux->next_si_pcr == -1) - mux->next_si_pcr = cur_pcr + mux->si_interval * 300; + mux->next_si_pcr = next_pcr + mux->si_interval * 300; else mux->next_si_pcr += mux->si_interval * 300; if (!tsmux_write_si (mux)) return FALSE; - cur_pcr = get_current_pcr (mux, cur_ts); + next_pcr = get_current_pcr (mux, cur_ts); } /* check if we need to rewrite any of the current pmts */ @@ -1340,28 +1447,28 @@ if (program->next_pmt_pcr == -1 || program->pmt_changed) write_pmt = TRUE; - else if (cur_pcr > program->next_pmt_pcr) + else if (next_pcr > program->next_pmt_pcr) write_pmt = TRUE; else write_pmt = FALSE; if (write_pmt) { if (program->next_pmt_pcr == -1) - program->next_pmt_pcr = cur_pcr + program->pmt_interval * 300; + program->next_pmt_pcr = next_pcr + program->pmt_interval * 300; else program->next_pmt_pcr += program->pmt_interval * 300; if (!tsmux_write_pmt (mux, program)) return FALSE; - cur_pcr = get_current_pcr (mux, cur_ts); + next_pcr = get_current_pcr (mux, cur_ts); } if (program->scte35_pid != 0) { gboolean write_scte_null = FALSE; if (program->next_scte35_pcr == -1) write_scte_null = TRUE; - else if (cur_pcr > program->next_scte35_pcr) + else if (next_pcr > program->next_scte35_pcr) write_scte_null = TRUE; if (write_scte_null) { @@ -1369,7 +1476,7 @@ program->next_scte35_pcr); if (program->next_scte35_pcr == -1) program->next_scte35_pcr = - cur_pcr + program->scte35_null_interval * 300; + next_pcr + program->scte35_null_interval * 300; else program->next_scte35_pcr += program->scte35_null_interval * 300; GST_DEBUG ("next scte35 NOW pcr %" G_GINT64_FORMAT, @@ -1378,9 +1485,11 @@ if (!tsmux_write_scte_null (mux, program)) return FALSE; - cur_pcr = get_current_pcr (mux, cur_ts); + next_pcr = get_current_pcr (mux, cur_ts); } } + + program->wrote_si = TRUE; } return TRUE; @@ -1393,66 +1502,73 @@ GstBuffer *buf = NULL; GstMapInfo map; gboolean ret = TRUE; + GstClockTimeDiff diff; + guint64 start_n_bytes; if (!mux->bitrate) goto done; - do { - if (GST_CLOCK_STIME_IS_VALID (cur_ts)) { - GstClockTimeDiff diff; - - if (!GST_CLOCK_STIME_IS_VALID (stream->first_ts)) - stream->first_ts = cur_ts; - - diff = GST_CLOCK_DIFF (stream->first_ts, cur_ts); - - if (diff) { - bitrate = - gst_util_uint64_scale (mux->n_bytes * 8, TSMUX_CLOCK_FREQ, diff); - - GST_LOG ("Transport stream bitrate: %" G_GUINT64_FORMAT, bitrate); - - if (bitrate < mux->bitrate) { - gint64 new_pcr; - guint payload_len, payload_offs; + if (!GST_CLOCK_STIME_IS_VALID (cur_ts)) + goto done; - GST_LOG ("Padding transport stream"); + if (!GST_CLOCK_STIME_IS_VALID (stream->first_ts)) + stream->first_ts = cur_ts; - if (!rewrite_si (mux, cur_ts)) { - ret = FALSE; - goto done; - } + diff = GST_CLOCK_DIFF (stream->first_ts, cur_ts); + if (diff == 0) + goto done; - if (!tsmux_get_buffer (mux, &buf)) { - ret = FALSE; - goto done; - } + start_n_bytes = mux->n_bytes; + do { + GST_LOG ("Transport stream bitrate: %" G_GUINT64_FORMAT " over %" + G_GUINT64_FORMAT " bytes, duration %" GST_TIME_FORMAT, + gst_util_uint64_scale (mux->n_bytes * 8, TSMUX_CLOCK_FREQ, diff), + mux->n_bytes, GST_TIME_ARGS (diff * GST_SECOND / TSMUX_CLOCK_FREQ)); + + /* calculate what the overall bitrate will be if we add 1 more packet */ + bitrate = + gst_util_uint64_scale ((mux->n_bytes + TSMUX_PACKET_LENGTH) * 8, + TSMUX_CLOCK_FREQ, diff); + + if (bitrate <= mux->bitrate) { + gint64 new_pcr; + guint payload_len, payload_offs; + + if (!tsmux_get_buffer (mux, &buf)) { + ret = FALSE; + goto done; + } - gst_buffer_map (buf, &map, GST_MAP_READ); + gst_buffer_map (buf, &map, GST_MAP_READ); - if ((new_pcr = - write_new_pcr (mux, stream, get_current_pcr (mux, - cur_ts)) != -1)) - tsmux_write_ts_header (mux, map.data, &stream->pi, &payload_len, - &payload_offs, 0); - else - tsmux_write_null_ts_header (map.data); + if ((new_pcr = + write_new_pcr (mux, stream, get_current_pcr (mux, + cur_ts), get_next_pcr (mux, cur_ts)) != -1)) { + GST_LOG ("Writing PCR-only packet on PID 0x%04x", stream->pi.pid); + tsmux_write_ts_header (mux, map.data, &stream->pi, &payload_len, + &payload_offs, 0); + } else { + GST_LOG ("Writing null stuffing packet"); + if (!rewrite_si (mux, cur_ts)) { + ret = FALSE; + goto done; + } + tsmux_write_null_ts_header (map.data); + } - gst_buffer_unmap (buf, &map); + gst_buffer_unmap (buf, &map); - stream->pi.flags &= TSMUX_PACKET_FLAG_PES_FULL_HEADER; + stream->pi.flags &= TSMUX_PACKET_FLAG_PES_FULL_HEADER; - if (!(ret = tsmux_packet_out (mux, buf, new_pcr))) - goto done; - } - } else { - break; - } - } else { - break; + if (!(ret = tsmux_packet_out (mux, buf, new_pcr))) + goto done; } } while (bitrate < mux->bitrate); + if (mux->n_bytes != start_n_bytes) { + GST_LOG ("Finished padding the mux"); + } + done: return ret; } @@ -1481,7 +1597,6 @@ if (tsmux_stream_is_pcr (stream)) { gint64 cur_ts = CLOCK_BASE; - if (tsmux_stream_get_dts (stream) != G_MININT64) cur_ts += tsmux_stream_get_dts (stream); else @@ -1493,7 +1608,9 @@ if (!pad_stream (mux, stream, cur_ts)) goto fail; - new_pcr = write_new_pcr (mux, stream, get_current_pcr (mux, cur_ts)); + new_pcr = + write_new_pcr (mux, stream, get_current_pcr (mux, cur_ts), + get_next_pcr (mux, cur_ts)); } pi->packet_start_unit_indicator = tsmux_stream_at_pes_start (stream); @@ -1559,10 +1676,31 @@ if (program->scte35_null_section) tsmux_section_free (program->scte35_null_section); - g_array_free (program->streams, TRUE); + g_ptr_array_free (program->streams, TRUE); g_slice_free (TsMuxProgram, program); } +/** + * tsmux_program_set_pmt_pid: + * @program: A #TsmuxProgram + * @pmt_pid: PID to write PMT for this program + */ +void +tsmux_program_set_pmt_pid (TsMuxProgram * program, guint16 pmt_pid) +{ + program->pmt_pid = pmt_pid; +} + +static gint +compare_program_number (gconstpointer a, gconstpointer b) +{ + const GstMpegtsPatProgram *pgm1 = *(const GstMpegtsPatProgram * const *) a; + const GstMpegtsPatProgram *pgm2 = *(const GstMpegtsPatProgram * const *) b; + gint num1 = pgm1->program_number, num2 = pgm2->program_number; + + return num1 - num2; +} + static gboolean tsmux_write_pat (TsMux * mux) { @@ -1592,6 +1730,8 @@ g_ptr_array_add (pat, pat_pgm); } + g_ptr_array_sort (pat, compare_program_number); + if (mux->pat.section) gst_mpegts_section_unref (mux->pat.section); @@ -1667,7 +1807,7 @@ /* Write out the entries */ for (i = 0; i < program->streams->len; i++) { GstMpegtsPMTStream *pmt_stream; - TsMuxStream *stream = g_array_index (program->streams, TsMuxStream *, i); + TsMuxStream *stream = g_ptr_array_index (program->streams, i); pmt_stream = gst_mpegts_pmt_stream_new ();
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/tsmux/tsmux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/tsmux/tsmux.h
Changed
@@ -1,10 +1,9 @@ /* * Copyright 2006 BBC and Fluendo S.A. * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL, and the MIT license. * * MPL: * @@ -35,22 +34,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -75,6 +58,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __TSMUX_H__ @@ -111,6 +95,9 @@ /* Information for the streams associated with one program */ struct TsMuxProgram { + /* TRUE if the SI has been written at least once */ + gboolean wrote_si; + TsMuxSection pmt; /* PMT version */ guint8 pmt_version; @@ -140,7 +127,7 @@ TsMuxStream *pcr_stream; /* programs TsMuxStream's */ - GArray *streams; + GPtrArray *streams; }; struct TsMux { @@ -221,6 +208,7 @@ /* pid/program management */ TsMuxProgram * tsmux_program_new (TsMux *mux, gint prog_id); void tsmux_program_free (TsMuxProgram *program); +void tsmux_program_set_pmt_pid (TsMuxProgram *program, guint16 pmt_pid); void tsmux_set_pmt_interval (TsMuxProgram *program, guint interval); guint tsmux_get_pmt_interval (TsMuxProgram *program); void tsmux_resend_pmt (TsMuxProgram *program);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/tsmux/tsmuxcommon.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/tsmux/tsmuxcommon.h
Changed
@@ -1,10 +1,9 @@ /* * Copyright 2006 BBC and Fluendo S.A. * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL, and the MIT license. * * MPL: * @@ -35,22 +34,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -75,6 +58,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __TSMUX_COMMON_H__
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/tsmux/tsmuxstream.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/tsmux/tsmuxstream.c
Changed
@@ -1,10 +1,9 @@ /* * Copyright 2006 BBC and Fluendo S.A. * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL, and the MIT license. * * MPL: * @@ -35,22 +34,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -75,6 +58,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifdef HAVE_CONFIG_H @@ -133,7 +117,7 @@ stream->pes_payload_size = 0; stream->cur_pes_payload_size = 0; stream->pes_bytes_written = 0; - stream->program_array_index = -1; + stream->pmt_index = -1; switch (stream_type) { case TSMUX_ST_VIDEO_MPEG1: @@ -719,7 +703,7 @@ * * Submit @len bytes of @data into @stream. @pts and @dts can be set to the * timestamp (against a 90Hz clock) of the first access unit in @data. A - * timestamp of GST_CLOCK_STIME_NNOE for @pts or @dts means unknown. + * timestamp of GST_CLOCK_STIME_NONE for @pts or @dts means unknown. * * @user_data will be passed to the release function as set with * tsmux_stream_set_buffer_release_func() when @data can be freed.
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mpegtsmux/tsmux/tsmuxstream.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mpegtsmux/tsmux/tsmuxstream.h
Changed
@@ -1,10 +1,9 @@ /* * Copyright 2006 BBC and Fluendo S.A. * - * This library is licensed under 4 different licenses and you + * This library is licensed under 3 different licenses and you * can choose to use it under the terms of any one of them. The - * four licenses are the MPL 1.1, the LGPL, the GPL and the MIT - * license. + * three licenses are the MPL 1.1, the LGPL, and the MIT license. * * MPL: * @@ -35,22 +34,6 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. * - * GPL: - * - * This program is free software; you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation; either version 2 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA. - * * MIT: * * Unless otherwise indicated, Source Code is licensed under MIT license. @@ -75,6 +58,7 @@ * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. * + * SPDX-License-Identifier: MPL-1.1 OR MIT OR LGPL-2.0-or-later */ #ifndef __TSMUXSTREAM_H__ @@ -166,8 +150,8 @@ guint8 id; /* extended stream id (13818-1 Amdt 2) */ guint8 id_extended; - /* array index in program array */ - gint program_array_index; + /* requested index in the PMT */ + gint pmt_index; gboolean is_video_stream;
View file
gst-plugins-bad-1.20.1.tar.xz/gst/mxf/gstmxfelement.c
Added
@@ -0,0 +1,82 @@ + +/* GStreamer + * Copyright (C) <2008> Sebastian Dröge <sebastian.droege@collabora.co.uk> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +#include "gstmxfelements.h" +#include "mxfquark.h" +#include "mxfdemux.h" +#include "mxfmux.h" +/*#include "mxfdms1.h"*/ +#include "mxfaes-bwf.h" +#include "mxfalaw.h" +#include "mxfd10.h" +#include "mxfdv-dif.h" +#include "mxfjpeg2000.h" +#include "mxfmpeg.h" +#include "mxfup.h" +#include "mxfvc3.h" +#include "mxfprores.h" +#include "mxfvanc.h" + +GST_DEBUG_CATEGORY (mxf_debug); +#define GST_CAT_DEFAULT mxf_debug + +static void +mxf_init (void) +{ + gst_tag_register (GST_TAG_MXF_UMID, GST_TAG_FLAG_META, + G_TYPE_STRING, "UMID", "Unique Material Identifier", NULL); + gst_tag_register (GST_TAG_MXF_STRUCTURE, GST_TAG_FLAG_META, + GST_TYPE_STRUCTURE, "Structure", "Structural metadata of " + "the MXF file", NULL); + gst_tag_register (GST_TAG_MXF_DESCRIPTIVE_METADATA_FRAMEWORK, + GST_TAG_FLAG_META, GST_TYPE_STRUCTURE, "DM Framework", + "Descriptive metadata framework", NULL); +} + +void +mxf_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (mxf_debug, "mxf", 0, "MXF"); + + mxf_init (); + mxf_quark_initialize (); + mxf_metadata_init_types (); + /* mxf_dms1_initialize (); */ + mxf_aes_bwf_init (); + mxf_alaw_init (); + mxf_d10_init (); + mxf_dv_dif_init (); + mxf_jpeg2000_init (); + mxf_mpeg_init (); + mxf_up_init (); + mxf_vc3_init (); + mxf_prores_init (); + mxf_vanc_init (); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/mxf/gstmxfelements.h
Added
@@ -0,0 +1,36 @@ +/* GStreamer + * Copyright (C) 2020 Huawei Technologies Co., Ltd. + * @Author: Stéphane Cerveau <stephane.cerveau@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_MXF_ELEMENTS_H__ +#define __GST_MXF_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void mxf_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (mxfdemux); +GST_ELEMENT_REGISTER_DECLARE (mxfmux); + +#endif /* __GST_MXF_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/meson.build
Changed
@@ -1,5 +1,6 @@ mxf_sources = [ 'mxf.c', + 'gstmxfelement.c', 'mxful.c', 'mxftypes.c', 'mxfmetadata.c',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxf.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxf.c
Changed
@@ -1,3 +1,4 @@ + /* GStreamer * Copyright (C) <2008> Sebastian Dröge <sebastian.droege@collabora.co.uk> * @@ -20,67 +21,19 @@ #include <config.h> #endif -#include <gst/gst.h> - -#include "mxfquark.h" -#include "mxfdemux.h" -#include "mxfmux.h" -/*#include "mxfdms1.h"*/ -#include "mxfaes-bwf.h" -#include "mxfalaw.h" -#include "mxfd10.h" -#include "mxfdv-dif.h" -#include "mxfjpeg2000.h" -#include "mxfmpeg.h" -#include "mxfup.h" -#include "mxfvc3.h" -#include "mxfprores.h" -#include "mxfvanc.h" - -GST_DEBUG_CATEGORY (mxf_debug); -#define GST_CAT_DEFAULT mxf_debug +#include "gstmxfelements.h" -static void -mxf_init (void) -{ - gst_tag_register (GST_TAG_MXF_UMID, GST_TAG_FLAG_META, - G_TYPE_STRING, "UMID", "Unique Material Identifier", NULL); - gst_tag_register (GST_TAG_MXF_STRUCTURE, GST_TAG_FLAG_META, - GST_TYPE_STRUCTURE, "Structure", "Structural metadata of " - "the MXF file", NULL); - gst_tag_register (GST_TAG_MXF_DESCRIPTIVE_METADATA_FRAMEWORK, - GST_TAG_FLAG_META, GST_TYPE_STRUCTURE, "DM Framework", - "Descriptive metadata framework", NULL); -} static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (mxf_debug, "mxf", 0, "MXF"); - - mxf_init (); - mxf_quark_initialize (); - mxf_metadata_init_types (); -/* mxf_dms1_initialize ();*/ - mxf_aes_bwf_init (); - mxf_alaw_init (); - mxf_d10_init (); - mxf_dv_dif_init (); - mxf_jpeg2000_init (); - mxf_mpeg_init (); - mxf_up_init (); - mxf_vc3_init (); - mxf_prores_init (); - mxf_vanc_init (); + gboolean ret = FALSE; /* mxfmux is disabled for now - it compiles but is completely untested */ - if (!gst_element_register (plugin, "mxfdemux", GST_RANK_PRIMARY, - GST_TYPE_MXF_DEMUX) - || !gst_element_register (plugin, "mxfmux", GST_RANK_PRIMARY, - GST_TYPE_MXF_MUX)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (mxfdemux, plugin); + ret |= GST_ELEMENT_REGISTER (mxfmux, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfaes-bwf.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfaes-bwf.c
Changed
@@ -211,7 +211,7 @@ mxf_timestamp_to_string (&self->peak_envelope_timestamp, str)); break; case 0x3d31: - self->peak_envelope_data = g_memdup (tag_data, tag_size); + self->peak_envelope_data = g_memdup2 (tag_data, tag_size); self->peak_envelope_data_length = tag_size; GST_DEBUG (" peak evelope data size = %u", self->peak_envelope_data_length); @@ -507,7 +507,7 @@ t = g_slice_new0 (MXFLocalTag); memcpy (&t->ul, &peak_envelope_data_ul, 16); t->size = self->peak_envelope_data_length; - t->data = g_memdup (self->peak_envelope_data, t->size); + t->data = g_memdup2 (self->peak_envelope_data, t->size); mxf_primer_pack_add_mapping (primer, 0x3d31, &t->ul); ret = g_list_prepend (ret, t); } @@ -1122,7 +1122,8 @@ (key->u[14] == 0x01 || key->u[14] == 0x02 || key->u[14] == 0x03 || - key->u[14] == 0x04 || key->u[14] == 0x08 || key->u[14] == 0x09)) + key->u[14] == 0x04 || key->u[14] == 0x08 || key->u[14] == 0x09 || + key->u[14] == 0x0a || key->u[14] == 0x0b)) return TRUE; } @@ -1160,8 +1161,12 @@ break; case 0x08: case 0x09: - default: + case 0x0a: + case 0x0b: return MXF_ESSENCE_WRAPPING_CUSTOM_WRAPPING; + default: + GST_WARNING ("Unknown frame wrapping"); + return MXF_ESSENCE_WRAPPING_UNKNOWN_WRAPPING; break; } } @@ -1437,7 +1442,8 @@ descriptor[i]) && (track->parent.descriptor[i]->essence_container.u[14] == 0x01 || track->parent.descriptor[i]->essence_container.u[14] == 0x02 - || track->parent.descriptor[i]->essence_container.u[14] == 0x08)) { + || track->parent.descriptor[i]->essence_container.u[14] == 0x08 + || track->parent.descriptor[i]->essence_container.u[14] == 0x0a)) { s = (MXFMetadataGenericSoundEssenceDescriptor *) track->parent. descriptor[i]; bwf = TRUE; @@ -1447,7 +1453,8 @@ descriptor[i]) && (track->parent.descriptor[i]->essence_container.u[14] == 0x03 || track->parent.descriptor[i]->essence_container.u[14] == 0x04 - || track->parent.descriptor[i]->essence_container.u[14] == 0x09)) { + || track->parent.descriptor[i]->essence_container.u[14] == 0x09 + || track->parent.descriptor[i]->essence_container.u[14] == 0x0b)) { s = (MXFMetadataGenericSoundEssenceDescriptor *) track->parent. descriptor[i];
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfd10.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfd10.c
Changed
@@ -64,7 +64,7 @@ if (mxf_is_generic_container_essence_container_label (key) && key->u[12] == 0x02 && key->u[13] == 0x01 && (key->u[14] >= 0x01 && key->u[14] <= 0x06) && - (key->u[15] == 0x01 || key->u[15] == 0x02)) + (key->u[15] == 0x01 || key->u[15] == 0x02 || key->u[15] == 0x7f)) return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfdemux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfdemux.c
Changed
@@ -34,7 +34,6 @@ * - Handle timecode tracks correctly (where is this documented?) * - Handle drop-frame field of timecode tracks * - Handle Generic container system items - * - Implement correct support for clip-wrapped essence elements. * - Post structural metadata and descriptive metadata trees as a message on the bus * and send them downstream as event. * - Multichannel audio needs channel layouts, define them (SMPTE S320M?). @@ -51,6 +50,7 @@ #include "config.h" #endif +#include "gstmxfelements.h" #include "mxfdemux.h" #include "mxfessence.h" @@ -71,14 +71,26 @@ GST_DEBUG_CATEGORY_STATIC (mxfdemux_debug); #define GST_CAT_DEFAULT mxfdemux_debug +/* Fill klv for the given offset, does not download the data */ static GstFlowReturn -gst_mxf_demux_pull_klv_packet (GstMXFDemux * demux, guint64 offset, MXFUL * key, - GstBuffer ** outbuf, guint * read); +gst_mxf_demux_peek_klv_packet (GstMXFDemux * demux, guint64 offset, + GstMXFKLV * klv); + +/* Ensures the klv data is present. Pulls it if needed */ +static GstFlowReturn +gst_mxf_demux_fill_klv (GstMXFDemux * demux, GstMXFKLV * klv); + +/* Call when done with a klv. Will release the buffer (if any) and will update + * the demuxer offset position */ +static void gst_mxf_demux_consume_klv (GstMXFDemux * demux, GstMXFKLV * klv); + static GstFlowReturn -gst_mxf_demux_handle_index_table_segment (GstMXFDemux * demux, - const MXFUL * key, GstBuffer * buffer, guint64 offset); +gst_mxf_demux_handle_index_table_segment (GstMXFDemux * demux, GstMXFKLV * klv); static void collect_index_table_segments (GstMXFDemux * demux); +static gboolean find_entry_for_offset (GstMXFDemux * demux, + GstMXFDemuxEssenceTrack * etrack, guint64 offset, + GstMXFDemuxIndex * retentry); GType gst_mxf_demux_pad_get_type (void); G_DEFINE_TYPE (GstMXFDemuxPad, gst_mxf_demux_pad, GST_TYPE_PAD); @@ -111,6 +123,8 @@ pad->current_material_track_position = 0; } +#define DEFAULT_MAX_DRIFT 100 * GST_MSECOND + enum { PROP_0, @@ -128,6 +142,8 @@ #define gst_mxf_demux_parent_class parent_class G_DEFINE_TYPE (GstMXFDemux, gst_mxf_demux, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mxfdemux, "mxfdemux", GST_RANK_PRIMARY, + GST_TYPE_MXF_DEMUX, mxf_element_init (plugin)); static void gst_mxf_demux_remove_pad (GstMXFDemuxPad * pad, GstMXFDemux * demux) @@ -203,6 +219,7 @@ &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); track->source_package = NULL; + track->delta_id = -1; track->source_track = NULL; } @@ -243,6 +260,8 @@ demux->flushing = FALSE; + demux->state = GST_MXF_DEMUX_STATE_UNKNOWN; + demux->footer_partition_pack_offset = 0; demux->offset = 0; @@ -285,7 +304,8 @@ for (l = demux->index_tables; l; l = l->next) { GstMXFDemuxIndexTable *t = l->data; - g_array_free (t->offsets, TRUE); + g_array_free (t->segments, TRUE); + g_array_free (t->reverse_temporal_offsets, TRUE); g_free (t); } g_list_free (demux->index_tables); @@ -383,19 +403,79 @@ return 0; } +/* Final checks and variable calculation for tracks and partition. This function + * can be called repeatedly without any side-effect. + */ +static void +gst_mxf_demux_partition_postcheck (GstMXFDemux * demux, + GstMXFDemuxPartition * partition) +{ + guint i; + GstMXFDemuxPartition *old_partition = demux->current_partition; + + /* If we already handled this partition or it doesn't contain any essence, skip */ + if (partition->single_track || !partition->partition.body_sid) + return; + + for (i = 0; i < demux->essence_tracks->len; i++) { + GstMXFDemuxEssenceTrack *cand = + &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); + + if (cand->body_sid != partition->partition.body_sid) + continue; + + if (!cand->source_package->is_interleaved) { + GST_DEBUG_OBJECT (demux, + "Assigning single track %d (0x%08x) to partition at offset %" + G_GUINT64_FORMAT, cand->track_id, cand->track_number, + partition->partition.this_partition); + + partition->single_track = cand; + + if (partition->essence_container_offset != 0 + && cand->wrapping != MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + GstMXFKLV essence_klv; + GstMXFDemuxIndex entry; + /* Update the essence container offset for the fact that the index + * stream offset is relative to the essence position and not to the + * KLV position */ + if (gst_mxf_demux_peek_klv_packet (demux, + partition->partition.this_partition + + partition->essence_container_offset, + &essence_klv) == GST_FLOW_OK) { + partition->essence_container_offset += essence_klv.data_offset; + /* And keep a copy of the clip/custom klv for this partition */ + partition->clip_klv = essence_klv; + GST_DEBUG_OBJECT (demux, + "Non-frame wrapping, updated essence_container_offset to %" + G_GUINT64_FORMAT, partition->essence_container_offset); + + /* And match it against index table, this will also update the track delta_id (if needed) */ + demux->current_partition = partition; + find_entry_for_offset (demux, cand, + essence_klv.offset + essence_klv.data_offset, &entry); + demux->current_partition = old_partition; + } + } + + break; + } + } +} + static GstFlowReturn -gst_mxf_demux_handle_partition_pack (GstMXFDemux * demux, const MXFUL * key, - GstBuffer * buffer) +gst_mxf_demux_handle_partition_pack (GstMXFDemux * demux, GstMXFKLV * klv) { MXFPartitionPack partition; GList *l; GstMXFDemuxPartition *p = NULL; GstMapInfo map; gboolean ret; + GstFlowReturn flowret; GST_DEBUG_OBJECT (demux, "Handling partition pack of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT, gst_buffer_get_size (buffer), demux->offset); + G_GUINT64_FORMAT, klv->length, klv->offset); for (l = demux->partitions; l; l = l->next) { GstMXFDemuxPartition *tmp = l->data; @@ -408,17 +488,23 @@ } } - - gst_buffer_map (buffer, &map, GST_MAP_READ); - ret = mxf_partition_pack_parse (key, &partition, map.data, map.size); - gst_buffer_unmap (buffer, &map); + flowret = gst_mxf_demux_fill_klv (demux, klv); + if (flowret != GST_FLOW_OK) + return flowret; + + gst_buffer_map (klv->data, &map, GST_MAP_READ); + ret = mxf_partition_pack_parse (&klv->key, &partition, map.data, map.size); + gst_buffer_unmap (klv->data, &map); if (!ret) { GST_ERROR_OBJECT (demux, "Parsing partition pack failed"); return GST_FLOW_ERROR; } if (partition.this_partition != demux->offset + demux->run_in) { - GST_WARNING_OBJECT (demux, "Partition with incorrect offset"); + GST_WARNING_OBJECT (demux, + "Partition with incorrect offset (this %" G_GUINT64_FORMAT + " demux offset %" G_GUINT64_FORMAT " run_in:%" G_GUINT64_FORMAT ")", + partition.this_partition, demux->offset, demux->run_in); partition.this_partition = demux->offset + demux->run_in; } @@ -445,6 +531,8 @@ (GCompareFunc) gst_mxf_demux_partition_compare); } + gst_mxf_demux_partition_postcheck (demux, p); + for (l = demux->partitions; l; l = l->next) { GstMXFDemuxPartition *a, *b; @@ -458,21 +546,25 @@ } out: + GST_DEBUG_OBJECT (demux, + "Current partition now %p (body_sid:%d index_sid:%d this_partition:%" + G_GUINT64_FORMAT ")", p, p->partition.body_sid, p->partition.index_sid, + p->partition.this_partition); demux->current_partition = p; return GST_FLOW_OK; } static GstFlowReturn -gst_mxf_demux_handle_primer_pack (GstMXFDemux * demux, const MXFUL * key, - GstBuffer * buffer) +gst_mxf_demux_handle_primer_pack (GstMXFDemux * demux, GstMXFKLV * klv) { GstMapInfo map; gboolean ret; + GstFlowReturn flowret; GST_DEBUG_OBJECT (demux, "Handling primer pack of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT, gst_buffer_get_size (buffer), demux->offset); + G_GUINT64_FORMAT, klv->length, klv->offset); if (G_UNLIKELY (!demux->current_partition)) { GST_ERROR_OBJECT (demux, "Primer pack before partition pack"); @@ -484,10 +576,14 @@ return GST_FLOW_OK; } - gst_buffer_map (buffer, &map, GST_MAP_READ); - ret = mxf_primer_pack_parse (key, &demux->current_partition->primer, + flowret = gst_mxf_demux_fill_klv (demux, klv); + if (flowret != GST_FLOW_OK) + return flowret; + + gst_buffer_map (klv->data, &map, GST_MAP_READ); + ret = mxf_primer_pack_parse (&klv->key, &demux->current_partition->primer, map.data, map.size); - gst_buffer_unmap (buffer, &map); + gst_buffer_unmap (klv->data, &map); if (!ret) { GST_ERROR_OBJECT (demux, "Parsing primer pack failed"); return GST_FLOW_ERROR; @@ -505,6 +601,7 @@ GHashTableIter iter; MXFMetadataBase *m = NULL; GstStructure *structure; + guint i; g_rw_lock_writer_lock (&demux->metadata_lock); @@ -548,6 +645,23 @@ gst_structure_free (structure); + /* Check for quirks */ + for (i = 0; i < demux->preface->n_identifications; i++) { + MXFMetadataIdentification *identification = + demux->preface->identifications[i]; + + GST_DEBUG_OBJECT (demux, "product:'%s' company:'%s'", + identification->product_name, identification->company_name); + if (!g_strcmp0 (identification->product_name, "MXFTk Advanced") && + !g_strcmp0 (identification->company_name, "OpenCube") && + identification->product_version.major <= 2 && + identification->product_version.minor <= 0) { + GST_WARNING_OBJECT (demux, + "Setting up quirk for misuse of temporal_order field"); + demux->temporal_order_misuse = TRUE; + } + } + g_rw_lock_writer_unlock (&demux->metadata_lock); return ret; @@ -674,6 +788,7 @@ i++) { MXFMetadataEssenceContainerData *edata; MXFMetadataSourcePackage *package; + MXFFraction common_rate = { 0, 0 }; if (demux->preface->content_storage->essence_container_data[i] == NULL) continue; @@ -699,18 +814,56 @@ gboolean new = FALSE; if (!package->parent.tracks[j] - || !MXF_IS_METADATA_TIMELINE_TRACK (package->parent.tracks[j])) + || !MXF_IS_METADATA_TIMELINE_TRACK (package->parent.tracks[j])) { + GST_DEBUG_OBJECT (demux, + "Skipping non-timeline track (id:%d number:0x%08x)", + package->parent.tracks[j]->track_id, + package->parent.tracks[j]->track_number); continue; + } track = MXF_METADATA_TIMELINE_TRACK (package->parent.tracks[j]); - if ((track->parent.type & 0xf0) != 0x30) + if ((track->parent.type & 0xf0) != 0x30) { + GST_DEBUG_OBJECT (demux, + "Skipping track of type 0x%02x (id:%d number:0x%08x)", + track->parent.type, track->parent.track_id, + track->parent.track_number); continue; + } if (track->edit_rate.n <= 0 || track->edit_rate.d <= 0) { GST_WARNING_OBJECT (demux, "Invalid edit rate"); continue; } + if (package->is_interleaved) { + /* + * S377-1:2019 "9.4.2 The MXF timing model" + * + * The value of Edit Rate shall be identical for every timeline Essence + * Track of the Top-Level File Package. + * + * The value of Edit Rate of the timeline Essence Tracks of one + * Top-Level File Package need not match the Edit Rate of the Essence + * Tracks of the other Top-Level File Packages. + * + * S377-1:2019 "9.5.5 Top-Level File Packages" + * + *12. All Essence Tracks of a Top-Level File Package **shall** have the + * same value of Edit Rate. All other Tracks of a Top-Level File + * Package **should** have the same value of Edit Rate as the + * Essence Tracks. + */ + if (common_rate.n == 0 && common_rate.d == 0) { + common_rate = track->edit_rate; + } else if (common_rate.n * track->edit_rate.d != + common_rate.d * track->edit_rate.n) { + GST_ELEMENT_ERROR (demux, STREAM, WRONG_TYPE, (NULL), + ("Interleaved File Package doesn't have identical edit rate on all tracks.")); + return GST_FLOW_ERROR; + } + } + for (k = 0; k < demux->essence_tracks->len; k++) { GstMXFDemuxEssenceTrack *tmp = &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, @@ -756,6 +909,7 @@ etrack->source_package = NULL; etrack->source_track = NULL; + etrack->delta_id = -1; if (!track->parent.sequence) { GST_WARNING_OBJECT (demux, "Source track has no sequence"); @@ -855,21 +1009,41 @@ caps = NULL; } - if (etrack->handler != NULL) { - MXFEssenceWrapping track_wrapping; + etrack->min_edit_units = 1; + /* Ensure we don't output one buffer per sample for audio */ + if (gst_util_uint64_scale (GST_SECOND, track->edit_rate.d, + track->edit_rate.n) < 10 * GST_MSECOND) { + GstStructure *s = gst_caps_get_structure (etrack->caps, 0); + const gchar *name = gst_structure_get_name (s); + if (g_str_has_prefix (name, "audio/x-raw")) { + etrack->min_edit_units = + gst_util_uint64_scale (25 * GST_MSECOND, track->edit_rate.n, + track->edit_rate.d * GST_SECOND); + GST_DEBUG_OBJECT (demux, "Seting miminum number of edit units to %u", + etrack->min_edit_units); + } + } - track_wrapping = etrack->handler->get_track_wrapping (track); - if (track_wrapping == MXF_ESSENCE_WRAPPING_CLIP_WRAPPING) { - GST_ELEMENT_ERROR (demux, STREAM, NOT_IMPLEMENTED, (NULL), - ("Clip essence wrapping is not implemented yet.")); - return GST_FLOW_ERROR; - } else if (track_wrapping == MXF_ESSENCE_WRAPPING_CUSTOM_WRAPPING) { - GST_ELEMENT_ERROR (demux, STREAM, NOT_IMPLEMENTED, (NULL), - ("Custom essence wrappings are not supported.")); + /* FIXME : We really should just abort/ignore the stream completely if we + * don't have a handler for it */ + if (etrack->handler != NULL) + etrack->wrapping = etrack->handler->get_track_wrapping (track); + else + etrack->wrapping = MXF_ESSENCE_WRAPPING_UNKNOWN_WRAPPING; + + if (package->is_interleaved) { + GST_DEBUG_OBJECT (demux, + "track comes from interleaved source package with %d track(s), setting delta_id to -1", + package->parent.n_tracks); + if (etrack->wrapping != MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + GST_ELEMENT_ERROR (demux, STREAM, WRONG_TYPE, (NULL), + ("Non-frame-wrapping is not allowed in interleaved File Package.")); return GST_FLOW_ERROR; } + etrack->delta_id = MXF_INDEX_DELTA_ID_UNKNOWN; + } else { + etrack->delta_id = MXF_INDEX_DELTA_ID_UNKNOWN; } - etrack->source_package = package; etrack->source_track = track; continue; @@ -907,6 +1081,148 @@ return GST_FLOW_OK; } +static MXFMetadataEssenceContainerData * +essence_container_for_source_package (MXFMetadataContentStorage * storage, + MXFMetadataSourcePackage * package) +{ + guint i; + + for (i = 0; i < storage->n_essence_container_data; i++) { + MXFMetadataEssenceContainerData *cont = storage->essence_container_data[i]; + if (cont && cont->linked_package == package) + return cont; + } + + return NULL; +} + +static void +gst_mxf_demux_show_topology (GstMXFDemux * demux) +{ + GList *material_packages = NULL; + GList *file_packages = NULL; + GList *tmp; + MXFMetadataContentStorage *storage = demux->preface->content_storage; + guint i; + gchar str[96]; + + /* Show the topology starting from the preface */ + GST_DEBUG_OBJECT (demux, "Topology"); + + for (i = 0; i < storage->n_packages; i++) { + MXFMetadataGenericPackage *pack = storage->packages[i]; + if (MXF_IS_METADATA_MATERIAL_PACKAGE (pack)) + material_packages = g_list_append (material_packages, pack); + else if (MXF_IS_METADATA_SOURCE_PACKAGE (pack)) + file_packages = g_list_append (file_packages, pack); + else + GST_DEBUG_OBJECT (demux, "Unknown package type"); + } + + GST_DEBUG_OBJECT (demux, "Number of Material Package (i.e. output) : %d", + g_list_length (material_packages)); + for (tmp = material_packages; tmp; tmp = tmp->next) { + MXFMetadataMaterialPackage *pack = (MXFMetadataMaterialPackage *) tmp->data; + GST_DEBUG_OBJECT (demux, " Package with %d tracks , UID:%s", + pack->n_tracks, mxf_umid_to_string (&pack->package_uid, str)); + for (i = 0; i < pack->n_tracks; i++) { + MXFMetadataTrack *track = pack->tracks[i]; + if (track == NULL) { + GST_DEBUG_OBJECT (demux, " Unknown/Unhandled track UUID %s", + mxf_uuid_to_string (&pack->tracks_uids[i], str)); + } else if (MXF_IS_METADATA_TIMELINE_TRACK (track)) { + MXFMetadataTimelineTrack *mtrack = (MXFMetadataTimelineTrack *) track; + GST_DEBUG_OBJECT (demux, + " Timeline Track id:%d number:0x%08x name:`%s` edit_rate:%d/%d origin:%" + G_GINT64_FORMAT, track->track_id, track->track_number, + track->track_name, mtrack->edit_rate.n, mtrack->edit_rate.d, + mtrack->origin); + } else { + GST_DEBUG_OBJECT (demux, + " Non-Timeline-Track id:%d number:0x%08x name:`%s`", + track->track_id, track->track_number, track->track_name); + } + if (track) { + MXFMetadataSequence *sequence = track->sequence; + guint si; + GST_DEBUG_OBJECT (demux, + " Sequence duration:%" G_GINT64_FORMAT + " n_structural_components:%d", sequence->duration, + sequence->n_structural_components); + for (si = 0; si < sequence->n_structural_components; si++) { + MXFMetadataStructuralComponent *comp = + sequence->structural_components[si]; + GST_DEBUG_OBJECT (demux, + " Component #%d duration:%" G_GINT64_FORMAT, si, + comp->duration); + if (MXF_IS_METADATA_SOURCE_CLIP (comp)) { + MXFMetadataSourceClip *clip = (MXFMetadataSourceClip *) comp; + GST_DEBUG_OBJECT (demux, + " Clip start_position:%" G_GINT64_FORMAT + " source_track_id:%d source_package_id:%s", + clip->start_position, clip->source_track_id, + mxf_umid_to_string (&clip->source_package_id, str)); + } + } + + } + } + } + + GST_DEBUG_OBJECT (demux, "Number of File Packages (i.e. input) : %d", + g_list_length (file_packages)); + for (tmp = file_packages; tmp; tmp = tmp->next) { + MXFMetadataMaterialPackage *pack = (MXFMetadataMaterialPackage *) tmp->data; + MXFMetadataSourcePackage *src = (MXFMetadataSourcePackage *) pack; + MXFMetadataEssenceContainerData *econt = + essence_container_for_source_package (storage, src); + GST_DEBUG_OBJECT (demux, + " Package (body_sid:%d index_sid:%d top_level:%d) with %d tracks , UID:%s", + econt->body_sid, econt->index_sid, src->top_level, pack->n_tracks, + mxf_umid_to_string (&pack->package_uid, str)); + GST_DEBUG_OBJECT (demux, " Package descriptor : %s", + g_type_name (G_OBJECT_TYPE (src->descriptor))); + for (i = 0; i < pack->n_tracks; i++) { + MXFMetadataTrack *track = pack->tracks[i]; + MXFMetadataSequence *sequence = track->sequence; + guint di, si; + if (MXF_IS_METADATA_TIMELINE_TRACK (track)) { + MXFMetadataTimelineTrack *mtrack = (MXFMetadataTimelineTrack *) track; + GST_DEBUG_OBJECT (demux, + " Timeline Track id:%d number:0x%08x name:`%s` edit_rate:%d/%d origin:%" + G_GINT64_FORMAT, track->track_id, track->track_number, + track->track_name, mtrack->edit_rate.n, mtrack->edit_rate.d, + mtrack->origin); + } else { + GST_DEBUG_OBJECT (demux, + " Non-Timeline-Track id:%d number:0x%08x name:`%s` type:0x%x", + track->track_id, track->track_number, track->track_name, + track->type); + } + for (di = 0; di < track->n_descriptor; di++) { + MXFMetadataFileDescriptor *desc = track->descriptor[di]; + GST_DEBUG_OBJECT (demux, " Descriptor %s %s", + g_type_name (G_OBJECT_TYPE (desc)), + mxf_ul_to_string (&desc->essence_container, str)); + } + GST_DEBUG_OBJECT (demux, + " Sequence duration:%" G_GINT64_FORMAT + " n_structural_components:%d", sequence->duration, + sequence->n_structural_components); + for (si = 0; si < sequence->n_structural_components; si++) { + MXFMetadataStructuralComponent *comp = + sequence->structural_components[si]; + GST_DEBUG_OBJECT (demux, + " Component #%d duration:%" G_GINT64_FORMAT, si, + comp->duration); + } + } + } + + g_list_free (material_packages); + g_list_free (file_packages); +} + static GstFlowReturn gst_mxf_demux_update_tracks (GstMXFDemux * demux) { @@ -921,6 +1237,8 @@ g_rw_lock_writer_lock (&demux->metadata_lock); GST_DEBUG_OBJECT (demux, "Updating tracks"); + gst_mxf_demux_show_topology (demux); + if ((ret = gst_mxf_demux_update_essence_tracks (demux)) != GST_FLOW_OK) { goto error; } @@ -959,7 +1277,7 @@ } if (!MXF_IS_METADATA_TIMELINE_TRACK (current_package->tracks[i])) { - GST_DEBUG_OBJECT (demux, "No timeline track"); + GST_DEBUG_OBJECT (demux, "Skipping Non-timeline track"); continue; } @@ -1094,7 +1412,10 @@ } if (track->parent.type && (track->parent.type & 0xf0) != 0x30) { - GST_DEBUG_OBJECT (demux, "No essence track"); + GST_DEBUG_OBJECT (demux, + "No essence track. type:0x%02x track_id:%d track_number:0x%08x", + track->parent.type, track->parent.track_id, + track->parent.track_number); if (!pad) { continue; } else { @@ -1332,6 +1653,11 @@ if (first_run) gst_element_no_more_pads (GST_ELEMENT_CAST (demux)); + /* Re-check all existing partitions for source package linking in case the + * header partition contains data (allowed in early MXF versions) */ + for (l = demux->partitions; l; l = l->next) + gst_mxf_demux_partition_postcheck (demux, (GstMXFDemuxPartition *) l->data); + return GST_FLOW_OK; error: @@ -1340,20 +1666,18 @@ } static GstFlowReturn -gst_mxf_demux_handle_metadata (GstMXFDemux * demux, const MXFUL * key, - GstBuffer * buffer) +gst_mxf_demux_handle_metadata (GstMXFDemux * demux, GstMXFKLV * klv) { guint16 type; MXFMetadata *metadata = NULL, *old = NULL; GstMapInfo map; GstFlowReturn ret = GST_FLOW_OK; - type = GST_READ_UINT16_BE (key->u + 13); + type = GST_READ_UINT16_BE (&klv->key.u[13]); GST_DEBUG_OBJECT (demux, "Handling metadata of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT " of type 0x%04x", gst_buffer_get_size (buffer), - demux->offset, type); + G_GUINT64_FORMAT " of type 0x%04x", klv->length, klv->offset, type); if (G_UNLIKELY (!demux->current_partition)) { GST_ERROR_OBJECT (demux, "Partition pack doesn't exist"); @@ -1370,14 +1694,17 @@ return GST_FLOW_OK; } - if (gst_buffer_get_size (buffer) == 0) + if (klv->length == 0) return GST_FLOW_OK; + ret = gst_mxf_demux_fill_klv (demux, klv); + if (ret != GST_FLOW_OK) + return ret; - gst_buffer_map (buffer, &map, GST_MAP_READ); + gst_buffer_map (klv->data, &map, GST_MAP_READ); metadata = mxf_metadata_new (type, &demux->current_partition->primer, demux->offset, map.data, map.size); - gst_buffer_unmap (buffer, &map); + gst_buffer_unmap (klv->data, &map); if (!metadata) { GST_WARNING_OBJECT (demux, @@ -1433,8 +1760,7 @@ } static GstFlowReturn -gst_mxf_demux_handle_descriptive_metadata (GstMXFDemux * demux, - const MXFUL * key, GstBuffer * buffer) +gst_mxf_demux_handle_descriptive_metadata (GstMXFDemux * demux, GstMXFKLV * klv) { guint32 type; guint8 scheme; @@ -1442,13 +1768,13 @@ GstFlowReturn ret = GST_FLOW_OK; MXFDescriptiveMetadata *m = NULL, *old = NULL; - scheme = GST_READ_UINT8 (key->u + 12); - type = GST_READ_UINT24_BE (key->u + 13); + scheme = GST_READ_UINT8 (&klv->key.u[12]); + type = GST_READ_UINT24_BE (&klv->key.u[13]); GST_DEBUG_OBJECT (demux, "Handling descriptive metadata of size %" G_GSIZE_FORMAT " at offset %" G_GUINT64_FORMAT " with scheme 0x%02x and type 0x%06x", - gst_buffer_get_size (buffer), demux->offset, scheme, type); + klv->length, klv->offset, scheme, type); if (G_UNLIKELY (!demux->current_partition)) { GST_ERROR_OBJECT (demux, "Partition pack doesn't exist"); @@ -1465,10 +1791,14 @@ return GST_FLOW_OK; } - gst_buffer_map (buffer, &map, GST_MAP_READ); + ret = gst_mxf_demux_fill_klv (demux, klv); + if (ret != GST_FLOW_OK) + return ret; + + gst_buffer_map (klv->data, &map, GST_MAP_READ); m = mxf_descriptive_metadata_new (scheme, type, &demux->current_partition->primer, demux->offset, map.data, map.size); - gst_buffer_unmap (buffer, &map); + gst_buffer_unmap (klv->data, &map); if (!m) { GST_WARNING_OBJECT (demux, @@ -1522,12 +1852,11 @@ static GstFlowReturn gst_mxf_demux_handle_generic_container_system_item (GstMXFDemux * demux, - const MXFUL * key, GstBuffer * buffer) + GstMXFKLV * klv) { GST_DEBUG_OBJECT (demux, "Handling generic container system item of size %" G_GSIZE_FORMAT - " at offset %" G_GUINT64_FORMAT, gst_buffer_get_size (buffer), - demux->offset); + " at offset %" G_GUINT64_FORMAT, klv->length, klv->offset); if (demux->current_partition->essence_container_offset == 0) demux->current_partition->essence_container_offset = @@ -1683,9 +2012,644 @@ return ret; } +/* + * Find the partition containing the stream offset of the given track + * */ +static GstMXFDemuxPartition * +get_partition_for_stream_offset (GstMXFDemux * demux, + GstMXFDemuxEssenceTrack * etrack, guint64 stream_offset) +{ + GList *tmp; + GstMXFDemuxPartition *offset_partition = NULL, *next_partition = NULL; + + for (tmp = demux->partitions; tmp; tmp = tmp->next) { + GstMXFDemuxPartition *partition = tmp->data; + + if (!next_partition && offset_partition) + next_partition = partition; + + if (partition->partition.body_sid != etrack->body_sid) + continue; + if (partition->partition.body_offset > stream_offset) + break; + + offset_partition = partition; + next_partition = NULL; + } + + if (offset_partition + && stream_offset < offset_partition->partition.body_offset) + return NULL; + + GST_DEBUG_OBJECT (demux, + "Found this_partition:%" G_GUINT64_FORMAT " body_offset:%" + G_GUINT64_FORMAT, offset_partition->partition.this_partition, + offset_partition->partition.body_offset); + + /* Are we overriding into the next partition ? */ + if (next_partition) { + guint64 partition_essence_size = + next_partition->partition.this_partition - + offset_partition->partition.this_partition + + offset_partition->essence_container_offset; + guint64 in_partition = + stream_offset - offset_partition->partition.body_offset; + GST_DEBUG_OBJECT (demux, + "Followed by this_partition:%" G_GUINT64_FORMAT " body_offset:%" + G_GUINT64_FORMAT, next_partition->partition.this_partition, + next_partition->partition.body_offset); + + if (in_partition >= partition_essence_size) { + GST_WARNING_OBJECT (demux, + "stream_offset %" G_GUINT64_FORMAT + " in track body_sid:% index_sid:%d leaks into next unrelated partition (body_sid:%d / index_sid:%d)", + stream_offset, etrack->body_sid, etrack->index_sid, + next_partition->partition.body_sid, + next_partition->partition.index_sid); + return NULL; + } + } + return offset_partition; +} + +static GstMXFDemuxIndexTable * +get_track_index_table (GstMXFDemux * demux, GstMXFDemuxEssenceTrack * etrack) +{ + GList *l; + + /* Look in the indextables */ + for (l = demux->index_tables; l; l = l->next) { + GstMXFDemuxIndexTable *tmp = l->data; + + if (tmp->body_sid == etrack->body_sid + && tmp->index_sid == etrack->index_sid) { + return tmp; + } + } + + return NULL; +} + +static guint32 +get_track_max_temporal_offset (GstMXFDemux * demux, + GstMXFDemuxEssenceTrack * etrack) +{ + GstMXFDemuxIndexTable *table; + + if (etrack->intra_only) + return 0; + + table = get_track_index_table (demux, etrack); + + if (table) + return table->max_temporal_offset; + return 0; +} + +static guint64 +find_offset (GArray * offsets, gint64 * position, gboolean keyframe) +{ + GstMXFDemuxIndex *idx; + guint64 current_offset = -1; + gint64 current_position = *position; + + if (!offsets || offsets->len <= *position) + return -1; + + idx = &g_array_index (offsets, GstMXFDemuxIndex, *position); + if (idx->offset != 0 && (!keyframe || idx->keyframe)) { + current_offset = idx->offset; + } else if (idx->offset != 0) { + current_position--; + while (current_position >= 0) { + GST_LOG ("current_position %" G_GINT64_FORMAT, current_position); + idx = &g_array_index (offsets, GstMXFDemuxIndex, current_position); + if (idx->offset == 0) { + GST_LOG ("breaking offset 0"); + break; + } else if (!idx->keyframe) { + current_position--; + continue; + } else { + GST_LOG ("Breaking found offset"); + current_offset = idx->offset; + break; + } + } + } + + if (current_offset == -1) + return -1; + + *position = current_position; + return current_offset; +} + +/** + * find_edit_entry: + * @demux: The demuxer + * @etrack: The target essence track + * @position: An edit unit position + * @keyframe: if TRUE search for supporting keyframe + * @entry: (out): Will be filled with the matching entry information + * + * Finds the edit entry of @etrack for the given edit unit @position and fill + * @entry with the information about that edit entry. If @keyframe is TRUE, the + * supporting entry (i.e. keyframe) for the given position will be searched for. + * + * For frame-wrapped contents, the returned offset will be the position of the + * KLV of the content. For clip-wrapped content, the returned offset will be the + * position of the essence (i.e. without KLV header) and the entry will specify + * the size (in bytes). + * + * The returned entry will also specify the duration (in edit units) of the + * content, which can be different from 1 for special cases (such as raw audio + * where multiple samples could be aggregated). + * + * Returns: TRUE if the entry was found and @entry was properly filled, else + * FALSE. + */ +static gboolean +find_edit_entry (GstMXFDemux * demux, GstMXFDemuxEssenceTrack * etrack, + gint64 position, gboolean keyframe, GstMXFDemuxIndex * entry) +{ + GstMXFDemuxIndexTable *index_table = NULL; + guint i; + MXFIndexTableSegment *segment = NULL; + GstMXFDemuxPartition *offset_partition = NULL; + guint64 stream_offset = G_MAXUINT64, absolute_offset; + + GST_DEBUG_OBJECT (demux, + "track %d body_sid:%d index_sid:%d delta_id:%d position:%" G_GINT64_FORMAT + " keyframe:%d", etrack->track_id, etrack->body_sid, + etrack->index_sid, etrack->delta_id, position, keyframe); + + /* Default values */ + entry->duration = 1; + /* By default every entry is a keyframe unless specified otherwise */ + entry->keyframe = TRUE; + + /* Look in the track offsets */ + if (etrack->offsets && etrack->offsets->len > position) { + if (find_offset (etrack->offsets, &position, keyframe) != -1) { + *entry = g_array_index (etrack->offsets, GstMXFDemuxIndex, position); + GST_LOG_OBJECT (demux, "Found entry in track offsets"); + return TRUE; + } else + GST_LOG_OBJECT (demux, "Didn't find entry in track offsets"); + } + + /* Look in the indextables */ + index_table = get_track_index_table (demux, etrack); + + if (!index_table) { + GST_DEBUG_OBJECT (demux, + "Couldn't find index table for body_sid:%d index_sid:%d", + etrack->body_sid, etrack->index_sid); + return FALSE; + } + + GST_DEBUG_OBJECT (demux, + "Looking for position %" G_GINT64_FORMAT + " in index table (max temporal offset %u)", + etrack->position, index_table->max_temporal_offset); + + /* Searching for a position in index tables works in 3 steps: + * + * 1. Figure out the table segment containing that position + * 2. Figure out the "stream offset" (and additional flags/timing) of that + * position from the table segment. + * 3. Figure out the "absolute offset" of that "stream offset" using partitions + */ + +search_in_segment: + + /* Find matching index segment */ + GST_DEBUG_OBJECT (demux, "Look for entry in %d segments", + index_table->segments->len); + for (i = 0; i < index_table->segments->len; i++) { + MXFIndexTableSegment *cand = + &g_array_index (index_table->segments, MXFIndexTableSegment, i); + if (position >= cand->index_start_position && (cand->index_duration == 0 + || position < + (cand->index_start_position + cand->index_duration))) { + GST_DEBUG_OBJECT (demux, + "Entry is in Segment #%d , start: %" G_GINT64_FORMAT " , duration: %" + G_GINT64_FORMAT, i, cand->index_start_position, cand->index_duration); + segment = cand; + break; + } + } + if (!segment) { + GST_DEBUG_OBJECT (demux, + "Didn't find index table segment for position %" G_GINT64_FORMAT, + position); + return FALSE; + } + + /* Were we asked for a keyframe ? */ + if (keyframe) { + if (segment->edit_unit_byte_count && !segment->n_index_entries) { + GST_LOG_OBJECT (demux, + "Index table without entries, directly using requested position for keyframe search"); + } else { + gint64 candidate; + GST_LOG_OBJECT (demux, "keyframe search"); + /* Search backwards for keyframe */ + for (candidate = position; candidate >= segment->index_start_position; + candidate--) { + MXFIndexEntry *segment_index_entry = + &segment->index_entries[candidate - segment->index_start_position]; + + /* Match */ + if (segment_index_entry->flags & 0x80) { + GST_LOG_OBJECT (demux, "Found keyframe at position %" G_GINT64_FORMAT, + candidate); + position = candidate; + break; + } + + /* If a keyframe offset is specified and valid, use that */ + if (segment_index_entry->key_frame_offset + && !(segment_index_entry->flags & 0x08)) { + GST_DEBUG_OBJECT (demux, "Using keyframe offset %d", + segment_index_entry->key_frame_offset); + position = candidate + segment_index_entry->key_frame_offset; + if (position < segment->index_start_position) { + GST_DEBUG_OBJECT (demux, "keyframe info is in previous segment"); + goto search_in_segment; + } + break; + } + + /* If we reached the beginning, use that */ + if (candidate == 0) { + GST_LOG_OBJECT (demux, + "Reached position 0 while searching for keyframe"); + position = 0; + break; + } + + /* If we looped past the beginning of this segment, go to the previous one */ + if (candidate == segment->index_start_position) { + position = candidate - 1; + GST_LOG_OBJECT (demux, "Looping with new position %" G_GINT64_FORMAT, + position); + goto search_in_segment; + } + + /* loop back to check previous entry */ + } + } + } + + /* Figure out the stream offset (also called "body offset" in specification) */ + if (segment->edit_unit_byte_count && !segment->n_index_entries) { + /* Constant entry table. */ + stream_offset = position * segment->edit_unit_byte_count; + if (etrack->delta_id >= 0) { + MXFDeltaEntry *delta_entry = &segment->delta_entries[etrack->delta_id]; + GST_LOG_OBJECT (demux, + "Using delta %d pos_table_index:%d slice:%u element_delta:%u", + etrack->delta_id, delta_entry->pos_table_index, delta_entry->slice, + delta_entry->element_delta); + stream_offset += delta_entry->element_delta; + } else if (etrack->min_edit_units != 1) { + GST_LOG_OBJECT (demux, "Handling minimum edit unit %u", + etrack->min_edit_units); + entry->duration = + MIN (etrack->min_edit_units, + (segment->index_start_position + segment->index_duration) - position); + entry->size = segment->edit_unit_byte_count * entry->duration; + } else { + entry->size = segment->edit_unit_byte_count; + } + } else if (segment->n_index_entries) { + MXFIndexEntry *segment_index_entry; + MXFDeltaEntry *delta_entry = NULL; + g_assert (position <= + segment->index_start_position + segment->n_index_entries); + segment_index_entry = + &segment->index_entries[position - segment->index_start_position]; + stream_offset = segment_index_entry->stream_offset; + + if (segment->n_delta_entries > 0) + delta_entry = &segment->delta_entries[etrack->delta_id]; + + if (delta_entry) { + GST_LOG_OBJECT (demux, + "Using delta %d pos_table_index:%d slice:%u element_delta:%u", + etrack->delta_id, delta_entry->pos_table_index, delta_entry->slice, + delta_entry->element_delta); + + /* Apply offset from slice/delta if needed */ + if (delta_entry->slice) + stream_offset += + segment_index_entry->slice_offset[delta_entry->slice - 1]; + stream_offset += delta_entry->element_delta; + if (delta_entry->pos_table_index == -1) { + entry->keyframe = (segment_index_entry->flags & 0x80) == 0x80; + } + /* FIXME : Handle fractional offset position (delta_entry->pos_table_offset > 0) */ + } + + /* Apply reverse temporal reordering if present */ + if (index_table->reordered_delta_entry == etrack->delta_id) { + if (position >= index_table->reverse_temporal_offsets->len) { + GST_WARNING_OBJECT (demux, + "Can't apply temporal offset for position %" G_GINT64_FORMAT + " (max:%d)", position, index_table->reverse_temporal_offsets->len); + } + if (demux->temporal_order_misuse) { + GST_DEBUG_OBJECT (demux, "Handling temporal order misuse"); + entry->pts = position + segment_index_entry->temporal_offset; + } else { + entry->pts = + position + g_array_index (index_table->reverse_temporal_offsets, + gint8, position); + GST_LOG_OBJECT (demux, + "Applied temporal offset. dts:%" G_GINT64_FORMAT " pts:%" + G_GINT64_FORMAT, position, entry->pts); + } + } else + entry->pts = position; + } else { + /* Note : This should have been handled in the parser */ + GST_WARNING_OBJECT (demux, + "Can't handle index tables without entries nor constant edit unit byte count"); + return FALSE; + } + + /* Find the partition containing the stream offset for this track */ + offset_partition = + get_partition_for_stream_offset (demux, etrack, stream_offset); + + if (!offset_partition) { + GST_WARNING_OBJECT (demux, + "Couldn't find matching partition for stream offset %" G_GUINT64_FORMAT, + stream_offset); + return FALSE; + } else { + GST_DEBUG_OBJECT (demux, "Entry is in partition %" G_GUINT64_FORMAT, + offset_partition->partition.this_partition); + } + + /* Convert stream offset to absolute offset using matching partition */ + absolute_offset = + offset_partition->partition.this_partition + + offset_partition->essence_container_offset + (stream_offset - + offset_partition->partition.body_offset); + + GST_LOG_OBJECT (demux, + "track %d position:%" G_GINT64_FORMAT " stream_offset %" G_GUINT64_FORMAT + " matches to absolute offset %" G_GUINT64_FORMAT, etrack->track_id, + position, stream_offset, absolute_offset); + entry->initialized = TRUE; + entry->offset = absolute_offset; + entry->dts = position; + + return TRUE; +} + +/** + * find_entry_for_offset: + * @demux: The demuxer + * @etrack: The target essence track + * @offset: An absolute byte offset (excluding run_in) + * @entry: (out): Will be filled with the matching entry information + * + * Find the entry located at the given absolute byte offset. + * + * Note: the offset requested should be in the current partition ! + * + * Returns: TRUE if the entry was found and @entry was properly filled, else + * FALSE. + */ +static gboolean +find_entry_for_offset (GstMXFDemux * demux, GstMXFDemuxEssenceTrack * etrack, + guint64 offset, GstMXFDemuxIndex * retentry) +{ + GstMXFDemuxIndexTable *index_table = get_track_index_table (demux, etrack); + guint i; + MXFIndexTableSegment *index_segment = NULL; + GstMXFDemuxPartition *partition = demux->current_partition; + guint64 original_offset = offset; + guint64 cp_offset = 0; /* Offset in Content Package */ + MXFIndexEntry *index_entry = NULL; + MXFDeltaEntry *delta_entry = NULL; + gint64 position = 0; + + GST_DEBUG_OBJECT (demux, + "track %d body_sid:%d index_sid:%d offset:%" G_GUINT64_FORMAT, + etrack->track_id, etrack->body_sid, etrack->index_sid, offset); + + /* Default value */ + retentry->duration = 1; + retentry->keyframe = TRUE; + + /* Index-less search */ + if (etrack->offsets) { + for (i = 0; i < etrack->offsets->len; i++) { + GstMXFDemuxIndex *idx = + &g_array_index (etrack->offsets, GstMXFDemuxIndex, i); + + if (idx->initialized && idx->offset != 0 && idx->offset == offset) { + *retentry = *idx; + GST_DEBUG_OBJECT (demux, + "Found in track index. Position:%" G_GINT64_FORMAT, idx->dts); + return TRUE; + } + } + } + + /* Actual index search */ + if (!index_table || !index_table->segments->len) { + GST_WARNING_OBJECT (demux, "No index table or entries to search in"); + return FALSE; + } + + if (!partition) { + GST_WARNING_OBJECT (demux, "No current partition for search"); + return FALSE; + } + + /* Searching for a stream position from an absolute offset works in 3 steps: + * + * 1. Convert the absolute offset to a "stream offset" based on the partition + * information. + * 2. Find the segment for that "stream offset" + * 3. Match the entry within that segment + */ + + /* Convert to stream offset */ + GST_LOG_OBJECT (demux, + "offset %" G_GUINT64_FORMAT " this_partition:%" G_GUINT64_FORMAT + " essence_container_offset:%" G_GINT64_FORMAT " partition body offset %" + G_GINT64_FORMAT, offset, partition->partition.this_partition, + partition->essence_container_offset, partition->partition.body_offset); + offset = + offset - partition->partition.this_partition - + partition->essence_container_offset + partition->partition.body_offset; + + GST_LOG_OBJECT (demux, "stream offset %" G_GUINT64_FORMAT, offset); + + /* Find the segment that covers the given stream offset (the highest one that + * covers that offset) */ + for (i = index_table->segments->len - 1; i >= 0; i--) { + index_segment = + &g_array_index (index_table->segments, MXFIndexTableSegment, i); + GST_DEBUG_OBJECT (demux, + "Checking segment #%d (essence_offset %" G_GUINT64_FORMAT ")", i, + index_segment->segment_start_offset); + /* Not in the right segment yet */ + if (offset >= index_segment->segment_start_offset) { + GST_LOG_OBJECT (demux, "Found"); + break; + } + } + if (!index_segment) { + GST_WARNING_OBJECT (demux, + "Couldn't find index table segment for given offset"); + return FALSE; + } + + /* In the right segment, figure out: + * * the offset in the content package, + * * the position in edit units + * * the matching entry (if the table has entries) + */ + if (index_segment->edit_unit_byte_count) { + cp_offset = offset % index_segment->edit_unit_byte_count; + position = offset / index_segment->edit_unit_byte_count; + /* Boundary check */ + if ((position < index_segment->index_start_position) + || (index_segment->index_duration + && position > + (index_segment->index_start_position + + index_segment->index_duration))) { + GST_WARNING_OBJECT (demux, + "Invalid offset, exceeds table segment limits"); + return FALSE; + } + if (etrack->min_edit_units != 1) { + retentry->duration = MIN (etrack->min_edit_units, + (index_segment->index_start_position + + index_segment->index_duration) - position); + retentry->size = index_segment->edit_unit_byte_count * retentry->duration; + } else { + retentry->size = index_segment->edit_unit_byte_count; + } + } else { + /* Find the content package entry containing this offset */ + guint cpidx; + for (cpidx = 0; cpidx < index_segment->n_index_entries; cpidx++) { + index_entry = &index_segment->index_entries[cpidx]; + GST_DEBUG_OBJECT (demux, + "entry #%u offset:%" G_GUINT64_FORMAT " stream_offset:%" + G_GUINT64_FORMAT, cpidx, offset, index_entry->stream_offset); + if (index_entry->stream_offset == offset) { + index_entry = &index_segment->index_entries[cpidx]; + /* exactly on the entry */ + cp_offset = offset - index_entry->stream_offset; + position = index_segment->index_start_position + cpidx; + break; + } + if (index_entry->stream_offset > offset && cpidx > 0) { + index_entry = &index_segment->index_entries[cpidx - 1]; + /* One too far, result is in previous entry */ + cp_offset = offset - index_entry->stream_offset; + position = index_segment->index_start_position + cpidx - 1; + break; + } + } + if (cpidx == index_segment->n_index_entries) { + GST_WARNING_OBJECT (demux, + "offset exceeds maximum number of entries in table segment"); + return FALSE; + } + } + + /* If the track comes from an interleaved essence container and doesn't have a + * delta_id set, figure it out now */ + if (G_UNLIKELY (etrack->delta_id == MXF_INDEX_DELTA_ID_UNKNOWN)) { + guint delta; + GST_DEBUG_OBJECT (demux, + "Unknown delta_id for track. Attempting to resolve it"); + + if (index_segment->n_delta_entries == 0) { + /* No delta entries, nothing we can do about this */ + GST_DEBUG_OBJECT (demux, "Index table has no delta entries, ignoring"); + etrack->delta_id = MXF_INDEX_DELTA_ID_IGNORE; + } else if (!index_entry) { + for (delta = 0; delta < index_segment->n_delta_entries; delta++) { + /* No entry, therefore no slices */ + GST_LOG_OBJECT (demux, + "delta #%d offset %" G_GUINT64_FORMAT " cp_offs:%" G_GUINT64_FORMAT + " element_delta:%u", delta, offset, cp_offset, + index_segment->delta_entries[delta].element_delta); + if (cp_offset == index_segment->delta_entries[delta].element_delta) { + GST_DEBUG_OBJECT (demux, "Matched to delta %d", delta); + etrack->delta_id = delta; + delta_entry = &index_segment->delta_entries[delta]; + break; + } + } + } else { + for (delta = 0; delta < index_segment->n_delta_entries; delta++) { + guint64 delta_offs = 0; + /* If we are not in the first slice, take that offset into account */ + if (index_segment->delta_entries[delta].slice) + delta_offs = + index_entry->slice_offset[index_segment-> + delta_entries[delta].slice - 1]; + /* Add the offset for this delta */ + delta_offs += index_segment->delta_entries[delta].element_delta; + if (cp_offset == delta_offs) { + GST_DEBUG_OBJECT (demux, "Matched to delta %d", delta); + etrack->delta_id = delta; + delta_entry = &index_segment->delta_entries[delta]; + break; + } + } + + } + /* If we didn't managed to match, ignore it from now on */ + if (etrack->delta_id == MXF_INDEX_DELTA_ID_UNKNOWN) { + GST_WARNING_OBJECT (demux, + "Couldn't match delta id, ignoring it from now on"); + etrack->delta_id = MXF_INDEX_DELTA_ID_IGNORE; + } + } else if (index_segment->n_delta_entries > 0) { + delta_entry = &index_segment->delta_entries[etrack->delta_id]; + } + + if (index_entry && delta_entry && delta_entry->pos_table_index == -1) { + retentry->keyframe = (index_entry->flags & 0x80) == 0x80; + if (!demux->temporal_order_misuse) + retentry->pts = + position + g_array_index (index_table->reverse_temporal_offsets, + gint8, position); + else + retentry->pts = position + index_entry->temporal_offset; + GST_LOG_OBJECT (demux, + "Applied temporal offset. dts:%" G_GINT64_FORMAT " pts:%" + G_GINT64_FORMAT, position, retentry->pts); + } else + retentry->pts = position; + + /* FIXME : check if position and cp_offs matches the table */ + GST_LOG_OBJECT (demux, "Found in index table. position:%" G_GINT64_FORMAT, + position); + retentry->initialized = TRUE; + retentry->offset = original_offset; + retentry->dts = position; + + return TRUE; +} + static GstFlowReturn gst_mxf_demux_handle_generic_container_essence_element (GstMXFDemux * demux, - const MXFUL * key, GstBuffer * buffer, gboolean peek) + GstMXFKLV * klv, gboolean peek) { GstFlowReturn ret = GST_FLOW_OK; guint32 track_number; @@ -1693,24 +2657,38 @@ GstBuffer *inbuf = NULL; GstBuffer *outbuf = NULL; GstMXFDemuxEssenceTrack *etrack = NULL; - gboolean keyframe = TRUE; /* As in GstMXFDemuxIndex */ - guint64 pts = G_MAXUINT64, dts = G_MAXUINT64; + guint64 pts = G_MAXUINT64; + gint32 max_temporal_offset = 0; + GstMXFDemuxIndex index_entry = { 0, }; + guint64 offset; GST_DEBUG_OBJECT (demux, "Handling generic container essence element of size %" G_GSIZE_FORMAT - " at offset %" G_GUINT64_FORMAT, gst_buffer_get_size (buffer), - demux->offset); + " at offset %" G_GUINT64_FORMAT, klv->length, + klv->offset + klv->consumed); - GST_DEBUG_OBJECT (demux, " type = 0x%02x", key->u[12]); - GST_DEBUG_OBJECT (demux, " essence element count = 0x%02x", key->u[13]); - GST_DEBUG_OBJECT (demux, " essence element type = 0x%02x", key->u[14]); - GST_DEBUG_OBJECT (demux, " essence element number = 0x%02x", key->u[15]); + GST_DEBUG_OBJECT (demux, " type = 0x%02x", klv->key.u[12]); + GST_DEBUG_OBJECT (demux, " essence element count = 0x%02x", klv->key.u[13]); + GST_DEBUG_OBJECT (demux, " essence element type = 0x%02x", klv->key.u[14]); + GST_DEBUG_OBJECT (demux, " essence element number = 0x%02x", klv->key.u[15]); - if (demux->current_partition->essence_container_offset == 0) + if (demux->current_partition->essence_container_offset == 0) { demux->current_partition->essence_container_offset = demux->offset - demux->current_partition->partition.this_partition - demux->run_in; + if (demux->current_partition->single_track + && demux->current_partition->single_track->wrapping != + MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + demux->current_partition->essence_container_offset += klv->data_offset; + demux->current_partition->clip_klv = *klv; + /* "consume" the initial bytes of the KLV */ + klv->consumed = klv->data_offset; + GST_DEBUG_OBJECT (demux, + "Non-frame wrapping, updated essence_container_offset to %" + G_GUINT64_FORMAT, demux->current_partition->essence_container_offset); + } + } if (!demux->current_package) { GST_ERROR_OBJECT (demux, "No package selected yet"); @@ -1727,70 +2705,157 @@ return GST_FLOW_ERROR; } - track_number = GST_READ_UINT32_BE (&key->u[12]); + /* Identify and fetch the essence track */ + track_number = GST_READ_UINT32_BE (&klv->key.u[12]); - for (i = 0; i < demux->essence_tracks->len; i++) { - GstMXFDemuxEssenceTrack *tmp = - &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); + etrack = demux->current_partition->single_track; + if (!etrack) { + for (i = 0; i < demux->essence_tracks->len; i++) { + GstMXFDemuxEssenceTrack *tmp = + &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); - if (tmp->body_sid == demux->current_partition->partition.body_sid && - (tmp->track_number == track_number || tmp->track_number == 0)) { - etrack = tmp; - break; + if (tmp->body_sid == demux->current_partition->partition.body_sid && + (tmp->track_number == track_number || tmp->track_number == 0)) { + etrack = tmp; + break; + } } - } - if (!etrack) { - GST_WARNING_OBJECT (demux, - "No essence track for this essence element found"); - return GST_FLOW_OK; + if (!etrack) { + GST_DEBUG_OBJECT (demux, + "No essence track for this essence element found"); + return GST_FLOW_OK; + } } + GST_DEBUG_OBJECT (demux, + "Handling generic container essence (track %d , position:%" + G_GINT64_FORMAT ", number: 0x%08x , frame-wrapped:%d)", etrack->track_id, + etrack->position, track_number, + etrack->wrapping == MXF_ESSENCE_WRAPPING_FRAME_WRAPPING); + + /* Fetch the current entry. + * + * 1. If we don't have a current position, use find_entry_for_offset() + * 2. If we do have a position, use find_edit_entry() + * + * 3. If we are dealing with frame-wrapped content, pull the corresponding + * data from upstream (because it wasn't provided). If we didn't find an + * entry, error out because we can't deal with a frame-wrapped stream + * without index. + */ + + offset = klv->offset + klv->consumed; + + /* Update the track position (in case of resyncs) */ if (etrack->position == -1) { GST_DEBUG_OBJECT (demux, "Unknown essence track position, looking into index"); - if (etrack->offsets) { - for (i = 0; i < etrack->offsets->len; i++) { - GstMXFDemuxIndex *idx = - &g_array_index (etrack->offsets, GstMXFDemuxIndex, i); - - if (idx->initialized && idx->offset != 0 - && idx->offset == demux->offset - demux->run_in) { - etrack->position = i; - break; - } + if (!find_entry_for_offset (demux, etrack, offset - demux->run_in, + &index_entry)) { + GST_WARNING_OBJECT (demux, "Essence track position not in index"); + return GST_FLOW_OK; + } + /* Update track position */ + etrack->position = index_entry.dts; + } else if (etrack->delta_id == MXF_INDEX_DELTA_ID_UNKNOWN) { + GST_DEBUG_OBJECT (demux, + "Unknown essence track delta_id, looking into index"); + if (!find_entry_for_offset (demux, etrack, offset - demux->run_in, + &index_entry)) { + /* Non-fatal, fallback to legacy mode */ + GST_WARNING_OBJECT (demux, "Essence track position not in index"); + } else if (etrack->position != index_entry.dts) { + GST_ERROR_OBJECT (demux, + "track position doesn't match %" G_GINT64_FORMAT " entry dts %" + G_GINT64_FORMAT, etrack->position, index_entry.dts); + return GST_FLOW_ERROR; + } + } else { + if (!find_edit_entry (demux, etrack, etrack->position, FALSE, &index_entry)) { + GST_DEBUG_OBJECT (demux, "Couldn't find entry"); + } else if (etrack->wrapping == MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + if (etrack->delta_id != MXF_INDEX_DELTA_ID_IGNORE + && index_entry.offset != offset) { + GST_ERROR_OBJECT (demux, + "demux offset doesn't match %" G_GINT64_FORMAT " entry offset %" + G_GUINT64_FORMAT, offset, index_entry.offset); + return GST_FLOW_ERROR; } + } else if (index_entry.offset != klv->offset + klv->consumed && + index_entry.offset != klv->offset + klv->data_offset) { + GST_ERROR_OBJECT (demux, + "KLV offset doesn't match %" G_GINT64_FORMAT " entry offset %" + G_GUINT64_FORMAT, klv->offset + klv->consumed, index_entry.offset); + return GST_FLOW_ERROR; } + } - if (etrack->position == -1) { - GST_WARNING_OBJECT (demux, "Essence track position not in index"); - return GST_FLOW_OK; + if (etrack->wrapping != MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + /* We need entry information to deal with non-frame-wrapped content */ + if (!index_entry.initialized) { + GST_ELEMENT_ERROR (demux, STREAM, WRONG_TYPE, (NULL), + ("Essence with non-frame-wrapping require an index table to be present")); + return GST_FLOW_ERROR; + } + /* We cannot deal with non-frame-wrapping in push mode for now */ + if (!demux->random_access) { + GST_ELEMENT_ERROR (demux, STREAM, WRONG_TYPE, (NULL), + ("Non-frame-wrapping is not support in push mode")); + return GST_FLOW_ERROR; } } - if (etrack->offsets && etrack->offsets->len > etrack->position) { - GstMXFDemuxIndex *index = - &g_array_index (etrack->offsets, GstMXFDemuxIndex, etrack->position); - if (index->initialized && index->offset != 0) - keyframe = index->keyframe; - if (index->initialized && index->pts != G_MAXUINT64) - pts = index->pts; - if (index->initialized && index->dts != G_MAXUINT64) - dts = index->dts; - } - - /* Create subbuffer to be able to change metadata */ - inbuf = - gst_buffer_copy_region (buffer, GST_BUFFER_COPY_ALL, 0, - gst_buffer_get_size (buffer)); + /* FIXME : If we're peeking and don't need to actually parse the data, we + * should avoid pulling the content from upstream */ + if (etrack->wrapping != MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + g_assert (index_entry.size); + GST_DEBUG_OBJECT (demux, "Should only grab %" G_GUINT64_FORMAT " bytes", + index_entry.size); + ret = + gst_mxf_demux_pull_range (demux, index_entry.offset, index_entry.size, + &inbuf); + if (ret != GST_FLOW_OK) + return ret; + if (klv->consumed == 0) + klv->consumed = klv->data_offset + index_entry.size; + else + klv->consumed += index_entry.size; + if (klv != &demux->current_partition->clip_klv) + demux->current_partition->clip_klv = *klv; + GST_LOG_OBJECT (demux, + "klv data_offset:%" G_GUINT64_FORMAT " length:%" G_GSIZE_FORMAT + " consumed:%" G_GUINT64_FORMAT, klv->data_offset, klv->length, + klv->consumed); + /* Switch back to KLV mode if we're done with this one */ + if (klv->length + klv->data_offset == klv->consumed) + demux->state = GST_MXF_DEMUX_STATE_KLV; + else + demux->state = GST_MXF_DEMUX_STATE_ESSENCE; + } else { + + ret = gst_mxf_demux_fill_klv (demux, klv); + if (ret != GST_FLOW_OK) + return ret; - if (!keyframe) + /* Create subbuffer to be able to change metadata */ + inbuf = + gst_buffer_copy_region (klv->data, GST_BUFFER_COPY_ALL, 0, + gst_buffer_get_size (klv->data)); + + } + + if (index_entry.initialized) { + GST_DEBUG_OBJECT (demux, "Got entry dts:%" G_GINT64_FORMAT " keyframe:%d", + index_entry.dts, index_entry.keyframe); + } + if (index_entry.initialized && !index_entry.keyframe) GST_BUFFER_FLAG_SET (inbuf, GST_BUFFER_FLAG_DELTA_UNIT); if (etrack->handle_func) { /* Takes ownership of inbuf */ ret = - etrack->handle_func (key, inbuf, etrack->caps, + etrack->handle_func (&klv->key, inbuf, etrack->caps, etrack->source_track, etrack->mapping_data, &outbuf); inbuf = NULL; } else { @@ -1808,71 +2873,27 @@ return ret; } - if (outbuf) - keyframe = !GST_BUFFER_FLAG_IS_SET (outbuf, GST_BUFFER_FLAG_DELTA_UNIT); - - /* Prefer keyframe information from index tables over everything else */ - if (demux->index_tables) { - GList *l; - GstMXFDemuxIndexTable *index_table = NULL; - - for (l = demux->index_tables; l; l = l->next) { - GstMXFDemuxIndexTable *tmp = l->data; - - if (tmp->body_sid == etrack->body_sid - && tmp->index_sid == etrack->index_sid) { - index_table = tmp; - break; - } - } - - if (index_table && index_table->offsets->len > etrack->position) { - GstMXFDemuxIndex *index = - &g_array_index (index_table->offsets, GstMXFDemuxIndex, - etrack->position); - if (index->initialized && index->offset != 0) { - keyframe = index->keyframe; - - if (outbuf) { - if (keyframe) - GST_BUFFER_FLAG_UNSET (outbuf, GST_BUFFER_FLAG_DELTA_UNIT); - else - GST_BUFFER_FLAG_SET (outbuf, GST_BUFFER_FLAG_DELTA_UNIT); - } - } - - if (index->initialized && index->pts != G_MAXUINT64) - pts = index->pts; - if (index->initialized && index->dts != G_MAXUINT64) - dts = index->dts; - } - } - - if (!etrack->offsets) - etrack->offsets = g_array_new (FALSE, TRUE, sizeof (GstMXFDemuxIndex)); - - { - if (etrack->offsets->len > etrack->position) { - GstMXFDemuxIndex *index = - &g_array_index (etrack->offsets, GstMXFDemuxIndex, etrack->position); - - index->offset = demux->offset - demux->run_in; - index->initialized = TRUE; - index->pts = pts; - index->dts = dts; - index->keyframe = keyframe; - } else if (etrack->position < G_MAXINT) { - GstMXFDemuxIndex index; - - index.offset = demux->offset - demux->run_in; - index.initialized = TRUE; - index.pts = pts; - index.dts = dts; - index.keyframe = keyframe; - if (etrack->offsets->len < etrack->position) - g_array_set_size (etrack->offsets, etrack->position + 1); - g_array_insert_val (etrack->offsets, etrack->position, index); - } + if (!index_entry.initialized) { + /* This can happen when doing scanning without entry tables */ + index_entry.duration = 1; + index_entry.offset = demux->offset - demux->run_in; + index_entry.dts = etrack->position; + index_entry.pts = etrack->intra_only ? etrack->position : G_MAXUINT64; + index_entry.keyframe = + !GST_BUFFER_FLAG_IS_SET (outbuf, GST_BUFFER_FLAG_DELTA_UNIT); + index_entry.initialized = TRUE; + GST_DEBUG_OBJECT (demux, + "Storing newly discovered information on track %d. dts: %" + G_GINT64_FORMAT " offset:%" G_GUINT64_FORMAT " keyframe:%d", + etrack->track_id, index_entry.dts, index_entry.offset, + index_entry.keyframe); + + if (!etrack->offsets) + etrack->offsets = g_array_new (FALSE, TRUE, sizeof (GstMXFDemuxIndex)); + + /* We only ever append to the track offset entry. */ + g_assert (etrack->position <= etrack->offsets->len); + g_array_insert_val (etrack->offsets, etrack->position, index_entry); } if (peek) @@ -1886,6 +2907,8 @@ inbuf = outbuf; outbuf = NULL; + max_temporal_offset = get_track_max_temporal_offset (demux, etrack); + for (i = 0; i < demux->src->len; i++) { GstMXFDemuxPad *pad = g_ptr_array_index (demux->src, i); @@ -1893,12 +2916,15 @@ continue; if (pad->eos) { - GST_DEBUG_OBJECT (demux, "Pad is already EOS"); + GST_DEBUG_OBJECT (pad, "Pad is already EOS"); continue; } - if (etrack->position != pad->current_essence_track_position) { - GST_DEBUG_OBJECT (demux, "Not at current component's position"); + if (etrack->position < pad->current_essence_track_position) { + GST_DEBUG_OBJECT (pad, + "Not at current component's position (track:%" G_GINT64_FORMAT + " essence:%" G_GINT64_FORMAT ")", etrack->position, + pad->current_essence_track_position); continue; } @@ -1907,7 +2933,10 @@ if (earliest && earliest != pad && earliest->position < pad->position && pad->position - earliest->position > demux->max_drift) { - GST_DEBUG_OBJECT (demux, "Pad is too far ahead of time"); + GST_DEBUG_OBJECT (earliest, + "Pad is too far ahead of time (%" GST_TIME_FORMAT " vs earliest:%" + GST_TIME_FORMAT ")", GST_TIME_ARGS (earliest->position), + GST_TIME_ARGS (pad->position)); continue; } } @@ -1917,6 +2946,8 @@ gst_buffer_copy_region (inbuf, GST_BUFFER_COPY_ALL, 0, gst_buffer_get_size (inbuf)); + pts = index_entry.pts; + GST_BUFFER_DTS (outbuf) = pad->position; if (etrack->intra_only) { GST_BUFFER_PTS (outbuf) = pad->position; @@ -1928,12 +2959,21 @@ gst_util_uint64_scale (pad->current_component_start_position * GST_SECOND, pad->material_track->edit_rate.d, pad->material_track->edit_rate.n); + /* We are dealing with reordered data, the PTS is shifted forward by the + * maximum temporal reordering (the DTS remain as-is). */ + if (max_temporal_offset > 0) + GST_BUFFER_PTS (outbuf) += + gst_util_uint64_scale (max_temporal_offset * GST_SECOND, + pad->current_essence_track->source_track->edit_rate.d, + pad->current_essence_track->source_track->edit_rate.n); + } else { GST_BUFFER_PTS (outbuf) = GST_CLOCK_TIME_NONE; } GST_BUFFER_DURATION (outbuf) = gst_util_uint64_scale (GST_SECOND, + index_entry.duration * pad->current_essence_track->source_track->edit_rate.d, pad->current_essence_track->source_track->edit_rate.n); GST_BUFFER_OFFSET (outbuf) = GST_BUFFER_OFFSET_NONE; @@ -1985,7 +3025,27 @@ gst_pad_push_event (GST_PAD_CAST (pad), gst_event_ref (demux->close_seg_event)); - e = gst_event_new_segment (&demux->segment); + if (max_temporal_offset > 0) { + GstSegment shift_segment; + /* Handle maximum temporal offset. We are shifting all output PTS for + * this stream by the greatest temporal reordering that can occur. In + * order not to change the stream/running time we shift the segment + * start and stop values accordingly */ + gst_segment_copy_into (&demux->segment, &shift_segment); + if (GST_CLOCK_TIME_IS_VALID (shift_segment.start)) + shift_segment.start += + gst_util_uint64_scale (max_temporal_offset * GST_SECOND, + pad->current_essence_track->source_track->edit_rate.d, + pad->current_essence_track->source_track->edit_rate.n); + if (GST_CLOCK_TIME_IS_VALID (shift_segment.stop)) + shift_segment.stop += + gst_util_uint64_scale (max_temporal_offset * GST_SECOND, + pad->current_essence_track->source_track->edit_rate.d, + pad->current_essence_track->source_track->edit_rate.n); + e = gst_event_new_segment (&shift_segment); + } else + e = gst_event_new_segment (&demux->segment); + GST_DEBUG_OBJECT (pad, "Sending segment %" GST_PTR_FORMAT, e); gst_event_set_seqnum (e, demux->seqnum); gst_pad_push_event (GST_PAD_CAST (pad), e); pad->need_segment = FALSE; @@ -1997,17 +3057,7 @@ } pad->position += GST_BUFFER_DURATION (outbuf); - pad->current_material_track_position++; - - GST_DEBUG_OBJECT (demux, - "Pushing buffer of size %" G_GSIZE_FORMAT " for track %u: pts %" - GST_TIME_FORMAT " dts %" GST_TIME_FORMAT " duration %" GST_TIME_FORMAT - " position %" G_GUINT64_FORMAT, gst_buffer_get_size (outbuf), - pad->material_track->parent.track_id, - GST_TIME_ARGS (GST_BUFFER_PTS (outbuf)), - GST_TIME_ARGS (GST_BUFFER_DTS (outbuf)), - GST_TIME_ARGS (GST_BUFFER_DURATION (outbuf)), - pad->current_essence_track_position); + pad->current_material_track_position += index_entry.duration; if (pad->discont) { GST_BUFFER_FLAG_SET (outbuf, GST_BUFFER_FLAG_DISCONT); @@ -2026,11 +3076,21 @@ "Replacing empty gap buffer with gap event %" GST_PTR_FORMAT, gap); gst_pad_push_event (GST_PAD_CAST (pad), gap); } else { + GST_DEBUG_OBJECT (pad, + "Pushing buffer of size %" G_GSIZE_FORMAT " for track %u: pts %" + GST_TIME_FORMAT " dts %" GST_TIME_FORMAT " duration %" GST_TIME_FORMAT + " position %" G_GUINT64_FORMAT, gst_buffer_get_size (outbuf), + pad->material_track->parent.track_id, + GST_TIME_ARGS (GST_BUFFER_PTS (outbuf)), + GST_TIME_ARGS (GST_BUFFER_DTS (outbuf)), + GST_TIME_ARGS (GST_BUFFER_DURATION (outbuf)), + pad->current_essence_track_position); + ret = gst_pad_push (GST_PAD_CAST (pad), outbuf); } outbuf = NULL; ret = gst_flow_combiner_update_flow (demux->flowcombiner, ret); - GST_LOG_OBJECT (demux, "combined return %s", gst_flow_get_name (ret)); + GST_LOG_OBJECT (pad, "combined return %s", gst_flow_get_name (ret)); if (pad->position > demux->segment.position) demux->segment.position = pad->position; @@ -2038,7 +3098,7 @@ if (ret != GST_FLOW_OK) goto out; - pad->current_essence_track_position++; + pad->current_essence_track_position += index_entry.duration; if (pad->current_component) { if (pad->current_component_duration > 0 && @@ -2051,6 +3111,9 @@ pad->current_component_index + 1); if (ret != GST_FLOW_OK && ret != GST_FLOW_EOS) { GST_ERROR_OBJECT (demux, "Switching component failed"); + } else { + pad->current_essence_track->position = + pad->current_essence_track_position; } } else if (etrack->duration > 0 && pad->current_essence_track_position >= etrack->duration) { @@ -2067,7 +3130,7 @@ if (ret == GST_FLOW_EOS) { GstEvent *e; - GST_DEBUG_OBJECT (demux, "EOS for track"); + GST_DEBUG_OBJECT (pad, "EOS for track"); pad->eos = TRUE; e = gst_event_new_eos (); gst_event_set_seqnum (e, demux->seqnum); @@ -2086,126 +3149,123 @@ if (outbuf) gst_buffer_unref (outbuf); - etrack->position++; + etrack->position += index_entry.duration; return ret; } +/* + * Called when analyzing the (RIP) Random Index Pack. + * + * FIXME : If a file doesn't have a RIP, we should iterate the partition headers + * to collect as much information as possible. + * + * This function collects as much information as possible from the partition headers: + * * Store partition information in the list of partitions + * * Handle any index table segment present + */ static void read_partition_header (GstMXFDemux * demux) { - GstBuffer *buf; - MXFUL key; - guint read; + GstMXFKLV klv; - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, &read) - != GST_FLOW_OK) - return; - - if (!mxf_is_partition_pack (&key)) { - gst_buffer_unref (buf); + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv) != GST_FLOW_OK + || !mxf_is_partition_pack (&klv.key)) { return; } - if (gst_mxf_demux_handle_partition_pack (demux, &key, buf) != GST_FLOW_OK) { - gst_buffer_unref (buf); + if (gst_mxf_demux_handle_partition_pack (demux, &klv) != GST_FLOW_OK) { + if (klv.data) + gst_buffer_unref (klv.data); return; } - demux->offset += read; - gst_buffer_unref (buf); + gst_mxf_demux_consume_klv (demux, &klv); - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, &read) - != GST_FLOW_OK) + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv) != GST_FLOW_OK) return; - while (mxf_is_fill (&key)) { - demux->offset += read; - gst_buffer_unref (buf); - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, &read) - != GST_FLOW_OK) + while (mxf_is_fill (&klv.key)) { + gst_mxf_demux_consume_klv (demux, &klv); + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, + &klv) != GST_FLOW_OK) return; } - if (!mxf_is_index_table_segment (&key) + if (!mxf_is_index_table_segment (&klv.key) && demux->current_partition->partition.header_byte_count) { - gst_buffer_unref (buf); demux->offset += demux->current_partition->partition.header_byte_count; - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, &read) - != GST_FLOW_OK) + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, + &klv) != GST_FLOW_OK) return; } - while (mxf_is_fill (&key)) { - demux->offset += read; - gst_buffer_unref (buf); - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, &read) - != GST_FLOW_OK) + while (mxf_is_fill (&klv.key)) { + gst_mxf_demux_consume_klv (demux, &klv); + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, + &klv) != GST_FLOW_OK) return; } if (demux->current_partition->partition.index_byte_count - && mxf_is_index_table_segment (&key)) { + && mxf_is_index_table_segment (&klv.key)) { guint64 index_end_offset = demux->offset + demux->current_partition->partition.index_byte_count; while (demux->offset < index_end_offset) { - if (mxf_is_index_table_segment (&key)) { - gst_mxf_demux_handle_index_table_segment (demux, &key, buf, - demux->offset); - } - demux->offset += read; + if (mxf_is_index_table_segment (&klv.key)) + gst_mxf_demux_handle_index_table_segment (demux, &klv); + gst_mxf_demux_consume_klv (demux, &klv); - gst_buffer_unref (buf); - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, - &read) - != GST_FLOW_OK) + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, + &klv) != GST_FLOW_OK) return; } } - while (mxf_is_fill (&key)) { - demux->offset += read; - gst_buffer_unref (buf); - if (gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buf, &read) - != GST_FLOW_OK) + while (mxf_is_fill (&klv.key)) { + gst_mxf_demux_consume_klv (demux, &klv); + if (gst_mxf_demux_peek_klv_packet (demux, demux->offset, + &klv) != GST_FLOW_OK) return; } - if (mxf_is_generic_container_system_item (&key) || - mxf_is_generic_container_essence_element (&key) || - mxf_is_avid_essence_container_essence_element (&key)) { + if (mxf_is_generic_container_system_item (&klv.key) || + mxf_is_generic_container_essence_element (&klv.key) || + mxf_is_avid_essence_container_essence_element (&klv.key)) { if (demux->current_partition->essence_container_offset == 0) demux->current_partition->essence_container_offset = demux->offset - demux->current_partition->partition.this_partition - demux->run_in; } - - gst_buffer_unref (buf); } static GstFlowReturn -gst_mxf_demux_handle_random_index_pack (GstMXFDemux * demux, const MXFUL * key, - GstBuffer * buffer) +gst_mxf_demux_handle_random_index_pack (GstMXFDemux * demux, GstMXFKLV * klv) { guint i; GList *l; GstMapInfo map; gboolean ret; + GstFlowReturn flowret; GST_DEBUG_OBJECT (demux, "Handling random index pack of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT, gst_buffer_get_size (buffer), demux->offset); + G_GUINT64_FORMAT, klv->length, klv->offset); if (demux->random_index_pack) { GST_DEBUG_OBJECT (demux, "Already parsed random index pack"); return GST_FLOW_OK; } - gst_buffer_map (buffer, &map, GST_MAP_READ); + flowret = gst_mxf_demux_fill_klv (demux, klv); + if (flowret != GST_FLOW_OK) + return flowret; + + gst_buffer_map (klv->data, &map, GST_MAP_READ); ret = - mxf_random_index_pack_parse (key, map.data, map.size, + mxf_random_index_pack_parse (&klv->key, map.data, map.size, &demux->random_index_pack); - gst_buffer_unmap (buffer, &map); + gst_buffer_unmap (klv->data, &map); if (!ret) { GST_ERROR_OBJECT (demux, "Parsing random index pack failed"); @@ -2256,23 +3316,59 @@ return GST_FLOW_OK; } +static gint +compare_index_table_segment (MXFIndexTableSegment * sa, + MXFIndexTableSegment * sb) +{ + if (mxf_uuid_is_equal (&sa->instance_id, &sb->instance_id)) + return 0; + if (sa->body_sid != sb->body_sid) + return (sa->body_sid < sb->body_sid) ? -1 : 1; + if (sa->index_sid != sb->index_sid) + return (sa->index_sid < sb->index_sid) ? -1 : 1; + /* Finally sort by index start position */ + if (sa->index_start_position < sb->index_start_position) + return -1; + return (sa->index_start_position != sb->index_start_position); +} + +#if !GLIB_CHECK_VERSION(2, 62, 0) +static gboolean +has_table_segment (GArray * segments, MXFIndexTableSegment * target) +{ + guint i; + for (i = 0; i < segments->len; i++) { + MXFIndexTableSegment *cand = + &g_array_index (segments, MXFIndexTableSegment, i); + if (mxf_uuid_is_equal (&cand->instance_id, &target->instance_id)) + return TRUE; + } + return FALSE; +} +#endif + static GstFlowReturn -gst_mxf_demux_handle_index_table_segment (GstMXFDemux * demux, - const MXFUL * key, GstBuffer * buffer, guint64 offset) +gst_mxf_demux_handle_index_table_segment (GstMXFDemux * demux, GstMXFKLV * klv) { MXFIndexTableSegment *segment; GstMapInfo map; gboolean ret; + GList *tmp; + GstFlowReturn flowret; + + flowret = gst_mxf_demux_fill_klv (demux, klv); + if (flowret != GST_FLOW_OK) + return flowret; GST_DEBUG_OBJECT (demux, "Handling index table segment of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT, gst_buffer_get_size (buffer), offset); + G_GUINT64_FORMAT, klv->length, klv->offset); segment = g_new0 (MXFIndexTableSegment, 1); - gst_buffer_map (buffer, &map, GST_MAP_READ); - ret = mxf_index_table_segment_parse (key, segment, map.data, map.size); - gst_buffer_unmap (buffer, &map); + gst_buffer_map (klv->data, &map, GST_MAP_READ); + ret = mxf_index_table_segment_parse (&klv->key, segment, map.data, map.size); + gst_buffer_unmap (klv->data, &map); if (!ret) { GST_ERROR_OBJECT (demux, "Parsing index table segment failed"); @@ -2280,27 +3376,49 @@ return GST_FLOW_ERROR; } + /* Drop it if we already saw it. Ideally we should be able to do this before + parsing (by checking instance UID) */ + if (g_list_find_custom (demux->pending_index_table_segments, segment, + (GCompareFunc) compare_index_table_segment)) { + GST_DEBUG_OBJECT (demux, "Already in pending list"); + g_free (segment); + return GST_FLOW_OK; + } + for (tmp = demux->index_tables; tmp; tmp = tmp->next) { + GstMXFDemuxIndexTable *table = (GstMXFDemuxIndexTable *) tmp->data; +#if !GLIB_CHECK_VERSION (2, 62, 0) + if (has_table_segment (table->segments, segment)) { +#else + if (g_array_binary_search (table->segments, segment, + (GCompareFunc) compare_index_table_segment, NULL)) { +#endif + GST_DEBUG_OBJECT (demux, "Already handled"); + g_free (segment); + return GST_FLOW_OK; + } + } + demux->pending_index_table_segments = - g_list_prepend (demux->pending_index_table_segments, segment); + g_list_insert_sorted (demux->pending_index_table_segments, segment, + (GCompareFunc) compare_index_table_segment); return GST_FLOW_OK; } static GstFlowReturn -gst_mxf_demux_pull_klv_packet (GstMXFDemux * demux, guint64 offset, MXFUL * key, - GstBuffer ** outbuf, guint * read) +gst_mxf_demux_peek_klv_packet (GstMXFDemux * demux, guint64 offset, + GstMXFKLV * klv) { GstBuffer *buffer = NULL; const guint8 *data; - guint64 data_offset = 0; - guint64 length; GstFlowReturn ret = GST_FLOW_OK; GstMapInfo map; #ifndef GST_DISABLE_GST_DEBUG gchar str[48]; #endif - memset (key, 0, sizeof (MXFUL)); + memset (klv, 0, sizeof (GstMXFKLV)); + klv->offset = offset; /* Pull 16 byte key and first byte of BER encoded length */ if ((ret = @@ -2309,19 +3427,16 @@ gst_buffer_map (buffer, &map, GST_MAP_READ); - memcpy (key, map.data, 16); - - GST_DEBUG_OBJECT (demux, "Got KLV packet with key %s", mxf_ul_to_string (key, - str)); + memcpy (&klv->key, map.data, 16); /* Decode BER encoded packet length */ if ((map.data[16] & 0x80) == 0) { - length = map.data[16]; - data_offset = 17; + klv->length = map.data[16]; + klv->data_offset = 17; } else { guint slen = map.data[16] & 0x7f; - data_offset = 16 + 1 + slen; + klv->data_offset = 16 + 1 + slen; gst_buffer_unmap (buffer, &map); gst_buffer_unref (buffer); @@ -2342,9 +3457,9 @@ gst_buffer_map (buffer, &map, GST_MAP_READ); data = map.data; - length = 0; + klv->length = 0; while (slen) { - length = (length << 8) | *data; + klv->length = (klv->length << 8) | *data; data++; slen--; } @@ -2356,25 +3471,17 @@ /* GStreamer's buffer sizes are stored in a guint so we * limit ourself to G_MAXUINT large buffers */ - if (length > G_MAXUINT) { + if (klv->length > G_MAXUINT) { GST_ERROR_OBJECT (demux, - "Unsupported KLV packet length: %" G_GUINT64_FORMAT, length); + "Unsupported KLV packet length: %" G_GSIZE_FORMAT, klv->length); ret = GST_FLOW_ERROR; goto beach; } - GST_DEBUG_OBJECT (demux, "KLV packet with key %s has length " - "%" G_GUINT64_FORMAT, mxf_ul_to_string (key, str), length); - - /* Pull the complete KLV packet */ - if ((ret = gst_mxf_demux_pull_range (demux, offset + data_offset, length, - &buffer)) != GST_FLOW_OK) - goto beach; - - *outbuf = buffer; - buffer = NULL; - if (read) - *read = data_offset + length; + GST_DEBUG_OBJECT (demux, + "Found KLV packet at offset %" G_GUINT64_FORMAT " with key %s and length " + "%" G_GSIZE_FORMAT, offset, mxf_ul_to_string (&klv->key, str), + klv->length); beach: if (buffer) @@ -2383,6 +3490,39 @@ return ret; } +static GstFlowReturn +gst_mxf_demux_fill_klv (GstMXFDemux * demux, GstMXFKLV * klv) +{ + if (klv->data) + return GST_FLOW_OK; + GST_DEBUG_OBJECT (demux, + "Pulling %" G_GSIZE_FORMAT " bytes from offset %" G_GUINT64_FORMAT, + klv->length, klv->offset + klv->data_offset); + return gst_mxf_demux_pull_range (demux, klv->offset + klv->data_offset, + klv->length, &klv->data); +} + +/* Call when done with a klv. Will release the buffer (if any) and will update + * the demuxer offset position. Do *NOT* call if you do not want the demuxer + * offset to be updated */ +static void +gst_mxf_demux_consume_klv (GstMXFDemux * demux, GstMXFKLV * klv) +{ + if (klv->data) { + gst_buffer_unref (klv->data); + klv->data = NULL; + } + GST_DEBUG_OBJECT (demux, + "Consuming KLV offset:%" G_GUINT64_FORMAT " data_offset:%" + G_GUINT64_FORMAT " length:%" G_GSIZE_FORMAT " consumed:%" + G_GUINT64_FORMAT, klv->offset, klv->data_offset, klv->length, + klv->consumed); + if (klv->consumed) + demux->offset = klv->offset + klv->consumed; + else + demux->offset += klv->data_offset + klv->length; +} + static void gst_mxf_demux_pull_random_index_pack (GstMXFDemux * demux) { @@ -2391,9 +3531,9 @@ GstFormat fmt = GST_FORMAT_BYTES; guint32 pack_size; guint64 old_offset = demux->offset; - MXFUL key; GstMapInfo map; GstFlowReturn flow_ret; + GstMXFKLV klv; if (!gst_pad_peer_query_duration (demux->sinkpad, fmt, &filesize) || fmt != GST_FORMAT_BYTES || filesize == -1) { @@ -2423,32 +3563,22 @@ return; } - buffer = NULL; - if (gst_mxf_demux_pull_range (demux, filesize - pack_size, 16, - &buffer) != GST_FLOW_OK) { + /* Peek for klv at filesize - pack_size */ + if (gst_mxf_demux_peek_klv_packet (demux, filesize - pack_size, + &klv) != GST_FLOW_OK) { GST_DEBUG_OBJECT (demux, "Failed pulling random index pack key"); return; } - gst_buffer_map (buffer, &map, GST_MAP_READ); - memcpy (&key, map.data, 16); - gst_buffer_unmap (buffer, &map); - gst_buffer_unref (buffer); - - if (!mxf_is_random_index_pack (&key)) { + if (!mxf_is_random_index_pack (&klv.key)) { GST_DEBUG_OBJECT (demux, "No random index pack"); return; } demux->offset = filesize - pack_size; - if (gst_mxf_demux_pull_klv_packet (demux, filesize - pack_size, &key, - &buffer, NULL) != GST_FLOW_OK) { - GST_DEBUG_OBJECT (demux, "Failed pulling random index pack"); - return; - } - - flow_ret = gst_mxf_demux_handle_random_index_pack (demux, &key, buffer); - gst_buffer_unref (buffer); + flow_ret = gst_mxf_demux_handle_random_index_pack (demux, &klv); + if (klv.data) + gst_buffer_unref (klv.data); demux->offset = old_offset; if (flow_ret == GST_FLOW_OK && !demux->index_table_segments_collected) { @@ -2461,12 +3591,12 @@ gst_mxf_demux_parse_footer_metadata (GstMXFDemux * demux) { guint64 old_offset = demux->offset; - MXFUL key; - GstBuffer *buffer = NULL; - guint read = 0; + GstMXFKLV klv; GstFlowReturn flow = GST_FLOW_OK; GstMXFDemuxPartition *old_partition = demux->current_partition; + GST_DEBUG_OBJECT (demux, "Parsing footer metadata"); + demux->current_partition = NULL; gst_mxf_demux_reset_metadata (demux); @@ -2481,23 +3611,29 @@ } next_try: - flow = - gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buffer, - &read); + GST_LOG_OBJECT (demux, "Peeking partition pack at offset %" G_GUINT64_FORMAT, + demux->offset); + + /* Process Partition Pack */ + flow = gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv); if (G_UNLIKELY (flow != GST_FLOW_OK)) goto out; - if (!mxf_is_partition_pack (&key)) + if (!mxf_is_partition_pack (&klv.key)) goto out; - if (gst_mxf_demux_handle_partition_pack (demux, &key, buffer) != GST_FLOW_OK) + if (gst_mxf_demux_handle_partition_pack (demux, &klv) != GST_FLOW_OK) { + if (klv.data) + gst_buffer_unref (klv.data); goto out; + } - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; + gst_mxf_demux_consume_klv (demux, &klv); + /* If there's no Header Metadata in this partition, jump to the previous + * one */ if (demux->current_partition->partition.header_byte_count == 0) { + /* Reached the first partition, bail out */ if (demux->current_partition->partition.this_partition == 0) goto out; @@ -2506,11 +3642,11 @@ goto next_try; } + /* Next up should be an optional fill pack followed by a primer pack */ while (TRUE) { - flow = - gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buffer, - &read); + flow = gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv); if (G_UNLIKELY (flow != GST_FLOW_OK)) { + /* If ever we can't get the next KLV, jump to the previous partition */ if (!demux->current_partition->partition.prev_partition) goto out; demux->offset = @@ -2518,17 +3654,13 @@ goto next_try; } - if (mxf_is_fill (&key)) { - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; - } else if (mxf_is_primer_pack (&key)) { + if (mxf_is_fill (&klv.key)) { + gst_mxf_demux_consume_klv (demux, &klv); + } else if (mxf_is_primer_pack (&klv.key)) { + /* Update primer mapping if present (jump to previous if it failed) */ if (!demux->current_partition->primer.mappings) { - if (gst_mxf_demux_handle_primer_pack (demux, &key, - buffer) != GST_FLOW_OK) { - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; + if (gst_mxf_demux_handle_primer_pack (demux, &klv) != GST_FLOW_OK) { + gst_mxf_demux_consume_klv (demux, &klv); if (!demux->current_partition->partition.prev_partition) goto out; demux->offset = @@ -2537,13 +3669,9 @@ goto next_try; } } - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; + gst_mxf_demux_consume_klv (demux, &klv); break; } else { - gst_buffer_unref (buffer); - buffer = NULL; if (!demux->current_partition->partition.prev_partition) goto out; demux->offset = @@ -2552,13 +3680,11 @@ } } - /* parse metadata */ + /* parse metadata for this partition */ while (demux->offset < demux->run_in + demux->current_partition->primer.offset + demux->current_partition->partition.header_byte_count) { - flow = - gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buffer, - &read); + flow = gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv); if (G_UNLIKELY (flow != GST_FLOW_OK)) { if (!demux->current_partition->partition.prev_partition) goto out; @@ -2567,11 +3693,9 @@ goto next_try; } - if (mxf_is_metadata (&key)) { - flow = gst_mxf_demux_handle_metadata (demux, &key, buffer); - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; + if (mxf_is_metadata (&klv.key)) { + flow = gst_mxf_demux_handle_metadata (demux, &klv); + gst_mxf_demux_consume_klv (demux, &klv); if (G_UNLIKELY (flow != GST_FLOW_OK)) { gst_mxf_demux_reset_metadata (demux); @@ -2581,33 +3705,20 @@ demux->run_in + demux->current_partition->partition.prev_partition; goto next_try; } - } else if (mxf_is_descriptive_metadata (&key)) { - gst_mxf_demux_handle_descriptive_metadata (demux, &key, buffer); - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; - } else if (mxf_is_fill (&key)) { - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; - } else if (mxf_is_generic_container_system_item (&key) || - mxf_is_generic_container_essence_element (&key) || - mxf_is_avid_essence_container_essence_element (&key)) { - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; - break; + } else if (mxf_is_descriptive_metadata (&klv.key)) { + gst_mxf_demux_handle_descriptive_metadata (demux, &klv); + gst_mxf_demux_consume_klv (demux, &klv); } else { - demux->offset += read; - gst_buffer_unref (buffer); - buffer = NULL; + gst_mxf_demux_consume_klv (demux, &klv); } } /* resolve references etc */ if (!demux->preface || gst_mxf_demux_resolve_references (demux) != GST_FLOW_OK || gst_mxf_demux_update_tracks (demux) != GST_FLOW_OK) { + /* Don't attempt to parse metadata from this partition again */ demux->current_partition->parsed_metadata = TRUE; + /* Skip to previous partition or bail out */ if (!demux->current_partition->partition.prev_partition) goto out; demux->offset = @@ -2616,17 +3727,15 @@ } out: - if (buffer) - gst_buffer_unref (buffer); - demux->offset = old_offset; demux->current_partition = old_partition; } static GstFlowReturn -gst_mxf_demux_handle_klv_packet (GstMXFDemux * demux, const MXFUL * key, - GstBuffer * buffer, gboolean peek) +gst_mxf_demux_handle_klv_packet (GstMXFDemux * demux, GstMXFKLV * klv, + gboolean peek) { + MXFUL *key = &klv->key; #ifndef GST_DISABLE_GST_DEBUG gchar key_str[48]; #endif @@ -2654,45 +3763,29 @@ if (!mxf_is_mxf_packet (key)) { GST_WARNING_OBJECT (demux, "Skipping non-MXF packet of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT ", key: %s", gst_buffer_get_size (buffer), + G_GUINT64_FORMAT ", key: %s", klv->length, demux->offset, mxf_ul_to_string (key, key_str)); } else if (mxf_is_partition_pack (key)) { - ret = gst_mxf_demux_handle_partition_pack (demux, key, buffer); - - /* If this partition contains the start of an essence container - * set the positions of all essence streams to 0 - */ - if (ret == GST_FLOW_OK && demux->current_partition - && demux->current_partition->partition.body_sid != 0 - && demux->current_partition->partition.body_offset == 0) { - guint i; - - for (i = 0; i < demux->essence_tracks->len; i++) { - GstMXFDemuxEssenceTrack *etrack = - &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); - - if (etrack->body_sid != demux->current_partition->partition.body_sid) - continue; - - etrack->position = 0; - } - } + ret = gst_mxf_demux_handle_partition_pack (demux, klv); } else if (mxf_is_primer_pack (key)) { - ret = gst_mxf_demux_handle_primer_pack (demux, key, buffer); + ret = gst_mxf_demux_handle_primer_pack (demux, klv); } else if (mxf_is_metadata (key)) { - ret = gst_mxf_demux_handle_metadata (demux, key, buffer); + ret = gst_mxf_demux_handle_metadata (demux, klv); } else if (mxf_is_descriptive_metadata (key)) { - ret = gst_mxf_demux_handle_descriptive_metadata (demux, key, buffer); + ret = gst_mxf_demux_handle_descriptive_metadata (demux, klv); } else if (mxf_is_generic_container_system_item (key)) { - ret = - gst_mxf_demux_handle_generic_container_system_item (demux, key, buffer); + if (demux->pending_index_table_segments) + collect_index_table_segments (demux); + ret = gst_mxf_demux_handle_generic_container_system_item (demux, klv); } else if (mxf_is_generic_container_essence_element (key) || mxf_is_avid_essence_container_essence_element (key)) { + if (demux->pending_index_table_segments) + collect_index_table_segments (demux); ret = - gst_mxf_demux_handle_generic_container_essence_element (demux, key, - buffer, peek); + gst_mxf_demux_handle_generic_container_essence_element (demux, klv, + peek); } else if (mxf_is_random_index_pack (key)) { - ret = gst_mxf_demux_handle_random_index_pack (demux, key, buffer); + ret = gst_mxf_demux_handle_random_index_pack (demux, klv); if (ret == GST_FLOW_OK && demux->random_access && !demux->index_table_segments_collected) { @@ -2700,48 +3793,18 @@ demux->index_table_segments_collected = TRUE; } } else if (mxf_is_index_table_segment (key)) { - ret = - gst_mxf_demux_handle_index_table_segment (demux, key, buffer, - demux->offset); + ret = gst_mxf_demux_handle_index_table_segment (demux, klv); } else if (mxf_is_fill (key)) { GST_DEBUG_OBJECT (demux, "Skipping filler packet of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT, gst_buffer_get_size (buffer), demux->offset); + G_GUINT64_FORMAT, klv->length, demux->offset); } else { GST_DEBUG_OBJECT (demux, "Skipping unknown packet of size %" G_GSIZE_FORMAT " at offset %" - G_GUINT64_FORMAT ", key: %s", gst_buffer_get_size (buffer), + G_GUINT64_FORMAT ", key: %s", klv->length, demux->offset, mxf_ul_to_string (key, key_str)); } - /* In pull mode try to get the last metadata */ - if (mxf_is_partition_pack (key) && ret == GST_FLOW_OK - && demux->pull_footer_metadata - && demux->random_access && demux->current_partition - && demux->current_partition->partition.type == MXF_PARTITION_PACK_HEADER - && (!demux->current_partition->partition.closed - || !demux->current_partition->partition.complete) - && (demux->footer_partition_pack_offset != 0 || demux->random_index_pack)) { - GST_DEBUG_OBJECT (demux, - "Open or incomplete header partition, trying to get final metadata from the last partitions"); - gst_mxf_demux_parse_footer_metadata (demux); - demux->pull_footer_metadata = FALSE; - - if (demux->current_partition->partition.body_sid != 0 && - demux->current_partition->partition.body_offset == 0) { - guint i; - for (i = 0; i < demux->essence_tracks->len; i++) { - GstMXFDemuxEssenceTrack *etrack = - &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); - - if (etrack->body_sid != demux->current_partition->partition.body_sid) - continue; - - etrack->position = 0; - } - } - } - beach: return ret; } @@ -2751,6 +3814,8 @@ { GList *l; + GST_LOG_OBJECT (demux, "offset %" G_GUINT64_FORMAT, offset); + /* This partition will already be parsed, otherwise * the position wouldn't be in the index */ for (l = demux->partitions; l; l = l->next) { @@ -2759,42 +3824,15 @@ if (p->partition.this_partition + demux->run_in <= offset) demux->current_partition = p; } -} - -static guint64 -find_offset (GArray * offsets, gint64 * position, gboolean keyframe) -{ - GstMXFDemuxIndex *idx; - guint64 current_offset = -1; - gint64 current_position = *position; - - if (!offsets || offsets->len <= *position) - return -1; - - idx = &g_array_index (offsets, GstMXFDemuxIndex, *position); - if (idx->offset != 0 && (!keyframe || idx->keyframe)) { - current_offset = idx->offset; - } else if (idx->offset != 0) { - current_position--; - while (current_position >= 0) { - idx = &g_array_index (offsets, GstMXFDemuxIndex, current_position); - if (idx->offset == 0) { - break; - } else if (!idx->keyframe) { - current_position--; - continue; - } else { - current_offset = idx->offset; - break; - } - } - } - - if (current_offset == -1) - return -1; - - *position = current_position; - return current_offset; + if (demux->current_partition) + GST_DEBUG_OBJECT (demux, + "Current partition now %p (body_sid:%d index_sid:%d this_partition:%" + G_GUINT64_FORMAT ")", demux->current_partition, + demux->current_partition->partition.body_sid, + demux->current_partition->partition.index_sid, + demux->current_partition->partition.this_partition); + else + GST_DEBUG_OBJECT (demux, "Haven't found partition for offset yet"); } static guint64 @@ -2833,176 +3871,153 @@ GstMXFDemuxPartition *old_partition = demux->current_partition; gint i; guint64 offset; - gint64 requested_position = *position; - GstMXFDemuxIndexTable *index_table = NULL; + gint64 requested_position = *position, index_start_position; + GstMXFDemuxIndex index_entry = { 0, }; GST_DEBUG_OBJECT (demux, "Trying to find essence element %" G_GINT64_FORMAT - " of track %u with body_sid %u (keyframe %d)", *position, + " of track 0x%08x with body_sid %u (keyframe %d)", *position, etrack->track_number, etrack->body_sid, keyframe); - if (demux->index_tables) { - GList *l; - - for (l = demux->index_tables; l; l = l->next) { - GstMXFDemuxIndexTable *tmp = l->data; - - if (tmp->body_sid == etrack->body_sid - && tmp->index_sid == etrack->index_sid) { - index_table = tmp; - break; - } - } + /* Get entry from index table if present */ + if (find_edit_entry (demux, etrack, *position, keyframe, &index_entry)) { + GST_DEBUG_OBJECT (demux, + "Got position %" G_GINT64_FORMAT " at offset %" G_GUINT64_FORMAT, + index_entry.dts, index_entry.offset); + *position = index_entry.dts; + return index_entry.offset; } -from_index: - - if (etrack->duration > 0 && *position >= etrack->duration) { - GST_WARNING_OBJECT (demux, "Position after end of essence track"); - return -1; - } + GST_DEBUG_OBJECT (demux, "Not found in index table"); - /* First try to find an offset in our index */ - offset = find_offset (etrack->offsets, position, keyframe); - if (offset != -1) { - GST_DEBUG_OBJECT (demux, - "Found edit unit %" G_GINT64_FORMAT " for %" G_GINT64_FORMAT - " in generated index at offset %" G_GUINT64_FORMAT, *position, - requested_position, offset); - return offset; - } + /* Fallback to track offsets */ - GST_DEBUG_OBJECT (demux, "Not found in index"); if (!demux->random_access) { + /* Best effort for push mode */ offset = find_closest_offset (etrack->offsets, position, keyframe); - if (offset != -1) { + if (offset != -1) GST_DEBUG_OBJECT (demux, "Starting with edit unit %" G_GINT64_FORMAT " for %" G_GINT64_FORMAT " in generated index at offset %" G_GUINT64_FORMAT, *position, requested_position, offset); - return offset; - } + return offset; + } - if (index_table) { - offset = find_closest_offset (index_table->offsets, position, keyframe); - if (offset != -1) { - GST_DEBUG_OBJECT (demux, - "Starting with edit unit %" G_GINT64_FORMAT " for %" G_GINT64_FORMAT - " in index at offset %" G_GUINT64_FORMAT, *position, - requested_position, offset); - return offset; - } - } - } else if (demux->random_access) { - gint64 index_start_position = *position; + if (etrack->duration > 0 && *position >= etrack->duration) { + GST_WARNING_OBJECT (demux, "Position after end of essence track"); + return -1; + } - demux->offset = demux->run_in; +from_track_offset: - offset = - find_closest_offset (etrack->offsets, &index_start_position, FALSE); - if (offset != -1) { - demux->offset = offset + demux->run_in; - GST_DEBUG_OBJECT (demux, - "Starting with edit unit %" G_GINT64_FORMAT " for %" G_GINT64_FORMAT - " in generated index at offset %" G_GUINT64_FORMAT, - index_start_position, requested_position, offset); - } else { - index_start_position = -1; - } + index_start_position = *position; - if (index_table) { - gint64 tmp_position = *position; + demux->offset = demux->run_in; - offset = find_closest_offset (index_table->offsets, &tmp_position, TRUE); - if (offset != -1 && tmp_position > index_start_position) { - demux->offset = offset + demux->run_in; - index_start_position = tmp_position; - GST_DEBUG_OBJECT (demux, - "Starting with edit unit %" G_GINT64_FORMAT " for %" G_GINT64_FORMAT - " in index at offset %" G_GUINT64_FORMAT, index_start_position, - requested_position, offset); - } - } + offset = find_closest_offset (etrack->offsets, &index_start_position, FALSE); + if (offset != -1) { + demux->offset = offset + demux->run_in; + GST_DEBUG_OBJECT (demux, + "Starting with edit unit %" G_GINT64_FORMAT " for %" G_GINT64_FORMAT + " in generated index at offset %" G_GUINT64_FORMAT, + index_start_position, requested_position, offset); + } else { + index_start_position = -1; + } - gst_mxf_demux_set_partition_for_offset (demux, demux->offset); + gst_mxf_demux_set_partition_for_offset (demux, demux->offset); - for (i = 0; i < demux->essence_tracks->len; i++) { - GstMXFDemuxEssenceTrack *t = - &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); + for (i = 0; i < demux->essence_tracks->len; i++) { + GstMXFDemuxEssenceTrack *t = + &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); - if (index_start_position != -1 && t == etrack) - t->position = index_start_position; - else - t->position = (demux->offset == demux->run_in) ? 0 : -1; - } + if (index_start_position != -1 && t == etrack) + t->position = index_start_position; + else + t->position = (demux->offset == demux->run_in) ? 0 : -1; + GST_LOG_OBJECT (demux, "Setting track %d position to %" G_GINT64_FORMAT, + t->track_id, t->position); + } - /* Else peek at all essence elements and complete our - * index until we find the requested element - */ - while (ret == GST_FLOW_OK) { - GstBuffer *buffer = NULL; - MXFUL key; - guint read = 0; + /* Else peek at all essence elements and complete our + * index until we find the requested element + */ + while (ret == GST_FLOW_OK) { + GstMXFKLV klv; - ret = - gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buffer, - &read); + GST_LOG_OBJECT (demux, "Pulling from offset %" G_GINT64_FORMAT, + demux->offset); + ret = gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv); + + if (ret == GST_FLOW_EOS) { + /* Handle EOS */ + for (i = 0; i < demux->essence_tracks->len; i++) { + GstMXFDemuxEssenceTrack *t = + &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, + i); - if (ret == GST_FLOW_EOS) { - for (i = 0; i < demux->essence_tracks->len; i++) { - GstMXFDemuxEssenceTrack *t = - &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, - i); + if (t->position > 0) + t->duration = t->position; + } + /* For the searched track this is really our position */ + etrack->duration = etrack->position; - if (t->position > 0) - t->duration = t->position; - } - /* For the searched track this is really our position */ - etrack->duration = etrack->position; + for (i = 0; i < demux->src->len; i++) { + GstMXFDemuxPad *p = g_ptr_array_index (demux->src, i); - for (i = 0; i < demux->src->len; i++) { - GstMXFDemuxPad *p = g_ptr_array_index (demux->src, i); + if (!p->eos + && p->current_essence_track_position >= + p->current_essence_track->duration) { + GstEvent *e; - if (!p->eos - && p->current_essence_track_position >= - p->current_essence_track->duration) { - GstEvent *e; - - p->eos = TRUE; - e = gst_event_new_eos (); - gst_event_set_seqnum (e, demux->seqnum); - gst_pad_push_event (GST_PAD_CAST (p), e); - } + p->eos = TRUE; + e = gst_event_new_eos (); + gst_event_set_seqnum (e, demux->seqnum); + gst_pad_push_event (GST_PAD_CAST (p), e); } } + } - if (G_UNLIKELY (ret != GST_FLOW_OK) && etrack->position <= *position) { - demux->offset = old_offset; - demux->current_partition = old_partition; - break; - } else if (G_UNLIKELY (ret == GST_FLOW_OK)) { - ret = gst_mxf_demux_handle_klv_packet (demux, &key, buffer, TRUE); - gst_buffer_unref (buffer); - } - - /* If we found the position read it from the index again */ - if (((ret == GST_FLOW_OK && etrack->position == *position + 2) || - (ret == GST_FLOW_EOS && etrack->position == *position + 1)) - && etrack->offsets && etrack->offsets->len > *position - && g_array_index (etrack->offsets, GstMXFDemuxIndex, - *position).offset != 0) { - GST_DEBUG_OBJECT (demux, "Found at offset %" G_GUINT64_FORMAT, - demux->offset); - demux->offset = old_offset; - demux->current_partition = old_partition; - goto from_index; + GST_LOG_OBJECT (demux, + "pulling gave flow:%s track->position:%" G_GINT64_FORMAT, + gst_flow_get_name (ret), etrack->position); + if (G_UNLIKELY (ret != GST_FLOW_OK) && etrack->position <= *position) { + demux->offset = old_offset; + demux->current_partition = old_partition; + break; + } else if (G_UNLIKELY (ret == GST_FLOW_OK)) { + ret = gst_mxf_demux_handle_klv_packet (demux, &klv, TRUE); + gst_mxf_demux_consume_klv (demux, &klv); + } + + GST_LOG_OBJECT (demux, + "Handling gave flow:%s track->position:%" G_GINT64_FORMAT + " looking for %" G_GINT64_FORMAT, gst_flow_get_name (ret), + etrack->position, *position); + + /* If we found the position read it from the index again */ + if (((ret == GST_FLOW_OK && etrack->position == *position + 1) || + (ret == GST_FLOW_EOS && etrack->position == *position + 1)) + && etrack->offsets && etrack->offsets->len > *position + && g_array_index (etrack->offsets, GstMXFDemuxIndex, + *position).offset != 0) { + GST_DEBUG_OBJECT (demux, "Found at offset %" G_GUINT64_FORMAT, + demux->offset); + demux->offset = old_offset; + demux->current_partition = old_partition; + if (find_edit_entry (demux, etrack, *position, keyframe, &index_entry)) { + GST_DEBUG_OBJECT (demux, + "Got position %" G_GINT64_FORMAT " at offset %" G_GUINT64_FORMAT, + index_entry.dts, index_entry.offset); + *position = index_entry.dts; + return index_entry.offset; } - demux->offset += read; + goto from_track_offset; } - demux->offset = old_offset; - demux->current_partition = old_partition; - - GST_DEBUG_OBJECT (demux, "Not found in this file"); } + demux->offset = old_offset; + demux->current_partition = old_partition; + + GST_DEBUG_OBJECT (demux, "Not found in this file"); return -1; } @@ -3010,10 +4025,9 @@ static GstFlowReturn gst_mxf_demux_pull_and_handle_klv_packet (GstMXFDemux * demux) { - GstBuffer *buffer = NULL; - MXFUL key; + GstMXFKLV klv; GstFlowReturn ret = GST_FLOW_OK; - guint read = 0; + gboolean force_switch = FALSE; if (demux->src->len > 0) { if (!gst_mxf_demux_get_earliest_pad (demux)) { @@ -3023,83 +4037,216 @@ } } - ret = - gst_mxf_demux_pull_klv_packet (demux, demux->offset, &key, &buffer, - &read); + if (demux->state == GST_MXF_DEMUX_STATE_ESSENCE) { + g_assert (demux->current_partition->single_track + && demux->current_partition->single_track->wrapping != + MXF_ESSENCE_WRAPPING_FRAME_WRAPPING); + /* Feeding essence directly (i.e. in the middle of a custom/clip KLV) */ + ret = + gst_mxf_demux_handle_generic_container_essence_element (demux, + &demux->current_partition->clip_klv, FALSE); + gst_mxf_demux_consume_klv (demux, &demux->current_partition->clip_klv); + if (ret == GST_FLOW_OK + && demux->current_partition->single_track->position >= + demux->current_partition->single_track->duration) { + /* We are done with the contents of this clip/custom wrapping, force the + * switch to the next non-EOS track */ + GST_DEBUG_OBJECT (demux, "Single track EOS, switch"); + force_switch = TRUE; + } - if (ret == GST_FLOW_EOS && demux->src->len > 0) { - guint i; - GstMXFDemuxPad *p = NULL; + } else { - for (i = 0; i < demux->essence_tracks->len; i++) { - GstMXFDemuxEssenceTrack *t = - &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); + ret = gst_mxf_demux_peek_klv_packet (demux, demux->offset, &klv); - if (t->position > 0) - t->duration = t->position; - } + /* FIXME + * + * Move this EOS handling to a separate function + */ + if (ret == GST_FLOW_EOS && demux->src->len > 0) { + guint i; + GstMXFDemuxPad *p = NULL; - for (i = 0; i < demux->src->len; i++) { - GstMXFDemuxPad *p = g_ptr_array_index (demux->src, i); + GST_DEBUG_OBJECT (demux, "EOS HANDLING"); - if (!p->eos - && p->current_essence_track_position >= - p->current_essence_track->duration) { - GstEvent *e; + for (i = 0; i < demux->src->len; i++) { + GstMXFDemuxPad *p = g_ptr_array_index (demux->src, i); - p->eos = TRUE; - e = gst_event_new_eos (); - gst_event_set_seqnum (e, demux->seqnum); - gst_pad_push_event (GST_PAD_CAST (p), e); + GST_DEBUG_OBJECT (p, + "eos:%d current_essence_track_position:%" G_GINT64_FORMAT + " position:%" G_GINT64_FORMAT " duration:%" G_GINT64_FORMAT, p->eos, + p->current_essence_track_position, + p->current_essence_track->position, + p->current_essence_track->duration); + if (!p->eos + && p->current_essence_track->position >= + p->current_essence_track->duration) { + GstEvent *e; + + p->eos = TRUE; + e = gst_event_new_eos (); + gst_event_set_seqnum (e, demux->seqnum); + gst_pad_push_event (GST_PAD_CAST (p), e); + } } - } - while ((p = gst_mxf_demux_get_earliest_pad (demux))) { - guint64 offset; - gint64 position; + while ((p = gst_mxf_demux_get_earliest_pad (demux))) { + guint64 offset; + gint64 position; - position = p->current_essence_track_position; + GST_DEBUG_OBJECT (p, "Trying on earliest"); - offset = - gst_mxf_demux_find_essence_element (demux, p->current_essence_track, - &position, FALSE); - if (offset == -1) { - GstEvent *e; + position = p->current_essence_track_position; - GST_ERROR_OBJECT (demux, "Failed to find offset for essence track"); - p->eos = TRUE; - e = gst_event_new_eos (); - gst_event_set_seqnum (e, demux->seqnum); - gst_pad_push_event (GST_PAD_CAST (p), e); - continue; - } + offset = + gst_mxf_demux_find_essence_element (demux, p->current_essence_track, + &position, FALSE); + if (offset == -1) { + GstEvent *e; - demux->offset = offset + demux->run_in; - gst_mxf_demux_set_partition_for_offset (demux, demux->offset); + GST_ERROR_OBJECT (demux, "Failed to find offset for essence track"); + p->eos = TRUE; + e = gst_event_new_eos (); + gst_event_set_seqnum (e, demux->seqnum); + gst_pad_push_event (GST_PAD_CAST (p), e); + continue; + } - p->current_essence_track->position = position; + demux->offset = offset + demux->run_in; + gst_mxf_demux_set_partition_for_offset (demux, demux->offset); + if (p->current_essence_track->wrapping != + MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + demux->state = GST_MXF_DEMUX_STATE_ESSENCE; + demux->current_partition->clip_klv.consumed = + offset - demux->current_partition->clip_klv.offset; + } else + demux->state = GST_MXF_DEMUX_STATE_KLV; + p->current_essence_track->position = position; - ret = GST_FLOW_OK; - goto beach; + ret = GST_FLOW_OK; + goto beach; + } } - } + if (G_UNLIKELY (ret != GST_FLOW_OK)) + goto beach; - if (G_UNLIKELY (ret != GST_FLOW_OK)) - goto beach; + ret = gst_mxf_demux_handle_klv_packet (demux, &klv, FALSE); + gst_mxf_demux_consume_klv (demux, &klv); + + /* We entered a new partition */ + if (ret == GST_FLOW_OK && mxf_is_partition_pack (&klv.key)) { + GstMXFDemuxPartition *partition = demux->current_partition; + + /* Grab footer metadata if needed */ + if (demux->pull_footer_metadata + && partition->partition.type == MXF_PARTITION_PACK_HEADER + && (!partition->partition.closed || !partition->partition.complete) + && (demux->footer_partition_pack_offset != 0 + || demux->random_index_pack)) { + GST_DEBUG_OBJECT (demux, + "Open or incomplete header partition, trying to get final metadata from the last partitions"); + gst_mxf_demux_parse_footer_metadata (demux); + demux->pull_footer_metadata = FALSE; + } + + /* If the partition has some content, do post-checks */ + if (partition->partition.body_sid != 0) { + guint64 lowest_offset = G_MAXUINT64; + GST_DEBUG_OBJECT (demux, + "Entered partition (body_sid:%d index_sid:%d body_offset:%" + G_GUINT64_FORMAT "), checking positions", + partition->partition.body_sid, partition->partition.index_sid, + partition->partition.body_offset); + + if (partition->single_track) { + /* Fast-path for single track partition */ + if (partition->single_track->position == -1 + && partition->partition.body_offset == 0) { + GST_DEBUG_OBJECT (demux, + "First time in partition, setting track position to 0"); + partition->single_track->position = 0; + } else if (partition->single_track->position == -1) { + GST_ERROR_OBJECT (demux, + "Unknown track position, consuming data from first partition entry"); + lowest_offset = + partition->partition.this_partition + + partition->essence_container_offset; + partition->clip_klv.consumed = 0; + } else if (partition->single_track->position != 0) { + GstMXFDemuxIndex entry; + GST_DEBUG_OBJECT (demux, + "Track already at another position : %" G_GINT64_FORMAT, + partition->single_track->position); + if (find_edit_entry (demux, partition->single_track, + partition->single_track->position, FALSE, &entry)) + lowest_offset = entry.offset; + } + } else { + guint i; + for (i = 0; i < demux->essence_tracks->len; i++) { + GstMXFDemuxEssenceTrack *etrack = + &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, + i); + + if (etrack->body_sid != partition->partition.body_sid) + continue; + if (etrack->position == -1 && partition->partition.body_offset == 0) { + GST_DEBUG_OBJECT (demux, "Resetting track %d to position 0", + etrack->track_id); + + etrack->position = 0; + } else if (etrack->position != 0) { + GstMXFDemuxIndex entry; + if (find_edit_entry (demux, etrack, + etrack->position, FALSE, &entry)) { + if (lowest_offset == G_MAXUINT64 + || entry.offset < lowest_offset) + lowest_offset = entry.offset; + } + } + } + } - ret = gst_mxf_demux_handle_klv_packet (demux, &key, buffer, FALSE); - demux->offset += read; + if (lowest_offset != G_MAXUINT64) { + GstMXFDemuxPartition *next_partition = NULL; + GList *cur_part = g_list_find (demux->partitions, partition); + if (cur_part && cur_part->next) + next_partition = (GstMXFDemuxPartition *) cur_part->next->data; + + /* If we have completely processed this partition, skip to next partition */ + if (lowest_offset > next_partition->partition.this_partition) { + GST_DEBUG_OBJECT (demux, + "Partition entirely processed, skipping to next one"); + demux->offset = next_partition->partition.this_partition; + } else { + GST_DEBUG_OBJECT (demux, + "Skipping to demuxer offset %" G_GUINT64_FORMAT " (from %" + G_GUINT64_FORMAT ")", lowest_offset, demux->offset); + demux->offset = lowest_offset; + if (partition->single_track + && partition->single_track->wrapping != + MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + demux->state = GST_MXF_DEMUX_STATE_ESSENCE; + demux->current_partition->clip_klv.consumed = + demux->offset - demux->current_partition->clip_klv.offset; + } + } + } + } + } + } if (ret == GST_FLOW_OK && demux->src->len > 0 && demux->essence_tracks->len > 0) { GstMXFDemuxPad *earliest = NULL; /* We allow time drifts of at most 500ms */ - while ((earliest = gst_mxf_demux_get_earliest_pad (demux)) && - demux->segment.position - earliest->position > demux->max_drift) { + while ((earliest = gst_mxf_demux_get_earliest_pad (demux)) && (force_switch + || demux->segment.position - earliest->position > + demux->max_drift)) { guint64 offset; gint64 position; - GST_WARNING_OBJECT (demux, + GST_DEBUG_OBJECT (demux, "Found synchronization issue -- trying to solve"); position = earliest->current_essence_track_position; @@ -3127,16 +4274,28 @@ demux->offset = offset + demux->run_in; gst_mxf_demux_set_partition_for_offset (demux, demux->offset); + GST_DEBUG_OBJECT (demux, + "Switching to offset %" G_GUINT64_FORMAT " for position %" + G_GINT64_FORMAT " on track %d (body_sid:%d index_sid:%d)", + demux->offset, position, earliest->current_essence_track->track_id, + earliest->current_essence_track->body_sid, + earliest->current_essence_track->index_sid); + if (demux->current_partition->single_track + && demux->current_partition->single_track->wrapping != + MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) { + demux->state = GST_MXF_DEMUX_STATE_ESSENCE; + demux->current_partition->clip_klv.consumed = + offset - demux->current_partition->clip_klv.offset; + } else + demux->state = GST_MXF_DEMUX_STATE_KLV; earliest->current_essence_track->position = position; + GST_DEBUG_OBJECT (earliest, "Switching to this pad"); break; } } beach: - if (buffer) - gst_buffer_unref (buffer); - return ret; } @@ -3145,49 +4304,38 @@ { GstMXFDemux *demux = NULL; GstFlowReturn flow = GST_FLOW_OK; - GstMapInfo map; - gboolean res; demux = GST_MXF_DEMUX (gst_pad_get_parent (pad)); - if (demux->run_in == -1) { + if (demux->state == GST_MXF_DEMUX_STATE_UNKNOWN) { + GstMXFKLV klv; + /* Skip run-in, which is at most 64K and is finished * by a header partition pack */ while (demux->offset < 64 * 1024) { - GstBuffer *buffer = NULL; - if ((flow = - gst_mxf_demux_pull_range (demux, demux->offset, 16, - &buffer)) != GST_FLOW_OK) - break; - - gst_buffer_map (buffer, &map, GST_MAP_READ); - res = mxf_is_header_partition_pack ((const MXFUL *) map.data); - gst_buffer_unmap (buffer, &map); + gst_mxf_demux_peek_klv_packet (demux, demux->offset, + &klv)) != GST_FLOW_OK) + goto pause; - if (res) { + if (mxf_is_header_partition_pack (&klv.key)) { GST_DEBUG_OBJECT (demux, "Found header partition pack at offset %" G_GUINT64_FORMAT, demux->offset); + demux->state = GST_MXF_DEMUX_STATE_KLV; demux->run_in = demux->offset; - gst_buffer_unref (buffer); break; } - demux->offset++; - gst_buffer_unref (buffer); } - if (G_UNLIKELY (flow != GST_FLOW_OK)) - goto pause; - if (G_UNLIKELY (demux->run_in == -1)) { GST_ERROR_OBJECT (demux, "No valid header partition pack found"); flow = GST_FLOW_ERROR; goto pause; } - /* First of all pull&parse the random index pack at EOF */ + /* Grab the RIP at the end of the file (if present) */ gst_mxf_demux_pull_random_index_pack (demux); } @@ -3199,8 +4347,7 @@ goto pause; /* check EOS condition */ - if ((demux->segment.flags & GST_SEEK_FLAG_SEGMENT) && - (demux->segment.stop != -1) && + if ((demux->segment.stop != -1) && (demux->segment.position >= demux->segment.stop)) { guint i; gboolean eos = TRUE; @@ -3284,12 +4431,9 @@ { GstFlowReturn ret = GST_FLOW_OK; GstMXFDemux *demux = NULL; - MXFUL key; const guint8 *data = NULL; - guint64 length = 0; - guint64 offset = 0; - GstBuffer *buffer = NULL; gboolean res; + GstMXFKLV klv; #ifndef GST_DISABLE_GST_DEBUG gchar str[48]; #endif @@ -3312,6 +4456,7 @@ GST_DEBUG_OBJECT (demux, "beginning of file, expect header"); demux->run_in = -1; demux->offset = 0; + demux->state = GST_MXF_DEMUX_STATE_UNKNOWN; } if (G_UNLIKELY (demux->offset == 0 && GST_BUFFER_OFFSET (inbuf) != 0)) { @@ -3336,7 +4481,7 @@ if (gst_adapter_available (demux->adapter) < 16) break; - if (demux->run_in == -1) { + if (demux->state == GST_MXF_DEMUX_STATE_UNKNOWN) { /* Skip run-in, which is at most 64K and is finished * by a header partition pack */ @@ -3351,6 +4496,7 @@ "Found header partition pack at offset %" G_GUINT64_FORMAT, demux->offset); demux->run_in = demux->offset; + demux->state = GST_MXF_DEMUX_STATE_KLV; break; } gst_adapter_flush (demux->adapter, 1); @@ -3364,11 +4510,11 @@ continue; } - /* Need more data */ - if (demux->run_in == -1 && demux->offset < 64 * 1024) - break; + if (demux->state == GST_MXF_DEMUX_STATE_UNKNOWN) { + /* Need more data */ + if (demux->offset < 64 * 1024) + break; - if (G_UNLIKELY (demux->run_in == -1)) { GST_ERROR_OBJECT (demux, "No valid header partition pack found"); ret = GST_FLOW_ERROR; break; @@ -3377,25 +4523,28 @@ if (gst_adapter_available (demux->adapter) < 17) break; + /* FIXME : Handle non-klv state */ + g_assert (demux->state == GST_MXF_DEMUX_STATE_KLV); + /* Now actually do something */ - memset (&key, 0, sizeof (MXFUL)); + memset (&klv, 0, sizeof (GstMXFKLV)); /* Pull 16 byte key and first byte of BER encoded length */ data = gst_adapter_map (demux->adapter, 17); - memcpy (&key, data, 16); + memcpy (&klv.key, data, 16); GST_DEBUG_OBJECT (demux, "Got KLV packet with key %s", - mxf_ul_to_string (&key, str)); + mxf_ul_to_string (&klv.key, str)); /* Decode BER encoded packet length */ if ((data[16] & 0x80) == 0) { - length = data[16]; - offset = 17; + klv.length = data[16]; + klv.data_offset = 17; } else { guint slen = data[16] & 0x7f; - offset = 16 + 1 + slen; + klv.data_offset = 16 + 1 + slen; gst_adapter_unmap (demux->adapter); @@ -3413,9 +4562,9 @@ data = gst_adapter_map (demux->adapter, 17 + slen); data += 17; - length = 0; + klv.length = 0; while (slen) { - length = (length << 8) | *data; + klv.length = (klv.length << 8) | *data; data++; slen--; } @@ -3425,34 +4574,152 @@ /* GStreamer's buffer sizes are stored in a guint so we * limit ourself to G_MAXUINT large buffers */ - if (length > G_MAXUINT) { + if (klv.length > G_MAXUINT) { GST_ERROR_OBJECT (demux, - "Unsupported KLV packet length: %" G_GUINT64_FORMAT, length); + "Unsupported KLV packet length: %" G_GSIZE_FORMAT, klv.length); ret = GST_FLOW_ERROR; break; } GST_DEBUG_OBJECT (demux, "KLV packet with key %s has length " - "%" G_GUINT64_FORMAT, mxf_ul_to_string (&key, str), length); + "%" G_GSIZE_FORMAT, mxf_ul_to_string (&klv.key, str), klv.length); - if (gst_adapter_available (demux->adapter) < offset + length) + if (gst_adapter_available (demux->adapter) < klv.data_offset + klv.length) break; - gst_adapter_flush (demux->adapter, offset); + gst_adapter_flush (demux->adapter, klv.data_offset); - if (length > 0) { - buffer = gst_adapter_take_buffer (demux->adapter, length); + if (klv.length > 0) { + klv.data = gst_adapter_take_buffer (demux->adapter, klv.length); - ret = gst_mxf_demux_handle_klv_packet (demux, &key, buffer, FALSE); - gst_buffer_unref (buffer); + ret = gst_mxf_demux_handle_klv_packet (demux, &klv, FALSE); } - - demux->offset += offset + length; + gst_mxf_demux_consume_klv (demux, &klv); } return ret; } +/* Given a stream time for an output pad, figure out: + * * The Essence track for that stream time + * * The position on that track + */ +static gboolean +gst_mxf_demux_pad_to_track_and_position (GstMXFDemux * demux, + GstMXFDemuxPad * pad, GstClockTime streamtime, + GstMXFDemuxEssenceTrack ** etrack, gint64 * position) +{ + gint64 material_position; + guint64 sum = 0; + guint i; + MXFMetadataSourceClip *clip = NULL; + gchar str[96]; + + /* Convert to material position */ + material_position = + gst_util_uint64_scale (streamtime, pad->material_track->edit_rate.n, + pad->material_track->edit_rate.d * GST_SECOND); + + GST_DEBUG_OBJECT (pad, + "streamtime %" GST_TIME_FORMAT " position %" G_GINT64_FORMAT, + GST_TIME_ARGS (streamtime), material_position); + + + /* Find sequence component covering that position */ + for (i = 0; i < pad->material_track->parent.sequence->n_structural_components; + i++) { + clip = + MXF_METADATA_SOURCE_CLIP (pad->material_track->parent.sequence-> + structural_components[i]); + GST_LOG_OBJECT (pad, + "clip %d start_position:%" G_GINT64_FORMAT " duration %" + G_GINT64_FORMAT, clip->source_track_id, clip->start_position, + clip->parent.duration); + if (clip->parent.duration <= 0) + break; + if ((sum + clip->parent.duration) > material_position) + break; + sum += clip->parent.duration; + } + + if (i == pad->material_track->parent.sequence->n_structural_components) { + GST_WARNING_OBJECT (pad, "Requested position beyond the last clip"); + /* Outside of current components. Setting to the end of the last clip */ + material_position = sum; + sum -= clip->parent.duration; + } + + GST_DEBUG_OBJECT (pad, "Looking for essence track for track_id:%d umid:%s", + clip->source_track_id, mxf_umid_to_string (&clip->source_package_id, + str)); + + /* Get the corresponding essence track for the given source package and stream id */ + for (i = 0; i < demux->essence_tracks->len; i++) { + GstMXFDemuxEssenceTrack *track = + &g_array_index (demux->essence_tracks, GstMXFDemuxEssenceTrack, i); + GST_LOG_OBJECT (pad, + "Looking at essence track body_sid:%d index_sid:%d", + track->body_sid, track->index_sid); + if (clip->source_track_id == 0 || (track->track_id == clip->source_track_id + && mxf_umid_is_equal (&clip->source_package_id, + &track->source_package_uid))) { + GST_DEBUG_OBJECT (pad, + "Found matching essence track body_sid:%d index_sid:%d", + track->body_sid, track->index_sid); + *etrack = track; + *position = material_position - sum; + return TRUE; + } + } + + return FALSE; +} + +/* Given a track+position for a given pad, figure out the resulting stream time */ +static gboolean +gst_mxf_demux_pad_get_stream_time (GstMXFDemux * demux, + GstMXFDemuxPad * pad, GstMXFDemuxEssenceTrack * etrack, + gint64 position, GstClockTime * stream_time) +{ + guint i; + guint64 sum = 0; + MXFMetadataSourceClip *clip = NULL; + + /* Find the component for that */ + /* Find sequence component covering that position */ + for (i = 0; i < pad->material_track->parent.sequence->n_structural_components; + i++) { + clip = + MXF_METADATA_SOURCE_CLIP (pad->material_track->parent.sequence-> + structural_components[i]); + GST_LOG_OBJECT (pad, + "clip %d start_position:%" G_GINT64_FORMAT " duration %" + G_GINT64_FORMAT, clip->source_track_id, clip->start_position, + clip->parent.duration); + if (etrack->track_id == clip->source_track_id + && mxf_umid_is_equal (&clip->source_package_id, + &etrack->source_package_uid)) { + /* This is the clip */ + break; + } + /* Fetch in the next one */ + sum += clip->parent.duration; + } + + /* Theoretically impossible */ + if (i == pad->material_track->parent.sequence->n_structural_components) { + /* Outside of current components ?? */ + return FALSE; + } + + *stream_time = + gst_util_uint64_scale (position + sum, + pad->material_track->edit_rate.d * GST_SECOND, + pad->material_track->edit_rate.n); + + return TRUE; +} + static void gst_mxf_demux_pad_set_position (GstMXFDemux * demux, GstMXFDemuxPad * p, GstClockTime start) @@ -3686,30 +4953,46 @@ guint64 old_offset = demux->offset; GstMXFDemuxPartition *old_partition = demux->current_partition; - if (!demux->random_index_pack) - return; + /* This function can also be called when a RIP is not present. This can happen + * if index table segments were discovered while scanning the file */ + if (demux->random_index_pack) { + for (i = 0; i < demux->random_index_pack->len; i++) { + MXFRandomIndexPackEntry *e = + &g_array_index (demux->random_index_pack, MXFRandomIndexPackEntry, i); - for (i = 0; i < demux->random_index_pack->len; i++) { - MXFRandomIndexPackEntry *e = - &g_array_index (demux->random_index_pack, MXFRandomIndexPackEntry, i); + if (e->offset < demux->run_in) { + GST_ERROR_OBJECT (demux, "Invalid random index pack entry"); + return; + } - if (e->offset < demux->run_in) { - GST_ERROR_OBJECT (demux, "Invalid random index pack entry"); - return; + demux->offset = e->offset; + read_partition_header (demux); } - demux->offset = e->offset; - read_partition_header (demux); + demux->offset = old_offset; + demux->current_partition = old_partition; } - demux->offset = old_offset; - demux->current_partition = old_partition; + if (demux->pending_index_table_segments == NULL) { + GST_DEBUG_OBJECT (demux, "No pending index table segments to collect"); + return; + } + + GST_LOG_OBJECT (demux, "Collecting pending index table segments"); for (l = demux->pending_index_table_segments; l; l = l->next) { MXFIndexTableSegment *segment = l->data; GstMXFDemuxIndexTable *t = NULL; GList *k; - guint64 start, end; + guint didx; +#ifndef GST_DISABLE_GST_DEBUG + gchar str[48]; +#endif + + GST_LOG_OBJECT (demux, + "Collecting from segment bodySID:%d indexSID:%d instance_id: %s", + segment->body_sid, segment->index_sid, + mxf_uuid_to_string (&segment->instance_id, str)); for (k = demux->index_tables; k; k = k->next) { GstMXFDemuxIndexTable *tmp = k->data; @@ -3725,102 +5008,92 @@ t = g_new0 (GstMXFDemuxIndexTable, 1); t->body_sid = segment->body_sid; t->index_sid = segment->index_sid; - t->offsets = g_array_new (FALSE, TRUE, sizeof (GstMXFDemuxIndex)); + t->max_temporal_offset = 0; + t->segments = g_array_new (FALSE, TRUE, sizeof (MXFIndexTableSegment)); + g_array_set_clear_func (t->segments, + (GDestroyNotify) mxf_index_table_segment_reset); + t->reordered_delta_entry = -1; + t->reverse_temporal_offsets = g_array_new (FALSE, TRUE, 1); demux->index_tables = g_list_prepend (demux->index_tables, t); } - start = segment->index_start_position; - end = start + segment->index_duration; - if (end > G_MAXINT / sizeof (GstMXFDemuxIndex)) { - demux->index_tables = g_list_remove (demux->index_tables, t); - g_array_free (t->offsets, TRUE); - g_free (t); - continue; - } - - if (t->offsets->len < end) - g_array_set_size (t->offsets, end); + /* Store index segment */ + g_array_append_val (t->segments, *segment); - for (i = 0; i < segment->n_index_entries && start + i < t->offsets->len; - i++) { - guint64 offset = segment->index_entries[i].stream_offset; - GList *m; - GstMXFDemuxPartition *offset_partition = NULL, *next_partition = NULL; - - for (m = demux->partitions; m; m = m->next) { - GstMXFDemuxPartition *partition = m->data; + /* Check if temporal reordering tables should be pre-calculated */ + for (didx = 0; didx < segment->n_delta_entries; didx++) { + MXFDeltaEntry *delta = &segment->delta_entries[didx]; + if (delta->pos_table_index == -1) { + if (t->reordered_delta_entry != -1 && didx != t->reordered_delta_entry) + GST_WARNING_OBJECT (demux, + "Index Table specifies more than one stream using temporal reordering (%d and %d)", + didx, t->reordered_delta_entry); + else + t->reordered_delta_entry = didx; + } else if (delta->pos_table_index > 0) + GST_WARNING_OBJECT (delta, + "Index Table uses fractional offset, please file a bug"); + } - if (!next_partition && offset_partition) - next_partition = partition; + } - if (partition->partition.body_sid != t->body_sid) - continue; - if (partition->partition.body_offset > offset) - break; + /* Handle temporal offset if present and needed */ + for (l = demux->index_tables; l; l = l->next) { + GstMXFDemuxIndexTable *table = l->data; + guint segidx; - offset_partition = partition; - next_partition = NULL; - } + /* No reordered entries, skip */ + if (table->reordered_delta_entry == -1) + continue; - if (offset_partition && offset >= offset_partition->partition.body_offset) { - offset = - offset_partition->partition.this_partition + - offset_partition->essence_container_offset + (offset - - offset_partition->partition.body_offset); + GST_DEBUG_OBJECT (demux, + "bodySID:%d indexSID:%d Calculating reverse temporal offset table", + table->body_sid, table->index_sid); - if (next_partition - && offset >= next_partition->partition.this_partition) { + for (segidx = 0; segidx < table->segments->len; segidx++) { + MXFIndexTableSegment *s = + &g_array_index (table->segments, MXFIndexTableSegment, segidx); + guint start = s->index_start_position; + guint stop = + s->index_duration ? start + s->index_duration : start + + s->n_index_entries; + guint entidx = 0; + + if (stop > table->reverse_temporal_offsets->len) + g_array_set_size (table->reverse_temporal_offsets, stop); + + for (entidx = 0; entidx < s->n_index_entries; entidx++) { + MXFIndexEntry *entry = &s->index_entries[entidx]; + gint8 offs = -entry->temporal_offset; + /* Check we don't exceed boundaries */ + if ((start + entidx + entry->temporal_offset) < 0 || + (start + entidx + entry->temporal_offset) > + table->reverse_temporal_offsets->len) { GST_ERROR_OBJECT (demux, - "Invalid index table segment going into next unrelated partition"); + "Temporal offset exceeds boundaries. entry:%d offset:%d max:%d", + start + entidx, entry->temporal_offset, + table->reverse_temporal_offsets->len); } else { - GstMXFDemuxIndex *index; - gint8 temporal_offset = segment->index_entries[i].temporal_offset; - guint64 pts_i = G_MAXUINT64; - - if (temporal_offset > 0 || - (temporal_offset < 0 && start + i >= -(gint) temporal_offset)) { - pts_i = start + i + temporal_offset; - - if (t->offsets->len < pts_i) - g_array_set_size (t->offsets, pts_i + 1); - - index = &g_array_index (t->offsets, GstMXFDemuxIndex, pts_i); - if (!index->initialized) { - index->initialized = TRUE; - index->offset = 0; - index->pts = G_MAXUINT64; - index->dts = G_MAXUINT64; - index->keyframe = FALSE; - } - - index->pts = start + i; - } - - index = &g_array_index (t->offsets, GstMXFDemuxIndex, start + i); - if (!index->initialized) { - index->initialized = TRUE; - index->offset = 0; - index->pts = G_MAXUINT64; - index->dts = G_MAXUINT64; - index->keyframe = FALSE; + /* Applying the temporal offset gives us the entry that should contain this PTS. + * We store the reverse temporal offset on that entry, i.e. the value it should apply + * to go from DTS to PTS. (i.e. entry.pts = entry.dts + rto[idx]) */ + g_array_index (table->reverse_temporal_offsets, gint8, + start + entidx + entry->temporal_offset) = offs; + if (entry->temporal_offset > (gint) table->max_temporal_offset) { + GST_LOG_OBJECT (demux, + "Updating max temporal offset to %d (was %d)", + entry->temporal_offset, table->max_temporal_offset); + table->max_temporal_offset = entry->temporal_offset; } - - index->offset = offset; - index->keyframe = ! !(segment->index_entries[i].flags & 0x80) - || (segment->index_entries[i].key_frame_offset == 0); - index->dts = pts_i; } } } } - for (l = demux->pending_index_table_segments; l; l = l->next) { - MXFIndexTableSegment *s = l->data; - mxf_index_table_segment_reset (s); - g_free (s); - } - g_list_free (demux->pending_index_table_segments); + g_list_free_full (demux->pending_index_table_segments, g_free); demux->pending_index_table_segments = NULL; + + GST_DEBUG_OBJECT (demux, "Done collecting segments"); } static gboolean @@ -3842,6 +5115,13 @@ &start_type, &start, &stop_type, &stop); seqnum = gst_event_get_seqnum (event); + if (seqnum == demux->seqnum) { + GST_DEBUG_OBJECT (demux, "Already handled requested seek"); + return TRUE; + } + + GST_DEBUG_OBJECT (demux, "Seek %" GST_PTR_FORMAT, event); + if (format != GST_FORMAT_TIME) goto wrong_format; @@ -3851,8 +5131,6 @@ flush = ! !(flags & GST_SEEK_FLAG_FLUSH); keyframe = ! !(flags & GST_SEEK_FLAG_KEY_UNIT); - keyunit_ts = start; - if (!demux->index_table_segments_collected) { collect_index_table_segments (demux); demux->index_table_segments_collected = TRUE; @@ -3894,12 +5172,12 @@ gst_segment_do_seek (&seeksegment, rate, format, flags, start_type, start, stop_type, stop, &update); - GST_DEBUG_OBJECT (demux, "segment configured %" GST_SEGMENT_FORMAT, - &seeksegment); + GST_DEBUG_OBJECT (demux, + "segment initially configured to %" GST_SEGMENT_FORMAT, &seeksegment); + /* Initialize and reset ourselves if needed */ if (flush || seeksegment.position != demux->segment.position) { - guint64 new_offset = -1; - + GList *tmp; if (!demux->metadata_resolved || demux->update_metadata) { if (gst_mxf_demux_resolve_references (demux) != GST_FLOW_OK || gst_mxf_demux_update_tracks (demux) != GST_FLOW_OK) { @@ -3907,19 +5185,77 @@ } } + /* Reset all single-track KLV tracking */ + for (tmp = demux->partitions; tmp; tmp = tmp->next) { + GstMXFDemuxPartition *partition = (GstMXFDemuxPartition *) tmp->data; + if (partition->single_track) { + partition->clip_klv.consumed = 0; + } + } + } + + keyunit_ts = seeksegment.position; + + /* Do a first round without changing positions. This is needed to figure out + * the supporting keyframe position (if any) */ + for (i = 0; i < demux->src->len; i++) { + GstMXFDemuxPad *p = g_ptr_array_index (demux->src, i); + GstMXFDemuxEssenceTrack *etrack; + gint64 track_pos, seeked_pos; + + /* Get track and track position for requested time, handles out of bound internally */ + if (!gst_mxf_demux_pad_to_track_and_position (demux, p, + seeksegment.position, &etrack, &track_pos)) + goto invalid_position; + + GST_LOG_OBJECT (p, + "track %d (body_sid:%d index_sid:%d), position %" G_GINT64_FORMAT, + etrack->track_id, etrack->body_sid, etrack->index_sid, track_pos); + + /* Find supporting keyframe entry */ + seeked_pos = track_pos; + if (gst_mxf_demux_find_essence_element (demux, etrack, &seeked_pos, + TRUE) == -1) { + /* Couldn't find entry, ignore */ + break; + } + + GST_LOG_OBJECT (p, + "track %d (body_sid:%d index_sid:%d), position %" G_GINT64_FORMAT + " entry position %" G_GINT64_FORMAT, etrack->track_id, etrack->body_sid, + etrack->index_sid, track_pos, seeked_pos); + + if (seeked_pos != track_pos) { + GstClockTime stream_time; + if (!gst_mxf_demux_pad_get_stream_time (demux, p, etrack, seeked_pos, + &stream_time)) + goto invalid_position; + GST_LOG_OBJECT (p, "Need to seek to stream time %" GST_TIME_FORMAT, + GST_TIME_ARGS (stream_time)); + keyunit_ts = MIN (seeksegment.position, stream_time); + } + } + + if (keyframe && keyunit_ts != seeksegment.position) { + GST_INFO_OBJECT (demux, "key unit seek, adjusting segment start to " + "%" GST_TIME_FORMAT, GST_TIME_ARGS (keyunit_ts)); + gst_segment_do_seek (&seeksegment, rate, format, flags, + start_type, keyunit_ts, stop_type, stop, &update); + } + + /* Finally set the position to the calculated position */ + if (flush || keyunit_ts != demux->segment.position) { + guint64 new_offset = -1; + /* Do the actual seeking */ for (i = 0; i < demux->src->len; i++) { - MXFMetadataTrackType track_type = MXF_METADATA_TRACK_UNKNOWN; GstMXFDemuxPad *p = g_ptr_array_index (demux->src, i); gint64 position; guint64 off; - if (p->material_track != NULL) - track_type = p->material_track->parent.type; - /* Reset EOS flag on all pads */ p->eos = FALSE; - gst_mxf_demux_pad_set_position (demux, p, start); + gst_mxf_demux_pad_set_position (demux, p, seeksegment.position); /* we always want to send data starting with a key unit */ position = p->current_essence_track_position; @@ -3948,12 +5284,8 @@ p->current_essence_track->source_track->edit_rate.n); } p->current_essence_track_position = position; - - /* FIXME: what about DV + MPEG-TS container essence tracks? */ - if (track_type == MXF_METADATA_TRACK_PICTURE_ESSENCE) { - keyunit_ts = MIN (p->position, keyunit_ts); - } } + p->current_essence_track->position = p->current_essence_track_position; p->discont = TRUE; } gst_flow_combiner_reset (demux->flowcombiner); @@ -3964,6 +5296,13 @@ demux->offset = new_offset + demux->run_in; } gst_mxf_demux_set_partition_for_offset (demux, demux->offset); + /* Reset the state accordingly */ + if (demux->current_partition->single_track + && demux->current_partition->single_track->wrapping != + MXF_ESSENCE_WRAPPING_FRAME_WRAPPING) + demux->state = GST_MXF_DEMUX_STATE_ESSENCE; + else + demux->state = GST_MXF_DEMUX_STATE_KLV; } if (G_UNLIKELY (demux->close_seg_event)) { @@ -3987,13 +5326,6 @@ gst_event_set_seqnum (demux->close_seg_event, demux->seqnum); } - if (keyframe && keyunit_ts != start) { - GST_INFO_OBJECT (demux, "key unit seek, adjusting segment start to " - "%" GST_TIME_FORMAT, GST_TIME_ARGS (keyunit_ts)); - gst_segment_do_seek (&seeksegment, rate, format, flags, - start_type, keyunit_ts, stop_type, stop, &update); - } - /* Ok seek succeeded, take the newly configured segment */ memcpy (&demux->segment, &seeksegment, sizeof (GstSegment)); @@ -4047,6 +5379,23 @@ GST_WARNING_OBJECT (demux, "metadata can't be resolved"); return FALSE; } + +invalid_position: + { + if (flush) { + GstEvent *e; + + /* Stop flushing, the sinks are at time 0 now */ + e = gst_event_new_flush_stop (TRUE); + gst_event_set_seqnum (e, seqnum); + gst_mxf_demux_push_src_event (demux, e); + } + gst_pad_start_task (demux->sinkpad, + (GstTaskFunction) gst_mxf_demux_loop, demux->sinkpad, NULL); + GST_PAD_STREAM_UNLOCK (demux->sinkpad); + GST_WARNING_OBJECT (demux, "Requested seek position is not valid"); + return FALSE; + } } static gboolean @@ -4669,7 +6018,7 @@ g_object_class_install_property (gobject_class, PROP_MAX_DRIFT, g_param_spec_uint64 ("max-drift", "Maximum drift", "Maximum number of nanoseconds by which tracks can differ", - 100 * GST_MSECOND, G_MAXUINT64, 500 * GST_MSECOND, + 100 * GST_MSECOND, G_MAXUINT64, DEFAULT_MAX_DRIFT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_STRUCTURE, @@ -4707,7 +6056,7 @@ gst_element_add_pad (GST_ELEMENT (demux), demux->sinkpad); - demux->max_drift = 500 * GST_MSECOND; + demux->max_drift = DEFAULT_MAX_DRIFT; demux->adapter = gst_adapter_new (); demux->flowcombiner = gst_flow_combiner_new ();
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfdemux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfdemux.h
Changed
@@ -49,23 +49,76 @@ typedef struct _GstMXFDemuxPad GstMXFDemuxPad; typedef struct _GstMXFDemuxPadClass GstMXFDemuxPadClass; -typedef struct + +/* + * GstMXFKLV is used to pass around information about a KLV. + * + * It optionally contains the content of the klv (data field). + */ +typedef struct { + MXFUL key; + guint64 offset; /* absolute offset of K */ + gsize length; /* Size of data (i.e. V) */ + guint64 data_offset; /* relative offset of data (i.e. size of 'KL') */ + GstBuffer *data; /* Can be NULL in pull-mode. */ + + /* For partial reads (ex: clip/custom wrapping essence), the amount of data + * already consumed within. If 0, all of length+data_offset was consumed */ + guint64 consumed; +} GstMXFKLV; + + +typedef enum { + GST_MXF_DEMUX_STATE_UNKNOWN, /* Still looking for run-in/klv */ + GST_MXF_DEMUX_STATE_KLV, /* Next read/fetch is a KLV */ + GST_MXF_DEMUX_STATE_ESSENCE /* Next read/fetch is within a KLV (i.e. non-frame-wrapped) */ +} GstMXFDemuxState; + +typedef struct _GstMXFDemuxPartition GstMXFDemuxPartition; +typedef struct _GstMXFDemuxEssenceTrack GstMXFDemuxEssenceTrack; + +struct _GstMXFDemuxPartition { MXFPartitionPack partition; MXFPrimerPack primer; gboolean parsed_metadata; + + /* Relative offset at which essence starts within this partition. + * + * For Frame wrapping, the position of the first KLV + * For Clip/Custom wrapping, the position of the first byte of essence in the KLV + **/ guint64 essence_container_offset; -} GstMXFDemuxPartition; -typedef struct + /* If the partition contains a single essence track, point to it */ + GstMXFDemuxEssenceTrack *single_track; + + /* For clip-based wrapping, the essence KLV */ + GstMXFKLV clip_klv; +}; + +#define MXF_INDEX_DELTA_ID_UNKNOWN -1 +#define MXF_INDEX_DELTA_ID_IGNORE -2 + +struct _GstMXFDemuxEssenceTrack { guint32 body_sid; guint32 index_sid; guint32 track_number; + /* delta id, the position of this track in the container package delta table + * (if the track is in an interleaved essence container) + * + * Special values: + * * -1 Not discovered yet + * * -2 Ignore delta entry (if index table is not present or not complete) + */ + gint32 delta_id; + guint32 track_id; MXFUMID source_package_uid; + /* Position and duration in edit units */ gint64 position; gint64 duration; @@ -82,11 +135,18 @@ GstCaps *caps; gboolean intra_only; -} GstMXFDemuxEssenceTrack; + + MXFEssenceWrapping wrapping; + + /* Minimum number of edit unit to send in one go. + * Default : 1 + * Used for raw audio track */ + guint min_edit_units; +}; typedef struct { - /* 0 if uninitialized */ + /* absolute byte offset excluding run_in, 0 if uninitialized */ guint64 offset; /* PTS edit unit number or G_MAXUINT64 */ @@ -95,8 +155,14 @@ /* DTS edit unit number if we got here via PTS */ guint64 dts; + /* Duration in edit units */ + guint64 duration; + gboolean keyframe; gboolean initialized; + + /* Size, used for non-frame-wrapped content */ + guint64 size; } GstMXFDemuxIndex; typedef struct @@ -104,8 +170,23 @@ guint32 body_sid; guint32 index_sid; - /* offsets indexed by DTS */ - GArray *offsets; + /* Array of MXFIndexTableSegment, sorted by DTS + * Note: Can be empty and can be sparse (i.e. not cover every edit unit) */ + GArray *segments; + + /* Delta entry to which reordering should be applied (-1 == no reordering) */ + gint reordered_delta_entry; + + /* Array of gint8 reverse temporal offsets. + * Contains the shift to apply to an entry DTS to get the PTS + * + * Can be NULL if the content doesn't have temporal shifts (i.e. all present + * entries have a temporal offset of 0) */ + GArray *reverse_temporal_offsets; + + /* Greatest temporal offset value contained within offsets. + * Unsigned because the smallest value is 0 (no reordering) */ + guint max_temporal_offset; } GstMXFDemuxIndexTable; struct _GstMXFDemuxPad @@ -117,7 +198,7 @@ GstClockTime position; gdouble position_accumulated_error; - /* Current position in the material track */ + /* Current position in the material track (in edit units) */ gint64 current_material_track_position; gboolean eos, discont; @@ -139,6 +220,7 @@ gint64 current_component_start; gint64 current_component_duration; + /* Current essence track and position (in edit units) */ GstMXFDemuxEssenceTrack *current_essence_track; gint64 current_essence_track_position; }; @@ -156,6 +238,8 @@ GPtrArray *src; /* < private > */ + GstMXFDemuxState state; + gboolean have_group_id; guint group_id; @@ -199,6 +283,7 @@ MXFMetadataPreface *preface; GHashTable *metadata; + /* Current Material Package */ MXFUMID current_package_uid; MXFMetadataGenericPackage *current_package; gchar *current_package_string; @@ -208,6 +293,9 @@ /* Properties */ gchar *requested_package_string; GstClockTime max_drift; + + /* Quirks */ + gboolean temporal_order_misuse; }; struct _GstMXFDemuxClass
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfdms1.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfdms1.c
Changed
@@ -1656,7 +1656,7 @@ memcpy (self->identifier_kind, tag_data, tag_size); GST_DEBUG (" identifier kind = %s", self->identifier_kind); } else if (memcmp (tag_ul, &identifier_value_ul, 16) == 0) { - self->identifier_value = g_memdup (tag_data, tag_size); + self->identifier_value = g_memdup2 (tag_data, tag_size); self->identifier_value_length = tag_size; GST_DEBUG (" identifier value length = %u", tag_size); } else if (memcmp (tag_ul, &identification_locator_ul, 16) == 0) {
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfessence.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfessence.h
Changed
@@ -29,7 +29,8 @@ typedef enum { MXF_ESSENCE_WRAPPING_FRAME_WRAPPING, MXF_ESSENCE_WRAPPING_CLIP_WRAPPING, - MXF_ESSENCE_WRAPPING_CUSTOM_WRAPPING + MXF_ESSENCE_WRAPPING_CUSTOM_WRAPPING, + MXF_ESSENCE_WRAPPING_UNKNOWN_WRAPPING } MXFEssenceWrapping; typedef GstFlowReturn (*MXFEssenceElementHandleFunc) (const MXFUL *key, GstBuffer *buffer, GstCaps *caps, MXFMetadataTimelineTrack *track, gpointer mapping_data, GstBuffer **outbuf);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfmetadata.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfmetadata.c
Changed
@@ -252,7 +252,7 @@ mxf_primer_pack_add_mapping (primer, 0x0000, &t->ul); memcpy (tmp->data, t->data, t->size); } else { - tmp->data = g_memdup (t->data, t->size); + tmp->data = g_memdup2 (t->data, t->size); } tags = g_list_prepend (tags, tmp); } @@ -2247,6 +2247,8 @@ d = MXF_METADATA_FILE_DESCRIPTOR (current); + self->is_interleaved = MXF_IS_METADATA_MULTIPLE_DESCRIPTOR (self->descriptor); + for (i = 0; i < package->n_tracks; i++) { if (!package->tracks[i]) continue; @@ -3400,6 +3402,16 @@ gchar str[96]; #endif + if (mxf_umid_is_zero (&self->source_package_id)) { + /* S377-1:2019 B.10 Source Clip. + * + * SourcePackageID: The value shall be 32 zero valued bytes to terminate the + * source reference chain. */ + GST_LOG ("Skipping termination source package for source clip %s", + mxf_uuid_to_string (&MXF_METADATA_BASE (self)->instance_uid, str)); + goto chain_up; + } + g_hash_table_iter_init (&iter, metadata); while (g_hash_table_iter_next (&iter, NULL, (gpointer) & current)) { @@ -3418,6 +3430,7 @@ mxf_umid_to_string (&self->source_package_id, str)); } +chain_up: return MXF_METADATA_BASE_CLASS (mxf_metadata_source_clip_parent_class)->resolve (m, metadata);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfmetadata.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfmetadata.h
Changed
@@ -509,6 +509,9 @@ MXFMetadataGenericDescriptor *descriptor; gboolean top_level; + + /* TRUE if descriptor is multi-descriptor, i.e. content is interleaved */ + gboolean is_interleaved; }; typedef enum {
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfmpeg.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfmpeg.c
Changed
@@ -590,6 +590,11 @@ /* RP224 */ +static const MXFUL video_mpeg_compression = { { + 0x06, 0x0E, 0x2B, 0x34, 0x04, 0x01, 0x01, 0x01, 0x04, 0x01, 0x02, 0x02, + 0x01,} +}; + static const MXFUL sound_essence_compression_ac3 = { { 0x06, 0x0E, 0x2B, 0x34, 0x04, 0x01, 0x01, 0x01, 0x04, 0x02, 0x02, 0x02, 0x03, 0x02, 0x01, 0x00} @@ -634,13 +639,16 @@ GstCaps *caps = NULL; const gchar *codec_name = NULL; MXFMPEGEssenceType t, *mdata; + gchar str[48]; *mapping_data = g_malloc (sizeof (MXFMPEGEssenceType)); mdata = (MXFMPEGEssenceType *) * mapping_data; /* SMPTE RP224 */ if (p) { - if (mxf_ul_is_zero (&p->picture_essence_coding)) { + if (mxf_ul_is_zero (&p->picture_essence_coding) || + mxf_ul_is_equal (&p->picture_essence_coding, + &p->parent.essence_container)) { GST_WARNING ("No picture essence coding defined, assuming MPEG2"); caps = gst_caps_new_simple ("video/mpeg", "mpegversion", G_TYPE_INT, 2, @@ -649,19 +657,10 @@ t = MXF_MPEG_ESSENCE_TYPE_VIDEO_MPEG2; memcpy (mdata, &t, sizeof (MXFMPEGEssenceType)); *intra_only = FALSE; - } else if (p->picture_essence_coding.u[0] != 0x06 - || p->picture_essence_coding.u[1] != 0x0e - || p->picture_essence_coding.u[2] != 0x2b - || p->picture_essence_coding.u[3] != 0x34 - || p->picture_essence_coding.u[4] != 0x04 - || p->picture_essence_coding.u[5] != 0x01 - || p->picture_essence_coding.u[6] != 0x01 - || p->picture_essence_coding.u[8] != 0x04 - || p->picture_essence_coding.u[9] != 0x01 - || p->picture_essence_coding.u[10] != 0x02 - || p->picture_essence_coding.u[11] != 0x02 - || p->picture_essence_coding.u[12] != 0x01) { - GST_ERROR ("No MPEG picture essence coding"); + } else if (!mxf_ul_is_subclass (&video_mpeg_compression, + &p->picture_essence_coding)) { + GST_ERROR ("Not MPEG picture essence coding %s", + mxf_ul_to_string (&p->picture_essence_coding, str)); caps = NULL; } else if (p->picture_essence_coding.u[13] >= 0x01 && p->picture_essence_coding.u[13] <= 0x08) { @@ -1322,7 +1321,7 @@ codec_data = gst_value_get_buffer (v); gst_buffer_map ((GstBuffer *) codec_data, &map, GST_MAP_READ); t->size = map.size; - t->data = g_memdup (map.data, map.size); + t->data = g_memdup2 (map.data, map.size); gst_buffer_unmap ((GstBuffer *) codec_data, &map); memcpy (&t->ul, &sony_mpeg4_extradata, 16); mxf_local_tag_insert (t, &MXF_METADATA_BASE (ret)->other_tags);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfmux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfmux.c
Changed
@@ -37,6 +37,7 @@ #include <math.h> #include <string.h> +#include "gstmxfelements.h" #include "mxfmux.h" #ifdef HAVE_SYS_UTSNAME_H @@ -120,6 +121,8 @@ #define gst_mxf_mux_parent_class parent_class G_DEFINE_TYPE (GstMXFMux, gst_mxf_mux, GST_TYPE_AGGREGATOR); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mxfmux, "mxfmux", GST_RANK_PRIMARY, + GST_TYPE_MXF_MUX, mxf_element_init (plugin)); static void gst_mxf_mux_finalize (GObject * object);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxftypes.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxftypes.c
Changed
@@ -1184,19 +1184,19 @@ segment->delta_entries[i].pos_table_index = GST_READ_UINT8 (tag_data); tag_data += 1; tag_size -= 1; - GST_DEBUG (" pos table index = %d", + GST_DEBUG (" pos table index = %d", segment->delta_entries[i].pos_table_index); segment->delta_entries[i].slice = GST_READ_UINT8 (tag_data); tag_data += 1; tag_size -= 1; - GST_DEBUG (" slice = %u", segment->delta_entries[i].slice); + GST_DEBUG (" slice = %u", segment->delta_entries[i].slice); segment->delta_entries[i].element_delta = GST_READ_UINT32_BE (tag_data); tag_data += 4; tag_size -= 4; - GST_DEBUG (" element delta = %u", + GST_DEBUG (" element delta = %u", segment->delta_entries[i].element_delta); } break; @@ -1236,22 +1236,26 @@ entry->temporal_offset = GST_READ_UINT8 (tag_data); tag_data += 1; tag_size -= 1; - GST_DEBUG (" temporal offset = %d", entry->temporal_offset); + GST_DEBUG (" temporal offset = %d", entry->temporal_offset); entry->key_frame_offset = GST_READ_UINT8 (tag_data); tag_data += 1; tag_size -= 1; - GST_DEBUG (" keyframe offset = %d", entry->key_frame_offset); + GST_DEBUG (" keyframe offset = %d", entry->key_frame_offset); entry->flags = GST_READ_UINT8 (tag_data); tag_data += 1; tag_size -= 1; - GST_DEBUG (" flags = 0x%02x", entry->flags); + GST_DEBUG (" flags = 0x%02x (%s%s%s%s)", entry->flags, + entry->flags & 0x80 ? "Random-Access " : "", + entry->flags & 0x40 ? "Sequence-Header " : "", + entry->flags & 0x20 ? "Forward-Prediction " : "", + entry->flags & 0x10 ? "Backward-Prediction " : ""); entry->stream_offset = GST_READ_UINT64_BE (tag_data); tag_data += 8; tag_size -= 8; - GST_DEBUG (" stream offset = %" G_GUINT64_FORMAT, + GST_DEBUG (" stream offset = %" G_GUINT64_FORMAT, entry->stream_offset); entry->slice_offset = g_new0 (guint32, segment->slice_count); @@ -1259,7 +1263,7 @@ entry->slice_offset[j] = GST_READ_UINT32_BE (tag_data); tag_data += 4; tag_size -= 4; - GST_DEBUG (" slice %u offset = %u", j, entry->slice_offset[j]); + GST_DEBUG (" slice %u offset = %u", j, entry->slice_offset[j]); } entry->pos_table = g_new0 (MXFFraction, segment->pos_table_count); @@ -1268,7 +1272,7 @@ goto error; tag_data += 8; tag_size -= 8; - GST_DEBUG (" pos table %u = %d/%d", j, entry->pos_table[j].n, + GST_DEBUG (" pos table %u = %d/%d", j, entry->pos_table[j].n, entry->pos_table[j].d); } } @@ -1281,6 +1285,21 @@ break; } } + + /* If edit unit byte count is 0 there *must* be entries */ + if (segment->edit_unit_byte_count == 0 && segment->n_index_entries == 0) { + GST_WARNING + ("Invalid IndexTableSegment, No entries and no specified edit unit byte count"); + goto error; + } + + /* Compute initial essence offset */ + if (segment->edit_unit_byte_count) + segment->segment_start_offset = + segment->index_start_position * segment->edit_unit_byte_count; + else + segment->segment_start_offset = segment->index_entries[0].stream_offset; + return TRUE; error: @@ -1687,7 +1706,7 @@ local_tag = g_slice_new0 (MXFLocalTag); memcpy (&local_tag->ul, ul, sizeof (MXFUL)); local_tag->size = tag_size; - local_tag->data = tag_size == 0 ? NULL : g_memdup (tag_data, tag_size); + local_tag->data = tag_size == 0 ? NULL : g_memdup2 (tag_data, tag_size); local_tag->g_slice = FALSE; g_hash_table_insert (*hash_table, &local_tag->ul, local_tag);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxftypes.h -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxftypes.h
Changed
@@ -169,6 +169,11 @@ guint32 n_index_entries; MXFIndexEntry *index_entries; + + /* Computed fields (i.e. not present in file) */ + + /* Initial essence offset covered by this segment */ + guint64 segment_start_offset; } MXFIndexTableSegment; #define GST_TAG_MXF_UMID "mxf-umid"
View file
gst-plugins-bad-1.18.6.tar.xz/gst/mxf/mxfvc3.c -> gst-plugins-bad-1.20.1.tar.xz/gst/mxf/mxfvc3.c
Changed
@@ -89,8 +89,8 @@ *outbuf = buffer; /* SMPTE 2019-4 6.1 */ - if (key->u[12] != 0x15 || (key->u[14] != 0x05 && key->u[14] != 0x0C - && key->u[14] != 0x0D)) { + if (key->u[12] != 0x15 || (key->u[14] != 0x05 && key->u[14] != 0x06 + && key->u[14] != 0x0C && key->u[14] != 0x0D)) { GST_ERROR ("Invalid VC-3 essence element"); return GST_FLOW_ERROR; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/netsim/gstnetsim.c -> gst-plugins-bad-1.20.1.tar.xz/gst/netsim/gstnetsim.c
Changed
@@ -93,6 +93,8 @@ GST_STATIC_CAPS_ANY); G_DEFINE_TYPE (GstNetSim, gst_net_sim, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (netsim, "netsim", + GST_RANK_MARGINAL, GST_TYPE_NET_SIM); static gboolean gst_net_sim_source_dispatch (GSource * source, @@ -763,8 +765,7 @@ static gboolean gst_net_sim_plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "netsim", - GST_RANK_MARGINAL, GST_TYPE_NET_SIM); + return GST_ELEMENT_REGISTER (netsim, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/netsim/gstnetsim.h -> gst-plugins-bad-1.20.1.tar.xz/gst/netsim/gstnetsim.h
Changed
@@ -96,6 +96,7 @@ }; GType gst_net_sim_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (netsim); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/onvif/gstrtponvif.c -> gst-plugins-bad-1.20.1.tar.xz/gst/onvif/gstrtponvif.c
Changed
@@ -30,14 +30,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "rtponviftimestamp", GST_RANK_NONE, - GST_TYPE_RTP_ONVIF_TIMESTAMP)) - return FALSE; - if (!gst_element_register (plugin, "rtponvifparse", GST_RANK_NONE, - GST_TYPE_RTP_ONVIF_PARSE)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (rtponviftimestamp, plugin); + ret |= GST_ELEMENT_REGISTER (rtponvifparse, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/onvif/gstrtponvifparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/onvif/gstrtponvifparse.c
Changed
@@ -48,6 +48,8 @@ ); G_DEFINE_TYPE (GstRtpOnvifParse, gst_rtp_onvif_parse, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (rtponvifparse, "rtponvifparse", + GST_RANK_NONE, GST_TYPE_RTP_ONVIF_PARSE); static void gst_rtp_onvif_parse_class_init (GstRtpOnvifParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/onvif/gstrtponvifparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/onvif/gstrtponvifparse.h
Changed
@@ -54,6 +54,7 @@ }; GType gst_rtp_onvif_parse_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (rtponvifparse); #ifdef __cplusplus }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/onvif/gstrtponviftimestamp.c -> gst-plugins-bad-1.20.1.tar.xz/gst/onvif/gstrtponviftimestamp.c
Changed
@@ -78,6 +78,8 @@ /*static guint gst_rtp_onvif_timestamp_signals[LAST_SIGNAL] = { 0 }; */ G_DEFINE_TYPE (GstRtpOnvifTimestamp, gst_rtp_onvif_timestamp, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (rtponviftimestamp, "rtponviftimestamp", + GST_RANK_NONE, GST_TYPE_RTP_ONVIF_TIMESTAMP); static void gst_rtp_onvif_timestamp_get_property (GObject * object,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/onvif/gstrtponviftimestamp.h -> gst-plugins-bad-1.20.1.tar.xz/gst/onvif/gstrtponviftimestamp.h
Changed
@@ -75,6 +75,7 @@ }; GType gst_rtp_onvif_timestamp_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (rtponviftimestamp); #ifdef __cplusplus }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pcapparse/gstirtspparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/pcapparse/gstirtspparse.c
Changed
@@ -80,6 +80,8 @@ #define parent_class gst_irtsp_parse_parent_class G_DEFINE_TYPE (GstIRTSPParse, gst_irtsp_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE (irtspparse, "irtspparse", GST_RANK_NONE, + GST_TYPE_IRTSP_PARSE); static void gst_irtsp_parse_class_init (GstIRTSPParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pcapparse/gstirtspparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/pcapparse/gstirtspparse.h
Changed
@@ -76,6 +76,7 @@ }; GType gst_irtsp_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (irtspparse); G_END_DECLS #endif /* __GST_IRTSP_PARSE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pcapparse/gstpcapparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/pcapparse/gstpcapparse.c
Changed
@@ -107,6 +107,8 @@ #define parent_class gst_pcap_parse_parent_class G_DEFINE_TYPE (GstPcapParse, gst_pcap_parse, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (pcapparse, "pcapparse", GST_RANK_NONE, + GST_TYPE_PCAP_PARSE); static void gst_pcap_parse_class_init (GstPcapParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pcapparse/gstpcapparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/pcapparse/gstpcapparse.h
Changed
@@ -94,6 +94,7 @@ }; GType gst_pcap_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (pcapparse); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pcapparse/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/pcapparse/plugin.c
Changed
@@ -27,12 +27,10 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; + gboolean ret = FALSE; - ret = gst_element_register (plugin, "pcapparse", - GST_RANK_NONE, GST_TYPE_PCAP_PARSE); - ret &= gst_element_register (plugin, "irtspparse", - GST_RANK_NONE, GST_TYPE_IRTSP_PARSE); + ret |= GST_ELEMENT_REGISTER (pcapparse, plugin); + ret |= GST_ELEMENT_REGISTER (irtspparse, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pnm/gstpnm.c -> gst-plugins-bad-1.20.1.tar.xz/gst/pnm/gstpnm.c
Changed
@@ -23,21 +23,16 @@ #include "gstpnmdec.h" #include "gstpnmenc.h" -#include <gst/gst.h> - -#include <string.h> static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "pnmdec", GST_RANK_PRIMARY, - GST_TYPE_PNMDEC)) - return FALSE; - if (!gst_element_register (plugin, "pnmenc", GST_RANK_PRIMARY, - GST_TYPE_PNMENC)) - return FALSE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (pnmdec, plugin); + ret |= GST_ELEMENT_REGISTER (pnmenc, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, pnm,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pnm/gstpnmdec.c -> gst-plugins-bad-1.20.1.tar.xz/gst/pnm/gstpnmdec.c
Changed
@@ -56,6 +56,8 @@ gst_pnmdec_parse_ascii (GstPnmdec * s, const guint8 * b, guint bs); G_DEFINE_TYPE (GstPnmdec, gst_pnmdec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (pnmdec, "pnmdec", GST_RANK_PRIMARY, + GST_TYPE_PNMDEC); static GstStaticPadTemplate gst_pnmdec_src_pad_template = GST_STATIC_PAD_TEMPLATE ("src",
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pnm/gstpnmdec.h -> gst-plugins-bad-1.20.1.tar.xz/gst/pnm/gstpnmdec.h
Changed
@@ -54,7 +54,7 @@ }; GType gst_pnmdec_get_type (void) G_GNUC_CONST; - +GST_ELEMENT_REGISTER_DECLARE (pnmdec); G_END_DECLS #endif /* __GST_PNMDEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pnm/gstpnmenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/pnm/gstpnmenc.c
Changed
@@ -65,8 +65,11 @@ GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS (MIME_ALL)); -G_DEFINE_TYPE (GstPnmenc, gst_pnmenc, GST_TYPE_VIDEO_ENCODER); #define parent_class gst_pnmenc_parent_class +G_DEFINE_TYPE (GstPnmenc, gst_pnmenc, GST_TYPE_VIDEO_ENCODER); +GST_ELEMENT_REGISTER_DEFINE (pnmenc, "pnmenc", GST_RANK_PRIMARY, + GST_TYPE_PNMENC); + static GstFlowReturn gst_pnmenc_handle_frame (GstVideoEncoder * encoder, GstVideoCodecFrame * frame);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/pnm/gstpnmenc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/pnm/gstpnmenc.h
Changed
@@ -51,6 +51,7 @@ }; GType gst_pnmenc_get_type (void) G_GNUC_CONST; +GST_ELEMENT_REGISTER_DECLARE (pnmenc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/proxy/gstproxy.c -> gst-plugins-bad-1.20.1.tar.xz/gst/proxy/gstproxy.c
Changed
@@ -29,11 +29,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "proxysrc", GST_RANK_NONE, GST_TYPE_PROXY_SRC); - gst_element_register (plugin, "proxysink", GST_RANK_NONE, - GST_TYPE_PROXY_SINK); + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (proxysrc, plugin); + ret |= GST_ELEMENT_REGISTER (proxysink, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/proxy/gstproxysink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/proxy/gstproxysink.c
Changed
@@ -54,6 +54,8 @@ /* Unlink proxysrc, we don't contain any elements so our parent is GstElement */ #define parent_class gst_proxy_sink_parent_class G_DEFINE_TYPE (GstProxySink, gst_proxy_sink, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (proxysink, "proxysink", GST_RANK_NONE, + GST_TYPE_PROXY_SINK); static gboolean gst_proxy_sink_sink_query (GstPad * pad, GstObject * parent, GstQuery * query);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/proxy/gstproxysink.h -> gst-plugins-bad-1.20.1.tar.xz/gst/proxy/gstproxysink.h
Changed
@@ -55,6 +55,7 @@ }; GType gst_proxy_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (proxysink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/proxy/gstproxysrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/proxy/gstproxysrc.c
Changed
@@ -100,6 +100,8 @@ * element */ #define parent_class gst_proxy_src_parent_class G_DEFINE_TYPE (GstProxySrc, gst_proxy_src, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (proxysrc, "proxysrc", GST_RANK_NONE, + GST_TYPE_PROXY_SRC); static gboolean gst_proxy_src_internal_src_query (GstPad * pad, GstObject * parent, GstQuery * query);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/proxy/gstproxysrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/proxy/gstproxysrc.h
Changed
@@ -63,6 +63,7 @@ }; GType gst_proxy_src_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (proxysrc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rawparse/gstaudioparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rawparse/gstaudioparse.c
Changed
@@ -122,6 +122,8 @@ #define gst_audio_parse_parent_class parent_class G_DEFINE_TYPE (GstAudioParse, gst_audio_parse, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (audioparse, "audioparse", GST_RANK_NONE, + gst_audio_parse_get_type ()); static void gst_audio_parse_class_init (GstAudioParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rawparse/gstaudioparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/rawparse/gstaudioparse.h
Changed
@@ -50,5 +50,6 @@ }; GType gst_audio_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (audioparse); #endif /* __GST_AUDIO_PARSE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rawparse/gstvideoparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rawparse/gstvideoparse.c
Changed
@@ -82,6 +82,8 @@ #define gst_video_parse_parent_class parent_class G_DEFINE_TYPE (GstVideoParse, gst_video_parse, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (videoparse, "videoparse", GST_RANK_NONE, + gst_video_parse_get_type ()); static void gst_video_parse_class_init (GstVideoParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rawparse/gstvideoparse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/rawparse/gstvideoparse.h
Changed
@@ -51,5 +51,6 @@ }; GType gst_video_parse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (videoparse); #endif /* __GST_VIDEO_PARSE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rawparse/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rawparse/plugin.c
Changed
@@ -9,12 +9,10 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; + gboolean ret = FALSE; - ret = gst_element_register (plugin, "videoparse", GST_RANK_NONE, - gst_video_parse_get_type ()); - ret &= gst_element_register (plugin, "audioparse", GST_RANK_NONE, - gst_audio_parse_get_type ()); + ret |= GST_ELEMENT_REGISTER (videoparse, plugin); + ret |= GST_ELEMENT_REGISTER (audioparse, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/removesilence/gstremovesilence.c -> gst-plugins-bad-1.20.1.tar.xz/gst/removesilence/gstremovesilence.c
Changed
@@ -104,6 +104,8 @@ #define gst_remove_silence_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstRemoveSilence, gst_remove_silence, GST_TYPE_BASE_TRANSFORM, DEBUG_INIT (0)); +GST_ELEMENT_REGISTER_DEFINE (removesilence, "removesilence", GST_RANK_NONE, + gst_remove_silence_get_type ()); static void gst_remove_silence_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); @@ -427,8 +429,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "removesilence", GST_RANK_NONE, - gst_remove_silence_get_type ()); + return GST_ELEMENT_REGISTER (removesilence, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/removesilence/gstremovesilence.h -> gst-plugins-bad-1.20.1.tar.xz/gst/removesilence/gstremovesilence.h
Changed
@@ -60,6 +60,7 @@ } GstRemoveSilenceClass; GType gst_remove_silence_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (removesilence); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstrist.c
Added
@@ -0,0 +1,93 @@ +/* GStreamer RIST plugin + * Copyright (C) 2019 Net Insight AB + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstrist.h" +#include "gstroundrobin.h" + +/* + * rtp_ext_seq: + * @extseq: (inout): a previous extended seqs + * @seq: a new seq + * + * Update the @extseq field with the extended seq of @seq + * For the first call of the method, @extseq should point to a location + * with a value of -1. + * + * This function is able to handle both forward and backward seqs taking + * into account: + * - seq wraparound making sure that the returned value is properly increased. + * - seq unwraparound making sure that the returned value is properly decreased. + * + * Returns: The extended seq of @seq or 0 if the result can't go anywhere backwards. + * + * NOTE: This is a calque of gst_rtp_buffer_ext_timestamp() but with + * s/32/16/ and s/64/32/ and s/0xffffffff/0xffff/ and s/timestamp/seqnum/. + */ +guint32 +gst_rist_rtp_ext_seq (guint32 * extseqnum, guint16 seqnum) +{ + guint32 result, ext; + + g_return_val_if_fail (extseqnum != NULL, -1); + + ext = *extseqnum; + + if (ext == -1) { + result = seqnum; + } else { + /* pick wraparound counter from previous seqnum and add to new seqnum */ + result = seqnum + (ext & ~(0xffff)); + + /* check for seqnum wraparound */ + if (result < ext) { + guint32 diff = ext - result; + + if (diff > G_MAXINT16) { + /* seqnum went backwards more than allowed, we wrap around and get + * updated extended seqnum. */ + result += (1 << 16); + } + } else { + guint32 diff = result - ext; + + if (diff > G_MAXINT16) { + if (result < (1 << 16)) { + GST_WARNING + ("Cannot unwrap, any wrapping took place yet. Returning 0 without updating extended seqnum."); + return 0; + } else { + /* seqnum went forwards more than allowed, we unwrap around and get + * updated extended seqnum. */ + result -= (1 << 16); + /* We don't want the extended seqnum storage to go back, ever */ + return result; + } + } + } + } + + *extseqnum = result; + + return result; +}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstrist.h -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstrist.h
Changed
@@ -30,6 +30,7 @@ GstElementClass parent_class; } GstRistRtxReceiveClass; GType gst_rist_rtx_receive_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ristrtxreceive); #define GST_TYPE_RIST_RTX_SEND (gst_rist_rtx_send_get_type()) #define GST_RIST_RTX_SEND(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_RIST_RTX_SEND, GstRistRtxSend)) @@ -38,6 +39,7 @@ GstElementClass parent_class; } GstRistRtxSendClass; GType gst_rist_rtx_send_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ristrtxsend); #define GST_TYPE_RIST_SRC (gst_rist_src_get_type()) #define GST_RIST_SRC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_RIST_SRC,GstRistSrc)) @@ -46,7 +48,7 @@ GstBinClass parent; } GstRistSrcClass; GType gst_rist_src_get_type (void); - +GST_ELEMENT_REGISTER_DECLARE (ristsrc); #define GST_TYPE_RIST_SINK (gst_rist_sink_get_type()) #define GST_RIST_SINK(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_RIST_SINK,GstRistSink)) @@ -55,6 +57,7 @@ GstBinClass parent; } GstRistSinkClass; GType gst_rist_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ristsink); #define GST_TYPE_RIST_RTP_EXT (gst_rist_rtp_ext_get_type()) #define GST_RIST_RTP_EXT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_RIST_RTP_EXT,GstRistRtpExt)) @@ -63,6 +66,7 @@ GstElementClass parent; } GstRistRtpExtClass; GType gst_rist_rtp_ext_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ristrtpext); #define GST_TYPE_RIST_RTP_DEEXT (gst_rist_rtp_deext_get_type()) #define GST_RIST_RTP_DEEXT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_RIST_RTP_DEEXT,GstRistRtpDeext)) @@ -71,6 +75,7 @@ GstElementClass parent; } GstRistRtpDeextClass; GType gst_rist_rtp_deext_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (ristrtpdeext); guint32 gst_rist_rtp_ext_seq (guint32 * extseqnum, guint16 seqnum);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristplugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristplugin.c
Changed
@@ -25,99 +25,20 @@ #include "gstrist.h" #include "gstroundrobin.h" -/* - * rtp_ext_seq: - * @extseq: (inout): a previous extended seqs - * @seq: a new seq - * - * Update the @extseq field with the extended seq of @seq - * For the first call of the method, @extseq should point to a location - * with a value of -1. - * - * This function is able to handle both forward and backward seqs taking - * into account: - * - seq wraparound making sure that the returned value is properly increased. - * - seq unwraparound making sure that the returned value is properly decreased. - * - * Returns: The extended seq of @seq or 0 if the result can't go anywhere backwards. - * - * NOTE: This is a calque of gst_rtp_buffer_ext_timestamp() but with - * s/32/16/ and s/64/32/ and s/0xffffffff/0xffff/ and s/timestamp/seqnum/. - */ -guint32 -gst_rist_rtp_ext_seq (guint32 * extseqnum, guint16 seqnum) -{ - guint32 result, ext; - - g_return_val_if_fail (extseqnum != NULL, -1); - - ext = *extseqnum; - - if (ext == -1) { - result = seqnum; - } else { - /* pick wraparound counter from previous seqnum and add to new seqnum */ - result = seqnum + (ext & ~(0xffff)); - - /* check for seqnum wraparound */ - if (result < ext) { - guint32 diff = ext - result; - - if (diff > G_MAXINT16) { - /* seqnum went backwards more than allowed, we wrap around and get - * updated extended seqnum. */ - result += (1 << 16); - } - } else { - guint32 diff = result - ext; - - if (diff > G_MAXINT16) { - if (result < (1 << 16)) { - GST_WARNING - ("Cannot unwrap, any wrapping took place yet. Returning 0 without updating extended seqnum."); - return 0; - } else { - /* seqnum went forwards more than allowed, we unwrap around and get - * updated extended seqnum. */ - result -= (1 << 16); - /* We don't want the extended seqnum storage to go back, ever */ - return result; - } - } - } - } - - *extseqnum = result; - - return result; -} - static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "ristsrc", GST_RANK_PRIMARY, - GST_TYPE_RIST_SRC)) - return FALSE; - if (!gst_element_register (plugin, "ristsink", GST_RANK_PRIMARY, - GST_TYPE_RIST_SINK)) - return FALSE; - if (!gst_element_register (plugin, "ristrtxsend", GST_RANK_NONE, - GST_TYPE_RIST_RTX_SEND)) - return FALSE; - if (!gst_element_register (plugin, "ristrtxreceive", GST_RANK_NONE, - GST_TYPE_RIST_RTX_RECEIVE)) - return FALSE; - if (!gst_element_register (plugin, "roundrobin", GST_RANK_NONE, - GST_TYPE_ROUND_ROBIN)) - return FALSE; - if (!gst_element_register (plugin, "ristrtpext", GST_RANK_NONE, - GST_TYPE_RIST_RTP_EXT)) - return FALSE; - if (!gst_element_register (plugin, "ristrtpdeext", GST_RANK_NONE, - GST_TYPE_RIST_RTP_DEEXT)) - return FALSE; + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (ristsrc, plugin); + ret |= GST_ELEMENT_REGISTER (ristsink, plugin); + ret |= GST_ELEMENT_REGISTER (ristrtxsend, plugin); + ret |= GST_ELEMENT_REGISTER (ristrtxreceive, plugin); + ret |= GST_ELEMENT_REGISTER (roundrobin, plugin); + ret |= GST_ELEMENT_REGISTER (ristrtpext, plugin); + ret |= GST_ELEMENT_REGISTER (ristrtpdeext, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristrtpdeext.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristrtpdeext.c
Changed
@@ -76,6 +76,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRistRtpDeext, gst_rist_rtp_deext, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_rist_rtp_deext_debug, "ristrtpdeext", 0, "RIST RTP De-extension")); +GST_ELEMENT_REGISTER_DEFINE (ristrtpdeext, "ristrtpdeext", GST_RANK_NONE, + GST_TYPE_RIST_RTP_DEEXT); static guint8 bit_count (guint8 value)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristrtpext.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristrtpext.c
Changed
@@ -85,7 +85,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRistRtpExt, gst_rist_rtp_ext, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_rist_rtp_ext_debug, "ristrtpext", 0, "RIST RTP Extension")); - +GST_ELEMENT_REGISTER_DEFINE (ristrtpext, "ristrtpext", GST_RANK_NONE, + GST_TYPE_RIST_RTP_EXT); static GstFlowReturn gst_rist_rtp_ext_chain (GstPad * pad, GstObject * parent, GstBuffer * buffer)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristrtxreceive.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristrtxreceive.c
Changed
@@ -91,6 +91,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRistRtxReceive, gst_rist_rtx_receive, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_rist_rtx_receive_debug, "ristrtxreceive", 0, "RIST retransmission receiver")); +GST_ELEMENT_REGISTER_DEFINE (ristrtxreceive, "ristrtxreceive", + GST_RANK_NONE, GST_TYPE_RIST_RTX_RECEIVE); static void gst_rist_rtx_receive_class_init (GstRistRtxReceiveClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristrtxsend.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristrtxsend.c
Changed
@@ -122,6 +122,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRistRtxSend, gst_rist_rtx_send, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_rist_rtx_send_debug, "ristrtxsend", 0, "RIST retransmission sender")); +GST_ELEMENT_REGISTER_DEFINE (ristrtxsend, "ristrtxsend", GST_RANK_NONE, + GST_TYPE_RIST_RTX_SEND); typedef struct {
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristsink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristsink.c
Changed
@@ -188,6 +188,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRistSink, gst_rist_sink, GST_TYPE_BIN, GST_DEBUG_CATEGORY_INIT (gst_rist_sink_debug, "ristsink", 0, "RIST Sink")); +GST_ELEMENT_REGISTER_DEFINE (ristsink, "ristsink", GST_RANK_PRIMARY, + GST_TYPE_RIST_SINK); GQuark session_id_quark = 0; @@ -257,7 +259,7 @@ gst_element_add_pad (sink->rtxbin, pad); g_snprintf (name, 32, "send_rtp_sink_%u", bond->session); - pad = gst_element_get_request_pad (sink->rtpbin, name); + pad = gst_element_request_pad_simple (sink->rtpbin, name); gst_object_unref (pad); } @@ -693,6 +695,7 @@ "close-socket", FALSE, NULL); g_object_unref (socket); + g_object_set (bond->rtcp_sink, "sync", FALSE, "async", FALSE, NULL); gst_element_set_locked_state (bond->rtcp_sink, FALSE); gst_element_sync_state_with_parent (bond->rtcp_sink); @@ -707,10 +710,37 @@ } static GstStateChangeReturn +gst_rist_sink_reuse_socket (GstRistSink * sink) +{ + gint i; + + for (i = 0; i < sink->bonds->len; i++) { + RistSenderBond *bond = g_ptr_array_index (sink->bonds, i); + GObject *session = NULL; + GstPad *pad; + gchar name[32]; + + g_signal_emit_by_name (sink->rtpbin, "get-session", i, &session); + g_object_set (session, "rtcp-min-interval", sink->min_rtcp_interval, + "rtcp-fraction", sink->max_rtcp_bandwidth, NULL); + g_object_unref (session); + + g_snprintf (name, 32, "src_%u", bond->session); + pad = gst_element_request_pad_simple (sink->dispatcher, name); + gst_element_link_pads (sink->dispatcher, name, bond->rtx_queue, "sink"); + gst_object_unref (pad); + + if (!gst_rist_sink_setup_rtcp_socket (sink, bond)) + return GST_STATE_CHANGE_FAILURE; + } + + return GST_STATE_CHANGE_SUCCESS; +} + +static GstStateChangeReturn gst_rist_sink_start (GstRistSink * sink) { GstPad *rtxbin_gpad, *rtpext_sinkpad; - gint i; /* Unless a custom dispatcher was provided, use the specified bonding method * to create one */ @@ -747,26 +777,6 @@ gst_bin_add (GST_BIN (sink->rtxbin), sink->dispatcher); gst_element_link (sink->rtpext, sink->dispatcher); - for (i = 0; i < sink->bonds->len; i++) { - RistSenderBond *bond = g_ptr_array_index (sink->bonds, i); - GObject *session = NULL; - GstPad *pad; - gchar name[32]; - - g_signal_emit_by_name (sink->rtpbin, "get-session", i, &session); - g_object_set (session, "rtcp-min-interval", sink->min_rtcp_interval, - "rtcp-fraction", sink->max_rtcp_bandwidth, NULL); - g_object_unref (session); - - g_snprintf (name, 32, "src_%u", bond->session); - pad = gst_element_get_request_pad (sink->dispatcher, name); - gst_element_link_pads (sink->dispatcher, name, bond->rtx_queue, "sink"); - gst_object_unref (pad); - - if (!gst_rist_sink_setup_rtcp_socket (sink, bond)) - return GST_STATE_CHANGE_FAILURE; - } - return GST_STATE_CHANGE_SUCCESS; } @@ -894,6 +904,11 @@ GstStateChangeReturn ret; switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + /* Set the properties to the child elements to avoid binding to + * a NULL interface on a network without a default gateway */ + if (gst_rist_sink_start (sink) == GST_STATE_CHANGE_FAILURE) + return GST_STATE_CHANGE_FAILURE; case GST_STATE_CHANGE_PAUSED_TO_READY: gst_rist_sink_disable_stats_interval (sink); break; @@ -906,7 +921,7 @@ switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: - ret = gst_rist_sink_start (sink); + ret = gst_rist_sink_reuse_socket (sink); break; case GST_STATE_CHANGE_READY_TO_PAUSED: gst_rist_sink_enable_stats_interval (sink);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstristsrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstristsrc.c
Changed
@@ -163,6 +163,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRistSrc, gst_rist_src, GST_TYPE_BIN, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rist_src_uri_init); GST_DEBUG_CATEGORY_INIT (gst_rist_src_debug, "ristsrc", 0, "RIST Source")); +GST_ELEMENT_REGISTER_DEFINE (ristsrc, "ristsrc", GST_RANK_PRIMARY, + GST_TYPE_RIST_SRC); /* called with bonds lock */ static RistReceiverBond *
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstroundrobin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstroundrobin.c
Changed
@@ -53,6 +53,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRoundRobin, gst_round_robin, GST_TYPE_ELEMENT, GST_DEBUG_CATEGORY_INIT (gst_round_robin_debug, "roundrobin", 0, "Round Robin")); +GST_ELEMENT_REGISTER_DEFINE (roundrobin, "roundrobin", GST_RANK_NONE, + GST_TYPE_ROUND_ROBIN); static GstFlowReturn gst_round_robin_chain (GstPad * pad, GstObject * parent, GstBuffer * buffer)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/gstroundrobin.h -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/gstroundrobin.h
Changed
@@ -30,5 +30,6 @@ GstElementClass parent; } GstRoundRobinClass; GType gst_round_robin_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (roundrobin); #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rist/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/rist/meson.build
Changed
@@ -4,6 +4,7 @@ 'gstristrtxreceive.c', 'gstristsrc.c', 'gstristsink.c', + 'gstrist.c', 'gstristplugin.c', 'gstristrtpext.c', 'gstristrtpdeext.c'
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtmp2/gstrtmp2.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/gstrtmp2.c
Changed
@@ -23,26 +23,18 @@ #include "config.h" #endif -#include "gstrtmp2src.h" -#include "gstrtmp2sink.h" +#include "gstrtmp2elements.h" -#include "rtmp/rtmpclient.h" static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "rtmp2src", GST_RANK_PRIMARY + 1, - GST_TYPE_RTMP2_SRC); - gst_element_register (plugin, "rtmp2sink", GST_RANK_PRIMARY + 1, - GST_TYPE_RTMP2_SINK); + gboolean ret = FALSE; - gst_type_mark_as_plugin_api (GST_TYPE_RTMP_SCHEME, 0); - gst_type_mark_as_plugin_api (GST_TYPE_RTMP_AUTHMOD, 0); -#if 0 - gst_type_mark_as_plugin_api (GST_TYPE_RTMP_STOP_COMMANDS, 0); -#endif + ret |= GST_ELEMENT_REGISTER (rtmp2src, plugin); + ret |= GST_ELEMENT_REGISTER (rtmp2sink, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/gstrtmp2element.c
Added
@@ -0,0 +1,41 @@ +/* GStreamer + * Copyright (C) 2014 David Schleef <ds@schleef.org> + * Copyright (C) 2017 Make.TV, Inc. <info@make.tv> + * Contact: Jan Alexander Steffens (heftig) <jsteffens@make.tv> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Suite 500, + * Boston, MA 02110-1335, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstrtmp2elements.h" + +#include "rtmp/rtmpclient.h" + +void +rtmp2_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + gst_type_mark_as_plugin_api (GST_TYPE_RTMP_SCHEME, 0); + gst_type_mark_as_plugin_api (GST_TYPE_RTMP_AUTHMOD, 0); + gst_type_mark_as_plugin_api (GST_TYPE_RTMP_STOP_COMMANDS, 0); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/gstrtmp2elements.h
Added
@@ -0,0 +1,36 @@ +/* GStreamer + * Copyright (C) 2020 Huawei Technologies Co., Ltd. + * @Author: Stéphane Cerveau <stephane.cerveau@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_RTMP2_ELEMENTS_H__ +#define __GST_RTMP2_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void rtmp2_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (rtmp2sink); +GST_ELEMENT_REGISTER_DECLARE (rtmp2src); + +#endif /* __GST_RTMP2_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtmp2/gstrtmp2sink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/gstrtmp2sink.c
Changed
@@ -38,6 +38,7 @@ #include "config.h" #endif +#include "gstrtmp2elements.h" #include "gstrtmp2sink.h" #include "gstrtmp2locationhandler.h" @@ -149,9 +150,7 @@ PROP_PEAK_KBPS, PROP_CHUNK_SIZE, PROP_STATS, -#if 0 PROP_STOP_COMMANDS, -#endif }; /* pad templates */ @@ -169,6 +168,8 @@ G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rtmp2_sink_uri_handler_init); G_IMPLEMENT_INTERFACE (GST_TYPE_RTMP_LOCATION_HANDLER, NULL)); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (rtmp2sink, "rtmp2sink", + GST_RANK_PRIMARY + 1, GST_TYPE_RTMP2_SINK, rtmp2_element_init (plugin)); static void gst_rtmp2_sink_class_init (GstRtmp2SinkClass * klass) @@ -233,20 +234,18 @@ g_param_spec_boxed ("stats", "Stats", "Retrieve a statistics structure", GST_TYPE_STRUCTURE, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); -#if 0 - /* + /** * GstRtmp2Sink:stop-commands: * * Which commands (if any) to send on EOS event before closing connection * - * Since: 1.18 + * Since: 1.20 */ g_object_class_install_property (gobject_class, PROP_STOP_COMMANDS, g_param_spec_flags ("stop-commands", "Stop commands", "RTMP commands to send on EOS event before closing connection", GST_TYPE_RTMP_STOP_COMMANDS, GST_RTMP_DEFAULT_STOP_COMMANDS, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); -#endif gst_type_mark_as_plugin_api (GST_TYPE_RTMP_LOCATION_HANDLER, 0); GST_DEBUG_CATEGORY_INIT (gst_rtmp2_sink_debug_category, "rtmp2sink", 0, @@ -382,13 +381,11 @@ set_chunk_size (self); g_mutex_unlock (&self->lock); break; -#if 0 case PROP_STOP_COMMANDS: GST_OBJECT_LOCK (self); self->stop_commands = g_value_get_flags (value); GST_OBJECT_UNLOCK (self); break; -#endif default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec); break; @@ -486,13 +483,11 @@ case PROP_STATS: g_value_take_boxed (value, gst_rtmp2_sink_get_stats (self)); break; -#if 0 case PROP_STOP_COMMANDS: GST_OBJECT_LOCK (self); g_value_set_flags (value, self->stop_commands); GST_OBJECT_UNLOCK (self); break; -#endif default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec); break;
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtmp2/gstrtmp2src.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/gstrtmp2src.c
Changed
@@ -36,6 +36,7 @@ #include "config.h" #endif +#include "gstrtmp2elements.h" #include "gstrtmp2src.h" #include "gstrtmp2locationhandler.h" @@ -157,6 +158,8 @@ G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rtmp2_src_uri_handler_init); G_IMPLEMENT_INTERFACE (GST_TYPE_RTMP_LOCATION_HANDLER, NULL)); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (rtmp2src, "rtmp2src", + GST_RANK_PRIMARY + 1, GST_TYPE_RTMP2_SRC, rtmp2_element_init (plugin)); static void gst_rtmp2_src_class_init (GstRtmp2SrcClass * klass) @@ -680,6 +683,8 @@ self->sent_header = TRUE; } + GST_BUFFER_DTS (buffer) = self->last_ts; + *outbuf = buffer; gst_buffer_unref (message);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtmp2/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/meson.build
Changed
@@ -1,5 +1,6 @@ rtmp2_sources = [ 'gstrtmp2.c', + 'gstrtmp2element.c', 'gstrtmp2locationhandler.c', 'gstrtmp2sink.c', 'gstrtmp2src.c',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtmp2/rtmp/amf.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/rtmp/amf.c
Changed
@@ -329,7 +329,7 @@ if (out_size) { *out_size = size; - return g_memdup (data, size); + return g_memdup2 (data, size); } else { return g_strndup (data, size); } @@ -444,9 +444,9 @@ if (size < 0) { size = strlen (value); - copy = g_memdup (value, size + 1); + copy = g_memdup2 (value, size + 1); } else { - copy = g_memdup (value, size); + copy = g_memdup2 (value, size); } gst_amf_node_take_string (node, copy, size);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtmp2/rtmp/rtmpclient.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtmp2/rtmp/rtmpclient.c
Changed
@@ -1308,6 +1308,7 @@ } } else { if (g_strcmp0 (code, "NetStream.Play.Start") == 0 || + g_strcmp0 (code, "NetStream.Play.PublishNotify") == 0 || g_strcmp0 (code, "NetStream.Play.Reset") == 0) { GST_INFO ("play success: %s", info_dump->str); g_task_return_boolean (task, TRUE);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtp/gstrtpsink.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtp/gstrtpsink.c
Changed
@@ -81,6 +81,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRtpSink, gst_rtp_sink, GST_TYPE_BIN, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rtp_sink_uri_handler_init); GST_DEBUG_CATEGORY_INIT (gst_rtp_sink_debug, "rtpsink", 0, "RTP Sink")); +GST_ELEMENT_REGISTER_DEFINE (rtpsink, "rtpsink", GST_RANK_PRIMARY + 1, + GST_TYPE_RTP_SINK); #define GST_RTP_SINK_GET_LOCK(obj) (&((GstRtpSink*)(obj))->lock) #define GST_RTP_SINK_LOCK(obj) (g_mutex_lock (GST_RTP_SINK_GET_LOCK(obj))) @@ -251,7 +253,7 @@ return NULL; GST_RTP_SINK_LOCK (self); - rpad = gst_element_get_request_pad (self->rtpbin, "send_rtp_sink_%u"); + rpad = gst_element_request_pad_simple (self->rtpbin, "send_rtp_sink_%u"); if (rpad) { pad = gst_ghost_pad_new (GST_PAD_NAME (rpad), rpad); gst_element_add_pad (element, pad); @@ -442,9 +444,30 @@ } static gboolean -gst_rtp_sink_start (GstRtpSink * self) +gst_rtp_sink_reuse_socket (GstRtpSink * self) { GSocket *socket = NULL; + + gst_element_set_locked_state (self->rtcp_src, FALSE); + gst_element_sync_state_with_parent (self->rtcp_src); + + /* share the socket created by the sink */ + g_object_get (self->rtcp_src, "used-socket", &socket, NULL); + g_object_set (self->rtcp_sink, "socket", socket, "auto-multicast", FALSE, + "close-socket", FALSE, NULL); + g_object_unref (socket); + + g_object_set (self->rtcp_sink, "sync", FALSE, "async", FALSE, NULL); + gst_element_set_locked_state (self->rtcp_sink, FALSE); + gst_element_sync_state_with_parent (self->rtcp_sink); + + return TRUE; + +} + +static gboolean +gst_rtp_sink_start (GstRtpSink * self) +{ GInetAddress *iaddr = NULL; gchar *remote_addr = NULL; GError *error = NULL; @@ -495,18 +518,6 @@ g_free (remote_addr); g_object_unref (iaddr); - gst_element_set_locked_state (self->rtcp_src, FALSE); - gst_element_sync_state_with_parent (self->rtcp_src); - - /* share the socket created by the sink */ - g_object_get (self->rtcp_src, "used-socket", &socket, NULL); - g_object_set (self->rtcp_sink, "socket", socket, "auto-multicast", FALSE, - "close-socket", FALSE, NULL); - g_object_unref (socket); - - gst_element_set_locked_state (self->rtcp_sink, FALSE); - gst_element_sync_state_with_parent (self->rtcp_sink); - return TRUE; dns_resolve_failed: @@ -529,6 +540,10 @@ switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: + /* Set the properties to the child elements to avoid binding to + * a NULL interface on a network without a default gateway */ + if (gst_rtp_sink_start (self) == FALSE) + return GST_STATE_CHANGE_FAILURE; break; case GST_STATE_CHANGE_READY_TO_PAUSED: break; @@ -542,7 +557,8 @@ switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: - if (gst_rtp_sink_start (self) == FALSE) + /* re-use the sockets after they have been initialised */ + if (gst_rtp_sink_reuse_socket (self) == FALSE) return GST_STATE_CHANGE_FAILURE; break; case GST_STATE_CHANGE_READY_TO_PAUSED:
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtp/gstrtpsink.h -> gst-plugins-bad-1.20.1.tar.xz/gst/rtp/gstrtpsink.h
Changed
@@ -43,8 +43,6 @@ { GstBin parent; - GstBin parent_instance; - /* Properties */ GstUri *uri; gint ttl; @@ -68,6 +66,7 @@ }; GType gst_rtp_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (rtpsink); G_END_DECLS #endif /* __GST_RTP_SINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtp/gstrtpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtp/gstrtpsrc.c
Changed
@@ -45,6 +45,8 @@ #include <config.h> #endif +#include <stdio.h> + #include <gst/net/net.h> #include <gst/rtp/gstrtppayloads.h> @@ -57,6 +59,7 @@ #define DEFAULT_PROP_TTL 64 #define DEFAULT_PROP_TTL_MC 1 #define DEFAULT_PROP_ENCODING_NAME NULL +#define DEFAULT_PROP_CAPS NULL #define DEFAULT_PROP_LATENCY 200 #define DEFAULT_PROP_ADDRESS "0.0.0.0" @@ -76,6 +79,7 @@ PROP_ENCODING_NAME, PROP_LATENCY, PROP_MULTICAST_IFACE, + PROP_CAPS, PROP_LAST }; @@ -87,6 +91,8 @@ G_DEFINE_TYPE_WITH_CODE (GstRtpSrc, gst_rtp_src, GST_TYPE_BIN, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_rtp_src_uri_handler_init); GST_DEBUG_CATEGORY_INIT (gst_rtp_src_debug, "rtpsrc", 0, "RTP Source")); +GST_ELEMENT_REGISTER_DEFINE (rtpsrc, "rtpsrc", GST_RANK_PRIMARY + 1, + GST_TYPE_RTP_SRC); #define GST_RTP_SRC_GET_LOCK(obj) (&((GstRtpSrc*)(obj))->lock) #define GST_RTP_SRC_LOCK(obj) (g_mutex_lock (GST_RTP_SRC_GET_LOCK(obj))) @@ -119,6 +125,12 @@ GST_DEBUG_OBJECT (self, "Requesting caps for session-id 0x%x and pt %u.", session_id, pt); + if (G_UNLIKELY (self->caps)) { + GST_DEBUG_OBJECT (self, + "Full caps were set, no need for lookup %" GST_PTR_FORMAT, self->caps); + return gst_caps_copy (self->caps); + } + /* the encoding-name has more relevant information */ if (self->encoding_name != NULL) { /* Unfortunately, the media needs to be passed in the function. Since @@ -242,6 +254,22 @@ else self->multi_iface = g_value_dup_string (value); break; + case PROP_CAPS: + { + const GstCaps *new_caps_val = gst_value_get_caps (value); + GstCaps *new_caps = NULL; + GstCaps *old_caps = self->caps; + + if (new_caps_val != NULL) { + new_caps = gst_caps_copy (new_caps_val); + } + + self->caps = new_caps; + + if (old_caps) + gst_caps_unref (old_caps); + break; + } default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -284,6 +312,9 @@ case PROP_MULTICAST_IFACE: g_value_set_string (value, self->multi_iface); break; + case PROP_CAPS: + gst_value_set_caps (value, self->caps); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -301,6 +332,9 @@ g_free (self->multi_iface); + if (self->caps) + gst_caps_unref (self->caps); + g_clear_object (&self->rtcp_send_addr); g_mutex_clear (&self->lock); @@ -426,6 +460,18 @@ DEFAULT_PROP_MULTICAST_IFACE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + /** + * GstRtpSrc:caps: + * + * The RTP caps of the incoming stream. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_CAPS, + g_param_spec_boxed ("caps", "Caps", + "The caps of the incoming stream", GST_TYPE_CAPS, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&src_template)); @@ -436,12 +482,34 @@ } static void +clear_ssrc (GstElement * rtpbin, GstPad * gpad) +{ + GstPad *pad; + gint pt; + guint ssrc; + + pad = gst_ghost_pad_get_target (GST_GHOST_PAD (gpad)); + if (!pad) + return; + + if (sscanf (GST_PAD_NAME (pad), "recv_rtp_src_0_%u_%d", &ssrc, &pt) != 2) { + gst_object_unref (pad); + return; + } + gst_object_unref (pad); + + g_signal_emit_by_name (rtpbin, "clear-ssrc", 0, ssrc); +} + +static void gst_rtp_src_rtpbin_pad_added_cb (GstElement * element, GstPad * pad, gpointer data) { GstRtpSrc *self = GST_RTP_SRC (data); GstCaps *caps = gst_pad_query_caps (pad, NULL); - GstPad *upad; + const GstStructure *s; + GstPad *upad = NULL; + gint pt = -1; gchar name[48]; /* Expose RTP data pad only */ @@ -473,15 +541,27 @@ return; } + + s = gst_caps_get_structure (caps, 0); + gst_structure_get_int (s, "payload", &pt); gst_caps_unref (caps); GST_RTP_SRC_LOCK (self); - g_snprintf (name, 48, "src_%u", GST_ELEMENT (self)->numpads); - upad = gst_ghost_pad_new (name, pad); + g_snprintf (name, 48, "src_%u", pt); + upad = gst_element_get_static_pad (GST_ELEMENT (self), name); - gst_pad_set_active (upad, TRUE); - gst_element_add_pad (GST_ELEMENT (self), upad); + if (!upad) { + GST_DEBUG_OBJECT (self, "Adding new pad: %s", name); + upad = gst_ghost_pad_new (name, pad); + gst_pad_set_active (upad, TRUE); + gst_element_add_pad (GST_ELEMENT (self), upad); + } else { + GST_DEBUG_OBJECT (self, "Re-using existing pad: %s", GST_PAD_NAME (upad)); + clear_ssrc (element, upad); + gst_ghost_pad_set_target (GST_GHOST_PAD (upad), pad); + gst_object_unref (upad); + } GST_RTP_SRC_UNLOCK (self); } @@ -623,7 +703,6 @@ /* set multicast-iface on the udpsrc and udpsink elements */ g_object_set (self->rtcp_src, "multicast-iface", self->multi_iface, NULL); - g_object_set (self->rtcp_sink, "multicast-iface", self->multi_iface, NULL); g_object_set (self->rtp_src, "multicast-iface", self->multi_iface, NULL); } else { /* In unicast, send RTCP to the detected sender address */ @@ -732,6 +811,7 @@ self->ttl = DEFAULT_PROP_TTL; self->ttl_mc = DEFAULT_PROP_TTL_MC; self->encoding_name = DEFAULT_PROP_ENCODING_NAME; + self->caps = DEFAULT_PROP_CAPS; GST_OBJECT_FLAG_SET (GST_OBJECT (self), GST_ELEMENT_FLAG_SOURCE); gst_bin_set_suppressed_flags (GST_BIN (self), @@ -754,6 +834,7 @@ missing_plugin = "rtpmanager"; goto missing_plugin; } + g_object_set (self->rtpbin, "autoremove", TRUE, NULL); gst_bin_add (GST_BIN (self), self->rtpbin);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtp/gstrtpsrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/rtp/gstrtpsrc.h
Changed
@@ -51,6 +51,7 @@ gint ttl_mc; gchar *encoding_name; gchar *multi_iface; + GstCaps *caps; /* Internal elements */ GstElement *rtpbin; @@ -71,6 +72,7 @@ }; GType gst_rtp_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (rtpsrc); G_END_DECLS #endif /* __GST_RTP_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/rtp/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/rtp/plugin.c
Changed
@@ -9,14 +9,10 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret = FALSE; - ret |= gst_element_register (plugin, "rtpsrc", - GST_RANK_PRIMARY + 1, GST_TYPE_RTP_SRC); - - ret |= gst_element_register (plugin, "rtpsink", - GST_RANK_PRIMARY + 1, GST_TYPE_RTP_SINK); + ret |= GST_ELEMENT_REGISTER (rtpsrc, plugin); + ret |= GST_ELEMENT_REGISTER (rtpsink, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/sdp/gstsdpdemux.c -> gst-plugins-bad-1.20.1.tar.xz/gst/sdp/gstsdpdemux.c
Changed
@@ -110,6 +110,8 @@ #define gst_sdp_demux_parent_class parent_class G_DEFINE_TYPE (GstSDPDemux, gst_sdp_demux, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE (sdpdemux, "sdpdemux", GST_RANK_NONE, + GST_TYPE_SDP_DEMUX); static void gst_sdp_demux_class_init (GstSDPDemuxClass * klass) @@ -400,6 +402,8 @@ s = gst_caps_get_structure (stream->caps, 0); gst_structure_set_name (s, "application/x-rtp"); + gst_sdp_media_attributes_to_caps (media, stream->caps); + if (stream->pt >= 96) { /* If we have a dynamic payload type, see if we have a stream with the * same payload number. If there is one, they are part of the same @@ -782,7 +786,8 @@ pad = gst_element_get_static_pad (stream->udpsrc[0], "src"); name = g_strdup_printf ("recv_rtp_sink_%u", stream->id); - stream->channelpad[0] = gst_element_get_request_pad (demux->session, name); + stream->channelpad[0] = + gst_element_request_pad_simple (demux->session, name); g_free (name); GST_DEBUG_OBJECT (demux, "connecting RTP source 0 to manager"); @@ -812,7 +817,8 @@ GST_DEBUG_OBJECT (demux, "connecting RTCP source to manager"); name = g_strdup_printf ("recv_rtcp_sink_%u", stream->id); - stream->channelpad[1] = gst_element_get_request_pad (demux->session, name); + stream->channelpad[1] = + gst_element_request_pad_simple (demux->session, name); g_free (name); pad = gst_element_get_static_pad (stream->udpsrc[1], "src"); @@ -888,7 +894,7 @@ /* get session RTCP pad */ name = g_strdup_printf ("send_rtcp_src_%u", stream->id); - pad = gst_element_get_request_pad (demux->session, name); + pad = gst_element_request_pad_simple (demux->session, name); g_free (name); /* and link */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/sdp/gstsdpdemux.h -> gst-plugins-bad-1.20.1.tar.xz/gst/sdp/gstsdpdemux.h
Changed
@@ -113,6 +113,7 @@ }; GType gst_sdp_demux_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (sdpdemux); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/sdp/gstsdpelem.c -> gst-plugins-bad-1.20.1.tar.xz/gst/sdp/gstsdpelem.c
Changed
@@ -27,13 +27,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "sdpdemux", GST_RANK_NONE, - GST_TYPE_SDP_DEMUX)) - return FALSE; - if (!gst_element_register (plugin, "sdpsrc", GST_RANK_NONE, GST_TYPE_SDP_SRC)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (sdpdemux, plugin); + ret |= GST_ELEMENT_REGISTER (sdpsrc, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/sdp/gstsdpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/sdp/gstsdpsrc.c
Changed
@@ -45,6 +45,7 @@ #define gst_sdp_src_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstSdpSrc, gst_sdp_src, GST_TYPE_BIN, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, gst_sdp_src_handler_init)); +GST_ELEMENT_REGISTER_DEFINE (sdpsrc, "sdpsrc", GST_RANK_NONE, GST_TYPE_SDP_SRC); static void gst_sdp_src_finalize (GObject * object)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/sdp/gstsdpsrc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/sdp/gstsdpsrc.h
Changed
@@ -57,6 +57,7 @@ }; GType gst_sdp_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (sdpsrc); G_END_DECLS #endif /* __GST_SDP_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/segmentclip/gstaudiosegmentclip.c -> gst-plugins-bad-1.20.1.tar.xz/gst/segmentclip/gstaudiosegmentclip.c
Changed
@@ -45,6 +45,8 @@ G_DEFINE_TYPE (GstAudioSegmentClip, gst_audio_segment_clip, GST_TYPE_SEGMENT_CLIP); +GST_ELEMENT_REGISTER_DEFINE (audiosegmentclip, "audiosegmentclip", + GST_RANK_NONE, GST_TYPE_AUDIO_SEGMENT_CLIP); static void gst_audio_segment_clip_class_init (GstAudioSegmentClipClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/segmentclip/gstaudiosegmentclip.h -> gst-plugins-bad-1.20.1.tar.xz/gst/segmentclip/gstaudiosegmentclip.h
Changed
@@ -55,6 +55,7 @@ }; GType gst_audio_segment_clip_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (audiosegmentclip); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/segmentclip/gstvideosegmentclip.c -> gst-plugins-bad-1.20.1.tar.xz/gst/segmentclip/gstvideosegmentclip.c
Changed
@@ -45,6 +45,8 @@ G_DEFINE_TYPE (GstVideoSegmentClip, gst_video_segment_clip, GST_TYPE_SEGMENT_CLIP); +GST_ELEMENT_REGISTER_DEFINE (videosegmentclip, "videosegmentclip", + GST_RANK_NONE, GST_TYPE_VIDEO_SEGMENT_CLIP); static void gst_video_segment_clip_class_init (GstVideoSegmentClipClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/segmentclip/gstvideosegmentclip.h -> gst-plugins-bad-1.20.1.tar.xz/gst/segmentclip/gstvideosegmentclip.h
Changed
@@ -54,6 +54,7 @@ }; GType gst_video_segment_clip_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (videosegmentclip); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/segmentclip/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/segmentclip/plugin.c
Changed
@@ -27,13 +27,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "audiosegmentclip", GST_RANK_NONE, - GST_TYPE_AUDIO_SEGMENT_CLIP) || - !gst_element_register (plugin, "videosegmentclip", GST_RANK_NONE, - GST_TYPE_VIDEO_SEGMENT_CLIP)) - return FALSE; + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (audiosegmentclip, plugin); + ret |= GST_ELEMENT_REGISTER (videosegmentclip, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/siren/gstsiren.c -> gst-plugins-bad-1.20.1.tar.xz/gst/siren/gstsiren.c
Changed
@@ -24,20 +24,19 @@ #include "config.h" #endif +#include "gstsiren.h" #include "gstsirendec.h" #include "gstsirenenc.h" - static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_siren_dec_plugin_init (plugin)) - return FALSE; + gboolean ret = FALSE; - if (!gst_siren_enc_plugin_init (plugin)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (sirendec, plugin); + ret |= GST_ELEMENT_REGISTER (sirenenc, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/siren/gstsirendec.c -> gst-plugins-bad-1.20.1.tar.xz/gst/siren/gstsirendec.c
Changed
@@ -65,6 +65,8 @@ G_DEFINE_TYPE (GstSirenDec, gst_siren_dec, GST_TYPE_AUDIO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (sirendec, "sirendec", + GST_RANK_MARGINAL, GST_TYPE_SIREN_DEC); static void gst_siren_dec_class_init (GstSirenDecClass * klass) @@ -248,10 +250,3 @@ goto done; } } - -gboolean -gst_siren_dec_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "sirendec", - GST_RANK_MARGINAL, GST_TYPE_SIREN_DEC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/siren/gstsirendec.h -> gst-plugins-bad-1.20.1.tar.xz/gst/siren/gstsirendec.h
Changed
@@ -60,8 +60,7 @@ }; GType gst_siren_dec_get_type (void); - -gboolean gst_siren_dec_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (sirendec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/siren/gstsirenenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/siren/gstsirenenc.c
Changed
@@ -62,7 +62,8 @@ GstBuffer * in_buf); G_DEFINE_TYPE (GstSirenEnc, gst_siren_enc, GST_TYPE_AUDIO_ENCODER); - +GST_ELEMENT_REGISTER_DEFINE (sirenenc, "sirenenc", + GST_RANK_MARGINAL, GST_TYPE_SIREN_ENC); static void gst_siren_enc_class_init (GstSirenEncClass * klass) @@ -228,10 +229,3 @@ goto done; } } - -gboolean -gst_siren_enc_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "sirenenc", - GST_RANK_MARGINAL, GST_TYPE_SIREN_ENC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/gst/siren/gstsirenenc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/siren/gstsirenenc.h
Changed
@@ -60,8 +60,7 @@ }; GType gst_siren_enc_get_type (void); - -gboolean gst_siren_enc_plugin_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (sirenenc); G_END_DECLS #endif /* __GST_SIREN_ENC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/smooth/gstsmooth.c -> gst-plugins-bad-1.20.1.tar.xz/gst/smooth/gstsmooth.c
Changed
@@ -64,6 +64,7 @@ GValue * value, GParamSpec * pspec); G_DEFINE_TYPE (GstSmooth, gst_smooth, GST_TYPE_VIDEO_FILTER); +GST_ELEMENT_REGISTER_DEFINE (smooth, "smooth", GST_RANK_NONE, GST_TYPE_SMOOTH); static void gst_smooth_class_init (GstSmoothClass * klass) @@ -277,8 +278,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "smooth", - GST_RANK_NONE, GST_TYPE_SMOOTH); + return GST_ELEMENT_REGISTER (smooth, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/smooth/gstsmooth.h -> gst-plugins-bad-1.20.1.tar.xz/gst/smooth/gstsmooth.h
Changed
@@ -65,7 +65,7 @@ }; GType gst_smooth_get_type(void); - +GST_ELEMENT_REGISTER_DECLARE (smooth); #ifdef __cplusplus }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/speed/gstspeed.c -> gst-plugins-bad-1.20.1.tar.xz/gst/speed/gstspeed.c
Changed
@@ -95,6 +95,10 @@ GstEvent * event); G_DEFINE_TYPE (GstSpeed, gst_speed, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (speed, "speed", GST_RANK_NONE, + GST_TYPE_SPEED, GST_DEBUG_CATEGORY_INIT (speed_debug, "speed", 0, + "speed element"); + ); static gboolean speed_setcaps (GstPad * pad, GstCaps * caps) @@ -690,9 +694,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (speed_debug, "speed", 0, "speed element"); - - return gst_element_register (plugin, "speed", GST_RANK_NONE, GST_TYPE_SPEED); + return GST_ELEMENT_REGISTER (speed, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/speed/gstspeed.h -> gst-plugins-bad-1.20.1.tar.xz/gst/speed/gstspeed.h
Changed
@@ -61,6 +61,7 @@ }; GType gst_speed_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (speed); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/subenc/gstsrtenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/subenc/gstsrtenc.c
Changed
@@ -57,6 +57,7 @@ #define parent_class gst_srt_enc_parent_class G_DEFINE_TYPE (GstSrtEnc, gst_srt_enc, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (srtenc, "srtenc", GST_RANK_NONE, GST_TYPE_SRT_ENC); static void gst_srt_enc_append_timestamp_to_string (GstClockTime timestamp, GString * str)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/subenc/gstsrtenc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/subenc/gstsrtenc.h
Changed
@@ -58,6 +58,7 @@ }; GType gst_srt_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (srtenc); G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/subenc/gstsubenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/subenc/gstsubenc.c
Changed
@@ -27,11 +27,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - gst_element_register (plugin, "srtenc", GST_RANK_NONE, GST_TYPE_SRT_ENC); - gst_element_register (plugin, "webvttenc", GST_RANK_NONE, - GST_TYPE_WEBVTT_ENC); + gboolean ret = FALSE; - return TRUE; + ret |= GST_ELEMENT_REGISTER (srtenc, plugin); + ret |= GST_ELEMENT_REGISTER (webvttenc, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/subenc/gstwebvttenc.c -> gst-plugins-bad-1.20.1.tar.xz/gst/subenc/gstwebvttenc.c
Changed
@@ -58,6 +58,8 @@ #define parent_class gst_webvtt_enc_parent_class G_DEFINE_TYPE (GstWebvttEnc, gst_webvtt_enc, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (webvttenc, "webvttenc", GST_RANK_NONE, + GST_TYPE_WEBVTT_ENC); static void gst_webvtt_enc_append_timestamp_to_string (GstClockTime timestamp,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/subenc/gstwebvttenc.h -> gst-plugins-bad-1.20.1.tar.xz/gst/subenc/gstwebvttenc.h
Changed
@@ -60,6 +60,7 @@ }; GType gst_webvtt_enc_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (webvttenc); G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/switchbin/gstswitchbin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/switchbin/gstswitchbin.c
Changed
@@ -111,6 +111,8 @@ G_IMPLEMENT_INTERFACE (GST_TYPE_CHILD_PROXY, gst_switch_bin_child_proxy_iface_init) ); +GST_ELEMENT_REGISTER_DEFINE (switchbin, "switchbin", GST_RANK_NONE, + gst_switch_bin_get_type ()); static void gst_switch_bin_unlock_paths_and_notify (GstSwitchBin * switchbin); @@ -143,7 +145,7 @@ GstPadProbeInfo * info, gpointer user_data); static GstCaps *gst_switch_bin_get_allowed_caps (GstSwitchBin * switch_bin, - gchar const *pad_name, GstCaps * filter); + GstPad * switch_bin_pad, gchar const *pad_name, GstCaps * filter); static gboolean gst_switch_bin_are_caps_acceptable (GstSwitchBin * switch_bin, GstCaps const *caps); @@ -461,7 +463,8 @@ || (switch_bin->current_path->element == NULL)) { /* Paths exist, but there is no current path (or the path is a dropping path, * so no element exists) - just return all allowed caps */ - caps = gst_switch_bin_get_allowed_caps (switch_bin, pad_name, filter); + caps = + gst_switch_bin_get_allowed_caps (switch_bin, pad, pad_name, filter); } else { /* Paths exist and there is a current path * Forward the query to its element */ @@ -815,12 +818,14 @@ static GstCaps * gst_switch_bin_get_allowed_caps (GstSwitchBin * switch_bin, - G_GNUC_UNUSED gchar const *pad_name, GstCaps * filter) + GstPad * switch_bin_pad, gchar const *pad_name, GstCaps * filter) { /* must be called with path lock held */ guint i; GstCaps *total_path_caps; + gboolean is_sink_pad = + (gst_pad_get_direction (switch_bin_pad) == GST_PAD_SINK); /* The allowed caps are a combination of the caps of all paths, the * filter caps, and the allowed caps as indicated by the result @@ -847,7 +852,7 @@ for (i = 0; i < switch_bin->num_paths; ++i) { GstSwitchBinPath *path = switch_bin->paths[i]; - if ((path->element != NULL) && (path == switch_bin->current_path)) { + if (path->element != NULL) { GstPad *pad; GstCaps *caps, *intersected_caps; GstQuery *caps_query = NULL; @@ -856,22 +861,32 @@ caps_query = gst_query_new_caps (NULL); /* Query the path element for allowed caps. If this is - * successful, intersect the returned caps with the path caps, - * and append the result of the intersection to the total_path_caps. */ + * successful, intersect the returned caps with the path caps for the sink pad, + * and append the result of the intersection to the total_path_caps, + * or just append the result to the total_path_caps if collecting srcpad caps. */ if (gst_pad_query (pad, caps_query)) { gst_query_parse_caps_result (caps_query, &caps); - intersected_caps = gst_caps_intersect (caps, path->caps); + if (is_sink_pad) { + intersected_caps = gst_caps_intersect (caps, path->caps); + } else { + intersected_caps = gst_caps_copy (caps); + } gst_caps_append (total_path_caps, intersected_caps); - } else + } else if (is_sink_pad) { + /* Just assume the sink pad has the path caps if the query failed */ gst_caps_append (total_path_caps, gst_caps_ref (path->caps)); + } gst_object_unref (GST_OBJECT (pad)); gst_query_unref (caps_query); } else { - /* Either this is the current path and it has no element (= is a dropping path), - * or it is not the current path. In both cases, no caps query can be performed. - * Just append the path caps then. */ - gst_caps_append (total_path_caps, gst_caps_ref (path->caps)); + /* This is a path with no element (= is a dropping path), + * If querying the sink caps, append the path + * input caps, otherwise the output caps can be ANY */ + if (is_sink_pad) + gst_caps_append (total_path_caps, gst_caps_ref (path->caps)); + else + gst_caps_append (total_path_caps, gst_caps_new_any ()); } }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/switchbin/gstswitchbin.h -> gst-plugins-bad-1.20.1.tar.xz/gst/switchbin/gstswitchbin.h
Changed
@@ -81,7 +81,7 @@ GType gst_switch_bin_get_type(void); GType gst_switch_bin_path_get_type(void); - +GST_ELEMENT_REGISTER_DECLARE (switchbin); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/switchbin/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/switchbin/plugin.c
Changed
@@ -28,14 +28,9 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret = TRUE; - ret = ret - && gst_element_register (plugin, "switchbin", GST_RANK_NONE, - gst_switch_bin_get_type ()); - return ret; + return GST_ELEMENT_REGISTER (switchbin, plugin); } - GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, switchbin,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/timecode/gstavwait.c -> gst-plugins-bad-1.20.1.tar.xz/gst/timecode/gstavwait.c
Changed
@@ -84,6 +84,7 @@ #define parent_class gst_avwait_parent_class G_DEFINE_TYPE (GstAvWait, gst_avwait, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (avwait, "avwait", GST_RANK_NONE, GST_TYPE_AVWAIT); enum { @@ -616,26 +617,33 @@ switch (GST_EVENT_TYPE (event)) { case GST_EVENT_SEGMENT:{ + GstSegment segment; gboolean send_message = FALSE; + gboolean segment_changed; g_mutex_lock (&self->mutex); - gst_event_copy_segment (event, &self->vsegment); + gst_event_copy_segment (event, &segment); + segment.position = self->vsegment.position; + segment_changed = !gst_segment_is_equal (&segment, &self->vsegment); + self->vsegment = segment; if (self->vsegment.format != GST_FORMAT_TIME) { GST_ERROR_OBJECT (self, "Invalid segment format"); g_mutex_unlock (&self->mutex); gst_event_unref (event); return FALSE; } - GST_DEBUG_OBJECT (self, "First time reset in video segment"); - self->running_time_to_wait_for = GST_CLOCK_TIME_NONE; - self->running_time_to_end_at = GST_CLOCK_TIME_NONE; - self->audio_running_time_to_wait_for = GST_CLOCK_TIME_NONE; - self->audio_running_time_to_end_at = GST_CLOCK_TIME_NONE; - if (!self->dropping) { - self->dropping = TRUE; - send_message = TRUE; + if (segment_changed) { + GST_DEBUG_OBJECT (self, "First time reset in video segment"); + self->running_time_to_wait_for = GST_CLOCK_TIME_NONE; + self->running_time_to_end_at = GST_CLOCK_TIME_NONE; + self->audio_running_time_to_wait_for = GST_CLOCK_TIME_NONE; + self->audio_running_time_to_end_at = GST_CLOCK_TIME_NONE; + if (!self->dropping) { + self->dropping = TRUE; + send_message = TRUE; + } + self->vsegment.position = GST_CLOCK_TIME_NONE; } - self->vsegment.position = GST_CLOCK_TIME_NONE; g_mutex_unlock (&self->mutex); if (send_message) @@ -750,18 +758,29 @@ GST_LOG_OBJECT (pad, "Got %s event", GST_EVENT_TYPE_NAME (event)); switch (GST_EVENT_TYPE (event)) { - case GST_EVENT_SEGMENT: + case GST_EVENT_SEGMENT:{ + GstSegment segment; + gboolean segment_changed; + g_mutex_lock (&self->mutex); - gst_event_copy_segment (event, &self->asegment); + gst_event_copy_segment (event, &segment); + segment.position = self->asegment.position; + segment_changed = !gst_segment_is_equal (&segment, &self->asegment); + self->asegment = segment; + if (self->asegment.format != GST_FORMAT_TIME) { GST_ERROR_OBJECT (self, "Invalid segment format"); g_mutex_unlock (&self->mutex); gst_event_unref (event); return FALSE; } - self->asegment.position = GST_CLOCK_TIME_NONE; + + if (segment_changed) { + self->asegment.position = GST_CLOCK_TIME_NONE; + } g_mutex_unlock (&self->mutex); break; + } case GST_EVENT_FLUSH_START: g_mutex_lock (&self->mutex); self->audio_flush_flag = TRUE;
View file
gst-plugins-bad-1.18.6.tar.xz/gst/timecode/gstavwait.h -> gst-plugins-bad-1.20.1.tar.xz/gst/timecode/gstavwait.h
Changed
@@ -98,6 +98,7 @@ }; GType gst_avwait_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (avwait); G_END_DECLS #endif /* __GST_AVWAIT_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/timecode/gsttimecodestamper.c -> gst-plugins-bad-1.20.1.tar.xz/gst/timecode/gsttimecodestamper.c
Changed
@@ -92,17 +92,21 @@ #define DEFAULT_LTC_QUEUE 100 static GstStaticPadTemplate gst_timecodestamper_src_template = -GST_STATIC_PAD_TEMPLATE ("src", + GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - GST_STATIC_CAPS ("video/x-raw, framerate=[1/2147483647, 2147483647/1]") + GST_STATIC_CAPS ("video/x-raw, framerate=[1/2147483647, 2147483647/1]; " + "closedcaption/x-cea-608, framerate=[1/2147483647, 2147483647/1]; " + "closedcaption/x-cea-708, framerate=[1/2147483647, 2147483647/1]; ") ); static GstStaticPadTemplate gst_timecodestamper_sink_template = -GST_STATIC_PAD_TEMPLATE ("sink", + GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - GST_STATIC_CAPS ("video/x-raw, framerate=[1/2147483647, 2147483647/1]") + GST_STATIC_CAPS ("video/x-raw, framerate=[1/2147483647, 2147483647/1]; " + "closedcaption/x-cea-608, framerate=[1/2147483647, 2147483647/1]; " + "closedcaption/x-cea-708, framerate=[1/2147483647, 2147483647/1]; ") ); static GstStaticPadTemplate gst_timecodestamper_ltc_template = @@ -161,6 +165,8 @@ G_DEFINE_TYPE (GstTimeCodeStamper, gst_timecodestamper, GST_TYPE_BASE_TRANSFORM); +GST_ELEMENT_REGISTER_DEFINE (timecodestamper, "timecodestamper", + GST_RANK_NONE, GST_TYPE_TIME_CODE_STAMPER); GType gst_timecodestamper_source_get_type (void) @@ -505,8 +511,8 @@ if (timecodestamper->ltc_internal_tc) { if (timecodestamper->ltc_internal_tc->config.latest_daily_jam) { - g_date_time_unref (timecodestamper->ltc_internal_tc->config. - latest_daily_jam); + g_date_time_unref (timecodestamper->ltc_internal_tc-> + config.latest_daily_jam); } timecodestamper->ltc_internal_tc->config.latest_daily_jam = g_date_time_ref (timecodestamper->ltc_daily_jam); @@ -630,7 +636,9 @@ g_mutex_unlock (&timecodestamper->mutex); #endif - gst_video_info_init (&timecodestamper->vinfo); + timecodestamper->interlace_mode = GST_VIDEO_INTERLACE_MODE_PROGRESSIVE; + timecodestamper->fps_n = 0; + timecodestamper->fps_d = 1; if (timecodestamper->internal_tc != NULL) { gst_video_time_code_free (timecodestamper->internal_tc); @@ -689,15 +697,19 @@ static gboolean gst_timecodestamper_start (GstBaseTransform * trans) { -#if HAVE_LTC GstTimeCodeStamper *timecodestamper = GST_TIME_CODE_STAMPER (trans); +#if HAVE_LTC g_mutex_lock (&timecodestamper->mutex); timecodestamper->video_flushing = FALSE; timecodestamper->video_eos = FALSE; g_mutex_unlock (&timecodestamper->mutex); #endif + timecodestamper->interlace_mode = GST_VIDEO_INTERLACE_MODE_PROGRESSIVE; + timecodestamper->fps_n = 0; + timecodestamper->fps_d = 1; + return TRUE; } @@ -705,9 +717,8 @@ static void gst_timecodestamper_update_drop_frame (GstTimeCodeStamper * timecodestamper) { - if (timecodestamper->drop_frame && timecodestamper->vinfo.fps_d == 1001 && - (timecodestamper->vinfo.fps_n == 30000 || - timecodestamper->vinfo.fps_n == 60000)) { + if (timecodestamper->drop_frame && timecodestamper->fps_d == 1001 && + (timecodestamper->fps_n == 30000 || timecodestamper->fps_n == 60000)) { if (timecodestamper->internal_tc) timecodestamper->internal_tc->config.flags |= GST_VIDEO_TIME_CODE_FLAGS_DROP_FRAME; @@ -754,7 +765,7 @@ static void gst_timecodestamper_update_timecode_framerate (GstTimeCodeStamper * - timecodestamper, const GstVideoInfo * vinfo, GstVideoTimeCode * timecode, + timecodestamper, gint fps_n, gint fps_d, GstVideoTimeCode * timecode, gboolean is_ltc) { guint64 nframes; @@ -765,13 +776,11 @@ if (!timecode) return; - if (timecodestamper->vinfo.interlace_mode != - GST_VIDEO_INTERLACE_MODE_PROGRESSIVE) + if (timecodestamper->interlace_mode != GST_VIDEO_INTERLACE_MODE_PROGRESSIVE) tc_flags |= GST_VIDEO_TIME_CODE_FLAGS_INTERLACED; - if (timecodestamper->drop_frame && timecodestamper->vinfo.fps_d == 1001 && - (timecodestamper->vinfo.fps_n == 30000 || - timecodestamper->vinfo.fps_n == 60000)) + if (timecodestamper->drop_frame && timecodestamper->fps_d == 1001 && + (timecodestamper->fps_n == 30000 || timecodestamper->fps_n == 60000)) tc_flags |= GST_VIDEO_TIME_CODE_FLAGS_DROP_FRAME; /* If this is an LTC timecode and we have no framerate yet in there then @@ -780,19 +789,17 @@ nframes = gst_video_time_code_frames_since_daily_jam (timecode); time = gst_util_uint64_scale (nframes, - GST_SECOND * timecodestamper->vinfo.fps_d, - timecodestamper->vinfo.fps_n); + GST_SECOND * timecodestamper->fps_d, timecodestamper->fps_n); jam = - timecode->config.latest_daily_jam ? g_date_time_ref (timecode-> - config.latest_daily_jam) : NULL; + timecode->config.latest_daily_jam ? g_date_time_ref (timecode->config. + latest_daily_jam) : NULL; gst_video_time_code_clear (timecode); - gst_video_time_code_init (timecode, timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d, jam, tc_flags, 0, 0, 0, 0, 0); + gst_video_time_code_init (timecode, timecodestamper->fps_n, + timecodestamper->fps_d, jam, tc_flags, 0, 0, 0, 0, 0); if (jam) g_date_time_unref (jam); - nframes = - gst_util_uint64_scale (time, vinfo->fps_n, GST_SECOND * vinfo->fps_d); + nframes = gst_util_uint64_scale (time, fps_n, GST_SECOND * fps_d); gst_video_time_code_add_frames (timecode, nframes); } } @@ -800,18 +807,17 @@ /* Must be called with object lock */ static gboolean gst_timecodestamper_update_framerate (GstTimeCodeStamper * timecodestamper, - const GstVideoInfo * vinfo) + gint fps_n, gint fps_d) { /* Nothing changed */ - if (vinfo->fps_n == timecodestamper->vinfo.fps_n && - vinfo->fps_d == timecodestamper->vinfo.fps_d) + if (fps_n == timecodestamper->fps_n && fps_d == timecodestamper->fps_d) return FALSE; - gst_timecodestamper_update_timecode_framerate (timecodestamper, vinfo, + gst_timecodestamper_update_timecode_framerate (timecodestamper, fps_n, fps_d, timecodestamper->internal_tc, FALSE); - gst_timecodestamper_update_timecode_framerate (timecodestamper, vinfo, + gst_timecodestamper_update_timecode_framerate (timecodestamper, fps_n, fps_d, timecodestamper->last_tc, FALSE); - gst_timecodestamper_update_timecode_framerate (timecodestamper, vinfo, + gst_timecodestamper_update_timecode_framerate (timecodestamper, fps_n, fps_d, timecodestamper->rtc_tc, FALSE); #if HAVE_LTC @@ -821,11 +827,11 @@ for (l = timecodestamper->ltc_current_tcs.head; l; l = l->next) { TimestampedTimecode *tc = l->data; - gst_timecodestamper_update_timecode_framerate (timecodestamper, vinfo, - &tc->timecode, TRUE); + gst_timecodestamper_update_timecode_framerate (timecodestamper, fps_n, + fps_d, &tc->timecode, TRUE); } } - gst_timecodestamper_update_timecode_framerate (timecodestamper, vinfo, + gst_timecodestamper_update_timecode_framerate (timecodestamper, fps_n, fps_d, timecodestamper->ltc_internal_tc, FALSE); #endif @@ -864,27 +870,42 @@ case GST_EVENT_CAPS: { GstCaps *caps; - GstVideoInfo info; gboolean latency_changed; + const gchar *interlace_mode; + GstStructure *s; + gint fps_n, fps_d; GST_OBJECT_LOCK (timecodestamper); gst_event_parse_caps (event, &caps); - if (!gst_video_info_from_caps (&info, caps)) { + + s = gst_caps_get_structure (caps, 0); + + if (!gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d)) { + GST_ERROR_OBJECT (timecodestamper, "Expected framerate in caps"); GST_OBJECT_UNLOCK (timecodestamper); gst_event_unref (event); return FALSE; } - if (info.fps_n == 0) { - GST_WARNING_OBJECT (timecodestamper, + + if (fps_n == 0) { + GST_ERROR_OBJECT (timecodestamper, "Non-constant frame rate found. Refusing to create a timecode"); GST_OBJECT_UNLOCK (timecodestamper); gst_event_unref (event); return FALSE; } + if ((interlace_mode = gst_structure_get_string (s, "interlace-mode"))) { + timecodestamper->interlace_mode = + gst_video_interlace_mode_from_string (interlace_mode); + } + latency_changed = - gst_timecodestamper_update_framerate (timecodestamper, &info); - timecodestamper->vinfo = info; + gst_timecodestamper_update_framerate (timecodestamper, fps_n, fps_d); + + timecodestamper->fps_n = fps_n; + timecodestamper->fps_d = fps_d; + GST_OBJECT_UNLOCK (timecodestamper); if (latency_changed) @@ -953,11 +974,10 @@ } GST_OBJECT_LOCK (timecodestamper); - if (timecodestamper->vinfo.fps_d && timecodestamper->vinfo.fps_n) { + if (timecodestamper->fps_d && timecodestamper->fps_n) { timecodestamper->prev_seek_seqnum = GST_EVENT_SEQNUM (event); timecodestamper->seeked_frames = gst_util_uint64_scale (start, - timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d * GST_SECOND); + timecodestamper->fps_n, timecodestamper->fps_d * GST_SECOND); } GST_OBJECT_UNLOCK (timecodestamper); break; @@ -994,19 +1014,21 @@ gst_pad_query_default (GST_BASE_TRANSFORM_SRC_PAD (trans), GST_OBJECT_CAST (trans), query); g_mutex_lock (&timecodestamper->mutex); - if (res && timecodestamper->vinfo.fps_n && timecodestamper->vinfo.fps_d) { + if (res && timecodestamper->fps_n && timecodestamper->fps_d) { gst_query_parse_latency (query, &live, &min_latency, &max_latency); if (live && timecodestamper->ltcpad) { /* Introduce additional LTC for waiting for LTC timecodes. The * LTC library introduces some as well as the encoding of the LTC * signal. */ - latency = timecodestamper->ltc_extra_latency * - gst_util_uint64_scale_int_ceil (GST_SECOND, - timecodestamper->vinfo.fps_d, timecodestamper->vinfo.fps_n); + latency = timecodestamper->ltc_extra_latency; min_latency += latency; if (max_latency != GST_CLOCK_TIME_NONE) max_latency += latency; timecodestamper->latency = min_latency; + GST_DEBUG_OBJECT (timecodestamper, + "Reporting latency min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT + " ours %" GST_TIME_FORMAT, GST_TIME_ARGS (min_latency), + GST_TIME_ARGS (max_latency), GST_TIME_ARGS (latency)); gst_query_set_latency (query, live, min_latency, max_latency); } else { timecodestamper->latency = 0; @@ -1082,7 +1104,7 @@ GstFlowReturn flow_ret = GST_FLOW_OK; GstVideoTimeCodeFlags tc_flags = 0; - if (timecodestamper->vinfo.fps_n == 0 || timecodestamper->vinfo.fps_d == 0 + if (timecodestamper->fps_n == 0 || timecodestamper->fps_d == 0 || !GST_BUFFER_PTS_IS_VALID (buffer)) { gst_buffer_unref (buffer); return GST_FLOW_NOT_NEGOTIATED; @@ -1136,13 +1158,11 @@ /* Update all our internal timecodes as needed */ GST_OBJECT_LOCK (timecodestamper); - if (timecodestamper->vinfo.interlace_mode != - GST_VIDEO_INTERLACE_MODE_PROGRESSIVE) + if (timecodestamper->interlace_mode != GST_VIDEO_INTERLACE_MODE_PROGRESSIVE) tc_flags |= GST_VIDEO_TIME_CODE_FLAGS_INTERLACED; - if (timecodestamper->drop_frame && timecodestamper->vinfo.fps_d == 1001 && - (timecodestamper->vinfo.fps_n == 30000 || - timecodestamper->vinfo.fps_n == 60000)) + if (timecodestamper->drop_frame && timecodestamper->fps_d == 1001 && + (timecodestamper->fps_n == 30000 || timecodestamper->fps_n == 60000)) tc_flags |= GST_VIDEO_TIME_CODE_FLAGS_DROP_FRAME; /* If we don't have an internal timecode yet then either a new one was just @@ -1158,8 +1178,8 @@ timecodestamper->reset_internal_tc_from_seek = FALSE; if (timecodestamper->set_internal_tc) { timecodestamper->internal_tc = - gst_video_time_code_new (timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d, + gst_video_time_code_new (timecodestamper->fps_n, + timecodestamper->fps_d, timecodestamper->set_internal_tc->config.latest_daily_jam, tc_flags, timecodestamper->set_internal_tc->hours, timecodestamper->set_internal_tc->minutes, @@ -1168,8 +1188,8 @@ timecodestamper->set_internal_tc->field_count); } else { timecodestamper->internal_tc = - gst_video_time_code_new (timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d, dt_frame, tc_flags, 0, 0, 0, 0, 0); + gst_video_time_code_new (timecodestamper->fps_n, + timecodestamper->fps_d, dt_frame, tc_flags, 0, 0, 0, 0, 0); if (timecodestamper->seeked_frames > 0) { GST_DEBUG_OBJECT (timecodestamper, "Adding %" G_GINT64_FORMAT " frames that were seeked", @@ -1245,8 +1265,7 @@ /* Create timecode for the current frame time */ memset (&rtc_timecode_now, 0, sizeof (rtc_timecode_now)); gst_video_time_code_init_from_date_time_full (&rtc_timecode_now, - timecodestamper->vinfo.fps_n, timecodestamper->vinfo.fps_d, dt_frame, - tc_flags, 0); + timecodestamper->fps_n, timecodestamper->fps_d, dt_frame, tc_flags, 0); tc_str = gst_video_time_code_to_string (&rtc_timecode_now); dt_str = g_date_time_format (dt_frame, "%F %R %z"); @@ -1314,7 +1333,7 @@ gboolean updated_internal = FALSE; frame_duration = gst_util_uint64_scale_int_ceil (GST_SECOND, - timecodestamper->vinfo.fps_d, timecodestamper->vinfo.fps_n); + timecodestamper->fps_d, timecodestamper->fps_n); g_mutex_lock (&timecodestamper->mutex); @@ -1396,13 +1415,12 @@ * done yet */ if (ltc_tc->timecode.config.fps_d == 0) { gint fps_n_div = - ((gdouble) timecodestamper->vinfo.fps_n) / - timecodestamper->vinfo.fps_d > 30 ? 2 : 1; + ((gdouble) timecodestamper->fps_n) / + timecodestamper->fps_d > 30 ? 2 : 1; ltc_tc->timecode.config.flags = tc_flags; - ltc_tc->timecode.config.fps_n = - timecodestamper->vinfo.fps_n / fps_n_div; - ltc_tc->timecode.config.fps_d = timecodestamper->vinfo.fps_d; + ltc_tc->timecode.config.fps_n = timecodestamper->fps_n / fps_n_div; + ltc_tc->timecode.config.fps_d = timecodestamper->fps_d; } tc_str = gst_video_time_code_to_string (<c_tc->timecode); @@ -1508,8 +1526,8 @@ tc = timecodestamper->internal_tc; break; case GST_TIME_CODE_STAMPER_SOURCE_ZERO: - tc = gst_video_time_code_new (timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d, NULL, tc_flags, 0, 0, 0, 0, 0); + tc = gst_video_time_code_new (timecodestamper->fps_n, + timecodestamper->fps_d, NULL, tc_flags, 0, 0, 0, 0, 0); free_tc = TRUE; break; case GST_TIME_CODE_STAMPER_SOURCE_LAST_KNOWN: @@ -1520,8 +1538,8 @@ case GST_TIME_CODE_STAMPER_SOURCE_LAST_KNOWN_OR_ZERO: tc = timecodestamper->last_tc; if (!tc) { - tc = gst_video_time_code_new (timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d, NULL, tc_flags, 0, 0, 0, 0, 0); + tc = gst_video_time_code_new (timecodestamper->fps_n, + timecodestamper->fps_d, NULL, tc_flags, 0, 0, 0, 0, 0); free_tc = TRUE; } break; @@ -1531,8 +1549,8 @@ tc = timecodestamper->ltc_internal_tc; #endif if (!tc) { - tc = gst_video_time_code_new (timecodestamper->vinfo.fps_n, - timecodestamper->vinfo.fps_d, NULL, tc_flags, 0, 0, 0, 0, 0); + tc = gst_video_time_code_new (timecodestamper->fps_n, + timecodestamper->fps_d, NULL, tc_flags, 0, 0, 0, 0, 0); free_tc = TRUE; } break; @@ -1599,8 +1617,8 @@ gst_segment_to_stream_time (&vfilter->segment, GST_FORMAT_TIME, GST_BUFFER_PTS (buffer)); duration = - gst_util_uint64_scale_int (GST_SECOND, timecodestamper->vinfo.fps_d, - timecodestamper->vinfo.fps_n); + gst_util_uint64_scale_int (GST_SECOND, timecodestamper->fps_d, + timecodestamper->fps_n); s = gst_structure_new ("timecodestamper", "timestamp", G_TYPE_UINT64, GST_BUFFER_PTS (buffer), "stream-time", G_TYPE_UINT64, stream_time, "running-time", G_TYPE_UINT64, running_time, "duration", @@ -1793,9 +1811,9 @@ GST_OBJECT_LOCK (timecodestamper); /* This is only for initialization and needs to be somewhat close to the * real value. It will be tracked automatically afterwards */ - if (timecodestamper->vinfo.fps_n) { + if (timecodestamper->fps_n) { samples_per_frame = timecodestamper->ainfo.rate * - timecodestamper->vinfo.fps_d / timecodestamper->vinfo.fps_n; + timecodestamper->fps_d / timecodestamper->fps_n; } GST_OBJECT_UNLOCK (timecodestamper);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/timecode/gsttimecodestamper.h -> gst-plugins-bad-1.20.1.tar.xz/gst/timecode/gsttimecodestamper.h
Changed
@@ -96,8 +96,10 @@ GstClockTime last_tc_running_time; GstVideoTimeCode *rtc_tc; - /* Internal state */ - GstVideoInfo vinfo; /* protected by object lock, changed only from video streaming thread */ + /* Internal state, protected by object lock, changed only from video streaming thread */ + gint fps_n; + gint fps_d; + GstVideoInterlaceMode interlace_mode; /* Seek handling, protected by the object lock */ guint32 prev_seek_seqnum; @@ -161,6 +163,7 @@ }; GType gst_timecodestamper_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (timecodestamper); GType gst_timecodestamper_source_get_type (void); GType gst_timecodestamper_set_get_type (void);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/timecode/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/timecode/plugin.c
Changed
@@ -30,13 +30,10 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean ret; + gboolean ret = FALSE; - ret = gst_element_register (plugin, "timecodestamper", GST_RANK_NONE, - GST_TYPE_TIME_CODE_STAMPER); - - ret &= gst_element_register (plugin, "avwait", GST_RANK_NONE, - GST_TYPE_AVWAIT); + ret |= GST_ELEMENT_REGISTER (timecodestamper, plugin); + ret |= GST_ELEMENT_REGISTER (avwait, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/transcode/gsttranscodebin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/transcode/gsttranscodebin.c
Changed
@@ -23,13 +23,26 @@ #endif #include "gsttranscoding.h" +#include "gsttranscodeelements.h" #include <gst/gst-i18n-plugin.h> #include <gst/pbutils/pbutils.h> #include <gst/pbutils/missing-plugins.h> -GST_DEBUG_CATEGORY_STATIC (gst_transcodebin_debug); -#define GST_CAT_DEFAULT gst_transcodebin_debug + + +/** + * GstTranscodeBin!sink_%u: + * + * Extra sinkpads for the parallel transcoding of auxiliary streams. + * + * Since: 1.20 + */ +static GstStaticPadTemplate transcode_bin_sinks_template = +GST_STATIC_PAD_TEMPLATE ("sink_%u", + GST_PAD_SINK, + GST_PAD_REQUEST, + GST_STATIC_CAPS_ANY); static GstStaticPadTemplate transcode_bin_sink_template = GST_STATIC_PAD_TEMPLATE ("sink", @@ -37,14 +50,52 @@ GST_PAD_ALWAYS, GST_STATIC_CAPS_ANY); +/** + * GstTranscodeBin!src_%u: + * + * The sometimes source pad, it will be exposed depending on the + * #transcodebin:profile in use. + * + * Note: in GStreamer 1.18 it was a static + * srcpad but in the the 1.20 cycle it was decided that we should make it a + * sometimes pad as part of the development of #encodebin2. + * + * Since: 1.20 + */ static GstStaticPadTemplate transcode_bin_src_template = -GST_STATIC_PAD_TEMPLATE ("src", +GST_STATIC_PAD_TEMPLATE ("src_%u", GST_PAD_SRC, - GST_PAD_ALWAYS, + GST_PAD_SOMETIMES, GST_STATIC_CAPS_ANY); typedef struct { + const gchar *stream_id; + GstStream *stream; + GstPad *encodebin_pad; +} TranscodingStream; + +static TranscodingStream * +transcoding_stream_new (GstStream * stream, GstPad * encodebin_pad) +{ + TranscodingStream *tstream = g_new0 (TranscodingStream, 1); + + tstream->stream_id = gst_stream_get_stream_id (stream); + tstream->stream = gst_object_ref (stream); + tstream->encodebin_pad = encodebin_pad; + + return tstream; +} + +static void +transcoding_stream_free (TranscodingStream * tstream) +{ + gst_object_unref (tstream->stream); + gst_object_unref (tstream->encodebin_pad); +} + +typedef struct +{ GstBin parent; GstElement *decodebin; @@ -53,10 +104,11 @@ GstEncodingProfile *profile; gboolean avoid_reencoding; GstPad *sinkpad; - GstPad *srcpad; GstElement *audio_filter; GstElement *video_filter; + + GPtrArray *transcoding_streams; } GstTranscodeBin; typedef struct @@ -68,14 +120,13 @@ /* *INDENT-OFF* */ #define GST_TYPE_TRANSCODE_BIN (gst_transcode_bin_get_type ()) #define GST_TRANSCODE_BIN(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_TRANSCODE_BIN, GstTranscodeBin)) -#define GST_TRANSCODE_BIN_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TRANSCODE_BIN_TYPE, GstTranscodeBinClass)) -#define GST_IS_TRANSCODE_BIN(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TRANSCODE_BIN_TYPE)) -#define GST_IS_TRANSCODE_BIN_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TRANSCODE_BIN_TYPE)) -#define GST_TRANSCODE_BIN_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TRANSCODE_BIN_TYPE, GstTranscodeBinClass)) #define DEFAULT_AVOID_REENCODING FALSE -G_DEFINE_TYPE (GstTranscodeBin, gst_transcode_bin, GST_TYPE_BIN) +G_DEFINE_TYPE (GstTranscodeBin, gst_transcode_bin, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (transcodebin, "transcodebin", GST_RANK_NONE, + GST_TYPE_TRANSCODE_BIN, transcodebin_element_init (plugin)); + enum { PROP_0, @@ -100,6 +151,26 @@ } /* *INDENT-ON* */ +static gboolean +filter_handles_any (GstElement * filter) +{ + GList *tmp; + + for (tmp = gst_element_get_pad_template_list (filter); tmp; tmp = tmp->next) { + GstPadTemplate *tmpl = tmp->data; + GstCaps *caps = gst_pad_template_get_caps (tmpl); + + if (!gst_caps_is_any (caps)) { + gst_caps_unref (caps); + return FALSE; + } + + gst_caps_unref (caps); + } + + return gst_element_get_pad_template_list (filter) != NULL; +} + static GstPad * _insert_filter (GstTranscodeBin * self, GstPad * sinkpad, GstPad * pad, GstCaps * caps) @@ -115,14 +186,16 @@ if (self->video_filter && g_str_has_prefix (media_type, "video")) { audio = FALSE; - if (!g_strcmp0 (media_type, "video/x-raw")) + if (!g_strcmp0 (media_type, "video/x-raw") + || filter_handles_any (self->video_filter)) filter = self->video_filter; else GST_ERROR_OBJECT (pad, "decodebin pad does not produce raw data (%" GST_PTR_FORMAT "), cannot add video filter '%s'", caps, GST_ELEMENT_NAME (self->video_filter)); } else if (self->audio_filter && g_str_has_prefix (media_type, "audio")) { - if (!g_strcmp0 (media_type, "audio/x-raw")) + if (!g_strcmp0 (media_type, "audio/x-raw") + || filter_handles_any (self->audio_filter)) filter = self->audio_filter; else GST_ERROR_OBJECT (pad, "decodebin pad does not produce raw data (%" @@ -133,7 +206,8 @@ if (!filter) return pad; - if ((filter_parent = gst_object_get_parent (GST_OBJECT (filter)))) { + filter_parent = gst_object_get_parent (GST_OBJECT (filter)); + if (filter_parent != GST_OBJECT_CAST (self)) { GST_WARNING_OBJECT (self, "Filter already in use (inside %" GST_PTR_FORMAT ").", filter_parent); GST_FIXME_OBJECT (self, @@ -143,13 +217,16 @@ return pad; } + gst_object_unref (filter_parent); /* We are guaranteed filters only have 1 unique sinkpad and srcpad */ GST_OBJECT_LOCK (filter); filter_sink = filter->sinkpads->data; filter_src = filter->srcpads->data; GST_OBJECT_UNLOCK (filter); - if (audio) + if (filter_handles_any (filter)) + convert = gst_element_factory_make ("identity", NULL); + else if (audio) convert = gst_element_factory_make ("audioconvert", NULL); else convert = gst_element_factory_make ("videoconvert", NULL); @@ -163,7 +240,7 @@ return pad; } - gst_bin_add_many (GST_BIN (self), convert, gst_object_ref (filter), NULL); + gst_bin_add_many (GST_BIN (self), convert, NULL); convert_sink = gst_element_get_static_pad (convert, "sink"); g_assert (convert_sink); @@ -214,61 +291,73 @@ return filter_src; } -static void -pad_added_cb (GstElement * decodebin, GstPad * pad, GstTranscodeBin * self) +static TranscodingStream * +find_stream (GstTranscodeBin * self, const gchar * stream_id, GstPad * pad) { - GstCaps *caps; - GstPad *sinkpad = NULL; - GstPadLinkReturn lret; - - caps = gst_pad_query_caps (pad, NULL); + gint i; + TranscodingStream *res = NULL; - GST_DEBUG_OBJECT (decodebin, "Pad added, caps: %" GST_PTR_FORMAT, caps); + GST_OBJECT_LOCK (self); + for (i = 0; i < self->transcoding_streams->len; i = i + 1) { + TranscodingStream *s = self->transcoding_streams->pdata[i]; + + if (stream_id && !g_strcmp0 (s->stream_id, stream_id)) { + res = s; + goto done; + } else if (pad && s->encodebin_pad == pad) { + res = s; + goto done; + } + } - g_signal_emit_by_name (self->encodebin, "request-pad", caps, &sinkpad); +done: + GST_OBJECT_UNLOCK (self); - if (sinkpad == NULL) { - gchar *stream_id = gst_pad_get_stream_id (pad); + return res; +} - GST_ELEMENT_WARNING_WITH_DETAILS (self, STREAM, FORMAT, - (NULL), ("Stream with caps: %" GST_PTR_FORMAT " can not be" - " encoded in the defined encoding formats", - caps), - ("can-t-encode-stream", G_TYPE_BOOLEAN, TRUE, - "stream-caps", GST_TYPE_CAPS, caps, - "stream-id", G_TYPE_STRING, stream_id, NULL)); +static void +gst_transcode_bin_link_encodebin_pad (GstTranscodeBin * self, GstPad * pad, + const gchar * stream_id) +{ + GstCaps *caps; + GstPadLinkReturn lret; + TranscodingStream *stream = find_stream (self, stream_id, NULL); - g_free (stream_id); + if (!stream) { + GST_ERROR_OBJECT (self, "%s -> Got not stream, decodebin3 bug?", stream_id); return; } - if (caps) - gst_caps_unref (caps); - - pad = _insert_filter (self, sinkpad, pad, caps); - lret = gst_pad_link (pad, sinkpad); + caps = gst_pad_query_caps (pad, NULL); + pad = _insert_filter (self, stream->encodebin_pad, pad, caps); + lret = gst_pad_link (pad, stream->encodebin_pad); switch (lret) { case GST_PAD_LINK_OK: break; case GST_PAD_LINK_WAS_LINKED: GST_FIXME_OBJECT (self, "Pad %" GST_PTR_FORMAT " was already linked", - sinkpad); + stream->encodebin_pad); break; default: { - GstCaps *othercaps = gst_pad_query_caps (sinkpad, NULL); + GstCaps *othercaps = gst_pad_query_caps (stream->encodebin_pad, NULL); caps = gst_pad_get_current_caps (pad); + if (!caps) + caps = gst_pad_query_caps (pad, NULL); + GST_ELEMENT_ERROR_WITH_DETAILS (self, CORE, PAD, (NULL), ("Couldn't link pads:\n %" GST_PTR_FORMAT ": %" GST_PTR_FORMAT "\nand:\n" - " %" GST_PTR_FORMAT ": %" GST_PTR_FORMAT "\n\n", - pad, caps, sinkpad, othercaps), + " %" GST_PTR_FORMAT ": %" GST_PTR_FORMAT "\n\n Error: %s\n", + pad, caps, stream->encodebin_pad, othercaps, + gst_pad_link_get_name (lret)), ("linking-error", GST_TYPE_PAD_LINK_RETURN, lret, "source-pad", GST_TYPE_PAD, pad, "source-caps", GST_TYPE_CAPS, caps, - "sink-pad", GST_TYPE_PAD, sinkpad, + "sink-pad", GST_TYPE_PAD, stream->encodebin_pad, "sink-caps", GST_TYPE_CAPS, othercaps, NULL)); gst_clear_caps (&caps); @@ -276,36 +365,89 @@ gst_caps_unref (othercaps); } } +} + +static GstPadProbeReturn +wait_stream_start_probe (GstPad * pad, + GstPadProbeInfo * info, GstTranscodeBin * self) +{ + const gchar *stream_id; + + if (GST_EVENT_TYPE (info->data) != GST_EVENT_STREAM_START) + return GST_PAD_PROBE_OK; + + gst_event_parse_stream_start (info->data, &stream_id); + GST_INFO_OBJECT (self, "Got pad %" GST_PTR_FORMAT " with stream ID: %s", + pad, stream_id); + gst_transcode_bin_link_encodebin_pad (self, pad, stream_id); + + return GST_PAD_PROBE_REMOVE; +} + +static void +decodebin_pad_added_cb (GstElement * decodebin, GstPad * pad, + GstTranscodeBin * self) +{ + const gchar *stream_id; + GstEvent *sstart_event; + + if (GST_PAD_IS_SINK (pad)) + return; + + sstart_event = gst_pad_get_sticky_event (pad, GST_EVENT_STREAM_START, -1); + if (sstart_event) { + gst_event_parse_stream_start (sstart_event, &stream_id); + GST_INFO_OBJECT (self, "Got pad %" GST_PTR_FORMAT " with stream ID: %s", + pad, stream_id); + gst_transcode_bin_link_encodebin_pad (self, pad, stream_id); + return; + } + + GST_INFO_OBJECT (self, "Waiting for stream ID for pad %" GST_PTR_FORMAT, pad); + gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, + (GstPadProbeCallback) wait_stream_start_probe, self, NULL); +} + +static void +encodebin_pad_added_cb (GstElement * encodebin, GstPad * pad, GstElement * self) +{ + GstPadTemplate *template; + GstPad *new_pad; + gchar *name; + + if (!GST_PAD_IS_SRC (pad)) + return; + + template = gst_element_get_pad_template (self, "src_%u"); - gst_object_unref (sinkpad); + GST_OBJECT_LOCK (self); + name = g_strdup_printf ("src_%u", GST_ELEMENT (self)->numsrcpads); + GST_OBJECT_UNLOCK (self); + new_pad = gst_ghost_pad_new_from_template (name, pad, template); + g_free (name); + GST_DEBUG_OBJECT (self, "Encodebin exposed srcpad: %" GST_PTR_FORMAT, pad); + + gst_element_add_pad (self, new_pad); } static gboolean make_encodebin (GstTranscodeBin * self) { - GstPad *pad; GST_INFO_OBJECT (self, "making new encodebin"); if (!self->profile) goto no_profile; - self->encodebin = gst_element_factory_make ("encodebin", NULL); + self->encodebin = gst_element_factory_make ("encodebin2", NULL); if (!self->encodebin) goto no_encodebin; gst_bin_add (GST_BIN (self), self->encodebin); - g_object_set (self->encodebin, "profile", self->profile, NULL); - pad = gst_element_get_static_pad (self->encodebin, "src"); - if (!gst_ghost_pad_set_target (GST_GHOST_PAD_CAST (self->srcpad), pad)) { + g_signal_connect (self->encodebin, "pad-added", + G_CALLBACK (encodebin_pad_added_cb), self); - gst_object_unref (pad); - GST_ERROR_OBJECT (self, "Could not ghost %" GST_PTR_FORMAT " srcpad", - self->encodebin); - - return FALSE; - } - gst_object_unref (pad); + g_object_set (self->encodebin, "profile", self->profile, NULL); return gst_element_sync_state_with_parent (self->encodebin); @@ -329,63 +471,234 @@ } } -static gboolean -make_decodebin (GstTranscodeBin * self) +static GstPad * +get_encodebin_pad_for_caps (GstTranscodeBin * self, GstCaps * srccaps) { - GstPad *pad; - GST_INFO_OBJECT (self, "making new decodebin"); + GstPad *res = NULL; + GstIterator *pads; + gboolean done = FALSE; + GValue paditem = { 0, }; + + if (G_UNLIKELY (srccaps == NULL)) + goto no_caps; + + pads = gst_element_iterate_sink_pads (self->encodebin); + + GST_DEBUG_OBJECT (self, "srccaps %" GST_PTR_FORMAT, srccaps); - self->decodebin = gst_element_factory_make ("decodebin", NULL); - - if (!self->decodebin) - goto no_decodebin; - - if (self->avoid_reencoding) { - GstCaps *decodecaps; - - g_object_get (self->decodebin, "caps", &decodecaps, NULL); - if (GST_IS_ENCODING_CONTAINER_PROFILE (self->profile)) { - GList *tmp; - - decodecaps = gst_caps_make_writable (decodecaps); - for (tmp = (GList *) - gst_encoding_container_profile_get_profiles - (GST_ENCODING_CONTAINER_PROFILE (self->profile)); tmp; - tmp = tmp->next) { - GstEncodingProfile *profile = tmp->data; - GstCaps *restrictions; - - restrictions = gst_encoding_profile_get_restriction (profile); - - if (!restrictions || gst_caps_is_any (restrictions)) { - GstCaps *encodecaps = gst_encoding_profile_get_format (profile); - GstElement *filter = NULL; - - /* Filter operates on raw data so don't allow decodebin to produce - * encoded data if one is defined. */ - if (GST_IS_ENCODING_VIDEO_PROFILE (profile) && self->video_filter) - filter = self->video_filter; - else if (GST_IS_ENCODING_AUDIO_PROFILE (profile) - && self->audio_filter) - filter = self->audio_filter; - - if (!filter) { - GST_DEBUG_OBJECT (self, - "adding %" GST_PTR_FORMAT " as output caps to decodebin", - encodecaps); - gst_caps_append (decodecaps, encodecaps); + while (!done) { + switch (gst_iterator_next (pads, &paditem)) { + case GST_ITERATOR_OK: + { + GstPad *testpad = g_value_get_object (&paditem); + + if (!gst_pad_is_linked (testpad) && !find_stream (self, NULL, testpad)) { + GstCaps *sinkcaps = gst_pad_query_caps (testpad, NULL); + + GST_DEBUG_OBJECT (self, "sinkccaps %" GST_PTR_FORMAT, sinkcaps); + + if (gst_caps_can_intersect (srccaps, sinkcaps)) { + res = gst_object_ref (testpad); + done = TRUE; } - } else { - gst_caps_unref (restrictions); + gst_caps_unref (sinkcaps); } + g_value_reset (&paditem); } + break; + case GST_ITERATOR_DONE: + case GST_ITERATOR_ERROR: + done = TRUE; + break; + case GST_ITERATOR_RESYNC: + gst_iterator_resync (pads); + break; } - g_object_set (self->decodebin, "caps", decodecaps, NULL); - gst_caps_unref (decodecaps); } + g_value_reset (&paditem); + gst_iterator_free (pads); + + if (!res) + g_signal_emit_by_name (self->encodebin, "request-pad", srccaps, &res); + + return res; + +no_caps: + { + GST_DEBUG_OBJECT (self, "No caps, can't do anything"); + return NULL; + } +} - g_signal_connect (self->decodebin, "pad-added", G_CALLBACK (pad_added_cb), - self); +static gboolean +caps_is_raw (GstCaps * caps, GstStreamType stype) +{ + const gchar *media_type; + + if (!caps || !gst_caps_get_size (caps)) + return FALSE; + + media_type = gst_structure_get_name (gst_caps_get_structure (caps, 0)); + if (stype == GST_STREAM_TYPE_VIDEO) + return !g_strcmp0 (media_type, "video/x-raw"); + else if (stype == GST_STREAM_TYPE_AUDIO) + return !g_strcmp0 (media_type, "audio/x-raw"); + /* FIXME: Handle more types ? */ + + return FALSE; +} + +static GstPad * +get_encodebin_pad_from_stream (GstTranscodeBin * self, + GstEncodingProfile * profile, GstStream * stream) +{ + GstCaps *caps = gst_stream_get_caps (stream); + GstPad *sinkpad = get_encodebin_pad_for_caps (self, caps); + + if (!sinkpad && !caps_is_raw (caps, gst_stream_get_stream_type (stream))) { + gst_clear_caps (&caps); + switch (gst_stream_get_stream_type (stream)) { + case GST_STREAM_TYPE_AUDIO: + caps = gst_caps_from_string ("audio/x-raw"); + break; + case GST_STREAM_TYPE_VIDEO: + caps = gst_caps_from_string ("video/x-raw"); + break; + default: + GST_INFO_OBJECT (self, "Unsupported stream type: %" GST_PTR_FORMAT, + stream); + return NULL; + } + sinkpad = get_encodebin_pad_for_caps (self, caps); + } + + return sinkpad; +} + +static gint +select_stream_cb (GstElement * decodebin, + GstStreamCollection * collection, GstStream * stream, + GstTranscodeBin * self) +{ + gint i; + gboolean transcode_stream = FALSE; + guint len = 0; + + GST_OBJECT_LOCK (self); + len = self->transcoding_streams->len; + GST_OBJECT_UNLOCK (self); + + if (len) { + transcode_stream = + find_stream (self, gst_stream_get_stream_id (stream), NULL) != NULL; + if (transcode_stream) + goto done; + } + + for (i = 0; i < gst_stream_collection_get_size (collection); i++) { + GstStream *tmpstream = gst_stream_collection_get_stream (collection, i); + GstPad *encodebin_pad = + get_encodebin_pad_from_stream (self, self->profile, tmpstream); + + if (encodebin_pad) { + if (stream == tmpstream) + transcode_stream = TRUE; + + GST_INFO_OBJECT (self, + "Going to transcode stream %s (encodebin pad: %" GST_PTR_FORMAT, + gst_stream_get_stream_id (tmpstream), encodebin_pad); + + GST_OBJECT_LOCK (self); + g_ptr_array_add (self->transcoding_streams, + transcoding_stream_new (tmpstream, encodebin_pad)); + GST_OBJECT_UNLOCK (self); + } + } + + GST_OBJECT_LOCK (self); + len = self->transcoding_streams->len; + GST_OBJECT_UNLOCK (self); + + if (len) { + transcode_stream = + find_stream (self, gst_stream_get_stream_id (stream), NULL) != NULL; + } + +done: + if (!transcode_stream) + GST_INFO_OBJECT (self, "Discarding stream: %" GST_PTR_FORMAT, stream); + + return transcode_stream; +} + +/* Called with OBJECT_LOCK */ +static void +_setup_avoid_reencoding (GstTranscodeBin * self) +{ + const GList *tmp; + GstCaps *decodecaps; + + if (!self->avoid_reencoding) + return; + + if (!GST_IS_ENCODING_CONTAINER_PROFILE (self->profile)) + return; + + g_object_get (self->decodebin, "caps", &decodecaps, NULL); + decodecaps = gst_caps_make_writable (decodecaps); + tmp = + gst_encoding_container_profile_get_profiles + (GST_ENCODING_CONTAINER_PROFILE (self->profile)); + for (; tmp; tmp = tmp->next) { + GstEncodingProfile *profile = tmp->data; + GstCaps *restrictions, *encodecaps; + GstElement *filter = NULL; + + restrictions = gst_encoding_profile_get_restriction (profile); + + if (restrictions && gst_caps_is_any (restrictions)) { + gst_caps_unref (restrictions); + continue; + } + + encodecaps = gst_encoding_profile_get_format (profile); + filter = NULL; + + /* Filter operates on raw data so don't allow decodebin to produce + * encoded data if one is defined. */ + if (GST_IS_ENCODING_VIDEO_PROFILE (profile) && self->video_filter) + filter = self->video_filter; + else if (GST_IS_ENCODING_AUDIO_PROFILE (profile) + && self->audio_filter) + filter = self->audio_filter; + + if (!filter || filter_handles_any (filter)) { + GST_DEBUG_OBJECT (self, + "adding %" GST_PTR_FORMAT " as output caps to decodebin", encodecaps); + gst_caps_append (decodecaps, encodecaps); + } + } + + GST_OBJECT_UNLOCK (self); + + g_object_set (self->decodebin, "caps", decodecaps, NULL); + gst_caps_unref (decodecaps); + + GST_OBJECT_LOCK (self); +} + +static gboolean +make_decodebin (GstTranscodeBin * self) +{ + GstPad *pad; + GST_INFO_OBJECT (self, "making new decodebin"); + + self->decodebin = gst_element_factory_make ("decodebin3", NULL); + + g_signal_connect (self->decodebin, "pad-added", + G_CALLBACK (decodebin_pad_added_cb), self); + g_signal_connect (self->decodebin, "select-stream", + G_CALLBACK (select_stream_cb), self); gst_bin_add (GST_BIN (self), self->decodebin); pad = gst_element_get_static_pad (self->decodebin, "sink"); @@ -402,14 +715,6 @@ return TRUE; /* ERRORS */ -no_decodebin: - { - post_missing_plugin_error (GST_ELEMENT_CAST (self), "decodebin"); - GST_ELEMENT_ERROR (self, CORE, MISSING_PLUGIN, (NULL), - ("No decodebin element, check your installation")); - - return FALSE; - } } static void @@ -430,12 +735,6 @@ gst_element_set_state (self->audio_filter, GST_STATE_NULL); gst_bin_remove (GST_BIN (self), self->audio_filter); } - - if (self->decodebin) { - gst_element_set_state (self->decodebin, GST_STATE_NULL); - gst_bin_remove (GST_BIN (self), self->decodebin); - self->decodebin = NULL; - } } static GstStateChangeReturn @@ -447,10 +746,15 @@ switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: - if (!make_encodebin (self)) + if (!self->decodebin) { + post_missing_plugin_error (GST_ELEMENT_CAST (self), "decodebin3"); + GST_ELEMENT_ERROR (self, CORE, MISSING_PLUGIN, (NULL), + ("No decodebin element, check your installation")); + goto setup_failed; + } - if (!make_decodebin (self)) + if (!make_encodebin (self)) goto setup_failed; break; @@ -466,6 +770,13 @@ switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_READY: + GST_OBJECT_LOCK (self); + g_ptr_array_remove_range (self->transcoding_streams, 0, + self->transcoding_streams->len); + GST_OBJECT_UNLOCK (self); + + g_signal_handlers_disconnect_by_data (self->decodebin, self); + remove_all_children (self); break; default: @@ -487,6 +798,7 @@ g_clear_object (&self->video_filter); g_clear_object (&self->audio_filter); + g_clear_pointer (&self->transcoding_streams, g_ptr_array_unref); G_OBJECT_CLASS (gst_transcode_bin_parent_class)->dispose (object); } @@ -523,6 +835,28 @@ } } +static GstPad * +gst_transcode_bin_request_pad (GstElement * element, GstPadTemplate * temp, + const gchar * name, const GstCaps * caps) +{ + GstTranscodeBin *self = (GstTranscodeBin *) element; + GstPad *gpad, *decodebin_pad = + gst_element_request_pad_simple (self->decodebin, "sink_%u"); + + if (!decodebin_pad) { + GST_ERROR_OBJECT (element, + "Could not request decodebin3 pad for %" GST_PTR_FORMAT, caps); + + return NULL; + } + + gpad = gst_ghost_pad_new_from_template (name, decodebin_pad, temp); + gst_element_add_pad (element, GST_PAD (gpad)); + gst_object_unref (decodebin_pad); + + return gpad; +} + static void _set_filter (GstTranscodeBin * self, GstElement * filter, GstElement ** mfilter) { @@ -539,6 +873,8 @@ goto bail_out; } GST_OBJECT_UNLOCK (filter); + + gst_bin_add (GST_BIN (self), gst_object_ref (filter)); } GST_OBJECT_LOCK (self); @@ -552,6 +888,15 @@ } static void +_set_profile (GstTranscodeBin * self, GstEncodingProfile * profile) +{ + GST_OBJECT_LOCK (self); + self->profile = profile; + _setup_avoid_reencoding (self); + GST_OBJECT_UNLOCK (self); +} + +static void gst_transcode_bin_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { @@ -559,13 +904,12 @@ switch (prop_id) { case PROP_PROFILE: - GST_OBJECT_LOCK (self); - self->profile = g_value_dup_object (value); - GST_OBJECT_UNLOCK (self); + _set_profile (self, g_value_dup_object (value)); break; case PROP_AVOID_REENCODING: GST_OBJECT_LOCK (self); self->avoid_reencoding = g_value_get_boolean (value); + _setup_avoid_reencoding (self); GST_OBJECT_UNLOCK (self); break; case PROP_AUDIO_FILTER: @@ -592,10 +936,14 @@ gstelement_klass = (GstElementClass *) klass; gstelement_klass->change_state = GST_DEBUG_FUNCPTR (gst_transcode_bin_change_state); + gstelement_klass->request_new_pad = + GST_DEBUG_FUNCPTR (gst_transcode_bin_request_pad); gst_element_class_add_pad_template (gstelement_klass, gst_static_pad_template_get (&transcode_bin_sink_template)); gst_element_class_add_pad_template (gstelement_klass, + gst_static_pad_template_get (&transcode_bin_sinks_template)); + gst_element_class_add_pad_template (gstelement_klass, gst_static_pad_template_get (&transcode_bin_src_template)); gst_element_class_set_static_metadata (gstelement_klass, @@ -661,35 +1009,8 @@ gst_object_unref (pad_tmpl); - pad_tmpl = gst_static_pad_template_get (&transcode_bin_src_template); - - self->srcpad = gst_ghost_pad_new_no_target_from_template ("src", pad_tmpl); - gst_pad_set_active (self->srcpad, TRUE); - gst_element_add_pad (GST_ELEMENT (self), self->srcpad); + self->transcoding_streams = + g_ptr_array_new_with_free_func ((GDestroyNotify) transcoding_stream_free); - gst_object_unref (pad_tmpl); + make_decodebin (self); } - -static gboolean -plugin_init (GstPlugin * plugin) -{ - gboolean res = TRUE; - gst_pb_utils_init (); - - GST_DEBUG_CATEGORY_INIT (gst_transcodebin_debug, "transcodebin", 0, - "Transcodebin element"); - - res &= gst_element_register (plugin, "transcodebin", GST_RANK_NONE, - GST_TYPE_TRANSCODE_BIN); - - res &= gst_element_register (plugin, "uritranscodebin", GST_RANK_NONE, - gst_uri_transcode_bin_get_type ()); - - return res; -} - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - transcode, - "A plugin containing elements for transcoding", plugin_init, VERSION, - GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.20.1.tar.xz/gst/transcode/gsttranscodeelement.c
Added
@@ -0,0 +1,46 @@ + +/* GStreamer + * Copyright (C) 2019 Thibault Saunier <tsaunier@igalia.com> + * + * gsttranscodebin.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +# include "config.h" +#endif + +#include "gsttranscodeelements.h" +#include <gst/gst-i18n-plugin.h> +#include <gst/pbutils/pbutils.h> + +#include <gst/pbutils/missing-plugins.h> + +GST_DEBUG_CATEGORY_STATIC (gst_transcodebin_debug); +#define GST_CAT_DEFAULT gst_transcodebin_debug + +void +transcodebin_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + gst_pb_utils_init (); + GST_DEBUG_CATEGORY_INIT (gst_transcodebin_debug, "transcodebin", 0, + "Transcodebin element"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/transcode/gsttranscodeelements.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2015 Thibault Saunier <tsaunier@gnome.org> + * + * gsttranscodebin.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_TRANSCODE_ELEMENTS_H__ +#define __GST_TRANSCODE_ELEMENTS_H__ + + +#include <gst/gst.h> + +void transcodebin_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (transcodebin); +GST_ELEMENT_REGISTER_DECLARE (uritranscodebin); + +#endif /* __GST_TRANSCODE_ELEMENTS_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst/transcode/gsttranscodeplugin.c
Added
@@ -0,0 +1,43 @@ +/* GStreamer + * Copyright (C) 2019 Thibault Saunier <tsaunier@igalia.com> + * + * gsttranscodebin.c: + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +# include "config.h" +#endif + +#include "gsttranscodeelements.h" + + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean res = FALSE; + + res |= GST_ELEMENT_REGISTER (transcodebin, plugin); + res |= GST_ELEMENT_REGISTER (uritranscodebin, plugin); + + return res; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + transcode, + "A plugin containing elements for transcoding", plugin_init, VERSION, + GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/transcode/gsturitranscodebin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/transcode/gsturitranscodebin.c
Changed
@@ -23,6 +23,7 @@ #endif #include "gsttranscoding.h" +#include "gsttranscodeelements.h" #if HAVE_GETRUSAGE #include "gst-cpu-throttling-clock.h" #endif @@ -73,7 +74,10 @@ #define DEFAULT_AVOID_REENCODING FALSE -G_DEFINE_TYPE (GstUriTranscodeBin, gst_uri_transcode_bin, GST_TYPE_PIPELINE) +G_DEFINE_TYPE (GstUriTranscodeBin, gst_uri_transcode_bin, GST_TYPE_PIPELINE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (uritranscodebin, "uritranscodebin", GST_RANK_NONE, + gst_uri_transcode_bin_get_type (), transcodebin_element_init (plugin)); + enum { PROP_0, @@ -89,6 +93,15 @@ LAST_PROP }; +/* signals */ +enum +{ + SIGNAL_SOURCE_SETUP, + SIGNAL_ELEMENT_SETUP, + LAST_SIGNAL +}; +static guint signals[LAST_SIGNAL] = { 0 }; + static void post_missing_plugin_error (GstElement * dec, const gchar * element_name) { @@ -104,65 +117,58 @@ /* *INDENT-ON* */ static gboolean -make_transcodebin (GstUriTranscodeBin * self) +make_dest (GstUriTranscodeBin * self) { - GST_INFO_OBJECT (self, "making new transcodebin"); - - self->transcodebin = gst_element_factory_make ("transcodebin", NULL); - if (!self->transcodebin) - goto no_decodebin; - - g_object_set (self->transcodebin, "profile", self->profile, - "video-filter", self->video_filter, - "audio-filter", self->audio_filter, - "avoid-reencoding", self->avoid_reencoding, NULL); - - gst_bin_add (GST_BIN (self), self->transcodebin); - if (!gst_element_link (self->transcodebin, self->sink)) - return FALSE; - - return TRUE; - - /* ERRORS */ -no_decodebin: - { - post_missing_plugin_error (GST_ELEMENT_CAST (self), "transcodebin"); - - GST_ELEMENT_ERROR (self, CORE, MISSING_PLUGIN, (NULL), - ("No transcodebin element, check your installation")); + GError *err = NULL; - return FALSE; + GST_OBJECT_LOCK (self); + if (!self->dest_uri) { + GST_INFO_OBJECT (self, "Sink already set: %" GST_PTR_FORMAT, self->sink); + goto ok_unlock; } -} -static gboolean -make_dest (GstUriTranscodeBin * self) -{ - GError *err = NULL; + if (!self->dest_uri) + goto ok_unlock; if (!gst_uri_is_valid (self->dest_uri)) - goto invalid_uri; + goto invalid_uri_unlock; self->sink = gst_element_make_from_uri (GST_URI_SINK, self->dest_uri, "sink", &err); if (!self->sink) - goto no_sink; + goto no_sink_unlock; + GST_OBJECT_UNLOCK (self); gst_bin_add (GST_BIN (self), self->sink); g_object_set (self->sink, "sync", TRUE, "max-lateness", GST_CLOCK_TIME_NONE, NULL); + return TRUE; -invalid_uri: +ok_unlock: + GST_OBJECT_UNLOCK (self); + return TRUE; + +invalid_uri_unlock: { + GST_OBJECT_UNLOCK (self); GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, ("Invalid URI \"%s\".", self->dest_uri), (NULL)); g_clear_error (&err); return FALSE; } -no_sink: +invalid_uri: + { + GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, + ("Invalid URI \"%s\".", self->source_uri), (NULL)); + g_clear_error (&err); + return FALSE; + } + +no_sink_unlock: { + GST_OBJECT_UNLOCK (self); /* whoops, could not create the source element, dig a little deeper to * figure out what might be wrong. */ if (err != NULL && err->code == GST_URI_ERROR_UNSUPPORTED_PROTOCOL) { @@ -191,6 +197,116 @@ } } +static void +transcodebin_pad_added_cb (GstElement * transcodebin, GstPad * pad, + GstUriTranscodeBin * self) +{ + + GstPad *sinkpad; + + if (GST_PAD_IS_SINK (pad)) + return; + + make_dest (self); + if (!self->sink) { + GST_ELEMENT_ERROR (self, CORE, FAILED, (NULL), ("No sink configured.")); + return; + } + + sinkpad = gst_element_get_static_pad (self->sink, "sink"); + if (!sinkpad) { + GST_ELEMENT_ERROR (self, CORE, FAILED, (NULL), ("Sink has not sinkpad?!")); + return; + } + + if (gst_pad_link (pad, sinkpad) != GST_PAD_LINK_OK) { + GST_ERROR_OBJECT (self, + "Could not link %" GST_PTR_FORMAT " and %" GST_PTR_FORMAT, pad, + sinkpad); + /* Let `pad unlinked` error pop up later */ + } +} + +static gboolean +make_transcodebin (GstUriTranscodeBin * self) +{ + GST_INFO_OBJECT (self, "making new transcodebin"); + + self->transcodebin = gst_element_factory_make ("transcodebin", NULL); + if (!self->transcodebin) + goto no_transcodebin; + + g_signal_connect (self->transcodebin, "pad-added", + G_CALLBACK (transcodebin_pad_added_cb), self); + + g_object_set (self->transcodebin, "profile", self->profile, + "video-filter", self->video_filter, + "audio-filter", self->audio_filter, + "avoid-reencoding", self->avoid_reencoding, NULL); + + gst_bin_add (GST_BIN (self), self->transcodebin); + + return TRUE; + + /* ERRORS */ +no_transcodebin: + { + post_missing_plugin_error (GST_ELEMENT_CAST (self), "transcodebin"); + + GST_ELEMENT_ERROR (self, CORE, MISSING_PLUGIN, (NULL), + ("No transcodebin element, check your installation")); + + return FALSE; + } +} + +static void +src_pad_added_cb (GstElement * src, GstPad * pad, GstUriTranscodeBin * self) +{ + GstPad *sinkpad = NULL; + GstPadLinkReturn res; + + GST_DEBUG_OBJECT (self, + "New pad %" GST_PTR_FORMAT " from source %" GST_PTR_FORMAT, pad, src); + + sinkpad = gst_element_get_static_pad (self->transcodebin, "sink"); + + if (gst_pad_is_linked (sinkpad)) + sinkpad = gst_element_request_pad_simple (self->transcodebin, "sink_%u"); + + if (sinkpad) { + GST_DEBUG_OBJECT (self, + "Linking %" GST_PTR_FORMAT " to %" GST_PTR_FORMAT, pad, sinkpad); + res = gst_pad_link (pad, sinkpad); + gst_object_unref (sinkpad); + if (GST_PAD_LINK_FAILED (res)) + goto link_failed; + } + return; + +link_failed: + { + GST_ERROR_OBJECT (self, + "failed to link pad %s:%s to decodebin, reason %s (%d)", + GST_DEBUG_PAD_NAME (pad), gst_pad_link_get_name (res), res); + return; + } +} + +static void +src_pad_removed_cb (GstElement * element, GstPad * pad, + GstUriTranscodeBin * self) +{ + /* FIXME : IMPLEMENT */ +} + +static void +source_setup_cb (GstElement * element, GstElement * source, + GstUriTranscodeBin * self) +{ + g_signal_emit (self, signals[SIGNAL_SOURCE_SETUP], 0, source); +} + static gboolean make_source (GstUriTranscodeBin * self) { @@ -199,15 +315,19 @@ if (!gst_uri_is_valid (self->source_uri)) goto invalid_uri; - self->src = gst_element_make_from_uri (GST_URI_SRC, self->source_uri, - "src", &err); + self->src = gst_element_factory_make ("urisourcebin", NULL); if (!self->src) - goto no_sink; + goto no_urisourcebin; gst_bin_add (GST_BIN (self), self->src); - if (!gst_element_link (self->src, self->transcodebin)) - return FALSE; + g_object_set (self->src, "uri", self->source_uri, NULL); + + g_signal_connect (self->src, "pad-added", (GCallback) src_pad_added_cb, self); + g_signal_connect (self->src, "pad-removed", + (GCallback) src_pad_removed_cb, self); + g_signal_connect (self->src, "source-setup", + G_CALLBACK (source_setup_cb), self); return TRUE; @@ -219,34 +339,16 @@ return FALSE; } -no_sink: +no_urisourcebin: { - /* whoops, could not create the source element, dig a little deeper to - * figure out what might be wrong. */ - if (err != NULL && err->code == GST_URI_ERROR_UNSUPPORTED_PROTOCOL) { - gchar *prot; - - prot = gst_uri_get_protocol (self->source_uri); - if (prot == NULL) - goto invalid_uri; - - gst_element_post_message (GST_ELEMENT_CAST (self), - gst_missing_uri_source_message_new (GST_ELEMENT (self), prot)); - - GST_ELEMENT_ERROR (self, CORE, MISSING_PLUGIN, - ("No URI handler implemented for \"%s\".", prot), (NULL)); - - g_free (prot); - } else { - GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, - ("%s", (err) ? err->message : "URI was not accepted by any element"), - ("No element accepted URI '%s'", self->dest_uri)); - } + post_missing_plugin_error (GST_ELEMENT_CAST (self), "urisourcebin"); - g_clear_error (&err); + GST_ELEMENT_ERROR (self, CORE, MISSING_PLUGIN, (NULL), + ("No urisourcebin element, check your installation")); return FALSE; } + } static void @@ -271,6 +373,53 @@ } } +static void +set_location_on_muxer_if_sink (GstUriTranscodeBin * self, GstElement * child) +{ + GstElementFactory *factory = gst_element_get_factory (child); + + if (!factory) + return; + + if (!self->dest_uri) + return; + + /* Set out dest URI as location for muxer sinks. */ + if (!gst_element_factory_list_is_type (factory, + GST_ELEMENT_FACTORY_TYPE_MUXER) || + !gst_element_factory_list_is_type (factory, + GST_ELEMENT_FACTORY_TYPE_SINK)) { + + return; + } + + if (!g_object_class_find_property (G_OBJECT_GET_CLASS (child), "location")) + return; + + if (!gst_uri_has_protocol (self->dest_uri, "file")) { + GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, + ("Trying to use a not local file with a muxing sink which is not" + " supported."), (NULL)); + return; + } + + GST_OBJECT_FLAG_SET (self->transcodebin, GST_ELEMENT_FLAG_SINK); + g_object_set (child, "location", &self->dest_uri[strlen ("file://")], NULL); + GST_DEBUG_OBJECT (self, "Setting location: %s", + &self->dest_uri[strlen ("file://")]); +} + +static void +deep_element_added (GstBin * bin, GstBin * sub_bin, GstElement * child) +{ + GstUriTranscodeBin *self = GST_URI_TRANSCODE_BIN (bin); + + set_location_on_muxer_if_sink (self, child); + g_signal_emit (bin, signals[SIGNAL_ELEMENT_SETUP], 0, child); + + GST_BIN_CLASS (parent_class)->deep_element_added (bin, sub_bin, child); +} + static GstStateChangeReturn gst_uri_transcode_bin_change_state (GstElement * element, GstStateChange transition) @@ -281,16 +430,13 @@ switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: - if (!make_dest (self)) - goto setup_failed; - if (!make_transcodebin (self)) goto setup_failed; if (!make_source (self)) goto setup_failed; - if (gst_element_set_state (self->sink, + if (self->sink && gst_element_set_state (self->sink, GST_STATE_PAUSED) == GST_STATE_CHANGE_FAILURE) { GST_ERROR_OBJECT (self, "Could not set %" GST_PTR_FORMAT " state to PAUSED", self->sink); @@ -471,6 +617,7 @@ { GObjectClass *object_class = G_OBJECT_CLASS (klass); GstElementClass *gstelement_klass; + GstBinClass *gstbin_klass; object_class->get_property = gst_uri_transcode_bin_get_property; object_class->set_property = gst_uri_transcode_bin_set_property; @@ -481,6 +628,9 @@ gstelement_klass->change_state = GST_DEBUG_FUNCPTR (gst_uri_transcode_bin_change_state); + gstbin_klass = (GstBinClass *) klass; + gstbin_klass->deep_element_added = GST_DEBUG_FUNCPTR (deep_element_added); + GST_DEBUG_CATEGORY_INIT (gst_uri_transcodebin_debug, "uritranscodebin", 0, "UriTranscodebin element"); @@ -553,6 +703,46 @@ g_param_spec_object ("audio-filter", "Audio filter", "the audio filter(s) to apply, if possible", GST_TYPE_ELEMENT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + /** + * GstUriTranscodeBin::source-setup: + * @uritranscodebin: a #GstUriTranscodeBin + * @source: source element + * + * This signal is emitted after the source element has been created, so + * it can be configured by setting additional properties (e.g. set a + * proxy server for an http source, or set the device and read speed for + * an audio cd source). This is functionally equivalent to connecting to + * the notify::source signal, but more convenient. + * + * This signal is usually emitted from the context of a GStreamer streaming + * thread. + * + * Since: 1.20 + */ + signals[SIGNAL_SOURCE_SETUP] = + g_signal_new ("source-setup", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_ELEMENT); + + /** + * GstUriTranscodeBin::element-setup: + * @uritranscodebin: a #GstUriTranscodeBin + * @element: an element that was added to the uritranscodebin hierarchy + * + * This signal is emitted when a new element is added to uritranscodebin or any of + * its sub-bins. This signal can be used to configure elements, e.g. to set + * properties on decoders. This is functionally equivalent to connecting to + * the deep-element-added signal, but more convenient. + * + * This signal is usually emitted from the context of a GStreamer streaming + * thread, so might be called at the same time as code running in the main + * application thread. + * + * Since: 1.20 + */ + signals[SIGNAL_ELEMENT_SETUP] = + g_signal_new ("element-setup", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_ELEMENT); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/gst/transcode/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/transcode/meson.build
Changed
@@ -1,4 +1,6 @@ gsttranscoder_plugin = library('gsttranscode', + 'gsttranscodeelement.c', + 'gsttranscodeplugin.c', 'gsttranscodebin.c', 'gst-cpu-throttling-clock.c', 'gsturitranscodebin.c',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstscenechange.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstscenechange.c
Changed
@@ -112,6 +112,8 @@ GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_scene_change_debug_category, "scenechange", 0, "debug category for scenechange element")); +GST_ELEMENT_REGISTER_DEFINE (scenechange, "scenechange", + GST_RANK_NONE, gst_scene_change_get_type ()); static void gst_scene_change_class_init (GstSceneChangeClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstscenechange.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstscenechange.h
Changed
@@ -53,6 +53,7 @@ }; GType gst_scene_change_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (scenechange); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstvideodiff.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstvideodiff.c
Changed
@@ -57,6 +57,8 @@ G_DEFINE_TYPE_WITH_CODE (GstVideoDiff, gst_video_diff, GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_video_diff_debug_category, "videodiff", 0, "debug category for videodiff element")); +GST_ELEMENT_REGISTER_DEFINE (videodiff, "videodiff", + GST_RANK_NONE, GST_TYPE_VIDEO_DIFF); static void gst_video_diff_class_init (GstVideoDiffClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstvideodiff.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstvideodiff.h
Changed
@@ -52,6 +52,7 @@ }; GType gst_video_diff_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (videodiff); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstvideofiltersbad.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstvideofiltersbad.c
Changed
@@ -31,15 +31,13 @@ static gboolean plugin_init (GstPlugin * plugin) { + gboolean ret = FALSE; - gst_element_register (plugin, "scenechange", GST_RANK_NONE, - gst_scene_change_get_type ()); - gst_element_register (plugin, "zebrastripe", GST_RANK_NONE, - gst_zebra_stripe_get_type ()); - return gst_element_register (plugin, "videodiff", GST_RANK_NONE, - GST_TYPE_VIDEO_DIFF); + ret |= GST_ELEMENT_REGISTER (scenechange, plugin); + ret |= GST_ELEMENT_REGISTER (zebrastripe, plugin); + ret |= GST_ELEMENT_REGISTER (videodiff, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstzebrastripe.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstzebrastripe.c
Changed
@@ -88,6 +88,8 @@ GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_zebra_stripe_debug_category, "zebrastripe", 0, "debug category for zebrastripe element")); +GST_ELEMENT_REGISTER_DEFINE (zebrastripe, "zebrastripe", + GST_RANK_NONE, gst_zebra_stripe_get_type ()); static void gst_zebra_stripe_class_init (GstZebraStripeClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videofilters/gstzebrastripe.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videofilters/gstzebrastripe.h
Changed
@@ -52,6 +52,7 @@ }; GType gst_zebra_stripe_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (zebrastripe); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoframe_audiolevel/gstvideoframe-audiolevel.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoframe_audiolevel/gstvideoframe-audiolevel.c
Changed
@@ -86,6 +86,8 @@ #define parent_class gst_videoframe_audiolevel_parent_class G_DEFINE_TYPE (GstVideoFrameAudioLevel, gst_videoframe_audiolevel, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (videoframe_audiolevel, "videoframe-audiolevel", + GST_RANK_NONE, GST_TYPE_VIDEOFRAME_AUDIOLEVEL); static GstFlowReturn gst_videoframe_audiolevel_asink_chain (GstPad * pad, GstObject * parent, GstBuffer * inbuf); @@ -777,8 +779,7 @@ static gboolean gst_videoframe_audiolevel_plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "videoframe-audiolevel", - GST_RANK_NONE, GST_TYPE_VIDEOFRAME_AUDIOLEVEL); + return GST_ELEMENT_REGISTER (videoframe_audiolevel, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoframe_audiolevel/gstvideoframe-audiolevel.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videoframe_audiolevel/gstvideoframe-audiolevel.h
Changed
@@ -68,6 +68,7 @@ }; GType gst_videoframe_audiolevel_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (videoframe_audiolevel); G_END_DECLS #endif /* __GST_VIDEOFRAME_AUDIOLEVEL_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstav1parse.c
Added
@@ -0,0 +1,1974 @@ +/* GStreamer + * Copyright (C) 2020 He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/* + * SECTION:element-av1parse + * @title: av1parse + * @short_description: An AV1 stream parse. + * + * The minimal unit should be the BYTE. + * There are four types of AV1 alignment in the AV1 stream. + * + * alignment: byte, obu, frame, tu + * + * 1. Aligned to byte. The basic and default one for input. + * 2. Aligned to obu(Open Bitstream Units). + * 3. Aligned to frame. The default one for output. This ensures that + * each buffer contains only one frame or frame header with the + * show_existing flag for the base or sub layer. It is useful for + * the decoder. + * 4. Aligned to tu(Temporal Unit). A temporal unit consists of all the + * OBUs that are associated with a specific, distinct time instant. + * When scalability is disabled, it contains just exact one showing + * frame(may contain several unshowing frames). When scalability is + * enabled, it contains frames depending on the layer number. It should + * begin with a temporal delimiter obu. It may be useful for mux/demux + * to index the data of some timestamp. + * + * The annex B define a special format for the temporal unit. The size of + * each temporal unit is extract out to the header of the buffer, and no + * size field inside the each obu. There are two stream formats: + * + * stream-format: obu-stream, annexb + * + * 1. obu-stream. The basic and default one. + * 2. annexb. A special stream of temporal unit. It also implies that the + * alignment should be TU. + * + * This AV1 parse implements the conversion between the alignments and the + * stream-formats. If the input and output have the same alignment and the + * same stream-format, it will check and bypass the data. + * + * ## Example launch line to generate annex B format AV1 stream: + * ``` + * gst-launch-1.0 filesrc location=sample.av1 ! ivfparse ! av1parse ! \ + * video/x-av1,alignment=\(string\)tu,stream-format=\(string\)annexb ! \ + * filesink location=matroskamux ! filesink location=trans.mkv + * ``` + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/base/gstbitreader.h> +#include <gst/base/gstbitwriter.h> +#include <gst/codecparsers/gstav1parser.h> +#include <gst/video/video.h> +#include "gstvideoparserselements.h" +#include "gstav1parse.h" + +#include <string.h> + +#define GST_AV1_MAX_LEB_128_SIZE 8 + +GST_DEBUG_CATEGORY (av1_parse_debug); +#define GST_CAT_DEFAULT av1_parse_debug + +/* We combine the stream format and the alignment + together. When stream format is annexb, the + alignment must be TU. */ +typedef enum +{ + GST_AV1_PARSE_ALIGN_ERROR = -1, + GST_AV1_PARSE_ALIGN_NONE = 0, + GST_AV1_PARSE_ALIGN_BYTE, + GST_AV1_PARSE_ALIGN_OBU, + GST_AV1_PARSE_ALIGN_FRAME, + GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT, + GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B, +} GstAV1ParseAligment; + +struct _GstAV1Parse +{ + GstBaseParse parent; + + gint width; + gint height; + gint subsampling_x; + gint subsampling_y; + gboolean mono_chrome; + guint8 bit_depth; + gchar *colorimetry; + GstAV1Profile profile; + + GstAV1ParseAligment in_align; + GstAV1ParseAligment align; + + GstAV1Parser *parser; + GstAdapter *cache_out; + guint last_parsed_offset; + GstAdapter *frame_cache; + guint highest_spatial_id; + gint last_shown_frame_temporal_id; + gint last_shown_frame_spatial_id; + gboolean within_one_frame; + gboolean update_caps; + gboolean discont; + gboolean header; + gboolean keyframe; + gboolean show_frame; +}; + +static GstStaticPadTemplate sinktemplate = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-av1")); + +static GstStaticPadTemplate srctemplate = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-av1, parsed = (boolean) true, " + "stream-format=(string) { obu-stream, annexb }, " + "alignment=(string) { obu, tu, frame }")); + +#define parent_class gst_av1_parse_parent_class +G_DEFINE_TYPE (GstAV1Parse, gst_av1_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (av1parse, "av1parse", GST_RANK_SECONDARY, + GST_TYPE_AV1_PARSE, videoparsers_element_init (plugin)); + +static void +remove_fields (GstCaps * caps, gboolean all) +{ + guint i, n; + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + GstStructure *s = gst_caps_get_structure (caps, i); + + if (all) { + gst_structure_remove_field (s, "alignment"); + gst_structure_remove_field (s, "stream-format"); + } + gst_structure_remove_field (s, "parsed"); + } +} + +static const gchar * +_obu_name (GstAV1OBUType type) +{ + switch (type) { + case GST_AV1_OBU_SEQUENCE_HEADER: + return "sequence header"; + case GST_AV1_OBU_TEMPORAL_DELIMITER: + return "temporal delimiter"; + case GST_AV1_OBU_FRAME_HEADER: + return "frame header"; + case GST_AV1_OBU_TILE_GROUP: + return "tile group"; + case GST_AV1_OBU_METADATA: + return "metadata"; + case GST_AV1_OBU_FRAME: + return "frame"; + case GST_AV1_OBU_REDUNDANT_FRAME_HEADER: + return "redundant frame header"; + case GST_AV1_OBU_TILE_LIST: + return "tile list"; + case GST_AV1_OBU_PADDING: + return "padding"; + default: + return "unknown"; + } + + return NULL; +} + +static guint32 +_read_leb128 (guint8 * data, GstAV1ParserResult * retval, guint32 * comsumed) +{ + guint8 leb128_byte = 0; + guint64 value = 0; + gint i; + gboolean result; + GstBitReader br; + guint32 cur_pos; + + gst_bit_reader_init (&br, data, 8); + + cur_pos = gst_bit_reader_get_pos (&br); + for (i = 0; i < 8; i++) { + leb128_byte = 0; + result = gst_bit_reader_get_bits_uint8 (&br, &leb128_byte, 8); + if (result == FALSE) { + *retval = GST_AV1_PARSER_BITSTREAM_ERROR; + return 0; + } + + value |= (((gint) leb128_byte & 0x7f) << (i * 7)); + if (!(leb128_byte & 0x80)) + break; + } + + *comsumed = (gst_bit_reader_get_pos (&br) - cur_pos) / 8; + /* check for bitstream conformance see chapter4.10.5 */ + if (value < G_MAXUINT32) { + *retval = GST_AV1_PARSER_OK; + return (guint32) value; + } else { + GST_WARNING ("invalid leb128"); + *retval = GST_AV1_PARSER_BITSTREAM_ERROR; + return 0; + } +} + +static gsize +_leb_size_in_bytes (guint64 value) +{ + gsize size = 0; + do { + ++size; + } while ((value >>= 7) != 0); + + return size; +} + +static gboolean +_write_leb128 (guint8 * data, guint * len, guint64 value) +{ + guint leb_size = _leb_size_in_bytes (value); + guint i; + + if (value > G_MAXUINT32 || leb_size > GST_AV1_MAX_LEB_128_SIZE) + return FALSE; + + for (i = 0; i < leb_size; ++i) { + guint8 byte = value & 0x7f; + value >>= 7; + + /* Signal that more bytes follow. */ + if (value != 0) + byte |= 0x80; + + *(data + i) = byte; + } + + *len = leb_size; + return TRUE; +} + +static gboolean gst_av1_parse_start (GstBaseParse * parse); +static gboolean gst_av1_parse_stop (GstBaseParse * parse); +static GstFlowReturn gst_av1_parse_handle_frame (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize); +static gboolean gst_av1_parse_set_sink_caps (GstBaseParse * parse, + GstCaps * caps); +static GstCaps *gst_av1_parse_get_sink_caps (GstBaseParse * parse, + GstCaps * filter); + +/* Clear the parse state related to data kind OBUs. */ +static void +gst_av1_parse_reset_obu_data_state (GstAV1Parse * self) +{ + self->last_shown_frame_temporal_id = -1; + self->last_shown_frame_spatial_id = -1; + self->within_one_frame = FALSE; +} + +static void +gst_av1_parse_reset (GstAV1Parse * self) +{ + self->width = 0; + self->height = 0; + self->subsampling_x = -1; + self->subsampling_y = -1; + self->mono_chrome = FALSE; + self->profile = GST_AV1_PROFILE_UNDEFINED; + self->bit_depth = 0; + self->align = GST_AV1_PARSE_ALIGN_NONE; + self->in_align = GST_AV1_PARSE_ALIGN_NONE; + self->discont = TRUE; + self->header = FALSE; + self->keyframe = FALSE; + self->show_frame = FALSE; + self->last_parsed_offset = 0; + self->highest_spatial_id = 0; + gst_av1_parse_reset_obu_data_state (self); + g_clear_pointer (&self->colorimetry, g_free); + g_clear_pointer (&self->parser, gst_av1_parser_free); + gst_adapter_clear (self->cache_out); + gst_adapter_clear (self->frame_cache); +} + +static void +gst_av1_parse_init (GstAV1Parse * self) +{ + gst_base_parse_set_pts_interpolation (GST_BASE_PARSE (self), FALSE); + gst_base_parse_set_infer_ts (GST_BASE_PARSE (self), FALSE); + + GST_PAD_SET_ACCEPT_INTERSECT (GST_BASE_PARSE_SINK_PAD (self)); + GST_PAD_SET_ACCEPT_TEMPLATE (GST_BASE_PARSE_SINK_PAD (self)); + + self->cache_out = gst_adapter_new (); + self->frame_cache = gst_adapter_new (); +} + +static void +gst_av1_parse_finalize (GObject * object) +{ + GstAV1Parse *self = GST_AV1_PARSE (object); + + gst_av1_parse_reset (self); + g_object_unref (self->cache_out); + g_object_unref (self->frame_cache); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_av1_parse_class_init (GstAV1ParseClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + GstBaseParseClass *parse_class = GST_BASE_PARSE_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + + gobject_class->finalize = gst_av1_parse_finalize; + parse_class->start = GST_DEBUG_FUNCPTR (gst_av1_parse_start); + parse_class->stop = GST_DEBUG_FUNCPTR (gst_av1_parse_stop); + parse_class->handle_frame = GST_DEBUG_FUNCPTR (gst_av1_parse_handle_frame); + parse_class->set_sink_caps = GST_DEBUG_FUNCPTR (gst_av1_parse_set_sink_caps); + parse_class->get_sink_caps = GST_DEBUG_FUNCPTR (gst_av1_parse_get_sink_caps); + + gst_element_class_add_static_pad_template (element_class, &srctemplate); + gst_element_class_add_static_pad_template (element_class, &sinktemplate); + + gst_element_class_set_static_metadata (element_class, "AV1 parser", + "Codec/Parser/Converter/Video", + "Parses AV1 streams", "He Junyan <junyan.he@intel.com>"); + + GST_DEBUG_CATEGORY_INIT (av1_parse_debug, "av1parse", 0, "av1 parser"); +} + +static gboolean +gst_av1_parse_start (GstBaseParse * parse) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + + GST_DEBUG_OBJECT (self, "start"); + + gst_av1_parse_reset (self); + self->parser = gst_av1_parser_new (); + + /* At least the OBU header. */ + gst_base_parse_set_min_frame_size (parse, 1); + + return TRUE; +} + +static gboolean +gst_av1_parse_stop (GstBaseParse * parse) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + + GST_DEBUG_OBJECT (self, "stop"); + g_clear_pointer (&self->parser, gst_av1_parser_free); + + return TRUE; +} + +static const gchar * +gst_av1_parse_profile_to_string (GstAV1Profile profile) +{ + switch (profile) { + case GST_AV1_PROFILE_0: + return "main"; + case GST_AV1_PROFILE_1: + return "high"; + case GST_AV1_PROFILE_2: + return "professional"; + default: + break; + } + + return NULL; +} + +static GstAV1Profile +gst_av1_parse_profile_from_string (const gchar * profile) +{ + if (!profile) + return GST_AV1_PROFILE_UNDEFINED; + + if (g_strcmp0 (profile, "main") == 0) + return GST_AV1_PROFILE_0; + else if (g_strcmp0 (profile, "high") == 0) + return GST_AV1_PROFILE_1; + else if (g_strcmp0 (profile, "professional") == 0) + return GST_AV1_PROFILE_2; + + return GST_AV1_PROFILE_UNDEFINED; +} + +static const gchar * +gst_av1_parse_alignment_to_steam_format_string (GstAV1ParseAligment align) +{ + switch (align) { + case GST_AV1_PARSE_ALIGN_BYTE: + return "obu-stream"; + case GST_AV1_PARSE_ALIGN_OBU: + case GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT: + case GST_AV1_PARSE_ALIGN_FRAME: + return "obu-stream"; + case GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B: + return "annexb"; + default: + GST_WARNING ("Unrecognized steam format"); + break; + } + + return NULL; +} + +static const gchar * +gst_av1_parse_alignment_to_string (GstAV1ParseAligment align) +{ + switch (align) { + case GST_AV1_PARSE_ALIGN_BYTE: + return "byte"; + case GST_AV1_PARSE_ALIGN_OBU: + return "obu"; + case GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT: + case GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B: + return "tu"; + case GST_AV1_PARSE_ALIGN_FRAME: + return "frame"; + default: + GST_WARNING ("Unrecognized alignment"); + break; + } + + return NULL; +} + +static GstAV1ParseAligment +gst_av1_parse_alignment_from_string (const gchar * align, + const gchar * stream_format) +{ + if (!align && !stream_format) + return GST_AV1_PARSE_ALIGN_NONE; + + if (stream_format) { + if (g_strcmp0 (stream_format, "annexb") == 0) { + if (align && g_strcmp0 (align, "tu") != 0) { + /* annex b stream must align to TU. */ + return GST_AV1_PARSE_ALIGN_ERROR; + } else { + return GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B; + } + } else if (g_strcmp0 (stream_format, "obu-stream") != 0) { + /* unrecognized */ + return GST_AV1_PARSE_ALIGN_NONE; + } + + /* stream-format is obu-stream, depends on align */ + } + + if (align) { + if (g_strcmp0 (align, "byte") == 0) { + return GST_AV1_PARSE_ALIGN_BYTE; + } else if (g_strcmp0 (align, "obu") == 0) { + return GST_AV1_PARSE_ALIGN_OBU; + } else if (g_strcmp0 (align, "tu") == 0) { + return GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT; + } else if (g_strcmp0 (align, "frame") == 0) { + return GST_AV1_PARSE_ALIGN_FRAME; + } else { + /* unrecognized */ + return GST_AV1_PARSE_ALIGN_NONE; + } + } + + return GST_AV1_PARSE_ALIGN_NONE; +} + +static gboolean +gst_av1_parse_caps_has_alignment (GstCaps * caps, GstAV1ParseAligment alignment) +{ + guint i, j, caps_size; + const gchar *cmp_align_str = NULL; + const gchar *cmp_stream_str = NULL; + + GST_DEBUG ("Try to find alignment %d in caps: %" GST_PTR_FORMAT, + alignment, caps); + + caps_size = gst_caps_get_size (caps); + if (caps_size == 0) + return FALSE; + + switch (alignment) { + case GST_AV1_PARSE_ALIGN_BYTE: + cmp_align_str = "byte"; + cmp_stream_str = "obu-stream"; + break; + case GST_AV1_PARSE_ALIGN_OBU: + cmp_align_str = "obu"; + cmp_stream_str = "obu-stream"; + break; + case GST_AV1_PARSE_ALIGN_FRAME: + cmp_align_str = "frame"; + cmp_stream_str = "obu-stream"; + break; + case GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT: + cmp_align_str = "tu"; + cmp_stream_str = "obu-stream"; + break; + case GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B: + cmp_align_str = "tu"; + cmp_stream_str = "annexb"; + break; + default: + return FALSE; + } + + for (i = 0; i < caps_size; i++) { + GstStructure *s = gst_caps_get_structure (caps, i); + const GValue *alignment_value = gst_structure_get_value (s, "alignment"); + const GValue *stream_value = gst_structure_get_value (s, "stream-format"); + + if (!alignment_value || !stream_value) + continue; + + if (G_VALUE_HOLDS_STRING (alignment_value)) { + const gchar *align_str = g_value_get_string (alignment_value); + + if (g_strcmp0 (align_str, cmp_align_str) != 0) + continue; + } else if (GST_VALUE_HOLDS_LIST (alignment_value)) { + guint num_values = gst_value_list_get_size (alignment_value); + + for (j = 0; j < num_values; j++) { + const GValue *v = gst_value_list_get_value (alignment_value, j); + const gchar *align_str = g_value_get_string (v); + + if (g_strcmp0 (align_str, cmp_align_str) == 0) + break; + } + + if (j == num_values) + continue; + } + + if (G_VALUE_HOLDS_STRING (stream_value)) { + const gchar *stream_str = g_value_get_string (stream_value); + + if (g_strcmp0 (stream_str, cmp_stream_str) != 0) + continue; + } else if (GST_VALUE_HOLDS_LIST (stream_value)) { + guint num_values = gst_value_list_get_size (stream_value); + + for (j = 0; j < num_values; j++) { + const GValue *v = gst_value_list_get_value (stream_value, j); + const gchar *stream_str = g_value_get_string (v); + + if (g_strcmp0 (stream_str, cmp_stream_str) == 0) + break; + } + + if (j == num_values) + continue; + } + + return TRUE; + } + + return FALSE; +} + +static GstAV1ParseAligment +gst_av1_parse_alignment_from_caps (GstCaps * caps) +{ + GstAV1ParseAligment align; + + align = GST_AV1_PARSE_ALIGN_NONE; + + GST_DEBUG ("parsing caps: %" GST_PTR_FORMAT, caps); + + if (caps && gst_caps_get_size (caps) > 0) { + GstStructure *s = gst_caps_get_structure (caps, 0); + const gchar *str_align = NULL; + const gchar *str_stream = NULL; + + str_align = gst_structure_get_string (s, "alignment"); + str_stream = gst_structure_get_string (s, "stream-format"); + + if (str_align || str_stream) + align = gst_av1_parse_alignment_from_string (str_align, str_stream); + } + + return align; +} + +static void +gst_av1_parse_update_src_caps (GstAV1Parse * self, GstCaps * caps) +{ + GstCaps *sink_caps, *src_caps; + GstCaps *final_caps = NULL; + GstStructure *s = NULL; + gint width, height; + gint par_n = 0, par_d = 0; + gint fps_n = 0, fps_d = 0; + const gchar *profile = NULL; + + if (G_UNLIKELY (!gst_pad_has_current_caps (GST_BASE_PARSE_SRC_PAD (self)))) + self->update_caps = TRUE; + + if (!self->update_caps) + return; + + /* if this is being called from the first _setcaps call, caps on the sinkpad + * aren't set yet and so they need to be passed as an argument */ + if (caps) + sink_caps = gst_caps_ref (caps); + else + sink_caps = gst_pad_get_current_caps (GST_BASE_PARSE_SINK_PAD (self)); + + /* carry over input caps as much as possible; override with our own stuff */ + if (!sink_caps) + sink_caps = gst_caps_new_empty_simple ("video/x-av1"); + else + s = gst_caps_get_structure (sink_caps, 0); + + final_caps = gst_caps_copy (sink_caps); + + if (s && gst_structure_has_field (s, "width") && + gst_structure_has_field (s, "height")) { + gst_structure_get_int (s, "width", &width); + gst_structure_get_int (s, "height", &height); + } else { + width = self->width; + height = self->height; + } + + if (width > 0 && height > 0) + gst_caps_set_simple (final_caps, "width", G_TYPE_INT, width, + "height", G_TYPE_INT, height, NULL); + + if (s && gst_structure_get_fraction (s, "pixel-aspect-ratio", &par_n, &par_d)) { + if (par_n != 0 && par_d != 0) { + gst_caps_set_simple (final_caps, "pixel-aspect-ratio", + GST_TYPE_FRACTION, par_n, par_d, NULL); + } + } + + if (s && gst_structure_has_field (s, "framerate")) { + gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d); + } + + if (fps_n > 0 && fps_d > 0) { + gst_caps_set_simple (final_caps, "framerate", + GST_TYPE_FRACTION, fps_n, fps_d, NULL); + gst_base_parse_set_frame_rate (GST_BASE_PARSE (self), fps_n, fps_d, 0, 0); + } + + /* When not RGB, the chroma format is needed. */ + if (self->colorimetry == NULL || + (g_strcmp0 (self->colorimetry, GST_VIDEO_COLORIMETRY_SRGB) != 0)) { + const gchar *chroma_format = NULL; + + if (self->subsampling_x == 1 && self->subsampling_y == 1) { + if (!self->mono_chrome) { + chroma_format = "4:2:0"; + } else { + chroma_format = "4:0:0"; + } + } else if (self->subsampling_x == 1 && self->subsampling_y == 0) { + chroma_format = "4:2:2"; + } else if (self->subsampling_x == 0 && self->subsampling_y == 0) { + chroma_format = "4:4:4"; + } + + if (chroma_format) + gst_caps_set_simple (final_caps, + "chroma-format", G_TYPE_STRING, chroma_format, NULL); + } + + if (self->bit_depth) + gst_caps_set_simple (final_caps, + "bit-depth-luma", G_TYPE_UINT, self->bit_depth, + "bit-depth-chroma", G_TYPE_UINT, self->bit_depth, NULL); + + if (self->colorimetry && (!s || !gst_structure_has_field (s, "colorimetry"))) + gst_caps_set_simple (final_caps, + "colorimetry", G_TYPE_STRING, self->colorimetry, NULL); + + g_assert (self->align > GST_AV1_PARSE_ALIGN_NONE); + gst_caps_set_simple (final_caps, "parsed", G_TYPE_BOOLEAN, TRUE, + "stream-format", G_TYPE_STRING, + gst_av1_parse_alignment_to_steam_format_string (self->align), + "alignment", G_TYPE_STRING, + gst_av1_parse_alignment_to_string (self->align), NULL); + + profile = gst_av1_parse_profile_to_string (self->profile); + if (profile) + gst_caps_set_simple (final_caps, "profile", G_TYPE_STRING, profile, NULL); + + src_caps = gst_pad_get_current_caps (GST_BASE_PARSE_SRC_PAD (self)); + + if (!(src_caps && gst_caps_is_strictly_equal (src_caps, final_caps))) { + GST_DEBUG_OBJECT (self, "Update src caps %" GST_PTR_FORMAT, final_caps); + gst_pad_set_caps (GST_BASE_PARSE_SRC_PAD (self), final_caps); + } + + gst_clear_caps (&src_caps); + gst_caps_unref (final_caps); + gst_caps_unref (sink_caps); + + self->update_caps = FALSE; +} + +/* check downstream caps to configure format and alignment */ +static void +gst_av1_parse_negotiate (GstAV1Parse * self, GstCaps * in_caps) +{ + GstCaps *caps; + GstAV1ParseAligment align = GST_AV1_PARSE_ALIGN_NONE; + + caps = gst_pad_get_allowed_caps (GST_BASE_PARSE_SRC_PAD (self)); + GST_DEBUG_OBJECT (self, "allowed caps: %" GST_PTR_FORMAT, caps); + + /* concentrate on leading structure, since decodebin parser + * capsfilter always includes parser template caps */ + if (caps) { + caps = gst_caps_truncate (caps); + GST_DEBUG_OBJECT (self, "negotiating with caps: %" GST_PTR_FORMAT, caps); + } + + /* prefer TU as default */ + if (gst_av1_parse_caps_has_alignment (caps, + GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT)) { + align = GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT; + goto done; + } + + /* Both upsteam and downstream support, best */ + if (in_caps && caps) { + if (gst_caps_can_intersect (in_caps, caps)) { + GST_DEBUG_OBJECT (self, "downstream accepts upstream caps"); + align = gst_av1_parse_alignment_from_caps (in_caps); + gst_clear_caps (&caps); + } + } + if (align != GST_AV1_PARSE_ALIGN_NONE) + goto done; + + /* Select first one of downstream support */ + if (caps && !gst_caps_is_empty (caps)) { + /* fixate to avoid ambiguity with lists when parsing */ + caps = gst_caps_fixate (caps); + align = gst_av1_parse_alignment_from_caps (caps); + } + if (align != GST_AV1_PARSE_ALIGN_NONE) + goto done; + + /* default */ + if (align == GST_AV1_PARSE_ALIGN_NONE) + align = GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT; + +done: + self->align = align; + GST_INFO_OBJECT (self, "selected alignment %s", + gst_av1_parse_alignment_to_string (align)); + + gst_clear_caps (&caps); +} + +static GstCaps * +gst_av1_parse_get_sink_caps (GstBaseParse * parse, GstCaps * filter) +{ + GstCaps *peercaps, *templ; + GstCaps *res, *tmp, *pcopy; + + templ = gst_pad_get_pad_template_caps (GST_BASE_PARSE_SINK_PAD (parse)); + if (filter) { + GstCaps *fcopy = gst_caps_copy (filter); + /* Remove the fields we convert */ + remove_fields (fcopy, TRUE); + peercaps = gst_pad_peer_query_caps (GST_BASE_PARSE_SRC_PAD (parse), fcopy); + gst_caps_unref (fcopy); + } else { + peercaps = gst_pad_peer_query_caps (GST_BASE_PARSE_SRC_PAD (parse), NULL); + } + + pcopy = gst_caps_copy (peercaps); + remove_fields (pcopy, TRUE); + + res = gst_caps_intersect_full (pcopy, templ, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (pcopy); + gst_caps_unref (templ); + + if (filter) { + GstCaps *tmp = gst_caps_intersect_full (res, filter, + GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (res); + res = tmp; + } + + /* Try if we can put the downstream caps first */ + pcopy = gst_caps_copy (peercaps); + remove_fields (pcopy, FALSE); + tmp = gst_caps_intersect_full (pcopy, res, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (pcopy); + if (!gst_caps_is_empty (tmp)) + res = gst_caps_merge (tmp, res); + else + gst_caps_unref (tmp); + + gst_caps_unref (peercaps); + + return res; +} + +static gboolean +gst_av1_parse_set_sink_caps (GstBaseParse * parse, GstCaps * caps) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + GstStructure *str; + GstAV1ParseAligment align; + GstCaps *in_caps = NULL; + const gchar *profile; + + str = gst_caps_get_structure (caps, 0); + + /* accept upstream info if provided */ + gst_structure_get_int (str, "width", &self->width); + gst_structure_get_int (str, "height", &self->height); + profile = gst_structure_get_string (str, "profile"); + if (profile) + self->profile = gst_av1_parse_profile_from_string (profile); + + /* get upstream align from caps */ + align = gst_av1_parse_alignment_from_caps (caps); + if (align == GST_AV1_PARSE_ALIGN_ERROR) { + GST_ERROR_OBJECT (self, "Sink caps %" GST_PTR_FORMAT " set stream-format" + " and alignment conflict.", caps); + return FALSE; + } + + in_caps = gst_caps_copy (caps); + /* default */ + if (align == GST_AV1_PARSE_ALIGN_NONE) + gst_caps_set_simple (in_caps, "alignment", G_TYPE_STRING, + gst_av1_parse_alignment_to_string (GST_AV1_PARSE_ALIGN_BYTE), NULL); + + /* negotiate with downstream, set output align */ + gst_av1_parse_negotiate (self, in_caps); + + self->update_caps = TRUE; + + /* if all of decoder's capability related values are provided + * by upstream, update src caps now */ + if (self->width > 0 && self->height > 0 && profile) + gst_av1_parse_update_src_caps (self, in_caps); + + gst_caps_unref (in_caps); + + self->in_align = align; + + if (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) { + gst_av1_parser_reset (self->parser, TRUE); + } else { + gst_av1_parser_reset (self->parser, FALSE); + } + + return TRUE; +} + +static GstFlowReturn +gst_av1_parse_push_data (GstAV1Parse * self, GstBaseParseFrame * frame, + guint32 finish_sz, gboolean frame_finished) +{ + gsize sz; + GstBuffer *buf, *header_buf; + GstBuffer *buffer = frame->buffer; + GstFlowReturn ret = GST_FLOW_OK; + + /* Need to generate the final TU annex-b format */ + if (self->align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) { + guint8 size_data[GST_AV1_MAX_LEB_128_SIZE]; + guint size_len = 0; + guint len; + + /* When push a TU, it must also be a frame end. */ + g_assert (frame_finished); + + /* Still some left in the frame cache */ + len = gst_adapter_available (self->frame_cache); + if (len) { + buf = gst_adapter_take_buffer (self->frame_cache, len); + + /* frame_unit_size */ + _write_leb128 (size_data, &size_len, len); + header_buf = gst_buffer_new_memdup (size_data, size_len); + GST_BUFFER_PTS (header_buf) = GST_BUFFER_PTS (buf); + GST_BUFFER_DTS (header_buf) = GST_BUFFER_DTS (buf); + GST_BUFFER_DURATION (header_buf) = GST_BUFFER_DURATION (buf); + + gst_adapter_push (self->cache_out, header_buf); + gst_adapter_push (self->cache_out, buf); + } + + len = gst_adapter_available (self->cache_out); + if (len) { + buf = gst_adapter_take_buffer (self->cache_out, len); + + /* temporal_unit_size */ + _write_leb128 (size_data, &size_len, len); + header_buf = gst_buffer_new_memdup (size_data, size_len); + GST_BUFFER_PTS (header_buf) = GST_BUFFER_PTS (buf); + GST_BUFFER_DTS (header_buf) = GST_BUFFER_DTS (buf); + GST_BUFFER_DURATION (header_buf) = GST_BUFFER_DURATION (buf); + + gst_adapter_push (self->cache_out, header_buf); + gst_adapter_push (self->cache_out, buf); + } + } + + sz = gst_adapter_available (self->cache_out); + if (sz) { + buf = gst_adapter_take_buffer (self->cache_out, sz); + gst_buffer_copy_into (buf, buffer, GST_BUFFER_COPY_METADATA, 0, -1); + if (self->discont) { + GST_BUFFER_FLAG_SET (buf, GST_BUFFER_FLAG_DISCONT); + self->discont = FALSE; + } + if (self->header) { + GST_BUFFER_FLAG_SET (buf, GST_BUFFER_FLAG_HEADER); + self->header = FALSE; + } + if (self->keyframe) { + GST_BUFFER_FLAG_UNSET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); + self->keyframe = FALSE; + } else { + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); + } + + if (frame_finished) + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_MARKER); + + if (self->align == GST_AV1_PARSE_ALIGN_FRAME) { + if (!self->show_frame) { + GST_BUFFER_FLAG_SET (buf, GST_BUFFER_FLAG_DECODE_ONLY); + } else { + GST_BUFFER_FLAG_UNSET (buf, GST_BUFFER_FLAG_DECODE_ONLY); + } + } + + gst_buffer_replace (&frame->out_buffer, buf); + gst_buffer_unref (buf); + + gst_av1_parse_update_src_caps (self, NULL); + GST_LOG_OBJECT (self, "comsumed %d, output one buffer with size %" + G_GSSIZE_FORMAT, finish_sz, sz); + ret = gst_base_parse_finish_frame (GST_BASE_PARSE (self), frame, finish_sz); + } + + return ret; +} + +static void +gst_av1_parse_convert_to_annexb (GstAV1Parse * self, GstBuffer * buffer, + GstAV1OBU * obu, gboolean frame_complete) +{ + guint8 size_data[GST_AV1_MAX_LEB_128_SIZE]; + guint size_len = 0; + GstBitWriter bs; + GstBuffer *buf, *buf2; + guint8 *data; + guint len, len2, offset; + + /* obu_length */ + _write_leb128 (size_data, &size_len, + obu->obu_size + 1 + obu->header.obu_extention_flag); + + gst_bit_writer_init_with_size (&bs, 128, FALSE); + /* obu_forbidden_bit */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 1); + /* obu_type */ + gst_bit_writer_put_bits_uint8 (&bs, obu->obu_type, 4); + /* obu_extension_flag */ + gst_bit_writer_put_bits_uint8 (&bs, obu->header.obu_extention_flag, 1); + /* obu_has_size_field */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 1); + /* obu_reserved_1bit */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 1); + if (obu->header.obu_extention_flag) { + /* temporal_id */ + gst_bit_writer_put_bits_uint8 (&bs, obu->header.obu_temporal_id, 3); + /* spatial_id */ + gst_bit_writer_put_bits_uint8 (&bs, obu->header.obu_spatial_id, 2); + /* extension_header_reserved_3bits */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 3); + } + g_assert (GST_BIT_WRITER_BIT_SIZE (&bs) % 8 == 0); + + len = size_len; + len += GST_BIT_WRITER_BIT_SIZE (&bs) / 8; + len += obu->obu_size; + + data = g_malloc (len); + offset = 0; + + memcpy (data + offset, size_data, size_len); + offset += size_len; + + memcpy (data + offset, GST_BIT_WRITER_DATA (&bs), + GST_BIT_WRITER_BIT_SIZE (&bs) / 8); + offset += GST_BIT_WRITER_BIT_SIZE (&bs) / 8; + + memcpy (data + offset, obu->data, obu->obu_size); + + /* The buf of this OBU */ + buf = gst_buffer_new_wrapped (data, len); + GST_BUFFER_PTS (buf) = GST_BUFFER_PTS (buffer); + GST_BUFFER_DTS (buf) = GST_BUFFER_DTS (buffer); + GST_BUFFER_DURATION (buf) = GST_BUFFER_DURATION (buffer); + + gst_adapter_push (self->frame_cache, buf); + + if (frame_complete) { + len2 = gst_adapter_available (self->frame_cache); + buf2 = gst_adapter_take_buffer (self->frame_cache, len2); + + /* frame_unit_size */ + _write_leb128 (size_data, &size_len, len2); + buf = gst_buffer_new_memdup (size_data, size_len); + GST_BUFFER_PTS (buf) = GST_BUFFER_PTS (buf2); + GST_BUFFER_DTS (buf) = GST_BUFFER_DTS (buf2); + GST_BUFFER_DURATION (buf) = GST_BUFFER_DURATION (buf2); + + gst_adapter_push (self->cache_out, buf); + gst_adapter_push (self->cache_out, buf2); + } + + gst_bit_writer_reset (&bs); +} + +static void +gst_av1_parse_convert_from_annexb (GstAV1Parse * self, GstBuffer * buffer, + GstAV1OBU * obu) +{ + guint8 size_data[GST_AV1_MAX_LEB_128_SIZE]; + guint size_len = 0; + GstBuffer *buf; + guint len, offset; + guint8 *data; + GstBitWriter bs; + + _write_leb128 (size_data, &size_len, obu->obu_size); + + /* obu_header */ + len = 1; + if (obu->header.obu_extention_flag) + len += 1; + len += size_len; + len += obu->obu_size; + + gst_bit_writer_init_with_size (&bs, 128, FALSE); + /* obu_forbidden_bit */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 1); + /* obu_type */ + gst_bit_writer_put_bits_uint8 (&bs, obu->obu_type, 4); + /* obu_extension_flag */ + gst_bit_writer_put_bits_uint8 (&bs, obu->header.obu_extention_flag, 1); + /* obu_has_size_field */ + gst_bit_writer_put_bits_uint8 (&bs, 1, 1); + /* obu_reserved_1bit */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 1); + if (obu->header.obu_extention_flag) { + /* temporal_id */ + gst_bit_writer_put_bits_uint8 (&bs, obu->header.obu_temporal_id, 3); + /* spatial_id */ + gst_bit_writer_put_bits_uint8 (&bs, obu->header.obu_spatial_id, 2); + /* extension_header_reserved_3bits */ + gst_bit_writer_put_bits_uint8 (&bs, 0, 3); + } + g_assert (GST_BIT_WRITER_BIT_SIZE (&bs) % 8 == 0); + + data = g_malloc (len); + offset = 0; + memcpy (data + offset, GST_BIT_WRITER_DATA (&bs), + GST_BIT_WRITER_BIT_SIZE (&bs) / 8); + offset += GST_BIT_WRITER_BIT_SIZE (&bs) / 8; + + memcpy (data + offset, size_data, size_len); + offset += size_len; + + memcpy (data + offset, obu->data, obu->obu_size); + + buf = gst_buffer_new_wrapped (data, len); + GST_BUFFER_PTS (buf) = GST_BUFFER_PTS (buffer); + GST_BUFFER_DTS (buf) = GST_BUFFER_DTS (buffer); + GST_BUFFER_DURATION (buf) = GST_BUFFER_DURATION (buffer); + + gst_adapter_push (self->cache_out, buf); + + gst_bit_writer_reset (&bs); +} + +static void +gst_av1_parse_cache_one_obu (GstAV1Parse * self, GstBuffer * buffer, + GstAV1OBU * obu, guint8 * data, guint32 size, gboolean frame_complete) +{ + gboolean need_convert = FALSE; + GstBuffer *buf; + + if (self->in_align != self->align + && (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B + || self->align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B)) + need_convert = TRUE; + + if (need_convert) { + if (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) { + gst_av1_parse_convert_from_annexb (self, buffer, obu); + } else { + gst_av1_parse_convert_to_annexb (self, buffer, obu, frame_complete); + } + } else if (self->align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) { + g_assert (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B); + gst_av1_parse_convert_to_annexb (self, buffer, obu, frame_complete); + } else { + buf = gst_buffer_new_memdup (data, size); + GST_BUFFER_PTS (buf) = GST_BUFFER_PTS (buffer); + GST_BUFFER_DTS (buf) = GST_BUFFER_DTS (buffer); + GST_BUFFER_DURATION (buf) = GST_BUFFER_DURATION (buffer); + + gst_adapter_push (self->cache_out, buf); + } +} + +static GstAV1ParserResult +gst_av1_parse_handle_sequence_obu (GstAV1Parse * self, GstAV1OBU * obu) +{ + GstAV1SequenceHeaderOBU seq_header; + GstAV1ParserResult res; + guint i; + guint val; + + res = gst_av1_parser_parse_sequence_header_obu (self->parser, + obu, &seq_header); + if (res != GST_AV1_PARSER_OK) + return res; + + if (self->width != seq_header.max_frame_width_minus_1 + 1) { + self->width = seq_header.max_frame_width_minus_1 + 1; + self->update_caps = TRUE; + } + if (self->height != seq_header.max_frame_height_minus_1 + 1) { + self->height = seq_header.max_frame_height_minus_1 + 1; + self->update_caps = TRUE; + } + + if (seq_header.color_config.color_description_present_flag) { + GstVideoColorimetry cinfo; + gboolean have_cinfo = TRUE; + gchar *colorimetry = NULL; + + if (have_cinfo) { + if (seq_header.color_config.color_range) + cinfo.range = GST_VIDEO_COLOR_RANGE_16_235; + else + cinfo.range = GST_VIDEO_COLOR_RANGE_0_255; + + cinfo.matrix = gst_video_color_matrix_from_iso + (seq_header.color_config.matrix_coefficients); + cinfo.transfer = gst_video_transfer_function_from_iso + (seq_header.color_config.transfer_characteristics); + cinfo.primaries = gst_video_color_primaries_from_iso + (seq_header.color_config.color_primaries); + colorimetry = gst_video_colorimetry_to_string (&cinfo); + } + + if (g_strcmp0 (colorimetry, self->colorimetry)) { + g_clear_pointer (&self->colorimetry, g_free); + self->colorimetry = colorimetry; + self->update_caps = TRUE; + } + } + + if (self->subsampling_x != seq_header.color_config.subsampling_x) { + self->subsampling_x = seq_header.color_config.subsampling_x; + self->update_caps = TRUE; + } + + if (self->subsampling_y != seq_header.color_config.subsampling_y) { + self->subsampling_y = seq_header.color_config.subsampling_y; + self->update_caps = TRUE; + } + + if (self->mono_chrome != seq_header.color_config.mono_chrome) { + self->mono_chrome = seq_header.color_config.mono_chrome; + self->update_caps = TRUE; + } + + if (self->bit_depth != seq_header.bit_depth) { + self->bit_depth = seq_header.bit_depth; + self->update_caps = TRUE; + } + + if (self->profile != seq_header.seq_profile) { + self->profile = seq_header.seq_profile; + self->update_caps = TRUE; + } + + val = (self->parser->state.operating_point_idc >> 8) & 0x0f; + for (i = 0; i < (1 << GST_AV1_MAX_SPATIAL_LAYERS); i++) { + if (val & (1 << i)) + self->highest_spatial_id = i; + } + + return GST_AV1_PARSER_OK; +} + +/* Check whether the frame start a new TU. + The obu here should be a shown frame/frame header. */ +static gboolean +gst_av1_parse_frame_start_new_temporal_unit (GstAV1Parse * self, + GstAV1OBU * obu) +{ + gboolean ret = FALSE; + + g_assert (obu->obu_type == GST_AV1_OBU_FRAME_HEADER + || obu->obu_type == GST_AV1_OBU_FRAME); + + /* 7.5.Ordering of OBUs: The value of temporal_id must be the same in all + OBU extension headers that are contained in the same temporal unit. */ + if (self->last_shown_frame_temporal_id >= 0 && + obu->header.obu_temporal_id != self->last_shown_frame_temporal_id) { + ret = TRUE; + goto new_tu; + } + + /* If scalability is not being used, only one shown frame for each + temporal unit. So the new frame belongs to a new temporal unit. */ + if (!self->within_one_frame && self->last_shown_frame_temporal_id >= 0 && + self->parser->state.operating_point_idc == 0) { + ret = TRUE; + goto new_tu; + } + + /* The new frame has the same layer IDs with the last shown frame, + it should belong to a new temporal unit. */ + if (!self->within_one_frame && + obu->header.obu_temporal_id == self->last_shown_frame_temporal_id && + obu->header.obu_spatial_id == self->last_shown_frame_spatial_id) { + ret = TRUE; + goto new_tu; + } + +new_tu: + if (ret) { + if (self->within_one_frame) + GST_WARNING_OBJECT (self, + "Start a new temporal unit with incompleted frame."); + + gst_av1_parse_reset_obu_data_state (self); + } + + return ret; +} + +/* frame_complete will be set true if it is the frame edge. */ +static GstAV1ParserResult +gst_av1_parse_handle_one_obu (GstAV1Parse * self, GstAV1OBU * obu, + gboolean * frame_complete, gboolean * check_new_tu) +{ + GstAV1ParserResult res = GST_AV1_PARSER_OK; + GstAV1MetadataOBU metadata; + GstAV1FrameHeaderOBU frame_header; + GstAV1TileListOBU tile_list; + GstAV1TileGroupOBU tile_group; + GstAV1FrameOBU frame; + + *frame_complete = FALSE; + + switch (obu->obu_type) { + case GST_AV1_OBU_TEMPORAL_DELIMITER: + res = gst_av1_parser_parse_temporal_delimiter_obu (self->parser, obu); + break; + case GST_AV1_OBU_SEQUENCE_HEADER: + res = gst_av1_parse_handle_sequence_obu (self, obu); + break; + case GST_AV1_OBU_REDUNDANT_FRAME_HEADER: + res = gst_av1_parser_parse_frame_header_obu (self->parser, obu, + &frame_header); + break; + case GST_AV1_OBU_FRAME_HEADER: + res = gst_av1_parser_parse_frame_header_obu (self->parser, obu, + &frame_header); + break; + case GST_AV1_OBU_FRAME: + res = gst_av1_parser_parse_frame_obu (self->parser, obu, &frame); + break; + case GST_AV1_OBU_METADATA: + res = gst_av1_parser_parse_metadata_obu (self->parser, obu, &metadata); + break; + case GST_AV1_OBU_TILE_GROUP: + res = + gst_av1_parser_parse_tile_group_obu (self->parser, obu, &tile_group); + break; + case GST_AV1_OBU_TILE_LIST: + res = gst_av1_parser_parse_tile_list_obu (self->parser, obu, &tile_list); + break; + case GST_AV1_OBU_PADDING: + break; + default: + GST_WARNING_OBJECT (self, "an unrecognized obu type %d", obu->obu_type); + res = GST_AV1_PARSER_BITSTREAM_ERROR; + break; + } + + GST_LOG_OBJECT (self, "parsing the obu %s, result is %d", + _obu_name (obu->obu_type), res); + if (res != GST_AV1_PARSER_OK) + goto out; + + /* 7.5: + All OBU extension headers that are contained in the same temporal + unit and have the same spatial_id value must have the same temporal_id + value. + And + OBUs with spatial level IDs (spatial_id) greater than 0 must + appear within a temporal unit in increasing order of the spatial + level ID values. */ + if (obu->header.obu_spatial_id > self->highest_spatial_id) { + GST_WARNING_OBJECT (self, + "spatial_id %d is bigger than highest_spatial_id %d", + obu->header.obu_spatial_id, self->highest_spatial_id); + res = GST_AV1_PARSER_BITSTREAM_ERROR; + goto out; + } + + /* If to check a new temporal starts, return early. + In 7.5.Ordering of OBUs: Sequence header OBUs may appear in any order + within a coded video sequence. So it is allowed to repeat the sequence + header within one temporal unit, and sequence header does not definitely + start a TU. We only check TD here. */ + if (obu->obu_type == GST_AV1_OBU_TEMPORAL_DELIMITER) { + gst_av1_parse_reset_obu_data_state (self); + + if (check_new_tu) { + *check_new_tu = TRUE; + res = GST_AV1_PARSER_OK; + goto out; + } + } + + if (obu->obu_type == GST_AV1_OBU_SEQUENCE_HEADER) + self->header = TRUE; + + if (obu->obu_type == GST_AV1_OBU_FRAME_HEADER + || obu->obu_type == GST_AV1_OBU_FRAME + || obu->obu_type == GST_AV1_OBU_REDUNDANT_FRAME_HEADER) { + GstAV1FrameHeaderOBU *fh = &frame_header; + + if (obu->obu_type == GST_AV1_OBU_FRAME) + fh = &frame.frame_header; + + self->show_frame = fh->show_frame || fh->show_existing_frame; + if (self->show_frame) { + /* Check whether a new temporal starts, and return early. */ + if (check_new_tu && obu->obu_type != GST_AV1_OBU_REDUNDANT_FRAME_HEADER + && gst_av1_parse_frame_start_new_temporal_unit (self, obu)) { + *check_new_tu = TRUE; + res = GST_AV1_PARSER_OK; + goto out; + } + + self->last_shown_frame_temporal_id = obu->header.obu_temporal_id; + self->last_shown_frame_spatial_id = obu->header.obu_spatial_id; + } + + self->within_one_frame = TRUE; + + /* if a show_existing_frame case, only update key frame. + otherwise, update all type of frame. */ + if (!fh->show_existing_frame || fh->frame_type == GST_AV1_KEY_FRAME) + res = gst_av1_parser_reference_frame_update (self->parser, fh); + + if (res != GST_AV1_PARSER_OK) + GST_WARNING_OBJECT (self, "update frame get result %d", res); + + if (fh->show_existing_frame) { + *frame_complete = TRUE; + self->within_one_frame = FALSE; + } + + if (fh->frame_type == GST_AV1_KEY_FRAME) + self->keyframe = TRUE; + } + + if (obu->obu_type == GST_AV1_OBU_TILE_GROUP + || obu->obu_type == GST_AV1_OBU_FRAME) { + GstAV1TileGroupOBU *tg = &tile_group; + + self->within_one_frame = TRUE; + + if (obu->obu_type == GST_AV1_OBU_FRAME) + tg = &frame.tile_group; + + if (tg->tg_end == tg->num_tiles - 1) { + *frame_complete = TRUE; + self->within_one_frame = FALSE; + } + } + +out: + if (res != GST_AV1_PARSER_OK) { + /* Some verbose OBU can be skip */ + if (obu->obu_type == GST_AV1_OBU_REDUNDANT_FRAME_HEADER) { + GST_WARNING_OBJECT (self, "Ignore a verbose %s OBU parsing error", + _obu_name (obu->obu_type)); + gst_av1_parse_reset_obu_data_state (self); + res = GST_AV1_PARSER_OK; + } + } + + return res; +} + +static GstFlowReturn +gst_av1_parse_handle_obu_to_obu (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + GstMapInfo map_info; + GstAV1OBU obu; + GstFlowReturn ret = GST_FLOW_OK; + GstAV1ParserResult res; + GstBuffer *buffer = gst_buffer_ref (frame->buffer); + guint32 consumed; + gboolean frame_complete; + + if (!gst_buffer_map (buffer, &map_info, GST_MAP_READ)) { + *skipsize = 0; + GST_ERROR_OBJECT (parse, "Couldn't map incoming buffer"); + return GST_FLOW_ERROR; + } + + consumed = 0; + frame_complete = FALSE; + res = gst_av1_parser_identify_one_obu (self->parser, map_info.data, + map_info.size, &obu, &consumed); + if (res == GST_AV1_PARSER_OK) + res = gst_av1_parse_handle_one_obu (self, &obu, &frame_complete, NULL); + + g_assert (consumed <= map_info.size); + + if (res == GST_AV1_PARSER_BITSTREAM_ERROR || + res == GST_AV1_PARSER_MISSING_OBU_REFERENCE) { + if (consumed) { + *skipsize = consumed; + } else { + *skipsize = map_info.size; + } + GST_WARNING_OBJECT (parse, "Parse obu error, discard %d.", *skipsize); + gst_av1_parse_reset_obu_data_state (self); + ret = GST_FLOW_OK; + goto out; + } else if (res == GST_AV1_PARSER_NO_MORE_DATA) { + *skipsize = 0; + + if (self->in_align == GST_AV1_PARSE_ALIGN_OBU) { + /* The buffer is already aligned to OBU, should not happen. */ + if (consumed) { + *skipsize = consumed; + } else { + *skipsize = map_info.size; + } + GST_WARNING_OBJECT (parse, "Parse obu need more data, discard %d.", + *skipsize); + gst_av1_parse_reset_obu_data_state (self); + } + ret = GST_FLOW_OK; + goto out; + } else if (res == GST_AV1_PARSER_DROP) { + GST_DEBUG_OBJECT (parse, "Drop %d data", consumed); + *skipsize = consumed; + gst_av1_parse_reset_obu_data_state (self); + ret = GST_FLOW_OK; + goto out; + } else if (res != GST_AV1_PARSER_OK) { + GST_ERROR_OBJECT (parse, "Parse obu get unexpect error %d", res); + *skipsize = 0; + ret = GST_FLOW_ERROR; + goto out; + } + + g_assert (consumed); + + gst_av1_parse_update_src_caps (self, NULL); + if (self->discont) { + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DISCONT); + self->discont = FALSE; + } + if (self->header) { + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_HEADER); + self->header = FALSE; + } + /* happen to be a frame boundary */ + if (frame_complete) + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_MARKER); + + GST_LOG_OBJECT (self, "Output one buffer with size %d", consumed); + ret = gst_base_parse_finish_frame (parse, frame, consumed); + *skipsize = 0; + +out: + gst_buffer_unmap (buffer, &map_info); + gst_buffer_unref (buffer); + return ret; +} + +static GstFlowReturn +gst_av1_parse_handle_to_small_and_equal_align (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + GstMapInfo map_info; + GstAV1OBU obu; + GstFlowReturn ret = GST_FLOW_OK; + GstAV1ParserResult res; + GstBuffer *buffer = gst_buffer_ref (frame->buffer); + guint32 total_consumed, consumed; + gboolean frame_complete; + + if (!gst_buffer_map (buffer, &map_info, GST_MAP_READ)) { + GST_ERROR_OBJECT (parse, "Couldn't map incoming buffer"); + return GST_FLOW_ERROR; + } + + total_consumed = 0; + frame_complete = FALSE; +again: + while (total_consumed < map_info.size) { + res = gst_av1_parser_identify_one_obu (self->parser, + map_info.data + total_consumed, map_info.size - total_consumed, + &obu, &consumed); + if (res == GST_AV1_PARSER_OK) + res = gst_av1_parse_handle_one_obu (self, &obu, &frame_complete, NULL); + if (res != GST_AV1_PARSER_OK) + break; + + if (obu.obu_type == GST_AV1_OBU_TEMPORAL_DELIMITER && total_consumed) { + GST_DEBUG_OBJECT (self, "Encounter TD inside one %s aligned" + " buffer, should not happen normally.", + gst_av1_parse_alignment_to_string (self->in_align)); + frame_complete = TRUE; + if (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) + gst_av1_parser_reset_annex_b (self->parser); + /* Not include this TD obu, it should belong to the next TU or frame */ + break; + } + + gst_av1_parse_cache_one_obu (self, buffer, &obu, + map_info.data + total_consumed, consumed, frame_complete); + + total_consumed += consumed; + + if (self->align == GST_AV1_PARSE_ALIGN_OBU) + break; + + if (self->align == GST_AV1_PARSE_ALIGN_FRAME && frame_complete) + break; + } + + if (res == GST_AV1_PARSER_BITSTREAM_ERROR || + res == GST_AV1_PARSER_MISSING_OBU_REFERENCE) { + /* Discard the whole frame */ + *skipsize = map_info.size; + GST_WARNING_OBJECT (parse, "Parse obu error, discard %d", *skipsize); + if (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) + gst_av1_parser_reset_annex_b (self->parser); + gst_av1_parse_reset_obu_data_state (self); + ret = GST_FLOW_OK; + goto out; + } else if (res == GST_AV1_PARSER_NO_MORE_DATA) { + /* Discard the whole buffer */ + *skipsize = map_info.size; + GST_WARNING_OBJECT (parse, "Parse obu need more data, discard %d.", + *skipsize); + if (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) + gst_av1_parser_reset_annex_b (self->parser); + + gst_av1_parse_reset_obu_data_state (self); + ret = GST_FLOW_OK; + goto out; + } else if (res == GST_AV1_PARSER_DROP) { + GST_DEBUG_OBJECT (parse, "Drop %d data", consumed); + total_consumed += consumed; + gst_av1_parse_reset_obu_data_state (self); + res = GST_AV1_PARSER_OK; + goto again; + } else if (res != GST_AV1_PARSER_OK) { + GST_ERROR_OBJECT (parse, "Parse obu get unexpect error %d", res); + *skipsize = 0; + ret = GST_FLOW_ERROR; + goto out; + } + + g_assert (total_consumed >= map_info.size || frame_complete + || self->align == GST_AV1_PARSE_ALIGN_OBU); + + if (total_consumed >= map_info.size && !frame_complete + && self->align == GST_AV1_PARSE_ALIGN_FRAME) { + /* Warning and still consider this frame as complete */ + GST_WARNING_OBJECT (self, "Exhaust the buffer but still incomplete frame," + " should not happend in %s alignment", + gst_av1_parse_alignment_to_string (self->in_align)); + } + + ret = gst_av1_parse_push_data (self, frame, total_consumed, frame_complete); + +out: + gst_buffer_unmap (buffer, &map_info); + gst_buffer_unref (buffer); + return ret; +} + +static GstFlowReturn +gst_av1_parse_handle_to_big_align (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + GstMapInfo map_info; + GstAV1OBU obu; + GstFlowReturn ret = GST_FLOW_OK; + GstAV1ParserResult res = GST_AV1_PARSER_OK; + GstBuffer *buffer = gst_buffer_ref (frame->buffer); + guint32 consumed; + gboolean frame_complete; + gboolean check_new_tu; + gboolean complete; + + g_assert (self->in_align <= GST_AV1_PARSE_ALIGN_FRAME); + + if (!gst_buffer_map (buffer, &map_info, GST_MAP_READ)) { + *skipsize = 0; + GST_ERROR_OBJECT (parse, "Couldn't map incoming buffer"); + return GST_FLOW_ERROR; + } + + complete = FALSE; +again: + while (self->last_parsed_offset < map_info.size) { + res = gst_av1_parser_identify_one_obu (self->parser, + map_info.data + self->last_parsed_offset, + map_info.size - self->last_parsed_offset, &obu, &consumed); + if (res != GST_AV1_PARSER_OK) + break; + + check_new_tu = FALSE; + res = gst_av1_parse_handle_one_obu (self, &obu, &frame_complete, + &check_new_tu); + if (res != GST_AV1_PARSER_OK) + break; + + if (check_new_tu && (gst_adapter_available (self->cache_out) || + gst_adapter_available (self->frame_cache))) { + complete = TRUE; + break; + } + + if (self->align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT || + self->align == GST_AV1_PARSE_ALIGN_FRAME) { + GstBuffer *buf = gst_buffer_copy_region (buffer, GST_BUFFER_COPY_ALL, + self->last_parsed_offset, consumed); + gst_adapter_push (self->cache_out, buf); + } else if (self->align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) { + gst_av1_parse_convert_to_annexb (self, buffer, &obu, frame_complete); + } else { + g_assert_not_reached (); + } + self->last_parsed_offset += consumed; + + if (self->align == GST_AV1_PARSE_ALIGN_FRAME && frame_complete) + complete = TRUE; + + if (complete) + break; + } + + /* Finish a complete frame anyway */ + if (complete || GST_BASE_PARSE_DRAINING (parse)) { + *skipsize = 0; + + /* push the left anyway if no error */ + if (res == GST_AV1_PARSER_OK) + ret = gst_av1_parse_push_data (self, frame, + self->last_parsed_offset, TRUE); + + self->last_parsed_offset = 0; + + goto out; + } + + if (res == GST_AV1_PARSER_BITSTREAM_ERROR || + res == GST_AV1_PARSER_MISSING_OBU_REFERENCE) { + *skipsize = map_info.size; + GST_WARNING_OBJECT (parse, "Parse obu error, discard whole buffer %d.", + *skipsize); + /* The adapter will be cleared in next loop because of + GST_BASE_PARSE_FRAME_FLAG_NEW_FRAME flag */ + gst_av1_parse_reset_obu_data_state (self); + ret = GST_FLOW_OK; + } else if (res == GST_AV1_PARSER_NO_MORE_DATA) { + *skipsize = 0; + + if (self->in_align >= GST_AV1_PARSE_ALIGN_OBU) { + /* The buffer is already aligned to OBU, should not happen. + The adapter will be cleared in next loop because of + GST_BASE_PARSE_FRAME_FLAG_NEW_FRAME flag */ + *skipsize = map_info.size; + gst_av1_parse_reset_obu_data_state (self); + GST_WARNING_OBJECT (parse, + "Parse obu need more data, discard whole buffer %d.", *skipsize); + } + ret = GST_FLOW_OK; + } else if (res == GST_AV1_PARSER_DROP) { + GST_DEBUG_OBJECT (parse, "Drop %d data", consumed); + self->last_parsed_offset += consumed; + gst_av1_parse_reset_obu_data_state (self); + res = GST_AV1_PARSER_OK; + goto again; + } else if (res == GST_AV1_PARSER_OK) { + /* Everything is correct but still not get a frame or tu, + need more data */ + GST_DEBUG_OBJECT (parse, "Need more data"); + *skipsize = 0; + ret = GST_FLOW_OK; + } else { + GST_ERROR_OBJECT (parse, "Parse obu get unexpect error %d", res); + *skipsize = 0; + ret = GST_FLOW_ERROR; + } + +out: + gst_buffer_unmap (buffer, &map_info); + gst_buffer_unref (buffer); + return ret; +} + +/* Try to recognize whether the input is annex-b format. */ +static GstFlowReturn +gst_av1_parse_detect_alignment (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize, guint32 * total_consumed) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + GstMapInfo map_info; + GstAV1OBU obu; + GstAV1ParserResult res; + GstBuffer *buffer = gst_buffer_ref (frame->buffer); + gboolean got_seq, got_frame; + gboolean frame_complete; + guint32 consumed; + guint32 frame_sz; + GstFlowReturn ret = GST_FLOW_OK; + + if (!gst_buffer_map (buffer, &map_info, GST_MAP_READ)) { + *skipsize = 0; + GST_ERROR_OBJECT (parse, "Couldn't map incoming buffer"); + return GST_FLOW_ERROR; + } + + gst_av1_parser_reset (self->parser, FALSE); + + /* Detect the alignment obu first */ + got_seq = FALSE; + got_frame = FALSE; + *total_consumed = 0; +again: + while (*total_consumed < map_info.size) { + res = gst_av1_parser_identify_one_obu (self->parser, + map_info.data + *total_consumed, map_info.size - *total_consumed, + &obu, &consumed); + if (res == GST_AV1_PARSER_OK) { + *total_consumed += consumed; + res = gst_av1_parse_handle_one_obu (self, &obu, &frame_complete, NULL); + } + + if (res != GST_AV1_PARSER_OK) + break; + + if (obu.obu_type == GST_AV1_OBU_SEQUENCE_HEADER) + got_seq = TRUE; + if (obu.obu_type == GST_AV1_OBU_REDUNDANT_FRAME_HEADER || + obu.obu_type == GST_AV1_OBU_FRAME || + obu.obu_type == GST_AV1_OBU_FRAME_HEADER) + got_frame = TRUE; + + if (got_seq || got_frame) + break; + } + + gst_av1_parser_reset (self->parser, FALSE); + + if (res == GST_AV1_PARSER_OK || res == GST_AV1_PARSER_NO_MORE_DATA) { + *skipsize = 0; + + /* If succeed recognize seq or frame, we can decide, + otherwise, just skipsize to 0 and get more data. */ + if (got_seq || got_frame) + self->in_align = GST_AV1_PARSE_ALIGN_BYTE; + + ret = GST_FLOW_OK; + goto out; + } else if (res == GST_AV1_PARSER_DROP) { + *total_consumed += consumed; + res = GST_AV1_PARSER_OK; + gst_av1_parse_reset_obu_data_state (self); + goto again; + } + + /* Try the annexb. The buffer should hold the whole frame, and + the buffer start with the frame size in leb128() format. */ + if (map_info.size < 8) { + /* Get more data. */ + *skipsize = 0; + ret = GST_FLOW_OK; + goto out; + } + + frame_sz = _read_leb128 (map_info.data, &res, &consumed); + if (frame_sz == 0 || res != GST_AV1_PARSER_OK) { + /* Both modes does not match, we can decide a error */ + ret = GST_FLOW_ERROR; + goto out; + } + + if (frame_sz + consumed != map_info.size) { + GST_DEBUG_OBJECT (self, "Buffer size %" G_GSSIZE_FORMAT ", frame size %d," + " consumed %d, does not match annex b format.", + map_info.size, frame_sz, consumed); + /* Both modes does not match, we can decide a error */ + ret = GST_FLOW_ERROR; + goto out; + } + + self->in_align = GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B; + gst_av1_parser_reset (self->parser, TRUE); + ret = GST_FLOW_OK; + +out: + gst_av1_parse_reset_obu_data_state (self); + gst_buffer_unmap (buffer, &map_info); + gst_buffer_unref (buffer); + return ret; +} + +static GstFlowReturn +gst_av1_parse_handle_frame (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize) +{ + GstAV1Parse *self = GST_AV1_PARSE (parse); + GstFlowReturn ret = GST_FLOW_OK; + guint in_level, out_level; + + if (GST_BUFFER_FLAG_IS_SET (frame->buffer, GST_BUFFER_FLAG_DISCONT)) { + self->discont = TRUE; + + if (frame->flags & GST_BASE_PARSE_FRAME_FLAG_NEW_FRAME) + gst_av1_parse_reset_obu_data_state (self); + } else { + self->discont = FALSE; + } + + GST_LOG_OBJECT (self, "Input frame size %" G_GSSIZE_FORMAT, + gst_buffer_get_size (frame->buffer)); + + /* avoid stale cached parsing state */ + if (frame->flags & GST_BASE_PARSE_FRAME_FLAG_NEW_FRAME) { + GST_LOG_OBJECT (self, "parsing new frame"); + gst_adapter_clear (self->cache_out); + gst_adapter_clear (self->frame_cache); + self->last_parsed_offset = 0; + self->header = FALSE; + self->keyframe = FALSE; + self->show_frame = FALSE; + } else { + GST_LOG_OBJECT (self, "resuming frame parsing"); + } + + /* When in pull mode, the sink pad has no caps, we may get the + caps by query the upstream element */ + if (self->in_align == GST_AV1_PARSE_ALIGN_NONE) { + GstCaps *upstream_caps; + + upstream_caps = + gst_pad_peer_query_caps (GST_BASE_PARSE_SINK_PAD (self), NULL); + if (upstream_caps) { + if (!gst_caps_is_empty (upstream_caps) + && !gst_caps_is_any (upstream_caps)) { + GST_LOG_OBJECT (self, "upstream caps: %" GST_PTR_FORMAT, upstream_caps); + /* fixate to avoid ambiguity with lists when parsing */ + upstream_caps = gst_caps_fixate (upstream_caps); + self->in_align = gst_av1_parse_alignment_from_caps (upstream_caps); + } + + gst_caps_unref (upstream_caps); + + gst_av1_parser_reset (self->parser, + self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B); + } + + if (self->in_align != GST_AV1_PARSE_ALIGN_NONE) + GST_LOG_OBJECT (self, "Query the upstream get the alignment %d", + self->in_align); + } + + if (self->in_align == GST_AV1_PARSE_ALIGN_NONE) { + guint32 consumed = 0; + + /* Only happend at the first time of handle_frame, and the + alignment in the sink caps is unset. Try the default and + if error, try the annex B. */ + ret = gst_av1_parse_detect_alignment (parse, frame, skipsize, &consumed); + if (ret == GST_FLOW_OK && self->in_align != GST_AV1_PARSE_ALIGN_NONE) { + GST_INFO_OBJECT (self, "Detect the input alignment %d", self->in_align); + } else { + *skipsize = consumed > 0 ? consumed : gst_buffer_get_size (frame->buffer); + GST_WARNING_OBJECT (self, "Fail to detect the alignment, skip %d", + *skipsize); + return GST_FLOW_OK; + } + } + + /* We may in pull mode and no caps is set */ + if (self->align == GST_AV1_PARSE_ALIGN_NONE) + gst_av1_parse_negotiate (self, NULL); + + in_level = self->in_align; + if (self->in_align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) + in_level = GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT; + out_level = self->align; + if (self->align == GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT_ANNEX_B) + out_level = GST_AV1_PARSE_ALIGN_TEMPORAL_UNIT; + + if (self->in_align <= GST_AV1_PARSE_ALIGN_OBU + && self->align == GST_AV1_PARSE_ALIGN_OBU) { + ret = gst_av1_parse_handle_obu_to_obu (parse, frame, skipsize); + } else if (in_level < out_level) { + ret = gst_av1_parse_handle_to_big_align (parse, frame, skipsize); + } else { + ret = gst_av1_parse_handle_to_small_and_equal_align (parse, + frame, skipsize); + } + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstav1parse.h
Added
@@ -0,0 +1,34 @@ +/* GStreamer + * Copyright (C) 2020 He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_AV1_PARSE_H__ +#define __GST_AV1_PARSE_H__ + +#include <gst/gst.h> +#include <gst/base/gstbaseparse.h> + +G_BEGIN_DECLS + +#define GST_TYPE_AV1_PARSE (gst_av1_parse_get_type()) +G_DECLARE_FINAL_TYPE (GstAV1Parse, + gst_av1_parse, GST, AV1_PARSE, GstBaseParse); + +G_END_DECLS + +#endif /* __GST_AV1_PARSE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gstdiracparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstdiracparse.c
Changed
@@ -38,6 +38,7 @@ #include <gst/base/base.h> #include <gst/pbutils/pbutils.h> #include <string.h> +#include "gstvideoparserselements.h" #include "gstdiracparse.h" #include "dirac_parse.h" @@ -96,6 +97,8 @@ #define parent_class gst_dirac_parse_parent_class G_DEFINE_TYPE (GstDiracParse, gst_dirac_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (diracparse, "diracparse", GST_RANK_NONE, + GST_TYPE_DIRAC_PARSE, videoparsers_element_init (plugin)); static void gst_dirac_parse_class_init (GstDiracParseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gsth263parse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gsth263parse.c
Changed
@@ -31,6 +31,7 @@ #include <gst/base/base.h> #include <gst/pbutils/pbutils.h> +#include "gstvideoparserselements.h" #include "gsth263parse.h" #include <string.h> @@ -53,6 +54,9 @@ #define parent_class gst_h263_parse_parent_class G_DEFINE_TYPE (GstH263Parse, gst_h263_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (h263parse, "h263parse", + GST_RANK_PRIMARY + 1, GST_TYPE_H263_PARSE, + videoparsers_element_init (plugin)); static gboolean gst_h263_parse_start (GstBaseParse * parse); static gboolean gst_h263_parse_stop (GstBaseParse * parse);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gsth264parse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gsth264parse.c
Changed
@@ -29,6 +29,7 @@ #include <gst/base/base.h> #include <gst/pbutils/pbutils.h> #include <gst/video/video.h> +#include "gstvideoparserselements.h" #include "gsth264parse.h" #include <string.h> @@ -98,6 +99,9 @@ #define parent_class gst_h264_parse_parent_class G_DEFINE_TYPE (GstH264Parse, gst_h264_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (h264parse, "h264parse", + GST_RANK_PRIMARY + 1, GST_TYPE_H264_PARSE, + videoparsers_element_init (plugin)); static void gst_h264_parse_finalize (GObject * object); @@ -167,7 +171,7 @@ "is attached to incoming buffer and also Picture Timing SEI exists " "in the bitstream. To make this property work, SPS must contain " "VUI and pic_struct_present_flag of VUI must be non-zero", - DEFAULT_CONFIG_INTERVAL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + DEFAULT_UPDATE_TIMECODE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); /* Override BaseParse vfuncs */ parse_class->start = GST_DEBUG_FUNCPTR (gst_h264_parse_start); @@ -200,7 +204,7 @@ h264parse->aud_needed = TRUE; h264parse->aud_insert = TRUE; - h264parse->update_timecode = FALSE; + h264parse->update_timecode = DEFAULT_UPDATE_TIMECODE; } static void @@ -233,6 +237,7 @@ h264parse->frame_start = FALSE; h264parse->have_sps_in_frame = FALSE; h264parse->have_pps_in_frame = FALSE; + h264parse->have_aud_in_frame = FALSE; gst_adapter_clear (h264parse->frame_out); } @@ -1126,6 +1131,7 @@ if (pres != GST_H264_PARSER_OK) return FALSE; h264parse->aud_needed = FALSE; + h264parse->have_aud_in_frame = TRUE; break; default: /* drop anything before the initial SPS */ @@ -1227,8 +1233,9 @@ parse_res = gst_h264_parser_identify_nalu_avc (h264parse->nalparser, map.data, 0, map.size, nl, &nalu); - /* there is no AUD in AVC, always enable insertion, the pre_push function - * will only add it once, and will only add it for byte-stream output. */ + /* Always enable AUD insertion per frame here. The pre_push function + * will only add it once, and will only add it for byte-stream output + * if AUD doesn't exist in the current frame */ h264parse->aud_insert = TRUE; while (parse_res == GST_H264_PARSER_OK) { @@ -2213,8 +2220,6 @@ s2 = gst_caps_get_structure (caps, 0); gst_structure_get_fraction (s2, "framerate", &h264parse->parsed_fps_n, &h264parse->parsed_fps_d); - gst_base_parse_set_frame_rate (GST_BASE_PARSE (h264parse), fps_num, - fps_den, 0, 0); /* If we know the frame duration, and if we are not in one of the zero * latency pattern, add one frame of latency */ @@ -2401,6 +2406,96 @@ gst_buffer_unref (buf); } +static GstClockTime +gst_h264_parse_get_duration (GstH264Parse * h264parse, gboolean frame) +{ + GstClockTime ret = GST_CLOCK_TIME_NONE; + GstH264SPS *sps = h264parse->nalparser->last_sps; + gint duration = 1; + + if (!frame) { + GST_LOG_OBJECT (h264parse, "no frame data -> 0 duration"); + ret = 0; + goto done; + } + + if (!sps) { + GST_DEBUG_OBJECT (h264parse, "referred SPS invalid"); + goto fps_duration; + } else if (!sps->vui_parameters_present_flag) { + GST_DEBUG_OBJECT (h264parse, "unable to compute duration: VUI not present"); + goto fps_duration; + } else if (!sps->vui_parameters.timing_info_present_flag) { + GST_DEBUG_OBJECT (h264parse, + "unable to compute duration: timing info not present"); + goto fps_duration; + } else if (sps->vui_parameters.time_scale == 0) { + GST_DEBUG_OBJECT (h264parse, + "unable to compute duration: time_scale = 0 " + "(this is forbidden in spec; bitstream probably contains error)"); + goto fps_duration; + } + + if (h264parse->sei_pic_struct_pres_flag && + h264parse->sei_pic_struct != (guint8) - 1) { + /* Note that when h264parse->sei_pic_struct == -1 (unspecified), there + * are ways to infer its value. This is related to computing the + * TopFieldOrderCnt and BottomFieldOrderCnt, which looks + * complicated and thus not implemented for the time being. Yet + * the value we have here is correct for many applications + */ + switch (h264parse->sei_pic_struct) { + case GST_H264_SEI_PIC_STRUCT_TOP_FIELD: + case GST_H264_SEI_PIC_STRUCT_BOTTOM_FIELD: + duration = 1; + break; + case GST_H264_SEI_PIC_STRUCT_FRAME: + case GST_H264_SEI_PIC_STRUCT_TOP_BOTTOM: + case GST_H264_SEI_PIC_STRUCT_BOTTOM_TOP: + duration = 2; + break; + case GST_H264_SEI_PIC_STRUCT_TOP_BOTTOM_TOP: + case GST_H264_SEI_PIC_STRUCT_BOTTOM_TOP_BOTTOM: + duration = 3; + break; + case GST_H264_SEI_PIC_STRUCT_FRAME_DOUBLING: + duration = 4; + break; + case GST_H264_SEI_PIC_STRUCT_FRAME_TRIPLING: + duration = 6; + break; + default: + GST_DEBUG_OBJECT (h264parse, + "h264parse->sei_pic_struct of unknown value %d. Not parsed", + h264parse->sei_pic_struct); + break; + } + } else { + duration = h264parse->field_pic_flag ? 1 : 2; + } + + GST_LOG_OBJECT (h264parse, "frame tick duration %d", duration); + + ret = gst_util_uint64_scale (duration * GST_SECOND, + sps->vui_parameters.num_units_in_tick, sps->vui_parameters.time_scale); + /* sanity check */ + if (ret < GST_MSECOND) { + GST_DEBUG_OBJECT (h264parse, "discarding dur %" GST_TIME_FORMAT, + GST_TIME_ARGS (ret)); + goto fps_duration; + } + +done: + return ret; + +fps_duration: + if (h264parse->parsed_fps_d > 0 && h264parse->parsed_fps_n > 0) + ret = + gst_util_uint64_scale (GST_SECOND, h264parse->parsed_fps_d, + h264parse->parsed_fps_n); + goto done; +} + static void gst_h264_parse_get_timestamp (GstH264Parse * h264parse, GstClockTime * out_ts, GstClockTime * out_dur, gboolean frame) @@ -2547,10 +2642,19 @@ /* don't mess with timestamps if provided by upstream, * particularly since our ts not that good they handle seeking etc */ - if (h264parse->do_ts) + if (h264parse->do_ts) { gst_h264_parse_get_timestamp (h264parse, &GST_BUFFER_DTS (buffer), &GST_BUFFER_DURATION (buffer), h264parse->frame_start); + } + + /* We don't want to let baseparse select a duration itself based + * solely on the framerate, as we have more per-frame information + * available */ + if (!GST_CLOCK_TIME_IS_VALID (GST_BUFFER_DURATION (buffer))) { + GST_BUFFER_DURATION (buffer) = + gst_h264_parse_get_duration (h264parse, h264parse->frame_start); + } if (h264parse->keyframe) GST_BUFFER_FLAG_UNSET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); @@ -2841,16 +2945,16 @@ num_clock_ts = num_clock_ts_table[h264parse->sei_pic_struct]; - if (num_meta != num_clock_ts) { + if (num_meta > num_clock_ts) { GST_LOG_OBJECT (h264parse, - "The number of timecode meta %d is not equal to required %d", + "The number of timecode meta %d is superior to required %d", num_meta, num_clock_ts); return NULL; } GST_LOG_OBJECT (h264parse, - "The number of timecode meta %d is equal", num_meta); + "The number of timecode meta %d is compatible", num_meta); memset (&sei, 0, sizeof (GstH264SEIMessage)); sei.payloadType = GST_H264_SEI_PIC_TIMING; @@ -3014,7 +3118,8 @@ /* In case of byte-stream, insert au delimiter by default * if it doesn't exist */ - if (h264parse->aud_insert && h264parse->format == GST_H264_PARSE_FORMAT_BYTE) { + if (h264parse->aud_insert && !h264parse->have_aud_in_frame && + h264parse->format == GST_H264_PARSE_FORMAT_BYTE) { GST_DEBUG_OBJECT (h264parse, "Inserting AUD into the stream."); if (h264parse->align == GST_H264_PARSE_ALIGN_AU) { GstMemory *mem = @@ -3134,7 +3239,14 @@ } #endif - if (!gst_buffer_get_video_time_code_meta (buffer)) { + if (frame->out_buffer) { + parse_buffer = frame->out_buffer = + gst_buffer_make_writable (frame->out_buffer); + } else { + parse_buffer = frame->buffer = gst_buffer_make_writable (frame->buffer); + } + + if (!gst_buffer_get_video_time_code_meta (parse_buffer)) { guint i = 0; for (i = 0; i < 3 && h264parse->num_clock_timestamp; i++) { @@ -3197,7 +3309,7 @@ "Add time code meta %02u:%02u:%02u:%02u", tim->hours_value, tim->minutes_value, tim->seconds_value, n_frames); - gst_buffer_add_video_time_code_meta_full (buffer, + gst_buffer_add_video_time_code_meta_full (parse_buffer, h264parse->parsed_fps_n, h264parse->parsed_fps_d, NULL, @@ -3210,13 +3322,6 @@ h264parse->num_clock_timestamp = 0; } - if (frame->out_buffer) { - parse_buffer = frame->out_buffer = - gst_buffer_make_writable (frame->out_buffer); - } else { - parse_buffer = frame->buffer = gst_buffer_make_writable (frame->buffer); - } - if (is_interlaced) { GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_INTERLACED); if (h264parse->sei_pic_struct == GST_H264_SEI_PIC_STRUCT_TOP_FIELD)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gsth264parse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gsth264parse.h
Changed
@@ -92,6 +92,16 @@ gboolean have_sps_in_frame; gboolean have_pps_in_frame; + /* per frame AU Delimiter check used when in_format == avc or avc3 */ + gboolean have_aud_in_frame; + + /* tracing state whether h264parse needs to insert AUD or not. + * Used when in_format == byte-stream */ + gboolean aud_needed; + + /* For insertion of AU Delimiter */ + gboolean aud_insert; + gboolean first_frame; /* collected SPS and PPS NALUs */ @@ -146,10 +156,6 @@ GstVideoMultiviewFlags multiview_flags; gboolean first_in_bundle; - /* For insertion of AU Delimiter */ - gboolean aud_needed; - gboolean aud_insert; - GstVideoParseUserData user_data; GstVideoMasteringDisplayInfo mastering_display_info;
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gsth265parse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gsth265parse.c
Changed
@@ -24,6 +24,7 @@ #include <gst/base/base.h> #include <gst/pbutils/pbutils.h> +#include "gstvideoparserselements.h" #include "gsth265parse.h" #include <string.h> @@ -91,6 +92,9 @@ #define parent_class gst_h265_parse_parent_class G_DEFINE_TYPE (GstH265Parse, gst_h265_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (h265parse, "h265parse", + GST_RANK_SECONDARY, GST_TYPE_H265_PARSE, + videoparsers_element_init (plugin)); static void gst_h265_parse_finalize (GObject * object); @@ -1846,6 +1850,106 @@ (GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14); break; } + /* All the -intra profiles can map to non-intra profiles, except + the monochrome case for main and main-10. */ + case GST_H265_PROFILE_MAIN_INTRA: + { + if (sps->chroma_format_idc == 1) { + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN); + + /* Add all main compatible profiles without monochrome. */ + /* A.3.3 */ + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_10); + + /* A.3.5 */ + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444_10); + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444_12); + + /* A.3.7 */ + profiles |= profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN); + profiles |= profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_10); + profiles |= + profile_to_flag + (GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444); + profiles |= + profile_to_flag + (GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_10); + profiles |= + profile_to_flag + (GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14); + + /* G.11.1.1 */ + profiles |= profile_to_flag (GST_H265_PROFILE_MULTIVIEW_MAIN); + + /* H.11.1.1 */ + profiles |= profile_to_flag (GST_H265_PROFILE_SCALABLE_MAIN); + profiles |= profile_to_flag (GST_H265_PROFILE_SCALABLE_MAIN_10); + + /* I.11.1.1 */ + profiles |= profile_to_flag (GST_H265_PROFILE_3D_MAIN); + } + + /* Add all main compatible profiles with monochrome. */ + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_12); + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_422_10); + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_422_12); + profiles |= profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444); + profiles |= + profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444_10); + break; + } + case GST_H265_PROFILE_MAIN_10_INTRA: + { + if (sps->chroma_format_idc == 1) { + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_10); + + /* Add all main-10 compatible profiles without monochrome. */ + /* A.3.5 */ + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444_10); + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444_12); + + /* A.3.7 */ + profiles |= profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_10); + + /* H.11.1.1 */ + profiles |= profile_to_flag (GST_H265_PROFILE_SCALABLE_MAIN_10); + } + + /* Add all main-10 compatible profiles with monochrome. */ + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_12); + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_422_10); + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_422_12); + break; + } + case GST_H265_PROFILE_MAIN_12_INTRA: + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_12); + break; + case GST_H265_PROFILE_MAIN_422_10_INTRA: + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_422_10); + break; + case GST_H265_PROFILE_MAIN_422_12_INTRA: + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_422_12); + break; + case GST_H265_PROFILE_MAIN_444_INTRA: + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444); + + /* Add all main444 compatible profiles. */ + /* A.3.7 */ + profiles |= profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444); + profiles |= + profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444_10); + break; + case GST_H265_PROFILE_MAIN_444_10_INTRA: + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444_10); + + /* Add all main444-10 compatible profiles. */ + /* A.3.7 */ + profiles |= + profile_to_flag (GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444_10); + break; + case GST_H265_PROFILE_MAIN_444_12_INTRA: + profiles |= profile_to_flag (GST_H265_PROFILE_MAIN_444_12); + break; default: break; } @@ -1878,6 +1982,22 @@ return caps; } +static void +fix_invalid_profile (GstH265Parse * h265parse, GstCaps * caps, GstH265SPS * sps) +{ + /* HACK: This is a work-around to identify some main profile streams + * having wrong profile_idc. There are some wrongly encoded main profile + * streams which doesn't have any of the profile_idc values mentioned in + * Annex-A. Just assuming them as MAIN profile for now if they meet the + * A.3.2 requirement. */ + if (sps->chroma_format_idc == 1 && sps->bit_depth_luma_minus8 == 0 && + sps->bit_depth_chroma_minus8 == 0 && sps->sps_extension_flag == 0) { + gst_caps_set_simple (caps, "profile", G_TYPE_STRING, "main", NULL); + GST_WARNING_OBJECT (h265parse, + "Wrong profile_idc = 0, setting it as main profile !!"); + } +} + /* if downstream didn't support the exact profile indicated in sps header, * check for the compatible profiles also */ static void @@ -1886,6 +2006,9 @@ { GstCaps *peer_caps, *compat_caps; + if (profile == GST_H265_PROFILE_INVALID) + fix_invalid_profile (h265parse, caps, sps); + peer_caps = gst_pad_get_current_caps (GST_BASE_PARSE_SRC_PAD (h265parse)); if (!peer_caps || !gst_caps_can_intersect (caps, peer_caps)) { GstCaps *filter_caps = gst_caps_new_empty_simple ("video/x-h265"); @@ -1930,6 +2053,31 @@ gst_caps_unref (peer_caps); } +static gboolean +gst_h265_parse_is_field_interlaced (GstH265Parse * h265parse) +{ + /* FIXME: The SEI is optional, so theoretically there could be files with + * the interlaced_source_flag set to TRUE but no SEI present, or SEI present + * but no pic_struct. Haven't seen any such files in practice, and we don't + * know how to interpret the data without the pic_struct, so we'll treat + * them as progressive */ + + switch (h265parse->sei_pic_struct) { + case GST_H265_SEI_PIC_STRUCT_TOP_FIELD: + case GST_H265_SEI_PIC_STRUCT_TOP_PAIRED_PREVIOUS_BOTTOM: + case GST_H265_SEI_PIC_STRUCT_TOP_PAIRED_NEXT_BOTTOM: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_FIELD: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_PAIRED_PREVIOUS_TOP: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_PAIRED_NEXT_TOP: + return TRUE; + break; + default: + break; + } + + return FALSE; +} + static void gst_h265_parse_update_src_caps (GstH265Parse * h265parse, GstCaps * caps) { @@ -2002,12 +2150,14 @@ crop_width = sps->width; crop_height = sps->height; } + if (gst_h265_parse_is_field_interlaced (h265parse)) { + crop_height *= 2; + } if (G_UNLIKELY (h265parse->width != crop_width || h265parse->height != crop_height)) { h265parse->width = crop_width; - h265parse->height = sps->profile_tier_level.interlaced_source_flag ? - crop_height * 2 : crop_height; + h265parse->height = crop_height; GST_INFO_OBJECT (h265parse, "resolution changed %dx%d", h265parse->width, h265parse->height); modified = TRUE; @@ -2025,8 +2175,16 @@ fps_num = sps->vui_params.time_scale; fps_den = sps->vui_params.num_units_in_tick; - if (sps->profile_tier_level.interlaced_source_flag) - fps_num /= 2; + if (gst_h265_parse_is_field_interlaced (h265parse) + && h265parse->parsed_framerate) { + gint new_fps_num, new_fps_den; + + gst_util_fraction_multiply (fps_num, fps_den, 1, 2, &new_fps_num, + &new_fps_den); + fps_num = new_fps_num; + fps_den = new_fps_den; + h265parse->parsed_framerate = FALSE; + } } if (G_UNLIKELY (h265parse->fps_num != fps_num @@ -2103,6 +2261,7 @@ gst_caps_set_simple (caps, "width", G_TYPE_INT, width, "height", G_TYPE_INT, height, NULL); + h265parse->parsed_framerate = FALSE; /* upstream overrides */ if (s && gst_structure_has_field (s, "framerate")) gst_structure_get_fraction (s, "framerate", &fps_num, &fps_den); @@ -2122,6 +2281,7 @@ fps_num, fps_den, 0, 0); val = sps->profile_tier_level.interlaced_source_flag ? GST_SECOND / 2 : GST_SECOND; + h265parse->parsed_framerate = TRUE; /* If we know the frame duration, and if we are not in one of the zero * latency pattern, add one frame of latency */ @@ -2195,15 +2355,28 @@ const gchar *profile, *tier, *level; GstH265Profile p; - p = gst_h265_profile_tier_level_get_profile (&sps->profile_tier_level); + p = gst_h265_get_profile_from_sps (sps); profile = gst_h265_profile_to_string (p); + + if (s && gst_structure_has_field (s, "profile")) { + const gchar *profile_sink = gst_structure_get_string (s, "profile"); + GstH265Profile p_sink = gst_h265_profile_from_string (profile_sink); + + if (p != p_sink) { + const gchar *profile_src; + + p = MAX (p, p_sink); + profile_src = (p == p_sink) ? profile_sink : profile; + GST_INFO_OBJECT (h265parse, + "Upstream profile (%s) is different than in SPS (%s). " + "Using %s.", profile_sink, profile, profile_src); + profile = profile_src; + } + } + if (profile != NULL) gst_caps_set_simple (caps, "profile", G_TYPE_STRING, profile, NULL); - if (sps->profile_tier_level.interlaced_source_flag) - gst_caps_set_simple (caps, "interlace-mode", G_TYPE_STRING, - "interleaved", NULL); - tier = get_tier_string (sps->profile_tier_level.tier_flag); if (tier != NULL) gst_caps_set_simple (caps, "tier", G_TYPE_STRING, tier, NULL); @@ -2342,11 +2515,10 @@ gst_h265_parse_update_src_caps (h265parse, NULL); if (h265parse->fps_num > 0 && h265parse->fps_den > 0) { - GstH265SPS *sps = h265parse->nalparser->last_sps; - GstClockTime val; + GstClockTime val = + gst_h265_parse_is_field_interlaced (h265parse) ? GST_SECOND / + 2 : GST_SECOND; - val = (sps != NULL && sps->profile_tier_level.interlaced_source_flag) ? - GST_SECOND / 2 : GST_SECOND; GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (val, h265parse->fps_den, h265parse->fps_num); } @@ -2748,6 +2920,37 @@ } } + if (frame->out_buffer) { + parse_buffer = frame->out_buffer = + gst_buffer_make_writable (frame->out_buffer); + } else { + parse_buffer = frame->buffer = gst_buffer_make_writable (frame->buffer); + } + + /* see section D.3.3 of the spec */ + switch (h265parse->sei_pic_struct) { + case GST_H265_SEI_PIC_STRUCT_TOP_BOTTOM: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_TOP: + case GST_H265_SEI_PIC_STRUCT_TOP_BOTTOM_TOP: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_TOP_BOTTOM: + GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_INTERLACED); + break; + case GST_H265_SEI_PIC_STRUCT_TOP_FIELD: + case GST_H265_SEI_PIC_STRUCT_TOP_PAIRED_NEXT_BOTTOM: + case GST_H265_SEI_PIC_STRUCT_TOP_PAIRED_PREVIOUS_BOTTOM: + GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_INTERLACED); + GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_TOP_FIELD); + break; + case GST_H265_SEI_PIC_STRUCT_BOTTOM_FIELD: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_PAIRED_PREVIOUS_TOP: + case GST_H265_SEI_PIC_STRUCT_BOTTOM_PAIRED_NEXT_TOP: + GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_INTERLACED); + GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_BOTTOM_FIELD); + break; + default: + break; + } + { guint i = 0; @@ -2809,7 +3012,7 @@ gst_util_uint64_scale_int (h265parse->time_code.n_frames[i], 1, 2 - h265parse->time_code.units_field_based_flag[i]); - gst_buffer_add_video_time_code_meta_full (buffer, + gst_buffer_add_video_time_code_meta_full (parse_buffer, h265parse->parsed_fps_n, h265parse->parsed_fps_d, NULL, @@ -2823,19 +3026,6 @@ } } - if (frame->out_buffer) { - parse_buffer = frame->out_buffer = - gst_buffer_make_writable (frame->out_buffer); - } else { - parse_buffer = frame->buffer = gst_buffer_make_writable (frame->buffer); - } - - if (h265parse->sei_pic_struct != GST_H265_SEI_PIC_STRUCT_FRAME) { - GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_INTERLACED); - if (h265parse->sei_pic_struct == GST_H265_SEI_PIC_STRUCT_TOP_FIELD) - GST_BUFFER_FLAG_SET (parse_buffer, GST_VIDEO_BUFFER_FLAG_TFF); - } - gst_video_push_user_data ((GstElement *) h265parse, &h265parse->user_data, parse_buffer);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gsth265parse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gsth265parse.h
Changed
@@ -110,6 +110,7 @@ gboolean predicted; gboolean bidirectional; gboolean header; + gboolean parsed_framerate; /* AU state */ gboolean picture_start;
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gstjpeg2000parse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstjpeg2000parse.c
Changed
@@ -22,6 +22,7 @@ # include "config.h" #endif +#include "gstvideoparserselements.h" #include "gstjpeg2000parse.h" #include <gst/base/base.h> @@ -57,19 +58,24 @@ static void -gst_jpeg2000_parse_get_subsampling (GstJPEG2000Sampling sampling, guint8 * dx, - guint8 * dy) +gst_jpeg2000_parse_get_subsampling (guint16 compno, + GstJPEG2000Sampling sampling, guint8 * dx, guint8 * dy) { *dx = 1; *dy = 1; - if (sampling == GST_JPEG2000_SAMPLING_YBR422) { - *dx = 2; - } else if (sampling == GST_JPEG2000_SAMPLING_YBR420) { - *dx = 2; - *dy = 2; - } else if (sampling == GST_JPEG2000_SAMPLING_YBR410) { - *dx = 4; - *dy = 2; + if (compno == 1 || compno == 2) { + if (sampling == GST_JPEG2000_SAMPLING_YBR422) { + *dx = 2; + } else if (sampling == GST_JPEG2000_SAMPLING_YBR420) { + *dx = 2; + *dy = 2; + } else if (sampling == GST_JPEG2000_SAMPLING_YBR411) { + *dx = 4; + *dy = 1; + } else if (sampling == GST_JPEG2000_SAMPLING_YBR410) { + *dx = 4; + *dy = 4; + } } } @@ -103,16 +109,27 @@ " width = (int)[1, MAX], height = (int)[1, MAX]," GST_JPEG2000_SAMPLING_LIST "," GST_JPEG2000_COLORSPACE_LIST "," - " profile = (int)[0, 49151]," " parsed = (boolean) true") + " profile = (int)[0, 49151]," + " parsed = (boolean) true ; " + "image/x-jpc-striped," + " width = (int)[1, MAX], height = (int)[1, MAX]," + GST_JPEG2000_SAMPLING_LIST "," + GST_JPEG2000_COLORSPACE_LIST "," + " profile = (int)[0, 49151]," + " num-stripes = [ 2, MAX ], parsed = (boolean) true;") ); static GstStaticPadTemplate sinktemplate = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - GST_STATIC_CAPS ("image/jp2;image/x-jpc;image/x-j2c")); + GST_STATIC_CAPS ("image/jp2; image/x-jpc; image/x-j2c; " + "image/x-jpc-striped")); #define parent_class gst_jpeg2000_parse_parent_class G_DEFINE_TYPE (GstJPEG2000Parse, gst_jpeg2000_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (jpeg2000parse, "jpeg2000parse", + GST_RANK_PRIMARY, GST_TYPE_JPEG2000_PARSE, + videoparsers_element_init (plugin)); static gboolean gst_jpeg2000_parse_start (GstBaseParse * parse); static gboolean gst_jpeg2000_parse_event (GstBaseParse * parse, @@ -323,6 +340,8 @@ guint num_prefix_bytes = 0; /* number of bytes to skip before actual code stream */ GstCaps *src_caps = NULL; guint eoc_frame_size = 0; + gint num_stripes = 1; + gint stripe_height = 0; for (i = 0; i < GST_JPEG2000_PARSE_MAX_SUPPORTED_COMPONENTS; ++i) { dx[i] = 1; @@ -532,7 +551,9 @@ if (sink_sampling_string) sink_sampling = gst_jpeg2000_sampling_from_string (sink_sampling_string); - } else { + } + + if (colorspace == GST_JPEG2000_COLORSPACE_NONE) { /* guess color space based on number of components */ if (numcomps == 0 || numcomps > 4) { GST_ERROR_OBJECT (jpeg2000parse, @@ -588,7 +609,8 @@ } if (sink_sampling != GST_JPEG2000_SAMPLING_NONE) { guint8 dx_caps, dy_caps; - gst_jpeg2000_parse_get_subsampling (sink_sampling, &dx_caps, &dy_caps); + gst_jpeg2000_parse_get_subsampling (compno, sink_sampling, &dx_caps, + &dy_caps); if (dx_caps != dx[compno] || dy_caps != dy[compno]) { GstJPEG2000Colorspace inferred_colorspace = GST_JPEG2000_COLORSPACE_NONE; @@ -637,7 +659,9 @@ parsed_sampling = GST_JPEG2000_SAMPLING_YBR444; } else if (dx[1] == 2 && dy[1] == 2) { parsed_sampling = GST_JPEG2000_SAMPLING_YBR420; - } else if (dx[1] == 4 && dy[1] == 2) { + } else if (dx[1] == 4 && dy[1] == 1) { + parsed_sampling = GST_JPEG2000_SAMPLING_YBR411; + } else if (dx[1] == 4 && dy[1] == 4) { parsed_sampling = GST_JPEG2000_SAMPLING_YBR410; } else if (dx[1] == 2 && dy[1] == 1) { parsed_sampling = GST_JPEG2000_SAMPLING_YBR422; @@ -665,6 +689,35 @@ colorspace = GST_JPEG2000_COLORSPACE_YUV; } } + + /* use caps height if in sub-frame mode, as encoded frame height will be + * strictly less than full frame height */ + if (current_caps_struct && + gst_structure_has_name (current_caps_struct, "image/x-jpc-striped")) { + gint h; + + if (!gst_structure_get_int (current_caps_struct, "num-stripes", + &num_stripes) || num_stripes < 2) { + GST_ELEMENT_ERROR (parse, STREAM, FORMAT, (NULL), + ("Striped JPEG 2000 is missing the stripe count")); + ret = GST_FLOW_ERROR; + goto beach; + } + + if (!gst_structure_get_int (current_caps_struct, "stripe-height", + &stripe_height)) { + stripe_height = height; + } else if (stripe_height != height && + !GST_BUFFER_FLAG_IS_SET (frame->buffer, GST_BUFFER_FLAG_MARKER)) { + GST_WARNING_OBJECT (parse, + "Only the last stripe is expected to be different" + " from the stripe height (%d != %u)", height, stripe_height); + } + + gst_structure_get_int (current_caps_struct, "height", &h); + height = h; + } + /* now we can set the source caps, if something has changed */ source_sampling = sink_sampling != @@ -675,8 +728,8 @@ gint fr_num = 0, fr_denom = 0; src_caps = - gst_caps_new_simple (media_type_from_codec_format - (jpeg2000parse->src_codec_format), + gst_caps_new_simple (num_stripes > 1 ? "image/x-jpc-striped" : + media_type_from_codec_format (jpeg2000parse->src_codec_format), "width", G_TYPE_INT, width, "height", G_TYPE_INT, height, "colorspace", G_TYPE_STRING, @@ -684,6 +737,10 @@ G_TYPE_STRING, gst_jpeg2000_sampling_to_string (source_sampling), "profile", G_TYPE_INT, profile, "parsed", G_TYPE_BOOLEAN, TRUE, NULL); + if (num_stripes > 1) + gst_caps_set_simple (src_caps, "num-stripes", G_TYPE_INT, num_stripes, + "stripe_height", G_TYPE_INT, stripe_height, NULL); + if (gst_jpeg2000_parse_is_broadcast (capabilities) || gst_jpeg2000_parse_is_imf (capabilities)) { gst_caps_set_simple (src_caps, "main-level", G_TYPE_INT, main_level, @@ -695,7 +752,9 @@ } if (current_caps_struct) { - const gchar *caps_string = gst_structure_get_string + const gchar *caps_string; + + caps_string = gst_structure_get_string (current_caps_struct, "colorimetry"); if (caps_string) { gst_caps_set_simple (src_caps, "colorimetry", G_TYPE_STRING,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gstmpeg4videoparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstmpeg4videoparse.c
Changed
@@ -33,6 +33,7 @@ #include <gst/pbutils/pbutils.h> #include <gst/video/video.h> +#include "gstvideoparserselements.h" #include "gstmpeg4videoparse.h" GST_DEBUG_CATEGORY (mpeg4v_parse_debug); @@ -71,6 +72,9 @@ #define gst_mpeg4vparse_parent_class parent_class G_DEFINE_TYPE (GstMpeg4VParse, gst_mpeg4vparse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mpeg4videoparse, "mpeg4videoparse", + GST_RANK_PRIMARY + 1, GST_TYPE_MPEG4VIDEO_PARSE, + videoparsers_element_init (plugin)); static gboolean gst_mpeg4vparse_start (GstBaseParse * parse); static gboolean gst_mpeg4vparse_stop (GstBaseParse * parse); @@ -300,7 +304,7 @@ if (mp4vparse->config != NULL) gst_buffer_unref (mp4vparse->config); - mp4vparse->config = gst_buffer_new_wrapped (g_memdup (data, size), size); + mp4vparse->config = gst_buffer_new_memdup (data, size); /* trigger src caps update */ mp4vparse->update_caps = TRUE;
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gstmpegvideoparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstmpegvideoparse.c
Changed
@@ -31,6 +31,7 @@ #include <gst/pbutils/pbutils.h> #include <gst/codecparsers/gstmpegvideometa.h> +#include "gstvideoparserselements.h" #include "gstmpegvideoparse.h" GST_DEBUG_CATEGORY (mpegv_parse_debug); @@ -64,6 +65,9 @@ #define parent_class gst_mpegv_parse_parent_class G_DEFINE_TYPE (GstMpegvParse, gst_mpegv_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mpegvideoparse, "mpegvideoparse", + GST_RANK_PRIMARY + 1, GST_TYPE_MPEGVIDEO_PARSE, + videoparsers_element_init (plugin)); static gboolean gst_mpegv_parse_start (GstBaseParse * parse); static gboolean gst_mpegv_parse_stop (GstBaseParse * parse);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gstpngparse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstpngparse.c
Changed
@@ -22,6 +22,7 @@ # include "config.h" #endif +#include "gstvideoparserselements.h" #include "gstpngparse.h" #include <gst/base/base.h> @@ -47,6 +48,8 @@ #define parent_class gst_png_parse_parent_class G_DEFINE_TYPE (GstPngParse, gst_png_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (pngparse, "pngparse", GST_RANK_PRIMARY, + GST_TYPE_PNG_PARSE, videoparsers_element_init (plugin)); static gboolean gst_png_parse_start (GstBaseParse * parse); static gboolean gst_png_parse_event (GstBaseParse * parse, GstEvent * event);
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/gstvc1parse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstvc1parse.c
Changed
@@ -79,6 +79,7 @@ #include "config.h" #endif +#include "gstvideoparserselements.h" #include "gstvc1parse.h" #include <gst/base/base.h> @@ -186,6 +187,8 @@ #define parent_class gst_vc1_parse_parent_class G_DEFINE_TYPE (GstVC1Parse, gst_vc1_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vc1parse, "vc1parse", GST_RANK_NONE, + GST_TYPE_VC1_PARSE, videoparsers_element_init (plugin)); static void gst_vc1_parse_finalize (GObject * object);
View file
gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstvideoparserselement.c
Added
@@ -0,0 +1,39 @@ +/* GStreamer video parsers + * Copyright (C) 2011 Mark Nauwelaerts <mark.nauwelaerts@collabora.co.uk> + * Copyright (C) 2009 Tim-Philipp Müller <tim centricular net> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvideoparserselements.h" + +GST_DEBUG_CATEGORY (videoparseutils_debug); + +void +videoparsers_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (videoparseutils_debug, "videoparseutils", 0, + "video parse utilities"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstvideoparserselements.h
Added
@@ -0,0 +1,46 @@ +/* GStreamer + * Copyright (C) 2020 Huawei Technologies Co., Ltd. + * @Author: Stéphane Cerveau <stephane.cerveau@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_VIDEOPARSERS_ELEMENTS_H__ +#define __GST_VIDEOPARSERS_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + + +void videoparsers_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (av1parse); +GST_ELEMENT_REGISTER_DECLARE (diracparse); +GST_ELEMENT_REGISTER_DECLARE (h263parse); +GST_ELEMENT_REGISTER_DECLARE (h264parse); +GST_ELEMENT_REGISTER_DECLARE (h265parse); +GST_ELEMENT_REGISTER_DECLARE (jpeg2000parse); +GST_ELEMENT_REGISTER_DECLARE (mpeg4videoparse); +GST_ELEMENT_REGISTER_DECLARE (mpegvideoparse); +GST_ELEMENT_REGISTER_DECLARE (pngparse); +GST_ELEMENT_REGISTER_DECLARE (vc1parse); +GST_ELEMENT_REGISTER_DECLARE (vp9parse); + +#endif /* __GST_VIDEOPARSERS_ELEMENTS_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstvp9parse.c
Added
@@ -0,0 +1,843 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/codecparsers/gstvp9parser.h> +#include <gst/video/video.h> +#include "gstvideoparserselements.h" +#include "gstvp9parse.h" + +#include <string.h> + +GST_DEBUG_CATEGORY (vp9_parse_debug); +#define GST_CAT_DEFAULT vp9_parse_debug + +typedef enum +{ + GST_VP9_PARSE_ALIGN_NONE = 0, + GST_VP9_PARSE_ALIGN_SUPER_FRAME, + GST_VP9_PARSE_ALIGN_FRAME, +} GstVp9ParseAligment; + +struct _GstVp9Parse +{ + GstBaseParse parent; + + /* parsed from the last keyframe */ + gint width; + gint height; + gint subsampling_x; + gint subsampling_y; + GstVp9ColorSpace color_space; + GstVp9ColorRange color_range; + GstVP9Profile profile; + GstVp9BitDepth bit_depth; + gboolean codec_alpha; + + GstVp9ParseAligment in_align; + GstVp9ParseAligment align; + + GstVp9Parser *parser; + gboolean update_caps; + + /* per frame status */ + gboolean discont; +}; + +static GstStaticPadTemplate sinktemplate = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp9")); + +static GstStaticPadTemplate srctemplate = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp9, parsed = (boolean) true, " + "alignment=(string) { super-frame, frame }")); + +#define parent_class gst_vp9_parse_parent_class +G_DEFINE_TYPE (GstVp9Parse, gst_vp9_parse, GST_TYPE_BASE_PARSE); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (vp9parse, "vp9parse", GST_RANK_SECONDARY, + GST_TYPE_VP9_PARSE, videoparsers_element_init (plugin)); + +static gboolean gst_vp9_parse_start (GstBaseParse * parse); +static gboolean gst_vp9_parse_stop (GstBaseParse * parse); +static GstFlowReturn gst_vp9_parse_handle_frame (GstBaseParse * parse, + GstBaseParseFrame * frame, gint * skipsize); +static gboolean gst_vp9_parse_set_sink_caps (GstBaseParse * parse, + GstCaps * caps); +static GstCaps *gst_vp9_parse_get_sink_caps (GstBaseParse * parse, + GstCaps * filter); +static void gst_vp9_parse_update_src_caps (GstVp9Parse * self, GstCaps * caps); +static GstFlowReturn gst_vp9_parse_parse_frame (GstVp9Parse * self, + GstBaseParseFrame * frame, GstVp9FrameHdr * frame_hdr); + +static void +gst_vp9_parse_class_init (GstVp9ParseClass * klass) +{ + GstBaseParseClass *parse_class = GST_BASE_PARSE_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + + parse_class->start = GST_DEBUG_FUNCPTR (gst_vp9_parse_start); + parse_class->stop = GST_DEBUG_FUNCPTR (gst_vp9_parse_stop); + parse_class->handle_frame = GST_DEBUG_FUNCPTR (gst_vp9_parse_handle_frame); + parse_class->set_sink_caps = GST_DEBUG_FUNCPTR (gst_vp9_parse_set_sink_caps); + parse_class->get_sink_caps = GST_DEBUG_FUNCPTR (gst_vp9_parse_get_sink_caps); + + gst_element_class_add_static_pad_template (element_class, &srctemplate); + gst_element_class_add_static_pad_template (element_class, &sinktemplate); + + gst_element_class_set_static_metadata (element_class, "VP9 parser", + "Codec/Parser/Converter/Video", + "Parses VP9 streams", "Seungha Yang <seungha@centricular.com>"); + + GST_DEBUG_CATEGORY_INIT (vp9_parse_debug, "vp9parse", 0, "vp9 parser"); +} + +static void +gst_vp9_parse_init (GstVp9Parse * self) +{ + gst_base_parse_set_pts_interpolation (GST_BASE_PARSE (self), FALSE); + gst_base_parse_set_infer_ts (GST_BASE_PARSE (self), FALSE); + + GST_PAD_SET_ACCEPT_INTERSECT (GST_BASE_PARSE_SINK_PAD (self)); + GST_PAD_SET_ACCEPT_TEMPLATE (GST_BASE_PARSE_SINK_PAD (self)); +} + +static void +gst_vp9_parse_reset (GstVp9Parse * self) +{ + self->width = 0; + self->height = 0; + self->subsampling_x = -1; + self->subsampling_y = -1; + self->color_space = GST_VP9_CS_UNKNOWN; + self->color_range = GST_VP9_CR_LIMITED; + self->profile = GST_VP9_PROFILE_UNDEFINED; + self->bit_depth = (GstVp9BitDepth) 0; + self->codec_alpha = FALSE; +} + +static gboolean +gst_vp9_parse_start (GstBaseParse * parse) +{ + GstVp9Parse *self = GST_VP9_PARSE (parse); + + GST_DEBUG_OBJECT (self, "start"); + + self->parser = gst_vp9_parser_new (); + gst_vp9_parse_reset (self); + + /* short frame header with one byte */ + gst_base_parse_set_min_frame_size (parse, 1); + + return TRUE; +} + +static gboolean +gst_vp9_parse_stop (GstBaseParse * parse) +{ + GstVp9Parse *self = GST_VP9_PARSE (parse); + + GST_DEBUG_OBJECT (self, "stop"); + g_clear_pointer (&self->parser, gst_vp9_parser_free); + + return TRUE; +} + +static const gchar * +gst_vp9_parse_profile_to_string (GstVP9Profile profile) +{ + switch (profile) { + case GST_VP9_PROFILE_0: + return "0"; + case GST_VP9_PROFILE_1: + return "1"; + case GST_VP9_PROFILE_2: + return "2"; + case GST_VP9_PROFILE_3: + return "3"; + default: + break; + } + + return NULL; +} + +static GstVP9Profile +gst_vp9_parse_profile_from_string (const gchar * profile) +{ + if (!profile) + return GST_VP9_PROFILE_UNDEFINED; + + if (g_strcmp0 (profile, "0") == 0) + return GST_VP9_PROFILE_0; + else if (g_strcmp0 (profile, "1") == 0) + return GST_VP9_PROFILE_1; + else if (g_strcmp0 (profile, "2") == 0) + return GST_VP9_PROFILE_2; + else if (g_strcmp0 (profile, "3") == 0) + return GST_VP9_PROFILE_3; + + return GST_VP9_PROFILE_UNDEFINED; +} + +static const gchar * +gst_vp9_parse_alignment_to_string (GstVp9ParseAligment align) +{ + switch (align) { + case GST_VP9_PARSE_ALIGN_SUPER_FRAME: + return "super-frame"; + case GST_VP9_PARSE_ALIGN_FRAME: + return "frame"; + default: + break; + } + + return NULL; +} + +static GstVp9ParseAligment +gst_vp9_parse_alignment_from_string (const gchar * align) +{ + if (!align) + return GST_VP9_PARSE_ALIGN_NONE; + + if (g_strcmp0 (align, "super-frame") == 0) + return GST_VP9_PARSE_ALIGN_SUPER_FRAME; + else if (g_strcmp0 (align, "frame") == 0) + return GST_VP9_PARSE_ALIGN_FRAME; + + return GST_VP9_PARSE_ALIGN_NONE; +} + +static void +gst_vp9_parse_alignment_from_caps (GstCaps * caps, GstVp9ParseAligment * align) +{ + *align = GST_VP9_PARSE_ALIGN_NONE; + + GST_DEBUG ("parsing caps: %" GST_PTR_FORMAT, caps); + + if (caps && gst_caps_get_size (caps) > 0) { + GstStructure *s = gst_caps_get_structure (caps, 0); + const gchar *str = NULL; + + if ((str = gst_structure_get_string (s, "alignment"))) { + *align = gst_vp9_parse_alignment_from_string (str); + } + } +} + +/* implement custom semantic for codec-alpha */ +static gboolean +gst_vp9_parse_check_codec_alpha (GstStructure * s, gboolean codec_alpha) +{ + gboolean value; + + if (gst_structure_get_boolean (s, "codec-alpha", &value)) + return value == codec_alpha; + + return codec_alpha == FALSE; +} + +/* check downstream caps to configure format and alignment */ +static void +gst_vp9_parse_negotiate (GstVp9Parse * self, GstVp9ParseAligment in_align, + GstCaps * in_caps) +{ + GstCaps *caps; + GstVp9ParseAligment align = self->align; + + caps = gst_pad_get_allowed_caps (GST_BASE_PARSE_SRC_PAD (self)); + GST_DEBUG_OBJECT (self, "allowed caps: %" GST_PTR_FORMAT, caps); + + /* concentrate on leading structure, since decodebin parser + * capsfilter always includes parser template caps */ + if (caps) { + while (gst_caps_get_size (caps) > 0) { + GstStructure *s = gst_caps_get_structure (caps, 0); + + if (gst_vp9_parse_check_codec_alpha (s, self->codec_alpha)) + break; + + gst_caps_remove_structure (caps, 0); + } + + /* this may happen if there is simply no codec alpha decoder in the + * gstreamer installation, in this case, pick the first non-alpha decoder. + */ + if (gst_caps_is_empty (caps)) { + gst_caps_unref (caps); + caps = gst_pad_get_allowed_caps (GST_BASE_PARSE_SRC_PAD (self)); + } + + caps = gst_caps_truncate (caps); + GST_DEBUG_OBJECT (self, "negotiating with caps: %" GST_PTR_FORMAT, caps); + } + + if (in_caps && caps) { + if (gst_caps_can_intersect (in_caps, caps)) { + GST_DEBUG_OBJECT (self, "downstream accepts upstream caps"); + gst_vp9_parse_alignment_from_caps (in_caps, &align); + gst_clear_caps (&caps); + } + } + + /* FIXME We could fail the negotiation immediately if caps are empty */ + if (caps && !gst_caps_is_empty (caps)) { + /* fixate to avoid ambiguity with lists when parsing */ + caps = gst_caps_fixate (caps); + gst_vp9_parse_alignment_from_caps (caps, &align); + } + + /* default */ + if (align == GST_VP9_PARSE_ALIGN_NONE) + align = GST_VP9_PARSE_ALIGN_SUPER_FRAME; + + GST_DEBUG_OBJECT (self, "selected alignment %s", + gst_vp9_parse_alignment_to_string (align)); + + self->align = align; + + gst_clear_caps (&caps); +} + +static gboolean +gst_vp9_parse_is_info_valid (GstVp9Parse * self) +{ + if (self->width <= 0 || self->height <= 0) + return FALSE; + + if (self->subsampling_x < 0 || self->subsampling_y < 0) + return FALSE; + + if (self->profile == GST_VP9_PROFILE_UNDEFINED) + return FALSE; + + if (self->bit_depth < (GstVp9BitDepth) GST_VP9_BIT_DEPTH_8) + return FALSE; + + return TRUE; +} + +static gboolean +gst_vp9_parse_process_frame (GstVp9Parse * self, GstVp9FrameHdr * frame_hdr) +{ + GstVp9Parser *parser = self->parser; + gint width, height; + + /* the resolution might be varying. Update our status per key frame */ + if (frame_hdr->frame_type != GST_VP9_KEY_FRAME || + frame_hdr->show_existing_frame) { + /* Need to continue to get some valid info. */ + if (gst_vp9_parse_is_info_valid (self)) + return TRUE; + } + + width = frame_hdr->width; + height = frame_hdr->height; + if (frame_hdr->display_size_enabled && + frame_hdr->display_width > 0 && frame_hdr->display_height) { + width = frame_hdr->display_width; + height = frame_hdr->display_height; + } + + if (width != self->width || height != self->height) { + GST_DEBUG_OBJECT (self, "resolution change from %dx%d to %dx%d", + self->width, self->height, width, height); + self->width = width; + self->height = height; + self->update_caps = TRUE; + } + + if (self->subsampling_x != parser->subsampling_x || + self->subsampling_y != parser->subsampling_y) { + GST_DEBUG_OBJECT (self, + "subsampling changed from x: %d, y: %d to x: %d, y: %d", + self->subsampling_x, self->subsampling_y, + parser->subsampling_x, parser->subsampling_y); + self->subsampling_x = parser->subsampling_x; + self->subsampling_y = parser->subsampling_y; + self->update_caps = TRUE; + } + + if (parser->color_space != GST_VP9_CS_UNKNOWN && + parser->color_space != GST_VP9_CS_RESERVED_2 && + parser->color_space != self->color_space) { + GST_DEBUG_OBJECT (self, "colorspace changed from %d to %d", + self->color_space, parser->color_space); + self->color_space = parser->color_space; + self->update_caps = TRUE; + } + + if (parser->color_range != self->color_range) { + GST_DEBUG_OBJECT (self, "color range changed from %d to %d", + self->color_range, parser->color_range); + self->color_range = parser->color_range; + self->update_caps = TRUE; + } + + if (frame_hdr->profile != GST_VP9_PROFILE_UNDEFINED && + frame_hdr->profile != self->profile) { + GST_DEBUG_OBJECT (self, "profile changed from %d to %d", self->profile, + frame_hdr->profile); + self->profile = frame_hdr->profile; + self->update_caps = TRUE; + } + + if (parser->bit_depth != self->bit_depth) { + GST_DEBUG_OBJECT (self, "bit-depth changed from %d to %d", + self->bit_depth, parser->bit_depth); + self->bit_depth = parser->bit_depth; + self->update_caps = TRUE; + } + + return TRUE; +} + +static GstFlowReturn +gst_vp9_parse_handle_frame (GstBaseParse * parse, GstBaseParseFrame * frame, + gint * skipsize) +{ + GstVp9Parse *self = GST_VP9_PARSE (parse); + GstBuffer *buffer = frame->buffer; + GstFlowReturn ret = GST_FLOW_OK; + GstVp9ParserResult parse_res = GST_VP9_PARSER_ERROR; + GstMapInfo map; + gsize offset = 0; + GstVp9SuperframeInfo superframe_info; + guint i; + GstVp9FrameHdr frame_hdr; + + if (GST_BUFFER_FLAG_IS_SET (frame->buffer, GST_BUFFER_FLAG_DISCONT)) + self->discont = TRUE; + else + self->discont = FALSE; + + /* need to save buffer from invalidation upon _finish_frame */ + if (self->align == GST_VP9_PARSE_ALIGN_FRAME) + buffer = gst_buffer_copy (frame->buffer); + + if (!gst_buffer_map (buffer, &map, GST_MAP_READ)) { + GST_ELEMENT_ERROR (parse, CORE, NOT_IMPLEMENTED, (NULL), + ("Couldn't map incoming buffer")); + + return GST_FLOW_ERROR; + } + + GST_TRACE_OBJECT (self, "processing buffer of size %" G_GSIZE_FORMAT, + map.size); + + /* superframe_info will be zero initialized by GstVp9Parser */ + parse_res = gst_vp9_parser_parse_superframe_info (self->parser, + &superframe_info, map.data, map.size); + + if (parse_res != GST_VP9_PARSER_OK) { + /* just finish this frame anyway, so that we don't too strict + * regarding parsing vp9 stream. + * Downstream might be able to handle this stream even though + * it's very unlikely */ + GST_WARNING_OBJECT (self, "Couldn't parse superframe res: %d", parse_res); + goto done; + } + + for (i = 0; i < superframe_info.frames_in_superframe; i++) { + guint32 frame_size; + + frame_size = superframe_info.frame_sizes[i]; + parse_res = gst_vp9_parser_parse_frame_header (self->parser, + &frame_hdr, map.data + offset, frame_size); + + if (parse_res != GST_VP9_PARSER_OK) { + GST_WARNING_OBJECT (self, "Parsing error %d", parse_res); + break; + } + + gst_vp9_parse_process_frame (self, &frame_hdr); + + if (self->align == GST_VP9_PARSE_ALIGN_FRAME) { + GstBaseParseFrame subframe; + + gst_base_parse_frame_init (&subframe); + subframe.flags |= frame->flags; + subframe.offset = frame->offset; + subframe.overhead = frame->overhead; + subframe.buffer = gst_buffer_copy_region (buffer, GST_BUFFER_COPY_ALL, + offset, frame_size); + + /* note we don't need to come up with a sub-buffer, since + * subsequent code only considers input buffer's metadata. + * Real data is either taken from input by baseclass or + * a replacement output buffer is provided anyway. */ + gst_vp9_parse_parse_frame (self, &subframe, &frame_hdr); + ret = gst_base_parse_finish_frame (parse, &subframe, frame_size); + } else { + /* FIXME: need to parse all frames belong to this superframe? */ + break; + } + + offset += frame_size; + } + +done: + gst_buffer_unmap (buffer, &map); + + if (self->align != GST_VP9_PARSE_ALIGN_FRAME) { + if (parse_res == GST_VP9_PARSER_OK) + gst_vp9_parse_parse_frame (self, frame, &frame_hdr); + ret = gst_base_parse_finish_frame (parse, frame, map.size); + } else { + gst_buffer_unref (buffer); + if (offset != map.size) { + gsize left = map.size - offset; + if (left != superframe_info.superframe_index_size) { + GST_WARNING_OBJECT (parse, + "Skipping leftover frame data %" G_GSIZE_FORMAT, left); + } + frame->flags |= GST_BASE_PARSE_FRAME_FLAG_DROP; + ret = gst_base_parse_finish_frame (parse, frame, left); + } + } + + return ret; +} + +static void +gst_vp9_parse_update_src_caps (GstVp9Parse * self, GstCaps * caps) +{ + GstCaps *sink_caps, *src_caps; + GstCaps *final_caps = NULL; + GstStructure *s = NULL; + gint width, height; + gint par_n = 0, par_d = 0; + gint fps_n = 0, fps_d = 0; + gint bitdepth = 0; + gchar *colorimetry = NULL; + const gchar *chroma_format = NULL; + const gchar *profile = NULL; + + if (!self->update_caps) + return; + + /* if this is being called from the first _setcaps call, caps on the sinkpad + * aren't set yet and so they need to be passed as an argument */ + if (caps) + sink_caps = gst_caps_ref (caps); + else + sink_caps = gst_pad_get_current_caps (GST_BASE_PARSE_SINK_PAD (self)); + + /* carry over input caps as much as possible; override with our own stuff */ + if (!sink_caps) + sink_caps = gst_caps_new_empty_simple ("video/x-vp9"); + else + s = gst_caps_get_structure (sink_caps, 0); + + final_caps = gst_caps_copy (sink_caps); + + /* frame header should give this but upstream overrides */ + if (s && gst_structure_has_field (s, "width") && + gst_structure_has_field (s, "height")) { + gst_structure_get_int (s, "width", &width); + gst_structure_get_int (s, "height", &height); + } else { + width = self->width; + height = self->height; + } + + if (width > 0 && height > 0) + gst_caps_set_simple (final_caps, "width", G_TYPE_INT, width, + "height", G_TYPE_INT, height, NULL); + + if (s && gst_structure_get_fraction (s, "pixel-aspect-ratio", &par_n, &par_d)) { + if (par_n != 0 && par_d != 0) { + gst_caps_set_simple (final_caps, "pixel-aspect-ratio", + GST_TYPE_FRACTION, par_n, par_d, NULL); + } + } + + if (s && gst_structure_has_field (s, "framerate")) { + gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d); + } + + if (fps_n > 0 && fps_d > 0) { + gst_caps_set_simple (final_caps, "framerate", + GST_TYPE_FRACTION, fps_n, fps_d, NULL); + gst_base_parse_set_frame_rate (GST_BASE_PARSE (self), fps_n, fps_d, 0, 0); + } + + if (self->color_space != GST_VP9_CS_UNKNOWN && + self->color_space != GST_VP9_CS_RESERVED_2) { + GstVideoColorimetry cinfo; + gboolean have_cinfo = TRUE; + + memset (&cinfo, 0, sizeof (GstVideoColorimetry)); + + switch (self->parser->color_space) { + case GST_VP9_CS_BT_601: + gst_video_colorimetry_from_string (&cinfo, GST_VIDEO_COLORIMETRY_BT601); + break; + case GST_VP9_CS_BT_709: + gst_video_colorimetry_from_string (&cinfo, GST_VIDEO_COLORIMETRY_BT709); + break; + case GST_VP9_CS_SMPTE_170: + gst_video_colorimetry_from_string (&cinfo, GST_VIDEO_COLORIMETRY_BT601); + break; + case GST_VP9_CS_SMPTE_240: + gst_video_colorimetry_from_string (&cinfo, + GST_VIDEO_COLORIMETRY_SMPTE240M); + break; + case GST_VP9_CS_BT_2020: + if (self->parser->bit_depth == GST_VP9_BIT_DEPTH_12) { + gst_video_colorimetry_from_string (&cinfo, + GST_VIDEO_COLORIMETRY_BT2020); + } else { + gst_video_colorimetry_from_string (&cinfo, + GST_VIDEO_COLORIMETRY_BT2020_10); + } + break; + case GST_VP9_CS_SRGB: + gst_video_colorimetry_from_string (&cinfo, GST_VIDEO_COLORIMETRY_SRGB); + break; + default: + have_cinfo = FALSE; + break; + } + + if (have_cinfo) { + if (self->parser->color_range == GST_VP9_CR_LIMITED) + cinfo.range = GST_VIDEO_COLOR_RANGE_16_235; + else + cinfo.range = GST_VIDEO_COLOR_RANGE_0_255; + + colorimetry = gst_video_colorimetry_to_string (&cinfo); + } + } + + if (self->color_space != GST_VP9_CS_SRGB) { + if (self->parser->subsampling_x == 1 && self->parser->subsampling_y == 1) + chroma_format = "4:2:0"; + else if (self->parser->subsampling_x == 1 && + self->parser->subsampling_y == 0) + chroma_format = "4:2:2"; + else if (self->parser->subsampling_x == 0 && + self->parser->subsampling_y == 1) + chroma_format = "4:4:0"; + else if (self->parser->subsampling_x == 0 && + self->parser->subsampling_y == 0) + chroma_format = "4:4:4"; + + if (chroma_format) + gst_caps_set_simple (final_caps, + "chroma-format", G_TYPE_STRING, chroma_format, NULL); + } + + switch (self->bit_depth) { + case GST_VP9_BIT_DEPTH_8: + bitdepth = 8; + break; + case GST_VP9_BIT_DEPTH_10: + bitdepth = 10; + break; + case GST_VP9_BIT_DEPTH_12: + bitdepth = 12; + break; + default: + break; + } + + if (bitdepth) { + gst_caps_set_simple (final_caps, + "bit-depth-luma", G_TYPE_UINT, bitdepth, + "bit-depth-chroma", G_TYPE_UINT, bitdepth, NULL); + } + + if (colorimetry && (!s || !gst_structure_has_field (s, "colorimetry"))) { + gst_caps_set_simple (final_caps, + "colorimetry", G_TYPE_STRING, colorimetry, NULL); + } + + g_free (colorimetry); + + gst_caps_set_simple (final_caps, "parsed", G_TYPE_BOOLEAN, TRUE, + "alignment", G_TYPE_STRING, + gst_vp9_parse_alignment_to_string (self->align), NULL); + + profile = gst_vp9_parse_profile_to_string (self->profile); + if (profile) + gst_caps_set_simple (final_caps, "profile", G_TYPE_STRING, profile, NULL); + + gst_caps_set_simple (final_caps, "codec-alpha", G_TYPE_BOOLEAN, + self->codec_alpha, NULL); + + src_caps = gst_pad_get_current_caps (GST_BASE_PARSE_SRC_PAD (self)); + + if (!(src_caps && gst_caps_is_strictly_equal (src_caps, final_caps))) { + GST_DEBUG_OBJECT (self, "Update src caps %" GST_PTR_FORMAT, final_caps); + gst_pad_set_caps (GST_BASE_PARSE_SRC_PAD (self), final_caps); + } + + gst_clear_caps (&src_caps); + gst_caps_unref (final_caps); + gst_caps_unref (sink_caps); + + self->update_caps = FALSE; +} + +static GstFlowReturn +gst_vp9_parse_parse_frame (GstVp9Parse * self, GstBaseParseFrame * frame, + GstVp9FrameHdr * frame_hdr) +{ + GstBuffer *buffer; + + buffer = frame->buffer; + + gst_vp9_parse_update_src_caps (self, NULL); + + if (frame_hdr->frame_type == GST_VP9_KEY_FRAME) + GST_BUFFER_FLAG_UNSET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); + else + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); + + if (self->align == GST_VP9_PARSE_ALIGN_FRAME) { + if (!frame_hdr->show_frame) + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DECODE_ONLY); + else + GST_BUFFER_FLAG_UNSET (buffer, GST_BUFFER_FLAG_DECODE_ONLY); + } + + if (self->discont) { + GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DISCONT); + self->discont = FALSE; + } + + return GST_FLOW_OK; +} + +static gboolean +gst_vp9_parse_set_sink_caps (GstBaseParse * parse, GstCaps * caps) +{ + GstVp9Parse *self = GST_VP9_PARSE (parse); + GstStructure *str; + GstVp9ParseAligment align; + GstCaps *in_caps = NULL; + const gchar *profile; + + str = gst_caps_get_structure (caps, 0); + + /* accept upstream info if provided */ + gst_structure_get_int (str, "width", &self->width); + gst_structure_get_int (str, "height", &self->height); + profile = gst_structure_get_string (str, "profile"); + if (profile) + self->profile = gst_vp9_parse_profile_from_string (profile); + gst_structure_get_boolean (str, "codec-alpha", &self->codec_alpha); + + /* get upstream align from caps */ + gst_vp9_parse_alignment_from_caps (caps, &align); + + /* default */ + if (align == GST_VP9_PARSE_ALIGN_NONE) + align = GST_VP9_PARSE_ALIGN_SUPER_FRAME; + + /* prefer alignment type determined above */ + in_caps = gst_caps_copy (caps); + gst_caps_set_simple (in_caps, "alignment", G_TYPE_STRING, + gst_vp9_parse_alignment_to_string (align), NULL); + + /* negotiate with downstream, set output align */ + gst_vp9_parse_negotiate (self, align, in_caps); + + self->update_caps = TRUE; + + /* if all of decoder's capability related values are provided + * by upstream, update src caps now */ + if (self->width > 0 && self->height > 0 && profile) + gst_vp9_parse_update_src_caps (self, in_caps); + + gst_caps_unref (in_caps); + + self->in_align = align; + + return TRUE; +} + +static void +remove_fields (GstCaps * caps, gboolean all) +{ + guint i, n; + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + GstStructure *s = gst_caps_get_structure (caps, i); + + if (all) { + gst_structure_remove_field (s, "alignment"); + } + gst_structure_remove_field (s, "parsed"); + } +} + +static GstCaps * +gst_vp9_parse_get_sink_caps (GstBaseParse * parse, GstCaps * filter) +{ + GstCaps *peercaps, *templ; + GstCaps *res, *tmp, *pcopy; + + templ = gst_pad_get_pad_template_caps (GST_BASE_PARSE_SINK_PAD (parse)); + if (filter) { + GstCaps *fcopy = gst_caps_copy (filter); + /* Remove the fields we convert */ + remove_fields (fcopy, TRUE); + peercaps = gst_pad_peer_query_caps (GST_BASE_PARSE_SRC_PAD (parse), fcopy); + gst_caps_unref (fcopy); + } else { + peercaps = gst_pad_peer_query_caps (GST_BASE_PARSE_SRC_PAD (parse), NULL); + } + + pcopy = gst_caps_copy (peercaps); + remove_fields (pcopy, TRUE); + + res = gst_caps_intersect_full (pcopy, templ, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (pcopy); + gst_caps_unref (templ); + + if (filter) { + GstCaps *tmp = gst_caps_intersect_full (res, filter, + GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (res); + res = tmp; + } + + /* Try if we can put the downstream caps first */ + pcopy = gst_caps_copy (peercaps); + remove_fields (pcopy, FALSE); + tmp = gst_caps_intersect_full (pcopy, res, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (pcopy); + if (!gst_caps_is_empty (tmp)) + res = gst_caps_merge (tmp, res); + else + gst_caps_unref (tmp); + + gst_caps_unref (peercaps); + + return res; +}
View file
gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/gstvp9parse.h
Added
@@ -0,0 +1,34 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_VP9_PARSE_H__ +#define __GST_VP9_PARSE_H__ + +#include <gst/gst.h> +#include <gst/base/gstbaseparse.h> + +G_BEGIN_DECLS + +#define GST_TYPE_VP9_PARSE (gst_vp9_parse_get_type()) +G_DECLARE_FINAL_TYPE (GstVp9Parse, + gst_vp9_parse, GST, VP9_PARSE, GstBaseParse); + +G_END_DECLS + +#endif /* __GST_VP9_PARSE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/meson.build -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/meson.build
Changed
@@ -1,5 +1,6 @@ vparse_sources = [ 'plugin.c', + 'gstvideoparserselement.c', 'h263parse.c', 'gsth263parse.c', 'gstdiracparse.c', @@ -12,6 +13,8 @@ 'gsth265parse.c', 'gstvideoparseutils.c', 'gstjpeg2000parse.c', + 'gstvp9parse.c', + 'gstav1parse.c', ] gstvideoparsersbad = library('gstvideoparsersbad',
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videoparsers/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videoparsers/plugin.c
Changed
@@ -22,44 +22,34 @@ #include "config.h" #endif -#include "gsth263parse.h" -#include "gsth264parse.h" -#include "gstdiracparse.h" -#include "gstmpegvideoparse.h" -#include "gstmpeg4videoparse.h" -#include "gstpngparse.h" -#include "gstjpeg2000parse.h" -#include "gstvc1parse.h" -#include "gsth265parse.h" - -GST_DEBUG_CATEGORY (videoparseutils_debug); +#include "gstvideoparserselements.h" static gboolean plugin_init (GstPlugin * plugin) { gboolean ret = FALSE; - GST_DEBUG_CATEGORY_INIT (videoparseutils_debug, "videoparseutils", 0, - "video parse utilities"); - - ret |= gst_element_register (plugin, "h263parse", - GST_RANK_PRIMARY + 1, GST_TYPE_H263_PARSE); - ret |= gst_element_register (plugin, "h264parse", - GST_RANK_PRIMARY + 1, GST_TYPE_H264_PARSE); - ret |= gst_element_register (plugin, "diracparse", - GST_RANK_NONE, GST_TYPE_DIRAC_PARSE); - ret |= gst_element_register (plugin, "mpegvideoparse", - GST_RANK_PRIMARY + 1, GST_TYPE_MPEGVIDEO_PARSE); - ret |= gst_element_register (plugin, "mpeg4videoparse", - GST_RANK_PRIMARY + 1, GST_TYPE_MPEG4VIDEO_PARSE); - ret |= gst_element_register (plugin, "pngparse", - GST_RANK_PRIMARY, GST_TYPE_PNG_PARSE); - ret |= gst_element_register (plugin, "jpeg2000parse", - GST_RANK_PRIMARY, GST_TYPE_JPEG2000_PARSE); - ret |= gst_element_register (plugin, "h265parse", - GST_RANK_SECONDARY, GST_TYPE_H265_PARSE); - ret |= gst_element_register (plugin, "vc1parse", - GST_RANK_NONE, GST_TYPE_VC1_PARSE); + ret |= GST_ELEMENT_REGISTER (h263parse, plugin); + ret |= GST_ELEMENT_REGISTER (h264parse, plugin); + ret |= GST_ELEMENT_REGISTER (diracparse, plugin); + ret |= GST_ELEMENT_REGISTER (mpegvideoparse, plugin); + ret |= GST_ELEMENT_REGISTER (mpeg4videoparse, plugin); + ret |= GST_ELEMENT_REGISTER (pngparse, plugin); + ret |= GST_ELEMENT_REGISTER (jpeg2000parse, plugin); + ret |= GST_ELEMENT_REGISTER (h265parse, plugin); + ret |= GST_ELEMENT_REGISTER (vc1parse, plugin); + /** + * element-vp9parse: + * + * Since: 1.20 + */ + ret |= GST_ELEMENT_REGISTER (vp9parse, plugin); + /** + * element-av1parse: + * + * Since: 1.20 + */ + ret |= GST_ELEMENT_REGISTER (av1parse, plugin); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstsimplevideomark.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstsimplevideomark.c
Changed
@@ -106,6 +106,8 @@ GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_video_mark_debug_category, "simplevideomark", 0, "debug category for simplevideomark element")); +GST_ELEMENT_REGISTER_DEFINE (simplevideomark, "simplevideomark", + GST_RANK_NONE, GST_TYPE_SIMPLE_VIDEO_MARK); static void gst_video_mark_class_init (GstSimpleVideoMarkClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstsimplevideomark.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstsimplevideomark.h
Changed
@@ -56,6 +56,8 @@ GType gst_video_mark_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (simplevideomark); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstsimplevideomarkdetect.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstsimplevideomarkdetect.c
Changed
@@ -128,6 +128,8 @@ GST_DEBUG_CATEGORY_INIT (gst_video_detect_debug_category, "simplevideomarkdetect", 0, "debug category for simplevideomarkdetect element")); +GST_ELEMENT_REGISTER_DEFINE (simplevideomarkdetect, + "simplevideomarkdetect", GST_RANK_NONE, GST_TYPE_SIMPLE_VIDEO_MARK_DETECT); static void gst_video_detect_class_init (GstSimpleVideoMarkDetectClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstsimplevideomarkdetect.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstsimplevideomarkdetect.h
Changed
@@ -58,6 +58,8 @@ GType gst_video_detect_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (simplevideomarkdetect); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstvideoanalyse.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstvideoanalyse.c
Changed
@@ -87,6 +87,8 @@ GST_TYPE_VIDEO_FILTER, GST_DEBUG_CATEGORY_INIT (gst_video_analyse_debug_category, "videoanalyse", 0, "debug category for videoanalyse element")); +GST_ELEMENT_REGISTER_DEFINE (videoanalyse, "videoanalyse", + GST_RANK_NONE, GST_TYPE_VIDEO_ANALYSE); static void gst_video_analyse_class_init (GstVideoAnalyseClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstvideoanalyse.h -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstvideoanalyse.h
Changed
@@ -52,6 +52,8 @@ GType gst_video_analyse_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (videoanalyse); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/gst/videosignal/gstvideosignal.c -> gst-plugins-bad-1.20.1.tar.xz/gst/videosignal/gstvideosignal.c
Changed
@@ -28,22 +28,16 @@ static gboolean plugin_init (GstPlugin * plugin) { - gboolean res; - - res = gst_element_register (plugin, "videoanalyse", GST_RANK_NONE, - GST_TYPE_VIDEO_ANALYSE); + gboolean ret = FALSE; + ret |= GST_ELEMENT_REGISTER (videoanalyse, plugin); /* FIXME under no circumstances is anyone allowed to revive the * element formerly known as simplevideomarkdetect without changing the name * first. XOXO --ds */ + ret |= GST_ELEMENT_REGISTER (simplevideomarkdetect, plugin); + ret |= GST_ELEMENT_REGISTER (simplevideomark, plugin); - res &= gst_element_register (plugin, "simplevideomarkdetect", GST_RANK_NONE, - GST_TYPE_SIMPLE_VIDEO_MARK_DETECT); - - res &= gst_element_register (plugin, "simplevideomark", GST_RANK_NONE, - GST_TYPE_SIMPLE_VIDEO_MARK); - - return res; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/vmnc/vmncdec.c -> gst-plugins-bad-1.20.1.tar.xz/gst/vmnc/vmncdec.c
Changed
@@ -79,6 +79,8 @@ ); G_DEFINE_TYPE (GstVMncDec, gst_vmnc_dec, GST_TYPE_VIDEO_DECODER); +GST_ELEMENT_REGISTER_DEFINE (vmncdec, "vmncdec", GST_RANK_PRIMARY, + GST_TYPE_VMNC_DEC); static void gst_vmnc_dec_class_init (GstVMncDecClass * klass) @@ -957,10 +959,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "vmncdec", GST_RANK_PRIMARY, - GST_TYPE_VMNC_DEC)) - return FALSE; - return TRUE; + return GST_ELEMENT_REGISTER (vmncdec, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/vmnc/vmncdec.h -> gst-plugins-bad-1.20.1.tar.xz/gst/vmnc/vmncdec.h
Changed
@@ -104,7 +104,7 @@ } GstVMncDecClass; GType gst_vmnc_dec_get_type (void); - +GST_ELEMENT_REGISTER_DECLARE (vmncdec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/gst/y4m/gsty4mdec.c -> gst-plugins-bad-1.20.1.tar.xz/gst/y4m/gsty4mdec.c
Changed
@@ -92,7 +92,9 @@ /* class initialization */ #define gst_y4m_dec_parent_class parent_class G_DEFINE_TYPE (GstY4mDec, gst_y4m_dec, GST_TYPE_ELEMENT); - +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (y4mdec, "y4mdec", GST_RANK_SECONDARY, + gst_y4m_dec_get_type (), GST_DEBUG_CATEGORY_INIT (y4mdec_debug, "y4mdec", 0, + "y4mdec element")); static void gst_y4m_dec_class_init (GstY4mDecClass * klass) { @@ -894,16 +896,9 @@ static gboolean plugin_init (GstPlugin * plugin) { - - gst_element_register (plugin, "y4mdec", GST_RANK_SECONDARY, - gst_y4m_dec_get_type ()); - - GST_DEBUG_CATEGORY_INIT (y4mdec_debug, "y4mdec", 0, "y4mdec element"); - - return TRUE; + return GST_ELEMENT_REGISTER (y4mdec, plugin); } - GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, y4mdec,
View file
gst-plugins-bad-1.18.6.tar.xz/gst/y4m/gsty4mdec.h -> gst-plugins-bad-1.20.1.tar.xz/gst/y4m/gsty4mdec.h
Changed
@@ -62,6 +62,7 @@ }; GType gst_y4m_dec_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (y4mdec); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/meson.build -> gst-plugins-bad-1.20.1.tar.xz/meson.build
Changed
@@ -1,6 +1,6 @@ project('gst-plugins-bad', 'c', 'cpp', - version : '1.18.6', - meson_version : '>= 0.49', + version : '1.20.1', + meson_version : '>= 0.59', default_options : [ 'warning_level=1', 'buildtype=debugoptimized' ]) @@ -14,11 +14,17 @@ else gst_version_nano = 0 endif -gst_version_is_dev = gst_version_minor % 2 == 1 and gst_version_micro < 90 +gst_version_is_stable = gst_version_minor.is_even() +gst_version_is_dev = gst_version_minor.is_odd() and gst_version_micro < 90 -glib_req = '>= 2.44.0' +glib_req = '>= 2.56.0' orc_req = '>= 0.4.17' -gst_req = '>= @0@.@1@.0'.format(gst_version_major, gst_version_minor) + +if gst_version_is_stable + gst_req = '>= @0@.@1@.0'.format(gst_version_major, gst_version_minor) +else + gst_req = '>= ' + gst_version +endif api_version = '1.0' soversion = 0 @@ -29,7 +35,9 @@ osxversion = curversion + 1 plugins_install_dir = join_paths(get_option('libdir'), 'gstreamer-1.0') +static_build = get_option('default_library') == 'static' plugins = [] +libraries = [] cc = meson.get_compiler('c') cxx = meson.get_compiler('cpp') @@ -46,19 +54,23 @@ cdata = configuration_data() if cc.get_id() == 'msvc' - # Ignore several spurious warnings for things gstreamer does very commonly - # If a warning is completely useless and spammy, use '/wdXXXX' to suppress it - # If a warning is harmless but hard to fix, use '/woXXXX' so it's shown once - # NOTE: Only add warnings here if you are sure they're spurious msvc_args = [ + # Ignore several spurious warnings for things gstreamer does very commonly + # If a warning is completely useless and spammy, use '/wdXXXX' to suppress it + # If a warning is harmless but hard to fix, use '/woXXXX' so it's shown once + # NOTE: Only add warnings here if you are sure they're spurious '/wd4018', # implicit signed/unsigned conversion '/wd4146', # unary minus on unsigned (beware INT_MIN) '/wd4244', # lossy type conversion (e.g. double -> int) '/wd4305', # truncating type conversion (e.g. double -> float) cc.get_supported_arguments(['/utf-8']), # set the input encoding to utf-8 + + # Enable some warnings on MSVC to match GCC/Clang behaviour + '/w14062', # enumerator 'identifier' in switch of enum 'enumeration' is not handled + '/w14101', # 'identifier' : unreferenced local variable + '/w14189', # 'identifier' : local variable is initialized but not referenced ] - add_project_arguments(msvc_args, language : 'c') - add_project_arguments(msvc_args, language : 'cpp') + add_project_arguments(msvc_args, language: ['c', 'cpp']) # Disable SAFESEH with MSVC for plugins and libs that use external deps that # are built with MinGW noseh_link_args = ['/SAFESEH:NO'] @@ -158,6 +170,7 @@ ['HAVE_DCGETTEXT', 'dcgettext'], ['HAVE_GETPAGESIZE', 'getpagesize'], ['HAVE_GMTIME_R', 'gmtime_r'], + ['HAVE_MEMFD_CREATE', 'memfd_create'], ['HAVE_MMAP', 'mmap'], ['HAVE_PIPE2', 'pipe2'], ['HAVE_GETRUSAGE', 'getrusage', '#include<sys/resource.h>'], @@ -308,9 +321,9 @@ if gstgl_dep.found() if gstgl_dep.type_name() == 'pkgconfig' - gst_gl_apis = gstgl_dep.get_pkgconfig_variable('gl_apis').split() - gst_gl_winsys = gstgl_dep.get_pkgconfig_variable('gl_winsys').split() - gst_gl_platforms = gstgl_dep.get_pkgconfig_variable('gl_platforms').split() + gst_gl_apis = gstgl_dep.get_variable('gl_apis').split() + gst_gl_winsys = gstgl_dep.get_variable('gl_winsys').split() + gst_gl_platforms = gstgl_dep.get_variable('gl_platforms').split() else gstbase = subproject('gst-plugins-base') gst_gl_apis = gstbase.get_variable('enabled_gl_apis') @@ -364,6 +377,20 @@ cdata.set('HAVE_X11', 1) endif +# +# Solaris and Illumos distros split a lot of networking-related code +# into '-lsocket -lnsl'. Anything that calls socketpair(), getifaddr(), +# etc. probably needs to include network_deps +# +if host_machine.system() == 'sunos' + network_deps = [ + cc.find_library('socket', required: false), + cc.find_library('nsl', required: false) + ] +else + network_deps = [] +endif + if host_machine.system() == 'windows' winsock2 = [cc.find_library('ws2_32')] else @@ -461,6 +488,26 @@ plugins_pkgconfig_install_dir = disabler() endif +pkgconfig_variables = ['exec_prefix=${prefix}', + 'toolsdir=${exec_prefix}/bin', + 'pluginsdir=${libdir}/gstreamer-1.0', + 'datarootdir=${prefix}/share', + 'datadir=${datarootdir}', + 'girdir=${datadir}/gir-1.0', + 'typelibdir=${libdir}/girepository-1.0'] + +pkgconfig_subdirs = ['gstreamer-1.0'] + +pkgconfig.generate( + libraries : [gst_dep], + variables : pkgconfig_variables, + subdirs : pkgconfig_subdirs, + name : 'gstreamer-plugins-bad-1.0', + description : 'Streaming media framework, bad plugins libraries', +) + +gpl_allowed = get_option('gpl').allowed() + subdir('gst-libs') subdir('gst') subdir('sys') @@ -468,7 +515,6 @@ subdir('tests') subdir('data') subdir('tools') -subdir('pkgconfig') if have_orcc update_orc_dist_files = find_program('scripts/update-orc-dist-files.py') @@ -488,7 +534,7 @@ ] endforeach - if meson.version().version_compare('>= 0.52') and orc_update_targets.length() > 0 + if orc_update_targets.length() > 0 update_orc_dist_target = alias_target('update-orc-dist', orc_update_targets) endif endif @@ -504,34 +550,30 @@ # Set release date if gst_version_nano == 0 extract_release_date = find_program('scripts/extract-release-date-from-doap-file.py') - run_result = run_command(extract_release_date, gst_version, files('gst-plugins-bad.doap')) - if run_result.returncode() == 0 - release_date = run_result.stdout().strip() - cdata.set_quoted('GST_PACKAGE_RELEASE_DATETIME', release_date) - message('Package release date: ' + release_date) - else - # Error out if our release can't be found in the .doap file - error(run_result.stderr()) - endif + run_result = run_command(extract_release_date, gst_version, files('gst-plugins-bad.doap'), check: true) + release_date = run_result.stdout().strip() + cdata.set_quoted('GST_PACKAGE_RELEASE_DATETIME', release_date) + message('Package release date: ' + release_date) endif -configure_file(output : 'config.h', configuration : cdata) +if glib_dep.version().version_compare('< 2.67.4') + cdata.set('g_memdup2(ptr,sz)', '(G_LIKELY(((guint64)(sz)) < G_MAXUINT)) ? g_memdup(ptr,sz) : (g_abort(),NULL)') +endif -run_command(python3, '-c', 'import shutil; shutil.copy("hooks/pre-commit.hook", ".git/hooks/pre-commit")') +configure_file(output : 'config.h', configuration : cdata) subdir('docs') -if meson.version().version_compare('>= 0.54') - plugin_names = [] - foreach plugin: plugins - # FIXME: Use str.subtring() when we can depend on Meson 0.56 - split = plugin.name().split('gst') - if split.length() == 2 - plugin_names += [split[1]] - else - warning('Need substring API in meson >= 0.56 to properly parse plugin name: ' + plugin.name()) - plugin_names += [plugin.name()] - endif - endforeach - summary({'Plugins':plugin_names}, list_sep: ', ') -endif +plugin_names = [] +foreach plugin: plugins + if plugin.name().startswith('gst') + plugin_names += [plugin.name().substring(3)] + else + plugin_names += [plugin.name()] + endif +endforeach + +summary({ + 'Plugins': plugin_names, + '(A)GPL license allowed': gpl_allowed, +}, list_sep: ', ')
View file
gst-plugins-bad-1.18.6.tar.xz/meson_options.txt -> gst-plugins-bad-1.20.1.tar.xz/meson_options.txt
Changed
@@ -1,5 +1,5 @@ -option('gst_player_tests', type: 'boolean', value: false, - description: 'Enable GstPlayer tests that need network access') +option('gst_play_tests', type: 'boolean', value: false, + description: 'Enable GstPlay tests that need network access') # Feature options for plugins without external deps option('accurip', type : 'feature', value : 'auto') @@ -15,6 +15,7 @@ option('autoconvert', type : 'feature', value : 'auto') option('bayer', type : 'feature', value : 'auto') option('camerabin2', type : 'feature', value : 'auto') +option('codecalpha', type : 'feature', value : 'auto') option('coloreffects', type : 'feature', value : 'auto') option('debugutils', type : 'feature', value : 'auto') option('dvbsubenc', type : 'feature', value : 'auto') @@ -75,10 +76,13 @@ option('x11', type : 'feature', value : 'auto', description : 'X11 support in Vulkan, GL and rfb plugins') # Feature options for plugins that need external deps +option('aes', type : 'feature', value : 'auto', description : 'AES encryption/decryption plugin') option('aom', type : 'feature', value : 'auto', description : 'AOM AV1 video codec plugin') option('avtp', type : 'feature', value : 'auto', description : 'Audio/Video Transport Protocol (AVTP) plugin') option('androidmedia', type : 'feature', value : 'auto', description : 'Video capture and codec plugins for Android') option('applemedia', type : 'feature', value : 'auto', description : 'Video capture and codec access plugins for macOS and iOS') +option('asio', type : 'feature', value : 'auto', description : 'Steinberg Audio Streaming Input Output (ASIO) plugin') +option('asio-sdk-path', type : 'string', value : '', description : 'Full path to Steinberg Audio Streaming Input Output (ASIO) SDK') option('assrender', type : 'feature', value : 'auto', description : 'ASS/SSA subtitle renderer plugin') option('bluez', type : 'feature', value : 'auto', description : 'Bluetooth audio A2DP/AVDTP sink, AVDTP source plugin') option('bs2b', type : 'feature', value : 'auto', description : 'Bauer stereophonic-to-binaural audio plugin') @@ -96,35 +100,37 @@ option('directfb', type : 'feature', value : 'auto', description : 'DirectFB video sink plugin') option('directsound', type : 'feature', value : 'auto', description : 'Directsound audio source plugin') option('dtls', type : 'feature', value : 'auto', description : 'DTLS encoder and decoder plugin') -option('dts', type : 'feature', value : 'auto', description : 'DTS audio decoder plugin') +option('dts', type : 'feature', value : 'auto', description : 'DTS audio decoder plugin (GPL - only built if gpl option is also enabled!)') option('dvb', type : 'feature', value : 'auto', description : 'DVB video bin and source plugin') option('faac', type : 'feature', value : 'auto', description : 'Free AAC audio encoder plugin') -option('faad', type : 'feature', value : 'auto', description : 'Free AAC audio decoder plugin') +option('faad', type : 'feature', value : 'auto', description : 'Free AAC audio decoder plugin (GPL - only built if gpl option is also enabled!)') option('fbdev', type : 'feature', value : 'auto', description : 'Framebuffer video sink plugin') option('fdkaac', type : 'feature', value : 'auto', description : 'Fraunhofer AAC audio codec plugin') option('flite', type : 'feature', value : 'auto', description : 'Flite speech synthesizer source plugin') option('fluidsynth', type : 'feature', value : 'auto', description : 'Fluidsynth MIDI decoder plugin') option('gl', type : 'feature', value : 'auto', description : 'GStreamer OpenGL integration support (used by various plugins)') option('gme', type : 'feature', value : 'auto', description : 'libgme gaming console music file decoder plugin') +option('gs', type : 'feature', value : 'auto', description : 'Google Cloud Storage source and sink plugin') option('gsm', type : 'feature', value : 'auto', description : 'GSM encoder/decoder plugin') option('ipcpipeline', type : 'feature', value : 'auto', description : 'Inter-process communication plugin') -option('iqa', type : 'feature', value : 'auto', description : 'Image quality assessment plugin') +option('iqa', type : 'feature', value : 'auto', description : 'Image quality assessment plugin (AGPL - only built if gpl option is also enabled!)') option('kate', type : 'feature', value : 'auto', description : 'Kate subtitle parser, tagger, and codec plugin') option('kms', type : 'feature', value : 'auto', description : 'KMS video sink plugin') option('ladspa', type : 'feature', value : 'auto', description : 'LADSPA plugin bridge') +option('ldac', type : 'feature', value : 'auto', description : 'LDAC bluetooth audio codec plugin') option('libde265', type : 'feature', value : 'auto', description : 'HEVC/H.265 video decoder plugin') -option('libmms', type : 'feature', value : 'auto', description : 'Microsoft multimedia server network source plugin') +option('openaptx', type : 'feature', value : 'auto', description : 'Open Source implementation of Audio Processing Technology codec (aptX) plugin') option('lv2', type : 'feature', value : 'auto', description : 'LV2 audio plugin bridge') option('mediafoundation', type : 'feature', value : 'auto', description : 'Microsoft Media Foundation plugin') option('microdns', type : 'feature', value : 'auto', description : 'libmicrodns-based device provider') option('modplug', type : 'feature', value : 'auto', description : 'ModPlug audio decoder plugin') -option('mpeg2enc', type : 'feature', value : 'auto', description : 'mpeg2enc video encoder plugin') -option('mplex', type : 'feature', value : 'auto', description : 'mplex audio/video multiplexer plugin') +option('mpeg2enc', type : 'feature', value : 'auto', description : 'mpeg2enc video encoder plugin (GPL - only built if gpl option is also enabled!)') +option('mplex', type : 'feature', value : 'auto', description : 'mplex audio/video multiplexer plugin (GPL - only built if gpl option is also enabled!)') option('msdk', type : 'feature', value : 'auto', description : 'Intel Media SDK video encoder/decoder plugin') option('musepack', type : 'feature', value : 'auto', description : 'libmpcdec Musepack decoder plugin') option('neon', type : 'feature', value : 'auto', description : 'NEON HTTP source plugin') option('nvcodec', type : 'feature', value : 'auto', description : 'NVIDIA GPU codec plugin') -option('ofa', type : 'feature', value : 'auto', description : 'Open Fingerprint Architecture library plugin') +option('onnx', type : 'feature', value : 'auto', description : 'ONNX neural network plugin') option('openal', type : 'feature', value : 'auto', description : 'OpenAL plugin') option('openexr', type : 'feature', value : 'auto', description : 'OpenEXR plugin') option('openh264', type : 'feature', value : 'auto', description : 'H.264 video codec plugin') @@ -133,7 +139,8 @@ option('openni2', type : 'feature', value : 'auto', description : 'OpenNI2 library plugin') option('opensles', type : 'feature', value : 'auto', description : 'OpenSL ES audio source/sink plugin') option('opus', type : 'feature', value : 'auto', description : 'OPUS audio parser plugin') -option('resindvd', type : 'feature', value : 'auto', description : 'Resin DVD playback plugin') +option('qroverlay', type : 'feature', value : 'auto', description : 'Element to set random data on a qroverlay') +option('resindvd', type : 'feature', value : 'auto', description : 'Resin DVD playback plugin (GPL - only built if gpl option is also enabled!)') option('rsvg', type : 'feature', value : 'auto', description : 'SVG overlayer and image decoder plugin') option('rtmp', type : 'feature', value : 'auto', description : 'RTMP video network source and sink plugin') option('sbc', type : 'feature', value : 'auto', description : 'SBC bluetooth audio codec plugin') @@ -163,12 +170,13 @@ option('wildmidi', type : 'feature', value : 'auto', description : 'WildMidi midi soft synth plugin') option('winks', type : 'feature', value : 'auto', description : 'Windows Kernel Streaming video source plugin') option('winscreencap', type : 'feature', value : 'auto', description : 'Windows Screen Capture video source plugin') -option('x265', type : 'feature', value : 'auto', description : 'HEVC/H.265 video encoder plugin') +option('x265', type : 'feature', value : 'auto', description : 'HEVC/H.265 video encoder plugin (GPL - only built if gpl option is also enabled!)') option('zbar', type : 'feature', value : 'auto', description : 'Barcode image scanner plugin using zbar library') option('zxing', type : 'feature', value : 'auto', description : 'Barcode image scanner plugin using zxing-cpp library') option('wpe', type : 'feature', value : 'auto', description : 'WPE Web browser plugin') option('magicleap', type : 'feature', value : 'auto', description : 'Magic Leap platform support') option('v4l2codecs', type : 'feature', value : 'auto', description : 'Video4Linux Stateless CODECs support') +option('isac', type : 'feature', value : 'auto', description : 'iSAC plugin') # HLS plugin options option('hls', type : 'feature', value : 'auto', description : 'HTTP Live Streaming plugin') @@ -179,6 +187,14 @@ option('sctp-internal-usrsctp', type: 'feature', value : 'enabled', description: 'Whether to use the bundled usrsctp library or the system one') +# MSDK plugin options +option('mfx_api', type : 'combo', choices : ['MSDK', 'oneVPL', 'auto'], value : 'auto', + description : 'Select MFX API to build against') + +# License-related feature options +option('gpl', type: 'feature', value: 'disabled', yield: true, + description: 'Allow build plugins that have (A)GPL-licensed dependencies') + # Common feature options option('examples', type : 'feature', value : 'auto', yield : true) option('tests', type : 'feature', value : 'auto', yield : true)
View file
gst-plugins-bad-1.18.6.tar.xz/po/LINGUAS -> gst-plugins-bad-1.20.1.tar.xz/po/LINGUAS
Changed
@@ -1,1 +1,1 @@ -af ast az bg ca cs da de el en_GB eo es eu fi fr fur gl hr hu id it ja ky lt lv mt nb nl or pl pt_BR ro ru sk sl sq sr sv tr uk vi zh_CN +af ast az bg ca cs da de el en_GB eo es eu fi fr fur gl hr hu id it ja ky lt lv mt nb nl or pl pt_BR ro ru sk sl sq sr sv tr uk vi zh_CN zh_TW
View file
gst-plugins-bad-1.18.6.tar.xz/po/af.po -> gst-plugins-bad-1.20.1.tar.xz/po/af.po
Changed
@@ -6,8 +6,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins 0.7.6\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2004-03-18 14:16+0200\n" "Last-Translator: Petri Jooste <rkwjpj@puk.ac.za>\n" "Language-Team: Afrikaans <i18n@af.org.za>\n" @@ -15,6 +15,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" msgid "No URL set." msgstr "" @@ -40,7 +41,7 @@ #, fuzzy msgid "Could not read DVD." -msgstr "Kon nie skryf na lêer \"%s\" nie." +msgstr "Kon nie skryf na toestel \"%s\" nie." msgid "This file contains no playable streams." msgstr "" @@ -68,6 +69,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -94,6 +96,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Kon nie lêer \"%s\" oopmaak om te lees nie." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -109,50 +112,44 @@ msgid "No properties for channel '%s'" msgstr "" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Kon nie beheertoestel \"%s\" toemaak nie." +msgstr "" #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" #, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Kon nie oudio-toestel \"%s\" konfigureer nie." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Kon nie skryf na lêer \"%s\" nie." - -#, fuzzy #~ msgid "No file name specified for writing." #~ msgstr "Geen lêernaam gespesifiseer." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Kon nie lêer \"%s\" oopmaak om in te skryf nie." -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "Kon nie skryf na lêer \"%s\" nie." - #~ msgid "Could not open device \"%s\" for reading and writing." #~ msgstr "Kon nie toestel \"%s\" oopmaak vir lees en skryf nie." #~ msgid "Device \"%s\" is not a capture device." #~ msgstr "Toestel \"%s\" is nie 'n vasleggingtoestel nie." -#~ msgid "Could not write to device \"%s\"." -#~ msgstr "Kon nie skryf na toestel \"%s\" nie." - #~ msgid "Could not get buffers from device \"%s\"." #~ msgstr "Kon nie buffers vanaf toestel \"%s\" verkry nie." +#~ msgid "Could not open audio device \"%s\" for writing." +#~ msgstr "Kon nie oudio-toestel \"%s\" oopmaak vir skryf nie." + #~ msgid "Could not open control device \"%s\" for writing." #~ msgstr "Kon nie beheertoestel \"%s\" oopmaak vir skryf nie." +#~ msgid "Could not configure audio device \"%s\"." +#~ msgstr "Kon nie oudio-toestel \"%s\" konfigureer nie." + #~ msgid "Could not set audio device \"%s\" to %d Hz." #~ msgstr "Kon nie klanktoestel \"%s\" verstel na %d Hz nie." @@ -183,9 +180,30 @@ #~ msgid "Could not open device \"%s\" for reading." #~ msgstr "Kon nie toestel \"%s\" oopmaak vir lees nie." +#~ msgid "Volume" +#~ msgstr "Volume" + +#~ msgid "Bass" +#~ msgstr "Bas" + +#~ msgid "Treble" +#~ msgstr "Treble" + #~ msgid "Synth" #~ msgstr "Sintetiseerder" +#~ msgid "PCM" +#~ msgstr "PCM" + +#~ msgid "Speaker" +#~ msgstr "Luidspreker" + +#~ msgid "Line-in" +#~ msgstr "Lyn-in" + +#~ msgid "Microphone" +#~ msgstr "Mikrofoon" + #~ msgid "CD" #~ msgstr "CD" @@ -195,9 +213,15 @@ #~ msgid "PCM-2" #~ msgstr "PCM-2" +#~ msgid "Record" +#~ msgstr "Neem op" + #~ msgid "In-gain" #~ msgstr "In-versterking" +#~ msgid "Out-gain" +#~ msgstr "Uit-versterking" + #~ msgid "Line-1" #~ msgstr "Lyn-1" @@ -207,6 +231,9 @@ #~ msgid "Line-3" #~ msgstr "Lyn-3" +#~ msgid "Digital-1" +#~ msgstr "Digitaal-1" + #~ msgid "Digital-2" #~ msgstr "Digitaal-2" @@ -219,9 +246,19 @@ #~ msgid "Phone-out" #~ msgstr "Telefoon-uit" +#~ msgid "Video" +#~ msgstr "Video" + #~ msgid "Radio" #~ msgstr "Radio" +#~ msgid "Monitor" +#~ msgstr "Monitor" + +#, fuzzy +#~ msgid "PC Speaker" +#~ msgstr "Luidspreker" + #~ msgid "Could not open CD device for reading." #~ msgstr "Kon nie CD-toestel oopmaak om te lees nie."
View file
gst-plugins-bad-1.18.6.tar.xz/po/ast.po -> gst-plugins-bad-1.20.1.tar.xz/po/ast.po
Changed
@@ -5,154 +5,134 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.12.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2017-05-04 15:09+0300\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2018-07-16 17:14+0100\n" "Last-Translator: enolp <enolp@softastur.org>\n" "Language-Team: Asturian <ubuntu-l10n-ast@lists.ubuntu.com>\n" "Language: ast\n" -"X-Bugs: Report translation errors to the Language-Team address.\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=n != 1;\n" "X-Generator: Lokalize 2.0\n" -#: ext/gl/gstgltestsrc.c:489 -msgid "failed to draw pattern" -msgstr "fallu al dibuxar el patrón" - -#: ext/gl/gstgltestsrc.c:490 -msgid "A GL error occured" -msgstr "Asocedió un fallu de GL" - -#: ext/gl/gstgltestsrc.c:496 -msgid "format wasn't negotiated before get function" -msgstr "el formatu nun se negoció enantes de la función get" +msgid "No URL set." +msgstr "" -#: ext/opencv/gsttemplatematch.cpp:186 msgid "OpenCV failed to load template image" msgstr "OpenCV falló al cargar la imaxe de la plantía" -#: ext/resindvd/resindvdsrc.c:361 msgid "Could not read title information for DVD." msgstr "Nun pudo lleese la información del títulu del DVD." -#: ext/resindvd/resindvdsrc.c:367 #, c-format msgid "Failed to open DVD device '%s'." msgstr "Fallu al abrir el preséu de DVDs «%s»." -#: ext/resindvd/resindvdsrc.c:373 msgid "Failed to set PGC based seeking." msgstr "Fallu al afitar la gueta basada en PGC." -#: ext/resindvd/resindvdsrc.c:1164 -msgid "Could not read DVD. This may be because the DVD is encrypted and a DVD decryption library is not installed." -msgstr "Nun pudo lleese'l DVD. Esto podría ser porque'l DVD ta cifráu y nun s'instaló una biblioteca de descifráu de DVDs." +msgid "" +"Could not read DVD. This may be because the DVD is encrypted and a DVD " +"decryption library is not installed." +msgstr "" +"Nun pudo lleese'l DVD. Esto podría ser porque'l DVD ta cifráu y nun " +"s'instaló una biblioteca de descifráu de DVDs." -#: ext/resindvd/resindvdsrc.c:1169 ext/resindvd/resindvdsrc.c:1178 msgid "Could not read DVD." msgstr "Nun pudo lleese'l DVD." -#: ext/smoothstreaming/gstmssdemux.c:421 -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:722 msgid "This file contains no playable streams." msgstr "El ficheru nun contién dengún fluxu reproducible." -#: ext/sndfile/gstsfdec.c:769 msgid "Could not open sndfile stream for reading." msgstr "Nun pudo abrise'l fluxu sndfile pa la llectura." -#: gst/asfmux/gstasfmux.c:1832 msgid "Generated file has a larger preroll time than its streams duration" -msgstr "El ficheru xeneráu tien un tiempu de prellanzamientu mayor que la duración de los sos fluxos" +msgstr "" +"El ficheru xeneráu tien un tiempu de prellanzamientu mayor que la duración " +"de los sos fluxos" -#: gst/camerabin2/camerabingeneral.c:167 gst/camerabin2/gstcamerabin2.c:1859 -#: gst/camerabin2/gstdigitalzoom.c:283 gst/camerabin2/gstviewfinderbin.c:270 #, c-format msgid "Missing element '%s' - check your GStreamer installation." msgstr "Falta l'elementu «%s», comprueba la instalación de GStreamer" -#: gst/camerabin2/gstcamerabin2.c:347 msgid "File location is set to NULL, please set it to a valid filename" -msgstr "L'allugamientu del ficheru ta afitáu a NULL, especifica un nome validu de ficheru" +msgstr "" +"L'allugamientu del ficheru ta afitáu a NULL, especifica un nome validu de " +"ficheru" -#: gst/camerabin2/gstwrappercamerabinsrc.c:585 msgid "Digitalzoom element couldn't be created" msgstr "Nun pudo crease l'elementu digitalzoom" -#: gst/dvdspu/gstdvdspu.c:1041 msgid "Subpicture format was not configured before data flow" msgstr "El formatu de soimaxe nun se configuró enantes de fluxu de datos" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3352 msgid "Failed to get fragment URL." msgstr "Fallu al consiguir la URL del fragmentu." -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3724 +#, c-format msgid "Couldn't download fragments" msgstr "Nun pudieron baxase fragamentos" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3813 -#: gst/mpegtsdemux/mpegtsbase.c:1610 msgid "Internal data stream error." msgstr "Fallu internu del fluxu de datos." -#: sys/dvb/gstdvbsrc.c:1579 sys/dvb/gstdvbsrc.c:1793 #, c-format msgid "Device \"%s\" does not exist." msgstr "El preséu «%s» nun esiste." -#: sys/dvb/gstdvbsrc.c:1583 #, c-format msgid "Could not open frontend device \"%s\"." msgstr "Nun pudo abrise'l preséu frontal «%s»." -#: sys/dvb/gstdvbsrc.c:1602 #, c-format msgid "Could not get settings from frontend device \"%s\"." msgstr "Nun pudieron consiguise los axustes del preséu frontal «%s»." -#: sys/dvb/gstdvbsrc.c:1619 #, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." msgstr "Nun puen numberase los sitemes d'entrega del preséu frontal «%s»." -#: sys/dvb/gstdvbsrc.c:1797 #, c-format msgid "Could not open file \"%s\" for reading." msgstr "Nun pudo abrise'l ficheru «%s» pa la llectura." -#: sys/dvb/parsechannels.c:410 +#, c-format msgid "Couldn't find channel configuration file" msgstr "Nun pudo alcontrase'l ficheru de configuración de canales" -#: sys/dvb/parsechannels.c:413 sys/dvb/parsechannels.c:563 #, c-format msgid "Couldn't load channel configuration file: '%s'" msgstr "Nun pudo cargase'l ficheru de configuración de canales: «%s»" -#: sys/dvb/parsechannels.c:421 sys/dvb/parsechannels.c:846 #, c-format msgid "Couldn't find details for channel '%s'" msgstr "Nun pudieron alcontrase los detalles pa la canal «%s»" -#: sys/dvb/parsechannels.c:430 #, c-format msgid "No properties for channel '%s'" msgstr "Nun hai propiedaes pa la canal «%s»" -#: sys/dvb/parsechannels.c:439 #, c-format msgid "Failed to set properties for channel '%s'" msgstr "Fallu al afitar les propiedaes pa la canal «%s»" -#: sys/dvb/parsechannels.c:560 #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "Nun pudo alcontrase'l ficheru de configuración de canales: %s" -#: sys/dvb/parsechannels.c:570 +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "El ficheru de configuración de canales nun contién denguna canal" + +#~ msgid "failed to draw pattern" +#~ msgstr "fallu al dibuxar el patrón" + +#~ msgid "A GL error occured" +#~ msgstr "Asocedió un fallu de GL" + +#~ msgid "format wasn't negotiated before get function" +#~ msgstr "el formatu nun se negoció enantes de la función get"
View file
gst-plugins-bad-1.18.6.tar.xz/po/az.po -> gst-plugins-bad-1.20.1.tar.xz/po/az.po
Changed
@@ -6,8 +6,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-0.8.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2004-03-19 18:29+0200\n" "Last-Translator: Metin Amiroff <metin@karegen.com>\n" "Language-Team: Azerbaijani <translation-team-az@lists.sourceforge.net>\n" @@ -15,6 +15,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Generator: KBabel 1.0.2\n" msgid "No URL set." @@ -41,7 +42,7 @@ #, fuzzy msgid "Could not read DVD." -msgstr "\"%s\" faylına yazıla bilmədi." +msgstr "\"%s\" avadanlığına yazıla bilmədi." msgid "This file contains no playable streams." msgstr "" @@ -69,6 +70,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -95,6 +97,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "\"%s\" faylı oxuma üçün açıla bilmədi." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -110,50 +113,44 @@ msgid "No properties for channel '%s'" msgstr "" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "\"%s\" idarə avadanlığı bağlana bilmədi." +msgstr "" #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" #, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "\"%s\" audio avadanlığı quraşdırıla bilmədi." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "\"%s\" faylına yazıla bilmədi." - -#, fuzzy #~ msgid "No file name specified for writing." #~ msgstr "Fayl adı verilməyib." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "\"%s\" faylı yazma üçün açıla bilmədi." -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "\"%s\" faylına yazıla bilmədi." - #~ msgid "Could not open device \"%s\" for reading and writing." #~ msgstr "\"%s\" avadanlığı oxuma və yazma üçün açıla bilmədi." #~ msgid "Device \"%s\" is not a capture device." #~ msgstr "\"%s\" avadanlığı capture avadanlığı deyil." -#~ msgid "Could not write to device \"%s\"." -#~ msgstr "\"%s\" avadanlığına yazıla bilmədi." - #~ msgid "Could not get buffers from device \"%s\"." #~ msgstr "\"%s\" avadanlığından bufferlər alına bilmədi." +#~ msgid "Could not open audio device \"%s\" for writing." +#~ msgstr "\"%s\" audio avadanlığı yazma üçün açıla bilmədi." + #~ msgid "Could not open control device \"%s\" for writing." #~ msgstr "\"%s\" idarə avadanlığı yazma üçün açıla bilmədi." +#~ msgid "Could not configure audio device \"%s\"." +#~ msgstr "\"%s\" audio avadanlığı quraşdırıla bilmədi." + #~ msgid "Could not set audio device \"%s\" to %d Hz." #~ msgstr "\"%s\" audio avadanlığı %d Hz-ə keçirilə bilmədi." @@ -184,9 +181,30 @@ #~ msgid "Could not open device \"%s\" for reading." #~ msgstr "\"%s\" avadanlığı oxuma üçün açıla bilmədi." +#~ msgid "Volume" +#~ msgstr "Səs" + +#~ msgid "Bass" +#~ msgstr "Bas" + +#~ msgid "Treble" +#~ msgstr "İncə" + #~ msgid "Synth" #~ msgstr "Sint" +#~ msgid "PCM" +#~ msgstr "PCM" + +#~ msgid "Speaker" +#~ msgstr "Spiker" + +#~ msgid "Line-in" +#~ msgstr "Xətd-giriş" + +#~ msgid "Microphone" +#~ msgstr "Mikrofon" + #~ msgid "CD" #~ msgstr "CD" @@ -196,9 +214,15 @@ #~ msgid "PCM-2" #~ msgstr "PCM-2" +#~ msgid "Record" +#~ msgstr "Qeyd" + #~ msgid "In-gain" #~ msgstr "Giriş-gain" +#~ msgid "Out-gain" +#~ msgstr "Çıxış-gain" + #~ msgid "Line-1" #~ msgstr "Xətd-1" @@ -208,6 +232,9 @@ #~ msgid "Line-3" #~ msgstr "Xətd-3" +#~ msgid "Digital-1" +#~ msgstr "Dijital-1" + #~ msgid "Digital-2" #~ msgstr "Dijital-2" @@ -220,9 +247,19 @@ #~ msgid "Phone-out" #~ msgstr "Telefon-çıxışı" +#~ msgid "Video" +#~ msgstr "Video" + #~ msgid "Radio" #~ msgstr "Radio" +#~ msgid "Monitor" +#~ msgstr "Monitor" + +#, fuzzy +#~ msgid "PC Speaker" +#~ msgstr "Spiker" + #~ msgid "Could not open CD device for reading." #~ msgstr "CD avadanlığı oxuma üçün açıla bilmədi."
View file
gst-plugins-bad-1.18.6.tar.xz/po/bg.po -> gst-plugins-bad-1.20.1.tar.xz/po/bg.po
Changed
@@ -1,16 +1,17 @@ # Bulgarian translation of gst-plugins-bad. # Copyright (C) 2007, 2008, 2009, 2010, 2011, 2016 Free Software Fondation, Inc. # Copyright (C) 2017 Free Software Fondation, Inc. +# Copyright (C) 2021 Alexander Shopov. # This file is distributed under the same license as the gst-plugins-bad package. # Alexander Shopov <ash@kambanaria.org>, 2007, 2008, 2009, 2010, 2011, 2016. -# Alexander Shopov <ash@kambanaria.org>, 2017. +# Alexander Shopov <ash@kambanaria.org>, 2017, 2021. # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 1.12.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2017-05-08 11:49+0200\n" +"Project-Id-Version: gst-plugins-bad 1.19.2\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2021-10-02 20:07+0200\n" "Last-Translator: Alexander Shopov <ash@kambanaria.org>\n" "Language-Team: Bulgarian <dict@ludost.net>\n" "Language: bg\n" @@ -21,7 +22,7 @@ "Plural-Forms: nplurals=2; plural=n != 1;\n" msgid "No URL set." -msgstr "" +msgstr "Не е зададен адрес." msgid "OpenCV failed to load template image" msgstr "OpenCV не успя да зареди изображението-шаблон" @@ -73,6 +74,7 @@ msgid "Failed to get fragment URL." msgstr "Неуспешно получаване на адреса на фрагмента." +#, c-format msgid "Couldn't download fragments" msgstr "Фрагментите не могат да бъдат свалени" @@ -89,16 +91,17 @@ #, c-format msgid "Could not get settings from frontend device \"%s\"." -msgstr "Настройките на устройството „%s“ не могат да бъдат получени." +msgstr "Настройките на устройството „%s“ не може да бъдат получени." #, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." -msgstr "Системите за доставка от устройството „%s“ не могат да бъдат изброени." +msgstr "Системите за доставка от устройството „%s“ не може да бъдат изброени." #, c-format msgid "Could not open file \"%s\" for reading." msgstr "Файлът „%s“ не може да се отвори за четене." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Файлът с настройките на каналите не може да бъде открит" @@ -116,21 +119,12 @@ #, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Свойствата на канала „%s“ не могат да бъдат зададени" +msgstr "Свойствата на канала „%s“ не може да бъдат зададени" #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "Файлът с настройките на каналите не може да бъде открит: „%s“" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Файлът с настройките на каналите не съдържа никакви канали" - -#~ msgid "failed to draw pattern" -#~ msgstr "шарката не може да се изобрази" - -#~ msgid "A GL error occured" -#~ msgstr "Грешка от GL" - -#~ msgid "format wasn't negotiated before get function" -#~ msgstr "" -#~ "форматът не е бил уточнен преди извикване на функцията за получаване"
View file
gst-plugins-bad-1.18.6.tar.xz/po/ca.po -> gst-plugins-bad-1.20.1.tar.xz/po/ca.po
Changed
@@ -8,8 +8,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2012-01-01 14:19+0100\n" "Last-Translator: Gil Forcada <gforcada@gnome.org>\n" "Language-Team: Catalan <ca@dodds.net>\n" @@ -17,6 +17,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=n != 1;\n" msgid "No URL set." @@ -71,6 +72,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -97,6 +99,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "No s'ha pogut obrir el fitxer «%s» per a la lectura." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -120,25 +123,20 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "No s'ha pogut obrir el fitxer «%s» per a la lectura." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "No s'ha pogut llegir el DVD." - #~ msgid "No file name specified for writing." #~ msgstr "No s'ha especificat cap nom de fitxer per a l'escriptura." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "No s'ha pogut obrir el fitxer «%s» per a l'escriptura." -#~ msgid "Internal data flow error." -#~ msgstr "S'ha produït un error intern de flux de dades." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "No s'ha pogut escriure al fitxer «%s»." + +#~ msgid "Internal data flow error." +#~ msgstr "S'ha produït un error intern de flux de dades."
View file
gst-plugins-bad-1.18.6.tar.xz/po/cs.po -> gst-plugins-bad-1.20.1.tar.xz/po/cs.po
Changed
@@ -10,8 +10,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.12.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2017-09-13 14:36+0200\n" "Last-Translator: Marek Černocký <marek@manet.cz>\n" "Language-Team: Czech <translation-team-cs@lists.sourceforge.net>\n" @@ -78,6 +78,7 @@ msgid "Failed to get fragment URL." msgstr "Selhalo získání adresy URL fragmentu." +#, c-format msgid "Couldn't download fragments" msgstr "Nelze stáhnout fragmenty" @@ -105,6 +106,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Nezdařilo se otevření souboru „%s“ ke čtení." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Nezdařilo se najít soubor s nastavením kanálů" @@ -128,6 +130,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Nezdařilo se najít soubor s nastavením kanálu: „%s“" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Soubor s nastavením kanálů neobsahuje žádné kanály"
View file
gst-plugins-bad-1.18.6.tar.xz/po/da.po -> gst-plugins-bad-1.20.1.tar.xz/po/da.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad-1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-02-05 11:28+0200\n" "Last-Translator: Joe Hansen <joedalton2@yahoo.dk>\n" "Language-Team: Danish <dansk@dansk-gruppen.dk>\n" @@ -71,6 +71,7 @@ msgid "Failed to get fragment URL." msgstr "Kunne ikke indhente fragmentadresse." +#, c-format msgid "Couldn't download fragments" msgstr "Kunne ikke hente fragmenter" @@ -97,6 +98,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Kunne ikke åbne filen »%s« for læsning." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Kunne ikke finde kanalkonfigurationsfil" @@ -121,6 +123,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Kunne ikke finde kanalkonfigurationsfilen: »%s«" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Kanalkonfigurationsfilen indeholder ikke nogen kanaler"
View file
gst-plugins-bad-1.18.6.tar.xz/po/de.po -> gst-plugins-bad-1.20.1.tar.xz/po/de.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-02-09 20:47+0100\n" "Last-Translator: Christian Kirbach <christian.kirbach@gmail.com>\n" "Language-Team: German <translation-team-de@lists.sourceforge.net>\n" @@ -78,6 +78,7 @@ msgid "Failed to get fragment URL." msgstr "Ermitteln der Fragment-Adresse schlug fehl." +#, c-format msgid "Couldn't download fragments" msgstr "Fragmente konnten nicht heruntergeladen werden" @@ -106,6 +107,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Datei »%s« konnte nicht zum Lesen geöffnet werden." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Kanal-Konfigurationsdatei konnte nicht gefunden werden" @@ -129,6 +131,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Kanal-Konfigurationsdatei konnte nicht gefunden werden: »%s«" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Kanal-Konfigurationsdatei enthält keine Kanäle"
View file
gst-plugins-bad-1.18.6.tar.xz/po/el.po -> gst-plugins-bad-1.20.1.tar.xz/po/el.po
Changed
@@ -8,8 +8,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2012-05-05 19:17+0100\n" "Last-Translator: Savvas Radevic <vicedar@gmail.com>\n" "Language-Team: Greek <team@lists.gnome.gr>\n" @@ -17,6 +17,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=n>1;\n" msgid "No URL set." @@ -72,6 +73,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -98,6 +100,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Αδυναμία ανοίγματος αρχείου \"%s\" για ανάγνωση." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -121,25 +124,20 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Αδυναμία ανοίγματος αρχείου \"%s\" για ανάγνωση." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Αδυναμία ανάγνωσης δίσκου DVD." - #~ msgid "No file name specified for writing." #~ msgstr "Δεν έχει καθορισθεί όνομα αρχείου για εγγραφή." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Αδυναμία ανοίγματος αρχείου \"%s\" για εγγραφή." -#~ msgid "Internal data flow error." -#~ msgstr "Εσωτερικό σφάλμα ροής δεδομένων." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "Αδυναμία εγγραφής στο αρχείο \"%s\"." + +#~ msgid "Internal data flow error." +#~ msgstr "Εσωτερικό σφάλμα ροής δεδομένων."
View file
gst-plugins-bad-1.18.6.tar.xz/po/en_GB.po -> gst-plugins-bad-1.20.1.tar.xz/po/en_GB.po
Changed
@@ -5,8 +5,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins 0.8.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2004-04-26 10:41-0400\n" "Last-Translator: Gareth Owen <gowen72@yahoo.com>\n" "Language-Team: English (British) <en_gb@li.org>\n" @@ -14,6 +14,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" msgid "No URL set." msgstr "" @@ -39,7 +40,7 @@ #, fuzzy msgid "Could not read DVD." -msgstr "Could not write to file \"%s\"." +msgstr "Could not write to device \"%s\"." msgid "This file contains no playable streams." msgstr "" @@ -67,6 +68,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -93,6 +95,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Could not open file \"%s\" for reading." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -108,50 +111,44 @@ msgid "No properties for channel '%s'" msgstr "" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Could not close control device \"%s\"." +msgstr "" #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" #, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Could not configure audio device \"%s\"." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Could not write to file \"%s\"." - -#, fuzzy #~ msgid "No file name specified for writing." #~ msgstr "No filename specified." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Could not open file \"%s\" for writing." -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "Could not write to file \"%s\"." - #~ msgid "Could not open device \"%s\" for reading and writing." #~ msgstr "Could not open device \"%s\" for reading and writing." #~ msgid "Device \"%s\" is not a capture device." #~ msgstr "Device \"%s\" is not a capture device." -#~ msgid "Could not write to device \"%s\"." -#~ msgstr "Could not write to device \"%s\"." - #~ msgid "Could not get buffers from device \"%s\"." #~ msgstr "Could not get buffers from device \"%s\"." +#~ msgid "Could not open audio device \"%s\" for writing." +#~ msgstr "Could not open audio device \"%s\" for writing." + #~ msgid "Could not open control device \"%s\" for writing." #~ msgstr "Could not open control device \"%s\" for writing." +#~ msgid "Could not configure audio device \"%s\"." +#~ msgstr "Could not configure audio device \"%s\"." + #~ msgid "Could not set audio device \"%s\" to %d Hz." #~ msgstr "Could not set audio device \"%s\" to %d Hz." @@ -186,9 +183,30 @@ #~ msgid "Your OSS device could not be probed correctly" #~ msgstr "Your oss device could not be probed correctly" +#~ msgid "Volume" +#~ msgstr "Volume" + +#~ msgid "Bass" +#~ msgstr "Bass" + +#~ msgid "Treble" +#~ msgstr "Treble" + #~ msgid "Synth" #~ msgstr "Synth" +#~ msgid "PCM" +#~ msgstr "PCM" + +#~ msgid "Speaker" +#~ msgstr "Speaker" + +#~ msgid "Line-in" +#~ msgstr "Line-in" + +#~ msgid "Microphone" +#~ msgstr "Microphone" + #~ msgid "CD" #~ msgstr "CD" @@ -198,9 +216,15 @@ #~ msgid "PCM-2" #~ msgstr "PCM-2" +#~ msgid "Record" +#~ msgstr "Record" + #~ msgid "In-gain" #~ msgstr "In-gain" +#~ msgid "Out-gain" +#~ msgstr "Out-gain" + #~ msgid "Line-1" #~ msgstr "Line-1" @@ -210,6 +234,9 @@ #~ msgid "Line-3" #~ msgstr "Line-3" +#~ msgid "Digital-1" +#~ msgstr "Digital-1" + #~ msgid "Digital-2" #~ msgstr "Digital-2" @@ -222,9 +249,19 @@ #~ msgid "Phone-out" #~ msgstr "Phone-out" +#~ msgid "Video" +#~ msgstr "Video" + #~ msgid "Radio" #~ msgstr "Radio" +#~ msgid "Monitor" +#~ msgstr "Monitor" + +#, fuzzy +#~ msgid "PC Speaker" +#~ msgstr "Speaker" + #~ msgid "Could not open CD device for reading." #~ msgstr "Could not open CD device for reading."
View file
gst-plugins-bad-1.18.6.tar.xz/po/eo.po -> gst-plugins-bad-1.20.1.tar.xz/po/eo.po
Changed
@@ -1,133 +1,132 @@ # Esperanto translation for gst-plugins-bad. -# Copyright (C) 2011 Free Software Foundation, Inc. +# Copyright (C) 2011, 2021 Free Software Foundation, Inc. # This file is distributed under the same license as the gst-plugins-bad package. # Kristjan SCHMIDT <kristjan.schmidt@googlemail.com>, 2011. +# Felipe CASTRO <fefcas@gmail.com>, 2021. # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2011-06-04 22:18+0200\n" -"Last-Translator: Kristjan SCHMIDT <kristjan.schmidt@googlemail.com>\n" +"Project-Id-Version: gst-plugins-bad 1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2021-03-07 11:10-0300\n" +"Last-Translator: Felipe Castro <fefcas@gmail.com>\n" "Language-Team: Esperanto <translation-team-eo@lists.sourceforge.net>\n" "Language: eo\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" -"Plural-Forms: nplurals=2; plural=(n != 1)\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" +"Plural-Forms: nplurals=2; plural=(n != 1);\n" +"X-Generator: Poedit 2.4.2\n" msgid "No URL set." -msgstr "" +msgstr "Neniu URL estas difinita." msgid "OpenCV failed to load template image" -msgstr "" +msgstr "OpenCV malsukcesis ŝargi je ŝablona bildo" msgid "Could not read title information for DVD." -msgstr "Ne eblis legi la titol-informojn de la DVD." +msgstr "Ne eblis legi la titol-informojn por DVD." #, c-format msgid "Failed to open DVD device '%s'." -msgstr "Fiaskis malfermi la DVD-aparaton \"%s\"." +msgstr "Malsukcesis malfermi la DVD-aparaton '%s'." msgid "Failed to set PGC based seeking." -msgstr "PGC-bazita serĉo fiaskis." +msgstr "Difino de PGC-bazita serĉo fiaskis." msgid "" "Could not read DVD. This may be because the DVD is encrypted and a DVD " "decryption library is not installed." msgstr "" "Ne eblis legi la DVD-n. Eble la DVD estas ĉifrita sed biblioteko por " -"malĉifrado ne estas instalite." +"malĉifrado ne estas instalita." msgid "Could not read DVD." msgstr "Ne eblis legi la DVD-n." msgid "This file contains no playable streams." -msgstr "" +msgstr "Tiu ĉi dosiero enhavas neniun ludeblan fluon." -#, fuzzy msgid "Could not open sndfile stream for reading." -msgstr "Ne eblis malfermi la dosieron \"%s\" por legi." +msgstr "Ne eblis malfermi la fluon sndfile por legi." msgid "Generated file has a larger preroll time than its streams duration" -msgstr "" +msgstr "Generita dosiero havas pli grandan tempon preroll ol ĝia flua daŭro" #, c-format msgid "Missing element '%s' - check your GStreamer installation." -msgstr "" +msgstr "Mankas elementos '%s' - kontrolu vian instalon de GStreamer." msgid "File location is set to NULL, please set it to a valid filename" msgstr "" +"Dosiera loko estas difinita kiel NULL, bonvolu difinu ĝin kiel valida " +"dosiernomo" msgid "Digitalzoom element couldn't be created" -msgstr "" +msgstr "Elemento Digitalzoom ne povis esti kreata" msgid "Subpicture format was not configured before data flow" -msgstr "" +msgstr "Formo Subpicture ne estis agordita antaŭ ol la datumara fluo" msgid "Failed to get fragment URL." -msgstr "" +msgstr "Malsukcesis havigi fragmentan URL-n." +#, c-format msgid "Couldn't download fragments" -msgstr "" +msgstr "Ne eblis elŝuti fragmentojn" msgid "Internal data stream error." -msgstr "Interna datum-flu-eraro." +msgstr "Interna eraro pri datumara fluo." #, c-format msgid "Device \"%s\" does not exist." msgstr "Aparato \"%s\" ne ekzistas." -#, fuzzy, c-format +#, c-format msgid "Could not open frontend device \"%s\"." -msgstr "Ne eblis malfermi la \"Frontend\"-aparaton \"%s\"." +msgstr "Ne eblis malfermi la fasadan aparaton \"%s\"." -#, fuzzy, c-format +#, c-format msgid "Could not get settings from frontend device \"%s\"." -msgstr "Ne eblis akiri la agordojn de la \"Frontend\"-aparato \"%s\"." +msgstr "Ne eblis havigi la agordojn el la fasada aparato \"%s\"." -#, fuzzy, c-format +#, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." -msgstr "Ne eblis akiri la agordojn de la \"Frontend\"-aparato \"%s\"." +msgstr "Ne eblis nombrigi la liverajn sistemojn el la fasada aparato \"%s\"." #, c-format msgid "Could not open file \"%s\" for reading." msgstr "Ne eblis malfermi la dosieron \"%s\" por legi." +#, c-format msgid "Couldn't find channel configuration file" -msgstr "" +msgstr "Ne eblis trovi kanal-agordan dosieron" #, c-format msgid "Couldn't load channel configuration file: '%s'" -msgstr "" +msgstr "Ne eblis ŝargi je la kanal-agorda dosiero: '%s'" #, c-format msgid "Couldn't find details for channel '%s'" -msgstr "" +msgstr "Ne eblis trovi detalojn pri la kanalo '%s'" #, c-format msgid "No properties for channel '%s'" -msgstr "" +msgstr "Neniu atributo por la kanalo '%s'" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Fiaskis malfermi la DVD-aparaton \"%s\"." +msgstr "Malsukcesis difini atributojn por la kanalo '%s'" #, c-format msgid "Couldn't find channel configuration file: '%s'" -msgstr "" +msgstr "Ne eblis trovi kanal-agordan dosieron: '%s'" +#, c-format msgid "Channel configuration file doesn't contain any channels" -msgstr "" - -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Ne eblis malfermi la dosieron \"%s\" por legi." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Ne eblis legi la DVD-n." +msgstr "Kanal-agorda dosiero ne enhavas iun ajn kanalon" #~ msgid "No file name specified for writing." #~ msgstr "Neniu dosiernomo estas specifite por skribi." @@ -135,8 +134,8 @@ #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Ne eblis malfermi la dosieron \"%s\" por skribi." -#~ msgid "Internal data flow error." -#~ msgstr "Interna datum-flu-eraro." - #~ msgid "Could not write to file \"%s\"." #~ msgstr "Ne eblis skribi al dosiero \"%s\"." + +#~ msgid "Internal data flow error." +#~ msgstr "Interna datum-flu-eraro."
View file
gst-plugins-bad-1.18.6.tar.xz/po/es.po -> gst-plugins-bad-1.20.1.tar.xz/po/es.po
Changed
@@ -6,8 +6,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2011-10-02 15:47+0200\n" "Last-Translator: Jorge González González <aloriel@gmail.com>\n" "Language-Team: Spanish <es@li.org>\n" @@ -15,6 +15,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=(n!=1);\n" msgid "No URL set." @@ -69,6 +70,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -95,7 +97,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "No se pudo abrir el archivo «%s» para leer." -#, fuzzy +#, fuzzy, c-format msgid "Couldn't find channel configuration file" msgstr "Configuración del canal del mezclador virtual" @@ -119,29 +121,24 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Configuración del canal del mezclador virtual" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "No se pudo abrir el archivo «%s» para leer." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "No se pudo leer el DVD." - #~ msgid "No file name specified for writing." #~ msgstr "No se especificó un nombre de archivo para su escritura." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "No se pudo abrir el archivo «%s» para escribir." -#~ msgid "Internal data flow error." -#~ msgstr "Error en el flujo de datos interno." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "No se pudo escribir en el archivo «%s»." +#~ msgid "Internal data flow error." +#~ msgstr "Error en el flujo de datos interno." + #~ msgid "Internal clock error." #~ msgstr "Error en el reloj interno."
View file
gst-plugins-bad-1.18.6.tar.xz/po/eu.po -> gst-plugins-bad-1.20.1.tar.xz/po/eu.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad-0.10.17.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2010-03-25 12:30+0100\n" "Last-Translator: Mikel Olasagasti Uranga <hey_neken@mundurat.net>\n" "Language-Team: Basque <translation-team-eu@lists.sourceforge.net>\n" @@ -16,6 +16,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Generator: KBabel 1.11.4\n" "Plural-Forms: nplurals=2; plural=(n != 1);\n" @@ -70,6 +71,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -96,6 +98,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Ezin izan da \"%s\" fitxategia ireki irakurtzeko." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -119,29 +122,24 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Ezin izan da \"%s\" fitxategia ireki irakurtzeko." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Ezin izan da DVDaren tituluaren informazioa irakurri." - #~ msgid "No file name specified for writing." #~ msgstr "Ez da fitxategi-izenik zehaztu idazteko." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Ezin izan da \"%s\" fitxategia ireki idazteko." -#~ msgid "Internal data flow error." -#~ msgstr "Datu-korrontearen barne-errorea." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "Ezin izan da idatzi \"%s\" fitxategian." +#~ msgid "Internal data flow error." +#~ msgstr "Datu-korrontearen barne-errorea." + #~ msgid "Could not open audio device for mixer control handling." #~ msgstr "" #~ "Ezin izan da audioko gailua ireki nahastailearen kontrola kudeatzeko." @@ -450,9 +448,11 @@ #~ msgid "Virtual Mixer Channels" #~ msgstr "Nahastaile birtualaren kanalak" +#, c-format #~ msgid "%s Function" #~ msgstr "%s funtzioa" +#, c-format #~ msgid "%s %d" #~ msgstr "%s %d"
View file
gst-plugins-bad-1.18.6.tar.xz/po/fi.po -> gst-plugins-bad-1.20.1.tar.xz/po/fi.po
Changed
@@ -10,8 +10,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.13.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2009-08-12 22:13+0300\n" "Last-Translator: Tommi Vainikainen <Tommi.Vainikainen@iki.fi>\n" "Language-Team: Finnish <translation-team-fi@lists.sourceforge.net>\n" @@ -19,6 +19,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=(n != 1);\n" "X-Generator: KBabel 1.11.2\n" @@ -73,6 +74,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -99,7 +101,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Tiedostoa \"%s\" ei voi avata luettavaksi." -#, fuzzy +#, fuzzy, c-format msgid "Couldn't find channel configuration file" msgstr "Näennäinen mikserikanava-asetus" @@ -123,34 +125,377 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Näennäinen mikserikanava-asetus" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Tiedostoa \"%s\" ei voi avata luettavaksi." +#~ msgid "Internal clock error." +#~ msgstr "Sisäinen kellovirhe." -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "DVD:n otsikkotietoja ei voitu lukea." +#~ msgid "Internal data flow error." +#~ msgstr "Sisäinen tietovirtavirhe." #~ msgid "No file name specified for writing." #~ msgstr "Kirjoitettavaa tiedostonimeä ei annettu." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Tiedostoa \"%s\" ei voi avata kirjoitettavaksi." -#~ msgid "Internal data flow error." -#~ msgstr "Sisäinen tietovirtavirhe." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "Tiedostoon \"%s\" ei voitu kirjoittaa." +#~ msgid "Could not open audio device for mixer control handling." +#~ msgstr "Äänilaitetta ei voitu avata mikserinhallinnan käsiteltäväksi." + +#~ msgid "" +#~ "Could not open audio device for mixer control handling. This version of " +#~ "the Open Sound System is not supported by this element." +#~ msgstr "" +#~ "Äänilaitetta ei voitu avata mikserinhallinnan käsiteltäväksi. Tämä " +#~ "elementti ei tue tätä versiota Open Sound Systemistä." + +#~ msgid "Volume" +#~ msgstr "Äänenvoimakkuus" + +#~ msgid "Master" +#~ msgstr "Pää" + +#~ msgid "Front" +#~ msgstr "Etu" + +#~ msgid "Rear" +#~ msgstr "Taka" + +#~ msgid "Headphones" +#~ msgstr "Kuulokkeet" + +#~ msgid "Center" +#~ msgstr "Keski" + +#~ msgid "LFE" +#~ msgstr "LFE" + +#~ msgid "Surround" +#~ msgstr "Surround" + +#~ msgid "Side" +#~ msgstr "Sivu" + +#~ msgid "Built-in Speaker" +#~ msgstr "Sisäänrakennettu kaiutin" + +#~ msgid "AUX 1 Out" +#~ msgstr "AUX 1 ulos" + +#~ msgid "AUX 2 Out" +#~ msgstr "AUX 2 ulos" + +#~ msgid "AUX Out" +#~ msgstr "AUX ulos" + +#~ msgid "Bass" +#~ msgstr "Basso" + +#~ msgid "Treble" +#~ msgstr "Diskantit" + +#~ msgid "3D Depth" +#~ msgstr "3D-syvyys" + +#~ msgid "3D Center" +#~ msgstr "3D-keski" + +#~ msgid "3D Enhance" +#~ msgstr "3D-parannus" + +#~ msgid "Telephone" +#~ msgstr "Puhelin" + +#~ msgid "Microphone" +#~ msgstr "Mikrofoni" + +#~ msgid "Line Out" +#~ msgstr "Linja ulos" + +#~ msgid "Line In" +#~ msgstr "Linja sisään" + +#~ msgid "Internal CD" +#~ msgstr "Sisäinen CD" + +#~ msgid "Video In" +#~ msgstr "Video sisään" + +#~ msgid "AUX 1 In" +#~ msgstr "AUX 1 sisään" + +#~ msgid "AUX 2 In" +#~ msgstr "AUX 2 sisään" + +#~ msgid "AUX In" +#~ msgstr "AUX sisään" + +#~ msgid "PCM" +#~ msgstr "PCM" + +#~ msgid "Record Gain" +#~ msgstr "Nauhoitusvahvistus" + +#~ msgid "Output Gain" +#~ msgstr "Ulostulovahvistus" + +#~ msgid "Microphone Boost" +#~ msgstr "Mikrofoni-lisä" + +#~ msgid "Loopback" +#~ msgstr "Silmukka" + +#~ msgid "Diagnostic" +#~ msgstr "Diagnosointi" + +#~ msgid "Bass Boost" +#~ msgstr "Basso-lisä" + +#~ msgid "Playback Ports" +#~ msgstr "Toistoportit" + +#~ msgid "Input" +#~ msgstr "Sisääntulo" + +#~ msgid "Record Source" +#~ msgstr "Nauhoituslähde" + +#~ msgid "Monitor Source" +#~ msgstr "Monitorointilähde" + +#~ msgid "Keyboard Beep" +#~ msgstr "Näppäimistöpiippaus" + +#~ msgid "Monitor" +#~ msgstr "Monitori" + +#~ msgid "Simulate Stereo" +#~ msgstr "Simuloitu stereo" + +#~ msgid "Stereo" +#~ msgstr "Stereo" + +#~ msgid "Surround Sound" +#~ msgstr "Surround-ääni" + +#~ msgid "Microphone Gain" +#~ msgstr "Mikrofonivahvistus" + +#~ msgid "Speaker Source" +#~ msgstr "Kaiutinlähde" + +#~ msgid "Microphone Source" +#~ msgstr "Mikrofonilähde" + +#~ msgid "Jack" +#~ msgstr "Jakki" + +#~ msgid "Center / LFE" +#~ msgstr "Keski / LFE" + +#~ msgid "Stereo Mix" +#~ msgstr "Stereomiksaus" + +#~ msgid "Mono Mix" +#~ msgstr "Monomiksaus" + +#~ msgid "Input Mix" +#~ msgstr "Sisääntulomiksaus" + +#~ msgid "SPDIF In" +#~ msgstr "SPDIF sisään" + +#~ msgid "SPDIF Out" +#~ msgstr "SPDIF ulos" + +#~ msgid "Microphone 1" +#~ msgstr "Mikrofoni 1" + +#~ msgid "Microphone 2" +#~ msgstr "Mikrofoni 2" + +#~ msgid "Digital Out" +#~ msgstr "Digitaalinen ulos" + +#~ msgid "Digital In" +#~ msgstr "Digitaalinen sisään" + +#~ msgid "HDMI" +#~ msgstr "HDMI" + +#~ msgid "Modem" +#~ msgstr "Modeemi" + +#~ msgid "Handset" +#~ msgstr "Luuri" + +#~ msgid "Other" +#~ msgstr "Muu" + +#~ msgid "None" +#~ msgstr "Ei mikään" + +#~ msgid "On" +#~ msgstr "Päällä" + +#~ msgid "Off" +#~ msgstr "Poissa" + +#~ msgid "Mute" +#~ msgstr "Vaimenna" + +#~ msgid "Fast" +#~ msgstr "Nopea" + +#~ msgid "Very Low" +#~ msgstr "Erittäin matala" + +#~ msgid "Low" +#~ msgstr "Matala" + +#~ msgid "Medium" +#~ msgstr "Keskitaso" + +#~ msgid "High" +#~ msgstr "Korkea" + +#~ msgid "Very High" +#~ msgstr "Erittäin korkea" + +#~ msgid "Production" +#~ msgstr "Tuotanto" + +#~ msgid "Front Panel Microphone" +#~ msgstr "Etupaneelin mikrofoni" + +#~ msgid "Front Panel Line In" +#~ msgstr "Etupaneelin linjasisääntulo" + +#~ msgid "Front Panel Headphones" +#~ msgstr "Etupaneelin kuulokkeet" + +#~ msgid "Front Panel Line Out" +#~ msgstr "Etupaneelin linja ulos" + +#~ msgid "Green Connector" +#~ msgstr "Vihreä liitin" + +#~ msgid "Pink Connector" +#~ msgstr "Violetti liitin" + +#~ msgid "Blue Connector" +#~ msgstr "Sininen liitin" + +#~ msgid "White Connector" +#~ msgstr "Valkoinen liitin" + +#~ msgid "Black Connector" +#~ msgstr "Musta liitin" + +#~ msgid "Gray Connector" +#~ msgstr "Harmaa liitin" + +#~ msgid "Orange Connector" +#~ msgstr "Oranssi liitin" + +#~ msgid "Red Connector" +#~ msgstr "Punainen liitin" + +#~ msgid "Yellow Connector" +#~ msgstr "Keltainen liitin" + +#~ msgid "Green Front Panel Connector" +#~ msgstr "Vihreä etupaneelin liitin" + +#~ msgid "Pink Front Panel Connector" +#~ msgstr "Violetti etupaneelin liitin" + +#~ msgid "Blue Front Panel Connector" +#~ msgstr "Sininen etupaneelin liitin" + +#~ msgid "White Front Panel Connector" +#~ msgstr "Valkoinen etupaneelin liitin" + +#~ msgid "Black Front Panel Connector" +#~ msgstr "Musta etupaneelin liitin" + +#~ msgid "Gray Front Panel Connector" +#~ msgstr "Harmaa etupaneelin liitin" + +#~ msgid "Orange Front Panel Connector" +#~ msgstr "Oranssi etupaneelin liitin" + +#~ msgid "Red Front Panel Connector" +#~ msgstr "Punainen etupaneelin liitin" + +#~ msgid "Yellow Front Panel Connector" +#~ msgstr "Keltainen etupaneelin liitin" + +#~ msgid "Spread Output" +#~ msgstr "Leviävä ulostulo" + +#~ msgid "Downmix" +#~ msgstr "Alasmiksaus" + +#~ msgid "Virtual Mixer Input" +#~ msgstr "Näennäinen mikserisisääntulo" + +#~ msgid "Virtual Mixer Output" +#~ msgstr "Näennäinen mikseriulostulo" + +#~ msgid "Virtual Mixer Channels" +#~ msgstr "Näennäiset mikserikanavat" + +#, c-format +#~ msgid "%s Function" +#~ msgstr "%s-toiminto" + +#, c-format #~ msgid "%s %d" #~ msgstr "%s %d" -#~ msgid "Internal clock error." -#~ msgstr "Sisäinen kellovirhe." +#~ msgid "" +#~ "Could not open audio device for playback. Device is being used by another " +#~ "application." +#~ msgstr "" +#~ "Äänialaitetta ei voitu avata toistoa varten. Laite on toisen sovelluksen " +#~ "käytössä." + +#~ msgid "" +#~ "Could not open audio device for playback. You don't have permission to " +#~ "open the device." +#~ msgstr "" +#~ "Äänialaitetta ei voitu avata toistoa varten. Sinulla ei ole oikeuksia " +#~ "avata laitetta." + +#~ msgid "Could not open audio device for playback." +#~ msgstr "Äänialaitetta ei voitu avata toistoa varten." + +#~ msgid "" +#~ "Could not open audio device for playback. This version of the Open Sound " +#~ "System is not supported by this element." +#~ msgstr "" +#~ "Äänialaitetta ei voitu avata toistoa varten. Tämä elementti ei tue tätä " +#~ "versiota Open Sound Systemistä." + +#~ msgid "Playback is not supported by this audio device." +#~ msgstr "Tämä äänilaite ei tue toistamista." + +#~ msgid "Audio playback error." +#~ msgstr "Äänentoistovirhe." + +#~ msgid "Recording is not supported by this audio device." +#~ msgstr "Tämä äänilaite ei tue nauhoitusta." + +#~ msgid "Error recording from audio device." +#~ msgstr "Virhe nauhoitettaessa äänilaitteelta." #~ msgid "PCM 1" #~ msgstr "PCM 1"
View file
gst-plugins-bad-1.18.6.tar.xz/po/fr.po -> gst-plugins-bad-1.20.1.tar.xz/po/fr.po
Changed
@@ -3,14 +3,14 @@ # This file is distributed under the same license as the gst-plugins-bad package. # # Claude Paroz <claude@2xlibre.net>, 2008-2011. -# Stéphane Aulery <lkppo@free.fr>, 2015-2016. +# Stéphane Aulery <lkppo@free.fr>, 2015-2016, 2019. # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2016-12-23 20:45+0100\n" +"Project-Id-Version: gst-plugins-bad 1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2019-05-12 03:46+0200\n" "Last-Translator: Stéphane Aulery <lkppo@free.fr>\n" "Language-Team: French <traduc@traduc.org>\n" "Language: fr\n" @@ -20,7 +20,7 @@ "X-Bugs: Report translation errors to the Language-Team address.\n" msgid "No URL set." -msgstr "" +msgstr "Aucune URL définie." msgid "OpenCV failed to load template image" msgstr "OpenCV n’a pas pu charger l’image modèle" @@ -73,6 +73,7 @@ msgid "Failed to get fragment URL." msgstr "Échec de la récupération de l’URL de fragment." +#, c-format msgid "Couldn't download fragments" msgstr "Impossible de télécharger les fragments" @@ -100,6 +101,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Impossible d’ouvrir le fichier « %s » en lecture." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Impossible de trouver le fichier de configuration de canal" @@ -123,6 +125,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Impossible de trouver le fichier de configuration du canal « %s »" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Le fichier de configuration ne contient aucun canal"
View file
gst-plugins-bad-1.18.6.tar.xz/po/fur.po -> gst-plugins-bad-1.20.1.tar.xz/po/fur.po
Changed
@@ -5,8 +5,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2016-12-30 13:28+0100\n" "Last-Translator: Fabio Tomat <f.t.public@gmail.com>\n" "Language-Team: Friulian <f.t.public@gmail.com>\n" @@ -72,6 +72,7 @@ msgid "Failed to get fragment URL." msgstr "No si è rivâts a otignî il URL dal frament" +#, c-format msgid "Couldn't download fragments" msgstr "Impussibil discjariâ i framents" @@ -99,6 +100,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Impussibil vierzi il file \"%s\" pe leture." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Impussibil cjatâ il file di configurazion canâl" @@ -122,6 +124,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Impussibil cjatâ il file di configurazion canâl: '%s'" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Il file di configurazion canâl nol à nissun canâl" @@ -149,11 +152,13 @@ #~ msgid "No file name specified for writing." #~ msgstr "Nissun non di file specificât pe scriture." +#, c-format #~ msgid "" #~ "Given file name \"%s\" can't be converted to local file name encoding." #~ msgstr "" #~ "Il non di file furnît \"%s\" nol pues jessi convertît te codifiche non " #~ "file locâl." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Impussibil vierzi il file \"%s\" pe scriture."
View file
gst-plugins-bad-1.18.6.tar.xz/po/gl.po -> gst-plugins-bad-1.20.1.tar.xz/po/gl.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2011-09-05 12:50+0200\n" "Last-Translator: Fran Dieguez <frandieguez@ubuntu.com>\n" "Language-Team: Galician <proxecto@trasno.net>\n" @@ -16,6 +16,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=(n!=1);\n" "X-Poedit-Language: galego\n" @@ -71,6 +72,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -97,6 +99,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Non foi posíbel abrir o ficheiro «%s» para ler." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -120,29 +123,24 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Non foi posíbel abrir o ficheiro «%s» para ler." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Non foi posíbel ler o DVD." - #~ msgid "No file name specified for writing." #~ msgstr "Non se especificou ningún nome de ficheiro para a súa escritura." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Non foi posíbel abrir o ficheiro «%s» para escribir." -#~ msgid "Internal data flow error." -#~ msgstr "Produciuse un erro interno no fluxo de datos." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "Non foi posíbel escribir no ficheiro «%s»." +#~ msgid "Internal data flow error." +#~ msgstr "Produciuse un erro interno no fluxo de datos." + #~ msgid "Could not open audio device for mixer control handling." #~ msgstr "" #~ "Non foi posíbel abrir o dispositivo de son para manexar o control do "
View file
gst-plugins-bad-1.18.6.tar.xz/po/gst-plugins-bad-1.0.pot -> gst-plugins-bad-1.20.1.tar.xz/po/gst-plugins-bad-1.0.pot
Changed
@@ -8,7 +8,7 @@ msgstr "" "Project-Id-Version: gst-plugins-bad-1.0\n" "Report-Msgid-Bugs-To: \n" -"POT-Creation-Date: 2022-02-02 15:07+0000\n" +"POT-Creation-Date: 2022-03-14 11:40+0000\n" "PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" "Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" "Language-Team: LANGUAGE <LL@li.org>\n" @@ -17,11 +17,11 @@ "Content-Type: text/plain; charset=CHARSET\n" "Content-Transfer-Encoding: 8bit\n" -#: ext/curl/gstcurlhttpsrc.c:1436 +#: ext/curl/gstcurlhttpsrc.c:1439 msgid "No URL set." msgstr "" -#: ext/opencv/gsttemplatematch.cpp:184 +#: ext/opencv/gsttemplatematch.cpp:189 msgid "OpenCV failed to load template image" msgstr "" @@ -48,34 +48,34 @@ msgid "Could not read DVD." msgstr "" -#: ext/smoothstreaming/gstmssdemux.c:429 +#: ext/smoothstreaming/gstmssdemux.c:430 #: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:735 msgid "This file contains no playable streams." msgstr "" -#: ext/sndfile/gstsfdec.c:769 +#: ext/sndfile/gstsfdec.c:771 msgid "Could not open sndfile stream for reading." msgstr "" -#: gst/asfmux/gstasfmux.c:1832 +#: gst/asfmux/gstasfmux.c:1834 msgid "Generated file has a larger preroll time than its streams duration" msgstr "" -#: gst/camerabin2/camerabingeneral.c:167 gst/camerabin2/gstcamerabin2.c:1861 -#: gst/camerabin2/gstdigitalzoom.c:283 gst/camerabin2/gstviewfinderbin.c:270 +#: gst/camerabin2/camerabingeneral.c:167 gst/camerabin2/gstcamerabin2.c:1866 +#: gst/camerabin2/gstdigitalzoom.c:283 gst/camerabin2/gstviewfinderbin.c:275 #, c-format msgid "Missing element '%s' - check your GStreamer installation." msgstr "" -#: gst/camerabin2/gstcamerabin2.c:347 +#: gst/camerabin2/gstcamerabin2.c:352 msgid "File location is set to NULL, please set it to a valid filename" msgstr "" -#: gst/camerabin2/gstwrappercamerabinsrc.c:585 +#: gst/camerabin2/gstwrappercamerabinsrc.c:587 msgid "Digitalzoom element couldn't be created" msgstr "" -#: gst/dvdspu/gstdvdspu.c:1041 +#: gst/dvdspu/gstdvdspu.c:1044 msgid "Subpicture format was not configured before data flow" msgstr "" @@ -89,31 +89,31 @@ msgstr "" #: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:4102 -#: gst/mpegtsdemux/mpegtsbase.c:1674 +#: gst/mpegtsdemux/mpegtsbase.c:1751 msgid "Internal data stream error." msgstr "" -#: sys/dvb/gstdvbsrc.c:1591 sys/dvb/gstdvbsrc.c:1805 +#: sys/dvb/gstdvbsrc.c:1597 sys/dvb/gstdvbsrc.c:1811 #, c-format msgid "Device \"%s\" does not exist." msgstr "" -#: sys/dvb/gstdvbsrc.c:1595 +#: sys/dvb/gstdvbsrc.c:1601 #, c-format msgid "Could not open frontend device \"%s\"." msgstr "" -#: sys/dvb/gstdvbsrc.c:1614 +#: sys/dvb/gstdvbsrc.c:1620 #, c-format msgid "Could not get settings from frontend device \"%s\"." msgstr "" -#: sys/dvb/gstdvbsrc.c:1631 +#: sys/dvb/gstdvbsrc.c:1637 #, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." msgstr "" -#: sys/dvb/gstdvbsrc.c:1809 +#: sys/dvb/gstdvbsrc.c:1815 #, c-format msgid "Could not open file \"%s\" for reading." msgstr ""
View file
gst-plugins-bad-1.18.6.tar.xz/po/hr.po -> gst-plugins-bad-1.20.1.tar.xz/po/hr.po
Changed
@@ -8,8 +8,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad-1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-02-03 13:38-0800\n" "Last-Translator: Božidar Putanec <bozidarp@yahoo.com>\n" "Language-Team: Croatian <lokalizacija@linux.hr>\n" @@ -76,6 +76,7 @@ msgid "Failed to get fragment URL." msgstr "Nije uspjelo dobiti fragment URL adrese." +#, c-format msgid "Couldn't download fragments" msgstr "Nije moguće preuzeti fragmente" @@ -105,6 +106,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Nije moguće otvoriti datoteku „%s“ za čitanje." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Nije moguće naći datoteku s postavkama za kanal" @@ -128,6 +130,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Nije moguće naći datoteku s postavkama za kanal: „%s“" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Datoteka s postavkama za kanal ne sadrži nijedan kanal"
View file
gst-plugins-bad-1.18.6.tar.xz/po/hu.po -> gst-plugins-bad-1.20.1.tar.xz/po/hu.po
Changed
@@ -1,27 +1,27 @@ # Hungarian translation for gst-plugins-bad. -# Copyright (C) 2007, 2008, 2009, 2012, 2014, 2015, 2017 Free Software Foundation, Inc. +# Copyright (C) 2007, 2008, 2009, 2012, 2014, 2015, 2017, 2019 Free Software Foundation, Inc. # This file is distributed under the same license as the gst-plugins-bad package. # # Gabor Kelemen <kelemeng@gnome.hu>, 2007, 2008, 2009, 2012. -# Balázs Úr <urbalazs@gmail.com>, 2014, 2015, 2017. +# Balázs Úr <ur.balazs@fsf.hu>, 2014, 2015, 2017, 2019. msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2017-03-19 00:53+0100\n" -"Last-Translator: Balázs Úr <urbalazs@gmail.com>\n" +"Project-Id-Version: gst-plugins-bad 1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2019-11-23 21:49+0100\n" +"Last-Translator: Balázs Úr <ur.balazs@fsf.hu>\n" "Language-Team: Hungarian <translation-team-hu@lists.sourceforge.net>\n" "Language: hu\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "X-Bugs: Report translation errors to the Language-Team address.\n" -"X-Generator: Lokalize 1.2\n" +"X-Generator: Lokalize 19.04.3\n" "Plural-Forms: nplurals=2; plural=(n != 1);\n" msgid "No URL set." -msgstr "" +msgstr "Nincs URL beállítva." msgid "OpenCV failed to load template image" msgstr "Az OpenCV-nek nem sikerült betöltenie a sablonképet" @@ -75,6 +75,7 @@ msgid "Failed to get fragment URL." msgstr "Nem sikerült lekérni a töredék URL-t." +#, c-format msgid "Couldn't download fragments" msgstr "Nem sikerült letölteni a töredékeket" @@ -102,6 +103,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "A fájl („%s”) nem nyitható meg olvasásra." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Nem található csatorna beállítófájl" @@ -125,6 +127,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Nem található csatorna beállítófájl: „%s”" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "A csatorna beállítófájl nem tartalmaz egyetlen csatornát sem"
View file
gst-plugins-bad-1.18.6.tar.xz/po/id.po -> gst-plugins-bad-1.20.1.tar.xz/po/id.po
Changed
@@ -1,24 +1,25 @@ # Indonesian translations for gst-plugins-bad package. # This file is put in the public domain. -# Andika Triwidada <andika@gmail.com>, 2013. +# Andika Triwidada <andika@gmail.com>, 2013, 2021. # Andhika Padmawan <andhika.padmawan@gmail.com>, 2009-2016. # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2016-11-14 20:38+0700\n" -"Last-Translator: Andhika Padmawan <andhika.padmawan@gmail.com>\n" +"Project-Id-Version: gst-plugins-bad 1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2021-09-07 19:48+0700\n" +"Last-Translator: Andika Triwidada <andika@gmail.com>\n" "Language-Team: Indonesian <translation-team-id@lists.sourceforge.net>\n" "Language: id\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "X-Bugs: Report translation errors to the Language-Team address.\n" +"X-Generator: Poedit 2.4.3\n" msgid "No URL set." -msgstr "" +msgstr "Tidak ada URL yang ditata." msgid "OpenCV failed to load template image" msgstr "OpenCV gagal memuat templat gambar" @@ -70,8 +71,9 @@ msgid "Failed to get fragment URL." msgstr "Gagal mendapat URL fragmen." +#, c-format msgid "Couldn't download fragments" -msgstr "Tak bisa mengunduh fragmen." +msgstr "Tak bisa mengunduh fragmen" msgid "Internal data stream error." msgstr "Galat arus data internal." @@ -97,6 +99,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Tak dapat membuka berkas \"%s\" untuk dibaca." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Tak bisa temukan berkas konfigurasi kanal" @@ -120,47 +123,6 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Tak bisa temukan berkas konfigurasi kanal: ‘%s’" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Berkas konfigurasi kanal tak memuat kanal apapun" - -#~ msgid "failed to draw pattern" -#~ msgstr "gagal menggambar pola" - -#~ msgid "A GL error occured" -#~ msgstr "Sebuah galat GL terjadi" - -#~ msgid "format wasn't negotiated before get function" -#~ msgstr "format tidak dinegosiasikan sebelum mendapat fungsi" - -#~ msgid "Could not establish connection to sndio" -#~ msgstr "Tak bisa menjalin koneksi ke sndio" - -#~ msgid "Failed to query sndio capabilities" -#~ msgstr "Gagal mengkuiri kapabilitas sndio" - -#~ msgid "Could not configure sndio" -#~ msgstr "Tak dapat menata sndio" - -#~ msgid "Could not start sndio" -#~ msgstr "Tak dapat memulai sndio" - -#~ msgid "No file name specified for writing." -#~ msgstr "Tak ada nama berkas yang ditentukan untuk ditulis." - -#~ msgid "" -#~ "Given file name \"%s\" can't be converted to local file name encoding." -#~ msgstr "" -#~ "Nama berkas \"%s\" yang diberikan tak bisa dikonversi ke enkoding nama " -#~ "berkas lokal." - -#~ msgid "Could not open file \"%s\" for writing." -#~ msgstr "Tak dapat membuka berkas \"%s\" untuk ditulis." - -#~ msgid "Couldn't get the Manifest's URI" -#~ msgstr "Tak bisa memperoleh URI Manifes" - -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "Tak dapat menulis ke berkas \"%s\"." - -#~ msgid "Internal data flow error." -#~ msgstr "Galat arus data internal."
View file
gst-plugins-bad-1.18.6.tar.xz/po/it.po -> gst-plugins-bad-1.20.1.tar.xz/po/it.po
Changed
@@ -9,8 +9,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad-1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-01-25 11:57+0100\n" "Last-Translator: Milo Casagrande <milo@milo.name>\n" "Language-Team: Italian <tp@lists.linux.it>\n" @@ -74,6 +74,7 @@ msgid "Failed to get fragment URL." msgstr "Impossibile ottenere l'URL del frammento." +#, c-format msgid "Couldn't download fragments" msgstr "Impossibile scaricare frammenti" @@ -103,6 +104,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Impossibile aprire il file «%s» in lettura." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Impossibile trovare il file di configurazione del canale" @@ -126,6 +128,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Impossibile trovare il file di configurazione del canale: «%s»" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Il file di configurazione del canale non contiene alcun canale"
View file
gst-plugins-bad-1.18.6.tar.xz/po/ja.po -> gst-plugins-bad-1.20.1.tar.xz/po/ja.po
Changed
@@ -5,8 +5,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2011-04-26 19:38+0900\n" "Last-Translator: Makoto Kato <makoto.kt@gmail.com>\n" "Language-Team: Japanese <translation-team-ja@lists.sourceforge.net>\n" @@ -14,6 +14,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Generator: Lokalize 0.3\n" "Plural-Forms: nplurals=1; plural=0;\n" @@ -69,6 +70,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -95,7 +97,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "読み込み用にファイル \"%s\" を開くことができません。" -#, fuzzy +#, fuzzy, c-format msgid "Couldn't find channel configuration file" msgstr "仮想ミックスチャンネル設定" @@ -119,29 +121,24 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "仮想ミックスチャンネル設定" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "読み込み用にファイル \"%s\" を開くことができません。" - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "DVDを読み込むことができませんでした。" - #~ msgid "No file name specified for writing." #~ msgstr "書き込み用にファイル名が指定されていません。" +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "書き込み用にファイル \"%s\" を開くことができませんでした。" -#~ msgid "Internal data flow error." -#~ msgstr "内部データフローエラー。" - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "ファイル \"%s\" へ書き込むことができませんでした。" +#~ msgid "Internal data flow error." +#~ msgstr "内部データフローエラー。" + #~ msgid "Internal clock error." #~ msgstr "内部クロックエラー"
View file
gst-plugins-bad-1.18.6.tar.xz/po/ky.po -> gst-plugins-bad-1.20.1.tar.xz/po/ky.po
Changed
@@ -6,33 +6,119 @@ msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.5\n" "Report-Msgid-Bugs-To: \n" -"POT-Creation-Date: 2007-06-13 12:47+0200\n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2007-11-13 17:16+0600\n" "Last-Translator: Ilyas Bakirov <just_ilyas@yahoo.com>\n" "Language-Team: Kirghiz <i18n-team-ky-kyrgyz@lists.sourceforge.net>\n" -"X-Bugs: Report translation errors to the Language-Team address.\n" +"Language: ky\n" "MIME-Version: 1.0\n" -"Content-Type: text/plain; charset=utf-8\n" +"Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Poedit-Language: Kyrgyz\n" "X-Poedit-Country: KYRGYZSTAN\n" -#: sys/dvb/gstdvbsrc.c:658 sys/dvb/gstdvbsrc.c:713 +msgid "No URL set." +msgstr "" + +msgid "OpenCV failed to load template image" +msgstr "" + +msgid "Could not read title information for DVD." +msgstr "" + +#, fuzzy, c-format +msgid "Failed to open DVD device '%s'." +msgstr "Алдын \"%s\" жабдыкты ачалган жок." + +msgid "Failed to set PGC based seeking." +msgstr "" + +msgid "" +"Could not read DVD. This may be because the DVD is encrypted and a DVD " +"decryption library is not installed." +msgstr "" + +msgid "Could not read DVD." +msgstr "" + +msgid "This file contains no playable streams." +msgstr "" + +#, fuzzy +msgid "Could not open sndfile stream for reading." +msgstr "\"%s\" файлы окууга ачылган жок." + +msgid "Generated file has a larger preroll time than its streams duration" +msgstr "" + +#, c-format +msgid "Missing element '%s' - check your GStreamer installation." +msgstr "" + +msgid "File location is set to NULL, please set it to a valid filename" +msgstr "" + +msgid "Digitalzoom element couldn't be created" +msgstr "" + +msgid "Subpicture format was not configured before data flow" +msgstr "" + +msgid "Failed to get fragment URL." +msgstr "" + +#, c-format +msgid "Couldn't download fragments" +msgstr "" + +msgid "Internal data stream error." +msgstr "" + #, c-format msgid "Device \"%s\" does not exist." msgstr "\"%s\" мындай жабдык жок." -#: sys/dvb/gstdvbsrc.c:662 #, c-format msgid "Could not open frontend device \"%s\"." msgstr "Алдын \"%s\" жабдыкты ачалган жок." -#: sys/dvb/gstdvbsrc.c:674 #, c-format msgid "Could not get settings from frontend device \"%s\"." msgstr "Алдын \"%s\" жабдыктан ырастоолор алынган жок." -#: sys/dvb/gstdvbsrc.c:717 +#, fuzzy, c-format +msgid "Cannot enumerate delivery systems from frontend device \"%s\"." +msgstr "Алдын \"%s\" жабдыктан ырастоолор алынган жок." + #, c-format msgid "Could not open file \"%s\" for reading." msgstr "\"%s\" файлы окууга ачылган жок." + +#, c-format +msgid "Couldn't find channel configuration file" +msgstr "" + +#, c-format +msgid "Couldn't load channel configuration file: '%s'" +msgstr "" + +#, c-format +msgid "Couldn't find details for channel '%s'" +msgstr "" + +#, c-format +msgid "No properties for channel '%s'" +msgstr "" + +#, c-format +msgid "Failed to set properties for channel '%s'" +msgstr "" + +#, c-format +msgid "Couldn't find channel configuration file: '%s'" +msgstr "" + +#, c-format +msgid "Channel configuration file doesn't contain any channels" +msgstr ""
View file
gst-plugins-bad-1.18.6.tar.xz/po/lt.po -> gst-plugins-bad-1.20.1.tar.xz/po/lt.po
Changed
@@ -5,8 +5,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad-0.10.6.3\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2008-05-14 02:13+0300\n" "Last-Translator: Gintautas Miliauskas <gintas@akl.lt>\n" "Language-Team: Lithuanian <komp_lt@konferencijos.lt>\n" @@ -14,6 +14,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Generator: KBabel 1.11.4\n" "Plural-Forms: nplurals=3; plural=(n%10==1 && n%100!=11 ? 0 : n%10>=2 && (n" "%100<10 || n%100>=20) ? 1 : 2);\n" @@ -40,9 +41,8 @@ "decryption library is not installed." msgstr "" -#, fuzzy msgid "Could not read DVD." -msgstr "Nepavyko rašyti į failą „%s“." +msgstr "" msgid "This file contains no playable streams." msgstr "" @@ -70,6 +70,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -96,6 +97,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Nepavyko atverti failo „%s“ skaitymui." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -111,38 +113,21 @@ msgid "No properties for channel '%s'" msgstr "" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Nepavyko atverti išorinės pusės įrenginio „%s“." +msgstr "" #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Nepavyko atverti failo „%s“ skaitymui." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Nepavyko rašyti į failą „%s“." - #~ msgid "No file name specified for writing." #~ msgstr "Nenurodytas failo rašymui pavadinimas." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Nepavyko atverti failo „%s“ rašymui." - -#, fuzzy -#~ msgid "Internal data flow error." -#~ msgstr "Vidinė duomenų srauto klaida." - -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "Nepavyko rašyti į failą „%s“." - -#, fuzzy -#~ msgid "Internal clock error." -#~ msgstr "Vidinė duomenų srauto klaida."
View file
gst-plugins-bad-1.18.6.tar.xz/po/lv.po -> gst-plugins-bad-1.20.1.tar.xz/po/lv.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.2.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2014-04-20 15:52+0300\n" "Last-Translator: Rihards Prieditis <rprieditis@gmail.com>\n" "Language-Team: Latvian <translation-team-lv@lists.sourceforge.net>\n" @@ -16,6 +16,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Poedit-Language: Latvian\n" "X-Poedit-Country: LATVIA\n" "Plural-Forms: nplurals=3; plural=(n%10==1 && n%100!=11 ? 0 : n != 0 ? 1 : " @@ -76,6 +77,7 @@ msgid "Failed to get fragment URL." msgstr "Neizdevās saņemt fragmenta URL." +#, c-format msgid "Couldn't download fragments" msgstr "Nevarēja lejupielādēt fragmentus" @@ -102,7 +104,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Nevarēja atvērt datni “%s” lasīšanai." -#, fuzzy +#, fuzzy, c-format msgid "Couldn't find channel configuration file" msgstr "Nevarēja atrast DVB kanāla konfigurācijas datni" @@ -120,16 +122,30 @@ #, fuzzy, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Nevarēja atrast sīkāku informāciju par DVB kanālu %s" +msgstr "Neizdevās atvērt DVD ierīci “%s”." #, fuzzy, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "Nevarēja atrast DVB kanāla konfigurācijas datni" -#, fuzzy +#, fuzzy, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "DVB kanāla konfigurācijas datne nesatur nevienu kanālu" +#~ msgid "Couldn't get the Manifest's URI" +#~ msgstr "Nevarēja saņemt manifesta URI" + +#~ msgid "No file name specified for writing." +#~ msgstr "Ierakstīšanai nav norādīts neviens datnes nosaukums." + +#, c-format +#~ msgid "Could not open file \"%s\" for writing." +#~ msgstr "Nav iespējams atvērt datni “%s” ierakstīšanai." + +#, c-format +#~ msgid "Could not write to file \"%s\"." +#~ msgstr "Nevarēja ierakstīt datnē “%s”." + #~ msgid "Could not establish connection to sndio" #~ msgstr "Nevarēja izveidot savienojumu ar sndio" @@ -142,27 +158,16 @@ #~ msgid "Could not start sndio" #~ msgstr "Nevarēja palaist sndio" -#~ msgid "No file name specified for writing." -#~ msgstr "Ierakstīšanai nav norādīts neviens datnes nosaukums." +#~ msgid "Internal data flow error." +#~ msgstr "Iekšēja datu plūsmas kļūda." +#, c-format #~ msgid "" #~ "Given file name \"%s\" can't be converted to local file name encoding." #~ msgstr "" #~ "Doto datnes nosaukumu “%s” nevar pārveidot uz lokālo datnes nosaukuma " #~ "kodējumu." -#~ msgid "Could not open file \"%s\" for writing." -#~ msgstr "Nav iespējams atvērt datni “%s” ierakstīšanai." - -#~ msgid "Internal data flow error." -#~ msgstr "Iekšēja datu plūsmas kļūda." - -#~ msgid "Couldn't get the Manifest's URI" -#~ msgstr "Nevarēja saņemt manifesta URI" - -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "Nevarēja ierakstīt datnē “%s”." - #~ msgid "Internal clock error." #~ msgstr "Iekšējā pulksteņa kļūda."
View file
gst-plugins-bad-1.18.6.tar.xz/po/mt.po -> gst-plugins-bad-1.20.1.tar.xz/po/mt.po
Changed
@@ -4,8 +4,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad-0.10.8.3\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2008-10-26 20:27+0100\n" "Last-Translator: Michel Bugeja <michelbugeja@rabatmalta.com>\n" "Language-Team: Maltese <translation-team-mt@lists.sourceforge.net>\n" @@ -13,6 +13,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "X-Poedit-Language: Maltese\n" "X-Poedit-Country: MALTA\n" "X-Poedit-SourceCharset: utf-8\n" @@ -69,6 +70,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -95,7 +97,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Ma nistax naqra mill-fajl \"%s\"." -#, fuzzy +#, fuzzy, c-format msgid "Couldn't find channel configuration file" msgstr "Virtual mixer channel configuration" @@ -119,37 +121,33 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Virtual mixer channel configuration" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Ma nistax naqra mill-fajl \"%s\"." +#~ msgid "Internal clock error." +#~ msgstr "Internal clock error." -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Ma stajtx naqra informazzjoni fuq it-titlu tad-DVD." +#~ msgid "Internal data flow error." +#~ msgstr "Internal data flow error." #~ msgid "No file name specified for writing." #~ msgstr "L-ebda isem speċifikat biex nikteb." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Ma nistax niftaħ fajl \"%s\" biex nikteb." -#~ msgid "Internal data flow error." -#~ msgstr "Internal data flow error." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "Ma nistax nikteb fil-fajl \"%s\". " -#~ msgid "Internal clock error." -#~ msgstr "Internal clock error." - #~ msgid "Failed to configure TwoLAME encoder. Check your encoding parameters." #~ msgstr "" #~ "Ma nistax naghmel il-konfigurazzjoni ta' TwoLAME encoder. Iċċekkja " #~ "parametri tal-encoding." +#, c-format #~ msgid "" #~ "The requested bitrate %d kbit/s for property '%s' is not allowed. The " #~ "bitrate was changed to %d kbit/s." @@ -157,6 +155,73 @@ #~ "Il-bitrate rikjesta ta' %d kbit/s għal-property '%s' mhux permessa. Il-" #~ "bitrate ġiet mibdula għal %d kbit/s. " +#~ msgid "Could not open audio device for mixer control handling." +#~ msgstr "Ma nistax niftaħ apparat tal-awdjo għal mixer control handling." + +#~ msgid "" +#~ "Could not open audio device for mixer control handling. This version of " +#~ "the Open Sound System is not supported by this element." +#~ msgstr "" +#~ "Ma nistax niftaħ apparat tal-awdjo għal mixer control handling. Din il-" +#~ "verzjoni ta' Open Sound System mhux issapportjata minn din l-element." + +#~ msgid "Fast" +#~ msgstr "Fast" + +#~ msgid "Low" +#~ msgstr "Low" + +#~ msgid "Medium" +#~ msgstr "Medium" + +#~ msgid "High" +#~ msgstr "High" + +#~ msgid "Very high" +#~ msgstr "Very high" + +#~ msgid "Production" +#~ msgstr "Production" + +#~ msgid "Off" +#~ msgstr "Off" + +#~ msgid "On" +#~ msgstr "On" + +#~ msgid "Stereo" +#~ msgstr "Stereo" + +#~ msgid "Surround sound" +#~ msgstr "Surround sound" + +#~ msgid "Input mix" +#~ msgstr "Input mix" + +#~ msgid "Front" +#~ msgstr "Quddiem" + +#~ msgid "Rear" +#~ msgstr "Wara" + +#~ msgid "Side" +#~ msgstr "Ġenb" + +#~ msgid "Center / LFE" +#~ msgstr "Center / LFE" + +#~ msgid "Microphone" +#~ msgstr "Mikrofonu" + +#~ msgid "Front panel microphone" +#~ msgstr "Mikrofonu tal-panella ta' quddiem" + +#~ msgid "Input" +#~ msgstr "Input" + +#~ msgid "Line-in" +#~ msgstr "Line-in" + #~ msgid "PCM 1" #~ msgstr "PCM 1" @@ -169,6 +234,60 @@ #~ msgid "PCM 4" #~ msgstr "PCM 4" +#~ msgid "Green connector" +#~ msgstr "Connector aħdar" + +#~ msgid "Green front panel connector" +#~ msgstr "Front panel connector aħdar" + +#~ msgid "Pink connector" +#~ msgstr "Connector roża" + +#~ msgid "Pink front panel connector" +#~ msgstr "Front panel connector roża" + +#~ msgid "Blue connector" +#~ msgstr "Connector Blu" + +#~ msgid "Blue front panel connector" +#~ msgstr "Front panel connector blu" + +#~ msgid "Orange connector" +#~ msgstr "Connector oranġjo" + +#~ msgid "Orange front panel connector" +#~ msgstr "Front Panel connector oranġjo" + +#~ msgid "Black connector" +#~ msgstr "Connector iswed" + +#~ msgid "Black front panel connector" +#~ msgstr "Front panel connector iswed" + +#~ msgid "Gray connector" +#~ msgstr "Connector Griż" + +#~ msgid "Gray front panel connector" +#~ msgstr "Front panel connector Griż" + +#~ msgid "White connector" +#~ msgstr "Connector abjad" + +#~ msgid "White front panel connector" +#~ msgstr "Front panel connector abjad" + +#~ msgid "Red connector" +#~ msgstr "Connector aħmar" + +#~ msgid "Red front panel connector" +#~ msgstr "Front panel connector aħmar" + +#~ msgid "Yellow connector" +#~ msgstr "Connector isfar" + +#~ msgid "Yellow front panel connector" +#~ msgstr "Front panel connector isfar" + #~ msgid "Green connector function" #~ msgstr "Connector function aħdar" @@ -222,3 +341,57 @@ #~ msgid "Yellow front panel connector function" #~ msgstr "Front panel connector function isfar" + +#~ msgid "Front panel line-in" +#~ msgstr "Front panel line-in" + +#~ msgid "Headphones" +#~ msgstr "Headphones" + +#~ msgid "Front panel headphones" +#~ msgstr "Front panel headphones" + +#~ msgid "PCM" +#~ msgstr "PCM" + +#~ msgid "Virtual mixer input" +#~ msgstr "Virtual mixer input" + +#~ msgid "Virtual mixer output" +#~ msgstr "Virtual mixer output" + +#~ msgid "" +#~ "Could not open audio device for playback. Device is being used by another " +#~ "application." +#~ msgstr "" +#~ "Ma nistax niftaħ apparat tal-awdjo biex indoqq. Apparat qed jintuża minn " +#~ "programm ieħor." + +#~ msgid "" +#~ "Could not open audio device for playback. You don't have permission to " +#~ "open the device." +#~ msgstr "" +#~ "Ma nistax niftaħ apparat tal-awdjo biex indoqq. M'għandekx aċċess għall-" +#~ "apparat." + +#~ msgid "Could not open audio device for playback." +#~ msgstr "Ma nistax niftaħ apparat tal-awdjo biex indoqq." + +#~ msgid "" +#~ "Could not open audio device for playback. This version of the Open Sound " +#~ "System is not supported by this element." +#~ msgstr "" +#~ "Ma nistax niftaħ apparat tal-awdjo biex indoqq. Dil il-verżjoni ta' Open " +#~ "Sound System mhux issapportjatha minn dan l-element." + +#~ msgid "Playback is not supported by this audio device." +#~ msgstr "Id-daqq mhux issappartjat minn dan l-apparat tal-awdjo." + +#~ msgid "Audio playback error." +#~ msgstr "Żball fiid-daqq tal-awdjo." + +#~ msgid "Recording is not supported by this audio device." +#~ msgstr "Irrekordjar mhux issapportjat minn dan l-apparat tal-awdjo." + +#~ msgid "Error recording from audio device." +#~ msgstr "Żball fl-irrekordjar mill-apparat tal-awdjo."
View file
gst-plugins-bad-1.18.6.tar.xz/po/nb.po -> gst-plugins-bad-1.20.1.tar.xz/po/nb.po
Changed
@@ -2,14 +2,14 @@ # This file is put in the public domain. # # Kjartan Maraas <kmaraas@gnome.org>, 2004-2007. -# Johnny A. Solbu <johnny@solbu.net>, 2012-2017 +# Johnny A. Solbu <johnny@solbu.net>, 2012-2019 # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2017-01-05 01:34+0100\n" +"Project-Id-Version: gst-plugins-bad 1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2019-09-03 09:33+0200\n" "Last-Translator: Johnny A. Solbu <johnny@solbu.net>\n" "Language-Team: Norwegian Bokmaal <i18n-nb@lister.ping.uio.no>\n" "Language: nb_NO\n" @@ -20,7 +20,7 @@ "X-Generator: Poedit 1.8.7.1\n" msgid "No URL set." -msgstr "" +msgstr "Ingen URL satt." msgid "OpenCV failed to load template image" msgstr "OpenCV kunne ikke laste mal-bilde" @@ -71,6 +71,7 @@ msgid "Failed to get fragment URL." msgstr "Klarte ikke å hente fragment-nettadresse." +#, c-format msgid "Couldn't download fragments" msgstr "Kunne ikke laste ned fragmenter" @@ -97,6 +98,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Kunne ikke åpne filen «%s» for lesing." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Kunne ikke finne kanal-konfigurasjonsfil" @@ -120,6 +122,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Kunne ikke finne kanal-konfigurasjonsfil: «%s»" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Kanalkonfigurasjonsfilen inneholder ingen kanaler"
View file
gst-plugins-bad-1.18.6.tar.xz/po/nl.po -> gst-plugins-bad-1.20.1.tar.xz/po/nl.po
Changed
@@ -6,8 +6,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.12.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2017-10-24 22:43+0100\n" "Last-Translator: Freek de Kruijf <f.de.kruijf@gmail.com>\n" "Language-Team: Dutch <vertaling@vrijschrift.org>\n" @@ -74,6 +74,7 @@ msgid "Failed to get fragment URL." msgstr "Verkrijgen van de URL van het fragment is mislukt." +#, c-format msgid "Couldn't download fragments" msgstr "Kan fragmenten niet downloaden" @@ -100,6 +101,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Kan bestand \"%s\" niet openen om te lezen." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Kon het configuratiebestand van het kanaal niet vinden" @@ -123,6 +125,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Kon het configuratiebestand van het kanaal niet vinden: '%s'" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Het configuratiebestand bevat geen enkel kanaal"
View file
gst-plugins-bad-1.18.6.tar.xz/po/or.po -> gst-plugins-bad-1.20.1.tar.xz/po/or.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-0.8.3\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2004-09-27 13:32+0530\n" "Last-Translator: Gora Mohanty <gora_mohanty@yahoo.co.in>\n" "Language-Team: Oriya <gora_mohanty@yahoo.co.in>\n" @@ -69,6 +69,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -95,6 +96,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "\"%s\" ଫାଇଲ ପଢ଼ିବା ପାଇଁ ଖୋଲିହେଲା ନାହିଁ." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -118,6 +120,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr ""
View file
gst-plugins-bad-1.18.6.tar.xz/po/pl.po -> gst-plugins-bad-1.20.1.tar.xz/po/pl.po
Changed
@@ -5,8 +5,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-01-25 05:41+0100\n" "Last-Translator: Jakub Bogusz <qboosh@pld-linux.org>\n" "Language-Team: Polish <translation-team-pl@lists.sourceforge.net>\n" @@ -69,6 +69,7 @@ msgid "Failed to get fragment URL." msgstr "Nie udało się uzyskać URL-a fragmentu." +#, c-format msgid "Couldn't download fragments" msgstr "Nie udało się pobrać fragmentów" @@ -96,6 +97,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Nie udało się otworzyć pliku \"%s\" do odczytu." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Nie udało się odnaleźć pliku konfiguracyjnego kanałów" @@ -119,5 +121,6 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Nie udało się odnaleźć pliku konfiguracyjnego kanałów: '%s'" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Plik konfiguracyjny kanałów nie zawiera żadnych kanałów"
View file
gst-plugins-bad-1.18.6.tar.xz/po/pt_BR.po -> gst-plugins-bad-1.20.1.tar.xz/po/pt_BR.po
Changed
@@ -1,15 +1,15 @@ # Brazilian Portuguese translation of gst-plugins-bad. # This file is distributed under the same license as the gst-plugins-bad package. -# Copyright (C) 2007-2016 Free Software Foundation, Inc. +# Copyright (C) 2007-2021 Free Software Foundation, Inc. # Raphael Higino <In memorian>, 2007. -# Fabrício Godoy <skarllot@gmail.com>, 2008-2018. +# Fabrício Godoy <skarllot@gmail.com>, 2008-2021. # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad-1.12.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2018-10-20 15:25-0300\n" +"Project-Id-Version: gst-plugins-bad-1.19.2\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2021-09-26 12:26-0300\n" "Last-Translator: Fabrício Godoy <skarllot@gmail.com>\n" "Language-Team: Brazilian Portuguese <ldpbr-translation@lists.sourceforge." "net>\n" @@ -19,9 +19,10 @@ "Content-Transfer-Encoding: 8bit\n" "X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=2; plural=(n > 1);\n" +"X-Generator: Poedit 3.0\n" msgid "No URL set." -msgstr "" +msgstr "Nenhum URL definido." msgid "OpenCV failed to load template image" msgstr "O OpenCV falhou ao carregar a imagem modelo" @@ -69,11 +70,12 @@ msgstr "O elemento Digitalzoom não pôde ser criado" msgid "Subpicture format was not configured before data flow" -msgstr "O formato de subimagem não foi configurado antes do fluxo de dados" +msgstr "O formato de sub-imagem não foi configurado antes do fluxo de dados" msgid "Failed to get fragment URL." msgstr "Falha ao obter um fragmento de URL." +#, c-format msgid "Couldn't download fragments" msgstr "Não foi possível baixar os fragmentos" @@ -103,6 +105,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Não foi possível abrir o arquivo \"%s\" para leitura." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Não foi possível encontrar o arquivo de configuração de canal" @@ -112,7 +115,7 @@ #, c-format msgid "Couldn't find details for channel '%s'" -msgstr "Não foi possível encontrar detalhes para o canal DVB \"%s\"" +msgstr "Não foi possível encontrar detalhes para o canal \"%s\"" #, c-format msgid "No properties for channel '%s'" @@ -126,6 +129,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Não foi possível encontrar o arquivo de configuração de canal: \"%s\"" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Nenhum canal foi encontrado no arquivo de configuração de canal" @@ -136,7 +140,7 @@ #~ msgstr "Um erro ocorreu no GL" #~ msgid "format wasn't negotiated before get function" -#~ msgstr "O formato não foi negociado antes da chamada da função" +#~ msgstr "o formato não foi negociado antes da chamada da função" #~ msgid "Could not establish connection to sndio" #~ msgstr "Não foi possível estabelecer a conexão para sndio"
View file
gst-plugins-bad-1.18.6.tar.xz/po/ro.po -> gst-plugins-bad-1.20.1.tar.xz/po/ro.po
Changed
@@ -1,27 +1,35 @@ # Romanian translation for gst-plugins-bad # This file is distributed under the same license as the gst-plugins-bad package. +# # Lucian Adrian Grijincu <lucian.grijincu@gmail.com>, 2010. +# Florentina Mușat <florentina.musat.28@gmail.com>, 2020. +# Actualizare a mesajelor, de la fișierul „gst-plugins-bad-1.19.2.pot”. +# Eliminare a mesajelor ce-au dispărut în ultima versiune. +# S-au efectuat mici corecții +# Actualizări realizate de Remus-Gabriel Chelu <remusgabriel.chelu@disroot.org>, 2022. +# msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 0.10.18.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2010-08-16 03:11+0300\n" -"Last-Translator: Lucian Adrian Grijincu <lucian.grijincu@gmail.com>\n" +"Project-Id-Version: gst-plugins-bad 1.19.2\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-09-23 01:34+0100\n" +"PO-Revision-Date: 2022-01-17 11:49+0100\n" +"Last-Translator: Remus-Gabriel Chelu <remusgabriel.chelu@disroot.org>\n" "Language-Team: Romanian <translation-team-ro@lists.sourceforge.net>\n" "Language: ro\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=3; plural=(n==1 ? 0 : (n==0 || (n%100 > 0 && n%100 < " "20)) ? 1 : 2);;\n" -"X-Generator: Virtaal 0.6.1\n" +"X-Generator: Poedit 2.3.1\n" msgid "No URL set." -msgstr "" +msgstr "Niciun URL stabilit." msgid "OpenCV failed to load template image" -msgstr "" +msgstr "OpenCV a eșuat să încarce imaginea șablon" msgid "Could not read title information for DVD." msgstr "Nu s-au putut citi informațiile despre titlu pentru DVD." @@ -37,39 +45,47 @@ "Could not read DVD. This may be because the DVD is encrypted and a DVD " "decryption library is not installed." msgstr "" +"Nu s-a putut citi DVD-ul. Aceasta poate fi pentru că DVD-ul este criptat și " +"o bibliotecă de decriptare a DVD-urilor nu este instalată." -#, fuzzy msgid "Could not read DVD." -msgstr "Nu s-au putut citi informațiile despre titlu pentru DVD." +msgstr "Nu s-a putut citi DVD-ul." msgid "This file contains no playable streams." -msgstr "" +msgstr "Fișierul nu conține fluxuri de redat." -#, fuzzy msgid "Could not open sndfile stream for reading." -msgstr "Nu s-a putut deschide fișierul „%s” pentru citire." +msgstr "Nu s-a putut deschide fluxul sndfile pentru citire." msgid "Generated file has a larger preroll time than its streams duration" msgstr "" +"Fișierul generat are un timp de pre-rulare mai mare decât durata fluxurilor " +"acestuia" #, c-format msgid "Missing element '%s' - check your GStreamer installation." -msgstr "" +msgstr "Lipsește elementul „%s” - verificați instalarea GStreamer." msgid "File location is set to NULL, please set it to a valid filename" msgstr "" +"Locația fișierului este stabilită la NULL, stabiliți-o la un nume de fișier " +"valid" msgid "Digitalzoom element couldn't be created" -msgstr "" +msgstr "Elementul digitalzoom nu a putut fi creat" +# R-GC, scrie: +# modificat de la: +# „Formatul de subpictură nu a fost configurat înainte de fluxul de date” msgid "Subpicture format was not configured before data flow" -msgstr "" +msgstr "Formatul de sub-imagine nu a fost configurat înainte de fluxul de date" msgid "Failed to get fragment URL." -msgstr "" +msgstr "Nu s-a putut obține URL-ul fragment." +#, c-format msgid "Couldn't download fragments" -msgstr "" +msgstr "Nu s-au putut descărca fragmentele" msgid "Internal data stream error." msgstr "Eroare internă a fluxului de date." @@ -78,64 +94,56 @@ msgid "Device \"%s\" does not exist." msgstr "Dispozitivul „%s” nu există." +# R-GC, scrie: +# modificat de la: +# „Nu s-a putut deschide dispozitivul frontend „%s”.” #, c-format msgid "Could not open frontend device \"%s\"." -msgstr "Nu s-a putut deschide dispozitivul frontend „%s”." +msgstr "Nu s-a putut deschide dispozitivul frontal „%s”." +# R-GC, scrie: +# modificat de la: +# „Nu s-au putut obține configurările de la dispozitivul frontend „%s”.” #, c-format msgid "Could not get settings from frontend device \"%s\"." -msgstr "Nu s-au putut obține configurările de la dispozitivul frontend „%s”." +msgstr "Nu s-au putut obține configurările de la dispozitivul frontal „%s”." -#, fuzzy, c-format +# R-GC, scrie: +# modificat de la: +# „Nu s-au putut enumera sistemele de livrare de la dispozitivul de interfață „%s”.” +#, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." -msgstr "Nu s-au putut obține configurările de la dispozitivul frontend „%s”." +msgstr "" +"Nu s-au putut enumera sistemele de livrare de la dispozitivul frontal „%s”." #, c-format msgid "Could not open file \"%s\" for reading." msgstr "Nu s-a putut deschide fișierul „%s” pentru citire." +#, c-format msgid "Couldn't find channel configuration file" -msgstr "" +msgstr "Nu s-a putut găsi fișierul de configurare a canalului" #, c-format msgid "Couldn't load channel configuration file: '%s'" -msgstr "" +msgstr "Nu s-a putut încărca fișierul de configurare a canalului: „%s”" #, c-format msgid "Couldn't find details for channel '%s'" -msgstr "" +msgstr "Nu s-au putut găsi detalii pentru canalul „%s”" #, c-format msgid "No properties for channel '%s'" -msgstr "" +msgstr "Nu sunt proprietăți pentru canalul „%s”" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Nu s-a putut deschide dispozitivul DVD „%s”." +msgstr "Nu s-au putut stabili proprietățile pentru canalul „%s”" #, c-format msgid "Couldn't find channel configuration file: '%s'" -msgstr "" +msgstr "Nu s-a putut găsi fișierul de configurare al canalului: „%s”" +#, c-format msgid "Channel configuration file doesn't contain any channels" -msgstr "" - -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Nu s-a putut deschide fișierul „%s” pentru citire." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "Nu s-au putut citi informațiile despre titlu pentru DVD." - -#~ msgid "No file name specified for writing." -#~ msgstr "Niciun nume de fișier specificat pentru scriere." - -#~ msgid "Could not open file \"%s\" for writing." -#~ msgstr "Nu s-a putut deschide fișierul „%s” pentru scriere." - -#~ msgid "Internal data flow error." -#~ msgstr "Eroare internă de flux al datelor." - -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "Nu s-a putut scrie în fișierul „%s”." +msgstr "Fișierul de configurare al canalului nu conține niciun canal"
View file
gst-plugins-bad-1.18.6.tar.xz/po/ru.po -> gst-plugins-bad-1.20.1.tar.xz/po/ru.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-02-02 09:10+0300\n" "Last-Translator: Yuri Kozlov <yuray@komyakino.ru>\n" "Language-Team: Russian <gnu@d07.ru>\n" @@ -74,6 +74,7 @@ msgid "Failed to get fragment URL." msgstr "Ошибка при получении URL фрагмента." +#, c-format msgid "Couldn't download fragments" msgstr "Не удалось скачать фрагменты" @@ -100,6 +101,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Не удалось открыть для чтения файл «%s»." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Не удалось найти файл настройки каналов" @@ -123,6 +125,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Не удалось найти файл настройки каналов: «%s»" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Файл настройки каналов не содержит каналов"
View file
gst-plugins-bad-1.18.6.tar.xz/po/sk.po -> gst-plugins-bad-1.20.1.tar.xz/po/sk.po
Changed
@@ -2,26 +2,27 @@ # Czech translations of gst-plugins. # Copyright (C) 2004 gst-plugins' COPYRIGHT HOLDER # This file is put in the public domain. -# Peter Tuhársky <tuharsky@misbb.sk>, 2007, 2008, 2009, 2010, 2014, 2016. +# Peter Tuhársky <tuharsky@misbb.sk>, 2007, 2008, 2009, 2010, 2014, 2016, 2020. # msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad 1.7.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2016-05-20 12:33+0100\n" +"Project-Id-Version: gst-plugins-bad 1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2020-06-11 09:09+0200\n" "Last-Translator: Peter Tuhársky <tuharsky@misbb.sk>\n" "Language-Team: Slovak <sk-i18n@lists.linux.sk>\n" "Language: sk\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=3; plural=(n%10==1 && n%100!=11 ? 0 : n%10>=2 && n" "%10<=4 && (n%100<10 || n%100>=20) ? 1 : 2);\n" -"X-Generator: Poedit 1.6.10\n" +"X-Generator: Poedit 2.2.1\n" msgid "No URL set." -msgstr "" +msgstr "Nebola nastavená URL adresa." msgid "OpenCV failed to load template image" msgstr "OpenCV nedokázalo načítať obrazovú šablónu" @@ -74,6 +75,7 @@ msgid "Failed to get fragment URL." msgstr "Nepodarilo sa získať URL fragmentu." +#, c-format msgid "Couldn't download fragments" msgstr "Nepodarilo sa získať fragmenty" @@ -100,33 +102,33 @@ msgid "Could not open file \"%s\" for reading." msgstr "Nepodarilo sa otvoriť súbor \"%s\" na čítanie." -#, fuzzy +#, c-format msgid "Couldn't find channel configuration file" -msgstr "Nepodarilo sa nájsť konfiguračný súbor DVB kanálu" +msgstr "Nepodarilo sa nájsť konfiguračný súbor kanálu" -#, fuzzy, c-format +#, c-format msgid "Couldn't load channel configuration file: '%s'" -msgstr "Nepodarilo sa načítať konfiguračný súbor DVB kanálu: %s" +msgstr "Nepodarilo sa načítať konfiguračný súbor kanálu: %s" -#, fuzzy, c-format +#, c-format msgid "Couldn't find details for channel '%s'" -msgstr "Nepodarilo sa nájsť podrobnosti o DVB kanáli %s" +msgstr "Nepodarilo sa nájsť podrobnosti o kanáli %s" -#, fuzzy, c-format +#, c-format msgid "No properties for channel '%s'" -msgstr "Nepodarilo sa nájsť podrobnosti o DVB kanáli %s" +msgstr "Nepodarilo sa nájsť podrobnosti o kanáli %s" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "Nepodarilo sa nájsť podrobnosti o DVB kanáli %s" +msgstr "Nepodarilo sa nastaviť vlastnosti pre kanál '%s'" -#, fuzzy, c-format +#, c-format msgid "Couldn't find channel configuration file: '%s'" -msgstr "Nepodarilo sa nájsť konfiguračný súbor DVB kanálu" +msgstr "Nepodarilo sa nájsť konfiguračný súbor kanálu: '%s'" -#, fuzzy +#, c-format msgid "Channel configuration file doesn't contain any channels" -msgstr "Konfiguračný súbor DVB kanála neobsahuje žiadne kanály" +msgstr "Konfiguračný súbor kanálov neobsahuje žiadne kanály" #~ msgid "format wasn't negotiated before get function" #~ msgstr "formát nebol dohodnutý pred funkciou stiahnutia" @@ -143,6 +145,9 @@ #~ msgid "Could not start sndio" #~ msgstr "Nepodarilo sa spustiť sndio" +#~ msgid "Internal data flow error." +#~ msgstr "Vnútorná chyba prúdu údajov." + #~ msgid "No file name specified for writing." #~ msgstr "Nebolo zadané žiadne meno súboru pre zápis." @@ -155,9 +160,6 @@ #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Nepodarilo sa otvoriť súbor \"%s\" pre zápis." -#~ msgid "Internal data flow error." -#~ msgstr "Vnútorná chyba prúdu údajov." - #~ msgid "Couldn't get the Manifest's URI" #~ msgstr "Nepodarilo sa získať URI Manifestu"
View file
gst-plugins-bad-1.18.6.tar.xz/po/sl.po -> gst-plugins-bad-1.20.1.tar.xz/po/sl.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.21.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2011-04-26 15:21+0100\n" "Last-Translator: Klemen Košir <klemen.kosir@gmx.com>\n" "Language-Team: Slovenian <translation-team-sl@lists.sourceforge.net>\n" @@ -16,6 +16,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" "Plural-Forms: nplurals=4; plural=(n%100==1 ? 1 : n%100==2 ? 2 : n%100==3 || n" "%100==4 ? 3 : 0);\n" "X-Poedit-Language: Slovenian\n" @@ -74,6 +75,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -100,6 +102,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Datoteke \"%s\" ni mogoče odpreti za branje." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -123,29 +126,24 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "Datoteke \"%s\" ni mogoče odpreti za branje." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "DVD-ja ni mogoče prebrati." - #~ msgid "No file name specified for writing." #~ msgstr "Ni navedenega imena datoteke za pisanje." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Datoteke \"%s\" ni mogoče odpreti za pisanje." -#~ msgid "Internal data flow error." -#~ msgstr "Notranja napaka pretočnosti podatkov." - +#, c-format #~ msgid "Could not write to file \"%s\"." #~ msgstr "V datoteko \"%s\" ni mogoče pisati." +#~ msgid "Internal data flow error." +#~ msgstr "Notranja napaka pretočnosti podatkov." + #~ msgid "Volume" #~ msgstr "Glasnost"
View file
gst-plugins-bad-1.18.6.tar.xz/po/sq.po -> gst-plugins-bad-1.20.1.tar.xz/po/sq.po
Changed
@@ -5,8 +5,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 0.10.7.2\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2008-08-15 16:07+0200\n" "Last-Translator: Laurent Dhima <laurenti@alblinux.net>\n" "Language-Team: Albanian <translation-team-sq@lists.sourceforge.net>\n" @@ -14,6 +14,7 @@ "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" msgid "No URL set." msgstr "" @@ -39,7 +40,7 @@ #, fuzzy msgid "Could not read DVD." -msgstr "I pamundur shkrimi tek file \"%s\"." +msgstr "I pamundur shkrimi në dispozitivin \"%s\"." msgid "This file contains no playable streams." msgstr "" @@ -67,6 +68,7 @@ msgid "Failed to get fragment URL." msgstr "" +#, c-format msgid "Couldn't download fragments" msgstr "" @@ -93,6 +95,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "E pamundur hapja e file \"%s\" për lexim." +#, c-format msgid "Couldn't find channel configuration file" msgstr "" @@ -108,57 +111,43 @@ msgid "No properties for channel '%s'" msgstr "" -#, fuzzy, c-format +#, c-format msgid "Failed to set properties for channel '%s'" -msgstr "E pamundur hapja e dispozitivit frontend \"%s\"." +msgstr "" #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "" -#, fuzzy -#~ msgid "Could not configure sndio" -#~ msgstr "I pamundur konfigurimi i dispozitivit të zërit \"%s\"." - -#, fuzzy -#~ msgid "Could not start sndio" -#~ msgstr "I pamundur shkrimi tek file \"%s\"." - #~ msgid "No file name specified for writing." #~ msgstr "Asnjë emër file specifikuar për shkrim." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "E pamundur hapja e file \"%s\" në shkrim." -#, fuzzy -#~ msgid "Internal data flow error." -#~ msgstr "Gabim i brendshëm tek stream i të dhënave." - -#~ msgid "Could not write to file \"%s\"." -#~ msgstr "I pamundur shkrimi tek file \"%s\"." - -#, fuzzy -#~ msgid "Internal clock error." -#~ msgstr "Gabim i brendshëm tek stream i të dhënave." - #~ msgid "Could not open device \"%s\" for reading and writing." #~ msgstr "E pamundur hapja e dispozitivit \"%s\" për lexim dhe shkrim." #~ msgid "Device \"%s\" is not a capture device." #~ msgstr "Dispozitivi \"%s\" nuk është një dispozitiv marrje." -#~ msgid "Could not write to device \"%s\"." -#~ msgstr "I pamundur shkrimi në dispozitivin \"%s\"." - #~ msgid "Could not get buffers from device \"%s\"." #~ msgstr "E pamundur marrja e buffers nga dispozitivi \"%s\"." +#~ msgid "Could not open audio device \"%s\" for writing." +#~ msgstr "E pamundur hapja e dispozitivit të zërit \"%s\" për shkrim." + #~ msgid "Could not open control device \"%s\" for writing." #~ msgstr "E pamundur hapja e dispozitivit të kontrollit \"%s\" për shkrim." +#~ msgid "Could not configure audio device \"%s\"." +#~ msgstr "I pamundur konfigurimi i dispozitivit të zërit \"%s\"." + #~ msgid "Could not set audio device \"%s\" to %d Hz." #~ msgstr "I pamundur rregullimi i dispozitivit audio \"%s\" në %d Hz." @@ -194,9 +183,30 @@ #~ msgid "Your OSS device could not be probed correctly" #~ msgstr "Dispozitivi juaj OSS mund të mos provohet korrektësisht" +#~ msgid "Volume" +#~ msgstr "Volumi" + +#~ msgid "Bass" +#~ msgstr "Bas" + +#~ msgid "Treble" +#~ msgstr "Treble" + #~ msgid "Synth" #~ msgstr "Sintetizuesi" +#~ msgid "PCM" +#~ msgstr "PCM" + +#~ msgid "Speaker" +#~ msgstr "Zë folës" + +#~ msgid "Line-in" +#~ msgstr "Linja-hyrje" + +#~ msgid "Microphone" +#~ msgstr "Mikrofoni" + #~ msgid "CD" #~ msgstr "CD" @@ -206,9 +216,15 @@ #~ msgid "PCM-2" #~ msgstr "PCM-2" +#~ msgid "Record" +#~ msgstr "Regjistrimi" + #~ msgid "In-gain" #~ msgstr "In-gain" +#~ msgid "Out-gain" +#~ msgstr "Out-gain" + #~ msgid "Line-1" #~ msgstr "Linja-1" @@ -218,6 +234,9 @@ #~ msgid "Line-3" #~ msgstr "Linja-3" +#~ msgid "Digital-1" +#~ msgstr "Dixhitale-1" + #~ msgid "Digital-2" #~ msgstr "Dixhitale-2" @@ -230,9 +249,19 @@ #~ msgid "Phone-out" #~ msgstr "Phone-dalja" +#~ msgid "Video" +#~ msgstr "Video" + #~ msgid "Radio" #~ msgstr "Radio" +#~ msgid "Monitor" +#~ msgstr "Ekrani" + +#, fuzzy +#~ msgid "PC Speaker" +#~ msgstr "Zë folës" + #~ msgid "Could not open CD device for reading." #~ msgstr "I pamundur hapja e dispozitivit CD për lexim."
View file
gst-plugins-bad-1.18.6.tar.xz/po/sr.po -> gst-plugins-bad-1.20.1.tar.xz/po/sr.po
Changed
@@ -1,14 +1,14 @@ # Serbian translation of gst-plugins -# Copyright (C) 2014 Free Software Foundation, Inc. +# Copyright © 2020 Free Software Foundation, Inc. # This file is distributed under the same license as the gst-plugins-bad package. # Danilo Segan <dsegan@gmx.net>, 2004. -# Мирослав Николић <miroslavnikolic@rocketmail.com>, 2011—2015, 2016. +# Мирослав Николић <miroslavnikolic@rocketmail.com>, 2011—2020. msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad-1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2016-12-04 14:22+0200\n" +"Project-Id-Version: gst-plugins-bad-1.15.1\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2020-04-01 13:24+0200\n" "Last-Translator: Мирослав Николић <miroslavnikolic@rocketmail.com>\n" "Language-Team: Serbian <(nothing)>\n" "Language: sr\n" @@ -17,11 +17,12 @@ "Content-Transfer-Encoding: 8bit\n" "Plural-Forms: nplurals=3; plural=(n%10==1 && n%100!=11 ? 0 : n%10>=2 && n" "%10<=4 && (n%100<10 || n%100>=20) ? 1 : 2);\n" +"X-Generator: Virtaal 0.7.1\n" "X-Bugs: Report translation errors to the Language-Team address.\n" "X-Project-Style: gnome\n" msgid "No URL set." -msgstr "" +msgstr "Није подешена адреса." msgid "OpenCV failed to load template image" msgstr "ОтворениЦВ није успео да учита слику шаблона" @@ -72,6 +73,7 @@ msgid "Failed to get fragment URL." msgstr "Нисам успео да добијем адресу одломка." +#, c-format msgid "Couldn't download fragments" msgstr "Не могу да преузмем одломке" @@ -98,6 +100,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Не могу да отворим датотеку „%s“ за читање." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Не могу да нађем датотеку подешавања канала" @@ -121,6 +124,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Не могу да нађем датотеку подешавања канала: %s" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Датотека подешавања канала не садржи ниједан канал"
View file
gst-plugins-bad-1.18.6.tar.xz/po/sv.po -> gst-plugins-bad-1.20.1.tar.xz/po/sv.po
Changed
@@ -7,146 +7,122 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-01-17 02:16+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-02-05 18:24+0100\n" "Last-Translator: Sebastian Rasmussen <sebras@gmail.com>\n" "Language-Team: Swedish <tp-sv@listor.tp-sv.se>\n" "Language: sv\n" "MIME-Version: 1.0\n" -"Content-Type: text/plain; charset=utf-8\n" +"Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "X-Bugs: Report translation errors to the Language-Team address.\n" "X-Generator: Poedit 2.2.1\n" -#: ext/curl/gstcurlhttpsrc.c:1265 msgid "No URL set." msgstr "Ingen URL inställd." -#: ext/opencv/gsttemplatematch.cpp:202 msgid "OpenCV failed to load template image" msgstr "OpenCV misslyckades med att läsa in mallbild" -#: ext/resindvd/resindvdsrc.c:361 msgid "Could not read title information for DVD." msgstr "Kunde inte läsa titelinformation för dvd." -#: ext/resindvd/resindvdsrc.c:367 #, c-format msgid "Failed to open DVD device '%s'." msgstr "Misslyckades med att öppna dvd-enheten ”%s”." -#: ext/resindvd/resindvdsrc.c:373 msgid "Failed to set PGC based seeking." msgstr "Misslyckades med att ställa in PGC-baserad spolning." -#: ext/resindvd/resindvdsrc.c:1164 -msgid "Could not read DVD. This may be because the DVD is encrypted and a DVD decryption library is not installed." -msgstr "Kunde ej läsa dvd. Detta kan vara på grund av att dvd:n är krypterad och ett DVD avkrypteringsbibliotek inte är installerat." +msgid "" +"Could not read DVD. This may be because the DVD is encrypted and a DVD " +"decryption library is not installed." +msgstr "" +"Kunde ej läsa dvd. Detta kan vara på grund av att dvd:n är krypterad och ett " +"DVD avkrypteringsbibliotek inte är installerat." -#: ext/resindvd/resindvdsrc.c:1169 ext/resindvd/resindvdsrc.c:1178 msgid "Could not read DVD." msgstr "Kunde inte läsa dvd." -#: ext/smoothstreaming/gstmssdemux.c:421 -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:731 msgid "This file contains no playable streams." msgstr "Den här filen innehåller inga spelbara strömmar." -#: ext/sndfile/gstsfdec.c:769 msgid "Could not open sndfile stream for reading." msgstr "Kunde inte öppna sndfile-ström för läsning." -#: gst/asfmux/gstasfmux.c:1832 msgid "Generated file has a larger preroll time than its streams duration" msgstr "Den genererade filen har en längre förrullningstid än dess strömlängd" -#: gst/camerabin2/camerabingeneral.c:167 gst/camerabin2/gstcamerabin2.c:1859 -#: gst/camerabin2/gstdigitalzoom.c:283 gst/camerabin2/gstviewfinderbin.c:270 #, c-format msgid "Missing element '%s' - check your GStreamer installation." msgstr "Saknar element ”%s” - kontrollera din GStreamer-installation." -#: gst/camerabin2/gstcamerabin2.c:347 msgid "File location is set to NULL, please set it to a valid filename" -msgstr "Filposition är satt till NULL, vänligen sätt den till ett giltigt filnamn" +msgstr "" +"Filposition är satt till NULL, vänligen sätt den till ett giltigt filnamn" -#: gst/camerabin2/gstwrappercamerabinsrc.c:585 msgid "Digitalzoom element couldn't be created" msgstr "Elementet Digitalzoom kunde inte skapas" -#: gst/dvdspu/gstdvdspu.c:1041 msgid "Subpicture format was not configured before data flow" msgstr "Delbildsformat var inte konfigurerat före dataflöde" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3564 msgid "Failed to get fragment URL." msgstr "Misslyckades att hämta fragment-URL" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3951 +#, c-format msgid "Couldn't download fragments" msgstr "Kunde ej ladda ner fragment" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:4040 -#: gst/mpegtsdemux/mpegtsbase.c:1640 msgid "Internal data stream error." msgstr "Internt fel i dataström." -#: sys/dvb/gstdvbsrc.c:1580 sys/dvb/gstdvbsrc.c:1794 #, c-format msgid "Device \"%s\" does not exist." msgstr "Enheten ”%s” finns inte." -#: sys/dvb/gstdvbsrc.c:1584 #, c-format msgid "Could not open frontend device \"%s\"." msgstr "Kunde inte öppna framändsenheten ”%s”." -#: sys/dvb/gstdvbsrc.c:1603 #, c-format msgid "Could not get settings from frontend device \"%s\"." msgstr "Kunde inte få inställningar från framändsenheten ”%s”." -#: sys/dvb/gstdvbsrc.c:1620 #, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." msgstr "Kan ej lista leveranssystem från framändsenheten ”%s”." -#: sys/dvb/gstdvbsrc.c:1798 #, c-format msgid "Could not open file \"%s\" for reading." msgstr "Kunde inte öppna filen ”%s” för läsning." -#: sys/dvb/parsechannels.c:410 +#, c-format msgid "Couldn't find channel configuration file" msgstr "Kunde ej hitta konfigurationsfil för kanal" -#: sys/dvb/parsechannels.c:413 sys/dvb/parsechannels.c:563 #, c-format msgid "Couldn't load channel configuration file: '%s'" msgstr "Kunde ej läsa in konfigurationsfil för kanal: ”%s”" -#: sys/dvb/parsechannels.c:421 sys/dvb/parsechannels.c:846 #, c-format msgid "Couldn't find details for channel '%s'" msgstr "Kunde ej hitta detaljer för kanal ”%s”" -#: sys/dvb/parsechannels.c:430 #, c-format msgid "No properties for channel '%s'" msgstr "Inga egenskaper för kanal ”%s”" -#: sys/dvb/parsechannels.c:439 #, c-format msgid "Failed to set properties for channel '%s'" msgstr "Misslyckades med att ställa in egenskaper för kanal ”%s”" -#: sys/dvb/parsechannels.c:560 #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "Kunde ej hitta konfigurationsfil för kanal: ”%s”" -#: sys/dvb/parsechannels.c:570 +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Konfigurationsfil för kanal innehåller inga kanaler" @@ -162,8 +138,14 @@ #~ msgid "default GStreamer sound events audiosink" #~ msgstr "standard GStreamer-ljudutgång för ljudhändelser" -#~ msgid "GStreamer can play audio using any number of output elements. Some possible choices are osssink, pulsesink and alsasink. The audiosink can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer kan spela upp ljud via ett antal utgångselement. Några möjliga val är osssink, pulsesink och alsasink. Ljudutgången kan vara en delrörledning istället för bara ett element." +#~ msgid "" +#~ "GStreamer can play audio using any number of output elements. Some " +#~ "possible choices are osssink, pulsesink and alsasink. The audiosink can " +#~ "be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer kan spela upp ljud via ett antal utgångselement. Några möjliga " +#~ "val är osssink, pulsesink och alsasink. Ljudutgången kan vara en " +#~ "delrörledning istället för bara ett element." #~ msgid "description for default GStreamer sound events audiosink" #~ msgstr "beskrivning för standard GStreamer-ljudutgång för ljudhändelser" @@ -174,8 +156,10 @@ #~ msgid "default GStreamer audiosink for Audio/Video Conferencing" #~ msgstr "standard GStreamer-ljudutgång för ljud/video-konferenser" -#~ msgid "description for default GStreamer audiosink for Audio/Video Conferencing" -#~ msgstr "beskrivning för standard GStreamer-ljudutgång för ljud/video-konferenser" +#~ msgid "" +#~ "description for default GStreamer audiosink for Audio/Video Conferencing" +#~ msgstr "" +#~ "beskrivning för standard GStreamer-ljudutgång för ljud/video-konferenser" #~ msgid "default GStreamer audiosink for Music and Movies" #~ msgstr "standard GStreamer-ljudutgång för musik och filmer" @@ -186,8 +170,14 @@ #~ msgid "default GStreamer videosink" #~ msgstr "standard GStreamer-videoutgång" -#~ msgid "GStreamer can play video using any number of output elements. Some possible choices are xvimagesink, ximagesink, sdlvideosink and aasink. The videosink can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer kan spela upp video via ett antal utgångselement. Några möjliga val är xvimagesink, ximagesink, sdlvideosink och aasink. Videoutgången kan vara en delrörledning istället för bara ett element." +#~ msgid "" +#~ "GStreamer can play video using any number of output elements. Some " +#~ "possible choices are xvimagesink, ximagesink, sdlvideosink and aasink. " +#~ "The videosink can be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer kan spela upp video via ett antal utgångselement. Några möjliga " +#~ "val är xvimagesink, ximagesink, sdlvideosink och aasink. Videoutgången " +#~ "kan vara en delrörledning istället för bara ett element." #~ msgid "description for default GStreamer videosink" #~ msgstr "beskrivning för standard GStreamer-videoutgång" @@ -198,8 +188,14 @@ #~ msgid "default GStreamer audiosrc" #~ msgstr "standard GStreamer-ljudkälla" -#~ msgid "GStreamer can record audio using any number of input elements. Some possible choices are osssrc, pulsesrc and alsasrc. The audio source can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer kan spela in ljud från ett antal ingångselement. Några möjliga val är osssrc, pulsesrc och alsasrc. Ljudkällan kan vara en delrörledning istället för bara ett element." +#~ msgid "" +#~ "GStreamer can record audio using any number of input elements. Some " +#~ "possible choices are osssrc, pulsesrc and alsasrc. The audio source can " +#~ "be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer kan spela in ljud från ett antal ingångselement. Några möjliga " +#~ "val är osssrc, pulsesrc och alsasrc. Ljudkällan kan vara en delrörledning " +#~ "istället för bara ett element." #~ msgid "description for default GStreamer audiosrc" #~ msgstr "beskrivning för standard GStreamer-ljudkälla" @@ -210,8 +206,14 @@ #~ msgid "default GStreamer videosrc" #~ msgstr "standard GStreamer-videokälla" -#~ msgid "GStreamer can record video from any number of input elements. Some possible choices are v4lsrc, v4l2src and videotestsrc. The video source can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer kan spela in video från ett antal ingångselement. Några möjliga val är v4lsrc, v4l2src och videotestsrc. Videokällan kan vara en delrörledning istället för bara ett element." +#~ msgid "" +#~ "GStreamer can record video from any number of input elements. Some " +#~ "possible choices are v4lsrc, v4l2src and videotestsrc. The video source " +#~ "can be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer kan spela in video från ett antal ingångselement. Några möjliga " +#~ "val är v4lsrc, v4l2src och videotestsrc. Videokällan kan vara en " +#~ "delrörledning istället för bara ett element." #~ msgid "description for default GStreamer videosrc" #~ msgstr "beskrivning för standard GStreamer-videokälla" @@ -222,8 +224,16 @@ #~ msgid "default GStreamer visualization" #~ msgstr "standard GStreamer-visualisering" -#~ msgid "GStreamer can put visualization plugins in a pipeline to transform audio streams in video frames. Some possible choices are goom, goom2k1 and synaesthesia. The visualization plugin can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer kan lägga in insticksmoduler för visualisering i en pipeline för att transformera ljudströmmar till videobilder. Några möjliga val är goom, goom2k1 och synaesthesia. Visualiseringsinsticksmodulen kan vara en delrörledning istället för bara ett element." +#~ msgid "" +#~ "GStreamer can put visualization plugins in a pipeline to transform audio " +#~ "streams in video frames. Some possible choices are goom, goom2k1 and " +#~ "synaesthesia. The visualization plugin can be a partial pipeline instead " +#~ "of just one element." +#~ msgstr "" +#~ "GStreamer kan lägga in insticksmoduler för visualisering i en pipeline " +#~ "för att transformera ljudströmmar till videobilder. Några möjliga val är " +#~ "goom, goom2k1 och synaesthesia. Visualiseringsinsticksmodulen kan vara en " +#~ "delrörledning istället för bara ett element." #~ msgid "description for default GStreamer visualization" #~ msgstr "beskrivning för standard GStreamer-visualisering" @@ -249,8 +259,10 @@ #~ msgid "No file name specified for writing." #~ msgstr "Inget filnamn angavs för skrivning." -#~ msgid "Given file name \"%s\" can't be converted to local file name encoding." -#~ msgstr "Angivet filnamn \"%s\" kan ej konverteras till lokal filnamnskodning." +#~ msgid "" +#~ "Given file name \"%s\" can't be converted to local file name encoding." +#~ msgstr "" +#~ "Angivet filnamn \"%s\" kan ej konverteras till lokal filnamnskodning." #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Kunde inte öppna filen \"%s\" för skrivning."
View file
gst-plugins-bad-1.18.6.tar.xz/po/tr.po -> gst-plugins-bad-1.20.1.tar.xz/po/tr.po
Changed
@@ -2,21 +2,21 @@ # This file is put in the public domain. # This file is distributed under the same license as the gst-plugins-bad package. # Server Acim <serveracim@gmail.com>, 2009, 2015. -# Mehmet Kececi <mkececi@mehmetkececi.com>, 2017, 2019. +# Mehmet Kececi <mkececi@mehmetkececi.com>, 2017, 2019, 2021. msgid "" msgstr "" -"Project-Id-Version: gst-plugins-bad-1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" -"PO-Revision-Date: 2019-01-25 12:21+0300\n" +"Project-Id-Version: gst-plugins-bad-1.19.2\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" +"PO-Revision-Date: 2021-10-04 14:09+0300\n" "Last-Translator: Mehmet Kececi <mkececi@mehmetkececi.com>\n" -"Language-Team: Turkish <gnu-tr-u12a@lists.sourceforge.net>\n" +"Language-Team: Turkish <gnome-turk@gnome.org>\n" "Language: tr\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "Plural-Forms: nplurals=1; plural=0;\n" -"X-Generator: Virtaal 0.7.1\n" +"X-Generator: Poedit 3.0\n" "X-Bugs: Report translation errors to the Language-Team address.\n" "X-Project-Style: gnome\n" @@ -74,6 +74,7 @@ msgid "Failed to get fragment URL." msgstr "Parça adresi alınamadı." +#, c-format msgid "Couldn't download fragments" msgstr "Parçalar indirilemedi" @@ -94,12 +95,14 @@ #, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." -msgstr "\"%s\" ön yüz aygıtından iletim sistemleri numaralandırılanamıyor" +msgstr "" +"\"%s\" ön arayüz aygıtından teslim sistemleri numaralandırma yapılamıyor." #, c-format msgid "Could not open file \"%s\" for reading." msgstr "Dosyayı \"%s\" okumak için açamıyor." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Kanal yapılandırma dosyası bulamadım" @@ -123,6 +126,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Kanal yapılandırma dosyası bulamadı: '%s'" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Kanal yapılandırma dosyası herhangi bir kanal içermez"
View file
gst-plugins-bad-1.18.6.tar.xz/po/uk.po -> gst-plugins-bad-1.20.1.tar.xz/po/uk.po
Changed
@@ -7,8 +7,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-01-24 22:16+0200\n" "Last-Translator: Yuri Chornoivan <yurchor@ukr.net>\n" "Language-Team: Ukrainian <translation-team-uk@lists.sourceforge.net>\n" @@ -77,6 +77,7 @@ msgid "Failed to get fragment URL." msgstr "Не вдалося отримати адреси фрагмента." +#, c-format msgid "Couldn't download fragments" msgstr "Не вдалося отримати фрагменти" @@ -105,6 +106,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Не вдалося відкрити файл «%s» для читання." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Не вдалося знайти файл налаштувань каналів" @@ -128,5 +130,6 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Не вдалося знайти файл налаштувань каналів DVB: «%s»" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "У файлі налаштувань каналів не міститься даних щодо жодного каналу"
View file
gst-plugins-bad-1.18.6.tar.xz/po/vi.po -> gst-plugins-bad-1.20.1.tar.xz/po/vi.po
Changed
@@ -8,8 +8,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.10.0\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-02-26 11:53+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2016-11-02 13:38+0700\n" "Last-Translator: Trần Ngọc Quân <vnwildman@gmail.com>\n" "Language-Team: Vietnamese <translation-team-vi@lists.sourceforge.net>\n" @@ -75,6 +75,7 @@ msgid "Failed to get fragment URL." msgstr "Gặp lỗi khi lấy URL phân mảnh." +#, c-format msgid "Couldn't download fragments" msgstr "Không thể tải về các phân mảnh" @@ -103,6 +104,7 @@ msgid "Could not open file \"%s\" for reading." msgstr "Không thể mở tập tin “%s” để đọc." +#, c-format msgid "Couldn't find channel configuration file" msgstr "Không thể tìm thấy tập tin cấu hình kênh" @@ -126,6 +128,7 @@ msgid "Couldn't find channel configuration file: '%s'" msgstr "Không thể tìm thấy tập tin cấu hình kênh: “%s”" +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "Tập tin cấu hình kênh chẳng chứa kênh nào cả" @@ -153,11 +156,13 @@ #~ msgid "No file name specified for writing." #~ msgstr "Chưa chỉ định tên tập tin để ghi vào." +#, c-format #~ msgid "" #~ "Given file name \"%s\" can't be converted to local file name encoding." #~ msgstr "" #~ "Tên tập tin đã cho “%s” không thể chuyển đổi bảng mã tên tập tin nội bộ." +#, c-format #~ msgid "Could not open file \"%s\" for writing." #~ msgstr "Không thể mở tập tin “%s” để ghi."
View file
gst-plugins-bad-1.18.6.tar.xz/po/zh_CN.po -> gst-plugins-bad-1.20.1.tar.xz/po/zh_CN.po
Changed
@@ -8,8 +8,8 @@ msgid "" msgstr "" "Project-Id-Version: gst-plugins-bad 1.15.1\n" -"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" -"POT-Creation-Date: 2019-01-17 02:16+0000\n" +"Report-Msgid-Bugs-To: \n" +"POT-Creation-Date: 2021-10-25 01:02+0100\n" "PO-Revision-Date: 2019-03-05 16:31+0100\n" "Last-Translator: Tianze Wang <zwpwjwtz@126.com>\n" "Language-Team: Chinese (simplified) <i18n-zh@googlegroups.com>\n" @@ -20,134 +20,107 @@ "X-Bugs: Report translation errors to the Language-Team address.\n" "X-Generator: Poedit 2.0.7\n" -#: ext/curl/gstcurlhttpsrc.c:1265 msgid "No URL set." msgstr "未设置 URL。" -#: ext/opencv/gsttemplatematch.cpp:202 msgid "OpenCV failed to load template image" msgstr "OpenCV加载模版图片失败" -#: ext/resindvd/resindvdsrc.c:361 msgid "Could not read title information for DVD." msgstr "无法读取DVD的标题信息。" -#: ext/resindvd/resindvdsrc.c:367 #, c-format msgid "Failed to open DVD device '%s'." msgstr "无法打开DVD设备“%s”。" -#: ext/resindvd/resindvdsrc.c:373 msgid "Failed to set PGC based seeking." msgstr "无法设置基于PGC的检索。" -#: ext/resindvd/resindvdsrc.c:1164 -msgid "Could not read DVD. This may be because the DVD is encrypted and a DVD decryption library is not installed." +msgid "" +"Could not read DVD. This may be because the DVD is encrypted and a DVD " +"decryption library is not installed." msgstr "无法读取DVD。这可能是由于DVD已加密,且没有安装解密所需的库。" -#: ext/resindvd/resindvdsrc.c:1169 ext/resindvd/resindvdsrc.c:1178 msgid "Could not read DVD." msgstr "无法读取DVD。" -#: ext/smoothstreaming/gstmssdemux.c:421 -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:731 msgid "This file contains no playable streams." msgstr "该文件含有无法播放的流。" -#: ext/sndfile/gstsfdec.c:769 msgid "Could not open sndfile stream for reading." msgstr "无法打开并读取sndfile流。" -#: gst/asfmux/gstasfmux.c:1832 msgid "Generated file has a larger preroll time than its streams duration" msgstr "生成文件的预告片段超过其自身流长度。" -#: gst/camerabin2/camerabingeneral.c:167 gst/camerabin2/gstcamerabin2.c:1859 -#: gst/camerabin2/gstdigitalzoom.c:283 gst/camerabin2/gstviewfinderbin.c:270 #, c-format msgid "Missing element '%s' - check your GStreamer installation." msgstr "缺少“%s”组件 - 请检查你的GStreamer安装。" -#: gst/camerabin2/gstcamerabin2.c:347 msgid "File location is set to NULL, please set it to a valid filename" msgstr "文件位置为NULL,请将其设置为有效的文件名" -#: gst/camerabin2/gstwrappercamerabinsrc.c:585 msgid "Digitalzoom element couldn't be created" msgstr "无法创建Digitalzoom组件" -#: gst/dvdspu/gstdvdspu.c:1041 msgid "Subpicture format was not configured before data flow" msgstr "子画面格式未在数据流前配置" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3564 msgid "Failed to get fragment URL." msgstr "无法获取片段URL。" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:3951 +#, c-format msgid "Couldn't download fragments" msgstr "无法下载片段。" -#: gst-libs/gst/adaptivedemux/gstadaptivedemux.c:4040 -#: gst/mpegtsdemux/mpegtsbase.c:1640 msgid "Internal data stream error." msgstr "内部数据流错误。" -#: sys/dvb/gstdvbsrc.c:1580 sys/dvb/gstdvbsrc.c:1794 #, c-format msgid "Device \"%s\" does not exist." msgstr "不存在设备“%s”。" -#: sys/dvb/gstdvbsrc.c:1584 #, c-format msgid "Could not open frontend device \"%s\"." msgstr "无法打开前端设备“%s”。" -#: sys/dvb/gstdvbsrc.c:1603 #, c-format msgid "Could not get settings from frontend device \"%s\"." msgstr "无法从前端设备“%s”获取设置。" -#: sys/dvb/gstdvbsrc.c:1620 #, c-format msgid "Cannot enumerate delivery systems from frontend device \"%s\"." msgstr "无法从终端设备“%s”中枚举传输系统。" -#: sys/dvb/gstdvbsrc.c:1798 #, c-format msgid "Could not open file \"%s\" for reading." msgstr "无法以读方式打开文件“%s”。" -#: sys/dvb/parsechannels.c:410 +#, c-format msgid "Couldn't find channel configuration file" msgstr "无法找到通道配置文件" -#: sys/dvb/parsechannels.c:413 sys/dvb/parsechannels.c:563 #, c-format msgid "Couldn't load channel configuration file: '%s'" msgstr "无法加载通道配置文件:“%s”" -#: sys/dvb/parsechannels.c:421 sys/dvb/parsechannels.c:846 #, c-format msgid "Couldn't find details for channel '%s'" msgstr "无法获取通道“%s”的详细信息" -#: sys/dvb/parsechannels.c:430 #, c-format msgid "No properties for channel '%s'" msgstr "通道“%s”没有属性" -#: sys/dvb/parsechannels.c:439 #, c-format msgid "Failed to set properties for channel '%s'" msgstr "无法设置通道“%s”的属性" -#: sys/dvb/parsechannels.c:560 #, c-format msgid "Couldn't find channel configuration file: '%s'" msgstr "无法找到通道配置文件:“%s”" -#: sys/dvb/parsechannels.c:570 +#, c-format msgid "Channel configuration file doesn't contain any channels" msgstr "通道配置文件中不包含任何通道" @@ -175,7 +148,8 @@ #~ msgid "No file name specified for writing." #~ msgstr "未指定写入文件名。" -#~ msgid "Given file name \"%s\" can't be converted to local file name encoding." +#~ msgid "" +#~ "Given file name \"%s\" can't be converted to local file name encoding." #~ msgstr "给定的文件名“%s”无法被转换为本地文件名编码。" #~ msgid "Could not open file \"%s\" for writing." @@ -184,8 +158,13 @@ #~ msgid "default GStreamer sound events audiosink" #~ msgstr "默认GStreamer声音事件的音频汇" -#~ msgid "GStreamer can play audio using any number of output elements. Some possible choices are osssink, pulsesink and alsasink. The audiosink can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer可以使用任意数量的输出组件播放音频。可能的选择有osssink、pulsesink和alsasink。音频汇可以是管道的一部分,而不仅仅为一个组件。" +#~ msgid "" +#~ "GStreamer can play audio using any number of output elements. Some " +#~ "possible choices are osssink, pulsesink and alsasink. The audiosink can " +#~ "be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer可以使用任意数量的输出组件播放音频。可能的选择有osssink、" +#~ "pulsesink和alsasink。音频汇可以是管道的一部分,而不仅仅为一个组件。" #~ msgid "description for default GStreamer sound events audiosink" #~ msgstr "有关默认GStreamer声音事件的音频汇的说明" @@ -196,7 +175,8 @@ #~ msgid "default GStreamer audiosink for Audio/Video Conferencing" #~ msgstr "用于音/视频会议的默认Gstreamer音频汇" -#~ msgid "description for default GStreamer audiosink for Audio/Video Conferencing" +#~ msgid "" +#~ "description for default GStreamer audiosink for Audio/Video Conferencing" #~ msgstr "有关用于音/视频会议的默认Gstreamer音频汇的描述" #~ msgid "default GStreamer audiosink for Music and Movies" @@ -208,8 +188,14 @@ #~ msgid "default GStreamer videosink" #~ msgstr "默认GStreamer的视频汇" -#~ msgid "GStreamer can play video using any number of output elements. Some possible choices are xvimagesink, ximagesink, sdlvideosink and aasink. The videosink can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer可以使用任意数量的输出组件播放视频。可能的选择有xvimagesink、ximagesink、sdlvideosink和aasink。视频汇可以是管道的一部分,而不仅仅为一个组件。" +#~ msgid "" +#~ "GStreamer can play video using any number of output elements. Some " +#~ "possible choices are xvimagesink, ximagesink, sdlvideosink and aasink. " +#~ "The videosink can be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer可以使用任意数量的输出组件播放视频。可能的选择有xvimagesink、" +#~ "ximagesink、sdlvideosink和aasink。视频汇可以是管道的一部分,而不仅仅为一个" +#~ "组件。" #~ msgid "description for default GStreamer videosink" #~ msgstr "有关默认GStreamer视频汇的描述" @@ -220,8 +206,13 @@ #~ msgid "default GStreamer audiosrc" #~ msgstr "默认GStreamer音频源" -#~ msgid "GStreamer can record audio using any number of input elements. Some possible choices are osssrc, pulsesrc and alsasrc. The audio source can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer可以使用任意数量的输入组件来记录音频。可能的选择有osssrc、pulsesrc和alsasrc。音频源可以是管道的一部分,而不仅仅为一个组件。" +#~ msgid "" +#~ "GStreamer can record audio using any number of input elements. Some " +#~ "possible choices are osssrc, pulsesrc and alsasrc. The audio source can " +#~ "be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer可以使用任意数量的输入组件来记录音频。可能的选择有osssrc、" +#~ "pulsesrc和alsasrc。音频源可以是管道的一部分,而不仅仅为一个组件。" #~ msgid "description for default GStreamer audiosrc" #~ msgstr "有关默认GStreamer音频源的描述" @@ -232,8 +223,13 @@ #~ msgid "default GStreamer videosrc" #~ msgstr "默认GStreamer视频源" -#~ msgid "GStreamer can record video from any number of input elements. Some possible choices are v4lsrc, v4l2src and videotestsrc. The video source can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer可以使用任意数量的输出组件播放视频。可能的选择有v4lsrc、v4l2src和videotestsrc。视频源可以是管道的一部分,而不仅仅为一个组件。" +#~ msgid "" +#~ "GStreamer can record video from any number of input elements. Some " +#~ "possible choices are v4lsrc, v4l2src and videotestsrc. The video source " +#~ "can be a partial pipeline instead of just one element." +#~ msgstr "" +#~ "GStreamer可以使用任意数量的输出组件播放视频。可能的选择有v4lsrc、v4l2src和" +#~ "videotestsrc。视频源可以是管道的一部分,而不仅仅为一个组件。" #~ msgid "description for default GStreamer videosrc" #~ msgstr "有关默认GStreamer视频源的描述" @@ -244,8 +240,15 @@ #~ msgid "default GStreamer visualization" #~ msgstr "默认Gstreamer可视化组件" -#~ msgid "GStreamer can put visualization plugins in a pipeline to transform audio streams in video frames. Some possible choices are goom, goom2k1 and synaesthesia. The visualization plugin can be a partial pipeline instead of just one element." -#~ msgstr "GStreamer可以将可视化插件加载于管道中,并在视频帧中传递音频流。可能的选择有goom、goom2k1和synaesthesia。可视化插件可以是管道的一部分,而不仅仅为一个组件。" +#~ msgid "" +#~ "GStreamer can put visualization plugins in a pipeline to transform audio " +#~ "streams in video frames. Some possible choices are goom, goom2k1 and " +#~ "synaesthesia. The visualization plugin can be a partial pipeline instead " +#~ "of just one element." +#~ msgstr "" +#~ "GStreamer可以将可视化插件加载于管道中,并在视频帧中传递音频流。可能的选择" +#~ "有goom、goom2k1和synaesthesia。可视化插件可以是管道的一部分,而不仅仅为一" +#~ "个组件。" #~ msgid "description for default GStreamer visualization" #~ msgstr "有关默认Gstreamer可视化组件的描述" @@ -265,8 +268,11 @@ #~ msgid "Could not open audio device for mixer control handling." #~ msgstr "无法打开音频文件进行混音控制操作。" -#~ msgid "Could not open audio device for mixer control handling. This version of the Open Sound System is not supported by this element." -#~ msgstr "无法打开音频设备进行音量控制操作。此部件不支持开放声音系统(OSS)的版本。" +#~ msgid "" +#~ "Could not open audio device for mixer control handling. This version of " +#~ "the Open Sound System is not supported by this element." +#~ msgstr "" +#~ "无法打开音频设备进行音量控制操作。此部件不支持开放声音系统(OSS)的版本。" #~ msgid "Volume" #~ msgstr "音量" @@ -591,16 +597,22 @@ #~ msgid "%s %d" #~ msgstr "%s %d" -#~ msgid "Could not open audio device for playback. Device is being used by another application." +#~ msgid "" +#~ "Could not open audio device for playback. Device is being used by another " +#~ "application." #~ msgstr "无法打开音频设备播放音频。设备正由另一程序使用。" -#~ msgid "Could not open audio device for playback. You don't have permission to open the device." +#~ msgid "" +#~ "Could not open audio device for playback. You don't have permission to " +#~ "open the device." #~ msgstr "无法打开音频设备播放音频。您无权打开此设备" #~ msgid "Could not open audio device for playback." #~ msgstr "无法打开音频设备播放音频。" -#~ msgid "Could not open audio device for playback. This version of the Open Sound System is not supported by this element." +#~ msgid "" +#~ "Could not open audio device for playback. This version of the Open Sound " +#~ "System is not supported by this element." #~ msgstr "无法打开音频设备播放音频。此组件不支持开放声音系统(OSS)版本。" #~ msgid "Playback is not supported by this audio device." @@ -618,7 +630,9 @@ #~ msgid "Failed to configure TwoLAME encoder. Check your encoding parameters." #~ msgstr "无法配置 TwoLAME 编码器。请检查您的编码参数。" -#~ msgid "The requested bitrate %d kbit/s for property '%s' is not allowed. The bitrate was changed to %d kbit/s." +#~ msgid "" +#~ "The requested bitrate %d kbit/s for property '%s' is not allowed. The " +#~ "bitrate was changed to %d kbit/s." #~ msgstr "不允许使用“%2$s”所请求的码率 %1$d kbit/s。码率改为 %3$d kbit/s。" #~ msgid "PCM 1"
View file
gst-plugins-bad-1.20.1.tar.xz/po/zh_TW.po
Added
@@ -0,0 +1,120 @@ +# Chinese (traditional) translation of gst-plugins-bad. +# This file is put in the public domain. +# Yi-Jyun Pan <pan93412@gmail.com>, 2020, 2021. +# +msgid "" +msgstr "" +"Project-Id-Version: gst-plugins-bad-1.15.1\n" +"Report-Msgid-Bugs-To: http://bugzilla.gnome.org/\n" +"POT-Creation-Date: 2019-01-17 02:16+0000\n" +"PO-Revision-Date: 2021-05-09 21:17+0800\n" +"Last-Translator: Yi-Jyun Pan <pan93412@gmail.com>\n" +"Language-Team: Chinese (traditional) <zh-l10n@lists.linux.org.tw>\n" +"Language: zh_TW\n" +"MIME-Version: 1.0\n" +"Content-Type: text/plain; charset=UTF-8\n" +"Content-Transfer-Encoding: 8bit\n" +"X-Bugs: Report translation errors to the Language-Team address.\n" +"X-Generator: Poedit 2.4.3\n" +"Plural-Forms: nplurals=1; plural=0;\n" + +msgid "No URL set." +msgstr "未設定 URL。" + +msgid "OpenCV failed to load template image" +msgstr "OpenCV 無法載入模本影像" + +msgid "Could not read title information for DVD." +msgstr "無法讀取 DVD 的標題資訊。" + +#, c-format +msgid "Failed to open DVD device '%s'." +msgstr "無法開啟 DVD 裝置 '%s'。" + +msgid "Failed to set PGC based seeking." +msgstr "無法設定以 PGC 為基礎的搜尋。" + +msgid "" +"Could not read DVD. This may be because the DVD is encrypted and a DVD " +"decryption library is not installed." +msgstr "無法讀取 DVD。這可能是因為 DVD 已受加密,且未安裝 DVD 解密函式庫。" + +msgid "Could not read DVD." +msgstr "無法讀取 DVD。" + +msgid "This file contains no playable streams." +msgstr "此檔案不包含可播放的串流。" + +msgid "Could not open sndfile stream for reading." +msgstr "無法開啟 sndfile 串流以讀取。" + +msgid "Generated file has a larger preroll time than its streams duration" +msgstr "產出檔案的預載 (preroll) 時間比串流時長還長。" + +#, c-format +msgid "Missing element '%s' - check your GStreamer installation." +msgstr "缺少元件 '%s' - 請檢查您的 GStreamer 安裝。" + +msgid "File location is set to NULL, please set it to a valid filename" +msgstr "檔案位置被設為 NULL,請將其設定為有效的檔案名稱" + +msgid "Digitalzoom element couldn't be created" +msgstr "無法建立數位縮放 (Digitalzoom) 元素" + +msgid "Subpicture format was not configured before data flow" +msgstr "未在資料串流前設定子影像 (Subpicture) 格式" + +msgid "Failed to get fragment URL." +msgstr "無法取得片段 URL。" + +msgid "Couldn't download fragments" +msgstr "無法下載片段" + +msgid "Internal data stream error." +msgstr "內部資料串流錯誤。" + +#, c-format +msgid "Device \"%s\" does not exist." +msgstr "\"%s\" 裝置不存在。" + +#, c-format +msgid "Could not open frontend device \"%s\"." +msgstr "無法開啟前端裝置 \"%s\"。" + +#, c-format +msgid "Could not get settings from frontend device \"%s\"." +msgstr "無法從前端裝置 \"%s\" 取得設定。" + +#, c-format +msgid "Cannot enumerate delivery systems from frontend device \"%s\"." +msgstr "無法在前端裝置「%s」枚舉遞送系統。" + +#, c-format +msgid "Could not open file \"%s\" for reading." +msgstr "無法開啟 \"%s\" 檔案以讀取。" + +msgid "Couldn't find channel configuration file" +msgstr "找不到頻道設定檔" + +#, c-format +msgid "Couldn't load channel configuration file: '%s'" +msgstr "無法載入頻道設定檔:'%s'" + +#, c-format +msgid "Couldn't find details for channel '%s'" +msgstr "無法尋找 '%s' 頻道的詳細資訊" + +#, c-format +msgid "No properties for channel '%s'" +msgstr "沒有 '%s' 頻道的屬性" + +#, c-format +msgid "Failed to set properties for channel '%s'" +msgstr "無法設定 '%s' 頻道的屬性" + +#, c-format +msgid "Couldn't find channel configuration file: '%s'" +msgstr "找不到頻道設定檔:'%s'" + +msgid "Channel configuration file doesn't contain any channels" +msgstr "頻道設定檔內完全沒有頻道"
View file
gst-plugins-bad-1.18.6.tar.xz/sys/androidmedia/gstamcaudiodec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/androidmedia/gstamcaudiodec.c
Changed
@@ -914,7 +914,7 @@ guint8 *data; gst_buffer_map (codec_data, &minfo, GST_MAP_READ); - data = g_memdup (minfo.data, minfo.size); + data = g_memdup2 (minfo.data, minfo.size); self->codec_datas = g_list_prepend (self->codec_datas, data); gst_amc_format_set_buffer (format, "csd-0", data, minfo.size, &err); if (err) @@ -946,7 +946,7 @@ fname = g_strdup_printf ("csd-%d", j); gst_buffer_map (buf, &minfo, GST_MAP_READ); - data = g_memdup (minfo.data, minfo.size); + data = g_memdup2 (minfo.data, minfo.size); self->codec_datas = g_list_prepend (self->codec_datas, data); gst_amc_format_set_buffer (format, fname, data, minfo.size, &err); if (err)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/androidmedia/gstamcvideodec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/androidmedia/gstamcvideodec.c
Changed
@@ -1811,7 +1811,7 @@ GstMapInfo cminfo; gst_buffer_map (state->codec_data, &cminfo, GST_MAP_READ); - codec_data = g_memdup (cminfo.data, cminfo.size); + codec_data = g_memdup2 (cminfo.data, cminfo.size); codec_data_size = cminfo.size; is_format_change |= (!self->codec_data
View file
gst-plugins-bad-1.18.6.tar.xz/sys/androidmedia/jni/gstamc-codeclist-jni.c -> gst-plugins-bad-1.20.1.tar.xz/sys/androidmedia/jni/gstamc-codeclist-jni.c
Changed
@@ -438,7 +438,7 @@ goto done; } - ret = g_memdup (elems, sizeof (jint) * len); + ret = g_memdup2 (elems, sizeof (jint) * len); *length = len; done:
View file
gst-plugins-bad-1.18.6.tar.xz/sys/androidmedia/jni/gstamc-format-jni.c -> gst-plugins-bad-1.20.1.tar.xz/sys/androidmedia/jni/gstamc-format-jni.c
Changed
@@ -475,7 +475,7 @@ gst_amc_buffer_get_position_and_limit (&buf, NULL, &position, &limit); *size = limit; - *data = g_memdup (*data + position, limit); + *data = g_memdup2 (*data + position, limit); ret = TRUE;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/androidmedia/magicleap/gstamc-format-ml.c -> gst-plugins-bad-1.20.1.tar.xz/sys/androidmedia/magicleap/gstamc-format-ml.c
Changed
@@ -251,7 +251,7 @@ } *size = buffer.length; - *data = (guint8 *) g_memdup (buffer.ptr, buffer.length); + *data = (guint8 *) g_memdup2 (buffer.ptr, buffer.length); MLMediaFormatKeyByteBufferRelease (format->handle, &buffer); return TRUE;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/avfassetsrc.m -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/avfassetsrc.m
Changed
@@ -22,7 +22,7 @@ */ /** - * SECTION:element-plugin + * SECTION:element-avfassetsrc * * Read and decode samples from AVFoundation assets using the AVFAssetReader API * @@ -149,12 +149,6 @@ gobject_class->get_property = gst_avf_asset_src_get_property; gobject_class->dispose = gst_avf_asset_src_dispose; - /** - * GstAVFAssetSrc:uri - * - * URI of the asset to read - * - **/ g_object_class_install_property (gobject_class, PROP_URI, g_param_spec_string ("uri", "Asset URI", "URI of the asset to read", NULL,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/avfvideosrc.m -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/avfvideosrc.m
Changed
@@ -1353,6 +1353,10 @@ GST_DEBUG_CATEGORY_INIT (gst_avf_video_src_debug, "avfvideosrc", 0, "iOS AVFoundation video source"); + + gst_type_mark_as_plugin_api (GST_TYPE_AVF_VIDEO_SOURCE_POSITION, 0); + gst_type_mark_as_plugin_api (GST_TYPE_AVF_VIDEO_SOURCE_ORIENTATION, 0); + gst_type_mark_as_plugin_api (GST_TYPE_AVF_VIDEO_SOURCE_DEVICE_TYPE, 0); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/coremediabuffer.h -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/coremediabuffer.h
Changed
@@ -24,7 +24,7 @@ #include <gst/video/gstvideometa.h> #include "videotexturecache.h" -#include "CoreMedia/CoreMedia.h" +#include <CoreMedia/CoreMedia.h> G_BEGIN_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/iosassetsrc.m -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/iosassetsrc.m
Changed
@@ -21,7 +21,7 @@ * Boston, MA 02111-1307, USA. */ /** - * SECTION:element-ios_assetsrc + * SECTION:element-iosassetsrc * @see_also: #GstIOSAssetSrc * * Read data from an iOS asset from the media library.
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/meson.build
Changed
@@ -31,6 +31,8 @@ endif applemedia_objc_args += ['-fobjc-arc'] + + objcpp = meson.get_compiler('objcpp') endif applemedia_frameworks = [] @@ -87,6 +89,7 @@ endif endforeach +applemedia_objcpp_args = [] if gstvulkan_dep.found() and have_objcpp moltenvk_dep = cc.find_library('MoltenVK', required : false) metal_dep = dependency('appleframeworks', modules : ['Metal'], required : false) @@ -97,6 +100,10 @@ 'iosurfacevulkanmemory.c', ] applemedia_args += ['-DAPPLEMEDIA_MOLTENVK'] + # override_options : ['cpp_std=c++11'] doesn't seem to work for objcpp + applemedia_objcpp_args += objcpp.get_supported_arguments([ + '-std=c++11', + ]) endif endif @@ -105,10 +112,11 @@ applemedia_sources, c_args : gst_plugins_bad_args + applemedia_args, objc_args : gst_plugins_bad_args + applemedia_args + applemedia_objc_args, - objcpp_args : gst_plugins_bad_args + applemedia_args + applemedia_objc_args, + objcpp_args : gst_plugins_bad_args + applemedia_args + applemedia_objc_args + applemedia_objcpp_args, link_args : noseh_link_args, include_directories : [configinc, libsinc], dependencies : [gstvideo_dep, gstaudio_dep, gstpbutils_dep, gst_dep, gstbase_dep, gstgl_dep, gstglproto_dep] + applemedia_frameworks, + override_options : ['cpp_std=c++11'], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/plugin.m -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/plugin.m
Changed
@@ -102,5 +102,5 @@ GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, applemedia, - "Elements for capture and codec access on Apple OS X and iOS", + "Elements for capture and codec access on Apple macOS and iOS", plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/videotexturecache-vulkan.mm -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/videotexturecache-vulkan.mm
Changed
@@ -32,7 +32,16 @@ #undef VK_USE_PLATFORM_MACOS_MVK #undef VK_USE_PLATFORM_IOS_MVK #include <MoltenVK/vk_mvk_moltenvk.h> +/* MoltenVK uses some enums/typedefs that are only available in newer macOS/iOS + * versions. At time of writing: + * - MTLTextureSwizzle + * - MTLTextureSwizzleChannels + * - MTLMultisampleDepthResolveFilter + */ +#pragma clang diagnostic push +#pragma clang diagnostic warning "-Wunguarded-availability-new" #include <MoltenVK/mvk_datatypes.h> +#pragma clang diagnostic pop /* silence macro redefinition warnings */ #undef VK_USE_PLATFORM_MACOS_MVK #undef VK_USE_PLATFORM_IOS_MVK @@ -210,7 +219,7 @@ IOSurfaceRef surface = CVPixelBufferGetIOSurface (pixel_buf); GstVideoTextureCacheVulkan *cache_vulkan = GST_VIDEO_TEXTURE_CACHE_VULKAN (cache); - MTLPixelFormat fmt = video_info_to_metal_format (info, plane); + MTLPixelFormat fmt = (MTLPixelFormat) video_info_to_metal_format (info, plane); CFRetain (pixel_buf); mem = gst_io_surface_vulkan_memory_wrapped (cache_vulkan->device, @@ -292,8 +301,8 @@ texture_data->texture = (__bridge_retained gpointer) texture; VkResult err = vkSetMTLTextureMVK (memory->vulkan_mem.image, texture); - GST_DEBUG ("bound texture %p to image %p: 0x%x", texture, memory->vulkan_mem.image, - err); + GST_DEBUG ("bound texture %p to image %" GST_VULKAN_NON_DISPATCHABLE_HANDLE_FORMAT ": 0x%x", + texture, memory->vulkan_mem.image, err); vk_mem->user_data = texture_data; vk_mem->notify = (GDestroyNotify) free_texture_wrapper;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/vtdec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/vtdec.c
Changed
@@ -19,9 +19,10 @@ */ /** * SECTION:element-vtdec - * @title: gstvtdec + * @title: vtdec * - * Apple VideoToolbox based decoder. + * Apple VideoToolbox based decoder which might use a HW or a SW + * implementation depending on the device. * * ## Example launch line * |[ @@ -31,6 +32,20 @@ * */ +/** + * SECTION:element-vtdec_hw + * @title: vtdec_hw + * + * Apple VideoToolbox based HW-only decoder. + * + * ## Example launch line + * |[ + * gst-launch-1.0 -v filesrc location=file.mov ! qtdemux ! queue ! h264parse ! vtdec_hw ! videoconvert ! autovideosink + * ]| + * Decode h264 video from a mov file. + * + */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif @@ -57,6 +72,7 @@ /* leave some headroom for new GstVideoCodecFrameFlags flags */ VTDEC_FRAME_FLAG_SKIP = (1 << 10), VTDEC_FRAME_FLAG_DROP = (1 << 11), + VTDEC_FRAME_FLAG_ERROR = (1 << 12), }; static void gst_vtdec_finalize (GObject * object); @@ -101,7 +117,9 @@ GST_STATIC_CAPS ("video/x-h264, stream-format=avc, alignment=au," " width=(int)[1, MAX], height=(int)[1, MAX];" "video/mpeg, mpegversion=2, systemstream=false, parsed=true;" - "image/jpeg") + "image/jpeg;" + "video/x-prores, variant = { (string)standard, (string)hq, (string)lt," + " (string)proxy, (string)4444, (string)4444xq };") ); /* define EnableHardwareAcceleratedVideoDecoder in < 10.9 */ @@ -114,20 +132,20 @@ CFSTR ("RequireHardwareAcceleratedVideoDecoder"); #endif -#if defined(APPLEMEDIA_MOLTENVK) -#define VIDEO_SRC_CAPS \ - GST_VIDEO_CAPS_MAKE("NV12") ";" \ - GST_VIDEO_CAPS_MAKE_WITH_FEATURES(GST_CAPS_FEATURE_MEMORY_GL_MEMORY,\ - "NV12") ", " \ - "texture-target = (string) rectangle ; " \ - GST_VIDEO_CAPS_MAKE_WITH_FEATURES(GST_CAPS_FEATURE_MEMORY_VULKAN_IMAGE,\ - "NV12") -#else -#define VIDEO_SRC_CAPS \ - GST_VIDEO_CAPS_MAKE("NV12") ";" \ +#define VIDEO_SRC_CAPS_FORMATS "{ NV12, AYUV64, ARGB64_BE }" + +#define VIDEO_SRC_CAPS_NATIVE \ + GST_VIDEO_CAPS_MAKE(VIDEO_SRC_CAPS_FORMATS) ";" \ GST_VIDEO_CAPS_MAKE_WITH_FEATURES(GST_CAPS_FEATURE_MEMORY_GL_MEMORY,\ - "NV12") ", " \ + VIDEO_SRC_CAPS_FORMATS) ", " \ "texture-target = (string) rectangle " + +#if defined(APPLEMEDIA_MOLTENVK) +#define VIDEO_SRC_CAPS VIDEO_SRC_CAPS_NATIVE "; " \ + GST_VIDEO_CAPS_MAKE_WITH_FEATURES(GST_CAPS_FEATURE_MEMORY_VULKAN_IMAGE, \ + VIDEO_SRC_CAPS_FORMATS) +#else +#define VIDEO_SRC_CAPS VIDEO_SRC_CAPS_NATIVE #endif G_DEFINE_TYPE (GstVtdec, gst_vtdec, GST_TYPE_VIDEO_DECODER); @@ -143,9 +161,15 @@ base_class_init if you intend to subclass this class. */ gst_element_class_add_static_pad_template (element_class, &gst_vtdec_sink_template); - gst_element_class_add_pad_template (element_class, - gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - gst_caps_from_string (VIDEO_SRC_CAPS))); + + { + GstCaps *caps = gst_caps_from_string (VIDEO_SRC_CAPS); + /* RGBA64_LE is kCVPixelFormatType_64RGBALE, only available on macOS 11.3+ */ + if (GST_VTUTIL_HAVE_64ARGBALE) + caps = gst_vtutil_caps_append_video_format (caps, "RGBA64_LE"); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, caps)); + } gst_element_class_set_static_metadata (element_class, "Apple VideoToolbox decoder", @@ -234,24 +258,64 @@ } static void -setup_texture_cache (GstVtdec * vtdec) +setup_texture_cache (GstVtdec * vtdec, GstVideoFormat format) { GstVideoCodecState *output_state; + GST_INFO_OBJECT (vtdec, "setting up texture cache"); output_state = gst_video_decoder_get_output_state (GST_VIDEO_DECODER (vtdec)); - gst_video_texture_cache_set_format (vtdec->texture_cache, - GST_VIDEO_FORMAT_NV12, output_state->caps); + gst_video_texture_cache_set_format (vtdec->texture_cache, format, + output_state->caps); gst_video_codec_state_unref (output_state); } +/* + * Unconditionally output a high bit-depth + alpha format when decoding Apple + * ProRes video if downstream supports it. + * TODO: read src_pix_fmt to get the preferred output format + * https://wiki.multimedia.cx/index.php/Apple_ProRes#Frame_header + */ +static GstVideoFormat +get_preferred_video_format (GstStructure * s, gboolean prores) +{ + const GValue *list = gst_structure_get_value (s, "format"); + guint i, size = gst_value_list_get_size (list); + for (i = 0; i < size; i++) { + const GValue *value = gst_value_list_get_value (list, i); + const char *fmt = g_value_get_string (value); + GstVideoFormat vfmt = gst_video_format_from_string (fmt); + switch (vfmt) { + case GST_VIDEO_FORMAT_NV12: + if (!prores) + return vfmt; + break; + case GST_VIDEO_FORMAT_AYUV64: + case GST_VIDEO_FORMAT_ARGB64_BE: + if (prores) + return vfmt; + break; + case GST_VIDEO_FORMAT_RGBA64_LE: + if (GST_VTUTIL_HAVE_64ARGBALE) { + if (prores) + return vfmt; + } else { + /* Codepath will never be hit on macOS older than Big Sur (11.3) */ + g_warn_if_reached (); + } + break; + default: + break; + } + } + return GST_VIDEO_FORMAT_UNKNOWN; +} + static gboolean gst_vtdec_negotiate (GstVideoDecoder * decoder) { GstVideoCodecState *output_state = NULL; GstCaps *peercaps = NULL, *caps = NULL, *templcaps = NULL, *prevcaps = NULL; - GstVideoFormat format; - GstStructure *structure; - const gchar *s; + GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; GstVtdec *vtdec; OSStatus err = noErr; GstCapsFeatures *features = NULL; @@ -290,9 +354,30 @@ gst_caps_unref (peercaps); caps = gst_caps_truncate (gst_caps_make_writable (caps)); - structure = gst_caps_get_structure (caps, 0); - s = gst_structure_get_string (structure, "format"); - format = gst_video_format_from_string (s); + + /* Try to use whatever video format downstream prefers */ + { + GstStructure *s = gst_caps_get_structure (caps, 0); + + if (gst_structure_has_field_typed (s, "format", GST_TYPE_LIST)) { + GstStructure *is = gst_caps_get_structure (vtdec->input_state->caps, 0); + const char *name = gst_structure_get_name (is); + format = get_preferred_video_format (s, + g_strcmp0 (name, "video/x-prores") == 0); + } + + if (format == GST_VIDEO_FORMAT_UNKNOWN) { + const char *fmt; + gst_structure_fixate_field (s, "format"); + fmt = gst_structure_get_string (s, "format"); + if (fmt) + format = gst_video_format_from_string (fmt); + else + /* If all fails, just use NV12 */ + format = GST_VIDEO_FORMAT_NV12; + } + } + features = gst_caps_get_features (caps, 0); if (features) features = gst_caps_features_copy (features); @@ -383,7 +468,7 @@ if (!vtdec->texture_cache) { vtdec->texture_cache = gst_video_texture_cache_gl_new (vtdec->ctxh->context); - setup_texture_cache (vtdec); + setup_texture_cache (vtdec, format); } } #if defined(APPLEMEDIA_MOLTENVK) @@ -420,7 +505,7 @@ if (!vtdec->texture_cache) { vtdec->texture_cache = gst_video_texture_cache_vulkan_new (vtdec->device); - setup_texture_cache (vtdec); + setup_texture_cache (vtdec, format); } } #endif @@ -454,6 +539,16 @@ cm_format = kCMVideoCodecType_MPEG2Video; } else if (!strcmp (caps_name, "image/jpeg")) { cm_format = kCMVideoCodecType_JPEG; + } else if (!strcmp (caps_name, "video/x-prores")) { + const char *variant = gst_structure_get_string (structure, "variant"); + + if (variant) + cm_format = gst_vtutil_codec_type_from_prores_variant (variant); + + if (cm_format == GST_kCMVideoCodecType_Some_AppleProRes) { + GST_ERROR_OBJECT (vtdec, "Invalid ProRes variant %s", variant); + return FALSE; + } } if (cm_format == kCMVideoCodecType_H264 && state->codec_data == NULL) { @@ -584,11 +679,22 @@ case GST_VIDEO_FORMAT_NV12: cv_format = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange; break; - case GST_VIDEO_FORMAT_UYVY: - cv_format = kCVPixelFormatType_422YpCbCr8; + case GST_VIDEO_FORMAT_AYUV64: +/* This is fine for now because Apple only ships LE devices */ +#if G_BYTE_ORDER != G_LITTLE_ENDIAN +#error "AYUV64 is NE but kCVPixelFormatType_4444AYpCbCr16 is LE" +#endif + cv_format = kCVPixelFormatType_4444AYpCbCr16; break; - case GST_VIDEO_FORMAT_RGBA: - cv_format = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange; + case GST_VIDEO_FORMAT_ARGB64_BE: + cv_format = kCVPixelFormatType_64ARGB; + break; + case GST_VIDEO_FORMAT_RGBA64_LE: + if (GST_VTUTIL_HAVE_64ARGBALE) + cv_format = kCVPixelFormatType_64RGBALE; + else + /* Codepath will never be hit on macOS older than Big Sur (11.3) */ + g_warn_if_reached (); break; default: g_warn_if_reached (); @@ -929,12 +1035,16 @@ * example) or we're draining/flushing */ if (frame) { - if (flush || frame->flags & VTDEC_FRAME_FLAG_SKIP) + if (frame->flags & VTDEC_FRAME_FLAG_ERROR) { + gst_video_decoder_release_frame (decoder, frame); + ret = GST_FLOW_ERROR; + } else if (flush || frame->flags & VTDEC_FRAME_FLAG_SKIP) { gst_video_decoder_release_frame (decoder, frame); - else if (frame->flags & VTDEC_FRAME_FLAG_DROP) + } else if (frame->flags & VTDEC_FRAME_FLAG_DROP) { gst_video_decoder_drop_frame (decoder, frame); - else + } else { ret = gst_video_decoder_finish_frame (decoder, frame); + } } if (!frame || ret != GST_FLOW_OK)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/vtenc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/vtenc.c
Changed
@@ -17,6 +17,49 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. */ + +/** + * SECTION:element-vtenc_h264 + * @title: vtenc_h264 + * + * Apple VideoToolbox H264 encoder, which can either use HW or a SW + * implementation depending on the device. + * + * ## Example pipeline + * |[ + * gst-launch-1.0 -v videotestsrc ! vtenc_h264 ! qtmux ! filesink location=out.mov + * ]| Encode a test video pattern and save it as an MOV file + * + */ + +/** + * SECTION:element-vtenc_h264_hw + * @title: vtenc_h264_hw + * + * Apple VideoToolbox H264 HW-only encoder (only available on macOS at + * present). + * + * ## Example pipeline + * |[ + * gst-launch-1.0 -v videotestsrc ! vtenc_h264_hw ! qtmux ! filesink location=out.mov + * ]| Encode a test video pattern and save it as an MOV file + * + */ + +/** + * SECTION:element-vtenc_prores + * @title: vtenc_prores + * + * Apple VideoToolbox ProRes encoder + * + * ## Example pipeline + * |[ + * gst-launch-1.0 -v videotestsrc ! vtenc_prores ! qtmux ! filesink location=out.mov + * ]| Encode a test video pattern and save it as an MOV file + * + * Since: 1.20 + */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif @@ -35,6 +78,7 @@ #define VTENC_DEFAULT_QUALITY 0.5 #define VTENC_DEFAULT_MAX_KEYFRAME_INTERVAL 0 #define VTENC_DEFAULT_MAX_KEYFRAME_INTERVAL_DURATION 0 +#define VTENC_DEFAULT_PRESERVE_ALPHA TRUE GST_DEBUG_CATEGORY (gst_vtenc_debug); #define GST_CAT_DEFAULT (gst_vtenc_debug) @@ -66,6 +110,12 @@ __attribute__ ((weak_import)); #endif +/* This property key is currently completely undocumented. The only way you can + * know about its existence is if Apple tells you. It allows you to tell the + * encoder to not preserve alpha even when outputting alpha formats. */ +const CFStringRef gstVTCodecPropertyKey_PreserveAlphaChannel = +CFSTR ("kVTCodecPropertyKey_PreserveAlphaChannel"); + enum { PROP_0, @@ -75,7 +125,8 @@ PROP_REALTIME, PROP_QUALITY, PROP_MAX_KEYFRAME_INTERVAL, - PROP_MAX_KEYFRAME_INTERVAL_DURATION + PROP_MAX_KEYFRAME_INTERVAL_DURATION, + PROP_PRESERVE_ALPHA, }; typedef struct _GstVTEncFrame GstVTEncFrame; @@ -151,9 +202,11 @@ GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ NV12, I420 }")); #else static GstStaticCaps sink_caps = -GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ UYVY, NV12, I420 }")); +GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE + ("{ AYUV64, UYVY, NV12, I420, ARGB64_BE }")); #endif + static void gst_vtenc_base_init (GstVTEncClass * klass) { @@ -164,7 +217,6 @@ const int min_height = 1, max_height = G_MAXINT; const int min_fps_n = 0, max_fps_n = G_MAXINT; const int min_fps_d = 1, max_fps_d = 1; - GstPadTemplate *sink_template, *src_template; GstCaps *src_caps; gchar *longname, *description; @@ -178,23 +230,77 @@ g_free (longname); g_free (description); - sink_template = gst_pad_template_new ("sink", - GST_PAD_SINK, GST_PAD_ALWAYS, gst_static_caps_get (&sink_caps)); - gst_element_class_add_pad_template (element_class, sink_template); + { + GstCaps *caps = gst_static_caps_get (&sink_caps); + /* RGBA64_LE is kCVPixelFormatType_64RGBALE, only available on macOS 11.3+ */ + if (GST_VTUTIL_HAVE_64ARGBALE) + caps = gst_vtutil_caps_append_video_format (caps, "RGBA64_LE"); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, caps)); + } + src_caps = gst_caps_new_simple (codec_details->mimetype, "width", GST_TYPE_INT_RANGE, min_width, max_width, "height", GST_TYPE_INT_RANGE, min_height, max_height, "framerate", GST_TYPE_FRACTION_RANGE, min_fps_n, min_fps_d, max_fps_n, max_fps_d, NULL); - if (codec_details->format_id == kCMVideoCodecType_H264) { - gst_structure_set (gst_caps_get_structure (src_caps, 0), - "stream-format", G_TYPE_STRING, "avc", - "alignment", G_TYPE_STRING, "au", NULL); + + /* Signal our limited interlace support */ + { + G_GNUC_BEGIN_IGNORE_DEPRECATIONS; + GValueArray *arr = g_value_array_new (2); + GValue val = G_VALUE_INIT; + + g_value_init (&val, G_TYPE_STRING); + g_value_set_string (&val, "progressive"); + arr = g_value_array_append (arr, &val); + g_value_set_string (&val, "interleaved"); + arr = g_value_array_append (arr, &val); + G_GNUC_END_IGNORE_DEPRECATIONS; + gst_structure_set_list (gst_caps_get_structure (src_caps, 0), + "interlace-mode", arr); + } + + switch (codec_details->format_id) { + case kCMVideoCodecType_H264: + gst_structure_set (gst_caps_get_structure (src_caps, 0), + "stream-format", G_TYPE_STRING, "avc", + "alignment", G_TYPE_STRING, "au", NULL); + break; + case GST_kCMVideoCodecType_Some_AppleProRes: + if (g_strcmp0 (codec_details->mimetype, "video/x-prores") == 0) { + G_GNUC_BEGIN_IGNORE_DEPRECATIONS; + GValueArray *arr = g_value_array_new (6); + GValue val = G_VALUE_INIT; + + g_value_init (&val, G_TYPE_STRING); + g_value_set_string (&val, "standard"); + arr = g_value_array_append (arr, &val); + g_value_set_string (&val, "4444xq"); + arr = g_value_array_append (arr, &val); + g_value_set_string (&val, "4444"); + arr = g_value_array_append (arr, &val); + g_value_set_string (&val, "hq"); + arr = g_value_array_append (arr, &val); + g_value_set_string (&val, "lt"); + arr = g_value_array_append (arr, &val); + g_value_set_string (&val, "proxy"); + arr = g_value_array_append (arr, &val); + gst_structure_set_list (gst_caps_get_structure (src_caps, 0), + "variant", arr); + g_value_array_free (arr); + g_value_unset (&val); + G_GNUC_END_IGNORE_DEPRECATIONS; + break; + } + /* fall through */ + default: + g_assert_not_reached (); } - src_template = gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - src_caps); - gst_element_class_add_pad_template (element_class, src_template); + + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, src_caps)); gst_caps_unref (src_caps); } @@ -257,6 +363,25 @@ "Maximum number of nanoseconds between keyframes (0 = no limit)", 0, G_MAXUINT64, VTENC_DEFAULT_MAX_KEYFRAME_INTERVAL_DURATION, G_PARAM_READWRITE | G_PARAM_CONSTRUCT | G_PARAM_STATIC_STRINGS)); + + /* + * H264 doesn't support alpha components, so only add the property for prores + */ + if (g_strcmp0 (G_OBJECT_CLASS_NAME (klass), "vtenc_prores") == 0) { + /** + * vtenc_prores:preserve-alpha + * + * Preserve non-opaque video alpha values from the input video when + * compressing, else treat all alpha component as opaque. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_PRESERVE_ALPHA, + g_param_spec_boolean ("preserve-alpha", "Preserve Video Alpha Values", + "Video alpha values (non opaque) need to be preserved", + VTENC_DEFAULT_PRESERVE_ALPHA, + G_PARAM_READWRITE | G_PARAM_CONSTRUCT | G_PARAM_STATIC_STRINGS)); + } } static void @@ -274,6 +399,7 @@ self->latency_frames = -1; self->session = NULL; self->profile_level = NULL; + self->have_field_order = TRUE; self->keyframe_props = CFDictionaryCreate (NULL, (const void **) keyframe_props_keys, @@ -463,6 +589,9 @@ g_value_set_uint64 (value, gst_vtenc_get_max_keyframe_interval_duration (self)); break; + case PROP_PRESERVE_ALPHA: + g_value_set_boolean (value, self->preserve_alpha); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (obj, prop_id, pspec); break; @@ -495,6 +624,9 @@ gst_vtenc_set_max_keyframe_interval_duration (self, g_value_get_uint64 (value)); break; + case PROP_PRESERVE_ALPHA: + self->preserve_alpha = g_value_get_boolean (value); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (obj, prop_id, pspec); break; @@ -609,7 +741,8 @@ } else if (!strcmp (profile, "main")) { profile = "Main"; } else { - g_assert_not_reached (); + GST_ERROR_OBJECT (self, "invalid profile: %s", profile); + return ret; } if (strlen (level) == 1) { @@ -631,13 +764,45 @@ } static gboolean -gst_vtenc_negotiate_profile_and_level (GstVideoEncoder * enc) +gst_vtenc_negotiate_profile_and_level (GstVTEnc * self, GstStructure * s) +{ + const gchar *profile = gst_structure_get_string (s, "profile"); + const gchar *level = gst_structure_get_string (s, "level"); + + if (self->profile_level) + CFRelease (self->profile_level); + self->profile_level = gst_vtenc_profile_level_key (self, profile, level); + if (self->profile_level == NULL) { + GST_ERROR_OBJECT (self, "unsupported h264 profile '%s' or level '%s'", + profile, level); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_vtenc_negotiate_prores_variant (GstVTEnc * self, GstStructure * s) +{ + const char *variant = gst_structure_get_string (s, "variant"); + CMVideoCodecType codec_type = + gst_vtutil_codec_type_from_prores_variant (variant); + + if (codec_type == GST_kCMVideoCodecType_Some_AppleProRes) { + GST_ERROR_OBJECT (self, "unsupported prores variant: %s", variant); + return FALSE; + } + + self->specific_format_id = codec_type; + return TRUE; +} + +static gboolean +gst_vtenc_negotiate_specific_format_details (GstVideoEncoder * enc) { GstVTEnc *self = GST_VTENC_CAST (enc); GstCaps *allowed_caps = NULL; gboolean ret = TRUE; - const gchar *profile = NULL; - const gchar *level = NULL; allowed_caps = gst_pad_get_allowed_caps (GST_VIDEO_ENCODER_SRC_PAD (enc)); if (allowed_caps) { @@ -651,17 +816,24 @@ allowed_caps = gst_caps_make_writable (allowed_caps); allowed_caps = gst_caps_fixate (allowed_caps); s = gst_caps_get_structure (allowed_caps, 0); - - profile = gst_structure_get_string (s, "profile"); - level = gst_structure_get_string (s, "level"); - } - - if (self->profile_level) - CFRelease (self->profile_level); - self->profile_level = gst_vtenc_profile_level_key (self, profile, level); - if (self->profile_level == NULL) { - GST_ERROR_OBJECT (enc, "invalid profile and level"); - goto fail; + switch (self->details->format_id) { + case kCMVideoCodecType_H264: + self->specific_format_id = kCMVideoCodecType_H264; + if (!gst_vtenc_negotiate_profile_and_level (self, s)) + goto fail; + break; + case GST_kCMVideoCodecType_Some_AppleProRes: + if (g_strcmp0 (self->details->mimetype, "video/x-prores") != 0) { + GST_ERROR_OBJECT (self, "format_id == %i mimetype must be Apple " + "ProRes", GST_kCMVideoCodecType_Some_AppleProRes); + goto fail; + } + if (!gst_vtenc_negotiate_prores_variant (self, s)) + goto fail; + break; + default: + g_assert_not_reached (); + } } out: @@ -695,7 +867,7 @@ gst_vtenc_destroy_session (self, &self->session); GST_OBJECT_UNLOCK (self); - gst_vtenc_negotiate_profile_and_level (enc); + gst_vtenc_negotiate_specific_format_details (enc); session = gst_vtenc_create_session (self); GST_OBJECT_LOCK (self); @@ -711,6 +883,28 @@ return self->negotiated_width != 0; } +/* + * When the image is opaque but the output ProRes format has an alpha + * component (4 component, 32 bits per pixel), Apple requires that we signal + * that it should be ignored by setting the depth to 24 bits per pixel. Not + * doing so causes the encoded files to fail validation. + * + * So we set that in the caps and qtmux sets the depth value in the container, + * which will be read by demuxers so that decoders can skip those bytes + * entirely. qtdemux does this, but vtdec does not use this information at + * present. + */ +static gboolean +gst_vtenc_signal_ignored_alpha_component (GstVTEnc * self) +{ + if (self->preserve_alpha) + return FALSE; + if (self->specific_format_id == kCMVideoCodecType_AppleProRes4444XQ || + self->specific_format_id == kCMVideoCodecType_AppleProRes4444) + return TRUE; + return FALSE; +} + static gboolean gst_vtenc_negotiate_downstream (GstVTEnc * self, CMSampleBufferRef sbuf) { @@ -735,36 +929,50 @@ "framerate", GST_TYPE_FRACTION, self->negotiated_fps_n, self->negotiated_fps_d, NULL); - if (self->details->format_id == kCMVideoCodecType_H264) { - CMFormatDescriptionRef fmt; - CFDictionaryRef atoms; - CFStringRef avccKey; - CFDataRef avcc; - guint8 *codec_data; - gsize codec_data_size; - GstBuffer *codec_data_buf; - guint8 sps[3]; - - fmt = CMSampleBufferGetFormatDescription (sbuf); - atoms = CMFormatDescriptionGetExtension (fmt, - kCMFormatDescriptionExtension_SampleDescriptionExtensionAtoms); - avccKey = CFStringCreateWithCString (NULL, "avcC", kCFStringEncodingUTF8); - avcc = CFDictionaryGetValue (atoms, avccKey); - CFRelease (avccKey); - codec_data_size = CFDataGetLength (avcc); - codec_data = g_malloc (codec_data_size); - CFDataGetBytes (avcc, CFRangeMake (0, codec_data_size), codec_data); - codec_data_buf = gst_buffer_new_wrapped (codec_data, codec_data_size); - - gst_structure_set (s, "codec_data", GST_TYPE_BUFFER, codec_data_buf, NULL); - - sps[0] = codec_data[1]; - sps[1] = codec_data[2] & ~0xDF; - sps[2] = codec_data[3]; - - gst_codec_utils_h264_caps_set_level_and_profile (caps, sps, 3); - - gst_buffer_unref (codec_data_buf); + switch (self->details->format_id) { + case kCMVideoCodecType_H264: + { + CMFormatDescriptionRef fmt; + CFDictionaryRef atoms; + CFStringRef avccKey; + CFDataRef avcc; + guint8 *codec_data; + gsize codec_data_size; + GstBuffer *codec_data_buf; + guint8 sps[3]; + + fmt = CMSampleBufferGetFormatDescription (sbuf); + atoms = CMFormatDescriptionGetExtension (fmt, + kCMFormatDescriptionExtension_SampleDescriptionExtensionAtoms); + avccKey = CFStringCreateWithCString (NULL, "avcC", kCFStringEncodingUTF8); + avcc = CFDictionaryGetValue (atoms, avccKey); + CFRelease (avccKey); + codec_data_size = CFDataGetLength (avcc); + codec_data = g_malloc (codec_data_size); + CFDataGetBytes (avcc, CFRangeMake (0, codec_data_size), codec_data); + codec_data_buf = gst_buffer_new_wrapped (codec_data, codec_data_size); + + gst_structure_set (s, "codec_data", GST_TYPE_BUFFER, codec_data_buf, + NULL); + + sps[0] = codec_data[1]; + sps[1] = codec_data[2] & ~0xDF; + sps[2] = codec_data[3]; + + gst_codec_utils_h264_caps_set_level_and_profile (caps, sps, 3); + + gst_buffer_unref (codec_data_buf); + } + break; + case GST_kCMVideoCodecType_Some_AppleProRes: + gst_structure_set (s, "variant", G_TYPE_STRING, + gst_vtutil_codec_type_to_prores_variant (self->specific_format_id), + NULL); + if (gst_vtenc_signal_ignored_alpha_component (self)) + gst_structure_set (s, "depth", G_TYPE_INT, 24, NULL); + break; + default: + g_assert_not_reached (); } state = @@ -821,11 +1029,120 @@ return (ret == GST_FLOW_OK); } +static void +gst_vtenc_set_colorimetry (GstVTEnc * self, VTCompressionSessionRef session) +{ + OSStatus status; + CFStringRef primaries = NULL, transfer = NULL, matrix = NULL; + GstVideoColorimetry cm = GST_VIDEO_INFO_COLORIMETRY (&self->video_info); + + /* + * https://developer.apple.com/documentation/corevideo/cvimagebuffer/image_buffer_ycbcr_matrix_constants + */ + switch (cm.matrix) { + case GST_VIDEO_COLOR_MATRIX_BT709: + matrix = kCVImageBufferYCbCrMatrix_ITU_R_709_2; + break; + case GST_VIDEO_COLOR_MATRIX_BT601: + matrix = kCVImageBufferYCbCrMatrix_ITU_R_601_4; + break; + case GST_VIDEO_COLOR_MATRIX_SMPTE240M: + matrix = kCVImageBufferYCbCrMatrix_SMPTE_240M_1995; + break; + case GST_VIDEO_COLOR_MATRIX_BT2020: + matrix = kCVImageBufferYCbCrMatrix_ITU_R_2020; + break; + default: + GST_WARNING_OBJECT (self, "Unsupported color matrix %u", cm.matrix); + } + + /* + * https://developer.apple.com/documentation/corevideo/cvimagebuffer/image_buffer_transfer_function_constants + */ + switch (cm.transfer) { + case GST_VIDEO_TRANSFER_BT709: + case GST_VIDEO_TRANSFER_BT601: + case GST_VIDEO_TRANSFER_UNKNOWN: + transfer = kCVImageBufferTransferFunction_ITU_R_709_2; + break; + case GST_VIDEO_TRANSFER_SMPTE240M: + transfer = kCVImageBufferTransferFunction_SMPTE_240M_1995; + break; + case GST_VIDEO_TRANSFER_BT2020_12: + transfer = kCVImageBufferTransferFunction_ITU_R_2020; + break; + case GST_VIDEO_TRANSFER_SRGB: + if (__builtin_available (macOS 10.13, *)) + transfer = kCVImageBufferTransferFunction_sRGB; + else + GST_WARNING_OBJECT (self, "macOS version is too old, the sRGB transfer " + "function is not available"); + break; + case GST_VIDEO_TRANSFER_SMPTE2084: + if (__builtin_available (macOS 10.13, *)) + transfer = kCVImageBufferTransferFunction_SMPTE_ST_2084_PQ; + else + GST_WARNING_OBJECT (self, "macOS version is too old, the SMPTE2084 " + "transfer function is not available"); + break; + default: + GST_WARNING_OBJECT (self, "Unsupported color transfer %u", cm.transfer); + } + + /* + * https://developer.apple.com/documentation/corevideo/cvimagebuffer/image_buffer_color_primaries_constants + */ + switch (cm.primaries) { + case GST_VIDEO_COLOR_PRIMARIES_BT709: + primaries = kCVImageBufferColorPrimaries_ITU_R_709_2; + break; + case GST_VIDEO_COLOR_PRIMARIES_SMPTE170M: + case GST_VIDEO_COLOR_PRIMARIES_SMPTE240M: + primaries = kCVImageBufferColorPrimaries_SMPTE_C; + break; + case GST_VIDEO_COLOR_PRIMARIES_BT2020: + primaries = kCVImageBufferColorPrimaries_ITU_R_2020; + break; + case GST_VIDEO_COLOR_PRIMARIES_SMPTERP431: + primaries = kCVImageBufferColorPrimaries_DCI_P3; + break; + case GST_VIDEO_COLOR_PRIMARIES_SMPTEEG432: + primaries = kCVImageBufferColorPrimaries_P3_D65; + break; + case GST_VIDEO_COLOR_PRIMARIES_EBU3213: + primaries = kCVImageBufferColorPrimaries_EBU_3213; + break; + default: + GST_WARNING_OBJECT (self, "Unsupported color primaries %u", cm.primaries); + } + + if (primaries) { + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_ColorPrimaries, primaries); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_ColorPrimaries =>" + "%d", status); + } + + if (transfer) { + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_TransferFunction, transfer); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_TransferFunction =>" + "%d", status); + } + + if (matrix) { + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_YCbCrMatrix, matrix); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_YCbCrMatrix => %d", + status); + } +} + static VTCompressionSessionRef gst_vtenc_create_session (GstVTEnc * self) { VTCompressionSessionRef session = NULL; - CFMutableDictionaryRef encoder_spec = NULL, pb_attrs; + CFMutableDictionaryRef encoder_spec = NULL, pb_attrs = NULL; OSStatus status; #if !HAVE_IOS @@ -843,16 +1160,21 @@ TRUE); #endif - pb_attrs = CFDictionaryCreateMutable (NULL, 0, &kCFTypeDictionaryKeyCallBacks, - &kCFTypeDictionaryValueCallBacks); - gst_vtutil_dict_set_i32 (pb_attrs, kCVPixelBufferWidthKey, - self->negotiated_width); - gst_vtutil_dict_set_i32 (pb_attrs, kCVPixelBufferHeightKey, - self->negotiated_height); + if (self->profile_level) { + pb_attrs = CFDictionaryCreateMutable (NULL, 0, + &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); + gst_vtutil_dict_set_i32 (pb_attrs, kCVPixelBufferWidthKey, + self->negotiated_width); + gst_vtutil_dict_set_i32 (pb_attrs, kCVPixelBufferHeightKey, + self->negotiated_height); + } + + /* This was set in gst_vtenc_negotiate_specific_format_details() */ + g_assert_cmpint (self->specific_format_id, !=, 0); status = VTCompressionSessionCreate (NULL, self->negotiated_width, self->negotiated_height, - self->details->format_id, encoder_spec, pb_attrs, NULL, + self->specific_format_id, encoder_spec, pb_attrs, NULL, gst_vtenc_enqueue_buffer, self, &session); GST_INFO_OBJECT (self, "VTCompressionSessionCreate for %d x %d => %d", self->negotiated_width, self->negotiated_height, (int) status); @@ -862,26 +1184,81 @@ goto beach; } - gst_vtenc_session_configure_expected_framerate (self, session, - (gdouble) self->negotiated_fps_n / (gdouble) self->negotiated_fps_d); + if (self->profile_level) { + gst_vtenc_session_configure_expected_framerate (self, session, + (gdouble) self->negotiated_fps_n / (gdouble) self->negotiated_fps_d); - status = VTSessionSetProperty (session, - kVTCompressionPropertyKey_ProfileLevel, self->profile_level); - GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_ProfileLevel => %d", - (int) status); + /* + * https://developer.apple.com/documentation/videotoolbox/vtcompressionsession/compression_properties/profile_and_level_constants + */ + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_ProfileLevel, self->profile_level); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_ProfileLevel => %d", + (int) status); - status = VTSessionSetProperty (session, - kVTCompressionPropertyKey_AllowTemporalCompression, kCFBooleanTrue); - GST_DEBUG_OBJECT (self, - "kVTCompressionPropertyKey_AllowTemporalCompression => %d", (int) status); + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_AllowTemporalCompression, kCFBooleanTrue); + GST_DEBUG_OBJECT (self, + "kVTCompressionPropertyKey_AllowTemporalCompression => %d", + (int) status); + + gst_vtenc_session_configure_max_keyframe_interval (self, session, + self->max_keyframe_interval); + gst_vtenc_session_configure_max_keyframe_interval_duration (self, session, + self->max_keyframe_interval_duration / ((gdouble) GST_SECOND)); - gst_vtenc_session_configure_max_keyframe_interval (self, session, - self->max_keyframe_interval); - gst_vtenc_session_configure_max_keyframe_interval_duration (self, session, - self->max_keyframe_interval_duration / ((gdouble) GST_SECOND)); + gst_vtenc_session_configure_bitrate (self, session, + gst_vtenc_get_bitrate (self)); + } + + /* Force encoder to not preserve alpha with 4444(XQ) ProRes formats if + * requested */ + if (!self->preserve_alpha && + (self->specific_format_id == kCMVideoCodecType_AppleProRes4444XQ || + self->specific_format_id == kCMVideoCodecType_AppleProRes4444)) { + status = VTSessionSetProperty (session, + gstVTCodecPropertyKey_PreserveAlphaChannel, CFSTR ("NO")); + GST_DEBUG_OBJECT (self, "kVTCodecPropertyKey_PreserveAlphaChannel => %d", + (int) status); + } + + gst_vtenc_set_colorimetry (self, session); + + /* Interlacing */ + switch (GST_VIDEO_INFO_INTERLACE_MODE (&self->video_info)) { + case GST_VIDEO_INTERLACE_MODE_PROGRESSIVE: + gst_vtenc_session_configure_property_int (self, session, + kVTCompressionPropertyKey_FieldCount, 1); + break; + case GST_VIDEO_INTERLACE_MODE_INTERLEAVED: + gst_vtenc_session_configure_property_int (self, session, + kVTCompressionPropertyKey_FieldCount, 2); + switch (GST_VIDEO_INFO_FIELD_ORDER (&self->video_info)) { + case GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST: + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_FieldDetail, + kCMFormatDescriptionFieldDetail_TemporalTopFirst); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_FieldDetail " + "TemporalTopFirst => %d", (int) status); + break; + case GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST: + status = VTSessionSetProperty (session, + kVTCompressionPropertyKey_FieldDetail, + kCMFormatDescriptionFieldDetail_TemporalBottomFirst); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_FieldDetail " + "TemporalBottomFirst => %d", (int) status); + break; + case GST_VIDEO_FIELD_ORDER_UNKNOWN: + GST_INFO_OBJECT (self, "Unknown field order for interleaved content, " + "will check first buffer"); + self->have_field_order = FALSE; + } + break; + default: + /* Caps negotiation should prevent this */ + g_assert_not_reached (); + } - gst_vtenc_session_configure_bitrate (self, session, - gst_vtenc_get_bitrate (self)); gst_vtenc_session_configure_realtime (self, session, gst_vtenc_get_realtime (self)); gst_vtenc_session_configure_allow_frame_reordering (self, session, @@ -906,7 +1283,8 @@ beach: if (encoder_spec) CFRelease (encoder_spec); - CFRelease (pb_attrs); + if (pb_attrs) + CFRelease (pb_attrs); return session; } @@ -1133,6 +1511,33 @@ else duration = kCMTimeInvalid; + /* If we don't have field order, we need to pick it up from the first buffer + * that has that information. The encoder session also cannot be reconfigured + * with a new field detail after it has been set, so we encode mixed streams + * with whatever the first buffer's field order is. */ + if (!self->have_field_order) { + CFStringRef field_detail = NULL; + + if (GST_VIDEO_BUFFER_IS_TOP_FIELD (frame->input_buffer)) + field_detail = kCMFormatDescriptionFieldDetail_TemporalTopFirst; + else if (GST_VIDEO_BUFFER_IS_BOTTOM_FIELD (frame->input_buffer)) + field_detail = kCMFormatDescriptionFieldDetail_TemporalBottomFirst; + + if (field_detail) { + vt_status = VTSessionSetProperty (self->session, + kVTCompressionPropertyKey_FieldDetail, field_detail); + GST_DEBUG_OBJECT (self, "kVTCompressionPropertyKey_FieldDetail => %d", + (int) vt_status); + } else { + GST_WARNING_OBJECT (self, "have interlaced content, but don't know field " + "order yet, skipping buffer"); + gst_video_codec_frame_unref (frame); + return GST_FLOW_OK; + } + + self->have_field_order = TRUE; + } + meta = gst_buffer_get_core_media_meta (frame->input_buffer); if (meta != NULL) { pbuf = gst_core_media_buffer_get_pixel_buffer (frame->input_buffer); @@ -1157,18 +1562,21 @@ pixel_format_type = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange; break; default: - goto cv_error; + g_assert_not_reached (); } if (!gst_video_frame_map (&inframe, &self->video_info, frame->input_buffer, - GST_MAP_READ)) + GST_MAP_READ)) { + GST_ERROR_OBJECT (self, "failed to map input buffer"); goto cv_error; + } cv_ret = CVPixelBufferCreate (NULL, self->negotiated_width, self->negotiated_height, pixel_format_type, NULL, &pbuf); if (cv_ret != kCVReturnSuccess) { + GST_ERROR_OBJECT (self, "CVPixelBufferCreate failed: %i", cv_ret); gst_video_frame_unmap (&inframe); goto cv_error; } @@ -1177,6 +1585,7 @@ gst_core_video_buffer_new ((CVBufferRef) pbuf, &self->video_info, NULL); if (!gst_video_frame_map (&outframe, &self->video_info, outbuf, GST_MAP_WRITE)) { + GST_ERROR_OBJECT (self, "Failed to map output buffer"); gst_video_frame_unmap (&inframe); gst_buffer_unref (outbuf); CVPixelBufferRelease (pbuf); @@ -1184,6 +1593,7 @@ } if (!gst_video_frame_copy (&outframe, &inframe)) { + GST_ERROR_OBJECT (self, "Failed to copy output frame"); gst_video_frame_unmap (&inframe); gst_buffer_unref (outbuf); CVPixelBufferRelease (pbuf); @@ -1200,8 +1610,10 @@ CVReturn cv_ret; vframe = gst_vtenc_frame_new (frame->input_buffer, &self->video_info); - if (!vframe) + if (!vframe) { + GST_ERROR_OBJECT (self, "Failed to create a new input frame"); goto cv_error; + } { const size_t num_planes = GST_VIDEO_FRAME_N_PLANES (&vframe->videoframe); @@ -1224,6 +1636,23 @@ } switch (GST_VIDEO_INFO_FORMAT (&self->video_info)) { + case GST_VIDEO_FORMAT_ARGB64_BE: + pixel_format_type = kCVPixelFormatType_64ARGB; + break; + case GST_VIDEO_FORMAT_AYUV64: +/* This is fine for now because Apple only ships LE devices */ +#if G_BYTE_ORDER != G_LITTLE_ENDIAN +#error "AYUV64 is NE but kCVPixelFormatType_4444AYpCbCr16 is LE" +#endif + pixel_format_type = kCVPixelFormatType_4444AYpCbCr16; + break; + case GST_VIDEO_FORMAT_RGBA64_LE: + if (GST_VTUTIL_HAVE_64ARGBALE) + pixel_format_type = kCVPixelFormatType_64RGBALE; + else + /* Codepath will never be hit on macOS older than Big Sur (11.3) */ + g_assert_not_reached (); + break; case GST_VIDEO_FORMAT_I420: pixel_format_type = kCVPixelFormatType_420YpCbCr8Planar; break; @@ -1234,8 +1663,7 @@ pixel_format_type = kCVPixelFormatType_422YpCbCr8; break; default: - gst_vtenc_frame_free (vframe); - goto cv_error; + g_assert_not_reached (); } cv_ret = CVPixelBufferCreateWithPlanarBytes (NULL, @@ -1250,6 +1678,8 @@ plane_bytes_per_row, gst_pixel_buffer_release_cb, vframe, NULL, &pbuf); if (cv_ret != kCVReturnSuccess) { + GST_ERROR_OBJECT (self, "CVPixelBufferCreateWithPlanarBytes failed: %i", + cv_ret); gst_vtenc_frame_free (vframe); goto cv_error; } @@ -1458,6 +1888,8 @@ #ifndef HAVE_IOS {"H.264 (HW only)", "h264_hw", "video/x-h264", kCMVideoCodecType_H264, TRUE}, #endif + {"Apple ProRes", "prores", "video/x-prores", + GST_kCMVideoCodecType_Some_AppleProRes, FALSE}, }; void
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/vtenc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/vtenc.h
Changed
@@ -58,6 +58,7 @@ const GstVTEncoderDetails * details; + CMVideoCodecType specific_format_id; CFStringRef profile_level; guint bitrate; gboolean allow_frame_reordering; @@ -66,6 +67,7 @@ gint max_keyframe_interval; GstClockTime max_keyframe_interval_duration; gint latency_frames; + gboolean preserve_alpha; gboolean dump_properties; gboolean dump_attributes; @@ -74,6 +76,7 @@ gint negotiated_fps_n, negotiated_fps_d; gint caps_width, caps_height; gint caps_fps_n, caps_fps_d; + gboolean have_field_order; GstVideoCodecState *input_state; GstVideoInfo video_info; VTCompressionSessionRef session;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/vtutil.c -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/vtutil.c
Changed
@@ -96,3 +96,66 @@ CFDictionarySetValue (dict, key, value); CFRelease (value); } + +CMVideoCodecType +gst_vtutil_codec_type_from_prores_variant (const char *variant) +{ + if (g_strcmp0 (variant, "standard") == 0) + return kCMVideoCodecType_AppleProRes422; + else if (g_strcmp0 (variant, "4444xq") == 0) + return kCMVideoCodecType_AppleProRes4444XQ; + else if (g_strcmp0 (variant, "4444") == 0) + return kCMVideoCodecType_AppleProRes4444; + else if (g_strcmp0 (variant, "hq") == 0) + return kCMVideoCodecType_AppleProRes422HQ; + else if (g_strcmp0 (variant, "lt") == 0) + return kCMVideoCodecType_AppleProRes422LT; + else if (g_strcmp0 (variant, "proxy") == 0) + return kCMVideoCodecType_AppleProRes422Proxy; + return GST_kCMVideoCodecType_Some_AppleProRes; +} + +const char * +gst_vtutil_codec_type_to_prores_variant (CMVideoCodecType codec_type) +{ + switch (codec_type) { + case kCMVideoCodecType_AppleProRes422: + return "standard"; + case kCMVideoCodecType_AppleProRes4444XQ: + return "4444xq"; + case kCMVideoCodecType_AppleProRes4444: + return "4444"; + case kCMVideoCodecType_AppleProRes422HQ: + return "hq"; + case kCMVideoCodecType_AppleProRes422LT: + return "lt"; + case kCMVideoCodecType_AppleProRes422Proxy: + return "proxy"; + default: + g_assert_not_reached (); + } +} + +GstCaps * +gst_vtutil_caps_append_video_format (GstCaps * caps, const char *vfmt) +{ + GstStructure *s; + GValueArray *arr; + GValue val = G_VALUE_INIT; + + caps = gst_caps_make_writable (caps); + s = gst_caps_get_structure (caps, 0); + gst_structure_get_list (s, "format", &arr); + + g_value_init (&val, G_TYPE_STRING); + + g_value_set_string (&val, vfmt); + G_GNUC_BEGIN_IGNORE_DEPRECATIONS; + arr = g_value_array_append (arr, &val); + G_GNUC_END_IGNORE_DEPRECATIONS; + + g_value_unset (&val); + + gst_structure_set_list (s, "format", arr); + return caps; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/applemedia/vtutil.h -> gst-plugins-bad-1.20.1.tar.xz/sys/applemedia/vtutil.h
Changed
@@ -21,7 +21,21 @@ #define __GST_VTUTIL_H__ #include <glib.h> +#include <gst/gst.h> #include <CoreFoundation/CoreFoundation.h> +#include <CoreMedia/CoreMedia.h> + +/* Some formats such as Apple ProRes have separate codec type mappings for all + * variants / profiles, and we don't want to instantiate separate elements for + * each variant, so we use a dummy type for details->format_id */ +#define GST_kCMVideoCodecType_Some_AppleProRes 1 + +// kCVPixelFormatType_64RGBALE is only available for 11.3 +. +// See https://developer.apple.com/documentation/corevideo/1563591-pixel_format_identifiers/kcvpixelformattype_64rgbale +#if defined(MAC_OS_X_VERSION_MAX_ALLOWED) && MAC_OS_X_VERSION_MAX_ALLOWED < 110300 +#define kCVPixelFormatType_64RGBALE 'l64r' +#endif +#define GST_VTUTIL_HAVE_64ARGBALE __builtin_available(macOS 11.3, *) G_BEGIN_DECLS @@ -38,6 +52,12 @@ void gst_vtutil_dict_set_object (CFMutableDictionaryRef dict, CFStringRef key, CFTypeRef * value); +CMVideoCodecType gst_vtutil_codec_type_from_prores_variant (const char * variant); +const char * gst_vtutil_codec_type_to_prores_variant (CMVideoCodecType codec_type); + +GstCaps * gst_vtutil_caps_append_video_format (GstCaps * caps, + const char * vfmt); + G_END_DECLS #endif /* __GST_VTUTIL_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasiodeviceprovider.cpp
Added
@@ -0,0 +1,273 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstasiodeviceprovider.h" +#include "gstasioutils.h" +#include "gstasioobject.h" +#include <atlconv.h> + +enum +{ + PROP_0, + PROP_DEVICE_CLSID, +}; + +struct _GstAsioDevice +{ + GstDevice parent; + + gchar *device_clsid; + const gchar *factory_name; +}; + +G_DEFINE_TYPE (GstAsioDevice, gst_asio_device, GST_TYPE_DEVICE); + +static void gst_asio_device_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_asio_device_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_asio_device_finalize (GObject * object); +static GstElement *gst_asio_device_create_element (GstDevice * device, + const gchar * name); + +static void +gst_asio_device_class_init (GstAsioDeviceClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstDeviceClass *dev_class = GST_DEVICE_CLASS (klass); + + dev_class->create_element = gst_asio_device_create_element; + + gobject_class->get_property = gst_asio_device_get_property; + gobject_class->set_property = gst_asio_device_set_property; + gobject_class->finalize = gst_asio_device_finalize; + + g_object_class_install_property (gobject_class, PROP_DEVICE_CLSID, + g_param_spec_string ("device-clsid", "Device CLSID", + "ASIO device CLSID as string including curly brackets", NULL, + (GParamFlags) (G_PARAM_READWRITE | + G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS))); +} + +static void +gst_asio_device_init (GstAsioDevice * self) +{ +} + +static void +gst_asio_device_finalize (GObject * object) +{ + GstAsioDevice *self = GST_ASIO_DEVICE (object); + + g_free (self->device_clsid); + + G_OBJECT_CLASS (gst_asio_device_parent_class)->finalize (object); +} + +static GstElement * +gst_asio_device_create_element (GstDevice * device, const gchar * name) +{ + GstAsioDevice *self = GST_ASIO_DEVICE (device); + GstElement *elem; + + elem = gst_element_factory_make (self->factory_name, name); + + g_object_set (elem, "device-clsid", self->device_clsid, NULL); + + return elem; +} + +static void +gst_asio_device_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstAsioDevice *self = GST_ASIO_DEVICE (object); + + switch (prop_id) { + case PROP_DEVICE_CLSID: + g_value_set_string (value, self->device_clsid); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_asio_device_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstAsioDevice *self = GST_ASIO_DEVICE (object); + + switch (prop_id) { + case PROP_DEVICE_CLSID: + g_free (self->device_clsid); + self->device_clsid = g_value_dup_string (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +struct _GstAsioDeviceProvider +{ + GstDeviceProvider parent; +}; + +G_DEFINE_TYPE (GstAsioDeviceProvider, gst_asio_device_provider, + GST_TYPE_DEVICE_PROVIDER); + +static GList *gst_asio_device_provider_probe (GstDeviceProvider * provider); + +static void +gst_asio_device_provider_class_init (GstAsioDeviceProviderClass * klass) +{ + GstDeviceProviderClass *provider_class = GST_DEVICE_PROVIDER_CLASS (klass); + + provider_class->probe = GST_DEBUG_FUNCPTR (gst_asio_device_provider_probe); + + gst_device_provider_class_set_static_metadata (provider_class, + "ASIO Device Provider", + "Source/Sink/Audio", "List ASIO source and sink devices", + "Seungha Yang <seungha@centricular.com>"); +} + +static void +gst_asio_device_provider_init (GstAsioDeviceProvider * provider) +{ +} + +static void +gst_asio_device_provider_probe_internal (GstAsioDeviceProvider * self, + gboolean is_src, GList * asio_device_list, GList ** devices) +{ + const gchar *device_class, *factory_name; + GList *iter; + + USES_CONVERSION; + + if (is_src) { + device_class = "Audio/Source"; + factory_name = "asiosrc"; + } else { + device_class = "Audio/Sink"; + factory_name = "asiosink"; + } + + for (iter = asio_device_list; iter; iter = g_list_next (iter)) { + GstDevice *device; + GstAsioDeviceInfo *info = (GstAsioDeviceInfo *) iter->data; + GstAsioObject *obj; + GstCaps *caps = nullptr; + GstStructure *props = nullptr; + long max_in_ch = 0; + long max_out_ch = 0; + HRESULT hr; + LPOLESTR clsid_str = nullptr; + glong min_buf_size = 0; + glong max_buf_size = 0; + glong preferred_buf_size = 0; + glong buf_size_granularity = 0; + + obj = gst_asio_object_new (info, FALSE); + if (!obj) + continue; + + if (!gst_asio_object_get_max_num_channels (obj, &max_in_ch, &max_out_ch)) + goto done; + + if (is_src && max_in_ch <= 0) + goto done; + else if (!is_src && max_out_ch <= 0) + goto done; + + if (is_src) { + caps = gst_asio_object_get_caps (obj, + GST_ASIO_DEVICE_CLASS_CAPTURE, 1, max_in_ch); + } else { + caps = gst_asio_object_get_caps (obj, + GST_ASIO_DEVICE_CLASS_RENDER, 1, max_out_ch); + } + if (!caps) + goto done; + + hr = StringFromIID (info->clsid, &clsid_str); + if (FAILED (hr)) + goto done; + + if (!gst_asio_object_get_buffer_size (obj, &min_buf_size, &max_buf_size, + &preferred_buf_size, &buf_size_granularity)) + goto done; + + props = gst_structure_new ("asio-proplist", + "device.api", G_TYPE_STRING, "asio", + "device.clsid", G_TYPE_STRING, OLE2A (clsid_str), + "asio.device.description", G_TYPE_STRING, info->driver_desc, + "asio.device.min-buf-size", G_TYPE_LONG, min_buf_size, + "asio.device.max-buf-size", G_TYPE_LONG, max_buf_size, + "asio.device.preferred-buf-size", G_TYPE_LONG, preferred_buf_size, + "asio.device.buf-size-granularity", G_TYPE_LONG, buf_size_granularity, + nullptr); + + device = (GstDevice *) g_object_new (GST_TYPE_ASIO_DEVICE, + "device-clsid", OLE2A (clsid_str), + "display-name", info->driver_desc, "caps", caps, + "device-class", device_class, "properties", props, nullptr); + GST_ASIO_DEVICE (device)->factory_name = factory_name; + + *devices = g_list_append (*devices, device); + + done: + gst_clear_caps (&caps); + gst_clear_object (&obj); + if (props) + gst_structure_free (props); + } + + return; +} + +static GList * +gst_asio_device_provider_probe (GstDeviceProvider * provider) +{ + GstAsioDeviceProvider *self = GST_ASIO_DEVICE_PROVIDER (provider); + GList *devices = nullptr; + guint num_device; + GList *asio_device_list = nullptr; + + num_device = gst_asio_enum (&asio_device_list); + + if (num_device == 0) + return nullptr; + + gst_asio_device_provider_probe_internal (self, + TRUE, asio_device_list, &devices); + gst_asio_device_provider_probe_internal (self, + FALSE, asio_device_list, &devices); + + g_list_free_full (asio_device_list, + (GDestroyNotify) gst_asio_device_info_free); + + return devices; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasiodeviceprovider.h
Added
@@ -0,0 +1,37 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ASIO_DEVICE_PROVIDER_H__ +#define __GST_ASIO_DEVICE_PROVIDER_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +#define GST_TYPE_ASIO_DEVICE (gst_asio_device_get_type()) +#define GST_TYPE_ASIO_DEVICE_PROVIDER (gst_asio_device_provider_get_type()) + +G_DECLARE_FINAL_TYPE (GstAsioDevice, gst_asio_device, + GST, ASIO_DEVICE, GstDevice); +G_DECLARE_FINAL_TYPE (GstAsioDeviceProvider, gst_asio_device_provider, + GST, ASIO_DEVICE_PROVIDER, GstDeviceProvider); + +G_END_DECLS + +#endif /* __GST_ASIO_DEVICE_PROVIDER_H__ */ \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasioobject.cpp
Added
@@ -0,0 +1,1875 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstasioobject.h" +#include <string.h> +#include <avrt.h> +#include <string> +#include <functional> +#include <vector> +#include <mutex> +#include <iasiodrv.h> + +GST_DEBUG_CATEGORY_STATIC (gst_asio_object_debug); +#define GST_CAT_DEFAULT gst_asio_object_debug + +/* List of GstAsioObject */ +static GList *asio_object_list = nullptr; + +/* *INDENT-OFF* */ +/* Protect asio_object_list and other global values */ +std::mutex global_lock; + +/* Protect callback slots */ +std::mutex slot_lock; +/* *INDENT-ON* */ + +static void gst_asio_object_buffer_switch (GstAsioObject * self, + glong index, ASIOBool process_now); +static void gst_asio_object_sample_rate_changed (GstAsioObject * self, + ASIOSampleRate rate); +static glong gst_asio_object_messages (GstAsioObject * self, glong selector, + glong value, gpointer message, gdouble * opt); +static ASIOTime *gst_asio_object_buffer_switch_time_info (GstAsioObject * self, + ASIOTime * time_info, glong index, ASIOBool process_now); + +/* *INDENT-OFF* */ +/* Object to delegate ASIO callbacks to dedicated GstAsioObject */ +class GstAsioCallbacks +{ +public: + GstAsioCallbacks (GstAsioObject * object) + { + g_weak_ref_init (&object_, object); + } + + virtual ~GstAsioCallbacks () + { + g_weak_ref_clear (&object_); + } + + void BufferSwitch (glong index, ASIOBool process_now) + { + GstAsioObject *obj = (GstAsioObject *) g_weak_ref_get (&object_); + if (!obj) + return; + + gst_asio_object_buffer_switch (obj, index, process_now); + gst_object_unref (obj); + } + + void SampleRateChanged (ASIOSampleRate rate) + { + GstAsioObject *obj = (GstAsioObject *) g_weak_ref_get (&object_); + if (!obj) + return; + + gst_asio_object_sample_rate_changed (obj, rate); + gst_object_unref (obj); + } + + glong Messages (glong selector, glong value, gpointer message, gdouble *opt) + { + GstAsioObject *obj = (GstAsioObject *) g_weak_ref_get (&object_); + if (!obj) + return 0; + + glong ret = gst_asio_object_messages (obj, selector, value, message, opt); + gst_object_unref (obj); + + return ret; + } + + ASIOTime * BufferSwitchTimeInfo (ASIOTime * time_info, + glong index, ASIOBool process_now) + { + GstAsioObject *obj = (GstAsioObject *) g_weak_ref_get (&object_); + if (!obj) + return nullptr; + + ASIOTime * ret = gst_asio_object_buffer_switch_time_info (obj, + time_info, index, process_now); + gst_object_unref (obj); + + return ret; + } + +private: + GWeakRef object_; +}; + +template <int instance_id> +class GstAsioCallbacksSlot +{ +public: + static void + BufferSwitchStatic(glong index, ASIOBool process_now) + { + buffer_switch(index, process_now); + } + + static void + SampleRateChangedStatic (ASIOSampleRate rate) + { + sample_rate_changed(rate); + } + + static glong + MessagesStatic(glong selector, glong value, gpointer message, gdouble *opt) + { + return messages(selector, value, message, opt); + } + + static ASIOTime * + BufferSwitchTimeInfoStatic(ASIOTime * time_info, glong index, + ASIOBool process_now) + { + return buffer_switch_time_info(time_info, index, process_now); + } + + static std::function<void(glong, ASIOBool)> buffer_switch; + static std::function<void(ASIOSampleRate)> sample_rate_changed; + static std::function<glong(glong, glong, gpointer, gdouble *)> messages; + static std::function<ASIOTime *(ASIOTime *, glong, ASIOBool)> buffer_switch_time_info; + + static bool bound; + + static void Init () + { + buffer_switch = nullptr; + sample_rate_changed = nullptr; + messages = nullptr; + buffer_switch_time_info = nullptr; + bound = false; + } + + static bool IsBound () + { + return bound; + } + + static void Bind (GstAsioCallbacks * cb, ASIOCallbacks * driver_cb) + { + buffer_switch = std::bind(&GstAsioCallbacks::BufferSwitch, cb, + std::placeholders::_1, std::placeholders::_2); + sample_rate_changed = std::bind(&GstAsioCallbacks::SampleRateChanged, cb, + std::placeholders::_1); + messages = std::bind(&GstAsioCallbacks::Messages, cb, + std::placeholders::_1, std::placeholders::_2, std::placeholders::_3, + std::placeholders::_4); + buffer_switch_time_info = std::bind(&GstAsioCallbacks::BufferSwitchTimeInfo, + cb, std::placeholders::_1, std::placeholders::_2, std::placeholders::_3); + + driver_cb->bufferSwitch = BufferSwitchStatic; + driver_cb->sampleRateDidChange = SampleRateChangedStatic; + driver_cb->asioMessage = MessagesStatic; + driver_cb->bufferSwitchTimeInfo = BufferSwitchTimeInfoStatic; + + bound = true; + } +}; + +template <int instance_id> +std::function<void(glong, ASIOBool)> GstAsioCallbacksSlot<instance_id>::buffer_switch; +template <int instance_id> +std::function<void(ASIOSampleRate)> GstAsioCallbacksSlot<instance_id>::sample_rate_changed; +template <int instance_id> +std::function<glong(glong, glong, gpointer, gdouble *)> GstAsioCallbacksSlot<instance_id>::messages; +template <int instance_id> +std::function<ASIOTime *(ASIOTime *, glong, ASIOBool)> GstAsioCallbacksSlot<instance_id>::buffer_switch_time_info; +template <int instance_id> +bool GstAsioCallbacksSlot<instance_id>::bound; + +/* XXX: Create global slot objects, + * because ASIO callback doesn't support user data, hum.... */ +GstAsioCallbacksSlot<0> cb_slot_0; +GstAsioCallbacksSlot<1> cb_slot_1; +GstAsioCallbacksSlot<2> cb_slot_2; +GstAsioCallbacksSlot<3> cb_slot_3; +GstAsioCallbacksSlot<4> cb_slot_4; +GstAsioCallbacksSlot<5> cb_slot_5; +GstAsioCallbacksSlot<6> cb_slot_6; +GstAsioCallbacksSlot<7> cb_slot_7; + +/* *INDENT-ON* */ + +typedef struct +{ + GstAsioObjectCallbacks callbacks; + guint64 callback_id; +} GstAsioObjectCallbacksPrivate; + +enum +{ + PROP_0, + PROP_DEVICE_INFO, +}; + +typedef enum +{ + GST_ASIO_OBJECT_STATE_LOADED, + GST_ASIO_OBJECT_STATE_INITIALIZED, + GST_ASIO_OBJECT_STATE_PREPARED, + GST_ASIO_OBJECT_STATE_RUNNING, +} GstAsioObjectState; + +/* Protect singletone object */ +struct _GstAsioObject +{ + GstObject parent; + + GstAsioDeviceInfo *device_info; + + GstAsioObjectState state; + + IASIO *asio_handle; + + GThread *thread; + GMutex lock; + GCond cond; + GMainContext *context; + GMainLoop *loop; + + GMutex thread_lock; + GCond thread_cond; + + GMutex api_lock; + + /* called after init() done */ + glong max_num_input_channels; + glong max_num_output_channels; + + glong min_buffer_size; + glong max_buffer_size; + glong preferred_buffer_size; + glong buffer_size_granularity; + + glong selected_buffer_size; + + /* List of supported sample rate */ + GArray *supported_sample_rates; + + /* List of ASIOChannelInfo */ + ASIOChannelInfo *input_channel_infos; + ASIOChannelInfo *output_channel_infos; + + /* Selected sample rate */ + ASIOSampleRate sample_rate; + + /* Input/Output buffer infors */ + ASIOBufferInfo *buffer_infos; + + /* Store requested channel before createbuffer */ + gboolean *input_channel_requested; + gboolean *output_channel_requested; + + glong num_requested_input_channels; + glong num_requested_output_channels; + guint num_allocated_buffers; + + GList *src_client_callbacks; + GList *sink_client_callbacks; + GList *loopback_client_callbacks; + guint64 next_callback_id; + + GstAsioCallbacks *callbacks; + ASIOCallbacks driver_callbacks; + int slot_id; + + gboolean occupy_all_channels; +}; + +static void gst_asio_object_constructed (GObject * object); +static void gst_asio_object_finalize (GObject * object); +static void gst_asio_object_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); + +static gpointer gst_asio_object_thread_func (GstAsioObject * self); + +#define gst_asio_object_parent_class parent_class +G_DEFINE_TYPE (GstAsioObject, gst_asio_object, GST_TYPE_OBJECT); + +static void +gst_asio_object_class_init (GstAsioObjectClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->constructed = gst_asio_object_constructed; + gobject_class->finalize = gst_asio_object_finalize; + gobject_class->set_property = gst_asio_object_set_property; + + g_object_class_install_property (gobject_class, PROP_DEVICE_INFO, + g_param_spec_pointer ("device-info", "Device Info", + "A pointer to GstAsioDeviceInfo struct", + (GParamFlags) (G_PARAM_WRITABLE | G_PARAM_CONSTRUCT_ONLY | + G_PARAM_STATIC_STRINGS))); + + GST_DEBUG_CATEGORY_INIT (gst_asio_object_debug, + "asioobject", 0, "asioobject"); +} + +static void +gst_asio_object_init (GstAsioObject * self) +{ + g_mutex_init (&self->lock); + g_cond_init (&self->cond); + + g_mutex_init (&self->thread_lock); + g_cond_init (&self->thread_cond); + + g_mutex_init (&self->api_lock); + + self->supported_sample_rates = g_array_new (FALSE, + FALSE, sizeof (ASIOSampleRate)); + + self->slot_id = -1; +} + +static void +gst_asio_object_constructed (GObject * object) +{ + GstAsioObject *self = GST_ASIO_OBJECT (object); + + if (!self->device_info) { + GST_ERROR_OBJECT (self, "Device info was not configured"); + return; + } + + self->context = g_main_context_new (); + self->loop = g_main_loop_new (self->context, FALSE); + + g_mutex_lock (&self->lock); + self->thread = g_thread_new ("GstAsioObject", + (GThreadFunc) gst_asio_object_thread_func, self); + while (!g_main_loop_is_running (self->loop)) + g_cond_wait (&self->cond, &self->lock); + g_mutex_unlock (&self->lock); +} + +static void +gst_asio_object_finalize (GObject * object) +{ + GstAsioObject *self = GST_ASIO_OBJECT (object); + + if (self->loop) { + g_main_loop_quit (self->loop); + g_thread_join (self->thread); + g_main_loop_unref (self->loop); + g_main_context_unref (self->context); + } + + g_mutex_clear (&self->lock); + g_cond_clear (&self->cond); + + g_mutex_clear (&self->thread_lock); + g_cond_clear (&self->thread_cond); + + g_mutex_clear (&self->api_lock); + + g_array_unref (self->supported_sample_rates); + + gst_asio_device_info_free (self->device_info); + g_free (self->input_channel_infos); + g_free (self->output_channel_infos); + g_free (self->input_channel_requested); + g_free (self->output_channel_requested); + + if (self->src_client_callbacks) + g_list_free_full (self->src_client_callbacks, (GDestroyNotify) g_free); + if (self->sink_client_callbacks) + g_list_free_full (self->sink_client_callbacks, (GDestroyNotify) g_free); + if (self->loopback_client_callbacks) + g_list_free_full (self->loopback_client_callbacks, (GDestroyNotify) g_free); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_asio_object_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstAsioObject *self = GST_ASIO_OBJECT (object); + + switch (prop_id) { + case PROP_DEVICE_INFO: + g_clear_pointer (&self->device_info, gst_asio_device_info_free); + self->device_info = gst_asio_device_info_copy ((GstAsioDeviceInfo *) + g_value_get_pointer (value)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static HWND +gst_asio_object_create_internal_hwnd (GstAsioObject * self) +{ + WNDCLASSEXW wc; + ATOM atom = 0; + HINSTANCE hinstance = GetModuleHandle (NULL); + + atom = GetClassInfoExW (hinstance, L"GstAsioInternalWindow", &wc); + if (atom == 0) { + GST_LOG_OBJECT (self, "Register internal window class"); + ZeroMemory (&wc, sizeof (WNDCLASSEX)); + + wc.cbSize = sizeof (WNDCLASSEX); + wc.lpfnWndProc = DefWindowProc; + wc.hInstance = GetModuleHandle (nullptr); + wc.style = CS_OWNDC; + wc.lpszClassName = L"GstAsioInternalWindow"; + + atom = RegisterClassExW (&wc); + + if (atom == 0) { + GST_ERROR_OBJECT (self, "Failed to register window class 0x%x", + (unsigned int) GetLastError ()); + return nullptr; + } + } + + return CreateWindowExW (0, L"GstAsioInternalWindow", L"GstAsioInternal", + WS_POPUP, 0, 0, 1, 1, nullptr, nullptr, GetModuleHandle (nullptr), + nullptr); +} + +static gboolean +hwnd_msg_cb (GIOChannel * source, GIOCondition condition, gpointer data) +{ + MSG msg; + + if (!PeekMessage (&msg, NULL, 0, 0, PM_REMOVE)) + return G_SOURCE_CONTINUE; + + TranslateMessage (&msg); + DispatchMessage (&msg); + + return G_SOURCE_CONTINUE; +} + +static gboolean +gst_asio_object_main_loop_running_cb (GstAsioObject * self) +{ + GST_INFO_OBJECT (self, "Main loop running now"); + + g_mutex_lock (&self->lock); + g_cond_signal (&self->cond); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +static gboolean +gst_asio_object_bind_callbacks (GstAsioObject * self) +{ + std::lock_guard < std::mutex > lk (slot_lock); + gboolean ret = TRUE; + + if (!cb_slot_0.IsBound ()) { + cb_slot_0.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 0; + } else if (!cb_slot_1.IsBound ()) { + cb_slot_1.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 1; + } else if (!cb_slot_2.IsBound ()) { + cb_slot_2.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 2; + } else if (!cb_slot_3.IsBound ()) { + cb_slot_3.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 3; + } else if (!cb_slot_4.IsBound ()) { + cb_slot_4.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 4; + } else if (!cb_slot_5.IsBound ()) { + cb_slot_5.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 5; + } else if (!cb_slot_6.IsBound ()) { + cb_slot_6.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 6; + } else if (!cb_slot_7.IsBound ()) { + cb_slot_7.Bind (self->callbacks, &self->driver_callbacks); + self->slot_id = 7; + } else { + self->slot_id = -1; + ret = FALSE; + } + + return ret; +} + +static void +gst_asio_object_unbind_callbacks (GstAsioObject * self) +{ + std::lock_guard < std::mutex > lk (slot_lock); + + if (!self->callbacks || self->slot_id < 0) + return; + + switch (self->slot_id) { + case 0: + cb_slot_0.Init (); + break; + case 1: + cb_slot_1.Init (); + break; + case 2: + cb_slot_2.Init (); + break; + case 3: + cb_slot_3.Init (); + break; + case 4: + cb_slot_4.Init (); + break; + case 5: + cb_slot_5.Init (); + break; + case 6: + cb_slot_6.Init (); + break; + case 7: + cb_slot_7.Init (); + break; + default: + g_assert_not_reached (); + break; + } + + return; +} + +static gpointer +gst_asio_object_thread_func (GstAsioObject * self) +{ + HANDLE avrt_handle = nullptr; + static DWORD task_idx = 0; + HWND hwnd; + GSource *source = nullptr; + GSource *hwnd_msg_source = nullptr; + GIOChannel *msg_io_channel = nullptr; + HRESULT hr; + ASIOError asio_rst; + IASIO *asio_handle = nullptr; + GstAsioDeviceInfo *device_info = self->device_info; + /* FIXME: check more sample rate */ + static ASIOSampleRate sample_rate_to_check[] = { + 48000.0, 44100.0, 192000.0, 96000.0, 88200.0, + }; + + g_assert (device_info); + + GST_INFO_OBJECT (self, + "Enter loop, ThreadingModel: %s, driver-name: %s, driver-desc: %s", + device_info->sta_model ? "STA" : "MTA", + GST_STR_NULL (device_info->driver_name), + GST_STR_NULL (device_info->driver_desc)); + + if (device_info->sta_model) + CoInitializeEx (NULL, COINIT_APARTMENTTHREADED); + else + CoInitializeEx (NULL, COINIT_MULTITHREADED); + + /* Our thread is unlikely different from driver's working thread though, + * let's do this. It should not cause any problem */ + AvSetMmThreadCharacteristicsW (L"Pro Audio", &task_idx); + g_main_context_push_thread_default (self->context); + + source = g_idle_source_new (); + g_source_set_callback (source, + (GSourceFunc) gst_asio_object_main_loop_running_cb, self, nullptr); + g_source_attach (source, self->context); + g_source_unref (source); + + /* XXX: not sure why ASIO API wants Windows handle for init(). + * Possibly it might be used for STA COM threading + * but it's undocummented... */ + hwnd = gst_asio_object_create_internal_hwnd (self); + if (!hwnd) + goto run_loop; + + hr = CoCreateInstance (device_info->clsid, nullptr, CLSCTX_INPROC_SERVER, + device_info->clsid, (gpointer *) & asio_handle); + if (FAILED (hr)) { + GST_WARNING_OBJECT (self, "Failed to create IASIO instance, hr: 0x%x", + (guint) hr); + goto run_loop; + } + + if (!asio_handle->init (hwnd)) { + GST_WARNING_OBJECT (self, "Failed to init IASIO instance"); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + /* Query device information */ + asio_rst = asio_handle->getChannels (&self->max_num_input_channels, + &self->max_num_output_channels); + if (asio_rst != 0) { + GST_WARNING_OBJECT (self, "Failed to query in/out channels, ret %ld", + asio_rst); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + GST_INFO_OBJECT (self, "Input/Output channles: %ld/%ld", + self->max_num_input_channels, self->max_num_output_channels); + + asio_rst = asio_handle->getBufferSize (&self->min_buffer_size, + &self->max_buffer_size, &self->preferred_buffer_size, + &self->buffer_size_granularity); + if (asio_rst != 0) { + GST_WARNING_OBJECT (self, "Failed to get buffer size, ret %ld", asio_rst); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + /* Use preferreed buffer size by default */ + self->selected_buffer_size = self->preferred_buffer_size; + + GST_INFO_OBJECT (self, "min-buffer-size %ld, max-buffer-size %ld, " + "preferred-buffer-size %ld, buffer-size-granularity %ld", + self->min_buffer_size, self->max_buffer_size, + self->preferred_buffer_size, self->buffer_size_granularity); + + for (guint i = 0; i < G_N_ELEMENTS (sample_rate_to_check); i++) { + asio_rst = asio_handle->canSampleRate (sample_rate_to_check[i]); + if (asio_rst != 0) + continue; + + GST_INFO_OBJECT (self, "SampleRate %.1lf is supported", + sample_rate_to_check[i]); + g_array_append_val (self->supported_sample_rates, sample_rate_to_check[i]); + } + + if (self->supported_sample_rates->len == 0) { + GST_WARNING_OBJECT (self, "Failed to query supported sample rate"); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + /* Pick the first supported samplerate */ + self->sample_rate = + g_array_index (self->supported_sample_rates, ASIOSampleRate, 0); + if (asio_handle->setSampleRate (self->sample_rate) != 0) { + GST_WARNING_OBJECT (self, "Failed to set samplerate %.1lf", + self->sample_rate); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + if (self->max_num_input_channels > 0) { + self->input_channel_infos = g_new0 (ASIOChannelInfo, + self->max_num_input_channels); + for (glong i = 0; i < self->max_num_input_channels; i++) { + ASIOChannelInfo *info = &self->input_channel_infos[i]; + info->channel = i; + info->isInput = TRUE; + + asio_rst = asio_handle->getChannelInfo (info); + if (asio_rst != 0) { + GST_WARNING_OBJECT (self, "Failed to %ld input channel info, ret %ld", + i, asio_rst); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + GST_INFO_OBJECT (self, + "InputChannelInfo %ld: isActive %s, channelGroup %ld, " + "ASIOSampleType %ld, name %s", i, info->isActive ? "true" : "false", + info->channelGroup, info->type, GST_STR_NULL (info->name)); + } + + self->input_channel_requested = + g_new0 (gboolean, self->max_num_input_channels); + } + + if (self->max_num_output_channels > 0) { + self->output_channel_infos = g_new0 (ASIOChannelInfo, + self->max_num_output_channels); + for (glong i = 0; i < self->max_num_output_channels; i++) { + ASIOChannelInfo *info = &self->output_channel_infos[i]; + info->channel = i; + info->isInput = FALSE; + + asio_rst = asio_handle->getChannelInfo (info); + if (asio_rst != 0) { + GST_WARNING_OBJECT (self, "Failed to %ld output channel info, ret %ld", + i, asio_rst); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + GST_INFO_OBJECT (self, + "OutputChannelInfo %ld: isActive %s, channelGroup %ld, " + "ASIOSampleType %ld, name %s", i, info->isActive ? "true" : "false", + info->channelGroup, info->type, GST_STR_NULL (info->name)); + } + + self->output_channel_requested = + g_new0 (gboolean, self->max_num_input_channels); + } + + asio_rst = asio_handle->getSampleRate (&self->sample_rate); + if (asio_rst != 0) { + GST_WARNING_OBJECT (self, + "Failed to get current samplerate, ret %ld", asio_rst); + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + GST_INFO_OBJECT (self, "Current samplerate %.1lf", self->sample_rate); + + self->callbacks = new GstAsioCallbacks (self); + if (!gst_asio_object_bind_callbacks (self)) { + GST_ERROR_OBJECT (self, "Failed to bind callback to slot"); + delete self->callbacks; + self->callbacks = nullptr; + + asio_handle->Release (); + asio_handle = nullptr; + goto run_loop; + } + + msg_io_channel = g_io_channel_win32_new_messages ((guintptr) hwnd); + hwnd_msg_source = g_io_create_watch (msg_io_channel, G_IO_IN); + g_source_set_callback (hwnd_msg_source, (GSourceFunc) hwnd_msg_cb, + self->context, nullptr); + g_source_attach (hwnd_msg_source, self->context); + + self->state = GST_ASIO_OBJECT_STATE_INITIALIZED; + self->asio_handle = asio_handle; + +run_loop: + g_main_loop_run (self->loop); + + if (self->asio_handle) { + if (self->state > GST_ASIO_OBJECT_STATE_PREPARED) + self->asio_handle->stop (); + + if (self->state > GST_ASIO_OBJECT_STATE_INITIALIZED) + self->asio_handle->disposeBuffers (); + } + + gst_asio_object_unbind_callbacks (self); + if (self->callbacks) { + delete self->callbacks; + self->callbacks = nullptr; + } + + if (hwnd_msg_source) { + g_source_destroy (hwnd_msg_source); + g_source_unref (hwnd_msg_source); + } + + if (msg_io_channel) + g_io_channel_unref (msg_io_channel); + + if (hwnd) + DestroyWindow (hwnd); + + g_main_context_pop_thread_default (self->context); + + if (avrt_handle) + AvRevertMmThreadCharacteristics (avrt_handle); + + if (asio_handle) { + asio_handle->Release (); + asio_handle = nullptr; + } + + CoUninitialize (); + + GST_INFO_OBJECT (self, "Exit loop"); + + return nullptr; +} + +static void +gst_asio_object_weak_ref_notify (gpointer data, GstAsioObject * object) +{ + std::lock_guard < std::mutex > lk (global_lock); + asio_object_list = g_list_remove (asio_object_list, object); +} + +GstAsioObject * +gst_asio_object_new (const GstAsioDeviceInfo * info, + gboolean occupy_all_channels) +{ + GstAsioObject *self = nullptr; + GList *iter; + std::lock_guard < std::mutex > lk (global_lock); + + g_return_val_if_fail (info != nullptr, nullptr); + + /* Check if we have object corresponding to CLSID, and if so return + * already existing object instead of allocating new one */ + for (iter = asio_object_list; iter; iter = g_list_next (iter)) { + GstAsioObject *object = (GstAsioObject *) iter->data; + + if (object->device_info->clsid == info->clsid) { + GST_DEBUG_OBJECT (object, "Found configured ASIO object"); + self = (GstAsioObject *) gst_object_ref (object); + break; + } + } + + if (self) + return self; + + self = (GstAsioObject *) g_object_new (GST_TYPE_ASIO_OBJECT, + "device-info", info, nullptr); + + if (!self->asio_handle) { + GST_WARNING_OBJECT (self, "ASIO handle is not available"); + gst_object_unref (self); + + return nullptr; + } + + self->occupy_all_channels = occupy_all_channels; + + gst_object_ref_sink (self); + + g_object_weak_ref (G_OBJECT (self), + (GWeakNotify) gst_asio_object_weak_ref_notify, nullptr); + asio_object_list = g_list_append (asio_object_list, self); + + return self; +} + +static GstCaps * +gst_asio_object_create_caps_from_channel_info (GstAsioObject * self, + ASIOChannelInfo * info, guint min_num_channels, guint max_num_channels) +{ + GstCaps *caps; + std::string caps_str; + GstAudioFormat fmt; + const gchar *fmt_str; + + g_assert (info); + g_assert (max_num_channels >= min_num_channels); + + fmt = gst_asio_sample_type_to_gst (info->type); + if (fmt == GST_AUDIO_FORMAT_UNKNOWN) { + GST_ERROR_OBJECT (self, "Unknown format"); + return nullptr; + } + + fmt_str = gst_audio_format_to_string (fmt); + + /* Actually we are non-interleaved, but element will interlave data */ + caps_str = "audio/x-raw, layout = (string) interleaved, "; + caps_str += "format = (string) " + std::string (fmt_str) + ", "; + /* use fixated sample rate, otherwise get_caps/set_sample_rate() might + * be racy in case that multiple sink/src are used */ + caps_str += + "rate = (int) " + std::to_string ((gint) self->sample_rate) + ", "; + + if (max_num_channels == min_num_channels) + caps_str += "channels = (int) " + std::to_string (max_num_channels); + else + caps_str += "channels = (int) [ " + std::to_string (min_num_channels) + + ", " + std::to_string (max_num_channels) + " ]"; + + caps = gst_caps_from_string (caps_str.c_str ()); + if (!caps) { + GST_ERROR_OBJECT (self, "Failed to create caps"); + return nullptr; + } + + GST_DEBUG_OBJECT (self, "Create caps %" GST_PTR_FORMAT, caps); + + return caps; +} + +/* FIXME: assuming all channels has the same format but it might not be true? */ +GstCaps * +gst_asio_object_get_caps (GstAsioObject * obj, GstAsioDeviceClassType type, + guint min_num_channels, guint max_num_channels) +{ + ASIOChannelInfo *infos; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), nullptr); + + if (type == GST_ASIO_DEVICE_CLASS_CAPTURE) { + if (obj->max_num_input_channels == 0) { + GST_WARNING_OBJECT (obj, "Device doesn't support input"); + return nullptr; + } + + /* max_num_channels == 0 means [1, max-allowed-channles] */ + if (max_num_channels > 0) { + if (max_num_channels > obj->max_num_input_channels) { + GST_WARNING_OBJECT (obj, "Too many max channels"); + return nullptr; + } + } else { + max_num_channels = obj->max_num_input_channels; + } + + if (min_num_channels > 0) { + if (min_num_channels > obj->max_num_input_channels) { + GST_WARNING_OBJECT (obj, "Too many min channels"); + return nullptr; + } + } else { + min_num_channels = 1; + } + + infos = obj->input_channel_infos; + } else { + if (obj->max_num_output_channels == 0) { + GST_WARNING_OBJECT (obj, "Device doesn't support output"); + return nullptr; + } + + /* max_num_channels == 0 means [1, max-allowed-channles] */ + if (max_num_channels > 0) { + if (max_num_channels > obj->max_num_output_channels) { + GST_WARNING_OBJECT (obj, "Too many max channels"); + return nullptr; + } + } else { + max_num_channels = obj->max_num_output_channels; + } + + if (min_num_channels > 0) { + if (min_num_channels > obj->max_num_output_channels) { + GST_WARNING_OBJECT (obj, "Too many min channels"); + return nullptr; + } + } else { + min_num_channels = 1; + } + + infos = obj->output_channel_infos; + } + + return gst_asio_object_create_caps_from_channel_info (obj, + infos, min_num_channels, max_num_channels); +} + +gboolean +gst_asio_object_get_max_num_channels (GstAsioObject * obj, glong * num_input_ch, + glong * num_output_ch) +{ + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + + if (num_input_ch) + *num_input_ch = obj->max_num_input_channels; + if (num_output_ch) + *num_output_ch = obj->max_num_output_channels; + + return TRUE; +} + +gboolean +gst_asio_object_get_buffer_size (GstAsioObject * obj, glong * min_size, + glong * max_size, glong * preferred_size, glong * granularity) +{ + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + + if (min_size) + *min_size = obj->min_buffer_size; + if (max_size) + *max_size = obj->max_buffer_size; + if (preferred_size) + *preferred_size = obj->preferred_buffer_size; + if (granularity) + *granularity = obj->buffer_size_granularity; + + return TRUE; +} + +typedef void (*GstAsioObjectThreadFunc) (GstAsioObject * obj, gpointer data); + +typedef struct +{ + GstAsioObject *self; + GstAsioObjectThreadFunc func; + gpointer data; + gboolean fired; +} GstAsioObjectThreadRunData; + +static gboolean +gst_asio_object_thread_run_func (GstAsioObjectThreadRunData * data) +{ + GstAsioObject *self = data->self; + + if (data->func) + data->func (self, data->data); + + g_mutex_lock (&self->thread_lock); + data->fired = TRUE; + g_cond_broadcast (&self->thread_cond); + g_mutex_unlock (&self->thread_lock); + + return G_SOURCE_REMOVE; +} + +static void +gst_asio_object_thread_add (GstAsioObject * self, GstAsioObjectThreadFunc func, + gpointer data) +{ + GstAsioObjectThreadRunData thread_data; + + g_return_if_fail (GST_IS_ASIO_OBJECT (self)); + + thread_data.self = self; + thread_data.func = func; + thread_data.data = data; + thread_data.fired = FALSE; + + g_main_context_invoke (self->context, + (GSourceFunc) gst_asio_object_thread_run_func, &thread_data); + + g_mutex_lock (&self->thread_lock); + while (!thread_data.fired) + g_cond_wait (&self->thread_cond, &self->thread_lock); + g_mutex_unlock (&self->thread_lock); +} + +static gboolean +gst_asio_object_validate_channels (GstAsioObject * self, gboolean is_input, + guint * channel_indices, guint num_channels) +{ + if (is_input) { + if (self->max_num_input_channels < num_channels) { + GST_WARNING_OBJECT (self, "%d exceeds max input channels %ld", + num_channels, self->max_num_input_channels); + return FALSE; + } + + for (guint i = 0; i < num_channels; i++) { + guint ch = channel_indices[i]; + if (self->max_num_input_channels <= ch) { + GST_WARNING_OBJECT (self, "%d exceeds max input channels %ld", + ch, self->max_num_input_channels); + + return FALSE; + } + } + } else { + if (self->max_num_output_channels < num_channels) { + GST_WARNING_OBJECT (self, "%d exceeds max output channels %ld", + num_channels, self->max_num_output_channels); + + return FALSE; + } + + for (guint i = 0; i < num_channels; i++) { + guint ch = channel_indices[i]; + if (self->max_num_output_channels <= ch) { + GST_WARNING_OBJECT (self, "%d exceeds max output channels %ld", + ch, self->max_num_output_channels); + + return FALSE; + } + } + } + + return TRUE; +} + +static gboolean +gst_asio_object_check_buffer_reuse (GstAsioObject * self, ASIOBool is_input, + guint * channel_indices, guint num_channels) +{ + guint num_found = 0; + + g_assert (self->buffer_infos); + g_assert (self->num_allocated_buffers > 0); + + for (guint i = 0; i < self->num_allocated_buffers; i++) { + ASIOBufferInfo *info = &self->buffer_infos[i]; + + if (info->isInput != is_input) + continue; + + for (guint j = 0; j < num_channels; j++) { + if (info->channelNum == channel_indices[j]) { + num_found++; + + break; + } + } + } + + return num_found == num_channels; +} + +static void +gst_asio_object_dispose_buffers_async (GstAsioObject * self, ASIOError * rst) +{ + g_assert (self->asio_handle); + g_assert (rst); + + *rst = self->asio_handle->disposeBuffers (); +} + +static gboolean +gst_asio_object_dispose_buffers (GstAsioObject * self) +{ + ASIOError rst; + g_assert (self->asio_handle); + + if (!self->buffer_infos) + return TRUE; + + if (!self->device_info->sta_model) { + rst = self->asio_handle->disposeBuffers (); + } else { + gst_asio_object_thread_add (self, + (GstAsioObjectThreadFunc) gst_asio_object_dispose_buffers_async, &rst); + } + + g_clear_pointer (&self->buffer_infos, g_free); + self->num_allocated_buffers = 0; + + return rst == 0; +} + +static ASIOError +gst_asio_object_create_buffers_real (GstAsioObject * self, glong * buffer_size) +{ + ASIOError err; + + g_assert (buffer_size); + + err = self->asio_handle->createBuffers (self->buffer_infos, + self->num_requested_input_channels + self->num_requested_output_channels, + *buffer_size, &self->driver_callbacks); + + /* It failed and buffer size is not equal to preferred size, + * try again with preferred size */ + if (err != 0 && *buffer_size != self->preferred_buffer_size) { + GST_WARNING_OBJECT (self, + "Failed to create buffer with buffer size %ld, try again with %ld", + *buffer_size, self->preferred_buffer_size); + + err = self->asio_handle->createBuffers (self->buffer_infos, + self->num_requested_input_channels + + self->num_requested_output_channels, self->preferred_buffer_size, + &self->driver_callbacks); + + if (!err) { + *buffer_size = self->preferred_buffer_size; + } + } + + return err; +} + +typedef struct +{ + glong buffer_size; + ASIOError err; +} CreateBuffersAsyncData; + +static void +gst_asio_object_create_buffers_async (GstAsioObject * self, + CreateBuffersAsyncData * data) +{ + data->err = gst_asio_object_create_buffers_real (self, &data->buffer_size); +} + +static gboolean +gst_asio_object_create_buffers_internal (GstAsioObject * self, + glong * buffer_size) +{ + ASIOError err; + g_assert (self->asio_handle); + + if (!self->device_info->sta_model) { + err = gst_asio_object_create_buffers_real (self, buffer_size); + } else { + CreateBuffersAsyncData data; + data.buffer_size = *buffer_size; + + gst_asio_object_thread_add (self, + (GstAsioObjectThreadFunc) gst_asio_object_create_buffers_async, &data); + + err = data.err; + *buffer_size = data.buffer_size; + } + + return !err; +} + +gboolean +gst_asio_object_create_buffers (GstAsioObject * obj, + GstAsioDeviceClassType type, + guint * channel_indices, guint num_channels, guint * buffer_size) +{ + gboolean can_reuse = FALSE; + guint i, j; + glong buf_size; + glong prev_buf_size = 0; + gboolean is_src; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + g_return_val_if_fail (channel_indices != nullptr, FALSE); + g_return_val_if_fail (num_channels > 0, FALSE); + + GST_DEBUG_OBJECT (obj, "Create buffers"); + + if (type == GST_ASIO_DEVICE_CLASS_CAPTURE) + is_src = TRUE; + else + is_src = FALSE; + + g_mutex_lock (&obj->api_lock); + if (!gst_asio_object_validate_channels (obj, is_src, channel_indices, + num_channels)) { + GST_ERROR_OBJECT (obj, "Invalid request"); + g_mutex_unlock (&obj->api_lock); + + return FALSE; + } + + if (obj->buffer_infos) { + GST_DEBUG_OBJECT (obj, + "Have configured buffer infors, checking whether we can reuse it"); + can_reuse = gst_asio_object_check_buffer_reuse (obj, + is_src ? TRUE : FALSE, channel_indices, num_channels); + } + + if (can_reuse) { + GST_DEBUG_OBJECT (obj, "We can reuse already allocated buffers"); + if (buffer_size) + *buffer_size = obj->selected_buffer_size; + + g_mutex_unlock (&obj->api_lock); + + return TRUE; + } + + /* Cannot re-allocated buffers once started... */ + if (obj->state > GST_ASIO_OBJECT_STATE_PREPARED) { + GST_WARNING_OBJECT (obj, "We are running already"); + g_mutex_unlock (&obj->api_lock); + + return FALSE; + } + + /* Use already configured buffer size */ + if (obj->buffer_infos) + prev_buf_size = obj->selected_buffer_size; + + /* If we have configured buffers, dispose and re-allocate */ + if (!gst_asio_object_dispose_buffers (obj)) { + GST_ERROR_OBJECT (obj, "Failed to dispose buffers"); + + obj->state = GST_ASIO_OBJECT_STATE_INITIALIZED; + + g_mutex_unlock (&obj->api_lock); + return FALSE; + } + + if (obj->occupy_all_channels) { + GST_INFO_OBJECT (obj, + "occupy-all-channels mode, will allocate buffers for all channels"); + /* In this case, we will allocate buffer for all available input/output + * channles, regardless of what requested here */ + for (guint i = 0; i < (guint) obj->max_num_input_channels; i++) + obj->input_channel_requested[i] = TRUE; + for (guint i = 0; i < (guint) obj->max_num_output_channels; i++) + obj->output_channel_requested[i] = TRUE; + + obj->num_requested_input_channels = obj->max_num_input_channels; + obj->num_requested_output_channels = obj->max_num_output_channels; + } else { + if (is_src) { + for (guint i = 0; i < num_channels; i++) { + guint ch = channel_indices[i]; + + obj->input_channel_requested[ch] = TRUE; + } + + obj->num_requested_input_channels = 0; + for (guint i = 0; i < obj->max_num_input_channels; i++) { + if (obj->input_channel_requested[i]) + obj->num_requested_input_channels++; + } + } else { + for (guint i = 0; i < num_channels; i++) { + guint ch = channel_indices[i]; + + obj->output_channel_requested[ch] = TRUE; + } + + obj->num_requested_output_channels = 0; + for (guint i = 0; i < obj->max_num_output_channels; i++) { + if (obj->output_channel_requested[i]) + obj->num_requested_output_channels++; + } + } + } + + obj->num_allocated_buffers = obj->num_requested_input_channels + + obj->num_requested_output_channels; + + obj->buffer_infos = g_new0 (ASIOBufferInfo, obj->num_allocated_buffers); + for (i = 0, j = 0; i < obj->num_requested_input_channels; i++) { + ASIOBufferInfo *info = &obj->buffer_infos[i]; + + info->isInput = TRUE; + while (!obj->input_channel_requested[j]) + j++; + + info->channelNum = j; + j++; + } + + for (i = obj->num_requested_input_channels, j = 0; + i < + obj->num_requested_input_channels + obj->num_requested_output_channels; + i++) { + ASIOBufferInfo *info = &obj->buffer_infos[i]; + + info->isInput = FALSE; + while (!obj->output_channel_requested[j]) + j++; + + info->channelNum = j; + j++; + } + + if (prev_buf_size > 0) { + buf_size = prev_buf_size; + } else if (buffer_size && *buffer_size > 0) { + buf_size = *buffer_size; + } else { + buf_size = obj->preferred_buffer_size; + } + + GST_INFO_OBJECT (obj, "Creating buffer with size %ld", buf_size); + + if (!gst_asio_object_create_buffers_internal (obj, &buf_size)) { + GST_ERROR_OBJECT (obj, "Failed to create buffers"); + g_clear_pointer (&obj->buffer_infos, g_free); + obj->num_allocated_buffers = 0; + + obj->state = GST_ASIO_OBJECT_STATE_INITIALIZED; + + g_mutex_unlock (&obj->api_lock); + + return FALSE; + } + + GST_INFO_OBJECT (obj, "Selected buffer size %ld", buf_size); + + obj->selected_buffer_size = buf_size; + if (buffer_size) + *buffer_size = buf_size; + + obj->state = GST_ASIO_OBJECT_STATE_PREPARED; + + g_mutex_unlock (&obj->api_lock); + + return TRUE; +} + +typedef struct +{ + glong arg[4]; + ASIOError ret; +} RunAsyncData; + +static void +gst_asio_object_get_latencies_async (GstAsioObject * self, RunAsyncData * data) +{ + data->ret = self->asio_handle->getLatencies (&data->arg[0], &data->arg[1]); +} + +gboolean +gst_asio_object_get_latencies (GstAsioObject * obj, glong * input_latency, + glong * output_latency) +{ + RunAsyncData data = { 0 }; + ASIOError err; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + g_assert (obj->asio_handle); + + if (!obj->device_info->sta_model) { + err = obj->asio_handle->getLatencies (input_latency, output_latency); + } else { + gst_asio_object_thread_add (obj, + (GstAsioObjectThreadFunc) gst_asio_object_get_latencies_async, &data); + + *input_latency = data.arg[0]; + *output_latency = data.arg[1]; + err = data.ret; + } + + return !err; +} + +typedef struct +{ + ASIOSampleRate sample_rate; + ASIOError err; +} SampleRateAsyncData; + +static void +gst_asio_object_can_sample_rate_async (GstAsioObject * self, + SampleRateAsyncData * data) +{ + data->err = self->asio_handle->canSampleRate (data->sample_rate); +} + +gboolean +gst_asio_object_can_sample_rate (GstAsioObject * obj, + ASIOSampleRate sample_rate) +{ + SampleRateAsyncData data = { 0 }; + ASIOError err = 0; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + g_assert (obj->asio_handle); + + g_mutex_lock (&obj->api_lock); + for (guint i = 0; i < obj->supported_sample_rates->len; i++) { + ASIOSampleRate val = g_array_index (obj->supported_sample_rates, + ASIOSampleRate, i); + if (val == sample_rate) { + g_mutex_unlock (&obj->api_lock); + return TRUE; + } + } + + if (!obj->device_info->sta_model) { + err = obj->asio_handle->canSampleRate (sample_rate); + + if (!err) + g_array_append_val (obj->supported_sample_rates, sample_rate); + + g_mutex_unlock (&obj->api_lock); + return !err; + } + + data.sample_rate = sample_rate; + gst_asio_object_thread_add (obj, + (GstAsioObjectThreadFunc) gst_asio_object_can_sample_rate_async, &data); + + if (!data.err) + g_array_append_val (obj->supported_sample_rates, sample_rate); + + g_mutex_unlock (&obj->api_lock); + + return !data.err; +} + +gboolean +gst_asio_object_get_sample_rate (GstAsioObject * obj, + ASIOSampleRate * sample_rate) +{ + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + + *sample_rate = obj->sample_rate; + + return 0; +} + +static void +gst_asio_object_set_sample_rate_async (GstAsioObject * self, + SampleRateAsyncData * data) +{ + data->err = self->asio_handle->setSampleRate (data->sample_rate); + if (!data->err) + self->sample_rate = data->sample_rate; +} + +gboolean +gst_asio_object_set_sample_rate (GstAsioObject * obj, + ASIOSampleRate sample_rate) +{ + SampleRateAsyncData data = { 0 }; + ASIOError err = 0; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + g_assert (obj->asio_handle); + + g_mutex_lock (&obj->api_lock); + if (sample_rate == obj->sample_rate) { + g_mutex_unlock (&obj->api_lock); + return TRUE; + } + + if (!obj->device_info->sta_model) { + err = obj->asio_handle->setSampleRate (sample_rate); + if (!err) + obj->sample_rate = sample_rate; + + g_mutex_unlock (&obj->api_lock); + return !err; + } + + data.sample_rate = sample_rate; + gst_asio_object_thread_add (obj, + (GstAsioObjectThreadFunc) gst_asio_object_set_sample_rate_async, &data); + g_mutex_unlock (&obj->api_lock); + + return !data.err; +} + +static void +gst_asio_object_buffer_switch (GstAsioObject * self, + glong index, ASIOBool process_now) +{ + ASIOTime time_info; + ASIOTime *our_time_info = nullptr; + ASIOError err = 0; + + memset (&time_info, 0, sizeof (ASIOTime)); + + err = + self->asio_handle->getSamplePosition (&time_info.timeInfo.samplePosition, + &time_info.timeInfo.systemTime); + if (!err) + our_time_info = &time_info; + + gst_asio_object_buffer_switch_time_info (self, + our_time_info, index, process_now); +} + +static void +gst_asio_object_sample_rate_changed (GstAsioObject * self, ASIOSampleRate rate) +{ + GST_INFO_OBJECT (self, "SampleRate changed to %lf", rate); +} + +static glong +gst_asio_object_messages (GstAsioObject * self, + glong selector, glong value, gpointer message, gdouble * opt) +{ + GST_DEBUG_OBJECT (self, "ASIO message: %ld, %ld", selector, value); + + switch (selector) { + case kAsioSelectorSupported: + if (value == kAsioResetRequest || value == kAsioEngineVersion || + value == kAsioResyncRequest || value == kAsioLatenciesChanged || + value == kAsioSupportsTimeCode || value == kAsioSupportsInputMonitor) + return 0; + else if (value == kAsioSupportsTimeInfo) + return 1; + GST_WARNING_OBJECT (self, "Unsupported ASIO selector: %li", value); + break; + case kAsioBufferSizeChange: + GST_WARNING_OBJECT (self, + "Unsupported ASIO message: kAsioBufferSizeChange"); + break; + case kAsioResetRequest: + GST_WARNING_OBJECT (self, "Unsupported ASIO message: kAsioResetRequest"); + break; + case kAsioResyncRequest: + GST_WARNING_OBJECT (self, "Unsupported ASIO message: kAsioResyncRequest"); + break; + case kAsioLatenciesChanged: + GST_WARNING_OBJECT (self, + "Unsupported ASIO message: kAsioLatenciesChanged"); + break; + case kAsioEngineVersion: + /* We target the ASIO v2 API, which includes ASIOOutputReady() */ + return 2; + case kAsioSupportsTimeInfo: + /* We use the new time info buffer switch callback */ + return 1; + case kAsioSupportsTimeCode: + /* We don't use the time code info right now */ + return 0; + default: + GST_WARNING_OBJECT (self, "Unsupported ASIO message: %li, %li", selector, + value); + break; + } + + return 0; +} + +#define PACK_ASIO_64(v) ((v).lo | ((guint64)((v).hi) << 32)) + +static ASIOTime * +gst_asio_object_buffer_switch_time_info (GstAsioObject * self, + ASIOTime * time_info, glong index, ASIOBool process_now) +{ + GList *iter; + + if (time_info) { + guint64 pos; + guint64 system_time; + + pos = PACK_ASIO_64 (time_info->timeInfo.samplePosition); + system_time = PACK_ASIO_64 (time_info->timeInfo.systemTime); + + GST_TRACE_OBJECT (self, "Sample Position: %" G_GUINT64_FORMAT + ", System Time: %" GST_TIME_FORMAT, pos, GST_TIME_ARGS (system_time)); + } + + g_mutex_lock (&self->api_lock); + if (!self->src_client_callbacks && !self->sink_client_callbacks && + !self->loopback_client_callbacks) { + GST_WARNING_OBJECT (self, "No installed client callback"); + goto out; + } + + for (iter = self->src_client_callbacks; iter;) { + GstAsioObjectCallbacksPrivate *cb = + (GstAsioObjectCallbacksPrivate *) iter->data; + gboolean ret; + + ret = cb->callbacks.buffer_switch (self, index, self->buffer_infos, + self->num_allocated_buffers, self->input_channel_infos, + self->output_channel_infos, self->sample_rate, + self->selected_buffer_size, time_info, cb->callbacks.user_data); + if (!ret) { + GST_INFO_OBJECT (self, "Remove callback for id %" G_GUINT64_FORMAT, + cb->callback_id); + GList *to_remove = iter; + iter = g_list_next (iter); + + g_free (to_remove->data); + g_list_free (to_remove); + } + + iter = g_list_next (iter); + } + + for (iter = self->sink_client_callbacks; iter;) { + GstAsioObjectCallbacksPrivate *cb = + (GstAsioObjectCallbacksPrivate *) iter->data; + gboolean ret; + + ret = cb->callbacks.buffer_switch (self, index, self->buffer_infos, + self->num_allocated_buffers, self->input_channel_infos, + self->output_channel_infos, self->sample_rate, + self->selected_buffer_size, time_info, cb->callbacks.user_data); + if (!ret) { + GST_INFO_OBJECT (self, "Remove callback for id %" G_GUINT64_FORMAT, + cb->callback_id); + GList *to_remove = iter; + iter = g_list_next (iter); + + g_free (to_remove->data); + g_list_free (to_remove); + } + + iter = g_list_next (iter); + } + + for (iter = self->loopback_client_callbacks; iter;) { + GstAsioObjectCallbacksPrivate *cb = + (GstAsioObjectCallbacksPrivate *) iter->data; + gboolean ret; + + ret = cb->callbacks.buffer_switch (self, index, self->buffer_infos, + self->num_allocated_buffers, self->input_channel_infos, + self->output_channel_infos, self->sample_rate, + self->selected_buffer_size, time_info, cb->callbacks.user_data); + if (!ret) { + GST_INFO_OBJECT (self, "Remove callback for id %" G_GUINT64_FORMAT, + cb->callback_id); + GList *to_remove = iter; + iter = g_list_next (iter); + + g_free (to_remove->data); + g_list_free (to_remove); + } + + iter = g_list_next (iter); + } + + self->asio_handle->outputReady (); + +out: + g_mutex_unlock (&self->api_lock); + + return nullptr; +} + +static void +gst_asio_object_start_async (GstAsioObject * self, ASIOError * rst) +{ + *rst = self->asio_handle->start (); +} + +gboolean +gst_asio_object_start (GstAsioObject * obj) +{ + ASIOError ret; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + + g_mutex_lock (&obj->api_lock); + if (obj->state > GST_ASIO_OBJECT_STATE_PREPARED) { + GST_DEBUG_OBJECT (obj, "We are running already"); + g_mutex_unlock (&obj->api_lock); + + return TRUE; + } else if (obj->state < GST_ASIO_OBJECT_STATE_PREPARED) { + GST_ERROR_OBJECT (obj, "We are not prepared"); + g_mutex_unlock (&obj->api_lock); + + return FALSE; + } + + /* Then start */ + if (!obj->device_info->sta_model) { + ret = obj->asio_handle->start (); + } else { + gst_asio_object_thread_add (obj, + (GstAsioObjectThreadFunc) gst_asio_object_start_async, &ret); + } + + if (ret != 0) { + GST_ERROR_OBJECT (obj, "Failed to start object"); + g_mutex_unlock (&obj->api_lock); + + return FALSE; + } + + obj->state = GST_ASIO_OBJECT_STATE_RUNNING; + g_mutex_unlock (&obj->api_lock); + + return TRUE; +} + +gboolean +gst_asio_object_install_callback (GstAsioObject * obj, + GstAsioDeviceClassType type, + GstAsioObjectCallbacks * callbacks, guint64 * callback_id) +{ + GstAsioObjectCallbacksPrivate *cb; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (obj), FALSE); + g_return_val_if_fail (callbacks != nullptr, FALSE); + g_return_val_if_fail (callback_id != nullptr, FALSE); + + g_mutex_lock (&obj->api_lock); + cb = g_new0 (GstAsioObjectCallbacksPrivate, 1); + cb->callbacks = *callbacks; + cb->callback_id = obj->next_callback_id; + + switch (type) { + case GST_ASIO_DEVICE_CLASS_CAPTURE: + obj->src_client_callbacks = g_list_append (obj->src_client_callbacks, cb); + break; + case GST_ASIO_DEVICE_CLASS_RENDER: + obj->sink_client_callbacks = + g_list_append (obj->sink_client_callbacks, cb); + break; + case GST_ASIO_DEVICE_CLASS_LOOPBACK_CAPTURE: + obj->loopback_client_callbacks = + g_list_append (obj->loopback_client_callbacks, cb); + break; + default: + g_assert_not_reached (); + g_free (cb); + return FALSE; + } + + *callback_id = cb->callback_id; + g_mutex_unlock (&obj->api_lock); + + return TRUE; +} + +void +gst_asio_object_uninstall_callback (GstAsioObject * obj, guint64 callback_id) +{ + GList *iter; + + g_return_if_fail (GST_IS_ASIO_OBJECT (obj)); + + g_mutex_lock (&obj->api_lock); + + GST_DEBUG_OBJECT (obj, "Removing callback id %" G_GUINT64_FORMAT, + callback_id); + + for (iter = obj->src_client_callbacks; iter; iter = g_list_next (iter)) { + GstAsioObjectCallbacksPrivate *cb = + (GstAsioObjectCallbacksPrivate *) iter->data; + + if (cb->callback_id != callback_id) + continue; + + GST_DEBUG_OBJECT (obj, "Found src callback for id %" G_GUINT64_FORMAT, + callback_id); + + obj->src_client_callbacks = + g_list_remove_link (obj->src_client_callbacks, iter); + g_free (iter->data); + g_list_free (iter); + g_mutex_unlock (&obj->api_lock); + + return; + } + + for (iter = obj->sink_client_callbacks; iter; iter = g_list_next (iter)) { + GstAsioObjectCallbacksPrivate *cb = + (GstAsioObjectCallbacksPrivate *) iter->data; + + if (cb->callback_id != callback_id) + continue; + + GST_DEBUG_OBJECT (obj, "Found sink callback for id %" G_GUINT64_FORMAT, + callback_id); + + obj->sink_client_callbacks = + g_list_remove_link (obj->sink_client_callbacks, iter); + g_free (iter->data); + g_list_free (iter); + g_mutex_unlock (&obj->api_lock); + + return; + } + + for (iter = obj->loopback_client_callbacks; iter; iter = g_list_next (iter)) { + GstAsioObjectCallbacksPrivate *cb = + (GstAsioObjectCallbacksPrivate *) iter->data; + + if (cb->callback_id != callback_id) + continue; + + GST_DEBUG_OBJECT (obj, "Found loopback callback for id %" G_GUINT64_FORMAT, + callback_id); + + obj->loopback_client_callbacks = + g_list_remove_link (obj->loopback_client_callbacks, iter); + g_free (iter->data); + g_list_free (iter); + break; + } + + g_mutex_unlock (&obj->api_lock); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasioobject.h
Added
@@ -0,0 +1,104 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License aglong with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ASIO_OBJECT_H__ +#define __GST_ASIO_OBJECT_H__ + +#include <gst/gst.h> +#include <windows.h> +#include "gstasioutils.h" + +G_BEGIN_DECLS + +#define GST_TYPE_ASIO_OBJECT (gst_asio_object_get_type()) +G_DECLARE_FINAL_TYPE (GstAsioObject, gst_asio_object, + GST, ASIO_OBJECT, GstObject); + +typedef struct { + gboolean (*buffer_switch) (GstAsioObject * obj, + glong index, + ASIOBufferInfo * infos, + guint num_infos, + ASIOChannelInfo * input_channel_infos, + ASIOChannelInfo * output_channel_infos, + ASIOSampleRate sample_rate, + glong buffer_size, + ASIOTime * time_info, + gpointer user_data); + + gpointer user_data; +} GstAsioObjectCallbacks; + +typedef enum +{ + GST_ASIO_DEVICE_CLASS_CAPTURE, + GST_ASIO_DEVICE_CLASS_RENDER, + GST_ASIO_DEVICE_CLASS_LOOPBACK_CAPTURE, +} GstAsioDeviceClassType; + +GstAsioObject * gst_asio_object_new (const GstAsioDeviceInfo * info, + gboolean occupy_all_channels); + +GstCaps * gst_asio_object_get_caps (GstAsioObject * obj, + GstAsioDeviceClassType type, + guint num_min_channels, + guint num_max_channels); + +gboolean gst_asio_object_create_buffers (GstAsioObject * obj, + GstAsioDeviceClassType type, + guint * channel_indices, + guint num_channels, + guint * buffer_size); + +gboolean gst_asio_object_start (GstAsioObject * obj); + +gboolean gst_asio_object_install_callback (GstAsioObject * obj, + GstAsioDeviceClassType type, + GstAsioObjectCallbacks * callbacks, + guint64 * callback_id); + +void gst_asio_object_uninstall_callback (GstAsioObject * obj, + guint64 callback_id); + +gboolean gst_asio_object_get_max_num_channels (GstAsioObject * obj, + glong * num_input_ch, + glong * num_output_ch); + +gboolean gst_asio_object_get_buffer_size (GstAsioObject * obj, + glong * min_size, + glong * max_size, + glong * preferred_size, + glong * granularity); + +gboolean gst_asio_object_get_latencies (GstAsioObject * obj, + glong * input_latency, + glong * output_latency); + +gboolean gst_asio_object_can_sample_rate (GstAsioObject * obj, + ASIOSampleRate sample_rate); + +gboolean gst_asio_object_get_sample_rate (GstAsioObject * obj, + ASIOSampleRate * sample_rate); + +gboolean gst_asio_object_set_sample_rate (GstAsioObject * obj, + ASIOSampleRate sample_rate); + +G_END_DECLS + +#endif /* __GST_ASIO_OBJECT_H__ */ \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasioringbuffer.cpp
Added
@@ -0,0 +1,473 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + + +#include "gstasioringbuffer.h" +#include <string.h> +#include "gstasioutils.h" +#include "gstasioobject.h" + +GST_DEBUG_CATEGORY_STATIC (gst_asio_ring_buffer_debug); +#define GST_CAT_DEFAULT gst_asio_ring_buffer_debug + +struct _GstAsioRingBuffer +{ + GstAudioRingBuffer parent; + + GstAsioDeviceClassType type; + + GstAsioObject *asio_object; + guint *channel_indices; + guint num_channels; + ASIOBufferInfo **infos; + + guint64 callback_id; + gboolean callback_installed; + + gboolean running; + guint buffer_size; + + /* Used to detect sample gap */ + gboolean is_first; + guint64 expected_sample_position; + gboolean trace_sample_position; +}; + +enum +{ + PROP_0, + PROP_DEVICE_INFO, +}; + +static void gst_asio_ring_buffer_dispose (GObject * object); + +static gboolean gst_asio_ring_buffer_open_device (GstAudioRingBuffer * buf); +static gboolean gst_asio_ring_buffer_close_device (GstAudioRingBuffer * buf); +static gboolean gst_asio_ring_buffer_acquire (GstAudioRingBuffer * buf, + GstAudioRingBufferSpec * spec); +static gboolean gst_asio_ring_buffer_release (GstAudioRingBuffer * buf); +static gboolean gst_asio_ring_buffer_start (GstAudioRingBuffer * buf); +static gboolean gst_asio_ring_buffer_stop (GstAudioRingBuffer * buf); +static guint gst_asio_ring_buffer_delay (GstAudioRingBuffer * buf); + +static gboolean gst_asio_buffer_switch_cb (GstAsioObject * obj, + glong index, ASIOBufferInfo * infos, guint num_infos, + ASIOChannelInfo * input_channel_infos, + ASIOChannelInfo * output_channel_infos, + ASIOSampleRate sample_rate, glong buffer_size, gpointer user_data); + +#define gst_asio_ring_buffer_parent_class parent_class +G_DEFINE_TYPE (GstAsioRingBuffer, gst_asio_ring_buffer, + GST_TYPE_AUDIO_RING_BUFFER); + +static void +gst_asio_ring_buffer_class_init (GstAsioRingBufferClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstAudioRingBufferClass *ring_buffer_class = + GST_AUDIO_RING_BUFFER_CLASS (klass); + + gobject_class->dispose = gst_asio_ring_buffer_dispose; + + ring_buffer_class->open_device = + GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_open_device); + ring_buffer_class->close_device = + GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_close_device); + ring_buffer_class->acquire = GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_acquire); + ring_buffer_class->release = GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_release); + ring_buffer_class->start = GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_start); + ring_buffer_class->resume = GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_start); + ring_buffer_class->stop = GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_stop); + ring_buffer_class->delay = GST_DEBUG_FUNCPTR (gst_asio_ring_buffer_delay); + + GST_DEBUG_CATEGORY_INIT (gst_asio_ring_buffer_debug, + "asioringbuffer", 0, "asioringbuffer"); +} + +static void +gst_asio_ring_buffer_init (GstAsioRingBuffer * self) +{ +} + +static void +gst_asio_ring_buffer_dispose (GObject * object) +{ + GstAsioRingBuffer *self = GST_ASIO_RING_BUFFER (object); + + gst_clear_object (&self->asio_object); + g_clear_pointer (&self->channel_indices, g_free); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static gboolean +gst_asio_ring_buffer_open_device (GstAudioRingBuffer * buf) +{ + GstAsioRingBuffer *self = GST_ASIO_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Open"); + + return TRUE; +} + +static gboolean +gst_asio_ring_buffer_close_device (GstAudioRingBuffer * buf) +{ + GstAsioRingBuffer *self = GST_ASIO_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Close"); + + return TRUE; +} + +#define PACK_ASIO_64(v) ((v).lo | ((guint64)((v).hi) << 32)) + +static gboolean +gst_asio_buffer_switch_cb (GstAsioObject * obj, glong index, + ASIOBufferInfo * infos, guint num_infos, + ASIOChannelInfo * input_channel_infos, + ASIOChannelInfo * output_channel_infos, + ASIOSampleRate sample_rate, glong buffer_size, + ASIOTime * time_info, gpointer user_data) +{ + GstAsioRingBuffer *self = (GstAsioRingBuffer *) user_data; + GstAudioRingBuffer *ringbuffer = GST_AUDIO_RING_BUFFER_CAST (self); + gint segment; + guint8 *readptr; + gint len; + guint i, j; + guint num_channels = 0; + guint bps = GST_AUDIO_INFO_WIDTH (&ringbuffer->spec.info) >> 3; + + g_assert (index == 0 || index == 1); + g_assert (num_infos >= self->num_channels); + + GST_TRACE_OBJECT (self, "Buffer Switch callback, index %ld", index); + + if (!gst_audio_ring_buffer_prepare_read (ringbuffer, + &segment, &readptr, &len)) { + GST_WARNING_OBJECT (self, "No segment available"); + return TRUE; + } + + GST_TRACE_OBJECT (self, "segment %d, length %d", segment, len); + + /* Check missing frames */ + if (self->type == GST_ASIO_DEVICE_CLASS_CAPTURE) { + if (self->is_first) { + if (time_info) { + self->expected_sample_position = + PACK_ASIO_64 (time_info->timeInfo.samplePosition) + buffer_size; + self->trace_sample_position = TRUE; + } else { + GST_WARNING_OBJECT (self, "ASIOTime is not available"); + self->trace_sample_position = FALSE; + } + + self->is_first = FALSE; + } else if (self->trace_sample_position) { + if (!time_info) { + GST_WARNING_OBJECT (self, "ASIOTime is not available"); + self->trace_sample_position = FALSE; + } else { + guint64 sample_position = + PACK_ASIO_64 (time_info->timeInfo.samplePosition); + if (self->expected_sample_position < sample_position) { + guint64 gap_frames = sample_position - self->expected_sample_position; + gint gap_size = gap_frames * bps; + + GST_WARNING_OBJECT (self, "%" G_GUINT64_FORMAT " frames are missing"); + + while (gap_size >= len) { + gst_audio_format_info_fill_silence (ringbuffer->spec.info.finfo, + readptr, len); + gst_audio_ring_buffer_advance (ringbuffer, 1); + + gst_audio_ring_buffer_prepare_read (ringbuffer, + &segment, &readptr, &len); + + gap_size -= len; + } + } + + self->expected_sample_position = sample_position + buffer_size; + GST_TRACE_OBJECT (self, "Sample Position %" G_GUINT64_FORMAT + ", next: %" G_GUINT64_FORMAT, sample_position, + self->expected_sample_position); + } + } + } + + /* Given @infos might contain more channel data, pick channels what we want to + * read */ + for (i = 0; i < num_infos; i++) { + ASIOBufferInfo *info = &infos[i]; + + if (self->type == GST_ASIO_DEVICE_CLASS_CAPTURE) { + if (!info->isInput) + continue; + } else { + if (info->isInput) + continue; + } + + for (j = 0; j < self->num_channels; j++) { + if (self->channel_indices[j] != info->channelNum) + continue; + + g_assert (num_channels < self->num_channels); + self->infos[num_channels++] = info; + break; + } + } + + if (num_channels < self->num_channels) { + GST_ERROR_OBJECT (self, "Too small number of channel %d (expected %d)", + num_channels, self->num_channels); + } else { + if (self->type == GST_ASIO_DEVICE_CLASS_CAPTURE || + self->type == GST_ASIO_DEVICE_CLASS_LOOPBACK_CAPTURE) { + if (num_channels == 1) { + memcpy (readptr, self->infos[0]->buffers[index], len); + } else { + guint gst_offset = 0, asio_offset = 0; + + /* Interleaves audio */ + while (gst_offset < len) { + for (i = 0; i < num_channels; i++) { + ASIOBufferInfo *info = self->infos[i]; + + memcpy (readptr + gst_offset, + ((guint8 *) info->buffers[index]) + asio_offset, bps); + + gst_offset += bps; + } + asio_offset += bps; + } + } + } else { + if (num_channels == 1) { + memcpy (self->infos[0]->buffers[index], readptr, len); + } else { + guint gst_offset = 0, asio_offset = 0; + + /* Interleaves audio */ + while (gst_offset < len) { + for (i = 0; i < num_channels; i++) { + ASIOBufferInfo *info = self->infos[i]; + + memcpy (((guint8 *) info->buffers[index]) + asio_offset, + readptr + gst_offset, bps); + + gst_offset += bps; + } + asio_offset += bps; + } + } + } + } + + if (self->type == GST_ASIO_DEVICE_CLASS_RENDER) + gst_audio_ring_buffer_clear (ringbuffer, segment); + gst_audio_ring_buffer_advance (ringbuffer, 1); + + return TRUE; +} + +static gboolean +gst_asio_ring_buffer_acquire (GstAudioRingBuffer * buf, + GstAudioRingBufferSpec * spec) +{ + GstAsioRingBuffer *self = GST_ASIO_RING_BUFFER (buf); + + if (!self->asio_object) { + GST_ERROR_OBJECT (self, "No configured ASIO object"); + return FALSE; + } + + if (!self->channel_indices || self->num_channels == 0) { + GST_ERROR_OBJECT (self, "No configured channels"); + return FALSE; + } + + if (!gst_asio_object_set_sample_rate (self->asio_object, + GST_AUDIO_INFO_RATE (&spec->info))) { + GST_ERROR_OBJECT (self, "Failed to set sample rate"); + return FALSE; + } + + spec->segsize = self->buffer_size * + (GST_AUDIO_INFO_WIDTH (&spec->info) >> 3) * + GST_AUDIO_INFO_CHANNELS (&spec->info); + spec->segtotal = 2; + + buf->size = spec->segtotal * spec->segsize; + buf->memory = (guint8 *) g_malloc (buf->size); + gst_audio_format_info_fill_silence (buf->spec.info.finfo, + buf->memory, buf->size); + + return TRUE; +} + +static gboolean +gst_asio_ring_buffer_release (GstAudioRingBuffer * buf) +{ + GST_DEBUG_OBJECT (buf, "Release"); + + g_clear_pointer (&buf->memory, g_free); + + return TRUE; +} + +static gboolean +gst_asio_ring_buffer_start (GstAudioRingBuffer * buf) +{ + GstAsioRingBuffer *self = GST_ASIO_RING_BUFFER (buf); + GstAsioObjectCallbacks callbacks; + + GST_DEBUG_OBJECT (buf, "Start"); + + callbacks.buffer_switch = gst_asio_buffer_switch_cb; + callbacks.user_data = self; + + self->is_first = TRUE; + self->expected_sample_position = 0; + + if (!gst_asio_object_install_callback (self->asio_object, self->type, + &callbacks, &self->callback_id)) { + GST_ERROR_OBJECT (self, "Failed to install callback"); + return FALSE; + } + + self->callback_installed = TRUE; + + if (!gst_asio_object_start (self->asio_object)) { + GST_ERROR_OBJECT (self, "Failed to start"); + + gst_asio_ring_buffer_stop (buf); + + return FALSE; + } + + self->running = TRUE; + + return TRUE; +} + +static gboolean +gst_asio_ring_buffer_stop (GstAudioRingBuffer * buf) +{ + GstAsioRingBuffer *self = GST_ASIO_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (buf, "Stop"); + + self->running = FALSE; + + if (!self->asio_object) + return TRUE; + + if (self->callback_installed) + gst_asio_object_uninstall_callback (self->asio_object, self->callback_id); + + self->callback_installed = FALSE; + self->callback_id = 0; + self->is_first = TRUE; + self->expected_sample_position = 0; + + return TRUE; +} + +static guint +gst_asio_ring_buffer_delay (GstAudioRingBuffer * buf) +{ + /* FIXME: impl. */ + + return 0; +} + +GstAsioRingBuffer * +gst_asio_ring_buffer_new (GstAsioObject * object, GstAsioDeviceClassType type, + const gchar * name) +{ + GstAsioRingBuffer *self; + + g_return_val_if_fail (GST_IS_ASIO_OBJECT (object), nullptr); + + self = + (GstAsioRingBuffer *) g_object_new (GST_TYPE_ASIO_RING_BUFFER, + "name", name, nullptr); + g_assert (self); + + self->type = type; + self->asio_object = (GstAsioObject *) gst_object_ref (object); + + return self; +} + +gboolean +gst_asio_ring_buffer_configure (GstAsioRingBuffer * buf, + guint * channel_indices, guint num_channles, guint preferred_buffer_size) +{ + g_return_val_if_fail (GST_IS_ASIO_RING_BUFFER (buf), FALSE); + g_return_val_if_fail (buf->asio_object != nullptr, FALSE); + g_return_val_if_fail (num_channles > 0, FALSE); + + GST_DEBUG_OBJECT (buf, "Configure"); + + buf->buffer_size = preferred_buffer_size; + + if (!gst_asio_object_create_buffers (buf->asio_object, buf->type, + channel_indices, num_channles, &buf->buffer_size)) { + GST_ERROR_OBJECT (buf, "Failed to configure"); + + g_clear_pointer (&buf->channel_indices, g_free); + buf->num_channels = 0; + + return FALSE; + } + + GST_DEBUG_OBJECT (buf, "configured buffer size: %d", buf->buffer_size); + + g_free (buf->channel_indices); + buf->channel_indices = g_new0 (guint, num_channles); + + for (guint i = 0; i < num_channles; i++) + buf->channel_indices[i] = channel_indices[i]; + + buf->num_channels = num_channles; + + g_clear_pointer (&buf->infos, g_free); + buf->infos = g_new0 (ASIOBufferInfo *, num_channles); + + return TRUE; +} + +GstCaps * +gst_asio_ring_buffer_get_caps (GstAsioRingBuffer * buf) +{ + g_return_val_if_fail (GST_IS_ASIO_RING_BUFFER (buf), nullptr); + g_assert (buf->asio_object != nullptr); + + return gst_asio_object_get_caps (buf->asio_object, + buf->type, buf->num_channels, buf->num_channels); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasioringbuffer.h
Added
@@ -0,0 +1,47 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ASIO_RING_BUFFER_H__ +#define __GST_ASIO_RING_BUFFER_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> +#include "gstasioutils.h" +#include "gstasioobject.h" + +G_BEGIN_DECLS + +#define GST_TYPE_ASIO_RING_BUFFER (gst_asio_ring_buffer_get_type()) +G_DECLARE_FINAL_TYPE (GstAsioRingBuffer, gst_asio_ring_buffer, + GST, ASIO_RING_BUFFER, GstAudioRingBuffer); + +GstAsioRingBuffer * gst_asio_ring_buffer_new (GstAsioObject * object, + GstAsioDeviceClassType type, + const gchar * name); + +gboolean gst_asio_ring_buffer_configure (GstAsioRingBuffer * buf, + guint * channel_indices, + guint num_channles, + guint preferred_buffer_size); + +GstCaps * gst_asio_ring_buffer_get_caps (GstAsioRingBuffer * buf); + +G_END_DECLS + +#endif /* __GST_ASIO_RING_BUFFER_H__ */ \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasiosink.cpp
Added
@@ -0,0 +1,361 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstasiosink.h" +#include "gstasioobject.h" +#include "gstasioringbuffer.h" +#include <atlconv.h> +#include <string.h> +#include <set> + +GST_DEBUG_CATEGORY_STATIC (gst_asio_sink_debug); +#define GST_CAT_DEFAULT gst_asio_sink_debug + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_ASIO_STATIC_CAPS)); + +enum +{ + PROP_0, + PROP_DEVICE_CLSID, + PROP_OUTPUT_CHANNELS, + PROP_BUFFER_SIZE, + PROP_OCCUPY_ALL_CHANNELS, +}; + +#define DEFAULT_BUFFER_SIZE 0 +#define DEFAULT_OCCUPY_ALL_CHANNELS TRUE + +struct _GstAsioSink +{ + GstAudioSink parent; + + /* properties */ + gchar *device_clsid; + gchar *output_channels; + guint buffer_size; + gboolean occupy_all_channels; +}; + +static void gst_asio_sink_finalize (GObject * object); +static void gst_asio_sink_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_asio_sink_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +static GstCaps *gst_asio_sink_get_caps (GstBaseSink * sink, GstCaps * filter); + +static GstAudioRingBuffer *gst_asio_sink_create_ringbuffer (GstAudioBaseSink * + sink); + +#define gst_asio_sink_parent_class parent_class +G_DEFINE_TYPE (GstAsioSink, gst_asio_sink, GST_TYPE_AUDIO_BASE_SINK); + +static void +gst_asio_sink_class_init (GstAsioSinkClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseSinkClass *basesink_class = GST_BASE_SINK_CLASS (klass); + GstAudioBaseSinkClass *audiobasesink_class = + GST_AUDIO_BASE_SINK_CLASS (klass); + + gobject_class->finalize = gst_asio_sink_finalize; + gobject_class->set_property = gst_asio_sink_set_property; + gobject_class->get_property = gst_asio_sink_get_property; + + g_object_class_install_property (gobject_class, PROP_DEVICE_CLSID, + g_param_spec_string ("device-clsid", "Device CLSID", + "ASIO device CLSID as a string", NULL, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_OUTPUT_CHANNELS, + g_param_spec_string ("output-channels", "Output Channels", + "Comma-separated list of ASIO channels to output", NULL, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_BUFFER_SIZE, + g_param_spec_uint ("buffer-size", "Buffer Size", + "Preferred buffer size (0 for default)", + 0, G_MAXINT32, DEFAULT_BUFFER_SIZE, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_OCCUPY_ALL_CHANNELS, + g_param_spec_boolean ("occupy-all-channels", + "Occupy All Channles", + "When enabled, ASIO device will allocate resources for all in/output " + "channles", + DEFAULT_OCCUPY_ALL_CHANNELS, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + + gst_element_class_add_static_pad_template (element_class, &sink_template); + gst_element_class_set_static_metadata (element_class, "AsioSink", + "Source/Audio/Hardware", + "Stream audio from an audio capture device through ASIO", + "Seungha Yang <seungha@centricular.com>"); + + basesink_class->get_caps = GST_DEBUG_FUNCPTR (gst_asio_sink_get_caps); + + audiobasesink_class->create_ringbuffer = + GST_DEBUG_FUNCPTR (gst_asio_sink_create_ringbuffer); + + GST_DEBUG_CATEGORY_INIT (gst_asio_sink_debug, "asiosink", 0, "asiosink"); +} + +static void +gst_asio_sink_init (GstAsioSink * self) +{ + self->buffer_size = DEFAULT_BUFFER_SIZE; + self->occupy_all_channels = DEFAULT_OCCUPY_ALL_CHANNELS; +} + +static void +gst_asio_sink_finalize (GObject * object) +{ + GstAsioSink *self = GST_ASIO_SINK (object); + + g_free (self->device_clsid); + g_free (self->output_channels); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_asio_sink_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstAsioSink *self = GST_ASIO_SINK (object); + + switch (prop_id) { + case PROP_DEVICE_CLSID: + g_free (self->device_clsid); + self->device_clsid = g_value_dup_string (value); + break; + case PROP_OUTPUT_CHANNELS: + g_free (self->output_channels); + self->output_channels = g_value_dup_string (value); + break; + case PROP_BUFFER_SIZE: + self->buffer_size = g_value_get_uint (value); + break; + case PROP_OCCUPY_ALL_CHANNELS: + self->occupy_all_channels = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_asio_sink_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstAsioSink *self = GST_ASIO_SINK (object); + + switch (prop_id) { + case PROP_DEVICE_CLSID: + g_value_set_string (value, self->device_clsid); + break; + case PROP_OUTPUT_CHANNELS: + g_value_set_string (value, self->output_channels); + break; + case PROP_BUFFER_SIZE: + g_value_set_uint (value, self->buffer_size); + break; + case PROP_OCCUPY_ALL_CHANNELS: + g_value_set_boolean (value, self->occupy_all_channels); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static GstCaps * +gst_asio_sink_get_caps (GstBaseSink * sink, GstCaps * filter) +{ + GstAudioBaseSink *asink = GST_AUDIO_BASE_SINK (sink); + GstAsioSink *self = GST_ASIO_SINK (sink); + GstCaps *caps = nullptr; + + if (asink->ringbuffer) + caps = + gst_asio_ring_buffer_get_caps (GST_ASIO_RING_BUFFER + (asink->ringbuffer)); + + if (!caps) + caps = gst_pad_get_pad_template_caps (GST_BASE_SINK_PAD (sink)); + + if (filter) { + GstCaps *filtered = + gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = filtered; + } + + GST_DEBUG_OBJECT (self, "returning caps %" GST_PTR_FORMAT, caps); + + return caps; +} + +static GstAudioRingBuffer * +gst_asio_sink_create_ringbuffer (GstAudioBaseSink * sink) +{ + GstAsioSink *self = GST_ASIO_SINK (sink); + GstAsioRingBuffer *ringbuffer = nullptr; + HRESULT hr; + CLSID clsid = GUID_NULL; + GList *device_infos = nullptr; + GstAsioDeviceInfo *info = nullptr; + GstAsioObject *asio_object = nullptr; + glong max_input_ch = 0; + glong max_output_ch = 0; + guint *channel_indices = nullptr; + guint num_capture_channels = 0; + std::set < guint > channel_list; + guint i; + gchar *ringbuffer_name; + + USES_CONVERSION; + + GST_DEBUG_OBJECT (self, "Create ringbuffer"); + + if (gst_asio_enum (&device_infos) == 0) { + GST_WARNING_OBJECT (self, "No available ASIO devices"); + return nullptr; + } + + if (self->device_clsid) { + hr = CLSIDFromString (A2COLE (self->device_clsid), &clsid); + if (FAILED (hr)) { + GST_WARNING_OBJECT (self, "Failed to convert %s to CLSID", + self->device_clsid); + clsid = GUID_NULL; + } + } + + /* Pick the first device */ + if (clsid == GUID_NULL) { + info = (GstAsioDeviceInfo *) device_infos->data; + } else { + /* Find matching device */ + GList *iter; + for (iter = device_infos; iter; iter = g_list_next (iter)) { + GstAsioDeviceInfo *tmp = (GstAsioDeviceInfo *) iter->data; + if (tmp->clsid == clsid) { + info = tmp; + break; + } + } + } + + if (!info) { + GST_WARNING_OBJECT (self, "Failed to find matching device"); + goto out; + } + + asio_object = gst_asio_object_new (info, self->occupy_all_channels); + if (!asio_object) { + GST_WARNING_OBJECT (self, "Failed to create ASIO object"); + goto out; + } + + /* Configure channels to use */ + if (!gst_asio_object_get_max_num_channels (asio_object, &max_input_ch, + &max_output_ch) || max_input_ch <= 0) { + GST_WARNING_OBJECT (self, "No available input channels"); + goto out; + } + + /* Check if user requested specific channle(s) */ + if (self->output_channels) { + gchar **ch; + + ch = g_strsplit (self->output_channels, ",", 0); + + num_capture_channels = g_strv_length (ch); + if (num_capture_channels > max_input_ch) { + GST_WARNING_OBJECT (self, "To many channels %d were requested", + num_capture_channels); + } else { + for (i = 0; i < num_capture_channels; i++) { + guint64 c = g_ascii_strtoull (ch[i], nullptr, 0); + if (c >= (guint64) max_input_ch) { + GST_WARNING_OBJECT (self, "Invalid channel index"); + num_capture_channels = 0; + break; + } + + channel_list.insert ((guint) c); + } + } + + g_strfreev (ch); + } + + channel_indices = (guint *) g_alloca (sizeof (guint) * max_input_ch); + if (channel_list.size () == 0) { + for (i = 0; i < max_input_ch; i++) + channel_indices[i] = i; + + num_capture_channels = max_input_ch; + } else { + num_capture_channels = (guint) channel_list.size (); + i = 0; + for (auto iter:channel_list) { + channel_indices[i++] = iter; + } + } + + ringbuffer_name = g_strdup_printf ("%s-asioringbuffer", + GST_OBJECT_NAME (sink)); + + ringbuffer = + (GstAsioRingBuffer *) gst_asio_ring_buffer_new (asio_object, + GST_ASIO_DEVICE_CLASS_RENDER, ringbuffer_name); + g_free (ringbuffer_name); + + if (!ringbuffer) { + GST_WARNING_OBJECT (self, "Couldn't create ringbuffer object"); + goto out; + } + + if (!gst_asio_ring_buffer_configure (ringbuffer, channel_indices, + num_capture_channels, self->buffer_size)) { + GST_WARNING_OBJECT (self, "Failed to configure ringbuffer"); + gst_clear_object (&ringbuffer); + goto out; + } + +out: + if (device_infos) + g_list_free_full (device_infos, (GDestroyNotify) gst_asio_device_info_free); + + gst_clear_object (&asio_object); + + return GST_AUDIO_RING_BUFFER_CAST (ringbuffer); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasiosink.h
Added
@@ -0,0 +1,34 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ASIO_SINK_H__ +#define __GST_ASIO_SINK_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> + +G_BEGIN_DECLS + +#define GST_TYPE_ASIO_SINK (gst_asio_sink_get_type ()) +G_DECLARE_FINAL_TYPE (GstAsioSink, + gst_asio_sink, GST, ASIO_SINK, GstAudioBaseSink); + +G_END_DECLS + +#endif /* __GST_ASIO_SINK_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasiosrc.cpp
Added
@@ -0,0 +1,375 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstasiosrc.h" +#include "gstasioobject.h" +#include "gstasioringbuffer.h" +#include <atlconv.h> +#include <string.h> +#include <set> + +GST_DEBUG_CATEGORY_STATIC (gst_asio_src_debug); +#define GST_CAT_DEFAULT gst_asio_src_debug + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_ASIO_STATIC_CAPS)); + +enum +{ + PROP_0, + PROP_DEVICE_CLSID, + PROP_CAPTURE_CHANNELS, + PROP_BUFFER_SIZE, + PROP_OCCUPY_ALL_CHANNELS, + PROP_LOOPBACK, +}; + +#define DEFAULT_BUFFER_SIZE 0 +#define DEFAULT_OCCUPY_ALL_CHANNELS TRUE +#define DEFAULT_LOOPBACK FALSE + +struct _GstAsioSrc +{ + GstAudioSrc parent; + + /* properties */ + gchar *device_clsid; + gchar *capture_channles; + guint buffer_size; + gboolean occupy_all_channels; + gboolean loopback; +}; + +static void gst_asio_src_finalize (GObject * object); +static void gst_asio_src_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_asio_src_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +static GstCaps *gst_asio_src_get_caps (GstBaseSrc * src, GstCaps * filter); + +static GstAudioRingBuffer *gst_asio_src_create_ringbuffer (GstAudioBaseSrc * + src); + +#define gst_asio_src_parent_class parent_class +G_DEFINE_TYPE (GstAsioSrc, gst_asio_src, GST_TYPE_AUDIO_BASE_SRC); + +static void +gst_asio_src_class_init (GstAsioSrcClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseSrcClass *basesrc_class = GST_BASE_SRC_CLASS (klass); + GstAudioBaseSrcClass *audiobasesrc_class = GST_AUDIO_BASE_SRC_CLASS (klass); + + gobject_class->finalize = gst_asio_src_finalize; + gobject_class->set_property = gst_asio_src_set_property; + gobject_class->get_property = gst_asio_src_get_property; + + g_object_class_install_property (gobject_class, PROP_DEVICE_CLSID, + g_param_spec_string ("device-clsid", "Device CLSID", + "ASIO device CLSID as a string", NULL, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_CAPTURE_CHANNELS, + g_param_spec_string ("input-channels", "Input Channels", + "Comma-separated list of ASIO channels to capture", NULL, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_BUFFER_SIZE, + g_param_spec_uint ("buffer-size", "Buffer Size", + "Preferred buffer size (0 for default)", + 0, G_MAXINT32, DEFAULT_BUFFER_SIZE, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_OCCUPY_ALL_CHANNELS, + g_param_spec_boolean ("occupy-all-channels", + "Occupy All Channles", + "When enabled, ASIO device will allocate resources for all in/output " + "channles", + DEFAULT_OCCUPY_ALL_CHANNELS, + (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_LOOPBACK, + g_param_spec_boolean ("loopback", "Loopback recording", + "Open the sink device for loopback recording", + DEFAULT_LOOPBACK, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_element_class_add_static_pad_template (element_class, &src_template); + gst_element_class_set_static_metadata (element_class, "AsioSrc", + "Source/Audio/Hardware", + "Stream audio from an audio capture device through ASIO", + "Seungha Yang <seungha@centricular.com>"); + + basesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_asio_src_get_caps); + + audiobasesrc_class->create_ringbuffer = + GST_DEBUG_FUNCPTR (gst_asio_src_create_ringbuffer); + + GST_DEBUG_CATEGORY_INIT (gst_asio_src_debug, "asiosrc", 0, "asiosrc"); +} + +static void +gst_asio_src_init (GstAsioSrc * self) +{ + self->buffer_size = DEFAULT_BUFFER_SIZE; + self->occupy_all_channels = DEFAULT_OCCUPY_ALL_CHANNELS; + self->loopback = DEFAULT_LOOPBACK; +} + +static void +gst_asio_src_finalize (GObject * object) +{ + GstAsioSrc *self = GST_ASIO_SRC (object); + + g_free (self->device_clsid); + g_free (self->capture_channles); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_asio_src_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstAsioSrc *self = GST_ASIO_SRC (object); + + switch (prop_id) { + case PROP_DEVICE_CLSID: + g_free (self->device_clsid); + self->device_clsid = g_value_dup_string (value); + break; + case PROP_CAPTURE_CHANNELS: + g_free (self->capture_channles); + self->capture_channles = g_value_dup_string (value); + break; + case PROP_BUFFER_SIZE: + self->buffer_size = g_value_get_uint (value); + break; + case PROP_OCCUPY_ALL_CHANNELS: + self->occupy_all_channels = g_value_get_boolean (value); + break; + case PROP_LOOPBACK: + self->loopback = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_asio_src_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstAsioSrc *self = GST_ASIO_SRC (object); + + switch (prop_id) { + case PROP_DEVICE_CLSID: + g_value_set_string (value, self->device_clsid); + break; + case PROP_CAPTURE_CHANNELS: + g_value_set_string (value, self->capture_channles); + break; + case PROP_BUFFER_SIZE: + g_value_set_uint (value, self->buffer_size); + break; + case PROP_OCCUPY_ALL_CHANNELS: + g_value_set_boolean (value, self->occupy_all_channels); + break; + case PROP_LOOPBACK: + g_value_set_boolean (value, self->loopback); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static GstCaps * +gst_asio_src_get_caps (GstBaseSrc * src, GstCaps * filter) +{ + GstAudioBaseSrc *asrc = GST_AUDIO_BASE_SRC (src); + GstAsioSrc *self = GST_ASIO_SRC (src); + GstCaps *caps = nullptr; + + if (asrc->ringbuffer) + caps = + gst_asio_ring_buffer_get_caps (GST_ASIO_RING_BUFFER (asrc->ringbuffer)); + + if (!caps) + caps = gst_pad_get_pad_template_caps (GST_BASE_SRC_PAD (src)); + + if (filter) { + GstCaps *filtered = + gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = filtered; + } + + GST_DEBUG_OBJECT (self, "returning caps %" GST_PTR_FORMAT, caps); + + return caps; +} + +static GstAudioRingBuffer * +gst_asio_src_create_ringbuffer (GstAudioBaseSrc * src) +{ + GstAsioSrc *self = GST_ASIO_SRC (src); + GstAsioRingBuffer *ringbuffer = nullptr; + HRESULT hr; + CLSID clsid = GUID_NULL; + GList *device_infos = nullptr; + GstAsioDeviceInfo *info = nullptr; + GstAsioObject *asio_object = nullptr; + glong max_input_ch = 0; + glong max_output_ch = 0; + guint *channel_indices = nullptr; + guint num_capture_channels = 0; + std::set < guint > channel_list; + guint i; + gchar *ringbuffer_name; + + USES_CONVERSION; + + GST_DEBUG_OBJECT (self, "Create ringbuffer"); + + if (gst_asio_enum (&device_infos) == 0) { + GST_WARNING_OBJECT (self, "No available ASIO devices"); + return nullptr; + } + + if (self->device_clsid) { + hr = CLSIDFromString (A2COLE (self->device_clsid), &clsid); + if (FAILED (hr)) { + GST_WARNING_OBJECT (self, "Failed to convert %s to CLSID", + self->device_clsid); + clsid = GUID_NULL; + } + } + + /* Pick the first device */ + if (clsid == GUID_NULL) { + info = (GstAsioDeviceInfo *) device_infos->data; + } else { + /* Find matching device */ + GList *iter; + for (iter = device_infos; iter; iter = g_list_next (iter)) { + GstAsioDeviceInfo *tmp = (GstAsioDeviceInfo *) iter->data; + if (tmp->clsid == clsid) { + info = tmp; + break; + } + } + } + + if (!info) { + GST_WARNING_OBJECT (self, "Failed to find matching device"); + goto out; + } + + asio_object = gst_asio_object_new (info, self->occupy_all_channels); + if (!asio_object) { + GST_WARNING_OBJECT (self, "Failed to create ASIO object"); + goto out; + } + + /* Configure channels to use */ + if (!gst_asio_object_get_max_num_channels (asio_object, &max_input_ch, + &max_output_ch) || max_input_ch <= 0) { + GST_WARNING_OBJECT (self, "No available input channels"); + goto out; + } + + /* Check if user requested specific channel(s) */ + if (self->capture_channles) { + gchar **ch; + + ch = g_strsplit (self->capture_channles, ",", 0); + + num_capture_channels = g_strv_length (ch); + if (num_capture_channels > max_input_ch) { + GST_WARNING_OBJECT (self, "To many channels %d were requested", + num_capture_channels); + } else { + for (i = 0; i < num_capture_channels; i++) { + guint64 c = g_ascii_strtoull (ch[i], nullptr, 0); + if (c >= (guint64) max_input_ch) { + GST_WARNING_OBJECT (self, "Invalid channel index"); + num_capture_channels = 0; + break; + } + + channel_list.insert ((guint) c); + } + } + + g_strfreev (ch); + } + + channel_indices = (guint *) g_alloca (sizeof (guint) * max_input_ch); + if (channel_list.size () == 0) { + for (i = 0; i < max_input_ch; i++) + channel_indices[i] = i; + + num_capture_channels = max_input_ch; + } else { + num_capture_channels = (guint) channel_list.size (); + i = 0; + for (auto iter:channel_list) { + channel_indices[i++] = iter; + } + } + + ringbuffer_name = g_strdup_printf ("%s-asioringbuffer", + GST_OBJECT_NAME (src)); + + ringbuffer = + (GstAsioRingBuffer *) gst_asio_ring_buffer_new (asio_object, + self->loopback ? GST_ASIO_DEVICE_CLASS_LOOPBACK_CAPTURE : + GST_ASIO_DEVICE_CLASS_CAPTURE, ringbuffer_name); + g_free (ringbuffer_name); + + if (!ringbuffer) { + GST_WARNING_OBJECT (self, "Couldn't create ringbuffer object"); + goto out; + } + + if (!gst_asio_ring_buffer_configure (ringbuffer, channel_indices, + num_capture_channels, self->buffer_size)) { + GST_WARNING_OBJECT (self, "Failed to configure ringbuffer"); + gst_clear_object (&ringbuffer); + goto out; + } + +out: + if (device_infos) + g_list_free_full (device_infos, (GDestroyNotify) gst_asio_device_info_free); + + gst_clear_object (&asio_object); + + return GST_AUDIO_RING_BUFFER_CAST (ringbuffer); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasiosrc.h
Added
@@ -0,0 +1,34 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ASIO_SRC_H__ +#define __GST_ASIO_SRC_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> + +G_BEGIN_DECLS + +#define GST_TYPE_ASIO_SRC (gst_asio_src_get_type ()) +G_DECLARE_FINAL_TYPE (GstAsioSrc, + gst_asio_src, GST, ASIO_SRC, GstAudioBaseSrc); + +G_END_DECLS + +#endif /* __GST_ASIO_SRC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasioutils.cpp
Added
@@ -0,0 +1,282 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstasioutils.h" +#include <windows.h> +#include <string.h> +#include <atlconv.h> + +static gboolean +gst_asio_enum_check_class_root (GstAsioDeviceInfo * info, LPCWSTR clsid) +{ + LSTATUS status; + HKEY root_key = nullptr; + HKEY device_key = nullptr; + HKEY proc_server_key = nullptr; + DWORD type = REG_SZ; + CHAR data[256]; + DWORD size = sizeof (data); + gboolean ret = FALSE; + + status = RegOpenKeyExW (HKEY_CLASSES_ROOT, L"clsid", 0, KEY_READ, &root_key); + if (status != ERROR_SUCCESS) + return FALSE; + + /* Read registry HKEY_CLASS_ROOT/CLSID/{device-clsid} */ + status = RegOpenKeyExW (root_key, clsid, 0, KEY_READ, &device_key); + if (status != ERROR_SUCCESS) + goto done; + + /* ThreadingModel describes COM apartment */ + status = RegOpenKeyExW (device_key, + L"InprocServer32", 0, KEY_READ, &proc_server_key); + if (status != ERROR_SUCCESS) + goto done; + + status = RegQueryValueExA (proc_server_key, + "ThreadingModel", nullptr, &type, (LPBYTE) data, &size); + if (status != ERROR_SUCCESS) + goto done; + + if (g_ascii_strcasecmp (data, "Both") == 0 || + g_ascii_strcasecmp (data, "Free") == 0) { + info->sta_model = FALSE; + } else { + info->sta_model = TRUE; + } + + ret = TRUE; + +done: + if (proc_server_key) + RegCloseKey (proc_server_key); + + if (device_key) + RegCloseKey (device_key); + + if (root_key) + RegCloseKey (root_key); + + return ret; +} + +static GstAsioDeviceInfo * +gst_asio_enum_new_device_info_from_reg (HKEY reg_key, LPWSTR key_name) +{ + LSTATUS status; + HKEY sub_key = nullptr; + WCHAR clsid_data[256]; + WCHAR desc_data[256]; + DWORD type = REG_SZ; + DWORD size = sizeof (clsid_data); + GstAsioDeviceInfo *ret = nullptr; + CLSID id; + HRESULT hr; + + USES_CONVERSION; + + status = RegOpenKeyExW (reg_key, key_name, 0, KEY_READ, &sub_key); + if (status != ERROR_SUCCESS) + return nullptr; + + /* find CLSID value, used for CoCreateInstance */ + status = RegQueryValueExW (sub_key, + L"clsid", 0, &type, (LPBYTE) clsid_data, &size); + if (status != ERROR_SUCCESS) + goto done; + + hr = CLSIDFromString (W2COLE (clsid_data), &id); + if (FAILED (hr)) + goto done; + + ret = g_new0 (GstAsioDeviceInfo, 1); + ret->clsid = id; + ret->driver_name = g_utf16_to_utf8 ((gunichar2 *) key_name, -1, + nullptr, nullptr, nullptr); + + /* human readable device description */ + status = RegQueryValueExW (sub_key, + L"description", 0, &type, (LPBYTE) desc_data, &size); + if (status != ERROR_SUCCESS) { + GST_WARNING ("no description"); + ret->driver_desc = g_strdup (ret->driver_name); + } else { + ret->driver_desc = g_utf16_to_utf8 ((gunichar2 *) desc_data, -1, + nullptr, nullptr, nullptr); + } + + /* Check COM threading model */ + if (!gst_asio_enum_check_class_root (ret, clsid_data)) { + gst_asio_device_info_free (ret); + ret = nullptr; + } + +done: + if (sub_key) + RegCloseKey (sub_key); + + return ret; +} + +guint +gst_asio_enum (GList ** infos) +{ + GList *info_list = nullptr; + DWORD index = 0; + guint num_device = 0; + LSTATUS status; + HKEY reg_key = nullptr; + WCHAR key_name[512]; + + g_return_val_if_fail (infos != nullptr, 0); + + status = RegOpenKeyExW (HKEY_LOCAL_MACHINE, L"software\\asio", 0, + KEY_READ, ®_key); + while (status == ERROR_SUCCESS) { + GstAsioDeviceInfo *info; + + status = RegEnumKeyW (reg_key, index, key_name, 512); + if (status != ERROR_SUCCESS) + break; + + index++; + info = gst_asio_enum_new_device_info_from_reg (reg_key, key_name); + if (!info) + continue; + + info_list = g_list_append (info_list, info); + num_device++; + } + + if (reg_key) + RegCloseKey (reg_key); + + *infos = info_list; + + return num_device; +} + +GstAsioDeviceInfo * +gst_asio_device_info_copy (const GstAsioDeviceInfo * info) +{ + GstAsioDeviceInfo *new_info; + + if (!info) + return nullptr; + + new_info = g_new0 (GstAsioDeviceInfo, 1); + + new_info->clsid = info->clsid; + new_info->sta_model = info->sta_model; + new_info->driver_name = g_strdup (info->driver_name); + new_info->driver_desc = g_strdup (info->driver_desc); + + return new_info; +} + +void +gst_asio_device_info_free (GstAsioDeviceInfo * info) +{ + if (!info) + return; + + g_free (info->driver_name); + g_free (info->driver_desc); + + g_free (info); +} + +GstAudioFormat +gst_asio_sample_type_to_gst (ASIOSampleType type) +{ + GstAudioFormat fmt; + + switch (type) { + /*~~ MSB means big endian ~~ */ + case ASIOSTInt16MSB: + fmt = GST_AUDIO_FORMAT_S16BE; + break; + /* FIXME: also used for 20 bits packed in 24 bits, how do we detect that? */ + case ASIOSTInt24MSB: + fmt = GST_AUDIO_FORMAT_S24BE; + break; + case ASIOSTInt32MSB: + fmt = GST_AUDIO_FORMAT_S32BE; + break; + case ASIOSTFloat32MSB: + fmt = GST_AUDIO_FORMAT_F32BE; + break; + case ASIOSTFloat64MSB: + fmt = GST_AUDIO_FORMAT_F64BE; + break; + /* All these are aligned to a different boundary than the packing, not sure + * how to handle it, let's try the normal S32BE format */ + case ASIOSTInt32MSB16: + case ASIOSTInt32MSB18: + case ASIOSTInt32MSB20: + case ASIOSTInt32MSB24: + fmt = GST_AUDIO_FORMAT_S32BE; + break; + + /*~~ LSB means little endian ~~ */ + case ASIOSTInt16LSB: + fmt = GST_AUDIO_FORMAT_S16LE; + break; + /* FIXME: also used for 20 bits packed in 24 bits, how do we detect that? */ + case ASIOSTInt24LSB: + fmt = GST_AUDIO_FORMAT_S24LE; + break; + case ASIOSTInt32LSB: + fmt = GST_AUDIO_FORMAT_S32LE; + break; + case ASIOSTFloat32LSB: + fmt = GST_AUDIO_FORMAT_F32LE; + break; + case ASIOSTFloat64LSB: + fmt = GST_AUDIO_FORMAT_F64LE; + break; + /* All these are aligned to a different boundary than the packing, not sure + * how to handle it, let's try the normal S32LE format */ + case ASIOSTInt32LSB16: + case ASIOSTInt32LSB18: + case ASIOSTInt32LSB20: + case ASIOSTInt32LSB24: + GST_WARNING ("weird alignment %ld, trying S32LE", type); + fmt = GST_AUDIO_FORMAT_S32LE; + break; + + /*~~ ASIO DSD formats are don't have gstreamer mappings ~~ */ + case ASIOSTDSDInt8LSB1: + case ASIOSTDSDInt8MSB1: + case ASIOSTDSDInt8NER8: + GST_ERROR ("ASIO DSD formats are not supported"); + fmt = GST_AUDIO_FORMAT_UNKNOWN; + break; + default: + GST_ERROR ("Unknown asio sample type %ld", type); + fmt = GST_AUDIO_FORMAT_UNKNOWN; + break; + } + + return fmt; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/gstasioutils.h
Added
@@ -0,0 +1,55 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_ASIO_DEVICE_ENUM_H__ +#define __GST_ASIO_DEVICE_ENUM_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> +#include <windows.h> +#include <asiosys.h> +#include <asio.h> + +G_BEGIN_DECLS + +#define GST_ASIO_STATIC_CAPS "audio/x-raw, " \ + "format = (string) " GST_AUDIO_FORMATS_ALL ", " \ + "layout = (string) interleaved, " \ + "rate = " GST_AUDIO_RATE_RANGE ", " \ + "channels = " GST_AUDIO_CHANNELS_RANGE + +typedef struct +{ + CLSID clsid; + gboolean sta_model; + gchar *driver_name; + gchar *driver_desc; +} GstAsioDeviceInfo; + +guint gst_asio_enum (GList ** infos); + +GstAsioDeviceInfo * gst_asio_device_info_copy (const GstAsioDeviceInfo * info); + +void gst_asio_device_info_free (GstAsioDeviceInfo * info); + +GstAudioFormat gst_asio_sample_type_to_gst (ASIOSampleType type); + +G_END_DECLS + +#endif /* __GST_ASIO_DEVICE_ENUM_H__ */ \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/meson.build
Added
@@ -0,0 +1,84 @@ +asio_sources = [ + 'gstasiodeviceprovider.cpp', + 'gstasioobject.cpp', + 'gstasioringbuffer.cpp', + 'gstasiosink.cpp', + 'gstasiosrc.cpp', + 'gstasioutils.cpp', + 'plugin.c', +] + +asio_option = get_option('asio') +if asio_option.disabled() or host_system != 'windows' + subdir_done() +endif + +# FIXME: non-msvc is not tested, and unlikely supported yet because of +# tool-chain issue +if cxx.get_id() != 'msvc' + if asio_option.enabled() + error('asio plugin can only be built with MSVC') + else + subdir_done () + endif +endif + +winapi_desktop = cxx.compiles('''#include <winapifamily.h> + #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP) + #error "not win32" + #endif''', + name: 'building for Win32') + +if not winapi_desktop + if asio_option.enabled() + error('asio plugin requires WINAPI_PARTITION_DESKTOP') + else + subdir_done () + endif +endif + +avrt_lib = cc.find_library('avrt', required: asio_option) +if not avrt_lib.found() + subdir_done () +endif + +winmm_lib = cc.find_library('winmm', required: asio_option) +if not winmm_lib.found() + subdir_done () +endif + +# Checking SDK headers. User should install ASIO sdk on system, and +# this plugin requires asio.h, asiosys.h and iasiodrv.h headers +asio_sdk_root = get_option ('asio-sdk-path') +if asio_sdk_root == '' + if asio_option.enabled() + error('asio sdk path is needed, pass with -Dasio-sdk-path=C:/path/to/sdk') + else + subdir_done () + endif +endif + +asio_inc_dir = include_directories(join_paths(asio_sdk_root, 'common'), is_system : true) +has_asio_header = cxx.has_header('asio.h', include_directories: asio_inc_dir) +has_asiosys_header = cxx.has_header('asiosys.h', include_directories: asio_inc_dir) +has_iasiodrv_header = cxx.has_header('iasiodrv.h', include_directories: asio_inc_dir) +if not has_asio_header or not has_asiosys_header or not has_iasiodrv_header + if asio_option.enabled() + error('Failed to find required SDK header(s)') + else + subdir_done () + endif +endif + +asio_deps = [gstaudio_dep, avrt_lib, winmm_lib] + +gstasio = library('gstasio', + asio_sources, + include_directories : [configinc, asio_inc_dir], + dependencies : asio_deps, + c_args : gst_plugins_bad_args, + cpp_args : gst_plugins_bad_args, + install : true, + install_dir : plugins_install_dir) +pkgconfig.generate(gstasio, install_dir : plugins_pkgconfig_install_dir) +plugins += [gstasio] \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/sys/asio/plugin.c
Added
@@ -0,0 +1,48 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstasiodeviceprovider.h" +#include "gstasiosrc.h" +#include "gstasiosink.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + GstRank rank = GST_RANK_SECONDARY; + + if (!gst_element_register (plugin, "asiosrc", rank, GST_TYPE_ASIO_SRC)) + return FALSE; + if (!gst_element_register (plugin, "asiosink", rank, GST_TYPE_ASIO_SINK)) + return FALSE; + if (!gst_device_provider_register (plugin, "asiodeviceprovider", + rank, GST_TYPE_ASIO_DEVICE_PROVIDER)) + return FALSE; + + return TRUE; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + asio, + "Steinberg ASIO plugin", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/a2dp-codecs.h -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/a2dp-codecs.h
Changed
@@ -111,6 +111,26 @@ #define AAC_CHANNELS_1 0x02 #define AAC_CHANNELS_2 0x01 +#define SONY_VENDOR_ID 0x0000012d +#define LDAC_CODEC_ID 0x00aa + +#define LDAC_SAMPLING_FREQ_44100 0x20 +#define LDAC_SAMPLING_FREQ_48000 0x10 +#define LDAC_SAMPLING_FREQ_88200 0x08 +#define LDAC_SAMPLING_FREQ_96000 0x04 + +#define LDAC_CHANNEL_MODE_MONO 0x04 +#define LDAC_CHANNEL_MODE_DUAL 0x02 +#define LDAC_CHANNEL_MODE_STEREO 0x01 + +#define A2DP_GET_VENDOR_ID(a) ( \ + (((uint32_t)(a).vendor_id[0]) << 0) | \ + (((uint32_t)(a).vendor_id[1]) << 8) | \ + (((uint32_t)(a).vendor_id[2]) << 16) | \ + (((uint32_t)(a).vendor_id[3]) << 24) \ + ) +#define A2DP_GET_CODEC_ID(a) ((a).codec_id[0] | (((uint16_t)(a).codec_id[1]) << 8)) + #if G_BYTE_ORDER == G_LITTLE_ENDIAN typedef struct { @@ -182,4 +202,10 @@ uint8_t codec_id[2]; } __attribute__ ((packed)) a2dp_vendor_codec_t; +typedef struct { + a2dp_vendor_codec_t info; + uint8_t frequency; + uint8_t channel_mode; +} __attribute__ ((packed)) a2dp_ldac_t; + #endif /* #define __GST_BLUEZ_A2DP_CODECS_H_INCLUDED__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/bluez-plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/bluez-plugin.c
Changed
@@ -22,26 +22,23 @@ #include <config.h> #endif +#include "gstbluezelements.h" #include "gsta2dpsink.h" #include "gstavdtpsink.h" #include "gstavdtpsrc.h" #include <string.h> -GST_DEBUG_CATEGORY (avdtp_debug); static gboolean plugin_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (avdtp_debug, "avdtp", 0, "avdtp utils"); + gboolean ret = FALSE; - gst_element_register (plugin, "a2dpsink", GST_RANK_NONE, GST_TYPE_A2DP_SINK); + ret |= GST_ELEMENT_REGISTER (a2dpsink, plugin); + ret |= GST_ELEMENT_REGISTER (avdtpsink, plugin); + ret |= GST_ELEMENT_REGISTER (avdtpsrc, plugin); - gst_element_register (plugin, "avdtpsink", - GST_RANK_NONE, GST_TYPE_AVDTP_SINK); - - gst_element_register (plugin, "avdtpsrc", GST_RANK_NONE, GST_TYPE_AVDTP_SRC); - - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gsta2dpsink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gsta2dpsink.c
Changed
@@ -27,6 +27,7 @@ #include <unistd.h> +#include "gstbluezelements.h" #include "gsta2dpsink.h" #include <gst/rtp/gstrtpbasepayload.h> @@ -48,6 +49,8 @@ #define parent_class gst_a2dp_sink_parent_class G_DEFINE_TYPE (GstA2dpSink, gst_a2dp_sink, GST_TYPE_BIN); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (a2dpsink, "a2dpsink", GST_RANK_NONE, + GST_TYPE_A2DP_SINK, bluez_element_init (plugin)); static GstStaticPadTemplate gst_a2dp_sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, @@ -58,7 +61,8 @@ "blocks = (int) { 4, 8, 12, 16 }, " "subbands = (int) { 4, 8 }, " "allocation-method = (string) { snr, loudness }, " - "bitpool = (int) [ 2, " TEMPLATE_MAX_BITPOOL_STR " ]; " "audio/mpeg")); + "bitpool = (int) [ 2, " TEMPLATE_MAX_BITPOOL_STR " ]; " + "audio/mpeg; " "audio/x-ldac")); static gboolean gst_a2dp_sink_handle_event (GstPad * pad, GstObject * pad_parent, GstEvent * event); @@ -95,6 +99,7 @@ GST_ERROR_OBJECT (self, "Failed to set target for ghost pad"); goto remove_element_and_fail; } + gst_object_unref (sinkpad); if (!gst_element_sync_state_with_parent (element)) { GST_DEBUG_OBJECT (self, "%s failed to go to playing", elementname); @@ -104,6 +109,7 @@ return element; remove_element_and_fail: + gst_object_unref (sinkpad); gst_element_set_state (element, GST_STATE_NULL); gst_bin_remove (GST_BIN (self), element); return NULL; @@ -316,8 +322,8 @@ GstCaps *caps = NULL; if (self->sink != NULL) { - GST_LOG_OBJECT (self, "Getting device caps"); caps = gst_a2dp_sink_get_device_caps (self); + GST_LOG_OBJECT (self, "Got device caps %" GST_PTR_FORMAT, caps); } if (!caps) @@ -405,12 +411,35 @@ } static gboolean +gst_a2dp_sink_init_rtp_ldac_element (GstA2dpSink * self) +{ + GstElement *rtppay; + + /* check if we don't need a new rtp */ + if (self->rtp) + return TRUE; + + GST_LOG_OBJECT (self, "Initializing rtp ldac element"); + + rtppay = gst_a2dp_sink_init_element (self, "rtpldacpay", "rtp"); + if (rtppay == NULL) + return FALSE; + + self->rtp = rtppay; + + gst_element_set_state (rtppay, GST_STATE_PAUSED); + + return TRUE; +} + +static gboolean gst_a2dp_sink_init_dynamic_elements (GstA2dpSink * self, GstCaps * caps) { GstStructure *structure; GstEvent *event; gboolean crc; gchar *mode = NULL; + guint mtu; structure = gst_caps_get_structure (caps, 0); @@ -423,13 +452,17 @@ GST_LOG_OBJECT (self, "mp3 media received"); if (!gst_a2dp_sink_init_rtp_mpeg_element (self)) return FALSE; + } else if (gst_structure_has_name (structure, "audio/x-ldac")) { + GST_LOG_OBJECT (self, "ldac media received"); + if (!gst_a2dp_sink_init_rtp_ldac_element (self)) + return FALSE; } else { GST_ERROR_OBJECT (self, "Unexpected media type"); return FALSE; } if (!gst_element_link (GST_ELEMENT (self->rtp), GST_ELEMENT (self->sink))) { - GST_ERROR_OBJECT (self, "couldn't link rtpsbcpay " "to avdtpsink"); + GST_ERROR_OBJECT (self, "couldn't link rtp payloader to avdtpsink"); return FALSE; } @@ -452,8 +485,9 @@ g_free (mode); } - g_object_set (self->rtp, "mtu", - gst_avdtp_sink_get_link_mtu (self->sink), NULL); + mtu = gst_avdtp_sink_get_link_mtu (self->sink); + GST_INFO_OBJECT (self, "Setting MTU to %u", mtu); + g_object_set (self->rtp, "mtu", mtu, NULL); return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gsta2dpsink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gsta2dpsink.h
Changed
@@ -66,8 +66,6 @@ GType gst_a2dp_sink_get_type (void); -gboolean gst_a2dp_sink_plugin_init (GstPlugin * plugin); - GstCaps *gst_a2dp_sink_get_device_caps (GstA2dpSink * self); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gstavdtpsink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstavdtpsink.c
Changed
@@ -36,6 +36,7 @@ #include "a2dp-codecs.h" +#include "gstbluezelements.h" #include "gstavdtpsink.h" #include <gst/rtp/rtp.h> @@ -69,26 +70,29 @@ #define parent_class gst_avdtp_sink_parent_class G_DEFINE_TYPE (GstAvdtpSink, gst_avdtp_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (avdtpsink, "avdtpsink", GST_RANK_NONE, + GST_TYPE_AVDTP_SINK, bluez_element_init (plugin)); static GstStaticPadTemplate avdtp_sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("application/x-rtp, " "media = (string) \"audio\"," - "payload = (int) " - GST_RTP_PAYLOAD_DYNAMIC_STRING ", " - "clock-rate = (int) { 16000, 32000, " - "44100, 48000 }, " + "payload = (int) " GST_RTP_PAYLOAD_DYNAMIC_STRING ", " + "clock-rate = (int) { 16000, 32000, 44100, 48000 }, " "encoding-name = (string) \"SBC\"; " "application/x-rtp, " "media = (string) \"audio\", " - "payload = (int) " - GST_RTP_PAYLOAD_MPA_STRING ", " + "payload = (int) " GST_RTP_PAYLOAD_MPA_STRING ", " "clock-rate = (int) 90000; " "application/x-rtp, " "media = (string) \"audio\", " - "payload = (int) " - GST_RTP_PAYLOAD_DYNAMIC_STRING ", " - "clock-rate = (int) 90000, " "encoding-name = (string) \"MPA\"")); + "payload = (int) " GST_RTP_PAYLOAD_DYNAMIC_STRING ", " + "clock-rate = (int) 90000, " "encoding-name = (string) \"MPA\"; " + "application/x-rtp, " + "media = (string) \"audio\", " + "payload = (int) " GST_RTP_PAYLOAD_DYNAMIC_STRING ", " + "clock-rate = (int) { 44100, 48000, 88200, 96000 }, " + "encoding-name = (string) \"X-GST-LDAC\"")); static gboolean gst_avdtp_sink_stop (GstBaseSink * basesink) @@ -409,14 +413,6 @@ */ } -gboolean -gst_avdtp_sink_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avdtpsink", GST_RANK_NONE, - GST_TYPE_AVDTP_SINK); -} - - /* public functions */ GstCaps * gst_avdtp_sink_get_device_caps (GstAvdtpSink * sink)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gstavdtpsink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstavdtpsink.h
Changed
@@ -86,8 +86,6 @@ gchar *gst_avdtp_sink_get_transport (GstAvdtpSink * sink); -gboolean gst_avdtp_sink_plugin_init (GstPlugin * plugin); - void gst_avdtp_sink_set_crc (GstAvdtpSink * self, gboolean crc); void gst_avdtp_sink_set_channel_mode (GstAvdtpSink * self, const gchar * mode);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gstavdtpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstavdtpsrc.c
Changed
@@ -31,6 +31,7 @@ #include <poll.h> #include <gst/rtp/gstrtppayloads.h> +#include "gstbluezelements.h" #include "gstavdtpsrc.h" GST_DEBUG_CATEGORY_STATIC (avdtpsrc_debug); @@ -47,6 +48,8 @@ #define parent_class gst_avdtp_src_parent_class G_DEFINE_TYPE (GstAvdtpSrc, gst_avdtp_src, GST_TYPE_BASE_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (avdtpsrc, "avdtpsrc", GST_RANK_NONE, + GST_TYPE_AVDTP_SRC, bluez_element_init (plugin)); static GstStaticPadTemplate gst_avdtp_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, @@ -525,10 +528,3 @@ return TRUE; } - -gboolean -gst_avdtp_src_plugin_init (GstPlugin * plugin) -{ - return gst_element_register (plugin, "avdtpsrc", GST_RANK_NONE, - GST_TYPE_AVDTP_SRC); -}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gstavdtpsrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstavdtpsrc.h
Changed
@@ -68,7 +68,5 @@ GType gst_avdtp_src_get_type (void); -gboolean gst_avdtp_src_plugin_init (GstPlugin * plugin); - G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/gstavdtputil.c -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstavdtputil.c
Changed
@@ -690,6 +690,112 @@ return structure; } +static GstStructure * +gst_avdtp_util_parse_ldac_raw (void *config) +{ + /* We assume the vendor/codec ID have been verified already */ + a2dp_ldac_t *ldac = (a2dp_ldac_t *) config; + GstStructure *structure; + GValue value = G_VALUE_INIT; + GValue list = G_VALUE_INIT; + gboolean mono, stereo; + + structure = gst_structure_new_empty ("audio/x-ldac"); + + g_value_init (&list, GST_TYPE_LIST); + g_value_init (&value, G_TYPE_INT); + + /* rate */ + if (ldac->frequency & LDAC_SAMPLING_FREQ_44100) { + g_value_set_int (&value, 44100); + gst_value_list_prepend_value (&list, &value); + } + if (ldac->frequency & LDAC_SAMPLING_FREQ_48000) { + g_value_set_int (&value, 48000); + gst_value_list_prepend_value (&list, &value); + } + if (ldac->frequency & LDAC_SAMPLING_FREQ_88200) { + g_value_set_int (&value, 88200); + gst_value_list_prepend_value (&list, &value); + } + if (ldac->frequency & LDAC_SAMPLING_FREQ_96000) { + g_value_set_int (&value, 96000); + gst_value_list_prepend_value (&list, &value); + } + + if (gst_value_list_get_size (&list) == 1) + gst_structure_set_value (structure, "rate", &value); + else + gst_structure_set_value (structure, "rate", &list); + + g_value_unset (&value); + g_value_reset (&list); + + /* channels */ + mono = FALSE; + stereo = FALSE; + if (ldac->channel_mode & LDAC_CHANNEL_MODE_MONO) + mono = TRUE; + if ((ldac->channel_mode & LDAC_CHANNEL_MODE_STEREO) || + (ldac->channel_mode & LDAC_CHANNEL_MODE_DUAL)) + stereo = TRUE; + + if (mono && stereo) { + g_value_init (&value, GST_TYPE_INT_RANGE); + gst_value_set_int_range (&value, 1, 2); + } else { + g_value_init (&value, G_TYPE_INT); + if (mono) + g_value_set_int (&value, 1); + else if (stereo) + g_value_set_int (&value, 2); + else { + GST_ERROR ("Unexpected number of channels"); + g_value_set_int (&value, 0); + } + } + gst_structure_set_value (structure, "channels", &value); + + g_value_unset (&value); + g_value_init (&value, G_TYPE_STRING); + + /* channel mode */ + if (ldac->channel_mode & LDAC_CHANNEL_MODE_MONO) { + g_value_set_static_string (&value, "mono"); + gst_value_list_prepend_value (&list, &value); + } + if (ldac->channel_mode & LDAC_CHANNEL_MODE_STEREO) { + g_value_set_static_string (&value, "stereo"); + gst_value_list_prepend_value (&list, &value); + } + if (ldac->channel_mode & LDAC_CHANNEL_MODE_DUAL) { + g_value_set_static_string (&value, "dual"); + gst_value_list_prepend_value (&list, &value); + } + + if (gst_value_list_get_size (&list) == 1) + gst_structure_set_value (structure, "channel-mode", &value); + else + gst_structure_take_value (structure, "channel-mode", &list); + + g_value_unset (&value); + g_value_unset (&list); + + return structure; +} + +static GstStructure * +gst_avdtp_util_parse_vendor_raw (void *config) +{ + a2dp_vendor_codec_t *vendor = (a2dp_vendor_codec_t *) config; + + if (A2DP_GET_VENDOR_ID (*vendor) == SONY_VENDOR_ID && + A2DP_GET_CODEC_ID (*vendor) == LDAC_CODEC_ID) + return gst_avdtp_util_parse_ldac_raw (config); + else + return NULL; +} + GstCaps * gst_avdtp_connection_get_caps (GstAvdtpConnection * conn) { @@ -709,6 +815,9 @@ case A2DP_CODEC_MPEG24: structure = gst_avdtp_util_parse_aac_raw (conn->data.config); break; + case A2DP_CODEC_VENDOR: + structure = gst_avdtp_util_parse_vendor_raw (conn->data.config); + break; default: GST_ERROR ("Unsupported configuration"); return NULL;
View file
gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstbluezelement.c
Added
@@ -0,0 +1,38 @@ +/* GStreamer bluez plugin + * + * Copyright (C) 2013 Collabora Ltd. <tim.muller@collabora.co.uk> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstbluezelements.h" + + +GST_DEBUG_CATEGORY (avdtp_debug); + +void +bluez_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (avdtp_debug, "avdtp", 0, "avdtp utils"); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/bluez/gstbluezelements.h
Added
@@ -0,0 +1,36 @@ +/* GStreamer + * Copyright (C) <2020> Julian Bouzas <julian.bouzas@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_BLUEZ_ELEMENTS_H__ +#define __GST_BLUEZ_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void bluez_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (a2dpsink); +GST_ELEMENT_REGISTER_DECLARE (avdtpsink); +GST_ELEMENT_REGISTER_DECLARE (avdtpsrc); + +#endif /* __GST_BLUEZ_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/bluez/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/bluez/meson.build
Changed
@@ -1,5 +1,6 @@ bluez_sources = [ 'bluez-plugin.c', + 'gstbluezelement.c', 'gsta2dpsink.c', 'gstavdtpsink.c', 'gstavdtpsrc.c',
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11av1dec.cpp
Added
@@ -0,0 +1,1381 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11av1dec + * @title: d3d11av1dec + * + * A Direct3D11/DXVA based AV1 video decoder + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/av1/file ! parsebin ! d3d11av1dec ! d3d11videosink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11av1dec.h" + +#include <gst/codecs/gstav1decoder.h> +#include <string.h> +#include <vector> + +/* HACK: to expose dxva data structure on UWP */ +#ifdef WINAPI_PARTITION_DESKTOP +#undef WINAPI_PARTITION_DESKTOP +#endif +#define WINAPI_PARTITION_DESKTOP 1 +#include <d3d9.h> +#include <dxva.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_av1_dec_debug); +#define GST_CAT_DEFAULT gst_d3d11_av1_dec_debug + +/* Might not be defined in dxva.h, copied from DXVA AV1 spec available at + * https://www.microsoft.com/en-us/download/confirmation.aspx?id=101577 + * and modified with "GST_" prefix + */ +#pragma pack(push, 1) + +/* AV1 picture entry data structure */ +typedef struct _GST_DXVA_PicEntry_AV1 +{ + UINT width; + UINT height; + + // Global motion parameters + INT wmmat[6]; + union + { + struct + { + UCHAR wminvalid:1; + UCHAR wmtype:2; + UCHAR Reserved:5; + }; + UCHAR GlobalMotionFlags; + } DUMMYUNIONNAME; + + UCHAR Index; + UINT16 Reserved16Bits; + +} GST_DXVA_PicEntry_AV1; + +/* AV1 picture parameters structure */ +typedef struct _GST_DXVA_PicParams_AV1 +{ + UINT width; + UINT height; + + UINT max_width; + UINT max_height; + + UCHAR CurrPicTextureIndex; + UCHAR superres_denom; + UCHAR bitdepth; + UCHAR seq_profile; + + // Tiles: + struct + { + UCHAR cols; + UCHAR rows; + USHORT context_update_id; + USHORT widths[64]; + USHORT heights[64]; + } tiles; + + // Coding Tools + union + { + struct + { + UINT use_128x128_superblock:1; + UINT intra_edge_filter:1; + UINT interintra_compound:1; + UINT masked_compound:1; + UINT warped_motion:1; + UINT dual_filter:1; + UINT jnt_comp:1; + UINT screen_content_tools:1; + UINT integer_mv:1; + UINT cdef:1; + UINT restoration:1; + UINT film_grain:1; + UINT intrabc:1; + UINT high_precision_mv:1; + UINT switchable_motion_mode:1; + UINT filter_intra:1; + UINT disable_frame_end_update_cdf:1; + UINT disable_cdf_update:1; + UINT reference_mode:1; + UINT skip_mode:1; + UINT reduced_tx_set:1; + UINT superres:1; + UINT tx_mode:2; + UINT use_ref_frame_mvs:1; + UINT enable_ref_frame_mvs:1; + UINT reference_frame_update:1; + UINT Reserved:5; + }; + UINT32 CodingParamToolFlags; + } coding; + + // Format & Picture Info flags + union + { + struct + { + UCHAR frame_type:2; + UCHAR show_frame:1; + UCHAR showable_frame:1; + UCHAR subsampling_x:1; + UCHAR subsampling_y:1; + UCHAR mono_chrome:1; + UCHAR Reserved:1; + }; + UCHAR FormatAndPictureInfoFlags; + } format; + + // References + UCHAR primary_ref_frame; + UCHAR order_hint; + UCHAR order_hint_bits; + + GST_DXVA_PicEntry_AV1 frame_refs[7]; + UCHAR RefFrameMapTextureIndex[8]; + + // Loop filter parameters + struct + { + UCHAR filter_level[2]; + UCHAR filter_level_u; + UCHAR filter_level_v; + + UCHAR sharpness_level; + union + { + struct + { + UCHAR mode_ref_delta_enabled:1; + UCHAR mode_ref_delta_update:1; + UCHAR delta_lf_multi:1; + UCHAR delta_lf_present:1; + UCHAR Reserved:4; + }; + UCHAR ControlFlags; + } DUMMYUNIONNAME; + CHAR ref_deltas[8]; + CHAR mode_deltas[2]; + UCHAR delta_lf_res; + UCHAR frame_restoration_type[3]; + USHORT log2_restoration_unit_size[3]; + UINT16 Reserved16Bits; + } loop_filter; + + // Quantization + struct + { + union + { + struct + { + UCHAR delta_q_present:1; + UCHAR delta_q_res:2; + UCHAR Reserved:5; + }; + UCHAR ControlFlags; + } DUMMYUNIONNAME; + + UCHAR base_qindex; + CHAR y_dc_delta_q; + CHAR u_dc_delta_q; + CHAR v_dc_delta_q; + CHAR u_ac_delta_q; + CHAR v_ac_delta_q; + // using_qmatrix: + UCHAR qm_y; + UCHAR qm_u; + UCHAR qm_v; + UINT16 Reserved16Bits; + } quantization; + + // Cdef parameters + struct + { + union + { + struct + { + UCHAR damping:2; + UCHAR bits:2; + UCHAR Reserved:4; + }; + UCHAR ControlFlags; + } DUMMYUNIONNAME; + + union + { + struct + { + UCHAR primary:6; + UCHAR secondary:2; + }; + UCHAR combined; + } y_strengths[8]; + + union + { + struct + { + UCHAR primary:6; + UCHAR secondary:2; + }; + UCHAR combined; + } uv_strengths[8]; + + } cdef; + + UCHAR interp_filter; + + // Segmentation + struct + { + union + { + struct + { + UCHAR enabled:1; + UCHAR update_map:1; + UCHAR update_data:1; + UCHAR temporal_update:1; + UCHAR Reserved:4; + }; + UCHAR ControlFlags; + } DUMMYUNIONNAME; + UCHAR Reserved24Bits[3]; + + union + { + struct + { + UCHAR alt_q:1; + UCHAR alt_lf_y_v:1; + UCHAR alt_lf_y_h:1; + UCHAR alt_lf_u:1; + UCHAR alt_lf_v:1; + UCHAR ref_frame:1; + UCHAR skip:1; + UCHAR globalmv:1; + }; + UCHAR mask; + } feature_mask[8]; + + SHORT feature_data[8][8]; + + } segmentation; + + struct + { + union + { + struct + { + USHORT apply_grain:1; + USHORT scaling_shift_minus8:2; + USHORT chroma_scaling_from_luma:1; + USHORT ar_coeff_lag:2; + USHORT ar_coeff_shift_minus6:2; + USHORT grain_scale_shift:2; + USHORT overlap_flag:1; + USHORT clip_to_restricted_range:1; + USHORT matrix_coeff_is_identity:1; + USHORT Reserved:3; + }; + USHORT ControlFlags; + } DUMMYUNIONNAME; + + USHORT grain_seed; + UCHAR scaling_points_y[14][2]; + UCHAR num_y_points; + UCHAR scaling_points_cb[10][2]; + UCHAR num_cb_points; + UCHAR scaling_points_cr[10][2]; + UCHAR num_cr_points; + UCHAR ar_coeffs_y[24]; + UCHAR ar_coeffs_cb[25]; + UCHAR ar_coeffs_cr[25]; + UCHAR cb_mult; + UCHAR cb_luma_mult; + UCHAR cr_mult; + UCHAR cr_luma_mult; + UCHAR Reserved8Bits; + SHORT cb_offset; + SHORT cr_offset; + } film_grain; + + UINT Reserved32Bits; + UINT StatusReportFeedbackNumber; +} GST_DXVA_PicParams_AV1; + +/* AV1 tile structure */ +typedef struct _GST_DXVA_Tile_AV1 +{ + UINT DataOffset; + UINT DataSize; + USHORT row; + USHORT column; + UINT16 Reserved16Bits; + UCHAR anchor_frame; + UCHAR Reserved8Bits; +} GST_DXVA_Tile_AV1; + +/* AV1 status reporting data structure */ +typedef struct _GST_DXVA_Status_AV1 +{ + UINT StatusReportFeedbackNumber; + GST_DXVA_PicEntry_AV1 CurrPic; + UCHAR BufType; + UCHAR Status; + UCHAR Reserved8Bits; + USHORT NumMbsAffected; +} GST_DXVA_Status_AV1; + +#pragma pack(pop) + +/* reference list 8 + 4 margin */ +#define NUM_OUTPUT_VIEW 12 + +/* *INDENT-OFF* */ +typedef struct _GstD3D11AV1DecInner +{ + GstD3D11Device *device = nullptr; + GstD3D11Decoder *d3d11_decoder = nullptr; + + GstAV1SequenceHeaderOBU seq_hdr; + GST_DXVA_PicParams_AV1 pic_params; + + std::vector<GST_DXVA_Tile_AV1> tile_list; + std::vector<guint8> bitstream_buffer; + + guint max_width = 0; + guint max_height = 0; + guint bitdepth = 0; +} GstD3D11AV1DecInner; +/* *INDENT-ON* */ + +typedef struct _GstD3D11AV1Dec +{ + GstAV1Decoder parent; + GstD3D11AV1DecInner *inner; +} GstD3D11AV1Dec; + +typedef struct _GstD3D11AV1DecClass +{ + GstAV1DecoderClass parent_class; + GstD3D11DecoderSubClassData class_data; +} GstD3D11AV1DecClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_AV1_DEC(object) ((GstD3D11AV1Dec *) (object)) +#define GST_D3D11_AV1_DEC_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11AV1DecClass)) + +static void gst_d3d11_av1_dec_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_av1_dec_finalize (GObject * object); + +static void gst_d3d11_av1_dec_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_av1_dec_open (GstVideoDecoder * decoder); +static gboolean gst_d3d11_av1_dec_close (GstVideoDecoder * decoder); +static gboolean gst_d3d11_av1_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_d3d11_av1_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_d3d11_av1_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); +static gboolean gst_d3d11_av1_dec_sink_event (GstVideoDecoder * decoder, + GstEvent * event); + +/* GstAV1Decoder */ +static GstFlowReturn gst_d3d11_av1_dec_new_sequence (GstAV1Decoder * decoder, + const GstAV1SequenceHeaderOBU * seq_hdr); +static GstFlowReturn gst_d3d11_av1_dec_new_picture (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, GstAV1Picture * picture); +static GstAV1Picture *gst_d3d11_av1_dec_duplicate_picture (GstAV1Decoder * + decoder, GstAV1Picture * picture); +static GstFlowReturn gst_d3d11_av1_dec_start_picture (GstAV1Decoder * decoder, + GstAV1Picture * picture, GstAV1Dpb * dpb); +static GstFlowReturn gst_d3d11_av1_dec_decode_tile (GstAV1Decoder * decoder, + GstAV1Picture * picture, GstAV1Tile * tile); +static GstFlowReturn gst_d3d11_av1_dec_end_picture (GstAV1Decoder * decoder, + GstAV1Picture * picture); +static GstFlowReturn gst_d3d11_av1_dec_output_picture (GstAV1Decoder * + decoder, GstVideoCodecFrame * frame, GstAV1Picture * picture); + +static void +gst_d3d11_av1_dec_class_init (GstD3D11AV1DecClass * klass, gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstAV1DecoderClass *av1decoder_class = GST_AV1_DECODER_CLASS (klass); + GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; + + gobject_class->get_property = gst_d3d11_av1_dec_get_property; + gobject_class->finalize = gst_d3d11_av1_dec_finalize; + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_set_context); + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + gst_d3d11_decoder_class_data_fill_subclass_data (cdata, &klass->class_data); + + /** + * GstD3D11AV1Dec:adapter-luid: + * + * DXGI Adapter LUID for this element + * + * Since: 1.20 + */ + gst_d3d11_decoder_proxy_class_init (element_class, cdata, + "Seungha Yang <seungha@centricular.com>"); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_src_query); + decoder_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_sink_event); + + av1decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_new_sequence); + av1decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_new_picture); + av1decoder_class->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_duplicate_picture); + av1decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_start_picture); + av1decoder_class->decode_tile = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_decode_tile); + av1decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_end_picture); + av1decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_av1_dec_output_picture); +} + +static void +gst_d3d11_av1_dec_init (GstD3D11AV1Dec * self) +{ + self->inner = new GstD3D11AV1DecInner (); +} + +static void +gst_d3d11_av1_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11AV1DecClass *klass = GST_D3D11_AV1_DEC_GET_CLASS (object); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_decoder_proxy_get_property (object, prop_id, value, pspec, cdata); +} + +static void +gst_d3d11_av1_dec_finalize (GObject * object) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (object); + + delete self->inner; + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_av1_dec_set_context (GstElement * element, GstContext * context) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (element); + GstD3D11AV1DecInner *inner = self->inner; + GstD3D11AV1DecClass *klass = GST_D3D11_AV1_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, cdata->adapter_luid, &inner->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_av1_dec_open (GstVideoDecoder * decoder) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + GstD3D11AV1DecClass *klass = GST_D3D11_AV1_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + if (!gst_d3d11_decoder_proxy_open (decoder, + cdata, &inner->device, &inner->d3d11_decoder)) { + GST_ERROR_OBJECT (self, "Failed to open decoder"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_av1_dec_close (GstVideoDecoder * decoder) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + + gst_clear_object (&inner->d3d11_decoder); + gst_clear_object (&inner->device); + + return TRUE; +} + +static gboolean +gst_d3d11_av1_dec_negotiate (GstVideoDecoder * decoder) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_negotiate (inner->d3d11_decoder, decoder)) + return FALSE; + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_d3d11_av1_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_decide_allocation (inner->d3d11_decoder, + decoder, query)) { + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_d3d11_av1_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), + query, inner->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static gboolean +gst_d3d11_av1_dec_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, TRUE); + break; + case GST_EVENT_FLUSH_STOP: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, FALSE); + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstFlowReturn +gst_d3d11_av1_dec_new_sequence (GstAV1Decoder * decoder, + const GstAV1SequenceHeaderOBU * seq_hdr) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + gboolean modified = FALSE; + guint max_width, max_height; + + GST_LOG_OBJECT (self, "new sequence"); + + if (seq_hdr->seq_profile != GST_AV1_PROFILE_0) { + GST_WARNING_OBJECT (self, "Unsupported profile %d", seq_hdr->seq_profile); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (seq_hdr->num_planes != 3) { + GST_WARNING_OBJECT (self, "Monochrome is not supported"); + return GST_FLOW_NOT_NEGOTIATED; + } + + inner->seq_hdr = *seq_hdr; + + if (inner->bitdepth != seq_hdr->bit_depth) { + GST_INFO_OBJECT (self, "Bitdepth changed %d -> %d", inner->bitdepth, + seq_hdr->bit_depth); + inner->bitdepth = seq_hdr->bit_depth; + modified = TRUE; + } + + max_width = seq_hdr->max_frame_width_minus_1 + 1; + max_height = seq_hdr->max_frame_height_minus_1 + 1; + + if (inner->max_width != max_width || inner->max_height != max_height) { + GST_INFO_OBJECT (self, "Resolution changed %dx%d -> %dx%d", + inner->max_width, inner->max_height, max_width, max_height); + inner->max_width = max_width; + inner->max_height = max_height; + modified = TRUE; + } + + if (modified || !gst_d3d11_decoder_is_configured (inner->d3d11_decoder)) { + GstVideoInfo info; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; + + if (inner->bitdepth == 8) { + out_format = GST_VIDEO_FORMAT_NV12; + } else if (inner->bitdepth == 10) { + out_format = GST_VIDEO_FORMAT_P010_10LE; + } else { + GST_WARNING_OBJECT (self, "Invalid bit-depth %d", seq_hdr->bit_depth); + return GST_FLOW_NOT_NEGOTIATED; + } + + gst_video_info_set_format (&info, + out_format, inner->max_width, inner->max_height); + + if (!gst_d3d11_decoder_configure (inner->d3d11_decoder, + decoder->input_state, &info, (gint) inner->max_width, + (gint) inner->max_height, NUM_OUTPUT_VIEW)) { + GST_ERROR_OBJECT (self, "Failed to create decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_av1_dec_new_picture (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, GstAV1Picture * picture) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + GstBuffer *view_buffer; + + view_buffer = gst_d3d11_decoder_get_output_view_buffer (inner->d3d11_decoder, + GST_VIDEO_DECODER (decoder)); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "No available output view buffer"); + return GST_FLOW_FLUSHING; + } + + GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT, view_buffer); + + gst_av1_picture_set_user_data (picture, + view_buffer, (GDestroyNotify) gst_buffer_unref); + + GST_LOG_OBJECT (self, "New AV1 picture %p", picture); + + return GST_FLOW_OK; +} + +static GstAV1Picture * +gst_d3d11_av1_dec_duplicate_picture (GstAV1Decoder * decoder, + GstAV1Picture * picture) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstBuffer *view_buffer; + GstAV1Picture *new_picture; + + view_buffer = (GstBuffer *) gst_av1_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Parent picture does not have output view buffer"); + return NULL; + } + + new_picture = gst_av1_picture_new (); + + GST_LOG_OBJECT (self, "Duplicate output with buffer %" GST_PTR_FORMAT, + view_buffer); + + gst_av1_picture_set_user_data (new_picture, + gst_buffer_ref (view_buffer), (GDestroyNotify) gst_buffer_unref); + + return new_picture; +} + +static ID3D11VideoDecoderOutputView * +gst_d3d11_av1_dec_get_output_view_from_picture (GstD3D11AV1Dec * self, + GstAV1Picture * picture, guint8 * view_id) +{ + GstD3D11AV1DecInner *inner = self->inner; + GstBuffer *view_buffer; + ID3D11VideoDecoderOutputView *view; + + view_buffer = (GstBuffer *) gst_av1_picture_get_user_data (picture); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); + return NULL; + } + + view = + gst_d3d11_decoder_get_output_view_from_buffer (inner->d3d11_decoder, + view_buffer, view_id); + if (!view) { + GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); + return NULL; + } + + return view; +} + +static GstFlowReturn +gst_d3d11_av1_dec_start_picture (GstAV1Decoder * decoder, + GstAV1Picture * picture, GstAV1Dpb * dpb) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + const GstAV1SequenceHeaderOBU *seq_hdr = &inner->seq_hdr; + const GstAV1FrameHeaderOBU *frame_hdr = &picture->frame_hdr; + ID3D11VideoDecoderOutputView *view; + GST_DXVA_PicParams_AV1 *pic_params = &inner->pic_params; + guint8 view_id = 0xff; + guint i, j; + + view = gst_d3d11_av1_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_OK; + } + + memset (pic_params, 0, sizeof (GST_DXVA_PicParams_AV1)); + + pic_params->width = frame_hdr->frame_width; + pic_params->height = frame_hdr->frame_height; + + pic_params->max_width = seq_hdr->max_frame_width_minus_1 + 1; + pic_params->max_height = seq_hdr->max_frame_height_minus_1 + 1; + + pic_params->CurrPicTextureIndex = view_id; + pic_params->superres_denom = frame_hdr->superres_denom; + pic_params->bitdepth = seq_hdr->bit_depth; + pic_params->seq_profile = seq_hdr->seq_profile; + + /* TILES */ + pic_params->tiles.cols = frame_hdr->tile_info.tile_cols; + pic_params->tiles.rows = frame_hdr->tile_info.tile_rows; + pic_params->tiles.context_update_id = + frame_hdr->tile_info.context_update_tile_id; + + for (i = 0; i < pic_params->tiles.cols; i++) { + pic_params->tiles.widths[i] = + frame_hdr->tile_info.width_in_sbs_minus_1[i] + 1; + } + + for (i = 0; i < pic_params->tiles.rows; i++) { + pic_params->tiles.heights[i] = + frame_hdr->tile_info.height_in_sbs_minus_1[i] + 1; + } + + /* CODING TOOLS */ + pic_params->coding.use_128x128_superblock = seq_hdr->use_128x128_superblock; + pic_params->coding.intra_edge_filter = seq_hdr->enable_filter_intra; + pic_params->coding.interintra_compound = seq_hdr->enable_interintra_compound; + pic_params->coding.masked_compound = seq_hdr->enable_masked_compound; + pic_params->coding.warped_motion = frame_hdr->allow_warped_motion; + pic_params->coding.dual_filter = seq_hdr->enable_dual_filter; + pic_params->coding.jnt_comp = seq_hdr->enable_jnt_comp; + pic_params->coding.screen_content_tools = + frame_hdr->allow_screen_content_tools; + pic_params->coding.integer_mv = frame_hdr->force_integer_mv; + pic_params->coding.cdef = seq_hdr->enable_cdef; + pic_params->coding.restoration = seq_hdr->enable_restoration; + pic_params->coding.film_grain = seq_hdr->film_grain_params_present; + pic_params->coding.intrabc = frame_hdr->allow_intrabc; + pic_params->coding.high_precision_mv = frame_hdr->allow_high_precision_mv; + pic_params->coding.switchable_motion_mode = + frame_hdr->is_motion_mode_switchable; + pic_params->coding.filter_intra = seq_hdr->enable_filter_intra; + pic_params->coding.disable_frame_end_update_cdf = + frame_hdr->disable_frame_end_update_cdf; + pic_params->coding.disable_cdf_update = frame_hdr->disable_cdf_update; + pic_params->coding.reference_mode = frame_hdr->reference_select; + pic_params->coding.skip_mode = frame_hdr->skip_mode_present; + pic_params->coding.reduced_tx_set = frame_hdr->reduced_tx_set; + pic_params->coding.superres = frame_hdr->use_superres; + pic_params->coding.tx_mode = frame_hdr->tx_mode; + pic_params->coding.use_ref_frame_mvs = frame_hdr->use_ref_frame_mvs; + pic_params->coding.enable_ref_frame_mvs = seq_hdr->enable_ref_frame_mvs; + pic_params->coding.reference_frame_update = 1; + + /* FORMAT */ + pic_params->format.frame_type = frame_hdr->frame_type; + pic_params->format.show_frame = frame_hdr->show_frame; + pic_params->format.showable_frame = frame_hdr->showable_frame; + pic_params->format.subsampling_x = seq_hdr->color_config.subsampling_x; + pic_params->format.subsampling_y = seq_hdr->color_config.subsampling_y; + pic_params->format.mono_chrome = seq_hdr->color_config.mono_chrome; + + /* REFERENCES */ + pic_params->primary_ref_frame = frame_hdr->primary_ref_frame; + pic_params->order_hint = frame_hdr->order_hint; + if (seq_hdr->enable_order_hint) { + pic_params->order_hint_bits = seq_hdr->order_hint_bits_minus_1 + 1; + } else { + pic_params->order_hint_bits = 0; + } + + for (i = 0; i < GST_AV1_REFS_PER_FRAME; i++) { + if (dpb->pic_list[i]) { + GstAV1Picture *other_pic = dpb->pic_list[i]; + const GstAV1GlobalMotionParams *gmp = &frame_hdr->global_motion_params; + + pic_params->frame_refs[i].width = other_pic->frame_hdr.frame_width; + pic_params->frame_refs[i].height = other_pic->frame_hdr.frame_height; + for (j = 0; j < 6; j++) { + pic_params->frame_refs[i].wmmat[j] = + gmp->gm_params[GST_AV1_REF_LAST_FRAME + i][j]; + } + pic_params->frame_refs[i].wminvalid = + (gmp->gm_type[GST_AV1_REF_LAST_FRAME + i] == + GST_AV1_WARP_MODEL_IDENTITY); + pic_params->frame_refs[i].wmtype = + gmp->gm_type[GST_AV1_REF_LAST_FRAME + i]; + pic_params->frame_refs[i].Index = frame_hdr->ref_frame_idx[i]; + } else { + pic_params->frame_refs[i].Index = 0xff; + } + } + + for (i = 0; i < GST_AV1_NUM_REF_FRAMES; i++) { + if (dpb->pic_list[i]) { + GstAV1Picture *other_pic = dpb->pic_list[i]; + ID3D11VideoDecoderOutputView *other_view; + guint8 other_view_id = 0xff; + + other_view = gst_d3d11_av1_dec_get_output_view_from_picture (self, + other_pic, &other_view_id); + if (!other_view) { + GST_ERROR_OBJECT (self, + "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + pic_params->RefFrameMapTextureIndex[i] = other_view_id; + } else { + pic_params->RefFrameMapTextureIndex[i] = 0xff; + } + } + + /* LOOP FILTER PARAMS */ + pic_params->loop_filter.filter_level[0] = + frame_hdr->loop_filter_params.loop_filter_level[0]; + pic_params->loop_filter.filter_level[1] = + frame_hdr->loop_filter_params.loop_filter_level[1]; + pic_params->loop_filter.filter_level_u = + frame_hdr->loop_filter_params.loop_filter_level[2]; + pic_params->loop_filter.filter_level_v = + frame_hdr->loop_filter_params.loop_filter_level[3]; + pic_params->loop_filter.sharpness_level = + frame_hdr->loop_filter_params.loop_filter_sharpness; + pic_params->loop_filter.mode_ref_delta_enabled = + frame_hdr->loop_filter_params.loop_filter_delta_enabled; + pic_params->loop_filter.mode_ref_delta_update = + frame_hdr->loop_filter_params.loop_filter_delta_update; + pic_params->loop_filter.delta_lf_multi = + frame_hdr->loop_filter_params.delta_lf_multi; + pic_params->loop_filter.delta_lf_present = + frame_hdr->loop_filter_params.delta_lf_present; + + for (i = 0; i < GST_AV1_TOTAL_REFS_PER_FRAME; i++) { + pic_params->loop_filter.ref_deltas[i] = + frame_hdr->loop_filter_params.loop_filter_ref_deltas[i]; + } + + for (i = 0; i < 2; i++) { + pic_params->loop_filter.mode_deltas[i] = + frame_hdr->loop_filter_params.loop_filter_mode_deltas[i]; + } + + pic_params->loop_filter.delta_lf_res = + frame_hdr->loop_filter_params.delta_lf_res; + + for (i = 0; i < GST_AV1_MAX_NUM_PLANES; i++) { + pic_params->loop_filter.frame_restoration_type[i] = + frame_hdr->loop_restoration_params.frame_restoration_type[i]; + } + + if (frame_hdr->loop_restoration_params.uses_lr) { + pic_params->loop_filter.log2_restoration_unit_size[0] = + (6 + frame_hdr->loop_restoration_params.lr_unit_shift); + pic_params->loop_filter.log2_restoration_unit_size[1] = + pic_params->loop_filter.log2_restoration_unit_size[2] = + (6 + frame_hdr->loop_restoration_params.lr_unit_shift - + frame_hdr->loop_restoration_params.lr_uv_shift); + } else { + pic_params->loop_filter.log2_restoration_unit_size[0] = + pic_params->loop_filter.log2_restoration_unit_size[1] = + pic_params->loop_filter.log2_restoration_unit_size[2] = 8; + } + + /* QUANTIZATION */ + pic_params->quantization.delta_q_present = + frame_hdr->quantization_params.delta_q_present; + pic_params->quantization.delta_q_res = + frame_hdr->quantization_params.delta_q_res; + pic_params->quantization.base_qindex = + frame_hdr->quantization_params.base_q_idx; + pic_params->quantization.y_dc_delta_q = + frame_hdr->quantization_params.delta_q_y_dc; + pic_params->quantization.u_dc_delta_q = + frame_hdr->quantization_params.delta_q_u_dc; + pic_params->quantization.v_dc_delta_q = + frame_hdr->quantization_params.delta_q_v_dc; + pic_params->quantization.u_ac_delta_q = + frame_hdr->quantization_params.delta_q_u_ac; + pic_params->quantization.v_ac_delta_q = + frame_hdr->quantization_params.delta_q_v_ac; + if (frame_hdr->quantization_params.using_qmatrix) { + pic_params->quantization.qm_y = frame_hdr->quantization_params.qm_y; + pic_params->quantization.qm_u = frame_hdr->quantization_params.qm_u; + pic_params->quantization.qm_v = frame_hdr->quantization_params.qm_v; + } else { + pic_params->quantization.qm_y = 0xff; + pic_params->quantization.qm_u = 0xff; + pic_params->quantization.qm_v = 0xff; + } + + /* Cdef params */ + pic_params->cdef.damping = frame_hdr->cdef_params.cdef_damping - 3; + pic_params->cdef.bits = frame_hdr->cdef_params.cdef_bits; + + for (i = 0; i < GST_AV1_CDEF_MAX; i++) { + guint8 secondary; + + pic_params->cdef.y_strengths[i].primary = + frame_hdr->cdef_params.cdef_y_pri_strength[i]; + secondary = frame_hdr->cdef_params.cdef_y_sec_strength[i]; + if (secondary == 4) + secondary--; + pic_params->cdef.y_strengths[i].secondary = secondary; + + pic_params->cdef.uv_strengths[i].primary = + frame_hdr->cdef_params.cdef_uv_pri_strength[i]; + secondary = frame_hdr->cdef_params.cdef_uv_sec_strength[i]; + if (secondary == 4) + secondary--; + pic_params->cdef.uv_strengths[i].secondary = secondary; + } + + pic_params->interp_filter = frame_hdr->interpolation_filter; + + /* SEGMENTATION */ + pic_params->segmentation.enabled = + frame_hdr->segmentation_params.segmentation_enabled; + pic_params->segmentation.update_map = + frame_hdr->segmentation_params.segmentation_update_map; + pic_params->segmentation.update_data = + frame_hdr->segmentation_params.segmentation_update_data; + pic_params->segmentation.temporal_update = + frame_hdr->segmentation_params.segmentation_temporal_update; + + for (i = 0; i < GST_AV1_MAX_SEGMENTS; i++) { + for (j = 0; j < GST_AV1_SEG_LVL_MAX; j++) { + pic_params->segmentation.feature_mask[i].mask |= + (frame_hdr->segmentation_params.feature_enabled[i][j] << j); + pic_params->segmentation.feature_data[i][j] = + frame_hdr->segmentation_params.feature_data[i][j]; + } + } + + /* FILM GRAIN */ + if (frame_hdr->film_grain_params.apply_grain) { + pic_params->film_grain.apply_grain = 1; + pic_params->film_grain.scaling_shift_minus8 = + frame_hdr->film_grain_params.grain_scaling_minus_8; + pic_params->film_grain.chroma_scaling_from_luma = + frame_hdr->film_grain_params.chroma_scaling_from_luma; + pic_params->film_grain.ar_coeff_lag = + frame_hdr->film_grain_params.ar_coeff_lag; + pic_params->film_grain.ar_coeff_shift_minus6 = + frame_hdr->film_grain_params.ar_coeff_shift_minus_6; + pic_params->film_grain.grain_scale_shift = + frame_hdr->film_grain_params.grain_scale_shift; + pic_params->film_grain.overlap_flag = + frame_hdr->film_grain_params.overlap_flag; + pic_params->film_grain.clip_to_restricted_range = + frame_hdr->film_grain_params.clip_to_restricted_range; + pic_params->film_grain.matrix_coeff_is_identity = + (seq_hdr->color_config.matrix_coefficients == GST_AV1_MC_IDENTITY); + pic_params->film_grain.grain_seed = frame_hdr->film_grain_params.grain_seed; + for (i = 0; i < frame_hdr->film_grain_params.num_y_points && i < 14; i++) { + pic_params->film_grain.scaling_points_y[i][0] = + frame_hdr->film_grain_params.point_y_value[i]; + pic_params->film_grain.scaling_points_y[i][1] = + frame_hdr->film_grain_params.point_y_scaling[i]; + } + pic_params->film_grain.num_y_points = + frame_hdr->film_grain_params.num_y_points; + + for (i = 0; i < frame_hdr->film_grain_params.num_cb_points && i < 10; i++) { + pic_params->film_grain.scaling_points_cb[i][0] = + frame_hdr->film_grain_params.point_cb_value[i]; + pic_params->film_grain.scaling_points_cb[i][1] = + frame_hdr->film_grain_params.point_cb_scaling[i]; + } + pic_params->film_grain.num_cb_points = + frame_hdr->film_grain_params.num_cb_points; + + for (i = 0; i < frame_hdr->film_grain_params.num_cr_points && i < 10; i++) { + pic_params->film_grain.scaling_points_cr[i][0] = + frame_hdr->film_grain_params.point_cr_value[i]; + pic_params->film_grain.scaling_points_cr[i][1] = + frame_hdr->film_grain_params.point_cr_scaling[i]; + } + pic_params->film_grain.num_cr_points = + frame_hdr->film_grain_params.num_cr_points; + + for (i = 0; i < 24; i++) { + pic_params->film_grain.ar_coeffs_y[i] = + frame_hdr->film_grain_params.ar_coeffs_y_plus_128[i]; + } + + for (i = 0; i < 25; i++) { + pic_params->film_grain.ar_coeffs_cb[i] = + frame_hdr->film_grain_params.ar_coeffs_cb_plus_128[i]; + pic_params->film_grain.ar_coeffs_cr[i] = + frame_hdr->film_grain_params.ar_coeffs_cr_plus_128[i]; + } + + pic_params->film_grain.cb_mult = frame_hdr->film_grain_params.cb_mult; + pic_params->film_grain.cb_luma_mult = + frame_hdr->film_grain_params.cb_luma_mult; + pic_params->film_grain.cr_mult = frame_hdr->film_grain_params.cr_mult; + pic_params->film_grain.cr_luma_mult = + frame_hdr->film_grain_params.cr_luma_mult; + pic_params->film_grain.cb_offset = frame_hdr->film_grain_params.cb_offset; + pic_params->film_grain.cr_offset = frame_hdr->film_grain_params.cr_offset; + } + + inner->bitstream_buffer.resize (0); + inner->tile_list.resize (0); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_av1_dec_decode_tile (GstAV1Decoder * decoder, + GstAV1Picture * picture, GstAV1Tile * tile) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + GstAV1TileGroupOBU *tile_group = &tile->tile_group; + + if (tile_group->num_tiles > inner->tile_list.size ()) + inner->tile_list.resize (tile_group->num_tiles); + + g_assert (tile_group->tg_end < inner->tile_list.size ()); + + GST_LOG_OBJECT (self, "Decode tile, tile count %d (start: %d - end: %d)", + tile_group->num_tiles, tile_group->tg_start, tile_group->tg_end); + + for (guint i = tile_group->tg_start; i <= tile_group->tg_end; i++) { + GST_DXVA_Tile_AV1 *dxva_tile = &inner->tile_list[i]; + + GST_TRACE_OBJECT (self, + "Tile offset %d, size %d, row %d, col %d", + tile_group->entry[i].tile_offset, tile_group->entry[i].tile_size, + tile_group->entry[i].tile_row, tile_group->entry[i].tile_col); + + dxva_tile->DataOffset = inner->bitstream_buffer.size () + + tile_group->entry[i].tile_offset; + dxva_tile->DataSize = tile_group->entry[i].tile_size; + dxva_tile->row = tile_group->entry[i].tile_row; + dxva_tile->column = tile_group->entry[i].tile_col; + /* TODO: used for tile list OBU */ + dxva_tile->anchor_frame = 0xff; + } + + GST_TRACE_OBJECT (self, "OBU size %d", tile->obu.obu_size); + + size_t pos = inner->bitstream_buffer.size (); + inner->bitstream_buffer.resize (pos + tile->obu.obu_size); + + memcpy (&inner->bitstream_buffer[0] + pos, + tile->obu.data, tile->obu.obu_size); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_av1_dec_end_picture (GstAV1Decoder * decoder, GstAV1Picture * picture) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + size_t bitstream_buffer_size; + size_t bitstream_pos; + GstD3D11DecodeInputStreamArgs input_args; + + if (inner->bitstream_buffer.empty () || inner->tile_list.empty ()) { + GST_ERROR_OBJECT (self, "No bitstream buffer to submit"); + return GST_FLOW_ERROR; + } + + view = gst_d3d11_av1_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (&input_args, 0, sizeof (GstD3D11DecodeInputStreamArgs)); + + bitstream_pos = inner->bitstream_buffer.size (); + bitstream_buffer_size = GST_ROUND_UP_128 (bitstream_pos); + + if (bitstream_buffer_size > bitstream_pos) { + size_t padding = bitstream_buffer_size - bitstream_pos; + + /* As per DXVA spec, total amount of bitstream buffer size should be + * 128 bytes aligned. If actual data is not multiple of 128 bytes, + * the last slice data needs to be zero-padded */ + inner->bitstream_buffer.resize (bitstream_buffer_size, 0); + + GST_DXVA_Tile_AV1 & tile = inner->tile_list.back (); + tile.DataSize += padding; + } + + input_args.picture_params = &inner->pic_params; + input_args.picture_params_size = sizeof (GST_DXVA_PicParams_AV1); + input_args.slice_control = &inner->tile_list[0]; + input_args.slice_control_size = + sizeof (GST_DXVA_Tile_AV1) * inner->tile_list.size (); + input_args.bitstream = &inner->bitstream_buffer[0]; + input_args.bitstream_size = inner->bitstream_buffer.size (); + + if (!gst_d3d11_decoder_decode_frame (inner->d3d11_decoder, view, &input_args)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_av1_dec_output_picture (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, GstAV1Picture * picture) +{ + GstD3D11AV1Dec *self = GST_D3D11_AV1_DEC (decoder); + GstD3D11AV1DecInner *inner = self->inner; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstBuffer *view_buffer; + + GST_LOG_OBJECT (self, "Outputting picture %p, %dx%d", picture, + picture->frame_hdr.render_width, picture->frame_hdr.render_height); + + view_buffer = (GstBuffer *) gst_av1_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Could not get output view"); + goto error; + } + + if (!gst_d3d11_decoder_process_output (inner->d3d11_decoder, vdec, + picture->frame_hdr.render_width, picture->frame_hdr.render_height, + view_buffer, &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to copy buffer"); + goto error; + } + + gst_av1_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_av1_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); + + return GST_FLOW_ERROR; +} + +void +gst_d3d11_av1_dec_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank) +{ + GType type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + guint i; + GTypeInfo type_info = { + sizeof (GstD3D11AV1DecClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_av1_dec_class_init, + NULL, + NULL, + sizeof (GstD3D11AV1Dec), + 0, + (GInstanceInitFunc) gst_d3d11_av1_dec_init, + }; + const GUID *profile_guid = NULL; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + guint max_width = 0; + guint max_height = 0; + guint resolution; + gboolean have_p010 = FALSE; + gboolean have_gray = FALSE; + gboolean have_gray10 = FALSE; + + if (!gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_AV1, GST_VIDEO_FORMAT_NV12, &profile_guid)) { + GST_INFO_OBJECT (device, "device does not support AV1 decoding"); + return; + } + + have_p010 = gst_d3d11_decoder_supports_format (device, + profile_guid, DXGI_FORMAT_P010); + have_gray = gst_d3d11_decoder_supports_format (device, + profile_guid, DXGI_FORMAT_R8_UNORM); + have_gray10 = gst_d3d11_decoder_supports_format (device, + profile_guid, DXGI_FORMAT_R16_UNORM); + + GST_INFO_OBJECT (device, "Decoder support P010: %d, R8: %d, R16: %d", + have_p010, have_gray, have_gray10); + + /* TODO: add test monochrome formats */ + for (i = 0; i < G_N_ELEMENTS (gst_dxva_resolutions); i++) { + if (gst_d3d11_decoder_supports_resolution (device, profile_guid, + DXGI_FORMAT_NV12, gst_dxva_resolutions[i].width, + gst_dxva_resolutions[i].height)) { + max_width = gst_dxva_resolutions[i].width; + max_height = gst_dxva_resolutions[i].height; + + GST_DEBUG_OBJECT (device, + "device support resolution %dx%d", max_width, max_height); + } else { + break; + } + } + + if (max_width == 0 || max_height == 0) { + GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); + return; + } + + sink_caps = + gst_caps_from_string ("video/x-av1, " + "alignment = (string) frame, profile = (string) main"); + src_caps = gst_caps_from_string ("video/x-raw(" + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "); video/x-raw"); + + if (have_p010) { + GValue format_list = G_VALUE_INIT; + GValue format_value = G_VALUE_INIT; + + g_value_init (&format_list, GST_TYPE_LIST); + + g_value_init (&format_value, G_TYPE_STRING); + g_value_set_string (&format_value, "NV12"); + gst_value_list_append_and_take_value (&format_list, &format_value); + + g_value_init (&format_value, G_TYPE_STRING); + g_value_set_string (&format_value, "P010_10LE"); + gst_value_list_append_and_take_value (&format_list, &format_value); + + gst_caps_set_value (src_caps, "format", &format_list); + g_value_unset (&format_list); + } else { + gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); + } + + /* To cover both landscape and portrait, select max value */ + resolution = MAX (max_width, max_height); + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + + type_info.class_data = + gst_d3d11_decoder_class_data_new (device, GST_DXVA_CODEC_AV1, + sink_caps, src_caps); + + type_name = g_strdup ("GstD3D11AV1Dec"); + feature_name = g_strdup ("d3d11av1dec"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11AV1Device%dDec", index); + feature_name = g_strdup_printf ("d3d11av1device%ddec", index); + } + + type = g_type_register_static (GST_TYPE_AV1_DECODER, + type_name, &type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11av1dec.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_AV1_DEC_H__ +#define __GST_D3D11_AV1_DEC_H__ + +#include "gstd3d11decoder.h" + +G_BEGIN_DECLS + +void gst_d3d11_av1_dec_register (GstPlugin * plugin, + GstD3D11Device * device, + guint rank); + +G_END_DECLS + +#endif /* __GST_D3D11_AV1_DEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11basefilter.cpp
Added
@@ -0,0 +1,335 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11basefilter.h" + +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_base_filter_debug); +#define GST_CAT_DEFAULT gst_d3d11_base_filter_debug + +enum +{ + PROP_0, + PROP_ADAPTER, +}; + +#define DEFAULT_ADAPTER -1 + +#define gst_d3d11_base_filter_parent_class parent_class +G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstD3D11BaseFilter, gst_d3d11_base_filter, + GST_TYPE_BASE_TRANSFORM, GST_DEBUG_CATEGORY_INIT (GST_CAT_DEFAULT, + "d3d11basefilter", 0, "d3d11 basefilter")); + +static void gst_d3d11_base_filter_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_d3d11_base_filter_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); +static void gst_d3d11_base_filter_dispose (GObject * object); +static void gst_d3d11_base_filter_set_context (GstElement * element, + GstContext * context); +static gboolean gst_d3d11_base_filter_start (GstBaseTransform * trans); +static gboolean gst_d3d11_base_filter_stop (GstBaseTransform * trans); +static gboolean gst_d3d11_base_filter_set_caps (GstBaseTransform * trans, + GstCaps * incaps, GstCaps * outcaps); +static gboolean gst_d3d11_base_filter_get_unit_size (GstBaseTransform * trans, + GstCaps * caps, gsize * size); +static gboolean +gst_d3d11_base_filter_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query); +static void gst_d3d11_base_filter_before_transform (GstBaseTransform * trans, + GstBuffer * buffer); + +static void +gst_d3d11_base_filter_class_init (GstD3D11BaseFilterClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gobject_class->set_property = gst_d3d11_base_filter_set_property; + gobject_class->get_property = gst_d3d11_base_filter_get_property; + gobject_class->dispose = gst_d3d11_base_filter_dispose; + + /** + * GstD3D11BaseFilter:adapter: + * + * Adapter index for creating device (-1 for default) + * + * Since: 1.18 + */ + g_object_class_install_property (gobject_class, PROP_ADAPTER, + g_param_spec_int ("adapter", "Adapter", + "Adapter index for creating device (-1 for default)", + -1, G_MAXINT32, DEFAULT_ADAPTER, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS))); + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_set_context); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_start); + trans_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_stop); + trans_class->set_caps = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_set_caps); + trans_class->get_unit_size = + GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_get_unit_size); + trans_class->query = GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_query); + trans_class->before_transform = + GST_DEBUG_FUNCPTR (gst_d3d11_base_filter_before_transform); + + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_BASE_FILTER, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_base_filter_init (GstD3D11BaseFilter * filter) +{ + filter->adapter = DEFAULT_ADAPTER; +} + +static void +gst_d3d11_base_filter_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (object); + + switch (prop_id) { + case PROP_ADAPTER: + filter->adapter = g_value_get_int (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_base_filter_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (object); + + switch (prop_id) { + case PROP_ADAPTER: + g_value_set_int (value, filter->adapter); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_base_filter_dispose (GObject * object) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (object); + + gst_clear_object (&filter->device); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_d3d11_base_filter_set_context (GstElement * element, GstContext * context) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (element); + + gst_d3d11_handle_set_context (element, + context, filter->adapter, &filter->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_base_filter_start (GstBaseTransform * trans) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + + if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (filter), + filter->adapter, &filter->device)) { + GST_ERROR_OBJECT (filter, "Failed to get D3D11 device"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_base_filter_stop (GstBaseTransform * trans) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + + gst_clear_object (&filter->device); + + return TRUE; +} + +static gboolean +gst_d3d11_base_filter_set_caps (GstBaseTransform * trans, GstCaps * incaps, + GstCaps * outcaps) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstVideoInfo in_info, out_info; + GstD3D11BaseFilterClass *klass; + gboolean res; + + if (!filter->device) { + GST_ERROR_OBJECT (filter, "No available D3D11 device"); + return FALSE; + } + + /* input caps */ + if (!gst_video_info_from_caps (&in_info, incaps)) + goto invalid_caps; + + /* output caps */ + if (!gst_video_info_from_caps (&out_info, outcaps)) + goto invalid_caps; + + klass = GST_D3D11_BASE_FILTER_GET_CLASS (filter); + if (klass->set_info) + res = klass->set_info (filter, incaps, &in_info, outcaps, &out_info); + else + res = TRUE; + + if (res) { + filter->in_info = in_info; + filter->out_info = out_info; + } + + return res; + + /* ERRORS */ +invalid_caps: + { + GST_ERROR_OBJECT (filter, "invalid caps"); + return FALSE; + } +} + +static gboolean +gst_d3d11_base_filter_get_unit_size (GstBaseTransform * trans, GstCaps * caps, + gsize * size) +{ + gboolean ret = FALSE; + GstVideoInfo info; + + ret = gst_video_info_from_caps (&info, caps); + if (ret) + *size = GST_VIDEO_INFO_SIZE (&info); + + return TRUE; +} + +static gboolean +gst_d3d11_base_filter_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + { + gboolean ret; + ret = gst_d3d11_handle_context_query (GST_ELEMENT (filter), query, + filter->device); + if (ret) + return TRUE; + break; + } + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->query (trans, direction, + query); +} + +static void +gst_d3d11_base_filter_before_transform (GstBaseTransform * trans, + GstBuffer * buffer) +{ + GstD3D11BaseFilter *self = GST_D3D11_BASE_FILTER (trans); + GstD3D11Memory *dmem; + GstMemory *mem; + gboolean update_device = FALSE; + GstCaps *in_caps = NULL; + GstCaps *out_caps = NULL; + + mem = gst_buffer_peek_memory (buffer, 0); + /* Can happens (e.g., d3d11upload) */ + if (!gst_is_d3d11_memory (mem)) + return; + + dmem = GST_D3D11_MEMORY_CAST (mem); + /* Same device, nothing to do */ + if (dmem->device == self->device) + return; + + /* Can accept any device, update */ + if (self->adapter < 0) { + update_device = TRUE; + } else { + guint adapter = 0; + + g_object_get (dmem->device, "adapter", &adapter, NULL); + /* The same GPU as what user wanted, update */ + if (adapter == (guint) self->adapter) + update_device = TRUE; + } + + if (!update_device) + return; + + GST_INFO_OBJECT (self, "Updating device %" GST_PTR_FORMAT " -> %" + GST_PTR_FORMAT, self->device, dmem->device); + + gst_object_unref (self->device); + self->device = (GstD3D11Device *) gst_object_ref (dmem->device); + + in_caps = gst_pad_get_current_caps (GST_BASE_TRANSFORM_SINK_PAD (trans)); + if (!in_caps) { + GST_WARNING_OBJECT (self, "sinkpad has null caps"); + goto out; + } + + out_caps = gst_pad_get_current_caps (GST_BASE_TRANSFORM_SRC_PAD (trans)); + if (!out_caps) { + GST_WARNING_OBJECT (self, "Has no configured output caps"); + goto out; + } + + /* subclass will update internal object. + * Note that gst_base_transform_reconfigure() might not trigger this + * unless caps was changed meanwhile */ + gst_d3d11_base_filter_set_caps (trans, in_caps, out_caps); + + /* Mark reconfigure so that we can update pool */ + gst_base_transform_reconfigure_src (trans); + +out: + gst_clear_caps (&in_caps); + gst_clear_caps (&out_caps); + + return; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11basefilter.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11basefilter.h
Changed
@@ -23,8 +23,8 @@ #include <gst/gst.h> #include <gst/video/video.h> #include <gst/base/gstbasetransform.h> - -#include "gstd3d11_fwd.h" +#include <gst/d3d11/gstd3d11.h> +#include "gstd3d11pluginutils.h" G_BEGIN_DECLS @@ -36,6 +36,9 @@ #define GST_IS_D3D11_BASE_FILTER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_BASE_FILTER)) #define GST_D3D11_BASE_FILTER_CAST(obj) ((GstD3D11BaseFilter*)(obj)) +typedef struct _GstD3D11BaseFilter GstD3D11BaseFilter; +typedef struct _GstD3D11BaseFilterClass GstD3D11BaseFilterClass; + struct _GstD3D11BaseFilter { GstBaseTransform parent; @@ -60,6 +63,8 @@ GType gst_d3d11_base_filter_get_type (void); +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstD3D11BaseFilter, gst_object_unref) + G_END_DECLS #endif /* __GST_D3D11_BASE_FILTER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11compositor.cpp
Added
@@ -0,0 +1,2521 @@ +/* + * GStreamer + * Copyright (C) 2004, 2008 Wim Taymans <wim@fluendo.com> + * Copyright (C) 2010 Sebastian Dröge <sebastian.droege@collabora.co.uk> + * Copyright (C) 2014 Mathieu Duponchelle <mathieu.duponchelle@opencreed.com> + * Copyright (C) 2014 Thibault Saunier <tsaunier@gnome.org> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11compositorelement + * @title: d3d11compositorelement + * + * A Direct3D11 based video compositing element. + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11compositor.h" +#include "gstd3d11converter.h" +#include "gstd3d11shader.h" +#include "gstd3d11pluginutils.h" +#include <string.h> +#include <wrl.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_compositor_debug); +#define GST_CAT_DEFAULT gst_d3d11_compositor_debug + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +/** + * GstD3D11CompositorBlendOperation: + * @GST_D3D11_COMPOSITOR_BLEND_OP_ADD: + * Add source 1 and source 2 + * @GST_D3D11_COMPOSITOR_BLEND_OP_SUBTRACT: + * Subtract source 1 from source 2 + * @GST_D3D11_COMPOSITOR_BLEND_OP_REV_SUBTRACT: + * Subtract source 2 from source 1 + * @GST_D3D11_COMPOSITOR_BLEND_OP_MIN: + * Find the minimum of source 1 and source 2 + * @GST_D3D11_COMPOSITOR_BLEND_OP_MAX: + * Find the maximum of source 1 and source 2 + * + * Since: 1.20 + */ +GType +gst_d3d11_compositor_blend_operation_get_type (void) +{ + static GType blend_operation_type = 0; + + static const GEnumValue blend_operator[] = { + {GST_D3D11_COMPOSITOR_BLEND_OP_ADD, "Add source and background", + "add"}, + {GST_D3D11_COMPOSITOR_BLEND_OP_SUBTRACT, + "Subtract source from background", + "subtract"}, + {GST_D3D11_COMPOSITOR_BLEND_OP_REV_SUBTRACT, + "Subtract background from source", + "rev-subtract"}, + {GST_D3D11_COMPOSITOR_BLEND_OP_MIN, + "Minimum of source and background", "min"}, + {GST_D3D11_COMPOSITOR_BLEND_OP_MAX, + "Maximum of source and background", "max"}, + {0, NULL, NULL}, + }; + + if (!blend_operation_type) { + blend_operation_type = + g_enum_register_static ("GstD3D11CompositorBlendOperation", + blend_operator); + } + return blend_operation_type; +} + +static GstD3D11CompositorBlendOperation +gst_d3d11_compositor_blend_operation_from_native (D3D11_BLEND_OP blend_op) +{ + switch (blend_op) { + case D3D11_BLEND_OP_ADD: + return GST_D3D11_COMPOSITOR_BLEND_OP_ADD; + case D3D11_BLEND_OP_SUBTRACT: + return GST_D3D11_COMPOSITOR_BLEND_OP_SUBTRACT; + case D3D11_BLEND_OP_REV_SUBTRACT: + return GST_D3D11_COMPOSITOR_BLEND_OP_REV_SUBTRACT; + case D3D11_BLEND_OP_MIN: + return GST_D3D11_COMPOSITOR_BLEND_OP_MIN; + case D3D11_BLEND_OP_MAX: + return GST_D3D11_COMPOSITOR_BLEND_OP_MAX; + default: + g_assert_not_reached (); + break; + } + + return GST_D3D11_COMPOSITOR_BLEND_OP_ADD; +} + +static D3D11_BLEND_OP +gst_d3d11_compositor_blend_operation_to_native (GstD3D11CompositorBlendOperation + op) +{ + switch (op) { + case GST_D3D11_COMPOSITOR_BLEND_OP_ADD: + return D3D11_BLEND_OP_ADD; + case GST_D3D11_COMPOSITOR_BLEND_OP_SUBTRACT: + return D3D11_BLEND_OP_SUBTRACT; + case GST_D3D11_COMPOSITOR_BLEND_OP_REV_SUBTRACT: + return D3D11_BLEND_OP_REV_SUBTRACT; + case GST_D3D11_COMPOSITOR_BLEND_OP_MIN: + return D3D11_BLEND_OP_MIN; + case GST_D3D11_COMPOSITOR_BLEND_OP_MAX: + return D3D11_BLEND_OP_MAX; + default: + g_assert_not_reached (); + break; + } + + return D3D11_BLEND_OP_ADD; +} + +/** + * GstD3D11CompositorBlend: + * @GST_D3D11_COMPOSITOR_BLEND_ZERO: + * The blend factor is (0, 0, 0, 0). No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_ONE: + * The blend factor is (1, 1, 1, 1). No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR: + * The blend factor is (Rs, Gs, Bs, As), + * that is color data (RGB) from a pixel shader. No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR: + * The blend factor is (1 - Rs, 1 - Gs, 1 - Bs, 1 - As), + * that is color data (RGB) from a pixel shader. + * The pre-blend operation inverts the data, generating 1 - RGB. + * @GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA: + * The blend factor is (As, As, As, As), + * that is alpha data (A) from a pixel shader. No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA: + * The blend factor is ( 1 - As, 1 - As, 1 - As, 1 - As), + * that is alpha data (A) from a pixel shader. + * The pre-blend operation inverts the data, generating 1 - A. + * @GST_D3D11_COMPOSITOR_BLEND_DEST_ALPHA: + * The blend factor is (Ad, Ad, Ad, Ad), + * that is alpha data from a render target. No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_INV_DEST_ALPHA: + * The blend factor is (1 - Ad, 1 - Ad, 1 - Ad, 1 - Ad), + * that is alpha data from a render target. + * The pre-blend operation inverts the data, generating 1 - A. + * @GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR: + * The blend factor is (Rd, Gd, Bd, Ad), + * that is color data from a render target. No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR: + * The blend factor is (1 - Rd, 1 - Gd, 1 - Bd, 1 - Ad), + * that is color data from a render target. + * The pre-blend operation inverts the data, generating 1 - RGB. + * @GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA_SAT: + * The blend factor is (f, f, f, 1); where f = min(As, 1 - Ad). + * The pre-blend operation clamps the data to 1 or less. + * @GST_D3D11_COMPOSITOR_BLEND_BLEND_FACTOR: + * The blend factor is the blend factor set with + * ID3D11DeviceContext::OMSetBlendState. No pre-blend operation. + * @GST_D3D11_COMPOSITOR_BLEND_INV_BLEND_FACTOR: + * The blend factor is the blend factor set with + * ID3D11DeviceContext::OMSetBlendState. + * The pre-blend operation inverts the blend factor, + * generating 1 - blend_factor. + * + * Since: 1.20 + */ +GType +gst_d3d11_compositor_blend_get_type (void) +{ + static GType blend_type = 0; + + static const GEnumValue blend[] = { + {GST_D3D11_COMPOSITOR_BLEND_ZERO, + "The blend factor is (0, 0, 0, 0)", "zero"}, + {GST_D3D11_COMPOSITOR_BLEND_ONE, + "The blend factor is (1, 1, 1, 1)", "one"}, + {GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR, + "The blend factor is (Rs, Gs, Bs, As)", "src-color"}, + {GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR, + "The blend factor is (1 - Rs, 1 - Gs, 1 - Bs, 1 - As)", + "inv-src-color"}, + {GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA, + "The blend factor is (As, As, As, As)", "src-alpha"}, + {GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA, + "The blend factor is (1 - As, 1 - As, 1 - As, 1 - As)", + "inv-src-alpha"}, + {GST_D3D11_COMPOSITOR_BLEND_DEST_ALPHA, + "The blend factor is (Ad, Ad, Ad, Ad)", "dest-alpha"}, + {GST_D3D11_COMPOSITOR_BLEND_INV_DEST_ALPHA, + "The blend factor is (1 - Ad, 1 - Ad, 1 - Ad, 1 - Ad)", + "inv-dest-alpha"}, + {GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR, + "The blend factor is (Rd, Gd, Bd, Ad)", "dest-color"}, + {GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR, + "The blend factor is (1 - Rd, 1 - Gd, 1 - Bd, 1 - Ad)", + "inv-dest-color"}, + {GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA_SAT, + "The blend factor is (f, f, f, 1); where f = min(As, 1 - Ad)", + "src-alpha-sat"}, + {GST_D3D11_COMPOSITOR_BLEND_BLEND_FACTOR, + "User defined blend factor", "blend-factor"}, + {GST_D3D11_COMPOSITOR_BLEND_INV_BLEND_FACTOR, + "Inverse of user defined blend factor", "inv-blend-factor"}, + {0, NULL, NULL}, + }; + + if (!blend_type) { + blend_type = g_enum_register_static ("GstD3D11CompositorBlend", blend); + } + return blend_type; +} + +static GstD3D11CompositorBlend +gst_d3d11_compositor_blend_from_native (D3D11_BLEND blend) +{ + switch (blend) { + case D3D11_BLEND_ZERO: + return GST_D3D11_COMPOSITOR_BLEND_ZERO; + case D3D11_BLEND_ONE: + return GST_D3D11_COMPOSITOR_BLEND_ONE; + case D3D11_BLEND_SRC_COLOR: + return GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR; + case D3D11_BLEND_INV_SRC_COLOR: + return GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR; + case D3D11_BLEND_SRC_ALPHA: + return GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA; + case D3D11_BLEND_INV_SRC_ALPHA: + return GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA; + case D3D11_BLEND_DEST_ALPHA: + return GST_D3D11_COMPOSITOR_BLEND_DEST_ALPHA; + case D3D11_BLEND_INV_DEST_ALPHA: + return GST_D3D11_COMPOSITOR_BLEND_INV_DEST_ALPHA; + case D3D11_BLEND_DEST_COLOR: + return GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR; + case D3D11_BLEND_INV_DEST_COLOR: + return GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR; + case D3D11_BLEND_SRC_ALPHA_SAT: + return GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA_SAT; + case D3D11_BLEND_BLEND_FACTOR: + return GST_D3D11_COMPOSITOR_BLEND_BLEND_FACTOR; + case D3D11_BLEND_INV_BLEND_FACTOR: + return GST_D3D11_COMPOSITOR_BLEND_INV_BLEND_FACTOR; + default: + g_assert_not_reached (); + break; + } + + return GST_D3D11_COMPOSITOR_BLEND_ZERO; +} + +static D3D11_BLEND +gst_d3d11_compositor_blend_to_native (GstD3D11CompositorBlend blend) +{ + switch (blend) { + case GST_D3D11_COMPOSITOR_BLEND_ZERO: + return D3D11_BLEND_ZERO; + case GST_D3D11_COMPOSITOR_BLEND_ONE: + return D3D11_BLEND_ONE; + case GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR: + return D3D11_BLEND_SRC_COLOR; + case GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR: + return D3D11_BLEND_INV_SRC_COLOR; + case GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA: + return D3D11_BLEND_SRC_ALPHA; + case GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA: + return D3D11_BLEND_INV_SRC_ALPHA; + case GST_D3D11_COMPOSITOR_BLEND_DEST_ALPHA: + return D3D11_BLEND_DEST_ALPHA; + case GST_D3D11_COMPOSITOR_BLEND_INV_DEST_ALPHA: + return D3D11_BLEND_INV_DEST_ALPHA; + case GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR: + return D3D11_BLEND_DEST_COLOR; + case GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR: + return D3D11_BLEND_INV_DEST_COLOR; + case GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA_SAT: + return D3D11_BLEND_SRC_ALPHA_SAT; + case GST_D3D11_COMPOSITOR_BLEND_BLEND_FACTOR: + return D3D11_BLEND_BLEND_FACTOR; + case GST_D3D11_COMPOSITOR_BLEND_INV_BLEND_FACTOR: + return D3D11_BLEND_INV_BLEND_FACTOR; + default: + g_assert_not_reached (); + break; + } + + return D3D11_BLEND_ZERO; +} + +/** + * GstD3D11CompositorBackground: + * + * Background mode + * + * Since: 1.20 + */ +GType +gst_d3d11_compositor_background_get_type (void) +{ + static GType compositor_background_type = 0; + + static const GEnumValue compositor_background[] = { + {GST_D3D11_COMPOSITOR_BACKGROUND_CHECKER, "Checker pattern", "checker"}, + {GST_D3D11_COMPOSITOR_BACKGROUND_BLACK, "Black", "black"}, + {GST_D3D11_COMPOSITOR_BACKGROUND_WHITE, "White", "white"}, + {GST_D3D11_COMPOSITOR_BACKGROUND_TRANSPARENT, + "Transparent Background to enable further compositing", "transparent"}, + {0, NULL, NULL}, + }; + + if (!compositor_background_type) { + compositor_background_type = + g_enum_register_static ("GstD3D11CompositorBackground", + compositor_background); + } + return compositor_background_type; +} + +/** + * GstD3D11CompositorSizingPolicy: + * + * Sizing policy + * + * Since: 1.20 + */ +GType +gst_d3d11_compositor_sizing_policy_get_type (void) +{ + static GType sizing_policy_type = 0; + + static const GEnumValue sizing_polices[] = { + {GST_D3D11_COMPOSITOR_SIZING_POLICY_NONE, + "None: Image is scaled to fill configured destination rectangle without " + "padding or keeping the aspect ratio", "none"}, + {GST_D3D11_COMPOSITOR_SIZING_POLICY_KEEP_ASPECT_RATIO, + "Keep Aspect Ratio: Image is scaled to fit destination rectangle " + "specified by GstCompositorPad:{xpos, ypos, width, height} " + "with preserved aspect ratio. Resulting image will be centered in " + "the destination rectangle with padding if necessary", + "keep-aspect-ratio"}, + {0, NULL, NULL}, + }; + + if (!sizing_policy_type) { + sizing_policy_type = + g_enum_register_static ("GstD3D11CompositorSizingPolicy", + sizing_polices); + } + return sizing_policy_type; +} + +/* *INDENT-OFF* */ +static const gchar checker_vs_src[] = + "struct VS_INPUT\n" + "{\n" + " float4 Position : POSITION;\n" + "};\n" + "\n" + "struct VS_OUTPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + "};\n" + "\n" + "VS_OUTPUT main(VS_INPUT input)\n" + "{\n" + " return input;\n" + "}\n"; + +static const gchar checker_ps_src[] = + "static const float blocksize = 8.0;\n" + "static const float4 high = float4(0.667, 0.667, 0.667, 1.0);\n" + "static const float4 low = float4(0.333, 0.333, 0.333, 1.0);\n" + "struct PS_INPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + "};\n" + "struct PS_OUTPUT\n" + "{\n" + " float4 Plane: SV_TARGET;\n" + "};\n" + "PS_OUTPUT main(PS_INPUT input)\n" + "{\n" + " PS_OUTPUT output;\n" + " if ((input.Position.x % (blocksize * 2.0)) >= blocksize) {\n" + " if ((input.Position.y % (blocksize * 2.0)) >= blocksize)\n" + " output.Plane = low;\n" + " else\n" + " output.Plane = high;\n" + " } else {\n" + " if ((input.Position.y % (blocksize * 2.0)) < blocksize)\n" + " output.Plane = low;\n" + " else\n" + " output.Plane = high;\n" + " }\n" + " return output;\n" + "}\n"; +/* *INDENT-ON* */ + +/** + * GstD3D11CompositorPad: + * + * Since: 1.20 + */ +struct _GstD3D11CompositorPad +{ + GstVideoAggregatorConvertPad parent; + + GstD3D11Converter *convert; + + GstBufferPool *fallback_pool; + GstBuffer *fallback_buf; + + gboolean position_updated; + gboolean alpha_updated; + gboolean blend_desc_updated; + ID3D11BlendState *blend; + + /* properties */ + gint xpos; + gint ypos; + gint width; + gint height; + gdouble alpha; + D3D11_RENDER_TARGET_BLEND_DESC desc; + gfloat blend_factor[4]; + GstD3D11CompositorSizingPolicy sizing_policy; +}; + +struct _GstD3D11Compositor +{ + GstVideoAggregator parent; + + GstD3D11Device *device; + + GstBufferPool *fallback_pool; + GstBuffer *fallback_buf; + + GstD3D11Quad *checker_background; + D3D11_VIEWPORT viewport; + + gboolean reconfigured; + + /* properties */ + gint adapter; + GstD3D11CompositorBackground background; +}; + +enum +{ + PROP_PAD_0, + PROP_PAD_XPOS, + PROP_PAD_YPOS, + PROP_PAD_WIDTH, + PROP_PAD_HEIGHT, + PROP_PAD_ALPHA, + PROP_PAD_BLEND_OP_RGB, + PROP_PAD_BLEND_OP_ALPHA, + PROP_PAD_BLEND_SRC_RGB, + PROP_PAD_BLEND_SRC_ALPHA, + PROP_PAD_BLEND_DEST_RGB, + PROP_PAD_BLEND_DEST_ALPHA, + PROP_PAD_BLEND_FACTOR_RED, + PROP_PAD_BLEND_FACTOR_GREEN, + PROP_PAD_BLEND_FACTOR_BLUE, + PROP_PAD_BLEND_FACTOR_ALPHA, + PROP_PAD_SIZING_POLICY, +}; + +#define DEFAULT_PAD_XPOS 0 +#define DEFAULT_PAD_YPOS 0 +#define DEFAULT_PAD_WIDTH 0 +#define DEFAULT_PAD_HEIGHT 0 +#define DEFAULT_PAD_ALPHA 1.0 +#define DEFAULT_PAD_BLEND_OP_RGB GST_D3D11_COMPOSITOR_BLEND_OP_ADD +#define DEFAULT_PAD_BLEND_OP_ALPHA GST_D3D11_COMPOSITOR_BLEND_OP_ADD +#define DEFAULT_PAD_BLEND_SRC_RGB GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA +#define DEFAULT_PAD_BLEND_SRC_ALPHA GST_D3D11_COMPOSITOR_BLEND_ONE +#define DEFAULT_PAD_BLEND_DEST_RGB GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA +#define DEFAULT_PAD_BLEND_DEST_ALPHA GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA +#define DEFAULT_PAD_SIZING_POLICY GST_D3D11_COMPOSITOR_SIZING_POLICY_NONE + +static void gst_d3d11_compositor_pad_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_compositor_pad_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static gboolean +gst_d3d11_compositor_pad_prepare_frame (GstVideoAggregatorPad * pad, + GstVideoAggregator * vagg, GstBuffer * buffer, + GstVideoFrame * prepared_frame); +static void +gst_d3d11_compositor_pad_clean_frame (GstVideoAggregatorPad * pad, + GstVideoAggregator * vagg, GstVideoFrame * prepared_frame); +static void +gst_d3d11_compositor_pad_init_blend_options (GstD3D11CompositorPad * pad); + +#define gst_d3d11_compositor_pad_parent_class parent_pad_class +G_DEFINE_TYPE (GstD3D11CompositorPad, gst_d3d11_compositor_pad, + GST_TYPE_VIDEO_AGGREGATOR_PAD); + +static void +gst_d3d11_compositor_pad_class_init (GstD3D11CompositorPadClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstVideoAggregatorPadClass *vaggpadclass = + GST_VIDEO_AGGREGATOR_PAD_CLASS (klass); + + gobject_class->set_property = gst_d3d11_compositor_pad_set_property; + gobject_class->get_property = gst_d3d11_compositor_pad_get_property; + + g_object_class_install_property (gobject_class, PROP_PAD_XPOS, + g_param_spec_int ("xpos", "X Position", "X position of the picture", + G_MININT, G_MAXINT, DEFAULT_PAD_XPOS, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_YPOS, + g_param_spec_int ("ypos", "Y Position", "Y position of the picture", + G_MININT, G_MAXINT, DEFAULT_PAD_YPOS, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_WIDTH, + g_param_spec_int ("width", "Width", "Width of the picture", + G_MININT, G_MAXINT, DEFAULT_PAD_WIDTH, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_HEIGHT, + g_param_spec_int ("height", "Height", "Height of the picture", + G_MININT, G_MAXINT, DEFAULT_PAD_HEIGHT, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_ALPHA, + g_param_spec_double ("alpha", "Alpha", "Alpha of the picture", 0.0, 1.0, + DEFAULT_PAD_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_BLEND_OP_RGB, + g_param_spec_enum ("blend-op-rgb", "Blend Operation RGB", + "Blend equation for RGB", GST_TYPE_D3D11_COMPOSITOR_BLEND_OPERATION, + DEFAULT_PAD_BLEND_OP_RGB, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_BLEND_OP_ALPHA, + g_param_spec_enum ("blend-op-alpha", "Blend Operation Alpha", + "Blend equation for alpha", GST_TYPE_D3D11_COMPOSITOR_BLEND_OPERATION, + DEFAULT_PAD_BLEND_OP_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_PAD_BLEND_SRC_RGB, + g_param_spec_enum ("blend-src-rgb", "Blend Source RGB", + "Blend factor for source RGB", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_PAD_BLEND_SRC_RGB, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_PAD_BLEND_SRC_ALPHA, + g_param_spec_enum ("blend-src-alpha", + "Blend Source Alpha", + "Blend factor for source alpha, \"*-color\" values are not allowed", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_PAD_BLEND_SRC_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_PAD_BLEND_DEST_RGB, + g_param_spec_enum ("blend-dest-rgb", + "Blend Destination RGB", + "Blend factor for destination RGB", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_PAD_BLEND_DEST_RGB, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_PAD_BLEND_DEST_ALPHA, + g_param_spec_enum ("blend-dest-alpha", + "Blend Destination Alpha", + "Blend factor for destination alpha, " + "\"*-color\" values are not allowed", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_PAD_BLEND_DEST_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_BLEND_FACTOR_RED, + g_param_spec_float ("blend-factor-red", "Blend Factor Red", + "Blend factor for red component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_BLEND_FACTOR_GREEN, + g_param_spec_float ("blend-factor-green", "Blend Factor Green", + "Blend factor for green component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_BLEND_FACTOR_BLUE, + g_param_spec_float ("blend-factor-blue", "Blend Factor Blue", + "Blend factor for blue component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_BLEND_FACTOR_ALPHA, + g_param_spec_float ("blend-factor-alpha", "Blend Factor Alpha", + "Blend factor for alpha component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_PAD_SIZING_POLICY, + g_param_spec_enum ("sizing-policy", "Sizing policy", + "Sizing policy to use for image scaling", + GST_TYPE_D3D11_COMPOSITOR_SIZING_POLICY, DEFAULT_PAD_SIZING_POLICY, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + vaggpadclass->prepare_frame = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_pad_prepare_frame); + vaggpadclass->clean_frame = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_pad_clean_frame); + + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_BLEND, + (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_BLEND_OPERATION, + (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_SIZING_POLICY, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_compositor_pad_init (GstD3D11CompositorPad * pad) +{ + pad->xpos = DEFAULT_PAD_XPOS; + pad->ypos = DEFAULT_PAD_YPOS; + pad->width = DEFAULT_PAD_WIDTH; + pad->height = DEFAULT_PAD_HEIGHT; + pad->alpha = DEFAULT_PAD_ALPHA; + pad->sizing_policy = DEFAULT_PAD_SIZING_POLICY; + + gst_d3d11_compositor_pad_init_blend_options (pad); +} + +static void +gst_d3d11_compositor_pad_update_blend_function (GstD3D11CompositorPad * pad, + D3D11_BLEND * value, GstD3D11CompositorBlend new_value) +{ + D3D11_BLEND temp = gst_d3d11_compositor_blend_to_native (new_value); + + if (temp == *value) + return; + + *value = temp; + pad->blend_desc_updated = TRUE; +} + +static void +gst_d3d11_compositor_pad_update_blend_equation (GstD3D11CompositorPad * pad, + D3D11_BLEND_OP * value, GstD3D11CompositorBlendOperation new_value) +{ + D3D11_BLEND_OP temp = + gst_d3d11_compositor_blend_operation_to_native (new_value); + + if (temp == *value) + return; + + *value = temp; + pad->blend_desc_updated = TRUE; +} + +static void +gst_d3d11_compositor_pad_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorPad *pad = GST_D3D11_COMPOSITOR_PAD (object); + + switch (prop_id) { + case PROP_PAD_XPOS: + pad->xpos = g_value_get_int (value); + pad->position_updated = TRUE; + break; + case PROP_PAD_YPOS: + pad->ypos = g_value_get_int (value); + pad->position_updated = TRUE; + break; + case PROP_PAD_WIDTH: + pad->width = g_value_get_int (value); + pad->position_updated = TRUE; + break; + case PROP_PAD_HEIGHT: + pad->height = g_value_get_int (value); + pad->position_updated = TRUE; + break; + case PROP_PAD_ALPHA: + { + gdouble alpha = g_value_get_double (value); + if (pad->alpha != alpha) { + pad->alpha_updated = TRUE; + pad->alpha = alpha; + } + break; + } + case PROP_PAD_BLEND_OP_RGB: + gst_d3d11_compositor_pad_update_blend_equation (pad, &pad->desc.BlendOp, + (GstD3D11CompositorBlendOperation) g_value_get_enum (value)); + break; + case PROP_PAD_BLEND_OP_ALPHA: + gst_d3d11_compositor_pad_update_blend_equation (pad, + &pad->desc.BlendOpAlpha, + (GstD3D11CompositorBlendOperation) g_value_get_enum (value)); + break; + case PROP_PAD_BLEND_SRC_RGB: + gst_d3d11_compositor_pad_update_blend_function (pad, &pad->desc.SrcBlend, + (GstD3D11CompositorBlend) g_value_get_enum (value)); + break; + case PROP_PAD_BLEND_SRC_ALPHA: + { + GstD3D11CompositorBlend blend = + (GstD3D11CompositorBlend) g_value_get_enum (value); + if (blend == GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR || + blend == GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR || + blend == GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR || + blend == GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR) { + g_warning ("%d is not allowed for %s", blend, pspec->name); + } else { + gst_d3d11_compositor_pad_update_blend_function (pad, + &pad->desc.SrcBlendAlpha, blend); + } + break; + } + case PROP_PAD_BLEND_DEST_RGB: + gst_d3d11_compositor_pad_update_blend_function (pad, &pad->desc.DestBlend, + (GstD3D11CompositorBlend) g_value_get_enum (value)); + break; + case PROP_PAD_BLEND_DEST_ALPHA: + { + GstD3D11CompositorBlend blend = + (GstD3D11CompositorBlend) g_value_get_enum (value); + if (blend == GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR || + blend == GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR || + blend == GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR || + blend == GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR) { + g_warning ("%d is not allowed for %s", blend, pspec->name); + } else { + gst_d3d11_compositor_pad_update_blend_function (pad, + &pad->desc.DestBlendAlpha, blend); + } + break; + } + case PROP_PAD_BLEND_FACTOR_RED: + pad->blend_factor[0] = g_value_get_float (value); + break; + case PROP_PAD_BLEND_FACTOR_GREEN: + pad->blend_factor[1] = g_value_get_float (value); + break; + case PROP_PAD_BLEND_FACTOR_BLUE: + pad->blend_factor[2] = g_value_get_float (value); + break; + case PROP_PAD_BLEND_FACTOR_ALPHA: + pad->blend_factor[3] = g_value_get_float (value); + break; + case PROP_PAD_SIZING_POLICY: + pad->sizing_policy = + (GstD3D11CompositorSizingPolicy) g_value_get_enum (value); + pad->position_updated = TRUE; + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_compositor_pad_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorPad *pad = GST_D3D11_COMPOSITOR_PAD (object); + + switch (prop_id) { + case PROP_PAD_XPOS: + g_value_set_int (value, pad->xpos); + break; + case PROP_PAD_YPOS: + g_value_set_int (value, pad->ypos); + break; + case PROP_PAD_WIDTH: + g_value_set_int (value, pad->width); + break; + case PROP_PAD_HEIGHT: + g_value_set_int (value, pad->height); + break; + case PROP_PAD_ALPHA: + g_value_set_double (value, pad->alpha); + break; + case PROP_PAD_BLEND_OP_RGB: + g_value_set_enum (value, + gst_d3d11_compositor_blend_operation_from_native (pad->desc.BlendOp)); + break; + case PROP_PAD_BLEND_OP_ALPHA: + g_value_set_enum (value, + gst_d3d11_compositor_blend_operation_from_native (pad-> + desc.BlendOpAlpha)); + break; + case PROP_PAD_BLEND_SRC_RGB: + g_value_set_enum (value, + gst_d3d11_compositor_blend_from_native (pad->desc.SrcBlend)); + break; + case PROP_PAD_BLEND_SRC_ALPHA: + g_value_set_enum (value, + gst_d3d11_compositor_blend_from_native (pad->desc.SrcBlendAlpha)); + break; + case PROP_PAD_BLEND_DEST_RGB: + g_value_set_enum (value, + gst_d3d11_compositor_blend_from_native (pad->desc.DestBlend)); + break; + case PROP_PAD_BLEND_DEST_ALPHA: + g_value_set_enum (value, + gst_d3d11_compositor_blend_from_native (pad->desc.DestBlendAlpha)); + break; + case PROP_PAD_BLEND_FACTOR_RED: + g_value_set_float (value, pad->blend_factor[0]); + break; + case PROP_PAD_BLEND_FACTOR_GREEN: + g_value_set_float (value, pad->blend_factor[1]); + break; + case PROP_PAD_BLEND_FACTOR_BLUE: + g_value_set_float (value, pad->blend_factor[2]); + break; + case PROP_PAD_BLEND_FACTOR_ALPHA: + g_value_set_float (value, pad->blend_factor[3]); + break; + case PROP_PAD_SIZING_POLICY: + g_value_set_enum (value, pad->sizing_policy); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_compositor_pad_init_blend_options (GstD3D11CompositorPad * pad) +{ + guint i; + + pad->desc.BlendEnable = TRUE; + pad->desc.SrcBlend = + gst_d3d11_compositor_blend_to_native (DEFAULT_PAD_BLEND_SRC_RGB); + pad->desc.DestBlend = + gst_d3d11_compositor_blend_to_native (DEFAULT_PAD_BLEND_DEST_RGB); + pad->desc.BlendOp = + gst_d3d11_compositor_blend_operation_to_native (DEFAULT_PAD_BLEND_OP_RGB); + pad->desc.SrcBlendAlpha = + gst_d3d11_compositor_blend_to_native (DEFAULT_PAD_BLEND_SRC_ALPHA); + pad->desc.DestBlendAlpha = + gst_d3d11_compositor_blend_to_native (DEFAULT_PAD_BLEND_DEST_ALPHA); + pad->desc.BlendOpAlpha = + gst_d3d11_compositor_blend_operation_to_native + (DEFAULT_PAD_BLEND_OP_ALPHA); + pad->desc.RenderTargetWriteMask = D3D11_COLOR_WRITE_ENABLE_ALL; + + for (i = 0; i < G_N_ELEMENTS (pad->blend_factor); i++) + pad->blend_factor[i] = 1.0f; +} + +static gboolean +gst_d3d11_compositor_configure_fallback_pool (GstD3D11Compositor * self, + GstVideoInfo * info, gint bind_flags, GstBufferPool ** pool) +{ + GstD3D11AllocationParams *d3d11_params; + GstBufferPool *new_pool; + GstCaps *caps; + + if (*pool) { + gst_buffer_pool_set_active (*pool, FALSE); + gst_clear_object (pool); + } + + caps = gst_video_info_to_caps (info); + if (!caps) { + GST_ERROR_OBJECT (self, "Couldn't create caps from info"); + return FALSE; + } + + d3d11_params = gst_d3d11_allocation_params_new (self->device, + info, (GstD3D11AllocationFlags) 0, bind_flags); + + new_pool = gst_d3d11_buffer_pool_new_with_options (self->device, + caps, d3d11_params, 0, 0); + gst_caps_unref (caps); + gst_d3d11_allocation_params_free (d3d11_params); + + if (!new_pool) { + GST_ERROR_OBJECT (self, "Failed to configure fallback pool"); + return FALSE; + } + + gst_buffer_pool_set_active (new_pool, TRUE); + *pool = new_pool; + + return TRUE; +} + +static gboolean +gst_d3d11_compsitor_prepare_fallback_buffer (GstD3D11Compositor * self, + GstVideoInfo * info, gboolean is_input, GstBufferPool ** pool, + GstBuffer ** fallback_buffer) +{ + GstBuffer *new_buf = NULL; + gint bind_flags = D3D11_BIND_SHADER_RESOURCE; + guint i; + + gst_clear_buffer (fallback_buffer); + + if (!is_input) + bind_flags = D3D11_BIND_RENDER_TARGET; + + if (*pool == NULL && + !gst_d3d11_compositor_configure_fallback_pool (self, info, + bind_flags, pool)) { + GST_ERROR_OBJECT (self, "Couldn't configure fallback buffer pool"); + return FALSE; + } + + if (gst_buffer_pool_acquire_buffer (*pool, &new_buf, NULL) + != GST_FLOW_OK) { + GST_ERROR_OBJECT (self, "Couldn't get fallback buffer from pool"); + return FALSE; + } + + for (i = 0; i < gst_buffer_n_memory (new_buf); i++) { + GstD3D11Memory *new_mem = + (GstD3D11Memory *) gst_buffer_peek_memory (new_buf, i); + + if (is_input && !gst_d3d11_memory_get_shader_resource_view_size (new_mem)) { + GST_ERROR_OBJECT (self, "Couldn't prepare shader resource view"); + gst_buffer_unref (new_buf); + return FALSE; + } else if (!is_input && + !gst_d3d11_memory_get_render_target_view_size (new_mem)) { + GST_ERROR_OBJECT (self, "Couldn't prepare render target view"); + gst_buffer_unref (new_buf); + return FALSE; + } + } + + *fallback_buffer = new_buf; + + return TRUE; +} + +static gboolean +gst_d3d11_compositor_copy_buffer (GstD3D11Compositor * self, + GstVideoInfo * info, GstBuffer * src_buf, GstBuffer * dest_buf, + gboolean do_device_copy) +{ + guint i; + + if (do_device_copy) { + return gst_d3d11_buffer_copy_into (dest_buf, src_buf, info); + } else { + GstVideoFrame src_frame, dest_frame; + + if (!gst_video_frame_map (&src_frame, info, src_buf, + (GstMapFlags) (GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) { + GST_ERROR_OBJECT (self, "Couldn't map input buffer"); + return FALSE; + } + + if (!gst_video_frame_map (&dest_frame, info, dest_buf, + (GstMapFlags) (GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) { + GST_ERROR_OBJECT (self, "Couldn't fallback buffer"); + gst_video_frame_unmap (&src_frame); + return FALSE; + } + + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&src_frame); i++) { + if (!gst_video_frame_copy_plane (&dest_frame, &src_frame, i)) { + GST_ERROR_OBJECT (self, "Couldn't copy %dth plane", i); + + gst_video_frame_unmap (&dest_frame); + gst_video_frame_unmap (&src_frame); + + return FALSE; + } + } + + gst_video_frame_unmap (&dest_frame); + gst_video_frame_unmap (&src_frame); + } + + return TRUE; +} + +static gboolean +gst_d3d11_compositor_check_d3d11_memory (GstD3D11Compositor * self, + GstBuffer * buffer, gboolean is_input, gboolean * view_available) +{ + guint i; + gboolean ret = TRUE; + + *view_available = TRUE; + + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstMemory *mem = gst_buffer_peek_memory (buffer, i); + GstD3D11Memory *dmem; + + if (!gst_is_d3d11_memory (mem)) { + ret = FALSE; + goto done; + } + + dmem = (GstD3D11Memory *) mem; + if (dmem->device != self->device) { + ret = FALSE; + goto done; + } + + if (is_input) { + if (!gst_d3d11_memory_get_shader_resource_view_size (dmem)) + *view_available = FALSE; + } else { + if (!gst_d3d11_memory_get_render_target_view_size (dmem)) + *view_available = FALSE; + } + } + +done: + if (!ret) + *view_available = FALSE; + + return ret; +} + +static void +gst_d3d11_compositor_pad_get_output_size (GstD3D11CompositorPad * comp_pad, + gint out_par_n, gint out_par_d, gint * width, gint * height, + gint * x_offset, gint * y_offset) +{ + GstVideoAggregatorPad *vagg_pad = GST_VIDEO_AGGREGATOR_PAD (comp_pad); + gint pad_width, pad_height; + guint dar_n, dar_d; + + *x_offset = 0; + *y_offset = 0; + *width = 0; + *height = 0; + + /* FIXME: Anything better we can do here? */ + if (!vagg_pad->info.finfo + || vagg_pad->info.finfo->format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_DEBUG_OBJECT (comp_pad, "Have no caps yet"); + return; + } + + pad_width = + comp_pad->width <= + 0 ? GST_VIDEO_INFO_WIDTH (&vagg_pad->info) : comp_pad->width; + pad_height = + comp_pad->height <= + 0 ? GST_VIDEO_INFO_HEIGHT (&vagg_pad->info) : comp_pad->height; + + if (pad_width == 0 || pad_height == 0) + return; + + if (!gst_video_calculate_display_ratio (&dar_n, &dar_d, pad_width, pad_height, + GST_VIDEO_INFO_PAR_N (&vagg_pad->info), + GST_VIDEO_INFO_PAR_D (&vagg_pad->info), out_par_n, out_par_d)) { + GST_WARNING_OBJECT (comp_pad, "Cannot calculate display aspect ratio"); + return; + } + + GST_TRACE_OBJECT (comp_pad, "scaling %ux%u by %u/%u (%u/%u / %u/%u)", + pad_width, pad_height, dar_n, dar_d, + GST_VIDEO_INFO_PAR_N (&vagg_pad->info), + GST_VIDEO_INFO_PAR_D (&vagg_pad->info), out_par_n, out_par_d); + + switch (comp_pad->sizing_policy) { + case GST_D3D11_COMPOSITOR_SIZING_POLICY_NONE: + /* Pick either height or width, whichever is an integer multiple of the + * display aspect ratio. However, prefer preserving the height to account + * for interlaced video. */ + if (pad_height % dar_n == 0) { + pad_width = gst_util_uint64_scale_int (pad_height, dar_n, dar_d); + } else if (pad_width % dar_d == 0) { + pad_height = gst_util_uint64_scale_int (pad_width, dar_d, dar_n); + } else { + pad_width = gst_util_uint64_scale_int (pad_height, dar_n, dar_d); + } + break; + case GST_D3D11_COMPOSITOR_SIZING_POLICY_KEEP_ASPECT_RATIO: + { + gint from_dar_n, from_dar_d, to_dar_n, to_dar_d, num, den; + + /* Calculate DAR again with actual video size */ + if (!gst_util_fraction_multiply (GST_VIDEO_INFO_WIDTH (&vagg_pad->info), + GST_VIDEO_INFO_HEIGHT (&vagg_pad->info), + GST_VIDEO_INFO_PAR_N (&vagg_pad->info), + GST_VIDEO_INFO_PAR_D (&vagg_pad->info), &from_dar_n, + &from_dar_d)) { + from_dar_n = from_dar_d = -1; + } + + if (!gst_util_fraction_multiply (pad_width, pad_height, + out_par_n, out_par_d, &to_dar_n, &to_dar_d)) { + to_dar_n = to_dar_d = -1; + } + + if (from_dar_n != to_dar_n || from_dar_d != to_dar_d) { + /* Calculate new output resolution */ + if (from_dar_n != -1 && from_dar_d != -1 + && gst_util_fraction_multiply (from_dar_n, from_dar_d, + out_par_d, out_par_n, &num, &den)) { + GstVideoRectangle src_rect, dst_rect, rst_rect; + + src_rect.h = gst_util_uint64_scale_int (pad_width, den, num); + if (src_rect.h == 0) { + pad_width = 0; + pad_height = 0; + break; + } + + src_rect.x = src_rect.y = 0; + src_rect.w = pad_width; + + dst_rect.x = dst_rect.y = 0; + dst_rect.w = pad_width; + dst_rect.h = pad_height; + + /* Scale rect to be centered in destination rect */ + gst_video_center_rect (&src_rect, &dst_rect, &rst_rect, TRUE); + + GST_LOG_OBJECT (comp_pad, + "Re-calculated size %dx%d -> %dx%d (x-offset %d, y-offset %d)", + pad_width, pad_height, rst_rect.w, rst_rect.h, rst_rect.x, + rst_rect.h); + + *x_offset = rst_rect.x; + *y_offset = rst_rect.y; + pad_width = rst_rect.w; + pad_height = rst_rect.h; + } else { + GST_WARNING_OBJECT (comp_pad, "Failed to calculate output size"); + + *x_offset = 0; + *y_offset = 0; + pad_width = 0; + pad_height = 0; + } + } + break; + } + } + + *width = pad_width; + *height = pad_height; +} + +static GstVideoRectangle +clamp_rectangle (gint x, gint y, gint w, gint h, gint outer_width, + gint outer_height) +{ + gint x2 = x + w; + gint y2 = y + h; + GstVideoRectangle clamped; + + /* Clamp the x/y coordinates of this frame to the output boundaries to cover + * the case where (say, with negative xpos/ypos or w/h greater than the output + * size) the non-obscured portion of the frame could be outside the bounds of + * the video itself and hence not visible at all */ + clamped.x = CLAMP (x, 0, outer_width); + clamped.y = CLAMP (y, 0, outer_height); + clamped.w = CLAMP (x2, 0, outer_width) - clamped.x; + clamped.h = CLAMP (y2, 0, outer_height) - clamped.y; + + return clamped; +} + +static gboolean +gst_d3d11_compositor_pad_check_frame_obscured (GstVideoAggregatorPad * pad, + GstVideoAggregator * vagg) +{ + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (pad); + gint width, height; + GstVideoInfo *info = &vagg->info; + /* The rectangle representing this frame, clamped to the video's boundaries. + * Due to the clamping, this is different from the frame width/height above. */ + GstVideoRectangle frame_rect; + gint x_offset, y_offset; + + /* There's three types of width/height here: + * 1. GST_VIDEO_FRAME_WIDTH/HEIGHT: + * The frame width/height (same as pad->info.height/width; + * see gst_video_frame_map()) + * 2. cpad->width/height: + * The optional pad property for scaling the frame (if zero, the video is + * left unscaled) + */ + + gst_d3d11_compositor_pad_get_output_size (cpad, GST_VIDEO_INFO_PAR_N (info), + GST_VIDEO_INFO_PAR_D (info), &width, &height, &x_offset, &y_offset); + + frame_rect = clamp_rectangle (cpad->xpos + x_offset, cpad->ypos + y_offset, + width, height, GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info)); + + if (frame_rect.w == 0 || frame_rect.h == 0) { + GST_DEBUG_OBJECT (pad, "Resulting frame is zero-width or zero-height " + "(w: %i, h: %i), skipping", frame_rect.w, frame_rect.h); + return TRUE; + } + + return FALSE; +} + +static gboolean +gst_d3d11_compositor_pad_prepare_frame (GstVideoAggregatorPad * pad, + GstVideoAggregator * vagg, GstBuffer * buffer, + GstVideoFrame * prepared_frame) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (vagg); + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (pad); + GstBuffer *target_buf = buffer; + gboolean do_device_copy = FALSE; + + /* Skip this frame */ + if (gst_d3d11_compositor_pad_check_frame_obscured (pad, vagg)) + return TRUE; + + /* Use fallback buffer when input buffer is: + * - non-d3d11 memory + * - or, from different d3d11 device + * - or not bound to shader resource + */ + if (!gst_d3d11_compositor_check_d3d11_memory (self, + buffer, TRUE, &do_device_copy) || !do_device_copy) { + if (!gst_d3d11_compsitor_prepare_fallback_buffer (self, &pad->info, TRUE, + &cpad->fallback_pool, &cpad->fallback_buf)) { + GST_ERROR_OBJECT (self, "Couldn't prepare fallback buffer"); + return FALSE; + } + + if (!gst_d3d11_compositor_copy_buffer (self, &pad->info, buffer, + cpad->fallback_buf, do_device_copy)) { + GST_ERROR_OBJECT (self, "Couldn't copy input buffer to fallback buffer"); + gst_clear_buffer (&cpad->fallback_buf); + return FALSE; + } + + target_buf = cpad->fallback_buf; + } + + if (!gst_video_frame_map (prepared_frame, &pad->info, target_buf, + (GstMapFlags) (GST_MAP_READ | GST_MAP_D3D11))) { + GST_WARNING_OBJECT (pad, "Couldn't map input buffer"); + return FALSE; + } + + return TRUE; +} + +static void +gst_d3d11_compositor_pad_clean_frame (GstVideoAggregatorPad * pad, + GstVideoAggregator * vagg, GstVideoFrame * prepared_frame) +{ + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (pad); + + GST_VIDEO_AGGREGATOR_PAD_CLASS (parent_pad_class)->clean_frame (pad, + vagg, prepared_frame); + + gst_clear_buffer (&cpad->fallback_buf); +} + +static gboolean +gst_d3d11_compositor_pad_setup_converter (GstVideoAggregatorPad * pad, + GstVideoAggregator * vagg) +{ + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (pad); + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (vagg); + RECT rect; + gint width, height; + GstVideoInfo *info = &vagg->info; + GstVideoRectangle frame_rect; + gboolean is_first = FALSE; + gint x_offset, y_offset; +#ifndef GST_DISABLE_GST_DEBUG + guint zorder = 0; +#endif + + if (!cpad->convert || self->reconfigured) { + GstStructure *config; + + if (cpad->convert) + gst_d3d11_converter_free (cpad->convert); + + config = gst_structure_new_empty ("config"); + if (cpad->alpha <= 1.0) { + gst_structure_set (config, GST_D3D11_CONVERTER_OPT_ALPHA_VALUE, + G_TYPE_DOUBLE, cpad->alpha, nullptr); + } + + cpad->convert = + gst_d3d11_converter_new (self->device, &pad->info, &vagg->info, config); + + if (!cpad->convert) { + GST_ERROR_OBJECT (pad, "Couldn't create converter"); + return FALSE; + } + + is_first = TRUE; + } else if (cpad->alpha_updated) { + GstStructure *config; + + config = gst_structure_new_empty ("config"); + if (cpad->alpha <= 1.0) { + gst_structure_set (config, GST_D3D11_CONVERTER_OPT_ALPHA_VALUE, + G_TYPE_DOUBLE, cpad->alpha, nullptr); + } + + gst_d3d11_converter_update_config (cpad->convert, config); + cpad->alpha_updated = FALSE; + } + + if (!cpad->blend || cpad->blend_desc_updated) { + HRESULT hr; + D3D11_BLEND_DESC desc = { 0, }; + ID3D11BlendState *blend = NULL; + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (self->device); + + GST_D3D11_CLEAR_COM (cpad->blend); + + desc.AlphaToCoverageEnable = FALSE; + desc.IndependentBlendEnable = FALSE; + desc.RenderTarget[0] = cpad->desc; + + hr = device_handle->CreateBlendState (&desc, &blend); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (pad, "Couldn't create blend staten, hr: 0x%x", + (guint) hr); + return FALSE; + } + + cpad->blend = blend; + } + + if (!is_first && !cpad->position_updated) + return TRUE; + + gst_d3d11_compositor_pad_get_output_size (cpad, GST_VIDEO_INFO_PAR_N (info), + GST_VIDEO_INFO_PAR_D (info), &width, &height, &x_offset, &y_offset); + + frame_rect = clamp_rectangle (cpad->xpos + x_offset, cpad->ypos + y_offset, + width, height, GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info)); + + rect.left = frame_rect.x; + rect.top = frame_rect.y; + rect.right = frame_rect.x + frame_rect.w; + rect.bottom = frame_rect.y + frame_rect.h; + +#ifndef GST_DISABLE_GST_DEBUG + g_object_get (pad, "zorder", &zorder, NULL); + + GST_LOG_OBJECT (pad, "Update position, pad-xpos %d, pad-ypos %d, " + "pad-zorder %d, pad-width %d, pad-height %d, in-resolution %dx%d, " + "out-resoution %dx%d, dst-{left,top,right,bottom} %d-%d-%d-%d", + cpad->xpos, cpad->ypos, zorder, cpad->width, cpad->height, + GST_VIDEO_INFO_WIDTH (&pad->info), GST_VIDEO_INFO_HEIGHT (&pad->info), + GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info), + (gint) rect.left, (gint) rect.top, (gint) rect.right, (gint) rect.bottom); +#endif + + cpad->position_updated = FALSE; + + return gst_d3d11_converter_update_dest_rect (cpad->convert, &rect); +} + +static GstStaticCaps pad_template_caps = +GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, "{ RGBA, BGRA }")); + +enum +{ + PROP_0, + PROP_ADAPTER, + PROP_BACKGROUND, +}; + +#define DEFAULT_ADAPTER -1 +#define DEFAULT_BACKGROUND GST_D3D11_COMPOSITOR_BACKGROUND_CHECKER + +static void gst_d3d11_compositor_child_proxy_init (gpointer g_iface, + gpointer iface_data); +static void gst_d3d11_compositor_dispose (GObject * object); +static void gst_d3d11_compositor_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_compositor_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); + +static GstPad *gst_d3d11_compositor_request_new_pad (GstElement * element, + GstPadTemplate * templ, const gchar * name, const GstCaps * caps); +static void gst_d3d11_compositor_release_pad (GstElement * element, + GstPad * pad); +static void gst_d3d11_compositor_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_compositor_start (GstAggregator * aggregator); +static gboolean gst_d3d11_compositor_stop (GstAggregator * aggregator); +static gboolean gst_d3d11_compositor_sink_query (GstAggregator * aggregator, + GstAggregatorPad * pad, GstQuery * query); +static gboolean gst_d3d11_compositor_src_query (GstAggregator * aggregator, + GstQuery * query); +static GstCaps *gst_d3d11_compositor_fixate_src_caps (GstAggregator * + aggregator, GstCaps * caps); +static gboolean gst_d3d11_compositor_propose_allocation (GstAggregator * + aggregator, GstAggregatorPad * pad, GstQuery * decide_query, + GstQuery * query); +static gboolean gst_d3d11_compositor_decide_allocation (GstAggregator * + aggregator, GstQuery * query); +static GstFlowReturn +gst_d3d11_compositor_aggregate_frames (GstVideoAggregator * vagg, + GstBuffer * outbuf); +static GstFlowReturn +gst_d3d11_compositor_create_output_buffer (GstVideoAggregator * vagg, + GstBuffer ** outbuffer); + +#define gst_d3d11_compositor_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstD3D11Compositor, gst_d3d11_compositor, + GST_TYPE_VIDEO_AGGREGATOR, G_IMPLEMENT_INTERFACE (GST_TYPE_CHILD_PROXY, + gst_d3d11_compositor_child_proxy_init)); + +static void +gst_d3d11_compositor_class_init (GstD3D11CompositorClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstAggregatorClass *aggregator_class = GST_AGGREGATOR_CLASS (klass); + GstVideoAggregatorClass *vagg_class = GST_VIDEO_AGGREGATOR_CLASS (klass); + GstCaps *caps; + + gobject_class->dispose = gst_d3d11_compositor_dispose; + gobject_class->set_property = gst_d3d11_compositor_set_property; + gobject_class->get_property = gst_d3d11_compositor_get_property; + + g_object_class_install_property (gobject_class, PROP_ADAPTER, + g_param_spec_int ("adapter", "Adapter", + "Adapter index for creating device (-1 for default)", + -1, G_MAXINT32, DEFAULT_ADAPTER, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_BACKGROUND, + g_param_spec_enum ("background", "Background", "Background type", + GST_TYPE_D3D11_COMPOSITOR_BACKGROUND, + DEFAULT_BACKGROUND, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + element_class->request_new_pad = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_request_new_pad); + element_class->release_pad = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_release_pad); + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_set_context); + + aggregator_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_compositor_start); + aggregator_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_compositor_stop); + aggregator_class->sink_query = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_sink_query); + aggregator_class->src_query = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_src_query); + aggregator_class->fixate_src_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_fixate_src_caps); + aggregator_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_propose_allocation); + aggregator_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_decide_allocation); + + vagg_class->aggregate_frames = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_aggregate_frames); + vagg_class->create_output_buffer = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_create_output_buffer); + + caps = gst_d3d11_get_updated_template_caps (&pad_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new_with_gtype ("sink_%u", GST_PAD_SINK, GST_PAD_REQUEST, + caps, GST_TYPE_D3D11_COMPOSITOR_PAD)); + + gst_element_class_add_pad_template (element_class, + gst_pad_template_new_with_gtype ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + caps, GST_TYPE_AGGREGATOR_PAD)); + gst_caps_unref (caps); + + gst_element_class_set_static_metadata (element_class, "Direct3D11 Compositor", + "Filter/Editor/Video/Compositor", + "A Direct3D11 compositor", "Seungha Yang <seungha@centricular.com>"); + + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_BACKGROUND, + (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_PAD, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_compositor_init (GstD3D11Compositor * self) +{ + self->adapter = DEFAULT_ADAPTER; + self->background = DEFAULT_BACKGROUND; +} + +static void +gst_d3d11_compositor_dispose (GObject * object) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (object); + + gst_clear_object (&self->device); + gst_clear_buffer (&self->fallback_buf); + gst_clear_object (&self->fallback_pool); + g_clear_pointer (&self->checker_background, gst_d3d11_quad_free); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_d3d11_compositor_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (object); + + switch (prop_id) { + case PROP_ADAPTER: + self->adapter = g_value_get_int (value); + break; + case PROP_BACKGROUND: + self->background = + (GstD3D11CompositorBackground) g_value_get_enum (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_compositor_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (object); + + switch (prop_id) { + case PROP_ADAPTER: + g_value_set_int (value, self->adapter); + break; + case PROP_BACKGROUND: + g_value_set_enum (value, self->background); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static GObject * +gst_d3d11_compositor_child_proxy_get_child_by_index (GstChildProxy * proxy, + guint index) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (proxy); + GObject *obj = NULL; + + GST_OBJECT_LOCK (self); + obj = (GObject *) g_list_nth_data (GST_ELEMENT_CAST (self)->sinkpads, index); + if (obj) + gst_object_ref (obj); + GST_OBJECT_UNLOCK (self); + + return obj; +} + +static guint +gst_d3d11_compositor_child_proxy_get_children_count (GstChildProxy * proxy) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (proxy); + guint count = 0; + + GST_OBJECT_LOCK (self); + count = GST_ELEMENT_CAST (self)->numsinkpads; + GST_OBJECT_UNLOCK (self); + GST_INFO_OBJECT (self, "Children Count: %d", count); + + return count; +} + +static void +gst_d3d11_compositor_child_proxy_init (gpointer g_iface, gpointer iface_data) +{ + GstChildProxyInterface *iface = (GstChildProxyInterface *) g_iface; + + iface->get_child_by_index = + gst_d3d11_compositor_child_proxy_get_child_by_index; + iface->get_children_count = + gst_d3d11_compositor_child_proxy_get_children_count; +} + +static GstPad * +gst_d3d11_compositor_request_new_pad (GstElement * element, + GstPadTemplate * templ, const gchar * name, const GstCaps * caps) +{ + GstPad *pad; + + pad = GST_ELEMENT_CLASS (parent_class)->request_new_pad (element, + templ, name, caps); + + if (pad == NULL) + goto could_not_create; + + gst_child_proxy_child_added (GST_CHILD_PROXY (element), G_OBJECT (pad), + GST_OBJECT_NAME (pad)); + + GST_DEBUG_OBJECT (element, "Created new pad %s:%s", GST_DEBUG_PAD_NAME (pad)); + + return pad; + +could_not_create: + { + GST_DEBUG_OBJECT (element, "could not create/add pad"); + return NULL; + } +} + +static gboolean +gst_d3d11_compositor_pad_clear_resource (GstD3D11Compositor * self, + GstD3D11CompositorPad * cpad, gpointer user_data) +{ + gst_clear_buffer (&cpad->fallback_buf); + if (cpad->fallback_pool) { + gst_buffer_pool_set_active (cpad->fallback_pool, FALSE); + gst_clear_object (&cpad->fallback_pool); + } + g_clear_pointer (&cpad->convert, gst_d3d11_converter_free); + GST_D3D11_CLEAR_COM (cpad->blend); + + return TRUE; +} + +static void +gst_d3d11_compositor_release_pad (GstElement * element, GstPad * pad) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (element); + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (pad); + + GST_DEBUG_OBJECT (self, "Releasing pad %s:%s", GST_DEBUG_PAD_NAME (pad)); + + gst_child_proxy_child_removed (GST_CHILD_PROXY (self), G_OBJECT (pad), + GST_OBJECT_NAME (pad)); + + gst_d3d11_compositor_pad_clear_resource (self, cpad, NULL); + + GST_ELEMENT_CLASS (parent_class)->release_pad (element, pad); +} + +static void +gst_d3d11_compositor_set_context (GstElement * element, GstContext * context) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (element); + + gst_d3d11_handle_set_context (element, context, self->adapter, &self->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_compositor_start (GstAggregator * aggregator) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (aggregator); + + if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), + self->adapter, &self->device)) { + GST_ERROR_OBJECT (self, "Failed to get D3D11 device"); + return FALSE; + } + + return GST_AGGREGATOR_CLASS (parent_class)->start (aggregator); +} + +static gboolean +gst_d3d11_compositor_stop (GstAggregator * aggregator) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (aggregator); + + g_clear_pointer (&self->checker_background, gst_d3d11_quad_free); + gst_clear_object (&self->device); + + return GST_AGGREGATOR_CLASS (parent_class)->stop (aggregator); +} + +static GstCaps * +gst_d3d11_compositor_sink_getcaps (GstPad * pad, GstCaps * filter) +{ + GstCaps *sinkcaps; + GstCaps *template_caps; + GstCaps *filtered_caps; + GstCaps *returned_caps; + + template_caps = gst_pad_get_pad_template_caps (pad); + + sinkcaps = gst_pad_get_current_caps (pad); + if (sinkcaps == NULL) { + sinkcaps = gst_caps_ref (template_caps); + } else { + sinkcaps = gst_caps_merge (sinkcaps, gst_caps_ref (template_caps)); + } + + if (filter) { + filtered_caps = gst_caps_intersect (sinkcaps, filter); + gst_caps_unref (sinkcaps); + } else { + filtered_caps = sinkcaps; /* pass ownership */ + } + + returned_caps = gst_caps_intersect (filtered_caps, template_caps); + + gst_caps_unref (template_caps); + gst_caps_unref (filtered_caps); + + GST_DEBUG_OBJECT (pad, "returning %" GST_PTR_FORMAT, returned_caps); + + return returned_caps; +} + +static gboolean +gst_d3d11_compositor_sink_acceptcaps (GstPad * pad, GstCaps * caps) +{ + gboolean ret; + GstCaps *template_caps; + + GST_DEBUG_OBJECT (pad, "try accept caps of %" GST_PTR_FORMAT, caps); + + template_caps = gst_pad_get_pad_template_caps (pad); + template_caps = gst_caps_make_writable (template_caps); + + ret = gst_caps_can_intersect (caps, template_caps); + GST_DEBUG_OBJECT (pad, "%saccepted caps %" GST_PTR_FORMAT, + (ret ? "" : "not "), caps); + gst_caps_unref (template_caps); + + return ret; +} + +static gboolean +gst_d3d11_compositor_sink_query (GstAggregator * aggregator, + GstAggregatorPad * pad, GstQuery * query) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (aggregator); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + { + gboolean ret; + ret = gst_d3d11_handle_context_query (GST_ELEMENT (aggregator), query, + self->device); + if (ret) + return TRUE; + break; + } + case GST_QUERY_CAPS: + { + GstCaps *filter, *caps; + + gst_query_parse_caps (query, &filter); + caps = gst_d3d11_compositor_sink_getcaps (GST_PAD (pad), filter); + gst_query_set_caps_result (query, caps); + gst_caps_unref (caps); + return TRUE; + } + case GST_QUERY_ACCEPT_CAPS: + { + GstCaps *caps; + gboolean ret; + + gst_query_parse_accept_caps (query, &caps); + ret = gst_d3d11_compositor_sink_acceptcaps (GST_PAD (pad), caps); + gst_query_set_accept_caps_result (query, ret); + return TRUE; + } + default: + break; + } + + return GST_AGGREGATOR_CLASS (parent_class)->sink_query (aggregator, + pad, query); +} + +static gboolean +gst_d3d11_compositor_src_query (GstAggregator * aggregator, GstQuery * query) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (aggregator); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + { + gboolean ret; + ret = gst_d3d11_handle_context_query (GST_ELEMENT (aggregator), query, + self->device); + if (ret) + return TRUE; + break; + } + default: + break; + } + + return GST_AGGREGATOR_CLASS (parent_class)->src_query (aggregator, query); +} + +static GstCaps * +gst_d3d11_compositor_fixate_src_caps (GstAggregator * aggregator, + GstCaps * caps) +{ + GstVideoAggregator *vagg = GST_VIDEO_AGGREGATOR (aggregator); + GList *l; + gint best_width = -1, best_height = -1; + gint best_fps_n = -1, best_fps_d = -1; + gint par_n, par_d; + gdouble best_fps = 0.; + GstCaps *ret = NULL; + GstStructure *s; + + ret = gst_caps_make_writable (caps); + + /* we need this to calculate how large to make the output frame */ + s = gst_caps_get_structure (ret, 0); + if (gst_structure_has_field (s, "pixel-aspect-ratio")) { + gst_structure_fixate_field_nearest_fraction (s, "pixel-aspect-ratio", 1, 1); + gst_structure_get_fraction (s, "pixel-aspect-ratio", &par_n, &par_d); + } else { + par_n = par_d = 1; + } + + GST_OBJECT_LOCK (vagg); + for (l = GST_ELEMENT (vagg)->sinkpads; l; l = l->next) { + GstVideoAggregatorPad *vaggpad = GST_VIDEO_AGGREGATOR_PAD (l->data); + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (vaggpad); + gint this_width, this_height; + gint width, height; + gint fps_n, fps_d; + gdouble cur_fps; + gint x_offset; + gint y_offset; + + fps_n = GST_VIDEO_INFO_FPS_N (&vaggpad->info); + fps_d = GST_VIDEO_INFO_FPS_D (&vaggpad->info); + gst_d3d11_compositor_pad_get_output_size (cpad, + par_n, par_d, &width, &height, &x_offset, &y_offset); + + if (width == 0 || height == 0) + continue; + + /* {x,y}_offset represent padding size of each top and left area. + * To calculate total resolution, count bottom and right padding area + * as well here */ + this_width = width + MAX (cpad->xpos + 2 * x_offset, 0); + this_height = height + MAX (cpad->ypos + 2 * y_offset, 0); + + if (best_width < this_width) + best_width = this_width; + if (best_height < this_height) + best_height = this_height; + + if (fps_d == 0) + cur_fps = 0.0; + else + gst_util_fraction_to_double (fps_n, fps_d, &cur_fps); + + if (best_fps < cur_fps) { + best_fps = cur_fps; + best_fps_n = fps_n; + best_fps_d = fps_d; + } + } + GST_OBJECT_UNLOCK (vagg); + + if (best_fps_n <= 0 || best_fps_d <= 0 || best_fps == 0.0) { + best_fps_n = 25; + best_fps_d = 1; + best_fps = 25.0; + } + + gst_structure_fixate_field_nearest_int (s, "width", best_width); + gst_structure_fixate_field_nearest_int (s, "height", best_height); + gst_structure_fixate_field_nearest_fraction (s, "framerate", best_fps_n, + best_fps_d); + ret = gst_caps_fixate (ret); + + GST_LOG_OBJECT (aggregator, "Fixated caps %" GST_PTR_FORMAT, ret); + + return ret; +} + +static gboolean +gst_d3d11_compositor_propose_allocation (GstAggregator * aggregator, + GstAggregatorPad * pad, GstQuery * decide_query, GstQuery * query) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (aggregator); + GstVideoInfo info; + GstBufferPool *pool; + GstCaps *caps; + guint size; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) + return FALSE; + + if (gst_query_get_n_allocation_pools (query) == 0) { + GstD3D11AllocationParams *d3d11_params; + GstStructure *config; + + d3d11_params = gst_d3d11_allocation_params_new (self->device, &info, + (GstD3D11AllocationFlags) 0, D3D11_BIND_SHADER_RESOURCE); + + pool = gst_d3d11_buffer_pool_new_with_options (self->device, + caps, d3d11_params, 0, 0); + gst_d3d11_allocation_params_free (d3d11_params); + + if (!pool) { + GST_ERROR_OBJECT (self, "Failed to create buffer pool"); + return FALSE; + } + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, + nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + gst_query_add_allocation_pool (query, pool, size, 0, 0); + gst_object_unref (pool); + } + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + + return TRUE; +} + +static gboolean +gst_d3d11_compositor_decide_allocation (GstAggregator * aggregator, + GstQuery * query) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (aggregator); + GstCaps *caps; + GstBufferPool *pool = NULL; + guint n, size, min, max; + GstVideoInfo info; + GstStructure *config; + GstD3D11AllocationParams *d3d11_params; + + gst_query_parse_allocation (query, &caps, NULL); + + if (!caps) { + GST_DEBUG_OBJECT (self, "No output caps"); + return FALSE; + } + + gst_video_info_from_caps (&info, caps); + n = gst_query_get_n_allocation_pools (query); + if (n > 0) + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + + /* create our own pool */ + if (pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != self->device) + gst_clear_object (&pool); + } + } + + if (!pool) { + pool = gst_d3d11_buffer_pool_new (self->device); + + min = max = 0; + size = (guint) info.size; + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, caps, size, min, max); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (self->device, + &info, (GstD3D11AllocationFlags) 0, D3D11_BIND_RENDER_TARGET); + } else { + guint i; + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { + d3d11_params->desc[i].BindFlags |= D3D11_BIND_RENDER_TARGET; + } + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + gst_buffer_pool_set_config (pool, config); + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + if (n > 0) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + gst_object_unref (pool); + + self->reconfigured = TRUE; + + return TRUE; +} + +typedef struct +{ + struct + { + FLOAT x; + FLOAT y; + FLOAT z; + } position; + struct + { + FLOAT u; + FLOAT v; + } texture; +} VertexData; + +static GstD3D11Quad * +gst_d3d11_compositor_create_checker_quad (GstD3D11Compositor * self) +{ + GstD3D11Quad *quad = NULL; + VertexData *vertex_data; + WORD *indices; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + D3D11_MAPPED_SUBRESOURCE map; + D3D11_INPUT_ELEMENT_DESC input_desc; + D3D11_BUFFER_DESC buffer_desc; + /* *INDENT-OFF* */ + ComPtr<ID3D11Buffer> vertex_buffer; + ComPtr<ID3D11Buffer> index_buffer; + ComPtr<ID3D11PixelShader> ps; + ComPtr<ID3D11VertexShader> vs; + ComPtr<ID3D11InputLayout> layout; + /* *INDENT-ON* */ + HRESULT hr; + + device_handle = gst_d3d11_device_get_device_handle (self->device); + context_handle = gst_d3d11_device_get_device_context_handle (self->device); + + if (!gst_d3d11_create_pixel_shader (self->device, checker_ps_src, &ps)) { + GST_ERROR_OBJECT (self, "Couldn't setup pixel shader"); + return NULL; + } + + memset (&input_desc, 0, sizeof (D3D11_INPUT_ELEMENT_DESC)); + input_desc.SemanticName = "POSITION"; + input_desc.SemanticIndex = 0; + input_desc.Format = DXGI_FORMAT_R32G32B32_FLOAT; + input_desc.InputSlot = 0; + input_desc.AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; + input_desc.InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; + input_desc.InstanceDataStepRate = 0; + + if (!gst_d3d11_create_vertex_shader (self->device, checker_vs_src, + &input_desc, 1, &vs, &layout)) { + GST_ERROR_OBJECT (self, "Couldn't setup vertex shader"); + return NULL; + } + + memset (&buffer_desc, 0, sizeof (D3D11_BUFFER_DESC)); + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (VertexData) * 4; + buffer_desc.BindFlags = D3D11_BIND_VERTEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + hr = device_handle->CreateBuffer (&buffer_desc, NULL, &vertex_buffer); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, + "Couldn't create vertex buffer, hr: 0x%x", (guint) hr); + return NULL; + } + + hr = context_handle->Map (vertex_buffer.Get (), + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Couldn't map vertex buffer, hr: 0x%x", (guint) hr); + return NULL; + } + + vertex_data = (VertexData *) map.pData; + /* bottom left */ + /* bottom left */ + vertex_data[0].position.x = -1.0f; + vertex_data[0].position.y = -1.0f; + vertex_data[0].position.z = 0.0f; + vertex_data[0].texture.u = 0.0f; + vertex_data[0].texture.v = 1.0f; + + /* top left */ + vertex_data[1].position.x = -1.0f; + vertex_data[1].position.y = 1.0f; + vertex_data[1].position.z = 0.0f; + vertex_data[1].texture.u = 0.0f; + vertex_data[1].texture.v = 0.0f; + + /* top right */ + vertex_data[2].position.x = 1.0f; + vertex_data[2].position.y = 1.0f; + vertex_data[2].position.z = 0.0f; + vertex_data[2].texture.u = 1.0f; + vertex_data[2].texture.v = 0.0f; + + /* bottom right */ + vertex_data[3].position.x = 1.0f; + vertex_data[3].position.y = -1.0f; + vertex_data[3].position.z = 0.0f; + vertex_data[3].texture.u = 1.0f; + vertex_data[3].texture.v = 1.0f; + + context_handle->Unmap (vertex_buffer.Get (), 0); + + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (WORD) * 6; + buffer_desc.BindFlags = D3D11_BIND_INDEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + hr = device_handle->CreateBuffer (&buffer_desc, NULL, &index_buffer); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, + "Couldn't create index buffer, hr: 0x%x", (guint) hr); + return NULL; + } + + hr = context_handle->Map (index_buffer.Get (), + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Couldn't map index buffer, hr: 0x%x", (guint) hr); + return NULL; + } + + indices = (WORD *) map.pData; + + /* clockwise indexing */ + indices[0] = 0; /* bottom left */ + indices[1] = 1; /* top left */ + indices[2] = 2; /* top right */ + + indices[3] = 3; /* bottom right */ + indices[4] = 0; /* bottom left */ + indices[5] = 2; /* top right */ + + context_handle->Unmap (index_buffer.Get (), 0); + + quad = gst_d3d11_quad_new (self->device, + ps.Get (), vs.Get (), layout.Get (), nullptr, 0, + vertex_buffer.Get (), sizeof (VertexData), index_buffer.Get (), + DXGI_FORMAT_R16_UINT, 6); + if (!quad) { + GST_ERROR_OBJECT (self, "Couldn't setup quad"); + return NULL; + } + + return quad; +} + +static gboolean +gst_d3d11_compositor_draw_background_checker (GstD3D11Compositor * self, + ID3D11RenderTargetView * rtv) +{ + if (!self->checker_background) { + GstVideoInfo *info = &GST_VIDEO_AGGREGATOR_CAST (self)->info; + + self->checker_background = gst_d3d11_compositor_create_checker_quad (self); + + if (!self->checker_background) + return FALSE; + + self->viewport.TopLeftX = 0; + self->viewport.TopLeftY = 0; + self->viewport.Width = GST_VIDEO_INFO_WIDTH (info); + self->viewport.Height = GST_VIDEO_INFO_HEIGHT (info); + self->viewport.MinDepth = 0.0f; + self->viewport.MaxDepth = 1.0f; + } + + return gst_d3d11_draw_quad_unlocked (self->checker_background, + &self->viewport, 1, NULL, 0, &rtv, 1, NULL, NULL, NULL, 0); +} + +/* Must be called with d3d11 device lock */ +static gboolean +gst_d3d11_compositor_draw_background (GstD3D11Compositor * self, + ID3D11RenderTargetView * rtv) +{ + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (self->device); + FLOAT rgba[4] = { 0.0f, 0.0f, 0.0f, 1.0f }; + + switch (self->background) { + case GST_D3D11_COMPOSITOR_BACKGROUND_CHECKER: + return gst_d3d11_compositor_draw_background_checker (self, rtv); + case GST_D3D11_COMPOSITOR_BACKGROUND_BLACK: + /* {0, 0, 0, 1} */ + break; + case GST_D3D11_COMPOSITOR_BACKGROUND_WHITE: + rgba[0] = 1.0f; + rgba[1] = 1.0f; + rgba[2] = 1.0f; + break; + case GST_D3D11_COMPOSITOR_BACKGROUND_TRANSPARENT: + rgba[3] = 0.0f; + break; + default: + g_assert_not_reached (); + return FALSE; + } + + device_context->ClearRenderTargetView (rtv, rgba); + + return TRUE; +} + +static GstFlowReturn +gst_d3d11_compositor_aggregate_frames (GstVideoAggregator * vagg, + GstBuffer * outbuf) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (vagg); + GList *iter; + GstBuffer *target_buf = outbuf; + gboolean need_copy = FALSE; + gboolean do_device_copy = FALSE; + GstFlowReturn ret = GST_FLOW_OK; + ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES] = { NULL, }; + guint i, j; + gint view_idx; + + /* Use fallback buffer when output buffer is: + * - non-d3d11 memory + * - or, from different d3d11 device + * - or not bound to render target + */ + if (!gst_d3d11_compositor_check_d3d11_memory (self, + outbuf, FALSE, &do_device_copy) || !do_device_copy) { + if (!gst_d3d11_compsitor_prepare_fallback_buffer (self, &vagg->info, FALSE, + &self->fallback_pool, &self->fallback_buf)) { + GST_ERROR_OBJECT (self, "Couldn't prepare fallback buffer"); + return GST_FLOW_ERROR; + } + + GST_TRACE_OBJECT (self, "Will draw on fallback texture"); + + need_copy = TRUE; + target_buf = self->fallback_buf; + } + + view_idx = 0; + for (i = 0; i < gst_buffer_n_memory (target_buf); i++) { + GstMemory *mem = gst_buffer_peek_memory (target_buf, i); + GstD3D11Memory *dmem; + guint rtv_size; + + if (!gst_is_d3d11_memory (mem)) { + GST_ERROR_OBJECT (self, "Invalid output memory"); + return GST_FLOW_ERROR; + } + + dmem = (GstD3D11Memory *) mem; + rtv_size = gst_d3d11_memory_get_render_target_view_size (dmem); + if (!rtv_size) { + GST_ERROR_OBJECT (self, "Render target view is unavailable"); + return GST_FLOW_ERROR; + } + + for (j = 0; j < rtv_size; j++) { + g_assert (view_idx < GST_VIDEO_MAX_PLANES); + + rtv[view_idx] = gst_d3d11_memory_get_render_target_view (dmem, j); + view_idx++; + } + + /* Mark need-download for fallback buffer use case */ + GST_MINI_OBJECT_FLAG_SET (dmem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + } + + gst_d3d11_device_lock (self->device); + /* XXX: the number of render target view must be one here, since we support + * only RGBA or BGRA */ + if (!gst_d3d11_compositor_draw_background (self, rtv[0])) { + GST_ERROR_OBJECT (self, "Couldn't draw background"); + gst_d3d11_device_unlock (self->device); + ret = GST_FLOW_ERROR; + goto done; + } + + GST_OBJECT_LOCK (self); + + for (iter = GST_ELEMENT (vagg)->sinkpads; iter; iter = g_list_next (iter)) { + GstVideoAggregatorPad *pad = GST_VIDEO_AGGREGATOR_PAD (iter->data); + GstD3D11CompositorPad *cpad = GST_D3D11_COMPOSITOR_PAD (pad); + GstVideoFrame *prepared_frame = + gst_video_aggregator_pad_get_prepared_frame (pad); + ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES] = { NULL, }; + GstBuffer *buffer; + + if (!prepared_frame) + continue; + + if (!gst_d3d11_compositor_pad_setup_converter (pad, vagg)) { + GST_ERROR_OBJECT (self, "Couldn't setup converter"); + ret = GST_FLOW_ERROR; + break; + } + + buffer = prepared_frame->buffer; + + view_idx = 0; + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstD3D11Memory *dmem = + (GstD3D11Memory *) gst_buffer_peek_memory (buffer, i); + guint srv_size = gst_d3d11_memory_get_shader_resource_view_size (dmem); + + for (j = 0; j < srv_size; j++) { + g_assert (view_idx < GST_VIDEO_MAX_PLANES); + + srv[view_idx] = gst_d3d11_memory_get_shader_resource_view (dmem, j); + view_idx++; + } + } + + if (!gst_d3d11_converter_convert_unlocked (cpad->convert, srv, rtv, + cpad->blend, cpad->blend_factor)) { + GST_ERROR_OBJECT (self, "Couldn't convert frame"); + ret = GST_FLOW_ERROR; + break; + } + } + + self->reconfigured = FALSE; + GST_OBJECT_UNLOCK (self); + gst_d3d11_device_unlock (self->device); + + if (ret != GST_FLOW_OK) + goto done; + + if (need_copy && !gst_d3d11_compositor_copy_buffer (self, &vagg->info, + target_buf, outbuf, do_device_copy)) { + GST_ERROR_OBJECT (self, "Couldn't copy input buffer to fallback buffer"); + ret = GST_FLOW_ERROR; + } + +done: + gst_clear_buffer (&self->fallback_buf); + + return ret; +} + +typedef struct +{ + /* without holding ref */ + GstD3D11Device *other_device; + gboolean have_same_device; +} DeviceCheckData; + +static gboolean +gst_d3d11_compositor_check_device_update (GstElement * agg, + GstVideoAggregatorPad * vpad, DeviceCheckData * data) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (agg); + GstBuffer *buf; + GstMemory *mem; + GstD3D11Memory *dmem; + gboolean update_device = FALSE; + + buf = gst_video_aggregator_pad_get_current_buffer (vpad); + if (!buf) + return TRUE; + + /* Ignore gap buffer */ + if (GST_BUFFER_FLAG_IS_SET (buf, GST_BUFFER_FLAG_GAP) || + gst_buffer_get_size (buf) == 0) { + return TRUE; + } + + mem = gst_buffer_peek_memory (buf, 0); + /* FIXME: we should be able to accept non-d3d11 memory later once + * we remove intermediate elements (d3d11upload and d3d11colorconvert) + */ + if (!gst_is_d3d11_memory (mem)) { + GST_ELEMENT_ERROR (agg, CORE, FAILED, (NULL), ("Invalid memory")); + return FALSE; + } + + dmem = GST_D3D11_MEMORY_CAST (mem); + + /* We can use existing device */ + if (dmem->device == self->device) { + data->have_same_device = TRUE; + return FALSE; + } + + if (self->adapter < 0) { + update_device = TRUE; + } else { + guint adapter = 0; + + g_object_get (dmem->device, "adapter", &adapter, NULL); + /* The same GPU as what user wanted, update */ + if (adapter == (guint) self->adapter) + update_device = TRUE; + } + + if (!update_device) + return TRUE; + + data->other_device = dmem->device; + + /* Keep iterate since there might be one buffer which holds the same device + * as ours */ + return TRUE; +} + +static GstFlowReturn +gst_d3d11_compositor_create_output_buffer (GstVideoAggregator * vagg, + GstBuffer ** outbuffer) +{ + GstD3D11Compositor *self = GST_D3D11_COMPOSITOR (vagg); + DeviceCheckData data; + + /* Check whether there is at least one sinkpad which holds d3d11 buffer + * with compatible device, and if not, update our device */ + data.other_device = NULL; + data.have_same_device = FALSE; + + gst_element_foreach_sink_pad (GST_ELEMENT_CAST (vagg), + (GstElementForeachPadFunc) gst_d3d11_compositor_check_device_update, + &data); + + if (data.have_same_device || !data.other_device) + goto done; + + /* Clear all device dependent resources */ + gst_element_foreach_sink_pad (GST_ELEMENT_CAST (vagg), + (GstElementForeachPadFunc) gst_d3d11_compositor_pad_clear_resource, NULL); + + gst_clear_buffer (&self->fallback_buf); + if (self->fallback_pool) { + gst_buffer_pool_set_active (self->fallback_pool, FALSE); + gst_clear_object (&self->fallback_pool); + } + g_clear_pointer (&self->checker_background, gst_d3d11_quad_free); + + GST_INFO_OBJECT (self, "Updating device %" GST_PTR_FORMAT " -> %" + GST_PTR_FORMAT, self->device, data.other_device); + gst_object_unref (self->device); + self->device = (GstD3D11Device *) gst_object_ref (data.other_device); + + /* We cannot call gst_aggregator_negotiate() here, since GstVideoAggregator + * is holding GST_VIDEO_AGGREGATOR_LOCK() already. + * Mark reconfigure and do reconfigure later */ + gst_pad_mark_reconfigure (GST_AGGREGATOR_SRC_PAD (vagg)); + + return GST_AGGREGATOR_FLOW_NEED_DATA; + +done: + return GST_VIDEO_AGGREGATOR_CLASS (parent_class)->create_output_buffer (vagg, + outbuffer); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11compositor.h
Added
@@ -0,0 +1,93 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_COMPOSITOR_H__ +#define __GST_D3D11_COMPOSITOR_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/video/gstvideoaggregator.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_COMPOSITOR_PAD (gst_d3d11_compositor_pad_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11CompositorPad, gst_d3d11_compositor_pad, + GST, D3D11_COMPOSITOR_PAD, GstVideoAggregatorPad) + +#define GST_TYPE_D3D11_COMPOSITOR (gst_d3d11_compositor_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11Compositor, gst_d3d11_compositor, + GST, D3D11_COMPOSITOR, GstVideoAggregator) + +typedef enum +{ + GST_D3D11_COMPOSITOR_BLEND_OP_ADD, + GST_D3D11_COMPOSITOR_BLEND_OP_SUBTRACT, + GST_D3D11_COMPOSITOR_BLEND_OP_REV_SUBTRACT, + GST_D3D11_COMPOSITOR_BLEND_OP_MIN, + GST_D3D11_COMPOSITOR_BLEND_OP_MAX +} GstD3D11CompositorBlendOperation; + +#define GST_TYPE_D3D11_COMPOSITOR_BLEND_OPERATION (gst_d3d11_compositor_blend_operation_get_type()) +GType gst_d3d11_compositor_blend_operation_get_type (void); + +typedef enum +{ + GST_D3D11_COMPOSITOR_BLEND_ZERO, + GST_D3D11_COMPOSITOR_BLEND_ONE, + GST_D3D11_COMPOSITOR_BLEND_SRC_COLOR, + GST_D3D11_COMPOSITOR_BLEND_INV_SRC_COLOR, + GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA, + GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA, + GST_D3D11_COMPOSITOR_BLEND_DEST_ALPHA, + GST_D3D11_COMPOSITOR_BLEND_INV_DEST_ALPHA, + GST_D3D11_COMPOSITOR_BLEND_DEST_COLOR, + GST_D3D11_COMPOSITOR_BLEND_INV_DEST_COLOR, + GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA_SAT, + GST_D3D11_COMPOSITOR_BLEND_BLEND_FACTOR, + GST_D3D11_COMPOSITOR_BLEND_INV_BLEND_FACTOR, +} GstD3D11CompositorBlend; + +#define GST_TYPE_D3D11_COMPOSITOR_BLEND (gst_d3d11_compositor_blend_get_type()) +GType gst_d3d11_compositor_blend_get_type (void); + +typedef enum +{ + GST_D3D11_COMPOSITOR_BACKGROUND_CHECKER, + GST_D3D11_COMPOSITOR_BACKGROUND_BLACK, + GST_D3D11_COMPOSITOR_BACKGROUND_WHITE, + GST_D3D11_COMPOSITOR_BACKGROUND_TRANSPARENT, +} GstD3D11CompositorBackground; + +#define GST_TYPE_D3D11_COMPOSITOR_BACKGROUND (gst_d3d11_compositor_background_get_type()) +GType gst_d3d11_compositor_background_get_type (void); + +typedef enum +{ + GST_D3D11_COMPOSITOR_SIZING_POLICY_NONE, + GST_D3D11_COMPOSITOR_SIZING_POLICY_KEEP_ASPECT_RATIO, +} GstD3D11CompositorSizingPolicy; + +#define GST_TYPE_D3D11_COMPOSITOR_SIZING_POLICY (gst_d3d11_compositor_sizing_policy_get_type()) +GType gst_d3d11_compositor_sizing_policy_get_type (void); + +G_END_DECLS + +#endif /* __GST_D3D11_COMPOSITOR_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11compositorbin.cpp
Added
@@ -0,0 +1,1011 @@ +/* + * GStreamer + * Copyright (C) 2015 Matthew Waters <matthew@centricular.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11compositor + * @title: d3d11compositor + * + * A convenient bin which wraps #d3d11compositorelement for video composition + * with other helper elements to handle color conversion and memory transfer + * between Direct3D11 and system memory space. + * + * ## Example launch line + * ``` + * gst-launch-1.0 d3d11compositor name=c ! d3d11videosink \ + * videotestsrc ! video/x-raw,width=320,height=240 ! c. \ + * videotestsrc pattern=ball ! video/x-raw,width=100,height=100 ! c. + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/controller/gstproxycontrolbinding.h> +#include "gstd3d11compositorbin.h" +#include "gstd3d11compositor.h" +#include "gstd3d11pluginutils.h" + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_compositor_debug); +#define GST_CAT_DEFAULT gst_d3d11_compositor_debug + +/**************************** + * GstD3D11CompositorBinPad * + ****************************/ + +enum +{ + PROP_PAD_0, + /* GstAggregatorPad */ + PROP_PAD_EMIT_SIGNALS, +}; + +/* GstAggregatorPad */ +#define DEFAULT_PAD_EMIT_SIGNALS FALSE + +enum +{ + /* GstAggregatorPad */ + SIGNAL_PAD_BUFFER_CONSUMED = 0, + SIGNAL_PAD_LAST, +}; + +static guint gst_d3d11_compositor_bin_pad_signals[SIGNAL_PAD_LAST] = { 0 }; + +/** + * GstD3D11CompositorBinPad: + * + * Since: 1.20 + */ +struct _GstD3D11CompositorBinPad +{ + GstGhostPad parent; + + /* Holds ref */ + GstPad *target; + gulong sig_id; +}; + +static void gst_d3d11_compositor_bin_pad_dispose (GObject * object); +static void gst_d3d11_compositor_bin_pad_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_compositor_bin_pad_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void +gst_d3d11_compositor_bin_pad_set_target_default (GstD3D11CompositorBinPad * pad, + GstPad * target); + +G_DEFINE_TYPE (GstD3D11CompositorBinPad, gst_d3d11_compositor_bin_pad, + GST_TYPE_GHOST_PAD); + +static void +gst_d3d11_compositor_bin_pad_class_init (GstD3D11CompositorBinPadClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->dispose = gst_d3d11_compositor_bin_pad_dispose; + gobject_class->set_property = gst_d3d11_compositor_bin_pad_set_property; + gobject_class->get_property = gst_d3d11_compositor_bin_pad_get_property; + + /* GstAggregatorPad */ + g_object_class_install_property (gobject_class, PROP_PAD_EMIT_SIGNALS, + g_param_spec_boolean ("emit-signals", "Emit signals", + "Send signals to signal data consumption", + DEFAULT_PAD_EMIT_SIGNALS, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_d3d11_compositor_bin_pad_signals[SIGNAL_PAD_BUFFER_CONSUMED] = + g_signal_new ("buffer-consumed", G_TYPE_FROM_CLASS (klass), + G_SIGNAL_RUN_FIRST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, GST_TYPE_BUFFER); + + klass->set_target = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_bin_pad_set_target_default); +} + +static void +gst_d3d11_compositor_bin_pad_init (GstD3D11CompositorBinPad * self) +{ +} + +static void +gst_d3d11_compositor_bin_pad_dispose (GObject * object) +{ + GstD3D11CompositorBinPad *self = GST_D3D11_COMPOSITOR_BIN_PAD (object); + + gst_clear_object (&self->target); + + G_OBJECT_CLASS (gst_d3d11_compositor_bin_pad_parent_class)->dispose (object); +} + +static void +gst_d3d11_compositor_bin_pad_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorBinPad *self = GST_D3D11_COMPOSITOR_BIN_PAD (object); + + if (self->target) + g_object_set_property (G_OBJECT (self->target), pspec->name, value); +} + +static void +gst_d3d11_compositor_bin_pad_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorBinPad *self = GST_D3D11_COMPOSITOR_BIN_PAD (object); + + if (self->target) + g_object_get_property (G_OBJECT (self->target), pspec->name, value); +} + +static void +gst_d3d11_compositor_bin_pad_on_buffer_consumed (GstAggregatorPad * pad, + GstBuffer * buffer, GstD3D11CompositorBinPad * self) +{ + g_signal_emit (self, + gst_d3d11_compositor_bin_pad_signals[SIGNAL_PAD_BUFFER_CONSUMED], + 0, buffer); +} + +/** + * gst_d3d11_compositor_bin_pad_set_target: + * @self: a #GstD3D11CompositorBinPad + * @target: (transfer full): a #GstAggregatorPad + */ +static void +gst_d3d11_compositor_bin_pad_set_target (GstD3D11CompositorBinPad * pad, + GstPad * target) +{ + GstD3D11CompositorBinPadClass *klass = + GST_D3D11_COMPOSITOR_BIN_PAD_GET_CLASS (pad); + + klass->set_target (pad, target); +} + +static void +gst_d3d11_compositor_bin_pad_set_target_default (GstD3D11CompositorBinPad * pad, + GstPad * target) +{ + pad->target = target; + pad->sig_id = g_signal_connect (target, "buffer-consumed", + G_CALLBACK (gst_d3d11_compositor_bin_pad_on_buffer_consumed), pad); +} + +static void +gst_d3d11_compositor_bin_pad_unset_target (GstD3D11CompositorBinPad * self) +{ + if (!self->target) + return; + + if (self->sig_id) + g_signal_handler_disconnect (self->target, self->sig_id); + self->sig_id = 0; + gst_clear_object (&self->target); +} + +/****************************** + * GstD3D11CompositorBinInput * + ******************************/ + +enum +{ + PROP_INPUT_0, + /* GstVideoAggregatorPad */ + PROP_INPUT_ZORDER, + PROP_INPUT_REPEAT_AFTER_EOS, + PROP_INPUT_MAX_LAST_BUFFER_REPEAT, + /* GstD3D11CompositorPad */ + PROP_INPUT_XPOS, + PROP_INPUT_YPOS, + PROP_INPUT_WIDTH, + PROP_INPUT_HEIGHT, + PROP_INPUT_ALPHA, + PROP_INPUT_BLEND_OP_RGB, + PROP_INPUT_BLEND_OP_ALPHA, + PROP_INPUT_BLEND_SRC_RGB, + PROP_INPUT_BLEND_SRC_ALPHA, + PROP_INPUT_BLEND_DEST_RGB, + PROP_INPUT_BLEND_DEST_ALPHA, + PROP_INPUT_BLEND_FACTOR_RED, + PROP_INPUT_BLEND_FACTOR_GREEN, + PROP_INPUT_BLEND_FACTOR_BLUE, + PROP_INPUT_BLEND_FACTOR_ALPHA, + PROP_INPUT_SIZING_POLICY, +}; + +/* GstVideoAggregatorPad */ +#define DEFAULT_INPUT_ZORDER 0 +#define DEFAULT_INPUT_REPEAT_AFTER_EOS FALSE +#define DEFAULT_INPUT_MAX_LAST_BUFFER_REPEAT GST_CLOCK_TIME_NONE +/* GstD3D11CompositorPad */ +#define DEFAULT_INPUT_XPOS 0 +#define DEFAULT_INPUT_YPOS 0 +#define DEFAULT_INPUT_WIDTH 0 +#define DEFAULT_INPUT_HEIGHT 0 +#define DEFAULT_INPUT_ALPHA 1.0 +#define DEFAULT_INPUT_BLEND_OP_RGB GST_D3D11_COMPOSITOR_BLEND_OP_ADD +#define DEFAULT_INPUT_BLEND_OP_ALPHA GST_D3D11_COMPOSITOR_BLEND_OP_ADD +#define DEFAULT_INPUT_BLEND_SRC_RGB GST_D3D11_COMPOSITOR_BLEND_SRC_ALPHA +#define DEFAULT_INPUT_BLEND_SRC_ALPHA GST_D3D11_COMPOSITOR_BLEND_ONE +#define DEFAULT_INPUT_BLEND_DEST_RGB GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA +#define DEFAULT_INPUT_BLEND_DEST_ALPHA GST_D3D11_COMPOSITOR_BLEND_INV_SRC_ALPHA +#define DEFAULT_INPUT_SIZING_POLICY GST_D3D11_COMPOSITOR_SIZING_POLICY_NONE + +/** + * GstD3D11CompositorBinInput: + * + * Since: 1.20 + */ +struct _GstD3D11CompositorBinInput +{ + GstD3D11CompositorBinPad parent; +}; + +static void gst_d3d11_compositor_bin_input_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_compositor_bin_input_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void +gst_d3d11_compositor_bin_input_set_target (GstD3D11CompositorBinPad * pad, + GstPad * target); + +#define gst_d3d11_compositor_bin_input_parent_class input_parent_class +G_DEFINE_TYPE (GstD3D11CompositorBinInput, gst_d3d11_compositor_bin_input, + GST_TYPE_D3D11_COMPOSITOR_BIN_PAD); + +static void +gst_d3d11_compositor_bin_input_class_init (GstD3D11CompositorBinInputClass * + klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstD3D11CompositorBinPadClass *pad_class = + GST_D3D11_COMPOSITOR_BIN_PAD_CLASS (klass); + + gobject_class->set_property = gst_d3d11_compositor_bin_input_set_property; + gobject_class->get_property = gst_d3d11_compositor_bin_input_get_property; + + /* GstVideoAggregatorPad */ + g_object_class_install_property (gobject_class, PROP_INPUT_ZORDER, + g_param_spec_uint ("zorder", "Z-Order", "Z Order of the picture", + 0, G_MAXUINT, DEFAULT_INPUT_ZORDER, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_REPEAT_AFTER_EOS, + g_param_spec_boolean ("repeat-after-eos", "Repeat After EOS", + "Repeat the " "last frame after EOS until all pads are EOS", + DEFAULT_INPUT_REPEAT_AFTER_EOS, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_INPUT_MAX_LAST_BUFFER_REPEAT, + g_param_spec_uint64 ("max-last-buffer-repeat", "Max Last Buffer Repeat", + "Repeat last buffer for time (in ns, -1=until EOS), " + "behaviour on EOS is not affected", 0, G_MAXUINT64, + DEFAULT_INPUT_MAX_LAST_BUFFER_REPEAT, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_PLAYING | + G_PARAM_STATIC_STRINGS))); + + /* GstD3D11CompositorPad */ + g_object_class_install_property (gobject_class, PROP_INPUT_XPOS, + g_param_spec_int ("xpos", "X Position", "X position of the picture", + G_MININT, G_MAXINT, DEFAULT_INPUT_XPOS, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_YPOS, + g_param_spec_int ("ypos", "Y Position", "Y position of the picture", + G_MININT, G_MAXINT, DEFAULT_INPUT_YPOS, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_WIDTH, + g_param_spec_int ("width", "Width", "Width of the picture", + G_MININT, G_MAXINT, DEFAULT_INPUT_WIDTH, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_HEIGHT, + g_param_spec_int ("height", "Height", "Height of the picture", + G_MININT, G_MAXINT, DEFAULT_INPUT_HEIGHT, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_ALPHA, + g_param_spec_double ("alpha", "Alpha", "Alpha of the picture", 0.0, 1.0, + DEFAULT_INPUT_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_BLEND_OP_RGB, + g_param_spec_enum ("blend-op-rgb", "Blend Operation RGB", + "Blend equation for RGB", GST_TYPE_D3D11_COMPOSITOR_BLEND_OPERATION, + DEFAULT_INPUT_BLEND_OP_RGB, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_BLEND_OP_ALPHA, + g_param_spec_enum ("blend-op-alpha", "Blend Operation Alpha", + "Blend equation for alpha", GST_TYPE_D3D11_COMPOSITOR_BLEND_OPERATION, + DEFAULT_INPUT_BLEND_OP_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_INPUT_BLEND_SRC_RGB, + g_param_spec_enum ("blend-src-rgb", "Blend Source RGB", + "Blend factor for source RGB", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_INPUT_BLEND_SRC_RGB, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_INPUT_BLEND_SRC_ALPHA, + g_param_spec_enum ("blend-src-alpha", + "Blend Source Alpha", + "Blend factor for source alpha, \"*-color\" values are not allowed", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_INPUT_BLEND_SRC_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_INPUT_BLEND_DEST_RGB, + g_param_spec_enum ("blend-dest-rgb", + "Blend Destination RGB", + "Blend factor for destination RGB", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_INPUT_BLEND_DEST_RGB, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, + PROP_INPUT_BLEND_DEST_ALPHA, + g_param_spec_enum ("blend-dest-alpha", + "Blend Destination Alpha", + "Blend factor for destination alpha, " + "\"*-color\" values are not allowed", + GST_TYPE_D3D11_COMPOSITOR_BLEND, + DEFAULT_INPUT_BLEND_DEST_ALPHA, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_BLEND_FACTOR_RED, + g_param_spec_float ("blend-factor-red", "Blend Factor Red", + "Blend factor for red component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_BLEND_FACTOR_GREEN, + g_param_spec_float ("blend-factor-green", "Blend Factor Green", + "Blend factor for green component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_BLEND_FACTOR_BLUE, + g_param_spec_float ("blend-factor-blue", "Blend Factor Blue", + "Blend factor for blue component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_BLEND_FACTOR_ALPHA, + g_param_spec_float ("blend-factor-alpha", "Blend Factor Alpha", + "Blend factor for alpha component " + "when blend type is \"blend-factor\" or \"inv-blend-factor\"", + 0.0, 1.0, 1.0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_INPUT_SIZING_POLICY, + g_param_spec_enum ("sizing-policy", "Sizing policy", + "Sizing policy to use for image scaling", + GST_TYPE_D3D11_COMPOSITOR_SIZING_POLICY, DEFAULT_INPUT_SIZING_POLICY, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | + G_PARAM_STATIC_STRINGS))); + + pad_class->set_target = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_bin_input_set_target); +} + +static void +gst_d3d11_compositor_bin_input_init (GstD3D11CompositorBinInput * self) +{ +} + +static void +gst_d3d11_compositor_bin_input_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorBinPad *pad = GST_D3D11_COMPOSITOR_BIN_PAD (object); + + if (pad->target) + g_object_set_property (G_OBJECT (pad->target), pspec->name, value); +} + +static void +gst_d3d11_compositor_bin_input_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorBinPad *pad = GST_D3D11_COMPOSITOR_BIN_PAD (object); + + if (pad->target) + g_object_get_property (G_OBJECT (pad->target), pspec->name, value); +} + +static void +gst_d3d11_compositor_bin_input_set_target (GstD3D11CompositorBinPad * pad, + GstPad * target) +{ + GST_D3D11_COMPOSITOR_BIN_PAD_CLASS (input_parent_class)->set_target (pad, + target); + +#define ADD_BINDING(obj,ref,prop) \ + gst_object_add_control_binding (GST_OBJECT (obj), \ + gst_proxy_control_binding_new (GST_OBJECT (obj), prop, \ + GST_OBJECT (ref), prop)); + /* GstVideoAggregatorPad */ + ADD_BINDING (target, pad, "zorder"); + ADD_BINDING (target, pad, "repeat-after-eos"); + /* GstD3D11CompositorPad */ + ADD_BINDING (target, pad, "xpos"); + ADD_BINDING (target, pad, "ypos"); + ADD_BINDING (target, pad, "width"); + ADD_BINDING (target, pad, "height"); + ADD_BINDING (target, pad, "alpha"); + ADD_BINDING (target, pad, "blend-op-rgb"); + ADD_BINDING (target, pad, "blend-op-alpha"); + ADD_BINDING (target, pad, "blend-src-rgb"); + ADD_BINDING (target, pad, "blend-src-alpha"); + ADD_BINDING (target, pad, "blend-dest-rgb"); + ADD_BINDING (target, pad, "blend-dest-alpha"); + ADD_BINDING (target, pad, "blend-factor-red"); + ADD_BINDING (target, pad, "blend-factor-green"); + ADD_BINDING (target, pad, "blend-factor-blue"); + ADD_BINDING (target, pad, "blend-factor-alpha"); + ADD_BINDING (target, pad, "sizing-policy"); +#undef ADD_BINDING +} + +/************************* + * GstD3D11CompositorBin * + *************************/ + +static GstStaticCaps sink_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SINK_FORMATS) ";" + GST_VIDEO_CAPS_MAKE (GST_D3D11_SINK_FORMATS)); + +static GstStaticCaps src_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SRC_FORMATS) ";" + GST_VIDEO_CAPS_MAKE (GST_D3D11_SRC_FORMATS)); + +enum +{ + PROP_0, + PROP_MIXER, + /* GstAggregator */ + PROP_LATENCY, + PROP_MIN_UPSTREAM_LATENCY, + PROP_START_TIME_SELECTION, + PROP_START_TIME, + PROP_EMIT_SIGNALS, + /* GstD3D11Compositor */ + PROP_ADAPTER, + PROP_BACKGROUND, + PROP_LAST +}; + +/* GstAggregator */ +#define DEFAULT_LATENCY 0 +#define DEFAULT_MIN_UPSTREAM_LATENCY 0 +#define DEFAULT_START_TIME_SELECTION GST_AGGREGATOR_START_TIME_SELECTION_ZERO +#define DEFAULT_START_TIME (-1) +#define DEFAULT_EMIT_SIGNALS FALSE + +/* GstD3D11Compositor */ +#define DEFAULT_ADAPTER -1 +#define DEFAULT_BACKGROUND GST_D3D11_COMPOSITOR_BACKGROUND_CHECKER + +typedef struct _GstD3D11CompositorBinChain +{ + /* without ref */ + GstD3D11CompositorBin *self; + GstD3D11CompositorBinPad *ghost_pad; + GstElement *upload; + GstElement *convert; + + gulong probe_id; +} GstD3D11CompositorBinChain; + +struct _GstD3D11CompositorBin +{ + GstBin parent; + + GstElement *compositor; + + GList *input_chains; + gboolean running; + + gint adapter; +}; + +static void gst_d3d11_compositor_bin_child_proxy_init (gpointer g_iface, + gpointer iface_data); +static void gst_d3d11_compositor_bin_dispose (GObject * object); +static void gst_d3d11_compositor_bin_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_compositor_bin_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); + +static GstStateChangeReturn +gst_d3d11_compositor_bin_change_state (GstElement * element, + GstStateChange transition); +static GstPad *gst_d3d11_compositor_bin_request_new_pad (GstElement * element, + GstPadTemplate * templ, const gchar * name, const GstCaps * caps); +static void gst_d3d11_compositor_bin_release_pad (GstElement * element, + GstPad * pad); + +#define gst_d3d11_compositor_bin_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstD3D11CompositorBin, gst_d3d11_compositor_bin, + GST_TYPE_BIN, G_IMPLEMENT_INTERFACE (GST_TYPE_CHILD_PROXY, + gst_d3d11_compositor_bin_child_proxy_init)); + +static void +gst_d3d11_compositor_bin_class_init (GstD3D11CompositorBinClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstCaps *caps; + + gobject_class->dispose = gst_d3d11_compositor_bin_dispose; + gobject_class->set_property = gst_d3d11_compositor_bin_set_property; + gobject_class->get_property = gst_d3d11_compositor_bin_get_property; + + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_bin_change_state); + element_class->request_new_pad = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_bin_request_new_pad); + element_class->release_pad = + GST_DEBUG_FUNCPTR (gst_d3d11_compositor_bin_release_pad); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 Compositor Bin", + "Filter/Editor/Video/Compositor", + "A Direct3D11 compositor bin", "Seungha Yang <seungha@centricular.com>"); + + caps = gst_d3d11_get_updated_template_caps (&sink_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new_with_gtype ("sink_%u", GST_PAD_SINK, GST_PAD_REQUEST, + caps, GST_TYPE_D3D11_COMPOSITOR_BIN_INPUT)); + gst_caps_unref (caps); + + caps = gst_d3d11_get_updated_template_caps (&src_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new_with_gtype ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + caps, GST_TYPE_D3D11_COMPOSITOR_BIN_PAD)); + gst_caps_unref (caps); + + g_object_class_install_property (gobject_class, PROP_MIXER, + g_param_spec_object ("mixer", "D3D11 mixer element", + "The d3d11 mixer chain to use", + GST_TYPE_ELEMENT, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + /*GstAggregator */ + g_object_class_install_property (gobject_class, PROP_LATENCY, + g_param_spec_uint64 ("latency", "Buffer latency", + "Additional latency in live mode to allow upstream " + "to take longer to produce buffers for the current " + "position (in nanoseconds)", 0, G_MAXUINT64, + DEFAULT_LATENCY, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_MIN_UPSTREAM_LATENCY, + g_param_spec_uint64 ("min-upstream-latency", "Buffer latency", + "When sources with a higher latency are expected to be plugged " + "in dynamically after the aggregator has started playing, " + "this allows overriding the minimum latency reported by the " + "initial source(s). This is only taken into account when larger " + "than the actually reported minimum latency. (nanoseconds)", + 0, G_MAXUINT64, + DEFAULT_LATENCY, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_START_TIME_SELECTION, + g_param_spec_enum ("start-time-selection", "Start Time Selection", + "Decides which start time is output", + gst_aggregator_start_time_selection_get_type (), + DEFAULT_START_TIME_SELECTION, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_START_TIME, + g_param_spec_uint64 ("start-time", "Start Time", + "Start time to use if start-time-selection=set", 0, + G_MAXUINT64, + DEFAULT_START_TIME, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_EMIT_SIGNALS, + g_param_spec_boolean ("emit-signals", "Emit signals", + "Send signals", DEFAULT_EMIT_SIGNALS, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /* GstD3D11Compositor */ + g_object_class_install_property (gobject_class, PROP_ADAPTER, + g_param_spec_int ("adapter", "Adapter", + "Adapter index for creating device (-1 for default)", + -1, G_MAXINT32, DEFAULT_ADAPTER, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_BACKGROUND, + g_param_spec_enum ("background", "Background", "Background type", + GST_TYPE_D3D11_COMPOSITOR_BACKGROUND, + DEFAULT_BACKGROUND, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_BIN_PAD, + (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_COMPOSITOR_BIN_INPUT, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_compositor_bin_init (GstD3D11CompositorBin * self) +{ + GstPad *pad; + GstPad *gpad; + GstElement *out_convert, *download; + + self->compositor = gst_element_factory_make ("d3d11compositorelement", NULL); + out_convert = gst_element_factory_make ("d3d11colorconvert", NULL); + download = gst_element_factory_make ("d3d11download", NULL); + + gst_bin_add_many (GST_BIN (self), + self->compositor, out_convert, download, NULL); + gst_element_link_many (self->compositor, out_convert, download, NULL); + + gpad = (GstPad *) g_object_new (GST_TYPE_D3D11_COMPOSITOR_BIN_PAD, + "name", "src", "direction", GST_PAD_SRC, NULL); + pad = gst_element_get_static_pad (self->compositor, "src"); + /* GstD3D11CompositorBinPad will hold reference of this compositor srcpad */ + gst_d3d11_compositor_bin_pad_set_target ((GstD3D11CompositorBinPad *) gpad, + pad); + + pad = gst_element_get_static_pad (download, "src"); + gst_ghost_pad_set_target (GST_GHOST_PAD_CAST (gpad), pad); + gst_object_unref (pad); + + gst_element_add_pad (GST_ELEMENT_CAST (self), gpad); +} + +static void +gst_d3d11_compositor_bin_dispose (GObject * object) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (object); + GList *iter; + + for (iter = self->input_chains; iter; iter = g_list_next (iter)) { + GstD3D11CompositorBinChain *chain = + (GstD3D11CompositorBinChain *) iter->data; + + if (self->compositor && chain->ghost_pad && chain->ghost_pad->target) { + gst_element_release_request_pad (GST_ELEMENT_CAST (self->compositor), + chain->ghost_pad->target); + gst_d3d11_compositor_bin_pad_unset_target (chain->ghost_pad); + } + } + + if (self->input_chains) + g_list_free_full (self->input_chains, (GDestroyNotify) g_free); + self->input_chains = NULL; + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_d3d11_compositor_bin_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (object); + + switch (prop_id) { + case PROP_ADAPTER: + self->adapter = g_value_get_int (value); + /* fallthrough */ + default: + g_object_set_property (G_OBJECT (self->compositor), pspec->name, value); + break; + } +} + +static void +gst_d3d11_compositor_bin_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (object); + + switch (prop_id) { + case PROP_MIXER: + g_value_set_object (value, self->compositor); + break; + case PROP_ADAPTER: + g_value_set_int (value, self->adapter); + break; + default: + g_object_get_property (G_OBJECT (self->compositor), pspec->name, value); + break; + } +} + +static GstStateChangeReturn +gst_d3d11_compositor_bin_change_state (GstElement * element, + GstStateChange transition) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (element); + GstStateChangeReturn ret; + + switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + GST_OBJECT_LOCK (element); + self->running = TRUE; + GST_OBJECT_UNLOCK (element); + break; + default: + break; + } + + ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); + if (ret == GST_STATE_CHANGE_FAILURE) + return ret; + + switch (transition) { + case GST_STATE_CHANGE_READY_TO_NULL: + GST_OBJECT_LOCK (self); + self->running = FALSE; + GST_OBJECT_UNLOCK (self); + default: + break; + } + + return ret; +} + +static GstD3D11CompositorBinChain * +gst_d3d11_compositor_bin_input_chain_new (GstD3D11CompositorBin * self, + GstPad * compositor_pad) +{ + GstD3D11CompositorBinChain *chain; + GstPad *pad; + + chain = g_new0 (GstD3D11CompositorBinChain, 1); + + chain->self = self; + + chain->upload = gst_element_factory_make ("d3d11upload", NULL); + chain->convert = gst_element_factory_make ("d3d11colorconvert", NULL); + + /* 1. Create child elements and like */ + gst_bin_add_many (GST_BIN (self), chain->upload, chain->convert, NULL); + + gst_element_link (chain->upload, chain->convert); + pad = gst_element_get_static_pad (chain->convert, "src"); + gst_pad_link (pad, compositor_pad); + gst_object_unref (pad); + + chain->ghost_pad = (GstD3D11CompositorBinPad *) + g_object_new (GST_TYPE_D3D11_COMPOSITOR_BIN_INPUT, "name", + GST_OBJECT_NAME (compositor_pad), "direction", GST_PAD_SINK, NULL); + + /* transfer ownership of compositor pad */ + gst_d3d11_compositor_bin_pad_set_target (chain->ghost_pad, compositor_pad); + + pad = gst_element_get_static_pad (chain->upload, "sink"); + gst_ghost_pad_set_target (GST_GHOST_PAD_CAST (chain->ghost_pad), pad); + gst_object_unref (pad); + + GST_OBJECT_LOCK (self); + if (self->running) + gst_pad_set_active (GST_PAD (chain->ghost_pad), TRUE); + GST_OBJECT_UNLOCK (self); + + gst_element_add_pad (GST_ELEMENT_CAST (self), + GST_PAD_CAST (chain->ghost_pad)); + + gst_element_sync_state_with_parent (chain->upload); + gst_element_sync_state_with_parent (chain->convert); + + return chain; +} + +static void +gst_d3d11_compositor_bin_input_chain_free (GstD3D11CompositorBinChain * chain) +{ + if (!chain) + return; + + if (chain->ghost_pad && chain->probe_id) { + gst_pad_remove_probe (GST_PAD_CAST (chain->ghost_pad), chain->probe_id); + chain->probe_id = 0; + } + + if (chain->upload) { + gst_element_set_state (chain->upload, GST_STATE_NULL); + gst_bin_remove (GST_BIN_CAST (chain->self), chain->upload); + } + + if (chain->convert) { + gst_element_set_state (chain->convert, GST_STATE_NULL); + gst_bin_remove (GST_BIN_CAST (chain->self), chain->convert); + } + + if (chain->ghost_pad && chain->ghost_pad->target) { + gst_element_release_request_pad (chain->self->compositor, + chain->ghost_pad->target); + gst_d3d11_compositor_bin_pad_unset_target (chain->ghost_pad); + } + + g_free (chain); +} + +static GstPad * +gst_d3d11_compositor_bin_request_new_pad (GstElement * element, + GstPadTemplate * templ, const gchar * name, const GstCaps * caps) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (element); + GstElementClass *compositor_class = GST_ELEMENT_GET_CLASS (self->compositor); + GstPad *compositor_pad; + GstD3D11CompositorBinChain *chain; + GstPadTemplate *compositor_templ = NULL; + GList *templ_list; + GList *iter; + + templ_list = gst_element_class_get_pad_template_list (compositor_class); + for (iter = templ_list; iter; iter = g_list_next (iter)) { + GstPadTemplate *t = (GstPadTemplate *) iter->data; + if (GST_PAD_TEMPLATE_DIRECTION (t) != GST_PAD_SINK || + GST_PAD_TEMPLATE_PRESENCE (t) != GST_PAD_REQUEST) + continue; + + compositor_templ = t; + break; + } + + g_assert (compositor_templ); + + compositor_pad = + gst_element_request_pad (self->compositor, compositor_templ, name, caps); + if (!compositor_pad) { + GST_WARNING_OBJECT (self, "Failed to request pad"); + return NULL; + } + + chain = gst_d3d11_compositor_bin_input_chain_new (self, compositor_pad); + g_assert (chain); + + GST_OBJECT_LOCK (self); + self->input_chains = g_list_append (self->input_chains, chain); + GST_OBJECT_UNLOCK (self); + + gst_child_proxy_child_added (GST_CHILD_PROXY (self), + G_OBJECT (chain->ghost_pad), GST_OBJECT_NAME (chain->ghost_pad)); + + GST_DEBUG_OBJECT (element, "Created new pad %s:%s", + GST_DEBUG_PAD_NAME (chain->ghost_pad)); + + return GST_PAD_CAST (chain->ghost_pad); +} + +static void +gst_d3d11_compositor_bin_release_pad (GstElement * element, GstPad * pad) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (element); + GList *iter; + gboolean found = FALSE; + + GST_DEBUG_OBJECT (self, "Releasing pad %s:%s", GST_DEBUG_PAD_NAME (pad)); + + GST_OBJECT_LOCK (self); + for (iter = self->input_chains; iter; iter = g_list_next (iter)) { + GstD3D11CompositorBinChain *chain = + (GstD3D11CompositorBinChain *) iter->data; + + if (pad == GST_PAD_CAST (chain->ghost_pad)) { + self->input_chains = g_list_delete_link (self->input_chains, iter); + GST_OBJECT_UNLOCK (self); + + gst_d3d11_compositor_bin_input_chain_free (chain); + found = TRUE; + break; + } + } + + if (!found) { + GST_OBJECT_UNLOCK (self); + GST_WARNING_OBJECT (self, "Unknown pad to release %s:%s", + GST_DEBUG_PAD_NAME (pad)); + } + + gst_element_remove_pad (element, pad); +} + +static GObject * +gst_d3d11_compositor_bin_child_proxy_get_child_by_index (GstChildProxy * proxy, + guint index) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (proxy); + GstBin *bin = GST_BIN_CAST (proxy); + GObject *res = NULL; + + GST_OBJECT_LOCK (self); + /* XXX: not exactly thread safe with ordering */ + if (index < (guint) bin->numchildren) { + if ((res = (GObject *) g_list_nth_data (bin->children, index))) + gst_object_ref (res); + } else { + GstD3D11CompositorBinChain *chain; + if ((chain = + (GstD3D11CompositorBinChain *) g_list_nth_data (self->input_chains, + index - bin->numchildren))) { + res = (GObject *) gst_object_ref (chain->ghost_pad); + } + } + GST_OBJECT_UNLOCK (self); + + return res; +} + +static guint +gst_d3d11_compositor_bin_child_proxy_get_children_count (GstChildProxy * proxy) +{ + GstD3D11CompositorBin *self = GST_D3D11_COMPOSITOR_BIN (proxy); + guint count = 0; + + GST_OBJECT_LOCK (self); + count = GST_BIN_CAST (self)->numchildren + g_list_length (self->input_chains); + GST_OBJECT_UNLOCK (self); + GST_INFO_OBJECT (self, "Children Count: %d", count); + + return count; +} + +static void +gst_d3d11_compositor_bin_child_proxy_init (gpointer g_iface, + gpointer iface_data) +{ + GstChildProxyInterface *iface = (GstChildProxyInterface *) g_iface; + + iface->get_child_by_index = + gst_d3d11_compositor_bin_child_proxy_get_child_by_index; + iface->get_children_count = + gst_d3d11_compositor_bin_child_proxy_get_children_count; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11compositorbin.h
Added
@@ -0,0 +1,66 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_COMPOSITOR_BIN_H__ +#define __GST_D3D11_COMPOSITOR_BIN_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/video/gstvideoaggregator.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_COMPOSITOR_BIN_PAD (gst_d3d11_compositor_bin_pad_get_type()) +#define GST_D3D11_COMPOSITOR_BIN_PAD(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_COMPOSITOR_BIN_PAD, GstD3D11CompositorBinPad)) +#define GST_D3D11_COMPOSITOR_BIN_PAD_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_D3D11_COMPOSITOR_BIN_PAD, GstD3D11CompositorBinPadClass)) +#define GST_IS_D3D11_COMPOSITOR_BIN_PAD(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_COMPOSITOR_BIN_PAD)) +#define GST_IS_D3D11_COMPOSITOR_BIN_PAD_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_COMPOSITOR_BIN_PAD)) +#define GST_D3D11_COMPOSITOR_BIN_PAD_GET_CLASS(obj) \ + (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_D3D11_COMPOSITOR_BIN_PAD,GstD3D11CompositorBinPadClass)) + +typedef struct _GstD3D11CompositorBinPad GstD3D11CompositorBinPad; +typedef struct _GstD3D11CompositorBinPadClass GstD3D11CompositorBinPadClass; + +struct _GstD3D11CompositorBinPadClass +{ + GstGhostPadClass parent_class; + + void (*set_target) (GstD3D11CompositorBinPad * pad, GstPad * target); +}; + +GType gst_d3d11_compositor_bin_pad_get_type (void); +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstD3D11CompositorBinPad, gst_object_unref) + +#define GST_TYPE_D3D11_COMPOSITOR_BIN_INPUT (gst_d3d11_compositor_bin_input_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11CompositorBinInput, gst_d3d11_compositor_bin_input, + GST, D3D11_COMPOSITOR_BIN_INPUT, GstD3D11CompositorBinPad); + +#define GST_TYPE_D3D11_COMPOSITOR_BIN (gst_d3d11_compositor_bin_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11CompositorBin, gst_d3d11_compositor_bin, + GST, D3D11_COMPOSITOR_BIN, GstBin) + +G_END_DECLS + +#endif /* __GST_D3D11_COMPOSITOR_BIN_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11convert.cpp
Added
@@ -0,0 +1,2575 @@ +/* GStreamer + * Copyright (C) <1999> Erik Walthinsen <omega@cse.ogi.edu> + * Copyright (C) 2005-2012 David Schleef <ds@schleef.org> + * Copyright (C) 2012-2014 Matthew Waters <ystree00@gmail.com> + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) <2019> Jeongki Kim <jeongki.kim@jeongki.kim> + * Copyright (C) 2020 Thibault Saunier <tsaunier@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11convert.h" +#include "gstd3d11converter.h" +#include "gstd3d11videoprocessor.h" +#include "gstd3d11pluginutils.h" + +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_convert_debug); +#define GST_CAT_DEFAULT gst_d3d11_convert_debug + +static GstStaticCaps sink_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SINK_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SINK_FORMATS)); + +static GstStaticCaps src_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SRC_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SRC_FORMATS)); + +#define DEFAULT_ADD_BORDERS TRUE + +struct _GstD3D11BaseConvert +{ + GstD3D11BaseFilter parent; + + const GstD3D11Format *in_d3d11_format; + const GstD3D11Format *out_d3d11_format; + + ID3D11Texture2D *in_texture[GST_VIDEO_MAX_PLANES]; + ID3D11ShaderResourceView *shader_resource_view[GST_VIDEO_MAX_PLANES]; + guint num_input_view; + + ID3D11Texture2D *out_texture[GST_VIDEO_MAX_PLANES]; + ID3D11RenderTargetView *render_target_view[GST_VIDEO_MAX_PLANES]; + guint num_output_view; + + GstD3D11Converter *converter; + GstD3D11VideoProcessor *processor; + gboolean processor_in_use; + + /* used for border rendering */ + RECT in_rect; + RECT out_rect; + + gint borders_h; + gint borders_w; + + /* Updated by subclass */ + gboolean add_borders; +}; + +/** + * GstD3D11BaseConvert: + * + * A baseclass implementation for d3d11 convert elements + * + * Since: 1.20 + */ +#define gst_d3d11_base_convert_parent_class parent_class +G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstD3D11BaseConvert, gst_d3d11_base_convert, + GST_TYPE_D3D11_BASE_FILTER, + GST_DEBUG_CATEGORY_INIT (gst_d3d11_convert_debug, "d3d11convert", 0, + "d3d11convert")); + +static void gst_d3d11_base_convert_dispose (GObject * object); +static GstCaps *gst_d3d11_base_convert_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static GstCaps *gst_d3d11_base_convert_fixate_caps (GstBaseTransform * + base, GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); +static gboolean gst_d3d11_base_convert_filter_meta (GstBaseTransform * trans, + GstQuery * query, GType api, const GstStructure * params); +static gboolean +gst_d3d11_base_convert_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query); +static gboolean +gst_d3d11_base_convert_decide_allocation (GstBaseTransform * trans, + GstQuery * query); + +static GstFlowReturn gst_d3d11_base_convert_transform (GstBaseTransform * + trans, GstBuffer * inbuf, GstBuffer * outbuf); +static gboolean gst_d3d11_base_convert_set_info (GstD3D11BaseFilter * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info); + +/* copies the given caps */ +static GstCaps * +gst_d3d11_base_convert_caps_remove_format_info (GstCaps * caps) +{ + GstStructure *st; + GstCapsFeatures *f; + gint i, n; + GstCaps *res; + GstCapsFeatures *feature = + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); + + res = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + st = gst_caps_get_structure (caps, i); + f = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) + continue; + + st = gst_structure_copy (st); + /* Only remove format info for the cases when we can actually convert */ + if (!gst_caps_features_is_any (f) + && gst_caps_features_is_equal (f, feature)) { + gst_structure_remove_fields (st, "format", "colorimetry", "chroma-site", + NULL); + } + + gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); + } + gst_caps_features_free (feature); + + return res; +} + +static GstCaps * +gst_d3d11_base_convert_caps_rangify_size_info (GstCaps * caps) +{ + GstStructure *st; + GstCapsFeatures *f; + gint i, n; + GstCaps *res; + GstCapsFeatures *feature = + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); + + res = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + st = gst_caps_get_structure (caps, i); + f = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) + continue; + + st = gst_structure_copy (st); + /* Only remove format info for the cases when we can actually convert */ + if (!gst_caps_features_is_any (f) + && gst_caps_features_is_equal (f, feature)) { + gst_structure_set (st, "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, + "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); + + /* if pixel aspect ratio, make a range of it */ + if (gst_structure_has_field (st, "pixel-aspect-ratio")) { + gst_structure_set (st, "pixel-aspect-ratio", + GST_TYPE_FRACTION_RANGE, 1, G_MAXINT, G_MAXINT, 1, NULL); + } + } + + gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); + } + gst_caps_features_free (feature); + + return res; +} + +static GstCaps * +gst_d3d11_base_convert_caps_remove_format_and_rangify_size_info (GstCaps * caps) +{ + GstStructure *st; + GstCapsFeatures *f; + gint i, n; + GstCaps *res; + GstCapsFeatures *feature = + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); + + res = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + st = gst_caps_get_structure (caps, i); + f = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) + continue; + + st = gst_structure_copy (st); + /* Only remove format info for the cases when we can actually convert */ + if (!gst_caps_features_is_any (f) + && gst_caps_features_is_equal (f, feature)) { + gst_structure_set (st, "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, + "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); + /* if pixel aspect ratio, make a range of it */ + if (gst_structure_has_field (st, "pixel-aspect-ratio")) { + gst_structure_set (st, "pixel-aspect-ratio", + GST_TYPE_FRACTION_RANGE, 1, G_MAXINT, G_MAXINT, 1, NULL); + } + gst_structure_remove_fields (st, "format", "colorimetry", "chroma-site", + NULL); + } + + gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); + } + gst_caps_features_free (feature); + + return res; +} + +static void +gst_d3d11_base_convert_class_init (GstD3D11BaseConvertClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstD3D11BaseFilterClass *bfilter_class = GST_D3D11_BASE_FILTER_CLASS (klass); + GstCaps *caps; + + gobject_class->dispose = gst_d3d11_base_convert_dispose; + + caps = gst_d3d11_get_updated_template_caps (&sink_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + caps = gst_d3d11_get_updated_template_caps (&src_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_transform_caps); + trans_class->fixate_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_fixate_caps); + trans_class->filter_meta = + GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_filter_meta); + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_propose_allocation); + trans_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_decide_allocation); + trans_class->transform = GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_transform); + + bfilter_class->set_info = GST_DEBUG_FUNCPTR (gst_d3d11_base_convert_set_info); + + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_BASE_CONVERT, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_base_convert_init (GstD3D11BaseConvert * self) +{ + self->add_borders = DEFAULT_ADD_BORDERS; +} + +static void +gst_d3d11_base_convert_clear_shader_resource (GstD3D11BaseConvert * self) +{ + gint i; + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (self->shader_resource_view[i]); + GST_D3D11_CLEAR_COM (self->render_target_view[i]); + } + + self->num_input_view = 0; + self->num_output_view = 0; + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (self->in_texture[i]); + GST_D3D11_CLEAR_COM (self->out_texture[i]); + } + + g_clear_pointer (&self->converter, gst_d3d11_converter_free); + g_clear_pointer (&self->processor, gst_d3d11_video_processor_free); + + self->processor_in_use = FALSE; +} + +static void +gst_d3d11_base_convert_dispose (GObject * object) +{ + GstD3D11BaseConvert *self = GST_D3D11_BASE_CONVERT (object); + + gst_d3d11_base_convert_clear_shader_resource (self); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static GstCaps * +gst_d3d11_base_convert_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *tmp, *tmp2; + GstCaps *result; + + /* Get all possible caps that we can transform to */ + tmp = gst_d3d11_base_convert_caps_remove_format_and_rangify_size_info (caps); + + if (filter) { + tmp2 = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + tmp = tmp2; + } + + result = tmp; + + GST_DEBUG_OBJECT (trans, "transformed %" GST_PTR_FORMAT " into %" + GST_PTR_FORMAT, caps, result); + + return result; +} + +/* + * This is an incomplete matrix of in formats and a score for the prefered output + * format. + * + * out: RGB24 RGB16 ARGB AYUV YUV444 YUV422 YUV420 YUV411 YUV410 PAL GRAY + * in + * RGB24 0 2 1 2 2 3 4 5 6 7 8 + * RGB16 1 0 1 2 2 3 4 5 6 7 8 + * ARGB 2 3 0 1 4 5 6 7 8 9 10 + * AYUV 3 4 1 0 2 5 6 7 8 9 10 + * YUV444 2 4 3 1 0 5 6 7 8 9 10 + * YUV422 3 5 4 2 1 0 6 7 8 9 10 + * YUV420 4 6 5 3 2 1 0 7 8 9 10 + * YUV411 4 6 5 3 2 1 7 0 8 9 10 + * YUV410 6 8 7 5 4 3 2 1 0 9 10 + * PAL 1 3 2 6 4 6 7 8 9 0 10 + * GRAY 1 4 3 2 1 5 6 7 8 9 0 + * + * PAL or GRAY are never prefered, if we can we would convert to PAL instead + * of GRAY, though + * less subsampling is prefered and if any, preferably horizontal + * We would like to keep the alpha, even if we would need to to colorspace conversion + * or lose depth. + */ +#define SCORE_FORMAT_CHANGE 1 +#define SCORE_DEPTH_CHANGE 1 +#define SCORE_ALPHA_CHANGE 1 +#define SCORE_CHROMA_W_CHANGE 1 +#define SCORE_CHROMA_H_CHANGE 1 +#define SCORE_PALETTE_CHANGE 1 + +#define SCORE_COLORSPACE_LOSS 2 /* RGB <-> YUV */ +#define SCORE_DEPTH_LOSS 4 /* change bit depth */ +#define SCORE_ALPHA_LOSS 8 /* lose the alpha channel */ +#define SCORE_CHROMA_W_LOSS 16 /* vertical subsample */ +#define SCORE_CHROMA_H_LOSS 32 /* horizontal subsample */ +#define SCORE_PALETTE_LOSS 64 /* convert to palette format */ +#define SCORE_COLOR_LOSS 128 /* convert to GRAY */ + +#define COLORSPACE_MASK (GST_VIDEO_FORMAT_FLAG_YUV | \ + GST_VIDEO_FORMAT_FLAG_RGB | GST_VIDEO_FORMAT_FLAG_GRAY) +#define ALPHA_MASK (GST_VIDEO_FORMAT_FLAG_ALPHA) +#define PALETTE_MASK (GST_VIDEO_FORMAT_FLAG_PALETTE) + +/* calculate how much loss a conversion would be */ +static void +score_value (GstBaseTransform * base, const GstVideoFormatInfo * in_info, + const GValue * val, gint * min_loss, const GstVideoFormatInfo ** out_info) +{ + const gchar *fname; + const GstVideoFormatInfo *t_info; + guint in_flags, t_flags; + gint loss; + + fname = g_value_get_string (val); + t_info = gst_video_format_get_info (gst_video_format_from_string (fname)); + if (!t_info || t_info->format == GST_VIDEO_FORMAT_UNKNOWN) + return; + + /* accept input format immediately without loss */ + if (in_info == t_info) { + *min_loss = 0; + *out_info = t_info; + return; + } + + loss = SCORE_FORMAT_CHANGE; + + in_flags = GST_VIDEO_FORMAT_INFO_FLAGS (in_info); + in_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; + in_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; + in_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; + + t_flags = GST_VIDEO_FORMAT_INFO_FLAGS (t_info); + t_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; + t_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; + t_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; + + if ((t_flags & PALETTE_MASK) != (in_flags & PALETTE_MASK)) { + loss += SCORE_PALETTE_CHANGE; + if (t_flags & PALETTE_MASK) + loss += SCORE_PALETTE_LOSS; + } + + if ((t_flags & COLORSPACE_MASK) != (in_flags & COLORSPACE_MASK)) { + loss += SCORE_COLORSPACE_LOSS; + if (t_flags & GST_VIDEO_FORMAT_FLAG_GRAY) + loss += SCORE_COLOR_LOSS; + } + + if ((t_flags & ALPHA_MASK) != (in_flags & ALPHA_MASK)) { + loss += SCORE_ALPHA_CHANGE; + if (in_flags & ALPHA_MASK) + loss += SCORE_ALPHA_LOSS; + } + + if ((in_info->h_sub[1]) != (t_info->h_sub[1])) { + loss += SCORE_CHROMA_H_CHANGE; + if ((in_info->h_sub[1]) < (t_info->h_sub[1])) + loss += SCORE_CHROMA_H_LOSS; + } + if ((in_info->w_sub[1]) != (t_info->w_sub[1])) { + loss += SCORE_CHROMA_W_CHANGE; + if ((in_info->w_sub[1]) < (t_info->w_sub[1])) + loss += SCORE_CHROMA_W_LOSS; + } + + if ((in_info->bits) != (t_info->bits)) { + loss += SCORE_DEPTH_CHANGE; + if ((in_info->bits) > (t_info->bits)) + loss += SCORE_DEPTH_LOSS + (in_info->bits - t_info->bits); + } + + GST_DEBUG_OBJECT (base, "score %s -> %s = %d", + GST_VIDEO_FORMAT_INFO_NAME (in_info), + GST_VIDEO_FORMAT_INFO_NAME (t_info), loss); + + if (loss < *min_loss) { + GST_DEBUG_OBJECT (base, "found new best %d", loss); + *out_info = t_info; + *min_loss = loss; + } +} + +static void +gst_d3d11_base_convert_fixate_format (GstBaseTransform * trans, + GstCaps * caps, GstCaps * result) +{ + GstStructure *ins, *outs; + const gchar *in_format; + const GstVideoFormatInfo *in_info, *out_info = NULL; + gint min_loss = G_MAXINT; + guint i, capslen; + + ins = gst_caps_get_structure (caps, 0); + in_format = gst_structure_get_string (ins, "format"); + if (!in_format) { + return; + } + + GST_DEBUG_OBJECT (trans, "source format %s", in_format); + + in_info = + gst_video_format_get_info (gst_video_format_from_string (in_format)); + if (!in_info) + return; + + outs = gst_caps_get_structure (result, 0); + + capslen = gst_caps_get_size (result); + GST_DEBUG ("iterate %d structures", capslen); + for (i = 0; i < capslen; i++) { + GstStructure *tests; + const GValue *format; + + tests = gst_caps_get_structure (result, i); + format = gst_structure_get_value (tests, "format"); + + /* should not happen */ + if (format == NULL) + continue; + + if (GST_VALUE_HOLDS_LIST (format)) { + gint j, len; + + len = gst_value_list_get_size (format); + GST_DEBUG_OBJECT (trans, "have %d formats", len); + for (j = 0; j < len; j++) { + const GValue *val; + + val = gst_value_list_get_value (format, j); + if (G_VALUE_HOLDS_STRING (val)) { + score_value (trans, in_info, val, &min_loss, &out_info); + if (min_loss == 0) + break; + } + } + } else if (G_VALUE_HOLDS_STRING (format)) { + score_value (trans, in_info, format, &min_loss, &out_info); + } + } + if (out_info) + gst_structure_set (outs, "format", G_TYPE_STRING, + GST_VIDEO_FORMAT_INFO_NAME (out_info), NULL); +} + +static gboolean +subsampling_unchanged (GstVideoInfo * in_info, GstVideoInfo * out_info) +{ + guint i; + const GstVideoFormatInfo *in_format, *out_format; + + if (GST_VIDEO_INFO_N_COMPONENTS (in_info) != + GST_VIDEO_INFO_N_COMPONENTS (out_info)) + return FALSE; + + in_format = in_info->finfo; + out_format = out_info->finfo; + + for (i = 0; i < GST_VIDEO_INFO_N_COMPONENTS (in_info); i++) { + if (GST_VIDEO_FORMAT_INFO_W_SUB (in_format, + i) != GST_VIDEO_FORMAT_INFO_W_SUB (out_format, i)) + return FALSE; + if (GST_VIDEO_FORMAT_INFO_H_SUB (in_format, + i) != GST_VIDEO_FORMAT_INFO_H_SUB (out_format, i)) + return FALSE; + } + + return TRUE; +} + +static void +transfer_colorimetry_from_input (GstBaseTransform * trans, GstCaps * in_caps, + GstCaps * out_caps) +{ + GstStructure *out_caps_s = gst_caps_get_structure (out_caps, 0); + GstStructure *in_caps_s = gst_caps_get_structure (in_caps, 0); + gboolean have_colorimetry = + gst_structure_has_field (out_caps_s, "colorimetry"); + gboolean have_chroma_site = + gst_structure_has_field (out_caps_s, "chroma-site"); + + /* If the output already has colorimetry and chroma-site, stop, + * otherwise try and transfer what we can from the input caps */ + if (have_colorimetry && have_chroma_site) + return; + + { + GstVideoInfo in_info, out_info; + const GValue *in_colorimetry = + gst_structure_get_value (in_caps_s, "colorimetry"); + + if (!gst_video_info_from_caps (&in_info, in_caps)) { + GST_WARNING_OBJECT (trans, + "Failed to convert sink pad caps to video info"); + return; + } + if (!gst_video_info_from_caps (&out_info, out_caps)) { + GST_WARNING_OBJECT (trans, + "Failed to convert src pad caps to video info"); + return; + } + + if (!have_colorimetry && in_colorimetry != NULL) { + if ((GST_VIDEO_INFO_IS_YUV (&out_info) + && GST_VIDEO_INFO_IS_YUV (&in_info)) + || (GST_VIDEO_INFO_IS_RGB (&out_info) + && GST_VIDEO_INFO_IS_RGB (&in_info)) + || (GST_VIDEO_INFO_IS_GRAY (&out_info) + && GST_VIDEO_INFO_IS_GRAY (&in_info))) { + /* Can transfer the colorimetry intact from the input if it has it */ + gst_structure_set_value (out_caps_s, "colorimetry", in_colorimetry); + } else { + gchar *colorimetry_str; + + /* Changing between YUV/RGB - forward primaries and transfer function, but use + * default range and matrix. + * the primaries is used for conversion between RGB and XYZ (CIE 1931 coordinate). + * the transfer function could be another reference (e.g., HDR) + */ + out_info.colorimetry.primaries = in_info.colorimetry.primaries; + out_info.colorimetry.transfer = in_info.colorimetry.transfer; + + colorimetry_str = + gst_video_colorimetry_to_string (&out_info.colorimetry); + gst_caps_set_simple (out_caps, "colorimetry", G_TYPE_STRING, + colorimetry_str, NULL); + g_free (colorimetry_str); + } + } + + /* Only YUV output needs chroma-site. If the input was also YUV and had the same chroma + * subsampling, transfer the siting. If the sub-sampling is changing, then the planes get + * scaled anyway so there's no real reason to prefer the input siting. */ + if (!have_chroma_site && GST_VIDEO_INFO_IS_YUV (&out_info)) { + if (GST_VIDEO_INFO_IS_YUV (&in_info)) { + const GValue *in_chroma_site = + gst_structure_get_value (in_caps_s, "chroma-site"); + if (in_chroma_site != NULL + && subsampling_unchanged (&in_info, &out_info)) + gst_structure_set_value (out_caps_s, "chroma-site", in_chroma_site); + } + } + } +} + +static GstCaps * +gst_d3d11_base_convert_get_fixed_format (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstCaps *result; + + result = gst_caps_intersect (othercaps, caps); + if (gst_caps_is_empty (result)) { + gst_caps_unref (result); + result = gst_caps_copy (othercaps); + } + + gst_d3d11_base_convert_fixate_format (trans, caps, result); + + /* fixate remaining fields */ + result = gst_caps_fixate (result); + + if (direction == GST_PAD_SINK) { + if (gst_caps_is_subset (caps, result)) { + gst_caps_replace (&result, caps); + } else { + /* Try and preserve input colorimetry / chroma information */ + transfer_colorimetry_from_input (trans, caps, result); + } + } + + return result; +} + +static GstCaps * +gst_d3d11_base_convert_fixate_size (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstStructure *ins, *outs; + const GValue *from_par, *to_par; + GValue fpar = G_VALUE_INIT, tpar = G_VALUE_INIT; + + othercaps = gst_caps_truncate (othercaps); + othercaps = gst_caps_make_writable (othercaps); + ins = gst_caps_get_structure (caps, 0); + outs = gst_caps_get_structure (othercaps, 0); + + from_par = gst_structure_get_value (ins, "pixel-aspect-ratio"); + to_par = gst_structure_get_value (outs, "pixel-aspect-ratio"); + + /* If we're fixating from the sinkpad we always set the PAR and + * assume that missing PAR on the sinkpad means 1/1 and + * missing PAR on the srcpad means undefined + */ + if (direction == GST_PAD_SINK) { + if (!from_par) { + g_value_init (&fpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&fpar, 1, 1); + from_par = &fpar; + } + if (!to_par) { + g_value_init (&tpar, GST_TYPE_FRACTION_RANGE); + gst_value_set_fraction_range_full (&tpar, 1, G_MAXINT, G_MAXINT, 1); + to_par = &tpar; + } + } else { + if (!to_par) { + g_value_init (&tpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&tpar, 1, 1); + to_par = &tpar; + + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1, + NULL); + } + if (!from_par) { + g_value_init (&fpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&fpar, 1, 1); + from_par = &fpar; + } + } + + /* we have both PAR but they might not be fixated */ + { + gint from_w, from_h, from_par_n, from_par_d, to_par_n, to_par_d; + gint w = 0, h = 0; + gint from_dar_n, from_dar_d; + gint num, den; + + /* from_par should be fixed */ + g_return_val_if_fail (gst_value_is_fixed (from_par), othercaps); + + from_par_n = gst_value_get_fraction_numerator (from_par); + from_par_d = gst_value_get_fraction_denominator (from_par); + + gst_structure_get_int (ins, "width", &from_w); + gst_structure_get_int (ins, "height", &from_h); + + gst_structure_get_int (outs, "width", &w); + gst_structure_get_int (outs, "height", &h); + + /* if both width and height are already fixed, we can't do anything + * about it anymore */ + if (w && h) { + guint n, d; + + GST_DEBUG_OBJECT (base, "dimensions already set to %dx%d, not fixating", + w, h); + if (!gst_value_is_fixed (to_par)) { + if (gst_video_calculate_display_ratio (&n, &d, from_w, from_h, + from_par_n, from_par_d, w, h)) { + GST_DEBUG_OBJECT (base, "fixating to_par to %dx%d", n, d); + if (gst_structure_has_field (outs, "pixel-aspect-ratio")) + gst_structure_fixate_field_nearest_fraction (outs, + "pixel-aspect-ratio", n, d); + else if (n != d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + n, d, NULL); + } + } + goto done; + } + + /* Calculate input DAR */ + if (!gst_util_fraction_multiply (from_w, from_h, from_par_n, from_par_d, + &from_dar_n, &from_dar_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + GST_DEBUG_OBJECT (base, "Input DAR is %d/%d", from_dar_n, from_dar_d); + + /* If either width or height are fixed there's not much we + * can do either except choosing a height or width and PAR + * that matches the DAR as good as possible + */ + if (h) { + GstStructure *tmp; + gint set_w, set_par_n, set_par_d; + + GST_DEBUG_OBJECT (base, "height is fixed (%d)", h); + + /* If the PAR is fixed too, there's not much to do + * except choosing the width that is nearest to the + * width with the same DAR */ + if (gst_value_is_fixed (to_par)) { + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + GST_DEBUG_OBJECT (base, "PAR is fixed %d/%d", to_par_n, to_par_d); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, + to_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (h, num, den); + gst_structure_fixate_field_nearest_int (outs, "width", w); + + goto done; + } + + /* The PAR is not fixed and it's quite likely that we can set + * an arbitrary PAR. */ + + /* Check if we can keep the input width */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + /* Might have failed but try to keep the DAR nonetheless by + * adjusting the PAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, h, set_w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + /* Check if the adjusted PAR is accepted */ + if (set_par_n == to_par_n && set_par_d == to_par_d) { + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "width", G_TYPE_INT, set_w, + "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, + NULL); + goto done; + } + + /* Otherwise scale the width to the new PAR and check if the + * adjusted with is accepted. If all that fails we can't keep + * the DAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (h, num, den); + gst_structure_fixate_field_nearest_int (outs, "width", w); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + + goto done; + } else if (w) { + GstStructure *tmp; + gint set_h, set_par_n, set_par_d; + + GST_DEBUG_OBJECT (base, "width is fixed (%d)", w); + + /* If the PAR is fixed too, there's not much to do + * except choosing the height that is nearest to the + * height with the same DAR */ + if (gst_value_is_fixed (to_par)) { + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + GST_DEBUG_OBJECT (base, "PAR is fixed %d/%d", to_par_n, to_par_d); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, + to_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + h = (guint) gst_util_uint64_scale_int_round (w, den, num); + gst_structure_fixate_field_nearest_int (outs, "height", h); + + goto done; + } + + /* The PAR is not fixed and it's quite likely that we can set + * an arbitrary PAR. */ + + /* Check if we can keep the input height */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + + /* Might have failed but try to keep the DAR nonetheless by + * adjusting the PAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + /* Check if the adjusted PAR is accepted */ + if (set_par_n == to_par_n && set_par_d == to_par_d) { + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "height", G_TYPE_INT, set_h, + "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, + NULL); + goto done; + } + + /* Otherwise scale the height to the new PAR and check if the + * adjusted with is accepted. If all that fails we can't keep + * the DAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scale sized - integer overflow")); + goto done; + } + + h = (guint) gst_util_uint64_scale_int_round (w, den, num); + gst_structure_fixate_field_nearest_int (outs, "height", h); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + + goto done; + } else if (gst_value_is_fixed (to_par)) { + GstStructure *tmp; + gint set_h, set_w, f_h, f_w; + + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + /* Calculate scale factor for the PAR change */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_n, + to_par_d, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + /* Try to keep the input height (because of interlacing) */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + + /* This might have failed but try to scale the width + * to keep the DAR nonetheless */ + w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); + gst_structure_fixate_field_nearest_int (tmp, "width", w); + gst_structure_get_int (tmp, "width", &set_w); + gst_structure_free (tmp); + + /* We kept the DAR and the height is nearest to the original height */ + if (set_w == w) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + goto done; + } + + f_h = set_h; + f_w = set_w; + + /* If the former failed, try to keep the input width at least */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + /* This might have failed but try to scale the width + * to keep the DAR nonetheless */ + h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); + gst_structure_fixate_field_nearest_int (tmp, "height", h); + gst_structure_get_int (tmp, "height", &set_h); + gst_structure_free (tmp); + + /* We kept the DAR and the width is nearest to the original width */ + if (set_h == h) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + goto done; + } + + /* If all this failed, keep the dimensions with the DAR that was closest + * to the correct DAR. This changes the DAR but there's not much else to + * do here. + */ + if (set_w * ABS (set_h - h) < ABS (f_w - w) * f_h) { + f_h = set_h; + f_w = set_w; + } + gst_structure_set (outs, "width", G_TYPE_INT, f_w, "height", G_TYPE_INT, + f_h, NULL); + goto done; + } else { + GstStructure *tmp; + gint set_h, set_w, set_par_n, set_par_d, tmp2; + + /* width, height and PAR are not fixed but passthrough is not possible */ + + /* First try to keep the height and width as good as possible + * and scale PAR */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, set_w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + if (set_par_n == to_par_n && set_par_d == to_par_d) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* Otherwise try to scale width to keep the DAR with the set + * PAR and height */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", w); + gst_structure_get_int (tmp, "width", &tmp2); + gst_structure_free (tmp); + + if (tmp2 == w) { + gst_structure_set (outs, "width", G_TYPE_INT, tmp2, "height", + G_TYPE_INT, set_h, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* ... or try the same with the height */ + h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", h); + gst_structure_get_int (tmp, "height", &tmp2); + gst_structure_free (tmp); + + if (tmp2 == h) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, tmp2, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* If all fails we can't keep the DAR and take the nearest values + * for everything from the first try */ + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + } + } + +done: + if (from_par == &fpar) + g_value_unset (&fpar); + if (to_par == &tpar) + g_value_unset (&tpar); + + return othercaps; +} + +static GstCaps * +gst_d3d11_base_convert_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstCaps *format = NULL; + + GST_DEBUG_OBJECT (base, + "trying to fixate othercaps %" GST_PTR_FORMAT " based on caps %" + GST_PTR_FORMAT, othercaps, caps); + + format = gst_d3d11_base_convert_get_fixed_format (base, direction, caps, + othercaps); + + if (gst_caps_is_empty (format)) { + GST_ERROR_OBJECT (base, "Could not convert formats"); + return format; + } + + /* convert mode is "all" or "size" here */ + othercaps = + gst_d3d11_base_convert_fixate_size (base, direction, caps, othercaps); + + if (gst_caps_get_size (othercaps) == 1) { + guint i; + const gchar *format_fields[] = { "format", "colorimetry", "chroma-site" }; + GstStructure *format_struct = gst_caps_get_structure (format, 0); + GstStructure *fixated_struct; + + othercaps = gst_caps_make_writable (othercaps); + fixated_struct = gst_caps_get_structure (othercaps, 0); + + for (i = 0; i < G_N_ELEMENTS (format_fields); i++) { + if (gst_structure_has_field (format_struct, format_fields[i])) { + gst_structure_set (fixated_struct, format_fields[i], G_TYPE_STRING, + gst_structure_get_string (format_struct, format_fields[i]), NULL); + } else { + gst_structure_remove_field (fixated_struct, format_fields[i]); + } + } + } + gst_caps_unref (format); + + GST_DEBUG_OBJECT (base, "fixated othercaps to %" GST_PTR_FORMAT, othercaps); + + return othercaps; +} + +static gboolean +gst_d3d11_base_convert_filter_meta (GstBaseTransform * trans, + GstQuery * query, GType api, const GstStructure * params) +{ + /* This element cannot passthrough the crop meta, because it would convert the + * wrong sub-region of the image, and worst, our output image may not be large + * enough for the crop to be applied later */ + if (api == GST_VIDEO_CROP_META_API_TYPE) + return FALSE; + + /* propose all other metadata upstream */ + return TRUE; +} + +static gboolean +gst_d3d11_base_convert_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstVideoInfo info; + GstBufferPool *pool = NULL; + GstCaps *caps; + guint n_pools, i; + GstStructure *config; + guint size; + GstD3D11AllocationParams *d3d11_params; + const GstD3D11Format *d3d11_format; + guint bind_flags = D3D11_BIND_SHADER_RESOURCE; + DXGI_FORMAT dxgi_format = DXGI_FORMAT_UNKNOWN; + UINT supported = 0; + HRESULT hr; + ID3D11Device *device_handle; + + if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query)) + return FALSE; + + /* passthrough, we're done */ + if (decide_query == NULL) + return TRUE; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) { + GST_ERROR_OBJECT (filter, "Invalid caps %" GST_PTR_FORMAT, caps); + return FALSE; + } + + d3d11_format = gst_d3d11_device_format_from_gst (filter->device, + GST_VIDEO_INFO_FORMAT (&info)); + if (!d3d11_format) { + GST_ERROR_OBJECT (filter, "Unknown format caps %" GST_PTR_FORMAT, caps); + return FALSE; + } + + if (d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + dxgi_format = d3d11_format->resource_format[0]; + } else { + dxgi_format = d3d11_format->dxgi_format; + } + + device_handle = gst_d3d11_device_get_device_handle (filter->device); + hr = device_handle->CheckFormatSupport (dxgi_format, &supported); + if (gst_d3d11_result (hr, filter->device) && + (supported & D3D11_FORMAT_SUPPORT_RENDER_TARGET) == + D3D11_FORMAT_SUPPORT_RENDER_TARGET) { + bind_flags |= D3D11_BIND_RENDER_TARGET; + } + + n_pools = gst_query_get_n_allocation_pools (query); + for (i = 0; i < n_pools; i++) { + gst_query_parse_nth_allocation_pool (query, i, &pool, NULL, NULL, NULL); + if (pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != filter->device) + gst_clear_object (&pool); + } + } + } + + if (!pool) + pool = gst_d3d11_buffer_pool_new (filter->device); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (filter->device, &info, + (GstD3D11AllocationFlags) 0, bind_flags); + } else { + /* Set bind flag */ + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { + d3d11_params->desc[i].BindFlags |= bind_flags; + } + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + /* size will be updated by d3d11 buffer pool */ + gst_buffer_pool_config_set_params (config, caps, 0, 0, 0); + + if (!gst_buffer_pool_set_config (pool, config)) + goto config_failed; + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + gst_query_add_allocation_meta (query, + GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + gst_query_add_allocation_pool (query, pool, size, 0, 0); + + gst_object_unref (pool); + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (filter, "failed to set config"); + gst_object_unref (pool); + return FALSE; + } +} + +static gboolean +gst_d3d11_base_convert_decide_allocation (GstBaseTransform * trans, + GstQuery * query) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstCaps *outcaps = NULL; + GstBufferPool *pool = NULL; + guint size, min = 0, max = 0; + GstStructure *config; + GstD3D11AllocationParams *d3d11_params; + gboolean update_pool = FALSE; + GstVideoInfo info; + guint i; + const GstD3D11Format *d3d11_format; + guint bind_flags = D3D11_BIND_RENDER_TARGET; + DXGI_FORMAT dxgi_format = DXGI_FORMAT_UNKNOWN; + UINT supported = 0; + HRESULT hr; + ID3D11Device *device_handle; + + gst_query_parse_allocation (query, &outcaps, NULL); + + if (!outcaps) + return FALSE; + + if (!gst_video_info_from_caps (&info, outcaps)) { + GST_ERROR_OBJECT (filter, "Invalid caps %" GST_PTR_FORMAT, outcaps); + return FALSE; + } + + d3d11_format = gst_d3d11_device_format_from_gst (filter->device, + GST_VIDEO_INFO_FORMAT (&info)); + if (!d3d11_format) { + GST_ERROR_OBJECT (filter, "Unknown format caps %" GST_PTR_FORMAT, outcaps); + return FALSE; + } + + if (d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + dxgi_format = d3d11_format->resource_format[0]; + } else { + dxgi_format = d3d11_format->dxgi_format; + } + + device_handle = gst_d3d11_device_get_device_handle (filter->device); + hr = device_handle->CheckFormatSupport (dxgi_format, &supported); + if (gst_d3d11_result (hr, filter->device) && + (supported & D3D11_FORMAT_SUPPORT_SHADER_SAMPLE) == + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE) { + bind_flags |= D3D11_BIND_SHADER_RESOURCE; + } + + size = GST_VIDEO_INFO_SIZE (&info); + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != filter->device) + gst_clear_object (&pool); + } + } + + update_pool = TRUE; + } + + if (!pool) + pool = gst_d3d11_buffer_pool_new (filter->device); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (filter->device, &info, + (GstD3D11AllocationFlags) 0, bind_flags); + } else { + /* Set bind flag */ + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { + d3d11_params->desc[i].BindFlags |= bind_flags; + } + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_set_config (pool, config); + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + gst_object_unref (pool); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, + query); +} + +static gboolean +create_shader_input_resource (GstD3D11BaseConvert * self, + GstD3D11Device * device, const GstD3D11Format * format, GstVideoInfo * info) +{ + D3D11_TEXTURE2D_DESC texture_desc; + D3D11_SHADER_RESOURCE_VIEW_DESC view_desc; + HRESULT hr; + ID3D11Device *device_handle; + ID3D11Texture2D *tex[GST_VIDEO_MAX_PLANES] = { NULL, }; + ID3D11ShaderResourceView *view[GST_VIDEO_MAX_PLANES] = { NULL, }; + gint i; + + if (self->num_input_view) + return TRUE; + + memset (&texture_desc, 0, sizeof (texture_desc)); + memset (&view_desc, 0, sizeof (view_desc)); + + device_handle = gst_d3d11_device_get_device_handle (device); + + texture_desc.MipLevels = 1; + texture_desc.ArraySize = 1; + texture_desc.SampleDesc.Count = 1; + texture_desc.SampleDesc.Quality = 0; + texture_desc.Usage = D3D11_USAGE_DEFAULT; + texture_desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; + + if (format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) + break; + + texture_desc.Width = GST_VIDEO_INFO_COMP_WIDTH (info, i); + texture_desc.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); + texture_desc.Format = format->resource_format[i]; + + hr = device_handle->CreateTexture2D (&texture_desc, NULL, &tex[i]); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); + goto error; + } + } + } else { + gboolean is_semiplanar = FALSE; + + if (format->dxgi_format == DXGI_FORMAT_NV12 || + format->dxgi_format == DXGI_FORMAT_P010 || + format->dxgi_format == DXGI_FORMAT_P016) + is_semiplanar = TRUE; + + texture_desc.Width = GST_VIDEO_INFO_WIDTH (info); + texture_desc.Height = GST_VIDEO_INFO_HEIGHT (info); + texture_desc.Format = format->dxgi_format; + + /* semiplanar format resolution of should be even number */ + if (is_semiplanar) { + texture_desc.Width = GST_ROUND_UP_2 (texture_desc.Width); + texture_desc.Height = GST_ROUND_UP_2 (texture_desc.Height); + } + + hr = device_handle->CreateTexture2D (&texture_desc, NULL, &tex[0]); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); + goto error; + } + + if (is_semiplanar) { + tex[0]->AddRef (); + tex[1] = tex[0]; + } + } + + view_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; + view_desc.Texture2D.MipLevels = 1; + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) + break; + + view_desc.Format = format->resource_format[i]; + hr = device_handle->CreateShaderResourceView (tex[i], &view_desc, &view[i]); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, + "Failed to create resource view (0x%x)", (guint) hr); + goto error; + } + } + + self->num_input_view = i; + + GST_DEBUG_OBJECT (self, + "%d shader resource view created", self->num_input_view); + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + self->in_texture[i] = tex[i]; + self->shader_resource_view[i] = view[i]; + } + + return TRUE; + +error: + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (view[i]); + } + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (tex[i]); + } + + return FALSE; +} + +/* 16.0 / 255.0 ~= 0.062745 */ +static const float luma_black_level_limited = 0.062745f; + +static inline void +clear_rtv_color_rgb (GstD3D11BaseConvert * self, + ID3D11DeviceContext * context_handle, ID3D11RenderTargetView * rtv, + gboolean full_range) +{ + const FLOAT clear_color_full[4] = { 0.0f, 0.0f, 0.0f, 1.0f }; + const FLOAT clear_color_limited[4] = + { luma_black_level_limited, luma_black_level_limited, + luma_black_level_limited, 1.0f + }; + const FLOAT *target; + + if (full_range) + target = clear_color_full; + else + target = clear_color_limited; + + context_handle->ClearRenderTargetView (rtv, target); +} + +static inline void +clear_rtv_color_vuya (GstD3D11BaseConvert * self, + ID3D11DeviceContext * context_handle, ID3D11RenderTargetView * rtv, + gboolean full_range) +{ + const FLOAT clear_color_full[4] = { 0.5f, 0.5f, 0.0f, 1.0f }; + const FLOAT clear_color_limited[4] = + { 0.5f, 0.5f, luma_black_level_limited, 1.0f }; + const FLOAT *target; + + if (full_range) + target = clear_color_full; + else + target = clear_color_limited; + + context_handle->ClearRenderTargetView (rtv, target); +} + +static inline void +clear_rtv_color_luma (GstD3D11BaseConvert * self, + ID3D11DeviceContext * context_handle, ID3D11RenderTargetView * rtv, + gboolean full_range) +{ + const FLOAT clear_color_full[4] = { 0.0f, 0.0f, 0.0f, 1.0f }; + const FLOAT clear_color_limited[4] = + { luma_black_level_limited, luma_black_level_limited, + luma_black_level_limited, 1.0f + }; + const FLOAT *target; + + if (full_range) + target = clear_color_full; + else + target = clear_color_limited; + + context_handle->ClearRenderTargetView (rtv, target); +} + +static inline void +clear_rtv_color_chroma (GstD3D11BaseConvert * self, + ID3D11DeviceContext * context_handle, ID3D11RenderTargetView * rtv) +{ + const FLOAT clear_color[4] = { 0.5f, 0.5f, 0.5f, 1.0f }; + + context_handle->ClearRenderTargetView (rtv, clear_color); +} + +static void +clear_rtv_color_all (GstD3D11BaseConvert * self, GstVideoInfo * info, + ID3D11DeviceContext * context_handle, + ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) +{ + gint i; + gboolean full_range = info->colorimetry.range == GST_VIDEO_COLOR_RANGE_0_255; + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (!rtv[i]) + break; + + if (GST_VIDEO_INFO_IS_RGB (info)) { + clear_rtv_color_rgb (self, context_handle, rtv[i], full_range); + } else { + if (GST_VIDEO_INFO_N_PLANES (info) == 1) { + clear_rtv_color_vuya (self, context_handle, rtv[i], full_range); + } else { + if (i == 0) + clear_rtv_color_luma (self, context_handle, rtv[i], full_range); + else + clear_rtv_color_chroma (self, context_handle, rtv[i]); + } + } + } +} + +static gboolean +create_shader_output_resource (GstD3D11BaseConvert * self, + GstD3D11Device * device, const GstD3D11Format * format, GstVideoInfo * info) +{ + D3D11_TEXTURE2D_DESC texture_desc; + D3D11_RENDER_TARGET_VIEW_DESC view_desc; + HRESULT hr; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + ID3D11Texture2D *tex[GST_VIDEO_MAX_PLANES] = { NULL, }; + ID3D11RenderTargetView *view[GST_VIDEO_MAX_PLANES] = { NULL, }; + gint i; + + if (self->num_output_view) + return TRUE; + + memset (&texture_desc, 0, sizeof (texture_desc)); + memset (&view_desc, 0, sizeof (view_desc)); + + device_handle = gst_d3d11_device_get_device_handle (device); + context_handle = gst_d3d11_device_get_device_context_handle (device); + + texture_desc.MipLevels = 1; + texture_desc.ArraySize = 1; + texture_desc.SampleDesc.Count = 1; + texture_desc.SampleDesc.Quality = 0; + texture_desc.Usage = D3D11_USAGE_DEFAULT; + texture_desc.BindFlags = + D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; + + if (format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) + break; + + texture_desc.Width = GST_VIDEO_INFO_COMP_WIDTH (info, i); + texture_desc.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); + texture_desc.Format = format->resource_format[i]; + + hr = device_handle->CreateTexture2D (&texture_desc, NULL, &tex[i]); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); + goto error; + } + } + } else { + gboolean is_semiplanar = FALSE; + + if (format->dxgi_format == DXGI_FORMAT_NV12 || + format->dxgi_format == DXGI_FORMAT_P010 || + format->dxgi_format == DXGI_FORMAT_P016) + is_semiplanar = TRUE; + + texture_desc.Width = GST_VIDEO_INFO_WIDTH (info); + texture_desc.Height = GST_VIDEO_INFO_HEIGHT (info); + texture_desc.Format = format->dxgi_format; + + /* semiplanar format resolution of should be even number */ + if (is_semiplanar) { + texture_desc.Width = GST_ROUND_UP_2 (texture_desc.Width); + texture_desc.Height = GST_ROUND_UP_2 (texture_desc.Height); + } + + hr = device_handle->CreateTexture2D (&texture_desc, NULL, &tex[0]); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Failed to create texture (0x%x)", (guint) hr); + goto error; + } + + if (is_semiplanar) { + tex[0]->AddRef (); + tex[1] = tex[0]; + } + } + + view_desc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; + view_desc.Texture2D.MipSlice = 0; + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (format->resource_format[i] == DXGI_FORMAT_UNKNOWN) + break; + + view_desc.Format = format->resource_format[i]; + hr = device_handle->CreateRenderTargetView (tex[i], &view_desc, &view[i]); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, + "Failed to create %dth render target view (0x%x)", i, (guint) hr); + goto error; + } + } + + gst_d3d11_device_lock (device); + clear_rtv_color_all (self, info, context_handle, view); + gst_d3d11_device_unlock (device); + + self->num_output_view = i; + + GST_DEBUG_OBJECT (self, "%d render view created", self->num_output_view); + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + self->out_texture[i] = tex[i]; + self->render_target_view[i] = view[i]; + } + + return TRUE; + +error: + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (view[i]); + } + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + GST_D3D11_CLEAR_COM (tex[i]); + } + + return FALSE; +} + +static gboolean +gst_d3d11_base_convert_set_info (GstD3D11BaseFilter * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info) +{ + GstD3D11BaseConvert *self = GST_D3D11_BASE_CONVERT (filter); + const GstVideoInfo *unknown_info; + gint from_dar_n, from_dar_d, to_dar_n, to_dar_d; + D3D11_VIEWPORT view_port; + gint border_offset_x = 0; + gint border_offset_y = 0; + + if (gst_base_transform_is_passthrough (GST_BASE_TRANSFORM (filter))) + return TRUE; + + if (!gst_util_fraction_multiply (in_info->width, + in_info->height, in_info->par_n, in_info->par_d, &from_dar_n, + &from_dar_d)) { + from_dar_n = from_dar_d = -1; + } + + if (!gst_util_fraction_multiply (out_info->width, + out_info->height, out_info->par_n, out_info->par_d, &to_dar_n, + &to_dar_d)) { + to_dar_n = to_dar_d = -1; + } + + self->borders_w = self->borders_h = 0; + if (to_dar_n != from_dar_n || to_dar_d != from_dar_d) { + if (self->add_borders) { + gint n, d, to_h, to_w; + + if (from_dar_n != -1 && from_dar_d != -1 + && gst_util_fraction_multiply (from_dar_n, from_dar_d, + out_info->par_d, out_info->par_n, &n, &d)) { + to_h = gst_util_uint64_scale_int (out_info->width, d, n); + if (to_h <= out_info->height) { + self->borders_h = out_info->height - to_h; + self->borders_w = 0; + } else { + to_w = gst_util_uint64_scale_int (out_info->height, n, d); + g_assert (to_w <= out_info->width); + self->borders_h = 0; + self->borders_w = out_info->width - to_w; + } + } else { + GST_WARNING_OBJECT (self, "Can't calculate borders"); + } + } else { + GST_INFO_OBJECT (self, "Display aspect ratio update %d/%d -> %d/%d", + from_dar_n, from_dar_d, to_dar_n, to_dar_d); + } + } + + gst_d3d11_base_convert_clear_shader_resource (self); + + GST_DEBUG_OBJECT (self, "Setup convert with format %s -> %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info)), + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); + + /* if present, these must match */ + if (in_info->interlace_mode != out_info->interlace_mode) + goto format_mismatch; + + if (in_info->width == out_info->width && in_info->height == out_info->height + && in_info->finfo == out_info->finfo && self->borders_w == 0 && + self->borders_h == 0) { + gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (filter), TRUE); + return TRUE; + } else { + gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (filter), FALSE); + } + + self->in_d3d11_format = + gst_d3d11_device_format_from_gst (filter->device, + GST_VIDEO_INFO_FORMAT (in_info)); + if (!self->in_d3d11_format) { + unknown_info = in_info; + goto format_unknown; + } + + self->out_d3d11_format = + gst_d3d11_device_format_from_gst (filter->device, + GST_VIDEO_INFO_FORMAT (out_info)); + if (!self->out_d3d11_format) { + unknown_info = out_info; + goto format_unknown; + } + + self->converter = + gst_d3d11_converter_new (filter->device, in_info, out_info, nullptr); + + if (!self->converter) { + GST_ERROR_OBJECT (self, "couldn't set converter"); + return FALSE; + } +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) + /* If both input and output formats are native DXGI format */ + if (self->in_d3d11_format->dxgi_format != DXGI_FORMAT_UNKNOWN && + self->out_d3d11_format->dxgi_format != DXGI_FORMAT_UNKNOWN) { + gboolean hardware = FALSE; + GstD3D11VideoProcessor *processor = NULL; + + gst_d3d11_device_lock (filter->device); + g_object_get (filter->device, "hardware", &hardware, NULL); + if (hardware) { + processor = gst_d3d11_video_processor_new (filter->device, + in_info->width, in_info->height, out_info->width, out_info->height); + } + + if (processor) { + const GstDxgiColorSpace *in_color_space; + const GstDxgiColorSpace *out_color_space; + + in_color_space = gst_d3d11_video_info_to_dxgi_color_space (in_info); + out_color_space = gst_d3d11_video_info_to_dxgi_color_space (out_info); + + if (in_color_space && out_color_space) { + DXGI_FORMAT in_dxgi_format = self->in_d3d11_format->dxgi_format; + DXGI_FORMAT out_dxgi_format = self->out_d3d11_format->dxgi_format; + DXGI_COLOR_SPACE_TYPE in_dxgi_color_space = + (DXGI_COLOR_SPACE_TYPE) in_color_space->dxgi_color_space_type; + DXGI_COLOR_SPACE_TYPE out_dxgi_color_space = + (DXGI_COLOR_SPACE_TYPE) out_color_space->dxgi_color_space_type; + + if (!gst_d3d11_video_processor_check_format_conversion (processor, + in_dxgi_format, in_dxgi_color_space, out_dxgi_format, + out_dxgi_color_space)) { + GST_DEBUG_OBJECT (self, "Conversion is not supported by device"); + gst_d3d11_video_processor_free (processor); + processor = NULL; + } else { + GST_DEBUG_OBJECT (self, "video processor supports conversion"); + gst_d3d11_video_processor_set_input_dxgi_color_space (processor, + in_dxgi_color_space); + gst_d3d11_video_processor_set_output_dxgi_color_space (processor, + out_dxgi_color_space); + } + } else { + GST_WARNING_OBJECT (self, + "Couldn't determine input and/or output dxgi colorspace"); + gst_d3d11_video_processor_free (processor); + processor = NULL; + } + } + + self->processor = processor; + gst_d3d11_device_unlock (filter->device); + } +#endif + + GST_DEBUG_OBJECT (self, "from=%dx%d (par=%d/%d dar=%d/%d), size %" + G_GSIZE_FORMAT " -> to=%dx%d (par=%d/%d dar=%d/%d borders=%d:%d), " + "size %" G_GSIZE_FORMAT, + in_info->width, in_info->height, in_info->par_n, in_info->par_d, + from_dar_n, from_dar_d, in_info->size, out_info->width, + out_info->height, out_info->par_n, out_info->par_d, to_dar_n, to_dar_d, + self->borders_w, self->borders_h, out_info->size); + + self->in_rect.left = 0; + self->in_rect.top = 0; + self->in_rect.right = GST_VIDEO_INFO_WIDTH (in_info); + self->in_rect.bottom = GST_VIDEO_INFO_HEIGHT (in_info); + + if (self->borders_w) { + border_offset_x = self->borders_w / 2; + self->out_rect.left = border_offset_x; + self->out_rect.right = GST_VIDEO_INFO_WIDTH (out_info) - border_offset_x; + } else { + self->out_rect.left = 0; + self->out_rect.right = GST_VIDEO_INFO_WIDTH (out_info); + } + + if (self->borders_h) { + border_offset_y = self->borders_h / 2; + self->out_rect.top = border_offset_y; + self->out_rect.bottom = GST_VIDEO_INFO_HEIGHT (out_info) - border_offset_y; + } else { + self->out_rect.top = 0; + self->out_rect.bottom = GST_VIDEO_INFO_HEIGHT (out_info); + } + + view_port.TopLeftX = border_offset_x; + view_port.TopLeftY = border_offset_y; + view_port.Width = GST_VIDEO_INFO_WIDTH (out_info) - self->borders_w; + view_port.Height = GST_VIDEO_INFO_HEIGHT (out_info) - self->borders_h; + view_port.MinDepth = 0.0f; + view_port.MaxDepth = 1.0f; + + gst_d3d11_converter_update_viewport (self->converter, &view_port); + + return TRUE; + + /* ERRORS */ +format_mismatch: + { + GST_ERROR_OBJECT (self, "input and output formats do not match"); + return FALSE; + } +format_unknown: + { + GST_ERROR_OBJECT (self, + "%s couldn't be converted to d3d11 format", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (unknown_info))); + return FALSE; + } +} + +static gboolean +gst_d3d11_base_convert_prefer_video_processor (GstD3D11BaseConvert * self, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (self); + GstMemory *mem; + GstD3D11Memory *dmem; + + if (!self->processor) { + GST_TRACE_OBJECT (self, "Processor is unavailable"); + return FALSE; + } + + if (gst_buffer_n_memory (inbuf) != 1 || gst_buffer_n_memory (outbuf) != 1) { + GST_TRACE_OBJECT (self, "Num memory objects is mismatched, in: %d, out: %d", + gst_buffer_n_memory (inbuf), gst_buffer_n_memory (outbuf)); + return FALSE; + } + + mem = gst_buffer_peek_memory (inbuf, 0); + g_assert (gst_is_d3d11_memory (mem)); + + dmem = (GstD3D11Memory *) mem; + if (dmem->device != filter->device) { + GST_TRACE_OBJECT (self, "Input memory belongs to different device"); + return FALSE; + } + + /* If we can use shader, and video processor was not used previously, + * we prefer to use shader instead of video processor + * because video processor implementation is vendor dependent + * and not flexible */ + if (!self->processor_in_use && + gst_d3d11_memory_get_shader_resource_view_size (dmem)) { + GST_TRACE_OBJECT (self, "SRV is available"); + return FALSE; + } + + if (!gst_d3d11_video_processor_get_input_view (self->processor, dmem)) { + GST_TRACE_OBJECT (self, "PIV is unavailable"); + return FALSE; + } + + mem = gst_buffer_peek_memory (outbuf, 0); + g_assert (gst_is_d3d11_memory (mem)); + + dmem = (GstD3D11Memory *) mem; + if (dmem->device != filter->device) { + GST_TRACE_OBJECT (self, "Output memory belongs to different device"); + return FALSE; + } + + if (!gst_d3d11_video_processor_get_output_view (self->processor, dmem)) { + GST_TRACE_OBJECT (self, "POV is unavailable"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_base_convert_transform_using_processor (GstD3D11BaseConvert * self, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GstD3D11Memory *in_mem, *out_mem; + ID3D11VideoProcessorInputView *piv; + ID3D11VideoProcessorOutputView *pov; + + in_mem = (GstD3D11Memory *) gst_buffer_peek_memory (inbuf, 0); + out_mem = (GstD3D11Memory *) gst_buffer_peek_memory (outbuf, 0); + + piv = gst_d3d11_video_processor_get_input_view (self->processor, in_mem); + if (!piv) { + GST_ERROR_OBJECT (self, "ID3D11VideoProcessorInputView is unavailable"); + return FALSE; + } + + pov = gst_d3d11_video_processor_get_output_view (self->processor, out_mem); + if (!pov) { + GST_ERROR_OBJECT (self, "ID3D11VideoProcessorOutputView is unavailable"); + return FALSE; + } + + /* Clear background color with black */ + if (self->borders_w || self->borders_h) { + GstD3D11BaseFilter *bfilter = GST_D3D11_BASE_FILTER_CAST (self); + ID3D11DeviceContext *context_handle = + gst_d3d11_device_get_device_context_handle (bfilter->device); + ID3D11RenderTargetView *render_view[GST_VIDEO_MAX_PLANES] = { NULL, }; + + if (!gst_d3d11_buffer_get_render_target_view (outbuf, render_view)) { + GST_ERROR_OBJECT (self, "ID3D11RenderTargetView is unavailable"); + return FALSE; + } + + gst_d3d11_device_lock (bfilter->device); + clear_rtv_color_all (self, &bfilter->out_info, context_handle, render_view); + gst_d3d11_device_unlock (bfilter->device); + } + + return gst_d3d11_video_processor_render (self->processor, + &self->in_rect, piv, &self->out_rect, pov); +} + +static GstFlowReturn +gst_d3d11_base_convert_transform (GstBaseTransform * trans, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstD3D11BaseConvert *self = GST_D3D11_BASE_CONVERT (trans); + GstD3D11Device *device = filter->device; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + ID3D11ShaderResourceView *resource_view[GST_VIDEO_MAX_PLANES] = { NULL, }; + ID3D11RenderTargetView *render_view[GST_VIDEO_MAX_PLANES] = { NULL, }; + ID3D11RenderTargetView **target_rtv; + guint i; + gboolean copy_input = FALSE; + gboolean copy_output = FALSE; + GstMapInfo in_map[GST_VIDEO_MAX_PLANES]; + GstMapInfo out_map[GST_VIDEO_MAX_PLANES]; + + device_handle = gst_d3d11_device_get_device_handle (device); + context_handle = gst_d3d11_device_get_device_context_handle (device); + + if (!gst_d3d11_buffer_map (inbuf, device_handle, in_map, GST_MAP_READ)) { + GST_ERROR_OBJECT (self, "Couldn't map input buffer"); + goto invalid_memory; + } + + if (!gst_d3d11_buffer_map (outbuf, device_handle, out_map, GST_MAP_WRITE)) { + GST_ERROR_OBJECT (self, "Couldn't map output buffer"); + gst_d3d11_buffer_unmap (inbuf, in_map); + goto invalid_memory; + } + + if (gst_d3d11_base_convert_prefer_video_processor (self, inbuf, outbuf)) { + gboolean ret = + gst_d3d11_base_convert_transform_using_processor (self, inbuf, outbuf); + + if (!ret) { + GST_ERROR_OBJECT (self, "Couldn't convert using video processor"); + goto conversion_failed; + } + + self->processor_in_use = TRUE; + + GST_TRACE_OBJECT (self, "Conversion done by video processor"); + + gst_d3d11_buffer_unmap (inbuf, in_map); + gst_d3d11_buffer_unmap (outbuf, out_map); + + return GST_FLOW_OK; + } + + /* Ensure shader resource views */ + if (!gst_d3d11_buffer_get_shader_resource_view (inbuf, resource_view)) { + if (!create_shader_input_resource (self, device, + self->in_d3d11_format, &filter->in_info)) { + GST_ERROR_OBJECT (self, "Failed to configure fallback input texture"); + goto fallback_failed; + } + + copy_input = TRUE; + gst_d3d11_device_lock (device); + for (i = 0; i < gst_buffer_n_memory (inbuf); i++) { + GstD3D11Memory *mem = + (GstD3D11Memory *) gst_buffer_peek_memory (inbuf, i); + guint subidx; + D3D11_BOX src_box = { 0, }; + D3D11_TEXTURE2D_DESC src_desc; + D3D11_TEXTURE2D_DESC dst_desc; + + subidx = gst_d3d11_memory_get_subresource_index (mem); + gst_d3d11_memory_get_texture_desc (mem, &src_desc); + + self->in_texture[i]->GetDesc (&dst_desc); + + src_box.left = 0; + src_box.top = 0; + src_box.front = 0; + src_box.back = 1; + src_box.right = MIN (src_desc.Width, dst_desc.Width); + src_box.bottom = MIN (src_desc.Height, dst_desc.Height); + + context_handle->CopySubresourceRegion (self->in_texture[i], 0, 0, 0, 0, + (ID3D11Resource *) in_map[i].data, subidx, &src_box); + } + gst_d3d11_device_unlock (device); + } + + /* Ensure render target views */ + if (!gst_d3d11_buffer_get_render_target_view (outbuf, render_view)) { + if (!create_shader_output_resource (self, device, + self->out_d3d11_format, &filter->out_info)) { + GST_ERROR_OBJECT (self, "Failed to configure fallback output texture"); + goto fallback_failed; + } + + copy_output = TRUE; + } + + /* If we need border, clear render target view first */ + if (copy_output) { + target_rtv = self->render_target_view; + } else { + target_rtv = render_view; + } + + /* We need to clear background color as our shader wouldn't touch border + * area. Likely output texture was initialized with zeros which is fine for + * RGB, but it's not black color in case of YUV */ + if (self->borders_w || self->borders_h) { + gst_d3d11_device_lock (device); + clear_rtv_color_all (self, &filter->out_info, context_handle, target_rtv); + gst_d3d11_device_unlock (device); + } + + if (!gst_d3d11_converter_convert (self->converter, + copy_input ? self->shader_resource_view : resource_view, + target_rtv, NULL, NULL)) { + goto conversion_failed; + } + + if (copy_output) { + gst_d3d11_device_lock (device); + for (i = 0; i < gst_buffer_n_memory (outbuf); i++) { + GstD3D11Memory *mem = + (GstD3D11Memory *) gst_buffer_peek_memory (outbuf, i); + guint subidx; + D3D11_BOX src_box = { 0, }; + D3D11_TEXTURE2D_DESC src_desc; + D3D11_TEXTURE2D_DESC dst_desc; + + self->out_texture[i]->GetDesc (&src_desc); + subidx = gst_d3d11_memory_get_subresource_index (mem); + gst_d3d11_memory_get_texture_desc (mem, &dst_desc); + + src_box.left = 0; + src_box.top = 0; + src_box.front = 0; + src_box.back = 1; + src_box.right = MIN (src_desc.Width, dst_desc.Width); + src_box.bottom = MIN (src_desc.Height, dst_desc.Height); + + context_handle->CopySubresourceRegion ((ID3D11Resource *) out_map[i].data, + subidx, 0, 0, 0, self->out_texture[i], 0, &src_box); + } + gst_d3d11_device_unlock (device); + } + + gst_d3d11_buffer_unmap (inbuf, in_map); + gst_d3d11_buffer_unmap (outbuf, out_map); + + return GST_FLOW_OK; + +invalid_memory: + { + GST_ELEMENT_ERROR (self, CORE, FAILED, (NULL), ("Invalid memory")); + return GST_FLOW_ERROR; + } +fallback_failed: + { + GST_ELEMENT_ERROR (self, CORE, FAILED, (NULL), + ("Couldn't prepare fallback memory")); + gst_d3d11_buffer_unmap (inbuf, in_map); + gst_d3d11_buffer_unmap (outbuf, out_map); + + return GST_FLOW_ERROR; + } +conversion_failed: + { + GST_ELEMENT_ERROR (self, CORE, FAILED, (NULL), + ("Couldn't convert texture")); + gst_d3d11_buffer_unmap (inbuf, in_map); + gst_d3d11_buffer_unmap (outbuf, out_map); + + return GST_FLOW_ERROR; + } +} + +static void +gst_d3d11_base_convert_set_add_border (GstD3D11BaseConvert * self, + gboolean add_border) +{ + gboolean prev = self->add_borders; + + self->add_borders = add_border; + if (prev != self->add_borders) + gst_base_transform_reconfigure_src (GST_BASE_TRANSFORM_CAST (self)); +} + +/** + * SECTION:element-d3d11convert + * @title: d3d11convert + * @short_description: A Direct3D11 based color conversion and video resizing element + * + * This element resizes video frames and change color space. + * By default the element will try to negotiate to the same size on the source + * and sinkpad so that no scaling is needed. + * It is therefore safe to insert this element in a pipeline to + * get more robust behaviour without any cost if no scaling is needed. + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! video/x-raw,format=NV12 ! d3d11upload ! d3d11convert ! d3d11videosink + * ``` + * This will output a test video (generated in NV12 format) in a video + * window. If the video sink selected does not support NV12 + * d3d11convert will automatically convert the video to a format understood + * by the video sink. + * + * Since: 1.18 + * + */ + +enum +{ + PROP_CONVERT_0, + PROP_CONVERT_ADD_BORDERS, +}; + +struct _GstD3D11Convert +{ + GstD3D11BaseConvert parent; +}; + +G_DEFINE_TYPE (GstD3D11Convert, gst_d3d11_convert, GST_TYPE_D3D11_BASE_CONVERT); + +static void gst_d3d11_convert_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_d3d11_convert_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); + +static void +gst_d3d11_convert_class_init (GstD3D11ConvertClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + + gobject_class->set_property = gst_d3d11_convert_set_property; + gobject_class->get_property = gst_d3d11_convert_get_property; + + /** + * GstD3D11Convert:add-borders: + * + * Add black borders if necessary to keep the display aspect ratio + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_CONVERT_ADD_BORDERS, + g_param_spec_boolean ("add-borders", "Add Borders", + "Add black borders if necessary to keep the display aspect ratio", + DEFAULT_ADD_BORDERS, (GParamFlags) (GST_PARAM_MUTABLE_PLAYING | + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 colorspace converter and scaler", + "Filter/Converter/Scaler/Video/Hardware", + "Resizes video and allow color conversion using Direct3D11", + "Seungha Yang <seungha.yang@navercorp.com>, " + "Jeongki Kim <jeongki.kim@jeongki.kim>"); +} + +static void +gst_d3d11_convert_init (GstD3D11Convert * self) +{ +} + +static void +gst_d3d11_convert_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11BaseConvert *base = GST_D3D11_BASE_CONVERT (object); + + switch (prop_id) { + case PROP_CONVERT_ADD_BORDERS: + gst_d3d11_base_convert_set_add_border (base, g_value_get_boolean (value)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_convert_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11BaseConvert *base = GST_D3D11_BASE_CONVERT (object); + + switch (prop_id) { + case PROP_CONVERT_ADD_BORDERS: + g_value_set_boolean (value, base->add_borders); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +/** + * SECTION:element-d3d11colorconvert + * @title: d3d11colorconvert + * + * A Direct3D11 based color conversion element + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! video/x-raw,format=NV12 ! d3d11upload ! d3d11colorconvert ! d3d11download ! video/x-raw,format=RGBA ! fakesink + * ``` + * This will upload a test video (generated in NV12 format) to Direct3D11 + * memory space and convert it to RGBA format. Then a converted Direct3D11 + * frame will be downloaded to system memory space. + * + * Since: 1.20 + * + */ +struct _GstD3D11ColorConvert +{ + GstD3D11BaseConvert parent; +}; + +static GstCaps *gst_d3d11_color_convert_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static GstCaps *gst_d3d11_color_convert_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); + +G_DEFINE_TYPE (GstD3D11ColorConvert, gst_d3d11_color_convert, + GST_TYPE_D3D11_BASE_CONVERT); + +static void +gst_d3d11_color_convert_class_init (GstD3D11ColorConvertClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 colorspace converter", + "Filter/Converter/Video/Hardware", + "Color conversion using Direct3D11", + "Seungha Yang <seungha@centricular.com>"); + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_transform_caps); + trans_class->fixate_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_color_convert_fixate_caps); +} + +static void +gst_d3d11_color_convert_init (GstD3D11ColorConvert * self) +{ +} + +static GstCaps * +gst_d3d11_color_convert_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *tmp, *tmp2; + GstCaps *result; + + /* Get all possible caps that we can transform to */ + tmp = gst_d3d11_base_convert_caps_remove_format_info (caps); + + if (filter) { + tmp2 = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + tmp = tmp2; + } + + result = tmp; + + GST_DEBUG_OBJECT (trans, "transformed %" GST_PTR_FORMAT " into %" + GST_PTR_FORMAT, caps, result); + + return result; +} + +static GstCaps * +gst_d3d11_color_convert_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstCaps *format = NULL; + + GST_DEBUG_OBJECT (base, + "trying to fixate othercaps %" GST_PTR_FORMAT " based on caps %" + GST_PTR_FORMAT, othercaps, caps); + + format = gst_d3d11_base_convert_get_fixed_format (base, direction, caps, + othercaps); + gst_caps_unref (othercaps); + + if (gst_caps_is_empty (format)) { + GST_ERROR_OBJECT (base, "Could not convert formats"); + } else { + GST_DEBUG_OBJECT (base, "fixated othercaps to %" GST_PTR_FORMAT, format); + } + + return format; +} + +/** + * SECTION:element-d3d11scale + * @title: d3d11scale + * + * A Direct3D11 based video resizing element + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480 ! d3d11upload ! d3d11scale ! d3d11download ! video/x-raw,width=1280,height=720 ! fakesink + * ``` + * This will upload a 640x480 resolution test video to Direct3D11 + * memory space and resize it to 1280x720 resolution. Then a resized Direct3D11 + * frame will be downloaded to system memory space. + * + * Since: 1.20 + * + */ + +enum +{ + PROP_SCALE_0, + PROP_SCALE_ADD_BORDERS, +}; + +struct _GstD3D11Scale +{ + GstD3D11BaseConvert parent; +}; + +static void gst_d3d11_scale_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_d3d11_scale_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); +static GstCaps *gst_d3d11_scale_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static GstCaps *gst_d3d11_scale_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); + +G_DEFINE_TYPE (GstD3D11Scale, gst_d3d11_scale, GST_TYPE_D3D11_BASE_CONVERT); + +static void +gst_d3d11_scale_class_init (GstD3D11ScaleClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gobject_class->set_property = gst_d3d11_scale_set_property; + gobject_class->get_property = gst_d3d11_scale_get_property; + + /** + * GstD3D11Scale:add-borders: + * + * Add black borders if necessary to keep the display aspect ratio + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_SCALE_ADD_BORDERS, + g_param_spec_boolean ("add-borders", "Add Borders", + "Add black borders if necessary to keep the display aspect ratio", + DEFAULT_ADD_BORDERS, (GParamFlags) (GST_PARAM_MUTABLE_PLAYING | + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 scaler", + "Filter/Converter/Video/Scaler/Hardware", + "Resizes video using Direct3D11", + "Seungha Yang <seungha@centricular.com>"); + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_scale_transform_caps); + trans_class->fixate_caps = GST_DEBUG_FUNCPTR (gst_d3d11_scale_fixate_caps); +} + +static void +gst_d3d11_scale_init (GstD3D11Scale * self) +{ +} + +static void +gst_d3d11_scale_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11BaseConvert *base = GST_D3D11_BASE_CONVERT (object); + + switch (prop_id) { + case PROP_CONVERT_ADD_BORDERS: + gst_d3d11_base_convert_set_add_border (base, g_value_get_boolean (value)); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_scale_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11BaseConvert *base = GST_D3D11_BASE_CONVERT (object); + + switch (prop_id) { + case PROP_CONVERT_ADD_BORDERS: + g_value_set_boolean (value, base->add_borders); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static GstCaps * +gst_d3d11_scale_transform_caps (GstBaseTransform * + trans, GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *tmp, *tmp2; + GstCaps *result; + + /* Get all possible caps that we can transform to */ + tmp = gst_d3d11_base_convert_caps_rangify_size_info (caps); + + if (filter) { + tmp2 = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + tmp = tmp2; + } + + result = tmp; + + GST_DEBUG_OBJECT (trans, "transformed %" GST_PTR_FORMAT " into %" + GST_PTR_FORMAT, caps, result); + + return result; +} + +static GstCaps * +gst_d3d11_scale_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GST_DEBUG_OBJECT (base, + "trying to fixate othercaps %" GST_PTR_FORMAT " based on caps %" + GST_PTR_FORMAT, othercaps, caps); + + othercaps = + gst_d3d11_base_convert_fixate_size (base, direction, caps, othercaps); + + GST_DEBUG_OBJECT (base, "fixated othercaps to %" GST_PTR_FORMAT, othercaps); + + return othercaps; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11convert.h
Added
@@ -0,0 +1,61 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_COLOR_CONVERT_H__ +#define __GST_D3D11_COLOR_CONVERT_H__ + +#include <gst/gst.h> + +#include "gstd3d11basefilter.h" + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_BASE_CONVERT (gst_d3d11_base_convert_get_type()) +#define GST_D3D11_BASE_CONVERT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_BASE_CONVERT,GstD3D11BaseConvert)) +#define GST_D3D11_BASE_CONVERT_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_BASE_CONVERT,GstD3D11BaseConvertClass)) +#define GST_D3D11_BASE_CONVERT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_BASE_CONVERT,GstD3D11BaseConvertClass)) +#define GST_IS_D3D11_BASE_CONVERT(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_BASE_CONVERT)) +#define GST_IS_D3D11_BASE_CONVERT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_BASE_CONVERT)) + +typedef struct _GstD3D11BaseConvert GstD3D11BaseConvert; +typedef struct _GstD3D11BaseConvertClass GstD3D11BaseConvertClass; + +struct _GstD3D11BaseConvertClass +{ + GstD3D11BaseFilterClass parent_class; +}; + +GType gst_d3d11_base_convert_get_type (void); +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstD3D11BaseConvert, gst_object_unref) + +#define GST_TYPE_D3D11_CONVERT (gst_d3d11_convert_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11Convert, gst_d3d11_convert, + GST, D3D11_CONVERT, GstD3D11BaseConvert) + +#define GST_TYPE_D3D11_COLOR_CONVERT (gst_d3d11_color_convert_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11ColorConvert, gst_d3d11_color_convert, + GST, D3D11_COLOR_CONVERT, GstD3D11BaseConvert) + +#define GST_TYPE_D3D11_SCALE (gst_d3d11_scale_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11Scale, gst_d3d11_scale, + GST, D3D11_SCALE, GstD3D11BaseConvert) + +G_END_DECLS + +#endif /* __GST_D3D11_COLOR_CONVERT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11converter.cpp
Added
@@ -0,0 +1,2301 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) <2019> Jeongki Kim <jeongki.kim@jeongki.kim> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11converter.h" +#include "gstd3d11shader.h" +#include "gstd3d11pluginutils.h" +#include <wrl.h> +#include <string.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_converter_debug); +#define GST_CAT_DEFAULT gst_d3d11_converter_debug + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +#define CONVERTER_MAX_QUADS 2 + +/* *INDENT-OFF* */ +typedef struct +{ + FLOAT trans_matrix[12]; + FLOAT padding[4]; +} PixelShaderColorTransform; + +typedef struct +{ + FLOAT alpha_mul; + FLOAT padding[3]; +} AlphaConstBuffer; + +typedef struct +{ + struct { + FLOAT x; + FLOAT y; + FLOAT z; + } position; + struct { + FLOAT x; + FLOAT y; + } texture; +} VertexData; + +typedef struct +{ + gboolean has_transform; + gboolean has_alpha; + const gchar *func; +} PixelShaderTemplate; + +static const gchar templ_color_transform_const_buffer[] = + "cbuffer PixelShaderColorTransform : register(b%u)\n" + "{\n" + " float3x4 trans_matrix;\n" + " float3 padding;\n" + "};"; + +static const gchar templ_alpha_const_buffer[] = + "cbuffer AlphaConstBuffer : register(b%u)\n" + "{\n" + " float alpha_mul;\n" + " float3 padding;\n" + "};"; + +#define HLSL_FUNC_YUV_TO_RGB \ + "float3 yuv_to_rgb (float3 yuv)\n" \ + "{\n" \ + " yuv += float3(-0.062745f, -0.501960f, -0.501960f);\n" \ + " yuv = mul(yuv, trans_matrix);\n" \ + " return saturate(yuv);\n" \ + "}\n" + +#define HLSL_FUNC_RGB_TO_YUV \ + "float3 rgb_to_yuv (float3 rgb)\n" \ + "{\n" \ + " float3 yuv;\n" \ + " yuv = mul(rgb, trans_matrix);\n" \ + " yuv += float3(0.062745f, 0.501960f, 0.501960f);\n" \ + " return saturate(yuv);\n" \ + "}\n" + +#define HLSL_PS_OUTPUT_ONE_PLANE_BODY \ + " float4 Plane_0: SV_TARGET0;" + +#define HLSL_PS_OUTPUT_TWO_PLANES_BODY \ + " float4 Plane_0: SV_TARGET0;\n" \ + " float4 Plane_1: SV_TARGET1;" + +static const PixelShaderTemplate templ_REORDER = + { FALSE, TRUE, NULL }; + +static const PixelShaderTemplate templ_REORDER_NO_ALPHA = + { FALSE, FALSE, NULL }; + +static const PixelShaderTemplate templ_YUV_to_RGB = + { TRUE, FALSE, HLSL_FUNC_YUV_TO_RGB }; + +static const PixelShaderTemplate templ_RGB_to_YUV = + { TRUE, FALSE, HLSL_FUNC_RGB_TO_YUV }; + +static const gchar templ_REORDER_BODY[] = + " float4 xyza;\n" + " xyza.xyz = shaderTexture[0].Sample(samplerState, input.Texture).xyz;\n" + " xyza.a = shaderTexture[0].Sample(samplerState, input.Texture).a * alpha_mul;\n" + " output.Plane_0 = xyza;\n"; + +static const gchar templ_VUYA_to_RGB_BODY[] = + " float4 sample, rgba;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).z;\n" + " sample.y = shaderTexture[0].Sample(samplerState, input.Texture).y;\n" + " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" + " sample.a = shaderTexture[0].Sample(samplerState, input.Texture).a;\n" + " rgba.rgb = yuv_to_rgb (sample.xyz);\n" + " rgba.a = sample.a;\n" + " output.Plane_0 = rgba;\n"; + +static const gchar templ_RGB_to_VUYA_BODY[] = + " float4 sample, vuya;\n" + " sample = shaderTexture[0].Sample(samplerState, input.Texture);\n" + " vuya.zyx = rgb_to_yuv (sample.rgb);\n" + " vuya.a = sample.a;\n" + " output.Plane_0 = vuya;\n"; + +static const gchar templ_PACKED_YUV_to_RGB_BODY[] = + " float4 sample, rgba;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.y = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " rgba.rgb = yuv_to_rgb (sample.xyz);\n" + " rgba.a = 1;\n" + " output.Plane_0 = rgba;\n"; + +/* YUV to RGB conversion */ +static const gchar templ_PLANAR_YUV_to_RGB_BODY[] = + " float4 sample, rgba;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x * %u;\n" + " sample.%c = shaderTexture[1].Sample(samplerState, input.Texture).x * %u;\n" + " sample.%c = shaderTexture[2].Sample(samplerState, input.Texture).x * %u;\n" + " rgba.rgb = yuv_to_rgb (sample.xyz);\n" + " rgba.a = 1.0;\n" + " output.Plane_0 = rgba;\n"; + +static const gchar templ_SEMI_PLANAR_to_RGB_BODY[] = + " float4 sample, rgba;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" + " sample.yz = shaderTexture[1].Sample(samplerState, input.Texture).%c%c;\n" + " rgba.rgb = yuv_to_rgb (sample.xyz);\n" + " rgba.a = 1.0;\n" + " output.Plane_0 = rgba;\n"; + +/* RGB to YUV conversion */ +static const gchar templ_RGB_to_LUMA_BODY[] = + " float4 sample, rgba;\n" + " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" + " sample.xyz = rgb_to_yuv (rgba.rgb);\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " sample.x = sample.x / %d;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_RGB_to_SEMI_PLANAR_CHROMA_BODY[] = + " float4 sample, rgba;\n" + " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" + " sample.xyz = rgb_to_yuv (rgba.rgb);\n" + " sample.%c = sample.y;\n" + " sample.%c = sample.z;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_RGB_to_PLANAR_CHROMA_BODY[] = + " float4 sample, rgba;\n" + " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" + " sample.xyz = rgb_to_yuv (rgba.rgb);\n" + " output.Plane_0 = float4(sample.%c / %u, 0.0, 0.0, 0.0);\n" + " output.Plane_1 = float4(sample.%c / %u, 0.0, 0.0, 0.0);\n"; + +/* YUV to YUV conversion */ +static const gchar templ_LUMA_to_LUMA_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x * %u;\n" + " output.Plane_0 = float4(sample.x / %u, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY[] = + " float4 in_sample;\n" + " float4 out_sample;\n" + " in_sample.%c = shaderTexture[1].Sample(samplerState, input.Texture).x * %u;\n" + " in_sample.%c = shaderTexture[2].Sample(samplerState, input.Texture).x * %u;\n" + " out_sample.xy = in_sample.yz;\n" + " output.Plane_0 = float4(out_sample.%c%c, 0.0, 0.0);\n"; + +static const gchar templ_SEMI_PLANAR_TO_PLANAR_CHROMA_BODY[] = + " float4 sample;\n" + " sample.yz = shaderTexture[1].Sample(samplerState, input.Texture).%c%c;\n" + " output.Plane_0 = float4(sample.%c / %d, 0.0, 0.0, 0.0);\n" + " output.Plane_1 = float4(sample.%c / %d, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_SEMI_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY[] = + " float4 sample;\n" + " sample.xy = shaderTexture[1].Sample(samplerState, input.Texture).%c%c;\n" + " output.Plane_0 = float4(sample.%c%c, 0.0, 0.0);\n"; + +static const gchar templ_PLANAR_TO_PLANAR_CHROMA_BODY[] = + " float4 sample;\n" + " sample.%c = shaderTexture[1].Sample(samplerState, input.Texture).x * %u;\n" + " sample.%c = shaderTexture[2].Sample(samplerState, input.Texture).x * %u;\n" + " output.Plane_0 = float4(sample.%c / %u, 0.0, 0.0, 0.0);\n" + " output.Plane_1 = float4(sample.%c / %u, 0.0, 0.0, 0.0);\n"; + +/* VUYA to YUV */ +static const gchar templ_VUYA_to_LUMA_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).z;\n" + " output.Plane_0 = float4(sample.x / %u, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_VUYA_TO_PLANAR_CHROMA_BODY[] = + " float4 sample;\n" + " sample.yz = shaderTexture[0].Sample(samplerState, input.Texture).yx;\n" + " output.Plane_0 = float4(sample.%c / %d, 0.0, 0.0, 0.0);\n" + " output.Plane_1 = float4(sample.%c / %d, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_VUYA_TO_SEMI_PLANAR_CHROMA_BODY[] = + " float2 sample;\n" + " sample.xy = shaderTexture[0].Sample(samplerState, input.Texture).%c%c;\n" + " output.Plane_0 = float4(sample.xy, 0.0, 0.0);\n"; + +/* YUV to VUYA */ +static const gchar templ_PLANAR_to_VUYA_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x * %u;\n" + " sample.%c = shaderTexture[1].Sample(samplerState, input.Texture).x * %u;\n" + " sample.%c = shaderTexture[2].Sample(samplerState, input.Texture).x * %u;\n" + " output.Plane_0 = float4(sample.zyx, 1.0f);\n"; + +static const gchar templ_SEMI_PLANAR_to_VUYA_BODY[] = + " float4 sample;\n" + " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" + " sample.xy = shaderTexture[1].Sample(samplerState, input.Texture).%c%c;\n" + " output.Plane_0 = float4(sample.xyz, 1.0f);\n"; + +static const gchar templ_PACKED_YUV_to_VUYA_BODY[] = + " float4 sample;\n" + " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.y = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " output.Plane_0 = float4(sample.xyz, 1.0f);\n"; + +/* packed YUV to (semi) planar YUV */ +static const gchar templ_PACKED_YUV_to_LUMA_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " output.Plane_0 = float4(sample.x / %u, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_PACKED_YUV_TO_PLANAR_CHROMA_BODY[] = + " float4 sample;\n" + " sample.y = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " output.Plane_0 = float4(sample.%c / %u, 0.0, 0.0, 0.0);\n" + " output.Plane_1 = float4(sample.%c / %u, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_PACKED_YUV_TO_SEMI_PLANAR_CHROMA_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.y = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " output.Plane_0 = float4(sample.%c%c, 0.0, 0.0);\n"; + +/* to GRAY */ +static const gchar templ_RGB_to_GRAY_BODY[] = + " float4 sample, rgba;\n" + " rgba.rgb = shaderTexture[0].Sample(samplerState, input.Texture).rgb;\n" + " sample.x = rgb_to_yuv (rgba.rgb).x;\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_VUYA_to_GRAY_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).z;\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_YUV_to_GRAY_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x * %u;\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_PACKED_YUV_TO_GRAY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).%c;\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_GRAY_to_GRAY_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +/* from GRAY */ +static const gchar templ_GRAY_to_RGB_BODY[] = + " float4 sample, rgba;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" + " sample.y = sample.x;\n" + " sample.z = sample.x;\n" + " sample.a = 1.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_GRAY_to_VUYA_BODY[] = + " float4 sample;\n" + " sample.z = shaderTexture[0].Sample(samplerState, input.Texture).x;\n" + " sample.x = 0.5;\n" + " sample.y = 0.5;\n" + " sample.a = 1.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_GRAY_to_LUMA_BODY[] = + " float4 sample;\n" + " sample.x = shaderTexture[0].Sample(samplerState, input.Texture).x / %u;\n" + " sample.y = 0.0;\n" + " sample.z = 0.0;\n" + " sample.a = 0.0;\n" + " output.Plane_0 = sample;\n"; + +static const gchar templ_GRAY_to_PLANAR_CHROMA_BODY[] = + " float val = 0.5 / %u;\n" + " output.Plane_0 = float4(val, 0.0, 0.0, 0.0);\n" + " output.Plane_1 = float4(val, 0.0, 0.0, 0.0);\n"; + +static const gchar templ_GRAY_to_SEMI_PLANAR_CHROMA_BODY[] = + " output.Plane_0 = float4(0.5, 0.5, 0.0, 0.0);\n"; + +static const gchar templ_pixel_shader[] = + /* constant buffer */ + "%s\n" + "%s\n" + "Texture2D shaderTexture[4];\n" + "SamplerState samplerState;\n" + "\n" + "struct PS_INPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + " float3 Texture: TEXCOORD0;\n" + "};\n" + "\n" + "struct PS_OUTPUT\n" + "{\n" + " %s\n" + "};\n" + "\n" + /* rgb <-> yuv function */ + "%s\n" + "PS_OUTPUT main(PS_INPUT input)\n" + "{\n" + " PS_OUTPUT output;\n" + "%s" + " return output;\n" + "}\n"; + +static const gchar templ_vertex_shader[] = + "struct VS_INPUT\n" + "{\n" + " float4 Position : POSITION;\n" + " float4 Texture : TEXCOORD0;\n" + "};\n" + "\n" + "struct VS_OUTPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + " float4 Texture: TEXCOORD0;\n" + "};\n" + "\n" + "VS_OUTPUT main(VS_INPUT input)\n" + "{\n" + " return input;\n" + "}\n"; + +/* *INDENT-ON* */ + +typedef struct +{ + const PixelShaderTemplate *templ; + gchar *ps_body[CONVERTER_MAX_QUADS]; + const gchar *ps_output[CONVERTER_MAX_QUADS]; + PixelShaderColorTransform transform; +} ConvertInfo; + +struct _GstD3D11Converter +{ + GstD3D11Device *device; + GstVideoInfo in_info; + GstVideoInfo out_info; + gdouble alpha; + + const GstD3D11Format *in_d3d11_format; + const GstD3D11Format *out_d3d11_format; + + guint num_input_view; + guint num_output_view; + + GstD3D11Quad *quad[CONVERTER_MAX_QUADS]; + + D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES]; + + RECT src_rect; + RECT dest_rect; + gint input_texture_width; + gint input_texture_height; + ID3D11Buffer *vertex_buffer; + ID3D11Buffer *alpha_const_buffer; + gboolean update_vertex; + gboolean update_alpha; + + ID3D11SamplerState *linear_sampler; + + ConvertInfo convert_info; + + GstStructure *config; +}; + +static gdouble +get_opt_double (GstD3D11Converter * self, const gchar * opt, gdouble def) +{ + gdouble res; + if (!gst_structure_get_double (self->config, opt, &res)) + res = def; + + return res; +} + +#define DEFAULT_OPT_ALPHA_VALUE 1.0 + +#define GET_OPT_ALPHA_VALUE(c) get_opt_double(c, \ + GST_D3D11_CONVERTER_OPT_ALPHA_VALUE, DEFAULT_OPT_ALPHA_VALUE); + +/* from video-converter.c */ +typedef struct +{ + gfloat dm[4][4]; +} MatrixData; + +static void +color_matrix_set_identity (MatrixData * m) +{ + gint i, j; + + for (i = 0; i < 4; i++) { + for (j = 0; j < 4; j++) { + m->dm[i][j] = (i == j); + } + } +} + +static void +color_matrix_copy (MatrixData * d, const MatrixData * s) +{ + gint i, j; + + for (i = 0; i < 4; i++) + for (j = 0; j < 4; j++) + d->dm[i][j] = s->dm[i][j]; +} + +/* Perform 4x4 matrix multiplication: + * - @dst@ = @a@ * @b@ + * - @dst@ may be a pointer to @a@ andor @b@ + */ +static void +color_matrix_multiply (MatrixData * dst, MatrixData * a, MatrixData * b) +{ + MatrixData tmp; + gint i, j, k; + + for (i = 0; i < 4; i++) { + for (j = 0; j < 4; j++) { + gfloat x = 0; + for (k = 0; k < 4; k++) { + x += a->dm[i][k] * b->dm[k][j]; + } + tmp.dm[i][j] = x; + } + } + color_matrix_copy (dst, &tmp); +} + +static void +color_matrix_offset_components (MatrixData * m, gfloat a1, gfloat a2, gfloat a3) +{ + MatrixData a; + + color_matrix_set_identity (&a); + a.dm[0][3] = a1; + a.dm[1][3] = a2; + a.dm[2][3] = a3; + color_matrix_multiply (m, &a, m); +} + +static void +color_matrix_scale_components (MatrixData * m, gfloat a1, gfloat a2, gfloat a3) +{ + MatrixData a; + + color_matrix_set_identity (&a); + a.dm[0][0] = a1; + a.dm[1][1] = a2; + a.dm[2][2] = a3; + color_matrix_multiply (m, &a, m); +} + +static void +color_matrix_debug (GstD3D11Converter * self, const MatrixData * s) +{ + GST_DEBUG ("[%f %f %f %f]", + s->dm[0][0], s->dm[0][1], s->dm[0][2], s->dm[0][3]); + GST_DEBUG ("[%f %f %f %f]", + s->dm[1][0], s->dm[1][1], s->dm[1][2], s->dm[1][3]); + GST_DEBUG ("[%f %f %f %f]", + s->dm[2][0], s->dm[2][1], s->dm[2][2], s->dm[2][3]); + GST_DEBUG ("[%f %f %f %f]", + s->dm[3][0], s->dm[3][1], s->dm[3][2], s->dm[3][3]); +} + +static void +color_matrix_YCbCr_to_RGB (MatrixData * m, gfloat Kr, gfloat Kb) +{ + gfloat Kg = 1.0 - Kr - Kb; + MatrixData k = { + { + {1., 0., 2 * (1 - Kr), 0.}, + {1., -2 * Kb * (1 - Kb) / Kg, -2 * Kr * (1 - Kr) / Kg, 0.}, + {1., 2 * (1 - Kb), 0., 0.}, + {0., 0., 0., 1.}, + } + }; + + color_matrix_multiply (m, &k, m); +} + +static void +color_matrix_RGB_to_YCbCr (MatrixData * m, gfloat Kr, gfloat Kb) +{ + gfloat Kg = 1.0 - Kr - Kb; + MatrixData k; + gfloat x; + + k.dm[0][0] = Kr; + k.dm[0][1] = Kg; + k.dm[0][2] = Kb; + k.dm[0][3] = 0; + + x = 1 / (2 * (1 - Kb)); + k.dm[1][0] = -x * Kr; + k.dm[1][1] = -x * Kg; + k.dm[1][2] = x * (1 - Kb); + k.dm[1][3] = 0; + + x = 1 / (2 * (1 - Kr)); + k.dm[2][0] = x * (1 - Kr); + k.dm[2][1] = -x * Kg; + k.dm[2][2] = -x * Kb; + k.dm[2][3] = 0; + + k.dm[3][0] = 0; + k.dm[3][1] = 0; + k.dm[3][2] = 0; + k.dm[3][3] = 1; + + color_matrix_multiply (m, &k, m); +} + +static void +compute_matrix_to_RGB (GstD3D11Converter * self, MatrixData * data, + GstVideoInfo * info) +{ + gdouble Kr = 0, Kb = 0; + gint offset[4], scale[4]; + + /* bring color components to [0..1.0] range */ + gst_video_color_range_offsets (info->colorimetry.range, info->finfo, offset, + scale); + + color_matrix_offset_components (data, -offset[0], -offset[1], -offset[2]); + color_matrix_scale_components (data, 1 / ((float) scale[0]), + 1 / ((float) scale[1]), 1 / ((float) scale[2])); + + if (!GST_VIDEO_INFO_IS_RGB (info)) { + /* bring components to R'G'B' space */ + if (gst_video_color_matrix_get_Kr_Kb (info->colorimetry.matrix, &Kr, &Kb)) + color_matrix_YCbCr_to_RGB (data, Kr, Kb); + } + color_matrix_debug (self, data); +} + +static void +compute_matrix_to_YUV (GstD3D11Converter * self, MatrixData * data, + GstVideoInfo * info) +{ + gdouble Kr = 0, Kb = 0; + gint offset[4], scale[4]; + + if (!GST_VIDEO_INFO_IS_RGB (info)) { + /* bring components to YCbCr space */ + if (gst_video_color_matrix_get_Kr_Kb (info->colorimetry.matrix, &Kr, &Kb)) + color_matrix_RGB_to_YCbCr (data, Kr, Kb); + } + + /* bring color components to nominal range */ + gst_video_color_range_offsets (info->colorimetry.range, info->finfo, offset, + scale); + + color_matrix_scale_components (data, (float) scale[0], (float) scale[1], + (float) scale[2]); + color_matrix_offset_components (data, offset[0], offset[1], offset[2]); + + color_matrix_debug (self, data); +} + +static gboolean +converter_get_matrix (GstD3D11Converter * self, MatrixData * matrix, + GstVideoInfo * in_info, GstVideoInfo * out_info) +{ + gboolean same_matrix; + guint in_bits, out_bits; + + in_bits = GST_VIDEO_INFO_COMP_DEPTH (in_info, 0); + out_bits = GST_VIDEO_INFO_COMP_DEPTH (out_info, 0); + + same_matrix = in_info->colorimetry.matrix == out_info->colorimetry.matrix; + + GST_DEBUG ("matrix %d -> %d (%d)", in_info->colorimetry.matrix, + out_info->colorimetry.matrix, same_matrix); + + color_matrix_set_identity (matrix); + + if (same_matrix) { + GST_DEBUG ("conversion matrix is not required"); + return FALSE; + } + + if (in_bits < out_bits) { + gint scale = 1 << (out_bits - in_bits); + color_matrix_scale_components (matrix, + 1 / (float) scale, 1 / (float) scale, 1 / (float) scale); + } + + GST_DEBUG ("to RGB matrix"); + compute_matrix_to_RGB (self, matrix, in_info); + GST_DEBUG ("current matrix"); + color_matrix_debug (self, matrix); + + GST_DEBUG ("to YUV matrix"); + compute_matrix_to_YUV (self, matrix, out_info); + GST_DEBUG ("current matrix"); + color_matrix_debug (self, matrix); + + if (in_bits > out_bits) { + gint scale = 1 << (in_bits - out_bits); + color_matrix_scale_components (matrix, + (float) scale, (float) scale, (float) scale); + } + + GST_DEBUG ("final matrix"); + color_matrix_debug (self, matrix); + + return TRUE; +} + +static gboolean +setup_convert_info_rgb_to_rgb (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *convert_info = &self->convert_info; + + convert_info->templ = &templ_REORDER; + convert_info->ps_body[0] = g_strdup_printf (templ_REORDER_BODY); + convert_info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + return TRUE; +} + +static gboolean +get_packed_yuv_components (GstD3D11Converter * self, GstVideoFormat + format, gchar * y, gchar * u, gchar * v) +{ + switch (format) { + case GST_VIDEO_FORMAT_YUY2: + { + const GstD3D11Format *d3d11_format = + gst_d3d11_device_format_from_gst (self->device, + GST_VIDEO_FORMAT_YUY2); + + g_assert (d3d11_format != NULL); + + if (d3d11_format->resource_format[0] == DXGI_FORMAT_R8G8B8A8_UNORM) { + *y = 'x'; + *u = 'y'; + *v = 'a'; + } else if (d3d11_format->resource_format[0] == + DXGI_FORMAT_G8R8_G8B8_UNORM) { + *y = 'y'; + *u = 'x'; + *v = 'z'; + } else { + g_assert_not_reached (); + return FALSE; + } + break; + } + case GST_VIDEO_FORMAT_UYVY: + *y = 'y'; + *u = 'x'; + *v = 'z'; + break; + case GST_VIDEO_FORMAT_VYUY: + *y = 'y'; + *u = 'z'; + *v = 'x'; + break; + case GST_VIDEO_FORMAT_Y210: + *y = 'r'; + *u = 'g'; + *v = 'a'; + break; + case GST_VIDEO_FORMAT_Y410: + *y = 'g'; + *u = 'r'; + *v = 'b'; + break; + default: + g_assert_not_reached (); + return FALSE; + } + + return TRUE; +} + +static void +get_planar_component (const GstVideoInfo * info, gchar * u, gchar * v, + guint * scale) +{ + switch (GST_VIDEO_INFO_FORMAT (info)) { + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_Y444_10LE: + *scale = (1 << 6); + break; + case GST_VIDEO_FORMAT_I420_12LE: + case GST_VIDEO_FORMAT_I422_12LE: + case GST_VIDEO_FORMAT_Y444_12LE: + *scale = (1 << 4); + break; + default: + *scale = 1; + break; + } + + if (GST_VIDEO_INFO_FORMAT (info) == GST_VIDEO_FORMAT_YV12) { + *u = 'z'; + *v = 'y'; + } else { + *u = 'y'; + *v = 'z'; + } +} + +static void +get_semi_planar_component (const GstVideoInfo * info, gchar * u, gchar * v) +{ + if (GST_VIDEO_INFO_FORMAT (info) == GST_VIDEO_FORMAT_NV21) { + *u = 'y'; + *v = 'x'; + } else { + *u = 'x'; + *v = 'y'; + } +} + +static gboolean +setup_convert_info_yuv_to_rgb (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_YUV_to_RGB; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + switch (GST_VIDEO_INFO_FORMAT (in_info)) { + case GST_VIDEO_FORMAT_VUYA: + info->ps_body[0] = g_strdup_printf (templ_VUYA_to_RGB_BODY); + break; + case GST_VIDEO_FORMAT_YUY2: + case GST_VIDEO_FORMAT_UYVY: + case GST_VIDEO_FORMAT_VYUY: + case GST_VIDEO_FORMAT_Y210: + case GST_VIDEO_FORMAT_Y410: + { + gchar y, u, v; + if (!get_packed_yuv_components (self, GST_VIDEO_INFO_FORMAT (in_info), + &y, &u, &v)) { + return FALSE; + } + + info->ps_body[0] = + g_strdup_printf (templ_PACKED_YUV_to_RGB_BODY, y, u, v); + break; + } + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_YV12: + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I420_12LE: + case GST_VIDEO_FORMAT_Y42B: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_I422_12LE: + case GST_VIDEO_FORMAT_Y444: + case GST_VIDEO_FORMAT_Y444_10LE: + case GST_VIDEO_FORMAT_Y444_12LE: + case GST_VIDEO_FORMAT_Y444_16LE: + { + guint mul; + gchar u, v; + + get_planar_component (in_info, &u, &v, &mul); + + info->ps_body[0] = + g_strdup_printf (templ_PLANAR_YUV_to_RGB_BODY, mul, u, mul, v, mul); + break; + } + case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_NV21: + case GST_VIDEO_FORMAT_P010_10LE: + case GST_VIDEO_FORMAT_P012_LE: + case GST_VIDEO_FORMAT_P016_LE: + { + gchar u, v; + + get_semi_planar_component (in_info, &u, &v); + + info->ps_body[0] = g_strdup_printf (templ_SEMI_PLANAR_to_RGB_BODY, u, v); + break; + } + default: + GST_ERROR ("Unhandled input format %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info))); + return FALSE; + } + + return TRUE; +} + +static gboolean +setup_convert_info_rgb_to_yuv (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_RGB_to_YUV; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + switch (GST_VIDEO_INFO_FORMAT (out_info)) { + case GST_VIDEO_FORMAT_VUYA: + info->ps_body[0] = g_strdup_printf (templ_RGB_to_VUYA_BODY); + break; + case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_NV21: + case GST_VIDEO_FORMAT_P010_10LE: + case GST_VIDEO_FORMAT_P012_LE: + case GST_VIDEO_FORMAT_P016_LE: + { + gchar u, v; + + get_semi_planar_component (out_info, &u, &v); + + info->ps_body[0] = g_strdup_printf (templ_RGB_to_LUMA_BODY, 1); + info->ps_body[1] = g_strdup_printf (templ_RGB_to_SEMI_PLANAR_CHROMA_BODY, + u, v); + info->ps_output[1] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + break; + } + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_YV12: + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I420_12LE: + case GST_VIDEO_FORMAT_Y42B: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_I422_12LE: + case GST_VIDEO_FORMAT_Y444: + case GST_VIDEO_FORMAT_Y444_10LE: + case GST_VIDEO_FORMAT_Y444_12LE: + case GST_VIDEO_FORMAT_Y444_16LE: + { + guint div; + gchar u, v; + + get_planar_component (out_info, &u, &v, &div); + + info->ps_body[0] = g_strdup_printf (templ_RGB_to_LUMA_BODY, div); + info->ps_body[1] = + g_strdup_printf (templ_RGB_to_PLANAR_CHROMA_BODY, u, div, v, div); + info->ps_output[1] = HLSL_PS_OUTPUT_TWO_PLANES_BODY; + break; + } + default: + GST_ERROR ("Unhandled output format %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); + return FALSE; + } + + return TRUE; +} + +static gboolean +setup_convert_info_planar_to_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + guint in_scale, out_scale; + gchar in_u, in_v, out_u, out_v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_TWO_PLANES_BODY; + + get_planar_component (in_info, &in_u, &in_v, &in_scale); + get_planar_component (out_info, &out_u, &out_v, &out_scale); + + info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, + in_scale, out_scale); + info->ps_body[1] = + g_strdup_printf (templ_PLANAR_TO_PLANAR_CHROMA_BODY, + in_u, in_scale, in_v, in_scale, out_u, out_scale, out_v, out_scale); + + return TRUE; +} + +static gboolean +setup_convert_info_planar_to_semi_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + guint in_scale; + gchar in_u, in_v, out_u, out_v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + get_planar_component (in_info, &in_u, &in_v, &in_scale); + get_semi_planar_component (out_info, &out_u, &out_v); + + info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, in_scale, 1); + info->ps_body[1] = + g_strdup_printf (templ_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY, + in_u, in_scale, in_v, in_scale, out_u, out_v); + + return TRUE; +} + +static gboolean +setup_convert_info_semi_planar_to_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + gchar in_u, in_v, out_u, out_v; + guint div = 1; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_TWO_PLANES_BODY; + + get_semi_planar_component (in_info, &in_u, &in_v); + get_planar_component (out_info, &out_u, &out_v, &div); + + info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, 1, div); + info->ps_body[1] = + g_strdup_printf (templ_SEMI_PLANAR_TO_PLANAR_CHROMA_BODY, + in_u, in_v, out_u, div, out_v, div); + + return TRUE; +} + +static gboolean +setup_convert_info_semi_planar_to_semi_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + gchar in_u, in_v, out_u, out_v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + get_semi_planar_component (in_info, &in_u, &in_v); + get_semi_planar_component (out_info, &out_u, &out_v); + + info->ps_body[0] = g_strdup_printf (templ_LUMA_to_LUMA_BODY, 1, 1); + info->ps_body[1] = + g_strdup_printf (templ_SEMI_PLANAR_TO_SEMI_PLANAR_CHROMA_BODY, + in_u, in_v, out_u, out_v); + + return TRUE; +} + +static gboolean +setup_convert_info_vuya_to_vuya (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + info->ps_body[0] = g_strdup_printf (templ_REORDER_BODY); + + return TRUE; +} + +static gboolean +setup_convert_info_vuya_to_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + guint div; + gchar u, v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_TWO_PLANES_BODY; + + get_planar_component (out_info, &u, &v, &div); + + info->ps_body[0] = g_strdup_printf (templ_VUYA_to_LUMA_BODY, div); + info->ps_body[1] = + g_strdup_printf (templ_VUYA_TO_PLANAR_CHROMA_BODY, u, div, v, div); + + return TRUE; +} + +static gboolean +setup_convert_info_vuya_to_semi_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + guint div = 1; + gchar u, v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + get_semi_planar_component (out_info, &u, &v); + + info->ps_body[0] = g_strdup_printf (templ_VUYA_to_LUMA_BODY, div); + info->ps_body[1] = + g_strdup_printf (templ_VUYA_TO_SEMI_PLANAR_CHROMA_BODY, v, u); + + return TRUE; +} + +static gboolean +setup_convert_info_planar_to_vuya (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + guint mul; + gchar u, v; + + get_planar_component (in_info, &u, &v, &mul); + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + info->ps_body[0] = + g_strdup_printf (templ_PLANAR_to_VUYA_BODY, mul, u, mul, v, mul); + + return TRUE; +} + +static gboolean +setup_convert_info_packed_yuv_to_vuya (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + gchar y, u, v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + if (!get_packed_yuv_components (self, GST_VIDEO_INFO_FORMAT (in_info), + &y, &u, &v)) { + return FALSE; + } + + info->ps_body[0] = g_strdup_printf (templ_PACKED_YUV_to_VUYA_BODY, y, u, v); + + return TRUE; +} + +static gboolean +setup_convert_info_semi_planar_to_vuya (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + gchar u, v; + + get_semi_planar_component (in_info, &u, &v); + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + info->ps_body[0] = g_strdup_printf (templ_SEMI_PLANAR_to_VUYA_BODY, v, u); + + return TRUE; +} + +static gboolean +setup_convert_info_packed_yuv_to_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + gchar in_y, in_u, in_v; + gchar out_u, out_v; + guint out_scale; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_TWO_PLANES_BODY; + + if (!get_packed_yuv_components (self, GST_VIDEO_INFO_FORMAT (in_info), + &in_y, &in_u, &in_v)) { + return FALSE; + } + + get_planar_component (out_info, &out_u, &out_v, &out_scale); + + info->ps_body[0] = + g_strdup_printf (templ_PACKED_YUV_to_LUMA_BODY, in_y, out_scale); + info->ps_body[1] = + g_strdup_printf (templ_PACKED_YUV_TO_PLANAR_CHROMA_BODY, in_u, in_v, + out_u, out_scale, out_v, out_scale); + + return TRUE; +} + +static gboolean +setup_convert_info_packed_yuv_to_semi_planar (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + gchar in_y, in_u, in_v; + gchar out_u, out_v; + + info->templ = &templ_REORDER; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + info->ps_output[1] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + if (!get_packed_yuv_components (self, GST_VIDEO_INFO_FORMAT (in_info), + &in_y, &in_u, &in_v)) { + return FALSE; + } + + get_semi_planar_component (out_info, &out_u, &out_v); + + info->ps_body[0] = g_strdup_printf (templ_PACKED_YUV_to_LUMA_BODY, in_y, 1); + info->ps_body[1] = + g_strdup_printf (templ_PACKED_YUV_TO_SEMI_PLANAR_CHROMA_BODY, + in_u, in_v, out_u, out_v); + + return TRUE; +} + +static gboolean +is_planar_format (const GstVideoInfo * info) +{ + switch (GST_VIDEO_INFO_FORMAT (info)) { + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_YV12: + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I420_12LE: + case GST_VIDEO_FORMAT_Y42B: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_I422_12LE: + case GST_VIDEO_FORMAT_Y444: + case GST_VIDEO_FORMAT_Y444_10LE: + case GST_VIDEO_FORMAT_Y444_12LE: + case GST_VIDEO_FORMAT_Y444_16LE: + return TRUE; + default: + break; + } + + return FALSE; +} + +static gboolean +setup_convert_info_yuv_to_yuv (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + gboolean in_planar, out_planar; + gboolean in_vuya, out_vuya; + gboolean in_packed; + + in_vuya = GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_VUYA; + out_vuya = GST_VIDEO_INFO_FORMAT (out_info) == GST_VIDEO_FORMAT_VUYA; + in_planar = is_planar_format (in_info); + in_packed = (GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_YUY2 || + GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_UYVY || + GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_VYUY || + GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_Y210 || + GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_FORMAT_Y410); + out_planar = is_planar_format (out_info); + + /* From/to VUYA */ + if (in_vuya && out_vuya) { + return setup_convert_info_vuya_to_vuya (self, in_info, out_info); + } else if (in_vuya) { + if (out_planar) + return setup_convert_info_vuya_to_planar (self, in_info, out_info); + else + return setup_convert_info_vuya_to_semi_planar (self, in_info, out_info); + } else if (out_vuya) { + if (in_planar) + return setup_convert_info_planar_to_vuya (self, in_info, out_info); + else if (in_packed) + return setup_convert_info_packed_yuv_to_vuya (self, in_info, out_info); + else + return setup_convert_info_semi_planar_to_vuya (self, in_info, out_info); + } + + if (in_planar) { + if (out_planar) + return setup_convert_info_planar_to_planar (self, in_info, out_info); + else + return setup_convert_info_planar_to_semi_planar (self, in_info, out_info); + } else if (in_packed) { + if (out_planar) + return setup_convert_info_packed_yuv_to_planar (self, in_info, out_info); + else + return setup_convert_info_packed_yuv_to_semi_planar (self, in_info, + out_info); + } else { + if (out_planar) + return setup_convert_info_semi_planar_to_planar (self, in_info, out_info); + else + return setup_convert_info_semi_planar_to_semi_planar (self, in_info, + out_info); + } + + return FALSE; +} + +static gboolean +setup_convert_info_rgb_to_gray (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_RGB_to_YUV; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + switch (GST_VIDEO_INFO_FORMAT (out_info)) { + case GST_VIDEO_FORMAT_GRAY8: + case GST_VIDEO_FORMAT_GRAY16_LE: + info->ps_body[0] = g_strdup_printf (templ_RGB_to_GRAY_BODY); + break; + default: + GST_ERROR ("Unhandled output format %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); + return FALSE; + } + + return TRUE; +} + +static gboolean +setup_convert_info_yuv_to_gray (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_REORDER_NO_ALPHA; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + if (GST_VIDEO_INFO_FORMAT (out_info) != GST_VIDEO_FORMAT_GRAY8 && + GST_VIDEO_INFO_FORMAT (out_info) != GST_VIDEO_FORMAT_GRAY16_LE) + return FALSE; + + switch (GST_VIDEO_INFO_FORMAT (in_info)) { + case GST_VIDEO_FORMAT_VUYA: + info->ps_body[0] = g_strdup_printf (templ_VUYA_to_GRAY_BODY); + break; + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_YV12: + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I420_12LE: + case GST_VIDEO_FORMAT_Y42B: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_I422_12LE: + case GST_VIDEO_FORMAT_Y444: + case GST_VIDEO_FORMAT_Y444_10LE: + case GST_VIDEO_FORMAT_Y444_12LE: + case GST_VIDEO_FORMAT_Y444_16LE: + case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_NV21: + case GST_VIDEO_FORMAT_P010_10LE: + case GST_VIDEO_FORMAT_P012_LE: + case GST_VIDEO_FORMAT_P016_LE: + { + gchar u, v; + guint scale; + + get_planar_component (in_info, &u, &v, &scale); + + info->ps_body[0] = g_strdup_printf (templ_YUV_to_GRAY_BODY, scale); + break; + } + case GST_VIDEO_FORMAT_Y410: + { + gchar y, u, v; + get_packed_yuv_components (self, GST_VIDEO_FORMAT_Y410, &y, &u, &v); + + info->ps_body[0] = g_strdup_printf (templ_PACKED_YUV_TO_GRAY, y); + break; + } + default: + GST_ERROR ("Unhandled input format %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info))); + return FALSE; + } + + return TRUE; +} + +static gboolean +setup_convert_info_gray_to_gray (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_REORDER_NO_ALPHA; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + if (GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_FORMAT_GRAY8 && + GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_FORMAT_GRAY16_LE) + return FALSE; + + if (GST_VIDEO_INFO_FORMAT (out_info) != GST_VIDEO_FORMAT_GRAY8 && + GST_VIDEO_INFO_FORMAT (out_info) != GST_VIDEO_FORMAT_GRAY16_LE) + return FALSE; + + info->ps_body[0] = g_strdup_printf (templ_GRAY_to_GRAY_BODY); + + return TRUE; +} + +static gboolean +setup_convert_info_gray_to_rgb (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_REORDER_NO_ALPHA; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + if (GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_FORMAT_GRAY8 && + GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_FORMAT_GRAY16_LE) + return FALSE; + + info->ps_body[0] = g_strdup_printf (templ_GRAY_to_RGB_BODY); + + return TRUE; +} + +static gboolean +setup_convert_info_gray_to_yuv (GstD3D11Converter * self, + const GstVideoInfo * in_info, const GstVideoInfo * out_info) +{ + ConvertInfo *info = &self->convert_info; + + info->templ = &templ_REORDER_NO_ALPHA; + info->ps_output[0] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + + if (GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_FORMAT_GRAY8 && + GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_FORMAT_GRAY16_LE) + return FALSE; + + switch (GST_VIDEO_INFO_FORMAT (out_info)) { + case GST_VIDEO_FORMAT_VUYA: + info->ps_body[0] = g_strdup_printf (templ_GRAY_to_VUYA_BODY); + break; + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_YV12: + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I420_12LE: + case GST_VIDEO_FORMAT_Y42B: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_I422_12LE: + case GST_VIDEO_FORMAT_Y444: + case GST_VIDEO_FORMAT_Y444_10LE: + case GST_VIDEO_FORMAT_Y444_12LE: + case GST_VIDEO_FORMAT_Y444_16LE: + { + gchar u, v; + guint div; + + get_planar_component (out_info, &u, &v, &div); + info->ps_body[0] = g_strdup_printf (templ_GRAY_to_LUMA_BODY, div); + info->ps_body[1] = + g_strdup_printf (templ_GRAY_to_PLANAR_CHROMA_BODY, div); + info->ps_output[1] = HLSL_PS_OUTPUT_TWO_PLANES_BODY; + break; + } + case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_NV21: + case GST_VIDEO_FORMAT_P010_10LE: + case GST_VIDEO_FORMAT_P012_LE: + case GST_VIDEO_FORMAT_P016_LE: + info->ps_body[0] = g_strdup_printf (templ_GRAY_to_LUMA_BODY, 1); + info->ps_body[1] = + g_strdup_printf (templ_GRAY_to_SEMI_PLANAR_CHROMA_BODY); + info->ps_output[1] = HLSL_PS_OUTPUT_ONE_PLANE_BODY; + break; + default: + GST_ERROR ("Unhandled output format %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_color_convert_setup_shader (GstD3D11Converter * self, + GstD3D11Device * device, GstVideoInfo * in_info, GstVideoInfo * out_info) +{ + HRESULT hr; + D3D11_SAMPLER_DESC sampler_desc; + D3D11_INPUT_ELEMENT_DESC input_desc[2]; + D3D11_BUFFER_DESC buffer_desc; + D3D11_MAPPED_SUBRESOURCE map; + VertexData *vertex_data; + WORD *indices; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + ConvertInfo *convert_info = &self->convert_info; + /* *INDENT-OFF* */ + ComPtr<ID3D11PixelShader> ps[CONVERTER_MAX_QUADS]; + ComPtr<ID3D11VertexShader> vs; + ComPtr<ID3D11InputLayout> layout; + ComPtr<ID3D11SamplerState> linear_sampler; + ComPtr<ID3D11Buffer> transform_const_buffer; + ComPtr<ID3D11Buffer> alpha_const_buffer; + ComPtr<ID3D11Buffer> vertex_buffer; + ComPtr<ID3D11Buffer> index_buffer; + /* *INDENT-ON* */ + const guint index_count = 2 * 3; + gint i; + gboolean ret; + ID3D11Buffer *const_buffers[2] = { nullptr, nullptr }; + guint num_const_buffers = 0; + + memset (&sampler_desc, 0, sizeof (sampler_desc)); + memset (input_desc, 0, sizeof (input_desc)); + memset (&buffer_desc, 0, sizeof (buffer_desc)); + + device_handle = gst_d3d11_device_get_device_handle (device); + context_handle = gst_d3d11_device_get_device_context_handle (device); + + /* bilinear filtering */ + sampler_desc.Filter = D3D11_FILTER_MIN_MAG_LINEAR_MIP_POINT; + sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.ComparisonFunc = D3D11_COMPARISON_ALWAYS; + sampler_desc.MinLOD = 0; + sampler_desc.MaxLOD = D3D11_FLOAT32_MAX; + + hr = device_handle->CreateSamplerState (&sampler_desc, &linear_sampler); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create sampler state, hr: 0x%x", (guint) hr); + return FALSE; + } + + for (i = 0; i < CONVERTER_MAX_QUADS; i++) { + gchar *shader_code = NULL; + + if (convert_info->ps_body[i]) { + gchar *transform_const_buffer = nullptr; + gchar *alpha_const_buffer = nullptr; + guint register_idx = 0; + + g_assert (convert_info->ps_output[i] != NULL); + + if (convert_info->templ->has_transform) { + transform_const_buffer = + g_strdup_printf (templ_color_transform_const_buffer, register_idx); + register_idx++; + } + + if (convert_info->templ->has_alpha) { + alpha_const_buffer = + g_strdup_printf (templ_alpha_const_buffer, register_idx); + register_idx++; + } + + shader_code = g_strdup_printf (templ_pixel_shader, + transform_const_buffer ? transform_const_buffer : "", + alpha_const_buffer ? alpha_const_buffer : "", + convert_info->ps_output[i], + convert_info->templ->func ? convert_info->templ->func : "", + convert_info->ps_body[i]); + g_free (transform_const_buffer); + g_free (alpha_const_buffer); + + ret = gst_d3d11_create_pixel_shader (device, shader_code, &ps[i]); + g_free (shader_code); + if (!ret) { + return FALSE; + } + } + } + + if (convert_info->templ->has_transform) { + D3D11_BUFFER_DESC const_buffer_desc = { 0, }; + + G_STATIC_ASSERT (sizeof (PixelShaderColorTransform) % 16 == 0); + + const_buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + const_buffer_desc.ByteWidth = sizeof (PixelShaderColorTransform); + const_buffer_desc.BindFlags = D3D11_BIND_CONSTANT_BUFFER; + const_buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + const_buffer_desc.MiscFlags = 0; + const_buffer_desc.StructureByteStride = 0; + + hr = device_handle->CreateBuffer (&const_buffer_desc, nullptr, + &transform_const_buffer); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create constant buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + gst_d3d11_device_lock (device); + hr = context_handle->Map (transform_const_buffer.Get (), + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't map constant buffer, hr: 0x%x", (guint) hr); + gst_d3d11_device_unlock (device); + return FALSE; + } + + memcpy (map.pData, &convert_info->transform, + sizeof (PixelShaderColorTransform)); + + context_handle->Unmap (transform_const_buffer.Get (), 0); + gst_d3d11_device_unlock (device); + + const_buffers[num_const_buffers] = transform_const_buffer.Get (); + num_const_buffers++; + } + + if (convert_info->templ->has_alpha) { + D3D11_BUFFER_DESC const_buffer_desc = { 0, }; + AlphaConstBuffer *alpha_const; + + G_STATIC_ASSERT (sizeof (AlphaConstBuffer) % 16 == 0); + + const_buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + const_buffer_desc.ByteWidth = sizeof (AlphaConstBuffer); + const_buffer_desc.BindFlags = D3D11_BIND_CONSTANT_BUFFER; + const_buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + const_buffer_desc.MiscFlags = 0; + const_buffer_desc.StructureByteStride = 0; + + hr = device_handle->CreateBuffer (&const_buffer_desc, + nullptr, &alpha_const_buffer); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create constant buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + gst_d3d11_device_lock (device); + hr = context_handle->Map (alpha_const_buffer.Get (), + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't map constant buffer, hr: 0x%x", (guint) hr); + gst_d3d11_device_unlock (device); + return FALSE; + } + + alpha_const = (AlphaConstBuffer *) map.pData; + memset (alpha_const, 0, sizeof (AlphaConstBuffer)); + alpha_const->alpha_mul = (FLOAT) self->alpha; + + context_handle->Unmap (alpha_const_buffer.Get (), 0); + gst_d3d11_device_unlock (device); + + self->alpha_const_buffer = alpha_const_buffer.Get (); + /* We will hold this buffer and update later */ + self->alpha_const_buffer->AddRef (); + const_buffers[num_const_buffers] = alpha_const_buffer.Get (); + num_const_buffers++; + } + + input_desc[0].SemanticName = "POSITION"; + input_desc[0].SemanticIndex = 0; + input_desc[0].Format = DXGI_FORMAT_R32G32B32_FLOAT; + input_desc[0].InputSlot = 0; + input_desc[0].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; + input_desc[0].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; + input_desc[0].InstanceDataStepRate = 0; + + input_desc[1].SemanticName = "TEXCOORD"; + input_desc[1].SemanticIndex = 0; + input_desc[1].Format = DXGI_FORMAT_R32G32_FLOAT; + input_desc[1].InputSlot = 0; + input_desc[1].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; + input_desc[1].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; + input_desc[1].InstanceDataStepRate = 0; + + if (!gst_d3d11_create_vertex_shader (device, templ_vertex_shader, + input_desc, G_N_ELEMENTS (input_desc), &vs, &layout)) { + GST_ERROR ("Couldn't vertex pixel shader"); + return FALSE; + } + + /* setup vertext buffer and index buffer */ + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (VertexData) * 4; + buffer_desc.BindFlags = D3D11_BIND_VERTEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + hr = device_handle->CreateBuffer (&buffer_desc, NULL, &vertex_buffer); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create vertex buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (WORD) * index_count; + buffer_desc.BindFlags = D3D11_BIND_INDEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + hr = device_handle->CreateBuffer (&buffer_desc, NULL, &index_buffer); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create index buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + gst_d3d11_device_lock (device); + hr = context_handle->Map (vertex_buffer.Get (), 0, D3D11_MAP_WRITE_DISCARD, 0, + &map); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't map vertex buffer, hr: 0x%x", (guint) hr); + gst_d3d11_device_unlock (device); + return FALSE; + } + + vertex_data = (VertexData *) map.pData; + + hr = context_handle->Map (index_buffer.Get (), 0, D3D11_MAP_WRITE_DISCARD, 0, + &map); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't map index buffer, hr: 0x%x", (guint) hr); + context_handle->Unmap (vertex_buffer.Get (), 0); + gst_d3d11_device_unlock (device); + return FALSE; + } + + indices = (WORD *) map.pData; + + /* bottom left */ + vertex_data[0].position.x = -1.0f; + vertex_data[0].position.y = -1.0f; + vertex_data[0].position.z = 0.0f; + vertex_data[0].texture.x = 0.0f; + vertex_data[0].texture.y = 1.0f; + + /* top left */ + vertex_data[1].position.x = -1.0f; + vertex_data[1].position.y = 1.0f; + vertex_data[1].position.z = 0.0f; + vertex_data[1].texture.x = 0.0f; + vertex_data[1].texture.y = 0.0f; + + /* top right */ + vertex_data[2].position.x = 1.0f; + vertex_data[2].position.y = 1.0f; + vertex_data[2].position.z = 0.0f; + vertex_data[2].texture.x = 1.0f; + vertex_data[2].texture.y = 0.0f; + + /* bottom right */ + vertex_data[3].position.x = 1.0f; + vertex_data[3].position.y = -1.0f; + vertex_data[3].position.z = 0.0f; + vertex_data[3].texture.x = 1.0f; + vertex_data[3].texture.y = 1.0f; + + /* clockwise indexing */ + indices[0] = 0; /* bottom left */ + indices[1] = 1; /* top left */ + indices[2] = 2; /* top right */ + + indices[3] = 3; /* bottom right */ + indices[4] = 0; /* bottom left */ + indices[5] = 2; /* top right */ + + context_handle->Unmap (vertex_buffer.Get (), 0); + context_handle->Unmap (index_buffer.Get (), 0); + gst_d3d11_device_unlock (device); + + self->quad[0] = gst_d3d11_quad_new (device, + ps[0].Get (), vs.Get (), layout.Get (), + const_buffers, num_const_buffers, + vertex_buffer.Get (), sizeof (VertexData), + index_buffer.Get (), DXGI_FORMAT_R16_UINT, index_count); + + if (ps[1]) { + self->quad[1] = gst_d3d11_quad_new (device, + ps[1].Get (), vs.Get (), layout.Get (), + const_buffers, num_const_buffers, + vertex_buffer.Get (), sizeof (VertexData), + index_buffer.Get (), DXGI_FORMAT_R16_UINT, index_count); + } + + self->num_input_view = GST_VIDEO_INFO_N_PLANES (in_info); + self->num_output_view = GST_VIDEO_INFO_N_PLANES (out_info); + + /* holds vertex buffer for crop rect update */ + self->vertex_buffer = vertex_buffer.Detach (); + + self->src_rect.left = 0; + self->src_rect.top = 0; + self->src_rect.right = GST_VIDEO_INFO_WIDTH (in_info); + self->src_rect.bottom = GST_VIDEO_INFO_HEIGHT (in_info); + + self->dest_rect.left = 0; + self->dest_rect.top = 0; + self->dest_rect.right = GST_VIDEO_INFO_WIDTH (out_info); + self->dest_rect.bottom = GST_VIDEO_INFO_HEIGHT (out_info); + + self->input_texture_width = GST_VIDEO_INFO_WIDTH (in_info); + self->input_texture_height = GST_VIDEO_INFO_HEIGHT (in_info); + + self->linear_sampler = linear_sampler.Detach (); + + return TRUE; +} + +static gboolean +copy_config (GQuark field_id, const GValue * value, GstD3D11Converter * self) +{ + gst_structure_id_set_value (self->config, field_id, value); + + return TRUE; +} + +static gboolean +gst_d3d11_converter_set_config (GstD3D11Converter * converter, + GstStructure * config) +{ + gst_structure_foreach (config, (GstStructureForeachFunc) copy_config, + converter); + gst_structure_free (config); + + return TRUE; +} + +GstD3D11Converter * +gst_d3d11_converter_new (GstD3D11Device * device, + GstVideoInfo * in_info, GstVideoInfo * out_info, GstStructure * config) +{ + const GstVideoInfo *unknown_info; + const GstD3D11Format *in_d3d11_format; + const GstD3D11Format *out_d3d11_format; + gboolean is_supported = FALSE; + MatrixData matrix; + GstD3D11Converter *converter = NULL; + gboolean ret; + guint i; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (in_info != NULL, NULL); + g_return_val_if_fail (out_info != NULL, NULL); + + GST_DEBUG ("Setup convert with format %s -> %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info)), + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); + + in_d3d11_format = + gst_d3d11_device_format_from_gst (device, + GST_VIDEO_INFO_FORMAT (in_info)); + if (!in_d3d11_format) { + unknown_info = in_info; + goto format_unknown; + } + + out_d3d11_format = + gst_d3d11_device_format_from_gst (device, + GST_VIDEO_INFO_FORMAT (out_info)); + if (!out_d3d11_format) { + unknown_info = out_info; + goto format_unknown; + } + + converter = g_new0 (GstD3D11Converter, 1); + converter->device = (GstD3D11Device *) gst_object_ref (device); + converter->config = gst_structure_new_empty ("GstD3D11Converter-Config"); + if (config) + gst_d3d11_converter_set_config (converter, config); + + converter->alpha = GET_OPT_ALPHA_VALUE (converter); + + if (GST_VIDEO_INFO_IS_RGB (in_info)) { + if (GST_VIDEO_INFO_IS_RGB (out_info)) { + is_supported = + setup_convert_info_rgb_to_rgb (converter, in_info, out_info); + } else if (GST_VIDEO_INFO_IS_YUV (out_info)) { + is_supported = + setup_convert_info_rgb_to_yuv (converter, in_info, out_info); + } else if (GST_VIDEO_INFO_IS_GRAY (out_info)) { + is_supported = + setup_convert_info_rgb_to_gray (converter, in_info, out_info); + } + } else if (GST_VIDEO_INFO_IS_YUV (in_info)) { + if (GST_VIDEO_INFO_IS_RGB (out_info)) { + is_supported = + setup_convert_info_yuv_to_rgb (converter, in_info, out_info); + } else if (GST_VIDEO_INFO_IS_YUV (out_info)) { + is_supported = + setup_convert_info_yuv_to_yuv (converter, in_info, out_info); + } else if (GST_VIDEO_INFO_IS_GRAY (out_info)) { + is_supported = + setup_convert_info_yuv_to_gray (converter, in_info, out_info); + } + } else if (GST_VIDEO_INFO_IS_GRAY (in_info)) { + if (GST_VIDEO_INFO_IS_RGB (out_info)) { + is_supported = + setup_convert_info_gray_to_rgb (converter, in_info, out_info); + } else if (GST_VIDEO_INFO_IS_YUV (out_info)) { + is_supported = + setup_convert_info_gray_to_yuv (converter, in_info, out_info); + } else if (GST_VIDEO_INFO_IS_GRAY (out_info)) { + is_supported = + setup_convert_info_gray_to_gray (converter, in_info, out_info); + } + } + + if (!is_supported) { + goto conversion_not_supported; + } + + if (converter_get_matrix (converter, &matrix, in_info, out_info)) { + PixelShaderColorTransform *transform = &converter->convert_info.transform; + + /* padding the last column for 16bytes alignment */ + transform->trans_matrix[0] = matrix.dm[0][0]; + transform->trans_matrix[1] = matrix.dm[0][1]; + transform->trans_matrix[2] = matrix.dm[0][2]; + transform->trans_matrix[3] = 0; + transform->trans_matrix[4] = matrix.dm[1][0]; + transform->trans_matrix[5] = matrix.dm[1][1]; + transform->trans_matrix[6] = matrix.dm[1][2]; + transform->trans_matrix[7] = 0; + transform->trans_matrix[8] = matrix.dm[2][0]; + transform->trans_matrix[9] = matrix.dm[2][1]; + transform->trans_matrix[10] = matrix.dm[2][2]; + transform->trans_matrix[11] = 0; + } + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (out_info); i++) { + converter->viewport[i].TopLeftX = 0; + converter->viewport[i].TopLeftY = 0; + converter->viewport[i].Width = GST_VIDEO_INFO_COMP_WIDTH (out_info, i); + converter->viewport[i].Height = GST_VIDEO_INFO_COMP_HEIGHT (out_info, i); + converter->viewport[i].MinDepth = 0.0f; + converter->viewport[i].MaxDepth = 1.0f; + } + + ret = gst_d3d11_color_convert_setup_shader (converter, + device, in_info, out_info); + + if (!ret) { + GST_ERROR ("Couldn't setup shader"); + gst_d3d11_converter_free (converter); + converter = NULL; + } else { + converter->in_info = *in_info; + converter->out_info = *out_info; + } + + return converter; + + /* ERRORS */ +format_unknown: + { + GST_ERROR ("%s couldn't be converted to d3d11 format", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (unknown_info))); + if (config) + gst_structure_free (config); + + return NULL; + } +conversion_not_supported: + { + GST_ERROR ("Conversion %s to %s not supported", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (in_info)), + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (out_info))); + gst_d3d11_converter_free (converter); + return NULL; + } +} + +void +gst_d3d11_converter_free (GstD3D11Converter * converter) +{ + gint i; + + g_return_if_fail (converter != NULL); + + for (i = 0; i < CONVERTER_MAX_QUADS; i++) { + if (converter->quad[i]) + gst_d3d11_quad_free (converter->quad[i]); + + g_free (converter->convert_info.ps_body[i]); + } + + GST_D3D11_CLEAR_COM (converter->vertex_buffer); + GST_D3D11_CLEAR_COM (converter->linear_sampler); + GST_D3D11_CLEAR_COM (converter->alpha_const_buffer); + + gst_clear_object (&converter->device); + + if (converter->config) + gst_structure_free (converter->config); + + g_free (converter); +} + +/* must be called with gst_d3d11_device_lock since ID3D11DeviceContext is not + * thread-safe */ +static gboolean +gst_d3d11_converter_update_vertex_buffer (GstD3D11Converter * self) +{ + D3D11_MAPPED_SUBRESOURCE map; + VertexData *vertex_data; + ID3D11DeviceContext *context_handle; + HRESULT hr; + FLOAT x1, y1, x2, y2; + FLOAT u, v; + const RECT *src_rect = &self->src_rect; + const RECT *dest_rect = &self->dest_rect; + gint texture_width = self->input_texture_width; + gint texture_height = self->input_texture_height; + gdouble val; + + context_handle = gst_d3d11_device_get_device_context_handle (self->device); + + hr = context_handle->Map (self->vertex_buffer, 0, D3D11_MAP_WRITE_DISCARD, + 0, &map); + + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR ("Couldn't map vertex buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + vertex_data = (VertexData *) map.pData; + /* bottom left */ + gst_util_fraction_to_double (dest_rect->left, + GST_VIDEO_INFO_WIDTH (&self->out_info), &val); + x1 = (val * 2.0f) - 1.0f; + + gst_util_fraction_to_double (dest_rect->bottom, + GST_VIDEO_INFO_HEIGHT (&self->out_info), &val); + y1 = (val * -2.0f) + 1.0f; + + /* top right */ + gst_util_fraction_to_double (dest_rect->right, + GST_VIDEO_INFO_WIDTH (&self->out_info), &val); + x2 = (val * 2.0f) - 1.0f; + + gst_util_fraction_to_double (dest_rect->top, + GST_VIDEO_INFO_HEIGHT (&self->out_info), &val); + y2 = (val * -2.0f) + 1.0f; + + /* bottom left */ + u = (src_rect->left / (gfloat) texture_width) - 0.5f / texture_width; + v = (src_rect->bottom / (gfloat) texture_height) - 0.5f / texture_height; + + vertex_data[0].position.x = x1; + vertex_data[0].position.y = y1; + vertex_data[0].position.z = 0.0f; + vertex_data[0].texture.x = u; + vertex_data[0].texture.y = v; + + /* top left */ + u = (src_rect->left / (gfloat) texture_width) - 0.5f / texture_width; + v = (src_rect->top / (gfloat) texture_height) - 0.5f / texture_height; + + vertex_data[1].position.x = x1; + vertex_data[1].position.y = y2; + vertex_data[1].position.z = 0.0f; + vertex_data[1].texture.x = u; + vertex_data[1].texture.y = v; + + /* top right */ + u = (src_rect->right / (gfloat) texture_width) - 0.5f / texture_width; + v = (src_rect->top / (gfloat) texture_height) - 0.5f / texture_height; + + vertex_data[2].position.x = x2; + vertex_data[2].position.y = y2; + vertex_data[2].position.z = 0.0f; + vertex_data[2].texture.x = u; + vertex_data[2].texture.y = v; + + /* bottom right */ + u = (src_rect->right / (gfloat) texture_width) - 0.5f / texture_width; + v = (src_rect->bottom / (gfloat) texture_height) - 0.5f / texture_height; + + vertex_data[3].position.x = x2; + vertex_data[3].position.y = y1; + vertex_data[3].position.z = 0.0f; + vertex_data[3].texture.x = u; + vertex_data[3].texture.y = v; + + context_handle->Unmap (self->vertex_buffer, 0); + + self->update_vertex = FALSE; + + return TRUE; +} + +gboolean +gst_d3d11_converter_convert (GstD3D11Converter * converter, + ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], + ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES], + ID3D11BlendState * blend, gfloat blend_factor[4]) +{ + gboolean ret; + + g_return_val_if_fail (converter != NULL, FALSE); + g_return_val_if_fail (srv != NULL, FALSE); + g_return_val_if_fail (rtv != NULL, FALSE); + + gst_d3d11_device_lock (converter->device); + ret = gst_d3d11_converter_convert_unlocked (converter, + srv, rtv, blend, blend_factor); + gst_d3d11_device_unlock (converter->device); + + return ret; +} + +gboolean +gst_d3d11_converter_convert_unlocked (GstD3D11Converter * converter, + ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], + ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES], + ID3D11BlendState * blend, gfloat blend_factor[4]) +{ + gboolean ret; + /* *INDENT-OFF* */ + ComPtr<ID3D11Resource> resource; + ComPtr<ID3D11Texture2D> texture; + /* *INDENT-ON* */ + D3D11_TEXTURE2D_DESC desc; + + g_return_val_if_fail (converter != NULL, FALSE); + g_return_val_if_fail (srv != NULL, FALSE); + g_return_val_if_fail (rtv != NULL, FALSE); + + /* check texture resolution and update crop area */ + srv[0]->GetResource (&resource); + resource.As (&texture); + texture->GetDesc (&desc); + + if (converter->update_vertex || + desc.Width != (guint) converter->input_texture_width || + desc.Height != (guint) converter->input_texture_height) { + GST_DEBUG ("Update vertext buffer, texture resolution: %dx%d", + desc.Width, desc.Height); + + converter->input_texture_width = desc.Width; + converter->input_texture_height = desc.Height; + + if (!gst_d3d11_converter_update_vertex_buffer (converter)) { + GST_ERROR ("Cannot update vertex buffer"); + return FALSE; + } + } + + if (converter->update_alpha) { + D3D11_MAPPED_SUBRESOURCE map; + ID3D11DeviceContext *context_handle; + AlphaConstBuffer *alpha_const; + HRESULT hr; + + g_assert (converter->alpha_const_buffer != nullptr); + + context_handle = + gst_d3d11_device_get_device_context_handle (converter->device); + + hr = context_handle->Map (converter->alpha_const_buffer, + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, converter->device)) { + GST_ERROR ("Couldn't map constant buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + alpha_const = (AlphaConstBuffer *) map.pData; + alpha_const->alpha_mul = (FLOAT) converter->alpha; + + context_handle->Unmap (converter->alpha_const_buffer, 0); + converter->update_alpha = FALSE; + } + + ret = gst_d3d11_draw_quad_unlocked (converter->quad[0], converter->viewport, + 1, srv, converter->num_input_view, rtv, 1, blend, blend_factor, + &converter->linear_sampler, 1); + + if (!ret) + return FALSE; + + if (converter->quad[1]) { + ret = gst_d3d11_draw_quad_unlocked (converter->quad[1], + &converter->viewport[1], converter->num_output_view - 1, + srv, converter->num_input_view, &rtv[1], converter->num_output_view - 1, + blend, blend_factor, &converter->linear_sampler, 1); + + if (!ret) + return FALSE; + } + + return TRUE; +} + +gboolean +gst_d3d11_converter_update_viewport (GstD3D11Converter * converter, + D3D11_VIEWPORT * viewport) +{ + g_return_val_if_fail (converter != NULL, FALSE); + g_return_val_if_fail (viewport != NULL, FALSE); + + converter->viewport[0] = *viewport; + + switch (GST_VIDEO_INFO_FORMAT (&converter->out_info)) { + case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_NV21: + case GST_VIDEO_FORMAT_P010_10LE: + case GST_VIDEO_FORMAT_P012_LE: + case GST_VIDEO_FORMAT_P016_LE: + case GST_VIDEO_FORMAT_I420: + case GST_VIDEO_FORMAT_YV12: + case GST_VIDEO_FORMAT_I420_10LE: + case GST_VIDEO_FORMAT_I420_12LE: + { + guint i; + converter->viewport[1].TopLeftX = converter->viewport[0].TopLeftX / 2; + converter->viewport[1].TopLeftY = converter->viewport[0].TopLeftY / 2; + converter->viewport[1].Width = converter->viewport[0].Width / 2; + converter->viewport[1].Height = converter->viewport[0].Height / 2; + + for (i = 2; i < GST_VIDEO_INFO_N_PLANES (&converter->out_info); i++) + converter->viewport[i] = converter->viewport[1]; + + break; + } + case GST_VIDEO_FORMAT_Y42B: + case GST_VIDEO_FORMAT_I422_10LE: + case GST_VIDEO_FORMAT_I422_12LE: + { + guint i; + converter->viewport[1].TopLeftX = converter->viewport[0].TopLeftX / 2; + converter->viewport[1].TopLeftY = converter->viewport[0].TopLeftY; + converter->viewport[1].Width = converter->viewport[0].Width / 2; + converter->viewport[1].Height = converter->viewport[0].Height; + + for (i = 2; i < GST_VIDEO_INFO_N_PLANES (&converter->out_info); i++) + converter->viewport[i] = converter->viewport[1]; + + break; + } + case GST_VIDEO_FORMAT_Y444: + case GST_VIDEO_FORMAT_Y444_10LE: + case GST_VIDEO_FORMAT_Y444_12LE: + case GST_VIDEO_FORMAT_Y444_16LE: + { + guint i; + for (i = 1; i < GST_VIDEO_INFO_N_PLANES (&converter->out_info); i++) + converter->viewport[i] = converter->viewport[1]; + break; + } + default: + if (converter->num_output_view > 1) + g_assert_not_reached (); + break; + } + + return TRUE; +} + +gboolean +gst_d3d11_converter_update_src_rect (GstD3D11Converter * converter, + RECT * src_rect) +{ + g_return_val_if_fail (converter != NULL, FALSE); + g_return_val_if_fail (src_rect != NULL, FALSE); + + gst_d3d11_device_lock (converter->device); + if (converter->src_rect.left != src_rect->left || + converter->src_rect.top != src_rect->top || + converter->src_rect.right != src_rect->right || + converter->src_rect.bottom != src_rect->bottom) { + converter->src_rect = *src_rect; + + /* vertex buffer will be updated on next convert() call */ + converter->update_vertex = TRUE; + } + gst_d3d11_device_unlock (converter->device); + + return TRUE; +} + +gboolean +gst_d3d11_converter_update_dest_rect (GstD3D11Converter * converter, + RECT * dest_rect) +{ + g_return_val_if_fail (converter != NULL, FALSE); + g_return_val_if_fail (dest_rect != NULL, FALSE); + + gst_d3d11_device_lock (converter->device); + if (converter->dest_rect.left != dest_rect->left || + converter->dest_rect.top != dest_rect->top || + converter->dest_rect.right != dest_rect->right || + converter->dest_rect.bottom != dest_rect->bottom) { + converter->dest_rect = *dest_rect; + + /* vertex buffer will be updated on next convert() call */ + converter->update_vertex = TRUE; + } + gst_d3d11_device_unlock (converter->device); + + return TRUE; +} + +gboolean +gst_d3d11_converter_update_config (GstD3D11Converter * converter, + GstStructure * config) +{ + g_return_val_if_fail (converter != nullptr, FALSE); + g_return_val_if_fail (config != nullptr, FALSE); + + gst_d3d11_device_lock (converter->device); + gst_d3d11_converter_set_config (converter, config); + + /* Check whether options are updated or not */ + if (converter->alpha_const_buffer) { + gdouble alpha = GET_OPT_ALPHA_VALUE (converter); + + if (alpha != converter->alpha) { + GST_DEBUG ("Updating alpha %lf -> %lf", converter->alpha, alpha); + converter->alpha = alpha; + converter->update_alpha = TRUE; + } + } + gst_d3d11_device_unlock (converter->device); + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11converter.h
Added
@@ -0,0 +1,72 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_COLOR_CONVERTER_H__ +#define __GST_D3D11_COLOR_CONVERTER_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +typedef struct _GstD3D11Converter GstD3D11Converter; + +/** + * GST_D3D11_CONVERTER_OPT_ALPHA_VALUE + * + * #G_TYPE_FLOAT, the alpha value color value to use. + * Default is 1.0 + */ +#define GST_D3D11_CONVERTER_OPT_ALPHA_VALUE "GstD3D11Converter.alpha-value" + +GstD3D11Converter * gst_d3d11_converter_new (GstD3D11Device * device, + GstVideoInfo * in_info, + GstVideoInfo * out_info, + GstStructure * config); + +void gst_d3d11_converter_free (GstD3D11Converter * converter); + +gboolean gst_d3d11_converter_convert (GstD3D11Converter * converter, + ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES], + ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES], + ID3D11BlendState *blend, + gfloat blend_factor[4]); + +gboolean gst_d3d11_converter_convert_unlocked (GstD3D11Converter * converter, + ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES], + ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES], + ID3D11BlendState *blend, + gfloat blend_factor[4]); + +gboolean gst_d3d11_converter_update_viewport (GstD3D11Converter * converter, + D3D11_VIEWPORT * viewport); + +gboolean gst_d3d11_converter_update_src_rect (GstD3D11Converter * converter, + RECT * src_rect); + +gboolean gst_d3d11_converter_update_dest_rect (GstD3D11Converter * converter, + RECT * dest_rect); + +gboolean gst_d3d11_converter_update_config (GstD3D11Converter * converter, + GstStructure * config); + +G_END_DECLS + +#endif /* __GST_D3D11_COLOR_CONVERTER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11decoder.cpp
Added
@@ -0,0 +1,2245 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + * + * NOTE: some of implementations are copied/modified from Chromium code + * + * Copyright 2015 The Chromium Authors. All rights reserved. + * + * Redistribution and use in source and binary forms, with or without + * modification, are permitted provided that the following conditions are + * met: + * + * * Redistributions of source code must retain the above copyright + * notice, this list of conditions and the following disclaimer. + * * Redistributions in binary form must reproduce the above + * copyright notice, this list of conditions and the following disclaimer + * in the documentation and/or other materials provided with the + * distribution. + * * Neither the name of Google Inc. nor the names of its + * contributors may be used to endorse or promote products derived from + * this software without specific prior written permission. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR + * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT + * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, + * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT + * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, + * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY + * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT + * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE + * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11decoder.h" +#include "gstd3d11converter.h" +#include "gstd3d11pluginutils.h" +#include <string.h> +#include <string> + +#ifdef HAVE_WINMM +#include <timeapi.h> +#endif + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_decoder_debug); +#define GST_CAT_DEFAULT gst_d3d11_decoder_debug + +/* GUID might not be defined in MinGW header */ +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_H264_IDCT_FGT, 0x1b81be67, 0xa0c7, + 0x11d3, 0xb9, 0x84, 0x00, 0xc0, 0x4f, 0x2e, 0x73, 0xc5); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_NOFGT, 0x1b81be68, 0xa0c7, + 0x11d3, 0xb9, 0x84, 0x00, 0xc0, 0x4f, 0x2e, 0x73, 0xc5); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_FGT, 0x1b81be69, 0xa0c7, + 0x11d3, 0xb9, 0x84, 0x00, 0xc0, 0x4f, 0x2e, 0x73, 0xc5); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN, + 0x5b11d51b, 0x2f4c, 0x4452, 0xbc, 0xc3, 0x09, 0xf2, 0xa1, 0x16, 0x0c, 0xc0); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN10, + 0x107af0e0, 0xef1a, 0x4d19, 0xab, 0xa8, 0x67, 0xa1, 0x63, 0x07, 0x3d, 0x13); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_VP8_VLD, + 0x90b899ea, 0x3a62, 0x4705, 0x88, 0xb3, 0x8d, 0xf0, 0x4b, 0x27, 0x44, 0xe7); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_PROFILE0, + 0x463707f8, 0xa1d0, 0x4585, 0x87, 0x6d, 0x83, 0xaa, 0x6d, 0x60, 0xb8, 0x9e); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_10BIT_PROFILE2, + 0xa4c749ef, 0x6ecf, 0x48aa, 0x84, 0x48, 0x50, 0xa7, 0xa1, 0x16, 0x5f, 0xf7); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_MPEG2_VLD, 0xee27417f, 0x5e28, + 0x4e65, 0xbe, 0xea, 0x1d, 0x26, 0xb5, 0x08, 0xad, 0xc9); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_MPEG2and1_VLD, 0x86695f12, 0x340e, + 0x4f04, 0x9f, 0xd3, 0x92, 0x53, 0xdd, 0x32, 0x74, 0x60); +DEFINE_GUID (GST_GUID_D3D11_DECODER_PROFILE_AV1_VLD_PROFILE0, 0xb8be4ccb, + 0xcf53, 0x46ba, 0x8d, 0x59, 0xd6, 0xb8, 0xa6, 0xda, 0x5d, 0x2a); + +static const GUID *profile_h264_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_H264_IDCT_FGT, + &GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_NOFGT, + &GST_GUID_D3D11_DECODER_PROFILE_H264_VLD_FGT, +}; + +static const GUID *profile_hevc_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN, +}; + +static const GUID *profile_hevc_10_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_HEVC_VLD_MAIN10, +}; + +static const GUID *profile_vp8_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_VP8_VLD, +}; + +static const GUID *profile_vp9_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_PROFILE0, +}; + +static const GUID *profile_vp9_10_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_VP9_VLD_10BIT_PROFILE2, +}; + +static const GUID *profile_mpeg2_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_MPEG2_VLD, + &GST_GUID_D3D11_DECODER_PROFILE_MPEG2and1_VLD +}; + +static const GUID *profile_av1_list[] = { + &GST_GUID_D3D11_DECODER_PROFILE_AV1_VLD_PROFILE0, + /* TODO: add more profile */ +}; + +enum +{ + PROP_0, + PROP_DEVICE, +}; + +struct _GstD3D11Decoder +{ + GstObject parent; + + gboolean configured; + gboolean opened; + + GstD3D11Device *device; + + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; + + ID3D11VideoDecoder *decoder_handle; + + GstVideoInfo info; + GstVideoInfo output_info; + GstDXVACodec codec; + gint coded_width; + gint coded_height; + DXGI_FORMAT decoder_format; + gboolean downstream_supports_d3d11; + + GstVideoCodecState *input_state; + GstVideoCodecState *output_state; + + /* Protect internal pool */ + GMutex internal_pool_lock; + + GstBufferPool *internal_pool; + /* Internal pool params */ + gint aligned_width; + gint aligned_height; + gboolean use_array_of_texture; + guint dpb_size; + guint downstream_min_buffers; + gboolean wait_on_pool_full; + + /* Used for array-of-texture */ + guint8 next_view_id; + + /* for staging */ + ID3D11Texture2D *staging; + gsize staging_texture_offset[GST_VIDEO_MAX_PLANES]; + gint stating_texture_stride[GST_VIDEO_MAX_PLANES]; + + GUID decoder_profile; + + /* For device specific workaround */ + gboolean can_direct_rendering; + + /* For high precision clock */ + guint timer_resolution; +}; + +static void gst_d3d11_decoder_constructed (GObject * object); +static void gst_d3d11_decoder_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_d3d11_decoder_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); +static void gst_d3d11_decoder_dispose (GObject * obj); +static void gst_d3d11_decoder_finalize (GObject * obj); +static gboolean gst_d3d11_decoder_can_direct_render (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, GstBuffer * view_buffer, + gint display_width, gint display_height); + +#define parent_class gst_d3d11_decoder_parent_class +G_DEFINE_TYPE (GstD3D11Decoder, gst_d3d11_decoder, GST_TYPE_OBJECT); + +static void +gst_d3d11_decoder_class_init (GstD3D11DecoderClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->constructed = gst_d3d11_decoder_constructed; + gobject_class->set_property = gst_d3d11_decoder_set_property; + gobject_class->get_property = gst_d3d11_decoder_get_property; + gobject_class->dispose = gst_d3d11_decoder_dispose; + gobject_class->finalize = gst_d3d11_decoder_finalize; + + g_object_class_install_property (gobject_class, PROP_DEVICE, + g_param_spec_object ("device", "Device", + "D3D11 Devicd to use", GST_TYPE_D3D11_DEVICE, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | + G_PARAM_STATIC_STRINGS))); +} + +static void +gst_d3d11_decoder_init (GstD3D11Decoder * self) +{ + g_mutex_init (&self->internal_pool_lock); +} + +static void +gst_d3d11_decoder_constructed (GObject * object) +{ + GstD3D11Decoder *self = GST_D3D11_DECODER (object); + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; + + if (!self->device) { + GST_ERROR_OBJECT (self, "No D3D11Device available"); + return; + } + + video_device = gst_d3d11_device_get_video_device_handle (self->device); + if (!video_device) { + GST_WARNING_OBJECT (self, "ID3D11VideoDevice is not available"); + return; + } + + video_context = gst_d3d11_device_get_video_context_handle (self->device); + if (!video_context) { + GST_WARNING_OBJECT (self, "ID3D11VideoContext is not available"); + return; + } + + self->video_device = video_device; + video_device->AddRef (); + + self->video_context = video_context; + video_context->AddRef (); + + return; +} + +static void +gst_d3d11_decoder_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11Decoder *self = GST_D3D11_DECODER (object); + + switch (prop_id) { + case PROP_DEVICE: + self->device = (GstD3D11Device *) g_value_dup_object (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_decoder_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11Decoder *self = GST_D3D11_DECODER (object); + + switch (prop_id) { + case PROP_DEVICE: + g_value_set_object (value, self->device); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_decoder_clear_resource (GstD3D11Decoder * self) +{ + g_mutex_lock (&self->internal_pool_lock); + if (self->internal_pool) { + gst_buffer_pool_set_active (self->internal_pool, FALSE); + gst_clear_object (&self->internal_pool); + } + g_mutex_unlock (&self->internal_pool_lock); + + GST_D3D11_CLEAR_COM (self->decoder_handle); + GST_D3D11_CLEAR_COM (self->staging); + + memset (self->staging_texture_offset, + 0, sizeof (self->staging_texture_offset)); + memset (self->stating_texture_stride, + 0, sizeof (self->stating_texture_stride)); +} + +static void +gst_d3d11_decoder_reset (GstD3D11Decoder * self) +{ + gst_d3d11_decoder_clear_resource (self); + + self->dpb_size = 0; + self->downstream_min_buffers = 0; + + self->configured = FALSE; + self->opened = FALSE; + + self->use_array_of_texture = FALSE; + self->downstream_supports_d3d11 = FALSE; + + g_clear_pointer (&self->output_state, gst_video_codec_state_unref); + g_clear_pointer (&self->input_state, gst_video_codec_state_unref); +} + +static void +gst_d3d11_decoder_dispose (GObject * obj) +{ + GstD3D11Decoder *self = GST_D3D11_DECODER (obj); + + gst_d3d11_decoder_reset (self); + + GST_D3D11_CLEAR_COM (self->video_device); + GST_D3D11_CLEAR_COM (self->video_context); + + gst_clear_object (&self->device); + + G_OBJECT_CLASS (parent_class)->dispose (obj); +} + +static void +gst_d3d11_decoder_finalize (GObject * obj) +{ + GstD3D11Decoder *self = GST_D3D11_DECODER (obj); + +#if HAVE_WINMM + /* Restore clock precision */ + if (self->timer_resolution) + timeEndPeriod (self->timer_resolution); +#endif + + g_mutex_clear (&self->internal_pool_lock); + + G_OBJECT_CLASS (parent_class)->finalize (obj); +} + +GstD3D11Decoder * +gst_d3d11_decoder_new (GstD3D11Device * device, GstDXVACodec codec) +{ + GstD3D11Decoder *self; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), nullptr); + g_return_val_if_fail (codec > GST_DXVA_CODEC_NONE, nullptr); + g_return_val_if_fail (codec < GST_DXVA_CODEC_LAST, nullptr); + + self = (GstD3D11Decoder *) + g_object_new (GST_TYPE_D3D11_DECODER, "device", device, NULL); + + if (!self->video_device || !self->video_context) { + gst_object_unref (self); + return NULL; + } + + self->codec = codec; + + gst_object_ref_sink (self); + + return self; +} + +gboolean +gst_d3d11_decoder_is_configured (GstD3D11Decoder * decoder) +{ + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + + return decoder->configured; +} + +static GQuark +gst_d3d11_decoder_view_id_quark (void) +{ + static gsize id_quark = 0; + + if (g_once_init_enter (&id_quark)) { + GQuark quark = g_quark_from_string ("GstD3D11DecoderViewId"); + g_once_init_leave (&id_quark, quark); + } + + return (GQuark) id_quark; +} + +static gboolean +gst_d3d11_decoder_ensure_output_view (GstD3D11Decoder * self, + GstBuffer * buffer) +{ + GstD3D11Memory *mem; + gpointer val = NULL; + + mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, 0); + if (!gst_d3d11_memory_get_decoder_output_view (mem, self->video_device, + &self->decoder_profile)) { + GST_ERROR_OBJECT (self, "Decoder output view is unavailable"); + return FALSE; + } + + if (!self->use_array_of_texture) + return TRUE; + + val = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_d3d11_decoder_view_id_quark ()); + if (!val) { + g_assert (self->next_view_id < 128); + g_assert (self->next_view_id > 0); + + gst_mini_object_set_qdata (GST_MINI_OBJECT (mem), + gst_d3d11_decoder_view_id_quark (), + GUINT_TO_POINTER (self->next_view_id), NULL); + + self->next_view_id++; + /* valid view range is [0, 126], but 0 is not used to here + * (it's NULL as well) */ + self->next_view_id %= 128; + if (self->next_view_id == 0) + self->next_view_id = 1; + } + + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_prepare_output_view_pool (GstD3D11Decoder * self) +{ + GstD3D11AllocationParams *alloc_params = NULL; + GstBufferPool *pool = NULL; + GstCaps *caps = NULL; + GstVideoAlignment align; + GstD3D11AllocationFlags alloc_flags = (GstD3D11AllocationFlags) 0; + gint bind_flags = D3D11_BIND_DECODER; + GstVideoInfo *info = &self->info; + guint pool_size; + + g_mutex_lock (&self->internal_pool_lock); + if (self->internal_pool) { + gst_buffer_pool_set_active (self->internal_pool, FALSE); + gst_clear_object (&self->internal_pool); + } + g_mutex_unlock (&self->internal_pool_lock); + + if (!self->use_array_of_texture) { + alloc_flags = GST_D3D11_ALLOCATION_FLAG_TEXTURE_ARRAY; + } else { + /* array of texture can have shader resource view */ + bind_flags |= D3D11_BIND_SHADER_RESOURCE; + } + + alloc_params = gst_d3d11_allocation_params_new (self->device, info, + alloc_flags, bind_flags); + + if (!alloc_params) { + GST_ERROR_OBJECT (self, "Failed to create allocation param"); + goto error; + } + + pool_size = self->dpb_size + self->downstream_min_buffers; + GST_DEBUG_OBJECT (self, + "Configuring internal pool with size %d " + "(dpb size: %d, downstream min buffers: %d)", pool_size, self->dpb_size, + self->downstream_min_buffers); + + if (!self->use_array_of_texture) { + alloc_params->desc[0].ArraySize = pool_size; + } else { + /* Valid view id is [0, 126], but we will use [1, 127] range so that + * it can be used by qdata, because zero is equal to null */ + self->next_view_id = 1; + + /* our pool size can be increased as much as possbile */ + pool_size = 0; + } + + gst_video_alignment_reset (&align); + + align.padding_right = self->aligned_width - GST_VIDEO_INFO_WIDTH (info); + align.padding_bottom = self->aligned_height - GST_VIDEO_INFO_HEIGHT (info); + if (!gst_d3d11_allocation_params_alignment (alloc_params, &align)) { + GST_ERROR_OBJECT (self, "Cannot set alignment"); + goto error; + } + + caps = gst_video_info_to_caps (info); + if (!caps) { + GST_ERROR_OBJECT (self, "Couldn't convert video info to caps"); + goto error; + } + + pool = gst_d3d11_buffer_pool_new_with_options (self->device, + caps, alloc_params, 0, pool_size); + gst_clear_caps (&caps); + g_clear_pointer (&alloc_params, gst_d3d11_allocation_params_free); + + if (!pool) { + GST_ERROR_OBJECT (self, "Failed to create buffer pool"); + goto error; + } + + if (!gst_buffer_pool_set_active (pool, TRUE)) { + GST_ERROR_OBJECT (self, "Couldn't activate pool"); + goto error; + } + + g_mutex_lock (&self->internal_pool_lock); + self->internal_pool = pool; + g_mutex_unlock (&self->internal_pool_lock); + + return TRUE; + +error: + if (alloc_params) + gst_d3d11_allocation_params_free (alloc_params); + if (pool) + gst_object_unref (pool); + if (caps) + gst_caps_unref (caps); + + return FALSE; +} + +static const gchar * +gst_dxva_codec_to_string (GstDXVACodec codec) +{ + switch (codec) { + case GST_DXVA_CODEC_NONE: + return "none"; + case GST_DXVA_CODEC_H264: + return "H.264"; + case GST_DXVA_CODEC_VP9: + return "VP9"; + case GST_DXVA_CODEC_H265: + return "H.265"; + case GST_DXVA_CODEC_VP8: + return "VP8"; + case GST_DXVA_CODEC_MPEG2: + return "MPEG2"; + case GST_DXVA_CODEC_AV1: + return "AV1"; + default: + g_assert_not_reached (); + break; + } + + return "Unknown"; +} + +gboolean +gst_d3d11_decoder_get_supported_decoder_profile (GstD3D11Device * device, + GstDXVACodec codec, GstVideoFormat format, const GUID ** selected_profile) +{ + GUID *guid_list = nullptr; + const GUID *profile = nullptr; + guint available_profile_count; + guint i, j; + HRESULT hr; + ID3D11VideoDevice *video_device; + const GUID **profile_list = nullptr; + guint profile_size = 0; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); + g_return_val_if_fail (selected_profile != nullptr, FALSE); + + video_device = gst_d3d11_device_get_video_device_handle (device); + if (!video_device) + return FALSE; + + switch (codec) { + case GST_DXVA_CODEC_H264: + if (format == GST_VIDEO_FORMAT_NV12) { + profile_list = profile_h264_list; + profile_size = G_N_ELEMENTS (profile_h264_list); + } + break; + case GST_DXVA_CODEC_H265: + if (format == GST_VIDEO_FORMAT_NV12) { + profile_list = profile_hevc_list; + profile_size = G_N_ELEMENTS (profile_hevc_list); + } else if (format == GST_VIDEO_FORMAT_P010_10LE) { + profile_list = profile_hevc_10_list; + profile_size = G_N_ELEMENTS (profile_hevc_10_list); + } + break; + case GST_DXVA_CODEC_VP8: + if (format == GST_VIDEO_FORMAT_NV12) { + profile_list = profile_vp8_list; + profile_size = G_N_ELEMENTS (profile_vp8_list); + } + break; + case GST_DXVA_CODEC_VP9: + if (format == GST_VIDEO_FORMAT_NV12) { + profile_list = profile_vp9_list; + profile_size = G_N_ELEMENTS (profile_vp9_list); + } else if (format == GST_VIDEO_FORMAT_P010_10LE) { + profile_list = profile_vp9_10_list; + profile_size = G_N_ELEMENTS (profile_vp9_10_list); + } + break; + case GST_DXVA_CODEC_MPEG2: + if (format == GST_VIDEO_FORMAT_NV12) { + profile_list = profile_mpeg2_list; + profile_size = G_N_ELEMENTS (profile_mpeg2_list); + } + break; + case GST_DXVA_CODEC_AV1: + profile_list = profile_av1_list; + profile_size = G_N_ELEMENTS (profile_av1_list); + break; + default: + break; + } + + if (!profile_list) { + GST_ERROR_OBJECT (device, + "Not supported codec (%d) and format (%s) configuration", codec, + gst_video_format_to_string (format)); + return FALSE; + } + + available_profile_count = video_device->GetVideoDecoderProfileCount (); + + if (available_profile_count == 0) { + GST_INFO_OBJECT (device, "No available decoder profile"); + return FALSE; + } + + GST_DEBUG_OBJECT (device, + "Have %u available decoder profiles", available_profile_count); + guid_list = (GUID *) g_alloca (sizeof (GUID) * available_profile_count); + + for (i = 0; i < available_profile_count; i++) { + hr = video_device->GetVideoDecoderProfile (i, &guid_list[i]); + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (device, "Failed to get %d th decoder profile", i); + return FALSE; + } + } + +#ifndef GST_DISABLE_GST_DEBUG + GST_LOG_OBJECT (device, "Supported decoder GUID"); + for (i = 0; i < available_profile_count; i++) { + const GUID *guid = &guid_list[i]; + + GST_LOG_OBJECT (device, + "\t { %8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x }", + (guint) guid->Data1, (guint) guid->Data2, (guint) guid->Data3, + guid->Data4[0], guid->Data4[1], guid->Data4[2], guid->Data4[3], + guid->Data4[4], guid->Data4[5], guid->Data4[6], guid->Data4[7]); + } + + GST_LOG_OBJECT (device, "Requested decoder GUID"); + for (i = 0; i < profile_size; i++) { + const GUID *guid = profile_list[i]; + + GST_LOG_OBJECT (device, + "\t { %8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x }", + (guint) guid->Data1, (guint) guid->Data2, (guint) guid->Data3, + guid->Data4[0], guid->Data4[1], guid->Data4[2], guid->Data4[3], + guid->Data4[4], guid->Data4[5], guid->Data4[6], guid->Data4[7]); + } +#endif + + for (i = 0; i < profile_size; i++) { + for (j = 0; j < available_profile_count; j++) { + if (IsEqualGUID (*profile_list[i], guid_list[j])) { + profile = profile_list[i]; + break; + } + } + } + + if (!profile) { + GST_INFO_OBJECT (device, "No supported decoder profile for %s codec", + gst_dxva_codec_to_string (codec)); + return FALSE; + } + + *selected_profile = profile; + + GST_DEBUG_OBJECT (device, + "Selected guid " + "{ %8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x }", + (guint) profile->Data1, (guint) profile->Data2, (guint) profile->Data3, + profile->Data4[0], profile->Data4[1], profile->Data4[2], + profile->Data4[3], profile->Data4[4], profile->Data4[5], + profile->Data4[6], profile->Data4[7]); + + return TRUE; +} + + +gboolean +gst_d3d11_decoder_configure (GstD3D11Decoder * decoder, + GstVideoCodecState * input_state, GstVideoInfo * info, gint coded_width, + gint coded_height, guint dpb_size) +{ + const GstD3D11Format *d3d11_format; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + g_return_val_if_fail (info != NULL, FALSE); + g_return_val_if_fail (input_state != NULL, FALSE); + g_return_val_if_fail (coded_width >= GST_VIDEO_INFO_WIDTH (info), FALSE); + g_return_val_if_fail (coded_height >= GST_VIDEO_INFO_HEIGHT (info), FALSE); + g_return_val_if_fail (dpb_size > 0, FALSE); + + gst_d3d11_decoder_reset (decoder); + + d3d11_format = gst_d3d11_device_format_from_gst (decoder->device, + GST_VIDEO_INFO_FORMAT (info)); + if (!d3d11_format || d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + GST_ERROR_OBJECT (decoder, "Could not determine dxgi format from %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (info))); + return FALSE; + } + + decoder->input_state = gst_video_codec_state_ref (input_state); + decoder->info = decoder->output_info = *info; + decoder->coded_width = coded_width; + decoder->coded_height = coded_height; + decoder->dpb_size = dpb_size; + decoder->decoder_format = d3d11_format->dxgi_format; + + decoder->configured = TRUE; + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_ensure_staging_texture (GstD3D11Decoder * self) +{ + ID3D11Device *device_handle; + D3D11_TEXTURE2D_DESC desc = { 0, }; + HRESULT hr; + + if (self->staging) + return TRUE; + + device_handle = gst_d3d11_device_get_device_handle (self->device); + + /* create stage texture to copy out */ + desc.Width = self->aligned_width; + desc.Height = self->aligned_height; + desc.MipLevels = 1; + desc.Format = self->decoder_format; + desc.SampleDesc.Count = 1; + desc.ArraySize = 1; + desc.Usage = D3D11_USAGE_STAGING; + desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ; + + hr = device_handle->CreateTexture2D (&desc, NULL, &self->staging); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Couldn't create staging texture"); + return FALSE; + } + + return TRUE; +} + +static void +gst_d3d11_decoder_enable_high_precision_timer (GstD3D11Decoder * self) +{ +#if HAVE_WINMM + GstD3D11DeviceVendor vendor; + + if (self->timer_resolution) + return; + + vendor = gst_d3d11_get_device_vendor (self->device); + /* Do this only for NVIDIA at the moment, other vendors doesn't seem to be + * requiring retry for BeginFrame() */ + if (vendor == GST_D3D11_DEVICE_VENDOR_NVIDIA) { + TIMECAPS time_caps; + if (timeGetDevCaps (&time_caps, sizeof (TIMECAPS)) == TIMERR_NOERROR) { + guint resolution; + MMRESULT ret; + + resolution = MIN (MAX (time_caps.wPeriodMin, 1), time_caps.wPeriodMax); + + ret = timeBeginPeriod (resolution); + if (ret == TIMERR_NOERROR) { + self->timer_resolution = resolution; + GST_INFO_OBJECT (self, "Updated timer resolution to %d", resolution); + } + } + } +#endif +} + +static gboolean +gst_d3d11_decoder_open (GstD3D11Decoder * self) +{ + HRESULT hr; + BOOL can_support = FALSE; + guint config_count; + D3D11_VIDEO_DECODER_CONFIG *config_list; + D3D11_VIDEO_DECODER_CONFIG *best_config = NULL; + D3D11_VIDEO_DECODER_DESC decoder_desc = { 0, }; + const GUID *selected_profile = NULL; + guint i; + gint aligned_width, aligned_height; + guint alignment; + GstD3D11DeviceVendor vendor; + ID3D11VideoDevice *video_device; + GstVideoInfo *info = &self->info; + + if (self->opened) + return TRUE; + + if (!self->configured) { + GST_ERROR_OBJECT (self, "Should configure first"); + return FALSE; + } + + video_device = self->video_device; + + gst_d3d11_device_lock (self->device); + if (!gst_d3d11_decoder_get_supported_decoder_profile (self->device, + self->codec, GST_VIDEO_INFO_FORMAT (info), &selected_profile)) { + goto error; + } + + hr = video_device->CheckVideoDecoderFormat (selected_profile, + self->decoder_format, &can_support); + if (!gst_d3d11_result (hr, self->device) || !can_support) { + GST_ERROR_OBJECT (self, + "VideoDevice could not support dxgi format %d, hr: 0x%x", + self->decoder_format, (guint) hr); + goto error; + } + + gst_d3d11_decoder_clear_resource (self); + self->can_direct_rendering = TRUE; + + vendor = gst_d3d11_get_device_vendor (self->device); + switch (vendor) { + case GST_D3D11_DEVICE_VENDOR_XBOX: + /* FIXME: Need to figure out Xbox device's behavior + * https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1312 + */ + self->can_direct_rendering = FALSE; + break; + default: + break; + } + + /* NOTE: other dxva implementations (ffmpeg and vlc) do this + * and they say the required alignment were mentioned by dxva spec. + * See ff_dxva2_common_frame_params() in dxva.c of ffmpeg and + * directx_va_Setup() in directx_va.c of vlc. + * But... where it is? */ + switch (self->codec) { + case GST_DXVA_CODEC_H265: + case GST_DXVA_CODEC_AV1: + /* See directx_va_Setup() impl. in vlc */ + if (vendor != GST_D3D11_DEVICE_VENDOR_XBOX) + alignment = 128; + else + alignment = 16; + break; + case GST_DXVA_CODEC_MPEG2: + /* XXX: ffmpeg does this */ + alignment = 32; + break; + default: + alignment = 16; + break; + } + + aligned_width = GST_ROUND_UP_N (self->coded_width, alignment); + aligned_height = GST_ROUND_UP_N (self->coded_height, alignment); + if (aligned_width != self->coded_width || + aligned_height != self->coded_height) { + GST_DEBUG_OBJECT (self, + "coded resolution %dx%d is not aligned to %d, adjust to %dx%d", + self->coded_width, self->coded_height, alignment, aligned_width, + aligned_height); + } + + self->aligned_width = aligned_width; + self->aligned_height = aligned_height; + + decoder_desc.SampleWidth = aligned_width; + decoder_desc.SampleHeight = aligned_height; + decoder_desc.OutputFormat = self->decoder_format; + decoder_desc.Guid = *selected_profile; + + hr = video_device->GetVideoDecoderConfigCount (&decoder_desc, &config_count); + if (!gst_d3d11_result (hr, self->device) || config_count == 0) { + GST_ERROR_OBJECT (self, "Could not get decoder config count, hr: 0x%x", + (guint) hr); + goto error; + } + + GST_DEBUG_OBJECT (self, "Total %d config available", config_count); + + config_list = (D3D11_VIDEO_DECODER_CONFIG *) + g_alloca (sizeof (D3D11_VIDEO_DECODER_CONFIG) * config_count); + + for (i = 0; i < config_count; i++) { + hr = video_device->GetVideoDecoderConfig (&decoder_desc, i, + &config_list[i]); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Could not get decoder %dth config, hr: 0x%x", + i, (guint) hr); + goto error; + } + + /* FIXME: need support DXVA_Slice_H264_Long ?? */ + /* this config uses DXVA_Slice_H264_Short */ + switch (self->codec) { + case GST_DXVA_CODEC_H264: + if (config_list[i].ConfigBitstreamRaw == 2) + best_config = &config_list[i]; + break; + case GST_DXVA_CODEC_H265: + case GST_DXVA_CODEC_VP9: + case GST_DXVA_CODEC_VP8: + case GST_DXVA_CODEC_MPEG2: + case GST_DXVA_CODEC_AV1: + if (config_list[i].ConfigBitstreamRaw == 1) + best_config = &config_list[i]; + break; + default: + g_assert_not_reached (); + goto error; + } + + if (best_config) + break; + } + + if (best_config == NULL) { + GST_ERROR_OBJECT (self, "Could not determine decoder config"); + goto error; + } + + GST_DEBUG_OBJECT (self, "ConfigDecoderSpecific 0x%x", + best_config->ConfigDecoderSpecific); + + /* bit 14 is equal to 1b means this config support array of texture and + * it's recommended type as per DXVA spec */ + if ((best_config->ConfigDecoderSpecific & 0x4000) == 0x4000) { + GST_DEBUG_OBJECT (self, "Config support array of texture"); + self->use_array_of_texture = TRUE; + } + + hr = video_device->CreateVideoDecoder (&decoder_desc, + best_config, &self->decoder_handle); + if (!gst_d3d11_result (hr, self->device) || !self->decoder_handle) { + GST_ERROR_OBJECT (self, + "Could not create decoder object, hr: 0x%x", (guint) hr); + goto error; + } + + GST_DEBUG_OBJECT (self, "Decoder object %p created", self->decoder_handle); + + if (!self->downstream_supports_d3d11 && + !gst_d3d11_decoder_ensure_staging_texture (self)) { + GST_ERROR_OBJECT (self, "Couldn't prepare staging texture"); + goto error; + } + + self->decoder_profile = *selected_profile; + + /* Store pool related information here, then we will setup internal pool + * later once the number of min buffer size required by downstream is known. + * Actual buffer pool size will be "dpb_size + downstream_min_buffers" + */ + self->downstream_min_buffers = 0; + self->wait_on_pool_full = FALSE; + + self->opened = TRUE; + gst_d3d11_device_unlock (self->device); + + gst_d3d11_decoder_enable_high_precision_timer (self); + + return TRUE; + +error: + gst_d3d11_decoder_reset (self); + gst_d3d11_device_unlock (self->device); + + return FALSE; +} + +static gboolean +gst_d3d11_decoder_begin_frame (GstD3D11Decoder * decoder, + ID3D11VideoDecoderOutputView * output_view, guint content_key_size, + gconstpointer content_key) +{ + ID3D11VideoContext *video_context; + guint retry_count = 0; + HRESULT hr; + guint retry_threshold = 100; + + /* if we have high resolution timer, do more retry */ + if (decoder->timer_resolution) + retry_threshold = 500; + + video_context = decoder->video_context; + + do { + GST_LOG_OBJECT (decoder, "Try begin frame, retry count %d", retry_count); + hr = video_context->DecoderBeginFrame (decoder->decoder_handle, + output_view, content_key_size, content_key); + + /* HACK: Do retry with 1ms sleep per failure, since DXVA/D3D11 + * doesn't provide API for "GPU-IS-READY-TO-DECODE" like signal. + */ + if (hr == E_PENDING && retry_count < retry_threshold) { + GST_LOG_OBJECT (decoder, "GPU is busy, try again. Retry count %d", + retry_count); + g_usleep (1000); + } else { + if (gst_d3d11_result (hr, decoder->device)) + GST_LOG_OBJECT (decoder, "Succeeded with retry count %d", retry_count); + break; + } + + retry_count++; + } while (TRUE); + + if (!gst_d3d11_result (hr, decoder->device)) { + GST_ERROR_OBJECT (decoder, "Failed to begin frame, hr: 0x%x", (guint) hr); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_end_frame (GstD3D11Decoder * decoder) +{ + HRESULT hr; + ID3D11VideoContext *video_context; + + video_context = decoder->video_context; + hr = video_context->DecoderEndFrame (decoder->decoder_handle); + + if (!gst_d3d11_result (hr, decoder->device)) { + GST_WARNING_OBJECT (decoder, "EndFrame failed, hr: 0x%x", (guint) hr); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_get_decoder_buffer (GstD3D11Decoder * decoder, + D3D11_VIDEO_DECODER_BUFFER_TYPE type, guint * buffer_size, + gpointer * buffer) +{ + UINT size; + void *decoder_buffer; + HRESULT hr; + ID3D11VideoContext *video_context; + + video_context = decoder->video_context; + hr = video_context->GetDecoderBuffer (decoder->decoder_handle, + type, &size, &decoder_buffer); + + if (!gst_d3d11_result (hr, decoder->device)) { + GST_WARNING_OBJECT (decoder, "Getting buffer type %d error, hr: 0x%x", + type, (guint) hr); + return FALSE; + } + + *buffer_size = size; + *buffer = decoder_buffer; + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_release_decoder_buffer (GstD3D11Decoder * decoder, + D3D11_VIDEO_DECODER_BUFFER_TYPE type) +{ + HRESULT hr; + ID3D11VideoContext *video_context; + + video_context = decoder->video_context; + hr = video_context->ReleaseDecoderBuffer (decoder->decoder_handle, type); + + if (!gst_d3d11_result (hr, decoder->device)) { + GST_WARNING_OBJECT (decoder, "ReleaseDecoderBuffer failed, hr: 0x%x", + (guint) hr); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_submit_decoder_buffers (GstD3D11Decoder * decoder, + guint buffer_count, const D3D11_VIDEO_DECODER_BUFFER_DESC * buffers) +{ + HRESULT hr; + ID3D11VideoContext *video_context; + + video_context = decoder->video_context; + hr = video_context->SubmitDecoderBuffers (decoder->decoder_handle, + buffer_count, buffers); + if (!gst_d3d11_result (hr, decoder->device)) { + GST_WARNING_OBJECT (decoder, "SubmitDecoderBuffers failed, hr: 0x%x", + (guint) hr); + return FALSE; + } + + return TRUE; +} + +gboolean +gst_d3d11_decoder_decode_frame (GstD3D11Decoder * decoder, + ID3D11VideoDecoderOutputView * output_view, + GstD3D11DecodeInputStreamArgs * input_args) +{ + guint d3d11_buffer_size; + gpointer d3d11_buffer; + D3D11_VIDEO_DECODER_BUFFER_DESC buffer_desc[4]; + guint buffer_desc_size; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + g_return_val_if_fail (output_view != nullptr, FALSE); + g_return_val_if_fail (input_args != nullptr, FALSE); + + memset (buffer_desc, 0, sizeof (buffer_desc)); + + buffer_desc[0].BufferType = D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS; + buffer_desc[0].DataSize = input_args->picture_params_size; + + buffer_desc[1].BufferType = D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL; + buffer_desc[1].DataSize = input_args->slice_control_size; + + buffer_desc[2].BufferType = D3D11_VIDEO_DECODER_BUFFER_BITSTREAM; + buffer_desc[2].DataOffset = 0; + buffer_desc[2].DataSize = input_args->bitstream_size; + + buffer_desc_size = 3; + if (input_args->inverse_quantization_matrix && + input_args->inverse_quantization_matrix_size > 0) { + buffer_desc[3].BufferType = + D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX; + buffer_desc[3].DataSize = input_args->inverse_quantization_matrix_size; + buffer_desc_size++; + } + + gst_d3d11_device_lock (decoder->device); + if (!gst_d3d11_decoder_begin_frame (decoder, output_view, 0, nullptr)) { + gst_d3d11_device_unlock (decoder->device); + + return FALSE; + } + + if (!gst_d3d11_decoder_get_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS, &d3d11_buffer_size, + &d3d11_buffer)) { + GST_ERROR_OBJECT (decoder, + "Failed to get decoder buffer for picture parameters"); + goto error; + } + + if (d3d11_buffer_size < input_args->picture_params_size) { + GST_ERROR_OBJECT (decoder, + "Too small picture param buffer size %d", d3d11_buffer_size); + + gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS); + goto error; + } + + memcpy (d3d11_buffer, input_args->picture_params, + input_args->picture_params_size); + + if (!gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_PICTURE_PARAMETERS)) { + GST_ERROR_OBJECT (decoder, "Failed to release picture param buffer"); + goto error; + } + + if (!gst_d3d11_decoder_get_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL, &d3d11_buffer_size, + &d3d11_buffer)) { + GST_ERROR_OBJECT (decoder, "Failed to get slice control buffer"); + goto error; + } + + if (d3d11_buffer_size < input_args->slice_control_size) { + GST_ERROR_OBJECT (decoder, + "Too small slice control buffer size %d", d3d11_buffer_size); + + gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL); + goto error; + } + + memcpy (d3d11_buffer, + input_args->slice_control, input_args->slice_control_size); + + if (!gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_SLICE_CONTROL)) { + GST_ERROR_OBJECT (decoder, "Failed to release slice control buffer"); + goto error; + } + + if (!gst_d3d11_decoder_get_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_BITSTREAM, &d3d11_buffer_size, + &d3d11_buffer)) { + GST_ERROR_OBJECT (decoder, "Failed to get bitstream buffer"); + goto error; + } + + if (d3d11_buffer_size < input_args->bitstream_size) { + GST_ERROR_OBJECT (decoder, "Too small bitstream buffer size %d", + d3d11_buffer_size); + + gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_BITSTREAM); + goto error; + } + + memcpy (d3d11_buffer, input_args->bitstream, input_args->bitstream_size); + + if (!gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_BITSTREAM)) { + GST_ERROR_OBJECT (decoder, "Failed to release bitstream buffer"); + goto error; + } + + if (input_args->inverse_quantization_matrix_size > 0) { + if (!gst_d3d11_decoder_get_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX, + &d3d11_buffer_size, &d3d11_buffer)) { + GST_ERROR_OBJECT (decoder, + "Failed to get inverse quantization matrix buffer"); + goto error; + } + + if (d3d11_buffer_size < input_args->inverse_quantization_matrix_size) { + GST_ERROR_OBJECT (decoder, + "Too small inverse quantization matrix buffer buffer %d", + d3d11_buffer_size); + + gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX); + goto error; + } + + memcpy (d3d11_buffer, input_args->inverse_quantization_matrix, + input_args->inverse_quantization_matrix_size); + + if (!gst_d3d11_decoder_release_decoder_buffer (decoder, + D3D11_VIDEO_DECODER_BUFFER_INVERSE_QUANTIZATION_MATRIX)) { + GST_ERROR_OBJECT (decoder, + "Failed to release inverse quantization matrix buffer"); + goto error; + } + } + + if (!gst_d3d11_decoder_submit_decoder_buffers (decoder, + buffer_desc_size, buffer_desc)) { + GST_ERROR_OBJECT (decoder, "Failed to submit decoder buffers"); + goto error; + } + + if (!gst_d3d11_decoder_end_frame (decoder)) { + gst_d3d11_device_unlock (decoder->device); + return FALSE; + } + + gst_d3d11_device_unlock (decoder->device); + + return TRUE; + +error: + gst_d3d11_decoder_end_frame (decoder); + gst_d3d11_device_unlock (decoder->device); + return FALSE; +} + +GstBuffer * +gst_d3d11_decoder_get_output_view_buffer (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec) +{ + GstBuffer *buf = NULL; + GstFlowReturn ret; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + + if (!decoder->internal_pool) { + /* Try negotiate again whatever the previous negotiation result was. + * There could be updated field(s) in sinkpad caps after we negotiated with + * downstream on new_sequence() call. For example, h264/h265 parse + * will be able to update HDR10 related caps field after parsing + * corresponding SEI messages which are usually placed after the essential + * headers */ + gst_video_decoder_negotiate (videodec); + + if (!gst_d3d11_decoder_prepare_output_view_pool (decoder)) { + GST_ERROR_OBJECT (videodec, "Failed to setup internal pool"); + return NULL; + } + } else if (!gst_buffer_pool_set_active (decoder->internal_pool, TRUE)) { + GST_ERROR_OBJECT (videodec, "Couldn't set active internal pool"); + return NULL; + } + + ret = gst_buffer_pool_acquire_buffer (decoder->internal_pool, &buf, NULL); + + if (ret != GST_FLOW_OK || !buf) { + if (ret != GST_FLOW_FLUSHING) { + GST_ERROR_OBJECT (videodec, "Couldn't get buffer from pool, ret %s", + gst_flow_get_name (ret)); + } else { + GST_DEBUG_OBJECT (videodec, "We are flusing"); + } + + return NULL; + } + + if (!gst_d3d11_decoder_ensure_output_view (decoder, buf)) { + GST_ERROR_OBJECT (videodec, "Output view unavailable"); + gst_buffer_unref (buf); + + return NULL; + } + + return buf; +} + +ID3D11VideoDecoderOutputView * +gst_d3d11_decoder_get_output_view_from_buffer (GstD3D11Decoder * decoder, + GstBuffer * buffer, guint8 * index) +{ + GstMemory *mem; + GstD3D11Memory *dmem; + ID3D11VideoDecoderOutputView *view; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), NULL); + g_return_val_if_fail (GST_IS_BUFFER (buffer), NULL); + + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_WARNING_OBJECT (decoder, "Not a d3d11 memory"); + return NULL; + } + + dmem = (GstD3D11Memory *) mem; + view = gst_d3d11_memory_get_decoder_output_view (dmem, decoder->video_device, + &decoder->decoder_profile); + + if (!view) { + GST_ERROR_OBJECT (decoder, "Decoder output view is unavailable"); + return NULL; + } + + if (index) { + if (decoder->use_array_of_texture) { + guint8 id; + gpointer val = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_d3d11_decoder_view_id_quark ()); + if (!val) { + GST_ERROR_OBJECT (decoder, "memory has no qdata"); + return NULL; + } + + id = (guint8) GPOINTER_TO_UINT (val); + g_assert (id < 128); + + *index = (id - 1); + } else { + *index = gst_d3d11_memory_get_subresource_index (dmem); + } + } + + return view; +} + +static gboolean +copy_to_system (GstD3D11Decoder * self, GstBuffer * decoder_buffer, + GstBuffer * output) +{ + GstVideoFrame out_frame; + GstVideoInfo *info = &self->output_info; + guint i; + GstD3D11Memory *in_mem; + D3D11_MAPPED_SUBRESOURCE map; + HRESULT hr; + ID3D11Texture2D *in_texture; + guint in_subresource_index; + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (self->device); + + if (!gst_d3d11_decoder_ensure_staging_texture (self)) { + GST_ERROR_OBJECT (self, "Staging texture is not available"); + return FALSE; + } + + if (!gst_video_frame_map (&out_frame, info, output, GST_MAP_WRITE)) { + GST_ERROR_OBJECT (self, "Couldn't map output buffer"); + return FALSE; + } + + in_mem = (GstD3D11Memory *) gst_buffer_peek_memory (decoder_buffer, 0); + + in_texture = gst_d3d11_memory_get_texture_handle (in_mem); + in_subresource_index = gst_d3d11_memory_get_subresource_index (in_mem); + + gst_d3d11_device_lock (self->device); + device_context->CopySubresourceRegion (self->staging, 0, 0, 0, 0, + in_texture, in_subresource_index, NULL); + + hr = device_context->Map (self->staging, 0, D3D11_MAP_READ, 0, &map); + + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Failed to map, hr: 0x%x", (guint) hr); + + gst_d3d11_device_unlock (self->device); + gst_video_frame_unmap (&out_frame); + + return FALSE; + } + + /* calculate stride and offset only once */ + if (self->stating_texture_stride[0] == 0) { + D3D11_TEXTURE2D_DESC desc; + gsize dummy; + + self->staging->GetDesc (&desc); + + gst_d3d11_dxgi_format_get_size (desc.Format, desc.Width, desc.Height, + map.RowPitch, self->staging_texture_offset, + self->stating_texture_stride, &dummy); + } + + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&out_frame); i++) { + guint8 *src, *dst; + gint j; + gint width; + + src = (guint8 *) map.pData + self->staging_texture_offset[i]; + dst = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (&out_frame, i); + width = GST_VIDEO_FRAME_COMP_WIDTH (&out_frame, i) * + GST_VIDEO_FRAME_COMP_PSTRIDE (&out_frame, i); + + for (j = 0; j < GST_VIDEO_FRAME_COMP_HEIGHT (&out_frame, i); j++) { + memcpy (dst, src, width); + dst += GST_VIDEO_FRAME_PLANE_STRIDE (&out_frame, i); + src += self->stating_texture_stride[i]; + } + } + + gst_video_frame_unmap (&out_frame); + device_context->Unmap (self->staging, 0); + gst_d3d11_device_unlock (self->device); + + return TRUE; +} + +static gboolean +copy_to_d3d11 (GstD3D11Decoder * self, GstBuffer * decoder_buffer, + GstBuffer * output) +{ + GstVideoInfo *info = &self->output_info; + GstD3D11Memory *in_mem; + GstD3D11Memory *out_mem; + GstMapInfo out_map; + D3D11_BOX src_box; + ID3D11Texture2D *in_texture; + guint in_subresource_index, out_subresource_index; + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (self->device); + + in_mem = (GstD3D11Memory *) gst_buffer_peek_memory (decoder_buffer, 0); + out_mem = (GstD3D11Memory *) gst_buffer_peek_memory (output, 0); + + if (!gst_memory_map (GST_MEMORY_CAST (out_mem), + &out_map, (GstMapFlags) (GST_MAP_WRITE | GST_MAP_D3D11))) { + GST_ERROR_OBJECT (self, "Couldn't map output d3d11 memory"); + return FALSE; + } + + gst_d3d11_device_lock (self->device); + in_texture = gst_d3d11_memory_get_texture_handle (in_mem); + in_subresource_index = gst_d3d11_memory_get_subresource_index (in_mem); + + src_box.left = 0; + src_box.top = 0; + src_box.front = 0; + src_box.back = 1; + + src_box.right = GST_ROUND_UP_2 (GST_VIDEO_INFO_WIDTH (info)); + src_box.bottom = GST_ROUND_UP_2 (GST_VIDEO_INFO_HEIGHT (info)); + + out_subresource_index = gst_d3d11_memory_get_subresource_index (out_mem); + device_context->CopySubresourceRegion ((ID3D11Resource *) out_map.data, + out_subresource_index, 0, 0, 0, in_texture, in_subresource_index, + &src_box); + + gst_d3d11_device_unlock (self->device); + gst_memory_unmap (GST_MEMORY_CAST (out_mem), &out_map); + + return TRUE; +} + +gboolean +gst_d3d11_decoder_process_output (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, gint display_width, gint display_height, + GstBuffer * decoder_buffer, GstBuffer ** output) +{ + gboolean can_device_copy = TRUE; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_VIDEO_DECODER (videodec), FALSE); + g_return_val_if_fail (GST_IS_BUFFER (decoder_buffer), FALSE); + g_return_val_if_fail (output != NULL, FALSE); + + if (display_width != GST_VIDEO_INFO_WIDTH (&decoder->output_info) || + display_height != GST_VIDEO_INFO_HEIGHT (&decoder->output_info)) { + GST_INFO_OBJECT (videodec, "Frame size changed, do renegotiate"); + + gst_video_info_set_format (&decoder->output_info, + GST_VIDEO_INFO_FORMAT (&decoder->info), display_width, display_height); + GST_VIDEO_INFO_INTERLACE_MODE (&decoder->output_info) = + GST_VIDEO_INFO_INTERLACE_MODE (&decoder->info); + + if (!gst_video_decoder_negotiate (videodec)) { + GST_ERROR_OBJECT (videodec, "Failed to re-negotiate with new frame size"); + return FALSE; + } + } + + if (gst_d3d11_decoder_can_direct_render (decoder, videodec, decoder_buffer, + display_width, display_height)) { + GstMemory *mem; + + mem = gst_buffer_peek_memory (decoder_buffer, 0); + GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD); + + *output = gst_buffer_ref (decoder_buffer); + + return TRUE; + } + + *output = gst_video_decoder_allocate_output_buffer (videodec); + if (*output == NULL) { + GST_ERROR_OBJECT (videodec, "Couldn't allocate output buffer"); + + return FALSE; + } + + /* decoder buffer must have single memory */ + if (gst_buffer_n_memory (decoder_buffer) == gst_buffer_n_memory (*output)) { + GstMemory *mem; + GstD3D11Memory *dmem; + + mem = gst_buffer_peek_memory (*output, 0); + if (!gst_is_d3d11_memory (mem)) { + can_device_copy = FALSE; + goto do_process; + } + + dmem = (GstD3D11Memory *) mem; + if (dmem->device != decoder->device) + can_device_copy = FALSE; + } else { + can_device_copy = FALSE; + } + +do_process: + if (can_device_copy) { + return copy_to_d3d11 (decoder, decoder_buffer, *output); + } + + return copy_to_system (decoder, decoder_buffer, *output); +} + +gboolean +gst_d3d11_decoder_negotiate (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec) +{ + GstVideoInfo *info; + GstCaps *peer_caps; + GstVideoCodecState *state = NULL; + gboolean alternate_interlaced; + gboolean alternate_supported = FALSE; + gboolean d3d11_supported = FALSE; + GstVideoCodecState *input_state; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_VIDEO_DECODER (videodec), FALSE); + + info = &decoder->output_info; + input_state = decoder->input_state; + + alternate_interlaced = + (GST_VIDEO_INFO_INTERLACE_MODE (info) == + GST_VIDEO_INTERLACE_MODE_ALTERNATE); + + peer_caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (videodec)); + GST_DEBUG_OBJECT (videodec, "Allowed caps %" GST_PTR_FORMAT, peer_caps); + + if (!peer_caps || gst_caps_is_any (peer_caps)) { + GST_DEBUG_OBJECT (videodec, + "cannot determine output format, use system memory"); + } else { + GstCapsFeatures *features; + guint size = gst_caps_get_size (peer_caps); + guint i; + + for (i = 0; i < size; i++) { + features = gst_caps_get_features (peer_caps, i); + + if (!features) + continue; + + if (gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + d3d11_supported = TRUE; + } + + /* FIXME: software deinterlace element will not return interlaced caps + * feature... We should fix it */ + if (gst_caps_features_contains (features, + GST_CAPS_FEATURE_FORMAT_INTERLACED)) { + alternate_supported = TRUE; + } + } + } + gst_clear_caps (&peer_caps); + + GST_DEBUG_OBJECT (videodec, + "Downstream feature support, D3D11 memory: %d, interlaced format %d", + d3d11_supported, alternate_supported); + + if (alternate_interlaced) { + /* FIXME: D3D11 cannot support alternating interlaced stream yet */ + GST_FIXME_OBJECT (videodec, + "Implement alternating interlaced stream for D3D11"); + + if (alternate_supported) { + gint height = GST_VIDEO_INFO_HEIGHT (info); + + /* Set caps resolution with display size, that's how we designed + * for alternating interlaced stream */ + height = 2 * height; + state = gst_video_decoder_set_interlaced_output_state (videodec, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_INTERLACE_MODE (info), + GST_VIDEO_INFO_WIDTH (info), height, input_state); + } else { + GST_WARNING_OBJECT (videodec, + "Downstream doesn't support alternating interlaced stream"); + + state = gst_video_decoder_set_output_state (videodec, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_WIDTH (info), + GST_VIDEO_INFO_HEIGHT (info), input_state); + + /* XXX: adjust PAR, this would produce output similar to that of + * "line doubling" (so called bob deinterlacing) processing. + * apart from missing anchor line (top-field or bottom-field) information. + * Potentially flickering could happen. So this might not be correct. + * But it would be better than negotiation error of half-height squeezed + * image */ + state->info.par_d *= 2; + state->info.fps_n *= 2; + } + } else { + state = gst_video_decoder_set_interlaced_output_state (videodec, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_INTERLACE_MODE (info), + GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info), input_state); + } + + if (!state) { + GST_ERROR_OBJECT (decoder, "Couldn't set output state"); + return FALSE; + } + + state->caps = gst_video_info_to_caps (&state->info); + + g_clear_pointer (&decoder->output_state, gst_video_codec_state_unref); + decoder->output_state = state; + + if (d3d11_supported) { + gst_caps_set_features (state->caps, 0, + gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, NULL)); + } + + decoder->downstream_supports_d3d11 = d3d11_supported; + + return gst_d3d11_decoder_open (decoder); +} + +gboolean +gst_d3d11_decoder_decide_allocation (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, GstQuery * query) +{ + GstCaps *outcaps; + GstBufferPool *pool = NULL; + guint n, size, min = 0, max = 0; + GstVideoInfo vinfo = { 0, }; + GstStructure *config; + GstD3D11AllocationParams *d3d11_params; + gboolean use_d3d11_pool; + + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_VIDEO_DECODER (videodec), FALSE); + g_return_val_if_fail (query != NULL, FALSE); + + if (!decoder->opened) { + GST_ERROR_OBJECT (videodec, "Should open decoder first"); + return FALSE; + } + + gst_query_parse_allocation (query, &outcaps, NULL); + + if (!outcaps) { + GST_DEBUG_OBJECT (decoder, "No output caps"); + return FALSE; + } + + use_d3d11_pool = decoder->downstream_supports_d3d11; + + gst_video_info_from_caps (&vinfo, outcaps); + n = gst_query_get_n_allocation_pools (query); + if (n > 0) + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + + /* create our own pool */ + if (pool && use_d3d11_pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + GST_DEBUG_OBJECT (videodec, + "Downstream pool is not d3d11, will create new one"); + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != decoder->device) { + GST_DEBUG_OBJECT (videodec, "Different device, will create new one"); + gst_clear_object (&pool); + } + } + } + + if (!pool) { + if (use_d3d11_pool) + pool = gst_d3d11_buffer_pool_new (decoder->device); + else + pool = gst_video_buffer_pool_new (); + + size = (guint) vinfo.size; + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + if (use_d3d11_pool) { + GstVideoAlignment align; + gint width, height; + + gst_video_alignment_reset (&align); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) + d3d11_params = gst_d3d11_allocation_params_new (decoder->device, &vinfo, + (GstD3D11AllocationFlags) 0, 0); + + width = GST_VIDEO_INFO_WIDTH (&vinfo); + height = GST_VIDEO_INFO_HEIGHT (&vinfo); + + /* need alignment to copy decoder output texture to downstream texture */ + align.padding_right = GST_ROUND_UP_16 (width) - width; + align.padding_bottom = GST_ROUND_UP_16 (height) - height; + if (!gst_d3d11_allocation_params_alignment (d3d11_params, &align)) { + GST_ERROR_OBJECT (videodec, "Cannot set alignment"); + return FALSE; + } + + /* Needs render target bind flag so that it can be used for + * output of shader pipeline if internal resizing is required. + * Also, downstream can keep using video processor even if we copy + * some decoded textures into downstream buffer */ + d3d11_params->desc[0].BindFlags |= D3D11_BIND_RENDER_TARGET; + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + /* Store min buffer size. We need to take account of the amount of buffers + * which might be held by downstream in case of zero-copy playback */ + if (!decoder->internal_pool) { + if (n > 0) { + GST_DEBUG_OBJECT (videodec, "Downstream proposed pool"); + decoder->wait_on_pool_full = TRUE; + /* XXX: hardcoded bound 16, to avoid too large pool size */ + decoder->downstream_min_buffers = MIN (min, 16); + } else { + GST_DEBUG_OBJECT (videodec, "Downstream didn't propose pool"); + decoder->wait_on_pool_full = FALSE; + /* don't know how many buffers would be queued by downstream */ + decoder->downstream_min_buffers = 4; + } + } else { + /* We configured our DPB pool already, let's check if our margin can + * cover min size */ + decoder->wait_on_pool_full = FALSE; + + if (n > 0) { + if (decoder->downstream_min_buffers >= min) + decoder->wait_on_pool_full = TRUE; + + GST_DEBUG_OBJECT (videodec, + "Pre-allocated margin %d can%s cover downstream min size %d", + decoder->downstream_min_buffers, + decoder->wait_on_pool_full ? "" : "not", min); + } else { + GST_DEBUG_OBJECT (videodec, "Downstream min size is unknown"); + } + } + + GST_DEBUG_OBJECT (videodec, "Downstream min buffres: %d", min); + } + + gst_buffer_pool_set_config (pool, config); + if (use_d3d11_pool) { + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, + nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + } + + if (n > 0) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + gst_object_unref (pool); + + return TRUE; +} + +gboolean +gst_d3d11_decoder_set_flushing (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, gboolean flushing) +{ + g_return_val_if_fail (GST_IS_D3D11_DECODER (decoder), FALSE); + + g_mutex_lock (&decoder->internal_pool_lock); + if (decoder->internal_pool) + gst_buffer_pool_set_flushing (decoder->internal_pool, flushing); + g_mutex_unlock (&decoder->internal_pool_lock); + + return TRUE; +} + +static gboolean +gst_d3d11_decoder_can_direct_render (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, GstBuffer * view_buffer, + gint display_width, gint display_height) +{ + GstMemory *mem; + GstD3D11PoolAllocator *alloc; + guint max_size = 0, outstanding_size = 0; + + /* We don't support direct render for reverse playback */ + if (videodec->input_segment.rate < 0) + return FALSE; + + if (!decoder->can_direct_rendering || !decoder->downstream_supports_d3d11) + return FALSE; + + /* different size, need copy */ + /* TODO: crop meta */ + if (display_width != GST_VIDEO_INFO_WIDTH (&decoder->info) || + display_height != GST_VIDEO_INFO_HEIGHT (&decoder->info)) + return FALSE; + + /* we can do direct render in this case, since there is no DPB pool size + * limit */ + if (decoder->use_array_of_texture) + return TRUE; + + /* Let's believe downstream info */ + if (decoder->wait_on_pool_full) + return TRUE; + + /* Check if we are about to full */ + mem = gst_buffer_peek_memory (view_buffer, 0); + + /* something went wrong */ + if (!gst_is_d3d11_memory (mem)) { + GST_ERROR_OBJECT (decoder, "Not a D3D11 memory"); + return FALSE; + } + + alloc = GST_D3D11_POOL_ALLOCATOR (mem->allocator); + if (!gst_d3d11_pool_allocator_get_pool_size (alloc, &max_size, + &outstanding_size)) { + GST_ERROR_OBJECT (decoder, "Couldn't query pool size"); + return FALSE; + } + + /* 2 buffer margin */ + if (max_size <= outstanding_size + 1) { + GST_DEBUG_OBJECT (decoder, "memory pool is about to full (%u/%u)", + outstanding_size, max_size); + return FALSE; + } + + GST_LOG_OBJECT (decoder, "Can do direct rendering"); + + return TRUE; +} + +/* Keep sync with chromium and keep in sorted order. + * See supported_profile_helpers.cc in chromium */ +static const guint legacy_amd_list[] = { + 0x130f, 0x6700, 0x6701, 0x6702, 0x6703, 0x6704, 0x6705, 0x6706, 0x6707, + 0x6708, 0x6709, 0x6718, 0x6719, 0x671c, 0x671d, 0x671f, 0x6720, 0x6721, + 0x6722, 0x6723, 0x6724, 0x6725, 0x6726, 0x6727, 0x6728, 0x6729, 0x6738, + 0x6739, 0x673e, 0x6740, 0x6741, 0x6742, 0x6743, 0x6744, 0x6745, 0x6746, + 0x6747, 0x6748, 0x6749, 0x674a, 0x6750, 0x6751, 0x6758, 0x6759, 0x675b, + 0x675d, 0x675f, 0x6760, 0x6761, 0x6762, 0x6763, 0x6764, 0x6765, 0x6766, + 0x6767, 0x6768, 0x6770, 0x6771, 0x6772, 0x6778, 0x6779, 0x677b, 0x6798, + 0x67b1, 0x6821, 0x683d, 0x6840, 0x6841, 0x6842, 0x6843, 0x6849, 0x6850, + 0x6858, 0x6859, 0x6880, 0x6888, 0x6889, 0x688a, 0x688c, 0x688d, 0x6898, + 0x6899, 0x689b, 0x689c, 0x689d, 0x689e, 0x68a0, 0x68a1, 0x68a8, 0x68a9, + 0x68b0, 0x68b8, 0x68b9, 0x68ba, 0x68be, 0x68bf, 0x68c0, 0x68c1, 0x68c7, + 0x68c8, 0x68c9, 0x68d8, 0x68d9, 0x68da, 0x68de, 0x68e0, 0x68e1, 0x68e4, + 0x68e5, 0x68e8, 0x68e9, 0x68f1, 0x68f2, 0x68f8, 0x68f9, 0x68fa, 0x68fe, + 0x9400, 0x9401, 0x9402, 0x9403, 0x9405, 0x940a, 0x940b, 0x940f, 0x9440, + 0x9441, 0x9442, 0x9443, 0x9444, 0x9446, 0x944a, 0x944b, 0x944c, 0x944e, + 0x9450, 0x9452, 0x9456, 0x945a, 0x945b, 0x945e, 0x9460, 0x9462, 0x946a, + 0x946b, 0x947a, 0x947b, 0x9480, 0x9487, 0x9488, 0x9489, 0x948a, 0x948f, + 0x9490, 0x9491, 0x9495, 0x9498, 0x949c, 0x949e, 0x949f, 0x94a0, 0x94a1, + 0x94a3, 0x94b1, 0x94b3, 0x94b4, 0x94b5, 0x94b9, 0x94c0, 0x94c1, 0x94c3, + 0x94c4, 0x94c5, 0x94c6, 0x94c7, 0x94c8, 0x94c9, 0x94cb, 0x94cc, 0x94cd, + 0x9500, 0x9501, 0x9504, 0x9505, 0x9506, 0x9507, 0x9508, 0x9509, 0x950f, + 0x9511, 0x9515, 0x9517, 0x9519, 0x9540, 0x9541, 0x9542, 0x954e, 0x954f, + 0x9552, 0x9553, 0x9555, 0x9557, 0x955f, 0x9580, 0x9581, 0x9583, 0x9586, + 0x9587, 0x9588, 0x9589, 0x958a, 0x958b, 0x958c, 0x958d, 0x958e, 0x958f, + 0x9590, 0x9591, 0x9593, 0x9595, 0x9596, 0x9597, 0x9598, 0x9599, 0x959b, + 0x95c0, 0x95c2, 0x95c4, 0x95c5, 0x95c6, 0x95c7, 0x95c9, 0x95cc, 0x95cd, + 0x95ce, 0x95cf, 0x9610, 0x9611, 0x9612, 0x9613, 0x9614, 0x9615, 0x9616, + 0x9640, 0x9641, 0x9642, 0x9643, 0x9644, 0x9645, 0x9647, 0x9648, 0x9649, + 0x964a, 0x964b, 0x964c, 0x964e, 0x964f, 0x9710, 0x9711, 0x9712, 0x9713, + 0x9714, 0x9715, 0x9802, 0x9803, 0x9804, 0x9805, 0x9806, 0x9807, 0x9808, + 0x9809, 0x980a, 0x9830, 0x983d, 0x9850, 0x9851, 0x9874, 0x9900, 0x9901, + 0x9903, 0x9904, 0x9905, 0x9906, 0x9907, 0x9908, 0x9909, 0x990a, 0x990b, + 0x990c, 0x990d, 0x990e, 0x990f, 0x9910, 0x9913, 0x9917, 0x9918, 0x9919, + 0x9990, 0x9991, 0x9992, 0x9993, 0x9994, 0x9995, 0x9996, 0x9997, 0x9998, + 0x9999, 0x999a, 0x999b, 0x999c, 0x999d, 0x99a0, 0x99a2, 0x99a4 +}; + +static const guint legacy_intel_list[] = { + 0x102, 0x106, 0x116, 0x126, 0x152, 0x156, 0x166, + 0x402, 0x406, 0x416, 0x41e, 0xa06, 0xa16, 0xf31, +}; + +static gint +binary_search_compare (const guint * a, const guint * b) +{ + return *a - *b; +} + +/* Certain AMD GPU drivers like R600, R700, Evergreen and Cayman and some second + * generation Intel GPU drivers crash if we create a video device with a + * resolution higher then 1920 x 1088. This function checks if the GPU is in + * this list and if yes returns true. */ +gboolean +gst_d3d11_decoder_util_is_legacy_device (GstD3D11Device * device) +{ + const guint amd_id[] = { 0x1002, 0x1022 }; + const guint intel_id = 0x8086; + guint device_id = 0; + guint vendor_id = 0; + guint *match = NULL; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); + + g_object_get (device, "device-id", &device_id, "vendor-id", &vendor_id, NULL); + + if (vendor_id == amd_id[0] || vendor_id == amd_id[1]) { + match = + (guint *) gst_util_array_binary_search ((gpointer) legacy_amd_list, + G_N_ELEMENTS (legacy_amd_list), sizeof (guint), + (GCompareDataFunc) binary_search_compare, + GST_SEARCH_MODE_EXACT, &device_id, NULL); + } else if (vendor_id == intel_id) { + match = + (guint *) gst_util_array_binary_search ((gpointer) legacy_intel_list, + G_N_ELEMENTS (legacy_intel_list), sizeof (guint), + (GCompareDataFunc) binary_search_compare, + GST_SEARCH_MODE_EXACT, &device_id, NULL); + } + + if (match) { + GST_DEBUG_OBJECT (device, "it's legacy device"); + return TRUE; + } + + return FALSE; +} + +gboolean +gst_d3d11_decoder_supports_format (GstD3D11Device * device, + const GUID * decoder_profile, DXGI_FORMAT format) +{ + HRESULT hr; + BOOL can_support = FALSE; + ID3D11VideoDevice *video_device; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); + g_return_val_if_fail (decoder_profile != NULL, FALSE); + g_return_val_if_fail (format != DXGI_FORMAT_UNKNOWN, FALSE); + + video_device = gst_d3d11_device_get_video_device_handle (device); + if (!video_device) + return FALSE; + + hr = video_device->CheckVideoDecoderFormat (decoder_profile, format, + &can_support); + if (!gst_d3d11_result (hr, device) || !can_support) { + GST_DEBUG_OBJECT (device, + "VideoDevice could not support dxgi format %d, hr: 0x%x", + format, (guint) hr); + + return FALSE; + } + + return TRUE; +} + +/* Don't call this method with legacy device */ +gboolean +gst_d3d11_decoder_supports_resolution (GstD3D11Device * device, + const GUID * decoder_profile, DXGI_FORMAT format, guint width, guint height) +{ + D3D11_VIDEO_DECODER_DESC desc; + HRESULT hr; + UINT config_count; + ID3D11VideoDevice *video_device; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); + g_return_val_if_fail (decoder_profile != NULL, FALSE); + g_return_val_if_fail (format != DXGI_FORMAT_UNKNOWN, FALSE); + + video_device = gst_d3d11_device_get_video_device_handle (device); + if (!video_device) + return FALSE; + + desc.SampleWidth = width; + desc.SampleHeight = height; + desc.OutputFormat = format; + desc.Guid = *decoder_profile; + + hr = video_device->GetVideoDecoderConfigCount (&desc, &config_count); + if (!gst_d3d11_result (hr, device) || config_count == 0) { + GST_DEBUG_OBJECT (device, "Could not get decoder config count, hr: 0x%x", + (guint) hr); + return FALSE; + } + + return TRUE; +} + +enum +{ + PROP_DECODER_ADAPTER_LUID = 1, + PROP_DECODER_DEVICE_ID, + PROP_DECODER_VENDOR_ID, +}; + +struct _GstD3D11DecoderClassData +{ + GstD3D11DecoderSubClassData subclass_data; + GstCaps *sink_caps; + GstCaps *src_caps; + gchar *description; +}; + +/** + * gst_d3d11_decoder_class_data_new: + * @device: (transfer none): a #GstD3D11Device + * @sink_caps: (transfer full): a #GstCaps + * @src_caps: (transfer full): a #GstCaps + * + * Create new #GstD3D11DecoderClassData + * + * Returns: (transfer full): the new #GstD3D11DecoderClassData + */ +GstD3D11DecoderClassData * +gst_d3d11_decoder_class_data_new (GstD3D11Device * device, GstDXVACodec codec, + GstCaps * sink_caps, GstCaps * src_caps) +{ + GstD3D11DecoderClassData *ret; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (sink_caps != NULL, NULL); + g_return_val_if_fail (src_caps != NULL, NULL); + + ret = g_new0 (GstD3D11DecoderClassData, 1); + + /* class data will be leaked if the element never gets instantiated */ + GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + ret->subclass_data.codec = codec; + g_object_get (device, "adapter-luid", &ret->subclass_data.adapter_luid, + "device-id", &ret->subclass_data.device_id, + "vendor-id", &ret->subclass_data.vendor_id, + "description", &ret->description, nullptr); + ret->sink_caps = sink_caps; + ret->src_caps = src_caps; + + return ret; +} + +void +gst_d3d11_decoder_class_data_fill_subclass_data (GstD3D11DecoderClassData * + data, GstD3D11DecoderSubClassData * subclass_data) +{ + g_return_if_fail (data != nullptr); + g_return_if_fail (subclass_data != nullptr); + + *subclass_data = data->subclass_data; +} + +static void +gst_d3d11_decoder_class_data_free (GstD3D11DecoderClassData * data) +{ + if (!data) + return; + + gst_clear_caps (&data->sink_caps); + gst_clear_caps (&data->src_caps); + g_free (data->description); + g_free (data); +} + +void +gst_d3d11_decoder_proxy_class_init (GstElementClass * klass, + GstD3D11DecoderClassData * data, const gchar * author) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstD3D11DecoderSubClassData *cdata = &data->subclass_data; + std::string long_name; + std::string description; + const gchar *codec_name; + + g_object_class_install_property (gobject_class, PROP_DECODER_ADAPTER_LUID, + g_param_spec_int64 ("adapter-luid", "Adapter LUID", + "DXGI Adapter LUID (Locally Unique Identifier) of created device", + G_MININT64, G_MAXINT64, cdata->adapter_luid, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_DECODER_DEVICE_ID, + g_param_spec_uint ("device-id", "Device Id", + "DXGI Device ID", 0, G_MAXUINT32, 0, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_DECODER_VENDOR_ID, + g_param_spec_uint ("vendor-id", "Vendor Id", + "DXGI Vendor ID", 0, G_MAXUINT32, 0, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + codec_name = gst_dxva_codec_to_string (cdata->codec); + long_name = "Direct3D11/DXVA " + std::string (codec_name) + " " + + std::string (data->description) + " Decoder"; + description = "Direct3D11/DXVA based " + std::string (codec_name) + + " video decoder"; + + gst_element_class_set_metadata (klass, long_name.c_str (), + "Codec/Decoder/Video/Hardware", description.c_str (), author); + + gst_element_class_add_pad_template (klass, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + data->sink_caps)); + gst_element_class_add_pad_template (klass, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + data->src_caps)); + + gst_d3d11_decoder_class_data_free (data); +} + +void +gst_d3d11_decoder_proxy_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec, + GstD3D11DecoderSubClassData * subclass_data) +{ + switch (prop_id) { + case PROP_DECODER_ADAPTER_LUID: + g_value_set_int64 (value, subclass_data->adapter_luid); + break; + case PROP_DECODER_DEVICE_ID: + g_value_set_uint (value, subclass_data->device_id); + break; + case PROP_DECODER_VENDOR_ID: + g_value_set_uint (value, subclass_data->vendor_id); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +gboolean +gst_d3d11_decoder_proxy_open (GstVideoDecoder * videodec, + GstD3D11DecoderSubClassData * subclass_data, GstD3D11Device ** device, + GstD3D11Decoder ** decoder) +{ + GstElement *elem = GST_ELEMENT (videodec); + + if (!gst_d3d11_ensure_element_data_for_adapter_luid (elem, + subclass_data->adapter_luid, device)) { + GST_ERROR_OBJECT (elem, "Cannot create d3d11device"); + return FALSE; + } + + *decoder = gst_d3d11_decoder_new (*device, subclass_data->codec); + + if (*decoder == nullptr) { + GST_ERROR_OBJECT (elem, "Cannot create d3d11 decoder"); + gst_clear_object (device); + return FALSE; + } + + return TRUE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11decoder.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11decoder.h
Changed
@@ -22,161 +22,146 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" -#include "gstd3d11device.h" -#include "gstd3d11utils.h" +#include <gst/d3d11/gstd3d11.h> G_BEGIN_DECLS -#define GST_TYPE_D3D11_DECODER \ - (gst_d3d11_decoder_get_type()) -#define GST_D3D11_DECODER(obj) \ - (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_DECODER,GstD3D11Decoder)) -#define GST_D3D11_DECODER_CLASS(klass) \ - (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_D3D11_DECODER,GstD3D11DecoderClass)) -#define GST_D3D11_DECODER_GET_CLASS(obj) \ - (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_D3D11_DECODER,GstD3D11DecoderClass)) -#define GST_IS_D3D11_DECODER(obj) \ - (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_DECODER)) -#define GST_IS_D3D11_DECODER_CLASS(klass) \ - (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_DECODER)) - -typedef struct _GstD3D11DecoderOutputView GstD3D11DecoderOutputView; - -struct _GstD3D11DecoderOutputView -{ - GstD3D11Device *device; - ID3D11VideoDecoderOutputView *handle; - guint view_id; -}; +#define GST_TYPE_D3D11_DECODER (gst_d3d11_decoder_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11Decoder, + gst_d3d11_decoder, GST, D3D11_DECODER, GstObject); + +typedef struct _GstD3D11DecoderClassData GstD3D11DecoderClassData; typedef enum { - GST_D3D11_CODEC_NONE, - GST_D3D11_CODEC_H264, - GST_D3D11_CODEC_VP9, - GST_D3D11_CODEC_H265, - GST_D3D11_CODEC_VP8, + GST_DXVA_CODEC_NONE, + GST_DXVA_CODEC_MPEG2, + GST_DXVA_CODEC_H264, + GST_DXVA_CODEC_H265, + GST_DXVA_CODEC_VP8, + GST_DXVA_CODEC_VP9, + GST_DXVA_CODEC_AV1, /* the last of supported codec */ - GST_D3D11_CODEC_LAST -} GstD3D11Codec; + GST_DXVA_CODEC_LAST +} GstDXVACodec; typedef struct { - GstCaps *sink_caps; - GstCaps *src_caps; - guint adapter; + GstDXVACodec codec; + gint64 adapter_luid; guint device_id; guint vendor_id; - gchar *description; -} GstD3D11DecoderClassData; - -struct _GstD3D11Decoder -{ - GstObject parent; +} GstD3D11DecoderSubClassData; - /* TRUE if decoder was successfully opened ever */ - gboolean opened; - - /*< private >*/ - GstD3D11DecoderPrivate *priv; - gpointer padding[GST_PADDING_LARGE]; -}; - -struct _GstD3D11DecoderClass +typedef struct _GstD3D11DecodeInputStreamArgs { - GstObjectClass parent_class; -}; - -GType gst_d3d11_decoder_get_type (void); + gpointer picture_params; + gsize picture_params_size; -GstD3D11Decoder * gst_d3d11_decoder_new (GstD3D11Device * device); + gpointer slice_control; + gsize slice_control_size; -gboolean gst_d3d11_decoder_open (GstD3D11Decoder * decoder, - GstD3D11Codec codec, - GstVideoInfo * info, - guint codec_width, - guint codec_height, - guint pool_size, - const GUID ** decoder_profiles, - guint profile_size); + gpointer bitstream; + gsize bitstream_size; -void gst_d3d11_decoder_reset (GstD3D11Decoder * decoder); + gpointer inverse_quantization_matrix; + gsize inverse_quantization_matrix_size; +} GstD3D11DecodeInputStreamArgs; -gboolean gst_d3d11_decoder_begin_frame (GstD3D11Decoder * decoder, - GstD3D11DecoderOutputView * output_view, - guint content_key_size, - gconstpointer content_key); +GstD3D11Decoder * gst_d3d11_decoder_new (GstD3D11Device * device, + GstDXVACodec codec); -gboolean gst_d3d11_decoder_end_frame (GstD3D11Decoder * decoder); +gboolean gst_d3d11_decoder_is_configured (GstD3D11Decoder * decoder); -gboolean gst_d3d11_decoder_get_decoder_buffer (GstD3D11Decoder * decoder, - D3D11_VIDEO_DECODER_BUFFER_TYPE type, - guint * buffer_size, - gpointer * buffer); +gboolean gst_d3d11_decoder_configure (GstD3D11Decoder * decoder, + GstVideoCodecState * input_state, + GstVideoInfo * info, + gint coded_width, + gint coded_height, + guint dpb_size); -gboolean gst_d3d11_decoder_release_decoder_buffer (GstD3D11Decoder * decoder, - D3D11_VIDEO_DECODER_BUFFER_TYPE type); +gboolean gst_d3d11_decoder_decode_frame (GstD3D11Decoder * decoder, + ID3D11VideoDecoderOutputView * output_view, + GstD3D11DecodeInputStreamArgs * input_args); -gboolean gst_d3d11_decoder_submit_decoder_buffers (GstD3D11Decoder * decoder, - guint buffer_count, - const D3D11_VIDEO_DECODER_BUFFER_DESC * buffers); -GstBuffer * gst_d3d11_decoder_get_output_view_buffer (GstD3D11Decoder * decoder); +GstBuffer * gst_d3d11_decoder_get_output_view_buffer (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec); -GstD3D11DecoderOutputView * gst_d3d11_decoder_get_output_view_from_buffer (GstD3D11Decoder * decoder, - GstBuffer * buffer); - -guint gst_d3d11_decoder_get_output_view_index (GstD3D11Decoder * decoder, - ID3D11VideoDecoderOutputView * view_handle); +ID3D11VideoDecoderOutputView * gst_d3d11_decoder_get_output_view_from_buffer (GstD3D11Decoder * decoder, + GstBuffer * buffer, + guint8 * view_id); gboolean gst_d3d11_decoder_process_output (GstD3D11Decoder * decoder, - GstVideoInfo * info, + GstVideoDecoder * videodec, gint display_width, gint display_height, GstBuffer * decoder_buffer, - GstBuffer * output); + GstBuffer ** output); -gboolean gst_d3d11_decoder_negotiate (GstVideoDecoder * decoder, - GstVideoCodecState * input_state, - GstVideoFormat format, - guint width, - guint height, - GstVideoCodecState ** output_state, - gboolean * downstream_supports_d3d11); +gboolean gst_d3d11_decoder_negotiate (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec); -gboolean gst_d3d11_decoder_decide_allocation (GstVideoDecoder * decoder, - GstQuery * query, - GstD3D11Device * device, - GstD3D11Codec codec, - gboolean use_d3d11_pool); +gboolean gst_d3d11_decoder_decide_allocation (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, + GstQuery * query); -gboolean gst_d3d11_decoder_supports_direct_rendering (GstD3D11Decoder * decoder); +gboolean gst_d3d11_decoder_set_flushing (GstD3D11Decoder * decoder, + GstVideoDecoder * videodec, + gboolean flushing); /* Utils for class registration */ +typedef struct _GstDXVAResolution +{ + guint width; + guint height; +} GstDXVAResolution; + +static const GstDXVAResolution gst_dxva_resolutions[] = { + {1920, 1088}, {2560, 1440}, {3840, 2160}, {4096, 2160}, + {7680, 4320}, {8192, 4320} +}; + gboolean gst_d3d11_decoder_util_is_legacy_device (GstD3D11Device * device); -gboolean gst_d3d11_decoder_get_supported_decoder_profile (GstD3D11Decoder * decoder, - const GUID ** decoder_profiles, - guint profile_size, - GUID * selected_profile); +gboolean gst_d3d11_decoder_get_supported_decoder_profile (GstD3D11Device * device, + GstDXVACodec codec, + GstVideoFormat format, + const GUID ** selected_profile); -gboolean gst_d3d11_decoder_supports_format (GstD3D11Decoder * decoder, +gboolean gst_d3d11_decoder_supports_format (GstD3D11Device * device, const GUID * decoder_profile, DXGI_FORMAT format); -gboolean gst_d3d11_decoder_supports_resolution (GstD3D11Decoder * decoder, +gboolean gst_d3d11_decoder_supports_resolution (GstD3D11Device * device, const GUID * decoder_profile, DXGI_FORMAT format, guint width, guint height); -GstD3D11DecoderClassData * gst_d3d11_decoder_class_data_new (GstD3D11Device * device, - GstCaps * sink_caps, - GstCaps * src_caps); +GstD3D11DecoderClassData * gst_d3d11_decoder_class_data_new (GstD3D11Device * device, + GstDXVACodec codec, + GstCaps * sink_caps, + GstCaps * src_caps); + +void gst_d3d11_decoder_class_data_fill_subclass_data (GstD3D11DecoderClassData * data, + GstD3D11DecoderSubClassData * subclass_data); + +void gst_d3d11_decoder_proxy_class_init (GstElementClass * klass, + GstD3D11DecoderClassData * data, + const gchar * author); + +void gst_d3d11_decoder_proxy_get_property (GObject * object, + guint prop_id, + GValue * value, + GParamSpec * pspec, + GstD3D11DecoderSubClassData * subclass_data); -void gst_d3d11_decoder_class_data_free (GstD3D11DecoderClassData * data); +gboolean gst_d3d11_decoder_proxy_open (GstVideoDecoder * videodec, + GstD3D11DecoderSubClassData * subclass_data, + GstD3D11Device ** device, + GstD3D11Decoder ** decoder); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11deinterlace.cpp
Added
@@ -0,0 +1,2485 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11deinterlaceelement + * @title: d3d11deinterlaceelement + * + * Deinterlacing interlaced video frames to progressive video frames by using + * ID3D11VideoProcessor API. Depending on the hardware it runs on, + * this element will only support a very limited set of video formats. + * Use #d3d11deinterlace instead, which will take care of conversion. + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/video/video.h> +#include <gst/base/gstbasetransform.h> + +#include "gstd3d11deinterlace.h" +#include "gstd3d11pluginutils.h" +#include <wrl.h> +#include <string.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_deinterlace_debug); +#define GST_CAT_DEFAULT gst_d3d11_deinterlace_debug + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +/* Deinterlacing Methods: + * Direct3D11 provides Blend, Bob, Adaptive, Motion Compensation, and + * Inverse Telecine methods. But depending on video processor device, + * some of method might not be supported. + * - Blend: the two fields of a interlaced frame are blended into a single + * progressive frame. Output rate will be half of input (e.g., 60i -> 30p) + * but due to the way of framerate signalling of GStreamer, that is, it uses + * frame rate, not field rate for interlaced stream, in/output framerate + * of caps will be identical. + * - Bob: missing field lines are interpolated from the lines above and below. + * Output rate will be the same as that of input (e.g., 60i -> 60p). + * In order words, video processor will generate two frames from two field + * of a intelaced frame. + * - Adaptive, Motion Compensation: future and past frames are used for + * reference frame for deinterlacing process. User should provide sufficent + * number of reference frames, otherwise processor device will fallback to + * Bob method. + * + * Direct3D11 doesn't provide a method for explicit deinterlacing method + * selection. Instead, it could be done indirectly. + * - Blend: sets output rate as half via VideoProcessorSetStreamOutputRate(). + * - Bob: sets output rate as normal. And performs VideoProcessorBlt() twice per + * a interlaced frame. D3D11_VIDEO_PROCESSOR_STREAM::OutputIndex needs to be + * incremented per field (e.g., OutputIndex = 0 for the first field, + * and 1 for the second field). + * - Adaptive, Motion Compensation: in addition to the requirement of Bob, + * user should provide reference frames via + * D3D11_VIDEO_PROCESSOR_STREAM::ppPastSurfaces and + * D3D11_VIDEO_PROCESSOR_STREAM::ppFutureSurfaces + */ + +/* g_queue_clear_full is available since 2.60 */ +#if !GLIB_CHECK_VERSION(2,60,0) +#define g_queue_clear_full gst_d3d11_deinterlace_g_queue_clear_full +static void +gst_d3d11_deinterlace_g_queue_clear_full (GQueue * queue, + GDestroyNotify free_func) +{ + g_return_if_fail (queue != NULL); + + if (free_func != NULL) + g_queue_foreach (queue, (GFunc) free_func, NULL); + + g_queue_clear (queue); +} +#endif + +typedef enum +{ + GST_D3D11_DEINTERLACE_METHOD_BLEND = + D3D11_VIDEO_PROCESSOR_PROCESSOR_CAPS_DEINTERLACE_BLEND, + GST_D3D11_DEINTERLACE_METHOD_BOB = + D3D11_VIDEO_PROCESSOR_PROCESSOR_CAPS_DEINTERLACE_BOB, + GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE = + D3D11_VIDEO_PROCESSOR_PROCESSOR_CAPS_DEINTERLACE_ADAPTIVE, + GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION = + D3D11_VIDEO_PROCESSOR_PROCESSOR_CAPS_DEINTERLACE_MOTION_COMPENSATION, + + /* TODO: INVERSE_TELECINE */ +} GstD3D11DeinterlaceMethod; + +/** + * GstD3D11DeinterlaceMethod: + * + * Deinterlacing method + * + * Since: 1.20 + */ +#define GST_TYPE_D3D11_DEINTERLACE_METHOD (gst_d3d11_deinterlace_method_type()) + +static GType +gst_d3d11_deinterlace_method_type (void) +{ + static gsize method_type = 0; + + if (g_once_init_enter (&method_type)) { + static const GFlagsValue method_types[] = { + {GST_D3D11_DEINTERLACE_METHOD_BLEND, + "Blend: Blending top/bottom field pictures into one frame. " + "Framerate will be preserved (e.g., 60i -> 30p)", "blend"}, + {GST_D3D11_DEINTERLACE_METHOD_BOB, + "Bob: Interpolating missing lines by using the adjacent lines. " + "Framerate will be doubled (e,g, 60i -> 60p)", "bob"}, + {GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE, + "Adaptive: Interpolating missing lines by using spatial/temporal references. " + "Framerate will be doubled (e,g, 60i -> 60p)", + "adaptive"}, + {GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION, + "Motion Compensation: Recreating missing lines by using motion vector. " + "Framerate will be doubled (e,g, 60i -> 60p)", "mocomp"}, + {0, NULL, NULL}, + }; + GType tmp = g_flags_register_static ("GstD3D11DeinterlaceMethod", + method_types); + g_once_init_leave (&method_type, tmp); + } + + return (GType) method_type; +} + +typedef struct +{ + GstD3D11DeinterlaceMethod supported_methods; + GstD3D11DeinterlaceMethod default_method; + + guint max_past_frames; + guint max_future_frames; +} GstD3D11DeinterlaceDeviceCaps; + +typedef struct +{ + GType deinterlace_type; + + GstCaps *sink_caps; + GstCaps *src_caps; + guint adapter; + guint device_id; + guint vendor_id; + gchar *description; + + GstD3D11DeinterlaceDeviceCaps device_caps; + + guint ref_count; +} GstD3D11DeinterlaceClassData; + +static GstD3D11DeinterlaceClassData * +gst_d3d11_deinterlace_class_data_new (void) +{ + GstD3D11DeinterlaceClassData *self = g_new0 (GstD3D11DeinterlaceClassData, 1); + + self->ref_count = 1; + + return self; +} + +static GstD3D11DeinterlaceClassData * +gst_d3d11_deinterlace_class_data_ref (GstD3D11DeinterlaceClassData * data) +{ + g_assert (data != NULL); + + g_atomic_int_add (&data->ref_count, 1); + + return data; +} + +static void +gst_d3d11_deinterlace_class_data_unref (GstD3D11DeinterlaceClassData * data) +{ + g_assert (data != NULL); + + if (g_atomic_int_dec_and_test (&data->ref_count)) { + gst_clear_caps (&data->sink_caps); + gst_clear_caps (&data->src_caps); + g_free (data->description); + g_free (data); + } +} + +enum +{ + PROP_0, + PROP_ADAPTER, + PROP_DEVICE_ID, + PROP_VENDOR_ID, + PROP_METHOD, + PROP_SUPPORTED_METHODS, +}; + +/* hardcoded maximum queue size for each past/future frame queue */ +#define MAX_NUM_REFERENCES 2 + +typedef struct _GstD3D11Deinterlace +{ + GstBaseTransform parent; + + GstVideoInfo in_info; + GstVideoInfo out_info; + /* Calculated buffer duration by using upstream framerate */ + GstClockTime default_buffer_duration; + + GstD3D11Device *device; + + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; + ID3D11VideoProcessorEnumerator *video_enum; + ID3D11VideoProcessor *video_proc; + + GstD3D11DeinterlaceMethod method; + + GRecMutex lock; + GQueue past_frame_queue; + GQueue future_frame_queue; + GstBuffer *to_process; + + guint max_past_frames; + guint max_future_frames; + + /* D3D11_VIDEO_PROCESSOR_STREAM::InputFrameOrField */ + guint input_index; + + /* Clear/Update per submit_input_buffer() */ + guint num_output_per_input; + guint num_transformed; + gboolean first_output; + + GstBufferPool *fallback_in_pool; + GstBufferPool *fallback_out_pool; +} GstD3D11Deinterlace; + +typedef struct _GstD3D11DeinterlaceClass +{ + GstBaseTransformClass parent_class; + + guint adapter; + guint device_id; + guint vendor_id; + + GstD3D11DeinterlaceDeviceCaps device_caps; +} GstD3D11DeinterlaceClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_DEINTERLACE(object) ((GstD3D11Deinterlace *) (object)) +#define GST_D3D11_DEINTERLACE_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object), \ + GstD3D11DeinterlaceClass)) +#define GST_D3D11_DEINTERLACE_LOCK(self) \ + g_rec_mutex_lock (&GST_D3D11_DEINTERLACE (self)->lock); +#define GST_D3D11_DEINTERLACE_UNLOCK(self) \ + g_rec_mutex_unlock (&GST_D3D11_DEINTERLACE (self)->lock); + +static gboolean +gst_d3d11_deinterlace_update_method (GstD3D11Deinterlace * self); +static void gst_d3d11_deinterlace_reset (GstD3D11Deinterlace * self); +static GstFlowReturn gst_d3d11_deinterlace_drain (GstD3D11Deinterlace * self); + +/* GObjectClass vfunc */ +static void gst_d3d11_deinterlace_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_deinterlace_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_d3d11_deinterlace_finalize (GObject * object); + +/* GstElementClass vfunc */ +static void gst_d3d11_deinterlace_set_context (GstElement * element, + GstContext * context); + +/* GstBaseTransformClass vfunc */ +static gboolean gst_d3d11_deinterlace_start (GstBaseTransform * trans); +static gboolean gst_d3d11_deinterlace_stop (GstBaseTransform * trans); +static gboolean gst_d3d11_deinterlace_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query); +static GstCaps *gst_d3d11_deinterlace_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static GstCaps *gst_d3d11_deinterlace_fixate_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); +static gboolean +gst_d3d11_deinterlace_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query); +static gboolean +gst_d3d11_deinterlace_decide_allocation (GstBaseTransform * trans, + GstQuery * query); +static gboolean gst_d3d11_deinterlace_set_caps (GstBaseTransform * trans, + GstCaps * incaps, GstCaps * outcaps); +static GstFlowReturn +gst_d3d11_deinterlace_submit_input_buffer (GstBaseTransform * trans, + gboolean is_discont, GstBuffer * input); +static GstFlowReturn +gst_d3d11_deinterlace_generate_output (GstBaseTransform * trans, + GstBuffer ** outbuf); +static GstFlowReturn +gst_d3d11_deinterlace_transform (GstBaseTransform * trans, GstBuffer * inbuf, + GstBuffer * outbuf); +static gboolean gst_d3d11_deinterlace_sink_event (GstBaseTransform * trans, + GstEvent * event); +static void gst_d3d11_deinterlace_before_transform (GstBaseTransform * trans, + GstBuffer * buffer); + +static void +gst_d3d11_deinterlace_class_init (GstD3D11DeinterlaceClass * klass, + gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstD3D11DeinterlaceClassData *cdata = (GstD3D11DeinterlaceClassData *) data; + gchar *long_name; + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + + gobject_class->get_property = gst_d3d11_deinterlace_get_property; + gobject_class->set_property = gst_d3d11_deinterlace_set_property; + gobject_class->finalize = gst_d3d11_deinterlace_finalize; + + g_object_class_install_property (gobject_class, PROP_ADAPTER, + g_param_spec_uint ("adapter", "Adapter", + "DXGI Adapter index for creating device", + 0, G_MAXUINT32, cdata->adapter, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_DEVICE_ID, + g_param_spec_uint ("device-id", "Device Id", + "DXGI Device ID", 0, G_MAXUINT32, 0, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_VENDOR_ID, + g_param_spec_uint ("vendor-id", "Vendor Id", + "DXGI Vendor ID", 0, G_MAXUINT32, 0, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_METHOD, + g_param_spec_flags ("method", "Method", + "Deinterlace Method. Use can set multiple methods as a flagset " + "and element will select one of method automatically. " + "If deinterlacing device failed to deinterlace with given mode, " + "fallback might happen by the device", + GST_TYPE_D3D11_DEINTERLACE_METHOD, cdata->device_caps.default_method, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + g_object_class_install_property (gobject_class, PROP_SUPPORTED_METHODS, + g_param_spec_flags ("supported-methods", "Supported Methods", + "Set of supported deinterlace methods by device", + GST_TYPE_D3D11_DEINTERLACE_METHOD, + cdata->device_caps.supported_methods, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_set_context); + + long_name = g_strdup_printf ("Direct3D11 %s Deinterlacer", + cdata->description); + gst_element_class_set_metadata (element_class, long_name, + "Filter/Effect/Video/Deinterlace/Hardware", + "A Direct3D11 based deinterlacer", + "Seungha Yang <seungha@centricular.com>"); + g_free (long_name); + + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + cdata->sink_caps)); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + cdata->src_caps)); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_start); + trans_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_stop); + trans_class->query = GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_query); + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_transform_caps); + trans_class->fixate_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_fixate_caps); + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_propose_allocation); + trans_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_decide_allocation); + trans_class->set_caps = GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_set_caps); + trans_class->submit_input_buffer = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_submit_input_buffer); + trans_class->generate_output = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_generate_output); + trans_class->transform = GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_transform); + trans_class->sink_event = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_sink_event); + trans_class->before_transform = + GST_DEBUG_FUNCPTR (gst_d3d11_deinterlace_before_transform); + + klass->adapter = cdata->adapter; + klass->device_id = cdata->device_id; + klass->vendor_id = cdata->vendor_id; + klass->device_caps = cdata->device_caps; + + gst_d3d11_deinterlace_class_data_unref (cdata); + + gst_type_mark_as_plugin_api (GST_TYPE_D3D11_DEINTERLACE_METHOD, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_deinterlace_init (GstD3D11Deinterlace * self) +{ + GstD3D11DeinterlaceClass *klass = GST_D3D11_DEINTERLACE_GET_CLASS (self); + + self->method = klass->device_caps.default_method; + self->default_buffer_duration = GST_CLOCK_TIME_NONE; + gst_d3d11_deinterlace_update_method (self); + + g_queue_init (&self->past_frame_queue); + g_queue_init (&self->future_frame_queue); + g_rec_mutex_init (&self->lock); +} + +static void +gst_d3d11_deinterlace_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (object); + GstD3D11DeinterlaceClass *klass = GST_D3D11_DEINTERLACE_GET_CLASS (object); + + switch (prop_id) { + case PROP_ADAPTER: + g_value_set_uint (value, klass->adapter); + break; + case PROP_DEVICE_ID: + g_value_set_uint (value, klass->device_id); + break; + case PROP_VENDOR_ID: + g_value_set_uint (value, klass->vendor_id); + break; + case PROP_METHOD: + g_value_set_flags (value, self->method); + break; + case PROP_SUPPORTED_METHODS: + g_value_set_flags (value, klass->device_caps.supported_methods); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static gboolean +gst_d3d11_deinterlace_update_method (GstD3D11Deinterlace * self) +{ + GstD3D11DeinterlaceClass *klass = GST_D3D11_DEINTERLACE_GET_CLASS (self); + GstD3D11DeinterlaceMethod requested_method = self->method; + gboolean updated = TRUE; + + /* Verify whether requested method is supported */ + if ((self->method & klass->device_caps.supported_methods) == 0) { +#ifndef GST_DISABLE_GST_DEBUG + gchar *supported, *requested; + + supported = g_flags_to_string (GST_TYPE_D3D11_DEINTERLACE_METHOD, + klass->device_caps.supported_methods); + requested = g_flags_to_string (GST_TYPE_D3D11_DEINTERLACE_METHOD, + klass->device_caps.supported_methods); + + GST_WARNING_OBJECT (self, + "Requested method %s is not supported (supported: %s)", + requested, supported); + + g_free (supported); + g_free (requested); +#endif + + self->method = klass->device_caps.default_method; + + goto done; + } + + /* Drop not supported methods */ + self->method = (GstD3D11DeinterlaceMethod) + (klass->device_caps.supported_methods & self->method); + + /* Single method was requested? */ + if (self->method == GST_D3D11_DEINTERLACE_METHOD_BLEND || + self->method == GST_D3D11_DEINTERLACE_METHOD_BOB || + self->method == GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE || + self->method == GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION) { + if (self->method == requested_method) + updated = FALSE; + } else { + /* Pick single method from requested */ + if ((self->method & GST_D3D11_DEINTERLACE_METHOD_BOB) == + GST_D3D11_DEINTERLACE_METHOD_BOB) { + self->method = GST_D3D11_DEINTERLACE_METHOD_BOB; + } else if ((self->method & GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE) == + GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE) { + self->method = GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE; + } else if ((self->method & GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION) + == GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION) { + self->method = GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION; + } else if ((self->method & GST_D3D11_DEINTERLACE_METHOD_BLEND) == + GST_D3D11_DEINTERLACE_METHOD_BLEND) { + self->method = GST_D3D11_DEINTERLACE_METHOD_BLEND; + } else { + self->method = klass->device_caps.default_method; + g_assert_not_reached (); + } + } + +done: + if (self->method == GST_D3D11_DEINTERLACE_METHOD_BLEND) { + /* Both methods don't use reference frame for deinterlacing */ + self->max_past_frames = self->max_future_frames = 0; + } else if (self->method == GST_D3D11_DEINTERLACE_METHOD_BOB) { + /* To calculate timestamp and duration of output fraems, we will hold one + * future frame even though processor device will not use reference */ + self->max_past_frames = 0; + self->max_future_frames = 1; + } else { + /* FIXME: how many frames should be allowed? also, this needs to be + * configurable */ + self->max_past_frames = MIN (klass->device_caps.max_past_frames, + MAX_NUM_REFERENCES); + + /* Likewise Bob, we need at least one future frame for timestamp/duration + * calculation */ + self->max_future_frames = + MAX (MIN (klass->device_caps.max_future_frames, MAX_NUM_REFERENCES), 1); + } + + return updated; +} + +static void +gst_d3d11_deinterlace_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (object); + + switch (prop_id) { + case PROP_METHOD:{ + gboolean notify_update = FALSE; + + GST_OBJECT_LOCK (self); + self->method = (GstD3D11DeinterlaceMethod) g_value_get_flags (value); + notify_update = gst_d3d11_deinterlace_update_method (self); + GST_OBJECT_UNLOCK (self); + + if (notify_update) + g_object_notify (object, "method"); + break; + } + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_deinterlace_finalize (GObject * object) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (object); + + g_rec_mutex_clear (&self->lock); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_deinterlace_set_context (GstElement * element, GstContext * context) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (element); + GstD3D11DeinterlaceClass *klass = GST_D3D11_DEINTERLACE_GET_CLASS (self); + + gst_d3d11_handle_set_context (element, context, klass->adapter, + &self->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_deinterlace_open (GstD3D11Deinterlace * self) +{ + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; + + video_device = gst_d3d11_device_get_video_device_handle (self->device); + if (!video_device) { + GST_ERROR_OBJECT (self, "ID3D11VideoDevice is not availale"); + return FALSE; + } + + video_context = gst_d3d11_device_get_video_context_handle (self->device); + if (!video_context) { + GST_ERROR_OBJECT (self, "ID3D11VideoContext is not available"); + return FALSE; + } + + self->video_device = video_device; + video_device->AddRef (); + + self->video_context = video_context; + video_context->AddRef (); + + return TRUE; +} + +/* Must be called with lock taken */ +static void +gst_d3d11_deinterlace_reset_history (GstD3D11Deinterlace * self) +{ + self->input_index = 0; + self->num_output_per_input = 1; + self->num_transformed = 0; + self->first_output = TRUE; + + g_queue_clear_full (&self->past_frame_queue, + (GDestroyNotify) gst_buffer_unref); + g_queue_clear_full (&self->future_frame_queue, + (GDestroyNotify) gst_buffer_unref); + gst_clear_buffer (&self->to_process); +} + +static void +gst_d3d11_deinterlace_reset (GstD3D11Deinterlace * self) +{ + GST_D3D11_DEINTERLACE_LOCK (self); + if (self->fallback_in_pool) { + gst_buffer_pool_set_active (self->fallback_in_pool, FALSE); + gst_object_unref (self->fallback_in_pool); + self->fallback_in_pool = NULL; + } + + if (self->fallback_out_pool) { + gst_buffer_pool_set_active (self->fallback_out_pool, FALSE); + gst_object_unref (self->fallback_out_pool); + self->fallback_out_pool = NULL; + } + + GST_D3D11_CLEAR_COM (self->video_enum); + GST_D3D11_CLEAR_COM (self->video_proc); + + gst_d3d11_deinterlace_reset_history (self); + self->default_buffer_duration = GST_CLOCK_TIME_NONE; + + GST_D3D11_DEINTERLACE_UNLOCK (self); +} + +static void +gst_d3d11_deinterlace_close (GstD3D11Deinterlace * self) +{ + gst_d3d11_deinterlace_reset (self); + + GST_D3D11_CLEAR_COM (self->video_device); + GST_D3D11_CLEAR_COM (self->video_context); + + gst_clear_object (&self->device); +} + +static gboolean +gst_d3d11_deinterlace_start (GstBaseTransform * trans) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstD3D11DeinterlaceClass *klass = GST_D3D11_DEINTERLACE_GET_CLASS (self); + + if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), klass->adapter, + &self->device)) { + GST_ERROR_OBJECT (self, "Couldn't create d3d11device"); + return FALSE; + } + + if (!gst_d3d11_deinterlace_open (self)) { + GST_ERROR_OBJECT (self, "Couldn't open video device"); + gst_d3d11_deinterlace_close (self); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_deinterlace_stop (GstBaseTransform * trans) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + + gst_d3d11_deinterlace_close (self); + + return TRUE; +} + +static gboolean +gst_d3d11_deinterlace_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT_CAST (self), + query, self->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->query (trans, direction, + query); +} + +static GstCaps * +gst_d3d11_deinterlace_remove_interlace_info (GstCaps * caps, + gboolean remove_framerate) +{ + GstStructure *st; + GstCapsFeatures *f; + gint i, n; + GstCaps *res; + GstCapsFeatures *feature = + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); + + res = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + st = gst_caps_get_structure (caps, i); + f = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) + continue; + + st = gst_structure_copy (st); + /* Only remove format info for the cases when we can actually convert */ + if (!gst_caps_features_is_any (f) + && gst_caps_features_is_equal (f, feature)) { + if (remove_framerate) { + gst_structure_remove_fields (st, "interlace-mode", "field-order", + "framerate", NULL); + } else { + gst_structure_remove_fields (st, "interlace-mode", "field-order", NULL); + } + } + + gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); + } + + gst_caps_features_free (feature); + + return res; +} + +static GstCaps * +gst_d3d11_deinterlace_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstCaps *tmp, *tmp2; + GstCaps *result; + + /* Get all possible caps that we can transform to */ + tmp = gst_d3d11_deinterlace_remove_interlace_info (caps, + /* Non-blend mode will double framerate */ + self->method != GST_D3D11_DEINTERLACE_METHOD_BLEND); + + if (filter) { + tmp2 = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + tmp = tmp2; + } + + result = tmp; + + GST_DEBUG_OBJECT (trans, "transformed %" GST_PTR_FORMAT " into %" + GST_PTR_FORMAT, caps, result); + + return result; +} + +static GstCaps * +gst_d3d11_deinterlace_fixate_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstStructure *s; + GstCaps *tmp; + gint fps_n, fps_d; + GstVideoInfo info; + const gchar *interlace_mode; + + othercaps = gst_caps_truncate (othercaps); + othercaps = gst_caps_make_writable (othercaps); + + if (direction == GST_PAD_SRC) + return gst_caps_fixate (othercaps); + + tmp = gst_caps_copy (caps); + tmp = gst_caps_fixate (tmp); + + if (!gst_video_info_from_caps (&info, tmp)) { + GST_WARNING_OBJECT (self, "Invalid caps %" GST_PTR_FORMAT, caps); + gst_caps_unref (tmp); + + return gst_caps_fixate (othercaps); + } + + s = gst_caps_get_structure (tmp, 0); + if (gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d)) { + /* for non-blend method, output framerate will be doubled */ + if (self->method != GST_D3D11_DEINTERLACE_METHOD_BLEND && + GST_VIDEO_INFO_IS_INTERLACED (&info)) { + fps_n *= 2; + } + + gst_caps_set_simple (othercaps, + "framerate", GST_TYPE_FRACTION, fps_n, fps_d, NULL); + } + + interlace_mode = gst_structure_get_string (s, "interlace-mode"); + if (g_strcmp0 ("progressive", interlace_mode) == 0) { + /* Just forward interlace-mode=progressive. + * By this way, basetransform will enable passthrough for non-interlaced + * stream*/ + gst_caps_set_simple (othercaps, + "interlace-mode", G_TYPE_STRING, "progressive", NULL); + } + + gst_caps_unref (tmp); + + return gst_caps_fixate (othercaps); +} + +static gboolean +gst_d3d11_deinterlace_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstVideoInfo info; + GstBufferPool *pool = NULL; + GstCaps *caps; + guint n_pools, i; + GstStructure *config; + guint size; + GstD3D11AllocationParams *d3d11_params; + guint min_buffers = 0; + + if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query)) + return FALSE; + + /* passthrough, we're done */ + if (decide_query == NULL) + return TRUE; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) + return FALSE; + + n_pools = gst_query_get_n_allocation_pools (query); + for (i = 0; i < n_pools; i++) { + gst_query_parse_nth_allocation_pool (query, i, &pool, NULL, NULL, NULL); + if (pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != self->device) + gst_clear_object (&pool); + } + } + } + + if (!pool) + pool = gst_d3d11_buffer_pool_new (self->device); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (self->device, &info, + (GstD3D11AllocationFlags) 0, D3D11_BIND_RENDER_TARGET); + } else { + d3d11_params->desc[0].BindFlags |= D3D11_BIND_RENDER_TARGET; + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + if (self->method == GST_D3D11_DEINTERLACE_METHOD_BOB) { + /* For non-blend methods, we will produce two progressive frames from + * a single interlaced frame. To determine timestamp and duration, + * we might need to hold one past frame if buffer duration is unknown */ + min_buffers = 2; + } else if (self->method == GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE || + self->method == GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION) { + /* For advanced deinterlacing methods, we will hold more frame so that + * device can use them as reference frames */ + + min_buffers += self->max_past_frames; + min_buffers += self->max_future_frames; + /* And one for current frame */ + min_buffers++; + + /* we will hold at least one frame for timestamp/duration calculation */ + min_buffers = MAX (min_buffers, 2); + } + + /* size will be updated by d3d11 buffer pool */ + gst_buffer_pool_config_set_params (config, caps, 0, min_buffers, 0); + + if (!gst_buffer_pool_set_config (pool, config)) + goto config_failed; + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + gst_query_add_allocation_meta (query, + GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + gst_query_add_allocation_pool (query, pool, size, min_buffers, 0); + + gst_object_unref (pool); + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (self, "failed to set config"); + gst_object_unref (pool); + return FALSE; + } +} + +static gboolean +gst_d3d11_deinterlace_decide_allocation (GstBaseTransform * trans, + GstQuery * query) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstCaps *outcaps = NULL; + GstBufferPool *pool = NULL; + guint size, min = 0, max = 0; + GstStructure *config; + GstD3D11AllocationParams *d3d11_params; + gboolean update_pool = FALSE; + GstVideoInfo info; + + gst_query_parse_allocation (query, &outcaps, NULL); + + if (!outcaps) + return FALSE; + + if (!gst_video_info_from_caps (&info, outcaps)) + return FALSE; + + size = GST_VIDEO_INFO_SIZE (&info); + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != self->device) + gst_clear_object (&pool); + } + } + + update_pool = TRUE; + } + + if (!pool) + pool = gst_d3d11_buffer_pool_new (self->device); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (self->device, &info, + (GstD3D11AllocationFlags) 0, D3D11_BIND_RENDER_TARGET); + } else { + d3d11_params->desc[0].BindFlags |= D3D11_BIND_RENDER_TARGET; + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_set_config (pool, config); + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + gst_object_unref (pool); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, + query); +} + +static gboolean +gst_d3d11_deinterlace_prepare_fallback_pool (GstD3D11Deinterlace * self, + GstCaps * in_caps, GstVideoInfo * in_info, GstCaps * out_caps, + GstVideoInfo * out_info) +{ + GstD3D11AllocationParams *d3d11_params; + + /* Clearing potentially remaining resource here would be redundant. + * Just to be safe enough */ + g_queue_clear_full (&self->past_frame_queue, + (GDestroyNotify) gst_buffer_unref); + g_queue_clear_full (&self->future_frame_queue, + (GDestroyNotify) gst_buffer_unref); + + if (self->fallback_in_pool) { + gst_buffer_pool_set_active (self->fallback_in_pool, FALSE); + gst_object_unref (self->fallback_in_pool); + self->fallback_in_pool = NULL; + } + + if (self->fallback_out_pool) { + gst_buffer_pool_set_active (self->fallback_out_pool, FALSE); + gst_object_unref (self->fallback_out_pool); + self->fallback_out_pool = NULL; + } + + /* Empty bind flag is allowed for video processor input */ + d3d11_params = gst_d3d11_allocation_params_new (self->device, in_info, + (GstD3D11AllocationFlags) 0, 0); + self->fallback_in_pool = gst_d3d11_buffer_pool_new_with_options (self->device, + in_caps, d3d11_params, 0, 0); + gst_d3d11_allocation_params_free (d3d11_params); + + if (!self->fallback_in_pool) { + GST_ERROR_OBJECT (self, "Failed to create input fallback buffer pool"); + return FALSE; + } + + /* For processor output, render target bind flag is required */ + d3d11_params = gst_d3d11_allocation_params_new (self->device, out_info, + (GstD3D11AllocationFlags) 0, D3D11_BIND_RENDER_TARGET); + self->fallback_out_pool = + gst_d3d11_buffer_pool_new_with_options (self->device, + out_caps, d3d11_params, 0, 0); + gst_d3d11_allocation_params_free (d3d11_params); + + if (!self->fallback_out_pool) { + GST_ERROR_OBJECT (self, "Failed to create output fallback buffer pool"); + gst_clear_object (&self->fallback_out_pool); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_deinterlace_set_caps (GstBaseTransform * trans, + GstCaps * incaps, GstCaps * outcaps) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstVideoInfo in_info, out_info; + /* *INDENT-OFF* */ + ComPtr<ID3D11VideoProcessorEnumerator> video_enum; + ComPtr<ID3D11VideoProcessor> video_proc; + /* *INDENT-ON* */ + D3D11_VIDEO_PROCESSOR_CONTENT_DESC desc; + D3D11_VIDEO_PROCESSOR_CAPS proc_caps; + D3D11_VIDEO_PROCESSOR_RATE_CONVERSION_CAPS rate_conv_caps; + D3D11_VIDEO_PROCESSOR_OUTPUT_RATE output_rate = + D3D11_VIDEO_PROCESSOR_OUTPUT_RATE_NORMAL; + HRESULT hr; + RECT rect; + guint i; + + if (gst_base_transform_is_passthrough (trans)) + return TRUE; + + if (!gst_video_info_from_caps (&in_info, incaps)) { + GST_ERROR_OBJECT (self, "Invalid input caps %" GST_PTR_FORMAT, incaps); + return FALSE; + } + + if (!gst_video_info_from_caps (&out_info, outcaps)) { + GST_ERROR_OBJECT (self, "Invalid output caps %" GST_PTR_FORMAT, outcaps); + return FALSE; + } + + self->in_info = in_info; + self->out_info = out_info; + + /* Calculate expected buffer duration. We might need to reference this value + * when buffer duration is unknown */ + if (GST_VIDEO_INFO_FPS_N (&in_info) > 0 && + GST_VIDEO_INFO_FPS_D (&in_info) > 0) { + self->default_buffer_duration = + gst_util_uint64_scale_int (GST_SECOND, GST_VIDEO_INFO_FPS_D (&in_info), + GST_VIDEO_INFO_FPS_N (&in_info)); + } else { + /* Assume 25 fps. We need this for reporting latency at least */ + self->default_buffer_duration = + gst_util_uint64_scale_int (GST_SECOND, 1, 25); + } + + gst_d3d11_deinterlace_reset (self); + + /* Nothing to do */ + if (!GST_VIDEO_INFO_IS_INTERLACED (&in_info)) { + gst_base_transform_set_passthrough (trans, TRUE); + + return TRUE; + } + + /* TFF or BFF is not important here, this is just for enumerating + * available deinterlace devices */ + memset (&desc, 0, sizeof (D3D11_VIDEO_PROCESSOR_CONTENT_DESC)); + + desc.InputFrameFormat = D3D11_VIDEO_FRAME_FORMAT_INTERLACED_TOP_FIELD_FIRST; + if (GST_VIDEO_INFO_FIELD_ORDER (&in_info) == + GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST) + desc.InputFrameFormat = + D3D11_VIDEO_FRAME_FORMAT_INTERLACED_BOTTOM_FIELD_FIRST; + desc.InputWidth = GST_VIDEO_INFO_WIDTH (&in_info); + desc.InputHeight = GST_VIDEO_INFO_HEIGHT (&in_info); + desc.OutputWidth = GST_VIDEO_INFO_WIDTH (&out_info); + desc.OutputHeight = GST_VIDEO_INFO_HEIGHT (&out_info); + desc.Usage = D3D11_VIDEO_USAGE_PLAYBACK_NORMAL; + + hr = self->video_device->CreateVideoProcessorEnumerator (&desc, &video_enum); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Couldn't create VideoProcessorEnumerator"); + return FALSE; + } + + hr = video_enum->GetVideoProcessorCaps (&proc_caps); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Couldn't query processor caps"); + return FALSE; + } + + /* Shouldn't happen, we checked this already during plugin_init */ + if (proc_caps.RateConversionCapsCount == 0) { + GST_ERROR_OBJECT (self, "Deinterlacing is not supported"); + return FALSE; + } + + for (i = 0; i < proc_caps.RateConversionCapsCount; i++) { + hr = video_enum->GetVideoProcessorRateConversionCaps (i, &rate_conv_caps); + if (FAILED (hr)) + continue; + + if ((rate_conv_caps.ProcessorCaps & self->method) == self->method) + break; + } + + if (i >= proc_caps.RateConversionCapsCount) { + GST_ERROR_OBJECT (self, "Deinterlacing method 0x%x is not supported", + self->method); + return FALSE; + } + + hr = self->video_device->CreateVideoProcessor (video_enum.Get (), + i, &video_proc); + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Couldn't create processor"); + return FALSE; + } + + if (!gst_d3d11_deinterlace_prepare_fallback_pool (self, incaps, &in_info, + outcaps, &out_info)) { + GST_ERROR_OBJECT (self, "Couldn't prepare fallback buffer pool"); + return FALSE; + } + + self->video_enum = video_enum.Detach (); + self->video_proc = video_proc.Detach (); + + rect.left = 0; + rect.top = 0; + rect.right = GST_VIDEO_INFO_WIDTH (&self->in_info); + rect.bottom = GST_VIDEO_INFO_HEIGHT (&self->in_info); + + /* Blending seems to be considered as half rate. See also + * https://docs.microsoft.com/en-us/windows/win32/api/d3d12video/ns-d3d12video-d3d12_video_process_input_stream_rate */ + if (self->method == GST_D3D11_DEINTERLACE_METHOD_BLEND) + output_rate = D3D11_VIDEO_PROCESSOR_OUTPUT_RATE_HALF; + + gst_d3d11_device_lock (self->device); + self->video_context->VideoProcessorSetStreamSourceRect (self->video_proc, + 0, TRUE, &rect); + self->video_context->VideoProcessorSetStreamDestRect (self->video_proc, + 0, TRUE, &rect); + self->video_context->VideoProcessorSetOutputTargetRect (self->video_proc, + TRUE, &rect); + self->video_context-> + VideoProcessorSetStreamAutoProcessingMode (self->video_proc, 0, FALSE); + self->video_context->VideoProcessorSetStreamOutputRate (self->video_proc, 0, + output_rate, TRUE, NULL); + gst_d3d11_device_unlock (self->device); + + return TRUE; +} + +static ID3D11VideoProcessorInputView * +gst_d3d11_deinterace_get_piv_from_buffer (GstD3D11Deinterlace * self, + GstBuffer * buffer) +{ + GstMemory *mem; + GstD3D11Memory *dmem; + ID3D11VideoProcessorInputView *piv; + + if (gst_buffer_n_memory (buffer) != 1) { + GST_WARNING_OBJECT (self, "Input buffer has more than one memory"); + return NULL; + } + + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_WARNING_OBJECT (self, "Input buffer is holding non-D3D11 memory"); + return NULL; + } + + dmem = (GstD3D11Memory *) mem; + if (dmem->device != self->device) { + GST_WARNING_OBJECT (self, + "Input D3D11 memory was allocated by other device"); + return NULL; + } + + piv = gst_d3d11_memory_get_processor_input_view (dmem, + self->video_device, self->video_enum); + if (!piv) { + GST_WARNING_OBJECT (self, "ID3D11VideoProcessorInputView is unavailable"); + return NULL; + } + + return piv; +} + +static GstBuffer * +gst_d3d11_deinterlace_ensure_input_buffer (GstD3D11Deinterlace * self, + GstBuffer * input) +{ + GstD3D11Memory *dmem; + ID3D11VideoProcessorInputView *piv; + GstBuffer *new_buf = NULL; + + if (!input) + return NULL; + + piv = gst_d3d11_deinterace_get_piv_from_buffer (self, input); + if (piv) + return input; + + if (!self->fallback_in_pool || + !gst_buffer_pool_set_active (self->fallback_in_pool, TRUE) || + gst_buffer_pool_acquire_buffer (self->fallback_in_pool, &new_buf, + NULL) != GST_FLOW_OK) { + GST_ERROR_OBJECT (self, "Fallback input buffer is unavailable"); + gst_buffer_unref (input); + + return NULL; + } + + if (!gst_d3d11_buffer_copy_into (new_buf, input, &self->in_info)) { + GST_ERROR_OBJECT (self, "Couldn't copy input buffer to fallback buffer"); + gst_buffer_unref (new_buf); + gst_buffer_unref (input); + + return NULL; + } + + dmem = (GstD3D11Memory *) gst_buffer_peek_memory (new_buf, 0); + piv = gst_d3d11_memory_get_processor_input_view (dmem, + self->video_device, self->video_enum); + if (!piv) { + GST_ERROR_OBJECT (self, "ID3D11VideoProcessorInputView is unavailable"); + gst_buffer_unref (new_buf); + gst_buffer_unref (input); + + return NULL; + } + + /* copy metadata, default implemenation of baseclass will copy everything + * what we need */ + GST_BASE_TRANSFORM_CLASS (parent_class)->copy_metadata + (GST_BASE_TRANSFORM_CAST (self), input, new_buf); + + gst_buffer_unref (input); + + return new_buf; +} + +static GstFlowReturn +gst_d3d11_deinterlace_submit_future_frame (GstD3D11Deinterlace * self, + GstBuffer * buffer) +{ + GstBaseTransform *trans = GST_BASE_TRANSFORM_CAST (self); + guint len; + + /* push tail and pop head, so that head frame can be the nearest frame + * of current frame */ + if (buffer) + g_queue_push_tail (&self->future_frame_queue, buffer); + + len = g_queue_get_length (&self->future_frame_queue); + + g_assert (len <= self->max_future_frames + 1); + + if (self->to_process) { + GST_WARNING_OBJECT (self, "Found uncleared processing buffer"); + gst_clear_buffer (&self->to_process); + } + + if (len > self->max_future_frames || + /* NULL means drain */ + (buffer == NULL && len > 0)) { + GstClockTime cur_timestmap = GST_CLOCK_TIME_NONE; + GstClockTime duration = GST_CLOCK_TIME_NONE; + GstBuffer *next_buf; + + self->to_process = + (GstBuffer *) g_queue_pop_head (&self->future_frame_queue); + + /* For non-blend methods, we will produce two frames from a single + * interlaced frame. So, sufficiently correct buffer duration is required + * to set timestamp for the second output frame */ + if (self->method != GST_D3D11_DEINTERLACE_METHOD_BLEND) { + if (GST_BUFFER_PTS_IS_VALID (self->to_process)) { + cur_timestmap = GST_BUFFER_PTS (self->to_process); + } else { + cur_timestmap = GST_BUFFER_DTS (self->to_process); + } + + /* Ensure buffer duration */ + next_buf = (GstBuffer *) g_queue_peek_head (&self->future_frame_queue); + if (next_buf && GST_CLOCK_STIME_IS_VALID (cur_timestmap)) { + GstClockTime next_timestamp; + + if (GST_BUFFER_PTS_IS_VALID (next_buf)) { + next_timestamp = GST_BUFFER_PTS (next_buf); + } else { + next_timestamp = GST_BUFFER_DTS (next_buf); + } + + if (GST_CLOCK_STIME_IS_VALID (next_timestamp)) { + if (trans->segment.rate >= 0.0 && next_timestamp > cur_timestmap) { + duration = next_timestamp - cur_timestmap; + } else if (trans->segment.rate < 0.0 + && next_timestamp < cur_timestmap) { + duration = cur_timestmap - next_timestamp; + } + } + } + + /* Make sure that we can update buffer duration safely */ + self->to_process = gst_buffer_make_writable (self->to_process); + if (GST_CLOCK_TIME_IS_VALID (duration)) { + GST_BUFFER_DURATION (self->to_process) = duration; + } else { + GST_BUFFER_DURATION (self->to_process) = self->default_buffer_duration; + } + + /* Bonus points, DTS doesn't make sense for raw video frame */ + GST_BUFFER_PTS (self->to_process) = cur_timestmap; + GST_BUFFER_DTS (self->to_process) = GST_CLOCK_TIME_NONE; + + /* And mark the number of output frames for this input frame */ + self->num_output_per_input = 2; + } else { + self->num_output_per_input = 1; + } + + self->first_output = TRUE; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_deinterlace_submit_input_buffer (GstBaseTransform * trans, + gboolean is_discont, GstBuffer * input) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstFlowReturn ret; + GstBuffer *buf; + + /* Let baseclass handle QoS first */ + ret = GST_BASE_TRANSFORM_CLASS (parent_class)->submit_input_buffer (trans, + is_discont, input); + if (ret != GST_FLOW_OK) + return ret; + + if (gst_base_transform_is_passthrough (trans)) + return ret; + + /* at this moment, baseclass must hold queued_buf */ + g_assert (trans->queued_buf != NULL); + + /* Check if we can use this buffer directly. If not, copy this into + * our fallback buffer */ + buf = trans->queued_buf; + trans->queued_buf = NULL; + + buf = gst_d3d11_deinterlace_ensure_input_buffer (self, buf); + if (!buf) { + GST_ERROR_OBJECT (self, "Invalid input buffer"); + return GST_FLOW_ERROR; + } + + return gst_d3d11_deinterlace_submit_future_frame (self, buf); +} + +static ID3D11VideoProcessorOutputView * +gst_d3d11_deinterace_get_pov_from_buffer (GstD3D11Deinterlace * self, + GstBuffer * buffer) +{ + GstMemory *mem; + GstD3D11Memory *dmem; + ID3D11VideoProcessorOutputView *pov; + + if (gst_buffer_n_memory (buffer) != 1) { + GST_WARNING_OBJECT (self, "Output buffer has more than one memory"); + return NULL; + } + + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_WARNING_OBJECT (self, "Output buffer is holding non-D3D11 memory"); + return NULL; + } + + dmem = (GstD3D11Memory *) mem; + if (dmem->device != self->device) { + GST_WARNING_OBJECT (self, + "Output D3D11 memory was allocated by other device"); + return NULL; + } + + pov = gst_d3d11_memory_get_processor_output_view (dmem, + self->video_device, self->video_enum); + if (!pov) { + GST_WARNING_OBJECT (self, "ID3D11VideoProcessorOutputView is unavailable"); + return NULL; + } + + return pov; +} + +static GstBuffer * +gst_d3d11_deinterlace_ensure_output_buffer (GstD3D11Deinterlace * self, + GstBuffer * output) +{ + GstD3D11Memory *dmem; + ID3D11VideoProcessorOutputView *pov; + GstBuffer *new_buf = NULL; + + pov = gst_d3d11_deinterace_get_pov_from_buffer (self, output); + if (pov) + return output; + + if (!self->fallback_out_pool || + !gst_buffer_pool_set_active (self->fallback_out_pool, TRUE) || + gst_buffer_pool_acquire_buffer (self->fallback_out_pool, &new_buf, + NULL) != GST_FLOW_OK) { + GST_ERROR_OBJECT (self, "Fallback output buffer is unavailable"); + gst_buffer_unref (output); + + return NULL; + } + + dmem = (GstD3D11Memory *) gst_buffer_peek_memory (new_buf, 0); + pov = gst_d3d11_memory_get_processor_output_view (dmem, + self->video_device, self->video_enum); + if (!pov) { + GST_ERROR_OBJECT (self, "ID3D11VideoProcessorOutputView is unavailable"); + gst_buffer_unref (new_buf); + gst_buffer_unref (output); + + return NULL; + } + + /* copy metadata, default implemenation of baseclass will copy everything + * what we need */ + GST_BASE_TRANSFORM_CLASS (parent_class)->copy_metadata + (GST_BASE_TRANSFORM_CAST (self), output, new_buf); + + gst_buffer_unref (output); + + return new_buf; +} + +static GstFlowReturn +gst_d3d11_deinterlace_submit_past_frame (GstD3D11Deinterlace * self, + GstBuffer * buffer) +{ + /* push head and pop tail, so that head frame can be the nearest frame + * of current frame */ + g_queue_push_head (&self->past_frame_queue, buffer); + while (g_queue_get_length (&self->past_frame_queue) > self->max_past_frames) { + GstBuffer *to_drop = + (GstBuffer *) g_queue_pop_tail (&self->past_frame_queue); + + if (to_drop) + gst_buffer_unref (to_drop); + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_deinterlace_generate_output (GstBaseTransform * trans, + GstBuffer ** outbuf) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstFlowReturn ret = GST_FLOW_OK; + GstBuffer *inbuf; + GstBuffer *buf = NULL; + + if (gst_base_transform_is_passthrough (trans)) { + return GST_BASE_TRANSFORM_CLASS (parent_class)->generate_output (trans, + outbuf); + } + + *outbuf = NULL; + inbuf = self->to_process; + if (inbuf == NULL) + return GST_FLOW_OK; + + ret = + GST_BASE_TRANSFORM_CLASS (parent_class)->prepare_output_buffer (trans, + inbuf, &buf); + + if (ret != GST_FLOW_OK || !buf) { + GST_WARNING_OBJECT (trans, "could not get buffer from pool: %s", + gst_flow_get_name (ret)); + + return ret; + } + + g_assert (inbuf != buf); + + buf = gst_d3d11_deinterlace_ensure_output_buffer (self, buf); + if (!buf) { + GST_ERROR_OBJECT (self, "Failed to allocate output buffer to process"); + + return GST_FLOW_ERROR; + } + + ret = gst_d3d11_deinterlace_transform (trans, inbuf, buf); + if (ret != GST_FLOW_OK) { + gst_buffer_unref (buf); + return ret; + } + + g_assert (self->num_output_per_input == 1 || self->num_output_per_input == 2); + + /* Update timestamp and buffer duration. + * Here, PTS and duration of inbuf must be valid, + * unless there's programing error, since we updated timestamp and duration + * already around submit_input_buffer() */ + if (self->num_output_per_input == 2) { + if (!GST_BUFFER_DURATION_IS_VALID (inbuf)) { + GST_LOG_OBJECT (self, "Input buffer duration is unknown"); + } else if (!GST_BUFFER_PTS_IS_VALID (inbuf)) { + GST_LOG_OBJECT (self, "Input buffer timestamp is unknown"); + } else { + GstClockTime duration = GST_BUFFER_DURATION (inbuf) / 2; + gboolean second_field = FALSE; + + if (self->first_output) { + /* For reverse playback, first output is the second field */ + if (trans->segment.rate < 0) + second_field = TRUE; + else + second_field = FALSE; + } else { + if (trans->segment.rate < 0) + second_field = FALSE; + else + second_field = TRUE; + } + + GST_BUFFER_DURATION (buf) = duration; + if (second_field) { + GST_BUFFER_PTS (buf) = GST_BUFFER_PTS (buf) + duration; + } + } + } + + *outbuf = buf; + self->first_output = FALSE; + self->num_transformed++; + /* https://docs.microsoft.com/en-us/windows/win32/api/d3d12video/ns-d3d12video-d3d12_video_process_input_stream_rate */ + if (self->method == GST_D3D11_DEINTERLACE_METHOD_BLEND) { + self->input_index += 2; + } else { + self->input_index++; + } + + if (self->num_output_per_input <= self->num_transformed) { + /* Move processed frame to past_frame queue */ + gst_d3d11_deinterlace_submit_past_frame (self, self->to_process); + self->to_process = NULL; + } + + return ret; +} + +static GstFlowReturn +gst_d3d11_deinterlace_transform (GstBaseTransform * trans, GstBuffer * inbuf, + GstBuffer * outbuf) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + ID3D11VideoProcessorInputView *piv; + ID3D11VideoProcessorOutputView *pov; + D3D11_VIDEO_FRAME_FORMAT frame_foramt = D3D11_VIDEO_FRAME_FORMAT_PROGRESSIVE; + D3D11_VIDEO_PROCESSOR_STREAM proc_stream = { 0, }; + ID3D11VideoProcessorInputView *future_surfaces[MAX_NUM_REFERENCES] = + { NULL, }; + ID3D11VideoProcessorInputView *past_surfaces[MAX_NUM_REFERENCES] = { NULL, }; + guint future_frames = 0; + guint past_frames = 0; + HRESULT hr; + guint i; + + /* Input/output buffer must be holding valid D3D11 memory here, + * as we checked it already in submit_input_buffer() and generate_output() */ + piv = gst_d3d11_deinterace_get_piv_from_buffer (self, inbuf); + if (!piv) { + GST_ERROR_OBJECT (self, "ID3D11VideoProcessorInputView is unavailable"); + return GST_FLOW_ERROR; + } + + pov = gst_d3d11_deinterace_get_pov_from_buffer (self, outbuf); + if (!pov) { + GST_ERROR_OBJECT (self, "ID3D11VideoProcessorOutputView is unavailable"); + return GST_FLOW_ERROR; + } + + /* Check field order */ + if (GST_VIDEO_INFO_INTERLACE_MODE (&self->in_info) == + GST_VIDEO_INTERLACE_MODE_MIXED || + (GST_VIDEO_INFO_INTERLACE_MODE (&self->in_info) == + GST_VIDEO_INTERLACE_MODE_INTERLEAVED && + GST_VIDEO_INFO_FIELD_ORDER (&self->in_info) == + GST_VIDEO_FIELD_ORDER_UNKNOWN)) { + if (!GST_BUFFER_FLAG_IS_SET (inbuf, GST_VIDEO_BUFFER_FLAG_INTERLACED)) { + frame_foramt = D3D11_VIDEO_FRAME_FORMAT_PROGRESSIVE; + } else if (GST_BUFFER_FLAG_IS_SET (inbuf, GST_VIDEO_BUFFER_FLAG_TFF)) { + frame_foramt = D3D11_VIDEO_FRAME_FORMAT_INTERLACED_TOP_FIELD_FIRST; + } else { + frame_foramt = D3D11_VIDEO_FRAME_FORMAT_INTERLACED_BOTTOM_FIELD_FIRST; + } + } else if (GST_VIDEO_INFO_FIELD_ORDER (&self->in_info) == + GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST) { + frame_foramt = D3D11_VIDEO_FRAME_FORMAT_INTERLACED_TOP_FIELD_FIRST; + } else if (GST_VIDEO_INFO_FIELD_ORDER (&self->in_info) == + GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST) { + frame_foramt = D3D11_VIDEO_FRAME_FORMAT_INTERLACED_BOTTOM_FIELD_FIRST; + } + + if (frame_foramt == D3D11_VIDEO_FRAME_FORMAT_PROGRESSIVE) { + /* Progressive stream will produce only one frame per frame */ + self->num_output_per_input = 1; + } else if (self->method != GST_D3D11_DEINTERLACE_METHOD_BLEND && + self->method != GST_D3D11_DEINTERLACE_METHOD_BOB) { + /* Fill reference frames */ + for (i = 0; i < g_queue_get_length (&self->future_frame_queue) && + i < G_N_ELEMENTS (future_surfaces); i++) { + GstBuffer *future_buf; + ID3D11VideoProcessorInputView *future_piv; + + future_buf = + (GstBuffer *) g_queue_peek_nth (&self->future_frame_queue, i); + future_piv = gst_d3d11_deinterace_get_piv_from_buffer (self, future_buf); + if (!future_piv) { + GST_WARNING_OBJECT (self, + "Couldn't get ID3D11VideoProcessorInputView from future " + "reference %d", i); + break; + } + + future_surfaces[i] = future_piv; + future_frames++; + } + + for (i = 0; i < g_queue_get_length (&self->past_frame_queue) && + i < G_N_ELEMENTS (past_surfaces); i++) { + GstBuffer *past_buf; + ID3D11VideoProcessorInputView *past_piv; + + past_buf = (GstBuffer *) g_queue_peek_nth (&self->past_frame_queue, i); + past_piv = gst_d3d11_deinterace_get_piv_from_buffer (self, past_buf); + if (!past_piv) { + GST_WARNING_OBJECT (self, + "Couldn't get ID3D11VideoProcessorInputView from past " + "reference %d", i); + break; + } + + past_surfaces[i] = past_piv; + past_frames++; + } + } + + proc_stream.Enable = TRUE; + proc_stream.pInputSurface = piv; + proc_stream.InputFrameOrField = self->input_index; + /* FIXME: This is wrong for inverse telechin case */ + /* OutputIndex == 0 for the first field, and 1 for the second field */ + if (self->num_output_per_input == 2) { + if (trans->segment.rate < 0.0) { + /* Process the second frame first in case of reverse playback */ + proc_stream.OutputIndex = self->first_output ? 1 : 0; + } else { + proc_stream.OutputIndex = self->first_output ? 0 : 1; + } + } else { + proc_stream.OutputIndex = 0; + } + + if (future_frames) { + proc_stream.FutureFrames = future_frames; + proc_stream.ppFutureSurfaces = future_surfaces; + } + + if (past_frames) { + proc_stream.PastFrames = past_frames; + proc_stream.ppPastSurfaces = past_surfaces; + } + + gst_d3d11_device_lock (self->device); + self->video_context->VideoProcessorSetStreamFrameFormat (self->video_proc, 0, + frame_foramt); + + hr = self->video_context->VideoProcessorBlt (self->video_proc, pov, 0, + 1, &proc_stream); + gst_d3d11_device_unlock (self->device); + + if (!gst_d3d11_result (hr, self->device)) { + GST_ERROR_OBJECT (self, "Failed to perform deinterlacing"); + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static gboolean +gst_d3d11_deinterlace_sink_event (GstBaseTransform * trans, GstEvent * event) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_STREAM_START: + /* stream-start means discont stream from previous one. Drain pending + * frame if any */ + GST_DEBUG_OBJECT (self, "Have stream-start, drain frames if any"); + gst_d3d11_deinterlace_drain (self); + break; + case GST_EVENT_CAPS:{ + GstPad *sinkpad = GST_BASE_TRANSFORM_SINK_PAD (trans); + GstCaps *prev_caps; + + prev_caps = gst_pad_get_current_caps (sinkpad); + if (prev_caps) { + GstCaps *caps; + gst_event_parse_caps (event, &caps); + /* If caps is updated, drain pending frames */ + if (!gst_caps_is_equal (prev_caps, caps)) { + GST_DEBUG_OBJECT (self, "Caps updated from %" GST_PTR_FORMAT " to %" + GST_PTR_FORMAT, prev_caps, caps); + gst_d3d11_deinterlace_drain (self); + } + + gst_caps_unref (prev_caps); + } + break; + } + case GST_EVENT_SEGMENT: + /* new segment would mean that temporal discontinuity */ + case GST_EVENT_SEGMENT_DONE: + case GST_EVENT_EOS: + GST_DEBUG_OBJECT (self, "Have event %s, drain frames if any", + GST_EVENT_TYPE_NAME (event)); + gst_d3d11_deinterlace_drain (self); + break; + case GST_EVENT_FLUSH_STOP: + GST_D3D11_DEINTERLACE_LOCK (self); + gst_d3d11_deinterlace_reset_history (self); + GST_D3D11_DEINTERLACE_UNLOCK (self); + break; + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->sink_event (trans, event); +} + +static void +gst_d3d11_deinterlace_before_transform (GstBaseTransform * trans, + GstBuffer * buffer) +{ + GstD3D11Deinterlace *self = GST_D3D11_DEINTERLACE (trans); + GstD3D11DeinterlaceClass *klass = GST_D3D11_DEINTERLACE_GET_CLASS (self); + GstD3D11Memory *dmem; + GstMemory *mem; + GstCaps *in_caps = NULL; + GstCaps *out_caps = NULL; + guint adapter = 0; + + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_ELEMENT_ERROR (self, CORE, FAILED, (NULL), ("Invalid memory")); + return; + } + + dmem = GST_D3D11_MEMORY_CAST (mem); + /* Same device, nothing to do */ + if (dmem->device == self->device) + return; + + g_object_get (dmem->device, "adapter", &adapter, NULL); + /* We have per-GPU deinterlace elements because of different capability + * per GPU. so, cannot accept other GPU at the moment */ + if (adapter != klass->adapter) + return; + + GST_INFO_OBJECT (self, "Updating device %" GST_PTR_FORMAT " -> %" + GST_PTR_FORMAT, self->device, dmem->device); + + /* Drain buffers before updating device */ + gst_d3d11_deinterlace_drain (self); + + gst_object_unref (self->device); + self->device = (GstD3D11Device *) gst_object_ref (dmem->device); + + in_caps = gst_pad_get_current_caps (GST_BASE_TRANSFORM_SINK_PAD (trans)); + if (!in_caps) { + GST_WARNING_OBJECT (self, "sinkpad has null caps"); + goto out; + } + + out_caps = gst_pad_get_current_caps (GST_BASE_TRANSFORM_SRC_PAD (trans)); + if (!out_caps) { + GST_WARNING_OBJECT (self, "Has no configured output caps"); + goto out; + } + + gst_d3d11_deinterlace_set_caps (trans, in_caps, out_caps); + + /* Mark reconfigure so that we can update pool */ + gst_base_transform_reconfigure_src (trans); + +out: + gst_clear_caps (&in_caps); + gst_clear_caps (&out_caps); + + return; +} + +/* FIXME: might be job of basetransform */ +static GstFlowReturn +gst_d3d11_deinterlace_drain (GstD3D11Deinterlace * self) +{ + GstBaseTransform *trans = GST_BASE_TRANSFORM_CAST (self); + GstFlowReturn ret = GST_FLOW_OK; + GstBuffer *outbuf = NULL; + + GST_D3D11_DEINTERLACE_LOCK (self); + if (gst_base_transform_is_passthrough (trans)) { + /* If we were passthrough, nothing to do */ + goto done; + } else if (!g_queue_get_length (&self->future_frame_queue)) { + /* No pending data, nothing to do */ + goto done; + } + + while (g_queue_get_length (&self->future_frame_queue)) { + gst_d3d11_deinterlace_submit_future_frame (self, NULL); + if (!self->to_process) + break; + + do { + outbuf = NULL; + + ret = gst_d3d11_deinterlace_generate_output (trans, &outbuf); + if (outbuf != NULL) { + /* Release lock during push buffer */ + GST_D3D11_DEINTERLACE_UNLOCK (self); + ret = gst_pad_push (trans->srcpad, outbuf); + GST_D3D11_DEINTERLACE_LOCK (self); + } + } while (ret == GST_FLOW_OK && outbuf != NULL); + } + +done: + gst_d3d11_deinterlace_reset_history (self); + GST_D3D11_DEINTERLACE_UNLOCK (self); + + return ret; +} + +/** + * SECTION:element-d3d11deinterlace + * @title: d3d11deinterlace + * @short_description: A Direct3D11 based deinterlace element + * + * Deinterlacing interlaced video frames to progressive video frames by using + * ID3D11VideoProcessor API. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/h264/file ! parsebin ! d3d11h264dec ! d3d11deinterlace ! d3d11videosink + * ``` + * + * Since: 1.20 + * + */ + +/* GstD3D11DeinterlaceBin */ +enum +{ + PROP_BIN_0, + /* basetransform */ + PROP_BIN_QOS, + /* deinterlace */ + PROP_BIN_ADAPTER, + PROP_BIN_DEVICE_ID, + PROP_BIN_VENDOR_ID, + PROP_BIN_METHOD, + PROP_BIN_SUPPORTED_METHODS, +}; + +typedef struct _GstD3D11DeinterlaceBin +{ + GstBin parent; + + GstPad *sinkpad; + GstPad *srcpad; + + GstElement *deinterlace; + GstElement *in_convert; + GstElement *out_convert; + GstElement *upload; + GstElement *download; +} GstD3D11DeinterlaceBin; + +typedef struct _GstD3D11DeinterlaceBinClass +{ + GstBinClass parent_class; + + guint adapter; + GType child_type; +} GstD3D11DeinterlaceBinClass; + +static GstElementClass *bin_parent_class = NULL; +#define GST_D3D11_DEINTERLACE_BIN(object) ((GstD3D11DeinterlaceBin *) (object)) +#define GST_D3D11_DEINTERLACE_BIN_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object), \ + GstD3D11DeinterlaceBinClass)) + +#define GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE(format) \ + "video/x-raw, " \ + "format = (string) " format ", " \ + "width = (int) [1, 8192], " \ + "height = (int) [1, 8192] " + +#define GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES(features,format) \ + "video/x-raw(" features "), " \ + "format = (string) " format ", " \ + "width = (int) [1, 8192], " \ + "height = (int) [1, 8192] " + +static GstStaticPadTemplate bin_sink_template_caps = + GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SINK_FORMATS) "; " + GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SINK_FORMATS) "; " + GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE (GST_D3D11_SINK_FORMATS) "; " + GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SINK_FORMATS) + )); + +static GstStaticPadTemplate bin_src_template_caps = + GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SRC_FORMATS) "; " + GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SRC_FORMATS) "; " + GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE (GST_D3D11_SRC_FORMATS) "; " + GST_D3D11_DEINTERLACE_BIN_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SRC_FORMATS) + )); + +static void gst_d3d11_deinterlace_bin_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_deinterlace_bin_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); + +static void +gst_d3d11_deinterlace_bin_class_init (GstD3D11DeinterlaceBinClass * klass, + gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstD3D11DeinterlaceClassData *cdata = (GstD3D11DeinterlaceClassData *) data; + gchar *long_name; + + bin_parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + + gobject_class->get_property = gst_d3d11_deinterlace_bin_get_property; + gobject_class->set_property = gst_d3d11_deinterlace_bin_set_property; + + /* basetransform */ + g_object_class_install_property (gobject_class, PROP_BIN_QOS, + g_param_spec_boolean ("qos", "QoS", "Handle Quality-of-Service events", + FALSE, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /* deinterlace */ + g_object_class_install_property (gobject_class, PROP_BIN_ADAPTER, + g_param_spec_uint ("adapter", "Adapter", + "DXGI Adapter index for creating device", + 0, G_MAXUINT32, cdata->adapter, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_BIN_DEVICE_ID, + g_param_spec_uint ("device-id", "Device Id", + "DXGI Device ID", 0, G_MAXUINT32, 0, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_BIN_VENDOR_ID, + g_param_spec_uint ("vendor-id", "Vendor Id", + "DXGI Vendor ID", 0, G_MAXUINT32, 0, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_BIN_METHOD, + g_param_spec_flags ("method", "Method", + "Deinterlace Method. Use can set multiple methods as a flagset " + "and element will select one of method automatically. " + "If deinterlacing device failed to deinterlace with given mode, " + "fallback might happen by the device", + GST_TYPE_D3D11_DEINTERLACE_METHOD, cdata->device_caps.default_method, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | + GST_PARAM_MUTABLE_READY))); + g_object_class_install_property (gobject_class, PROP_BIN_SUPPORTED_METHODS, + g_param_spec_flags ("supported-methods", "Supported Methods", + "Set of supported deinterlace methods by device", + GST_TYPE_D3D11_DEINTERLACE_METHOD, + cdata->device_caps.supported_methods, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + long_name = g_strdup_printf ("Direct3D11 %s Deinterlacer Bin", + cdata->description); + gst_element_class_set_metadata (element_class, long_name, + "Filter/Effect/Video/Deinterlace/Hardware", + "A Direct3D11 based deinterlacer bin", + "Seungha Yang <seungha@centricular.com>"); + g_free (long_name); + + gst_element_class_add_static_pad_template (element_class, + &bin_sink_template_caps); + gst_element_class_add_static_pad_template (element_class, + &bin_src_template_caps); + + klass->adapter = cdata->adapter; + klass->child_type = cdata->deinterlace_type; + + gst_d3d11_deinterlace_class_data_unref (cdata); +} + +static void +gst_d3d11_deinterlace_bin_init (GstD3D11DeinterlaceBin * self) +{ + GstD3D11DeinterlaceBinClass *klass = + GST_D3D11_DEINTERLACE_BIN_GET_CLASS (self); + GstPad *pad; + + self->deinterlace = (GstElement *) g_object_new (klass->child_type, + "name", "deinterlace", NULL); + self->in_convert = gst_element_factory_make ("d3d11colorconvert", NULL); + self->out_convert = gst_element_factory_make ("d3d11colorconvert", NULL); + self->upload = gst_element_factory_make ("d3d11upload", NULL); + self->download = gst_element_factory_make ("d3d11download", NULL); + + /* Specify DXGI adapter index to use */ + g_object_set (G_OBJECT (self->in_convert), "adapter", klass->adapter, NULL); + g_object_set (G_OBJECT (self->out_convert), "adapter", klass->adapter, NULL); + g_object_set (G_OBJECT (self->upload), "adapter", klass->adapter, NULL); + g_object_set (G_OBJECT (self->download), "adapter", klass->adapter, NULL); + + gst_bin_add_many (GST_BIN_CAST (self), self->upload, self->in_convert, + self->deinterlace, self->out_convert, self->download, NULL); + gst_element_link_many (self->upload, self->in_convert, self->deinterlace, + self->out_convert, self->download, NULL); + + pad = gst_element_get_static_pad (self->upload, "sink"); + self->sinkpad = gst_ghost_pad_new ("sink", pad); + gst_element_add_pad (GST_ELEMENT_CAST (self), self->sinkpad); + gst_object_unref (pad); + + pad = gst_element_get_static_pad (self->download, "src"); + self->srcpad = gst_ghost_pad_new ("src", pad); + gst_element_add_pad (GST_ELEMENT_CAST (self), self->srcpad); + gst_object_unref (pad); +} + +static void +gst_d3d11_deinterlace_bin_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11DeinterlaceBin *self = GST_D3D11_DEINTERLACE_BIN (object); + + g_object_set_property (G_OBJECT (self->deinterlace), pspec->name, value); +} + +static void +gst_d3d11_deinterlace_bin_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11DeinterlaceBin *self = GST_D3D11_DEINTERLACE_BIN (object); + + g_object_get_property (G_OBJECT (self->deinterlace), pspec->name, value); +} + +void +gst_d3d11_deinterlace_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank) +{ + GType type; + GType bin_type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + GTypeInfo type_info = { + sizeof (GstD3D11DeinterlaceClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_deinterlace_class_init, + NULL, + NULL, + sizeof (GstD3D11Deinterlace), + 0, + (GInstanceInitFunc) gst_d3d11_deinterlace_init, + }; + GTypeInfo bin_type_info = { + sizeof (GstD3D11DeinterlaceBinClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_deinterlace_bin_class_init, + NULL, + NULL, + sizeof (GstD3D11DeinterlaceBin), + 0, + (GInstanceInitFunc) gst_d3d11_deinterlace_bin_init, + }; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + GstCaps *caps = NULL; + GstCapsFeatures *caps_features; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + /* *INDENT-OFF* */ + ComPtr<ID3D11VideoDevice> video_device; + ComPtr<ID3D11VideoContext> video_context; + ComPtr<ID3D11VideoProcessorEnumerator> video_proc_enum; + ComPtr<ID3D11VideoProcessorEnumerator1> video_proc_enum1; + /* *INDENT-ON* */ + HRESULT hr; + D3D11_VIDEO_PROCESSOR_CONTENT_DESC desc; + D3D11_VIDEO_PROCESSOR_CAPS proc_caps = { 0, }; + UINT supported_methods = 0; + GstD3D11DeinterlaceMethod default_method; + gboolean blend; + gboolean bob; + gboolean adaptive; + gboolean mocomp; + /* NOTE: processor might be able to handle other formats. + * However, not all YUV formats can be used for render target. + * For instance, DXGI_FORMAT_Y210 and DXGI_FORMAT_Y410 formats cannot be + * render target. In practice, interlaced stream would output of video + * decoders, so NV12/P010/P016 can cover most of real-world use case. + */ + DXGI_FORMAT formats_to_check[] = { + DXGI_FORMAT_NV12, /* NV12 */ + DXGI_FORMAT_P010, /* P010_10LE */ + DXGI_FORMAT_P016, /* P016_LE */ + }; + GValue *supported_formats = NULL; + GstD3D11DeinterlaceClassData *cdata; + guint max_past_frames = 0; + guint max_future_frames = 0; + guint i; + + device_handle = gst_d3d11_device_get_device_handle (device); + context_handle = gst_d3d11_device_get_device_context_handle (device); + + hr = device_handle->QueryInterface (IID_PPV_ARGS (&video_device)); + if (!gst_d3d11_result (hr, device)) + return; + + hr = context_handle->QueryInterface (IID_PPV_ARGS (&video_context)); + if (!gst_d3d11_result (hr, device)) + return; + + memset (&desc, 0, sizeof (D3D11_VIDEO_PROCESSOR_CONTENT_DESC)); + desc.InputFrameFormat = D3D11_VIDEO_FRAME_FORMAT_INTERLACED_TOP_FIELD_FIRST; + desc.InputWidth = 320; + desc.InputHeight = 240; + desc.OutputWidth = 320; + desc.OutputHeight = 240; + desc.Usage = D3D11_VIDEO_USAGE_PLAYBACK_NORMAL; + + hr = video_device->CreateVideoProcessorEnumerator (&desc, &video_proc_enum); + if (!gst_d3d11_result (hr, device)) + return; + + /* We need ID3D11VideoProcessorEnumerator1 interface to check conversion + * capability of device via CheckVideoProcessorFormatConversion() */ + hr = video_proc_enum.As (&video_proc_enum1); + if (!gst_d3d11_result (hr, device)) + return; + + hr = video_proc_enum->GetVideoProcessorCaps (&proc_caps); + if (!gst_d3d11_result (hr, device)) + return; + + for (i = 0; i < proc_caps.RateConversionCapsCount; i++) { + D3D11_VIDEO_PROCESSOR_RATE_CONVERSION_CAPS rate_conv_caps = { 0, }; + + hr = video_proc_enum->GetVideoProcessorRateConversionCaps (i, + &rate_conv_caps); + if (FAILED (hr)) + continue; + + supported_methods |= rate_conv_caps.ProcessorCaps; + max_past_frames = MAX (max_past_frames, rate_conv_caps.PastFrames); + max_future_frames = MAX (max_future_frames, rate_conv_caps.FutureFrames); + } + + if (supported_methods == 0) + return; + +#define IS_SUPPORTED_METHOD(flags,val) (flags & val) == val + blend = IS_SUPPORTED_METHOD (supported_methods, + GST_D3D11_DEINTERLACE_METHOD_BLEND); + bob = IS_SUPPORTED_METHOD (supported_methods, + GST_D3D11_DEINTERLACE_METHOD_BOB); + adaptive = IS_SUPPORTED_METHOD (supported_methods, + GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE); + mocomp = IS_SUPPORTED_METHOD (supported_methods, + GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION); +#undef IS_SUPPORTED_METHOD + + if (!blend && !bob && !adaptive && !mocomp) + return; + + /* Drop all not supported methods from flags */ + supported_methods = supported_methods & + (GST_D3D11_DEINTERLACE_METHOD_BLEND | GST_D3D11_DEINTERLACE_METHOD_BOB | + GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE | + GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION); + + /* Prefer bob, it's equivalent to "linear" which is default mode of + * software deinterlace element, also it's fallback mode + * for our "adaptive" and "mocomp" modes. Note that since Direct3D12, "blend" + * mode is no more supported, instead "bob" and "custom" mode are suported + * by Direct3D12 */ + if (bob) { + default_method = GST_D3D11_DEINTERLACE_METHOD_BOB; + } else if (adaptive) { + default_method = GST_D3D11_DEINTERLACE_METHOD_ADAPTVIE; + } else if (mocomp) { + default_method = GST_D3D11_DEINTERLACE_METHOD_MOTION_COMPENSATION; + } else if (blend) { + default_method = GST_D3D11_DEINTERLACE_METHOD_BLEND; + } else { + /* Programming error */ + g_return_if_reached (); + } + + for (i = 0; i < G_N_ELEMENTS (formats_to_check); i++) { + UINT flags = 0; + GValue val = G_VALUE_INIT; + GstVideoFormat format; + BOOL supported = FALSE; + + hr = video_proc_enum->CheckVideoProcessorFormat (formats_to_check[i], + &flags); + if (FAILED (hr)) + continue; + + /* D3D11 video processor can support other conversion at once, + * including color format conversion. + * But not all combinations of in/out pairs can be supported. + * To make things simple, this element will do only deinterlacing + * (might not be optimal in terms of processing power/resource though) */ + + /* D3D11_VIDEO_PROCESSOR_FORMAT_SUPPORT_INPUT = 0x1, + * D3D11_VIDEO_PROCESSOR_FORMAT_SUPPORT_OUTPUT = 0x2, + * MinGW header might not be defining the above enum values */ + if ((flags & 0x3) != 0x3) + continue; + + format = gst_d3d11_dxgi_format_to_gst (formats_to_check[i]); + /* This is programming error! */ + if (format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ERROR ("Couldn't convert DXGI format %d to video format", + formats_to_check[i]); + continue; + } + + hr = video_proc_enum1->CheckVideoProcessorFormatConversion + (formats_to_check[i], DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P709, + formats_to_check[i], DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P709, + &supported); + if (FAILED (hr) || !supported) + continue; + + if (!supported_formats) { + supported_formats = g_new0 (GValue, 1); + g_value_init (supported_formats, GST_TYPE_LIST); + } + + if (formats_to_check[i] == DXGI_FORMAT_P016) { + /* This is used for P012 as well */ + g_value_init (&val, G_TYPE_STRING); + g_value_set_static_string (&val, + gst_video_format_to_string (GST_VIDEO_FORMAT_P012_LE)); + gst_value_list_append_and_take_value (supported_formats, &val); + } + + g_value_init (&val, G_TYPE_STRING); + g_value_set_static_string (&val, gst_video_format_to_string (format)); + gst_value_list_append_and_take_value (supported_formats, &val); + } + + if (!supported_formats) + return; + + caps = gst_caps_new_empty_simple ("video/x-raw"); + /* FIXME: Check supported resolution, it would be different from + * supported max texture dimension */ + gst_caps_set_simple (caps, + "width", GST_TYPE_INT_RANGE, 1, 8192, + "height", GST_TYPE_INT_RANGE, 1, 8192, NULL); + gst_caps_set_value (caps, "format", supported_formats); + g_value_unset (supported_formats); + g_free (supported_formats); + + /* TODO: Add alternating deinterlace */ + src_caps = gst_caps_copy (caps); + caps_features = gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, + NULL); + gst_caps_set_features_simple (src_caps, caps_features); + + caps_features = gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, NULL); + gst_caps_set_features_simple (caps, caps_features); + gst_caps_append (src_caps, caps); + + sink_caps = gst_caps_copy (src_caps); + + GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + cdata = gst_d3d11_deinterlace_class_data_new (); + cdata->sink_caps = sink_caps; + cdata->src_caps = src_caps; + cdata->device_caps.supported_methods = + (GstD3D11DeinterlaceMethod) supported_methods; + cdata->device_caps.default_method = default_method; + cdata->device_caps.max_past_frames = max_past_frames; + cdata->device_caps.max_future_frames = max_future_frames; + + g_object_get (device, "adapter", &cdata->adapter, + "device-id", &cdata->device_id, "vendor-id", &cdata->vendor_id, + "description", &cdata->description, NULL); + type_info.class_data = cdata; + bin_type_info.class_data = gst_d3d11_deinterlace_class_data_ref (cdata); + + type_name = g_strdup ("GstD3D11Deinterlace"); + feature_name = g_strdup ("d3d11deinterlaceelement"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11Device%dDeinterlace", index); + feature_name = g_strdup_printf ("d3d11device%ddeinterlaceelement", index); + } + + type = g_type_register_static (GST_TYPE_BASE_TRANSFORM, + type_name, &type_info, (GTypeFlags) 0); + cdata->deinterlace_type = type; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, GST_RANK_NONE, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); + + /* Register wrapper bin */ + index = 0; + type_name = g_strdup ("GstD3D11DeinterlaceBin"); + feature_name = g_strdup ("d3d11deinterlace"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11Device%dDeinterlaceBin", index); + feature_name = g_strdup_printf ("d3d11device%ddeinterlace", index); + } + + bin_type = g_type_register_static (GST_TYPE_BIN, + type_name, &bin_type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (bin_type); + + if (!gst_element_register (plugin, feature_name, rank, bin_type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11deinterlace.h
Added
@@ -0,0 +1,34 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_DEINTERLACE_H__ +#define __GST_D3D11_DEINTERLACE_H__ + +#include <gst/gst.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +void gst_d3d11_deinterlace_register (GstPlugin * plugin, + GstD3D11Device * device, + guint rank); + +G_END_DECLS + +#endif /* __GST_D3D11_DEINTERLACE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11download.cpp
Added
@@ -0,0 +1,465 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11download + * @title: d3d11download + * @short_description: Downloads Direct3D11 texture memory into system memory + * + * Downloads Direct3D11 texture memory into system memory + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=test_h264.mp4 ! parsebin ! d3d11h264dec ! \ + * d3d11convert ! d3d11download ! video/x-raw,width=640,height=480 ! mfh264enc ! \ + * h264parse ! mp4mux ! filesink location=output.mp4 + * ``` + * This pipeline will resize decoded (by #d3d11h264dec) frames to 640x480 + * resolution by using #d3d11convert. Then it will be copied into system memory + * by d3d11download. Finally downloaded frames will be encoded as a new + * H.264 stream via #mfh264enc and muxed via mp4mux + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11download.h" + +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_download_debug); +#define GST_CAT_DEFAULT gst_d3d11_download_debug + +static GstStaticCaps sink_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE (GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS)); + +static GstStaticCaps src_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE (GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS)); + +struct _GstD3D11Download +{ + GstD3D11BaseFilter parent; + + GstBuffer *staging_buffer; +}; + +#define gst_d3d11_download_parent_class parent_class +G_DEFINE_TYPE (GstD3D11Download, gst_d3d11_download, + GST_TYPE_D3D11_BASE_FILTER); + +static void gst_d3d11_download_dispose (GObject * object); +static gboolean gst_d3d11_download_stop (GstBaseTransform * trans); +static gboolean gst_d3d11_download_sink_event (GstBaseTransform * trans, + GstEvent * event); +static GstCaps *gst_d3d11_download_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static gboolean gst_d3d11_download_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query); +static gboolean gst_d3d11_download_decide_allocation (GstBaseTransform * trans, + GstQuery * query); +static GstFlowReturn gst_d3d11_download_transform (GstBaseTransform * trans, + GstBuffer * inbuf, GstBuffer * outbuf); +static gboolean gst_d3d11_download_set_info (GstD3D11BaseFilter * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info); + +static void +gst_d3d11_download_class_init (GstD3D11DownloadClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstD3D11BaseFilterClass *bfilter_class = GST_D3D11_BASE_FILTER_CLASS (klass); + GstCaps *caps; + + gobject_class->dispose = gst_d3d11_download_dispose; + + caps = gst_d3d11_get_updated_template_caps (&sink_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + caps = gst_d3d11_get_updated_template_caps (&src_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 downloader", "Filter/Video", + "Downloads Direct3D11 texture memory into system memory", + "Seungha Yang <seungha.yang@navercorp.com>"); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_download_stop); + trans_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_download_sink_event); + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_download_transform_caps); + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_download_propose_allocation); + trans_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_download_decide_allocation); + trans_class->transform = GST_DEBUG_FUNCPTR (gst_d3d11_download_transform); + + bfilter_class->set_info = GST_DEBUG_FUNCPTR (gst_d3d11_download_set_info); + + GST_DEBUG_CATEGORY_INIT (gst_d3d11_download_debug, + "d3d11download", 0, "d3d11download Element"); +} + +static void +gst_d3d11_download_init (GstD3D11Download * download) +{ +} + +static void +gst_d3d11_download_dispose (GObject * object) +{ + GstD3D11Download *self = GST_D3D11_DOWNLOAD (object); + + gst_clear_buffer (&self->staging_buffer); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static gboolean +gst_d3d11_download_stop (GstBaseTransform * trans) +{ + GstD3D11Download *self = GST_D3D11_DOWNLOAD (trans); + + gst_clear_buffer (&self->staging_buffer); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->stop (trans); +} + +static gboolean +gst_d3d11_download_sink_event (GstBaseTransform * trans, GstEvent * event) +{ + GstD3D11Download *self = GST_D3D11_DOWNLOAD (trans); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_EOS: + /* We don't need to hold this staging buffer after eos */ + gst_clear_buffer (&self->staging_buffer); + break; + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->sink_event (trans, event); +} + +static GstCaps * +_set_caps_features (const GstCaps * caps, const gchar * feature_name) +{ + GstCaps *tmp = gst_caps_copy (caps); + guint n = gst_caps_get_size (tmp); + guint i = 0; + + for (i = 0; i < n; i++) + gst_caps_set_features (tmp, i, + gst_caps_features_from_string (feature_name)); + + return tmp; +} + +static GstCaps * +gst_d3d11_download_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *result, *tmp; + + GST_DEBUG_OBJECT (trans, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + if (direction == GST_PAD_SINK) { + tmp = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); + tmp = gst_caps_merge (gst_caps_ref (caps), tmp); + } else { + GstCaps *newcaps; + tmp = gst_caps_ref (caps); + + newcaps = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); + tmp = gst_caps_merge (tmp, newcaps); + } + + if (filter) { + result = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + } else { + result = tmp; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, result); + + return result; +} + +static gboolean +gst_d3d11_download_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstVideoInfo info; + GstBufferPool *pool; + GstCaps *caps; + guint size; + + if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query)) + return FALSE; + + /* passthrough, we're done */ + if (decide_query == NULL) + return TRUE; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) + return FALSE; + + if (gst_query_get_n_allocation_pools (query) == 0) { + GstCapsFeatures *features; + GstStructure *config; + gboolean is_d3d11 = FALSE; + + features = gst_caps_get_features (caps, 0); + + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + GST_DEBUG_OBJECT (filter, "upstream support d3d11 memory"); + pool = gst_d3d11_buffer_pool_new (filter->device); + is_d3d11 = TRUE; + } else { + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + /* d3d11 pool does not support video alignment */ + if (!is_d3d11) { + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + } + + size = GST_VIDEO_INFO_SIZE (&info); + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + + if (!gst_buffer_pool_set_config (pool, config)) + goto config_failed; + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + + if (is_d3d11) { + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, + nullptr); + gst_structure_free (config); + } + + gst_query_add_allocation_pool (query, pool, size, 0, 0); + + gst_object_unref (pool); + } + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (filter, "failed to set config"); + gst_object_unref (pool); + return FALSE; + } +} + +static gboolean +gst_d3d11_download_decide_allocation (GstBaseTransform * trans, + GstQuery * query) +{ + GstBufferPool *pool = NULL; + GstStructure *config; + guint min, max, size; + gboolean update_pool; + GstCaps *outcaps = NULL; + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + + if (!pool) + gst_query_parse_allocation (query, &outcaps, NULL); + + update_pool = TRUE; + } else { + GstVideoInfo vinfo; + + gst_query_parse_allocation (query, &outcaps, NULL); + gst_video_info_from_caps (&vinfo, outcaps); + size = vinfo.size; + min = max = 0; + update_pool = FALSE; + } + + if (!pool) + pool = gst_video_buffer_pool_new (); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + if (outcaps) + gst_buffer_pool_config_set_params (config, outcaps, size, 0, 0); + gst_buffer_pool_set_config (pool, config); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + gst_object_unref (pool); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, + query); +} + +static gboolean +gst_d3d11_download_can_use_staging_buffer (GstD3D11Download * self, + GstBuffer * inbuf) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (self); + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (filter->device); + + if (!gst_d3d11_buffer_can_access_device (inbuf, device_handle)) + return FALSE; + + if (self->staging_buffer) + return TRUE; + + self->staging_buffer = gst_d3d11_allocate_staging_buffer_for (inbuf, + &filter->in_info, TRUE); + + if (!self->staging_buffer) { + GST_WARNING_OBJECT (self, "Couldn't allocate staging buffer"); + return FALSE; + } + + return TRUE; +} + +static GstFlowReturn +gst_d3d11_download_transform (GstBaseTransform * trans, GstBuffer * inbuf, + GstBuffer * outbuf) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstD3D11Download *self = GST_D3D11_DOWNLOAD (trans); + GstVideoFrame in_frame, out_frame; + GstFlowReturn ret = GST_FLOW_OK; + gboolean use_staging_buf; + GstBuffer *target_inbuf = inbuf; + guint i; + + use_staging_buf = gst_d3d11_download_can_use_staging_buffer (self, inbuf); + + if (use_staging_buf) { + GST_TRACE_OBJECT (self, "Copy input buffer to staging buffer"); + + /* Copy d3d11 texture to staging texture */ + if (!gst_d3d11_buffer_copy_into (self->staging_buffer, + inbuf, &filter->in_info)) { + GST_ERROR_OBJECT (self, + "Failed to copy input buffer into staging texture"); + return GST_FLOW_ERROR; + } + + target_inbuf = self->staging_buffer; + } + + if (!gst_video_frame_map (&in_frame, &filter->in_info, target_inbuf, + (GstMapFlags) (GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) + goto invalid_buffer; + + if (!gst_video_frame_map (&out_frame, &filter->out_info, outbuf, + (GstMapFlags) (GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) { + gst_video_frame_unmap (&in_frame); + goto invalid_buffer; + } + + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&in_frame); i++) { + if (!gst_video_frame_copy_plane (&out_frame, &in_frame, i)) { + GST_ERROR_OBJECT (filter, "Couldn't copy %dth plane", i); + ret = GST_FLOW_ERROR; + break; + } + } + + gst_video_frame_unmap (&out_frame); + gst_video_frame_unmap (&in_frame); + + return ret; + + /* ERRORS */ +invalid_buffer: + { + GST_ELEMENT_WARNING (filter, CORE, NOT_IMPLEMENTED, (NULL), + ("invalid video buffer received")); + return GST_FLOW_ERROR; + } +} + +static gboolean +gst_d3d11_download_set_info (GstD3D11BaseFilter * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info) +{ + GstD3D11Download *self = GST_D3D11_DOWNLOAD (filter); + + gst_clear_buffer (&self->staging_buffer); + + return TRUE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11download.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11download.h
Changed
@@ -24,24 +24,9 @@ G_BEGIN_DECLS -#define GST_TYPE_D3D11_DOWNLOAD (gst_d3d11_download_get_type()) -#define GST_D3D11_DOWNLOAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_DOWNLOAD,GstD3D11Download)) -#define GST_D3D11_DOWNLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_DOWNLOAD,GstD3D11DownloadClass)) -#define GST_D3D11_DOWNLOAD_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_DOWNLOAD,GstD3D11DownloadClass)) -#define GST_IS_D3D11_DOWNLOAD(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_DOWNLOAD)) -#define GST_IS_D3D11_DOWNLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_DOWNLOAD)) - -struct _GstD3D11Download -{ - GstD3D11BaseFilter parent; -}; - -struct _GstD3D11DownloadClass -{ - GstD3D11BaseFilterClass parent_class; -}; - -GType gst_d3d11_download_get_type (void); +#define GST_TYPE_D3D11_DOWNLOAD (gst_d3d11_download_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11Download, + gst_d3d11_download, GST, D3D11_DOWNLOAD, GstD3D11BaseFilter); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11h264dec.cpp
Added
@@ -0,0 +1,1031 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + * + * NOTE: some of implementations are copied/modified from Chromium code + * + * Copyright 2015 The Chromium Authors. All rights reserved. + * + * Redistribution and use in source and binary forms, with or without + * modification, are permitted provided that the following conditions are + * met: + * + * * Redistributions of source code must retain the above copyright + * notice, this list of conditions and the following disclaimer. + * * Redistributions in binary form must reproduce the above + * copyright notice, this list of conditions and the following disclaimer + * in the documentation and/or other materials provided with the + * distribution. + * * Neither the name of Google Inc. nor the names of its + * contributors may be used to endorse or promote products derived from + * this software without specific prior written permission. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR + * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT + * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, + * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT + * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, + * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY + * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT + * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE + * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + */ + +/** + * SECTION:element-d3d11h264dec + * @title: d3d11h264dec + * + * A Direct3D11/DXVA based H.264 video decoder + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/h264/file ! parsebin ! d3d11h264dec ! d3d11videosink + * ``` + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11h264dec.h" + +#include <gst/codecs/gsth264decoder.h> +#include <string.h> +#include <vector> + +/* HACK: to expose dxva data structure on UWP */ +#ifdef WINAPI_PARTITION_DESKTOP +#undef WINAPI_PARTITION_DESKTOP +#endif +#define WINAPI_PARTITION_DESKTOP 1 +#include <d3d9.h> +#include <dxva.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_h264_dec_debug); +#define GST_CAT_DEFAULT gst_d3d11_h264_dec_debug + +/* *INDENT-OFF* */ +typedef struct _GstD3D11H264DecInner +{ + GstD3D11Device *device = nullptr; + GstD3D11Decoder *d3d11_decoder = nullptr; + + DXVA_PicParams_H264 pic_params; + DXVA_Qmatrix_H264 iq_matrix; + + std::vector<DXVA_Slice_H264_Short> slice_list; + std::vector<guint8> bitstream_buffer; + + gint width = 0; + gint height = 0; + gint coded_width = 0; + gint coded_height = 0; + gint bitdepth = 0; + guint8 chroma_format_idc = 0; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; + gboolean interlaced = FALSE; + gint max_dpb_size = 0; +} GstD3D11H264DecInner; + +/* *INDENT-ON* */ +typedef struct _GstD3D11H264Dec +{ + GstH264Decoder parent; + GstD3D11H264DecInner *inner; +} GstD3D11H264Dec; + +typedef struct _GstD3D11H264DecClass +{ + GstH264DecoderClass parent_class; + GstD3D11DecoderSubClassData class_data; +} GstD3D11H264DecClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_H264_DEC(object) ((GstD3D11H264Dec *) (object)) +#define GST_D3D11_H264_DEC_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11H264DecClass)) + +static void gst_d3d11_h264_dec_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_h264_dec_finalize (GObject * object); +static void gst_d3d11_h264_dec_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_h264_dec_open (GstVideoDecoder * decoder); +static gboolean gst_d3d11_h264_dec_close (GstVideoDecoder * decoder); +static gboolean gst_d3d11_h264_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_d3d11_h264_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_d3d11_h264_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); +static gboolean gst_d3d11_h264_dec_sink_event (GstVideoDecoder * decoder, + GstEvent * event); + +/* GstH264Decoder */ +static GstFlowReturn gst_d3d11_h264_dec_new_sequence (GstH264Decoder * decoder, + const GstH264SPS * sps, gint max_dpb_size); +static GstFlowReturn gst_d3d11_h264_dec_new_picture (GstH264Decoder * decoder, + GstVideoCodecFrame * frame, GstH264Picture * picture); +static GstFlowReturn gst_d3d11_h264_dec_new_field_picture (GstH264Decoder * + decoder, const GstH264Picture * first_field, GstH264Picture * second_field); +static GstFlowReturn gst_d3d11_h264_dec_start_picture (GstH264Decoder * decoder, + GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb); +static GstFlowReturn gst_d3d11_h264_dec_decode_slice (GstH264Decoder * decoder, + GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, + GArray * ref_pic_list1); +static GstFlowReturn gst_d3d11_h264_dec_end_picture (GstH264Decoder * decoder, + GstH264Picture * picture); +static GstFlowReturn gst_d3d11_h264_dec_output_picture (GstH264Decoder * + decoder, GstVideoCodecFrame * frame, GstH264Picture * picture); + +static void +gst_d3d11_h264_dec_class_init (GstD3D11H264DecClass * klass, gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstH264DecoderClass *h264decoder_class = GST_H264_DECODER_CLASS (klass); + GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; + + gobject_class->get_property = gst_d3d11_h264_dec_get_property; + gobject_class->finalize = gst_d3d11_h264_dec_finalize; + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_set_context); + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + gst_d3d11_decoder_class_data_fill_subclass_data (cdata, &klass->class_data); + + /** + * GstD3D11H264Dec:adapter-luid: + * + * DXGI Adapter LUID for this element + * + * Since: 1.20 + */ + gst_d3d11_decoder_proxy_class_init (element_class, cdata, + "Seungha Yang <seungha.yang@navercorp.com>"); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_src_query); + decoder_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_sink_event); + + h264decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_new_sequence); + h264decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_new_picture); + h264decoder_class->new_field_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_new_field_picture); + h264decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_start_picture); + h264decoder_class->decode_slice = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_decode_slice); + h264decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_end_picture); + h264decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h264_dec_output_picture); +} + +static void +gst_d3d11_h264_dec_init (GstD3D11H264Dec * self) +{ + self->inner = new GstD3D11H264DecInner (); +} + +static void +gst_d3d11_h264_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11H264DecClass *klass = GST_D3D11_H264_DEC_GET_CLASS (object); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_decoder_proxy_get_property (object, prop_id, value, pspec, cdata); +} + +static void +gst_d3d11_h264_dec_finalize (GObject * object) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (object); + + delete self->inner; + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_h264_dec_set_context (GstElement * element, GstContext * context) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (element); + GstD3D11H264DecInner *inner = self->inner; + GstD3D11H264DecClass *klass = GST_D3D11_H264_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, cdata->adapter_luid, &inner->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +/* Clear all codec specific (e.g., SPS) data */ +static void +gst_d3d11_h264_dec_reset (GstD3D11H264Dec * self) +{ + GstD3D11H264DecInner *inner = self->inner; + + inner->width = 0; + inner->height = 0; + inner->coded_width = 0; + inner->coded_height = 0; + inner->bitdepth = 0; + inner->chroma_format_idc = 0; + inner->out_format = GST_VIDEO_FORMAT_UNKNOWN; + inner->interlaced = FALSE; + inner->max_dpb_size = 0; +} + +static gboolean +gst_d3d11_h264_dec_open (GstVideoDecoder * decoder) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + GstD3D11H264DecClass *klass = GST_D3D11_H264_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + if (!gst_d3d11_decoder_proxy_open (decoder, + cdata, &inner->device, &inner->d3d11_decoder)) { + GST_ERROR_OBJECT (self, "Failed to open decoder"); + return FALSE; + } + + gst_d3d11_h264_dec_reset (self); + + return TRUE; +} + +static gboolean +gst_d3d11_h264_dec_close (GstVideoDecoder * decoder) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + + gst_clear_object (&inner->d3d11_decoder); + gst_clear_object (&inner->device); + + return TRUE; +} + +static gboolean +gst_d3d11_h264_dec_negotiate (GstVideoDecoder * decoder) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_negotiate (inner->d3d11_decoder, decoder)) + return FALSE; + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_d3d11_h264_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_decide_allocation (inner->d3d11_decoder, + decoder, query)) { + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_d3d11_h264_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), + query, inner->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static gboolean +gst_d3d11_h264_dec_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, TRUE); + break; + case GST_EVENT_FLUSH_STOP: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, FALSE); + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstFlowReturn +gst_d3d11_h264_dec_new_sequence (GstH264Decoder * decoder, + const GstH264SPS * sps, gint max_dpb_size) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + gint crop_width, crop_height; + gboolean interlaced; + gboolean modified = FALSE; + + GST_LOG_OBJECT (self, "new sequence"); + + if (sps->frame_cropping_flag) { + crop_width = sps->crop_rect_width; + crop_height = sps->crop_rect_height; + } else { + crop_width = sps->width; + crop_height = sps->height; + } + + if (inner->width != crop_width || inner->height != crop_height || + inner->coded_width != sps->width || inner->coded_height != sps->height) { + GST_INFO_OBJECT (self, "resolution changed %dx%d (%dx%d)", + crop_width, crop_height, sps->width, sps->height); + inner->width = crop_width; + inner->height = crop_height; + inner->coded_width = sps->width; + inner->coded_height = sps->height; + modified = TRUE; + } + + if (inner->bitdepth != sps->bit_depth_luma_minus8 + 8) { + GST_INFO_OBJECT (self, "bitdepth changed"); + inner->bitdepth = (guint) sps->bit_depth_luma_minus8 + 8; + modified = TRUE; + } + + if (inner->chroma_format_idc != sps->chroma_format_idc) { + GST_INFO_OBJECT (self, "chroma format changed"); + inner->chroma_format_idc = sps->chroma_format_idc; + modified = TRUE; + } + + interlaced = !sps->frame_mbs_only_flag; + if (inner->interlaced != interlaced) { + GST_INFO_OBJECT (self, "interlaced sequence changed"); + inner->interlaced = interlaced; + modified = TRUE; + } + + if (inner->max_dpb_size < max_dpb_size) { + GST_INFO_OBJECT (self, "Requires larger DPB size (%d -> %d)", + inner->max_dpb_size, max_dpb_size); + modified = TRUE; + } + + if (modified || !gst_d3d11_decoder_is_configured (inner->d3d11_decoder)) { + GstVideoInfo info; + + inner->out_format = GST_VIDEO_FORMAT_UNKNOWN; + + if (inner->bitdepth == 8) { + if (inner->chroma_format_idc == 1) + inner->out_format = GST_VIDEO_FORMAT_NV12; + else { + GST_FIXME_OBJECT (self, "Could not support 8bits non-4:2:0 format"); + } + } + + if (inner->out_format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ERROR_OBJECT (self, "Could not support bitdepth/chroma format"); + return GST_FLOW_NOT_NEGOTIATED; + } + + gst_video_info_set_format (&info, + inner->out_format, inner->width, inner->height); + if (inner->interlaced) + GST_VIDEO_INFO_INTERLACE_MODE (&info) = GST_VIDEO_INTERLACE_MODE_MIXED; + + /* Store configured DPB size here. Then, it will be referenced later + * to decide whether we need to re-open decoder object or not. + * For instance, if every configuration is same apart from DPB size and + * new DPB size is decreased, we can reuse existing decoder object. + */ + inner->max_dpb_size = max_dpb_size; + if (!gst_d3d11_decoder_configure (inner->d3d11_decoder, + decoder->input_state, &info, + inner->coded_width, inner->coded_height, + /* Additional 4 views margin for zero-copy rendering */ + max_dpb_size + 4)) { + GST_ERROR_OBJECT (self, "Failed to create decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h264_dec_new_picture (GstH264Decoder * decoder, + GstVideoCodecFrame * frame, GstH264Picture * picture) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + GstBuffer *view_buffer; + + view_buffer = gst_d3d11_decoder_get_output_view_buffer (inner->d3d11_decoder, + GST_VIDEO_DECODER (decoder)); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "No available output view buffer"); + return GST_FLOW_FLUSHING; + } + + GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT, view_buffer); + + gst_h264_picture_set_user_data (picture, + view_buffer, (GDestroyNotify) gst_buffer_unref); + + GST_LOG_OBJECT (self, "New h264picture %p", picture); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h264_dec_new_field_picture (GstH264Decoder * decoder, + const GstH264Picture * first_field, GstH264Picture * second_field) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstBuffer *view_buffer; + + view_buffer = (GstBuffer *) gst_h264_picture_get_user_data ((GstH264Picture *) + first_field); + + if (!view_buffer) { + GST_WARNING_OBJECT (self, "First picture does not have output view buffer"); + return GST_FLOW_OK; + } + + GST_LOG_OBJECT (self, "New field picture with buffer %" GST_PTR_FORMAT, + view_buffer); + + gst_h264_picture_set_user_data (second_field, + gst_buffer_ref (view_buffer), (GDestroyNotify) gst_buffer_unref); + + return GST_FLOW_OK; +} + +static ID3D11VideoDecoderOutputView * +gst_d3d11_h264_dec_get_output_view_from_picture (GstD3D11H264Dec * self, + GstH264Picture * picture, guint8 * view_id) +{ + GstD3D11H264DecInner *inner = self->inner; + GstBuffer *view_buffer; + ID3D11VideoDecoderOutputView *view; + + view_buffer = (GstBuffer *) gst_h264_picture_get_user_data (picture); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); + return NULL; + } + + view = gst_d3d11_decoder_get_output_view_from_buffer (inner->d3d11_decoder, + view_buffer, view_id); + if (!view) { + GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); + return NULL; + } + + return view; +} + +static void +gst_d3d11_h264_dec_picture_params_from_sps (GstD3D11H264Dec * self, + const GstH264SPS * sps, gboolean field_pic, DXVA_PicParams_H264 * params) +{ +#define COPY_FIELD(f) \ + (params)->f = (sps)->f + + params->wFrameWidthInMbsMinus1 = sps->pic_width_in_mbs_minus1; + if (!sps->frame_mbs_only_flag) { + params->wFrameHeightInMbsMinus1 = + ((sps->pic_height_in_map_units_minus1 + 1) << 1) - 1; + } else { + params->wFrameHeightInMbsMinus1 = sps->pic_height_in_map_units_minus1; + } + params->residual_colour_transform_flag = sps->separate_colour_plane_flag; + params->MbaffFrameFlag = (sps->mb_adaptive_frame_field_flag && !field_pic); + params->field_pic_flag = field_pic; + params->MinLumaBipredSize8x8Flag = sps->level_idc >= 31; + + COPY_FIELD (num_ref_frames); + COPY_FIELD (chroma_format_idc); + COPY_FIELD (frame_mbs_only_flag); + COPY_FIELD (bit_depth_luma_minus8); + COPY_FIELD (bit_depth_chroma_minus8); + COPY_FIELD (log2_max_frame_num_minus4); + COPY_FIELD (pic_order_cnt_type); + COPY_FIELD (log2_max_pic_order_cnt_lsb_minus4); + COPY_FIELD (delta_pic_order_always_zero_flag); + COPY_FIELD (direct_8x8_inference_flag); + +#undef COPY_FIELD +} + +static void +gst_d3d11_h264_dec_picture_params_from_pps (GstD3D11H264Dec * self, + const GstH264PPS * pps, DXVA_PicParams_H264 * params) +{ +#define COPY_FIELD(f) \ + (params)->f = (pps)->f + + COPY_FIELD (constrained_intra_pred_flag); + COPY_FIELD (weighted_pred_flag); + COPY_FIELD (weighted_bipred_idc); + COPY_FIELD (transform_8x8_mode_flag); + COPY_FIELD (pic_init_qs_minus26); + COPY_FIELD (chroma_qp_index_offset); + COPY_FIELD (second_chroma_qp_index_offset); + COPY_FIELD (pic_init_qp_minus26); + COPY_FIELD (num_ref_idx_l0_active_minus1); + COPY_FIELD (num_ref_idx_l1_active_minus1); + COPY_FIELD (entropy_coding_mode_flag); + COPY_FIELD (pic_order_present_flag); + COPY_FIELD (deblocking_filter_control_present_flag); + COPY_FIELD (redundant_pic_cnt_present_flag); + COPY_FIELD (num_slice_groups_minus1); + COPY_FIELD (slice_group_map_type); + +#undef COPY_FIELD +} + +static void +gst_d3d11_h264_dec_picture_params_from_slice_header (GstD3D11H264Dec * + self, const GstH264SliceHdr * slice_header, DXVA_PicParams_H264 * params) +{ + params->sp_for_switch_flag = slice_header->sp_for_switch_flag; + params->field_pic_flag = slice_header->field_pic_flag; + params->CurrPic.AssociatedFlag = slice_header->bottom_field_flag; + params->IntraPicFlag = + GST_H264_IS_I_SLICE (slice_header) || GST_H264_IS_SI_SLICE (slice_header); +} + +static gboolean +gst_d3d11_h264_dec_fill_picture_params (GstD3D11H264Dec * self, + const GstH264SliceHdr * slice_header, DXVA_PicParams_H264 * params) +{ + const GstH264SPS *sps; + const GstH264PPS *pps; + + g_return_val_if_fail (slice_header->pps != NULL, FALSE); + g_return_val_if_fail (slice_header->pps->sequence != NULL, FALSE); + + pps = slice_header->pps; + sps = pps->sequence; + + params->MbsConsecutiveFlag = 1; + params->Reserved16Bits = 3; + params->ContinuationFlag = 1; + params->Reserved8BitsA = 0; + params->Reserved8BitsB = 0; + params->StatusReportFeedbackNumber = 1; + + gst_d3d11_h264_dec_picture_params_from_sps (self, + sps, slice_header->field_pic_flag, params); + gst_d3d11_h264_dec_picture_params_from_pps (self, pps, params); + gst_d3d11_h264_dec_picture_params_from_slice_header (self, + slice_header, params); + + return TRUE; +} + +static inline void +init_pic_params (DXVA_PicParams_H264 * params) +{ + memset (params, 0, sizeof (DXVA_PicParams_H264)); + for (guint i = 0; i < G_N_ELEMENTS (params->RefFrameList); i++) + params->RefFrameList[i].bPicEntry = 0xff; +} + +static GstFlowReturn +gst_d3d11_h264_dec_start_picture (GstH264Decoder * decoder, + GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + DXVA_PicParams_H264 *pic_params = &inner->pic_params; + DXVA_Qmatrix_H264 *iq_matrix = &inner->iq_matrix; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + GArray *dpb_array; + GstH264PPS *pps; + guint i, j; + + pps = slice->header.pps; + + view = gst_d3d11_h264_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + init_pic_params (pic_params); + gst_d3d11_h264_dec_fill_picture_params (self, &slice->header, pic_params); + + pic_params->CurrPic.Index7Bits = view_id; + pic_params->RefPicFlag = GST_H264_PICTURE_IS_REF (picture); + pic_params->frame_num = picture->frame_num; + + if (picture->field == GST_H264_PICTURE_FIELD_TOP_FIELD) { + pic_params->CurrFieldOrderCnt[0] = picture->top_field_order_cnt; + pic_params->CurrFieldOrderCnt[1] = 0; + } else if (picture->field == GST_H264_PICTURE_FIELD_BOTTOM_FIELD) { + pic_params->CurrFieldOrderCnt[0] = 0; + pic_params->CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; + } else { + pic_params->CurrFieldOrderCnt[0] = picture->top_field_order_cnt; + pic_params->CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; + } + + dpb_array = gst_h264_dpb_get_pictures_all (dpb); + for (i = 0, j = 0; i < dpb_array->len && j < 16; i++) { + GstH264Picture *other = g_array_index (dpb_array, GstH264Picture *, i); + guint8 id = 0xff; + + if (!GST_H264_PICTURE_IS_REF (other)) + continue; + + /* The second field picture will be handled differently */ + if (other->second_field) + continue; + + gst_d3d11_h264_dec_get_output_view_from_picture (self, other, &id); + pic_params->RefFrameList[j].Index7Bits = id; + + if (GST_H264_PICTURE_IS_LONG_TERM_REF (other)) { + pic_params->RefFrameList[j].AssociatedFlag = 1; + pic_params->FrameNumList[j] = other->long_term_frame_idx; + } else { + pic_params->RefFrameList[j].AssociatedFlag = 0; + pic_params->FrameNumList[j] = other->frame_num; + } + + switch (other->field) { + case GST_H264_PICTURE_FIELD_TOP_FIELD: + pic_params->FieldOrderCntList[j][0] = other->top_field_order_cnt; + pic_params->UsedForReferenceFlags |= 0x1 << (2 * j); + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + pic_params->FieldOrderCntList[j][1] = other->bottom_field_order_cnt; + pic_params->UsedForReferenceFlags |= 0x1 << (2 * j + 1); + break; + default: + pic_params->FieldOrderCntList[j][0] = other->top_field_order_cnt; + pic_params->FieldOrderCntList[j][1] = other->bottom_field_order_cnt; + pic_params->UsedForReferenceFlags |= 0x3 << (2 * j); + break; + } + + if (other->other_field) { + GstH264Picture *other_field = other->other_field; + + switch (other_field->field) { + case GST_H264_PICTURE_FIELD_TOP_FIELD: + pic_params->FieldOrderCntList[j][0] = + other_field->top_field_order_cnt; + pic_params->UsedForReferenceFlags |= 0x1 << (2 * j); + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + pic_params->FieldOrderCntList[j][1] = + other_field->bottom_field_order_cnt; + pic_params->UsedForReferenceFlags |= 0x1 << (2 * j + 1); + break; + default: + break; + } + } + + pic_params->NonExistingFrameFlags |= (other->nonexisting) << j; + j++; + } + g_array_unref (dpb_array); + + G_STATIC_ASSERT (sizeof (iq_matrix->bScalingLists4x4) == + sizeof (pps->scaling_lists_4x4)); + memcpy (iq_matrix->bScalingLists4x4, pps->scaling_lists_4x4, + sizeof (pps->scaling_lists_4x4)); + + G_STATIC_ASSERT (sizeof (iq_matrix->bScalingLists8x8[0]) == + sizeof (pps->scaling_lists_8x8[0])); + memcpy (iq_matrix->bScalingLists8x8[0], pps->scaling_lists_8x8[0], + sizeof (pps->scaling_lists_8x8[0])); + memcpy (iq_matrix->bScalingLists8x8[1], pps->scaling_lists_8x8[1], + sizeof (pps->scaling_lists_8x8[1])); + + inner->slice_list.resize (0); + inner->bitstream_buffer.resize (0); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h264_dec_decode_slice (GstH264Decoder * decoder, + GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, + GArray * ref_pic_list1) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + DXVA_Slice_H264_Short dxva_slice; + static const guint8 start_code[] = { 0, 0, 1 }; + const size_t start_code_size = sizeof (start_code); + + dxva_slice.BSNALunitDataLocation = inner->bitstream_buffer.size (); + /* Includes 3 bytes start code prefix */ + dxva_slice.SliceBytesInBuffer = slice->nalu.size + start_code_size; + dxva_slice.wBadSliceChopping = 0; + + inner->slice_list.push_back (dxva_slice); + + size_t pos = inner->bitstream_buffer.size (); + inner->bitstream_buffer.resize (pos + start_code_size + slice->nalu.size); + + /* Fill start code prefix */ + memcpy (&inner->bitstream_buffer[0] + pos, start_code, start_code_size); + + /* Copy bitstream */ + memcpy (&inner->bitstream_buffer[0] + pos + start_code_size, + slice->nalu.data + slice->nalu.offset, slice->nalu.size); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h264_dec_end_picture (GstH264Decoder * decoder, + GstH264Picture * picture) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + size_t bitstream_buffer_size; + size_t bitstream_pos; + GstD3D11DecodeInputStreamArgs input_args; + + GST_LOG_OBJECT (self, "end picture %p, (poc %d)", + picture, picture->pic_order_cnt); + + if (inner->bitstream_buffer.empty () || inner->slice_list.empty ()) { + GST_ERROR_OBJECT (self, "No bitstream buffer to submit"); + return GST_FLOW_ERROR; + } + + view = gst_d3d11_h264_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (&input_args, 0, sizeof (GstD3D11DecodeInputStreamArgs)); + + bitstream_pos = inner->bitstream_buffer.size (); + bitstream_buffer_size = GST_ROUND_UP_128 (bitstream_pos); + + if (bitstream_buffer_size > bitstream_pos) { + size_t padding = bitstream_buffer_size - bitstream_pos; + + /* As per DXVA spec, total amount of bitstream buffer size should be + * 128 bytes aligned. If actual data is not multiple of 128 bytes, + * the last slice data needs to be zero-padded */ + inner->bitstream_buffer.resize (bitstream_buffer_size, 0); + + DXVA_Slice_H264_Short & slice = inner->slice_list.back (); + slice.SliceBytesInBuffer += padding; + } + + input_args.picture_params = &inner->pic_params; + input_args.picture_params_size = sizeof (DXVA_PicParams_H264); + input_args.slice_control = &inner->slice_list[0]; + input_args.slice_control_size = + sizeof (DXVA_Slice_H264_Short) * inner->slice_list.size (); + input_args.bitstream = &inner->bitstream_buffer[0]; + input_args.bitstream_size = inner->bitstream_buffer.size (); + input_args.inverse_quantization_matrix = &inner->iq_matrix; + input_args.inverse_quantization_matrix_size = sizeof (DXVA_Qmatrix_H264); + + if (!gst_d3d11_decoder_decode_frame (inner->d3d11_decoder, view, &input_args)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h264_dec_output_picture (GstH264Decoder * decoder, + GstVideoCodecFrame * frame, GstH264Picture * picture) +{ + GstD3D11H264Dec *self = GST_D3D11_H264_DEC (decoder); + GstD3D11H264DecInner *inner = self->inner; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstBuffer *view_buffer; + + GST_LOG_OBJECT (self, + "Outputting picture %p (poc %d)", picture, picture->pic_order_cnt); + + view_buffer = (GstBuffer *) gst_h264_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Could not get output view"); + goto error; + } + + if (!gst_d3d11_decoder_process_output (inner->d3d11_decoder, vdec, + inner->width, inner->height, view_buffer, &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to copy buffer"); + goto error; + } + + if (picture->buffer_flags != 0) { + gboolean interlaced = + (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_INTERLACED) != 0; + gboolean tff = (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_TFF) != 0; + + GST_TRACE_OBJECT (self, + "apply buffer flags 0x%x (interlaced %d, top-field-first %d)", + picture->buffer_flags, interlaced, tff); + GST_BUFFER_FLAG_SET (frame->output_buffer, picture->buffer_flags); + } + + gst_h264_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_h264_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); + + return GST_FLOW_ERROR; +} + +void +gst_d3d11_h264_dec_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank, gboolean legacy) +{ + GType type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + guint i; + gboolean ret; + GTypeInfo type_info = { + sizeof (GstD3D11H264DecClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_h264_dec_class_init, + NULL, + NULL, + sizeof (GstD3D11H264Dec), + 0, + (GInstanceInitFunc) gst_d3d11_h264_dec_init, + }; + const GUID *supported_profile = NULL; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + guint max_width = 0; + guint max_height = 0; + guint resolution; + + ret = gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_H264, GST_VIDEO_FORMAT_NV12, &supported_profile); + + if (!ret) { + GST_WARNING_OBJECT (device, "decoder profile unavailable"); + return; + } + + ret = + gst_d3d11_decoder_supports_format (device, supported_profile, + DXGI_FORMAT_NV12); + if (!ret) { + GST_FIXME_OBJECT (device, "device does not support NV12 format"); + return; + } + + /* we will not check the maximum resolution for legacy devices. + * it might cause crash */ + if (legacy) { + max_width = gst_dxva_resolutions[0].width; + max_height = gst_dxva_resolutions[0].height; + } else { + for (i = 0; i < G_N_ELEMENTS (gst_dxva_resolutions); i++) { + if (gst_d3d11_decoder_supports_resolution (device, supported_profile, + DXGI_FORMAT_NV12, gst_dxva_resolutions[i].width, + gst_dxva_resolutions[i].height)) { + max_width = gst_dxva_resolutions[i].width; + max_height = gst_dxva_resolutions[i].height; + + GST_DEBUG_OBJECT (device, + "device support resolution %dx%d", max_width, max_height); + } else { + break; + } + } + } + + if (max_width == 0 || max_height == 0) { + GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); + return; + } + + sink_caps = gst_caps_from_string ("video/x-h264, " + "stream-format= (string) { avc, avc3, byte-stream }, " + "alignment= (string) au, " + "profile = (string) { high, progressive-high, constrained-high, main, constrained-baseline, baseline }"); + src_caps = gst_caps_from_string ("video/x-raw(" + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "), format = (string) NV12; " + "video/x-raw, format = (string) NV12"); + + /* To cover both landscape and portrait, select max value */ + resolution = MAX (max_width, max_height); + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + + type_info.class_data = + gst_d3d11_decoder_class_data_new (device, GST_DXVA_CODEC_H264, + sink_caps, src_caps); + + type_name = g_strdup ("GstD3D11H264Dec"); + feature_name = g_strdup ("d3d11h264dec"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11H264Device%dDec", index); + feature_name = g_strdup_printf ("d3d11h264device%ddec", index); + } + + type = g_type_register_static (GST_TYPE_H264_DECODER, + type_name, &type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11h264dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11h264dec.h
Changed
@@ -26,7 +26,6 @@ void gst_d3d11_h264_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank, gboolean legacy);
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11h265dec.cpp
Added
@@ -0,0 +1,1157 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11h265dec + * @title: d3d11h265dec + * + * A Direct3D11/DXVA based H.265 video decoder + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/hevc/file ! parsebin ! d3d11h265dec ! d3d11videosink + * ``` + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11h265dec.h" + +#include <gst/codecs/gsth265decoder.h> +#include <string.h> +#include <vector> + +/* HACK: to expose dxva data structure on UWP */ +#ifdef WINAPI_PARTITION_DESKTOP +#undef WINAPI_PARTITION_DESKTOP +#endif +#define WINAPI_PARTITION_DESKTOP 1 +#include <d3d9.h> +#include <dxva.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_h265_dec_debug); +#define GST_CAT_DEFAULT gst_d3d11_h265_dec_debug + +/* *INDENT-OFF* */ +typedef struct _GstD3D11H265DecInner +{ + GstD3D11Device *device = nullptr; + GstD3D11Decoder *d3d11_decoder = nullptr; + + DXVA_PicParams_HEVC pic_params; + DXVA_Qmatrix_HEVC iq_matrix; + + std::vector<DXVA_Slice_HEVC_Short> slice_list; + std::vector<guint8> bitstream_buffer; + + gboolean submit_iq_data; + + gint width = 0; + gint height = 0; + gint coded_width = 0; + gint coded_height = 0; + guint bitdepth = 0; + guint8 chroma_format_idc = 0; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; + GstVideoInterlaceMode interlace_mode = GST_VIDEO_INTERLACE_MODE_PROGRESSIVE; +} GstD3D11H265DecInner; +/* *INDENT-ON* */ + +typedef struct _GstD3D11H265Dec +{ + GstH265Decoder parent; + GstD3D11H265DecInner *inner; +} GstD3D11H265Dec; + +typedef struct _GstD3D11H265DecClass +{ + GstH265DecoderClass parent_class; + GstD3D11DecoderSubClassData class_data; +} GstD3D11H265DecClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_H265_DEC(object) ((GstD3D11H265Dec *) (object)) +#define GST_D3D11_H265_DEC_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11H265DecClass)) + +static void gst_d3d11_h265_dec_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_h265_dec_finalize (GObject * object); +static void gst_d3d11_h265_dec_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_h265_dec_open (GstVideoDecoder * decoder); +static gboolean gst_d3d11_h265_dec_close (GstVideoDecoder * decoder); +static gboolean gst_d3d11_h265_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_d3d11_h265_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_d3d11_h265_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); +static gboolean gst_d3d11_h265_dec_sink_event (GstVideoDecoder * decoder, + GstEvent * event); + +/* GstH265Decoder */ +static GstFlowReturn gst_d3d11_h265_dec_new_sequence (GstH265Decoder * decoder, + const GstH265SPS * sps, gint max_dpb_size); +static GstFlowReturn gst_d3d11_h265_dec_new_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * cframe, GstH265Picture * picture); +static GstFlowReturn gst_d3d11_h265_dec_start_picture (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb); +static GstFlowReturn gst_d3d11_h265_dec_decode_slice (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, + GArray * ref_pic_list0, GArray * ref_pic_list1); +static GstFlowReturn gst_d3d11_h265_dec_end_picture (GstH265Decoder * decoder, + GstH265Picture * picture); +static GstFlowReturn gst_d3d11_h265_dec_output_picture (GstH265Decoder * + decoder, GstVideoCodecFrame * frame, GstH265Picture * picture); + +static void +gst_d3d11_h265_dec_class_init (GstD3D11H265DecClass * klass, gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstH265DecoderClass *h265decoder_class = GST_H265_DECODER_CLASS (klass); + GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; + + gobject_class->get_property = gst_d3d11_h265_dec_get_property; + gobject_class->finalize = gst_d3d11_h265_dec_finalize; + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_set_context); + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + gst_d3d11_decoder_class_data_fill_subclass_data (cdata, &klass->class_data); + + /** + * GstD3D11H265Dec:adapter-luid: + * + * DXGI Adapter LUID for this element + * + * Since: 1.20 + */ + gst_d3d11_decoder_proxy_class_init (element_class, cdata, + "Seungha Yang <seungha.yang@navercorp.com>"); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_src_query); + decoder_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_sink_event); + + h265decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_new_sequence); + h265decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_new_picture); + h265decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_start_picture); + h265decoder_class->decode_slice = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_decode_slice); + h265decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_end_picture); + h265decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_h265_dec_output_picture); +} + +static void +gst_d3d11_h265_dec_init (GstD3D11H265Dec * self) +{ + self->inner = new GstD3D11H265DecInner (); +} + +static void +gst_d3d11_h265_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11H265DecClass *klass = GST_D3D11_H265_DEC_GET_CLASS (object); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_decoder_proxy_get_property (object, prop_id, value, pspec, cdata); +} + +static void +gst_d3d11_h265_dec_finalize (GObject * object) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (object); + + delete self->inner; + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_h265_dec_set_context (GstElement * element, GstContext * context) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (element); + GstD3D11H265DecInner *inner = self->inner; + GstD3D11H265DecClass *klass = GST_D3D11_H265_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, cdata->adapter_luid, &inner->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_h265_dec_open (GstVideoDecoder * decoder) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + GstD3D11H265DecClass *klass = GST_D3D11_H265_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + if (!gst_d3d11_decoder_proxy_open (decoder, + cdata, &inner->device, &inner->d3d11_decoder)) { + GST_ERROR_OBJECT (self, "Failed to open decoder"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_h265_dec_close (GstVideoDecoder * decoder) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + + gst_clear_object (&inner->d3d11_decoder); + gst_clear_object (&inner->device); + + return TRUE; +} + +static gboolean +gst_d3d11_h265_dec_negotiate (GstVideoDecoder * decoder) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_negotiate (inner->d3d11_decoder, decoder)) + return FALSE; + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_d3d11_h265_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_decide_allocation (inner->d3d11_decoder, + decoder, query)) { + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_d3d11_h265_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), + query, inner->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static gboolean +gst_d3d11_h265_dec_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, TRUE); + break; + case GST_EVENT_FLUSH_STOP: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, FALSE); + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstFlowReturn +gst_d3d11_h265_dec_new_sequence (GstH265Decoder * decoder, + const GstH265SPS * sps, gint max_dpb_size) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + gint crop_width, crop_height; + gboolean modified = FALSE; + GstVideoInterlaceMode interlace_mode = GST_VIDEO_INTERLACE_MODE_PROGRESSIVE; + + GST_LOG_OBJECT (self, "new sequence"); + + if (sps->conformance_window_flag) { + crop_width = sps->crop_rect_width; + crop_height = sps->crop_rect_height; + } else { + crop_width = sps->width; + crop_height = sps->height; + } + + if (inner->width != crop_width || inner->height != crop_height || + inner->coded_width != sps->width || inner->coded_height != sps->height) { + GST_INFO_OBJECT (self, "resolution changed %dx%d -> %dx%d", + crop_width, crop_height, sps->width, sps->height); + inner->width = crop_width; + inner->height = crop_height; + inner->coded_width = sps->width; + inner->coded_height = sps->height; + modified = TRUE; + } + + if (inner->bitdepth != (guint) sps->bit_depth_luma_minus8 + 8) { + GST_INFO_OBJECT (self, "bitdepth changed"); + inner->bitdepth = sps->bit_depth_luma_minus8 + 8; + modified = TRUE; + } + + if (sps->vui_parameters_present_flag && sps->vui_params.field_seq_flag) { + interlace_mode = GST_VIDEO_INTERLACE_MODE_ALTERNATE; + } else { + /* 7.4.4 Profile, tier and level sementics */ + if (sps->profile_tier_level.progressive_source_flag && + !sps->profile_tier_level.interlaced_source_flag) { + interlace_mode = GST_VIDEO_INTERLACE_MODE_PROGRESSIVE; + } else { + interlace_mode = GST_VIDEO_INTERLACE_MODE_MIXED; + } + } + + if (inner->interlace_mode != interlace_mode) { + GST_INFO_OBJECT (self, "Interlace mode change %d -> %d", + inner->interlace_mode, interlace_mode); + inner->interlace_mode = interlace_mode; + modified = TRUE; + } + + if (inner->chroma_format_idc != sps->chroma_format_idc) { + GST_INFO_OBJECT (self, "chroma format changed"); + inner->chroma_format_idc = sps->chroma_format_idc; + modified = TRUE; + } + + if (modified || !gst_d3d11_decoder_is_configured (inner->d3d11_decoder)) { + GstVideoInfo info; + + inner->out_format = GST_VIDEO_FORMAT_UNKNOWN; + + if (inner->bitdepth == 8) { + if (inner->chroma_format_idc == 1) { + inner->out_format = GST_VIDEO_FORMAT_NV12; + } else { + GST_FIXME_OBJECT (self, "Could not support 8bits non-4:2:0 format"); + } + } else if (inner->bitdepth == 10) { + if (inner->chroma_format_idc == 1) { + inner->out_format = GST_VIDEO_FORMAT_P010_10LE; + } else { + GST_FIXME_OBJECT (self, "Could not support 10bits non-4:2:0 format"); + } + } + + if (inner->out_format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ERROR_OBJECT (self, "Could not support bitdepth/chroma format"); + return GST_FLOW_NOT_NEGOTIATED; + } + + gst_video_info_set_format (&info, + inner->out_format, inner->width, inner->height); + GST_VIDEO_INFO_INTERLACE_MODE (&info) = inner->interlace_mode; + + if (!gst_d3d11_decoder_configure (inner->d3d11_decoder, + decoder->input_state, &info, + inner->coded_width, inner->coded_height, + /* Additional 4 views margin for zero-copy rendering */ + max_dpb_size + 4)) { + GST_ERROR_OBJECT (self, "Failed to create decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h265_dec_new_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * cframe, GstH265Picture * picture) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + GstBuffer *view_buffer; + + view_buffer = gst_d3d11_decoder_get_output_view_buffer (inner->d3d11_decoder, + GST_VIDEO_DECODER (decoder)); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "No available output view buffer"); + return GST_FLOW_FLUSHING; + } + + GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT, view_buffer); + + gst_h265_picture_set_user_data (picture, + view_buffer, (GDestroyNotify) gst_buffer_unref); + + GST_LOG_OBJECT (self, "New h265picture %p", picture); + + return GST_FLOW_OK; +} + +static void +gst_d3d11_h265_dec_picture_params_from_sps (GstD3D11H265Dec * self, + const GstH265SPS * sps, DXVA_PicParams_HEVC * params) +{ +#define COPY_FIELD(f) \ + (params)->f = (sps)->f +#define COPY_FIELD_WITH_PREFIX(f) \ + (params)->G_PASTE(sps_,f) = (sps)->f + + params->PicWidthInMinCbsY = + sps->width >> (sps->log2_min_luma_coding_block_size_minus3 + 3); + params->PicHeightInMinCbsY = + sps->height >> (sps->log2_min_luma_coding_block_size_minus3 + 3); + params->sps_max_dec_pic_buffering_minus1 = + sps->max_dec_pic_buffering_minus1[sps->max_sub_layers_minus1]; + + COPY_FIELD (chroma_format_idc); + COPY_FIELD (separate_colour_plane_flag); + COPY_FIELD (bit_depth_luma_minus8); + COPY_FIELD (bit_depth_chroma_minus8); + COPY_FIELD (log2_max_pic_order_cnt_lsb_minus4); + COPY_FIELD (log2_min_luma_coding_block_size_minus3); + COPY_FIELD (log2_diff_max_min_luma_coding_block_size); + COPY_FIELD (log2_min_transform_block_size_minus2); + COPY_FIELD (log2_diff_max_min_transform_block_size); + COPY_FIELD (max_transform_hierarchy_depth_inter); + COPY_FIELD (max_transform_hierarchy_depth_intra); + COPY_FIELD (num_short_term_ref_pic_sets); + COPY_FIELD (num_long_term_ref_pics_sps); + COPY_FIELD (scaling_list_enabled_flag); + COPY_FIELD (amp_enabled_flag); + COPY_FIELD (sample_adaptive_offset_enabled_flag); + COPY_FIELD (pcm_enabled_flag); + + if (sps->pcm_enabled_flag) { + COPY_FIELD (pcm_sample_bit_depth_luma_minus1); + COPY_FIELD (pcm_sample_bit_depth_chroma_minus1); + COPY_FIELD (log2_min_pcm_luma_coding_block_size_minus3); + COPY_FIELD (log2_diff_max_min_pcm_luma_coding_block_size); + } + + COPY_FIELD (pcm_loop_filter_disabled_flag); + COPY_FIELD (long_term_ref_pics_present_flag); + COPY_FIELD_WITH_PREFIX (temporal_mvp_enabled_flag); + COPY_FIELD (strong_intra_smoothing_enabled_flag); + +#undef COPY_FIELD +#undef COPY_FIELD_WITH_PREFIX +} + +static void +gst_d3d11_h265_dec_picture_params_from_pps (GstD3D11H265Dec * self, + const GstH265PPS * pps, DXVA_PicParams_HEVC * params) +{ + guint i; + +#define COPY_FIELD(f) \ + (params)->f = (pps)->f +#define COPY_FIELD_WITH_PREFIX(f) \ + (params)->G_PASTE(pps_,f) = (pps)->f + + COPY_FIELD (num_ref_idx_l0_default_active_minus1); + COPY_FIELD (num_ref_idx_l1_default_active_minus1); + COPY_FIELD (init_qp_minus26); + COPY_FIELD (dependent_slice_segments_enabled_flag); + COPY_FIELD (output_flag_present_flag); + COPY_FIELD (num_extra_slice_header_bits); + COPY_FIELD (sign_data_hiding_enabled_flag); + COPY_FIELD (cabac_init_present_flag); + COPY_FIELD (constrained_intra_pred_flag); + COPY_FIELD (transform_skip_enabled_flag); + COPY_FIELD (cu_qp_delta_enabled_flag); + COPY_FIELD_WITH_PREFIX (slice_chroma_qp_offsets_present_flag); + COPY_FIELD (weighted_pred_flag); + COPY_FIELD (weighted_bipred_flag); + COPY_FIELD (transquant_bypass_enabled_flag); + COPY_FIELD (tiles_enabled_flag); + COPY_FIELD (entropy_coding_sync_enabled_flag); + COPY_FIELD (uniform_spacing_flag); + + if (pps->tiles_enabled_flag) + COPY_FIELD (loop_filter_across_tiles_enabled_flag); + + COPY_FIELD_WITH_PREFIX (loop_filter_across_slices_enabled_flag); + COPY_FIELD (deblocking_filter_override_enabled_flag); + COPY_FIELD_WITH_PREFIX (deblocking_filter_disabled_flag); + COPY_FIELD (lists_modification_present_flag); + COPY_FIELD (slice_segment_header_extension_present_flag); + COPY_FIELD_WITH_PREFIX (cb_qp_offset); + COPY_FIELD_WITH_PREFIX (cr_qp_offset); + + if (pps->tiles_enabled_flag) { + COPY_FIELD (num_tile_columns_minus1); + COPY_FIELD (num_tile_rows_minus1); + if (!pps->uniform_spacing_flag) { + for (i = 0; i < pps->num_tile_columns_minus1 && + i < G_N_ELEMENTS (params->column_width_minus1); i++) + COPY_FIELD (column_width_minus1[i]); + + for (i = 0; i < pps->num_tile_rows_minus1 && + i < G_N_ELEMENTS (params->row_height_minus1); i++) + COPY_FIELD (row_height_minus1[i]); + } + } + + COPY_FIELD (diff_cu_qp_delta_depth); + COPY_FIELD_WITH_PREFIX (beta_offset_div2); + COPY_FIELD_WITH_PREFIX (tc_offset_div2); + COPY_FIELD (log2_parallel_merge_level_minus2); + +#undef COPY_FIELD +#undef COPY_FIELD_WITH_PREFIX +} + +static void +gst_d3d11_h265_dec_picture_params_from_slice_header (GstD3D11H265Dec * + self, const GstH265SliceHdr * slice_header, DXVA_PicParams_HEVC * params) +{ + if (slice_header->short_term_ref_pic_set_sps_flag == 0) { + params->ucNumDeltaPocsOfRefRpsIdx = + slice_header->short_term_ref_pic_sets.NumDeltaPocsOfRefRpsIdx; + params->wNumBitsForShortTermRPSInSlice = + slice_header->short_term_ref_pic_set_size; + } +} + +static gboolean +gst_d3d11_h265_dec_fill_picture_params (GstD3D11H265Dec * self, + const GstH265SliceHdr * slice_header, DXVA_PicParams_HEVC * params) +{ + const GstH265SPS *sps; + const GstH265PPS *pps; + + g_return_val_if_fail (slice_header->pps != NULL, FALSE); + g_return_val_if_fail (slice_header->pps->sps != NULL, FALSE); + + pps = slice_header->pps; + sps = pps->sps; + + /* not related to hevc syntax */ + params->NoPicReorderingFlag = 0; + params->NoBiPredFlag = 0; + params->ReservedBits1 = 0; + params->StatusReportFeedbackNumber = 1; + + gst_d3d11_h265_dec_picture_params_from_sps (self, sps, params); + gst_d3d11_h265_dec_picture_params_from_pps (self, pps, params); + gst_d3d11_h265_dec_picture_params_from_slice_header (self, + slice_header, params); + + return TRUE; +} + +static ID3D11VideoDecoderOutputView * +gst_d3d11_h265_dec_get_output_view_from_picture (GstD3D11H265Dec * self, + GstH265Picture * picture, guint8 * view_id) +{ + GstD3D11H265DecInner *inner = self->inner; + GstBuffer *view_buffer; + ID3D11VideoDecoderOutputView *view; + + view_buffer = (GstBuffer *) gst_h265_picture_get_user_data (picture); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); + return NULL; + } + + view = gst_d3d11_decoder_get_output_view_from_buffer (inner->d3d11_decoder, + view_buffer, view_id); + if (!view) { + GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); + return NULL; + } + + return view; +} + +static UCHAR +gst_d3d11_h265_dec_get_ref_index (const DXVA_PicParams_HEVC * pic_params, + guint8 view_id) +{ + for (UCHAR i = 0; i < G_N_ELEMENTS (pic_params->RefPicList); i++) { + if (pic_params->RefPicList[i].Index7Bits == view_id) + return i; + } + + return 0xff; +} + +static inline void +init_pic_params (DXVA_PicParams_HEVC * params) +{ + memset (params, 0, sizeof (DXVA_PicParams_HEVC)); + for (guint i = 0; i < G_N_ELEMENTS (params->RefPicList); i++) + params->RefPicList[i].bPicEntry = 0xff; + + for (guint i = 0; i < G_N_ELEMENTS (params->RefPicSetStCurrBefore); i++) { + params->RefPicSetStCurrBefore[i] = 0xff; + params->RefPicSetStCurrAfter[i] = 0xff; + params->RefPicSetLtCurr[i] = 0xff; + } +} + +static GstFlowReturn +gst_d3d11_h265_dec_start_picture (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + DXVA_PicParams_HEVC *pic_params = &inner->pic_params; + DXVA_Qmatrix_HEVC *iq_matrix = &inner->iq_matrix; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + guint i, j; + GArray *dpb_array; + GstH265SPS *sps; + GstH265PPS *pps; + GstH265ScalingList *scaling_list = nullptr; + + pps = slice->header.pps; + sps = pps->sps; + + view = gst_d3d11_h265_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + init_pic_params (pic_params); + gst_d3d11_h265_dec_fill_picture_params (self, &slice->header, pic_params); + + pic_params->CurrPic.Index7Bits = view_id; + pic_params->IrapPicFlag = GST_H265_IS_NAL_TYPE_IRAP (slice->nalu.type); + pic_params->IdrPicFlag = GST_H265_IS_NAL_TYPE_IDR (slice->nalu.type); + pic_params->IntraPicFlag = GST_H265_IS_NAL_TYPE_IRAP (slice->nalu.type); + pic_params->CurrPicOrderCntVal = picture->pic_order_cnt; + + dpb_array = gst_h265_dpb_get_pictures_all (dpb); + for (i = 0, j = 0; + i < dpb_array->len && j < G_N_ELEMENTS (pic_params->RefPicList); i++) { + GstH265Picture *other = g_array_index (dpb_array, GstH265Picture *, i); + guint8 id = 0xff; + + if (!other->ref) { + GST_LOG_OBJECT (self, "%dth picture in dpb is not reference, skip", i); + continue; + } + + gst_d3d11_h265_dec_get_output_view_from_picture (self, other, &id); + pic_params->RefPicList[j].Index7Bits = id; + pic_params->RefPicList[j].AssociatedFlag = other->long_term; + pic_params->PicOrderCntValList[j] = other->pic_order_cnt; + j++; + } + g_array_unref (dpb_array); + + for (i = 0, j = 0; i < G_N_ELEMENTS (pic_params->RefPicSetStCurrBefore); i++) { + GstH265Picture *other = nullptr; + guint8 other_view_id = 0xff; + guint8 id = 0xff; + + while (!other && j < decoder->NumPocStCurrBefore) + other = decoder->RefPicSetStCurrBefore[j++]; + + if (other) { + ID3D11VideoDecoderOutputView *other_view; + + other_view = gst_d3d11_h265_dec_get_output_view_from_picture (self, + other, &other_view_id); + + if (other_view) + id = gst_d3d11_h265_dec_get_ref_index (pic_params, other_view_id); + } + + pic_params->RefPicSetStCurrBefore[i] = id; + } + + for (i = 0, j = 0; i < G_N_ELEMENTS (pic_params->RefPicSetStCurrAfter); i++) { + GstH265Picture *other = nullptr; + guint8 other_view_id = 0xff; + guint8 id = 0xff; + + while (!other && j < decoder->NumPocStCurrAfter) + other = decoder->RefPicSetStCurrAfter[j++]; + + if (other) { + ID3D11VideoDecoderOutputView *other_view; + + other_view = gst_d3d11_h265_dec_get_output_view_from_picture (self, + other, &other_view_id); + + if (other_view) + id = gst_d3d11_h265_dec_get_ref_index (pic_params, other_view_id); + } + + pic_params->RefPicSetStCurrAfter[i] = id; + } + + for (i = 0, j = 0; i < G_N_ELEMENTS (pic_params->RefPicSetLtCurr); i++) { + GstH265Picture *other = nullptr; + guint8 other_view_id = 0xff; + guint8 id = 0xff; + + while (!other && j < decoder->NumPocLtCurr) + other = decoder->RefPicSetLtCurr[j++]; + + if (other) { + ID3D11VideoDecoderOutputView *other_view; + + other_view = gst_d3d11_h265_dec_get_output_view_from_picture (self, + other, &other_view_id); + + if (other_view) + id = gst_d3d11_h265_dec_get_ref_index (pic_params, other_view_id); + } + + pic_params->RefPicSetLtCurr[i] = id; + } + + if (pps->scaling_list_data_present_flag || + (sps->scaling_list_enabled_flag + && !sps->scaling_list_data_present_flag)) { + scaling_list = &pps->scaling_list; + } else if (sps->scaling_list_enabled_flag && + sps->scaling_list_data_present_flag) { + scaling_list = &sps->scaling_list; + } + + if (scaling_list) { + G_STATIC_ASSERT (sizeof (iq_matrix->ucScalingLists0) == + sizeof (scaling_list->scaling_lists_4x4)); + G_STATIC_ASSERT (sizeof (iq_matrix->ucScalingLists1) == + sizeof (scaling_list->scaling_lists_8x8)); + G_STATIC_ASSERT (sizeof (iq_matrix->ucScalingLists2) == + sizeof (scaling_list->scaling_lists_16x16)); + G_STATIC_ASSERT (sizeof (iq_matrix->ucScalingLists3) == + sizeof (scaling_list->scaling_lists_32x32)); + + memcpy (iq_matrix->ucScalingLists0, scaling_list->scaling_lists_4x4, + sizeof (iq_matrix->ucScalingLists0)); + memcpy (iq_matrix->ucScalingLists1, scaling_list->scaling_lists_8x8, + sizeof (iq_matrix->ucScalingLists1)); + memcpy (iq_matrix->ucScalingLists2, scaling_list->scaling_lists_16x16, + sizeof (iq_matrix->ucScalingLists2)); + memcpy (iq_matrix->ucScalingLists3, scaling_list->scaling_lists_32x32, + sizeof (iq_matrix->ucScalingLists3)); + + for (i = 0; i < G_N_ELEMENTS (iq_matrix->ucScalingListDCCoefSizeID2); i++) { + iq_matrix->ucScalingListDCCoefSizeID2[i] = + scaling_list->scaling_list_dc_coef_minus8_16x16[i] + 8; + } + + for (i = 0; i < G_N_ELEMENTS (iq_matrix->ucScalingListDCCoefSizeID3); i++) { + iq_matrix->ucScalingListDCCoefSizeID3[i] = + scaling_list->scaling_list_dc_coef_minus8_32x32[i] + 8; + } + + inner->submit_iq_data = TRUE; + } else { + inner->submit_iq_data = FALSE; + } + + inner->slice_list.resize (0); + inner->bitstream_buffer.resize (0); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h265_dec_decode_slice (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, + GArray * ref_pic_list0, GArray * ref_pic_list1) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + DXVA_Slice_HEVC_Short dxva_slice; + static const guint8 start_code[] = { 0, 0, 1 }; + const size_t start_code_size = sizeof (start_code); + + dxva_slice.BSNALunitDataLocation = inner->bitstream_buffer.size (); + /* Includes 3 bytes start code prefix */ + dxva_slice.SliceBytesInBuffer = slice->nalu.size + start_code_size; + dxva_slice.wBadSliceChopping = 0; + + inner->slice_list.push_back (dxva_slice); + + size_t pos = inner->bitstream_buffer.size (); + inner->bitstream_buffer.resize (pos + start_code_size + slice->nalu.size); + + /* Fill start code prefix */ + memcpy (&inner->bitstream_buffer[0] + pos, start_code, start_code_size); + + /* Copy bitstream */ + memcpy (&inner->bitstream_buffer[0] + pos + start_code_size, + slice->nalu.data + slice->nalu.offset, slice->nalu.size); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h265_dec_end_picture (GstH265Decoder * decoder, + GstH265Picture * picture) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + size_t bitstream_buffer_size; + size_t bitstream_pos; + GstD3D11DecodeInputStreamArgs input_args; + + GST_LOG_OBJECT (self, "end picture %p, (poc %d)", + picture, picture->pic_order_cnt); + + if (inner->bitstream_buffer.empty () || inner->slice_list.empty ()) { + GST_ERROR_OBJECT (self, "No bitstream buffer to submit"); + return GST_FLOW_ERROR; + } + + view = gst_d3d11_h265_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (&input_args, 0, sizeof (GstD3D11DecodeInputStreamArgs)); + + bitstream_pos = inner->bitstream_buffer.size (); + bitstream_buffer_size = GST_ROUND_UP_128 (bitstream_pos); + + if (bitstream_buffer_size > bitstream_pos) { + size_t padding = bitstream_buffer_size - bitstream_pos; + + /* As per DXVA spec, total amount of bitstream buffer size should be + * 128 bytes aligned. If actual data is not multiple of 128 bytes, + * the last slice data needs to be zero-padded */ + inner->bitstream_buffer.resize (bitstream_buffer_size, 0); + + DXVA_Slice_HEVC_Short & slice = inner->slice_list.back (); + slice.SliceBytesInBuffer += padding; + } + + input_args.picture_params = &inner->pic_params; + input_args.picture_params_size = sizeof (DXVA_PicParams_HEVC); + input_args.slice_control = &inner->slice_list[0]; + input_args.slice_control_size = + sizeof (DXVA_Slice_HEVC_Short) * inner->slice_list.size (); + input_args.bitstream = &inner->bitstream_buffer[0]; + input_args.bitstream_size = inner->bitstream_buffer.size (); + + if (inner->submit_iq_data) { + input_args.inverse_quantization_matrix = &inner->iq_matrix; + input_args.inverse_quantization_matrix_size = sizeof (DXVA_Qmatrix_HEVC); + } + + if (!gst_d3d11_decoder_decode_frame (inner->d3d11_decoder, view, &input_args)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_h265_dec_output_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * frame, GstH265Picture * picture) +{ + GstD3D11H265Dec *self = GST_D3D11_H265_DEC (decoder); + GstD3D11H265DecInner *inner = self->inner; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstBuffer *view_buffer; + + GST_LOG_OBJECT (self, "Outputting picture %p, poc %d, picture_struct %d, " + "buffer flags 0x%x", picture, picture->pic_order_cnt, picture->pic_struct, + picture->buffer_flags); + + view_buffer = (GstBuffer *) gst_h265_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Could not get output view"); + goto error; + } + + if (!gst_d3d11_decoder_process_output (inner->d3d11_decoder, vdec, + inner->width, inner->height, view_buffer, &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to copy buffer"); + goto error; + } + + GST_BUFFER_FLAG_SET (frame->output_buffer, picture->buffer_flags); + gst_h265_picture_unref (picture); + + return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); + +error: + gst_h265_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); + + return GST_FLOW_ERROR; +} + +void +gst_d3d11_h265_dec_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank) +{ + GType type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + guint i; + const GUID *profile = NULL; + GTypeInfo type_info = { + sizeof (GstD3D11H265DecClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_h265_dec_class_init, + NULL, + NULL, + sizeof (GstD3D11H265Dec), + 0, + (GInstanceInitFunc) gst_d3d11_h265_dec_init, + }; + const GUID *main_10_guid = NULL; + const GUID *main_guid = NULL; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + GstCaps *src_caps_copy; + GstCaps *tmp; + GstCapsFeatures *caps_features; + guint max_width = 0; + guint max_height = 0; + guint resolution; + gboolean have_main10 = FALSE; + gboolean have_main = FALSE; + DXGI_FORMAT format = DXGI_FORMAT_UNKNOWN; + + have_main10 = gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_H265, GST_VIDEO_FORMAT_P010_10LE, &main_10_guid); + if (!have_main10) { + GST_DEBUG_OBJECT (device, "decoder does not support HEVC_VLD_MAIN10"); + } else { + have_main10 &= + gst_d3d11_decoder_supports_format (device, main_10_guid, + DXGI_FORMAT_P010); + if (!have_main10) { + GST_FIXME_OBJECT (device, "device does not support P010 format"); + } + } + + have_main = gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_H265, GST_VIDEO_FORMAT_NV12, &main_guid); + if (!have_main) { + GST_DEBUG_OBJECT (device, "decoder does not support HEVC_VLD_MAIN"); + } else { + have_main = + gst_d3d11_decoder_supports_format (device, main_guid, DXGI_FORMAT_NV12); + if (!have_main) { + GST_FIXME_OBJECT (device, "device does not support NV12 format"); + } + } + + if (!have_main10 && !have_main) { + GST_INFO_OBJECT (device, "device does not support h.265 decoding"); + return; + } + + if (have_main) { + profile = main_guid; + format = DXGI_FORMAT_NV12; + } else { + profile = main_10_guid; + format = DXGI_FORMAT_P010; + } + + for (i = 0; i < G_N_ELEMENTS (gst_dxva_resolutions); i++) { + if (gst_d3d11_decoder_supports_resolution (device, profile, + format, gst_dxva_resolutions[i].width, + gst_dxva_resolutions[i].height)) { + max_width = gst_dxva_resolutions[i].width; + max_height = gst_dxva_resolutions[i].height; + + GST_DEBUG_OBJECT (device, + "device support resolution %dx%d", max_width, max_height); + } else { + break; + } + } + + if (max_width == 0 || max_height == 0) { + GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); + return; + } + + sink_caps = gst_caps_from_string ("video/x-h265, " + "stream-format=(string) { hev1, hvc1, byte-stream }, " + "alignment= (string) au"); + src_caps = gst_caps_new_empty_simple ("video/x-raw"); + + if (have_main10) { + /* main10 profile covers main and main10 */ + GValue profile_list = G_VALUE_INIT; + GValue profile_value = G_VALUE_INIT; + GValue format_list = G_VALUE_INIT; + GValue format_value = G_VALUE_INIT; + + g_value_init (&profile_list, GST_TYPE_LIST); + + g_value_init (&profile_value, G_TYPE_STRING); + g_value_set_string (&profile_value, "main"); + gst_value_list_append_and_take_value (&profile_list, &profile_value); + + g_value_init (&profile_value, G_TYPE_STRING); + g_value_set_string (&profile_value, "main-10"); + gst_value_list_append_and_take_value (&profile_list, &profile_value); + + + g_value_init (&format_list, GST_TYPE_LIST); + + g_value_init (&format_value, G_TYPE_STRING); + g_value_set_string (&format_value, "NV12"); + gst_value_list_append_and_take_value (&format_list, &format_value); + + g_value_init (&format_value, G_TYPE_STRING); + g_value_set_string (&format_value, "P010_10LE"); + gst_value_list_append_and_take_value (&format_list, &format_value); + + gst_caps_set_value (sink_caps, "profile", &profile_list); + gst_caps_set_value (src_caps, "format", &format_list); + g_value_unset (&profile_list); + g_value_unset (&format_list); + } else { + gst_caps_set_simple (sink_caps, "profile", G_TYPE_STRING, "main", NULL); + gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); + } + + /* To cover both landscape and portrait, select max value */ + resolution = MAX (max_width, max_height); + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + + /* Copy src caps to append other capsfeatures */ + src_caps_copy = gst_caps_copy (src_caps); + + /* System memory with alternate interlace-mode */ + tmp = gst_caps_copy (src_caps_copy); + caps_features = gst_caps_features_new (GST_CAPS_FEATURE_FORMAT_INTERLACED, + NULL); + gst_caps_set_features_simple (tmp, caps_features); + gst_caps_set_simple (tmp, "interlace-mode", G_TYPE_STRING, "alternate", NULL); + gst_caps_append (src_caps, tmp); + + /* D3D11 memory feature */ + tmp = gst_caps_copy (src_caps_copy); + caps_features = gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, + NULL); + gst_caps_set_features_simple (tmp, caps_features); + gst_caps_append (src_caps, tmp); + + /* FIXME: D3D11 deinterlace element is not prepared, so this D3D11 with + * interlaced caps feature is pointless at the moment */ +#if 0 + /* D3D11 memory with alternate interlace-mode */ + tmp = gst_caps_copy (src_caps_copy); + caps_features = gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, + GST_CAPS_FEATURE_FORMAT_INTERLACED, NULL); + gst_caps_set_features_simple (tmp, caps_features); + gst_caps_set_simple (tmp, "interlace-mode", G_TYPE_STRING, "alternate", NULL); + gst_caps_append (src_caps, tmp); +#endif + + gst_caps_unref (src_caps_copy); + + type_info.class_data = + gst_d3d11_decoder_class_data_new (device, GST_DXVA_CODEC_H265, + sink_caps, src_caps); + + type_name = g_strdup ("GstD3D11H265Dec"); + feature_name = g_strdup ("d3d11h265dec"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11H265Device%dDec", index); + feature_name = g_strdup_printf ("d3d11h265device%ddec", index); + } + + type = g_type_register_static (GST_TYPE_H265_DECODER, + type_name, &type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11h265dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11h265dec.h
Changed
@@ -26,7 +26,6 @@ void gst_d3d11_h265_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11mpeg2dec.cpp
Added
@@ -0,0 +1,856 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11mpeg2dec + * @title: d3d11mpeg2dec + * + * A Direct3D11/DXVA based MPEG-2 video decoder + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/mpeg2/file ! parsebin ! d3d11mpeg2dec ! d3d11videosink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11mpeg2dec.h" + +#include <gst/codecs/gstmpeg2decoder.h> +#include <string.h> +#include <vector> + +/* HACK: to expose dxva data structure on UWP */ +#ifdef WINAPI_PARTITION_DESKTOP +#undef WINAPI_PARTITION_DESKTOP +#endif +#define WINAPI_PARTITION_DESKTOP 1 +#include <d3d9.h> +#include <dxva.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_mpeg2_dec_debug); +#define GST_CAT_DEFAULT gst_d3d11_mpeg2_dec_debug + +/* reference list 2 + 4 margin */ +#define NUM_OUTPUT_VIEW 6 + +/* *INDENT-OFF* */ +typedef struct _GstD3D11Mpeg2DecInner +{ + GstD3D11Device *device = nullptr; + GstD3D11Decoder *d3d11_decoder = nullptr; + + DXVA_PictureParameters pic_params; + DXVA_QmatrixData iq_matrix; + + std::vector<DXVA_SliceInfo> slice_list; + std::vector<guint8> bitstream_buffer; + + gboolean submit_iq_data; + + gint width = 0; + gint height = 0; + guint width_in_mb = 0; + guint height_in_mb = 0; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; + GstMpegVideoSequenceHdr seq; + GstMpegVideoProfile profile = GST_MPEG_VIDEO_PROFILE_MAIN; + gboolean interlaced = FALSE; +} GstD3D11Mpeg2DecInner; +/* *INDENT-ON* */ + +typedef struct _GstD3D11Mpeg2Dec +{ + GstMpeg2Decoder parent; + GstD3D11Mpeg2DecInner *inner; +} GstD3D11Mpeg2Dec; + +typedef struct _GstD3D11Mpeg2DecClass +{ + GstMpeg2DecoderClass parent_class; + GstD3D11DecoderSubClassData class_data; +} GstD3D11Mpeg2DecClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_MPEG2_DEC(object) ((GstD3D11Mpeg2Dec *) (object)) +#define GST_D3D11_MPEG2_DEC_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11Mpeg2DecClass)) + +static void gst_d3d11_mpeg2_dec_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_mpeg2_dec_finalize (GObject * object); +static void gst_d3d11_mpeg2_dec_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_mpeg2_dec_open (GstVideoDecoder * decoder); +static gboolean gst_d3d11_mpeg2_dec_close (GstVideoDecoder * decoder); +static gboolean gst_d3d11_mpeg2_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_d3d11_mpeg2_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_d3d11_mpeg2_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); +static gboolean gst_d3d11_mpeg2_dec_sink_event (GstVideoDecoder * decoder, + GstEvent * event); + +/* GstMpeg2Decoder */ +static GstFlowReturn gst_d3d11_mpeg2_dec_new_sequence (GstMpeg2Decoder * + decoder, const GstMpegVideoSequenceHdr * seq, + const GstMpegVideoSequenceExt * seq_ext, + const GstMpegVideoSequenceDisplayExt * seq_display_ext, + const GstMpegVideoSequenceScalableExt * seq_scalable_ext); +static GstFlowReturn gst_d3d11_mpeg2_dec_new_picture (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, GstMpeg2Picture * picture); +static GstFlowReturn gst_d3d11_mpeg2_dec_new_field_picture (GstMpeg2Decoder * + decoder, const GstMpeg2Picture * first_field, + GstMpeg2Picture * second_field); +static GstFlowReturn gst_d3d11_mpeg2_dec_start_picture (GstMpeg2Decoder * + decoder, GstMpeg2Picture * picture, GstMpeg2Slice * slice, + GstMpeg2Picture * prev_picture, GstMpeg2Picture * next_picture); +static GstFlowReturn gst_d3d11_mpeg2_dec_decode_slice (GstMpeg2Decoder * + decoder, GstMpeg2Picture * picture, GstMpeg2Slice * slice); +static GstFlowReturn gst_d3d11_mpeg2_dec_end_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture); +static GstFlowReturn gst_d3d11_mpeg2_dec_output_picture (GstMpeg2Decoder * + decoder, GstVideoCodecFrame * frame, GstMpeg2Picture * picture); + +static void +gst_d3d11_mpeg2_dec_class_init (GstD3D11Mpeg2DecClass * klass, gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstMpeg2DecoderClass *mpeg2decoder_class = GST_MPEG2_DECODER_CLASS (klass); + GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; + + gobject_class->get_property = gst_d3d11_mpeg2_dec_get_property; + gobject_class->finalize = gst_d3d11_mpeg2_dec_finalize; + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_set_context); + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + gst_d3d11_decoder_class_data_fill_subclass_data (cdata, &klass->class_data); + + /** + * GstD3D11Mpeg2Dec:adapter-luid: + * + * DXGI Adapter LUID for this element + * + * Since: 1.20 + */ + gst_d3d11_decoder_proxy_class_init (element_class, cdata, + "Seungha Yang <seungha@centricular.com>"); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_src_query); + decoder_class->sink_event = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_sink_event); + + mpeg2decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_new_sequence); + mpeg2decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_new_picture); + mpeg2decoder_class->new_field_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_new_field_picture); + mpeg2decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_start_picture); + mpeg2decoder_class->decode_slice = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_decode_slice); + mpeg2decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_end_picture); + mpeg2decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_mpeg2_dec_output_picture); +} + +static void +gst_d3d11_mpeg2_dec_init (GstD3D11Mpeg2Dec * self) +{ + self->inner = new GstD3D11Mpeg2DecInner (); +} + +static void +gst_d3d11_mpeg2_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11Mpeg2DecClass *klass = GST_D3D11_MPEG2_DEC_GET_CLASS (object); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_decoder_proxy_get_property (object, prop_id, value, pspec, cdata); +} + +static void +gst_d3d11_mpeg2_dec_finalize (GObject * object) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (object); + + delete self->inner; + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_mpeg2_dec_set_context (GstElement * element, GstContext * context) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (element); + GstD3D11Mpeg2DecInner *inner = self->inner; + GstD3D11Mpeg2DecClass *klass = GST_D3D11_MPEG2_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, cdata->adapter_luid, &inner->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_mpeg2_dec_open (GstVideoDecoder * decoder) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + GstD3D11Mpeg2DecClass *klass = GST_D3D11_MPEG2_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + if (!gst_d3d11_decoder_proxy_open (decoder, + cdata, &inner->device, &inner->d3d11_decoder)) { + GST_ERROR_OBJECT (self, "Failed to open decoder"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_mpeg2_dec_close (GstVideoDecoder * decoder) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + + gst_clear_object (&inner->d3d11_decoder); + gst_clear_object (&inner->device); + + return TRUE; +} + +static gboolean +gst_d3d11_mpeg2_dec_negotiate (GstVideoDecoder * decoder) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_negotiate (inner->d3d11_decoder, decoder)) + return FALSE; + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_d3d11_mpeg2_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_decide_allocation (inner->d3d11_decoder, + decoder, query)) { + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_d3d11_mpeg2_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), + query, inner->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static gboolean +gst_d3d11_mpeg2_dec_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, TRUE); + break; + case GST_EVENT_FLUSH_STOP: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, FALSE); + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_new_sequence (GstMpeg2Decoder * decoder, + const GstMpegVideoSequenceHdr * seq, + const GstMpegVideoSequenceExt * seq_ext, + const GstMpegVideoSequenceDisplayExt * seq_display_ext, + const GstMpegVideoSequenceScalableExt * seq_scalable_ext) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + gboolean interlaced; + gboolean modified = FALSE; + gint width, height; + GstMpegVideoProfile mpeg_profile; + + GST_LOG_OBJECT (self, "new sequence"); + + interlaced = seq_ext ? !seq_ext->progressive : FALSE; + if (inner->interlaced != interlaced) { + GST_INFO_OBJECT (self, "interlaced sequence change"); + inner->interlaced = interlaced; + modified = TRUE; + } + + width = seq->width; + height = seq->height; + if (seq_ext) { + width = (width & 0x0fff) | ((guint32) seq_ext->horiz_size_ext << 12); + height = (height & 0x0fff) | ((guint32) seq_ext->vert_size_ext << 12); + } + + if (inner->width != width || inner->height != height) { + GST_INFO_OBJECT (self, "resolution change %dx%d -> %dx%d", + inner->width, inner->height, width, height); + inner->width = width; + inner->height = height; + inner->width_in_mb = GST_ROUND_UP_16 (width) >> 4; + inner->height_in_mb = GST_ROUND_UP_16 (height) >> 4; + modified = TRUE; + } + + mpeg_profile = GST_MPEG_VIDEO_PROFILE_MAIN; + if (seq_ext) + mpeg_profile = (GstMpegVideoProfile) seq_ext->profile; + + if (mpeg_profile != GST_MPEG_VIDEO_PROFILE_MAIN && + mpeg_profile != GST_MPEG_VIDEO_PROFILE_SIMPLE) { + GST_ERROR_OBJECT (self, "Cannot support profile %d", mpeg_profile); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (inner->profile != mpeg_profile) { + GST_INFO_OBJECT (self, "Profile change %d -> %d", + inner->profile, mpeg_profile); + inner->profile = mpeg_profile; + modified = TRUE; + } + + if (modified || !gst_d3d11_decoder_is_configured (inner->d3d11_decoder)) { + GstVideoInfo info; + + /* FIXME: support I420 */ + inner->out_format = GST_VIDEO_FORMAT_NV12; + + gst_video_info_set_format (&info, + inner->out_format, inner->width, inner->height); + if (inner->interlaced) + GST_VIDEO_INFO_INTERLACE_MODE (&info) = GST_VIDEO_INTERLACE_MODE_MIXED; + + if (!gst_d3d11_decoder_configure (inner->d3d11_decoder, + decoder->input_state, &info, + inner->width, inner->height, NUM_OUTPUT_VIEW)) { + GST_ERROR_OBJECT (self, "Failed to create decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_new_picture (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, GstMpeg2Picture * picture) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + GstBuffer *view_buffer; + + view_buffer = gst_d3d11_decoder_get_output_view_buffer (inner->d3d11_decoder, + GST_VIDEO_DECODER (decoder)); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "No available output view buffer"); + return GST_FLOW_ERROR; + } + + GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT, view_buffer); + + gst_mpeg2_picture_set_user_data (picture, + view_buffer, (GDestroyNotify) gst_buffer_unref); + + GST_LOG_OBJECT (self, "New MPEG2 picture %p", picture); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_new_field_picture (GstMpeg2Decoder * decoder, + const GstMpeg2Picture * first_field, GstMpeg2Picture * second_field) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstBuffer *view_buffer; + + view_buffer = (GstBuffer *) + gst_mpeg2_picture_get_user_data ((GstMpeg2Picture *) first_field); + + if (!view_buffer) { + GST_WARNING_OBJECT (self, "First picture does not have output view buffer"); + return GST_FLOW_OK; + } + + GST_LOG_OBJECT (self, "New field picture with buffer %" GST_PTR_FORMAT, + view_buffer); + + gst_mpeg2_picture_set_user_data (second_field, + gst_buffer_ref (view_buffer), (GDestroyNotify) gst_buffer_unref); + + return GST_FLOW_OK; +} + +static ID3D11VideoDecoderOutputView * +gst_d3d11_mpeg2_dec_get_output_view_from_picture (GstD3D11Mpeg2Dec * self, + GstMpeg2Picture * picture, guint8 * view_id) +{ + GstD3D11Mpeg2DecInner *inner = self->inner; + GstBuffer *view_buffer; + ID3D11VideoDecoderOutputView *view; + + if (!picture) + return NULL; + + view_buffer = (GstBuffer *) gst_mpeg2_picture_get_user_data (picture); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); + return NULL; + } + + view = + gst_d3d11_decoder_get_output_view_from_buffer (inner->d3d11_decoder, + view_buffer, view_id); + if (!view) { + GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); + return NULL; + } + + return view; +} + +static inline WORD +_pack_f_codes (guint8 f_code[2][2]) +{ + return (((WORD) f_code[0][0] << 12) + | ((WORD) f_code[0][1] << 8) + | ((WORD) f_code[1][0] << 4) + | (f_code[1][1])); +} + +static inline WORD +_pack_pce_elements (GstMpeg2Slice * slice) +{ + return (((WORD) slice->pic_ext->intra_dc_precision << 14) + | ((WORD) slice->pic_ext->picture_structure << 12) + | ((WORD) slice->pic_ext->top_field_first << 11) + | ((WORD) slice->pic_ext->frame_pred_frame_dct << 10) + | ((WORD) slice->pic_ext->concealment_motion_vectors << 9) + | ((WORD) slice->pic_ext->q_scale_type << 8) + | ((WORD) slice->pic_ext->intra_vlc_format << 7) + | ((WORD) slice->pic_ext->alternate_scan << 6) + | ((WORD) slice->pic_ext->repeat_first_field << 5) + | ((WORD) slice->pic_ext->chroma_420_type << 4) + | ((WORD) slice->pic_ext->progressive_frame << 3)); +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_start_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice, + GstMpeg2Picture * prev_picture, GstMpeg2Picture * next_picture) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + DXVA_PictureParameters *pic_params = &inner->pic_params; + DXVA_QmatrixData *iq_matrix = &inner->iq_matrix; + ID3D11VideoDecoderOutputView *view; + ID3D11VideoDecoderOutputView *other_view; + guint8 view_id = 0xff; + guint8 other_view_id = 0xff; + gboolean is_field = + picture->structure != GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME; + + view = gst_d3d11_mpeg2_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (pic_params, 0, sizeof (DXVA_PictureParameters)); + memset (iq_matrix, 0, sizeof (DXVA_QmatrixData)); + + /* Fill DXVA_PictureParameters */ + pic_params->wDecodedPictureIndex = view_id; + pic_params->wForwardRefPictureIndex = 0xffff; + pic_params->wBackwardRefPictureIndex = 0xffff; + + switch (picture->type) { + case GST_MPEG_VIDEO_PICTURE_TYPE_B:{ + if (next_picture) { + other_view = + gst_d3d11_mpeg2_dec_get_output_view_from_picture (self, + next_picture, &other_view_id); + if (other_view) + pic_params->wBackwardRefPictureIndex = other_view_id; + } + } + /* fall-through */ + case GST_MPEG_VIDEO_PICTURE_TYPE_P:{ + if (prev_picture) { + other_view = + gst_d3d11_mpeg2_dec_get_output_view_from_picture (self, + prev_picture, &other_view_id); + if (other_view) + pic_params->wForwardRefPictureIndex = other_view_id; + } + } + default: + break; + } + + pic_params->wPicWidthInMBminus1 = inner->width_in_mb - 1; + pic_params->wPicHeightInMBminus1 = (inner->height_in_mb >> is_field) - 1; + pic_params->bMacroblockWidthMinus1 = 15; + pic_params->bMacroblockHeightMinus1 = 15; + pic_params->bBlockWidthMinus1 = 7; + pic_params->bBlockHeightMinus1 = 7; + pic_params->bBPPminus1 = 7; + pic_params->bPicStructure = (BYTE) picture->structure; + if (picture->first_field && is_field) { + pic_params->bSecondField = TRUE; + } + pic_params->bPicIntra = picture->type == GST_MPEG_VIDEO_PICTURE_TYPE_I; + pic_params->bPicBackwardPrediction = + picture->type == GST_MPEG_VIDEO_PICTURE_TYPE_B; + /* FIXME: 1 -> 4:2:0, 2 -> 4:2:2, 3 -> 4:4:4 */ + pic_params->bChromaFormat = 1; + pic_params->bPicScanFixed = 1; + pic_params->bPicScanMethod = slice->pic_ext->alternate_scan; + pic_params->wBitstreamFcodes = _pack_f_codes (slice->pic_ext->f_code); + pic_params->wBitstreamPCEelements = _pack_pce_elements (slice); + + /* Fill DXVA_QmatrixData */ + if (slice->quant_matrix && + /* The value in bNewQmatrix[0] and bNewQmatrix[1] must not both be zero. + * https://docs.microsoft.com/en-us/windows-hardware/drivers/ddi/dxva/ns-dxva-_dxva_qmatrixdata + */ + (slice->quant_matrix->load_intra_quantiser_matrix || + slice->quant_matrix->load_non_intra_quantiser_matrix)) { + GstMpegVideoQuantMatrixExt *quant_matrix = slice->quant_matrix; + + if (quant_matrix->load_intra_quantiser_matrix) { + iq_matrix->bNewQmatrix[0] = 1; + for (guint i = 0; i < 64; i++) { + iq_matrix->Qmatrix[0][i] = quant_matrix->intra_quantiser_matrix[i]; + } + } + + if (quant_matrix->load_non_intra_quantiser_matrix) { + iq_matrix->bNewQmatrix[1] = 1; + for (guint i = 0; i < 64; i++) { + iq_matrix->Qmatrix[1][i] = quant_matrix->non_intra_quantiser_matrix[i]; + } + } + + if (quant_matrix->load_chroma_intra_quantiser_matrix) { + iq_matrix->bNewQmatrix[2] = 1; + for (guint i = 0; i < 64; i++) { + iq_matrix->Qmatrix[2][i] = + quant_matrix->chroma_intra_quantiser_matrix[i]; + } + } + + if (quant_matrix->load_chroma_non_intra_quantiser_matrix) { + iq_matrix->bNewQmatrix[3] = 1; + for (guint i = 0; i < 64; i++) { + iq_matrix->Qmatrix[3][i] = + quant_matrix->chroma_non_intra_quantiser_matrix[i]; + } + } + + inner->submit_iq_data = TRUE; + } else { + inner->submit_iq_data = FALSE; + } + + inner->slice_list.resize (0); + inner->bitstream_buffer.resize (0); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_decode_slice (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + GstMpegVideoSliceHdr *header = &slice->header; + GstMpegVideoPacket *packet = &slice->packet; + DXVA_SliceInfo slice_info = { 0, }; + + g_assert (packet->offset >= 4); + + slice_info.wHorizontalPosition = header->mb_column; + slice_info.wVerticalPosition = header->mb_row; + /* including start code 4 bytes */ + slice_info.dwSliceBitsInBuffer = 8 * (packet->size + 4); + slice_info.dwSliceDataLocation = inner->bitstream_buffer.size (); + /* XXX: We don't have information about the number of MBs in this slice. + * Just store offset here, and actual number will be calculated later */ + slice_info.wNumberMBsInSlice = + (header->mb_row * inner->width_in_mb) + header->mb_column; + slice_info.wQuantizerScaleCode = header->quantiser_scale_code; + slice_info.wMBbitOffset = header->header_size + 32; + + inner->slice_list.push_back (slice_info); + + size_t pos = inner->bitstream_buffer.size (); + inner->bitstream_buffer.resize (pos + packet->size + 4); + memcpy (&inner->bitstream_buffer[0] + pos, packet->data + packet->offset - 4, + packet->size + 4); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_end_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + GstD3D11DecodeInputStreamArgs input_args; + gboolean is_field = + picture->structure != GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME; + guint mb_count = inner->width_in_mb * (inner->height_in_mb >> is_field); + + if (inner->bitstream_buffer.empty ()) { + GST_ERROR_OBJECT (self, "No bitstream buffer to submit"); + return GST_FLOW_ERROR; + } + + view = gst_d3d11_mpeg2_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (&input_args, 0, sizeof (GstD3D11DecodeInputStreamArgs)); + + DXVA_SliceInfo *first = &inner->slice_list[0]; + for (size_t i = 0; i < inner->slice_list.size (); i++) { + DXVA_SliceInfo *slice = first + i; + + /* Update the number of MBs per slice */ + if (i == inner->slice_list.size () - 1) { + slice->wNumberMBsInSlice = mb_count - slice->wNumberMBsInSlice; + } else { + DXVA_SliceInfo *next = first + i + 1; + slice->wNumberMBsInSlice = + next->wNumberMBsInSlice - slice->wNumberMBsInSlice; + } + } + + input_args.picture_params = &inner->pic_params; + input_args.picture_params_size = sizeof (DXVA_PictureParameters); + input_args.slice_control = &inner->slice_list[0]; + input_args.slice_control_size = + sizeof (DXVA_SliceInfo) * inner->slice_list.size (); + input_args.bitstream = &inner->bitstream_buffer[0]; + input_args.bitstream_size = inner->bitstream_buffer.size (); + if (inner->submit_iq_data) { + input_args.inverse_quantization_matrix = &inner->iq_matrix; + input_args.inverse_quantization_matrix_size = sizeof (DXVA_QmatrixData); + } + + if (!gst_d3d11_decoder_decode_frame (inner->d3d11_decoder, view, &input_args)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_mpeg2_dec_output_picture (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, GstMpeg2Picture * picture) +{ + GstD3D11Mpeg2Dec *self = GST_D3D11_MPEG2_DEC (decoder); + GstD3D11Mpeg2DecInner *inner = self->inner; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstBuffer *view_buffer; + + GST_LOG_OBJECT (self, "Outputting picture %p", picture); + + view_buffer = (GstBuffer *) gst_mpeg2_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Could not get output view"); + goto error; + } + + if (!gst_d3d11_decoder_process_output (inner->d3d11_decoder, vdec, + inner->width, inner->height, view_buffer, &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to copy buffer"); + goto error; + } + + if (picture->buffer_flags != 0) { + gboolean interlaced = + (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_INTERLACED) != 0; + gboolean tff = (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_TFF) != 0; + + GST_TRACE_OBJECT (self, + "apply buffer flags 0x%x (interlaced %d, top-field-first %d)", + picture->buffer_flags, interlaced, tff); + GST_BUFFER_FLAG_SET (frame->output_buffer, picture->buffer_flags); + } + + gst_mpeg2_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_mpeg2_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); + + return GST_FLOW_ERROR; +} + +void +gst_d3d11_mpeg2_dec_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank) +{ + GType type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + GTypeInfo type_info = { + sizeof (GstD3D11Mpeg2DecClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_mpeg2_dec_class_init, + NULL, + NULL, + sizeof (GstD3D11Mpeg2Dec), + 0, + (GInstanceInitFunc) gst_d3d11_mpeg2_dec_init, + }; + const GUID *supported_profile = NULL; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + + if (!gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_MPEG2, GST_VIDEO_FORMAT_NV12, &supported_profile)) { + GST_INFO_OBJECT (device, "device does not support MPEG-2 video decoding"); + return; + } + + sink_caps = gst_caps_from_string ("video/mpeg, " + "mpegversion = (int)2, systemstream = (boolean) false, " + "profile = (string) { main, simple }"); + src_caps = gst_caps_from_string ("video/x-raw(" + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "); video/x-raw"); + + /* NOTE: We are supporting only 4:2:0, main or simple profiles */ + gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); + + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 1, 1920, + "height", GST_TYPE_INT_RANGE, 1, 1920, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 1, 1920, + "height", GST_TYPE_INT_RANGE, 1, 1920, NULL); + + type_info.class_data = + gst_d3d11_decoder_class_data_new (device, GST_DXVA_CODEC_MPEG2, + sink_caps, src_caps); + + type_name = g_strdup ("GstD3D11Mpeg2Dec"); + feature_name = g_strdup ("d3d11mpeg2dec"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11Mpeg2Device%dDec", index); + feature_name = g_strdup_printf ("d3d11mpeg2device%ddec", index); + } + + type = g_type_register_static (GST_TYPE_MPEG2_DECODER, + type_name, &type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11mpeg2dec.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_MPEG2_DEC_H__ +#define __GST_D3D11_MPEG2_DEC_H__ + +#include "gstd3d11decoder.h" + +G_BEGIN_DECLS + +void gst_d3d11_mpeg2_dec_register (GstPlugin * plugin, + GstD3D11Device * device, + guint rank); + +G_END_DECLS + +#endif /* __GST_D3D11_MPEG2_DEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11overlaycompositor.cpp
Added
@@ -0,0 +1,645 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11overlaycompositor.h" +#include "gstd3d11shader.h" +#include "gstd3d11pluginutils.h" +#include <wrl.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_overlay_compositor_debug); +#define GST_CAT_DEFAULT gst_d3d11_overlay_compositor_debug + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; + +typedef struct +{ + struct { + FLOAT x; + FLOAT y; + FLOAT z; + } position; + struct { + FLOAT x; + FLOAT y; + } texture; +} VertexData; + +static const gchar templ_pixel_shader[] = + "Texture2D shaderTexture;\n" + "SamplerState samplerState;\n" + "\n" + "struct PS_INPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + " float3 Texture: TEXCOORD0;\n" + "};\n" + "\n" + "float4 main(PS_INPUT input): SV_TARGET\n" + "{\n" + " return shaderTexture.Sample(samplerState, input.Texture);\n" + "}\n"; + +static const gchar templ_vertex_shader[] = + "struct VS_INPUT\n" + "{\n" + " float4 Position : POSITION;\n" + " float4 Texture : TEXCOORD0;\n" + "};\n" + "\n" + "struct VS_OUTPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + " float4 Texture: TEXCOORD0;\n" + "};\n" + "\n" + "VS_OUTPUT main(VS_INPUT input)\n" + "{\n" + " return input;\n" + "}\n"; +/* *INDENT-ON* */ + +struct _GstD3D11OverlayCompositor +{ + GstD3D11Device *device; + GstVideoInfo out_info; + + D3D11_VIEWPORT viewport; + + ID3D11PixelShader *ps; + ID3D11VertexShader *vs; + ID3D11InputLayout *layout; + ID3D11SamplerState *sampler; + ID3D11BlendState *blend; + ID3D11Buffer *index_buffer; + + /* GstD3D11CompositionOverlay */ + GList *overlays; +}; + +typedef struct +{ + GstVideoOverlayRectangle *overlay_rect; + ID3D11Texture2D *texture; + ID3D11ShaderResourceView *srv; + GstD3D11Quad *quad; +} GstD3D11CompositionOverlay; + +static GstD3D11CompositionOverlay * +gst_d3d11_composition_overlay_new (GstD3D11OverlayCompositor * self, + GstVideoOverlayRectangle * overlay_rect) +{ + GstD3D11CompositionOverlay *overlay = NULL; + gint x, y; + guint width, height; + D3D11_SUBRESOURCE_DATA subresource_data; + D3D11_TEXTURE2D_DESC texture_desc; + D3D11_SHADER_RESOURCE_VIEW_DESC srv_desc; + D3D11_BUFFER_DESC buffer_desc; + D3D11_MAPPED_SUBRESOURCE map; + VertexData *vertex_data; + GstBuffer *buf; + GstVideoMeta *vmeta; + GstMapInfo info; + guint8 *data; + gint stride; + HRESULT hr; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + GstD3D11Device *device = self->device; + const guint index_count = 2 * 3; + FLOAT x1, y1, x2, y2; + gdouble val; + /* *INDENT-OFF* */ + ComPtr<ID3D11Texture2D> texture; + ComPtr<ID3D11ShaderResourceView> srv; + ComPtr<ID3D11Buffer> vertex_buffer; + /* *INDENT-ON* */ + + g_return_val_if_fail (overlay_rect != NULL, NULL); + + memset (&subresource_data, 0, sizeof (subresource_data)); + memset (&texture_desc, 0, sizeof (texture_desc)); + memset (&srv_desc, 0, sizeof (srv_desc)); + memset (&buffer_desc, 0, sizeof (buffer_desc)); + + device_handle = gst_d3d11_device_get_device_handle (device); + context_handle = gst_d3d11_device_get_device_context_handle (device); + + if (!gst_video_overlay_rectangle_get_render_rectangle (overlay_rect, &x, &y, + &width, &height)) { + GST_ERROR ("Failed to get render rectangle"); + return NULL; + } + + buf = gst_video_overlay_rectangle_get_pixels_unscaled_argb (overlay_rect, + GST_VIDEO_OVERLAY_FORMAT_FLAG_NONE); + if (!buf) { + GST_ERROR ("Failed to get overlay buffer"); + return NULL; + } + + vmeta = gst_buffer_get_video_meta (buf); + if (!vmeta) { + GST_ERROR ("Failed to get video meta"); + return NULL; + } + + if (!gst_video_meta_map (vmeta, + 0, &info, (gpointer *) & data, &stride, GST_MAP_READ)) { + GST_ERROR ("Failed to map"); + return NULL; + } + + /* Do create texture and upload data at once, for create immutable texture */ + subresource_data.pSysMem = data; + subresource_data.SysMemPitch = stride; + subresource_data.SysMemSlicePitch = 0; + + texture_desc.Width = width; + texture_desc.Height = height; + texture_desc.MipLevels = 1; + texture_desc.ArraySize = 1; + /* FIXME: need to consider non-BGRA ? */ + texture_desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; + texture_desc.SampleDesc.Count = 1; + texture_desc.SampleDesc.Quality = 0; + texture_desc.Usage = D3D11_USAGE_IMMUTABLE; + texture_desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; + texture_desc.CPUAccessFlags = 0; + + hr = device_handle->CreateTexture2D (&texture_desc, + &subresource_data, &texture); + gst_video_meta_unmap (vmeta, 0, &info); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Failed to create texture"); + return NULL; + } + + srv_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; + srv_desc.Texture2D.MipLevels = 1; + + hr = device_handle->CreateShaderResourceView (texture.Get (), &srv_desc, + &srv); + if (!gst_d3d11_result (hr, device) || !srv) { + GST_ERROR ("Failed to create shader resource view"); + return NULL; + } + + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (VertexData) * 4; + buffer_desc.BindFlags = D3D11_BIND_VERTEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + hr = device_handle->CreateBuffer (&buffer_desc, NULL, &vertex_buffer); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create vertex buffer, hr: 0x%x", (guint) hr); + return NULL; + } + + gst_d3d11_device_lock (device); + hr = context_handle->Map (vertex_buffer.Get (), + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't map vertex buffer, hr: 0x%x", (guint) hr); + gst_d3d11_device_unlock (device); + return NULL; + } + + vertex_data = (VertexData *) map.pData; + /* bottom left */ + gst_util_fraction_to_double (x, GST_VIDEO_INFO_WIDTH (&self->out_info), &val); + x1 = (val * 2.0f) - 1.0f; + + gst_util_fraction_to_double (y + height, + GST_VIDEO_INFO_HEIGHT (&self->out_info), &val); + y1 = (val * -2.0f) + 1.0f; + + /* top right */ + gst_util_fraction_to_double (x + width, + GST_VIDEO_INFO_WIDTH (&self->out_info), &val); + x2 = (val * 2.0f) - 1.0f; + + gst_util_fraction_to_double (y, + GST_VIDEO_INFO_HEIGHT (&self->out_info), &val); + y2 = (val * -2.0f) + 1.0f; + + /* bottom left */ + vertex_data[0].position.x = x1; + vertex_data[0].position.y = y1; + vertex_data[0].position.z = 0.0f; + vertex_data[0].texture.x = 0.0f; + vertex_data[0].texture.y = 1.0f; + + /* top left */ + vertex_data[1].position.x = x1; + vertex_data[1].position.y = y2; + vertex_data[1].position.z = 0.0f; + vertex_data[1].texture.x = 0.0f; + vertex_data[1].texture.y = 0.0f; + + /* top right */ + vertex_data[2].position.x = x2; + vertex_data[2].position.y = y2; + vertex_data[2].position.z = 0.0f; + vertex_data[2].texture.x = 1.0f; + vertex_data[2].texture.y = 0.0f; + + /* bottom right */ + vertex_data[3].position.x = x2; + vertex_data[3].position.y = y1; + vertex_data[3].position.z = 0.0f; + vertex_data[3].texture.x = 1.0f; + vertex_data[3].texture.y = 1.0f; + + context_handle->Unmap (vertex_buffer.Get (), 0); + gst_d3d11_device_unlock (device); + + overlay = g_new0 (GstD3D11CompositionOverlay, 1); + overlay->overlay_rect = gst_video_overlay_rectangle_ref (overlay_rect); + overlay->texture = texture.Detach (); + overlay->srv = srv.Detach (); + overlay->quad = gst_d3d11_quad_new (device, + self->ps, self->vs, self->layout, nullptr, 0, + vertex_buffer.Get (), sizeof (VertexData), + self->index_buffer, DXGI_FORMAT_R16_UINT, index_count); + + return overlay; +} + +static void +gst_d3d11_composition_overlay_free (GstD3D11CompositionOverlay * overlay) +{ + if (!overlay) + return; + + if (overlay->overlay_rect) + gst_video_overlay_rectangle_unref (overlay->overlay_rect); + + GST_D3D11_CLEAR_COM (overlay->srv); + GST_D3D11_CLEAR_COM (overlay->texture); + + if (overlay->quad) + gst_d3d11_quad_free (overlay->quad); + + g_free (overlay); +} + +static gboolean +gst_d3d11_overlay_compositor_setup_shader (GstD3D11OverlayCompositor * self, + GstD3D11Device * device) +{ + HRESULT hr; + D3D11_SAMPLER_DESC sampler_desc; + D3D11_INPUT_ELEMENT_DESC input_desc[2]; + D3D11_BUFFER_DESC buffer_desc; + D3D11_BLEND_DESC blend_desc; + D3D11_MAPPED_SUBRESOURCE map; + WORD *indices; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + /* *INDENT-OFF* */ + ComPtr<ID3D11PixelShader> ps; + ComPtr<ID3D11VertexShader> vs; + ComPtr<ID3D11InputLayout> layout; + ComPtr<ID3D11SamplerState> sampler; + ComPtr<ID3D11BlendState> blend; + ComPtr<ID3D11Buffer> index_buffer; + /* *INDENT-ON* */ + const guint index_count = 2 * 3; + + memset (&sampler_desc, 0, sizeof (sampler_desc)); + memset (input_desc, 0, sizeof (input_desc)); + memset (&buffer_desc, 0, sizeof (buffer_desc)); + memset (&blend_desc, 0, sizeof (blend_desc)); + + device_handle = gst_d3d11_device_get_device_handle (device); + context_handle = gst_d3d11_device_get_device_context_handle (device); + + /* bilinear filtering */ + sampler_desc.Filter = D3D11_FILTER_MIN_MAG_LINEAR_MIP_POINT; + sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.ComparisonFunc = D3D11_COMPARISON_ALWAYS; + sampler_desc.MinLOD = 0; + sampler_desc.MaxLOD = D3D11_FLOAT32_MAX; + + hr = device_handle->CreateSamplerState (&sampler_desc, &sampler); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create sampler state, hr: 0x%x", (guint) hr); + return FALSE; + } + + GST_LOG ("Create Pixel Shader \n%s", templ_pixel_shader); + + if (!gst_d3d11_create_pixel_shader (device, templ_pixel_shader, &ps)) { + GST_ERROR ("Couldn't create pixel shader"); + return FALSE; + } + + input_desc[0].SemanticName = "POSITION"; + input_desc[0].SemanticIndex = 0; + input_desc[0].Format = DXGI_FORMAT_R32G32B32_FLOAT; + input_desc[0].InputSlot = 0; + input_desc[0].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; + input_desc[0].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; + input_desc[0].InstanceDataStepRate = 0; + + input_desc[1].SemanticName = "TEXCOORD"; + input_desc[1].SemanticIndex = 0; + input_desc[1].Format = DXGI_FORMAT_R32G32_FLOAT; + input_desc[1].InputSlot = 0; + input_desc[1].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; + input_desc[1].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; + input_desc[1].InstanceDataStepRate = 0; + + if (!gst_d3d11_create_vertex_shader (device, templ_vertex_shader, + input_desc, G_N_ELEMENTS (input_desc), &vs, &layout)) { + GST_ERROR ("Couldn't vertex pixel shader"); + return FALSE; + } + + blend_desc.AlphaToCoverageEnable = FALSE; + blend_desc.IndependentBlendEnable = FALSE; + blend_desc.RenderTarget[0].BlendEnable = TRUE; + blend_desc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA; + blend_desc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA; + blend_desc.RenderTarget[0].BlendOp = D3D11_BLEND_OP_ADD; + blend_desc.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ONE; + blend_desc.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_ZERO; + blend_desc.RenderTarget[0].BlendOpAlpha = D3D11_BLEND_OP_ADD; + blend_desc.RenderTarget[0].RenderTargetWriteMask = + D3D11_COLOR_WRITE_ENABLE_ALL; + + hr = device_handle->CreateBlendState (&blend_desc, &blend); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create blend staten, hr: 0x%x", (guint) hr); + return FALSE; + } + + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (WORD) * index_count; + buffer_desc.BindFlags = D3D11_BIND_INDEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + hr = device_handle->CreateBuffer (&buffer_desc, NULL, &index_buffer); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't create index buffer, hr: 0x%x", (guint) hr); + return FALSE; + } + + gst_d3d11_device_lock (device); + hr = context_handle->Map (index_buffer.Get (), + 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't map index buffer, hr: 0x%x", (guint) hr); + gst_d3d11_device_unlock (device); + return FALSE; + } + + indices = (WORD *) map.pData; + + /* clockwise indexing */ + indices[0] = 0; /* bottom left */ + indices[1] = 1; /* top left */ + indices[2] = 2; /* top right */ + + indices[3] = 3; /* bottom right */ + indices[4] = 0; /* bottom left */ + indices[5] = 2; /* top right */ + + context_handle->Unmap (index_buffer.Get (), 0); + gst_d3d11_device_unlock (device); + + self->ps = ps.Detach (); + self->vs = vs.Detach (); + self->layout = layout.Detach (); + self->sampler = sampler.Detach (); + self->blend = blend.Detach (); + self->index_buffer = index_buffer.Detach (); + + return TRUE; +} + + +GstD3D11OverlayCompositor * +gst_d3d11_overlay_compositor_new (GstD3D11Device * device, + GstVideoInfo * out_info) +{ + GstD3D11OverlayCompositor *compositor = NULL; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (out_info != NULL, NULL); + + compositor = g_new0 (GstD3D11OverlayCompositor, 1); + + if (!gst_d3d11_overlay_compositor_setup_shader (compositor, device)) { + gst_d3d11_overlay_compositor_free (compositor); + return NULL; + } + + compositor->device = (GstD3D11Device *) gst_object_ref (device); + compositor->out_info = *out_info; + + compositor->viewport.TopLeftX = 0; + compositor->viewport.TopLeftY = 0; + compositor->viewport.Width = GST_VIDEO_INFO_WIDTH (out_info); + compositor->viewport.Height = GST_VIDEO_INFO_HEIGHT (out_info); + compositor->viewport.MinDepth = 0.0f; + compositor->viewport.MaxDepth = 1.0f; + + return compositor; +} + +void +gst_d3d11_overlay_compositor_free (GstD3D11OverlayCompositor * compositor) +{ + g_return_if_fail (compositor != NULL); + + gst_d3d11_overlay_compositor_free_overlays (compositor); + + GST_D3D11_CLEAR_COM (compositor->ps); + GST_D3D11_CLEAR_COM (compositor->vs); + GST_D3D11_CLEAR_COM (compositor->layout); + GST_D3D11_CLEAR_COM (compositor->sampler); + GST_D3D11_CLEAR_COM (compositor->blend); + GST_D3D11_CLEAR_COM (compositor->index_buffer); + + gst_clear_object (&compositor->device); + g_free (compositor); +} + +static gint +find_in_compositor (const GstD3D11CompositionOverlay * overlay, + const GstVideoOverlayRectangle * rect) +{ + return !(overlay->overlay_rect == rect); +} + +static gboolean +is_in_video_overlay_composition (GstVideoOverlayComposition * voc, + GstD3D11CompositionOverlay * overlay) +{ + guint i; + + for (i = 0; i < gst_video_overlay_composition_n_rectangles (voc); i++) { + GstVideoOverlayRectangle *rectangle = + gst_video_overlay_composition_get_rectangle (voc, i); + if (overlay->overlay_rect == rectangle) + return TRUE; + } + return FALSE; +} + +gboolean +gst_d3d11_overlay_compositor_upload (GstD3D11OverlayCompositor * compositor, + GstBuffer * buf) +{ + GstVideoOverlayCompositionMeta *meta; + gint i, num_overlays; + GList *iter; + + g_return_val_if_fail (compositor != NULL, FALSE); + g_return_val_if_fail (GST_IS_BUFFER (buf), FALSE); + + meta = gst_buffer_get_video_overlay_composition_meta (buf); + + if (!meta) { + gst_d3d11_overlay_compositor_free_overlays (compositor); + return TRUE; + } + + num_overlays = gst_video_overlay_composition_n_rectangles (meta->overlay); + if (!num_overlays) { + gst_d3d11_overlay_compositor_free_overlays (compositor); + return TRUE; + } + + GST_LOG ("Upload %d overlay rectangles", num_overlays); + + /* Upload new overlay */ + for (i = 0; i < num_overlays; i++) { + GstVideoOverlayRectangle *rectangle = + gst_video_overlay_composition_get_rectangle (meta->overlay, i); + + if (!g_list_find_custom (compositor->overlays, + rectangle, (GCompareFunc) find_in_compositor)) { + GstD3D11CompositionOverlay *overlay = NULL; + + overlay = gst_d3d11_composition_overlay_new (compositor, rectangle); + + if (!overlay) + return FALSE; + + compositor->overlays = g_list_append (compositor->overlays, overlay); + } + } + + /* Remove old overlay */ + iter = compositor->overlays; + while (iter) { + GstD3D11CompositionOverlay *overlay = + (GstD3D11CompositionOverlay *) iter->data; + GList *next = iter->next; + + if (!is_in_video_overlay_composition (meta->overlay, overlay)) { + compositor->overlays = g_list_delete_link (compositor->overlays, iter); + gst_d3d11_composition_overlay_free (overlay); + } + + iter = next; + } + + return TRUE; +} + +void +gst_d3d11_overlay_compositor_free_overlays (GstD3D11OverlayCompositor * + compositor) +{ + g_return_if_fail (compositor != NULL); + + if (compositor->overlays) { + g_list_free_full (compositor->overlays, + (GDestroyNotify) gst_d3d11_composition_overlay_free); + + compositor->overlays = NULL; + } +} + +gboolean +gst_d3d11_overlay_compositor_update_viewport (GstD3D11OverlayCompositor * + compositor, D3D11_VIEWPORT * viewport) +{ + g_return_val_if_fail (compositor != NULL, FALSE); + g_return_val_if_fail (viewport != NULL, FALSE); + + compositor->viewport = *viewport; + + return TRUE; +} + +gboolean +gst_d3d11_overlay_compositor_draw (GstD3D11OverlayCompositor * compositor, + ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) +{ + gboolean ret = TRUE; + + g_return_val_if_fail (compositor != NULL, FALSE); + g_return_val_if_fail (rtv != NULL, FALSE); + + gst_d3d11_device_lock (compositor->device); + ret = gst_d3d11_overlay_compositor_draw_unlocked (compositor, rtv); + gst_d3d11_device_unlock (compositor->device); + + return ret; +} + +gboolean +gst_d3d11_overlay_compositor_draw_unlocked (GstD3D11OverlayCompositor * + compositor, ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES]) +{ + gboolean ret = TRUE; + GList *iter; + + g_return_val_if_fail (compositor != NULL, FALSE); + g_return_val_if_fail (rtv != NULL, FALSE); + + for (iter = compositor->overlays; iter; iter = g_list_next (iter)) { + GstD3D11CompositionOverlay *overlay = + (GstD3D11CompositionOverlay *) iter->data; + + ret = gst_d3d11_draw_quad_unlocked (overlay->quad, + &compositor->viewport, 1, &overlay->srv, 1, rtv, 1, + compositor->blend, NULL, &compositor->sampler, 1); + + if (!ret) + break; + } + + return ret; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11overlaycompositor.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11overlaycompositor.h
Changed
@@ -22,7 +22,7 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" +#include <gst/d3d11/gstd3d11.h> G_BEGIN_DECLS @@ -38,8 +38,8 @@ void gst_d3d11_overlay_compositor_free_overlays (GstD3D11OverlayCompositor * compositor); -gboolean gst_d3d11_overlay_compositor_update_rect (GstD3D11OverlayCompositor * compositor, - RECT *rect); +gboolean gst_d3d11_overlay_compositor_update_viewport (GstD3D11OverlayCompositor * compositor, + D3D11_VIEWPORT * viewport); gboolean gst_d3d11_overlay_compositor_draw (GstD3D11OverlayCompositor * compositor, ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES]);
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11pluginutils.cpp
Added
@@ -0,0 +1,1032 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11pluginutils.h" + +#include <windows.h> +#include <versionhelpers.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_plugin_utils_debug); +#define GST_CAT_DEFAULT gst_d3d11_plugin_utils_debug + +/* Max Texture Dimension for feature level 11_0 ~ 12_1 */ +static guint _gst_d3d11_texture_max_dimension = 16384; + +void +gst_d3d11_plugin_utils_init (D3D_FEATURE_LEVEL feature_level) +{ + static gsize _init_once = 0; + + if (g_once_init_enter (&_init_once)) { + /* https://docs.microsoft.com/en-us/windows/win32/direct3d11/overviews-direct3d-11-devices-downlevel-intro */ + if (feature_level >= D3D_FEATURE_LEVEL_11_0) + _gst_d3d11_texture_max_dimension = 16384; + else if (feature_level >= D3D_FEATURE_LEVEL_10_0) + _gst_d3d11_texture_max_dimension = 8192; + else + _gst_d3d11_texture_max_dimension = 4096; + + g_once_init_leave (&_init_once, 1); + } +} + +GstCaps * +gst_d3d11_get_updated_template_caps (GstStaticCaps * template_caps) +{ + GstCaps *caps; + + g_return_val_if_fail (template_caps != NULL, NULL); + + caps = gst_static_caps_get (template_caps); + if (!caps) { + GST_ERROR ("Couldn't get caps from static caps"); + return NULL; + } + + caps = gst_caps_make_writable (caps); + gst_caps_set_simple (caps, + "width", GST_TYPE_INT_RANGE, 1, _gst_d3d11_texture_max_dimension, + "height", GST_TYPE_INT_RANGE, 1, _gst_d3d11_texture_max_dimension, NULL); + + return caps; +} + +gboolean +gst_d3d11_is_windows_8_or_greater (void) +{ + static gsize version_once = 0; + static gboolean ret = FALSE; + + if (g_once_init_enter (&version_once)) { +#if (!GST_D3D11_WINAPI_ONLY_APP) + if (IsWindows8OrGreater ()) + ret = TRUE; +#else + ret = TRUE; +#endif + + g_once_init_leave (&version_once, 1); + } + + return ret; +} + +GstD3D11DeviceVendor +gst_d3d11_get_device_vendor (GstD3D11Device * device) +{ + guint device_id = 0; + guint vendor_id = 0; + gchar *desc = NULL; + GstD3D11DeviceVendor vendor = GST_D3D11_DEVICE_VENDOR_UNKNOWN; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), + GST_D3D11_DEVICE_VENDOR_UNKNOWN); + + g_object_get (device, "device-id", &device_id, "vendor-id", &vendor_id, + "description", &desc, NULL); + + switch (vendor_id) { + case 0: + if (device_id == 0 && desc && g_strrstr (desc, "SraKmd")) + vendor = GST_D3D11_DEVICE_VENDOR_XBOX; + break; + case 0x1002: + case 0x1022: + vendor = GST_D3D11_DEVICE_VENDOR_AMD; + break; + case 0x8086: + vendor = GST_D3D11_DEVICE_VENDOR_INTEL; + break; + case 0x10de: + vendor = GST_D3D11_DEVICE_VENDOR_NVIDIA; + break; + case 0x4d4f4351: + vendor = GST_D3D11_DEVICE_VENDOR_QUALCOMM; + break; + default: + break; + } + + g_free (desc); + + return vendor; +} + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) +gboolean +gst_d3d11_hdr_meta_data_to_dxgi (GstVideoMasteringDisplayInfo * minfo, + GstVideoContentLightLevel * cll, DXGI_HDR_METADATA_HDR10 * dxgi_hdr10) +{ + g_return_val_if_fail (dxgi_hdr10 != NULL, FALSE); + + memset (dxgi_hdr10, 0, sizeof (DXGI_HDR_METADATA_HDR10)); + + if (minfo) { + dxgi_hdr10->RedPrimary[0] = minfo->display_primaries[0].x; + dxgi_hdr10->RedPrimary[1] = minfo->display_primaries[0].y; + dxgi_hdr10->GreenPrimary[0] = minfo->display_primaries[1].x; + dxgi_hdr10->GreenPrimary[1] = minfo->display_primaries[1].y; + dxgi_hdr10->BluePrimary[0] = minfo->display_primaries[2].x; + dxgi_hdr10->BluePrimary[1] = minfo->display_primaries[2].y; + + dxgi_hdr10->WhitePoint[0] = minfo->white_point.x; + dxgi_hdr10->WhitePoint[1] = minfo->white_point.y; + dxgi_hdr10->MaxMasteringLuminance = minfo->max_display_mastering_luminance; + dxgi_hdr10->MinMasteringLuminance = minfo->min_display_mastering_luminance; + } + + if (cll) { + dxgi_hdr10->MaxContentLightLevel = cll->max_content_light_level; + dxgi_hdr10->MaxFrameAverageLightLevel = cll->max_frame_average_light_level; + } + + return TRUE; +} +#endif + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) +typedef enum +{ + GST_DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709 = 0, + GST_DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709 = 1, + GST_DXGI_COLOR_SPACE_RGB_STUDIO_G22_NONE_P709 = 2, + GST_DXGI_COLOR_SPACE_RGB_STUDIO_G22_NONE_P2020 = 3, + GST_DXGI_COLOR_SPACE_RESERVED = 4, + GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_NONE_P709_X601 = 5, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P601 = 6, + GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_LEFT_P601 = 7, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P709 = 8, + GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_LEFT_P709 = 9, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P2020 = 10, + GST_DXGI_COLOR_SPACE_YCBCR_FULL_G22_LEFT_P2020 = 11, + GST_DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020 = 12, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G2084_LEFT_P2020 = 13, + GST_DXGI_COLOR_SPACE_RGB_STUDIO_G2084_NONE_P2020 = 14, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_TOPLEFT_P2020 = 15, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G2084_TOPLEFT_P2020 = 16, + GST_DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P2020 = 17, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_GHLG_TOPLEFT_P2020 = 18, + GST_DXGI_COLOR_SPACE_YCBCR_FULL_GHLG_TOPLEFT_P2020 = 19, + GST_DXGI_COLOR_SPACE_RGB_STUDIO_G24_NONE_P709 = 20, + GST_DXGI_COLOR_SPACE_RGB_STUDIO_G24_NONE_P2020 = 21, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G24_LEFT_P709 = 22, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G24_LEFT_P2020 = 23, + GST_DXGI_COLOR_SPACE_YCBCR_STUDIO_G24_TOPLEFT_P2020 = 24, + GST_DXGI_COLOR_SPACE_CUSTOM = 0xFFFFFFFF +} GST_DXGI_COLOR_SPACE_TYPE; + +/* https://docs.microsoft.com/en-us/windows/win32/api/dxgicommon/ne-dxgicommon-dxgi_color_space_type */ + +#define MAKE_COLOR_MAP(d,r,m,t,p) \ + { GST_DXGI_COLOR_SPACE_ ##d, GST_VIDEO_COLOR_RANGE ##r, \ + GST_VIDEO_COLOR_MATRIX_ ##m, GST_VIDEO_TRANSFER_ ##t, \ + GST_VIDEO_COLOR_PRIMARIES_ ##p } + +static const GstDxgiColorSpace rgb_colorspace_map[] = { + /* 1) DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709 + * 2) DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709 + * 3) DXGI_COLOR_SPACE_RGB_STUDIO_G22_NONE_P709 + * 4) DXGI_COLOR_SPACE_RGB_STUDIO_G22_NONE_P2020 + * 5) DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020 + * 6) DXGI_COLOR_SPACE_RGB_STUDIO_G2084_NONE_P2020 + * 7) DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P2020 + * 8) DXGI_COLOR_SPACE_RGB_STUDIO_G24_NONE_P709 + * 9) DXGI_COLOR_SPACE_RGB_STUDIO_G24_NONE_P2020 + * + * NOTE: if G24 (Gamma 2.4, SRGB) transfer is not defined, + * it will be approximated as G22. + * NOTE: BT470BG ~= BT709 + */ + + /* 1) RGB_FULL_G22_NONE_P709 */ + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT709, BT709), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT601, BT709), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT2020_10, BT709), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT2020_12, BT709), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT709, BT470BG), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT601, BT470BG), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT2020_10, BT470BG), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, BT2020_12, BT470BG), + + /* 1-1) Approximation for RGB_FULL_G22_NONE_P709 */ + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, SRGB, BT709), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P709, _0_255, UNKNOWN, SRGB, BT470BG), + + /* 2) RGB_FULL_G10_NONE_P709 */ + MAKE_COLOR_MAP (RGB_FULL_G10_NONE_P709, _0_255, UNKNOWN, GAMMA10, BT709), + MAKE_COLOR_MAP (RGB_FULL_G10_NONE_P709, _0_255, UNKNOWN, GAMMA10, BT470BG), + + /* 3) RGB_STUDIO_G22_NONE_P709 */ + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT709, BT709), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT601, BT709), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT2020_10, BT709), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT2020_12, BT709), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT709, BT470BG), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT601, BT470BG), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT2020_10, + BT470BG), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, BT2020_12, + BT470BG), + + /* 3-1) Approximation for RGB_STUDIO_G22_NONE_P709 */ + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, SRGB, BT709), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P709, _16_235, UNKNOWN, SRGB, BT470BG), + + /* 4) RGB_STUDIO_G22_NONE_P2020 */ + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P2020, _16_235, UNKNOWN, BT709, BT2020), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P2020, _16_235, UNKNOWN, BT601, BT2020), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P2020, _16_235, UNKNOWN, BT2020_10, + BT2020), + MAKE_COLOR_MAP (RGB_STUDIO_G22_NONE_P2020, _16_235, UNKNOWN, BT2020_12, + BT2020), + + /* 5) RGB_FULL_G2084_NONE_P2020 */ + MAKE_COLOR_MAP (RGB_FULL_G2084_NONE_P2020, _0_255, UNKNOWN, SMPTE2084, + BT2020), + + /* 6) RGB_STUDIO_G2084_NONE_P2020 */ + MAKE_COLOR_MAP (RGB_STUDIO_G2084_NONE_P2020, _16_235, UNKNOWN, SMPTE2084, + BT2020), + + /* 7) RGB_FULL_G22_NONE_P2020 */ + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, BT709, BT2020), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, BT601, BT2020), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, BT2020_10, BT2020), + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, BT2020_12, BT2020), + + /* 7-1) Approximation for RGB_FULL_G22_NONE_P2020 */ + MAKE_COLOR_MAP (RGB_FULL_G22_NONE_P2020, _0_255, UNKNOWN, SRGB, BT2020), + + /* 8) RGB_STUDIO_G24_NONE_P709 */ + MAKE_COLOR_MAP (RGB_STUDIO_G24_NONE_P709, _16_235, UNKNOWN, SRGB, BT709), + MAKE_COLOR_MAP (RGB_STUDIO_G24_NONE_P709, _16_235, UNKNOWN, SRGB, BT470BG), + + /* 9) RGB_STUDIO_G24_NONE_P2020 */ + MAKE_COLOR_MAP (RGB_STUDIO_G24_NONE_P2020, _16_235, UNKNOWN, SRGB, BT2020), +}; + +static const GstDxgiColorSpace yuv_colorspace_map[] = { + /* 1) YCBCR_FULL_G22_NONE_P709_X601 + * 2) YCBCR_STUDIO_G22_LEFT_P601 + * 3) YCBCR_FULL_G22_LEFT_P601 + * 4) YCBCR_STUDIO_G22_LEFT_P709 + * 5) YCBCR_FULL_G22_LEFT_P709 + * 6) YCBCR_STUDIO_G22_LEFT_P2020 + * 7) YCBCR_FULL_G22_LEFT_P2020 + * 8) YCBCR_STUDIO_G2084_LEFT_P2020 + * 9) YCBCR_STUDIO_G22_TOPLEFT_P2020 + * 10) YCBCR_STUDIO_G2084_TOPLEFT_P2020 + * 11) YCBCR_STUDIO_GHLG_TOPLEFT_P2020 + * 12) YCBCR_FULL_GHLG_TOPLEFT_P2020 + * 13) YCBCR_STUDIO_G24_LEFT_P709 + * 14) YCBCR_STUDIO_G24_LEFT_P2020 + * 15) YCBCR_STUDIO_G24_TOPLEFT_P2020 + * + * NOTE: BT470BG ~= BT709 + */ + + /* 1) YCBCR_FULL_G22_NONE_P709_X601 */ + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT709, BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT601, BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT2020_10, + BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT2020_12, + BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT709, BT470BG), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT601, BT470BG), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT2020_10, + BT470BG), + MAKE_COLOR_MAP (YCBCR_FULL_G22_NONE_P709_X601, _0_255, BT601, BT2020_12, + BT470BG), + + /* 2) YCBCR_STUDIO_G22_LEFT_P601 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT601, SMPTE170M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT709, SMPTE170M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT2020_10, + SMPTE170M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT2020_12, + SMPTE170M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT601, SMPTE240M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT709, SMPTE240M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT2020_10, + SMPTE240M), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P601, _16_235, BT601, BT2020_12, + SMPTE240M), + + /* 3) YCBCR_FULL_G22_LEFT_P601 */ + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT601, SMPTE170M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT709, SMPTE170M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT2020_10, + SMPTE170M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT2020_12, + SMPTE170M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT601, SMPTE240M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT709, SMPTE240M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT2020_10, + SMPTE240M), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P601, _0_255, BT601, BT2020_12, + SMPTE240M), + + /* 4) YCBCR_STUDIO_G22_LEFT_P709 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT709, BT709), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT601, BT709), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT2020_10, + BT709), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT2020_12, + BT709), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT709, BT470BG), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT601, BT470BG), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT2020_10, + BT470BG), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P709, _16_235, BT709, BT2020_12, + BT470BG), + + /* 5) YCBCR_FULL_G22_LEFT_P709 */ + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT709, BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT601, BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT2020_10, BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT2020_12, BT709), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT709, BT470BG), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT601, BT470BG), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT2020_10, BT470BG), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P709, _0_255, BT709, BT2020_12, BT470BG), + + /* 6) YCBCR_STUDIO_G22_LEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P2020, _16_235, BT2020, BT709, BT2020), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P2020, _16_235, BT2020, BT601, BT2020), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P2020, _16_235, BT2020, BT2020_10, + BT2020), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_LEFT_P2020, _16_235, BT2020, BT2020_12, + BT2020), + + /* 7) YCBCR_FULL_G22_LEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P2020, _0_255, BT2020, BT709, BT2020), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P2020, _0_255, BT2020, BT601, BT2020), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P2020, _0_255, BT2020, BT2020_10, + BT2020), + MAKE_COLOR_MAP (YCBCR_FULL_G22_LEFT_P2020, _0_255, BT2020, BT2020_12, + BT2020), + + /* 8) YCBCR_STUDIO_G2084_LEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G2084_LEFT_P2020, _16_235, BT2020, SMPTE2084, + BT2020), + + /* 9) YCBCR_STUDIO_G22_TOPLEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_TOPLEFT_P2020, _16_235, BT2020, BT2020_10, + BT2020), + MAKE_COLOR_MAP (YCBCR_STUDIO_G22_TOPLEFT_P2020, _16_235, BT2020, BT2020_12, + BT2020), + + /* 10) YCBCR_STUDIO_G2084_TOPLEFT_P2020 */ + /* FIXME: check chroma-site to differentiate this from + * YCBCR_STUDIO_G2084_LEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G2084_TOPLEFT_P2020, _16_235, BT2020, SMPTE2084, + BT2020), + + /* 11) YCBCR_STUDIO_GHLG_TOPLEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_GHLG_TOPLEFT_P2020, _16_235, BT2020, + ARIB_STD_B67, BT2020), + + /* 12) YCBCR_FULL_GHLG_TOPLEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_FULL_GHLG_TOPLEFT_P2020, _0_255, BT2020, ARIB_STD_B67, + BT2020), + + /* 13) YCBCR_STUDIO_G24_LEFT_P709 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G24_LEFT_P709, _16_235, BT709, SRGB, BT709), + + /* 14) YCBCR_STUDIO_G24_LEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G24_LEFT_P2020, _16_235, BT2020, SRGB, BT2020), + + /* 15) YCBCR_STUDIO_G24_TOPLEFT_P2020 */ + /* FIXME: check chroma-site to differentiate this from + * YCBCR_STUDIO_G24_LEFT_P2020 */ + MAKE_COLOR_MAP (YCBCR_STUDIO_G24_TOPLEFT_P2020, _16_235, BT2020, SRGB, + BT2020), +}; + +#define SCORE_RANGE_MISMATCH 5 +#define SCORE_MATRIX_MISMATCH 5 +#define SCORE_TRANSFER_MISMATCH 5 +#define SCORE_PRIMARY_MISMATCH 10 + +static gint +get_score (GstVideoInfo * info, const GstDxgiColorSpace * color_map, + gboolean is_yuv) +{ + gint loss = 0; + GstVideoColorimetry *color = &info->colorimetry; + + if (color->range != color_map->range) + loss += SCORE_RANGE_MISMATCH; + + if (is_yuv && color->matrix != color_map->matrix) + loss += SCORE_MATRIX_MISMATCH; + + if (color->transfer != color_map->transfer) + loss += SCORE_TRANSFER_MISMATCH; + + if (color->primaries != color_map->primaries) + loss += SCORE_PRIMARY_MISMATCH; + + return loss; +} + +static const GstDxgiColorSpace * +gst_d3d11_video_info_to_dxgi_color_space_rgb (GstVideoInfo * info) +{ + gint best_score = G_MAXINT; + gint score; + guint i; + const GstDxgiColorSpace *colorspace = NULL; + + for (i = 0; i < G_N_ELEMENTS (rgb_colorspace_map); i++) { + score = get_score (info, &rgb_colorspace_map[i], FALSE); + + if (score < best_score) { + best_score = score; + colorspace = &rgb_colorspace_map[i]; + + if (score == 0) + break; + } + } + + return colorspace; +} + +static const GstDxgiColorSpace * +gst_d3d11_video_info_to_dxgi_color_space_yuv (GstVideoInfo * info) +{ + gint best_score = G_MAXINT; + gint score; + guint i; + const GstDxgiColorSpace *colorspace = NULL; + + for (i = 0; i < G_N_ELEMENTS (yuv_colorspace_map); i++) { + score = get_score (info, &yuv_colorspace_map[i], TRUE); + + if (score < best_score) { + best_score = score; + colorspace = &yuv_colorspace_map[i]; + + if (score == 0) + break; + } + } + + return colorspace; +} + +const GstDxgiColorSpace * +gst_d3d11_video_info_to_dxgi_color_space (GstVideoInfo * info) +{ + g_return_val_if_fail (info != NULL, NULL); + + if (GST_VIDEO_INFO_IS_RGB (info)) { + return gst_d3d11_video_info_to_dxgi_color_space_rgb (info); + } else if (GST_VIDEO_INFO_IS_YUV (info)) { + return gst_d3d11_video_info_to_dxgi_color_space_yuv (info); + } + + return NULL; +} + +const GstDxgiColorSpace * +gst_d3d11_find_swap_chain_color_space (GstVideoInfo * info, + IDXGISwapChain3 * swapchain) +{ + const GstDxgiColorSpace *colorspace = NULL; + gint best_score = G_MAXINT; + guint i; + /* list of tested display color spaces */ + static GST_DXGI_COLOR_SPACE_TYPE whitelist[] = { + GST_DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709, + GST_DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709, + GST_DXGI_COLOR_SPACE_RGB_FULL_G2084_NONE_P2020, + }; + + g_return_val_if_fail (info != NULL, FALSE); + g_return_val_if_fail (swapchain != NULL, FALSE); + + if (!GST_VIDEO_INFO_IS_RGB (info)) { + GST_WARNING ("Swapchain colorspace should be RGB format"); + return FALSE; + } + + for (i = 0; i < G_N_ELEMENTS (rgb_colorspace_map); i++) { + UINT can_support = 0; + HRESULT hr; + gint score; + gboolean valid = FALSE; + GST_DXGI_COLOR_SPACE_TYPE cur_type = + (GST_DXGI_COLOR_SPACE_TYPE) rgb_colorspace_map[i].dxgi_color_space_type; + + for (guint j = 0; j < G_N_ELEMENTS (whitelist); j++) { + if (whitelist[j] == cur_type) { + valid = TRUE; + break; + } + } + + if (!valid) + continue; + + hr = swapchain->CheckColorSpaceSupport ((DXGI_COLOR_SPACE_TYPE) cur_type, + &can_support); + + if (FAILED (hr)) + continue; + + if ((can_support & DXGI_SWAP_CHAIN_COLOR_SPACE_SUPPORT_FLAG_PRESENT) == + DXGI_SWAP_CHAIN_COLOR_SPACE_SUPPORT_FLAG_PRESENT) { + score = get_score (info, &rgb_colorspace_map[i], FALSE); + + GST_DEBUG ("colorspace %d supported, score %d", cur_type, score); + + if (score < best_score) { + best_score = score; + colorspace = &rgb_colorspace_map[i]; + } + } + } + + return colorspace; +} +#endif + +static void +fill_staging_desc (const D3D11_TEXTURE2D_DESC * ref, + D3D11_TEXTURE2D_DESC * staging) +{ + memset (staging, 0, sizeof (D3D11_TEXTURE2D_DESC)); + + staging->Width = ref->Width; + staging->Height = ref->Height; + staging->MipLevels = 1; + staging->Format = ref->Format; + staging->SampleDesc.Count = 1; + staging->ArraySize = 1; + staging->Usage = D3D11_USAGE_STAGING; + staging->CPUAccessFlags = (D3D11_CPU_ACCESS_READ | D3D11_CPU_ACCESS_WRITE); +} + +GstBuffer * +gst_d3d11_allocate_staging_buffer_for (GstBuffer * buffer, + const GstVideoInfo * info, gboolean add_videometa) +{ + GstD3D11Memory *dmem; + GstD3D11Device *device; + GstD3D11Allocator *alloc = NULL; + GstBuffer *staging_buffer = NULL; + gint stride[GST_VIDEO_MAX_PLANES] = { 0, }; + gsize offset[GST_VIDEO_MAX_PLANES] = { 0, }; + guint i; + gsize size = 0; + const GstD3D11Format *format; + D3D11_TEXTURE2D_DESC desc; + + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstMemory *mem = gst_buffer_peek_memory (buffer, i); + + if (!gst_is_d3d11_memory (mem)) { + GST_DEBUG ("Not a d3d11 memory"); + + return NULL; + } + } + + dmem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, 0); + device = dmem->device; + format = gst_d3d11_device_format_from_gst (device, + GST_VIDEO_INFO_FORMAT (info)); + if (!format) { + GST_ERROR ("Unknown d3d11 format"); + return NULL; + } + + alloc = (GstD3D11Allocator *) gst_allocator_find (GST_D3D11_MEMORY_NAME); + if (!alloc) { + GST_ERROR ("D3D11 allocator is not available"); + return NULL; + } + + staging_buffer = gst_buffer_new (); + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + D3D11_TEXTURE2D_DESC staging_desc; + GstD3D11Memory *mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, i); + GstD3D11Memory *new_mem; + + guint cur_stride = 0; + + gst_d3d11_memory_get_texture_desc (mem, &desc); + fill_staging_desc (&desc, &staging_desc); + + new_mem = (GstD3D11Memory *) + gst_d3d11_allocator_alloc (alloc, mem->device, &staging_desc); + if (!new_mem) { + GST_ERROR ("Failed to allocate memory"); + goto error; + } + + if (!gst_d3d11_memory_get_texture_stride (new_mem, &cur_stride) || + cur_stride < staging_desc.Width) { + GST_ERROR ("Failed to calculate memory size"); + gst_memory_unref (GST_MEMORY_CAST (mem)); + goto error; + } + + offset[i] = size; + stride[i] = cur_stride; + size += GST_MEMORY_CAST (new_mem)->size; + + gst_buffer_append_memory (staging_buffer, GST_MEMORY_CAST (new_mem)); + } + + /* single texture semi-planar formats */ + if (format->dxgi_format != DXGI_FORMAT_UNKNOWN && + GST_VIDEO_INFO_N_PLANES (info) == 2) { + stride[1] = stride[0]; + offset[1] = stride[0] * desc.Height; + } + + gst_buffer_add_video_meta_full (staging_buffer, GST_VIDEO_FRAME_FLAG_NONE, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_WIDTH (info), + GST_VIDEO_INFO_HEIGHT (info), GST_VIDEO_INFO_N_PLANES (info), + offset, stride); + + if (alloc) + gst_object_unref (alloc); + + return staging_buffer; + +error: + gst_clear_buffer (&staging_buffer); + gst_clear_object (&alloc); + + return NULL; +} + +static gboolean +gst_d3d11_buffer_copy_into_fallback (GstBuffer * dst, GstBuffer * src, + const GstVideoInfo * info) +{ + GstVideoFrame in_frame, out_frame; + gboolean ret; + + if (!gst_video_frame_map (&in_frame, (GstVideoInfo *) info, src, + (GstMapFlags) (GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) + goto invalid_buffer; + + if (!gst_video_frame_map (&out_frame, (GstVideoInfo *) info, dst, + (GstMapFlags) (GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) { + gst_video_frame_unmap (&in_frame); + goto invalid_buffer; + } + + ret = gst_video_frame_copy (&out_frame, &in_frame); + + gst_video_frame_unmap (&in_frame); + gst_video_frame_unmap (&out_frame); + + return ret; + + /* ERRORS */ +invalid_buffer: + { + GST_ERROR ("Invalid video buffer"); + return FALSE; + } +} + +gboolean +gst_d3d11_buffer_copy_into (GstBuffer * dst, GstBuffer * src, + const GstVideoInfo * info) +{ + guint i; + + g_return_val_if_fail (GST_IS_BUFFER (dst), FALSE); + g_return_val_if_fail (GST_IS_BUFFER (src), FALSE); + g_return_val_if_fail (info != NULL, FALSE); + + if (gst_buffer_n_memory (dst) != gst_buffer_n_memory (src)) { + GST_LOG ("different memory layout, perform fallback copy"); + return gst_d3d11_buffer_copy_into_fallback (dst, src, info); + } + + if (!gst_is_d3d11_buffer (dst) || !gst_is_d3d11_buffer (src)) { + GST_LOG ("non-d3d11 memory, perform fallback copy"); + return gst_d3d11_buffer_copy_into_fallback (dst, src, info); + } + + for (i = 0; i < gst_buffer_n_memory (dst); i++) { + GstMemory *dst_mem, *src_mem; + GstD3D11Memory *dst_dmem, *src_dmem; + GstMapInfo dst_info; + GstMapInfo src_info; + ID3D11Resource *dst_texture, *src_texture; + ID3D11DeviceContext *device_context; + GstD3D11Device *device; + D3D11_BOX src_box = { 0, }; + D3D11_TEXTURE2D_DESC dst_desc, src_desc; + guint dst_subidx, src_subidx; + + dst_mem = gst_buffer_peek_memory (dst, i); + src_mem = gst_buffer_peek_memory (src, i); + + dst_dmem = (GstD3D11Memory *) dst_mem; + src_dmem = (GstD3D11Memory *) src_mem; + + device = dst_dmem->device; + if (device != src_dmem->device) { + GST_LOG ("different device, perform fallback copy"); + return gst_d3d11_buffer_copy_into_fallback (dst, src, info); + } + + gst_d3d11_memory_get_texture_desc (dst_dmem, &dst_desc); + gst_d3d11_memory_get_texture_desc (src_dmem, &src_desc); + + if (dst_desc.Format != src_desc.Format) { + GST_WARNING ("different dxgi format"); + return FALSE; + } + + device_context = gst_d3d11_device_get_device_context_handle (device); + + if (!gst_memory_map (dst_mem, &dst_info, + (GstMapFlags) (GST_MAP_WRITE | GST_MAP_D3D11))) { + GST_ERROR ("Cannot map dst d3d11 memory"); + return FALSE; + } + + if (!gst_memory_map (src_mem, &src_info, + (GstMapFlags) (GST_MAP_READ | GST_MAP_D3D11))) { + GST_ERROR ("Cannot map src d3d11 memory"); + gst_memory_unmap (dst_mem, &dst_info); + return FALSE; + } + + dst_texture = (ID3D11Resource *) dst_info.data; + src_texture = (ID3D11Resource *) src_info.data; + + /* src/dst texture size might be different if padding was used. + * select smaller size */ + src_box.left = 0; + src_box.top = 0; + src_box.front = 0; + src_box.back = 1; + src_box.right = MIN (src_desc.Width, dst_desc.Width); + src_box.bottom = MIN (src_desc.Height, dst_desc.Height); + + dst_subidx = gst_d3d11_memory_get_subresource_index (dst_dmem); + src_subidx = gst_d3d11_memory_get_subresource_index (src_dmem); + + gst_d3d11_device_lock (device); + device_context->CopySubresourceRegion (dst_texture, dst_subidx, 0, 0, 0, + src_texture, src_subidx, &src_box); + gst_d3d11_device_unlock (device); + + gst_memory_unmap (src_mem, &src_info); + gst_memory_unmap (dst_mem, &dst_info); + } + + return TRUE; +} + +gboolean +gst_is_d3d11_buffer (GstBuffer * buffer) +{ + guint i; + guint size; + + g_return_val_if_fail (GST_IS_BUFFER (buffer), FALSE); + + size = gst_buffer_n_memory (buffer); + if (size == 0) + return FALSE; + + for (i = 0; i < size; i++) { + GstMemory *mem = gst_buffer_peek_memory (buffer, i); + + if (!gst_is_d3d11_memory (mem)) + return FALSE; + } + + return TRUE; +} + +gboolean +gst_d3d11_buffer_can_access_device (GstBuffer * buffer, ID3D11Device * device) +{ + guint i; + + g_return_val_if_fail (GST_IS_BUFFER (buffer), FALSE); + g_return_val_if_fail (device != NULL, FALSE); + + if (!gst_is_d3d11_buffer (buffer)) { + GST_LOG ("Not a d3d11 buffer"); + return FALSE; + } + + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstD3D11Memory *mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, i); + ID3D11Device *handle; + + handle = gst_d3d11_device_get_device_handle (mem->device); + if (handle != device) { + GST_LOG ("D3D11 device is incompatible"); + return FALSE; + } + } + + return TRUE; +} + +gboolean +gst_d3d11_buffer_map (GstBuffer * buffer, ID3D11Device * device, + GstMapInfo info[GST_VIDEO_MAX_PLANES], GstMapFlags flags) +{ + GstMapFlags map_flags; + guint num_mapped = 0; + + g_return_val_if_fail (GST_IS_BUFFER (buffer), FALSE); + g_return_val_if_fail (info != NULL, FALSE); + + if (!gst_d3d11_buffer_can_access_device (buffer, device)) + return FALSE; + + map_flags = (GstMapFlags) (flags | GST_MAP_D3D11); + + for (num_mapped = 0; num_mapped < gst_buffer_n_memory (buffer); num_mapped++) { + GstMemory *mem = gst_buffer_peek_memory (buffer, num_mapped); + + if (!gst_memory_map (mem, &info[num_mapped], map_flags)) { + GST_ERROR ("Couldn't map memory"); + goto error; + } + } + + return TRUE; + +error: + { + guint i; + for (i = 0; i < num_mapped; i++) { + GstMemory *mem = gst_buffer_peek_memory (buffer, i); + gst_memory_unmap (mem, &info[i]); + } + + return FALSE; + } +} + +gboolean +gst_d3d11_buffer_unmap (GstBuffer * buffer, + GstMapInfo info[GST_VIDEO_MAX_PLANES]) +{ + guint i; + + g_return_val_if_fail (GST_IS_BUFFER (buffer), FALSE); + g_return_val_if_fail (info != NULL, FALSE); + + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstMemory *mem = gst_buffer_peek_memory (buffer, i); + + gst_memory_unmap (mem, &info[i]); + } + + return TRUE; +} + +guint +gst_d3d11_buffer_get_shader_resource_view (GstBuffer * buffer, + ID3D11ShaderResourceView * view[GST_VIDEO_MAX_PLANES]) +{ + guint i; + guint num_views = 0; + + g_return_val_if_fail (GST_IS_BUFFER (buffer), 0); + g_return_val_if_fail (view != NULL, 0); + + if (!gst_is_d3d11_buffer (buffer)) { + GST_ERROR ("Buffer contains non-d3d11 memory"); + return 0; + } + + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstD3D11Memory *mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, i); + guint view_size; + guint j; + + view_size = gst_d3d11_memory_get_shader_resource_view_size (mem); + if (!view_size) { + GST_LOG ("SRV is unavailable for memory index %d", i); + return 0; + } + + for (j = 0; j < view_size; j++) { + if (num_views >= GST_VIDEO_MAX_PLANES) { + GST_ERROR ("Too many SRVs"); + return 0; + } + + view[num_views++] = gst_d3d11_memory_get_shader_resource_view (mem, j); + } + } + + return num_views; +} + +guint +gst_d3d11_buffer_get_render_target_view (GstBuffer * buffer, + ID3D11RenderTargetView * view[GST_VIDEO_MAX_PLANES]) +{ + guint i; + guint num_views = 0; + + g_return_val_if_fail (GST_IS_BUFFER (buffer), 0); + g_return_val_if_fail (view != NULL, 0); + + if (!gst_is_d3d11_buffer (buffer)) { + GST_ERROR ("Buffer contains non-d3d11 memory"); + return 0; + } + + for (i = 0; i < gst_buffer_n_memory (buffer); i++) { + GstD3D11Memory *mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, i); + guint view_size; + guint j; + + view_size = gst_d3d11_memory_get_render_target_view_size (mem); + if (!view_size) { + GST_LOG ("RTV is unavailable for memory index %d", i); + return 0; + } + + for (j = 0; j < view_size; j++) { + if (num_views >= GST_VIDEO_MAX_PLANES) { + GST_ERROR ("Too many RTVs"); + return 0; + } + + view[num_views++] = gst_d3d11_memory_get_render_target_view (mem, j); + } + } + + return num_views; +} + +GstBufferPool * +gst_d3d11_buffer_pool_new_with_options (GstD3D11Device * device, + GstCaps * caps, GstD3D11AllocationParams * alloc_params, + guint min_buffers, guint max_buffers) +{ + GstBufferPool *pool; + GstStructure *config; + GstVideoInfo info; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (GST_IS_CAPS (caps), NULL); + g_return_val_if_fail (alloc_params != NULL, NULL); + + if (!gst_video_info_from_caps (&info, caps)) { + GST_ERROR_OBJECT (device, "invalid caps"); + return NULL; + } + + pool = gst_d3d11_buffer_pool_new (device); + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, + caps, GST_VIDEO_INFO_SIZE (&info), min_buffers, max_buffers); + + gst_buffer_pool_config_set_d3d11_allocation_params (config, alloc_params); + + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + if (!gst_buffer_pool_set_config (pool, config)) { + GST_ERROR_OBJECT (pool, "Couldn't set config"); + gst_object_unref (pool); + return NULL; + } + + return pool; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11pluginutils.h
Added
@@ -0,0 +1,114 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_PLUGIN_UTILS_H__ +#define __GST_D3D11_PLUGIN_UTILS_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +typedef struct _GstDxgiColorSpace GstDxgiColorSpace; + +typedef enum +{ + GST_D3D11_DEVICE_VENDOR_UNKNOWN = 0, + GST_D3D11_DEVICE_VENDOR_AMD, + GST_D3D11_DEVICE_VENDOR_INTEL, + GST_D3D11_DEVICE_VENDOR_NVIDIA, + GST_D3D11_DEVICE_VENDOR_QUALCOMM, + GST_D3D11_DEVICE_VENDOR_XBOX, +} GstD3D11DeviceVendor; + +struct _GstDxgiColorSpace +{ + guint dxgi_color_space_type; + GstVideoColorRange range; + GstVideoColorMatrix matrix; + GstVideoTransferFunction transfer; + GstVideoColorPrimaries primaries; +}; + +#define GST_D3D11_CLEAR_COM(obj) G_STMT_START { \ + if (obj) { \ + (obj)->Release (); \ + (obj) = NULL; \ + } \ + } G_STMT_END + +void gst_d3d11_plugin_utils_init (D3D_FEATURE_LEVEL feature_level); + +GstCaps * gst_d3d11_get_updated_template_caps (GstStaticCaps * template_caps); + +gboolean gst_d3d11_is_windows_8_or_greater (void); + +GstD3D11DeviceVendor gst_d3d11_get_device_vendor (GstD3D11Device * device); + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) +gboolean gst_d3d11_hdr_meta_data_to_dxgi (GstVideoMasteringDisplayInfo * minfo, + GstVideoContentLightLevel * cll, + DXGI_HDR_METADATA_HDR10 * dxgi_hdr10); +#endif + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) +const GstDxgiColorSpace * gst_d3d11_video_info_to_dxgi_color_space (GstVideoInfo * info); + +const GstDxgiColorSpace * gst_d3d11_find_swap_chain_color_space (GstVideoInfo * info, + IDXGISwapChain3 * swapchain); +#endif + +GstBuffer * gst_d3d11_allocate_staging_buffer_for (GstBuffer * buffer, + const GstVideoInfo * info, + gboolean add_videometa); + +gboolean gst_d3d11_buffer_copy_into (GstBuffer * dst, + GstBuffer * src, + const GstVideoInfo * info); + +gboolean gst_is_d3d11_buffer (GstBuffer * buffer); + +gboolean gst_d3d11_buffer_can_access_device (GstBuffer * buffer, + ID3D11Device * device); + +gboolean gst_d3d11_buffer_map (GstBuffer * buffer, + ID3D11Device * device, + GstMapInfo info[GST_VIDEO_MAX_PLANES], + GstMapFlags flags); + +gboolean gst_d3d11_buffer_unmap (GstBuffer * buffer, + GstMapInfo info[GST_VIDEO_MAX_PLANES]); + +guint gst_d3d11_buffer_get_shader_resource_view (GstBuffer * buffer, + ID3D11ShaderResourceView * view[GST_VIDEO_MAX_PLANES]); + +guint gst_d3d11_buffer_get_render_target_view (GstBuffer * buffer, + ID3D11RenderTargetView * view[GST_VIDEO_MAX_PLANES]); + +GstBufferPool * gst_d3d11_buffer_pool_new_with_options (GstD3D11Device * device, + GstCaps * caps, + GstD3D11AllocationParams * alloc_params, + guint min_buffers, + guint max_buffers); + +G_END_DECLS + +#endif /* __GST_D3D11_PLUGIN_UTILS_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11screencapture.cpp
Added
@@ -0,0 +1,2004 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/* + * The MIT License (MIT) + * + * Copyright (c) Microsoft Corporation + * + * Permission is hereby granted, free of charge, to any person obtaining a copy + * of this software and associated documentation files (the "Software"), to deal + * in the Software without restriction, including without limitation the rights + * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + * copies of the Software, and to permit persons to whom the Software is + * furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN + * THE SOFTWARE. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11screencapture.h" +#include "gstd3d11shader.h" +#include "gstd3d11pluginutils.h" +#include <string.h> + +#include <wrl.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_screen_capture_debug); +#define GST_CAT_DEFAULT gst_d3d11_screen_capture_debug + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; + +/* List of GstD3D11ScreenCapture weakref */ +G_LOCK_DEFINE_STATIC (dupl_list_lock); +static GList *dupl_list = nullptr; + +/* Below implemenation were taken from Microsoft sample + * https://github.com/microsoft/Windows-classic-samples/tree/master/Samples/DXGIDesktopDuplication + */ +#define NUMVERTICES 6 +#define BPP 4 + +/* Define our own MyFLOAT3 and MyFLOAT2 struct, since MinGW doesn't support + * DirectXMath.h + */ +struct MyFLOAT3 +{ + float x; + float y; + float z; + + MyFLOAT3() = default; + + MyFLOAT3(const MyFLOAT3&) = default; + MyFLOAT3& operator=(const MyFLOAT3&) = default; + + MyFLOAT3(MyFLOAT3&&) = default; + MyFLOAT3& operator=(MyFLOAT3&&) = default; + + constexpr MyFLOAT3(float _x, float _y, float _z) : x(_x), y(_y), z(_z) {} + explicit MyFLOAT3(const float *pArray) : x(pArray[0]), y(pArray[1]), z(pArray[2]) {} +}; + +struct MyFLOAT2 +{ + float x; + float y; + + MyFLOAT2() = default; + + MyFLOAT2(const MyFLOAT2&) = default; + MyFLOAT2& operator=(const MyFLOAT2&) = default; + + MyFLOAT2(MyFLOAT2&&) = default; + MyFLOAT2& operator=(MyFLOAT2&&) = default; + + constexpr MyFLOAT2(float _x, float _y) : x(_x), y(_y) {} + explicit MyFLOAT2(const float *pArray) : x(pArray[0]), y(pArray[1]) {} +}; + +typedef struct +{ + MyFLOAT3 Pos; + MyFLOAT2 TexCoord; +} VERTEX; + +/* List of expected error cases */ +/* These are the errors we expect from general Dxgi API due to a transition */ +HRESULT SystemTransitionsExpectedErrors[] = { + DXGI_ERROR_DEVICE_REMOVED, + DXGI_ERROR_ACCESS_LOST, + static_cast<HRESULT>(WAIT_ABANDONED), + S_OK +}; + +/* These are the errors we expect from IDXGIOutput1::DuplicateOutput + * due to a transition */ +HRESULT CreateDuplicationExpectedErrors[] = { + DXGI_ERROR_DEVICE_REMOVED, + static_cast<HRESULT>(E_ACCESSDENIED), + DXGI_ERROR_SESSION_DISCONNECTED, + S_OK +}; + +/* These are the errors we expect from IDXGIOutputDuplication methods + * due to a transition */ +HRESULT FrameInfoExpectedErrors[] = { + DXGI_ERROR_DEVICE_REMOVED, + DXGI_ERROR_ACCESS_LOST, + S_OK +}; + +/* These are the errors we expect from IDXGIAdapter::EnumOutputs methods + * due to outputs becoming stale during a transition */ +HRESULT EnumOutputsExpectedErrors[] = { + DXGI_ERROR_NOT_FOUND, + S_OK +}; + +static GstFlowReturn +gst_d3d11_screen_capture_return_from_hr (ID3D11Device * device, + HRESULT hr, HRESULT * expected_errors = nullptr) +{ + HRESULT translated_hr = hr; + + /* On an error check if the DX device is lost */ + if (device) { + HRESULT remove_reason = device->GetDeviceRemovedReason (); + + switch (remove_reason) { + case DXGI_ERROR_DEVICE_REMOVED: + case DXGI_ERROR_DEVICE_RESET: + case static_cast<HRESULT>(E_OUTOFMEMORY): + /* Our device has been stopped due to an external event on the GPU so + * map them all to device removed and continue processing the condition + */ + translated_hr = DXGI_ERROR_DEVICE_REMOVED; + break; + case S_OK: + /* Device is not removed so use original error */ + break; + default: + /* Device is removed but not a error we want to remap */ + translated_hr = remove_reason; + break; + } + } + + /* Check if this error was expected or not */ + if (expected_errors) { + HRESULT* rst = expected_errors; + + while (*rst != S_OK) { + if (*rst == translated_hr) + return GST_D3D11_SCREEN_CAPTURE_FLOW_EXPECTED_ERROR; + + rst++; + } + } + + return GST_FLOW_ERROR; +} + +class PTR_INFO +{ +public: + PTR_INFO () + : PtrShapeBuffer (nullptr) + , BufferSize (0) + { + LastTimeStamp.QuadPart = 0; + } + + ~PTR_INFO () + { + if (PtrShapeBuffer) + delete[] PtrShapeBuffer; + } + + void + MaybeReallocBuffer (UINT buffer_size) + { + if (buffer_size <= BufferSize) + return; + + if (PtrShapeBuffer) + delete[] PtrShapeBuffer; + + PtrShapeBuffer = new BYTE[buffer_size]; + BufferSize = buffer_size; + } + + BYTE* PtrShapeBuffer; + UINT BufferSize; + DXGI_OUTDUPL_POINTER_SHAPE_INFO shape_info; + POINT Position; + bool Visible; + LARGE_INTEGER LastTimeStamp; +}; + +class D3D11DesktopDupObject +{ +public: + D3D11DesktopDupObject () + : device_(nullptr) + , metadata_buffer_(nullptr) + , metadata_buffer_size_(0) + , vertex_buffer_(nullptr) + , vertex_buffer_size_(0) + { + } + + ~D3D11DesktopDupObject () + { + if (metadata_buffer_) + delete[] metadata_buffer_; + + if (vertex_buffer_) + delete[] vertex_buffer_; + + gst_clear_object (&device_); + } + + GstFlowReturn + Init (GstD3D11Device * device, HMONITOR monitor) + { + GstFlowReturn ret; + ID3D11Device *device_handle; + HRESULT hr; + D3D11_TEXTURE2D_DESC texture_desc = { 0, }; + + if (!InitShader (device)) + return GST_FLOW_ERROR; + + ret = InitDupl (device, monitor); + if (ret != GST_FLOW_OK) + return ret; + + GST_INFO ("Init done"); + + device_handle = gst_d3d11_device_get_device_handle (device); + + texture_desc.Width = output_desc_.ModeDesc.Width; + texture_desc.Height = output_desc_.ModeDesc.Height; + texture_desc.MipLevels = 1; + texture_desc.ArraySize = 1; + /* FIXME: we can support DXGI_FORMAT_R10G10B10A2_UNORM */ + texture_desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; + texture_desc.SampleDesc.Count = 1; + texture_desc.Usage = D3D11_USAGE_DEFAULT; + texture_desc.BindFlags = + D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; + texture_desc.CPUAccessFlags = 0; + texture_desc.MiscFlags = 0; + + hr = device_handle->CreateTexture2D (&texture_desc, + nullptr, &shared_texture_); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (device, "Couldn't create texture, hr 0x%x", (guint) hr); + return GST_FLOW_ERROR; + } + + device_ = (GstD3D11Device *) gst_object_ref (device); + + return GST_FLOW_OK; + } + + GstFlowReturn + Capture (gboolean draw_mouse) + { + GstFlowReturn ret; + bool timeout = false; + ComPtr<ID3D11Texture2D> texture; + UINT move_count, dirty_count; + DXGI_OUTDUPL_FRAME_INFO frame_info; + + GST_TRACE ("Capturing"); + ret = GetFrame (&texture, &move_count, &dirty_count, &frame_info, &timeout); + if (ret != GST_FLOW_OK) + return ret; + + /* Nothing updated */ + if (timeout) { + GST_TRACE ("timeout"); + return GST_FLOW_OK; + } + + if (draw_mouse) { + GST_TRACE ("Getting mouse pointer info"); + ret = GetMouse (&ptr_info_, &frame_info); + if (ret != GST_FLOW_OK) { + GST_WARNING ("Couldn't get mouse pointer info"); + dupl_->ReleaseFrame (); + return ret; + } + } + + ret = ProcessFrame (texture.Get(), shared_texture_.Get(), + &output_desc_, move_count, dirty_count, &frame_info); + + if (ret != GST_FLOW_OK) { + dupl_->ReleaseFrame (); + GST_WARNING ("Couldn't process frame"); + return ret; + } + + HRESULT hr = dupl_->ReleaseFrame (); + if (!gst_d3d11_result (hr, device_)) { + GST_WARNING ("Couldn't release frame"); + return gst_d3d11_screen_capture_return_from_hr (nullptr, hr, FrameInfoExpectedErrors); + } + + GST_TRACE ("Capture done"); + + return GST_FLOW_OK; + } + + bool DrawMouse (ID3D11RenderTargetView * rtv) + { + GST_TRACE ("Drawing mouse"); + + if (!ptr_info_.Visible) { + GST_TRACE ("Mouse is invisiable"); + return true; + } + + ComPtr<ID3D11Texture2D> MouseTex; + ComPtr<ID3D11ShaderResourceView> ShaderRes; + ComPtr<ID3D11Buffer> VertexBufferMouse; + D3D11_SUBRESOURCE_DATA InitData; + D3D11_TEXTURE2D_DESC Desc; + D3D11_SHADER_RESOURCE_VIEW_DESC SDesc; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device_); + ID3D11DeviceContext *context_handle = + gst_d3d11_device_get_device_context_handle (device_); + + VERTEX Vertices[NUMVERTICES] = + { + {MyFLOAT3(-1.0f, -1.0f, 0), MyFLOAT2(0.0f, 1.0f)}, + {MyFLOAT3(-1.0f, 1.0f, 0), MyFLOAT2(0.0f, 0.0f)}, + {MyFLOAT3(1.0f, -1.0f, 0), MyFLOAT2(1.0f, 1.0f)}, + {MyFLOAT3(1.0f, -1.0f, 0), MyFLOAT2(1.0f, 1.0f)}, + {MyFLOAT3(-1.0f, 1.0f, 0), MyFLOAT2(0.0f, 0.0f)}, + {MyFLOAT3(1.0f, 1.0f, 0), MyFLOAT2(1.0f, 0.0f)}, + }; + + D3D11_TEXTURE2D_DESC FullDesc; + shared_texture_->GetDesc(&FullDesc); + INT DesktopWidth = FullDesc.Width; + INT DesktopHeight = FullDesc.Height; + + INT CenterX = (DesktopWidth / 2); + INT CenterY = (DesktopHeight / 2); + + INT PtrWidth = 0; + INT PtrHeight = 0; + INT PtrLeft = 0; + INT PtrTop = 0; + + BYTE* InitBuffer = nullptr; + + D3D11_BOX Box; + Box.front = 0; + Box.back = 1; + + Desc.MipLevels = 1; + Desc.ArraySize = 1; + Desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; + Desc.SampleDesc.Count = 1; + Desc.SampleDesc.Quality = 0; + Desc.Usage = D3D11_USAGE_DEFAULT; + Desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; + Desc.CPUAccessFlags = 0; + Desc.MiscFlags = 0; + + // Set shader resource properties + SDesc.Format = Desc.Format; + SDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; + SDesc.Texture2D.MostDetailedMip = Desc.MipLevels - 1; + SDesc.Texture2D.MipLevels = Desc.MipLevels; + + switch (ptr_info_.shape_info.Type) { + case DXGI_OUTDUPL_POINTER_SHAPE_TYPE_COLOR: + PtrLeft = ptr_info_.Position.x; + PtrTop = ptr_info_.Position.y; + + PtrWidth = static_cast<INT>(ptr_info_.shape_info.Width); + PtrHeight = static_cast<INT>(ptr_info_.shape_info.Height); + break; + case DXGI_OUTDUPL_POINTER_SHAPE_TYPE_MONOCHROME: + ProcessMonoMask(true, &ptr_info_, &PtrWidth, &PtrHeight, &PtrLeft, + &PtrTop, &InitBuffer, &Box); + break; + case DXGI_OUTDUPL_POINTER_SHAPE_TYPE_MASKED_COLOR: + ProcessMonoMask(false, &ptr_info_, &PtrWidth, &PtrHeight, &PtrLeft, + &PtrTop, &InitBuffer, &Box); + break; + default: + break; + } + + /* Nothing draw */ + if (PtrWidth == 0 || PtrHeight == 0) { + if (InitBuffer) + delete[] InitBuffer; + + return true; + } + + Vertices[0].Pos.x = (PtrLeft - CenterX) / (FLOAT)CenterX; + Vertices[0].Pos.y = -1 * ((PtrTop + PtrHeight) - CenterY) / (FLOAT)CenterY; + Vertices[1].Pos.x = (PtrLeft - CenterX) / (FLOAT)CenterX; + Vertices[1].Pos.y = -1 * (PtrTop - CenterY) / (FLOAT)CenterY; + Vertices[2].Pos.x = ((PtrLeft + PtrWidth) - CenterX) / (FLOAT)CenterX; + Vertices[2].Pos.y = -1 * ((PtrTop + PtrHeight) - CenterY) / (FLOAT)CenterY; + Vertices[3].Pos.x = Vertices[2].Pos.x; + Vertices[3].Pos.y = Vertices[2].Pos.y; + Vertices[4].Pos.x = Vertices[1].Pos.x; + Vertices[4].Pos.y = Vertices[1].Pos.y; + Vertices[5].Pos.x = ((PtrLeft + PtrWidth) - CenterX) / (FLOAT)CenterX; + Vertices[5].Pos.y = -1 * (PtrTop - CenterY) / (FLOAT)CenterY; + + Desc.Width = PtrWidth; + Desc.Height = PtrHeight; + + InitData.pSysMem = + (ptr_info_.shape_info.Type == DXGI_OUTDUPL_POINTER_SHAPE_TYPE_COLOR) ? + ptr_info_.PtrShapeBuffer : InitBuffer; + InitData.SysMemPitch = + (ptr_info_.shape_info.Type == DXGI_OUTDUPL_POINTER_SHAPE_TYPE_COLOR) ? + ptr_info_.shape_info.Pitch : PtrWidth * BPP; + InitData.SysMemSlicePitch = 0; + + // Create mouseshape as texture + HRESULT hr = device_handle->CreateTexture2D(&Desc, &InitData, &MouseTex); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Failed to create texture for rendering mouse"); + return false; + } + + // Create shader resource from texture + hr = device_handle->CreateShaderResourceView(MouseTex.Get(), &SDesc, + &ShaderRes); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Failed to create shader resource view for rendering mouse"); + return false; + } + + D3D11_BUFFER_DESC BDesc; + memset (&BDesc, 0, sizeof(D3D11_BUFFER_DESC)); + BDesc.Usage = D3D11_USAGE_DEFAULT; + BDesc.ByteWidth = sizeof(VERTEX) * NUMVERTICES; + BDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; + BDesc.CPUAccessFlags = 0; + + memset (&InitData, 0, sizeof(D3D11_SUBRESOURCE_DATA)); + InitData.pSysMem = Vertices; + + // Create vertex buffer + hr = device_handle->CreateBuffer(&BDesc, &InitData, &VertexBufferMouse); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Failed to create vertex buffer for rendering mouse"); + return false; + } + + FLOAT BlendFactor[4] = {0.f, 0.f, 0.f, 0.f}; + UINT Stride = sizeof(VERTEX); + UINT Offset = 0; + ID3D11SamplerState *samplers = sampler_.Get(); + ID3D11ShaderResourceView *srv = ShaderRes.Get(); + ID3D11Buffer *vert_buf = VertexBufferMouse.Get(); + + context_handle->IASetVertexBuffers(0, 1, &vert_buf, &Stride, &Offset); + context_handle->OMSetBlendState(blend_.Get(), BlendFactor, 0xFFFFFFFF); + context_handle->OMSetRenderTargets(1, &rtv, nullptr); + context_handle->VSSetShader(vs_.Get(), nullptr, 0); + context_handle->PSSetShader(ps_.Get(), nullptr, 0); + context_handle->PSSetShaderResources(0, 1, &srv); + context_handle->PSSetSamplers(0, 1, &samplers); + context_handle->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); + context_handle->IASetInputLayout(layout_.Get()); + + context_handle->Draw(NUMVERTICES, 0); + + /* Unbind srv and rtv from context */ + srv = nullptr; + context_handle->PSSetShaderResources (0, 1, &srv); + context_handle->OMSetRenderTargets (0, nullptr, nullptr); + + if (InitBuffer) + delete[] InitBuffer; + + return true; + } + + void + CopyToTexture (ID3D11Texture2D * texture) + { + ID3D11DeviceContext *context_handle = + gst_d3d11_device_get_device_context_handle (device_); + + context_handle->CopySubresourceRegion (texture, 0, 0, 0, 0, + shared_texture_.Get(), 0, nullptr); + } + + void + GetSize (guint * width, guint * height) + { + *width = output_desc_.ModeDesc.Width; + *height = output_desc_.ModeDesc.Height; + } + +private: + /* This method is not expected to be failed unless un-recoverable error case */ + bool + InitShader (GstD3D11Device * device) + { + static const gchar vs_str[] = + "struct VS_INPUT {\n" + " float4 Position: POSITION;\n" + " float2 Texture: TEXCOORD;\n" + "};\n" + "\n" + "struct VS_OUTPUT {\n" + " float4 Position: SV_POSITION;\n" + " float2 Texture: TEXCOORD;\n" + "};\n" + "\n" + "VS_OUTPUT main (VS_INPUT input)\n" + "{\n" + " return input;\n" + "}"; + + static const gchar ps_str[] = + "Texture2D shaderTexture;\n" + "SamplerState samplerState;\n" + "\n" + "struct PS_INPUT {\n" + " float4 Position: SV_POSITION;\n" + " float2 Texture: TEXCOORD;\n" + "};\n" + "\n" + "struct PS_OUTPUT {\n" + " float4 Plane: SV_Target;\n" + "};\n" + "\n" + "PS_OUTPUT main(PS_INPUT input)\n" + "{\n" + " PS_OUTPUT output;\n" + " output.Plane = shaderTexture.Sample(samplerState, input.Texture);\n" + " return output;\n" + "}"; + + D3D11_INPUT_ELEMENT_DESC input_desc[] = { + {"POSITION", + 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0}, + {"TEXCOORD", + 0, DXGI_FORMAT_R32G32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0} + }; + + ComPtr<ID3D11VertexShader> vs; + ComPtr<ID3D11InputLayout> layout; + if (!gst_d3d11_create_vertex_shader (device, + vs_str, input_desc, G_N_ELEMENTS (input_desc), &vs, &layout)) { + GST_ERROR ("Failed to create vertex shader"); + return false; + } + + ComPtr<ID3D11PixelShader> ps; + if (!gst_d3d11_create_pixel_shader (device, ps_str, &ps)) { + GST_ERROR ("Failed to create pixel shader"); + return false; + } + + D3D11_SAMPLER_DESC sampler_desc; + memset (&sampler_desc, 0, sizeof (D3D11_SAMPLER_DESC)); + sampler_desc.Filter = D3D11_FILTER_MIN_MAG_MIP_LINEAR; + sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.ComparisonFunc = D3D11_COMPARISON_NEVER; + sampler_desc.MinLOD = 0; + sampler_desc.MaxLOD = D3D11_FLOAT32_MAX; + + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + ComPtr<ID3D11SamplerState> sampler; + HRESULT hr = device_handle->CreateSamplerState (&sampler_desc, &sampler); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Failed to create sampler state, hr 0x%x", (guint) hr); + return false; + } + + /* For blending mouse pointer texture */ + D3D11_BLEND_DESC blend_desc; + blend_desc.AlphaToCoverageEnable = FALSE; + blend_desc.IndependentBlendEnable = FALSE; + blend_desc.RenderTarget[0].BlendEnable = TRUE; + blend_desc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA; + blend_desc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA; + blend_desc.RenderTarget[0].BlendOp = D3D11_BLEND_OP_ADD; + blend_desc.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ONE; + blend_desc.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_ZERO; + blend_desc.RenderTarget[0].BlendOpAlpha = D3D11_BLEND_OP_ADD; + blend_desc.RenderTarget[0].RenderTargetWriteMask = + D3D11_COLOR_WRITE_ENABLE_ALL; + + ComPtr<ID3D11BlendState> blend; + hr = device_handle->CreateBlendState (&blend_desc, &blend); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Failed to create blend state, hr 0x%x", (guint) hr); + return false; + } + + /* Everything is prepared now */ + vs_ = vs; + ps_ = ps; + layout_ = layout; + sampler_ = sampler; + blend_ = blend; + + return true; + } + + /* Maybe returning expected error code depending on desktop status */ + GstFlowReturn + InitDupl (GstD3D11Device * device, HMONITOR monitor) + { + ComPtr<ID3D11Device> d3d11_device; + ComPtr<IDXGIAdapter1> adapter; + ComPtr<IDXGIOutput> output; + ComPtr<IDXGIOutput1> output1; + + d3d11_device = gst_d3d11_device_get_device_handle (device); + + HRESULT hr = gst_d3d11_screen_capture_find_output_for_monitor (monitor, + &adapter, &output); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't get adapter and output for monitor"); + return GST_FLOW_ERROR; + } + + hr = output.As (&output1); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("Couldn't get IDXGIOutput1 interface, hr 0x%x", (guint) hr); + return GST_FLOW_ERROR; + } + + HDESK hdesk = OpenInputDesktop (0, FALSE, GENERIC_ALL); + if (hdesk) { + if (!SetThreadDesktop (hdesk)) { + GST_WARNING ("SetThreadDesktop() failed, error %lu", GetLastError()); + } + + CloseDesktop (hdesk); + } else { + GST_WARNING ("OpenInputDesktop() failed, error %lu", GetLastError()); + } + + /* FIXME: Use DuplicateOutput1 to avoid potentail color conversion */ + hr = output1->DuplicateOutput(d3d11_device.Get(), &dupl_); + if (!gst_d3d11_result (hr, device)) { + if (hr == DXGI_ERROR_NOT_CURRENTLY_AVAILABLE) { + GST_ERROR ("Hit the max allowed number of Desktop Duplication session"); + return GST_FLOW_ERROR; + } + + /* Seems to be one limitation of Desktop Duplication API design + * See + * https://docs.microsoft.com/en-US/troubleshoot/windows-client/shell-experience/error-when-dda-capable-app-is-against-gpu + */ + if (hr == DXGI_ERROR_UNSUPPORTED) { + GST_WARNING ("IDXGIOutput1::DuplicateOutput returned " + "DXGI_ERROR_UNSUPPORTED, possiblely application is run against a " + "discrete GPU"); + return GST_D3D11_SCREEN_CAPTURE_FLOW_UNSUPPORTED; + } + + return gst_d3d11_screen_capture_return_from_hr (d3d11_device.Get(), hr, + CreateDuplicationExpectedErrors); + } + + dupl_->GetDesc (&output_desc_); + + return GST_FLOW_OK; + } + + GstFlowReturn + GetMouse (PTR_INFO * ptr_info, DXGI_OUTDUPL_FRAME_INFO * frame_info) + { + /* A non-zero mouse update timestamp indicates that there is a mouse + * position update and optionally a shape change */ + if (frame_info->LastMouseUpdateTime.QuadPart == 0) + return GST_FLOW_OK; + + ptr_info->Position.x = frame_info->PointerPosition.Position.x; + ptr_info->Position.y = frame_info->PointerPosition.Position.y; + ptr_info->LastTimeStamp = frame_info->LastMouseUpdateTime; + ptr_info->Visible = frame_info->PointerPosition.Visible != 0; + + /* Mouse is invisible */ + if (!ptr_info->Visible) + return GST_FLOW_OK; + + /* No new shape */ + if (frame_info->PointerShapeBufferSize == 0) + return GST_FLOW_OK; + + /* Realloc buffer if needed */ + ptr_info->MaybeReallocBuffer (frame_info->PointerShapeBufferSize); + + /* Get shape */ + UINT dummy; + HRESULT hr = dupl_->GetFramePointerShape(frame_info->PointerShapeBufferSize, + (void *) ptr_info->PtrShapeBuffer, &dummy, &ptr_info->shape_info); + + if (!gst_d3d11_result (hr, device_)) { + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (device_); + + return gst_d3d11_screen_capture_return_from_hr(device_handle, hr, + FrameInfoExpectedErrors); + } + + return GST_FLOW_OK; + } + + void + MaybeReallocMetadataBuffer (UINT buffer_size) + { + if (buffer_size <= metadata_buffer_size_) + return; + + if (metadata_buffer_) + delete[] metadata_buffer_; + + metadata_buffer_ = new BYTE[buffer_size]; + metadata_buffer_size_ = buffer_size; + } + + GstFlowReturn + GetFrame (ID3D11Texture2D ** texture, UINT * move_count, UINT * dirty_count, + DXGI_OUTDUPL_FRAME_INFO * frame_info, bool* timeout) + { + ComPtr<IDXGIResource> resource; + ComPtr<ID3D11Texture2D> acquired_texture; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device_); + + /* Get new frame */ + HRESULT hr = dupl_->AcquireNextFrame(0, frame_info, &resource); + if (hr == DXGI_ERROR_WAIT_TIMEOUT) { + GST_TRACE ("Timeout"); + + *timeout = true; + return GST_FLOW_OK; + } + + *timeout = false; + *move_count = 0; + *dirty_count = 0; + + if (!gst_d3d11_result (hr, device_)) { + return gst_d3d11_screen_capture_return_from_hr(device_handle, hr, + FrameInfoExpectedErrors); + } + + GST_TRACE ( + "LastPresentTime: %" G_GINT64_FORMAT + ", LastMouseUpdateTime: %" G_GINT64_FORMAT + ", AccumulatedFrames: %d" + ", RectsCoalesced: %d" + ", ProtectedContentMaskedOut: %d" + ", PointerPosition: (%ldx%ld, visible %d)" + ", TotalMetadataBufferSize: %d" + ", PointerShapeBufferSize: %d", + frame_info->LastPresentTime.QuadPart, + frame_info->LastMouseUpdateTime.QuadPart, + frame_info->AccumulatedFrames, + frame_info->RectsCoalesced, + frame_info->ProtectedContentMaskedOut, + frame_info->PointerPosition.Position.x, + frame_info->PointerPosition.Position.y, + frame_info->PointerPosition.Visible, + frame_info->TotalMetadataBufferSize, + frame_info->PointerShapeBufferSize); + + hr = resource.As (&acquired_texture); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Failed to get ID3D11Texture2D interface from IDXGIResource " + "hr 0x%x", (guint) hr); + return GST_FLOW_ERROR; + } + + /* Get metadata */ + if (frame_info->TotalMetadataBufferSize) { + UINT buf_size = frame_info->TotalMetadataBufferSize; + + MaybeReallocMetadataBuffer (buf_size); + + /* Get move rectangles */ + hr = dupl_->GetFrameMoveRects(buf_size, + (DXGI_OUTDUPL_MOVE_RECT *) metadata_buffer_, &buf_size); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't get move rect, hr 0x%x", (guint) hr); + + return gst_d3d11_screen_capture_return_from_hr(nullptr, hr, + FrameInfoExpectedErrors); + } + + *move_count = buf_size / sizeof(DXGI_OUTDUPL_MOVE_RECT); + + GST_TRACE ("MoveRects count %d", *move_count); +#ifndef GST_DISABLE_GST_DEBUG + { + DXGI_OUTDUPL_MOVE_RECT *rects = + (DXGI_OUTDUPL_MOVE_RECT *) metadata_buffer_; + for (guint i = 0; i < *move_count; i++) { + GST_TRACE ("MoveRect[%d] SourcePoint: %ldx%ld, " + "DestinationRect (left:top:right:bottom): %ldx%ldx%ldx%ld", + i, rects->SourcePoint.x, rects->SourcePoint.y, + rects->DestinationRect.left, rects->DestinationRect.top, + rects->DestinationRect.right, rects->DestinationRect.bottom); + } + } +#endif + + BYTE* dirty_rects = metadata_buffer_ + buf_size; + buf_size = frame_info->TotalMetadataBufferSize - buf_size; + + /* Get dirty rectangles */ + hr = dupl_->GetFrameDirtyRects(buf_size, (RECT *) dirty_rects, &buf_size); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't get dirty rect, hr 0x%x", (guint) hr); + *move_count = 0; + *dirty_count = 0; + + return gst_d3d11_screen_capture_return_from_hr(nullptr, + hr, FrameInfoExpectedErrors); + } + + *dirty_count = buf_size / sizeof(RECT); + + GST_TRACE ("DirtyRects count %d", *dirty_count); +#ifndef GST_DISABLE_GST_DEBUG + { + RECT *rects = (RECT *) dirty_rects; + for (guint i = 0; i < *dirty_count; i++) { + GST_TRACE ("DirtyRect[%d] left:top:right:bottom: %ldx%ldx%ldx%ld", + i, rects->left, rects->top, rects->right, rects->bottom); + } + } +#endif + } + + *texture = acquired_texture.Detach(); + + return GST_FLOW_OK; + } + + void + SetMoveRect (RECT* SrcRect, RECT* DestRect, DXGI_OUTDUPL_DESC* DeskDesc, + DXGI_OUTDUPL_MOVE_RECT* MoveRect, INT TexWidth, INT TexHeight) + { + switch (DeskDesc->Rotation) + { + case DXGI_MODE_ROTATION_UNSPECIFIED: + case DXGI_MODE_ROTATION_IDENTITY: + SrcRect->left = MoveRect->SourcePoint.x; + SrcRect->top = MoveRect->SourcePoint.y; + SrcRect->right = MoveRect->SourcePoint.x + + MoveRect->DestinationRect.right - MoveRect->DestinationRect.left; + SrcRect->bottom = MoveRect->SourcePoint.y + + MoveRect->DestinationRect.bottom - MoveRect->DestinationRect.top; + + *DestRect = MoveRect->DestinationRect; + break; + case DXGI_MODE_ROTATION_ROTATE90: + SrcRect->left = TexHeight - (MoveRect->SourcePoint.y + + MoveRect->DestinationRect.bottom - MoveRect->DestinationRect.top); + SrcRect->top = MoveRect->SourcePoint.x; + SrcRect->right = TexHeight - MoveRect->SourcePoint.y; + SrcRect->bottom = MoveRect->SourcePoint.x + + MoveRect->DestinationRect.right - MoveRect->DestinationRect.left; + + DestRect->left = TexHeight - MoveRect->DestinationRect.bottom; + DestRect->top = MoveRect->DestinationRect.left; + DestRect->right = TexHeight - MoveRect->DestinationRect.top; + DestRect->bottom = MoveRect->DestinationRect.right; + break; + case DXGI_MODE_ROTATION_ROTATE180: + SrcRect->left = TexWidth - (MoveRect->SourcePoint.x + + MoveRect->DestinationRect.right - MoveRect->DestinationRect.left); + SrcRect->top = TexHeight - (MoveRect->SourcePoint.y + + MoveRect->DestinationRect.bottom - MoveRect->DestinationRect.top); + SrcRect->right = TexWidth - MoveRect->SourcePoint.x; + SrcRect->bottom = TexHeight - MoveRect->SourcePoint.y; + + DestRect->left = TexWidth - MoveRect->DestinationRect.right; + DestRect->top = TexHeight - MoveRect->DestinationRect.bottom; + DestRect->right = TexWidth - MoveRect->DestinationRect.left; + DestRect->bottom = TexHeight - MoveRect->DestinationRect.top; + break; + case DXGI_MODE_ROTATION_ROTATE270: + SrcRect->left = MoveRect->SourcePoint.x; + SrcRect->top = TexWidth - (MoveRect->SourcePoint.x + + MoveRect->DestinationRect.right - MoveRect->DestinationRect.left); + SrcRect->right = MoveRect->SourcePoint.y + + MoveRect->DestinationRect.bottom - MoveRect->DestinationRect.top; + SrcRect->bottom = TexWidth - MoveRect->SourcePoint.x; + + DestRect->left = MoveRect->DestinationRect.top; + DestRect->top = TexWidth - MoveRect->DestinationRect.right; + DestRect->right = MoveRect->DestinationRect.bottom; + DestRect->bottom = TexWidth - MoveRect->DestinationRect.left; + break; + default: + memset (DestRect, 0, sizeof (RECT)); + memset (SrcRect, 0, sizeof (RECT)); + break; + } + } + + GstFlowReturn + CopyMove (ID3D11Texture2D* SharedSurf, DXGI_OUTDUPL_MOVE_RECT* MoveBuffer, + UINT MoveCount, DXGI_OUTDUPL_DESC* DeskDesc) + { + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device_); + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (device_); + D3D11_TEXTURE2D_DESC FullDesc; + SharedSurf->GetDesc(&FullDesc); + + GST_TRACE ("Copying MoveRects (count %d)", MoveCount); + + /* Make new intermediate surface to copy into for moving */ + if (!move_texture_) { + D3D11_TEXTURE2D_DESC MoveDesc; + MoveDesc = FullDesc; + MoveDesc.BindFlags = D3D11_BIND_RENDER_TARGET; + MoveDesc.MiscFlags = 0; + HRESULT hr = device_handle->CreateTexture2D(&MoveDesc, + nullptr, &move_texture_); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't create intermediate texture, hr 0x%x", (guint) hr); + return GST_FLOW_ERROR; + } + } + + for (UINT i = 0; i < MoveCount; i++) { + RECT SrcRect; + RECT DestRect; + + SetMoveRect(&SrcRect, &DestRect, DeskDesc, &MoveBuffer[i], + FullDesc.Width, FullDesc.Height); + + /* Copy rect out of shared surface */ + D3D11_BOX Box; + Box.left = SrcRect.left; + Box.top = SrcRect.top; + Box.front = 0; + Box.right = SrcRect.right; + Box.bottom = SrcRect.bottom; + Box.back = 1; + device_context->CopySubresourceRegion(move_texture_.Get(), + 0, SrcRect.left, SrcRect.top, 0, SharedSurf, 0, &Box); + + /* Copy back to shared surface */ + device_context->CopySubresourceRegion(SharedSurf, + 0, DestRect.left, DestRect.top, 0, move_texture_.Get(), 0, &Box); + } + + return GST_FLOW_OK; + } + + void + SetDirtyVert (VERTEX* Vertices, RECT* Dirty, + DXGI_OUTDUPL_DESC* DeskDesc, D3D11_TEXTURE2D_DESC* FullDesc, + D3D11_TEXTURE2D_DESC* ThisDesc) + { + INT CenterX = FullDesc->Width / 2; + INT CenterY = FullDesc->Height / 2; + + INT Width = FullDesc->Width; + INT Height = FullDesc->Height; + + /* Rotation compensated destination rect */ + RECT DestDirty = *Dirty; + + /* Set appropriate coordinates compensated for rotation */ + switch (DeskDesc->Rotation) + { + case DXGI_MODE_ROTATION_ROTATE90: + DestDirty.left = Width - Dirty->bottom; + DestDirty.top = Dirty->left; + DestDirty.right = Width - Dirty->top; + DestDirty.bottom = Dirty->right; + + Vertices[0].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[1].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[2].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[5].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + break; + case DXGI_MODE_ROTATION_ROTATE180: + DestDirty.left = Width - Dirty->right; + DestDirty.top = Height - Dirty->bottom; + DestDirty.right = Width - Dirty->left; + DestDirty.bottom = Height - Dirty->top; + + Vertices[0].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[1].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[2].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[5].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + break; + case DXGI_MODE_ROTATION_ROTATE270: + DestDirty.left = Dirty->top; + DestDirty.top = Height - Dirty->right; + DestDirty.right = Dirty->bottom; + DestDirty.bottom = Height - Dirty->left; + + Vertices[0].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[1].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[2].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[5].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + break; + case DXGI_MODE_ROTATION_UNSPECIFIED: + case DXGI_MODE_ROTATION_IDENTITY: + default: + Vertices[0].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[1].TexCoord = + MyFLOAT2(Dirty->left / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[2].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->bottom / static_cast<FLOAT>(ThisDesc->Height)); + Vertices[5].TexCoord = + MyFLOAT2(Dirty->right / static_cast<FLOAT>(ThisDesc->Width), + Dirty->top / static_cast<FLOAT>(ThisDesc->Height)); + break; + } + + /* Set positions */ + Vertices[0].Pos = + MyFLOAT3( + (DestDirty.left - CenterX) / static_cast<FLOAT>(CenterX), + -1 * (DestDirty.bottom - CenterY) / static_cast<FLOAT>(CenterY), + 0.0f); + Vertices[1].Pos = + MyFLOAT3( + (DestDirty.left - CenterX) / static_cast<FLOAT>(CenterX), + -1 * (DestDirty.top - CenterY) / static_cast<FLOAT>(CenterY), + 0.0f); + Vertices[2].Pos = + MyFLOAT3( + (DestDirty.right - CenterX) / static_cast<FLOAT>(CenterX), + -1 * (DestDirty.bottom - CenterY) / static_cast<FLOAT>(CenterY), + 0.0f); + Vertices[3].Pos = Vertices[2].Pos; + Vertices[4].Pos = Vertices[1].Pos; + Vertices[5].Pos = + MyFLOAT3( + (DestDirty.right - CenterX) / static_cast<FLOAT>(CenterX), + -1 * (DestDirty.top - CenterY) / static_cast<FLOAT>(CenterY), + 0.0f); + + Vertices[3].TexCoord = Vertices[2].TexCoord; + Vertices[4].TexCoord = Vertices[1].TexCoord; + } + + void + MaybeReallocVertexBuffer (UINT buffer_size) + { + if (buffer_size <= vertex_buffer_size_) + return; + + if (vertex_buffer_) + delete[] vertex_buffer_; + + vertex_buffer_ = new BYTE[buffer_size]; + vertex_buffer_size_ = buffer_size; + } + + GstFlowReturn + CopyDirty (ID3D11Texture2D* SrcSurface, ID3D11Texture2D* SharedSurf, + RECT* DirtyBuffer, UINT DirtyCount, DXGI_OUTDUPL_DESC* DeskDesc) + { + D3D11_TEXTURE2D_DESC FullDesc; + D3D11_TEXTURE2D_DESC ThisDesc; + ComPtr<ID3D11ShaderResourceView> ShaderResource; + ComPtr<ID3D11Buffer> VertBuf; + HRESULT hr; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device_); + ID3D11DeviceContext *device_context = + gst_d3d11_device_get_device_context_handle (device_); + + GST_TRACE ("Copying DiretyRects (count %d)", DirtyCount); + + SharedSurf->GetDesc(&FullDesc); + SrcSurface->GetDesc(&ThisDesc); + + if (!rtv_) { + hr = device_handle->CreateRenderTargetView(SharedSurf, nullptr, &rtv_); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't create renter target view, hr 0x%x", (guint) hr); + return GST_FLOW_ERROR; + } + } + + D3D11_SHADER_RESOURCE_VIEW_DESC ShaderDesc; + ShaderDesc.Format = ThisDesc.Format; + ShaderDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; + ShaderDesc.Texture2D.MostDetailedMip = ThisDesc.MipLevels - 1; + ShaderDesc.Texture2D.MipLevels = ThisDesc.MipLevels; + + /* Create new shader resource view */ + hr = device_handle->CreateShaderResourceView(SrcSurface, + &ShaderDesc, &ShaderResource); + if (!gst_d3d11_result (hr, device_)) { + return gst_d3d11_screen_capture_return_from_hr(device_handle, hr, + SystemTransitionsExpectedErrors); + } + + ID3D11SamplerState *samplers = sampler_.Get(); + ID3D11ShaderResourceView *srv = ShaderResource.Get(); + ID3D11RenderTargetView *rtv = rtv_.Get(); + device_context->OMSetBlendState(nullptr, nullptr, 0xFFFFFFFF); + device_context->OMSetRenderTargets(1, &rtv, nullptr); + device_context->VSSetShader(vs_.Get(), nullptr, 0); + device_context->PSSetShader(ps_.Get(), nullptr, 0); + device_context->PSSetShaderResources(0, 1, &srv); + device_context->PSSetSamplers(0, 1, &samplers); + device_context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); + device_context->IASetInputLayout(layout_.Get()); + + /* Create space for vertices for the dirty rects if the current space isn't + * large enough */ + UINT byte_needed = sizeof(VERTEX) * NUMVERTICES * DirtyCount; + MaybeReallocVertexBuffer (byte_needed); + + /* Fill them in */ + VERTEX* DirtyVertex = (VERTEX *) vertex_buffer_; + for (UINT i = 0; i < DirtyCount; ++i, DirtyVertex += NUMVERTICES) { + SetDirtyVert(DirtyVertex, &DirtyBuffer[i], DeskDesc, + &FullDesc, &ThisDesc); + } + + /* Create vertex buffer */ + D3D11_BUFFER_DESC BufferDesc; + memset (&BufferDesc, 0, sizeof (D3D11_BUFFER_DESC)); + BufferDesc.Usage = D3D11_USAGE_DEFAULT; + BufferDesc.ByteWidth = byte_needed; + BufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; + BufferDesc.CPUAccessFlags = 0; + D3D11_SUBRESOURCE_DATA InitData; + memset (&InitData, 0, sizeof (D3D11_SUBRESOURCE_DATA)); + InitData.pSysMem = vertex_buffer_; + + hr = device_handle->CreateBuffer(&BufferDesc, &InitData, &VertBuf); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Failed to create vertex buffer"); + return GST_FLOW_ERROR; + } + + UINT Stride = sizeof(VERTEX); + UINT Offset = 0; + ID3D11Buffer *vert_buf = VertBuf.Get(); + device_context->IASetVertexBuffers(0, 1, &vert_buf, &Stride, &Offset); + + D3D11_VIEWPORT VP; + VP.Width = static_cast<FLOAT>(FullDesc.Width); + VP.Height = static_cast<FLOAT>(FullDesc.Height); + VP.MinDepth = 0.0f; + VP.MaxDepth = 1.0f; + VP.TopLeftX = 0.0f; + VP.TopLeftY = 0.0f; + device_context->RSSetViewports(1, &VP); + + device_context->Draw(NUMVERTICES * DirtyCount, 0); + + /* Unbind srv and rtv from context */ + srv = nullptr; + device_context->PSSetShaderResources (0, 1, &srv); + device_context->OMSetRenderTargets (0, nullptr, nullptr); + + return GST_FLOW_OK; + } + + GstFlowReturn + ProcessFrame(ID3D11Texture2D * acquired_texture, ID3D11Texture2D* SharedSurf, + DXGI_OUTDUPL_DESC* DeskDesc, UINT move_count, UINT dirty_count, + DXGI_OUTDUPL_FRAME_INFO * frame_info) + { + GstFlowReturn ret = GST_FLOW_OK; + + GST_TRACE ("Processing frame"); + + /* Process dirties and moves */ + if (frame_info->TotalMetadataBufferSize) { + if (move_count) { + ret = CopyMove(SharedSurf, (DXGI_OUTDUPL_MOVE_RECT *) metadata_buffer_, + move_count, DeskDesc); + + if (ret != GST_FLOW_OK) + return ret; + } + + if (dirty_count) { + ret = CopyDirty(acquired_texture, SharedSurf, + (RECT *)(metadata_buffer_ + + (move_count * sizeof(DXGI_OUTDUPL_MOVE_RECT))), + dirty_count, DeskDesc); + } + } else { + GST_TRACE ("No metadata"); + } + + return ret; + } + + /* To draw mouse */ + bool + ProcessMonoMask (bool IsMono, PTR_INFO* PtrInfo, INT* PtrWidth, + INT* PtrHeight, INT* PtrLeft, INT* PtrTop, BYTE** InitBuffer, + D3D11_BOX* Box) + { + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device_); + ID3D11DeviceContext *context_handle = + gst_d3d11_device_get_device_context_handle (device_); + + D3D11_TEXTURE2D_DESC FullDesc; + shared_texture_->GetDesc(&FullDesc); + INT DesktopWidth = FullDesc.Width; + INT DesktopHeight = FullDesc.Height; + + // Pointer position + INT GivenLeft = PtrInfo->Position.x; + INT GivenTop = PtrInfo->Position.y; + + // Figure out if any adjustment is needed for out of bound positions + if (GivenLeft < 0) { + *PtrWidth = GivenLeft + static_cast<INT>(PtrInfo->shape_info.Width); + } else if ((GivenLeft + static_cast<INT>(PtrInfo->shape_info.Width)) > DesktopWidth) { + *PtrWidth = DesktopWidth - GivenLeft; + } else { + *PtrWidth = static_cast<INT>(PtrInfo->shape_info.Width); + } + + if (IsMono) + PtrInfo->shape_info.Height = PtrInfo->shape_info.Height / 2; + + if (GivenTop < 0) { + *PtrHeight = GivenTop + static_cast<INT>(PtrInfo->shape_info.Height); + } else if ((GivenTop + static_cast<INT>(PtrInfo->shape_info.Height)) > DesktopHeight) { + *PtrHeight = DesktopHeight - GivenTop; + } else { + *PtrHeight = static_cast<INT>(PtrInfo->shape_info.Height); + } + + if (IsMono) + PtrInfo->shape_info.Height = PtrInfo->shape_info.Height * 2; + + *PtrLeft = (GivenLeft < 0) ? 0 : GivenLeft; + *PtrTop = (GivenTop < 0) ? 0 : GivenTop; + + D3D11_TEXTURE2D_DESC CopyBufferDesc; + CopyBufferDesc.Width = *PtrWidth; + CopyBufferDesc.Height = *PtrHeight; + CopyBufferDesc.MipLevels = 1; + CopyBufferDesc.ArraySize = 1; + CopyBufferDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; + CopyBufferDesc.SampleDesc.Count = 1; + CopyBufferDesc.SampleDesc.Quality = 0; + CopyBufferDesc.Usage = D3D11_USAGE_STAGING; + CopyBufferDesc.BindFlags = 0; + CopyBufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_READ; + CopyBufferDesc.MiscFlags = 0; + + ComPtr<ID3D11Texture2D> CopyBuffer; + HRESULT hr = device_handle->CreateTexture2D(&CopyBufferDesc, + nullptr, &CopyBuffer); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't create texture for mouse pointer"); + return false; + } + + Box->left = *PtrLeft; + Box->top = *PtrTop; + Box->right = *PtrLeft + *PtrWidth; + Box->bottom = *PtrTop + *PtrHeight; + context_handle->CopySubresourceRegion(CopyBuffer.Get(), + 0, 0, 0, 0, shared_texture_.Get(), 0, Box); + + ComPtr<IDXGISurface> CopySurface; + hr = CopyBuffer.As (&CopySurface); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't get DXGI resource from mouse texture"); + return false; + } + + DXGI_MAPPED_RECT MappedSurface; + hr = CopySurface->Map(&MappedSurface, DXGI_MAP_READ); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Couldn't map DXGI surface"); + return false; + } + + *InitBuffer = new BYTE[*PtrWidth * *PtrHeight * BPP]; + + UINT* InitBuffer32 = reinterpret_cast<UINT*>(*InitBuffer); + UINT* Desktop32 = reinterpret_cast<UINT*>(MappedSurface.pBits); + UINT DesktopPitchInPixels = MappedSurface.Pitch / sizeof(UINT); + + // What to skip (pixel offset) + UINT SkipX = (GivenLeft < 0) ? (-1 * GivenLeft) : (0); + UINT SkipY = (GivenTop < 0) ? (-1 * GivenTop) : (0); + + if (IsMono) { + for (INT Row = 0; Row < *PtrHeight; Row++) { + BYTE Mask = 0x80; + Mask = Mask >> (SkipX % 8); + for (INT Col = 0; Col < *PtrWidth; Col++) { + BYTE AndMask = PtrInfo->PtrShapeBuffer[((Col + SkipX) / 8) + + ((Row + SkipY) * (PtrInfo->shape_info.Pitch))] & Mask; + BYTE XorMask = PtrInfo->PtrShapeBuffer[((Col + SkipX) / 8) + + ((Row + SkipY + (PtrInfo->shape_info.Height / 2)) * + (PtrInfo->shape_info.Pitch))] & Mask; + UINT AndMask32 = (AndMask) ? 0xFFFFFFFF : 0xFF000000; + UINT XorMask32 = (XorMask) ? 0x00FFFFFF : 0x00000000; + + InitBuffer32[(Row * *PtrWidth) + Col] = + (Desktop32[(Row * DesktopPitchInPixels) + Col] & AndMask32) ^ XorMask32; + + if (Mask == 0x01) { + Mask = 0x80; + } else { + Mask = Mask >> 1; + } + } + } + } else { + UINT* Buffer32 = reinterpret_cast<UINT*>(PtrInfo->PtrShapeBuffer); + + for (INT Row = 0; Row < *PtrHeight; Row++) { + for (INT Col = 0; Col < *PtrWidth; ++Col) { + // Set up mask + UINT MaskVal = 0xFF000000 & Buffer32[(Col + SkipX) + ((Row + SkipY) * + (PtrInfo->shape_info.Pitch / sizeof(UINT)))]; + if (MaskVal) { + // Mask was 0xFF + InitBuffer32[(Row * *PtrWidth) + Col] = + (Desktop32[(Row * DesktopPitchInPixels) + Col] ^ + Buffer32[(Col + SkipX) + ((Row + SkipY) * + (PtrInfo->shape_info.Pitch / sizeof(UINT)))]) | 0xFF000000; + } else { + // Mask was 0x00 + InitBuffer32[(Row * *PtrWidth) + Col] = Buffer32[(Col + SkipX) + + ((Row + SkipY) * (PtrInfo->shape_info.Pitch / sizeof(UINT)))] | 0xFF000000; + } + } + } + } + + // Done with resource + hr = CopySurface->Unmap(); + if (!gst_d3d11_result (hr, device_)) { + GST_ERROR ("Failed to unmap DXGI surface"); + return false; + } + + return true; + } + +private: + PTR_INFO ptr_info_; + DXGI_OUTDUPL_DESC output_desc_; + GstD3D11Device * device_; + + ComPtr<ID3D11Texture2D> shared_texture_; + ComPtr<ID3D11RenderTargetView> rtv_; + ComPtr<ID3D11Texture2D> move_texture_; + ComPtr<ID3D11VertexShader> vs_; + ComPtr<ID3D11PixelShader> ps_; + ComPtr<ID3D11InputLayout> layout_; + ComPtr<ID3D11SamplerState> sampler_; + ComPtr<IDXGIOutputDuplication> dupl_; + ComPtr<ID3D11BlendState> blend_; + + /* frame metadata */ + BYTE *metadata_buffer_; + UINT metadata_buffer_size_; + + /* vertex buffers */ + BYTE *vertex_buffer_; + UINT vertex_buffer_size_; +}; +/* *INDENT-ON* */ + +enum +{ + PROP_0, + PROP_D3D11_DEVICE, + PROP_MONITOR_HANDLE, +}; + +#define DEFAULT_MONITOR_INDEX -1 + +struct _GstD3D11ScreenCapture +{ + GstObject parent; + + GstD3D11Device *device; + guint cached_width; + guint cached_height; + + D3D11DesktopDupObject *dupl_obj; + + HMONITOR monitor_handle; + RECT desktop_coordinates; + gboolean prepared; + + GRecMutex lock; +}; + +static void gst_d3d11_screen_capture_constructed (GObject * object); +static void gst_d3d11_screen_capture_dispose (GObject * object); +static void gst_d3d11_screen_capture_finalize (GObject * object); +static void gst_d3d11_screen_capture_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); + +#define gst_d3d11_screen_capture_parent_class parent_class +G_DEFINE_TYPE (GstD3D11ScreenCapture, gst_d3d11_screen_capture, + GST_TYPE_OBJECT); + +static void +gst_d3d11_screen_capture_class_init (GstD3D11ScreenCaptureClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->constructed = gst_d3d11_screen_capture_constructed; + gobject_class->dispose = gst_d3d11_screen_capture_dispose; + gobject_class->finalize = gst_d3d11_screen_capture_finalize; + gobject_class->set_property = gst_d3d11_screen_capture_set_property; + + g_object_class_install_property (gobject_class, PROP_D3D11_DEVICE, + g_param_spec_object ("d3d11device", "D3D11 Device", + "GstD3D11Device object for operating", + GST_TYPE_D3D11_DEVICE, (GParamFlags) + (G_PARAM_WRITABLE | G_PARAM_CONSTRUCT_ONLY | + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_MONITOR_HANDLE, + g_param_spec_pointer ("monitor-handle", "Monitor Handle", + "A HMONITOR handle of monitor to capture", (GParamFlags) + (G_PARAM_WRITABLE | G_PARAM_CONSTRUCT_ONLY | + G_PARAM_STATIC_STRINGS))); +} + +static void +gst_d3d11_screen_capture_init (GstD3D11ScreenCapture * self) +{ + g_rec_mutex_init (&self->lock); + + memset (&self->desktop_coordinates, 0, sizeof (RECT)); +} + +static void +gst_d3d11_screen_capture_constructed (GObject * object) +{ + GstD3D11ScreenCapture *self = GST_D3D11_SCREEN_CAPTURE (object); + /* *INDENT-OFF* */ + ComPtr<IDXGIDevice> dxgi_device; + ComPtr<IDXGIAdapter1> adapter; + ComPtr<IDXGIOutput> output; + ComPtr<IDXGIOutput1> output1; + /* *INDENT-ON* */ + HRESULT hr; + gboolean ret = FALSE; + DXGI_OUTPUT_DESC output_desc; + DXGI_ADAPTER_DESC adapter_desc; + gint64 luid, device_luid; + + if (!self->device) { + GST_WARNING_OBJECT (self, "D3D11 device is unavailable"); + goto out; + } + + if (!self->monitor_handle) { + GST_WARNING_OBJECT (self, "Null monitor handle"); + goto out; + } + + hr = gst_d3d11_screen_capture_find_output_for_monitor (self->monitor_handle, + &adapter, &output); + if (!gst_d3d11_result (hr, self->device)) { + GST_WARNING_OBJECT (self, + "Failed to find associated adapter for monitor %p", + self->monitor_handle); + goto out; + } + + hr = output.As (&output1); + if (!gst_d3d11_result (hr, self->device)) { + GST_WARNING_OBJECT (self, "IDXGIOutput1 interface is unavailble"); + goto out; + } + + hr = adapter->GetDesc (&adapter_desc); + if (!gst_d3d11_result (hr, self->device)) { + GST_WARNING_OBJECT (self, "Failed to get adapter desc"); + goto out; + } + + luid = gst_d3d11_luid_to_int64 (&adapter_desc.AdapterLuid); + g_object_get (self->device, "adapter-luid", &device_luid, nullptr); + if (luid != device_luid) { + GST_WARNING_OBJECT (self, "Incompatible d3d11 device"); + goto out; + } + + hr = output->GetDesc (&output_desc); + if (!gst_d3d11_result (hr, self->device)) { + GST_WARNING_OBJECT (self, "Failed to get output desc"); + goto out; + } + + /* DesktopCoordinates will not report actual texture size in case that + * application is running without dpi-awareness. To get actual monitor size, + * we need to use Win32 API... */ + MONITORINFOEXW monitor_info; + DEVMODEW dev_mode; + + monitor_info.cbSize = sizeof (MONITORINFOEXW); + if (!GetMonitorInfoW (output_desc.Monitor, (LPMONITORINFO) & monitor_info)) { + GST_WARNING_OBJECT (self, "Couldn't get monitor info"); + goto out; + } + + dev_mode.dmSize = sizeof (DEVMODEW); + dev_mode.dmDriverExtra = sizeof (POINTL); + dev_mode.dmFields = DM_POSITION; + if (!EnumDisplaySettingsW + (monitor_info.szDevice, ENUM_CURRENT_SETTINGS, &dev_mode)) { + GST_WARNING_OBJECT (self, "Couldn't enumerate display settings"); + goto out; + } + + self->desktop_coordinates.left = dev_mode.dmPosition.x; + self->desktop_coordinates.top = dev_mode.dmPosition.y; + self->desktop_coordinates.right = + dev_mode.dmPosition.x + dev_mode.dmPelsWidth; + self->desktop_coordinates.bottom = + dev_mode.dmPosition.y + dev_mode.dmPelsHeight; + + self->cached_width = + self->desktop_coordinates.right - self->desktop_coordinates.left; + self->cached_height = + self->desktop_coordinates.bottom - self->desktop_coordinates.top; + + GST_DEBUG_OBJECT (self, + "Desktop coordinates left:top:right:bottom = %ld:%ld:%ld:%ld (%dx%d)", + self->desktop_coordinates.left, self->desktop_coordinates.top, + self->desktop_coordinates.right, self->desktop_coordinates.bottom, + self->cached_width, self->cached_height); + + ret = TRUE; + +out: + if (!ret) + gst_clear_object (&self->device); + + G_OBJECT_CLASS (parent_class)->constructed (object); +} + +static void +gst_d3d11_screen_capture_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11ScreenCapture *self = GST_D3D11_SCREEN_CAPTURE (object); + + switch (prop_id) { + case PROP_D3D11_DEVICE: + self->device = (GstD3D11Device *) g_value_dup_object (value); + break; + case PROP_MONITOR_HANDLE: + self->monitor_handle = (HMONITOR) g_value_get_pointer (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_screen_capture_dispose (GObject * object) +{ + GstD3D11ScreenCapture *self = GST_D3D11_SCREEN_CAPTURE (object); + + if (self->dupl_obj) { + delete self->dupl_obj; + self->dupl_obj = nullptr; + } + + gst_clear_object (&self->device); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_d3d11_screen_capture_finalize (GObject * object) +{ + GstD3D11ScreenCapture *self = GST_D3D11_SCREEN_CAPTURE (object); + + g_rec_mutex_clear (&self->lock); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_screen_capture_weak_ref_notify (gpointer data, + GstD3D11ScreenCapture * dupl) +{ + G_LOCK (dupl_list_lock); + dupl_list = g_list_remove (dupl_list, dupl); + G_UNLOCK (dupl_list_lock); +} + +GstD3D11ScreenCapture * +gst_d3d11_screen_capture_new (GstD3D11Device * device, HMONITOR monitor_handle) +{ + GstD3D11ScreenCapture *self = nullptr; + GList *iter; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), nullptr); + + /* Check if we have dup object corresponding to monitor_handle, + * and if there is already configured capture object, reuse it. + * This is because of the limitation of desktop duplication API + * (i.e., in a process, only one duplication object can exist). + * See also + * https://docs.microsoft.com/en-us/windows/win32/api/dxgi1_2/nf-dxgi1_2-idxgioutput1-duplicateoutput#remarks + */ + G_LOCK (dupl_list_lock); + for (iter = dupl_list; iter; iter = g_list_next (iter)) { + GstD3D11ScreenCapture *dupl = (GstD3D11ScreenCapture *) iter->data; + + if (dupl->monitor_handle == monitor_handle) { + GST_DEBUG ("Found configured desktop dup object for monitor handle %p", + monitor_handle); + self = (GstD3D11ScreenCapture *) gst_object_ref (dupl); + break; + } + } + + if (self) { + G_UNLOCK (dupl_list_lock); + return self; + } + + self = (GstD3D11ScreenCapture *) g_object_new (GST_TYPE_D3D11_SCREEN_CAPTURE, + "d3d11device", device, "monitor-handle", monitor_handle, nullptr); + + if (!self->device) { + GST_WARNING_OBJECT (self, "Couldn't configure desktop dup object"); + gst_object_unref (self); + G_UNLOCK (dupl_list_lock); + + return nullptr; + } + + g_object_weak_ref (G_OBJECT (self), + (GWeakNotify) gst_d3d11_screen_capture_weak_ref_notify, nullptr); + dupl_list = g_list_append (dupl_list, self); + + G_UNLOCK (dupl_list_lock); + + return self; +} + +GstFlowReturn +gst_d3d11_screen_capture_prepare (GstD3D11ScreenCapture * capture) +{ + GstFlowReturn ret; + + g_return_val_if_fail (GST_IS_D3D11_SCREEN_CAPTURE (capture), GST_FLOW_ERROR); + g_return_val_if_fail (capture->device != nullptr, GST_FLOW_ERROR); + + g_rec_mutex_lock (&capture->lock); + if (capture->prepared) { + GST_DEBUG_OBJECT (capture, "Already prepared"); + g_rec_mutex_unlock (&capture->lock); + return GST_FLOW_OK; + } + + capture->dupl_obj = new D3D11DesktopDupObject (); + ret = capture->dupl_obj->Init (capture->device, capture->monitor_handle); + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (capture, + "Couldn't prepare capturing, %sexpected failure", + ret == GST_D3D11_SCREEN_CAPTURE_FLOW_EXPECTED_ERROR ? "" : "un"); + + delete capture->dupl_obj; + capture->dupl_obj = nullptr; + g_rec_mutex_unlock (&capture->lock); + + return ret; + } + + capture->prepared = TRUE; + g_rec_mutex_unlock (&capture->lock); + + return GST_FLOW_OK; +} + +gboolean +gst_d3d11_screen_capture_get_size (GstD3D11ScreenCapture * capture, + guint * width, guint * height) +{ + g_return_val_if_fail (GST_IS_D3D11_SCREEN_CAPTURE (capture), FALSE); + g_return_val_if_fail (width != nullptr, FALSE); + g_return_val_if_fail (height != nullptr, FALSE); + + g_rec_mutex_lock (&capture->lock); + *width = 0; + *height = 0; + + if (capture->dupl_obj) { + capture->dupl_obj->GetSize (&capture->cached_width, + &capture->cached_height); + } + + *width = capture->cached_width; + *height = capture->cached_height; + g_rec_mutex_unlock (&capture->lock); + + return TRUE; +} + +GstFlowReturn +gst_d3d11_screen_capture_do_capture (GstD3D11ScreenCapture * capture, + ID3D11Texture2D * texture, ID3D11RenderTargetView * rtv, + gboolean draw_mouse) +{ + GstFlowReturn ret = GST_FLOW_OK; + D3D11_TEXTURE2D_DESC desc; + guint width, height; + + g_return_val_if_fail (GST_IS_D3D11_SCREEN_CAPTURE (capture), GST_FLOW_ERROR); + g_return_val_if_fail (texture != nullptr, GST_FLOW_ERROR); + + g_rec_mutex_lock (&capture->lock); + if (!capture->prepared) + ret = gst_d3d11_screen_capture_prepare (capture); + + if (ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (capture, "We are not prepared"); + g_rec_mutex_unlock (&capture->lock); + return ret; + } + + gst_d3d11_screen_capture_get_size (capture, &width, &height); + + texture->GetDesc (&desc); + if (desc.Width != width || desc.Height != height) { + GST_INFO_OBJECT (capture, + "Different texture size, ours: %dx%d, external: %dx%d", + width, height, desc.Width, desc.Height); + g_rec_mutex_unlock (&capture->lock); + + return GST_D3D11_SCREEN_CAPTURE_FLOW_SIZE_CHANGED; + } + + gst_d3d11_device_lock (capture->device); + ret = capture->dupl_obj->Capture (draw_mouse); + if (ret != GST_FLOW_OK) { + gst_d3d11_device_unlock (capture->device); + + delete capture->dupl_obj; + capture->dupl_obj = nullptr; + capture->prepared = FALSE; + + if (ret == GST_D3D11_SCREEN_CAPTURE_FLOW_EXPECTED_ERROR) { + GST_WARNING_OBJECT (capture, + "Couldn't capture frame, but expected failure"); + } else { + GST_ERROR_OBJECT (capture, "Unexpected failure during capture"); + } + + g_rec_mutex_unlock (&capture->lock); + return ret; + } + + GST_LOG_OBJECT (capture, "Capture done"); + + capture->dupl_obj->CopyToTexture (texture); + if (draw_mouse) + capture->dupl_obj->DrawMouse (rtv); + gst_d3d11_device_unlock (capture->device); + g_rec_mutex_unlock (&capture->lock); + + return GST_FLOW_OK; +} + +HRESULT +gst_d3d11_screen_capture_find_output_for_monitor (HMONITOR monitor, + IDXGIAdapter1 ** adapter, IDXGIOutput ** output) +{ + ComPtr < IDXGIFactory1 > factory; + HRESULT hr = S_OK; + + g_return_val_if_fail (monitor != nullptr, E_INVALIDARG); + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (FAILED (hr)) + return hr; + + for (UINT adapter_idx = 0;; adapter_idx++) { + ComPtr < IDXGIAdapter1 > adapter_tmp; + + hr = factory->EnumAdapters1 (adapter_idx, &adapter_tmp); + if (FAILED (hr)) + break; + + for (UINT output_idx = 0;; output_idx++) { + ComPtr < IDXGIOutput > output_tmp; + DXGI_OUTPUT_DESC desc; + + hr = adapter_tmp->EnumOutputs (output_idx, &output_tmp); + if (FAILED (hr)) + break; + + hr = output_tmp->GetDesc (&desc); + if (FAILED (hr)) + continue; + + if (desc.Monitor == monitor) { + if (adapter) + *adapter = adapter_tmp.Detach (); + if (output) + *output = output_tmp.Detach (); + + return S_OK; + } + } + } + + return E_FAIL; +} + +HRESULT +gst_d3d11_screen_capture_find_primary_monitor (HMONITOR * monitor, + IDXGIAdapter1 ** adapter, IDXGIOutput ** output) +{ + ComPtr < IDXGIFactory1 > factory; + HRESULT hr = S_OK; + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (FAILED (hr)) + return hr; + + for (UINT adapter_idx = 0;; adapter_idx++) { + ComPtr < IDXGIAdapter1 > adapter_tmp; + + hr = factory->EnumAdapters1 (adapter_idx, &adapter_tmp); + if (FAILED (hr)) + break; + + for (UINT output_idx = 0;; output_idx++) { + ComPtr < IDXGIOutput > output_tmp; + DXGI_OUTPUT_DESC desc; + MONITORINFOEXW minfo; + + hr = adapter_tmp->EnumOutputs (output_idx, &output_tmp); + if (FAILED (hr)) + break; + + hr = output_tmp->GetDesc (&desc); + if (FAILED (hr)) + continue; + + minfo.cbSize = sizeof (MONITORINFOEXW); + if (!GetMonitorInfoW (desc.Monitor, &minfo)) + continue; + + if ((minfo.dwFlags & MONITORINFOF_PRIMARY) != 0) { + if (monitor) + *monitor = desc.Monitor; + if (adapter) + *adapter = adapter_tmp.Detach (); + if (output) + *output = output_tmp.Detach (); + + return S_OK; + } + } + } + + return E_FAIL; +} + +HRESULT +gst_d3d11_screen_capture_find_nth_monitor (guint index, HMONITOR * monitor, + IDXGIAdapter1 ** adapter, IDXGIOutput ** output) +{ + ComPtr < IDXGIFactory1 > factory; + HRESULT hr = S_OK; + guint num_found = 0; + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (FAILED (hr)) + return hr; + + for (UINT adapter_idx = 0;; adapter_idx++) { + ComPtr < IDXGIAdapter1 > adapter_tmp; + + hr = factory->EnumAdapters1 (adapter_idx, &adapter_tmp); + if (FAILED (hr)) + break; + + for (UINT output_idx = 0;; output_idx++) { + ComPtr < IDXGIOutput > output_tmp; + DXGI_OUTPUT_DESC desc; + MONITORINFOEXW minfo; + + hr = adapter_tmp->EnumOutputs (output_idx, &output_tmp); + if (FAILED (hr)) + break; + + hr = output_tmp->GetDesc (&desc); + if (FAILED (hr)) + continue; + + minfo.cbSize = sizeof (MONITORINFOEXW); + if (!GetMonitorInfoW (desc.Monitor, &minfo)) + continue; + + if (num_found == index) { + if (monitor) + *monitor = desc.Monitor; + if (adapter) + *adapter = adapter_tmp.Detach (); + if (output) + *output = output_tmp.Detach (); + + return S_OK; + } + + num_found++; + } + } + + return E_FAIL; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11screencapture.h
Added
@@ -0,0 +1,65 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +#define GST_D3D11_SCREEN_CAPTURE_FLOW_EXPECTED_ERROR GST_FLOW_CUSTOM_SUCCESS +#define GST_D3D11_SCREEN_CAPTURE_FLOW_SIZE_CHANGED GST_FLOW_CUSTOM_SUCCESS_1 +#define GST_D3D11_SCREEN_CAPTURE_FLOW_UNSUPPORTED GST_FLOW_CUSTOM_ERROR + +#define GST_TYPE_D3D11_SCREEN_CAPTURE (gst_d3d11_screen_capture_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11ScreenCapture, gst_d3d11_screen_capture, + GST, D3D11_SCREEN_CAPTURE, GstObject); + +GstD3D11ScreenCapture * gst_d3d11_screen_capture_new (GstD3D11Device * device, + HMONITOR monitor_handle); + +GstFlowReturn gst_d3d11_screen_capture_prepare (GstD3D11ScreenCapture * capture); + +gboolean gst_d3d11_screen_capture_get_size (GstD3D11ScreenCapture * capture, + guint * width, + guint * height); + +GstFlowReturn gst_d3d11_screen_capture_do_capture (GstD3D11ScreenCapture * capture, + ID3D11Texture2D * texture, + ID3D11RenderTargetView *rtv, + gboolean draw_mouse); + +HRESULT gst_d3d11_screen_capture_find_output_for_monitor (HMONITOR monitor, + IDXGIAdapter1 ** adapter, + IDXGIOutput ** output); + +HRESULT gst_d3d11_screen_capture_find_primary_monitor (HMONITOR * monitor, + IDXGIAdapter1 ** adapter, + IDXGIOutput ** output); + +HRESULT gst_d3d11_screen_capture_find_nth_monitor (guint index, + HMONITOR * monitor, + IDXGIAdapter1 ** adapter, + IDXGIOutput ** output); + +G_END_DECLS +
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11screencapturedevice.cpp
Added
@@ -0,0 +1,448 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11screencapturedevice.h" +#include <gst/d3d11/gstd3d11.h> +#include <gst/video/video.h> +#include <wrl.h> +#include <string.h> +#include <string> +#include <locale> +#include <codecvt> + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_screen_capture_device_debug); +#define GST_CAT_DEFAULT gst_d3d11_screen_capture_device_debug + +static GstStaticCaps template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, "BGRA") ";" + GST_VIDEO_CAPS_MAKE ("BGRA")); + +enum +{ + PROP_0, + PROP_MONITOR_HANDLE, +}; + +struct _GstD3D11ScreenCaptureDevice +{ + GstDevice parent; + + HMONITOR monitor_handle; +}; + +static void gst_d3d11_screen_capture_device_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_screen_capture_device_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static GstElement *gst_d3d11_screen_capture_device_create_element (GstDevice * + device, const gchar * name); + +G_DEFINE_TYPE (GstD3D11ScreenCaptureDevice, + gst_d3d11_screen_capture_device, GST_TYPE_DEVICE); + +static void +gst_d3d11_screen_capture_device_class_init (GstD3D11ScreenCaptureDeviceClass * + klass) +{ + GObjectClass *object_class = G_OBJECT_CLASS (klass); + GstDeviceClass *dev_class = GST_DEVICE_CLASS (klass); + + object_class->get_property = gst_d3d11_screen_capture_device_get_property; + object_class->set_property = gst_d3d11_screen_capture_device_set_property; + + g_object_class_install_property (object_class, PROP_MONITOR_HANDLE, + g_param_spec_uint64 ("monitor-handle", "Monitor Handle", + "A HMONITOR handle", 0, G_MAXUINT64, 0, + (GParamFlags) (G_PARAM_STATIC_STRINGS | G_PARAM_READWRITE | + G_PARAM_CONSTRUCT_ONLY))); + + + dev_class->create_element = gst_d3d11_screen_capture_device_create_element; +} + +static void +gst_d3d11_screen_capture_device_init (GstD3D11ScreenCaptureDevice * self) +{ +} + +static void +gst_d3d11_screen_capture_device_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11ScreenCaptureDevice *self = GST_D3D11_SCREEN_CAPTURE_DEVICE (object); + + switch (prop_id) { + case PROP_MONITOR_HANDLE: + g_value_set_uint64 (value, (guint64) self->monitor_handle); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_screen_capture_device_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11ScreenCaptureDevice *self = GST_D3D11_SCREEN_CAPTURE_DEVICE (object); + + switch (prop_id) { + case PROP_MONITOR_HANDLE: + self->monitor_handle = (HMONITOR) g_value_get_uint64 (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static GstElement * +gst_d3d11_screen_capture_device_create_element (GstDevice * device, + const gchar * name) +{ + GstD3D11ScreenCaptureDevice *self = GST_D3D11_SCREEN_CAPTURE_DEVICE (device); + GstElement *elem; + + elem = gst_element_factory_make ("d3d11screencapturesrc", name); + + g_object_set (elem, "monitor-handle", self->monitor_handle, nullptr); + + return elem; +} + +struct _GstD3D11ScreenCaptureDeviceProvider +{ + GstDeviceProvider parent; +}; + +G_DEFINE_TYPE (GstD3D11ScreenCaptureDeviceProvider, + gst_d3d11_screen_capture_device_provider, GST_TYPE_DEVICE_PROVIDER); + +static GList *gst_d3d11_screen_capture_device_provider_probe (GstDeviceProvider + * provider); + +static void + gst_d3d11_screen_capture_device_provider_class_init + (GstD3D11ScreenCaptureDeviceProviderClass * klass) +{ + GstDeviceProviderClass *provider_class = GST_DEVICE_PROVIDER_CLASS (klass); + + provider_class->probe = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_device_provider_probe); + + gst_device_provider_class_set_static_metadata (provider_class, + "Direct3D11 Desktop Capture Device Provider", + "Source/Monitor", "List Direct3D11 desktop capture source devices", + "Seungha Yang <seungha@centricular.com>"); +} + +static void + gst_d3d11_screen_capture_device_provider_init + (GstD3D11ScreenCaptureDeviceProvider * self) +{ +} + +static gboolean +get_monitor_name (const MONITORINFOEXW * info, + DISPLAYCONFIG_TARGET_DEVICE_NAME * target) +{ + UINT32 num_path = 0; + UINT32 num_mode = 0; + LONG query_ret; + + memset (target, 0, sizeof (DISPLAYCONFIG_TARGET_DEVICE_NAME)); + + query_ret = GetDisplayConfigBufferSizes (QDC_ONLY_ACTIVE_PATHS, + &num_path, &num_mode); + if (query_ret != ERROR_SUCCESS || num_path == 0 || num_mode == 0) + return FALSE; + + DISPLAYCONFIG_PATH_INFO *path_infos = (DISPLAYCONFIG_PATH_INFO *) + g_alloca (num_path * sizeof (DISPLAYCONFIG_PATH_INFO)); + DISPLAYCONFIG_MODE_INFO *mode_infos = (DISPLAYCONFIG_MODE_INFO *) + g_alloca (num_mode * sizeof (DISPLAYCONFIG_MODE_INFO)); + + query_ret = QueryDisplayConfig (QDC_ONLY_ACTIVE_PATHS, &num_path, + path_infos, &num_mode, mode_infos, nullptr); + if (query_ret != ERROR_SUCCESS) + return FALSE; + + for (UINT32 i = 0; i < num_path; i++) { + DISPLAYCONFIG_PATH_INFO *p = &path_infos[i]; + DISPLAYCONFIG_SOURCE_DEVICE_NAME source; + + memset (&source, 0, sizeof (DISPLAYCONFIG_SOURCE_DEVICE_NAME)); + + source.header.type = DISPLAYCONFIG_DEVICE_INFO_GET_SOURCE_NAME; + source.header.size = sizeof (DISPLAYCONFIG_SOURCE_DEVICE_NAME); + source.header.adapterId = p->sourceInfo.adapterId; + source.header.id = p->sourceInfo.id; + + query_ret = DisplayConfigGetDeviceInfo (&source.header); + if (query_ret != ERROR_SUCCESS) + continue; + + if (wcscmp (info->szDevice, source.viewGdiDeviceName) != 0) + continue; + + DISPLAYCONFIG_TARGET_DEVICE_NAME tmp; + + memset (&tmp, 0, sizeof (DISPLAYCONFIG_TARGET_DEVICE_NAME)); + + tmp.header.type = DISPLAYCONFIG_DEVICE_INFO_GET_TARGET_NAME; + tmp.header.size = sizeof (DISPLAYCONFIG_TARGET_DEVICE_NAME); + tmp.header.adapterId = p->sourceInfo.adapterId; + tmp.header.id = p->targetInfo.id; + + query_ret = DisplayConfigGetDeviceInfo (&tmp.header); + if (query_ret != ERROR_SUCCESS) + continue; + + memcpy (target, &tmp, sizeof (DISPLAYCONFIG_TARGET_DEVICE_NAME)); + + return TRUE; + } + + return FALSE; +} + +/* XXX: please bump MinGW toolchain version, + * DISPLAYCONFIG_VIDEO_OUTPUT_TECHNOLOGY defined in wingdi.h */ +typedef enum +{ + OUTPUT_TECH_OTHER = -1, + OUTPUT_TECH_HD15 = 0, + OUTPUT_TECH_SVIDEO = 1, + OUTPUT_TECH_COMPOSITE_VIDEO = 2, + OUTPUT_TECH_COMPONENT_VIDEO = 3, + OUTPUT_TECH_DVI = 4, + OUTPUT_TECH_HDMI = 5, + OUTPUT_TECH_LVDS = 6, + OUTPUT_TECH_D_JPN = 8, + OUTPUT_TECH_SDI = 9, + OUTPUT_TECH_DISPLAYPORT_EXTERNAL = 10, + OUTPUT_TECH_DISPLAYPORT_EMBEDDED = 11, + OUTPUT_TECH_UDI_EXTERNAL = 12, + OUTPUT_TECH_UDI_EMBEDDED = 13, + OUTPUT_TECH_SDTVDONGLE = 14, + OUTPUT_TECH_MIRACAST = 15, + OUTPUT_TECH_INDIRECT_WIRED = 16, + OUTPUT_TECH_INDIRECT_VIRTUAL = 17, + OUTPUT_TECH_INTERNAL = 0x80000000, + OUTPUT_TECH_FORCE_UINT32 = 0xFFFFFFFF +} GST_OUTPUT_TECHNOLOGY; + +static const gchar * +output_tech_to_string (GST_OUTPUT_TECHNOLOGY tech) +{ + switch (tech) { + case OUTPUT_TECH_HD15: + return "hd15"; + case OUTPUT_TECH_SVIDEO: + return "svideo"; + case OUTPUT_TECH_COMPOSITE_VIDEO: + return "composite-video"; + case OUTPUT_TECH_DVI: + return "dvi"; + case OUTPUT_TECH_HDMI: + return "hdmi"; + case OUTPUT_TECH_LVDS: + return "lvds"; + case OUTPUT_TECH_D_JPN: + return "d-jpn"; + case OUTPUT_TECH_SDI: + return "sdi"; + case OUTPUT_TECH_DISPLAYPORT_EXTERNAL: + return "displayport-external"; + case OUTPUT_TECH_DISPLAYPORT_EMBEDDED: + return "displayport-internal"; + case OUTPUT_TECH_UDI_EXTERNAL: + return "udi-external"; + case OUTPUT_TECH_UDI_EMBEDDED: + return "udi-embedded"; + case OUTPUT_TECH_SDTVDONGLE: + return "sdtv"; + case OUTPUT_TECH_MIRACAST: + return "miracast"; + case OUTPUT_TECH_INDIRECT_WIRED: + return "indirect-wired"; + case OUTPUT_TECH_INDIRECT_VIRTUAL: + return "indirect-virtual"; + case OUTPUT_TECH_INTERNAL: + return "internal"; + default: + break; + } + + return "unknown"; +} + +static GstDevice * +create_device (const DXGI_ADAPTER_DESC * adapter_desc, + const DXGI_OUTPUT_DESC * output_desc, + const MONITORINFOEXW * minfo, const DEVMODEW * dev_mode, + const DISPLAYCONFIG_TARGET_DEVICE_NAME * target) +{ + GstCaps *caps; + gint width, height, left, top, right, bottom; + GstStructure *props; + std::wstring_convert < std::codecvt_utf8 < wchar_t >, wchar_t >converter; + std::string device_name; + std::string display_name; + std::string device_path; + std::string device_description; + const gchar *output_type; + gboolean primary = FALSE; + GstDevice *device; + + left = (gint) dev_mode->dmPosition.x; + top = (gint) dev_mode->dmPosition.y; + width = dev_mode->dmPelsWidth; + height = dev_mode->dmPelsHeight; + right = left + width; + bottom = top + height; + + caps = gst_static_caps_get (&template_caps); + caps = gst_caps_make_writable (caps); + gst_caps_set_simple (caps, + "width", G_TYPE_INT, width, "height", G_TYPE_INT, height, nullptr); + + device_name = converter.to_bytes (minfo->szDevice); + display_name = converter.to_bytes (target->monitorFriendlyDeviceName); + device_path = converter.to_bytes (target->monitorDevicePath); + device_description = converter.to_bytes (adapter_desc->Description); + output_type = + output_tech_to_string ((GST_OUTPUT_TECHNOLOGY) target->outputTechnology); + if ((minfo->dwFlags & MONITORINFOF_PRIMARY) != 0) + primary = TRUE; + + props = gst_structure_new ("d3d11screencapturedevice-proplist", + "device.api", G_TYPE_STRING, "d3d11", + "device.name", G_TYPE_STRING, GST_STR_NULL (device_name.c_str ()), + "device.path", G_TYPE_STRING, GST_STR_NULL (device_path.c_str ()), + "device.primary", G_TYPE_BOOLEAN, primary, + "device.type", G_TYPE_STRING, output_type, + "device.hmonitor", G_TYPE_UINT64, (guint64) output_desc->Monitor, + "device.adapter.luid", G_TYPE_INT64, + gst_d3d11_luid_to_int64 (&adapter_desc->AdapterLuid), + "device.adapter.description", G_TYPE_STRING, + GST_STR_NULL (device_description.c_str ()), + "desktop.coordinates.left", G_TYPE_INT, + (gint) output_desc->DesktopCoordinates.left, + "desktop.coordinates.top", G_TYPE_INT, + (gint) output_desc->DesktopCoordinates.top, + "desktop.coordinates.right", G_TYPE_INT, + (gint) output_desc->DesktopCoordinates.right, + "desktop.coordinates.bottom", G_TYPE_INT, + (gint) output_desc->DesktopCoordinates.bottom, + "display.coordinates.left", G_TYPE_INT, left, + "display.coordinates.top", G_TYPE_INT, top, + "display.coordinates.right", G_TYPE_INT, right, + "display.coordinates.bottom", G_TYPE_INT, bottom, nullptr); + + device = (GstDevice *) g_object_new (GST_TYPE_D3D11_SCREEN_CAPTURE_DEVICE, + "display-name", display_name.c_str (), "caps", caps, "device-class", + "Source/Monitor", "properties", props, "monitor-handle", + (guint64) output_desc->Monitor, nullptr); + + gst_caps_unref (caps); + + return device; +} + +static GList * +gst_d3d11_screen_capture_device_provider_probe (GstDeviceProvider * provider) +{ + GList *devices = nullptr; + ComPtr < IDXGIFactory1 > factory; + HRESULT hr = S_OK; + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (FAILED (hr)) + return nullptr; + + for (UINT adapter_idx = 0;; adapter_idx++) { + ComPtr < IDXGIAdapter1 > adapter; + DXGI_ADAPTER_DESC adapter_desc; + + hr = factory->EnumAdapters1 (adapter_idx, &adapter); + if (FAILED (hr)) + break; + + hr = adapter->GetDesc (&adapter_desc); + if (FAILED (hr)) + continue; + + for (UINT output_idx = 0;; output_idx++) { + ComPtr < IDXGIOutput > output; + ComPtr < IDXGIOutput1 > output1; + DXGI_OUTPUT_DESC desc; + MONITORINFOEXW minfo; + DEVMODEW dev_mode; + DISPLAYCONFIG_TARGET_DEVICE_NAME target; + GstDevice *dev; + + hr = adapter->EnumOutputs (output_idx, &output); + if (FAILED (hr)) + break; + + hr = output.As (&output1); + if (FAILED (hr)) + continue; + + hr = output->GetDesc (&desc); + if (FAILED (hr)) + continue; + + minfo.cbSize = sizeof (MONITORINFOEXW); + if (!GetMonitorInfoW (desc.Monitor, &minfo)) + continue; + + dev_mode.dmSize = sizeof (DEVMODEW); + dev_mode.dmDriverExtra = sizeof (POINTL); + dev_mode.dmFields = DM_POSITION; + if (!EnumDisplaySettingsW (minfo.szDevice, + ENUM_CURRENT_SETTINGS, &dev_mode)) { + continue; + } + + /* Human readable monitor name is not always availabe, if it's empty + * fill it with generic one */ + if (!get_monitor_name (&minfo, &target) || + wcslen (target.monitorFriendlyDeviceName) == 0) { + wcscpy (target.monitorFriendlyDeviceName, L"Generic PnP Monitor"); + } + + dev = create_device (&adapter_desc, &desc, &minfo, &dev_mode, &target); + devices = g_list_append (devices, dev); + } + } + + return devices; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11screencapturedevice.h
Added
@@ -0,0 +1,36 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_SCREEN_CAPTURE_DEVICE (gst_d3d11_screen_capture_device_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11ScreenCaptureDevice, gst_d3d11_screen_capture_device, + GST, D3D11_SCREEN_CAPTURE_DEVICE, GstDevice); + +#define GST_TYPE_D3D11_SCREEN_CAPTURE_DEVICE_PROVIDER (gst_d3d11_screen_capture_device_provider_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11ScreenCaptureDeviceProvider, + gst_d3d11_screen_capture_device_provider, + GST, D3D11_SCREEN_CAPTURE_DEVICE_PROVIDER, GstDeviceProvider); + +G_END_DECLS +
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11screencapturesrc.cpp
Added
@@ -0,0 +1,919 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11screencapturesrc + * @title: d3d11screencapturesrc + * + * A DXGI Desktop Duplication API based screen capture element + * + * ## Example launch line + * ``` + * gst-launch-1.0 d3d11screencapturesrc ! queue ! d3d11videosink + * ``` + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11screencapturesrc.h" +#include "gstd3d11screencapture.h" +#include "gstd3d11pluginutils.h" +#include <wrl.h> +#include <string.h> + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_screen_capture_debug); +#define GST_CAT_DEFAULT gst_d3d11_screen_capture_debug + +enum +{ + PROP_0, + PROP_MONITOR_INDEX, + PROP_MONITOR_HANDLE, + PROP_SHOW_CURSOR, + + PROP_LAST, +}; + +static GParamSpec *properties[PROP_LAST]; + +#define DEFAULT_MONITOR_INDEX -1 +#define DEFAULT_SHOW_CURSOR FALSE + +static GstStaticCaps template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, "BGRA") ";" + GST_VIDEO_CAPS_MAKE ("BGRA")); + +struct _GstD3D11ScreenCaptureSrc +{ + GstBaseSrc src; + + guint64 last_frame_no; + GstClockID clock_id; + GstVideoInfo video_info; + + GstD3D11Device *device; + GstD3D11ScreenCapture *capture; + + GstBufferPool *pool; + + gint64 adapter_luid; + gint monitor_index; + HMONITOR monitor_handle; + gboolean show_cursor; + + gboolean flushing; + GstClockTime min_latency; + GstClockTime max_latency; + + gboolean downstream_supports_d3d11; +}; + +static void gst_d3d11_screen_capture_src_dispose (GObject * object); +static void gst_d3d11_screen_capture_src_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_d3d11_screen_capture_src_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); + +static void gst_d3d11_screen_capture_src_set_context (GstElement * element, + GstContext * context); + +static GstCaps *gst_d3d11_screen_capture_src_get_caps (GstBaseSrc * bsrc, + GstCaps * filter); +static GstCaps *gst_d3d11_screen_capture_src_fixate (GstBaseSrc * bsrc, + GstCaps * caps); +static gboolean gst_d3d11_screen_capture_src_set_caps (GstBaseSrc * bsrc, + GstCaps * caps); +static gboolean gst_d3d11_screen_capture_src_decide_allocation (GstBaseSrc * + bsrc, GstQuery * query); +static gboolean gst_d3d11_screen_capture_src_start (GstBaseSrc * bsrc); +static gboolean gst_d3d11_screen_capture_src_stop (GstBaseSrc * bsrc); +static gboolean gst_d3d11_screen_capture_src_unlock (GstBaseSrc * bsrc); +static gboolean gst_d3d11_screen_capture_src_unlock_stop (GstBaseSrc * bsrc); +static gboolean +gst_d3d11_screen_capture_src_src_query (GstBaseSrc * bsrc, GstQuery * query); + +static GstFlowReturn gst_d3d11_screen_capture_src_create (GstBaseSrc * bsrc, + guint64 offset, guint size, GstBuffer ** buf); + +#define gst_d3d11_screen_capture_src_parent_class parent_class +G_DEFINE_TYPE (GstD3D11ScreenCaptureSrc, gst_d3d11_screen_capture_src, + GST_TYPE_BASE_SRC); + +static void +gst_d3d11_screen_capture_src_class_init (GstD3D11ScreenCaptureSrcClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseSrcClass *basesrc_class = GST_BASE_SRC_CLASS (klass); + GstCaps *caps; + + gobject_class->dispose = gst_d3d11_screen_capture_src_dispose; + gobject_class->set_property = gst_d3d11_screen_capture_src_set_property; + gobject_class->get_property = gst_d3d11_screen_capture_src_get_property; + + properties[PROP_MONITOR_INDEX] = + g_param_spec_int ("monitor-index", "Monitor Index", + "Zero-based index for monitor to capture (-1 = primary monitor)", + -1, G_MAXINT, DEFAULT_MONITOR_INDEX, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS)); + + properties[PROP_MONITOR_HANDLE] = + g_param_spec_uint64 ("monitor-handle", "Monitor Handle", + "A HMONITOR handle of monitor to capture", + 0, G_MAXUINT64, 0, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS)); + + properties[PROP_SHOW_CURSOR] = + g_param_spec_boolean ("show-cursor", + "Show Mouse Cursor", "Whether to show mouse cursor", + DEFAULT_SHOW_CURSOR, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_properties (gobject_class, PROP_LAST, properties); + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_set_context); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 screen capture src", "Source/Video", + "Capture desktop image by using Desktop Duplication API", + "Seungha Yang <seungha@centricular.com>"); + + caps = gst_d3d11_get_updated_template_caps (&template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + basesrc_class->get_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_get_caps); + basesrc_class->fixate = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_fixate); + basesrc_class->set_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_set_caps); + basesrc_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_decide_allocation); + basesrc_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_start); + basesrc_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_stop); + basesrc_class->unlock = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_unlock); + basesrc_class->unlock_stop = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_unlock_stop); + basesrc_class->query = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_src_query); + + basesrc_class->create = + GST_DEBUG_FUNCPTR (gst_d3d11_screen_capture_src_create); +} + +static void +gst_d3d11_screen_capture_src_init (GstD3D11ScreenCaptureSrc * self) +{ + gst_base_src_set_live (GST_BASE_SRC (self), TRUE); + gst_base_src_set_format (GST_BASE_SRC (self), GST_FORMAT_TIME); + + self->monitor_index = DEFAULT_MONITOR_INDEX; + self->show_cursor = DEFAULT_SHOW_CURSOR; + self->min_latency = GST_CLOCK_TIME_NONE; + self->max_latency = GST_CLOCK_TIME_NONE; +} + +static void +gst_d3d11_screen_capture_src_dispose (GObject * object) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (object); + + gst_clear_object (&self->capture); + gst_clear_object (&self->device); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_d3d11_screen_capture_src_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (object); + + switch (prop_id) { + case PROP_MONITOR_INDEX: + self->monitor_index = g_value_get_int (value); + break; + case PROP_MONITOR_HANDLE: + self->monitor_handle = (HMONITOR) g_value_get_uint64 (value); + break; + case PROP_SHOW_CURSOR: + self->show_cursor = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + }; +} + +static void +gst_d3d11_screen_capture_src_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (object); + + switch (prop_id) { + case PROP_MONITOR_INDEX: + g_value_set_int (value, self->monitor_index); + break; + case PROP_MONITOR_HANDLE: + g_value_set_uint64 (value, (guint64) self->monitor_handle); + break; + case PROP_SHOW_CURSOR: + g_value_set_boolean (value, self->show_cursor); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + }; +} + +static void +gst_d3d11_screen_capture_src_set_context (GstElement * element, + GstContext * context) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (element); + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, self->adapter_luid, &self->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static GstCaps * +gst_d3d11_screen_capture_src_get_caps (GstBaseSrc * bsrc, GstCaps * filter) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + GstCaps *caps = NULL; + guint width, height; + + if (!self->capture) { + GST_DEBUG_OBJECT (self, "Duplication object is not configured yet"); + return gst_pad_get_pad_template_caps (GST_BASE_SRC_PAD (bsrc)); + } + + if (!gst_d3d11_screen_capture_get_size (self->capture, &width, &height)) { + GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, + ("Cannot query supported resolution"), (NULL)); + return NULL; + } + + caps = gst_pad_get_pad_template_caps (GST_BASE_SRC_PAD (bsrc)); + caps = gst_caps_make_writable (caps); + + gst_caps_set_simple (caps, "width", G_TYPE_INT, width, "height", + G_TYPE_INT, height, nullptr); + + if (filter) { + GstCaps *tmp = + gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + + caps = tmp; + } + + return caps; +} + +static GstCaps * +gst_d3d11_screen_capture_src_fixate (GstBaseSrc * bsrc, GstCaps * caps) +{ + guint size; + GstCaps *d3d11_caps = nullptr; + + caps = gst_caps_make_writable (caps); + size = gst_caps_get_size (caps); + + for (guint i = 0; i < size; i++) { + GstStructure *s; + + s = gst_caps_get_structure (caps, i); + gst_structure_fixate_field_nearest_fraction (s, "framerate", 30, 1); + + if (!d3d11_caps) { + GstCapsFeatures *features; + features = gst_caps_get_features (caps, i); + + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + + d3d11_caps = gst_caps_new_empty (); + gst_caps_append_structure (d3d11_caps, gst_structure_copy (s)); + + gst_caps_set_features (d3d11_caps, 0, + gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, + nullptr)); + + break; + } + } + } + + if (d3d11_caps) { + gst_caps_unref (caps); + caps = d3d11_caps; + } + + return gst_caps_fixate (caps); +} + +static gboolean +gst_d3d11_screen_capture_src_set_caps (GstBaseSrc * bsrc, GstCaps * caps) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + GstCapsFeatures *features; + + GST_DEBUG_OBJECT (self, "Set caps %" GST_PTR_FORMAT, caps); + + features = gst_caps_get_features (caps, 0); + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + self->downstream_supports_d3d11 = TRUE; + } else { + self->downstream_supports_d3d11 = FALSE; + } + + gst_video_info_from_caps (&self->video_info, caps); + gst_base_src_set_blocksize (bsrc, GST_VIDEO_INFO_SIZE (&self->video_info)); + + return TRUE; +} + +static gboolean +gst_d3d11_screen_capture_src_decide_allocation (GstBaseSrc * bsrc, + GstQuery * query) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + GstBufferPool *pool = NULL; + GstStructure *config; + GstD3D11AllocationParams *d3d11_params; + GstCaps *caps; + guint min, max, size; + gboolean update_pool; + GstVideoInfo vinfo; + + if (self->pool) { + gst_buffer_pool_set_active (self->pool, FALSE); + gst_clear_object (&self->pool); + } + + gst_query_parse_allocation (query, &caps, NULL); + + if (!caps) { + GST_ERROR_OBJECT (self, "No output caps"); + return FALSE; + } + + gst_video_info_from_caps (&vinfo, caps); + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + update_pool = TRUE; + } else { + size = GST_VIDEO_INFO_SIZE (&vinfo); + + min = max = 0; + update_pool = FALSE; + } + + if (pool && self->downstream_supports_d3d11) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != self->device) + gst_clear_object (&pool); + } + } + + if (!pool) { + if (self->downstream_supports_d3d11) + pool = gst_d3d11_buffer_pool_new (self->device); + else + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_set_params (config, caps, size, min, max); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + if (self->downstream_supports_d3d11) { + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (self->device, &vinfo, + (GstD3D11AllocationFlags) 0, + D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET); + } else { + d3d11_params->desc[0].BindFlags |= D3D11_BIND_RENDER_TARGET; + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + } + + if (!gst_buffer_pool_set_config (pool, config)) { + GST_ERROR_OBJECT (self, "Failed to set config"); + goto error; + } + + if (self->downstream_supports_d3d11) { + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, + nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + } else { + self->pool = gst_d3d11_buffer_pool_new (self->device); + + config = gst_buffer_pool_get_config (self->pool); + + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (self->device, &vinfo, + (GstD3D11AllocationFlags) 0, D3D11_BIND_RENDER_TARGET); + } else { + d3d11_params->desc[0].BindFlags |= D3D11_BIND_RENDER_TARGET; + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + if (!gst_buffer_pool_set_config (self->pool, config)) { + GST_ERROR_OBJECT (self, "Failed to set config for internal pool"); + goto error; + } + + if (!gst_buffer_pool_set_active (self->pool, TRUE)) { + GST_ERROR_OBJECT (self, "Failed to activate internal pool"); + goto error; + } + } + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + gst_object_unref (pool); + + return TRUE; + +error: + gst_clear_object (&self->pool); + gst_clear_object (&pool); + + return FALSE; +} + +static gboolean +gst_d3d11_screen_capture_src_start (GstBaseSrc * bsrc) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + GstFlowReturn ret; + HMONITOR monitor = self->monitor_handle; + ComPtr < IDXGIAdapter1 > adapter; + DXGI_ADAPTER_DESC desc; + HRESULT hr; + + if (monitor) { + hr = gst_d3d11_screen_capture_find_output_for_monitor (monitor, + &adapter, nullptr); + } else if (self->monitor_index < 0) { + hr = gst_d3d11_screen_capture_find_primary_monitor (&monitor, + &adapter, nullptr); + } else { + hr = gst_d3d11_screen_capture_find_nth_monitor (self->monitor_index, + &monitor, &adapter, nullptr); + } + + if (FAILED (hr)) + goto error; + + hr = adapter->GetDesc (&desc); + if (FAILED (hr)) + goto error; + + self->adapter_luid = gst_d3d11_luid_to_int64 (&desc.AdapterLuid); + gst_clear_object (&self->device); + + if (!gst_d3d11_ensure_element_data_for_adapter_luid (GST_ELEMENT_CAST (self), + self->adapter_luid, &self->device)) { + GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, + ("D3D11 device for LUID %" G_GINT64_FORMAT " is unavailble", + self->adapter_luid), (nullptr)); + + return FALSE; + } + + self->capture = gst_d3d11_screen_capture_new (self->device, monitor); + if (!self->capture) + goto error; + + /* Check if we can open device */ + ret = gst_d3d11_screen_capture_prepare (self->capture); + switch (ret) { + case GST_D3D11_SCREEN_CAPTURE_FLOW_EXPECTED_ERROR: + case GST_FLOW_OK: + break; + case GST_D3D11_SCREEN_CAPTURE_FLOW_UNSUPPORTED: + goto unsupported; + default: + goto error; + } + + self->last_frame_no = -1; + self->min_latency = self->max_latency = GST_CLOCK_TIME_NONE; + + return TRUE; + +error: + { + GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, + ("Failed to prepare capture object with given configuration, " + "monitor-index: %d, monitor-handle: %p", + self->monitor_index, self->monitor_handle), (nullptr)); + return FALSE; + } + +unsupported: + { + GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, + ("Failed to prepare capture object with given configuration, " + "monitor-index: %d, monitor-handle: %p", + self->monitor_index, self->monitor_handle), + ("Try run the application on the integrated GPU")); + return FALSE; + } +} + +static gboolean +gst_d3d11_screen_capture_src_stop (GstBaseSrc * bsrc) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + + if (self->pool) { + gst_buffer_pool_set_active (self->pool, FALSE); + gst_clear_object (&self->pool); + } + + gst_clear_object (&self->capture); + gst_clear_object (&self->device); + + return TRUE; +} + +static gboolean +gst_d3d11_screen_capture_src_unlock (GstBaseSrc * bsrc) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + + GST_OBJECT_LOCK (self); + if (self->clock_id) { + GST_DEBUG_OBJECT (self, "Waking up waiting clock"); + gst_clock_id_unschedule (self->clock_id); + } + self->flushing = TRUE; + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gboolean +gst_d3d11_screen_capture_src_unlock_stop (GstBaseSrc * bsrc) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + + GST_OBJECT_LOCK (self); + self->flushing = FALSE; + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gboolean +gst_d3d11_screen_capture_src_src_query (GstBaseSrc * bsrc, GstQuery * query) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT_CAST (self), query, + self->device)) { + return TRUE; + } + break; + case GST_QUERY_LATENCY: + if (GST_CLOCK_TIME_IS_VALID (self->min_latency)) { + gst_query_set_latency (query, + TRUE, self->min_latency, self->max_latency); + return TRUE; + } + break; + default: + break; + } + + return GST_BASE_SRC_CLASS (parent_class)->query (bsrc, query); +} + +static GstFlowReturn +gst_d3d11_screen_capture_src_create (GstBaseSrc * bsrc, guint64 offset, + guint size, GstBuffer ** buf) +{ + GstD3D11ScreenCaptureSrc *self = GST_D3D11_SCREEN_CAPTURE_SRC (bsrc); + ID3D11Texture2D *texture; + ID3D11RenderTargetView *rtv = NULL; + gint fps_n, fps_d; + GstMapInfo info; + GstMemory *mem; + GstD3D11Memory *dmem; + GstFlowReturn ret = GST_FLOW_OK; + GstClock *clock = NULL; + GstClockTime base_time; + GstClockTime next_capture_ts; + GstClockTime before_capture; + GstClockTime after_capture; + GstClockTime latency; + GstClockTime dur; + gboolean update_latency = FALSE; + guint64 next_frame_no; + gboolean draw_mouse; + /* Just magic number... */ + gint unsupported_retry_count = 100; + GstBuffer *buffer = NULL; + GstBuffer *sysmem_buf = NULL; + + if (!self->capture) { + GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, + ("Couldn't configure capture object"), (nullptr)); + return GST_FLOW_NOT_NEGOTIATED; + } + + fps_n = GST_VIDEO_INFO_FPS_N (&self->video_info); + fps_d = GST_VIDEO_INFO_FPS_D (&self->video_info); + + if (fps_n <= 0 || fps_d <= 0) + return GST_FLOW_NOT_NEGOTIATED; + +again: + clock = gst_element_get_clock (GST_ELEMENT_CAST (self)); + if (!clock) { + GST_ELEMENT_ERROR (self, RESOURCE, FAILED, + ("Cannot operate without a clock"), (nullptr)); + return GST_FLOW_ERROR; + } + + /* Check flushing before waiting clock because we are might be doing + * retry */ + GST_OBJECT_LOCK (self); + if (self->flushing) { + ret = GST_FLOW_FLUSHING; + GST_OBJECT_UNLOCK (self); + goto out; + } + + base_time = GST_ELEMENT_CAST (self)->base_time; + next_capture_ts = gst_clock_get_time (clock); + next_capture_ts -= base_time; + + next_frame_no = gst_util_uint64_scale (next_capture_ts, + fps_n, GST_SECOND * fps_d); + + if (next_frame_no == self->last_frame_no) { + GstClockID id; + GstClockReturn clock_ret; + + /* Need to wait for the next frame */ + next_frame_no += 1; + + /* Figure out what the next frame time is */ + next_capture_ts = gst_util_uint64_scale (next_frame_no, + fps_d * GST_SECOND, fps_n); + + id = gst_clock_new_single_shot_id (GST_ELEMENT_CLOCK (self), + next_capture_ts + base_time); + self->clock_id = id; + + /* release the object lock while waiting */ + GST_OBJECT_UNLOCK (self); + + GST_LOG_OBJECT (self, "Waiting for next frame time %" GST_TIME_FORMAT, + GST_TIME_ARGS (next_capture_ts)); + clock_ret = gst_clock_id_wait (id, NULL); + GST_OBJECT_LOCK (self); + + gst_clock_id_unref (id); + self->clock_id = NULL; + + if (clock_ret == GST_CLOCK_UNSCHEDULED) { + /* Got woken up by the unlock function */ + ret = GST_FLOW_FLUSHING; + GST_OBJECT_UNLOCK (self); + goto out; + } + /* Duration is a complete 1/fps frame duration */ + dur = gst_util_uint64_scale_int (GST_SECOND, fps_d, fps_n); + } else { + GstClockTime next_frame_ts; + + GST_LOG_OBJECT (self, "No need to wait for next frame time %" + GST_TIME_FORMAT " next frame = %" G_GINT64_FORMAT " prev = %" + G_GINT64_FORMAT, GST_TIME_ARGS (next_capture_ts), next_frame_no, + self->last_frame_no); + + next_frame_ts = gst_util_uint64_scale (next_frame_no + 1, + fps_d * GST_SECOND, fps_n); + /* Frame duration is from now until the next expected capture time */ + dur = next_frame_ts - next_capture_ts; + } + + self->last_frame_no = next_frame_no; + GST_OBJECT_UNLOCK (self); + + if (!buffer) { + if (self->downstream_supports_d3d11) { + ret = GST_BASE_SRC_CLASS (parent_class)->alloc (bsrc, + offset, size, &buffer); + } else { + if (!self->pool) { + GST_ERROR_OBJECT (self, "Internal pool wasn't configured"); + goto error; + } + + ret = gst_buffer_pool_acquire_buffer (self->pool, &buffer, nullptr); + } + + if (ret != GST_FLOW_OK) + goto out; + } + + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_ERROR_OBJECT (self, "Not a D3D11 memory"); + goto error; + } + + dmem = (GstD3D11Memory *) mem; + draw_mouse = self->show_cursor; + rtv = gst_d3d11_memory_get_render_target_view (dmem, 0); + if (draw_mouse && !rtv) { + GST_ERROR_OBJECT (self, "Render target view is unavailable"); + goto error; + } + + if (!gst_memory_map (mem, &info, + (GstMapFlags) (GST_MAP_WRITE | GST_MAP_D3D11))) { + GST_ERROR_OBJECT (self, "Failed to map d3d11 memory"); + goto error; + } + + texture = (ID3D11Texture2D *) info.data; + before_capture = gst_clock_get_time (clock); + ret = + gst_d3d11_screen_capture_do_capture (self->capture, texture, rtv, + draw_mouse); + gst_memory_unmap (mem, &info); + + switch (ret) { + case GST_D3D11_SCREEN_CAPTURE_FLOW_EXPECTED_ERROR: + GST_WARNING_OBJECT (self, "Got expected error, try again"); + gst_clear_object (&clock); + goto again; + case GST_D3D11_SCREEN_CAPTURE_FLOW_UNSUPPORTED: + GST_WARNING_OBJECT (self, "Got DXGI_ERROR_UNSUPPORTED error"); + unsupported_retry_count--; + + if (unsupported_retry_count < 0) + goto error; + + gst_clear_object (&clock); + goto again; + case GST_D3D11_SCREEN_CAPTURE_FLOW_SIZE_CHANGED: + GST_INFO_OBJECT (self, "Size was changed, need negotiation"); + gst_clear_buffer (&buffer); + gst_clear_object (&clock); + + if (!gst_base_src_negotiate (bsrc)) { + GST_ERROR_OBJECT (self, "Failed to negotiate with new size"); + ret = GST_FLOW_NOT_NEGOTIATED; + goto out; + } + goto again; + default: + break; + } + + if (!self->downstream_supports_d3d11) { + GstVideoFrame src_frame, dst_frame; + gboolean copy_ret; + + ret = GST_BASE_SRC_CLASS (parent_class)->alloc (bsrc, + offset, size, &sysmem_buf); + if (ret != GST_FLOW_OK) { + gst_clear_buffer (&buffer); + goto out; + } + + if (!gst_video_frame_map (&src_frame, &self->video_info, buffer, + GST_MAP_READ)) { + GST_ERROR_OBJECT (self, "Failed to map d3d11 buffer"); + goto error; + } + + if (!gst_video_frame_map (&dst_frame, &self->video_info, sysmem_buf, + GST_MAP_WRITE)) { + GST_ERROR_OBJECT (self, "Failed to map sysmem buffer"); + gst_video_frame_unmap (&src_frame); + goto error; + } + + copy_ret = gst_video_frame_copy (&dst_frame, &src_frame); + gst_video_frame_unmap (&dst_frame); + gst_video_frame_unmap (&src_frame); + + if (!copy_ret) { + GST_ERROR_OBJECT (self, "Failed to copy frame"); + goto error; + } + + gst_buffer_unref (buffer); + buffer = sysmem_buf; + sysmem_buf = nullptr; + } + + GST_BUFFER_DTS (buffer) = GST_CLOCK_TIME_NONE; + GST_BUFFER_PTS (buffer) = next_capture_ts; + GST_BUFFER_DURATION (buffer) = dur; + + after_capture = gst_clock_get_time (clock); + latency = after_capture - before_capture; + if (!GST_CLOCK_TIME_IS_VALID (self->min_latency)) { + self->min_latency = self->max_latency = latency; + update_latency = TRUE; + GST_DEBUG_OBJECT (self, "Initial latency %" GST_TIME_FORMAT, + GST_TIME_ARGS (latency)); + } + + if (latency > self->max_latency) { + self->max_latency = latency; + update_latency = TRUE; + GST_DEBUG_OBJECT (self, "Updating max latency %" GST_TIME_FORMAT, + GST_TIME_ARGS (latency)); + } + + if (update_latency) { + gst_element_post_message (GST_ELEMENT_CAST (self), + gst_message_new_latency (GST_OBJECT_CAST (self))); + } + +out: + gst_clear_object (&clock); + *buf = buffer; + + return ret; + +error: + gst_clear_buffer (&buffer); + gst_clear_buffer (&sysmem_buf); + gst_clear_object (&clock); + + return GST_FLOW_ERROR; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11screencapturesrc.h
Added
@@ -0,0 +1,35 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/base/gstbasesrc.h> +#include <gst/d3d11/gstd3d11.h> + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_SCREEN_CAPTURE_SRC (gst_d3d11_screen_capture_src_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11ScreenCaptureSrc, gst_d3d11_screen_capture_src, + GST, D3D11_SCREEN_CAPTURE_SRC, GstBaseSrc); + +G_END_DECLS +
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11shader.cpp
Added
@@ -0,0 +1,396 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11shader.h" +#include "gstd3d11pluginutils.h" +#include <gmodule.h> +#include <wrl.h> + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_shader_debug); +#define GST_CAT_DEFAULT gst_d3d11_shader_debug + +/* too many const buffers doesn't make sense */ +#define MAX_CONST_BUFFERS 16 + +static GModule *d3d_compiler_module = NULL; +static pD3DCompile GstD3DCompileFunc = NULL; + +gboolean +gst_d3d11_shader_init (void) +{ + static gsize _init = 0; + + if (g_once_init_enter (&_init)) { +#if GST_D3D11_WINAPI_ONLY_APP + /* Assuming that d3d compiler library is available */ + GstD3DCompileFunc = D3DCompile; +#else + static const gchar *d3d_compiler_names[] = { + "d3dcompiler_47.dll", + "d3dcompiler_46.dll", + "d3dcompiler_45.dll", + "d3dcompiler_44.dll", + "d3dcompiler_43.dll", + }; + guint i; + for (i = 0; i < G_N_ELEMENTS (d3d_compiler_names); i++) { + d3d_compiler_module = + g_module_open (d3d_compiler_names[i], G_MODULE_BIND_LAZY); + + if (d3d_compiler_module) { + GST_INFO ("D3D compiler %s is available", d3d_compiler_names[i]); + if (!g_module_symbol (d3d_compiler_module, "D3DCompile", + (gpointer *) & GstD3DCompileFunc)) { + GST_ERROR ("Cannot load D3DCompile symbol from %s", + d3d_compiler_names[i]); + g_module_close (d3d_compiler_module); + d3d_compiler_module = NULL; + GstD3DCompileFunc = NULL; + } else { + break; + } + } + } + + if (!GstD3DCompileFunc) + GST_WARNING ("D3D11 compiler library is unavailable"); +#endif + + g_once_init_leave (&_init, 1); + } + + return !!GstD3DCompileFunc; +} + +static gboolean +compile_shader (GstD3D11Device * device, const gchar * shader_source, + gboolean is_pixel_shader, ID3DBlob ** blob) +{ + const gchar *shader_target; + D3D_FEATURE_LEVEL feature_level; + HRESULT hr; + ID3D11Device *device_handle; + /* *INDENT-OFF* */ + ComPtr<ID3DBlob> ret; + ComPtr<ID3DBlob> error; + /* *INDENT-ON* */ + + if (!gst_d3d11_shader_init ()) { + GST_ERROR ("D3DCompiler is unavailable"); + return FALSE; + } + + device_handle = gst_d3d11_device_get_device_handle (device); + feature_level = device_handle->GetFeatureLevel (); + + if (is_pixel_shader) { + if (feature_level >= D3D_FEATURE_LEVEL_10_0) + shader_target = "ps_4_0"; + else if (feature_level >= D3D_FEATURE_LEVEL_9_3) + shader_target = "ps_4_0_level_9_3"; + else + shader_target = "ps_4_0_level_9_1"; + } else { + if (feature_level >= D3D_FEATURE_LEVEL_10_0) + shader_target = "vs_4_0"; + else if (feature_level >= D3D_FEATURE_LEVEL_9_3) + shader_target = "vs_4_0_level_9_3"; + else + shader_target = "vs_4_0_level_9_1"; + } + + g_assert (GstD3DCompileFunc); + + GST_TRACE ("Compile code \n%s", shader_source); + + hr = GstD3DCompileFunc (shader_source, strlen (shader_source), NULL, NULL, + NULL, "main", shader_target, 0, 0, &ret, &error); + + if (!gst_d3d11_result (hr, device)) { + const gchar *err = NULL; + + if (error) + err = (const gchar *) error->GetBufferPointer (); + + GST_ERROR ("could not compile source, hr: 0x%x, error detail %s", + (guint) hr, GST_STR_NULL (err)); + return FALSE; + } + + if (error) { + const gchar *err = (const gchar *) error->GetBufferPointer (); + + GST_DEBUG ("HLSL compiler warnings:\n%s\nShader code:\n%s", + GST_STR_NULL (err), GST_STR_NULL (shader_source)); + } + + *blob = ret.Detach (); + + return TRUE; +} + +gboolean +gst_d3d11_create_pixel_shader (GstD3D11Device * device, + const gchar * source, ID3D11PixelShader ** shader) +{ + ID3D11Device *device_handle; + HRESULT hr; + /* *INDENT-OFF* */ + ComPtr<ID3DBlob> ps_blob; + /* *INDENT-ON* */ + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); + g_return_val_if_fail (source != NULL, FALSE); + g_return_val_if_fail (shader != NULL, FALSE); + + if (!compile_shader (device, source, TRUE, &ps_blob)) { + GST_ERROR ("Failed to compile pixel shader"); + return FALSE; + } + + device_handle = gst_d3d11_device_get_device_handle (device); + hr = device_handle->CreatePixelShader (ps_blob->GetBufferPointer (), + ps_blob->GetBufferSize (), NULL, shader); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("could not create pixel shader, hr: 0x%x", (guint) hr); + return FALSE; + } + + return TRUE; +} + +gboolean +gst_d3d11_create_vertex_shader (GstD3D11Device * device, const gchar * source, + const D3D11_INPUT_ELEMENT_DESC * input_desc, guint desc_len, + ID3D11VertexShader ** shader, ID3D11InputLayout ** layout) +{ + ID3D11Device *device_handle; + HRESULT hr; + /* *INDENT-OFF* */ + ComPtr<ID3DBlob> vs_blob; + ComPtr<ID3D11VertexShader> vs; + ComPtr<ID3D11InputLayout> in_layout; + /* *INDENT-ON* */ + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), FALSE); + g_return_val_if_fail (source != NULL, FALSE); + g_return_val_if_fail (input_desc != NULL, FALSE); + g_return_val_if_fail (desc_len > 0, FALSE); + g_return_val_if_fail (shader != NULL, FALSE); + g_return_val_if_fail (layout != NULL, FALSE); + + if (!compile_shader (device, source, FALSE, &vs_blob)) { + GST_ERROR ("Failed to compile shader code"); + return FALSE; + } + + device_handle = gst_d3d11_device_get_device_handle (device); + hr = device_handle->CreateVertexShader (vs_blob->GetBufferPointer (), + vs_blob->GetBufferSize (), NULL, &vs); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("could not create vertex shader, hr: 0x%x", (guint) hr); + return FALSE; + } + + hr = device_handle->CreateInputLayout (input_desc, + desc_len, vs_blob->GetBufferPointer (), + vs_blob->GetBufferSize (), &in_layout); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR ("could not create input layout shader, hr: 0x%x", (guint) hr); + return FALSE; + } + + *shader = vs.Detach (); + *layout = in_layout.Detach (); + + return TRUE; +} + +struct _GstD3D11Quad +{ + GstD3D11Device *device; + ID3D11PixelShader *ps; + ID3D11VertexShader *vs; + ID3D11InputLayout *layout; + ID3D11Buffer *const_buffer[MAX_CONST_BUFFERS]; + guint num_const_buffers; + ID3D11Buffer *vertex_buffer; + guint vertex_stride; + ID3D11Buffer *index_buffer; + DXGI_FORMAT index_format; + guint index_count; + D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES]; + ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES]; + guint num_srv; + ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES]; + guint num_rtv; +}; + +GstD3D11Quad * +gst_d3d11_quad_new (GstD3D11Device * device, ID3D11PixelShader * pixel_shader, + ID3D11VertexShader * vertex_shader, ID3D11InputLayout * layout, + ID3D11Buffer ** const_buffers, guint num_const_buffers, + ID3D11Buffer * vertex_buffer, guint vertex_stride, + ID3D11Buffer * index_buffer, DXGI_FORMAT index_format, guint index_count) +{ + GstD3D11Quad *quad; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + g_return_val_if_fail (pixel_shader != NULL, NULL); + g_return_val_if_fail (vertex_shader != NULL, NULL); + g_return_val_if_fail (layout != NULL, NULL); + g_return_val_if_fail (num_const_buffers <= MAX_CONST_BUFFERS, NULL); + g_return_val_if_fail (vertex_buffer != NULL, NULL); + g_return_val_if_fail (vertex_stride > 0, NULL); + g_return_val_if_fail (index_buffer != NULL, NULL); + g_return_val_if_fail (index_format != DXGI_FORMAT_UNKNOWN, NULL); + + quad = g_new0 (GstD3D11Quad, 1); + + quad->device = (GstD3D11Device *) gst_object_ref (device); + quad->ps = pixel_shader; + quad->vs = vertex_shader; + quad->layout = layout; + quad->vertex_buffer = vertex_buffer; + quad->vertex_stride = vertex_stride; + quad->index_buffer = index_buffer; + quad->index_format = index_format; + quad->index_count = index_count; + + pixel_shader->AddRef (); + vertex_shader->AddRef (); + layout->AddRef (); + vertex_buffer->AddRef (); + index_buffer->AddRef (); + + if (num_const_buffers > 0) { + guint i; + + g_assert (const_buffers); + + for (i = 0; i < num_const_buffers; i++) { + quad->const_buffer[i] = const_buffers[i]; + quad->const_buffer[i]->AddRef (); + } + + quad->num_const_buffers = num_const_buffers; + } + + return quad; +} + +void +gst_d3d11_quad_free (GstD3D11Quad * quad) +{ + guint i; + + g_return_if_fail (quad != NULL); + + GST_D3D11_CLEAR_COM (quad->ps); + GST_D3D11_CLEAR_COM (quad->vs); + GST_D3D11_CLEAR_COM (quad->layout); + for (i = 0; i < quad->num_const_buffers; i++) + quad->const_buffer[i]->Release (); + GST_D3D11_CLEAR_COM (quad->vertex_buffer); + GST_D3D11_CLEAR_COM (quad->index_buffer); + + gst_clear_object (&quad->device); + g_free (quad); +} + +gboolean +gst_d3d11_draw_quad (GstD3D11Quad * quad, + D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES], guint num_viewport, + ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], guint num_srv, + ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES], guint num_rtv, + ID3D11BlendState * blend, gfloat blend_factor[4], + ID3D11SamplerState ** sampler, guint num_sampler) +{ + gboolean ret; + + g_return_val_if_fail (quad != NULL, FALSE); + + gst_d3d11_device_lock (quad->device); + ret = gst_d3d11_draw_quad_unlocked (quad, viewport, num_viewport, + srv, num_srv, rtv, num_viewport, blend, blend_factor, sampler, + num_sampler); + gst_d3d11_device_unlock (quad->device); + + return ret; +} + +gboolean +gst_d3d11_draw_quad_unlocked (GstD3D11Quad * quad, + D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES], guint num_viewport, + ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], guint num_srv, + ID3D11RenderTargetView * rtv[GST_VIDEO_MAX_PLANES], guint num_rtv, + ID3D11BlendState * blend, gfloat blend_factor[4], + ID3D11SamplerState ** sampler, guint num_sampler) +{ + ID3D11DeviceContext *context; + UINT offsets = 0; + ID3D11ShaderResourceView *clear_view[GST_VIDEO_MAX_PLANES] = { NULL, }; + ID3D11BlendState *blend_state = blend; + + g_return_val_if_fail (quad != NULL, FALSE); + g_return_val_if_fail (viewport != NULL, FALSE); + g_return_val_if_fail (num_viewport <= GST_VIDEO_MAX_PLANES, FALSE); + g_return_val_if_fail (rtv != NULL, FALSE); + g_return_val_if_fail (num_rtv <= GST_VIDEO_MAX_PLANES, FALSE); + + context = gst_d3d11_device_get_device_context_handle (quad->device); + + context->IASetPrimitiveTopology (D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); + context->IASetInputLayout (quad->layout); + context->IASetVertexBuffers (0, 1, &quad->vertex_buffer, &quad->vertex_stride, + &offsets); + context->IASetIndexBuffer (quad->index_buffer, quad->index_format, 0); + + if (sampler) + context->PSSetSamplers (0, num_sampler, sampler); + context->VSSetShader (quad->vs, NULL, 0); + context->PSSetShader (quad->ps, NULL, 0); + context->RSSetViewports (num_viewport, viewport); + + if (quad->num_const_buffers) { + context->PSSetConstantBuffers (0, quad->num_const_buffers, + quad->const_buffer); + } + + if (srv) + context->PSSetShaderResources (0, num_srv, srv); + context->OMSetRenderTargets (num_rtv, rtv, NULL); + context->OMSetBlendState (blend_state, blend_factor, 0xffffffff); + + context->DrawIndexed (quad->index_count, 0, 0); + + if (srv) + context->PSSetShaderResources (0, num_srv, clear_view); + context->OMSetRenderTargets (0, NULL, NULL); + + return TRUE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11shader.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11shader.h
Changed
@@ -21,8 +21,8 @@ #define __GST_D3D11_SHADER_H__ #include <gst/gst.h> -#include "gstd3d11_fwd.h" #include <gst/video/video.h> +#include <gst/d3d11/gstd3d11.h> #include <d3dcompiler.h> @@ -47,10 +47,8 @@ ID3D11PixelShader * pixel_shader, ID3D11VertexShader * vertex_shader, ID3D11InputLayout * layout, - ID3D11SamplerState * sampler, - ID3D11BlendState * blend, - ID3D11DepthStencilState *depth_stencil, - ID3D11Buffer * const_buffer, + ID3D11Buffer ** const_buffers, + guint num_const_buffers, ID3D11Buffer * vertex_buffer, guint vertex_stride, ID3D11Buffer * index_buffer, @@ -66,7 +64,10 @@ guint num_srv, ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES], guint num_rtv, - ID3D11DepthStencilView *dsv); + ID3D11BlendState *blend, + gfloat blend_factor[4], + ID3D11SamplerState ** sampler, + guint num_sampler); gboolean gst_d3d11_draw_quad_unlocked (GstD3D11Quad * quad, D3D11_VIEWPORT viewport[GST_VIDEO_MAX_PLANES], @@ -75,7 +76,10 @@ guint num_srv, ID3D11RenderTargetView *rtv[GST_VIDEO_MAX_PLANES], guint num_rtv, - ID3D11DepthStencilView *dsv); + ID3D11BlendState *blend, + gfloat blend_factor[4], + ID3D11SamplerState ** sampler, + guint num_sampler); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11upload.cpp
Added
@@ -0,0 +1,566 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11upload + * @title: d3d11upload + * + * Upload video frame to Direct3D11 texture memory + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! d3d11upload ! d3d11videosinkelement + * ``` + * This pipeline will upload video test frame (system memory) into Direct3D11 + * textures and d3d11videosinkelement will display frames on screen. + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11upload.h" + +#include <string.h> + +GST_DEBUG_CATEGORY_STATIC (gst_d3d11_upload_debug); +#define GST_CAT_DEFAULT gst_d3d11_upload_debug + +static GstStaticCaps sink_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, + GST_D3D11_ALL_FORMATS) ";" + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY + "," GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS)); + +static GstStaticCaps src_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS) ";" + GST_VIDEO_CAPS_MAKE (GST_D3D11_ALL_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_ALL_FORMATS)); + +struct _GstD3D11Upload +{ + GstD3D11BaseFilter parent; + + GstBuffer *staging_buffer; +}; + +#define gst_d3d11_upload_parent_class parent_class +G_DEFINE_TYPE (GstD3D11Upload, gst_d3d11_upload, GST_TYPE_D3D11_BASE_FILTER); + +static void gst_d3d11_upload_dispose (GObject * object); +static gboolean gst_d3d11_upload_stop (GstBaseTransform * trans); +static gboolean gst_d3d11_upload_sink_event (GstBaseTransform * trans, + GstEvent * event); +static GstCaps *gst_d3d11_upload_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static gboolean gst_d3d11_upload_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query); +static gboolean gst_d3d11_upload_decide_allocation (GstBaseTransform * trans, + GstQuery * query); +static GstFlowReturn gst_d3d11_upload_transform (GstBaseTransform * trans, + GstBuffer * inbuf, GstBuffer * outbuf); +static gboolean gst_d3d11_upload_set_info (GstD3D11BaseFilter * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info); + +static void +gst_d3d11_upload_class_init (GstD3D11UploadClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstD3D11BaseFilterClass *bfilter_class = GST_D3D11_BASE_FILTER_CLASS (klass); + GstCaps *caps; + + gobject_class->dispose = gst_d3d11_upload_dispose; + + caps = gst_d3d11_get_updated_template_caps (&sink_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + caps = gst_d3d11_get_updated_template_caps (&src_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 uploader", "Filter/Video", + "Uploads data into Direct3D11 texture memory", + "Seungha Yang <seungha.yang@navercorp.com>"); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_upload_stop); + trans_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_upload_sink_event); + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_d3d11_upload_transform_caps); + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_upload_propose_allocation); + trans_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_upload_decide_allocation); + trans_class->transform = GST_DEBUG_FUNCPTR (gst_d3d11_upload_transform); + + bfilter_class->set_info = GST_DEBUG_FUNCPTR (gst_d3d11_upload_set_info); + + GST_DEBUG_CATEGORY_INIT (gst_d3d11_upload_debug, + "d3d11upload", 0, "d3d11upload Element"); +} + +static void +gst_d3d11_upload_init (GstD3D11Upload * upload) +{ +} + +static void +gst_d3d11_upload_dispose (GObject * object) +{ + GstD3D11Upload *self = GST_D3D11_UPLOAD (object); + + gst_clear_buffer (&self->staging_buffer); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static gboolean +gst_d3d11_upload_stop (GstBaseTransform * trans) +{ + GstD3D11Upload *self = GST_D3D11_UPLOAD (trans); + + gst_clear_buffer (&self->staging_buffer); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->stop (trans); +} + +static gboolean +gst_d3d11_upload_sink_event (GstBaseTransform * trans, GstEvent * event) +{ + GstD3D11Upload *self = GST_D3D11_UPLOAD (trans); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_EOS: + /* We don't need to hold this staging buffer after eos */ + gst_clear_buffer (&self->staging_buffer); + break; + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->sink_event (trans, event); +} + +static GstCaps * +_set_caps_features (const GstCaps * caps, const gchar * feature_name) +{ + guint i, j, m, n; + GstCaps *tmp; + GstCapsFeatures *overlay_feature = + gst_caps_features_from_string + (GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION); + + tmp = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + GstCapsFeatures *features, *orig_features; + GstStructure *s = gst_caps_get_structure (caps, i); + + orig_features = gst_caps_get_features (caps, i); + features = gst_caps_features_new (feature_name, NULL); + + if (gst_caps_features_is_any (orig_features)) { + gst_caps_append_structure_full (tmp, gst_structure_copy (s), + gst_caps_features_copy (features)); + + if (!gst_caps_features_contains (features, + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION)) + gst_caps_features_add (features, + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION); + } else { + m = gst_caps_features_get_size (orig_features); + for (j = 0; j < m; j++) { + const gchar *feature = gst_caps_features_get_nth (orig_features, j); + + /* if we already have the features */ + if (gst_caps_features_contains (features, feature)) + continue; + + if (g_strcmp0 (feature, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY) == 0) + continue; + + if (gst_caps_features_contains (overlay_feature, feature)) { + gst_caps_features_add (features, feature); + } + } + } + + gst_caps_append_structure_full (tmp, gst_structure_copy (s), features); + } + + gst_caps_features_free (overlay_feature); + + return tmp; +} + +static GstCaps * +gst_d3d11_upload_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *result, *tmp; + + GST_DEBUG_OBJECT (trans, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + if (direction == GST_PAD_SINK) { + tmp = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY); + tmp = gst_caps_merge (gst_caps_ref (caps), tmp); + } else { + GstCaps *newcaps; + tmp = gst_caps_ref (caps); + newcaps = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); + tmp = gst_caps_merge (tmp, newcaps); + } + + if (filter) { + result = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + } else { + result = tmp; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, result); + + return result; +} + +static gboolean +gst_d3d11_upload_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstVideoInfo info; + GstBufferPool *pool; + GstCaps *caps; + guint size; + + if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query)) + return FALSE; + + /* passthrough, we're done */ + if (decide_query == NULL) + return TRUE; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) + return FALSE; + + if (gst_query_get_n_allocation_pools (query) == 0) { + GstCapsFeatures *features; + GstStructure *config; + gboolean is_d3d11 = FALSE; + + features = gst_caps_get_features (caps, 0); + + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + GST_DEBUG_OBJECT (filter, "upstream support d3d11 memory"); + pool = gst_d3d11_buffer_pool_new (filter->device); + is_d3d11 = TRUE; + } else { + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + /* d3d11 pool does not support video alignment */ + if (!is_d3d11) { + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + } + + size = GST_VIDEO_INFO_SIZE (&info); + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + + if (!gst_buffer_pool_set_config (pool, config)) + goto config_failed; + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + if (is_d3d11) { + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, + nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + } + + gst_query_add_allocation_pool (query, pool, size, 0, 0); + gst_object_unref (pool); + } + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + gst_query_add_allocation_meta (query, + GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (filter, "failed to set config"); + gst_object_unref (pool); + return FALSE; + } +} + +static gboolean +gst_d3d11_upload_decide_allocation (GstBaseTransform * trans, GstQuery * query) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstCaps *outcaps = NULL; + GstBufferPool *pool = NULL; + guint size, min, max; + GstStructure *config; + gboolean update_pool = FALSE; + GstVideoInfo vinfo; + const GstD3D11Format *d3d11_format; + GstD3D11AllocationParams *d3d11_params; + guint bind_flags = 0; + guint i; + DXGI_FORMAT dxgi_format = DXGI_FORMAT_UNKNOWN; + UINT supported = 0; + HRESULT hr; + ID3D11Device *device_handle; + + gst_query_parse_allocation (query, &outcaps, NULL); + + if (!outcaps) + return FALSE; + + gst_video_info_from_caps (&vinfo, outcaps); + + d3d11_format = gst_d3d11_device_format_from_gst (filter->device, + GST_VIDEO_INFO_FORMAT (&vinfo)); + if (!d3d11_format) { + GST_ERROR_OBJECT (filter, "Unknown format caps %" GST_PTR_FORMAT, outcaps); + return FALSE; + } + + if (d3d11_format->dxgi_format == DXGI_FORMAT_UNKNOWN) { + dxgi_format = d3d11_format->resource_format[0]; + } else { + dxgi_format = d3d11_format->dxgi_format; + } + + device_handle = gst_d3d11_device_get_device_handle (filter->device); + hr = device_handle->CheckFormatSupport (dxgi_format, &supported); + if (gst_d3d11_result (hr, filter->device)) { + if ((supported & D3D11_FORMAT_SUPPORT_SHADER_SAMPLE) == + D3D11_FORMAT_SUPPORT_SHADER_SAMPLE) { + bind_flags |= D3D11_BIND_SHADER_RESOURCE; + } + + if ((supported & D3D11_FORMAT_SUPPORT_RENDER_TARGET) == + D3D11_FORMAT_SUPPORT_RENDER_TARGET) { + bind_flags |= D3D11_BIND_RENDER_TARGET; + } + } + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (pool) { + if (!GST_IS_D3D11_BUFFER_POOL (pool)) { + gst_clear_object (&pool); + } else { + GstD3D11BufferPool *dpool = GST_D3D11_BUFFER_POOL (pool); + if (dpool->device != filter->device) + gst_clear_object (&pool); + } + } + + update_pool = TRUE; + } else { + size = GST_VIDEO_INFO_SIZE (&vinfo); + min = max = 0; + } + + if (!pool) { + GST_DEBUG_OBJECT (trans, "create our pool"); + + pool = gst_d3d11_buffer_pool_new (filter->device); + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + + d3d11_params = gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (filter->device, &vinfo, + (GstD3D11AllocationFlags) 0, bind_flags); + } else { + /* Set bind flag */ + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&vinfo); i++) { + d3d11_params->desc[i].BindFlags |= bind_flags; + } + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + + gst_buffer_pool_set_config (pool, config); + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + gst_object_unref (pool); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, + query); +} + +static gboolean +gst_d3d11_upload_can_use_staging_buffer (GstD3D11Upload * self, + GstBuffer * outbuf) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (self); + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (filter->device); + + if (!gst_d3d11_buffer_can_access_device (outbuf, device_handle)) + return FALSE; + + if (self->staging_buffer) + return TRUE; + + self->staging_buffer = gst_d3d11_allocate_staging_buffer_for (outbuf, + &filter->out_info, TRUE); + + if (!self->staging_buffer) { + GST_WARNING_OBJECT (self, "Couldn't allocate staging buffer"); + return FALSE; + } + + return TRUE; +} + +static GstFlowReturn +gst_d3d11_upload_transform (GstBaseTransform * trans, GstBuffer * inbuf, + GstBuffer * outbuf) +{ + GstD3D11BaseFilter *filter = GST_D3D11_BASE_FILTER (trans); + GstD3D11Upload *self = GST_D3D11_UPLOAD (trans); + GstVideoFrame in_frame, out_frame; + GstFlowReturn ret = GST_FLOW_OK; + gboolean use_staging_buf; + GstBuffer *target_outbuf = outbuf; + guint i; + + use_staging_buf = gst_d3d11_upload_can_use_staging_buffer (self, outbuf); + + if (use_staging_buf) { + GST_TRACE_OBJECT (self, "Copy input buffer to staging buffer"); + target_outbuf = self->staging_buffer; + } + + if (!gst_video_frame_map (&in_frame, &filter->in_info, inbuf, + (GstMapFlags) (GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) + goto invalid_buffer; + + if (!gst_video_frame_map (&out_frame, &filter->out_info, target_outbuf, + (GstMapFlags) (GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) { + gst_video_frame_unmap (&in_frame); + goto invalid_buffer; + } + + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&in_frame); i++) { + if (!gst_video_frame_copy_plane (&out_frame, &in_frame, i)) { + GST_ERROR_OBJECT (filter, "Couldn't copy plane %d", i); + ret = GST_FLOW_ERROR; + break; + } + } + + gst_video_frame_unmap (&out_frame); + gst_video_frame_unmap (&in_frame); + + /* Copy staging texture to d3d11 texture */ + if (use_staging_buf) { + if (!gst_d3d11_buffer_copy_into (outbuf, + self->staging_buffer, &filter->out_info)) { + GST_ERROR_OBJECT (self, "Cannot copy staging texture into texture"); + return GST_FLOW_ERROR; + } + } + + return ret; + + /* ERRORS */ +invalid_buffer: + { + GST_ELEMENT_WARNING (filter, CORE, NOT_IMPLEMENTED, (NULL), + ("invalid video buffer received")); + return GST_FLOW_ERROR; + } +} + +static gboolean +gst_d3d11_upload_set_info (GstD3D11BaseFilter * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info) +{ + GstD3D11Upload *self = GST_D3D11_UPLOAD (filter); + + gst_clear_buffer (&self->staging_buffer); + + return TRUE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11upload.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11upload.h
Changed
@@ -24,24 +24,9 @@ G_BEGIN_DECLS -#define GST_TYPE_D3D11_UPLOAD (gst_d3d11_upload_get_type()) -#define GST_D3D11_UPLOAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_UPLOAD,GstD3D11Upload)) -#define GST_D3D11_UPLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_UPLOAD,GstD3D11UploadClass)) -#define GST_D3D11_UPLOAD_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_UPLOAD,GstD3D11UploadClass)) -#define GST_IS_D3D11_UPLOAD(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_UPLOAD)) -#define GST_IS_D3D11_UPLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_UPLOAD)) - -struct _GstD3D11Upload -{ - GstD3D11BaseFilter parent; -}; - -struct _GstD3D11UploadClass -{ - GstD3D11BaseFilterClass parent_class; -}; - -GType gst_d3d11_upload_get_type (void); +#define GST_TYPE_D3D11_UPLOAD (gst_d3d11_upload_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11Upload, + gst_d3d11_upload, GST, D3D11_UPLOAD, GstD3D11BaseFilter); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11videoprocessor.cpp
Added
@@ -0,0 +1,563 @@ +/* GStreamer + * Copyright (C) <2020> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstd3d11videoprocessor.h" +#include "gstd3d11pluginutils.h" + +#include <string.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_video_processor_debug); +#define GST_CAT_DEFAULT gst_d3d11_video_processor_debug + +#if (GST_D3D11_HEADER_VERSION >= 1 && GST_D3D11_DXGI_HEADER_VERSION >= 4) +#define HAVE_VIDEO_CONTEXT_ONE +#endif + +#if (GST_D3D11_HEADER_VERSION >= 4) && (GST_D3D11_DXGI_HEADER_VERSION >= 5) +#define HAVE_VIDEO_CONTEXT_TWO +#endif + +struct _GstD3D11VideoProcessor +{ + GstD3D11Device *device; + + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; +#ifdef HAVE_VIDEO_CONTEXT_ONE + ID3D11VideoContext1 *video_context1; +#endif +#ifdef HAVE_VIDEO_CONTEXT_TWO + ID3D11VideoContext2 *video_context2; +#endif + ID3D11VideoProcessor *processor; + ID3D11VideoProcessorEnumerator *enumerator; +#ifdef HAVE_VIDEO_CONTEXT_ONE + ID3D11VideoProcessorEnumerator1 *enumerator1; +#endif + D3D11_VIDEO_PROCESSOR_CAPS processor_caps; +}; + +GstD3D11VideoProcessor * +gst_d3d11_video_processor_new (GstD3D11Device * device, guint in_width, + guint in_height, guint out_width, guint out_height) +{ + GstD3D11VideoProcessor *self; + ID3D11VideoDevice *video_device; + ID3D11VideoContext *video_context; + HRESULT hr; + D3D11_VIDEO_PROCESSOR_CONTENT_DESC desc; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + video_device = gst_d3d11_device_get_video_device_handle (device); + if (!video_device) { + GST_WARNING_OBJECT (device, "ID3D11VideoDevice is not available"); + return NULL; + } + + video_context = gst_d3d11_device_get_video_context_handle (device); + if (!video_context) { + GST_WARNING_OBJECT (device, "ID3D11VideoContext is not availale"); + return NULL; + } + + memset (&desc, 0, sizeof (desc)); + + self = g_new0 (GstD3D11VideoProcessor, 1); + self->device = (GstD3D11Device *) gst_object_ref (device); + + self->video_device = video_device; + video_device->AddRef (); + + self->video_context = video_context; + video_context->AddRef (); + + /* FIXME: Add support intelace */ + desc.InputFrameFormat = D3D11_VIDEO_FRAME_FORMAT_PROGRESSIVE; + desc.InputWidth = in_width; + desc.InputHeight = in_height; + desc.OutputWidth = out_width; + desc.OutputHeight = out_height; + /* TODO: make option for this */ + desc.Usage = D3D11_VIDEO_USAGE_PLAYBACK_NORMAL; + + hr = self->video_device->CreateVideoProcessorEnumerator (&desc, + &self->enumerator); + if (!gst_d3d11_result (hr, device)) + goto fail; +#ifdef HAVE_VIDEO_CONTEXT_ONE + hr = self->enumerator->QueryInterface (IID_PPV_ARGS (&self->enumerator1)); + if (gst_d3d11_result (hr, device)) { + GST_DEBUG ("ID3D11VideoProcessorEnumerator1 interface available"); + } +#endif + + hr = self->enumerator->GetVideoProcessorCaps (&self->processor_caps); + if (!gst_d3d11_result (hr, device)) + goto fail; + + hr = self->video_device->CreateVideoProcessor (self->enumerator, 0, + &self->processor); + if (!gst_d3d11_result (hr, device)) + goto fail; + +#ifdef HAVE_VIDEO_CONTEXT_ONE + hr = self->video_context-> + QueryInterface (IID_PPV_ARGS (&self->video_context1)); + if (gst_d3d11_result (hr, device)) { + GST_DEBUG ("ID3D11VideoContext1 interface available"); + } +#endif +#ifdef HAVE_VIDEO_CONTEXT_TWO + hr = self->video_context-> + QueryInterface (IID_PPV_ARGS (&self->video_context2)); + if (gst_d3d11_result (hr, device)) { + GST_DEBUG ("ID3D11VideoContext2 interface available"); + } +#endif + + /* Setting up default options */ + gst_d3d11_device_lock (self->device); + /* We don't want auto processing by driver */ + self->video_context->VideoProcessorSetStreamAutoProcessingMode + (self->processor, 0, FALSE); + gst_d3d11_device_unlock (self->device); + + return self; + +fail: + gst_d3d11_video_processor_free (self); + + return NULL; +} + +void +gst_d3d11_video_processor_free (GstD3D11VideoProcessor * processor) +{ + g_return_if_fail (processor != NULL); + + GST_D3D11_CLEAR_COM (processor->video_device); + GST_D3D11_CLEAR_COM (processor->video_context); +#ifdef HAVE_VIDEO_CONTEXT_ONE + GST_D3D11_CLEAR_COM (processor->video_context1); +#endif +#ifdef HAVE_VIDEO_CONTEXT_TWO + GST_D3D11_CLEAR_COM (processor->video_context2); +#endif + GST_D3D11_CLEAR_COM (processor->processor); + GST_D3D11_CLEAR_COM (processor->enumerator); +#ifdef HAVE_VIDEO_CONTEXT_ONE + GST_D3D11_CLEAR_COM (processor->enumerator1); +#endif + + gst_clear_object (&processor->device); + g_free (processor); +} + +static gboolean +gst_d3d11_video_processor_supports_format (GstD3D11VideoProcessor * + self, DXGI_FORMAT format, gboolean is_input) +{ + HRESULT hr; + UINT flag = 0; + + hr = self->enumerator->CheckVideoProcessorFormat (format, &flag); + + if (!gst_d3d11_result (hr, self->device)) + return FALSE; + + if (is_input) { + /* D3D11_VIDEO_PROCESSOR_FORMAT_SUPPORT_INPUT, missing in mingw header */ + if ((flag & 0x1) != 0) + return TRUE; + } else { + /* D3D11_VIDEO_PROCESSOR_FORMAT_SUPPORT_OUTPUT, missing in mingw header */ + if ((flag & 0x2) != 0) + return TRUE; + } + + return FALSE; +} + +gboolean +gst_d3d11_video_processor_supports_input_format (GstD3D11VideoProcessor * + processor, DXGI_FORMAT format) +{ + g_return_val_if_fail (processor != NULL, FALSE); + + if (format == DXGI_FORMAT_UNKNOWN) + return FALSE; + + return gst_d3d11_video_processor_supports_format (processor, format, TRUE); +} + +gboolean +gst_d3d11_video_processor_supports_output_format (GstD3D11VideoProcessor * + processor, DXGI_FORMAT format) +{ + g_return_val_if_fail (processor != NULL, FALSE); + + if (format == DXGI_FORMAT_UNKNOWN) + return FALSE; + + return gst_d3d11_video_processor_supports_format (processor, format, FALSE); +} + +gboolean +gst_d3d11_video_processor_get_caps (GstD3D11VideoProcessor * processor, + D3D11_VIDEO_PROCESSOR_CAPS * caps) +{ + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (caps != NULL, FALSE); + + *caps = processor->processor_caps; + + return TRUE; +} + +static void +video_processor_color_space_from_gst (GstD3D11VideoProcessor * self, + GstVideoColorimetry * color, D3D11_VIDEO_PROCESSOR_COLOR_SPACE * colorspace) +{ + /* D3D11_VIDEO_PROCESSOR_DEVICE_CAPS_xvYCC */ + UINT can_xvYCC = 2; + + /* 0: playback, 1: video processing */ + colorspace->Usage = 0; + + if (color->range == GST_VIDEO_COLOR_RANGE_0_255) { + colorspace->RGB_Range = 0; + colorspace->Nominal_Range = D3D11_VIDEO_PROCESSOR_NOMINAL_RANGE_0_255; + } else { + /* 16-235 */ + colorspace->RGB_Range = 1; + colorspace->Nominal_Range = D3D11_VIDEO_PROCESSOR_NOMINAL_RANGE_16_235; + } + + if (color->matrix == GST_VIDEO_COLOR_MATRIX_BT601) { + colorspace->YCbCr_Matrix = 0; + } else { + /* BT.709, no other options (such as BT2020) */ + colorspace->YCbCr_Matrix = 1; + } + + if ((self->processor_caps.DeviceCaps & can_xvYCC) == can_xvYCC) { + colorspace->YCbCr_xvYCC = 1; + } else { + colorspace->YCbCr_xvYCC = 0; + } +} + +gboolean +gst_d3d11_video_processor_set_input_color_space (GstD3D11VideoProcessor * + processor, GstVideoColorimetry * color) +{ + D3D11_VIDEO_PROCESSOR_COLOR_SPACE color_space; + + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (color != NULL, FALSE); + + video_processor_color_space_from_gst (processor, color, &color_space); + + processor->video_context->VideoProcessorSetStreamColorSpace + (processor->processor, 0, &color_space); + + return TRUE; +} + +gboolean +gst_d3d11_video_processor_set_output_color_space (GstD3D11VideoProcessor * + processor, GstVideoColorimetry * color) +{ + D3D11_VIDEO_PROCESSOR_COLOR_SPACE color_space; + + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (color != NULL, FALSE); + + video_processor_color_space_from_gst (processor, color, &color_space); + + processor->video_context->VideoProcessorSetOutputColorSpace + (processor->processor, &color_space); + + return TRUE; +} + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) +gboolean +gst_d3d11_video_processor_check_format_conversion (GstD3D11VideoProcessor * + processor, DXGI_FORMAT in_format, DXGI_COLOR_SPACE_TYPE in_color_space, + DXGI_FORMAT out_format, DXGI_COLOR_SPACE_TYPE out_color_space) +{ +#ifdef HAVE_VIDEO_CONTEXT_ONE + HRESULT hr; + BOOL supported = TRUE; + + g_return_val_if_fail (processor != NULL, FALSE); + + if (!processor->enumerator1) + return FALSE; + + hr = processor->enumerator1->CheckVideoProcessorFormatConversion + (in_format, in_color_space, out_format, out_color_space, &supported); + if (!gst_d3d11_result (hr, processor->device)) { + GST_WARNING ("Failed to check conversion support"); + return FALSE; + } + + return supported; +#endif + + return FALSE; +} + +gboolean +gst_d3d11_video_processor_set_input_dxgi_color_space (GstD3D11VideoProcessor * + processor, DXGI_COLOR_SPACE_TYPE color_space) +{ + g_return_val_if_fail (processor != NULL, FALSE); + +#ifdef HAVE_VIDEO_CONTEXT_ONE + if (processor->video_context1) { + processor->video_context1->VideoProcessorSetStreamColorSpace1 + (processor->processor, 0, color_space); + return TRUE; + } +#endif + + return FALSE; +} + +gboolean +gst_d3d11_video_processor_set_output_dxgi_color_space (GstD3D11VideoProcessor * + processor, DXGI_COLOR_SPACE_TYPE color_space) +{ + g_return_val_if_fail (processor != NULL, FALSE); + +#ifdef HAVE_VIDEO_CONTEXT_ONE + if (processor->video_context1) { + processor->video_context1->VideoProcessorSetOutputColorSpace1 + (processor->processor, color_space); + return TRUE; + } +#endif + + return FALSE; +} +#endif + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) +/* D3D11_VIDEO_PROCESSOR_FEATURE_CAPS_METADATA_HDR10 + * missing in mingw header */ +#define FEATURE_CAPS_METADATA_HDR10 (0x800) + +gboolean +gst_d3d11_video_processor_set_input_hdr10_metadata (GstD3D11VideoProcessor * + processor, DXGI_HDR_METADATA_HDR10 * hdr10_meta) +{ + g_return_val_if_fail (processor != NULL, FALSE); + +#ifdef HAVE_VIDEO_CONTEXT_TWO + if (processor->video_context2 && (processor->processor_caps.FeatureCaps & + FEATURE_CAPS_METADATA_HDR10)) { + if (hdr10_meta) { + processor->video_context2->VideoProcessorSetStreamHDRMetaData + (processor->processor, 0, + DXGI_HDR_METADATA_TYPE_HDR10, sizeof (DXGI_HDR_METADATA_HDR10), + hdr10_meta); + } else { + processor->video_context2->VideoProcessorSetStreamHDRMetaData + (processor->processor, 0, DXGI_HDR_METADATA_TYPE_NONE, 0, NULL); + } + + return TRUE; + } +#endif + + return FALSE; +} + +gboolean +gst_d3d11_video_processor_set_output_hdr10_metadata (GstD3D11VideoProcessor * + processor, DXGI_HDR_METADATA_HDR10 * hdr10_meta) +{ + g_return_val_if_fail (processor != NULL, FALSE); + +#ifdef HAVE_VIDEO_CONTEXT_TWO + if (processor->video_context2 && (processor->processor_caps.FeatureCaps & + FEATURE_CAPS_METADATA_HDR10)) { + if (hdr10_meta) { + processor->video_context2->VideoProcessorSetOutputHDRMetaData + (processor->processor, DXGI_HDR_METADATA_TYPE_HDR10, + sizeof (DXGI_HDR_METADATA_HDR10), hdr10_meta); + } else { + processor->video_context2->VideoProcessorSetOutputHDRMetaData + (processor->processor, DXGI_HDR_METADATA_TYPE_NONE, 0, NULL); + } + + return TRUE; + } +#endif + + return FALSE; +} +#endif + +gboolean +gst_d3d11_video_processor_create_input_view (GstD3D11VideoProcessor * processor, + D3D11_VIDEO_PROCESSOR_INPUT_VIEW_DESC * desc, ID3D11Resource * resource, + ID3D11VideoProcessorInputView ** view) +{ + HRESULT hr; + + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (desc != NULL, FALSE); + g_return_val_if_fail (resource != NULL, FALSE); + g_return_val_if_fail (view != NULL, FALSE); + + hr = processor->video_device->CreateVideoProcessorInputView (resource, + processor->enumerator, desc, view); + if (!gst_d3d11_result (hr, processor->device)) + return FALSE; + + return TRUE; +} + +ID3D11VideoProcessorInputView * +gst_d3d11_video_processor_get_input_view (GstD3D11VideoProcessor * processor, + GstD3D11Memory * mem) +{ + return gst_d3d11_memory_get_processor_input_view (mem, + processor->video_device, processor->enumerator); +} + +gboolean +gst_d3d11_video_processor_create_output_view (GstD3D11VideoProcessor * + processor, D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC * desc, + ID3D11Resource * resource, ID3D11VideoProcessorOutputView ** view) +{ + HRESULT hr; + + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (desc != NULL, FALSE); + g_return_val_if_fail (resource != NULL, FALSE); + g_return_val_if_fail (view != NULL, FALSE); + + hr = processor->video_device->CreateVideoProcessorOutputView + (resource, processor->enumerator, desc, view); + if (!gst_d3d11_result (hr, processor->device)) + return FALSE; + + return TRUE; +} + +ID3D11VideoProcessorOutputView * +gst_d3d11_video_processor_get_output_view (GstD3D11VideoProcessor * + processor, GstD3D11Memory * mem) +{ + return gst_d3d11_memory_get_processor_output_view (mem, + processor->video_device, processor->enumerator); +} + +gboolean +gst_d3d11_video_processor_render (GstD3D11VideoProcessor * processor, + RECT * in_rect, ID3D11VideoProcessorInputView * in_view, + RECT * out_rect, ID3D11VideoProcessorOutputView * out_view) +{ + gboolean ret; + + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (in_view != NULL, FALSE); + g_return_val_if_fail (out_view != NULL, FALSE); + + gst_d3d11_device_lock (processor->device); + ret = gst_d3d11_video_processor_render_unlocked (processor, in_rect, in_view, + out_rect, out_view); + gst_d3d11_device_unlock (processor->device); + + return ret; +} + +gboolean +gst_d3d11_video_processor_render_unlocked (GstD3D11VideoProcessor * processor, + RECT * in_rect, ID3D11VideoProcessorInputView * in_view, + RECT * out_rect, ID3D11VideoProcessorOutputView * out_view) +{ + HRESULT hr; + D3D11_VIDEO_PROCESSOR_STREAM stream = { 0, }; + ID3D11VideoContext *context; + ID3D11VideoProcessor *proc; + + g_return_val_if_fail (processor != NULL, FALSE); + g_return_val_if_fail (in_view != NULL, FALSE); + g_return_val_if_fail (out_view != NULL, FALSE); + + stream.Enable = TRUE; + stream.pInputSurface = in_view; + context = processor->video_context; + proc = processor->processor; + + if (in_rect) { + context->VideoProcessorSetStreamSourceRect (proc, 0, TRUE, in_rect); + } else { + context->VideoProcessorSetStreamSourceRect (proc, 0, FALSE, NULL); + } + + if (out_rect) { + context->VideoProcessorSetStreamDestRect (proc, 0, TRUE, out_rect); + context->VideoProcessorSetOutputTargetRect (proc, TRUE, out_rect); + } else { + context->VideoProcessorSetStreamDestRect (proc, 0, FALSE, NULL); + context->VideoProcessorSetOutputTargetRect (proc, FALSE, NULL); + } + + hr = context->VideoProcessorBlt (proc, out_view, 0, 1, &stream); + if (!gst_d3d11_result (hr, processor->device)) + return FALSE; + + return TRUE; +} + +gboolean +gst_d3d11_video_processor_check_bind_flags_for_input_view (guint bind_flags) +{ + static const guint compatible_flags = (D3D11_BIND_DECODER | + D3D11_BIND_VIDEO_ENCODER | D3D11_BIND_RENDER_TARGET | + D3D11_BIND_UNORDERED_ACCESS); + + if (bind_flags == 0) + return TRUE; + + if ((bind_flags & compatible_flags) != 0) + return TRUE; + + return FALSE; +} + +gboolean +gst_d3d11_video_processor_check_bind_flags_for_output_view (guint bind_flags) +{ + if ((bind_flags & D3D11_BIND_RENDER_TARGET) == D3D11_BIND_RENDER_TARGET) + return TRUE; + + return FALSE; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11videoprocessor.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11videoprocessor.h
Changed
@@ -22,15 +22,12 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" +#include <gst/d3d11/gstd3d11.h> G_BEGIN_DECLS typedef struct _GstD3D11VideoProcessor GstD3D11VideoProcessor; -GQuark gst_d3d11_video_processor_input_view_quark (void); -GQuark gst_d3d11_video_processor_output_view_quark (void); - GstD3D11VideoProcessor * gst_d3d11_video_processor_new (GstD3D11Device * device, guint in_width, guint in_height, @@ -54,7 +51,13 @@ gboolean gst_d3d11_video_processor_set_output_color_space (GstD3D11VideoProcessor * processor, GstVideoColorimetry * color); -#if (DXGI_HEADER_VERSION >= 4) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) +gboolean gst_d3d11_video_processor_check_format_conversion (GstD3D11VideoProcessor * processor, + DXGI_FORMAT in_format, + DXGI_COLOR_SPACE_TYPE in_color_space, + DXGI_FORMAT out_format, + DXGI_COLOR_SPACE_TYPE out_color_space); + gboolean gst_d3d11_video_processor_set_input_dxgi_color_space (GstD3D11VideoProcessor * processor, DXGI_COLOR_SPACE_TYPE color_space); @@ -62,7 +65,7 @@ DXGI_COLOR_SPACE_TYPE color_space); #endif -#if (DXGI_HEADER_VERSION >= 5) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) gboolean gst_d3d11_video_processor_set_input_hdr10_metadata (GstD3D11VideoProcessor * processor, DXGI_HDR_METADATA_HDR10 * hdr10_meta); @@ -75,14 +78,16 @@ ID3D11Resource *resource, ID3D11VideoProcessorInputView ** view); +ID3D11VideoProcessorInputView * gst_d3d11_video_processor_get_input_view (GstD3D11VideoProcessor * processor, + GstD3D11Memory *mem); + gboolean gst_d3d11_video_processor_create_output_view (GstD3D11VideoProcessor * processor, D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC * desc, ID3D11Resource *resource, ID3D11VideoProcessorOutputView ** view); -void gst_d3d11_video_processor_input_view_release (ID3D11VideoProcessorInputView * view); - -void gst_d3d11_video_processor_output_view_release (ID3D11VideoProcessorOutputView * view); +ID3D11VideoProcessorOutputView * gst_d3d11_video_processor_get_output_view (GstD3D11VideoProcessor * processor, + GstD3D11Memory *mem); gboolean gst_d3d11_video_processor_render (GstD3D11VideoProcessor * processor, RECT *in_rect,
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11videosink.cpp
Added
@@ -0,0 +1,1402 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11videosink + * @title: d3d11videosink + * + * Direct3D11 based video render element + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! d3d11upload ! d3d11videosink + * ``` + * This pipeline will display test video stream on screen via d3d11videosink + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11videosink.h" +#include "gstd3d11videoprocessor.h" +#include "gstd3d11pluginutils.h" +#include <string> + +#if GST_D3D11_WINAPI_APP +#include "gstd3d11window_corewindow.h" +#include "gstd3d11window_swapchainpanel.h" +#endif +#if (!GST_D3D11_WINAPI_ONLY_APP) +#include "gstd3d11window_win32.h" +#endif +#include "gstd3d11window_dummy.h" + +enum +{ + PROP_0, + PROP_ADAPTER, + PROP_FORCE_ASPECT_RATIO, + PROP_ENABLE_NAVIGATION_EVENTS, + PROP_FULLSCREEN_TOGGLE_MODE, + PROP_FULLSCREEN, + PROP_DRAW_ON_SHARED_TEXTURE, +}; + +#define DEFAULT_ADAPTER -1 +#define DEFAULT_FORCE_ASPECT_RATIO TRUE +#define DEFAULT_ENABLE_NAVIGATION_EVENTS TRUE +#define DEFAULT_FULLSCREEN_TOGGLE_MODE GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_NONE +#define DEFAULT_FULLSCREEN FALSE +#define DEFAULT_DRAW_ON_SHARED_TEXTURE FALSE + +enum +{ + /* signals */ + SIGNAL_BEGIN_DRAW, + + /* actions */ + SIGNAL_DRAW, + + LAST_SIGNAL +}; + +static guint gst_d3d11_video_sink_signals[LAST_SIGNAL] = { 0, }; + +static GstStaticCaps pad_template_caps = + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY, GST_D3D11_SINK_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SINK_FORMATS) ";" + GST_VIDEO_CAPS_MAKE (GST_D3D11_SINK_FORMATS) "; " + GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY "," + GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, + GST_D3D11_SINK_FORMATS)); + +GST_DEBUG_CATEGORY (d3d11_video_sink_debug); +#define GST_CAT_DEFAULT d3d11_video_sink_debug + +struct _GstD3D11VideoSink +{ + GstVideoSink parent; + GstD3D11Device *device; + GstD3D11Window *window; + gint video_width; + gint video_height; + + GstVideoInfo info; + + guintptr window_id; + + gboolean caps_updated; + + /* properties */ + gint adapter; + gboolean force_aspect_ratio; + gboolean enable_navigation_events; + GstD3D11WindowFullscreenToggleMode fullscreen_toggle_mode; + gboolean fullscreen; + gboolean draw_on_shared_texture; + + /* saved render rectangle until we have a window */ + GstVideoRectangle render_rect; + gboolean pending_render_rect; + + GstBufferPool *fallback_pool; + gboolean have_video_processor; + gboolean processor_in_use; + + /* For drawing on user texture */ + gboolean drawing; + GstBuffer *current_buffer; + GRecMutex draw_lock; + + gchar *title; +}; + +static void gst_d3d11_videosink_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec); +static void gst_d3d11_videosink_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec); +static void gst_d3d11_video_sink_finalize (GObject * object); +static gboolean +gst_d3d11_video_sink_draw_action (GstD3D11VideoSink * self, + gpointer shared_handle, guint texture_misc_flags, guint64 acquire_key, + guint64 release_key); + +static void +gst_d3d11_video_sink_video_overlay_init (GstVideoOverlayInterface * iface); +static void +gst_d3d11_video_sink_navigation_init (GstNavigationInterface * iface); + +static void gst_d3d11_video_sink_set_context (GstElement * element, + GstContext * context); +static GstCaps *gst_d3d11_video_sink_get_caps (GstBaseSink * sink, + GstCaps * filter); +static gboolean gst_d3d11_video_sink_set_caps (GstBaseSink * sink, + GstCaps * caps); + +static gboolean gst_d3d11_video_sink_start (GstBaseSink * sink); +static gboolean gst_d3d11_video_sink_stop (GstBaseSink * sink); +static gboolean gst_d3d11_video_sink_propose_allocation (GstBaseSink * sink, + GstQuery * query); +static gboolean gst_d3d11_video_sink_query (GstBaseSink * sink, + GstQuery * query); +static gboolean gst_d3d11_video_sink_unlock (GstBaseSink * sink); +static gboolean gst_d3d11_video_sink_unlock_stop (GstBaseSink * sink); +static gboolean gst_d3d11_video_sink_event (GstBaseSink * sink, + GstEvent * event); + +static GstFlowReturn +gst_d3d11_video_sink_show_frame (GstVideoSink * sink, GstBuffer * buf); +static gboolean gst_d3d11_video_sink_prepare_window (GstD3D11VideoSink * self); + +#define gst_d3d11_video_sink_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstD3D11VideoSink, gst_d3d11_video_sink, + GST_TYPE_VIDEO_SINK, + G_IMPLEMENT_INTERFACE (GST_TYPE_VIDEO_OVERLAY, + gst_d3d11_video_sink_video_overlay_init); + G_IMPLEMENT_INTERFACE (GST_TYPE_NAVIGATION, + gst_d3d11_video_sink_navigation_init); + GST_DEBUG_CATEGORY_INIT (d3d11_video_sink_debug, + "d3d11videosink", 0, "Direct3D11 Video Sink")); + +static void +gst_d3d11_video_sink_class_init (GstD3D11VideoSinkClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseSinkClass *basesink_class = GST_BASE_SINK_CLASS (klass); + GstVideoSinkClass *videosink_class = GST_VIDEO_SINK_CLASS (klass); + GstCaps *caps; + + gobject_class->set_property = gst_d3d11_videosink_set_property; + gobject_class->get_property = gst_d3d11_videosink_get_property; + gobject_class->finalize = gst_d3d11_video_sink_finalize; + + g_object_class_install_property (gobject_class, PROP_ADAPTER, + g_param_spec_int ("adapter", "Adapter", + "Adapter index for creating device (-1 for default)", + -1, G_MAXINT32, DEFAULT_ADAPTER, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_FORCE_ASPECT_RATIO, + g_param_spec_boolean ("force-aspect-ratio", + "Force aspect ratio", + "When enabled, scaling will respect original aspect ratio", + DEFAULT_FORCE_ASPECT_RATIO, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_ENABLE_NAVIGATION_EVENTS, + g_param_spec_boolean ("enable-navigation-events", + "Enable navigation events", + "When enabled, navigation events are sent upstream", + DEFAULT_ENABLE_NAVIGATION_EVENTS, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_FULLSCREEN_TOGGLE_MODE, + g_param_spec_flags ("fullscreen-toggle-mode", + "Full screen toggle mode", + "Full screen toggle mode used to trigger fullscreen mode change", + GST_D3D11_WINDOW_TOGGLE_MODE_GET_TYPE, DEFAULT_FULLSCREEN_TOGGLE_MODE, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + g_object_class_install_property (gobject_class, PROP_FULLSCREEN, + g_param_spec_boolean ("fullscreen", + "fullscreen", + "Ignored when \"fullscreen-toggle-mode\" does not include \"property\"", + DEFAULT_FULLSCREEN, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + + /** + * GstD3D11VideoSink:draw-on-shared-texture: + * + * Instruct the sink to draw on a shared texture provided by user. + * User must watch #d3d11videosink::begin-draw signal and should call + * #d3d11videosink::draw method on the #d3d11videosink::begin-draw + * signal handler. + * + * Currently supported formats for user texture are: + * - DXGI_FORMAT_R8G8B8A8_UNORM + * - DXGI_FORMAT_B8G8R8A8_UNORM + * - DXGI_FORMAT_R10G10B10A2_UNORM + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_DRAW_ON_SHARED_TEXTURE, + g_param_spec_boolean ("draw-on-shared-texture", + "Draw on shared texture", + "Draw on user provided shared texture instead of window. " + "When enabled, user can pass application's own texture to sink " + "by using \"draw\" action signal on \"begin-draw\" signal handler, " + "so that sink can draw video data on application's texture. " + "Supported texture formats for user texture are " + "DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_FORMAT_B8G8R8A8_UNORM, and " + "DXGI_FORMAT_R10G10B10A2_UNORM.", + DEFAULT_DRAW_ON_SHARED_TEXTURE, + (GParamFlags) (G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS))); + + /** + * GstD3D11VideoSink::begin-draw: + * @videosink: the #d3d11videosink + * + * Emitted when sink has a texture to draw. Application needs to invoke + * #d3d11videosink::draw action signal before returning from + * #d3d11videosink::begin-draw signal handler. + * + * Since: 1.20 + */ + gst_d3d11_video_sink_signals[SIGNAL_BEGIN_DRAW] = + g_signal_new ("begin-draw", G_TYPE_FROM_CLASS (klass), G_SIGNAL_RUN_LAST, + G_STRUCT_OFFSET (GstD3D11VideoSinkClass, begin_draw), + NULL, NULL, NULL, G_TYPE_NONE, 0, G_TYPE_NONE); + + /** + * GstD3D11VideoSink::draw: + * @videosink: the #d3d11videosink + * @shard_handle: a pointer to HANDLE + * @texture_misc_flags: a D3D11_RESOURCE_MISC_FLAG value + * @acquire_key: a key value used for IDXGIKeyedMutex::AcquireSync + * @release_key: a key value used for IDXGIKeyedMutex::ReleaseSync + * + * Draws on a shared texture. @shard_handle must be a valid pointer to + * a HANDLE which was obtained via IDXGIResource::GetSharedHandle or + * IDXGIResource1::CreateSharedHandle. + * + * If the texture was created with D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX flag, + * caller must specify valid @acquire_key and @release_key. + * Otherwise (i.e., created with D3D11_RESOURCE_MISC_SHARED flag), + * @acquire_key and @release_key will be ignored. + * + * Since: 1.20 + */ + gst_d3d11_video_sink_signals[SIGNAL_DRAW] = + g_signal_new ("draw", G_TYPE_FROM_CLASS (klass), + (GSignalFlags) (G_SIGNAL_RUN_LAST | G_SIGNAL_ACTION), + G_STRUCT_OFFSET (GstD3D11VideoSinkClass, draw), NULL, NULL, NULL, + G_TYPE_BOOLEAN, 4, G_TYPE_POINTER, G_TYPE_UINT, G_TYPE_UINT64, + G_TYPE_UINT64); + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_set_context); + + gst_element_class_set_static_metadata (element_class, + "Direct3D11 video sink", "Sink/Video", + "A Direct3D11 based videosink", + "Seungha Yang <seungha.yang@navercorp.com>"); + + caps = gst_d3d11_get_updated_template_caps (&pad_template_caps); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, caps)); + gst_caps_unref (caps); + + basesink_class->get_caps = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_get_caps); + basesink_class->set_caps = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_set_caps); + basesink_class->start = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_start); + basesink_class->stop = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_stop); + basesink_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_propose_allocation); + basesink_class->query = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_query); + basesink_class->unlock = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_unlock); + basesink_class->unlock_stop = + GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_unlock_stop); + basesink_class->event = GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_event); + + videosink_class->show_frame = + GST_DEBUG_FUNCPTR (gst_d3d11_video_sink_show_frame); + + klass->draw = gst_d3d11_video_sink_draw_action; + + gst_type_mark_as_plugin_api (GST_D3D11_WINDOW_TOGGLE_MODE_GET_TYPE, + (GstPluginAPIFlags) 0); +} + +static void +gst_d3d11_video_sink_init (GstD3D11VideoSink * self) +{ + self->adapter = DEFAULT_ADAPTER; + self->force_aspect_ratio = DEFAULT_FORCE_ASPECT_RATIO; + self->enable_navigation_events = DEFAULT_ENABLE_NAVIGATION_EVENTS; + self->fullscreen_toggle_mode = DEFAULT_FULLSCREEN_TOGGLE_MODE; + self->fullscreen = DEFAULT_FULLSCREEN; + self->draw_on_shared_texture = DEFAULT_DRAW_ON_SHARED_TEXTURE; + + g_rec_mutex_init (&self->draw_lock); +} + +static void +gst_d3d11_videosink_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (object); + + GST_OBJECT_LOCK (self); + switch (prop_id) { + case PROP_ADAPTER: + self->adapter = g_value_get_int (value); + break; + case PROP_FORCE_ASPECT_RATIO: + self->force_aspect_ratio = g_value_get_boolean (value); + if (self->window) + g_object_set (self->window, + "force-aspect-ratio", self->force_aspect_ratio, NULL); + break; + case PROP_ENABLE_NAVIGATION_EVENTS: + self->enable_navigation_events = g_value_get_boolean (value); + if (self->window) { + g_object_set (self->window, + "enable-navigation-events", self->enable_navigation_events, NULL); + } + break; + case PROP_FULLSCREEN_TOGGLE_MODE: + self->fullscreen_toggle_mode = + (GstD3D11WindowFullscreenToggleMode) g_value_get_flags (value); + if (self->window) { + g_object_set (self->window, + "fullscreen-toggle-mode", self->fullscreen_toggle_mode, NULL); + } + break; + case PROP_FULLSCREEN: + self->fullscreen = g_value_get_boolean (value); + if (self->window) { + g_object_set (self->window, "fullscreen", self->fullscreen, NULL); + } + break; + case PROP_DRAW_ON_SHARED_TEXTURE: + self->draw_on_shared_texture = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + GST_OBJECT_UNLOCK (self); +} + +static void +gst_d3d11_videosink_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (object); + + switch (prop_id) { + case PROP_ADAPTER: + g_value_set_int (value, self->adapter); + break; + case PROP_FORCE_ASPECT_RATIO: + g_value_set_boolean (value, self->force_aspect_ratio); + break; + case PROP_ENABLE_NAVIGATION_EVENTS: + g_value_set_boolean (value, self->enable_navigation_events); + break; + case PROP_FULLSCREEN_TOGGLE_MODE: + g_value_set_flags (value, self->fullscreen_toggle_mode); + break; + case PROP_FULLSCREEN: + if (self->window) { + g_object_get_property (G_OBJECT (self->window), pspec->name, value); + } else { + g_value_set_boolean (value, self->fullscreen); + } + break; + case PROP_DRAW_ON_SHARED_TEXTURE: + g_value_set_boolean (value, self->draw_on_shared_texture); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_d3d11_video_sink_finalize (GObject * object) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (object); + + g_rec_mutex_clear (&self->draw_lock); + g_free (self->title); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_video_sink_set_context (GstElement * element, GstContext * context) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (element); + + gst_d3d11_handle_set_context (element, context, self->adapter, &self->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static GstCaps * +gst_d3d11_video_sink_get_caps (GstBaseSink * sink, GstCaps * filter) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + GstCaps *caps = NULL; + + caps = gst_pad_get_pad_template_caps (GST_VIDEO_SINK_PAD (sink)); + + if (self->device) { + gboolean is_hardware = FALSE; + + g_object_get (self->device, "hardware", &is_hardware, NULL); + + /* In case of WARP device, conversion via shader would be inefficient than + * upstream videoconvert. Allow native formats in this case */ + if (!is_hardware) { + GValue format_list = G_VALUE_INIT; + GValue format = G_VALUE_INIT; + + g_value_init (&format_list, GST_TYPE_LIST); + g_value_init (&format, G_TYPE_STRING); + + g_value_set_string (&format, "RGBA"); + gst_value_list_append_and_take_value (&format_list, &format); + + format = G_VALUE_INIT; + g_value_init (&format, G_TYPE_STRING); + g_value_set_string (&format, "BGRA"); + gst_value_list_append_and_take_value (&format_list, &format); + + caps = gst_caps_make_writable (caps); + gst_caps_set_value (caps, "format", &format_list); + g_value_unset (&format_list); + } + } + + if (filter) { + GstCaps *isect; + isect = gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = isect; + } + + return caps; +} + +static gboolean +gst_d3d11_video_sink_set_caps (GstBaseSink * sink, GstCaps * caps) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + GST_DEBUG_OBJECT (self, "set caps %" GST_PTR_FORMAT, caps); + + /* We will update window on show_frame() */ + self->caps_updated = TRUE; + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_update_window (GstD3D11VideoSink * self, GstCaps * caps) +{ + gint video_width, video_height; + gint video_par_n, video_par_d; /* video's PAR */ + gint display_par_n = 1, display_par_d = 1; /* display's PAR */ + guint num, den; + GError *error = NULL; + + GST_DEBUG_OBJECT (self, "Updating window with caps %" GST_PTR_FORMAT, caps); + + self->caps_updated = FALSE; + + if (!gst_d3d11_video_sink_prepare_window (self)) + goto no_window; + + if (!gst_video_info_from_caps (&self->info, caps)) + goto invalid_format; + + video_width = GST_VIDEO_INFO_WIDTH (&self->info); + video_height = GST_VIDEO_INFO_HEIGHT (&self->info); + video_par_n = GST_VIDEO_INFO_PAR_N (&self->info); + video_par_d = GST_VIDEO_INFO_PAR_D (&self->info); + + /* get aspect ratio from caps if it's present, and + * convert video width and height to a display width and height + * using wd / hd = wv / hv * PARv / PARd */ + + /* TODO: Get display PAR */ + + if (!gst_video_calculate_display_ratio (&num, &den, video_width, + video_height, video_par_n, video_par_d, display_par_n, display_par_d)) + goto no_disp_ratio; + + GST_DEBUG_OBJECT (self, + "video width/height: %dx%d, calculated display ratio: %d/%d format: %s", + video_width, video_height, num, den, + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (&self->info))); + + /* now find a width x height that respects this display ratio. + * prefer those that have one of w/h the same as the incoming video + * using wd / hd = num / den + */ + + /* start with same height, because of interlaced video + * check hd / den is an integer scale factor, and scale wd with the PAR + */ + if (video_height % den == 0) { + GST_DEBUG_OBJECT (self, "keeping video height"); + GST_VIDEO_SINK_WIDTH (self) = (guint) + gst_util_uint64_scale_int (video_height, num, den); + GST_VIDEO_SINK_HEIGHT (self) = video_height; + } else if (video_width % num == 0) { + GST_DEBUG_OBJECT (self, "keeping video width"); + GST_VIDEO_SINK_WIDTH (self) = video_width; + GST_VIDEO_SINK_HEIGHT (self) = (guint) + gst_util_uint64_scale_int (video_width, den, num); + } else { + GST_DEBUG_OBJECT (self, "approximating while keeping video height"); + GST_VIDEO_SINK_WIDTH (self) = (guint) + gst_util_uint64_scale_int (video_height, num, den); + GST_VIDEO_SINK_HEIGHT (self) = video_height; + } + + GST_DEBUG_OBJECT (self, "scaling to %dx%d", + GST_VIDEO_SINK_WIDTH (self), GST_VIDEO_SINK_HEIGHT (self)); + self->video_width = video_width; + self->video_height = video_height; + + if (GST_VIDEO_SINK_WIDTH (self) <= 0 || GST_VIDEO_SINK_HEIGHT (self) <= 0) + goto no_display_size; + + GST_OBJECT_LOCK (self); + if (self->pending_render_rect) { + GstVideoRectangle rect = self->render_rect; + + self->pending_render_rect = FALSE; + GST_OBJECT_UNLOCK (self); + + gst_d3d11_window_set_render_rectangle (self->window, &rect); + } else { + GST_OBJECT_UNLOCK (self); + } + + self->have_video_processor = FALSE; + if (!gst_d3d11_window_prepare (self->window, GST_VIDEO_SINK_WIDTH (self), + GST_VIDEO_SINK_HEIGHT (self), caps, &self->have_video_processor, + &error)) { + GstMessage *error_msg; + + GST_ERROR_OBJECT (self, "cannot create swapchain"); + error_msg = gst_message_new_error (GST_OBJECT_CAST (self), + error, "Failed to prepare d3d11window"); + g_clear_error (&error); + gst_element_post_message (GST_ELEMENT (self), error_msg); + + return FALSE; + } + + if (self->fallback_pool) { + gst_buffer_pool_set_active (self->fallback_pool, FALSE); + gst_clear_object (&self->fallback_pool); + } + + { + GstD3D11AllocationParams *d3d11_params; + gint bind_flags = D3D11_BIND_SHADER_RESOURCE; + + if (self->have_video_processor) { + /* To create video processor input view, one of following bind flags + * is required + * NOTE: Any texture arrays which were created with D3D11_BIND_DECODER flag + * cannot be used for shader input. + * + * D3D11_BIND_DECODER + * D3D11_BIND_VIDEO_ENCODER + * D3D11_BIND_RENDER_TARGET + * D3D11_BIND_UNORDERED_ACCESS_VIEW + */ + bind_flags |= D3D11_BIND_RENDER_TARGET; + } + + d3d11_params = gst_d3d11_allocation_params_new (self->device, + &self->info, (GstD3D11AllocationFlags) 0, bind_flags); + + self->fallback_pool = gst_d3d11_buffer_pool_new_with_options (self->device, + caps, d3d11_params, 2, 0); + gst_d3d11_allocation_params_free (d3d11_params); + } + + if (!self->fallback_pool) { + GST_ERROR_OBJECT (self, "Failed to configure fallback pool"); + return FALSE; + } + + self->processor_in_use = FALSE; + + if (self->title) { + gst_d3d11_window_set_title (self->window, self->title); + g_clear_pointer (&self->title, g_free); + } + + return TRUE; + + /* ERRORS */ +invalid_format: + { + GST_DEBUG_OBJECT (self, + "Could not locate image format from caps %" GST_PTR_FORMAT, caps); + return FALSE; + } +no_window: + { + GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, (NULL), + ("Failed to open window.")); + return FALSE; + } +no_disp_ratio: + { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output display ratio of the video.")); + return FALSE; + } +no_display_size: + { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output display ratio of the video.")); + return FALSE; + } +} + +static void +gst_d3d11_video_sink_key_event (GstD3D11Window * window, const gchar * event, + const gchar * key, GstD3D11VideoSink * self) +{ + if (self->enable_navigation_events) { + GST_LOG_OBJECT (self, "send key event %s, key %s", event, key); + gst_navigation_send_key_event (GST_NAVIGATION (self), event, key); + } +} + +static void +gst_d3d11_video_mouse_key_event (GstD3D11Window * window, const gchar * event, + gint button, gdouble x, gdouble y, GstD3D11VideoSink * self) +{ + if (self->enable_navigation_events) { + GST_LOG_OBJECT (self, + "send mouse event %s, button %d (%.1f, %.1f)", event, button, x, y); + gst_navigation_send_mouse_event (GST_NAVIGATION (self), event, button, x, + y); + } +} + +static gboolean +gst_d3d11_video_sink_start (GstBaseSink * sink) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + GST_DEBUG_OBJECT (self, "Start"); + + if (!gst_d3d11_ensure_element_data (GST_ELEMENT_CAST (self), self->adapter, + &self->device)) { + GST_ERROR_OBJECT (sink, "Cannot create d3d11device"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_prepare_window (GstD3D11VideoSink * self) +{ + GstD3D11WindowNativeType window_type = GST_D3D11_WINDOW_NATIVE_TYPE_HWND; + + if (self->window) + return TRUE; + + if (self->draw_on_shared_texture) { + GST_INFO_OBJECT (self, + "Create dummy window for rendering on shared texture"); + self->window = gst_d3d11_window_dummy_new (self->device); + return TRUE; + } + + if (!self->window_id) + gst_video_overlay_prepare_window_handle (GST_VIDEO_OVERLAY (self)); + + if (self->window_id) { + window_type = + gst_d3d11_window_get_native_type_from_handle (self->window_id); + + if (window_type != GST_D3D11_WINDOW_NATIVE_TYPE_NONE) { + GST_DEBUG_OBJECT (self, "Have window handle %" G_GUINTPTR_FORMAT, + self->window_id); + gst_video_overlay_got_window_handle (GST_VIDEO_OVERLAY (self), + self->window_id); + } + } + + GST_DEBUG_OBJECT (self, "Create window (type: %s)", + gst_d3d11_window_get_native_type_to_string (window_type)); + +#if GST_D3D11_WINAPI_ONLY_APP + if (window_type != GST_D3D11_WINDOW_NATIVE_TYPE_CORE_WINDOW && + window_type != GST_D3D11_WINDOW_NATIVE_TYPE_SWAP_CHAIN_PANEL) { + GST_ERROR_OBJECT (self, "Overlay handle must be set before READY state"); + return FALSE; + } +#endif + + switch (window_type) { +#if (!GST_D3D11_WINAPI_ONLY_APP) + case GST_D3D11_WINDOW_NATIVE_TYPE_HWND: + self->window = gst_d3d11_window_win32_new (self->device, self->window_id); + break; +#endif +#if GST_D3D11_WINAPI_APP + case GST_D3D11_WINDOW_NATIVE_TYPE_CORE_WINDOW: + self->window = gst_d3d11_window_core_window_new (self->device, + self->window_id); + break; + case GST_D3D11_WINDOW_NATIVE_TYPE_SWAP_CHAIN_PANEL: + self->window = gst_d3d11_window_swap_chain_panel_new (self->device, + self->window_id); + break; +#endif + default: + break; + } + + if (!self->window) { + GST_ERROR_OBJECT (self, "Cannot create d3d11window"); + return FALSE; + } + + GST_OBJECT_LOCK (self); + g_object_set (self->window, + "force-aspect-ratio", self->force_aspect_ratio, + "fullscreen-toggle-mode", self->fullscreen_toggle_mode, + "fullscreen", self->fullscreen, + "enable-navigation-events", self->enable_navigation_events, NULL); + GST_OBJECT_UNLOCK (self); + + g_signal_connect (self->window, "key-event", + G_CALLBACK (gst_d3d11_video_sink_key_event), self); + g_signal_connect (self->window, "mouse-event", + G_CALLBACK (gst_d3d11_video_mouse_key_event), self); + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_stop (GstBaseSink * sink) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + GST_DEBUG_OBJECT (self, "Stop"); + + if (self->fallback_pool) { + gst_buffer_pool_set_active (self->fallback_pool, FALSE); + gst_object_unref (self->fallback_pool); + self->fallback_pool = NULL; + } + + if (self->window) + gst_d3d11_window_unprepare (self->window); + + gst_clear_object (&self->device); + gst_clear_object (&self->window); + + g_clear_pointer (&self->title, g_free); + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_propose_allocation (GstBaseSink * sink, GstQuery * query) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + GstCaps *caps; + GstBufferPool *pool = NULL; + GstVideoInfo info; + guint size; + gboolean need_pool; + + if (!self->device) + return FALSE; + + gst_query_parse_allocation (query, &caps, &need_pool); + + if (caps == NULL) + goto no_caps; + + if (!gst_video_info_from_caps (&info, caps)) + goto invalid_caps; + + /* the normal size of a frame */ + size = info.size; + + if (need_pool) { + GstCapsFeatures *features; + GstStructure *config; + gboolean is_d3d11 = false; + + features = gst_caps_get_features (caps, 0); + if (features + && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + GST_DEBUG_OBJECT (self, "upstream support d3d11 memory"); + pool = gst_d3d11_buffer_pool_new (self->device); + is_d3d11 = true; + } else { + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + /* d3d11 pool does not support video alignment */ + if (!is_d3d11) { + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + } + + size = GST_VIDEO_INFO_SIZE (&info); + if (is_d3d11) { + GstD3D11AllocationParams *d3d11_params; + + d3d11_params = + gst_d3d11_allocation_params_new (self->device, + &info, (GstD3D11AllocationFlags) 0, D3D11_BIND_SHADER_RESOURCE); + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + } + + gst_buffer_pool_config_set_params (config, caps, (guint) size, 2, 0); + + if (!gst_buffer_pool_set_config (pool, config)) { + GST_ERROR_OBJECT (pool, "Couldn't set config"); + gst_object_unref (pool); + + return FALSE; + } + + if (is_d3d11) { + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, nullptr, &size, nullptr, + nullptr); + gst_structure_free (config); + + /* In case of system memory, we will upload video frame to GPU memory, + * (which is copy in any case), so crop meta support for system memory + * is almost pointless */ + gst_query_add_allocation_meta (query, + GST_VIDEO_CROP_META_API_TYPE, nullptr); + } + } + + /* We need at least 2 buffers because we hold on to the last one for redrawing + * on window-resize event */ + gst_query_add_allocation_pool (query, pool, size, 2, 0); + if (pool) + g_object_unref (pool); + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + gst_query_add_allocation_meta (query, + GST_VIDEO_OVERLAY_COMPOSITION_META_API_TYPE, NULL); + + return TRUE; + + /* ERRORS */ +no_caps: + { + GST_WARNING_OBJECT (self, "no caps specified"); + return FALSE; + } +invalid_caps: + { + GST_WARNING_OBJECT (self, "invalid caps specified"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_query (GstBaseSink * sink, GstQuery * query) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (self), query, + self->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_BASE_SINK_CLASS (parent_class)->query (sink, query); +} + +static gboolean +gst_d3d11_video_sink_unlock (GstBaseSink * sink) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + if (self->window) + gst_d3d11_window_unlock (self->window); + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_unlock_stop (GstBaseSink * sink) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + if (self->window) + gst_d3d11_window_unlock_stop (self->window); + + return TRUE; +} + +static gboolean +gst_d3d11_video_sink_event (GstBaseSink * sink, GstEvent * event) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_TAG:{ + GstTagList *taglist; + gchar *title = nullptr; + + gst_event_parse_tag (event, &taglist); + gst_tag_list_get_string (taglist, GST_TAG_TITLE, &title); + + if (title) { + const gchar *app_name = g_get_application_name (); + std::string title_string; + + if (app_name) { + title_string = std::string (title) + " : " + std::string (app_name); + } else { + title_string = std::string (title); + } + + if (self->window) { + gst_d3d11_window_set_title (self->window, title_string.c_str ()); + } else { + g_free (self->title); + self->title = g_strdup (title_string.c_str ()); + } + + g_free (title); + } + break; + } + default: + break; + } + + return GST_BASE_SINK_CLASS (parent_class)->event (sink, event); +} + +static gboolean +gst_d3d11_video_sink_upload_frame (GstD3D11VideoSink * self, GstBuffer * inbuf, + GstBuffer * outbuf) +{ + GstVideoFrame in_frame, out_frame; + gboolean ret; + + GST_LOG_OBJECT (self, "Copy to fallback buffer"); + + if (!gst_video_frame_map (&in_frame, &self->info, inbuf, + (GstMapFlags) (GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) + goto invalid_buffer; + + if (!gst_video_frame_map (&out_frame, &self->info, outbuf, + (GstMapFlags) (GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF))) { + gst_video_frame_unmap (&in_frame); + goto invalid_buffer; + } + + ret = gst_video_frame_copy (&out_frame, &in_frame); + + gst_video_frame_unmap (&in_frame); + gst_video_frame_unmap (&out_frame); + + return ret; + + /* ERRORS */ +invalid_buffer: + { + GST_ELEMENT_WARNING (self, CORE, NOT_IMPLEMENTED, (NULL), + ("invalid video buffer received")); + return FALSE; + } +} + +static gboolean +gst_d3d11_video_sink_copy_d3d11_to_d3d11 (GstD3D11VideoSink * self, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GST_LOG_OBJECT (self, "Copy to fallback buffer using device memory copy"); + + return gst_d3d11_buffer_copy_into (outbuf, inbuf, &self->info); +} + +static gboolean +gst_d3d11_video_sink_get_fallback_buffer (GstD3D11VideoSink * self, + GstBuffer * inbuf, GstBuffer ** fallback_buf, gboolean device_copy) +{ + GstBuffer *outbuf = NULL; + ID3D11ShaderResourceView *view[GST_VIDEO_MAX_PLANES]; + GstVideoOverlayCompositionMeta *compo_meta; + GstVideoCropMeta *crop_meta; + + if (!self->fallback_pool || + !gst_buffer_pool_set_active (self->fallback_pool, TRUE) || + gst_buffer_pool_acquire_buffer (self->fallback_pool, &outbuf, + NULL) != GST_FLOW_OK) { + GST_ERROR_OBJECT (self, "fallback pool is unavailable"); + return FALSE; + } + + /* Ensure SRV */ + if (!gst_d3d11_buffer_get_shader_resource_view (outbuf, view)) { + GST_ERROR_OBJECT (self, "fallback SRV is unavailable"); + goto error; + } + + if (device_copy) { + if (!gst_d3d11_video_sink_copy_d3d11_to_d3d11 (self, inbuf, outbuf)) { + GST_ERROR_OBJECT (self, "cannot copy frame"); + goto error; + } + } else if (!gst_d3d11_video_sink_upload_frame (self, inbuf, outbuf)) { + GST_ERROR_OBJECT (self, "cannot upload frame"); + goto error; + } + + /* Copy overlaycomposition meta if any */ + compo_meta = gst_buffer_get_video_overlay_composition_meta (inbuf); + if (compo_meta) + gst_buffer_add_video_overlay_composition_meta (outbuf, compo_meta->overlay); + + /* And copy crop meta as well */ + crop_meta = gst_buffer_get_video_crop_meta (inbuf); + if (crop_meta) { + GstVideoCropMeta *new_crop_meta = gst_buffer_add_video_crop_meta (outbuf); + + new_crop_meta->x = crop_meta->x; + new_crop_meta->y = crop_meta->y; + new_crop_meta->width = crop_meta->width; + new_crop_meta->height = crop_meta->height; + } + + *fallback_buf = outbuf; + + return TRUE; + +error: + gst_buffer_unref (outbuf); + return FALSE; +} + +static void +gst_d3d11_video_sink_check_device_update (GstD3D11VideoSink * self, + GstBuffer * buf) +{ + GstMemory *mem; + GstD3D11Memory *dmem; + gboolean update_device = FALSE; + + /* We have configured window already, cannot update device */ + if (self->window) + return; + + mem = gst_buffer_peek_memory (buf, 0); + if (!gst_is_d3d11_memory (mem)) + return; + + dmem = GST_D3D11_MEMORY_CAST (mem); + /* Same device, nothing to do */ + if (dmem->device == self->device) + return; + + if (self->adapter < 0) { + update_device = TRUE; + } else { + guint adapter = 0; + + g_object_get (dmem->device, "adapter", &adapter, NULL); + /* The same GPU as what user wanted, update */ + if (adapter == (guint) self->adapter) + update_device = TRUE; + } + + if (!update_device) + return; + + GST_INFO_OBJECT (self, "Updating device %" GST_PTR_FORMAT " -> %" + GST_PTR_FORMAT, self->device, dmem->device); + + gst_object_unref (self->device); + self->device = (GstD3D11Device *) gst_object_ref (dmem->device); +} + +static GstFlowReturn +gst_d3d11_video_sink_show_frame (GstVideoSink * sink, GstBuffer * buf) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (sink); + GstFlowReturn ret = GST_FLOW_OK; + GstBuffer *fallback_buf = NULL; + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (self->device); + ID3D11ShaderResourceView *view[GST_VIDEO_MAX_PLANES]; + + gst_d3d11_video_sink_check_device_update (self, buf); + + if (self->caps_updated || !self->window) { + GstCaps *caps = gst_pad_get_current_caps (GST_BASE_SINK_PAD (sink)); + gboolean update_ret; + + /* shouldn't happen */ + if (!caps) + return GST_FLOW_NOT_NEGOTIATED; + + update_ret = gst_d3d11_video_sink_update_window (self, caps); + gst_caps_unref (caps); + + if (!update_ret) + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_d3d11_buffer_can_access_device (buf, device_handle)) { + GST_LOG_OBJECT (self, "Need fallback buffer"); + + if (!gst_d3d11_video_sink_get_fallback_buffer (self, buf, &fallback_buf, + FALSE)) { + return GST_FLOW_ERROR; + } + } else { + gboolean direct_rendering = FALSE; + + /* Check if we can use video processor for conversion */ + if (gst_buffer_n_memory (buf) == 1 && self->have_video_processor) { + GstD3D11Memory *mem = (GstD3D11Memory *) gst_buffer_peek_memory (buf, 0); + D3D11_TEXTURE2D_DESC desc; + + gst_d3d11_memory_get_texture_desc (mem, &desc); + if ((desc.BindFlags & D3D11_BIND_DECODER) == D3D11_BIND_DECODER) { + GST_TRACE_OBJECT (self, + "Got VideoProcessor compatible texture, do direct rendering"); + direct_rendering = TRUE; + self->processor_in_use = TRUE; + } else if (self->processor_in_use && + (desc.BindFlags & D3D11_BIND_RENDER_TARGET) == + D3D11_BIND_RENDER_TARGET) { + direct_rendering = TRUE; + } + } + + /* Or, SRV should be available */ + if (!direct_rendering) { + if (gst_d3d11_buffer_get_shader_resource_view (buf, view)) { + GST_TRACE_OBJECT (self, "SRV is available, do direct rendering"); + direct_rendering = TRUE; + } + } + + if (!direct_rendering && + !gst_d3d11_video_sink_get_fallback_buffer (self, buf, &fallback_buf, + TRUE)) { + return GST_FLOW_ERROR; + } + } + + gst_d3d11_window_show (self->window); + + if (self->draw_on_shared_texture) { + g_rec_mutex_lock (&self->draw_lock); + self->current_buffer = fallback_buf ? fallback_buf : buf; + self->drawing = TRUE; + + GST_LOG_OBJECT (self, "Begin drawing"); + + /* Application should call draw method on this callback */ + g_signal_emit (self, gst_d3d11_video_sink_signals[SIGNAL_BEGIN_DRAW], 0, + NULL); + + GST_LOG_OBJECT (self, "End drawing"); + self->drawing = FALSE; + self->current_buffer = NULL; + g_rec_mutex_unlock (&self->draw_lock); + } else { + ret = gst_d3d11_window_render (self->window, + fallback_buf ? fallback_buf : buf); + } + + gst_clear_buffer (&fallback_buf); + + if (ret == GST_D3D11_WINDOW_FLOW_CLOSED) { + GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, + ("Output window was closed"), (NULL)); + + ret = GST_FLOW_ERROR; + } + + return ret; +} + +/* VideoOverlay interface */ +static void +gst_d3d11_video_sink_set_window_handle (GstVideoOverlay * overlay, + guintptr window_id) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (overlay); + + GST_DEBUG ("set window handle %" G_GUINTPTR_FORMAT, window_id); + + self->window_id = window_id; +} + +static void +gst_d3d11_video_sink_set_render_rectangle (GstVideoOverlay * overlay, gint x, + gint y, gint width, gint height) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (overlay); + + GST_DEBUG_OBJECT (self, + "render rect x: %d, y: %d, width: %d, height %d", x, y, width, height); + + GST_OBJECT_LOCK (self); + if (self->window) { + GstVideoRectangle rect; + + rect.x = x; + rect.y = y; + rect.w = width; + rect.h = height; + + self->render_rect = rect; + GST_OBJECT_UNLOCK (self); + + gst_d3d11_window_set_render_rectangle (self->window, &rect); + } else { + self->render_rect.x = x; + self->render_rect.y = y; + self->render_rect.w = width; + self->render_rect.h = height; + self->pending_render_rect = TRUE; + GST_OBJECT_UNLOCK (self); + } +} + +static void +gst_d3d11_video_sink_expose (GstVideoOverlay * overlay) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (overlay); + + if (self->window && self->window->swap_chain) { + gst_d3d11_window_render (self->window, NULL); + } +} + +static void +gst_d3d11_video_sink_video_overlay_init (GstVideoOverlayInterface * iface) +{ + iface->set_window_handle = gst_d3d11_video_sink_set_window_handle; + iface->set_render_rectangle = gst_d3d11_video_sink_set_render_rectangle; + iface->expose = gst_d3d11_video_sink_expose; +} + +/* Navigation interface */ +static void +gst_d3d11_video_sink_navigation_send_event (GstNavigation * navigation, + GstStructure * structure) +{ + GstD3D11VideoSink *self = GST_D3D11_VIDEO_SINK (navigation); + GstEvent *event = gst_event_new_navigation (structure); + + /* TODO: add support for translating native coordinate and video coordinate + * when force-aspect-ratio is set */ + if (event) { + gboolean handled; + + gst_event_ref (event); + handled = gst_pad_push_event (GST_VIDEO_SINK_PAD (self), event); + + if (!handled) + gst_element_post_message (GST_ELEMENT_CAST (self), + gst_navigation_message_new_event (GST_OBJECT_CAST (self), event)); + + gst_event_unref (event); + } +} + +static void +gst_d3d11_video_sink_navigation_init (GstNavigationInterface * iface) +{ + iface->send_event = gst_d3d11_video_sink_navigation_send_event; +} + +static gboolean +gst_d3d11_video_sink_draw_action (GstD3D11VideoSink * self, + gpointer shared_handle, guint texture_misc_flags, + guint64 acquire_key, guint64 release_key) +{ + GstFlowReturn ret; + g_return_val_if_fail (shared_handle != NULL, FALSE); + + if (!self->draw_on_shared_texture) { + GST_ERROR_OBJECT (self, "Invalid draw call, we are drawing on window"); + return FALSE; + } + + if (!shared_handle) { + GST_ERROR_OBJECT (self, "Invalid handle"); + return FALSE; + } + + g_rec_mutex_lock (&self->draw_lock); + if (!self->drawing || !self->current_buffer) { + GST_WARNING_OBJECT (self, "Nothing to draw"); + g_rec_mutex_unlock (&self->draw_lock); + return FALSE; + } + + GST_LOG_OBJECT (self, "Drawing on shared handle %p, MiscFlags: 0x%x" + ", acquire key: %" G_GUINT64_FORMAT ", release key: %" + G_GUINT64_FORMAT, shared_handle, texture_misc_flags, acquire_key, + release_key); + + ret = gst_d3d11_window_render_on_shared_handle (self->window, + self->current_buffer, shared_handle, texture_misc_flags, acquire_key, + release_key); + g_rec_mutex_unlock (&self->draw_lock); + + return ret == GST_FLOW_OK; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11videosink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11videosink.h
Changed
@@ -1,5 +1,6 @@ /* GStreamer * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public @@ -25,58 +26,37 @@ #include <gst/video/gstvideosink.h> #include <gst/video/videooverlay.h> #include <gst/video/navigation.h> - -#include "gstd3d11_fwd.h" +#include <gst/d3d11/gstd3d11.h> #include "gstd3d11window.h" G_BEGIN_DECLS -#define GST_TYPE_D3D11_VIDEO_SINK (gst_d3d11_video_sink_get_type()) -#define GST_D3D11_VIDEO_SINK(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_VIDEO_SINK,GstD3D11VideoSink)) -#define GST_D3D11_VIDEO_SINK_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_D3D11_VIDEO_SINK,GstD3D11VideoSinkClass)) -#define GST_D3D11_VIDEO_SINK_GET_CLASS(obj) (GST_D3D11_VIDEO_SINK_CLASS(G_OBJECT_GET_CLASS(obj))) -#define GST_IS_D3D11_VIDEO_SINK(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_VIDEO_SINK)) -#define GST_IS_D3D11_VIDEO_SINK_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_D3D11_VIDEO_SINK)) +#define GST_TYPE_D3D11_VIDEO_SINK (gst_d3d11_video_sink_get_type()) +#define GST_D3D11_VIDEO_SINK(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_D3D11_VIDEO_SINK, GstD3D11VideoSink)) +#define GST_D3D11_VIDEO_SINK_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_D3D11_VIDEO_SINK, GstD3D11VideoSinkClass)) +#define GST_IS_D3D11_VIDEO_SINK(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_D3D11_VIDEO_SINK)) +#define GST_IS_D3D11_VIDEO_SINK_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_D3D11_VIDEO_SINK)) +#define GST_D3D11_VIDEO_SINK_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_D3D11_VIDEO_SINK, GstD3D11VideoSinkClass)) typedef struct _GstD3D11VideoSink GstD3D11VideoSink; typedef struct _GstD3D11VideoSinkClass GstD3D11VideoSinkClass; - -struct _GstD3D11VideoSink -{ - GstVideoSink sink; - GstD3D11Device *device; - GstD3D11Window *window; - gint video_width; - gint video_height; - - GstVideoInfo info; - - guintptr window_id; - - /* properties */ - gint adapter; - gboolean force_aspect_ratio; - gboolean enable_navigation_events; - GstD3D11WindowFullscreenToggleMode fullscreen_toggle_mode; - gboolean fullscreen; - - /* saved render rectangle until we have a window */ - GstVideoRectangle render_rect; - gboolean pending_render_rect; - - GstBufferPool *fallback_pool; - gboolean can_convert; - gboolean have_video_processor; -}; - struct _GstD3D11VideoSinkClass { GstVideoSinkClass parent_class; + + /* signals */ + void (*begin_draw) (GstD3D11VideoSink * videosink); + + /* actions */ + gboolean (*draw) (GstD3D11VideoSink * videosink, + gpointer shared_handle, + guint texture_misc_flags, + guint64 acquire_key, + guint64 release_key); }; -GType gst_d3d11_video_sink_get_type (void); +GType gst_d3d11_video_sink_get_type (void); G_END_DECLS - #endif /* __GST_D3D11_VIDEO_SINK_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11vp8dec.cpp
Added
@@ -0,0 +1,767 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-d3d11vp8dec + * @title: d3d11vp8dec + * + * A Direct3D11/DXVA based VP8 video decoder + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/vp8/file ! parsebin ! d3d11vp8dec ! d3d11videosink + * ``` + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11vp8dec.h" + +#include <gst/codecs/gstvp8decoder.h> +#include <string.h> +#include <vector> + +/* HACK: to expose dxva data structure on UWP */ +#ifdef WINAPI_PARTITION_DESKTOP +#undef WINAPI_PARTITION_DESKTOP +#endif +#define WINAPI_PARTITION_DESKTOP 1 +#include <d3d9.h> +#include <dxva.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_vp8_dec_debug); +#define GST_CAT_DEFAULT gst_d3d11_vp8_dec_debug + +/* reference list 4 + 4 margin */ +#define NUM_OUTPUT_VIEW 8 + +/* *INDENT-OFF* */ +typedef struct _GstD3D11Vp8DecInner +{ + GstD3D11Device *device = nullptr; + GstD3D11Decoder *d3d11_decoder = nullptr; + + DXVA_PicParams_VP8 pic_params; + DXVA_Slice_VPx_Short slice; + + /* In case of VP8, there's only one slice per picture so we don't + * need this bitstream buffer, but this will be used for 128 bytes alignment */ + std::vector<guint8> bitstream_buffer; + + guint width = 0; + guint height = 0; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; +} GstD3D11Vp8DecInner; +/* *INDENT-ON* */ + +typedef struct _GstD3D11Vp8Dec +{ + GstVp8Decoder parent; + GstD3D11Vp8DecInner *inner; +} GstD3D11Vp8Dec; + +typedef struct _GstD3D11Vp8DecClass +{ + GstVp8DecoderClass parent_class; + GstD3D11DecoderSubClassData class_data; +} GstD3D11Vp8DecClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_VP8_DEC(object) ((GstD3D11Vp8Dec *) (object)) +#define GST_D3D11_VP8_DEC_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11Vp8DecClass)) + +static void gst_d3d11_vp8_dec_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_vp8_dec_finalize (GObject * object); +static void gst_d3d11_vp8_dec_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_vp8_dec_open (GstVideoDecoder * decoder); +static gboolean gst_d3d11_vp8_dec_close (GstVideoDecoder * decoder); +static gboolean gst_d3d11_vp8_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_d3d11_vp8_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_d3d11_vp8_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); +static gboolean gst_d3d11_vp8_sink_event (GstVideoDecoder * decoder, + GstEvent * event); + +/* GstVp8Decoder */ +static GstFlowReturn gst_d3d11_vp8_dec_new_sequence (GstVp8Decoder * decoder, + const GstVp8FrameHdr * frame_hdr); +static GstFlowReturn gst_d3d11_vp8_dec_new_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture); +static GstFlowReturn gst_d3d11_vp8_dec_start_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture); +static GstFlowReturn gst_d3d11_vp8_dec_decode_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture, GstVp8Parser * parser); +static GstFlowReturn gst_d3d11_vp8_dec_end_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture); +static GstFlowReturn gst_d3d11_vp8_dec_output_picture (GstVp8Decoder * + decoder, GstVideoCodecFrame * frame, GstVp8Picture * picture); + +static void +gst_d3d11_vp8_dec_class_init (GstD3D11Vp8DecClass * klass, gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstVp8DecoderClass *vp8decoder_class = GST_VP8_DECODER_CLASS (klass); + GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; + + gobject_class->get_property = gst_d3d11_vp8_dec_get_property; + gobject_class->finalize = gst_d3d11_vp8_dec_finalize; + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_set_context); + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + gst_d3d11_decoder_class_data_fill_subclass_data (cdata, &klass->class_data); + + /** + * GstD3D11Vp8Dec:adapter-luid: + * + * DXGI Adapter LUID for this element + * + * Since: 1.20 + */ + gst_d3d11_decoder_proxy_class_init (element_class, cdata, + "Seungha Yang <seungha.yang@navercorp.com>"); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_src_query); + decoder_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_vp8_sink_event); + + vp8decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_new_sequence); + vp8decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_new_picture); + vp8decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_start_picture); + vp8decoder_class->decode_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_decode_picture); + vp8decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_end_picture); + vp8decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp8_dec_output_picture); +} + +static void +gst_d3d11_vp8_dec_init (GstD3D11Vp8Dec * self) +{ + self->inner = new GstD3D11Vp8DecInner (); +} + +static void +gst_d3d11_vp8_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11Vp8DecClass *klass = GST_D3D11_VP8_DEC_GET_CLASS (object); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_decoder_proxy_get_property (object, prop_id, value, pspec, cdata); +} + +static void +gst_d3d11_vp8_dec_finalize (GObject * object) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (object); + + delete self->inner; + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_vp8_dec_set_context (GstElement * element, GstContext * context) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (element); + GstD3D11Vp8DecInner *inner = self->inner; + GstD3D11Vp8DecClass *klass = GST_D3D11_VP8_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, cdata->adapter_luid, &inner->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_vp8_dec_open (GstVideoDecoder * decoder) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + GstD3D11Vp8DecClass *klass = GST_D3D11_VP8_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + if (!gst_d3d11_decoder_proxy_open (decoder, + cdata, &inner->device, &inner->d3d11_decoder)) { + GST_ERROR_OBJECT (self, "Failed to open decoder"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_d3d11_vp8_dec_close (GstVideoDecoder * decoder) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + + gst_clear_object (&inner->d3d11_decoder); + gst_clear_object (&inner->device); + + return TRUE; +} + +static gboolean +gst_d3d11_vp8_dec_negotiate (GstVideoDecoder * decoder) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_negotiate (inner->d3d11_decoder, decoder)) + return FALSE; + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_d3d11_vp8_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_decide_allocation (inner->d3d11_decoder, decoder, + query)) { + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_d3d11_vp8_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), + query, inner->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static gboolean +gst_d3d11_vp8_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, TRUE); + break; + case GST_EVENT_FLUSH_STOP: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, FALSE); + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstFlowReturn +gst_d3d11_vp8_dec_new_sequence (GstVp8Decoder * decoder, + const GstVp8FrameHdr * frame_hdr) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + GstVideoInfo info; + + GST_LOG_OBJECT (self, "new sequence"); + + /* FIXME: support I420 */ + inner->out_format = GST_VIDEO_FORMAT_NV12; + inner->width = frame_hdr->width; + inner->height = frame_hdr->height; + + gst_video_info_set_format (&info, + inner->out_format, inner->width, inner->height); + + if (!gst_d3d11_decoder_configure (inner->d3d11_decoder, + decoder->input_state, &info, inner->width, inner->height, + NUM_OUTPUT_VIEW)) { + GST_ERROR_OBJECT (self, "Failed to create decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp8_dec_new_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + GstBuffer *view_buffer; + + view_buffer = gst_d3d11_decoder_get_output_view_buffer (inner->d3d11_decoder, + GST_VIDEO_DECODER (decoder)); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "No available output view buffer"); + return GST_FLOW_FLUSHING; + } + + GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT, view_buffer); + + gst_vp8_picture_set_user_data (picture, + view_buffer, (GDestroyNotify) gst_buffer_unref); + + GST_LOG_OBJECT (self, "New VP8 picture %p", picture); + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp8_dec_start_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + + inner->bitstream_buffer.resize (0); + + return GST_FLOW_OK; +} + +static ID3D11VideoDecoderOutputView * +gst_d3d11_vp8_dec_get_output_view_from_picture (GstD3D11Vp8Dec * self, + GstVp8Picture * picture, guint8 * view_id) +{ + GstD3D11Vp8DecInner *inner = self->inner; + GstBuffer *view_buffer; + ID3D11VideoDecoderOutputView *view; + + view_buffer = (GstBuffer *) gst_vp8_picture_get_user_data (picture); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); + return NULL; + } + + view = + gst_d3d11_decoder_get_output_view_from_buffer (inner->d3d11_decoder, + view_buffer, view_id); + if (!view) { + GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); + return NULL; + } + + return view; +} + +static void +gst_d3d11_vp8_dec_copy_frame_params (GstD3D11Vp8Dec * self, + GstVp8Picture * picture, GstVp8Parser * parser, DXVA_PicParams_VP8 * params) +{ + const GstVp8FrameHdr *frame_hdr = &picture->frame_hdr; + gint i; + + /* 0: keyframe, 1: inter */ + params->frame_type = !frame_hdr->key_frame; + params->version = frame_hdr->version; + params->show_frame = frame_hdr->show_frame; + params->clamp_type = frame_hdr->clamping_type; + + params->filter_type = frame_hdr->filter_type; + params->filter_level = frame_hdr->loop_filter_level; + params->sharpness_level = frame_hdr->sharpness_level; + params->mode_ref_lf_delta_enabled = + parser->mb_lf_adjust.loop_filter_adj_enable; + params->mode_ref_lf_delta_update = + parser->mb_lf_adjust.mode_ref_lf_delta_update; + for (i = 0; i < 4; i++) { + params->ref_lf_deltas[i] = parser->mb_lf_adjust.ref_frame_delta[i]; + params->mode_lf_deltas[i] = parser->mb_lf_adjust.mb_mode_delta[i]; + } + params->log2_nbr_of_dct_partitions = frame_hdr->log2_nbr_of_dct_partitions; + params->base_qindex = frame_hdr->quant_indices.y_ac_qi; + params->y1dc_delta_q = frame_hdr->quant_indices.y_dc_delta; + params->y2dc_delta_q = frame_hdr->quant_indices.y2_dc_delta; + params->y2ac_delta_q = frame_hdr->quant_indices.y2_ac_delta; + params->uvdc_delta_q = frame_hdr->quant_indices.uv_dc_delta; + params->uvac_delta_q = frame_hdr->quant_indices.uv_ac_delta; + + params->ref_frame_sign_bias_golden = frame_hdr->sign_bias_golden; + params->ref_frame_sign_bias_altref = frame_hdr->sign_bias_alternate; + + params->refresh_entropy_probs = frame_hdr->refresh_entropy_probs; + + memcpy (params->vp8_coef_update_probs, frame_hdr->token_probs.prob, + sizeof (frame_hdr->token_probs.prob)); + + params->mb_no_coeff_skip = frame_hdr->mb_no_skip_coeff; + params->prob_skip_false = frame_hdr->prob_skip_false; + params->prob_intra = frame_hdr->prob_intra; + params->prob_last = frame_hdr->prob_last; + params->prob_golden = frame_hdr->prob_gf; + + memcpy (params->intra_16x16_prob, frame_hdr->mode_probs.y_prob, + sizeof (frame_hdr->mode_probs.y_prob)); + memcpy (params->intra_chroma_prob, frame_hdr->mode_probs.uv_prob, + sizeof (frame_hdr->mode_probs.uv_prob)); + memcpy (params->vp8_mv_update_probs, frame_hdr->mv_probs.prob, + sizeof (frame_hdr->mv_probs.prob)); +} + +static void +gst_d3d11_vp8_dec_copy_reference_frames (GstD3D11Vp8Dec * self, + DXVA_PicParams_VP8 * params) +{ + GstVp8Decoder *decoder = GST_VP8_DECODER (self); + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + + if (decoder->alt_ref_picture) { + view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, + decoder->alt_ref_picture, &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "picture does not have output view handle"); + return; + } + + params->alt_fb_idx.Index7Bits = view_id; + } else { + params->alt_fb_idx.bPicEntry = 0xff; + } + + if (decoder->golden_ref_picture) { + view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, + decoder->golden_ref_picture, &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "picture does not have output view handle"); + return; + } + + params->gld_fb_idx.Index7Bits = view_id; + } else { + params->gld_fb_idx.bPicEntry = 0xff; + } + + if (decoder->last_picture) { + view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, + decoder->last_picture, &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "picture does not have output view handle"); + return; + } + + params->lst_fb_idx.Index7Bits = view_id; + } else { + params->lst_fb_idx.bPicEntry = 0xff; + } +} + +static void +gst_d3d11_vp8_dec_copy_segmentation_params (GstD3D11Vp8Dec * self, + GstVp8Parser * parser, DXVA_PicParams_VP8 * params) +{ + const GstVp8Segmentation *seg = &parser->segmentation; + gint i; + + params->stVP8Segments.segmentation_enabled = seg->segmentation_enabled; + params->stVP8Segments.update_mb_segmentation_map = + seg->update_mb_segmentation_map; + params->stVP8Segments.update_mb_segmentation_data = + seg->update_segment_feature_data; + params->stVP8Segments.mb_segement_abs_delta = seg->segment_feature_mode; + + for (i = 0; i < 4; i++) { + params->stVP8Segments.segment_feature_data[0][i] = + seg->quantizer_update_value[i]; + } + + for (i = 0; i < 4; i++) { + params->stVP8Segments.segment_feature_data[1][i] = seg->lf_update_value[i]; + } + + for (i = 0; i < 3; i++) { + params->stVP8Segments.mb_segment_tree_probs[i] = seg->segment_prob[i]; + } +} + +static GstFlowReturn +gst_d3d11_vp8_dec_decode_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture, GstVp8Parser * parser) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + DXVA_PicParams_VP8 *pic_params = &inner->pic_params; + DXVA_Slice_VPx_Short *slice = &inner->slice; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + const GstVp8FrameHdr *frame_hdr = &picture->frame_hdr; + + view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, + picture, &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (pic_params, 0, sizeof (DXVA_PicParams_VP8)); + + pic_params->first_part_size = frame_hdr->first_part_size; + pic_params->width = inner->width; + pic_params->height = inner->height; + pic_params->CurrPic.Index7Bits = view_id; + pic_params->StatusReportFeedbackNumber = 1; + + gst_d3d11_vp8_dec_copy_frame_params (self, picture, parser, pic_params); + gst_d3d11_vp8_dec_copy_reference_frames (self, pic_params); + gst_d3d11_vp8_dec_copy_segmentation_params (self, parser, pic_params); + + inner->bitstream_buffer.resize (picture->size); + memcpy (&inner->bitstream_buffer[0], picture->data, picture->size); + + slice->BSNALunitDataLocation = 0; + slice->SliceBytesInBuffer = inner->bitstream_buffer.size (); + slice->wBadSliceChopping = 0; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp8_dec_end_picture (GstVp8Decoder * decoder, GstVp8Picture * picture) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + size_t bitstream_buffer_size; + size_t bitstream_pos; + GstD3D11DecodeInputStreamArgs input_args; + + if (inner->bitstream_buffer.empty ()) { + GST_ERROR_OBJECT (self, "No bitstream buffer to submit"); + return GST_FLOW_ERROR; + } + + view = gst_d3d11_vp8_dec_get_output_view_from_picture (self, + picture, &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (&input_args, 0, sizeof (GstD3D11DecodeInputStreamArgs)); + + bitstream_pos = inner->bitstream_buffer.size (); + bitstream_buffer_size = GST_ROUND_UP_128 (bitstream_pos); + + if (bitstream_buffer_size > bitstream_pos) { + size_t padding = bitstream_buffer_size - bitstream_pos; + + /* As per DXVA spec, total amount of bitstream buffer size should be + * 128 bytes aligned. If actual data is not multiple of 128 bytes, + * the last slice data needs to be zero-padded */ + inner->bitstream_buffer.resize (bitstream_buffer_size, 0); + + inner->slice.SliceBytesInBuffer += padding; + } + + input_args.picture_params = &inner->pic_params; + input_args.picture_params_size = sizeof (DXVA_PicParams_VP8); + input_args.slice_control = &inner->slice; + input_args.slice_control_size = sizeof (DXVA_Slice_VPx_Short); + input_args.bitstream = &inner->bitstream_buffer[0]; + input_args.bitstream_size = inner->bitstream_buffer.size (); + + if (!gst_d3d11_decoder_decode_frame (inner->d3d11_decoder, view, &input_args)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp8_dec_output_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture) +{ + GstD3D11Vp8Dec *self = GST_D3D11_VP8_DEC (decoder); + GstD3D11Vp8DecInner *inner = self->inner; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstBuffer *view_buffer; + + g_assert (picture->frame_hdr.show_frame); + + GST_LOG_OBJECT (self, "Outputting picture %p", picture); + + view_buffer = (GstBuffer *) gst_vp8_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Could not get output view"); + goto error; + } + + if (!gst_d3d11_decoder_process_output (inner->d3d11_decoder, vdec, + inner->width, inner->height, view_buffer, &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to copy buffer"); + goto error; + } + + gst_vp8_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_vp8_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); + + return GST_FLOW_ERROR; +} + +void +gst_d3d11_vp8_dec_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank) +{ + GType type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + guint i; + GTypeInfo type_info = { + sizeof (GstD3D11Vp8DecClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_vp8_dec_class_init, + NULL, + NULL, + sizeof (GstD3D11Vp8Dec), + 0, + (GInstanceInitFunc) gst_d3d11_vp8_dec_init, + }; + const GUID *profile_guid = NULL; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + guint max_width = 0; + guint max_height = 0; + guint resolution; + DXGI_FORMAT format = DXGI_FORMAT_NV12; + + if (!gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_VP8, GST_VIDEO_FORMAT_NV12, &profile_guid)) { + GST_INFO_OBJECT (device, "device does not support VP8 decoding"); + return; + } + + for (i = 0; i < G_N_ELEMENTS (gst_dxva_resolutions); i++) { + if (gst_d3d11_decoder_supports_resolution (device, profile_guid, + format, gst_dxva_resolutions[i].width, + gst_dxva_resolutions[i].height)) { + max_width = gst_dxva_resolutions[i].width; + max_height = gst_dxva_resolutions[i].height; + + GST_DEBUG_OBJECT (device, + "device support resolution %dx%d", max_width, max_height); + } else { + break; + } + } + + if (max_width == 0 || max_height == 0) { + GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); + return; + } + + sink_caps = gst_caps_from_string ("video/x-vp8"); + src_caps = gst_caps_from_string ("video/x-raw(" + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "); video/x-raw"); + + gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); + + /* To cover both landscape and portrait, select max value */ + resolution = MAX (max_width, max_height); + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + + type_info.class_data = + gst_d3d11_decoder_class_data_new (device, GST_DXVA_CODEC_VP8, + sink_caps, src_caps); + + type_name = g_strdup ("GstD3D11Vp8Dec"); + feature_name = g_strdup ("d3d11vp8dec"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11Vp8Device%dDec", index); + feature_name = g_strdup_printf ("d3d11vp8device%ddec", index); + } + + type = g_type_register_static (GST_TYPE_VP8_DECODER, + type_name, &type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11vp8dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11vp8dec.h
Changed
@@ -26,7 +26,6 @@ void gst_d3d11_vp8_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11vp9dec.cpp
Added
@@ -0,0 +1,979 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + * + * NOTE: some of implementations are copied/modified from Chromium code + * + * Copyright 2015 The Chromium Authors. All rights reserved. + * + * Redistribution and use in source and binary forms, with or without + * modification, are permitted provided that the following conditions are + * met: + * + * * Redistributions of source code must retain the above copyright + * notice, this list of conditions and the following disclaimer. + * * Redistributions in binary form must reproduce the above + * copyright notice, this list of conditions and the following disclaimer + * in the documentation and/or other materials provided with the + * distribution. + * * Neither the name of Google Inc. nor the names of its + * contributors may be used to endorse or promote products derived from + * this software without specific prior written permission. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR + * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT + * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, + * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT + * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, + * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY + * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT + * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE + * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + */ + +/** + * SECTION:element-d3d11vp9dec + * @title: d3d11vp9dec + * + * A Direct3D11/DXVA based VP9 video decoder + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=/path/to/vp9/file ! parsebin ! d3d11vp9dec ! d3d11videosink + * ``` + * + * Since: 1.18 + * + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstd3d11vp9dec.h" +#include "gstd3d11pluginutils.h" + +#include <gst/codecs/gstvp9decoder.h> +#include <string.h> +#include <vector> + +/* HACK: to expose dxva data structure on UWP */ +#ifdef WINAPI_PARTITION_DESKTOP +#undef WINAPI_PARTITION_DESKTOP +#endif +#define WINAPI_PARTITION_DESKTOP 1 +#include <d3d9.h> +#include <dxva.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_vp9_dec_debug); +#define GST_CAT_DEFAULT gst_d3d11_vp9_dec_debug + +/* reference list 8 + 4 margin */ +#define NUM_OUTPUT_VIEW 12 + +/* *INDENT-OFF* */ +typedef struct _GstD3D11Vp9DecInner +{ + GstD3D11Device *device = nullptr; + GstD3D11Decoder *d3d11_decoder = nullptr; + + DXVA_PicParams_VP9 pic_params; + DXVA_Slice_VPx_Short slice; + + /* In case of VP9, there's only one slice per picture so we don't + * need this bitstream buffer, but this will be used for 128 bytes alignment */ + std::vector<guint8> bitstream_buffer; + + /* To calculate use_prev_in_find_mv_refs */ + guint last_frame_width = 0; + guint last_frame_height = 0; + gboolean last_show_frame = FALSE; +} GstD3D11Vp9DecInner; +/* *INDENT-ON* */ + +typedef struct _GstD3D11Vp9Dec +{ + GstVp9Decoder parent; + GstD3D11Vp9DecInner *inner; +} GstD3D11Vp9Dec; + +typedef struct _GstD3D11Vp9DecClass +{ + GstVp9DecoderClass parent_class; + GstD3D11DecoderSubClassData class_data; +} GstD3D11Vp9DecClass; + +static GstElementClass *parent_class = NULL; + +#define GST_D3D11_VP9_DEC(object) ((GstD3D11Vp9Dec *) (object)) +#define GST_D3D11_VP9_DEC_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstD3D11Vp9DecClass)) + +static void gst_d3d11_vp9_dec_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_d3d11_vp9_dec_finalize (GObject * object); +static void gst_d3d11_vp9_dec_set_context (GstElement * element, + GstContext * context); + +static gboolean gst_d3d11_vp9_dec_open (GstVideoDecoder * decoder); +static gboolean gst_d3d11_vp9_dec_close (GstVideoDecoder * decoder); +static gboolean gst_d3d11_vp9_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_d3d11_vp9_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_d3d11_vp9_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); +static gboolean gst_d3d11_vp9_dec_sink_event (GstVideoDecoder * decoder, + GstEvent * event); + +/* GstVp9Decoder */ +static GstFlowReturn gst_d3d11_vp9_dec_new_sequence (GstVp9Decoder * decoder, + const GstVp9FrameHeader * frame_hdr); +static GstFlowReturn gst_d3d11_vp9_dec_new_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture); +static GstVp9Picture *gst_d3d11_vp9_dec_duplicate_picture (GstVp9Decoder * + decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); +static GstFlowReturn gst_d3d11_vp9_dec_start_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture); +static GstFlowReturn gst_d3d11_vp9_dec_decode_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture, GstVp9Dpb * dpb); +static GstFlowReturn gst_d3d11_vp9_dec_end_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture); +static GstFlowReturn gst_d3d11_vp9_dec_output_picture (GstVp9Decoder * + decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); + +static void +gst_d3d11_vp9_dec_class_init (GstD3D11Vp9DecClass * klass, gpointer data) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstVp9DecoderClass *vp9decoder_class = GST_VP9_DECODER_CLASS (klass); + GstD3D11DecoderClassData *cdata = (GstD3D11DecoderClassData *) data; + + gobject_class->get_property = gst_d3d11_vp9_dec_get_property; + gobject_class->finalize = gst_d3d11_vp9_dec_finalize; + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_set_context); + + parent_class = (GstElementClass *) g_type_class_peek_parent (klass); + gst_d3d11_decoder_class_data_fill_subclass_data (cdata, &klass->class_data); + + /** + * GstD3D11Vp9Dec:adapter-luid: + * + * DXGI Adapter LUID for this element + * + * Since: 1.20 + */ + + gst_d3d11_decoder_proxy_class_init (element_class, cdata, + "Seungha Yang <seungha.yang@navercorp.com>"); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_src_query); + decoder_class->sink_event = GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_sink_event); + + vp9decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_new_sequence); + vp9decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_new_picture); + vp9decoder_class->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_duplicate_picture); + vp9decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_start_picture); + vp9decoder_class->decode_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_decode_picture); + vp9decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_end_picture); + vp9decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_d3d11_vp9_dec_output_picture); +} + +static void +gst_d3d11_vp9_dec_init (GstD3D11Vp9Dec * self) +{ + self->inner = new GstD3D11Vp9DecInner (); +} + +static void +gst_d3d11_vp9_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstD3D11Vp9DecClass *klass = GST_D3D11_VP9_DEC_GET_CLASS (object); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_decoder_proxy_get_property (object, prop_id, value, pspec, cdata); +} + +static void +gst_d3d11_vp9_dec_finalize (GObject * object) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (object); + + delete self->inner; + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_d3d11_vp9_dec_set_context (GstElement * element, GstContext * context) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (element); + GstD3D11Vp9DecInner *inner = self->inner; + GstD3D11Vp9DecClass *klass = GST_D3D11_VP9_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + gst_d3d11_handle_set_context_for_adapter_luid (element, + context, cdata->adapter_luid, &inner->device); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_d3d11_vp9_dec_open (GstVideoDecoder * decoder) +{ + GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + GstD3D11Vp9DecClass *klass = GST_D3D11_VP9_DEC_GET_CLASS (self); + GstD3D11DecoderSubClassData *cdata = &klass->class_data; + + if (!gst_d3d11_decoder_proxy_open (decoder, + cdata, &inner->device, &inner->d3d11_decoder)) { + GST_ERROR_OBJECT (self, "Failed to open decoder"); + return FALSE; + } + + /* XXX: ConfigDecoderSpecific bit 12 indicates whether accelerator can + * support non-keyframe format change or not, but it doesn't seem to be + * reliable, since 1b means that it's supported and 0b indicates it may not be + * supported. Because some GPUs can support it even if the bit 12 is not + * set, do filtering by vendor for now (AMD and Intel looks fine) */ + if (gst_d3d11_get_device_vendor (inner->device) == + GST_D3D11_DEVICE_VENDOR_NVIDIA) { + gst_vp9_decoder_set_non_keyframe_format_change_support (vp9dec, FALSE); + } + + return TRUE; +} + +static gboolean +gst_d3d11_vp9_dec_close (GstVideoDecoder * decoder) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + + gst_clear_object (&inner->d3d11_decoder); + gst_clear_object (&inner->device); + + return TRUE; +} + +static gboolean +gst_d3d11_vp9_dec_negotiate (GstVideoDecoder * decoder) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_negotiate (inner->d3d11_decoder, decoder)) + return FALSE; + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_d3d11_vp9_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + + if (!gst_d3d11_decoder_decide_allocation (inner->d3d11_decoder, + decoder, query)) { + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_d3d11_vp9_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (decoder), + query, inner->device)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static gboolean +gst_d3d11_vp9_dec_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, TRUE); + break; + case GST_EVENT_FLUSH_STOP: + if (inner->d3d11_decoder) + gst_d3d11_decoder_set_flushing (inner->d3d11_decoder, decoder, FALSE); + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstFlowReturn +gst_d3d11_vp9_dec_new_sequence (GstVp9Decoder * decoder, + const GstVp9FrameHeader * frame_hdr) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + GstVideoInfo info; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; + + GST_LOG_OBJECT (self, "new sequence"); + + if (frame_hdr->profile == GST_VP9_PROFILE_0) + out_format = GST_VIDEO_FORMAT_NV12; + else if (frame_hdr->profile == GST_VP9_PROFILE_2) + out_format = GST_VIDEO_FORMAT_P010_10LE; + + if (out_format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ERROR_OBJECT (self, "Could not support profile %d", frame_hdr->profile); + return GST_FLOW_NOT_NEGOTIATED; + } + + gst_video_info_set_format (&info, + out_format, frame_hdr->width, frame_hdr->height); + + if (!gst_d3d11_decoder_configure (inner->d3d11_decoder, + decoder->input_state, &info, (gint) frame_hdr->width, + (gint) frame_hdr->height, NUM_OUTPUT_VIEW)) { + GST_ERROR_OBJECT (self, "Failed to create decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + + /* Will be updated per decode_picture */ + inner->last_frame_width = inner->last_frame_height = 0; + inner->last_show_frame = FALSE; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp9_dec_new_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + GstBuffer *view_buffer; + + view_buffer = gst_d3d11_decoder_get_output_view_buffer (inner->d3d11_decoder, + GST_VIDEO_DECODER (decoder)); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "No available output view buffer"); + return GST_FLOW_FLUSHING; + } + + GST_LOG_OBJECT (self, "New output view buffer %" GST_PTR_FORMAT, view_buffer); + + gst_vp9_picture_set_user_data (picture, + view_buffer, (GDestroyNotify) gst_buffer_unref); + + GST_LOG_OBJECT (self, "New VP9 picture %p", picture); + + return GST_FLOW_OK; +} + +static GstVp9Picture * +gst_d3d11_vp9_dec_duplicate_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstBuffer *view_buffer; + GstVp9Picture *new_picture; + + view_buffer = (GstBuffer *) gst_vp9_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Parent picture does not have output view buffer"); + return NULL; + } + + new_picture = gst_vp9_picture_new (); + new_picture->frame_hdr = picture->frame_hdr; + + GST_LOG_OBJECT (self, "Duplicate output with buffer %" GST_PTR_FORMAT, + view_buffer); + + gst_vp9_picture_set_user_data (new_picture, + gst_buffer_ref (view_buffer), (GDestroyNotify) gst_buffer_unref); + + return new_picture; +} + +static GstFlowReturn +gst_d3d11_vp9_dec_start_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + + inner->bitstream_buffer.resize (0); + + return GST_FLOW_OK; +} + +static ID3D11VideoDecoderOutputView * +gst_d3d11_vp9_dec_get_output_view_from_picture (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, guint8 * view_id) +{ + GstD3D11Vp9DecInner *inner = self->inner; + GstBuffer *view_buffer; + ID3D11VideoDecoderOutputView *view; + + view_buffer = (GstBuffer *) gst_vp9_picture_get_user_data (picture); + if (!view_buffer) { + GST_DEBUG_OBJECT (self, "current picture does not have output view buffer"); + return NULL; + } + + view = + gst_d3d11_decoder_get_output_view_from_buffer (inner->d3d11_decoder, + view_buffer, view_id); + if (!view) { + GST_DEBUG_OBJECT (self, "current picture does not have output view handle"); + return NULL; + } + + return view; +} + +static void +gst_d3d11_vp9_dec_copy_frame_params (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, DXVA_PicParams_VP9 * params) +{ + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + + params->profile = frame_hdr->profile; + params->frame_type = frame_hdr->frame_type; + params->show_frame = frame_hdr->show_frame; + params->error_resilient_mode = frame_hdr->error_resilient_mode; + params->subsampling_x = frame_hdr->subsampling_x; + params->subsampling_y = frame_hdr->subsampling_y; + params->refresh_frame_context = frame_hdr->refresh_frame_context; + params->frame_parallel_decoding_mode = + frame_hdr->frame_parallel_decoding_mode; + params->intra_only = frame_hdr->intra_only; + params->frame_context_idx = frame_hdr->frame_context_idx; + params->reset_frame_context = frame_hdr->reset_frame_context; + if (frame_hdr->frame_type == GST_VP9_KEY_FRAME) + params->allow_high_precision_mv = 0; + else + params->allow_high_precision_mv = frame_hdr->allow_high_precision_mv; + + params->width = frame_hdr->width; + params->height = frame_hdr->height; + params->BitDepthMinus8Luma = frame_hdr->bit_depth - 8; + params->BitDepthMinus8Chroma = frame_hdr->bit_depth - 8; + + params->interp_filter = frame_hdr->interpolation_filter; + params->log2_tile_cols = frame_hdr->tile_cols_log2; + params->log2_tile_rows = frame_hdr->tile_rows_log2; +} + +static void +gst_d3d11_vp9_dec_copy_reference_frames (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, GstVp9Dpb * dpb, DXVA_PicParams_VP9 * params) +{ + gint i; + + for (i = 0; i < GST_VP9_REF_FRAMES; i++) { + if (dpb->pic_list[i]) { + GstVp9Picture *other_pic = dpb->pic_list[i]; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + + view = gst_d3d11_vp9_dec_get_output_view_from_picture (self, other_pic, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "picture does not have output view handle"); + return; + } + + params->ref_frame_map[i].Index7Bits = view_id; + params->ref_frame_coded_width[i] = picture->frame_hdr.width; + params->ref_frame_coded_height[i] = picture->frame_hdr.height; + } else { + params->ref_frame_map[i].bPicEntry = 0xff; + params->ref_frame_coded_width[i] = 0; + params->ref_frame_coded_height[i] = 0; + } + } +} + +static void +gst_d3d11_vp9_dec_copy_frame_refs (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, DXVA_PicParams_VP9 * params) +{ + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + gint i; + + for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { + params->frame_refs[i] = params->ref_frame_map[frame_hdr->ref_frame_idx[i]]; + } + + G_STATIC_ASSERT (G_N_ELEMENTS (params->ref_frame_sign_bias) == + G_N_ELEMENTS (frame_hdr->ref_frame_sign_bias)); + G_STATIC_ASSERT (sizeof (params->ref_frame_sign_bias) == + sizeof (frame_hdr->ref_frame_sign_bias)); + memcpy (params->ref_frame_sign_bias, + frame_hdr->ref_frame_sign_bias, sizeof (frame_hdr->ref_frame_sign_bias)); +} + +static void +gst_d3d11_vp9_dec_copy_loop_filter_params (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, DXVA_PicParams_VP9 * params) +{ + GstD3D11Vp9DecInner *inner = self->inner; + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + const GstVp9LoopFilterParams *lfp = &frame_hdr->loop_filter_params; + + params->filter_level = lfp->loop_filter_level; + params->sharpness_level = lfp->loop_filter_sharpness; + params->mode_ref_delta_enabled = lfp->loop_filter_delta_enabled; + params->mode_ref_delta_update = lfp->loop_filter_delta_update; + params->use_prev_in_find_mv_refs = + inner->last_show_frame && !frame_hdr->error_resilient_mode; + + if (frame_hdr->frame_type != GST_VP9_KEY_FRAME && !frame_hdr->intra_only) { + params->use_prev_in_find_mv_refs &= + (frame_hdr->width == inner->last_frame_width && + frame_hdr->height == inner->last_frame_height); + } + + G_STATIC_ASSERT (G_N_ELEMENTS (params->ref_deltas) == + G_N_ELEMENTS (lfp->loop_filter_ref_deltas)); + G_STATIC_ASSERT (sizeof (params->ref_deltas) == + sizeof (lfp->loop_filter_ref_deltas)); + memcpy (params->ref_deltas, lfp->loop_filter_ref_deltas, + sizeof (lfp->loop_filter_ref_deltas)); + + G_STATIC_ASSERT (G_N_ELEMENTS (params->mode_deltas) == + G_N_ELEMENTS (lfp->loop_filter_mode_deltas)); + G_STATIC_ASSERT (sizeof (params->mode_deltas) == + sizeof (lfp->loop_filter_mode_deltas)); + memcpy (params->mode_deltas, lfp->loop_filter_mode_deltas, + sizeof (lfp->loop_filter_mode_deltas)); +} + +static void +gst_d3d11_vp9_dec_copy_quant_params (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, DXVA_PicParams_VP9 * params) +{ + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + const GstVp9QuantizationParams *qp = &frame_hdr->quantization_params; + + params->base_qindex = qp->base_q_idx; + params->y_dc_delta_q = qp->delta_q_y_dc; + params->uv_dc_delta_q = qp->delta_q_uv_dc; + params->uv_ac_delta_q = qp->delta_q_uv_ac; +} + +static void +gst_d3d11_vp9_dec_copy_segmentation_params (GstD3D11Vp9Dec * self, + GstVp9Picture * picture, DXVA_PicParams_VP9 * params) +{ + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + const GstVp9SegmentationParams *sp = &frame_hdr->segmentation_params; + gint i, j; + + params->stVP9Segments.enabled = sp->segmentation_enabled; + params->stVP9Segments.update_map = sp->segmentation_update_map; + params->stVP9Segments.temporal_update = sp->segmentation_temporal_update; + params->stVP9Segments.abs_delta = sp->segmentation_abs_or_delta_update; + + G_STATIC_ASSERT (G_N_ELEMENTS (params->stVP9Segments.tree_probs) == + G_N_ELEMENTS (sp->segmentation_tree_probs)); + G_STATIC_ASSERT (sizeof (params->stVP9Segments.tree_probs) == + sizeof (sp->segmentation_tree_probs)); + memcpy (params->stVP9Segments.tree_probs, sp->segmentation_tree_probs, + sizeof (sp->segmentation_tree_probs)); + + G_STATIC_ASSERT (G_N_ELEMENTS (params->stVP9Segments.pred_probs) == + G_N_ELEMENTS (sp->segmentation_pred_prob)); + G_STATIC_ASSERT (sizeof (params->stVP9Segments.pred_probs) == + sizeof (sp->segmentation_pred_prob)); + + if (sp->segmentation_temporal_update) { + memcpy (params->stVP9Segments.pred_probs, sp->segmentation_pred_prob, + sizeof (params->stVP9Segments.pred_probs)); + } else { + memset (params->stVP9Segments.pred_probs, 255, + sizeof (params->stVP9Segments.pred_probs)); + } + + for (i = 0; i < GST_VP9_MAX_SEGMENTS; i++) { + params->stVP9Segments.feature_mask[i] = + (sp->feature_enabled[i][GST_VP9_SEG_LVL_ALT_Q] << 0) | + (sp->feature_enabled[i][GST_VP9_SEG_LVL_ALT_L] << 1) | + (sp->feature_enabled[i][GST_VP9_SEG_LVL_REF_FRAME] << 2) | + (sp->feature_enabled[i][GST_VP9_SEG_SEG_LVL_SKIP] << 3); + + for (j = 0; j < 3; j++) + params->stVP9Segments.feature_data[i][j] = sp->feature_data[i][j]; + params->stVP9Segments.feature_data[i][3] = 0; + } +} + +static GstFlowReturn +gst_d3d11_vp9_dec_decode_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture, GstVp9Dpb * dpb) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + DXVA_PicParams_VP9 *pic_params = &inner->pic_params; + DXVA_Slice_VPx_Short *slice = &inner->slice; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + + view = gst_d3d11_vp9_dec_get_output_view_from_picture (self, picture, + &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (pic_params, 0, sizeof (DXVA_PicParams_VP9)); + + pic_params->CurrPic.Index7Bits = view_id; + pic_params->uncompressed_header_size_byte_aligned = + picture->frame_hdr.frame_header_length_in_bytes; + pic_params->first_partition_size = picture->frame_hdr.header_size_in_bytes; + pic_params->StatusReportFeedbackNumber = 1; + + gst_d3d11_vp9_dec_copy_frame_params (self, picture, pic_params); + gst_d3d11_vp9_dec_copy_reference_frames (self, picture, dpb, pic_params); + gst_d3d11_vp9_dec_copy_frame_refs (self, picture, pic_params); + gst_d3d11_vp9_dec_copy_loop_filter_params (self, picture, pic_params); + gst_d3d11_vp9_dec_copy_quant_params (self, picture, pic_params); + gst_d3d11_vp9_dec_copy_segmentation_params (self, picture, pic_params); + + inner->bitstream_buffer.resize (picture->size); + memcpy (&inner->bitstream_buffer[0], picture->data, picture->size); + + slice->BSNALunitDataLocation = 0; + slice->SliceBytesInBuffer = inner->bitstream_buffer.size (); + slice->wBadSliceChopping = 0; + + inner->last_frame_width = pic_params->width; + inner->last_frame_height = pic_params->height; + inner->last_show_frame = pic_params->show_frame; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp9_dec_end_picture (GstVp9Decoder * decoder, GstVp9Picture * picture) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + ID3D11VideoDecoderOutputView *view; + guint8 view_id = 0xff; + size_t bitstream_buffer_size; + size_t bitstream_pos; + GstD3D11DecodeInputStreamArgs input_args; + + if (inner->bitstream_buffer.empty ()) { + GST_ERROR_OBJECT (self, "No bitstream buffer to submit"); + return GST_FLOW_ERROR; + } + + view = gst_d3d11_vp9_dec_get_output_view_from_picture (self, + picture, &view_id); + if (!view) { + GST_ERROR_OBJECT (self, "current picture does not have output view handle"); + return GST_FLOW_ERROR; + } + + memset (&input_args, 0, sizeof (GstD3D11DecodeInputStreamArgs)); + + bitstream_pos = inner->bitstream_buffer.size (); + bitstream_buffer_size = GST_ROUND_UP_128 (bitstream_pos); + + if (bitstream_buffer_size > bitstream_pos) { + size_t padding = bitstream_buffer_size - bitstream_pos; + + /* As per DXVA spec, total amount of bitstream buffer size should be + * 128 bytes aligned. If actual data is not multiple of 128 bytes, + * the last slice data needs to be zero-padded */ + inner->bitstream_buffer.resize (bitstream_buffer_size, 0); + + inner->slice.SliceBytesInBuffer += padding; + } + + input_args.picture_params = &inner->pic_params; + input_args.picture_params_size = sizeof (DXVA_PicParams_VP9); + input_args.slice_control = &inner->slice; + input_args.slice_control_size = sizeof (DXVA_Slice_VPx_Short); + input_args.bitstream = &inner->bitstream_buffer[0]; + input_args.bitstream_size = inner->bitstream_buffer.size (); + + if (!gst_d3d11_decoder_decode_frame (inner->d3d11_decoder, view, &input_args)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_d3d11_vp9_dec_output_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstD3D11Vp9Dec *self = GST_D3D11_VP9_DEC (decoder); + GstD3D11Vp9DecInner *inner = self->inner; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstBuffer *view_buffer; + + GST_LOG_OBJECT (self, "Outputting picture %p", picture); + + view_buffer = (GstBuffer *) gst_vp9_picture_get_user_data (picture); + + if (!view_buffer) { + GST_ERROR_OBJECT (self, "Could not get output view"); + goto error; + } + + if (!gst_d3d11_decoder_process_output (inner->d3d11_decoder, vdec, + picture->frame_hdr.width, picture->frame_hdr.height, view_buffer, + &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to copy buffer"); + goto error; + } + + gst_vp9_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_vp9_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); + + return GST_FLOW_ERROR; +} + +void +gst_d3d11_vp9_dec_register (GstPlugin * plugin, GstD3D11Device * device, + guint rank) +{ + GType type; + gchar *type_name; + gchar *feature_name; + guint index = 0; + guint i; + const GUID *profile; + GTypeInfo type_info = { + sizeof (GstD3D11Vp9DecClass), + NULL, + NULL, + (GClassInitFunc) gst_d3d11_vp9_dec_class_init, + NULL, + NULL, + sizeof (GstD3D11Vp9Dec), + 0, + (GInstanceInitFunc) gst_d3d11_vp9_dec_init, + }; + const GUID *profile2_guid = NULL; + const GUID *profile0_guid = NULL; + GstCaps *sink_caps = NULL; + GstCaps *src_caps = NULL; + guint max_width = 0; + guint max_height = 0; + guint resolution; + gboolean have_profile2 = FALSE; + gboolean have_profile0 = FALSE; + DXGI_FORMAT format = DXGI_FORMAT_UNKNOWN; + GValue vp9_profiles = G_VALUE_INIT; + + have_profile2 = gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_VP9, GST_VIDEO_FORMAT_P010_10LE, &profile2_guid); + if (!have_profile2) { + GST_DEBUG_OBJECT (device, + "decoder does not support VP9_VLD_10BIT_PROFILE2"); + } else { + have_profile2 &= + gst_d3d11_decoder_supports_format (device, + profile2_guid, DXGI_FORMAT_P010); + if (!have_profile2) { + GST_FIXME_OBJECT (device, "device does not support P010 format"); + } + } + + have_profile0 = gst_d3d11_decoder_get_supported_decoder_profile (device, + GST_DXVA_CODEC_VP9, GST_VIDEO_FORMAT_NV12, &profile0_guid); + if (!have_profile0) { + GST_DEBUG_OBJECT (device, "decoder does not support VP9_VLD_PROFILE0"); + } else { + have_profile0 = + gst_d3d11_decoder_supports_format (device, profile0_guid, + DXGI_FORMAT_NV12); + if (!have_profile0) { + GST_FIXME_OBJECT (device, "device does not support NV12 format"); + } + } + + if (!have_profile2 && !have_profile0) { + GST_INFO_OBJECT (device, "device does not support VP9 decoding"); + return; + } + + if (have_profile0) { + profile = profile0_guid; + format = DXGI_FORMAT_NV12; + } else { + profile = profile2_guid; + format = DXGI_FORMAT_P010; + } + + for (i = 0; i < G_N_ELEMENTS (gst_dxva_resolutions); i++) { + if (gst_d3d11_decoder_supports_resolution (device, profile, + format, gst_dxva_resolutions[i].width, + gst_dxva_resolutions[i].height)) { + max_width = gst_dxva_resolutions[i].width; + max_height = gst_dxva_resolutions[i].height; + + GST_DEBUG_OBJECT (device, + "device support resolution %dx%d", max_width, max_height); + } else { + break; + } + } + + if (max_width == 0 || max_height == 0) { + GST_WARNING_OBJECT (device, "Couldn't query supported resolution"); + return; + } + + sink_caps = gst_caps_from_string ("video/x-vp9, alignment = (string) frame"); + src_caps = gst_caps_from_string ("video/x-raw(" + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY "); video/x-raw"); + + g_value_init (&vp9_profiles, GST_TYPE_LIST); + + if (have_profile0) { + GValue vp9_profile_val = G_VALUE_INIT; + + g_value_init (&vp9_profile_val, G_TYPE_STRING); + g_value_set_string (&vp9_profile_val, "0"); + gst_value_list_append_and_take_value (&vp9_profiles, &vp9_profile_val); + } + + if (have_profile2) { + GValue format_list = G_VALUE_INIT; + GValue format_value = G_VALUE_INIT; + GValue vp9_profile_val = G_VALUE_INIT; + + g_value_init (&format_list, GST_TYPE_LIST); + + g_value_init (&format_value, G_TYPE_STRING); + g_value_set_string (&format_value, "NV12"); + gst_value_list_append_and_take_value (&format_list, &format_value); + + g_value_init (&format_value, G_TYPE_STRING); + g_value_set_string (&format_value, "P010_10LE"); + gst_value_list_append_and_take_value (&format_list, &format_value); + + gst_caps_set_value (src_caps, "format", &format_list); + g_value_unset (&format_list); + + g_value_init (&vp9_profile_val, G_TYPE_STRING); + g_value_set_string (&vp9_profile_val, "2"); + gst_value_list_append_and_take_value (&vp9_profiles, &vp9_profile_val); + } else { + gst_caps_set_simple (src_caps, "format", G_TYPE_STRING, "NV12", NULL); + } + + gst_caps_set_value (sink_caps, "profile", &vp9_profiles); + g_value_unset (&vp9_profiles); + + /* To cover both landscape and portrait, select max value */ + resolution = MAX (max_width, max_height); + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 1, resolution, + "height", GST_TYPE_INT_RANGE, 1, resolution, NULL); + + type_info.class_data = + gst_d3d11_decoder_class_data_new (device, GST_DXVA_CODEC_VP9, + sink_caps, src_caps); + + type_name = g_strdup ("GstD3D11Vp9Dec"); + feature_name = g_strdup ("d3d11vp9dec"); + + while (g_type_from_name (type_name)) { + index++; + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstD3D11Vp9Device%dDec", index); + feature_name = g_strdup_printf ("d3d11vp9device%ddec", index); + } + + type = g_type_register_static (GST_TYPE_VP9_DECODER, + type_name, &type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && index != 0) + rank--; + + if (index != 0) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11vp9dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11vp9dec.h
Changed
@@ -26,7 +26,6 @@ void gst_d3d11_vp9_dec_register (GstPlugin * plugin, GstD3D11Device * device, - GstD3D11Decoder * decoder, guint rank); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window.cpp
Changed
@@ -23,27 +23,26 @@ #endif #include "gstd3d11window.h" -#include "gstd3d11device.h" -#include "gstd3d11memory.h" -#include "gstd3d11utils.h" +#include "gstd3d11pluginutils.h" -#if GST_D3D11_WINAPI_ONLY_APP +#if GST_D3D11_WINAPI_APP /* workaround for GetCurrentTime collision */ #ifdef GetCurrentTime #undef GetCurrentTime #endif #include <windows.ui.xaml.h> #include <windows.applicationmodel.core.h> +#endif + #include <wrl.h> -#include <wrl/wrappers/corewrappers.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; -#endif +/* *INDENT-ON* */ -extern "C" { GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_window_debug); #define GST_CAT_DEFAULT gst_d3d11_window_debug -} + enum { @@ -54,12 +53,14 @@ PROP_FULLSCREEN_TOGGLE_MODE, PROP_FULLSCREEN, PROP_WINDOW_HANDLE, + PROP_RENDER_STATS, }; #define DEFAULT_ENABLE_NAVIGATION_EVENTS TRUE #define DEFAULT_FORCE_ASPECT_RATIO TRUE #define DEFAULT_FULLSCREEN_TOGGLE_MODE GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_NONE #define DEFAULT_FULLSCREEN FALSE +#define DEFAULT_RENDER_STATS FALSE enum { @@ -102,9 +103,13 @@ GValue * value, GParamSpec * pspec); static void gst_d3d11_window_dispose (GObject * object); static GstFlowReturn gst_d3d111_window_present (GstD3D11Window * self, - GstBuffer * buffer); + GstBuffer * buffer, ID3D11VideoProcessorOutputView * pov, + ID3D11RenderTargetView * rtv); static void gst_d3d11_window_on_resize_default (GstD3D11Window * window, guint width, guint height); +static gboolean gst_d3d11_window_prepare_default (GstD3D11Window * window, + guint display_width, guint display_height, GstCaps * caps, + gboolean * video_processor_available, GError ** error); static void gst_d3d11_window_class_init (GstD3D11WindowClass * klass) @@ -116,6 +121,7 @@ gobject_class->dispose = gst_d3d11_window_dispose; klass->on_resize = GST_DEBUG_FUNCPTR (gst_d3d11_window_on_resize_default); + klass->prepare = GST_DEBUG_FUNCPTR (gst_d3d11_window_prepare_default); g_object_class_install_property (gobject_class, PROP_D3D11_DEVICE, g_param_spec_object ("d3d11device", "D3D11 Device", @@ -176,6 +182,7 @@ self->enable_navigation_events = DEFAULT_ENABLE_NAVIGATION_EVENTS; self->fullscreen_toggle_mode = GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_NONE; self->fullscreen = DEFAULT_FULLSCREEN; + self->render_stats = DEFAULT_RENDER_STATS; } static void @@ -248,20 +255,9 @@ gst_d3d11_window_release_resources (GstD3D11Device * device, GstD3D11Window * window) { - if (window->rtv) { - window->rtv->Release (); - window->rtv = NULL; - } - - if (window->pov) { - window->pov->Release (); - window->pov = NULL; - } - - if (window->swap_chain) { - window->swap_chain->Release (); - window->swap_chain = NULL; - } + GST_D3D11_CLEAR_COM (window->rtv); + GST_D3D11_CLEAR_COM (window->pov); + GST_D3D11_CLEAR_COM (window->swap_chain); } static void @@ -274,7 +270,7 @@ } g_clear_pointer (&self->processor, gst_d3d11_video_processor_free); - g_clear_pointer (&self->converter, gst_d3d11_color_converter_free); + g_clear_pointer (&self->converter, gst_d3d11_converter_free); g_clear_pointer (&self->compositor, gst_d3d11_overlay_compositor_free); gst_clear_buffer (&self->cached_buffer); @@ -302,15 +298,8 @@ device_handle = gst_d3d11_device_get_device_handle (window->device); swap_chain = window->swap_chain; - if (window->rtv) { - window->rtv->Release (); - window->rtv = NULL; - } - - if (window->pov) { - window->pov->Release (); - window->pov = NULL; - } + GST_D3D11_CLEAR_COM (window->rtv); + GST_D3D11_CLEAR_COM (window->pov); swap_chain->GetDesc (&swap_desc); hr = swap_chain->ResizeBuffers (0, width, height, window->dxgi_format, @@ -320,7 +309,7 @@ goto done; } - hr = swap_chain->GetBuffer (0, IID_ID3D11Texture2D, (void **) &backbuffer); + hr = swap_chain->GetBuffer (0, IID_PPV_ARGS (&backbuffer)); if (!gst_d3d11_result (hr, window->device)) { GST_ERROR_OBJECT (window, "Cannot get backbuffer from swapchain, hr: 0x%x", (guint) hr); @@ -358,8 +347,7 @@ "New client area %dx%d, render rect x: %d, y: %d, %dx%d", desc.Width, desc.Height, rst_rect.x, rst_rect.y, rst_rect.w, rst_rect.h); - hr = device_handle->CreateRenderTargetView ((ID3D11Resource *) backbuffer, - NULL, &window->rtv); + hr = device_handle->CreateRenderTargetView (backbuffer, NULL, &window->rtv); if (!gst_d3d11_result (hr, window->device)) { GST_ERROR_OBJECT (window, "Cannot create render target view, hr: 0x%x", (guint) hr); @@ -374,18 +362,20 @@ pov_desc.Texture2D.MipSlice = 0; if (!gst_d3d11_video_processor_create_output_view (window->processor, - &pov_desc, (ID3D11Resource *) backbuffer, &window->pov)) + &pov_desc, backbuffer, &window->pov)) goto done; } window->first_present = TRUE; /* redraw the last scene if cached buffer exits */ - gst_d3d111_window_present (window, NULL); + if (window->cached_buffer) { + gst_d3d111_window_present (window, window->cached_buffer, + window->pov, window->rtv); + } done: - if (backbuffer) - backbuffer->Release (); + GST_D3D11_CLEAR_COM (backbuffer); gst_d3d11_device_unlock (window->device); } @@ -422,13 +412,31 @@ gboolean supported; } GstD3D11WindowDisplayFormat; - gboolean gst_d3d11_window_prepare (GstD3D11Window * window, guint display_width, guint display_height, GstCaps * caps, gboolean * video_processor_available, GError ** error) { GstD3D11WindowClass *klass; + + g_return_val_if_fail (GST_IS_D3D11_WINDOW (window), FALSE); + + klass = GST_D3D11_WINDOW_GET_CLASS (window); + g_assert (klass->prepare != NULL); + + GST_DEBUG_OBJECT (window, "Prepare window, display resolution %dx%d, caps %" + GST_PTR_FORMAT, display_width, display_height, caps); + + return klass->prepare (window, display_width, display_height, caps, + video_processor_available, error); +} + +static gboolean +gst_d3d11_window_prepare_default (GstD3D11Window * window, guint display_width, + guint display_height, GstCaps * caps, gboolean * video_processor_available, + GError ** error) +{ + GstD3D11WindowClass *klass; guint swapchain_flags = 0; ID3D11Device *device_handle; guint i; @@ -438,32 +446,29 @@ D3D11_FORMAT_SUPPORT_TEXTURE2D | D3D11_FORMAT_SUPPORT_DISPLAY; UINT supported_flags = 0; GstD3D11WindowDisplayFormat formats[] = { - { DXGI_FORMAT_R8G8B8A8_UNORM, GST_VIDEO_FORMAT_RGBA, FALSE }, - { DXGI_FORMAT_B8G8R8A8_UNORM, GST_VIDEO_FORMAT_BGRA, FALSE }, - { DXGI_FORMAT_R10G10B10A2_UNORM, GST_VIDEO_FORMAT_RGB10A2_LE, FALSE }, + {DXGI_FORMAT_R8G8B8A8_UNORM, GST_VIDEO_FORMAT_RGBA, FALSE}, + {DXGI_FORMAT_B8G8R8A8_UNORM, GST_VIDEO_FORMAT_BGRA, FALSE}, + {DXGI_FORMAT_R10G10B10A2_UNORM, GST_VIDEO_FORMAT_RGB10A2_LE, FALSE}, }; const GstD3D11WindowDisplayFormat *chosen_format = NULL; - const GstDxgiColorSpace * chosen_colorspace = NULL; -#if (DXGI_HEADER_VERSION >= 4) + const GstDxgiColorSpace *chosen_colorspace = NULL; +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) gboolean have_hdr10 = FALSE; DXGI_COLOR_SPACE_TYPE native_colorspace_type = DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709; #endif -#if (DXGI_HEADER_VERSION >= 5) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) DXGI_HDR_METADATA_HDR10 hdr10_metadata = { 0, }; #endif - g_return_val_if_fail (GST_IS_D3D11_WINDOW (window), FALSE); - - GST_DEBUG_OBJECT (window, "Prepare window, display resolution %dx%d, caps %" - GST_PTR_FORMAT, display_width, display_height, caps); - /* Step 1: Clear old resources and objects */ gst_clear_buffer (&window->cached_buffer); g_clear_pointer (&window->processor, gst_d3d11_video_processor_free); - g_clear_pointer (&window->converter, gst_d3d11_color_converter_free); + g_clear_pointer (&window->converter, gst_d3d11_converter_free); g_clear_pointer (&window->compositor, gst_d3d11_overlay_compositor_free); + window->processor_in_use = FALSE; + /* Step 2: Decide display color format * If upstream format is 10bits, try DXGI_FORMAT_R10G10B10A2_UNORM first * Otherwise, use DXGI_FORMAT_B8G8R8A8_UNORM or DXGI_FORMAT_B8G8R8A8_UNORM @@ -521,14 +526,32 @@ g_assert (chosen_format != NULL); - GST_DEBUG_OBJECT (window, "chosen rendero format %s (DXGI_FORMAT %d)", + GST_DEBUG_OBJECT (window, "chosen render format %s (DXGI_FORMAT %d)", gst_video_format_to_string (chosen_format->gst_format), chosen_format->dxgi_format); /* Step 3: Create swapchain * (or reuse old swapchain if the format is not changed) */ window->allow_tearing = FALSE; - g_object_get (window->device, "allow-tearing", &window->allow_tearing, NULL); + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) + { + ComPtr < IDXGIFactory5 > factory5; + IDXGIFactory1 *factory_handle; + BOOL allow_tearing = FALSE; + + factory_handle = gst_d3d11_device_get_dxgi_factory_handle (window->device); + hr = factory_handle->QueryInterface (IID_PPV_ARGS (&factory5)); + if (SUCCEEDED (hr)) { + hr = factory5->CheckFeatureSupport (DXGI_FEATURE_PRESENT_ALLOW_TEARING, + (void *) &allow_tearing, sizeof (allow_tearing)); + } + + if (SUCCEEDED (hr) && allow_tearing) + window->allow_tearing = allow_tearing; + } +#endif + if (window->allow_tearing) { GST_DEBUG_OBJECT (window, "device support tearning"); swapchain_flags |= DXGI_SWAP_CHAIN_FLAG_ALLOW_TEARING; @@ -540,15 +563,13 @@ klass = GST_D3D11_WINDOW_GET_CLASS (window); if (!window->swap_chain && !klass->create_swap_chain (window, window->dxgi_format, - display_width, display_height, swapchain_flags, &window->swap_chain)) { + display_width, display_height, swapchain_flags, + &window->swap_chain)) { GST_ERROR_OBJECT (window, "Cannot create swapchain"); g_set_error (error, GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_FAILED, "Cannot create swapchain"); - gst_d3d11_device_unlock (window->device); - - return FALSE; + goto error; } - gst_d3d11_device_unlock (window->device); /* this rect struct will be used to calculate render area */ window->render_rect.left = 0; @@ -562,29 +583,17 @@ window->input_rect.bottom = GST_VIDEO_INFO_HEIGHT (&window->info); /* Step 4: Decide render color space and set it on converter/processor */ - - /* check HDR10 metadata. If HDR APIs are available, BT2020 primaries colorspcae - * will be used. - * - * FIXME: need to query h/w level support. If that's not the case, tone-mapping - * should be placed somewhere. - * - * FIXME: Non-HDR colorspace with BT2020 primaries will break rendering. - * https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/1175 - * To workaround it, BT709 colorspace will be chosen for non-HDR case. - */ -#if (DXGI_HEADER_VERSION >= 5) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) { GstVideoMasteringDisplayInfo minfo; GstVideoContentLightLevel cll; if (gst_video_mastering_display_info_from_caps (&minfo, caps) && gst_video_content_light_level_from_caps (&cll, caps)) { - IDXGISwapChain4 *swapchain4 = NULL; + ComPtr < IDXGISwapChain4 > swapchain4; HRESULT hr; - hr = window->swap_chain->QueryInterface (IID_IDXGISwapChain4, - (void **) &swapchain4); + hr = window->swap_chain->QueryInterface (IID_PPV_ARGS (&swapchain4)); if (gst_d3d11_result (hr, window->device)) { GST_DEBUG_OBJECT (window, "Have HDR metadata, set to DXGI swapchain"); @@ -598,8 +607,6 @@ } else { have_hdr10 = TRUE; } - - swapchain4->Release (); } } } @@ -613,26 +620,29 @@ window->render_info.colorimetry.primaries = window->info.colorimetry.primaries; window->render_info.colorimetry.transfer = window->info.colorimetry.transfer; - window->render_info.colorimetry.range = window->info.colorimetry.range; + /* prefer FULL range RGB. STUDIO range doesn't seem to be well supported + * color space by GPUs and we don't need to preserve color range for + * target display color space type */ + window->render_info.colorimetry.range = GST_VIDEO_COLOR_RANGE_0_255; -#if (DXGI_HEADER_VERSION >= 4) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) { - IDXGISwapChain3 *swapchain3 = NULL; + ComPtr < IDXGISwapChain3 > swapchain3; HRESULT hr; - hr = window->swap_chain->QueryInterface (IID_IDXGISwapChain3, - (void **) &swapchain3); + hr = window->swap_chain->QueryInterface (IID_PPV_ARGS (&swapchain3)); if (gst_d3d11_result (hr, window->device)) { - chosen_colorspace = gst_d3d11_find_swap_chain_color_space (&window->render_info, - swapchain3, have_hdr10); + chosen_colorspace = + gst_d3d11_find_swap_chain_color_space (&window->render_info, + swapchain3.Get ()); if (chosen_colorspace) { native_colorspace_type = (DXGI_COLOR_SPACE_TYPE) chosen_colorspace->dxgi_color_space_type; hr = swapchain3->SetColorSpace1 (native_colorspace_type); if (!gst_d3d11_result (hr, window->device)) { GST_WARNING_OBJECT (window, "Failed to set colorspace %d, hr: 0x%x", - native_colorspace_type, (guint) hr); + native_colorspace_type, (guint) hr); chosen_colorspace = NULL; native_colorspace_type = DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709; } else { @@ -644,14 +654,10 @@ chosen_colorspace->primaries; window->render_info.colorimetry.transfer = chosen_colorspace->transfer; - window->render_info.colorimetry.range = - chosen_colorspace->range; - window->render_info.colorimetry.matrix = - chosen_colorspace->matrix; + window->render_info.colorimetry.range = chosen_colorspace->range; + window->render_info.colorimetry.matrix = chosen_colorspace->matrix; } } - - swapchain3->Release (); } } #endif @@ -664,128 +670,75 @@ window->render_info.colorimetry.transfer = GST_VIDEO_TRANSFER_BT709; window->render_info.colorimetry.range = GST_VIDEO_COLOR_RANGE_0_255; } - - /* FIXME: need to verify video processor on Xbox - * https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1312 - */ - - /* XXX: Depending on driver/vendor, d3d11 video processor might not support - * HDR10 metadata. Even worse thing here is, - * although the d3d11 video processor's capability flag indicated that - * HDR10 metadata is supported, it would result to black screen when HDR10 - * metadata is passed to d3d11 video processor. (without any error message). - * Let's disable d3d11 video processor. - */ - if (!have_hdr10 && gst_d3d11_get_device_vendor (window->device) != - GST_D3D11_DEVICE_VENDOR_XBOX) { - window->processor = - gst_d3d11_video_processor_new (window->device, - GST_VIDEO_INFO_WIDTH (&window->info), - GST_VIDEO_INFO_HEIGHT (&window->info), - display_width, display_height); - } - - if (window->processor) { - const GstD3D11Format *in_format; - const GstD3D11Format *out_format; - gboolean input_support = FALSE; - gboolean out_support = FALSE; - - in_format = gst_d3d11_device_format_from_gst (window->device, +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) + if (chosen_colorspace) { + const GstDxgiColorSpace *in_color_space = + gst_d3d11_video_info_to_dxgi_color_space (&window->info); + const GstD3D11Format *in_format = + gst_d3d11_device_format_from_gst (window->device, GST_VIDEO_INFO_FORMAT (&window->info)); - out_format = gst_d3d11_device_format_from_gst (window->device, - GST_VIDEO_INFO_FORMAT (&window->render_info)); + gboolean hardware = FALSE; + GstD3D11VideoProcessor *processor = NULL; - if (gst_d3d11_video_processor_supports_input_format (window->processor, - in_format->dxgi_format)) { - input_support = TRUE; - } else { - GST_DEBUG_OBJECT (window, - "IVideoProcessor cannot support input dxgi format %d", - in_format->dxgi_format); + if (in_color_space && in_format && + in_format->dxgi_format != DXGI_FORMAT_UNKNOWN) { + g_object_get (window->device, "hardware", &hardware, NULL); } - if (gst_d3d11_video_processor_supports_output_format (window->processor, - out_format->dxgi_format)) { - out_support = TRUE; - } else { - GST_DEBUG_OBJECT (window, - "IVideoProcessor cannot support output dxgi format %d", - out_format->dxgi_format); + if (hardware) { + processor = + gst_d3d11_video_processor_new (window->device, + GST_VIDEO_INFO_WIDTH (&window->info), + GST_VIDEO_INFO_HEIGHT (&window->info), display_width, display_height); } - if (!input_support || !out_support) { - gst_d3d11_video_processor_free (window->processor); - window->processor = NULL; - } else { - gboolean processor_input_configured = FALSE; - gboolean processor_output_configured = FALSE; - - GST_DEBUG_OBJECT (window, "IVideoProcessor interface available"); - *video_processor_available = TRUE; -#if (DXGI_HEADER_VERSION >= 5) - if (have_hdr10) { - GST_DEBUG_OBJECT (window, "Set HDR metadata on video processor"); - gst_d3d11_video_processor_set_input_hdr10_metadata (window->processor, - &hdr10_metadata); - gst_d3d11_video_processor_set_output_hdr10_metadata (window->processor, - &hdr10_metadata); - } -#endif - -#if (DXGI_HEADER_VERSION >= 4) - { - DXGI_COLOR_SPACE_TYPE in_native_cs = - DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709; - const GstDxgiColorSpace *in_cs = - gst_d3d11_video_info_to_dxgi_color_space (&window->info); - - if (in_cs) { - in_native_cs = (DXGI_COLOR_SPACE_TYPE) in_cs->dxgi_color_space_type; - } else { - GST_WARNING_OBJECT (window, - "Cannot figure out input dxgi color space"); - if (GST_VIDEO_INFO_IS_RGB (&window->info)) { - in_native_cs = DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709; - } else { - in_native_cs = DXGI_COLOR_SPACE_YCBCR_STUDIO_G22_LEFT_P709; - } + if (processor) { + DXGI_FORMAT in_dxgi_format = in_format->dxgi_format; + DXGI_FORMAT out_dxgi_format = chosen_format->dxgi_format; + DXGI_COLOR_SPACE_TYPE in_dxgi_color_space = + (DXGI_COLOR_SPACE_TYPE) in_color_space->dxgi_color_space_type; + DXGI_COLOR_SPACE_TYPE out_dxgi_color_space = native_colorspace_type; + + if (!gst_d3d11_video_processor_check_format_conversion (processor, + in_dxgi_format, in_dxgi_color_space, out_dxgi_format, + out_dxgi_color_space)) { + GST_DEBUG_OBJECT (window, "Conversion is not supported by device"); + gst_d3d11_video_processor_free (processor); + processor = NULL; + } else { + GST_DEBUG_OBJECT (window, "video processor supports conversion"); + gst_d3d11_video_processor_set_input_dxgi_color_space (processor, + in_dxgi_color_space); + gst_d3d11_video_processor_set_output_dxgi_color_space (processor, + out_dxgi_color_space); + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) + if (have_hdr10) { + GST_DEBUG_OBJECT (window, "Set HDR metadata on video processor"); + gst_d3d11_video_processor_set_input_hdr10_metadata (processor, + &hdr10_metadata); + gst_d3d11_video_processor_set_output_hdr10_metadata (processor, + &hdr10_metadata); } - - GST_DEBUG_OBJECT (window, - "Set color space on video processor, in %d, out %d", - in_native_cs, native_colorspace_type); - processor_input_configured = - gst_d3d11_video_processor_set_input_dxgi_color_space - (window->processor, in_native_cs); - processor_output_configured = - gst_d3d11_video_processor_set_output_dxgi_color_space - (window->processor, native_colorspace_type); - } #endif - if (!processor_input_configured) { - gst_d3d11_video_processor_set_input_color_space (window->processor, - &window->info.colorimetry); } - if (!processor_output_configured) { - gst_d3d11_video_processor_set_output_color_space (window->processor, - &window->render_info.colorimetry); - } + window->processor = processor; } } +#endif + *video_processor_available = !!window->processor; /* configure shader even if video processor is available for fallback */ window->converter = - gst_d3d11_color_converter_new (window->device, &window->info, - &window->render_info); + gst_d3d11_converter_new (window->device, &window->info, + &window->render_info, nullptr); if (!window->converter) { GST_ERROR_OBJECT (window, "Cannot create converter"); g_set_error (error, GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_FAILED, "Cannot create converter"); - - return FALSE; + goto error; } window->compositor = @@ -794,9 +747,9 @@ GST_ERROR_OBJECT (window, "Cannot create overlay compositor"); g_set_error (error, GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_FAILED, "Cannot create overlay compositor"); - - return FALSE; + goto error; } + gst_d3d11_device_unlock (window->device); /* call resize to allocated resources */ klass->on_resize (window, display_width, display_height); @@ -808,6 +761,11 @@ GST_DEBUG_OBJECT (window, "New swap chain 0x%p created", window->swap_chain); return TRUE; + +error: + gst_d3d11_device_unlock (window->device); + + return FALSE; } void @@ -824,12 +782,30 @@ } void -gst_d3d11_window_set_render_rectangle (GstD3D11Window * window, gint x, gint y, - gint width, gint height) +gst_d3d11_window_set_render_rectangle (GstD3D11Window * window, + const GstVideoRectangle * rect) +{ + GstD3D11WindowClass *klass; + + g_return_if_fail (GST_IS_D3D11_WINDOW (window)); + + klass = GST_D3D11_WINDOW_GET_CLASS (window); + + if (klass->set_render_rectangle) + klass->set_render_rectangle (window, rect); +} + +void +gst_d3d11_window_set_title (GstD3D11Window * window, const gchar * title) { + GstD3D11WindowClass *klass; + g_return_if_fail (GST_IS_D3D11_WINDOW (window)); - /* TODO: resize window and view */ + klass = GST_D3D11_WINDOW_GET_CLASS (window); + + if (klass->set_title) + klass->set_title (window, title); } static gboolean @@ -837,8 +813,7 @@ GstBuffer * buffer, ID3D11VideoProcessorInputView ** in_view) { GstD3D11Memory *mem; - ID3D11VideoProcessorInputView *view; - GQuark quark; + ID3D11VideoProcessorInputView *piv; if (!self->processor) return FALSE; @@ -847,122 +822,224 @@ return FALSE; mem = (GstD3D11Memory *) gst_buffer_peek_memory (buffer, 0); - - if (!gst_d3d11_video_processor_check_bind_flags_for_input_view - (mem->desc.BindFlags)) { + piv = gst_d3d11_video_processor_get_input_view (self->processor, mem); + if (!piv) { + GST_LOG_OBJECT (self, "Failed to get processor input view"); return FALSE; } - quark = gst_d3d11_video_processor_input_view_quark (); - view = (ID3D11VideoProcessorInputView *) - gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), quark); + *in_view = piv; - if (!view) { - D3D11_VIDEO_PROCESSOR_INPUT_VIEW_DESC in_desc; + return TRUE; +} - in_desc.FourCC = 0; - in_desc.ViewDimension = D3D11_VPIV_DIMENSION_TEXTURE2D; - in_desc.Texture2D.MipSlice = 0; - in_desc.Texture2D.ArraySlice = mem->subresource_index; +static gboolean +gst_d3d11_window_do_processor (GstD3D11Window * self, + ID3D11VideoProcessorInputView * piv, ID3D11VideoProcessorOutputView * pov, + RECT * input_rect) +{ + gboolean ret; - GST_TRACE_OBJECT (self, "Create new processor input view"); + ret = gst_d3d11_video_processor_render_unlocked (self->processor, + input_rect, piv, &self->render_rect, pov); + if (!ret) { + GST_ERROR_OBJECT (self, "Couldn't render to backbuffer using processor"); + } else { + GST_TRACE_OBJECT (self, "Rendered using processor"); + self->processor_in_use = TRUE; + } - if (!gst_d3d11_video_processor_create_input_view (self->processor, - &in_desc, mem->texture, &view)) { - GST_LOG_OBJECT (self, "Failed to create processor input view"); - return FALSE; - } + return ret; +} - gst_mini_object_set_qdata (GST_MINI_OBJECT (mem), quark, view, - (GDestroyNotify) gst_d3d11_video_processor_input_view_release); - } else { - GST_TRACE_OBJECT (self, "Reuse existing processor input view %p", view); +static gboolean +gst_d3d11_window_do_convert (GstD3D11Window * self, + ID3D11ShaderResourceView * srv[GST_VIDEO_MAX_PLANES], + ID3D11RenderTargetView * rtv, RECT * input_rect) +{ + if (!gst_d3d11_converter_update_src_rect (self->converter, input_rect)) { + GST_ERROR_OBJECT (self, "Failed to update src rect"); + return FALSE; } - *in_view = view; + if (!gst_d3d11_converter_convert_unlocked (self->converter, + srv, &rtv, NULL, NULL)) { + GST_ERROR_OBJECT (self, "Couldn't render to backbuffer using converter"); + return FALSE; + } else { + GST_TRACE_OBJECT (self, "Rendered using converter"); + } return TRUE; } static GstFlowReturn -gst_d3d111_window_present (GstD3D11Window * self, GstBuffer * buffer) +gst_d3d111_window_present (GstD3D11Window * self, GstBuffer * buffer, + ID3D11VideoProcessorOutputView * pov, ID3D11RenderTargetView * rtv) { GstD3D11WindowClass *klass = GST_D3D11_WINDOW_GET_CLASS (self); GstFlowReturn ret = GST_FLOW_OK; guint present_flags = 0; - if (buffer) { - gst_buffer_replace (&self->cached_buffer, buffer); - } + if (!buffer) + return GST_FLOW_OK; - if (self->cached_buffer) { + { + GstMapInfo infos[GST_VIDEO_MAX_PLANES]; ID3D11ShaderResourceView *srv[GST_VIDEO_MAX_PLANES]; ID3D11VideoProcessorInputView *piv = NULL; - guint i, j, k; - - if (!gst_d3d11_window_buffer_ensure_processor_input (self, - self->cached_buffer, &piv)) { - for (i = 0, j = 0; i < gst_buffer_n_memory (self->cached_buffer); i++) { - GstD3D11Memory *mem = - (GstD3D11Memory *) gst_buffer_peek_memory (self->cached_buffer, i); - for (k = 0; k < mem->num_shader_resource_views; k++) { - srv[j] = mem->shader_resource_view[k]; - j++; - } + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (self->device); + gboolean can_convert = FALSE; + gboolean can_process = FALSE; + gboolean convert_ret = FALSE; + RECT input_rect = self->input_rect; + GstVideoCropMeta *crop_meta; + + /* Map memory in any case so that we can upload pending stage texture */ + if (!gst_d3d11_buffer_map (buffer, device_handle, infos, GST_MAP_READ)) { + GST_ERROR_OBJECT (self, "Couldn't map buffer"); + return GST_FLOW_ERROR; + } + + can_convert = gst_d3d11_buffer_get_shader_resource_view (buffer, srv); + if (pov) { + can_process = gst_d3d11_window_buffer_ensure_processor_input (self, + buffer, &piv); + } + + if (!can_convert && !can_process) { + GST_ERROR_OBJECT (self, "Input texture cannot be used for converter"); + return GST_FLOW_ERROR; + } + + crop_meta = gst_buffer_get_video_crop_meta (buffer); + /* Do minimal validate */ + if (crop_meta) { + ID3D11Texture2D *texture = (ID3D11Texture2D *) infos[0].data; + D3D11_TEXTURE2D_DESC desc = { 0, }; + + texture->GetDesc (&desc); + + if (desc.Width < crop_meta->x + crop_meta->width || + desc.Height < crop_meta->y + crop_meta->height) { + GST_WARNING_OBJECT (self, "Invalid crop meta, ignore"); + + crop_meta = nullptr; } } + if (crop_meta) { + input_rect.left = crop_meta->x; + input_rect.right = crop_meta->x + crop_meta->width; + input_rect.top = crop_meta->y; + input_rect.bottom = crop_meta->y + crop_meta->height; + } + if (self->first_present) { - gst_d3d11_color_converter_update_rect (self->converter, - &self->render_rect); - gst_d3d11_overlay_compositor_update_rect (self->compositor, - &self->render_rect); + D3D11_VIEWPORT viewport; + + viewport.TopLeftX = self->render_rect.left; + viewport.TopLeftY = self->render_rect.top; + viewport.Width = self->render_rect.right - self->render_rect.left; + viewport.Height = self->render_rect.bottom - self->render_rect.top; + viewport.MinDepth = 0.0f; + viewport.MaxDepth = 1.0f; + gst_d3d11_converter_update_viewport (self->converter, &viewport); + gst_d3d11_overlay_compositor_update_viewport (self->compositor, + &viewport); } - if (self->processor && piv && self->pov) { - if (!gst_d3d11_video_processor_render_unlocked (self->processor, - &self->input_rect, piv, &self->render_rect, self->pov)) { - GST_ERROR_OBJECT (self, "Couldn't render to backbuffer using processor"); - return GST_FLOW_ERROR; - } else { - GST_TRACE_OBJECT (self, "Rendered using processor"); - } + /* Converter preference order + * 1) If this texture can be converted via processor, and we used processor + * previously, use processor + * 2) If SRV is available, use converter + * 3) otherwise, use processor + */ + if (can_process && self->processor_in_use) { + convert_ret = gst_d3d11_window_do_processor (self, piv, pov, &input_rect); + } else if (can_convert) { + convert_ret = gst_d3d11_window_do_convert (self, srv, rtv, &input_rect); + } else if (can_process) { + convert_ret = gst_d3d11_window_do_processor (self, piv, pov, &input_rect); } else { - if (!gst_d3d11_color_converter_convert_unlocked (self->converter, - srv, &self->rtv)) { - GST_ERROR_OBJECT (self, "Couldn't render to backbuffer using converter"); - return GST_FLOW_ERROR; - } else { - GST_TRACE_OBJECT (self, "Rendered using converter"); - } + g_assert_not_reached (); + ret = GST_FLOW_ERROR; + goto unmap_and_out; } - gst_d3d11_overlay_compositor_upload (self->compositor, self->cached_buffer); - gst_d3d11_overlay_compositor_draw_unlocked (self->compositor, &self->rtv); + if (!convert_ret) { + ret = GST_FLOW_ERROR; + goto unmap_and_out; + } + + gst_d3d11_overlay_compositor_upload (self->compositor, buffer); + gst_d3d11_overlay_compositor_draw_unlocked (self->compositor, &rtv); -#if (DXGI_HEADER_VERSION >= 5) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 5) if (self->allow_tearing && self->fullscreen) { present_flags |= DXGI_PRESENT_ALLOW_TEARING; } #endif - ret = klass->present (self, present_flags); + if (klass->present) + ret = klass->present (self, present_flags); self->first_present = FALSE; + + unmap_and_out: + gst_d3d11_buffer_unmap (buffer, infos); } return ret; } GstFlowReturn -gst_d3d11_window_render (GstD3D11Window * window, GstBuffer * buffer, - GstVideoRectangle * rect) +gst_d3d11_window_render (GstD3D11Window * window, GstBuffer * buffer) { GstMemory *mem; GstFlowReturn ret; g_return_val_if_fail (GST_IS_D3D11_WINDOW (window), GST_FLOW_ERROR); - g_return_val_if_fail (rect != NULL, GST_FLOW_ERROR); + + if (buffer) { + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_ERROR_OBJECT (window, "Invalid buffer"); + + return GST_FLOW_ERROR; + } + } + + gst_d3d11_device_lock (window->device); + if (buffer) + gst_buffer_replace (&window->cached_buffer, buffer); + + ret = gst_d3d111_window_present (window, window->cached_buffer, + window->pov, window->rtv); + gst_d3d11_device_unlock (window->device); + + return ret; +} + +GstFlowReturn +gst_d3d11_window_render_on_shared_handle (GstD3D11Window * window, + GstBuffer * buffer, HANDLE shared_handle, guint texture_misc_flags, + guint64 acquire_key, guint64 release_key) +{ + GstD3D11WindowClass *klass; + GstMemory *mem; + GstFlowReturn ret = GST_FLOW_OK; + GstD3D11WindowSharedHandleData data = { NULL, }; + ID3D11VideoProcessorOutputView *pov = NULL; + ID3D11RenderTargetView *rtv = NULL; + + g_return_val_if_fail (GST_IS_D3D11_WINDOW (window), GST_FLOW_ERROR); + + klass = GST_D3D11_WINDOW_GET_CLASS (window); + + g_assert (klass->open_shared_handle != NULL); + g_assert (klass->release_shared_handle != NULL); mem = gst_buffer_peek_memory (buffer, 0); if (!gst_is_d3d11_memory (mem)) { @@ -971,8 +1048,29 @@ return GST_FLOW_ERROR; } + data.shared_handle = shared_handle; + data.texture_misc_flags = texture_misc_flags; + data.acquire_key = acquire_key; + data.release_key = release_key; + gst_d3d11_device_lock (window->device); - ret = gst_d3d111_window_present (window, buffer); + if (!klass->open_shared_handle (window, &data)) { + GST_ERROR_OBJECT (window, "Couldn't open shared handle"); + gst_d3d11_device_unlock (window->device); + return GST_FLOW_OK; + } + + if (data.fallback_rtv) { + rtv = data.fallback_rtv; + pov = data.fallback_pov; + } else { + rtv = data.rtv; + pov = data.pov; + } + + ret = gst_d3d111_window_present (window, buffer, pov, rtv); + + klass->release_shared_handle (window, &data); gst_d3d11_device_unlock (window->device); return ret; @@ -1036,11 +1134,14 @@ #if (!GST_D3D11_WINAPI_ONLY_APP) if (IsWindow ((HWND) handle)) return GST_D3D11_WINDOW_NATIVE_TYPE_HWND; -#else +#endif +#if GST_D3D11_WINAPI_ONLY_APP { + /* *INDENT-OFF* */ ComPtr<IInspectable> window = reinterpret_cast<IInspectable*> (handle); ComPtr<ABI::Windows::UI::Core::ICoreWindow> core_window; ComPtr<ABI::Windows::UI::Xaml::Controls::ISwapChainPanel> panel; + /* *INDENT-ON* */ if (SUCCEEDED (window.As (&core_window))) return GST_D3D11_WINDOW_NATIVE_TYPE_CORE_WINDOW; @@ -1070,4 +1171,4 @@ } return "none"; -} \ No newline at end of file +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window.h
Changed
@@ -24,10 +24,11 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" -#include "gstd3d11colorconverter.h" +#include <gst/d3d11/gstd3d11.h> +#include "gstd3d11converter.h" #include "gstd3d11overlaycompositor.h" #include "gstd3d11videoprocessor.h" +#include "gstd3d11pluginutils.h" G_BEGIN_DECLS @@ -61,6 +62,22 @@ GST_D3D11_WINDOW_NATIVE_TYPE_SWAP_CHAIN_PANEL, } GstD3D11WindowNativeType; +typedef struct +{ + HANDLE shared_handle; + guint texture_misc_flags; + guint64 acquire_key; + guint64 release_key; + + ID3D11Texture2D *texture; + IDXGIKeyedMutex *keyed_mutex; + ID3D11VideoProcessorOutputView *pov; + ID3D11RenderTargetView *rtv; + + ID3D11VideoProcessorOutputView *fallback_pov; + ID3D11RenderTargetView *fallback_rtv; +} GstD3D11WindowSharedHandleData; + struct _GstD3D11Window { GstObject parent; @@ -76,13 +93,16 @@ GstD3D11WindowFullscreenToggleMode fullscreen_toggle_mode; gboolean requested_fullscreen; gboolean fullscreen; + gboolean render_stats; GstVideoInfo info; GstVideoInfo render_info; GstD3D11VideoProcessor *processor; - GstD3D11ColorConverter *converter; + GstD3D11Converter *converter; GstD3D11OverlayCompositor *compositor; + gboolean processor_in_use; + /* calculated rect with aspect ratio and window area */ RECT render_rect; @@ -133,7 +153,26 @@ guint width, guint height); + gboolean (*prepare) (GstD3D11Window * window, + guint display_width, + guint display_height, + GstCaps * caps, + gboolean * video_processor_available, + GError ** error); + void (*unprepare) (GstD3D11Window * window); + + gboolean (*open_shared_handle) (GstD3D11Window * window, + GstD3D11WindowSharedHandleData * data); + + gboolean (*release_shared_handle) (GstD3D11Window * window, + GstD3D11WindowSharedHandleData * data); + + void (*set_render_rectangle) (GstD3D11Window * window, + const GstVideoRectangle * rect); + + void (*set_title) (GstD3D11Window * window, + const gchar *title); }; GType gst_d3d11_window_get_type (void); @@ -141,8 +180,10 @@ void gst_d3d11_window_show (GstD3D11Window * window); void gst_d3d11_window_set_render_rectangle (GstD3D11Window * window, - gint x, gint y, - gint width, gint height); + const GstVideoRectangle * rect); + +void gst_d3d11_window_set_title (GstD3D11Window * window, + const gchar *title); gboolean gst_d3d11_window_prepare (GstD3D11Window * window, guint display_width, @@ -152,8 +193,14 @@ GError ** error); GstFlowReturn gst_d3d11_window_render (GstD3D11Window * window, - GstBuffer * buffer, - GstVideoRectangle * src_rect); + GstBuffer * buffer); + +GstFlowReturn gst_d3d11_window_render_on_shared_handle (GstD3D11Window * window, + GstBuffer * buffer, + HANDLE shared_handle, + guint texture_misc_flags, + guint64 acquire_key, + guint64 release_key); gboolean gst_d3d11_window_unlock (GstD3D11Window * window);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window_corewindow.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_corewindow.cpp
Changed
@@ -22,10 +22,6 @@ #include "config.h" #endif -#include "gstd3d11window.h" -#include "gstd3d11device.h" -#include "gstd3d11memory.h" -#include "gstd3d11utils.h" #include "gstd3d11window_corewindow.h" /* workaround for GetCurrentTime collision */ @@ -39,6 +35,7 @@ #include <wrl/wrappers/corewrappers.h> #include <windows.graphics.display.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; using namespace Microsoft::WRL::Wrappers; using namespace ABI::Windows::UI; @@ -50,10 +47,8 @@ __FITypedEventHandler_2_Windows__CUI__CCore__CCoreWindow_Windows__CUI__CCore__CWindowSizeChangedEventArgs_t IWindowSizeChangedEventHandler; -extern "C" { GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_window_debug); #define GST_CAT_DEFAULT gst_d3d11_window_debug -} /* timeout to wait busy UI thread */ #define DEFAULT_ASYNC_TIMEOUT (10 * 1000) @@ -65,6 +60,7 @@ HANDLE cancellable; EventRegistrationToken event_token; } CoreWindowWinRTStorage; +/* *INDENT-ON* */ struct _GstD3D11WindowCoreWindow { @@ -79,18 +75,16 @@ static void gst_d3d11_window_core_window_constructed (GObject * object); static void gst_d3d11_window_core_window_dispose (GObject * object); -static void gst_d3d11_window_core_window_update_swap_chain (GstD3D11Window * window); -static void -gst_d3d11_window_core_window_change_fullscreen_mode (GstD3D11Window * window); -static gboolean -gst_d3d11_window_core_window_create_swap_chain (GstD3D11Window * window, - DXGI_FORMAT format, guint width, guint height, guint swapchain_flags, - IDXGISwapChain ** swap_chain); -static GstFlowReturn -gst_d3d11_window_core_window_present (GstD3D11Window * window, - guint present_flags); -static gboolean -gst_d3d11_window_core_window_unlock (GstD3D11Window * window); +static void gst_d3d11_window_core_window_update_swap_chain (GstD3D11Window * + window); +static void gst_d3d11_window_core_window_change_fullscreen_mode (GstD3D11Window + * window); +static gboolean gst_d3d11_window_core_window_create_swap_chain (GstD3D11Window * + window, DXGI_FORMAT format, guint width, guint height, + guint swapchain_flags, IDXGISwapChain ** swap_chain); +static GstFlowReturn gst_d3d11_window_core_window_present (GstD3D11Window * + window, guint present_flags); +static gboolean gst_d3d11_window_core_window_unlock (GstD3D11Window * window); static gboolean gst_d3d11_window_core_window_unlock_stop (GstD3D11Window * window); static void @@ -98,18 +92,20 @@ guint width, guint height); static void gst_d3d11_window_core_window_on_resize_sync (GstD3D11Window * window); -static void -gst_d3d11_window_core_window_unprepare (GstD3D11Window * window); +static void gst_d3d11_window_core_window_unprepare (GstD3D11Window * window); static float get_logical_dpi (void) { + /* *INDENT-OFF* */ ComPtr<Display::IDisplayPropertiesStatics> properties; + /* *INDENT-ON* */ HRESULT hr; HStringReference str_ref = - HStringReference (RuntimeClass_Windows_Graphics_Display_DisplayProperties); + HStringReference + (RuntimeClass_Windows_Graphics_Display_DisplayProperties); - hr = GetActivationFactory (str_ref.Get(), properties.GetAddressOf()); + hr = GetActivationFactory (str_ref.Get (), properties.GetAddressOf ()); if (gst_d3d11_result (hr, NULL)) { float dpi = 96.0f; @@ -122,12 +118,14 @@ return 96.0f; } -static inline float dip_to_pixel (float dip) +static inline float +dip_to_pixel (float dip) { /* https://docs.microsoft.com/en-us/windows/win32/learnwin32/dpi-and-device-independent-pixels */ - return dip * get_logical_dpi() / 96.0f; + return dip * get_logical_dpi () / 96.0f; } +/* *INDENT-OFF* */ class CoreResizeHandler : public RuntimeClass<RuntimeClassFlags<ClassicCom>, IWindowSizeChangedEventHandler> @@ -237,6 +235,7 @@ return hr; }); } +/* *INDENT-ON* */ static void gst_d3d11_window_core_window_class_init (GstD3D11WindowCoreWindowClass * klass) @@ -272,6 +271,7 @@ self->storage->cancellable = CreateEvent (NULL, TRUE, FALSE, NULL); } +/* *INDENT-OFF* */ static void gst_d3d11_window_core_window_constructed (GObject * object) { @@ -281,8 +281,8 @@ HRESULT hr; ComPtr<IInspectable> inspectable; ComPtr<IWindowSizeChangedEventHandler> resize_handler; - Size size; ComPtr<Core::ICoreWindow> core_window; + Size size; if (!window->external_handle) { GST_ERROR_OBJECT (self, "No external window handle"); @@ -333,6 +333,7 @@ GST_ERROR_OBJECT (self, "Invalid window handle"); return; } +/* *INDENT-ON* */ static void gst_d3d11_window_core_window_dispose (GObject * object) @@ -342,6 +343,7 @@ G_OBJECT_CLASS (parent_class)->dispose (object); } +/* *INDENT-OFF* */ static void gst_d3d11_window_core_window_unprepare (GstD3D11Window * window) { @@ -369,6 +371,38 @@ self->storage = NULL; } +/* *INDENT-ON* */ + +static IDXGISwapChain1 * +create_swap_chain_for_core_window (GstD3D11WindowCoreWindow * self, + GstD3D11Device * device, guintptr core_window, DXGI_SWAP_CHAIN_DESC1 * desc, + IDXGIOutput * output) +{ + HRESULT hr; + IDXGISwapChain1 *swap_chain = NULL; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + IDXGIFactory1 *factory = gst_d3d11_device_get_dxgi_factory_handle (device); + ComPtr < IDXGIFactory2 > factory2; + + hr = factory->QueryInterface (IID_PPV_ARGS (&factory2)); + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "IDXGIFactory2 interface is unavailable"); + return NULL; + } + + gst_d3d11_device_lock (device); + hr = factory2->CreateSwapChainForCoreWindow (device_handle, + (IUnknown *) core_window, desc, output, &swap_chain); + gst_d3d11_device_unlock (device); + + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "Cannot create SwapChain Object: 0x%x", + (guint) hr); + swap_chain = NULL; + } + + return swap_chain; +} static gboolean gst_d3d11_window_core_window_create_swap_chain (GstD3D11Window * window, @@ -376,8 +410,9 @@ IDXGISwapChain ** swap_chain) { GstD3D11WindowCoreWindow *self = GST_D3D11_WINDOW_CORE_WINDOW (window); - CoreWindowWinRTStorage *storage = self->storage; + /* *INDENT-OFF* */ ComPtr<IDXGISwapChain1> new_swapchain; + /* *INDENT-ON* */ GstD3D11Device *device = window->device; DXGI_SWAP_CHAIN_DESC1 desc1 = { 0, }; @@ -395,8 +430,8 @@ desc1.Flags = swapchain_flags; new_swapchain = - gst_d3d11_device_create_swap_chain_for_core_window (device, - window->external_handle, &desc1, NULL); + create_swap_chain_for_core_window (self, device, + window->external_handle, &desc1, NULL); if (!new_swapchain) { GST_ERROR_OBJECT (self, "Cannot create swapchain"); @@ -458,8 +493,6 @@ static void gst_d3d11_window_core_window_update_swap_chain (GstD3D11Window * window) { - GstD3D11WindowCoreWindow *self = GST_D3D11_WINDOW_CORE_WINDOW (window); - gst_d3d11_window_core_window_on_resize (window, window->surface_width, window->surface_height); @@ -481,11 +514,13 @@ GstD3D11WindowCoreWindow *self = GST_D3D11_WINDOW_CORE_WINDOW (window); CoreWindowWinRTStorage *storage = self->storage; + /* *INDENT-OFF* */ run_async (storage->dispatcher, storage->cancellable, INFINITE, [window] { gst_d3d11_window_core_window_on_resize_sync (window); return S_OK; }); + /* *INDENT-ON* */ } static void @@ -518,4 +553,4 @@ g_object_ref_sink (window); return window; -} \ No newline at end of file +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window_corewindow.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_corewindow.h
Changed
@@ -23,7 +23,6 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" #include "gstd3d11window.h" G_BEGIN_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_dummy.cpp
Added
@@ -0,0 +1,537 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstd3d11window_dummy.h" +#include "gstd3d11pluginutils.h" +#include <wrl.h> + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_window_debug); +#define GST_CAT_DEFAULT gst_d3d11_window_debug + +struct _GstD3D11WindowDummy +{ + GstD3D11Window parent; + + ID3D11Texture2D *fallback_texture; + ID3D11VideoProcessorOutputView *fallback_pov; + ID3D11RenderTargetView *fallback_rtv; +}; + +#define gst_d3d11_window_dummy_parent_class parent_class +G_DEFINE_TYPE (GstD3D11WindowDummy, gst_d3d11_window_dummy, + GST_TYPE_D3D11_WINDOW); + +static void gst_d3d11_window_dummy_on_resize (GstD3D11Window * window, + guint width, guint height); +static gboolean gst_d3d11_window_dummy_prepare (GstD3D11Window * window, + guint display_width, guint display_height, GstCaps * caps, + gboolean * video_processor_available, GError ** error); +static void gst_d3d11_window_dummy_unprepare (GstD3D11Window * window); +static gboolean +gst_d3d11_window_dummy_open_shared_handle (GstD3D11Window * window, + GstD3D11WindowSharedHandleData * data); +static gboolean +gst_d3d11_window_dummy_release_shared_handle (GstD3D11Window * window, + GstD3D11WindowSharedHandleData * data); + +static void +gst_d3d11_window_dummy_class_init (GstD3D11WindowDummyClass * klass) +{ + GstD3D11WindowClass *window_class = GST_D3D11_WINDOW_CLASS (klass); + + window_class->on_resize = + GST_DEBUG_FUNCPTR (gst_d3d11_window_dummy_on_resize); + window_class->prepare = GST_DEBUG_FUNCPTR (gst_d3d11_window_dummy_prepare); + window_class->unprepare = + GST_DEBUG_FUNCPTR (gst_d3d11_window_dummy_unprepare); + window_class->open_shared_handle = + GST_DEBUG_FUNCPTR (gst_d3d11_window_dummy_open_shared_handle); + window_class->release_shared_handle = + GST_DEBUG_FUNCPTR (gst_d3d11_window_dummy_release_shared_handle); +} + +static void +gst_d3d11_window_dummy_init (GstD3D11WindowDummy * self) +{ +} + +static gboolean +gst_d3d11_window_dummy_prepare (GstD3D11Window * window, + guint display_width, guint display_height, GstCaps * caps, + gboolean * video_processor_available, GError ** error) +{ + g_clear_pointer (&window->processor, gst_d3d11_video_processor_free); + g_clear_pointer (&window->converter, gst_d3d11_converter_free); + g_clear_pointer (&window->compositor, gst_d3d11_overlay_compositor_free); + + /* We are supporting only RGBA, BGRA or RGB10A2_LE formats but we don't know + * which format texture will be used at this moment */ + + gst_video_info_from_caps (&window->info, caps); + window->render_rect.left = 0; + window->render_rect.top = 0; + window->render_rect.right = display_width; + window->render_rect.bottom = display_height; + + window->input_rect.left = 0; + window->input_rect.top = 0; + window->input_rect.right = GST_VIDEO_INFO_WIDTH (&window->info); + window->input_rect.bottom = GST_VIDEO_INFO_HEIGHT (&window->info); + + gst_video_info_set_format (&window->render_info, + GST_VIDEO_FORMAT_BGRA, display_width, display_height); + + /* TODO: not sure which colorspace should be used, let's use BT709 since + * it's default and most common one */ + window->render_info.colorimetry.primaries = GST_VIDEO_COLOR_PRIMARIES_BT709; + window->render_info.colorimetry.transfer = GST_VIDEO_TRANSFER_BT709; + window->render_info.colorimetry.range = GST_VIDEO_COLOR_RANGE_0_255; + + gst_d3d11_device_lock (window->device); + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 4) + { + const GstDxgiColorSpace *in_color_space = + gst_d3d11_video_info_to_dxgi_color_space (&window->info); + const GstD3D11Format *in_format = + gst_d3d11_device_format_from_gst (window->device, + GST_VIDEO_INFO_FORMAT (&window->info)); + gboolean hardware = FALSE; + GstD3D11VideoProcessor *processor = NULL; + guint i; + DXGI_FORMAT formats_to_check[] = { + DXGI_FORMAT_R8G8B8A8_UNORM, + DXGI_FORMAT_B8G8R8A8_UNORM, + DXGI_FORMAT_R10G10B10A2_UNORM + }; + + if (in_color_space && in_format && + in_format->dxgi_format != DXGI_FORMAT_UNKNOWN) { + g_object_get (window->device, "hardware", &hardware, NULL); + } + + if (hardware) { + processor = + gst_d3d11_video_processor_new (window->device, + GST_VIDEO_INFO_WIDTH (&window->info), + GST_VIDEO_INFO_HEIGHT (&window->info), display_width, display_height); + } + + /* Check if video processor can support all possible output dxgi formats */ + for (i = 0; i < G_N_ELEMENTS (formats_to_check) && processor; i++) { + DXGI_FORMAT in_dxgi_format = in_format->dxgi_format; + DXGI_FORMAT out_dxgi_format = formats_to_check[i]; + DXGI_COLOR_SPACE_TYPE in_dxgi_color_space = + (DXGI_COLOR_SPACE_TYPE) in_color_space->dxgi_color_space_type; + + if (!gst_d3d11_video_processor_check_format_conversion (processor, + in_dxgi_format, in_dxgi_color_space, out_dxgi_format, + DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709)) { + GST_DEBUG_OBJECT (window, "Conversion is not supported by device"); + g_clear_pointer (&processor, gst_d3d11_video_processor_free); + break; + } + } + + if (processor) { + gst_d3d11_video_processor_set_input_dxgi_color_space (processor, + (DXGI_COLOR_SPACE_TYPE) in_color_space->dxgi_color_space_type); + gst_d3d11_video_processor_set_output_dxgi_color_space (processor, + DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709); + } + + window->processor = processor; + } +#endif + *video_processor_available = !!window->processor; + + window->converter = + gst_d3d11_converter_new (window->device, &window->info, + &window->render_info, nullptr); + + if (!window->converter) { + GST_ERROR_OBJECT (window, "Cannot create converter"); + g_set_error (error, GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_FAILED, + "Cannot create converter"); + goto error; + } + + window->compositor = + gst_d3d11_overlay_compositor_new (window->device, &window->render_info); + if (!window->compositor) { + GST_ERROR_OBJECT (window, "Cannot create overlay compositor"); + g_set_error (error, GST_RESOURCE_ERROR, GST_RESOURCE_ERROR_FAILED, + "Cannot create overlay compositor"); + goto error; + } + + gst_d3d11_device_unlock (window->device); + + return TRUE; + +error: + gst_d3d11_device_unlock (window->device); + + return FALSE; +} + +static void +gst_d3d11_window_dummy_clear_resources (GstD3D11WindowDummy * self) +{ + GST_D3D11_CLEAR_COM (self->fallback_pov); + GST_D3D11_CLEAR_COM (self->fallback_rtv); + GST_D3D11_CLEAR_COM (self->fallback_texture); +} + +static void +gst_d3d11_window_dummy_unprepare (GstD3D11Window * window) +{ + GstD3D11WindowDummy *self = GST_D3D11_WINDOW_DUMMY (window); + + gst_d3d11_window_dummy_clear_resources (self); +} + +static void +gst_d3d11_window_dummy_on_resize (GstD3D11Window * window, + guint width, guint height) +{ + GstVideoRectangle src_rect, dst_rect, rst_rect; + + dst_rect.x = 0; + dst_rect.y = 0; + dst_rect.w = width; + dst_rect.h = height; + + if (window->force_aspect_ratio) { + src_rect.x = 0; + src_rect.y = 0; + src_rect.w = GST_VIDEO_INFO_WIDTH (&window->render_info); + src_rect.h = GST_VIDEO_INFO_HEIGHT (&window->render_info); + + gst_video_sink_center_rect (src_rect, dst_rect, &rst_rect, TRUE); + } else { + rst_rect = dst_rect; + } + + window->render_rect.left = rst_rect.x; + window->render_rect.top = rst_rect.y; + window->render_rect.right = rst_rect.x + rst_rect.w; + window->render_rect.bottom = rst_rect.y + rst_rect.h; + + window->first_present = TRUE; +} + +static gboolean +gst_d3d11_window_dummy_setup_fallback_texture (GstD3D11Window * window, + D3D11_TEXTURE2D_DESC * shared_desc) +{ + GstD3D11WindowDummy *self = GST_D3D11_WINDOW_DUMMY (window); + D3D11_TEXTURE2D_DESC desc = { 0, }; + D3D11_RENDER_TARGET_VIEW_DESC rtv_desc; + ID3D11Device *device_handle = + gst_d3d11_device_get_device_handle (window->device); + gboolean need_new_texture = FALSE; + HRESULT hr; + + if (!self->fallback_texture) { + GST_DEBUG_OBJECT (self, + "We have no configured fallback texture, create new one"); + need_new_texture = TRUE; + } else { + self->fallback_texture->GetDesc (&desc); + if (shared_desc->Format != desc.Format) { + GST_DEBUG_OBJECT (self, "Texture formats are different, create new one"); + need_new_texture = TRUE; + } else if (shared_desc->Width > desc.Width || + shared_desc->Height > desc.Height) { + GST_DEBUG_OBJECT (self, "Needs larger size of fallback texture"); + need_new_texture = TRUE; + } + } + + if (!need_new_texture) + return TRUE; + + gst_d3d11_window_dummy_clear_resources (self); + + desc.Width = shared_desc->Width; + desc.Height = shared_desc->Height; + desc.MipLevels = 1; + desc.ArraySize = 1; + desc.Format = shared_desc->Format; + desc.SampleDesc.Count = 1; + desc.SampleDesc.Quality = 0; + desc.Usage = D3D11_USAGE_DEFAULT; + desc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET; + + hr = device_handle->CreateTexture2D (&desc, NULL, &self->fallback_texture); + if (!gst_d3d11_result (hr, window->device)) { + GST_ERROR_OBJECT (self, "Couldn't create fallback texture"); + return FALSE; + } + + rtv_desc.Format = DXGI_FORMAT_UNKNOWN; + rtv_desc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; + rtv_desc.Texture2D.MipSlice = 0; + + hr = device_handle->CreateRenderTargetView (self->fallback_texture, &rtv_desc, + &self->fallback_rtv); + if (!gst_d3d11_result (hr, window->device)) { + GST_ERROR_OBJECT (self, + "Couldn't get render target view from fallback texture"); + gst_d3d11_window_dummy_clear_resources (self); + return FALSE; + } + + if (window->processor) { + D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC pov_desc; + + pov_desc.ViewDimension = D3D11_VPOV_DIMENSION_TEXTURE2D; + pov_desc.Texture2D.MipSlice = 0; + + if (!gst_d3d11_video_processor_create_output_view (window->processor, + &pov_desc, self->fallback_texture, &self->fallback_pov)) { + GST_ERROR_OBJECT (window, + "ID3D11VideoProcessorOutputView is unavailable"); + gst_d3d11_window_dummy_clear_resources (self); + return FALSE; + } + } + + return TRUE; +} + +/* *INDENT-OFF* */ +static gboolean +gst_d3d11_window_dummy_open_shared_handle (GstD3D11Window * window, + GstD3D11WindowSharedHandleData * data) +{ + GstD3D11WindowDummy *self = GST_D3D11_WINDOW_DUMMY (window); + GstD3D11Device *device = window->device; + ID3D11Device *device_handle; + HRESULT hr; + ID3D11Texture2D *texture = NULL; + IDXGIKeyedMutex *keyed_mutex = NULL; + ID3D11VideoProcessorOutputView *pov = NULL; + ID3D11RenderTargetView *rtv = NULL; + D3D11_TEXTURE2D_DESC desc; + gboolean use_keyed_mutex = FALSE; + gboolean need_fallback_texture = FALSE; + + device_handle = gst_d3d11_device_get_device_handle (device); + + if ((data->texture_misc_flags & D3D11_RESOURCE_MISC_SHARED_NTHANDLE) == + D3D11_RESOURCE_MISC_SHARED_NTHANDLE) { + ComPtr<ID3D11Device1> device1_handle; + + hr = device_handle->QueryInterface (IID_PPV_ARGS (&device1_handle)); + if (!gst_d3d11_result (hr, device)) + return FALSE; + + hr = device1_handle->OpenSharedResource1 (data->shared_handle, + IID_PPV_ARGS (&texture)); + } else { + hr = device_handle->OpenSharedResource (data->shared_handle, + IID_PPV_ARGS (&texture)); + } + + if (!gst_d3d11_result (hr, device)) + return FALSE; + + texture->GetDesc (&desc); + use_keyed_mutex = (desc.MiscFlags & D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX) == + D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX; + + if (use_keyed_mutex) { + hr = texture->QueryInterface (IID_PPV_ARGS (&keyed_mutex)); + if (!gst_d3d11_result (hr, device)) + goto out; + } + + if (window->processor) { + if (use_keyed_mutex) { + D3D11_VIDEO_PROCESSOR_OUTPUT_VIEW_DESC pov_desc; + + pov_desc.ViewDimension = D3D11_VPOV_DIMENSION_TEXTURE2D; + pov_desc.Texture2D.MipSlice = 0; + + if (!gst_d3d11_video_processor_create_output_view (window->processor, + &pov_desc, texture, &pov)) { + GST_WARNING_OBJECT (window, + "ID3D11VideoProcessorOutputView is unavailable"); + } + } else { + /* HACK: If external texture was created without keyed mutext + * and we need to used videoprocessor to convert decoder output texture + * to external texture, converted texture by videoprocessor seems to be broken + * Probably that's because of missing flush/sync API around videoprocessor. + * (e.g., ID3D11VideoContext and ID3D11VideoProcessor have no + * flushing api such as ID3D11DeviceContext::Flush). + * To workaround the case, we need to use fallback texture and copy back + * to external texture + */ + + need_fallback_texture = TRUE; + + GST_TRACE_OBJECT (window, + "We are using video processor but keyed mutex is unavailable"); + if (!gst_d3d11_window_dummy_setup_fallback_texture (window, &desc)) { + goto out; + } + } + } + + hr = device_handle->CreateRenderTargetView ((ID3D11Resource *) texture, + NULL, &rtv); + if (!gst_d3d11_result (hr, device)) + goto out; + + if (keyed_mutex) { + hr = keyed_mutex->AcquireSync(data->acquire_key, INFINITE); + if (!gst_d3d11_result (hr, device)) + goto out; + } + + /* Everything is prepared now */ + gst_d3d11_window_dummy_on_resize (window, desc.Width, desc.Height); + + /* Move owned resources */ + data->texture = texture; + data->keyed_mutex = keyed_mutex; + data->pov = pov; + data->rtv = rtv; + + if (need_fallback_texture) { + data->fallback_pov = self->fallback_pov; + data->fallback_rtv = self->fallback_rtv; + } else { + data->fallback_pov = nullptr; + data->fallback_rtv = nullptr; + } + + return TRUE; + +out: + GST_D3D11_CLEAR_COM (texture); + GST_D3D11_CLEAR_COM (keyed_mutex); + GST_D3D11_CLEAR_COM (pov); + GST_D3D11_CLEAR_COM (rtv); + + return FALSE; +} +/* *INDENT-ON* */ + +static gboolean +gst_d3d11_window_dummy_release_shared_handle (GstD3D11Window * window, + GstD3D11WindowSharedHandleData * data) +{ + GstD3D11WindowDummy *self = GST_D3D11_WINDOW_DUMMY (window); + GstD3D11Device *device = window->device; + HRESULT hr; + + /* TODO: cache owned resource for the later reuse? */ + if (data->keyed_mutex) { + hr = data->keyed_mutex->ReleaseSync (data->release_key); + gst_d3d11_result (hr, device); + + data->keyed_mutex->Release (); + } else { + /* *INDENT-OFF* */ + ComPtr<ID3D11Query> query; + /* *INDENT-ON* */ + D3D11_QUERY_DESC query_desc; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + ID3D11DeviceContext *context_handle = + gst_d3d11_device_get_device_context_handle (device); + BOOL sync_done = FALSE; + + /* If keyed mutex is not used, let's handle sync manually by using + * ID3D11Query. Issued GPU commands might not be finished yet */ + query_desc.Query = D3D11_QUERY_EVENT; + query_desc.MiscFlags = 0; + + hr = device_handle->CreateQuery (&query_desc, &query); + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Couldn't Create event query"); + return FALSE; + } + + /* Copy from fallback texture to user's texture */ + if (data->fallback_rtv) { + D3D11_BOX src_box; + D3D11_TEXTURE2D_DESC desc; + ID3D11DeviceContext *context_handle = + gst_d3d11_device_get_device_context_handle (device); + + data->texture->GetDesc (&desc); + + src_box.left = 0; + src_box.top = 0; + src_box.front = 0; + src_box.back = 1; + src_box.right = desc.Width; + src_box.bottom = desc.Height; + + context_handle->CopySubresourceRegion (data->texture, 0, 0, 0, 0, + self->fallback_texture, 0, &src_box); + } + context_handle->End (query.Get ()); + + /* Wait until all issued GPU commands are finished */ + do { + context_handle->GetData (query.Get (), &sync_done, sizeof (BOOL), 0); + } while (!sync_done && (hr == S_OK || hr == S_FALSE)); + + if (!gst_d3d11_result (hr, device)) { + GST_ERROR_OBJECT (self, "Couldn't sync GPU operation"); + return FALSE; + } + } + + GST_D3D11_CLEAR_COM (data->rtv); + GST_D3D11_CLEAR_COM (data->pov); + GST_D3D11_CLEAR_COM (data->texture); + + return TRUE; +} + +GstD3D11Window * +gst_d3d11_window_dummy_new (GstD3D11Device * device) +{ + GstD3D11Window *window; + + g_return_val_if_fail (GST_IS_D3D11_DEVICE (device), NULL); + + window = (GstD3D11Window *) + g_object_new (GST_TYPE_D3D11_WINDOW_DUMMY, "d3d11device", device, NULL); + + window->initialized = TRUE; + g_object_ref_sink (window); + + return window; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_dummy.h
Added
@@ -0,0 +1,38 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_D3D11_WINDOW_DUMMY_H__ +#define __GST_D3D11_WINDOW_DUMMY_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include "gstd3d11window.h" + +G_BEGIN_DECLS + +#define GST_TYPE_D3D11_WINDOW_DUMMY (gst_d3d11_window_dummy_get_type()) +G_DECLARE_FINAL_TYPE (GstD3D11WindowDummy, + gst_d3d11_window_dummy, GST, D3D11_WINDOW_DUMMY, GstD3D11Window); + +GstD3D11Window * gst_d3d11_window_dummy_new (GstD3D11Device * device); + +G_END_DECLS + +#endif /* __GST_D3D11_WINDOW_DUMMY_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window_swapchainpanel.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_swapchainpanel.cpp
Changed
@@ -22,10 +22,6 @@ #include "config.h" #endif -#include "gstd3d11window.h" -#include "gstd3d11device.h" -#include "gstd3d11memory.h" -#include "gstd3d11utils.h" #include "gstd3d11window_swapchainpanel.h" /* workaround for GetCurrentTime collision */ @@ -38,15 +34,15 @@ #include <wrl.h> #include <wrl/wrappers/corewrappers.h> +/* *INDENT-OFF* */ + using namespace Microsoft::WRL; using namespace Microsoft::WRL::Wrappers; using namespace ABI::Windows::UI; using namespace ABI::Windows::Foundation; -extern "C" { GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_window_debug); #define GST_CAT_DEFAULT gst_d3d11_window_debug -} /* timeout to wait busy UI thread */ #define DEFAULT_ASYNC_TIMEOUT (10 * 1000) @@ -59,6 +55,7 @@ HANDLE cancellable; EventRegistrationToken event_token; } SwapChainPanelWinRTStorage; +/* *INDENT-ON* */ struct _GstD3D11WindowSwapChainPanel { @@ -94,11 +91,11 @@ gst_d3d11_window_swap_chain_panel_on_resize (GstD3D11Window * window, guint width, guint height); static void -gst_d3d11_window_swap_chain_panel_on_resize_sync (GstD3D11Window * - window); +gst_d3d11_window_swap_chain_panel_on_resize_sync (GstD3D11Window * window); static void gst_d3d11_window_swap_chain_panel_unprepare (GstD3D11Window * window); +/* *INDENT-OFF* */ class PanelResizeHandler : public RuntimeClass<RuntimeClassFlags<ClassicCom>, Xaml::ISizeChangedEventHandler> @@ -202,6 +199,7 @@ return hr; } +/* *INDENT-ON* */ static void gst_d3d11_window_swap_chain_panel_class_init (GstD3D11WindowSwapChainPanelClass @@ -216,7 +214,8 @@ window_class->update_swap_chain = GST_DEBUG_FUNCPTR (gst_d3d11_window_swap_chain_panel_update_swap_chain); window_class->change_fullscreen_mode = - GST_DEBUG_FUNCPTR (gst_d3d11_window_swap_chain_panel_change_fullscreen_mode); + GST_DEBUG_FUNCPTR + (gst_d3d11_window_swap_chain_panel_change_fullscreen_mode); window_class->create_swap_chain = GST_DEBUG_FUNCPTR (gst_d3d11_window_swap_chain_panel_create_swap_chain); window_class->present = @@ -238,6 +237,7 @@ self->storage->cancellable = CreateEvent (NULL, TRUE, FALSE, NULL); } +/* *INDENT-OFF* */ static void gst_d3d11_window_swap_chain_panel_constructed (GObject * object) { @@ -305,6 +305,7 @@ GST_ERROR_OBJECT (self, "Invalid window handle"); return; } +/* *INDENT-ON* */ static void gst_d3d11_window_swap_chain_panel_dispose (GObject * object) @@ -314,6 +315,7 @@ G_OBJECT_CLASS (parent_class)->dispose (object); } +/* *INDENT-OFF* */ static void gst_d3d11_window_swap_chain_panel_unprepare (GstD3D11Window * window) { @@ -343,7 +345,39 @@ self->storage = NULL; } +/* *INDENT-ON* */ + +static IDXGISwapChain1 * +create_swap_chain_for_composition (GstD3D11WindowSwapChainPanel * self, + GstD3D11Device * device, DXGI_SWAP_CHAIN_DESC1 * desc, IDXGIOutput * output) +{ + HRESULT hr; + IDXGISwapChain1 *swap_chain = NULL; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + IDXGIFactory1 *factory = gst_d3d11_device_get_dxgi_factory_handle (device); + ComPtr < IDXGIFactory2 > factory2; + + hr = factory->QueryInterface (IID_PPV_ARGS (&factory2)); + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "IDXGIFactory2 interface is unavailable"); + return NULL; + } + + gst_d3d11_device_lock (device); + hr = factory2->CreateSwapChainForComposition (device_handle, + desc, output, &swap_chain); + gst_d3d11_device_unlock (device); + + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "Cannot create SwapChain Object: 0x%x", + (guint) hr); + swap_chain = NULL; + } + + return swap_chain; +} +/* *INDENT-OFF* */ static gboolean gst_d3d11_window_swap_chain_panel_create_swap_chain (GstD3D11Window * window, DXGI_FORMAT format, guint width, guint height, guint swapchain_flags, @@ -372,7 +406,7 @@ desc1.Flags = swapchain_flags; new_swapchain = - gst_d3d11_device_create_swap_chain_for_composition (device, &desc1, NULL); + create_swap_chain_for_composition (self, device, &desc1, NULL); if (!new_swapchain) { GST_ERROR_OBJECT (self, "Cannot create swapchain"); @@ -396,6 +430,7 @@ return TRUE; } +/* *INDENT-ON* */ static GstFlowReturn gst_d3d11_window_swap_chain_panel_present (GstD3D11Window * window, @@ -450,9 +485,6 @@ static void gst_d3d11_window_swap_chain_panel_update_swap_chain (GstD3D11Window * window) { - GstD3D11WindowSwapChainPanel *self = - GST_D3D11_WINDOW_SWAP_CHAIN_PANEL (window); - gst_d3d11_window_swap_chain_panel_on_resize (window, window->surface_width, window->surface_height); @@ -476,11 +508,11 @@ GST_D3D11_WINDOW_SWAP_CHAIN_PANEL (window); SwapChainPanelWinRTStorage *storage = self->storage; - run_async (storage->dispatcher, storage->cancellable, INFINITE, - [window] { + run_async (storage->dispatcher, storage->cancellable, INFINITE,[window] { gst_d3d11_window_swap_chain_panel_on_resize_sync (window); return S_OK; - }); + } + ); } static void @@ -503,7 +535,7 @@ window = (GstD3D11Window *) g_object_new (GST_TYPE_D3D11_WINDOW_SWAP_CHAIN_PANEL, - "d3d11device", device, "window-handle", handle, NULL); + "d3d11device", device, "window-handle", handle, NULL); if (!window->initialized) { gst_object_unref (window); return NULL; @@ -512,4 +544,4 @@ g_object_ref_sink (window); return window; -} \ No newline at end of file +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window_swapchainpanel.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_swapchainpanel.h
Changed
@@ -23,7 +23,6 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" #include "gstd3d11window.h" G_BEGIN_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window_win32.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_win32.cpp
Changed
@@ -24,15 +24,15 @@ #include "config.h" #endif -#include "gstd3d11device.h" -#include "gstd3d11memory.h" -#include "gstd3d11utils.h" #include "gstd3d11window_win32.h" +#include <wrl.h> + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ -extern "C" { GST_DEBUG_CATEGORY_EXTERN (gst_d3d11_window_debug); #define GST_CAT_DEFAULT gst_d3d11_window_debug -} G_LOCK_DEFINE_STATIC (create_lock); @@ -41,6 +41,8 @@ #define WM_GST_D3D11_FULLSCREEN (WM_USER + 1) #define WM_GST_D3D11_CONSTRUCT_INTERNAL_WINDOW (WM_USER + 2) +#define WM_GST_D3D11_DESTROY_INTERNAL_WINDOW (WM_USER + 3) +#define WM_GST_D3D11_MOVE_WINDOW (WM_USER + 4) static LRESULT CALLBACK window_proc (HWND hWnd, UINT uMsg, WPARAM wParam, LPARAM lParam); @@ -71,6 +73,8 @@ GThread *thread; + GThread *internal_hwnd_thread; + HWND internal_hwnd; HWND external_hwnd; GstD3D11WindowWin32OverlayState overlay_state; @@ -81,10 +85,14 @@ /* atomic */ gint pending_fullscreen_count; + gint pending_move_window; /* fullscreen related */ RECT restore_rect; LONG restore_style; + + /* Handle set_render_rectangle */ + GstVideoRectangle render_rect; }; #define gst_d3d11_window_win32_parent_class parent_class @@ -109,10 +117,8 @@ static gpointer gst_d3d11_window_win32_thread_func (gpointer data); static gboolean gst_d3d11_window_win32_create_internal_window (GstD3D11WindowWin32 * self); -static void gst_d3d11_window_win32_close_internal_window (GstD3D11WindowWin32 * - self); -static void gst_d3d11_window_win32_release_external_handle (GstD3D11WindowWin32 - * self); +static void gst_d3d11_window_win32_destroy_internal_window (HWND hwnd); +static void gst_d3d11_window_win32_release_external_handle (HWND hwnd); static void gst_d3d11_window_win32_set_window_handle (GstD3D11WindowWin32 * self, guintptr handle); @@ -120,6 +126,11 @@ gst_d3d11_window_win32_on_resize (GstD3D11Window * window, guint width, guint height); static void gst_d3d11_window_win32_unprepare (GstD3D11Window * window); +static void +gst_d3d11_window_win32_set_render_rectangle (GstD3D11Window * window, + const GstVideoRectangle * rect); +static void gst_d3d11_window_win32_set_title (GstD3D11Window * window, + const gchar * title); static void gst_d3d11_window_win32_class_init (GstD3D11WindowWin32Class * klass) @@ -143,6 +154,10 @@ GST_DEBUG_FUNCPTR (gst_d3d11_window_win32_on_resize); window_class->unprepare = GST_DEBUG_FUNCPTR (gst_d3d11_window_win32_unprepare); + window_class->set_render_rectangle = + GST_DEBUG_FUNCPTR (gst_d3d11_window_win32_set_render_rectangle); + window_class->set_title = + GST_DEBUG_FUNCPTR (gst_d3d11_window_win32_set_title); } static void @@ -167,7 +182,7 @@ g_mutex_lock (&self->lock); self->loop = g_main_loop_new (self->main_context, FALSE); - self->thread = g_thread_new ("GstD3D11WindowWin32Win32", + self->thread = g_thread_new ("GstD3D11WindowWin32", (GThreadFunc) gst_d3d11_window_win32_thread_func, self); while (!g_main_loop_is_running (self->loop)) g_cond_wait (&self->cond, &self->lock); @@ -190,7 +205,32 @@ { GstD3D11WindowWin32 *self = GST_D3D11_WINDOW_WIN32 (window); - gst_d3d11_window_win32_release_external_handle (self); + if (self->external_hwnd) { + gst_d3d11_window_win32_release_external_handle (self->external_hwnd); + RemoveProp (self->internal_hwnd, D3D11_WINDOW_PROP_NAME); + + if (self->internal_hwnd_thread == g_thread_self ()) { + /* State changing thread is identical to internal window thread. + * window can be closed here */ + + GST_INFO_OBJECT (self, "Closing internal window immediately"); + gst_d3d11_window_win32_destroy_internal_window (self->internal_hwnd); + } else { + /* We cannot destroy internal window from non-window thread. + * and we cannot use synchronously SendMessage() method at this point + * since window thread might be wait for current thread and SendMessage() + * will be blocked until it's called from window thread. + * Instead, posts message so that it can be closed from window thread + * asynchronously */ + GST_INFO_OBJECT (self, "Posting custom destory message"); + PostMessage (self->internal_hwnd, WM_GST_D3D11_DESTROY_INTERNAL_WINDOW, + 0, 0); + } + + self->external_hwnd = NULL; + self->internal_hwnd = NULL; + self->internal_hwnd_thread = NULL; + } if (self->loop) { g_main_loop_quit (self->loop); @@ -210,8 +250,53 @@ g_main_context_unref (self->main_context); self->main_context = NULL; } +} - gst_d3d11_window_win32_close_internal_window (self); +static void +gst_d3d11_window_win32_set_render_rectangle (GstD3D11Window * window, + const GstVideoRectangle * rect) +{ + GstD3D11WindowWin32 *self = GST_D3D11_WINDOW_WIN32 (window); + + if (self->external_hwnd && self->internal_hwnd) { + g_atomic_int_add (&self->pending_move_window, 1); + self->render_rect = *rect; + + if (self->internal_hwnd_thread == g_thread_self ()) { + /* We are on message pumping thread already, handle this synchroniously */ + SendMessage (self->internal_hwnd, WM_GST_D3D11_MOVE_WINDOW, 0, 0); + } else { + /* Post message to message pumping thread. Handling HWND specific message + * on message pumping thread is not a worst idea in generall */ + PostMessage (self->internal_hwnd, WM_GST_D3D11_MOVE_WINDOW, 0, 0); + } + } else { + /* XXX: Not sure what's expected behavior if we are drawing on internal + * HWND but user wants to specify rectangle. + * + * - Should we move window to corresponding desktop coordinates ? + * - Or should crop correspondingly by modifying viewport of + * render target view of swapchian's backbuffer or so ? + * - Or should we ignore set_render_rectangle if we are drawing on + * internal HWND without external HWND ? + */ + } +} + +static void +gst_d3d11_window_win32_set_title (GstD3D11Window * window, const gchar * title) +{ + GstD3D11WindowWin32 *self = GST_D3D11_WINDOW_WIN32 (window); + + /* Do this only when we are rendring on our own HWND */ + if (!self->external_hwnd && self->internal_hwnd) { + gunichar2 *str = g_utf8_to_utf16 (title, -1, nullptr, nullptr, nullptr); + + if (str) { + SetWindowTextW (self->internal_hwnd, (LPCWSTR) str); + g_free (str); + } + } } static void @@ -278,7 +363,21 @@ g_main_loop_run (self->loop); - gst_d3d11_window_win32_close_internal_window (self); + RemoveProp (self->internal_hwnd, D3D11_WINDOW_PROP_NAME); + gst_d3d11_window_win32_destroy_internal_window (self->internal_hwnd); + self->internal_hwnd = NULL; + self->internal_hwnd_thread = NULL; + + if (self->msg_source) { + g_source_destroy (self->msg_source); + g_source_unref (self->msg_source); + self->msg_source = NULL; + } + + if (self->msg_io_channel) { + g_io_channel_unref (self->msg_io_channel); + self->msg_io_channel = NULL; + } g_main_context_pop_thread_default (self->main_context); @@ -288,28 +387,18 @@ } static void -gst_d3d11_window_win32_close_internal_window (GstD3D11WindowWin32 * self) +gst_d3d11_window_win32_destroy_internal_window (HWND hwnd) { - if (self->internal_hwnd) { - RemoveProp (self->internal_hwnd, D3D11_WINDOW_PROP_NAME); - ShowWindow (self->internal_hwnd, SW_HIDE); - SetParent (self->internal_hwnd, NULL); - if (!DestroyWindow (self->internal_hwnd)) - GST_WARNING ("failed to destroy window %" G_GUINTPTR_FORMAT - ", 0x%x", (guintptr) self->internal_hwnd, (guint) GetLastError ()); - self->internal_hwnd = NULL; - } + if (!hwnd) + return; - if (self->msg_source) { - g_source_destroy (self->msg_source); - g_source_unref (self->msg_source); - self->msg_source = NULL; - } + SetParent (hwnd, NULL); - if (self->msg_io_channel) { - g_io_channel_unref (self->msg_io_channel); - self->msg_io_channel = NULL; - } + GST_INFO ("Destroying internal window %" G_GUINTPTR_FORMAT, (guintptr) hwnd); + + if (!DestroyWindow (hwnd)) + GST_WARNING ("failed to destroy window %" G_GUINTPTR_FORMAT + ", 0x%x", (guintptr) hwnd, (guint) GetLastError ()); } static void @@ -336,30 +425,27 @@ } static void -gst_d3d11_window_win32_release_external_handle (GstD3D11WindowWin32 * self) +gst_d3d11_window_win32_release_external_handle (HWND hwnd) { WNDPROC external_proc; - if (!self->external_hwnd) + if (!hwnd) return; - external_proc = - (WNDPROC) GetProp (self->external_hwnd, EXTERNAL_PROC_PROP_NAME); - if (!external_proc) + external_proc = (WNDPROC) GetProp (hwnd, EXTERNAL_PROC_PROP_NAME); + if (!external_proc) { + GST_WARNING ("Failed to get original window procedure"); return; + } - GST_DEBUG_OBJECT (self, "release external window %" G_GUINTPTR_FORMAT - ", original window procedure %p", (guintptr) self->external_hwnd, - external_proc); + GST_DEBUG ("release external window %" G_GUINTPTR_FORMAT + ", original window procedure %p", (guintptr) hwnd, external_proc); - if (!SetWindowLongPtr (self->external_hwnd, - GWLP_WNDPROC, (LONG_PTR) external_proc)) { - GST_WARNING_OBJECT (self, "Couldn't restore original window procedure"); - } + RemoveProp (hwnd, EXTERNAL_PROC_PROP_NAME); + RemoveProp (hwnd, D3D11_WINDOW_PROP_NAME); - RemoveProp (self->external_hwnd, EXTERNAL_PROC_PROP_NAME); - RemoveProp (self->external_hwnd, D3D11_WINDOW_PROP_NAME); - self->external_hwnd = NULL; + if (!SetWindowLongPtr (hwnd, GWLP_WNDPROC, (LONG_PTR) external_proc)) + GST_WARNING ("Couldn't restore original window procedure"); } static gboolean @@ -428,6 +514,8 @@ GST_LOG_OBJECT (self, "Created a internal d3d11 window %p", self->internal_hwnd); + self->internal_hwnd_thread = g_thread_self (); + return TRUE; } @@ -463,7 +551,7 @@ ShowWindow (hwnd, SW_NORMAL); } else { - IDXGIOutput *output; + ComPtr < IDXGIOutput > output; DXGI_OUTPUT_DESC output_desc; IDXGISwapChain *swap_chain = window->swap_chain; @@ -483,7 +571,6 @@ swap_chain->GetContainingOutput (&output); output->GetDesc (&output_desc); - output->Release (); SetWindowPos (hwnd, HWND_TOPMOST, output_desc.DesktopCoordinates.left, @@ -585,8 +672,11 @@ break; case WM_CLOSE: if (self->internal_hwnd) { + RemoveProp (self->internal_hwnd, D3D11_WINDOW_PROP_NAME); ShowWindow (self->internal_hwnd, SW_HIDE); - gst_d3d11_window_win32_close_internal_window (self); + gst_d3d11_window_win32_destroy_internal_window (self->internal_hwnd); + self->internal_hwnd = NULL; + self->internal_hwnd_thread = NULL; } break; case WM_KEYDOWN: @@ -602,7 +692,8 @@ case WM_MOUSEMOVE: /* To handle mouse event only once, do this only for internal window */ if (self->internal_hwnd && self->internal_hwnd == hWnd) - gst_d3d11_window_win32_on_mouse_event (self, hWnd, uMsg, wParam, lParam); + gst_d3d11_window_win32_on_mouse_event (self, hWnd, uMsg, wParam, + lParam); /* DefWindowProc will not chain up mouse event to parent window */ if (self->external_hwnd && self->external_hwnd != hWnd) @@ -630,6 +721,27 @@ gst_d3d11_window_win32_change_fullscreen_mode_internal (self); } break; + case WM_GST_D3D11_MOVE_WINDOW: + if (g_atomic_int_get (&self->pending_move_window)) { + g_atomic_int_set (&self->pending_move_window, 0); + + if (self->internal_hwnd && self->external_hwnd) { + if (self->render_rect.w < 0 || self->render_rect.h < 0) { + RECT rect; + + /* Reset render rect and back to full-size window */ + if (GetClientRect (self->external_hwnd, &rect)) { + MoveWindow (self->internal_hwnd, 0, 0, + rect.right - rect.left, rect.bottom - rect.top, FALSE); + } + } else { + MoveWindow (self->internal_hwnd, self->render_rect.x, + self->render_rect.y, self->render_rect.w, self->render_rect.h, + FALSE); + } + } + } + break; default: break; } @@ -668,6 +780,11 @@ gst_d3d11_window_win32_handle_window_proc (self, hWnd, uMsg, wParam, lParam); + } else if (uMsg == WM_GST_D3D11_DESTROY_INTERNAL_WINDOW) { + GST_INFO ("Handle destroy window message"); + gst_d3d11_window_win32_destroy_internal_window (hWnd); + + return 0; } if (uMsg == WM_SIZE) @@ -708,19 +825,28 @@ /* don't need to be chained up to parent window procedure, * as this is our custom message */ return 0; - } else if (uMsg == WM_SIZE) { - MoveWindow (self->internal_hwnd, 0, 0, LOWORD (lParam), HIWORD (lParam), - FALSE); - } else if (uMsg == WM_CLOSE || uMsg == WM_DESTROY) { - g_mutex_lock (&self->lock); - GST_WARNING_OBJECT (self, "external window is closing"); - gst_d3d11_window_win32_release_external_handle (self); - self->external_hwnd = NULL; - self->overlay_state = GST_D3D11_WINDOW_WIN32_OVERLAY_STATE_CLOSED; - g_mutex_unlock (&self->lock); - } else { - gst_d3d11_window_win32_handle_window_proc (self, hWnd, uMsg, wParam, - lParam); + } else if (self) { + if (uMsg == WM_SIZE) { + MoveWindow (self->internal_hwnd, 0, 0, LOWORD (lParam), HIWORD (lParam), + FALSE); + } else if (uMsg == WM_CLOSE || uMsg == WM_DESTROY) { + g_mutex_lock (&self->lock); + GST_WARNING_OBJECT (self, "external window is closing"); + gst_d3d11_window_win32_release_external_handle (self->external_hwnd); + self->external_hwnd = NULL; + + RemoveProp (self->internal_hwnd, D3D11_WINDOW_PROP_NAME); + ShowWindow (self->internal_hwnd, SW_HIDE); + gst_d3d11_window_win32_destroy_internal_window (self->internal_hwnd); + self->internal_hwnd = NULL; + self->internal_hwnd_thread = NULL; + + self->overlay_state = GST_D3D11_WINDOW_WIN32_OVERLAY_STATE_CLOSED; + g_mutex_unlock (&self->lock); + } else { + gst_d3d11_window_win32_handle_window_proc (self, hWnd, uMsg, wParam, + lParam); + } } return CallWindowProc (external_window_proc, hWnd, uMsg, wParam, lParam); @@ -730,10 +856,10 @@ gst_d3d11_window_win32_disable_alt_enter (GstD3D11WindowWin32 * self, GstD3D11Device * device, IDXGISwapChain * swap_chain, HWND hwnd) { - IDXGIFactory1 *factory = NULL; + ComPtr < IDXGIFactory1 > factory; HRESULT hr; - hr = swap_chain->GetParent (IID_IDXGIFactory1, (void **) &factory); + hr = swap_chain->GetParent (IID_PPV_ARGS (&factory)); if (!gst_d3d11_result (hr, device) || !factory) { GST_WARNING_OBJECT (self, "Cannot get parent dxgi factory for swapchain %p, hr: 0x%x", @@ -746,9 +872,62 @@ GST_WARNING_OBJECT (self, "MakeWindowAssociation failure, hr: 0x%x", (guint) hr); } +} + +static IDXGISwapChain * +create_swap_chain (GstD3D11WindowWin32 * self, GstD3D11Device * device, + DXGI_SWAP_CHAIN_DESC * desc) +{ + HRESULT hr; + IDXGISwapChain *swap_chain = NULL; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + IDXGIFactory1 *factory = gst_d3d11_device_get_dxgi_factory_handle (device); + + gst_d3d11_device_lock (device); + hr = factory->CreateSwapChain (device_handle, desc, &swap_chain); + gst_d3d11_device_unlock (device); + + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "Cannot create SwapChain Object: 0x%x", + (guint) hr); + swap_chain = NULL; + } + + return swap_chain; +} + +#if (GST_D3D11_DXGI_HEADER_VERSION >= 2) +static IDXGISwapChain1 * +create_swap_chain_for_hwnd (GstD3D11WindowWin32 * self, GstD3D11Device * device, + HWND hwnd, DXGI_SWAP_CHAIN_DESC1 * desc, + DXGI_SWAP_CHAIN_FULLSCREEN_DESC * fullscreen_desc, IDXGIOutput * output) +{ + HRESULT hr; + IDXGISwapChain1 *swap_chain = NULL; + ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device); + IDXGIFactory1 *factory = gst_d3d11_device_get_dxgi_factory_handle (device); + ComPtr < IDXGIFactory2 > factory2; + + hr = factory->QueryInterface (IID_PPV_ARGS (&factory2)); + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "IDXGIFactory2 interface is unavailable"); + return NULL; + } - factory->Release (); + gst_d3d11_device_lock (device); + hr = factory2->CreateSwapChainForHwnd (device_handle, hwnd, desc, + fullscreen_desc, output, &swap_chain); + gst_d3d11_device_unlock (device); + + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, "Cannot create SwapChain Object: 0x%x", + (guint) hr); + swap_chain = NULL; + } + + return swap_chain; } +#endif static gboolean gst_d3d11_window_win32_create_swap_chain (GstD3D11Window * window, @@ -762,7 +941,7 @@ self->have_swapchain1 = FALSE; -#if (DXGI_HEADER_VERSION >= 2) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 2) { DXGI_SWAP_CHAIN_DESC1 desc1 = { 0, }; desc1.Width = 0; @@ -784,8 +963,7 @@ desc1.AlphaMode = DXGI_ALPHA_MODE_UNSPECIFIED; desc1.Flags = swapchain_flags; - new_swapchain = (IDXGISwapChain *) - gst_d3d11_device_create_swap_chain_for_hwnd (device, + new_swapchain = create_swap_chain_for_hwnd (self, device, self->internal_hwnd, &desc1, NULL, NULL); if (!new_swapchain) { @@ -820,7 +998,7 @@ desc.Windowed = TRUE; desc.Flags = swapchain_flags; - new_swapchain = gst_d3d11_device_create_swap_chain (device, &desc); + new_swapchain = create_swap_chain (self, device, &desc); } if (!new_swapchain) { @@ -892,8 +1070,7 @@ return GST_D3D11_WINDOW_FLOW_CLOSED; } - -#if (DXGI_HEADER_VERSION >= 2) +#if (GST_D3D11_DXGI_HEADER_VERSION >= 2) if (self->have_swapchain1) { IDXGISwapChain1 *swap_chain1 = (IDXGISwapChain1 *) window->swap_chain; DXGI_PRESENT_PARAMETERS present_params = { 0, }; @@ -966,4 +1143,4 @@ g_object_ref_sink (window); return window; -} \ No newline at end of file +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/gstd3d11window_win32.h -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/gstd3d11window_win32.h
Changed
@@ -23,7 +23,6 @@ #include <gst/gst.h> #include <gst/video/video.h> -#include "gstd3d11_fwd.h" #include "gstd3d11window.h" G_BEGIN_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/d3d11/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/meson.build
Changed
@@ -1,84 +1,53 @@ d3d11_sources = [ - 'gstd3d11bufferpool.c', - 'gstd3d11device.c', - 'gstd3d11memory.c', - 'gstd3d11utils.c', - 'gstd3d11videosink.c', + 'gstd3d11basefilter.cpp', + 'gstd3d11convert.cpp', + 'gstd3d11converter.cpp', + 'gstd3d11compositor.cpp', + 'gstd3d11compositorbin.cpp', + 'gstd3d11download.cpp', + 'gstd3d11overlaycompositor.cpp', + 'gstd3d11pluginutils.cpp', + 'gstd3d11shader.cpp', + 'gstd3d11upload.cpp', + 'gstd3d11videoprocessor.cpp', + 'gstd3d11videosink.cpp', 'gstd3d11window.cpp', - 'plugin.c', - 'gstd3d11format.c', - 'gstd3d11basefilter.c', - 'gstd3d11upload.c', - 'gstd3d11download.c', - 'gstd3d11colorconvert.c', - 'gstd3d11videosinkbin.c', - 'gstd3d11shader.c', - 'gstd3d11colorconverter.c', - 'gstd3d11overlaycompositor.c', - 'gstd3d11videoprocessor.c', + 'gstd3d11window_dummy.cpp', + 'plugin.cpp', ] d3d11_dec_sources = [ - 'gstd3d11decoder.c', - 'gstd3d11h264dec.c', - 'gstd3d11vp9dec.c', - 'gstd3d11h265dec.c', - 'gstd3d11vp8dec.c', + 'gstd3d11av1dec.cpp', + 'gstd3d11decoder.cpp', + 'gstd3d11h264dec.cpp', + 'gstd3d11vp9dec.cpp', + 'gstd3d11h265dec.cpp', + 'gstd3d11mpeg2dec.cpp', + 'gstd3d11vp8dec.cpp', ] -dxgi_headers = [ - ['dxgi1_6.h', 6], - ['dxgi1_5.h', 5], - ['dxgi1_4.h', 4], - ['dxgi1_3.h', 3], - ['dxgi1_2.h', 2], - ['dxgi.h', 1] -] - -d3d11_headers = [ - ['d3d11_4.h', 4], - ['d3d11_3.h', 3], - ['d3d11_2.h', 2], - ['d3d11_1.h', 1], - ['d3d11.h', 0] -] - -have_d3d11 = false extra_c_args = ['-DCOBJMACROS'] -have_dxgi_header = false -have_d3d11_header = false -have_d3d11sdk_h = false -have_dxgidebug_h = false -winapi_desktop = false -winapi_app = false +extra_args = ['-DGST_USE_UNSTABLE_API'] extra_dep = [] -d3d11_conf = configuration_data() d3d11_option = get_option('d3d11') if host_system != 'windows' or d3d11_option.disabled() subdir_done() endif -d3d11_lib = cc.find_library('d3d11', required : d3d11_option) -dxgi_lib = cc.find_library('dxgi', required : d3d11_option) -d3dcompiler_lib = cc.find_library('d3dcompiler', required: d3d11_option) -runtimeobject_lib = cc.find_library('runtimeobject', required : false) - -foreach dxgi_h: dxgi_headers - if not have_dxgi_header and cc.has_header(dxgi_h[0]) - d3d11_conf.set('DXGI_HEADER_VERSION', dxgi_h[1]) - have_dxgi_header = true +if not gstd3d11_dep.found() + if d3d11_option.enabled() + error('The d3d11 was enabled explicitly, but required dependencies were not found.') endif -endforeach + subdir_done() +endif -foreach d3d11_h: d3d11_headers - if not have_d3d11_header and cc.has_header(d3d11_h[0]) - d3d11_conf.set('D3D11_HEADER_VERSION', d3d11_h[1]) - have_d3d11_header = true - endif -endforeach +d3dcompiler_lib = cc.find_library('d3dcompiler', required: d3d11_option) +runtimeobject_lib = cc.find_library('runtimeobject', required : false) +winmm_lib = cc.find_library('winmm', required: false) +has_decoder = false -have_d3d11 = d3d11_lib.found() and dxgi_lib.found() and have_d3d11_header and have_dxgi_header and cc.has_header('d3dcompiler.h') +have_d3d11 = cc.has_header('d3dcompiler.h') if not have_d3d11 if d3d11_option.enabled() error('The d3d11 plugin was enabled explicitly, but required dependencies were not found.') @@ -88,111 +57,67 @@ # d3d11 video api uses dxva structure for decoding, and dxva.h needs d3d9 types if cc.has_header('dxva.h') and cc.has_header('d3d9.h') - d3d11_conf.set('HAVE_DXVA_H', 1) d3d11_sources += d3d11_dec_sources - extra_c_args += ['-DGST_USE_UNSTABLE_API'] + extra_args += ['-DHAVE_DXVA_H'] extra_dep += [gstcodecs_dep] + has_decoder = true endif -winapi_desktop = cxx.compiles('''#include <winapifamily.h> - #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP) - #error "not win32" - #endif''', - dependencies: [d3d11_lib, dxgi_lib], - name: 'checking if building for Win32') - -if runtimeobject_lib.found() and d3dcompiler_lib.found() - winapi_app = cxx.compiles('''#include <winapifamily.h> - #include <windows.applicationmodel.core.h> - #include <wrl.h> - #include <wrl/wrappers/corewrappers.h> - #include <d3d11.h> - #include <dxgi1_2.h> - #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_APP) - #error "not winrt" - #endif''', - dependencies: [d3d11_lib, dxgi_lib, runtimeobject_lib], - name: 'checking if building for WinRT') -endif - -if not winapi_desktop and not winapi_app - error('Neither Desktop partition nor App partition') +if d3d11_winapi_only_app and (not d3dcompiler_lib.found() or not runtimeobject_lib.found()) + if d3d11_option.enabled() + error('The d3d11 plugin was enabled explicitly, but required dependencies were not found.') + endif + subdir_done() endif -winapi_app_only = winapi_app and not winapi_desktop - -if winapi_app_only +# if build target is Windows 10 and WINAPI_PARTITION_APP is allowed, +# we can build UWP only modules as well +if d3d11_winapi_app d3d11_sources += ['gstd3d11window_corewindow.cpp', 'gstd3d11window_swapchainpanel.cpp'] extra_dep += [runtimeobject_lib, d3dcompiler_lib] -else - d3d11_sources += ['gstd3d11window_win32.cpp'] endif -d3d11_conf.set10('GST_D3D11_WINAPI_ONLY_APP', winapi_app_only) - -# for enabling debug layer -# NOTE: Disable d3d11/dxgi debug layer in case of [UWP build + release CRT] -# WACK (Windows App Certification Kit) doesn't seem to be happy with -# the DXGIGetDebugInterface1 symbol. - -# FIXME: Probably DXGIGetDebugInterface1 might be used on UWP app for development -# purpose. So, I suspect one possible reason why WACK is complaining about -# DXGIGetDebugInterface1 is that debugging APIs couldn't be used for -# Windows store app, but couldn't find any reference about that. -# -# [IDXGIDebug1] -# https://docs.microsoft.com/en-us/windows/win32/api/dxgidebug/nn-dxgidebug-idxgidebug1 -# is saying that the IDXGIDebug1 interface is available for both desktop app and -# UWP. And then the *DXGIGetDebugInterface1* method need to be called to obtain -# the IDXGIDebug1 interface. -# -# [DXGIGetDebugInterface1] -# https://docs.microsoft.com/en-us/windows/win32/api/dxgi1_3/nf-dxgi1_3-dxgigetdebuginterface1 -# is mentioning that DXGIGetDebugInterface1 is desktop app only. -# -# PLEASE LET US KNOW A CORRECT WAY TO OBTAIN IDXGIDebug1 ON UWP, MICROSOFT -if get_option('debug') and not (winapi_app_only and get_option('b_vscrt') == 'md') - d3d11_debug_libs = [ - ['d3d11sdklayers.h', 'ID3D11Debug', 'ID3D11InfoQueue', 'have_d3d11sdk_h'], - ['dxgidebug.h', 'IDXGIDebug', 'IDXGIInfoQueue', 'have_dxgidebug_h'], - ] +if d3d11_winapi_desktop + d3d11_sources += ['gstd3d11window_win32.cpp'] + if d3d11_conf.get('GST_D3D11_DXGI_HEADER_VERSION') >= 6 + # Desktop Duplication API is unavailable for UWP + # and MinGW is not supported due to some missing headers + extra_args += ['-DHAVE_DXGI_DESKTOP_DUP'] + d3d11_sources += ['gstd3d11screencapture.cpp', + 'gstd3d11screencapturedevice.cpp', + 'gstd3d11screencapturesrc.cpp'] + message('Enable D3D11 Desktop Duplication API') + endif + # multimedia clock is desktop only API + if has_decoder and winmm_lib.found() and cc.has_header('timeapi.h') + extra_args += ['-DHAVE_WINMM'] + extra_dep += [winmm_lib] + endif +endif - foreach f : d3d11_debug_libs - header = f.get(0) - debug_obj = f.get(1) - info_obj = f.get(2) - compile_code = ''' - #include <d3d11.h> - #include <dxgi.h> - #include <@0@> - int main(int arc, char ** argv) { - @1@ *debug = NULL; - @2@ *info_queue = NULL; - return 0; - }'''.format(header, debug_obj, info_obj) - if cc.compiles(compile_code, dependencies: [d3d11_lib, dxgi_lib], name: debug_obj) - set_variable(f.get(3), true) - endif - endforeach -else - message('Disable D3D11Debug and DXGIDebug layers') +# need dxgi1_5.h for HDR10 processing and d3d11_4.h for ID3D11VideoContext2 interface +if d3d11_conf.get('GST_D3D11_DXGI_HEADER_VERSION') >= 5 and d3d11_conf.get('GST_D3D11_HEADER_VERSION') >= 4 + d3d11_sources += ['gstd3d11deinterlace.cpp'] + extra_args += ['-DHAVE_D3D11_VIDEO_PROC'] endif -d3d11_conf.set10('HAVE_D3D11SDKLAYERS_H', have_d3d11sdk_h) -d3d11_conf.set10('HAVE_DXGIDEBUG_H', have_dxgidebug_h) +# MinGW 32bits compiler seems to be complaining about redundant-decls +# when ComPtr is in use. Let's just disable the warning +if cc.get_id() != 'msvc' + extra_mingw_args = cc.get_supported_arguments([ + '-Wno-redundant-decls', + ]) -configure_file( - output: 'gstd3d11config.h', - configuration: d3d11_conf, -) + extra_args += extra_mingw_args +endif gstd3d11 = library('gstd3d11', d3d11_sources, - c_args : gst_plugins_bad_args + extra_c_args, - cpp_args: gst_plugins_bad_args, + c_args : gst_plugins_bad_args + extra_c_args + extra_args, + cpp_args: gst_plugins_bad_args + extra_args, include_directories : [configinc], - dependencies : [gstbase_dep, gstvideo_dep, gmodule_dep, d3d11_lib, dxgi_lib] + extra_dep, + dependencies : [gstbase_dep, gstvideo_dep, gmodule_dep, gstcontroller_dep, gstd3d11_dep] + extra_dep, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.20.1.tar.xz/sys/d3d11/plugin.cpp
Added
@@ -0,0 +1,276 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:plugin-d3d11 + * + * Microsoft Direct3D11 plugin. + * + * This plugin consists of various video filter, screen capture source, + * video sink, and video decoder elements. + * + * GstD3D11 plugin supports H.264/AVC, H.265/HEVC, VP8, VP9, H.262/MPEG-2 video, + * and AV1 codecs for decoding as well as hardware-accelerated video + * deinterlacing. Note that minimum required OS version for video decoder and + * deinterlacing elements is Windows 8. + * + * Plugin feature names of decoders: + * - d3d11h264dec + * - d3d11h265dec + * - d3d11vp8dec + * - d3d11vp9dec + * - d3d11mpeg2dec + * - d3d11av1dec + * + * Similar to the video decoder case, deinterlacing element would be registered + * only if its supported by hardware with the feature name `d3d11deinterlace` + * + * However, depending on the hardware it runs on, some elements might not be + * registered in case that underlying hardware doesn't support the feature. + * For a system with multiple Direct3D11 compatible hardwares (i.e., GPU), + * there can be multiple plugin features having the same role. + * The naming rule for the non-primary decoder element is + * `d3d11{codec}device{index}dec` where `index` is an arbitrary index number of + * hardware starting from 1. + * + * To get a list of all available elements, user can run + * ```sh + * gst-inspect-1.0.exe d3d11 + * ``` + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/d3d11/gstd3d11.h> +#include "gstd3d11videosink.h" +#include "gstd3d11upload.h" +#include "gstd3d11download.h" +#include "gstd3d11convert.h" +#include "gstd3d11shader.h" +#include "gstd3d11compositor.h" +#include "gstd3d11compositorbin.h" +#ifdef HAVE_DXVA_H +#include "gstd3d11h264dec.h" +#include "gstd3d11h265dec.h" +#include "gstd3d11vp9dec.h" +#include "gstd3d11vp8dec.h" +#include "gstd3d11mpeg2dec.h" +#include "gstd3d11av1dec.h" +#endif +#ifdef HAVE_DXGI_DESKTOP_DUP +#include "gstd3d11screencapturesrc.h" +#include "gstd3d11screencapturedevice.h" +#endif +#ifdef HAVE_D3D11_VIDEO_PROC +#include "gstd3d11deinterlace.h" +#endif + +GST_DEBUG_CATEGORY (gst_d3d11_debug); +GST_DEBUG_CATEGORY (gst_d3d11_shader_debug); +GST_DEBUG_CATEGORY (gst_d3d11_converter_debug); +GST_DEBUG_CATEGORY (gst_d3d11_plugin_utils_debug); +GST_DEBUG_CATEGORY (gst_d3d11_format_debug); +GST_DEBUG_CATEGORY (gst_d3d11_device_debug); +GST_DEBUG_CATEGORY (gst_d3d11_overlay_compositor_debug); +GST_DEBUG_CATEGORY (gst_d3d11_window_debug); +GST_DEBUG_CATEGORY (gst_d3d11_video_processor_debug); +GST_DEBUG_CATEGORY (gst_d3d11_compositor_debug); + +#ifdef HAVE_DXVA_H +GST_DEBUG_CATEGORY (gst_d3d11_decoder_debug); +GST_DEBUG_CATEGORY (gst_d3d11_h264_dec_debug); +GST_DEBUG_CATEGORY (gst_d3d11_h265_dec_debug); +GST_DEBUG_CATEGORY (gst_d3d11_vp9_dec_debug); +GST_DEBUG_CATEGORY (gst_d3d11_vp8_dec_debug); +GST_DEBUG_CATEGORY (gst_d3d11_mpeg2_dec_debug); +GST_DEBUG_CATEGORY (gst_d3d11_av1_dec_debug); +#endif + +#ifdef HAVE_DXGI_DESKTOP_DUP +GST_DEBUG_CATEGORY (gst_d3d11_screen_capture_debug); +GST_DEBUG_CATEGORY (gst_d3d11_screen_capture_device_debug); +#endif + +#ifdef HAVE_D3D11_VIDEO_PROC +GST_DEBUG_CATEGORY (gst_d3d11_deinterlace_debug); +#endif + +#define GST_CAT_DEFAULT gst_d3d11_debug + +static gboolean +plugin_init (GstPlugin * plugin) +{ + GstRank video_sink_rank = GST_RANK_PRIMARY; + D3D_FEATURE_LEVEL max_feature_level = D3D_FEATURE_LEVEL_9_3; + guint i; + + GST_DEBUG_CATEGORY_INIT (gst_d3d11_debug, "d3d11", 0, "direct3d 11 plugin"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_shader_debug, + "d3d11shader", 0, "d3d11shader"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_converter_debug, + "d3d11converter", 0, "d3d11converter"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_plugin_utils_debug, + "d3d11pluginutils", 0, "d3d11 plugin utility functions"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_overlay_compositor_debug, + "d3d11overlaycompositor", 0, "d3d11overlaycompositor"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_window_debug, + "d3d11window", 0, "d3d11window"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_video_processor_debug, + "d3d11videoprocessor", 0, "d3d11videoprocessor"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_compositor_debug, + "d3d11compositor", 0, "d3d11compositor element"); + + if (!gst_d3d11_shader_init ()) { + GST_WARNING ("Cannot initialize d3d11 shader"); + return TRUE; + } +#ifdef HAVE_DXVA_H + /* DXVA2 API is availble since Windows 8 */ + if (gst_d3d11_is_windows_8_or_greater ()) { + GST_DEBUG_CATEGORY_INIT (gst_d3d11_decoder_debug, + "d3d11decoder", 0, "Direct3D11 Video Decoder object"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_h264_dec_debug, + "d3d11h264dec", 0, "Direct3D11 H.264 Video Decoder"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_vp9_dec_debug, + "d3d11vp9dec", 0, "Direct3D11 VP9 Video Decoder"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_h265_dec_debug, + "d3d11h265dec", 0, "Direct3D11 H.265 Video Decoder"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_vp8_dec_debug, + "d3d11vp8dec", 0, "Direct3D11 VP8 Decoder"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_mpeg2_dec_debug, + "d3d11mpeg2dec", 0, "Direct3D11 MPEG2 Decoder"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_av1_dec_debug, + "d3d11av1dec", 0, "Direct3D11 AV1 Decoder"); + } +#endif + +#ifdef HAVE_D3D11_VIDEO_PROC + GST_DEBUG_CATEGORY_INIT (gst_d3d11_deinterlace_debug, + "d3d11deinterlace", 0, "Direct3D11 Deinterlacer"); +#endif + + /* Enumerate devices to register decoders per device and to get the highest + * feature level */ + /* AMD seems supporting up to 12 cards, and 8 for NVIDIA */ + for (i = 0; i < 12; i++) { + GstD3D11Device *device = NULL; + ID3D11Device *device_handle; + D3D_FEATURE_LEVEL feature_level; + + device = gst_d3d11_device_new (i, D3D11_CREATE_DEVICE_BGRA_SUPPORT); + if (!device) + break; + + device_handle = gst_d3d11_device_get_device_handle (device); + feature_level = device_handle->GetFeatureLevel (); + + if (feature_level > max_feature_level) + max_feature_level = feature_level; + +#ifdef HAVE_DXVA_H + /* DXVA2 API is availble since Windows 8 */ + if (gst_d3d11_is_windows_8_or_greater () && + gst_d3d11_device_get_video_device_handle (device)) { + gboolean legacy = gst_d3d11_decoder_util_is_legacy_device (device); + + gst_d3d11_h264_dec_register (plugin, device, GST_RANK_SECONDARY, legacy); + if (!legacy) { + gst_d3d11_h265_dec_register (plugin, device, GST_RANK_SECONDARY); + gst_d3d11_vp9_dec_register (plugin, device, GST_RANK_SECONDARY); + gst_d3d11_vp8_dec_register (plugin, device, GST_RANK_SECONDARY); + gst_d3d11_mpeg2_dec_register (plugin, device, GST_RANK_SECONDARY); + gst_d3d11_av1_dec_register (plugin, device, GST_RANK_SECONDARY); + } + } +#endif + +#ifdef HAVE_D3D11_VIDEO_PROC + /* D3D11 video processor API is availble since Windows 8 */ + if (gst_d3d11_is_windows_8_or_greater ()) { + gboolean hardware; + + g_object_get (device, "hardware", &hardware, NULL); + if (hardware) + gst_d3d11_deinterlace_register (plugin, device, GST_RANK_MARGINAL); + } +#endif + + gst_object_unref (device); + } + + /* FIXME: Our shader code is not compatible with D3D_FEATURE_LEVEL_9_3 + * or lower. So HLSL compiler cannot understand our shader code and + * therefore d3d11colorconverter cannot be configured. + * + * Known D3D_FEATURE_LEVEL_9_3 driver is + * "VirtualBox Graphics Adapter (WDDM)" + * ... and there might be some more old physical devices which don't support + * D3D_FEATURE_LEVEL_10_0. + */ + if (max_feature_level < D3D_FEATURE_LEVEL_10_0) + video_sink_rank = GST_RANK_NONE; + + gst_d3d11_plugin_utils_init (max_feature_level); + + gst_element_register (plugin, + "d3d11upload", GST_RANK_NONE, GST_TYPE_D3D11_UPLOAD); + gst_element_register (plugin, + "d3d11download", GST_RANK_NONE, GST_TYPE_D3D11_DOWNLOAD); + gst_element_register (plugin, + "d3d11convert", GST_RANK_NONE, GST_TYPE_D3D11_CONVERT); + gst_element_register (plugin, + "d3d11colorconvert", GST_RANK_NONE, GST_TYPE_D3D11_COLOR_CONVERT); + gst_element_register (plugin, + "d3d11scale", GST_RANK_NONE, GST_TYPE_D3D11_SCALE); + gst_element_register (plugin, + "d3d11videosink", video_sink_rank, GST_TYPE_D3D11_VIDEO_SINK); + + gst_element_register (plugin, + "d3d11compositorelement", GST_RANK_NONE, GST_TYPE_D3D11_COMPOSITOR); + gst_element_register (plugin, + "d3d11compositor", GST_RANK_SECONDARY, GST_TYPE_D3D11_COMPOSITOR_BIN); + +#ifdef HAVE_DXGI_DESKTOP_DUP + if (gst_d3d11_is_windows_8_or_greater ()) { + GST_DEBUG_CATEGORY_INIT (gst_d3d11_screen_capture_debug, + "d3d11screencapturesrc", 0, "d3d11screencapturesrc"); + GST_DEBUG_CATEGORY_INIT (gst_d3d11_screen_capture_device_debug, + "d3d11screencapturedevice", 0, "d3d11screencapturedevice"); + + gst_element_register (plugin, + "d3d11screencapturesrc", GST_RANK_NONE, + GST_TYPE_D3D11_SCREEN_CAPTURE_SRC); + gst_device_provider_register (plugin, + "d3d11screencapturedeviceprovider", GST_RANK_PRIMARY, + GST_TYPE_D3D11_SCREEN_CAPTURE_DEVICE_PROVIDER); + } +#endif + + return TRUE; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + d3d11, + "Direct3D11 plugin", + plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklink.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklink.cpp
Changed
@@ -88,11 +88,16 @@ {GST_DECKLINK_MODE_2160p5994, "4k 59.94p", "2160p5994"}, {GST_DECKLINK_MODE_2160p60, "4k 60p", "2160p60"}, - {GST_DECKLINK_MODE_NTSC_WIDESCREEN, "NTSC SD 60i Widescreen", "ntsc-widescreen"}, - {GST_DECKLINK_MODE_NTSC2398_WIDESCREEN, "NTSC SD 60i Widescreen (24 fps)", "ntsc2398-widescreen"}, - {GST_DECKLINK_MODE_PAL_WIDESCREEN, "PAL SD 50i Widescreen", "pal-widescreen"}, - {GST_DECKLINK_MODE_NTSC_P_WIDESCREEN, "NTSC SD 60p Widescreen", "ntsc-p-widescreen"}, - {GST_DECKLINK_MODE_PAL_P_WIDESCREEN, "PAL SD 50p Widescreen", "pal-p-widescreen"}, + {GST_DECKLINK_MODE_NTSC_WIDESCREEN, "NTSC SD 60i Widescreen", + "ntsc-widescreen"}, + {GST_DECKLINK_MODE_NTSC2398_WIDESCREEN, "NTSC SD 60i Widescreen (24 fps)", + "ntsc2398-widescreen"}, + {GST_DECKLINK_MODE_PAL_WIDESCREEN, "PAL SD 50i Widescreen", + "pal-widescreen"}, + {GST_DECKLINK_MODE_NTSC_P_WIDESCREEN, "NTSC SD 60p Widescreen", + "ntsc-p-widescreen"}, + {GST_DECKLINK_MODE_PAL_P_WIDESCREEN, "PAL SD 50p Widescreen", + "pal-p-widescreen"}, {0, NULL, NULL} }; @@ -156,18 +161,41 @@ return (GType) id; } +/** + * GstDecklinkProfileId: + * @GST_DECKLINK_PROFILE_ID_DEFAULT: Don't change the profile + * @GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_FULL_DUPLEX: Equivalent to bmdProfileOneSubDeviceFullDuplex + * @GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_HALF_DUPLEX: Equivalent to bmdProfileOneSubDeviceHalfDuplex + * @GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_FULL_DUPLEX: Equivalent to bmdProfileTwoSubDevicesFullDuplex + * @GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_HALF_DUPLEX: Equivalent to bmdProfileTwoSubDevicesHalfDuplex + * @GST_DECKLINK_PROFILE_ID_FOUR_SUB_DEVICES_HALF_DUPLEX: Equivalent to bmdProfileFourSubDevicesHalfDuplex + * + * Decklink Profile ID + * + * Since: 1.20 + */ GType -gst_decklink_duplex_mode_get_type (void) +gst_decklink_profile_id_get_type (void) { static gsize id = 0; static const GEnumValue types[] = { - {GST_DECKLINK_DUPLEX_MODE_HALF, "Half-Duplex", "half"}, - {GST_DECKLINK_DUPLEX_MODE_FULL, "Full-Duplex", "full"}, + {GST_DECKLINK_PROFILE_ID_DEFAULT, "Default, don't change profile", + "default"}, + {GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_FULL_DUPLEX, + "One sub-device, Full-Duplex", "one-sub-device-full"}, + {GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_HALF_DUPLEX, + "One sub-device, Half-Duplex", "one-sub-device-half"}, + {GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_FULL_DUPLEX, + "Two sub-devices, Full-Duplex", "two-sub-devices-full"}, + {GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_HALF_DUPLEX, + "Two sub-devices, Half-Duplex", "two-sub-devices-half"}, + {GST_DECKLINK_PROFILE_ID_FOUR_SUB_DEVICES_HALF_DUPLEX, + "Four sub-devices, Half-Duplex", "four-sub-devices-half"}, {0, NULL, NULL} }; if (g_once_init_enter (&id)) { - GType tmp = g_enum_register_static ("GstDecklinkDuplexMode", types); + GType tmp = g_enum_register_static ("GstDecklinkProfileId", types); g_once_init_leave (&id, tmp); } @@ -351,15 +379,11 @@ /* *INDENT-ON* */ }; -static const struct +enum ProfileSetOperationResult { - BMDDuplexMode mode; - GstDecklinkDuplexMode gstmode; -} duplex_modes[] = { - /* *INDENT-OFF* */ - {bmdDuplexModeHalf, GST_DECKLINK_DUPLEX_MODE_HALF}, - {bmdDuplexModeFull, GST_DECKLINK_DUPLEX_MODE_FULL}, - /* *INDENT-ON* */ + PROFILE_SET_UNSUPPORTED, + PROFILE_SET_SUCCESS, + PROFILE_SET_FAILURE }; enum DuplexModeSetOperationResult @@ -589,25 +613,6 @@ return GST_DECKLINK_TIMECODE_FORMAT_RP188ANY; } -const BMDDuplexMode -gst_decklink_duplex_mode_from_enum (GstDecklinkDuplexMode m) -{ - return duplex_modes[m].mode; -} - -const GstDecklinkDuplexMode -gst_decklink_duplex_mode_to_enum (BMDDuplexMode m) -{ - guint i; - - for (i = 0; i < G_N_ELEMENTS (duplex_modes); i++) { - if (duplex_modes[i].mode == m) - return duplex_modes[i].gstmode; - } - g_assert_not_reached (); - return GST_DECKLINK_DUPLEX_MODE_HALF; -} - const BMDKeyerMode gst_decklink_keyer_mode_from_enum (GstDecklinkKeyerMode m) { @@ -860,14 +865,8 @@ GstDecklinkDevice *devices[4]; }; -DuplexModeSetOperationResult gst_decklink_configure_duplex_mode (Device * - device, BMDDuplexMode duplex_mode); -DuplexModeSetOperationResult -gst_decklink_configure_duplex_mode_pair_device (Device * device, - BMDDuplexMode duplex_mode); -Device *gst_decklink_find_device_by_persistent_id (int64_t persistent_id); -gboolean gst_decklink_device_has_persistent_id (Device * device, - int64_t persistent_id); +static ProfileSetOperationResult gst_decklink_configure_profile (Device * + device, GstDecklinkProfileId profile_id); class GStreamerDecklinkInputCallback:public IDeckLinkInputCallback { @@ -947,7 +946,8 @@ /* Reset any timestamp observations we might've made */ if (m_input->videosrc) { - GstDecklinkVideoSrc *videosrc = GST_DECKLINK_VIDEO_SRC (m_input->videosrc); + GstDecklinkVideoSrc *videosrc = + GST_DECKLINK_VIDEO_SRC (m_input->videosrc); g_mutex_lock (&videosrc->lock); videosrc->window_fill = 0; @@ -991,7 +991,7 @@ GstClockTime stream_time, GstClockTime stream_duration, GstClockTime hardware_time, GstClockTime hardware_duration, gboolean no_signal) = NULL; - GstDecklinkModeEnum mode; + GstDecklinkModeEnum mode = GST_DECKLINK_MODE_AUTO; GstClockTime capture_time = GST_CLOCK_TIME_NONE; GstClockTime base_time = 0; gboolean no_signal = FALSE; @@ -1009,7 +1009,9 @@ base_time = gst_element_get_base_time (videosrc); got_video_frame = m_input->got_video_frame; } - mode = gst_decklink_get_mode_enum_from_bmd (m_input->mode->mode); + + if (m_input->mode) + mode = gst_decklink_get_mode_enum_from_bmd (m_input->mode->mode); if (m_input->audiosrc) { audiosrc = GST_ELEMENT_CAST (gst_object_ref (m_input->audiosrc)); @@ -1152,7 +1154,7 @@ while ((buf = (uint8_t *) gst_queue_array_pop_head (m_buffers))) { uint8_t offset = *(buf - 1); void *alloc_buf = buf - 128 + offset; - g_free (alloc_buf); + g_free (alloc_buf); } } @@ -1582,7 +1584,7 @@ } } - ret = decklink->QueryInterface (IID_IDeckLinkAttributes, + ret = decklink->QueryInterface (IID_IDeckLinkProfileAttributes, (void **) &dev->input.attributes); dev->output.attributes = dev->input.attributes; if (ret != S_OK) { @@ -1592,8 +1594,7 @@ bool tmp_bool = false; int64_t tmp_int = 2; - dev->input.attributes->GetInt (BMDDeckLinkMaximumAudioChannels, - &tmp_int); + dev->input.attributes->GetInt (BMDDeckLinkMaximumAudioChannels, &tmp_int); dev->input.attributes->GetFlag (BMDDeckLinkSupportsInputFormatDetection, &tmp_bool); supports_format_detection = tmp_bool; @@ -1713,8 +1714,8 @@ if (!is_audio) { GstDecklinkVideoSink *videosink = (GstDecklinkVideoSink *) (sink); - if (gst_decklink_configure_duplex_mode (device, - videosink->duplex_mode) == DUPLEX_MODE_SET_FAILURE) { + if (gst_decklink_configure_profile (device, + videosink->profile_id) == PROFILE_SET_FAILURE) { return NULL; } } @@ -1787,11 +1788,12 @@ if (!is_audio) { GstDecklinkVideoSrc *videosrc = (GstDecklinkVideoSrc *) (src); - if (gst_decklink_configure_duplex_mode (device, - videosrc->duplex_mode) == DUPLEX_MODE_SET_FAILURE) { + if (gst_decklink_configure_profile (device, + videosrc->profile_id) == PROFILE_SET_FAILURE) { return NULL; } } + g_mutex_lock (&input->lock); input->input->SetVideoInputFrameMemoryAllocator (new GStreamerDecklinkMemoryAllocator); @@ -1804,6 +1806,7 @@ g_mutex_unlock (&input->lock); return input; } + g_mutex_unlock (&input->lock); GST_ERROR ("Input device %d (audio: %d) in use already", n, is_audio); @@ -1840,150 +1843,68 @@ g_mutex_unlock (&input->lock); } -/* - * Probes if duplex-mode is supported and sets it accordingly. I duplex-mode is not supported - * but this device is part of a pair (Duo2- and Quad2-Cards) and Half-Dupley-Mode is requested, - * the parent device is also checked and configured accordingly. - * - * If - * - full-duplex-mode is requested and the device does not support it *or* - * - half-duplex-mode is requested and there is not parent-device *or* - * - half-duplex-mode is requested and neither the device nor the parent device does support setting - * the duplex-mode, DUPLEX_MODE_SET_UNSUPPORTED is returnded. - * If the device does support duplex-mode and setting it succeeded, DUPLEX_MODE_SET_SUCCESS is rerturned. - * If - * - the device does support duplex-mode and setting it failed *or* - * - the Device reported a pair-device that does not exist in the system, - * DUPLEX_MODE_SET_FAILURE is returned. - */ -DuplexModeSetOperationResult -gst_decklink_configure_duplex_mode (Device * device, BMDDuplexMode duplex_mode) +static ProfileSetOperationResult +gst_decklink_configure_profile (Device * device, + GstDecklinkProfileId profile_id) { - HRESULT result; - bool duplex_supported; - int64_t paired_device_id; + HRESULT res; + + if (profile_id == GST_DECKLINK_PROFILE_ID_DEFAULT) + return PROFILE_SET_SUCCESS; GstDecklinkInput *input = &device->input; + IDeckLink *decklink = input->device; + + IDeckLinkProfileManager *manager = NULL; + if (decklink->QueryInterface (IID_IDeckLinkProfileManager, + (void **) &manager) == S_OK) { + BMDProfileID bmd_profile_id; + + switch (profile_id) { + case GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_FULL_DUPLEX: + bmd_profile_id = bmdProfileOneSubDeviceFullDuplex; + break; + case GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_HALF_DUPLEX: + bmd_profile_id = bmdProfileOneSubDeviceHalfDuplex; + break; + case GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_FULL_DUPLEX: + bmd_profile_id = bmdProfileTwoSubDevicesFullDuplex; + break; + case GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_HALF_DUPLEX: + bmd_profile_id = bmdProfileTwoSubDevicesHalfDuplex; + break; + case GST_DECKLINK_PROFILE_ID_FOUR_SUB_DEVICES_HALF_DUPLEX: + bmd_profile_id = bmdProfileFourSubDevicesHalfDuplex; + break; + default: + case GST_DECKLINK_PROFILE_ID_DEFAULT: + g_assert_not_reached (); + break; + } - result = - input->attributes->GetFlag (BMDDeckLinkSupportsDuplexModeConfiguration, - &duplex_supported); - if (result != S_OK) { - duplex_supported = false; - } + IDeckLinkProfile *profile = NULL; + res = manager->GetProfile (bmd_profile_id, &profile); - if (!duplex_supported) { - if (duplex_mode == bmdDuplexModeFull) { - GST_DEBUG ("Device does not support Full-Duplex-Mode"); - return DUPLEX_MODE_SET_UNSUPPORTED; - } else if (duplex_mode == bmdDuplexModeHalf) { - result = - input->attributes->GetInt (BMDDeckLinkPairedDevicePersistentID, - &paired_device_id); - - if (result == S_OK) { - GST_DEBUG ("Device does not support Half-Duplex-Mode but the Device is " - "a Part of a Device-Pair, trying to set Half-Duplex-Mode " - "on the Parent-Device"); - - Device *pair_device = - gst_decklink_find_device_by_persistent_id (paired_device_id); - if (pair_device == NULL) { - GST_ERROR ("Device reported as Pair-Device does not exist"); - return DUPLEX_MODE_SET_FAILURE; - } - return gst_decklink_configure_duplex_mode_pair_device (pair_device, - duplex_mode); - } else { - GST_DEBUG ("Device does not support Half-Duplex-Mode"); - return DUPLEX_MODE_SET_SUCCESS; - } - } else { - GST_ERROR ("duplex_mode=%d", duplex_mode); - g_assert_not_reached (); + if (res == S_OK && profile) { + res = profile->SetActive (); + profile->Release (); } - } else { - GST_DEBUG ("Setting duplex-mode of Device"); - result = input->config->SetInt (bmdDeckLinkConfigDuplexMode, duplex_mode); - if (result == S_OK) { - GST_DEBUG ("Duplex mode set successful"); - return DUPLEX_MODE_SET_SUCCESS; + manager->Release (); + + if (res == S_OK) { + GST_DEBUG ("Successfully set profile"); + return PROFILE_SET_SUCCESS; } else { - GST_ERROR ("Setting duplex mode failed"); - return DUPLEX_MODE_SET_FAILURE; + GST_ERROR ("Failed to set profile"); + return PROFILE_SET_FAILURE; } - } - - g_assert_not_reached (); - return DUPLEX_MODE_SET_FAILURE; -} - -DuplexModeSetOperationResult -gst_decklink_configure_duplex_mode_pair_device (Device * device, - BMDDuplexMode duplex_mode) -{ - HRESULT result; - bool duplex_supported; - - GstDecklinkInput *input = &device->input; - - result = - input->attributes->GetFlag (BMDDeckLinkSupportsDuplexModeConfiguration, - &duplex_supported); - if (result != S_OK) { - duplex_supported = false; - } - - if (!duplex_supported) { - GST_DEBUG ("Pair-Device does not support Duplex-Mode"); - return DUPLEX_MODE_SET_UNSUPPORTED; - } - - GST_DEBUG ("Setting duplex-mode of Pair-Device"); - result = input->config->SetInt (bmdDeckLinkConfigDuplexMode, duplex_mode); - - if (result == S_OK) { - GST_DEBUG ("Duplex mode set successful"); - return DUPLEX_MODE_SET_SUCCESS; } else { - GST_ERROR ("Setting duplex mode failed"); - return DUPLEX_MODE_SET_FAILURE; + GST_DEBUG ("Device has only one profile"); + return PROFILE_SET_UNSUPPORTED; } } -gboolean -gst_decklink_device_has_persistent_id (Device * device, int64_t persistent_id) -{ - HRESULT result; - int64_t this_device_persistent_id; - - GstDecklinkInput *input = &device->input; - - result = - input->attributes->GetInt (BMDDeckLinkPersistentID, - &this_device_persistent_id); - return (result == S_OK) && (this_device_persistent_id == persistent_id); -} - -Device * -gst_decklink_find_device_by_persistent_id (int64_t persistent_id) -{ - GST_DEBUG ("Searching Device by persistent ID %" G_GINT64_FORMAT, - (gint64) persistent_id); - - for (guint index = 0; index < devices->len; index++) { - Device *device = (Device *) g_ptr_array_index (devices, index); - - if (gst_decklink_device_has_persistent_id (device, persistent_id)) { - GST_DEBUG ("Found matching Device %u", index); - return device; - } - } - - return NULL; -} - G_DEFINE_TYPE (GstDecklinkClock, gst_decklink_clock, GST_TYPE_SYSTEM_CLOCK); static GstClockTime gst_decklink_clock_get_internal_time (GstClock * clock); @@ -2073,38 +1994,22 @@ return result; } -static gboolean -plugin_init (GstPlugin * plugin) +void +decklink_element_init (GstPlugin * plugin) { - GST_DEBUG_CATEGORY_INIT (gst_decklink_debug, "decklink", 0, - "debug category for decklink plugin"); - - gst_element_register (plugin, "decklinkaudiosink", GST_RANK_NONE, - GST_TYPE_DECKLINK_AUDIO_SINK); - gst_element_register (plugin, "decklinkvideosink", GST_RANK_NONE, - GST_TYPE_DECKLINK_VIDEO_SINK); - gst_element_register (plugin, "decklinkaudiosrc", GST_RANK_NONE, - GST_TYPE_DECKLINK_AUDIO_SRC); - gst_element_register (plugin, "decklinkvideosrc", GST_RANK_NONE, - GST_TYPE_DECKLINK_VIDEO_SRC); - - gst_device_provider_register (plugin, "decklinkdeviceprovider", - GST_RANK_PRIMARY, GST_TYPE_DECKLINK_DEVICE_PROVIDER); - - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_AUDIO_CHANNELS, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_AUDIO_CONNECTION, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_DUPLEX_MODE, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_KEYER_MODE, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_MODE, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_TIMECODE_FORMAT, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_VIDEO_FORMAT, (GstPluginAPIFlags) 0); - gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_CONNECTION, (GstPluginAPIFlags) 0); - - return TRUE; + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + GST_DEBUG_CATEGORY_INIT (gst_decklink_debug, "decklink", 0, + "debug category for decklink plugin"); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_AUDIO_CHANNELS, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_AUDIO_CONNECTION, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_PROFILE_ID, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_KEYER_MODE, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_MODE, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_TIMECODE_FORMAT, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_VIDEO_FORMAT, (GstPluginAPIFlags) 0); + gst_type_mark_as_plugin_api (GST_TYPE_DECKLINK_CONNECTION, (GstPluginAPIFlags) 0); + + g_once_init_leave (&res, TRUE); + } } - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - decklink, - "Blackmagic Decklink plugin", - plugin_init, VERSION, "LGPL", PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklink.h
Changed
@@ -55,6 +55,8 @@ #define WINAPI #endif /* G_OS_WIN32 */ +void decklink_element_init (GstPlugin * plugin); + typedef enum { GST_DECKLINK_MODE_AUTO, @@ -161,11 +163,15 @@ GType gst_decklink_video_format_get_type (void); typedef enum { - GST_DECKLINK_DUPLEX_MODE_HALF, /* bmdDuplexModeHalf */ - GST_DECKLINK_DUPLEX_MODE_FULL, /* bmdDuplexModeFull */ -} GstDecklinkDuplexMode; -#define GST_TYPE_DECKLINK_DUPLEX_MODE (gst_decklink_duplex_mode_get_type ()) -GType gst_decklink_duplex_mode_get_type (void); + GST_DECKLINK_PROFILE_ID_DEFAULT, + GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_FULL_DUPLEX, /* bmdProfileOneSubDeviceFullDuplex */ + GST_DECKLINK_PROFILE_ID_ONE_SUB_DEVICE_HALF_DUPLEX, /* bmdProfileOneSubDeviceHalfDuplex */ + GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_FULL_DUPLEX, /* bmdProfileTwoSubDevicesFullDuplex */ + GST_DECKLINK_PROFILE_ID_TWO_SUB_DEVICES_HALF_DUPLEX, /* bmdProfileTwoSubDevicesHalfDuplex */ + GST_DECKLINK_PROFILE_ID_FOUR_SUB_DEVICES_HALF_DUPLEX, /* bmdProfileFourSubDevicesHalfDuplex */ +} GstDecklinkProfileId; +#define GST_TYPE_DECKLINK_PROFILE_ID (gst_decklink_profile_id_get_type ()) +GType gst_decklink_profile_id_get_type (void); typedef enum { GST_DECKLINK_TIMECODE_FORMAT_RP188VITC1, /*bmdTimecodeRP188VITC1 */ @@ -204,8 +210,8 @@ GstVideoFormat gst_decklink_video_format_from_type (BMDPixelFormat pf); const BMDTimecodeFormat gst_decklink_timecode_format_from_enum (GstDecklinkTimecodeFormat f); const GstDecklinkTimecodeFormat gst_decklink_timecode_format_to_enum (BMDTimecodeFormat f); -const BMDDuplexMode gst_decklink_duplex_mode_from_enum (GstDecklinkDuplexMode m); -const GstDecklinkDuplexMode gst_decklink_duplex_mode_to_enum (BMDDuplexMode m); +const BMDProfileID gst_decklink_profile_id_from_enum (GstDecklinkProfileId p); +const GstDecklinkProfileId gst_decklink_profile_id_to_enum (BMDProfileID p); const BMDKeyerMode gst_decklink_keyer_mode_from_enum (GstDecklinkKeyerMode m); const GstDecklinkKeyerMode gst_decklink_keyer_mode_to_enum (BMDKeyerMode m); @@ -233,7 +239,7 @@ struct _GstDecklinkOutput { IDeckLink *device; IDeckLinkOutput *output; - IDeckLinkAttributes *attributes; + IDeckLinkProfileAttributes *attributes; IDeckLinkKeyer *keyer; gchar *hw_serial_number; @@ -264,7 +270,7 @@ IDeckLink *device; IDeckLinkInput *input; IDeckLinkConfiguration *config; - IDeckLinkAttributes *attributes; + IDeckLinkProfileAttributes *attributes; gchar *hw_serial_number;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkaudiosink.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkaudiosink.cpp
Changed
@@ -101,6 +101,8 @@ #define parent_class gst_decklink_audio_sink_parent_class G_DEFINE_TYPE (GstDecklinkAudioSink, gst_decklink_audio_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (decklinkaudiosink, "decklinkaudiosink", GST_RANK_NONE, + GST_TYPE_DECKLINK_AUDIO_SINK, decklink_element_init (plugin)); static void gst_decklink_audio_sink_class_init (GstDecklinkAudioSinkClass * klass) @@ -684,7 +686,8 @@ GST_TIME_ARGS (buffered_time), buffered_samples); { - GstClockTimeDiff buffered_ahead_of_clock_ahead = GST_CLOCK_DIFF (clock_ahead, buffered_time); + GstClockTimeDiff buffered_ahead_of_clock_ahead = + GST_CLOCK_DIFF (clock_ahead, buffered_time); GST_DEBUG_OBJECT (self, "driver is %" GST_STIME_FORMAT " ahead of the " "expected clock", GST_STIME_ARGS (buffered_ahead_of_clock_ahead)); @@ -693,9 +696,11 @@ * own synchronisation. It seems to count samples instead. */ /* FIXME: do we need to split buffers? */ if (buffered_ahead_of_clock_ahead > 0 && - buffered_ahead_of_clock_ahead > gst_base_sink_get_max_lateness (bsink)) { - GST_DEBUG_OBJECT (self, "Dropping buffer that is %" GST_STIME_FORMAT - " too late", GST_STIME_ARGS (buffered_ahead_of_clock_ahead)); + buffered_ahead_of_clock_ahead > + gst_base_sink_get_max_lateness (bsink)) { + GST_DEBUG_OBJECT (self, + "Dropping buffer that is %" GST_STIME_FORMAT " too late", + GST_STIME_ARGS (buffered_ahead_of_clock_ahead)); if (self->resampler) gst_audio_resampler_reset (self->resampler); flow_ret = GST_FLOW_OK; @@ -710,7 +715,8 @@ GST_DEBUG_OBJECT (self, "Buffered enough, wait for preroll or the clock or flushing. " - "Configured buffer time: %" GST_TIME_FORMAT, GST_TIME_ARGS (self->buffer_time)); + "Configured buffer time: %" GST_TIME_FORMAT, + GST_TIME_ARGS (self->buffer_time)); if (wait_time < self->buffer_time) wait_time = 0;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkaudiosink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkaudiosink.h
Changed
@@ -67,6 +67,8 @@ GType gst_decklink_audio_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (decklinkaudiosink); + G_END_DECLS #endif /* __GST_DECKLINK_AUDIO_SINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkaudiosrc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkaudiosrc.cpp
Changed
@@ -125,6 +125,8 @@ static gboolean gst_decklink_audio_src_unlock (GstBaseSrc * bsrc); static gboolean gst_decklink_audio_src_unlock_stop (GstBaseSrc * bsrc); +static GstCaps *gst_decklink_audio_src_get_caps (GstBaseSrc * bsrc, + GstCaps * filter); static gboolean gst_decklink_audio_src_query (GstBaseSrc * bsrc, GstQuery * query); @@ -138,6 +140,8 @@ #define parent_class gst_decklink_audio_src_parent_class G_DEFINE_TYPE (GstDecklinkAudioSrc, gst_decklink_audio_src, GST_TYPE_PUSH_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (decklinkaudiosrc, "decklinkaudiosrc", GST_RANK_NONE, + GST_TYPE_DECKLINK_AUDIO_SRC, decklink_element_init (plugin)); static void gst_decklink_audio_src_class_init (GstDecklinkAudioSrcClass * klass) @@ -156,6 +160,7 @@ basesrc_class->query = GST_DEBUG_FUNCPTR (gst_decklink_audio_src_query); basesrc_class->negotiate = NULL; + basesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_decklink_audio_src_get_caps); basesrc_class->unlock = GST_DEBUG_FUNCPTR (gst_decklink_audio_src_unlock); basesrc_class->unlock_stop = GST_DEBUG_FUNCPTR (gst_decklink_audio_src_unlock_stop); @@ -237,6 +242,10 @@ self->current_packets = gst_queue_array_new_for_struct (sizeof (CapturePacket), DEFAULT_BUFFER_SIZE); + + self->skipped_last = 0; + self->skip_from_timestamp = GST_CLOCK_TIME_NONE; + self->skip_to_timestamp = GST_CLOCK_TIME_NONE; } void @@ -466,6 +475,10 @@ } gst_caps_unref (caps); + self->skipped_last = 0; + self->skip_from_timestamp = GST_CLOCK_TIME_NONE; + self->skip_to_timestamp = GST_CLOCK_TIME_NONE; + return TRUE; } @@ -477,7 +490,7 @@ gboolean no_signal) { GstDecklinkAudioSrc *self = GST_DECKLINK_AUDIO_SRC_CAST (element); - GstClockTime timestamp; + GstClockTime timestamp = GST_CLOCK_TIME_NONE; GST_LOG_OBJECT (self, "Got audio packet at %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT @@ -497,8 +510,10 @@ if (videosrc->first_time == GST_CLOCK_TIME_NONE) videosrc->first_time = stream_time; - if (videosrc->skip_first_time > 0 - && stream_time - videosrc->first_time < videosrc->skip_first_time) { + if (GST_CLOCK_TIME_IS_VALID (videosrc->first_time) && + GST_CLOCK_TIME_IS_VALID (stream_time) && + videosrc->skip_first_time > 0 && + stream_time - videosrc->first_time < videosrc->skip_first_time) { GST_DEBUG_OBJECT (self, "Skipping frame as requested: %" GST_TIME_FORMAT " < %" GST_TIME_FORMAT, GST_TIME_ARGS (stream_time), @@ -509,7 +524,7 @@ if (videosrc->output_stream_time) timestamp = stream_time; - else + else if (GST_CLOCK_TIME_IS_VALID (stream_time)) timestamp = gst_clock_adjust_with_calibration (NULL, stream_time, videosrc->current_time_mapping.xbase, videosrc->current_time_mapping.b, videosrc->current_time_mapping.num, @@ -526,25 +541,36 @@ if (!self->flushing) { CapturePacket p; guint skipped_packets = 0; - GstClockTime from_timestamp = GST_CLOCK_TIME_NONE; - GstClockTime to_timestamp = GST_CLOCK_TIME_NONE; while (gst_queue_array_get_length (self->current_packets) >= self->buffer_size) { CapturePacket *tmp = (CapturePacket *) gst_queue_array_pop_head_struct (self->current_packets); - if (skipped_packets == 0) - from_timestamp = tmp->timestamp; + if (skipped_packets == 0 && self->skipped_last == 0) + self->skip_from_timestamp = tmp->timestamp; skipped_packets++; - to_timestamp = tmp->timestamp; + self->skip_to_timestamp = tmp->timestamp; capture_packet_clear (tmp); } - if (skipped_packets > 0) - GST_WARNING_OBJECT (self, - "Dropped %u old packets from %" GST_TIME_FORMAT " to %" - GST_TIME_FORMAT, skipped_packets, GST_TIME_ARGS (from_timestamp), - GST_TIME_ARGS (to_timestamp)); + if (self->skipped_last == 0 && skipped_packets > 0) { + GST_WARNING_OBJECT (self, "Starting to drop audio packets"); + } + + if (skipped_packets == 0 && self->skipped_last > 0) { + GST_ELEMENT_WARNING_WITH_DETAILS (self, + STREAM, FAILED, + ("Dropped %u old packets from %" GST_TIME_FORMAT " to %" + GST_TIME_FORMAT, self->skipped_last, + GST_TIME_ARGS (self->skip_from_timestamp), + GST_TIME_ARGS (self->skip_to_timestamp)), + (NULL), + ("dropped", G_TYPE_UINT, self->skipped_last, + "from", G_TYPE_UINT64, self->skip_from_timestamp, + "to", G_TYPE_UINT64, self->skip_to_timestamp, NULL)); + self->skipped_last = 0; + } + self->skipped_last += skipped_packets; memset (&p, 0, sizeof (p)); p.packet = packet; @@ -604,12 +630,33 @@ sample_count = p.packet->GetSampleFrameCount (); data_size = self->info.bpf * sample_count; - if (p.timestamp == GST_CLOCK_TIME_NONE && self->next_offset == (guint64) - 1) { - GST_DEBUG_OBJECT (self, - "Got packet without timestamp before initial " - "timestamp after discont - dropping"); - capture_packet_clear (&p); - goto retry; + timestamp = p.timestamp; + + if (!GST_CLOCK_TIME_IS_VALID (timestamp)) { + if (self->next_offset == (guint64) - 1) { + GST_DEBUG_OBJECT (self, + "Got packet without timestamp before initial " + "timestamp after discont - dropping"); + capture_packet_clear (&p); + goto retry; + } else { + GST_INFO_OBJECT (self, "Unknown timestamp value"); + + /* Likely the case where IDeckLinkInputCallback::VideoInputFrameArrived() + * didn't provide video frame, so no reference video stream timestamp + * is available. It can happen as per SDK documentation + * under the following circumstances: + * - On Intensity Pro with progressive NTSC only, every video frame will + * have two audio packets. + * - With 3:2 pulldown there are five audio packets for each four + * video frames. + * - If video processing is not fast enough, audio will still be delivered + * + * Assume there was no packet drop from previous valid packet, and use + * previously calculated expected timestamp here */ + timestamp = gst_util_uint64_scale (self->next_offset, + GST_SECOND, self->info.rate); + } } ap = (AudioPacket *) g_malloc0 (sizeof (AudioPacket)); @@ -624,8 +671,6 @@ ap->input = self->input->input; ap->input->AddRef (); - timestamp = p.timestamp; - // Jitter and discontinuity handling, based on audiobasesrc start_time = timestamp; @@ -657,7 +702,9 @@ GST_SECOND); // Discont! - if (G_UNLIKELY (diff >= max_sample_diff)) { + if (self->alignment_threshold > 0 + && self->alignment_threshold != GST_CLOCK_TIME_NONE + && G_UNLIKELY (diff >= max_sample_diff)) { if (self->discont_wait > 0) { if (self->discont_time == GST_CLOCK_TIME_NONE) { self->discont_time = start_time; @@ -684,6 +731,8 @@ self->next_offset = end_offset; // Got a discont and adjusted, reset the discont_time marker. self->discont_time = GST_CLOCK_TIME_NONE; + } else if (self->alignment_threshold == 0) { + // Don't align, just pass through timestamps } else { // No discont, just keep counting timestamp = @@ -780,6 +829,42 @@ return flow_ret; } +static GstCaps * +gst_decklink_audio_src_get_caps (GstBaseSrc * bsrc, GstCaps * filter) +{ + GstDecklinkAudioSrc *self = GST_DECKLINK_AUDIO_SRC_CAST (bsrc); + GstCaps *caps, *template_caps; + const GstStructure *s; + gint channels; + + channels = self->channels; + if (channels == 0) + channels = self->channels_found; + + template_caps = gst_pad_get_pad_template_caps (GST_BASE_SRC_PAD (bsrc)); + if (channels == 0) { + caps = template_caps; + } else { + if (channels > 2) + s = gst_caps_get_structure (template_caps, 1); + else + s = gst_caps_get_structure (template_caps, 0); + + caps = gst_caps_new_full (gst_structure_copy (s), NULL); + gst_caps_set_simple (caps, "channels", G_TYPE_INT, channels, NULL); + gst_caps_unref (template_caps); + } + + if (filter) { + GstCaps *tmp = + gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = tmp; + } + + return caps; +} + static gboolean gst_decklink_audio_src_query (GstBaseSrc * bsrc, GstQuery * query) {
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkaudiosrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkaudiosrc.h
Changed
@@ -81,6 +81,10 @@ GstClockTime discont_time; guint buffer_size; + + guint skipped_last; + GstClockTime skip_from_timestamp; + GstClockTime skip_to_timestamp; }; struct _GstDecklinkAudioSrcClass @@ -90,6 +94,8 @@ GType gst_decklink_audio_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (decklinkaudiosrc); + G_END_DECLS #endif /* __GST_DECKLINK_AUDIO_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkdeviceprovider.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkdeviceprovider.cpp
Changed
@@ -26,6 +26,8 @@ G_DEFINE_TYPE (GstDecklinkDeviceProvider, gst_decklink_device_provider, GST_TYPE_DEVICE_PROVIDER); +GST_DEVICE_PROVIDER_REGISTER_DEFINE (decklinkdeviceprovider, "decklinkdeviceprovider", + GST_RANK_PRIMARY, GST_TYPE_DECKLINK_DEVICE_PROVIDER); static void gst_decklink_device_provider_init (GstDecklinkDeviceProvider * self)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkdeviceprovider.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkdeviceprovider.h
Changed
@@ -41,6 +41,7 @@ }; GType gst_decklink_device_provider_get_type (void); +GST_DEVICE_PROVIDER_REGISTER_DECLARE (decklinkdeviceprovider); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkplugin.cpp
Added
@@ -0,0 +1,52 @@ +/* GStreamer + * Copyright (C) 2011 David Schleef <ds@schleef.org> + * Copyright (C) 2014 Sebastian Dröge <sebastian@centricular.com> + * Copyright (C) 2015 Florian Langlois <florian.langlois@fr.thalesgroup.com> + * Copyright (C) 2020 Sohonet <dev@sohonet.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Suite 500, + * Boston, MA 02110-1335, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> + +#include "gstdecklinkaudiosink.h" +#include "gstdecklinkvideosink.h" +#include "gstdecklinkaudiosrc.h" +#include "gstdecklinkvideosrc.h" +#include "gstdecklinkdeviceprovider.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + GST_ELEMENT_REGISTER (decklinkaudiosink, plugin); + GST_ELEMENT_REGISTER (decklinkvideosink, plugin); + GST_ELEMENT_REGISTER (decklinkaudiosrc, plugin); + GST_ELEMENT_REGISTER (decklinkvideosrc, plugin); + + GST_DEVICE_PROVIDER_REGISTER (decklinkdeviceprovider, plugin); + + return TRUE; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + decklink, + "Blackmagic Decklink plugin", + plugin_init, VERSION, "LGPL", PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkvideosink.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkvideosink.cpp
Changed
@@ -238,7 +238,7 @@ PROP_MODE, PROP_DEVICE_NUMBER, PROP_VIDEO_FORMAT, - PROP_DUPLEX_MODE, + PROP_PROFILE_ID, PROP_TIMECODE_FORMAT, PROP_KEYER_MODE, PROP_KEYER_LEVEL, @@ -280,6 +280,8 @@ #define parent_class gst_decklink_video_sink_parent_class G_DEFINE_TYPE (GstDecklinkVideoSink, gst_decklink_video_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (decklinkvideosink, "decklinkvideosink", GST_RANK_NONE, + GST_TYPE_DECKLINK_VIDEO_SINK, decklink_element_init (plugin)); static gboolean reset_framerate (GstCapsFeatures * features, GstStructure * structure, @@ -341,17 +343,26 @@ (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT))); - g_object_class_install_property (gobject_class, PROP_DUPLEX_MODE, - g_param_spec_enum ("duplex-mode", "Duplex mode", - "Certain DeckLink devices such as the DeckLink Quad 2 and the " + /** + * GstDecklinkVideoSink:profile + * + * Specifies decklink profile to use. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_PROFILE_ID, + g_param_spec_enum ("profile", "Profile", + "Certain DeckLink devices such as the DeckLink 8K Pro, the DeckLink " + "Quad 2 and the DeckLink Duo 2 support multiple profiles to " + "configure the capture and playback behavior of its sub-devices." + "For the DeckLink Duo 2 and DeckLink Quad 2, a profile is shared " + "between any 2 sub-devices that utilize the same connectors. For the " + "DeckLink 8K Pro, a profile is shared between all 4 sub-devices. Any " + "sub-devices that share a profile are considered to be part of the " + "same profile group." "DeckLink Duo 2 support configuration of the duplex mode of " - "individual sub-devices." - "A sub-device configured as full-duplex will use two connectors, " - "which allows simultaneous capture and playback, internal keying, " - "and fill & key scenarios." - "A half-duplex sub-device will use a single connector as an " - "individual capture or playback channel.", - GST_TYPE_DECKLINK_DUPLEX_MODE, GST_DECKLINK_DUPLEX_MODE_HALF, + "individual sub-devices.", + GST_TYPE_DECKLINK_PROFILE_ID, GST_DECKLINK_PROFILE_ID_DEFAULT, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT))); @@ -419,7 +430,7 @@ self->mode = GST_DECKLINK_MODE_NTSC; self->device_number = 0; self->video_format = GST_DECKLINK_VIDEO_FORMAT_8BIT_YUV; - self->duplex_mode = bmdDuplexModeHalf; + self->profile_id = GST_DECKLINK_PROFILE_ID_DEFAULT; /* VITC is legacy, we should expect RP188 in modern use cases */ self->timecode_format = bmdTimecodeRP188Any; self->caption_line = 0; @@ -457,10 +468,8 @@ break; } break; - case PROP_DUPLEX_MODE: - self->duplex_mode = - gst_decklink_duplex_mode_from_enum ((GstDecklinkDuplexMode) - g_value_get_enum (value)); + case PROP_PROFILE_ID: + self->profile_id = (GstDecklinkProfileId) g_value_get_enum (value); break; case PROP_TIMECODE_FORMAT: self->timecode_format = @@ -503,9 +512,8 @@ case PROP_VIDEO_FORMAT: g_value_set_enum (value, self->video_format); break; - case PROP_DUPLEX_MODE: - g_value_set_enum (value, - gst_decklink_duplex_mode_to_enum (self->duplex_mode)); + case PROP_PROFILE_ID: + g_value_set_enum (value, self->profile_id); break; case PROP_TIMECODE_FORMAT: g_value_set_enum (value, @@ -1617,7 +1625,8 @@ self->internal_time_offset = self->internal_base_time; } else if (GST_CLOCK_TIME_IS_VALID (self->internal_pause_time)) { self->internal_time_offset += - gst_clock_get_internal_time (self->output->clock) - self->internal_pause_time; + gst_clock_get_internal_time (self->output->clock) - + self->internal_pause_time; } GST_INFO_OBJECT (self, "clock has been set to %" GST_PTR_FORMAT @@ -1681,7 +1690,8 @@ case GST_STATE_CHANGE_PAUSED_TO_PLAYING: break; case GST_STATE_CHANGE_PLAYING_TO_PAUSED: - self->internal_pause_time = gst_clock_get_internal_time (self->output->clock); + self->internal_pause_time = + gst_clock_get_internal_time (self->output->clock); break; default: break;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkvideosink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkvideosink.h
Changed
@@ -52,7 +52,7 @@ GstDecklinkModeEnum mode; gint device_number; GstDecklinkVideoFormat video_format; - BMDDuplexMode duplex_mode; + GstDecklinkProfileId profile_id; BMDTimecodeFormat timecode_format; BMDKeyerMode keyer_mode; gint keyer_level; @@ -84,6 +84,8 @@ GType gst_decklink_video_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (decklinkvideosink); + void gst_decklink_video_sink_convert_to_internal_clock (GstDecklinkVideoSink * self, GstClockTime * timestamp, GstClockTime * duration);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkvideosrc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkvideosrc.cpp
Changed
@@ -162,7 +162,7 @@ PROP_DEVICE_NUMBER, PROP_BUFFER_SIZE, PROP_VIDEO_FORMAT, - PROP_DUPLEX_MODE, + PROP_PROFILE_ID, PROP_TIMECODE_FORMAT, PROP_OUTPUT_STREAM_TIME, PROP_SKIP_FIRST_TIME, @@ -223,6 +223,8 @@ gst_decklink_video_src_change_state (GstElement * element, GstStateChange transition); +static GstCaps *gst_decklink_video_src_get_caps (GstBaseSrc * bsrc, + GstCaps * filter); static gboolean gst_decklink_video_src_query (GstBaseSrc * bsrc, GstQuery * query); static gboolean gst_decklink_video_src_unlock (GstBaseSrc * bsrc); @@ -240,6 +242,8 @@ #define parent_class gst_decklink_video_src_parent_class G_DEFINE_TYPE (GstDecklinkVideoSrc, gst_decklink_video_src, GST_TYPE_PUSH_SRC); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (decklinkvideosrc, "decklinkvideosrc", GST_RANK_NONE, + GST_TYPE_DECKLINK_VIDEO_SRC, decklink_element_init (plugin)); static void gst_decklink_video_src_class_init (GstDecklinkVideoSrcClass * klass) @@ -259,6 +263,7 @@ basesrc_class->query = GST_DEBUG_FUNCPTR (gst_decklink_video_src_query); basesrc_class->negotiate = NULL; + basesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_decklink_video_src_get_caps); basesrc_class->unlock = GST_DEBUG_FUNCPTR (gst_decklink_video_src_unlock); basesrc_class->unlock_stop = GST_DEBUG_FUNCPTR (gst_decklink_video_src_unlock_stop); @@ -298,17 +303,26 @@ (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT))); - g_object_class_install_property (gobject_class, PROP_DUPLEX_MODE, - g_param_spec_enum ("duplex-mode", "Duplex mode", - "Certain DeckLink devices such as the DeckLink Quad 2 and the " + /** + * GstDecklinkVideoSrc:profile + * + * Specifies decklink profile to use. + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_PROFILE_ID, + g_param_spec_enum ("profile", "Profile", + "Certain DeckLink devices such as the DeckLink 8K Pro, the DeckLink " + "Quad 2 and the DeckLink Duo 2 support multiple profiles to " + "configure the capture and playback behavior of its sub-devices." + "For the DeckLink Duo 2 and DeckLink Quad 2, a profile is shared " + "between any 2 sub-devices that utilize the same connectors. For the " + "DeckLink 8K Pro, a profile is shared between all 4 sub-devices. Any " + "sub-devices that share a profile are considered to be part of the " + "same profile group." "DeckLink Duo 2 support configuration of the duplex mode of " - "individual sub-devices." - "A sub-device configured as full-duplex will use two connectors, " - "which allows simultaneous capture and playback, internal keying, " - "and fill & key scenarios." - "A half-duplex sub-device will use a single connector as an " - "individual capture or playback channel.", - GST_TYPE_DECKLINK_DUPLEX_MODE, GST_DECKLINK_DUPLEX_MODE_HALF, + "individual sub-devices.", + GST_TYPE_DECKLINK_PROFILE_ID, GST_DECKLINK_PROFILE_ID_DEFAULT, (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS | G_PARAM_CONSTRUCT))); @@ -384,7 +398,7 @@ self->device_number = 0; self->buffer_size = DEFAULT_BUFFER_SIZE; self->video_format = GST_DECKLINK_VIDEO_FORMAT_AUTO; - self->duplex_mode = bmdDuplexModeHalf; + self->profile_id = GST_DECKLINK_PROFILE_ID_DEFAULT; self->timecode_format = bmdTimecodeRP188Any; self->signal_state = SIGNAL_STATE_UNKNOWN; self->output_stream_time = DEFAULT_OUTPUT_STREAM_TIME; @@ -399,6 +413,9 @@ self->window_fill = 0; self->window_skip = 1; self->window_skip_count = 0; + self->skipped_last = 0; + self->skip_from_timestamp = GST_CLOCK_TIME_NONE; + self->skip_to_timestamp = GST_CLOCK_TIME_NONE; gst_base_src_set_live (GST_BASE_SRC (self), TRUE); gst_base_src_set_format (GST_BASE_SRC (self), GST_FORMAT_TIME); @@ -456,10 +473,8 @@ break; } break; - case PROP_DUPLEX_MODE: - self->duplex_mode = - gst_decklink_duplex_mode_from_enum ((GstDecklinkDuplexMode) - g_value_get_enum (value)); + case PROP_PROFILE_ID: + self->profile_id = (GstDecklinkProfileId) g_value_get_enum (value); break; case PROP_TIMECODE_FORMAT: self->timecode_format = @@ -509,9 +524,8 @@ case PROP_VIDEO_FORMAT: g_value_set_enum (value, self->video_format); break; - case PROP_DUPLEX_MODE: - g_value_set_enum (value, - gst_decklink_duplex_mode_to_enum (self->duplex_mode)); + case PROP_PROFILE_ID: + g_value_set_enum (value, self->profile_id); break; case PROP_TIMECODE_FORMAT: g_value_set_enum (value, @@ -649,6 +663,11 @@ self->input->start_streams (self->input->videosrc); g_mutex_unlock (&self->input->lock); + self->skipped_last = 0; + self->skip_from_timestamp = GST_CLOCK_TIME_NONE; + self->skip_to_timestamp = GST_CLOCK_TIME_NONE; + self->aspect_ratio_flag = -1; + return TRUE; } @@ -828,27 +847,39 @@ GstVideoTimeCodeFlags flags = GST_VIDEO_TIME_CODE_FLAGS_NONE; guint field_count = 0; guint skipped_frames = 0; - GstClockTime from_timestamp = GST_CLOCK_TIME_NONE; - GstClockTime to_timestamp = GST_CLOCK_TIME_NONE; while (gst_queue_array_get_length (self->current_frames) >= self->buffer_size) { CaptureFrame *tmp = (CaptureFrame *) gst_queue_array_pop_head_struct (self->current_frames); if (tmp->frame) { - if (skipped_frames == 0) - from_timestamp = tmp->timestamp; + if (skipped_frames == 0 && self->skipped_last == 0) + self->skip_from_timestamp = tmp->timestamp; skipped_frames++; - to_timestamp = tmp->timestamp; + self->skip_to_timestamp = tmp->timestamp; } capture_frame_clear (tmp); } - if (skipped_frames > 0) - GST_WARNING_OBJECT (self, - "Dropped %u old frames from %" GST_TIME_FORMAT " to %" - GST_TIME_FORMAT, skipped_frames, GST_TIME_ARGS (from_timestamp), - GST_TIME_ARGS (to_timestamp)); + if (self->skipped_last == 0 && skipped_frames > 0) { + GST_WARNING_OBJECT (self, "Starting to drop frames"); + } + + if (skipped_frames == 0 && self->skipped_last > 0) { + GST_ELEMENT_WARNING_WITH_DETAILS (self, + STREAM, FAILED, + ("Dropped %u old frames from %" GST_TIME_FORMAT " to %" + GST_TIME_FORMAT, self->skipped_last, + GST_TIME_ARGS (self->skip_from_timestamp), + GST_TIME_ARGS (self->skip_to_timestamp)), + (NULL), + ("dropped", G_TYPE_UINT, self->skipped_last, + "from", G_TYPE_UINT64, self->skip_from_timestamp, + "to", G_TYPE_UINT64, self->skip_to_timestamp, NULL)); + self->skipped_last = 0; + } + + self->skipped_last += skipped_frames; memset (&f, 0, sizeof (f)); f.frame = frame; @@ -979,11 +1010,13 @@ "Adding AFD/Bar meta to buffer for line %u", field2_offset + line); GST_MEMDUMP_OBJECT (self, "AFD/Bar", gstanc.data, gstanc.data_count); - if (gstanc.data_count < 16) { + if (gstanc.data_count < 8) { GST_WARNING_OBJECT (self, "AFD/Bar data too small"); continue; } + self->aspect_ratio_flag = (gstanc.data[0] >> 2) & 0x1; + afd = (GstVideoAFDValue) ((gstanc.data[0] >> 3) & 0xf); is_letterbox = ((gstanc.data[3] >> 4) & 0x3) == 0; bar1 = GST_READ_UINT16_BE (&gstanc.data[4]); @@ -1216,9 +1249,71 @@ // If we're not flushing, we should have a valid frame from the queue g_assert (f.frame != NULL); + // Create output buffer + f.frame->GetBytes ((gpointer *) & data); + data_size = f.frame->GetHeight() * f.frame->GetRowBytes(); + + vf = (VideoFrame *) g_malloc0 (sizeof (VideoFrame)); + + *buffer = + gst_buffer_new_wrapped_full ((GstMemoryFlags) GST_MEMORY_FLAG_READONLY, + (gpointer) data, data_size, 0, data_size, vf, + (GDestroyNotify) video_frame_free); + + vf->frame = f.frame; + f.frame->AddRef (); + vf->input = self->input->input; + vf->input->AddRef (); + + // Reset aspect ratio flag if the mode has changed. The new mode might not + // have AFD/Bar VANC. + if (self->caps_mode != f.mode) { + self->aspect_ratio_flag = -1; + } + // If we have a format that supports VANC and we are asked to extract CC, + // then do it here. + if ((self->output_cc || self->output_afd_bar) + && self->signal_state != SIGNAL_STATE_LOST) + extract_vbi (self, buffer, vf); + if (!gst_pad_has_current_caps (GST_BASE_SRC_PAD (self))) { caps_changed = TRUE; } + // If there was AFD information with the aspect ratio flag set and the mode + // is auto then we have to switch from normal NTSC/PAL to the widescreen + // variants + if (self->aspect_ratio_flag != -1 && self->mode == GST_DECKLINK_MODE_AUTO) { + switch (f.mode) { + case GST_DECKLINK_MODE_NTSC: + f.mode = + self->aspect_ratio_flag == + 1 ? GST_DECKLINK_MODE_NTSC_WIDESCREEN : GST_DECKLINK_MODE_NTSC; + break; + case GST_DECKLINK_MODE_NTSC_P: + f.mode = + self->aspect_ratio_flag == + 1 ? GST_DECKLINK_MODE_NTSC_P_WIDESCREEN : GST_DECKLINK_MODE_NTSC_P; + break; + case GST_DECKLINK_MODE_NTSC2398: + f.mode = + self->aspect_ratio_flag == + 1 ? GST_DECKLINK_MODE_NTSC2398_WIDESCREEN : + GST_DECKLINK_MODE_NTSC2398; + break; + case GST_DECKLINK_MODE_PAL: + f.mode = + self->aspect_ratio_flag == + 1 ? GST_DECKLINK_MODE_PAL_WIDESCREEN : GST_DECKLINK_MODE_PAL; + break; + case GST_DECKLINK_MODE_PAL_P: + f.mode = + self->aspect_ratio_flag == + 1 ? GST_DECKLINK_MODE_PAL_P_WIDESCREEN : GST_DECKLINK_MODE_PAL_P; + break; + default: + break; + } + } if (self->caps_mode != f.mode) { if (self->mode == GST_DECKLINK_MODE_AUTO @@ -1233,6 +1328,7 @@ ("Invalid mode in captured frame"), ("Mode set to %d but captured %d", self->caps_mode, f.mode)); capture_frame_clear (&f); + gst_clear_buffer (buffer); return GST_FLOW_NOT_NEGOTIATED; } } @@ -1249,6 +1345,7 @@ ("Invalid pixel format in captured frame"), ("Format set to %d but captured %d", self->caps_format, f.format)); capture_frame_clear (&f); + gst_clear_buffer (buffer); return GST_FLOW_NOT_NEGOTIATED; } } @@ -1298,27 +1395,6 @@ } } - f.frame->GetBytes ((gpointer *) & data); - data_size = self->info.size; - - vf = (VideoFrame *) g_malloc0 (sizeof (VideoFrame)); - - *buffer = - gst_buffer_new_wrapped_full ((GstMemoryFlags) GST_MEMORY_FLAG_READONLY, - (gpointer) data, data_size, 0, data_size, vf, - (GDestroyNotify) video_frame_free); - - vf->frame = f.frame; - f.frame->AddRef (); - vf->input = self->input->input; - vf->input->AddRef (); - - // If we have a format that supports VANC and we are asked to extract CC, - // then do it here. - if ((self->output_cc || self->output_afd_bar) - && self->signal_state != SIGNAL_STATE_LOST) - extract_vbi (self, buffer, vf); - if (f.no_signal) GST_BUFFER_FLAG_SET (*buffer, GST_BUFFER_FLAG_GAP); GST_BUFFER_TIMESTAMP (*buffer) = f.timestamp; @@ -1349,6 +1425,31 @@ return flow_ret; } +static GstCaps * +gst_decklink_video_src_get_caps (GstBaseSrc * bsrc, GstCaps * filter) +{ + GstDecklinkVideoSrc *self = GST_DECKLINK_VIDEO_SRC_CAST (bsrc); + GstCaps *caps; + + if (self->mode != GST_DECKLINK_MODE_AUTO) { + caps = gst_decklink_mode_get_caps (self->mode, self->caps_format, TRUE); + } else if (self->caps_mode != GST_DECKLINK_MODE_AUTO) { + caps = + gst_decklink_mode_get_caps (self->caps_mode, self->caps_format, TRUE); + } else { + caps = gst_pad_get_pad_template_caps (GST_BASE_SRC_PAD (bsrc)); + } + + if (filter) { + GstCaps *tmp = + gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = tmp; + } + + return caps; +} + static gboolean gst_decklink_video_src_query (GstBaseSrc * bsrc, GstQuery * query) {
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/gstdecklinkvideosrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/gstdecklinkvideosrc.h
Changed
@@ -58,6 +58,7 @@ GstDecklinkModeEnum mode; GstDecklinkModeEnum caps_mode; + gint aspect_ratio_flag; /* -1 when unknown, 0 not set, 1 set */ BMDPixelFormat caps_format; GstDecklinkConnectionEnum connection; gint device_number; @@ -71,7 +72,7 @@ GstVideoInfo info; GstDecklinkVideoFormat video_format; - BMDDuplexMode duplex_mode; + GstDecklinkProfileId profile_id; BMDTimecodeFormat timecode_format; GstDecklinkInput *input; @@ -111,6 +112,10 @@ gboolean output_afd_bar; gint last_afd_bar_vbi_line; gint last_afd_bar_vbi_line_field2; + + guint skipped_last; + GstClockTime skip_from_timestamp; + GstClockTime skip_to_timestamp; }; struct _GstDecklinkVideoSrcClass @@ -120,6 +125,8 @@ GType gst_decklink_video_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (decklinkvideosrc); + G_END_DECLS #endif /* __GST_DECKLINK_VIDEO_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPI.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPI.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -66,10 +66,10 @@ BMD_CONST REFIID IID_IDeckLinkAudioOutputCallback = /* 403C681B-7F46-4A12-B993-2BB127084EE6 */ {0x40,0x3C,0x68,0x1B,0x7F,0x46,0x4A,0x12,0xB9,0x93,0x2B,0xB1,0x27,0x08,0x4E,0xE6}; BMD_CONST REFIID IID_IDeckLinkIterator = /* 50FB36CD-3063-4B73-BDBB-958087F2D8BA */ {0x50,0xFB,0x36,0xCD,0x30,0x63,0x4B,0x73,0xBD,0xBB,0x95,0x80,0x87,0xF2,0xD8,0xBA}; BMD_CONST REFIID IID_IDeckLinkAPIInformation = /* 7BEA3C68-730D-4322-AF34-8A7152B532A4 */ {0x7B,0xEA,0x3C,0x68,0x73,0x0D,0x43,0x22,0xAF,0x34,0x8A,0x71,0x52,0xB5,0x32,0xA4}; -BMD_CONST REFIID IID_IDeckLinkOutput = /* CC5C8A6E-3F2F-4B3A-87EA-FD78AF300564 */ {0xCC,0x5C,0x8A,0x6E,0x3F,0x2F,0x4B,0x3A,0x87,0xEA,0xFD,0x78,0xAF,0x30,0x05,0x64}; -BMD_CONST REFIID IID_IDeckLinkInput = /* AF22762B-DFAC-4846-AA79-FA8883560995 */ {0xAF,0x22,0x76,0x2B,0xDF,0xAC,0x48,0x46,0xAA,0x79,0xFA,0x88,0x83,0x56,0x09,0x95}; +BMD_CONST REFIID IID_IDeckLinkOutput = /* 065A0F6C-C508-4D0D-B919-F5EB0EBFC96B */ {0x06,0x5A,0x0F,0x6C,0xC5,0x08,0x4D,0x0D,0xB9,0x19,0xF5,0xEB,0x0E,0xBF,0xC9,0x6B}; +BMD_CONST REFIID IID_IDeckLinkInput = /* 2A88CF76-F494-4216-A7EF-DC74EEB83882 */ {0x2A,0x88,0xCF,0x76,0xF4,0x94,0x42,0x16,0xA7,0xEF,0xDC,0x74,0xEE,0xB8,0x38,0x82}; BMD_CONST REFIID IID_IDeckLinkHDMIInputEDID = /* ABBBACBC-45BC-4665-9D92-ACE6E5A97902 */ {0xAB,0xBB,0xAC,0xBC,0x45,0xBC,0x46,0x65,0x9D,0x92,0xAC,0xE6,0xE5,0xA9,0x79,0x02}; -BMD_CONST REFIID IID_IDeckLinkEncoderInput = /* 270587DA-6B7D-42E7-A1F0-6D853F581185 */ {0x27,0x05,0x87,0xDA,0x6B,0x7D,0x42,0xE7,0xA1,0xF0,0x6D,0x85,0x3F,0x58,0x11,0x85}; +BMD_CONST REFIID IID_IDeckLinkEncoderInput = /* F222551D-13DF-4FD8-B587-9D4F19EC12C9 */ {0xF2,0x22,0x55,0x1D,0x13,0xDF,0x4F,0xD8,0xB5,0x87,0x9D,0x4F,0x19,0xEC,0x12,0xC9}; BMD_CONST REFIID IID_IDeckLinkVideoFrame = /* 3F716FE0-F023-4111-BE5D-EF4414C05B17 */ {0x3F,0x71,0x6F,0xE0,0xF0,0x23,0x41,0x11,0xBE,0x5D,0xEF,0x44,0x14,0xC0,0x5B,0x17}; BMD_CONST REFIID IID_IDeckLinkMutableVideoFrame = /* 69E2639F-40DA-4E19-B6F2-20ACE815C390 */ {0x69,0xE2,0x63,0x9F,0x40,0xDA,0x4E,0x19,0xB6,0xF2,0x20,0xAC,0xE8,0x15,0xC3,0x90}; BMD_CONST REFIID IID_IDeckLinkVideoFrame3DExtensions = /* DA0F7E4A-EDC7-48A8-9CDD-2DB51C729CD7 */ {0xDA,0x0F,0x7E,0x4A,0xED,0xC7,0x48,0xA8,0x9C,0xDD,0x2D,0xB5,0x1C,0x72,0x9C,0xD7}; @@ -87,8 +87,12 @@ BMD_CONST REFIID IID_IDeckLinkScreenPreviewCallback = /* B1D3F49A-85FE-4C5D-95C8-0B5D5DCCD438 */ {0xB1,0xD3,0xF4,0x9A,0x85,0xFE,0x4C,0x5D,0x95,0xC8,0x0B,0x5D,0x5D,0xCC,0xD4,0x38}; BMD_CONST REFIID IID_IDeckLinkGLScreenPreviewHelper = /* 504E2209-CAC7-4C1A-9FB4-C5BB6274D22F */ {0x50,0x4E,0x22,0x09,0xCA,0xC7,0x4C,0x1A,0x9F,0xB4,0xC5,0xBB,0x62,0x74,0xD2,0x2F}; BMD_CONST REFIID IID_IDeckLinkNotificationCallback = /* B002A1EC-070D-4288-8289-BD5D36E5FF0D */ {0xB0,0x02,0xA1,0xEC,0x07,0x0D,0x42,0x88,0x82,0x89,0xBD,0x5D,0x36,0xE5,0xFF,0x0D}; -BMD_CONST REFIID IID_IDeckLinkNotification = /* 0A1FB207-E215-441B-9B19-6FA1575946C5 */ {0x0A,0x1F,0xB2,0x07,0xE2,0x15,0x44,0x1B,0x9B,0x19,0x6F,0xA1,0x57,0x59,0x46,0xC5}; -BMD_CONST REFIID IID_IDeckLinkAttributes = /* ABC11843-D966-44CB-96E2-A1CB5D3135C4 */ {0xAB,0xC1,0x18,0x43,0xD9,0x66,0x44,0xCB,0x96,0xE2,0xA1,0xCB,0x5D,0x31,0x35,0xC4}; +BMD_CONST REFIID IID_IDeckLinkNotification = /* B85DF4C8-BDF5-47C1-8064-28162EBDD4EB */ {0xB8,0x5D,0xF4,0xC8,0xBD,0xF5,0x47,0xC1,0x80,0x64,0x28,0x16,0x2E,0xBD,0xD4,0xEB}; +BMD_CONST REFIID IID_IDeckLinkProfileAttributes = /* 17D4BF8E-4911-473A-80A0-731CF6FF345B */ {0x17,0xD4,0xBF,0x8E,0x49,0x11,0x47,0x3A,0x80,0xA0,0x73,0x1C,0xF6,0xFF,0x34,0x5B}; +BMD_CONST REFIID IID_IDeckLinkProfileIterator = /* 29E5A8C0-8BE4-46EB-93AC-31DAAB5B7BF2 */ {0x29,0xE5,0xA8,0xC0,0x8B,0xE4,0x46,0xEB,0x93,0xAC,0x31,0xDA,0xAB,0x5B,0x7B,0xF2}; +BMD_CONST REFIID IID_IDeckLinkProfile = /* 16093466-674A-432B-9DA0-1AC2C5A8241C */ {0x16,0x09,0x34,0x66,0x67,0x4A,0x43,0x2B,0x9D,0xA0,0x1A,0xC2,0xC5,0xA8,0x24,0x1C}; +BMD_CONST REFIID IID_IDeckLinkProfileCallback = /* A4F9341E-97AA-4E04-8935-15F809898CEA */ {0xA4,0xF9,0x34,0x1E,0x97,0xAA,0x4E,0x04,0x89,0x35,0x15,0xF8,0x09,0x89,0x8C,0xEA}; +BMD_CONST REFIID IID_IDeckLinkProfileManager = /* 30D41429-3998-4B6D-84F8-78C94A797C6E */ {0x30,0xD4,0x14,0x29,0x39,0x98,0x4B,0x6D,0x84,0xF8,0x78,0xC9,0x4A,0x79,0x7C,0x6E}; BMD_CONST REFIID IID_IDeckLinkStatus = /* 5F558200-4028-49BC-BEAC-DB3FA4A96E46 */ {0x5F,0x55,0x82,0x00,0x40,0x28,0x49,0xBC,0xBE,0xAC,0xDB,0x3F,0xA4,0xA9,0x6E,0x46}; BMD_CONST REFIID IID_IDeckLinkKeyer = /* 89AFCAF5-65F8-421E-98F7-96FE5F5BFBA3 */ {0x89,0xAF,0xCA,0xF5,0x65,0xF8,0x42,0x1E,0x98,0xF7,0x96,0xFE,0x5F,0x5B,0xFB,0xA3}; BMD_CONST REFIID IID_IDeckLinkVideoConversion = /* 3BBCB8A2-DA2C-42D9-B5D8-88083644E99A */ {0x3B,0xBC,0xB8,0xA2,0xDA,0x2C,0x42,0xD9,0xB5,0xD8,0x88,0x08,0x36,0x44,0xE9,0x9A}; @@ -103,7 +107,21 @@ bmdVideoOutputVANC = 1 << 0, bmdVideoOutputVITC = 1 << 1, bmdVideoOutputRP188 = 1 << 2, - bmdVideoOutputDualStream3D = 1 << 4 + bmdVideoOutputDualStream3D = 1 << 4, + bmdVideoOutputSynchronizeToPlaybackGroup = 1 << 6 +}; + +/* Enum BMDSupportedVideoModeFlags - Flags to describe supported video mode */ + +typedef uint32_t BMDSupportedVideoModeFlags; +enum _BMDSupportedVideoModeFlags { + bmdSupportedVideoModeDefault = 0, + bmdSupportedVideoModeKeying = 1 << 0, + bmdSupportedVideoModeDualStream3D = 1 << 1, + bmdSupportedVideoModeSDISingleLink = 1 << 2, + bmdSupportedVideoModeSDIDualLink = 1 << 3, + bmdSupportedVideoModeSDIQuadLink = 1 << 4, + bmdSupportedVideoModeInAnyProfile = 1 << 5 }; /* Enum BMDPacketType - Type of packet */ @@ -135,7 +153,8 @@ enum _BMDVideoInputFlags { bmdVideoInputFlagDefault = 0, bmdVideoInputEnableFormatDetection = 1 << 0, - bmdVideoInputDualStream3D = 1 << 1 + bmdVideoInputDualStream3D = 1 << 1, + bmdVideoInputSynchronizeToCaptureGroup = 1 << 2 }; /* Enum BMDVideoInputFormatChangedEvents - Bitmask passed to the VideoInputFormatChanged notification to identify the properties of the input signal that have changed */ @@ -214,15 +233,6 @@ bmdAudioOutputStreamTimestamped }; -/* Enum BMDDisplayModeSupport - Output mode supported flags */ - -typedef uint32_t BMDDisplayModeSupport; -enum _BMDDisplayModeSupport { - bmdDisplayModeNotSupported = 0, - bmdDisplayModeSupported, - bmdDisplayModeSupportedWithConversion -}; - /* Enum BMDAncillaryPacketFormat - Ancillary packet format */ typedef uint32_t BMDAncillaryPacketFormat; @@ -239,7 +249,8 @@ bmdTimecodeRP188VITC1 = /* 'rpv1' */ 0x72707631, // RP188 timecode where DBB1 equals VITC1 (line 9) bmdTimecodeRP188VITC2 = /* 'rp12' */ 0x72703132, // RP188 timecode where DBB1 equals VITC2 (line 9 for progressive or line 571 for interlaced/PsF) bmdTimecodeRP188LTC = /* 'rplt' */ 0x72706C74, // RP188 timecode where DBB1 equals LTC (line 10) - bmdTimecodeRP188Any = /* 'rp18' */ 0x72703138, // For capture: return the first valid timecode in {VITC1, LTC ,VITC2} - For playback: set the timecode as VITC1 + bmdTimecodeRP188HighFrameRate = /* 'rphr' */ 0x72706872, // RP188 timecode where DBB1 is an HFRTC (SMPTE ST 12-3), the only timecode allowing the frame value to go above 30 + bmdTimecodeRP188Any = /* 'rp18' */ 0x72703138, // Convenience for capture, returning the first valid timecode in {HFRTC (if supported), VITC1, LTC, VITC2} bmdTimecodeVITC = /* 'vitc' */ 0x76697463, bmdTimecodeVITCField2 = /* 'vit2' */ 0x76697432, bmdTimecodeSerial = /* 'seri' */ 0x73657269 @@ -384,8 +395,6 @@ bmdDeckLinkFrameMetadataHDRElectroOpticalTransferFunc = /* 'eotf' */ 0x656F7466, // EOTF in range 0-7 as per CEA 861.3 bmdDeckLinkFrameMetadataCintelFilmType = /* 'cfty' */ 0x63667479, // Current film type bmdDeckLinkFrameMetadataCintelFilmGauge = /* 'cfga' */ 0x63666761, // Current film gauge - bmdDeckLinkFrameMetadataCintelOffsetDetectedHorizontal = /* 'odfh' */ 0x6F646668, // Horizontal offset (pixels) detected in image - bmdDeckLinkFrameMetadataCintelOffsetDetectedVertical = /* 'odfv' */ 0x6F646676, // Vertical offset (pixels) detected in image bmdDeckLinkFrameMetadataCintelKeykodeLow = /* 'ckkl' */ 0x636B6B6C, // Raw keykode value - low 64 bits bmdDeckLinkFrameMetadataCintelKeykodeHigh = /* 'ckkh' */ 0x636B6B68, // Raw keykode value - high 64 bits bmdDeckLinkFrameMetadataCintelTile1Size = /* 'ct1s' */ 0x63743173, // Size in bytes of compressed raw tile 1 @@ -432,15 +441,30 @@ bmdDeckLinkFrameMetadataCintelGainBlue = /* 'LfBl' */ 0x4C66426C, // Blue gain parameter to apply after log bmdDeckLinkFrameMetadataCintelLiftRed = /* 'GnRd' */ 0x476E5264, // Red lift parameter to apply after log and gain bmdDeckLinkFrameMetadataCintelLiftGreen = /* 'GnGr' */ 0x476E4772, // Green lift parameter to apply after log and gain - bmdDeckLinkFrameMetadataCintelLiftBlue = /* 'GnBl' */ 0x476E426C // Blue lift parameter to apply after log and gain + bmdDeckLinkFrameMetadataCintelLiftBlue = /* 'GnBl' */ 0x476E426C, // Blue lift parameter to apply after log and gain + bmdDeckLinkFrameMetadataCintelHDRGainRed = /* 'HGRd' */ 0x48475264, // Red gain parameter to apply to linear data for HDR Combination + bmdDeckLinkFrameMetadataCintelHDRGainGreen = /* 'HGGr' */ 0x48474772, // Green gain parameter to apply to linear data for HDR Combination + bmdDeckLinkFrameMetadataCintelHDRGainBlue = /* 'HGBl' */ 0x4847426C // Blue gain parameter to apply to linear data for HDR Combination }; -/* Enum BMDDuplexMode - Duplex for configurable ports */ +/* Enum BMDProfileID - Identifies a profile */ -typedef uint32_t BMDDuplexMode; -enum _BMDDuplexMode { - bmdDuplexModeFull = /* 'fdup' */ 0x66647570, - bmdDuplexModeHalf = /* 'hdup' */ 0x68647570 +typedef uint32_t BMDProfileID; +enum _BMDProfileID { + bmdProfileOneSubDeviceFullDuplex = /* '1dfd' */ 0x31646664, + bmdProfileOneSubDeviceHalfDuplex = /* '1dhd' */ 0x31646864, + bmdProfileTwoSubDevicesFullDuplex = /* '2dfd' */ 0x32646664, + bmdProfileTwoSubDevicesHalfDuplex = /* '2dhd' */ 0x32646864, + bmdProfileFourSubDevicesHalfDuplex = /* '4dhd' */ 0x34646864 +}; + +/* Enum BMDHDMITimecodePacking - Packing form of timecode on HDMI */ + +typedef uint32_t BMDHDMITimecodePacking; +enum _BMDHDMITimecodePacking { + bmdHDMITimecodePackingIEEEOUI000085 = 0x00008500, + bmdHDMITimecodePackingIEEEOUI080046 = 0x08004601, + bmdHDMITimecodePackingIEEEOUI5CF9F0 = 0x5CF9F003 }; /* Enum BMDDeckLinkAttributeID - DeckLink Attribute ID */ @@ -452,7 +476,6 @@ BMDDeckLinkSupportsInternalKeying = /* 'keyi' */ 0x6B657969, BMDDeckLinkSupportsExternalKeying = /* 'keye' */ 0x6B657965, - BMDDeckLinkSupportsHDKeying = /* 'keyh' */ 0x6B657968, BMDDeckLinkSupportsInputFormatDetection = /* 'infd' */ 0x696E6664, BMDDeckLinkHasReferenceInput = /* 'hrin' */ 0x6872696E, BMDDeckLinkHasSerialPort = /* 'hspt' */ 0x68737074, @@ -461,16 +484,19 @@ BMDDeckLinkHasVideoInputAntiAliasingFilter = /* 'aafl' */ 0x6161666C, BMDDeckLinkHasBypass = /* 'byps' */ 0x62797073, BMDDeckLinkSupportsClockTimingAdjustment = /* 'ctad' */ 0x63746164, - BMDDeckLinkSupportsFullDuplex = /* 'fdup' */ 0x66647570, BMDDeckLinkSupportsFullFrameReferenceInputTimingOffset = /* 'frin' */ 0x6672696E, BMDDeckLinkSupportsSMPTELevelAOutput = /* 'lvla' */ 0x6C766C61, BMDDeckLinkSupportsDualLinkSDI = /* 'sdls' */ 0x73646C73, BMDDeckLinkSupportsQuadLinkSDI = /* 'sqls' */ 0x73716C73, BMDDeckLinkSupportsIdleOutput = /* 'idou' */ 0x69646F75, + BMDDeckLinkVANCRequires10BitYUVVideoFrames = /* 'vioY' */ 0x76696F59, // Legacy product requires v210 active picture for IDeckLinkVideoFrameAncillaryPackets or 10-bit VANC BMDDeckLinkHasLTCTimecodeInput = /* 'hltc' */ 0x686C7463, - BMDDeckLinkSupportsDuplexModeConfiguration = /* 'dupx' */ 0x64757078, BMDDeckLinkSupportsHDRMetadata = /* 'hdrm' */ 0x6864726D, BMDDeckLinkSupportsColorspaceMetadata = /* 'cmet' */ 0x636D6574, + BMDDeckLinkSupportsHDMITimecode = /* 'htim' */ 0x6874696D, + BMDDeckLinkSupportsHighFrameRateTimecode = /* 'HFRT' */ 0x48465254, + BMDDeckLinkSupportsSynchronizeToCaptureGroup = /* 'stcg' */ 0x73746367, + BMDDeckLinkSupportsSynchronizeToPlaybackGroup = /* 'stpg' */ 0x73747067, /* Integers */ @@ -493,7 +519,8 @@ BMDDeckLinkAudioInputXLRChannelCount = /* 'aixc' */ 0x61697863, BMDDeckLinkAudioOutputRCAChannelCount = /* 'aorc' */ 0x616F7263, BMDDeckLinkAudioOutputXLRChannelCount = /* 'aoxc' */ 0x616F7863, - BMDDeckLinkPairedDevicePersistentID = /* 'ppid' */ 0x70706964, + BMDDeckLinkProfileID = /* 'prid' */ 0x70726964, // Returns a BMDProfileID + BMDDeckLinkDuplex = /* 'dupx' */ 0x64757078, /* Floats */ @@ -539,7 +566,6 @@ bmdDeckLinkStatusLastVideoOutputPixelFormat = /* 'opix' */ 0x6F706978, bmdDeckLinkStatusReferenceSignalMode = /* 'refm' */ 0x7265666D, bmdDeckLinkStatusReferenceSignalFlags = /* 'reff' */ 0x72656666, - bmdDeckLinkStatusDuplexMode = /* 'dupx' */ 0x64757078, bmdDeckLinkStatusBusy = /* 'busy' */ 0x62757379, bmdDeckLinkStatusInterchangeablePanelType = /* 'icpt' */ 0x69637074, bmdDeckLinkStatusDeviceTemperature = /* 'dtmp' */ 0x64746D70, @@ -559,14 +585,14 @@ bmdDeckLinkVideoStatusDualStream3D = 1 << 1 }; -/* Enum BMDDuplexStatus - Duplex status of the device */ +/* Enum BMDDuplexMode - Duplex of the device */ -typedef uint32_t BMDDuplexStatus; -enum _BMDDuplexStatus { - bmdDuplexStatusFullDuplex = /* 'fdup' */ 0x66647570, - bmdDuplexStatusHalfDuplex = /* 'hdup' */ 0x68647570, - bmdDuplexStatusSimplex = /* 'splx' */ 0x73706C78, - bmdDuplexStatusInactive = /* 'inac' */ 0x696E6163 +typedef uint32_t BMDDuplexMode; +enum _BMDDuplexMode { + bmdDuplexFull = /* 'dxfu' */ 0x64786675, + bmdDuplexHalf = /* 'dxha' */ 0x64786861, + bmdDuplexSimplex = /* 'dxsp' */ 0x64787370, + bmdDuplexInactive = /* 'dxin' */ 0x6478696E }; /* Enum BMDPanelType - The type of interchangeable panel */ @@ -646,7 +672,11 @@ class IDeckLinkGLScreenPreviewHelper; class IDeckLinkNotificationCallback; class IDeckLinkNotification; -class IDeckLinkAttributes; +class IDeckLinkProfileAttributes; +class IDeckLinkProfileIterator; +class IDeckLinkProfile; +class IDeckLinkProfileCallback; +class IDeckLinkProfileManager; class IDeckLinkStatus; class IDeckLinkKeyer; class IDeckLinkVideoConversion; @@ -737,7 +767,8 @@ class BMD_PUBLIC IDeckLinkOutput : public IUnknown { public: - virtual HRESULT DoesSupportVideoMode (/* in */ BMDDisplayMode displayMode, /* in */ BMDPixelFormat pixelFormat, /* in */ BMDVideoOutputFlags flags, /* out */ BMDDisplayModeSupport *result, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + virtual HRESULT DoesSupportVideoMode (/* in */ BMDVideoConnection connection /* If a value of 0 is specified, the caller does not care about the connection */, /* in */ BMDDisplayMode requestedMode, /* in */ BMDPixelFormat requestedPixelFormat, /* in */ BMDSupportedVideoModeFlags flags, /* out */ BMDDisplayMode *actualMode, /* out */ bool *supported) = 0; + virtual HRESULT GetDisplayMode (/* in */ BMDDisplayMode displayMode, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; virtual HRESULT GetDisplayModeIterator (/* out */ IDeckLinkDisplayModeIterator **iterator) = 0; virtual HRESULT SetScreenPreviewCallback (/* in */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; @@ -794,7 +825,8 @@ class BMD_PUBLIC IDeckLinkInput : public IUnknown { public: - virtual HRESULT DoesSupportVideoMode (/* in */ BMDDisplayMode displayMode, /* in */ BMDPixelFormat pixelFormat, /* in */ BMDVideoInputFlags flags, /* out */ BMDDisplayModeSupport *result, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + virtual HRESULT DoesSupportVideoMode (/* in */ BMDVideoConnection connection /* If a value of 0 is specified, the caller does not care about the connection */, /* in */ BMDDisplayMode requestedMode, /* in */ BMDPixelFormat requestedPixelFormat, /* in */ BMDSupportedVideoModeFlags flags, /* out */ bool *supported) = 0; + virtual HRESULT GetDisplayMode (/* in */ BMDDisplayMode displayMode, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; virtual HRESULT GetDisplayModeIterator (/* out */ IDeckLinkDisplayModeIterator **iterator) = 0; virtual HRESULT SetScreenPreviewCallback (/* in */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; @@ -846,7 +878,8 @@ class BMD_PUBLIC IDeckLinkEncoderInput : public IUnknown { public: - virtual HRESULT DoesSupportVideoMode (/* in */ BMDDisplayMode displayMode, /* in */ BMDPixelFormat pixelFormat, /* in */ BMDVideoInputFlags flags, /* out */ BMDDisplayModeSupport *result, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + virtual HRESULT DoesSupportVideoMode (/* in */ BMDVideoConnection connection /* If a value of 0 is specified, the caller does not care about the connection */, /* in */ BMDDisplayMode requestedMode, /* in */ BMDPixelFormat requestedCodec, /* in */ uint32_t requestedCodecProfile, /* in */ BMDSupportedVideoModeFlags flags, /* out */ bool *supported) = 0; + virtual HRESULT GetDisplayMode (/* in */ BMDDisplayMode displayMode, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; virtual HRESULT GetDisplayModeIterator (/* out */ IDeckLinkDisplayModeIterator **iterator) = 0; /* Video Input */ @@ -1116,11 +1149,14 @@ public: virtual HRESULT Subscribe (/* in */ BMDNotifications topic, /* in */ IDeckLinkNotificationCallback *theCallback) = 0; virtual HRESULT Unsubscribe (/* in */ BMDNotifications topic, /* in */ IDeckLinkNotificationCallback *theCallback) = 0; + +protected: + virtual ~IDeckLinkNotification () {} // call Release method to drop reference count }; -/* Interface IDeckLinkAttributes - DeckLink Attribute interface */ +/* Interface IDeckLinkProfileAttributes - Created by QueryInterface from an IDeckLinkProfile, or from IDeckLink. When queried from IDeckLink, interrogates the active profile */ -class BMD_PUBLIC IDeckLinkAttributes : public IUnknown +class BMD_PUBLIC IDeckLinkProfileAttributes : public IUnknown { public: virtual HRESULT GetFlag (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ bool *value) = 0; @@ -1129,7 +1165,57 @@ virtual HRESULT GetString (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ const char **value) = 0; protected: - virtual ~IDeckLinkAttributes () {} // call Release method to drop reference count + virtual ~IDeckLinkProfileAttributes () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfileIterator - Enumerates IDeckLinkProfile interfaces */ + +class BMD_PUBLIC IDeckLinkProfileIterator : public IUnknown +{ +public: + virtual HRESULT Next (/* out */ IDeckLinkProfile **profile) = 0; + +protected: + virtual ~IDeckLinkProfileIterator () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfile - Represents the active profile when queried from IDeckLink */ + +class BMD_PUBLIC IDeckLinkProfile : public IUnknown +{ +public: + virtual HRESULT GetDevice (/* out */ IDeckLink **device) = 0; // Device affected when this profile becomes active + virtual HRESULT IsActive (/* out */ bool *isActive) = 0; + virtual HRESULT SetActive (void) = 0; // Activating a profile will also change the profile on all devices enumerated by GetPeers(). Activation is not complete until IDeckLinkProfileCallback::ProfileActivated() is called + virtual HRESULT GetPeers (/* out */ IDeckLinkProfileIterator **profileIterator) = 0; // Profiles of other devices activated with this profile + +protected: + virtual ~IDeckLinkProfile () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfileCallback - Receive notifications about profiles related to this device */ + +class BMD_PUBLIC IDeckLinkProfileCallback : public IUnknown +{ +public: + virtual HRESULT ProfileChanging (/* in */ IDeckLinkProfile *profileToBeActivated, /* in */ bool streamsWillBeForcedToStop) = 0; // Called before this device changes profile. User has an opportunity for teardown if streamsWillBeForcedToStop + virtual HRESULT ProfileActivated (/* in */ IDeckLinkProfile *activatedProfile) = 0; // Called after this device has been activated with a new profile + +protected: + virtual ~IDeckLinkProfileCallback () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfileManager - Created by QueryInterface from IDeckLink when a device has multiple optional profiles */ + +class BMD_PUBLIC IDeckLinkProfileManager : public IUnknown +{ +public: + virtual HRESULT GetProfiles (/* out */ IDeckLinkProfileIterator **profileIterator) = 0; // All available profiles for this device + virtual HRESULT GetProfile (/* in */ BMDProfileID profileID, /* out */ IDeckLinkProfile **profile) = 0; + virtual HRESULT SetCallback (/* in */ IDeckLinkProfileCallback *callback) = 0; + +protected: + virtual ~IDeckLinkProfileManager () {} // call Release method to drop reference count }; /* Interface IDeckLinkStatus - DeckLink Status interface */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPIConfiguration.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPIConfiguration.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -46,7 +46,7 @@ // Interface ID Declarations -BMD_CONST REFIID IID_IDeckLinkConfiguration = /* EF90380B-4AE5-4346-9077-E288E149F129 */ {0xEF,0x90,0x38,0x0B,0x4A,0xE5,0x43,0x46,0x90,0x77,0xE2,0x88,0xE1,0x49,0xF1,0x29}; +BMD_CONST REFIID IID_IDeckLinkConfiguration = /* 912F634B-2D4E-40A4-8AAB-8D80B73F1289 */ {0x91,0x2F,0x63,0x4B,0x2D,0x4E,0x40,0xA4,0x8A,0xAB,0x8D,0x80,0xB7,0x3F,0x12,0x89}; BMD_CONST REFIID IID_IDeckLinkEncoderConfiguration = /* 138050E5-C60A-4552-BF3F-0F358049327E */ {0x13,0x80,0x50,0xE5,0xC6,0x0A,0x45,0x52,0xBF,0x3F,0x0F,0x35,0x80,0x49,0x32,0x7E}; /* Enum BMDDeckLinkConfigurationID - DeckLink Configuration ID */ @@ -63,7 +63,6 @@ bmdDeckLinkConfigHDMI3DPackingFormat = /* '3dpf' */ 0x33647066, bmdDeckLinkConfigBypass = /* 'byps' */ 0x62797073, bmdDeckLinkConfigClockTimingAdjustment = /* 'ctad' */ 0x63746164, - bmdDeckLinkConfigDuplexMode = /* 'dupx' */ 0x64757078, /* Audio Input/Output Flags */ @@ -95,6 +94,8 @@ bmdDeckLinkConfigDefaultVideoOutputMode = /* 'dvom' */ 0x64766F6D, bmdDeckLinkConfigDefaultVideoOutputModeFlags = /* 'dvof' */ 0x64766F66, bmdDeckLinkConfigSDIOutputLinkConfiguration = /* 'solc' */ 0x736F6C63, + bmdDeckLinkConfigHDMITimecodePacking = /* 'htpk' */ 0x6874706B, + bmdDeckLinkConfigPlaybackGroup = /* 'plgr' */ 0x706C6772, /* Video Output Floats */ @@ -126,6 +127,7 @@ bmdDeckLinkConfigVANCSourceLine2Mapping = /* 'vsl2' */ 0x76736C32, bmdDeckLinkConfigVANCSourceLine3Mapping = /* 'vsl3' */ 0x76736C33, bmdDeckLinkConfigCapturePassThroughMode = /* 'cptm' */ 0x6370746D, + bmdDeckLinkConfigCaptureGroup = /* 'cpgr' */ 0x63706772, /* Video Input Floats */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPIDeckControl.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPIDeckControl.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPIDiscovery.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPIDiscovery.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPIDispatch.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPIDispatch.cpp
Changed
@@ -7,14 +7,14 @@ ** execute, and transmit the Software, and to prepare derivative works of the ** Software, and to permit third-parties to whom the Software is furnished to ** do so, all subject to the following: -** +** ** The copyright notices in the Software and this entire statement, including ** the above license grant, this restriction and the following disclaimer, ** must be included in all copies of the Software, in whole or in part, and ** all derivative works of the Software, unless such copies or derivative ** works are solely in the form of machine-executable object code generated by ** a source language processor. -** +** ** THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR ** IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, ** FITNESS FOR A PARTICULAR PURPOSE, TITLE AND NON-INFRINGEMENT. IN NO EVENT @@ -56,20 +56,17 @@ static void InitDeckLinkAPI (void) { void *libraryHandle; - + libraryHandle = dlopen(kDeckLinkAPI_Name, RTLD_NOW|RTLD_GLOBAL); if (!libraryHandle) { - /* As we install this plugin regardless if there is a - * proprietary library present or not, let's stay silent - * to avoid poluting the logs */ // fprintf(stderr, "%s\n", dlerror()); return; } - + gLoadedDeckLinkAPI = true; - - gCreateIteratorFunc = (CreateIteratorFunc)dlsym(libraryHandle, "CreateDeckLinkIteratorInstance_0003"); + + gCreateIteratorFunc = (CreateIteratorFunc)dlsym(libraryHandle, "CreateDeckLinkIteratorInstance_0004"); if (!gCreateIteratorFunc) fprintf(stderr, "%s\n", dlerror()); gCreateAPIInformationFunc = (CreateAPIInformationFunc)dlsym(libraryHandle, "CreateDeckLinkAPIInformationInstance_0001"); @@ -78,7 +75,7 @@ gCreateVideoConversionFunc = (CreateVideoConversionInstanceFunc)dlsym(libraryHandle, "CreateVideoConversionInstance_0001"); if (!gCreateVideoConversionFunc) fprintf(stderr, "%s\n", dlerror()); - gCreateDeckLinkDiscoveryFunc = (CreateDeckLinkDiscoveryInstanceFunc)dlsym(libraryHandle, "CreateDeckLinkDiscoveryInstance_0002"); + gCreateDeckLinkDiscoveryFunc = (CreateDeckLinkDiscoveryInstanceFunc)dlsym(libraryHandle, "CreateDeckLinkDiscoveryInstance_0003"); if (!gCreateDeckLinkDiscoveryFunc) fprintf(stderr, "%s\n", dlerror()); gCreateVideoFrameAncillaryPacketsFunc = (CreateVideoFrameAncillaryPacketsInstanceFunc)dlsym(libraryHandle, "CreateVideoFrameAncillaryPacketsInstance_0001"); @@ -89,7 +86,7 @@ static void InitDeckLinkPreviewAPI (void) { void *libraryHandle; - + libraryHandle = dlopen(KDeckLinkPreviewAPI_Name, RTLD_NOW|RTLD_GLOBAL); if (!libraryHandle) { @@ -112,7 +109,7 @@ IDeckLinkIterator* CreateDeckLinkIteratorInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateIteratorFunc == NULL) return NULL; return gCreateIteratorFunc(); @@ -121,7 +118,7 @@ IDeckLinkAPIInformation* CreateDeckLinkAPIInformationInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateAPIInformationFunc == NULL) return NULL; return gCreateAPIInformationFunc(); @@ -131,7 +128,7 @@ { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); pthread_once(&gPreviewOnceControl, InitDeckLinkPreviewAPI); - + if (gCreateOpenGLPreviewFunc == NULL) return NULL; return gCreateOpenGLPreviewFunc(); @@ -140,7 +137,7 @@ IDeckLinkVideoConversion* CreateVideoConversionInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateVideoConversionFunc == NULL) return NULL; return gCreateVideoConversionFunc(); @@ -149,7 +146,7 @@ IDeckLinkDiscovery* CreateDeckLinkDiscoveryInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateDeckLinkDiscoveryFunc == NULL) return NULL; return gCreateDeckLinkDiscoveryFunc(); @@ -158,7 +155,7 @@ IDeckLinkVideoFrameAncillaryPackets* CreateVideoFrameAncillaryPacketsInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateVideoFrameAncillaryPacketsFunc == NULL) return NULL; return gCreateVideoFrameAncillaryPacketsFunc();
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPIModes.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPIModes.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -69,9 +69,16 @@ bmdModeHD1080p25 = /* 'Hp25' */ 0x48703235, bmdModeHD1080p2997 = /* 'Hp29' */ 0x48703239, bmdModeHD1080p30 = /* 'Hp30' */ 0x48703330, + bmdModeHD1080p4795 = /* 'Hp47' */ 0x48703437, + bmdModeHD1080p48 = /* 'Hp48' */ 0x48703438, bmdModeHD1080p50 = /* 'Hp50' */ 0x48703530, bmdModeHD1080p5994 = /* 'Hp59' */ 0x48703539, bmdModeHD1080p6000 = /* 'Hp60' */ 0x48703630, // N.B. This _really_ is 60.00 Hz. + bmdModeHD1080p9590 = /* 'Hp95' */ 0x48703935, + bmdModeHD1080p96 = /* 'Hp96' */ 0x48703936, + bmdModeHD1080p100 = /* 'Hp10' */ 0x48703130, + bmdModeHD1080p11988 = /* 'Hp11' */ 0x48703131, + bmdModeHD1080p120 = /* 'Hp12' */ 0x48703132, bmdModeHD1080i50 = /* 'Hi50' */ 0x48693530, bmdModeHD1080i5994 = /* 'Hi59' */ 0x48693539, bmdModeHD1080i6000 = /* 'Hi60' */ 0x48693630, // N.B. This _really_ is 60.00 Hz. @@ -95,9 +102,16 @@ bmdMode2kDCI25 = /* '2d25' */ 0x32643235, bmdMode2kDCI2997 = /* '2d29' */ 0x32643239, bmdMode2kDCI30 = /* '2d30' */ 0x32643330, + bmdMode2kDCI4795 = /* '2d47' */ 0x32643437, + bmdMode2kDCI48 = /* '2d48' */ 0x32643438, bmdMode2kDCI50 = /* '2d50' */ 0x32643530, bmdMode2kDCI5994 = /* '2d59' */ 0x32643539, bmdMode2kDCI60 = /* '2d60' */ 0x32643630, + bmdMode2kDCI9590 = /* '2d95' */ 0x32643935, + bmdMode2kDCI96 = /* '2d96' */ 0x32643936, + bmdMode2kDCI100 = /* '2d10' */ 0x32643130, + bmdMode2kDCI11988 = /* '2d11' */ 0x32643131, + bmdMode2kDCI120 = /* '2d12' */ 0x32643132, /* 4K UHD Modes */ @@ -106,9 +120,16 @@ bmdMode4K2160p25 = /* '4k25' */ 0x346B3235, bmdMode4K2160p2997 = /* '4k29' */ 0x346B3239, bmdMode4K2160p30 = /* '4k30' */ 0x346B3330, + bmdMode4K2160p4795 = /* '4k47' */ 0x346B3437, + bmdMode4K2160p48 = /* '4k48' */ 0x346B3438, bmdMode4K2160p50 = /* '4k50' */ 0x346B3530, bmdMode4K2160p5994 = /* '4k59' */ 0x346B3539, bmdMode4K2160p60 = /* '4k60' */ 0x346B3630, + bmdMode4K2160p9590 = /* '4k95' */ 0x346B3935, + bmdMode4K2160p96 = /* '4k96' */ 0x346B3936, + bmdMode4K2160p100 = /* '4k10' */ 0x346B3130, + bmdMode4K2160p11988 = /* '4k11' */ 0x346B3131, + bmdMode4K2160p120 = /* '4k12' */ 0x346B3132, /* 4K DCI Modes */ @@ -117,9 +138,16 @@ bmdMode4kDCI25 = /* '4d25' */ 0x34643235, bmdMode4kDCI2997 = /* '4d29' */ 0x34643239, bmdMode4kDCI30 = /* '4d30' */ 0x34643330, + bmdMode4kDCI4795 = /* '4d47' */ 0x34643437, + bmdMode4kDCI48 = /* '4d48' */ 0x34643438, bmdMode4kDCI50 = /* '4d50' */ 0x34643530, bmdMode4kDCI5994 = /* '4d59' */ 0x34643539, bmdMode4kDCI60 = /* '4d60' */ 0x34643630, + bmdMode4kDCI9590 = /* '4d95' */ 0x34643935, + bmdMode4kDCI96 = /* '4d96' */ 0x34643936, + bmdMode4kDCI100 = /* '4d10' */ 0x34643130, + bmdMode4kDCI11988 = /* '4d11' */ 0x34643131, + bmdMode4kDCI120 = /* '4d12' */ 0x34643132, /* 8K UHD Modes */ @@ -128,6 +156,8 @@ bmdMode8K4320p25 = /* '8k25' */ 0x386B3235, bmdMode8K4320p2997 = /* '8k29' */ 0x386B3239, bmdMode8K4320p30 = /* '8k30' */ 0x386B3330, + bmdMode8K4320p4795 = /* '8k47' */ 0x386B3437, + bmdMode8K4320p48 = /* '8k48' */ 0x386B3438, bmdMode8K4320p50 = /* '8k50' */ 0x386B3530, bmdMode8K4320p5994 = /* '8k59' */ 0x386B3539, bmdMode8K4320p60 = /* '8k60' */ 0x386B3630, @@ -139,10 +169,31 @@ bmdMode8kDCI25 = /* '8d25' */ 0x38643235, bmdMode8kDCI2997 = /* '8d29' */ 0x38643239, bmdMode8kDCI30 = /* '8d30' */ 0x38643330, + bmdMode8kDCI4795 = /* '8d47' */ 0x38643437, + bmdMode8kDCI48 = /* '8d48' */ 0x38643438, bmdMode8kDCI50 = /* '8d50' */ 0x38643530, bmdMode8kDCI5994 = /* '8d59' */ 0x38643539, bmdMode8kDCI60 = /* '8d60' */ 0x38643630, + /* PC Modes */ + + bmdMode640x480p60 = /* 'vga6' */ 0x76676136, + bmdMode800x600p60 = /* 'svg6' */ 0x73766736, + bmdMode1440x900p50 = /* 'wxg5' */ 0x77786735, + bmdMode1440x900p60 = /* 'wxg6' */ 0x77786736, + bmdMode1440x1080p50 = /* 'sxg5' */ 0x73786735, + bmdMode1440x1080p60 = /* 'sxg6' */ 0x73786736, + bmdMode1600x1200p50 = /* 'uxg5' */ 0x75786735, + bmdMode1600x1200p60 = /* 'uxg6' */ 0x75786736, + bmdMode1920x1200p50 = /* 'wux5' */ 0x77757835, + bmdMode1920x1200p60 = /* 'wux6' */ 0x77757836, + bmdMode1920x1440p50 = /* '1945' */ 0x31393435, + bmdMode1920x1440p60 = /* '1946' */ 0x31393436, + bmdMode2560x1440p50 = /* 'wqh5' */ 0x77716835, + bmdMode2560x1440p60 = /* 'wqh6' */ 0x77716836, + bmdMode2560x1600p50 = /* 'wqx5' */ 0x77717835, + bmdMode2560x1600p60 = /* 'wqx6' */ 0x77717836, + /* RAW Modes for Cintel (input only) */ bmdModeCintelRAW = /* 'rwci' */ 0x72776369, // Frame size up to 4096x3072, variable frame rate @@ -168,6 +219,7 @@ typedef uint32_t BMDPixelFormat; enum _BMDPixelFormat { + bmdFormatUnspecified = 0, bmdFormat8BitYUV = /* '2vuy' */ 0x32767579, bmdFormat10BitYUV = /* 'v210' */ 0x76323130, bmdFormat8BitARGB = 32,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPITypes.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPITypes.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -59,13 +59,16 @@ bmdTimecodeFlagDefault = 0, bmdTimecodeIsDropFrame = 1 << 0, bmdTimecodeFieldMark = 1 << 1, - bmdTimecodeColorFrame = 1 << 2 + bmdTimecodeColorFrame = 1 << 2, + bmdTimecodeEmbedRecordingTrigger = 1 << 3, // On SDI recording trigger utilises a user-bit + bmdTimecodeRecordingTriggered = 1 << 4 }; /* Enum BMDVideoConnection - Video connection types */ typedef uint32_t BMDVideoConnection; enum _BMDVideoConnection { + bmdVideoConnectionUnspecified = 0, bmdVideoConnectionSDI = 1 << 0, bmdVideoConnectionHDMI = 1 << 1, bmdVideoConnectionOpticalSDI = 1 << 2,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/linux/DeckLinkAPIVersion.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/linux/DeckLinkAPIVersion.h
Changed
@@ -30,8 +30,8 @@ #ifndef __DeckLink_API_Version_h__ #define __DeckLink_API_Version_h__ -#define BLACKMAGIC_DECKLINK_API_VERSION 0x0a0b0400 -#define BLACKMAGIC_DECKLINK_API_VERSION_STRING "10.11.4" +#define BLACKMAGIC_DECKLINK_API_VERSION 0x0b020000 +#define BLACKMAGIC_DECKLINK_API_VERSION_STRING "11.2" #endif // __DeckLink_API_Version_h__
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/meson.build
Changed
@@ -1,4 +1,5 @@ decklink_sources = [ + 'gstdecklinkplugin.cpp', 'gstdecklink.cpp', 'gstdecklinkaudiosink.cpp', 'gstdecklinkvideosink.cpp',
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPI.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPI.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -69,10 +69,10 @@ BMD_CONST REFIID IID_IDeckLinkAudioOutputCallback = /* 403C681B-7F46-4A12-B993-2BB127084EE6 */ {0x40,0x3C,0x68,0x1B,0x7F,0x46,0x4A,0x12,0xB9,0x93,0x2B,0xB1,0x27,0x08,0x4E,0xE6}; BMD_CONST REFIID IID_IDeckLinkIterator = /* 50FB36CD-3063-4B73-BDBB-958087F2D8BA */ {0x50,0xFB,0x36,0xCD,0x30,0x63,0x4B,0x73,0xBD,0xBB,0x95,0x80,0x87,0xF2,0xD8,0xBA}; BMD_CONST REFIID IID_IDeckLinkAPIInformation = /* 7BEA3C68-730D-4322-AF34-8A7152B532A4 */ {0x7B,0xEA,0x3C,0x68,0x73,0x0D,0x43,0x22,0xAF,0x34,0x8A,0x71,0x52,0xB5,0x32,0xA4}; -BMD_CONST REFIID IID_IDeckLinkOutput = /* CC5C8A6E-3F2F-4B3A-87EA-FD78AF300564 */ {0xCC,0x5C,0x8A,0x6E,0x3F,0x2F,0x4B,0x3A,0x87,0xEA,0xFD,0x78,0xAF,0x30,0x05,0x64}; -BMD_CONST REFIID IID_IDeckLinkInput = /* AF22762B-DFAC-4846-AA79-FA8883560995 */ {0xAF,0x22,0x76,0x2B,0xDF,0xAC,0x48,0x46,0xAA,0x79,0xFA,0x88,0x83,0x56,0x09,0x95}; +BMD_CONST REFIID IID_IDeckLinkOutput = /* 065A0F6C-C508-4D0D-B919-F5EB0EBFC96B */ {0x06,0x5A,0x0F,0x6C,0xC5,0x08,0x4D,0x0D,0xB9,0x19,0xF5,0xEB,0x0E,0xBF,0xC9,0x6B}; +BMD_CONST REFIID IID_IDeckLinkInput = /* 2A88CF76-F494-4216-A7EF-DC74EEB83882 */ {0x2A,0x88,0xCF,0x76,0xF4,0x94,0x42,0x16,0xA7,0xEF,0xDC,0x74,0xEE,0xB8,0x38,0x82}; BMD_CONST REFIID IID_IDeckLinkHDMIInputEDID = /* ABBBACBC-45BC-4665-9D92-ACE6E5A97902 */ {0xAB,0xBB,0xAC,0xBC,0x45,0xBC,0x46,0x65,0x9D,0x92,0xAC,0xE6,0xE5,0xA9,0x79,0x02}; -BMD_CONST REFIID IID_IDeckLinkEncoderInput = /* 270587DA-6B7D-42E7-A1F0-6D853F581185 */ {0x27,0x05,0x87,0xDA,0x6B,0x7D,0x42,0xE7,0xA1,0xF0,0x6D,0x85,0x3F,0x58,0x11,0x85}; +BMD_CONST REFIID IID_IDeckLinkEncoderInput = /* F222551D-13DF-4FD8-B587-9D4F19EC12C9 */ {0xF2,0x22,0x55,0x1D,0x13,0xDF,0x4F,0xD8,0xB5,0x87,0x9D,0x4F,0x19,0xEC,0x12,0xC9}; BMD_CONST REFIID IID_IDeckLinkVideoFrame = /* 3F716FE0-F023-4111-BE5D-EF4414C05B17 */ {0x3F,0x71,0x6F,0xE0,0xF0,0x23,0x41,0x11,0xBE,0x5D,0xEF,0x44,0x14,0xC0,0x5B,0x17}; BMD_CONST REFIID IID_IDeckLinkMutableVideoFrame = /* 69E2639F-40DA-4E19-B6F2-20ACE815C390 */ {0x69,0xE2,0x63,0x9F,0x40,0xDA,0x4E,0x19,0xB6,0xF2,0x20,0xAC,0xE8,0x15,0xC3,0x90}; BMD_CONST REFIID IID_IDeckLinkVideoFrame3DExtensions = /* DA0F7E4A-EDC7-48A8-9CDD-2DB51C729CD7 */ {0xDA,0x0F,0x7E,0x4A,0xED,0xC7,0x48,0xA8,0x9C,0xDD,0x2D,0xB5,0x1C,0x72,0x9C,0xD7}; @@ -91,8 +91,12 @@ BMD_CONST REFIID IID_IDeckLinkCocoaScreenPreviewCallback = /* D174152F-8F96-4C07-83A5-DD5F5AF0A2AA */ {0xD1,0x74,0x15,0x2F,0x8F,0x96,0x4C,0x07,0x83,0xA5,0xDD,0x5F,0x5A,0xF0,0xA2,0xAA}; BMD_CONST REFIID IID_IDeckLinkGLScreenPreviewHelper = /* 504E2209-CAC7-4C1A-9FB4-C5BB6274D22F */ {0x50,0x4E,0x22,0x09,0xCA,0xC7,0x4C,0x1A,0x9F,0xB4,0xC5,0xBB,0x62,0x74,0xD2,0x2F}; BMD_CONST REFIID IID_IDeckLinkNotificationCallback = /* B002A1EC-070D-4288-8289-BD5D36E5FF0D */ {0xB0,0x02,0xA1,0xEC,0x07,0x0D,0x42,0x88,0x82,0x89,0xBD,0x5D,0x36,0xE5,0xFF,0x0D}; -BMD_CONST REFIID IID_IDeckLinkNotification = /* 0A1FB207-E215-441B-9B19-6FA1575946C5 */ {0x0A,0x1F,0xB2,0x07,0xE2,0x15,0x44,0x1B,0x9B,0x19,0x6F,0xA1,0x57,0x59,0x46,0xC5}; -BMD_CONST REFIID IID_IDeckLinkAttributes = /* ABC11843-D966-44CB-96E2-A1CB5D3135C4 */ {0xAB,0xC1,0x18,0x43,0xD9,0x66,0x44,0xCB,0x96,0xE2,0xA1,0xCB,0x5D,0x31,0x35,0xC4}; +BMD_CONST REFIID IID_IDeckLinkNotification = /* B85DF4C8-BDF5-47C1-8064-28162EBDD4EB */ {0xB8,0x5D,0xF4,0xC8,0xBD,0xF5,0x47,0xC1,0x80,0x64,0x28,0x16,0x2E,0xBD,0xD4,0xEB}; +BMD_CONST REFIID IID_IDeckLinkProfileAttributes = /* 17D4BF8E-4911-473A-80A0-731CF6FF345B */ {0x17,0xD4,0xBF,0x8E,0x49,0x11,0x47,0x3A,0x80,0xA0,0x73,0x1C,0xF6,0xFF,0x34,0x5B}; +BMD_CONST REFIID IID_IDeckLinkProfileIterator = /* 29E5A8C0-8BE4-46EB-93AC-31DAAB5B7BF2 */ {0x29,0xE5,0xA8,0xC0,0x8B,0xE4,0x46,0xEB,0x93,0xAC,0x31,0xDA,0xAB,0x5B,0x7B,0xF2}; +BMD_CONST REFIID IID_IDeckLinkProfile = /* 16093466-674A-432B-9DA0-1AC2C5A8241C */ {0x16,0x09,0x34,0x66,0x67,0x4A,0x43,0x2B,0x9D,0xA0,0x1A,0xC2,0xC5,0xA8,0x24,0x1C}; +BMD_CONST REFIID IID_IDeckLinkProfileCallback = /* A4F9341E-97AA-4E04-8935-15F809898CEA */ {0xA4,0xF9,0x34,0x1E,0x97,0xAA,0x4E,0x04,0x89,0x35,0x15,0xF8,0x09,0x89,0x8C,0xEA}; +BMD_CONST REFIID IID_IDeckLinkProfileManager = /* 30D41429-3998-4B6D-84F8-78C94A797C6E */ {0x30,0xD4,0x14,0x29,0x39,0x98,0x4B,0x6D,0x84,0xF8,0x78,0xC9,0x4A,0x79,0x7C,0x6E}; BMD_CONST REFIID IID_IDeckLinkStatus = /* 5F558200-4028-49BC-BEAC-DB3FA4A96E46 */ {0x5F,0x55,0x82,0x00,0x40,0x28,0x49,0xBC,0xBE,0xAC,0xDB,0x3F,0xA4,0xA9,0x6E,0x46}; BMD_CONST REFIID IID_IDeckLinkKeyer = /* 89AFCAF5-65F8-421E-98F7-96FE5F5BFBA3 */ {0x89,0xAF,0xCA,0xF5,0x65,0xF8,0x42,0x1E,0x98,0xF7,0x96,0xFE,0x5F,0x5B,0xFB,0xA3}; BMD_CONST REFIID IID_IDeckLinkVideoConversion = /* 3BBCB8A2-DA2C-42D9-B5D8-88083644E99A */ {0x3B,0xBC,0xB8,0xA2,0xDA,0x2C,0x42,0xD9,0xB5,0xD8,0x88,0x08,0x36,0x44,0xE9,0x9A}; @@ -107,7 +111,21 @@ bmdVideoOutputVANC = 1 << 0, bmdVideoOutputVITC = 1 << 1, bmdVideoOutputRP188 = 1 << 2, - bmdVideoOutputDualStream3D = 1 << 4 + bmdVideoOutputDualStream3D = 1 << 4, + bmdVideoOutputSynchronizeToPlaybackGroup = 1 << 6 +}; + +/* Enum BMDSupportedVideoModeFlags - Flags to describe supported video mode */ + +typedef uint32_t BMDSupportedVideoModeFlags; +enum _BMDSupportedVideoModeFlags { + bmdSupportedVideoModeDefault = 0, + bmdSupportedVideoModeKeying = 1 << 0, + bmdSupportedVideoModeDualStream3D = 1 << 1, + bmdSupportedVideoModeSDISingleLink = 1 << 2, + bmdSupportedVideoModeSDIDualLink = 1 << 3, + bmdSupportedVideoModeSDIQuadLink = 1 << 4, + bmdSupportedVideoModeInAnyProfile = 1 << 5 }; /* Enum BMDPacketType - Type of packet */ @@ -139,7 +157,8 @@ enum _BMDVideoInputFlags { bmdVideoInputFlagDefault = 0, bmdVideoInputEnableFormatDetection = 1 << 0, - bmdVideoInputDualStream3D = 1 << 1 + bmdVideoInputDualStream3D = 1 << 1, + bmdVideoInputSynchronizeToCaptureGroup = 1 << 2 }; /* Enum BMDVideoInputFormatChangedEvents - Bitmask passed to the VideoInputFormatChanged notification to identify the properties of the input signal that have changed */ @@ -218,15 +237,6 @@ bmdAudioOutputStreamTimestamped }; -/* Enum BMDDisplayModeSupport - Output mode supported flags */ - -typedef uint32_t BMDDisplayModeSupport; -enum _BMDDisplayModeSupport { - bmdDisplayModeNotSupported = 0, - bmdDisplayModeSupported, - bmdDisplayModeSupportedWithConversion -}; - /* Enum BMDAncillaryPacketFormat - Ancillary packet format */ typedef uint32_t BMDAncillaryPacketFormat; @@ -243,7 +253,8 @@ bmdTimecodeRP188VITC1 = 'rpv1', // RP188 timecode where DBB1 equals VITC1 (line 9) bmdTimecodeRP188VITC2 = 'rp12', // RP188 timecode where DBB1 equals VITC2 (line 9 for progressive or line 571 for interlaced/PsF) bmdTimecodeRP188LTC = 'rplt', // RP188 timecode where DBB1 equals LTC (line 10) - bmdTimecodeRP188Any = 'rp18', // For capture: return the first valid timecode in {VITC1, LTC ,VITC2} - For playback: set the timecode as VITC1 + bmdTimecodeRP188HighFrameRate = 'rphr', // RP188 timecode where DBB1 is an HFRTC (SMPTE ST 12-3), the only timecode allowing the frame value to go above 30 + bmdTimecodeRP188Any = 'rp18', // Convenience for capture, returning the first valid timecode in {HFRTC (if supported), VITC1, LTC, VITC2} bmdTimecodeVITC = 'vitc', bmdTimecodeVITCField2 = 'vit2', bmdTimecodeSerial = 'seri' @@ -388,8 +399,6 @@ bmdDeckLinkFrameMetadataHDRElectroOpticalTransferFunc = 'eotf', // EOTF in range 0-7 as per CEA 861.3 bmdDeckLinkFrameMetadataCintelFilmType = 'cfty', // Current film type bmdDeckLinkFrameMetadataCintelFilmGauge = 'cfga', // Current film gauge - bmdDeckLinkFrameMetadataCintelOffsetDetectedHorizontal = 'odfh', // Horizontal offset (pixels) detected in image - bmdDeckLinkFrameMetadataCintelOffsetDetectedVertical = 'odfv', // Vertical offset (pixels) detected in image bmdDeckLinkFrameMetadataCintelKeykodeLow = 'ckkl', // Raw keykode value - low 64 bits bmdDeckLinkFrameMetadataCintelKeykodeHigh = 'ckkh', // Raw keykode value - high 64 bits bmdDeckLinkFrameMetadataCintelTile1Size = 'ct1s', // Size in bytes of compressed raw tile 1 @@ -436,15 +445,30 @@ bmdDeckLinkFrameMetadataCintelGainBlue = 'LfBl', // Blue gain parameter to apply after log bmdDeckLinkFrameMetadataCintelLiftRed = 'GnRd', // Red lift parameter to apply after log and gain bmdDeckLinkFrameMetadataCintelLiftGreen = 'GnGr', // Green lift parameter to apply after log and gain - bmdDeckLinkFrameMetadataCintelLiftBlue = 'GnBl' // Blue lift parameter to apply after log and gain + bmdDeckLinkFrameMetadataCintelLiftBlue = 'GnBl', // Blue lift parameter to apply after log and gain + bmdDeckLinkFrameMetadataCintelHDRGainRed = 'HGRd', // Red gain parameter to apply to linear data for HDR Combination + bmdDeckLinkFrameMetadataCintelHDRGainGreen = 'HGGr', // Green gain parameter to apply to linear data for HDR Combination + bmdDeckLinkFrameMetadataCintelHDRGainBlue = 'HGBl' // Blue gain parameter to apply to linear data for HDR Combination }; -/* Enum BMDDuplexMode - Duplex for configurable ports */ +/* Enum BMDProfileID - Identifies a profile */ -typedef uint32_t BMDDuplexMode; -enum _BMDDuplexMode { - bmdDuplexModeFull = 'fdup', - bmdDuplexModeHalf = 'hdup' +typedef uint32_t BMDProfileID; +enum _BMDProfileID { + bmdProfileOneSubDeviceFullDuplex = '1dfd', + bmdProfileOneSubDeviceHalfDuplex = '1dhd', + bmdProfileTwoSubDevicesFullDuplex = '2dfd', + bmdProfileTwoSubDevicesHalfDuplex = '2dhd', + bmdProfileFourSubDevicesHalfDuplex = '4dhd' +}; + +/* Enum BMDHDMITimecodePacking - Packing form of timecode on HDMI */ + +typedef uint32_t BMDHDMITimecodePacking; +enum _BMDHDMITimecodePacking { + bmdHDMITimecodePackingIEEEOUI000085 = 0x00008500, + bmdHDMITimecodePackingIEEEOUI080046 = 0x08004601, + bmdHDMITimecodePackingIEEEOUI5CF9F0 = 0x5CF9F003 }; /* Enum BMDDeckLinkAttributeID - DeckLink Attribute ID */ @@ -456,7 +480,6 @@ BMDDeckLinkSupportsInternalKeying = 'keyi', BMDDeckLinkSupportsExternalKeying = 'keye', - BMDDeckLinkSupportsHDKeying = 'keyh', BMDDeckLinkSupportsInputFormatDetection = 'infd', BMDDeckLinkHasReferenceInput = 'hrin', BMDDeckLinkHasSerialPort = 'hspt', @@ -465,16 +488,19 @@ BMDDeckLinkHasVideoInputAntiAliasingFilter = 'aafl', BMDDeckLinkHasBypass = 'byps', BMDDeckLinkSupportsClockTimingAdjustment = 'ctad', - BMDDeckLinkSupportsFullDuplex = 'fdup', BMDDeckLinkSupportsFullFrameReferenceInputTimingOffset = 'frin', BMDDeckLinkSupportsSMPTELevelAOutput = 'lvla', BMDDeckLinkSupportsDualLinkSDI = 'sdls', BMDDeckLinkSupportsQuadLinkSDI = 'sqls', BMDDeckLinkSupportsIdleOutput = 'idou', + BMDDeckLinkVANCRequires10BitYUVVideoFrames = 'vioY', // Legacy product requires v210 active picture for IDeckLinkVideoFrameAncillaryPackets or 10-bit VANC BMDDeckLinkHasLTCTimecodeInput = 'hltc', - BMDDeckLinkSupportsDuplexModeConfiguration = 'dupx', BMDDeckLinkSupportsHDRMetadata = 'hdrm', BMDDeckLinkSupportsColorspaceMetadata = 'cmet', + BMDDeckLinkSupportsHDMITimecode = 'htim', + BMDDeckLinkSupportsHighFrameRateTimecode = 'HFRT', + BMDDeckLinkSupportsSynchronizeToCaptureGroup = 'stcg', + BMDDeckLinkSupportsSynchronizeToPlaybackGroup = 'stpg', /* Integers */ @@ -497,7 +523,8 @@ BMDDeckLinkAudioInputXLRChannelCount = 'aixc', BMDDeckLinkAudioOutputRCAChannelCount = 'aorc', BMDDeckLinkAudioOutputXLRChannelCount = 'aoxc', - BMDDeckLinkPairedDevicePersistentID = 'ppid', + BMDDeckLinkProfileID = 'prid', // Returns a BMDProfileID + BMDDeckLinkDuplex = 'dupx', /* Floats */ @@ -543,7 +570,6 @@ bmdDeckLinkStatusLastVideoOutputPixelFormat = 'opix', bmdDeckLinkStatusReferenceSignalMode = 'refm', bmdDeckLinkStatusReferenceSignalFlags = 'reff', - bmdDeckLinkStatusDuplexMode = 'dupx', bmdDeckLinkStatusBusy = 'busy', bmdDeckLinkStatusInterchangeablePanelType = 'icpt', bmdDeckLinkStatusDeviceTemperature = 'dtmp', @@ -563,14 +589,14 @@ bmdDeckLinkVideoStatusDualStream3D = 1 << 1 }; -/* Enum BMDDuplexStatus - Duplex status of the device */ +/* Enum BMDDuplexMode - Duplex of the device */ -typedef uint32_t BMDDuplexStatus; -enum _BMDDuplexStatus { - bmdDuplexStatusFullDuplex = 'fdup', - bmdDuplexStatusHalfDuplex = 'hdup', - bmdDuplexStatusSimplex = 'splx', - bmdDuplexStatusInactive = 'inac' +typedef uint32_t BMDDuplexMode; +enum _BMDDuplexMode { + bmdDuplexFull = 'dxfu', + bmdDuplexHalf = 'dxha', + bmdDuplexSimplex = 'dxsp', + bmdDuplexInactive = 'dxin' }; /* Enum BMDPanelType - The type of interchangeable panel */ @@ -651,7 +677,11 @@ class IDeckLinkGLScreenPreviewHelper; class IDeckLinkNotificationCallback; class IDeckLinkNotification; -class IDeckLinkAttributes; +class IDeckLinkProfileAttributes; +class IDeckLinkProfileIterator; +class IDeckLinkProfile; +class IDeckLinkProfileCallback; +class IDeckLinkProfileManager; class IDeckLinkStatus; class IDeckLinkKeyer; class IDeckLinkVideoConversion; @@ -742,7 +772,8 @@ class BMD_PUBLIC IDeckLinkOutput : public IUnknown { public: - virtual HRESULT DoesSupportVideoMode (/* in */ BMDDisplayMode displayMode, /* in */ BMDPixelFormat pixelFormat, /* in */ BMDVideoOutputFlags flags, /* out */ BMDDisplayModeSupport *result, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + virtual HRESULT DoesSupportVideoMode (/* in */ BMDVideoConnection connection /* If a value of 0 is specified, the caller does not care about the connection */, /* in */ BMDDisplayMode requestedMode, /* in */ BMDPixelFormat requestedPixelFormat, /* in */ BMDSupportedVideoModeFlags flags, /* out */ BMDDisplayMode *actualMode, /* out */ bool *supported) = 0; + virtual HRESULT GetDisplayMode (/* in */ BMDDisplayMode displayMode, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; virtual HRESULT GetDisplayModeIterator (/* out */ IDeckLinkDisplayModeIterator **iterator) = 0; virtual HRESULT SetScreenPreviewCallback (/* in */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; @@ -799,7 +830,8 @@ class BMD_PUBLIC IDeckLinkInput : public IUnknown { public: - virtual HRESULT DoesSupportVideoMode (/* in */ BMDDisplayMode displayMode, /* in */ BMDPixelFormat pixelFormat, /* in */ BMDVideoInputFlags flags, /* out */ BMDDisplayModeSupport *result, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + virtual HRESULT DoesSupportVideoMode (/* in */ BMDVideoConnection connection /* If a value of 0 is specified, the caller does not care about the connection */, /* in */ BMDDisplayMode requestedMode, /* in */ BMDPixelFormat requestedPixelFormat, /* in */ BMDSupportedVideoModeFlags flags, /* out */ bool *supported) = 0; + virtual HRESULT GetDisplayMode (/* in */ BMDDisplayMode displayMode, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; virtual HRESULT GetDisplayModeIterator (/* out */ IDeckLinkDisplayModeIterator **iterator) = 0; virtual HRESULT SetScreenPreviewCallback (/* in */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; @@ -851,7 +883,8 @@ class BMD_PUBLIC IDeckLinkEncoderInput : public IUnknown { public: - virtual HRESULT DoesSupportVideoMode (/* in */ BMDDisplayMode displayMode, /* in */ BMDPixelFormat pixelFormat, /* in */ BMDVideoInputFlags flags, /* out */ BMDDisplayModeSupport *result, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + virtual HRESULT DoesSupportVideoMode (/* in */ BMDVideoConnection connection /* If a value of 0 is specified, the caller does not care about the connection */, /* in */ BMDDisplayMode requestedMode, /* in */ BMDPixelFormat requestedCodec, /* in */ uint32_t requestedCodecProfile, /* in */ BMDSupportedVideoModeFlags flags, /* out */ bool *supported) = 0; + virtual HRESULT GetDisplayMode (/* in */ BMDDisplayMode displayMode, /* out */ IDeckLinkDisplayMode **resultDisplayMode) = 0; virtual HRESULT GetDisplayModeIterator (/* out */ IDeckLinkDisplayModeIterator **iterator) = 0; /* Video Input */ @@ -1131,11 +1164,14 @@ public: virtual HRESULT Subscribe (/* in */ BMDNotifications topic, /* in */ IDeckLinkNotificationCallback *theCallback) = 0; virtual HRESULT Unsubscribe (/* in */ BMDNotifications topic, /* in */ IDeckLinkNotificationCallback *theCallback) = 0; + +protected: + virtual ~IDeckLinkNotification () {} // call Release method to drop reference count }; -/* Interface IDeckLinkAttributes - DeckLink Attribute interface */ +/* Interface IDeckLinkProfileAttributes - Created by QueryInterface from an IDeckLinkProfile, or from IDeckLink. When queried from IDeckLink, interrogates the active profile */ -class BMD_PUBLIC IDeckLinkAttributes : public IUnknown +class BMD_PUBLIC IDeckLinkProfileAttributes : public IUnknown { public: virtual HRESULT GetFlag (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ bool *value) = 0; @@ -1144,7 +1180,57 @@ virtual HRESULT GetString (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ CFStringRef *value) = 0; protected: - virtual ~IDeckLinkAttributes () {} // call Release method to drop reference count + virtual ~IDeckLinkProfileAttributes () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfileIterator - Enumerates IDeckLinkProfile interfaces */ + +class BMD_PUBLIC IDeckLinkProfileIterator : public IUnknown +{ +public: + virtual HRESULT Next (/* out */ IDeckLinkProfile **profile) = 0; + +protected: + virtual ~IDeckLinkProfileIterator () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfile - Represents the active profile when queried from IDeckLink */ + +class BMD_PUBLIC IDeckLinkProfile : public IUnknown +{ +public: + virtual HRESULT GetDevice (/* out */ IDeckLink **device) = 0; // Device affected when this profile becomes active + virtual HRESULT IsActive (/* out */ bool *isActive) = 0; + virtual HRESULT SetActive (void) = 0; // Activating a profile will also change the profile on all devices enumerated by GetPeers(). Activation is not complete until IDeckLinkProfileCallback::ProfileActivated() is called + virtual HRESULT GetPeers (/* out */ IDeckLinkProfileIterator **profileIterator) = 0; // Profiles of other devices activated with this profile + +protected: + virtual ~IDeckLinkProfile () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfileCallback - Receive notifications about profiles related to this device */ + +class BMD_PUBLIC IDeckLinkProfileCallback : public IUnknown +{ +public: + virtual HRESULT ProfileChanging (/* in */ IDeckLinkProfile *profileToBeActivated, /* in */ bool streamsWillBeForcedToStop) = 0; // Called before this device changes profile. User has an opportunity for teardown if streamsWillBeForcedToStop + virtual HRESULT ProfileActivated (/* in */ IDeckLinkProfile *activatedProfile) = 0; // Called after this device has been activated with a new profile + +protected: + virtual ~IDeckLinkProfileCallback () {} // call Release method to drop reference count +}; + +/* Interface IDeckLinkProfileManager - Created by QueryInterface from IDeckLink when a device has multiple optional profiles */ + +class BMD_PUBLIC IDeckLinkProfileManager : public IUnknown +{ +public: + virtual HRESULT GetProfiles (/* out */ IDeckLinkProfileIterator **profileIterator) = 0; // All available profiles for this device + virtual HRESULT GetProfile (/* in */ BMDProfileID profileID, /* out */ IDeckLinkProfile **profile) = 0; + virtual HRESULT SetCallback (/* in */ IDeckLinkProfileCallback *callback) = 0; + +protected: + virtual ~IDeckLinkProfileManager () {} // call Release method to drop reference count }; /* Interface IDeckLinkStatus - DeckLink Status interface */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIConfiguration.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIConfiguration.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -46,7 +46,7 @@ // Interface ID Declarations -BMD_CONST REFIID IID_IDeckLinkConfiguration = /* EF90380B-4AE5-4346-9077-E288E149F129 */ {0xEF,0x90,0x38,0x0B,0x4A,0xE5,0x43,0x46,0x90,0x77,0xE2,0x88,0xE1,0x49,0xF1,0x29}; +BMD_CONST REFIID IID_IDeckLinkConfiguration = /* 912F634B-2D4E-40A4-8AAB-8D80B73F1289 */ {0x91,0x2F,0x63,0x4B,0x2D,0x4E,0x40,0xA4,0x8A,0xAB,0x8D,0x80,0xB7,0x3F,0x12,0x89}; BMD_CONST REFIID IID_IDeckLinkEncoderConfiguration = /* 138050E5-C60A-4552-BF3F-0F358049327E */ {0x13,0x80,0x50,0xE5,0xC6,0x0A,0x45,0x52,0xBF,0x3F,0x0F,0x35,0x80,0x49,0x32,0x7E}; /* Enum BMDDeckLinkConfigurationID - DeckLink Configuration ID */ @@ -63,7 +63,6 @@ bmdDeckLinkConfigHDMI3DPackingFormat = '3dpf', bmdDeckLinkConfigBypass = 'byps', bmdDeckLinkConfigClockTimingAdjustment = 'ctad', - bmdDeckLinkConfigDuplexMode = 'dupx', /* Audio Input/Output Flags */ @@ -95,6 +94,8 @@ bmdDeckLinkConfigDefaultVideoOutputMode = 'dvom', bmdDeckLinkConfigDefaultVideoOutputModeFlags = 'dvof', bmdDeckLinkConfigSDIOutputLinkConfiguration = 'solc', + bmdDeckLinkConfigHDMITimecodePacking = 'htpk', + bmdDeckLinkConfigPlaybackGroup = 'plgr', /* Video Output Floats */ @@ -126,6 +127,7 @@ bmdDeckLinkConfigVANCSourceLine2Mapping = 'vsl2', bmdDeckLinkConfigVANCSourceLine3Mapping = 'vsl3', bmdDeckLinkConfigCapturePassThroughMode = 'cptm', + bmdDeckLinkConfigCaptureGroup = 'cpgr', /* Video Input Floats */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIDeckControl.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIDeckControl.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIDiscovery.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIDiscovery.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIDispatch.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIDispatch.cpp
Changed
@@ -7,14 +7,14 @@ ** execute, and transmit the Software, and to prepare derivative works of the ** Software, and to permit third-parties to whom the Software is furnished to ** do so, all subject to the following: -** +** ** The copyright notices in the Software and this entire statement, including ** the above license grant, this restriction and the following disclaimer, ** must be included in all copies of the Software, in whole or in part, and ** all derivative works of the Software, unless such copies or derivative ** works are solely in the form of machine-executable object code generated by ** a source language processor. -** +** ** THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR ** IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, ** FITNESS FOR A PARTICULAR PURPOSE, TITLE AND NON-INFRINGEMENT. IN NO EVENT @@ -65,12 +65,12 @@ gDeckLinkAPIBundleRef = CFBundleCreate(kCFAllocatorDefault, bundleURL); if (gDeckLinkAPIBundleRef != NULL) { - gCreateIteratorFunc = (CreateIteratorFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateDeckLinkIteratorInstance_0003")); + gCreateIteratorFunc = (CreateIteratorFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateDeckLinkIteratorInstance_0004")); gCreateAPIInformationFunc = (CreateAPIInformationFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateDeckLinkAPIInformationInstance_0001")); gCreateOpenGLPreviewFunc = (CreateOpenGLScreenPreviewHelperFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateOpenGLScreenPreviewHelper_0001")); gCreateCocoaPreviewFunc = (CreateCocoaScreenPreviewFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateCocoaScreenPreview_0001")); gCreateVideoConversionFunc = (CreateVideoConversionInstanceFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateVideoConversionInstance_0001")); - gCreateDeckLinkDiscoveryFunc = (CreateDeckLinkDiscoveryInstanceFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateDeckLinkDiscoveryInstance_0002")); + gCreateDeckLinkDiscoveryFunc = (CreateDeckLinkDiscoveryInstanceFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateDeckLinkDiscoveryInstance_0003")); gCreateVideoFrameAncillaryPacketsFunc = (CreateVideoFrameAncillaryPacketsInstanceFunc)CFBundleGetFunctionPointerForName(gDeckLinkAPIBundleRef, CFSTR("CreateVideoFrameAncillaryPacketsInstance_0001")); } CFRelease(bundleURL); @@ -83,7 +83,7 @@ // If the DeckLink API bundle was successfully loaded, return this knowledge to the caller if (gDeckLinkAPIBundleRef != NULL) return true; - + return false; } #endif @@ -91,70 +91,70 @@ IDeckLinkIterator* CreateDeckLinkIteratorInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateIteratorFunc == NULL) return NULL; - + return gCreateIteratorFunc(); } IDeckLinkAPIInformation* CreateDeckLinkAPIInformationInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateAPIInformationFunc == NULL) return NULL; - + return gCreateAPIInformationFunc(); } IDeckLinkGLScreenPreviewHelper* CreateOpenGLScreenPreviewHelper (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateOpenGLPreviewFunc == NULL) return NULL; - + return gCreateOpenGLPreviewFunc(); } IDeckLinkCocoaScreenPreviewCallback* CreateCocoaScreenPreview (void* parentView) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateCocoaPreviewFunc == NULL) return NULL; - + return gCreateCocoaPreviewFunc(parentView); } IDeckLinkVideoConversion* CreateVideoConversionInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateVideoConversionFunc == NULL) return NULL; - + return gCreateVideoConversionFunc(); } IDeckLinkDiscovery* CreateDeckLinkDiscoveryInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateDeckLinkDiscoveryFunc == NULL) return NULL; - + return gCreateDeckLinkDiscoveryFunc(); } IDeckLinkVideoFrameAncillaryPackets* CreateVideoFrameAncillaryPacketsInstance (void) { pthread_once(&gDeckLinkOnceControl, InitDeckLinkAPI); - + if (gCreateVideoFrameAncillaryPacketsFunc == NULL) return NULL; - + return gCreateVideoFrameAncillaryPacketsFunc(); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIModes.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIModes.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -69,9 +69,16 @@ bmdModeHD1080p25 = 'Hp25', bmdModeHD1080p2997 = 'Hp29', bmdModeHD1080p30 = 'Hp30', + bmdModeHD1080p4795 = 'Hp47', + bmdModeHD1080p48 = 'Hp48', bmdModeHD1080p50 = 'Hp50', bmdModeHD1080p5994 = 'Hp59', bmdModeHD1080p6000 = 'Hp60', // N.B. This _really_ is 60.00 Hz. + bmdModeHD1080p9590 = 'Hp95', + bmdModeHD1080p96 = 'Hp96', + bmdModeHD1080p100 = 'Hp10', + bmdModeHD1080p11988 = 'Hp11', + bmdModeHD1080p120 = 'Hp12', bmdModeHD1080i50 = 'Hi50', bmdModeHD1080i5994 = 'Hi59', bmdModeHD1080i6000 = 'Hi60', // N.B. This _really_ is 60.00 Hz. @@ -95,9 +102,16 @@ bmdMode2kDCI25 = '2d25', bmdMode2kDCI2997 = '2d29', bmdMode2kDCI30 = '2d30', + bmdMode2kDCI4795 = '2d47', + bmdMode2kDCI48 = '2d48', bmdMode2kDCI50 = '2d50', bmdMode2kDCI5994 = '2d59', bmdMode2kDCI60 = '2d60', + bmdMode2kDCI9590 = '2d95', + bmdMode2kDCI96 = '2d96', + bmdMode2kDCI100 = '2d10', + bmdMode2kDCI11988 = '2d11', + bmdMode2kDCI120 = '2d12', /* 4K UHD Modes */ @@ -106,9 +120,16 @@ bmdMode4K2160p25 = '4k25', bmdMode4K2160p2997 = '4k29', bmdMode4K2160p30 = '4k30', + bmdMode4K2160p4795 = '4k47', + bmdMode4K2160p48 = '4k48', bmdMode4K2160p50 = '4k50', bmdMode4K2160p5994 = '4k59', bmdMode4K2160p60 = '4k60', + bmdMode4K2160p9590 = '4k95', + bmdMode4K2160p96 = '4k96', + bmdMode4K2160p100 = '4k10', + bmdMode4K2160p11988 = '4k11', + bmdMode4K2160p120 = '4k12', /* 4K DCI Modes */ @@ -117,9 +138,16 @@ bmdMode4kDCI25 = '4d25', bmdMode4kDCI2997 = '4d29', bmdMode4kDCI30 = '4d30', + bmdMode4kDCI4795 = '4d47', + bmdMode4kDCI48 = '4d48', bmdMode4kDCI50 = '4d50', bmdMode4kDCI5994 = '4d59', bmdMode4kDCI60 = '4d60', + bmdMode4kDCI9590 = '4d95', + bmdMode4kDCI96 = '4d96', + bmdMode4kDCI100 = '4d10', + bmdMode4kDCI11988 = '4d11', + bmdMode4kDCI120 = '4d12', /* 8K UHD Modes */ @@ -128,6 +156,8 @@ bmdMode8K4320p25 = '8k25', bmdMode8K4320p2997 = '8k29', bmdMode8K4320p30 = '8k30', + bmdMode8K4320p4795 = '8k47', + bmdMode8K4320p48 = '8k48', bmdMode8K4320p50 = '8k50', bmdMode8K4320p5994 = '8k59', bmdMode8K4320p60 = '8k60', @@ -139,10 +169,31 @@ bmdMode8kDCI25 = '8d25', bmdMode8kDCI2997 = '8d29', bmdMode8kDCI30 = '8d30', + bmdMode8kDCI4795 = '8d47', + bmdMode8kDCI48 = '8d48', bmdMode8kDCI50 = '8d50', bmdMode8kDCI5994 = '8d59', bmdMode8kDCI60 = '8d60', + /* PC Modes */ + + bmdMode640x480p60 = 'vga6', + bmdMode800x600p60 = 'svg6', + bmdMode1440x900p50 = 'wxg5', + bmdMode1440x900p60 = 'wxg6', + bmdMode1440x1080p50 = 'sxg5', + bmdMode1440x1080p60 = 'sxg6', + bmdMode1600x1200p50 = 'uxg5', + bmdMode1600x1200p60 = 'uxg6', + bmdMode1920x1200p50 = 'wux5', + bmdMode1920x1200p60 = 'wux6', + bmdMode1920x1440p50 = '1945', + bmdMode1920x1440p60 = '1946', + bmdMode2560x1440p50 = 'wqh5', + bmdMode2560x1440p60 = 'wqh6', + bmdMode2560x1600p50 = 'wqx5', + bmdMode2560x1600p60 = 'wqx6', + /* RAW Modes for Cintel (input only) */ bmdModeCintelRAW = 'rwci', // Frame size up to 4096x3072, variable frame rate @@ -168,6 +219,7 @@ typedef uint32_t BMDPixelFormat; enum _BMDPixelFormat { + bmdFormatUnspecified = 0, bmdFormat8BitYUV = '2vuy', bmdFormat10BitYUV = 'v210', bmdFormat8BitARGB = 32,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIStreaming.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIStreaming.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPITypes.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPITypes.h
Changed
@@ -1,5 +1,5 @@ /* -LICENSE-START- -** Copyright (c) 2018 Blackmagic Design +** Copyright (c) 2019 Blackmagic Design ** ** Permission is hereby granted, free of charge, to any person or organization ** obtaining a copy of the software and accompanying documentation covered by @@ -59,13 +59,16 @@ bmdTimecodeFlagDefault = 0, bmdTimecodeIsDropFrame = 1 << 0, bmdTimecodeFieldMark = 1 << 1, - bmdTimecodeColorFrame = 1 << 2 + bmdTimecodeColorFrame = 1 << 2, + bmdTimecodeEmbedRecordingTrigger = 1 << 3, // On SDI recording trigger utilises a user-bit + bmdTimecodeRecordingTriggered = 1 << 4 }; /* Enum BMDVideoConnection - Video connection types */ typedef uint32_t BMDVideoConnection; enum _BMDVideoConnection { + bmdVideoConnectionUnspecified = 0, bmdVideoConnectionSDI = 1 << 0, bmdVideoConnectionHDMI = 1 << 1, bmdVideoConnectionOpticalSDI = 1 << 2,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/osx/DeckLinkAPIVersion.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/osx/DeckLinkAPIVersion.h
Changed
@@ -30,8 +30,8 @@ #ifndef __DeckLink_API_Version_h__ #define __DeckLink_API_Version_h__ -#define BLACKMAGIC_DECKLINK_API_VERSION 0x0a0b0400 -#define BLACKMAGIC_DECKLINK_API_VERSION_STRING "10.11.4" +#define BLACKMAGIC_DECKLINK_API_VERSION 0x0b020000 +#define BLACKMAGIC_DECKLINK_API_VERSION_STRING "11.2" #endif // __DeckLink_API_Version_h__
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/win/DeckLinkAPI.h -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/win/DeckLinkAPI.h
Changed
@@ -4,13 +4,13 @@ /* File created by MIDL compiler version 8.01.0622 */ -/* at Fri Feb 28 12:18:07 2020 +/* at Tue Jan 19 12:14:07 2038 */ /* Compiler settings for ..\..\Blackmagic\DeckLink_SDK_10.11.4\Win\include\DeckLinkAPI.idl: - Oicf, W1, Zp8, env=Win64 (32b run), target_arch=AMD64 8.01.0622 + Oicf, W1, Zp8, env=Win64 (32b run), target_arch=AMD64 8.01.0622 protocol : dce , ms_ext, c_ext, robust - error checks: allocation ref bounds_check enum stub_data - VC __declspec() decoration level: + error checks: allocation ref bounds_check enum stub_data + VC __declspec() decoration level: __declspec(uuid()), __declspec(selectany), __declspec(novtable) DECLSPEC_UUID(), MIDL_INTERFACE() */ @@ -34,14 +34,14 @@ #endif /* __RPCNDR_H_VERSION__ */ -#ifndef __DeckLinkAPI_h_h__ -#define __DeckLinkAPI_h_h__ +#ifndef __DeckLinkAPI_h__ +#define __DeckLinkAPI_h__ #if defined(_MSC_VER) && (_MSC_VER >= 1020) #pragma once #endif -/* Forward Declarations */ +/* Forward Declarations */ #ifndef __IDeckLinkTimecode_FWD_DEFINED__ #define __IDeckLinkTimecode_FWD_DEFINED__ @@ -410,11 +410,39 @@ #endif /* __IDeckLinkNotification_FWD_DEFINED__ */ -#ifndef __IDeckLinkAttributes_FWD_DEFINED__ -#define __IDeckLinkAttributes_FWD_DEFINED__ -typedef interface IDeckLinkAttributes IDeckLinkAttributes; +#ifndef __IDeckLinkProfileAttributes_FWD_DEFINED__ +#define __IDeckLinkProfileAttributes_FWD_DEFINED__ +typedef interface IDeckLinkProfileAttributes IDeckLinkProfileAttributes; -#endif /* __IDeckLinkAttributes_FWD_DEFINED__ */ +#endif /* __IDeckLinkProfileAttributes_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkProfileIterator_FWD_DEFINED__ +#define __IDeckLinkProfileIterator_FWD_DEFINED__ +typedef interface IDeckLinkProfileIterator IDeckLinkProfileIterator; + +#endif /* __IDeckLinkProfileIterator_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkProfile_FWD_DEFINED__ +#define __IDeckLinkProfile_FWD_DEFINED__ +typedef interface IDeckLinkProfile IDeckLinkProfile; + +#endif /* __IDeckLinkProfile_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkProfileCallback_FWD_DEFINED__ +#define __IDeckLinkProfileCallback_FWD_DEFINED__ +typedef interface IDeckLinkProfileCallback IDeckLinkProfileCallback; + +#endif /* __IDeckLinkProfileCallback_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkProfileManager_FWD_DEFINED__ +#define __IDeckLinkProfileManager_FWD_DEFINED__ +typedef interface IDeckLinkProfileManager IDeckLinkProfileManager; + +#endif /* __IDeckLinkProfileManager_FWD_DEFINED__ */ #ifndef __IDeckLinkStatus_FWD_DEFINED__ @@ -536,6 +564,72 @@ #endif /* __CDeckLinkVideoFrameAncillaryPackets_FWD_DEFINED__ */ +#ifndef __IDeckLinkConfiguration_v10_11_FWD_DEFINED__ +#define __IDeckLinkConfiguration_v10_11_FWD_DEFINED__ +typedef interface IDeckLinkConfiguration_v10_11 IDeckLinkConfiguration_v10_11; + +#endif /* __IDeckLinkConfiguration_v10_11_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkAttributes_v10_11_FWD_DEFINED__ +#define __IDeckLinkAttributes_v10_11_FWD_DEFINED__ +typedef interface IDeckLinkAttributes_v10_11 IDeckLinkAttributes_v10_11; + +#endif /* __IDeckLinkAttributes_v10_11_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkNotification_v10_11_FWD_DEFINED__ +#define __IDeckLinkNotification_v10_11_FWD_DEFINED__ +typedef interface IDeckLinkNotification_v10_11 IDeckLinkNotification_v10_11; + +#endif /* __IDeckLinkNotification_v10_11_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkOutput_v10_11_FWD_DEFINED__ +#define __IDeckLinkOutput_v10_11_FWD_DEFINED__ +typedef interface IDeckLinkOutput_v10_11 IDeckLinkOutput_v10_11; + +#endif /* __IDeckLinkOutput_v10_11_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkInput_v10_11_FWD_DEFINED__ +#define __IDeckLinkInput_v10_11_FWD_DEFINED__ +typedef interface IDeckLinkInput_v10_11 IDeckLinkInput_v10_11; + +#endif /* __IDeckLinkInput_v10_11_FWD_DEFINED__ */ + + +#ifndef __IDeckLinkEncoderInput_v10_11_FWD_DEFINED__ +#define __IDeckLinkEncoderInput_v10_11_FWD_DEFINED__ +typedef interface IDeckLinkEncoderInput_v10_11 IDeckLinkEncoderInput_v10_11; + +#endif /* __IDeckLinkEncoderInput_v10_11_FWD_DEFINED__ */ + + +#ifndef __CDeckLinkIterator_v10_11_FWD_DEFINED__ +#define __CDeckLinkIterator_v10_11_FWD_DEFINED__ + +#ifdef __cplusplus +typedef class CDeckLinkIterator_v10_11 CDeckLinkIterator_v10_11; +#else +typedef struct CDeckLinkIterator_v10_11 CDeckLinkIterator_v10_11; +#endif /* __cplusplus */ + +#endif /* __CDeckLinkIterator_v10_11_FWD_DEFINED__ */ + + +#ifndef __CDeckLinkDiscovery_v10_11_FWD_DEFINED__ +#define __CDeckLinkDiscovery_v10_11_FWD_DEFINED__ + +#ifdef __cplusplus +typedef class CDeckLinkDiscovery_v10_11 CDeckLinkDiscovery_v10_11; +#else +typedef struct CDeckLinkDiscovery_v10_11 CDeckLinkDiscovery_v10_11; +#endif /* __cplusplus */ + +#endif /* __CDeckLinkDiscovery_v10_11_FWD_DEFINED__ */ + + #ifndef __IDeckLinkConfiguration_v10_9_FWD_DEFINED__ #define __IDeckLinkConfiguration_v10_9_FWD_DEFINED__ typedef interface IDeckLinkConfiguration_v10_9 IDeckLinkConfiguration_v10_9; @@ -879,7 +973,7 @@ #ifdef __cplusplus extern "C"{ -#endif +#endif @@ -887,7 +981,7 @@ #define __DeckLinkAPI_LIBRARY_DEFINED__ /* library DeckLinkAPI */ -/* [helpstring][version][uuid] */ +/* [helpstring][version][uuid] */ typedef LONGLONG BMDTimeValue; @@ -902,26 +996,29 @@ typedef enum _BMDTimecodeFlags BMDTimecodeFlags; #endif -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDTimecodeFlags { bmdTimecodeFlagDefault = 0, bmdTimecodeIsDropFrame = ( 1 << 0 ) , bmdTimecodeFieldMark = ( 1 << 1 ) , - bmdTimecodeColorFrame = ( 1 << 2 ) + bmdTimecodeColorFrame = ( 1 << 2 ) , + bmdTimecodeEmbedRecordingTrigger = ( 1 << 3 ) , + bmdTimecodeRecordingTriggered = ( 1 << 4 ) } ; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoConnection { + bmdVideoConnectionUnspecified = 0, bmdVideoConnectionSDI = ( 1 << 0 ) , bmdVideoConnectionHDMI = ( 1 << 1 ) , bmdVideoConnectionOpticalSDI = ( 1 << 2 ) , bmdVideoConnectionComponent = ( 1 << 3 ) , bmdVideoConnectionComposite = ( 1 << 4 ) , - bmdVideoConnectionSVideo = ( 1 << 5 ) + bmdVideoConnectionSVideo = ( 1 << 5 ) } BMDVideoConnection; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioConnection { bmdAudioConnectionEmbedded = ( 1 << 0 ) , @@ -930,14 +1027,14 @@ bmdAudioConnectionAnalogXLR = ( 1 << 3 ) , bmdAudioConnectionAnalogRCA = ( 1 << 4 ) , bmdAudioConnectionMicrophone = ( 1 << 5 ) , - bmdAudioConnectionHeadphones = ( 1 << 6 ) + bmdAudioConnectionHeadphones = ( 1 << 6 ) } BMDAudioConnection; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckControlConnection { bmdDeckControlConnectionRS422Remote1 = ( 1 << 0 ) , - bmdDeckControlConnectionRS422Remote2 = ( 1 << 1 ) + bmdDeckControlConnectionRS422Remote2 = ( 1 << 1 ) } BMDDeckControlConnection; @@ -946,7 +1043,7 @@ typedef enum _BMDDisplayModeFlags BMDDisplayModeFlags; #endif -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDisplayMode { bmdModeNTSC = 0x6e747363, @@ -959,9 +1056,16 @@ bmdModeHD1080p25 = 0x48703235, bmdModeHD1080p2997 = 0x48703239, bmdModeHD1080p30 = 0x48703330, + bmdModeHD1080p4795 = 0x48703437, + bmdModeHD1080p48 = 0x48703438, bmdModeHD1080p50 = 0x48703530, bmdModeHD1080p5994 = 0x48703539, bmdModeHD1080p6000 = 0x48703630, + bmdModeHD1080p9590 = 0x48703935, + bmdModeHD1080p96 = 0x48703936, + bmdModeHD1080p100 = 0x48703130, + bmdModeHD1080p11988 = 0x48703131, + bmdModeHD1080p120 = 0x48703132, bmdModeHD1080i50 = 0x48693530, bmdModeHD1080i5994 = 0x48693539, bmdModeHD1080i6000 = 0x48693630, @@ -976,30 +1080,53 @@ bmdMode2kDCI25 = 0x32643235, bmdMode2kDCI2997 = 0x32643239, bmdMode2kDCI30 = 0x32643330, + bmdMode2kDCI4795 = 0x32643437, + bmdMode2kDCI48 = 0x32643438, bmdMode2kDCI50 = 0x32643530, bmdMode2kDCI5994 = 0x32643539, bmdMode2kDCI60 = 0x32643630, + bmdMode2kDCI9590 = 0x32643935, + bmdMode2kDCI96 = 0x32643936, + bmdMode2kDCI100 = 0x32643130, + bmdMode2kDCI11988 = 0x32643131, + bmdMode2kDCI120 = 0x32643132, bmdMode4K2160p2398 = 0x346b3233, bmdMode4K2160p24 = 0x346b3234, bmdMode4K2160p25 = 0x346b3235, bmdMode4K2160p2997 = 0x346b3239, bmdMode4K2160p30 = 0x346b3330, + bmdMode4K2160p4795 = 0x346b3437, + bmdMode4K2160p48 = 0x346b3438, bmdMode4K2160p50 = 0x346b3530, bmdMode4K2160p5994 = 0x346b3539, bmdMode4K2160p60 = 0x346b3630, + bmdMode4K2160p9590 = 0x346b3935, + bmdMode4K2160p96 = 0x346b3936, + bmdMode4K2160p100 = 0x346b3130, + bmdMode4K2160p11988 = 0x346b3131, + bmdMode4K2160p120 = 0x346b3132, bmdMode4kDCI2398 = 0x34643233, bmdMode4kDCI24 = 0x34643234, bmdMode4kDCI25 = 0x34643235, bmdMode4kDCI2997 = 0x34643239, bmdMode4kDCI30 = 0x34643330, + bmdMode4kDCI4795 = 0x34643437, + bmdMode4kDCI48 = 0x34643438, bmdMode4kDCI50 = 0x34643530, bmdMode4kDCI5994 = 0x34643539, bmdMode4kDCI60 = 0x34643630, + bmdMode4kDCI9590 = 0x34643935, + bmdMode4kDCI96 = 0x34643936, + bmdMode4kDCI100 = 0x34643130, + bmdMode4kDCI11988 = 0x34643131, + bmdMode4kDCI120 = 0x34643132, bmdMode8K4320p2398 = 0x386b3233, bmdMode8K4320p24 = 0x386b3234, bmdMode8K4320p25 = 0x386b3235, bmdMode8K4320p2997 = 0x386b3239, bmdMode8K4320p30 = 0x386b3330, + bmdMode8K4320p4795 = 0x386b3437, + bmdMode8K4320p48 = 0x386b3438, bmdMode8K4320p50 = 0x386b3530, bmdMode8K4320p5994 = 0x386b3539, bmdMode8K4320p60 = 0x386b3630, @@ -1008,15 +1135,33 @@ bmdMode8kDCI25 = 0x38643235, bmdMode8kDCI2997 = 0x38643239, bmdMode8kDCI30 = 0x38643330, + bmdMode8kDCI4795 = 0x38643437, + bmdMode8kDCI48 = 0x38643438, bmdMode8kDCI50 = 0x38643530, bmdMode8kDCI5994 = 0x38643539, bmdMode8kDCI60 = 0x38643630, + bmdMode640x480p60 = 0x76676136, + bmdMode800x600p60 = 0x73766736, + bmdMode1440x900p50 = 0x77786735, + bmdMode1440x900p60 = 0x77786736, + bmdMode1440x1080p50 = 0x73786735, + bmdMode1440x1080p60 = 0x73786736, + bmdMode1600x1200p50 = 0x75786735, + bmdMode1600x1200p60 = 0x75786736, + bmdMode1920x1200p50 = 0x77757835, + bmdMode1920x1200p60 = 0x77757836, + bmdMode1920x1440p50 = 0x31393435, + bmdMode1920x1440p60 = 0x31393436, + bmdMode2560x1440p50 = 0x77716835, + bmdMode2560x1440p60 = 0x77716836, + bmdMode2560x1600p50 = 0x77717835, + bmdMode2560x1600p60 = 0x77717836, bmdModeCintelRAW = 0x72776369, bmdModeCintelCompressedRAW = 0x72776363, bmdModeUnknown = 0x69756e6b } BMDDisplayMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDFieldDominance { bmdUnknownFieldDominance = 0, @@ -1026,9 +1171,10 @@ bmdProgressiveSegmentedFrame = 0x70736620 } BMDFieldDominance; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDPixelFormat { + bmdFormatUnspecified = 0, bmdFormat8BitYUV = 0x32767579, bmdFormat10BitYUV = 0x76323130, bmdFormat8BitARGB = 32, @@ -1044,13 +1190,13 @@ bmdFormat12BitRAWJPEG = 0x72313670 } BMDPixelFormat; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDDisplayModeFlags { bmdDisplayModeSupports3D = ( 1 << 0 ) , bmdDisplayModeColorspaceRec601 = ( 1 << 1 ) , bmdDisplayModeColorspaceRec709 = ( 1 << 2 ) , - bmdDisplayModeColorspaceRec2020 = ( 1 << 3 ) + bmdDisplayModeColorspaceRec2020 = ( 1 << 3 ) } ; @@ -1059,14 +1205,13 @@ #if 0 #endif -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkConfigurationID { bmdDeckLinkConfigSwapSerialRxTx = 0x73737274, bmdDeckLinkConfigHDMI3DPackingFormat = 0x33647066, bmdDeckLinkConfigBypass = 0x62797073, bmdDeckLinkConfigClockTimingAdjustment = 0x63746164, - bmdDeckLinkConfigDuplexMode = 0x64757078, bmdDeckLinkConfigAnalogAudioConsumerLevels = 0x6161636c, bmdDeckLinkConfigFieldFlickerRemoval = 0x66646672, bmdDeckLinkConfigHD1080p24ToHD1080i5994Conversion = 0x746f3539, @@ -1086,6 +1231,8 @@ bmdDeckLinkConfigDefaultVideoOutputMode = 0x64766f6d, bmdDeckLinkConfigDefaultVideoOutputModeFlags = 0x64766f66, bmdDeckLinkConfigSDIOutputLinkConfiguration = 0x736f6c63, + bmdDeckLinkConfigHDMITimecodePacking = 0x6874706b, + bmdDeckLinkConfigPlaybackGroup = 0x706c6772, bmdDeckLinkConfigVideoOutputComponentLumaGain = 0x6f636c67, bmdDeckLinkConfigVideoOutputComponentChromaBlueGain = 0x6f636362, bmdDeckLinkConfigVideoOutputComponentChromaRedGain = 0x6f636372, @@ -1105,6 +1252,7 @@ bmdDeckLinkConfigVANCSourceLine2Mapping = 0x76736c32, bmdDeckLinkConfigVANCSourceLine3Mapping = 0x76736c33, bmdDeckLinkConfigCapturePassThroughMode = 0x6370746d, + bmdDeckLinkConfigCaptureGroup = 0x63706772, bmdDeckLinkConfigVideoInputComponentLumaGain = 0x69636c67, bmdDeckLinkConfigVideoInputComponentChromaBlueGain = 0x69636362, bmdDeckLinkConfigVideoInputComponentChromaRedGain = 0x69636372, @@ -1136,7 +1284,7 @@ bmdDeckLinkConfigDeckControlConnection = 0x6463636f } BMDDeckLinkConfigurationID; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkEncoderConfigurationID { bmdDeckLinkEncoderConfigPreferredBitDepth = 0x65706272, @@ -1158,7 +1306,7 @@ typedef enum _BMDDeckControlExportModeOpsFlags BMDDeckControlExportModeOpsFlags; #endif -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckControlMode { bmdDeckControlNotOpened = 0x6e746f70, @@ -1167,7 +1315,7 @@ bmdDeckControlCaptureMode = 0x6361706d } BMDDeckControlMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckControlEvent { bmdDeckControlAbortedEvent = 0x61627465, @@ -1177,7 +1325,7 @@ bmdDeckControlCaptureCompleteEvent = 0x63636576 } BMDDeckControlEvent; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckControlVTRControlState { bmdDeckControlNotInVTRControlMode = 0x6e76636d, @@ -1191,15 +1339,15 @@ bmdDeckControlVTRControlStopped = 0x7674726f } BMDDeckControlVTRControlState; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDDeckControlStatusFlags { bmdDeckControlStatusDeckConnected = ( 1 << 0 ) , bmdDeckControlStatusRemoteMode = ( 1 << 1 ) , bmdDeckControlStatusRecordInhibited = ( 1 << 2 ) , - bmdDeckControlStatusCassetteOut = ( 1 << 3 ) + bmdDeckControlStatusCassetteOut = ( 1 << 3 ) } ; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDDeckControlExportModeOpsFlags { bmdDeckControlExportModeInsertVideo = ( 1 << 0 ) , @@ -1218,9 +1366,9 @@ bmdDeckControlExportModeInsertTimeCode = ( 1 << 13 ) , bmdDeckControlExportModeInsertAssemble = ( 1 << 14 ) , bmdDeckControlExportModeInsertPreview = ( 1 << 15 ) , - bmdDeckControlUseManualExport = ( 1 << 16 ) + bmdDeckControlUseManualExport = ( 1 << 16 ) } ; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckControlError { bmdDeckControlNoError = 0x6e6f6572, @@ -1245,7 +1393,7 @@ #if 0 #endif -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingDeviceMode { bmdStreamingDeviceIdle = 0x69646c65, @@ -1254,7 +1402,7 @@ bmdStreamingDeviceUnknown = 0x6d756e6b } BMDStreamingDeviceMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingEncodingFrameRate { bmdStreamingEncodedFrameRate50i = 0x65353069, @@ -1270,21 +1418,21 @@ bmdStreamingEncodedFrameRate60p = 0x65363070 } BMDStreamingEncodingFrameRate; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingEncodingSupport { bmdStreamingEncodingModeNotSupported = 0, bmdStreamingEncodingModeSupported = ( bmdStreamingEncodingModeNotSupported + 1 ) , - bmdStreamingEncodingModeSupportedWithChanges = ( bmdStreamingEncodingModeSupported + 1 ) + bmdStreamingEncodingModeSupportedWithChanges = ( bmdStreamingEncodingModeSupported + 1 ) } BMDStreamingEncodingSupport; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingVideoCodec { bmdStreamingVideoCodecH264 = 0x48323634 } BMDStreamingVideoCodec; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingH264Profile { bmdStreamingH264ProfileHigh = 0x68696768, @@ -1292,7 +1440,7 @@ bmdStreamingH264ProfileBaseline = 0x62617365 } BMDStreamingH264Profile; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingH264Level { bmdStreamingH264Level12 = 0x6c763132, @@ -1308,20 +1456,20 @@ bmdStreamingH264Level42 = 0x6c763432 } BMDStreamingH264Level; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingH264EntropyCoding { bmdStreamingH264EntropyCodingCAVLC = 0x45564c43, bmdStreamingH264EntropyCodingCABAC = 0x45424143 } BMDStreamingH264EntropyCoding; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingAudioCodec { bmdStreamingAudioCodecAAC = 0x41414320 } BMDStreamingAudioCodec; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDStreamingEncodingModePropertyID { bmdStreamingEncodingPropertyVideoFrameRate = 0x76667274, @@ -1370,24 +1518,37 @@ typedef enum _BMDDeviceBusyState BMDDeviceBusyState; #endif -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoOutputFlags { bmdVideoOutputFlagDefault = 0, bmdVideoOutputVANC = ( 1 << 0 ) , bmdVideoOutputVITC = ( 1 << 1 ) , bmdVideoOutputRP188 = ( 1 << 2 ) , - bmdVideoOutputDualStream3D = ( 1 << 4 ) + bmdVideoOutputDualStream3D = ( 1 << 4 ) , + bmdVideoOutputSynchronizeToPlaybackGroup = ( 1 << 6 ) } BMDVideoOutputFlags; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ +enum _BMDSupportedVideoModeFlags + { + bmdSupportedVideoModeDefault = 0, + bmdSupportedVideoModeKeying = ( 1 << 0 ) , + bmdSupportedVideoModeDualStream3D = ( 1 << 1 ) , + bmdSupportedVideoModeSDISingleLink = ( 1 << 2 ) , + bmdSupportedVideoModeSDIDualLink = ( 1 << 3 ) , + bmdSupportedVideoModeSDIQuadLink = ( 1 << 4 ) , + bmdSupportedVideoModeInAnyProfile = ( 1 << 5 ) + } BMDSupportedVideoModeFlags; + +typedef /* [v1_enum] */ enum _BMDPacketType { bmdPacketTypeStreamInterruptedMarker = 0x73696e74, bmdPacketTypeStreamData = 0x73646174 } BMDPacketType; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDFrameFlags { bmdFrameFlagDefault = 0, @@ -1395,88 +1556,81 @@ bmdFrameContainsHDRMetadata = ( 1 << 1 ) , bmdFrameContainsCintelMetadata = ( 1 << 2 ) , bmdFrameCapturedAsPsF = ( 1 << 30 ) , - bmdFrameHasNoInputSource = ( 1 << 31 ) + bmdFrameHasNoInputSource = ( 1 << 31 ) } ; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDVideoInputFlags { bmdVideoInputFlagDefault = 0, bmdVideoInputEnableFormatDetection = ( 1 << 0 ) , - bmdVideoInputDualStream3D = ( 1 << 1 ) + bmdVideoInputDualStream3D = ( 1 << 1 ) , + bmdVideoInputSynchronizeToCaptureGroup = ( 1 << 2 ) } ; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDVideoInputFormatChangedEvents { bmdVideoInputDisplayModeChanged = ( 1 << 0 ) , bmdVideoInputFieldDominanceChanged = ( 1 << 1 ) , - bmdVideoInputColorspaceChanged = ( 1 << 2 ) + bmdVideoInputColorspaceChanged = ( 1 << 2 ) } ; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDDetectedVideoInputFormatFlags { bmdDetectedVideoInputYCbCr422 = ( 1 << 0 ) , bmdDetectedVideoInputRGB444 = ( 1 << 1 ) , - bmdDetectedVideoInputDualStream3D = ( 1 << 2 ) + bmdDetectedVideoInputDualStream3D = ( 1 << 2 ) } ; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDDeckLinkCapturePassthroughMode { bmdDeckLinkCapturePassthroughModeDisabled = 0x70646973, bmdDeckLinkCapturePassthroughModeDirect = 0x70646972, bmdDeckLinkCapturePassthroughModeCleanSwitch = 0x70636c6e } ; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDOutputFrameCompletionResult { bmdOutputFrameCompleted = 0, bmdOutputFrameDisplayedLate = ( bmdOutputFrameCompleted + 1 ) , bmdOutputFrameDropped = ( bmdOutputFrameDisplayedLate + 1 ) , - bmdOutputFrameFlushed = ( bmdOutputFrameDropped + 1 ) + bmdOutputFrameFlushed = ( bmdOutputFrameDropped + 1 ) } BMDOutputFrameCompletionResult; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDReferenceStatus { bmdReferenceNotSupportedByHardware = ( 1 << 0 ) , - bmdReferenceLocked = ( 1 << 1 ) + bmdReferenceLocked = ( 1 << 1 ) } BMDReferenceStatus; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioFormat { bmdAudioFormatPCM = 0x6c70636d } BMDAudioFormat; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioSampleRate { bmdAudioSampleRate48kHz = 48000 } BMDAudioSampleRate; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioSampleType { bmdAudioSampleType16bitInteger = 16, bmdAudioSampleType32bitInteger = 32 } BMDAudioSampleType; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioOutputStreamType { bmdAudioOutputStreamContinuous = 0, bmdAudioOutputStreamContinuousDontResample = ( bmdAudioOutputStreamContinuous + 1 ) , - bmdAudioOutputStreamTimestamped = ( bmdAudioOutputStreamContinuousDontResample + 1 ) + bmdAudioOutputStreamTimestamped = ( bmdAudioOutputStreamContinuousDontResample + 1 ) } BMDAudioOutputStreamType; -typedef /* [v1_enum] */ -enum _BMDDisplayModeSupport - { - bmdDisplayModeNotSupported = 0, - bmdDisplayModeSupported = ( bmdDisplayModeNotSupported + 1 ) , - bmdDisplayModeSupportedWithConversion = ( bmdDisplayModeSupported + 1 ) - } BMDDisplayModeSupport; - -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAncillaryPacketFormat { bmdAncillaryPacketFormatUInt8 = 0x75693038, @@ -1484,32 +1638,33 @@ bmdAncillaryPacketFormatYCbCr10 = 0x76323130 } BMDAncillaryPacketFormat; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDTimecodeFormat { bmdTimecodeRP188VITC1 = 0x72707631, bmdTimecodeRP188VITC2 = 0x72703132, bmdTimecodeRP188LTC = 0x72706c74, + bmdTimecodeRP188HighFrameRate = 0x72706872, bmdTimecodeRP188Any = 0x72703138, bmdTimecodeVITC = 0x76697463, bmdTimecodeVITCField2 = 0x76697432, bmdTimecodeSerial = 0x73657269 } BMDTimecodeFormat; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDAnalogVideoFlags { bmdAnalogVideoFlagCompositeSetup75 = ( 1 << 0 ) , - bmdAnalogVideoFlagComponentBetacamLevels = ( 1 << 1 ) + bmdAnalogVideoFlagComponentBetacamLevels = ( 1 << 1 ) } ; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioOutputAnalogAESSwitch { bmdAudioOutputSwitchAESEBU = 0x61657320, bmdAudioOutputSwitchAnalog = 0x616e6c67 } BMDAudioOutputAnalogAESSwitch; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoOutputConversionMode { bmdNoVideoOutputConversion = 0x6e6f6e65, @@ -1528,7 +1683,7 @@ bmdVideoOutputHardwarePillarbox1080iUpconversion = 0x75703169 } BMDVideoOutputConversionMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoInputConversionMode { bmdNoVideoInputConversion = 0x6e6f6e65, @@ -1540,7 +1695,7 @@ bmdVideoInputAnamorphicUpconversion = 0x616d7570 } BMDVideoInputConversionMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideo3DPackingFormat { bmdVideo3DPackingSidebySideHalf = 0x73627368, @@ -1551,21 +1706,21 @@ bmdVideo3DPackingRightOnly = 0x72696768 } BMDVideo3DPackingFormat; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDIdleVideoOutputOperation { bmdIdleVideoOutputBlack = 0x626c6163, bmdIdleVideoOutputLastFrame = 0x6c616661 } BMDIdleVideoOutputOperation; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoEncoderFrameCodingMode { bmdVideoEncoderFrameCodingModeInter = 0x696e7465, bmdVideoEncoderFrameCodingModeIntra = 0x696e7472 } BMDVideoEncoderFrameCodingMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDNxHRLevel { bmdDNxHRLevelSQ = 0x646e7371, @@ -1575,7 +1730,7 @@ bmdDNxHRLevel444 = 0x64343434 } BMDDNxHRLevel; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDLinkConfiguration { bmdLinkConfigurationSingleLink = 0x6c63736c, @@ -1583,7 +1738,7 @@ bmdLinkConfigurationQuadLink = 0x6c63716c } BMDLinkConfiguration; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeviceInterface { bmdDeviceInterfacePCI = 0x70636920, @@ -1591,7 +1746,7 @@ bmdDeviceInterfaceThunderbolt = 0x7468756e } BMDDeviceInterface; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDColorspace { bmdColorspaceRec601 = 0x72363031, @@ -1599,29 +1754,27 @@ bmdColorspaceRec2020 = 0x32303230 } BMDColorspace; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDynamicRange { bmdDynamicRangeSDR = 0, bmdDynamicRangeHDRStaticPQ = ( 1 << 29 ) , - bmdDynamicRangeHDRStaticHLG = ( 1 << 30 ) + bmdDynamicRangeHDRStaticHLG = ( 1 << 30 ) } BMDDynamicRange; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkHDMIInputEDIDID { bmdDeckLinkHDMIInputEDIDDynamicRange = 0x48494479 } BMDDeckLinkHDMIInputEDIDID; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkFrameMetadataID { bmdDeckLinkFrameMetadataColorspace = 0x63737063, bmdDeckLinkFrameMetadataHDRElectroOpticalTransferFunc = 0x656f7466, bmdDeckLinkFrameMetadataCintelFilmType = 0x63667479, bmdDeckLinkFrameMetadataCintelFilmGauge = 0x63666761, - bmdDeckLinkFrameMetadataCintelOffsetDetectedHorizontal = 0x6f646668, - bmdDeckLinkFrameMetadataCintelOffsetDetectedVertical = 0x6f646676, bmdDeckLinkFrameMetadataCintelKeykodeLow = 0x636b6b6c, bmdDeckLinkFrameMetadataCintelKeykodeHigh = 0x636b6b68, bmdDeckLinkFrameMetadataCintelTile1Size = 0x63743173, @@ -1668,22 +1821,35 @@ bmdDeckLinkFrameMetadataCintelGainBlue = 0x4c66426c, bmdDeckLinkFrameMetadataCintelLiftRed = 0x476e5264, bmdDeckLinkFrameMetadataCintelLiftGreen = 0x476e4772, - bmdDeckLinkFrameMetadataCintelLiftBlue = 0x476e426c + bmdDeckLinkFrameMetadataCintelLiftBlue = 0x476e426c, + bmdDeckLinkFrameMetadataCintelHDRGainRed = 0x48475264, + bmdDeckLinkFrameMetadataCintelHDRGainGreen = 0x48474772, + bmdDeckLinkFrameMetadataCintelHDRGainBlue = 0x4847426c } BMDDeckLinkFrameMetadataID; -typedef /* [v1_enum] */ -enum _BMDDuplexMode +typedef /* [v1_enum] */ +enum _BMDProfileID { - bmdDuplexModeFull = 0x66647570, - bmdDuplexModeHalf = 0x68647570 - } BMDDuplexMode; + bmdProfileOneSubDeviceFullDuplex = 0x31646664, + bmdProfileOneSubDeviceHalfDuplex = 0x31646864, + bmdProfileTwoSubDevicesFullDuplex = 0x32646664, + bmdProfileTwoSubDevicesHalfDuplex = 0x32646864, + bmdProfileFourSubDevicesHalfDuplex = 0x34646864 + } BMDProfileID; + +typedef /* [v1_enum] */ +enum _BMDHDMITimecodePacking + { + bmdHDMITimecodePackingIEEEOUI000085 = 0x8500, + bmdHDMITimecodePackingIEEEOUI080046 = 0x8004601, + bmdHDMITimecodePackingIEEEOUI5CF9F0 = 0x5cf9f003 + } BMDHDMITimecodePacking; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkAttributeID { BMDDeckLinkSupportsInternalKeying = 0x6b657969, BMDDeckLinkSupportsExternalKeying = 0x6b657965, - BMDDeckLinkSupportsHDKeying = 0x6b657968, BMDDeckLinkSupportsInputFormatDetection = 0x696e6664, BMDDeckLinkHasReferenceInput = 0x6872696e, BMDDeckLinkHasSerialPort = 0x68737074, @@ -1692,16 +1858,19 @@ BMDDeckLinkHasVideoInputAntiAliasingFilter = 0x6161666c, BMDDeckLinkHasBypass = 0x62797073, BMDDeckLinkSupportsClockTimingAdjustment = 0x63746164, - BMDDeckLinkSupportsFullDuplex = 0x66647570, BMDDeckLinkSupportsFullFrameReferenceInputTimingOffset = 0x6672696e, BMDDeckLinkSupportsSMPTELevelAOutput = 0x6c766c61, BMDDeckLinkSupportsDualLinkSDI = 0x73646c73, BMDDeckLinkSupportsQuadLinkSDI = 0x73716c73, BMDDeckLinkSupportsIdleOutput = 0x69646f75, + BMDDeckLinkVANCRequires10BitYUVVideoFrames = 0x76696f59, BMDDeckLinkHasLTCTimecodeInput = 0x686c7463, - BMDDeckLinkSupportsDuplexModeConfiguration = 0x64757078, BMDDeckLinkSupportsHDRMetadata = 0x6864726d, BMDDeckLinkSupportsColorspaceMetadata = 0x636d6574, + BMDDeckLinkSupportsHDMITimecode = 0x6874696d, + BMDDeckLinkSupportsHighFrameRateTimecode = 0x48465254, + BMDDeckLinkSupportsSynchronizeToCaptureGroup = 0x73746367, + BMDDeckLinkSupportsSynchronizeToPlaybackGroup = 0x73747067, BMDDeckLinkMaximumAudioChannels = 0x6d616368, BMDDeckLinkMaximumAnalogAudioInputChannels = 0x69616368, BMDDeckLinkMaximumAnalogAudioOutputChannels = 0x61616368, @@ -1721,7 +1890,8 @@ BMDDeckLinkAudioInputXLRChannelCount = 0x61697863, BMDDeckLinkAudioOutputRCAChannelCount = 0x616f7263, BMDDeckLinkAudioOutputXLRChannelCount = 0x616f7863, - BMDDeckLinkPairedDevicePersistentID = 0x70706964, + BMDDeckLinkProfileID = 0x70726964, + BMDDeckLinkDuplex = 0x64757078, BMDDeckLinkVideoInputGainMinimum = 0x7669676d, BMDDeckLinkVideoInputGainMaximum = 0x76696778, BMDDeckLinkVideoOutputGainMinimum = 0x766f676d, @@ -1735,13 +1905,13 @@ BMDDeckLinkDeviceHandle = 0x64657668 } BMDDeckLinkAttributeID; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkAPIInformationID { BMDDeckLinkAPIVersion = 0x76657273 } BMDDeckLinkAPIInformationID; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkStatusID { bmdDeckLinkStatusDetectedVideoInputMode = 0x6476696d, @@ -1756,7 +1926,6 @@ bmdDeckLinkStatusLastVideoOutputPixelFormat = 0x6f706978, bmdDeckLinkStatusReferenceSignalMode = 0x7265666d, bmdDeckLinkStatusReferenceSignalFlags = 0x72656666, - bmdDeckLinkStatusDuplexMode = 0x64757078, bmdDeckLinkStatusBusy = 0x62757379, bmdDeckLinkStatusInterchangeablePanelType = 0x69637074, bmdDeckLinkStatusDeviceTemperature = 0x64746d70, @@ -1765,44 +1934,44 @@ bmdDeckLinkStatusReceivedEDID = 0x65646964 } BMDDeckLinkStatusID; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkVideoStatusFlags { bmdDeckLinkVideoStatusPsF = ( 1 << 0 ) , - bmdDeckLinkVideoStatusDualStream3D = ( 1 << 1 ) + bmdDeckLinkVideoStatusDualStream3D = ( 1 << 1 ) } BMDDeckLinkVideoStatusFlags; -typedef /* [v1_enum] */ -enum _BMDDuplexStatus +typedef /* [v1_enum] */ +enum _BMDDuplexMode { - bmdDuplexStatusFullDuplex = 0x66647570, - bmdDuplexStatusHalfDuplex = 0x68647570, - bmdDuplexStatusSimplex = 0x73706c78, - bmdDuplexStatusInactive = 0x696e6163 - } BMDDuplexStatus; + bmdDuplexFull = 0x64786675, + bmdDuplexHalf = 0x64786861, + bmdDuplexSimplex = 0x64787370, + bmdDuplexInactive = 0x6478696e + } BMDDuplexMode; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDPanelType { bmdPanelNotDetected = 0x6e706e6c, bmdPanelTeranexMiniSmartPanel = 0x746d736d } BMDPanelType; -/* [v1_enum] */ +/* [v1_enum] */ enum _BMDDeviceBusyState { bmdDeviceCaptureBusy = ( 1 << 0 ) , bmdDevicePlaybackBusy = ( 1 << 1 ) , - bmdDeviceSerialPortBusy = ( 1 << 2 ) + bmdDeviceSerialPortBusy = ( 1 << 2 ) } ; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoIOSupport { bmdDeviceSupportsCapture = ( 1 << 0 ) , - bmdDeviceSupportsPlayback = ( 1 << 1 ) + bmdDeviceSupportsPlayback = ( 1 << 1 ) } BMDVideoIOSupport; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMD3DPreviewFormat { bmd3DPreviewFormatDefault = 0x64656661, @@ -1812,7 +1981,7 @@ bmd3DPreviewFormatTopBottom = 0x746f7062 } BMD3DPreviewFormat; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDNotifications { bmdPreferencesChanged = 0x70726566, @@ -1855,46 +2024,98 @@ -typedef /* [v1_enum] */ + + + + +typedef /* [v1_enum] */ +enum _BMDDisplayModeSupport_v10_11 + { + bmdDisplayModeNotSupported_v10_11 = 0, + bmdDisplayModeSupported_v10_11 = ( bmdDisplayModeNotSupported_v10_11 + 1 ) , + bmdDisplayModeSupportedWithConversion_v10_11 = ( bmdDisplayModeSupported_v10_11 + 1 ) + } BMDDisplayModeSupport_v10_11; + +typedef /* [v1_enum] */ +enum _BMDDuplexMode_v10_11 + { + bmdDuplexModeFull_v10_11 = 0x66647570, + bmdDuplexModeHalf_v10_11 = 0x68647570 + } BMDDuplexMode_v10_11; + +typedef /* [v1_enum] */ +enum _BMDDeckLinkConfigurationID_v10_11 + { + bmdDeckLinkConfigDuplexMode_v10_11 = 0x64757078 + } BMDDeckLinkConfigurationID_v10_11; + +typedef /* [v1_enum] */ +enum _BMDDeckLinkAttributeID_v10_11 + { + BMDDeckLinkSupportsDuplexModeConfiguration_v10_11 = 0x64757078, + BMDDeckLinkSupportsHDKeying_v10_11 = 0x6b657968, + BMDDeckLinkPairedDevicePersistentID_v10_11 = 0x70706964, + BMDDeckLinkSupportsFullDuplex_v10_11 = 0x66647570 + } BMDDeckLinkAttributeID_v10_11; + +typedef /* [v1_enum] */ +enum _BMDDeckLinkStatusID_v10_11 + { + bmdDeckLinkStatusDuplexMode_v10_11 = 0x64757078 + } BMDDeckLinkStatusID_v10_11; + +typedef /* [v1_enum] */ +enum _BMDDuplexStatus_v10_11 + { + bmdDuplexFullDuplex_v10_11 = 0x66647570, + bmdDuplexHalfDuplex_v10_11 = 0x68647570, + bmdDuplexSimplex_v10_11 = 0x73706c78, + bmdDuplexInactive_v10_11 = 0x696e6163 + } BMDDuplexStatus_v10_11; + + + + +typedef /* [v1_enum] */ enum _BMDDeckLinkConfigurationID_v10_9 { bmdDeckLinkConfig1080pNotPsF_v10_9 = 0x6670726f } BMDDeckLinkConfigurationID_v10_9; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkAttributeID_v10_6 { BMDDeckLinkSupportsDesktopDisplay_v10_6 = 0x65787464 } BMDDeckLinkAttributeID_v10_6; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDIdleVideoOutputOperation_v10_6 { bmdIdleVideoOutputDesktop_v10_6 = 0x6465736b } BMDIdleVideoOutputOperation_v10_6; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkAttributeID_v10_5 { BMDDeckLinkDeviceBusyState_v10_5 = 0x64627374 } BMDDeckLinkAttributeID_v10_5; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkConfigurationID_v10_4 { bmdDeckLinkConfigSingleLinkVideoOutput_v10_4 = 0x73676c6f } BMDDeckLinkConfigurationID_v10_4; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckLinkConfigurationID_v10_2 { bmdDeckLinkConfig3GBpsVideoOutput_v10_2 = 0x33676273 } BMDDeckLinkConfigurationID_v10_2; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDAudioConnection_v10_2 { bmdAudioConnectionEmbedded_v10_2 = 0x656d6264, @@ -1905,7 +2126,7 @@ } BMDAudioConnection_v10_2; -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDDeckControlVTRControlState_v8_1 { bmdDeckControlNotInVTRControlMode_v8_1 = 0x6e76636d, @@ -1918,7 +2139,7 @@ -typedef /* [v1_enum] */ +typedef /* [v1_enum] */ enum _BMDVideoConnection_v7_6 { bmdVideoConnectionSDI_v7_6 = 0x73646920, @@ -1957,75 +2178,75 @@ #define __IDeckLinkTimecode_INTERFACE_DEFINED__ /* interface IDeckLinkTimecode */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkTimecode; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("BC6CFBD3-8317-4325-AC1C-1216391E9340") IDeckLinkTimecode : public IUnknown { public: virtual BMDTimecodeBCD STDMETHODCALLTYPE GetBCD( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetComponents( + + virtual HRESULT STDMETHODCALLTYPE GetComponents( /* [out] */ unsigned char *hours, /* [out] */ unsigned char *minutes, /* [out] */ unsigned char *seconds, /* [out] */ unsigned char *frames) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [out] */ BSTR *timecode) = 0; - + virtual BMDTimecodeFlags STDMETHODCALLTYPE GetFlags( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeUserBits( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeUserBits( /* [out] */ BMDTimecodeUserBits *userBits) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkTimecodeVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkTimecode * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkTimecode * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkTimecode * This); - - BMDTimecodeBCD ( STDMETHODCALLTYPE *GetBCD )( + + BMDTimecodeBCD ( STDMETHODCALLTYPE *GetBCD )( IDeckLinkTimecode * This); - - HRESULT ( STDMETHODCALLTYPE *GetComponents )( + + HRESULT ( STDMETHODCALLTYPE *GetComponents )( IDeckLinkTimecode * This, /* [out] */ unsigned char *hours, /* [out] */ unsigned char *minutes, /* [out] */ unsigned char *seconds, /* [out] */ unsigned char *frames); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkTimecode * This, /* [out] */ BSTR *timecode); - - BMDTimecodeFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDTimecodeFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkTimecode * This); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeUserBits )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeUserBits )( IDeckLinkTimecode * This, /* [out] */ BMDTimecodeUserBits *userBits); - + END_INTERFACE } IDeckLinkTimecodeVtbl; @@ -2034,35 +2255,35 @@ CONST_VTBL struct IDeckLinkTimecodeVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkTimecode_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkTimecode_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkTimecode_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkTimecode_GetBCD(This) \ - ( (This)->lpVtbl -> GetBCD(This) ) + ( (This)->lpVtbl -> GetBCD(This) ) #define IDeckLinkTimecode_GetComponents(This,hours,minutes,seconds,frames) \ - ( (This)->lpVtbl -> GetComponents(This,hours,minutes,seconds,frames) ) + ( (This)->lpVtbl -> GetComponents(This,hours,minutes,seconds,frames) ) #define IDeckLinkTimecode_GetString(This,timecode) \ - ( (This)->lpVtbl -> GetString(This,timecode) ) + ( (This)->lpVtbl -> GetString(This,timecode) ) #define IDeckLinkTimecode_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkTimecode_GetTimecodeUserBits(This,userBits) \ - ( (This)->lpVtbl -> GetTimecodeUserBits(This,userBits) ) + ( (This)->lpVtbl -> GetTimecodeUserBits(This,userBits) ) #endif /* COBJMACROS */ @@ -2079,45 +2300,45 @@ #define __IDeckLinkDisplayModeIterator_INTERFACE_DEFINED__ /* interface IDeckLinkDisplayModeIterator */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDisplayModeIterator; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("9C88499F-F601-4021-B80B-032E4EB41C35") IDeckLinkDisplayModeIterator : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IDeckLinkDisplayMode **deckLinkDisplayMode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDisplayModeIteratorVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDisplayModeIterator * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDisplayModeIterator * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDisplayModeIterator * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IDeckLinkDisplayModeIterator * This, /* [out] */ IDeckLinkDisplayMode **deckLinkDisplayMode); - + END_INTERFACE } IDeckLinkDisplayModeIteratorVtbl; @@ -2126,23 +2347,23 @@ CONST_VTBL struct IDeckLinkDisplayModeIteratorVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDisplayModeIterator_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDisplayModeIterator_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDisplayModeIterator_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDisplayModeIterator_Next(This,deckLinkDisplayMode) \ - ( (This)->lpVtbl -> Next(This,deckLinkDisplayMode) ) + ( (This)->lpVtbl -> Next(This,deckLinkDisplayMode) ) #endif /* COBJMACROS */ @@ -2159,79 +2380,79 @@ #define __IDeckLinkDisplayMode_INTERFACE_DEFINED__ /* interface IDeckLinkDisplayMode */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDisplayMode; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("3EB2C1AB-0A3D-4523-A3AD-F40D7FB14E78") IDeckLinkDisplayMode : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetName( + virtual HRESULT STDMETHODCALLTYPE GetName( /* [out] */ BSTR *name) = 0; - + virtual BMDDisplayMode STDMETHODCALLTYPE GetDisplayMode( void) = 0; - + virtual long STDMETHODCALLTYPE GetWidth( void) = 0; - + virtual long STDMETHODCALLTYPE GetHeight( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFrameRate( + + virtual HRESULT STDMETHODCALLTYPE GetFrameRate( /* [out] */ BMDTimeValue *frameDuration, /* [out] */ BMDTimeScale *timeScale) = 0; - + virtual BMDFieldDominance STDMETHODCALLTYPE GetFieldDominance( void) = 0; - + virtual BMDDisplayModeFlags STDMETHODCALLTYPE GetFlags( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDisplayModeVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDisplayMode * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDisplayMode * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDisplayMode * This); - - HRESULT ( STDMETHODCALLTYPE *GetName )( + + HRESULT ( STDMETHODCALLTYPE *GetName )( IDeckLinkDisplayMode * This, /* [out] */ BSTR *name); - - BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( + + BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkDisplayMode * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkDisplayMode * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkDisplayMode * This); - - HRESULT ( STDMETHODCALLTYPE *GetFrameRate )( + + HRESULT ( STDMETHODCALLTYPE *GetFrameRate )( IDeckLinkDisplayMode * This, /* [out] */ BMDTimeValue *frameDuration, /* [out] */ BMDTimeScale *timeScale); - - BMDFieldDominance ( STDMETHODCALLTYPE *GetFieldDominance )( + + BMDFieldDominance ( STDMETHODCALLTYPE *GetFieldDominance )( IDeckLinkDisplayMode * This); - - BMDDisplayModeFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDDisplayModeFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkDisplayMode * This); - + END_INTERFACE } IDeckLinkDisplayModeVtbl; @@ -2240,41 +2461,41 @@ CONST_VTBL struct IDeckLinkDisplayModeVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDisplayMode_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDisplayMode_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDisplayMode_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDisplayMode_GetName(This,name) \ - ( (This)->lpVtbl -> GetName(This,name) ) + ( (This)->lpVtbl -> GetName(This,name) ) #define IDeckLinkDisplayMode_GetDisplayMode(This) \ - ( (This)->lpVtbl -> GetDisplayMode(This) ) + ( (This)->lpVtbl -> GetDisplayMode(This) ) #define IDeckLinkDisplayMode_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkDisplayMode_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkDisplayMode_GetFrameRate(This,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetFrameRate(This,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetFrameRate(This,frameDuration,timeScale) ) #define IDeckLinkDisplayMode_GetFieldDominance(This) \ - ( (This)->lpVtbl -> GetFieldDominance(This) ) + ( (This)->lpVtbl -> GetFieldDominance(This) ) #define IDeckLinkDisplayMode_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #endif /* COBJMACROS */ @@ -2291,52 +2512,52 @@ #define __IDeckLink_INTERFACE_DEFINED__ /* interface IDeckLink */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLink; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("C418FBDD-0587-48ED-8FE5-640F0A14AF91") IDeckLink : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetModelName( + virtual HRESULT STDMETHODCALLTYPE GetModelName( /* [out] */ BSTR *modelName) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayName( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayName( /* [out] */ BSTR *displayName) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLink * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLink * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLink * This); - - HRESULT ( STDMETHODCALLTYPE *GetModelName )( + + HRESULT ( STDMETHODCALLTYPE *GetModelName )( IDeckLink * This, /* [out] */ BSTR *modelName); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayName )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayName )( IDeckLink * This, /* [out] */ BSTR *displayName); - + END_INTERFACE } IDeckLinkVtbl; @@ -2345,26 +2566,26 @@ CONST_VTBL struct IDeckLinkVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLink_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLink_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLink_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLink_GetModelName(This,modelName) \ - ( (This)->lpVtbl -> GetModelName(This,modelName) ) + ( (This)->lpVtbl -> GetModelName(This,modelName) ) #define IDeckLink_GetDisplayName(This,displayName) \ - ( (This)->lpVtbl -> GetDisplayName(This,displayName) ) + ( (This)->lpVtbl -> GetDisplayName(This,displayName) ) #endif /* COBJMACROS */ @@ -2381,115 +2602,115 @@ #define __IDeckLinkConfiguration_INTERFACE_DEFINED__ /* interface IDeckLinkConfiguration */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkConfiguration; #if defined(__cplusplus) && !defined(CINTERFACE) - - MIDL_INTERFACE("EF90380B-4AE5-4346-9077-E288E149F129") + + MIDL_INTERFACE("912F634B-2D4E-40A4-8AAB-8D80B73F1289") IDeckLinkConfiguration : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetFlag( + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value) = 0; - + virtual HRESULT STDMETHODCALLTYPE WriteConfigurationToPreferences( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkConfigurationVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkConfiguration * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkConfiguration * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkConfiguration * This); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkConfiguration * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( + + HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( IDeckLinkConfiguration * This); - + END_INTERFACE } IDeckLinkConfigurationVtbl; @@ -2498,47 +2719,47 @@ CONST_VTBL struct IDeckLinkConfigurationVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkConfiguration_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkConfiguration_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkConfiguration_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkConfiguration_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #define IDeckLinkConfiguration_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IDeckLinkConfiguration_WriteConfigurationToPreferences(This) \ - ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) + ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) #endif /* COBJMACROS */ @@ -2555,121 +2776,121 @@ #define __IDeckLinkEncoderConfiguration_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderConfiguration */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderConfiguration; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("138050E5-C60A-4552-BF3F-0F358049327E") IDeckLinkEncoderConfiguration : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetFlag( + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BSTR value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BSTR *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ void *buffer, /* [out][in] */ unsigned int *bufferSize) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderConfigurationVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderConfiguration * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderConfiguration * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderConfiguration * This); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BSTR value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkEncoderConfiguration * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ void *buffer, /* [out][in] */ unsigned int *bufferSize); - + END_INTERFACE } IDeckLinkEncoderConfigurationVtbl; @@ -2678,47 +2899,47 @@ CONST_VTBL struct IDeckLinkEncoderConfigurationVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderConfiguration_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderConfiguration_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderConfiguration_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkEncoderConfiguration_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_GetBytes(This,cfgID,buffer,bufferSize) \ - ( (This)->lpVtbl -> GetBytes(This,cfgID,buffer,bufferSize) ) + ( (This)->lpVtbl -> GetBytes(This,cfgID,buffer,bufferSize) ) #endif /* COBJMACROS */ @@ -2735,72 +2956,72 @@ #define __IDeckLinkDeckControlStatusCallback_INTERFACE_DEFINED__ /* interface IDeckLinkDeckControlStatusCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDeckControlStatusCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("53436FFB-B434-4906-BADC-AE3060FFE8EF") IDeckLinkDeckControlStatusCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE TimecodeUpdate( + virtual HRESULT STDMETHODCALLTYPE TimecodeUpdate( /* [in] */ BMDTimecodeBCD currentTimecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE VTRControlStateChanged( + + virtual HRESULT STDMETHODCALLTYPE VTRControlStateChanged( /* [in] */ BMDDeckControlVTRControlState newState, /* [in] */ BMDDeckControlError error) = 0; - - virtual HRESULT STDMETHODCALLTYPE DeckControlEventReceived( + + virtual HRESULT STDMETHODCALLTYPE DeckControlEventReceived( /* [in] */ BMDDeckControlEvent event, /* [in] */ BMDDeckControlError error) = 0; - - virtual HRESULT STDMETHODCALLTYPE DeckControlStatusChanged( + + virtual HRESULT STDMETHODCALLTYPE DeckControlStatusChanged( /* [in] */ BMDDeckControlStatusFlags flags, /* [in] */ unsigned int mask) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDeckControlStatusCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDeckControlStatusCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDeckControlStatusCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDeckControlStatusCallback * This); - - HRESULT ( STDMETHODCALLTYPE *TimecodeUpdate )( + + HRESULT ( STDMETHODCALLTYPE *TimecodeUpdate )( IDeckLinkDeckControlStatusCallback * This, /* [in] */ BMDTimecodeBCD currentTimecode); - - HRESULT ( STDMETHODCALLTYPE *VTRControlStateChanged )( + + HRESULT ( STDMETHODCALLTYPE *VTRControlStateChanged )( IDeckLinkDeckControlStatusCallback * This, /* [in] */ BMDDeckControlVTRControlState newState, /* [in] */ BMDDeckControlError error); - - HRESULT ( STDMETHODCALLTYPE *DeckControlEventReceived )( + + HRESULT ( STDMETHODCALLTYPE *DeckControlEventReceived )( IDeckLinkDeckControlStatusCallback * This, /* [in] */ BMDDeckControlEvent event, /* [in] */ BMDDeckControlError error); - - HRESULT ( STDMETHODCALLTYPE *DeckControlStatusChanged )( + + HRESULT ( STDMETHODCALLTYPE *DeckControlStatusChanged )( IDeckLinkDeckControlStatusCallback * This, /* [in] */ BMDDeckControlStatusFlags flags, /* [in] */ unsigned int mask); - + END_INTERFACE } IDeckLinkDeckControlStatusCallbackVtbl; @@ -2809,32 +3030,32 @@ CONST_VTBL struct IDeckLinkDeckControlStatusCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDeckControlStatusCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDeckControlStatusCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDeckControlStatusCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDeckControlStatusCallback_TimecodeUpdate(This,currentTimecode) \ - ( (This)->lpVtbl -> TimecodeUpdate(This,currentTimecode) ) + ( (This)->lpVtbl -> TimecodeUpdate(This,currentTimecode) ) #define IDeckLinkDeckControlStatusCallback_VTRControlStateChanged(This,newState,error) \ - ( (This)->lpVtbl -> VTRControlStateChanged(This,newState,error) ) + ( (This)->lpVtbl -> VTRControlStateChanged(This,newState,error) ) #define IDeckLinkDeckControlStatusCallback_DeckControlEventReceived(This,event,error) \ - ( (This)->lpVtbl -> DeckControlEventReceived(This,event,error) ) + ( (This)->lpVtbl -> DeckControlEventReceived(This,event,error) ) #define IDeckLinkDeckControlStatusCallback_DeckControlStatusChanged(This,flags,mask) \ - ( (This)->lpVtbl -> DeckControlStatusChanged(This,flags,mask) ) + ( (This)->lpVtbl -> DeckControlStatusChanged(This,flags,mask) ) #endif /* COBJMACROS */ @@ -2851,183 +3072,183 @@ #define __IDeckLinkDeckControl_INTERFACE_DEFINED__ /* interface IDeckLinkDeckControl */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDeckControl; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("8E1C3ACE-19C7-4E00-8B92-D80431D958BE") IDeckLinkDeckControl : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Open( + virtual HRESULT STDMETHODCALLTYPE Open( /* [in] */ BMDTimeScale timeScale, /* [in] */ BMDTimeValue timeValue, /* [in] */ BOOL timecodeIsDropFrame, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Close( + + virtual HRESULT STDMETHODCALLTYPE Close( /* [in] */ BOOL standbyOn) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCurrentState( + + virtual HRESULT STDMETHODCALLTYPE GetCurrentState( /* [out] */ BMDDeckControlMode *mode, /* [out] */ BMDDeckControlVTRControlState *vtrControlState, /* [out] */ BMDDeckControlStatusFlags *flags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetStandby( + + virtual HRESULT STDMETHODCALLTYPE SetStandby( /* [in] */ BOOL standbyOn) = 0; - - virtual HRESULT STDMETHODCALLTYPE SendCommand( + + virtual HRESULT STDMETHODCALLTYPE SendCommand( /* [in] */ unsigned char *inBuffer, /* [in] */ unsigned int inBufferSize, /* [out] */ unsigned char *outBuffer, /* [out] */ unsigned int *outDataSize, /* [in] */ unsigned int outBufferSize, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Play( + + virtual HRESULT STDMETHODCALLTYPE Play( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Stop( + + virtual HRESULT STDMETHODCALLTYPE Stop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE TogglePlayStop( + + virtual HRESULT STDMETHODCALLTYPE TogglePlayStop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Eject( + + virtual HRESULT STDMETHODCALLTYPE Eject( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GoToTimecode( + + virtual HRESULT STDMETHODCALLTYPE GoToTimecode( /* [in] */ BMDTimecodeBCD timecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE FastForward( + + virtual HRESULT STDMETHODCALLTYPE FastForward( /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Rewind( + + virtual HRESULT STDMETHODCALLTYPE Rewind( /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StepForward( + + virtual HRESULT STDMETHODCALLTYPE StepForward( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StepBack( + + virtual HRESULT STDMETHODCALLTYPE StepBack( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Jog( + + virtual HRESULT STDMETHODCALLTYPE Jog( /* [in] */ double rate, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Shuttle( + + virtual HRESULT STDMETHODCALLTYPE Shuttle( /* [in] */ double rate, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeString( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeString( /* [out] */ BSTR *currentTimeCode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecode( + + virtual HRESULT STDMETHODCALLTYPE GetTimecode( /* [out] */ IDeckLinkTimecode **currentTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeBCD( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeBCD( /* [out] */ BMDTimecodeBCD *currentTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetPreroll( + + virtual HRESULT STDMETHODCALLTYPE SetPreroll( /* [in] */ unsigned int prerollSeconds) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPreroll( + + virtual HRESULT STDMETHODCALLTYPE GetPreroll( /* [out] */ unsigned int *prerollSeconds) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetExportOffset( + + virtual HRESULT STDMETHODCALLTYPE SetExportOffset( /* [in] */ int exportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetExportOffset( + + virtual HRESULT STDMETHODCALLTYPE GetExportOffset( /* [out] */ int *exportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetManualExportOffset( + + virtual HRESULT STDMETHODCALLTYPE GetManualExportOffset( /* [out] */ int *deckManualExportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCaptureOffset( + + virtual HRESULT STDMETHODCALLTYPE SetCaptureOffset( /* [in] */ int captureOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCaptureOffset( + + virtual HRESULT STDMETHODCALLTYPE GetCaptureOffset( /* [out] */ int *captureOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartExport( + + virtual HRESULT STDMETHODCALLTYPE StartExport( /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [in] */ BMDDeckControlExportModeOpsFlags exportModeOps, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartCapture( + + virtual HRESULT STDMETHODCALLTYPE StartCapture( /* [in] */ BOOL useVITC, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDeviceID( + + virtual HRESULT STDMETHODCALLTYPE GetDeviceID( /* [out] */ unsigned short *deviceId, /* [out] */ BMDDeckControlError *error) = 0; - + virtual HRESULT STDMETHODCALLTYPE Abort( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE CrashRecordStart( + + virtual HRESULT STDMETHODCALLTYPE CrashRecordStart( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE CrashRecordStop( + + virtual HRESULT STDMETHODCALLTYPE CrashRecordStop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkDeckControlStatusCallback *callback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDeckControlVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDeckControl * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDeckControl * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDeckControl * This); - - HRESULT ( STDMETHODCALLTYPE *Open )( + + HRESULT ( STDMETHODCALLTYPE *Open )( IDeckLinkDeckControl * This, /* [in] */ BMDTimeScale timeScale, /* [in] */ BMDTimeValue timeValue, /* [in] */ BOOL timecodeIsDropFrame, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Close )( + + HRESULT ( STDMETHODCALLTYPE *Close )( IDeckLinkDeckControl * This, /* [in] */ BOOL standbyOn); - - HRESULT ( STDMETHODCALLTYPE *GetCurrentState )( + + HRESULT ( STDMETHODCALLTYPE *GetCurrentState )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlMode *mode, /* [out] */ BMDDeckControlVTRControlState *vtrControlState, /* [out] */ BMDDeckControlStatusFlags *flags); - - HRESULT ( STDMETHODCALLTYPE *SetStandby )( + + HRESULT ( STDMETHODCALLTYPE *SetStandby )( IDeckLinkDeckControl * This, /* [in] */ BOOL standbyOn); - - HRESULT ( STDMETHODCALLTYPE *SendCommand )( + + HRESULT ( STDMETHODCALLTYPE *SendCommand )( IDeckLinkDeckControl * This, /* [in] */ unsigned char *inBuffer, /* [in] */ unsigned int inBufferSize, @@ -3035,133 +3256,133 @@ /* [out] */ unsigned int *outDataSize, /* [in] */ unsigned int outBufferSize, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Play )( + + HRESULT ( STDMETHODCALLTYPE *Play )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Stop )( + + HRESULT ( STDMETHODCALLTYPE *Stop )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *TogglePlayStop )( + + HRESULT ( STDMETHODCALLTYPE *TogglePlayStop )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Eject )( + + HRESULT ( STDMETHODCALLTYPE *Eject )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GoToTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GoToTimecode )( IDeckLinkDeckControl * This, /* [in] */ BMDTimecodeBCD timecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *FastForward )( + + HRESULT ( STDMETHODCALLTYPE *FastForward )( IDeckLinkDeckControl * This, /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Rewind )( + + HRESULT ( STDMETHODCALLTYPE *Rewind )( IDeckLinkDeckControl * This, /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StepForward )( + + HRESULT ( STDMETHODCALLTYPE *StepForward )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StepBack )( + + HRESULT ( STDMETHODCALLTYPE *StepBack )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Jog )( + + HRESULT ( STDMETHODCALLTYPE *Jog )( IDeckLinkDeckControl * This, /* [in] */ double rate, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Shuttle )( + + HRESULT ( STDMETHODCALLTYPE *Shuttle )( IDeckLinkDeckControl * This, /* [in] */ double rate, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeString )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeString )( IDeckLinkDeckControl * This, /* [out] */ BSTR *currentTimeCode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkDeckControl * This, /* [out] */ IDeckLinkTimecode **currentTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeBCD )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeBCD )( IDeckLinkDeckControl * This, /* [out] */ BMDTimecodeBCD *currentTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *SetPreroll )( + + HRESULT ( STDMETHODCALLTYPE *SetPreroll )( IDeckLinkDeckControl * This, /* [in] */ unsigned int prerollSeconds); - - HRESULT ( STDMETHODCALLTYPE *GetPreroll )( + + HRESULT ( STDMETHODCALLTYPE *GetPreroll )( IDeckLinkDeckControl * This, /* [out] */ unsigned int *prerollSeconds); - - HRESULT ( STDMETHODCALLTYPE *SetExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *SetExportOffset )( IDeckLinkDeckControl * This, /* [in] */ int exportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetExportOffset )( IDeckLinkDeckControl * This, /* [out] */ int *exportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetManualExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetManualExportOffset )( IDeckLinkDeckControl * This, /* [out] */ int *deckManualExportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *SetCaptureOffset )( + + HRESULT ( STDMETHODCALLTYPE *SetCaptureOffset )( IDeckLinkDeckControl * This, /* [in] */ int captureOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetCaptureOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetCaptureOffset )( IDeckLinkDeckControl * This, /* [out] */ int *captureOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *StartExport )( + + HRESULT ( STDMETHODCALLTYPE *StartExport )( IDeckLinkDeckControl * This, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [in] */ BMDDeckControlExportModeOpsFlags exportModeOps, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StartCapture )( + + HRESULT ( STDMETHODCALLTYPE *StartCapture )( IDeckLinkDeckControl * This, /* [in] */ BOOL useVITC, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetDeviceID )( + + HRESULT ( STDMETHODCALLTYPE *GetDeviceID )( IDeckLinkDeckControl * This, /* [out] */ unsigned short *deviceId, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Abort )( + + HRESULT ( STDMETHODCALLTYPE *Abort )( IDeckLinkDeckControl * This); - - HRESULT ( STDMETHODCALLTYPE *CrashRecordStart )( + + HRESULT ( STDMETHODCALLTYPE *CrashRecordStart )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *CrashRecordStop )( + + HRESULT ( STDMETHODCALLTYPE *CrashRecordStop )( IDeckLinkDeckControl * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkDeckControl * This, /* [in] */ IDeckLinkDeckControlStatusCallback *callback); - + END_INTERFACE } IDeckLinkDeckControlVtbl; @@ -3170,119 +3391,119 @@ CONST_VTBL struct IDeckLinkDeckControlVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDeckControl_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDeckControl_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDeckControl_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDeckControl_Open(This,timeScale,timeValue,timecodeIsDropFrame,error) \ - ( (This)->lpVtbl -> Open(This,timeScale,timeValue,timecodeIsDropFrame,error) ) + ( (This)->lpVtbl -> Open(This,timeScale,timeValue,timecodeIsDropFrame,error) ) #define IDeckLinkDeckControl_Close(This,standbyOn) \ - ( (This)->lpVtbl -> Close(This,standbyOn) ) + ( (This)->lpVtbl -> Close(This,standbyOn) ) #define IDeckLinkDeckControl_GetCurrentState(This,mode,vtrControlState,flags) \ - ( (This)->lpVtbl -> GetCurrentState(This,mode,vtrControlState,flags) ) + ( (This)->lpVtbl -> GetCurrentState(This,mode,vtrControlState,flags) ) #define IDeckLinkDeckControl_SetStandby(This,standbyOn) \ - ( (This)->lpVtbl -> SetStandby(This,standbyOn) ) + ( (This)->lpVtbl -> SetStandby(This,standbyOn) ) #define IDeckLinkDeckControl_SendCommand(This,inBuffer,inBufferSize,outBuffer,outDataSize,outBufferSize,error) \ - ( (This)->lpVtbl -> SendCommand(This,inBuffer,inBufferSize,outBuffer,outDataSize,outBufferSize,error) ) + ( (This)->lpVtbl -> SendCommand(This,inBuffer,inBufferSize,outBuffer,outDataSize,outBufferSize,error) ) #define IDeckLinkDeckControl_Play(This,error) \ - ( (This)->lpVtbl -> Play(This,error) ) + ( (This)->lpVtbl -> Play(This,error) ) #define IDeckLinkDeckControl_Stop(This,error) \ - ( (This)->lpVtbl -> Stop(This,error) ) + ( (This)->lpVtbl -> Stop(This,error) ) #define IDeckLinkDeckControl_TogglePlayStop(This,error) \ - ( (This)->lpVtbl -> TogglePlayStop(This,error) ) + ( (This)->lpVtbl -> TogglePlayStop(This,error) ) #define IDeckLinkDeckControl_Eject(This,error) \ - ( (This)->lpVtbl -> Eject(This,error) ) + ( (This)->lpVtbl -> Eject(This,error) ) #define IDeckLinkDeckControl_GoToTimecode(This,timecode,error) \ - ( (This)->lpVtbl -> GoToTimecode(This,timecode,error) ) + ( (This)->lpVtbl -> GoToTimecode(This,timecode,error) ) #define IDeckLinkDeckControl_FastForward(This,viewTape,error) \ - ( (This)->lpVtbl -> FastForward(This,viewTape,error) ) + ( (This)->lpVtbl -> FastForward(This,viewTape,error) ) #define IDeckLinkDeckControl_Rewind(This,viewTape,error) \ - ( (This)->lpVtbl -> Rewind(This,viewTape,error) ) + ( (This)->lpVtbl -> Rewind(This,viewTape,error) ) #define IDeckLinkDeckControl_StepForward(This,error) \ - ( (This)->lpVtbl -> StepForward(This,error) ) + ( (This)->lpVtbl -> StepForward(This,error) ) #define IDeckLinkDeckControl_StepBack(This,error) \ - ( (This)->lpVtbl -> StepBack(This,error) ) + ( (This)->lpVtbl -> StepBack(This,error) ) #define IDeckLinkDeckControl_Jog(This,rate,error) \ - ( (This)->lpVtbl -> Jog(This,rate,error) ) + ( (This)->lpVtbl -> Jog(This,rate,error) ) #define IDeckLinkDeckControl_Shuttle(This,rate,error) \ - ( (This)->lpVtbl -> Shuttle(This,rate,error) ) + ( (This)->lpVtbl -> Shuttle(This,rate,error) ) #define IDeckLinkDeckControl_GetTimecodeString(This,currentTimeCode,error) \ - ( (This)->lpVtbl -> GetTimecodeString(This,currentTimeCode,error) ) + ( (This)->lpVtbl -> GetTimecodeString(This,currentTimeCode,error) ) #define IDeckLinkDeckControl_GetTimecode(This,currentTimecode,error) \ - ( (This)->lpVtbl -> GetTimecode(This,currentTimecode,error) ) + ( (This)->lpVtbl -> GetTimecode(This,currentTimecode,error) ) #define IDeckLinkDeckControl_GetTimecodeBCD(This,currentTimecode,error) \ - ( (This)->lpVtbl -> GetTimecodeBCD(This,currentTimecode,error) ) + ( (This)->lpVtbl -> GetTimecodeBCD(This,currentTimecode,error) ) #define IDeckLinkDeckControl_SetPreroll(This,prerollSeconds) \ - ( (This)->lpVtbl -> SetPreroll(This,prerollSeconds) ) + ( (This)->lpVtbl -> SetPreroll(This,prerollSeconds) ) #define IDeckLinkDeckControl_GetPreroll(This,prerollSeconds) \ - ( (This)->lpVtbl -> GetPreroll(This,prerollSeconds) ) + ( (This)->lpVtbl -> GetPreroll(This,prerollSeconds) ) #define IDeckLinkDeckControl_SetExportOffset(This,exportOffsetFields) \ - ( (This)->lpVtbl -> SetExportOffset(This,exportOffsetFields) ) + ( (This)->lpVtbl -> SetExportOffset(This,exportOffsetFields) ) #define IDeckLinkDeckControl_GetExportOffset(This,exportOffsetFields) \ - ( (This)->lpVtbl -> GetExportOffset(This,exportOffsetFields) ) + ( (This)->lpVtbl -> GetExportOffset(This,exportOffsetFields) ) #define IDeckLinkDeckControl_GetManualExportOffset(This,deckManualExportOffsetFields) \ - ( (This)->lpVtbl -> GetManualExportOffset(This,deckManualExportOffsetFields) ) + ( (This)->lpVtbl -> GetManualExportOffset(This,deckManualExportOffsetFields) ) #define IDeckLinkDeckControl_SetCaptureOffset(This,captureOffsetFields) \ - ( (This)->lpVtbl -> SetCaptureOffset(This,captureOffsetFields) ) + ( (This)->lpVtbl -> SetCaptureOffset(This,captureOffsetFields) ) #define IDeckLinkDeckControl_GetCaptureOffset(This,captureOffsetFields) \ - ( (This)->lpVtbl -> GetCaptureOffset(This,captureOffsetFields) ) + ( (This)->lpVtbl -> GetCaptureOffset(This,captureOffsetFields) ) #define IDeckLinkDeckControl_StartExport(This,inTimecode,outTimecode,exportModeOps,error) \ - ( (This)->lpVtbl -> StartExport(This,inTimecode,outTimecode,exportModeOps,error) ) + ( (This)->lpVtbl -> StartExport(This,inTimecode,outTimecode,exportModeOps,error) ) #define IDeckLinkDeckControl_StartCapture(This,useVITC,inTimecode,outTimecode,error) \ - ( (This)->lpVtbl -> StartCapture(This,useVITC,inTimecode,outTimecode,error) ) + ( (This)->lpVtbl -> StartCapture(This,useVITC,inTimecode,outTimecode,error) ) #define IDeckLinkDeckControl_GetDeviceID(This,deviceId,error) \ - ( (This)->lpVtbl -> GetDeviceID(This,deviceId,error) ) + ( (This)->lpVtbl -> GetDeviceID(This,deviceId,error) ) #define IDeckLinkDeckControl_Abort(This) \ - ( (This)->lpVtbl -> Abort(This) ) + ( (This)->lpVtbl -> Abort(This) ) #define IDeckLinkDeckControl_CrashRecordStart(This,error) \ - ( (This)->lpVtbl -> CrashRecordStart(This,error) ) + ( (This)->lpVtbl -> CrashRecordStart(This,error) ) #define IDeckLinkDeckControl_CrashRecordStop(This,error) \ - ( (This)->lpVtbl -> CrashRecordStop(This,error) ) + ( (This)->lpVtbl -> CrashRecordStop(This,error) ) #define IDeckLinkDeckControl_SetCallback(This,callback) \ - ( (This)->lpVtbl -> SetCallback(This,callback) ) + ( (This)->lpVtbl -> SetCallback(This,callback) ) #endif /* COBJMACROS */ @@ -3299,61 +3520,61 @@ #define __IBMDStreamingDeviceNotificationCallback_INTERFACE_DEFINED__ /* interface IBMDStreamingDeviceNotificationCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingDeviceNotificationCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("F9531D64-3305-4B29-A387-7F74BB0D0E84") IBMDStreamingDeviceNotificationCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE StreamingDeviceArrived( + virtual HRESULT STDMETHODCALLTYPE StreamingDeviceArrived( /* [in] */ IDeckLink *device) = 0; - - virtual HRESULT STDMETHODCALLTYPE StreamingDeviceRemoved( + + virtual HRESULT STDMETHODCALLTYPE StreamingDeviceRemoved( /* [in] */ IDeckLink *device) = 0; - - virtual HRESULT STDMETHODCALLTYPE StreamingDeviceModeChanged( + + virtual HRESULT STDMETHODCALLTYPE StreamingDeviceModeChanged( /* [in] */ IDeckLink *device, /* [in] */ BMDStreamingDeviceMode mode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingDeviceNotificationCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingDeviceNotificationCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingDeviceNotificationCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingDeviceNotificationCallback * This); - - HRESULT ( STDMETHODCALLTYPE *StreamingDeviceArrived )( + + HRESULT ( STDMETHODCALLTYPE *StreamingDeviceArrived )( IBMDStreamingDeviceNotificationCallback * This, /* [in] */ IDeckLink *device); - - HRESULT ( STDMETHODCALLTYPE *StreamingDeviceRemoved )( + + HRESULT ( STDMETHODCALLTYPE *StreamingDeviceRemoved )( IBMDStreamingDeviceNotificationCallback * This, /* [in] */ IDeckLink *device); - - HRESULT ( STDMETHODCALLTYPE *StreamingDeviceModeChanged )( + + HRESULT ( STDMETHODCALLTYPE *StreamingDeviceModeChanged )( IBMDStreamingDeviceNotificationCallback * This, /* [in] */ IDeckLink *device, /* [in] */ BMDStreamingDeviceMode mode); - + END_INTERFACE } IBMDStreamingDeviceNotificationCallbackVtbl; @@ -3362,29 +3583,29 @@ CONST_VTBL struct IBMDStreamingDeviceNotificationCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingDeviceNotificationCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingDeviceNotificationCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingDeviceNotificationCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingDeviceNotificationCallback_StreamingDeviceArrived(This,device) \ - ( (This)->lpVtbl -> StreamingDeviceArrived(This,device) ) + ( (This)->lpVtbl -> StreamingDeviceArrived(This,device) ) #define IBMDStreamingDeviceNotificationCallback_StreamingDeviceRemoved(This,device) \ - ( (This)->lpVtbl -> StreamingDeviceRemoved(This,device) ) + ( (This)->lpVtbl -> StreamingDeviceRemoved(This,device) ) #define IBMDStreamingDeviceNotificationCallback_StreamingDeviceModeChanged(This,device,mode) \ - ( (This)->lpVtbl -> StreamingDeviceModeChanged(This,device,mode) ) + ( (This)->lpVtbl -> StreamingDeviceModeChanged(This,device,mode) ) #endif /* COBJMACROS */ @@ -3401,74 +3622,74 @@ #define __IBMDStreamingH264InputCallback_INTERFACE_DEFINED__ /* interface IBMDStreamingH264InputCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingH264InputCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("823C475F-55AE-46F9-890C-537CC5CEDCCA") IBMDStreamingH264InputCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE H264NALPacketArrived( + virtual HRESULT STDMETHODCALLTYPE H264NALPacketArrived( /* [in] */ IBMDStreamingH264NALPacket *nalPacket) = 0; - - virtual HRESULT STDMETHODCALLTYPE H264AudioPacketArrived( + + virtual HRESULT STDMETHODCALLTYPE H264AudioPacketArrived( /* [in] */ IBMDStreamingAudioPacket *audioPacket) = 0; - - virtual HRESULT STDMETHODCALLTYPE MPEG2TSPacketArrived( + + virtual HRESULT STDMETHODCALLTYPE MPEG2TSPacketArrived( /* [in] */ IBMDStreamingMPEG2TSPacket *tsPacket) = 0; - + virtual HRESULT STDMETHODCALLTYPE H264VideoInputConnectorScanningChanged( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE H264VideoInputConnectorChanged( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE H264VideoInputModeChanged( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingH264InputCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingH264InputCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingH264InputCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingH264InputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *H264NALPacketArrived )( + + HRESULT ( STDMETHODCALLTYPE *H264NALPacketArrived )( IBMDStreamingH264InputCallback * This, /* [in] */ IBMDStreamingH264NALPacket *nalPacket); - - HRESULT ( STDMETHODCALLTYPE *H264AudioPacketArrived )( + + HRESULT ( STDMETHODCALLTYPE *H264AudioPacketArrived )( IBMDStreamingH264InputCallback * This, /* [in] */ IBMDStreamingAudioPacket *audioPacket); - - HRESULT ( STDMETHODCALLTYPE *MPEG2TSPacketArrived )( + + HRESULT ( STDMETHODCALLTYPE *MPEG2TSPacketArrived )( IBMDStreamingH264InputCallback * This, /* [in] */ IBMDStreamingMPEG2TSPacket *tsPacket); - - HRESULT ( STDMETHODCALLTYPE *H264VideoInputConnectorScanningChanged )( + + HRESULT ( STDMETHODCALLTYPE *H264VideoInputConnectorScanningChanged )( IBMDStreamingH264InputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *H264VideoInputConnectorChanged )( + + HRESULT ( STDMETHODCALLTYPE *H264VideoInputConnectorChanged )( IBMDStreamingH264InputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *H264VideoInputModeChanged )( + + HRESULT ( STDMETHODCALLTYPE *H264VideoInputModeChanged )( IBMDStreamingH264InputCallback * This); - + END_INTERFACE } IBMDStreamingH264InputCallbackVtbl; @@ -3477,38 +3698,38 @@ CONST_VTBL struct IBMDStreamingH264InputCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingH264InputCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingH264InputCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingH264InputCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingH264InputCallback_H264NALPacketArrived(This,nalPacket) \ - ( (This)->lpVtbl -> H264NALPacketArrived(This,nalPacket) ) + ( (This)->lpVtbl -> H264NALPacketArrived(This,nalPacket) ) #define IBMDStreamingH264InputCallback_H264AudioPacketArrived(This,audioPacket) \ - ( (This)->lpVtbl -> H264AudioPacketArrived(This,audioPacket) ) + ( (This)->lpVtbl -> H264AudioPacketArrived(This,audioPacket) ) #define IBMDStreamingH264InputCallback_MPEG2TSPacketArrived(This,tsPacket) \ - ( (This)->lpVtbl -> MPEG2TSPacketArrived(This,tsPacket) ) + ( (This)->lpVtbl -> MPEG2TSPacketArrived(This,tsPacket) ) #define IBMDStreamingH264InputCallback_H264VideoInputConnectorScanningChanged(This) \ - ( (This)->lpVtbl -> H264VideoInputConnectorScanningChanged(This) ) + ( (This)->lpVtbl -> H264VideoInputConnectorScanningChanged(This) ) #define IBMDStreamingH264InputCallback_H264VideoInputConnectorChanged(This) \ - ( (This)->lpVtbl -> H264VideoInputConnectorChanged(This) ) + ( (This)->lpVtbl -> H264VideoInputConnectorChanged(This) ) #define IBMDStreamingH264InputCallback_H264VideoInputModeChanged(This) \ - ( (This)->lpVtbl -> H264VideoInputModeChanged(This) ) + ( (This)->lpVtbl -> H264VideoInputModeChanged(This) ) #endif /* COBJMACROS */ @@ -3525,50 +3746,50 @@ #define __IBMDStreamingDiscovery_INTERFACE_DEFINED__ /* interface IBMDStreamingDiscovery */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingDiscovery; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("2C837444-F989-4D87-901A-47C8A36D096D") IBMDStreamingDiscovery : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE InstallDeviceNotifications( + virtual HRESULT STDMETHODCALLTYPE InstallDeviceNotifications( /* [in] */ IBMDStreamingDeviceNotificationCallback *theCallback) = 0; - + virtual HRESULT STDMETHODCALLTYPE UninstallDeviceNotifications( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingDiscoveryVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingDiscovery * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingDiscovery * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingDiscovery * This); - - HRESULT ( STDMETHODCALLTYPE *InstallDeviceNotifications )( + + HRESULT ( STDMETHODCALLTYPE *InstallDeviceNotifications )( IBMDStreamingDiscovery * This, /* [in] */ IBMDStreamingDeviceNotificationCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *UninstallDeviceNotifications )( + + HRESULT ( STDMETHODCALLTYPE *UninstallDeviceNotifications )( IBMDStreamingDiscovery * This); - + END_INTERFACE } IBMDStreamingDiscoveryVtbl; @@ -3577,26 +3798,26 @@ CONST_VTBL struct IBMDStreamingDiscoveryVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingDiscovery_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingDiscovery_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingDiscovery_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingDiscovery_InstallDeviceNotifications(This,theCallback) \ - ( (This)->lpVtbl -> InstallDeviceNotifications(This,theCallback) ) + ( (This)->lpVtbl -> InstallDeviceNotifications(This,theCallback) ) #define IBMDStreamingDiscovery_UninstallDeviceNotifications(This) \ - ( (This)->lpVtbl -> UninstallDeviceNotifications(This) ) + ( (This)->lpVtbl -> UninstallDeviceNotifications(This) ) #endif /* COBJMACROS */ @@ -3613,123 +3834,123 @@ #define __IBMDStreamingVideoEncodingMode_INTERFACE_DEFINED__ /* interface IBMDStreamingVideoEncodingMode */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingVideoEncodingMode; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("1AB8035B-CD13-458D-B6DF-5E8F7C2141D9") IBMDStreamingVideoEncodingMode : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetName( + virtual HRESULT STDMETHODCALLTYPE GetName( /* [out] */ BSTR *name) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetPresetID( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetSourcePositionX( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetSourcePositionY( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetSourceWidth( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetSourceHeight( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetDestWidth( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetDestHeight( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ BSTR *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateMutableVideoEncodingMode( + + virtual HRESULT STDMETHODCALLTYPE CreateMutableVideoEncodingMode( /* [out] */ IBMDStreamingMutableVideoEncodingMode **newEncodingMode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingVideoEncodingModeVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingVideoEncodingMode * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingVideoEncodingMode * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingVideoEncodingMode * This); - - HRESULT ( STDMETHODCALLTYPE *GetName )( + + HRESULT ( STDMETHODCALLTYPE *GetName )( IBMDStreamingVideoEncodingMode * This, /* [out] */ BSTR *name); - - unsigned int ( STDMETHODCALLTYPE *GetPresetID )( + + unsigned int ( STDMETHODCALLTYPE *GetPresetID )( IBMDStreamingVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourcePositionX )( + + unsigned int ( STDMETHODCALLTYPE *GetSourcePositionX )( IBMDStreamingVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourcePositionY )( + + unsigned int ( STDMETHODCALLTYPE *GetSourcePositionY )( IBMDStreamingVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourceWidth )( + + unsigned int ( STDMETHODCALLTYPE *GetSourceWidth )( IBMDStreamingVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourceHeight )( + + unsigned int ( STDMETHODCALLTYPE *GetSourceHeight )( IBMDStreamingVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetDestWidth )( + + unsigned int ( STDMETHODCALLTYPE *GetDestWidth )( IBMDStreamingVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetDestHeight )( + + unsigned int ( STDMETHODCALLTYPE *GetDestHeight )( IBMDStreamingVideoEncodingMode * This); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IBMDStreamingVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IBMDStreamingVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IBMDStreamingVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IBMDStreamingVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *CreateMutableVideoEncodingMode )( + + HRESULT ( STDMETHODCALLTYPE *CreateMutableVideoEncodingMode )( IBMDStreamingVideoEncodingMode * This, /* [out] */ IBMDStreamingMutableVideoEncodingMode **newEncodingMode); - + END_INTERFACE } IBMDStreamingVideoEncodingModeVtbl; @@ -3738,59 +3959,59 @@ CONST_VTBL struct IBMDStreamingVideoEncodingModeVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingVideoEncodingMode_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingVideoEncodingMode_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingVideoEncodingMode_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingVideoEncodingMode_GetName(This,name) \ - ( (This)->lpVtbl -> GetName(This,name) ) + ( (This)->lpVtbl -> GetName(This,name) ) #define IBMDStreamingVideoEncodingMode_GetPresetID(This) \ - ( (This)->lpVtbl -> GetPresetID(This) ) + ( (This)->lpVtbl -> GetPresetID(This) ) #define IBMDStreamingVideoEncodingMode_GetSourcePositionX(This) \ - ( (This)->lpVtbl -> GetSourcePositionX(This) ) + ( (This)->lpVtbl -> GetSourcePositionX(This) ) #define IBMDStreamingVideoEncodingMode_GetSourcePositionY(This) \ - ( (This)->lpVtbl -> GetSourcePositionY(This) ) + ( (This)->lpVtbl -> GetSourcePositionY(This) ) #define IBMDStreamingVideoEncodingMode_GetSourceWidth(This) \ - ( (This)->lpVtbl -> GetSourceWidth(This) ) + ( (This)->lpVtbl -> GetSourceWidth(This) ) #define IBMDStreamingVideoEncodingMode_GetSourceHeight(This) \ - ( (This)->lpVtbl -> GetSourceHeight(This) ) + ( (This)->lpVtbl -> GetSourceHeight(This) ) #define IBMDStreamingVideoEncodingMode_GetDestWidth(This) \ - ( (This)->lpVtbl -> GetDestWidth(This) ) + ( (This)->lpVtbl -> GetDestWidth(This) ) #define IBMDStreamingVideoEncodingMode_GetDestHeight(This) \ - ( (This)->lpVtbl -> GetDestHeight(This) ) + ( (This)->lpVtbl -> GetDestHeight(This) ) #define IBMDStreamingVideoEncodingMode_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IBMDStreamingVideoEncodingMode_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IBMDStreamingVideoEncodingMode_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IBMDStreamingVideoEncodingMode_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IBMDStreamingVideoEncodingMode_CreateMutableVideoEncodingMode(This,newEncodingMode) \ - ( (This)->lpVtbl -> CreateMutableVideoEncodingMode(This,newEncodingMode) ) + ( (This)->lpVtbl -> CreateMutableVideoEncodingMode(This,newEncodingMode) ) #endif /* COBJMACROS */ @@ -3807,145 +4028,145 @@ #define __IBMDStreamingMutableVideoEncodingMode_INTERFACE_DEFINED__ /* interface IBMDStreamingMutableVideoEncodingMode */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingMutableVideoEncodingMode; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("19BF7D90-1E0A-400D-B2C6-FFC4E78AD49D") IBMDStreamingMutableVideoEncodingMode : public IBMDStreamingVideoEncodingMode { public: - virtual HRESULT STDMETHODCALLTYPE SetSourceRect( + virtual HRESULT STDMETHODCALLTYPE SetSourceRect( /* [in] */ unsigned int posX, /* [in] */ unsigned int posY, /* [in] */ unsigned int width, /* [in] */ unsigned int height) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetDestSize( + + virtual HRESULT STDMETHODCALLTYPE SetDestSize( /* [in] */ unsigned int width, /* [in] */ unsigned int height) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFlag( + + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ BSTR value) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingMutableVideoEncodingModeVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingMutableVideoEncodingMode * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingMutableVideoEncodingMode * This); - - HRESULT ( STDMETHODCALLTYPE *GetName )( + + HRESULT ( STDMETHODCALLTYPE *GetName )( IBMDStreamingMutableVideoEncodingMode * This, /* [out] */ BSTR *name); - - unsigned int ( STDMETHODCALLTYPE *GetPresetID )( + + unsigned int ( STDMETHODCALLTYPE *GetPresetID )( IBMDStreamingMutableVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourcePositionX )( + + unsigned int ( STDMETHODCALLTYPE *GetSourcePositionX )( IBMDStreamingMutableVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourcePositionY )( + + unsigned int ( STDMETHODCALLTYPE *GetSourcePositionY )( IBMDStreamingMutableVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourceWidth )( + + unsigned int ( STDMETHODCALLTYPE *GetSourceWidth )( IBMDStreamingMutableVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetSourceHeight )( + + unsigned int ( STDMETHODCALLTYPE *GetSourceHeight )( IBMDStreamingMutableVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetDestWidth )( + + unsigned int ( STDMETHODCALLTYPE *GetDestWidth )( IBMDStreamingMutableVideoEncodingMode * This); - - unsigned int ( STDMETHODCALLTYPE *GetDestHeight )( + + unsigned int ( STDMETHODCALLTYPE *GetDestHeight )( IBMDStreamingMutableVideoEncodingMode * This); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *CreateMutableVideoEncodingMode )( + + HRESULT ( STDMETHODCALLTYPE *CreateMutableVideoEncodingMode )( IBMDStreamingMutableVideoEncodingMode * This, /* [out] */ IBMDStreamingMutableVideoEncodingMode **newEncodingMode); - - HRESULT ( STDMETHODCALLTYPE *SetSourceRect )( + + HRESULT ( STDMETHODCALLTYPE *SetSourceRect )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ unsigned int posX, /* [in] */ unsigned int posY, /* [in] */ unsigned int width, /* [in] */ unsigned int height); - - HRESULT ( STDMETHODCALLTYPE *SetDestSize )( + + HRESULT ( STDMETHODCALLTYPE *SetDestSize )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ unsigned int width, /* [in] */ unsigned int height); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IBMDStreamingMutableVideoEncodingMode * This, /* [in] */ BMDStreamingEncodingModePropertyID cfgID, /* [in] */ BSTR value); - + END_INTERFACE } IBMDStreamingMutableVideoEncodingModeVtbl; @@ -3954,78 +4175,78 @@ CONST_VTBL struct IBMDStreamingMutableVideoEncodingModeVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingMutableVideoEncodingMode_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingMutableVideoEncodingMode_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingMutableVideoEncodingMode_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetName(This,name) \ - ( (This)->lpVtbl -> GetName(This,name) ) + ( (This)->lpVtbl -> GetName(This,name) ) #define IBMDStreamingMutableVideoEncodingMode_GetPresetID(This) \ - ( (This)->lpVtbl -> GetPresetID(This) ) + ( (This)->lpVtbl -> GetPresetID(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetSourcePositionX(This) \ - ( (This)->lpVtbl -> GetSourcePositionX(This) ) + ( (This)->lpVtbl -> GetSourcePositionX(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetSourcePositionY(This) \ - ( (This)->lpVtbl -> GetSourcePositionY(This) ) + ( (This)->lpVtbl -> GetSourcePositionY(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetSourceWidth(This) \ - ( (This)->lpVtbl -> GetSourceWidth(This) ) + ( (This)->lpVtbl -> GetSourceWidth(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetSourceHeight(This) \ - ( (This)->lpVtbl -> GetSourceHeight(This) ) + ( (This)->lpVtbl -> GetSourceHeight(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetDestWidth(This) \ - ( (This)->lpVtbl -> GetDestWidth(This) ) + ( (This)->lpVtbl -> GetDestWidth(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetDestHeight(This) \ - ( (This)->lpVtbl -> GetDestHeight(This) ) + ( (This)->lpVtbl -> GetDestHeight(This) ) #define IBMDStreamingMutableVideoEncodingMode_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_CreateMutableVideoEncodingMode(This,newEncodingMode) \ - ( (This)->lpVtbl -> CreateMutableVideoEncodingMode(This,newEncodingMode) ) + ( (This)->lpVtbl -> CreateMutableVideoEncodingMode(This,newEncodingMode) ) #define IBMDStreamingMutableVideoEncodingMode_SetSourceRect(This,posX,posY,width,height) \ - ( (This)->lpVtbl -> SetSourceRect(This,posX,posY,width,height) ) + ( (This)->lpVtbl -> SetSourceRect(This,posX,posY,width,height) ) #define IBMDStreamingMutableVideoEncodingMode_SetDestSize(This,width,height) \ - ( (This)->lpVtbl -> SetDestSize(This,width,height) ) + ( (This)->lpVtbl -> SetDestSize(This,width,height) ) #define IBMDStreamingMutableVideoEncodingMode_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IBMDStreamingMutableVideoEncodingMode_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #endif /* COBJMACROS */ @@ -4042,45 +4263,45 @@ #define __IBMDStreamingVideoEncodingModePresetIterator_INTERFACE_DEFINED__ /* interface IBMDStreamingVideoEncodingModePresetIterator */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingVideoEncodingModePresetIterator; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("7AC731A3-C950-4AD0-804A-8377AA51C6C4") IBMDStreamingVideoEncodingModePresetIterator : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IBMDStreamingVideoEncodingMode **videoEncodingMode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingVideoEncodingModePresetIteratorVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingVideoEncodingModePresetIterator * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingVideoEncodingModePresetIterator * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingVideoEncodingModePresetIterator * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IBMDStreamingVideoEncodingModePresetIterator * This, /* [out] */ IBMDStreamingVideoEncodingMode **videoEncodingMode); - + END_INTERFACE } IBMDStreamingVideoEncodingModePresetIteratorVtbl; @@ -4089,23 +4310,23 @@ CONST_VTBL struct IBMDStreamingVideoEncodingModePresetIteratorVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingVideoEncodingModePresetIterator_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingVideoEncodingModePresetIterator_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingVideoEncodingModePresetIterator_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingVideoEncodingModePresetIterator_Next(This,videoEncodingMode) \ - ( (This)->lpVtbl -> Next(This,videoEncodingMode) ) + ( (This)->lpVtbl -> Next(This,videoEncodingMode) ) #endif /* COBJMACROS */ @@ -4122,121 +4343,121 @@ #define __IBMDStreamingDeviceInput_INTERFACE_DEFINED__ /* interface IBMDStreamingDeviceInput */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingDeviceInput; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("24B6B6EC-1727-44BB-9818-34FF086ACF98") IBMDStreamingDeviceInput : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoInputMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoInputMode( /* [in] */ BMDDisplayMode inputMode, /* [out] */ BOOL *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVideoInputModeIterator( + + virtual HRESULT STDMETHODCALLTYPE GetVideoInputModeIterator( /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoInputMode( + + virtual HRESULT STDMETHODCALLTYPE SetVideoInputMode( /* [in] */ BMDDisplayMode inputMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCurrentDetectedVideoInputMode( + + virtual HRESULT STDMETHODCALLTYPE GetCurrentDetectedVideoInputMode( /* [out] */ BMDDisplayMode *detectedMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVideoEncodingMode( + + virtual HRESULT STDMETHODCALLTYPE GetVideoEncodingMode( /* [out] */ IBMDStreamingVideoEncodingMode **encodingMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVideoEncodingModePresetIterator( + + virtual HRESULT STDMETHODCALLTYPE GetVideoEncodingModePresetIterator( /* [in] */ BMDDisplayMode inputMode, /* [out] */ IBMDStreamingVideoEncodingModePresetIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoEncodingMode( + + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoEncodingMode( /* [in] */ BMDDisplayMode inputMode, /* [in] */ IBMDStreamingVideoEncodingMode *encodingMode, /* [out] */ BMDStreamingEncodingSupport *result, /* [out] */ IBMDStreamingVideoEncodingMode **changedEncodingMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoEncodingMode( + + virtual HRESULT STDMETHODCALLTYPE SetVideoEncodingMode( /* [in] */ IBMDStreamingVideoEncodingMode *encodingMode) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartCapture( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopCapture( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IUnknown *theCallback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingDeviceInputVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingDeviceInput * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingDeviceInput * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingDeviceInput * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoInputMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoInputMode )( IBMDStreamingDeviceInput * This, /* [in] */ BMDDisplayMode inputMode, /* [out] */ BOOL *result); - - HRESULT ( STDMETHODCALLTYPE *GetVideoInputModeIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetVideoInputModeIterator )( IBMDStreamingDeviceInput * This, /* [out] */ IDeckLinkDisplayModeIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetVideoInputMode )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoInputMode )( IBMDStreamingDeviceInput * This, /* [in] */ BMDDisplayMode inputMode); - - HRESULT ( STDMETHODCALLTYPE *GetCurrentDetectedVideoInputMode )( + + HRESULT ( STDMETHODCALLTYPE *GetCurrentDetectedVideoInputMode )( IBMDStreamingDeviceInput * This, /* [out] */ BMDDisplayMode *detectedMode); - - HRESULT ( STDMETHODCALLTYPE *GetVideoEncodingMode )( + + HRESULT ( STDMETHODCALLTYPE *GetVideoEncodingMode )( IBMDStreamingDeviceInput * This, /* [out] */ IBMDStreamingVideoEncodingMode **encodingMode); - - HRESULT ( STDMETHODCALLTYPE *GetVideoEncodingModePresetIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetVideoEncodingModePresetIterator )( IBMDStreamingDeviceInput * This, /* [in] */ BMDDisplayMode inputMode, /* [out] */ IBMDStreamingVideoEncodingModePresetIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoEncodingMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoEncodingMode )( IBMDStreamingDeviceInput * This, /* [in] */ BMDDisplayMode inputMode, /* [in] */ IBMDStreamingVideoEncodingMode *encodingMode, /* [out] */ BMDStreamingEncodingSupport *result, /* [out] */ IBMDStreamingVideoEncodingMode **changedEncodingMode); - - HRESULT ( STDMETHODCALLTYPE *SetVideoEncodingMode )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoEncodingMode )( IBMDStreamingDeviceInput * This, /* [in] */ IBMDStreamingVideoEncodingMode *encodingMode); - - HRESULT ( STDMETHODCALLTYPE *StartCapture )( + + HRESULT ( STDMETHODCALLTYPE *StartCapture )( IBMDStreamingDeviceInput * This); - - HRESULT ( STDMETHODCALLTYPE *StopCapture )( + + HRESULT ( STDMETHODCALLTYPE *StopCapture )( IBMDStreamingDeviceInput * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IBMDStreamingDeviceInput * This, /* [in] */ IUnknown *theCallback); - + END_INTERFACE } IBMDStreamingDeviceInputVtbl; @@ -4245,53 +4466,53 @@ CONST_VTBL struct IBMDStreamingDeviceInputVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingDeviceInput_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingDeviceInput_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingDeviceInput_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingDeviceInput_DoesSupportVideoInputMode(This,inputMode,result) \ - ( (This)->lpVtbl -> DoesSupportVideoInputMode(This,inputMode,result) ) + ( (This)->lpVtbl -> DoesSupportVideoInputMode(This,inputMode,result) ) #define IBMDStreamingDeviceInput_GetVideoInputModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetVideoInputModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetVideoInputModeIterator(This,iterator) ) #define IBMDStreamingDeviceInput_SetVideoInputMode(This,inputMode) \ - ( (This)->lpVtbl -> SetVideoInputMode(This,inputMode) ) + ( (This)->lpVtbl -> SetVideoInputMode(This,inputMode) ) #define IBMDStreamingDeviceInput_GetCurrentDetectedVideoInputMode(This,detectedMode) \ - ( (This)->lpVtbl -> GetCurrentDetectedVideoInputMode(This,detectedMode) ) + ( (This)->lpVtbl -> GetCurrentDetectedVideoInputMode(This,detectedMode) ) #define IBMDStreamingDeviceInput_GetVideoEncodingMode(This,encodingMode) \ - ( (This)->lpVtbl -> GetVideoEncodingMode(This,encodingMode) ) + ( (This)->lpVtbl -> GetVideoEncodingMode(This,encodingMode) ) #define IBMDStreamingDeviceInput_GetVideoEncodingModePresetIterator(This,inputMode,iterator) \ - ( (This)->lpVtbl -> GetVideoEncodingModePresetIterator(This,inputMode,iterator) ) + ( (This)->lpVtbl -> GetVideoEncodingModePresetIterator(This,inputMode,iterator) ) #define IBMDStreamingDeviceInput_DoesSupportVideoEncodingMode(This,inputMode,encodingMode,result,changedEncodingMode) \ - ( (This)->lpVtbl -> DoesSupportVideoEncodingMode(This,inputMode,encodingMode,result,changedEncodingMode) ) + ( (This)->lpVtbl -> DoesSupportVideoEncodingMode(This,inputMode,encodingMode,result,changedEncodingMode) ) #define IBMDStreamingDeviceInput_SetVideoEncodingMode(This,encodingMode) \ - ( (This)->lpVtbl -> SetVideoEncodingMode(This,encodingMode) ) + ( (This)->lpVtbl -> SetVideoEncodingMode(This,encodingMode) ) #define IBMDStreamingDeviceInput_StartCapture(This) \ - ( (This)->lpVtbl -> StartCapture(This) ) + ( (This)->lpVtbl -> StartCapture(This) ) #define IBMDStreamingDeviceInput_StopCapture(This) \ - ( (This)->lpVtbl -> StopCapture(This) ) + ( (This)->lpVtbl -> StopCapture(This) ) #define IBMDStreamingDeviceInput_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #endif /* COBJMACROS */ @@ -4308,73 +4529,73 @@ #define __IBMDStreamingH264NALPacket_INTERFACE_DEFINED__ /* interface IBMDStreamingH264NALPacket */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingH264NALPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("E260E955-14BE-4395-9775-9F02CC0A9D89") IBMDStreamingH264NALPacket : public IUnknown { public: virtual long STDMETHODCALLTYPE GetPayloadSize( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytesWithSizePrefix( + + virtual HRESULT STDMETHODCALLTYPE GetBytesWithSizePrefix( /* [out] */ void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayTime( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayTime( /* [in] */ ULONGLONG requestedTimeScale, /* [out] */ ULONGLONG *displayTime) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPacketIndex( + + virtual HRESULT STDMETHODCALLTYPE GetPacketIndex( /* [out] */ unsigned int *packetIndex) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingH264NALPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingH264NALPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingH264NALPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingH264NALPacket * This); - - long ( STDMETHODCALLTYPE *GetPayloadSize )( + + long ( STDMETHODCALLTYPE *GetPayloadSize )( IBMDStreamingH264NALPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IBMDStreamingH264NALPacket * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetBytesWithSizePrefix )( + + HRESULT ( STDMETHODCALLTYPE *GetBytesWithSizePrefix )( IBMDStreamingH264NALPacket * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayTime )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayTime )( IBMDStreamingH264NALPacket * This, /* [in] */ ULONGLONG requestedTimeScale, /* [out] */ ULONGLONG *displayTime); - - HRESULT ( STDMETHODCALLTYPE *GetPacketIndex )( + + HRESULT ( STDMETHODCALLTYPE *GetPacketIndex )( IBMDStreamingH264NALPacket * This, /* [out] */ unsigned int *packetIndex); - + END_INTERFACE } IBMDStreamingH264NALPacketVtbl; @@ -4383,35 +4604,35 @@ CONST_VTBL struct IBMDStreamingH264NALPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingH264NALPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingH264NALPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingH264NALPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingH264NALPacket_GetPayloadSize(This) \ - ( (This)->lpVtbl -> GetPayloadSize(This) ) + ( (This)->lpVtbl -> GetPayloadSize(This) ) #define IBMDStreamingH264NALPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IBMDStreamingH264NALPacket_GetBytesWithSizePrefix(This,buffer) \ - ( (This)->lpVtbl -> GetBytesWithSizePrefix(This,buffer) ) + ( (This)->lpVtbl -> GetBytesWithSizePrefix(This,buffer) ) #define IBMDStreamingH264NALPacket_GetDisplayTime(This,requestedTimeScale,displayTime) \ - ( (This)->lpVtbl -> GetDisplayTime(This,requestedTimeScale,displayTime) ) + ( (This)->lpVtbl -> GetDisplayTime(This,requestedTimeScale,displayTime) ) #define IBMDStreamingH264NALPacket_GetPacketIndex(This,packetIndex) \ - ( (This)->lpVtbl -> GetPacketIndex(This,packetIndex) ) + ( (This)->lpVtbl -> GetPacketIndex(This,packetIndex) ) #endif /* COBJMACROS */ @@ -4428,71 +4649,71 @@ #define __IBMDStreamingAudioPacket_INTERFACE_DEFINED__ /* interface IBMDStreamingAudioPacket */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingAudioPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("D9EB5902-1AD2-43F4-9E2C-3CFA50B5EE19") IBMDStreamingAudioPacket : public IUnknown { public: virtual BMDStreamingAudioCodec STDMETHODCALLTYPE GetCodec( void) = 0; - + virtual long STDMETHODCALLTYPE GetPayloadSize( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPlayTime( + + virtual HRESULT STDMETHODCALLTYPE GetPlayTime( /* [in] */ ULONGLONG requestedTimeScale, /* [out] */ ULONGLONG *playTime) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPacketIndex( + + virtual HRESULT STDMETHODCALLTYPE GetPacketIndex( /* [out] */ unsigned int *packetIndex) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingAudioPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingAudioPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingAudioPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingAudioPacket * This); - - BMDStreamingAudioCodec ( STDMETHODCALLTYPE *GetCodec )( + + BMDStreamingAudioCodec ( STDMETHODCALLTYPE *GetCodec )( IBMDStreamingAudioPacket * This); - - long ( STDMETHODCALLTYPE *GetPayloadSize )( + + long ( STDMETHODCALLTYPE *GetPayloadSize )( IBMDStreamingAudioPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IBMDStreamingAudioPacket * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetPlayTime )( + + HRESULT ( STDMETHODCALLTYPE *GetPlayTime )( IBMDStreamingAudioPacket * This, /* [in] */ ULONGLONG requestedTimeScale, /* [out] */ ULONGLONG *playTime); - - HRESULT ( STDMETHODCALLTYPE *GetPacketIndex )( + + HRESULT ( STDMETHODCALLTYPE *GetPacketIndex )( IBMDStreamingAudioPacket * This, /* [out] */ unsigned int *packetIndex); - + END_INTERFACE } IBMDStreamingAudioPacketVtbl; @@ -4501,35 +4722,35 @@ CONST_VTBL struct IBMDStreamingAudioPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingAudioPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingAudioPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingAudioPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingAudioPacket_GetCodec(This) \ - ( (This)->lpVtbl -> GetCodec(This) ) + ( (This)->lpVtbl -> GetCodec(This) ) #define IBMDStreamingAudioPacket_GetPayloadSize(This) \ - ( (This)->lpVtbl -> GetPayloadSize(This) ) + ( (This)->lpVtbl -> GetPayloadSize(This) ) #define IBMDStreamingAudioPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IBMDStreamingAudioPacket_GetPlayTime(This,requestedTimeScale,playTime) \ - ( (This)->lpVtbl -> GetPlayTime(This,requestedTimeScale,playTime) ) + ( (This)->lpVtbl -> GetPlayTime(This,requestedTimeScale,playTime) ) #define IBMDStreamingAudioPacket_GetPacketIndex(This,packetIndex) \ - ( (This)->lpVtbl -> GetPacketIndex(This,packetIndex) ) + ( (This)->lpVtbl -> GetPacketIndex(This,packetIndex) ) #endif /* COBJMACROS */ @@ -4546,50 +4767,50 @@ #define __IBMDStreamingMPEG2TSPacket_INTERFACE_DEFINED__ /* interface IBMDStreamingMPEG2TSPacket */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingMPEG2TSPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("91810D1C-4FB3-4AAA-AE56-FA301D3DFA4C") IBMDStreamingMPEG2TSPacket : public IUnknown { public: virtual long STDMETHODCALLTYPE GetPayloadSize( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingMPEG2TSPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingMPEG2TSPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingMPEG2TSPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingMPEG2TSPacket * This); - - long ( STDMETHODCALLTYPE *GetPayloadSize )( + + long ( STDMETHODCALLTYPE *GetPayloadSize )( IBMDStreamingMPEG2TSPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IBMDStreamingMPEG2TSPacket * This, /* [out] */ void **buffer); - + END_INTERFACE } IBMDStreamingMPEG2TSPacketVtbl; @@ -4598,26 +4819,26 @@ CONST_VTBL struct IBMDStreamingMPEG2TSPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingMPEG2TSPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingMPEG2TSPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingMPEG2TSPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingMPEG2TSPacket_GetPayloadSize(This) \ - ( (This)->lpVtbl -> GetPayloadSize(This) ) + ( (This)->lpVtbl -> GetPayloadSize(This) ) #define IBMDStreamingMPEG2TSPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #endif /* COBJMACROS */ @@ -4634,65 +4855,65 @@ #define __IBMDStreamingH264NALParser_INTERFACE_DEFINED__ /* interface IBMDStreamingH264NALParser */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IBMDStreamingH264NALParser; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("5867F18C-5BFA-4CCC-B2A7-9DFD140417D2") IBMDStreamingH264NALParser : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE IsNALSequenceParameterSet( + virtual HRESULT STDMETHODCALLTYPE IsNALSequenceParameterSet( /* [in] */ IBMDStreamingH264NALPacket *nal) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsNALPictureParameterSet( + + virtual HRESULT STDMETHODCALLTYPE IsNALPictureParameterSet( /* [in] */ IBMDStreamingH264NALPacket *nal) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetProfileAndLevelFromSPS( + + virtual HRESULT STDMETHODCALLTYPE GetProfileAndLevelFromSPS( /* [in] */ IBMDStreamingH264NALPacket *nal, /* [out] */ unsigned int *profileIdc, /* [out] */ unsigned int *profileCompatability, /* [out] */ unsigned int *levelIdc) = 0; - + }; - - + + #else /* C style interface */ typedef struct IBMDStreamingH264NALParserVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IBMDStreamingH264NALParser * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IBMDStreamingH264NALParser * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IBMDStreamingH264NALParser * This); - - HRESULT ( STDMETHODCALLTYPE *IsNALSequenceParameterSet )( + + HRESULT ( STDMETHODCALLTYPE *IsNALSequenceParameterSet )( IBMDStreamingH264NALParser * This, /* [in] */ IBMDStreamingH264NALPacket *nal); - - HRESULT ( STDMETHODCALLTYPE *IsNALPictureParameterSet )( + + HRESULT ( STDMETHODCALLTYPE *IsNALPictureParameterSet )( IBMDStreamingH264NALParser * This, /* [in] */ IBMDStreamingH264NALPacket *nal); - - HRESULT ( STDMETHODCALLTYPE *GetProfileAndLevelFromSPS )( + + HRESULT ( STDMETHODCALLTYPE *GetProfileAndLevelFromSPS )( IBMDStreamingH264NALParser * This, /* [in] */ IBMDStreamingH264NALPacket *nal, /* [out] */ unsigned int *profileIdc, /* [out] */ unsigned int *profileCompatability, /* [out] */ unsigned int *levelIdc); - + END_INTERFACE } IBMDStreamingH264NALParserVtbl; @@ -4701,29 +4922,29 @@ CONST_VTBL struct IBMDStreamingH264NALParserVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IBMDStreamingH264NALParser_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IBMDStreamingH264NALParser_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IBMDStreamingH264NALParser_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IBMDStreamingH264NALParser_IsNALSequenceParameterSet(This,nal) \ - ( (This)->lpVtbl -> IsNALSequenceParameterSet(This,nal) ) + ( (This)->lpVtbl -> IsNALSequenceParameterSet(This,nal) ) #define IBMDStreamingH264NALParser_IsNALPictureParameterSet(This,nal) \ - ( (This)->lpVtbl -> IsNALPictureParameterSet(This,nal) ) + ( (This)->lpVtbl -> IsNALPictureParameterSet(This,nal) ) #define IBMDStreamingH264NALParser_GetProfileAndLevelFromSPS(This,nal,profileIdc,profileCompatability,levelIdc) \ - ( (This)->lpVtbl -> GetProfileAndLevelFromSPS(This,nal,profileIdc,profileCompatability,levelIdc) ) + ( (This)->lpVtbl -> GetProfileAndLevelFromSPS(This,nal,profileIdc,profileCompatability,levelIdc) ) #endif /* COBJMACROS */ @@ -4756,52 +4977,52 @@ #define __IDeckLinkVideoOutputCallback_INTERFACE_DEFINED__ /* interface IDeckLinkVideoOutputCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoOutputCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("20AA5225-1958-47CB-820B-80A8D521A6EE") IDeckLinkVideoOutputCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE ScheduledFrameCompleted( + virtual HRESULT STDMETHODCALLTYPE ScheduledFrameCompleted( /* [in] */ IDeckLinkVideoFrame *completedFrame, /* [in] */ BMDOutputFrameCompletionResult result) = 0; - + virtual HRESULT STDMETHODCALLTYPE ScheduledPlaybackHasStopped( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoOutputCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoOutputCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoOutputCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoOutputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduledFrameCompleted )( + + HRESULT ( STDMETHODCALLTYPE *ScheduledFrameCompleted )( IDeckLinkVideoOutputCallback * This, /* [in] */ IDeckLinkVideoFrame *completedFrame, /* [in] */ BMDOutputFrameCompletionResult result); - - HRESULT ( STDMETHODCALLTYPE *ScheduledPlaybackHasStopped )( + + HRESULT ( STDMETHODCALLTYPE *ScheduledPlaybackHasStopped )( IDeckLinkVideoOutputCallback * This); - + END_INTERFACE } IDeckLinkVideoOutputCallbackVtbl; @@ -4810,26 +5031,26 @@ CONST_VTBL struct IDeckLinkVideoOutputCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoOutputCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoOutputCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoOutputCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoOutputCallback_ScheduledFrameCompleted(This,completedFrame,result) \ - ( (This)->lpVtbl -> ScheduledFrameCompleted(This,completedFrame,result) ) + ( (This)->lpVtbl -> ScheduledFrameCompleted(This,completedFrame,result) ) #define IDeckLinkVideoOutputCallback_ScheduledPlaybackHasStopped(This) \ - ( (This)->lpVtbl -> ScheduledPlaybackHasStopped(This) ) + ( (This)->lpVtbl -> ScheduledPlaybackHasStopped(This) ) #endif /* COBJMACROS */ @@ -4846,58 +5067,58 @@ #define __IDeckLinkInputCallback_INTERFACE_DEFINED__ /* interface IDeckLinkInputCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInputCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("DD04E5EC-7415-42AB-AE4A-E80C4DFC044A") IDeckLinkInputCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged( + virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged( /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( + + virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( /* [in] */ IDeckLinkVideoInputFrame *videoFrame, /* [in] */ IDeckLinkAudioInputPacket *audioPacket) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInputCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInputCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInputCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFormatChanged )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFormatChanged )( IDeckLinkInputCallback * This, /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( IDeckLinkInputCallback * This, /* [in] */ IDeckLinkVideoInputFrame *videoFrame, /* [in] */ IDeckLinkAudioInputPacket *audioPacket); - + END_INTERFACE } IDeckLinkInputCallbackVtbl; @@ -4906,26 +5127,26 @@ CONST_VTBL struct IDeckLinkInputCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInputCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInputCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInputCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInputCallback_VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) \ - ( (This)->lpVtbl -> VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) + ( (This)->lpVtbl -> VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) #define IDeckLinkInputCallback_VideoInputFrameArrived(This,videoFrame,audioPacket) \ - ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) + ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) #endif /* COBJMACROS */ @@ -4942,63 +5163,63 @@ #define __IDeckLinkEncoderInputCallback_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderInputCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderInputCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("ACF13E61-F4A0-4974-A6A7-59AFF6268B31") IDeckLinkEncoderInputCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE VideoInputSignalChanged( + virtual HRESULT STDMETHODCALLTYPE VideoInputSignalChanged( /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE VideoPacketArrived( + + virtual HRESULT STDMETHODCALLTYPE VideoPacketArrived( /* [in] */ IDeckLinkEncoderVideoPacket *videoPacket) = 0; - - virtual HRESULT STDMETHODCALLTYPE AudioPacketArrived( + + virtual HRESULT STDMETHODCALLTYPE AudioPacketArrived( /* [in] */ IDeckLinkEncoderAudioPacket *audioPacket) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderInputCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderInputCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderInputCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderInputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *VideoInputSignalChanged )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputSignalChanged )( IDeckLinkEncoderInputCallback * This, /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags); - - HRESULT ( STDMETHODCALLTYPE *VideoPacketArrived )( + + HRESULT ( STDMETHODCALLTYPE *VideoPacketArrived )( IDeckLinkEncoderInputCallback * This, /* [in] */ IDeckLinkEncoderVideoPacket *videoPacket); - - HRESULT ( STDMETHODCALLTYPE *AudioPacketArrived )( + + HRESULT ( STDMETHODCALLTYPE *AudioPacketArrived )( IDeckLinkEncoderInputCallback * This, /* [in] */ IDeckLinkEncoderAudioPacket *audioPacket); - + END_INTERFACE } IDeckLinkEncoderInputCallbackVtbl; @@ -5007,29 +5228,29 @@ CONST_VTBL struct IDeckLinkEncoderInputCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderInputCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderInputCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderInputCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkEncoderInputCallback_VideoInputSignalChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) \ - ( (This)->lpVtbl -> VideoInputSignalChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) + ( (This)->lpVtbl -> VideoInputSignalChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) #define IDeckLinkEncoderInputCallback_VideoPacketArrived(This,videoPacket) \ - ( (This)->lpVtbl -> VideoPacketArrived(This,videoPacket) ) + ( (This)->lpVtbl -> VideoPacketArrived(This,videoPacket) ) #define IDeckLinkEncoderInputCallback_AudioPacketArrived(This,audioPacket) \ - ( (This)->lpVtbl -> AudioPacketArrived(This,audioPacket) ) + ( (This)->lpVtbl -> AudioPacketArrived(This,audioPacket) ) #endif /* COBJMACROS */ @@ -5046,64 +5267,64 @@ #define __IDeckLinkMemoryAllocator_INTERFACE_DEFINED__ /* interface IDeckLinkMemoryAllocator */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkMemoryAllocator; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("B36EB6E7-9D29-4AA8-92EF-843B87A289E8") IDeckLinkMemoryAllocator : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE AllocateBuffer( + virtual HRESULT STDMETHODCALLTYPE AllocateBuffer( /* [in] */ unsigned int bufferSize, /* [out] */ void **allocatedBuffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE ReleaseBuffer( + + virtual HRESULT STDMETHODCALLTYPE ReleaseBuffer( /* [in] */ void *buffer) = 0; - + virtual HRESULT STDMETHODCALLTYPE Commit( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE Decommit( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkMemoryAllocatorVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkMemoryAllocator * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkMemoryAllocator * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkMemoryAllocator * This); - - HRESULT ( STDMETHODCALLTYPE *AllocateBuffer )( + + HRESULT ( STDMETHODCALLTYPE *AllocateBuffer )( IDeckLinkMemoryAllocator * This, /* [in] */ unsigned int bufferSize, /* [out] */ void **allocatedBuffer); - - HRESULT ( STDMETHODCALLTYPE *ReleaseBuffer )( + + HRESULT ( STDMETHODCALLTYPE *ReleaseBuffer )( IDeckLinkMemoryAllocator * This, /* [in] */ void *buffer); - - HRESULT ( STDMETHODCALLTYPE *Commit )( + + HRESULT ( STDMETHODCALLTYPE *Commit )( IDeckLinkMemoryAllocator * This); - - HRESULT ( STDMETHODCALLTYPE *Decommit )( + + HRESULT ( STDMETHODCALLTYPE *Decommit )( IDeckLinkMemoryAllocator * This); - + END_INTERFACE } IDeckLinkMemoryAllocatorVtbl; @@ -5112,32 +5333,32 @@ CONST_VTBL struct IDeckLinkMemoryAllocatorVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkMemoryAllocator_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkMemoryAllocator_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkMemoryAllocator_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkMemoryAllocator_AllocateBuffer(This,bufferSize,allocatedBuffer) \ - ( (This)->lpVtbl -> AllocateBuffer(This,bufferSize,allocatedBuffer) ) + ( (This)->lpVtbl -> AllocateBuffer(This,bufferSize,allocatedBuffer) ) #define IDeckLinkMemoryAllocator_ReleaseBuffer(This,buffer) \ - ( (This)->lpVtbl -> ReleaseBuffer(This,buffer) ) + ( (This)->lpVtbl -> ReleaseBuffer(This,buffer) ) #define IDeckLinkMemoryAllocator_Commit(This) \ - ( (This)->lpVtbl -> Commit(This) ) + ( (This)->lpVtbl -> Commit(This) ) #define IDeckLinkMemoryAllocator_Decommit(This) \ - ( (This)->lpVtbl -> Decommit(This) ) + ( (This)->lpVtbl -> Decommit(This) ) #endif /* COBJMACROS */ @@ -5154,45 +5375,45 @@ #define __IDeckLinkAudioOutputCallback_INTERFACE_DEFINED__ /* interface IDeckLinkAudioOutputCallback */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkAudioOutputCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("403C681B-7F46-4A12-B993-2BB127084EE6") IDeckLinkAudioOutputCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE RenderAudioSamples( + virtual HRESULT STDMETHODCALLTYPE RenderAudioSamples( /* [in] */ BOOL preroll) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkAudioOutputCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkAudioOutputCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkAudioOutputCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkAudioOutputCallback * This); - - HRESULT ( STDMETHODCALLTYPE *RenderAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *RenderAudioSamples )( IDeckLinkAudioOutputCallback * This, /* [in] */ BOOL preroll); - + END_INTERFACE } IDeckLinkAudioOutputCallbackVtbl; @@ -5201,23 +5422,23 @@ CONST_VTBL struct IDeckLinkAudioOutputCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkAudioOutputCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkAudioOutputCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkAudioOutputCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkAudioOutputCallback_RenderAudioSamples(This,preroll) \ - ( (This)->lpVtbl -> RenderAudioSamples(This,preroll) ) + ( (This)->lpVtbl -> RenderAudioSamples(This,preroll) ) #endif /* COBJMACROS */ @@ -5234,45 +5455,45 @@ #define __IDeckLinkIterator_INTERFACE_DEFINED__ /* interface IDeckLinkIterator */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkIterator; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("50FB36CD-3063-4B73-BDBB-958087F2D8BA") IDeckLinkIterator : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IDeckLink **deckLinkInstance) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkIteratorVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkIterator * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkIterator * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkIterator * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IDeckLinkIterator * This, /* [out] */ IDeckLink **deckLinkInstance); - + END_INTERFACE } IDeckLinkIteratorVtbl; @@ -5281,23 +5502,23 @@ CONST_VTBL struct IDeckLinkIteratorVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkIterator_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkIterator_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkIterator_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkIterator_Next(This,deckLinkInstance) \ - ( (This)->lpVtbl -> Next(This,deckLinkInstance) ) + ( (This)->lpVtbl -> Next(This,deckLinkInstance) ) #endif /* COBJMACROS */ @@ -5314,74 +5535,74 @@ #define __IDeckLinkAPIInformation_INTERFACE_DEFINED__ /* interface IDeckLinkAPIInformation */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkAPIInformation; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("7BEA3C68-730D-4322-AF34-8A7152B532A4") IDeckLinkAPIInformation : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetFlag( + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ BSTR *value) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkAPIInformationVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkAPIInformation * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkAPIInformation * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkAPIInformation * This); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkAPIInformation * This, /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkAPIInformation * This, /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkAPIInformation * This, /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkAPIInformation * This, /* [in] */ BMDDeckLinkAPIInformationID cfgID, /* [out] */ BSTR *value); - + END_INTERFACE } IDeckLinkAPIInformationVtbl; @@ -5390,32 +5611,32 @@ CONST_VTBL struct IDeckLinkAPIInformationVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkAPIInformation_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkAPIInformation_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkAPIInformation_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkAPIInformation_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkAPIInformation_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkAPIInformation_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkAPIInformation_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #endif /* COBJMACROS */ @@ -5432,180 +5653,191 @@ #define __IDeckLinkOutput_INTERFACE_DEFINED__ /* interface IDeckLinkOutput */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkOutput; #if defined(__cplusplus) && !defined(CINTERFACE) - - MIDL_INTERFACE("CC5C8A6E-3F2F-4B3A-87EA-FD78AF300564") + + MIDL_INTERFACE("065A0F6C-C508-4D0D-B919-F5EB0EBFC96B") IDeckLinkOutput : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + /* [in] */ BMDVideoConnection connection, + /* [in] */ BMDDisplayMode requestedMode, + /* [in] */ BMDPixelFormat requestedPixelFormat, + /* [in] */ BMDSupportedVideoModeFlags flags, + /* [out] */ BMDDisplayMode *actualMode, + /* [out] */ BOOL *supported) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayMode( /* [in] */ BMDDisplayMode displayMode, - /* [in] */ BMDPixelFormat pixelFormat, - /* [in] */ BMDVideoOutputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDVideoOutputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( /* [in] */ int width, /* [in] */ int height, /* [in] */ int rowBytes, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame **outFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( /* [in] */ BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( + + virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( /* [in] */ IDeckLinkVideoFrame *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( /* [in] */ IDeckLinkVideoFrame *theFrame, /* [in] */ BMDTimeValue displayTime, /* [in] */ BMDTimeValue displayDuration, /* [in] */ BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( /* [in] */ IDeckLinkVideoOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( /* [out] */ unsigned int *bufferedFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount, /* [in] */ BMDAudioOutputStreamType streamType) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( + + virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten) = 0; - + virtual HRESULT STDMETHODCALLTYPE BeginAudioPreroll( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE EndAudioPreroll( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( + + virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [in] */ BMDTimeValue streamTime, /* [in] */ BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( /* [out] */ unsigned int *bufferedSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushBufferedAudioSamples( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( + + virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( /* [in] */ IDeckLinkAudioOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( /* [in] */ BMDTimeValue playbackStartTime, /* [in] */ BMDTimeScale timeScale, /* [in] */ double playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( /* [in] */ BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, /* [in] */ BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( + + virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( /* [out] */ BOOL *active) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( + + virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *streamTime, /* [out] */ double *playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetReferenceStatus( + + virtual HRESULT STDMETHODCALLTYPE GetReferenceStatus( /* [out] */ BMDReferenceStatus *referenceStatus) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFrameCompletionReferenceTimestamp( + + virtual HRESULT STDMETHODCALLTYPE GetFrameCompletionReferenceTimestamp( /* [in] */ IDeckLinkVideoFrame *theFrame, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *frameCompletionTimestamp) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkOutputVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkOutput * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkOutput * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkOutput * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + IDeckLinkOutput * This, + /* [in] */ BMDVideoConnection connection, + /* [in] */ BMDDisplayMode requestedMode, + /* [in] */ BMDPixelFormat requestedPixelFormat, + /* [in] */ BMDSupportedVideoModeFlags flags, + /* [out] */ BMDDisplayMode *actualMode, + /* [out] */ BOOL *supported); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkOutput * This, /* [in] */ BMDDisplayMode displayMode, - /* [in] */ BMDPixelFormat pixelFormat, - /* [in] */ BMDVideoOutputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkOutput * This, /* [out] */ IDeckLinkDisplayModeIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkOutput * This, /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( IDeckLinkOutput * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDVideoOutputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( IDeckLinkOutput * This); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( IDeckLinkOutput * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( IDeckLinkOutput * This, /* [in] */ int width, /* [in] */ int height, @@ -5613,111 +5845,111 @@ /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame **outFrame); - - HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( IDeckLinkOutput * This, /* [in] */ BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer); - - HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( + + HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( IDeckLinkOutput * This, /* [in] */ IDeckLinkVideoFrame *theFrame); - - HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( IDeckLinkOutput * This, /* [in] */ IDeckLinkVideoFrame *theFrame, /* [in] */ BMDTimeValue displayTime, /* [in] */ BMDTimeValue displayDuration, /* [in] */ BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( IDeckLinkOutput * This, /* [in] */ IDeckLinkVideoOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( IDeckLinkOutput * This, /* [out] */ unsigned int *bufferedFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( IDeckLinkOutput * This, /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount, /* [in] */ BMDAudioOutputStreamType streamType); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( IDeckLinkOutput * This); - - HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( + + HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( IDeckLinkOutput * This, /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( IDeckLinkOutput * This); - - HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( IDeckLinkOutput * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( IDeckLinkOutput * This, /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [in] */ BMDTimeValue streamTime, /* [in] */ BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( IDeckLinkOutput * This, /* [out] */ unsigned int *bufferedSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( IDeckLinkOutput * This); - - HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( IDeckLinkOutput * This, /* [in] */ IDeckLinkAudioOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( IDeckLinkOutput * This, /* [in] */ BMDTimeValue playbackStartTime, /* [in] */ BMDTimeScale timeScale, /* [in] */ double playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( IDeckLinkOutput * This, /* [in] */ BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, /* [in] */ BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( + + HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( IDeckLinkOutput * This, /* [out] */ BOOL *active); - - HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( IDeckLinkOutput * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *streamTime, /* [out] */ double *playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *GetReferenceStatus )( + + HRESULT ( STDMETHODCALLTYPE *GetReferenceStatus )( IDeckLinkOutput * This, /* [out] */ BMDReferenceStatus *referenceStatus); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkOutput * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - - HRESULT ( STDMETHODCALLTYPE *GetFrameCompletionReferenceTimestamp )( + + HRESULT ( STDMETHODCALLTYPE *GetFrameCompletionReferenceTimestamp )( IDeckLinkOutput * This, /* [in] */ IDeckLinkVideoFrame *theFrame, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *frameCompletionTimestamp); - + END_INTERFACE } IDeckLinkOutputVtbl; @@ -5726,104 +5958,107 @@ CONST_VTBL struct IDeckLinkOutputVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkOutput_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkOutput_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkOutput_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) + +#define IDeckLinkOutput_DoesSupportVideoMode(This,connection,requestedMode,requestedPixelFormat,flags,actualMode,supported) \ + ( (This)->lpVtbl -> DoesSupportVideoMode(This,connection,requestedMode,requestedPixelFormat,flags,actualMode,supported) ) -#define IDeckLinkOutput_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) +#define IDeckLinkOutput_GetDisplayMode(This,displayMode,resultDisplayMode) \ + ( (This)->lpVtbl -> GetDisplayMode(This,displayMode,resultDisplayMode) ) #define IDeckLinkOutput_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkOutput_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkOutput_EnableVideoOutput(This,displayMode,flags) \ - ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) + ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) #define IDeckLinkOutput_DisableVideoOutput(This) \ - ( (This)->lpVtbl -> DisableVideoOutput(This) ) + ( (This)->lpVtbl -> DisableVideoOutput(This) ) #define IDeckLinkOutput_SetVideoOutputFrameMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) #define IDeckLinkOutput_CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) \ - ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) + ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) #define IDeckLinkOutput_CreateAncillaryData(This,pixelFormat,outBuffer) \ - ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) + ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) #define IDeckLinkOutput_DisplayVideoFrameSync(This,theFrame) \ - ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) + ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) #define IDeckLinkOutput_ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) \ - ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) + ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) #define IDeckLinkOutput_SetScheduledFrameCompletionCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) #define IDeckLinkOutput_GetBufferedVideoFrameCount(This,bufferedFrameCount) \ - ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) + ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) #define IDeckLinkOutput_EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) \ - ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) + ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) #define IDeckLinkOutput_DisableAudioOutput(This) \ - ( (This)->lpVtbl -> DisableAudioOutput(This) ) + ( (This)->lpVtbl -> DisableAudioOutput(This) ) #define IDeckLinkOutput_WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) \ - ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) + ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) #define IDeckLinkOutput_BeginAudioPreroll(This) \ - ( (This)->lpVtbl -> BeginAudioPreroll(This) ) + ( (This)->lpVtbl -> BeginAudioPreroll(This) ) #define IDeckLinkOutput_EndAudioPreroll(This) \ - ( (This)->lpVtbl -> EndAudioPreroll(This) ) + ( (This)->lpVtbl -> EndAudioPreroll(This) ) #define IDeckLinkOutput_ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) \ - ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) + ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) #define IDeckLinkOutput_GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) \ - ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) #define IDeckLinkOutput_FlushBufferedAudioSamples(This) \ - ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) + ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) #define IDeckLinkOutput_SetAudioCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) #define IDeckLinkOutput_StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) \ - ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) + ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) #define IDeckLinkOutput_StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) \ - ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) + ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) #define IDeckLinkOutput_IsScheduledPlaybackRunning(This,active) \ - ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) + ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) #define IDeckLinkOutput_GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) \ - ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) + ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) #define IDeckLinkOutput_GetReferenceStatus(This,referenceStatus) \ - ( (This)->lpVtbl -> GetReferenceStatus(This,referenceStatus) ) + ( (This)->lpVtbl -> GetReferenceStatus(This,referenceStatus) ) #define IDeckLinkOutput_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #define IDeckLinkOutput_GetFrameCompletionReferenceTimestamp(This,theFrame,desiredTimeScale,frameCompletionTimestamp) \ - ( (This)->lpVtbl -> GetFrameCompletionReferenceTimestamp(This,theFrame,desiredTimeScale,frameCompletionTimestamp) ) + ( (This)->lpVtbl -> GetFrameCompletionReferenceTimestamp(This,theFrame,desiredTimeScale,frameCompletionTimestamp) ) #endif /* COBJMACROS */ @@ -5840,160 +6075,169 @@ #define __IDeckLinkInput_INTERFACE_DEFINED__ /* interface IDeckLinkInput */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInput; #if defined(__cplusplus) && !defined(CINTERFACE) - - MIDL_INTERFACE("AF22762B-DFAC-4846-AA79-FA8883560995") + + MIDL_INTERFACE("2A88CF76-F494-4216-A7EF-DC74EEB83882") IDeckLinkInput : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + /* [in] */ BMDVideoConnection connection, + /* [in] */ BMDDisplayMode requestedMode, + /* [in] */ BMDPixelFormat requestedPixelFormat, + /* [in] */ BMDSupportedVideoModeFlags flags, + /* [out] */ BOOL *supported) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayMode( /* [in] */ BMDDisplayMode displayMode, - /* [in] */ BMDPixelFormat pixelFormat, - /* [in] */ BMDVideoInputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( /* [out] */ unsigned int *availableFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoInputFrameMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetVideoInputFrameMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( /* [out] */ unsigned int *availableSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkInputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInputVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInput * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInput * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + IDeckLinkInput * This, + /* [in] */ BMDVideoConnection connection, + /* [in] */ BMDDisplayMode requestedMode, + /* [in] */ BMDPixelFormat requestedPixelFormat, + /* [in] */ BMDSupportedVideoModeFlags flags, + /* [out] */ BOOL *supported); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkInput * This, /* [in] */ BMDDisplayMode displayMode, - /* [in] */ BMDPixelFormat pixelFormat, - /* [in] */ BMDVideoInputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkInput * This, /* [out] */ IDeckLinkDisplayModeIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkInput * This, /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( IDeckLinkInput * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( IDeckLinkInput * This, /* [out] */ unsigned int *availableFrameCount); - - HRESULT ( STDMETHODCALLTYPE *SetVideoInputFrameMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoInputFrameMemoryAllocator )( IDeckLinkInput * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( IDeckLinkInput * This, /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( IDeckLinkInput * This, /* [out] */ unsigned int *availableSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *StartStreams )( + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *StopStreams )( + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( IDeckLinkInput * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkInput * This, /* [in] */ IDeckLinkInputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkInput * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - + END_INTERFACE } IDeckLinkInputVtbl; @@ -6002,68 +6246,71 @@ CONST_VTBL struct IDeckLinkInputVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInput_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInput_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInput_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) + +#define IDeckLinkInput_DoesSupportVideoMode(This,connection,requestedMode,requestedPixelFormat,flags,supported) \ + ( (This)->lpVtbl -> DoesSupportVideoMode(This,connection,requestedMode,requestedPixelFormat,flags,supported) ) -#define IDeckLinkInput_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) +#define IDeckLinkInput_GetDisplayMode(This,displayMode,resultDisplayMode) \ + ( (This)->lpVtbl -> GetDisplayMode(This,displayMode,resultDisplayMode) ) #define IDeckLinkInput_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkInput_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkInput_EnableVideoInput(This,displayMode,pixelFormat,flags) \ - ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) #define IDeckLinkInput_DisableVideoInput(This) \ - ( (This)->lpVtbl -> DisableVideoInput(This) ) + ( (This)->lpVtbl -> DisableVideoInput(This) ) #define IDeckLinkInput_GetAvailableVideoFrameCount(This,availableFrameCount) \ - ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) + ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) #define IDeckLinkInput_SetVideoInputFrameMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetVideoInputFrameMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetVideoInputFrameMemoryAllocator(This,theAllocator) ) #define IDeckLinkInput_EnableAudioInput(This,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) #define IDeckLinkInput_DisableAudioInput(This) \ - ( (This)->lpVtbl -> DisableAudioInput(This) ) + ( (This)->lpVtbl -> DisableAudioInput(This) ) #define IDeckLinkInput_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ - ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) #define IDeckLinkInput_StartStreams(This) \ - ( (This)->lpVtbl -> StartStreams(This) ) + ( (This)->lpVtbl -> StartStreams(This) ) #define IDeckLinkInput_StopStreams(This) \ - ( (This)->lpVtbl -> StopStreams(This) ) + ( (This)->lpVtbl -> StopStreams(This) ) #define IDeckLinkInput_PauseStreams(This) \ - ( (This)->lpVtbl -> PauseStreams(This) ) + ( (This)->lpVtbl -> PauseStreams(This) ) #define IDeckLinkInput_FlushStreams(This) \ - ( (This)->lpVtbl -> FlushStreams(This) ) + ( (This)->lpVtbl -> FlushStreams(This) ) #define IDeckLinkInput_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #define IDeckLinkInput_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #endif /* COBJMACROS */ @@ -6080,61 +6327,61 @@ #define __IDeckLinkHDMIInputEDID_INTERFACE_DEFINED__ /* interface IDeckLinkHDMIInputEDID */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkHDMIInputEDID; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("ABBBACBC-45BC-4665-9D92-ACE6E5A97902") IDeckLinkHDMIInputEDID : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetInt( + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkHDMIInputEDIDID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkHDMIInputEDIDID cfgID, /* [out] */ LONGLONG *value) = 0; - + virtual HRESULT STDMETHODCALLTYPE WriteToEDID( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkHDMIInputEDIDVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkHDMIInputEDID * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkHDMIInputEDID * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkHDMIInputEDID * This); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkHDMIInputEDID * This, /* [in] */ BMDDeckLinkHDMIInputEDIDID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkHDMIInputEDID * This, /* [in] */ BMDDeckLinkHDMIInputEDIDID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *WriteToEDID )( + + HRESULT ( STDMETHODCALLTYPE *WriteToEDID )( IDeckLinkHDMIInputEDID * This); - + END_INTERFACE } IDeckLinkHDMIInputEDIDVtbl; @@ -6143,29 +6390,29 @@ CONST_VTBL struct IDeckLinkHDMIInputEDIDVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkHDMIInputEDID_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkHDMIInputEDID_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkHDMIInputEDID_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkHDMIInputEDID_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkHDMIInputEDID_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkHDMIInputEDID_WriteToEDID(This) \ - ( (This)->lpVtbl -> WriteToEDID(This) ) + ( (This)->lpVtbl -> WriteToEDID(This) ) #endif /* COBJMACROS */ @@ -6182,155 +6429,166 @@ #define __IDeckLinkEncoderInput_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderInput */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderInput; #if defined(__cplusplus) && !defined(CINTERFACE) - - MIDL_INTERFACE("270587DA-6B7D-42E7-A1F0-6D853F581185") + + MIDL_INTERFACE("F222551D-13DF-4FD8-B587-9D4F19EC12C9") IDeckLinkEncoderInput : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + /* [in] */ BMDVideoConnection connection, + /* [in] */ BMDDisplayMode requestedMode, + /* [in] */ BMDPixelFormat requestedCodec, + /* [in] */ unsigned int requestedCodecProfile, + /* [in] */ BMDSupportedVideoModeFlags flags, + /* [out] */ BOOL *supported) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayMode( /* [in] */ BMDDisplayMode displayMode, - /* [in] */ BMDPixelFormat pixelFormat, - /* [in] */ BMDVideoInputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailablePacketsCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailablePacketsCount( /* [out] */ unsigned int *availablePacketsCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( /* [in] */ BMDAudioFormat audioFormat, /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( /* [out] */ unsigned int *availableSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkEncoderInputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderInputVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderInput * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderInput * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + IDeckLinkEncoderInput * This, + /* [in] */ BMDVideoConnection connection, + /* [in] */ BMDDisplayMode requestedMode, + /* [in] */ BMDPixelFormat requestedCodec, + /* [in] */ unsigned int requestedCodecProfile, + /* [in] */ BMDSupportedVideoModeFlags flags, + /* [out] */ BOOL *supported); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkEncoderInput * This, /* [in] */ BMDDisplayMode displayMode, - /* [in] */ BMDPixelFormat pixelFormat, - /* [in] */ BMDVideoInputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkEncoderInput * This, /* [out] */ IDeckLinkDisplayModeIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( IDeckLinkEncoderInput * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailablePacketsCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailablePacketsCount )( IDeckLinkEncoderInput * This, /* [out] */ unsigned int *availablePacketsCount); - - HRESULT ( STDMETHODCALLTYPE *SetMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetMemoryAllocator )( IDeckLinkEncoderInput * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( IDeckLinkEncoderInput * This, /* [in] */ BMDAudioFormat audioFormat, /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( IDeckLinkEncoderInput * This, /* [out] */ unsigned int *availableSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *StartStreams )( + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *StopStreams )( + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( IDeckLinkEncoderInput * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkEncoderInput * This, /* [in] */ IDeckLinkEncoderInputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkEncoderInput * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - + END_INTERFACE } IDeckLinkEncoderInputVtbl; @@ -6339,65 +6597,68 @@ CONST_VTBL struct IDeckLinkEncoderInputVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderInput_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderInput_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderInput_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) + +#define IDeckLinkEncoderInput_DoesSupportVideoMode(This,connection,requestedMode,requestedCodec,requestedCodecProfile,flags,supported) \ + ( (This)->lpVtbl -> DoesSupportVideoMode(This,connection,requestedMode,requestedCodec,requestedCodecProfile,flags,supported) ) -#define IDeckLinkEncoderInput_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) +#define IDeckLinkEncoderInput_GetDisplayMode(This,displayMode,resultDisplayMode) \ + ( (This)->lpVtbl -> GetDisplayMode(This,displayMode,resultDisplayMode) ) #define IDeckLinkEncoderInput_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkEncoderInput_EnableVideoInput(This,displayMode,pixelFormat,flags) \ - ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) #define IDeckLinkEncoderInput_DisableVideoInput(This) \ - ( (This)->lpVtbl -> DisableVideoInput(This) ) + ( (This)->lpVtbl -> DisableVideoInput(This) ) #define IDeckLinkEncoderInput_GetAvailablePacketsCount(This,availablePacketsCount) \ - ( (This)->lpVtbl -> GetAvailablePacketsCount(This,availablePacketsCount) ) + ( (This)->lpVtbl -> GetAvailablePacketsCount(This,availablePacketsCount) ) #define IDeckLinkEncoderInput_SetMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetMemoryAllocator(This,theAllocator) ) #define IDeckLinkEncoderInput_EnableAudioInput(This,audioFormat,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioInput(This,audioFormat,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioInput(This,audioFormat,sampleRate,sampleType,channelCount) ) #define IDeckLinkEncoderInput_DisableAudioInput(This) \ - ( (This)->lpVtbl -> DisableAudioInput(This) ) + ( (This)->lpVtbl -> DisableAudioInput(This) ) #define IDeckLinkEncoderInput_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ - ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) #define IDeckLinkEncoderInput_StartStreams(This) \ - ( (This)->lpVtbl -> StartStreams(This) ) + ( (This)->lpVtbl -> StartStreams(This) ) #define IDeckLinkEncoderInput_StopStreams(This) \ - ( (This)->lpVtbl -> StopStreams(This) ) + ( (This)->lpVtbl -> StopStreams(This) ) #define IDeckLinkEncoderInput_PauseStreams(This) \ - ( (This)->lpVtbl -> PauseStreams(This) ) + ( (This)->lpVtbl -> PauseStreams(This) ) #define IDeckLinkEncoderInput_FlushStreams(This) \ - ( (This)->lpVtbl -> FlushStreams(This) ) + ( (This)->lpVtbl -> FlushStreams(This) ) #define IDeckLinkEncoderInput_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #define IDeckLinkEncoderInput_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #endif /* COBJMACROS */ @@ -6414,86 +6675,86 @@ #define __IDeckLinkVideoFrame_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrame */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrame; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("3F716FE0-F023-4111-BE5D-EF4414C05B17") IDeckLinkVideoFrame : public IUnknown { public: virtual long STDMETHODCALLTYPE GetWidth( void) = 0; - + virtual long STDMETHODCALLTYPE GetHeight( void) = 0; - + virtual long STDMETHODCALLTYPE GetRowBytes( void) = 0; - + virtual BMDPixelFormat STDMETHODCALLTYPE GetPixelFormat( void) = 0; - + virtual BMDFrameFlags STDMETHODCALLTYPE GetFlags( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecode( + + virtual HRESULT STDMETHODCALLTYPE GetTimecode( /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE GetAncillaryData( /* [out] */ IDeckLinkVideoFrameAncillary **ancillary) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrameVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrame * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrame * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrame * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoFrame * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoFrame * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoFrame * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoFrame * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoFrame * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoFrame * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkVideoFrame * This, /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkVideoFrame * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - + END_INTERFACE } IDeckLinkVideoFrameVtbl; @@ -6502,44 +6763,44 @@ CONST_VTBL struct IDeckLinkVideoFrameVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrame_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrame_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrame_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrame_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoFrame_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoFrame_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoFrame_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoFrame_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoFrame_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkVideoFrame_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkVideoFrame_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #endif /* COBJMACROS */ @@ -6556,98 +6817,98 @@ #define __IDeckLinkMutableVideoFrame_INTERFACE_DEFINED__ /* interface IDeckLinkMutableVideoFrame */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkMutableVideoFrame; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("69E2639F-40DA-4E19-B6F2-20ACE815C390") IDeckLinkMutableVideoFrame : public IDeckLinkVideoFrame { public: - virtual HRESULT STDMETHODCALLTYPE SetFlags( + virtual HRESULT STDMETHODCALLTYPE SetFlags( /* [in] */ BMDFrameFlags newFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetTimecode( + + virtual HRESULT STDMETHODCALLTYPE SetTimecode( /* [in] */ BMDTimecodeFormat format, /* [in] */ IDeckLinkTimecode *timecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetTimecodeFromComponents( + + virtual HRESULT STDMETHODCALLTYPE SetTimecodeFromComponents( /* [in] */ BMDTimecodeFormat format, /* [in] */ unsigned char hours, /* [in] */ unsigned char minutes, /* [in] */ unsigned char seconds, /* [in] */ unsigned char frames, /* [in] */ BMDTimecodeFlags flags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE SetAncillaryData( /* [in] */ IDeckLinkVideoFrameAncillary *ancillary) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetTimecodeUserBits( + + virtual HRESULT STDMETHODCALLTYPE SetTimecodeUserBits( /* [in] */ BMDTimecodeFormat format, /* [in] */ BMDTimecodeUserBits userBits) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkMutableVideoFrameVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkMutableVideoFrame * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkMutableVideoFrame * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkMutableVideoFrame * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkMutableVideoFrame * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkMutableVideoFrame * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkMutableVideoFrame * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkMutableVideoFrame * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkMutableVideoFrame * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkMutableVideoFrame * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkMutableVideoFrame * This, /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkMutableVideoFrame * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - - HRESULT ( STDMETHODCALLTYPE *SetFlags )( + + HRESULT ( STDMETHODCALLTYPE *SetFlags )( IDeckLinkMutableVideoFrame * This, /* [in] */ BMDFrameFlags newFlags); - - HRESULT ( STDMETHODCALLTYPE *SetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *SetTimecode )( IDeckLinkMutableVideoFrame * This, /* [in] */ BMDTimecodeFormat format, /* [in] */ IDeckLinkTimecode *timecode); - - HRESULT ( STDMETHODCALLTYPE *SetTimecodeFromComponents )( + + HRESULT ( STDMETHODCALLTYPE *SetTimecodeFromComponents )( IDeckLinkMutableVideoFrame * This, /* [in] */ BMDTimecodeFormat format, /* [in] */ unsigned char hours, @@ -6655,16 +6916,16 @@ /* [in] */ unsigned char seconds, /* [in] */ unsigned char frames, /* [in] */ BMDTimecodeFlags flags); - - HRESULT ( STDMETHODCALLTYPE *SetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *SetAncillaryData )( IDeckLinkMutableVideoFrame * This, /* [in] */ IDeckLinkVideoFrameAncillary *ancillary); - - HRESULT ( STDMETHODCALLTYPE *SetTimecodeUserBits )( + + HRESULT ( STDMETHODCALLTYPE *SetTimecodeUserBits )( IDeckLinkMutableVideoFrame * This, /* [in] */ BMDTimecodeFormat format, /* [in] */ BMDTimecodeUserBits userBits); - + END_INTERFACE } IDeckLinkMutableVideoFrameVtbl; @@ -6673,60 +6934,60 @@ CONST_VTBL struct IDeckLinkMutableVideoFrameVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkMutableVideoFrame_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkMutableVideoFrame_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkMutableVideoFrame_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkMutableVideoFrame_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkMutableVideoFrame_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkMutableVideoFrame_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkMutableVideoFrame_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkMutableVideoFrame_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkMutableVideoFrame_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkMutableVideoFrame_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkMutableVideoFrame_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #define IDeckLinkMutableVideoFrame_SetFlags(This,newFlags) \ - ( (This)->lpVtbl -> SetFlags(This,newFlags) ) + ( (This)->lpVtbl -> SetFlags(This,newFlags) ) #define IDeckLinkMutableVideoFrame_SetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> SetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> SetTimecode(This,format,timecode) ) #define IDeckLinkMutableVideoFrame_SetTimecodeFromComponents(This,format,hours,minutes,seconds,frames,flags) \ - ( (This)->lpVtbl -> SetTimecodeFromComponents(This,format,hours,minutes,seconds,frames,flags) ) + ( (This)->lpVtbl -> SetTimecodeFromComponents(This,format,hours,minutes,seconds,frames,flags) ) #define IDeckLinkMutableVideoFrame_SetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> SetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> SetAncillaryData(This,ancillary) ) #define IDeckLinkMutableVideoFrame_SetTimecodeUserBits(This,format,userBits) \ - ( (This)->lpVtbl -> SetTimecodeUserBits(This,format,userBits) ) + ( (This)->lpVtbl -> SetTimecodeUserBits(This,format,userBits) ) #endif /* COBJMACROS */ @@ -6743,50 +7004,50 @@ #define __IDeckLinkVideoFrame3DExtensions_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrame3DExtensions */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrame3DExtensions; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("DA0F7E4A-EDC7-48A8-9CDD-2DB51C729CD7") IDeckLinkVideoFrame3DExtensions : public IUnknown { public: virtual BMDVideo3DPackingFormat STDMETHODCALLTYPE Get3DPackingFormat( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFrameForRightEye( + + virtual HRESULT STDMETHODCALLTYPE GetFrameForRightEye( /* [out] */ IDeckLinkVideoFrame **rightEyeFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrame3DExtensionsVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrame3DExtensions * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrame3DExtensions * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrame3DExtensions * This); - - BMDVideo3DPackingFormat ( STDMETHODCALLTYPE *Get3DPackingFormat )( + + BMDVideo3DPackingFormat ( STDMETHODCALLTYPE *Get3DPackingFormat )( IDeckLinkVideoFrame3DExtensions * This); - - HRESULT ( STDMETHODCALLTYPE *GetFrameForRightEye )( + + HRESULT ( STDMETHODCALLTYPE *GetFrameForRightEye )( IDeckLinkVideoFrame3DExtensions * This, /* [out] */ IDeckLinkVideoFrame **rightEyeFrame); - + END_INTERFACE } IDeckLinkVideoFrame3DExtensionsVtbl; @@ -6795,26 +7056,26 @@ CONST_VTBL struct IDeckLinkVideoFrame3DExtensionsVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrame3DExtensions_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrame3DExtensions_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrame3DExtensions_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrame3DExtensions_Get3DPackingFormat(This) \ - ( (This)->lpVtbl -> Get3DPackingFormat(This) ) + ( (This)->lpVtbl -> Get3DPackingFormat(This) ) #define IDeckLinkVideoFrame3DExtensions_GetFrameForRightEye(This,rightEyeFrame) \ - ( (This)->lpVtbl -> GetFrameForRightEye(This,rightEyeFrame) ) + ( (This)->lpVtbl -> GetFrameForRightEye(This,rightEyeFrame) ) #endif /* COBJMACROS */ @@ -6831,74 +7092,74 @@ #define __IDeckLinkVideoFrameMetadataExtensions_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrameMetadataExtensions */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrameMetadataExtensions; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("D5973DC9-6432-46D0-8F0B-2496F8A1238F") IDeckLinkVideoFrameMetadataExtensions : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetInt( + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ BSTR *value) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrameMetadataExtensionsVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrameMetadataExtensions * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrameMetadataExtensions * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrameMetadataExtensions * This); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkVideoFrameMetadataExtensions * This, /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkVideoFrameMetadataExtensions * This, /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkVideoFrameMetadataExtensions * This, /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkVideoFrameMetadataExtensions * This, /* [in] */ BMDDeckLinkFrameMetadataID metadataID, /* [out] */ BSTR *value); - + END_INTERFACE } IDeckLinkVideoFrameMetadataExtensionsVtbl; @@ -6907,32 +7168,32 @@ CONST_VTBL struct IDeckLinkVideoFrameMetadataExtensionsVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrameMetadataExtensions_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrameMetadataExtensions_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrameMetadataExtensions_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrameMetadataExtensions_GetInt(This,metadataID,value) \ - ( (This)->lpVtbl -> GetInt(This,metadataID,value) ) + ( (This)->lpVtbl -> GetInt(This,metadataID,value) ) #define IDeckLinkVideoFrameMetadataExtensions_GetFloat(This,metadataID,value) \ - ( (This)->lpVtbl -> GetFloat(This,metadataID,value) ) + ( (This)->lpVtbl -> GetFloat(This,metadataID,value) ) #define IDeckLinkVideoFrameMetadataExtensions_GetFlag(This,metadataID,value) \ - ( (This)->lpVtbl -> GetFlag(This,metadataID,value) ) + ( (This)->lpVtbl -> GetFlag(This,metadataID,value) ) #define IDeckLinkVideoFrameMetadataExtensions_GetString(This,metadataID,value) \ - ( (This)->lpVtbl -> GetString(This,metadataID,value) ) + ( (This)->lpVtbl -> GetString(This,metadataID,value) ) #endif /* COBJMACROS */ @@ -6949,88 +7210,88 @@ #define __IDeckLinkVideoInputFrame_INTERFACE_DEFINED__ /* interface IDeckLinkVideoInputFrame */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoInputFrame; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("05CFE374-537C-4094-9A57-680525118F44") IDeckLinkVideoInputFrame : public IDeckLinkVideoFrame { public: - virtual HRESULT STDMETHODCALLTYPE GetStreamTime( + virtual HRESULT STDMETHODCALLTYPE GetStreamTime( /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration, /* [in] */ BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceTimestamp( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceTimestamp( /* [in] */ BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoInputFrameVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoInputFrame * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoInputFrame * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoInputFrame * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoInputFrame * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoInputFrame * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoInputFrame * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoInputFrame * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoInputFrame * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoInputFrame * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkVideoInputFrame * This, /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkVideoInputFrame * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkVideoInputFrame * This, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration, /* [in] */ BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( IDeckLinkVideoInputFrame * This, /* [in] */ BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration); - + END_INTERFACE } IDeckLinkVideoInputFrameVtbl; @@ -7039,51 +7300,51 @@ CONST_VTBL struct IDeckLinkVideoInputFrameVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoInputFrame_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoInputFrame_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoInputFrame_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoInputFrame_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoInputFrame_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoInputFrame_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoInputFrame_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoInputFrame_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoInputFrame_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkVideoInputFrame_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkVideoInputFrame_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #define IDeckLinkVideoInputFrame_GetStreamTime(This,frameTime,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,frameDuration,timeScale) ) #define IDeckLinkVideoInputFrame_GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) \ - ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) + ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) #endif /* COBJMACROS */ @@ -7100,69 +7361,69 @@ #define __IDeckLinkAncillaryPacket_INTERFACE_DEFINED__ /* interface IDeckLinkAncillaryPacket */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkAncillaryPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("CC5BBF7E-029C-4D3B-9158-6000EF5E3670") IDeckLinkAncillaryPacket : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetBytes( + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [in] */ BMDAncillaryPacketFormat format, /* [out] */ const void **data, /* [out] */ unsigned int *size) = 0; - + virtual unsigned char STDMETHODCALLTYPE GetDID( void) = 0; - + virtual unsigned char STDMETHODCALLTYPE GetSDID( void) = 0; - + virtual unsigned int STDMETHODCALLTYPE GetLineNumber( void) = 0; - + virtual unsigned char STDMETHODCALLTYPE GetDataStreamIndex( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkAncillaryPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkAncillaryPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkAncillaryPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkAncillaryPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkAncillaryPacket * This, /* [in] */ BMDAncillaryPacketFormat format, /* [out] */ const void **data, /* [out] */ unsigned int *size); - - unsigned char ( STDMETHODCALLTYPE *GetDID )( + + unsigned char ( STDMETHODCALLTYPE *GetDID )( IDeckLinkAncillaryPacket * This); - - unsigned char ( STDMETHODCALLTYPE *GetSDID )( + + unsigned char ( STDMETHODCALLTYPE *GetSDID )( IDeckLinkAncillaryPacket * This); - - unsigned int ( STDMETHODCALLTYPE *GetLineNumber )( + + unsigned int ( STDMETHODCALLTYPE *GetLineNumber )( IDeckLinkAncillaryPacket * This); - - unsigned char ( STDMETHODCALLTYPE *GetDataStreamIndex )( + + unsigned char ( STDMETHODCALLTYPE *GetDataStreamIndex )( IDeckLinkAncillaryPacket * This); - + END_INTERFACE } IDeckLinkAncillaryPacketVtbl; @@ -7171,35 +7432,35 @@ CONST_VTBL struct IDeckLinkAncillaryPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkAncillaryPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkAncillaryPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkAncillaryPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkAncillaryPacket_GetBytes(This,format,data,size) \ - ( (This)->lpVtbl -> GetBytes(This,format,data,size) ) + ( (This)->lpVtbl -> GetBytes(This,format,data,size) ) #define IDeckLinkAncillaryPacket_GetDID(This) \ - ( (This)->lpVtbl -> GetDID(This) ) + ( (This)->lpVtbl -> GetDID(This) ) #define IDeckLinkAncillaryPacket_GetSDID(This) \ - ( (This)->lpVtbl -> GetSDID(This) ) + ( (This)->lpVtbl -> GetSDID(This) ) #define IDeckLinkAncillaryPacket_GetLineNumber(This) \ - ( (This)->lpVtbl -> GetLineNumber(This) ) + ( (This)->lpVtbl -> GetLineNumber(This) ) #define IDeckLinkAncillaryPacket_GetDataStreamIndex(This) \ - ( (This)->lpVtbl -> GetDataStreamIndex(This) ) + ( (This)->lpVtbl -> GetDataStreamIndex(This) ) #endif /* COBJMACROS */ @@ -7216,45 +7477,45 @@ #define __IDeckLinkAncillaryPacketIterator_INTERFACE_DEFINED__ /* interface IDeckLinkAncillaryPacketIterator */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkAncillaryPacketIterator; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("3FC8994B-88FB-4C17-968F-9AAB69D964A7") IDeckLinkAncillaryPacketIterator : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IDeckLinkAncillaryPacket **packet) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkAncillaryPacketIteratorVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkAncillaryPacketIterator * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkAncillaryPacketIterator * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkAncillaryPacketIterator * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IDeckLinkAncillaryPacketIterator * This, /* [out] */ IDeckLinkAncillaryPacket **packet); - + END_INTERFACE } IDeckLinkAncillaryPacketIteratorVtbl; @@ -7263,23 +7524,23 @@ CONST_VTBL struct IDeckLinkAncillaryPacketIteratorVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkAncillaryPacketIterator_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkAncillaryPacketIterator_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkAncillaryPacketIterator_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkAncillaryPacketIterator_Next(This,packet) \ - ( (This)->lpVtbl -> Next(This,packet) ) + ( (This)->lpVtbl -> Next(This,packet) ) #endif /* COBJMACROS */ @@ -7296,75 +7557,75 @@ #define __IDeckLinkVideoFrameAncillaryPackets_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrameAncillaryPackets */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrameAncillaryPackets; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("6C186C0F-459E-41D8-AEE2-4812D81AEE68") IDeckLinkVideoFrameAncillaryPackets : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetPacketIterator( + virtual HRESULT STDMETHODCALLTYPE GetPacketIterator( /* [out] */ IDeckLinkAncillaryPacketIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFirstPacketByID( + + virtual HRESULT STDMETHODCALLTYPE GetFirstPacketByID( /* [in] */ unsigned char DID, /* [in] */ unsigned char SDID, /* [out] */ IDeckLinkAncillaryPacket **packet) = 0; - - virtual HRESULT STDMETHODCALLTYPE AttachPacket( + + virtual HRESULT STDMETHODCALLTYPE AttachPacket( /* [in] */ IDeckLinkAncillaryPacket *packet) = 0; - - virtual HRESULT STDMETHODCALLTYPE DetachPacket( + + virtual HRESULT STDMETHODCALLTYPE DetachPacket( /* [in] */ IDeckLinkAncillaryPacket *packet) = 0; - + virtual HRESULT STDMETHODCALLTYPE DetachAllPackets( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrameAncillaryPacketsVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrameAncillaryPackets * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrameAncillaryPackets * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrameAncillaryPackets * This); - - HRESULT ( STDMETHODCALLTYPE *GetPacketIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetPacketIterator )( IDeckLinkVideoFrameAncillaryPackets * This, /* [out] */ IDeckLinkAncillaryPacketIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *GetFirstPacketByID )( + + HRESULT ( STDMETHODCALLTYPE *GetFirstPacketByID )( IDeckLinkVideoFrameAncillaryPackets * This, /* [in] */ unsigned char DID, /* [in] */ unsigned char SDID, /* [out] */ IDeckLinkAncillaryPacket **packet); - - HRESULT ( STDMETHODCALLTYPE *AttachPacket )( + + HRESULT ( STDMETHODCALLTYPE *AttachPacket )( IDeckLinkVideoFrameAncillaryPackets * This, /* [in] */ IDeckLinkAncillaryPacket *packet); - - HRESULT ( STDMETHODCALLTYPE *DetachPacket )( + + HRESULT ( STDMETHODCALLTYPE *DetachPacket )( IDeckLinkVideoFrameAncillaryPackets * This, /* [in] */ IDeckLinkAncillaryPacket *packet); - - HRESULT ( STDMETHODCALLTYPE *DetachAllPackets )( + + HRESULT ( STDMETHODCALLTYPE *DetachAllPackets )( IDeckLinkVideoFrameAncillaryPackets * This); - + END_INTERFACE } IDeckLinkVideoFrameAncillaryPacketsVtbl; @@ -7373,35 +7634,35 @@ CONST_VTBL struct IDeckLinkVideoFrameAncillaryPacketsVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrameAncillaryPackets_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrameAncillaryPackets_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrameAncillaryPackets_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrameAncillaryPackets_GetPacketIterator(This,iterator) \ - ( (This)->lpVtbl -> GetPacketIterator(This,iterator) ) + ( (This)->lpVtbl -> GetPacketIterator(This,iterator) ) #define IDeckLinkVideoFrameAncillaryPackets_GetFirstPacketByID(This,DID,SDID,packet) \ - ( (This)->lpVtbl -> GetFirstPacketByID(This,DID,SDID,packet) ) + ( (This)->lpVtbl -> GetFirstPacketByID(This,DID,SDID,packet) ) #define IDeckLinkVideoFrameAncillaryPackets_AttachPacket(This,packet) \ - ( (This)->lpVtbl -> AttachPacket(This,packet) ) + ( (This)->lpVtbl -> AttachPacket(This,packet) ) #define IDeckLinkVideoFrameAncillaryPackets_DetachPacket(This,packet) \ - ( (This)->lpVtbl -> DetachPacket(This,packet) ) + ( (This)->lpVtbl -> DetachPacket(This,packet) ) #define IDeckLinkVideoFrameAncillaryPackets_DetachAllPackets(This) \ - ( (This)->lpVtbl -> DetachAllPackets(This) ) + ( (This)->lpVtbl -> DetachAllPackets(This) ) #endif /* COBJMACROS */ @@ -7418,57 +7679,57 @@ #define __IDeckLinkVideoFrameAncillary_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrameAncillary */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrameAncillary; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("732E723C-D1A4-4E29-9E8E-4A88797A0004") IDeckLinkVideoFrameAncillary : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetBufferForVerticalBlankingLine( + virtual HRESULT STDMETHODCALLTYPE GetBufferForVerticalBlankingLine( /* [in] */ unsigned int lineNumber, /* [out] */ void **buffer) = 0; - + virtual BMDPixelFormat STDMETHODCALLTYPE GetPixelFormat( void) = 0; - + virtual BMDDisplayMode STDMETHODCALLTYPE GetDisplayMode( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrameAncillaryVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrameAncillary * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrameAncillary * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrameAncillary * This); - - HRESULT ( STDMETHODCALLTYPE *GetBufferForVerticalBlankingLine )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferForVerticalBlankingLine )( IDeckLinkVideoFrameAncillary * This, /* [in] */ unsigned int lineNumber, /* [out] */ void **buffer); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoFrameAncillary * This); - - BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( + + BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkVideoFrameAncillary * This); - + END_INTERFACE } IDeckLinkVideoFrameAncillaryVtbl; @@ -7477,29 +7738,29 @@ CONST_VTBL struct IDeckLinkVideoFrameAncillaryVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrameAncillary_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrameAncillary_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrameAncillary_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrameAncillary_GetBufferForVerticalBlankingLine(This,lineNumber,buffer) \ - ( (This)->lpVtbl -> GetBufferForVerticalBlankingLine(This,lineNumber,buffer) ) + ( (This)->lpVtbl -> GetBufferForVerticalBlankingLine(This,lineNumber,buffer) ) #define IDeckLinkVideoFrameAncillary_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoFrameAncillary_GetDisplayMode(This) \ - ( (This)->lpVtbl -> GetDisplayMode(This) ) + ( (This)->lpVtbl -> GetDisplayMode(This) ) #endif /* COBJMACROS */ @@ -7516,64 +7777,64 @@ #define __IDeckLinkEncoderPacket_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderPacket */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("B693F36C-316E-4AF1-B6C2-F389A4BCA620") IDeckLinkEncoderPacket : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetBytes( + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - + virtual long STDMETHODCALLTYPE GetSize( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetStreamTime( + + virtual HRESULT STDMETHODCALLTYPE GetStreamTime( /* [out] */ BMDTimeValue *frameTime, /* [in] */ BMDTimeScale timeScale) = 0; - + virtual BMDPacketType STDMETHODCALLTYPE GetPacketType( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkEncoderPacket * This, /* [out] */ void **buffer); - - long ( STDMETHODCALLTYPE *GetSize )( + + long ( STDMETHODCALLTYPE *GetSize )( IDeckLinkEncoderPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkEncoderPacket * This, /* [out] */ BMDTimeValue *frameTime, /* [in] */ BMDTimeScale timeScale); - - BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( + + BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( IDeckLinkEncoderPacket * This); - + END_INTERFACE } IDeckLinkEncoderPacketVtbl; @@ -7582,32 +7843,32 @@ CONST_VTBL struct IDeckLinkEncoderPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkEncoderPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkEncoderPacket_GetSize(This) \ - ( (This)->lpVtbl -> GetSize(This) ) + ( (This)->lpVtbl -> GetSize(This) ) #define IDeckLinkEncoderPacket_GetStreamTime(This,frameTime,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) #define IDeckLinkEncoderPacket_GetPacketType(This) \ - ( (This)->lpVtbl -> GetPacketType(This) ) + ( (This)->lpVtbl -> GetPacketType(This) ) #endif /* COBJMACROS */ @@ -7624,78 +7885,78 @@ #define __IDeckLinkEncoderVideoPacket_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderVideoPacket */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderVideoPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("4E7FD944-E8C7-4EAC-B8C0-7B77F80F5AE0") IDeckLinkEncoderVideoPacket : public IDeckLinkEncoderPacket { public: virtual BMDPixelFormat STDMETHODCALLTYPE GetPixelFormat( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceTimestamp( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceTimestamp( /* [in] */ BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecode( + + virtual HRESULT STDMETHODCALLTYPE GetTimecode( /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderVideoPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderVideoPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderVideoPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderVideoPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkEncoderVideoPacket * This, /* [out] */ void **buffer); - - long ( STDMETHODCALLTYPE *GetSize )( + + long ( STDMETHODCALLTYPE *GetSize )( IDeckLinkEncoderVideoPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkEncoderVideoPacket * This, /* [out] */ BMDTimeValue *frameTime, /* [in] */ BMDTimeScale timeScale); - - BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( + + BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( IDeckLinkEncoderVideoPacket * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkEncoderVideoPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( IDeckLinkEncoderVideoPacket * This, /* [in] */ BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkEncoderVideoPacket * This, /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode); - + END_INTERFACE } IDeckLinkEncoderVideoPacketVtbl; @@ -7704,42 +7965,42 @@ CONST_VTBL struct IDeckLinkEncoderVideoPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderVideoPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderVideoPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderVideoPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkEncoderVideoPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkEncoderVideoPacket_GetSize(This) \ - ( (This)->lpVtbl -> GetSize(This) ) + ( (This)->lpVtbl -> GetSize(This) ) #define IDeckLinkEncoderVideoPacket_GetStreamTime(This,frameTime,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) #define IDeckLinkEncoderVideoPacket_GetPacketType(This) \ - ( (This)->lpVtbl -> GetPacketType(This) ) + ( (This)->lpVtbl -> GetPacketType(This) ) #define IDeckLinkEncoderVideoPacket_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkEncoderVideoPacket_GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) \ - ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) + ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) #define IDeckLinkEncoderVideoPacket_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #endif /* COBJMACROS */ @@ -7756,58 +8017,58 @@ #define __IDeckLinkEncoderAudioPacket_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderAudioPacket */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderAudioPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("49E8EDC8-693B-4E14-8EF6-12C658F5A07A") IDeckLinkEncoderAudioPacket : public IDeckLinkEncoderPacket { public: virtual BMDAudioFormat STDMETHODCALLTYPE GetAudioFormat( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderAudioPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderAudioPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderAudioPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderAudioPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkEncoderAudioPacket * This, /* [out] */ void **buffer); - - long ( STDMETHODCALLTYPE *GetSize )( + + long ( STDMETHODCALLTYPE *GetSize )( IDeckLinkEncoderAudioPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkEncoderAudioPacket * This, /* [out] */ BMDTimeValue *frameTime, /* [in] */ BMDTimeScale timeScale); - - BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( + + BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( IDeckLinkEncoderAudioPacket * This); - - BMDAudioFormat ( STDMETHODCALLTYPE *GetAudioFormat )( + + BMDAudioFormat ( STDMETHODCALLTYPE *GetAudioFormat )( IDeckLinkEncoderAudioPacket * This); - + END_INTERFACE } IDeckLinkEncoderAudioPacketVtbl; @@ -7816,36 +8077,36 @@ CONST_VTBL struct IDeckLinkEncoderAudioPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderAudioPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderAudioPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderAudioPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkEncoderAudioPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkEncoderAudioPacket_GetSize(This) \ - ( (This)->lpVtbl -> GetSize(This) ) + ( (This)->lpVtbl -> GetSize(This) ) #define IDeckLinkEncoderAudioPacket_GetStreamTime(This,frameTime,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) #define IDeckLinkEncoderAudioPacket_GetPacketType(This) \ - ( (This)->lpVtbl -> GetPacketType(This) ) + ( (This)->lpVtbl -> GetPacketType(This) ) #define IDeckLinkEncoderAudioPacket_GetAudioFormat(This) \ - ( (This)->lpVtbl -> GetAudioFormat(This) ) + ( (This)->lpVtbl -> GetAudioFormat(This) ) #endif /* COBJMACROS */ @@ -7862,86 +8123,86 @@ #define __IDeckLinkH265NALPacket_INTERFACE_DEFINED__ /* interface IDeckLinkH265NALPacket */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkH265NALPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("639C8E0B-68D5-4BDE-A6D4-95F3AEAFF2E7") IDeckLinkH265NALPacket : public IDeckLinkEncoderVideoPacket { public: - virtual HRESULT STDMETHODCALLTYPE GetUnitType( + virtual HRESULT STDMETHODCALLTYPE GetUnitType( /* [out] */ unsigned char *unitType) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytesNoPrefix( + + virtual HRESULT STDMETHODCALLTYPE GetBytesNoPrefix( /* [out] */ void **buffer) = 0; - + virtual long STDMETHODCALLTYPE GetSizeNoPrefix( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkH265NALPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkH265NALPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkH265NALPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkH265NALPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkH265NALPacket * This, /* [out] */ void **buffer); - - long ( STDMETHODCALLTYPE *GetSize )( + + long ( STDMETHODCALLTYPE *GetSize )( IDeckLinkH265NALPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkH265NALPacket * This, /* [out] */ BMDTimeValue *frameTime, /* [in] */ BMDTimeScale timeScale); - - BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( + + BMDPacketType ( STDMETHODCALLTYPE *GetPacketType )( IDeckLinkH265NALPacket * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkH265NALPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( IDeckLinkH265NALPacket * This, /* [in] */ BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkH265NALPacket * This, /* [in] */ BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetUnitType )( + + HRESULT ( STDMETHODCALLTYPE *GetUnitType )( IDeckLinkH265NALPacket * This, /* [out] */ unsigned char *unitType); - - HRESULT ( STDMETHODCALLTYPE *GetBytesNoPrefix )( + + HRESULT ( STDMETHODCALLTYPE *GetBytesNoPrefix )( IDeckLinkH265NALPacket * This, /* [out] */ void **buffer); - - long ( STDMETHODCALLTYPE *GetSizeNoPrefix )( + + long ( STDMETHODCALLTYPE *GetSizeNoPrefix )( IDeckLinkH265NALPacket * This); - + END_INTERFACE } IDeckLinkH265NALPacketVtbl; @@ -7950,52 +8211,52 @@ CONST_VTBL struct IDeckLinkH265NALPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkH265NALPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkH265NALPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkH265NALPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkH265NALPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkH265NALPacket_GetSize(This) \ - ( (This)->lpVtbl -> GetSize(This) ) + ( (This)->lpVtbl -> GetSize(This) ) #define IDeckLinkH265NALPacket_GetStreamTime(This,frameTime,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,timeScale) ) #define IDeckLinkH265NALPacket_GetPacketType(This) \ - ( (This)->lpVtbl -> GetPacketType(This) ) + ( (This)->lpVtbl -> GetPacketType(This) ) #define IDeckLinkH265NALPacket_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkH265NALPacket_GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) \ - ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) + ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) #define IDeckLinkH265NALPacket_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkH265NALPacket_GetUnitType(This,unitType) \ - ( (This)->lpVtbl -> GetUnitType(This,unitType) ) + ( (This)->lpVtbl -> GetUnitType(This,unitType) ) #define IDeckLinkH265NALPacket_GetBytesNoPrefix(This,buffer) \ - ( (This)->lpVtbl -> GetBytesNoPrefix(This,buffer) ) + ( (This)->lpVtbl -> GetBytesNoPrefix(This,buffer) ) #define IDeckLinkH265NALPacket_GetSizeNoPrefix(This) \ - ( (This)->lpVtbl -> GetSizeNoPrefix(This) ) + ( (This)->lpVtbl -> GetSizeNoPrefix(This) ) #endif /* COBJMACROS */ @@ -8012,59 +8273,59 @@ #define __IDeckLinkAudioInputPacket_INTERFACE_DEFINED__ /* interface IDeckLinkAudioInputPacket */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkAudioInputPacket; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("E43D5870-2894-11DE-8C30-0800200C9A66") IDeckLinkAudioInputPacket : public IUnknown { public: virtual long STDMETHODCALLTYPE GetSampleFrameCount( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPacketTime( + + virtual HRESULT STDMETHODCALLTYPE GetPacketTime( /* [out] */ BMDTimeValue *packetTime, /* [in] */ BMDTimeScale timeScale) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkAudioInputPacketVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkAudioInputPacket * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkAudioInputPacket * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkAudioInputPacket * This); - - long ( STDMETHODCALLTYPE *GetSampleFrameCount )( + + long ( STDMETHODCALLTYPE *GetSampleFrameCount )( IDeckLinkAudioInputPacket * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkAudioInputPacket * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetPacketTime )( + + HRESULT ( STDMETHODCALLTYPE *GetPacketTime )( IDeckLinkAudioInputPacket * This, /* [out] */ BMDTimeValue *packetTime, /* [in] */ BMDTimeScale timeScale); - + END_INTERFACE } IDeckLinkAudioInputPacketVtbl; @@ -8073,29 +8334,29 @@ CONST_VTBL struct IDeckLinkAudioInputPacketVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkAudioInputPacket_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkAudioInputPacket_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkAudioInputPacket_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkAudioInputPacket_GetSampleFrameCount(This) \ - ( (This)->lpVtbl -> GetSampleFrameCount(This) ) + ( (This)->lpVtbl -> GetSampleFrameCount(This) ) #define IDeckLinkAudioInputPacket_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkAudioInputPacket_GetPacketTime(This,packetTime,timeScale) \ - ( (This)->lpVtbl -> GetPacketTime(This,packetTime,timeScale) ) + ( (This)->lpVtbl -> GetPacketTime(This,packetTime,timeScale) ) #endif /* COBJMACROS */ @@ -8112,45 +8373,45 @@ #define __IDeckLinkScreenPreviewCallback_INTERFACE_DEFINED__ /* interface IDeckLinkScreenPreviewCallback */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkScreenPreviewCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("B1D3F49A-85FE-4C5D-95C8-0B5D5DCCD438") IDeckLinkScreenPreviewCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DrawFrame( + virtual HRESULT STDMETHODCALLTYPE DrawFrame( /* [in] */ IDeckLinkVideoFrame *theFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkScreenPreviewCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkScreenPreviewCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkScreenPreviewCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkScreenPreviewCallback * This); - - HRESULT ( STDMETHODCALLTYPE *DrawFrame )( + + HRESULT ( STDMETHODCALLTYPE *DrawFrame )( IDeckLinkScreenPreviewCallback * This, /* [in] */ IDeckLinkVideoFrame *theFrame); - + END_INTERFACE } IDeckLinkScreenPreviewCallbackVtbl; @@ -8159,23 +8420,23 @@ CONST_VTBL struct IDeckLinkScreenPreviewCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkScreenPreviewCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkScreenPreviewCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkScreenPreviewCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkScreenPreviewCallback_DrawFrame(This,theFrame) \ - ( (This)->lpVtbl -> DrawFrame(This,theFrame) ) + ( (This)->lpVtbl -> DrawFrame(This,theFrame) ) #endif /* COBJMACROS */ @@ -8192,62 +8453,62 @@ #define __IDeckLinkGLScreenPreviewHelper_INTERFACE_DEFINED__ /* interface IDeckLinkGLScreenPreviewHelper */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkGLScreenPreviewHelper; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("504E2209-CAC7-4C1A-9FB4-C5BB6274D22F") IDeckLinkGLScreenPreviewHelper : public IUnknown { public: virtual HRESULT STDMETHODCALLTYPE InitializeGL( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PaintGL( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFrame( + + virtual HRESULT STDMETHODCALLTYPE SetFrame( /* [in] */ IDeckLinkVideoFrame *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE Set3DPreviewFormat( + + virtual HRESULT STDMETHODCALLTYPE Set3DPreviewFormat( /* [in] */ BMD3DPreviewFormat previewFormat) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkGLScreenPreviewHelperVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkGLScreenPreviewHelper * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkGLScreenPreviewHelper * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkGLScreenPreviewHelper * This); - - HRESULT ( STDMETHODCALLTYPE *InitializeGL )( + + HRESULT ( STDMETHODCALLTYPE *InitializeGL )( IDeckLinkGLScreenPreviewHelper * This); - - HRESULT ( STDMETHODCALLTYPE *PaintGL )( + + HRESULT ( STDMETHODCALLTYPE *PaintGL )( IDeckLinkGLScreenPreviewHelper * This); - - HRESULT ( STDMETHODCALLTYPE *SetFrame )( + + HRESULT ( STDMETHODCALLTYPE *SetFrame )( IDeckLinkGLScreenPreviewHelper * This, /* [in] */ IDeckLinkVideoFrame *theFrame); - - HRESULT ( STDMETHODCALLTYPE *Set3DPreviewFormat )( + + HRESULT ( STDMETHODCALLTYPE *Set3DPreviewFormat )( IDeckLinkGLScreenPreviewHelper * This, /* [in] */ BMD3DPreviewFormat previewFormat); - + END_INTERFACE } IDeckLinkGLScreenPreviewHelperVtbl; @@ -8256,32 +8517,32 @@ CONST_VTBL struct IDeckLinkGLScreenPreviewHelperVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkGLScreenPreviewHelper_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkGLScreenPreviewHelper_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkGLScreenPreviewHelper_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkGLScreenPreviewHelper_InitializeGL(This) \ - ( (This)->lpVtbl -> InitializeGL(This) ) + ( (This)->lpVtbl -> InitializeGL(This) ) #define IDeckLinkGLScreenPreviewHelper_PaintGL(This) \ - ( (This)->lpVtbl -> PaintGL(This) ) + ( (This)->lpVtbl -> PaintGL(This) ) #define IDeckLinkGLScreenPreviewHelper_SetFrame(This,theFrame) \ - ( (This)->lpVtbl -> SetFrame(This,theFrame) ) + ( (This)->lpVtbl -> SetFrame(This,theFrame) ) #define IDeckLinkGLScreenPreviewHelper_Set3DPreviewFormat(This,previewFormat) \ - ( (This)->lpVtbl -> Set3DPreviewFormat(This,previewFormat) ) + ( (This)->lpVtbl -> Set3DPreviewFormat(This,previewFormat) ) #endif /* COBJMACROS */ @@ -8298,66 +8559,66 @@ #define __IDeckLinkDX9ScreenPreviewHelper_INTERFACE_DEFINED__ /* interface IDeckLinkDX9ScreenPreviewHelper */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDX9ScreenPreviewHelper; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("2094B522-D1A1-40C0-9AC7-1C012218EF02") IDeckLinkDX9ScreenPreviewHelper : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Initialize( + virtual HRESULT STDMETHODCALLTYPE Initialize( /* [in] */ void *device) = 0; - - virtual HRESULT STDMETHODCALLTYPE Render( + + virtual HRESULT STDMETHODCALLTYPE Render( /* [in] */ RECT *rc) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFrame( + + virtual HRESULT STDMETHODCALLTYPE SetFrame( /* [in] */ IDeckLinkVideoFrame *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE Set3DPreviewFormat( + + virtual HRESULT STDMETHODCALLTYPE Set3DPreviewFormat( /* [in] */ BMD3DPreviewFormat previewFormat) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDX9ScreenPreviewHelperVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDX9ScreenPreviewHelper * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDX9ScreenPreviewHelper * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDX9ScreenPreviewHelper * This); - - HRESULT ( STDMETHODCALLTYPE *Initialize )( + + HRESULT ( STDMETHODCALLTYPE *Initialize )( IDeckLinkDX9ScreenPreviewHelper * This, /* [in] */ void *device); - - HRESULT ( STDMETHODCALLTYPE *Render )( + + HRESULT ( STDMETHODCALLTYPE *Render )( IDeckLinkDX9ScreenPreviewHelper * This, /* [in] */ RECT *rc); - - HRESULT ( STDMETHODCALLTYPE *SetFrame )( + + HRESULT ( STDMETHODCALLTYPE *SetFrame )( IDeckLinkDX9ScreenPreviewHelper * This, /* [in] */ IDeckLinkVideoFrame *theFrame); - - HRESULT ( STDMETHODCALLTYPE *Set3DPreviewFormat )( + + HRESULT ( STDMETHODCALLTYPE *Set3DPreviewFormat )( IDeckLinkDX9ScreenPreviewHelper * This, /* [in] */ BMD3DPreviewFormat previewFormat); - + END_INTERFACE } IDeckLinkDX9ScreenPreviewHelperVtbl; @@ -8366,32 +8627,32 @@ CONST_VTBL struct IDeckLinkDX9ScreenPreviewHelperVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDX9ScreenPreviewHelper_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDX9ScreenPreviewHelper_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDX9ScreenPreviewHelper_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDX9ScreenPreviewHelper_Initialize(This,device) \ - ( (This)->lpVtbl -> Initialize(This,device) ) + ( (This)->lpVtbl -> Initialize(This,device) ) #define IDeckLinkDX9ScreenPreviewHelper_Render(This,rc) \ - ( (This)->lpVtbl -> Render(This,rc) ) + ( (This)->lpVtbl -> Render(This,rc) ) #define IDeckLinkDX9ScreenPreviewHelper_SetFrame(This,theFrame) \ - ( (This)->lpVtbl -> SetFrame(This,theFrame) ) + ( (This)->lpVtbl -> SetFrame(This,theFrame) ) #define IDeckLinkDX9ScreenPreviewHelper_Set3DPreviewFormat(This,previewFormat) \ - ( (This)->lpVtbl -> Set3DPreviewFormat(This,previewFormat) ) + ( (This)->lpVtbl -> Set3DPreviewFormat(This,previewFormat) ) #endif /* COBJMACROS */ @@ -8408,49 +8669,49 @@ #define __IDeckLinkNotificationCallback_INTERFACE_DEFINED__ /* interface IDeckLinkNotificationCallback */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkNotificationCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("b002a1ec-070d-4288-8289-bd5d36e5ff0d") IDeckLinkNotificationCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Notify( + virtual HRESULT STDMETHODCALLTYPE Notify( /* [in] */ BMDNotifications topic, /* [in] */ ULONGLONG param1, /* [in] */ ULONGLONG param2) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkNotificationCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkNotificationCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkNotificationCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkNotificationCallback * This); - - HRESULT ( STDMETHODCALLTYPE *Notify )( + + HRESULT ( STDMETHODCALLTYPE *Notify )( IDeckLinkNotificationCallback * This, /* [in] */ BMDNotifications topic, /* [in] */ ULONGLONG param1, /* [in] */ ULONGLONG param2); - + END_INTERFACE } IDeckLinkNotificationCallbackVtbl; @@ -8459,23 +8720,23 @@ CONST_VTBL struct IDeckLinkNotificationCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkNotificationCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkNotificationCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkNotificationCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkNotificationCallback_Notify(This,topic,param1,param2) \ - ( (This)->lpVtbl -> Notify(This,topic,param1,param2) ) + ( (This)->lpVtbl -> Notify(This,topic,param1,param2) ) #endif /* COBJMACROS */ @@ -8492,56 +8753,56 @@ #define __IDeckLinkNotification_INTERFACE_DEFINED__ /* interface IDeckLinkNotification */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkNotification; #if defined(__cplusplus) && !defined(CINTERFACE) - - MIDL_INTERFACE("0a1fb207-e215-441b-9b19-6fa1575946c5") + + MIDL_INTERFACE("b85df4c8-bdf5-47c1-8064-28162ebdd4eb") IDeckLinkNotification : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Subscribe( + virtual HRESULT STDMETHODCALLTYPE Subscribe( /* [in] */ BMDNotifications topic, /* [in] */ IDeckLinkNotificationCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE Unsubscribe( + + virtual HRESULT STDMETHODCALLTYPE Unsubscribe( /* [in] */ BMDNotifications topic, /* [in] */ IDeckLinkNotificationCallback *theCallback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkNotificationVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkNotification * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkNotification * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkNotification * This); - - HRESULT ( STDMETHODCALLTYPE *Subscribe )( + + HRESULT ( STDMETHODCALLTYPE *Subscribe )( IDeckLinkNotification * This, /* [in] */ BMDNotifications topic, /* [in] */ IDeckLinkNotificationCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *Unsubscribe )( + + HRESULT ( STDMETHODCALLTYPE *Unsubscribe )( IDeckLinkNotification * This, /* [in] */ BMDNotifications topic, /* [in] */ IDeckLinkNotificationCallback *theCallback); - + END_INTERFACE } IDeckLinkNotificationVtbl; @@ -8550,26 +8811,26 @@ CONST_VTBL struct IDeckLinkNotificationVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkNotification_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkNotification_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkNotification_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkNotification_Subscribe(This,topic,theCallback) \ - ( (This)->lpVtbl -> Subscribe(This,topic,theCallback) ) + ( (This)->lpVtbl -> Subscribe(This,topic,theCallback) ) #define IDeckLinkNotification_Unsubscribe(This,topic,theCallback) \ - ( (This)->lpVtbl -> Unsubscribe(This,topic,theCallback) ) + ( (This)->lpVtbl -> Unsubscribe(This,topic,theCallback) ) #endif /* COBJMACROS */ @@ -8582,112 +8843,494 @@ #endif /* __IDeckLinkNotification_INTERFACE_DEFINED__ */ -#ifndef __IDeckLinkAttributes_INTERFACE_DEFINED__ -#define __IDeckLinkAttributes_INTERFACE_DEFINED__ +#ifndef __IDeckLinkProfileAttributes_INTERFACE_DEFINED__ +#define __IDeckLinkProfileAttributes_INTERFACE_DEFINED__ -/* interface IDeckLinkAttributes */ -/* [helpstring][local][uuid][object] */ +/* interface IDeckLinkProfileAttributes */ +/* [helpstring][local][uuid][object] */ -EXTERN_C const IID IID_IDeckLinkAttributes; +EXTERN_C const IID IID_IDeckLinkProfileAttributes; #if defined(__cplusplus) && !defined(CINTERFACE) - - MIDL_INTERFACE("ABC11843-D966-44CB-96E2-A1CB5D3135C4") - IDeckLinkAttributes : public IUnknown + + MIDL_INTERFACE("17D4BF8E-4911-473A-80A0-731CF6FF345B") + IDeckLinkProfileAttributes : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetFlag( + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ BSTR *value) = 0; - + }; - - + + #else /* C style interface */ - typedef struct IDeckLinkAttributesVtbl + typedef struct IDeckLinkProfileAttributesVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( - IDeckLinkAttributes * This, + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkProfileAttributes * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( - IDeckLinkAttributes * This); - - ULONG ( STDMETHODCALLTYPE *Release )( - IDeckLinkAttributes * This); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( - IDeckLinkAttributes * This, + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkProfileAttributes * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkProfileAttributes * This); + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( + IDeckLinkProfileAttributes * This, /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( - IDeckLinkAttributes * This, + + HRESULT ( STDMETHODCALLTYPE *GetInt )( + IDeckLinkProfileAttributes * This, /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( - IDeckLinkAttributes * This, + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( + IDeckLinkProfileAttributes * This, /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( - IDeckLinkAttributes * This, + + HRESULT ( STDMETHODCALLTYPE *GetString )( + IDeckLinkProfileAttributes * This, /* [in] */ BMDDeckLinkAttributeID cfgID, /* [out] */ BSTR *value); - + + END_INTERFACE + } IDeckLinkProfileAttributesVtbl; + + interface IDeckLinkProfileAttributes + { + CONST_VTBL struct IDeckLinkProfileAttributesVtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkProfileAttributes_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkProfileAttributes_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkProfileAttributes_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkProfileAttributes_GetFlag(This,cfgID,value) \ + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + +#define IDeckLinkProfileAttributes_GetInt(This,cfgID,value) \ + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + +#define IDeckLinkProfileAttributes_GetFloat(This,cfgID,value) \ + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + +#define IDeckLinkProfileAttributes_GetString(This,cfgID,value) \ + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkProfileAttributes_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkProfileIterator_INTERFACE_DEFINED__ +#define __IDeckLinkProfileIterator_INTERFACE_DEFINED__ + +/* interface IDeckLinkProfileIterator */ +/* [helpstring][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkProfileIterator; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("29E5A8C0-8BE4-46EB-93AC-31DAAB5B7BF2") + IDeckLinkProfileIterator : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE Next( + /* [out] */ IDeckLinkProfile **profile) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkProfileIteratorVtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkProfileIterator * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkProfileIterator * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkProfileIterator * This); + + HRESULT ( STDMETHODCALLTYPE *Next )( + IDeckLinkProfileIterator * This, + /* [out] */ IDeckLinkProfile **profile); + + END_INTERFACE + } IDeckLinkProfileIteratorVtbl; + + interface IDeckLinkProfileIterator + { + CONST_VTBL struct IDeckLinkProfileIteratorVtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkProfileIterator_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkProfileIterator_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkProfileIterator_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkProfileIterator_Next(This,profile) \ + ( (This)->lpVtbl -> Next(This,profile) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkProfileIterator_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkProfile_INTERFACE_DEFINED__ +#define __IDeckLinkProfile_INTERFACE_DEFINED__ + +/* interface IDeckLinkProfile */ +/* [helpstring][local][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkProfile; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("16093466-674A-432B-9DA0-1AC2C5A8241C") + IDeckLinkProfile : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE GetDevice( + /* [out] */ IDeckLink **device) = 0; + + virtual HRESULT STDMETHODCALLTYPE IsActive( + /* [out] */ BOOL *isActive) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetActive( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetPeers( + /* [out] */ IDeckLinkProfileIterator **profileIterator) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkProfileVtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkProfile * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkProfile * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkProfile * This); + + HRESULT ( STDMETHODCALLTYPE *GetDevice )( + IDeckLinkProfile * This, + /* [out] */ IDeckLink **device); + + HRESULT ( STDMETHODCALLTYPE *IsActive )( + IDeckLinkProfile * This, + /* [out] */ BOOL *isActive); + + HRESULT ( STDMETHODCALLTYPE *SetActive )( + IDeckLinkProfile * This); + + HRESULT ( STDMETHODCALLTYPE *GetPeers )( + IDeckLinkProfile * This, + /* [out] */ IDeckLinkProfileIterator **profileIterator); + END_INTERFACE - } IDeckLinkAttributesVtbl; + } IDeckLinkProfileVtbl; - interface IDeckLinkAttributes + interface IDeckLinkProfile { - CONST_VTBL struct IDeckLinkAttributesVtbl *lpVtbl; + CONST_VTBL struct IDeckLinkProfileVtbl *lpVtbl; }; - + #ifdef COBJMACROS -#define IDeckLinkAttributes_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) +#define IDeckLinkProfile_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkProfile_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkProfile_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkProfile_GetDevice(This,device) \ + ( (This)->lpVtbl -> GetDevice(This,device) ) + +#define IDeckLinkProfile_IsActive(This,isActive) \ + ( (This)->lpVtbl -> IsActive(This,isActive) ) + +#define IDeckLinkProfile_SetActive(This) \ + ( (This)->lpVtbl -> SetActive(This) ) + +#define IDeckLinkProfile_GetPeers(This,profileIterator) \ + ( (This)->lpVtbl -> GetPeers(This,profileIterator) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkProfile_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkProfileCallback_INTERFACE_DEFINED__ +#define __IDeckLinkProfileCallback_INTERFACE_DEFINED__ + +/* interface IDeckLinkProfileCallback */ +/* [helpstring][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkProfileCallback; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("A4F9341E-97AA-4E04-8935-15F809898CEA") + IDeckLinkProfileCallback : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE ProfileChanging( + /* [in] */ IDeckLinkProfile *profileToBeActivated, + /* [in] */ BOOL streamsWillBeForcedToStop) = 0; + + virtual HRESULT STDMETHODCALLTYPE ProfileActivated( + /* [in] */ IDeckLinkProfile *activatedProfile) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkProfileCallbackVtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkProfileCallback * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkProfileCallback * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkProfileCallback * This); + + HRESULT ( STDMETHODCALLTYPE *ProfileChanging )( + IDeckLinkProfileCallback * This, + /* [in] */ IDeckLinkProfile *profileToBeActivated, + /* [in] */ BOOL streamsWillBeForcedToStop); + + HRESULT ( STDMETHODCALLTYPE *ProfileActivated )( + IDeckLinkProfileCallback * This, + /* [in] */ IDeckLinkProfile *activatedProfile); + + END_INTERFACE + } IDeckLinkProfileCallbackVtbl; + + interface IDeckLinkProfileCallback + { + CONST_VTBL struct IDeckLinkProfileCallbackVtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkProfileCallback_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkProfileCallback_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkProfileCallback_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkProfileCallback_ProfileChanging(This,profileToBeActivated,streamsWillBeForcedToStop) \ + ( (This)->lpVtbl -> ProfileChanging(This,profileToBeActivated,streamsWillBeForcedToStop) ) + +#define IDeckLinkProfileCallback_ProfileActivated(This,activatedProfile) \ + ( (This)->lpVtbl -> ProfileActivated(This,activatedProfile) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkProfileCallback_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkProfileManager_INTERFACE_DEFINED__ +#define __IDeckLinkProfileManager_INTERFACE_DEFINED__ + +/* interface IDeckLinkProfileManager */ +/* [helpstring][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkProfileManager; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("30D41429-3998-4B6D-84F8-78C94A797C6E") + IDeckLinkProfileManager : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE GetProfiles( + /* [out] */ IDeckLinkProfileIterator **profileIterator) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetProfile( + /* [in] */ BMDProfileID profileID, + /* [out] */ IDeckLinkProfile **profile) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetCallback( + /* [in] */ IDeckLinkProfileCallback *callback) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkProfileManagerVtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkProfileManager * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkProfileManager * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkProfileManager * This); + + HRESULT ( STDMETHODCALLTYPE *GetProfiles )( + IDeckLinkProfileManager * This, + /* [out] */ IDeckLinkProfileIterator **profileIterator); + + HRESULT ( STDMETHODCALLTYPE *GetProfile )( + IDeckLinkProfileManager * This, + /* [in] */ BMDProfileID profileID, + /* [out] */ IDeckLinkProfile **profile); + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( + IDeckLinkProfileManager * This, + /* [in] */ IDeckLinkProfileCallback *callback); + + END_INTERFACE + } IDeckLinkProfileManagerVtbl; + + interface IDeckLinkProfileManager + { + CONST_VTBL struct IDeckLinkProfileManagerVtbl *lpVtbl; + }; -#define IDeckLinkAttributes_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) -#define IDeckLinkAttributes_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) +#ifdef COBJMACROS + + +#define IDeckLinkProfileManager_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) -#define IDeckLinkAttributes_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) +#define IDeckLinkProfileManager_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) -#define IDeckLinkAttributes_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) +#define IDeckLinkProfileManager_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) -#define IDeckLinkAttributes_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) -#define IDeckLinkAttributes_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) +#define IDeckLinkProfileManager_GetProfiles(This,profileIterator) \ + ( (This)->lpVtbl -> GetProfiles(This,profileIterator) ) + +#define IDeckLinkProfileManager_GetProfile(This,profileID,profile) \ + ( (This)->lpVtbl -> GetProfile(This,profileID,profile) ) + +#define IDeckLinkProfileManager_SetCallback(This,callback) \ + ( (This)->lpVtbl -> SetCallback(This,callback) ) #endif /* COBJMACROS */ @@ -8697,92 +9340,92 @@ -#endif /* __IDeckLinkAttributes_INTERFACE_DEFINED__ */ +#endif /* __IDeckLinkProfileManager_INTERFACE_DEFINED__ */ #ifndef __IDeckLinkStatus_INTERFACE_DEFINED__ #define __IDeckLinkStatus_INTERFACE_DEFINED__ /* interface IDeckLinkStatus */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkStatus; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("5F558200-4028-49BC-BEAC-DB3FA4A96E46") IDeckLinkStatus : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetFlag( + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ BSTR *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ void *buffer, /* [out][in] */ unsigned int *bufferSize) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkStatusVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkStatus * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkStatus * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkStatus * This); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkStatus * This, /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkStatus * This, /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkStatus * This, /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkStatus * This, /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkStatus * This, /* [in] */ BMDDeckLinkStatusID statusID, /* [out] */ void *buffer, /* [out][in] */ unsigned int *bufferSize); - + END_INTERFACE } IDeckLinkStatusVtbl; @@ -8791,35 +9434,35 @@ CONST_VTBL struct IDeckLinkStatusVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkStatus_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkStatus_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkStatus_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkStatus_GetFlag(This,statusID,value) \ - ( (This)->lpVtbl -> GetFlag(This,statusID,value) ) + ( (This)->lpVtbl -> GetFlag(This,statusID,value) ) #define IDeckLinkStatus_GetInt(This,statusID,value) \ - ( (This)->lpVtbl -> GetInt(This,statusID,value) ) + ( (This)->lpVtbl -> GetInt(This,statusID,value) ) #define IDeckLinkStatus_GetFloat(This,statusID,value) \ - ( (This)->lpVtbl -> GetFloat(This,statusID,value) ) + ( (This)->lpVtbl -> GetFloat(This,statusID,value) ) #define IDeckLinkStatus_GetString(This,statusID,value) \ - ( (This)->lpVtbl -> GetString(This,statusID,value) ) + ( (This)->lpVtbl -> GetString(This,statusID,value) ) #define IDeckLinkStatus_GetBytes(This,statusID,buffer,bufferSize) \ - ( (This)->lpVtbl -> GetBytes(This,statusID,buffer,bufferSize) ) + ( (This)->lpVtbl -> GetBytes(This,statusID,buffer,bufferSize) ) #endif /* COBJMACROS */ @@ -8836,71 +9479,71 @@ #define __IDeckLinkKeyer_INTERFACE_DEFINED__ /* interface IDeckLinkKeyer */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkKeyer; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("89AFCAF5-65F8-421E-98F7-96FE5F5BFBA3") IDeckLinkKeyer : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Enable( + virtual HRESULT STDMETHODCALLTYPE Enable( /* [in] */ BOOL isExternal) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetLevel( + + virtual HRESULT STDMETHODCALLTYPE SetLevel( /* [in] */ unsigned char level) = 0; - - virtual HRESULT STDMETHODCALLTYPE RampUp( + + virtual HRESULT STDMETHODCALLTYPE RampUp( /* [in] */ unsigned int numberOfFrames) = 0; - - virtual HRESULT STDMETHODCALLTYPE RampDown( + + virtual HRESULT STDMETHODCALLTYPE RampDown( /* [in] */ unsigned int numberOfFrames) = 0; - + virtual HRESULT STDMETHODCALLTYPE Disable( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkKeyerVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkKeyer * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkKeyer * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkKeyer * This); - - HRESULT ( STDMETHODCALLTYPE *Enable )( + + HRESULT ( STDMETHODCALLTYPE *Enable )( IDeckLinkKeyer * This, /* [in] */ BOOL isExternal); - - HRESULT ( STDMETHODCALLTYPE *SetLevel )( + + HRESULT ( STDMETHODCALLTYPE *SetLevel )( IDeckLinkKeyer * This, /* [in] */ unsigned char level); - - HRESULT ( STDMETHODCALLTYPE *RampUp )( + + HRESULT ( STDMETHODCALLTYPE *RampUp )( IDeckLinkKeyer * This, /* [in] */ unsigned int numberOfFrames); - - HRESULT ( STDMETHODCALLTYPE *RampDown )( + + HRESULT ( STDMETHODCALLTYPE *RampDown )( IDeckLinkKeyer * This, /* [in] */ unsigned int numberOfFrames); - - HRESULT ( STDMETHODCALLTYPE *Disable )( + + HRESULT ( STDMETHODCALLTYPE *Disable )( IDeckLinkKeyer * This); - + END_INTERFACE } IDeckLinkKeyerVtbl; @@ -8909,35 +9552,35 @@ CONST_VTBL struct IDeckLinkKeyerVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkKeyer_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkKeyer_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkKeyer_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkKeyer_Enable(This,isExternal) \ - ( (This)->lpVtbl -> Enable(This,isExternal) ) + ( (This)->lpVtbl -> Enable(This,isExternal) ) #define IDeckLinkKeyer_SetLevel(This,level) \ - ( (This)->lpVtbl -> SetLevel(This,level) ) + ( (This)->lpVtbl -> SetLevel(This,level) ) #define IDeckLinkKeyer_RampUp(This,numberOfFrames) \ - ( (This)->lpVtbl -> RampUp(This,numberOfFrames) ) + ( (This)->lpVtbl -> RampUp(This,numberOfFrames) ) #define IDeckLinkKeyer_RampDown(This,numberOfFrames) \ - ( (This)->lpVtbl -> RampDown(This,numberOfFrames) ) + ( (This)->lpVtbl -> RampDown(This,numberOfFrames) ) #define IDeckLinkKeyer_Disable(This) \ - ( (This)->lpVtbl -> Disable(This) ) + ( (This)->lpVtbl -> Disable(This) ) #endif /* COBJMACROS */ @@ -8954,47 +9597,47 @@ #define __IDeckLinkVideoConversion_INTERFACE_DEFINED__ /* interface IDeckLinkVideoConversion */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoConversion; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("3BBCB8A2-DA2C-42D9-B5D8-88083644E99A") IDeckLinkVideoConversion : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE ConvertFrame( + virtual HRESULT STDMETHODCALLTYPE ConvertFrame( /* [in] */ IDeckLinkVideoFrame *srcFrame, /* [in] */ IDeckLinkVideoFrame *dstFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoConversionVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoConversion * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoConversion * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoConversion * This); - - HRESULT ( STDMETHODCALLTYPE *ConvertFrame )( + + HRESULT ( STDMETHODCALLTYPE *ConvertFrame )( IDeckLinkVideoConversion * This, /* [in] */ IDeckLinkVideoFrame *srcFrame, /* [in] */ IDeckLinkVideoFrame *dstFrame); - + END_INTERFACE } IDeckLinkVideoConversionVtbl; @@ -9003,23 +9646,23 @@ CONST_VTBL struct IDeckLinkVideoConversionVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoConversion_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoConversion_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoConversion_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoConversion_ConvertFrame(This,srcFrame,dstFrame) \ - ( (This)->lpVtbl -> ConvertFrame(This,srcFrame,dstFrame) ) + ( (This)->lpVtbl -> ConvertFrame(This,srcFrame,dstFrame) ) #endif /* COBJMACROS */ @@ -9036,52 +9679,52 @@ #define __IDeckLinkDeviceNotificationCallback_INTERFACE_DEFINED__ /* interface IDeckLinkDeviceNotificationCallback */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDeviceNotificationCallback; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("4997053B-0ADF-4CC8-AC70-7A50C4BE728F") IDeckLinkDeviceNotificationCallback : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DeckLinkDeviceArrived( + virtual HRESULT STDMETHODCALLTYPE DeckLinkDeviceArrived( /* [in] */ IDeckLink *deckLinkDevice) = 0; - - virtual HRESULT STDMETHODCALLTYPE DeckLinkDeviceRemoved( + + virtual HRESULT STDMETHODCALLTYPE DeckLinkDeviceRemoved( /* [in] */ IDeckLink *deckLinkDevice) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDeviceNotificationCallbackVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDeviceNotificationCallback * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDeviceNotificationCallback * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDeviceNotificationCallback * This); - - HRESULT ( STDMETHODCALLTYPE *DeckLinkDeviceArrived )( + + HRESULT ( STDMETHODCALLTYPE *DeckLinkDeviceArrived )( IDeckLinkDeviceNotificationCallback * This, /* [in] */ IDeckLink *deckLinkDevice); - - HRESULT ( STDMETHODCALLTYPE *DeckLinkDeviceRemoved )( + + HRESULT ( STDMETHODCALLTYPE *DeckLinkDeviceRemoved )( IDeckLinkDeviceNotificationCallback * This, /* [in] */ IDeckLink *deckLinkDevice); - + END_INTERFACE } IDeckLinkDeviceNotificationCallbackVtbl; @@ -9090,26 +9733,26 @@ CONST_VTBL struct IDeckLinkDeviceNotificationCallbackVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDeviceNotificationCallback_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDeviceNotificationCallback_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDeviceNotificationCallback_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDeviceNotificationCallback_DeckLinkDeviceArrived(This,deckLinkDevice) \ - ( (This)->lpVtbl -> DeckLinkDeviceArrived(This,deckLinkDevice) ) + ( (This)->lpVtbl -> DeckLinkDeviceArrived(This,deckLinkDevice) ) #define IDeckLinkDeviceNotificationCallback_DeckLinkDeviceRemoved(This,deckLinkDevice) \ - ( (This)->lpVtbl -> DeckLinkDeviceRemoved(This,deckLinkDevice) ) + ( (This)->lpVtbl -> DeckLinkDeviceRemoved(This,deckLinkDevice) ) #endif /* COBJMACROS */ @@ -9126,50 +9769,50 @@ #define __IDeckLinkDiscovery_INTERFACE_DEFINED__ /* interface IDeckLinkDiscovery */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDiscovery; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("CDBF631C-BC76-45FA-B44D-C55059BC6101") IDeckLinkDiscovery : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE InstallDeviceNotifications( + virtual HRESULT STDMETHODCALLTYPE InstallDeviceNotifications( /* [in] */ IDeckLinkDeviceNotificationCallback *deviceNotificationCallback) = 0; - + virtual HRESULT STDMETHODCALLTYPE UninstallDeviceNotifications( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDiscoveryVtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDiscovery * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDiscovery * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDiscovery * This); - - HRESULT ( STDMETHODCALLTYPE *InstallDeviceNotifications )( + + HRESULT ( STDMETHODCALLTYPE *InstallDeviceNotifications )( IDeckLinkDiscovery * This, /* [in] */ IDeckLinkDeviceNotificationCallback *deviceNotificationCallback); - - HRESULT ( STDMETHODCALLTYPE *UninstallDeviceNotifications )( + + HRESULT ( STDMETHODCALLTYPE *UninstallDeviceNotifications )( IDeckLinkDiscovery * This); - + END_INTERFACE } IDeckLinkDiscoveryVtbl; @@ -9178,26 +9821,26 @@ CONST_VTBL struct IDeckLinkDiscoveryVtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDiscovery_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDiscovery_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDiscovery_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDiscovery_InstallDeviceNotifications(This,deviceNotificationCallback) \ - ( (This)->lpVtbl -> InstallDeviceNotifications(This,deviceNotificationCallback) ) + ( (This)->lpVtbl -> InstallDeviceNotifications(This,deviceNotificationCallback) ) #define IDeckLinkDiscovery_UninstallDeviceNotifications(This) \ - ( (This)->lpVtbl -> UninstallDeviceNotifications(This) ) + ( (This)->lpVtbl -> UninstallDeviceNotifications(This) ) #endif /* COBJMACROS */ @@ -9214,7 +9857,7 @@ #ifdef __cplusplus -class DECLSPEC_UUID("87D2693F-8D4A-45C7-B43F-10ACBA25E68F") +class DECLSPEC_UUID("BA6C6F44-6DA5-4DCE-94AA-EE2D1372A676") CDeckLinkIterator; #endif @@ -9254,7 +9897,7 @@ #ifdef __cplusplus -class DECLSPEC_UUID("652615D4-26CD-4514-B161-2FD5072ED008") +class DECLSPEC_UUID("22FBFC33-8D07-495C-A5BF-DAB5EA9B82DB") CDeckLinkDiscovery; #endif @@ -9266,119 +9909,1401 @@ CDeckLinkVideoFrameAncillaryPackets; #endif +#ifndef __IDeckLinkConfiguration_v10_11_INTERFACE_DEFINED__ +#define __IDeckLinkConfiguration_v10_11_INTERFACE_DEFINED__ + +/* interface IDeckLinkConfiguration_v10_11 */ +/* [helpstring][local][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkConfiguration_v10_11; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("EF90380B-4AE5-4346-9077-E288E149F129") + IDeckLinkConfiguration_v10_11 : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE SetFlag( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ BOOL value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetFlag( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ BOOL *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetInt( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ LONGLONG value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetInt( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ LONGLONG *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetFloat( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ double value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetFloat( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ double *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetString( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ BSTR value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetString( + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ BSTR *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE WriteConfigurationToPreferences( void) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkConfiguration_v10_11Vtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkConfiguration_v10_11 * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkConfiguration_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ BOOL value); + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ BOOL *value); + + HRESULT ( STDMETHODCALLTYPE *SetInt )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ LONGLONG value); + + HRESULT ( STDMETHODCALLTYPE *GetInt )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ LONGLONG *value); + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ double value); + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ double *value); + + HRESULT ( STDMETHODCALLTYPE *SetString )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [in] */ BSTR value); + + HRESULT ( STDMETHODCALLTYPE *GetString )( + IDeckLinkConfiguration_v10_11 * This, + /* [in] */ BMDDeckLinkConfigurationID cfgID, + /* [out] */ BSTR *value); + + HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( + IDeckLinkConfiguration_v10_11 * This); + + END_INTERFACE + } IDeckLinkConfiguration_v10_11Vtbl; + + interface IDeckLinkConfiguration_v10_11 + { + CONST_VTBL struct IDeckLinkConfiguration_v10_11Vtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkConfiguration_v10_11_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkConfiguration_v10_11_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkConfiguration_v10_11_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkConfiguration_v10_11_SetFlag(This,cfgID,value) \ + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_GetFlag(This,cfgID,value) \ + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_SetInt(This,cfgID,value) \ + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_GetInt(This,cfgID,value) \ + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_SetFloat(This,cfgID,value) \ + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_GetFloat(This,cfgID,value) \ + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_SetString(This,cfgID,value) \ + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_GetString(This,cfgID,value) \ + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + +#define IDeckLinkConfiguration_v10_11_WriteConfigurationToPreferences(This) \ + ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkConfiguration_v10_11_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkAttributes_v10_11_INTERFACE_DEFINED__ +#define __IDeckLinkAttributes_v10_11_INTERFACE_DEFINED__ + +/* interface IDeckLinkAttributes_v10_11 */ +/* [helpstring][local][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkAttributes_v10_11; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("ABC11843-D966-44CB-96E2-A1CB5D3135C4") + IDeckLinkAttributes_v10_11 : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE GetFlag( + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ BOOL *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetInt( + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ LONGLONG *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetFloat( + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ double *value) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetString( + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ BSTR *value) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkAttributes_v10_11Vtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkAttributes_v10_11 * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkAttributes_v10_11 * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkAttributes_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( + IDeckLinkAttributes_v10_11 * This, + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ BOOL *value); + + HRESULT ( STDMETHODCALLTYPE *GetInt )( + IDeckLinkAttributes_v10_11 * This, + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ LONGLONG *value); + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( + IDeckLinkAttributes_v10_11 * This, + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ double *value); + + HRESULT ( STDMETHODCALLTYPE *GetString )( + IDeckLinkAttributes_v10_11 * This, + /* [in] */ BMDDeckLinkAttributeID cfgID, + /* [out] */ BSTR *value); + + END_INTERFACE + } IDeckLinkAttributes_v10_11Vtbl; + + interface IDeckLinkAttributes_v10_11 + { + CONST_VTBL struct IDeckLinkAttributes_v10_11Vtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkAttributes_v10_11_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkAttributes_v10_11_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkAttributes_v10_11_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkAttributes_v10_11_GetFlag(This,cfgID,value) \ + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + +#define IDeckLinkAttributes_v10_11_GetInt(This,cfgID,value) \ + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + +#define IDeckLinkAttributes_v10_11_GetFloat(This,cfgID,value) \ + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + +#define IDeckLinkAttributes_v10_11_GetString(This,cfgID,value) \ + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkAttributes_v10_11_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkNotification_v10_11_INTERFACE_DEFINED__ +#define __IDeckLinkNotification_v10_11_INTERFACE_DEFINED__ + +/* interface IDeckLinkNotification_v10_11 */ +/* [helpstring][local][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkNotification_v10_11; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("0A1FB207-E215-441B-9B19-6FA1575946C5") + IDeckLinkNotification_v10_11 : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE Subscribe( + /* [in] */ BMDNotifications topic, + /* [in] */ IDeckLinkNotificationCallback *theCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE Unsubscribe( + /* [in] */ BMDNotifications topic, + /* [in] */ IDeckLinkNotificationCallback *theCallback) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkNotification_v10_11Vtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkNotification_v10_11 * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkNotification_v10_11 * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkNotification_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *Subscribe )( + IDeckLinkNotification_v10_11 * This, + /* [in] */ BMDNotifications topic, + /* [in] */ IDeckLinkNotificationCallback *theCallback); + + HRESULT ( STDMETHODCALLTYPE *Unsubscribe )( + IDeckLinkNotification_v10_11 * This, + /* [in] */ BMDNotifications topic, + /* [in] */ IDeckLinkNotificationCallback *theCallback); + + END_INTERFACE + } IDeckLinkNotification_v10_11Vtbl; + + interface IDeckLinkNotification_v10_11 + { + CONST_VTBL struct IDeckLinkNotification_v10_11Vtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkNotification_v10_11_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkNotification_v10_11_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkNotification_v10_11_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkNotification_v10_11_Subscribe(This,topic,theCallback) \ + ( (This)->lpVtbl -> Subscribe(This,topic,theCallback) ) + +#define IDeckLinkNotification_v10_11_Unsubscribe(This,topic,theCallback) \ + ( (This)->lpVtbl -> Unsubscribe(This,topic,theCallback) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkNotification_v10_11_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkOutput_v10_11_INTERFACE_DEFINED__ +#define __IDeckLinkOutput_v10_11_INTERFACE_DEFINED__ + +/* interface IDeckLinkOutput_v10_11 */ +/* [helpstring][local][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkOutput_v10_11; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("CC5C8A6E-3F2F-4B3A-87EA-FD78AF300564") + IDeckLinkOutput_v10_11 : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoOutputFlags flags, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, + /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDVideoOutputFlags flags) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisableVideoOutput( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( + /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( + /* [in] */ int width, + /* [in] */ int height, + /* [in] */ int rowBytes, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDFrameFlags flags, + /* [out] */ IDeckLinkMutableVideoFrame **outFrame) = 0; + + virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( + /* [in] */ BMDPixelFormat pixelFormat, + /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( + /* [in] */ IDeckLinkVideoFrame *theFrame) = 0; + + virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( + /* [in] */ IDeckLinkVideoFrame *theFrame, + /* [in] */ BMDTimeValue displayTime, + /* [in] */ BMDTimeValue displayDuration, + /* [in] */ BMDTimeScale timeScale) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( + /* [in] */ IDeckLinkVideoOutputCallback *theCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( + /* [out] */ unsigned int *bufferedFrameCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( + /* [in] */ BMDAudioSampleRate sampleRate, + /* [in] */ BMDAudioSampleType sampleType, + /* [in] */ unsigned int channelCount, + /* [in] */ BMDAudioOutputStreamType streamType) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisableAudioOutput( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( + /* [in] */ void *buffer, + /* [in] */ unsigned int sampleFrameCount, + /* [out] */ unsigned int *sampleFramesWritten) = 0; + + virtual HRESULT STDMETHODCALLTYPE BeginAudioPreroll( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE EndAudioPreroll( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( + /* [in] */ void *buffer, + /* [in] */ unsigned int sampleFrameCount, + /* [in] */ BMDTimeValue streamTime, + /* [in] */ BMDTimeScale timeScale, + /* [out] */ unsigned int *sampleFramesWritten) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + /* [out] */ unsigned int *bufferedSampleFrameCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE FlushBufferedAudioSamples( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( + /* [in] */ IDeckLinkAudioOutputCallback *theCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( + /* [in] */ BMDTimeValue playbackStartTime, + /* [in] */ BMDTimeScale timeScale, + /* [in] */ double playbackSpeed) = 0; + + virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( + /* [in] */ BMDTimeValue stopPlaybackAtTime, + /* [out] */ BMDTimeValue *actualStopTime, + /* [in] */ BMDTimeScale timeScale) = 0; + + virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( + /* [out] */ BOOL *active) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *streamTime, + /* [out] */ double *playbackSpeed) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetReferenceStatus( + /* [out] */ BMDReferenceStatus *referenceStatus) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *hardwareTime, + /* [out] */ BMDTimeValue *timeInFrame, + /* [out] */ BMDTimeValue *ticksPerFrame) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetFrameCompletionReferenceTimestamp( + /* [in] */ IDeckLinkVideoFrame *theFrame, + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *frameCompletionTimestamp) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkOutput_v10_11Vtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkOutput_v10_11 * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkOutput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoOutputFlags flags, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, + /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + IDeckLinkOutput_v10_11 * This, + /* [out] */ IDeckLinkDisplayModeIterator **iterator); + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); + + HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDVideoOutputFlags flags); + + HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( + IDeckLinkOutput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkMemoryAllocator *theAllocator); + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ int width, + /* [in] */ int height, + /* [in] */ int rowBytes, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDFrameFlags flags, + /* [out] */ IDeckLinkMutableVideoFrame **outFrame); + + HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDPixelFormat pixelFormat, + /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer); + + HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkVideoFrame *theFrame); + + HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkVideoFrame *theFrame, + /* [in] */ BMDTimeValue displayTime, + /* [in] */ BMDTimeValue displayDuration, + /* [in] */ BMDTimeScale timeScale); + + HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkVideoOutputCallback *theCallback); + + HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( + IDeckLinkOutput_v10_11 * This, + /* [out] */ unsigned int *bufferedFrameCount); + + HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDAudioSampleRate sampleRate, + /* [in] */ BMDAudioSampleType sampleType, + /* [in] */ unsigned int channelCount, + /* [in] */ BMDAudioOutputStreamType streamType); + + HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( + IDeckLinkOutput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ void *buffer, + /* [in] */ unsigned int sampleFrameCount, + /* [out] */ unsigned int *sampleFramesWritten); + + HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( + IDeckLinkOutput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( + IDeckLinkOutput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ void *buffer, + /* [in] */ unsigned int sampleFrameCount, + /* [in] */ BMDTimeValue streamTime, + /* [in] */ BMDTimeScale timeScale, + /* [out] */ unsigned int *sampleFramesWritten); + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + IDeckLinkOutput_v10_11 * This, + /* [out] */ unsigned int *bufferedSampleFrameCount); + + HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( + IDeckLinkOutput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkAudioOutputCallback *theCallback); + + HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDTimeValue playbackStartTime, + /* [in] */ BMDTimeScale timeScale, + /* [in] */ double playbackSpeed); + + HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDTimeValue stopPlaybackAtTime, + /* [out] */ BMDTimeValue *actualStopTime, + /* [in] */ BMDTimeScale timeScale); + + HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( + IDeckLinkOutput_v10_11 * This, + /* [out] */ BOOL *active); + + HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *streamTime, + /* [out] */ double *playbackSpeed); + + HRESULT ( STDMETHODCALLTYPE *GetReferenceStatus )( + IDeckLinkOutput_v10_11 * This, + /* [out] */ BMDReferenceStatus *referenceStatus); + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *hardwareTime, + /* [out] */ BMDTimeValue *timeInFrame, + /* [out] */ BMDTimeValue *ticksPerFrame); + + HRESULT ( STDMETHODCALLTYPE *GetFrameCompletionReferenceTimestamp )( + IDeckLinkOutput_v10_11 * This, + /* [in] */ IDeckLinkVideoFrame *theFrame, + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *frameCompletionTimestamp); + + END_INTERFACE + } IDeckLinkOutput_v10_11Vtbl; + + interface IDeckLinkOutput_v10_11 + { + CONST_VTBL struct IDeckLinkOutput_v10_11Vtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkOutput_v10_11_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkOutput_v10_11_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkOutput_v10_11_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkOutput_v10_11_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) + +#define IDeckLinkOutput_v10_11_GetDisplayModeIterator(This,iterator) \ + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + +#define IDeckLinkOutput_v10_11_SetScreenPreviewCallback(This,previewCallback) \ + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + +#define IDeckLinkOutput_v10_11_EnableVideoOutput(This,displayMode,flags) \ + ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) + +#define IDeckLinkOutput_v10_11_DisableVideoOutput(This) \ + ( (This)->lpVtbl -> DisableVideoOutput(This) ) + +#define IDeckLinkOutput_v10_11_SetVideoOutputFrameMemoryAllocator(This,theAllocator) \ + ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) + +#define IDeckLinkOutput_v10_11_CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) \ + ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) + +#define IDeckLinkOutput_v10_11_CreateAncillaryData(This,pixelFormat,outBuffer) \ + ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) + +#define IDeckLinkOutput_v10_11_DisplayVideoFrameSync(This,theFrame) \ + ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) + +#define IDeckLinkOutput_v10_11_ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) \ + ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) + +#define IDeckLinkOutput_v10_11_SetScheduledFrameCompletionCallback(This,theCallback) \ + ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) + +#define IDeckLinkOutput_v10_11_GetBufferedVideoFrameCount(This,bufferedFrameCount) \ + ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) + +#define IDeckLinkOutput_v10_11_EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) \ + ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) + +#define IDeckLinkOutput_v10_11_DisableAudioOutput(This) \ + ( (This)->lpVtbl -> DisableAudioOutput(This) ) + +#define IDeckLinkOutput_v10_11_WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) \ + ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) + +#define IDeckLinkOutput_v10_11_BeginAudioPreroll(This) \ + ( (This)->lpVtbl -> BeginAudioPreroll(This) ) + +#define IDeckLinkOutput_v10_11_EndAudioPreroll(This) \ + ( (This)->lpVtbl -> EndAudioPreroll(This) ) + +#define IDeckLinkOutput_v10_11_ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) \ + ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) + +#define IDeckLinkOutput_v10_11_GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) \ + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) + +#define IDeckLinkOutput_v10_11_FlushBufferedAudioSamples(This) \ + ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) + +#define IDeckLinkOutput_v10_11_SetAudioCallback(This,theCallback) \ + ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) + +#define IDeckLinkOutput_v10_11_StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) \ + ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) + +#define IDeckLinkOutput_v10_11_StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) \ + ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) + +#define IDeckLinkOutput_v10_11_IsScheduledPlaybackRunning(This,active) \ + ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) + +#define IDeckLinkOutput_v10_11_GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) \ + ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) + +#define IDeckLinkOutput_v10_11_GetReferenceStatus(This,referenceStatus) \ + ( (This)->lpVtbl -> GetReferenceStatus(This,referenceStatus) ) + +#define IDeckLinkOutput_v10_11_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + +#define IDeckLinkOutput_v10_11_GetFrameCompletionReferenceTimestamp(This,theFrame,desiredTimeScale,frameCompletionTimestamp) \ + ( (This)->lpVtbl -> GetFrameCompletionReferenceTimestamp(This,theFrame,desiredTimeScale,frameCompletionTimestamp) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkOutput_v10_11_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkInput_v10_11_INTERFACE_DEFINED__ +#define __IDeckLinkInput_v10_11_INTERFACE_DEFINED__ + +/* interface IDeckLinkInput_v10_11 */ +/* [helpstring][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkInput_v10_11; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("AF22762B-DFAC-4846-AA79-FA8883560995") + IDeckLinkInput_v10_11 : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, + /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( + /* [out] */ unsigned int *availableFrameCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetVideoInputFrameMemoryAllocator( + /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + /* [in] */ BMDAudioSampleRate sampleRate, + /* [in] */ BMDAudioSampleType sampleType, + /* [in] */ unsigned int channelCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + /* [out] */ unsigned int *availableSampleFrameCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetCallback( + /* [in] */ IDeckLinkInputCallback *theCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *hardwareTime, + /* [out] */ BMDTimeValue *timeInFrame, + /* [out] */ BMDTimeValue *ticksPerFrame) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkInput_v10_11Vtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkInput_v10_11 * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkInput_v10_11 * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + IDeckLinkInput_v10_11 * This, + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, + /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + IDeckLinkInput_v10_11 * This, + /* [out] */ IDeckLinkDisplayModeIterator **iterator); + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + IDeckLinkInput_v10_11 * This, + /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + IDeckLinkInput_v10_11 * This, + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags); + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( + IDeckLinkInput_v10_11 * This, + /* [out] */ unsigned int *availableFrameCount); + + HRESULT ( STDMETHODCALLTYPE *SetVideoInputFrameMemoryAllocator )( + IDeckLinkInput_v10_11 * This, + /* [in] */ IDeckLinkMemoryAllocator *theAllocator); + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + IDeckLinkInput_v10_11 * This, + /* [in] */ BMDAudioSampleRate sampleRate, + /* [in] */ BMDAudioSampleType sampleType, + /* [in] */ unsigned int channelCount); + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + IDeckLinkInput_v10_11 * This, + /* [out] */ unsigned int *availableSampleFrameCount); + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + IDeckLinkInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( + IDeckLinkInput_v10_11 * This, + /* [in] */ IDeckLinkInputCallback *theCallback); + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + IDeckLinkInput_v10_11 * This, + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *hardwareTime, + /* [out] */ BMDTimeValue *timeInFrame, + /* [out] */ BMDTimeValue *ticksPerFrame); + + END_INTERFACE + } IDeckLinkInput_v10_11Vtbl; + + interface IDeckLinkInput_v10_11 + { + CONST_VTBL struct IDeckLinkInput_v10_11Vtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkInput_v10_11_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkInput_v10_11_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkInput_v10_11_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkInput_v10_11_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) + +#define IDeckLinkInput_v10_11_GetDisplayModeIterator(This,iterator) \ + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + +#define IDeckLinkInput_v10_11_SetScreenPreviewCallback(This,previewCallback) \ + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + +#define IDeckLinkInput_v10_11_EnableVideoInput(This,displayMode,pixelFormat,flags) \ + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + +#define IDeckLinkInput_v10_11_DisableVideoInput(This) \ + ( (This)->lpVtbl -> DisableVideoInput(This) ) + +#define IDeckLinkInput_v10_11_GetAvailableVideoFrameCount(This,availableFrameCount) \ + ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) + +#define IDeckLinkInput_v10_11_SetVideoInputFrameMemoryAllocator(This,theAllocator) \ + ( (This)->lpVtbl -> SetVideoInputFrameMemoryAllocator(This,theAllocator) ) + +#define IDeckLinkInput_v10_11_EnableAudioInput(This,sampleRate,sampleType,channelCount) \ + ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) + +#define IDeckLinkInput_v10_11_DisableAudioInput(This) \ + ( (This)->lpVtbl -> DisableAudioInput(This) ) + +#define IDeckLinkInput_v10_11_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + +#define IDeckLinkInput_v10_11_StartStreams(This) \ + ( (This)->lpVtbl -> StartStreams(This) ) + +#define IDeckLinkInput_v10_11_StopStreams(This) \ + ( (This)->lpVtbl -> StopStreams(This) ) + +#define IDeckLinkInput_v10_11_PauseStreams(This) \ + ( (This)->lpVtbl -> PauseStreams(This) ) + +#define IDeckLinkInput_v10_11_FlushStreams(This) \ + ( (This)->lpVtbl -> FlushStreams(This) ) + +#define IDeckLinkInput_v10_11_SetCallback(This,theCallback) \ + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + +#define IDeckLinkInput_v10_11_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkInput_v10_11_INTERFACE_DEFINED__ */ + + +#ifndef __IDeckLinkEncoderInput_v10_11_INTERFACE_DEFINED__ +#define __IDeckLinkEncoderInput_v10_11_INTERFACE_DEFINED__ + +/* interface IDeckLinkEncoderInput_v10_11 */ +/* [helpstring][uuid][object] */ + + +EXTERN_C const IID IID_IDeckLinkEncoderInput_v10_11; + +#if defined(__cplusplus) && !defined(CINTERFACE) + + MIDL_INTERFACE("270587DA-6B7D-42E7-A1F0-6D853F581185") + IDeckLinkEncoderInput_v10_11 : public IUnknown + { + public: + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, + /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetAvailablePacketsCount( + /* [out] */ unsigned int *availablePacketsCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetMemoryAllocator( + /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + /* [in] */ BMDAudioFormat audioFormat, + /* [in] */ BMDAudioSampleRate sampleRate, + /* [in] */ BMDAudioSampleType sampleType, + /* [in] */ unsigned int channelCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + /* [out] */ unsigned int *availableSampleFrameCount) = 0; + + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; + + virtual HRESULT STDMETHODCALLTYPE SetCallback( + /* [in] */ IDeckLinkEncoderInputCallback *theCallback) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *hardwareTime, + /* [out] */ BMDTimeValue *timeInFrame, + /* [out] */ BMDTimeValue *ticksPerFrame) = 0; + + }; + + +#else /* C style interface */ + + typedef struct IDeckLinkEncoderInput_v10_11Vtbl + { + BEGIN_INTERFACE + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ REFIID riid, + /* [annotation][iid_is][out] */ + _COM_Outptr_ void **ppvObject); + + ULONG ( STDMETHODCALLTYPE *AddRef )( + IDeckLinkEncoderInput_v10_11 * This); + + ULONG ( STDMETHODCALLTYPE *Release )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, + /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + IDeckLinkEncoderInput_v10_11 * This, + /* [out] */ IDeckLinkDisplayModeIterator **iterator); + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ BMDDisplayMode displayMode, + /* [in] */ BMDPixelFormat pixelFormat, + /* [in] */ BMDVideoInputFlags flags); + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *GetAvailablePacketsCount )( + IDeckLinkEncoderInput_v10_11 * This, + /* [out] */ unsigned int *availablePacketsCount); + + HRESULT ( STDMETHODCALLTYPE *SetMemoryAllocator )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ IDeckLinkMemoryAllocator *theAllocator); + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ BMDAudioFormat audioFormat, + /* [in] */ BMDAudioSampleRate sampleRate, + /* [in] */ BMDAudioSampleType sampleType, + /* [in] */ unsigned int channelCount); + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + IDeckLinkEncoderInput_v10_11 * This, + /* [out] */ unsigned int *availableSampleFrameCount); + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + IDeckLinkEncoderInput_v10_11 * This); + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ IDeckLinkEncoderInputCallback *theCallback); + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + IDeckLinkEncoderInput_v10_11 * This, + /* [in] */ BMDTimeScale desiredTimeScale, + /* [out] */ BMDTimeValue *hardwareTime, + /* [out] */ BMDTimeValue *timeInFrame, + /* [out] */ BMDTimeValue *ticksPerFrame); + + END_INTERFACE + } IDeckLinkEncoderInput_v10_11Vtbl; + + interface IDeckLinkEncoderInput_v10_11 + { + CONST_VTBL struct IDeckLinkEncoderInput_v10_11Vtbl *lpVtbl; + }; + + + +#ifdef COBJMACROS + + +#define IDeckLinkEncoderInput_v10_11_QueryInterface(This,riid,ppvObject) \ + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + +#define IDeckLinkEncoderInput_v10_11_AddRef(This) \ + ( (This)->lpVtbl -> AddRef(This) ) + +#define IDeckLinkEncoderInput_v10_11_Release(This) \ + ( (This)->lpVtbl -> Release(This) ) + + +#define IDeckLinkEncoderInput_v10_11_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) + +#define IDeckLinkEncoderInput_v10_11_GetDisplayModeIterator(This,iterator) \ + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + +#define IDeckLinkEncoderInput_v10_11_EnableVideoInput(This,displayMode,pixelFormat,flags) \ + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + +#define IDeckLinkEncoderInput_v10_11_DisableVideoInput(This) \ + ( (This)->lpVtbl -> DisableVideoInput(This) ) + +#define IDeckLinkEncoderInput_v10_11_GetAvailablePacketsCount(This,availablePacketsCount) \ + ( (This)->lpVtbl -> GetAvailablePacketsCount(This,availablePacketsCount) ) + +#define IDeckLinkEncoderInput_v10_11_SetMemoryAllocator(This,theAllocator) \ + ( (This)->lpVtbl -> SetMemoryAllocator(This,theAllocator) ) + +#define IDeckLinkEncoderInput_v10_11_EnableAudioInput(This,audioFormat,sampleRate,sampleType,channelCount) \ + ( (This)->lpVtbl -> EnableAudioInput(This,audioFormat,sampleRate,sampleType,channelCount) ) + +#define IDeckLinkEncoderInput_v10_11_DisableAudioInput(This) \ + ( (This)->lpVtbl -> DisableAudioInput(This) ) + +#define IDeckLinkEncoderInput_v10_11_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + +#define IDeckLinkEncoderInput_v10_11_StartStreams(This) \ + ( (This)->lpVtbl -> StartStreams(This) ) + +#define IDeckLinkEncoderInput_v10_11_StopStreams(This) \ + ( (This)->lpVtbl -> StopStreams(This) ) + +#define IDeckLinkEncoderInput_v10_11_PauseStreams(This) \ + ( (This)->lpVtbl -> PauseStreams(This) ) + +#define IDeckLinkEncoderInput_v10_11_FlushStreams(This) \ + ( (This)->lpVtbl -> FlushStreams(This) ) + +#define IDeckLinkEncoderInput_v10_11_SetCallback(This,theCallback) \ + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + +#define IDeckLinkEncoderInput_v10_11_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + +#endif /* COBJMACROS */ + + +#endif /* C style interface */ + + + + +#endif /* __IDeckLinkEncoderInput_v10_11_INTERFACE_DEFINED__ */ + + +EXTERN_C const CLSID CLSID_CDeckLinkIterator_v10_11; + +#ifdef __cplusplus + +class DECLSPEC_UUID("87D2693F-8D4A-45C7-B43F-10ACBA25E68F") +CDeckLinkIterator_v10_11; +#endif + +EXTERN_C const CLSID CLSID_CDeckLinkDiscovery_v10_11; + +#ifdef __cplusplus + +class DECLSPEC_UUID("652615D4-26CD-4514-B161-2FD5072ED008") +CDeckLinkDiscovery_v10_11; +#endif + #ifndef __IDeckLinkConfiguration_v10_9_INTERFACE_DEFINED__ #define __IDeckLinkConfiguration_v10_9_INTERFACE_DEFINED__ /* interface IDeckLinkConfiguration_v10_9 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkConfiguration_v10_9; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("CB71734A-FE37-4E8D-8E13-802133A1C3F2") IDeckLinkConfiguration_v10_9 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetFlag( + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value) = 0; - + virtual HRESULT STDMETHODCALLTYPE WriteConfigurationToPreferences( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkConfiguration_v10_9Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkConfiguration_v10_9 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkConfiguration_v10_9 * This); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkConfiguration_v10_9 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( + + HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( IDeckLinkConfiguration_v10_9 * This); - + END_INTERFACE } IDeckLinkConfiguration_v10_9Vtbl; @@ -9387,47 +11312,47 @@ CONST_VTBL struct IDeckLinkConfiguration_v10_9Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkConfiguration_v10_9_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkConfiguration_v10_9_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkConfiguration_v10_9_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkConfiguration_v10_9_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_9_WriteConfigurationToPreferences(This) \ - ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) + ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) #endif /* COBJMACROS */ @@ -9468,121 +11393,121 @@ #define __IDeckLinkEncoderConfiguration_v10_5_INTERFACE_DEFINED__ /* interface IDeckLinkEncoderConfiguration_v10_5 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkEncoderConfiguration_v10_5; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("67455668-0848-45DF-8D8E-350A77C9A028") IDeckLinkEncoderConfiguration_v10_5 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetFlag( + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BSTR value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BSTR *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDecoderConfigurationInfo( + + virtual HRESULT STDMETHODCALLTYPE GetDecoderConfigurationInfo( /* [out] */ void *buffer, /* [in] */ long bufferSize, /* [out] */ long *returnedSize) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkEncoderConfiguration_v10_5Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkEncoderConfiguration_v10_5 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkEncoderConfiguration_v10_5 * This); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [in] */ BSTR value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [in] */ BMDDeckLinkEncoderConfigurationID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *GetDecoderConfigurationInfo )( + + HRESULT ( STDMETHODCALLTYPE *GetDecoderConfigurationInfo )( IDeckLinkEncoderConfiguration_v10_5 * This, /* [out] */ void *buffer, /* [in] */ long bufferSize, /* [out] */ long *returnedSize); - + END_INTERFACE } IDeckLinkEncoderConfiguration_v10_5Vtbl; @@ -9591,47 +11516,47 @@ CONST_VTBL struct IDeckLinkEncoderConfiguration_v10_5Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkEncoderConfiguration_v10_5_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkEncoderConfiguration_v10_5_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkEncoderConfiguration_v10_5_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkEncoderConfiguration_v10_5_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IDeckLinkEncoderConfiguration_v10_5_GetDecoderConfigurationInfo(This,buffer,bufferSize,returnedSize) \ - ( (This)->lpVtbl -> GetDecoderConfigurationInfo(This,buffer,bufferSize,returnedSize) ) + ( (This)->lpVtbl -> GetDecoderConfigurationInfo(This,buffer,bufferSize,returnedSize) ) #endif /* COBJMACROS */ @@ -9648,115 +11573,115 @@ #define __IDeckLinkConfiguration_v10_4_INTERFACE_DEFINED__ /* interface IDeckLinkConfiguration_v10_4 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkConfiguration_v10_4; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("1E69FCF6-4203-4936-8076-2A9F4CFD50CB") IDeckLinkConfiguration_v10_4 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetFlag( + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value) = 0; - + virtual HRESULT STDMETHODCALLTYPE WriteConfigurationToPreferences( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkConfiguration_v10_4Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkConfiguration_v10_4 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkConfiguration_v10_4 * This); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkConfiguration_v10_4 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( + + HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( IDeckLinkConfiguration_v10_4 * This); - + END_INTERFACE } IDeckLinkConfiguration_v10_4Vtbl; @@ -9765,47 +11690,47 @@ CONST_VTBL struct IDeckLinkConfiguration_v10_4Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkConfiguration_v10_4_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkConfiguration_v10_4_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkConfiguration_v10_4_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkConfiguration_v10_4_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_4_WriteConfigurationToPreferences(This) \ - ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) + ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) #endif /* COBJMACROS */ @@ -9822,115 +11747,115 @@ #define __IDeckLinkConfiguration_v10_2_INTERFACE_DEFINED__ /* interface IDeckLinkConfiguration_v10_2 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkConfiguration_v10_2; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("C679A35B-610C-4D09-B748-1D0478100FC0") IDeckLinkConfiguration_v10_2 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE SetFlag( + virtual HRESULT STDMETHODCALLTYPE SetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFlag( + + virtual HRESULT STDMETHODCALLTYPE GetFlag( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetInt( + + virtual HRESULT STDMETHODCALLTYPE SetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetInt( + + virtual HRESULT STDMETHODCALLTYPE GetInt( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFloat( + + virtual HRESULT STDMETHODCALLTYPE SetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFloat( + + virtual HRESULT STDMETHODCALLTYPE GetFloat( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetString( + + virtual HRESULT STDMETHODCALLTYPE SetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value) = 0; - + virtual HRESULT STDMETHODCALLTYPE WriteConfigurationToPreferences( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkConfiguration_v10_2Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkConfiguration_v10_2 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkConfiguration_v10_2 * This); - - HRESULT ( STDMETHODCALLTYPE *SetFlag )( + + HRESULT ( STDMETHODCALLTYPE *SetFlag )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BOOL value); - - HRESULT ( STDMETHODCALLTYPE *GetFlag )( + + HRESULT ( STDMETHODCALLTYPE *GetFlag )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BOOL *value); - - HRESULT ( STDMETHODCALLTYPE *SetInt )( + + HRESULT ( STDMETHODCALLTYPE *SetInt )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ LONGLONG value); - - HRESULT ( STDMETHODCALLTYPE *GetInt )( + + HRESULT ( STDMETHODCALLTYPE *GetInt )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ LONGLONG *value); - - HRESULT ( STDMETHODCALLTYPE *SetFloat )( + + HRESULT ( STDMETHODCALLTYPE *SetFloat )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ double value); - - HRESULT ( STDMETHODCALLTYPE *GetFloat )( + + HRESULT ( STDMETHODCALLTYPE *GetFloat )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ double *value); - - HRESULT ( STDMETHODCALLTYPE *SetString )( + + HRESULT ( STDMETHODCALLTYPE *SetString )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [in] */ BSTR value); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkConfiguration_v10_2 * This, /* [in] */ BMDDeckLinkConfigurationID cfgID, /* [out] */ BSTR *value); - - HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( + + HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( IDeckLinkConfiguration_v10_2 * This); - + END_INTERFACE } IDeckLinkConfiguration_v10_2Vtbl; @@ -9939,47 +11864,47 @@ CONST_VTBL struct IDeckLinkConfiguration_v10_2Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkConfiguration_v10_2_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkConfiguration_v10_2_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkConfiguration_v10_2_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkConfiguration_v10_2_SetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_GetFlag(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFlag(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_SetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> SetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_GetInt(This,cfgID,value) \ - ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) + ( (This)->lpVtbl -> GetInt(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_SetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> SetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_GetFloat(This,cfgID,value) \ - ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) + ( (This)->lpVtbl -> GetFloat(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_SetString(This,cfgID,value) \ - ( (This)->lpVtbl -> SetString(This,cfgID,value) ) + ( (This)->lpVtbl -> SetString(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_GetString(This,cfgID,value) \ - ( (This)->lpVtbl -> GetString(This,cfgID,value) ) + ( (This)->lpVtbl -> GetString(This,cfgID,value) ) #define IDeckLinkConfiguration_v10_2_WriteConfigurationToPreferences(This) \ - ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) + ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) #endif /* COBJMACROS */ @@ -9996,175 +11921,175 @@ #define __IDeckLinkOutput_v9_9_INTERFACE_DEFINED__ /* interface IDeckLinkOutput_v9_9 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkOutput_v9_9; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("A3EF0963-0862-44ED-92A9-EE89ABF431C7") IDeckLinkOutput_v9_9 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoOutputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDVideoOutputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( /* [in] */ int width, /* [in] */ int height, /* [in] */ int rowBytes, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame **outFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( /* [in] */ BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( + + virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( /* [in] */ IDeckLinkVideoFrame *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( /* [in] */ IDeckLinkVideoFrame *theFrame, /* [in] */ BMDTimeValue displayTime, /* [in] */ BMDTimeValue displayDuration, /* [in] */ BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( /* [in] */ IDeckLinkVideoOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( /* [out] */ unsigned int *bufferedFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount, /* [in] */ BMDAudioOutputStreamType streamType) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( + + virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten) = 0; - + virtual HRESULT STDMETHODCALLTYPE BeginAudioPreroll( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE EndAudioPreroll( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( + + virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [in] */ BMDTimeValue streamTime, /* [in] */ BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( /* [out] */ unsigned int *bufferedSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushBufferedAudioSamples( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( + + virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( /* [in] */ IDeckLinkAudioOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( /* [in] */ BMDTimeValue playbackStartTime, /* [in] */ BMDTimeScale timeScale, /* [in] */ double playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( /* [in] */ BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, /* [in] */ BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( + + virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( /* [out] */ BOOL *active) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( + + virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *streamTime, /* [out] */ double *playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetReferenceStatus( + + virtual HRESULT STDMETHODCALLTYPE GetReferenceStatus( /* [out] */ BMDReferenceStatus *referenceStatus) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkOutput_v9_9Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkOutput_v9_9 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkOutput_v9_9 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkOutput_v9_9 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoOutputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkOutput_v9_9 * This, /* [out] */ IDeckLinkDisplayModeIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkOutput_v9_9 * This, /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDVideoOutputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( IDeckLinkOutput_v9_9 * This); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( IDeckLinkOutput_v9_9 * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( IDeckLinkOutput_v9_9 * This, /* [in] */ int width, /* [in] */ int height, @@ -10172,105 +12097,105 @@ /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame **outFrame); - - HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer); - - HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( + + HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( IDeckLinkOutput_v9_9 * This, /* [in] */ IDeckLinkVideoFrame *theFrame); - - HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( IDeckLinkOutput_v9_9 * This, /* [in] */ IDeckLinkVideoFrame *theFrame, /* [in] */ BMDTimeValue displayTime, /* [in] */ BMDTimeValue displayDuration, /* [in] */ BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( IDeckLinkOutput_v9_9 * This, /* [in] */ IDeckLinkVideoOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( IDeckLinkOutput_v9_9 * This, /* [out] */ unsigned int *bufferedFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount, /* [in] */ BMDAudioOutputStreamType streamType); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( IDeckLinkOutput_v9_9 * This); - - HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( + + HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( IDeckLinkOutput_v9_9 * This, /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( IDeckLinkOutput_v9_9 * This); - - HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( IDeckLinkOutput_v9_9 * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( IDeckLinkOutput_v9_9 * This, /* [in] */ void *buffer, /* [in] */ unsigned int sampleFrameCount, /* [in] */ BMDTimeValue streamTime, /* [in] */ BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( IDeckLinkOutput_v9_9 * This, /* [out] */ unsigned int *bufferedSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( IDeckLinkOutput_v9_9 * This); - - HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( IDeckLinkOutput_v9_9 * This, /* [in] */ IDeckLinkAudioOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDTimeValue playbackStartTime, /* [in] */ BMDTimeScale timeScale, /* [in] */ double playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, /* [in] */ BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( + + HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( IDeckLinkOutput_v9_9 * This, /* [out] */ BOOL *active); - - HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *streamTime, /* [out] */ double *playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *GetReferenceStatus )( + + HRESULT ( STDMETHODCALLTYPE *GetReferenceStatus )( IDeckLinkOutput_v9_9 * This, /* [out] */ BMDReferenceStatus *referenceStatus); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkOutput_v9_9 * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - + END_INTERFACE } IDeckLinkOutput_v9_9Vtbl; @@ -10279,101 +12204,101 @@ CONST_VTBL struct IDeckLinkOutput_v9_9Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkOutput_v9_9_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkOutput_v9_9_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkOutput_v9_9_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkOutput_v9_9_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) #define IDeckLinkOutput_v9_9_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkOutput_v9_9_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkOutput_v9_9_EnableVideoOutput(This,displayMode,flags) \ - ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) + ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) #define IDeckLinkOutput_v9_9_DisableVideoOutput(This) \ - ( (This)->lpVtbl -> DisableVideoOutput(This) ) + ( (This)->lpVtbl -> DisableVideoOutput(This) ) #define IDeckLinkOutput_v9_9_SetVideoOutputFrameMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) #define IDeckLinkOutput_v9_9_CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) \ - ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) + ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) #define IDeckLinkOutput_v9_9_CreateAncillaryData(This,pixelFormat,outBuffer) \ - ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) + ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) #define IDeckLinkOutput_v9_9_DisplayVideoFrameSync(This,theFrame) \ - ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) + ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) #define IDeckLinkOutput_v9_9_ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) \ - ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) + ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) #define IDeckLinkOutput_v9_9_SetScheduledFrameCompletionCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) #define IDeckLinkOutput_v9_9_GetBufferedVideoFrameCount(This,bufferedFrameCount) \ - ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) + ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) #define IDeckLinkOutput_v9_9_EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) \ - ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) + ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) #define IDeckLinkOutput_v9_9_DisableAudioOutput(This) \ - ( (This)->lpVtbl -> DisableAudioOutput(This) ) + ( (This)->lpVtbl -> DisableAudioOutput(This) ) #define IDeckLinkOutput_v9_9_WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) \ - ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) + ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) #define IDeckLinkOutput_v9_9_BeginAudioPreroll(This) \ - ( (This)->lpVtbl -> BeginAudioPreroll(This) ) + ( (This)->lpVtbl -> BeginAudioPreroll(This) ) #define IDeckLinkOutput_v9_9_EndAudioPreroll(This) \ - ( (This)->lpVtbl -> EndAudioPreroll(This) ) + ( (This)->lpVtbl -> EndAudioPreroll(This) ) #define IDeckLinkOutput_v9_9_ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) \ - ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) + ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) #define IDeckLinkOutput_v9_9_GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) \ - ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) #define IDeckLinkOutput_v9_9_FlushBufferedAudioSamples(This) \ - ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) + ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) #define IDeckLinkOutput_v9_9_SetAudioCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) #define IDeckLinkOutput_v9_9_StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) \ - ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) + ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) #define IDeckLinkOutput_v9_9_StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) \ - ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) + ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) #define IDeckLinkOutput_v9_9_IsScheduledPlaybackRunning(This,active) \ - ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) + ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) #define IDeckLinkOutput_v9_9_GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) \ - ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) + ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) #define IDeckLinkOutput_v9_9_GetReferenceStatus(This,referenceStatus) \ - ( (This)->lpVtbl -> GetReferenceStatus(This,referenceStatus) ) + ( (This)->lpVtbl -> GetReferenceStatus(This,referenceStatus) ) #define IDeckLinkOutput_v9_9_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #endif /* COBJMACROS */ @@ -10390,153 +12315,153 @@ #define __IDeckLinkInput_v9_2_INTERFACE_DEFINED__ /* interface IDeckLinkInput_v9_2 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInput_v9_2; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("6D40EF78-28B9-4E21-990D-95BB7750A04F") IDeckLinkInput_v9_2 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( /* [out] */ unsigned int *availableFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( /* [out] */ unsigned int *availableSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkInputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInput_v9_2Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInput_v9_2 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInput_v9_2 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkInput_v9_2 * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags, - /* [out] */ BMDDisplayModeSupport *result, + /* [out] */ BMDDisplayModeSupport_v10_11 *result, /* [out] */ IDeckLinkDisplayMode **resultDisplayMode); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkInput_v9_2 * This, /* [out] */ IDeckLinkDisplayModeIterator **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkInput_v9_2 * This, /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( IDeckLinkInput_v9_2 * This, /* [in] */ BMDDisplayMode displayMode, /* [in] */ BMDPixelFormat pixelFormat, /* [in] */ BMDVideoInputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( IDeckLinkInput_v9_2 * This, /* [out] */ unsigned int *availableFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( IDeckLinkInput_v9_2 * This, /* [in] */ BMDAudioSampleRate sampleRate, /* [in] */ BMDAudioSampleType sampleType, /* [in] */ unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( IDeckLinkInput_v9_2 * This, /* [out] */ unsigned int *availableSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *StartStreams )( + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *StopStreams )( + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( IDeckLinkInput_v9_2 * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkInput_v9_2 * This, /* [in] */ IDeckLinkInputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkInput_v9_2 * This, /* [in] */ BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - + END_INTERFACE } IDeckLinkInput_v9_2Vtbl; @@ -10545,65 +12470,65 @@ CONST_VTBL struct IDeckLinkInput_v9_2Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInput_v9_2_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInput_v9_2_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInput_v9_2_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInput_v9_2_DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,flags,result,resultDisplayMode) ) #define IDeckLinkInput_v9_2_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkInput_v9_2_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkInput_v9_2_EnableVideoInput(This,displayMode,pixelFormat,flags) \ - ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) #define IDeckLinkInput_v9_2_DisableVideoInput(This) \ - ( (This)->lpVtbl -> DisableVideoInput(This) ) + ( (This)->lpVtbl -> DisableVideoInput(This) ) #define IDeckLinkInput_v9_2_GetAvailableVideoFrameCount(This,availableFrameCount) \ - ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) + ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) #define IDeckLinkInput_v9_2_EnableAudioInput(This,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) #define IDeckLinkInput_v9_2_DisableAudioInput(This) \ - ( (This)->lpVtbl -> DisableAudioInput(This) ) + ( (This)->lpVtbl -> DisableAudioInput(This) ) #define IDeckLinkInput_v9_2_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ - ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) #define IDeckLinkInput_v9_2_StartStreams(This) \ - ( (This)->lpVtbl -> StartStreams(This) ) + ( (This)->lpVtbl -> StartStreams(This) ) #define IDeckLinkInput_v9_2_StopStreams(This) \ - ( (This)->lpVtbl -> StopStreams(This) ) + ( (This)->lpVtbl -> StopStreams(This) ) #define IDeckLinkInput_v9_2_PauseStreams(This) \ - ( (This)->lpVtbl -> PauseStreams(This) ) + ( (This)->lpVtbl -> PauseStreams(This) ) #define IDeckLinkInput_v9_2_FlushStreams(This) \ - ( (This)->lpVtbl -> FlushStreams(This) ) + ( (This)->lpVtbl -> FlushStreams(This) ) #define IDeckLinkInput_v9_2_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #define IDeckLinkInput_v9_2_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #endif /* COBJMACROS */ @@ -10620,72 +12545,72 @@ #define __IDeckLinkDeckControlStatusCallback_v8_1_INTERFACE_DEFINED__ /* interface IDeckLinkDeckControlStatusCallback_v8_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDeckControlStatusCallback_v8_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("E5F693C1-4283-4716-B18F-C1431521955B") IDeckLinkDeckControlStatusCallback_v8_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE TimecodeUpdate( + virtual HRESULT STDMETHODCALLTYPE TimecodeUpdate( /* [in] */ BMDTimecodeBCD currentTimecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE VTRControlStateChanged( + + virtual HRESULT STDMETHODCALLTYPE VTRControlStateChanged( /* [in] */ BMDDeckControlVTRControlState_v8_1 newState, /* [in] */ BMDDeckControlError error) = 0; - - virtual HRESULT STDMETHODCALLTYPE DeckControlEventReceived( + + virtual HRESULT STDMETHODCALLTYPE DeckControlEventReceived( /* [in] */ BMDDeckControlEvent event, /* [in] */ BMDDeckControlError error) = 0; - - virtual HRESULT STDMETHODCALLTYPE DeckControlStatusChanged( + + virtual HRESULT STDMETHODCALLTYPE DeckControlStatusChanged( /* [in] */ BMDDeckControlStatusFlags flags, /* [in] */ unsigned int mask) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDeckControlStatusCallback_v8_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDeckControlStatusCallback_v8_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDeckControlStatusCallback_v8_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDeckControlStatusCallback_v8_1 * This); - - HRESULT ( STDMETHODCALLTYPE *TimecodeUpdate )( + + HRESULT ( STDMETHODCALLTYPE *TimecodeUpdate )( IDeckLinkDeckControlStatusCallback_v8_1 * This, /* [in] */ BMDTimecodeBCD currentTimecode); - - HRESULT ( STDMETHODCALLTYPE *VTRControlStateChanged )( + + HRESULT ( STDMETHODCALLTYPE *VTRControlStateChanged )( IDeckLinkDeckControlStatusCallback_v8_1 * This, /* [in] */ BMDDeckControlVTRControlState_v8_1 newState, /* [in] */ BMDDeckControlError error); - - HRESULT ( STDMETHODCALLTYPE *DeckControlEventReceived )( + + HRESULT ( STDMETHODCALLTYPE *DeckControlEventReceived )( IDeckLinkDeckControlStatusCallback_v8_1 * This, /* [in] */ BMDDeckControlEvent event, /* [in] */ BMDDeckControlError error); - - HRESULT ( STDMETHODCALLTYPE *DeckControlStatusChanged )( + + HRESULT ( STDMETHODCALLTYPE *DeckControlStatusChanged )( IDeckLinkDeckControlStatusCallback_v8_1 * This, /* [in] */ BMDDeckControlStatusFlags flags, /* [in] */ unsigned int mask); - + END_INTERFACE } IDeckLinkDeckControlStatusCallback_v8_1Vtbl; @@ -10694,32 +12619,32 @@ CONST_VTBL struct IDeckLinkDeckControlStatusCallback_v8_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDeckControlStatusCallback_v8_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDeckControlStatusCallback_v8_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDeckControlStatusCallback_v8_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDeckControlStatusCallback_v8_1_TimecodeUpdate(This,currentTimecode) \ - ( (This)->lpVtbl -> TimecodeUpdate(This,currentTimecode) ) + ( (This)->lpVtbl -> TimecodeUpdate(This,currentTimecode) ) #define IDeckLinkDeckControlStatusCallback_v8_1_VTRControlStateChanged(This,newState,error) \ - ( (This)->lpVtbl -> VTRControlStateChanged(This,newState,error) ) + ( (This)->lpVtbl -> VTRControlStateChanged(This,newState,error) ) #define IDeckLinkDeckControlStatusCallback_v8_1_DeckControlEventReceived(This,event,error) \ - ( (This)->lpVtbl -> DeckControlEventReceived(This,event,error) ) + ( (This)->lpVtbl -> DeckControlEventReceived(This,event,error) ) #define IDeckLinkDeckControlStatusCallback_v8_1_DeckControlStatusChanged(This,flags,mask) \ - ( (This)->lpVtbl -> DeckControlStatusChanged(This,flags,mask) ) + ( (This)->lpVtbl -> DeckControlStatusChanged(This,flags,mask) ) #endif /* COBJMACROS */ @@ -10736,183 +12661,183 @@ #define __IDeckLinkDeckControl_v8_1_INTERFACE_DEFINED__ /* interface IDeckLinkDeckControl_v8_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDeckControl_v8_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("522A9E39-0F3C-4742-94EE-D80DE335DA1D") IDeckLinkDeckControl_v8_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Open( + virtual HRESULT STDMETHODCALLTYPE Open( /* [in] */ BMDTimeScale timeScale, /* [in] */ BMDTimeValue timeValue, /* [in] */ BOOL timecodeIsDropFrame, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Close( + + virtual HRESULT STDMETHODCALLTYPE Close( /* [in] */ BOOL standbyOn) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCurrentState( + + virtual HRESULT STDMETHODCALLTYPE GetCurrentState( /* [out] */ BMDDeckControlMode *mode, /* [out] */ BMDDeckControlVTRControlState_v8_1 *vtrControlState, /* [out] */ BMDDeckControlStatusFlags *flags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetStandby( + + virtual HRESULT STDMETHODCALLTYPE SetStandby( /* [in] */ BOOL standbyOn) = 0; - - virtual HRESULT STDMETHODCALLTYPE SendCommand( + + virtual HRESULT STDMETHODCALLTYPE SendCommand( /* [in] */ unsigned char *inBuffer, /* [in] */ unsigned int inBufferSize, /* [out] */ unsigned char *outBuffer, /* [out] */ unsigned int *outDataSize, /* [in] */ unsigned int outBufferSize, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Play( + + virtual HRESULT STDMETHODCALLTYPE Play( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Stop( + + virtual HRESULT STDMETHODCALLTYPE Stop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE TogglePlayStop( + + virtual HRESULT STDMETHODCALLTYPE TogglePlayStop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Eject( + + virtual HRESULT STDMETHODCALLTYPE Eject( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GoToTimecode( + + virtual HRESULT STDMETHODCALLTYPE GoToTimecode( /* [in] */ BMDTimecodeBCD timecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE FastForward( + + virtual HRESULT STDMETHODCALLTYPE FastForward( /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Rewind( + + virtual HRESULT STDMETHODCALLTYPE Rewind( /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StepForward( + + virtual HRESULT STDMETHODCALLTYPE StepForward( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StepBack( + + virtual HRESULT STDMETHODCALLTYPE StepBack( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Jog( + + virtual HRESULT STDMETHODCALLTYPE Jog( /* [in] */ double rate, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Shuttle( + + virtual HRESULT STDMETHODCALLTYPE Shuttle( /* [in] */ double rate, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeString( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeString( /* [out] */ BSTR *currentTimeCode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecode( + + virtual HRESULT STDMETHODCALLTYPE GetTimecode( /* [out] */ IDeckLinkTimecode **currentTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeBCD( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeBCD( /* [out] */ BMDTimecodeBCD *currentTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetPreroll( + + virtual HRESULT STDMETHODCALLTYPE SetPreroll( /* [in] */ unsigned int prerollSeconds) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPreroll( + + virtual HRESULT STDMETHODCALLTYPE GetPreroll( /* [out] */ unsigned int *prerollSeconds) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetExportOffset( + + virtual HRESULT STDMETHODCALLTYPE SetExportOffset( /* [in] */ int exportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetExportOffset( + + virtual HRESULT STDMETHODCALLTYPE GetExportOffset( /* [out] */ int *exportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetManualExportOffset( + + virtual HRESULT STDMETHODCALLTYPE GetManualExportOffset( /* [out] */ int *deckManualExportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCaptureOffset( + + virtual HRESULT STDMETHODCALLTYPE SetCaptureOffset( /* [in] */ int captureOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCaptureOffset( + + virtual HRESULT STDMETHODCALLTYPE GetCaptureOffset( /* [out] */ int *captureOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartExport( + + virtual HRESULT STDMETHODCALLTYPE StartExport( /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [in] */ BMDDeckControlExportModeOpsFlags exportModeOps, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartCapture( + + virtual HRESULT STDMETHODCALLTYPE StartCapture( /* [in] */ BOOL useVITC, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDeviceID( + + virtual HRESULT STDMETHODCALLTYPE GetDeviceID( /* [out] */ unsigned short *deviceId, /* [out] */ BMDDeckControlError *error) = 0; - + virtual HRESULT STDMETHODCALLTYPE Abort( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE CrashRecordStart( + + virtual HRESULT STDMETHODCALLTYPE CrashRecordStart( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE CrashRecordStop( + + virtual HRESULT STDMETHODCALLTYPE CrashRecordStop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkDeckControlStatusCallback_v8_1 *callback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDeckControl_v8_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDeckControl_v8_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDeckControl_v8_1 * This); - - HRESULT ( STDMETHODCALLTYPE *Open )( + + HRESULT ( STDMETHODCALLTYPE *Open )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BMDTimeScale timeScale, /* [in] */ BMDTimeValue timeValue, /* [in] */ BOOL timecodeIsDropFrame, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Close )( + + HRESULT ( STDMETHODCALLTYPE *Close )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BOOL standbyOn); - - HRESULT ( STDMETHODCALLTYPE *GetCurrentState )( + + HRESULT ( STDMETHODCALLTYPE *GetCurrentState )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlMode *mode, /* [out] */ BMDDeckControlVTRControlState_v8_1 *vtrControlState, /* [out] */ BMDDeckControlStatusFlags *flags); - - HRESULT ( STDMETHODCALLTYPE *SetStandby )( + + HRESULT ( STDMETHODCALLTYPE *SetStandby )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BOOL standbyOn); - - HRESULT ( STDMETHODCALLTYPE *SendCommand )( + + HRESULT ( STDMETHODCALLTYPE *SendCommand )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ unsigned char *inBuffer, /* [in] */ unsigned int inBufferSize, @@ -10920,133 +12845,133 @@ /* [out] */ unsigned int *outDataSize, /* [in] */ unsigned int outBufferSize, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Play )( + + HRESULT ( STDMETHODCALLTYPE *Play )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Stop )( + + HRESULT ( STDMETHODCALLTYPE *Stop )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *TogglePlayStop )( + + HRESULT ( STDMETHODCALLTYPE *TogglePlayStop )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Eject )( + + HRESULT ( STDMETHODCALLTYPE *Eject )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GoToTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GoToTimecode )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BMDTimecodeBCD timecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *FastForward )( + + HRESULT ( STDMETHODCALLTYPE *FastForward )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Rewind )( + + HRESULT ( STDMETHODCALLTYPE *Rewind )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StepForward )( + + HRESULT ( STDMETHODCALLTYPE *StepForward )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StepBack )( + + HRESULT ( STDMETHODCALLTYPE *StepBack )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Jog )( + + HRESULT ( STDMETHODCALLTYPE *Jog )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ double rate, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Shuttle )( + + HRESULT ( STDMETHODCALLTYPE *Shuttle )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ double rate, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeString )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeString )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BSTR *currentTimeCode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ IDeckLinkTimecode **currentTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeBCD )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeBCD )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDTimecodeBCD *currentTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *SetPreroll )( + + HRESULT ( STDMETHODCALLTYPE *SetPreroll )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ unsigned int prerollSeconds); - - HRESULT ( STDMETHODCALLTYPE *GetPreroll )( + + HRESULT ( STDMETHODCALLTYPE *GetPreroll )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ unsigned int *prerollSeconds); - - HRESULT ( STDMETHODCALLTYPE *SetExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *SetExportOffset )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ int exportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetExportOffset )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ int *exportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetManualExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetManualExportOffset )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ int *deckManualExportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *SetCaptureOffset )( + + HRESULT ( STDMETHODCALLTYPE *SetCaptureOffset )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ int captureOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetCaptureOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetCaptureOffset )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ int *captureOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *StartExport )( + + HRESULT ( STDMETHODCALLTYPE *StartExport )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [in] */ BMDDeckControlExportModeOpsFlags exportModeOps, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StartCapture )( + + HRESULT ( STDMETHODCALLTYPE *StartCapture )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ BOOL useVITC, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetDeviceID )( + + HRESULT ( STDMETHODCALLTYPE *GetDeviceID )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ unsigned short *deviceId, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Abort )( + + HRESULT ( STDMETHODCALLTYPE *Abort )( IDeckLinkDeckControl_v8_1 * This); - - HRESULT ( STDMETHODCALLTYPE *CrashRecordStart )( + + HRESULT ( STDMETHODCALLTYPE *CrashRecordStart )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *CrashRecordStop )( + + HRESULT ( STDMETHODCALLTYPE *CrashRecordStop )( IDeckLinkDeckControl_v8_1 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkDeckControl_v8_1 * This, /* [in] */ IDeckLinkDeckControlStatusCallback_v8_1 *callback); - + END_INTERFACE } IDeckLinkDeckControl_v8_1Vtbl; @@ -11055,119 +12980,119 @@ CONST_VTBL struct IDeckLinkDeckControl_v8_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDeckControl_v8_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDeckControl_v8_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDeckControl_v8_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDeckControl_v8_1_Open(This,timeScale,timeValue,timecodeIsDropFrame,error) \ - ( (This)->lpVtbl -> Open(This,timeScale,timeValue,timecodeIsDropFrame,error) ) + ( (This)->lpVtbl -> Open(This,timeScale,timeValue,timecodeIsDropFrame,error) ) #define IDeckLinkDeckControl_v8_1_Close(This,standbyOn) \ - ( (This)->lpVtbl -> Close(This,standbyOn) ) + ( (This)->lpVtbl -> Close(This,standbyOn) ) #define IDeckLinkDeckControl_v8_1_GetCurrentState(This,mode,vtrControlState,flags) \ - ( (This)->lpVtbl -> GetCurrentState(This,mode,vtrControlState,flags) ) + ( (This)->lpVtbl -> GetCurrentState(This,mode,vtrControlState,flags) ) #define IDeckLinkDeckControl_v8_1_SetStandby(This,standbyOn) \ - ( (This)->lpVtbl -> SetStandby(This,standbyOn) ) + ( (This)->lpVtbl -> SetStandby(This,standbyOn) ) #define IDeckLinkDeckControl_v8_1_SendCommand(This,inBuffer,inBufferSize,outBuffer,outDataSize,outBufferSize,error) \ - ( (This)->lpVtbl -> SendCommand(This,inBuffer,inBufferSize,outBuffer,outDataSize,outBufferSize,error) ) + ( (This)->lpVtbl -> SendCommand(This,inBuffer,inBufferSize,outBuffer,outDataSize,outBufferSize,error) ) #define IDeckLinkDeckControl_v8_1_Play(This,error) \ - ( (This)->lpVtbl -> Play(This,error) ) + ( (This)->lpVtbl -> Play(This,error) ) #define IDeckLinkDeckControl_v8_1_Stop(This,error) \ - ( (This)->lpVtbl -> Stop(This,error) ) + ( (This)->lpVtbl -> Stop(This,error) ) #define IDeckLinkDeckControl_v8_1_TogglePlayStop(This,error) \ - ( (This)->lpVtbl -> TogglePlayStop(This,error) ) + ( (This)->lpVtbl -> TogglePlayStop(This,error) ) #define IDeckLinkDeckControl_v8_1_Eject(This,error) \ - ( (This)->lpVtbl -> Eject(This,error) ) + ( (This)->lpVtbl -> Eject(This,error) ) #define IDeckLinkDeckControl_v8_1_GoToTimecode(This,timecode,error) \ - ( (This)->lpVtbl -> GoToTimecode(This,timecode,error) ) + ( (This)->lpVtbl -> GoToTimecode(This,timecode,error) ) #define IDeckLinkDeckControl_v8_1_FastForward(This,viewTape,error) \ - ( (This)->lpVtbl -> FastForward(This,viewTape,error) ) + ( (This)->lpVtbl -> FastForward(This,viewTape,error) ) #define IDeckLinkDeckControl_v8_1_Rewind(This,viewTape,error) \ - ( (This)->lpVtbl -> Rewind(This,viewTape,error) ) + ( (This)->lpVtbl -> Rewind(This,viewTape,error) ) #define IDeckLinkDeckControl_v8_1_StepForward(This,error) \ - ( (This)->lpVtbl -> StepForward(This,error) ) + ( (This)->lpVtbl -> StepForward(This,error) ) #define IDeckLinkDeckControl_v8_1_StepBack(This,error) \ - ( (This)->lpVtbl -> StepBack(This,error) ) + ( (This)->lpVtbl -> StepBack(This,error) ) #define IDeckLinkDeckControl_v8_1_Jog(This,rate,error) \ - ( (This)->lpVtbl -> Jog(This,rate,error) ) + ( (This)->lpVtbl -> Jog(This,rate,error) ) #define IDeckLinkDeckControl_v8_1_Shuttle(This,rate,error) \ - ( (This)->lpVtbl -> Shuttle(This,rate,error) ) + ( (This)->lpVtbl -> Shuttle(This,rate,error) ) #define IDeckLinkDeckControl_v8_1_GetTimecodeString(This,currentTimeCode,error) \ - ( (This)->lpVtbl -> GetTimecodeString(This,currentTimeCode,error) ) + ( (This)->lpVtbl -> GetTimecodeString(This,currentTimeCode,error) ) #define IDeckLinkDeckControl_v8_1_GetTimecode(This,currentTimecode,error) \ - ( (This)->lpVtbl -> GetTimecode(This,currentTimecode,error) ) + ( (This)->lpVtbl -> GetTimecode(This,currentTimecode,error) ) #define IDeckLinkDeckControl_v8_1_GetTimecodeBCD(This,currentTimecode,error) \ - ( (This)->lpVtbl -> GetTimecodeBCD(This,currentTimecode,error) ) + ( (This)->lpVtbl -> GetTimecodeBCD(This,currentTimecode,error) ) #define IDeckLinkDeckControl_v8_1_SetPreroll(This,prerollSeconds) \ - ( (This)->lpVtbl -> SetPreroll(This,prerollSeconds) ) + ( (This)->lpVtbl -> SetPreroll(This,prerollSeconds) ) #define IDeckLinkDeckControl_v8_1_GetPreroll(This,prerollSeconds) \ - ( (This)->lpVtbl -> GetPreroll(This,prerollSeconds) ) + ( (This)->lpVtbl -> GetPreroll(This,prerollSeconds) ) #define IDeckLinkDeckControl_v8_1_SetExportOffset(This,exportOffsetFields) \ - ( (This)->lpVtbl -> SetExportOffset(This,exportOffsetFields) ) + ( (This)->lpVtbl -> SetExportOffset(This,exportOffsetFields) ) #define IDeckLinkDeckControl_v8_1_GetExportOffset(This,exportOffsetFields) \ - ( (This)->lpVtbl -> GetExportOffset(This,exportOffsetFields) ) + ( (This)->lpVtbl -> GetExportOffset(This,exportOffsetFields) ) #define IDeckLinkDeckControl_v8_1_GetManualExportOffset(This,deckManualExportOffsetFields) \ - ( (This)->lpVtbl -> GetManualExportOffset(This,deckManualExportOffsetFields) ) + ( (This)->lpVtbl -> GetManualExportOffset(This,deckManualExportOffsetFields) ) #define IDeckLinkDeckControl_v8_1_SetCaptureOffset(This,captureOffsetFields) \ - ( (This)->lpVtbl -> SetCaptureOffset(This,captureOffsetFields) ) + ( (This)->lpVtbl -> SetCaptureOffset(This,captureOffsetFields) ) #define IDeckLinkDeckControl_v8_1_GetCaptureOffset(This,captureOffsetFields) \ - ( (This)->lpVtbl -> GetCaptureOffset(This,captureOffsetFields) ) + ( (This)->lpVtbl -> GetCaptureOffset(This,captureOffsetFields) ) #define IDeckLinkDeckControl_v8_1_StartExport(This,inTimecode,outTimecode,exportModeOps,error) \ - ( (This)->lpVtbl -> StartExport(This,inTimecode,outTimecode,exportModeOps,error) ) + ( (This)->lpVtbl -> StartExport(This,inTimecode,outTimecode,exportModeOps,error) ) #define IDeckLinkDeckControl_v8_1_StartCapture(This,useVITC,inTimecode,outTimecode,error) \ - ( (This)->lpVtbl -> StartCapture(This,useVITC,inTimecode,outTimecode,error) ) + ( (This)->lpVtbl -> StartCapture(This,useVITC,inTimecode,outTimecode,error) ) #define IDeckLinkDeckControl_v8_1_GetDeviceID(This,deviceId,error) \ - ( (This)->lpVtbl -> GetDeviceID(This,deviceId,error) ) + ( (This)->lpVtbl -> GetDeviceID(This,deviceId,error) ) #define IDeckLinkDeckControl_v8_1_Abort(This) \ - ( (This)->lpVtbl -> Abort(This) ) + ( (This)->lpVtbl -> Abort(This) ) #define IDeckLinkDeckControl_v8_1_CrashRecordStart(This,error) \ - ( (This)->lpVtbl -> CrashRecordStart(This,error) ) + ( (This)->lpVtbl -> CrashRecordStart(This,error) ) #define IDeckLinkDeckControl_v8_1_CrashRecordStop(This,error) \ - ( (This)->lpVtbl -> CrashRecordStop(This,error) ) + ( (This)->lpVtbl -> CrashRecordStop(This,error) ) #define IDeckLinkDeckControl_v8_1_SetCallback(This,callback) \ - ( (This)->lpVtbl -> SetCallback(This,callback) ) + ( (This)->lpVtbl -> SetCallback(This,callback) ) #endif /* COBJMACROS */ @@ -11184,45 +13109,45 @@ #define __IDeckLink_v8_0_INTERFACE_DEFINED__ /* interface IDeckLink_v8_0 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLink_v8_0; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("62BFF75D-6569-4E55-8D4D-66AA03829ABC") IDeckLink_v8_0 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetModelName( + virtual HRESULT STDMETHODCALLTYPE GetModelName( /* [out] */ BSTR *modelName) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLink_v8_0Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLink_v8_0 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLink_v8_0 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLink_v8_0 * This); - - HRESULT ( STDMETHODCALLTYPE *GetModelName )( + + HRESULT ( STDMETHODCALLTYPE *GetModelName )( IDeckLink_v8_0 * This, /* [out] */ BSTR *modelName); - + END_INTERFACE } IDeckLink_v8_0Vtbl; @@ -11231,23 +13156,23 @@ CONST_VTBL struct IDeckLink_v8_0Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLink_v8_0_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLink_v8_0_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLink_v8_0_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLink_v8_0_GetModelName(This,modelName) \ - ( (This)->lpVtbl -> GetModelName(This,modelName) ) + ( (This)->lpVtbl -> GetModelName(This,modelName) ) #endif /* COBJMACROS */ @@ -11264,45 +13189,45 @@ #define __IDeckLinkIterator_v8_0_INTERFACE_DEFINED__ /* interface IDeckLinkIterator_v8_0 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkIterator_v8_0; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("74E936FC-CC28-4A67-81A0-1E94E52D4E69") IDeckLinkIterator_v8_0 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IDeckLink_v8_0 **deckLinkInstance) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkIterator_v8_0Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkIterator_v8_0 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkIterator_v8_0 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkIterator_v8_0 * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IDeckLinkIterator_v8_0 * This, /* [out] */ IDeckLink_v8_0 **deckLinkInstance); - + END_INTERFACE } IDeckLinkIterator_v8_0Vtbl; @@ -11311,23 +13236,23 @@ CONST_VTBL struct IDeckLinkIterator_v8_0Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkIterator_v8_0_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkIterator_v8_0_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkIterator_v8_0_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkIterator_v8_0_Next(This,deckLinkInstance) \ - ( (This)->lpVtbl -> Next(This,deckLinkInstance) ) + ( (This)->lpVtbl -> Next(This,deckLinkInstance) ) #endif /* COBJMACROS */ @@ -11352,300 +13277,300 @@ #define __IDeckLinkDeckControl_v7_9_INTERFACE_DEFINED__ /* interface IDeckLinkDeckControl_v7_9 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDeckControl_v7_9; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("A4D81043-0619-42B7-8ED6-602D29041DF7") IDeckLinkDeckControl_v7_9 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Open( + virtual HRESULT STDMETHODCALLTYPE Open( /* [in] */ BMDTimeScale timeScale, /* [in] */ BMDTimeValue timeValue, /* [in] */ BOOL timecodeIsDropFrame, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Close( + + virtual HRESULT STDMETHODCALLTYPE Close( /* [in] */ BOOL standbyOn) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCurrentState( + + virtual HRESULT STDMETHODCALLTYPE GetCurrentState( /* [out] */ BMDDeckControlMode *mode, /* [out] */ BMDDeckControlVTRControlState *vtrControlState, /* [out] */ BMDDeckControlStatusFlags *flags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetStandby( + + virtual HRESULT STDMETHODCALLTYPE SetStandby( /* [in] */ BOOL standbyOn) = 0; - - virtual HRESULT STDMETHODCALLTYPE Play( + + virtual HRESULT STDMETHODCALLTYPE Play( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Stop( + + virtual HRESULT STDMETHODCALLTYPE Stop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE TogglePlayStop( + + virtual HRESULT STDMETHODCALLTYPE TogglePlayStop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Eject( + + virtual HRESULT STDMETHODCALLTYPE Eject( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GoToTimecode( + + virtual HRESULT STDMETHODCALLTYPE GoToTimecode( /* [in] */ BMDTimecodeBCD timecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE FastForward( + + virtual HRESULT STDMETHODCALLTYPE FastForward( /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Rewind( + + virtual HRESULT STDMETHODCALLTYPE Rewind( /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StepForward( + + virtual HRESULT STDMETHODCALLTYPE StepForward( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StepBack( + + virtual HRESULT STDMETHODCALLTYPE StepBack( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Jog( + + virtual HRESULT STDMETHODCALLTYPE Jog( /* [in] */ double rate, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE Shuttle( + + virtual HRESULT STDMETHODCALLTYPE Shuttle( /* [in] */ double rate, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeString( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeString( /* [out] */ BSTR *currentTimeCode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecode( + + virtual HRESULT STDMETHODCALLTYPE GetTimecode( /* [out] */ IDeckLinkTimecode **currentTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecodeBCD( + + virtual HRESULT STDMETHODCALLTYPE GetTimecodeBCD( /* [out] */ BMDTimecodeBCD *currentTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetPreroll( + + virtual HRESULT STDMETHODCALLTYPE SetPreroll( /* [in] */ unsigned int prerollSeconds) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetPreroll( + + virtual HRESULT STDMETHODCALLTYPE GetPreroll( /* [out] */ unsigned int *prerollSeconds) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetExportOffset( + + virtual HRESULT STDMETHODCALLTYPE SetExportOffset( /* [in] */ int exportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetExportOffset( + + virtual HRESULT STDMETHODCALLTYPE GetExportOffset( /* [out] */ int *exportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetManualExportOffset( + + virtual HRESULT STDMETHODCALLTYPE GetManualExportOffset( /* [out] */ int *deckManualExportOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCaptureOffset( + + virtual HRESULT STDMETHODCALLTYPE SetCaptureOffset( /* [in] */ int captureOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetCaptureOffset( + + virtual HRESULT STDMETHODCALLTYPE GetCaptureOffset( /* [out] */ int *captureOffsetFields) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartExport( + + virtual HRESULT STDMETHODCALLTYPE StartExport( /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [in] */ BMDDeckControlExportModeOpsFlags exportModeOps, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartCapture( + + virtual HRESULT STDMETHODCALLTYPE StartCapture( /* [in] */ BOOL useVITC, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDeviceID( + + virtual HRESULT STDMETHODCALLTYPE GetDeviceID( /* [out] */ unsigned short *deviceId, /* [out] */ BMDDeckControlError *error) = 0; - + virtual HRESULT STDMETHODCALLTYPE Abort( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE CrashRecordStart( + + virtual HRESULT STDMETHODCALLTYPE CrashRecordStart( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE CrashRecordStop( + + virtual HRESULT STDMETHODCALLTYPE CrashRecordStop( /* [out] */ BMDDeckControlError *error) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkDeckControlStatusCallback *callback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDeckControl_v7_9Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDeckControl_v7_9 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDeckControl_v7_9 * This); - - HRESULT ( STDMETHODCALLTYPE *Open )( + + HRESULT ( STDMETHODCALLTYPE *Open )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BMDTimeScale timeScale, /* [in] */ BMDTimeValue timeValue, /* [in] */ BOOL timecodeIsDropFrame, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Close )( + + HRESULT ( STDMETHODCALLTYPE *Close )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BOOL standbyOn); - - HRESULT ( STDMETHODCALLTYPE *GetCurrentState )( + + HRESULT ( STDMETHODCALLTYPE *GetCurrentState )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlMode *mode, /* [out] */ BMDDeckControlVTRControlState *vtrControlState, /* [out] */ BMDDeckControlStatusFlags *flags); - - HRESULT ( STDMETHODCALLTYPE *SetStandby )( + + HRESULT ( STDMETHODCALLTYPE *SetStandby )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BOOL standbyOn); - - HRESULT ( STDMETHODCALLTYPE *Play )( + + HRESULT ( STDMETHODCALLTYPE *Play )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Stop )( + + HRESULT ( STDMETHODCALLTYPE *Stop )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *TogglePlayStop )( + + HRESULT ( STDMETHODCALLTYPE *TogglePlayStop )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Eject )( + + HRESULT ( STDMETHODCALLTYPE *Eject )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GoToTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GoToTimecode )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BMDTimecodeBCD timecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *FastForward )( + + HRESULT ( STDMETHODCALLTYPE *FastForward )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Rewind )( + + HRESULT ( STDMETHODCALLTYPE *Rewind )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BOOL viewTape, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StepForward )( + + HRESULT ( STDMETHODCALLTYPE *StepForward )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StepBack )( + + HRESULT ( STDMETHODCALLTYPE *StepBack )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Jog )( + + HRESULT ( STDMETHODCALLTYPE *Jog )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ double rate, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Shuttle )( + + HRESULT ( STDMETHODCALLTYPE *Shuttle )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ double rate, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeString )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeString )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BSTR *currentTimeCode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ IDeckLinkTimecode **currentTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetTimecodeBCD )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecodeBCD )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDTimecodeBCD *currentTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *SetPreroll )( + + HRESULT ( STDMETHODCALLTYPE *SetPreroll )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ unsigned int prerollSeconds); - - HRESULT ( STDMETHODCALLTYPE *GetPreroll )( + + HRESULT ( STDMETHODCALLTYPE *GetPreroll )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ unsigned int *prerollSeconds); - - HRESULT ( STDMETHODCALLTYPE *SetExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *SetExportOffset )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ int exportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetExportOffset )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ int *exportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetManualExportOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetManualExportOffset )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ int *deckManualExportOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *SetCaptureOffset )( + + HRESULT ( STDMETHODCALLTYPE *SetCaptureOffset )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ int captureOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *GetCaptureOffset )( + + HRESULT ( STDMETHODCALLTYPE *GetCaptureOffset )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ int *captureOffsetFields); - - HRESULT ( STDMETHODCALLTYPE *StartExport )( + + HRESULT ( STDMETHODCALLTYPE *StartExport )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [in] */ BMDDeckControlExportModeOpsFlags exportModeOps, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *StartCapture )( + + HRESULT ( STDMETHODCALLTYPE *StartCapture )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ BOOL useVITC, /* [in] */ BMDTimecodeBCD inTimecode, /* [in] */ BMDTimecodeBCD outTimecode, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *GetDeviceID )( + + HRESULT ( STDMETHODCALLTYPE *GetDeviceID )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ unsigned short *deviceId, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *Abort )( + + HRESULT ( STDMETHODCALLTYPE *Abort )( IDeckLinkDeckControl_v7_9 * This); - - HRESULT ( STDMETHODCALLTYPE *CrashRecordStart )( + + HRESULT ( STDMETHODCALLTYPE *CrashRecordStart )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *CrashRecordStop )( + + HRESULT ( STDMETHODCALLTYPE *CrashRecordStop )( IDeckLinkDeckControl_v7_9 * This, /* [out] */ BMDDeckControlError *error); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkDeckControl_v7_9 * This, /* [in] */ IDeckLinkDeckControlStatusCallback *callback); - + END_INTERFACE } IDeckLinkDeckControl_v7_9Vtbl; @@ -11654,116 +13579,116 @@ CONST_VTBL struct IDeckLinkDeckControl_v7_9Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDeckControl_v7_9_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDeckControl_v7_9_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDeckControl_v7_9_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDeckControl_v7_9_Open(This,timeScale,timeValue,timecodeIsDropFrame,error) \ - ( (This)->lpVtbl -> Open(This,timeScale,timeValue,timecodeIsDropFrame,error) ) + ( (This)->lpVtbl -> Open(This,timeScale,timeValue,timecodeIsDropFrame,error) ) #define IDeckLinkDeckControl_v7_9_Close(This,standbyOn) \ - ( (This)->lpVtbl -> Close(This,standbyOn) ) + ( (This)->lpVtbl -> Close(This,standbyOn) ) #define IDeckLinkDeckControl_v7_9_GetCurrentState(This,mode,vtrControlState,flags) \ - ( (This)->lpVtbl -> GetCurrentState(This,mode,vtrControlState,flags) ) + ( (This)->lpVtbl -> GetCurrentState(This,mode,vtrControlState,flags) ) #define IDeckLinkDeckControl_v7_9_SetStandby(This,standbyOn) \ - ( (This)->lpVtbl -> SetStandby(This,standbyOn) ) + ( (This)->lpVtbl -> SetStandby(This,standbyOn) ) #define IDeckLinkDeckControl_v7_9_Play(This,error) \ - ( (This)->lpVtbl -> Play(This,error) ) + ( (This)->lpVtbl -> Play(This,error) ) #define IDeckLinkDeckControl_v7_9_Stop(This,error) \ - ( (This)->lpVtbl -> Stop(This,error) ) + ( (This)->lpVtbl -> Stop(This,error) ) #define IDeckLinkDeckControl_v7_9_TogglePlayStop(This,error) \ - ( (This)->lpVtbl -> TogglePlayStop(This,error) ) + ( (This)->lpVtbl -> TogglePlayStop(This,error) ) #define IDeckLinkDeckControl_v7_9_Eject(This,error) \ - ( (This)->lpVtbl -> Eject(This,error) ) + ( (This)->lpVtbl -> Eject(This,error) ) #define IDeckLinkDeckControl_v7_9_GoToTimecode(This,timecode,error) \ - ( (This)->lpVtbl -> GoToTimecode(This,timecode,error) ) + ( (This)->lpVtbl -> GoToTimecode(This,timecode,error) ) #define IDeckLinkDeckControl_v7_9_FastForward(This,viewTape,error) \ - ( (This)->lpVtbl -> FastForward(This,viewTape,error) ) + ( (This)->lpVtbl -> FastForward(This,viewTape,error) ) #define IDeckLinkDeckControl_v7_9_Rewind(This,viewTape,error) \ - ( (This)->lpVtbl -> Rewind(This,viewTape,error) ) + ( (This)->lpVtbl -> Rewind(This,viewTape,error) ) #define IDeckLinkDeckControl_v7_9_StepForward(This,error) \ - ( (This)->lpVtbl -> StepForward(This,error) ) + ( (This)->lpVtbl -> StepForward(This,error) ) #define IDeckLinkDeckControl_v7_9_StepBack(This,error) \ - ( (This)->lpVtbl -> StepBack(This,error) ) + ( (This)->lpVtbl -> StepBack(This,error) ) #define IDeckLinkDeckControl_v7_9_Jog(This,rate,error) \ - ( (This)->lpVtbl -> Jog(This,rate,error) ) + ( (This)->lpVtbl -> Jog(This,rate,error) ) #define IDeckLinkDeckControl_v7_9_Shuttle(This,rate,error) \ - ( (This)->lpVtbl -> Shuttle(This,rate,error) ) + ( (This)->lpVtbl -> Shuttle(This,rate,error) ) #define IDeckLinkDeckControl_v7_9_GetTimecodeString(This,currentTimeCode,error) \ - ( (This)->lpVtbl -> GetTimecodeString(This,currentTimeCode,error) ) + ( (This)->lpVtbl -> GetTimecodeString(This,currentTimeCode,error) ) #define IDeckLinkDeckControl_v7_9_GetTimecode(This,currentTimecode,error) \ - ( (This)->lpVtbl -> GetTimecode(This,currentTimecode,error) ) + ( (This)->lpVtbl -> GetTimecode(This,currentTimecode,error) ) #define IDeckLinkDeckControl_v7_9_GetTimecodeBCD(This,currentTimecode,error) \ - ( (This)->lpVtbl -> GetTimecodeBCD(This,currentTimecode,error) ) + ( (This)->lpVtbl -> GetTimecodeBCD(This,currentTimecode,error) ) #define IDeckLinkDeckControl_v7_9_SetPreroll(This,prerollSeconds) \ - ( (This)->lpVtbl -> SetPreroll(This,prerollSeconds) ) + ( (This)->lpVtbl -> SetPreroll(This,prerollSeconds) ) #define IDeckLinkDeckControl_v7_9_GetPreroll(This,prerollSeconds) \ - ( (This)->lpVtbl -> GetPreroll(This,prerollSeconds) ) + ( (This)->lpVtbl -> GetPreroll(This,prerollSeconds) ) #define IDeckLinkDeckControl_v7_9_SetExportOffset(This,exportOffsetFields) \ - ( (This)->lpVtbl -> SetExportOffset(This,exportOffsetFields) ) + ( (This)->lpVtbl -> SetExportOffset(This,exportOffsetFields) ) #define IDeckLinkDeckControl_v7_9_GetExportOffset(This,exportOffsetFields) \ - ( (This)->lpVtbl -> GetExportOffset(This,exportOffsetFields) ) + ( (This)->lpVtbl -> GetExportOffset(This,exportOffsetFields) ) #define IDeckLinkDeckControl_v7_9_GetManualExportOffset(This,deckManualExportOffsetFields) \ - ( (This)->lpVtbl -> GetManualExportOffset(This,deckManualExportOffsetFields) ) + ( (This)->lpVtbl -> GetManualExportOffset(This,deckManualExportOffsetFields) ) #define IDeckLinkDeckControl_v7_9_SetCaptureOffset(This,captureOffsetFields) \ - ( (This)->lpVtbl -> SetCaptureOffset(This,captureOffsetFields) ) + ( (This)->lpVtbl -> SetCaptureOffset(This,captureOffsetFields) ) #define IDeckLinkDeckControl_v7_9_GetCaptureOffset(This,captureOffsetFields) \ - ( (This)->lpVtbl -> GetCaptureOffset(This,captureOffsetFields) ) + ( (This)->lpVtbl -> GetCaptureOffset(This,captureOffsetFields) ) #define IDeckLinkDeckControl_v7_9_StartExport(This,inTimecode,outTimecode,exportModeOps,error) \ - ( (This)->lpVtbl -> StartExport(This,inTimecode,outTimecode,exportModeOps,error) ) + ( (This)->lpVtbl -> StartExport(This,inTimecode,outTimecode,exportModeOps,error) ) #define IDeckLinkDeckControl_v7_9_StartCapture(This,useVITC,inTimecode,outTimecode,error) \ - ( (This)->lpVtbl -> StartCapture(This,useVITC,inTimecode,outTimecode,error) ) + ( (This)->lpVtbl -> StartCapture(This,useVITC,inTimecode,outTimecode,error) ) #define IDeckLinkDeckControl_v7_9_GetDeviceID(This,deviceId,error) \ - ( (This)->lpVtbl -> GetDeviceID(This,deviceId,error) ) + ( (This)->lpVtbl -> GetDeviceID(This,deviceId,error) ) #define IDeckLinkDeckControl_v7_9_Abort(This) \ - ( (This)->lpVtbl -> Abort(This) ) + ( (This)->lpVtbl -> Abort(This) ) #define IDeckLinkDeckControl_v7_9_CrashRecordStart(This,error) \ - ( (This)->lpVtbl -> CrashRecordStart(This,error) ) + ( (This)->lpVtbl -> CrashRecordStart(This,error) ) #define IDeckLinkDeckControl_v7_9_CrashRecordStop(This,error) \ - ( (This)->lpVtbl -> CrashRecordStop(This,error) ) + ( (This)->lpVtbl -> CrashRecordStop(This,error) ) #define IDeckLinkDeckControl_v7_9_SetCallback(This,callback) \ - ( (This)->lpVtbl -> SetCallback(This,callback) ) + ( (This)->lpVtbl -> SetCallback(This,callback) ) #endif /* COBJMACROS */ @@ -11780,45 +13705,45 @@ #define __IDeckLinkDisplayModeIterator_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkDisplayModeIterator_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDisplayModeIterator_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("455D741F-1779-4800-86F5-0B5D13D79751") IDeckLinkDisplayModeIterator_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IDeckLinkDisplayMode_v7_6 **deckLinkDisplayMode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDisplayModeIterator_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDisplayModeIterator_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDisplayModeIterator_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDisplayModeIterator_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IDeckLinkDisplayModeIterator_v7_6 * This, /* [out] */ IDeckLinkDisplayMode_v7_6 **deckLinkDisplayMode); - + END_INTERFACE } IDeckLinkDisplayModeIterator_v7_6Vtbl; @@ -11827,23 +13752,23 @@ CONST_VTBL struct IDeckLinkDisplayModeIterator_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDisplayModeIterator_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDisplayModeIterator_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDisplayModeIterator_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDisplayModeIterator_v7_6_Next(This,deckLinkDisplayMode) \ - ( (This)->lpVtbl -> Next(This,deckLinkDisplayMode) ) + ( (This)->lpVtbl -> Next(This,deckLinkDisplayMode) ) #endif /* COBJMACROS */ @@ -11860,74 +13785,74 @@ #define __IDeckLinkDisplayMode_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkDisplayMode_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDisplayMode_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("87451E84-2B7E-439E-A629-4393EA4A8550") IDeckLinkDisplayMode_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetName( + virtual HRESULT STDMETHODCALLTYPE GetName( /* [out] */ BSTR *name) = 0; - + virtual BMDDisplayMode STDMETHODCALLTYPE GetDisplayMode( void) = 0; - + virtual long STDMETHODCALLTYPE GetWidth( void) = 0; - + virtual long STDMETHODCALLTYPE GetHeight( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFrameRate( + + virtual HRESULT STDMETHODCALLTYPE GetFrameRate( /* [out] */ BMDTimeValue *frameDuration, /* [out] */ BMDTimeScale *timeScale) = 0; - + virtual BMDFieldDominance STDMETHODCALLTYPE GetFieldDominance( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDisplayMode_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDisplayMode_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDisplayMode_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDisplayMode_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetName )( + + HRESULT ( STDMETHODCALLTYPE *GetName )( IDeckLinkDisplayMode_v7_6 * This, /* [out] */ BSTR *name); - - BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( + + BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkDisplayMode_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkDisplayMode_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkDisplayMode_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetFrameRate )( + + HRESULT ( STDMETHODCALLTYPE *GetFrameRate )( IDeckLinkDisplayMode_v7_6 * This, /* [out] */ BMDTimeValue *frameDuration, /* [out] */ BMDTimeScale *timeScale); - - BMDFieldDominance ( STDMETHODCALLTYPE *GetFieldDominance )( + + BMDFieldDominance ( STDMETHODCALLTYPE *GetFieldDominance )( IDeckLinkDisplayMode_v7_6 * This); - + END_INTERFACE } IDeckLinkDisplayMode_v7_6Vtbl; @@ -11936,38 +13861,38 @@ CONST_VTBL struct IDeckLinkDisplayMode_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDisplayMode_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDisplayMode_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDisplayMode_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDisplayMode_v7_6_GetName(This,name) \ - ( (This)->lpVtbl -> GetName(This,name) ) + ( (This)->lpVtbl -> GetName(This,name) ) #define IDeckLinkDisplayMode_v7_6_GetDisplayMode(This) \ - ( (This)->lpVtbl -> GetDisplayMode(This) ) + ( (This)->lpVtbl -> GetDisplayMode(This) ) #define IDeckLinkDisplayMode_v7_6_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkDisplayMode_v7_6_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkDisplayMode_v7_6_GetFrameRate(This,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetFrameRate(This,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetFrameRate(This,frameDuration,timeScale) ) #define IDeckLinkDisplayMode_v7_6_GetFieldDominance(This) \ - ( (This)->lpVtbl -> GetFieldDominance(This) ) + ( (This)->lpVtbl -> GetFieldDominance(This) ) #endif /* COBJMACROS */ @@ -11984,168 +13909,168 @@ #define __IDeckLinkOutput_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkOutput_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkOutput_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("29228142-EB8C-4141-A621-F74026450955") IDeckLinkOutput_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ BMDDisplayModeSupport_v10_11 *result) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback_v7_6 *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( BMDDisplayMode displayMode, BMDVideoOutputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( int width, int height, int rowBytes, BMDPixelFormat pixelFormat, BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame_v7_6 **outFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( + + virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame, BMDTimeValue displayTime, BMDTimeValue displayDuration, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( /* [in] */ IDeckLinkVideoOutputCallback_v7_6 *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( /* [out] */ unsigned int *bufferedFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount, BMDAudioOutputStreamType streamType) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( + + virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( /* [in] */ void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten) = 0; - + virtual HRESULT STDMETHODCALLTYPE BeginAudioPreroll( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE EndAudioPreroll( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( + + virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( /* [in] */ void *buffer, unsigned int sampleFrameCount, BMDTimeValue streamTime, BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( /* [out] */ unsigned int *bufferedSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushBufferedAudioSamples( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( + + virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( /* [in] */ IDeckLinkAudioOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( BMDTimeValue playbackStartTime, BMDTimeScale timeScale, double playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( + + virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( /* [out] */ BOOL *active) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( + + virtual HRESULT STDMETHODCALLTYPE GetScheduledStreamTime( BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *streamTime, /* [out] */ double *playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkOutput_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkOutput_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkOutput_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkOutput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkOutput_v7_6 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + /* [out] */ BMDDisplayModeSupport_v10_11 *result); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkOutput_v7_6 * This, /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkOutput_v7_6 * This, /* [in] */ IDeckLinkScreenPreviewCallback_v7_6 *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( IDeckLinkOutput_v7_6 * This, BMDDisplayMode displayMode, BMDVideoOutputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( IDeckLinkOutput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( IDeckLinkOutput_v7_6 * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( IDeckLinkOutput_v7_6 * This, int width, int height, @@ -12153,101 +14078,101 @@ BMDPixelFormat pixelFormat, BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame_v7_6 **outFrame); - - HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( IDeckLinkOutput_v7_6 * This, BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer); - - HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( + + HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( IDeckLinkOutput_v7_6 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame); - - HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( IDeckLinkOutput_v7_6 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame, BMDTimeValue displayTime, BMDTimeValue displayDuration, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( IDeckLinkOutput_v7_6 * This, /* [in] */ IDeckLinkVideoOutputCallback_v7_6 *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( IDeckLinkOutput_v7_6 * This, /* [out] */ unsigned int *bufferedFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( IDeckLinkOutput_v7_6 * This, BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount, BMDAudioOutputStreamType streamType); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( IDeckLinkOutput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( + + HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( IDeckLinkOutput_v7_6 * This, /* [in] */ void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( IDeckLinkOutput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( IDeckLinkOutput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( IDeckLinkOutput_v7_6 * This, /* [in] */ void *buffer, unsigned int sampleFrameCount, BMDTimeValue streamTime, BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( IDeckLinkOutput_v7_6 * This, /* [out] */ unsigned int *bufferedSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( IDeckLinkOutput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( IDeckLinkOutput_v7_6 * This, /* [in] */ IDeckLinkAudioOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( IDeckLinkOutput_v7_6 * This, BMDTimeValue playbackStartTime, BMDTimeScale timeScale, double playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( IDeckLinkOutput_v7_6 * This, BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( + + HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( IDeckLinkOutput_v7_6 * This, /* [out] */ BOOL *active); - - HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetScheduledStreamTime )( IDeckLinkOutput_v7_6 * This, BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *streamTime, /* [out] */ double *playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkOutput_v7_6 * This, BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - + END_INTERFACE } IDeckLinkOutput_v7_6Vtbl; @@ -12256,98 +14181,98 @@ CONST_VTBL struct IDeckLinkOutput_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkOutput_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkOutput_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkOutput_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkOutput_v7_6_DoesSupportVideoMode(This,displayMode,pixelFormat,result) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) #define IDeckLinkOutput_v7_6_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkOutput_v7_6_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkOutput_v7_6_EnableVideoOutput(This,displayMode,flags) \ - ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) + ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) #define IDeckLinkOutput_v7_6_DisableVideoOutput(This) \ - ( (This)->lpVtbl -> DisableVideoOutput(This) ) + ( (This)->lpVtbl -> DisableVideoOutput(This) ) #define IDeckLinkOutput_v7_6_SetVideoOutputFrameMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) #define IDeckLinkOutput_v7_6_CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) \ - ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) + ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) #define IDeckLinkOutput_v7_6_CreateAncillaryData(This,pixelFormat,outBuffer) \ - ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) + ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) #define IDeckLinkOutput_v7_6_DisplayVideoFrameSync(This,theFrame) \ - ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) + ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) #define IDeckLinkOutput_v7_6_ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) \ - ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) + ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) #define IDeckLinkOutput_v7_6_SetScheduledFrameCompletionCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) #define IDeckLinkOutput_v7_6_GetBufferedVideoFrameCount(This,bufferedFrameCount) \ - ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) + ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) #define IDeckLinkOutput_v7_6_EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) \ - ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) + ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) #define IDeckLinkOutput_v7_6_DisableAudioOutput(This) \ - ( (This)->lpVtbl -> DisableAudioOutput(This) ) + ( (This)->lpVtbl -> DisableAudioOutput(This) ) #define IDeckLinkOutput_v7_6_WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) \ - ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) + ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) #define IDeckLinkOutput_v7_6_BeginAudioPreroll(This) \ - ( (This)->lpVtbl -> BeginAudioPreroll(This) ) + ( (This)->lpVtbl -> BeginAudioPreroll(This) ) #define IDeckLinkOutput_v7_6_EndAudioPreroll(This) \ - ( (This)->lpVtbl -> EndAudioPreroll(This) ) + ( (This)->lpVtbl -> EndAudioPreroll(This) ) #define IDeckLinkOutput_v7_6_ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) \ - ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) + ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) #define IDeckLinkOutput_v7_6_GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) \ - ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) #define IDeckLinkOutput_v7_6_FlushBufferedAudioSamples(This) \ - ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) + ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) #define IDeckLinkOutput_v7_6_SetAudioCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) #define IDeckLinkOutput_v7_6_StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) \ - ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) + ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) #define IDeckLinkOutput_v7_6_StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) \ - ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) + ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) #define IDeckLinkOutput_v7_6_IsScheduledPlaybackRunning(This,active) \ - ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) + ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) #define IDeckLinkOutput_v7_6_GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) \ - ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) + ( (This)->lpVtbl -> GetScheduledStreamTime(This,desiredTimeScale,streamTime,playbackSpeed) ) #define IDeckLinkOutput_v7_6_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #endif /* COBJMACROS */ @@ -12364,149 +14289,149 @@ #define __IDeckLinkInput_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkInput_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInput_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("300C135A-9F43-48E2-9906-6D7911D93CF1") IDeckLinkInput_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ BMDDisplayModeSupport_v10_11 *result) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback_v7_6 *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, BMDVideoInputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( /* [out] */ unsigned int *availableFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( /* [out] */ unsigned int *availableSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkInputCallback_v7_6 *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInput_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInput_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInput_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkInput_v7_6 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + /* [out] */ BMDDisplayModeSupport_v10_11 *result); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkInput_v7_6 * This, /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkInput_v7_6 * This, /* [in] */ IDeckLinkScreenPreviewCallback_v7_6 *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( IDeckLinkInput_v7_6 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, BMDVideoInputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( IDeckLinkInput_v7_6 * This, /* [out] */ unsigned int *availableFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( IDeckLinkInput_v7_6 * This, BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( IDeckLinkInput_v7_6 * This, /* [out] */ unsigned int *availableSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *StartStreams )( + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *StopStreams )( + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( IDeckLinkInput_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkInput_v7_6 * This, /* [in] */ IDeckLinkInputCallback_v7_6 *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkInput_v7_6 * This, BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *hardwareTime, /* [out] */ BMDTimeValue *timeInFrame, /* [out] */ BMDTimeValue *ticksPerFrame); - + END_INTERFACE } IDeckLinkInput_v7_6Vtbl; @@ -12515,65 +14440,65 @@ CONST_VTBL struct IDeckLinkInput_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInput_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInput_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInput_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInput_v7_6_DoesSupportVideoMode(This,displayMode,pixelFormat,result) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) #define IDeckLinkInput_v7_6_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkInput_v7_6_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkInput_v7_6_EnableVideoInput(This,displayMode,pixelFormat,flags) \ - ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) #define IDeckLinkInput_v7_6_DisableVideoInput(This) \ - ( (This)->lpVtbl -> DisableVideoInput(This) ) + ( (This)->lpVtbl -> DisableVideoInput(This) ) #define IDeckLinkInput_v7_6_GetAvailableVideoFrameCount(This,availableFrameCount) \ - ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) + ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) #define IDeckLinkInput_v7_6_EnableAudioInput(This,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) #define IDeckLinkInput_v7_6_DisableAudioInput(This) \ - ( (This)->lpVtbl -> DisableAudioInput(This) ) + ( (This)->lpVtbl -> DisableAudioInput(This) ) #define IDeckLinkInput_v7_6_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ - ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) #define IDeckLinkInput_v7_6_StartStreams(This) \ - ( (This)->lpVtbl -> StartStreams(This) ) + ( (This)->lpVtbl -> StartStreams(This) ) #define IDeckLinkInput_v7_6_StopStreams(This) \ - ( (This)->lpVtbl -> StopStreams(This) ) + ( (This)->lpVtbl -> StopStreams(This) ) #define IDeckLinkInput_v7_6_PauseStreams(This) \ - ( (This)->lpVtbl -> PauseStreams(This) ) + ( (This)->lpVtbl -> PauseStreams(This) ) #define IDeckLinkInput_v7_6_FlushStreams(This) \ - ( (This)->lpVtbl -> FlushStreams(This) ) + ( (This)->lpVtbl -> FlushStreams(This) ) #define IDeckLinkInput_v7_6_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #define IDeckLinkInput_v7_6_GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,hardwareTime,timeInFrame,ticksPerFrame) ) #endif /* COBJMACROS */ @@ -12590,68 +14515,68 @@ #define __IDeckLinkTimecode_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkTimecode_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkTimecode_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("EFB9BCA6-A521-44F7-BD69-2332F24D9EE6") IDeckLinkTimecode_v7_6 : public IUnknown { public: virtual BMDTimecodeBCD STDMETHODCALLTYPE GetBCD( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetComponents( + + virtual HRESULT STDMETHODCALLTYPE GetComponents( /* [out] */ unsigned char *hours, /* [out] */ unsigned char *minutes, /* [out] */ unsigned char *seconds, /* [out] */ unsigned char *frames) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetString( + + virtual HRESULT STDMETHODCALLTYPE GetString( /* [out] */ BSTR *timecode) = 0; - + virtual BMDTimecodeFlags STDMETHODCALLTYPE GetFlags( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkTimecode_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkTimecode_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkTimecode_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkTimecode_v7_6 * This); - - BMDTimecodeBCD ( STDMETHODCALLTYPE *GetBCD )( + + BMDTimecodeBCD ( STDMETHODCALLTYPE *GetBCD )( IDeckLinkTimecode_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetComponents )( + + HRESULT ( STDMETHODCALLTYPE *GetComponents )( IDeckLinkTimecode_v7_6 * This, /* [out] */ unsigned char *hours, /* [out] */ unsigned char *minutes, /* [out] */ unsigned char *seconds, /* [out] */ unsigned char *frames); - - HRESULT ( STDMETHODCALLTYPE *GetString )( + + HRESULT ( STDMETHODCALLTYPE *GetString )( IDeckLinkTimecode_v7_6 * This, /* [out] */ BSTR *timecode); - - BMDTimecodeFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDTimecodeFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkTimecode_v7_6 * This); - + END_INTERFACE } IDeckLinkTimecode_v7_6Vtbl; @@ -12660,32 +14585,32 @@ CONST_VTBL struct IDeckLinkTimecode_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkTimecode_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkTimecode_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkTimecode_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkTimecode_v7_6_GetBCD(This) \ - ( (This)->lpVtbl -> GetBCD(This) ) + ( (This)->lpVtbl -> GetBCD(This) ) #define IDeckLinkTimecode_v7_6_GetComponents(This,hours,minutes,seconds,frames) \ - ( (This)->lpVtbl -> GetComponents(This,hours,minutes,seconds,frames) ) + ( (This)->lpVtbl -> GetComponents(This,hours,minutes,seconds,frames) ) #define IDeckLinkTimecode_v7_6_GetString(This,timecode) \ - ( (This)->lpVtbl -> GetString(This,timecode) ) + ( (This)->lpVtbl -> GetString(This,timecode) ) #define IDeckLinkTimecode_v7_6_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #endif /* COBJMACROS */ @@ -12702,86 +14627,86 @@ #define __IDeckLinkVideoFrame_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrame_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrame_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("A8D8238E-6B18-4196-99E1-5AF717B83D32") IDeckLinkVideoFrame_v7_6 : public IUnknown { public: virtual long STDMETHODCALLTYPE GetWidth( void) = 0; - + virtual long STDMETHODCALLTYPE GetHeight( void) = 0; - + virtual long STDMETHODCALLTYPE GetRowBytes( void) = 0; - + virtual BMDPixelFormat STDMETHODCALLTYPE GetPixelFormat( void) = 0; - + virtual BMDFrameFlags STDMETHODCALLTYPE GetFlags( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( /* [out] */ void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetTimecode( + + virtual HRESULT STDMETHODCALLTYPE GetTimecode( BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode_v7_6 **timecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE GetAncillaryData( /* [out] */ IDeckLinkVideoFrameAncillary **ancillary) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrame_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrame_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrame_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoFrame_v7_6 * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoFrame_v7_6 * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoFrame_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoFrame_v7_6 * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkVideoFrame_v7_6 * This, BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode_v7_6 **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkVideoFrame_v7_6 * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - + END_INTERFACE } IDeckLinkVideoFrame_v7_6Vtbl; @@ -12790,44 +14715,44 @@ CONST_VTBL struct IDeckLinkVideoFrame_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrame_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrame_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrame_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrame_v7_6_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoFrame_v7_6_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoFrame_v7_6_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoFrame_v7_6_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoFrame_v7_6_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoFrame_v7_6_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkVideoFrame_v7_6_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkVideoFrame_v7_6_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #endif /* COBJMACROS */ @@ -12844,94 +14769,94 @@ #define __IDeckLinkMutableVideoFrame_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkMutableVideoFrame_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkMutableVideoFrame_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("46FCEE00-B4E6-43D0-91C0-023A7FCEB34F") IDeckLinkMutableVideoFrame_v7_6 : public IDeckLinkVideoFrame_v7_6 { public: - virtual HRESULT STDMETHODCALLTYPE SetFlags( + virtual HRESULT STDMETHODCALLTYPE SetFlags( BMDFrameFlags newFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetTimecode( + + virtual HRESULT STDMETHODCALLTYPE SetTimecode( BMDTimecodeFormat format, /* [in] */ IDeckLinkTimecode_v7_6 *timecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetTimecodeFromComponents( + + virtual HRESULT STDMETHODCALLTYPE SetTimecodeFromComponents( BMDTimecodeFormat format, unsigned char hours, unsigned char minutes, unsigned char seconds, unsigned char frames, BMDTimecodeFlags flags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE SetAncillaryData( /* [in] */ IDeckLinkVideoFrameAncillary *ancillary) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkMutableVideoFrame_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkMutableVideoFrame_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkMutableVideoFrame_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkMutableVideoFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkMutableVideoFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkMutableVideoFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkMutableVideoFrame_v7_6 * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkMutableVideoFrame_v7_6 * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkMutableVideoFrame_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkMutableVideoFrame_v7_6 * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkMutableVideoFrame_v7_6 * This, BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode_v7_6 **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkMutableVideoFrame_v7_6 * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - - HRESULT ( STDMETHODCALLTYPE *SetFlags )( + + HRESULT ( STDMETHODCALLTYPE *SetFlags )( IDeckLinkMutableVideoFrame_v7_6 * This, BMDFrameFlags newFlags); - - HRESULT ( STDMETHODCALLTYPE *SetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *SetTimecode )( IDeckLinkMutableVideoFrame_v7_6 * This, BMDTimecodeFormat format, /* [in] */ IDeckLinkTimecode_v7_6 *timecode); - - HRESULT ( STDMETHODCALLTYPE *SetTimecodeFromComponents )( + + HRESULT ( STDMETHODCALLTYPE *SetTimecodeFromComponents )( IDeckLinkMutableVideoFrame_v7_6 * This, BMDTimecodeFormat format, unsigned char hours, @@ -12939,11 +14864,11 @@ unsigned char seconds, unsigned char frames, BMDTimecodeFlags flags); - - HRESULT ( STDMETHODCALLTYPE *SetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *SetAncillaryData )( IDeckLinkMutableVideoFrame_v7_6 * This, /* [in] */ IDeckLinkVideoFrameAncillary *ancillary); - + END_INTERFACE } IDeckLinkMutableVideoFrame_v7_6Vtbl; @@ -12952,57 +14877,57 @@ CONST_VTBL struct IDeckLinkMutableVideoFrame_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkMutableVideoFrame_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkMutableVideoFrame_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkMutableVideoFrame_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkMutableVideoFrame_v7_6_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkMutableVideoFrame_v7_6_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkMutableVideoFrame_v7_6_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkMutableVideoFrame_v7_6_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkMutableVideoFrame_v7_6_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkMutableVideoFrame_v7_6_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkMutableVideoFrame_v7_6_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkMutableVideoFrame_v7_6_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #define IDeckLinkMutableVideoFrame_v7_6_SetFlags(This,newFlags) \ - ( (This)->lpVtbl -> SetFlags(This,newFlags) ) + ( (This)->lpVtbl -> SetFlags(This,newFlags) ) #define IDeckLinkMutableVideoFrame_v7_6_SetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> SetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> SetTimecode(This,format,timecode) ) #define IDeckLinkMutableVideoFrame_v7_6_SetTimecodeFromComponents(This,format,hours,minutes,seconds,frames,flags) \ - ( (This)->lpVtbl -> SetTimecodeFromComponents(This,format,hours,minutes,seconds,frames,flags) ) + ( (This)->lpVtbl -> SetTimecodeFromComponents(This,format,hours,minutes,seconds,frames,flags) ) #define IDeckLinkMutableVideoFrame_v7_6_SetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> SetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> SetAncillaryData(This,ancillary) ) #endif /* COBJMACROS */ @@ -13019,88 +14944,88 @@ #define __IDeckLinkVideoInputFrame_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkVideoInputFrame_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoInputFrame_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("9A74FA41-AE9F-47AC-8CF4-01F42DD59965") IDeckLinkVideoInputFrame_v7_6 : public IDeckLinkVideoFrame_v7_6 { public: - virtual HRESULT STDMETHODCALLTYPE GetStreamTime( + virtual HRESULT STDMETHODCALLTYPE GetStreamTime( /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceTimestamp( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceTimestamp( BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoInputFrame_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoInputFrame_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoInputFrame_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoInputFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoInputFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoInputFrame_v7_6 * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoInputFrame_v7_6 * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoInputFrame_v7_6 * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoInputFrame_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoInputFrame_v7_6 * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkVideoInputFrame_v7_6 * This, BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode_v7_6 **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkVideoInputFrame_v7_6 * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkVideoInputFrame_v7_6 * This, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceTimestamp )( IDeckLinkVideoInputFrame_v7_6 * This, BMDTimeScale timeScale, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration); - + END_INTERFACE } IDeckLinkVideoInputFrame_v7_6Vtbl; @@ -13109,51 +15034,51 @@ CONST_VTBL struct IDeckLinkVideoInputFrame_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoInputFrame_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoInputFrame_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoInputFrame_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoInputFrame_v7_6_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoInputFrame_v7_6_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoInputFrame_v7_6_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoInputFrame_v7_6_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoInputFrame_v7_6_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoInputFrame_v7_6_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkVideoInputFrame_v7_6_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkVideoInputFrame_v7_6_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #define IDeckLinkVideoInputFrame_v7_6_GetStreamTime(This,frameTime,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,frameDuration,timeScale) ) #define IDeckLinkVideoInputFrame_v7_6_GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) \ - ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) + ( (This)->lpVtbl -> GetHardwareReferenceTimestamp(This,timeScale,frameTime,frameDuration) ) #endif /* COBJMACROS */ @@ -13170,45 +15095,45 @@ #define __IDeckLinkScreenPreviewCallback_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkScreenPreviewCallback_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkScreenPreviewCallback_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("373F499D-4B4D-4518-AD22-6354E5A5825E") IDeckLinkScreenPreviewCallback_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DrawFrame( + virtual HRESULT STDMETHODCALLTYPE DrawFrame( /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkScreenPreviewCallback_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkScreenPreviewCallback_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkScreenPreviewCallback_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkScreenPreviewCallback_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *DrawFrame )( + + HRESULT ( STDMETHODCALLTYPE *DrawFrame )( IDeckLinkScreenPreviewCallback_v7_6 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame); - + END_INTERFACE } IDeckLinkScreenPreviewCallback_v7_6Vtbl; @@ -13217,23 +15142,23 @@ CONST_VTBL struct IDeckLinkScreenPreviewCallback_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkScreenPreviewCallback_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkScreenPreviewCallback_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkScreenPreviewCallback_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkScreenPreviewCallback_v7_6_DrawFrame(This,theFrame) \ - ( (This)->lpVtbl -> DrawFrame(This,theFrame) ) + ( (This)->lpVtbl -> DrawFrame(This,theFrame) ) #endif /* COBJMACROS */ @@ -13250,55 +15175,55 @@ #define __IDeckLinkGLScreenPreviewHelper_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkGLScreenPreviewHelper_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkGLScreenPreviewHelper_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("BA575CD9-A15E-497B-B2C2-F9AFE7BE4EBA") IDeckLinkGLScreenPreviewHelper_v7_6 : public IUnknown { public: virtual HRESULT STDMETHODCALLTYPE InitializeGL( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PaintGL( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetFrame( + + virtual HRESULT STDMETHODCALLTYPE SetFrame( /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkGLScreenPreviewHelper_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkGLScreenPreviewHelper_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkGLScreenPreviewHelper_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkGLScreenPreviewHelper_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *InitializeGL )( + + HRESULT ( STDMETHODCALLTYPE *InitializeGL )( IDeckLinkGLScreenPreviewHelper_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *PaintGL )( + + HRESULT ( STDMETHODCALLTYPE *PaintGL )( IDeckLinkGLScreenPreviewHelper_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *SetFrame )( + + HRESULT ( STDMETHODCALLTYPE *SetFrame )( IDeckLinkGLScreenPreviewHelper_v7_6 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame); - + END_INTERFACE } IDeckLinkGLScreenPreviewHelper_v7_6Vtbl; @@ -13307,29 +15232,29 @@ CONST_VTBL struct IDeckLinkGLScreenPreviewHelper_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkGLScreenPreviewHelper_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkGLScreenPreviewHelper_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkGLScreenPreviewHelper_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkGLScreenPreviewHelper_v7_6_InitializeGL(This) \ - ( (This)->lpVtbl -> InitializeGL(This) ) + ( (This)->lpVtbl -> InitializeGL(This) ) #define IDeckLinkGLScreenPreviewHelper_v7_6_PaintGL(This) \ - ( (This)->lpVtbl -> PaintGL(This) ) + ( (This)->lpVtbl -> PaintGL(This) ) #define IDeckLinkGLScreenPreviewHelper_v7_6_SetFrame(This,theFrame) \ - ( (This)->lpVtbl -> SetFrame(This,theFrame) ) + ( (This)->lpVtbl -> SetFrame(This,theFrame) ) #endif /* COBJMACROS */ @@ -13346,47 +15271,47 @@ #define __IDeckLinkVideoConversion_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkVideoConversion_v7_6 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoConversion_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("3EB504C9-F97D-40FE-A158-D407D48CB53B") IDeckLinkVideoConversion_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE ConvertFrame( + virtual HRESULT STDMETHODCALLTYPE ConvertFrame( /* [in] */ IDeckLinkVideoFrame_v7_6 *srcFrame, /* [in] */ IDeckLinkVideoFrame_v7_6 *dstFrame) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoConversion_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoConversion_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoConversion_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoConversion_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *ConvertFrame )( + + HRESULT ( STDMETHODCALLTYPE *ConvertFrame )( IDeckLinkVideoConversion_v7_6 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *srcFrame, /* [in] */ IDeckLinkVideoFrame_v7_6 *dstFrame); - + END_INTERFACE } IDeckLinkVideoConversion_v7_6Vtbl; @@ -13395,23 +15320,23 @@ CONST_VTBL struct IDeckLinkVideoConversion_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoConversion_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoConversion_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoConversion_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoConversion_v7_6_ConvertFrame(This,srcFrame,dstFrame) \ - ( (This)->lpVtbl -> ConvertFrame(This,srcFrame,dstFrame) ) + ( (This)->lpVtbl -> ConvertFrame(This,srcFrame,dstFrame) ) #endif /* COBJMACROS */ @@ -13428,246 +15353,246 @@ #define __IDeckLinkConfiguration_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkConfiguration_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkConfiguration_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("B8EAD569-B764-47F0-A73F-AE40DF6CBF10") IDeckLinkConfiguration_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetConfigurationValidator( + virtual HRESULT STDMETHODCALLTYPE GetConfigurationValidator( /* [out] */ IDeckLinkConfiguration_v7_6 **configObject) = 0; - + virtual HRESULT STDMETHODCALLTYPE WriteConfigurationToPreferences( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFormat( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFormat( /* [in] */ BMDVideoConnection_v7_6 videoOutputConnection) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsVideoOutputActive( + + virtual HRESULT STDMETHODCALLTYPE IsVideoOutputActive( /* [in] */ BMDVideoConnection_v7_6 videoOutputConnection, /* [out] */ BOOL *active) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAnalogVideoOutputFlags( + + virtual HRESULT STDMETHODCALLTYPE SetAnalogVideoOutputFlags( /* [in] */ BMDAnalogVideoFlags analogVideoFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAnalogVideoOutputFlags( + + virtual HRESULT STDMETHODCALLTYPE GetAnalogVideoOutputFlags( /* [out] */ BMDAnalogVideoFlags *analogVideoFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableFieldFlickerRemovalWhenPaused( + + virtual HRESULT STDMETHODCALLTYPE EnableFieldFlickerRemovalWhenPaused( /* [in] */ BOOL enable) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsEnabledFieldFlickerRemovalWhenPaused( + + virtual HRESULT STDMETHODCALLTYPE IsEnabledFieldFlickerRemovalWhenPaused( /* [out] */ BOOL *enabled) = 0; - - virtual HRESULT STDMETHODCALLTYPE Set444And3GBpsVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE Set444And3GBpsVideoOutput( /* [in] */ BOOL enable444VideoOutput, /* [in] */ BOOL enable3GbsOutput) = 0; - - virtual HRESULT STDMETHODCALLTYPE Get444And3GBpsVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE Get444And3GBpsVideoOutput( /* [out] */ BOOL *is444VideoOutputEnabled, /* [out] */ BOOL *threeGbsOutputEnabled) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputConversionMode( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputConversionMode( /* [in] */ BMDVideoOutputConversionMode conversionMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVideoOutputConversionMode( + + virtual HRESULT STDMETHODCALLTYPE GetVideoOutputConversionMode( /* [out] */ BMDVideoOutputConversionMode *conversionMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE Set_HD1080p24_to_HD1080i5994_Conversion( + + virtual HRESULT STDMETHODCALLTYPE Set_HD1080p24_to_HD1080i5994_Conversion( /* [in] */ BOOL enable) = 0; - - virtual HRESULT STDMETHODCALLTYPE Get_HD1080p24_to_HD1080i5994_Conversion( + + virtual HRESULT STDMETHODCALLTYPE Get_HD1080p24_to_HD1080i5994_Conversion( /* [out] */ BOOL *enabled) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoInputFormat( + + virtual HRESULT STDMETHODCALLTYPE SetVideoInputFormat( /* [in] */ BMDVideoConnection_v7_6 videoInputFormat) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVideoInputFormat( + + virtual HRESULT STDMETHODCALLTYPE GetVideoInputFormat( /* [out] */ BMDVideoConnection_v7_6 *videoInputFormat) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAnalogVideoInputFlags( + + virtual HRESULT STDMETHODCALLTYPE SetAnalogVideoInputFlags( /* [in] */ BMDAnalogVideoFlags analogVideoFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAnalogVideoInputFlags( + + virtual HRESULT STDMETHODCALLTYPE GetAnalogVideoInputFlags( /* [out] */ BMDAnalogVideoFlags *analogVideoFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoInputConversionMode( + + virtual HRESULT STDMETHODCALLTYPE SetVideoInputConversionMode( /* [in] */ BMDVideoInputConversionMode conversionMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVideoInputConversionMode( + + virtual HRESULT STDMETHODCALLTYPE GetVideoInputConversionMode( /* [out] */ BMDVideoInputConversionMode *conversionMode) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetBlackVideoOutputDuringCapture( + + virtual HRESULT STDMETHODCALLTYPE SetBlackVideoOutputDuringCapture( /* [in] */ BOOL blackOutInCapture) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBlackVideoOutputDuringCapture( + + virtual HRESULT STDMETHODCALLTYPE GetBlackVideoOutputDuringCapture( /* [out] */ BOOL *blackOutInCapture) = 0; - - virtual HRESULT STDMETHODCALLTYPE Set32PulldownSequenceInitialTimecodeFrame( + + virtual HRESULT STDMETHODCALLTYPE Set32PulldownSequenceInitialTimecodeFrame( /* [in] */ unsigned int aFrameTimecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE Get32PulldownSequenceInitialTimecodeFrame( + + virtual HRESULT STDMETHODCALLTYPE Get32PulldownSequenceInitialTimecodeFrame( /* [out] */ unsigned int *aFrameTimecode) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVancSourceLineMapping( + + virtual HRESULT STDMETHODCALLTYPE SetVancSourceLineMapping( /* [in] */ unsigned int activeLine1VANCsource, /* [in] */ unsigned int activeLine2VANCsource, /* [in] */ unsigned int activeLine3VANCsource) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetVancSourceLineMapping( + + virtual HRESULT STDMETHODCALLTYPE GetVancSourceLineMapping( /* [out] */ unsigned int *activeLine1VANCsource, /* [out] */ unsigned int *activeLine2VANCsource, /* [out] */ unsigned int *activeLine3VANCsource) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAudioInputFormat( + + virtual HRESULT STDMETHODCALLTYPE SetAudioInputFormat( /* [in] */ BMDAudioConnection_v10_2 audioInputFormat) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAudioInputFormat( + + virtual HRESULT STDMETHODCALLTYPE GetAudioInputFormat( /* [out] */ BMDAudioConnection_v10_2 *audioInputFormat) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkConfiguration_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkConfiguration_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkConfiguration_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *GetConfigurationValidator )( + + HRESULT ( STDMETHODCALLTYPE *GetConfigurationValidator )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ IDeckLinkConfiguration_v7_6 **configObject); - - HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( + + HRESULT ( STDMETHODCALLTYPE *WriteConfigurationToPreferences )( IDeckLinkConfiguration_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFormat )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFormat )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDVideoConnection_v7_6 videoOutputConnection); - - HRESULT ( STDMETHODCALLTYPE *IsVideoOutputActive )( + + HRESULT ( STDMETHODCALLTYPE *IsVideoOutputActive )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDVideoConnection_v7_6 videoOutputConnection, /* [out] */ BOOL *active); - - HRESULT ( STDMETHODCALLTYPE *SetAnalogVideoOutputFlags )( + + HRESULT ( STDMETHODCALLTYPE *SetAnalogVideoOutputFlags )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDAnalogVideoFlags analogVideoFlags); - - HRESULT ( STDMETHODCALLTYPE *GetAnalogVideoOutputFlags )( + + HRESULT ( STDMETHODCALLTYPE *GetAnalogVideoOutputFlags )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BMDAnalogVideoFlags *analogVideoFlags); - - HRESULT ( STDMETHODCALLTYPE *EnableFieldFlickerRemovalWhenPaused )( + + HRESULT ( STDMETHODCALLTYPE *EnableFieldFlickerRemovalWhenPaused )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BOOL enable); - - HRESULT ( STDMETHODCALLTYPE *IsEnabledFieldFlickerRemovalWhenPaused )( + + HRESULT ( STDMETHODCALLTYPE *IsEnabledFieldFlickerRemovalWhenPaused )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BOOL *enabled); - - HRESULT ( STDMETHODCALLTYPE *Set444And3GBpsVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *Set444And3GBpsVideoOutput )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BOOL enable444VideoOutput, /* [in] */ BOOL enable3GbsOutput); - - HRESULT ( STDMETHODCALLTYPE *Get444And3GBpsVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *Get444And3GBpsVideoOutput )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BOOL *is444VideoOutputEnabled, /* [out] */ BOOL *threeGbsOutputEnabled); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputConversionMode )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputConversionMode )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDVideoOutputConversionMode conversionMode); - - HRESULT ( STDMETHODCALLTYPE *GetVideoOutputConversionMode )( + + HRESULT ( STDMETHODCALLTYPE *GetVideoOutputConversionMode )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BMDVideoOutputConversionMode *conversionMode); - - HRESULT ( STDMETHODCALLTYPE *Set_HD1080p24_to_HD1080i5994_Conversion )( + + HRESULT ( STDMETHODCALLTYPE *Set_HD1080p24_to_HD1080i5994_Conversion )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BOOL enable); - - HRESULT ( STDMETHODCALLTYPE *Get_HD1080p24_to_HD1080i5994_Conversion )( + + HRESULT ( STDMETHODCALLTYPE *Get_HD1080p24_to_HD1080i5994_Conversion )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BOOL *enabled); - - HRESULT ( STDMETHODCALLTYPE *SetVideoInputFormat )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoInputFormat )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDVideoConnection_v7_6 videoInputFormat); - - HRESULT ( STDMETHODCALLTYPE *GetVideoInputFormat )( + + HRESULT ( STDMETHODCALLTYPE *GetVideoInputFormat )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BMDVideoConnection_v7_6 *videoInputFormat); - - HRESULT ( STDMETHODCALLTYPE *SetAnalogVideoInputFlags )( + + HRESULT ( STDMETHODCALLTYPE *SetAnalogVideoInputFlags )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDAnalogVideoFlags analogVideoFlags); - - HRESULT ( STDMETHODCALLTYPE *GetAnalogVideoInputFlags )( + + HRESULT ( STDMETHODCALLTYPE *GetAnalogVideoInputFlags )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BMDAnalogVideoFlags *analogVideoFlags); - - HRESULT ( STDMETHODCALLTYPE *SetVideoInputConversionMode )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoInputConversionMode )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDVideoInputConversionMode conversionMode); - - HRESULT ( STDMETHODCALLTYPE *GetVideoInputConversionMode )( + + HRESULT ( STDMETHODCALLTYPE *GetVideoInputConversionMode )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BMDVideoInputConversionMode *conversionMode); - - HRESULT ( STDMETHODCALLTYPE *SetBlackVideoOutputDuringCapture )( + + HRESULT ( STDMETHODCALLTYPE *SetBlackVideoOutputDuringCapture )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BOOL blackOutInCapture); - - HRESULT ( STDMETHODCALLTYPE *GetBlackVideoOutputDuringCapture )( + + HRESULT ( STDMETHODCALLTYPE *GetBlackVideoOutputDuringCapture )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BOOL *blackOutInCapture); - - HRESULT ( STDMETHODCALLTYPE *Set32PulldownSequenceInitialTimecodeFrame )( + + HRESULT ( STDMETHODCALLTYPE *Set32PulldownSequenceInitialTimecodeFrame )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ unsigned int aFrameTimecode); - - HRESULT ( STDMETHODCALLTYPE *Get32PulldownSequenceInitialTimecodeFrame )( + + HRESULT ( STDMETHODCALLTYPE *Get32PulldownSequenceInitialTimecodeFrame )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ unsigned int *aFrameTimecode); - - HRESULT ( STDMETHODCALLTYPE *SetVancSourceLineMapping )( + + HRESULT ( STDMETHODCALLTYPE *SetVancSourceLineMapping )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ unsigned int activeLine1VANCsource, /* [in] */ unsigned int activeLine2VANCsource, /* [in] */ unsigned int activeLine3VANCsource); - - HRESULT ( STDMETHODCALLTYPE *GetVancSourceLineMapping )( + + HRESULT ( STDMETHODCALLTYPE *GetVancSourceLineMapping )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ unsigned int *activeLine1VANCsource, /* [out] */ unsigned int *activeLine2VANCsource, /* [out] */ unsigned int *activeLine3VANCsource); - - HRESULT ( STDMETHODCALLTYPE *SetAudioInputFormat )( + + HRESULT ( STDMETHODCALLTYPE *SetAudioInputFormat )( IDeckLinkConfiguration_v7_6 * This, /* [in] */ BMDAudioConnection_v10_2 audioInputFormat); - - HRESULT ( STDMETHODCALLTYPE *GetAudioInputFormat )( + + HRESULT ( STDMETHODCALLTYPE *GetAudioInputFormat )( IDeckLinkConfiguration_v7_6 * This, /* [out] */ BMDAudioConnection_v10_2 *audioInputFormat); - + END_INTERFACE } IDeckLinkConfiguration_v7_6Vtbl; @@ -13676,104 +15601,104 @@ CONST_VTBL struct IDeckLinkConfiguration_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkConfiguration_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkConfiguration_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkConfiguration_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkConfiguration_v7_6_GetConfigurationValidator(This,configObject) \ - ( (This)->lpVtbl -> GetConfigurationValidator(This,configObject) ) + ( (This)->lpVtbl -> GetConfigurationValidator(This,configObject) ) #define IDeckLinkConfiguration_v7_6_WriteConfigurationToPreferences(This) \ - ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) + ( (This)->lpVtbl -> WriteConfigurationToPreferences(This) ) #define IDeckLinkConfiguration_v7_6_SetVideoOutputFormat(This,videoOutputConnection) \ - ( (This)->lpVtbl -> SetVideoOutputFormat(This,videoOutputConnection) ) + ( (This)->lpVtbl -> SetVideoOutputFormat(This,videoOutputConnection) ) #define IDeckLinkConfiguration_v7_6_IsVideoOutputActive(This,videoOutputConnection,active) \ - ( (This)->lpVtbl -> IsVideoOutputActive(This,videoOutputConnection,active) ) + ( (This)->lpVtbl -> IsVideoOutputActive(This,videoOutputConnection,active) ) #define IDeckLinkConfiguration_v7_6_SetAnalogVideoOutputFlags(This,analogVideoFlags) \ - ( (This)->lpVtbl -> SetAnalogVideoOutputFlags(This,analogVideoFlags) ) + ( (This)->lpVtbl -> SetAnalogVideoOutputFlags(This,analogVideoFlags) ) #define IDeckLinkConfiguration_v7_6_GetAnalogVideoOutputFlags(This,analogVideoFlags) \ - ( (This)->lpVtbl -> GetAnalogVideoOutputFlags(This,analogVideoFlags) ) + ( (This)->lpVtbl -> GetAnalogVideoOutputFlags(This,analogVideoFlags) ) #define IDeckLinkConfiguration_v7_6_EnableFieldFlickerRemovalWhenPaused(This,enable) \ - ( (This)->lpVtbl -> EnableFieldFlickerRemovalWhenPaused(This,enable) ) + ( (This)->lpVtbl -> EnableFieldFlickerRemovalWhenPaused(This,enable) ) #define IDeckLinkConfiguration_v7_6_IsEnabledFieldFlickerRemovalWhenPaused(This,enabled) \ - ( (This)->lpVtbl -> IsEnabledFieldFlickerRemovalWhenPaused(This,enabled) ) + ( (This)->lpVtbl -> IsEnabledFieldFlickerRemovalWhenPaused(This,enabled) ) #define IDeckLinkConfiguration_v7_6_Set444And3GBpsVideoOutput(This,enable444VideoOutput,enable3GbsOutput) \ - ( (This)->lpVtbl -> Set444And3GBpsVideoOutput(This,enable444VideoOutput,enable3GbsOutput) ) + ( (This)->lpVtbl -> Set444And3GBpsVideoOutput(This,enable444VideoOutput,enable3GbsOutput) ) #define IDeckLinkConfiguration_v7_6_Get444And3GBpsVideoOutput(This,is444VideoOutputEnabled,threeGbsOutputEnabled) \ - ( (This)->lpVtbl -> Get444And3GBpsVideoOutput(This,is444VideoOutputEnabled,threeGbsOutputEnabled) ) + ( (This)->lpVtbl -> Get444And3GBpsVideoOutput(This,is444VideoOutputEnabled,threeGbsOutputEnabled) ) #define IDeckLinkConfiguration_v7_6_SetVideoOutputConversionMode(This,conversionMode) \ - ( (This)->lpVtbl -> SetVideoOutputConversionMode(This,conversionMode) ) + ( (This)->lpVtbl -> SetVideoOutputConversionMode(This,conversionMode) ) #define IDeckLinkConfiguration_v7_6_GetVideoOutputConversionMode(This,conversionMode) \ - ( (This)->lpVtbl -> GetVideoOutputConversionMode(This,conversionMode) ) + ( (This)->lpVtbl -> GetVideoOutputConversionMode(This,conversionMode) ) #define IDeckLinkConfiguration_v7_6_Set_HD1080p24_to_HD1080i5994_Conversion(This,enable) \ - ( (This)->lpVtbl -> Set_HD1080p24_to_HD1080i5994_Conversion(This,enable) ) + ( (This)->lpVtbl -> Set_HD1080p24_to_HD1080i5994_Conversion(This,enable) ) #define IDeckLinkConfiguration_v7_6_Get_HD1080p24_to_HD1080i5994_Conversion(This,enabled) \ - ( (This)->lpVtbl -> Get_HD1080p24_to_HD1080i5994_Conversion(This,enabled) ) + ( (This)->lpVtbl -> Get_HD1080p24_to_HD1080i5994_Conversion(This,enabled) ) #define IDeckLinkConfiguration_v7_6_SetVideoInputFormat(This,videoInputFormat) \ - ( (This)->lpVtbl -> SetVideoInputFormat(This,videoInputFormat) ) + ( (This)->lpVtbl -> SetVideoInputFormat(This,videoInputFormat) ) #define IDeckLinkConfiguration_v7_6_GetVideoInputFormat(This,videoInputFormat) \ - ( (This)->lpVtbl -> GetVideoInputFormat(This,videoInputFormat) ) + ( (This)->lpVtbl -> GetVideoInputFormat(This,videoInputFormat) ) #define IDeckLinkConfiguration_v7_6_SetAnalogVideoInputFlags(This,analogVideoFlags) \ - ( (This)->lpVtbl -> SetAnalogVideoInputFlags(This,analogVideoFlags) ) + ( (This)->lpVtbl -> SetAnalogVideoInputFlags(This,analogVideoFlags) ) #define IDeckLinkConfiguration_v7_6_GetAnalogVideoInputFlags(This,analogVideoFlags) \ - ( (This)->lpVtbl -> GetAnalogVideoInputFlags(This,analogVideoFlags) ) + ( (This)->lpVtbl -> GetAnalogVideoInputFlags(This,analogVideoFlags) ) #define IDeckLinkConfiguration_v7_6_SetVideoInputConversionMode(This,conversionMode) \ - ( (This)->lpVtbl -> SetVideoInputConversionMode(This,conversionMode) ) + ( (This)->lpVtbl -> SetVideoInputConversionMode(This,conversionMode) ) #define IDeckLinkConfiguration_v7_6_GetVideoInputConversionMode(This,conversionMode) \ - ( (This)->lpVtbl -> GetVideoInputConversionMode(This,conversionMode) ) + ( (This)->lpVtbl -> GetVideoInputConversionMode(This,conversionMode) ) #define IDeckLinkConfiguration_v7_6_SetBlackVideoOutputDuringCapture(This,blackOutInCapture) \ - ( (This)->lpVtbl -> SetBlackVideoOutputDuringCapture(This,blackOutInCapture) ) + ( (This)->lpVtbl -> SetBlackVideoOutputDuringCapture(This,blackOutInCapture) ) #define IDeckLinkConfiguration_v7_6_GetBlackVideoOutputDuringCapture(This,blackOutInCapture) \ - ( (This)->lpVtbl -> GetBlackVideoOutputDuringCapture(This,blackOutInCapture) ) + ( (This)->lpVtbl -> GetBlackVideoOutputDuringCapture(This,blackOutInCapture) ) #define IDeckLinkConfiguration_v7_6_Set32PulldownSequenceInitialTimecodeFrame(This,aFrameTimecode) \ - ( (This)->lpVtbl -> Set32PulldownSequenceInitialTimecodeFrame(This,aFrameTimecode) ) + ( (This)->lpVtbl -> Set32PulldownSequenceInitialTimecodeFrame(This,aFrameTimecode) ) #define IDeckLinkConfiguration_v7_6_Get32PulldownSequenceInitialTimecodeFrame(This,aFrameTimecode) \ - ( (This)->lpVtbl -> Get32PulldownSequenceInitialTimecodeFrame(This,aFrameTimecode) ) + ( (This)->lpVtbl -> Get32PulldownSequenceInitialTimecodeFrame(This,aFrameTimecode) ) #define IDeckLinkConfiguration_v7_6_SetVancSourceLineMapping(This,activeLine1VANCsource,activeLine2VANCsource,activeLine3VANCsource) \ - ( (This)->lpVtbl -> SetVancSourceLineMapping(This,activeLine1VANCsource,activeLine2VANCsource,activeLine3VANCsource) ) + ( (This)->lpVtbl -> SetVancSourceLineMapping(This,activeLine1VANCsource,activeLine2VANCsource,activeLine3VANCsource) ) #define IDeckLinkConfiguration_v7_6_GetVancSourceLineMapping(This,activeLine1VANCsource,activeLine2VANCsource,activeLine3VANCsource) \ - ( (This)->lpVtbl -> GetVancSourceLineMapping(This,activeLine1VANCsource,activeLine2VANCsource,activeLine3VANCsource) ) + ( (This)->lpVtbl -> GetVancSourceLineMapping(This,activeLine1VANCsource,activeLine2VANCsource,activeLine3VANCsource) ) #define IDeckLinkConfiguration_v7_6_SetAudioInputFormat(This,audioInputFormat) \ - ( (This)->lpVtbl -> SetAudioInputFormat(This,audioInputFormat) ) + ( (This)->lpVtbl -> SetAudioInputFormat(This,audioInputFormat) ) #define IDeckLinkConfiguration_v7_6_GetAudioInputFormat(This,audioInputFormat) \ - ( (This)->lpVtbl -> GetAudioInputFormat(This,audioInputFormat) ) + ( (This)->lpVtbl -> GetAudioInputFormat(This,audioInputFormat) ) #endif /* COBJMACROS */ @@ -13790,52 +15715,52 @@ #define __IDeckLinkVideoOutputCallback_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkVideoOutputCallback_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoOutputCallback_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("E763A626-4A3C-49D1-BF13-E7AD3692AE52") IDeckLinkVideoOutputCallback_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE ScheduledFrameCompleted( + virtual HRESULT STDMETHODCALLTYPE ScheduledFrameCompleted( /* [in] */ IDeckLinkVideoFrame_v7_6 *completedFrame, /* [in] */ BMDOutputFrameCompletionResult result) = 0; - + virtual HRESULT STDMETHODCALLTYPE ScheduledPlaybackHasStopped( void) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoOutputCallback_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoOutputCallback_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoOutputCallback_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoOutputCallback_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduledFrameCompleted )( + + HRESULT ( STDMETHODCALLTYPE *ScheduledFrameCompleted )( IDeckLinkVideoOutputCallback_v7_6 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *completedFrame, /* [in] */ BMDOutputFrameCompletionResult result); - - HRESULT ( STDMETHODCALLTYPE *ScheduledPlaybackHasStopped )( + + HRESULT ( STDMETHODCALLTYPE *ScheduledPlaybackHasStopped )( IDeckLinkVideoOutputCallback_v7_6 * This); - + END_INTERFACE } IDeckLinkVideoOutputCallback_v7_6Vtbl; @@ -13844,26 +15769,26 @@ CONST_VTBL struct IDeckLinkVideoOutputCallback_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoOutputCallback_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoOutputCallback_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoOutputCallback_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoOutputCallback_v7_6_ScheduledFrameCompleted(This,completedFrame,result) \ - ( (This)->lpVtbl -> ScheduledFrameCompleted(This,completedFrame,result) ) + ( (This)->lpVtbl -> ScheduledFrameCompleted(This,completedFrame,result) ) #define IDeckLinkVideoOutputCallback_v7_6_ScheduledPlaybackHasStopped(This) \ - ( (This)->lpVtbl -> ScheduledPlaybackHasStopped(This) ) + ( (This)->lpVtbl -> ScheduledPlaybackHasStopped(This) ) #endif /* COBJMACROS */ @@ -13880,58 +15805,58 @@ #define __IDeckLinkInputCallback_v7_6_INTERFACE_DEFINED__ /* interface IDeckLinkInputCallback_v7_6 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInputCallback_v7_6; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("31D28EE7-88B6-4CB1-897A-CDBF79A26414") IDeckLinkInputCallback_v7_6 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged( + virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged( /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode_v7_6 *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( + + virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( /* [in] */ IDeckLinkVideoInputFrame_v7_6 *videoFrame, /* [in] */ IDeckLinkAudioInputPacket *audioPacket) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInputCallback_v7_6Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInputCallback_v7_6 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInputCallback_v7_6 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInputCallback_v7_6 * This); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFormatChanged )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFormatChanged )( IDeckLinkInputCallback_v7_6 * This, /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode_v7_6 *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( IDeckLinkInputCallback_v7_6 * This, /* [in] */ IDeckLinkVideoInputFrame_v7_6 *videoFrame, /* [in] */ IDeckLinkAudioInputPacket *audioPacket); - + END_INTERFACE } IDeckLinkInputCallback_v7_6Vtbl; @@ -13940,26 +15865,26 @@ CONST_VTBL struct IDeckLinkInputCallback_v7_6Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInputCallback_v7_6_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInputCallback_v7_6_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInputCallback_v7_6_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInputCallback_v7_6_VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) \ - ( (This)->lpVtbl -> VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) + ( (This)->lpVtbl -> VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) #define IDeckLinkInputCallback_v7_6_VideoInputFrameArrived(This,videoFrame,audioPacket) \ - ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) + ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) #endif /* COBJMACROS */ @@ -13992,58 +15917,58 @@ #define __IDeckLinkInputCallback_v7_3_INTERFACE_DEFINED__ /* interface IDeckLinkInputCallback_v7_3 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInputCallback_v7_3; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("FD6F311D-4D00-444B-9ED4-1F25B5730AD0") IDeckLinkInputCallback_v7_3 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged( + virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged( /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode_v7_6 *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags) = 0; - - virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( + + virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( /* [in] */ IDeckLinkVideoInputFrame_v7_3 *videoFrame, /* [in] */ IDeckLinkAudioInputPacket *audioPacket) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInputCallback_v7_3Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInputCallback_v7_3 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInputCallback_v7_3 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInputCallback_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFormatChanged )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFormatChanged )( IDeckLinkInputCallback_v7_3 * This, /* [in] */ BMDVideoInputFormatChangedEvents notificationEvents, /* [in] */ IDeckLinkDisplayMode_v7_6 *newDisplayMode, /* [in] */ BMDDetectedVideoInputFormatFlags detectedSignalFlags); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( IDeckLinkInputCallback_v7_3 * This, /* [in] */ IDeckLinkVideoInputFrame_v7_3 *videoFrame, /* [in] */ IDeckLinkAudioInputPacket *audioPacket); - + END_INTERFACE } IDeckLinkInputCallback_v7_3Vtbl; @@ -14052,26 +15977,26 @@ CONST_VTBL struct IDeckLinkInputCallback_v7_3Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInputCallback_v7_3_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInputCallback_v7_3_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInputCallback_v7_3_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInputCallback_v7_3_VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) \ - ( (This)->lpVtbl -> VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) + ( (This)->lpVtbl -> VideoInputFormatChanged(This,notificationEvents,newDisplayMode,detectedSignalFlags) ) #define IDeckLinkInputCallback_v7_3_VideoInputFrameArrived(This,videoFrame,audioPacket) \ - ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) + ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) #endif /* COBJMACROS */ @@ -14088,161 +16013,161 @@ #define __IDeckLinkOutput_v7_3_INTERFACE_DEFINED__ /* interface IDeckLinkOutput_v7_3 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkOutput_v7_3; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("271C65E3-C323-4344-A30F-D908BCB20AA3") IDeckLinkOutput_v7_3 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ BMDDisplayModeSupport_v10_11 *result) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( BMDDisplayMode displayMode, BMDVideoOutputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( int width, int height, int rowBytes, BMDPixelFormat pixelFormat, BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame_v7_6 **outFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( + + virtual HRESULT STDMETHODCALLTYPE CreateAncillaryData( BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( + + virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame, BMDTimeValue displayTime, BMDTimeValue displayDuration, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( /* [in] */ IDeckLinkVideoOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedVideoFrameCount( /* [out] */ unsigned int *bufferedFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount, BMDAudioOutputStreamType streamType) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( + + virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( /* [in] */ void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten) = 0; - + virtual HRESULT STDMETHODCALLTYPE BeginAudioPreroll( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE EndAudioPreroll( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( + + virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( /* [in] */ void *buffer, unsigned int sampleFrameCount, BMDTimeValue streamTime, BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( /* [out] */ unsigned int *bufferedSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushBufferedAudioSamples( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( + + virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( /* [in] */ IDeckLinkAudioOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( BMDTimeValue playbackStartTime, BMDTimeScale timeScale, double playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( + + virtual HRESULT STDMETHODCALLTYPE IsScheduledPlaybackRunning( /* [out] */ BOOL *active) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *elapsedTimeSinceSchedulerBegan) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkOutput_v7_3Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkOutput_v7_3 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkOutput_v7_3 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkOutput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkOutput_v7_3 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + /* [out] */ BMDDisplayModeSupport_v10_11 *result); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkOutput_v7_3 * This, /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkOutput_v7_3 * This, /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( IDeckLinkOutput_v7_3 * This, BMDDisplayMode displayMode, BMDVideoOutputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( IDeckLinkOutput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( IDeckLinkOutput_v7_3 * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( IDeckLinkOutput_v7_3 * This, int width, int height, @@ -14250,93 +16175,93 @@ BMDPixelFormat pixelFormat, BMDFrameFlags flags, /* [out] */ IDeckLinkMutableVideoFrame_v7_6 **outFrame); - - HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *CreateAncillaryData )( IDeckLinkOutput_v7_3 * This, BMDPixelFormat pixelFormat, /* [out] */ IDeckLinkVideoFrameAncillary **outBuffer); - - HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( + + HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( IDeckLinkOutput_v7_3 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame); - - HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( IDeckLinkOutput_v7_3 * This, /* [in] */ IDeckLinkVideoFrame_v7_6 *theFrame, BMDTimeValue displayTime, BMDTimeValue displayDuration, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( IDeckLinkOutput_v7_3 * This, /* [in] */ IDeckLinkVideoOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedVideoFrameCount )( IDeckLinkOutput_v7_3 * This, /* [out] */ unsigned int *bufferedFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( IDeckLinkOutput_v7_3 * This, BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount, BMDAudioOutputStreamType streamType); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( IDeckLinkOutput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( + + HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( IDeckLinkOutput_v7_3 * This, /* [in] */ void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( IDeckLinkOutput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( IDeckLinkOutput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( IDeckLinkOutput_v7_3 * This, /* [in] */ void *buffer, unsigned int sampleFrameCount, BMDTimeValue streamTime, BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( IDeckLinkOutput_v7_3 * This, /* [out] */ unsigned int *bufferedSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( IDeckLinkOutput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( IDeckLinkOutput_v7_3 * This, /* [in] */ IDeckLinkAudioOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( IDeckLinkOutput_v7_3 * This, BMDTimeValue playbackStartTime, BMDTimeScale timeScale, double playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( IDeckLinkOutput_v7_3 * This, BMDTimeValue stopPlaybackAtTime, /* [out] */ BMDTimeValue *actualStopTime, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( + + HRESULT ( STDMETHODCALLTYPE *IsScheduledPlaybackRunning )( IDeckLinkOutput_v7_3 * This, /* [out] */ BOOL *active); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkOutput_v7_3 * This, BMDTimeScale desiredTimeScale, /* [out] */ BMDTimeValue *elapsedTimeSinceSchedulerBegan); - + END_INTERFACE } IDeckLinkOutput_v7_3Vtbl; @@ -14345,95 +16270,95 @@ CONST_VTBL struct IDeckLinkOutput_v7_3Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkOutput_v7_3_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkOutput_v7_3_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkOutput_v7_3_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkOutput_v7_3_DoesSupportVideoMode(This,displayMode,pixelFormat,result) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) #define IDeckLinkOutput_v7_3_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkOutput_v7_3_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkOutput_v7_3_EnableVideoOutput(This,displayMode,flags) \ - ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) + ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode,flags) ) #define IDeckLinkOutput_v7_3_DisableVideoOutput(This) \ - ( (This)->lpVtbl -> DisableVideoOutput(This) ) + ( (This)->lpVtbl -> DisableVideoOutput(This) ) #define IDeckLinkOutput_v7_3_SetVideoOutputFrameMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) #define IDeckLinkOutput_v7_3_CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) \ - ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) + ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) #define IDeckLinkOutput_v7_3_CreateAncillaryData(This,pixelFormat,outBuffer) \ - ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) + ( (This)->lpVtbl -> CreateAncillaryData(This,pixelFormat,outBuffer) ) #define IDeckLinkOutput_v7_3_DisplayVideoFrameSync(This,theFrame) \ - ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) + ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) #define IDeckLinkOutput_v7_3_ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) \ - ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) + ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) #define IDeckLinkOutput_v7_3_SetScheduledFrameCompletionCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) #define IDeckLinkOutput_v7_3_GetBufferedVideoFrameCount(This,bufferedFrameCount) \ - ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) + ( (This)->lpVtbl -> GetBufferedVideoFrameCount(This,bufferedFrameCount) ) #define IDeckLinkOutput_v7_3_EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) \ - ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) + ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount,streamType) ) #define IDeckLinkOutput_v7_3_DisableAudioOutput(This) \ - ( (This)->lpVtbl -> DisableAudioOutput(This) ) + ( (This)->lpVtbl -> DisableAudioOutput(This) ) #define IDeckLinkOutput_v7_3_WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) \ - ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) + ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) #define IDeckLinkOutput_v7_3_BeginAudioPreroll(This) \ - ( (This)->lpVtbl -> BeginAudioPreroll(This) ) + ( (This)->lpVtbl -> BeginAudioPreroll(This) ) #define IDeckLinkOutput_v7_3_EndAudioPreroll(This) \ - ( (This)->lpVtbl -> EndAudioPreroll(This) ) + ( (This)->lpVtbl -> EndAudioPreroll(This) ) #define IDeckLinkOutput_v7_3_ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) \ - ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) + ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) #define IDeckLinkOutput_v7_3_GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) \ - ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleFrameCount) ) #define IDeckLinkOutput_v7_3_FlushBufferedAudioSamples(This) \ - ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) + ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) #define IDeckLinkOutput_v7_3_SetAudioCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) #define IDeckLinkOutput_v7_3_StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) \ - ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) + ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) #define IDeckLinkOutput_v7_3_StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) \ - ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) + ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) #define IDeckLinkOutput_v7_3_IsScheduledPlaybackRunning(This,active) \ - ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) + ( (This)->lpVtbl -> IsScheduledPlaybackRunning(This,active) ) #define IDeckLinkOutput_v7_3_GetHardwareReferenceClock(This,desiredTimeScale,elapsedTimeSinceSchedulerBegan) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,elapsedTimeSinceSchedulerBegan) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,elapsedTimeSinceSchedulerBegan) ) #endif /* COBJMACROS */ @@ -14450,136 +16375,136 @@ #define __IDeckLinkInput_v7_3_INTERFACE_DEFINED__ /* interface IDeckLinkInput_v7_3 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInput_v7_3; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("4973F012-9925-458C-871C-18774CDBBECB") IDeckLinkInput_v7_3 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ BMDDisplayModeSupport_v10_11 *result) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScreenPreviewCallback( /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, BMDVideoInputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableVideoFrameCount( /* [out] */ unsigned int *availableFrameCount) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetAvailableAudioSampleFrameCount( /* [out] */ unsigned int *availableSampleFrameCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushStreams( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkInputCallback_v7_3 *theCallback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInput_v7_3Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInput_v7_3 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInput_v7_3 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkInput_v7_3 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + /* [out] */ BMDDisplayModeSupport_v10_11 *result); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkInput_v7_3 * This, /* [out] */ IDeckLinkDisplayModeIterator_v7_6 **iterator); - - HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScreenPreviewCallback )( IDeckLinkInput_v7_3 * This, /* [in] */ IDeckLinkScreenPreviewCallback *previewCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( IDeckLinkInput_v7_3 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, BMDVideoInputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableVideoFrameCount )( IDeckLinkInput_v7_3 * This, /* [out] */ unsigned int *availableFrameCount); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( IDeckLinkInput_v7_3 * This, BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetAvailableAudioSampleFrameCount )( IDeckLinkInput_v7_3 * This, /* [out] */ unsigned int *availableSampleFrameCount); - - HRESULT ( STDMETHODCALLTYPE *StartStreams )( + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *StopStreams )( + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *FlushStreams )( + + HRESULT ( STDMETHODCALLTYPE *FlushStreams )( IDeckLinkInput_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkInput_v7_3 * This, /* [in] */ IDeckLinkInputCallback_v7_3 *theCallback); - + END_INTERFACE } IDeckLinkInput_v7_3Vtbl; @@ -14588,62 +16513,62 @@ CONST_VTBL struct IDeckLinkInput_v7_3Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInput_v7_3_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInput_v7_3_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInput_v7_3_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInput_v7_3_DoesSupportVideoMode(This,displayMode,pixelFormat,result) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) #define IDeckLinkInput_v7_3_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkInput_v7_3_SetScreenPreviewCallback(This,previewCallback) \ - ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) + ( (This)->lpVtbl -> SetScreenPreviewCallback(This,previewCallback) ) #define IDeckLinkInput_v7_3_EnableVideoInput(This,displayMode,pixelFormat,flags) \ - ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) #define IDeckLinkInput_v7_3_DisableVideoInput(This) \ - ( (This)->lpVtbl -> DisableVideoInput(This) ) + ( (This)->lpVtbl -> DisableVideoInput(This) ) #define IDeckLinkInput_v7_3_GetAvailableVideoFrameCount(This,availableFrameCount) \ - ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) + ( (This)->lpVtbl -> GetAvailableVideoFrameCount(This,availableFrameCount) ) #define IDeckLinkInput_v7_3_EnableAudioInput(This,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) #define IDeckLinkInput_v7_3_DisableAudioInput(This) \ - ( (This)->lpVtbl -> DisableAudioInput(This) ) + ( (This)->lpVtbl -> DisableAudioInput(This) ) #define IDeckLinkInput_v7_3_GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) \ - ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) + ( (This)->lpVtbl -> GetAvailableAudioSampleFrameCount(This,availableSampleFrameCount) ) #define IDeckLinkInput_v7_3_StartStreams(This) \ - ( (This)->lpVtbl -> StartStreams(This) ) + ( (This)->lpVtbl -> StartStreams(This) ) #define IDeckLinkInput_v7_3_StopStreams(This) \ - ( (This)->lpVtbl -> StopStreams(This) ) + ( (This)->lpVtbl -> StopStreams(This) ) #define IDeckLinkInput_v7_3_PauseStreams(This) \ - ( (This)->lpVtbl -> PauseStreams(This) ) + ( (This)->lpVtbl -> PauseStreams(This) ) #define IDeckLinkInput_v7_3_FlushStreams(This) \ - ( (This)->lpVtbl -> FlushStreams(This) ) + ( (This)->lpVtbl -> FlushStreams(This) ) #define IDeckLinkInput_v7_3_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #endif /* COBJMACROS */ @@ -14660,77 +16585,77 @@ #define __IDeckLinkVideoInputFrame_v7_3_INTERFACE_DEFINED__ /* interface IDeckLinkVideoInputFrame_v7_3 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoInputFrame_v7_3; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("CF317790-2894-11DE-8C30-0800200C9A66") IDeckLinkVideoInputFrame_v7_3 : public IDeckLinkVideoFrame_v7_6 { public: - virtual HRESULT STDMETHODCALLTYPE GetStreamTime( + virtual HRESULT STDMETHODCALLTYPE GetStreamTime( /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration, BMDTimeScale timeScale) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoInputFrame_v7_3Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoInputFrame_v7_3 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoInputFrame_v7_3 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoInputFrame_v7_3 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoInputFrame_v7_3 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoInputFrame_v7_3 * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoInputFrame_v7_3 * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoInputFrame_v7_3 * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoInputFrame_v7_3 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoInputFrame_v7_3 * This, /* [out] */ void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetTimecode )( + + HRESULT ( STDMETHODCALLTYPE *GetTimecode )( IDeckLinkVideoInputFrame_v7_3 * This, BMDTimecodeFormat format, /* [out] */ IDeckLinkTimecode_v7_6 **timecode); - - HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( + + HRESULT ( STDMETHODCALLTYPE *GetAncillaryData )( IDeckLinkVideoInputFrame_v7_3 * This, /* [out] */ IDeckLinkVideoFrameAncillary **ancillary); - - HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( + + HRESULT ( STDMETHODCALLTYPE *GetStreamTime )( IDeckLinkVideoInputFrame_v7_3 * This, /* [out] */ BMDTimeValue *frameTime, /* [out] */ BMDTimeValue *frameDuration, BMDTimeScale timeScale); - + END_INTERFACE } IDeckLinkVideoInputFrame_v7_3Vtbl; @@ -14739,48 +16664,48 @@ CONST_VTBL struct IDeckLinkVideoInputFrame_v7_3Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoInputFrame_v7_3_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoInputFrame_v7_3_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoInputFrame_v7_3_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoInputFrame_v7_3_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoInputFrame_v7_3_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoInputFrame_v7_3_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoInputFrame_v7_3_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoInputFrame_v7_3_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoInputFrame_v7_3_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkVideoInputFrame_v7_3_GetTimecode(This,format,timecode) \ - ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) + ( (This)->lpVtbl -> GetTimecode(This,format,timecode) ) #define IDeckLinkVideoInputFrame_v7_3_GetAncillaryData(This,ancillary) \ - ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) + ( (This)->lpVtbl -> GetAncillaryData(This,ancillary) ) #define IDeckLinkVideoInputFrame_v7_3_GetStreamTime(This,frameTime,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetStreamTime(This,frameTime,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetStreamTime(This,frameTime,frameDuration,timeScale) ) #endif /* COBJMACROS */ @@ -14797,45 +16722,45 @@ #define __IDeckLinkDisplayModeIterator_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkDisplayModeIterator_v7_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDisplayModeIterator_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("B28131B6-59AC-4857-B5AC-CD75D5883E2F") IDeckLinkDisplayModeIterator_v7_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE Next( + virtual HRESULT STDMETHODCALLTYPE Next( /* [out] */ IDeckLinkDisplayMode_v7_1 **deckLinkDisplayMode) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDisplayModeIterator_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDisplayModeIterator_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDisplayModeIterator_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDisplayModeIterator_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *Next )( + + HRESULT ( STDMETHODCALLTYPE *Next )( IDeckLinkDisplayModeIterator_v7_1 * This, /* [out] */ IDeckLinkDisplayMode_v7_1 **deckLinkDisplayMode); - + END_INTERFACE } IDeckLinkDisplayModeIterator_v7_1Vtbl; @@ -14844,23 +16769,23 @@ CONST_VTBL struct IDeckLinkDisplayModeIterator_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDisplayModeIterator_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDisplayModeIterator_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDisplayModeIterator_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDisplayModeIterator_v7_1_Next(This,deckLinkDisplayMode) \ - ( (This)->lpVtbl -> Next(This,deckLinkDisplayMode) ) + ( (This)->lpVtbl -> Next(This,deckLinkDisplayMode) ) #endif /* COBJMACROS */ @@ -14877,69 +16802,69 @@ #define __IDeckLinkDisplayMode_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkDisplayMode_v7_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkDisplayMode_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("AF0CD6D5-8376-435E-8433-54F9DD530AC3") IDeckLinkDisplayMode_v7_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE GetName( + virtual HRESULT STDMETHODCALLTYPE GetName( /* [out] */ BSTR *name) = 0; - + virtual BMDDisplayMode STDMETHODCALLTYPE GetDisplayMode( void) = 0; - + virtual long STDMETHODCALLTYPE GetWidth( void) = 0; - + virtual long STDMETHODCALLTYPE GetHeight( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetFrameRate( + + virtual HRESULT STDMETHODCALLTYPE GetFrameRate( /* [out] */ BMDTimeValue *frameDuration, /* [out] */ BMDTimeScale *timeScale) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkDisplayMode_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkDisplayMode_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkDisplayMode_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkDisplayMode_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *GetName )( + + HRESULT ( STDMETHODCALLTYPE *GetName )( IDeckLinkDisplayMode_v7_1 * This, /* [out] */ BSTR *name); - - BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( + + BMDDisplayMode ( STDMETHODCALLTYPE *GetDisplayMode )( IDeckLinkDisplayMode_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkDisplayMode_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkDisplayMode_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *GetFrameRate )( + + HRESULT ( STDMETHODCALLTYPE *GetFrameRate )( IDeckLinkDisplayMode_v7_1 * This, /* [out] */ BMDTimeValue *frameDuration, /* [out] */ BMDTimeScale *timeScale); - + END_INTERFACE } IDeckLinkDisplayMode_v7_1Vtbl; @@ -14948,35 +16873,35 @@ CONST_VTBL struct IDeckLinkDisplayMode_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkDisplayMode_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkDisplayMode_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkDisplayMode_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkDisplayMode_v7_1_GetName(This,name) \ - ( (This)->lpVtbl -> GetName(This,name) ) + ( (This)->lpVtbl -> GetName(This,name) ) #define IDeckLinkDisplayMode_v7_1_GetDisplayMode(This) \ - ( (This)->lpVtbl -> GetDisplayMode(This) ) + ( (This)->lpVtbl -> GetDisplayMode(This) ) #define IDeckLinkDisplayMode_v7_1_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkDisplayMode_v7_1_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkDisplayMode_v7_1_GetFrameRate(This,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetFrameRate(This,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetFrameRate(This,frameDuration,timeScale) ) #endif /* COBJMACROS */ @@ -14993,70 +16918,70 @@ #define __IDeckLinkVideoFrame_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkVideoFrame_v7_1 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoFrame_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("333F3A10-8C2D-43CF-B79D-46560FEEA1CE") IDeckLinkVideoFrame_v7_1 : public IUnknown { public: virtual long STDMETHODCALLTYPE GetWidth( void) = 0; - + virtual long STDMETHODCALLTYPE GetHeight( void) = 0; - + virtual long STDMETHODCALLTYPE GetRowBytes( void) = 0; - + virtual BMDPixelFormat STDMETHODCALLTYPE GetPixelFormat( void) = 0; - + virtual BMDFrameFlags STDMETHODCALLTYPE GetFlags( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( void **buffer) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoFrame_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoFrame_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoFrame_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoFrame_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoFrame_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoFrame_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoFrame_v7_1 * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoFrame_v7_1 * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoFrame_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoFrame_v7_1 * This, void **buffer); - + END_INTERFACE } IDeckLinkVideoFrame_v7_1Vtbl; @@ -15065,38 +16990,38 @@ CONST_VTBL struct IDeckLinkVideoFrame_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoFrame_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoFrame_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoFrame_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoFrame_v7_1_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoFrame_v7_1_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoFrame_v7_1_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoFrame_v7_1_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoFrame_v7_1_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoFrame_v7_1_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #endif /* COBJMACROS */ @@ -15113,68 +17038,68 @@ #define __IDeckLinkVideoInputFrame_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkVideoInputFrame_v7_1 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoInputFrame_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("C8B41D95-8848-40EE-9B37-6E3417FB114B") IDeckLinkVideoInputFrame_v7_1 : public IDeckLinkVideoFrame_v7_1 { public: - virtual HRESULT STDMETHODCALLTYPE GetFrameTime( + virtual HRESULT STDMETHODCALLTYPE GetFrameTime( BMDTimeValue *frameTime, BMDTimeValue *frameDuration, BMDTimeScale timeScale) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoInputFrame_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoInputFrame_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoInputFrame_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoInputFrame_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetWidth )( + + long ( STDMETHODCALLTYPE *GetWidth )( IDeckLinkVideoInputFrame_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetHeight )( + + long ( STDMETHODCALLTYPE *GetHeight )( IDeckLinkVideoInputFrame_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetRowBytes )( + + long ( STDMETHODCALLTYPE *GetRowBytes )( IDeckLinkVideoInputFrame_v7_1 * This); - - BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( + + BMDPixelFormat ( STDMETHODCALLTYPE *GetPixelFormat )( IDeckLinkVideoInputFrame_v7_1 * This); - - BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( + + BMDFrameFlags ( STDMETHODCALLTYPE *GetFlags )( IDeckLinkVideoInputFrame_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkVideoInputFrame_v7_1 * This, void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetFrameTime )( + + HRESULT ( STDMETHODCALLTYPE *GetFrameTime )( IDeckLinkVideoInputFrame_v7_1 * This, BMDTimeValue *frameTime, BMDTimeValue *frameDuration, BMDTimeScale timeScale); - + END_INTERFACE } IDeckLinkVideoInputFrame_v7_1Vtbl; @@ -15183,42 +17108,42 @@ CONST_VTBL struct IDeckLinkVideoInputFrame_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoInputFrame_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoInputFrame_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoInputFrame_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoInputFrame_v7_1_GetWidth(This) \ - ( (This)->lpVtbl -> GetWidth(This) ) + ( (This)->lpVtbl -> GetWidth(This) ) #define IDeckLinkVideoInputFrame_v7_1_GetHeight(This) \ - ( (This)->lpVtbl -> GetHeight(This) ) + ( (This)->lpVtbl -> GetHeight(This) ) #define IDeckLinkVideoInputFrame_v7_1_GetRowBytes(This) \ - ( (This)->lpVtbl -> GetRowBytes(This) ) + ( (This)->lpVtbl -> GetRowBytes(This) ) #define IDeckLinkVideoInputFrame_v7_1_GetPixelFormat(This) \ - ( (This)->lpVtbl -> GetPixelFormat(This) ) + ( (This)->lpVtbl -> GetPixelFormat(This) ) #define IDeckLinkVideoInputFrame_v7_1_GetFlags(This) \ - ( (This)->lpVtbl -> GetFlags(This) ) + ( (This)->lpVtbl -> GetFlags(This) ) #define IDeckLinkVideoInputFrame_v7_1_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkVideoInputFrame_v7_1_GetFrameTime(This,frameTime,frameDuration,timeScale) \ - ( (This)->lpVtbl -> GetFrameTime(This,frameTime,frameDuration,timeScale) ) + ( (This)->lpVtbl -> GetFrameTime(This,frameTime,frameDuration,timeScale) ) #endif /* COBJMACROS */ @@ -15235,59 +17160,59 @@ #define __IDeckLinkAudioInputPacket_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkAudioInputPacket_v7_1 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkAudioInputPacket_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("C86DE4F6-A29F-42E3-AB3A-1363E29F0788") IDeckLinkAudioInputPacket_v7_1 : public IUnknown { public: virtual long STDMETHODCALLTYPE GetSampleCount( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBytes( + + virtual HRESULT STDMETHODCALLTYPE GetBytes( void **buffer) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetAudioPacketTime( + + virtual HRESULT STDMETHODCALLTYPE GetAudioPacketTime( BMDTimeValue *packetTime, BMDTimeScale timeScale) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkAudioInputPacket_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkAudioInputPacket_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkAudioInputPacket_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkAudioInputPacket_v7_1 * This); - - long ( STDMETHODCALLTYPE *GetSampleCount )( + + long ( STDMETHODCALLTYPE *GetSampleCount )( IDeckLinkAudioInputPacket_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *GetBytes )( + + HRESULT ( STDMETHODCALLTYPE *GetBytes )( IDeckLinkAudioInputPacket_v7_1 * This, void **buffer); - - HRESULT ( STDMETHODCALLTYPE *GetAudioPacketTime )( + + HRESULT ( STDMETHODCALLTYPE *GetAudioPacketTime )( IDeckLinkAudioInputPacket_v7_1 * This, BMDTimeValue *packetTime, BMDTimeScale timeScale); - + END_INTERFACE } IDeckLinkAudioInputPacket_v7_1Vtbl; @@ -15296,29 +17221,29 @@ CONST_VTBL struct IDeckLinkAudioInputPacket_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkAudioInputPacket_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkAudioInputPacket_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkAudioInputPacket_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkAudioInputPacket_v7_1_GetSampleCount(This) \ - ( (This)->lpVtbl -> GetSampleCount(This) ) + ( (This)->lpVtbl -> GetSampleCount(This) ) #define IDeckLinkAudioInputPacket_v7_1_GetBytes(This,buffer) \ - ( (This)->lpVtbl -> GetBytes(This,buffer) ) + ( (This)->lpVtbl -> GetBytes(This,buffer) ) #define IDeckLinkAudioInputPacket_v7_1_GetAudioPacketTime(This,packetTime,timeScale) \ - ( (This)->lpVtbl -> GetAudioPacketTime(This,packetTime,timeScale) ) + ( (This)->lpVtbl -> GetAudioPacketTime(This,packetTime,timeScale) ) #endif /* COBJMACROS */ @@ -15335,47 +17260,47 @@ #define __IDeckLinkVideoOutputCallback_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkVideoOutputCallback_v7_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkVideoOutputCallback_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("EBD01AFA-E4B0-49C6-A01D-EDB9D1B55FD9") IDeckLinkVideoOutputCallback_v7_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE ScheduledFrameCompleted( + virtual HRESULT STDMETHODCALLTYPE ScheduledFrameCompleted( /* [in] */ IDeckLinkVideoFrame_v7_1 *completedFrame, /* [in] */ BMDOutputFrameCompletionResult result) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkVideoOutputCallback_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkVideoOutputCallback_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkVideoOutputCallback_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkVideoOutputCallback_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduledFrameCompleted )( + + HRESULT ( STDMETHODCALLTYPE *ScheduledFrameCompleted )( IDeckLinkVideoOutputCallback_v7_1 * This, /* [in] */ IDeckLinkVideoFrame_v7_1 *completedFrame, /* [in] */ BMDOutputFrameCompletionResult result); - + END_INTERFACE } IDeckLinkVideoOutputCallback_v7_1Vtbl; @@ -15384,23 +17309,23 @@ CONST_VTBL struct IDeckLinkVideoOutputCallback_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkVideoOutputCallback_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkVideoOutputCallback_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkVideoOutputCallback_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkVideoOutputCallback_v7_1_ScheduledFrameCompleted(This,completedFrame,result) \ - ( (This)->lpVtbl -> ScheduledFrameCompleted(This,completedFrame,result) ) + ( (This)->lpVtbl -> ScheduledFrameCompleted(This,completedFrame,result) ) #endif /* COBJMACROS */ @@ -15417,47 +17342,47 @@ #define __IDeckLinkInputCallback_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkInputCallback_v7_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInputCallback_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("7F94F328-5ED4-4E9F-9729-76A86BDC99CC") IDeckLinkInputCallback_v7_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( + virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived( /* [in] */ IDeckLinkVideoInputFrame_v7_1 *videoFrame, /* [in] */ IDeckLinkAudioInputPacket_v7_1 *audioPacket) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInputCallback_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInputCallback_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInputCallback_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInputCallback_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( + + HRESULT ( STDMETHODCALLTYPE *VideoInputFrameArrived )( IDeckLinkInputCallback_v7_1 * This, /* [in] */ IDeckLinkVideoInputFrame_v7_1 *videoFrame, /* [in] */ IDeckLinkAudioInputPacket_v7_1 *audioPacket); - + END_INTERFACE } IDeckLinkInputCallback_v7_1Vtbl; @@ -15466,23 +17391,23 @@ CONST_VTBL struct IDeckLinkInputCallback_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInputCallback_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInputCallback_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInputCallback_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInputCallback_v7_1_VideoInputFrameArrived(This,videoFrame,audioPacket) \ - ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) + ( (This)->lpVtbl -> VideoInputFrameArrived(This,videoFrame,audioPacket) ) #endif /* COBJMACROS */ @@ -15499,42 +17424,42 @@ #define __IDeckLinkOutput_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkOutput_v7_1 */ -/* [helpstring][local][uuid][object] */ +/* [helpstring][local][uuid][object] */ EXTERN_C const IID IID_IDeckLinkOutput_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("AE5B3E9B-4E1E-4535-B6E8-480FF52F6CE5") IDeckLinkOutput_v7_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ BMDDisplayModeSupport_v10_11 *result) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator_v7_1 **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoOutput( BMDDisplayMode displayMode) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( + + virtual HRESULT STDMETHODCALLTYPE SetVideoOutputFrameMemoryAllocator( /* [in] */ IDeckLinkMemoryAllocator *theAllocator) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrame( int width, int height, int rowBytes, BMDPixelFormat pixelFormat, BMDFrameFlags flags, IDeckLinkVideoFrame_v7_1 **outFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE CreateVideoFrameFromBuffer( + + virtual HRESULT STDMETHODCALLTYPE CreateVideoFrameFromBuffer( void *buffer, int width, int height, @@ -15542,107 +17467,107 @@ BMDPixelFormat pixelFormat, BMDFrameFlags flags, IDeckLinkVideoFrame_v7_1 **outFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( + + virtual HRESULT STDMETHODCALLTYPE DisplayVideoFrameSync( IDeckLinkVideoFrame_v7_1 *theFrame) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( + + virtual HRESULT STDMETHODCALLTYPE ScheduleVideoFrame( IDeckLinkVideoFrame_v7_1 *theFrame, BMDTimeValue displayTime, BMDTimeValue displayDuration, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( + + virtual HRESULT STDMETHODCALLTYPE SetScheduledFrameCompletionCallback( /* [in] */ IDeckLinkVideoOutputCallback_v7_1 *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioOutput( BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioOutput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( + + virtual HRESULT STDMETHODCALLTYPE WriteAudioSamplesSync( void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten) = 0; - + virtual HRESULT STDMETHODCALLTYPE BeginAudioPreroll( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE EndAudioPreroll( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( + + virtual HRESULT STDMETHODCALLTYPE ScheduleAudioSamples( void *buffer, unsigned int sampleFrameCount, BMDTimeValue streamTime, BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( /* [out] */ unsigned int *bufferedSampleCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE FlushBufferedAudioSamples( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( + + virtual HRESULT STDMETHODCALLTYPE SetAudioCallback( /* [in] */ IDeckLinkAudioOutputCallback *theCallback) = 0; - - virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StartScheduledPlayback( BMDTimeValue playbackStartTime, BMDTimeScale timeScale, double playbackSpeed) = 0; - - virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( + + virtual HRESULT STDMETHODCALLTYPE StopScheduledPlayback( BMDTimeValue stopPlaybackAtTime, BMDTimeValue *actualStopTime, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( + + virtual HRESULT STDMETHODCALLTYPE GetHardwareReferenceClock( BMDTimeScale desiredTimeScale, BMDTimeValue *elapsedTimeSinceSchedulerBegan) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkOutput_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkOutput_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkOutput_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkOutput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkOutput_v7_1 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + /* [out] */ BMDDisplayModeSupport_v10_11 *result); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkOutput_v7_1 * This, /* [out] */ IDeckLinkDisplayModeIterator_v7_1 **iterator); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoOutput )( IDeckLinkOutput_v7_1 * This, BMDDisplayMode displayMode); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoOutput )( IDeckLinkOutput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( + + HRESULT ( STDMETHODCALLTYPE *SetVideoOutputFrameMemoryAllocator )( IDeckLinkOutput_v7_1 * This, /* [in] */ IDeckLinkMemoryAllocator *theAllocator); - - HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrame )( IDeckLinkOutput_v7_1 * This, int width, int height, @@ -15650,8 +17575,8 @@ BMDPixelFormat pixelFormat, BMDFrameFlags flags, IDeckLinkVideoFrame_v7_1 **outFrame); - - HRESULT ( STDMETHODCALLTYPE *CreateVideoFrameFromBuffer )( + + HRESULT ( STDMETHODCALLTYPE *CreateVideoFrameFromBuffer )( IDeckLinkOutput_v7_1 * This, void *buffer, int width, @@ -15660,79 +17585,79 @@ BMDPixelFormat pixelFormat, BMDFrameFlags flags, IDeckLinkVideoFrame_v7_1 **outFrame); - - HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( + + HRESULT ( STDMETHODCALLTYPE *DisplayVideoFrameSync )( IDeckLinkOutput_v7_1 * This, IDeckLinkVideoFrame_v7_1 *theFrame); - - HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleVideoFrame )( IDeckLinkOutput_v7_1 * This, IDeckLinkVideoFrame_v7_1 *theFrame, BMDTimeValue displayTime, BMDTimeValue displayDuration, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetScheduledFrameCompletionCallback )( IDeckLinkOutput_v7_1 * This, /* [in] */ IDeckLinkVideoOutputCallback_v7_1 *theCallback); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioOutput )( IDeckLinkOutput_v7_1 * This, BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioOutput )( IDeckLinkOutput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( + + HRESULT ( STDMETHODCALLTYPE *WriteAudioSamplesSync )( IDeckLinkOutput_v7_1 * This, void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *BeginAudioPreroll )( IDeckLinkOutput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( + + HRESULT ( STDMETHODCALLTYPE *EndAudioPreroll )( IDeckLinkOutput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *ScheduleAudioSamples )( IDeckLinkOutput_v7_1 * This, void *buffer, unsigned int sampleFrameCount, BMDTimeValue streamTime, BMDTimeScale timeScale, /* [out] */ unsigned int *sampleFramesWritten); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( IDeckLinkOutput_v7_1 * This, /* [out] */ unsigned int *bufferedSampleCount); - - HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *FlushBufferedAudioSamples )( IDeckLinkOutput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetAudioCallback )( IDeckLinkOutput_v7_1 * This, /* [in] */ IDeckLinkAudioOutputCallback *theCallback); - - HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StartScheduledPlayback )( IDeckLinkOutput_v7_1 * This, BMDTimeValue playbackStartTime, BMDTimeScale timeScale, double playbackSpeed); - - HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( + + HRESULT ( STDMETHODCALLTYPE *StopScheduledPlayback )( IDeckLinkOutput_v7_1 * This, BMDTimeValue stopPlaybackAtTime, BMDTimeValue *actualStopTime, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( + + HRESULT ( STDMETHODCALLTYPE *GetHardwareReferenceClock )( IDeckLinkOutput_v7_1 * This, BMDTimeScale desiredTimeScale, BMDTimeValue *elapsedTimeSinceSchedulerBegan); - + END_INTERFACE } IDeckLinkOutput_v7_1Vtbl; @@ -15741,86 +17666,86 @@ CONST_VTBL struct IDeckLinkOutput_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkOutput_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkOutput_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkOutput_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkOutput_v7_1_DoesSupportVideoMode(This,displayMode,pixelFormat,result) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) #define IDeckLinkOutput_v7_1_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkOutput_v7_1_EnableVideoOutput(This,displayMode) \ - ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode) ) + ( (This)->lpVtbl -> EnableVideoOutput(This,displayMode) ) #define IDeckLinkOutput_v7_1_DisableVideoOutput(This) \ - ( (This)->lpVtbl -> DisableVideoOutput(This) ) + ( (This)->lpVtbl -> DisableVideoOutput(This) ) #define IDeckLinkOutput_v7_1_SetVideoOutputFrameMemoryAllocator(This,theAllocator) \ - ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) + ( (This)->lpVtbl -> SetVideoOutputFrameMemoryAllocator(This,theAllocator) ) #define IDeckLinkOutput_v7_1_CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) \ - ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) + ( (This)->lpVtbl -> CreateVideoFrame(This,width,height,rowBytes,pixelFormat,flags,outFrame) ) #define IDeckLinkOutput_v7_1_CreateVideoFrameFromBuffer(This,buffer,width,height,rowBytes,pixelFormat,flags,outFrame) \ - ( (This)->lpVtbl -> CreateVideoFrameFromBuffer(This,buffer,width,height,rowBytes,pixelFormat,flags,outFrame) ) + ( (This)->lpVtbl -> CreateVideoFrameFromBuffer(This,buffer,width,height,rowBytes,pixelFormat,flags,outFrame) ) #define IDeckLinkOutput_v7_1_DisplayVideoFrameSync(This,theFrame) \ - ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) + ( (This)->lpVtbl -> DisplayVideoFrameSync(This,theFrame) ) #define IDeckLinkOutput_v7_1_ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) \ - ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) + ( (This)->lpVtbl -> ScheduleVideoFrame(This,theFrame,displayTime,displayDuration,timeScale) ) #define IDeckLinkOutput_v7_1_SetScheduledFrameCompletionCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetScheduledFrameCompletionCallback(This,theCallback) ) #define IDeckLinkOutput_v7_1_EnableAudioOutput(This,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioOutput(This,sampleRate,sampleType,channelCount) ) #define IDeckLinkOutput_v7_1_DisableAudioOutput(This) \ - ( (This)->lpVtbl -> DisableAudioOutput(This) ) + ( (This)->lpVtbl -> DisableAudioOutput(This) ) #define IDeckLinkOutput_v7_1_WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) \ - ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) + ( (This)->lpVtbl -> WriteAudioSamplesSync(This,buffer,sampleFrameCount,sampleFramesWritten) ) #define IDeckLinkOutput_v7_1_BeginAudioPreroll(This) \ - ( (This)->lpVtbl -> BeginAudioPreroll(This) ) + ( (This)->lpVtbl -> BeginAudioPreroll(This) ) #define IDeckLinkOutput_v7_1_EndAudioPreroll(This) \ - ( (This)->lpVtbl -> EndAudioPreroll(This) ) + ( (This)->lpVtbl -> EndAudioPreroll(This) ) #define IDeckLinkOutput_v7_1_ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) \ - ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) + ( (This)->lpVtbl -> ScheduleAudioSamples(This,buffer,sampleFrameCount,streamTime,timeScale,sampleFramesWritten) ) #define IDeckLinkOutput_v7_1_GetBufferedAudioSampleFrameCount(This,bufferedSampleCount) \ - ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleCount) ) + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleCount) ) #define IDeckLinkOutput_v7_1_FlushBufferedAudioSamples(This) \ - ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) + ( (This)->lpVtbl -> FlushBufferedAudioSamples(This) ) #define IDeckLinkOutput_v7_1_SetAudioCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetAudioCallback(This,theCallback) ) #define IDeckLinkOutput_v7_1_StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) \ - ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) + ( (This)->lpVtbl -> StartScheduledPlayback(This,playbackStartTime,timeScale,playbackSpeed) ) #define IDeckLinkOutput_v7_1_StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) \ - ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) + ( (This)->lpVtbl -> StopScheduledPlayback(This,stopPlaybackAtTime,actualStopTime,timeScale) ) #define IDeckLinkOutput_v7_1_GetHardwareReferenceClock(This,desiredTimeScale,elapsedTimeSinceSchedulerBegan) \ - ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,elapsedTimeSinceSchedulerBegan) ) + ( (This)->lpVtbl -> GetHardwareReferenceClock(This,desiredTimeScale,elapsedTimeSinceSchedulerBegan) ) #endif /* COBJMACROS */ @@ -15837,132 +17762,132 @@ #define __IDeckLinkInput_v7_1_INTERFACE_DEFINED__ /* interface IDeckLinkInput_v7_1 */ -/* [helpstring][uuid][object] */ +/* [helpstring][uuid][object] */ EXTERN_C const IID IID_IDeckLinkInput_v7_1; #if defined(__cplusplus) && !defined(CINTERFACE) - + MIDL_INTERFACE("2B54EDEF-5B32-429F-BA11-BB990596EACD") IDeckLinkInput_v7_1 : public IUnknown { public: - virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( + virtual HRESULT STDMETHODCALLTYPE DoesSupportVideoMode( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( + /* [out] */ BMDDisplayModeSupport_v10_11 *result) = 0; + + virtual HRESULT STDMETHODCALLTYPE GetDisplayModeIterator( /* [out] */ IDeckLinkDisplayModeIterator_v7_1 **iterator) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( + + virtual HRESULT STDMETHODCALLTYPE EnableVideoInput( BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, BMDVideoInputFlags flags) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableVideoInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( + + virtual HRESULT STDMETHODCALLTYPE EnableAudioInput( BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE DisableAudioInput( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE ReadAudioSamples( + + virtual HRESULT STDMETHODCALLTYPE ReadAudioSamples( void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesRead, /* [out] */ BMDTimeValue *audioPacketTime, BMDTimeScale timeScale) = 0; - - virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( + + virtual HRESULT STDMETHODCALLTYPE GetBufferedAudioSampleFrameCount( /* [out] */ unsigned int *bufferedSampleCount) = 0; - + virtual HRESULT STDMETHODCALLTYPE StartStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE StopStreams( void) = 0; - + virtual HRESULT STDMETHODCALLTYPE PauseStreams( void) = 0; - - virtual HRESULT STDMETHODCALLTYPE SetCallback( + + virtual HRESULT STDMETHODCALLTYPE SetCallback( /* [in] */ IDeckLinkInputCallback_v7_1 *theCallback) = 0; - + }; - - + + #else /* C style interface */ typedef struct IDeckLinkInput_v7_1Vtbl { BEGIN_INTERFACE - - HRESULT ( STDMETHODCALLTYPE *QueryInterface )( + + HRESULT ( STDMETHODCALLTYPE *QueryInterface )( IDeckLinkInput_v7_1 * This, /* [in] */ REFIID riid, - /* [annotation][iid_is][out] */ + /* [annotation][iid_is][out] */ _COM_Outptr_ void **ppvObject); - - ULONG ( STDMETHODCALLTYPE *AddRef )( + + ULONG ( STDMETHODCALLTYPE *AddRef )( IDeckLinkInput_v7_1 * This); - - ULONG ( STDMETHODCALLTYPE *Release )( + + ULONG ( STDMETHODCALLTYPE *Release )( IDeckLinkInput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( + + HRESULT ( STDMETHODCALLTYPE *DoesSupportVideoMode )( IDeckLinkInput_v7_1 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, - /* [out] */ BMDDisplayModeSupport *result); - - HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( + /* [out] */ BMDDisplayModeSupport_v10_11 *result); + + HRESULT ( STDMETHODCALLTYPE *GetDisplayModeIterator )( IDeckLinkInput_v7_1 * This, /* [out] */ IDeckLinkDisplayModeIterator_v7_1 **iterator); - - HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableVideoInput )( IDeckLinkInput_v7_1 * This, BMDDisplayMode displayMode, BMDPixelFormat pixelFormat, BMDVideoInputFlags flags); - - HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableVideoInput )( IDeckLinkInput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *EnableAudioInput )( IDeckLinkInput_v7_1 * This, BMDAudioSampleRate sampleRate, BMDAudioSampleType sampleType, unsigned int channelCount); - - HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( + + HRESULT ( STDMETHODCALLTYPE *DisableAudioInput )( IDeckLinkInput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *ReadAudioSamples )( + + HRESULT ( STDMETHODCALLTYPE *ReadAudioSamples )( IDeckLinkInput_v7_1 * This, void *buffer, unsigned int sampleFrameCount, /* [out] */ unsigned int *sampleFramesRead, /* [out] */ BMDTimeValue *audioPacketTime, BMDTimeScale timeScale); - - HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( + + HRESULT ( STDMETHODCALLTYPE *GetBufferedAudioSampleFrameCount )( IDeckLinkInput_v7_1 * This, /* [out] */ unsigned int *bufferedSampleCount); - - HRESULT ( STDMETHODCALLTYPE *StartStreams )( + + HRESULT ( STDMETHODCALLTYPE *StartStreams )( IDeckLinkInput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *StopStreams )( + + HRESULT ( STDMETHODCALLTYPE *StopStreams )( IDeckLinkInput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *PauseStreams )( + + HRESULT ( STDMETHODCALLTYPE *PauseStreams )( IDeckLinkInput_v7_1 * This); - - HRESULT ( STDMETHODCALLTYPE *SetCallback )( + + HRESULT ( STDMETHODCALLTYPE *SetCallback )( IDeckLinkInput_v7_1 * This, /* [in] */ IDeckLinkInputCallback_v7_1 *theCallback); - + END_INTERFACE } IDeckLinkInput_v7_1Vtbl; @@ -15971,56 +17896,56 @@ CONST_VTBL struct IDeckLinkInput_v7_1Vtbl *lpVtbl; }; - + #ifdef COBJMACROS #define IDeckLinkInput_v7_1_QueryInterface(This,riid,ppvObject) \ - ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) + ( (This)->lpVtbl -> QueryInterface(This,riid,ppvObject) ) #define IDeckLinkInput_v7_1_AddRef(This) \ - ( (This)->lpVtbl -> AddRef(This) ) + ( (This)->lpVtbl -> AddRef(This) ) #define IDeckLinkInput_v7_1_Release(This) \ - ( (This)->lpVtbl -> Release(This) ) + ( (This)->lpVtbl -> Release(This) ) #define IDeckLinkInput_v7_1_DoesSupportVideoMode(This,displayMode,pixelFormat,result) \ - ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) + ( (This)->lpVtbl -> DoesSupportVideoMode(This,displayMode,pixelFormat,result) ) #define IDeckLinkInput_v7_1_GetDisplayModeIterator(This,iterator) \ - ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) + ( (This)->lpVtbl -> GetDisplayModeIterator(This,iterator) ) #define IDeckLinkInput_v7_1_EnableVideoInput(This,displayMode,pixelFormat,flags) \ - ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) + ( (This)->lpVtbl -> EnableVideoInput(This,displayMode,pixelFormat,flags) ) #define IDeckLinkInput_v7_1_DisableVideoInput(This) \ - ( (This)->lpVtbl -> DisableVideoInput(This) ) + ( (This)->lpVtbl -> DisableVideoInput(This) ) #define IDeckLinkInput_v7_1_EnableAudioInput(This,sampleRate,sampleType,channelCount) \ - ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) + ( (This)->lpVtbl -> EnableAudioInput(This,sampleRate,sampleType,channelCount) ) #define IDeckLinkInput_v7_1_DisableAudioInput(This) \ - ( (This)->lpVtbl -> DisableAudioInput(This) ) + ( (This)->lpVtbl -> DisableAudioInput(This) ) #define IDeckLinkInput_v7_1_ReadAudioSamples(This,buffer,sampleFrameCount,sampleFramesRead,audioPacketTime,timeScale) \ - ( (This)->lpVtbl -> ReadAudioSamples(This,buffer,sampleFrameCount,sampleFramesRead,audioPacketTime,timeScale) ) + ( (This)->lpVtbl -> ReadAudioSamples(This,buffer,sampleFrameCount,sampleFramesRead,audioPacketTime,timeScale) ) #define IDeckLinkInput_v7_1_GetBufferedAudioSampleFrameCount(This,bufferedSampleCount) \ - ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleCount) ) + ( (This)->lpVtbl -> GetBufferedAudioSampleFrameCount(This,bufferedSampleCount) ) #define IDeckLinkInput_v7_1_StartStreams(This) \ - ( (This)->lpVtbl -> StartStreams(This) ) + ( (This)->lpVtbl -> StartStreams(This) ) #define IDeckLinkInput_v7_1_StopStreams(This) \ - ( (This)->lpVtbl -> StopStreams(This) ) + ( (This)->lpVtbl -> StopStreams(This) ) #define IDeckLinkInput_v7_1_PauseStreams(This) \ - ( (This)->lpVtbl -> PauseStreams(This) ) + ( (This)->lpVtbl -> PauseStreams(This) ) #define IDeckLinkInput_v7_1_SetCallback(This,theCallback) \ - ( (This)->lpVtbl -> SetCallback(This,theCallback) ) + ( (This)->lpVtbl -> SetCallback(This,theCallback) ) #endif /* COBJMACROS */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/decklink/win/DeckLinkAPI_i.c -> gst-plugins-bad-1.20.1.tar.xz/sys/decklink/win/DeckLinkAPI_i.c
Changed
@@ -1,6 +1,4 @@ /* *INDENT-OFF* */ - - /* this ALWAYS GENERATED file contains the IIDs and CLSIDs */ @@ -8,13 +6,13 @@ /* File created by MIDL compiler version 8.01.0622 */ -/* at Fri Feb 28 12:18:07 2020 +/* at Tue Jan 19 12:14:07 2038 */ /* Compiler settings for ..\..\Blackmagic\DeckLink_SDK_10.11.4\Win\include\DeckLinkAPI.idl: - Oicf, W1, Zp8, env=Win64 (32b run), target_arch=AMD64 8.01.0622 + Oicf, W1, Zp8, env=Win64 (32b run), target_arch=AMD64 8.01.0622 protocol : dce , ms_ext, c_ext, robust - error checks: allocation ref bounds_check enum stub_data - VC __declspec() decoration level: + error checks: allocation ref bounds_check enum stub_data + VC __declspec() decoration level: __declspec(uuid()), __declspec(selectany), __declspec(novtable) DECLSPEC_UUID(), MIDL_INTERFACE() */ @@ -24,10 +22,9 @@ #pragma warning( disable: 4049 ) /* more than 64k source lines */ #endif - #ifdef __cplusplus extern "C"{ -#endif +#endif #include <rpc.h> @@ -86,7 +83,7 @@ MIDL_DEFINE_GUID(IID, IID_IDeckLink,0xC418FBDD,0x0587,0x48ED,0x8F,0xE5,0x64,0x0F,0x0A,0x14,0xAF,0x91); -MIDL_DEFINE_GUID(IID, IID_IDeckLinkConfiguration,0xEF90380B,0x4AE5,0x4346,0x90,0x77,0xE2,0x88,0xE1,0x49,0xF1,0x29); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkConfiguration,0x912F634B,0x2D4E,0x40A4,0x8A,0xAB,0x8D,0x80,0xB7,0x3F,0x12,0x89); MIDL_DEFINE_GUID(IID, IID_IDeckLinkEncoderConfiguration,0x138050E5,0xC60A,0x4552,0xBF,0x3F,0x0F,0x35,0x80,0x49,0x32,0x7E); @@ -158,16 +155,16 @@ MIDL_DEFINE_GUID(IID, IID_IDeckLinkAPIInformation,0x7BEA3C68,0x730D,0x4322,0xAF,0x34,0x8A,0x71,0x52,0xB5,0x32,0xA4); -MIDL_DEFINE_GUID(IID, IID_IDeckLinkOutput,0xCC5C8A6E,0x3F2F,0x4B3A,0x87,0xEA,0xFD,0x78,0xAF,0x30,0x05,0x64); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkOutput,0x065A0F6C,0xC508,0x4D0D,0xB9,0x19,0xF5,0xEB,0x0E,0xBF,0xC9,0x6B); -MIDL_DEFINE_GUID(IID, IID_IDeckLinkInput,0xAF22762B,0xDFAC,0x4846,0xAA,0x79,0xFA,0x88,0x83,0x56,0x09,0x95); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkInput,0x2A88CF76,0xF494,0x4216,0xA7,0xEF,0xDC,0x74,0xEE,0xB8,0x38,0x82); MIDL_DEFINE_GUID(IID, IID_IDeckLinkHDMIInputEDID,0xABBBACBC,0x45BC,0x4665,0x9D,0x92,0xAC,0xE6,0xE5,0xA9,0x79,0x02); -MIDL_DEFINE_GUID(IID, IID_IDeckLinkEncoderInput,0x270587DA,0x6B7D,0x42E7,0xA1,0xF0,0x6D,0x85,0x3F,0x58,0x11,0x85); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkEncoderInput,0xF222551D,0x13DF,0x4FD8,0xB5,0x87,0x9D,0x4F,0x19,0xEC,0x12,0xC9); MIDL_DEFINE_GUID(IID, IID_IDeckLinkVideoFrame,0x3F716FE0,0xF023,0x4111,0xBE,0x5D,0xEF,0x44,0x14,0xC0,0x5B,0x17); @@ -224,10 +221,22 @@ MIDL_DEFINE_GUID(IID, IID_IDeckLinkNotificationCallback,0xb002a1ec,0x070d,0x4288,0x82,0x89,0xbd,0x5d,0x36,0xe5,0xff,0x0d); -MIDL_DEFINE_GUID(IID, IID_IDeckLinkNotification,0x0a1fb207,0xe215,0x441b,0x9b,0x19,0x6f,0xa1,0x57,0x59,0x46,0xc5); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkNotification,0xb85df4c8,0xbdf5,0x47c1,0x80,0x64,0x28,0x16,0x2e,0xbd,0xd4,0xeb); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkProfileAttributes,0x17D4BF8E,0x4911,0x473A,0x80,0xA0,0x73,0x1C,0xF6,0xFF,0x34,0x5B); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkProfileIterator,0x29E5A8C0,0x8BE4,0x46EB,0x93,0xAC,0x31,0xDA,0xAB,0x5B,0x7B,0xF2); -MIDL_DEFINE_GUID(IID, IID_IDeckLinkAttributes,0xABC11843,0xD966,0x44CB,0x96,0xE2,0xA1,0xCB,0x5D,0x31,0x35,0xC4); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkProfile,0x16093466,0x674A,0x432B,0x9D,0xA0,0x1A,0xC2,0xC5,0xA8,0x24,0x1C); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkProfileCallback,0xA4F9341E,0x97AA,0x4E04,0x89,0x35,0x15,0xF8,0x09,0x89,0x8C,0xEA); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkProfileManager,0x30D41429,0x3998,0x4B6D,0x84,0xF8,0x78,0xC9,0x4A,0x79,0x7C,0x6E); MIDL_DEFINE_GUID(IID, IID_IDeckLinkStatus,0x5F558200,0x4028,0x49BC,0xBE,0xAC,0xDB,0x3F,0xA4,0xA9,0x6E,0x46); @@ -245,7 +254,7 @@ MIDL_DEFINE_GUID(IID, IID_IDeckLinkDiscovery,0xCDBF631C,0xBC76,0x45FA,0xB4,0x4D,0xC5,0x50,0x59,0xBC,0x61,0x01); -MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkIterator,0x87D2693F,0x8D4A,0x45C7,0xB4,0x3F,0x10,0xAC,0xBA,0x25,0xE6,0x8F); +MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkIterator,0xBA6C6F44,0x6DA5,0x4DCE,0x94,0xAA,0xEE,0x2D,0x13,0x72,0xA6,0x76); MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkAPIInformation,0x263CA19F,0xED09,0x482E,0x9F,0x9D,0x84,0x00,0x57,0x83,0xA2,0x37); @@ -260,12 +269,36 @@ MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkVideoConversion,0x7DBBBB11,0x5B7B,0x467D,0xAE,0xA4,0xCE,0xA4,0x68,0xFD,0x36,0x8C); -MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkDiscovery,0x652615D4,0x26CD,0x4514,0xB1,0x61,0x2F,0xD5,0x07,0x2E,0xD0,0x08); +MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkDiscovery,0x22FBFC33,0x8D07,0x495C,0xA5,0xBF,0xDA,0xB5,0xEA,0x9B,0x82,0xDB); MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkVideoFrameAncillaryPackets,0xF891AD29,0xD0C2,0x46E9,0xA9,0x26,0x4E,0x2D,0x0D,0xD8,0xCF,0xAD); +MIDL_DEFINE_GUID(IID, IID_IDeckLinkConfiguration_v10_11,0xEF90380B,0x4AE5,0x4346,0x90,0x77,0xE2,0x88,0xE1,0x49,0xF1,0x29); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkAttributes_v10_11,0xABC11843,0xD966,0x44CB,0x96,0xE2,0xA1,0xCB,0x5D,0x31,0x35,0xC4); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkNotification_v10_11,0x0A1FB207,0xE215,0x441B,0x9B,0x19,0x6F,0xA1,0x57,0x59,0x46,0xC5); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkOutput_v10_11,0xCC5C8A6E,0x3F2F,0x4B3A,0x87,0xEA,0xFD,0x78,0xAF,0x30,0x05,0x64); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkInput_v10_11,0xAF22762B,0xDFAC,0x4846,0xAA,0x79,0xFA,0x88,0x83,0x56,0x09,0x95); + + +MIDL_DEFINE_GUID(IID, IID_IDeckLinkEncoderInput_v10_11,0x270587DA,0x6B7D,0x42E7,0xA1,0xF0,0x6D,0x85,0x3F,0x58,0x11,0x85); + + +MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkIterator_v10_11,0x87D2693F,0x8D4A,0x45C7,0xB4,0x3F,0x10,0xAC,0xBA,0x25,0xE6,0x8F); + + +MIDL_DEFINE_GUID(CLSID, CLSID_CDeckLinkDiscovery_v10_11,0x652615D4,0x26CD,0x4514,0xB1,0x61,0x2F,0xD5,0x07,0x2E,0xD0,0x08); + + MIDL_DEFINE_GUID(IID, IID_IDeckLinkConfiguration_v10_9,0xCB71734A,0xFE37,0x4E8D,0x8E,0x13,0x80,0x21,0x33,0xA1,0xC3,0xF2); @@ -404,6 +437,4 @@ #endif - - /* *INDENT-ON* */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/dvb/dvbbasebin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/dvb/dvbbasebin.c
Changed
@@ -30,6 +30,7 @@ #include <stdlib.h> #include <string.h> #include <gst/mpegts/mpegts.h> +#include "gstdvbelements.h" #include "dvbbasebin.h" #include "parsechannels.h" @@ -169,7 +170,12 @@ 0, G_IMPLEMENT_INTERFACE (GST_TYPE_URI_HANDLER, dvb_base_bin_uri_handler_init)); - +#define _do_init \ + GST_DEBUG_CATEGORY_INIT (dvb_base_bin_debug, "dvbbasebin", 0, "DVB bin"); \ + cam_init (); \ + dvb_element_init (plugin); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dvbbasebin, "dvbbasebin", GST_RANK_NONE, + GST_TYPE_DVB_BASE_BIN, _do_init); static void dvb_base_bin_ref_stream (DvbBaseBinStream * stream) @@ -745,7 +751,7 @@ if (name == NULL) name = GST_PAD_TEMPLATE_NAME_TEMPLATE (templ); - pad = gst_element_get_request_pad (dvbbasebin->tsparse, name); + pad = gst_element_request_pad_simple (dvbbasebin->tsparse, name); if (pad == NULL) return NULL; @@ -1216,16 +1222,6 @@ iface->set_uri = dvb_base_bin_uri_set_uri; } -gboolean -gst_dvb_base_bin_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (dvb_base_bin_debug, "dvbbasebin", 0, "DVB bin"); - - cam_init (); - - return gst_element_register (plugin, "dvbbasebin", - GST_RANK_NONE, GST_TYPE_DVB_BASE_BIN); -} static void dvb_base_bin_program_destroy (gpointer data)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/dvb/dvbbasebin.h -> gst-plugins-bad-1.20.1.tar.xz/sys/dvb/dvbbasebin.h
Changed
@@ -75,7 +75,6 @@ }; GType dvb_base_bin_get_type(void); -gboolean gst_dvb_base_bin_plugin_init (GstPlugin *plugin); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/dvb/gstdvb.c -> gst-plugins-bad-1.20.1.tar.xz/sys/dvb/gstdvb.c
Changed
@@ -1,3 +1,4 @@ + /* * gstdvb.c - * Copyright (C) 2007 Alessandro Decina @@ -26,26 +27,17 @@ #include <gst/gst-i18n-plugin.h> -#include "gstdvbsrc.h" -#include "dvbbasebin.h" +#include "gstdvbelements.h" static gboolean plugin_init (GstPlugin * plugin) { -#ifdef ENABLE_NLS - GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, - LOCALEDIR); - bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); - bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); -#endif /* ENABLE_NLS */ - - if (!gst_dvbsrc_plugin_init (plugin)) - return FALSE; + gboolean ret = FALSE; - if (!gst_dvb_base_bin_plugin_init (plugin)) - return FALSE; + ret |= GST_ELEMENT_REGISTER (dvbsrc, plugin); + ret |= GST_ELEMENT_REGISTER (dvbbasebin, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.20.1.tar.xz/sys/dvb/gstdvbelement.c
Added
@@ -0,0 +1,44 @@ +/* + * gstdvb.c - + * Copyright (C) 2007 Alessandro Decina + * + * Authors: + * Alessandro Decina <alessandro.d@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst-i18n-plugin.h> + +#include "gstdvbelements.h" + +void +dvb_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { +#ifdef ENABLE_NLS + GST_DEBUG ("binding text domain %s to locale dir %s", GETTEXT_PACKAGE, + LOCALEDIR); + bindtextdomain (GETTEXT_PACKAGE, LOCALEDIR); + bind_textdomain_codeset (GETTEXT_PACKAGE, "UTF-8"); +#endif /* ENABLE_NLS */ + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/dvb/gstdvbelements.h
Added
@@ -0,0 +1,34 @@ +/* GStreamer + * Copyright (C) <2020> Julian Bouzas <julian.bouzas@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_DVB_ELEMENTS_H__ +#define __GST_DVB_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void dvb_element_init (GstPlugin * plugin); +GST_ELEMENT_REGISTER_DECLARE (dvbbasebin); +GST_ELEMENT_REGISTER_DECLARE (dvbsrc); + +#endif /* __GST_DVB_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/dvb/gstdvbsrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/dvb/gstdvbsrc.c
Changed
@@ -93,6 +93,7 @@ #include "config.h" #endif +#include "gstdvbelements.h" #include "gstdvbsrc.h" #include <gst/gst.h> #include <gst/glib-compat-private.h> @@ -601,6 +602,11 @@ #define gst_dvbsrc_parent_class parent_class G_DEFINE_TYPE (GstDvbSrc, gst_dvbsrc, GST_TYPE_PUSH_SRC); +#define _do_init \ + GST_DEBUG_CATEGORY_INIT (gstdvbsrc_debug, "dvbsrc", 0, "DVB Source Element");\ + dvb_element_init (plugin); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (dvbsrc, "dvbsrc", GST_RANK_NONE, + GST_TYPE_DVBSRC, _do_init); static guint gst_dvbsrc_signals[LAST_SIGNAL] = { 0 }; @@ -1859,14 +1865,6 @@ * register the element factories and pad templates * register the features */ -gboolean -gst_dvbsrc_plugin_init (GstPlugin * plugin) -{ - GST_DEBUG_CATEGORY_INIT (gstdvbsrc_debug, "dvbsrc", 0, "DVB Source Element"); - - return gst_element_register (plugin, "dvbsrc", GST_RANK_NONE, - GST_TYPE_DVBSRC); -} static GstFlowReturn gst_dvbsrc_read_device (GstDvbSrc * object, int size, GstBuffer ** buffer)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/dvb/gstdvbsrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/dvb/gstdvbsrc.h
Changed
@@ -144,7 +144,6 @@ GType gst_dvbsrc_get_type (void); -gboolean gst_dvbsrc_plugin_init (GstPlugin * plugin); G_END_DECLS #endif /* __GST_DVBSRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/dvb/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/dvb/meson.build
Changed
@@ -11,6 +11,7 @@ 'camutils.c', 'dvbbasebin.c', 'gstdvb.c', + 'gstdvbelement.c', 'gstdvbsrc.c', 'parsechannels.c', ]
View file
gst-plugins-bad-1.18.6.tar.xz/sys/fbdev/gstfbdevsink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/fbdev/gstfbdevsink.c
Changed
@@ -83,6 +83,8 @@ #define parent_class gst_fbdevsink_parent_class G_DEFINE_TYPE (GstFBDEVSink, gst_fbdevsink, GST_TYPE_VIDEO_SINK); +GST_ELEMENT_REGISTER_DEFINE (fbdevsink, "fbdevsink", GST_RANK_NONE, + GST_TYPE_FBDEVSINK); static void gst_fbdevsink_init (GstFBDEVSink * fbdevsink) @@ -394,11 +396,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "fbdevsink", GST_RANK_NONE, - GST_TYPE_FBDEVSINK)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (fbdevsink, plugin); } static void
View file
gst-plugins-bad-1.18.6.tar.xz/sys/fbdev/gstfbdevsink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/fbdev/gstfbdevsink.h
Changed
@@ -67,6 +67,7 @@ }; GType gst_fbdevsink_get_type(void); +GST_ELEMENT_REGISTER_DECLARE (fbdevsink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/ipcpipeline/gstipcpipeline.c -> gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcpipeline.c
Changed
@@ -22,21 +22,15 @@ #include "config.h" #endif -#include "gstipcpipelinecomm.h" -#include "gstipcpipelinesink.h" -#include "gstipcpipelinesrc.h" -#include "gstipcslavepipeline.h" +#include "gstipcpipelineelements.h" + static gboolean plugin_init (GstPlugin * plugin) { - gst_ipc_pipeline_comm_plugin_init (); - gst_element_register (plugin, "ipcpipelinesrc", GST_RANK_NONE, - GST_TYPE_IPC_PIPELINE_SRC); - gst_element_register (plugin, "ipcpipelinesink", GST_RANK_NONE, - GST_TYPE_IPC_PIPELINE_SINK); - gst_element_register (plugin, "ipcslavepipeline", GST_RANK_NONE, - GST_TYPE_IPC_SLAVE_PIPELINE); + GST_ELEMENT_REGISTER (ipcpipelinesrc, plugin); + GST_ELEMENT_REGISTER (ipcpipelinesink, plugin); + GST_ELEMENT_REGISTER (ipcslavepipeline, plugin); return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/ipcpipeline/gstipcpipelinecomm.c -> gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcpipelinecomm.c
Changed
@@ -28,8 +28,9 @@ # include <unistd.h> #endif #ifdef _MSC_VER -/* ssize_t is not available, so match return value of read()/write() on MSVC */ -#define ssize_t int +/* ssize_t is not available, so match return value of recv()/send() on MSVC */ +# define ssize_t int +# include <winsock2.h> #endif #include <errno.h> #include <string.h> @@ -228,6 +229,20 @@ GST_TRACE_OBJECT (comm->element, "Writing %u bytes to fdout", (unsigned) size); while (size) { +#ifdef _MSC_VER + ssize_t written = + send (comm->fdout, (const unsigned char *) data + offset, size, 0); + if (written < 0) { + int last_error = WSAGetLastError (); + if (last_error == WSAEWOULDBLOCK) + continue; + gchar *error_text = g_win32_error_message (last_error); + GST_ERROR_OBJECT (comm->element, "Failed to write to fd: %s", error_text); + g_free (error_text); + ret = FALSE; + goto done; + } +#else ssize_t written = write (comm->fdout, (const unsigned char *) data + offset, size); if (written < 0) { @@ -238,6 +253,7 @@ ret = FALSE; goto done; } +#endif size -= written; offset += written; } @@ -1752,7 +1768,19 @@ mem = gst_allocator_alloc (NULL, comm->read_chunk_size, NULL); gst_memory_map (mem, &map, GST_MAP_WRITE); +#ifdef _MSC_VER + sz = recv (comm->pollFDin.fd, map.data, map.size, 0); + if (sz < 0) { + int last_error = WSAGetLastError (); + if (last_error == WSAEWOULDBLOCK) { + errno = EAGAIN; + } else { + errno = last_error; + } + } +#else sz = read (comm->pollFDin.fd, map.data, map.size); +#endif gst_memory_unmap (mem, &map); if (sz <= 0) {
View file
gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcpipelineelement.c
Added
@@ -0,0 +1,37 @@ +/* GStreamer + * Copyright (C) 2017 YouView TV Ltd + * Author: George Kiagiadakis <george.Kiagiadakis@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin Street, Suite 500, + * Boston, MA 02110-1335, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstipcpipelineelements.h" +#include "gstipcpipelinecomm.h" + + +void +icepipeline_element_init (GstPlugin * plugin) +{ + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + gst_ipc_pipeline_comm_plugin_init (); + g_once_init_leave (&res, TRUE); + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcpipelineelements.h
Added
@@ -0,0 +1,37 @@ +/* GStreamer + * Copyright (C) 2017 YouView TV Ltd + * Author: George Kiagiadakis <george.Kiagiadakis@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifndef __GST_IPCPIPELINE_ELEMENTS_H__ +#define __GST_IPCPIPELINE_ELEMENTS_H__ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include <gst/gst.h> + +void icepipeline_element_init (GstPlugin * plugin); + +GST_ELEMENT_REGISTER_DECLARE (ipcpipelinesink); +GST_ELEMENT_REGISTER_DECLARE (ipcpipelinesrc); +GST_ELEMENT_REGISTER_DECLARE (ipcslavepipeline); + +#endif /* __GST_IPCPIPELINE_ELEMENTS_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/ipcpipeline/gstipcpipelinesink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcpipelinesink.c
Changed
@@ -73,6 +73,7 @@ # include "config.h" #endif +#include "gstipcpipelineelements.h" #include "gstipcpipelinesink.h" static GstStaticPadTemplate sinktemplate = GST_STATIC_PAD_TEMPLATE ("sink", @@ -109,6 +110,9 @@ #define gst_ipc_pipeline_sink_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstIpcPipelineSink, gst_ipc_pipeline_sink, GST_TYPE_ELEMENT, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (ipcpipelinesink, "ipcpipelinesink", + GST_RANK_NONE, GST_TYPE_IPC_PIPELINE_SINK, + icepipeline_element_init (plugin)); static void gst_ipc_pipeline_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/ipcpipeline/gstipcpipelinesrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcpipelinesrc.c
Changed
@@ -40,6 +40,7 @@ # include "config.h" #endif +#include "gstipcpipelineelements.h" #include "gstipcpipelinesrc.h" static GstStaticPadTemplate srctemplate = GST_STATIC_PAD_TEMPLATE ("src", @@ -79,6 +80,9 @@ #define gst_ipc_pipeline_src_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstIpcPipelineSrc, gst_ipc_pipeline_src, GST_TYPE_ELEMENT, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (ipcpipelinesrc, "ipcpipelinesrc", + GST_RANK_NONE, GST_TYPE_IPC_PIPELINE_SRC, + icepipeline_element_init (plugin)); static void gst_ipc_pipeline_src_finalize (GObject * object); static void gst_ipc_pipeline_src_dispose (GObject * object);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/ipcpipeline/gstipcslavepipeline.c -> gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/gstipcslavepipeline.c
Changed
@@ -42,6 +42,7 @@ #include <string.h> +#include "gstipcpipelineelements.h" #include "gstipcpipelinesrc.h" #include "gstipcslavepipeline.h" @@ -53,6 +54,9 @@ #define gst_ipc_slave_pipeline_parent_class parent_class G_DEFINE_TYPE_WITH_CODE (GstIpcSlavePipeline, gst_ipc_slave_pipeline, GST_TYPE_PIPELINE, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (ipcslavepipeline, "ipcslavepipeline", + GST_RANK_NONE, GST_TYPE_IPC_SLAVE_PIPELINE, + icepipeline_element_init (plugin)); static gboolean gst_ipc_slave_pipeline_post_message (GstElement * element, GstMessage * message);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/ipcpipeline/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/ipcpipeline/meson.build
Changed
@@ -1,5 +1,6 @@ ipcpipeline_sources = [ 'gstipcpipeline.c', + 'gstipcpipelineelement.c', 'gstipcpipelinecomm.c', 'gstipcpipelinesink.c', 'gstipcpipelinesrc.c', @@ -14,7 +15,7 @@ ipcpipeline_sources, c_args : gst_plugins_bad_args, include_directories : [configinc], - dependencies : [gstbase_dep], + dependencies : [gstbase_dep] + winsock2, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/sys/kms/gstkmssink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/kms/gstkmssink.c
Changed
@@ -81,6 +81,8 @@ GST_DEBUG_CATEGORY_GET (CAT_PERFORMANCE, "GST_PERFORMANCE"); G_IMPLEMENT_INTERFACE (GST_TYPE_VIDEO_OVERLAY, gst_kms_sink_video_overlay_init)); +GST_ELEMENT_REGISTER_DEFINE (kmssink, GST_PLUGIN_NAME, GST_RANK_SECONDARY, + GST_TYPE_KMS_SINK); enum { @@ -175,7 +177,7 @@ { static const char *drivers[] = { "i915", "radeon", "nouveau", "vmwgfx", "exynos", "amdgpu", "imx-drm", "rockchip", "atmel-hlcdc", "msm", - "xlnx", "vc4", "meson", "sun4i-drm", "mxsfb-drm", + "xlnx", "vc4", "meson", "sun4i-drm", "mxsfb-drm", "tegra", "xilinx_drm", /* DEPRECATED. Replaced by xlnx */ }; int i, fd = -1; @@ -2084,11 +2086,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, GST_PLUGIN_NAME, GST_RANK_SECONDARY, - GST_TYPE_KMS_SINK)) - return FALSE; - - return TRUE; + return GST_ELEMENT_REGISTER (kmssink, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, kms,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/kms/gstkmssink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/kms/gstkmssink.h
Changed
@@ -100,7 +100,7 @@ }; GType gst_kms_sink_get_type (void) G_GNUC_CONST; - +GST_ELEMENT_REGISTER_DECLARE (kmssink); G_END_DECLS #endif /* __GST_KMS_SINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/kms/gstkmsutils.c -> gst-plugins-bad-1.20.1.tar.xz/sys/kms/gstkmsutils.c
Changed
@@ -40,36 +40,47 @@ #define DEF_FMT(fourcc, fmt) \ { DRM_FORMAT_##fourcc,GST_VIDEO_FORMAT_##fmt } - /* DEF_FMT (XRGB1555, ???), */ - /* DEF_FMT (XBGR1555, ???), */ -#if G_BYTE_ORDER == G_LITTLE_ENDIAN + /* Keep sorted by decreasing quality, refer to GST_VIDEO_FORMATS_ALL order + * if unsure */ + + /* 32bits/p RGB with Alpha */ DEF_FMT (ARGB8888, BGRA), - DEF_FMT (XRGB8888, BGRx), DEF_FMT (ABGR8888, RGBA), + + /* 16bits/c YUV 4:2:0 */ + DEF_FMT (P016, P016_LE), + + /* 16bits/c YUV 4:2:0 */ + DEF_FMT (P010, P010_10LE), + + /* YUV 4:4:4 */ + DEF_FMT (NV24, NV24), + + /* 32bits/p RGB opaque */ + DEF_FMT (XRGB8888, BGRx), DEF_FMT (XBGR8888, RGBx), + + /* 24bits RGB opaque */ DEF_FMT (BGR888, RGB), DEF_FMT (RGB888, BGR), - DEF_FMT (P010, P010_10LE), - DEF_FMT (P016, P016_LE), -#else - DEF_FMT (ARGB8888, ARGB), - DEF_FMT (XRGB8888, xRGB), - DEF_FMT (ABGR8888, ABGR), - DEF_FMT (XBGR8888, xBGR), - DEF_FMT (RGB888, RGB), - DEF_FMT (BGR888, BGR), - DEF_FMT (P010, P010_10BE), - DEF_FMT (P016, P016_BE), -#endif + + /* 8bits/c YUV 4:2:2 */ + DEF_FMT (YUV422, Y42B), + DEF_FMT (NV61, NV61), + DEF_FMT (NV16, NV16), DEF_FMT (UYVY, UYVY), - DEF_FMT (YUYV, YUY2), DEF_FMT (YVYU, YVYU), + DEF_FMT (YUYV, YUY2), + + /* 8bits/c YUV 4:2:0 */ DEF_FMT (YUV420, I420), DEF_FMT (YVU420, YV12), - DEF_FMT (YUV422, Y42B), - DEF_FMT (NV12, NV12), DEF_FMT (NV21, NV21), - DEF_FMT (NV16, NV16), + DEF_FMT (NV12, NV12), + + /* 16bits/p RGB */ + DEF_FMT (RGB565, RGB16), + DEF_FMT (BGR565, BGR16), #undef DEF_FMT }; @@ -113,6 +124,8 @@ case DRM_FORMAT_NV12: case DRM_FORMAT_NV21: case DRM_FORMAT_NV16: + case DRM_FORMAT_NV61: + case DRM_FORMAT_NV24: bpp = 8; break; case DRM_FORMAT_P010: @@ -122,6 +135,8 @@ case DRM_FORMAT_YUYV: case DRM_FORMAT_YVYU: case DRM_FORMAT_P016: + case DRM_FORMAT_RGB565: + case DRM_FORMAT_BGR565: bpp = 16; break; case DRM_FORMAT_BGR888: @@ -152,8 +167,12 @@ ret = height * 3 / 2; break; case DRM_FORMAT_NV16: + case DRM_FORMAT_NV61: ret = height * 2; break; + case DRM_FORMAT_NV24: + ret = height * 3; + break; default: ret = height; break;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/magicleap/mlaudiosink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/magicleap/mlaudiosink.c
Changed
@@ -113,6 +113,9 @@ }; G_DEFINE_TYPE (GstMLAudioSink, gst_ml_audio_sink, GST_TYPE_AUDIO_SINK); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (mlaudiosink, "mlaudiosink", + GST_RANK_PRIMARY + 10, GST_TYPE_ML_AUDIO_SINK, + GST_DEBUG_CATEGORY_INIT (mgl_debug, "magicleap", 0, "Magic Leap elements")); enum {
View file
gst-plugins-bad-1.18.6.tar.xz/sys/magicleap/mlaudiosink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/magicleap/mlaudiosink.h
Changed
@@ -27,5 +27,5 @@ #define GST_TYPE_ML_AUDIO_SINK gst_ml_audio_sink_get_type () G_DECLARE_FINAL_TYPE (GstMLAudioSink, gst_ml_audio_sink, GST, ML_AUDIO_SINK, GstAudioSink) - +GST_ELEMENT_REGISTER_DECLARE (mlaudiosink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/magicleap/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/magicleap/plugin.c
Changed
@@ -29,12 +29,7 @@ static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "mlaudiosink", GST_RANK_PRIMARY + 10, - GST_TYPE_ML_AUDIO_SINK)) - return FALSE; - - GST_DEBUG_CATEGORY_INIT (mgl_debug, "magicleap", 0, "Magic Leap elements"); - return TRUE; + return GST_ELEMENT_REGISTER (mlaudiosink, plugin); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfaacenc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfaacenc.cpp
Changed
@@ -43,7 +43,9 @@ #include <vector> #include <string> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; +/* *INDENT-ON* */ GST_DEBUG_CATEGORY (gst_mf_aac_enc_debug); #define GST_CAT_DEFAULT gst_mf_aac_enc_debug @@ -60,7 +62,7 @@ { GstMFAudioEnc parent; - /* properteies */ + /* properties */ guint bitrate; } GstMFAacEnc; @@ -70,6 +72,7 @@ } GstMFAacEncClass; +/* *INDENT-OFF* */ typedef struct { GstCaps *sink_caps; @@ -79,6 +82,7 @@ guint device_index; std::set<UINT32> bitrate_list; } GstMFAacEncClassData; +/* *INDENT-ON* */ static GstElementClass *parent_class = NULL; @@ -110,25 +114,27 @@ gobject_class->get_property = gst_mf_aac_enc_get_property; gobject_class->set_property = gst_mf_aac_enc_set_property; - bitrate_blurb = - "Bitrate in bit/sec, (0 = auto), valid values are { 0"; + bitrate_blurb = "Bitrate in bit/sec, (0 = auto), valid values are { 0"; + + /* *INDENT-OFF* */ for (auto iter: cdata->bitrate_list) { bitrate_blurb += ", " + std::to_string (iter); /* std::set<> stores values in a sorted fashion */ max_bitrate = iter; } bitrate_blurb += " }"; + /* *INDENT-ON* */ g_object_class_install_property (gobject_class, PROP_BITRATE, - g_param_spec_uint ("bitrate", "Bitrate", bitrate_blurb.c_str(), 0, + g_param_spec_uint ("bitrate", "Bitrate", bitrate_blurb.c_str (), 0, max_bitrate, DEFAULT_BITRATE, (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | - G_PARAM_STATIC_NAME | G_PARAM_STATIC_NICK))); + G_PARAM_STATIC_NAME | G_PARAM_STATIC_NICK))); long_name = g_strdup_printf ("Media Foundation %s", cdata->device_name); classification = g_strdup_printf ("Codec/Encoder/Audio%s", (cdata->enum_flags & MFT_ENUM_FLAG_HARDWARE) == MFT_ENUM_FLAG_HARDWARE ? - "/Hardware" : ""); + "/Hardware" : ""); gst_element_class_set_metadata (element_class, long_name, classification, "Microsoft Media Foundation AAC Encoder", @@ -147,8 +153,7 @@ GST_DEBUG_FUNCPTR (gst_mf_aac_enc_get_output_type); mfenc_class->get_input_type = GST_DEBUG_FUNCPTR (gst_mf_aac_enc_get_input_type); - mfenc_class->set_src_caps = - GST_DEBUG_FUNCPTR (gst_mf_aac_enc_set_src_caps); + mfenc_class->set_src_caps = GST_DEBUG_FUNCPTR (gst_mf_aac_enc_set_src_caps); mfenc_class->codec_id = MFAudioFormat_AAC; mfenc_class->enum_flags = cdata->enum_flags; @@ -207,9 +212,9 @@ GstMFTransform *transform = mfenc->transform; GList *output_list = NULL; GList *iter; - ComPtr<IMFMediaType> target_output; - std::vector<ComPtr<IMFMediaType>> filtered_types; - std::set<UINT32> bitrate_list; + ComPtr < IMFMediaType > target_output; + std::vector < ComPtr < IMFMediaType >> filtered_types; + std::set < UINT32 > bitrate_list; UINT32 bitrate; UINT32 target_bitrate = 0; HRESULT hr; @@ -268,12 +273,12 @@ g_list_free_full (output_list, (GDestroyNotify) gst_mf_media_type_release); - if (filtered_types.empty()) { + if (filtered_types.empty ()) { GST_ERROR_OBJECT (self, "Couldn't find target output type"); return FALSE; } - GST_DEBUG_OBJECT (self, "have %d candidate output", filtered_types.size()); + GST_DEBUG_OBJECT (self, "have %d candidate output", filtered_types.size ()); /* 2. Find the best matching bitrate */ bitrate = self->bitrate; @@ -310,6 +315,8 @@ } GST_DEBUG_OBJECT (self, "Available bitrates"); + + /* *INDENT-OFF* */ for (auto it: bitrate_list) GST_DEBUG_OBJECT (self, "\t%d", it); @@ -335,13 +342,14 @@ break; } } + /* *INDENT-ON* */ if (!target_output) { GST_ERROR_OBJECT (self, "Failed to decide final output type"); return FALSE; } - *output_type = target_output.Detach(); + *output_type = target_output.Detach (); return TRUE; } @@ -354,9 +362,9 @@ GstMFTransform *transform = mfenc->transform; GList *input_list = NULL; GList *iter; - ComPtr<IMFMediaType> target_input; - std::vector<ComPtr<IMFMediaType>> filtered_types; - std::set<UINT32> bitrate_list; + ComPtr < IMFMediaType > target_input; + std::vector < ComPtr < IMFMediaType >> filtered_types; + std::set < UINT32 > bitrate_list; HRESULT hr; if (!gst_mf_transform_get_input_available_types (transform, &input_list)) { @@ -407,25 +415,24 @@ g_list_free_full (input_list, (GDestroyNotify) gst_mf_media_type_release); - if (filtered_types.empty()) { + if (filtered_types.empty ()) { GST_ERROR_OBJECT (self, "Couldn't find target input type"); return FALSE; } GST_DEBUG_OBJECT (self, "Total %d input types are available", - filtered_types.size()); + filtered_types.size ()); /* Just select the first one */ - target_input = *filtered_types.begin(); + target_input = *filtered_types.begin (); - *input_type = target_input.Detach(); + *input_type = target_input.Detach (); return TRUE; } static gboolean -gst_mf_aac_enc_set_src_caps (GstMFAudioEnc * mfenc, - GstAudioInfo * info) +gst_mf_aac_enc_set_src_caps (GstMFAudioEnc * mfenc, GstAudioInfo * info) { GstMFAacEnc *self = (GstMFAacEnc *) mfenc; HRESULT hr; @@ -434,10 +441,11 @@ UINT8 *blob = NULL; UINT32 blob_size = 0; gboolean ret; - ComPtr<IMFMediaType> output_type; + ComPtr < IMFMediaType > output_type; static const guint config_data_offset = 12; - if (!gst_mf_transform_get_output_current_type (mfenc->transform, &output_type)) { + if (!gst_mf_transform_get_output_current_type (mfenc->transform, + &output_type)) { GST_ERROR_OBJECT (self, "Couldn't get current output type"); return FALSE; } @@ -480,7 +488,8 @@ blob + config_data_offset, blob_size - config_data_offset); CoTaskMemFree (blob); - ret = gst_audio_encoder_set_output_format (GST_AUDIO_ENCODER (self), src_caps); + ret = + gst_audio_encoder_set_output_format (GST_AUDIO_ENCODER (self), src_caps); if (!ret) { GST_WARNING_OBJECT (self, "Couldn't set output format %" GST_PTR_FORMAT, src_caps); @@ -494,7 +503,7 @@ gst_mf_aac_enc_register (GstPlugin * plugin, guint rank, const gchar * device_name, guint32 enum_flags, guint device_index, GstCaps * sink_caps, GstCaps * src_caps, - const std::set<UINT32> &bitrate_list) + const std::set < UINT32 > &bitrate_list) { GType type; gchar *type_name; @@ -562,9 +571,9 @@ gchar *device_name = NULL; GList *output_list = NULL; GList *iter; - std::set<UINT32> channels_list; - std::set<UINT32> rate_list; - std::set<UINT32> bitrate_list; + std::set < UINT32 > channels_list; + std::set < UINT32 > rate_list; + std::set < UINT32 > bitrate_list; gboolean config_found = FALSE; GValue channles_value = G_VALUE_INIT; GValue rate_value = G_VALUE_INIT; @@ -583,7 +592,8 @@ goto done; } - GST_INFO_OBJECT (transform, "Have %d output type", g_list_length (output_list)); + GST_INFO_OBJECT (transform, "Have %d output type", + g_list_length (output_list)); for (iter = output_list, i = 0; iter; iter = g_list_next (iter), i++) { UINT32 channels, rate, bitrate; @@ -652,6 +662,7 @@ g_value_init (&channles_value, GST_TYPE_LIST); g_value_init (&rate_value, GST_TYPE_LIST); + /* *INDENT-OFF* */ for (auto it: channels_list) { GValue channles = G_VALUE_INIT; @@ -667,6 +678,7 @@ g_value_set_int (&rate, (gint) it); gst_value_list_append_and_take_value (&rate_value, &rate); } + /* *INDENT-ON* */ gst_caps_set_value (src_caps, "channels", &channles_value); gst_caps_set_value (sink_caps, "channels", &channles_value); @@ -680,7 +692,7 @@ gst_mf_aac_enc_register (plugin, rank, device_name, enum_flags, device_index, sink_caps, src_caps, bitrate_list); - done: +done: if (output_list) g_list_free_full (output_list, (GDestroyNotify) gst_mf_media_type_release); g_free (device_name); @@ -703,29 +715,10 @@ output_type.guidSubtype = MFAudioFormat_AAC; enum_params.category = MFT_CATEGORY_AUDIO_ENCODER; - enum_params.enum_flags = (MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_ASYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - enum_params.output_typeinfo = &output_type; - - /* register hardware encoders first (likey no hardware audio encoder) */ - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_aac_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); - - /* register software encoders */ enum_params.enum_flags = (MFT_ENUM_FLAG_SYNCMFT | MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); + enum_params.output_typeinfo = &output_type; + i = 0; do { enum_params.device_index = i++;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfaudioenc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfaudioenc.cpp
Changed
@@ -26,7 +26,9 @@ #include <wrl.h> #include <string.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; +/* *INDENT-ON* */ GST_DEBUG_CATEGORY (gst_mf_audio_enc_debug); #define GST_CAT_DEFAULT gst_mf_audio_enc_debug @@ -35,14 +37,14 @@ G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstMFAudioEnc, gst_mf_audio_enc, GST_TYPE_AUDIO_ENCODER, GST_DEBUG_CATEGORY_INIT (gst_mf_audio_enc_debug, "mfaudioenc", 0, - "mfaudioenc")); + "mfaudioenc")); static gboolean gst_mf_audio_enc_open (GstAudioEncoder * enc); static gboolean gst_mf_audio_enc_close (GstAudioEncoder * enc); static gboolean gst_mf_audio_enc_set_format (GstAudioEncoder * enc, GstAudioInfo * info); static GstFlowReturn gst_mf_audio_enc_handle_frame (GstAudioEncoder * enc, - GstBuffer *buffer); + GstBuffer * buffer); static GstFlowReturn gst_mf_audio_enc_drain (GstAudioEncoder * enc); static void gst_mf_audio_enc_flush (GstAudioEncoder * enc); @@ -56,8 +58,7 @@ audioenc_class->set_format = GST_DEBUG_FUNCPTR (gst_mf_audio_enc_set_format); audioenc_class->handle_frame = GST_DEBUG_FUNCPTR (gst_mf_audio_enc_handle_frame); - audioenc_class->flush = - GST_DEBUG_FUNCPTR (gst_mf_audio_enc_flush); + audioenc_class->flush = GST_DEBUG_FUNCPTR (gst_mf_audio_enc_flush); gst_type_mark_as_plugin_api (GST_TYPE_MF_AUDIO_ENC, (GstPluginAPIFlags) 0); } @@ -112,8 +113,8 @@ { GstMFAudioEnc *self = GST_MF_AUDIO_ENC (enc); GstMFAudioEncClass *klass = GST_MF_AUDIO_ENC_GET_CLASS (enc); - ComPtr<IMFMediaType> in_type; - ComPtr<IMFMediaType> out_type; + ComPtr < IMFMediaType > in_type; + ComPtr < IMFMediaType > out_type; GST_DEBUG_OBJECT (self, "Set format"); @@ -130,7 +131,7 @@ return FALSE; } - gst_mf_dump_attributes (out_type.Get(), "Set output type", GST_LEVEL_DEBUG); + gst_mf_dump_attributes (out_type.Get (), "Set output type", GST_LEVEL_DEBUG); if (!gst_mf_transform_set_output_type (self->transform, out_type.Get ())) { GST_ERROR_OBJECT (self, "Couldn't set output type"); @@ -143,7 +144,7 @@ return FALSE; } - gst_mf_dump_attributes (in_type.Get(), "Set input type", GST_LEVEL_DEBUG); + gst_mf_dump_attributes (in_type.Get (), "Set input type", GST_LEVEL_DEBUG); if (!gst_mf_transform_set_input_type (self->transform, in_type.Get ())) { GST_ERROR_OBJECT (self, "Couldn't set input media type"); @@ -175,8 +176,8 @@ gst_mf_audio_enc_process_input (GstMFAudioEnc * self, GstBuffer * buffer) { HRESULT hr; - ComPtr<IMFSample> sample; - ComPtr<IMFMediaBuffer> media_buffer; + ComPtr < IMFSample > sample; + ComPtr < IMFMediaBuffer > media_buffer; BYTE *data; gboolean res = FALSE; GstMapInfo info; @@ -243,12 +244,12 @@ { GstMFAudioEncClass *klass = GST_MF_AUDIO_ENC_GET_CLASS (self); HRESULT hr; - BYTE *data; - ComPtr<IMFMediaBuffer> media_buffer; - ComPtr<IMFSample> sample; + BYTE *data = nullptr; + ComPtr < IMFMediaBuffer > media_buffer; + ComPtr < IMFSample > sample; GstBuffer *buffer; GstFlowReturn res = GST_FLOW_ERROR; - DWORD buffer_len; + DWORD buffer_len = 0; res = gst_mf_transform_get_output (self->transform, sample.GetAddressOf ()); @@ -263,6 +264,13 @@ if (!gst_mf_result (hr)) return GST_FLOW_ERROR; + /* Can happen while draining */ + if (buffer_len == 0 || !data) { + GST_DEBUG_OBJECT (self, "Empty media buffer"); + media_buffer->Unlock (); + return GST_FLOW_OK; + } + buffer = gst_audio_encoder_allocate_output_buffer (GST_AUDIO_ENCODER (self), buffer_len); gst_buffer_fill (buffer, 0, data, buffer_len); @@ -273,8 +281,7 @@ } static GstFlowReturn -gst_mf_audio_enc_handle_frame (GstAudioEncoder * enc, - GstBuffer *buffer) +gst_mf_audio_enc_handle_frame (GstAudioEncoder * enc, GstBuffer * buffer) { GstMFAudioEnc *self = GST_MF_AUDIO_ENC (enc); GstFlowReturn ret;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfcapturewinrt.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfcapturewinrt.cpp
Changed
@@ -21,6 +21,7 @@ #include "config.h" #endif +#include <gst/base/base.h> #include <gst/video/video.h> #include "gstmfcapturewinrt.h" #include "gstmfutils.h" @@ -29,6 +30,7 @@ #include <memory> #include <algorithm> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; using namespace Microsoft::WRL::Wrappers; using namespace ABI::Windows::Media::MediaProperties; @@ -41,6 +43,7 @@ #define GST_CAT_DEFAULT gst_mf_source_object_debug G_END_DECLS +/* *INDENT-ON* */ enum { @@ -61,7 +64,7 @@ GMainLoop *loop; /* protected by lock */ - GQueue *queue; + GstQueueArray *queue; GstCaps *supported_caps; GstVideoInfo info; @@ -71,6 +74,12 @@ gpointer dispatcher; }; +typedef struct _GstMFCaptureWinRTFrame +{ + IMediaFrameReference *frame; + GstClockTime clock_time; +} GstMFCaptureWinRTFrame; + static void gst_mf_capture_winrt_constructed (GObject * object); static void gst_mf_capture_winrt_finalize (GObject * object); static void gst_mf_capture_winrt_get_property (GObject * object, guint prop_id, @@ -79,20 +88,22 @@ const GValue * value, GParamSpec * pspec); static gboolean gst_mf_capture_winrt_start (GstMFSourceObject * object); -static gboolean gst_mf_capture_winrt_stop (GstMFSourceObject * object); +static gboolean gst_mf_capture_winrt_stop (GstMFSourceObject * object); static GstFlowReturn gst_mf_capture_winrt_fill (GstMFSourceObject * object, GstBuffer * buffer); static gboolean gst_mf_capture_winrt_unlock (GstMFSourceObject * object); static gboolean gst_mf_capture_winrt_unlock_stop (GstMFSourceObject * object); -static GstCaps * gst_mf_capture_winrt_get_caps (GstMFSourceObject * object); +static GstCaps *gst_mf_capture_winrt_get_caps (GstMFSourceObject * object); static gboolean gst_mf_capture_winrt_set_caps (GstMFSourceObject * object, GstCaps * caps); -static HRESULT gst_mf_capture_winrt_on_frame (ISoftwareBitmap * bitmap, - void * user_data); -static HRESULT gst_mf_capture_winrt_on_failed (const std::string &error, - UINT32 error_code, void * user_data); +static HRESULT gst_mf_capture_winrt_on_frame (IMediaFrameReference * frame, + void *user_data); +static HRESULT gst_mf_capture_winrt_on_failed (const std::string & error, + UINT32 error_code, void *user_data); static gpointer gst_mf_capture_winrt_thread_func (GstMFCaptureWinRT * self); +static void +gst_mf_capture_winrt_frame_clear (GstMFCaptureWinRTFrame * winrt_frame); #define gst_mf_capture_winrt_parent_class parent_class G_DEFINE_TYPE (GstMFCaptureWinRT, gst_mf_capture_winrt, @@ -113,7 +124,7 @@ g_param_spec_pointer ("dispatcher", "Dispatcher", "ICoreDispatcher COM object to use", (GParamFlags) (G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | - G_PARAM_STATIC_STRINGS))); + G_PARAM_STATIC_STRINGS))); source_class->start = GST_DEBUG_FUNCPTR (gst_mf_capture_winrt_start); source_class->stop = GST_DEBUG_FUNCPTR (gst_mf_capture_winrt_stop); @@ -128,7 +139,10 @@ static void gst_mf_capture_winrt_init (GstMFCaptureWinRT * self) { - self->queue = g_queue_new (); + self->queue = + gst_queue_array_new_for_struct (sizeof (GstMFCaptureWinRTFrame), 2); + gst_queue_array_set_clear_func (self->queue, + (GDestroyNotify) gst_mf_capture_winrt_frame_clear); g_mutex_init (&self->lock); g_cond_init (&self->cond); } @@ -162,7 +176,7 @@ g_main_loop_unref (self->loop); g_main_context_unref (self->context); - g_queue_free (self->queue); + gst_queue_array_free (self->queue); gst_clear_caps (&self->supported_caps); g_mutex_clear (&self->lock); g_cond_clear (&self->cond); @@ -221,13 +235,13 @@ HRESULT hr; guint index; GSource *idle_source; - std::shared_ptr<GstWinRTMediaFrameSourceGroup> target_group; - std::vector<GstWinRTMediaFrameSourceGroup> group_list; + std::shared_ptr < GstWinRTMediaFrameSourceGroup > target_group; + std::vector < GstWinRTMediaFrameSourceGroup > group_list; MediaCaptureWrapperCallbacks callbacks; RoInitializeWrapper init_wrapper (RO_INIT_MULTITHREADED); - self->capture = new MediaCaptureWrapper(self->dispatcher); + self->capture = new MediaCaptureWrapper (self->dispatcher); callbacks.frame_arrived = gst_mf_capture_winrt_on_frame; callbacks.failed = gst_mf_capture_winrt_on_failed; self->capture->RegisterCb (callbacks, self); @@ -240,8 +254,9 @@ g_source_attach (idle_source, self->context); g_source_unref (idle_source); - hr = self->capture->EnumrateFrameSourceGroup(group_list); + hr = self->capture->EnumrateFrameSourceGroup (group_list); + /* *INDENT-OFF* */ #ifndef GST_DISABLE_GST_DEBUG index = 0; for (const auto& iter: group_list) { @@ -279,6 +294,7 @@ index++; } + /* *INDENT-ON* */ if (!target_group) { GST_WARNING_OBJECT (self, "No matching device"); @@ -290,15 +306,17 @@ goto run_loop; } - self->capture->SetSourceGroup(*target_group); + self->capture->SetSourceGroup (*target_group); std::sort (target_group->source_list_.begin (), target_group->source_list_.end (), WinRTCapsCompareFunc); self->supported_caps = gst_caps_new_empty (); + /* *INDENT-OFF* */ for (auto iter: target_group->source_list_) gst_caps_append (self->supported_caps, gst_caps_copy (iter.caps_)); + /* *INDENT-ON* */ GST_DEBUG_OBJECT (self, "Available output caps %" GST_PTR_FORMAT, self->supported_caps); @@ -306,10 +324,10 @@ source->opened = TRUE; g_free (source->device_path); - source->device_path = g_strdup (target_group->id_.c_str()); + source->device_path = g_strdup (target_group->id_.c_str ()); g_free (source->device_name); - source->device_name = g_strdup (target_group->display_name_.c_str()); + source->device_name = g_strdup (target_group->display_name_.c_str ()); source->device_index = index; @@ -339,7 +357,7 @@ return FALSE; } - hr = self->capture->StartCapture(); + hr = self->capture->StartCapture (); if (!gst_mf_result (hr)) { GST_ERROR_OBJECT (self, "Capture object doesn't want to start capture"); return FALSE; @@ -359,13 +377,9 @@ return FALSE; } - hr = self->capture->StopCapture(); + hr = self->capture->StopCapture (); - while (!g_queue_is_empty (self->queue)) { - ISoftwareBitmap *buffer = - (ISoftwareBitmap *) g_queue_pop_head (self->queue); - buffer->Release (); - } + gst_queue_array_clear (self->queue); if (!gst_mf_result (hr)) { GST_ERROR_OBJECT (self, "Capture object doesn't want to stop capture"); @@ -376,10 +390,10 @@ } static HRESULT -gst_mf_capture_winrt_on_frame (ISoftwareBitmap * bitmap, - void * user_data) +gst_mf_capture_winrt_on_frame (IMediaFrameReference * frame, void *user_data) { GstMFCaptureWinRT *self = GST_MF_CAPTURE_WINRT (user_data); + GstMFCaptureWinRTFrame winrt_frame; g_mutex_lock (&self->lock); if (self->flushing) { @@ -387,8 +401,11 @@ return S_OK; } - g_queue_push_tail (self->queue, bitmap); - bitmap->AddRef (); + winrt_frame.frame = frame; + winrt_frame.clock_time = + gst_mf_source_object_get_running_time (GST_MF_SOURCE_OBJECT (self)); + gst_queue_array_push_tail_struct (self->queue, &winrt_frame); + frame->AddRef (); g_cond_broadcast (&self->cond); g_mutex_unlock (&self->lock); @@ -397,12 +414,12 @@ } static HRESULT -gst_mf_capture_winrt_on_failed (const std::string &error, - UINT32 error_code, void * user_data) +gst_mf_capture_winrt_on_failed (const std::string & error, + UINT32 error_code, void *user_data) { GstMFCaptureWinRT *self = GST_MF_CAPTURE_WINRT (user_data); - GST_DEBUG_OBJECT (self, "Have error %s (%d)", error.c_str(), error_code); + GST_DEBUG_OBJECT (self, "Have error %s (%d)", error.c_str (), error_code); g_mutex_lock (&self->lock); self->got_error = TRUE; @@ -413,22 +430,19 @@ } static GstFlowReturn -gst_mf_capture_winrt_fill (GstMFSourceObject * object, GstBuffer * buffer) +gst_mf_capture_winrt_get_video_media_frame (GstMFCaptureWinRT * self, + IVideoMediaFrame ** media_frame, GstClockTime * timestamp, + GstClockTime * duration) { - GstMFCaptureWinRT *self = GST_MF_CAPTURE_WINRT (object); - GstFlowReturn ret = GST_FLOW_OK; + GstMFCaptureWinRTFrame *winrt_frame = nullptr; + IMediaFrameReference *frame_ref; HRESULT hr; - GstVideoFrame frame; - BYTE *data; - UINT32 size; - gint i, j; - ComPtr<ISoftwareBitmap> bitmap; - ComPtr<IBitmapBuffer> bitmap_buffer; - ComPtr<IMemoryBuffer> mem_buf; - ComPtr<IMemoryBufferReference> mem_ref; - ComPtr<Windows::Foundation::IMemoryBufferByteAccess> byte_access; - INT32 plane_count; - BitmapPlaneDescription desc[GST_VIDEO_MAX_PLANES]; + ComPtr < IReference < TimeSpan >> winrt_timestamp; + TimeSpan winrt_duration; + + *media_frame = nullptr; + *timestamp = GST_CLOCK_TIME_NONE; + *duration = GST_CLOCK_TIME_NONE; g_mutex_lock (&self->lock); if (self->got_error) { @@ -441,7 +455,8 @@ return GST_FLOW_FLUSHING; } - while (!self->flushing && !self->got_error && g_queue_is_empty (self->queue)) + while (!self->flushing && !self->got_error && + gst_queue_array_is_empty (self->queue)) g_cond_wait (&self->cond, &self->lock); if (self->got_error) { @@ -454,9 +469,67 @@ return GST_FLOW_FLUSHING; } - bitmap.Attach ((ISoftwareBitmap *) g_queue_pop_head (self->queue)); + winrt_frame = + (GstMFCaptureWinRTFrame *) gst_queue_array_pop_head_struct (self->queue); + + frame_ref = winrt_frame->frame; + g_assert (frame_ref); + + hr = frame_ref->get_VideoMediaFrame (media_frame); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get IVideoMediaFrame"); + *media_frame = nullptr; + goto done; + } + + hr = frame_ref->get_Duration (&winrt_duration); + if (gst_mf_result (hr)) + *duration = winrt_duration.Duration * 100; + + *timestamp = winrt_frame->clock_time; + +done: + gst_mf_capture_winrt_frame_clear (winrt_frame); g_mutex_unlock (&self->lock); + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_mf_capture_winrt_fill (GstMFSourceObject * object, GstBuffer * buffer) +{ + GstMFCaptureWinRT *self = GST_MF_CAPTURE_WINRT (object); + GstFlowReturn ret = GST_FLOW_OK; + HRESULT hr; + GstVideoFrame frame; + BYTE *data; + UINT32 size; + gint i, j; + ComPtr < IVideoMediaFrame > video_frame; + ComPtr < ISoftwareBitmap > bitmap; + ComPtr < IBitmapBuffer > bitmap_buffer; + ComPtr < IMemoryBuffer > mem_buf; + ComPtr < IMemoryBufferReference > mem_ref; + ComPtr < Windows::Foundation::IMemoryBufferByteAccess > byte_access; + INT32 plane_count; + BitmapPlaneDescription desc[GST_VIDEO_MAX_PLANES]; + GstClockTime timestamp = GST_CLOCK_TIME_NONE; + GstClockTime duration = GST_CLOCK_TIME_NONE; + + do { + ret = gst_mf_capture_winrt_get_video_media_frame (self, + video_frame.ReleaseAndGetAddressOf (), ×tamp, &duration); + } while (ret == GST_FLOW_OK && !video_frame); + + if (ret != GST_FLOW_OK) + return ret; + + hr = video_frame->get_SoftwareBitmap (&bitmap); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (self, "Couldn't get ISoftwareBitmap"); + return GST_FLOW_ERROR; + } + hr = bitmap->LockBuffer (BitmapBufferAccessMode::BitmapBufferAccessMode_Read, &bitmap_buffer); if (!gst_mf_result (hr)) { @@ -500,7 +573,7 @@ return GST_FLOW_ERROR; } - hr = mem_ref.As(&byte_access); + hr = mem_ref.As (&byte_access); if (!gst_mf_result (hr)) { GST_ERROR_OBJECT (self, "Cannot get IMemoryBufferByteAccess"); return GST_FLOW_ERROR; @@ -544,6 +617,10 @@ gst_video_frame_unmap (&frame); + GST_BUFFER_PTS (buffer) = timestamp; + GST_BUFFER_DTS (buffer) = GST_CLOCK_TIME_NONE; + GST_BUFFER_DURATION (buffer) = duration; + return ret; } @@ -594,6 +671,7 @@ return NULL; } +/* *INDENT-OFF* */ static gboolean gst_mf_capture_winrt_set_caps (GstMFSourceObject * object, GstCaps * caps) { @@ -629,13 +707,27 @@ return TRUE; } +/* *INDENT-ON* */ + +static void +gst_mf_capture_winrt_frame_clear (GstMFCaptureWinRTFrame * winrt_frame) +{ + if (!winrt_frame) + return; + + if (winrt_frame->frame) + winrt_frame->frame->Release (); + + winrt_frame->frame = nullptr; + winrt_frame->clock_time = GST_CLOCK_TIME_NONE; +} GstMFSourceObject * gst_mf_capture_winrt_new (GstMFSourceType type, gint device_index, const gchar * device_name, const gchar * device_path, gpointer dispatcher) { GstMFSourceObject *self; - ComPtr<ICoreDispatcher> core_dispatcher; + ComPtr < ICoreDispatcher > core_dispatcher; /* Multiple COM init is allowed */ RoInitializeWrapper init_wrapper (RO_INIT_MULTITHREADED);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfdevice.c -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfdevice.c
Changed
@@ -29,6 +29,22 @@ #include "gstmfdevice.h" +#if GST_MF_WINAPI_DESKTOP +#include "gstwin32devicewatcher.h" + +#ifndef INITGUID +#include <initguid.h> +#endif + +#include <dbt.h> +DEFINE_GUID (GST_KSCATEGORY_CAPTURE, 0x65E8773DL, 0x8F56, + 0x11D0, 0xA3, 0xB9, 0x00, 0xA0, 0xC9, 0x22, 0x31, 0x96); +#endif + +#if GST_MF_WINAPI_APP +#include <gst/winrt/gstwinrt.h> +#endif + GST_DEBUG_CATEGORY_EXTERN (gst_mf_debug); #define GST_CAT_DEFAULT gst_mf_debug @@ -136,12 +152,52 @@ struct _GstMFDeviceProvider { GstDeviceProvider parent; + + GstObject *watcher; + + GMutex lock; + GCond cond; + + gboolean enum_completed; }; G_DEFINE_TYPE (GstMFDeviceProvider, gst_mf_device_provider, GST_TYPE_DEVICE_PROVIDER); +static void gst_mf_device_provider_dispose (GObject * object); +static void gst_mf_device_provider_finalize (GObject * object); + static GList *gst_mf_device_provider_probe (GstDeviceProvider * provider); +static gboolean gst_mf_device_provider_start (GstDeviceProvider * provider); +static void gst_mf_device_provider_stop (GstDeviceProvider * provider); + +#if GST_MF_WINAPI_DESKTOP +static gboolean gst_mf_device_provider_start_win32 (GstDeviceProvider * self); +static void gst_mf_device_provider_device_changed (GstWin32DeviceWatcher * + watcher, WPARAM wparam, LPARAM lparam, gpointer user_data); +#endif + +#if GST_MF_WINAPI_APP +static gboolean gst_mf_device_provider_start_winrt (GstDeviceProvider * self); +static void +gst_mf_device_provider_device_added (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformation * info, + gpointer user_data); +static void +gst_mf_device_provider_device_updated (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data); +static void gst_mf_device_provider_device_removed (GstWinRTDeviceWatcher * + watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data); +static void +gst_mf_device_provider_device_enum_completed (GstWinRTDeviceWatcher * + watcher, gpointer user_data); +#endif + +static void +gst_mf_device_provider_on_device_updated (GstMFDeviceProvider * self); static void gst_mf_device_provider_class_init (GstMFDeviceProviderClass * klass) @@ -149,6 +205,8 @@ GstDeviceProviderClass *provider_class = GST_DEVICE_PROVIDER_CLASS (klass); provider_class->probe = GST_DEBUG_FUNCPTR (gst_mf_device_provider_probe); + provider_class->start = GST_DEBUG_FUNCPTR (gst_mf_device_provider_start); + provider_class->stop = GST_DEBUG_FUNCPTR (gst_mf_device_provider_stop); gst_device_provider_class_set_static_metadata (provider_class, "Media Foundation Device Provider", @@ -157,8 +215,54 @@ } static void -gst_mf_device_provider_init (GstMFDeviceProvider * provider) +gst_mf_device_provider_init (GstMFDeviceProvider * self) +{ +#if GST_MF_WINAPI_DESKTOP + GstWin32DeviceWatcherCallbacks win32_callbacks; + + win32_callbacks.device_changed = gst_mf_device_provider_device_changed; + self->watcher = (GstObject *) + gst_win32_device_watcher_new (DBT_DEVTYP_DEVICEINTERFACE, + &GST_KSCATEGORY_CAPTURE, &win32_callbacks, self); +#endif +#if GST_MF_WINAPI_APP + if (!self->watcher) { + GstWinRTDeviceWatcherCallbacks winrt_callbacks; + winrt_callbacks.added = gst_mf_device_provider_device_added; + winrt_callbacks.updated = gst_mf_device_provider_device_updated; + winrt_callbacks.removed = gst_mf_device_provider_device_removed; + winrt_callbacks.enumeration_completed = + gst_mf_device_provider_device_enum_completed; + + self->watcher = (GstObject *) + gst_winrt_device_watcher_new (GST_WINRT_DEVICE_CLASS_VIDEO_CAPTURE, + &winrt_callbacks, self); + } +#endif + + g_mutex_init (&self->lock); + g_cond_init (&self->cond); +} + +static void +gst_mf_device_provider_dispose (GObject * object) { + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (object); + + gst_clear_object (&self->watcher); + + G_OBJECT_CLASS (gst_mf_device_provider_parent_class)->dispose (object); +} + +static void +gst_mf_device_provider_finalize (GObject * object) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (object); + + g_mutex_clear (&self->lock); + g_cond_clear (&self->cond); + + G_OBJECT_CLASS (gst_mf_device_provider_parent_class)->finalize (object); } static GList * @@ -223,3 +327,264 @@ return list; } + +#if GST_MF_WINAPI_DESKTOP +static gboolean +gst_mf_device_provider_start_win32 (GstDeviceProvider * provider) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (provider); + GstWin32DeviceWatcher *watcher; + GList *devices = NULL; + GList *iter; + + if (!GST_IS_WIN32_DEVICE_WATCHER (self->watcher)) + return FALSE; + + GST_DEBUG_OBJECT (self, "Starting Win32 watcher"); + + watcher = GST_WIN32_DEVICE_WATCHER (self->watcher); + + devices = gst_mf_device_provider_probe (provider); + if (devices) { + for (iter = devices; iter; iter = g_list_next (iter)) { + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + } + + g_list_free (devices); + } + + return gst_win32_device_watcher_start (watcher); +} +#endif + +#if GST_MF_WINAPI_APP +static gboolean +gst_mf_device_provider_start_winrt (GstDeviceProvider * provider) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (provider); + GstWinRTDeviceWatcher *watcher; + GList *devices = NULL; + GList *iter; + + if (!GST_IS_WINRT_DEVICE_WATCHER (self->watcher)) + return FALSE; + + GST_DEBUG_OBJECT (self, "Starting WinRT watcher"); + watcher = GST_WINRT_DEVICE_WATCHER (self->watcher); + + self->enum_completed = FALSE; + + if (!gst_winrt_device_watcher_start (watcher)) + return FALSE; + + /* Wait for initial enumeration to be completed */ + g_mutex_lock (&self->lock); + while (!self->enum_completed) + g_cond_wait (&self->cond, &self->lock); + + devices = gst_mf_device_provider_probe (provider); + if (devices) { + for (iter = devices; iter; iter = g_list_next (iter)) { + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + } + + g_list_free (devices); + } + g_mutex_unlock (&self->lock); + + return TRUE; +} +#endif + +static gboolean +gst_mf_device_provider_start (GstDeviceProvider * provider) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (provider); + gboolean ret = FALSE; + + if (!self->watcher) { + GST_ERROR_OBJECT (self, "DeviceWatcher object wasn't configured"); + return FALSE; + } +#if GST_MF_WINAPI_DESKTOP + ret = gst_mf_device_provider_start_win32 (provider); +#endif + +#if GST_MF_WINAPI_APP + if (!ret) + ret = gst_mf_device_provider_start_winrt (provider); +#endif + + return ret; +} + +static void +gst_mf_device_provider_stop (GstDeviceProvider * provider) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (provider); + + if (self->watcher) { +#if GST_MF_WINAPI_DESKTOP + if (GST_IS_WIN32_DEVICE_WATCHER (self->watcher)) { + gst_win32_device_watcher_stop (GST_WIN32_DEVICE_WATCHER (self->watcher)); + } +#endif +#if GST_MF_WINAPI_APP + if (GST_IS_WINRT_DEVICE_WATCHER (self->watcher)) { + gst_winrt_device_watcher_stop (GST_WINRT_DEVICE_WATCHER (self->watcher)); + } +#endif + } +} + +static gboolean +gst_mf_device_is_in_list (GList * list, GstDevice * device) +{ + GList *iter; + GstStructure *s; + const gchar *device_id; + gboolean found = FALSE; + + s = gst_device_get_properties (device); + g_assert (s); + + device_id = gst_structure_get_string (s, "device.path"); + g_assert (device_id); + + for (iter = list; iter; iter = g_list_next (iter)) { + GstStructure *other_s; + const gchar *other_id; + + other_s = gst_device_get_properties (GST_DEVICE (iter->data)); + g_assert (other_s); + + other_id = gst_structure_get_string (other_s, "device.path"); + g_assert (other_id); + + if (g_ascii_strcasecmp (device_id, other_id) == 0) { + found = TRUE; + } + + gst_structure_free (other_s); + if (found) + break; + } + + gst_structure_free (s); + + return found; +} + +static void +gst_mf_device_provider_update_devices (GstMFDeviceProvider * self) +{ + GstDeviceProvider *provider = GST_DEVICE_PROVIDER_CAST (self); + GList *prev_devices = NULL; + GList *new_devices = NULL; + GList *to_add = NULL; + GList *to_remove = NULL; + GList *iter; + + GST_OBJECT_LOCK (self); + prev_devices = g_list_copy_deep (provider->devices, + (GCopyFunc) gst_object_ref, NULL); + GST_OBJECT_UNLOCK (self); + + new_devices = gst_mf_device_provider_probe (provider); + + /* Ownership of GstDevice for gst_device_provider_device_add() + * and gst_device_provider_device_remove() is a bit complicated. + * Remove floating reference here for things to be clear */ + for (iter = new_devices; iter; iter = g_list_next (iter)) + gst_object_ref_sink (iter->data); + + /* Check newly added devices */ + for (iter = new_devices; iter; iter = g_list_next (iter)) { + if (!gst_mf_device_is_in_list (prev_devices, GST_DEVICE (iter->data))) { + to_add = g_list_prepend (to_add, gst_object_ref (iter->data)); + } + } + + /* Check removed device */ + for (iter = prev_devices; iter; iter = g_list_next (iter)) { + if (!gst_mf_device_is_in_list (new_devices, GST_DEVICE (iter->data))) { + to_remove = g_list_prepend (to_remove, gst_object_ref (iter->data)); + } + } + + for (iter = to_remove; iter; iter = g_list_next (iter)) + gst_device_provider_device_remove (provider, GST_DEVICE (iter->data)); + + for (iter = to_add; iter; iter = g_list_next (iter)) + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + + if (prev_devices) + g_list_free_full (prev_devices, (GDestroyNotify) gst_object_unref); + + if (to_add) + g_list_free_full (to_add, (GDestroyNotify) gst_object_unref); + + if (to_remove) + g_list_free_full (to_remove, (GDestroyNotify) gst_object_unref); +} + +#if GST_MF_WINAPI_DESKTOP +static void +gst_mf_device_provider_device_changed (GstWin32DeviceWatcher * watcher, + WPARAM wparam, LPARAM lparam, gpointer user_data) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (user_data); + + if (wparam == DBT_DEVICEARRIVAL || wparam == DBT_DEVICEREMOVECOMPLETE) { + gst_mf_device_provider_update_devices (self); + } +} +#endif + +#if GST_MF_WINAPI_APP +static void +gst_mf_device_provider_device_added (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformation * info, + gpointer user_data) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (user_data); + + if (self->enum_completed) + gst_mf_device_provider_update_devices (self); +} + +static void +gst_mf_device_provider_device_removed (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (user_data); + + if (self->enum_completed) + gst_mf_device_provider_update_devices (self); +} + + +static void +gst_mf_device_provider_device_updated (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (user_data); + + gst_mf_device_provider_update_devices (self); +} + +static void +gst_mf_device_provider_device_enum_completed (GstWinRTDeviceWatcher * + watcher, gpointer user_data) +{ + GstMFDeviceProvider *self = GST_MF_DEVICE_PROVIDER (user_data); + + g_mutex_lock (&self->lock); + GST_DEBUG_OBJECT (self, "Enumeration completed"); + self->enum_completed = TRUE; + g_cond_signal (&self->cond); + g_mutex_unlock (&self->lock); +} +#endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfdevice.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfdevice.h
Changed
@@ -31,9 +31,6 @@ G_DECLARE_FINAL_TYPE (GstMFDeviceProvider, gst_mf_device_provider, GST, MF_DEVICE_PROVIDER, GstDeviceProvider); -G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstMFDevice, gst_object_unref) -G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstMFDeviceProvider, gst_object_unref) - G_END_DECLS -#endif /* __GST_MF_DEVICE_H__ */ \ No newline at end of file +#endif /* __GST_MF_DEVICE_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfh264enc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfh264enc.cpp
Changed
@@ -35,13 +35,21 @@ #include "config.h" #endif +#include "gstmfconfig.h" + #include <gst/gst.h> #include <gst/pbutils/pbutils.h> #include "gstmfvideoenc.h" #include "gstmfh264enc.h" #include <wrl.h> +#if GST_MF_HAVE_D3D11 +#include <gst/d3d11/gstd3d11.h> +#endif + +/* *INDENT-OFF* */ using namespace Microsoft::WRL; +/* *INDENT-ON* */ GST_DEBUG_CATEGORY (gst_mf_h264_enc_debug); #define GST_CAT_DEFAULT gst_mf_h264_enc_debug @@ -154,6 +162,8 @@ PROP_QP_P, PROP_QP_B, PROP_REF, + PROP_D3D11_AWARE, + PROP_ADAPTER_LUID, }; #define DEFAULT_BITRATE (2 * 1024) @@ -167,7 +177,7 @@ #define DEFAULT_SPS_ID 0 #define DEFAULT_PPS_ID 0 #define DEFAULT_BFRAMES 0 -#define DEFAULT_GOP_SIZE 0 +#define DEFAULT_GOP_SIZE -1 #define DEFAULT_THREADS 0 #define DEFAULT_CONTENT_TYPE GST_MF_H264_ENC_CONTENT_TYPE_UNKNOWN #define DEFAULT_QP 24 @@ -179,44 +189,11 @@ #define DEFAULT_QP_B 26 #define DEFAULT_REF 2 -#define GST_MF_H264_ENC_GET_CLASS(obj) \ - (G_TYPE_INSTANCE_GET_CLASS((obj), G_TYPE_FROM_INSTANCE (obj), GstMFH264EncClass)) - -typedef struct _GstMFH264EncDeviceCaps -{ - /* if CodecAPI is available */ - gboolean rc_mode; /* AVEncCommonRateControlMode */ - gboolean quality; /* AVEncCommonQuality */ - - gboolean adaptive_mode; /* AVEncAdaptiveMode */ - gboolean buffer_size; /* AVEncCommonBufferSize */ - gboolean max_bitrate; /* AVEncCommonMaxBitRate */ - gboolean quality_vs_speed; /* AVEncCommonQualityVsSpeed */ - gboolean cabac; /* AVEncH264CABACEnable */ - gboolean sps_id; /* AVEncH264SPSID */ - gboolean pps_id; /* AVEncH264PPSID */ - gboolean bframes; /* AVEncMPVDefaultBPictureCount */ - gboolean gop_size; /* AVEncMPVGOPSize */ - gboolean threads; /* AVEncNumWorkerThreads */ - gboolean content_type; /* AVEncVideoContentType */ - gboolean qp; /* AVEncVideoEncodeQP */ - gboolean force_keyframe; /* AVEncVideoForceKeyFrame */ - gboolean low_latency; /* AVLowLatencyMode */ - - /* since Windows 8.1 */ - gboolean min_qp; /* AVEncVideoMinQP */ - gboolean max_qp; /* AVEncVideoMaxQP */ - gboolean frame_type_qp; /* AVEncVideoEncodeFrameTypeQP */ - gboolean max_num_ref; /* AVEncVideoMaxNumRefFrame */ - guint max_num_ref_high; - guint max_num_ref_low; -} GstMFH264EncDeviceCaps; - typedef struct _GstMFH264Enc { GstMFVideoEnc parent; - /* properteies */ + /* properties */ guint bitrate; /* device dependent properties */ @@ -230,7 +207,7 @@ guint sps_id; guint pps_id; guint bframes; - guint gop_size; + gint gop_size; guint threads; guint content_type; guint qp; @@ -241,34 +218,23 @@ guint qp_p; guint qp_b; guint max_num_ref; + gchar *profile_str; } GstMFH264Enc; typedef struct _GstMFH264EncClass { GstMFVideoEncClass parent_class; - - GstMFH264EncDeviceCaps device_caps; } GstMFH264EncClass; -typedef struct -{ - GstCaps *sink_caps; - GstCaps *src_caps; - gchar *device_name; - guint32 enum_flags; - guint device_index; - GstMFH264EncDeviceCaps device_caps; - gboolean is_default; -} GstMFH264EncClassData; - static GstElementClass *parent_class = NULL; +static void gst_mf_h264_enc_finalize (GObject * object); static void gst_mf_h264_enc_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static void gst_mf_h264_enc_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static gboolean gst_mf_h264_enc_set_option (GstMFVideoEnc * mfenc, - IMFMediaType * output_type); + GstVideoCodecState * state, IMFMediaType * output_type); static gboolean gst_mf_h264_enc_set_src_caps (GstMFVideoEnc * mfenc, GstVideoCodecState * state, IMFMediaType * output_type); @@ -278,14 +244,14 @@ GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *element_class = GST_ELEMENT_CLASS (klass); GstMFVideoEncClass *mfenc_class = GST_MF_VIDEO_ENC_CLASS (klass); - GstMFH264EncClassData *cdata = (GstMFH264EncClassData *) data; - GstMFH264EncDeviceCaps *device_caps = &cdata->device_caps; + GstMFVideoEncClassData *cdata = (GstMFVideoEncClassData *) data; + GstMFVideoEncDeviceCaps *device_caps = &cdata->device_caps; gchar *long_name; gchar *classification; parent_class = (GstElementClass *) g_type_class_peek_parent (klass); - klass->device_caps = *device_caps; + gobject_class->finalize = gst_mf_h264_enc_finalize; gobject_class->get_property = gst_mf_h264_enc_get_property; gobject_class->set_property = gst_mf_h264_enc_set_property; @@ -300,7 +266,7 @@ "Rate Control Mode", GST_TYPE_MF_H264_ENC_RC_MODE, DEFAULT_RC_MODE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { @@ -317,7 +283,7 @@ "Quality applied when rc-mode is qvbr", 1, 100, DEFAULT_QUALITY_LEVEL, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->adaptive_mode) { @@ -326,12 +292,12 @@ "Adaptive Mode", GST_TYPE_MF_H264_ENC_ADAPTIVE_MODE, DEFAULT_ADAPTIVE_MODE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { gst_type_mark_as_plugin_api (GST_TYPE_MF_H264_ENC_ADAPTIVE_MODE, - (GstPluginAPIFlags) 0); + (GstPluginAPIFlags) 0); } } @@ -341,7 +307,7 @@ "VBV(HRD) Buffer Size in bytes (0 = MFT default)", 0, G_MAXUINT - 1, DEFAULT_BUFFER_SIZE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->max_bitrate) { @@ -350,7 +316,7 @@ "The maximum bitrate applied when rc-mode is \"pcvbr\" in kbit/sec", 0, (G_MAXUINT >> 10), DEFAULT_MAX_BITRATE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->quality_vs_speed) { @@ -360,7 +326,7 @@ "[34, 66]: Medium complexity, [67, 100]: High complexity", 0, 100, DEFAULT_QUALITY_VS_SPEED, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->cabac) { @@ -369,7 +335,7 @@ "Enable CABAC entropy coding", DEFAULT_CABAC, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->sps_id) { @@ -378,7 +344,7 @@ "The SPS id to use", 0, 31, DEFAULT_SPS_ID, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->pps_id) { @@ -387,7 +353,7 @@ "The PPS id to use", 0, 255, DEFAULT_PPS_ID, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->bframes) { @@ -396,16 +362,18 @@ "The maximum number of consecutive B frames", 0, 2, DEFAULT_BFRAMES, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->gop_size) { g_object_class_install_property (gobject_class, PROP_GOP_SIZE, - g_param_spec_uint ("gop-size", "GOP size", - "The number of pictures from one GOP header to the next, " - "(0 = MFT default)", 0, G_MAXUINT - 1, DEFAULT_GOP_SIZE, + g_param_spec_int ("gop-size", "GOP size", + "The number of pictures from one GOP header to the next. " + "Depending on GPU vendor implementation, zero gop-size might " + "produce only one keyframe at the beginning (-1 for automatic)", + -1, G_MAXINT, DEFAULT_GOP_SIZE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->threads) { @@ -414,7 +382,7 @@ "The number of worker threads used by a encoder, (0 = MFT default)", 0, 16, DEFAULT_THREADS, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->content_type) { @@ -423,7 +391,7 @@ "Indicates the type of video content", GST_TYPE_MF_H264_ENC_CONTENT_TYPE, DEFAULT_CONTENT_TYPE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { @@ -438,7 +406,7 @@ "QP applied when rc-mode is \"qvbr\"", 16, 51, DEFAULT_QP, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->low_latency) { @@ -447,7 +415,7 @@ "Enable low latency encoding", DEFAULT_LOW_LATENCY, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->min_qp) { @@ -456,7 +424,7 @@ "The minimum allowed QP applied to all rc-mode", 0, 51, DEFAULT_MIN_QP, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->max_qp) { @@ -465,7 +433,7 @@ "The maximum allowed QP applied to all rc-mode", 0, 51, DEFAULT_MAX_QP, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->frame_type_qp) { @@ -474,21 +442,21 @@ "QP applied to I frames", 0, 51, DEFAULT_QP_I, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); g_object_class_install_property (gobject_class, PROP_QP_P, g_param_spec_uint ("qp-p", "QP P", "QP applied to P frames", 0, 51, DEFAULT_QP_P, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); g_object_class_install_property (gobject_class, PROP_QP_B, g_param_spec_uint ("qp-b", "QP B", "QP applied to B frames", 0, 51, DEFAULT_QP_B, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->max_num_ref) { @@ -498,13 +466,42 @@ device_caps->max_num_ref_low, device_caps->max_num_ref_high, DEFAULT_REF, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + } + + /** + * GstMFH264Enc:d3d11-aware: + * + * Whether element supports Direct3D11 texture as an input or not + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_D3D11_AWARE, + g_param_spec_boolean ("d3d11-aware", "D3D11 Aware", + "Whether device can support Direct3D11 interop", + device_caps->d3d11_aware, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + /** + * GstMFH264Enc:adapter-luid: + * + * DXGI Adapter LUID for this elemenet + * + * Since: 1.20 + */ + if (device_caps->d3d11_aware) { + g_object_class_install_property (gobject_class, PROP_ADAPTER_LUID, + g_param_spec_int64 ("adapter-luid", "Adapter LUID", + "DXGI Adapter LUID (Locally Unique Identifier) of created device", + G_MININT64, G_MAXINT64, device_caps->adapter_luid, + (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); } long_name = g_strdup_printf ("Media Foundation %s", cdata->device_name); classification = g_strdup_printf ("Codec/Encoder/Video%s", (cdata->enum_flags & MFT_ENUM_FLAG_HARDWARE) == MFT_ENUM_FLAG_HARDWARE ? - "/Hardware" : ""); + "/Hardware" : ""); gst_element_class_set_metadata (element_class, long_name, classification, "Microsoft Media Foundation H.264 Encoder", @@ -525,7 +522,7 @@ mfenc_class->codec_id = MFVideoFormat_H264; mfenc_class->enum_flags = cdata->enum_flags; mfenc_class->device_index = cdata->device_index; - mfenc_class->can_force_keyframe = device_caps->force_keyframe; + mfenc_class->device_caps = *device_caps; g_free (cdata->device_name); gst_caps_unref (cdata->sink_caps); @@ -560,10 +557,21 @@ } static void +gst_mf_h264_enc_finalize (GObject * object) +{ + GstMFH264Enc *self = (GstMFH264Enc *) (object); + + g_free (self->profile_str); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void gst_mf_h264_enc_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstMFH264Enc *self = (GstMFH264Enc *) (object); + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (object); switch (prop_id) { case PROP_BITRATE: @@ -600,7 +608,7 @@ g_value_set_uint (value, self->bframes); break; case PROP_GOP_SIZE: - g_value_set_uint (value, self->gop_size); + g_value_set_int (value, self->gop_size); break; case PROP_THREADS: g_value_set_uint (value, self->threads); @@ -632,6 +640,12 @@ case PROP_REF: g_value_set_uint (value, self->max_num_ref); break; + case PROP_D3D11_AWARE: + g_value_set_boolean (value, klass->device_caps.d3d11_aware); + break; + case PROP_ADAPTER_LUID: + g_value_set_int64 (value, klass->device_caps.adapter_luid); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -679,7 +693,7 @@ self->bframes = g_value_get_uint (value); break; case PROP_GOP_SIZE: - self->gop_size = g_value_get_uint (value); + self->gop_size = g_value_get_int (value); break; case PROP_THREADS: self->threads = g_value_get_uint (value); @@ -768,17 +782,21 @@ } G_STMT_END static gboolean -gst_mf_h264_enc_set_option (GstMFVideoEnc * mfenc, IMFMediaType * output_type) +gst_mf_h264_enc_set_option (GstMFVideoEnc * mfenc, GstVideoCodecState * state, + IMFMediaType * output_type) { GstMFH264Enc *self = (GstMFH264Enc *) mfenc; - GstMFH264EncClass *klass = GST_MF_H264_ENC_GET_CLASS (self); - GstMFH264EncDeviceCaps *device_caps = &klass->device_caps; + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (mfenc); + GstMFVideoEncDeviceCaps *device_caps = &klass->device_caps; HRESULT hr; GstCaps *allowed_caps, *template_caps; - guint selected_profile = eAVEncH264VProfile_Main; + eAVEncH264VProfile selected_profile = eAVEncH264VProfile_Main; gint level_idc = -1; GstMFTransform *transform = mfenc->transform; + g_free (self->profile_str); + self->profile_str = g_strdup ("main"); + template_caps = gst_pad_get_pad_template_caps (GST_VIDEO_ENCODER_SRC_PAD (self)); allowed_caps = gst_pad_get_allowed_caps (GST_VIDEO_ENCODER_SRC_PAD (self)); @@ -802,12 +820,21 @@ profile = gst_structure_get_string (s, "profile"); if (profile) { - if (!strcmp (profile, "baseline")) { + /* Although we are setting eAVEncH264VProfile_Base, actual profile + * chosen by MFT seems to be constrained-baseline */ + if (strcmp (profile, "baseline") == 0 || + strcmp (profile, "constrained-baseline") == 0) { selected_profile = eAVEncH264VProfile_Base; + g_free (self->profile_str); + self->profile_str = g_strdup (profile); } else if (g_str_has_prefix (profile, "high")) { selected_profile = eAVEncH264VProfile_High; + g_free (self->profile_str); + self->profile_str = g_strdup (profile); } else if (g_str_has_prefix (profile, "main")) { selected_profile = eAVEncH264VProfile_Main; + g_free (self->profile_str); + self->profile_str = g_strdup (profile); } } @@ -856,8 +883,7 @@ if (device_caps->adaptive_mode) { guint adaptive_mode; - adaptive_mode = - gst_mf_h264_enc_adaptive_mode_to_enum (self->adaptive_mode); + adaptive_mode = gst_mf_h264_enc_adaptive_mode_to_enum (self->adaptive_mode); if (adaptive_mode != G_MAXUINT) { hr = gst_mf_transform_set_codec_api_uint32 (transform, &CODECAPI_AVEncAdaptiveMode, adaptive_mode); @@ -880,8 +906,7 @@ if (device_caps->quality_vs_speed) { hr = gst_mf_transform_set_codec_api_uint32 (transform, - &CODECAPI_AVEncCommonQualityVsSpeed, - self->quality_vs_speed); + &CODECAPI_AVEncCommonQualityVsSpeed, self->quality_vs_speed); WARNING_HR (hr, CODECAPI_AVEncCommonQualityVsSpeed); } @@ -903,15 +928,36 @@ WARNING_HR (hr, CODECAPI_AVEncH264PPSID); } + mfenc->has_reorder_frame = FALSE; if (device_caps->bframes && selected_profile != eAVEncH264VProfile_Base) { hr = gst_mf_transform_set_codec_api_uint32 (transform, &CODECAPI_AVEncMPVDefaultBPictureCount, self->bframes); + if (SUCCEEDED (hr) && self->bframes > 0) + mfenc->has_reorder_frame = TRUE; + WARNING_HR (hr, CODECAPI_AVEncMPVDefaultBPictureCount); } if (device_caps->gop_size) { + GstVideoInfo *info = &state->info; + gint gop_size = self->gop_size; + gint fps_n, fps_d; + + /* Set default value (10 sec or 250 frames) like that of x264enc */ + if (gop_size < 0) { + fps_n = GST_VIDEO_INFO_FPS_N (info); + fps_d = GST_VIDEO_INFO_FPS_D (info); + if (fps_n <= 0 || fps_d <= 0) { + gop_size = 250; + } else { + gop_size = 10 * fps_n / fps_d; + } + + GST_DEBUG_OBJECT (self, "Update GOP size to %d", gop_size); + } + hr = gst_mf_transform_set_codec_api_uint32 (transform, - &CODECAPI_AVEncMPVGOPSize, self->gop_size); + &CODECAPI_AVEncMPVGOPSize, gop_size); WARNING_HR (hr, CODECAPI_AVEncMPVGOPSize); } @@ -989,7 +1035,8 @@ s = gst_caps_get_structure (out_caps, 0); gst_structure_set (s, "stream-format", G_TYPE_STRING, "byte-stream", - "alignment", G_TYPE_STRING, "au", NULL); + "alignment", G_TYPE_STRING, "au", "profile", + G_TYPE_STRING, self->profile_str, NULL); out_state = gst_video_encoder_set_output_state (GST_VIDEO_ENCODER (self), out_caps, state); @@ -1010,18 +1057,10 @@ return TRUE; } -static void -gst_mf_h264_enc_register (GstPlugin * plugin, guint rank, - const gchar * device_name, const GstMFH264EncDeviceCaps * device_caps, - guint32 enum_flags, guint device_index, - GstCaps * sink_caps, GstCaps * src_caps) +void +gst_mf_h264_enc_plugin_init (GstPlugin * plugin, guint rank, + GList * d3d11_device) { - GType type; - gchar *type_name; - gchar *feature_name; - gint i; - GstMFH264EncClassData *cdata; - gboolean is_default = TRUE; GTypeInfo type_info = { sizeof (GstMFH264EncClass), NULL, @@ -1033,386 +1072,9 @@ 0, (GInstanceInitFunc) gst_mf_h264_enc_init, }; - - cdata = g_new0 (GstMFH264EncClassData, 1); - cdata->sink_caps = sink_caps; - cdata->src_caps = src_caps; - cdata->device_name = g_strdup (device_name); - cdata->device_caps = *device_caps; - cdata->enum_flags = enum_flags; - cdata->device_index = device_index; - type_info.class_data = cdata; - - type_name = g_strdup ("GstMFH264Enc"); - feature_name = g_strdup ("mfh264enc"); - - i = 1; - while (g_type_from_name (type_name) != 0) { - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstMFH264Device%dEnc", i); - feature_name = g_strdup_printf ("mfh264device%denc", i); - is_default = FALSE; - i++; - } - - cdata->is_default = is_default; - - type = - g_type_register_static (GST_TYPE_MF_VIDEO_ENC, type_name, &type_info, - (GTypeFlags) 0); - - /* make lower rank than default device */ - if (rank > 0 && !is_default) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -} - -typedef struct -{ - guint width; - guint height; -} GstMFH264EncResolution; - -typedef struct -{ - eAVEncH264VProfile profile; - const gchar *profile_str; -} GStMFH264EncProfileMap; - -static void -gst_mf_h264_enc_plugin_init_internal (GstPlugin * plugin, guint rank, - GstMFTransform * transform, guint device_index, guint32 enum_flags) -{ - HRESULT hr; - MFT_REGISTER_TYPE_INFO *infos; - UINT32 info_size; - gint i; - GstCaps *src_caps = NULL; - GstCaps *sink_caps = NULL; - GValue *supported_formats = NULL; - gboolean have_I420 = FALSE; - gchar *device_name = NULL; - GstMFH264EncDeviceCaps device_caps = { 0, }; - IMFActivate *activate; - IMFTransform *encoder; - ICodecAPI *codec_api; - ComPtr<IMFMediaType> out_type; - GstMFH264EncResolution resolutions_to_check[] = { - {1920, 1088}, {2560, 1440}, {3840, 2160}, {4096, 2160}, {8192, 4320} - }; - guint max_width = 0; - guint max_height = 0; - guint resolution; - GStMFH264EncProfileMap profiles_to_check[] = { - { eAVEncH264VProfile_High, "high" }, - { eAVEncH264VProfile_Main, "main" }, - { eAVEncH264VProfile_Base, "baseline" }, - }; - guint num_profiles = 0; - GValue profiles = G_VALUE_INIT; - - /* NOTE: depending on environment, - * some enumerated h/w MFT might not be usable (e.g., multiple GPU case) */ - if (!gst_mf_transform_open (transform)) - return; - - activate = gst_mf_transform_get_activate_handle (transform); - if (!activate) { - GST_WARNING_OBJECT (transform, "No IMFActivate interface available"); - return; - } - - encoder = gst_mf_transform_get_transform_handle (transform); - if (!encoder) { - GST_WARNING_OBJECT (transform, "No IMFTransform interface available"); - return; - } - - codec_api = gst_mf_transform_get_codec_api_handle (transform); - if (!codec_api) { - GST_WARNING_OBJECT (transform, "No ICodecAPI interface available"); - return; - } - - g_object_get (transform, "device-name", &device_name, NULL); - if (!device_name) { - GST_WARNING_OBJECT (transform, "Unknown device name"); - return; - } - - g_value_init (&profiles, GST_TYPE_LIST); - - hr = activate->GetAllocatedBlob (MFT_INPUT_TYPES_Attributes, - (UINT8 **) & infos, &info_size); - if (!gst_mf_result (hr)) - goto done; - - for (i = 0; i < info_size / sizeof (MFT_REGISTER_TYPE_INFO); i++) { - GstVideoFormat vformat; - GValue val = G_VALUE_INIT; - - vformat = gst_mf_video_subtype_to_video_format (&infos[i].guidSubtype); - if (vformat == GST_VIDEO_FORMAT_UNKNOWN) - continue; - - if (!supported_formats) { - supported_formats = g_new0 (GValue, 1); - g_value_init (supported_formats, GST_TYPE_LIST); - } - - /* media foundation has duplicated formats IYUV and I420 */ - if (vformat == GST_VIDEO_FORMAT_I420) { - if (have_I420) - continue; - - have_I420 = TRUE; - } - - g_value_init (&val, G_TYPE_STRING); - g_value_set_static_string (&val, gst_video_format_to_string (vformat)); - gst_value_list_append_and_take_value (supported_formats, &val); - } - - CoTaskMemFree (infos); - - if (!supported_formats) - goto done; - - /* check supported profiles and resolutions */ - hr = MFCreateMediaType (out_type.GetAddressOf ()); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetGUID (MF_MT_MAJOR_TYPE, MFMediaType_Video); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetGUID (MF_MT_SUBTYPE, MFVideoFormat_H264); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_AVG_BITRATE, 2048000); - if (!gst_mf_result (hr)) - goto done; - - hr = MFSetAttributeRatio (out_type.Get (), MF_MT_FRAME_RATE, 30, 1); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); - if (!gst_mf_result (hr)) - goto done; - - GST_DEBUG_OBJECT (transform, "Check supported profiles of %s", - device_name); - for (i = 0; i < G_N_ELEMENTS (profiles_to_check); i++) { - GValue profile_val = G_VALUE_INIT; - - hr = out_type->SetUINT32 (MF_MT_MPEG2_PROFILE, - profiles_to_check[i].profile); - if (!gst_mf_result (hr)) - goto done; - - hr = MFSetAttributeSize (out_type.Get (), MF_MT_FRAME_SIZE, - resolutions_to_check[0].width, resolutions_to_check[0].height); - if (!gst_mf_result (hr)) - break; - - if (!gst_mf_transform_set_output_type (transform, out_type.Get ())) - break; - - GST_DEBUG_OBJECT (transform, "MFT supports h264 %s profile", - profiles_to_check[i].profile_str); - - g_value_init (&profile_val, G_TYPE_STRING); - g_value_set_static_string (&profile_val, profiles_to_check[i].profile_str); - gst_value_list_append_and_take_value (&profiles, &profile_val); - num_profiles++; - - /* clear media type */ - gst_mf_transform_set_output_type (transform, NULL); - } - - if (num_profiles == 0) { - GST_WARNING_OBJECT (transform, "Couldn't query supported profile"); - goto done; - } - - /* baseline is default profile */ - hr = out_type->SetUINT32 (MF_MT_MPEG2_PROFILE, eAVEncH264VProfile_Base); - if (!gst_mf_result (hr)) - goto done; - - GST_DEBUG_OBJECT (transform, "Check supported resolutions of %s", - device_name); - - /* FIXME: This would take so long time. - * Need to find smart way to find supported resolution*/ -#if 0 - for (i = 0; i < G_N_ELEMENTS (resolutions_to_check); i++) { - guint width, height; - - width = resolutions_to_check[i].width; - height = resolutions_to_check[i].height; - - hr = MFSetAttributeSize (out_type.Get (), MF_MT_FRAME_SIZE, width, height); - if (!gst_mf_result (hr)) - break; - - if (!gst_mf_transform_set_output_type (transform, out_type.Get ())) - break; - - max_width = width; - max_height = height; - - GST_DEBUG_OBJECT (transform, - "MFT supports resolution %dx%d", max_width, max_height); - - /* clear media type */ - gst_mf_transform_set_output_type (transform, NULL); - } - - if (max_width == 0 || max_height == 0) { - GST_WARNING_OBJECT (transform, "Couldn't query supported resolution"); - goto done; - } -#else - /* FIXME: don't hardcode supported resolution */ - max_width = max_height = 8192; -#endif - - /* high profile supported since windows8 */ - src_caps = gst_caps_from_string ("video/x-h264, " - "stream-format=(string) byte-stream, " - "alignment=(string) au"); - gst_caps_set_value (src_caps, "profile", &profiles); - - sink_caps = gst_caps_new_empty_simple ("video/x-raw"); - gst_caps_set_value (sink_caps, "format", supported_formats); - g_value_unset (supported_formats); - g_free (supported_formats); - - /* To cover both landscape and portrait, select max value */ - resolution = MAX (max_width, max_height); - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - - GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - -#define CHECK_DEVICE_CAPS(codec_obj,api,val) \ - if (SUCCEEDED((codec_obj)->IsSupported(&(api)))) {\ - device_caps.val = TRUE; \ - } - - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonRateControlMode, rc_mode); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonQuality, quality); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncAdaptiveMode, adaptive_mode); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonBufferSize, buffer_size); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonMaxBitRate, max_bitrate); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncCommonQualityVsSpeed, quality_vs_speed); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncH264CABACEnable, cabac); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncH264SPSID, sps_id); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncH264PPSID, pps_id); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVDefaultBPictureCount, bframes); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVGOPSize, gop_size); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncNumWorkerThreads, threads); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoContentType, content_type); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoEncodeQP, qp); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncVideoForceKeyFrame, force_keyframe); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVLowLatencyMode, low_latency); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMinQP, min_qp); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMaxQP, max_qp); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncVideoEncodeFrameTypeQP, frame_type_qp); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMaxNumRefFrame, max_num_ref); - if (device_caps.max_num_ref) { - VARIANT min; - VARIANT max; - VARIANT step; - - hr = codec_api->GetParameterRange (&CODECAPI_AVEncVideoMaxNumRefFrame, - &min, &max, &step); - if (SUCCEEDED (hr)) { - device_caps.max_num_ref = TRUE; - device_caps.max_num_ref_high = max.uiVal; - device_caps.max_num_ref_low = min.uiVal; - VariantClear (&min); - VariantClear (&max); - VariantClear (&step); - } - } - - gst_mf_h264_enc_register (plugin, rank, device_name, - &device_caps, enum_flags, device_index, sink_caps, src_caps); - -done: - g_value_unset (&profiles); - g_free (device_name); -} - -void -gst_mf_h264_enc_plugin_init (GstPlugin * plugin, guint rank) -{ - GstMFTransformEnumParams enum_params = { 0, }; - MFT_REGISTER_TYPE_INFO output_type; - GstMFTransform *transform; - gint i; - gboolean do_next; + GUID subtype = MFVideoFormat_H264; GST_DEBUG_CATEGORY_INIT (gst_mf_h264_enc_debug, "mfh264enc", 0, "mfh264enc"); - output_type.guidMajorType = MFMediaType_Video; - output_type.guidSubtype = MFVideoFormat_H264; - - enum_params.category = MFT_CATEGORY_VIDEO_ENCODER; - enum_params.enum_flags = (MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_ASYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - enum_params.output_typeinfo = &output_type; - - /* register hardware encoders first */ - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_h264_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); - - /* register software encoders */ - enum_params.enum_flags = (MFT_ENUM_FLAG_SYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_h264_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); + gst_mf_video_enc_register (plugin, rank, &subtype, &type_info, d3d11_device); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfh264enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfh264enc.h
Changed
@@ -26,7 +26,8 @@ G_BEGIN_DECLS void gst_mf_h264_enc_plugin_init (GstPlugin * plugin, - guint rank); + guint rank, + GList * d3d11_device); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfh265enc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfh265enc.cpp
Changed
@@ -40,7 +40,9 @@ #include "gstmfh265enc.h" #include <wrl.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; +/* *INDENT-ON* */ GST_DEBUG_CATEGORY (gst_mf_h265_enc_debug); #define GST_CAT_DEFAULT gst_mf_h265_enc_debug @@ -115,6 +117,8 @@ PROP_QP_P, PROP_QP_B, PROP_REF, + PROP_D3D11_AWARE, + PROP_ADAPTER_LUID, }; #define DEFAULT_BITRATE (2 * 1024) @@ -124,7 +128,7 @@ #define DEFAULT_QUALITY_VS_SPEED 50 #define DEFAULT_CABAC TRUE #define DEFAULT_BFRAMES 0 -#define DEFAULT_GOP_SIZE 0 +#define DEFAULT_GOP_SIZE -1 #define DEFAULT_THREADS 0 #define DEFAULT_CONTENT_TYPE GST_MF_H265_ENC_CONTENT_TYPE_UNKNOWN #define DEFAULT_QP 24 @@ -136,35 +140,11 @@ #define DEFAULT_QP_B 26 #define DEFAULT_REF 2 -#define GST_MF_H265_ENC_GET_CLASS(obj) \ - (G_TYPE_INSTANCE_GET_CLASS((obj), G_TYPE_FROM_INSTANCE (obj), GstMFH265EncClass)) - -typedef struct _GstMFH265EncDeviceCaps -{ - gboolean rc_mode; /* AVEncCommonRateControlMode */ - gboolean buffer_size; /* AVEncCommonBufferSize */ - gboolean max_bitrate; /* AVEncCommonMaxBitRate */ - gboolean quality_vs_speed; /* AVEncCommonQualityVsSpeed */ - gboolean bframes; /* AVEncMPVDefaultBPictureCount */ - gboolean gop_size; /* AVEncMPVGOPSize */ - gboolean threads; /* AVEncNumWorkerThreads */ - gboolean content_type; /* AVEncVideoContentType */ - gboolean qp; /* AVEncVideoEncodeQP */ - gboolean force_keyframe; /* AVEncVideoForceKeyFrame */ - gboolean low_latency; /* AVLowLatencyMode */ - gboolean min_qp; /* AVEncVideoMinQP */ - gboolean max_qp; /* AVEncVideoMaxQP */ - gboolean frame_type_qp; /* AVEncVideoEncodeFrameTypeQP */ - gboolean max_num_ref; /* AVEncVideoMaxNumRefFrame */ - guint max_num_ref_high; - guint max_num_ref_low; -} GstMFH265EncDeviceCaps; - typedef struct _GstMFH265Enc { GstMFVideoEnc parent; - /* properteies */ + /* properties */ guint bitrate; /* device dependent properties */ @@ -189,21 +169,8 @@ typedef struct _GstMFH265EncClass { GstMFVideoEncClass parent_class; - - GstMFH265EncDeviceCaps device_caps; } GstMFH265EncClass; -typedef struct -{ - GstCaps *sink_caps; - GstCaps *src_caps; - gchar *device_name; - guint32 enum_flags; - guint device_index; - GstMFH265EncDeviceCaps device_caps; - gboolean is_default; -} GstMFH265EncClassData; - static GstElementClass *parent_class = NULL; static void gst_mf_h265_enc_get_property (GObject * object, guint prop_id, @@ -211,7 +178,7 @@ static void gst_mf_h265_enc_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static gboolean gst_mf_h265_enc_set_option (GstMFVideoEnc * mfenc, - IMFMediaType * output_type); + GstVideoCodecState * state, IMFMediaType * output_type); static gboolean gst_mf_h265_enc_set_src_caps (GstMFVideoEnc * mfenc, GstVideoCodecState * state, IMFMediaType * output_type); @@ -221,13 +188,12 @@ GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *element_class = GST_ELEMENT_CLASS (klass); GstMFVideoEncClass *mfenc_class = GST_MF_VIDEO_ENC_CLASS (klass); - GstMFH265EncClassData *cdata = (GstMFH265EncClassData *) data; - GstMFH265EncDeviceCaps *device_caps = &cdata->device_caps; + GstMFVideoEncClassData *cdata = (GstMFVideoEncClassData *) data; + GstMFVideoEncDeviceCaps *device_caps = &cdata->device_caps; gchar *long_name; gchar *classification; parent_class = (GstElementClass *) g_type_class_peek_parent (klass); - klass->device_caps = *device_caps; gobject_class->get_property = gst_mf_h265_enc_get_property; gobject_class->set_property = gst_mf_h265_enc_set_property; @@ -243,12 +209,12 @@ "Rate Control Mode", GST_TYPE_MF_H265_ENC_RC_MODE, DEFAULT_RC_MODE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { gst_type_mark_as_plugin_api (GST_TYPE_MF_H265_ENC_RC_MODE, - (GstPluginAPIFlags) 0); + (GstPluginAPIFlags) 0); } } @@ -258,7 +224,7 @@ "VBV(HRD) Buffer Size in bytes (0 = MFT default)", 0, G_MAXUINT - 1, DEFAULT_BUFFER_SIZE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->max_bitrate) { @@ -267,7 +233,7 @@ "The maximum bitrate applied when rc-mode is \"pcvbr\" in kbit/sec " "(0 = MFT default)", 0, (G_MAXUINT >> 10), DEFAULT_MAX_BITRATE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->quality_vs_speed) { @@ -277,7 +243,7 @@ "[34, 66]: Medium complexity, [67, 100]: High complexity", 0, 100, DEFAULT_QUALITY_VS_SPEED, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->bframes) { @@ -286,16 +252,18 @@ "The maximum number of consecutive B frames", 0, 2, DEFAULT_BFRAMES, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->gop_size) { g_object_class_install_property (gobject_class, PROP_GOP_SIZE, - g_param_spec_uint ("gop-size", "GOP size", - "The number of pictures from one GOP header to the next, " - "(0 = MFT default)", 0, G_MAXUINT - 1, DEFAULT_GOP_SIZE, + g_param_spec_int ("gop-size", "GOP size", + "The number of pictures from one GOP header to the next. " + "Depending on GPU vendor implementation, zero gop-size might " + "produce only one keyframe at the beginning (-1 for automatic)", + -1, G_MAXINT, DEFAULT_GOP_SIZE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->threads) { @@ -304,7 +272,7 @@ "The number of worker threads used by a encoder, (0 = MFT default)", 0, 16, DEFAULT_THREADS, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->content_type) { @@ -313,7 +281,7 @@ "Indicates the type of video content", GST_TYPE_MF_H265_ENC_CONTENT_TYPE, DEFAULT_CONTENT_TYPE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { @@ -327,7 +295,7 @@ g_param_spec_uint ("qp", "qp", "QP applied when rc-mode is \"qvbr\"", 16, 51, DEFAULT_QP, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->low_latency) { @@ -335,7 +303,7 @@ g_param_spec_boolean ("low-latency", "Low Latency", "Enable low latency encoding", DEFAULT_LOW_LATENCY, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->min_qp) { @@ -344,7 +312,7 @@ "The minimum allowed QP applied to all rc-mode", 0, 51, DEFAULT_MIN_QP, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->max_qp) { @@ -353,7 +321,7 @@ "The maximum allowed QP applied to all rc-mode", 0, 51, DEFAULT_MAX_QP, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->frame_type_qp) { @@ -362,21 +330,21 @@ "QP applied to I frames", 0, 51, DEFAULT_QP_I, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); g_object_class_install_property (gobject_class, PROP_QP_P, g_param_spec_uint ("qp-p", "QP P", "QP applied to P frames", 0, 51, DEFAULT_QP_P, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); g_object_class_install_property (gobject_class, PROP_QP_B, g_param_spec_uint ("qp-b", "QP B", "QP applied to B frames", 0, 51, DEFAULT_QP_B, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->max_num_ref) { @@ -386,13 +354,42 @@ device_caps->max_num_ref_low, device_caps->max_num_ref_high, DEFAULT_REF, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + } + + /** + * GstMFH265Enc:d3d11-aware: + * + * Whether element supports Direct3D11 texture as an input or not + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_D3D11_AWARE, + g_param_spec_boolean ("d3d11-aware", "D3D11 Aware", + "Whether device can support Direct3D11 interop", + device_caps->d3d11_aware, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + /** + * GstMFH265Enc:adapter-luid: + * + * DXGI Adapter LUID for this elemenet + * + * Since: 1.20 + */ + if (device_caps->d3d11_aware) { + g_object_class_install_property (gobject_class, PROP_ADAPTER_LUID, + g_param_spec_int64 ("adapter-luid", "Adapter LUID", + "DXGI Adapter LUID (Locally Unique Identifier) of created device", + G_MININT64, G_MAXINT64, device_caps->adapter_luid, + (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); } long_name = g_strdup_printf ("Media Foundation %s", cdata->device_name); classification = g_strdup_printf ("Codec/Encoder/Video%s", (cdata->enum_flags & MFT_ENUM_FLAG_HARDWARE) == MFT_ENUM_FLAG_HARDWARE ? - "/Hardware" : ""); + "/Hardware" : ""); gst_element_class_set_metadata (element_class, long_name, classification, "Microsoft Media Foundation H.265 Encoder", @@ -413,7 +410,7 @@ mfenc_class->codec_id = MFVideoFormat_HEVC; mfenc_class->enum_flags = cdata->enum_flags; mfenc_class->device_index = cdata->device_index; - mfenc_class->can_force_keyframe = device_caps->force_keyframe; + mfenc_class->device_caps = *device_caps; g_free (cdata->device_name); gst_caps_unref (cdata->sink_caps); @@ -447,6 +444,7 @@ GValue * value, GParamSpec * pspec) { GstMFH265Enc *self = (GstMFH265Enc *) (object); + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (object); switch (prop_id) { case PROP_BITRATE: @@ -468,7 +466,7 @@ g_value_set_uint (value, self->bframes); break; case PROP_GOP_SIZE: - g_value_set_uint (value, self->gop_size); + g_value_set_int (value, self->gop_size); break; case PROP_THREADS: g_value_set_uint (value, self->threads); @@ -500,6 +498,12 @@ case PROP_REF: g_value_set_uint (value, self->max_num_ref); break; + case PROP_D3D11_AWARE: + g_value_set_boolean (value, klass->device_caps.d3d11_aware); + break; + case PROP_ADAPTER_LUID: + g_value_set_int64 (value, klass->device_caps.adapter_luid); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -532,7 +536,7 @@ self->bframes = g_value_get_uint (value); break; case PROP_GOP_SIZE: - self->gop_size = g_value_get_uint (value); + self->gop_size = g_value_get_int (value); break; case PROP_THREADS: self->threads = g_value_get_uint (value); @@ -604,11 +608,12 @@ } G_STMT_END static gboolean -gst_mf_h265_enc_set_option (GstMFVideoEnc * mfenc, IMFMediaType * output_type) +gst_mf_h265_enc_set_option (GstMFVideoEnc * mfenc, GstVideoCodecState * state, + IMFMediaType * output_type) { GstMFH265Enc *self = (GstMFH265Enc *) mfenc; - GstMFH265EncClass *klass = GST_MF_H265_ENC_GET_CLASS (self); - GstMFH265EncDeviceCaps *device_caps = &klass->device_caps; + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (mfenc); + GstMFVideoEncDeviceCaps *device_caps = &klass->device_caps; HRESULT hr; GstMFTransform *transform = mfenc->transform; @@ -658,20 +663,40 @@ if (device_caps->quality_vs_speed) { hr = gst_mf_transform_set_codec_api_uint32 (transform, - &CODECAPI_AVEncCommonQualityVsSpeed, - self->quality_vs_speed); + &CODECAPI_AVEncCommonQualityVsSpeed, self->quality_vs_speed); WARNING_HR (hr, CODECAPI_AVEncCommonQualityVsSpeed); } + mfenc->has_reorder_frame = FALSE; if (device_caps->bframes) { hr = gst_mf_transform_set_codec_api_uint32 (transform, &CODECAPI_AVEncMPVDefaultBPictureCount, self->bframes); + if (SUCCEEDED (hr) && self->bframes > 0) + mfenc->has_reorder_frame = TRUE; + WARNING_HR (hr, CODECAPI_AVEncMPVDefaultBPictureCount); } if (device_caps->gop_size) { + GstVideoInfo *info = &state->info; + gint gop_size = self->gop_size; + gint fps_n, fps_d; + + /* Set default value (10 sec or 250 frames) like that of x264enc */ + if (gop_size < 0) { + fps_n = GST_VIDEO_INFO_FPS_N (info); + fps_d = GST_VIDEO_INFO_FPS_D (info); + if (fps_n <= 0 || fps_d <= 0) { + gop_size = 250; + } else { + gop_size = 10 * fps_n / fps_d; + } + + GST_DEBUG_OBJECT (self, "Update GOP size to %d", gop_size); + } + hr = gst_mf_transform_set_codec_api_uint32 (transform, - &CODECAPI_AVEncMPVGOPSize, self->gop_size); + &CODECAPI_AVEncMPVGOPSize, gop_size); WARNING_HR (hr, CODECAPI_AVEncMPVGOPSize); } @@ -751,6 +776,13 @@ gst_structure_set (s, "stream-format", G_TYPE_STRING, "byte-stream", "alignment", G_TYPE_STRING, "au", NULL); + if (GST_VIDEO_INFO_FORMAT (&mfenc->input_state->info) == + GST_VIDEO_FORMAT_P010_10LE) { + gst_structure_set (s, "profile", G_TYPE_STRING, "main-10", NULL); + } else { + gst_structure_set (s, "profile", G_TYPE_STRING, "main", NULL); + } + out_state = gst_video_encoder_set_output_state (GST_VIDEO_ENCODER (self), out_caps, state); @@ -768,21 +800,12 @@ gst_tag_list_unref (tags); return TRUE; - } -static void -gst_mf_h265_enc_register (GstPlugin * plugin, guint rank, - const gchar * device_name, const GstMFH265EncDeviceCaps * device_caps, - guint32 enum_flags, guint device_index, - GstCaps * sink_caps, GstCaps * src_caps) +void +gst_mf_h265_enc_plugin_init (GstPlugin * plugin, guint rank, + GList * d3d11_device) { - GType type; - gchar *type_name; - gchar *feature_name; - gint i; - GstMFH265EncClassData *cdata; - gboolean is_default = TRUE; GTypeInfo type_info = { sizeof (GstMFH265EncClass), NULL, @@ -794,380 +817,9 @@ 0, (GInstanceInitFunc) gst_mf_h265_enc_init, }; - - cdata = g_new0 (GstMFH265EncClassData, 1); - cdata->sink_caps = sink_caps; - cdata->src_caps = src_caps; - cdata->device_name = g_strdup (device_name); - cdata->device_caps = *device_caps; - cdata->enum_flags = enum_flags; - cdata->device_index = device_index; - type_info.class_data = cdata; - - type_name = g_strdup ("GstMFH265Enc"); - feature_name = g_strdup ("mfh265enc"); - - i = 1; - while (g_type_from_name (type_name) != 0) { - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstMFH265Device%dEnc", i); - feature_name = g_strdup_printf ("mfh265device%denc", i); - is_default = FALSE; - i++; - } - - cdata->is_default = is_default; - - type = - g_type_register_static (GST_TYPE_MF_VIDEO_ENC, type_name, &type_info, - (GTypeFlags) 0); - - /* make lower rank than default device */ - if (rank > 0 && !is_default) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -} - -typedef struct -{ - guint width; - guint height; -} GstMFH265EncResolution; - -typedef struct -{ - eAVEncH265VProfile profile; - const gchar *profile_str; -} GStMFH265EncProfileMap; - -static void -gst_mf_h265_enc_plugin_init_internal (GstPlugin * plugin, guint rank, - GstMFTransform * transform, guint device_index, guint32 enum_flags) -{ - HRESULT hr; - MFT_REGISTER_TYPE_INFO *infos; - UINT32 info_size; - gint i; - GstCaps *src_caps = NULL; - GstCaps *sink_caps = NULL; - GValue *supported_formats = NULL; - gboolean have_I420 = FALSE; - gchar *device_name = NULL; - GstMFH265EncDeviceCaps device_caps = { 0, }; - IMFActivate *activate; - IMFTransform *encoder; - ICodecAPI *codec_api; - ComPtr<IMFMediaType> out_type; - GstMFH265EncResolution resolutions_to_check[] = { - {1920, 1088}, {2560, 1440}, {3840, 2160}, {4096, 2160}, {8192, 4320} - }; - guint max_width = 0; - guint max_height = 0; - guint resolution; - GStMFH265EncProfileMap profiles_to_check[] = { - { eAVEncH265VProfile_Main_420_8, "main" }, - { eAVEncH265VProfile_Main_420_10, "main-10" }, - }; - guint num_profiles = 0; - GValue profiles = G_VALUE_INIT; - - /* NOTE: depending on environment, - * some enumerated h/w MFT might not be usable (e.g., multiple GPU case) */ - if (!gst_mf_transform_open (transform)) - return; - - activate = gst_mf_transform_get_activate_handle (transform); - if (!activate) { - GST_WARNING_OBJECT (transform, "No IMFActivate interface available"); - return; - } - - encoder = gst_mf_transform_get_transform_handle (transform); - if (!encoder) { - GST_WARNING_OBJECT (transform, "No IMFTransform interface available"); - return; - } - - codec_api = gst_mf_transform_get_codec_api_handle (transform); - if (!codec_api) { - GST_WARNING_OBJECT (transform, "No ICodecAPI interface available"); - return; - } - - g_object_get (transform, "device-name", &device_name, NULL); - if (!device_name) { - GST_WARNING_OBJECT (transform, "Unknown device name"); - return; - } - - g_value_init (&profiles, GST_TYPE_LIST); - - hr = activate->GetAllocatedBlob (MFT_INPUT_TYPES_Attributes, - (UINT8 **) & infos, &info_size); - if (!gst_mf_result (hr)) - goto done; - - for (i = 0; i < info_size / sizeof (MFT_REGISTER_TYPE_INFO); i++) { - GstVideoFormat vformat; - GValue val = G_VALUE_INIT; - - vformat = gst_mf_video_subtype_to_video_format (&infos[i].guidSubtype); - if (vformat == GST_VIDEO_FORMAT_UNKNOWN) - continue; - - if (!supported_formats) { - supported_formats = g_new0 (GValue, 1); - g_value_init (supported_formats, GST_TYPE_LIST); - } - - /* media foundation has duplicated formats IYUV and I420 */ - if (vformat == GST_VIDEO_FORMAT_I420) { - if (have_I420) - continue; - - have_I420 = TRUE; - } - - g_value_init (&val, G_TYPE_STRING); - g_value_set_static_string (&val, gst_video_format_to_string (vformat)); - - gst_value_list_append_and_take_value (supported_formats, &val); - } - - CoTaskMemFree (infos); - - if (!supported_formats) - goto done; - - /* check supported resolutions */ - hr = MFCreateMediaType (out_type.GetAddressOf ()); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetGUID (MF_MT_MAJOR_TYPE, MFMediaType_Video); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetGUID (MF_MT_SUBTYPE, MFVideoFormat_HEVC); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_AVG_BITRATE, 2048000); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_MPEG2_PROFILE, eAVEncH265VProfile_Main_420_8); - if (!gst_mf_result (hr)) - goto done; - - hr = MFSetAttributeRatio (out_type.Get (), MF_MT_FRAME_RATE, 30, 1); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); - if (!gst_mf_result (hr)) - goto done; - - GST_DEBUG_OBJECT (transform, "Check supported profiles of %s", - device_name); - for (i = 0; i < G_N_ELEMENTS (profiles_to_check); i++) { - GValue profile_val = G_VALUE_INIT; - - hr = out_type->SetUINT32 (MF_MT_MPEG2_PROFILE, - profiles_to_check[i].profile); - if (!gst_mf_result (hr)) - goto done; - - hr = MFSetAttributeSize (out_type.Get (), MF_MT_FRAME_SIZE, - resolutions_to_check[0].width, resolutions_to_check[0].height); - if (!gst_mf_result (hr)) - break; - - if (!gst_mf_transform_set_output_type (transform, out_type.Get ())) - break; - - GST_DEBUG_OBJECT (transform, "MFT supports h265 %s profile", - profiles_to_check[i].profile_str); - - g_value_init (&profile_val, G_TYPE_STRING); - g_value_set_static_string (&profile_val, profiles_to_check[i].profile_str); - gst_value_list_append_and_take_value (&profiles, &profile_val); - num_profiles++; - - /* clear media type */ - gst_mf_transform_set_output_type (transform, NULL); - } - - if (num_profiles == 0) { - GST_WARNING_OBJECT (transform, "Couldn't query supported profile"); - goto done; - } - - hr = out_type->SetUINT32 (MF_MT_MPEG2_PROFILE, eAVEncH265VProfile_Main_420_8); - if (!gst_mf_result (hr)) - goto done; - - /* FIXME: This would take so long time. - * Need to find smart way to find supported resolution*/ -#if 0 - for (i = 0; i < G_N_ELEMENTS (resolutions_to_check); i++) { - guint width, height; - - width = resolutions_to_check[i].width; - height = resolutions_to_check[i].height; - - hr = MFSetAttributeSize (out_type.Get (), MF_MT_FRAME_SIZE, width, height); - if (!gst_mf_result (hr)) - break; - - if (!gst_mf_transform_set_output_type (transform, out_type.Get ())) - break; - - max_width = width; - max_height = height; - - GST_DEBUG_OBJECT (transform, - "MFT supports resolution %dx%d", max_width, max_height); - - /* clear media type */ - gst_mf_transform_set_output_type (transform, NULL); - } - - if (max_width == 0 || max_height == 0) { - GST_WARNING_OBJECT (transform, "Couldn't query supported resolution"); - goto done; - } -#else - /* FIXME: don't hardcode supported resolution */ - max_width = max_height = 8192; -#endif - - src_caps = gst_caps_from_string ("video/x-h265, " - "stream-format=(string) byte-stream, " - "alignment=(string) au"); - gst_caps_set_value (src_caps, "profile", &profiles); - - sink_caps = gst_caps_new_empty_simple ("video/x-raw"); - gst_caps_set_value (sink_caps, "format", supported_formats); - g_value_unset (supported_formats); - g_free (supported_formats); - - /* To cover both landscape and portrait, select max value */ - resolution = MAX (max_width, max_height); - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, resolution, - "height", GST_TYPE_INT_RANGE, 64, resolution, NULL); - - GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - -#define CHECK_DEVICE_CAPS(codec_obj,api,val) \ - if (SUCCEEDED((codec_obj)->IsSupported(&(api)))) {\ - device_caps.val = TRUE; \ - } - - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonRateControlMode, rc_mode); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonBufferSize, buffer_size); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonMaxBitRate, max_bitrate); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncCommonQualityVsSpeed, quality_vs_speed); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVDefaultBPictureCount, bframes); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVGOPSize, gop_size); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncNumWorkerThreads, threads); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoContentType, content_type); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoEncodeQP, qp); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncVideoForceKeyFrame, force_keyframe); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVLowLatencyMode, low_latency); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMinQP, min_qp); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMaxQP, max_qp); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncVideoEncodeFrameTypeQP, frame_type_qp); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMaxNumRefFrame, max_num_ref); - if (device_caps.max_num_ref) { - VARIANT min; - VARIANT max; - VARIANT step; - - hr = codec_api->GetParameterRange (&CODECAPI_AVEncVideoMaxNumRefFrame, - &min, &max, &step); - if (SUCCEEDED (hr)) { - device_caps.max_num_ref = TRUE; - device_caps.max_num_ref_high = max.uiVal; - device_caps.max_num_ref_low = min.uiVal; - VariantClear (&min); - VariantClear (&max); - VariantClear (&step); - } - } - - gst_mf_h265_enc_register (plugin, rank, device_name, - &device_caps, enum_flags, device_index, sink_caps, src_caps); - -done: - g_value_unset (&profiles); - g_free (device_name); -} - -void -gst_mf_h265_enc_plugin_init (GstPlugin * plugin, guint rank) -{ - GstMFTransformEnumParams enum_params = { 0, }; - MFT_REGISTER_TYPE_INFO output_type; - GstMFTransform *transform; - gint i; - gboolean do_next; + GUID subtype = MFVideoFormat_HEVC; GST_DEBUG_CATEGORY_INIT (gst_mf_h265_enc_debug, "mfh265enc", 0, "mfh265enc"); - output_type.guidMajorType = MFMediaType_Video; - output_type.guidSubtype = MFVideoFormat_HEVC; - - enum_params.category = MFT_CATEGORY_VIDEO_ENCODER; - enum_params.enum_flags = (MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_ASYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - enum_params.output_typeinfo = &output_type; - - /* register hardware encoders first */ - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_h265_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); - - /* register software encoders */ - enum_params.enum_flags = (MFT_ENUM_FLAG_SYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_h265_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); + gst_mf_video_enc_register (plugin, rank, &subtype, &type_info, d3d11_device); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfh265enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfh265enc.h
Changed
@@ -26,7 +26,8 @@ G_BEGIN_DECLS void gst_mf_h265_enc_plugin_init (GstPlugin * plugin, - guint rank); + guint rank, + GList * d3d11_device); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfmp3enc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfmp3enc.cpp
Changed
@@ -43,7 +43,9 @@ #include <vector> #include <string> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; +/* *INDENT-ON* */ GST_DEBUG_CATEGORY (gst_mf_mp3_enc_debug); #define GST_CAT_DEFAULT gst_mf_mp3_enc_debug @@ -60,7 +62,7 @@ { GstMFAudioEnc parent; - /* properteies */ + /* properties */ guint bitrate; } GstMFMp3Enc; @@ -70,6 +72,7 @@ } GstMFMp3EncClass; +/* *INDENT-OFF* */ typedef struct { GstCaps *sink_caps; @@ -79,6 +82,7 @@ guint device_index; std::set<UINT32> bitrate_list; } GstMFMp3EncClassData; +/* *INDENT-ON* */ static GstElementClass *parent_class = NULL; @@ -110,25 +114,27 @@ gobject_class->get_property = gst_mf_mp3_enc_get_property; gobject_class->set_property = gst_mf_mp3_enc_set_property; - bitrate_blurb = - "Bitrate in bit/sec, (0 = auto), valid values are { 0"; + bitrate_blurb = "Bitrate in bit/sec, (0 = auto), valid values are { 0"; + + /* *INDENT-OFF* */ for (auto iter: cdata->bitrate_list) { bitrate_blurb += ", " + std::to_string (iter); /* std::set<> stores values in a sorted fashion */ max_bitrate = iter; } bitrate_blurb += " }"; + /* *INDENT-ON* */ g_object_class_install_property (gobject_class, PROP_BITRATE, - g_param_spec_uint ("bitrate", "Bitrate", bitrate_blurb.c_str(), 0, + g_param_spec_uint ("bitrate", "Bitrate", bitrate_blurb.c_str (), 0, max_bitrate, DEFAULT_BITRATE, (GParamFlags) (GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | - G_PARAM_STATIC_NAME | G_PARAM_STATIC_NICK))); + G_PARAM_STATIC_NAME | G_PARAM_STATIC_NICK))); long_name = g_strdup_printf ("Media Foundation %s", cdata->device_name); classification = g_strdup_printf ("Codec/Encoder/Audio%s", (cdata->enum_flags & MFT_ENUM_FLAG_HARDWARE) == MFT_ENUM_FLAG_HARDWARE ? - "/Hardware" : ""); + "/Hardware" : ""); gst_element_class_set_metadata (element_class, long_name, classification, "Microsoft Media Foundation MP3 Encoder", @@ -147,8 +153,7 @@ GST_DEBUG_FUNCPTR (gst_mf_mp3_enc_get_output_type); mfenc_class->get_input_type = GST_DEBUG_FUNCPTR (gst_mf_mp3_enc_get_input_type); - mfenc_class->set_src_caps = - GST_DEBUG_FUNCPTR (gst_mf_mp3_enc_set_src_caps); + mfenc_class->set_src_caps = GST_DEBUG_FUNCPTR (gst_mf_mp3_enc_set_src_caps); mfenc_class->codec_id = MFAudioFormat_MP3; mfenc_class->enum_flags = cdata->enum_flags; @@ -207,9 +212,9 @@ GstMFTransform *transform = mfenc->transform; GList *output_list = NULL; GList *iter; - ComPtr<IMFMediaType> target_output; - std::vector<ComPtr<IMFMediaType>> filtered_types; - std::set<UINT32> bitrate_list; + ComPtr < IMFMediaType > target_output; + std::vector < ComPtr < IMFMediaType >> filtered_types; + std::set < UINT32 > bitrate_list; UINT32 bitrate; UINT32 target_bitrate = 0; HRESULT hr; @@ -268,12 +273,12 @@ g_list_free_full (output_list, (GDestroyNotify) gst_mf_media_type_release); - if (filtered_types.empty()) { + if (filtered_types.empty ()) { GST_ERROR_OBJECT (self, "Couldn't find target output type"); return FALSE; } - GST_DEBUG_OBJECT (self, "have %d candidate output", filtered_types.size()); + GST_DEBUG_OBJECT (self, "have %d candidate output", filtered_types.size ()); /* 2. Find the best matching bitrate */ bitrate = self->bitrate; @@ -294,6 +299,8 @@ } GST_DEBUG_OBJECT (self, "Available bitrates"); + + /* *INDENT-OFF* */ for (auto it: bitrate_list) GST_DEBUG_OBJECT (self, "\t%d", it); @@ -319,13 +326,14 @@ break; } } + /* *INDENT-ON* */ if (!target_output) { GST_ERROR_OBJECT (self, "Failed to decide final output type"); return FALSE; } - *output_type = target_output.Detach(); + *output_type = target_output.Detach (); return TRUE; } @@ -338,9 +346,9 @@ GstMFTransform *transform = mfenc->transform; GList *input_list = NULL; GList *iter; - ComPtr<IMFMediaType> target_input; - std::vector<ComPtr<IMFMediaType>> filtered_types; - std::set<UINT32> bitrate_list; + ComPtr < IMFMediaType > target_input; + std::vector < ComPtr < IMFMediaType >> filtered_types; + std::set < UINT32 > bitrate_list; HRESULT hr; if (!gst_mf_transform_get_input_available_types (transform, &input_list)) { @@ -391,33 +399,33 @@ g_list_free_full (input_list, (GDestroyNotify) gst_mf_media_type_release); - if (filtered_types.empty()) { + if (filtered_types.empty ()) { GST_ERROR_OBJECT (self, "Couldn't find target input type"); return FALSE; } GST_DEBUG_OBJECT (self, "Total %d input types are available", - filtered_types.size()); + filtered_types.size ()); /* Just select the first one */ - target_input = *filtered_types.begin(); + target_input = *filtered_types.begin (); - *input_type = target_input.Detach(); + *input_type = target_input.Detach (); return TRUE; } static gboolean -gst_mf_mp3_enc_set_src_caps (GstMFAudioEnc * mfenc, - GstAudioInfo * info) +gst_mf_mp3_enc_set_src_caps (GstMFAudioEnc * mfenc, GstAudioInfo * info) { GstMFMp3Enc *self = (GstMFMp3Enc *) mfenc; GstCaps *src_caps; gboolean ret; - ComPtr<IMFMediaType> output_type; + ComPtr < IMFMediaType > output_type; gint version = 1; - if (!gst_mf_transform_get_output_current_type (mfenc->transform, &output_type)) { + if (!gst_mf_transform_get_output_current_type (mfenc->transform, + &output_type)) { GST_ERROR_OBJECT (self, "Couldn't get current output type"); return FALSE; } @@ -434,10 +442,10 @@ "mpegaudioversion", G_TYPE_INT, version, "layer", G_TYPE_INT, 3, "channels", G_TYPE_INT, GST_AUDIO_INFO_CHANNELS (info), - "rate", G_TYPE_INT, GST_AUDIO_INFO_RATE (info), - NULL); + "rate", G_TYPE_INT, GST_AUDIO_INFO_RATE (info), NULL); - ret = gst_audio_encoder_set_output_format (GST_AUDIO_ENCODER (self), src_caps); + ret = + gst_audio_encoder_set_output_format (GST_AUDIO_ENCODER (self), src_caps); if (!ret) { GST_WARNING_OBJECT (self, "Couldn't set output format %" GST_PTR_FORMAT, src_caps); @@ -451,7 +459,7 @@ gst_mf_mp3_enc_register (GstPlugin * plugin, guint rank, const gchar * device_name, guint32 enum_flags, guint device_index, GstCaps * sink_caps, GstCaps * src_caps, - const std::set<UINT32> &bitrate_list) + const std::set < UINT32 > &bitrate_list) { GType type; gchar *type_name; @@ -509,41 +517,42 @@ } static gboolean -gst_mf_mp3_enc_create_template_caps (const std::set<UINT32> &rate_list, +gst_mf_mp3_enc_create_template_caps (const std::set < UINT32 > &rate_list, gint channels, GstCaps ** sink_caps, GstCaps ** src_caps) { GstCaps *sink = NULL; GstCaps *src = NULL; GValue rate_value = G_VALUE_INIT; - if (rate_list.empty()) { + if (rate_list.empty ()) { GST_WARNING ("No available rate for channels %d", channels); return FALSE; } if (channels != 0) { sink = - gst_caps_from_string ("audio/x-raw, " - "format = (string) " GST_AUDIO_NE (S16) - ", layout = (string) interleaved"); + gst_caps_from_string ("audio/x-raw, " + "format = (string) " GST_AUDIO_NE (S16) + ", layout = (string) interleaved"); src = - gst_caps_from_string ("audio/mpeg, mpegversion = (int) 1," - "layer = (int) 3"); + gst_caps_from_string ("audio/mpeg, mpegversion = (int) 1," + "layer = (int) 3"); gst_caps_set_simple (sink, "channels", G_TYPE_INT, channels, NULL); gst_caps_set_simple (src, "channels", G_TYPE_INT, channels, NULL); } else { sink = - gst_caps_from_string ("audio/x-raw, " - "format = (string) " GST_AUDIO_NE (S16) - ", layout = (string) interleaved, channels = (int) [ 1, 2 ]"); + gst_caps_from_string ("audio/x-raw, " + "format = (string) " GST_AUDIO_NE (S16) + ", layout = (string) interleaved, channels = (int) [ 1, 2 ]"); src = - gst_caps_from_string ("audio/mpeg, mpegversion = (int) 1," - "layer = (int) 3, channels = (int) [ 1, 2 ]"); + gst_caps_from_string ("audio/mpeg, mpegversion = (int) 1," + "layer = (int) 3, channels = (int) [ 1, 2 ]"); } g_value_init (&rate_value, GST_TYPE_LIST); + /* *INDENT-OFF* */ for (const auto &it: rate_list) { GValue rate = G_VALUE_INIT; @@ -551,6 +560,7 @@ g_value_set_int (&rate, (gint) it); gst_value_list_append_and_take_value (&rate_value, &rate); } + /* *INDENT-ON* */ gst_caps_set_value (src, "rate", &rate_value); gst_caps_set_value (sink, "rate", &rate_value); @@ -581,9 +591,9 @@ gchar *device_name = NULL; GList *output_list = NULL; GList *iter; - std::set<UINT32> mono_rate_list; - std::set<UINT32> stereo_rate_list; - std::set<UINT32> bitrate_list; + std::set < UINT32 > mono_rate_list; + std::set < UINT32 > stereo_rate_list; + std::set < UINT32 > bitrate_list; gboolean config_found = FALSE; if (!gst_mf_transform_open (transform)) @@ -600,7 +610,8 @@ goto done; } - GST_INFO_OBJECT (transform, "Have %d output type", g_list_length (output_list)); + GST_INFO_OBJECT (transform, "Have %d output type", + g_list_length (output_list)); for (iter = output_list, i = 0; iter; iter = g_list_next (iter), i++) { UINT32 channels, rate, bitrate; @@ -646,11 +657,11 @@ continue; if (channels == 1) - mono_rate_list.insert(rate); + mono_rate_list.insert (rate); else if (channels == 2) - stereo_rate_list.insert(rate); + stereo_rate_list.insert (rate); else - g_assert_not_reached(); + g_assert_not_reached (); /* convert bytes to bit */ bitrate_list.insert (bitrate * 8); @@ -668,17 +679,17 @@ * * Configure caps per channels if supported rate values are different */ - if (!mono_rate_list.empty() && !stereo_rate_list.empty() && + if (!mono_rate_list.empty () && !stereo_rate_list.empty () && mono_rate_list == stereo_rate_list) { gst_mf_mp3_enc_create_template_caps (mono_rate_list, - 0, &sink_caps, &src_caps); + 0, &sink_caps, &src_caps); } else { - if (!mono_rate_list.empty()) { + if (!mono_rate_list.empty ()) { gst_mf_mp3_enc_create_template_caps (mono_rate_list, 1, &sink_caps, &src_caps); } - if (!stereo_rate_list.empty()) { + if (!stereo_rate_list.empty ()) { gst_mf_mp3_enc_create_template_caps (stereo_rate_list, 2, &sink_caps, &src_caps); } @@ -697,7 +708,7 @@ gst_mf_mp3_enc_register (plugin, rank, device_name, enum_flags, device_index, sink_caps, src_caps, bitrate_list); - done: +done: if (output_list) g_list_free_full (output_list, (GDestroyNotify) gst_mf_media_type_release); g_free (device_name); @@ -718,29 +729,10 @@ output_type.guidSubtype = MFAudioFormat_MP3; enum_params.category = MFT_CATEGORY_AUDIO_ENCODER; - enum_params.enum_flags = (MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_ASYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - enum_params.output_typeinfo = &output_type; - - /* register hardware encoders first (likey no hardware audio encoder) */ - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_mp3_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); - - /* register software encoders */ enum_params.enum_flags = (MFT_ENUM_FLAG_SYNCMFT | MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); + enum_params.output_typeinfo = &output_type; + i = 0; do { enum_params.device_index = i++;
View file
gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfplatloader.c
Added
@@ -0,0 +1,129 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstmfplatloader.h" +#include "gstmfconfig.h" +#include <gmodule.h> + +/* *INDENT-OFF* */ +G_BEGIN_DECLS + +GST_DEBUG_CATEGORY_EXTERN (gst_mf_debug); +#define GST_CAT_DEFAULT gst_mf_debug + +G_END_DECLS + +#define LOAD_SYMBOL(name,func) G_STMT_START { \ + if (!g_module_symbol (module, G_STRINGIFY (name), (gpointer *) &vtable->func)) { \ + GST_WARNING ("Failed to load '%s', %s", G_STRINGIFY (name), g_module_error()); \ + goto out; \ + } \ +} G_STMT_END; + +typedef struct _GstMFPlatVTable +{ + gboolean loaded; + + HRESULT (__stdcall * GstMFTEnum2) (GUID guidCategory, + UINT32 Flags, + const MFT_REGISTER_TYPE_INFO * pInputType, + const MFT_REGISTER_TYPE_INFO * pOutputType, + IMFAttributes * pAttributes, + IMFActivate *** pppMFTActivate, + UINT32 * pnumMFTActivate); + + HRESULT (__stdcall * GstMFCreateDXGIDeviceManager) (UINT * resetToken, + IMFDXGIDeviceManager ** ppDeviceManager); + + HRESULT (__stdcall * GstMFCreateVideoSampleAllocatorEx) (REFIID riid, + void** ppSampleAllocator); +} GstMFPlatVTable; +/* *INDENT-ON* */ + +static GstMFPlatVTable gst_mf_plat_vtable = { 0, }; + +static gboolean +load_library_once (void) +{ + static gsize load_once = 0; + if (g_once_init_enter (&load_once)) { +#if GST_MF_HAVE_D3D11 + GModule *module; + GstMFPlatVTable *vtable = &gst_mf_plat_vtable; + + module = g_module_open ("mfplat.dll", G_MODULE_BIND_LAZY); + if (!module) + goto out; + + LOAD_SYMBOL (MFTEnum2, GstMFTEnum2); + LOAD_SYMBOL (MFCreateDXGIDeviceManager, GstMFCreateDXGIDeviceManager); + LOAD_SYMBOL (MFCreateVideoSampleAllocatorEx, + GstMFCreateVideoSampleAllocatorEx); + + vtable->loaded = TRUE; +#endif + + out: + g_once_init_leave (&load_once, 1); + } + + return gst_mf_plat_vtable.loaded; +} + +gboolean +gst_mf_plat_load_library (void) +{ + return load_library_once (); +} + +HRESULT __stdcall +GstMFTEnum2 (GUID guidCategory, UINT32 Flags, + const MFT_REGISTER_TYPE_INFO * pInputType, + const MFT_REGISTER_TYPE_INFO * pOutputType, + IMFAttributes * pAttributes, IMFActivate *** pppMFTActivate, + UINT32 * pnumMFTActivate) +{ + g_assert (gst_mf_plat_vtable.GstMFTEnum2 != NULL); + + return gst_mf_plat_vtable.GstMFTEnum2 (guidCategory, Flags, pInputType, + pOutputType, pAttributes, pppMFTActivate, pnumMFTActivate); +} + +HRESULT __stdcall +GstMFCreateDXGIDeviceManager (UINT * resetToken, + IMFDXGIDeviceManager ** ppDeviceManager) +{ + g_assert (gst_mf_plat_vtable.GstMFCreateDXGIDeviceManager != NULL); + + return gst_mf_plat_vtable.GstMFCreateDXGIDeviceManager (resetToken, + ppDeviceManager); +} + +HRESULT __stdcall +GstMFCreateVideoSampleAllocatorEx (REFIID riid, void **ppSampleAllocator) +{ + g_assert (gst_mf_plat_vtable.GstMFCreateVideoSampleAllocatorEx != NULL); + + return gst_mf_plat_vtable.GstMFCreateVideoSampleAllocatorEx (riid, + ppSampleAllocator); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfplatloader.h
Added
@@ -0,0 +1,44 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/gst.h> +#include <windows.h> +#include <mfapi.h> + +G_BEGIN_DECLS + +gboolean gst_mf_plat_load_library (void); + +HRESULT __stdcall GstMFTEnum2 (GUID guidCategory, + UINT32 Flags, + const MFT_REGISTER_TYPE_INFO * pInputType, + const MFT_REGISTER_TYPE_INFO * pOutputType, + IMFAttributes * pAttributes, + IMFActivate *** pppMFTActivate, + UINT32 * pnumMFTActivate); + +HRESULT __stdcall GstMFCreateDXGIDeviceManager (UINT * resetToken, + IMFDXGIDeviceManager ** ppDeviceManager); + +HRESULT __stdcall GstMFCreateVideoSampleAllocatorEx (REFIID riid, + void** ppSampleAllocator); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfsourceobject.c -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfsourceobject.c
Changed
@@ -110,6 +110,8 @@ { self->device_index = DEFAULT_DEVICE_INDEX; self->source_type = DEFAULT_SOURCE_TYPE; + + g_weak_ref_init (&self->client, NULL); } static void @@ -120,6 +122,8 @@ g_free (self->device_path); g_free (self->device_name); + g_weak_ref_clear (&self->client); + G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -274,6 +278,45 @@ return klass->get_caps (object); } +gboolean +gst_mf_source_object_set_client (GstMFSourceObject * object, + GstElement * client) +{ + g_return_val_if_fail (GST_IS_MF_SOURCE_OBJECT (object), FALSE); + + g_weak_ref_set (&object->client, client); + + return TRUE; +} + +GstClockTime +gst_mf_source_object_get_running_time (GstMFSourceObject * object) +{ + GstElement *client = NULL; + GstClockTime timestamp = GST_CLOCK_TIME_NONE; + + g_return_val_if_fail (GST_IS_MF_SOURCE_OBJECT (object), GST_CLOCK_TIME_NONE); + + client = (GstElement *) g_weak_ref_get (&object->client); + if (client) { + GstClockTime basetime = client->base_time; + GstClock *clock; + + clock = gst_element_get_clock (client); + if (clock) { + GstClockTime now; + + now = gst_clock_get_time (clock); + timestamp = now - basetime; + gst_object_unref (clock); + } + + gst_object_unref (client); + } + + return timestamp; +} + static gboolean gst_mf_source_object_use_winrt_api (void) {
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfsourceobject.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfsourceobject.h
Changed
@@ -54,6 +54,8 @@ gchar *device_path; gchar *device_name; gint device_index; + + GWeakRef client; }; struct _GstMFSourceObjectClass @@ -102,6 +104,11 @@ gboolean gst_mf_source_object_set_caps (GstMFSourceObject * object, GstCaps * caps); +gboolean gst_mf_source_object_set_client (GstMFSourceObject * object, + GstElement * element); + +GstClockTime gst_mf_source_object_get_running_time (GstMFSourceObject * object); + /* A factory method for subclass impl. selection */ GstMFSourceObject * gst_mf_source_object_new (GstMFSourceType type, gint device_index,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfsourcereader.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfsourcereader.cpp
Changed
@@ -22,6 +22,7 @@ #include "config.h" #endif +#include <gst/base/base.h> #include <gst/video/video.h> #include "gstmfsourcereader.h" #include <string.h> @@ -30,6 +31,7 @@ #include <vector> #include <algorithm> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; G_BEGIN_DECLS @@ -38,6 +40,7 @@ #define GST_CAT_DEFAULT gst_mf_source_object_debug G_END_DECLS +/* *INDENT-ON* */ typedef struct _GstMFStreamMediaType { @@ -71,8 +74,9 @@ GMainLoop *loop; /* protected by lock */ - GQueue *queue; + GstQueueArray *queue; + IMFActivate *activate; IMFMediaSource *source; IMFSourceReader *reader; @@ -86,20 +90,28 @@ gboolean flushing; }; +typedef struct _GstMFSourceReaderSample +{ + IMFSample *sample; + GstClockTime clock_time; +} GstMFSourceReaderSample; + static void gst_mf_source_reader_constructed (GObject * object); static void gst_mf_source_reader_finalize (GObject * object); static gboolean gst_mf_source_reader_start (GstMFSourceObject * object); -static gboolean gst_mf_source_reader_stop (GstMFSourceObject * object); +static gboolean gst_mf_source_reader_stop (GstMFSourceObject * object); static GstFlowReturn gst_mf_source_reader_fill (GstMFSourceObject * object, GstBuffer * buffer); static GstFlowReturn gst_mf_source_reader_create (GstMFSourceObject * object, GstBuffer ** buffer); static gboolean gst_mf_source_reader_unlock (GstMFSourceObject * object); static gboolean gst_mf_source_reader_unlock_stop (GstMFSourceObject * object); -static GstCaps * gst_mf_source_reader_get_caps (GstMFSourceObject * object); +static GstCaps *gst_mf_source_reader_get_caps (GstMFSourceObject * object); static gboolean gst_mf_source_reader_set_caps (GstMFSourceObject * object, GstCaps * caps); +static void +gst_mf_source_reader_sample_clear (GstMFSourceReaderSample * reader_sample); static gboolean gst_mf_source_reader_open (GstMFSourceReader * object, IMFActivate * activate); @@ -136,7 +148,10 @@ static void gst_mf_source_reader_init (GstMFSourceReader * self) { - self->queue = g_queue_new (); + self->queue = + gst_queue_array_new_for_struct (sizeof (GstMFSourceReaderSample), 2); + gst_queue_array_set_clear_func (self->queue, + (GDestroyNotify) gst_mf_source_reader_sample_clear); g_mutex_init (&self->lock); g_cond_init (&self->cond); } @@ -165,7 +180,7 @@ gint i, j; HRESULT hr; GList *list = NULL; - std::vector<std::string> unhandled_caps; + std::vector < std::string > unhandled_caps; g_return_val_if_fail (source_reader != NULL, FALSE); g_return_val_if_fail (media_types != NULL, FALSE); @@ -179,7 +194,7 @@ */ i = MF_SOURCE_READER_FIRST_VIDEO_STREAM; for (j = 0;; j++) { - ComPtr<IMFMediaType> media_type; + ComPtr < IMFMediaType > media_type; hr = source_reader->GetNativeMediaType (i, j, &media_type); @@ -199,10 +214,10 @@ name = gst_structure_get_name (s); if (name != "video/x-raw" && name != "image/jpeg") { auto it = - std::find(unhandled_caps.begin(), unhandled_caps.end(), name); - if (it == unhandled_caps.end()) { - GST_FIXME ("Skip not supported format %s", name.c_str()); - unhandled_caps.push_back(name); + std::find (unhandled_caps.begin (), unhandled_caps.end (), name); + if (it == unhandled_caps.end ()) { + GST_FIXME ("Skip not supported format %s", name.c_str ()); + unhandled_caps.push_back (name); } gst_caps_unref (caps); continue; @@ -236,7 +251,7 @@ list = g_list_reverse (list); *media_types = list; - return ! !list; + return !!list; } static void @@ -269,11 +284,11 @@ { GList *iter; HRESULT hr; - ComPtr<IMFSourceReader> reader; - ComPtr<IMFMediaSource> source; - ComPtr<IMFAttributes> attr; + ComPtr < IMFSourceReader > reader; + ComPtr < IMFMediaSource > source; + ComPtr < IMFAttributes > attr; - hr = activate->ActivateObject (IID_IMFMediaSource, (void **) &source); + hr = activate->ActivateObject (IID_PPV_ARGS (&source)); if (!gst_mf_result (hr)) return FALSE; @@ -297,6 +312,8 @@ return FALSE; } + self->activate = activate; + activate->AddRef (); self->source = source.Detach (); self->reader = reader.Detach (); @@ -322,6 +339,12 @@ { gst_clear_caps (&self->supported_caps); + if (self->activate) { + self->activate->ShutdownObject (); + self->activate->Release (); + self->activate = NULL; + } + if (self->media_types) { g_list_free_full (self->media_types, (GDestroyNotify) gst_mf_stream_media_type_free); @@ -352,7 +375,7 @@ g_main_loop_unref (self->loop); g_main_context_unref (self->context); - g_queue_free (self->queue); + gst_queue_array_free (self->queue); gst_clear_caps (&self->supported_caps); g_mutex_clear (&self->lock); g_cond_clear (&self->cond); @@ -419,15 +442,23 @@ return TRUE; } +static GstMFSourceReaderSample * +gst_mf_source_reader_sample_new (IMFSample * sample, GstClockTime timestamp) +{ + GstMFSourceReaderSample *reader_sample = g_new0 (GstMFSourceReaderSample, 1); + + reader_sample->sample = sample; + reader_sample->clock_time = timestamp; + + return reader_sample; +} + static gboolean gst_mf_source_reader_stop (GstMFSourceObject * object) { GstMFSourceReader *self = GST_MF_SOURCE_READER (object); - while (!g_queue_is_empty (self->queue)) { - IMFMediaBuffer *buffer = (IMFMediaBuffer *) g_queue_pop_head (self->queue); - buffer->Release (); - } + gst_queue_array_clear (self->queue); return TRUE; } @@ -436,47 +467,55 @@ gst_mf_source_reader_read_sample (GstMFSourceReader * self) { HRESULT hr; - DWORD count = 0, i; DWORD stream_flags = 0; GstMFStreamMediaType *type = self->cur_type; - ComPtr<IMFSample> sample; + IMFSample *sample = nullptr; + GstMFSourceReaderSample reader_sample; hr = self->reader->ReadSample (type->stream_index, 0, NULL, &stream_flags, - NULL, &sample); + NULL, &sample); - if (!gst_mf_result (hr)) + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to read sample"); return GST_FLOW_ERROR; + } - if ((stream_flags & MF_SOURCE_READERF_ERROR) == MF_SOURCE_READERF_ERROR) + if ((stream_flags & MF_SOURCE_READERF_ERROR) == MF_SOURCE_READERF_ERROR) { + GST_ERROR_OBJECT (self, "Error while reading sample, sample flags 0x%x", + stream_flags); return GST_FLOW_ERROR; + } - if (!sample) - return GST_FLOW_OK; - - hr = sample->GetBufferCount (&count); - if (!gst_mf_result (hr) || !count) + if (!sample) { + GST_WARNING_OBJECT (self, "Empty sample"); return GST_FLOW_OK; + } - for (i = 0; i < count; i++) { - IMFMediaBuffer *buffer = NULL; - - hr = sample->GetBufferByIndex (i, &buffer); - if (!gst_mf_result (hr) || !buffer) - continue; + reader_sample.sample = sample; + reader_sample.clock_time = + gst_mf_source_object_get_running_time (GST_MF_SOURCE_OBJECT (self)); - g_queue_push_tail (self->queue, buffer); - } + gst_queue_array_push_tail_struct (self->queue, &reader_sample); return GST_FLOW_OK; } static GstFlowReturn gst_mf_source_reader_get_media_buffer (GstMFSourceReader * self, - IMFMediaBuffer ** media_buffer) + IMFMediaBuffer ** buffer, GstClockTime * timestamp, GstClockTime * duration) { GstFlowReturn ret = GST_FLOW_OK; + IMFSample *sample = nullptr; + HRESULT hr; + DWORD count = 0; + LONGLONG mf_timestamp; + GstMFSourceReaderSample *reader_sample = nullptr; + + *buffer = nullptr; + *timestamp = GST_CLOCK_TIME_NONE; + *duration = GST_CLOCK_TIME_NONE; - while (g_queue_is_empty (self->queue)) { + while (gst_queue_array_is_empty (self->queue)) { ret = gst_mf_source_reader_read_sample (self); if (ret != GST_FLOW_OK) return ret; @@ -489,7 +528,37 @@ g_mutex_unlock (&self->lock); } - *media_buffer = (IMFMediaBuffer *) g_queue_pop_head (self->queue); + reader_sample = + (GstMFSourceReaderSample *) gst_queue_array_pop_head_struct (self->queue); + sample = reader_sample->sample; + g_assert (sample); + + hr = sample->GetBufferCount (&count); + if (!gst_mf_result (hr) || count == 0) { + GST_WARNING_OBJECT (self, "Empty IMFSample, read again"); + goto done; + } + + /* XXX: read the first buffer and ignore the others for now */ + hr = sample->GetBufferByIndex (0, buffer); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get IMFMediaBuffer from sample"); + goto done; + } + + hr = sample->GetSampleDuration (&mf_timestamp); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get sample duration"); + *duration = GST_CLOCK_TIME_NONE; + } else { + /* Media Foundation uses 100 nano seconds unit */ + *duration = mf_timestamp * 100; + } + + *timestamp = reader_sample->clock_time; + +done: + gst_mf_source_reader_sample_clear (reader_sample); return GST_FLOW_OK; } @@ -499,13 +568,19 @@ { GstMFSourceReader *self = GST_MF_SOURCE_READER (object); GstFlowReturn ret = GST_FLOW_OK; - ComPtr<IMFMediaBuffer> media_buffer; + ComPtr < IMFMediaBuffer > media_buffer; GstVideoFrame frame; BYTE *data; gint i, j; HRESULT hr; + GstClockTime timestamp = GST_CLOCK_TIME_NONE; + GstClockTime duration = GST_CLOCK_TIME_NONE; + + do { + ret = gst_mf_source_reader_get_media_buffer (self, + media_buffer.ReleaseAndGetAddressOf (), ×tamp, &duration); + } while (ret == GST_FLOW_OK && !media_buffer); - ret = gst_mf_source_reader_get_media_buffer (self, &media_buffer); if (ret != GST_FLOW_OK) return ret; @@ -568,6 +643,10 @@ gst_video_frame_unmap (&frame); media_buffer->Unlock (); + GST_BUFFER_PTS (buffer) = timestamp; + GST_BUFFER_DTS (buffer) = GST_CLOCK_TIME_NONE; + GST_BUFFER_DURATION (buffer) = duration; + return GST_FLOW_OK; } @@ -576,14 +655,20 @@ { GstMFSourceReader *self = GST_MF_SOURCE_READER (object); GstFlowReturn ret = GST_FLOW_OK; - ComPtr<IMFMediaBuffer> media_buffer; + ComPtr < IMFMediaBuffer > media_buffer; HRESULT hr; BYTE *data; DWORD len = 0; GstBuffer *buf; GstMapInfo info; + GstClockTime timestamp = GST_CLOCK_TIME_NONE; + GstClockTime duration = GST_CLOCK_TIME_NONE; + + do { + ret = gst_mf_source_reader_get_media_buffer (self, + media_buffer.ReleaseAndGetAddressOf (), ×tamp, &duration); + } while (ret == GST_FLOW_OK && !media_buffer); - ret = gst_mf_source_reader_get_media_buffer (self, &media_buffer); if (ret != GST_FLOW_OK) return ret; @@ -606,6 +691,11 @@ media_buffer->Unlock (); + GST_BUFFER_PTS (buf) = timestamp; + /* Set DTS since this is compressed format */ + GST_BUFFER_DTS (buf) = timestamp; + GST_BUFFER_DURATION (buf) = duration; + *buffer = buf; return GST_FLOW_OK; @@ -784,7 +874,7 @@ { HRESULT hr; GList *ret = NULL; - ComPtr<IMFAttributes> attr; + ComPtr < IMFAttributes > attr; IMFActivate **devices = NULL; UINT32 i, count = 0; @@ -818,9 +908,9 @@ switch (source_type) { case GST_MF_SOURCE_TYPE_VIDEO: - hr = activate->GetAllocatedString ( - MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, - &name, &name_len); + hr = activate->GetAllocatedString + (MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, &name, + &name_len); break; default: g_assert_not_reached (); @@ -854,7 +944,7 @@ *device_sources = ret; - return ! !ret; + return !!ret; } static void @@ -870,6 +960,19 @@ g_free (activate); } +static void +gst_mf_source_reader_sample_clear (GstMFSourceReaderSample * reader_sample) +{ + if (!reader_sample) + return; + + if (reader_sample->sample) + reader_sample->sample->Release (); + + reader_sample->sample = nullptr; + reader_sample->clock_time = GST_CLOCK_TIME_NONE; +} + GstMFSourceObject * gst_mf_source_reader_new (GstMFSourceType type, gint device_index, const gchar * device_name, const gchar * device_path) @@ -892,4 +995,4 @@ } return self; -} \ No newline at end of file +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmftransform.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmftransform.cpp
Changed
@@ -22,12 +22,16 @@ #include "config.h" #endif +#include "gstmfconfig.h" + #include <gst/gst.h> #include "gstmftransform.h" #include "gstmfutils.h" +#include "gstmfplatloader.h" #include <string.h> #include <wrl.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; G_BEGIN_DECLS @@ -37,12 +41,192 @@ G_END_DECLS +typedef HRESULT (*GstMFTransformAsyncCallbackOnEvent) (MediaEventType event, + GstObject * client); + +class GstMFTransformAsyncCallback : public IMFAsyncCallback +{ +public: + static HRESULT + CreateInstance (IMFTransform * mft, + GstMFTransformAsyncCallbackOnEvent event_cb, GstObject * client, + GstMFTransformAsyncCallback ** callback) + { + HRESULT hr; + GstMFTransformAsyncCallback *self; + + if (!mft || !callback) + return E_INVALIDARG; + + self = new GstMFTransformAsyncCallback (); + + if (!self) + return E_OUTOFMEMORY; + + hr = self->Initialize (mft, event_cb, client); + + if (!gst_mf_result (hr)) { + self->Release (); + return hr; + } + + *callback = self; + + return S_OK; + } + + HRESULT + BeginGetEvent (void) + { + if (!gen_) + return E_FAIL; + + /* we are running already */ + if (running_) + return S_OK; + + running_ = true; + + return gen_->BeginGetEvent (this, nullptr); + } + + HRESULT + Stop (void) + { + running_ = false; + + return S_OK; + } + + /* IUnknown */ + STDMETHODIMP + QueryInterface (REFIID riid, void ** object) + { + return E_NOTIMPL; + } + + STDMETHODIMP_ (ULONG) + AddRef (void) + { + GST_TRACE ("%p, %d", this, ref_count_); + return InterlockedIncrement (&ref_count_); + } + + STDMETHODIMP_ (ULONG) + Release (void) + { + ULONG ref_count; + + GST_TRACE ("%p, %d", this, ref_count_); + ref_count = InterlockedDecrement (&ref_count_); + + if (ref_count == 0) { + GST_TRACE ("Delete instance %p", this); + delete this; + } + + return ref_count; + } + + /* IMFAsyncCallback */ + STDMETHODIMP + GetParameters (DWORD * flags, DWORD * queue) + { + /* this callback could be blocked */ + *flags = MFASYNC_BLOCKING_CALLBACK; + *queue = MFASYNC_CALLBACK_QUEUE_MULTITHREADED; + return S_OK; + } + + STDMETHODIMP + Invoke (IMFAsyncResult * async_result) + { + ComPtr<IMFMediaEvent> event; + HRESULT hr; + bool do_next = true; + + hr = gen_->EndGetEvent (async_result, &event); + + if (!gst_mf_result (hr)) + return hr; + + if (event) { + MediaEventType type; + GstObject *client = nullptr; + hr = event->GetType(&type); + if (!gst_mf_result (hr)) + return hr; + + if (!event_cb_) + return S_OK; + + client = (GstObject *) g_weak_ref_get (&client_); + if (!client) + return S_OK; + + hr = event_cb_ (type, client); + gst_object_unref (client); + if (!gst_mf_result (hr)) + return hr; + + /* On Drain event, this callback object will stop calling BeginGetEvent() + * since there might be no more following events. Client should call + * our BeginGetEvent() method to run again */ + if (type == METransformDrainComplete) + do_next = false; + } + + if (do_next) + gen_->BeginGetEvent(this, nullptr); + + return S_OK; + } + +private: + GstMFTransformAsyncCallback () + : ref_count_ (1) + , running_ (false) + { + g_weak_ref_init (&client_, NULL); + } + + ~GstMFTransformAsyncCallback () + { + g_weak_ref_clear (&client_); + } + + HRESULT + Initialize (IMFTransform * mft, GstMFTransformAsyncCallbackOnEvent event_cb, + GstObject * client) + { + HRESULT hr = mft->QueryInterface(IID_PPV_ARGS(&gen_)); + + if (!gst_mf_result (hr)) + return hr; + + event_cb_ = event_cb; + g_weak_ref_set (&client_, client); + + return S_OK; + } + +private: + ULONG ref_count_; + ComPtr<IMFMediaEventGenerator> gen_; + GstMFTransformAsyncCallbackOnEvent event_cb_; + GWeakRef client_; + + bool running_; +}; +/* *INDENT-ON* */ + enum { PROP_0, PROP_DEVICE_NAME, PROP_HARDWARE, PROP_ENUM_PARAMS, + PROP_D3D11_AWARE, }; struct _GstMFTransform @@ -54,11 +238,12 @@ gchar *device_name; gboolean hardware; + gboolean d3d11_aware; IMFActivate *activate; IMFTransform *transform; - ICodecAPI * codec_api; - IMFMediaEventGenerator *event_gen; + ICodecAPI *codec_api; + GstMFTransformAsyncCallback *callback_object; GQueue *output_queue; @@ -68,13 +253,19 @@ gboolean running; gint pending_need_input; - gint pending_have_output; GThread *thread; GMutex lock; GCond cond; + GMutex event_lock; + GCond event_cond; GMainContext *context; GMainLoop *loop; + gboolean draining; + gboolean flushing; + + GstMFTransformNewSampleCallback callback; + gpointer user_data; }; #define gst_mf_transform_parent_class parent_class @@ -89,6 +280,8 @@ static gpointer gst_mf_transform_thread_func (GstMFTransform * self); static gboolean gst_mf_transform_close (GstMFTransform * self); +static HRESULT gst_mf_transform_on_event (MediaEventType event, + GstMFTransform * self); static void gst_mf_transform_class_init (GstMFTransformClass * klass) @@ -112,7 +305,11 @@ g_param_spec_pointer ("enum-params", "Enum Params", "GstMFTransformEnumParams for MFTEnumEx", (GParamFlags) (G_PARAM_WRITABLE | G_PARAM_CONSTRUCT_ONLY | - G_PARAM_STATIC_STRINGS))); + G_PARAM_STATIC_STRINGS))); + g_object_class_install_property (gobject_class, PROP_D3D11_AWARE, + g_param_spec_boolean ("d3d11-aware", "D3D11 Aware", + "Whether Direct3D11 supports Direct3D11", FALSE, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); } static void @@ -121,7 +318,9 @@ self->output_queue = g_queue_new (); g_mutex_init (&self->lock); + g_mutex_init (&self->event_lock); g_cond_init (&self->cond); + g_cond_init (&self->event_cond); self->context = g_main_context_new (); self->loop = g_main_loop_new (self->context, FALSE); @@ -144,7 +343,7 @@ } static void -gst_mf_transform_clear_enum_params (GstMFTransformEnumParams *params) +gst_mf_transform_clear_enum_params (GstMFTransformEnumParams * params) { g_free (params->input_typeinfo); params->input_typeinfo = NULL; @@ -174,7 +373,9 @@ gst_mf_transform_clear_enum_params (&self->enum_params); g_free (self->device_name); g_mutex_clear (&self->lock); + g_mutex_clear (&self->event_lock); g_cond_clear (&self->cond); + g_cond_clear (&self->event_cond); G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -192,6 +393,9 @@ case PROP_HARDWARE: g_value_set_boolean (value, self->hardware); break; + case PROP_D3D11_AWARE: + g_value_set_boolean (value, self->d3d11_aware); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -214,6 +418,7 @@ self->enum_params.category = params->category; self->enum_params.enum_flags = params->enum_flags; self->enum_params.device_index = params->device_index; + self->enum_params.adapter_luid = params->adapter_luid; if (params->input_typeinfo) { self->enum_params.input_typeinfo = g_new0 (MFT_REGISTER_TYPE_INFO, 1); memcpy (self->enum_params.input_typeinfo, params->input_typeinfo, @@ -248,7 +453,7 @@ static gpointer gst_mf_transform_thread_func (GstMFTransform * self) { - HRESULT hr; + HRESULT hr = S_OK; IMFActivate **devices = NULL; UINT32 num_devices, i; LPWSTR name = NULL; @@ -264,9 +469,43 @@ g_source_attach (source, self->context); g_source_unref (source); - hr = MFTEnumEx (self->enum_params.category, self->enum_params.enum_flags, - self->enum_params.input_typeinfo, self->enum_params.output_typeinfo, - &devices, &num_devices); + /* NOTE: MFTEnum2 is desktop only and requires Windows 10 */ +#if GST_MF_HAVE_D3D11 + if (gst_mf_plat_load_library () && self->enum_params.adapter_luid && + (self->enum_params.enum_flags & MFT_ENUM_FLAG_HARDWARE) != 0) { + ComPtr < IMFAttributes > attr; + LUID luid; + + hr = MFCreateAttributes (&attr, 1); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (self, "Couldn't create IMFAttributes"); + goto run_loop; + } + + GST_INFO_OBJECT (self, + "Enumerating MFT for adapter-luid %" G_GINT64_FORMAT, + self->enum_params.adapter_luid); + + luid.LowPart = (DWORD) (self->enum_params.adapter_luid & 0xffffffff); + luid.HighPart = (LONG) (self->enum_params.adapter_luid >> 32); + + hr = attr->SetBlob (GST_GUID_MFT_ENUM_ADAPTER_LUID, (BYTE *) & luid, + sizeof (LUID)); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (self, "Couldn't set MFT_ENUM_ADAPTER_LUID"); + goto run_loop; + } + + hr = GstMFTEnum2 (self->enum_params.category, + self->enum_params.enum_flags, self->enum_params.input_typeinfo, + self->enum_params.output_typeinfo, attr.Get (), &devices, &num_devices); + } else +#endif + { + hr = MFTEnumEx (self->enum_params.category, self->enum_params.enum_flags, + self->enum_params.input_typeinfo, self->enum_params.output_typeinfo, + &devices, &num_devices); + } if (!gst_mf_result (hr)) { GST_WARNING_OBJECT (self, "MFTEnumEx failure"); @@ -290,7 +529,7 @@ devices[i]->Release (); hr = self->activate->GetAllocatedString (MFT_FRIENDLY_NAME_Attribute, - &name, NULL); + &name, NULL); if (gst_mf_result (hr)) { self->device_name = g_utf16_to_utf8 ((const gunichar2 *) name, @@ -302,7 +541,7 @@ CoTaskMemFree (devices); - self->hardware = ! !(self->enum_params.enum_flags & MFT_ENUM_FLAG_HARDWARE); + self->hardware = !!(self->enum_params.enum_flags & MFT_ENUM_FLAG_HARDWARE); self->initialized = TRUE; run_loop: @@ -325,68 +564,6 @@ return NULL; } -static HRESULT -gst_mf_transform_pop_event (GstMFTransform * self, - gboolean no_wait, MediaEventType * event_type) -{ - ComPtr<IMFMediaEvent> event; - MediaEventType type; - HRESULT hr; - DWORD flags = 0; - - if (!self->hardware || !self->event_gen) - return MF_E_NO_EVENTS_AVAILABLE; - - if (no_wait) - flags = MF_EVENT_FLAG_NO_WAIT; - - hr = self->event_gen->GetEvent (flags, event.GetAddressOf ()); - - if (hr == MF_E_NO_EVENTS_AVAILABLE) - return hr; - else if (!gst_mf_result (hr)) - return hr; - - hr = event->GetType (&type); - if (!gst_mf_result (hr)) { - GST_ERROR_OBJECT (self, "Failed to get event, hr: 0x%x", (guint) hr); - - return hr; - } - - *event_type = type; - return S_OK; -} - -static void -gst_mf_transform_drain_all_events (GstMFTransform * self) -{ - HRESULT hr; - - if (!self->hardware) - return; - - do { - MediaEventType type; - - hr = gst_mf_transform_pop_event (self, TRUE, &type); - if (hr == MF_E_NO_EVENTS_AVAILABLE || !gst_mf_result (hr)) - return; - - switch (type) { - case METransformNeedInput: - self->pending_need_input++; - break; - case METransformHaveOutput: - self->pending_have_output++; - break; - default: - GST_DEBUG_OBJECT (self, "Unhandled event %d", type); - break; - } - } while (SUCCEEDED (hr)); -} - static GstFlowReturn gst_mf_transform_process_output (GstMFTransform * self) { @@ -408,11 +585,10 @@ if ((out_stream_info.dwFlags & (MFT_OUTPUT_STREAM_PROVIDES_SAMPLES | MFT_OUTPUT_STREAM_CAN_PROVIDE_SAMPLES)) == 0) { - ComPtr<IMFMediaBuffer> buffer; - ComPtr<IMFSample> new_sample; + ComPtr < IMFMediaBuffer > buffer; + ComPtr < IMFSample > new_sample; - hr = MFCreateMemoryBuffer (out_stream_info.cbSize, - buffer.GetAddressOf ()); + hr = MFCreateMemoryBuffer (out_stream_info.cbSize, buffer.GetAddressOf ()); if (!gst_mf_result (hr)) { GST_ERROR_OBJECT (self, "Couldn't create memory buffer"); return GST_FLOW_ERROR; @@ -437,14 +613,11 @@ hr = transform->ProcessOutput (0, 1, &out_data, &status); - if (self->hardware) - self->pending_have_output--; - if (hr == MF_E_TRANSFORM_NEED_MORE_INPUT) { GST_LOG_OBJECT (self, "Need more input data"); ret = GST_MF_TRANSFORM_FLOW_NEED_DATA; } else if (hr == MF_E_TRANSFORM_STREAM_CHANGE) { - ComPtr<IMFMediaType> output_type; + ComPtr < IMFMediaType > output_type; GST_DEBUG_OBJECT (self, "Stream change, set output type again"); @@ -465,14 +638,19 @@ ret = GST_MF_TRANSFORM_FLOW_NEED_DATA; } else if (!gst_mf_result (hr)) { - GST_ERROR_OBJECT (self, "ProcessOutput error"); - ret = GST_FLOW_ERROR; + if (self->flushing) { + GST_DEBUG_OBJECT (self, "Ignore error on flushing"); + ret = GST_FLOW_FLUSHING; + } else { + GST_ERROR_OBJECT (self, "ProcessOutput error, hr 0x%x", hr); + ret = GST_FLOW_ERROR; + } } done: if (ret != GST_FLOW_OK) { if (out_data.pSample) - out_data.pSample->Release(); + out_data.pSample->Release (); return ret; } @@ -482,14 +660,20 @@ return GST_FLOW_OK; } + if (self->callback) { + self->callback (self, out_data.pSample, self->user_data); + out_data.pSample->Release (); + return GST_FLOW_OK; + } + g_queue_push_tail (self->output_queue, out_data.pSample); return GST_FLOW_OK; } +/* Must be called with event_lock */ static gboolean -gst_mf_transform_process_input_sync (GstMFTransform * self, - IMFSample * sample) +gst_mf_transform_process_input_sync (GstMFTransform * self, IMFSample * sample) { HRESULT hr; @@ -502,11 +686,10 @@ } gboolean -gst_mf_transform_process_input (GstMFTransform * object, - IMFSample * sample) +gst_mf_transform_process_input (GstMFTransform * object, IMFSample * sample) { HRESULT hr; - GstFlowReturn ret; + gboolean ret = FALSE; g_return_val_if_fail (GST_IS_MF_TRANSFORM (object), FALSE); g_return_val_if_fail (sample != NULL, FALSE); @@ -516,101 +699,79 @@ if (!object->transform) return FALSE; + g_mutex_lock (&object->event_lock); if (!object->running) { + object->pending_need_input = 0; + hr = object->transform->ProcessMessage (MFT_MESSAGE_NOTIFY_START_OF_STREAM, 0); if (!gst_mf_result (hr)) { GST_ERROR_OBJECT (object, "Cannot post start-of-stream message"); - return FALSE; + goto done; } hr = object->transform->ProcessMessage (MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, 0); if (!gst_mf_result (hr)) { GST_ERROR_OBJECT (object, "Cannot post begin-stream message"); - return FALSE; + goto done; + } + + if (object->callback_object) { + hr = object->callback_object->BeginGetEvent (); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (object, "BeginGetEvent failed"); + goto done; + } } GST_DEBUG_OBJECT (object, "MFT is running now"); object->running = TRUE; + object->flushing = FALSE; } - gst_mf_transform_drain_all_events (object); - + /* Wait METransformNeedInput event. While waiting METransformNeedInput + * event, we can still output data if MFT notifyes METransformHaveOutput + * event. */ if (object->hardware) { - process_output: - /* Process pending output first */ - while (object->pending_have_output > 0) { - GST_TRACE_OBJECT (object, - "Pending have output %d", object->pending_have_output); - ret = gst_mf_transform_process_output (object); - if (ret != GST_FLOW_OK) { - if (ret == GST_VIDEO_ENCODER_FLOW_NEED_DATA) { - GST_TRACE_OBJECT (object, "Need more data"); - ret = GST_FLOW_OK; - break; - } else { - GST_WARNING_OBJECT (object, - "Couldn't process output, ret %s", gst_flow_get_name (ret)); - return FALSE; - } - } - } - - while (object->pending_need_input == 0) { - MediaEventType type; - HRESULT hr; - - GST_TRACE_OBJECT (object, "No pending need input, waiting event"); + while (object->pending_need_input == 0 && !object->flushing) + g_cond_wait (&object->event_cond, &object->event_lock); + } - hr = gst_mf_transform_pop_event (object, FALSE, &type); - if (hr != MF_E_NO_EVENTS_AVAILABLE && !gst_mf_result (hr)) { - GST_DEBUG_OBJECT (object, "failed to pop event, hr: 0x%x", (guint) hr); - return FALSE; - } + if (object->flushing) { + GST_DEBUG_OBJECT (object, "We are flushing"); + ret = TRUE; + goto done; + } - GST_TRACE_OBJECT (object, "Got event type %d", (gint) type); - - switch (type) { - case METransformNeedInput: - object->pending_need_input++; - break; - case METransformHaveOutput: - object->pending_have_output++; - break; - default: - GST_DEBUG_OBJECT (object, "Unhandled event %d", type); - break; - } + ret = gst_mf_transform_process_input_sync (object, sample); - /* If MFT doesn't want to handle input yet but we have pending output, - * process output again */ - if (object->pending_have_output > 0 && object->pending_need_input == 0) { - GST_TRACE_OBJECT (object, - "Only have pending output, process output again"); - goto process_output; - } - } - } +done: + g_mutex_unlock (&object->event_lock); - return gst_mf_transform_process_input_sync (object, sample); + return ret; } GstFlowReturn -gst_mf_transform_get_output (GstMFTransform * object, - IMFSample ** sample) +gst_mf_transform_get_output (GstMFTransform * object, IMFSample ** sample) { + GstFlowReturn ret; + g_return_val_if_fail (GST_IS_MF_TRANSFORM (object), GST_FLOW_ERROR); g_return_val_if_fail (sample != NULL, GST_FLOW_ERROR); + /* Hardware MFT must not call this method, instead client must install + * new sample callback so that outputting data from Media Foundation's + * worker thread */ + g_return_val_if_fail (!object->hardware, GST_FLOW_ERROR); if (!object->transform) return GST_FLOW_ERROR; - gst_mf_transform_drain_all_events (object); + ret = gst_mf_transform_process_output (object); - if (!object->hardware || object->pending_have_output) - gst_mf_transform_process_output (object); + if (ret != GST_MF_TRANSFORM_FLOW_NEED_DATA && ret != GST_FLOW_OK) + return ret; if (g_queue_is_empty (object->output_queue)) return GST_MF_TRANSFORM_FLOW_NEED_DATA; @@ -625,11 +786,23 @@ { g_return_val_if_fail (GST_IS_MF_TRANSFORM (object), FALSE); + g_mutex_lock (&object->event_lock); + object->flushing = TRUE; + g_cond_broadcast (&object->event_cond); + g_mutex_unlock (&object->event_lock); + if (object->transform) { - if (object->running) + /* In case of async MFT, there would be no more event after FLUSH, + * then callback object shouldn't wait another event. + * Call Stop() so that our callback object can stop calling BeginGetEvent() + * from it's Invoke() method */ + if (object->callback_object) + object->callback_object->Stop (); + + if (object->running) { object->transform->ProcessMessage (MFT_MESSAGE_COMMAND_FLUSH, 0); + } - object->pending_have_output = 0; object->pending_need_input = 0; } @@ -654,47 +827,28 @@ return TRUE; object->running = FALSE; - object->transform->ProcessMessage (MFT_MESSAGE_COMMAND_DRAIN, 0); - - if (object->hardware) { - MediaEventType type; - HRESULT hr; - - do { - hr = gst_mf_transform_pop_event (object, FALSE, &type); - if (hr != MF_E_NO_EVENTS_AVAILABLE && FAILED (hr)) { - GST_DEBUG_OBJECT (object, "failed to pop event, hr: 0x%x", (guint) hr); - break; - } + object->draining = TRUE; - switch (type) { - case METransformNeedInput: - GST_DEBUG_OBJECT (object, "Ignore need input during finish"); - break; - case METransformHaveOutput: - object->pending_have_output++; - gst_mf_transform_process_output (object); - break; - case METransformDrainComplete: - GST_DEBUG_OBJECT (object, "Drain complete"); - return TRUE; - default: - GST_DEBUG_OBJECT (object, "Unhandled event %d", type); - break; - } - } while (SUCCEEDED (hr)); + GST_DEBUG_OBJECT (object, "Start drain"); - /* and drain all the other events if any */ - gst_mf_transform_drain_all_events (object); + object->transform->ProcessMessage (MFT_MESSAGE_COMMAND_DRAIN, 0); - object->pending_have_output = 0; - object->pending_need_input = 0; + if (object->hardware) { + g_mutex_lock (&object->event_lock); + while (object->draining) + g_cond_wait (&object->event_cond, &object->event_lock); + g_mutex_unlock (&object->event_lock); } else { do { ret = gst_mf_transform_process_output (object); } while (ret == GST_FLOW_OK); } + GST_DEBUG_OBJECT (object, "End drain"); + + object->draining = FALSE; + object->pending_need_input = 0; + return TRUE; } @@ -714,8 +868,7 @@ data->ret = FALSE; gst_mf_transform_close (object); - hr = object->activate->ActivateObject (IID_IMFTransform, - (void **) &object->transform); + hr = object->activate->ActivateObject (IID_PPV_ARGS (&object->transform)); if (!gst_mf_result (hr)) { GST_WARNING_OBJECT (object, "Couldn't open MFT"); @@ -723,7 +876,8 @@ } if (object->hardware) { - ComPtr<IMFAttributes> attr; + ComPtr < IMFAttributes > attr; + UINT32 supports_d3d11 = 0; hr = object->transform->GetAttributes (attr.GetAddressOf ()); if (!gst_mf_result (hr)) { @@ -737,9 +891,20 @@ goto done; } - hr = object->transform->QueryInterface (IID_IMFMediaEventGenerator, - (void **) &object->event_gen); - if (!gst_mf_result (hr)) { + hr = attr->GetUINT32 (GST_GUID_MF_SA_D3D11_AWARE, &supports_d3d11); + if (gst_mf_result (hr) && supports_d3d11 != 0) { + GST_DEBUG_OBJECT (object, "MFT supports direct3d11"); + object->d3d11_aware = TRUE; + } + + /* Create our IMFAsyncCallback object so that listen METransformNeedInput + * and METransformHaveOutput events. The event callback will be called from + * Media Foundation's worker queue thread */ + hr = GstMFTransformAsyncCallback::CreateInstance (object->transform, + (GstMFTransformAsyncCallbackOnEvent) gst_mf_transform_on_event, + GST_OBJECT_CAST (object), &object->callback_object); + + if (!object->callback_object) { GST_ERROR_OBJECT (object, "IMFMediaEventGenerator unavailable"); goto done; } @@ -752,8 +917,7 @@ object->output_id = 0; } - hr = object->transform->QueryInterface (IID_ICodecAPI, - (void **) &object->codec_api); + hr = object->transform->QueryInterface (IID_PPV_ARGS (&object->codec_api)); if (!gst_mf_result (hr)) { GST_WARNING_OBJECT (object, "ICodecAPI is unavailable"); } @@ -795,6 +959,39 @@ return data.ret; } +gboolean +gst_mf_transform_set_device_manager (GstMFTransform * object, + IMFDXGIDeviceManager * manager) +{ + HRESULT hr; + + g_return_val_if_fail (GST_IS_MF_TRANSFORM (object), FALSE); + + if (!object->transform) { + GST_ERROR_OBJECT (object, "IMFTransform is not configured yet"); + return FALSE; + } + + hr = object->transform->ProcessMessage (MFT_MESSAGE_SET_D3D_MANAGER, + (ULONG_PTR) manager); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (object, "Couldn't set device manager"); + return FALSE; + } + + return TRUE; +} + +void +gst_mf_transform_set_new_sample_callback (GstMFTransform * object, + GstMFTransformNewSampleCallback callback, gpointer user_data) +{ + g_return_if_fail (GST_IS_MF_TRANSFORM (object)); + + object->callback = callback; + object->user_data = user_data; +} + static gboolean gst_mf_transform_close (GstMFTransform * object) { @@ -802,9 +999,14 @@ gst_mf_transform_flush (object); - if (object->event_gen) { - object->event_gen->Release (); - object->event_gen = NULL; + /* Otherwise IMFTransform will be alive even after we release the IMFTransform + * below */ + if (object->activate) + object->activate->ShutdownObject (); + + if (object->callback_object) { + object->callback_object->Release (); + object->callback_object = nullptr; } if (object->codec_api) { @@ -820,6 +1022,56 @@ return TRUE; } +static const gchar * +gst_mf_transform_event_type_to_string (MediaEventType event) +{ + switch (event) { + case METransformNeedInput: + return "METransformNeedInput"; + case METransformHaveOutput: + return "METransformHaveOutput"; + case METransformDrainComplete: + return "METransformDrainComplete"; + case METransformMarker: + return "METransformMarker"; + case METransformInputStreamStateChanged: + return "METransformInputStreamStateChanged"; + default: + break; + } + + return "Unknown"; +} + +static HRESULT +gst_mf_transform_on_event (MediaEventType event, GstMFTransform * self) +{ + GST_TRACE_OBJECT (self, "Have event %s (%d)", + gst_mf_transform_event_type_to_string (event), (gint) event); + + switch (event) { + case METransformNeedInput: + g_mutex_lock (&self->event_lock); + self->pending_need_input++; + g_cond_broadcast (&self->event_cond); + g_mutex_unlock (&self->event_lock); + break; + case METransformHaveOutput: + gst_mf_transform_process_output (self); + break; + case METransformDrainComplete: + g_mutex_lock (&self->event_lock); + self->draining = FALSE; + g_cond_broadcast (&self->event_cond); + g_mutex_unlock (&self->event_lock); + break; + default: + break; + } + + return S_OK; +} + IMFActivate * gst_mf_transform_get_activate_handle (GstMFTransform * object) { @@ -848,8 +1100,7 @@ g_return_val_if_fail (GST_IS_MF_TRANSFORM (object), NULL); if (!object->codec_api) { - GST_WARNING_OBJECT (object, - "ICodecAPI is not configured, open MFT first"); + GST_WARNING_OBJECT (object, "ICodecAPI is not configured, open MFT first"); return NULL; } @@ -1115,4 +1366,3 @@ return gst_mf_result (hr); } -
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmftransform.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmftransform.h
Changed
@@ -34,6 +34,26 @@ #define GST_MF_TRANSFORM_FLOW_NEED_DATA GST_FLOW_CUSTOM_SUCCESS +/* NOTE: This GUID is defined in mfapi.h header but it's available only for + * at least Windows 10 RS1. So defining the GUID here again so that + * make use even if build target (e.g., WINVER) wasn't for Windows 10 */ +DEFINE_GUID(GST_GUID_MFT_ENUM_ADAPTER_LUID, + 0x1d39518c, 0xe220, 0x4da8, 0xa0, 0x7f, 0xba, 0x17, 0x25, 0x52, 0xd6, 0xb1); + +/* below GUIDs are defined in mftransform.h for Windows 8 or greater + * FIXME: remove below defines when we bump minimum supported OS version to + * Windows 10 */ +DEFINE_GUID(GST_GUID_MF_SA_D3D11_AWARE, + 0x206b4fc8, 0xfcf9, 0x4c51, 0xaf, 0xe3, 0x97, 0x64, 0x36, 0x9e, 0x33, 0xa0); +DEFINE_GUID(GST_GUID_MF_SA_BUFFERS_PER_SAMPLE, + 0x873c5171, 0x1e3d, 0x4e25, 0x98, 0x8d, 0xb4, 0x33, 0xce, 0x04, 0x19, 0x83); +DEFINE_GUID(GST_GUID_MF_SA_D3D11_USAGE, + 0xe85fe442, 0x2ca3, 0x486e, 0xa9, 0xc7, 0x10, 0x9d, 0xda, 0x60, 0x98, 0x80); +DEFINE_GUID(GST_GUID_MF_SA_D3D11_SHARED_WITHOUT_MUTEX, + 0x39dbd44d, 0x2e44, 0x4931, 0xa4, 0xc8, 0x35, 0x2d, 0x3d, 0xc4, 0x21, 0x15); +DEFINE_GUID(GST_GUID_MF_SA_D3D11_BINDFLAGS, + 0xeacf97ad, 0x065c, 0x4408, 0xbe, 0xe3, 0xfd, 0xcb, 0xfd, 0x12, 0x8b, 0xe2); + typedef struct _GstMFTransformEnumParams { GUID category; @@ -42,12 +62,24 @@ MFT_REGISTER_TYPE_INFO *output_typeinfo; guint device_index; + gint64 adapter_luid; } GstMFTransformEnumParams; +typedef HRESULT (*GstMFTransformNewSampleCallback) (GstMFTransform * object, + IMFSample * sample, + gpointer user_data); + GstMFTransform * gst_mf_transform_new (GstMFTransformEnumParams * params); gboolean gst_mf_transform_open (GstMFTransform * object); +gboolean gst_mf_transform_set_device_manager (GstMFTransform * object, + IMFDXGIDeviceManager * manager); + +void gst_mf_transform_set_new_sample_callback (GstMFTransform * object, + GstMFTransformNewSampleCallback callback, + gpointer user_data); + IMFActivate * gst_mf_transform_get_activate_handle (GstMFTransform * object); IMFTransform * gst_mf_transform_get_transform_handle (GstMFTransform * object);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfutils.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfutils.cpp
Changed
@@ -27,6 +27,7 @@ #include "gstmfutils.h" #include <wrl.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; G_BEGIN_DECLS @@ -92,6 +93,7 @@ {MFVideoFormat_VP90, "video/x-vp9"}, {MFVideoFormat_MJPG, "image/jpeg"}, }; +/* *INDENT-ON* */ GstVideoFormat gst_mf_video_subtype_to_video_format (const GUID * subtype) @@ -506,178 +508,178 @@ } G_STMT_END static const gchar * -gst_mf_guid_to_static_string (const GUID& guid) +gst_mf_guid_to_static_string (const GUID & guid) { - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_SUBTYPE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_ALL_SAMPLES_INDEPENDENT); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_FIXED_SIZE_SAMPLES); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_COMPRESSED); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_SAMPLE_SIZE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_WRAPPED_TYPE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_NUM_CHANNELS); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_SAMPLES_PER_SECOND); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_FLOAT_SAMPLES_PER_SECOND); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_AVG_BYTES_PER_SECOND); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_BLOCK_ALIGNMENT); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_BITS_PER_SAMPLE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_VALID_BITS_PER_SAMPLE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_SAMPLES_PER_BLOCK); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_CHANNEL_MASK); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_FOLDDOWN_MATRIX); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_PEAKREF); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_PEAKTARGET); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_AVGREF); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_AVGTARGET); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AUDIO_PREFER_WAVEFORMATEX); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AAC_PAYLOAD_TYPE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AAC_AUDIO_PROFILE_LEVEL_INDICATION); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_FRAME_SIZE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE_RANGE_MAX); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE_RANGE_MIN); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_PIXEL_ASPECT_RATIO); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DRM_FLAGS); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_PAD_CONTROL_FLAGS); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_SOURCE_CONTENT_HINT); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_VIDEO_CHROMA_SITING); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_INTERLACE_MODE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_TRANSFER_FUNCTION); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_VIDEO_PRIMARIES); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_YUV_MATRIX); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_VIDEO_LIGHTING); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_VIDEO_NOMINAL_RANGE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_GEOMETRIC_APERTURE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MINIMUM_DISPLAY_APERTURE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_PAN_SCAN_APERTURE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_PAN_SCAN_ENABLED); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AVG_BITRATE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AVG_BIT_ERROR_RATE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MAX_KEYFRAME_SPACING); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DEFAULT_STRIDE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_PALETTE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_USER_DATA); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG_START_TIME_CODE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG2_PROFILE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG2_LEVEL); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG2_FLAGS); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG_SEQUENCE_HEADER); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_SRC_PACK_0); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_CTRL_PACK_0); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_SRC_PACK_1); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_CTRL_PACK_1); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DV_VAUX_SRC_PACK); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_DV_VAUX_CTRL_PACK); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_IMAGE_LOSS_TOLERANT); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG4_SAMPLE_DESCRIPTION); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_MPEG4_CURRENT_SAMPLE_ENTRY); - - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_Audio); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_Video); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_Protected); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_SAMI); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_Script); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_Image); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_HTML); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_Binary); - GST_MF_IF_EQUAL_RETURN(guid, MFMediaType_FileTransfer); - - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_AI44); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_ARGB32); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_AYUV); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_DV25); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_DV50); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_DVH1); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_DVSD); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_DVSL); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_H264); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_H265); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_HEVC); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_HEVC_ES); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_I420); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_IYUV); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_M4S2); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MJPG); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MP43); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MP4S); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MP4V); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MPG1); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MSS1); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_MSS2); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_NV11); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_NV12); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_P010); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_P016); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_P210); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_P216); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_RGB24); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_RGB32); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_RGB555); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_RGB565); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_RGB8); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_UYVY); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_v210); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_v410); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_VP80); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_VP90); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_WMV1); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_WMV2); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_WMV3); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_WVC1); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_Y210); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_Y216); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_Y410); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_Y416); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_Y41P); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_Y41T); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_YUY2); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_YV12); - GST_MF_IF_EQUAL_RETURN(guid, MFVideoFormat_YVYU); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MAJOR_TYPE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MAJOR_TYPE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_SUBTYPE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_ALL_SAMPLES_INDEPENDENT); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_FIXED_SIZE_SAMPLES); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_COMPRESSED); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_SAMPLE_SIZE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_WRAPPED_TYPE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_NUM_CHANNELS); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_SAMPLES_PER_SECOND); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_FLOAT_SAMPLES_PER_SECOND); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_AVG_BYTES_PER_SECOND); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_BLOCK_ALIGNMENT); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_BITS_PER_SAMPLE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_VALID_BITS_PER_SAMPLE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_SAMPLES_PER_BLOCK); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_CHANNEL_MASK); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_FOLDDOWN_MATRIX); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_WMADRC_PEAKREF); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_WMADRC_PEAKTARGET); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_WMADRC_AVGREF); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_WMADRC_AVGTARGET); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AUDIO_PREFER_WAVEFORMATEX); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AAC_PAYLOAD_TYPE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AAC_AUDIO_PROFILE_LEVEL_INDICATION); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_FRAME_SIZE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_FRAME_RATE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_FRAME_RATE_RANGE_MAX); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_FRAME_RATE_RANGE_MIN); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_PIXEL_ASPECT_RATIO); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DRM_FLAGS); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_PAD_CONTROL_FLAGS); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_SOURCE_CONTENT_HINT); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_VIDEO_CHROMA_SITING); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_INTERLACE_MODE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_TRANSFER_FUNCTION); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_VIDEO_PRIMARIES); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_YUV_MATRIX); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_VIDEO_LIGHTING); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_VIDEO_NOMINAL_RANGE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_GEOMETRIC_APERTURE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MINIMUM_DISPLAY_APERTURE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_PAN_SCAN_APERTURE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_PAN_SCAN_ENABLED); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AVG_BITRATE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AVG_BIT_ERROR_RATE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MAX_KEYFRAME_SPACING); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DEFAULT_STRIDE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_PALETTE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_USER_DATA); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG_START_TIME_CODE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG2_PROFILE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG2_LEVEL); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG2_FLAGS); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG_SEQUENCE_HEADER); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DV_AAUX_SRC_PACK_0); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DV_AAUX_CTRL_PACK_0); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DV_AAUX_SRC_PACK_1); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DV_AAUX_CTRL_PACK_1); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DV_VAUX_SRC_PACK); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_DV_VAUX_CTRL_PACK); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_IMAGE_LOSS_TOLERANT); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG4_SAMPLE_DESCRIPTION); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_MPEG4_CURRENT_SAMPLE_ENTRY); + + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_Audio); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_Video); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_Protected); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_SAMI); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_Script); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_Image); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_HTML); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_Binary); + GST_MF_IF_EQUAL_RETURN (guid, MFMediaType_FileTransfer); + + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_AI44); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_ARGB32); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_AYUV); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_DV25); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_DV50); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_DVH1); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_DVSD); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_DVSL); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_H264); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_H265); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_HEVC); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_HEVC_ES); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_I420); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_IYUV); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_M4S2); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MJPG); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MP43); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MP4S); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MP4V); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MPG1); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MSS1); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_MSS2); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_NV11); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_NV12); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_P010); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_P016); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_P210); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_P216); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_RGB24); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_RGB32); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_RGB555); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_RGB565); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_RGB8); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_UYVY); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_v210); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_v410); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_VP80); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_VP90); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_WMV1); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_WMV2); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_WMV3); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_WVC1); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_Y210); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_Y216); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_Y410); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_Y416); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_Y41P); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_Y41T); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_YUY2); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_YV12); + GST_MF_IF_EQUAL_RETURN (guid, MFVideoFormat_YVYU); /* WAVE_FORMAT_PCM */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_PCM); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_PCM); /* WAVE_FORMAT_IEEE_FLOAT */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_Float); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_Float); /* WAVE_FORMAT_DTS */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_DTS); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_DTS); /* WAVE_FORMAT_DOLBY_AC3_SPDIF */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_Dolby_AC3_SPDIF); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_Dolby_AC3_SPDIF); /* WAVE_FORMAT_DRM */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_DRM); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_DRM); /* WAVE_FORMAT_WMAUDIO2 */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudioV8); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_WMAudioV8); /* WAVE_FORMAT_WMAUDIO3 */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudioV9); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_WMAudioV9); /* WAVE_FORMAT_WMAUDIO_LOSSLESS */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudio_Lossless); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_WMAudio_Lossless); /* WAVE_FORMAT_WMASPDIF */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_WMASPDIF); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_WMASPDIF); /* WAVE_FORMAT_WMAVOICE9 */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_MSP1); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_MSP1); /* WAVE_FORMAT_MPEGLAYER3 */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_MP3); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_MP3); /* WAVE_FORMAT_MPEG */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_MPEG); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_MPEG); /* WAVE_FORMAT_MPEG_HEAAC */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_AAC); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_AAC); /* WAVE_FORMAT_MPEG_ADTS_AAC */ - GST_MF_IF_EQUAL_RETURN(guid, MFAudioFormat_ADTS); + GST_MF_IF_EQUAL_RETURN (guid, MFAudioFormat_ADTS); #if GST_MF_WINAPI_DESKTOP - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_CUSTOM_VIDEO_PRIMARIES); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_AM_FORMAT_TYPE); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_ARBITRARY_HEADER); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_ARBITRARY_FORMAT); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_ORIGINAL_4CC); - GST_MF_IF_EQUAL_RETURN(guid, MF_MT_ORIGINAL_WAVE_FORMAT_TAG); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_CUSTOM_VIDEO_PRIMARIES); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_AM_FORMAT_TYPE); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_ARBITRARY_HEADER); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_ARBITRARY_FORMAT); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_ORIGINAL_4CC); + GST_MF_IF_EQUAL_RETURN (guid, MF_MT_ORIGINAL_WAVE_FORMAT_TAG); #endif return NULL; } static gchar * -gst_mf_guid_to_string (const GUID& guid) +gst_mf_guid_to_string (const GUID & guid) { const gchar *str = NULL; HRESULT hr; @@ -700,14 +702,14 @@ ret = g_strdup_printf ("%8.8x-%4.4x-%4.4x-%2.2x%2.2x-%2.2x%2.2x%2.2x%2.2x%2.2x%2.2x", (guint) guid.Data1, (guint) guid.Data2, (guint) guid.Data3, - guid.Data4[0], guid.Data4[1], guid.Data4[2], guid.Data4[3], - guid.Data4[4], guid.Data4[5], guid.Data4[6], guid.Data4[7]); + guid.Data4[0], guid.Data4[1], guid.Data4[2], guid.Data4[3], + guid.Data4[4], guid.Data4[5], guid.Data4[6], guid.Data4[7]); return ret; } static gchar * -gst_mf_attribute_value_to_string (const GUID& guid, const PROPVARIANT& var) +gst_mf_attribute_value_to_string (const GUID & guid, const PROPVARIANT & var) { if (IsEqualGUID (guid, MF_MT_FRAME_RATE) || IsEqualGUID (guid, MF_MT_FRAME_RATE_RANGE_MAX) || @@ -716,7 +718,7 @@ IsEqualGUID (guid, MF_MT_PIXEL_ASPECT_RATIO)) { UINT32 high = 0, low = 0; - Unpack2UINT32AsUINT64(var.uhVal.QuadPart, &high, &low); + Unpack2UINT32AsUINT64 (var.uhVal.QuadPart, &high, &low); return g_strdup_printf ("%dx%d", high, low); } @@ -724,7 +726,7 @@ IsEqualGUID (guid, MF_MT_MINIMUM_DISPLAY_APERTURE) || IsEqualGUID (guid, MF_MT_PAN_SCAN_APERTURE)) { /* FIXME: Not our usecase for now */ - return g_strup ("Not parsed"); + return g_strdup ("Not parsed"); } switch (var.vt) { @@ -750,8 +752,8 @@ static void gst_mf_dump_attribute_value_by_index (IMFAttributes * attr, const gchar * msg, - guint index, GstDebugLevel level, GstDebugCategory * cat, const gchar * file, - const gchar * function, gint line) + guint index, GstDebugLevel level, GstDebugCategory * cat, + const gchar * file, const gchar * function, gint line) { gchar *guid_name = NULL; gchar *value = NULL; @@ -759,9 +761,9 @@ HRESULT hr; PROPVARIANT var; - PropVariantInit(&var); + PropVariantInit (&var); - hr = attr->GetItemByIndex(index, &guid, &var); + hr = attr->GetItemByIndex (index, &guid, &var); if (!gst_mf_result (hr)) goto done; @@ -774,11 +776,10 @@ goto done; gst_debug_log (cat, level, file, function, line, - NULL, "%s attribute %d, %s: %s", msg ? msg : "", index, guid_name, - value); + NULL, "%s attribute %d, %s: %s", msg ? msg : "", index, guid_name, value); done: - PropVariantClear(&var); + PropVariantClear (&var); g_free (guid_name); g_free (value); } @@ -804,4 +805,4 @@ msg, i, level, cat, file, function, line); } #endif -} \ No newline at end of file +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvideobuffer.cpp
Added
@@ -0,0 +1,554 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstmfvideobuffer.h" +#include <string.h> + +/* *INDENT-OFF* */ +G_BEGIN_DECLS + +GST_DEBUG_CATEGORY_EXTERN (gst_mf_video_buffer_debug); +#define GST_CAT_DEFAULT gst_mf_video_buffer_debug + +G_END_DECLS + +IGstMFVideoBuffer::IGstMFVideoBuffer () + : ref_count_ (1) + , current_len_ (0) + , contiguous_len_ (0) + , data_ (nullptr) + , contiguous_data_ (nullptr) + , info_ (nullptr) + , contiguous_info_ (nullptr) + , locked_ (false) + , wrapped_ (false) + , user_data_ (nullptr) + , notify_ (nullptr) +{ +} + +IGstMFVideoBuffer::~IGstMFVideoBuffer () +{ + if (info_) + gst_video_info_free (info_); + if (contiguous_info_) + gst_video_info_free (contiguous_info_); + + g_free (contiguous_data_); + + if (!wrapped_) + g_free (data_); +} + +HRESULT +IGstMFVideoBuffer::CreateInstance (GstVideoInfo * info, + IMFMediaBuffer ** buffer) +{ + HRESULT hr = S_OK; + IGstMFVideoBuffer * self; + + if (!info || !buffer) + return E_INVALIDARG; + + self = new IGstMFVideoBuffer (); + + if (!self) + return E_OUTOFMEMORY; + + hr = self->Initialize (info); + + if (SUCCEEDED (hr)) + hr = self->QueryInterface (IID_PPV_ARGS (buffer)); + + self->Release (); + + return hr; +} + +HRESULT +IGstMFVideoBuffer::CreateInstanceWrapped (GstVideoInfo * info, + BYTE * data, DWORD length, IMFMediaBuffer ** buffer) +{ + HRESULT hr = S_OK; + IGstMFVideoBuffer * self; + + if (!info || !data || length == 0 || !buffer) + return E_INVALIDARG; + + self = new IGstMFVideoBuffer (); + + if (!self) + return E_OUTOFMEMORY; + + hr = self->InitializeWrapped (info, data, length); + + if (SUCCEEDED (hr)) + hr = self->QueryInterface (IID_PPV_ARGS (buffer)); + + self->Release (); + + return hr; +} + +HRESULT +IGstMFVideoBuffer::Initialize (GstVideoInfo * info) +{ + if (!info) + return E_INVALIDARG; + + info_ = gst_video_info_copy (info); + contiguous_info_ = gst_video_info_new (); + + /* check if padding is required */ + gst_video_info_set_format (contiguous_info_, GST_VIDEO_INFO_FORMAT (info_), + GST_VIDEO_INFO_WIDTH (info_), GST_VIDEO_INFO_HEIGHT (info_)); + + contiguous_ = GST_VIDEO_INFO_SIZE (info_) == + GST_VIDEO_INFO_SIZE (contiguous_info_); + + contiguous_len_ = GST_VIDEO_INFO_SIZE (contiguous_info_); + /* NOTE: {Set,Get}CurrentLength will not be applied for + * IMF2DBuffer interface */ + + current_len_ = contiguous_len_; + + data_ = (BYTE *) g_malloc0 (GST_VIDEO_INFO_SIZE (info_)); + + if (!data_) + return E_OUTOFMEMORY; + + return S_OK; +} + +HRESULT +IGstMFVideoBuffer::InitializeWrapped (GstVideoInfo * info, BYTE * data, + DWORD length) +{ + if (!info || !data || length == 0) + return E_INVALIDARG; + + if (length < GST_VIDEO_INFO_SIZE (info)) + return E_INVALIDARG; + + info_ = gst_video_info_copy (info); + contiguous_info_ = gst_video_info_new (); + + /* check if padding is required */ + gst_video_info_set_format (contiguous_info_, GST_VIDEO_INFO_FORMAT (info_), + GST_VIDEO_INFO_WIDTH (info_), GST_VIDEO_INFO_HEIGHT (info_)); + + contiguous_ = GST_VIDEO_INFO_SIZE (info_) == + GST_VIDEO_INFO_SIZE (contiguous_info_); + + contiguous_len_ = GST_VIDEO_INFO_SIZE (contiguous_info_); + + current_len_ = contiguous_len_; + + data_ = data; + wrapped_ = true; + + return S_OK; +} + +HRESULT +IGstMFVideoBuffer::SetUserData (gpointer user_data, GDestroyNotify notify) +{ + GDestroyNotify old_notify = notify_; + gpointer old_user_data = user_data_; + + if (old_notify) + old_notify (old_user_data); + + user_data_ = user_data; + notify_ = notify; + + return S_OK; +} + +HRESULT +IGstMFVideoBuffer::GetUserData (gpointer * user_data) +{ + if (!user_data) + return E_INVALIDARG; + + *user_data = user_data_; + + return S_OK; +} + +/* IUnknown interface */ +STDMETHODIMP_ (ULONG) +IGstMFVideoBuffer::AddRef (void) +{ + GST_TRACE ("%p, %d", this, ref_count_); + return InterlockedIncrement (&ref_count_); +} + +STDMETHODIMP_ (ULONG) +IGstMFVideoBuffer::Release (void) +{ + ULONG ref_count; + + GST_TRACE ("%p, %d", this, ref_count_); + ref_count = InterlockedDecrement (&ref_count_); + + if (ref_count == 0) { + GDestroyNotify old_notify = notify_; + gpointer old_user_data = user_data_; + + GST_TRACE ("Delete instance %p", this); + delete this; + + if (old_notify) + old_notify (old_user_data); + } + + return ref_count; +} + +STDMETHODIMP +IGstMFVideoBuffer::QueryInterface (REFIID riid, void ** object) +{ + if (!object) + return E_POINTER; + + if (riid == IID_IUnknown) { + GST_TRACE ("query IUnknown interface %p", this); + *object = static_cast<IUnknown *> (static_cast<IMFMediaBuffer *> (this)); + } else if (riid == __uuidof(IMFMediaBuffer)) { + GST_TRACE ("query IMFMediaBuffer interface %p", this); + *object = static_cast<IMFMediaBuffer *> (this); + } else if (riid == __uuidof(IMF2DBuffer)) { + GST_TRACE ("query IMF2DBuffer interface %p", this); + *object = static_cast<IMF2DBuffer *> (this); + } else if (riid == __uuidof(IGstMFVideoBuffer)) { + GST_TRACE ("query IGstMFVideoBuffer interface %p", this); + *object = this; + } else { + *object = NULL; + return E_NOINTERFACE; + } + + AddRef(); + + return S_OK; +} + +/* IMFMediaBuffer interface */ +STDMETHODIMP +IGstMFVideoBuffer::Lock (BYTE ** buffer, DWORD * max_length, + DWORD * current_length) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + if (locked_) { + GST_LOG ("%p, Already locked", this); + return S_OK; + } + + locked_ = true; + + if (contiguous_) { + *buffer = data_; + goto done; + } + + /* IMFMediaBuffer::Lock method should return contiguous memory */ + if (!contiguous_data_) + contiguous_data_ = (BYTE *) g_malloc0 (contiguous_len_); + + ContiguousCopyToUnlocked (contiguous_data_, contiguous_len_); + *buffer = contiguous_data_; + +done: + if (max_length) + *max_length = contiguous_len_; + if (current_length) + *current_length = current_len_; + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::Unlock (void) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + if (!locked_) { + GST_LOG ("%p, No previous Lock call", this); + return S_OK; + } + + locked_ = false; + + if (contiguous_) { + GST_TRACE ("%p, Have configured contiguous data", this); + return S_OK; + } + + /* copy back to original data */ + ContiguousCopyFromUnlocked (contiguous_data_, contiguous_len_); + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::GetCurrentLength (DWORD * length) +{ + std::lock_guard<std::mutex> lock(lock_); + + *length = current_len_; + + GST_TRACE ("%p, %d", this, current_len_); + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::SetCurrentLength (DWORD length) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p %d", this, length); + + if (length > contiguous_len_) { + GST_LOG ("%p, Requested length %d is larger than contiguous_len %d", + this, length, contiguous_len_); + return E_INVALIDARG; + } + + current_len_ = length; + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::GetMaxLength (DWORD * length) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + *length = contiguous_len_; + + return S_OK; +} + +/* IMF2DBuffer */ +STDMETHODIMP +IGstMFVideoBuffer::Lock2D (BYTE ** buffer, LONG * pitch) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + if (locked_) { + GST_LOG ("%p, Already locked", this); + return MF_E_INVALIDREQUEST; + } + + locked_ = true; + + *buffer = data_; + *pitch = GST_VIDEO_INFO_PLANE_STRIDE (info_, 0); + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::Unlock2D (void) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + if (!locked_) { + GST_LOG ("%p, No previous Lock2D call", this); + return S_OK; + } + + locked_ = false; + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::GetScanline0AndPitch (BYTE ** buffer, LONG * pitch) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + /* Lock2D must be called before */ + if (!locked_) { + GST_LOG ("%p, Invalid call, Lock2D must be called before", this); + return ERROR_INVALID_FUNCTION; + } + + *buffer = data_; + *pitch = GST_VIDEO_INFO_PLANE_STRIDE (info_, 0); + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::IsContiguousFormat (BOOL * contiguous) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + *contiguous = contiguous_; + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::GetContiguousLength (DWORD * length) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + *length = contiguous_len_; + + return S_OK; +} + +STDMETHODIMP +IGstMFVideoBuffer::ContiguousCopyTo (BYTE * dest_buffer, + DWORD dest_buffer_length) +{ + std::lock_guard<std::mutex> lock(lock_); + + return ContiguousCopyToUnlocked (dest_buffer, dest_buffer_length); +} + +STDMETHODIMP +IGstMFVideoBuffer::ContiguousCopyFrom (const BYTE * src_buffer, + DWORD src_buffer_length) +{ + std::lock_guard<std::mutex> lock(lock_); + + GST_TRACE ("%p", this); + + return ContiguousCopyFromUnlocked (src_buffer, src_buffer_length); +} + +HRESULT +IGstMFVideoBuffer::ContiguousCopyToUnlocked (BYTE * dest_buffer, + DWORD dest_buffer_length) +{ + GST_TRACE ("%p", this); + + if (!dest_buffer || dest_buffer_length < contiguous_len_) + return E_INVALIDARG; + + if (contiguous_) { + memcpy (dest_buffer, data_, current_len_); + return S_OK; + } + + for (gint i = 0; i < GST_VIDEO_INFO_N_PLANES (info_); i++) { + BYTE *src, *dst; + guint src_stride, dst_stride; + guint width, height; + + src = data_ + GST_VIDEO_INFO_PLANE_OFFSET (info_, i); + dst = dest_buffer + GST_VIDEO_INFO_PLANE_OFFSET (contiguous_info_, i); + + src_stride = GST_VIDEO_INFO_PLANE_STRIDE (info_, i); + dst_stride = GST_VIDEO_INFO_PLANE_STRIDE (contiguous_info_, i); + + width = GST_VIDEO_INFO_COMP_WIDTH (info_, i) + * GST_VIDEO_INFO_COMP_PSTRIDE (info_, i); + height = GST_VIDEO_INFO_COMP_HEIGHT (info_, i); + + for (gint j = 0; j < height; j++) { + memcpy (dst, src, width); + src += src_stride; + dst += dst_stride; + } + } + + return S_OK; +} + +HRESULT +IGstMFVideoBuffer::ContiguousCopyFromUnlocked (const BYTE * src_buffer, + DWORD src_buffer_length) +{ + gint offset; + + GST_TRACE ("%p", this); + + if (!src_buffer) + return E_INVALIDARG; + + /* Nothing to copy */ + if (src_buffer_length == 0) + return S_OK; + + if (contiguous_) { + memcpy (data_, src_buffer, src_buffer_length); + return S_OK; + } + + for (gint i = 0; i < GST_VIDEO_INFO_N_PLANES (info_); i++) { + BYTE *dst; + guint src_stride, dst_stride; + guint width, height; + + offset = GST_VIDEO_INFO_PLANE_OFFSET (contiguous_info_, i); + + dst = data_ + GST_VIDEO_INFO_PLANE_OFFSET (info_, i); + + src_stride = GST_VIDEO_INFO_PLANE_STRIDE (contiguous_info_, i); + dst_stride = GST_VIDEO_INFO_PLANE_STRIDE (info_, i); + + width = GST_VIDEO_INFO_COMP_WIDTH (info_, i) + * GST_VIDEO_INFO_COMP_PSTRIDE (info_, i); + height = GST_VIDEO_INFO_COMP_HEIGHT (info_, i); + + for (gint j = 0; j < height; j++) { + gint to_copy = 0; + + if (offset + width < src_buffer_length) + to_copy = width; + else + to_copy = (gint) src_buffer_length - offset; + + if (to_copy <= 0) + return S_OK; + + memcpy (dst, src_buffer + offset, to_copy); + + offset += src_stride; + dst += dst_stride; + } + } + + return S_OK; +} + +/* *INDENT-ON* */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvideobuffer.h
Added
@@ -0,0 +1,123 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_MF_VIDEO_BUFFER_H__ +#define __GST_MF_VIDEO_BUFFER_H__ + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <windows.h> +#include <mfobjects.h> +#include <mferror.h> +#include <mutex> + +#ifndef __cplusplus +#error IGstMFVideoBuffer interface doesn't provide C API +#endif + +/* Define UUID for QueryInterface() */ +class DECLSPEC_UUID("ce922806-a8a6-4e1e-871f-e0cdd5fc9899") +IGstMFVideoBuffer : public IMFMediaBuffer, public IMF2DBuffer +{ +public: + static HRESULT CreateInstance (GstVideoInfo * info, + IMFMediaBuffer ** buffer); + + static HRESULT CreateInstanceWrapped (GstVideoInfo * info, + BYTE * data, + DWORD length, + IMFMediaBuffer ** buffer); + + /* notify will be called right after this object is destroyed */ + HRESULT SetUserData (gpointer user_data, + GDestroyNotify notify); + + HRESULT GetUserData (gpointer * user_data); + + /* IUnknown interface */ + STDMETHODIMP_ (ULONG) AddRef (void); + STDMETHODIMP_ (ULONG) Release (void); + STDMETHODIMP QueryInterface (REFIID riid, + void ** object); + + /* IMFMediaBuffer interface + * + * Caller of this interface expects returned raw memory layout via Lock() + * has no padding with default stride. If stored memory layout consists of + * non-default stride and/or with some padding, then Lock() / Unlock() would + * cause memory copy therefore. + * Caller should avoid to use this interface as much as possible + * if IMF2DBuffer interface available. + */ + STDMETHODIMP Lock (BYTE ** buffer, + DWORD * max_length, + DWORD * current_length); + STDMETHODIMP Unlock (void); + STDMETHODIMP GetCurrentLength (DWORD * length); + STDMETHODIMP SetCurrentLength (DWORD length); + STDMETHODIMP GetMaxLength (DWORD * length); + + /* IMF2DBuffer interface + * + * this interface supports any raw memory layout with non-default stride. + * But more complex layout (padding at bottom for instance) is not supported. + */ + STDMETHODIMP Lock2D (BYTE ** buffer, + LONG * pitch); + STDMETHODIMP Unlock2D (void); + STDMETHODIMP GetScanline0AndPitch (BYTE ** buffer, + LONG * pitch); + STDMETHODIMP IsContiguousFormat (BOOL * contiguous); + STDMETHODIMP GetContiguousLength (DWORD * length); + STDMETHODIMP ContiguousCopyTo (BYTE * dest_buffer, + DWORD dest_buffer_length); + STDMETHODIMP ContiguousCopyFrom (const BYTE * src_buffer, + DWORD src_buffer_length); + +private: + IGstMFVideoBuffer (void); + ~IGstMFVideoBuffer (void); + + HRESULT Initialize (GstVideoInfo * info); + HRESULT InitializeWrapped (GstVideoInfo * info, + BYTE * data, + DWORD length); + HRESULT ContiguousCopyToUnlocked (BYTE * dest_buffer, + DWORD dest_buffer_length); + HRESULT ContiguousCopyFromUnlocked (const BYTE * src_buffer, + DWORD src_buffer_length); + +private: + ULONG ref_count_; + DWORD current_len_; + DWORD contiguous_len_; + BYTE *data_; + BYTE *contiguous_data_; + GstVideoInfo *info_; + GstVideoInfo *contiguous_info_; + BOOL contiguous_; + std::mutex lock_; + bool locked_; + bool wrapped_; + + gpointer user_data_; + GDestroyNotify notify_; +}; + +#endif /* __GST_MF_VIDEO_BUFFER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfvideoenc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvideoenc.cpp
Changed
@@ -24,40 +24,76 @@ #include <gst/gst.h> #include "gstmfvideoenc.h" +#include "gstmfvideobuffer.h" +#include "gstmfplatloader.h" #include <wrl.h> +#include <string.h> +#include <cmath> +#if GST_MF_HAVE_D3D11 +#include <d3d10.h> +#endif + +/* *INDENT-OFF* */ using namespace Microsoft::WRL; -GST_DEBUG_CATEGORY (gst_mf_video_enc_debug); +G_BEGIN_DECLS + +GST_DEBUG_CATEGORY_EXTERN (gst_mf_video_enc_debug); #define GST_CAT_DEFAULT gst_mf_video_enc_debug +G_END_DECLS +/* *INDENT-ON* */ + #define gst_mf_video_enc_parent_class parent_class -G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstMFVideoEnc, gst_mf_video_enc, - GST_TYPE_VIDEO_ENCODER, - GST_DEBUG_CATEGORY_INIT (gst_mf_video_enc_debug, "mfvideoenc", 0, - "mfvideoenc")); +G_DEFINE_ABSTRACT_TYPE (GstMFVideoEnc, gst_mf_video_enc, + GST_TYPE_VIDEO_ENCODER); +static void gst_mf_video_enc_dispose (GObject * object); +static void gst_mf_video_enc_set_context (GstElement * element, + GstContext * context); static gboolean gst_mf_video_enc_open (GstVideoEncoder * enc); static gboolean gst_mf_video_enc_close (GstVideoEncoder * enc); +static gboolean gst_mf_video_enc_start (GstVideoEncoder * enc); static gboolean gst_mf_video_enc_set_format (GstVideoEncoder * enc, GstVideoCodecState * state); static GstFlowReturn gst_mf_video_enc_handle_frame (GstVideoEncoder * enc, GstVideoCodecFrame * frame); static GstFlowReturn gst_mf_video_enc_finish (GstVideoEncoder * enc); static gboolean gst_mf_video_enc_flush (GstVideoEncoder * enc); +static gboolean gst_mf_video_enc_propose_allocation (GstVideoEncoder * enc, + GstQuery * query); +static gboolean gst_mf_video_enc_sink_query (GstVideoEncoder * enc, + GstQuery * query); +static gboolean gst_mf_video_enc_src_query (GstVideoEncoder * enc, + GstQuery * query); + +static HRESULT gst_mf_video_on_new_sample (GstMFTransform * object, + IMFSample * sample, GstMFVideoEnc * self); static void gst_mf_video_enc_class_init (GstMFVideoEncClass * klass) { + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); GstVideoEncoderClass *videoenc_class = GST_VIDEO_ENCODER_CLASS (klass); + gobject_class->dispose = gst_mf_video_enc_dispose; + + element_class->set_context = GST_DEBUG_FUNCPTR (gst_mf_video_enc_set_context); + videoenc_class->open = GST_DEBUG_FUNCPTR (gst_mf_video_enc_open); videoenc_class->close = GST_DEBUG_FUNCPTR (gst_mf_video_enc_close); + videoenc_class->start = GST_DEBUG_FUNCPTR (gst_mf_video_enc_start); videoenc_class->set_format = GST_DEBUG_FUNCPTR (gst_mf_video_enc_set_format); videoenc_class->handle_frame = GST_DEBUG_FUNCPTR (gst_mf_video_enc_handle_frame); videoenc_class->finish = GST_DEBUG_FUNCPTR (gst_mf_video_enc_finish); videoenc_class->flush = GST_DEBUG_FUNCPTR (gst_mf_video_enc_flush); + videoenc_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_mf_video_enc_propose_allocation); + videoenc_class->sink_query = GST_DEBUG_FUNCPTR (gst_mf_video_enc_sink_query); + videoenc_class->src_query = GST_DEBUG_FUNCPTR (gst_mf_video_enc_src_query); gst_type_mark_as_plugin_api (GST_TYPE_MF_VIDEO_ENC, (GstPluginAPIFlags) 0); } @@ -67,15 +103,103 @@ { } +static void +gst_mf_video_enc_dispose (GObject * object) +{ +#if GST_MF_HAVE_D3D11 + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (object); + + gst_clear_object (&self->d3d11_device); + gst_clear_object (&self->other_d3d11_device); +#endif + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_mf_video_enc_set_context (GstElement * element, GstContext * context) +{ +#if GST_MF_HAVE_D3D11 + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (element); + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (self); + GstMFVideoEncDeviceCaps *device_caps = &klass->device_caps; + + if (device_caps->d3d11_aware) { + gst_d3d11_handle_set_context_for_adapter_luid (element, context, + device_caps->adapter_luid, &self->other_d3d11_device); + } +#endif + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + static gboolean gst_mf_video_enc_open (GstVideoEncoder * enc) { GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (enc); + GstMFVideoEncDeviceCaps *device_caps = &klass->device_caps; GstMFTransformEnumParams enum_params = { 0, }; MFT_REGISTER_TYPE_INFO output_type; gboolean ret; +#if GST_MF_HAVE_D3D11 + if (device_caps->d3d11_aware) { + HRESULT hr; + ID3D11Device *device_handle; + ComPtr < ID3D10Multithread > multi_thread; + GstD3D11Device *device; + + if (!gst_d3d11_ensure_element_data_for_adapter_luid (GST_ELEMENT (self), + device_caps->adapter_luid, &self->other_d3d11_device)) { + GST_ERROR_OBJECT (self, "Other d3d11 device is unavailable"); + return FALSE; + } + + /* Create our own device with D3D11_CREATE_DEVICE_VIDEO_SUPPORT */ + self->d3d11_device = + gst_d3d11_device_new_for_adapter_luid (device_caps->adapter_luid, + D3D11_CREATE_DEVICE_VIDEO_SUPPORT); + if (!self->d3d11_device) { + GST_ERROR_OBJECT (self, "Couldn't create internal d3d11 device"); + gst_clear_object (&self->other_d3d11_device); + return FALSE; + } + + device = self->d3d11_device; + + hr = GstMFCreateDXGIDeviceManager (&self->reset_token, &self->device_manager); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (self, "Couldn't create DXGI device manager"); + gst_clear_object (&self->other_d3d11_device); + gst_clear_object (&self->d3d11_device); + return FALSE; + } + + device_handle = gst_d3d11_device_get_device_handle (device); + /* Enable multi thread protection as this device will be shared with + * MFT */ + hr = device_handle->QueryInterface (IID_PPV_ARGS (&multi_thread)); + if (!gst_d3d11_result (hr, device)) { + GST_WARNING_OBJECT (self, + "device doesn't suport ID3D10Multithread interface"); + gst_clear_object (&self->other_d3d11_device); + gst_clear_object (&self->d3d11_device); + } + + multi_thread->SetMultithreadProtected (TRUE); + + hr = self->device_manager->ResetDevice ((IUnknown *) device_handle, + self->reset_token); + if (!gst_mf_result (hr)) { + GST_ERROR_OBJECT (self, "Couldn't reset device with given d3d11 device"); + gst_clear_object (&self->other_d3d11_device); + gst_clear_object (&self->d3d11_device); + return FALSE; + } + } +#endif + output_type.guidMajorType = MFMediaType_Video; output_type.guidSubtype = klass->codec_id; @@ -84,14 +208,34 @@ enum_params.output_typeinfo = &output_type; enum_params.device_index = klass->device_index; - GST_DEBUG_OBJECT (self, "Create MFT with enum flags 0x%x, device index %d", - klass->enum_flags, klass->device_index); + if (device_caps->d3d11_aware) + enum_params.adapter_luid = device_caps->adapter_luid; + + GST_DEBUG_OBJECT (self, + "Create MFT with enum flags: 0x%x, device index: %d, d3d11 aware: %d, " + "adapter-luid %" G_GINT64_FORMAT, klass->enum_flags, klass->device_index, + device_caps->d3d11_aware, device_caps->adapter_luid); self->transform = gst_mf_transform_new (&enum_params); ret = !!self->transform; - if (!ret) + if (!ret) { GST_ERROR_OBJECT (self, "Cannot create MFT object"); + return FALSE; + } + + /* In case of hardware MFT, it will be running on async mode. + * And new output sample callback will be called from Media Foundation's + * internal worker queue thread */ + if (self->transform && + (enum_params.enum_flags & MFT_ENUM_FLAG_HARDWARE) == + MFT_ENUM_FLAG_HARDWARE) { + self->async_mft = TRUE; + gst_mf_transform_set_new_sample_callback (self->transform, + (GstMFTransformNewSampleCallback) gst_mf_video_on_new_sample, self); + } else { + self->async_mft = FALSE; + } return ret; } @@ -107,6 +251,31 @@ gst_video_codec_state_unref (self->input_state); self->input_state = NULL; } +#if GST_MF_HAVE_D3D11 + if (self->device_manager) { + self->device_manager->Release (); + self->device_manager = nullptr; + } + + if (self->mf_allocator) { + self->mf_allocator->UninitializeSampleAllocator (); + self->mf_allocator->Release (); + self->mf_allocator = NULL; + } + + gst_clear_object (&self->other_d3d11_device); + gst_clear_object (&self->d3d11_device); +#endif + + return TRUE; +} + +static gboolean +gst_mf_video_enc_start (GstVideoEncoder * enc) +{ + /* Media Foundation Transform will shift PTS in case that B-frame is enabled. + * We need to adjust DTS correspondingly */ + gst_video_encoder_set_min_pts (enc, GST_SECOND * 60 * 60 * 1000); return TRUE; } @@ -117,8 +286,8 @@ GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (enc); GstVideoInfo *info = &state->info; - ComPtr<IMFMediaType> in_type; - ComPtr<IMFMediaType> out_type; + ComPtr < IMFMediaType > in_type; + ComPtr < IMFMediaType > out_type; GList *input_types = NULL; GList *iter; HRESULT hr; @@ -128,6 +297,10 @@ gst_mf_video_enc_finish (enc); + self->mf_pts_offset = 0; + self->has_reorder_frame = FALSE; + self->last_ret = GST_FLOW_OK; + if (self->input_state) gst_video_codec_state_unref (self->input_state); self->input_state = gst_video_codec_state_ref (state); @@ -136,6 +309,17 @@ GST_ERROR_OBJECT (self, "Failed to open MFT"); return FALSE; } +#if GST_MF_HAVE_D3D11 + if (self->device_manager) { + if (!gst_mf_transform_set_device_manager (self->transform, + self->device_manager)) { + GST_ERROR_OBJECT (self, "Couldn't set device manager"); + return FALSE; + } else { + GST_DEBUG_OBJECT (self, "set device manager done"); + } + } +#endif hr = MFCreateMediaType (out_type.GetAddressOf ()); if (!gst_mf_result (hr)) @@ -146,7 +330,7 @@ return FALSE; if (klass->set_option) { - if (!klass->set_option (self, out_type.Get ())) { + if (!klass->set_option (self, self->input_state, out_type.Get ())) { GST_ERROR_OBJECT (self, "subclass failed to set option"); return FALSE; } @@ -199,7 +383,7 @@ } if (!gst_mf_transform_get_input_available_types (self->transform, - &input_types)) { + &input_types)) { GST_ERROR_OBJECT (self, "Couldn't get available input types"); return FALSE; } @@ -275,79 +459,193 @@ GST_ERROR_OBJECT (self, "subclass couldn't set src caps"); return FALSE; } +#if GST_MF_HAVE_D3D11 + if (self->mf_allocator) { + self->mf_allocator->UninitializeSampleAllocator (); + self->mf_allocator->Release (); + self->mf_allocator = NULL; + } + + /* Check whether upstream is d3d11 element */ + if (state->caps) { + GstCapsFeatures *features; + ComPtr < IMFVideoSampleAllocatorEx > allocator; + + features = gst_caps_get_features (state->caps, 0); + + if (features && + gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + GST_DEBUG_OBJECT (self, "found D3D11 memory feature"); + + hr = GstMFCreateVideoSampleAllocatorEx (IID_PPV_ARGS (&allocator)); + if (!gst_mf_result (hr)) + GST_WARNING_OBJECT (self, + "IMFVideoSampleAllocatorEx interface is unavailable"); + } + + if (allocator) { + do { + ComPtr < IMFAttributes > attr; + + hr = MFCreateAttributes (&attr, 4); + if (!gst_mf_result (hr)) + break; + + /* Only one buffer per sample + * (multiple sample is usually for multi-view things) */ + hr = attr->SetUINT32 (GST_GUID_MF_SA_BUFFERS_PER_SAMPLE, 1); + if (!gst_mf_result (hr)) + break; + + hr = attr->SetUINT32 (GST_GUID_MF_SA_D3D11_USAGE, D3D11_USAGE_DEFAULT); + if (!gst_mf_result (hr)) + break; + + /* TODO: Check if we need to use keyed-mutex */ + hr = attr->SetUINT32 (GST_GUID_MF_SA_D3D11_SHARED_WITHOUT_MUTEX, TRUE); + if (!gst_mf_result (hr)) + break; + + hr = attr->SetUINT32 (GST_GUID_MF_SA_D3D11_BINDFLAGS, + D3D11_BIND_VIDEO_ENCODER); + if (!gst_mf_result (hr)) + break; + + hr = allocator->SetDirectXManager (self->device_manager); + if (!gst_mf_result (hr)) + break; + + hr = allocator->InitializeSampleAllocatorEx ( + /* min samples, since we are running on async mode, + * at least 2 samples would be required */ + 2, + /* max samples, why 16 + 2? it's just magic number + * (H264 max dpb size 16 + our min sample size 2) */ + 16 + 2, attr.Get (), in_type.Get () + ); + + if (!gst_mf_result (hr)) + break; + + GST_DEBUG_OBJECT (self, "IMFVideoSampleAllocatorEx is initialized"); + + self->mf_allocator = allocator.Detach (); + } while (0); + } + } +#endif return TRUE; } -typedef struct +static void +gst_mf_video_buffer_free (GstVideoFrame * frame) { - GstClockTime mf_pts; -} GstMFVideoEncFrameData; + if (!frame) + return; + + gst_video_frame_unmap (frame); + g_free (frame); +} static gboolean -gst_mf_video_enc_process_input (GstMFVideoEnc * self, - GstVideoCodecFrame * frame) +gst_mf_video_enc_frame_needs_copy (GstVideoFrame * vframe) { - GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (self); - HRESULT hr; - ComPtr<IMFSample> sample; - ComPtr<IMFMediaBuffer> media_buffer; - GstVideoInfo *info = &self->input_state->info; - gint i, j; - BYTE *data; - GstVideoFrame vframe; - gboolean res = FALSE; - gboolean unset_force_keyframe = FALSE; - GstMFVideoEncFrameData *frame_data = NULL; - - if (!gst_video_frame_map (&vframe, info, frame->input_buffer, GST_MAP_READ)) { - GST_ERROR_OBJECT (self, "Couldn't map input frame"); + /* Single plane data can be used without copy */ + if (GST_VIDEO_FRAME_N_PLANES (vframe) == 1) return FALSE; - } - hr = MFCreateSample (sample.GetAddressOf ()); - if (!gst_mf_result (hr)) - goto done; + switch (GST_VIDEO_FRAME_FORMAT (vframe)) { + case GST_VIDEO_FORMAT_I420: + { + guint8 *data, *other_data; + guint size; + + /* Unexpected stride size, Media Foundation doesn't provide API for + * per plane stride information */ + if (GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 0) != + 2 * GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 1) || + GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 1) != + GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 2)) { + return TRUE; + } - hr = MFCreateMemoryBuffer (GST_VIDEO_INFO_SIZE (info), - media_buffer.GetAddressOf ()); - if (!gst_mf_result (hr)) - goto done; + size = GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 0) * + GST_VIDEO_FRAME_HEIGHT (vframe); + if (size + GST_VIDEO_FRAME_PLANE_OFFSET (vframe, 0) != + GST_VIDEO_FRAME_PLANE_OFFSET (vframe, 1)) + return TRUE; - hr = media_buffer->Lock (&data, NULL, NULL); - if (!gst_mf_result (hr)) - goto done; + data = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 0); + other_data = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 1); + if (data + size != other_data) + return TRUE; - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { - guint8 *src, *dst; - gint src_stride, dst_stride; - gint width; + size = GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 1) * + GST_VIDEO_FRAME_COMP_HEIGHT (vframe, 1); + if (size + GST_VIDEO_FRAME_PLANE_OFFSET (vframe, 1) != + GST_VIDEO_FRAME_PLANE_OFFSET (vframe, 2)) + return TRUE; + + data = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 1); + other_data = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 2); + if (data + size != other_data) + return TRUE; + + return FALSE; + } + case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_P010_10LE: + case GST_VIDEO_FORMAT_P016_LE: + { + guint8 *data, *other_data; + guint size; + + /* Unexpected stride size, Media Foundation doesn't provide API for + * per plane stride information */ + if (GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 0) != + GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 1)) { + return TRUE; + } - src = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (&vframe, i); - dst = data + GST_VIDEO_INFO_PLANE_OFFSET (info, i); + size = GST_VIDEO_FRAME_PLANE_STRIDE (vframe, 0) * + GST_VIDEO_FRAME_HEIGHT (vframe); - src_stride = GST_VIDEO_FRAME_PLANE_STRIDE (&vframe, i); - dst_stride = GST_VIDEO_INFO_PLANE_STRIDE (info, i); + /* Unexpected padding */ + if (size + GST_VIDEO_FRAME_PLANE_OFFSET (vframe, 0) != + GST_VIDEO_FRAME_PLANE_OFFSET (vframe, 1)) + return TRUE; - width = GST_VIDEO_INFO_COMP_WIDTH (info, i) - * GST_VIDEO_INFO_COMP_PSTRIDE (info, i); + data = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 0); + other_data = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 1); + if (data + size != other_data) + return TRUE; - for (j = 0; j < GST_VIDEO_INFO_COMP_HEIGHT (info, i); j++) { - memcpy (dst, src, width); - src += src_stride; - dst += dst_stride; + return FALSE; } + default: + g_assert_not_reached (); + return TRUE; } - media_buffer->Unlock (); + return TRUE; +} - hr = media_buffer->SetCurrentLength (GST_VIDEO_INFO_SIZE (info)); - if (!gst_mf_result (hr)) - goto done; +typedef struct +{ + LONGLONG mf_pts; +} GstMFVideoEncFrameData; - hr = sample->AddBuffer (media_buffer.Get ()); - if (!gst_mf_result (hr)) - goto done; +static gboolean +gst_mf_video_enc_process_input (GstMFVideoEnc * self, + GstVideoCodecFrame * frame, IMFSample * sample) +{ + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (self); + HRESULT hr; + gboolean unset_force_keyframe = FALSE; + GstMFVideoEncFrameData *frame_data = NULL; + gboolean res; frame_data = g_new0 (GstMFVideoEncFrameData, 1); frame_data->mf_pts = frame->pts / 100; @@ -357,15 +655,16 @@ hr = sample->SetSampleTime (frame_data->mf_pts); if (!gst_mf_result (hr)) - goto done; + return FALSE; - hr = sample->SetSampleDuration ( - GST_CLOCK_TIME_IS_VALID (frame->duration) ? frame->duration / 100 : 0); + hr = sample-> + SetSampleDuration (GST_CLOCK_TIME_IS_VALID (frame->duration) ? frame-> + duration / 100 : 0); if (!gst_mf_result (hr)) - goto done; + return FALSE; if (GST_VIDEO_CODEC_FRAME_IS_FORCE_KEYFRAME (frame)) { - if (klass->can_force_keyframe) { + if (klass->device_caps.force_keyframe) { unset_force_keyframe = gst_mf_transform_set_codec_api_uint32 (self->transform, &CODECAPI_AVEncVideoForceKeyFrame, TRUE); @@ -374,66 +673,72 @@ } } - if (!gst_mf_transform_process_input (self->transform, sample.Get ())) { - GST_ERROR_OBJECT (self, "Failed to process input"); - goto done; - } + /* Unlock temporary so that we can output frame from Media Foundation's + * worker thread. + * While we are processing input, MFT might notify + * METransformHaveOutput event from Media Foundation's internal worker queue + * thread. Then we will output encoded data from the thread synchroniously, + * not from streaming (this) thread */ + if (self->async_mft) + GST_VIDEO_ENCODER_STREAM_UNLOCK (self); + res = gst_mf_transform_process_input (self->transform, sample); + if (self->async_mft) + GST_VIDEO_ENCODER_STREAM_LOCK (self); if (unset_force_keyframe) { gst_mf_transform_set_codec_api_uint32 (self->transform, &CODECAPI_AVEncVideoForceKeyFrame, FALSE); } - res = TRUE; - -done: - gst_video_frame_unmap (&vframe); + if (!res) { + GST_ERROR_OBJECT (self, "Failed to process input"); + return FALSE; + } - return res; + return TRUE; } static GstVideoCodecFrame * -gst_mf_video_enc_find_output_frame (GstMFVideoEnc * self, UINT64 mf_dts, - UINT64 mf_pts) +gst_mf_video_enc_find_output_frame (GstMFVideoEnc * self, LONGLONG mf_pts) { GList *l, *walk = gst_video_encoder_get_frames (GST_VIDEO_ENCODER (self)); GstVideoCodecFrame *ret = NULL; + GstVideoCodecFrame *closest = NULL; + LONGLONG min_pts_abs_diff = 0; for (l = walk; l; l = l->next) { GstVideoCodecFrame *frame = (GstVideoCodecFrame *) l->data; GstMFVideoEncFrameData *data = (GstMFVideoEncFrameData *) gst_video_codec_frame_get_user_data (frame); + LONGLONG abs_diff; if (!data) continue; - if (mf_dts == data->mf_pts) { + if (mf_pts == data->mf_pts) { ret = frame; break; } - } - - /* find target with pts */ - if (!ret) { - for (l = walk; l; l = l->next) { - GstVideoCodecFrame *frame = (GstVideoCodecFrame *) l->data; - GstMFVideoEncFrameData *data = (GstMFVideoEncFrameData *) - gst_video_codec_frame_get_user_data (frame); - if (!data) - continue; + abs_diff = std::abs (mf_pts - data->mf_pts); - if (mf_pts == data->mf_pts) { - ret = frame; - break; - } + if (!closest || abs_diff < min_pts_abs_diff) { + closest = frame; + min_pts_abs_diff = abs_diff; } } + if (!ret && closest) + ret = closest; + if (ret) { gst_video_codec_frame_ref (ret); } else { - /* just return the oldest one */ + /* XXX: Shouldn't happen, but possible if no GstVideoCodecFrame holds + * user data for some reasons */ + GST_WARNING_OBJECT (self, + "Failed to find closest GstVideoCodecFrame with MF pts %" + G_GINT64_FORMAT, mf_pts); ret = gst_video_encoder_get_oldest_frame (GST_VIDEO_ENCODER (self)); } @@ -443,70 +748,145 @@ return ret; } -static GstFlowReturn -gst_mf_video_enc_process_output (GstMFVideoEnc * self) +static HRESULT +gst_mf_video_enc_finish_sample (GstMFVideoEnc * self, IMFSample * sample) { - HRESULT hr; + HRESULT hr = S_OK; BYTE *data; - ComPtr<IMFMediaBuffer> media_buffer; - ComPtr<IMFSample> sample; + ComPtr < IMFMediaBuffer > media_buffer; GstBuffer *buffer; GstFlowReturn res = GST_FLOW_ERROR; GstVideoCodecFrame *frame; LONGLONG sample_timestamp; LONGLONG sample_duration; + LONGLONG target_mf_pts; + UINT64 mf_dts; UINT32 keyframe = FALSE; - UINT64 mf_dts = GST_CLOCK_TIME_NONE; DWORD buffer_len; - - res = gst_mf_transform_get_output (self->transform, sample.GetAddressOf ()); - - if (res != GST_FLOW_OK) - return res; + GstClockTime pts, dts, duration; hr = sample->GetBufferByIndex (0, media_buffer.GetAddressOf ()); if (!gst_mf_result (hr)) - return GST_FLOW_ERROR; + goto done; hr = media_buffer->Lock (&data, NULL, &buffer_len); if (!gst_mf_result (hr)) - return GST_FLOW_ERROR; + goto done; buffer = gst_buffer_new_allocate (NULL, buffer_len, NULL); gst_buffer_fill (buffer, 0, data, buffer_len); media_buffer->Unlock (); sample->GetSampleTime (&sample_timestamp); + target_mf_pts = sample_timestamp; sample->GetSampleDuration (&sample_duration); sample->GetUINT32 (MFSampleExtension_CleanPoint, &keyframe); hr = sample->GetUINT64 (MFSampleExtension_DecodeTimestamp, &mf_dts); - if (FAILED (hr)) + if (FAILED (hr)) { mf_dts = sample_timestamp; + hr = S_OK; + } + + pts = sample_timestamp * 100; + dts = mf_dts * 100; + duration = sample_duration * 100; + + GST_LOG_OBJECT (self, "Finish sample, MF pts %" GST_TIME_FORMAT " MF dts %" + GST_TIME_FORMAT ", MF duration %" GST_TIME_FORMAT, + GST_TIME_ARGS (pts), GST_TIME_ARGS (dts), GST_TIME_ARGS (duration)); + + /* NOTE: When B-frame is enabled, MFT shows following pattern + * (input timestamp starts from 1000:00:00.000000000, and 30fps) + * + * Frame-1: MF pts 0:00.033333300 MF dts 0:00.000000000 + * Frame-2: MF pts 0:00.133333300 MF dts 0:00.033333300 + * Frame-3: MF pts 0:00.066666600 MF dts 0:00.066666600 + * Frame-4: MF pts 0:00.099999900 MF dts 0:00.100000000 + * + * - Sounds MFT doesn't support negative timestamp, so PTS of each frame seems + * to be shifthed + * - DTS is likely based on timestamp we've set to input sample, + * but some frames has (especially Frame-4 case) unexpected PTS and + * even PTS < DTS. That would be the result of PTS shifting + * + * To handle this case, + * - Calculate timestamp offset "Frame-1 PTS" - "Frame-1 DTS" (== duration), + * and compensate PTS/DTS of each frame + * - Needs additional offset for DTS to compenstate GST/MF timescale difference + * (MF uses 100ns timescale). So DTS offset should be "PTS offset + 100ns" + * - Find corresponding GstVideoCodecFrame by using compensated PTS. + * Note that MFT doesn't support user-data for tracing input/output sample + * pair. So, timestamp based lookup is the only way to map MF sample + * and our GstVideoCodecFrame + */ + if (self->has_reorder_frame) { + /* This would be the first frame */ + if (self->mf_pts_offset == 0) { + LONGLONG mf_pts_offset = -1; + if (sample_timestamp > mf_dts) { + mf_pts_offset = sample_timestamp - mf_dts; + GST_DEBUG_OBJECT (self, "Calculates PTS offset using \"PTS - DTS\": %" + G_GINT64_FORMAT, mf_pts_offset); + } else if (sample_duration > 0) { + mf_pts_offset = sample_duration; + GST_DEBUG_OBJECT (self, "Calculates PTS offset using duration: %" + G_GINT64_FORMAT, mf_pts_offset); + } else { + GST_WARNING_OBJECT (self, "Cannot calculate PTS offset"); + } + + self->mf_pts_offset = mf_pts_offset; + } + + if (self->mf_pts_offset > 0) { + target_mf_pts -= self->mf_pts_offset; + + pts -= (self->mf_pts_offset * 100); + /* +1 to compensate timescale difference */ + dts -= ((self->mf_pts_offset + 1) * 100); + } + } - frame = gst_mf_video_enc_find_output_frame (self, - mf_dts, (UINT64) sample_timestamp); + frame = gst_mf_video_enc_find_output_frame (self, target_mf_pts); if (frame) { if (keyframe) { GST_DEBUG_OBJECT (self, "Keyframe pts %" GST_TIME_FORMAT, GST_TIME_ARGS (frame->pts)); GST_VIDEO_CODEC_FRAME_SET_SYNC_POINT (frame); - GST_BUFFER_FLAG_UNSET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); - } else { - GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); } - frame->pts = sample_timestamp * 100; - frame->dts = mf_dts * 100; - frame->duration = sample_duration * 100; frame->output_buffer = buffer; + /* Update DTS only if B-frame was enabled, but use input frame pts as-is. + * Otherwise we will lost at most 100ns precision */ + if (self->has_reorder_frame) { + frame->dts = dts; + } else { + frame->dts = frame->pts; + } + + /* make sure PTS > DTS */ + if (GST_CLOCK_TIME_IS_VALID (frame->pts) && + GST_CLOCK_TIME_IS_VALID (frame->dts) && frame->pts < frame->dts) { + GST_WARNING_OBJECT (self, "Calculated DTS %" GST_TIME_FORMAT + " is larger than PTS %" GST_TIME_FORMAT, GST_TIME_ARGS (frame->pts), + GST_TIME_ARGS (frame->dts)); + + /* XXX: just set clock-time-none? */ + frame->dts = frame->pts; + } + + GST_LOG_OBJECT (self, "Frame pts %" GST_TIME_FORMAT ", Frame DTS %" + GST_TIME_FORMAT, GST_TIME_ARGS (frame->pts), + GST_TIME_ARGS (frame->dts)); + res = gst_video_encoder_finish_frame (GST_VIDEO_ENCODER (self), frame); } else { - GST_BUFFER_DTS (buffer) = mf_dts * 100; - GST_BUFFER_PTS (buffer) = sample_timestamp * 100; - GST_BUFFER_DURATION (buffer) = sample_duration * 100; + GST_BUFFER_PTS (buffer) = pts; + GST_BUFFER_DTS (buffer) = dts; + GST_BUFFER_DURATION (buffer) = duration; if (keyframe) { GST_DEBUG_OBJECT (self, "Keyframe pts %" GST_TIME_FORMAT, @@ -516,68 +896,1112 @@ GST_BUFFER_FLAG_SET (buffer, GST_BUFFER_FLAG_DELTA_UNIT); } + GST_LOG_OBJECT (self, "Buffer pts %" GST_TIME_FORMAT ", Buffer DTS %" + GST_TIME_FORMAT, GST_TIME_ARGS (pts), GST_TIME_ARGS (dts)); + res = gst_pad_push (GST_VIDEO_ENCODER_SRC_PAD (self), buffer); } - return res; +done: + self->last_ret = res; + + return hr; } static GstFlowReturn -gst_mf_video_enc_handle_frame (GstVideoEncoder * enc, - GstVideoCodecFrame * frame) +gst_mf_video_enc_process_output (GstMFVideoEnc * self) { - GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); - GstFlowReturn ret = GST_FLOW_OK; - - if (!gst_mf_video_enc_process_input (self, frame)) { - GST_ERROR_OBJECT (self, "Failed to process input"); - ret = GST_FLOW_ERROR; - goto done; - } + ComPtr < IMFSample > sample; + GstFlowReturn res = GST_FLOW_ERROR; - do { - ret = gst_mf_video_enc_process_output (self); - } while (ret == GST_FLOW_OK); + res = gst_mf_transform_get_output (self->transform, &sample); - if (ret == GST_MF_TRANSFORM_FLOW_NEED_DATA) - ret = GST_FLOW_OK; + if (res != GST_FLOW_OK) + return res; -done: - gst_video_codec_frame_unref (frame); + gst_mf_video_enc_finish_sample (self, sample.Get ()); - return ret; + return self->last_ret; } -static GstFlowReturn -gst_mf_video_enc_finish (GstVideoEncoder * enc) +static gboolean +gst_mf_video_enc_create_input_sample (GstMFVideoEnc * self, + GstVideoCodecFrame * frame, IMFSample ** sample) { - GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); - GstFlowReturn ret = GST_FLOW_OK; + HRESULT hr; + ComPtr < IMFSample > new_sample; + ComPtr < IMFMediaBuffer > media_buffer; + ComPtr < IGstMFVideoBuffer > video_buffer; + GstVideoInfo *info = &self->input_state->info; + gint i, j; + GstVideoFrame *vframe = NULL; + BYTE *data = NULL; + gboolean need_copy; - if (!self->transform) - return GST_FLOW_OK; + vframe = g_new0 (GstVideoFrame, 1); - gst_mf_transform_drain (self->transform); + if (!gst_video_frame_map (vframe, info, frame->input_buffer, GST_MAP_READ)) { + GST_ERROR_OBJECT (self, "Couldn't map input frame"); + g_free (vframe); + return FALSE; + } - do { - ret = gst_mf_video_enc_process_output (self); - } while (ret == GST_FLOW_OK); + hr = MFCreateSample (&new_sample); + if (!gst_mf_result (hr)) + goto error; - if (ret == GST_MF_TRANSFORM_FLOW_NEED_DATA) - ret = GST_FLOW_OK; + /* Check if we can forward this memory to Media Foundation without copy */ + need_copy = gst_mf_video_enc_frame_needs_copy (vframe); + if (need_copy) { + GST_TRACE_OBJECT (self, "Copy input buffer into Media Foundation memory"); + hr = MFCreateMemoryBuffer (GST_VIDEO_INFO_SIZE (info), &media_buffer); + } else { + GST_TRACE_OBJECT (self, "Can use input buffer without copy"); + hr = IGstMFVideoBuffer::CreateInstanceWrapped (&vframe->info, + (BYTE *) GST_VIDEO_FRAME_PLANE_DATA (vframe, 0), + GST_VIDEO_INFO_SIZE (&vframe->info), &media_buffer); + } - return ret; + if (!gst_mf_result (hr)) + goto error; + + if (!need_copy) { + hr = media_buffer.As (&video_buffer); + if (!gst_mf_result (hr)) + goto error; + } else { + hr = media_buffer->Lock (&data, NULL, NULL); + if (!gst_mf_result (hr)) + goto error; + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + guint8 *src, *dst; + gint src_stride, dst_stride; + gint width; + + src = (guint8 *) GST_VIDEO_FRAME_PLANE_DATA (vframe, i); + dst = data + GST_VIDEO_INFO_PLANE_OFFSET (info, i); + + src_stride = GST_VIDEO_FRAME_PLANE_STRIDE (vframe, i); + dst_stride = GST_VIDEO_INFO_PLANE_STRIDE (info, i); + + width = GST_VIDEO_INFO_COMP_WIDTH (info, i) + * GST_VIDEO_INFO_COMP_PSTRIDE (info, i); + + for (j = 0; j < GST_VIDEO_INFO_COMP_HEIGHT (info, i); j++) { + memcpy (dst, src, width); + src += src_stride; + dst += dst_stride; + } + } + + media_buffer->Unlock (); + } + + hr = media_buffer->SetCurrentLength (GST_VIDEO_INFO_SIZE (info)); + if (!gst_mf_result (hr)) + goto error; + + hr = new_sample->AddBuffer (media_buffer.Get ()); + if (!gst_mf_result (hr)) + goto error; + + if (!need_copy) { + /* IGstMFVideoBuffer will hold GstVideoFrame (+ GstBuffer), then it will be + * cleared when it's no more referenced by Media Foundation internals */ + hr = video_buffer->SetUserData ((gpointer) vframe, + (GDestroyNotify) gst_mf_video_buffer_free); + if (!gst_mf_result (hr)) + goto error; + } else { + gst_video_frame_unmap (vframe); + g_free (vframe); + vframe = NULL; + } + + *sample = new_sample.Detach (); + + return TRUE; + +error: + if (vframe) { + gst_video_frame_unmap (vframe); + g_free (vframe); + } + + return FALSE; } +#if GST_MF_HAVE_D3D11 static gboolean -gst_mf_video_enc_flush (GstVideoEncoder * enc) +gst_mf_video_enc_create_input_sample_d3d11 (GstMFVideoEnc * self, + GstVideoCodecFrame * frame, IMFSample ** sample) { - GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + HRESULT hr; + ComPtr < IMFSample > new_sample; + ComPtr < IMFMediaBuffer > mf_buffer; + ComPtr < IMFDXGIBuffer > dxgi_buffer; + ComPtr < ID3D11Texture2D > mf_texture; + ComPtr < IDXGIResource > dxgi_resource; + ComPtr < ID3D11Texture2D > shared_texture; + ComPtr < ID3D11Query > query; + D3D11_QUERY_DESC query_desc; + BOOL sync_done = FALSE; + HANDLE shared_handle; + GstMemory *mem; + GstD3D11Memory *dmem; + ID3D11Texture2D *texture; + ID3D11Device *device_handle; + ID3D11DeviceContext *context_handle; + GstMapInfo info; + D3D11_BOX src_box = { 0, }; + D3D11_TEXTURE2D_DESC dst_desc, src_desc; + guint subidx; + + if (!self->mf_allocator) { + GST_WARNING_OBJECT (self, "IMFVideoSampleAllocatorEx was configured"); + return FALSE; + } - if (!self->transform) - return TRUE; + mem = gst_buffer_peek_memory (frame->input_buffer, 0); + if (!gst_is_d3d11_memory (mem)) { + GST_WARNING_OBJECT (self, "Non-d3d11 memory"); + return FALSE; + } - gst_mf_transform_flush (self->transform); + dmem = (GstD3D11Memory *) mem; + device_handle = gst_d3d11_device_get_device_handle (dmem->device); + context_handle = gst_d3d11_device_get_device_context_handle (dmem->device); - return TRUE; + /* 1) Allocate new encoding surface */ + hr = self->mf_allocator->AllocateSample (&new_sample); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, + "Couldn't allocate new sample via IMFVideoSampleAllocatorEx"); + return FALSE; + } + + hr = new_sample->GetBufferByIndex (0, &mf_buffer); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get IMFMediaBuffer from sample"); + return FALSE; + } + + hr = mf_buffer.As (&dxgi_buffer); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get IMFDXGIBuffer from IMFMediaBuffer"); + return FALSE; + } + + hr = dxgi_buffer->GetResource (IID_PPV_ARGS (&mf_texture)); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, + "Couldn't get ID3D11Texture2D from IMFDXGIBuffer"); + return FALSE; + } + + hr = mf_texture.As (&dxgi_resource); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, + "Couldn't get IDXGIResource from ID3D11Texture2D"); + return FALSE; + } + + hr = dxgi_resource->GetSharedHandle (&shared_handle); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get shared handle from IDXGIResource"); + return FALSE; + } + + /* Allocation succeeded. Now open shared texture to access it from + * other device */ + hr = device_handle->OpenSharedResource (shared_handle, + IID_PPV_ARGS (&shared_texture)); + if (!gst_mf_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't open shared resource"); + return FALSE; + } + + /* 2) Copy upstream texture to mf's texture */ + /* Map memory so that ensure pending upload from staging texture */ + if (!gst_memory_map (mem, &info, + (GstMapFlags) (GST_MAP_READ | GST_MAP_D3D11))) { + GST_ERROR_OBJECT (self, "Couldn't map d3d11 memory"); + return FALSE; + } + + texture = (ID3D11Texture2D *) info.data; + texture->GetDesc (&src_desc); + shared_texture->GetDesc (&dst_desc); + subidx = gst_d3d11_memory_get_subresource_index (dmem); + + /* src/dst texture size might be different if padding was used. + * select smaller size */ + src_box.left = 0; + src_box.top = 0; + src_box.front = 0; + src_box.back = 1; + src_box.right = MIN (src_desc.Width, dst_desc.Width); + src_box.bottom = MIN (src_desc.Height, dst_desc.Height); + + /* CopySubresourceRegion() might not be able to guarantee + * copying. To ensure it, we will make use of d3d11 query */ + query_desc.Query = D3D11_QUERY_EVENT; + query_desc.MiscFlags = 0; + + hr = device_handle->CreateQuery (&query_desc, &query); + if (!gst_d3d11_result (hr, dmem->device)) { + GST_ERROR_OBJECT (self, "Couldn't Create event query"); + return FALSE; + } + + gst_d3d11_device_lock (dmem->device); + context_handle->CopySubresourceRegion (shared_texture.Get (), 0, 0, 0, 0, + texture, subidx, &src_box); + context_handle->End (query.Get ()); + + /* Wait until all issued GPU commands are finished */ + do { + context_handle->GetData (query.Get (), &sync_done, sizeof (BOOL), 0); + } while (!sync_done && (hr == S_OK || hr == S_FALSE)); + + if (!gst_d3d11_result (hr, dmem->device)) { + GST_ERROR_OBJECT (self, "Couldn't sync GPU operation"); + gst_d3d11_device_unlock (dmem->device); + gst_memory_unmap (mem, &info); + + return FALSE; + } + + gst_d3d11_device_unlock (dmem->device); + gst_memory_unmap (mem, &info); + + *sample = new_sample.Detach (); + + return TRUE; +} +#endif + +static GstFlowReturn +gst_mf_video_enc_handle_frame (GstVideoEncoder * enc, + GstVideoCodecFrame * frame) +{ + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + GstFlowReturn ret = GST_FLOW_OK; + ComPtr < IMFSample > sample; + + if (self->last_ret != GST_FLOW_OK) { + GST_DEBUG_OBJECT (self, "Last return was %s", gst_flow_get_name (ret)); + ret = self->last_ret; + goto done; + } +#if GST_MF_HAVE_D3D11 + if (self->mf_allocator && + !gst_mf_video_enc_create_input_sample_d3d11 (self, frame, &sample)) { + GST_WARNING_OBJECT (self, "Failed to create IMFSample for D3D11"); + sample = nullptr; + } +#endif + + if (!sample && !gst_mf_video_enc_create_input_sample (self, frame, &sample)) { + GST_ERROR_OBJECT (self, "Failed to create IMFSample"); + ret = GST_FLOW_ERROR; + goto done; + } + + if (!gst_mf_video_enc_process_input (self, frame, sample.Get ())) { + GST_ERROR_OBJECT (self, "Failed to process input"); + ret = GST_FLOW_ERROR; + goto done; + } + + /* Don't call process_output for async (hardware) MFT. We will output + * encoded data from gst_mf_video_on_new_sample() callback which is called + * from Media Foundation's internal worker queue thread */ + if (!self->async_mft) { + do { + ret = gst_mf_video_enc_process_output (self); + } while (ret == GST_FLOW_OK); + } + + if (ret == GST_MF_TRANSFORM_FLOW_NEED_DATA) + ret = GST_FLOW_OK; + +done: + gst_video_codec_frame_unref (frame); + + return ret; +} + +static GstFlowReturn +gst_mf_video_enc_finish (GstVideoEncoder * enc) +{ + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + GstFlowReturn ret = GST_FLOW_OK; + + if (!self->transform) + return GST_FLOW_OK; + + /* Unlock temporary so that we can output frame from Media Foundation's + * worker thread */ + if (self->async_mft) + GST_VIDEO_ENCODER_STREAM_UNLOCK (enc); + + gst_mf_transform_drain (self->transform); + + if (self->async_mft) + GST_VIDEO_ENCODER_STREAM_LOCK (enc); + + if (!self->async_mft) { + do { + ret = gst_mf_video_enc_process_output (self); + } while (ret == GST_FLOW_OK); + } + + if (ret == GST_MF_TRANSFORM_FLOW_NEED_DATA) + ret = GST_FLOW_OK; + + return ret; +} + +static gboolean +gst_mf_video_enc_flush (GstVideoEncoder * enc) +{ + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + + if (!self->transform) + goto out; + + /* Unlock while flushing, while flushing, new sample callback might happen */ + if (self->async_mft) + GST_VIDEO_ENCODER_STREAM_UNLOCK (enc); + + gst_mf_transform_flush (self->transform); + + if (self->async_mft) + GST_VIDEO_ENCODER_STREAM_LOCK (enc); + +out: + self->last_ret = GST_FLOW_OK; + + return TRUE; +} + +static gboolean +gst_mf_video_enc_propose_allocation (GstVideoEncoder * enc, GstQuery * query) +{ +#if GST_MF_HAVE_D3D11 + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + GstVideoInfo info; + GstBufferPool *pool = NULL; + GstCaps *caps; + guint size; + GstD3D11Device *device = self->other_d3d11_device; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) + return FALSE; + + if (gst_query_get_n_allocation_pools (query) == 0) { + GstCapsFeatures *features; + GstStructure *config; + gboolean is_d3d11 = FALSE; + + features = gst_caps_get_features (caps, 0); + + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)) { + GST_DEBUG_OBJECT (self, "Allocation caps supports d3d11 memory"); + pool = gst_d3d11_buffer_pool_new (device); + is_d3d11 = TRUE; + } else { + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + /* d3d11 pool does not support video alignment */ + if (!is_d3d11) { + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + } else { + GstD3D11AllocationParams *d3d11_params; + guint misc_flags = 0; + gboolean is_hardware = FALSE; + gint i; + + g_object_get (device, "hardware", &is_hardware, NULL); + + /* In case of hardware, set D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX flag + * so that it can be shared with other d3d11 devices */ + if (is_hardware) + misc_flags = D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX; + + d3d11_params = + gst_buffer_pool_config_get_d3d11_allocation_params (config); + if (!d3d11_params) { + d3d11_params = gst_d3d11_allocation_params_new (device, &info, + (GstD3D11AllocationFlags) 0, 0); + } else { + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) + d3d11_params->desc[i].MiscFlags |= misc_flags; + } + + gst_buffer_pool_config_set_d3d11_allocation_params (config, d3d11_params); + gst_d3d11_allocation_params_free (d3d11_params); + } + + size = GST_VIDEO_INFO_SIZE (&info); + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + + if (!gst_buffer_pool_set_config (pool, config)) + goto config_failed; + + /* d3d11 buffer pool will update buffer size based on allocated texture, + * get size from config again */ + if (is_d3d11) { + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_get_params (config, + nullptr, &size, nullptr, nullptr); + gst_structure_free (config); + } + + gst_query_add_allocation_pool (query, pool, size, 0, 0); + gst_object_unref (pool); + } + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (self, "failed to set config"); + gst_object_unref (pool); + return FALSE; + } + +#else + return GST_VIDEO_ENCODER_CLASS (parent_class)->propose_allocation (enc, + query); +#endif +} + +static gboolean +gst_mf_video_enc_sink_query (GstVideoEncoder * enc, GstQuery * query) +{ +#if GST_MF_HAVE_D3D11 + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (self), + query, self->other_d3d11_device)) { + return TRUE; + } + break; + default: + break; + } +#endif + + return GST_VIDEO_ENCODER_CLASS (parent_class)->sink_query (enc, query); +} + +static gboolean +gst_mf_video_enc_src_query (GstVideoEncoder * enc, GstQuery * query) +{ +#if GST_MF_HAVE_D3D11 + GstMFVideoEnc *self = GST_MF_VIDEO_ENC (enc); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_d3d11_handle_context_query (GST_ELEMENT (self), + query, self->other_d3d11_device)) { + return TRUE; + } + break; + default: + break; + } +#endif + + return GST_VIDEO_ENCODER_CLASS (parent_class)->src_query (enc, query); +} + +static HRESULT +gst_mf_video_on_new_sample (GstMFTransform * object, + IMFSample * sample, GstMFVideoEnc * self) +{ + GST_LOG_OBJECT (self, "New Sample callback"); + + /* NOTE: this callback will be called from Media Foundation's internal + * worker queue thread */ + GST_VIDEO_ENCODER_STREAM_LOCK (self); + gst_mf_video_enc_finish_sample (self, sample); + GST_VIDEO_ENCODER_STREAM_UNLOCK (self); + + return S_OK; +} + +typedef struct +{ + guint profile; + const gchar *profile_str; +} GstMFVideoEncProfileMap; + +static void +gst_mf_video_enc_enum_internal (GstMFTransform * transform, GUID & subtype, + GstObject * d3d11_device, GstMFVideoEncDeviceCaps * device_caps, + GstCaps ** sink_template, GstCaps ** src_template) +{ + HRESULT hr; + MFT_REGISTER_TYPE_INFO *infos; + UINT32 info_size; + gint i; + GstCaps *src_caps = NULL; + GstCaps *sink_caps = NULL; + GstCaps *d3d11_caps = NULL; + GValue *supported_formats = NULL; + GValue *profiles = NULL; + gboolean have_I420 = FALSE; + gboolean have_NV12 = FALSE; + gboolean have_P010 = FALSE; + gboolean d3d11_aware = FALSE; + gchar *device_name = NULL; + IMFActivate *activate; + IMFTransform *encoder; + ICodecAPI *codec_api; + ComPtr < IMFMediaType > out_type; + GstMFVideoEncProfileMap h264_profile_map[] = { + {eAVEncH264VProfile_High, "high"}, + {eAVEncH264VProfile_Main, "main"}, + {eAVEncH264VProfile_Base, "baseline"}, + {0, NULL}, + }; + GstMFVideoEncProfileMap hevc_profile_map[] = { + {eAVEncH265VProfile_Main_420_8, "main"}, + {eAVEncH265VProfile_Main_420_10, "main-10"}, + {0, NULL}, + }; + GstMFVideoEncProfileMap *profile_to_check = NULL; + static const gchar *h264_caps_str = + "video/x-h264, stream-format=(string) byte-stream, alignment=(string) au"; + static const gchar *hevc_caps_str = + "video/x-h265, stream-format=(string) byte-stream, alignment=(string) au"; + static const gchar *vp9_caps_str = "video/x-vp9"; + const gchar *codec_caps_str = NULL; + + /* NOTE: depending on environment, + * some enumerated h/w MFT might not be usable (e.g., multiple GPU case) */ + if (!gst_mf_transform_open (transform)) + return; + + activate = gst_mf_transform_get_activate_handle (transform); + if (!activate) { + GST_WARNING_OBJECT (transform, "No IMFActivate interface available"); + return; + } + + encoder = gst_mf_transform_get_transform_handle (transform); + if (!encoder) { + GST_WARNING_OBJECT (transform, "No IMFTransform interface available"); + return; + } + + codec_api = gst_mf_transform_get_codec_api_handle (transform); + if (!codec_api) { + GST_WARNING_OBJECT (transform, "No ICodecAPI interface available"); + return; + } + + g_object_get (transform, "device-name", &device_name, NULL); + if (!device_name) { + GST_WARNING_OBJECT (transform, "Unknown device name"); + return; + } + g_free (device_name); + + hr = activate->GetAllocatedBlob (MFT_INPUT_TYPES_Attributes, + (UINT8 **) & infos, &info_size); + if (!gst_mf_result (hr)) + return; + + for (i = 0; i < info_size / sizeof (MFT_REGISTER_TYPE_INFO); i++) { + GstVideoFormat format; + const GstVideoFormatInfo *format_info; + GValue val = G_VALUE_INIT; + + format = gst_mf_video_subtype_to_video_format (&infos[i].guidSubtype); + if (format == GST_VIDEO_FORMAT_UNKNOWN) + continue; + + format_info = gst_video_format_get_info (format); + if (GST_VIDEO_FORMAT_INFO_IS_RGB (format_info)) { + GST_DEBUG_OBJECT (transform, "Skip %s format", + GST_VIDEO_FORMAT_INFO_NAME (format_info)); + continue; + } + + if (!supported_formats) { + supported_formats = g_new0 (GValue, 1); + g_value_init (supported_formats, GST_TYPE_LIST); + } + + switch (format) { + /* media foundation has duplicated formats IYUV and I420 */ + case GST_VIDEO_FORMAT_I420: + if (have_I420) + continue; + + have_I420 = TRUE; + break; + case GST_VIDEO_FORMAT_NV12: + have_NV12 = TRUE; + break; + case GST_VIDEO_FORMAT_P010_10LE: + have_P010 = TRUE; + break; + default: + break; + } + + g_value_init (&val, G_TYPE_STRING); + g_value_set_static_string (&val, gst_video_format_to_string (format)); + gst_value_list_append_and_take_value (supported_formats, &val); + } + CoTaskMemFree (infos); + + if (!supported_formats) { + GST_WARNING_OBJECT (transform, "Couldn't figure out supported format"); + return; + } + + if (IsEqualGUID (MFVideoFormat_H264, subtype)) { + profile_to_check = h264_profile_map; + codec_caps_str = h264_caps_str; + } else if (IsEqualGUID (MFVideoFormat_HEVC, subtype)) { + profile_to_check = hevc_profile_map; + codec_caps_str = hevc_caps_str; + } else if (IsEqualGUID (MFVideoFormat_VP90, subtype)) { + codec_caps_str = vp9_caps_str; + } else { + g_assert_not_reached (); + return; + } + + if (profile_to_check) { + hr = MFCreateMediaType (&out_type); + if (!gst_mf_result (hr)) + return; + + hr = out_type->SetGUID (MF_MT_MAJOR_TYPE, MFMediaType_Video); + if (!gst_mf_result (hr)) + return; + + hr = out_type->SetGUID (MF_MT_SUBTYPE, subtype); + if (!gst_mf_result (hr)) + return; + + hr = out_type->SetUINT32 (MF_MT_AVG_BITRATE, 2048000); + if (!gst_mf_result (hr)) + return; + + hr = MFSetAttributeRatio (out_type.Get (), MF_MT_FRAME_RATE, 30, 1); + if (!gst_mf_result (hr)) + return; + + hr = out_type->SetUINT32 (MF_MT_INTERLACE_MODE, + MFVideoInterlace_Progressive); + if (!gst_mf_result (hr)) + return; + + hr = MFSetAttributeSize (out_type.Get (), MF_MT_FRAME_SIZE, 1920, 1080); + if (!gst_mf_result (hr)) + return; + + i = 0; + do { + GValue profile_val = G_VALUE_INIT; + guint mf_profile = profile_to_check[i].profile; + const gchar *profile_str = profile_to_check[i].profile_str; + + i++; + + if (mf_profile == 0) + break; + + g_assert (profile_str != NULL); + + hr = out_type->SetUINT32 (MF_MT_MPEG2_PROFILE, mf_profile); + if (!gst_mf_result (hr)) + return; + + if (!gst_mf_transform_set_output_type (transform, out_type.Get ())) + continue; + + if (!profiles) { + profiles = g_new0 (GValue, 1); + g_value_init (profiles, GST_TYPE_LIST); + } + + /* Add "constrained-baseline" in addition to "baseline" */ + if (profile_str == "baseline") { + g_value_init (&profile_val, G_TYPE_STRING); + g_value_set_static_string (&profile_val, "constrained-baseline"); + gst_value_list_append_and_take_value (profiles, &profile_val); + } + + g_value_init (&profile_val, G_TYPE_STRING); + g_value_set_static_string (&profile_val, profile_str); + gst_value_list_append_and_take_value (profiles, &profile_val); + } while (1); + + if (!profiles) { + GST_WARNING_OBJECT (transform, "Couldn't query supported profile"); + return; + } + } + + src_caps = gst_caps_from_string (codec_caps_str); + if (profiles) { + gst_caps_set_value (src_caps, "profile", profiles); + g_value_unset (profiles); + g_free (profiles); + } + + sink_caps = gst_caps_new_empty_simple ("video/x-raw"); + /* FIXME: don't hardcode max resolution, but MF doesn't provide + * API for querying supported max resolution... */ + gst_caps_set_simple (sink_caps, + "width", GST_TYPE_INT_RANGE, 64, 8192, + "height", GST_TYPE_INT_RANGE, 64, 8192, NULL); + gst_caps_set_simple (src_caps, + "width", GST_TYPE_INT_RANGE, 64, 8192, + "height", GST_TYPE_INT_RANGE, 64, 8192, NULL); + +#if GST_MF_HAVE_D3D11 + /* Check whether this MFT can support D3D11 */ + if (d3d11_device && (have_NV12 || have_P010)) { + g_object_get (transform, "d3d11-aware", &d3d11_aware, NULL); + GST_DEBUG_OBJECT (transform, "d3d11 aware %d", d3d11_aware); + } + + if (d3d11_device && (have_NV12 || have_P010) && d3d11_aware) { + gint64 adapter_luid = 0; + GValue d3d11_formats = G_VALUE_INIT; + + g_object_get (d3d11_device, "adapter-luid", &adapter_luid, NULL); + + d3d11_caps = gst_caps_copy (sink_caps); + + g_value_init (&d3d11_formats, GST_TYPE_LIST); + if (have_NV12) { + GValue val = G_VALUE_INIT; + g_value_init (&val, G_TYPE_STRING); + g_value_set_static_string (&val, "NV12"); + gst_value_list_append_and_take_value (&d3d11_formats, &val); + } + + if (have_P010) { + GValue val = G_VALUE_INIT; + g_value_init (&val, G_TYPE_STRING); + g_value_set_static_string (&val, "P010_10LE"); + gst_value_list_append_and_take_value (&d3d11_formats, &val); + } + + gst_caps_set_value (d3d11_caps, "format", &d3d11_formats); + g_value_unset (&d3d11_formats); + gst_caps_set_features_simple (d3d11_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_D3D11_MEMORY)); + device_caps->d3d11_aware = TRUE; + device_caps->adapter_luid = adapter_luid; + } +#endif + + gst_caps_set_value (sink_caps, "format", supported_formats); + g_value_unset (supported_formats); + g_free (supported_formats); + + if (d3d11_caps) + gst_caps_append (sink_caps, d3d11_caps); + + *sink_template = sink_caps; + *src_template = src_caps; + +#define CHECK_DEVICE_CAPS(codec_obj,api,val) \ + if (SUCCEEDED((codec_obj)->IsSupported(&(api)))) {\ + (device_caps)->val = TRUE; \ + } + + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonRateControlMode, rc_mode); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonQuality, quality); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncAdaptiveMode, adaptive_mode); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonBufferSize, buffer_size); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonMaxBitRate, max_bitrate); + CHECK_DEVICE_CAPS (codec_api, + CODECAPI_AVEncCommonQualityVsSpeed, quality_vs_speed); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncH264CABACEnable, cabac); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncH264SPSID, sps_id); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncH264PPSID, pps_id); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVDefaultBPictureCount, bframes); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVGOPSize, gop_size); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncNumWorkerThreads, threads); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoContentType, content_type); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoEncodeQP, qp); + CHECK_DEVICE_CAPS (codec_api, + CODECAPI_AVEncVideoForceKeyFrame, force_keyframe); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVLowLatencyMode, low_latency); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMinQP, min_qp); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMaxQP, max_qp); + CHECK_DEVICE_CAPS (codec_api, + CODECAPI_AVEncVideoEncodeFrameTypeQP, frame_type_qp); + CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoMaxNumRefFrame, max_num_ref); + if (device_caps->max_num_ref) { + VARIANT min; + VARIANT max; + VARIANT step; + + hr = codec_api->GetParameterRange (&CODECAPI_AVEncVideoMaxNumRefFrame, + &min, &max, &step); + if (SUCCEEDED (hr)) { + device_caps->max_num_ref_high = max.uiVal; + device_caps->max_num_ref_low = min.uiVal; + VariantClear (&min); + VariantClear (&max); + VariantClear (&step); + } else { + device_caps->max_num_ref = FALSE; + } + } +#undef CHECK_DEVICE_CAPS + + return; +} + +static GstMFTransform * +gst_mf_video_enc_enum (guint enum_flags, GUID * subtype, guint device_index, + GstMFVideoEncDeviceCaps * device_caps, GstObject * d3d11_device, + GstCaps ** sink_template, GstCaps ** src_template) +{ + GstMFTransformEnumParams enum_params = { 0, }; + MFT_REGISTER_TYPE_INFO output_type; + GstMFTransform *transform; + gint64 adapter_luid = 0; + + *sink_template = NULL; + *src_template = NULL; + memset (device_caps, 0, sizeof (GstMFVideoEncDeviceCaps)); + + if (!IsEqualGUID (MFVideoFormat_H264, *subtype) && + !IsEqualGUID (MFVideoFormat_HEVC, *subtype) && + !IsEqualGUID (MFVideoFormat_VP90, *subtype)) { + GST_ERROR ("Unknown subtype GUID"); + + return NULL; + } + + if (d3d11_device) { + g_object_get (d3d11_device, "adapter-luid", &adapter_luid, NULL); + if (!adapter_luid) { + GST_ERROR ("Couldn't get adapter LUID"); + return NULL; + } + } + + output_type.guidMajorType = MFMediaType_Video; + output_type.guidSubtype = *subtype; + + enum_params.category = MFT_CATEGORY_VIDEO_ENCODER; + enum_params.output_typeinfo = &output_type; + enum_params.device_index = device_index; + enum_params.enum_flags = enum_flags; + enum_params.adapter_luid = adapter_luid; + + transform = gst_mf_transform_new (&enum_params); + if (!transform) + return NULL; + + gst_mf_video_enc_enum_internal (transform, output_type.guidSubtype, + d3d11_device, device_caps, sink_template, src_template); + + return transform; +} + +static void +gst_mf_video_enc_register_internal (GstPlugin * plugin, guint rank, + GUID * subtype, GTypeInfo * type_info, + const GstMFVideoEncDeviceCaps * device_caps, + guint32 enum_flags, guint device_index, GstMFTransform * transform, + GstCaps * sink_caps, GstCaps * src_caps) +{ + GType type; + GTypeInfo local_type_info; + gchar *type_name; + gchar *feature_name; + gint i; + GstMFVideoEncClassData *cdata; + gboolean is_default = TRUE; + gchar *device_name = NULL; + const gchar *type_name_prefix = NULL; + const gchar *feature_name_prefix = NULL; + + if (IsEqualGUID (MFVideoFormat_H264, *subtype)) { + type_name_prefix = "H264"; + feature_name_prefix = "h264"; + } else if (IsEqualGUID (MFVideoFormat_HEVC, *subtype)) { + type_name_prefix = "H265"; + feature_name_prefix = "h265"; + } else if (IsEqualGUID (MFVideoFormat_VP90, *subtype)) { + type_name_prefix = "VP9"; + feature_name_prefix = "vp9"; + } else { + g_assert_not_reached (); + return; + } + + /* Must be checked already */ + g_object_get (transform, "device-name", &device_name, NULL); + g_assert (device_name != NULL); + + cdata = g_new0 (GstMFVideoEncClassData, 1); + cdata->sink_caps = gst_caps_copy (sink_caps); + cdata->src_caps = gst_caps_copy (src_caps); + cdata->device_name = device_name; + cdata->device_caps = *device_caps; + cdata->enum_flags = enum_flags; + cdata->device_index = device_index; + + local_type_info = *type_info; + local_type_info.class_data = cdata; + + GST_MINI_OBJECT_FLAG_SET (cdata->sink_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (cdata->src_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + type_name = g_strdup_printf ("GstMF%sEnc", type_name_prefix); + feature_name = g_strdup_printf ("mf%senc", feature_name_prefix); + + i = 1; + while (g_type_from_name (type_name) != 0) { + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstMF%sDevice%dEnc", type_name_prefix, i); + feature_name = g_strdup_printf ("mf%sdevice%denc", feature_name_prefix, i); + is_default = FALSE; + i++; + } + + cdata->is_default = is_default; + + type = + g_type_register_static (GST_TYPE_MF_VIDEO_ENC, type_name, + &local_type_info, (GTypeFlags) 0); + + /* make lower rank than default device */ + if (rank > 0 && !is_default) + rank--; + + if (!is_default) + gst_element_type_set_skip_documentation (type); + + if (!gst_element_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +} + +void +gst_mf_video_enc_register (GstPlugin * plugin, guint rank, GUID * subtype, + GTypeInfo * type_info, GList * d3d11_device) +{ + GstMFTransform *transform = NULL; + GstCaps *sink_template = NULL; + GstCaps *src_template = NULL; + guint enum_flags; + GstMFVideoEncDeviceCaps device_caps; + guint i; + + /* register hardware encoders first */ + enum_flags = (MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_ASYNCMFT | + MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); + + if (d3d11_device) { + GList *iter; + for (iter = d3d11_device; iter; iter = g_list_next (iter)) { + GstObject *device = (GstObject *) iter->data; + + transform = gst_mf_video_enc_enum (enum_flags, subtype, 0, &device_caps, + device, &sink_template, &src_template); + + /* No more MFT to enumerate */ + if (!transform) + break; + + /* Failed to open MFT */ + if (!sink_template) { + gst_clear_object (&transform); + continue; + } + + gst_mf_video_enc_register_internal (plugin, rank, subtype, type_info, + &device_caps, enum_flags, 0, transform, sink_template, src_template); + gst_clear_object (&transform); + gst_clear_caps (&sink_template); + gst_clear_caps (&src_template); + } + } else { + /* AMD seems to be able to support up to 12 GPUs */ + for (i = 0; i < 12; i++) { + transform = gst_mf_video_enc_enum (enum_flags, subtype, i, &device_caps, + NULL, &sink_template, &src_template); + + /* No more MFT to enumerate */ + if (!transform) + break; + + /* Failed to open MFT */ + if (!sink_template) { + gst_clear_object (&transform); + continue; + } + + gst_mf_video_enc_register_internal (plugin, rank, subtype, type_info, + &device_caps, enum_flags, i, transform, sink_template, src_template); + gst_clear_object (&transform); + gst_clear_caps (&sink_template); + gst_clear_caps (&src_template); + } + } + + /* register software encoders */ + enum_flags = (MFT_ENUM_FLAG_SYNCMFT | + MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); + + transform = gst_mf_video_enc_enum (enum_flags, subtype, 0, &device_caps, + NULL, &sink_template, &src_template); + + if (!transform) + goto done; + + if (!sink_template) + goto done; + + gst_mf_video_enc_register_internal (plugin, rank, subtype, type_info, + &device_caps, enum_flags, 0, transform, sink_template, src_template); + +done: + gst_clear_object (&transform); + gst_clear_caps (&sink_template); + gst_clear_caps (&src_template); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfvideoenc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvideoenc.h
Changed
@@ -21,11 +21,17 @@ #ifndef __GST_MF_VIDEO_ENC_H__ #define __GST_MF_VIDEO_ENC_H__ +#include "gstmfconfig.h" + #include <gst/gst.h> #include <gst/video/video.h> #include "gstmfutils.h" #include "gstmftransform.h" +#if GST_MF_HAVE_D3D11 +#include <gst/d3d11/gstd3d11.h> +#endif + G_BEGIN_DECLS #define GST_TYPE_MF_VIDEO_ENC (gst_mf_video_enc_get_type()) @@ -37,26 +43,92 @@ typedef struct _GstMFVideoEnc GstMFVideoEnc; typedef struct _GstMFVideoEncClass GstMFVideoEncClass; +typedef struct _GstMFVideoEncDeviceCaps GstMFVideoEncDeviceCaps; +typedef struct _GstMFVideoEncClassData GstMFVideoEncClassData; + +struct _GstMFVideoEncDeviceCaps +{ + gboolean rc_mode; /* AVEncCommonRateControlMode */ + gboolean quality; /* AVEncCommonQuality */ + + gboolean adaptive_mode; /* AVEncAdaptiveMode */ + gboolean buffer_size; /* AVEncCommonBufferSize */ + gboolean max_bitrate; /* AVEncCommonMaxBitRate */ + gboolean quality_vs_speed; /* AVEncCommonQualityVsSpeed */ + gboolean cabac; /* AVEncH264CABACEnable */ + gboolean sps_id; /* AVEncH264SPSID */ + gboolean pps_id; /* AVEncH264PPSID */ + gboolean bframes; /* AVEncMPVDefaultBPictureCount */ + gboolean gop_size; /* AVEncMPVGOPSize */ + gboolean threads; /* AVEncNumWorkerThreads */ + gboolean content_type; /* AVEncVideoContentType */ + gboolean qp; /* AVEncVideoEncodeQP */ + gboolean force_keyframe; /* AVEncVideoForceKeyFrame */ + gboolean low_latency; /* AVLowLatencyMode */ + + gboolean min_qp; /* AVEncVideoMinQP */ + gboolean max_qp; /* AVEncVideoMaxQP */ + gboolean frame_type_qp; /* AVEncVideoEncodeFrameTypeQP */ + gboolean max_num_ref; /* AVEncVideoMaxNumRefFrame */ + guint max_num_ref_high; + guint max_num_ref_low; + + /* TRUE if MFT support d3d11 and also we can use d3d11 interop */ + gboolean d3d11_aware; + /* DXGI adapter LUID, valid only when d3d11_aware == TRUE */ + gint64 adapter_luid; +}; + +struct _GstMFVideoEncClassData +{ + GstCaps *sink_caps; + GstCaps *src_caps; + gchar *device_name; + guint32 enum_flags; + guint device_index; + GstMFVideoEncDeviceCaps device_caps; + gboolean is_default; +}; struct _GstMFVideoEnc { GstVideoEncoder parent; GstMFTransform *transform; + gboolean async_mft; + GstFlowReturn last_ret; GstVideoCodecState *input_state; + + /* Set by subclass */ + gboolean has_reorder_frame; + + /* Calculated timestamp offset in MF timescale (100ns scale) + * when B-frame is enabled. */ + LONGLONG mf_pts_offset; + +#if GST_MF_HAVE_D3D11 + /* For D3D11 interop. */ + GstD3D11Device *other_d3d11_device; + GstD3D11Device *d3d11_device; + IMFDXGIDeviceManager *device_manager; + UINT reset_token; + IMFVideoSampleAllocatorEx *mf_allocator; +#endif }; struct _GstMFVideoEncClass { GstVideoEncoderClass parent_class; - GUID codec_id; - guint32 enum_flags; - guint device_index; - gboolean can_force_keyframe; + /* Set by subclass */ + GUID codec_id; /* Output subtype of MFT */ + guint32 enum_flags; /* MFT_ENUM_FLAG */ + guint device_index; /* Index of enumerated IMFActivate via MFTEnum */ + GstMFVideoEncDeviceCaps device_caps; gboolean (*set_option) (GstMFVideoEnc * mfenc, + GstVideoCodecState * state, IMFMediaType * output_type); gboolean (*set_src_caps) (GstMFVideoEnc * mfenc, @@ -66,6 +138,12 @@ GType gst_mf_video_enc_get_type (void); +void gst_mf_video_enc_register (GstPlugin * plugin, + guint rank, + GUID * subtype, + GTypeInfo * type_info, + GList * d3d11_device); + G_END_DECLS #endif /* __GST_MF_VIDEO_ENC_H__ */ \ No newline at end of file
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfvideosrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvideosrc.c
Changed
@@ -73,8 +73,8 @@ gboolean started; GstVideoInfo info; - GstClockTime first_pts; guint64 n_frames; + GstClockTime latency; /* properties */ gchar *device_path; @@ -109,6 +109,7 @@ static GstCaps *gst_mf_video_src_fixate (GstBaseSrc * src, GstCaps * caps); static gboolean gst_mf_video_src_unlock (GstBaseSrc * src); static gboolean gst_mf_video_src_unlock_stop (GstBaseSrc * src); +static gboolean gst_mf_video_src_query (GstBaseSrc * src, GstQuery * query); static GstFlowReturn gst_mf_video_src_create (GstPushSrc * pushsrc, GstBuffer ** buffer); @@ -178,6 +179,7 @@ basesrc_class->fixate = GST_DEBUG_FUNCPTR (gst_mf_video_src_fixate); basesrc_class->unlock = GST_DEBUG_FUNCPTR (gst_mf_video_src_unlock); basesrc_class->unlock_stop = GST_DEBUG_FUNCPTR (gst_mf_video_src_unlock_stop); + basesrc_class->query = GST_DEBUG_FUNCPTR (gst_mf_video_src_query); pushsrc_class->create = GST_DEBUG_FUNCPTR (gst_mf_video_src_create); @@ -190,7 +192,6 @@ { gst_base_src_set_format (GST_BASE_SRC (self), GST_FORMAT_TIME); gst_base_src_set_live (GST_BASE_SRC (self), TRUE); - gst_base_src_set_do_timestamp (GST_BASE_SRC (self), TRUE); self->device_index = DEFAULT_DEVICE_INDEX; } @@ -267,10 +268,17 @@ self->source = gst_mf_source_object_new (GST_MF_SOURCE_TYPE_VIDEO, self->device_index, self->device_name, self->device_path, NULL); - self->first_pts = GST_CLOCK_TIME_NONE; self->n_frames = 0; + self->latency = 0; - return ! !self->source; + if (!self->source) { + GST_ERROR_OBJECT (self, "Couldn't create capture object"); + return FALSE; + } + + gst_mf_source_object_set_client (self->source, GST_ELEMENT (self)); + + return TRUE; } static gboolean @@ -383,12 +391,35 @@ return TRUE; } +static gboolean +gst_mf_video_src_query (GstBaseSrc * src, GstQuery * query) +{ + GstMFVideoSrc *self = GST_MF_VIDEO_SRC (src); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_LATENCY: + if (self->started) { + gst_query_set_latency (query, TRUE, 0, self->latency); + + return TRUE; + } + break; + default: + break; + } + + return GST_BASE_SRC_CLASS (parent_class)->query (src, query); +} + static GstFlowReturn gst_mf_video_src_create (GstPushSrc * pushsrc, GstBuffer ** buffer) { GstMFVideoSrc *self = GST_MF_VIDEO_SRC (pushsrc); GstFlowReturn ret = GST_FLOW_OK; GstBuffer *buf = NULL; + GstClock *clock; + GstClockTime running_time = GST_CLOCK_TIME_NONE; + GstClockTimeDiff diff; if (!self->started) { if (!gst_mf_source_object_start (self->source)) { @@ -419,6 +450,28 @@ GST_BUFFER_OFFSET_END (buf) = GST_BUFFER_OFFSET (buf) + 1; self->n_frames++; + GST_LOG_OBJECT (self, + "Captured buffer timestamp %" GST_TIME_FORMAT ", duration %" + GST_TIME_FORMAT, GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buf)), + GST_TIME_ARGS (GST_BUFFER_DURATION (buf))); + + /* Update latency */ + clock = gst_element_get_clock (GST_ELEMENT_CAST (self)); + if (clock) { + GstClockTime now; + + now = gst_clock_get_time (clock); + running_time = now - GST_ELEMENT_CAST (self)->base_time; + gst_object_unref (clock); + } + + diff = GST_CLOCK_DIFF (GST_BUFFER_PTS (buf), running_time); + if (diff > self->latency) { + self->latency = (GstClockTime) diff; + GST_DEBUG_OBJECT (self, "Updated latency value %" GST_TIME_FORMAT, + GST_TIME_ARGS (self->latency)); + } + *buffer = buf; return GST_FLOW_OK;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfvp9enc.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvp9enc.cpp
Changed
@@ -39,7 +39,9 @@ #include "gstmfvp9enc.h" #include <wrl.h> +/* *INDENT-OFF* */ using namespace Microsoft::WRL; +/* *INDENT-ON* */ GST_DEBUG_CATEGORY (gst_mf_vp9_enc_debug); #define GST_CAT_DEFAULT gst_mf_vp9_enc_debug @@ -105,37 +107,24 @@ PROP_THREADS, PROP_CONTENT_TYPE, PROP_LOW_LATENCY, + PROP_D3D11_AWARE, + PROP_ADAPTER_LUID, }; #define DEFAULT_BITRATE (2 * 1024) #define DEFAULT_RC_MODE GST_MF_VP9_ENC_RC_MODE_CBR #define DEFAULT_MAX_BITRATE 0 #define DEFAULT_QUALITY_VS_SPEED 50 -#define DEFAULT_GOP_SIZE 0 +#define DEFAULT_GOP_SIZE -1 #define DEFAULT_THREADS 0 #define DEFAULT_CONTENT_TYPE GST_MF_VP9_ENC_CONTENT_TYPE_UNKNOWN #define DEFAULT_LOW_LATENCY FALSE -#define GST_MF_VP9_ENC_GET_CLASS(obj) \ - (G_TYPE_INSTANCE_GET_CLASS((obj), G_TYPE_FROM_INSTANCE (obj), GstMFVP9EncClass)) - -typedef struct _GstMFVP9EncDeviceCaps -{ - gboolean rc_mode; /* AVEncCommonRateControlMode */ - gboolean max_bitrate; /* AVEncCommonMaxBitRate */ - gboolean quality_vs_speed; /* AVEncCommonQualityVsSpeed */ - gboolean gop_size; /* AVEncMPVGOPSize */ - gboolean threads; /* AVEncNumWorkerThreads */ - gboolean content_type; /* AVEncVideoContentType */ - gboolean force_keyframe; /* AVEncVideoForceKeyFrame */ - gboolean low_latency; /* AVLowLatencyMode */ -} GstMFVP9EncDeviceCaps; - typedef struct _GstMFVP9Enc { GstMFVideoEnc parent; - /* properteies */ + /* properties */ guint bitrate; /* device dependent properties */ @@ -151,21 +140,8 @@ typedef struct _GstMFVP9EncClass { GstMFVideoEncClass parent_class; - - GstMFVP9EncDeviceCaps device_caps; } GstMFVP9EncClass; -typedef struct -{ - GstCaps *sink_caps; - GstCaps *src_caps; - gchar *device_name; - guint32 enum_flags; - guint device_index; - GstMFVP9EncDeviceCaps device_caps; - gboolean is_default; -} GstMFVP9EncClassData; - static GstElementClass *parent_class = NULL; static void gst_mf_vp9_enc_get_property (GObject * object, guint prop_id, @@ -173,7 +149,7 @@ static void gst_mf_vp9_enc_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static gboolean gst_mf_vp9_enc_set_option (GstMFVideoEnc * mfenc, - IMFMediaType * output_type); + GstVideoCodecState * state, IMFMediaType * output_type); static gboolean gst_mf_vp9_enc_set_src_caps (GstMFVideoEnc * mfenc, GstVideoCodecState * state, IMFMediaType * output_type); @@ -183,13 +159,12 @@ GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *element_class = GST_ELEMENT_CLASS (klass); GstMFVideoEncClass *mfenc_class = GST_MF_VIDEO_ENC_CLASS (klass); - GstMFVP9EncClassData *cdata = (GstMFVP9EncClassData *) data; - GstMFVP9EncDeviceCaps *device_caps = &cdata->device_caps; + GstMFVideoEncClassData *cdata = (GstMFVideoEncClassData *) data; + GstMFVideoEncDeviceCaps *device_caps = &cdata->device_caps; gchar *long_name; gchar *classification; parent_class = (GstElementClass *) g_type_class_peek_parent (klass); - klass->device_caps = *device_caps; gobject_class->get_property = gst_mf_vp9_enc_get_property; gobject_class->set_property = gst_mf_vp9_enc_set_property; @@ -205,7 +180,7 @@ "Rate Control Mode", GST_TYPE_MF_VP9_ENC_RC_MODE, DEFAULT_RC_MODE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { @@ -221,7 +196,7 @@ "(0 = MFT default)", 0, (G_MAXUINT >> 10), DEFAULT_MAX_BITRATE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->quality_vs_speed) { @@ -231,17 +206,18 @@ "[34, 66]: Medium complexity, [67, 100]: High complexity", 0, 100, DEFAULT_QUALITY_VS_SPEED, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->gop_size) { g_object_class_install_property (gobject_class, PROP_GOP_SIZE, - g_param_spec_uint ("gop-size", "GOP size", - "The number of pictures from one GOP header to the next, " - "(0 = MFT default)", 0, G_MAXUINT - 1, - DEFAULT_GOP_SIZE, + g_param_spec_int ("gop-size", "GOP size", + "The number of pictures from one GOP header to the next. " + "Depending on GPU vendor implementation, zero gop-size might " + "produce only one keyframe at the beginning (-1 for automatic)", + -1, G_MAXINT, DEFAULT_GOP_SIZE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->threads) { @@ -251,7 +227,7 @@ "(0 = MFT default)", 0, 16, DEFAULT_THREADS, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } if (device_caps->content_type) { @@ -260,7 +236,7 @@ "Indicates the type of video content", GST_TYPE_MF_VP9_ENC_CONTENT_TYPE, DEFAULT_CONTENT_TYPE, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); /* NOTE: documentation will be done by only for default device */ if (cdata->is_default) { @@ -275,13 +251,42 @@ "Enable low latency encoding", DEFAULT_LOW_LATENCY, (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | - G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); + } + + /** + * GstMFVP9Enc:d3d11-aware: + * + * Whether element supports Direct3D11 texture as an input or not + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_D3D11_AWARE, + g_param_spec_boolean ("d3d11-aware", "D3D11 Aware", + "Whether device can support Direct3D11 interop", + device_caps->d3d11_aware, + (GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); + + /** + * GstMFVP9Enc:adapter-luid: + * + * DXGI Adapter LUID for this elemenet + * + * Since: 1.20 + */ + if (device_caps->d3d11_aware) { + g_object_class_install_property (gobject_class, PROP_ADAPTER_LUID, + g_param_spec_int64 ("adapter-luid", "Adapter LUID", + "DXGI Adapter LUID (Locally Unique Identifier) of created device", + G_MININT64, G_MAXINT64, device_caps->adapter_luid, + (GParamFlags) (GST_PARAM_CONDITIONALLY_AVAILABLE | + G_PARAM_READABLE | G_PARAM_STATIC_STRINGS))); } long_name = g_strdup_printf ("Media Foundation %s", cdata->device_name); classification = g_strdup_printf ("Codec/Encoder/Video%s", (cdata->enum_flags & MFT_ENUM_FLAG_HARDWARE) == MFT_ENUM_FLAG_HARDWARE ? - "/Hardware" : ""); + "/Hardware" : ""); gst_element_class_set_metadata (element_class, long_name, classification, "Microsoft Media Foundation VP9 Encoder", @@ -302,7 +307,7 @@ mfenc_class->codec_id = MFVideoFormat_VP90; mfenc_class->enum_flags = cdata->enum_flags; mfenc_class->device_index = cdata->device_index; - mfenc_class->can_force_keyframe = device_caps->force_keyframe; + mfenc_class->device_caps = *device_caps; g_free (cdata->device_name); gst_caps_unref (cdata->sink_caps); @@ -328,6 +333,7 @@ GValue * value, GParamSpec * pspec) { GstMFVP9Enc *self = (GstMFVP9Enc *) (object); + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (object); switch (prop_id) { case PROP_BITRATE: @@ -343,7 +349,7 @@ g_value_set_uint (value, self->quality_vs_speed); break; case PROP_GOP_SIZE: - g_value_set_uint (value, self->gop_size); + g_value_set_int (value, self->gop_size); break; case PROP_THREADS: g_value_set_uint (value, self->threads); @@ -354,6 +360,12 @@ case PROP_LOW_LATENCY: g_value_set_boolean (value, self->low_latency); break; + case PROP_D3D11_AWARE: + g_value_set_boolean (value, klass->device_caps.d3d11_aware); + break; + case PROP_ADAPTER_LUID: + g_value_set_int64 (value, klass->device_caps.adapter_luid); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -380,7 +392,7 @@ self->quality_vs_speed = g_value_get_uint (value); break; case PROP_GOP_SIZE: - self->gop_size = g_value_get_uint (value); + self->gop_size = g_value_get_int (value); break; case PROP_THREADS: self->threads = g_value_get_uint (value); @@ -431,11 +443,12 @@ } G_STMT_END static gboolean -gst_mf_vp9_enc_set_option (GstMFVideoEnc * mfenc, IMFMediaType * output_type) +gst_mf_vp9_enc_set_option (GstMFVideoEnc * mfenc, GstVideoCodecState * state, + IMFMediaType * output_type) { GstMFVP9Enc *self = (GstMFVP9Enc *) mfenc; - GstMFVP9EncClass *klass = GST_MF_VP9_ENC_GET_CLASS (self); - GstMFVP9EncDeviceCaps *device_caps = &klass->device_caps; + GstMFVideoEncClass *klass = GST_MF_VIDEO_ENC_GET_CLASS (mfenc); + GstMFVideoEncDeviceCaps *device_caps = &klass->device_caps; HRESULT hr; GstMFTransform *transform = mfenc->transform; @@ -470,14 +483,30 @@ if (device_caps->quality_vs_speed) { hr = gst_mf_transform_set_codec_api_uint32 (transform, - &CODECAPI_AVEncCommonQualityVsSpeed, - self->quality_vs_speed); + &CODECAPI_AVEncCommonQualityVsSpeed, self->quality_vs_speed); WARNING_HR (hr, CODECAPI_AVEncCommonQualityVsSpeed); } if (device_caps->gop_size) { + GstVideoInfo *info = &state->info; + gint gop_size = self->gop_size; + gint fps_n, fps_d; + + /* Set default value (10 sec or 250 frames) like that of x264enc */ + if (gop_size < 0) { + fps_n = GST_VIDEO_INFO_FPS_N (info); + fps_d = GST_VIDEO_INFO_FPS_D (info); + if (fps_n <= 0 || fps_d <= 0) { + gop_size = 250; + } else { + gop_size = 10 * fps_n / fps_d; + } + + GST_DEBUG_OBJECT (self, "Update GOP size to %d", gop_size); + } + hr = gst_mf_transform_set_codec_api_uint32 (transform, - &CODECAPI_AVEncMPVGOPSize, self->gop_size); + &CODECAPI_AVEncMPVGOPSize, gop_size); WARNING_HR (hr, CODECAPI_AVEncMPVGOPSize); } @@ -536,21 +565,12 @@ gst_tag_list_unref (tags); return TRUE; - } -static void -gst_mf_vp9_enc_register (GstPlugin * plugin, guint rank, - const gchar * device_name, const GstMFVP9EncDeviceCaps * device_caps, - guint32 enum_flags, guint device_index, - GstCaps * sink_caps, GstCaps * src_caps) +void +gst_mf_vp9_enc_plugin_init (GstPlugin * plugin, guint rank, + GList * d3d11_device) { - GType type; - gchar *type_name; - gchar *feature_name; - gint i; - GstMFVP9EncClassData *cdata; - gboolean is_default = TRUE; GTypeInfo type_info = { sizeof (GstMFVP9EncClass), NULL, @@ -562,249 +582,9 @@ 0, (GInstanceInitFunc) gst_mf_vp9_enc_init, }; - - cdata = g_new0 (GstMFVP9EncClassData, 1); - cdata->sink_caps = sink_caps; - cdata->src_caps = src_caps; - cdata->device_name = g_strdup (device_name); - cdata->device_caps = *device_caps; - cdata->enum_flags = enum_flags; - cdata->device_index = device_index; - type_info.class_data = cdata; - - type_name = g_strdup ("GstMFVP9Enc"); - feature_name = g_strdup ("mfvp9enc"); - - i = 1; - while (g_type_from_name (type_name) != 0) { - g_free (type_name); - g_free (feature_name); - type_name = g_strdup_printf ("GstMFVP9Device%dEnc", i); - feature_name = g_strdup_printf ("mfvp9device%denc", i); - is_default = FALSE; - i++; - } - - cdata->is_default = is_default; - - type = - g_type_register_static (GST_TYPE_MF_VIDEO_ENC, type_name, &type_info, - (GTypeFlags) 0); - - /* make lower rank than default device */ - if (rank > 0 && !is_default) - rank--; - - if (!gst_element_register (plugin, feature_name, rank, type)) - GST_WARNING ("Failed to register plugin '%s'", type_name); - - g_free (type_name); - g_free (feature_name); -} - -static void -gst_mf_vp9_enc_plugin_init_internal (GstPlugin * plugin, guint rank, - GstMFTransform * transform, guint device_index, guint32 enum_flags) -{ - HRESULT hr; - MFT_REGISTER_TYPE_INFO *infos; - UINT32 info_size; - gint i; - GstCaps *src_caps = NULL; - GstCaps *sink_caps = NULL; - GValue *supported_formats = NULL; - gboolean have_I420 = FALSE; - gchar *device_name = NULL; - GstMFVP9EncDeviceCaps device_caps = { 0, }; - IMFActivate *activate; - IMFTransform *encoder; - ICodecAPI *codec_api; - ComPtr<IMFMediaType> out_type; - - /* NOTE: depending on environment, - * some enumerated h/w MFT might not be usable (e.g., multiple GPU case) */ - if (!gst_mf_transform_open (transform)) - return; - - activate = gst_mf_transform_get_activate_handle (transform); - if (!activate) { - GST_WARNING_OBJECT (transform, "No IMFActivate interface available"); - return; - } - - encoder = gst_mf_transform_get_transform_handle (transform); - if (!encoder) { - GST_WARNING_OBJECT (transform, "No IMFTransform interface available"); - return; - } - - codec_api = gst_mf_transform_get_codec_api_handle (transform); - if (!codec_api) { - GST_WARNING_OBJECT (transform, "No ICodecAPI interface available"); - return; - } - - g_object_get (transform, "device-name", &device_name, NULL); - if (!device_name) { - GST_WARNING_OBJECT (transform, "Unknown device name"); - return; - } - - hr = activate->GetAllocatedBlob (MFT_INPUT_TYPES_Attributes, - (UINT8 **) & infos, &info_size); - if (!gst_mf_result (hr)) - goto done; - - for (i = 0; i < info_size / sizeof (MFT_REGISTER_TYPE_INFO); i++) { - GstVideoFormat vformat; - GValue val = G_VALUE_INIT; - - vformat = gst_mf_video_subtype_to_video_format (&infos[i].guidSubtype); - if (vformat == GST_VIDEO_FORMAT_UNKNOWN) - continue; - - if (!supported_formats) { - supported_formats = g_new0 (GValue, 1); - g_value_init (supported_formats, GST_TYPE_LIST); - } - - /* media foundation has duplicated formats IYUV and I420 */ - if (vformat == GST_VIDEO_FORMAT_I420) { - if (have_I420) - continue; - - have_I420 = TRUE; - } - - g_value_init (&val, G_TYPE_STRING); - g_value_set_static_string (&val, gst_video_format_to_string (vformat)); - - gst_value_list_append_and_take_value (supported_formats, &val); - } - - CoTaskMemFree (infos); - - if (!supported_formats) - goto done; - - /* check supported resolutions */ - hr = MFCreateMediaType (out_type.GetAddressOf ()); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetGUID (MF_MT_MAJOR_TYPE, MFMediaType_Video); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetGUID (MF_MT_SUBTYPE, MFVideoFormat_VP90); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_AVG_BITRATE, 2048000); - if (!gst_mf_result (hr)) - goto done; - - hr = MFSetAttributeRatio (out_type.Get (), MF_MT_FRAME_RATE, 30, 1); - if (!gst_mf_result (hr)) - goto done; - - hr = out_type->SetUINT32 (MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); - if (!gst_mf_result (hr)) - goto done; - - GST_DEBUG_OBJECT (transform, "Check supported profiles of %s", - device_name); - - src_caps = gst_caps_new_empty_simple ("video/x-vp9"); - sink_caps = gst_caps_new_empty_simple ("video/x-raw"); - gst_caps_set_value (sink_caps, "format", supported_formats); - g_value_unset (supported_formats); - g_free (supported_formats); - - /* FIXME: don't hardcode resolution */ - gst_caps_set_simple (sink_caps, - "width", GST_TYPE_INT_RANGE, 64, 8192, - "height", GST_TYPE_INT_RANGE, 64, 8192, NULL); - gst_caps_set_simple (src_caps, - "width", GST_TYPE_INT_RANGE, 64, 8192, - "height", GST_TYPE_INT_RANGE, 64, 8192, NULL); - - GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); - -#define CHECK_DEVICE_CAPS(codec_obj,api,val) \ - if (SUCCEEDED((codec_obj)->IsSupported(&(api)))) {\ - device_caps.val = TRUE; \ - } - - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonRateControlMode, rc_mode); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncCommonMaxBitRate, max_bitrate); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncCommonQualityVsSpeed, quality_vs_speed); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncMPVGOPSize, gop_size); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncNumWorkerThreads, threads); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVEncVideoContentType, content_type); - CHECK_DEVICE_CAPS (codec_api, - CODECAPI_AVEncVideoForceKeyFrame, force_keyframe); - CHECK_DEVICE_CAPS (codec_api, CODECAPI_AVLowLatencyMode, low_latency); - - gst_mf_vp9_enc_register (plugin, rank, device_name, - &device_caps, enum_flags, device_index, sink_caps, src_caps); - -done: - g_free (device_name); -} - -void -gst_mf_vp9_enc_plugin_init (GstPlugin * plugin, guint rank) -{ - GstMFTransformEnumParams enum_params = { 0, }; - MFT_REGISTER_TYPE_INFO output_type; - GstMFTransform *transform; - gint i; - gboolean do_next; + GUID subtype = MFVideoFormat_VP90; GST_DEBUG_CATEGORY_INIT (gst_mf_vp9_enc_debug, "mfvp9enc", 0, "mfvp9enc"); - output_type.guidMajorType = MFMediaType_Video; - output_type.guidSubtype = MFVideoFormat_VP90; - - enum_params.category = MFT_CATEGORY_VIDEO_ENCODER; - enum_params.enum_flags = (MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_ASYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - enum_params.output_typeinfo = &output_type; - - /* register hardware encoders first */ - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_vp9_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); - - /* register software encoders */ - enum_params.enum_flags = (MFT_ENUM_FLAG_SYNCMFT | - MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_SORTANDFILTER_APPROVED_ONLY); - i = 0; - do { - enum_params.device_index = i++; - transform = gst_mf_transform_new (&enum_params); - do_next = TRUE; - - if (!transform) { - do_next = FALSE; - } else { - gst_mf_vp9_enc_plugin_init_internal (plugin, rank, transform, - enum_params.device_index, enum_params.enum_flags); - gst_clear_object (&transform); - } - } while (do_next); + gst_mf_video_enc_register (plugin, rank, &subtype, &type_info, d3d11_device); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/gstmfvp9enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstmfvp9enc.h
Changed
@@ -25,7 +25,8 @@ G_BEGIN_DECLS void gst_mf_vp9_enc_plugin_init (GstPlugin * plugin, - guint rank); + guint rank, + GList * d3d11_device); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstwin32devicewatcher.cpp
Added
@@ -0,0 +1,393 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstwin32devicewatcher.h" +#include <dbt.h> + +GST_DEBUG_CATEGORY_STATIC (gst_win32_device_watcher_debug); +#define GST_CAT_DEFAULT gst_win32_device_watcher_debug + +G_LOCK_DEFINE_STATIC (create_lock); + +#define GST_WIN32_HWND_PROP_NAME "gst-win32-device-watcher" + +struct _GstWin32DeviceWatcher +{ + GstObject parent; + + GMutex lock; + GCond cond; + + GThread *thread; + GMainContext *context; + GMainLoop *loop; + + GstWin32DeviceWatcherCallbacks callbacks; + gpointer user_data; + + HDEVNOTIFY device_notify; + HWND hwnd; + DWORD device_type; + GUID class_guid; +}; + +#define gst_win32_device_watcher_parent_class parent_class +G_DEFINE_TYPE (GstWin32DeviceWatcher, gst_win32_device_watcher, + GST_TYPE_OBJECT); + +static void gst_win32_device_watcher_constructed (GObject * object); +static void gst_win32_device_watcher_finalize (GObject * object); + +static gpointer +gst_win32_device_watcher_thread_func (GstWin32DeviceWatcher * self); + +static void +gst_win32_device_watcher_class_init (GstWin32DeviceWatcherClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->constructed = gst_win32_device_watcher_constructed; + gobject_class->finalize = gst_win32_device_watcher_finalize; + + GST_DEBUG_CATEGORY_INIT (gst_win32_device_watcher_debug, + "win32devicewatcher", 0, "win32devicewatcher"); +} + +static void +gst_win32_device_watcher_init (GstWin32DeviceWatcher * self) +{ + g_mutex_init (&self->lock); + g_cond_init (&self->cond); + self->context = g_main_context_new (); + self->loop = g_main_loop_new (self->context, FALSE); +} + +static void +gst_win32_device_watcher_constructed (GObject * object) +{ + GstWin32DeviceWatcher *self = GST_WIN32_DEVICE_WATCHER (object); + + /* Create a new thread for WIN32 message pumping */ + g_mutex_lock (&self->lock); + self->thread = g_thread_new ("GstWin32DeviceWatcher", + (GThreadFunc) gst_win32_device_watcher_thread_func, self); + while (!g_main_loop_is_running (self->loop)) + g_cond_wait (&self->cond, &self->lock); + g_mutex_unlock (&self->lock); +} + +static void +gst_win32_device_watcher_finalize (GObject * object) +{ + GstWin32DeviceWatcher *self = GST_WIN32_DEVICE_WATCHER (object); + + g_main_loop_quit (self->loop); + if (g_thread_self () != self->thread) { + g_thread_join (self->thread); + g_main_loop_unref (self->loop); + g_main_context_unref (self->context); + } else { + g_warning ("Trying join from self-thread"); + } + + g_mutex_clear (&self->lock); + g_cond_clear (&self->cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static LRESULT CALLBACK +window_proc (HWND hwnd, UINT msg, WPARAM wparam, LPARAM lparam) +{ + GstWin32DeviceWatcher * self; + + switch (msg) { + case WM_CREATE: + self = (GstWin32DeviceWatcher *) + ((LPCREATESTRUCT) lparam)->lpCreateParams; + + GST_DEBUG_OBJECT (self, "WM_CREATE"); + + SetPropA (hwnd, GST_WIN32_HWND_PROP_NAME, self); + break; + case WM_DEVICECHANGE: + { + self = (GstWin32DeviceWatcher *) GetPropA + (hwnd, GST_WIN32_HWND_PROP_NAME); + if (!self) { + GST_WARNING ("Failed to get watcher object"); + break; + } + + self->callbacks.device_changed (self, wparam, lparam, self->user_data); + break; + } + default: + break; + } + + return DefWindowProc (hwnd, msg, wparam, lparam); +} + +static HWND +create_hwnd (GstWin32DeviceWatcher * self) +{ + WNDCLASSEXA wc; + ATOM atom = 0; + HINSTANCE hinstance = GetModuleHandle (NULL); + static const gchar *klass_name = "GstWin32DeviceWatcher"; + HWND hwnd; + + G_LOCK (create_lock); + atom = GetClassInfoExA (hinstance, klass_name, &wc); + if (atom == 0) { + GST_LOG_OBJECT (self, "Register window class"); + ZeroMemory (&wc, sizeof (WNDCLASSEX)); + + wc.cbSize = sizeof (WNDCLASSEXA); + wc.lpfnWndProc = window_proc; + wc.hInstance = hinstance; + wc.lpszClassName = klass_name; + atom = RegisterClassExA (&wc); + + if (atom == 0) { + G_UNLOCK (create_lock); + GST_ERROR_OBJECT (self, "Failed to register window class, lastError 0x%x", + (guint) GetLastError ()); + return NULL; + } + } else { + GST_LOG_OBJECT (self, "window class was already registered"); + } + G_UNLOCK (create_lock); + + hwnd = CreateWindowExA (0, klass_name, "", 0, 0, 0, 0, 0, + HWND_MESSAGE, NULL, hinstance, self); + if (!hwnd) { + GST_ERROR_OBJECT (self, "Failed to create window handle, lastError 0x%x", + (guint) GetLastError ()); + return nullptr; + } + + return hwnd; +} + +static gboolean +loop_running_cb (GstWin32DeviceWatcher * self) +{ + g_mutex_lock (&self->lock); + g_cond_signal (&self->cond); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +static gboolean +win32_msg_cb (GIOChannel * source, GIOCondition condition, gpointer data) +{ + MSG msg; + + if (!PeekMessage (&msg, NULL, 0, 0, PM_REMOVE)) + return G_SOURCE_CONTINUE; + + TranslateMessage (&msg); + DispatchMessage (&msg); + + return G_SOURCE_CONTINUE; +} + +static gpointer +gst_win32_device_watcher_thread_func (GstWin32DeviceWatcher * self) +{ + GSource *idle_source; + HWND hwnd = nullptr; + GIOChannel *msg_io_channel = nullptr; + GSource *msg_source = nullptr; + + g_main_context_push_thread_default (self->context); + + idle_source = g_idle_source_new (); + g_source_set_callback (idle_source, + (GSourceFunc) loop_running_cb, self, nullptr); + g_source_attach (idle_source, self->context); + g_source_unref (idle_source); + + hwnd = create_hwnd (self); + if (!hwnd) + goto run_loop; + + msg_io_channel = g_io_channel_win32_new_messages ((gsize) hwnd); + msg_source = g_io_create_watch (msg_io_channel, G_IO_IN); + g_source_set_callback (msg_source, + (GSourceFunc) win32_msg_cb, nullptr, nullptr); + g_source_attach (msg_source, self->context); + + self->hwnd = hwnd; + +run_loop: + GST_INFO_OBJECT (self, "Starting loop"); + g_main_loop_run (self->loop); + GST_INFO_OBJECT (self, "Stopped loop"); + + if (self->device_notify) { + UnregisterDeviceNotification (self->device_notify); + self->device_notify = nullptr; + } + + if (msg_source) { + g_source_destroy (msg_source); + g_source_unref (msg_source); + } + + if (msg_io_channel) + g_io_channel_unref (msg_io_channel); + + if (hwnd) + DestroyWindow (hwnd); + + g_main_context_pop_thread_default (self->context); + + return nullptr; +} + +GstWin32DeviceWatcher * +gst_win32_device_watcher_new (DWORD device_type, const GUID * class_guid, + const GstWin32DeviceWatcherCallbacks * callbacks, gpointer user_data) +{ + GstWin32DeviceWatcher *self; + + g_return_val_if_fail (class_guid != nullptr, nullptr); + g_return_val_if_fail (callbacks != nullptr, nullptr); + + self = (GstWin32DeviceWatcher *) + g_object_new (GST_TYPE_WIN32_DEVICE_WATCHER, nullptr); + + if (!self->hwnd) { + gst_object_unref (self); + return nullptr; + } + + self->callbacks = *callbacks; + self->user_data = user_data; + self->device_type = device_type; + self->class_guid = *class_guid; + + gst_object_ref_sink (self); + + return self; +} + +typedef struct +{ + GstWin32DeviceWatcher * self; + + gboolean handled; + gboolean ret; +} DeviceNotificationData; + +static gboolean +register_device_notification (DeviceNotificationData * data) +{ + GstWin32DeviceWatcher * self = data->self; + DEV_BROADCAST_DEVICEINTERFACE di = { 0, }; + + if (self->device_notify) + goto out; + + di.dbcc_size = sizeof (di); + di.dbcc_devicetype = self->device_type; + di.dbcc_classguid = self->class_guid; + + self->device_notify = RegisterDeviceNotificationW (self->hwnd, + &di, DEVICE_NOTIFY_WINDOW_HANDLE); + +out: + if (self->device_notify) + data->ret = TRUE; + + g_mutex_lock (&self->lock); + data->handled = TRUE; + g_cond_broadcast (&self->cond); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +gboolean +gst_win32_device_watcher_start (GstWin32DeviceWatcher * watcher) +{ + DeviceNotificationData data; + + g_return_val_if_fail (GST_IS_WIN32_DEVICE_WATCHER (watcher), FALSE); + + data.self = watcher; + data.handled = FALSE; + data.ret = FALSE; + + g_main_context_invoke (watcher->context, + (GSourceFunc) register_device_notification, &data); + g_mutex_lock (&watcher->lock); + while (!data.handled) + g_cond_wait (&watcher->cond, &watcher->lock); + g_mutex_unlock (&watcher->lock); + + return data.ret; +} + +static gboolean +unregister_device_notification (DeviceNotificationData * data) +{ + GstWin32DeviceWatcher * self = data->self; + + if (!self->device_notify) + goto out; + + UnregisterDeviceNotification (self->device_notify); + self->device_notify = nullptr; + +out: + g_mutex_lock (&self->lock); + data->handled = TRUE; + g_cond_broadcast (&self->cond); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +void +gst_win32_device_watcher_stop (GstWin32DeviceWatcher * watcher) +{ + DeviceNotificationData data; + + g_return_if_fail (GST_IS_WIN32_DEVICE_WATCHER (watcher)); + + data.self = watcher; + data.handled = FALSE; + + g_main_context_invoke (watcher->context, + (GSourceFunc) register_device_notification, &data); + g_mutex_lock (&watcher->lock); + while (!data.handled) + g_cond_wait (&watcher->cond, &watcher->lock); + g_mutex_unlock (&watcher->lock); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/gstwin32devicewatcher.h
Added
@@ -0,0 +1,52 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WIN32_DEVICE_WATCHER_H__ +#define __GST_WIN32_DEVICE_WATCHER_H__ + +#include <gst/gst.h> +#include <windows.h> + +G_BEGIN_DECLS + +#define GST_TYPE_WIN32_DEVICE_WATCHER (gst_win32_device_watcher_get_type()) +G_DECLARE_FINAL_TYPE (GstWin32DeviceWatcher, + gst_win32_device_watcher, GST, WIN32_DEVICE_WATCHER, GstObject); + +typedef struct _GstWin32DeviceWatcherCallbacks +{ + void (*device_changed) (GstWin32DeviceWatcher * watcher, + WPARAM wparam, + LPARAM lparam, + gpointer user_data); + +} GstWin32DeviceWatcherCallbacks; + +GstWin32DeviceWatcher * gst_win32_device_watcher_new (DWORD device_type, + const GUID * class_guid, + const GstWin32DeviceWatcherCallbacks * callbacks, + gpointer user_data); + +gboolean gst_win32_device_watcher_start (GstWin32DeviceWatcher * watcher); + +void gst_win32_device_watcher_stop (GstWin32DeviceWatcher * watcher); + +G_END_DECLS + +#endif /* __GST_WIN32_DEVICE_WATCHER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/mediacapturewrapper.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/mediacapturewrapper.cpp
Changed
@@ -34,6 +34,7 @@ #include <algorithm> #include <iterator> +/* *INDENT-OFF* */ using namespace ABI::Windows::ApplicationModel::Core; using namespace ABI::Windows::Foundation::Collections; using namespace ABI::Windows::Media::Devices; @@ -1047,19 +1048,11 @@ if (!frame_ref) return S_OK; - hr = frame_ref->get_VideoMediaFrame (&video_frame); - if (!gst_mf_result (hr)) - return hr; - - hr = video_frame->get_SoftwareBitmap (&bitmap); - if (!gst_mf_result (hr) || !bitmap) - return hr; - /* nothing to do if no callback was installed */ if (!user_cb_.frame_arrived) return S_OK; - return user_cb_.frame_arrived (bitmap.Get(), user_data_); + return user_cb_.frame_arrived (frame_ref.Get(), user_data_); } HRESULT @@ -1183,4 +1176,6 @@ const GstWinRTMediaDescription & b) { return gst_mf_source_object_caps_compare (a.caps_, b.caps_) < 0; -} \ No newline at end of file +} + +/* *INDENT-ON* */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/mediacapturewrapper.h -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/mediacapturewrapper.h
Changed
@@ -112,7 +112,8 @@ typedef struct { - HRESULT (*frame_arrived) (ISoftwareBitmap * bitmap, void * user_data); + HRESULT (*frame_arrived) (IMediaFrameReference * frame, + void * user_data); HRESULT (*failed) (const std::string &error, UINT32 error_code, void * user_data);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/meson.build
Changed
@@ -12,10 +12,13 @@ 'gstmfaudioenc.cpp', 'gstmfaacenc.cpp', 'gstmfmp3enc.cpp', + 'gstmfvideobuffer.cpp', + 'gstmfplatloader.c', ] mf_desktop_sources = [ 'gstmfsourcereader.cpp', + 'gstwin32devicewatcher.cpp', ] mf_app_sources = [ @@ -36,8 +39,11 @@ winapi_desktop = false winapi_app = false have_capture_engine = false +have_mf_d3d11 = false mf_lib_deps = [] mf_config = configuration_data() +extra_c_args = ['-DCOBJMACROS'] +extra_cpp_args = [] mf_option = get_option('mediafoundation') if host_system != 'windows' or mf_option.disabled() @@ -113,28 +119,52 @@ endif if winapi_app + if not gstwinrt_dep.found() + if mf_option.enabled() + error('The mediafoundation plugin was enabled explicitly, but GstWinRt library is unavailable') + else + subdir_done() + endif + endif + mf_sources += mf_app_sources - mf_lib_deps += [runtimeobject_lib] + mf_lib_deps += [runtimeobject_lib, gstwinrt_dep] endif if winapi_desktop mf_sources += mf_desktop_sources + # We need d3d11_4.h header for querying "ExtendedNV12SharedTextureSupported" + # Since MFTEnum2 is desktop only we don't support d3d11 interop for UWP build + # And because MFTEnum2 is Windows 10 API, we will load MFTEnum2 symbol + # by using g_module_open() so that keep supporting old OS versions + if gstd3d11_dep.found() and cc.has_header('d3d11_4.h') and cc.has_header('d3d10.h') + have_mf_d3d11 = true + mf_lib_deps += [gstd3d11_dep] + extra_c_args += ['-DGST_USE_UNSTABLE_API'] + extra_cpp_args += ['-DGST_USE_UNSTABLE_API'] + message ('Enable D3D11 interop for MediaFoundation plugin') + endif endif mf_config.set10('GST_MF_WINAPI_APP', winapi_app) mf_config.set10('GST_MF_WINAPI_DESKTOP', winapi_desktop) +mf_config.set10('GST_MF_HAVE_D3D11', have_mf_d3d11) configure_file( output: 'gstmfconfig.h', configuration: mf_config, ) +# Work around for Windows SDK header issue +# https://docs.microsoft.com/en-us/cpp/build/reference/permissive-standards-conformance?view=msvc-160#windows-header-issues +extra_cpp_args += cxx.get_supported_arguments(['/Zc:twoPhase-']) + gstmediafoundation = library('gstmediafoundation', mf_sources, - c_args : gst_plugins_bad_args + ['-DCOBJMACROS'], - cpp_args : gst_plugins_bad_args, + c_args : gst_plugins_bad_args + extra_c_args, + cpp_args : gst_plugins_bad_args + extra_cpp_args, include_directories : [configinc], - dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, gstpbutils_dep] + mf_lib_deps, + dependencies : [gstbase_dep, gstvideo_dep, gstaudio_dep, gstpbutils_dep, gmodule_dep] + mf_lib_deps, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/sys/mediafoundation/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/mediafoundation/plugin.c
Changed
@@ -18,6 +18,41 @@ * Boston, MA 02110-1301, USA. */ +/** + * SECTION:plugin-mediafoundation + * + * Microsoft MediaFoundation plugin. + * + * This plugin consists of various hardware/software video encoders + * software audio encoders, and video capture (from webcam) elements. + * + * GstMediaFoundation plugin supports H.264/AVC, H.265/HEVC, VP9, codecs for + * hardware-accelerate encoding. + * + * However, depending on the hardware it runs on, some elements might not be + * registered in case that underlying hardware doesn't support the feature. + * + * Moreover, depending on hardware vendor's MediaFoundation implementation, + * secendary GPU may not be usable. In that case, user could use vendor + * specific plugins, Intel Media SDK and NVCODEC plugins for example. + * + * For a system with multiple MediaFoundation compatible hardwares (i.e., GPU), + * there can be multiple plugin features having the same role. + * Also, there would be additional software video encoder element the system + * meets requirement. + * + * The naming rule for the non-primary encoder is `mf{codec}device{index}enc` + * where `index` is an arbitrary index number of hardware starting from 1. + * + * To get a list of all available elements, user can run + * ```sh + * gst-inspect-1.0.exe mediafoundation + * ``` + * + * Since: 1.18 + * + */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif @@ -27,6 +62,7 @@ #include <winapifamily.h> #include <gst/gst.h> +#include <gst/video/video.h> #include "gstmfvideosrc.h" #include "gstmfdevice.h" #include "gstmfutils.h" @@ -36,10 +72,17 @@ #include "gstmfaacenc.h" #include "gstmfmp3enc.h" +#if GST_MF_HAVE_D3D11 +#include <gst/d3d11/gstd3d11.h> +#include <gstmfplatloader.h> +#endif + GST_DEBUG_CATEGORY (gst_mf_debug); GST_DEBUG_CATEGORY (gst_mf_utils_debug); GST_DEBUG_CATEGORY (gst_mf_source_object_debug); GST_DEBUG_CATEGORY (gst_mf_transform_debug); +GST_DEBUG_CATEGORY (gst_mf_video_buffer_debug); +GST_DEBUG_CATEGORY (gst_mf_video_enc_debug); #define GST_CAT_DEFAULT gst_mf_debug @@ -49,17 +92,103 @@ MFShutdown (); } +#if GST_MF_HAVE_D3D11 +static GList * +get_d3d11_devices (void) +{ + GList *ret = NULL; + guint i; + HRESULT hr; + IMFVideoSampleAllocatorEx *allocator = NULL; + + /* Check whether we can use IMFVideoSampleAllocatorEx interface */ + hr = GstMFCreateVideoSampleAllocatorEx (&IID_IMFVideoSampleAllocatorEx, + &allocator); + if (!gst_mf_result (hr)) { + GST_DEBUG ("IMFVideoSampleAllocatorEx interface is unavailable"); + return NULL; + } else { + IMFVideoSampleAllocatorEx_Release (allocator); + } + + /* AMD seems supporting up to 12 cards, and 8 for NVIDIA */ + for (i = 0; i < 12; i++) { + GstD3D11Device *device; + gboolean is_hardware = FALSE; + const GstD3D11Format *d3d11_format; + ID3D11Device *device_handle; + D3D11_FEATURE_DATA_D3D11_OPTIONS4 options = { 0, }; + UINT supported = 0; + + device = gst_d3d11_device_new (i, D3D11_CREATE_DEVICE_VIDEO_SUPPORT); + + if (!device) + break; + + g_object_get (device, "hardware", &is_hardware, NULL); + + if (!is_hardware) { + GST_DEBUG_OBJECT (device, "Given d3d11 device is not for hardware"); + gst_object_unref (device); + continue; + } + + /* device can support NV12 format? */ + d3d11_format = + gst_d3d11_device_format_from_gst (device, GST_VIDEO_FORMAT_NV12); + if (!d3d11_format || d3d11_format->dxgi_format != DXGI_FORMAT_NV12) { + GST_DEBUG_OBJECT (device, + "Given d3d11 device cannot support NV12 format"); + gst_object_unref (device); + continue; + } + + /* device can support ExtendedNV12SharedTextureSupported? + * + * NOTE: we will make use of per encoder object d3d11 device without + * sharing it in a pipeline because MF needs D3D11_CREATE_DEVICE_VIDEO_SUPPORT + * but the flag doesn't used for the other our use cases. + * So we need texture sharing feature so that we can copy d3d11 texture into + * MF specific texture pool without download texture */ + + device_handle = gst_d3d11_device_get_device_handle (device); + hr = ID3D11Device_CheckFeatureSupport (device_handle, + D3D11_FEATURE_D3D11_OPTIONS4, &options, sizeof (options)); + if (!gst_d3d11_result (hr, device) || + !options.ExtendedNV12SharedTextureSupported) { + GST_DEBUG_OBJECT (device, + "Given d3d11 device cannot support NV12 format for shared texture"); + gst_object_unref (device); + continue; + } + + /* can we bind NV12 texture for encoder? */ + hr = ID3D11Device_CheckFormatSupport (device_handle, + DXGI_FORMAT_NV12, &supported); + + if (!gst_d3d11_result (hr, device)) { + GST_DEBUG_OBJECT (device, "Couldn't query format support"); + gst_object_unref (device); + continue; + } else if ((supported & D3D11_FORMAT_SUPPORT_VIDEO_ENCODER) == 0) { + GST_DEBUG_OBJECT (device, "We cannot bind NV12 format for encoding"); + gst_object_unref (device); + continue; + } + + ret = g_list_append (ret, device); + } + + return ret; +} +#endif + static gboolean plugin_init (GstPlugin * plugin) { HRESULT hr; GstRank rank = GST_RANK_SECONDARY; - - /** - * plugin-mediafoundation: - * - * Since: 1.18 - */ + GList *device_list = NULL; GST_DEBUG_CATEGORY_INIT (gst_mf_debug, "mf", 0, "media foundation"); GST_DEBUG_CATEGORY_INIT (gst_mf_utils_debug, @@ -68,6 +197,10 @@ "mfsourceobject", 0, "mfsourceobject"); GST_DEBUG_CATEGORY_INIT (gst_mf_transform_debug, "mftransform", 0, "mftransform"); + GST_DEBUG_CATEGORY_INIT (gst_mf_video_buffer_debug, + "mfvideobuffer", 0, "mfvideobuffer"); + GST_DEBUG_CATEGORY_INIT (gst_mf_video_enc_debug, + "mfvideoenc", 0, "mfvideoenc"); hr = MFStartup (MF_VERSION, MFSTARTUP_NOSOCKET); if (!gst_mf_result (hr)) { @@ -80,13 +213,25 @@ rank = GST_RANK_PRIMARY + 1; #endif + /* FIXME: In order to create MFT for a specific GPU, MFTEnum2() API is + * required API but it's desktop only. + * So, resulting MFT and D3D11 might not be compatible in case of multi-GPU + * environment on UWP. */ +#if GST_MF_HAVE_D3D11 + if (gst_mf_plat_load_library ()) + device_list = get_d3d11_devices (); +#endif + gst_element_register (plugin, "mfvideosrc", rank, GST_TYPE_MF_VIDEO_SRC); gst_device_provider_register (plugin, "mfdeviceprovider", rank, GST_TYPE_MF_DEVICE_PROVIDER); - gst_mf_h264_enc_plugin_init (plugin, GST_RANK_SECONDARY); - gst_mf_h265_enc_plugin_init (plugin, GST_RANK_SECONDARY); - gst_mf_vp9_enc_plugin_init (plugin, GST_RANK_SECONDARY); + gst_mf_h264_enc_plugin_init (plugin, GST_RANK_SECONDARY, device_list); + gst_mf_h265_enc_plugin_init (plugin, GST_RANK_SECONDARY, device_list); + gst_mf_vp9_enc_plugin_init (plugin, GST_RANK_SECONDARY, device_list); + + if (device_list) + g_list_free_full (device_list, gst_object_unref); gst_mf_aac_enc_plugin_init (plugin, GST_RANK_SECONDARY); gst_mf_mp3_enc_plugin_init (plugin, GST_RANK_SECONDARY);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/meson.build
Changed
@@ -1,5 +1,6 @@ subdir('androidmedia') subdir('applemedia') +subdir('asio') subdir('bluez') subdir('d3d11') subdir('d3dvideosink')
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdk.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdk.c
Changed
@@ -30,6 +30,13 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: plugin-msdk + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -52,6 +59,9 @@ #ifdef USE_MSDK_VP9_DEC #include "gstmsdkvp9dec.h" #endif +#ifdef USE_MSDK_AV1_DEC +#include "gstmsdkav1dec.h" +#endif #include "gstmsdkvpp.h" GST_DEBUG_CATEGORY (gst_msdk_debug); @@ -70,6 +80,32 @@ GST_DEBUG_CATEGORY (gst_msdkvc1dec_debug); GST_DEBUG_CATEGORY (gst_msdkvp9dec_debug); GST_DEBUG_CATEGORY (gst_msdkvp9enc_debug); +GST_DEBUG_CATEGORY (gst_msdkav1dec_debug); + +static void +plugin_add_dependencies (GstPlugin * plugin) +{ +#ifndef _WIN32 + const gchar *env_vars[] = { "LIBVA_DRIVER_NAME", NULL }; + const gchar *kernel_paths[] = { "/dev/dri", NULL }; + const gchar *kernel_names[] = { "card", "render", NULL }; + + /* features get updated upon changes in /dev/dri/card* */ + gst_plugin_add_dependency (plugin, NULL, kernel_paths, kernel_names, + GST_PLUGIN_DEPENDENCY_FLAG_FILE_NAME_IS_PREFIX); + + /* features get updated upon changes in VA environment variables */ + gst_plugin_add_dependency (plugin, env_vars, NULL, NULL, + GST_PLUGIN_DEPENDENCY_FLAG_NONE); + + /* features get updated upon changes in default VA drivers + * directory */ + gst_plugin_add_dependency_simple (plugin, "LIBVA_DRIVERS_PATH", + VA_DRIVERS_PATH, "_drv_video.so", + GST_PLUGIN_DEPENDENCY_FLAG_FILE_NAME_IS_SUFFIX | + GST_PLUGIN_DEPENDENCY_FLAG_PATHS_ARE_DEFAULT_ONLY); +#endif +} static gboolean plugin_init (GstPlugin * plugin) @@ -100,9 +136,12 @@ GST_DEBUG_CATEGORY_INIT (gst_msdkvc1dec_debug, "msdkvc1dec", 0, "msdkvc1dec"); GST_DEBUG_CATEGORY_INIT (gst_msdkvp9dec_debug, "msdkvp9dec", 0, "msdkvp9dec"); GST_DEBUG_CATEGORY_INIT (gst_msdkvp9enc_debug, "msdkvp9enc", 0, "msdkvp9enc"); + GST_DEBUG_CATEGORY_INIT (gst_msdkav1dec_debug, "msdkav1dec", 0, "msdkav1dec"); + + plugin_add_dependencies (plugin); if (!msdk_is_available ()) - return FALSE; + return TRUE; /* return TRUE to avoid getting blacklisted */ ret = gst_element_register (plugin, "msdkh264dec", GST_RANK_NONE, GST_TYPE_MSDKH264DEC); @@ -141,6 +180,10 @@ ret = gst_element_register (plugin, "msdkvp9enc", GST_RANK_NONE, GST_TYPE_MSDKVP9ENC); #endif +#ifdef USE_MSDK_AV1_DEC + ret = gst_element_register (plugin, "msdkav1dec", GST_RANK_NONE, + GST_TYPE_MSDKAV1DEC); +#endif ret = gst_element_register (plugin, "msdkvpp", GST_RANK_NONE, GST_TYPE_MSDKVPP); @@ -150,5 +193,5 @@ GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, msdk, - "Intel Media SDK based elements", + "MFX API (" MFX_API_SDK ") based elements", plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkallocator.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkallocator.h
Changed
@@ -46,7 +46,7 @@ #ifndef _WIN32 VASurfaceID *surface; VAImage image; - VABufferInfo info; + VADRMPRIMESurfaceDescriptor desc; #else /* TODO: This is just to avoid compile errors on Windows. * Implement handling Windows-specific video-memory.
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkallocator_libva.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkallocator_libva.c
Changed
@@ -32,6 +32,7 @@ #include <va/va.h> #include <va/va_drmcommon.h> +#include <unistd.h> #include "gstmsdkallocator.h" #include "gstmsdkallocator_libva.h" #include "msdk_libva.h" @@ -150,6 +151,12 @@ format = VA_RT_FORMAT_YUV444_12; #endif +#if (MFX_VERSION >= 2004) + if (format == VA_RT_FORMAT_YUV444 && (va_fourcc == VA_FOURCC_RGBP + || va_fourcc == VA_FOURCC_BGRP)) + format = VA_RT_FORMAT_RGBP; +#endif + va_status = vaCreateSurfaces (gst_msdk_context_get_handle (context), format, req->Info.Width, req->Info.Height, surfaces, surfaces_num, attribs, @@ -164,37 +171,37 @@ for (i = 0; i < surfaces_num; i++) { /* Get dmabuf handle if MFX_MEMTYPE_EXPORT_FRAME */ if (req->Type & MFX_MEMTYPE_EXPORT_FRAME) { - msdk_mids[i].info.mem_type = VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME; + VADRMPRIMESurfaceDescriptor va_desc = { 0 }; + uint32_t export_flags = VA_EXPORT_SURFACE_SEPARATE_LAYERS | + VA_EXPORT_SURFACE_READ_WRITE; + va_status = - vaDeriveImage (gst_msdk_context_get_handle (context), surfaces[i], - &msdk_mids[i].image); + vaExportSurfaceHandle (gst_msdk_context_get_handle (context), + surfaces[i], VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2, export_flags, + &va_desc); + status = gst_msdk_get_mfx_status_from_va_status (va_status); if (MFX_ERR_NONE != status) { - GST_ERROR ("failed to derive image"); + GST_ERROR ("Failed to export surface"); return status; } - va_status = - vaAcquireBufferHandle (gst_msdk_context_get_handle (context), - msdk_mids[i].image.buf, &msdk_mids[i].info); - status = gst_msdk_get_mfx_status_from_va_status (va_status); + g_assert (va_desc.num_objects); - if (MFX_ERR_NONE != status) { - GST_ERROR ("failed to get dmabuf handle"); - va_status = vaDestroyImage (gst_msdk_context_get_handle (context), - msdk_mids[i].image.image_id); - if (va_status == VA_STATUS_SUCCESS) { - msdk_mids[i].image.image_id = VA_INVALID_ID; - msdk_mids[i].image.buf = VA_INVALID_ID; - } + /* This plugin supports single object only */ + if (va_desc.num_objects > 1) { + GST_ERROR ("Can not support multiple objects"); + return MFX_ERR_UNSUPPORTED; } - } else { - /* useful to check the image mapping state later */ - msdk_mids[i].image.image_id = VA_INVALID_ID; - msdk_mids[i].image.buf = VA_INVALID_ID; + + msdk_mids[i].desc = va_desc; } + /* Don't use image for DMABuf */ + msdk_mids[i].image.image_id = VA_INVALID_ID; + msdk_mids[i].image.buf = VA_INVALID_ID; + msdk_mids[i].surface = &surfaces[i]; mids[i] = (mfxMemId *) & msdk_mids[i]; } @@ -225,6 +232,11 @@ surfaces[i] = coded_buf; msdk_mids[i].surface = &surfaces[i]; msdk_mids[i].fourcc = fourcc; + + /* Don't use image for P208 */ + msdk_mids[i].image.image_id = VA_INVALID_ID; + msdk_mids[i].image.buf = VA_INVALID_ID; + mids[i] = (mfxMemId *) & msdk_mids[i]; } } @@ -271,9 +283,12 @@ for (i = 0; i < resp->NumFrameActual; i++) { GstMsdkMemoryID *mem = resp->mids[i]; - /* Release dmabuf handle if used */ - if (mem->info.mem_type == VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME) - vaReleaseBufferHandle (dpy, mem->image.buf); + /* Release prime fd if used */ + if (mem->desc.num_objects) { + g_assert (mem->desc.num_objects == 1); + close (mem->desc.objects[0].fd); + mem->desc.num_objects = 0; + } if (mem->image.image_id != VA_INVALID_ID && vaDestroyImage (dpy, mem->image.image_id) == VA_STATUS_SUCCESS) { @@ -316,7 +331,7 @@ va_surface = mem_id->surface; dpy = gst_msdk_context_get_handle (context); - if (mem_id->info.mem_type == VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME) { + if (mem_id->desc.num_objects) { GST_WARNING ("Couldn't map the buffer since dmabuf is already in use"); return MFX_ERR_LOCK_MEMORY; } @@ -428,6 +443,21 @@ data->A = data->R + 3; break; +#if (MFX_VERSION >= 2004) + case VA_FOURCC_RGBP: + data->Pitch = mem_id->image.pitches[0]; + data->R = buf + mem_id->image.offsets[0]; + data->G = buf + mem_id->image.offsets[1]; + data->B = buf + mem_id->image.offsets[2]; + break; + case VA_FOURCC_BGRP: + data->Pitch = mem_id->image.pitches[0]; + data->B = buf + mem_id->image.offsets[0]; + data->G = buf + mem_id->image.offsets[1]; + data->R = buf + mem_id->image.offsets[2]; + break; +#endif + default: g_assert_not_reached (); break; @@ -456,6 +486,8 @@ mem_id = (GstMsdkMemoryID *) mid; dpy = gst_msdk_context_get_handle (context); + g_assert (mem_id->desc.num_objects == 0); + if (mem_id->fourcc != MFX_FOURCC_P8) { vaUnmapBuffer (dpy, mem_id->image.buf); va_status = vaDestroyImage (dpy, mem_id->image.image_id); @@ -510,10 +542,13 @@ g_return_val_if_fail (surface, FALSE); mem_id = (GstMsdkMemoryID *) surface->Data.MemId; + + g_assert (mem_id->desc.num_objects == 1); + if (handle) - *handle = mem_id->info.handle; + *handle = mem_id->desc.objects[0].fd; if (size) - *size = mem_id->info.mem_size; + *size = mem_id->desc.objects[0].size; return TRUE; } @@ -606,6 +641,16 @@ va_fourcc = VA_FOURCC_Y416; break; #endif +#if (MFX_VERSION >= 2004) + case GST_VIDEO_FORMAT_RGBP: + va_chroma = VA_RT_FORMAT_RGBP; + va_fourcc = VA_FOURCC_RGBP; + break; + case GST_VIDEO_FORMAT_BGRP: + va_chroma = VA_RT_FORMAT_RGBP; + va_fourcc = VA_FOURCC_BGRP; + break; +#endif default: goto error_unsupported_format; }
View file
gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkav1dec.c
Added
@@ -0,0 +1,131 @@ +/* GStreamer Intel MSDK plugin + * Copyright (c) 2020, Intel Corporation + * All rights reserved. + * + * Redistribution and use in source and binary forms, with or without + * modification, are permitted provided that the following conditions are met: + * + * 1. Redistributions of source code must retain the above copyright notice, + * this list of conditions and the following disclaimer. + * + * 2. Redistributions in binary form must reproduce the above copyright notice, + * this list of conditions and the following disclaimer in the documentation + * and/or other materials provided with the distribution. + * + * 3. Neither the name of the copyright holder nor the names of its contributors + * may be used to endorse or promote products derived from this software + * without specific prior written permission. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" + * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, + * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR + * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR + * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, + * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, + * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; + * OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, + * WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGDECE + * OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, + * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + */ + + /** + * SECTION: element-msdkav1dec + * @title: msdkav1dec + * @short_description: Intel MSDK AV1 decoder + * + * AV1 video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.ivf ! ivfparse ! msdkav1dec ! glimagesink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstmsdkav1dec.h" +#include "gstmsdkvideomemory.h" + +GST_DEBUG_CATEGORY_EXTERN (gst_msdkav1dec_debug); +#define GST_CAT_DEFAULT gst_msdkav1dec_debug + +#define COMMON_FORMAT "{ NV12, P010_10LE, VUYA, Y410 }" + +static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-av1") + ); + +static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_MSDK_CAPS_STR (COMMON_FORMAT, COMMON_FORMAT)) + ); + +#define gst_msdkav1dec_parent_class parent_class +G_DEFINE_TYPE (GstMsdkAV1Dec, gst_msdkav1dec, GST_TYPE_MSDKDEC); + +static gboolean +gst_msdkav1dec_configure (GstMsdkDec * decoder) +{ + decoder->param.mfx.CodecId = MFX_CODEC_AV1; + /* Replaced with width and height rounded up to 16 */ + decoder->param.mfx.FrameInfo.Width = + GST_ROUND_UP_16 (decoder->param.mfx.FrameInfo.CropW); + decoder->param.mfx.FrameInfo.Height = + GST_ROUND_UP_16 (decoder->param.mfx.FrameInfo.CropH); + + decoder->force_reset_on_res_change = FALSE; + + return TRUE; +} + +static gboolean +gst_msdkav1dec_preinit_decoder (GstMsdkDec * decoder) +{ + decoder->param.mfx.FrameInfo.Width = + GST_ROUND_UP_16 (decoder->param.mfx.FrameInfo.Width); + decoder->param.mfx.FrameInfo.Height = + GST_ROUND_UP_16 (decoder->param.mfx.FrameInfo.Height); + + decoder->param.mfx.FrameInfo.PicStruct = + decoder->param.mfx.FrameInfo.PicStruct ? decoder->param.mfx. + FrameInfo.PicStruct : MFX_PICSTRUCT_PROGRESSIVE; + + return TRUE; +} + +static void +gst_msdkav1dec_class_init (GstMsdkAV1DecClass * klass) +{ + GstElementClass *element_class; + GstMsdkDecClass *decoder_class; + + element_class = GST_ELEMENT_CLASS (klass); + decoder_class = GST_MSDKDEC_CLASS (klass); + + decoder_class->configure = GST_DEBUG_FUNCPTR (gst_msdkav1dec_configure); + decoder_class->preinit_decoder = + GST_DEBUG_FUNCPTR (gst_msdkav1dec_preinit_decoder); + + gst_element_class_set_static_metadata (element_class, + "Intel MSDK AV1 decoder", + "Codec/Decoder/Video/Hardware", + "AV1 video decoder based on " MFX_API_SDK, + "Haihao Xiang <haihao.xiang@intel.com>"); + + gst_element_class_add_static_pad_template (element_class, &sink_factory); + gst_element_class_add_static_pad_template (element_class, &src_factory); +} + +static void +gst_msdkav1dec_init (GstMsdkAV1Dec * thiz) +{ +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkav1dec.h
Added
@@ -0,0 +1,67 @@ +/* GStreamer Intel MSDK plugin + * Copyright (c) 2020, Intel Corporation + * All rights reserved. + * + * Redistribution and use in source and binary forms, with or without + * modification, are permitted provided that the following conditions are met: + * + * 1. Redistributions of source code must retain the above copyright notice, + * this list of conditions and the following disclaimer. + * + * 2. Redistributions in binary form must reproduce the above copyright notice, + * this list of conditions and the following disclaimer in the documentation + * and/or other materials provided with the distribution. + * + * 3. Neither the name of the copyright holder nor the names of its contributors + * may be used to endorse or promote products derived from this software + * without specific prior written permission. + * + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" + * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, + * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR + * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR + * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, + * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, + * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; + * OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, + * WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGDECE + * OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, + * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + */ + +#ifndef __GST_MSDKAV1DEC_H__ +#define __GST_MSDKAV1DEC_H__ + +#include "gstmsdkdec.h" + +G_BEGIN_DECLS + +#define GST_TYPE_MSDKAV1DEC \ + (gst_msdkav1dec_get_type()) +#define GST_MSDKAV1DEC(obj) \ + (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_MSDKAV1DEC,GstMsdkAV1Dec)) +#define GST_MSDKAV1DEC_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_MSDKAV1DEC,GstMsdkAV1DecClass)) +#define GST_IS_MSDKAV1DEC(obj) \ + (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_MSDKAV1DEC)) +#define GST_IS_MSDKAV1DEC_CLASS(klass) \ + (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_MSDKAV1DEC)) + +typedef struct _GstMsdkAV1Dec GstMsdkAV1Dec; +typedef struct _GstMsdkAV1DecClass GstMsdkAV1DecClass; + +struct _GstMsdkAV1Dec +{ + GstMsdkDec base; +}; + +struct _GstMsdkAV1DecClass +{ + GstMsdkDecClass parent_class; +}; + +GType gst_msdkav1dec_get_type (void); + +G_END_DECLS + +#endif /* __GST_MSDKAV1DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkcontext.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkcontext.c
Changed
@@ -34,6 +34,7 @@ #ifndef _WIN32 #include <fcntl.h> #include <unistd.h> +#include <xf86drm.h> #include <va/va_drm.h> #include <gudev/gudev.h> #endif @@ -43,15 +44,15 @@ struct _GstMsdkContextPrivate { - mfxSession session; + MsdkSession session; GList *cached_alloc_responses; gboolean hardware; - gboolean is_joined; gboolean has_frame_allocator; GstMsdkContextJobType job_type; gint shared_async_depth; GMutex mutex; GList *child_session_list; + GstMsdkContext *parent_context; #ifndef _WIN32 gint fd; VADisplay dpy; @@ -76,6 +77,30 @@ const gchar *devnode_path; const gchar *devnode_files[2] = { "renderD[0-9]*", "card[0-9]*" }; int fd = -1, i; + const gchar *user_choice = g_getenv ("GST_MSDK_DRM_DEVICE"); + + if (user_choice) { + if (g_str_has_prefix (user_choice, "/dev/dri/")) + fd = open (user_choice, O_RDWR | O_CLOEXEC); + + if (fd >= 0) { + drmVersionPtr drm_version = drmGetVersion (fd); + + if (!drm_version || strncmp (drm_version->name, "i915", 4)) { + GST_ERROR ("The specified device isn't an Intel device"); + drmFreeVersion (drm_version); + close (fd); + fd = -1; + } else { + GST_DEBUG ("Opened the specified drm device %s", user_choice); + drmFreeVersion (drm_version); + } + } else { + GST_ERROR ("The specified device isn't a valid drm device"); + } + + return fd; + } client = g_udev_client_new (NULL); if (!client) @@ -137,7 +162,7 @@ fd = get_device_id (); if (fd < 0) { - GST_ERROR ("Couldn't find a drm device node to open"); + GST_WARNING ("Couldn't find a valid drm device node"); return FALSE; } @@ -153,7 +178,7 @@ goto failed; } - status = MFXVideoCORE_SetHandle (priv->session, MFX_HANDLE_VA_DISPLAY, + status = MFXVideoCORE_SetHandle (priv->session.session, MFX_HANDLE_VA_DISPLAY, (mfxHDL) va_dpy); if (status != MFX_ERR_NONE) { GST_ERROR ("Setting VAAPI handle failed (%s)", @@ -180,12 +205,15 @@ { mfxU16 codename; GstMsdkContextPrivate *priv = context->priv; + MsdkSession msdk_session; priv->job_type = job_type; priv->hardware = hardware; - priv->session = + + msdk_session = msdk_open_session (hardware ? MFX_IMPL_HARDWARE_ANY : MFX_IMPL_SOFTWARE); - if (!priv->session) + priv->session = msdk_session; + if (!priv->session.session) goto failed; #ifndef _WIN32 @@ -197,7 +225,7 @@ } #endif - codename = msdk_get_platform_codename (priv->session); + codename = msdk_get_platform_codename (priv->session.session); if (codename != MFX_PLATFORM_UNKNOWN) GST_INFO ("Detected MFX platform with device code %d", codename); @@ -229,7 +257,7 @@ status = MFXDisjoinSession (_session); if (status != MFX_ERR_NONE) GST_WARNING ("failed to disjoin (%s)", msdk_status_to_string (status)); - msdk_close_session (_session); + msdk_close_mfx_session (_session); } static void @@ -239,12 +267,13 @@ GstMsdkContextPrivate *priv = context->priv; /* child sessions will be closed when the parent session is closed */ - if (priv->is_joined) + if (priv->parent_context) { + gst_object_unref (priv->parent_context); goto done; - else + } else g_list_free_full (priv->child_session_list, release_child_session); - msdk_close_session (priv->session); + msdk_close_session (&priv->session); g_mutex_clear (&priv->mutex); #ifndef _WIN32 @@ -287,44 +316,97 @@ GstMsdkContext *obj = g_object_new (GST_TYPE_MSDK_CONTEXT, NULL); GstMsdkContextPrivate *priv = obj->priv; GstMsdkContextPrivate *parent_priv = parent->priv; + mfxVersion version; + mfxIMPL impl; + MsdkSession child_msdk_session; + mfxHandleType handle_type = 0; + mfxHDL handle = NULL; + + status = MFXQueryIMPL (parent_priv->session.session, &impl); + + if (status == MFX_ERR_NONE) + status = MFXQueryVersion (parent_priv->session.session, &version); - status = MFXCloneSession (parent_priv->session, &priv->session); if (status != MFX_ERR_NONE) { - GST_ERROR ("Failed to clone mfx session"); + GST_ERROR ("Failed to query the session attributes (%s)", + msdk_status_to_string (status)); g_object_unref (obj); return NULL; } - priv->is_joined = TRUE; - priv->hardware = parent_priv->hardware; - priv->job_type = parent_priv->job_type; - parent_priv->child_session_list = - g_list_prepend (parent_priv->child_session_list, priv->session); -#ifndef _WIN32 - priv->dpy = parent_priv->dpy; - priv->fd = parent_priv->fd; + if (MFX_IMPL_VIA_VAAPI == (0x0f00 & (impl))) + handle_type = MFX_HANDLE_VA_DISPLAY; + + if (handle_type) { + status = + MFXVideoCORE_GetHandle (parent_priv->session.session, handle_type, + &handle); + + if (status != MFX_ERR_NONE || !handle) { + GST_ERROR ("Failed to get session handle (%s)", + msdk_status_to_string (status)); + g_object_unref (obj); + return NULL; + } + } - if (priv->hardware) { - status = MFXVideoCORE_SetHandle (priv->session, MFX_HANDLE_VA_DISPLAY, - (mfxHDL) parent_priv->dpy); + child_msdk_session.loader = parent_priv->session.loader; + child_msdk_session.session = NULL; + status = msdk_init_msdk_session (impl, &version, &child_msdk_session); + + if (status != MFX_ERR_NONE) { + GST_ERROR ("Failed to create a child mfx session (%s)", + msdk_status_to_string (status)); + g_object_unref (obj); + return NULL; + } + + if (handle) { + status = + MFXVideoCORE_SetHandle (child_msdk_session.session, handle_type, + handle); if (status != MFX_ERR_NONE) { - GST_ERROR ("Setting VA handle failed (%s)", + GST_ERROR ("Failed to set a HW handle (%s)", msdk_status_to_string (status)); + MFXClose (child_msdk_session.session); g_object_unref (obj); return NULL; } + } +#if (MFX_VERSION >= 1025) + status = + MFXJoinSession (parent_priv->session.session, child_msdk_session.session); + if (status != MFX_ERR_NONE) { + GST_ERROR ("Failed to join two sessions (%s)", + msdk_status_to_string (status)); + MFXClose (child_msdk_session.session); + g_object_unref (obj); + return NULL; } #endif + /* Set loader to NULL for child session */ + priv->session.loader = NULL; + priv->session.session = child_msdk_session.session; + priv->hardware = parent_priv->hardware; + priv->job_type = parent_priv->job_type; + parent_priv->child_session_list = + g_list_prepend (parent_priv->child_session_list, priv->session.session); +#ifndef _WIN32 + priv->dpy = parent_priv->dpy; + priv->fd = parent_priv->fd; +#endif + priv->parent_context = gst_object_ref (parent); + return obj; } mfxSession gst_msdk_context_get_session (GstMsdkContext * context) { - return context->priv->session; + return context->priv->session.session; } gpointer @@ -633,7 +715,7 @@ if (!priv->has_frame_allocator) { mfxStatus status; - status = MFXVideoCORE_SetFrameAllocator (priv->session, allocator); + status = MFXVideoCORE_SetFrameAllocator (priv->session.session, allocator); if (status != MFX_ERR_NONE) GST_ERROR ("Failed to set frame allocator");
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkcontext.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkcontext.h
Changed
@@ -36,6 +36,7 @@ #include "msdk.h" #ifndef _WIN32 #include <va/va.h> +#include <va/va_drmcommon.h> #endif G_BEGIN_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkdec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkdec.c
Changed
@@ -83,6 +83,15 @@ static gboolean gst_msdkdec_flush (GstVideoDecoder * decoder); static gboolean gst_msdkdec_negotiate (GstMsdkDec * thiz, gboolean hard_reset); +void +gst_msdkdec_add_bs_extra_param (GstMsdkDec * thiz, mfxExtBuffer * param) +{ + if (thiz->num_bs_extra_params < MAX_BS_EXTRA_PARAMS) { + thiz->bs_extra_params[thiz->num_bs_extra_params] = param; + thiz->num_bs_extra_params++; + } +} + static GstVideoCodecFrame * gst_msdkdec_get_oldest_frame (GstVideoDecoder * decoder) { @@ -288,6 +297,7 @@ if (reset_param) memset (&thiz->param, 0, sizeof (thiz->param)); + thiz->num_bs_extra_params = 0; thiz->initialized = FALSE; gst_adapter_clear (thiz->adapter); } @@ -374,7 +384,7 @@ FrameInfo.ChromaFormat : MFX_CHROMAFORMAT_YUV420; session = gst_msdk_context_get_session (thiz->context); - /* validate parameters and allow the Media SDK to make adjustments */ + /* validate parameters and allow MFX to make adjustments */ status = MFXVideoDECODE_Query (session, &thiz->param, &thiz->param); if (status < MFX_ERR_NONE) { GST_ERROR_OBJECT (thiz, "Video Decode Query failed (%s)", @@ -649,6 +659,37 @@ task->decode_only = FALSE; } +static void +gst_msdkdec_frame_corruption_report (GstMsdkDec * thiz, mfxU16 corruption) +{ + if (!thiz->report_error || !corruption) + return; + + if (corruption & MFX_CORRUPTION_MINOR) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Corruption] Minor corruption detected!"), (NULL)); + + if (corruption & MFX_CORRUPTION_MAJOR) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Corruption] Major corruption detected!"), (NULL)); + + if (corruption & MFX_CORRUPTION_ABSENT_TOP_FIELD) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Corruption] Absent top field!"), (NULL)); + + if (corruption & MFX_CORRUPTION_ABSENT_BOTTOM_FIELD) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Corruption] Absent bottom field!"), (NULL)); + + if (corruption & MFX_CORRUPTION_REFERENCE_FRAME) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Corruption] Corrupted reference frame!"), (NULL)); + + if (corruption & MFX_CORRUPTION_REFERENCE_LIST) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Corruption] Corrupted reference list!"), (NULL)); +} + static GstFlowReturn gst_msdkdec_finish_task (GstMsdkDec * thiz, MsdkDecTask * task) { @@ -671,6 +712,8 @@ surface = task->surface; if (surface) { + gst_msdkdec_frame_corruption_report (thiz, + surface->surface->Data.Corrupted); GST_DEBUG_OBJECT (thiz, "Decoded MFX TimeStamp: %" G_GUINT64_FORMAT, (guint64) surface->surface->Data.TimeStamp); pts = surface->surface->Data.TimeStamp; @@ -974,6 +1017,33 @@ return TRUE; } +static void +gst_msdkdec_error_report (GstMsdkDec * thiz) +{ + if (!thiz->report_error) + return; + +#if (MFX_VERSION >= 1025) + else { + if (thiz->error_report.ErrorTypes & MFX_ERROR_SPS) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Error] SPS Error detected!"), (NULL)); + + if (thiz->error_report.ErrorTypes & MFX_ERROR_PPS) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Error] PPS Error detected!"), (NULL)); + + if (thiz->error_report.ErrorTypes & MFX_ERROR_SLICEHEADER) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Error] SliceHeader Error detected!"), (NULL)); + + if (thiz->error_report.ErrorTypes & MFX_ERROR_FRAME_GAP) + GST_ELEMENT_WARNING (thiz, STREAM, DECODE, + ("[Error] Frame Gap Error detected!"), (NULL)); + } +#endif +} + static GstFlowReturn gst_msdkdec_handle_frame (GstVideoDecoder * decoder, GstVideoCodecFrame * frame) { @@ -1034,6 +1104,12 @@ memset (&bitstream, 0, sizeof (bitstream)); + /* Add extended buffers */ + if (thiz->num_bs_extra_params) { + bitstream.NumExtParam = thiz->num_bs_extra_params; + bitstream.ExtParam = thiz->bs_extra_params; + } + if (gst_video_decoder_get_packetized (decoder)) { /* Packetized stream: we prefer to have a parser as a connected upstream * element to the decoder */ @@ -1077,8 +1153,15 @@ * and this information can't be retrieved from the negotiated caps. * So instead of introducing a codecparser dependency to parse the headers * inside msdk plugin, we simply use the mfx APIs to extract header information */ +#if (MFX_VERSION >= 1025) + if (thiz->report_error) + thiz->error_report.ErrorTypes = 0; +#endif + status = MFXVideoDECODE_DecodeHeader (session, &bitstream, &thiz->param); GST_DEBUG_OBJECT (decoder, "DecodeHeader => %d", status); + gst_msdkdec_error_report (thiz); + if (status == MFX_ERR_MORE_DATA) { flow = GST_FLOW_OK; goto done; @@ -1163,6 +1246,10 @@ } } } +#if (MFX_VERSION >= 1025) + if (thiz->report_error) + thiz->error_report.ErrorTypes = 0; +#endif status = MFXVideoDECODE_DecodeFrameAsync (session, &bitstream, surface->surface, @@ -1174,6 +1261,7 @@ } GST_DEBUG_OBJECT (decoder, "DecodeFrameAsync => %d", status); + gst_msdkdec_error_report (thiz); /* media-sdk requires complete reset since the surface is inadequate * for further decoding */ @@ -1183,8 +1271,14 @@ * suitable for the current frame. Call MFXVideoDECODE_DecodeHeader to get * the current frame size, then do memory re-allocation, otherwise * MFXVideoDECODE_DecodeFrameAsync will still fail on next call */ +#if (MFX_VERSION >= 1025) + if (thiz->report_error) + thiz->error_report.ErrorTypes = 0; +#endif status = MFXVideoDECODE_DecodeHeader (session, &bitstream, &thiz->param); GST_DEBUG_OBJECT (decoder, "DecodeHeader => %d", status); + gst_msdkdec_error_report (thiz); + if (status == MFX_ERR_MORE_DATA) { flow = GST_FLOW_OK; goto done; @@ -1558,6 +1652,10 @@ if (!surface) return GST_FLOW_ERROR; } +#if (MFX_VERSION >= 1025) + if (thiz->report_error) + thiz->error_report.ErrorTypes = 0; +#endif status = MFXVideoDECODE_DecodeFrameAsync (session, NULL, surface->surface, @@ -1568,6 +1666,8 @@ } GST_DEBUG_OBJECT (decoder, "DecodeFrameAsync => %d", status); + gst_msdkdec_error_report (thiz); + if (G_LIKELY (status == MFX_ERR_NONE)) { thiz->next_task = (thiz->next_task + 1) % thiz->tasks->len; surface = NULL; @@ -1802,6 +1902,7 @@ thiz->do_renego = TRUE; thiz->do_realloc = TRUE; thiz->force_reset_on_res_change = TRUE; + thiz->report_error = FALSE; thiz->adapter = gst_adapter_new (); thiz->input_state = NULL; thiz->pool = NULL;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkdec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkdec.h
Changed
@@ -54,6 +54,8 @@ #define GST_IS_MSDKDEC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_MSDKDEC)) +#define MAX_BS_EXTRA_PARAMS 8 + typedef struct _GstMsdkDec GstMsdkDec; typedef struct _GstMsdkDecClass GstMsdkDecClass; typedef struct _MsdkDecTask MsdkDecTask; @@ -97,7 +99,15 @@ /* element properties */ gboolean hardware; + gboolean report_error; guint async_depth; + + mfxExtBuffer *bs_extra_params[MAX_BS_EXTRA_PARAMS]; + guint num_bs_extra_params; + +#if (MFX_VERSION >= 1025) + mfxExtDecodeErrorReport error_report; +#endif }; struct _GstMsdkDecClass @@ -117,6 +127,9 @@ GType gst_msdkdec_get_type (void); +void +gst_msdkdec_add_bs_extra_param (GstMsdkDec * thiz, mfxExtBuffer * param); + G_END_DECLS #endif /* __GST_MSDKDEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkdecproputil.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkdecproputil.c
Changed
@@ -42,6 +42,16 @@ G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } +void +gst_msdkdec_prop_install_error_report_property (GObjectClass * gobject_class) +{ + g_object_class_install_property (gobject_class, GST_MSDKDEC_PROP_ERROR_REPORT, + g_param_spec_boolean ("report-error", "report-error", + "Report bitstream error information", + PROP_ERROR_REPORT_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); +} + gboolean gst_msdkdec_prop_check_state (GstState state, GParamSpec * pspec) {
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkdecproputil.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkdecproputil.h
Changed
@@ -37,6 +37,7 @@ G_BEGIN_DECLS #define PROP_OUTPUT_ORDER_DEFAULT GST_MSDKDEC_OUTPUT_ORDER_DISPLAY +#define PROP_ERROR_REPORT_DEFAULT FALSE enum { @@ -44,11 +45,15 @@ GST_MSDKDEC_PROP_HARDWARE, GST_MSDKDEC_PROP_ASYNC_DEPTH, GST_MSDKDEC_PROP_OUTPUT_ORDER, + GST_MSDKDEC_PROP_ERROR_REPORT, }; void gst_msdkdec_prop_install_output_oder_property(GObjectClass * gobject_class); +void +gst_msdkdec_prop_install_error_report_property (GObjectClass * gobject_class); + gboolean gst_msdkdec_prop_check_state(GstState state, GParamSpec * pspec);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkenc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkenc.c
Changed
@@ -49,6 +49,7 @@ #include "gstmsdkvideomemory.h" #include "gstmsdksystemmemory.h" #include "gstmsdkcontextutil.h" +#include "mfxjpeg.h" #ifndef _WIN32 #include "gstmsdkallocator_libva.h" @@ -105,6 +106,10 @@ #define PROP_ADAPTIVE_I_DEFAULT MFX_CODINGOPTION_OFF #define PROP_ADAPTIVE_B_DEFAULT MFX_CODINGOPTION_OFF +/* External coding properties */ +#define EC_PROPS_STRUCT_NAME "props" +#define EC_PROPS_EXTBRC "extbrc" + #define gst_msdkenc_parent_class parent_class G_DEFINE_TYPE (GstMsdkEnc, gst_msdkenc, GST_TYPE_VIDEO_ENCODER); @@ -213,16 +218,138 @@ } } +static gint16 +coding_option_get_value (const gchar * key, const gchar * nickname) +{ + if (!g_strcmp0 (nickname, "on")) { + return MFX_CODINGOPTION_ON; + } else if (!g_strcmp0 (nickname, "off")) { + return MFX_CODINGOPTION_OFF; + } else if (!g_strcmp0 (nickname, "auto")) { + return MFX_CODINGOPTION_UNKNOWN; + } + + GST_ERROR ("\"%s\" illegal option \"%s\", set to \"off\"", key, nickname); + + return MFX_CODINGOPTION_OFF; +} + +static gboolean +structure_transform (const GstStructure * src, GstStructure * dst) +{ + guint len; + GValue dst_value = G_VALUE_INIT; + gboolean ret = TRUE; + + g_return_val_if_fail (src != NULL, FALSE); + g_return_val_if_fail (dst != NULL, FALSE); + + len = gst_structure_n_fields (src); + + for (guint i = 0; i < len; i++) { + const gchar *key = gst_structure_nth_field_name (src, i); + const GValue *src_value = gst_structure_get_value (src, key); + + if (!gst_structure_has_field (dst, key)) { + GST_ERROR ("structure \"%s\" does not support \"%s\"", + gst_structure_get_name (dst), key); + ret = FALSE; + continue; + } + + g_value_init (&dst_value, gst_structure_get_field_type (dst, key)); + + if (g_value_transform (src_value, &dst_value)) { + gst_structure_set_value (dst, key, &dst_value); + } else { + GST_ERROR ("\"%s\" transform %s to %s failed", key, + G_VALUE_TYPE_NAME (src_value), G_VALUE_TYPE_NAME (&dst_value)); + ret = FALSE; + } + + g_value_unset (&dst_value); + } + + return ret; +} + +/* Supported types: gchar*, gboolean, gint, guint, gfloat, gdouble */ +static gboolean +structure_get_value (const GstStructure * s, const gchar * key, gpointer value) +{ + const GValue *gvalue = gst_structure_get_value (s, key); + if (!gvalue) { + GST_ERROR ("structure \"%s\" does not support \"%s\"", + gst_structure_get_name (s), key); + return FALSE; + } + + switch (G_VALUE_TYPE (gvalue)) { + case G_TYPE_STRING:{ + const gchar **val = (const gchar **) value; + *val = g_value_get_string (gvalue); + break; + } + case G_TYPE_BOOLEAN:{ + gboolean *val = (gboolean *) value; + *val = g_value_get_boolean (gvalue); + break; + } + case G_TYPE_INT:{ + gint *val = (gint *) value; + *val = g_value_get_int (gvalue); + break; + } + case G_TYPE_UINT:{ + guint *val = (guint *) value; + *val = g_value_get_uint (gvalue); + break; + } + case G_TYPE_FLOAT:{ + gfloat *val = (gfloat *) value; + *val = g_value_get_float (gvalue); + break; + } + case G_TYPE_DOUBLE:{ + gdouble *val = (gdouble *) value; + *val = g_value_get_double (gvalue); + break; + } + default: + GST_ERROR ("\"%s\" unsupported type %s", key, G_VALUE_TYPE_NAME (gvalue)); + return FALSE; + } + + return TRUE; +} + +static gboolean +ext_coding_props_get_value (GstMsdkEnc * thiz, + const gchar * key, gpointer value) +{ + gboolean ret; + if (!(ret = structure_get_value (thiz->ext_coding_props, key, value))) { + GST_ERROR_OBJECT (thiz, "structure \"%s\" failed to get value for \"%s\"", + gst_structure_get_name (thiz->ext_coding_props), key); + } + + return ret; +} + void gst_msdkenc_ensure_extended_coding_options (GstMsdkEnc * thiz) { mfxExtCodingOption2 *option2 = &thiz->option2; mfxExtCodingOption3 *option3 = &thiz->option3; + gchar *extbrc; + ext_coding_props_get_value (thiz, EC_PROPS_EXTBRC, &extbrc); + /* Fill ExtendedCodingOption2, set non-zero defaults too */ option2->Header.BufferId = MFX_EXTBUFF_CODING_OPTION2; option2->Header.BufferSz = sizeof (thiz->option2); option2->MBBRC = thiz->mbbrc; + option2->ExtBRC = coding_option_get_value (EC_PROPS_EXTBRC, extbrc); option2->AdaptiveI = thiz->adaptive_i; option2->AdaptiveB = thiz->adaptive_b; option2->BitrateLimit = MFX_CODINGOPTION_OFF; @@ -357,6 +484,7 @@ guint i; gboolean need_vpp = TRUE; GstVideoFormat encoder_input_fmt; + mfxExtVideoSignalInfo ext_vsi; if (thiz->initialized) { GST_DEBUG_OBJECT (thiz, "Already initialized"); @@ -449,7 +577,7 @@ break; } - /* validate parameters and allow the Media SDK to make adjustments */ + /* validate parameters and allow MFX to make adjustments */ status = MFXVideoVPP_Query (session, &thiz->vpp_param, &thiz->vpp_param); if (status < MFX_ERR_NONE) { GST_ERROR_OBJECT (thiz, "Video VPP Query failed (%s)", @@ -567,6 +695,12 @@ thiz->param.mfx.FrameInfo.BitDepthLuma = 8; thiz->param.mfx.FrameInfo.BitDepthChroma = 8; break; + case GST_VIDEO_FORMAT_BGR10A2_LE: + thiz->param.mfx.FrameInfo.FourCC = MFX_FOURCC_A2RGB10; + thiz->param.mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV444; + thiz->param.mfx.FrameInfo.BitDepthLuma = 10; + thiz->param.mfx.FrameInfo.BitDepthChroma = 10; + break; case GST_VIDEO_FORMAT_YUY2: thiz->param.mfx.FrameInfo.FourCC = MFX_FOURCC_YUY2; thiz->param.mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV422; @@ -601,12 +735,33 @@ goto failed; } + /* If color properties are available from upstream, set it and pass to MediaSDK here. + * MJPEG and VP9 are excluded as MediaSDK does not support to handle video param + * extbuff with buffer id equals to MFX_EXTBUFF_VIDEO_SIGNAL_INFO. + */ + if (thiz->param.mfx.CodecId != MFX_CODEC_JPEG && + thiz->param.mfx.CodecId != MFX_CODEC_VP9 && + (info->colorimetry.primaries || info->colorimetry.transfer + || info->colorimetry.matrix)) { + memset (&ext_vsi, 0, sizeof (ext_vsi)); + ext_vsi.Header.BufferId = MFX_EXTBUFF_VIDEO_SIGNAL_INFO; + ext_vsi.Header.BufferSz = sizeof (ext_vsi); + ext_vsi.ColourDescriptionPresent = 1; + ext_vsi.ColourPrimaries = + gst_video_color_primaries_to_iso (info->colorimetry.primaries); + ext_vsi.TransferCharacteristics = + gst_video_transfer_function_to_iso (info->colorimetry.transfer); + ext_vsi.MatrixCoefficients = + gst_video_color_matrix_to_iso (info->colorimetry.matrix); + gst_msdkenc_add_extra_param (thiz, (mfxExtBuffer *) & ext_vsi); + } + if (thiz->num_extra_params) { thiz->param.NumExtParam = thiz->num_extra_params; thiz->param.ExtParam = thiz->extra_params; } - /* validate parameters and allow the Media SDK to make adjustments */ + /* validate parameters and allow MFX to make adjustments */ status = MFXVideoENCODE_Query (session, &thiz->param, &thiz->param); if (status < MFX_ERR_NONE) { GST_ERROR_OBJECT (thiz, "Video Encode Query failed (%s)", @@ -1868,6 +2023,8 @@ gst_clear_object (&thiz->msdk_converted_pool); gst_clear_object (&thiz->old_context); + gst_clear_structure (&thiz->ext_coding_props); + G_OBJECT_CLASS (parent_class)->dispose (object); } @@ -1920,6 +2077,8 @@ klass->need_conversion = gst_msdkenc_need_conversion; klass->need_reconfig = gst_msdkenc_need_reconfig; klass->set_extra_params = gst_msdkenc_set_extra_params; + klass->qp_max = 51; + klass->qp_min = 0; gobject_class->dispose = gst_msdkenc_dispose; @@ -1961,6 +2120,9 @@ thiz->mbbrc = PROP_MBBRC_DEFAULT; thiz->adaptive_i = PROP_ADAPTIVE_I_DEFAULT; thiz->adaptive_b = PROP_ADAPTIVE_B_DEFAULT; + + thiz->ext_coding_props = gst_structure_new (EC_PROPS_STRUCT_NAME, + EC_PROPS_EXTBRC, G_TYPE_STRING, "off", NULL); } /* gst_msdkenc_set_common_property: @@ -2057,6 +2219,16 @@ case GST_MSDKENC_PROP_ADAPTIVE_B: thiz->adaptive_b = g_value_get_enum (value); break; + case GST_MSDKENC_PROP_EXT_CODING_PROPS: + { + const GstStructure *s = gst_value_get_structure (value); + const gchar *name = gst_structure_get_name (s); + gst_structure_set_name (thiz->ext_coding_props, name); + if (!structure_transform (s, thiz->ext_coding_props)) { + GST_ERROR_OBJECT (thiz, "failed to transform structure"); + } + break; + } default: ret = FALSE; break; @@ -2150,6 +2322,9 @@ case GST_MSDKENC_PROP_ADAPTIVE_B: g_value_set_enum (value, thiz->adaptive_b); break; + case GST_MSDKENC_PROP_EXT_CODING_PROPS: + gst_value_set_structure (value, thiz->ext_coding_props); + break; default: ret = FALSE; break; @@ -2171,6 +2346,8 @@ { GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GParamSpec *obj_properties[GST_MSDKENC_PROP_MAX] = { NULL, }; + guint qp_range_max = klass->qp_max; + guint qp_range_min = klass->qp_min; obj_properties[GST_MSDKENC_PROP_HARDWARE] = g_param_spec_boolean ("hardware", "Hardware", "Enable hardware encoders", @@ -2200,7 +2377,7 @@ obj_properties[GST_MSDKENC_PROP_MAX_FRAME_SIZE] = g_param_spec_uint ("max-frame-size", "Max Frame Size", - "Maximum possible size (in kb) of any compressed frames (0: auto-calculate)", + "Maximum possible size (in kbyte) of any compressed frames (0: auto-calculate)", 0, G_MAXUINT16, PROP_MAX_FRAME_SIZE_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); @@ -2232,17 +2409,20 @@ g_param_spec_uint ("qpi", "QPI", "Constant quantizer for I frames (0 unlimited). Also used as " "ICQQuality or QVBRQuality for different RateControl methods", - 0, 51, PROP_QPI_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + qp_range_min, qp_range_max, PROP_QPI_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); obj_properties[GST_MSDKENC_PROP_QPP] = g_param_spec_uint ("qpp", "QPP", "Constant quantizer for P frames (0 unlimited)", - 0, 51, PROP_QPP_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + qp_range_min, qp_range_max, PROP_QPP_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); obj_properties[GST_MSDKENC_PROP_QPB] = g_param_spec_uint ("qpb", "QPB", "Constant quantizer for B frames (0 unlimited)", - 0, 51, PROP_QPB_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + qp_range_min, qp_range_max, PROP_QPB_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); obj_properties[GST_MSDKENC_PROP_GOP_SIZE] = g_param_spec_uint ("gop-size", "GOP Size", "GOP Size", 0, @@ -2292,6 +2472,31 @@ gst_msdkenc_adaptive_b_get_type (), PROP_ADAPTIVE_B_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + /** + * GstMsdkEnc:ext-coding-props + * + * The properties for the external coding. + * + * Supported properties: + * ``` + * extbrc : External bitrate control + * String. Range: { auto, on, off } Default: off + * ``` + * + * Example: + * ``` + * ext-coding-props="props,extbrc=on" + * ``` + * + * Since: 1.20 + * + */ + obj_properties[GST_MSDKENC_PROP_EXT_CODING_PROPS] = + g_param_spec_boxed ("ext-coding-props", "External coding properties", + "The properties for the external coding, refer to the hotdoc for the " + "supported properties", + GST_TYPE_STRUCTURE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS); + g_object_class_install_properties (gobject_class, GST_MSDKENC_PROP_MAX, obj_properties); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkenc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkenc.h
Changed
@@ -83,6 +83,7 @@ GST_MSDKENC_PROP_MBBRC, GST_MSDKENC_PROP_ADAPTIVE_I, GST_MSDKENC_PROP_ADAPTIVE_B, + GST_MSDKENC_PROP_EXT_CODING_PROPS, GST_MSDKENC_PROP_MAX, }; @@ -158,6 +159,8 @@ gint16 adaptive_i; gint16 adaptive_b; + GstStructure *ext_coding_props; + gboolean reconfig; guint16 codename; @@ -183,6 +186,9 @@ /* Allow sub class set extra frame parameters */ void (*set_extra_params) (GstMsdkEnc * encoder, GstVideoCodecFrame * frame); + + guint qp_max; + guint qp_min; }; struct _MsdkEncTask
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkh264dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkh264dec.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ + /** + * SECTION: element-msdkh264dec + * @title: msdkh264dec + * @short_description: Intel MSDK H264 decoder + * + * H264 video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.h264 ! h264parse ! msdkh264dec ! glimagesink + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -44,7 +60,7 @@ GST_STATIC_CAPS ("video/x-h264, " "width = (int) [ 1, MAX ], height = (int) [ 1, MAX ], " "stream-format = (string) byte-stream , alignment = (string) au , " - "profile = (string) { high, main, baseline, constrained-baseline }") + "profile = (string) { high, progressive-high, constrained-high, main, baseline, constrained-baseline }") ); #define gst_msdkh264dec_parent_class parent_class @@ -61,6 +77,17 @@ * customers still using this for low-latency streaming of non-b-frame * encoded streams */ decoder->param.mfx.DecodedOrder = h264dec->output_order; + +#if (MFX_VERSION >= 1025) + if (decoder->report_error) { + decoder->error_report.Header.BufferId = MFX_EXTBUFF_DECODE_ERROR_REPORT; + decoder->error_report.Header.BufferSz = sizeof (decoder->error_report); + decoder->error_report.ErrorTypes = 0; + gst_msdkdec_add_bs_extra_param (decoder, + (mfxExtBuffer *) & decoder->error_report); + } +#endif + return TRUE; } @@ -69,6 +96,9 @@ const GValue * value, GParamSpec * pspec) { GstMsdkH264Dec *thiz = GST_MSDKH264DEC (object); +#if (MFX_VERSION >= 1025) + GstMsdkDec *dec = GST_MSDKDEC (object); +#endif GstState state; GST_OBJECT_LOCK (thiz); @@ -83,6 +113,11 @@ case GST_MSDKDEC_PROP_OUTPUT_ORDER: thiz->output_order = g_value_get_enum (value); break; +#if (MFX_VERSION >= 1025) + case GST_MSDKDEC_PROP_ERROR_REPORT: + dec->report_error = g_value_get_boolean (value); + break; +#endif default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -96,12 +131,20 @@ GParamSpec * pspec) { GstMsdkH264Dec *thiz = GST_MSDKH264DEC (object); +#if (MFX_VERSION >= 1025) + GstMsdkDec *dec = GST_MSDKDEC (object); +#endif GST_OBJECT_LOCK (thiz); switch (prop_id) { case GST_MSDKDEC_PROP_OUTPUT_ORDER: g_value_set_enum (value, thiz->output_order); break; +#if (MFX_VERSION >= 1025) + case GST_MSDKDEC_PROP_ERROR_REPORT: + g_value_set_boolean (value, dec->report_error); + break; +#endif default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -128,11 +171,15 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK H264 decoder", "Codec/Decoder/Video/Hardware", - "H264 video decoder based on Intel Media SDK", + "H264 video decoder based on " MFX_API_SDK, "Scott D Phillips <scott.d.phillips@intel.com>"); gst_msdkdec_prop_install_output_oder_property (gobject_class); +#if (MFX_VERSION >= 1025) + gst_msdkdec_prop_install_error_report_property (gobject_class); +#endif + gst_element_class_add_static_pad_template (element_class, &sink_factory); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkh264enc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkh264enc.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkh264enc + * @title: msdkh264enc + * @short_description: Intel MSDK H264 encoder + * + * H264 video encoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc num-buffers=90 ! msdkh264enc ! h264parse ! filesink location=output.h264 + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -54,6 +70,11 @@ PROP_MAX_SLICE_SIZE, PROP_B_PYRAMID, PROP_TUNE_MODE, + PROP_P_PYRAMID, + PROP_MIN_QP, + PROP_MAX_QP, + PROP_INTRA_REFRESH_TYPE, + PROP_DBLK_IDC, }; enum @@ -70,6 +91,11 @@ #define PROP_MAX_SLICE_SIZE_DEFAULT 0 #define PROP_B_PYRAMID_DEFAULT FALSE #define PROP_TUNE_MODE_DEFAULT MFX_CODINGOPTION_UNKNOWN +#define PROP_P_PYRAMID_DEFAULT FALSE +#define PROP_MIN_QP_DEFAULT 0 +#define PROP_MAX_QP_DEFAULT 0 +#define PROP_INTRA_REFRESH_TYPE_DEFAULT MFX_REFRESH_NO +#define PROP_DBLK_IDC_DEFAULT 0 static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, @@ -376,6 +402,13 @@ encoder->option2.Trellis = thiz->trellis ? thiz->trellis : MFX_TRELLIS_OFF; encoder->option2.MaxSliceSize = thiz->max_slice_size; + encoder->option2.MinQPI = encoder->option2.MinQPP = encoder->option2.MinQPB = + thiz->min_qp; + encoder->option2.MaxQPI = encoder->option2.MaxQPP = encoder->option2.MaxQPB = + thiz->max_qp; + encoder->option2.IntRefType = thiz->intra_refresh_type; + encoder->option2.DisableDeblockingIdc = thiz->dblk_idc; + if (encoder->rate_control == MFX_RATECONTROL_LA || encoder->rate_control == MFX_RATECONTROL_LA_HRD || encoder->rate_control == MFX_RATECONTROL_LA_ICQ) @@ -388,6 +421,15 @@ encoder->param.mfx.GopRefDist = 0; } + if (thiz->p_pyramid) { + encoder->option3.PRefType = MFX_P_REF_PYRAMID; + /* MFX_P_REF_PYRAMID is available for GopRefDist = 1 */ + encoder->param.mfx.GopRefDist = 1; + /* SDK decides the DPB size for P pyramid */ + encoder->param.mfx.NumRefFrame = 0; + encoder->enable_extopt3 = TRUE; + } + /* Enable Extended coding options */ gst_msdkenc_ensure_extended_coding_options (encoder); @@ -557,6 +599,21 @@ thiz->tune_mode = g_value_get_enum (value); thiz->prop_flag |= GST_MSDK_FLAG_TUNE_MODE; break; + case PROP_P_PYRAMID: + thiz->p_pyramid = g_value_get_boolean (value); + break; + case PROP_MIN_QP: + thiz->min_qp = g_value_get_uint (value); + break; + case PROP_MAX_QP: + thiz->max_qp = g_value_get_uint (value); + break; + case PROP_INTRA_REFRESH_TYPE: + thiz->intra_refresh_type = g_value_get_enum (value); + break; + case PROP_DBLK_IDC: + thiz->dblk_idc = g_value_get_uint (value); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -602,6 +659,21 @@ case PROP_TUNE_MODE: g_value_set_enum (value, thiz->tune_mode); break; + case PROP_P_PYRAMID: + g_value_set_boolean (value, thiz->p_pyramid); + break; + case PROP_MIN_QP: + g_value_set_uint (value, thiz->min_qp); + break; + case PROP_MAX_QP: + g_value_set_uint (value, thiz->max_qp); + break; + case PROP_INTRA_REFRESH_TYPE: + g_value_set_enum (value, thiz->intra_refresh_type); + break; + case PROP_DBLK_IDC: + g_value_set_uint (value, thiz->dblk_idc); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -703,9 +775,39 @@ gst_msdkenc_tune_mode_get_type (), PROP_TUNE_MODE_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + g_object_class_install_property (gobject_class, PROP_P_PYRAMID, + g_param_spec_boolean ("p-pyramid", "P-pyramid", + "Enable P-Pyramid Reference structure", FALSE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_MIN_QP, + g_param_spec_uint ("min-qp", "Min QP", + "Minimal quantizer for I/P/B frames", + 0, 51, PROP_MIN_QP_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_MAX_QP, + g_param_spec_uint ("max-qp", "Max QP", + "Maximum quantizer for I/P/B frames", + 0, 51, PROP_MAX_QP_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_INTRA_REFRESH_TYPE, + g_param_spec_enum ("intra-refresh-type", "Intra refresh type", + "Set intra refresh type", + gst_msdkenc_intra_refresh_type_get_type (), + PROP_INTRA_REFRESH_TYPE_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_DBLK_IDC, + g_param_spec_uint ("dblk-idc", "Disable Deblocking Idc", + "Option of disable deblocking idc", + 0, 2, PROP_DBLK_IDC_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + gst_element_class_set_static_metadata (element_class, "Intel MSDK H264 encoder", "Codec/Encoder/Video/Hardware", - "H264 video encoder based on Intel Media SDK", + "H264 video encoder based on " MFX_API_SDK, "Josep Torra <jtorra@oblong.com>"); gst_element_class_add_static_pad_template (element_class, &src_factory); } @@ -721,4 +823,9 @@ thiz->max_slice_size = PROP_MAX_SLICE_SIZE_DEFAULT; thiz->b_pyramid = PROP_B_PYRAMID_DEFAULT; thiz->tune_mode = PROP_TUNE_MODE_DEFAULT; + thiz->p_pyramid = PROP_P_PYRAMID_DEFAULT; + thiz->min_qp = PROP_MIN_QP_DEFAULT; + thiz->max_qp = PROP_MAX_QP_DEFAULT; + thiz->intra_refresh_type = PROP_INTRA_REFRESH_TYPE_DEFAULT; + thiz->dblk_idc = PROP_DBLK_IDC_DEFAULT; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkh264enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkh264enc.h
Changed
@@ -71,6 +71,11 @@ guint b_pyramid; gint tune_mode; guint prop_flag; + guint p_pyramid; + guint min_qp; + guint max_qp; + guint intra_refresh_type; + guint dblk_idc; GstH264NalParser *parser; GArray *cc_sei_array;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkh265dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkh265dec.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ + /** + * SECTION:element-msdkh265dec + * @title: msdkh265dec + * @short_description: Intel MSDK H265 decoder + * + * H265 video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.h265 ! h265parse ! msdkh265dec ! glimagesink + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -83,6 +99,17 @@ * customers still using this for low-latency streaming of non-b-frame * encoded streams */ decoder->param.mfx.DecodedOrder = h265dec->output_order; + +#if (MFX_VERSION >= 1025) + if (decoder->report_error) { + decoder->error_report.Header.BufferId = MFX_EXTBUFF_DECODE_ERROR_REPORT; + decoder->error_report.Header.BufferSz = sizeof (decoder->error_report); + decoder->error_report.ErrorTypes = 0; + gst_msdkdec_add_bs_extra_param (decoder, + (mfxExtBuffer *) & decoder->error_report); + } +#endif + return TRUE; } @@ -91,6 +118,9 @@ const GValue * value, GParamSpec * pspec) { GstMsdkH265Dec *thiz = GST_MSDKH265DEC (object); +#if (MFX_VERSION >= 1025) + GstMsdkDec *dec = GST_MSDKDEC (object); +#endif GstState state; GST_OBJECT_LOCK (thiz); @@ -105,6 +135,11 @@ case GST_MSDKDEC_PROP_OUTPUT_ORDER: thiz->output_order = g_value_get_enum (value); break; +#if (MFX_VERSION >= 1025) + case GST_MSDKDEC_PROP_ERROR_REPORT: + dec->report_error = g_value_get_boolean (value); + break; +#endif default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -118,12 +153,20 @@ GParamSpec * pspec) { GstMsdkH265Dec *thiz = GST_MSDKH265DEC (object); +#if (MFX_VERSION >= 1025) + GstMsdkDec *dec = GST_MSDKDEC (object); +#endif GST_OBJECT_LOCK (thiz); switch (prop_id) { case GST_MSDKDEC_PROP_OUTPUT_ORDER: g_value_set_enum (value, thiz->output_order); break; +#if (MFX_VERSION >= 1025) + case GST_MSDKDEC_PROP_ERROR_REPORT: + g_value_set_boolean (value, dec->report_error); + break; +#endif default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -150,11 +193,15 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK H265 decoder", "Codec/Decoder/Video/Hardware", - "H265 video decoder based on Intel Media SDK", + "H265 video decoder based on " MFX_API_SDK, "Scott D Phillips <scott.d.phillips@intel.com>"); gst_msdkdec_prop_install_output_oder_property (gobject_class); +#if (MFX_VERSION >= 1025) + gst_msdkdec_prop_install_error_report_property (gobject_class); +#endif + gst_element_class_add_static_pad_template (element_class, &sink_factory); gst_element_class_add_static_pad_template (element_class, &src_factory); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkh265enc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkh265enc.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION:element-msdkh265enc + * @title: msdkh265enc + * @short_description: Intel MSDK H265 encoder + * + * H265 video encoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc num-buffers=90 ! msdkh265enc ! h265parse ! filesink location=output.h265 + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -51,6 +67,13 @@ PROP_TILE_COL, PROP_MAX_SLICE_SIZE, PROP_TUNE_MODE, + PROP_TRANSFORM_SKIP, + PROP_B_PYRAMID, + PROP_P_PYRAMID, + PROP_MIN_QP, + PROP_MAX_QP, + PROP_INTRA_REFRESH_TYPE, + PROP_DBLK_IDC, }; enum @@ -64,9 +87,16 @@ #define PROP_TILE_COL_DEFAULT 1 #define PROP_MAX_SLICE_SIZE_DEFAULT 0 #define PROP_TUNE_MODE_DEFAULT MFX_CODINGOPTION_UNKNOWN - -#define RAW_FORMATS "NV12, I420, YV12, YUY2, UYVY, BGRA, P010_10LE, VUYA" -#define PROFILES "main, main-10, main-444" +#define PROP_TRANSFORM_SKIP_DEFAULT MFX_CODINGOPTION_UNKNOWN +#define PROP_B_PYRAMID_DEFAULT FALSE +#define PROP_P_PYRAMID_DEFAULT FALSE +#define PROP_MIN_QP_DEFAULT 0 +#define PROP_MAX_QP_DEFAULT 0 +#define PROP_INTRA_REFRESH_TYPE_DEFAULT MFX_REFRESH_NO +#define PROP_DBLK_IDC_DEFAULT 0 + +#define RAW_FORMATS "NV12, I420, YV12, YUY2, UYVY, BGRA, BGR10A2_LE, P010_10LE, VUYA" +#define PROFILES "main, main-10, main-444, main-still-picture, main-10-still-picture" #define COMMON_FORMAT "{ " RAW_FORMATS " }" #define PRFOLIE_STR "{ " PROFILES " }" @@ -218,13 +248,119 @@ gst_memory_unref (mem); } +static void +gst_msdkh265enc_add_mdcv_sei (GstMsdkEnc * encoder, GstVideoCodecFrame * frame) +{ + GstMsdkH265Enc *thiz = GST_MSDKH265ENC (encoder); + GstVideoMasteringDisplayInfo *mastering_display_info + = encoder->input_state->mastering_display_info; + GstH265SEIMessage sei; + GstH265MasteringDisplayColourVolume *mdcv; + GstMemory *mem = NULL; + guint i = 0; + + memset (&sei, 0, sizeof (GstH265SEIMessage)); + sei.payloadType = GST_H265_SEI_MASTERING_DISPLAY_COLOUR_VOLUME; + mdcv = &sei.payload.mastering_display_colour_volume; + + for (i = 0; i < 3; i++) { + mdcv->display_primaries_x[i] = + mastering_display_info->display_primaries[i].x; + mdcv->display_primaries_y[i] = + mastering_display_info->display_primaries[i].y; + } + + mdcv->white_point_x = mastering_display_info->white_point.x; + mdcv->white_point_y = mastering_display_info->white_point.y; + mdcv->max_display_mastering_luminance = + mastering_display_info->max_display_mastering_luminance; + mdcv->min_display_mastering_luminance = + mastering_display_info->min_display_mastering_luminance; + + if (!thiz->cc_sei_array) + thiz->cc_sei_array = g_array_new (FALSE, FALSE, sizeof (GstH265SEIMessage)); + else + g_array_set_size (thiz->cc_sei_array, 0); + + g_array_append_val (thiz->cc_sei_array, sei); + + if (!thiz->cc_sei_array || !thiz->cc_sei_array->len) + return; + + /* layer_id and temporal_id will be updated by parser later */ + mem = gst_h265_create_sei_memory (0, 1, 4, thiz->cc_sei_array); + + if (!mem) { + GST_WARNING_OBJECT (thiz, "Cannot create SEI nal unit"); + return; + } + + GST_DEBUG_OBJECT (thiz, + "Inserting %d mastering display colout volume SEI message(s)", + thiz->cc_sei_array->len); + + gst_msdkh265enc_insert_sei (thiz, frame, mem); + gst_memory_unref (mem); +} + +static void +gst_msdkh265enc_add_cll_sei (GstMsdkEnc * encoder, GstVideoCodecFrame * frame) +{ + GstMsdkH265Enc *thiz = GST_MSDKH265ENC (encoder); + GstVideoContentLightLevel *content_light_level + = encoder->input_state->content_light_level; + GstH265ContentLightLevel *cll; + GstH265SEIMessage sei; + GstMemory *mem = NULL; + + memset (&sei, 0, sizeof (GstH265SEIMessage)); + sei.payloadType = GST_H265_SEI_CONTENT_LIGHT_LEVEL; + cll = &sei.payload.content_light_level; + + cll->max_content_light_level = content_light_level->max_content_light_level; + cll->max_pic_average_light_level = + content_light_level->max_frame_average_light_level; + + if (!thiz->cc_sei_array) + thiz->cc_sei_array = g_array_new (FALSE, FALSE, sizeof (GstH265SEIMessage)); + else + g_array_set_size (thiz->cc_sei_array, 0); + + g_array_append_val (thiz->cc_sei_array, sei); + + if (!thiz->cc_sei_array || !thiz->cc_sei_array->len) + return; + + /* layer_id and temporal_id will be updated by parser later */ + mem = gst_h265_create_sei_memory (0, 1, 4, thiz->cc_sei_array); + + if (!mem) { + GST_WARNING_OBJECT (thiz, "Cannot create SEI nal unit"); + return; + } + + GST_DEBUG_OBJECT (thiz, + "Inserting %d content light level SEI message(s)", + thiz->cc_sei_array->len); + + gst_msdkh265enc_insert_sei (thiz, frame, mem); + gst_memory_unref (mem); +} + static GstFlowReturn gst_msdkh265enc_pre_push (GstVideoEncoder * encoder, GstVideoCodecFrame * frame) { GstMsdkH265Enc *thiz = GST_MSDKH265ENC (encoder); + GstMsdkEnc *msdk_encoder = GST_MSDKENC (encoder); gst_msdkh265enc_add_cc (thiz, frame); + if (msdk_encoder->input_state->mastering_display_info) + gst_msdkh265enc_add_mdcv_sei (msdk_encoder, frame); + + if (msdk_encoder->input_state->content_light_level) + gst_msdkh265enc_add_cll_sei (msdk_encoder, frame); + return GST_FLOW_OK; } @@ -294,7 +430,17 @@ if (!strcmp (h265enc->profile_name, "main-10")) encoder->param.mfx.CodecProfile = MFX_PROFILE_HEVC_MAIN10; - else if (!strcmp (h265enc->profile_name, "main-444") || + else if (!strcmp (h265enc->profile_name, "main-still-picture")) + encoder->param.mfx.CodecProfile = MFX_PROFILE_HEVC_MAINSP; + else if (!strcmp (h265enc->profile_name, "main-10-still-picture")) { + encoder->param.mfx.CodecProfile = MFX_PROFILE_HEVC_MAIN10; + h265enc->ext_param.Header.BufferId = MFX_EXTBUFF_HEVC_PARAM; + h265enc->ext_param.Header.BufferSz = sizeof (h265enc->ext_param); + h265enc->ext_param.GeneralConstraintFlags = + MFX_HEVC_CONSTR_REXT_ONE_PICTURE_ONLY; + gst_msdkenc_add_extra_param (encoder, + (mfxExtBuffer *) & h265enc->ext_param); + } else if (!strcmp (h265enc->profile_name, "main-444") || !strcmp (h265enc->profile_name, "main-422-10") || !strcmp (h265enc->profile_name, "main-444-10") || !strcmp (h265enc->profile_name, "main-12")) @@ -314,6 +460,7 @@ break; case MFX_FOURCC_AYUV: case MFX_FOURCC_YUY2: + case MFX_FOURCC_A2RGB10: #if (MFX_VERSION >= 1027) case MFX_FOURCC_Y410: case MFX_FOURCC_Y210: @@ -337,6 +484,36 @@ /* Enable Extended coding options */ encoder->option2.MaxSliceSize = h265enc->max_slice_size; + encoder->option2.MinQPI = encoder->option2.MinQPP = encoder->option2.MinQPB = + h265enc->min_qp; + encoder->option2.MaxQPI = encoder->option2.MaxQPP = encoder->option2.MaxQPB = + h265enc->max_qp; + encoder->option2.IntRefType = h265enc->intra_refresh_type; + encoder->option2.DisableDeblockingIdc = h265enc->dblk_idc; + +#if (MFX_VERSION >= 1026) + if (h265enc->transform_skip != MFX_CODINGOPTION_UNKNOWN) { + encoder->option3.TransformSkip = h265enc->transform_skip; + encoder->enable_extopt3 = TRUE; + } +#endif + + if (h265enc->b_pyramid) { + encoder->option2.BRefType = MFX_B_REF_PYRAMID; + /* Don't define Gop structure for B-pyramid, otherwise EncodeInit + * will throw Invalid param error */ + encoder->param.mfx.GopRefDist = 0; + } + + if (h265enc->p_pyramid) { + encoder->option3.PRefType = MFX_P_REF_PYRAMID; + /* MFX_P_REF_PYRAMID is available for GopRefDist = 1 */ + encoder->param.mfx.GopRefDist = 1; + /* SDK decides the DPB size for P pyramid */ + encoder->param.mfx.NumRefFrame = 0; + encoder->enable_extopt3 = TRUE; + } + gst_msdkenc_ensure_extended_coding_options (encoder); if (h265enc->num_tile_rows > 1 || h265enc->num_tile_cols > 1) { @@ -430,6 +607,10 @@ gst_structure_set (structure, "profile", G_TYPE_STRING, "main-422-10", NULL); break; + case MFX_FOURCC_A2RGB10: + gst_structure_set (structure, "profile", G_TYPE_STRING, "main-444-10", + NULL); + break; #if (MFX_VERSION >= 1027) case MFX_FOURCC_Y410: gst_structure_set (structure, "profile", G_TYPE_STRING, "main-444-10", @@ -516,6 +697,34 @@ thiz->prop_flag |= GST_MSDK_FLAG_TUNE_MODE; break; + case PROP_TRANSFORM_SKIP: + thiz->transform_skip = g_value_get_enum (value); + break; + + case PROP_B_PYRAMID: + thiz->b_pyramid = g_value_get_boolean (value); + break; + + case PROP_P_PYRAMID: + thiz->p_pyramid = g_value_get_boolean (value); + break; + + case PROP_MIN_QP: + thiz->min_qp = g_value_get_uint (value); + break; + + case PROP_MAX_QP: + thiz->max_qp = g_value_get_uint (value); + break; + + case PROP_INTRA_REFRESH_TYPE: + thiz->intra_refresh_type = g_value_get_enum (value); + break; + + case PROP_DBLK_IDC: + thiz->dblk_idc = g_value_get_uint (value); + break; + default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -556,6 +765,34 @@ g_value_set_enum (value, thiz->tune_mode); break; + case PROP_TRANSFORM_SKIP: + g_value_set_enum (value, thiz->transform_skip); + break; + + case PROP_B_PYRAMID: + g_value_set_boolean (value, thiz->b_pyramid); + break; + + case PROP_P_PYRAMID: + g_value_set_boolean (value, thiz->p_pyramid); + break; + + case PROP_MIN_QP: + g_value_set_uint (value, thiz->min_qp); + break; + + case PROP_MAX_QP: + g_value_set_uint (value, thiz->max_qp); + break; + + case PROP_INTRA_REFRESH_TYPE: + g_value_set_enum (value, thiz->intra_refresh_type); + break; + + case PROP_DBLK_IDC: + g_value_set_uint (value, thiz->dblk_idc); + break; + default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -589,6 +826,7 @@ switch (GST_VIDEO_INFO_FORMAT (info)) { case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_BGR10A2_LE: case GST_VIDEO_FORMAT_P010_10LE: case GST_VIDEO_FORMAT_VUYA: #if (MFX_VERSION >= 1027) @@ -675,10 +913,51 @@ gst_msdkenc_tune_mode_get_type (), PROP_TUNE_MODE_DEFAULT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + g_object_class_install_property (gobject_class, PROP_TRANSFORM_SKIP, + g_param_spec_enum ("transform-skip", "Transform Skip", + "Transform Skip option", + gst_msdkenc_transform_skip_get_type (), PROP_TRANSFORM_SKIP_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_B_PYRAMID, + g_param_spec_boolean ("b-pyramid", "B-pyramid", + "Enable B-Pyramid Reference structure", FALSE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_P_PYRAMID, + g_param_spec_boolean ("p-pyramid", "P-pyramid", + "Enable P-Pyramid Reference structure", FALSE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_MIN_QP, + g_param_spec_uint ("min-qp", "Min QP", + "Minimal quantizer for I/P/B frames", + 0, 51, PROP_MIN_QP_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_MAX_QP, + g_param_spec_uint ("max-qp", "Max QP", + "Maximum quantizer for I/P/B frames", + 0, 51, PROP_MAX_QP_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_INTRA_REFRESH_TYPE, + g_param_spec_enum ("intra-refresh-type", "Intra refresh type", + "Set intra refresh type", + gst_msdkenc_intra_refresh_type_get_type (), + PROP_INTRA_REFRESH_TYPE_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + + g_object_class_install_property (gobject_class, PROP_DBLK_IDC, + g_param_spec_uint ("dblk-idc", "Disable Deblocking Idc", + "Option of disable deblocking idc", + 0, 2, PROP_DBLK_IDC_DEFAULT, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); + gst_element_class_set_static_metadata (element_class, "Intel MSDK H265 encoder", "Codec/Encoder/Video/Hardware", - "H265 video encoder based on Intel Media SDK", + "H265 video encoder based on " MFX_API_SDK, "Josep Torra <jtorra@oblong.com>"); gst_element_class_add_static_pad_template (element_class, &sink_factory); @@ -694,5 +973,12 @@ thiz->num_tile_cols = PROP_TILE_COL_DEFAULT; thiz->max_slice_size = PROP_MAX_SLICE_SIZE_DEFAULT; thiz->tune_mode = PROP_TUNE_MODE_DEFAULT; + thiz->transform_skip = PROP_TRANSFORM_SKIP_DEFAULT; + thiz->b_pyramid = PROP_B_PYRAMID_DEFAULT; + thiz->p_pyramid = PROP_P_PYRAMID_DEFAULT; + thiz->min_qp = PROP_MIN_QP_DEFAULT; + thiz->max_qp = PROP_MAX_QP_DEFAULT; + thiz->intra_refresh_type = PROP_INTRA_REFRESH_TYPE_DEFAULT; + thiz->dblk_idc = PROP_DBLK_IDC_DEFAULT; msdk_enc->num_extra_frames = 1; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkh265enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkh265enc.h
Changed
@@ -62,8 +62,16 @@ guint max_slice_size; gint tune_mode; guint prop_flag; + gushort transform_skip; + guint b_pyramid; + guint p_pyramid; + guint min_qp; + guint max_qp; + guint intra_refresh_type; + guint dblk_idc; mfxExtHEVCTiles ext_tiles; + mfxExtHEVCParam ext_param; /* roi[0] for current ROI and roi[1] for previous ROI */ mfxExtEncoderROI roi[2];
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkmjpegdec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkmjpegdec.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkmjpegdec + * @title: msdkmjpegdec + * @short_description: Intel MSDK MJPEG decoder + * + * MJPEG video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.jpg ! jpegparse ! msdkmjpegdec ! glimagesink + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -109,7 +125,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK MJPEG decoder", "Codec/Decoder/Video/Hardware", - "MJPEG video decoder based on Intel Media SDK", + "MJPEG video decoder based on " MFX_API_SDK, "Scott D Phillips <scott.d.phillips@intel.com>"); gst_element_class_add_static_pad_template (element_class, &sink_factory);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkmjpegenc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkmjpegenc.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkmjpegenc + * @title: msdkmjpegenc + * @short_description: Intel MSDK MJPEG encoder + * + * MJPEG video encoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc num-buffers=1 ! msdkmjpegenc ! jpegparse ! filesink location=output.jpg + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -177,7 +193,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK MJPEG encoder", "Codec/Encoder/Video/Hardware", - "MJPEG video encoder based on Intel Media SDK", + "MJPEG video encoder based on " MFX_API_SDK, "Scott D Phillips <scott.d.phillips@intel.com>"); gst_element_class_add_static_pad_template (element_class, &src_factory);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkmpeg2dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkmpeg2dec.c
Changed
@@ -32,6 +32,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkmpeg2dec + * @title: msdkmpeg2dec + * @short_description: Intel MSDK MPEG2 decoder + * + * MPEG2 video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.mpeg2 ! mpegvideoparse ! msdkmpeg2dec ! glimagesink + * ``` + * + * Since: 1.14 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -129,7 +145,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK MPEG2 decoder", "Codec/Decoder/Video/Hardware", - "MPEG2 video decoder based on Intel Media SDK", + "MPEG2 video decoder based on " MFX_API_SDK, "Sreerenj Balachandran <sreerenj.balachandran@intel.com>"); gst_msdkdec_prop_install_output_oder_property (gobject_class);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkmpeg2enc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkmpeg2enc.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION:element-msdkmpeg2enc + * @title: msdkmpeg2enc + * @short_description: Intel MSDK MPEG2 encoder + * + * MPEG2 video encoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc num-buffers=90 ! msdkmpeg2enc ! mpegvideoparse ! filesink location=output.mpg + * ``` + * + * Since: 1.12 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -194,7 +210,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK MPEG2 encoder", "Codec/Encoder/Video/Hardware", - "MPEG2 video encoder based on Intel Media SDK", + "MPEG2 video encoder based on " MFX_API_SDK, "Josep Torra <jtorra@oblong.com>"); gst_element_class_add_static_pad_template (element_class, &src_factory);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdksystemmemory.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdksystemmemory.c
Changed
@@ -110,6 +110,7 @@ mem->surface->Data.Pitch = mem->destination_pitches[0]; break; case GST_VIDEO_FORMAT_BGRA: + case GST_VIDEO_FORMAT_BGRx: mem->surface->Data.B = mem->cached_data[0]; mem->surface->Data.G = mem->surface->Data.B + 1; mem->surface->Data.R = mem->surface->Data.B + 2; @@ -159,6 +160,20 @@ mem->surface->Data.A = mem->surface->Data.U + 6; mem->surface->Data.Pitch = mem->destination_pitches[0]; break; +#if (MFX_VERSION >= 2004) + case GST_VIDEO_FORMAT_RGBP: + mem->surface->Data.Pitch = mem->destination_pitches[0]; + mem->surface->Data.R = mem->cached_data[0]; + mem->surface->Data.G = mem->cached_data[1]; + mem->surface->Data.B = mem->cached_data[2]; + break; + case GST_VIDEO_FORMAT_BGRP: + mem->surface->Data.Pitch = mem->destination_pitches[0]; + mem->surface->Data.B = mem->cached_data[0]; + mem->surface->Data.G = mem->cached_data[1]; + mem->surface->Data.R = mem->cached_data[2]; + break; +#endif default: g_assert_not_reached (); @@ -265,6 +280,14 @@ return mem->surface->Data.U; /* The first channel is U */ #endif +#if (MFX_VERSION >= 2004) + case MFX_FOURCC_RGBP: + return mem->surface->Data.R; + + case MFX_FOURCC_BGRP: + return mem->surface->Data.B; +#endif + default: return mem->surface->Data.Y; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkvc1dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkvc1dec.c
Changed
@@ -31,7 +31,21 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -/* sample pipeline: gst-launch-1.0 filesrc location=video.wmv ! asfdemux ! vc1parse ! msdkvc1dec ! videoconvert ! xvimagesink */ +/** + * SECTION: element-msdkvc1dec + * @title: msdkvc1dec + * @short_description: Intel MSDK VC1 decoder + * + * VC1/WMV video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=video.wmv ! asfdemux ! vc1parse ! msdkvc1dec ! videoconvert ! xvimagesink + * ``` + * + * Since: 1.14 + * + */ #ifdef HAVE_CONFIG_H # include <config.h> @@ -189,7 +203,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK VC1 decoder", "Codec/Decoder/Video/Hardware", - "VC1/WMV video decoder based on Intel Media SDK", + "VC1/WMV video decoder based on " MFX_API_SDK, "Sreerenj Balachandran <sreerenj.balachandran@intel.com>"); gst_msdkdec_prop_install_output_oder_property (gobject_class);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkvideomemory.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkvideomemory.c
Changed
@@ -239,6 +239,7 @@ switch (meta->format) { case GST_VIDEO_FORMAT_BGRA: + case GST_VIDEO_FORMAT_BGRx: *data = mem->surface->Data.B + offset; break; @@ -331,6 +332,14 @@ return mem->surface->Data.U; #endif +#if (MFX_VERSION >= 2004) + case MFX_FOURCC_RGBP: + return mem->surface->Data.R; + + case MFX_FOURCC_BGRP: + return mem->surface->Data.B; +#endif + default: return mem->surface->Data.Y; } @@ -499,18 +508,22 @@ g_return_val_if_fail (GST_IS_MSDK_DMABUF_ALLOCATOR (allocator), NULL); mem_id = surface->Data.MemId; - fd = mem_id->info.handle; - size = mem_id->info.mem_size; - if (fd < 0 || (fd = dup (fd)) < 0) { + g_assert (mem_id->desc.num_objects == 1); + + fd = mem_id->desc.objects[0].fd; + size = mem_id->desc.objects[0].size; + + if (fd < 0) { GST_ERROR ("Failed to get dmabuf handle"); return NULL; } - mem = gst_dmabuf_allocator_alloc (allocator, fd, size); + mem = gst_fd_allocator_alloc (allocator, fd, size, + GST_FD_MEMORY_FLAG_DONT_CLOSE); + if (!mem) { GST_ERROR ("failed ! dmabuf fd: %d", fd); - close (fd); return NULL; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkvp8dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkvp8dec.c
Changed
@@ -30,6 +30,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION:element-msdkvp8dec + * @title: msdkvp8dec + * @short_description: Intel MSDK VP8 decoder + * + * VP8 video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.webm ! matroskademux ! msdkvp8dec ! glimagesink + * ``` + * + * Since: 1.14 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -174,7 +190,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK VP8 decoder", "Codec/Decoder/Video/Hardware", - "VP8 video decoder based on Intel Media SDK", + "VP8 video decoder based on " MFX_API_SDK, "Hyunjun Ko <zzoon@igalia.com>"); gst_msdkdec_prop_install_output_oder_property (gobject_class);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkvp9dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkvp9dec.c
Changed
@@ -31,6 +31,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkvp9dec + * @title: msdkvp9dec + * @short_description: Intel MSDK VP9 decoderr + * + * VP9 video decoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.webm ! matroskademux ! msdkvp9dec ! glimagesink + * ``` + * + * Since: 1.16 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -38,8 +54,6 @@ #include "gstmsdkvp9dec.h" #include "gstmsdkvideomemory.h" -#include <mfxvp9.h> - GST_DEBUG_CATEGORY_EXTERN (gst_msdkvp9dec_debug); #define GST_CAT_DEFAULT gst_msdkvp9dec_debug @@ -171,7 +185,7 @@ gst_element_class_set_static_metadata (element_class, "Intel MSDK VP9 decoder", "Codec/Decoder/Video/Hardware", - "VP9 video decoder based on Intel Media SDK", + "VP9 video decoder based on " MFX_API_SDK, "Sreerenj Balachandran <sreerenj.balachandran@intel.com>"); gst_msdkdec_prop_install_output_oder_property (gobject_class);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkvp9enc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkvp9enc.c
Changed
@@ -29,6 +29,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkvp9enc + * @title: msdkvp9enc + * @short_description: Intel MSDK VP9 encoder + * + * VP9 video encoder based on Intel MFX + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc num-buffers=90 ! msdkvp9enc ! matroskamux ! filesink location=output.webm + * ``` + * + * Since: 1.18 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -256,13 +272,15 @@ encoder_class->set_format = gst_msdkvp9enc_set_format; encoder_class->configure = gst_msdkvp9enc_configure; encoder_class->set_src_caps = gst_msdkvp9enc_set_src_caps; + encoder_class->qp_max = 255; + encoder_class->qp_min = 0; gst_msdkenc_install_common_properties (encoder_class); gst_element_class_set_static_metadata (element_class, "Intel MSDK VP9 encoder", "Codec/Encoder/Video/Hardware", - "VP9 video encoder based on Intel Media SDK", + "VP9 video encoder based on " MFX_API_SDK, "Haihao Xiang <haihao.xiang@intel.com>"); gst_element_class_add_static_pad_template (element_class, &sink_factory);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/gstmsdkvpp.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/gstmsdkvpp.c
Changed
@@ -31,6 +31,22 @@ * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ +/** + * SECTION: element-msdkvpp + * @title: msdkvpp + * @short_description: MSDK Video Postprocessor + * + * A MediaSDK Video Postprocessing Filter + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! msdkvpp ! glimagesink + * ``` + * + * Since: 1.16 + * + */ + #ifdef HAVE_CONFIG_H # include <config.h> #endif @@ -54,9 +70,15 @@ #endif #endif -#if (MFX_VERSION >= 1032) -#define EXT_SINK_FORMATS ", RGB16, Y410, Y210" -#define EXT_SRC_FORMATS ", YV12, Y410, Y210" +#if (MFX_VERSION >= 2004) +#define EXT_SINK_FORMATS ", RGB16, Y410, Y210, P012_LE, Y212_LE, Y412_LE" +#define EXT_SRC_FORMATS ", YV12, Y410, Y210, RGBP, BGRP, P012_LE, Y212_LE, Y412_LE" +#elif (MFX_VERSION >= 1032) +#define EXT_SINK_FORMATS ", RGB16, Y410, Y210, P012_LE, Y212_LE, Y412_LE" +#define EXT_SRC_FORMATS ", YV12, Y410, Y210, P012_LE, Y212_LE, Y412_LE" +#elif (MFX_VERSION >= 1031) +#define EXT_SINK_FORMATS ", RGB16, Y410, Y210, P012_LE, Y212_LE, Y412_LE" +#define EXT_SRC_FORMATS ", Y410, Y210, P012_LE, Y212_LE, Y412_LE" #elif (MFX_VERSION >= 1028) #define EXT_SINK_FORMATS ", RGB16, Y410, Y210" #define EXT_SRC_FORMATS ", Y410, Y210" @@ -76,9 +98,9 @@ #define SUPPORTED_DMABUF_FORMAT \ "{ NV12, BGRA, YUY2, UYVY, VUYA, P010_10LE" EXT_SINK_FORMATS "}" #define SRC_SYSTEM_FORMAT \ - "{ BGRA, NV12, YUY2, UYVY, VUYA, BGRx, P010_10LE" EXT_FORMATS EXT_SRC_FORMATS "}" + "{ NV12, BGRA, YUY2, UYVY, VUYA, BGRx, P010_10LE" EXT_FORMATS EXT_SRC_FORMATS "}" #define SRC_DMABUF_FORMAT \ - "{ BGRA, YUY2, UYVY, NV12, VUYA, BGRx, P010_10LE" EXT_FORMATS EXT_SRC_FORMATS "}" + "{ NV12, BGRA, YUY2, UYVY, VUYA, BGRx, P010_10LE" EXT_FORMATS EXT_SRC_FORMATS "}" #ifndef _WIN32 #define DMABUF_SINK_CAPS_STR \ @@ -848,9 +870,10 @@ mfxFrameInfo *in_info = NULL; MsdkSurface *in_surface = NULL; MsdkSurface *out_surface = NULL; + GstBuffer *outbuf_new = NULL; gboolean locked_by_others; + gboolean create_new_surface = FALSE; - timestamp = GST_BUFFER_TIMESTAMP (inbuf); free_unlocked_msdk_surfaces (thiz); in_surface = get_msdk_surface_from_input_buffer (thiz, inbuf); @@ -864,6 +887,13 @@ } locked_by_others = ! !in_surface->surface->Data.Locked; + /* always convert timestamp of input surface as msdk timestamp */ + if (inbuf->pts == GST_CLOCK_TIME_NONE) + in_surface->surface->Data.TimeStamp = MFX_TIMESTAMP_UNKNOWN; + else + in_surface->surface->Data.TimeStamp = + gst_util_uint64_scale_round (inbuf->pts, 90000, GST_SECOND); + if (gst_msdk_is_msdk_buffer (outbuf)) { out_surface = g_slice_new0 (MsdkSurface); out_surface->surface = gst_msdk_get_surface_from_buffer (outbuf); @@ -893,11 +923,18 @@ status = MFXVideoVPP_RunFrameVPPAsync (session, in_surface->surface, out_surface->surface, NULL, &sync_point); + timestamp = out_surface->surface->Data.TimeStamp; + if (status != MFX_WRN_DEVICE_BUSY) break; /* If device is busy, wait 1ms and retry, as per MSDK's recommendation */ g_usleep (1000); - }; + } + + if (timestamp == MFX_TIMESTAMP_UNKNOWN) + timestamp = GST_CLOCK_TIME_NONE; + else + timestamp = gst_util_uint64_scale_round (timestamp, GST_SECOND, 90000); if (status == MFX_WRN_INCOMPATIBLE_VIDEO_PARAM) GST_WARNING_OBJECT (thiz, "VPP returned: %s", @@ -918,16 +955,29 @@ MFXVideoCORE_SyncOperation (session, sync_point, 300000) != MFX_ERR_NONE) GST_WARNING_OBJECT (thiz, "failed to do sync operation"); + /* push new output buffer forward after sync operation */ + if (create_new_surface) { + create_new_surface = FALSE; + ret = gst_pad_push (GST_BASE_TRANSFORM_SRC_PAD (trans), outbuf_new); + if (ret != GST_FLOW_OK) + goto error_push_buffer; + } /* More than one output buffers are generated */ if (status == MFX_ERR_MORE_SURFACE) { - GST_BUFFER_TIMESTAMP (outbuf) = timestamp; - GST_BUFFER_DURATION (outbuf) = thiz->buffer_duration; - timestamp += thiz->buffer_duration; - ret = gst_pad_push (GST_BASE_TRANSFORM_SRC_PAD (trans), outbuf); - if (ret != GST_FLOW_OK) - goto error_push_buffer; - outbuf = create_output_buffer (thiz); + outbuf_new = create_output_buffer (thiz); + GST_BUFFER_TIMESTAMP (outbuf_new) = timestamp; + GST_BUFFER_DURATION (outbuf_new) = thiz->buffer_duration; + + if (gst_msdk_is_msdk_buffer (outbuf_new)) { + release_out_surface (thiz, out_surface); + out_surface = g_slice_new0 (MsdkSurface); + out_surface->surface = gst_msdk_get_surface_from_buffer (outbuf_new); + create_new_surface = TRUE; + } else { + GST_ERROR_OBJECT (thiz, "Failed to get msdk outsurface!"); + goto vpp_error; + } } else { GST_BUFFER_TIMESTAMP (outbuf) = timestamp; GST_BUFFER_DURATION (outbuf) = thiz->buffer_duration; @@ -1156,8 +1206,10 @@ || GST_VIDEO_INFO_FPS_D (&thiz->sinkpad_info) != GST_VIDEO_INFO_FPS_D (&thiz->srcpad_info))) { thiz->flags |= GST_MSDK_FLAG_FRC; - /* So far this is the only algorithm which is working somewhat good */ - thiz->frc_algm = MFX_FRCALGM_PRESERVE_TIMESTAMP; + /* manually set distributed timestamp as frc algorithm + * as it is more resonable for framerate conversion + */ + thiz->frc_algm = MFX_FRCALGM_DISTRIBUTED_TIMESTAMP; } /* work-around to avoid zero fps in msdk structure */ @@ -1179,7 +1231,7 @@ thiz->param.ExtParam = thiz->extra_params; } - /* validate parameters and allow the Media SDK to make adjustments */ + /* validate parameters and allow MFX to make adjustments */ status = MFXVideoVPP_Query (session, &thiz->param, &thiz->param); if (status < MFX_ERR_NONE) { GST_ERROR_OBJECT (thiz, "Video VPP Query failed (%s)", @@ -1244,7 +1296,8 @@ gboolean srcpad_info_changed = FALSE; gboolean deinterlace; - if (gst_caps_get_features (caps, 0) != gst_caps_get_features (out_caps, 0)) + if (!gst_caps_features_is_equal (gst_caps_get_features (caps, 0), + gst_caps_get_features (out_caps, 0))) thiz->need_vpp = 1; gst_video_info_from_caps (&in_info, caps); @@ -1624,10 +1677,10 @@ &gst_msdkvpp_sink_factory); gst_element_class_set_static_metadata (element_class, - "MSDK Video Postprocessor", + "Intel MSDK Video Postprocessor", "Filter/Converter/Video;Filter/Converter/Video/Scaler;" "Filter/Effect/Video;Filter/Effect/Video/Deinterlace", - "A MediaSDK Video Postprocessing Filter", + "Video Postprocessing Filter based on " MFX_API_SDK, "Sreerenj Balachandrn <sreerenj.balachandran@intel.com>"); trans_class->start = GST_DEBUG_FUNCPTR (gst_msdkvpp_start);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/meson.build
Changed
@@ -32,47 +32,73 @@ have_msdk = false msdk_dep = [] +use_msdk = false +use_onevpl = false msdk_option = get_option('msdk') if msdk_option.disabled() subdir_done() endif -mfx_dep = dependency('libmfx', required: false) -if mfx_dep.found() - mfx_incdir = mfx_dep.get_pkgconfig_variable('includedir') - mfx_inc = [] -else - # Old versions of MediaSDK don't provide a pkg-config file - mfx_root = run_command(python3, '-c', 'import os; print(os.environ.get("INTELMEDIASDKROOT", os.environ.get("MFX_HOME", "")))').stdout().strip() - - if mfx_root != '' - mfx_libdir = [mfx_root + '/lib/lin_x64', mfx_root + '/lib/x64', mfx_root + '/lib64', mfx_root + '/lib'] - if host_machine.system() == 'windows' - if host_machine.cpu_family() == 'x86' - mfx_libdir = [mfx_root + '/lib/win32'] - else - mfx_libdir = [mfx_root + '/lib/x64'] +mfx_api = get_option('mfx_api') + +if mfx_api != 'oneVPL' + mfx_dep = dependency('libmfx', version: ['>= 1.0', '<= 1.99'], required: false) + + if mfx_dep.found() + mfx_incdir = mfx_dep.get_variable('includedir') + mfx_inc = [] + use_msdk = true + else + # Old versions of MediaSDK don't provide a pkg-config file + mfx_root = run_command(python3, '-c', 'import os; print(os.environ.get("INTELMEDIASDKROOT", os.environ.get("MFX_HOME", "")))', check: false).stdout().strip() + + if mfx_root != '' + mfx_libdir = [mfx_root + '/lib/lin_x64', mfx_root + '/lib/x64', mfx_root + '/lib64', mfx_root + '/lib'] + if host_machine.system() == 'windows' + if host_machine.cpu_family() == 'x86' + mfx_libdir = [mfx_root + '/lib/win32'] + else + mfx_libdir = [mfx_root + '/lib/x64'] + endif endif + mfx_incdir = join_paths([mfx_root, 'include']) + mfx_lib = cxx.find_library('mfx', dirs: mfx_libdir, required: msdk_option) + mfx_inc = include_directories(mfx_incdir) + mfx_dep = declare_dependency(include_directories: mfx_inc, dependencies: mfx_lib) + use_msdk = true endif - mfx_incdir = join_paths([mfx_root, 'include']) - mfx_lib = cxx.find_library('mfx', dirs: mfx_libdir, required: msdk_option) - mfx_inc = include_directories(mfx_incdir) - mfx_dep = declare_dependency(include_directories: mfx_inc, dependencies: mfx_lib) - elif msdk_option.enabled() - error('msdk plugin enabled but Intel Media SDK not found: consider setting PKG_CONFIG_PATH, INTELMEDIASDKROOT or MFX_HOME') + endif +endif + +if not use_msdk and mfx_api != 'MSDK' + mfx_dep = dependency('vpl', version: '>= 2.2', required: false) + + if mfx_dep.found() + mfx_incdir = mfx_dep.get_variable('includedir') + mfx_inc = [] + use_onevpl = true + endif +endif + +if not use_msdk and not use_onevpl + if msdk_option.enabled() + error('msdk plugin enabled but the Intel Media SDK or the oneVPL SDK not found: consider setting PKG_CONFIG_PATH, INTELMEDIASDKROOT or MFX_HOME') else subdir_done() endif endif -# Old versions of MediaSDK don't have the 'mfx' directory prefix -if cxx.has_header('mfx/mfxdefs.h', args: '-I' + mfx_incdir) +# Check oneVPL firstly +if use_onevpl + mfx_incdir = join_paths([mfx_incdir, 'vpl']) + mfx_inc = include_directories(mfx_incdir) +elif cxx.has_header('mfx/mfxdefs.h', args: '-I' + mfx_incdir) mfx_incdir = join_paths([mfx_incdir, 'mfx']) mfx_inc = include_directories(mfx_incdir) endif -if cxx.has_header('mfxvp9.h', args: '-I' + mfx_incdir) +if use_onevpl or cxx.has_header('mfxvp9.h', args: '-I' + mfx_incdir) msdk_sources += [ 'gstmsdkvp9dec.c' ] cdata.set10('USE_MSDK_VP9_DEC', 1) endif @@ -97,6 +123,21 @@ cdata.set10('USE_MSDK_VP9_ENC', 1) endif +mfx_ver134_check_code = ''' +#include <mfxdefs.h> +#if MFX_VERSION < 1034 +#error "The current version of mfx doesn't support AV1 decoding" +#endif +''' + +have_mfx_ver134 = cc.compiles(mfx_ver134_check_code, + include_directories : [configinc, mfx_inc]) + +if have_mfx_ver134 + msdk_sources += [ 'gstmsdkav1dec.c' ] + cdata.set10('USE_MSDK_AV1_DEC', 1) +endif + if host_machine.system() == 'windows' if cc.get_id() != 'msvc' and msdk_option.enabled() error('msdk plugin can only be built with MSVC') @@ -106,14 +147,24 @@ msdk_deps = declare_dependency(dependencies: [d3d11_dep, legacy_stdio_dep]) msdk_deps_found = d3d11_dep.found() and legacy_stdio_dep.found() and cc.get_id() == 'msvc' else - libva_dep = dependency('libva-drm', required: get_option('msdk')) + libva_dep = dependency('libva', required: get_option('msdk')) + libva_drm_dep = dependency('libva-drm', required: get_option('msdk')) libdl_dep = cc.find_library('dl', required: get_option('msdk')) libgudev_dep = dependency('gudev-1.0', required: get_option('msdk')) - msdk_deps = declare_dependency(dependencies: [libva_dep, libdl_dep, libgudev_dep]) - msdk_deps_found = libva_dep.found() and libdl_dep.found() and libgudev_dep.found() + libdrm_dep = dependency('libdrm', required: get_option('msdk')) + msdk_deps = declare_dependency(dependencies: [libva_dep, libva_drm_dep, libdl_dep, libgudev_dep, libdrm_dep]) + msdk_deps_found = libva_dep.found() and libva_drm_dep.found() and libdl_dep.found() and libgudev_dep.found() and libdrm_dep.found() endif if msdk_deps_found + if host_machine.system() != 'windows' + driverdir = libva_dep.get_variable('driverdir', default_value: '') + if driverdir == '' + driverdir = join_paths(get_option('prefix'), get_option('libdir'), 'dri') + endif + cdata.set_quoted('VA_DRIVERS_PATH', '@0@'.format(driverdir)) + endif + gstmsdktag = library('gstmsdk', msdk_sources, c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API'],
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/msdk-enums.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/msdk-enums.c
Changed
@@ -195,6 +195,46 @@ return type; } +GType +gst_msdkenc_transform_skip_get_type (void) +{ + static GType type = 0; + + static const GEnumValue values[] = { + {MFX_CODINGOPTION_UNKNOWN, "SDK desides what to do", "auto"}, + {MFX_CODINGOPTION_OFF, + "transform_skip_enabled_flag will be set to 0 in PPS ", "off"}, + {MFX_CODINGOPTION_ON, + "transform_skip_enabled_flag will be set to 1 in PPS ", "on"}, + {0, NULL, NULL} + }; + + if (!type) { + type = g_enum_register_static ("GstMsdkEncTransformSkip", values); + } + return type; +} + +GType +gst_msdkenc_intra_refresh_type_get_type (void) +{ + static GType type = 0; + + static const GEnumValue values[] = { + {MFX_REFRESH_NO, "No (default)", "no"}, + {MFX_REFRESH_VERTICAL, "Vertical", "vertical"}, + {MFX_REFRESH_HORIZONTAL, "Horizontal ", "horizontal"}, + {MFX_REFRESH_SLICE, "Slice ", "slice"}, + {0, NULL, NULL} + }; + + if (!type) { + type = g_enum_register_static ("GstMsdkEncIntraRefreshType", values); + } + + return type; +} + /*========= MSDK VPP Enums =========================*/ #ifndef GST_REMOVE_DEPRECATED
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/msdk-enums.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/msdk-enums.h
Changed
@@ -101,5 +101,11 @@ GType gst_msdkvpp_frc_algorithm_get_type (void); +GType +gst_msdkenc_transform_skip_get_type (void); + +GType +gst_msdkenc_intra_refresh_type_get_type (void); + G_END_DECLS #endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/msdk.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/msdk.c
Changed
@@ -75,6 +75,10 @@ /* Y416 is used for 12bit 4:4:4:4 format in MSDK */ GST_VIDEO_INFO_TO_MFX_MAP (Y412_LE, YUV444, Y416), #endif +#if (MFX_VERSION >=2004) + GST_VIDEO_INFO_TO_MFX_MAP (RGBP, YUV444, RGBP), + GST_VIDEO_INFO_TO_MFX_MAP (BGRP, YUV444, BGRP), +#endif {0, 0, 0} }; @@ -123,10 +127,12 @@ return "device operation failure"; case MFX_ERR_MORE_BITSTREAM: return "expect more bitstream buffers at output"; +#if (MFX_VERSION < 2000) case MFX_ERR_INCOMPATIBLE_AUDIO_PARAM: return "incompatible audio parameters"; case MFX_ERR_INVALID_AUDIO_PARAM: return "invalid audio parameters"; +#endif /* warnings >0 */ case MFX_WRN_IN_EXECUTION: return "the previous asynchronous operation is in execution"; @@ -144,8 +150,10 @@ return "the value is out of valid range"; case MFX_WRN_FILTER_SKIPPED: return "one of requested filters has been skipped"; +#if (MFX_VERSION < 2000) case MFX_WRN_INCOMPATIBLE_AUDIO_PARAM: return "incompatible audio parameters"; +#endif default: break; } @@ -170,8 +178,149 @@ return codename; } +#if (MFX_VERSION >= 2000) + +mfxStatus +msdk_init_msdk_session (mfxIMPL impl, mfxVersion * pver, + MsdkSession * msdk_session) +{ + mfxStatus sts = MFX_ERR_NONE; + mfxLoader loader = NULL; + mfxSession session = NULL; + uint32_t impl_idx = 0; + mfxConfig cfg; + mfxVariant impl_value; + + loader = msdk_session->loader; + + if (!loader) { + loader = MFXLoad (); + + GST_INFO ("Use the Intel oneVPL SDK to create MFX session"); + + if (!loader) { + GST_WARNING ("Failed to create a MFX loader"); + return MFX_ERR_UNKNOWN; + } + + /* Create configurations for implementation */ + cfg = MFXCreateConfig (loader); + + if (!cfg) { + GST_ERROR ("Failed to create a MFX configuration"); + MFXUnload (loader); + return MFX_ERR_UNKNOWN; + } + + impl_value.Type = MFX_VARIANT_TYPE_U32; + impl_value.Data.U32 = + (impl == + MFX_IMPL_SOFTWARE) ? MFX_IMPL_TYPE_SOFTWARE : MFX_IMPL_TYPE_HARDWARE; + sts = + MFXSetConfigFilterProperty (cfg, + (const mfxU8 *) "mfxImplDescription.Impl", impl_value); + + if (sts != MFX_ERR_NONE) { + GST_ERROR ("Failed to add an additional MFX configuration (%s)", + msdk_status_to_string (sts)); + MFXUnload (loader); + return sts; + } + + impl_value.Type = MFX_VARIANT_TYPE_U32; + impl_value.Data.U32 = pver->Version; + sts = + MFXSetConfigFilterProperty (cfg, + (const mfxU8 *) "mfxImplDescription.ApiVersion.Version", impl_value); + + if (sts != MFX_ERR_NONE) { + GST_ERROR ("Failed to add an additional MFX configuration (%s)", + msdk_status_to_string (sts)); + MFXUnload (loader); + return sts; + } + } + + while (1) { + /* Enumerate all implementations */ + mfxImplDescription *impl_desc; + + sts = MFXEnumImplementations (loader, impl_idx, + MFX_IMPLCAPS_IMPLDESCSTRUCTURE, (mfxHDL *) & impl_desc); + + /* Failed to find an available implementation */ + if (sts == MFX_ERR_NOT_FOUND) + break; + else if (sts != MFX_ERR_NONE) { + impl_idx++; + continue; + } + + sts = MFXCreateSession (loader, impl_idx, &session); + MFXDispReleaseImplDescription (loader, impl_desc); + + if (sts == MFX_ERR_NONE) + break; + + impl_idx++; + } + + if (sts != MFX_ERR_NONE) { + GST_ERROR ("Failed to create a MFX session (%s)", + msdk_status_to_string (sts)); + + if (!msdk_session->loader) + MFXUnload (loader); + + return sts; + } + + msdk_session->session = session; + msdk_session->loader = loader; + + return MFX_ERR_NONE; +} + +#else + +mfxStatus +msdk_init_msdk_session (mfxIMPL impl, mfxVersion * pver, + MsdkSession * msdk_session) +{ + mfxStatus status; + mfxSession session = NULL; + mfxInitParam init_par = { impl, *pver }; + + GST_INFO ("Use the " MFX_API_SDK " to create MFX session"); + +#if (MFX_VERSION >= 1025) + init_par.GPUCopy = 1; +#endif + + status = MFXInitEx (init_par, &session); + + if (status != MFX_ERR_NONE) { + GST_WARNING ("Failed to initialize a MFX session (%s)", + msdk_status_to_string (status)); + return status; + } + + msdk_session->session = session; + msdk_session->loader = NULL; + + return MFX_ERR_NONE; +} + void -msdk_close_session (mfxSession session) +MFXUnload (mfxLoader loader) +{ + g_assert (loader == NULL); +} + +#endif + +void +msdk_close_mfx_session (mfxSession session) { mfxStatus status; @@ -183,7 +332,14 @@ GST_ERROR ("Close failed (%s)", msdk_status_to_string (status)); } -mfxSession +void +msdk_close_session (MsdkSession * msdk_session) +{ + msdk_close_mfx_session (msdk_session->session); + MFXUnload (msdk_session->loader); +} + +MsdkSession msdk_open_session (mfxIMPL impl) { mfxSession session = NULL; @@ -191,18 +347,21 @@ }; mfxIMPL implementation; mfxStatus status; + MsdkSession msdk_session; static const gchar *implementation_names[] = { "AUTO", "SOFTWARE", "HARDWARE", "AUTO_ANY", "HARDWARE_ANY", "HARDWARE2", "HARDWARE3", "HARDWARE4", "RUNTIME" }; - status = MFXInit (impl, &version, &session); - if (status != MFX_ERR_NONE) { - GST_ERROR ("Intel Media SDK not available (%s)", - msdk_status_to_string (status)); - goto failed; - } + msdk_session.session = NULL; + msdk_session.loader = NULL; + status = msdk_init_msdk_session (impl, &version, &msdk_session); + + if (status != MFX_ERR_NONE) + return msdk_session; + else + session = msdk_session.session; status = MFXQueryIMPL (session, &implementation); if (status != MFX_ERR_NONE) { @@ -221,11 +380,13 @@ implementation_names[MFX_IMPL_BASETYPE (implementation)]); GST_INFO ("MFX version: %d.%d", version.Major, version.Minor); - return session; + return msdk_session; failed: - msdk_close_session (session); - return NULL; + msdk_close_session (&msdk_session); + msdk_session.session = NULL; + msdk_session.loader = NULL; + return msdk_session; } gboolean @@ -483,6 +644,7 @@ gst_msdk_load_plugin (mfxSession session, const mfxPluginUID * uid, mfxU32 version, const gchar * plugin) { +#if (MFX_VERSION < 2000) mfxStatus status; status = MFXVideoUSER_Load (session, uid, version); @@ -497,6 +659,7 @@ GST_WARNING ("Media SDK Plugin for %s load warning: %s", plugin, msdk_status_to_string (status)); } +#endif return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/msdk.h -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/msdk.h
Changed
@@ -43,7 +43,27 @@ #include <gst/allocators/allocators.h> #include <mfxvideo.h> + +#if (MFX_VERSION < 2000) #include <mfxplugin.h> +#else +#include <mfxdispatcher.h> + +#define mfxPluginUID char +static const char MFX_PLUGINID_HEVCD_SW; +static const char MFX_PLUGINID_HEVCD_HW; +static const char MFX_PLUGINID_HEVCE_SW; +static const char MFX_PLUGINID_HEVCE_HW; +static const char MFX_PLUGINID_VP8D_HW; +static const char MFX_PLUGINID_VP9E_HW; +static const char MFX_PLUGINID_VP9D_HW; +#endif + +#if (MFX_VERSION >= 2000) +#define MFX_API_SDK "Intel(R) oneVPL" +#else +#define MFX_API_SDK "Intel(R) Media SDK" +#endif G_BEGIN_DECLS @@ -63,8 +83,23 @@ GST_MSDK_CAPS_MAKE (format) "; " \ GST_MSDK_CAPS_MAKE_WITH_DMABUF_FEATURE (dmaformat) -mfxSession msdk_open_session (mfxIMPL impl); -void msdk_close_session (mfxSession session); +#if (MFX_VERSION < 2000) +typedef void * mfxLoader; + +void MFXUnload (mfxLoader loader); +#endif + +typedef struct _MsdkSession MsdkSession; + +struct _MsdkSession +{ + mfxSession session; + mfxLoader loader; +}; + +MsdkSession msdk_open_session (mfxIMPL impl); +void msdk_close_mfx_session (mfxSession session); +void msdk_close_session (MsdkSession * session); gboolean msdk_is_available (void); @@ -107,6 +142,10 @@ mfxU16 msdk_get_platform_codename (mfxSession session); +mfxStatus +msdk_init_msdk_session (mfxIMPL impl, mfxVersion * pver, + MsdkSession * msdk_session); + G_END_DECLS #endif /* __MSDK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/msdk/msdk_libva.c -> gst-plugins-bad-1.20.1.tar.xz/sys/msdk/msdk_libva.c
Changed
@@ -79,6 +79,12 @@ FOURCC_MFX_TO_VA (Y410, Y410), #endif FOURCC_MFX_TO_VA (BGR4, ABGR), + +#if (MFX_VERSION >= 2004) + FOURCC_MFX_TO_VA (RGBP, RGBP), + FOURCC_MFX_TO_VA (BGRP, BGRP), +#endif + {0, 0} };
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/cuda-converter.c
Added
@@ -0,0 +1,2085 @@ +/* GStreamer + * Copyright (C) 2010 David Schleef <ds@schleef.org> + * Copyright (C) 2010 Sebastian Dröge <sebastian.droege@collabora.co.uk> + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:cudaconverter + * @title: GstCudaConverter + * @short_description: Generic video conversion using CUDA + * + * This object is used to convert video frames from one format to another. + * The object can perform conversion of: + * + * * video format + * * video colorspace + * * video size + */ + +/** + * TODO: + * * Add more interpolation method and make it selectable, + * currently default bi-linear interpolation only + * * Add fast-path for conversion like videoconvert + * * Full colorimetry and chroma-siting support + * * cropping, and x, y position support + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "cuda-converter.h" +#include "gstcudautils.h" +#include "gstcudaloader.h" +#include "gstcudanvrtc.h" +#include <string.h> + +#define CUDA_BLOCK_X 16 +#define CUDA_BLOCK_Y 16 +#define DIV_UP(size,block) (((size) + ((block) - 1)) / (block)) + +static gboolean cuda_converter_lookup_path (GstCudaConverter * convert); + +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT ensure_debug_category() +static GstDebugCategory * +ensure_debug_category (void) +{ + static gsize cat_gonce = 0; + + if (g_once_init_enter (&cat_gonce)) { + gsize cat_done; + + cat_done = (gsize) _gst_debug_category_new ("cuda-converter", 0, + "cuda-converter object"); + + g_once_init_leave (&cat_gonce, cat_done); + } + + return (GstDebugCategory *) cat_gonce; +} +#else +#define ensure_debug_category() +#endif + +#define GST_CUDA_KERNEL_FUNC "gst_cuda_kernel_func" + +#define GST_CUDA_KERNEL_FUNC_TO_Y444 "gst_cuda_kernel_func_to_y444" + +#define GST_CUDA_KERNEL_FUNC_Y444_TO_YUV "gst_cuda_kernel_func_y444_to_yuv" + +#define GST_CUDA_KERNEL_FUNC_TO_ARGB "gst_cuda_kernel_func_to_argb" + +#define GST_CUDA_KERNEL_FUNC_SCALE_RGB "gst_cuda_kernel_func_scale_rgb" + +/* *INDENT-OFF* */ +/** + * read_chroma: + * @tex1: a CUDA texture object representing a semi-planar chroma plane + * @tex2: dummy object + * @x: the x coordinate to read data from @tex1 + * @y: the y coordinate to read data from @tex1 + * + * Returns: a #ushort2 vector representing both chroma pixel values + */ +static const gchar READ_CHROMA_FROM_SEMI_PLANAR[] = +"__device__ ushort2\n" +"read_chroma (cudaTextureObject_t tex1, cudaTextureObject_t tex2, \n" +" float x, float y)\n" +"{\n" +" return tex2D<ushort2>(tex1, x, y);\n" +"}"; + +/** + * read_chroma: + * @tex1: a CUDA texture object representing a chroma planar plane + * @tex2: a CUDA texture object representing the other planar plane + * @x: the x coordinate to read data from @tex1 and @tex2 + * @y: the y coordinate to read data from @tex1 and @tex2 + * + * Returns: a #ushort2 vector representing both chroma pixel values + */ +static const gchar READ_CHROMA_FROM_PLANAR[] = +"__device__ ushort2\n" +"read_chroma (cudaTextureObject_t tex1, cudaTextureObject_t tex2, \n" +" float x, float y)\n" +"{\n" +" unsigned short u, v;\n" +" u = tex2D<unsigned short>(tex1, x, y);\n" +" v = tex2D<unsigned short>(tex2, x, y);\n" +" return make_ushort2(u, v);\n" +"}"; + +/** + * write_chroma: + * @dst1: a CUDA global memory pointing to a semi-planar chroma plane + * @dst2: dummy + * @u: a pixel value to write @dst1 + * @v: a pixel value to write @dst1 + * @x: the x coordinate to write data into @tex1 + * @x: the y coordinate to write data into @tex1 + * @pstride: the pixel stride of @dst1 + * @mask: bitmask to be applied to high bitdepth plane + * + * Write @u and @v pixel value to @dst1 semi-planar plane + */ +static const gchar WRITE_CHROMA_TO_SEMI_PLANAR[] = +"__device__ void\n" +"write_chroma (unsigned char *dst1, unsigned char *dst2, unsigned short u,\n" +" unsigned short v, int x, int y, int pstride, int stride, int mask)\n" +"{\n" +" if (OUT_DEPTH > 8) {\n" +" *(unsigned short *)&dst1[x * pstride + y * stride] = (u & mask);\n" +" *(unsigned short *)&dst1[x * pstride + 2 + y * stride] = (v & mask);\n" +" } else {\n" +" dst1[x * pstride + y * stride] = u;\n" +" dst1[x * pstride + 1 + y * stride] = v;\n" +" }\n" +"}"; + +/** + * write_chroma: + * @dst1: a CUDA global memory pointing to a planar chroma plane + * @dst2: a CUDA global memory pointing to a the other planar chroma plane + * @u: a pixel value to write @dst1 + * @v: a pixel value to write @dst1 + * @x: the x coordinate to write data into @tex1 + * @x: the y coordinate to write data into @tex1 + * @pstride: the pixel stride of @dst1 + * @mask: bitmask to be applied to high bitdepth plane + * + * Write @u and @v pixel value into @dst1 and @dst2 planar planes + */ +static const gchar WRITE_CHROMA_TO_PLANAR[] = +"__device__ void\n" +"write_chroma (unsigned char *dst1, unsigned char *dst2, unsigned short u,\n" +" unsigned short v, int x, int y, int pstride, int stride, int mask)\n" +"{\n" +" if (OUT_DEPTH > 8) {\n" +" *(unsigned short *)&dst1[x * pstride + y * stride] = (u & mask);\n" +" *(unsigned short *)&dst2[x * pstride + y * stride] = (v & mask);\n" +" } else {\n" +" dst1[x * pstride + y * stride] = u;\n" +" dst2[x * pstride + y * stride] = v;\n" +" }\n" +"}"; + +/* CUDA kernel source for from YUV to YUV conversion and scale */ +static const gchar templ_YUV_TO_YUV[] = +"extern \"C\"{\n" +"__constant__ float SCALE_H = %f;\n" +"__constant__ float SCALE_V = %f;\n" +"__constant__ float CHROMA_SCALE_H = %f;\n" +"__constant__ float CHROMA_SCALE_V = %f;\n" +"__constant__ int WIDTH = %d;\n" +"__constant__ int HEIGHT = %d;\n" +"__constant__ int CHROMA_WIDTH = %d;\n" +"__constant__ int CHROMA_HEIGHT = %d;\n" +"__constant__ int IN_DEPTH = %d;\n" +"__constant__ int OUT_DEPTH = %d;\n" +"__constant__ int PSTRIDE = %d;\n" +"__constant__ int CHROMA_PSTRIDE = %d;\n" +"__constant__ int IN_SHIFT = %d;\n" +"__constant__ int OUT_SHIFT = %d;\n" +"__constant__ int MASK = %d;\n" +"__constant__ int SWAP_UV = %d;\n" +"\n" +"__device__ unsigned short\n" +"do_scale_pixel (unsigned short val) \n" +"{\n" +" unsigned int diff;\n" +" if (OUT_DEPTH > IN_DEPTH) {\n" +" diff = OUT_DEPTH - IN_DEPTH;\n" +" return (val << diff) | (val >> (IN_DEPTH - diff));\n" +" } else if (IN_DEPTH > OUT_DEPTH) {\n" +" return val >> (IN_DEPTH - OUT_DEPTH);\n" +" }\n" +" return val;\n" +"}\n" +"\n" +/* __device__ ushort2 + * read_chroma (cudaTextureObject_t tex1, cudaTextureObject_t tex2, float x, float y); + */ +"%s\n" +"\n" +/* __device__ void + * write_chroma (unsigned char *dst1, unsigned char *dst2, unsigned short u, + * unsigned short v, int x, int y, int pstride, int stride, int mask); + */ +"%s\n" +"\n" +"__global__ void\n" +GST_CUDA_KERNEL_FUNC +"(cudaTextureObject_t tex0, cudaTextureObject_t tex1, cudaTextureObject_t tex2,\n" +" unsigned char *dst0, unsigned char *dst1, unsigned char *dst2,\n" +" int stride)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < WIDTH && y_pos < HEIGHT) {\n" +" float src_xpos = SCALE_H * x_pos;\n" +" float src_ypos = SCALE_V * y_pos;\n" +" unsigned short y = tex2D<unsigned short>(tex0, src_xpos, src_ypos);\n" +" y = y >> IN_SHIFT;\n" +" y = do_scale_pixel (y);\n" +" y = y << OUT_SHIFT;\n" +" if (OUT_DEPTH > 8) {\n" +" *(unsigned short *)&dst0[x_pos * PSTRIDE + y_pos * stride] = (y & MASK);\n" +" } else {\n" +" dst0[x_pos * PSTRIDE + y_pos * stride] = y;\n" +" }\n" +" }\n" +" if (x_pos < CHROMA_WIDTH && y_pos < CHROMA_HEIGHT) {\n" +" float src_xpos = CHROMA_SCALE_H * x_pos;\n" +" float src_ypos = CHROMA_SCALE_V * y_pos;\n" +" unsigned short u, v;\n" +" ushort2 uv = read_chroma (tex1, tex2, src_xpos, src_ypos);\n" +" u = uv.x;\n" +" v = uv.y;\n" +" u = u >> IN_SHIFT;\n" +" v = v >> IN_SHIFT;\n" +" u = do_scale_pixel (u);\n" +" v = do_scale_pixel (v);\n" +" u = u << OUT_SHIFT;\n" +" v = v << OUT_SHIFT;\n" +" if (SWAP_UV) {\n" +" unsigned short tmp = u;\n" +" u = v;\n" +" v = tmp;\n" +" }\n" +" write_chroma (dst1,\n" +" dst2, u, v, x_pos, y_pos, CHROMA_PSTRIDE, stride, MASK);\n" +" }\n" +"}\n" +"\n" +"}"; + +/* CUDA kernel source for from YUV to RGB conversion and scale */ +static const gchar templ_YUV_TO_RGB[] = +"extern \"C\"{\n" +"__constant__ float offset[3] = {%f, %f, %f};\n" +"__constant__ float rcoeff[3] = {%f, %f, %f};\n" +"__constant__ float gcoeff[3] = {%f, %f, %f};\n" +"__constant__ float bcoeff[3] = {%f, %f, %f};\n" +"\n" +"__constant__ float SCALE_H = %f;\n" +"__constant__ float SCALE_V = %f;\n" +"__constant__ float CHROMA_SCALE_H = %f;\n" +"__constant__ float CHROMA_SCALE_V = %f;\n" +"__constant__ int WIDTH = %d;\n" +"__constant__ int HEIGHT = %d;\n" +"__constant__ int CHROMA_WIDTH = %d;\n" +"__constant__ int CHROMA_HEIGHT = %d;\n" +"__constant__ int IN_DEPTH = %d;\n" +"__constant__ int OUT_DEPTH = %d;\n" +"__constant__ int PSTRIDE = %d;\n" +"__constant__ int CHROMA_PSTRIDE = %d;\n" +"__constant__ int IN_SHIFT = %d;\n" +"__constant__ int OUT_SHIFT = %d;\n" +"__constant__ int MASK = %d;\n" +"__constant__ int SWAP_UV = %d;\n" +"__constant__ int MAX_IN_VAL = %d;\n" +"__constant__ int R_IDX = %d;\n" +"__constant__ int G_IDX = %d;\n" +"__constant__ int B_IDX = %d;\n" +"__constant__ int A_IDX = %d;\n" +"__constant__ int X_IDX = %d;\n" +"\n" +"__device__ unsigned short\n" +"do_scale_pixel (unsigned short val) \n" +"{\n" +" unsigned int diff;\n" +" if (OUT_DEPTH > IN_DEPTH) {\n" +" diff = OUT_DEPTH - IN_DEPTH;\n" +" return (val << diff) | (val >> (IN_DEPTH - diff));\n" +" } else if (IN_DEPTH > OUT_DEPTH) {\n" +" return val >> (IN_DEPTH - OUT_DEPTH);\n" +" }\n" +" return val;\n" +"}\n" +"\n" +"__device__ float\n" +"dot(float3 val, float *coeff)\n" +"{\n" +" return val.x * coeff[0] + val.y * coeff[1] + val.z * coeff[2];\n" +"}\n" +"\n" +"__device__ uint3\n" +"yuv_to_rgb (unsigned short y, unsigned short u, unsigned short v, unsigned int max_val)\n" +"{\n" +" float3 yuv = make_float3 (y, u, v);\n" +" uint3 rgb;\n" +" rgb.x = max ((unsigned int)(dot (yuv, rcoeff) + offset[0]), 0);\n" +" rgb.y = max ((unsigned int)(dot (yuv, gcoeff) + offset[1]), 0);\n" +" rgb.z = max ((unsigned int)(dot (yuv, bcoeff) + offset[2]), 0);\n" +" rgb.x = min (rgb.x, max_val);\n" +" rgb.y = min (rgb.y, max_val);\n" +" rgb.z = min (rgb.z, max_val);\n" +" return rgb;\n" +"}\n" +"\n" +/* __device__ ushort2 + * read_chroma (cudaTextureObject_t tex1, cudaTextureObject_t tex2, float x, float y); + */ +"%s\n" +"\n" +"__global__ void\n" +GST_CUDA_KERNEL_FUNC +"(cudaTextureObject_t tex0, cudaTextureObject_t tex1, cudaTextureObject_t tex2,\n" +" unsigned char *dstRGB, int stride)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < WIDTH && y_pos < HEIGHT) {\n" +" float src_xpos = SCALE_H * x_pos;\n" +" float src_ypos = SCALE_V * y_pos;\n" +" unsigned short y = tex2D<unsigned short>(tex0, src_xpos, src_ypos);\n" +" ushort2 uv;\n" +" unsigned short u, v;\n" +" uint3 rgb;\n" +" unsigned int clip_max = MAX_IN_VAL;\n" +" src_xpos = CHROMA_SCALE_H * x_pos;\n" +" src_ypos = CHROMA_SCALE_V * y_pos;\n" +" uv = read_chroma (tex1, tex2, src_xpos, src_ypos);\n" +" u = uv.x;\n" +" v = uv.y;\n" +" y = y >> IN_SHIFT;\n" +" u = u >> IN_SHIFT;\n" +" v = v >> IN_SHIFT;\n" +" if (SWAP_UV) {\n" +" unsigned short tmp = u;\n" +" u = v;\n" +" v = tmp;\n" +" }\n" + /* conversion matrix is scaled to higher bitdepth between in/out formats */ +" if (OUT_DEPTH > IN_DEPTH) {\n" +" y = do_scale_pixel (y);\n" +" u = do_scale_pixel (u);\n" +" v = do_scale_pixel (v);\n" +" clip_max = MASK;\n" +" }" +" rgb = yuv_to_rgb (y, u, v, clip_max);\n" +" if (OUT_DEPTH < IN_DEPTH) {\n" +" rgb.x = do_scale_pixel (rgb.x);\n" +" rgb.y = do_scale_pixel (rgb.y);\n" +" rgb.z = do_scale_pixel (rgb.z);\n" +" }" +" if (OUT_DEPTH > 8) {\n" +" unsigned int packed_rgb = 0;\n" + /* A is always MSB, we support only little endian system */ +" packed_rgb = 0xc000 << 16;\n" +" packed_rgb |= (rgb.x << (30 - (R_IDX * 10)));\n" +" packed_rgb |= (rgb.y << (30 - (G_IDX * 10)));\n" +" packed_rgb |= (rgb.z << (30 - (B_IDX * 10)));\n" +" *(unsigned int *)&dstRGB[x_pos * PSTRIDE + y_pos * stride] = packed_rgb;\n" +" } else {\n" +" dstRGB[x_pos * PSTRIDE + R_IDX + y_pos * stride] = (unsigned char) rgb.x;\n" +" dstRGB[x_pos * PSTRIDE + G_IDX + y_pos * stride] = (unsigned char) rgb.y;\n" +" dstRGB[x_pos * PSTRIDE + B_IDX + y_pos * stride] = (unsigned char) rgb.z;\n" +" if (A_IDX >= 0 || X_IDX >= 0)\n" +" dstRGB[x_pos * PSTRIDE + A_IDX + y_pos * stride] = 0xff;\n" +" }\n" +" }\n" +"}\n" +"\n" +"}"; + +/** + * GST_CUDA_KERNEL_FUNC_TO_ARGB: + * @srcRGB: a CUDA global memory containing a RGB image + * @dstRGB: a CUDA global memory to store unpacked ARGB image + * @width: the width of @srcRGB and @dstRGB + * @height: the height of @srcRGB and @dstRGB + * @src_stride: the stride of @srcRGB + * @src_pstride: the pixel stride of @srcRGB + * @dst_stride: the stride of @dstRGB + * @r_idx: the index of red component of @srcRGB + * @g_idx: the index of green component of @srcRGB + * @b_idx: the index of blue component of @srcRGB + * @a_idx: the index of alpha component of @srcRGB + * + * Unpack a RGB image from @srcRGB and write the unpacked data into @dstRGB + */ +static const gchar unpack_to_ARGB[] = +"__global__ void\n" +GST_CUDA_KERNEL_FUNC_TO_ARGB +"(unsigned char *srcRGB, unsigned char *dstRGB, int width, int height,\n" +" int src_stride, int src_pstride, int dst_stride,\n" +" int r_idx, int g_idx, int b_idx, int a_idx)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < width && y_pos < height) {\n" +" if (a_idx >= 0) {\n" +" dstRGB[x_pos * 4 + y_pos * dst_stride] =\n" +" srcRGB[x_pos * src_pstride + a_idx + y_pos * src_stride];\n" +" } else {\n" +" dstRGB[x_pos * 4 + y_pos * dst_stride] = 0xff;\n" +" }\n" +" dstRGB[x_pos * 4 + 1 + y_pos * dst_stride] =\n" +" srcRGB[x_pos * src_pstride + r_idx + y_pos * src_stride];\n" +" dstRGB[x_pos * 4 + 2 + y_pos * dst_stride] =\n" +" srcRGB[x_pos * src_pstride + g_idx + y_pos * src_stride];\n" +" dstRGB[x_pos * 4 + 3 + y_pos * dst_stride] =\n" +" srcRGB[x_pos * src_pstride + b_idx + y_pos * src_stride];\n" +" }\n" +"}\n"; + +/** + * GST_CUDA_KERNEL_FUNC_TO_ARGB: + * @srcRGB: a CUDA global memory containing a RGB image + * @dstRGB: a CUDA global memory to store unpacked ARGB64 image + * @width: the width of @srcRGB and @dstRGB + * @height: the height of @srcRGB and @dstRGB + * @src_stride: the stride of @srcRGB + * @src_pstride: the pixel stride of @srcRGB + * @dst_stride: the stride of @dstRGB + * @r_idx: the index of red component of @srcRGB + * @g_idx: the index of green component of @srcRGB + * @b_idx: the index of blue component of @srcRGB + * @a_idx: the index of alpha component of @srcRGB + * + * Unpack a RGB image from @srcRGB and write the unpacked data into @dstRGB + */ +static const gchar unpack_to_ARGB64[] = +"__global__ void\n" +GST_CUDA_KERNEL_FUNC_TO_ARGB +"(unsigned char *srcRGB, unsigned char *dstRGB, int width, int height,\n" +" int src_stride, int src_pstride, int dst_stride,\n" +" int r_idx, int g_idx, int b_idx, int a_idx)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < width && y_pos < height) {\n" +" unsigned short a, r, g, b;\n" +" unsigned int read_val;\n" +" read_val = *(unsigned int *)&srcRGB[x_pos * src_pstride + y_pos * src_stride];\n" +" a = (read_val >> 30) & 0x03;\n" +" a = (a << 14) | (a << 12) | (a << 10) | (a << 8) | (a << 6) | (a << 4) | (a << 2) | (a << 0);\n" +" r = ((read_val >> (30 - (r_idx * 10))) & 0x3ff);\n" +" r = (r << 6) | (r >> 4);\n" +" g = ((read_val >> (30 - (g_idx * 10))) & 0x3ff);\n" +" g = (g << 6) | (g >> 4);\n" +" b = ((read_val >> (30 - (b_idx * 10))) & 0x3ff);\n" +" b = (b << 6) | (b >> 4);\n" +" *(unsigned short *)&dstRGB[x_pos * 8 + y_pos * dst_stride] = 0xffff;\n" +" *(unsigned short *)&dstRGB[x_pos * 8 + 2 + y_pos * dst_stride] = r;\n" +" *(unsigned short *)&dstRGB[x_pos * 8 + 4 + y_pos * dst_stride] = g;\n" +" *(unsigned short *)&dstRGB[x_pos * 8 + 6 + y_pos * dst_stride] = b;\n" +" }\n" +"}\n"; + +/* CUDA kernel source for from RGB to YUV conversion and scale */ +static const gchar templ_RGB_TO_YUV[] = +"extern \"C\"{\n" +"__constant__ float offset[3] = {%f, %f, %f};\n" +"__constant__ float ycoeff[3] = {%f, %f, %f};\n" +"__constant__ float ucoeff[3] = {%f, %f, %f};\n" +"__constant__ float vcoeff[3] = {%f, %f, %f};\n" +"\n" +"__constant__ float SCALE_H = %f;\n" +"__constant__ float SCALE_V = %f;\n" +"__constant__ float CHROMA_SCALE_H = %f;\n" +"__constant__ float CHROMA_SCALE_V = %f;\n" +"__constant__ int WIDTH = %d;\n" +"__constant__ int HEIGHT = %d;\n" +"__constant__ int CHROMA_WIDTH = %d;\n" +"__constant__ int CHROMA_HEIGHT = %d;\n" +"__constant__ int IN_DEPTH = %d;\n" +"__constant__ int OUT_DEPTH = %d;\n" +"__constant__ int PSTRIDE = %d;\n" +"__constant__ int CHROMA_PSTRIDE = %d;\n" +"__constant__ int IN_SHIFT = %d;\n" +"__constant__ int OUT_SHIFT = %d;\n" +"__constant__ int MASK = %d;\n" +"__constant__ int SWAP_UV = %d;\n" +"\n" +"__device__ unsigned short\n" +"do_scale_pixel (unsigned short val) \n" +"{\n" +" unsigned int diff;\n" +" if (OUT_DEPTH > IN_DEPTH) {\n" +" diff = OUT_DEPTH - IN_DEPTH;\n" +" return (val << diff) | (val >> (IN_DEPTH - diff));\n" +" } else if (IN_DEPTH > OUT_DEPTH) {\n" +" return val >> (IN_DEPTH - OUT_DEPTH);\n" +" }\n" +" return val;\n" +"}\n" +"\n" +"__device__ float\n" +"dot(float3 val, float *coeff)\n" +"{\n" +" return val.x * coeff[0] + val.y * coeff[1] + val.z * coeff[2];\n" +"}\n" +"\n" +"__device__ uint3\n" +"rgb_to_yuv (unsigned short r, unsigned short g, unsigned short b,\n" +" unsigned int max_val)\n" +"{\n" +" float3 rgb = make_float3 (r, g, b);\n" +" uint3 yuv;\n" +" yuv.x = max ((unsigned int)(dot (rgb, ycoeff) + offset[0]), 0);\n" +" yuv.y = max ((unsigned int)(dot (rgb, ucoeff) + offset[1]), 0);\n" +" yuv.z = max ((unsigned int)(dot (rgb, vcoeff) + offset[2]), 0);\n" +" yuv.x = min (yuv.x, max_val);\n" +" yuv.y = min (yuv.y, max_val);\n" +" yuv.z = min (yuv.z, max_val);\n" +" return yuv;\n" +"}\n" +"\n" +/* __global__ void + * GST_CUDA_KERNEL_FUNC_TO_ARGB + */ +"%s\n" +"\n" +/* __device__ ushort2 + * read_chroma (cudaTextureObject_t tex1, cudaTextureObject_t tex2, float x, float y); + */ +"%s\n" +"\n" +/* __device__ void + * write_chroma (unsigned char *dst1, unsigned char *dst2, unsigned short u, + * unsigned short v, int x, int y, int pstride, int stride, int mask); + */ +"%s\n" +"\n" +"__global__ void\n" +GST_CUDA_KERNEL_FUNC_TO_Y444 +"(cudaTextureObject_t srcRGB, unsigned char *dstY, int y_stride,\n" +" unsigned char *dstU, int u_stride, unsigned char *dstV, int v_stride,\n" +" int width, int height, int dst_pstride, int in_depth)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < width && y_pos < height) {\n" +" ushort4 argb = tex2D<ushort4>(srcRGB, x_pos, y_pos);\n" +" uint3 yuv;\n" +" yuv = rgb_to_yuv (argb.y, argb.z, argb.w, (1 << in_depth) - 1);\n" +" if (in_depth > 8) {\n" +" *(unsigned short *)&dstY[x_pos * dst_pstride + y_pos * y_stride] = yuv.x;\n" +" *(unsigned short *)&dstU[x_pos * dst_pstride + y_pos * u_stride] = yuv.y;\n" +" *(unsigned short *)&dstV[x_pos * dst_pstride + y_pos * v_stride] = yuv.z;\n" +" } else {\n" +" dstY[x_pos * dst_pstride + y_pos * y_stride] = yuv.x;\n" +" dstU[x_pos * dst_pstride + y_pos * u_stride] = yuv.y;\n" +" dstV[x_pos * dst_pstride + y_pos * v_stride] = yuv.z;\n" +" }\n" +" }\n" +"}\n" +"\n" +"__global__ void\n" +GST_CUDA_KERNEL_FUNC_Y444_TO_YUV +"(cudaTextureObject_t tex0, cudaTextureObject_t tex1, cudaTextureObject_t tex2,\n" +" unsigned char *dst0, unsigned char *dst1, unsigned char *dst2,\n" +" int stride)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < WIDTH && y_pos < HEIGHT) {\n" +" float src_xpos = SCALE_H * x_pos;\n" +" float src_ypos = SCALE_V * y_pos;\n" +" unsigned short y = tex2D<unsigned short>(tex0, src_xpos, src_ypos);\n" +" y = y >> IN_SHIFT;\n" +" y = do_scale_pixel (y);\n" +" y = y << OUT_SHIFT;\n" +" if (OUT_DEPTH > 8) {\n" +" *(unsigned short *)&dst0[x_pos * PSTRIDE + y_pos * stride] = (y & MASK);\n" +" } else {\n" +" dst0[x_pos * PSTRIDE + y_pos * stride] = y;\n" +" }\n" +" }\n" +" if (x_pos < CHROMA_WIDTH && y_pos < CHROMA_HEIGHT) {\n" +" float src_xpos = CHROMA_SCALE_H * x_pos;\n" +" float src_ypos = CHROMA_SCALE_V * y_pos;\n" +" unsigned short u, v;\n" +" ushort2 uv;\n" +" uv = read_chroma (tex1, tex2, src_xpos, src_ypos);\n" +" u = uv.x;\n" +" v = uv.y;\n" +" u = u >> IN_SHIFT;\n" +" v = v >> IN_SHIFT;\n" +" u = do_scale_pixel (u);\n" +" v = do_scale_pixel (v);\n" +" u = u << OUT_SHIFT;\n" +" v = v << OUT_SHIFT;\n" +" if (SWAP_UV) {\n" +" unsigned short tmp = u;\n" +" u = v;\n" +" v = tmp;\n" +" }\n" +" write_chroma (dst1,\n" +" dst2, u, v, x_pos, y_pos, CHROMA_PSTRIDE, stride, MASK);\n" +" }\n" +"}\n" +"\n" +"}"; + +/* CUDA kernel source for from RGB to RGB conversion and scale */ +static const gchar templ_RGB_to_RGB[] = +"extern \"C\"{\n" +"__constant__ float SCALE_H = %f;\n" +"__constant__ float SCALE_V = %f;\n" +"__constant__ int WIDTH = %d;\n" +"__constant__ int HEIGHT = %d;\n" +"__constant__ int IN_DEPTH = %d;\n" +"__constant__ int OUT_DEPTH = %d;\n" +"__constant__ int PSTRIDE = %d;\n" +"__constant__ int R_IDX = %d;\n" +"__constant__ int G_IDX = %d;\n" +"__constant__ int B_IDX = %d;\n" +"__constant__ int A_IDX = %d;\n" +"__constant__ int X_IDX = %d;\n" +"\n" +"__device__ unsigned short\n" +"do_scale_pixel (unsigned short val) \n" +"{\n" +" unsigned int diff;\n" +" if (OUT_DEPTH > IN_DEPTH) {\n" +" diff = OUT_DEPTH - IN_DEPTH;\n" +" return (val << diff) | (val >> (IN_DEPTH - diff));\n" +" } else if (IN_DEPTH > OUT_DEPTH) {\n" +" return val >> (IN_DEPTH - OUT_DEPTH);\n" +" }\n" +" return val;\n" +"}\n" +"\n" +/* __global__ void + * GST_CUDA_KERNEL_FUNC_TO_ARGB + */ +"%s\n" +"\n" +/* convert ARGB or ARGB64 to other RGB formats with scale */ +"__global__ void\n" +GST_CUDA_KERNEL_FUNC_SCALE_RGB +"(cudaTextureObject_t srcRGB, unsigned char *dstRGB, int dst_stride)\n" +"{\n" +" int x_pos = blockIdx.x * blockDim.x + threadIdx.x;\n" +" int y_pos = blockIdx.y * blockDim.y + threadIdx.y;\n" +" if (x_pos < WIDTH && y_pos < HEIGHT) {\n" +" float src_xpos = SCALE_H * x_pos;\n" +" float src_ypos = SCALE_V * y_pos;\n" +" ushort4 argb = tex2D<ushort4>(srcRGB, src_xpos, src_ypos);\n" +" argb.x = do_scale_pixel(argb.x);\n" +" argb.y = do_scale_pixel(argb.y);\n" +" argb.z = do_scale_pixel(argb.z);\n" +" argb.w = do_scale_pixel(argb.w);\n" + /* FIXME: RGB10A2_LE or BGR10A2_LE only */ +" if (OUT_DEPTH > 8) {\n" +" unsigned int packed_rgb = 0;\n" +" unsigned int a, r, g, b;" +" a = (argb.x >> 8) & 0x3;\n" +" r = argb.y & 0x3ff;\n" +" g = argb.z & 0x3ff;\n" +" b = argb.w & 0x3ff;\n" + /* A is always MSB, we support only little endian system */ +" packed_rgb = a << 30;\n" +" packed_rgb |= (r << (30 - (R_IDX * 10)));\n" +" packed_rgb |= (g << (30 - (G_IDX * 10)));\n" +" packed_rgb |= (b << (30 - (B_IDX * 10)));\n" +" *(unsigned int *)&dstRGB[x_pos * 4 + y_pos * dst_stride] = packed_rgb;\n" +" } else {\n" +" if (A_IDX >= 0) {\n" +" argb.x = do_scale_pixel(argb.x);\n" +" dstRGB[x_pos * PSTRIDE + A_IDX + y_pos * dst_stride] = argb.x;\n" +" } else if (X_IDX >= 0) {\n" +" dstRGB[x_pos * PSTRIDE + X_IDX + y_pos * dst_stride] = 0xff;\n" +" }\n" +" dstRGB[x_pos * PSTRIDE + R_IDX + y_pos * dst_stride] = argb.y;\n" +" dstRGB[x_pos * PSTRIDE + G_IDX + y_pos * dst_stride] = argb.z;\n" +" dstRGB[x_pos * PSTRIDE + B_IDX + y_pos * dst_stride] = argb.w;\n" +" }\n" +" }\n" +"}\n" +"\n" +"}"; +/* *INDENT-ON* */ + +typedef struct +{ + gint R; + gint G; + gint B; + gint A; + gint X; +} GstCudaRGBOrder; + +typedef struct +{ + CUdeviceptr device_ptr; + gsize cuda_stride; +} GstCudaStageBuffer; + +#define CONVERTER_MAX_NUM_FUNC 4 + +struct _GstCudaConverter +{ + GstVideoInfo in_info; + GstVideoInfo out_info; + gboolean keep_size; + + gint texture_alignment; + + GstCudaContext *cuda_ctx; + CUmodule cuda_module; + CUfunction kernel_func[CONVERTER_MAX_NUM_FUNC]; + const gchar *func_names[CONVERTER_MAX_NUM_FUNC]; + gchar *kernel_source; + gchar *ptx; + GstCudaStageBuffer fallback_buffer[GST_VIDEO_MAX_PLANES]; + + gboolean (*convert) (GstCudaConverter * convert, const GstCudaMemory * src, + GstVideoInfo * in_info, GstCudaMemory * dst, GstVideoInfo * out_info, + CUstream cuda_stream); + + const CUdeviceptr src; + GstVideoInfo *cur_in_info; + + CUdeviceptr dest; + GstVideoInfo *cur_out_info; + + /* rgb to {rgb, yuv} only */ + GstCudaRGBOrder in_rgb_order; + GstCudaStageBuffer unpack_surface; + GstCudaStageBuffer y444_surface[GST_VIDEO_MAX_PLANES]; +}; + +#define LOAD_CUDA_FUNC(module,func,name) G_STMT_START { \ + if (!gst_cuda_result (CuModuleGetFunction (&(func), (module), name))) { \ + GST_ERROR ("failed to get %s function", (name)); \ + goto error; \ + } \ +} G_STMT_END + +/** + * gst_cuda_converter_new: + * @in_info: a #GstVideoInfo + * @out_info: a #GstVideoInfo + * @cuda_ctx: (transfer none): a #GstCudaContext + * + * Create a new converter object to convert between @in_info and @out_info + * with @config. + * + * Returns: a #GstCudaConverter or %NULL if conversion is not possible. + */ +GstCudaConverter * +gst_cuda_converter_new (GstVideoInfo * in_info, GstVideoInfo * out_info, + GstCudaContext * cuda_ctx) +{ + GstCudaConverter *convert; + gint i; + + g_return_val_if_fail (in_info != NULL, NULL); + g_return_val_if_fail (out_info != NULL, NULL); + g_return_val_if_fail (cuda_ctx != NULL, NULL); + /* we won't ever do framerate conversion */ + g_return_val_if_fail (in_info->fps_n == out_info->fps_n, NULL); + g_return_val_if_fail (in_info->fps_d == out_info->fps_d, NULL); + /* we won't ever do deinterlace */ + g_return_val_if_fail (in_info->interlace_mode == out_info->interlace_mode, + NULL); + + convert = g_new0 (GstCudaConverter, 1); + + convert->in_info = *in_info; + convert->out_info = *out_info; + + /* FIXME: should return kernel source */ + if (!gst_cuda_context_push (cuda_ctx)) { + GST_ERROR ("cannot push context"); + goto error; + } + + if (!cuda_converter_lookup_path (convert)) + goto error; + + convert->ptx = gst_cuda_nvrtc_compile (convert->kernel_source); + if (!convert->ptx) { + GST_ERROR ("no PTX data to load"); + goto error; + } + + GST_TRACE ("compiled convert ptx \n%s", convert->ptx); + + if (!gst_cuda_result (CuModuleLoadData (&convert->cuda_module, convert->ptx))) { + gst_cuda_context_pop (NULL); + GST_ERROR ("failed to load cuda module data"); + + goto error; + } + + for (i = 0; i < CONVERTER_MAX_NUM_FUNC; i++) { + if (!convert->func_names[i]) + break; + + LOAD_CUDA_FUNC (convert->cuda_module, convert->kernel_func[i], + convert->func_names[i]); + GST_DEBUG ("kernel function \"%s\" loaded", convert->func_names[i]); + } + + gst_cuda_context_pop (NULL); + convert->cuda_ctx = gst_object_ref (cuda_ctx); + convert->texture_alignment = + gst_cuda_context_get_texture_alignment (cuda_ctx); + + g_free (convert->kernel_source); + g_free (convert->ptx); + convert->kernel_source = NULL; + convert->ptx = NULL; + + return convert; + +error: + gst_cuda_context_pop (NULL); + gst_cuda_converter_free (convert); + + return NULL; +} + +/** + * gst_video_converter_free: + * @convert: a #GstCudaConverter + * + * Free @convert + */ +void +gst_cuda_converter_free (GstCudaConverter * convert) +{ + g_return_if_fail (convert != NULL); + + if (convert->cuda_ctx) { + if (gst_cuda_context_push (convert->cuda_ctx)) { + gint i; + + if (convert->cuda_module) { + gst_cuda_result (CuModuleUnload (convert->cuda_module)); + } + + for (i = 0; i < GST_VIDEO_MAX_PLANES; i++) { + if (convert->fallback_buffer[i].device_ptr) + gst_cuda_result (CuMemFree (convert->fallback_buffer[i].device_ptr)); + if (convert->y444_surface[i].device_ptr) + gst_cuda_result (CuMemFree (convert->y444_surface[i].device_ptr)); + } + + if (convert->unpack_surface.device_ptr) + gst_cuda_result (CuMemFree (convert->unpack_surface.device_ptr)); + + gst_cuda_context_pop (NULL); + } + + gst_object_unref (convert->cuda_ctx); + } + + g_free (convert->kernel_source); + g_free (convert->ptx); + g_free (convert); +} + +/** + * gst_cuda_converter_frame: + * @convert: a #GstCudaConverter + * @src: a #GstCudaMemory + * @in_info: a #GstVideoInfo representing @src + * @dst: a #GstCudaMemory + * @out_info: a #GstVideoInfo representing @dst + * @cuda_stream: a #CUstream + * + * Convert the pixels of @src into @dest using @convert. + * Called without gst_cuda_context_push() and gst_cuda_context_pop() by caller + */ +gboolean +gst_cuda_converter_frame (GstCudaConverter * convert, const GstCudaMemory * src, + GstVideoInfo * in_info, GstCudaMemory * dst, GstVideoInfo * out_info, + CUstream cuda_stream) +{ + gboolean ret; + + g_return_val_if_fail (convert, FALSE); + g_return_val_if_fail (src, FALSE); + g_return_val_if_fail (in_info, FALSE); + g_return_val_if_fail (dst, FALSE); + g_return_val_if_fail (out_info, FALSE); + + gst_cuda_context_push (convert->cuda_ctx); + + ret = gst_cuda_converter_frame_unlocked (convert, + src, in_info, dst, out_info, cuda_stream); + + gst_cuda_context_pop (NULL); + + return ret; +} + +/** + * gst_cuda_converter_frame_unlocked: + * @convert: a #GstCudaConverter + * @src: a #GstCudaMemory + * @in_info: a #GstVideoInfo representing @src + * @dst: a #GstCudaMemory + * @out_info: a #GstVideoInfo representing @dest + * @cuda_stream: a #CUstream + * + * Convert the pixels of @src into @dest using @convert. + * Caller should call this method after gst_cuda_context_push() + */ +gboolean +gst_cuda_converter_frame_unlocked (GstCudaConverter * convert, + const GstCudaMemory * src, GstVideoInfo * in_info, GstCudaMemory * dst, + GstVideoInfo * out_info, CUstream cuda_stream) +{ + g_return_val_if_fail (convert, FALSE); + g_return_val_if_fail (src, FALSE); + g_return_val_if_fail (in_info, FALSE); + g_return_val_if_fail (dst, FALSE); + g_return_val_if_fail (out_info, FALSE); + + return convert->convert (convert, src, in_info, dst, out_info, cuda_stream); +} + +/* allocate fallback memory for texture alignment requirement */ +static gboolean +convert_ensure_fallback_memory (GstCudaConverter * convert, + GstVideoInfo * info, guint plane) +{ + CUresult ret; + guint element_size = 8; + + if (convert->fallback_buffer[plane].device_ptr) + return TRUE; + + if (GST_VIDEO_INFO_COMP_DEPTH (info, 0) > 8) + element_size = 16; + + ret = CuMemAllocPitch (&convert->fallback_buffer[plane].device_ptr, + &convert->fallback_buffer[plane].cuda_stride, + GST_VIDEO_INFO_COMP_WIDTH (info, plane) * + GST_VIDEO_INFO_COMP_PSTRIDE (info, plane), + GST_VIDEO_INFO_COMP_HEIGHT (info, plane), element_size); + + if (!gst_cuda_result (ret)) { + GST_ERROR ("failed to allocated fallback memory"); + return FALSE; + } + + return TRUE; +} + +/* create a 2D CUDA texture without alignment check */ +static CUtexObject +convert_create_texture_unchecked (const CUdeviceptr src, gint width, + gint height, gint channels, gint stride, CUarray_format format, + CUfilter_mode mode, CUstream cuda_stream) +{ + CUDA_TEXTURE_DESC texture_desc; + CUDA_RESOURCE_DESC resource_desc; + CUtexObject texture = 0; + CUresult cuda_ret; + + memset (&texture_desc, 0, sizeof (CUDA_TEXTURE_DESC)); + memset (&resource_desc, 0, sizeof (CUDA_RESOURCE_DESC)); + + resource_desc.resType = CU_RESOURCE_TYPE_PITCH2D; + resource_desc.res.pitch2D.format = format; + resource_desc.res.pitch2D.numChannels = channels; + resource_desc.res.pitch2D.width = width; + resource_desc.res.pitch2D.height = height; + resource_desc.res.pitch2D.pitchInBytes = stride; + resource_desc.res.pitch2D.devPtr = src; + + texture_desc.filterMode = mode; + texture_desc.flags = CU_TRSF_READ_AS_INTEGER; + + gst_cuda_result (CuStreamSynchronize (cuda_stream)); + cuda_ret = CuTexObjectCreate (&texture, &resource_desc, &texture_desc, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("couldn't create texture"); + + return 0; + } + + return texture; +} + +static CUtexObject +convert_create_texture (GstCudaConverter * convert, const GstCudaMemory * src, + GstVideoInfo * info, guint plane, CUstream cuda_stream) +{ + CUarray_format format = CU_AD_FORMAT_UNSIGNED_INT8; + guint channels = 1; + CUdeviceptr src_ptr; + gsize stride; + CUresult cuda_ret; + CUfilter_mode mode; + + if (GST_VIDEO_INFO_COMP_DEPTH (info, plane) > 8) + format = CU_AD_FORMAT_UNSIGNED_INT16; + + /* FIXME: more graceful method ? */ + if (plane != 0 && + GST_VIDEO_INFO_N_PLANES (info) != GST_VIDEO_INFO_N_COMPONENTS (info)) { + channels = 2; + } + + src_ptr = src->data + src->offset[plane]; + stride = src->stride; + + if (convert->texture_alignment && (src_ptr % convert->texture_alignment)) { + CUDA_MEMCPY2D copy_params = { 0, }; + + if (!convert_ensure_fallback_memory (convert, info, plane)) + return 0; + + GST_LOG ("device memory was not aligned, copy to fallback memory"); + + copy_params.srcMemoryType = CU_MEMORYTYPE_DEVICE; + copy_params.srcPitch = stride; + copy_params.srcDevice = (CUdeviceptr) src_ptr; + + copy_params.dstMemoryType = CU_MEMORYTYPE_DEVICE; + copy_params.dstPitch = convert->fallback_buffer[plane].cuda_stride; + copy_params.dstDevice = convert->fallback_buffer[plane].device_ptr; + copy_params.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (info, plane) + * GST_VIDEO_INFO_COMP_PSTRIDE (info, plane); + copy_params.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, plane); + + cuda_ret = CuMemcpy2DAsync (©_params, cuda_stream); + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("failed to copy to fallback buffer"); + return 0; + } + + src_ptr = convert->fallback_buffer[plane].device_ptr; + stride = convert->fallback_buffer[plane].cuda_stride; + } + + /* Use h/w linear interpolation only when resize is required. + * Otherwise the image might be blurred */ + if (convert->keep_size) + mode = CU_TR_FILTER_MODE_POINT; + else + mode = CU_TR_FILTER_MODE_LINEAR; + + return convert_create_texture_unchecked (src_ptr, + GST_VIDEO_INFO_COMP_WIDTH (info, plane), + GST_VIDEO_INFO_COMP_HEIGHT (info, plane), channels, stride, format, mode, + cuda_stream); +} + +/* main conversion function for YUV to YUV conversion */ +static gboolean +convert_YUV_TO_YUV (GstCudaConverter * convert, + const GstCudaMemory * src, GstVideoInfo * in_info, GstCudaMemory * dst, + GstVideoInfo * out_info, CUstream cuda_stream) +{ + CUtexObject texture[GST_VIDEO_MAX_PLANES] = { 0, }; + CUresult cuda_ret; + gboolean ret = FALSE; + CUdeviceptr dst_ptr[GST_VIDEO_MAX_PLANES] = { 0, }; + gint dst_stride; + gint width, height; + gint i; + + gpointer kernel_args[] = { &texture[0], &texture[1], &texture[2], + &dst_ptr[0], &dst_ptr[1], &dst_ptr[2], &dst_stride + }; + + /* conversion step + * STEP 1: create CUtexObject per plane + * STEP 2: call YUV to YUV conversion kernel function. + * resize, uv reordering and bitdepth conversion will be performed in + * the CUDA kernel function + */ + + /* map CUDA device memory to CUDA texture object */ + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (in_info); i++) { + texture[i] = convert_create_texture (convert, src, in_info, i, cuda_stream); + if (!texture[i]) { + GST_ERROR ("couldn't create texture for %d th plane", i); + goto done; + } + } + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (out_info); i++) + dst_ptr[i] = dst->data + dst->offset[i]; + + dst_stride = dst->stride; + + width = GST_VIDEO_INFO_WIDTH (out_info); + height = GST_VIDEO_INFO_HEIGHT (out_info); + + cuda_ret = + CuLaunchKernel (convert->kernel_func[0], DIV_UP (width, CUDA_BLOCK_X), + DIV_UP (height, CUDA_BLOCK_Y), 1, CUDA_BLOCK_X, CUDA_BLOCK_Y, 1, 0, + cuda_stream, kernel_args, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("could not rescale plane"); + goto done; + } + + ret = TRUE; + gst_cuda_result (CuStreamSynchronize (cuda_stream)); + +done: + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (in_info); i++) { + if (texture[i]) + gst_cuda_result (CuTexObjectDestroy (texture[i])); + } + + return ret; +} + +/* main conversion function for YUV to RGB conversion */ +static gboolean +convert_YUV_TO_RGB (GstCudaConverter * convert, + const GstCudaMemory * src, GstVideoInfo * in_info, GstCudaMemory * dst, + GstVideoInfo * out_info, CUstream cuda_stream) +{ + CUtexObject texture[GST_VIDEO_MAX_PLANES] = { 0, }; + CUresult cuda_ret; + gboolean ret = FALSE; + CUdeviceptr dstRGB = 0; + gint dst_stride; + gint width, height; + gint i; + + gpointer kernel_args[] = { &texture[0], &texture[1], &texture[2], + &dstRGB, &dst_stride + }; + + /* conversion step + * STEP 1: create CUtexObject per plane + * STEP 2: call YUV to RGB conversion kernel function. + * resizing, argb ordering and bitdepth conversion will be performed in + * the CUDA kernel function + */ + + /* map CUDA device memory to CUDA texture object */ + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (in_info); i++) { + texture[i] = convert_create_texture (convert, src, in_info, i, cuda_stream); + if (!texture[i]) { + GST_ERROR ("couldn't create texture for %d th plane", i); + goto done; + } + } + + dstRGB = dst->data; + dst_stride = dst->stride; + + width = GST_VIDEO_INFO_WIDTH (out_info); + height = GST_VIDEO_INFO_HEIGHT (out_info); + + cuda_ret = + CuLaunchKernel (convert->kernel_func[0], DIV_UP (width, CUDA_BLOCK_X), + DIV_UP (height, CUDA_BLOCK_Y), 1, CUDA_BLOCK_X, CUDA_BLOCK_Y, 1, 0, + cuda_stream, kernel_args, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("could not rescale plane"); + goto done; + } + + ret = TRUE; + gst_cuda_result (CuStreamSynchronize (cuda_stream)); + +done: + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (in_info); i++) { + if (texture[i]) + gst_cuda_result (CuTexObjectDestroy (texture[i])); + } + + return ret; +} + +static gboolean +convert_UNPACK_RGB (GstCudaConverter * convert, CUfunction kernel_func, + CUstream cuda_stream, const GstCudaMemory * src, GstVideoInfo * in_info, + CUdeviceptr dst, gint dst_stride, GstCudaRGBOrder * rgb_order) +{ + CUdeviceptr srcRGB = 0; + gint width, height; + gint src_stride, src_pstride; + CUresult cuda_ret; + + gpointer unpack_kernel_args[] = { &srcRGB, &dst, + &width, &height, + &src_stride, &src_pstride, &dst_stride, + &convert->in_rgb_order.R, &convert->in_rgb_order.G, + &convert->in_rgb_order.B, &convert->in_rgb_order.A, + }; + + srcRGB = src->data; + src_stride = src->stride; + + width = GST_VIDEO_INFO_WIDTH (in_info); + height = GST_VIDEO_INFO_HEIGHT (in_info); + src_pstride = GST_VIDEO_INFO_COMP_PSTRIDE (in_info, 0); + + cuda_ret = + CuLaunchKernel (kernel_func, DIV_UP (width, CUDA_BLOCK_X), + DIV_UP (height, CUDA_BLOCK_Y), 1, CUDA_BLOCK_X, CUDA_BLOCK_Y, 1, 0, + cuda_stream, unpack_kernel_args, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("could not unpack rgb"); + return FALSE; + } + + return TRUE; +} + +static gboolean +convert_TO_Y444 (GstCudaConverter * convert, CUfunction kernel_func, + CUstream cuda_stream, CUtexObject srcRGB, CUdeviceptr dstY, gint y_stride, + CUdeviceptr dstU, gint u_stride, CUdeviceptr dstV, gint v_stride, + gint width, gint height, gint pstride, gint bitdepth) +{ + CUresult cuda_ret; + + gpointer kernel_args[] = { &srcRGB, &dstY, &y_stride, &dstU, &u_stride, &dstV, + &v_stride, &width, &height, &pstride, &bitdepth, + }; + + cuda_ret = + CuLaunchKernel (kernel_func, DIV_UP (width, CUDA_BLOCK_X), + DIV_UP (height, CUDA_BLOCK_Y), 1, CUDA_BLOCK_X, CUDA_BLOCK_Y, 1, 0, + cuda_stream, kernel_args, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("could not unpack rgb"); + return FALSE; + } + + return TRUE; +} + +/* main conversion function for RGB to YUV conversion */ +static gboolean +convert_RGB_TO_YUV (GstCudaConverter * convert, + const GstCudaMemory * src, GstVideoInfo * in_info, GstCudaMemory * dst, + GstVideoInfo * out_info, CUstream cuda_stream) +{ + CUtexObject texture = 0; + CUtexObject yuv_texture[3] = { 0, }; + CUdeviceptr dst_ptr[GST_VIDEO_MAX_PLANES] = { 0, }; + CUresult cuda_ret; + gboolean ret = FALSE; + gint in_width, in_height; + gint out_width, out_height; + gint dst_stride; + CUarray_format format = CU_AD_FORMAT_UNSIGNED_INT8; + CUfilter_mode mode = CU_TR_FILTER_MODE_POINT; + gint pstride = 1; + gint bitdepth = 8; + gint i; + + gpointer kernel_args[] = { &yuv_texture[0], &yuv_texture[1], &yuv_texture[2], + &dst_ptr[0], &dst_ptr[1], &dst_ptr[2], &dst_stride + }; + + /* conversion step + * STEP 1: unpack src RGB into ARGB or ARGB64 format + * STEP 2: convert unpacked ARGB (or ARGB64) to Y444 (or Y444_16LE) + * STEP 3: convert Y444 (or Y444_16LE) to final YUV format. + * resizing, bitdepth conversion, uv reordering will be performed in + * the CUDA kernel function + */ + if (!convert_UNPACK_RGB (convert, convert->kernel_func[0], cuda_stream, + src, in_info, convert->unpack_surface.device_ptr, + convert->unpack_surface.cuda_stride, &convert->in_rgb_order)) { + GST_ERROR ("could not unpack input rgb"); + + goto done; + } + + in_width = GST_VIDEO_INFO_WIDTH (in_info); + in_height = GST_VIDEO_INFO_HEIGHT (in_info); + + out_width = GST_VIDEO_INFO_WIDTH (out_info); + out_height = GST_VIDEO_INFO_HEIGHT (out_info); + dst_stride = dst->stride; + + if (GST_VIDEO_INFO_COMP_DEPTH (in_info, 0) > 8) { + pstride = 2; + bitdepth = 16; + format = CU_AD_FORMAT_UNSIGNED_INT16; + } + + texture = + convert_create_texture_unchecked (convert->unpack_surface.device_ptr, + in_width, in_height, 4, convert->unpack_surface.cuda_stride, format, + mode, cuda_stream); + + if (!texture) { + GST_ERROR ("could not create texture"); + goto done; + } + + if (!convert_TO_Y444 (convert, convert->kernel_func[1], cuda_stream, texture, + convert->y444_surface[0].device_ptr, + convert->y444_surface[0].cuda_stride, + convert->y444_surface[1].device_ptr, + convert->y444_surface[1].cuda_stride, + convert->y444_surface[2].device_ptr, + convert->y444_surface[2].cuda_stride, in_width, in_height, pstride, + bitdepth)) { + GST_ERROR ("could not convert to Y444 or Y444_16LE"); + goto done; + } + + /* Use h/w linear interpolation only when resize is required. + * Otherwise the image might be blurred */ + if (convert->keep_size) + mode = CU_TR_FILTER_MODE_POINT; + else + mode = CU_TR_FILTER_MODE_LINEAR; + + for (i = 0; i < 3; i++) { + yuv_texture[i] = + convert_create_texture_unchecked (convert->y444_surface[i].device_ptr, + in_width, in_height, 1, convert->y444_surface[i].cuda_stride, format, + mode, cuda_stream); + + if (!yuv_texture[i]) { + GST_ERROR ("could not create %dth yuv texture", i); + goto done; + } + } + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (out_info); i++) + dst_ptr[i] = dst->data + dst->offset[i]; + + cuda_ret = + CuLaunchKernel (convert->kernel_func[2], DIV_UP (out_width, CUDA_BLOCK_X), + DIV_UP (out_height, CUDA_BLOCK_Y), 1, CUDA_BLOCK_X, CUDA_BLOCK_Y, 1, 0, + cuda_stream, kernel_args, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("could not rescale plane"); + goto done; + } + + ret = TRUE; + gst_cuda_result (CuStreamSynchronize (cuda_stream)); + +done: + if (texture) + gst_cuda_result (CuTexObjectDestroy (texture)); + for (i = 0; i < 3; i++) { + if (yuv_texture[i]) + gst_cuda_result (CuTexObjectDestroy (yuv_texture[i])); + } + + return ret; +} + +/* main conversion function for RGB to RGB conversion */ +static gboolean +convert_RGB_TO_RGB (GstCudaConverter * convert, + const GstCudaMemory * src, GstVideoInfo * in_info, GstCudaMemory * dst, + GstVideoInfo * out_info, CUstream cuda_stream) +{ + CUtexObject texture = 0; + CUresult cuda_ret; + gboolean ret = FALSE; + CUdeviceptr dstRGB = 0; + gint in_width, in_height; + gint out_width, out_height; + gint dst_stride; + CUfilter_mode mode; + CUarray_format format = CU_AD_FORMAT_UNSIGNED_INT8; + + gpointer rescale_kernel_args[] = { &texture, &dstRGB, &dst_stride }; + + /* conversion step + * STEP 1: unpack src RGB into ARGB or ARGB64 format + * STEP 2: convert ARGB (or ARGB64) to final RGB format. + * resizing, bitdepth conversion, argb reordering will be performed in + * the CUDA kernel function + */ + + if (!convert_UNPACK_RGB (convert, convert->kernel_func[0], cuda_stream, + src, in_info, convert->unpack_surface.device_ptr, + convert->unpack_surface.cuda_stride, &convert->in_rgb_order)) { + GST_ERROR ("could not unpack input rgb"); + + goto done; + } + + in_width = GST_VIDEO_INFO_WIDTH (in_info); + in_height = GST_VIDEO_INFO_HEIGHT (in_info); + + out_width = GST_VIDEO_INFO_WIDTH (out_info); + out_height = GST_VIDEO_INFO_HEIGHT (out_info); + + dstRGB = dst->data; + dst_stride = dst->stride; + + if (GST_VIDEO_INFO_COMP_DEPTH (in_info, 0) > 8) + format = CU_AD_FORMAT_UNSIGNED_INT16; + + /* Use h/w linear interpolation only when resize is required. + * Otherwise the image might be blurred */ + if (convert->keep_size) + mode = CU_TR_FILTER_MODE_POINT; + else + mode = CU_TR_FILTER_MODE_LINEAR; + + texture = + convert_create_texture_unchecked (convert->unpack_surface.device_ptr, + in_width, in_height, 4, convert->unpack_surface.cuda_stride, format, + mode, cuda_stream); + + if (!texture) { + GST_ERROR ("could not create texture"); + goto done; + } + + cuda_ret = + CuLaunchKernel (convert->kernel_func[1], DIV_UP (out_width, CUDA_BLOCK_X), + DIV_UP (out_height, CUDA_BLOCK_Y), 1, CUDA_BLOCK_X, CUDA_BLOCK_Y, 1, 0, + cuda_stream, rescale_kernel_args, NULL); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("could not rescale plane"); + goto done; + } + + ret = TRUE; + gst_cuda_result (CuStreamSynchronize (cuda_stream)); + +done: + if (texture) + gst_cuda_result (CuTexObjectDestroy (texture)); + + return ret; +} + +/* from video-converter.c */ +typedef struct +{ + gdouble dm[4][4]; +} MatrixData; + +static void +color_matrix_set_identity (MatrixData * m) +{ + gint i, j; + + for (i = 0; i < 4; i++) { + for (j = 0; j < 4; j++) { + m->dm[i][j] = (i == j); + } + } +} + +static void +color_matrix_copy (MatrixData * d, const MatrixData * s) +{ + gint i, j; + + for (i = 0; i < 4; i++) + for (j = 0; j < 4; j++) + d->dm[i][j] = s->dm[i][j]; +} + +/* Perform 4x4 matrix multiplication: + * - @dst@ = @a@ * @b@ + * - @dst@ may be a pointer to @a@ andor @b@ + */ +static void +color_matrix_multiply (MatrixData * dst, MatrixData * a, MatrixData * b) +{ + MatrixData tmp; + gint i, j, k; + + for (i = 0; i < 4; i++) { + for (j = 0; j < 4; j++) { + gdouble x = 0; + for (k = 0; k < 4; k++) { + x += a->dm[i][k] * b->dm[k][j]; + } + tmp.dm[i][j] = x; + } + } + color_matrix_copy (dst, &tmp); +} + +static void +color_matrix_offset_components (MatrixData * m, gdouble a1, gdouble a2, + gdouble a3) +{ + MatrixData a; + + color_matrix_set_identity (&a); + a.dm[0][3] = a1; + a.dm[1][3] = a2; + a.dm[2][3] = a3; + color_matrix_multiply (m, &a, m); +} + +static void +color_matrix_scale_components (MatrixData * m, gdouble a1, gdouble a2, + gdouble a3) +{ + MatrixData a; + + color_matrix_set_identity (&a); + a.dm[0][0] = a1; + a.dm[1][1] = a2; + a.dm[2][2] = a3; + color_matrix_multiply (m, &a, m); +} + +static void +color_matrix_debug (const MatrixData * s) +{ + GST_DEBUG ("[%f %f %f %f]", s->dm[0][0], s->dm[0][1], s->dm[0][2], + s->dm[0][3]); + GST_DEBUG ("[%f %f %f %f]", s->dm[1][0], s->dm[1][1], s->dm[1][2], + s->dm[1][3]); + GST_DEBUG ("[%f %f %f %f]", s->dm[2][0], s->dm[2][1], s->dm[2][2], + s->dm[2][3]); + GST_DEBUG ("[%f %f %f %f]", s->dm[3][0], s->dm[3][1], s->dm[3][2], + s->dm[3][3]); +} + +static void +color_matrix_YCbCr_to_RGB (MatrixData * m, gdouble Kr, gdouble Kb) +{ + gdouble Kg = 1.0 - Kr - Kb; + MatrixData k = { + { + {1., 0., 2 * (1 - Kr), 0.}, + {1., -2 * Kb * (1 - Kb) / Kg, -2 * Kr * (1 - Kr) / Kg, 0.}, + {1., 2 * (1 - Kb), 0., 0.}, + {0., 0., 0., 1.}, + } + }; + + color_matrix_multiply (m, &k, m); +} + +static void +color_matrix_RGB_to_YCbCr (MatrixData * m, gdouble Kr, gdouble Kb) +{ + gdouble Kg = 1.0 - Kr - Kb; + MatrixData k; + gdouble x; + + k.dm[0][0] = Kr; + k.dm[0][1] = Kg; + k.dm[0][2] = Kb; + k.dm[0][3] = 0; + + x = 1 / (2 * (1 - Kb)); + k.dm[1][0] = -x * Kr; + k.dm[1][1] = -x * Kg; + k.dm[1][2] = x * (1 - Kb); + k.dm[1][3] = 0; + + x = 1 / (2 * (1 - Kr)); + k.dm[2][0] = x * (1 - Kr); + k.dm[2][1] = -x * Kg; + k.dm[2][2] = -x * Kb; + k.dm[2][3] = 0; + + k.dm[3][0] = 0; + k.dm[3][1] = 0; + k.dm[3][2] = 0; + k.dm[3][3] = 1; + + color_matrix_multiply (m, &k, m); +} + +static void +compute_matrix_to_RGB (GstCudaConverter * convert, MatrixData * data, + GstVideoInfo * info) +{ + gdouble Kr = 0, Kb = 0; + gint offset[4], scale[4]; + + /* bring color components to [0..1.0] range */ + gst_video_color_range_offsets (info->colorimetry.range, info->finfo, offset, + scale); + + color_matrix_offset_components (data, -offset[0], -offset[1], -offset[2]); + color_matrix_scale_components (data, 1 / ((float) scale[0]), + 1 / ((float) scale[1]), 1 / ((float) scale[2])); + + if (!GST_VIDEO_INFO_IS_RGB (info)) { + /* bring components to R'G'B' space */ + if (gst_video_color_matrix_get_Kr_Kb (info->colorimetry.matrix, &Kr, &Kb)) + color_matrix_YCbCr_to_RGB (data, Kr, Kb); + } + color_matrix_debug (data); +} + +static void +compute_matrix_to_YUV (GstCudaConverter * convert, MatrixData * data, + GstVideoInfo * info) +{ + gdouble Kr = 0, Kb = 0; + gint offset[4], scale[4]; + + if (!GST_VIDEO_INFO_IS_RGB (info)) { + /* bring components to YCbCr space */ + if (gst_video_color_matrix_get_Kr_Kb (info->colorimetry.matrix, &Kr, &Kb)) + color_matrix_RGB_to_YCbCr (data, Kr, Kb); + } + + /* bring color components to nominal range */ + gst_video_color_range_offsets (info->colorimetry.range, info->finfo, offset, + scale); + + color_matrix_scale_components (data, (float) scale[0], (float) scale[1], + (float) scale[2]); + color_matrix_offset_components (data, offset[0], offset[1], offset[2]); + + color_matrix_debug (data); +} + +static gboolean +cuda_converter_get_matrix (GstCudaConverter * convert, MatrixData * matrix, + GstVideoInfo * in_info, GstVideoInfo * out_info) +{ + gboolean same_matrix, same_bits; + guint in_bits, out_bits; + + in_bits = GST_VIDEO_INFO_COMP_DEPTH (in_info, 0); + out_bits = GST_VIDEO_INFO_COMP_DEPTH (out_info, 0); + + same_bits = in_bits == out_bits; + same_matrix = in_info->colorimetry.matrix == out_info->colorimetry.matrix; + + GST_DEBUG ("matrix %d -> %d (%d)", in_info->colorimetry.matrix, + out_info->colorimetry.matrix, same_matrix); + GST_DEBUG ("bits %d -> %d (%d)", in_bits, out_bits, same_bits); + + color_matrix_set_identity (matrix); + + if (same_bits && same_matrix) { + GST_DEBUG ("conversion matrix is not required"); + + return FALSE; + } + + if (in_bits < out_bits) { + gint scale = 1 << (out_bits - in_bits); + color_matrix_scale_components (matrix, + 1 / (float) scale, 1 / (float) scale, 1 / (float) scale); + } + + GST_DEBUG ("to RGB matrix"); + compute_matrix_to_RGB (convert, matrix, in_info); + GST_DEBUG ("current matrix"); + color_matrix_debug (matrix); + + GST_DEBUG ("to YUV matrix"); + compute_matrix_to_YUV (convert, matrix, out_info); + GST_DEBUG ("current matrix"); + color_matrix_debug (matrix); + + if (in_bits > out_bits) { + gint scale = 1 << (in_bits - out_bits); + color_matrix_scale_components (matrix, + (float) scale, (float) scale, (float) scale); + } + + GST_DEBUG ("final matrix"); + color_matrix_debug (matrix); + + return TRUE; +} + +static gboolean +is_uv_swapped (GstVideoFormat format) +{ + static GstVideoFormat swapped_formats[] = { + GST_VIDEO_FORMAT_YV12, + GST_VIDEO_FORMAT_NV21, + }; + gint i; + + for (i = 0; i < G_N_ELEMENTS (swapped_formats); i++) { + if (format == swapped_formats[i]) + return TRUE; + } + + return FALSE; +} + +typedef struct +{ + const gchar *read_chroma; + const gchar *write_chroma; + const gchar *unpack_function; + gfloat scale_h, scale_v; + gfloat chroma_scale_h, chroma_scale_v; + gint width, height; + gint chroma_width, chroma_height; + gint in_depth; + gint out_depth; + gint pstride, chroma_pstride; + gint in_shift, out_shift; + gint mask; + gint swap_uv; + /* RGBA specific variables */ + gint max_in_val; + GstCudaRGBOrder rgb_order; +} GstCudaKernelTempl; + +static gchar * +cuda_converter_generate_yuv_to_yuv_kernel_code (GstCudaConverter * convert, + GstCudaKernelTempl * templ) +{ + return g_strdup_printf (templ_YUV_TO_YUV, + templ->scale_h, templ->scale_v, templ->chroma_scale_h, + templ->chroma_scale_v, templ->width, templ->height, templ->chroma_width, + templ->chroma_height, templ->in_depth, templ->out_depth, templ->pstride, + templ->chroma_pstride, templ->in_shift, templ->out_shift, templ->mask, + templ->swap_uv, templ->read_chroma, templ->write_chroma); +} + +static gchar * +cuda_converter_generate_yuv_to_rgb_kernel_code (GstCudaConverter * convert, + GstCudaKernelTempl * templ, MatrixData * matrix) +{ + return g_strdup_printf (templ_YUV_TO_RGB, + matrix->dm[0][3], matrix->dm[1][3], matrix->dm[2][3], + matrix->dm[0][0], matrix->dm[0][1], matrix->dm[0][2], + matrix->dm[1][0], matrix->dm[1][1], matrix->dm[1][2], + matrix->dm[2][0], matrix->dm[2][1], matrix->dm[2][2], + templ->scale_h, templ->scale_v, templ->chroma_scale_h, + templ->chroma_scale_v, templ->width, templ->height, templ->chroma_width, + templ->chroma_height, templ->in_depth, templ->out_depth, templ->pstride, + templ->chroma_pstride, templ->in_shift, templ->out_shift, templ->mask, + templ->swap_uv, templ->max_in_val, templ->rgb_order.R, + templ->rgb_order.G, templ->rgb_order.B, templ->rgb_order.A, + templ->rgb_order.X, templ->read_chroma); +} + +static gchar * +cuda_converter_generate_rgb_to_yuv_kernel_code (GstCudaConverter * convert, + GstCudaKernelTempl * templ, MatrixData * matrix) +{ + return g_strdup_printf (templ_RGB_TO_YUV, + matrix->dm[0][3], matrix->dm[1][3], matrix->dm[2][3], + matrix->dm[0][0], matrix->dm[0][1], matrix->dm[0][2], + matrix->dm[1][0], matrix->dm[1][1], matrix->dm[1][2], + matrix->dm[2][0], matrix->dm[2][1], matrix->dm[2][2], + templ->scale_h, templ->scale_v, templ->chroma_scale_h, + templ->chroma_scale_v, templ->width, templ->height, templ->chroma_width, + templ->chroma_height, templ->in_depth, templ->out_depth, templ->pstride, + templ->chroma_pstride, templ->in_shift, templ->out_shift, templ->mask, + templ->swap_uv, templ->unpack_function, templ->read_chroma, + templ->write_chroma); +} + +static gchar * +cuda_converter_generate_rgb_to_rgb_kernel_code (GstCudaConverter * convert, + GstCudaKernelTempl * templ) +{ + return g_strdup_printf (templ_RGB_to_RGB, + templ->scale_h, templ->scale_v, + templ->width, templ->height, + templ->in_depth, templ->out_depth, templ->pstride, + templ->rgb_order.R, templ->rgb_order.G, + templ->rgb_order.B, templ->rgb_order.A, templ->rgb_order.X, + templ->unpack_function); +} + +#define SET_ORDER(o,r,g,b,a,x) G_STMT_START { \ + (o)->R = (r); \ + (o)->G = (g); \ + (o)->B = (b); \ + (o)->A = (a); \ + (o)->X = (x); \ +} G_STMT_END + +static void +cuda_converter_get_rgb_order (GstVideoFormat format, GstCudaRGBOrder * order) +{ + switch (format) { + case GST_VIDEO_FORMAT_RGBA: + SET_ORDER (order, 0, 1, 2, 3, -1); + break; + case GST_VIDEO_FORMAT_RGBx: + SET_ORDER (order, 0, 1, 2, -1, 3); + break; + case GST_VIDEO_FORMAT_BGRA: + SET_ORDER (order, 2, 1, 0, 3, -1); + break; + case GST_VIDEO_FORMAT_BGRx: + SET_ORDER (order, 2, 1, 0, -1, 3); + break; + case GST_VIDEO_FORMAT_ARGB: + SET_ORDER (order, 1, 2, 3, 0, -1); + break; + case GST_VIDEO_FORMAT_ABGR: + SET_ORDER (order, 3, 2, 1, 0, -1); + break; + case GST_VIDEO_FORMAT_RGB: + SET_ORDER (order, 0, 1, 2, -1, -1); + break; + case GST_VIDEO_FORMAT_BGR: + SET_ORDER (order, 2, 1, 0, -1, -1); + break; + case GST_VIDEO_FORMAT_BGR10A2_LE: + SET_ORDER (order, 1, 2, 3, 0, -1); + break; + case GST_VIDEO_FORMAT_RGB10A2_LE: + SET_ORDER (order, 3, 2, 1, 0, -1); + break; + default: + g_assert_not_reached (); + break; + } +} + +static gboolean +cuda_converter_lookup_path (GstCudaConverter * convert) +{ + GstVideoFormat in_format, out_format; + gboolean src_yuv, dst_yuv; + gboolean src_planar, dst_planar; + GstCudaKernelTempl templ = { 0, }; + GstVideoInfo *in_info, *out_info; + gboolean ret = FALSE; + CUresult cuda_ret; + + in_info = &convert->in_info; + out_info = &convert->out_info; + + in_format = GST_VIDEO_INFO_FORMAT (in_info); + out_format = GST_VIDEO_INFO_FORMAT (out_info); + + src_yuv = GST_VIDEO_INFO_IS_YUV (in_info); + dst_yuv = GST_VIDEO_INFO_IS_YUV (out_info); + + src_planar = GST_VIDEO_INFO_N_PLANES (in_info) == + GST_VIDEO_INFO_N_COMPONENTS (in_info); + dst_planar = GST_VIDEO_INFO_N_PLANES (out_info) == + GST_VIDEO_INFO_N_COMPONENTS (out_info); + + convert->keep_size = (GST_VIDEO_INFO_WIDTH (&convert->in_info) == + GST_VIDEO_INFO_WIDTH (&convert->out_info) && + GST_VIDEO_INFO_HEIGHT (&convert->in_info) == + GST_VIDEO_INFO_HEIGHT (&convert->out_info)); + + templ.scale_h = (gfloat) GST_VIDEO_INFO_COMP_WIDTH (in_info, 0) / + (gfloat) GST_VIDEO_INFO_COMP_WIDTH (out_info, 0); + templ.scale_v = (gfloat) GST_VIDEO_INFO_COMP_HEIGHT (in_info, 0) / + (gfloat) GST_VIDEO_INFO_COMP_HEIGHT (out_info, 0); + templ.chroma_scale_h = (gfloat) GST_VIDEO_INFO_COMP_WIDTH (in_info, 1) / + (gfloat) GST_VIDEO_INFO_COMP_WIDTH (out_info, 1); + templ.chroma_scale_v = (gfloat) GST_VIDEO_INFO_COMP_HEIGHT (in_info, 1) / + (gfloat) GST_VIDEO_INFO_COMP_HEIGHT (out_info, 1); + templ.width = GST_VIDEO_INFO_COMP_WIDTH (out_info, 0); + templ.height = GST_VIDEO_INFO_COMP_HEIGHT (out_info, 0); + templ.chroma_width = GST_VIDEO_INFO_COMP_WIDTH (out_info, 1); + templ.chroma_height = GST_VIDEO_INFO_COMP_HEIGHT (out_info, 1); + + templ.in_depth = GST_VIDEO_INFO_COMP_DEPTH (in_info, 0); + templ.out_depth = GST_VIDEO_INFO_COMP_DEPTH (out_info, 0); + templ.pstride = GST_VIDEO_INFO_COMP_PSTRIDE (out_info, 0); + templ.chroma_pstride = GST_VIDEO_INFO_COMP_PSTRIDE (out_info, 1); + templ.in_shift = in_info->finfo->shift[0]; + templ.out_shift = out_info->finfo->shift[0]; + templ.mask = ((1 << templ.out_depth) - 1) << templ.out_shift; + templ.swap_uv = (is_uv_swapped (in_format) != is_uv_swapped (out_format)); + + if (src_yuv && dst_yuv) { + convert->convert = convert_YUV_TO_YUV; + + if (src_planar && dst_planar) { + templ.read_chroma = READ_CHROMA_FROM_PLANAR; + templ.write_chroma = WRITE_CHROMA_TO_PLANAR; + } else if (!src_planar && dst_planar) { + templ.read_chroma = READ_CHROMA_FROM_SEMI_PLANAR; + templ.write_chroma = WRITE_CHROMA_TO_PLANAR; + } else if (src_planar && !dst_planar) { + templ.read_chroma = READ_CHROMA_FROM_PLANAR; + templ.write_chroma = WRITE_CHROMA_TO_SEMI_PLANAR; + } else { + templ.read_chroma = READ_CHROMA_FROM_SEMI_PLANAR; + templ.write_chroma = WRITE_CHROMA_TO_SEMI_PLANAR; + } + + convert->kernel_source = + cuda_converter_generate_yuv_to_yuv_kernel_code (convert, &templ); + convert->func_names[0] = GST_CUDA_KERNEL_FUNC; + + ret = TRUE; + } else if (src_yuv && !dst_yuv) { + MatrixData matrix; + + if (src_planar) { + templ.read_chroma = READ_CHROMA_FROM_PLANAR; + } else { + templ.read_chroma = READ_CHROMA_FROM_SEMI_PLANAR; + } + + templ.max_in_val = (1 << templ.in_depth) - 1; + cuda_converter_get_rgb_order (out_format, &templ.rgb_order); + + cuda_converter_get_matrix (convert, &matrix, in_info, out_info); + convert->kernel_source = + cuda_converter_generate_yuv_to_rgb_kernel_code (convert, + &templ, &matrix); + convert->func_names[0] = GST_CUDA_KERNEL_FUNC; + + convert->convert = convert_YUV_TO_RGB; + + ret = TRUE; + } else if (!src_yuv && dst_yuv) { + MatrixData matrix; + gsize element_size = 8; + GstVideoFormat unpack_format; + GstVideoFormat y444_format; + GstVideoInfo unpack_info; + GstVideoInfo y444_info; + gint i; + + if (dst_planar) { + templ.write_chroma = WRITE_CHROMA_TO_PLANAR; + } else { + templ.write_chroma = WRITE_CHROMA_TO_SEMI_PLANAR; + } + templ.read_chroma = READ_CHROMA_FROM_PLANAR; + + cuda_converter_get_rgb_order (in_format, &convert->in_rgb_order); + + if (templ.in_depth > 8) { + /* FIXME: RGB10A2_LE and BGR10A2_LE only */ + element_size = 16; + unpack_format = GST_VIDEO_FORMAT_ARGB64; + y444_format = GST_VIDEO_FORMAT_Y444_16LE; + templ.unpack_function = unpack_to_ARGB64; + } else { + unpack_format = GST_VIDEO_FORMAT_ARGB; + y444_format = GST_VIDEO_FORMAT_Y444; + templ.unpack_function = unpack_to_ARGB; + } + + gst_video_info_set_format (&unpack_info, + unpack_format, GST_VIDEO_INFO_WIDTH (in_info), + GST_VIDEO_INFO_HEIGHT (in_info)); + gst_video_info_set_format (&y444_info, + y444_format, GST_VIDEO_INFO_WIDTH (in_info), + GST_VIDEO_INFO_HEIGHT (in_info)); + + templ.in_depth = GST_VIDEO_INFO_COMP_DEPTH (&unpack_info, 0); + + cuda_ret = CuMemAllocPitch (&convert->unpack_surface.device_ptr, + &convert->unpack_surface.cuda_stride, + GST_VIDEO_INFO_COMP_WIDTH (&unpack_info, 0) * + GST_VIDEO_INFO_COMP_PSTRIDE (&unpack_info, 0), + GST_VIDEO_INFO_HEIGHT (&unpack_info), element_size); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("couldn't alloc unpack surface"); + return FALSE; + } + + for (i = 0; i < 3; i++) { + cuda_ret = CuMemAllocPitch (&convert->y444_surface[i].device_ptr, + &convert->y444_surface[i].cuda_stride, + GST_VIDEO_INFO_COMP_WIDTH (&y444_info, i) * + GST_VIDEO_INFO_COMP_PSTRIDE (&y444_info, i), + GST_VIDEO_INFO_COMP_HEIGHT (&y444_info, i), element_size); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("couldn't alloc %dth y444 surface", i); + return FALSE; + } + } + + cuda_converter_get_matrix (convert, &matrix, &unpack_info, &y444_info); + + convert->kernel_source = + cuda_converter_generate_rgb_to_yuv_kernel_code (convert, + &templ, &matrix); + + convert->func_names[0] = GST_CUDA_KERNEL_FUNC_TO_ARGB; + convert->func_names[1] = GST_CUDA_KERNEL_FUNC_TO_Y444; + convert->func_names[2] = GST_CUDA_KERNEL_FUNC_Y444_TO_YUV; + + convert->convert = convert_RGB_TO_YUV; + + ret = TRUE; + } else { + gsize element_size = 8; + GstVideoFormat unpack_format; + GstVideoInfo unpack_info; + + cuda_converter_get_rgb_order (in_format, &convert->in_rgb_order); + cuda_converter_get_rgb_order (out_format, &templ.rgb_order); + + if (templ.in_depth > 8) { + /* FIXME: RGB10A2_LE and BGR10A2_LE only */ + element_size = 16; + unpack_format = GST_VIDEO_FORMAT_ARGB64; + templ.unpack_function = unpack_to_ARGB64; + } else { + unpack_format = GST_VIDEO_FORMAT_ARGB; + templ.unpack_function = unpack_to_ARGB; + } + + gst_video_info_set_format (&unpack_info, + unpack_format, GST_VIDEO_INFO_WIDTH (in_info), + GST_VIDEO_INFO_HEIGHT (in_info)); + + templ.in_depth = GST_VIDEO_INFO_COMP_DEPTH (&unpack_info, 0); + + cuda_ret = CuMemAllocPitch (&convert->unpack_surface.device_ptr, + &convert->unpack_surface.cuda_stride, + GST_VIDEO_INFO_COMP_WIDTH (&unpack_info, 0) * + GST_VIDEO_INFO_COMP_PSTRIDE (&unpack_info, 0), + GST_VIDEO_INFO_HEIGHT (&unpack_info), element_size); + + if (!gst_cuda_result (cuda_ret)) { + GST_ERROR ("couldn't alloc unpack surface"); + return FALSE; + } + + convert->kernel_source = + cuda_converter_generate_rgb_to_rgb_kernel_code (convert, &templ); + + convert->func_names[0] = GST_CUDA_KERNEL_FUNC_TO_ARGB; + convert->func_names[1] = GST_CUDA_KERNEL_FUNC_SCALE_RGB; + + convert->convert = convert_RGB_TO_RGB; + + ret = TRUE; + } + + if (!ret) { + GST_DEBUG ("no path found"); + + return FALSE; + } + + GST_TRACE ("configured CUDA kernel source\n%s", convert->kernel_source); + + return TRUE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/cuda-converter.h
Added
@@ -0,0 +1,58 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_CONVERTER_H__ +#define __GST_CUDA_CONVERTER_H__ + +#include <gst/video/video.h> +#include "gstcudacontext.h" +#include "gstcudamemory.h" + +G_BEGIN_DECLS + +typedef struct _GstCudaConverter GstCudaConverter; + +#define GST_CUDA_CONVERTER_FORMATS \ + "{ I420, YV12, NV12, NV21, P010_10LE, P016_LE, I420_10LE, Y444, Y444_16LE, " \ + "BGRA, RGBA, RGBx, BGRx, ARGB, ABGR, RGB, BGR, BGR10A2_LE, RGB10A2_LE }" + +GstCudaConverter * gst_cuda_converter_new (GstVideoInfo * in_info, + GstVideoInfo * out_info, + GstCudaContext * cuda_ctx); + +void gst_cuda_converter_free (GstCudaConverter * convert); + +gboolean gst_cuda_converter_frame (GstCudaConverter * convert, + const GstCudaMemory * src, + GstVideoInfo * in_info, + GstCudaMemory * dst, + GstVideoInfo * out_info, + CUstream cuda_stream); + +gboolean gst_cuda_converter_frame_unlocked (GstCudaConverter * convert, + const GstCudaMemory * src, + GstVideoInfo * in_info, + GstCudaMemory * dst, + GstVideoInfo * out_info, + CUstream cuda_stream); + + +G_END_DECLS + +#endif /* __GST_CUDA_CONVERTER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudabasefilter.c
Added
@@ -0,0 +1,321 @@ +/* GStreamer + * Copyright (C) <1999> Erik Walthinsen <omega@cse.ogi.edu> + * Copyright (C) 2005-2012 David Schleef <ds@schleef.org> + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * GstCudaBaseFilter: + * + * Base class for CUDA filters + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstcudabasefilter.h" +#include "gstcudautils.h" +#include <string.h> + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_base_filter_debug); +#define GST_CAT_DEFAULT gst_cuda_base_filter_debug + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY, GST_CUDA_CONVERTER_FORMATS)) + ); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE_WITH_FEATURES + (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY, GST_CUDA_CONVERTER_FORMATS)) + ); + +#define gst_cuda_base_filter_parent_class parent_class +G_DEFINE_ABSTRACT_TYPE (GstCudaBaseFilter, + gst_cuda_base_filter, GST_TYPE_CUDA_BASE_TRANSFORM); + +static void gst_cuda_base_filter_dispose (GObject * object); +static GstFlowReturn +gst_cuda_base_filter_transform_frame (GstCudaBaseTransform * btrans, + GstVideoFrame * in_frame, GstCudaMemory * in_cuda_mem, + GstVideoFrame * out_frame, GstCudaMemory * out_cuda_mem); +static gboolean gst_cuda_base_filter_set_info (GstCudaBaseTransform * btrans, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info); + +static void +gst_cuda_base_filter_class_init (GstCudaBaseFilterClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstCudaBaseTransformClass *btrans_class = + GST_CUDA_BASE_TRANSFORM_CLASS (klass); + + gobject_class->dispose = gst_cuda_base_filter_dispose; + + gst_element_class_add_static_pad_template (element_class, &sink_template); + gst_element_class_add_static_pad_template (element_class, &src_template); + + trans_class->passthrough_on_same_caps = TRUE; + + btrans_class->set_info = GST_DEBUG_FUNCPTR (gst_cuda_base_filter_set_info); + btrans_class->transform_frame = + GST_DEBUG_FUNCPTR (gst_cuda_base_filter_transform_frame); + + GST_DEBUG_CATEGORY_INIT (gst_cuda_base_filter_debug, + "cudabasefilter", 0, "CUDA Base Filter"); +} + +static void +gst_cuda_base_filter_init (GstCudaBaseFilter * convert) +{ +} + +static void +gst_cuda_base_filter_dispose (GObject * object) +{ + GstCudaBaseFilter *filter = GST_CUDA_BASE_FILTER (object); + + if (filter->converter) { + gst_cuda_converter_free (filter->converter); + filter->converter = NULL; + } + + if (filter->in_fallback) { + gst_memory_unref (GST_MEMORY_CAST (filter->in_fallback)); + filter->in_fallback = NULL; + } + + if (filter->out_fallback) { + gst_memory_unref (GST_MEMORY_CAST (filter->out_fallback)); + filter->out_fallback = NULL; + } + + gst_clear_object (&filter->allocator); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static gboolean +gst_cuda_base_filter_configure (GstCudaBaseFilter * filter, + GstVideoInfo * in_info, GstVideoInfo * out_info) +{ + GstCudaBaseTransform *btrans = GST_CUDA_BASE_TRANSFORM (filter); + + /* cleanup internal pool */ + if (filter->in_fallback) { + gst_memory_unref (GST_MEMORY_CAST (filter->in_fallback)); + filter->in_fallback = NULL; + } + + if (filter->out_fallback) { + gst_memory_unref (GST_MEMORY_CAST (filter->out_fallback)); + filter->out_fallback = NULL; + } + + if (!filter->allocator) + filter->allocator = gst_cuda_allocator_new (btrans->context); + + if (!filter->allocator) { + GST_ERROR_OBJECT (filter, "Failed to create CUDA allocator"); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_cuda_base_filter_set_info (GstCudaBaseTransform * btrans, GstCaps * incaps, + GstVideoInfo * in_info, GstCaps * outcaps, GstVideoInfo * out_info) +{ + GstCudaBaseFilter *filter = GST_CUDA_BASE_FILTER (btrans); + + if (!gst_cuda_base_filter_configure (filter, in_info, out_info)) { + return FALSE; + } + + if (filter->converter) + gst_cuda_converter_free (filter->converter); + + filter->converter = + gst_cuda_converter_new (in_info, out_info, btrans->context); + + if (filter->converter == NULL) + goto no_converter; + + GST_DEBUG_OBJECT (filter, "reconfigured %d %d", + GST_VIDEO_INFO_FORMAT (in_info), GST_VIDEO_INFO_FORMAT (out_info)); + + return TRUE; + +no_converter: + { + GST_ERROR_OBJECT (filter, "could not create converter"); + return FALSE; + } +} + +static GstFlowReturn +gst_cuda_base_filter_transform_frame (GstCudaBaseTransform * btrans, + GstVideoFrame * in_frame, GstCudaMemory * in_cuda_mem, + GstVideoFrame * out_frame, GstCudaMemory * out_cuda_mem) +{ + GstCudaBaseFilter *filter = GST_CUDA_BASE_FILTER (btrans); + gboolean conv_ret; + GstCudaMemory *in_mem; + GstCudaMemory *out_mem; + gint i; + + if (in_cuda_mem) { + in_mem = in_cuda_mem; + } else { + if (!filter->in_fallback) { + GstCudaAllocationParams params; + + memset (¶ms, 0, sizeof (GstCudaAllocationParams)); + params.info = btrans->in_info; + + filter->in_fallback = + (GstCudaMemory *) gst_cuda_allocator_alloc (filter->allocator, + GST_VIDEO_INFO_SIZE (¶ms.info), ¶ms); + } + + if (!filter->in_fallback) { + GST_ERROR_OBJECT (filter, "Couldn't allocate fallback memory"); + return GST_FLOW_ERROR; + } + + GST_TRACE_OBJECT (filter, "use CUDA fallback memory input"); + + if (!gst_cuda_context_push (btrans->context)) { + GST_ELEMENT_ERROR (filter, LIBRARY, FAILED, (NULL), + ("Cannot push CUDA context")); + return FALSE; + } + + /* upload frame to device memory */ + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (in_frame); i++) { + CUDA_MEMCPY2D param = { 0, }; + guint width, height; + + width = GST_VIDEO_FRAME_COMP_WIDTH (in_frame, i) * + GST_VIDEO_FRAME_COMP_PSTRIDE (in_frame, i); + height = GST_VIDEO_FRAME_COMP_HEIGHT (in_frame, i); + + param.srcMemoryType = CU_MEMORYTYPE_HOST; + param.srcPitch = GST_VIDEO_FRAME_PLANE_STRIDE (in_frame, i); + param.srcHost = GST_VIDEO_FRAME_PLANE_DATA (in_frame, i); + param.dstMemoryType = CU_MEMORYTYPE_DEVICE; + param.dstPitch = filter->in_fallback->stride; + param.dstDevice = + filter->in_fallback->data + filter->in_fallback->offset[i]; + param.WidthInBytes = width; + param.Height = height; + + if (!gst_cuda_result (CuMemcpy2DAsync (¶m, btrans->cuda_stream))) { + gst_cuda_context_pop (NULL); + GST_ELEMENT_ERROR (filter, LIBRARY, FAILED, (NULL), + ("Cannot upload input video frame")); + return GST_FLOW_ERROR; + } + } + + gst_cuda_result (CuStreamSynchronize (btrans->cuda_stream)); + gst_cuda_context_pop (NULL); + + in_mem = filter->in_fallback; + } + + if (out_cuda_mem) { + out_mem = out_cuda_mem; + } else { + if (!filter->out_fallback) { + GstCudaAllocationParams params; + + memset (¶ms, 0, sizeof (GstCudaAllocationParams)); + params.info = btrans->out_info; + + filter->out_fallback = + (GstCudaMemory *) gst_cuda_allocator_alloc (filter->allocator, + GST_VIDEO_INFO_SIZE (¶ms.info), ¶ms); + } + + if (!filter->out_fallback) { + GST_ERROR_OBJECT (filter, "Couldn't allocate fallback memory"); + return GST_FLOW_ERROR; + } + + out_mem = filter->out_fallback; + } + + conv_ret = + gst_cuda_converter_frame (filter->converter, in_mem, &btrans->in_info, + out_mem, &btrans->out_info, btrans->cuda_stream); + + if (!conv_ret) { + GST_ERROR_OBJECT (filter, "Failed to convert frame"); + return GST_FLOW_ERROR; + } + + if (!out_cuda_mem) { + if (!gst_cuda_context_push (btrans->context)) { + GST_ELEMENT_ERROR (filter, LIBRARY, FAILED, (NULL), + ("Cannot push CUDA context")); + return FALSE; + } + + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (out_frame); i++) { + CUDA_MEMCPY2D param = { 0, }; + guint width, height; + + width = GST_VIDEO_FRAME_COMP_WIDTH (out_frame, i) * + GST_VIDEO_FRAME_COMP_PSTRIDE (out_frame, i); + height = GST_VIDEO_FRAME_COMP_HEIGHT (out_frame, i); + + param.srcMemoryType = CU_MEMORYTYPE_DEVICE; + param.srcPitch = out_mem->stride; + param.srcDevice = + filter->out_fallback->data + filter->out_fallback->offset[i]; + param.dstMemoryType = CU_MEMORYTYPE_HOST; + param.dstPitch = GST_VIDEO_FRAME_PLANE_STRIDE (out_frame, i); + param.dstHost = GST_VIDEO_FRAME_PLANE_DATA (out_frame, i); + param.WidthInBytes = width; + param.Height = height; + + if (!gst_cuda_result (CuMemcpy2DAsync (¶m, btrans->cuda_stream))) { + gst_cuda_context_pop (NULL); + GST_ELEMENT_ERROR (filter, LIBRARY, FAILED, (NULL), + ("Cannot upload input video frame")); + return GST_FLOW_ERROR; + } + } + + gst_cuda_result (CuStreamSynchronize (btrans->cuda_stream)); + gst_cuda_context_pop (NULL); + } + + return GST_FLOW_OK; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudabasefilter.h
Added
@@ -0,0 +1,61 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_BASE_FILTER_H__ +#define __GST_CUDA_BASE_FILTER_H__ + +#include <gst/gst.h> + +#include "gstcudabasetransform.h" +#include "cuda-converter.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_BASE_FILTER (gst_cuda_base_filter_get_type()) +#define GST_CUDA_BASE_FILTER(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_BASE_FILTER,GstCudaBaseFilter)) +#define GST_CUDA_BASE_FILTER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_BASE_FILTER,GstCudaBaseFilterClass)) +#define GST_CUDA_BASE_FILTER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_BASE_FILTER,GstCudaBaseFilterClass)) +#define GST_IS_CUDA_BASE_FILTER(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_BASE_FILTER)) +#define GST_IS_CUDA_BASE_FILTER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_BASE_FILTER)) + +typedef struct _GstCudaBaseFilter GstCudaBaseFilter; +typedef struct _GstCudaBaseFilterClass GstCudaBaseFilterClass; + +struct _GstCudaBaseFilter +{ + GstCudaBaseTransform parent; + + GstCudaConverter *converter; + + /* fallback CUDA memory */ + GstAllocator *allocator; + GstCudaMemory *in_fallback; + GstCudaMemory *out_fallback; +}; + +struct _GstCudaBaseFilterClass +{ + GstCudaBaseTransformClass parent_class; +}; + +GType gst_cuda_base_filter_get_type (void); + +G_END_DECLS + +#endif /* __GST_CUDA_BASE_FILTER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudabasetransform.c
Added
@@ -0,0 +1,619 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * GstCudaBaseTransform: + * + * Base class for CUDA transformers + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstcudabasetransform.h" +#include "gstcudautils.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_base_transform_debug); +#define GST_CAT_DEFAULT gst_cuda_base_transform_debug + +enum +{ + PROP_0, + PROP_DEVICE_ID, +}; + +#define DEFAULT_DEVICE_ID -1 + +#define gst_cuda_base_transform_parent_class parent_class +G_DEFINE_ABSTRACT_TYPE (GstCudaBaseTransform, gst_cuda_base_transform, + GST_TYPE_BASE_TRANSFORM); + +static void gst_cuda_base_transform_set_property (GObject * object, + guint prop_id, const GValue * value, GParamSpec * pspec); +static void gst_cuda_base_transform_get_property (GObject * object, + guint prop_id, GValue * value, GParamSpec * pspec); +static void gst_cuda_base_transform_dispose (GObject * object); +static void gst_cuda_base_transform_set_context (GstElement * element, + GstContext * context); +static gboolean gst_cuda_base_transform_start (GstBaseTransform * trans); +static gboolean gst_cuda_base_transform_stop (GstBaseTransform * trans); +static gboolean gst_cuda_base_transform_set_caps (GstBaseTransform * trans, + GstCaps * incaps, GstCaps * outcaps); +static GstFlowReturn gst_cuda_base_transform_transform (GstBaseTransform * + trans, GstBuffer * inbuf, GstBuffer * outbuf); +static gboolean gst_cuda_base_transform_get_unit_size (GstBaseTransform * trans, + GstCaps * caps, gsize * size); +static gboolean gst_cuda_base_transform_propose_allocation (GstBaseTransform * + trans, GstQuery * decide_query, GstQuery * query); +static gboolean gst_cuda_base_transform_decide_allocation (GstBaseTransform * + trans, GstQuery * query); +static gboolean gst_cuda_base_transform_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query); +static GstFlowReturn +gst_cuda_base_transform_transform_frame_default (GstCudaBaseTransform * filter, + GstVideoFrame * in_frame, GstCudaMemory * in_cuda_mem, + GstVideoFrame * out_frame, GstCudaMemory * out_cuda_mem); + +static void +gst_cuda_base_transform_class_init (GstCudaBaseTransformClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *element_class; + GstBaseTransformClass *trans_class; + + gobject_class = G_OBJECT_CLASS (klass); + element_class = GST_ELEMENT_CLASS (klass); + trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gobject_class->set_property = gst_cuda_base_transform_set_property; + gobject_class->get_property = gst_cuda_base_transform_get_property; + gobject_class->dispose = gst_cuda_base_transform_dispose; + + g_object_class_install_property (gobject_class, PROP_DEVICE_ID, + g_param_spec_int ("cuda-device-id", + "Cuda Device ID", + "Set the GPU device to use for operations (-1 = auto)", + -1, G_MAXINT, DEFAULT_DEVICE_ID, + G_PARAM_READWRITE | GST_PARAM_MUTABLE_READY | + G_PARAM_STATIC_STRINGS)); + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_cuda_base_transform_set_context); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->start = GST_DEBUG_FUNCPTR (gst_cuda_base_transform_start); + trans_class->stop = GST_DEBUG_FUNCPTR (gst_cuda_base_transform_stop); + trans_class->set_caps = GST_DEBUG_FUNCPTR (gst_cuda_base_transform_set_caps); + trans_class->transform = + GST_DEBUG_FUNCPTR (gst_cuda_base_transform_transform); + trans_class->get_unit_size = + GST_DEBUG_FUNCPTR (gst_cuda_base_transform_get_unit_size); + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_cuda_base_transform_propose_allocation); + trans_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_cuda_base_transform_decide_allocation); + trans_class->query = GST_DEBUG_FUNCPTR (gst_cuda_base_transform_query); + + klass->transform_frame = + GST_DEBUG_FUNCPTR (gst_cuda_base_transform_transform_frame_default); + + GST_DEBUG_CATEGORY_INIT (gst_cuda_base_transform_debug, + "cudabasefilter", 0, "cudabasefilter Element"); +} + +static void +gst_cuda_base_transform_init (GstCudaBaseTransform * filter) +{ + filter->device_id = DEFAULT_DEVICE_ID; + + filter->negotiated = FALSE; +} + +static void +gst_cuda_base_transform_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (object); + + switch (prop_id) { + case PROP_DEVICE_ID: + filter->device_id = g_value_get_int (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_cuda_base_transform_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (object); + + switch (prop_id) { + case PROP_DEVICE_ID: + g_value_set_int (value, filter->device_id); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_cuda_base_transform_dispose (GObject * object) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (object); + + gst_clear_object (&filter->context); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_cuda_base_transform_set_context (GstElement * element, GstContext * context) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (element); + + gst_cuda_handle_set_context (element, + context, filter->device_id, &filter->context); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_cuda_base_transform_start (GstBaseTransform * trans) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + CUresult cuda_ret; + + if (!gst_cuda_ensure_element_context (GST_ELEMENT_CAST (filter), + filter->device_id, &filter->context)) { + GST_ERROR_OBJECT (filter, "Failed to get CUDA context"); + return FALSE; + } + + if (gst_cuda_context_push (filter->context)) { + cuda_ret = CuStreamCreate (&filter->cuda_stream, CU_STREAM_DEFAULT); + if (!gst_cuda_result (cuda_ret)) { + GST_WARNING_OBJECT (filter, + "Could not create cuda stream, will use default stream"); + filter->cuda_stream = NULL; + } + gst_cuda_context_pop (NULL); + } + + return TRUE; +} + +static gboolean +gst_cuda_base_transform_stop (GstBaseTransform * trans) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + + if (filter->context && filter->cuda_stream) { + if (gst_cuda_context_push (filter->context)) { + gst_cuda_result (CuStreamDestroy (filter->cuda_stream)); + gst_cuda_context_pop (NULL); + } + } + + gst_clear_object (&filter->context); + filter->cuda_stream = NULL; + + return TRUE; +} + +static gboolean +gst_cuda_base_transform_set_caps (GstBaseTransform * trans, GstCaps * incaps, + GstCaps * outcaps) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + GstVideoInfo in_info, out_info; + GstCudaBaseTransformClass *klass; + gboolean res; + + if (!filter->context) { + GST_ERROR_OBJECT (filter, "No available CUDA context"); + return FALSE; + } + + /* input caps */ + if (!gst_video_info_from_caps (&in_info, incaps)) + goto invalid_caps; + + /* output caps */ + if (!gst_video_info_from_caps (&out_info, outcaps)) + goto invalid_caps; + + klass = GST_CUDA_BASE_TRANSFORM_GET_CLASS (filter); + if (klass->set_info) + res = klass->set_info (filter, incaps, &in_info, outcaps, &out_info); + else + res = TRUE; + + if (res) { + filter->in_info = in_info; + filter->out_info = out_info; + } + + filter->negotiated = res; + + return res; + + /* ERRORS */ +invalid_caps: + { + GST_ERROR_OBJECT (filter, "invalid caps"); + filter->negotiated = FALSE; + return FALSE; + } +} + +static gboolean +gst_cuda_base_transform_get_unit_size (GstBaseTransform * trans, GstCaps * caps, + gsize * size) +{ + gboolean ret = FALSE; + GstVideoInfo info; + + ret = gst_video_info_from_caps (&info, caps); + if (ret) + *size = GST_VIDEO_INFO_SIZE (&info); + + return TRUE; +} + +static GstFlowReturn +gst_cuda_base_transform_transform (GstBaseTransform * trans, + GstBuffer * inbuf, GstBuffer * outbuf) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + GstCudaBaseTransformClass *fclass = + GST_CUDA_BASE_TRANSFORM_GET_CLASS (filter); + GstVideoFrame in_frame, out_frame; + GstFlowReturn ret = GST_FLOW_OK; + GstMapFlags in_map_flags, out_map_flags; + GstMemory *mem; + GstCudaMemory *in_cuda_mem = NULL; + GstCudaMemory *out_cuda_mem = NULL; + + if (G_UNLIKELY (!filter->negotiated)) + goto unknown_format; + + in_map_flags = GST_MAP_READ | GST_VIDEO_FRAME_MAP_FLAG_NO_REF; + out_map_flags = GST_MAP_WRITE | GST_VIDEO_FRAME_MAP_FLAG_NO_REF; + + in_cuda_mem = out_cuda_mem = FALSE; + + if (gst_buffer_n_memory (inbuf) == 1 && + (mem = gst_buffer_peek_memory (inbuf, 0)) && gst_is_cuda_memory (mem)) { + GstCudaMemory *cmem = GST_CUDA_MEMORY_CAST (mem); + + if (cmem->context == filter->context || + gst_cuda_context_get_handle (cmem->context) == + gst_cuda_context_get_handle (filter->context) || + (gst_cuda_context_can_access_peer (cmem->context, filter->context) && + gst_cuda_context_can_access_peer (filter->context, + cmem->context))) { + in_map_flags |= GST_MAP_CUDA; + in_cuda_mem = cmem; + } + } + + if (gst_buffer_n_memory (outbuf) == 1 && + (mem = gst_buffer_peek_memory (outbuf, 0)) && gst_is_cuda_memory (mem)) { + GstCudaMemory *cmem = GST_CUDA_MEMORY_CAST (mem); + + if (cmem->context == filter->context || + gst_cuda_context_get_handle (cmem->context) == + gst_cuda_context_get_handle (filter->context) || + (gst_cuda_context_can_access_peer (cmem->context, filter->context) && + gst_cuda_context_can_access_peer (filter->context, + cmem->context))) { + out_map_flags |= GST_MAP_CUDA; + out_cuda_mem = cmem; + } + } + + if (!gst_video_frame_map (&in_frame, &filter->in_info, inbuf, in_map_flags)) + goto invalid_buffer; + + if (!gst_video_frame_map (&out_frame, &filter->out_info, outbuf, + out_map_flags)) { + gst_video_frame_unmap (&in_frame); + goto invalid_buffer; + } + + ret = fclass->transform_frame (filter, &in_frame, in_cuda_mem, &out_frame, + out_cuda_mem); + + gst_video_frame_unmap (&out_frame); + gst_video_frame_unmap (&in_frame); + + return ret; + + /* ERRORS */ +unknown_format: + { + GST_ELEMENT_ERROR (filter, CORE, NOT_IMPLEMENTED, (NULL), + ("unknown format")); + return GST_FLOW_NOT_NEGOTIATED; + } +invalid_buffer: + { + GST_ELEMENT_WARNING (trans, CORE, NOT_IMPLEMENTED, (NULL), + ("invalid video buffer received")); + return GST_FLOW_OK; + } +} + +static GstFlowReturn +gst_cuda_base_transform_transform_frame_default (GstCudaBaseTransform * filter, + GstVideoFrame * in_frame, GstCudaMemory * in_cuda_mem, + GstVideoFrame * out_frame, GstCudaMemory * out_cuda_mem) +{ + gint i; + GstFlowReturn ret = GST_FLOW_OK; + + if (in_cuda_mem || out_cuda_mem) { + if (!gst_cuda_context_push (filter->context)) { + GST_ELEMENT_ERROR (filter, LIBRARY, FAILED, (NULL), + ("Cannot push CUDA context")); + + return GST_FLOW_ERROR; + } + + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (in_frame); i++) { + CUDA_MEMCPY2D param = { 0, }; + guint width, height; + + width = GST_VIDEO_FRAME_COMP_WIDTH (in_frame, i) * + GST_VIDEO_FRAME_COMP_PSTRIDE (in_frame, i); + height = GST_VIDEO_FRAME_COMP_HEIGHT (in_frame, i); + + if (in_cuda_mem) { + param.srcMemoryType = CU_MEMORYTYPE_DEVICE; + param.srcDevice = in_cuda_mem->data + in_cuda_mem->offset[i]; + param.srcPitch = in_cuda_mem->stride; + } else { + param.srcMemoryType = CU_MEMORYTYPE_HOST; + param.srcHost = GST_VIDEO_FRAME_PLANE_DATA (in_frame, i); + param.srcPitch = GST_VIDEO_FRAME_PLANE_STRIDE (in_frame, i); + } + + if (out_cuda_mem) { + param.dstMemoryType = CU_MEMORYTYPE_DEVICE; + param.dstDevice = out_cuda_mem->data + out_cuda_mem->offset[i]; + param.dstPitch = out_cuda_mem->stride; + } else { + param.dstMemoryType = CU_MEMORYTYPE_HOST; + param.dstHost = GST_VIDEO_FRAME_PLANE_DATA (out_frame, i); + param.dstPitch = GST_VIDEO_FRAME_PLANE_STRIDE (out_frame, i); + } + + param.WidthInBytes = width; + param.Height = height; + + if (!gst_cuda_result (CuMemcpy2DAsync (¶m, filter->cuda_stream))) { + gst_cuda_context_pop (NULL); + GST_ELEMENT_ERROR (filter, LIBRARY, FAILED, (NULL), + ("Cannot upload input video frame")); + + return GST_FLOW_ERROR; + } + } + + CuStreamSynchronize (filter->cuda_stream); + + gst_cuda_context_pop (NULL); + } else { + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (in_frame); i++) { + if (!gst_video_frame_copy_plane (out_frame, in_frame, i)) { + GST_ERROR_OBJECT (filter, "Couldn't copy %dth plane", i); + + return GST_FLOW_ERROR; + } + } + } + + return ret; +} + +static gboolean +gst_cuda_base_transform_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + GstVideoInfo info; + GstBufferPool *pool; + GstCaps *caps; + guint size; + + if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query)) + return FALSE; + + /* passthrough, we're done */ + if (decide_query == NULL) + return TRUE; + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) + return FALSE; + + if (gst_query_get_n_allocation_pools (query) == 0) { + GstCapsFeatures *features; + GstStructure *config; + GstVideoAlignment align; + GstAllocationParams params = { 0, 31, 0, 0, }; + GstAllocator *allocator = NULL; + gint i; + + features = gst_caps_get_features (caps, 0); + + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)) { + GST_DEBUG_OBJECT (filter, "upstream support CUDA memory"); + pool = gst_cuda_buffer_pool_new (filter->context); + } else { + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + + gst_video_alignment_reset (&align); + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&info); i++) { + align.stride_align[i] = 31; + } + gst_video_info_align (&info, &align); + + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + + gst_buffer_pool_config_set_video_alignment (config, &align); + size = GST_VIDEO_INFO_SIZE (&info); + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + gst_query_add_allocation_pool (query, pool, size, 0, 0); + + if (gst_buffer_pool_config_get_allocator (config, &allocator, ¶ms)) { + if (params.align < 31) + params.align = 31; + + gst_query_add_allocation_param (query, allocator, ¶ms); + gst_buffer_pool_config_set_allocator (config, allocator, ¶ms); + } + + if (!gst_buffer_pool_set_config (pool, config)) + goto config_failed; + + gst_object_unref (pool); + } + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (filter, "failed to set config"); + gst_object_unref (pool); + return FALSE; + } +} + +static gboolean +gst_cuda_base_transform_decide_allocation (GstBaseTransform * trans, + GstQuery * query) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + GstCaps *outcaps = NULL; + GstBufferPool *pool = NULL; + guint size, min, max; + GstStructure *config; + gboolean update_pool = FALSE; + gboolean need_cuda = FALSE; + GstCapsFeatures *features; + + gst_query_parse_allocation (query, &outcaps, NULL); + + if (!outcaps) + return FALSE; + + features = gst_caps_get_features (outcaps, 0); + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)) { + need_cuda = TRUE; + } + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (need_cuda && pool && !GST_IS_CUDA_BUFFER_POOL (pool)) { + /* when cuda device memory is supported, but pool is not cudabufferpool */ + gst_object_unref (pool); + pool = NULL; + } + + update_pool = TRUE; + } else { + GstVideoInfo vinfo; + gst_video_info_from_caps (&vinfo, outcaps); + size = GST_VIDEO_INFO_SIZE (&vinfo); + min = max = 0; + } + + if (!pool) { + GST_DEBUG_OBJECT (filter, "create our pool"); + + if (need_cuda) + pool = gst_cuda_buffer_pool_new (filter->context); + else + pool = gst_video_buffer_pool_new (); + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_set_config (pool, config); + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + gst_object_unref (pool); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, + query); +} + +static gboolean +gst_cuda_base_transform_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query) +{ + GstCudaBaseTransform *filter = GST_CUDA_BASE_TRANSFORM (trans); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + { + gboolean ret; + ret = gst_cuda_handle_context_query (GST_ELEMENT (filter), query, + filter->context); + if (ret) + return TRUE; + break; + } + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->query (trans, direction, + query); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudabasetransform.h
Added
@@ -0,0 +1,75 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_BASE_TRANSFORM_H__ +#define __GST_CUDA_BASE_TRANSFORM_H__ + +#include <gst/gst.h> +#include <gst/base/gstbasetransform.h> +#include <gst/video/video.h> +#include "gstcudacontext.h" +#include "gstcudabufferpool.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_BASE_TRANSFORM (gst_cuda_base_transform_get_type()) +#define GST_CUDA_BASE_TRANSFORM(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_BASE_TRANSFORM,GstCudaBaseTransform)) +#define GST_CUDA_BASE_TRANSFORM_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_BASE_TRANSFORM,GstCudaBaseTransformClass)) +#define GST_CUDA_BASE_TRANSFORM_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_BASE_TRANSFORM,GstCudaBaseTransformClass)) +#define GST_IS_CUDA_BASE_TRANSFORM(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_BASE_TRANSFORM)) +#define GST_IS_CUDA_BASE_TRANSFORM_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_BASE_TRANSFORM)) + +typedef struct _GstCudaBaseTransform GstCudaBaseTransform; +typedef struct _GstCudaBaseTransformClass GstCudaBaseTransformClass; + +struct _GstCudaBaseTransform +{ + GstBaseTransform parent; + + gboolean negotiated; + + GstVideoInfo in_info; + GstVideoInfo out_info; + + GstCudaContext *context; + CUstream cuda_stream; + + gint device_id; +}; + +struct _GstCudaBaseTransformClass +{ + GstBaseTransformClass parent_class; + + gboolean (*set_info) (GstCudaBaseTransform *filter, + GstCaps *incaps, GstVideoInfo *in_info, + GstCaps *outcaps, GstVideoInfo *out_info); + + GstFlowReturn (*transform_frame) (GstCudaBaseTransform *filter, + GstVideoFrame *in_frame, + GstCudaMemory *in_cuda_mem, + GstVideoFrame *out_frame, + GstCudaMemory *out_cuda_mem); +}; + +GType gst_cuda_base_transform_get_type (void); + +G_END_DECLS + +#endif /* __GST_CUDA_BASE_TRANSFORM_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudabufferpool.c
Added
@@ -0,0 +1,259 @@ +/* GStreamer + * Copyright (C) <2018-2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstcudabufferpool.h" +#include "gstcudacontext.h" +#include "gstcudamemory.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_buffer_pool_debug); +#define GST_CAT_DEFAULT gst_cuda_buffer_pool_debug + +struct _GstCudaBufferPoolPrivate +{ + GstCudaContext *context; + GstAllocator *allocator; + GstVideoInfo info; + gboolean add_videometa; + gboolean need_alignment; + GstCudaAllocationParams params; +}; + +#define gst_cuda_buffer_pool_parent_class parent_class +G_DEFINE_TYPE_WITH_PRIVATE (GstCudaBufferPool, gst_cuda_buffer_pool, + GST_TYPE_BUFFER_POOL); + +static const gchar ** +gst_cuda_buffer_pool_get_options (GstBufferPool * pool) +{ + static const gchar *options[] = { GST_BUFFER_POOL_OPTION_VIDEO_META, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT, NULL + }; + + return options; +} + +static gboolean +gst_cuda_buffer_pool_set_config (GstBufferPool * pool, GstStructure * config) +{ + GstCudaBufferPool *cuda_pool = GST_CUDA_BUFFER_POOL_CAST (pool); + GstCudaBufferPoolPrivate *priv = cuda_pool->priv; + GstCaps *caps = NULL; + guint size, min_buffers, max_buffers; + guint max_align, n; + GstAllocator *allocator = NULL; + GstAllocationParams *params = (GstAllocationParams *) & priv->params; + GstVideoInfo *info = &priv->params.info; + + if (!gst_buffer_pool_config_get_params (config, &caps, &size, &min_buffers, + &max_buffers)) + goto wrong_config; + + if (caps == NULL) + goto no_caps; + + if (!gst_buffer_pool_config_get_allocator (config, &allocator, params)) + goto wrong_config; + + /* now parse the caps from the config */ + if (!gst_video_info_from_caps (info, caps)) + goto wrong_caps; + + GST_LOG_OBJECT (pool, "%dx%d, caps %" GST_PTR_FORMAT, + GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info), caps); + + gst_clear_object (&priv->allocator); + + if (allocator) { + if (!GST_IS_CUDA_ALLOCATOR (allocator)) { + goto wrong_allocator; + } else { + priv->allocator = gst_object_ref (allocator); + } + } else { + allocator = priv->allocator = gst_cuda_allocator_new (priv->context); + if (G_UNLIKELY (priv->allocator == NULL)) + goto no_allocator; + } + + priv->add_videometa = gst_buffer_pool_config_has_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + priv->need_alignment = gst_buffer_pool_config_has_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + + max_align = params->align; + + /* do memory align */ + if (priv->need_alignment && priv->add_videometa) { + GstVideoAlignment valign; + + gst_buffer_pool_config_get_video_alignment (config, &valign); + + for (n = 0; n < GST_VIDEO_MAX_PLANES; ++n) + max_align |= valign.stride_align[n]; + + for (n = 0; n < GST_VIDEO_MAX_PLANES; ++n) + valign.stride_align[n] = max_align; + + if (!gst_video_info_align (info, &valign)) + goto failed_to_align; + + gst_buffer_pool_config_set_video_alignment (config, &valign); + } + + if (params->align < max_align) { + GST_WARNING_OBJECT (pool, "allocation params alignment %u is smaller " + "than the max specified video stride alignment %u, fixing", + (guint) params->align, max_align); + + params->align = max_align; + gst_buffer_pool_config_set_allocator (config, allocator, params); + } + + gst_buffer_pool_config_set_params (config, caps, GST_VIDEO_INFO_SIZE (info), + min_buffers, max_buffers); + + return GST_BUFFER_POOL_CLASS (parent_class)->set_config (pool, config); + + /* ERRORS */ +wrong_config: + { + GST_WARNING_OBJECT (pool, "invalid config"); + return FALSE; + } +no_caps: + { + GST_WARNING_OBJECT (pool, "no caps in config"); + return FALSE; + } +wrong_caps: + { + GST_WARNING_OBJECT (pool, + "failed getting geometry from caps %" GST_PTR_FORMAT, caps); + return FALSE; + } +no_allocator: + { + GST_WARNING_OBJECT (pool, "Could not create new CUDA allocator"); + return FALSE; + } +wrong_allocator: + { + GST_WARNING_OBJECT (pool, "Incorrect allocator type for this pool"); + return FALSE; + } +failed_to_align: + { + GST_WARNING_OBJECT (pool, "Failed to align"); + return FALSE; + } +} + +static GstFlowReturn +gst_cuda_buffer_pool_alloc (GstBufferPool * pool, GstBuffer ** buffer, + GstBufferPoolAcquireParams * params) +{ + GstCudaBufferPool *cuda_pool = GST_CUDA_BUFFER_POOL_CAST (pool); + GstCudaBufferPoolPrivate *priv = cuda_pool->priv; + GstVideoInfo *info; + GstBuffer *cuda; + GstMemory *mem; + + info = &priv->params.info; + + cuda = gst_buffer_new (); + + mem = gst_cuda_allocator_alloc (GST_ALLOCATOR_CAST (priv->allocator), + GST_VIDEO_INFO_SIZE (info), &priv->params); + + if (mem == NULL) { + gst_buffer_unref (cuda); + GST_WARNING_OBJECT (pool, "Cannot create CUDA memory"); + return GST_FLOW_ERROR; + } + gst_buffer_append_memory (cuda, mem); + + if (priv->add_videometa) { + GST_DEBUG_OBJECT (pool, "adding GstVideoMeta"); + gst_buffer_add_video_meta_full (cuda, GST_VIDEO_FRAME_FLAG_NONE, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_WIDTH (info), + GST_VIDEO_INFO_HEIGHT (info), GST_VIDEO_INFO_N_PLANES (info), + info->offset, info->stride); + } + + *buffer = cuda; + + return GST_FLOW_OK; +} + +GstBufferPool * +gst_cuda_buffer_pool_new (GstCudaContext * context) +{ + GstCudaBufferPool *pool; + + pool = g_object_new (GST_TYPE_CUDA_BUFFER_POOL, NULL); + gst_object_ref_sink (pool); + + pool->priv->context = gst_object_ref (context); + + GST_LOG_OBJECT (pool, "new CUDA buffer pool %p", pool); + + return GST_BUFFER_POOL_CAST (pool); +} + +static void +gst_cuda_buffer_pool_dispose (GObject * object) +{ + GstCudaBufferPool *pool = GST_CUDA_BUFFER_POOL_CAST (object); + GstCudaBufferPoolPrivate *priv = pool->priv; + + GST_LOG_OBJECT (pool, "finalize CUDA buffer pool %p", pool); + + gst_clear_object (&priv->allocator); + gst_clear_object (&priv->context); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + + +static void +gst_cuda_buffer_pool_class_init (GstCudaBufferPoolClass * klass) +{ + GObjectClass *gobject_class = (GObjectClass *) klass; + GstBufferPoolClass *gstbufferpool_class = (GstBufferPoolClass *) klass; + + gobject_class->dispose = gst_cuda_buffer_pool_dispose; + + gstbufferpool_class->get_options = gst_cuda_buffer_pool_get_options; + gstbufferpool_class->set_config = gst_cuda_buffer_pool_set_config; + gstbufferpool_class->alloc_buffer = gst_cuda_buffer_pool_alloc; + + GST_DEBUG_CATEGORY_INIT (gst_cuda_buffer_pool_debug, "cudabufferpool", 0, + "CUDA Buffer Pool"); +} + +static void +gst_cuda_buffer_pool_init (GstCudaBufferPool * pool) +{ + pool->priv = gst_cuda_buffer_pool_get_instance_private (pool); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudabufferpool.h
Added
@@ -0,0 +1,66 @@ +/* GStreamer + * Copyright (C) <2018-2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_BUFFER_POOL_H__ +#define __GST_CUDA_BUFFER_POOL_H__ + +#include <gst/video/gstvideometa.h> +#include <gst/video/gstvideopool.h> + +#include "gstcudamemory.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_BUFFER_POOL (gst_cuda_buffer_pool_get_type ()) +#define GST_CUDA_BUFFER_POOL(obj) (G_TYPE_CHECK_INSTANCE_CAST ((obj),GST_TYPE_CUDA_BUFFER_POOL,GstCudaBufferPool)) +#define GST_CUDA_BUFFER_POOL_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_CUDA_BUFFER_POOL,GstCudaBufferPoolClass)) +#define GST_CUDA_BUFFER_POOL_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_BUFFER_POOL,GstCudaBufferPoolClass)) +#define GST_IS_CUDA_BUFFER_POOL(obj) (G_TYPE_CHECK_INSTANCE_TYPE ((obj),GST_TYPE_CUDA_BUFFER_POOL)) +#define GST_IS_CUDA_BUFFER_POOL_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_CUDA_BUFFER_POOL)) +#define GST_CUDA_BUFFER_POOL_CAST(obj) ((GstCudaBufferPool*)(obj)) + +typedef struct _GstCudaBufferPool GstCudaBufferPool; +typedef struct _GstCudaBufferPoolClass GstCudaBufferPoolClass; +typedef struct _GstCudaBufferPoolPrivate GstCudaBufferPoolPrivate; + +/* + * GstCudaBufferPool: + */ +struct _GstCudaBufferPool +{ + GstBufferPool parent; + + GstCudaBufferPoolPrivate *priv; +}; + +/* + * GstCudaBufferPoolClass: + */ +struct _GstCudaBufferPoolClass +{ + GstBufferPoolClass parent_class; +}; + +GType gst_cuda_buffer_pool_get_type (void); + +GstBufferPool * gst_cuda_buffer_pool_new (GstCudaContext * context); + +G_END_DECLS + +#endif /* __GST_CUDA_BUFFER_POOL_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstcudacontext.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudacontext.c
Changed
@@ -28,6 +28,10 @@ GST_DEBUG_CATEGORY_STATIC (gst_cuda_context_debug); #define GST_CAT_DEFAULT gst_cuda_context_debug +/* store all context object with weak ref */ +static GList *context_list = NULL; +G_LOCK_DEFINE_STATIC (list_lock); + enum { PROP_0, @@ -41,6 +45,10 @@ CUcontext context; CUdevice device; gint device_id; + + gint tex_align; + + GHashTable *accessible_peer; }; #define gst_cuda_context_parent_class parent_class @@ -52,6 +60,10 @@ GValue * value, GParamSpec * pspec); static void gst_cuda_context_constructed (GObject * object); static void gst_cuda_context_finalize (GObject * object); +static void gst_cuda_context_weak_ref_notify (gpointer data, + GstCudaContext * context); +static void gst_cuda_context_enable_peer_access (GstCudaContext * context, + GstCudaContext * peer); static void gst_cuda_context_class_init (GstCudaContextClass * klass) @@ -80,6 +92,7 @@ priv->context = NULL; priv->device_id = DEFAULT_DEVICE_ID; + priv->accessible_peer = g_hash_table_new (g_direct_hash, g_direct_equal); context->priv = priv; } @@ -131,6 +144,8 @@ gchar name[256]; gint min = 0, maj = 0; gint i; + gint tex_align = 0; + GList *iter; if (g_once_init_enter (&once)) { if (CuInit (0) != CUDA_SUCCESS) { @@ -154,11 +169,15 @@ gst_cuda_result (CuDeviceGetAttribute (&maj, CU_DEVICE_ATTRIBUTE_COMPUTE_CAPABILITY_MAJOR, cdev)) && gst_cuda_result (CuDeviceGetAttribute (&min, - CU_DEVICE_ATTRIBUTE_COMPUTE_CAPABILITY_MINOR, cdev))) { + CU_DEVICE_ATTRIBUTE_COMPUTE_CAPABILITY_MINOR, cdev)) && + gst_cuda_result (CuDeviceGetAttribute (&tex_align, + CU_DEVICE_ATTRIBUTE_TEXTURE_ALIGNMENT, cdev))) { GST_INFO ("GPU #%d supports NVENC: %s (%s) (Compute SM %d.%d)", i, (((maj << 4) + min) >= 0x30) ? "yes" : "no", name, maj, min); if (priv->device_id == -1 || priv->device_id == cdev) { priv->device_id = cuda_dev = cdev; + priv->tex_align = tex_align; + break; } } } @@ -183,6 +202,101 @@ priv->context = cuda_ctx; priv->device = cuda_dev; + + G_LOCK (list_lock); + g_object_weak_ref (G_OBJECT (object), + (GWeakNotify) gst_cuda_context_weak_ref_notify, NULL); + for (iter = context_list; iter; iter = g_list_next (iter)) { + GstCudaContext *peer = (GstCudaContext *) iter->data; + + /* EnablePeerAccess is unidirectional */ + gst_cuda_context_enable_peer_access (context, peer); + gst_cuda_context_enable_peer_access (peer, context); + } + + context_list = g_list_append (context_list, context); + G_UNLOCK (list_lock); +} + +/* must be called with list_lock taken */ +static void +gst_cuda_context_enable_peer_access (GstCudaContext * context, + GstCudaContext * peer) +{ + GstCudaContextPrivate *priv = context->priv; + GstCudaContextPrivate *peer_priv = peer->priv; + CUdevice device = priv->device; + CUdevice other_dev = peer_priv->device; + CUresult cuda_ret; + gint can_access = 0; + + cuda_ret = CuDeviceCanAccessPeer (&can_access, device, other_dev); + + if (!gst_cuda_result (cuda_ret) || !can_access) { + GST_DEBUG_OBJECT (context, + "Peer access to %" GST_PTR_FORMAT " is not allowed", peer); + return; + } + + gst_cuda_context_push (context); + if (gst_cuda_result (CuCtxEnablePeerAccess (peer_priv->context, 0))) { + GST_DEBUG_OBJECT (context, "Enable peer access to %" GST_PTR_FORMAT, peer); + g_hash_table_add (priv->accessible_peer, peer); + } + + gst_cuda_context_pop (NULL); +} + +static void +gst_cuda_context_weak_ref_notify (gpointer data, GstCudaContext * context) +{ + GList *iter; + + G_LOCK (list_lock); + context_list = g_list_remove (context_list, context); + + /* disable self -> peer access */ + if (context->priv->accessible_peer) { + GHashTableIter iter; + gpointer key; + g_hash_table_iter_init (&iter, context->priv->accessible_peer); + if (gst_cuda_context_push (context)) { + while (g_hash_table_iter_next (&iter, &key, NULL)) { + GstCudaContext *peer = GST_CUDA_CONTEXT (key); + CUcontext peer_handle = gst_cuda_context_get_handle (peer); + GST_DEBUG_OBJECT (context, + "Disable peer access to %" GST_PTR_FORMAT, peer); + gst_cuda_result (CuCtxDisablePeerAccess (peer_handle)); + } + gst_cuda_context_pop (NULL); + } + + g_hash_table_destroy (context->priv->accessible_peer); + context->priv->accessible_peer = NULL; + } + + /* disable peer -> self access */ + for (iter = context_list; iter; iter = g_list_next (iter)) { + GstCudaContext *other = (GstCudaContext *) iter->data; + GstCudaContextPrivate *other_priv = other->priv; + CUcontext self_handle; + + if (!other_priv->accessible_peer) + continue; + + if (g_hash_table_lookup (other_priv->accessible_peer, context)) { + if (gst_cuda_context_push (other)) { + self_handle = gst_cuda_context_get_handle (context); + GST_DEBUG_OBJECT (other, + "Disable peer access to %" GST_PTR_FORMAT, context); + gst_cuda_result (CuCtxDisablePeerAccess (self_handle)); + gst_cuda_context_pop (NULL); + } + + g_hash_table_remove (other_priv->accessible_peer, context); + } + } + G_UNLOCK (list_lock); } static void @@ -274,3 +388,47 @@ return ctx->priv->context; } + +/** + * gst_cuda_context_get_texture_alignment: + * @ctx: a #GstCudaContext + * + * Get required texture alignment by device + * + * Returns: the #CUcontext of @ctx + */ +gint +gst_cuda_context_get_texture_alignment (GstCudaContext * ctx) +{ + g_return_val_if_fail (ctx, 0); + g_return_val_if_fail (GST_IS_CUDA_CONTEXT (ctx), 0); + + return ctx->priv->tex_align; +} + +/** + * gst_cuda_context_can_access_peer: + * @ctx: a #GstCudaContext + * @peer: a #GstCudaContext + * + * Query whether @ctx can access any memory which belongs to @peer directly. + + * Returns: %TRUE if @ctx can access @peer directly + */ +gboolean +gst_cuda_context_can_access_peer (GstCudaContext * ctx, GstCudaContext * peer) +{ + gboolean ret = FALSE; + + g_return_val_if_fail (GST_IS_CUDA_CONTEXT (ctx), FALSE); + g_return_val_if_fail (GST_IS_CUDA_CONTEXT (peer), FALSE); + + G_LOCK (list_lock); + if (ctx->priv->accessible_peer && + g_hash_table_lookup (ctx->priv->accessible_peer, peer)) { + ret = TRUE; + } + G_UNLOCK (list_lock); + + return ret; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstcudacontext.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudacontext.h
Changed
@@ -60,18 +60,19 @@ GType gst_cuda_context_get_type (void); -G_GNUC_INTERNAL GstCudaContext * gst_cuda_context_new (gint device_id); -G_GNUC_INTERNAL gboolean gst_cuda_context_push (GstCudaContext * ctx); -G_GNUC_INTERNAL gboolean gst_cuda_context_pop (CUcontext * cuda_ctx); -G_GNUC_INTERNAL gpointer gst_cuda_context_get_handle (GstCudaContext * ctx); +gint gst_cuda_context_get_texture_alignment (GstCudaContext * ctx); + +gboolean gst_cuda_context_can_access_peer (GstCudaContext * ctx, + GstCudaContext * peer); + G_END_DECLS #endif /* __GST_CUDA_CONTEXT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudaconvert.c
Added
@@ -0,0 +1,417 @@ +/* GStreamer + * Copyright (C) <1999> Erik Walthinsen <omega@cse.ogi.edu> + * Copyright (C) 2005-2012 David Schleef <ds@schleef.org> + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-cudaconvert + * @title: cudaconvert + * + * Convert video frames between supported video formats. + * + * ## Example launch line + * |[ + * gst-launch-1.0 -v videotestsrc ! video/x-raw,format=Y444_16LE ! cudaupload ! cudaconvert ! cudadownload ! autovideosink + * ]| + * This will output a test video (generated in Y444_16LE format) in a video + * window. If the video sink selected does not support Y444_16LE + * cudaconvert will automatically convert the video to a format understood + * by the video sink. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstcudaconvert.h" +#include "gstcudautils.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_convert_debug); +#define GST_CAT_DEFAULT gst_cuda_convert_debug + +#define gst_cuda_convert_parent_class parent_class +G_DEFINE_TYPE (GstCudaConvert, gst_cuda_convert, GST_TYPE_CUDA_BASE_FILTER); + +static GstCaps *gst_cuda_convert_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static GstCaps *gst_cuda_convert_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); +static gboolean gst_cuda_convert_filter_meta (GstBaseTransform * trans, + GstQuery * query, GType api, const GstStructure * params); +static gboolean +gst_cuda_convert_set_info (GstCudaBaseTransform * btrans, GstCaps * incaps, + GstVideoInfo * in_info, GstCaps * outcaps, GstVideoInfo * out_info); + +/* copies the given caps */ +static GstCaps * +gst_cuda_convert_caps_remove_format_info (GstCaps * caps) +{ + GstStructure *st; + GstCapsFeatures *f; + gint i, n; + GstCaps *res; + GstCapsFeatures *feature = + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY); + + res = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + st = gst_caps_get_structure (caps, i); + f = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) + continue; + + st = gst_structure_copy (st); + /* Only remove format info for the cases when we can actually convert */ + if (!gst_caps_features_is_any (f) + && gst_caps_features_is_equal (f, feature)) + gst_structure_remove_fields (st, "format", "colorimetry", "chroma-site", + NULL); + + gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); + } + gst_caps_features_free (feature); + + return res; +} + +/* + * This is an incomplete matrix of in formats and a score for the prefered output + * format. + * + * out: RGB24 RGB16 ARGB AYUV YUV444 YUV422 YUV420 YUV411 YUV410 PAL GRAY + * in + * RGB24 0 2 1 2 2 3 4 5 6 7 8 + * RGB16 1 0 1 2 2 3 4 5 6 7 8 + * ARGB 2 3 0 1 4 5 6 7 8 9 10 + * AYUV 3 4 1 0 2 5 6 7 8 9 10 + * YUV444 2 4 3 1 0 5 6 7 8 9 10 + * YUV422 3 5 4 2 1 0 6 7 8 9 10 + * YUV420 4 6 5 3 2 1 0 7 8 9 10 + * YUV411 4 6 5 3 2 1 7 0 8 9 10 + * YUV410 6 8 7 5 4 3 2 1 0 9 10 + * PAL 1 3 2 6 4 6 7 8 9 0 10 + * GRAY 1 4 3 2 1 5 6 7 8 9 0 + * + * PAL or GRAY are never preferred, if we can we would convert to PAL instead + * of GRAY, though + * less subsampling is preferred and if any, preferably horizontal + * We would like to keep the alpha, even if we would need to to colorspace conversion + * or lose depth. + */ +#define SCORE_FORMAT_CHANGE 1 +#define SCORE_DEPTH_CHANGE 1 +#define SCORE_ALPHA_CHANGE 1 +#define SCORE_CHROMA_W_CHANGE 1 +#define SCORE_CHROMA_H_CHANGE 1 +#define SCORE_PALETTE_CHANGE 1 + +#define SCORE_COLORSPACE_LOSS 2 /* RGB <-> YUV */ +#define SCORE_DEPTH_LOSS 4 /* change bit depth */ +#define SCORE_ALPHA_LOSS 8 /* lose the alpha channel */ +#define SCORE_CHROMA_W_LOSS 16 /* vertical subsample */ +#define SCORE_CHROMA_H_LOSS 32 /* horizontal subsample */ +#define SCORE_PALETTE_LOSS 64 /* convert to palette format */ +#define SCORE_COLOR_LOSS 128 /* convert to GRAY */ + +#define COLORSPACE_MASK (GST_VIDEO_FORMAT_FLAG_YUV | \ + GST_VIDEO_FORMAT_FLAG_RGB | GST_VIDEO_FORMAT_FLAG_GRAY) +#define ALPHA_MASK (GST_VIDEO_FORMAT_FLAG_ALPHA) +#define PALETTE_MASK (GST_VIDEO_FORMAT_FLAG_PALETTE) + +/* calculate how much loss a conversion would be */ +static void +score_value (GstBaseTransform * base, const GstVideoFormatInfo * in_info, + const GValue * val, gint * min_loss, const GstVideoFormatInfo ** out_info) +{ + const gchar *fname; + const GstVideoFormatInfo *t_info; + GstVideoFormatFlags in_flags, t_flags; + gint loss; + + fname = g_value_get_string (val); + t_info = gst_video_format_get_info (gst_video_format_from_string (fname)); + if (!t_info) + return; + + /* accept input format immediately without loss */ + if (in_info == t_info) { + *min_loss = 0; + *out_info = t_info; + return; + } + + loss = SCORE_FORMAT_CHANGE; + + in_flags = GST_VIDEO_FORMAT_INFO_FLAGS (in_info); + in_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; + in_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; + in_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; + + t_flags = GST_VIDEO_FORMAT_INFO_FLAGS (t_info); + t_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; + t_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; + t_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; + + if ((t_flags & PALETTE_MASK) != (in_flags & PALETTE_MASK)) { + loss += SCORE_PALETTE_CHANGE; + if (t_flags & PALETTE_MASK) + loss += SCORE_PALETTE_LOSS; + } + + if ((t_flags & COLORSPACE_MASK) != (in_flags & COLORSPACE_MASK)) { + loss += SCORE_COLORSPACE_LOSS; + if (t_flags & GST_VIDEO_FORMAT_FLAG_GRAY) + loss += SCORE_COLOR_LOSS; + } + + if ((t_flags & ALPHA_MASK) != (in_flags & ALPHA_MASK)) { + loss += SCORE_ALPHA_CHANGE; + if (in_flags & ALPHA_MASK) + loss += SCORE_ALPHA_LOSS; + } + + if ((in_info->h_sub[1]) != (t_info->h_sub[1])) { + loss += SCORE_CHROMA_H_CHANGE; + if ((in_info->h_sub[1]) < (t_info->h_sub[1])) + loss += SCORE_CHROMA_H_LOSS; + } + if ((in_info->w_sub[1]) != (t_info->w_sub[1])) { + loss += SCORE_CHROMA_W_CHANGE; + if ((in_info->w_sub[1]) < (t_info->w_sub[1])) + loss += SCORE_CHROMA_W_LOSS; + } + + if ((in_info->bits) != (t_info->bits)) { + loss += SCORE_DEPTH_CHANGE; + if ((in_info->bits) > (t_info->bits)) + loss += SCORE_DEPTH_LOSS; + } + + GST_DEBUG_OBJECT (base, "score %s -> %s = %d", + GST_VIDEO_FORMAT_INFO_NAME (in_info), + GST_VIDEO_FORMAT_INFO_NAME (t_info), loss); + + if (loss < *min_loss) { + GST_DEBUG_OBJECT (base, "found new best %d", loss); + *out_info = t_info; + *min_loss = loss; + } +} + +static void +gst_cuda_convert_class_init (GstCudaConvertClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstCudaBaseTransformClass *btrans_class = + GST_CUDA_BASE_TRANSFORM_CLASS (klass); + + gst_element_class_set_static_metadata (element_class, + "CUDA Colorspace converter", + "Filter/Converter/Video/Hardware", + "Converts video from one colorspace to another using CUDA", + "Seungha Yang <seungha.yang@navercorp.com>"); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_cuda_convert_transform_caps); + trans_class->fixate_caps = GST_DEBUG_FUNCPTR (gst_cuda_convert_fixate_caps); + trans_class->filter_meta = GST_DEBUG_FUNCPTR (gst_cuda_convert_filter_meta); + + btrans_class->set_info = GST_DEBUG_FUNCPTR (gst_cuda_convert_set_info); + + GST_DEBUG_CATEGORY_INIT (gst_cuda_convert_debug, + "cudaconvert", 0, "Video ColorSpace convert using CUDA"); + + gst_type_mark_as_plugin_api (GST_TYPE_CUDA_BASE_FILTER, 0); +} + +static void +gst_cuda_convert_init (GstCudaConvert * convert) +{ +} + +static GstCaps * +gst_cuda_convert_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *tmp, *tmp2; + GstCaps *result; + + /* Get all possible caps that we can transform to */ + tmp = gst_cuda_convert_caps_remove_format_info (caps); + + if (filter) { + tmp2 = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + tmp = tmp2; + } + + result = tmp; + + GST_DEBUG_OBJECT (trans, "transformed %" GST_PTR_FORMAT " into %" + GST_PTR_FORMAT, caps, result); + + return result; +} + +/* fork of gstvideoconvert */ +static void +gst_cuda_convert_fixate_format (GstBaseTransform * base, GstCaps * caps, + GstCaps * result) +{ + GstStructure *ins, *outs; + const gchar *in_format; + const GstVideoFormatInfo *in_info, *out_info = NULL; + gint min_loss = G_MAXINT; + guint i, capslen; + + ins = gst_caps_get_structure (caps, 0); + in_format = gst_structure_get_string (ins, "format"); + if (!in_format) + return; + + GST_DEBUG_OBJECT (base, "source format %s", in_format); + + in_info = + gst_video_format_get_info (gst_video_format_from_string (in_format)); + if (!in_info) + return; + + outs = gst_caps_get_structure (result, 0); + + capslen = gst_caps_get_size (result); + GST_DEBUG_OBJECT (base, "iterate %d structures", capslen); + for (i = 0; i < capslen; i++) { + GstStructure *tests; + const GValue *format; + + tests = gst_caps_get_structure (result, i); + format = gst_structure_get_value (tests, "format"); + /* should not happen */ + if (format == NULL) + continue; + + if (GST_VALUE_HOLDS_LIST (format)) { + gint j, len; + + len = gst_value_list_get_size (format); + GST_DEBUG_OBJECT (base, "have %d formats", len); + for (j = 0; j < len; j++) { + const GValue *val; + + val = gst_value_list_get_value (format, j); + if (G_VALUE_HOLDS_STRING (val)) { + score_value (base, in_info, val, &min_loss, &out_info); + if (min_loss == 0) + break; + } + } + } else if (G_VALUE_HOLDS_STRING (format)) { + score_value (base, in_info, format, &min_loss, &out_info); + } + } + if (out_info) + gst_structure_set (outs, "format", G_TYPE_STRING, + GST_VIDEO_FORMAT_INFO_NAME (out_info), NULL); +} + +static GstCaps * +gst_cuda_convert_fixate_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstCaps *result; + + GST_DEBUG_OBJECT (trans, "trying to fixate othercaps %" GST_PTR_FORMAT + " based on caps %" GST_PTR_FORMAT, othercaps, caps); + + result = gst_caps_intersect (othercaps, caps); + if (gst_caps_is_empty (result)) { + gst_caps_unref (result); + result = othercaps; + } else { + gst_caps_unref (othercaps); + } + + GST_DEBUG_OBJECT (trans, "now fixating %" GST_PTR_FORMAT, result); + + result = gst_caps_make_writable (result); + gst_cuda_convert_fixate_format (trans, caps, result); + + /* fixate remaining fields */ + result = gst_caps_fixate (result); + + if (direction == GST_PAD_SINK) { + if (gst_caps_is_subset (caps, result)) { + gst_caps_replace (&result, caps); + } + } + + return result; +} + +static gboolean +gst_cuda_convert_filter_meta (GstBaseTransform * trans, GstQuery * query, + GType api, const GstStructure * params) +{ + /* This element cannot passthrough the crop meta, because it would convert the + * wrong sub-region of the image, and worst, our output image may not be large + * enough for the crop to be applied later */ + if (api == GST_VIDEO_CROP_META_API_TYPE) + return FALSE; + + /* propose all other metadata upstream */ + return TRUE; +} + +static gboolean +gst_cuda_convert_set_info (GstCudaBaseTransform * btrans, GstCaps * incaps, + GstVideoInfo * in_info, GstCaps * outcaps, GstVideoInfo * out_info) +{ + /* these must match */ + if (in_info->width != out_info->width || in_info->height != out_info->height + || in_info->fps_n != out_info->fps_n || in_info->fps_d != out_info->fps_d) + goto format_mismatch; + + /* if present, these must match too */ + if (in_info->par_n != out_info->par_n || in_info->par_d != out_info->par_d) + goto format_mismatch; + + /* if present, these must match too */ + if (in_info->interlace_mode != out_info->interlace_mode) + goto format_mismatch; + + return GST_CUDA_BASE_TRANSFORM_CLASS (parent_class)->set_info (btrans, incaps, + in_info, outcaps, out_info); + + /* ERRORS */ +format_mismatch: + { + GST_ERROR_OBJECT (btrans, "input and output formats do not match"); + return FALSE; + } +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudaconvert.h
Added
@@ -0,0 +1,53 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_CONVERT_H__ +#define __GST_CUDA_CONVERT_H__ + +#include <gst/gst.h> + +#include "gstcudabasefilter.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_CONVERT (gst_cuda_convert_get_type()) +#define GST_CUDA_CONVERT(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_CONVERT,GstCudaConvert)) +#define GST_CUDA_CONVERT_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_CONVERT,GstCudaConvertClass)) +#define GST_CUDA_CONVERT_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_CONVERT,GstCudaConvertClass)) +#define GST_IS_CUDA_CONVERT(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_CONVERT)) +#define GST_IS_CUDA_CONVERT_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_CONVERT)) + +typedef struct _GstCudaConvert GstCudaConvert; +typedef struct _GstCudaConvertClass GstCudaConvertClass; + +struct _GstCudaConvert +{ + GstCudaBaseFilter parent; +}; + +struct _GstCudaConvertClass +{ + GstCudaBaseFilterClass parent_class; +}; + +GType gst_cuda_convert_get_type (void); + +G_END_DECLS + +#endif /* __GST_CUDA_CONVERT_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudadownload.c
Added
@@ -0,0 +1,131 @@ + +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * element-cudadownload: + * + * Downloads data from NVIDA GPU via CUDA APIs + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstcudadownload.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_download_debug); +#define GST_CAT_DEFAULT gst_cuda_download_debug + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw(" GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY + "); video/x-raw")); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw")); + +G_DEFINE_TYPE (GstCudaDownload, gst_cuda_download, + GST_TYPE_CUDA_BASE_TRANSFORM); + +static GstCaps *gst_cuda_download_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); + +static void +gst_cuda_download_class_init (GstCudaDownloadClass * klass) +{ + GstElementClass *element_class; + GstBaseTransformClass *trans_class; + + element_class = GST_ELEMENT_CLASS (klass); + trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gst_element_class_add_static_pad_template (element_class, &sink_template); + gst_element_class_add_static_pad_template (element_class, &src_template); + + gst_element_class_set_static_metadata (element_class, + "CUDA downloader", "Filter/Video", + "Downloads data from NVIDA GPU via CUDA APIs", + "Seungha Yang <seungha.yang@navercorp.com>"); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_cuda_download_transform_caps); + + GST_DEBUG_CATEGORY_INIT (gst_cuda_download_debug, + "cudadownload", 0, "cudadownload Element"); +} + +static void +gst_cuda_download_init (GstCudaDownload * download) +{ +} + +static GstCaps * +_set_caps_features (const GstCaps * caps, const gchar * feature_name) +{ + GstCaps *tmp = gst_caps_copy (caps); + guint n = gst_caps_get_size (tmp); + guint i = 0; + + for (i = 0; i < n; i++) + gst_caps_set_features (tmp, i, + gst_caps_features_from_string (feature_name)); + + return tmp; +} + +static GstCaps * +gst_cuda_download_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *result, *tmp; + + GST_DEBUG_OBJECT (trans, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + if (direction == GST_PAD_SINK) { + tmp = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); + tmp = gst_caps_merge (gst_caps_ref (caps), tmp); + } else { + GstCaps *newcaps; + tmp = gst_caps_ref (caps); + + newcaps = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY); + tmp = gst_caps_merge (tmp, newcaps); + } + + if (filter) { + result = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + } else { + result = tmp; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, result); + + return result; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudadownload.h
Added
@@ -0,0 +1,51 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_DOWNLOAD_H__ +#define __GST_CUDA_DOWNLOAD_H__ + +#include "gstcudabasetransform.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_DOWNLOAD (gst_cuda_download_get_type()) +#define GST_CUDA_DOWNLOAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_DOWNLOAD,GstCudaDownload)) +#define GST_CUDA_DOWNLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_DOWNLOAD,GstCudaDownloadClass)) +#define GST_CUDA_DOWNLOAD_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_DOWNLOAD,GstCudaDownloadClass)) +#define GST_IS_CUDA_DOWNLOAD(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_DOWNLOAD)) +#define GST_IS_CUDA_DOWNLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_DOWNLOAD)) + +typedef struct _GstCudaDownload GstCudaDownload; +typedef struct _GstCudaDownloadClass GstCudaDownloadClass; + +struct _GstCudaDownload +{ + GstCudaBaseTransform parent; +}; + +struct _GstCudaDownloadClass +{ + GstCudaBaseTransformClass parent_class; +}; + +GType gst_cuda_download_get_type (void); + +G_END_DECLS + +#endif /* __GST_CUDA_DOWNLOAD_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudafilter.c
Added
@@ -0,0 +1,56 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstcudafilter.h" +#include "gstcudaloader.h" +#include "gstnvrtcloader.h" +#include "gstcudanvrtc.h" +#include "gstcudaconvert.h" +#include "gstcudascale.h" + +/* *INDENT-OFF* */ +const gchar *nvrtc_test_source = + "__global__ void\n" + "my_kernel (void) {}"; +/* *INDENT-ON* */ + +void +gst_cuda_filter_plugin_init (GstPlugin * plugin) +{ + gchar *test_ptx = NULL; + + if (!gst_nvrtc_load_library ()) + return; + + test_ptx = gst_cuda_nvrtc_compile (nvrtc_test_source); + + if (!test_ptx) { + return; + } + g_free (test_ptx); + + gst_element_register (plugin, "cudaconvert", GST_RANK_NONE, + GST_TYPE_CUDA_CONVERT); + gst_element_register (plugin, "cudascale", GST_RANK_NONE, + GST_TYPE_CUDA_SCALE); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudafilter.h
Added
@@ -0,0 +1,31 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_FILTER_H__ +#define __GST_CUDA_FILTER_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +void gst_cuda_filter_plugin_init (GstPlugin * plugin); + +G_END_DECLS + +#endif /* __GST_CUDA_FILTER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstcudaloader.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudaloader.c
Changed
@@ -54,6 +54,9 @@ CUresult (CUDAAPI * CuCtxPopCurrent) (CUcontext * pctx); CUresult (CUDAAPI * CuCtxPushCurrent) (CUcontext ctx); + CUresult (CUDAAPI * CuCtxEnablePeerAccess) (CUcontext peerContext, + unsigned int Flags); + CUresult (CUDAAPI * CuCtxDisablePeerAccess) (CUcontext peerContext); CUresult (CUDAAPI * CuGraphicsMapResources) (unsigned int count, CUgraphicsResource * resources, CUstream hStream); CUresult (CUDAAPI * CuGraphicsUnmapResources) (unsigned int count, @@ -69,10 +72,14 @@ CUresult (CUDAAPI * CuMemAlloc) (CUdeviceptr * dptr, unsigned int bytesize); CUresult (CUDAAPI * CuMemAllocPitch) (CUdeviceptr * dptr, size_t * pPitch, size_t WidthInBytes, size_t Height, unsigned int ElementSizeBytes); + CUresult (CUDAAPI * CuMemAllocHost) (void **pp, unsigned int bytesize); CUresult (CUDAAPI * CuMemcpy2D) (const CUDA_MEMCPY2D * pCopy); CUresult (CUDAAPI * CuMemcpy2DAsync) (const CUDA_MEMCPY2D * pCopy, CUstream hStream); + CUresult (CUDAAPI * CuMemFree) (CUdeviceptr dptr); + CUresult (CUDAAPI * CuMemFreeHost) (void *p); + CUresult (CUDAAPI * CuStreamCreate) (CUstream * phStream, unsigned int Flags); CUresult (CUDAAPI * CuStreamDestroy) (CUstream hStream); @@ -83,6 +90,24 @@ CUresult (CUDAAPI * CuDeviceGetName) (char *name, int len, CUdevice dev); CUresult (CUDAAPI * CuDeviceGetAttribute) (int *pi, CUdevice_attribute attrib, CUdevice dev); + CUresult (CUDAAPI * CuDeviceCanAccessPeer) (int *canAccessPeer, + CUdevice dev, CUdevice peerDev); + CUresult (CUDAAPI * CuDriverGetVersion) (int *driverVersion); + + CUresult (CUDAAPI * CuModuleLoadData) (CUmodule * module, + const void *image); + CUresult (CUDAAPI * CuModuleUnload) (CUmodule module); + CUresult (CUDAAPI * CuModuleGetFunction) (CUfunction * hfunc, + CUmodule hmod, const char *name); + CUresult (CUDAAPI * CuTexObjectCreate) (CUtexObject * pTexObject, + const CUDA_RESOURCE_DESC * pResDesc, const CUDA_TEXTURE_DESC * pTexDesc, + const CUDA_RESOURCE_VIEW_DESC * pResViewDesc); + CUresult (CUDAAPI * CuTexObjectDestroy) (CUtexObject texObject); + CUresult (CUDAAPI * CuLaunchKernel) (CUfunction f, unsigned int gridDimX, + unsigned int gridDimY, unsigned int gridDimZ, + unsigned int blockDimX, unsigned int blockDimY, unsigned int blockDimZ, + unsigned int sharedMemBytes, CUstream hStream, void **kernelParams, + void **extra); CUresult (CUDAAPI * CuGraphicsGLRegisterImage) (CUgraphicsResource * pCudaResource, unsigned int image, unsigned int target, @@ -125,6 +150,8 @@ LOAD_SYMBOL (cuCtxDestroy, CuCtxDestroy); LOAD_SYMBOL (cuCtxPopCurrent, CuCtxPopCurrent); LOAD_SYMBOL (cuCtxPushCurrent, CuCtxPushCurrent); + LOAD_SYMBOL (cuCtxEnablePeerAccess, CuCtxEnablePeerAccess); + LOAD_SYMBOL (cuCtxDisablePeerAccess, CuCtxDisablePeerAccess); LOAD_SYMBOL (cuGraphicsMapResources, CuGraphicsMapResources); LOAD_SYMBOL (cuGraphicsUnmapResources, CuGraphicsUnmapResources); @@ -136,9 +163,12 @@ LOAD_SYMBOL (cuMemAlloc, CuMemAlloc); LOAD_SYMBOL (cuMemAllocPitch, CuMemAllocPitch); + LOAD_SYMBOL (cuMemAllocHost, CuMemAllocHost); LOAD_SYMBOL (cuMemcpy2D, CuMemcpy2D); LOAD_SYMBOL (cuMemcpy2DAsync, CuMemcpy2DAsync); + LOAD_SYMBOL (cuMemFree, CuMemFree); + LOAD_SYMBOL (cuMemFreeHost, CuMemFreeHost); LOAD_SYMBOL (cuStreamCreate, CuStreamCreate); LOAD_SYMBOL (cuStreamDestroy, CuStreamDestroy); @@ -148,6 +178,16 @@ LOAD_SYMBOL (cuDeviceGetCount, CuDeviceGetCount); LOAD_SYMBOL (cuDeviceGetName, CuDeviceGetName); LOAD_SYMBOL (cuDeviceGetAttribute, CuDeviceGetAttribute); + LOAD_SYMBOL (cuDeviceCanAccessPeer, CuDeviceCanAccessPeer); + + LOAD_SYMBOL (cuDriverGetVersion, CuDriverGetVersion); + + LOAD_SYMBOL (cuModuleLoadData, CuModuleLoadData); + LOAD_SYMBOL (cuModuleUnload, CuModuleUnload); + LOAD_SYMBOL (cuModuleGetFunction, CuModuleGetFunction); + LOAD_SYMBOL (cuTexObjectCreate, CuTexObjectCreate); + LOAD_SYMBOL (cuTexObjectDestroy, CuTexObjectDestroy); + LOAD_SYMBOL (cuLaunchKernel, CuLaunchKernel); /* cudaGL.h */ LOAD_SYMBOL (cuGraphicsGLRegisterImage, CuGraphicsGLRegisterImage); @@ -222,6 +262,22 @@ } CUresult CUDAAPI +CuCtxEnablePeerAccess (CUcontext peerContext, unsigned int Flags) +{ + g_assert (gst_cuda_vtable.CuCtxEnablePeerAccess != NULL); + + return gst_cuda_vtable.CuCtxEnablePeerAccess (peerContext, Flags); +} + +CUresult CUDAAPI +CuCtxDisablePeerAccess (CUcontext peerContext) +{ + g_assert (gst_cuda_vtable.CuCtxDisablePeerAccess != NULL); + + return gst_cuda_vtable.CuCtxDisablePeerAccess (peerContext); +} + +CUresult CUDAAPI CuGraphicsMapResources (unsigned int count, CUgraphicsResource * resources, CUstream hStream) { @@ -286,6 +342,14 @@ } CUresult CUDAAPI +CuMemAllocHost (void **pp, unsigned int bytesize) +{ + g_assert (gst_cuda_vtable.CuMemAllocHost != NULL); + + return gst_cuda_vtable.CuMemAllocHost (pp, bytesize); +} + +CUresult CUDAAPI CuMemcpy2D (const CUDA_MEMCPY2D * pCopy) { g_assert (gst_cuda_vtable.CuMemcpy2D != NULL); @@ -310,6 +374,14 @@ } CUresult CUDAAPI +CuMemFreeHost (void *p) +{ + g_assert (gst_cuda_vtable.CuMemFreeHost != NULL); + + return gst_cuda_vtable.CuMemFreeHost (p); +} + +CUresult CUDAAPI CuStreamCreate (CUstream * phStream, unsigned int Flags) { g_assert (gst_cuda_vtable.CuStreamCreate != NULL); @@ -365,6 +437,79 @@ return gst_cuda_vtable.CuDeviceGetAttribute (pi, attrib, dev); } +CUresult CUDAAPI +CuDeviceCanAccessPeer (int *canAccessPeer, CUdevice dev, CUdevice peerDev) +{ + g_assert (gst_cuda_vtable.CuDeviceCanAccessPeer != NULL); + + return gst_cuda_vtable.CuDeviceCanAccessPeer (canAccessPeer, dev, peerDev); +} + +CUresult CUDAAPI +CuDriverGetVersion (int *driverVersion) +{ + g_assert (gst_cuda_vtable.CuDriverGetVersion != NULL); + + return gst_cuda_vtable.CuDriverGetVersion (driverVersion); +} + +CUresult CUDAAPI +CuModuleLoadData (CUmodule * module, const void *image) +{ + g_assert (gst_cuda_vtable.CuModuleLoadData != NULL); + + return gst_cuda_vtable.CuModuleLoadData (module, image); +} + +CUresult CUDAAPI +CuModuleUnload (CUmodule module) +{ + g_assert (gst_cuda_vtable.CuModuleUnload != NULL); + + return gst_cuda_vtable.CuModuleUnload (module); +} + +CUresult CUDAAPI +CuModuleGetFunction (CUfunction * hfunc, CUmodule hmod, const char *name) +{ + g_assert (gst_cuda_vtable.CuModuleGetFunction != NULL); + + return gst_cuda_vtable.CuModuleGetFunction (hfunc, hmod, name); +} + +CUresult CUDAAPI +CuTexObjectCreate (CUtexObject * pTexObject, + const CUDA_RESOURCE_DESC * pResDesc, const CUDA_TEXTURE_DESC * pTexDesc, + const CUDA_RESOURCE_VIEW_DESC * pResViewDesc) +{ + g_assert (gst_cuda_vtable.CuTexObjectCreate != NULL); + + return gst_cuda_vtable.CuTexObjectCreate (pTexObject, pResDesc, pTexDesc, + pResViewDesc); +} + +CUresult CUDAAPI +CuTexObjectDestroy (CUtexObject texObject) +{ + g_assert (gst_cuda_vtable.CuTexObjectDestroy != NULL); + + return gst_cuda_vtable.CuTexObjectDestroy (texObject); +} + +CUresult CUDAAPI +CuLaunchKernel (CUfunction f, unsigned int gridDimX, + unsigned int gridDimY, unsigned int gridDimZ, + unsigned int blockDimX, unsigned int blockDimY, unsigned int blockDimZ, + unsigned int sharedMemBytes, CUstream hStream, void **kernelParams, + void **extra) +{ + g_assert (gst_cuda_vtable.CuLaunchKernel != NULL); + + return gst_cuda_vtable.CuLaunchKernel (f, gridDimX, gridDimY, gridDimZ, + blockDimX, blockDimY, blockDimZ, sharedMemBytes, hStream, kernelParams, + extra); +} + /* cudaGL.h */ CUresult CUDAAPI CuGraphicsGLRegisterImage (CUgraphicsResource * pCudaResource,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstcudaloader.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudaloader.h
Changed
@@ -26,123 +26,138 @@ G_BEGIN_DECLS -G_GNUC_INTERNAL gboolean gst_cuda_load_library (void); /* cuda.h */ -G_GNUC_INTERNAL CUresult CUDAAPI CuInit (unsigned int Flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuGetErrorName (CUresult error, const char **pStr); -G_GNUC_INTERNAL CUresult CUDAAPI CuGetErrorString (CUresult error, const char **pStr); -G_GNUC_INTERNAL CUresult CUDAAPI CuCtxCreate (CUcontext * pctx, unsigned int flags, CUdevice dev); -G_GNUC_INTERNAL CUresult CUDAAPI CuCtxDestroy (CUcontext ctx); -G_GNUC_INTERNAL CUresult CUDAAPI CuCtxPopCurrent (CUcontext * pctx); -G_GNUC_INTERNAL CUresult CUDAAPI CuCtxPushCurrent (CUcontext ctx); -G_GNUC_INTERNAL +CUresult CUDAAPI CuCtxEnablePeerAccess (CUcontext peerContext, + unsigned int Flags); + +CUresult CUDAAPI CuCtxDisablePeerAccess (CUcontext peerContext); + CUresult CUDAAPI CuGraphicsMapResources (unsigned int count, CUgraphicsResource * resources, CUstream hStream); -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsUnmapResources (unsigned int count, CUgraphicsResource * resources, CUstream hStream); -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsSubResourceGetMappedArray (CUarray * pArray, CUgraphicsResource resource, unsigned int arrayIndex, unsigned int mipLevel); -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsResourceGetMappedPointer (CUdeviceptr * pDevPtr, size_t * pSize, CUgraphicsResource resource); -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsUnregisterResource (CUgraphicsResource resource); -G_GNUC_INTERNAL CUresult CUDAAPI CuMemAlloc (CUdeviceptr * dptr, unsigned int bytesize); -G_GNUC_INTERNAL CUresult CUDAAPI CuMemAllocPitch (CUdeviceptr * dptr, size_t * pPitch, size_t WidthInBytes, size_t Height, unsigned int ElementSizeBytes); -G_GNUC_INTERNAL +CUresult CUDAAPI CuMemAllocHost (void **pp, + unsigned int bytesize); + CUresult CUDAAPI CuMemcpy2D (const CUDA_MEMCPY2D * pCopy); -G_GNUC_INTERNAL CUresult CUDAAPI CuMemcpy2DAsync (const CUDA_MEMCPY2D *pCopy, CUstream hStream); -G_GNUC_INTERNAL CUresult CUDAAPI CuMemFree (CUdeviceptr dptr); -G_GNUC_INTERNAL +CUresult CUDAAPI CuMemFreeHost (void *p); + CUresult CUDAAPI CuStreamCreate (CUstream *phStream, unsigned int Flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuStreamDestroy (CUstream hStream); -G_GNUC_INTERNAL CUresult CUDAAPI CuStreamSynchronize (CUstream hStream); -G_GNUC_INTERNAL CUresult CUDAAPI CuDeviceGet (CUdevice * device, int ordinal); -G_GNUC_INTERNAL CUresult CUDAAPI CuDeviceGetCount (int *count); -G_GNUC_INTERNAL CUresult CUDAAPI CuDeviceGetName (char *name, int len, CUdevice dev); -G_GNUC_INTERNAL CUresult CUDAAPI CuDeviceGetAttribute (int *pi, CUdevice_attribute attrib, CUdevice dev); +CUresult CUDAAPI CuDeviceCanAccessPeer (int *canAccessPeer, + CUdevice dev, + CUdevice peerDev); + +CUresult CUDAAPI CuDriverGetVersion (int * driverVersion); + +CUresult CUDAAPI CuModuleLoadData (CUmodule* module, + const void *image); + +CUresult CUDAAPI CuModuleUnload (CUmodule module); + +CUresult CUDAAPI CuModuleGetFunction (CUfunction* hfunc, + CUmodule hmod, + const char* name); + +CUresult CUDAAPI CuTexObjectCreate (CUtexObject *pTexObject, + const CUDA_RESOURCE_DESC *pResDesc, + const CUDA_TEXTURE_DESC *pTexDesc, + const CUDA_RESOURCE_VIEW_DESC *pResViewDesc); + +CUresult CUDAAPI CuTexObjectDestroy (CUtexObject texObject); + +CUresult CUDAAPI CuLaunchKernel (CUfunction f, + unsigned int gridDimX, + unsigned int gridDimY, + unsigned int gridDimZ, + unsigned int blockDimX, + unsigned int blockDimY, + unsigned int blockDimZ, + unsigned int sharedMemBytes, + CUstream hStream, + void **kernelParams, + void **extra); + /* cudaGL.h */ -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsGLRegisterImage (CUgraphicsResource * pCudaResource, unsigned int image, unsigned int target, unsigned int Flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsGLRegisterBuffer (CUgraphicsResource * pCudaResource, unsigned int buffer, unsigned int Flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuGraphicsResourceSetMapFlags (CUgraphicsResource resource, unsigned int flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuGLGetDevices (unsigned int * pCudaDeviceCount, CUdevice * pCudaDevices, unsigned int cudaDeviceCount,
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudamemory.c
Added
@@ -0,0 +1,485 @@ +/* GStreamer + * Copyright (C) <2018-2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstcudamemory.h" +#include "gstcudautils.h" + +#include <string.h> + +GST_DEBUG_CATEGORY_STATIC (cudaallocator_debug); +#define GST_CAT_DEFAULT cudaallocator_debug +GST_DEBUG_CATEGORY_STATIC (GST_CAT_MEMORY); + +#define gst_cuda_allocator_parent_class parent_class +G_DEFINE_TYPE (GstCudaAllocator, gst_cuda_allocator, GST_TYPE_ALLOCATOR); + +static void gst_cuda_allocator_dispose (GObject * object); +static void gst_cuda_allocator_free (GstAllocator * allocator, + GstMemory * memory); + +static gpointer cuda_mem_map (GstCudaMemory * mem, gsize maxsize, + GstMapFlags flags); +static void cuda_mem_unmap_full (GstCudaMemory * mem, GstMapInfo * info); +static GstMemory *cuda_mem_copy (GstMemory * mem, gssize offset, gssize size); + +static GstMemory * +gst_cuda_allocator_dummy_alloc (GstAllocator * allocator, gsize size, + GstAllocationParams * params) +{ + g_return_val_if_reached (NULL); +} + +static void +gst_cuda_allocator_class_init (GstCudaAllocatorClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstAllocatorClass *allocator_class = GST_ALLOCATOR_CLASS (klass); + + gobject_class->dispose = gst_cuda_allocator_dispose; + + allocator_class->alloc = GST_DEBUG_FUNCPTR (gst_cuda_allocator_dummy_alloc); + allocator_class->free = GST_DEBUG_FUNCPTR (gst_cuda_allocator_free); + + GST_DEBUG_CATEGORY_INIT (cudaallocator_debug, "cudaallocator", 0, + "CUDA Allocator"); + GST_DEBUG_CATEGORY_GET (GST_CAT_MEMORY, "GST_MEMORY"); +} + +static void +gst_cuda_allocator_init (GstCudaAllocator * allocator) +{ + GstAllocator *alloc = GST_ALLOCATOR_CAST (allocator); + + GST_DEBUG_OBJECT (allocator, "init"); + + alloc->mem_type = GST_CUDA_MEMORY_TYPE_NAME; + + alloc->mem_map = (GstMemoryMapFunction) cuda_mem_map; + alloc->mem_unmap_full = (GstMemoryUnmapFullFunction) cuda_mem_unmap_full; + alloc->mem_copy = (GstMemoryCopyFunction) cuda_mem_copy; + + GST_OBJECT_FLAG_SET (allocator, GST_ALLOCATOR_FLAG_CUSTOM_ALLOC); +} + +static void +gst_cuda_allocator_dispose (GObject * object) +{ + GstCudaAllocator *self = GST_CUDA_ALLOCATOR_CAST (object); + + GST_DEBUG_OBJECT (self, "dispose"); + + gst_clear_object (&self->context); + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +GstMemory * +gst_cuda_allocator_alloc (GstAllocator * allocator, gsize size, + GstCudaAllocationParams * params) +{ + GstCudaAllocator *self = GST_CUDA_ALLOCATOR_CAST (allocator); + gsize maxsize = size + params->parent.prefix + params->parent.padding; + gsize align = params->parent.align; + gsize offset = params->parent.prefix; + GstMemoryFlags flags = params->parent.flags; + CUdeviceptr data; + gboolean ret = FALSE; + GstCudaMemory *mem; + GstVideoInfo *info = ¶ms->info; + gint i; + guint width, height; + gsize stride, plane_offset; + + if (!gst_cuda_context_push (self->context)) + return NULL; + + /* ensure configured alignment */ + align |= gst_memory_alignment; + /* allocate more to compensate for alignment */ + maxsize += align; + + GST_CAT_DEBUG_OBJECT (GST_CAT_MEMORY, self, "allocate new cuda memory"); + + width = GST_VIDEO_INFO_COMP_WIDTH (info, 0) * + GST_VIDEO_INFO_COMP_PSTRIDE (info, 0); + height = 0; + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) + height += GST_VIDEO_INFO_COMP_HEIGHT (info, i); + + ret = gst_cuda_result (CuMemAllocPitch (&data, &stride, width, height, 16)); + gst_cuda_context_pop (NULL); + + if (G_UNLIKELY (!ret)) { + GST_CAT_ERROR_OBJECT (GST_CAT_MEMORY, self, "CUDA allocation failure"); + return NULL; + } + + mem = g_new0 (GstCudaMemory, 1); + g_mutex_init (&mem->lock); + mem->data = data; + mem->alloc_params = *params; + mem->stride = stride; + + plane_offset = 0; + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + mem->offset[i] = plane_offset; + plane_offset += stride * GST_VIDEO_INFO_COMP_HEIGHT (info, i); + } + + mem->context = gst_object_ref (self->context); + + gst_memory_init (GST_MEMORY_CAST (mem), + flags, GST_ALLOCATOR_CAST (self), NULL, maxsize, align, offset, size); + + return GST_MEMORY_CAST (mem); +} + +static void +gst_cuda_allocator_free (GstAllocator * allocator, GstMemory * memory) +{ + GstCudaAllocator *self = GST_CUDA_ALLOCATOR_CAST (allocator); + GstCudaMemory *mem = GST_CUDA_MEMORY_CAST (memory); + + GST_CAT_DEBUG_OBJECT (GST_CAT_MEMORY, allocator, "free cuda memory"); + + g_mutex_clear (&mem->lock); + + gst_cuda_context_push (self->context); + if (mem->data) + gst_cuda_result (CuMemFree (mem->data)); + + if (mem->map_alloc_data) + gst_cuda_result (CuMemFreeHost (mem->map_alloc_data)); + + gst_cuda_context_pop (NULL); + gst_object_unref (mem->context); + + g_free (mem); +} + +/* called with lock */ +static gboolean +gst_cuda_memory_upload_transfer (GstCudaMemory * mem) +{ + gint i; + GstVideoInfo *info = &mem->alloc_params.info; + gboolean ret = TRUE; + + if (!mem->map_data) { + GST_CAT_ERROR (GST_CAT_MEMORY, "no staging memory to upload"); + return FALSE; + } + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + CUDA_MEMCPY2D param = { 0, }; + + param.srcMemoryType = CU_MEMORYTYPE_HOST; + param.srcHost = + (guint8 *) mem->map_data + GST_VIDEO_INFO_PLANE_OFFSET (info, i); + param.srcPitch = GST_VIDEO_INFO_PLANE_STRIDE (info, i); + + param.dstMemoryType = CU_MEMORYTYPE_DEVICE; + param.dstDevice = mem->data + mem->offset[i]; + param.dstPitch = mem->stride; + param.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (info, i) * + GST_VIDEO_INFO_COMP_PSTRIDE (info, i); + param.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); + + if (!gst_cuda_result (CuMemcpy2DAsync (¶m, NULL))) { + GST_CAT_ERROR (GST_CAT_MEMORY, "Failed to copy %dth plane", i); + ret = FALSE; + break; + } + } + gst_cuda_result (CuStreamSynchronize (NULL)); + + return ret; +} + +/* called with lock */ +static gboolean +gst_cuda_memory_download_transfer (GstCudaMemory * mem) +{ + gint i; + GstVideoInfo *info = &mem->alloc_params.info; + + if (!mem->map_data) { + GST_CAT_ERROR (GST_CAT_MEMORY, "no staging memory to upload"); + return FALSE; + } + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + CUDA_MEMCPY2D param = { 0, }; + + param.srcMemoryType = CU_MEMORYTYPE_DEVICE; + param.srcDevice = mem->data + mem->offset[i]; + param.srcPitch = mem->stride; + + param.dstMemoryType = CU_MEMORYTYPE_HOST; + param.dstHost = + (guint8 *) mem->map_data + GST_VIDEO_INFO_PLANE_OFFSET (info, i); + param.dstPitch = GST_VIDEO_INFO_PLANE_STRIDE (info, i); + param.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (info, i) * + GST_VIDEO_INFO_COMP_PSTRIDE (info, i); + param.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); + + if (!gst_cuda_result (CuMemcpy2DAsync (¶m, NULL))) { + GST_CAT_ERROR (GST_CAT_MEMORY, "Failed to copy %dth plane", i); + CuMemFreeHost (mem->map_alloc_data); + mem->map_alloc_data = mem->map_data = mem->align_data = NULL; + break; + } + } + gst_cuda_result (CuStreamSynchronize (NULL)); + + return ! !mem->map_data; +} + +static gpointer +gst_cuda_memory_device_memory_map (GstCudaMemory * mem) +{ + GstMemory *memory = GST_MEMORY_CAST (mem); + gpointer data; + gsize aoffset; + gsize align = memory->align; + + if (mem->map_data) { + return mem->map_data; + } + + GST_CAT_DEBUG (GST_CAT_MEMORY, "alloc host memory for map"); + + if (!mem->map_alloc_data) { + gsize maxsize; + guint8 *align_data; + + maxsize = memory->maxsize + align; + if (!gst_cuda_context_push (mem->context)) { + GST_CAT_ERROR (GST_CAT_MEMORY, "cannot push cuda context"); + + return NULL; + } + + if (!gst_cuda_result (CuMemAllocHost (&data, maxsize))) { + GST_CAT_ERROR (GST_CAT_MEMORY, "cannot alloc host memory"); + gst_cuda_context_pop (NULL); + + return NULL; + } + + if (!gst_cuda_context_pop (NULL)) { + GST_CAT_WARNING (GST_CAT_MEMORY, "cannot pop cuda context"); + } + + mem->map_alloc_data = data; + align_data = data; + + /* do align */ + if ((aoffset = ((guintptr) align_data & align))) { + aoffset = (align + 1) - aoffset; + align_data += aoffset; + } + mem->align_data = align_data; + + /* first memory, always need download to staging */ + GST_MINI_OBJECT_FLAG_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD); + } + + mem->map_data = mem->align_data; + + if (GST_MEMORY_FLAG_IS_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD)) { + if (!gst_cuda_context_push (mem->context)) { + GST_CAT_ERROR (GST_CAT_MEMORY, "cannot push cuda context"); + + return NULL; + } + + gst_cuda_memory_download_transfer (mem); + + if (!gst_cuda_context_pop (NULL)) { + GST_CAT_WARNING (GST_CAT_MEMORY, "cannot pop cuda context"); + } + } + + return mem->map_data; +} + +static gpointer +cuda_mem_map (GstCudaMemory * mem, gsize maxsize, GstMapFlags flags) +{ + gpointer ret = NULL; + + g_mutex_lock (&mem->lock); + mem->map_count++; + + if ((flags & GST_MAP_CUDA) == GST_MAP_CUDA) { + /* upload from staging to device memory if necessary */ + if (GST_MEMORY_FLAG_IS_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_UPLOAD)) { + if (!gst_cuda_context_push (mem->context)) { + GST_CAT_ERROR (GST_CAT_MEMORY, "cannot push cuda context"); + g_mutex_unlock (&mem->lock); + + return NULL; + } + + if (!gst_cuda_memory_upload_transfer (mem)) { + g_mutex_unlock (&mem->lock); + return NULL; + } + + gst_cuda_context_pop (NULL); + } + + GST_MEMORY_FLAG_UNSET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_UPLOAD); + + if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) + GST_MINI_OBJECT_FLAG_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD); + + g_mutex_unlock (&mem->lock); + return (gpointer) mem->data; + } + + ret = gst_cuda_memory_device_memory_map (mem); + if (ret == NULL) { + mem->map_count--; + g_mutex_unlock (&mem->lock); + return NULL; + } + + if ((flags & GST_MAP_WRITE) == GST_MAP_WRITE) + GST_MINI_OBJECT_FLAG_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_UPLOAD); + + GST_MEMORY_FLAG_UNSET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD); + + g_mutex_unlock (&mem->lock); + + return ret; +} + +static void +cuda_mem_unmap_full (GstCudaMemory * mem, GstMapInfo * info) +{ + g_mutex_lock (&mem->lock); + mem->map_count--; + GST_CAT_TRACE (GST_CAT_MEMORY, + "unmap CUDA memory %p, map count %d, have map_data %s", + mem, mem->map_count, mem->map_data ? "true" : "false"); + + if ((info->flags & GST_MAP_CUDA) == GST_MAP_CUDA) { + if ((info->flags & GST_MAP_WRITE) == GST_MAP_WRITE) + GST_MINI_OBJECT_FLAG_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD); + + g_mutex_unlock (&mem->lock); + return; + } + + if ((info->flags & GST_MAP_WRITE)) + GST_MINI_OBJECT_FLAG_SET (mem, GST_CUDA_MEMORY_TRANSFER_NEED_UPLOAD); + + if (mem->map_count > 0 || !mem->map_data) { + g_mutex_unlock (&mem->lock); + return; + } + + mem->map_data = NULL; + g_mutex_unlock (&mem->lock); + + return; +} + +static GstMemory * +cuda_mem_copy (GstMemory * mem, gssize offset, gssize size) +{ + GstMemory *copy; + GstCudaMemory *src_mem = GST_CUDA_MEMORY_CAST (mem); + GstCudaMemory *dst_mem; + GstCudaContext *ctx = GST_CUDA_ALLOCATOR_CAST (mem->allocator)->context; + gint i; + GstVideoInfo *info; + + /* offset and size are ignored */ + copy = gst_cuda_allocator_alloc (mem->allocator, mem->size, + &src_mem->alloc_params); + + dst_mem = GST_CUDA_MEMORY_CAST (copy); + + info = &src_mem->alloc_params.info; + + if (!gst_cuda_context_push (ctx)) { + GST_CAT_ERROR (GST_CAT_MEMORY, "cannot push cuda context"); + gst_cuda_allocator_free (mem->allocator, copy); + + return NULL; + } + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { + CUDA_MEMCPY2D param = { 0, }; + + param.srcMemoryType = CU_MEMORYTYPE_DEVICE; + param.srcDevice = src_mem->data + src_mem->offset[i]; + param.srcPitch = src_mem->stride; + + param.dstMemoryType = CU_MEMORYTYPE_DEVICE; + param.dstDevice = dst_mem->data + dst_mem->offset[i]; + param.dstPitch = dst_mem->stride; + param.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (info, i) * + GST_VIDEO_INFO_COMP_PSTRIDE (info, i); + param.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); + + if (!gst_cuda_result (CuMemcpy2DAsync (¶m, NULL))) { + GST_CAT_ERROR_OBJECT (GST_CAT_MEMORY, + mem->allocator, "Failed to copy %dth plane", i); + gst_cuda_context_pop (NULL); + gst_cuda_allocator_free (mem->allocator, copy); + + return NULL; + } + } + + gst_cuda_result (CuStreamSynchronize (NULL)); + + if (!gst_cuda_context_pop (NULL)) { + GST_CAT_WARNING (GST_CAT_MEMORY, "cannot pop cuda context"); + } + + return copy; +} + +GstAllocator * +gst_cuda_allocator_new (GstCudaContext * context) +{ + GstCudaAllocator *allocator; + + g_return_val_if_fail (GST_IS_CUDA_CONTEXT (context), NULL); + + allocator = g_object_new (GST_TYPE_CUDA_ALLOCATOR, NULL); + allocator->context = gst_object_ref (context); + + return GST_ALLOCATOR_CAST (allocator); +} + +gboolean +gst_is_cuda_memory (GstMemory * mem) +{ + return mem != NULL && mem->allocator != NULL && + GST_IS_CUDA_ALLOCATOR (mem->allocator); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudamemory.h
Added
@@ -0,0 +1,138 @@ +/* GStreamer + * Copyright (C) <2018-2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_MEMORY_H__ +#define __GST_CUDA_MEMORY_H__ + +#include <gst/gst.h> +#include <gst/gstallocator.h> +#include <gst/video/video.h> +#include "gstcudaloader.h" +#include "gstcudacontext.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_ALLOCATOR (gst_cuda_allocator_get_type()) +#define GST_CUDA_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_ALLOCATOR,GstCudaAllocator)) +#define GST_CUDA_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_ALLOCATOR,GstCudaAllocatorClass)) +#define GST_CUDA_ALLOCATOR_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_ALLOCATOR,GstCudaAllocatorClass)) +#define GST_IS_CUDA_ALLOCATOR(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_ALLOCATOR)) +#define GST_IS_CUDA_ALLOCATOR_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_ALLOCATOR)) +#define GST_CUDA_ALLOCATOR_CAST(obj) ((GstCudaAllocator *)(obj)) +#define GST_CUDA_MEMORY_CAST(mem) ((GstCudaMemory *) (mem)) + +typedef struct _GstCudaAllocationParams GstCudaAllocationParams; +typedef struct _GstCudaAllocator GstCudaAllocator; +typedef struct _GstCudaAllocatorClass GstCudaAllocatorClass; +typedef struct _GstCudaMemory GstCudaMemory; + +/** + * GST_MAP_CUDA: + * + * Flag indicating that we should map the CUDA device memory + * instead of to system memory. + * + * Combining #GST_MAP_CUDA with #GST_MAP_WRITE has the same semantics as though + * you are writing to CUDA device/host memory. + * Conversely, combining #GST_MAP_CUDA with + * #GST_MAP_READ has the same semantics as though you are reading from + * CUDA device/host memory + */ +#define GST_MAP_CUDA (GST_MAP_FLAG_LAST << 1) + +#define GST_CUDA_MEMORY_TYPE_NAME "gst.cuda.memory" + +/** + * GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY: + * + * Name of the caps feature for indicating the use of #GstCudaMemory + */ +#define GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY "memory:CUDAMemory" + +struct _GstCudaAllocationParams +{ + GstAllocationParams parent; + + GstVideoInfo info; +}; + +struct _GstCudaAllocator +{ + GstAllocator parent; + GstCudaContext *context; +}; + +struct _GstCudaAllocatorClass +{ + GstAllocatorClass parent_class; +}; + +GType gst_cuda_allocator_get_type (void); + +GstAllocator * gst_cuda_allocator_new (GstCudaContext * context); + +GstMemory * gst_cuda_allocator_alloc (GstAllocator * allocator, + gsize size, + GstCudaAllocationParams * params); + +/** + * GstCudaMemoryTransfer: + * @GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD: the device memory needs downloading + * to the staging memory + * @GST_CUDA_MEMORY_TRANSFER_NEED_UPLOAD: the staging memory needs uploading + * to the device memory + */ +typedef enum +{ + GST_CUDA_MEMORY_TRANSFER_NEED_DOWNLOAD = (GST_MEMORY_FLAG_LAST << 0), + GST_CUDA_MEMORY_TRANSFER_NEED_UPLOAD = (GST_MEMORY_FLAG_LAST << 1) +} GstCudaMemoryTransfer; + +struct _GstCudaMemory +{ + GstMemory mem; + + GstCudaContext *context; + CUdeviceptr data; + + GstCudaAllocationParams alloc_params; + + /* offset and stride of CUDA device memory */ + gsize offset[GST_VIDEO_MAX_PLANES]; + gint stride; + + /* allocated CUDA Host memory */ + gpointer map_alloc_data; + + /* aligned CUDA Host memory */ + guint8 *align_data; + + /* pointing align_data if the memory is mapped */ + gpointer map_data; + + gint map_count; + + GMutex lock; +}; + +gboolean gst_is_cuda_memory (GstMemory * mem); + +G_END_DECLS + +#endif /* __GST_CUDA_MEMORY_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudanvrtc.c
Added
@@ -0,0 +1,120 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstcudanvrtc.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_nvrtc_debug); +#define GST_CAT_DEFAULT gst_cuda_nvrtc_debug + +static void +_init_debug (void) +{ + static gsize once_init = 0; + + if (g_once_init_enter (&once_init)) { + + GST_DEBUG_CATEGORY_INIT (gst_cuda_nvrtc_debug, "cudanvrtc", 0, + "CUDA runtime compiler"); + g_once_init_leave (&once_init, 1); + } +} + +gchar * +gst_cuda_nvrtc_compile (const gchar * source) +{ + nvrtcProgram prog; + nvrtcResult ret; + CUresult curet; + const gchar *opts[] = { "--gpu-architecture=compute_30" }; + gsize ptx_size; + gchar *ptx = NULL; + int driverVersion; + + g_return_val_if_fail (source != NULL, FALSE); + + _init_debug (); + + GST_TRACE ("CUDA kernel source \n%s", source); + + curet = CuDriverGetVersion (&driverVersion); + if (curet != CUDA_SUCCESS) { + GST_ERROR ("Failed to query CUDA Driver version, ret %d", curet); + return NULL; + } + + GST_DEBUG ("CUDA Driver Version %d.%d", driverVersion / 1000, + (driverVersion % 1000) / 10); + + ret = NvrtcCreateProgram (&prog, source, NULL, 0, NULL, NULL); + if (ret != NVRTC_SUCCESS) { + GST_ERROR ("couldn't create nvrtc program, ret %d", ret); + return NULL; + } + + /* Starting from CUDA 11, the lowest supported architecture is 5.2 */ + if (driverVersion >= 11000) + opts[0] = "--gpu-architecture=compute_52"; + + ret = NvrtcCompileProgram (prog, 1, opts); + if (ret != NVRTC_SUCCESS) { + gsize log_size; + + GST_ERROR ("couldn't compile nvrtc program, ret %d", ret); + if (NvrtcGetProgramLogSize (prog, &log_size) == NVRTC_SUCCESS && + log_size > 0) { + gchar *compile_log = g_alloca (log_size); + if (NvrtcGetProgramLog (prog, compile_log) == NVRTC_SUCCESS) { + GST_ERROR ("nvrtc compile log %s", compile_log); + } + } + + goto error; + } + + ret = NvrtcGetPTXSize (prog, &ptx_size); + if (ret != NVRTC_SUCCESS) { + GST_ERROR ("unknown ptx size, ret %d", ret); + + goto error; + } + + ptx = g_malloc0 (ptx_size); + ret = NvrtcGetPTX (prog, ptx); + if (ret != NVRTC_SUCCESS) { + GST_ERROR ("couldn't get ptx, ret %d", ret); + g_free (ptx); + + goto error; + } + + NvrtcDestroyProgram (&prog); + + GST_TRACE ("compiled CUDA PTX %s\n", ptx); + + return ptx; + +error: + NvrtcDestroyProgram (&prog); + + return NULL; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudanvrtc.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_NVRTC_H__ +#define __GST_CUDA_NVRTC_H__ + +#include <gst/gst.h> +#include "gstcudaloader.h" +#include "gstnvrtcloader.h" + +G_BEGIN_DECLS + +gchar * gst_cuda_nvrtc_compile (const gchar * source); + +G_END_DECLS + +#endif /* __GST_CUDA_NVRTC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudascale.c
Added
@@ -0,0 +1,607 @@ +/* GStreamer + * Copyright (C) <1999> Erik Walthinsen <omega@cse.ogi.edu> + * Copyright (C) 2005-2012 David Schleef <ds@schleef.org> + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-cudascale + * @title: cudascale + * @see_also: cudaconvert + * + * This element resizes video frames. By default the element will try to + * negotiate to the same size on the source and sinkpad so that no scaling + * is needed. It is therefore safe to insert this element in a pipeline to + * get more robust behaviour without any cost if no scaling is needed. + * + * This element supports some YUV formats which are are also supported by + * nvidia encoders and decoders. + * + * ## Example pipelines + * |[ + * gst-launch-1.0 -v filesrc location=videotestsrc.mp4 ! qtdemux ! h264parse ! nvh264dec ! cudaconvert ! cudascale ! cudaconvert ! cudadownload ! autovideosink + * ]| + * Decode a mp4/h264 and display the video. If the video sink chosen + * cannot perform scaling, the video scaling will be performed by cudascale + * |[ + * gst-launch-1.0 -v filesrc location=videotestsrc.mp4 ! qtdemux ! h264parse ! nvh264dec ! cudaconvert ! cudascale ! cudaconvert ! cudadownload ! video/x-raw,width=100 ! autovideosink + * ]| + * Decode an mp4/h264 and display the video with a width of 100. + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstcudascale.h" +#include "gstcudautils.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_scale_debug); +#define GST_CAT_DEFAULT gst_cuda_scale_debug + +#define gst_cuda_scale_parent_class parent_class +G_DEFINE_TYPE (GstCudaScale, gst_cuda_scale, GST_TYPE_CUDA_BASE_FILTER); + +static GstCaps *gst_cuda_scale_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); +static GstCaps *gst_cuda_scale_fixate_caps (GstBaseTransform * base, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); +static gboolean gst_cuda_scale_set_info (GstCudaBaseTransform * filter, + GstCaps * incaps, GstVideoInfo * in_info, GstCaps * outcaps, + GstVideoInfo * out_info); + +static void +gst_cuda_scale_class_init (GstCudaScaleClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (klass); + GstCudaBaseTransformClass *btrans_class = + GST_CUDA_BASE_TRANSFORM_CLASS (klass); + + gst_element_class_set_static_metadata (element_class, + "CUDA Video scaler", + "Filter/Converter/Video/Scaler/Hardware", + "Resizes Video using CUDA", "Seungha Yang <seungha.yang@navercorp.com>"); + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_cuda_scale_transform_caps); + trans_class->fixate_caps = GST_DEBUG_FUNCPTR (gst_cuda_scale_fixate_caps); + + btrans_class->set_info = GST_DEBUG_FUNCPTR (gst_cuda_scale_set_info); + + GST_DEBUG_CATEGORY_INIT (gst_cuda_scale_debug, + "cudascale", 0, "Video Resize using CUDA"); +} + +static void +gst_cuda_scale_init (GstCudaScale * cuda) +{ +} + +static GstCaps * +gst_cuda_scale_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *ret; + GstStructure *structure; + GstCapsFeatures *features; + gint i, n; + + GST_DEBUG_OBJECT (trans, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + ret = gst_caps_new_empty (); + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + structure = gst_caps_get_structure (caps, i); + features = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (ret, structure, features)) + continue; + + /* make copy */ + structure = gst_structure_copy (structure); + + gst_structure_set (structure, "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, + "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); + + /* if pixel aspect ratio, make a range of it */ + if (gst_structure_has_field (structure, "pixel-aspect-ratio")) { + gst_structure_set (structure, "pixel-aspect-ratio", + GST_TYPE_FRACTION_RANGE, 1, G_MAXINT, G_MAXINT, 1, NULL); + } + + gst_caps_append_structure_full (ret, structure, + gst_caps_features_copy (features)); + } + + if (filter) { + GstCaps *intersection; + + intersection = + gst_caps_intersect_full (filter, ret, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (ret); + ret = intersection; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, ret); + + return ret; +} + +/* fork of gstvideoscale */ +static GstCaps * +gst_cuda_scale_fixate_caps (GstBaseTransform * base, GstPadDirection direction, + GstCaps * caps, GstCaps * othercaps) +{ + GstStructure *ins, *outs; + const GValue *from_par, *to_par; + GValue fpar = { 0, }, tpar = { + 0,}; + + othercaps = gst_caps_truncate (othercaps); + othercaps = gst_caps_make_writable (othercaps); + + GST_DEBUG_OBJECT (base, "trying to fixate othercaps %" GST_PTR_FORMAT + " based on caps %" GST_PTR_FORMAT, othercaps, caps); + + ins = gst_caps_get_structure (caps, 0); + outs = gst_caps_get_structure (othercaps, 0); + + from_par = gst_structure_get_value (ins, "pixel-aspect-ratio"); + to_par = gst_structure_get_value (outs, "pixel-aspect-ratio"); + + /* If we're fixating from the sinkpad we always set the PAR and + * assume that missing PAR on the sinkpad means 1/1 and + * missing PAR on the srcpad means undefined + */ + if (direction == GST_PAD_SINK) { + if (!from_par) { + g_value_init (&fpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&fpar, 1, 1); + from_par = &fpar; + } + if (!to_par) { + g_value_init (&tpar, GST_TYPE_FRACTION_RANGE); + gst_value_set_fraction_range_full (&tpar, 1, G_MAXINT, G_MAXINT, 1); + to_par = &tpar; + } + } else { + if (!to_par) { + g_value_init (&tpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&tpar, 1, 1); + to_par = &tpar; + + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1, + NULL); + } + if (!from_par) { + g_value_init (&fpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&fpar, 1, 1); + from_par = &fpar; + } + } + + /* we have both PAR but they might not be fixated */ + { + gint from_w, from_h, from_par_n, from_par_d, to_par_n, to_par_d; + gint w = 0, h = 0; + gint from_dar_n, from_dar_d; + gint num, den; + + /* from_par should be fixed */ + g_return_val_if_fail (gst_value_is_fixed (from_par), othercaps); + + from_par_n = gst_value_get_fraction_numerator (from_par); + from_par_d = gst_value_get_fraction_denominator (from_par); + + gst_structure_get_int (ins, "width", &from_w); + gst_structure_get_int (ins, "height", &from_h); + + gst_structure_get_int (outs, "width", &w); + gst_structure_get_int (outs, "height", &h); + + /* if both width and height are already fixed, we can't do anything + * about it anymore */ + if (w && h) { + guint n, d; + + GST_DEBUG_OBJECT (base, "dimensions already set to %dx%d, not fixating", + w, h); + if (!gst_value_is_fixed (to_par)) { + if (gst_video_calculate_display_ratio (&n, &d, from_w, from_h, + from_par_n, from_par_d, w, h)) { + GST_DEBUG_OBJECT (base, "fixating to_par to %dx%d", n, d); + if (gst_structure_has_field (outs, "pixel-aspect-ratio")) + gst_structure_fixate_field_nearest_fraction (outs, + "pixel-aspect-ratio", n, d); + else if (n != d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + n, d, NULL); + } + } + goto done; + } + + /* Calculate input DAR */ + if (!gst_util_fraction_multiply (from_w, from_h, from_par_n, from_par_d, + &from_dar_n, &from_dar_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + GST_DEBUG_OBJECT (base, "Input DAR is %d/%d", from_dar_n, from_dar_d); + + /* If either width or height are fixed there's not much we + * can do either except choosing a height or width and PAR + * that matches the DAR as good as possible + */ + if (h) { + GstStructure *tmp; + gint set_w, set_par_n, set_par_d; + + GST_DEBUG_OBJECT (base, "height is fixed (%d)", h); + + /* If the PAR is fixed too, there's not much to do + * except choosing the width that is nearest to the + * width with the same DAR */ + if (gst_value_is_fixed (to_par)) { + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + GST_DEBUG_OBJECT (base, "PAR is fixed %d/%d", to_par_n, to_par_d); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, + to_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (h, num, den); + gst_structure_fixate_field_nearest_int (outs, "width", w); + + goto done; + } + + /* The PAR is not fixed and it's quite likely that we can set + * an arbitrary PAR. */ + + /* Check if we can keep the input width */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + /* Might have failed but try to keep the DAR nonetheless by + * adjusting the PAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, h, set_w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + /* Check if the adjusted PAR is accepted */ + if (set_par_n == to_par_n && set_par_d == to_par_d) { + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "width", G_TYPE_INT, set_w, + "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, + NULL); + goto done; + } + + /* Otherwise scale the width to the new PAR and check if the + * adjusted with is accepted. If all that fails we can't keep + * the DAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (h, num, den); + gst_structure_fixate_field_nearest_int (outs, "width", w); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + + goto done; + } else if (w) { + GstStructure *tmp; + gint set_h, set_par_n, set_par_d; + + GST_DEBUG_OBJECT (base, "width is fixed (%d)", w); + + /* If the PAR is fixed too, there's not much to do + * except choosing the height that is nearest to the + * height with the same DAR */ + if (gst_value_is_fixed (to_par)) { + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + GST_DEBUG_OBJECT (base, "PAR is fixed %d/%d", to_par_n, to_par_d); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, + to_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + h = (guint) gst_util_uint64_scale_int_round (w, den, num); + gst_structure_fixate_field_nearest_int (outs, "height", h); + + goto done; + } + + /* The PAR is not fixed and it's quite likely that we can set + * an arbitrary PAR. */ + + /* Check if we can keep the input height */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + + /* Might have failed but try to keep the DAR nonetheless by + * adjusting the PAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + /* Check if the adjusted PAR is accepted */ + if (set_par_n == to_par_n && set_par_d == to_par_d) { + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "height", G_TYPE_INT, set_h, + "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, + NULL); + goto done; + } + + /* Otherwise scale the height to the new PAR and check if the + * adjusted with is accepted. If all that fails we can't keep + * the DAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + h = (guint) gst_util_uint64_scale_int_round (w, den, num); + gst_structure_fixate_field_nearest_int (outs, "height", h); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + + goto done; + } else if (gst_value_is_fixed (to_par)) { + GstStructure *tmp; + gint set_h, set_w, f_h, f_w; + + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + /* Calculate scale factor for the PAR change */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_n, + to_par_d, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + /* Try to keep the input height (because of interlacing) */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + + /* This might have failed but try to scale the width + * to keep the DAR nonetheless */ + w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); + gst_structure_fixate_field_nearest_int (tmp, "width", w); + gst_structure_get_int (tmp, "width", &set_w); + gst_structure_free (tmp); + + /* We kept the DAR and the height is nearest to the original height */ + if (set_w == w) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + goto done; + } + + f_h = set_h; + f_w = set_w; + + /* If the former failed, try to keep the input width at least */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + /* This might have failed but try to scale the width + * to keep the DAR nonetheless */ + h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); + gst_structure_fixate_field_nearest_int (tmp, "height", h); + gst_structure_get_int (tmp, "height", &set_h); + gst_structure_free (tmp); + + /* We kept the DAR and the width is nearest to the original width */ + if (set_h == h) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + goto done; + } + + /* If all this failed, keep the dimensions with the DAR that was closest + * to the correct DAR. This changes the DAR but there's not much else to + * do here. + */ + if (set_w * ABS (set_h - h) < ABS (f_w - w) * f_h) { + f_h = set_h; + f_w = set_w; + } + gst_structure_set (outs, "width", G_TYPE_INT, f_w, "height", G_TYPE_INT, + f_h, NULL); + goto done; + } else { + GstStructure *tmp; + gint set_h, set_w, set_par_n, set_par_d, tmp2; + + /* width, height and PAR are not fixed but passthrough is not possible */ + + /* First try to keep the height and width as good as possible + * and scale PAR */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, set_w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + if (set_par_n == to_par_n && set_par_d == to_par_d) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* Otherwise try to scale width to keep the DAR with the set + * PAR and height */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (base, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", w); + gst_structure_get_int (tmp, "width", &tmp2); + gst_structure_free (tmp); + + if (tmp2 == w) { + gst_structure_set (outs, "width", G_TYPE_INT, tmp2, "height", + G_TYPE_INT, set_h, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* ... or try the same with the height */ + h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", h); + gst_structure_get_int (tmp, "height", &tmp2); + gst_structure_free (tmp); + + if (tmp2 == h) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, tmp2, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* If all fails we can't keep the DAR and take the nearest values + * for everything from the first try */ + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + } + } + +done: + GST_DEBUG_OBJECT (base, "fixated othercaps to %" GST_PTR_FORMAT, othercaps); + + if (from_par == &fpar) + g_value_unset (&fpar); + if (to_par == &tpar) + g_value_unset (&tpar); + + return othercaps; +} + +static gboolean +gst_cuda_scale_set_info (GstCudaBaseTransform * btrans, GstCaps * incaps, + GstVideoInfo * in_info, GstCaps * outcaps, GstVideoInfo * out_info) +{ + if (GST_VIDEO_INFO_WIDTH (in_info) == GST_VIDEO_INFO_WIDTH (out_info) && + GST_VIDEO_INFO_HEIGHT (in_info) == GST_VIDEO_INFO_HEIGHT (out_info) && + GST_VIDEO_INFO_FORMAT (in_info) == GST_VIDEO_INFO_FORMAT (out_info)) { + gst_base_transform_set_passthrough (GST_BASE_TRANSFORM (btrans), TRUE); + } + + return GST_CUDA_BASE_TRANSFORM_CLASS (parent_class)->set_info (btrans, + incaps, in_info, outcaps, out_info); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudascale.h
Added
@@ -0,0 +1,59 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_SCALE_H__ +#define __GST_CUDA_SCALE_H__ + +#include <gst/gst.h> + +#include "gstcudabasefilter.h" +#include "cuda-converter.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_SCALE (gst_cuda_scale_get_type()) +#define GST_CUDA_SCALE(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_SCALE,GstCudaScale)) +#define GST_CUDA_SCALE_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_SCALE,GstCudaScaleClass)) +#define GST_CUDA_SCALE_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_SCALE,GstCudaScaleClass)) +#define GST_IS_CUDA_SCALE(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_SCALE)) +#define GST_IS_CUDA_SCALE_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_SCALE)) + +typedef struct _GstCudaScale GstCudaScale; +typedef struct _GstCudaScaleClass GstCudaScaleClass; + +struct _GstCudaScale +{ + GstCudaBaseFilter parent; + + GstCudaConverter *converter; + + CUdeviceptr in_fallback; + CUdeviceptr out_fallback; +}; + +struct _GstCudaScaleClass +{ + GstCudaBaseFilterClass parent_class; +}; + +GType gst_cuda_scale_get_type (void); + +G_END_DECLS + +#endif /* __GST_CUDA_SCALE_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudaupload.c
Added
@@ -0,0 +1,130 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * element-cudaupload: + * + * Uploads data to NVIDA GPU via CUDA APIs + * + * Since: 1.20 + */ + +#ifdef HAVE_CONFIG_H +# include <config.h> +#endif + +#include "gstcudaupload.h" + +GST_DEBUG_CATEGORY_STATIC (gst_cuda_upload_debug); +#define GST_CAT_DEFAULT gst_cuda_upload_debug + +static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw; video/x-raw(" + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY ")")); + +static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw(" GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY ")")); + +G_DEFINE_TYPE (GstCudaUpload, gst_cuda_upload, GST_TYPE_CUDA_BASE_TRANSFORM); + +static GstCaps *gst_cuda_upload_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter); + +static void +gst_cuda_upload_class_init (GstCudaUploadClass * klass) +{ + GstElementClass *element_class; + GstBaseTransformClass *trans_class; + + element_class = GST_ELEMENT_CLASS (klass); + trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gst_element_class_add_static_pad_template (element_class, &sink_template); + gst_element_class_add_static_pad_template (element_class, &src_template); + + gst_element_class_set_static_metadata (element_class, + "CUDA uploader", "Filter/Video", + "Uploads data into NVIDA GPU via CUDA APIs", + "Seungha Yang <seungha.yang@navercorp.com>"); + + trans_class->passthrough_on_same_caps = TRUE; + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_cuda_upload_transform_caps); + + gst_type_mark_as_plugin_api (GST_TYPE_CUDA_BASE_TRANSFORM, 0); + GST_DEBUG_CATEGORY_INIT (gst_cuda_upload_debug, + "cudaupload", 0, "cudaupload Element"); +} + +static void +gst_cuda_upload_init (GstCudaUpload * upload) +{ +} + +static GstCaps * +_set_caps_features (const GstCaps * caps, const gchar * feature_name) +{ + GstCaps *tmp = gst_caps_copy (caps); + guint n = gst_caps_get_size (tmp); + guint i = 0; + + for (i = 0; i < n; i++) + gst_caps_set_features (tmp, i, + gst_caps_features_from_string (feature_name)); + + return tmp; +} + +static GstCaps * +gst_cuda_upload_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstCaps *result, *tmp; + + GST_DEBUG_OBJECT (trans, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + if (direction == GST_PAD_SINK) { + tmp = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY); + tmp = gst_caps_merge (gst_caps_ref (caps), tmp); + } else { + GstCaps *newcaps; + tmp = gst_caps_ref (caps); + + newcaps = _set_caps_features (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); + tmp = gst_caps_merge (tmp, newcaps); + } + + if (filter) { + result = gst_caps_intersect_full (filter, tmp, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (tmp); + } else { + result = tmp; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, result); + + return result; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudaupload.h
Added
@@ -0,0 +1,51 @@ +/* GStreamer + * Copyright (C) <2019> Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_CUDA_UPLOAD_H__ +#define __GST_CUDA_UPLOAD_H__ + +#include "gstcudabasetransform.h" + +G_BEGIN_DECLS + +#define GST_TYPE_CUDA_UPLOAD (gst_cuda_upload_get_type()) +#define GST_CUDA_UPLOAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_CUDA_UPLOAD,GstCudaUpload)) +#define GST_CUDA_UPLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_CUDA_UPLOAD,GstCudaUploadClass)) +#define GST_CUDA_UPLOAD_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_CUDA_UPLOAD,GstCudaUploadClass)) +#define GST_IS_CUDA_UPLOAD(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_CUDA_UPLOAD)) +#define GST_IS_CUDA_UPLOAD_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_CUDA_UPLOAD)) + +typedef struct _GstCudaUpload GstCudaUpload; +typedef struct _GstCudaUploadClass GstCudaUploadClass; + +struct _GstCudaUpload +{ + GstCudaBaseTransform parent; +}; + +struct _GstCudaUploadClass +{ + GstCudaBaseTransformClass parent_class; +}; + +GType gst_cuda_upload_get_type (void); + +G_END_DECLS + +#endif /* __GST_CUDA_UPLOAD_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstcudautils.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcudautils.h
Changed
@@ -99,51 +99,40 @@ gboolean mapped; } GstCudaGraphicsResource; -G_GNUC_INTERNAL gboolean gst_cuda_ensure_element_context (GstElement * element, gint device_id, GstCudaContext ** cuda_ctx); -G_GNUC_INTERNAL gboolean gst_cuda_handle_set_context (GstElement * element, GstContext * context, gint device_id, GstCudaContext ** cuda_ctx); -G_GNUC_INTERNAL gboolean gst_cuda_handle_context_query (GstElement * element, GstQuery * query, GstCudaContext * cuda_ctx); -G_GNUC_INTERNAL GstContext * gst_context_new_cuda_context (GstCudaContext * context); -G_GNUC_INTERNAL GQuark gst_cuda_quark_from_id (GstCudaQuarkId id); -G_GNUC_INTERNAL GstCudaGraphicsResource * gst_cuda_graphics_resource_new (GstCudaContext * context, GstObject * graphics_context, GstCudaGraphicsResourceType type); -G_GNUC_INTERNAL gboolean gst_cuda_graphics_resource_register_gl_buffer (GstCudaGraphicsResource * resource, guint buffer, CUgraphicsRegisterFlags flags); -G_GNUC_INTERNAL void gst_cuda_graphics_resource_unregister (GstCudaGraphicsResource * resource); -G_GNUC_INTERNAL CUgraphicsResource gst_cuda_graphics_resource_map (GstCudaGraphicsResource * resource, CUstream stream, CUgraphicsMapResourceFlags flags); -G_GNUC_INTERNAL void gst_cuda_graphics_resource_unmap (GstCudaGraphicsResource * resource, CUstream stream); -G_GNUC_INTERNAL void gst_cuda_graphics_resource_free (GstCudaGraphicsResource * resource); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstcuvidloader.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstcuvidloader.h
Changed
@@ -27,65 +27,50 @@ G_BEGIN_DECLS /* cuvid.h */ -G_GNUC_INTERNAL gboolean gst_cuvid_load_library (guint api_major_ver, guint api_minor_ver); -G_GNUC_INTERNAL gboolean gst_cuvid_get_api_version (guint * api_major_ver, guint * api_minor_ver); -G_GNUC_INTERNAL gboolean gst_cuvid_can_get_decoder_caps (void); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidCtxLockCreate (CUvideoctxlock * pLock, CUcontext ctx); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidCtxLockDestroy (CUvideoctxlock lck); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidCtxLock (CUvideoctxlock lck, unsigned int reserved_flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidCtxUnlock (CUvideoctxlock lck, unsigned int reserved_flags); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidCreateDecoder (CUvideodecoder * phDecoder, CUVIDDECODECREATEINFO * pdci); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidDestroyDecoder (CUvideodecoder hDecoder); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidDecodePicture (CUvideodecoder hDecoder, CUVIDPICPARAMS * pPicParams); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidCreateVideoParser (CUvideoparser * pObj, CUVIDPARSERPARAMS * pParams); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidParseVideoData (CUvideoparser obj, CUVIDSOURCEDATAPACKET * pPacket); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidDestroyVideoParser (CUvideoparser obj); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidMapVideoFrame (CUvideodecoder hDecoder, int nPicIdx, guintptr * pDevPtr, unsigned int *pPitch, CUVIDPROCPARAMS * pVPP); -G_GNUC_INTERNAL CUresult CUDAAPI CuvidUnmapVideoFrame (CUvideodecoder hDecoder, guintptr DevPtr); -G_GNUC_INTERNAL + CUresult CUDAAPI CuvidGetDecoderCaps (CUVIDDECODECAPS * pdc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvbaseenc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvbaseenc.c
Changed
@@ -23,6 +23,7 @@ #include "gstnvbaseenc.h" #include "gstcudautils.h" +#include "gstcudabufferpool.h" #include <gst/pbutils/codec-utils.h> @@ -249,6 +250,8 @@ static gboolean gst_nv_base_enc_stop_bitstream_thread (GstNvBaseEnc * nvenc, gboolean force); static gboolean gst_nv_base_enc_drain_encoder (GstNvBaseEnc * nvenc); +static gboolean gst_nv_base_enc_propose_allocation (GstVideoEncoder * enc, + GstQuery * query); static void gst_nv_base_enc_class_init (GstNvBaseEncClass * klass) @@ -276,6 +279,8 @@ videoenc_class->finish = GST_DEBUG_FUNCPTR (gst_nv_base_enc_finish); videoenc_class->sink_query = GST_DEBUG_FUNCPTR (gst_nv_base_enc_sink_query); videoenc_class->sink_event = GST_DEBUG_FUNCPTR (gst_nv_base_enc_sink_event); + videoenc_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_nv_base_enc_propose_allocation); g_object_class_install_property (gobject_class, PROP_DEVICE_ID, g_param_spec_uint ("cuda-device-id", @@ -564,6 +569,129 @@ return GST_VIDEO_ENCODER_CLASS (parent_class)->sink_query (enc, query); } +#ifdef HAVE_NVCODEC_GST_GL +static gboolean +gst_nv_base_enc_ensure_gl_context (GstNvBaseEnc * nvenc) +{ + if (!nvenc->display) { + GST_DEBUG_OBJECT (nvenc, "No available OpenGL display"); + return FALSE; + } + + if (!gst_gl_query_local_gl_context (GST_ELEMENT (nvenc), GST_PAD_SINK, + (GstGLContext **) & nvenc->gl_context)) { + GST_INFO_OBJECT (nvenc, "failed to query local OpenGL context"); + if (nvenc->gl_context) + gst_object_unref (nvenc->gl_context); + nvenc->gl_context = + (GstObject *) gst_gl_display_get_gl_context_for_thread ((GstGLDisplay *) + nvenc->display, NULL); + if (!nvenc->gl_context + || !gst_gl_display_add_context ((GstGLDisplay *) nvenc->display, + (GstGLContext *) nvenc->gl_context)) { + if (nvenc->gl_context) + gst_object_unref (nvenc->gl_context); + if (!gst_gl_display_create_context ((GstGLDisplay *) nvenc->display, + (GstGLContext *) nvenc->other_context, + (GstGLContext **) & nvenc->gl_context, NULL)) { + GST_ERROR_OBJECT (nvenc, "failed to create OpenGL context"); + return FALSE; + } + if (!gst_gl_display_add_context ((GstGLDisplay *) nvenc->display, + (GstGLContext *) nvenc->gl_context)) { + GST_ERROR_OBJECT (nvenc, + "failed to add the OpenGL context to the display"); + return FALSE; + } + } + } + + if (!gst_gl_context_check_gl_version ((GstGLContext *) nvenc->gl_context, + SUPPORTED_GL_APIS, 3, 0)) { + GST_WARNING_OBJECT (nvenc, "OpenGL context could not support PBO download"); + return FALSE; + } + + return TRUE; +} +#endif + +static gboolean +gst_nv_base_enc_propose_allocation (GstVideoEncoder * enc, GstQuery * query) +{ + GstNvBaseEnc *nvenc = GST_NV_BASE_ENC (enc); + GstCaps *caps; + GstVideoInfo info; + GstBufferPool *pool; + GstStructure *config; + GstCapsFeatures *features; + + GST_DEBUG_OBJECT (nvenc, "propose allocation"); + + gst_query_parse_allocation (query, &caps, NULL); + + if (caps == NULL) + return FALSE; + + if (!gst_video_info_from_caps (&info, caps)) { + GST_WARNING_OBJECT (nvenc, "failed to get video info"); + return FALSE; + } + + features = gst_caps_get_features (caps, 0); +#if HAVE_NVCODEC_GST_GL + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { + GST_DEBUG_OBJECT (nvenc, "upsteram support GL memory"); + if (!gst_nv_base_enc_ensure_gl_context (nvenc)) { + GST_WARNING_OBJECT (nvenc, "Could not get gl context"); + goto done; + } + + pool = gst_gl_buffer_pool_new ((GstGLContext *) nvenc->gl_context); + } else +#endif + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)) { + GST_DEBUG_OBJECT (nvenc, "upstream support CUDA memory"); + pool = gst_cuda_buffer_pool_new (nvenc->cuda_ctx); + } else { + GST_DEBUG_OBJECT (nvenc, "use system memory"); + goto done; + } + + if (G_UNLIKELY (pool == NULL)) { + GST_WARNING_OBJECT (nvenc, "cannot create buffer pool"); + goto done; + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, caps, GST_VIDEO_INFO_SIZE (&info), + nvenc->items->len, 0); + + gst_query_add_allocation_pool (query, pool, GST_VIDEO_INFO_SIZE (&info), + nvenc->items->len, 0); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + + if (!gst_buffer_pool_set_config (pool, config)) + goto error_pool_config; + + gst_object_unref (pool); + +done: + return GST_VIDEO_ENCODER_CLASS (parent_class)->propose_allocation (enc, + query); + +error_pool_config: + { + if (pool) + gst_object_unref (pool); + GST_WARNING_OBJECT (nvenc, "failed to set config"); + return FALSE; + } +} + static gboolean gst_nv_base_enc_sink_event (GstVideoEncoder * enc, GstEvent * event) { @@ -649,6 +777,10 @@ gst_object_unref (nvenc->other_context); nvenc->other_context = NULL; } + if (nvenc->gl_context) { + gst_object_unref (nvenc->gl_context); + nvenc->gl_context = NULL; + } if (nvenc->items) { g_array_free (nvenc->items, TRUE); @@ -1717,7 +1849,6 @@ if (!reconfigure) { nvenc->input_info = *info; - nvenc->gl_input = FALSE; } if (nvenc->input_state) @@ -1727,9 +1858,7 @@ /* now allocate some buffers only on first configuration */ if (!reconfigure) { -#if HAVE_NVCODEC_GST_GL GstCapsFeatures *features; -#endif guint i; guint input_width, input_height; guint n_bufs; @@ -1744,11 +1873,17 @@ /* input buffers */ g_array_set_size (nvenc->items, n_bufs); -#if HAVE_NVCODEC_GST_GL + nvenc->mem_type = GST_NVENC_MEM_TYPE_SYSTEM; + features = gst_caps_get_features (state->caps, 0); if (gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)) { + nvenc->mem_type = GST_NVENC_MEM_TYPE_CUDA; + } +#if HAVE_NVCODEC_GST_GL + else if (gst_caps_features_contains (features, GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { - nvenc->gl_input = TRUE; + nvenc->mem_type = GST_NVENC_MEM_TYPE_GL; } #endif @@ -1773,7 +1908,7 @@ } resource->nv_resource.version = - gst_nvenc_get_registure_resource_version (); + gst_nvenc_get_register_resource_version (); resource->nv_resource.resourceType = NV_ENC_INPUT_RESOURCE_TYPE_CUDADEVICEPTR; resource->nv_resource.width = input_width; @@ -1914,7 +2049,7 @@ gl_buf_obj = gl_mem->pbo; GST_LOG_OBJECT (nvenc, - "registure glbuffer %d to CUDA resource", gl_buf_obj->id); + "register glbuffer %d to CUDA resource", gl_buf_obj->id); if (gst_cuda_graphics_resource_register_gl_buffer (resource, gl_buf_obj->id, CU_GRAPHICS_REGISTER_FLAGS_NONE)) { @@ -2090,26 +2225,37 @@ static gboolean gst_nv_base_enc_upload_frame (GstNvBaseEnc * nvenc, GstVideoFrame * frame, - GstNvEncInputResource * resource) + GstNvEncInputResource * resource, gboolean use_device_memory) { gint i; CUdeviceptr dst = resource->cuda_pointer; GstVideoInfo *info = &frame->info; CUresult cuda_ret; + GstCudaMemory *cuda_mem = NULL; if (!gst_cuda_context_push (nvenc->cuda_ctx)) { GST_ERROR_OBJECT (nvenc, "cannot push context"); return FALSE; } + if (use_device_memory) { + cuda_mem = (GstCudaMemory *) gst_buffer_peek_memory (frame->buffer, 0); + } + for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (frame); i++) { CUDA_MEMCPY2D param = { 0, }; guint dest_stride = _get_cuda_device_stride (&nvenc->input_info, i, resource->cuda_stride); - param.srcMemoryType = CU_MEMORYTYPE_HOST; - param.srcHost = GST_VIDEO_FRAME_PLANE_DATA (frame, i); - param.srcPitch = GST_VIDEO_FRAME_PLANE_STRIDE (frame, i); + if (use_device_memory) { + param.srcMemoryType = CU_MEMORYTYPE_DEVICE; + param.srcDevice = cuda_mem->data + cuda_mem->offset[i]; + param.srcPitch = cuda_mem->stride; + } else { + param.srcMemoryType = CU_MEMORYTYPE_HOST; + param.srcHost = GST_VIDEO_FRAME_PLANE_DATA (frame, i); + param.srcPitch = GST_VIDEO_FRAME_PLANE_STRIDE (frame, i); + } param.dstMemoryType = CU_MEMORYTYPE_DEVICE; param.dstDevice = dst; @@ -2268,6 +2414,7 @@ GstMapFlags in_map_flags = GST_MAP_READ; GstNvEncFrameState *state = NULL; GstNvEncInputResource *resource = NULL; + gboolean use_device_memory = FALSE; g_assert (nvenc->encoder != NULL); @@ -2292,10 +2439,29 @@ GST_VIDEO_CODEC_FRAME_SET_FORCE_KEYFRAME (frame); } #if HAVE_NVCODEC_GST_GL - if (nvenc->gl_input) + if (nvenc->mem_type == GST_NVENC_MEM_TYPE_GL) in_map_flags |= GST_MAP_GL; #endif + if (nvenc->mem_type == GST_NVENC_MEM_TYPE_CUDA) { + GstMemory *mem; + + if ((mem = gst_buffer_peek_memory (frame->input_buffer, 0)) && + gst_is_cuda_memory (mem)) { + GstCudaMemory *cmem = GST_CUDA_MEMORY_CAST (mem); + + if (cmem->context == nvenc->cuda_ctx || + gst_cuda_context_get_handle (cmem->context) == + gst_cuda_context_get_handle (nvenc->cuda_ctx) || + (gst_cuda_context_can_access_peer (cmem->context, nvenc->cuda_ctx) && + gst_cuda_context_can_access_peer (nvenc->cuda_ctx, + cmem->context))) { + use_device_memory = TRUE; + in_map_flags |= GST_MAP_CUDA; + } + } + } + if (!gst_video_frame_map (&vframe, info, frame->input_buffer, in_map_flags)) { goto drop; } @@ -2315,7 +2481,7 @@ resource = state->in_buf; #if HAVE_NVCODEC_GST_GL - if (nvenc->gl_input) { + if (nvenc->mem_type == GST_NVENC_MEM_TYPE_GL) { GstGLMemory *gl_mem; GstNvEncGLMapData data; @@ -2335,7 +2501,8 @@ } } else #endif - if (!gst_nv_base_enc_upload_frame (nvenc, &vframe, resource)) { + if (!gst_nv_base_enc_upload_frame (nvenc, + &vframe, resource, use_device_memory)) { flow = GST_FLOW_ERROR; goto unmap_and_drop; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvbaseenc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvbaseenc.h
Changed
@@ -61,6 +61,14 @@ GST_NV_RC_MODE_VBR_HQ, } GstNvRCMode; +typedef enum +{ + GST_NVENC_MEM_TYPE_SYSTEM = 0, + GST_NVENC_MEM_TYPE_GL, + GST_NVENC_MEM_TYPE_CUDA, + /* FIXME: add support D3D11 memory */ +} GstNvEncMemType; + typedef struct { gboolean weighted_prediction; gint rc_modes; @@ -112,7 +120,7 @@ GstVideoCodecState *input_state; gint reconfig; /* ATOMIC */ - gboolean gl_input; + GstNvEncMemType mem_type; /* array of allocated input/output buffers (GstNvEncFrameState), * and hold the ownership of the GstNvEncFrameState. */ @@ -137,6 +145,7 @@ GstObject *display; /* GstGLDisplay */ GstObject *other_context; /* GstGLContext */ + GstObject *gl_context; /* GstGLContext */ GstVideoInfo input_info; /* buffer configuration for buffers sent to NVENC */ @@ -173,7 +182,6 @@ NV_ENC_CONFIG * config); } GstNvBaseEncClass; -G_GNUC_INTERNAL GType gst_nv_base_enc_get_type (void); GType gst_nv_base_enc_register (const char * codec,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvdec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvdec.c
Changed
@@ -31,12 +31,21 @@ #include "gstnvdec.h" #include "gstcudautils.h" +#include "gstcudabufferpool.h" #include <string.h> GST_DEBUG_CATEGORY_EXTERN (gst_nvdec_debug); #define GST_CAT_DEFAULT gst_nvdec_debug +#define DEFAULT_MAX_DISPLAY_DELAY -1 + +enum +{ + PROP_0, + PROP_MAX_DISPLAY_DELAY, +}; + #ifdef HAVE_NVCODEC_GST_GL #define SUPPORTED_GL_APIS (GST_GL_API_OPENGL | GST_GL_API_OPENGL3 | GST_GL_API_GLES2) @@ -46,7 +55,7 @@ #endif static gboolean -gst_nvdec_copy_device_to_system (GstNvDec * nvdec, +gst_nvdec_copy_device_to_memory (GstNvDec * nvdec, CUVIDPARSERDISPINFO * dispinfo, GstBuffer * output_buffer); #ifdef HAVE_NVCODEC_GST_GL @@ -166,11 +175,47 @@ G_DEFINE_ABSTRACT_TYPE (GstNvDec, gst_nvdec, GST_TYPE_VIDEO_DECODER); static void +gst_nv_dec_set_property (GObject * object, guint prop_id, const GValue * value, + GParamSpec * pspec) +{ + GstNvDec *nvdec = GST_NVDEC (object); + + switch (prop_id) { + case PROP_MAX_DISPLAY_DELAY: + nvdec->max_display_delay = g_value_get_int (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_nv_dec_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstNvDec *nvdec = GST_NVDEC (object); + + switch (prop_id) { + case PROP_MAX_DISPLAY_DELAY: + g_value_set_int (value, nvdec->max_display_delay); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void gst_nvdec_class_init (GstNvDecClass * klass) { + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstVideoDecoderClass *video_decoder_class = GST_VIDEO_DECODER_CLASS (klass); GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + gobject_class->set_property = gst_nv_dec_set_property; + gobject_class->get_property = gst_nv_dec_get_property; + video_decoder_class->open = GST_DEBUG_FUNCPTR (gst_nvdec_open); video_decoder_class->start = GST_DEBUG_FUNCPTR (gst_nvdec_start); video_decoder_class->stop = GST_DEBUG_FUNCPTR (gst_nvdec_stop); @@ -188,11 +233,24 @@ element_class->set_context = GST_DEBUG_FUNCPTR (gst_nvdec_set_context); gst_type_mark_as_plugin_api (GST_TYPE_NVDEC, 0); + + /** + * GstNvDec:max-display-delay: + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_MAX_DISPLAY_DELAY, + g_param_spec_int ("max-display-delay", "Max Display Delay", + "Improves pipelining of decode with display, 0 means no delay " + "(auto = -1)", + -1, G_MAXINT, DEFAULT_MAX_DISPLAY_DELAY, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void gst_nvdec_init (GstNvDec * nvdec) { + nvdec->max_display_delay = DEFAULT_MAX_DISPLAY_DELAY; gst_video_decoder_set_packetized (GST_VIDEO_DECODER (nvdec), TRUE); gst_video_decoder_set_needs_format (GST_VIDEO_DECODER (nvdec), TRUE); } @@ -260,6 +318,33 @@ return 8; } +static guint +gst_nvdec_get_max_display_delay (GstNvDec * nvdec) +{ + return nvdec->max_display_delay >= 0 ? nvdec->max_display_delay : + (nvdec->is_live ? 0 : 4); +} + +static gint64 +gst_nvdec_get_latency (GstNvDec * nvdec) +{ + gint fps_n, fps_d; + + if (!nvdec->input_state) + return 0; + fps_n = GST_VIDEO_INFO_FPS_N (&nvdec->input_state->info); + fps_d = GST_VIDEO_INFO_FPS_D (&nvdec->input_state->info); + + /* We assume 25 fps if the input framerate is invalid */ + if (fps_n < 1 || fps_d < 1) { + fps_n = 25; + fps_d = 1; + } + + return gst_util_uint64_scale_int ((nvdec->num_decode_surface + + gst_nvdec_get_max_display_delay (nvdec)) * GST_SECOND, fps_d, fps_n); +} + /* 0: fail, 1: succeeded, > 1: override dpb size of parser * (set by CUVIDPARSERPARAMS::ulMaxNumDecodeSurfaces while creating parser) */ static gint CUDAAPI @@ -274,9 +359,10 @@ GstCudaContext *ctx = nvdec->cuda_ctx; GstStructure *in_s = NULL; gboolean updata = FALSE; - gint num_decode_surface = 0; guint major_api_ver = 0; + guint64 curr_latency, old_latency; + old_latency = gst_nvdec_get_latency (nvdec); width = format->display_area.right - format->display_area.left; height = format->display_area.bottom - format->display_area.top; @@ -405,17 +491,24 @@ if (gst_cuvid_get_api_version (&major_api_ver, NULL) && major_api_ver >= 9) { /* min_num_decode_surfaces was introduced in nvcodec sdk 9.0 header */ - num_decode_surface = format->min_num_decode_surfaces; + nvdec->num_decode_surface = format->min_num_decode_surfaces; - GST_DEBUG_OBJECT (nvdec, "Num decode surface: %d", num_decode_surface); + GST_DEBUG_OBJECT (nvdec, + "Num decode surface: %d", nvdec->num_decode_surface); } else { - num_decode_surface = + nvdec->num_decode_surface = calculate_num_decode_surface (format->codec, width, height); GST_DEBUG_OBJECT (nvdec, - "Calculated num decode surface: %d", num_decode_surface); + "Calculated num decode surface: %d", nvdec->num_decode_surface); } + /* Update the latency if it has changed */ + curr_latency = gst_nvdec_get_latency (nvdec); + if (old_latency != curr_latency) + gst_video_decoder_set_latency (GST_VIDEO_DECODER (nvdec), curr_latency, + curr_latency); + if (!nvdec->decoder || !gst_video_info_is_equal (out_info, &prev_out_info)) { updata = TRUE; @@ -436,7 +529,7 @@ GST_DEBUG_OBJECT (nvdec, "creating decoder"); create_info.ulWidth = width; create_info.ulHeight = height; - create_info.ulNumDecodeSurfaces = num_decode_surface; + create_info.ulNumDecodeSurfaces = nvdec->num_decode_surface; create_info.CodecType = format->codec; create_info.ChromaFormat = format->chroma_format; create_info.ulCreationFlags = cudaVideoCreate_Default; @@ -475,7 +568,7 @@ } } - return num_decode_surface; + return nvdec->num_decode_surface; error: nvdec->last_ret = GST_FLOW_ERROR; @@ -506,7 +599,6 @@ state->caps = gst_video_info_to_caps (&state->info); nvdec->mem_type = GST_NVDEC_MEM_TYPE_SYSTEM; -#ifdef HAVE_NVCODEC_GST_GL { GstCaps *caps; caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (nvdec)); @@ -515,40 +607,67 @@ if (!caps || gst_caps_is_any (caps)) { GST_DEBUG_OBJECT (nvdec, "cannot determine output format, use system memory"); - } else if (nvdec->gl_display) { + } else { GstCapsFeatures *features; guint size = gst_caps_get_size (caps); guint i; + gboolean have_cuda = FALSE; + gboolean have_gl = FALSE; for (i = 0; i < size; i++) { features = gst_caps_get_features (caps, i); if (features && gst_caps_features_contains (features, - GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { - GST_DEBUG_OBJECT (nvdec, "found GL memory feature, use gl"); - nvdec->mem_type = GST_NVDEC_MEM_TYPE_GL; + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)) { + GST_DEBUG_OBJECT (nvdec, "found CUDA memory feature"); + have_cuda = TRUE; break; } +#ifdef HAVE_NVCODEC_GST_GL + if (nvdec->gl_display && + features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { + GST_DEBUG_OBJECT (nvdec, "found GL memory feature"); + have_gl = TRUE; + } +#endif } + + if (have_cuda) + nvdec->mem_type = GST_NVDEC_MEM_TYPE_CUDA; + else if (have_gl) + nvdec->mem_type = GST_NVDEC_MEM_TYPE_GL; } gst_clear_caps (&caps); } +#ifdef HAVE_NVCODEC_GST_GL if (nvdec->mem_type == GST_NVDEC_MEM_TYPE_GL && !gst_nvdec_ensure_gl_context (nvdec)) { GST_WARNING_OBJECT (nvdec, "OpenGL context is not CUDA-compatible, fallback to system memory"); nvdec->mem_type = GST_NVDEC_MEM_TYPE_SYSTEM; } +#endif - if (nvdec->mem_type == GST_NVDEC_MEM_TYPE_GL) { - gst_caps_set_features (state->caps, 0, - gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_GL_MEMORY, NULL)); - gst_caps_set_simple (state->caps, "texture-target", G_TYPE_STRING, - "2D", NULL); - } else { - GST_DEBUG_OBJECT (nvdec, "use system memory"); - } + switch (nvdec->mem_type) { + case GST_NVDEC_MEM_TYPE_CUDA: + GST_DEBUG_OBJECT (nvdec, "use cuda memory"); + gst_caps_set_features (state->caps, 0, + gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY, NULL)); + break; +#ifdef HAVE_NVCODEC_GST_GL + case GST_NVDEC_MEM_TYPE_GL: + GST_DEBUG_OBJECT (nvdec, "use gl memory"); + gst_caps_set_features (state->caps, 0, + gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_GL_MEMORY, NULL)); + gst_caps_set_simple (state->caps, "texture-target", G_TYPE_STRING, + "2D", NULL); + break; #endif + default: + GST_DEBUG_OBJECT (nvdec, "use system memory"); + break; + } if (nvdec->output_state) gst_video_codec_state_unref (nvdec->output_state); @@ -711,7 +830,7 @@ if (!copy_ret) #endif { - copy_ret = gst_nvdec_copy_device_to_system (nvdec, dispinfo, output_buffer); + copy_ret = gst_nvdec_copy_device_to_memory (nvdec, dispinfo, output_buffer); } if (!copy_ret) { @@ -904,6 +1023,7 @@ GstNvDec *nvdec = GST_NVDEC (decoder); GstNvDecClass *klass = GST_NVDEC_GET_CLASS (decoder); CUVIDPARSERPARAMS parser_params = { 0, }; + GstQuery *query; gboolean ret = TRUE; GST_DEBUG_OBJECT (nvdec, "set format"); @@ -916,12 +1036,19 @@ if (!maybe_destroy_decoder_and_parser (nvdec)) return FALSE; + /* Check if pipeline is live */ + nvdec->is_live = FALSE; + query = gst_query_new_latency (); + if (gst_pad_peer_query (GST_VIDEO_DECODER_SINK_PAD (decoder), query)) + gst_query_parse_latency (query, &nvdec->is_live, NULL, NULL); + gst_query_unref (query); + parser_params.CodecType = klass->codec_type; /* ulMaxNumDecodeSurfaces will be updated by the return value of * SequenceCallback */ parser_params.ulMaxNumDecodeSurfaces = 1; parser_params.ulErrorThreshold = 100; - parser_params.ulMaxDisplayDelay = 0; + parser_params.ulMaxDisplayDelay = gst_nvdec_get_max_display_delay (nvdec); parser_params.ulClockRate = GST_SECOND; parser_params.pUserData = nvdec; parser_params.pfnSequenceCallback = @@ -953,7 +1080,7 @@ gst_buffer_replace (&nvdec->codec_data, codec_data); } - /* For all CODEC we get completre picture ... */ + /* For all CODEC we get complete picture ... */ nvdec->recv_complete_picture = TRUE; /* Except for JPEG, for which it depends on the caps */ @@ -1103,7 +1230,7 @@ #endif static gboolean -gst_nvdec_copy_device_to_system (GstNvDec * nvdec, +gst_nvdec_copy_device_to_memory (GstNvDec * nvdec, CUVIDPARSERDISPINFO * dispinfo, GstBuffer * output_buffer) { CUVIDPROCPARAMS params = { 0, }; @@ -1113,16 +1240,35 @@ GstVideoFrame video_frame; GstVideoInfo *info = &nvdec->output_state->info; gint i; + GstMemory *mem; + GstCudaMemory *cuda_mem = NULL; if (!gst_cuda_context_push (nvdec->cuda_ctx)) { GST_WARNING_OBJECT (nvdec, "failed to lock CUDA context"); return FALSE; } - if (!gst_video_frame_map (&video_frame, info, output_buffer, GST_MAP_WRITE)) { - GST_ERROR_OBJECT (nvdec, "frame map failure"); - gst_cuda_context_pop (NULL); - return FALSE; + if (nvdec->mem_type == GST_NVDEC_MEM_TYPE_CUDA && + (mem = gst_buffer_peek_memory (output_buffer, 0)) && + gst_is_cuda_memory (mem)) { + GstCudaMemory *cmem = GST_CUDA_MEMORY_CAST (mem); + + if (cmem->context == nvdec->cuda_ctx || + gst_cuda_context_get_handle (cmem->context) == + gst_cuda_context_get_handle (nvdec->cuda_ctx) || + (gst_cuda_context_can_access_peer (cmem->context, nvdec->cuda_ctx) && + gst_cuda_context_can_access_peer (nvdec->cuda_ctx, + cmem->context))) { + cuda_mem = cmem; + } + } + + if (!cuda_mem) { + if (!gst_video_frame_map (&video_frame, info, output_buffer, GST_MAP_WRITE)) { + GST_ERROR_OBJECT (nvdec, "frame map failure"); + gst_cuda_context_pop (NULL); + return FALSE; + } } params.progressive_frame = dispinfo->progressive_frame; @@ -1139,20 +1285,27 @@ copy_params.srcMemoryType = CU_MEMORYTYPE_DEVICE; copy_params.srcPitch = pitch; - copy_params.dstMemoryType = CU_MEMORYTYPE_HOST; - copy_params.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (info, 0) - * GST_VIDEO_INFO_COMP_PSTRIDE (info, 0); + copy_params.dstMemoryType = + cuda_mem ? CU_MEMORYTYPE_DEVICE : CU_MEMORYTYPE_HOST; - for (i = 0; i < GST_VIDEO_FRAME_N_PLANES (&video_frame); i++) { + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { copy_params.srcDevice = dptr + (i * pitch * GST_VIDEO_INFO_HEIGHT (info)); - copy_params.dstHost = GST_VIDEO_FRAME_PLANE_DATA (&video_frame, i); - copy_params.dstPitch = GST_VIDEO_FRAME_PLANE_STRIDE (&video_frame, i); - copy_params.Height = GST_VIDEO_FRAME_COMP_HEIGHT (&video_frame, i); + if (cuda_mem) { + copy_params.dstDevice = cuda_mem->data + cuda_mem->offset[i]; + copy_params.dstPitch = cuda_mem->stride; + } else { + copy_params.dstHost = GST_VIDEO_FRAME_PLANE_DATA (&video_frame, i); + copy_params.dstPitch = GST_VIDEO_FRAME_PLANE_STRIDE (&video_frame, i); + } + copy_params.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (info, i) + * GST_VIDEO_INFO_COMP_PSTRIDE (info, i); + copy_params.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); if (!gst_cuda_result (CuMemcpy2DAsync (©_params, nvdec->cuda_stream))) { GST_ERROR_OBJECT (nvdec, "failed to copy %dth plane", i); CuvidUnmapVideoFrame (nvdec->decoder, dptr); - gst_video_frame_unmap (&video_frame); + if (!cuda_mem) + gst_video_frame_unmap (&video_frame); gst_cuda_context_pop (NULL); return FALSE; } @@ -1160,7 +1313,8 @@ gst_cuda_result (CuStreamSynchronize (nvdec->cuda_stream)); - gst_video_frame_unmap (&video_frame); + if (!cuda_mem) + gst_video_frame_unmap (&video_frame); if (!gst_cuda_result (CuvidUnmapVideoFrame (nvdec->decoder, dptr))) GST_WARNING_OBJECT (nvdec, "failed to unmap video frame"); @@ -1346,13 +1500,9 @@ return TRUE; } -#endif - static gboolean -gst_nvdec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query) +gst_nvdec_ensure_gl_pool (GstNvDec * nvdec, GstQuery * query) { -#ifdef HAVE_NVCODEC_GST_GL - GstNvDec *nvdec = GST_NVDEC (decoder); GstCaps *outcaps; GstBufferPool *pool = NULL; guint n, size, min, max; @@ -1361,10 +1511,6 @@ GST_DEBUG_OBJECT (nvdec, "decide allocation"); - if (nvdec->mem_type == GST_NVDEC_MEM_TYPE_SYSTEM) - return GST_VIDEO_DECODER_CLASS (gst_nvdec_parent_class)->decide_allocation - (decoder, query); - gst_query_parse_allocation (query, &outcaps, NULL); n = gst_query_get_n_allocation_pools (query); if (n > 0) @@ -1376,6 +1522,7 @@ } if (!pool) { + GST_DEBUG_OBJECT (nvdec, "no downstream pool, create our pool"); pool = gst_gl_buffer_pool_new (nvdec->gl_context); if (outcaps) @@ -1393,8 +1540,74 @@ else gst_query_add_allocation_pool (query, pool, size, min, max); gst_object_unref (pool); + + return TRUE; +} +#endif + +static gboolean +gst_nvdec_ensure_cuda_pool (GstNvDec * nvdec, GstQuery * query) +{ + GstCaps *outcaps; + GstBufferPool *pool = NULL; + guint n, size, min, max; + GstVideoInfo vinfo = { 0, }; + GstStructure *config; + + gst_query_parse_allocation (query, &outcaps, NULL); + n = gst_query_get_n_allocation_pools (query); + if (n > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (pool && !GST_IS_CUDA_BUFFER_POOL (pool)) { + gst_object_unref (pool); + pool = NULL; + } + } + + if (!pool) { + GST_DEBUG_OBJECT (nvdec, "no downstream pool, create our pool"); + pool = gst_cuda_buffer_pool_new (nvdec->cuda_ctx); + + if (outcaps) + gst_video_info_from_caps (&vinfo, outcaps); + size = (guint) vinfo.size; + min = max = 0; + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_buffer_pool_set_config (pool, config); + if (n > 0) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + gst_object_unref (pool); + + return TRUE; +} + +static gboolean +gst_nvdec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query) +{ + GstNvDec *nvdec = GST_NVDEC (decoder); + + GST_DEBUG_OBJECT (nvdec, "decide allocation"); + + if (nvdec->mem_type == GST_NVDEC_MEM_TYPE_SYSTEM) + goto done; + +#ifdef HAVE_NVCODEC_GST_GL + if (nvdec->mem_type == GST_NVDEC_MEM_TYPE_GL) { + if (!gst_nvdec_ensure_gl_pool (nvdec, query)) + return FALSE; + } else #endif + if (!gst_nvdec_ensure_cuda_pool (nvdec, query)) { + return FALSE; + } +done: return GST_VIDEO_DECODER_CLASS (gst_nvdec_parent_class)->decide_allocation (decoder, query); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvdec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvdec.h
Changed
@@ -61,7 +61,8 @@ { GST_NVDEC_MEM_TYPE_SYSTEM = 0, GST_NVDEC_MEM_TYPE_GL, - /* FIXME: add support CUDA, D3D11 memory */ + GST_NVDEC_MEM_TYPE_CUDA, + /* FIXME: add support D3D11 memory */ } GstNvDecMemType; struct _GstNvDec @@ -74,6 +75,10 @@ GstGLContext *other_gl_context; #endif + guint num_decode_surface; + gint max_display_delay; + gboolean is_live; + CUvideoparser parser; CUvideodecoder decoder; GstCudaContext *cuda_ctx;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvdecoder.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvdecoder.c
Changed
@@ -50,7 +50,9 @@ #include <gst/gl/gstglfuncs.h> #endif +#include "gstcudamemory.h" #include "gstnvdecoder.h" +#include "gstcudabufferpool.h" #include <string.h> GST_DEBUG_CATEGORY_EXTERN (gst_nv_decoder_debug); @@ -65,20 +67,39 @@ gboolean available; } GstNvDecoderFrameInfo; +typedef enum +{ + GST_NV_DECODER_OUTPUT_TYPE_SYSTEM = 0, + GST_NV_DECODER_OUTPUT_TYPE_GL, + GST_NV_DECODER_OUTPUT_TYPE_CUDA, + /* FIXME: add support D3D11 memory */ +} GstNvDecoderOutputType; + struct _GstNvDecoder { GstObject parent; GstCudaContext *context; + CUstream cuda_stream; CUvideodecoder decoder_handle; GstNvDecoderFrameInfo *frame_pool; guint pool_size; GstVideoInfo info; + GstVideoInfo coded_info; + + gboolean configured; + + /* For OpenGL interop. */ + GstObject *gl_display; + GstObject *gl_context; + GstObject *other_gl_context; + + GstNvDecoderOutputType output_type; }; static void gst_nv_decoder_dispose (GObject * object); -static void gst_nv_decoder_finalize (GObject * object); +static void gst_nv_decoder_reset (GstNvDecoder * self); #define parent_class gst_nv_decoder_parent_class G_DEFINE_TYPE (GstNvDecoder, gst_nv_decoder, GST_TYPE_OBJECT); @@ -89,7 +110,6 @@ GObjectClass *gobject_class = G_OBJECT_CLASS (klass); gobject_class->dispose = gst_nv_decoder_dispose; - gobject_class->finalize = gst_nv_decoder_finalize; } static void @@ -102,21 +122,22 @@ { GstNvDecoder *self = GST_NV_DECODER (object); - gst_clear_object (&self->context); + gst_nv_decoder_reset (self); - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_nv_decoder_finalize (GObject * object) -{ - GstNvDecoder *self = GST_NV_DECODER (object); + if (self->context && self->cuda_stream) { + if (gst_cuda_context_push (self->context)) { + gst_cuda_result (CuStreamDestroy (self->cuda_stream)); + gst_cuda_context_pop (NULL); + self->cuda_stream = NULL; + } + } - g_free (self->frame_pool); - if (self->decoder_handle) - gst_cuda_result (CuvidDestroyDecoder (self->decoder_handle)); + gst_clear_object (&self->context); + gst_clear_object (&self->gl_display); + gst_clear_object (&self->gl_context); + gst_clear_object (&self->other_gl_context); - G_OBJECT_CLASS (parent_class)->finalize (object); + G_OBJECT_CLASS (parent_class)->dispose (object); } static cudaVideoChromaFormat @@ -141,29 +162,6 @@ return cudaVideoChromaFormat_420; } -static guint -bitdepth_minus8_from_video_format (GstVideoFormat format) -{ - switch (format) { - case GST_VIDEO_FORMAT_NV12: - case GST_VIDEO_FORMAT_Y444: - return 0; - case GST_VIDEO_FORMAT_P010_10LE: - case GST_VIDEO_FORMAT_P010_10BE: - return 2; - case GST_VIDEO_FORMAT_P016_LE: - case GST_VIDEO_FORMAT_P016_BE: - case GST_VIDEO_FORMAT_Y444_16LE: - case GST_VIDEO_FORMAT_Y444_16BE: - return 8; - default: - g_assert_not_reached (); - break; - } - - return 0; -} - static cudaVideoSurfaceFormat output_format_from_video_format (GstVideoFormat format) { @@ -204,32 +202,88 @@ } GstNvDecoder * -gst_nv_decoder_new (GstCudaContext * context, cudaVideoCodec codec, - GstVideoInfo * info, guint pool_size) +gst_nv_decoder_new (GstCudaContext * context) +{ + GstNvDecoder *self; + + g_return_val_if_fail (GST_IS_CUDA_CONTEXT (context), NULL); + + self = g_object_new (GST_TYPE_NV_DECODER, NULL); + self->context = gst_object_ref (context); + gst_object_ref_sink (self); + + if (gst_cuda_context_push (context)) { + CUresult cuda_ret; + cuda_ret = CuStreamCreate (&self->cuda_stream, CU_STREAM_DEFAULT); + if (!gst_cuda_result (cuda_ret)) { + GST_WARNING_OBJECT (self, + "Could not create CUDA stream, will use default stream"); + self->cuda_stream = NULL; + } + + gst_cuda_context_pop (NULL); + } + + return self; +} + +gboolean +gst_nv_decoder_is_configured (GstNvDecoder * decoder) +{ + g_return_val_if_fail (GST_IS_NV_DECODER (decoder), FALSE); + + return decoder->configured; +} + +static void +gst_nv_decoder_reset (GstNvDecoder * self) +{ + g_clear_pointer (&self->frame_pool, g_free); + + if (self->decoder_handle) { + gst_cuda_context_push (self->context); + CuvidDestroyDecoder (self->decoder_handle); + gst_cuda_context_pop (NULL); + self->decoder_handle = NULL; + } + + self->output_type = GST_NV_DECODER_OUTPUT_TYPE_SYSTEM; + self->configured = FALSE; +} + +gboolean +gst_nv_decoder_configure (GstNvDecoder * decoder, cudaVideoCodec codec, + GstVideoInfo * info, gint coded_width, gint coded_height, + guint coded_bitdepth, guint pool_size) { - GstNvDecoder *decoder; CUVIDDECODECREATEINFO create_info = { 0, }; GstVideoFormat format; + gboolean ret; - g_return_val_if_fail (GST_IS_CUDA_CONTEXT (context), NULL); - g_return_val_if_fail (codec < cudaVideoCodec_NumCodecs, NULL); - g_return_val_if_fail (info != NULL, NULL); - g_return_val_if_fail (pool_size > 0, NULL); + g_return_val_if_fail (GST_IS_NV_DECODER (decoder), FALSE); + g_return_val_if_fail (codec < cudaVideoCodec_NumCodecs, FALSE); + g_return_val_if_fail (info != NULL, FALSE); + g_return_val_if_fail (coded_width >= GST_VIDEO_INFO_WIDTH (info), FALSE); + g_return_val_if_fail (coded_height >= GST_VIDEO_INFO_HEIGHT (info), FALSE); + g_return_val_if_fail (coded_bitdepth >= 8, FALSE); + g_return_val_if_fail (pool_size > 0, FALSE); - decoder = g_object_new (GST_TYPE_NV_DECODER, NULL); - decoder->context = gst_object_ref (context); - gst_object_ref_sink (decoder); + gst_nv_decoder_reset (decoder); + + decoder->info = *info; + gst_video_info_set_format (&decoder->coded_info, GST_VIDEO_INFO_FORMAT (info), + coded_width, coded_height); format = GST_VIDEO_INFO_FORMAT (info); - /* FIXME: check aligned resolution or actaul coded resolution */ - create_info.ulWidth = GST_VIDEO_INFO_WIDTH (info);; - create_info.ulHeight = GST_VIDEO_INFO_HEIGHT (info);; + /* FIXME: check aligned resolution or actual coded resolution */ + create_info.ulWidth = GST_VIDEO_INFO_WIDTH (&decoder->coded_info); + create_info.ulHeight = GST_VIDEO_INFO_HEIGHT (&decoder->coded_info); create_info.ulNumDecodeSurfaces = pool_size; create_info.CodecType = codec; create_info.ChromaFormat = chroma_format_from_video_format (format); create_info.ulCreationFlags = cudaVideoCreate_Default; - create_info.bitDepthMinus8 = bitdepth_minus8_from_video_format (format); + create_info.bitDepthMinus8 = coded_bitdepth - 8; create_info.ulIntraDecodeOnly = 0; create_info.display_area.left = 0; @@ -239,7 +293,7 @@ create_info.OutputFormat = output_format_from_video_format (format); create_info.DeinterlaceMode = cudaVideoDeinterlaceMode_Weave; - create_info.ulTargetWidth = GST_VIDEO_INFO_WIDTH (info);; + create_info.ulTargetWidth = GST_VIDEO_INFO_WIDTH (info); create_info.ulTargetHeight = GST_VIDEO_INFO_HEIGHT (info); /* we always copy decoded picture to output buffer */ create_info.ulNumOutputSurfaces = 1; @@ -249,34 +303,29 @@ create_info.target_rect.right = GST_VIDEO_INFO_WIDTH (info); create_info.target_rect.bottom = GST_VIDEO_INFO_HEIGHT (info); - if (!gst_cuda_context_push (context)) { + if (!gst_cuda_context_push (decoder->context)) { GST_ERROR_OBJECT (decoder, "Failed to lock CUDA context"); - goto error; + return FALSE; } - if (!gst_cuda_result (CuvidCreateDecoder (&decoder->decoder_handle, - &create_info))) { - GST_ERROR_OBJECT (decoder, "Cannot create decoder instance"); - goto error; - } + ret = gst_cuda_result (CuvidCreateDecoder (&decoder->decoder_handle, + &create_info)); + gst_cuda_context_pop (NULL); - if (!gst_cuda_context_pop (NULL)) { - GST_ERROR_OBJECT (decoder, "Failed to unlock CUDA context"); - goto error; + if (!ret) { + GST_ERROR_OBJECT (decoder, "Cannot create decoder instance"); + return FALSE; } if (!gst_nv_decoder_prepare_frame_pool (decoder, pool_size)) { GST_ERROR_OBJECT (decoder, "Cannot prepare internal surface buffer pool"); - goto error; + gst_nv_decoder_reset (decoder); + return FALSE; } - decoder->info = *info; - - return decoder; + decoder->configured = TRUE; -error: - gst_clear_object (&decoder); - return NULL; + return TRUE; } GstNvDecoderFrame * @@ -304,6 +353,7 @@ frame = g_new0 (GstNvDecoderFrame, 1); frame->index = index_to_use; frame->decoder = gst_object_ref (decoder); + frame->ref_count = 1; GST_LOG_OBJECT (decoder, "New frame %p (index %d)", frame, frame->index); @@ -367,33 +417,45 @@ frame->mapped = FALSE; } +GstNvDecoderFrame * +gst_nv_decoder_frame_ref (GstNvDecoderFrame * frame) +{ + g_assert (frame != NULL); + + g_atomic_int_add (&frame->ref_count, 1); + + return frame; +} + void -gst_nv_decoder_frame_free (GstNvDecoderFrame * frame) +gst_nv_decoder_frame_unref (GstNvDecoderFrame * frame) { GstNvDecoder *self; g_assert (frame != NULL); - GST_LOG ("Free frame %p (index %d)", frame, frame->index); + if (g_atomic_int_dec_and_test (&frame->ref_count)) { + GST_LOG ("Free frame %p (index %d)", frame, frame->index); - if (frame->decoder) { - self = frame->decoder; - if (frame->mapped && gst_cuda_context_push (self->context)) { - gst_nv_decoder_frame_unmap (frame); - gst_cuda_context_pop (NULL); - } + if (frame->decoder) { + self = frame->decoder; + if (frame->mapped && gst_cuda_context_push (self->context)) { + gst_nv_decoder_frame_unmap (frame); + gst_cuda_context_pop (NULL); + } - if (frame->index < self->pool_size) { - self->frame_pool[frame->index].available = TRUE; - } else { - GST_WARNING_OBJECT (self, - "Frame %p has invalid index %d", frame, frame->index); + if (frame->index < self->pool_size) { + self->frame_pool[frame->index].available = TRUE; + } else { + GST_WARNING_OBJECT (self, + "Frame %p has invalid index %d", frame, frame->index); + } + + gst_object_unref (self); } - gst_object_unref (self); + g_free (frame); } - - g_free (frame); } gboolean @@ -578,7 +640,7 @@ * GST_VIDEO_INFO_COMP_PSTRIDE (info, i); copy_params.srcDevice = frame->devptr + - (i * frame->pitch * GST_VIDEO_INFO_HEIGHT (info)); + (i * frame->pitch * GST_VIDEO_INFO_HEIGHT (&self->info)); copy_params.dstDevice = dst_ptr; copy_params.Height = GST_VIDEO_INFO_COMP_HEIGHT (info, i); @@ -634,7 +696,7 @@ } if (!gst_cuda_context_push (decoder->context)) { - GST_ERROR_OBJECT (decoder, "Failed to pust CUDA context"); + GST_ERROR_OBJECT (decoder, "Failed to push CUDA context"); gst_video_frame_unmap (&video_frame); return FALSE; } @@ -652,13 +714,13 @@ copy_params.dstPitch = GST_VIDEO_FRAME_PLANE_STRIDE (&video_frame, i); copy_params.Height = GST_VIDEO_FRAME_COMP_HEIGHT (&video_frame, i); - if (!gst_cuda_result (CuMemcpy2DAsync (©_params, NULL))) { + if (!gst_cuda_result (CuMemcpy2DAsync (©_params, decoder->cuda_stream))) { GST_ERROR_OBJECT (decoder, "failed to copy %dth plane", i); goto done; } } - gst_cuda_result (CuStreamSynchronize (NULL)); + gst_cuda_result (CuStreamSynchronize (decoder->cuda_stream)); ret = TRUE; @@ -672,54 +734,153 @@ return ret; } -gboolean -gst_nv_decoder_finish_frame (GstNvDecoder * decoder, - GstNvDecoderOutputType output_type, GstObject * graphics_context, +static gboolean +gst_nv_decoder_copy_frame_to_cuda (GstNvDecoder * decoder, GstNvDecoderFrame * frame, GstBuffer * buffer) { - gboolean ret; - - g_return_val_if_fail (GST_IS_NV_DECODER (decoder), FALSE); - g_return_val_if_fail (frame != NULL, FALSE); - g_return_val_if_fail (GST_IS_BUFFER (buffer), FALSE); + CUDA_MEMCPY2D copy_params = { 0, }; + GstMemory *mem; + GstCudaMemory *cuda_mem = NULL; + gint i; + gboolean ret = FALSE; -#ifdef HAVE_NVCODEC_GST_GL - if (output_type == GST_NV_DECOCER_OUTPUT_TYPE_GL && !graphics_context) { - if (!GST_IS_GL_CONTEXT (graphics_context)) { - GST_ERROR_OBJECT (decoder, "Invalid GL Context"); - return FALSE; + mem = gst_buffer_peek_memory (buffer, 0); + if (!gst_is_cuda_memory (mem)) { + GST_WARNING_OBJECT (decoder, "Not a CUDA memory"); + return FALSE; + } else { + GstCudaMemory *cmem = GST_CUDA_MEMORY_CAST (mem); + + if (cmem->context == decoder->context || + gst_cuda_context_get_handle (cmem->context) == + gst_cuda_context_get_handle (decoder->context) || + (gst_cuda_context_can_access_peer (cmem->context, decoder->context) && + gst_cuda_context_can_access_peer (decoder->context, + cmem->context))) { + cuda_mem = cmem; } } -#endif + + if (!cuda_mem) { + GST_WARNING_OBJECT (decoder, "Access to CUDA memory is not allowed"); + return FALSE; + } if (!gst_cuda_context_push (decoder->context)) { - GST_ERROR_OBJECT (decoder, "Failed to pust CUDA context"); + GST_ERROR_OBJECT (decoder, "Failed to push CUDA context"); + return FALSE; + } + + copy_params.srcMemoryType = CU_MEMORYTYPE_DEVICE; + copy_params.srcPitch = frame->pitch; + copy_params.dstMemoryType = CU_MEMORYTYPE_DEVICE; + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&decoder->info); i++) { + copy_params.srcDevice = frame->devptr + + (i * frame->pitch * GST_VIDEO_INFO_HEIGHT (&decoder->info)); + copy_params.dstDevice = cuda_mem->data + cuda_mem->offset[i]; + copy_params.dstPitch = cuda_mem->stride; + copy_params.WidthInBytes = GST_VIDEO_INFO_COMP_WIDTH (&decoder->info, 0) + * GST_VIDEO_INFO_COMP_PSTRIDE (&decoder->info, 0); + copy_params.Height = GST_VIDEO_INFO_COMP_HEIGHT (&decoder->info, i); + + if (!gst_cuda_result (CuMemcpy2DAsync (©_params, decoder->cuda_stream))) { + GST_ERROR_OBJECT (decoder, "failed to copy %dth plane", i); + goto done; + } + } + + gst_cuda_result (CuStreamSynchronize (decoder->cuda_stream)); + + ret = TRUE; + +done: + gst_cuda_context_pop (NULL); + + GST_LOG_OBJECT (decoder, "Copy frame to CUDA ret %d", ret); + + return ret; +} + +gboolean +gst_nv_decoder_finish_frame (GstNvDecoder * decoder, GstVideoDecoder * videodec, + GstNvDecoderFrame * frame, GstBuffer ** buffer) +{ + GstBuffer *outbuf = NULL; + gboolean ret = FALSE; + + g_return_val_if_fail (GST_IS_NV_DECODER (decoder), GST_FLOW_ERROR); + g_return_val_if_fail (GST_IS_VIDEO_DECODER (videodec), GST_FLOW_ERROR); + g_return_val_if_fail (frame != NULL, GST_FLOW_ERROR); + g_return_val_if_fail (buffer != NULL, GST_FLOW_ERROR); + + outbuf = gst_video_decoder_allocate_output_buffer (videodec); + if (!outbuf) { + GST_ERROR_OBJECT (videodec, "Couldn't allocate output buffer"); return FALSE; } + if (!gst_cuda_context_push (decoder->context)) { + GST_ERROR_OBJECT (decoder, "Failed to push CUDA context"); + goto error; + } + if (!gst_nv_decoder_frame_map (frame)) { GST_ERROR_OBJECT (decoder, "Couldn't map frame"); gst_cuda_context_pop (NULL); - return FALSE; + goto error; } gst_cuda_context_pop (NULL); + switch (decoder->output_type) { + case GST_NV_DECODER_OUTPUT_TYPE_SYSTEM: + ret = gst_nv_decoder_copy_frame_to_system (decoder, frame, outbuf); + break; #ifdef HAVE_NVCODEC_GST_GL - if (output_type == GST_NV_DECOCER_OUTPUT_TYPE_GL) { - ret = gst_nv_decoder_copy_frame_to_gl (decoder, - GST_GL_CONTEXT (graphics_context), frame, buffer); - } else + case GST_NV_DECODER_OUTPUT_TYPE_GL: + g_assert (decoder->gl_context != NULL); + + ret = gst_nv_decoder_copy_frame_to_gl (decoder, + GST_GL_CONTEXT (decoder->gl_context), frame, outbuf); + break; #endif - { - ret = gst_nv_decoder_copy_frame_to_system (decoder, frame, buffer); + case GST_NV_DECODER_OUTPUT_TYPE_CUDA: + ret = gst_nv_decoder_copy_frame_to_cuda (decoder, frame, outbuf); + break; + default: + g_assert_not_reached (); + goto error; + } + + /* FIXME: This is the case where OpenGL context of downstream glbufferpool + * belongs to non-nvidia (or different device). + * There should be enhancement to ensure nvdec has compatible OpenGL context + */ + if (!ret && decoder->output_type == GST_NV_DECODER_OUTPUT_TYPE_GL) { + GST_WARNING_OBJECT (videodec, + "Couldn't copy frame to GL memory, fallback to system memory"); + decoder->output_type = GST_NV_DECODER_OUTPUT_TYPE_SYSTEM; + + ret = gst_nv_decoder_copy_frame_to_system (decoder, frame, outbuf); } gst_cuda_context_push (decoder->context); gst_nv_decoder_frame_unmap (frame); gst_cuda_context_pop (NULL); - return ret; + if (!ret) { + GST_WARNING_OBJECT (videodec, "Failed to copy frame"); + goto error; + } + + *buffer = outbuf; + + return TRUE; + +error: + gst_clear_buffer (&outbuf); + return FALSE; } typedef enum @@ -734,7 +895,7 @@ } GstNvDecoderFormatFlags; static gboolean -gst_nv_decocer_get_supported_codec_profiles (GValue * profiles, +gst_nv_decoder_get_supported_codec_profiles (GValue * profiles, cudaVideoCodec codec, GstNvDecoderFormatFlags flags) { GValue val = G_VALUE_INIT; @@ -757,6 +918,12 @@ g_value_set_static_string (&val, "high"); gst_value_list_append_value (profiles, &val); + + g_value_set_static_string (&val, "constrained-high"); + gst_value_list_append_value (profiles, &val); + + g_value_set_static_string (&val, "progressive-high"); + gst_value_list_append_value (profiles, &val); } /* NVDEC supports only 4:2:0 8bits h264 decoding. @@ -765,6 +932,9 @@ GST_NV_DECODER_FORMAT_FLAG_420_10BITS) { g_value_set_static_string (&val, "high-10"); gst_value_list_append_value (profiles, &val); + + g_value_set_static_string (&val, "progressive-high-10"); + gst_value_list_append_value (profiles, &val); } if ((flags & GST_NV_DECODER_FORMAT_FLAG_420_12BITS) == @@ -820,6 +990,21 @@ ret = TRUE; break; + case cudaVideoCodec_VP9: + if ((flags & GST_NV_DECODER_FORMAT_FLAG_420_8BITS) == + GST_NV_DECODER_FORMAT_FLAG_420_8BITS) { + g_value_set_static_string (&val, "0"); + gst_value_list_append_value (profiles, &val); + } + + if ((flags & GST_NV_DECODER_FORMAT_FLAG_420_10BITS) == + GST_NV_DECODER_FORMAT_FLAG_420_10BITS) { + g_value_set_static_string (&val, "2"); + gst_value_list_append_value (profiles, &val); + } + + ret = TRUE; + break; default: break; } @@ -853,7 +1038,7 @@ {cudaVideoCodec_H264, "h264", "video/x-h264, stream-format = (string) byte-stream" ", alignment = (string) au" - ", profile = (string) { constrained-baseline, baseline, main, high }"}, + ", profile = (string) { constrained-baseline, baseline, main, high, constrained-high, progressive-high }"}, {cudaVideoCodec_JPEG, "jpeg", "image/jpeg"}, #if 0 /* FIXME: need verification */ @@ -915,15 +1100,23 @@ src_templ = gst_caps_from_string (GST_VIDEO_CAPS_MAKE ("NV12")); -#if HAVE_NVCODEC_GST_GL { - GstCaps *gl_caps = gst_caps_copy (src_templ); - gst_caps_set_features_simple (gl_caps, - gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_GL_MEMORY)); - gst_caps_append (src_templ, gl_caps); - } + GstCaps *cuda_caps = gst_caps_copy (src_templ); + gst_caps_set_features_simple (cuda_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)); + +#if HAVE_NVCODEC_GST_GL + { + GstCaps *gl_caps = gst_caps_copy (src_templ); + gst_caps_set_features_simple (gl_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_GL_MEMORY)); + gst_caps_append (src_templ, gl_caps); + } #endif + gst_caps_append (src_templ, cuda_caps); + } + sink_templ = gst_caps_from_string (codec_map->sink_caps_string); *src_template = src_templ; @@ -1042,22 +1235,30 @@ gst_caps_set_value (src_templ, "format", &format_list); - /* OpenGL specific */ -#if HAVE_NVCODEC_GST_GL { - GstCaps *gl_caps = gst_caps_copy (src_templ); - gst_caps_set_features_simple (gl_caps, - gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_GL_MEMORY)); - gst_caps_append (src_templ, gl_caps); - } + GstCaps *cuda_caps = gst_caps_copy (src_templ); + gst_caps_set_features_simple (cuda_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)); + + /* OpenGL specific */ +#if HAVE_NVCODEC_GST_GL + { + GstCaps *gl_caps = gst_caps_copy (src_templ); + gst_caps_set_features_simple (gl_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_GL_MEMORY)); + gst_caps_append (src_templ, gl_caps); + } #endif + gst_caps_append (src_templ, cuda_caps); + } + sink_templ = gst_caps_from_string (codec_map->sink_caps_string); gst_caps_set_simple (sink_templ, "width", GST_TYPE_INT_RANGE, min_width, max_width, "height", GST_TYPE_INT_RANGE, min_height, max_height, NULL); - if (gst_nv_decocer_get_supported_codec_profiles (&profile_list, codec, + if (gst_nv_decoder_get_supported_codec_profiles (&profile_list, codec, format_flags)) { gst_caps_set_value (sink_templ, "profile", &profile_list); } @@ -1105,84 +1306,37 @@ } gboolean -gst_nv_decoder_ensure_element_data (GstElement * decoder, guint cuda_device_id, - GstCudaContext ** cuda_context, CUstream * cuda_stream, - GstObject ** gl_display, GstObject ** other_gl_context) +gst_nv_decoder_handle_set_context (GstNvDecoder * decoder, + GstElement * videodec, GstContext * context) { - CUresult cuda_ret; - - g_return_val_if_fail (GST_IS_ELEMENT (decoder), FALSE); - g_return_val_if_fail (cuda_context, FALSE); - g_return_val_if_fail (cuda_stream, FALSE); - g_return_val_if_fail (gl_display, FALSE); - g_return_val_if_fail (other_gl_context, FALSE); - - if (!gst_cuda_ensure_element_context (decoder, cuda_device_id, cuda_context)) { - GST_ERROR_OBJECT (decoder, "failed to create CUDA context"); - return FALSE; - } - - if (gst_cuda_context_push (*cuda_context)) { - CUstream stream; - cuda_ret = CuStreamCreate (&stream, CU_STREAM_DEFAULT); - if (!gst_cuda_result (cuda_ret)) { - GST_WARNING_OBJECT (decoder, - "Could not create CUDA stream, will use default stream"); - *cuda_stream = NULL; - } else { - *cuda_stream = stream; - } + g_return_val_if_fail (GST_IS_NV_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_ELEMENT (videodec), FALSE); - gst_cuda_context_pop (NULL); +#ifdef HAVE_NVCODEC_GST_GL + if (gst_gl_handle_set_context (videodec, context, + (GstGLDisplay **) & decoder->gl_display, + (GstGLContext **) & decoder->other_gl_context)) { + return TRUE; } -#if HAVE_NVCODEC_GST_GL - gst_gl_ensure_element_data (decoder, - (GstGLDisplay **) gl_display, (GstGLContext **) other_gl_context); - if (*gl_display) - gst_gl_display_filter_gl_api (GST_GL_DISPLAY (*gl_display), - SUPPORTED_GL_APIS); #endif - return TRUE; -} - -void -gst_nv_decoder_set_context (GstElement * decoder, GstContext * context, - guint cuda_device_id, GstCudaContext ** cuda_context, - GstObject ** gl_display, GstObject ** other_gl_context) -{ - g_return_if_fail (GST_IS_ELEMENT (decoder)); - g_return_if_fail (GST_IS_CONTEXT (context)); - g_return_if_fail (cuda_context != NULL); - g_return_if_fail (gl_display != NULL); - g_return_if_fail (other_gl_context != NULL); - - if (gst_cuda_handle_set_context (decoder, context, cuda_device_id, - cuda_context)) { - return; - } -#ifdef HAVE_NVCODEC_GST_GL - gst_gl_handle_set_context (decoder, context, - (GstGLDisplay **) gl_display, (GstGLContext **) other_gl_context); -#endif + return FALSE; } gboolean -gst_nv_decoder_handle_context_query (GstElement * decoder, GstQuery * query, - GstCudaContext * cuda_context, GstObject * gl_display, - GstObject * gl_context, GstObject * other_gl_context) +gst_nv_decoder_handle_context_query (GstNvDecoder * decoder, + GstVideoDecoder * videodec, GstQuery * query) { - g_return_val_if_fail (GST_IS_ELEMENT (decoder), FALSE); + g_return_val_if_fail (GST_IS_NV_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_ELEMENT (videodec), FALSE); - if (gst_cuda_handle_context_query (decoder, query, cuda_context)) { - return TRUE; - } #ifdef HAVE_NVCODEC_GST_GL - if (gst_gl_handle_context_query (GST_ELEMENT (decoder), query, - (GstGLDisplay *) gl_display, - (GstGLContext *) gl_context, (GstGLContext *) other_gl_context)) { - if (gl_display) - gst_gl_display_filter_gl_api (GST_GL_DISPLAY (gl_display), + if (gst_gl_handle_context_query (GST_ELEMENT (videodec), query, + (GstGLDisplay *) decoder->gl_display, + (GstGLContext *) decoder->gl_context, + (GstGLContext *) decoder->other_gl_context)) { + if (decoder->gl_display) + gst_gl_display_filter_gl_api (GST_GL_DISPLAY (decoder->gl_display), SUPPORTED_GL_APIS); return TRUE; } @@ -1214,50 +1368,52 @@ } static gboolean -gst_nv_decoder_ensure_gl_context (GstElement * decoder, GstObject * gl_display, - GstObject * other_gl_context, GstObject ** gl_context) +gst_nv_decoder_ensure_gl_context (GstNvDecoder * decoder, GstElement * videodec) { gboolean ret; GstGLDisplay *display; GstGLContext *context; - if (!gl_display) { - GST_DEBUG_OBJECT (decoder, "No available OpenGL display"); + if (!gst_gl_ensure_element_data (videodec, + (GstGLDisplay **) & decoder->gl_display, + (GstGLContext **) & decoder->other_gl_context)) { + GST_DEBUG_OBJECT (videodec, "No available OpenGL display"); return FALSE; } - display = GST_GL_DISPLAY (gl_display); + display = GST_GL_DISPLAY (decoder->gl_display); - if (!gst_gl_query_local_gl_context (decoder, GST_PAD_SRC, - (GstGLContext **) gl_context)) { - GST_INFO_OBJECT (decoder, "failed to query local OpenGL context"); + if (!gst_gl_query_local_gl_context (videodec, GST_PAD_SRC, + (GstGLContext **) & decoder->gl_context)) { + GST_INFO_OBJECT (videodec, "failed to query local OpenGL context"); - gst_clear_object (gl_context); - *gl_context = + gst_clear_object (&decoder->gl_context); + decoder->gl_context = (GstObject *) gst_gl_display_get_gl_context_for_thread (display, NULL); - if (*gl_context == NULL + if (decoder->gl_context == NULL || !gst_gl_display_add_context (display, - GST_GL_CONTEXT (*gl_context))) { - gst_clear_object (gl_context); + GST_GL_CONTEXT (decoder->gl_context))) { + gst_clear_object (&decoder->gl_context); if (!gst_gl_display_create_context (display, - (GstGLContext *) other_gl_context, - (GstGLContext **) gl_context, NULL)) { - GST_WARNING_OBJECT (decoder, "failed to create OpenGL context"); + (GstGLContext *) decoder->other_gl_context, + (GstGLContext **) & decoder->gl_context, NULL)) { + GST_WARNING_OBJECT (videodec, "failed to create OpenGL context"); return FALSE; } - if (!gst_gl_display_add_context (display, (GstGLContext *) * gl_context)) { - GST_WARNING_OBJECT (decoder, + if (!gst_gl_display_add_context (display, + (GstGLContext *) decoder->gl_context)) { + GST_WARNING_OBJECT (videodec, "failed to add the OpenGL context to the display"); return FALSE; } } } - context = GST_GL_CONTEXT (*gl_context); + context = GST_GL_CONTEXT (decoder->gl_context); if (!gst_gl_context_check_gl_version (context, SUPPORTED_GL_APIS, 3, 0)) { - GST_WARNING_OBJECT (decoder, + GST_WARNING_OBJECT (videodec, "OpenGL context could not support PBO download"); return FALSE; } @@ -1267,7 +1423,7 @@ &ret); if (!ret) { - GST_WARNING_OBJECT (decoder, + GST_WARNING_OBJECT (videodec, "Current OpenGL context is not CUDA-compatible"); return FALSE; } @@ -1277,97 +1433,168 @@ #endif gboolean -gst_nv_decoder_negotiate (GstVideoDecoder * decoder, - GstVideoCodecState * input_state, GstVideoFormat format, guint width, - guint height, GstObject * gl_display, GstObject * other_gl_context, - GstObject ** gl_context, GstVideoCodecState ** output_state, - GstNvDecoderOutputType * output_type) +gst_nv_decoder_negotiate (GstNvDecoder * decoder, + GstVideoDecoder * videodec, GstVideoCodecState * input_state, + GstVideoCodecState ** output_state) { GstVideoCodecState *state; + GstVideoInfo *info; - g_return_val_if_fail (GST_IS_VIDEO_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_NV_DECODER (decoder), FALSE); + g_return_val_if_fail (GST_IS_VIDEO_DECODER (videodec), FALSE); g_return_val_if_fail (input_state != NULL, FALSE); - g_return_val_if_fail (format != GST_VIDEO_FORMAT_UNKNOWN, FALSE); - g_return_val_if_fail (width > 0, FALSE); - g_return_val_if_fail (height > 0, FALSE); g_return_val_if_fail (output_state != NULL, FALSE); - g_return_val_if_fail (gl_context != NULL, FALSE); - g_return_val_if_fail (output_type != NULL, FALSE); - state = gst_video_decoder_set_output_state (decoder, - format, width, height, input_state); + if (!decoder->configured) { + GST_ERROR_OBJECT (videodec, "Should configure decoder first"); + return FALSE; + } + + info = &decoder->info; + state = gst_video_decoder_set_interlaced_output_state (videodec, + GST_VIDEO_INFO_FORMAT (info), GST_VIDEO_INFO_INTERLACE_MODE (info), + GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info), input_state); state->caps = gst_video_info_to_caps (&state->info); if (*output_state) gst_video_codec_state_unref (*output_state); *output_state = state; - *output_type = GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM; + decoder->output_type = GST_NV_DECODER_OUTPUT_TYPE_SYSTEM; -#ifdef HAVE_NVCODEC_GST_GL { GstCaps *caps; - caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (decoder)); - GST_DEBUG_OBJECT (decoder, "Allowed caps %" GST_PTR_FORMAT, caps); + caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (videodec)); + GST_DEBUG_OBJECT (videodec, "Allowed caps %" GST_PTR_FORMAT, caps); if (!caps || gst_caps_is_any (caps)) { - GST_DEBUG_OBJECT (decoder, + GST_DEBUG_OBJECT (videodec, "cannot determine output format, using system memory"); - } else if (gl_display) { + } else { GstCapsFeatures *features; guint size = gst_caps_get_size (caps); guint i; + gboolean have_cuda = FALSE; + gboolean have_gl = FALSE; for (i = 0; i < size; i++) { features = gst_caps_get_features (caps, i); if (features && gst_caps_features_contains (features, - GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { - GST_DEBUG_OBJECT (decoder, "found GL memory feature, using gl"); - *output_type = GST_NV_DECOCER_OUTPUT_TYPE_GL; + GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)) { + GST_DEBUG_OBJECT (videodec, "found CUDA memory feature"); + have_cuda = TRUE; break; } +#ifdef HAVE_NVCODEC_GST_GL + if (features && gst_caps_features_contains (features, + GST_CAPS_FEATURE_MEMORY_GL_MEMORY)) { + GST_DEBUG_OBJECT (videodec, "found GL memory feature"); + have_gl = TRUE; + } +#endif } + + if (have_cuda) + decoder->output_type = GST_NV_DECODER_OUTPUT_TYPE_CUDA; + else if (have_gl) + decoder->output_type = GST_NV_DECODER_OUTPUT_TYPE_GL; } gst_clear_caps (&caps); } - if (*output_type == GST_NV_DECOCER_OUTPUT_TYPE_GL && - !gst_nv_decoder_ensure_gl_context (GST_ELEMENT (decoder), - gl_display, other_gl_context, gl_context)) { - GST_WARNING_OBJECT (decoder, +#ifdef HAVE_NVCODEC_GST_GL + if (decoder->output_type == GST_NV_DECODER_OUTPUT_TYPE_GL && + !gst_nv_decoder_ensure_gl_context (decoder, GST_ELEMENT (videodec))) { + GST_WARNING_OBJECT (videodec, "OpenGL context is not CUDA-compatible, fallback to system memory"); - *output_type = GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM; + decoder->output_type = GST_NV_DECODER_OUTPUT_TYPE_SYSTEM; } +#endif - if (*output_type == GST_NV_DECOCER_OUTPUT_TYPE_GL) { - gst_caps_set_features (state->caps, 0, - gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_GL_MEMORY, NULL)); - gst_caps_set_simple (state->caps, "texture-target", G_TYPE_STRING, - "2D", NULL); - } else { - GST_DEBUG_OBJECT (decoder, "using system memory"); - } + switch (decoder->output_type) { + case GST_NV_DECODER_OUTPUT_TYPE_CUDA: + GST_DEBUG_OBJECT (videodec, "using CUDA memory"); + gst_caps_set_features (state->caps, 0, + gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY, NULL)); + break; +#ifdef HAVE_NVCODEC_GST_GL + case GST_NV_DECODER_OUTPUT_TYPE_GL: + GST_DEBUG_OBJECT (videodec, "using GL memory"); + gst_caps_set_features (state->caps, 0, + gst_caps_features_new (GST_CAPS_FEATURE_MEMORY_GL_MEMORY, NULL)); + gst_caps_set_simple (state->caps, "texture-target", G_TYPE_STRING, + "2D", NULL); + break; #endif + default: + GST_DEBUG_OBJECT (videodec, "using system memory"); + break; + } return TRUE; } -gboolean -gst_nv_decoder_decide_allocation (GstVideoDecoder * decocer, GstQuery * query, - GstObject * gl_context, GstNvDecoderOutputType output_type) +static gboolean +gst_nv_decoder_ensure_cuda_pool (GstNvDecoder * decoder, GstQuery * query) { + GstCaps *outcaps; + GstBufferPool *pool = NULL; + guint n, size, min, max; + GstVideoInfo vinfo = { 0, }; + GstStructure *config; + + gst_query_parse_allocation (query, &outcaps, NULL); + n = gst_query_get_n_allocation_pools (query); + if (n > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (pool && !GST_IS_CUDA_BUFFER_POOL (pool)) { + gst_object_unref (pool); + pool = NULL; + } + } + + if (!pool) { + GST_DEBUG_OBJECT (decoder, "no downstream pool, create our pool"); + pool = gst_cuda_buffer_pool_new (decoder->context); + + if (outcaps) + gst_video_info_from_caps (&vinfo, outcaps); + size = (guint) vinfo.size; + min = max = 0; + } + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_buffer_pool_set_config (pool, config); + if (n > 0) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + gst_object_unref (pool); + + return TRUE; +} + #ifdef HAVE_NVCODEC_GST_GL +static gboolean +gst_nv_decoder_ensure_gl_pool (GstNvDecoder * decoder, GstQuery * query) +{ GstCaps *outcaps; GstBufferPool *pool = NULL; guint n, size, min, max; GstVideoInfo vinfo = { 0, }; GstStructure *config; + GstGLContext *gl_context; - GST_DEBUG_OBJECT (decocer, "decide allocation"); + GST_DEBUG_OBJECT (decoder, "decide allocation"); - /* GstVideoDecoder will take care this case */ - if (output_type == GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM) - return TRUE; + if (!decoder->gl_context) { + GST_ERROR_OBJECT (decoder, "GL context is not available"); + return FALSE; + } + + gl_context = GST_GL_CONTEXT (decoder->gl_context); gst_query_parse_allocation (query, &outcaps, NULL); n = gst_query_get_n_allocation_pools (query); @@ -1380,6 +1607,7 @@ } if (!pool) { + GST_DEBUG_OBJECT (decoder, "no downstream pool, create our pool"); pool = gst_gl_buffer_pool_new (GST_GL_CONTEXT (gl_context)); if (outcaps) @@ -1397,7 +1625,35 @@ else gst_query_add_allocation_pool (query, pool, size, min, max); gst_object_unref (pool); -#endif return TRUE; } +#endif + +gboolean +gst_nv_decoder_decide_allocation (GstNvDecoder * decoder, + GstVideoDecoder * videodec, GstQuery * query) +{ + gboolean ret = TRUE; + + GST_DEBUG_OBJECT (videodec, "decide allocation"); + + switch (decoder->output_type) { + case GST_NV_DECODER_OUTPUT_TYPE_SYSTEM: + /* GstVideoDecoder will take care this case */ + break; +#ifdef HAVE_NVCODEC_GST_GL + case GST_NV_DECODER_OUTPUT_TYPE_GL: + ret = gst_nv_decoder_ensure_gl_pool (decoder, query); + break; +#endif + case GST_NV_DECODER_OUTPUT_TYPE_CUDA: + ret = gst_nv_decoder_ensure_cuda_pool (decoder, query); + break; + default: + g_assert_not_reached (); + return FALSE; + } + + return ret; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvdecoder.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvdecoder.h
Changed
@@ -42,90 +42,61 @@ /*< private >*/ GstNvDecoder *decoder; + + gint ref_count; } GstNvDecoderFrame; -typedef enum -{ - GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM = 0, - GST_NV_DECOCER_OUTPUT_TYPE_GL, - /* FIXME: add support CUDA, D3D11 memory */ -} GstNvDecoderOutputType; - -G_GNUC_INTERNAL -GstNvDecoder * gst_nv_decoder_new (GstCudaContext * context, - cudaVideoCodec codec, - GstVideoInfo * info, - guint pool_size); - -G_GNUC_INTERNAL +GstNvDecoder * gst_nv_decoder_new (GstCudaContext * context); + +gboolean gst_nv_decoder_is_configured (GstNvDecoder * decoder); + +gboolean gst_nv_decoder_configure (GstNvDecoder * decoder, + cudaVideoCodec codec, + GstVideoInfo * info, + gint coded_width, + gint coded_height, + guint coded_bitdepth, + guint pool_size); + GstNvDecoderFrame * gst_nv_decoder_new_frame (GstNvDecoder * decoder); -G_GNUC_INTERNAL -void gst_nv_decoder_frame_free (GstNvDecoderFrame * frame); +GstNvDecoderFrame * gst_nv_decoder_frame_ref (GstNvDecoderFrame * frame); + +void gst_nv_decoder_frame_unref (GstNvDecoderFrame * frame); -G_GNUC_INTERNAL gboolean gst_nv_decoder_decode_picture (GstNvDecoder * decoder, CUVIDPICPARAMS * params); -G_GNUC_INTERNAL -gboolean gst_nv_decoder_finish_frame (GstNvDecoder * decoder, - GstNvDecoderOutputType output_type, - GstObject * graphics_context, - GstNvDecoderFrame *frame, - GstBuffer *buffer); +gboolean gst_nv_decoder_finish_frame (GstNvDecoder * decoder, + GstVideoDecoder * videodec, + GstNvDecoderFrame *frame, + GstBuffer ** buffer); /* utils for class registration */ -G_GNUC_INTERNAL gboolean gst_nv_decoder_check_device_caps (CUcontext cuda_ctx, cudaVideoCodec codec, GstCaps **sink_template, GstCaps **src_template); -G_GNUC_INTERNAL const gchar * gst_cuda_video_codec_to_string (cudaVideoCodec codec); /* helper methods */ -G_GNUC_INTERNAL -gboolean gst_nv_decoder_ensure_element_data (GstElement * decoder, - guint cuda_device_id, - GstCudaContext ** cuda_context, - CUstream * cuda_stream, - GstObject ** gl_display, - GstObject ** other_gl_context); - -G_GNUC_INTERNAL -void gst_nv_decoder_set_context (GstElement * decoder, - GstContext * context, - guint cuda_device_id, - GstCudaContext ** cuda_context, - GstObject ** gl_display, - GstObject ** other_gl_context); - -G_GNUC_INTERNAL -gboolean gst_nv_decoder_handle_context_query (GstElement * decoder, - GstQuery * query, - GstCudaContext * cuda_context, - GstObject * gl_display, - GstObject * gl_context, - GstObject * other_gl_context); - -G_GNUC_INTERNAL -gboolean gst_nv_decoder_negotiate (GstVideoDecoder * decoder, +gboolean gst_nv_decoder_handle_set_context (GstNvDecoder * decoder, + GstElement * videodec, + GstContext * context); + +gboolean gst_nv_decoder_handle_context_query (GstNvDecoder * decoder, + GstVideoDecoder * videodec, + GstQuery * query); + +gboolean gst_nv_decoder_negotiate (GstNvDecoder * decoder, + GstVideoDecoder * videodec, GstVideoCodecState * input_state, - GstVideoFormat format, - guint width, - guint height, - GstObject * gl_display, - GstObject * other_gl_context, - GstObject ** gl_context, - GstVideoCodecState ** output_state, - GstNvDecoderOutputType * output_type); - -G_GNUC_INTERNAL -gboolean gst_nv_decoder_decide_allocation (GstVideoDecoder * decocer, - GstQuery * query, - GstObject * gl_context, - GstNvDecoderOutputType output_type); + GstVideoCodecState ** output_state); + +gboolean gst_nv_decoder_decide_allocation (GstNvDecoder * decoder, + GstVideoDecoder * videodec, + GstQuery * query); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvenc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvenc.c
Changed
@@ -24,6 +24,8 @@ #include "gstnvenc.h" #include "gstnvh264enc.h" #include "gstnvh265enc.h" +#include "gstcudabufferpool.h" + #include <gmodule.h> #if HAVE_NVCODEC_GST_GL @@ -512,6 +514,9 @@ /* put baseline to last since it does not support bframe */ {"baseline", NV_ENC_H264_PROFILE_BASELINE_GUID, NV_ENC_CODEC_H264_GUID, FALSE, FALSE, FALSE}, + {"constrained-baseline", NV_ENC_H264_PROFILE_BASELINE_GUID, + NV_ENC_CODEC_H264_GUID, + FALSE, FALSE, FALSE}, /* hevc profiles */ {"main", NV_ENC_HEVC_PROFILE_MAIN_GUID, NV_ENC_CODEC_HEVC_GUID, FALSE, FALSE, FALSE}, @@ -787,15 +792,21 @@ g_value_unset (interlace_modes); g_free (interlace_modes); } -#if HAVE_NVCODEC_GST_GL + { + GstCaps *cuda_caps = gst_caps_copy (sink_templ); +#if HAVE_NVCODEC_GST_GL GstCaps *gl_caps = gst_caps_copy (sink_templ); gst_caps_set_features_simple (gl_caps, gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_GL_MEMORY)); gst_caps_append (sink_templ, gl_caps); - } #endif + gst_caps_set_features_simple (cuda_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_CUDA_MEMORY)); + gst_caps_append (sink_templ, cuda_caps); + } + name = g_strdup_printf ("video/x-%s", codec); src_templ = gst_caps_new_simple (name, "width", GST_TYPE_INT_RANGE, min_width, max_width, @@ -1083,7 +1094,7 @@ } guint32 -gst_nvenc_get_registure_resource_version (void) +gst_nvenc_get_register_resource_version (void) { /* NV_ENC_REGISTER_RESOURCE_VER == NVENCAPI_STRUCT_VERSION(3) */ return GST_NVENCAPI_STRUCT_VERSION (3, gst_nvenc_api_version);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvenc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvenc.h
Changed
@@ -26,94 +26,66 @@ #include "gstcudaloader.h" #include "nvEncodeAPI.h" -G_GNUC_INTERNAL gboolean gst_nvenc_cmp_guid (GUID g1, GUID g2); -G_GNUC_INTERNAL NV_ENC_BUFFER_FORMAT gst_nvenc_get_nv_buffer_format (GstVideoFormat fmt); -G_GNUC_INTERNAL gboolean gst_nvenc_get_supported_input_formats (gpointer encoder, GUID codec_id, GValue ** formats); -G_GNUC_INTERNAL GValue * gst_nvenc_get_interlace_modes (gpointer enc, GUID codec_id); -G_GNUC_INTERNAL GValue * gst_nvenc_get_supported_codec_profiles (gpointer enc, GUID codec_id); -G_GNUC_INTERNAL void gst_nvenc_plugin_init (GstPlugin * plugin, guint device_index, CUcontext cuda_ctx); -G_GNUC_INTERNAL guint32 gst_nvenc_get_api_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_caps_param_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_encode_out_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_create_input_buffer_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_create_bitstream_buffer_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_create_mv_buffer_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_rc_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_config_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_initialize_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_reconfigure_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_preset_config_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_pic_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_meonly_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_lock_bitstream_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_lock_input_buffer_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_map_input_resource_version (void); -G_GNUC_INTERNAL -guint32 gst_nvenc_get_registure_resource_version (void); +guint32 gst_nvenc_get_register_resource_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_stat_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_sequence_param_payload_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_event_params_version (void); -G_GNUC_INTERNAL guint32 gst_nvenc_get_open_encode_session_ex_params_version (void); -G_GNUC_INTERNAL gboolean gst_nvenc_load_library (guint * api_major_ver, guint * api_minor_ver);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh264dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh264dec.c
Changed
@@ -89,11 +89,7 @@ GstVideoCodecState *output_state; - const GstH264SPS *last_sps; - const GstH264PPS *last_pps; - GstCudaContext *context; - CUstream cuda_stream; GstNvDecoder *decoder; CUVIDPICPARAMS params; @@ -113,14 +109,10 @@ guint bitdepth; guint chroma_format_idc; gint max_dpb_size; - GstVideoFormat out_format; - /* For OpenGL interop. */ - GstObject *gl_display; - GstObject *gl_context; - GstObject *other_gl_context; + gboolean interlaced; - GstNvDecoderOutputType output_type; + GArray *ref_list; }; struct _GstNvH264DecClass @@ -132,6 +124,7 @@ #define gst_nv_h264_dec_parent_class parent_class G_DEFINE_TYPE (GstNvH264Dec, gst_nv_h264_dec, GST_TYPE_H264_DECODER); +static void gst_nv_h264_decoder_dispose (GObject * object); static void gst_nv_h264_decoder_finalize (GObject * object); static void gst_nv_h264_dec_set_context (GstElement * element, GstContext * context); @@ -144,19 +137,24 @@ GstQuery * query); /* GstH264Decoder */ -static gboolean gst_nv_h264_dec_new_sequence (GstH264Decoder * decoder, +static GstFlowReturn gst_nv_h264_dec_new_sequence (GstH264Decoder * decoder, const GstH264SPS * sps, gint max_dpb_size); -static gboolean gst_nv_h264_dec_new_picture (GstH264Decoder * decoder, +static GstFlowReturn gst_nv_h264_dec_new_picture (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture); +static GstFlowReturn gst_nv_h264_dec_new_field_picture (GstH264Decoder * + decoder, const GstH264Picture * first_field, GstH264Picture * second_field); static GstFlowReturn gst_nv_h264_dec_output_picture (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture); -static gboolean gst_nv_h264_dec_start_picture (GstH264Decoder * decoder, +static GstFlowReturn gst_nv_h264_dec_start_picture (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb); -static gboolean gst_nv_h264_dec_decode_slice (GstH264Decoder * decoder, +static GstFlowReturn gst_nv_h264_dec_decode_slice (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, GArray * ref_pic_list1); -static gboolean gst_nv_h264_dec_end_picture (GstH264Decoder * decoder, +static GstFlowReturn gst_nv_h264_dec_end_picture (GstH264Decoder * decoder, GstH264Picture * picture); +static guint +gst_nv_h264_dec_get_preferred_output_delay (GstH264Decoder * decoder, + gboolean live); static void gst_nv_h264_dec_class_init (GstNvH264DecClass * klass) @@ -172,6 +170,7 @@ * Since: 1.18 */ + object_class->dispose = gst_nv_h264_decoder_dispose; object_class->finalize = gst_nv_h264_decoder_finalize; element_class->set_context = GST_DEBUG_FUNCPTR (gst_nv_h264_dec_set_context); @@ -187,6 +186,8 @@ GST_DEBUG_FUNCPTR (gst_nv_h264_dec_new_sequence); h264decoder_class->new_picture = GST_DEBUG_FUNCPTR (gst_nv_h264_dec_new_picture); + h264decoder_class->new_field_picture = + GST_DEBUG_FUNCPTR (gst_nv_h264_dec_new_field_picture); h264decoder_class->output_picture = GST_DEBUG_FUNCPTR (gst_nv_h264_dec_output_picture); h264decoder_class->start_picture = @@ -195,6 +196,8 @@ GST_DEBUG_FUNCPTR (gst_nv_h264_dec_decode_slice); h264decoder_class->end_picture = GST_DEBUG_FUNCPTR (gst_nv_h264_dec_end_picture); + h264decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_nv_h264_dec_get_preferred_output_delay); GST_DEBUG_CATEGORY_INIT (gst_nv_h264_dec_debug, "nvh264dec", 0, "Nvidia H.264 Decoder"); @@ -205,6 +208,20 @@ static void gst_nv_h264_dec_init (GstNvH264Dec * self) { + self->ref_list = g_array_sized_new (FALSE, TRUE, + sizeof (GstH264Picture *), 16); + g_array_set_clear_func (self->ref_list, + (GDestroyNotify) gst_h264_picture_clear); +} + +static void +gst_nv_h264_decoder_dispose (GObject * object) +{ + GstNvH264Dec *self = GST_NV_H264_DEC (object); + + g_clear_pointer (&self->ref_list, g_array_unref); + + G_OBJECT_CLASS (parent_class)->dispose (object); } static void @@ -227,9 +244,15 @@ GST_DEBUG_OBJECT (self, "set context %s", gst_context_get_context_type (context)); - gst_nv_decoder_set_context (element, context, klass->cuda_device_id, - &self->context, &self->gl_display, &self->other_gl_context); + if (gst_cuda_handle_set_context (element, context, klass->cuda_device_id, + &self->context)) { + goto done; + } + + if (self->decoder) + gst_nv_decoder_handle_set_context (self->decoder, element, context); +done: GST_ELEMENT_CLASS (parent_class)->set_context (element, context); } @@ -243,8 +266,8 @@ self->coded_height = 0; self->bitdepth = 0; self->chroma_format_idc = 0; - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; self->max_dpb_size = 0; + self->interlaced = FALSE; } static gboolean @@ -253,13 +276,20 @@ GstNvH264Dec *self = GST_NV_H264_DEC (decoder); GstNvH264DecClass *klass = GST_NV_H264_DEC_GET_CLASS (self); - if (!gst_nv_decoder_ensure_element_data (GST_ELEMENT (self), - klass->cuda_device_id, &self->context, &self->cuda_stream, - &self->gl_display, &self->other_gl_context)) { + if (!gst_cuda_ensure_element_context (GST_ELEMENT (self), + klass->cuda_device_id, &self->context)) { GST_ERROR_OBJECT (self, "Required element data is unavailable"); return FALSE; } + self->decoder = gst_nv_decoder_new (self->context); + if (!self->decoder) { + GST_ERROR_OBJECT (self, "Failed to create decoder object"); + gst_clear_object (&self->context); + + return FALSE; + } + gst_d3d11_h264_dec_reset (self); return TRUE; @@ -272,19 +302,7 @@ g_clear_pointer (&self->output_state, gst_video_codec_state_unref); gst_clear_object (&self->decoder); - - if (self->context && self->cuda_stream) { - if (gst_cuda_context_push (self->context)) { - gst_cuda_result (CuStreamDestroy (self->cuda_stream)); - gst_cuda_context_pop (NULL); - } - } - - gst_clear_object (&self->gl_context); - gst_clear_object (&self->other_gl_context); - gst_clear_object (&self->gl_display); gst_clear_object (&self->context); - self->cuda_stream = NULL; return TRUE; } @@ -297,9 +315,8 @@ GST_DEBUG_OBJECT (self, "negotiate"); - gst_nv_decoder_negotiate (decoder, h264dec->input_state, self->out_format, - self->width, self->height, self->gl_display, self->other_gl_context, - &self->gl_context, &self->output_state, &self->output_type); + gst_nv_decoder_negotiate (self->decoder, decoder, h264dec->input_state, + &self->output_state); /* TODO: add support D3D11 memory */ @@ -311,8 +328,10 @@ { GstNvH264Dec *self = GST_NV_H264_DEC (decoder); - gst_nv_decoder_decide_allocation (decoder, query, - self->gl_context, self->output_type); + if (!gst_nv_decoder_decide_allocation (self->decoder, decoder, query)) { + GST_WARNING_OBJECT (self, "Failed to handle decide allocation"); + return FALSE; + } return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation (decoder, query); @@ -325,9 +344,11 @@ switch (GST_QUERY_TYPE (query)) { case GST_QUERY_CONTEXT: - if (gst_nv_decoder_handle_context_query (GST_ELEMENT (self), query, - self->context, self->gl_display, self->gl_context, - self->other_gl_context)) { + if (gst_cuda_handle_context_query (GST_ELEMENT (decoder), query, + self->context)) { + return TRUE; + } else if (self->decoder && + gst_nv_decoder_handle_context_query (self->decoder, decoder, query)) { return TRUE; } break; @@ -338,13 +359,14 @@ return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); } -static gboolean +static GstFlowReturn gst_nv_h264_dec_new_sequence (GstH264Decoder * decoder, const GstH264SPS * sps, gint max_dpb_size) { GstNvH264Dec *self = GST_NV_H264_DEC (decoder); gint crop_width, crop_height; gboolean modified = FALSE; + gboolean interlaced; GST_LOG_OBJECT (self, "new sequence"); @@ -379,67 +401,69 @@ modified = TRUE; } + interlaced = !sps->frame_mbs_only_flag; + if (self->interlaced != interlaced) { + GST_INFO_OBJECT (self, "interlaced sequence changed"); + self->interlaced = interlaced; + modified = TRUE; + } + if (self->max_dpb_size < max_dpb_size) { GST_INFO_OBJECT (self, "Requires larger DPB size (%d -> %d)", self->max_dpb_size, max_dpb_size); modified = TRUE; } - if (modified || !self->decoder) { + if (modified || !gst_nv_decoder_is_configured (self->decoder)) { GstVideoInfo info; - - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; if (self->bitdepth == 8) { if (self->chroma_format_idc == 1) - self->out_format = GST_VIDEO_FORMAT_NV12; + out_format = GST_VIDEO_FORMAT_NV12; else { GST_FIXME_OBJECT (self, "Could not support 8bits non-4:2:0 format"); } } else if (self->bitdepth == 10) { if (self->chroma_format_idc == 1) - self->out_format = GST_VIDEO_FORMAT_P010_10LE; + out_format = GST_VIDEO_FORMAT_P010_10LE; else { GST_FIXME_OBJECT (self, "Could not support 10bits non-4:2:0 format"); } } - if (self->out_format == GST_VIDEO_FORMAT_UNKNOWN) { + if (out_format == GST_VIDEO_FORMAT_UNKNOWN) { GST_ERROR_OBJECT (self, "Could not support bitdepth/chroma format"); - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; } - gst_clear_object (&self->decoder); - - gst_video_info_set_format (&info, - self->out_format, self->width, self->height); + gst_video_info_set_format (&info, out_format, self->width, self->height); + if (self->interlaced) + GST_VIDEO_INFO_INTERLACE_MODE (&info) = GST_VIDEO_INTERLACE_MODE_MIXED; self->max_dpb_size = max_dpb_size; /* FIXME: add support cudaVideoCodec_H264_SVC and cudaVideoCodec_H264_MVC */ - self->decoder = gst_nv_decoder_new (self->context, cudaVideoCodec_H264, - &info, - /* Additional 2 buffers for margin */ - max_dpb_size + 2); - - if (!self->decoder) { - GST_ERROR_OBJECT (self, "Failed to create decoder"); - return FALSE; + if (!gst_nv_decoder_configure (self->decoder, + cudaVideoCodec_H264, &info, self->coded_width, self->coded_height, + self->bitdepth, + /* Additional 4 buffers for render delay */ + max_dpb_size + 4)) { + GST_ERROR_OBJECT (self, "Failed to configure decoder"); + return GST_FLOW_NOT_NEGOTIATED; } if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; } - self->last_sps = NULL; - self->last_pps = NULL; memset (&self->params, 0, sizeof (CUVIDPICPARAMS)); } - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_nv_h264_dec_new_picture (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture) { @@ -449,16 +473,37 @@ nv_frame = gst_nv_decoder_new_frame (self->decoder); if (!nv_frame) { GST_ERROR_OBJECT (self, "No available decoder frame"); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "New decoder frame %p (index %d)", nv_frame, nv_frame->index); gst_h264_picture_set_user_data (picture, - nv_frame, (GDestroyNotify) gst_nv_decoder_frame_free); + nv_frame, (GDestroyNotify) gst_nv_decoder_frame_unref); - return TRUE; + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_nv_h264_dec_new_field_picture (GstH264Decoder * decoder, + const GstH264Picture * first_field, GstH264Picture * second_field) +{ + GstNvDecoderFrame *nv_frame; + + nv_frame = (GstNvDecoderFrame *) + gst_h264_picture_get_user_data ((GstH264Picture *) first_field); + if (!nv_frame) { + GST_ERROR_OBJECT (decoder, + "No decoder frame in the first picture %p", first_field); + return GST_FLOW_ERROR; + } + + gst_h264_picture_set_user_data (second_field, + gst_nv_decoder_frame_ref (nv_frame), + (GDestroyNotify) gst_nv_decoder_frame_unref); + + return GST_FLOW_OK; } static GstFlowReturn @@ -468,7 +513,6 @@ GstNvH264Dec *self = GST_NV_H264_DEC (decoder); GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); GstNvDecoderFrame *decoder_frame; - gboolean ret G_GNUC_UNUSED = FALSE; GST_LOG_OBJECT (self, "Outputting picture %p (poc %d)", picture, picture->pic_order_cnt); @@ -480,35 +524,21 @@ goto error; } - frame->output_buffer = gst_video_decoder_allocate_output_buffer (vdec); - if (!frame->output_buffer) { - GST_ERROR_OBJECT (self, "Couldn't allocate output buffer"); + if (!gst_nv_decoder_finish_frame (self->decoder, vdec, decoder_frame, + &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to handle output picture"); goto error; } - if (self->output_type == GST_NV_DECOCER_OUTPUT_TYPE_GL) { - ret = gst_nv_decoder_finish_frame (self->decoder, - GST_NV_DECOCER_OUTPUT_TYPE_GL, self->gl_context, - decoder_frame, frame->output_buffer); + if (picture->buffer_flags != 0) { + gboolean interlaced = + (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_INTERLACED) != 0; + gboolean tff = (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_TFF) != 0; - /* FIXME: This is the case where OpenGL context of downstream glbufferpool - * belongs to non-nvidia (or different device). - * There should be enhancement to ensure nvdec has compatible OpenGL context - */ - if (!ret) { - GST_WARNING_OBJECT (self, - "Couldn't copy frame to GL memory, fallback to system memory"); - self->output_type = GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM; - } - } - - if (!ret) { - if (!gst_nv_decoder_finish_frame (self->decoder, - GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM, NULL, decoder_frame, - frame->output_buffer)) { - GST_ERROR_OBJECT (self, "Failed to finish frame"); - goto error; - } + GST_TRACE_OBJECT (self, + "apply buffer flags 0x%x (interlaced %d, top-field-first %d)", + picture->buffer_flags, interlaced, tff); + GST_BUFFER_FLAG_SET (frame->output_buffer, picture->buffer_flags); } gst_h264_picture_unref (picture); @@ -516,8 +546,8 @@ return gst_video_decoder_finish_frame (vdec, frame); error: - gst_video_decoder_drop_frame (vdec, frame); gst_h264_picture_unref (picture); + gst_video_decoder_release_frame (vdec, frame); return GST_FLOW_ERROR; } @@ -564,7 +594,7 @@ const GstH264SPS * sps, gboolean field_pic, CUVIDH264PICPARAMS * params) { params->residual_colour_transform_flag = sps->separate_colour_plane_flag; - params->MbaffFrameFlag = sps->mb_adaptive_frame_field_flag && field_pic; + params->MbaffFrameFlag = sps->mb_adaptive_frame_field_flag && !field_pic; #define COPY_FIELD(f) \ (params)->f = (sps)->f @@ -624,7 +654,68 @@ self->params.pSliceDataOffsets = NULL; } -static gboolean +static void +gst_nv_h264_dec_fill_dpb (GstNvH264Dec * self, GstH264Picture * ref, + CUVIDH264DPBENTRY * dpb) +{ + GstNvDecoderFrame *frame; + + dpb->not_existing = ref->nonexisting; + dpb->PicIdx = -1; + + frame = gst_nv_h264_dec_get_decoder_frame_from_picture (self, ref); + if (!frame) { + dpb->not_existing = 1; + } else if (!dpb->not_existing) { + dpb->PicIdx = frame->index; + } + + if (dpb->not_existing) + return; + + if (GST_H264_PICTURE_IS_LONG_TERM_REF (ref)) { + dpb->FrameIdx = ref->long_term_frame_idx; + dpb->is_long_term = 1; + } else { + dpb->FrameIdx = ref->frame_num; + dpb->is_long_term = 0; + } + + switch (ref->field) { + case GST_H264_PICTURE_FIELD_FRAME: + dpb->FieldOrderCnt[0] = ref->top_field_order_cnt; + dpb->FieldOrderCnt[1] = ref->bottom_field_order_cnt; + dpb->used_for_reference = 0x3; + break; + case GST_H264_PICTURE_FIELD_TOP_FIELD: + dpb->FieldOrderCnt[0] = ref->top_field_order_cnt; + dpb->used_for_reference = 0x1; + if (ref->other_field) { + dpb->FieldOrderCnt[1] = ref->other_field->bottom_field_order_cnt; + dpb->used_for_reference |= 0x2; + } else { + dpb->FieldOrderCnt[1] = 0; + } + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + dpb->FieldOrderCnt[1] = ref->bottom_field_order_cnt; + dpb->used_for_reference = 0x2; + if (ref->other_field) { + dpb->FieldOrderCnt[0] = ref->other_field->bottom_field_order_cnt; + dpb->used_for_reference |= 0x1; + } else { + dpb->FieldOrderCnt[0] = 0; + } + break; + default: + dpb->FieldOrderCnt[0] = 0; + dpb->FieldOrderCnt[1] = 0; + dpb->used_for_reference = 0; + break; + } +} + +static GstFlowReturn gst_nv_h264_dec_start_picture (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb) { @@ -635,8 +726,8 @@ const GstH264SPS *sps; const GstH264PPS *pps; GstNvDecoderFrame *frame; - GArray *dpb_array; - gint i; + GArray *ref_list = self->ref_list; + guint i, ref_frame_idx; g_return_val_if_fail (slice_header->pps != NULL, FALSE); g_return_val_if_fail (slice_header->pps->sequence != NULL, FALSE); @@ -646,7 +737,7 @@ if (!frame) { GST_ERROR_OBJECT (self, "Couldn't get decoder frame frame picture %p", picture); - return FALSE; + return GST_FLOW_ERROR; } gst_nv_h264_dec_reset_bitstream_params (self); @@ -656,92 +747,70 @@ /* FIXME: update sps/pps related params only when it's required */ params->PicWidthInMbs = sps->pic_width_in_mbs_minus1 + 1; - params->FrameHeightInMbs = sps->pic_height_in_map_units_minus1 + 1; + if (!sps->frame_mbs_only_flag) { + params->FrameHeightInMbs = (sps->pic_height_in_map_units_minus1 + 1) << 1; + } else { + params->FrameHeightInMbs = sps->pic_height_in_map_units_minus1 + 1; + } params->CurrPicIdx = frame->index; - /* TODO: verifiy interlaced */ - params->field_pic_flag = picture->field != GST_H264_PICTURE_FIELD_FRAME; + params->field_pic_flag = slice_header->field_pic_flag; params->bottom_field_flag = picture->field == GST_H264_PICTURE_FIELD_BOTTOM_FIELD; - /* TODO: set second_field here */ - params->second_field = 0; + params->second_field = picture->second_field; + + if (picture->field == GST_H264_PICTURE_FIELD_TOP_FIELD) { + h264_params->CurrFieldOrderCnt[0] = picture->top_field_order_cnt; + h264_params->CurrFieldOrderCnt[1] = 0; + } else if (picture->field == GST_H264_PICTURE_FIELD_BOTTOM_FIELD) { + h264_params->CurrFieldOrderCnt[0] = 0; + h264_params->CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; + } else { + h264_params->CurrFieldOrderCnt[0] = picture->top_field_order_cnt; + h264_params->CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; + } /* nBitstreamDataLen, pBitstreamData, nNumSlices and pSliceDataOffsets * will be set later */ - params->ref_pic_flag = picture->ref; + params->ref_pic_flag = GST_H264_PICTURE_IS_REF (picture); /* will be updated later, if any slices belong to this frame is not * intra slice */ params->intra_pic_flag = 1; h264_params->frame_num = picture->frame_num; - h264_params->ref_pic_flag = picture->ref; - /* FIXME: should be updated depending on field type? */ - h264_params->CurrFieldOrderCnt[0] = picture->top_field_order_cnt; - h264_params->CurrFieldOrderCnt[1] = picture->bottom_field_order_cnt; - - if (!self->last_sps || self->last_sps != sps) { - GST_DEBUG_OBJECT (self, "Update params from SPS and PPS"); - gst_nv_h264_dec_picture_params_from_sps (self, - sps, slice_header->field_pic_flag, h264_params); - gst_nv_h264_dec_picture_params_from_pps (self, pps, h264_params); - self->last_sps = sps; - self->last_pps = pps; - } else if (!self->last_pps || self->last_pps != pps) { - GST_DEBUG_OBJECT (self, "Update params from PPS"); - gst_nv_h264_dec_picture_params_from_pps (self, pps, h264_params); - self->last_pps = pps; - } else { - GST_TRACE_OBJECT (self, "SPS and PPS were not updated"); - } - - memset (&h264_params->dpb, 0, sizeof (h264_params->dpb)); - for (i = 0; i < G_N_ELEMENTS (h264_params->dpb); i++) - h264_params->dpb[i].PicIdx = -1; + h264_params->ref_pic_flag = GST_H264_PICTURE_IS_REF (picture); - dpb_array = gst_h264_dpb_get_pictures_all (dpb); - for (i = 0; i < dpb_array->len && i < G_N_ELEMENTS (h264_params->dpb); i++) { - GstH264Picture *other = g_array_index (dpb_array, GstH264Picture *, i); - GstNvDecoderFrame *other_frame; - gint picture_index = -1; - CUVIDH264DPBENTRY *dpb = &h264_params->dpb[i]; + gst_nv_h264_dec_picture_params_from_sps (self, + sps, slice_header->field_pic_flag, h264_params); + gst_nv_h264_dec_picture_params_from_pps (self, pps, h264_params); - if (!other->ref) - continue; + ref_frame_idx = 0; + g_array_set_size (ref_list, 0); - other_frame = gst_nv_h264_dec_get_decoder_frame_from_picture (self, other); - - if (other_frame) - picture_index = other_frame->index; - - dpb->PicIdx = picture_index; - if (other->long_term) { - dpb->FrameIdx = other->long_term_frame_idx; - dpb->is_long_term = 1; - } else { - dpb->FrameIdx = other->frame_num; - dpb->is_long_term = 0; - } - - dpb->not_existing = other->nonexisting; - if (dpb->not_existing && dpb->PicIdx != -1) { - GST_WARNING_OBJECT (self, - "Non-existing frame has valid picture index %d", dpb->PicIdx); - dpb->PicIdx = -1; - } - - /* FIXME: 1=top_field, 2=bottom_field, 3=both_fields */ - dpb->used_for_reference = 3; + memset (&h264_params->dpb, 0, sizeof (h264_params->dpb)); + gst_h264_dpb_get_pictures_short_term_ref (dpb, FALSE, FALSE, ref_list); + for (i = 0; ref_frame_idx < 16 && i < ref_list->len; i++) { + GstH264Picture *other = g_array_index (ref_list, GstH264Picture *, i); + gst_nv_h264_dec_fill_dpb (self, other, &h264_params->dpb[ref_frame_idx]); + ref_frame_idx++; + } + g_array_set_size (ref_list, 0); - dpb->FieldOrderCnt[0] = other->top_field_order_cnt; - dpb->FieldOrderCnt[1] = other->bottom_field_order_cnt; + gst_h264_dpb_get_pictures_long_term_ref (dpb, FALSE, ref_list); + for (i = 0; ref_frame_idx < 16 && i < ref_list->len; i++) { + GstH264Picture *other = g_array_index (ref_list, GstH264Picture *, i); + gst_nv_h264_dec_fill_dpb (self, other, &h264_params->dpb[ref_frame_idx]); + ref_frame_idx++; } + g_array_set_size (ref_list, 0); - g_array_unref (dpb_array); + for (i = ref_frame_idx; i < 16; i++) + h264_params->dpb[i].PicIdx = -1; - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_nv_h264_dec_decode_slice (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, GArray * ref_pic_list1) @@ -779,10 +848,10 @@ !GST_H264_IS_SI_SLICE (&slice->header)) self->params.intra_pic_flag = 0; - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_nv_h264_dec_end_picture (GstH264Decoder * decoder, GstH264Picture * picture) { GstNvH264Dec *self = GST_NV_H264_DEC (decoder); @@ -799,10 +868,24 @@ ret = gst_nv_decoder_decode_picture (self->decoder, &self->params); - if (!ret) + if (!ret) { GST_ERROR_OBJECT (self, "Failed to decode picture"); + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static guint +gst_nv_h264_dec_get_preferred_output_delay (GstH264Decoder * decoder, + gboolean live) +{ + /* Prefer to zero latency for live pipeline */ + if (live) + return 0; - return ret; + /* NVCODEC SDK uses 4 frame delay for better throughput performance */ + return 4; } typedef struct @@ -870,7 +953,7 @@ cdata->sink_caps = gst_caps_from_string ("video/x-h264, " "stream-format= (string) { avc, avc3, byte-stream }, " "alignment= (string) au, " - "profile = (string) { high, main, constrained-baseline, baseline }, " + "profile = (string) { high, main, constrained-high, constrained-baseline, baseline }, " "framerate = " GST_VIDEO_FPS_RANGE); s = gst_caps_get_structure (sink_caps, 0);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh264dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh264dec.h
Changed
@@ -33,10 +33,8 @@ typedef struct _GstNvH264Dec GstNvH264Dec; typedef struct _GstNvH264DecClass GstNvH264DecClass; -G_GNUC_INTERNAL GType gst_nv_h264_dec_get_type (void); -G_GNUC_INTERNAL void gst_nv_h264_dec_register (GstPlugin * plugin, guint device_id, guint rank,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh264enc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh264enc.c
Changed
@@ -69,7 +69,8 @@ #define DOCUMENTATION_SINK_CAPS \ "video/x-raw, " DOCUMENTATION_SINK_CAPS_COMM "; " \ - "video/x-raw(memory:GLMemory), " DOCUMENTATION_SINK_CAPS_COMM + "video/x-raw(memory:GLMemory), " DOCUMENTATION_SINK_CAPS_COMM "; " \ + "video/x-raw(memory:CUDAMemory), " DOCUMENTATION_SINK_CAPS_COMM #define DOCUMENTATION_SRC_CAPS \ "video/x-h264, " \ @@ -78,7 +79,7 @@ "framerate = " GST_VIDEO_FPS_RANGE ", " \ "stream-format = (string) byte-stream, " \ "alignment = (string) au, " \ - "profile = (string) { main, high, high-4:4:4, baseline }" + "profile = (string) { main, high, high-4:4:4, baseline, constrained-baseline }" static gboolean gst_nv_h264_enc_open (GstVideoEncoder * enc); static gboolean gst_nv_h264_enc_close (GstVideoEncoder * enc); @@ -473,7 +474,8 @@ profile = gst_structure_get_string (s, "profile"); if (profile) { - if (!strcmp (profile, "baseline")) { + if (!strcmp (profile, "baseline") + || !strcmp (profile, "constrained-baseline")) { selected_profile = NV_ENC_H264_PROFILE_BASELINE_GUID; } else if (g_str_has_prefix (profile, "high-4:4:4")) { selected_profile = NV_ENC_H264_PROFILE_HIGH_444_GUID;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh264enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh264enc.h
Changed
@@ -32,7 +32,6 @@ GstNvBaseEncClass video_encoder_class; } GstNvH264EncClass; -G_GNUC_INTERNAL void gst_nv_h264_enc_register (GstPlugin * plugin, guint device_id, guint rank,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh265dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh265dec.c
Changed
@@ -89,11 +89,7 @@ GstVideoCodecState *output_state; - const GstH265SPS *last_sps; - const GstH265PPS *last_pps; - GstCudaContext *context; - CUstream cuda_stream; GstNvDecoder *decoder; CUVIDPICPARAMS params; @@ -112,14 +108,6 @@ guint coded_width, coded_height; guint bitdepth; guint chroma_format_idc; - GstVideoFormat out_format; - - /* For OpenGL interop. */ - GstObject *gl_display; - GstObject *gl_context; - GstObject *other_gl_context; - - GstNvDecoderOutputType output_type; }; struct _GstNvH265DecClass @@ -143,17 +131,18 @@ GstQuery * query); /* GstH265Decoder */ -static gboolean gst_nv_h265_dec_new_sequence (GstH265Decoder * decoder, +static GstFlowReturn gst_nv_h265_dec_new_sequence (GstH265Decoder * decoder, const GstH265SPS * sps, gint max_dpb_size); -static gboolean gst_nv_h265_dec_new_picture (GstH265Decoder * decoder, - GstH265Picture * picture); +static GstFlowReturn gst_nv_h265_dec_new_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * frame, GstH265Picture * picture); static GstFlowReturn gst_nv_h265_dec_output_picture (GstH265Decoder * - decoder, GstH265Picture * picture); -static gboolean gst_nv_h265_dec_start_picture (GstH265Decoder * decoder, + decoder, GstVideoCodecFrame * frame, GstH265Picture * picture); +static GstFlowReturn gst_nv_h265_dec_start_picture (GstH265Decoder * decoder, GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb); -static gboolean gst_nv_h265_dec_decode_slice (GstH265Decoder * decoder, - GstH265Picture * picture, GstH265Slice * slice); -static gboolean gst_nv_h265_dec_end_picture (GstH265Decoder * decoder, +static GstFlowReturn gst_nv_h265_dec_decode_slice (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, + GArray * ref_pic_list0, GArray * ref_pic_list1); +static GstFlowReturn gst_nv_h265_dec_end_picture (GstH265Decoder * decoder, GstH265Picture * picture); static void @@ -225,9 +214,15 @@ GST_DEBUG_OBJECT (self, "set context %s", gst_context_get_context_type (context)); - gst_nv_decoder_set_context (element, context, klass->cuda_device_id, - &self->context, &self->gl_display, &self->other_gl_context); + if (gst_cuda_handle_set_context (element, context, klass->cuda_device_id, + &self->context)) { + goto done; + } + + if (self->decoder) + gst_nv_decoder_handle_set_context (self->decoder, element, context); +done: GST_ELEMENT_CLASS (parent_class)->set_context (element, context); } @@ -237,13 +232,20 @@ GstNvH265Dec *self = GST_NV_H265_DEC (decoder); GstNvH265DecClass *klass = GST_NV_H265_DEC_GET_CLASS (self); - if (!gst_nv_decoder_ensure_element_data (GST_ELEMENT (self), - klass->cuda_device_id, &self->context, &self->cuda_stream, - &self->gl_display, &self->other_gl_context)) { + if (!gst_cuda_ensure_element_context (GST_ELEMENT (self), + klass->cuda_device_id, &self->context)) { GST_ERROR_OBJECT (self, "Required element data is unavailable"); return FALSE; } + self->decoder = gst_nv_decoder_new (self->context); + if (!self->decoder) { + GST_ERROR_OBJECT (self, "Failed to create decoder object"); + gst_clear_object (&self->context); + + return FALSE; + } + return TRUE; } @@ -254,19 +256,7 @@ g_clear_pointer (&self->output_state, gst_video_codec_state_unref); gst_clear_object (&self->decoder); - - if (self->context && self->cuda_stream) { - if (gst_cuda_context_push (self->context)) { - gst_cuda_result (CuStreamDestroy (self->cuda_stream)); - gst_cuda_context_pop (NULL); - } - } - - gst_clear_object (&self->gl_context); - gst_clear_object (&self->other_gl_context); - gst_clear_object (&self->gl_display); gst_clear_object (&self->context); - self->cuda_stream = NULL; return TRUE; } @@ -279,9 +269,8 @@ GST_DEBUG_OBJECT (self, "negotiate"); - gst_nv_decoder_negotiate (decoder, h265dec->input_state, self->out_format, - self->width, self->height, self->gl_display, self->other_gl_context, - &self->gl_context, &self->output_state, &self->output_type); + gst_nv_decoder_negotiate (self->decoder, decoder, h265dec->input_state, + &self->output_state); /* TODO: add support D3D11 memory */ @@ -293,8 +282,10 @@ { GstNvH265Dec *self = GST_NV_H265_DEC (decoder); - gst_nv_decoder_decide_allocation (decoder, query, - self->gl_context, self->output_type); + if (!gst_nv_decoder_decide_allocation (self->decoder, decoder, query)) { + GST_WARNING_OBJECT (self, "Failed to handle decide allocation"); + return FALSE; + } return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation (decoder, query); @@ -307,9 +298,11 @@ switch (GST_QUERY_TYPE (query)) { case GST_QUERY_CONTEXT: - if (gst_nv_decoder_handle_context_query (GST_ELEMENT (self), query, - self->context, self->gl_display, self->gl_context, - self->other_gl_context)) { + if (gst_cuda_handle_context_query (GST_ELEMENT (decoder), query, + self->context)) { + return TRUE; + } else if (self->decoder && + gst_nv_decoder_handle_context_query (self->decoder, decoder, query)) { return TRUE; } break; @@ -320,7 +313,7 @@ return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); } -static gboolean +static GstFlowReturn gst_nv_h265_dec_new_sequence (GstH265Decoder * decoder, const GstH265SPS * sps, gint max_dpb_size) { @@ -361,60 +354,67 @@ modified = TRUE; } - if (modified || !self->decoder) { + if (modified || !gst_nv_decoder_is_configured (self->decoder)) { GstVideoInfo info; - - self->out_format = GST_VIDEO_FORMAT_UNKNOWN; + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; if (self->bitdepth == 8) { - if (self->chroma_format_idc == 1) - self->out_format = GST_VIDEO_FORMAT_NV12; - else { - GST_FIXME_OBJECT (self, "Could not support 8bits non-4:2:0 format"); + if (self->chroma_format_idc == 1) { + out_format = GST_VIDEO_FORMAT_NV12; + } else if (self->chroma_format_idc == 3) { + out_format = GST_VIDEO_FORMAT_Y444; + } else { + GST_FIXME_OBJECT (self, "8 bits supports only 4:2:0 or 4:4:4 format"); } } else if (self->bitdepth == 10) { - if (self->chroma_format_idc == 1) - self->out_format = GST_VIDEO_FORMAT_P010_10LE; - else { - GST_FIXME_OBJECT (self, "Could not support 10bits non-4:2:0 format"); + if (self->chroma_format_idc == 1) { + out_format = GST_VIDEO_FORMAT_P010_10LE; + } else if (self->chroma_format_idc == 3) { + out_format = GST_VIDEO_FORMAT_Y444_16LE; + } else { + GST_FIXME_OBJECT (self, "10 bits supports only 4:2:0 or 4:4:4 format"); + } + } else if (self->bitdepth == 12 || self->bitdepth == 16) { + if (self->chroma_format_idc == 1) { + out_format = GST_VIDEO_FORMAT_P016_LE; + } else if (self->chroma_format_idc == 3) { + out_format = GST_VIDEO_FORMAT_Y444_16LE; + } else { + GST_FIXME_OBJECT (self, "%d bits supports only 4:2:0 or 4:4:4 format", + self->bitdepth); } } - if (self->out_format == GST_VIDEO_FORMAT_UNKNOWN) { + if (out_format == GST_VIDEO_FORMAT_UNKNOWN) { GST_ERROR_OBJECT (self, "Could not support bitdepth/chroma format"); - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; } - gst_clear_object (&self->decoder); - - gst_video_info_set_format (&info, - self->out_format, self->width, self->height); + gst_video_info_set_format (&info, out_format, self->width, self->height); - self->decoder = gst_nv_decoder_new (self->context, cudaVideoCodec_HEVC, - &info, - /* Additional 2 buffers for margin */ - max_dpb_size + 2); - - if (!self->decoder) { - GST_ERROR_OBJECT (self, "Failed to create decoder"); - return FALSE; + if (!gst_nv_decoder_configure (self->decoder, + cudaVideoCodec_HEVC, &info, self->coded_width, self->coded_height, + self->bitdepth, + /* Additional 2 buffers for margin */ + max_dpb_size + 2)) { + GST_ERROR_OBJECT (self, "Failed to configure decoder"); + return GST_FLOW_NOT_NEGOTIATED; } if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; } - self->last_sps = NULL; - self->last_pps = NULL; memset (&self->params, 0, sizeof (CUVIDPICPARAMS)); } - return TRUE; + return GST_FLOW_OK; } -static gboolean -gst_nv_h265_dec_new_picture (GstH265Decoder * decoder, GstH265Picture * picture) +static GstFlowReturn +gst_nv_h265_dec_new_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * cframe, GstH265Picture * picture) { GstNvH265Dec *self = GST_NV_H265_DEC (decoder); GstNvDecoderFrame *frame; @@ -422,26 +422,24 @@ frame = gst_nv_decoder_new_frame (self->decoder); if (!frame) { GST_ERROR_OBJECT (self, "No available decoder frame"); - return FALSE; + return GST_FLOW_ERROR; } GST_LOG_OBJECT (self, "New decoder frame %p (index %d)", frame, frame->index); gst_h265_picture_set_user_data (picture, - frame, (GDestroyNotify) gst_nv_decoder_frame_free); + frame, (GDestroyNotify) gst_nv_decoder_frame_unref); - return TRUE; + return GST_FLOW_OK; } static GstFlowReturn gst_nv_h265_dec_output_picture (GstH265Decoder * decoder, - GstH265Picture * picture) + GstVideoCodecFrame * frame, GstH265Picture * picture) { GstNvH265Dec *self = GST_NV_H265_DEC (decoder); - GstVideoCodecFrame *frame = NULL; - GstBuffer *output_buffer = NULL; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); GstNvDecoderFrame *decoder_frame; - gboolean ret G_GNUC_UNUSED = FALSE; GST_LOG_OBJECT (self, "Outputting picture %p (poc %d)", picture, picture->pic_order_cnt); @@ -450,47 +448,24 @@ (GstNvDecoderFrame *) gst_h265_picture_get_user_data (picture); if (!decoder_frame) { GST_ERROR_OBJECT (self, "No decoder frame in picture %p", picture); - return GST_FLOW_ERROR; + goto error; } - frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), - picture->system_frame_number); - if (!frame) { - GST_ERROR_OBJECT (self, "Failed to retrieve codec frame"); - return GST_FLOW_ERROR; - } - - output_buffer = - gst_video_decoder_allocate_output_buffer (GST_VIDEO_DECODER (self)); - frame->output_buffer = output_buffer; - - if (self->output_type == GST_NV_DECOCER_OUTPUT_TYPE_GL) { - ret = gst_nv_decoder_finish_frame (self->decoder, - GST_NV_DECOCER_OUTPUT_TYPE_GL, self->gl_context, - decoder_frame, output_buffer); - - /* FIXME: This is the case where OpenGL context of downstream glbufferpool - * belongs to non-nvidia (or different device). - * There should be enhancement to ensure nvdec has compatible OpenGL context - */ - if (!ret) { - GST_WARNING_OBJECT (self, - "Couldn't copy frame to GL memory, fallback to system memory"); - self->output_type = GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM; - } + if (!gst_nv_decoder_finish_frame (self->decoder, vdec, decoder_frame, + &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to handle output picture"); + goto error; } - if (!ret) { - if (!gst_nv_decoder_finish_frame (self->decoder, - GST_NV_DECOCER_OUTPUT_TYPE_SYSTEM, NULL, decoder_frame, - output_buffer)) { - GST_ERROR_OBJECT (self, "Failed to finish frame"); - gst_video_decoder_drop_frame (GST_VIDEO_DECODER (self), frame); - return GST_FLOW_ERROR; - } - } + gst_h265_picture_unref (picture); return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); + +error: + gst_video_decoder_drop_frame (vdec, frame); + gst_h265_picture_unref (picture); + + return GST_FLOW_ERROR; } static GstNvDecoderFrame * @@ -676,7 +651,7 @@ self->params.pSliceDataOffsets = NULL; } -static gboolean +static GstFlowReturn gst_nv_h265_dec_start_picture (GstH265Decoder * decoder, GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb) { @@ -702,15 +677,15 @@ G_STATIC_ASSERT (sizeof (scaling_list->scaling_lists_32x32) == sizeof (h265_params->ScalingList32x32)); - g_return_val_if_fail (slice_header->pps != NULL, FALSE); - g_return_val_if_fail (slice_header->pps->sps != NULL, FALSE); + g_return_val_if_fail (slice_header->pps != NULL, GST_FLOW_ERROR); + g_return_val_if_fail (slice_header->pps->sps != NULL, GST_FLOW_ERROR); frame = gst_nv_h265_dec_get_decoder_frame_from_picture (self, picture); if (!frame) { GST_ERROR_OBJECT (self, "Couldn't get decoder frame frame picture %p", picture); - return FALSE; + return GST_FLOW_ERROR; } gst_nv_h265_dec_reset_bitstream_params (self); @@ -722,12 +697,6 @@ params->PicWidthInMbs = sps->pic_width_in_luma_samples / 16; params->FrameHeightInMbs = sps->pic_height_in_luma_samples / 16; params->CurrPicIdx = frame->index; - /* TODO: verifiy interlaced */ - params->field_pic_flag = picture->field != GST_H265_PICTURE_FIELD_FRAME; - params->bottom_field_flag = - picture->field == GST_H265_PICTURE_FIELD_BOTTOM_FIELD; - /* TODO: set second_field here */ - params->second_field = 0; /* nBitstreamDataLen, pBitstreamData, nNumSlices and pSliceDataOffsets * will be set later */ @@ -737,42 +706,28 @@ h265_params->IrapPicFlag = GST_H265_IS_NAL_TYPE_IRAP (slice->nalu.type); h265_params->IdrPicFlag = GST_H265_IS_NAL_TYPE_IDR (slice->nalu.type); - if (!self->last_sps || self->last_sps != sps) { - GST_DEBUG_OBJECT (self, "Update params from SPS and PPS"); - gst_nv_h265_dec_picture_params_from_sps (self, sps, h265_params); - if (!gst_nv_h265_dec_picture_params_from_pps (self, pps, h265_params)) { - GST_ERROR_OBJECT (self, "Couldn't copy pps"); - return FALSE; - } - self->last_sps = sps; - self->last_pps = pps; - } else if (!self->last_pps || self->last_pps != pps) { - GST_DEBUG_OBJECT (self, "Update params from PPS"); - if (!gst_nv_h265_dec_picture_params_from_pps (self, pps, h265_params)) { - GST_ERROR_OBJECT (self, "Couldn't copy pps"); - return FALSE; - } - self->last_pps = pps; - } else { - GST_TRACE_OBJECT (self, "SPS and PPS were not updated"); + gst_nv_h265_dec_picture_params_from_sps (self, sps, h265_params); + if (!gst_nv_h265_dec_picture_params_from_pps (self, pps, h265_params)) { + GST_ERROR_OBJECT (self, "Couldn't copy pps"); + return GST_FLOW_ERROR; } /* Fill reference */ if (decoder->NumPocStCurrBefore > G_N_ELEMENTS (h265_params->RefPicSetStCurrBefore)) { GST_ERROR_OBJECT (self, "Too many RefPicSetStCurrBefore"); - return FALSE; + return GST_FLOW_ERROR; } if (decoder->NumPocStCurrAfter > G_N_ELEMENTS (h265_params->RefPicSetStCurrAfter)) { GST_ERROR_OBJECT (self, "Too many RefPicSetStCurrAfter"); - return FALSE; + return GST_FLOW_ERROR; } if (decoder->NumPocLtCurr > G_N_ELEMENTS (h265_params->RefPicSetLtCurr)) { GST_ERROR_OBJECT (self, "Too many RefPicSetLtCurr"); - return FALSE; + return GST_FLOW_ERROR; } /* Fill ref list */ @@ -780,7 +735,7 @@ slice_header->short_term_ref_pic_set_size; h265_params->NumDeltaPocsOfRefRpsIdx = slice_header->short_term_ref_pic_sets.NumDeltaPocsOfRefRpsIdx; - h265_params->NumPocTotalCurr = decoder->NumPocTotalCurr; + h265_params->NumPocTotalCurr = decoder->NumPicTotalCurr; h265_params->NumPocStCurrBefore = decoder->NumPocStCurrBefore; h265_params->NumPocStCurrAfter = decoder->NumPocStCurrAfter; h265_params->NumPocLtCurr = decoder->NumPocLtCurr; @@ -799,7 +754,7 @@ if (num_ref_pic >= G_N_ELEMENTS (h265_params->RefPicIdx)) { GST_ERROR_OBJECT (self, "Too many reference frames"); - return FALSE; + return GST_FLOW_ERROR; } other_frame = gst_nv_h265_dec_get_decoder_frame_from_picture (self, other); @@ -824,7 +779,7 @@ if (!decoder->RefPicSetStCurrBefore[i]) { GST_ERROR_OBJECT (self, "Empty RefPicSetStCurrBefore[%d]", i); - return FALSE; + return GST_FLOW_ERROR; } other = decoder->RefPicSetStCurrBefore[i]; @@ -842,7 +797,7 @@ if (!decoder->RefPicSetStCurrAfter[i]) { GST_ERROR_OBJECT (self, "Empty RefPicSetStCurrAfter[%d]", i); - return FALSE; + return GST_FLOW_ERROR; } other = decoder->RefPicSetStCurrAfter[i]; @@ -860,7 +815,7 @@ if (!decoder->RefPicSetLtCurr[i]) { GST_ERROR_OBJECT (self, "Empty RefPicSetLtCurr[%d]", i); - return FALSE; + return GST_FLOW_ERROR; } other = decoder->RefPicSetLtCurr[i]; @@ -901,12 +856,13 @@ scaling_list->scaling_list_dc_coef_minus8_32x32[i] + 8; } - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_nv_h265_dec_decode_slice (GstH265Decoder * decoder, - GstH265Picture * picture, GstH265Slice * slice) + GstH265Picture * picture, GstH265Slice * slice, + GArray * ref_pic_list0, GArray * ref_pic_list1) { GstNvH265Dec *self = GST_NV_H265_DEC (decoder); gsize new_size; @@ -937,10 +893,10 @@ slice->nalu.data + slice->nalu.offset, slice->nalu.size); self->bitstream_buffer_offset = new_size; - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_nv_h265_dec_end_picture (GstH265Decoder * decoder, GstH265Picture * picture) { GstNvH265Dec *self = GST_NV_H265_DEC (decoder); @@ -957,10 +913,12 @@ ret = gst_nv_decoder_decode_picture (self->decoder, &self->params); - if (!ret) + if (!ret) { GST_ERROR_OBJECT (self, "Failed to decode picture"); + return GST_FLOW_ERROR; + } - return ret; + return GST_FLOW_OK; } typedef struct
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh265dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh265dec.h
Changed
@@ -33,10 +33,8 @@ typedef struct _GstNvH265Dec GstNvH265Dec; typedef struct _GstNvH265DecClass GstNvH265DecClass; -G_GNUC_INTERNAL GType gst_nv_h265_dec_get_type (void); -G_GNUC_INTERNAL void gst_nv_h265_dec_register (GstPlugin * plugin, guint device_id, guint rank,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh265enc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh265enc.c
Changed
@@ -70,7 +70,8 @@ #define DOCUMENTATION_SINK_CAPS \ "video/x-raw, " DOCUMENTATION_SINK_CAPS_COMM "; " \ - "video/x-raw(memory:GLMemory), " DOCUMENTATION_SINK_CAPS_COMM + "video/x-raw(memory:GLMemory), " DOCUMENTATION_SINK_CAPS_COMM "; " \ + "video/x-raw(memory:CUDAMemory), " DOCUMENTATION_SINK_CAPS_COMM #define DOCUMENTATION_SRC_CAPS \ "video/x-h265, " \
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/gstnvh265enc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvh265enc.h
Changed
@@ -37,7 +37,6 @@ GstNvBaseEncClass video_encoder_class; } GstNvH265EncClass; -G_GNUC_INTERNAL void gst_nv_h265_enc_register (GstPlugin * plugin, guint device_id, guint rank,
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvrtcloader.c
Added
@@ -0,0 +1,199 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstnvrtcloader.h" +#include "gstcudaloader.h" + +#include <gmodule.h> + +GST_DEBUG_CATEGORY_EXTERN (gst_nvcodec_debug); +#define GST_CAT_DEFAULT gst_nvcodec_debug + +#ifndef G_OS_WIN32 +#define NVRTC_LIBNAME "libnvrtc.so" +#else +#define NVRTC_LIBNAME "nvrtc64_%d%d_0.dll" +#endif + +#define LOAD_SYMBOL(name,func) G_STMT_START { \ + if (!g_module_symbol (module, G_STRINGIFY (name), (gpointer *) &vtable->func)) { \ + GST_ERROR ("Failed to load '%s' from %s, %s", G_STRINGIFY (name), fname, g_module_error()); \ + goto error; \ + } \ +} G_STMT_END; + +typedef struct _GstNvCodecNvrtcVtahle +{ + gboolean loaded; + + nvrtcResult (*NvrtcCompileProgram) (nvrtcProgram prog, int numOptions, + const char **options); + nvrtcResult (*NvrtcCreateProgram) (nvrtcProgram * prog, const char *src, + const char *name, int numHeaders, const char **headers, + const char **includeNames); + nvrtcResult (*NvrtcDestroyProgram) (nvrtcProgram * prog); + nvrtcResult (*NvrtcGetPTX) (nvrtcProgram prog, char *ptx); + nvrtcResult (*NvrtcGetPTXSize) (nvrtcProgram prog, size_t * ptxSizeRet); + nvrtcResult (*NvrtcGetProgramLog) (nvrtcProgram prog, char *log); + nvrtcResult (*NvrtcGetProgramLogSize) (nvrtcProgram prog, + size_t * logSizeRet); +} GstNvCodecNvrtcVtahle; + +static GstNvCodecNvrtcVtahle gst_nvrtc_vtable = { 0, }; + +gboolean +gst_nvrtc_load_library (void) +{ + GModule *module = NULL; + gchar *filename = NULL; + const gchar *filename_env; + const gchar *fname; + gint cuda_version; + GstNvCodecNvrtcVtahle *vtable; + + if (gst_nvrtc_vtable.loaded) + return TRUE; + + CuDriverGetVersion (&cuda_version); + + fname = filename_env = g_getenv ("GST_NVCODEC_NVRTC_LIBNAME"); + if (filename_env) + module = g_module_open (filename_env, G_MODULE_BIND_LAZY); + + if (!module) { +#ifndef G_OS_WIN32 + filename = g_strdup (NVRTC_LIBNAME); + fname = filename; + module = g_module_open (filename, G_MODULE_BIND_LAZY); +#else + /* XXX: On Windows, minor version of nvrtc library might not be exactly + * same as CUDA library */ + { + gint cuda_major_version = cuda_version / 1000; + gint cuda_minor_version = (cuda_version % 1000) / 10; + gint minor_version; + + for (minor_version = cuda_minor_version; minor_version >= 0; + minor_version--) { + g_free (filename); + filename = g_strdup_printf (NVRTC_LIBNAME, cuda_major_version, + minor_version); + fname = filename; + + module = g_module_open (filename, G_MODULE_BIND_LAZY); + if (module) { + GST_INFO ("%s is available", filename); + break; + } + + GST_DEBUG ("Couldn't open library %s", filename); + } + } +#endif + } + + if (module == NULL) { + GST_WARNING ("Could not open library %s, %s", filename, g_module_error ()); + g_free (filename); + return FALSE; + } + + vtable = &gst_nvrtc_vtable; + + LOAD_SYMBOL (nvrtcCompileProgram, NvrtcCompileProgram); + LOAD_SYMBOL (nvrtcCreateProgram, NvrtcCreateProgram); + LOAD_SYMBOL (nvrtcDestroyProgram, NvrtcDestroyProgram); + LOAD_SYMBOL (nvrtcGetPTX, NvrtcGetPTX); + LOAD_SYMBOL (nvrtcGetPTXSize, NvrtcGetPTXSize); + LOAD_SYMBOL (nvrtcGetProgramLog, NvrtcGetProgramLog); + LOAD_SYMBOL (nvrtcGetProgramLogSize, NvrtcGetProgramLogSize); + + vtable->loaded = TRUE; + g_free (filename); + + return TRUE; + +error: + g_module_close (module); + g_free (filename); + + return FALSE; +} + +nvrtcResult +NvrtcCompileProgram (nvrtcProgram prog, int numOptions, const char **options) +{ + g_assert (gst_nvrtc_vtable.NvrtcCompileProgram != NULL); + + return gst_nvrtc_vtable.NvrtcCompileProgram (prog, numOptions, options); +} + +nvrtcResult +NvrtcCreateProgram (nvrtcProgram * prog, const char *src, const char *name, + int numHeaders, const char **headers, const char **includeNames) +{ + g_assert (gst_nvrtc_vtable.NvrtcCreateProgram != NULL); + + return gst_nvrtc_vtable.NvrtcCreateProgram (prog, src, name, numHeaders, + headers, includeNames); +} + +nvrtcResult +NvrtcDestroyProgram (nvrtcProgram * prog) +{ + g_assert (gst_nvrtc_vtable.NvrtcDestroyProgram != NULL); + + return gst_nvrtc_vtable.NvrtcDestroyProgram (prog); +} + +nvrtcResult +NvrtcGetPTX (nvrtcProgram prog, char *ptx) +{ + g_assert (gst_nvrtc_vtable.NvrtcGetPTX != NULL); + + return gst_nvrtc_vtable.NvrtcGetPTX (prog, ptx); +} + +nvrtcResult +NvrtcGetPTXSize (nvrtcProgram prog, size_t * ptxSizeRet) +{ + g_assert (gst_nvrtc_vtable.NvrtcGetPTXSize != NULL); + + return gst_nvrtc_vtable.NvrtcGetPTXSize (prog, ptxSizeRet); +} + +nvrtcResult +NvrtcGetProgramLog (nvrtcProgram prog, char *log) +{ + g_assert (gst_nvrtc_vtable.NvrtcGetProgramLog != NULL); + + return gst_nvrtc_vtable.NvrtcGetProgramLog (prog, log); +} + +nvrtcResult +NvrtcGetProgramLogSize (nvrtcProgram prog, size_t * logSizeRet) +{ + g_assert (gst_nvrtc_vtable.NvrtcGetProgramLogSize != NULL); + + return gst_nvrtc_vtable.NvrtcGetProgramLogSize (prog, logSizeRet); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvrtcloader.h
Added
@@ -0,0 +1,56 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_NVRTC_LOADER_H__ +#define __GST_NVRTC_LOADER_H__ + +#include <gst/gst.h> +#include <nvrtc.h> + +G_BEGIN_DECLS + +gboolean gst_nvrtc_load_library (void); + +nvrtcResult NvrtcCompileProgram (nvrtcProgram prog, + int numOptions, + const char** options); + +nvrtcResult NvrtcCreateProgram (nvrtcProgram* prog, + const char* src, + const char* name, + int numHeaders, + const char** headers, + const char** includeNames); + +nvrtcResult NvrtcDestroyProgram (nvrtcProgram* prog); + +nvrtcResult NvrtcGetPTX (nvrtcProgram prog, + char* ptx); + +nvrtcResult NvrtcGetPTXSize (nvrtcProgram prog, + size_t* ptxSizeRet); + +nvrtcResult NvrtcGetProgramLog (nvrtcProgram prog, + char* log); + +nvrtcResult NvrtcGetProgramLogSize (nvrtcProgram prog, + size_t* logSizeRet); + +G_END_DECLS +#endif /* __GST_NVRTC_LOADER_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvvp8dec.c
Added
@@ -0,0 +1,557 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstnvvp8dec.h" +#include "gstcudautils.h" +#include "gstnvdecoder.h" + +#include <string.h> + +GST_DEBUG_CATEGORY_STATIC (gst_nv_vp8_dec_debug); +#define GST_CAT_DEFAULT gst_nv_vp8_dec_debug + +/* reference list 4 + 2 margin */ +#define NUM_OUTPUT_VIEW 6 + +struct _GstNvVp8Dec +{ + GstVp8Decoder parent; + + GstVideoCodecState *output_state; + + GstCudaContext *context; + GstNvDecoder *decoder; + CUVIDPICPARAMS params; + + guint width, height; +}; + +struct _GstNvVp8DecClass +{ + GstVp8DecoderClass parent_class; + guint cuda_device_id; +}; + +#define gst_nv_vp8_dec_parent_class parent_class + +/** + * GstNvVp8Dec: + * + * Since: 1.20 + */ +G_DEFINE_TYPE (GstNvVp8Dec, gst_nv_vp8_dec, GST_TYPE_VP8_DECODER); + +static void gst_nv_vp8_dec_set_context (GstElement * element, + GstContext * context); +static gboolean gst_nv_vp8_dec_open (GstVideoDecoder * decoder); +static gboolean gst_nv_vp8_dec_close (GstVideoDecoder * decoder); +static gboolean gst_nv_vp8_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_nv_vp8_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_nv_vp8_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); + +/* GstVp8Decoder */ +static GstFlowReturn gst_nv_vp8_dec_new_sequence (GstVp8Decoder * decoder, + const GstVp8FrameHdr * frame_hdr); +static GstFlowReturn gst_nv_vp8_dec_new_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture); +static GstFlowReturn gst_nv_vp8_dec_decode_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture, GstVp8Parser * parser); +static GstFlowReturn gst_nv_vp8_dec_output_picture (GstVp8Decoder * + decoder, GstVideoCodecFrame * frame, GstVp8Picture * picture); +static guint gst_nv_vp8_dec_get_preferred_output_delay (GstVp8Decoder * decoder, + gboolean is_live); + +static void +gst_nv_vp8_dec_class_init (GstNvVp8DecClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstVp8DecoderClass *vp8decoder_class = GST_VP8_DECODER_CLASS (klass); + + element_class->set_context = GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_set_context); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_src_query); + + vp8decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_new_sequence); + vp8decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_new_picture); + vp8decoder_class->decode_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_decode_picture); + vp8decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_output_picture); + vp8decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_nv_vp8_dec_get_preferred_output_delay); + + GST_DEBUG_CATEGORY_INIT (gst_nv_vp8_dec_debug, + "nvvp8dec", 0, "NVIDIA VP8 Decoder"); + + gst_type_mark_as_plugin_api (GST_TYPE_NV_VP8_DEC, 0); +} + +static void +gst_nv_vp8_dec_init (GstNvVp8Dec * self) +{ +} + +static void +gst_nv_vp8_dec_set_context (GstElement * element, GstContext * context) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (element); + GstNvVp8DecClass *klass = GST_NV_VP8_DEC_GET_CLASS (self); + + GST_DEBUG_OBJECT (self, "set context %s", + gst_context_get_context_type (context)); + + if (gst_cuda_handle_set_context (element, context, klass->cuda_device_id, + &self->context)) { + goto done; + } + + if (self->decoder) + gst_nv_decoder_handle_set_context (self->decoder, element, context); + +done: + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_nv_vp8_dec_open (GstVideoDecoder * decoder) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + GstNvVp8DecClass *klass = GST_NV_VP8_DEC_GET_CLASS (self); + + if (!gst_cuda_ensure_element_context (GST_ELEMENT (self), + klass->cuda_device_id, &self->context)) { + GST_ERROR_OBJECT (self, "Required element data is unavailable"); + return FALSE; + } + + self->decoder = gst_nv_decoder_new (self->context); + if (!self->decoder) { + GST_ERROR_OBJECT (self, "Failed to create decoder object"); + gst_clear_object (&self->context); + + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_nv_vp8_dec_close (GstVideoDecoder * decoder) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + + g_clear_pointer (&self->output_state, gst_video_codec_state_unref); + gst_clear_object (&self->decoder); + gst_clear_object (&self->context); + + return TRUE; +} + +static gboolean +gst_nv_vp8_dec_negotiate (GstVideoDecoder * decoder) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + GstVp8Decoder *vp8dec = GST_VP8_DECODER (decoder); + + GST_DEBUG_OBJECT (self, "negotiate"); + + gst_nv_decoder_negotiate (self->decoder, decoder, vp8dec->input_state, + &self->output_state); + + /* TODO: add support D3D11 memory */ + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_nv_vp8_dec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + + if (!gst_nv_decoder_decide_allocation (self->decoder, decoder, query)) { + GST_WARNING_OBJECT (self, "Failed to handle decide allocation"); + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_nv_vp8_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_cuda_handle_context_query (GST_ELEMENT (decoder), query, + self->context)) { + return TRUE; + } else if (self->decoder && + gst_nv_decoder_handle_context_query (self->decoder, decoder, query)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static GstFlowReturn +gst_nv_vp8_dec_new_sequence (GstVp8Decoder * decoder, + const GstVp8FrameHdr * frame_hdr) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + gboolean modified = FALSE; + + GST_LOG_OBJECT (self, "new sequence"); + + if (self->width != frame_hdr->width || self->height != frame_hdr->height) { + if (self->decoder) { + GST_INFO_OBJECT (self, "resolution changed %dx%d -> %dx%d", + self->width, self->height, frame_hdr->width, frame_hdr->height); + } + + self->width = frame_hdr->width; + self->height = frame_hdr->height; + + modified = TRUE; + } + + if (modified || !gst_nv_decoder_is_configured (self->decoder)) { + GstVideoInfo info; + + gst_video_info_set_format (&info, + GST_VIDEO_FORMAT_NV12, self->width, self->height); + + if (!gst_nv_decoder_configure (self->decoder, + cudaVideoCodec_VP8, &info, self->width, self->height, 8, + /* +4 for render delay */ + NUM_OUTPUT_VIEW + 4)) { + GST_ERROR_OBJECT (self, "Failed to configure decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + + memset (&self->params, 0, sizeof (CUVIDPICPARAMS)); + + self->params.PicWidthInMbs = GST_ROUND_UP_16 (self->width) >> 4; + self->params.FrameHeightInMbs = GST_ROUND_UP_16 (self->height) >> 4; + + self->params.CodecSpecific.vp8.width = self->width; + self->params.CodecSpecific.vp8.height = self->height; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_nv_vp8_dec_new_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + GstNvDecoderFrame *nv_frame; + + nv_frame = gst_nv_decoder_new_frame (self->decoder); + if (!nv_frame) { + GST_ERROR_OBJECT (self, "No available decoder frame"); + return GST_FLOW_ERROR; + } + + GST_LOG_OBJECT (self, + "New decoder frame %p (index %d)", nv_frame, nv_frame->index); + + gst_vp8_picture_set_user_data (picture, + nv_frame, (GDestroyNotify) gst_nv_decoder_frame_unref); + + return GST_FLOW_OK; +} + +static GstNvDecoderFrame * +gst_nv_vp8_dec_get_decoder_frame_from_picture (GstNvVp8Dec * self, + GstVp8Picture * picture) +{ + GstNvDecoderFrame *frame; + + frame = (GstNvDecoderFrame *) gst_vp8_picture_get_user_data (picture); + + if (!frame) + GST_DEBUG_OBJECT (self, "current picture does not have decoder frame"); + + return frame; +} + +static GstFlowReturn +gst_nv_vp8_dec_decode_picture (GstVp8Decoder * decoder, + GstVp8Picture * picture, GstVp8Parser * parser) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + GstVp8FrameHdr *frame_hdr = &picture->frame_hdr; + GstNvDecoderFrame *frame; + GstNvDecoderFrame *other_frame; + guint offset = 0; + + GST_LOG_OBJECT (self, "Decode picture, size %" G_GSIZE_FORMAT, picture->size); + + frame = gst_nv_vp8_dec_get_decoder_frame_from_picture (self, picture); + if (!frame) { + GST_ERROR_OBJECT (self, "Decoder frame is unavailable"); + return GST_FLOW_ERROR; + } + + self->params.nBitstreamDataLen = picture->size; + self->params.pBitstreamData = picture->data; + self->params.nNumSlices = 1; + self->params.pSliceDataOffsets = &offset; + + self->params.CurrPicIdx = frame->index; + + self->params.CodecSpecific.vp8.first_partition_size = + frame_hdr->first_part_size; + + if (decoder->alt_ref_picture) { + other_frame = + gst_nv_vp8_dec_get_decoder_frame_from_picture (self, + decoder->alt_ref_picture); + if (!other_frame) { + GST_ERROR_OBJECT (self, "Couldn't get decoder frame for AltRef"); + return GST_FLOW_ERROR; + } + + self->params.CodecSpecific.vp8.AltRefIdx = other_frame->index; + } else { + self->params.CodecSpecific.vp8.AltRefIdx = 0xff; + } + + if (decoder->golden_ref_picture) { + other_frame = + gst_nv_vp8_dec_get_decoder_frame_from_picture (self, + decoder->golden_ref_picture); + if (!other_frame) { + GST_ERROR_OBJECT (self, "Couldn't get decoder frame for GoldenRef"); + return GST_FLOW_ERROR; + } + + self->params.CodecSpecific.vp8.GoldenRefIdx = other_frame->index; + } else { + self->params.CodecSpecific.vp8.GoldenRefIdx = 0xff; + } + + if (decoder->last_picture) { + other_frame = + gst_nv_vp8_dec_get_decoder_frame_from_picture (self, + decoder->last_picture); + if (!other_frame) { + GST_ERROR_OBJECT (self, "Couldn't get decoder frame for LastRef"); + return GST_FLOW_ERROR; + } + + self->params.CodecSpecific.vp8.LastRefIdx = other_frame->index; + } else { + self->params.CodecSpecific.vp8.LastRefIdx = 0xff; + } + + self->params.CodecSpecific.vp8.vp8_frame_tag.frame_type = + frame_hdr->key_frame ? 0 : 1; + self->params.CodecSpecific.vp8.vp8_frame_tag.version = frame_hdr->version; + self->params.CodecSpecific.vp8.vp8_frame_tag.show_frame = + frame_hdr->show_frame; + self->params.CodecSpecific.vp8.vp8_frame_tag.update_mb_segmentation_data = + parser->segmentation.segmentation_enabled ? + parser->segmentation.update_segment_feature_data : 0; + + if (!gst_nv_decoder_decode_picture (self->decoder, &self->params)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_nv_vp8_dec_output_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture) +{ + GstNvVp8Dec *self = GST_NV_VP8_DEC (decoder); + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstNvDecoderFrame *decoder_frame; + + GST_LOG_OBJECT (self, "Outputting picture %p", picture); + + decoder_frame = (GstNvDecoderFrame *) gst_vp8_picture_get_user_data (picture); + if (!decoder_frame) { + GST_ERROR_OBJECT (self, "No decoder frame in picture %p", picture); + goto error; + } + + if (!gst_nv_decoder_finish_frame (self->decoder, vdec, decoder_frame, + &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to handle output picture"); + goto error; + } + + gst_vp8_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_video_decoder_drop_frame (vdec, frame); + gst_vp8_picture_unref (picture); + + return GST_FLOW_ERROR; +} + +static guint +gst_nv_vp8_dec_get_preferred_output_delay (GstVp8Decoder * decoder, + gboolean is_live) +{ + /* Prefer to zero latency for live pipeline */ + if (is_live) + return 0; + + /* NVCODEC SDK uses 4 frame delay for better throughput performance */ + return 4; +} + +typedef struct +{ + GstCaps *sink_caps; + GstCaps *src_caps; + guint cuda_device_id; + gboolean is_default; +} GstNvVp8DecClassData; + +static void +gst_nv_vp8_dec_subclass_init (gpointer klass, GstNvVp8DecClassData * cdata) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstNvVp8DecClass *nvdec_class = (GstNvVp8DecClass *) (klass); + gchar *long_name; + + if (cdata->is_default) { + long_name = g_strdup_printf ("NVDEC VP8 Stateless Decoder"); + } else { + long_name = g_strdup_printf ("NVDEC VP8 Stateless Decoder with device %d", + cdata->cuda_device_id); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", + "NVIDIA VP8 video decoder", "Seungha Yang <seungha@centricular.com>"); + g_free (long_name); + + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + cdata->sink_caps)); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + cdata->src_caps)); + + nvdec_class->cuda_device_id = cdata->cuda_device_id; + + gst_caps_unref (cdata->sink_caps); + gst_caps_unref (cdata->src_caps); + g_free (cdata); +} + +void +gst_nv_vp8_dec_register (GstPlugin * plugin, guint device_id, guint rank, + GstCaps * sink_caps, GstCaps * src_caps, gboolean is_primary) +{ + GTypeQuery type_query; + GTypeInfo type_info = { 0, }; + GType subtype; + gchar *type_name; + gchar *feature_name; + GstNvVp8DecClassData *cdata; + gboolean is_default = TRUE; + + /** + * element-nvvp8sldec: + * + * Since: 1.20 + */ + + cdata = g_new0 (GstNvVp8DecClassData, 1); + cdata->sink_caps = gst_caps_ref (sink_caps); + cdata->src_caps = gst_caps_ref (src_caps); + cdata->cuda_device_id = device_id; + + g_type_query (GST_TYPE_NV_VP8_DEC, &type_query); + memset (&type_info, 0, sizeof (type_info)); + type_info.class_size = type_query.class_size; + type_info.instance_size = type_query.instance_size; + type_info.class_init = (GClassInitFunc) gst_nv_vp8_dec_subclass_init; + type_info.class_data = cdata; + + if (is_primary) { + type_name = g_strdup ("GstNvVP8StatelessPrimaryDec"); + feature_name = g_strdup ("nvvp8dec"); + } else { + type_name = g_strdup ("GstNvVP8StatelessDec"); + feature_name = g_strdup ("nvvp8sldec"); + } + + if (g_type_from_name (type_name) != 0) { + g_free (type_name); + g_free (feature_name); + if (is_primary) { + type_name = + g_strdup_printf ("GstNvVP8StatelessPrimaryDevice%dDec", device_id); + feature_name = g_strdup_printf ("nvvp8device%ddec", device_id); + } else { + type_name = g_strdup_printf ("GstNvVP8StatelessDevice%dDec", device_id); + feature_name = g_strdup_printf ("nvvp8sldevice%ddec", device_id); + } + + is_default = FALSE; + } + + cdata->is_default = is_default; + subtype = g_type_register_static (GST_TYPE_NV_VP8_DEC, + type_name, &type_info, 0); + + /* make lower rank than default device */ + if (rank > 0 && !is_default) + rank--; + + if (!gst_element_register (plugin, feature_name, rank, subtype)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvvp8dec.h
Added
@@ -0,0 +1,47 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_NV_VP8_DEC_H__ +#define __GST_NV_VP8_DEC_H__ + +#include <gst/gst.h> +#include <gst/codecs/gstvp8decoder.h> + +G_BEGIN_DECLS + +#define GST_TYPE_NV_VP8_DEC (gst_nv_vp8_dec_get_type()) +#define GST_NV_VP8_DEC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_NV_VP8_DEC, GstNvVp8Dec)) +#define GST_NV_VP8_DEC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_NV_VP8_DEC, GstNvVp8DecClass)) +#define GST_NV_VP8_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_NV_VP8_DEC, GstNvVp8DecClass)) + +typedef struct _GstNvVp8Dec GstNvVp8Dec; +typedef struct _GstNvVp8DecClass GstNvVp8DecClass; + +GType gst_nv_vp8_dec_get_type (void); + +void gst_nv_vp8_dec_register (GstPlugin * plugin, + guint device_id, + guint rank, + GstCaps * sink_caps, + GstCaps * src_caps, + gboolean is_primary); + +G_END_DECLS + +#endif /* __GST_NV_VP8_DEC_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvvp9dec.c
Added
@@ -0,0 +1,657 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstnvvp9dec.h" +#include "gstcudautils.h" +#include "gstnvdecoder.h" + +#include <string.h> + +GST_DEBUG_CATEGORY_STATIC (gst_nv_vp9_dec_debug); +#define GST_CAT_DEFAULT gst_nv_vp9_dec_debug + +/* reference list 8 + 2 margin */ +#define NUM_OUTPUT_VIEW 10 + +struct _GstNvVp9Dec +{ + GstVp9Decoder parent; + + GstVideoCodecState *output_state; + + GstCudaContext *context; + GstNvDecoder *decoder; + CUVIDPICPARAMS params; + + guint width, height; + GstVP9Profile profile; +}; + +struct _GstNvVp9DecClass +{ + GstVp9DecoderClass parent_class; + guint cuda_device_id; +}; + +#define gst_nv_vp9_dec_parent_class parent_class + +/** + * GstNvVp9Dec: + * + * Since: 1.20 + */ +G_DEFINE_TYPE (GstNvVp9Dec, gst_nv_vp9_dec, GST_TYPE_VP9_DECODER); + +static void gst_nv_vp9_dec_set_context (GstElement * element, + GstContext * context); +static gboolean gst_nv_vp9_dec_open (GstVideoDecoder * decoder); +static gboolean gst_nv_vp9_dec_close (GstVideoDecoder * decoder); +static gboolean gst_nv_vp9_dec_negotiate (GstVideoDecoder * decoder); +static gboolean gst_nv_vp9_dec_decide_allocation (GstVideoDecoder * + decoder, GstQuery * query); +static gboolean gst_nv_vp9_dec_src_query (GstVideoDecoder * decoder, + GstQuery * query); + +/* GstVp9Decoder */ +static GstFlowReturn gst_nv_vp9_dec_new_sequence (GstVp9Decoder * decoder, + const GstVp9FrameHeader * frame_hdr); +static GstFlowReturn gst_nv_vp9_dec_new_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture); +static GstVp9Picture *gst_nv_vp9_dec_duplicate_picture (GstVp9Decoder * + decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); +static GstFlowReturn gst_nv_vp9_dec_decode_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture, GstVp9Dpb * dpb); +static GstFlowReturn gst_nv_vp9_dec_output_picture (GstVp9Decoder * + decoder, GstVideoCodecFrame * frame, GstVp9Picture * picture); +static guint gst_nv_vp9_dec_get_preferred_output_delay (GstVp9Decoder * decoder, + gboolean is_live); + +static void +gst_nv_vp9_dec_class_init (GstNvVp9DecClass * klass) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstVp9DecoderClass *vp9decoder_class = GST_VP9_DECODER_CLASS (klass); + + element_class->set_context = GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_set_context); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_close); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_decide_allocation); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_src_query); + + vp9decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_new_sequence); + vp9decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_new_picture); + vp9decoder_class->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_duplicate_picture); + vp9decoder_class->decode_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_decode_picture); + vp9decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_output_picture); + vp9decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_nv_vp9_dec_get_preferred_output_delay); + + GST_DEBUG_CATEGORY_INIT (gst_nv_vp9_dec_debug, + "nvvp9dec", 0, "NVIDIA VP9 Decoder"); + + gst_type_mark_as_plugin_api (GST_TYPE_NV_VP9_DEC, 0); +} + +static void +gst_nv_vp9_dec_init (GstNvVp9Dec * self) +{ +} + +static void +gst_nv_vp9_dec_set_context (GstElement * element, GstContext * context) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (element); + GstNvVp9DecClass *klass = GST_NV_VP9_DEC_GET_CLASS (self); + + GST_DEBUG_OBJECT (self, "set context %s", + gst_context_get_context_type (context)); + + if (gst_cuda_handle_set_context (element, context, klass->cuda_device_id, + &self->context)) { + goto done; + } + + if (self->decoder) + gst_nv_decoder_handle_set_context (self->decoder, element, context); + +done: + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static gboolean +gst_nv_vp9_dec_open (GstVideoDecoder * decoder) +{ + GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + GstNvVp9DecClass *klass = GST_NV_VP9_DEC_GET_CLASS (self); + + if (!gst_cuda_ensure_element_context (GST_ELEMENT (self), + klass->cuda_device_id, &self->context)) { + GST_ERROR_OBJECT (self, "Required element data is unavailable"); + return FALSE; + } + + self->decoder = gst_nv_decoder_new (self->context); + if (!self->decoder) { + GST_ERROR_OBJECT (self, "Failed to create decoder object"); + gst_clear_object (&self->context); + + return FALSE; + } + + /* NVDEC doesn't support non-keyframe resolution change and it will result + * in outputting broken frames */ + gst_vp9_decoder_set_non_keyframe_format_change_support (vp9dec, FALSE); + + return TRUE; +} + +static gboolean +gst_nv_vp9_dec_close (GstVideoDecoder * decoder) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + + g_clear_pointer (&self->output_state, gst_video_codec_state_unref); + gst_clear_object (&self->decoder); + gst_clear_object (&self->context); + + return TRUE; +} + +static gboolean +gst_nv_vp9_dec_negotiate (GstVideoDecoder * decoder) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); + + GST_DEBUG_OBJECT (self, "negotiate"); + + gst_nv_decoder_negotiate (self->decoder, decoder, vp9dec->input_state, + &self->output_state); + + /* TODO: add support D3D11 memory */ + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static gboolean +gst_nv_vp9_dec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + + if (!gst_nv_decoder_decide_allocation (self->decoder, decoder, query)) { + GST_WARNING_OBJECT (self, "Failed to handle decide allocation"); + return FALSE; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static gboolean +gst_nv_vp9_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + if (gst_cuda_handle_context_query (GST_ELEMENT (decoder), query, + self->context)) { + return TRUE; + } else if (self->decoder && + gst_nv_decoder_handle_context_query (self->decoder, decoder, query)) { + return TRUE; + } + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); +} + +static GstFlowReturn +gst_nv_vp9_dec_new_sequence (GstVp9Decoder * decoder, + const GstVp9FrameHeader * frame_hdr) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + GstVideoFormat out_format = GST_VIDEO_FORMAT_UNKNOWN; + GstVideoInfo info; + + GST_LOG_OBJECT (self, "new sequence"); + + self->width = frame_hdr->width; + self->height = frame_hdr->height; + self->profile = frame_hdr->profile; + + if (self->profile == GST_VP9_PROFILE_0) { + out_format = GST_VIDEO_FORMAT_NV12; + } else if (self->profile == GST_VP9_PROFILE_2) { + if (frame_hdr->bit_depth == 10) { + out_format = GST_VIDEO_FORMAT_P010_10LE; + } else { + out_format = GST_VIDEO_FORMAT_P016_LE; + } + } + + if (out_format == GST_VIDEO_FORMAT_UNKNOWN) { + GST_ERROR_OBJECT (self, "Could not support profile %d", self->profile); + return GST_FLOW_NOT_NEGOTIATED; + } + + gst_video_info_set_format (&info, out_format, self->width, self->height); + if (!gst_nv_decoder_configure (self->decoder, + cudaVideoCodec_VP9, &info, self->width, self->height, + frame_hdr->bit_depth, + /* +4 for render delay */ + NUM_OUTPUT_VIEW)) { + GST_ERROR_OBJECT (self, "Failed to configure decoder"); + return GST_FLOW_NOT_NEGOTIATED; + } + + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + + memset (&self->params, 0, sizeof (CUVIDPICPARAMS)); + + self->params.CodecSpecific.vp9.colorSpace = frame_hdr->color_space; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_nv_vp9_dec_new_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + GstNvDecoderFrame *nv_frame; + + nv_frame = gst_nv_decoder_new_frame (self->decoder); + if (!nv_frame) { + GST_ERROR_OBJECT (self, "No available decoder frame"); + return GST_FLOW_ERROR; + } + + GST_LOG_OBJECT (self, + "New decoder frame %p (index %d)", nv_frame, nv_frame->index); + + gst_vp9_picture_set_user_data (picture, + nv_frame, (GDestroyNotify) gst_nv_decoder_frame_unref); + + return GST_FLOW_OK; +} + +static GstNvDecoderFrame * +gst_nv_vp9_dec_get_decoder_frame_from_picture (GstNvVp9Dec * self, + GstVp9Picture * picture) +{ + GstNvDecoderFrame *frame; + + frame = (GstNvDecoderFrame *) gst_vp9_picture_get_user_data (picture); + + if (!frame) + GST_DEBUG_OBJECT (self, "current picture does not have decoder frame"); + + return frame; +} + +static GstVp9Picture * +gst_nv_vp9_dec_duplicate_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + GstNvDecoderFrame *nv_frame; + GstVp9Picture *new_picture; + + nv_frame = gst_nv_vp9_dec_get_decoder_frame_from_picture (self, picture); + + if (!nv_frame) { + GST_ERROR_OBJECT (self, "Parent picture does not have decoder frame"); + return NULL; + } + + new_picture = gst_vp9_picture_new (); + new_picture->frame_hdr = picture->frame_hdr; + + gst_vp9_picture_set_user_data (new_picture, + gst_nv_decoder_frame_ref (nv_frame), + (GDestroyNotify) gst_nv_decoder_frame_unref); + + return new_picture; +} + +static GstFlowReturn +gst_nv_vp9_dec_decode_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture, GstVp9Dpb * dpb) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + const GstVp9LoopFilterParams *lfp = &frame_hdr->loop_filter_params; + const GstVp9SegmentationParams *sp = &frame_hdr->segmentation_params; + const GstVp9QuantizationParams *qp = &frame_hdr->quantization_params; + CUVIDPICPARAMS *params = &self->params; + CUVIDVP9PICPARAMS *vp9_params = ¶ms->CodecSpecific.vp9; + GstNvDecoderFrame *frame; + GstNvDecoderFrame *other_frame; + guint offset = 0; + guint8 ref_frame_map[GST_VP9_REF_FRAMES]; + gint i; + + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->mbRefLfDelta) == + GST_VP9_MAX_REF_LF_DELTAS); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->mbModeLfDelta) == + GST_VP9_MAX_MODE_LF_DELTAS); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->mb_segment_tree_probs) == + GST_VP9_SEG_TREE_PROBS); + G_STATIC_ASSERT (sizeof (vp9_params->mb_segment_tree_probs) == + sizeof (sp->segmentation_tree_probs)); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->segment_pred_probs) == + GST_VP9_PREDICTION_PROBS); + G_STATIC_ASSERT (sizeof (vp9_params->segment_pred_probs) == + sizeof (sp->segmentation_pred_prob)); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->refFrameSignBias) == + GST_VP9_REFS_PER_FRAME + 1); + G_STATIC_ASSERT (sizeof (vp9_params->refFrameSignBias) == + sizeof (frame_hdr->ref_frame_sign_bias)); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->activeRefIdx) == + GST_VP9_REFS_PER_FRAME); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->segmentFeatureEnable) == + GST_VP9_MAX_SEGMENTS); + G_STATIC_ASSERT (sizeof (vp9_params->segmentFeatureEnable) == + sizeof (sp->feature_enabled)); + G_STATIC_ASSERT (G_N_ELEMENTS (vp9_params->segmentFeatureData) == + GST_VP9_MAX_SEGMENTS); + G_STATIC_ASSERT (sizeof (vp9_params->segmentFeatureData) == + sizeof (sp->feature_data)); + + GST_LOG_OBJECT (self, "Decode picture, size %" G_GSIZE_FORMAT, picture->size); + + frame = gst_nv_vp9_dec_get_decoder_frame_from_picture (self, picture); + if (!frame) { + GST_ERROR_OBJECT (self, "Decoder frame is unavailable"); + return GST_FLOW_ERROR; + } + + params->nBitstreamDataLen = picture->size; + params->pBitstreamData = picture->data; + params->nNumSlices = 1; + params->pSliceDataOffsets = &offset; + + params->PicWidthInMbs = GST_ROUND_UP_16 (frame_hdr->width) >> 4; + params->FrameHeightInMbs = GST_ROUND_UP_16 (frame_hdr->height) >> 4; + params->CurrPicIdx = frame->index; + + vp9_params->width = frame_hdr->width; + vp9_params->height = frame_hdr->height; + + for (i = 0; i < GST_VP9_REF_FRAMES; i++) { + if (dpb->pic_list[i]) { + other_frame = gst_nv_vp9_dec_get_decoder_frame_from_picture (self, + dpb->pic_list[i]); + if (!other_frame) { + GST_ERROR_OBJECT (self, "Couldn't get decoder frame from picture"); + return FALSE; + } + + ref_frame_map[i] = other_frame->index; + } else { + ref_frame_map[i] = 0xff; + } + } + + vp9_params->LastRefIdx = ref_frame_map[frame_hdr->ref_frame_idx[0]]; + vp9_params->GoldenRefIdx = ref_frame_map[frame_hdr->ref_frame_idx[1]]; + vp9_params->AltRefIdx = ref_frame_map[frame_hdr->ref_frame_idx[2]]; + + vp9_params->profile = frame_hdr->profile; + vp9_params->frameContextIdx = frame_hdr->frame_context_idx; + vp9_params->frameType = frame_hdr->frame_type; + vp9_params->showFrame = frame_hdr->show_frame; + vp9_params->errorResilient = frame_hdr->error_resilient_mode; + vp9_params->frameParallelDecoding = frame_hdr->frame_parallel_decoding_mode; + vp9_params->subSamplingX = frame_hdr->subsampling_x; + vp9_params->subSamplingY = frame_hdr->subsampling_y; + vp9_params->intraOnly = frame_hdr->intra_only; + vp9_params->allow_high_precision_mv = frame_hdr->allow_high_precision_mv; + vp9_params->refreshEntropyProbs = frame_hdr->refresh_frame_context; + vp9_params->bitDepthMinus8Luma = frame_hdr->bit_depth - 8; + vp9_params->bitDepthMinus8Chroma = frame_hdr->bit_depth - 8; + + vp9_params->loopFilterLevel = lfp->loop_filter_level; + vp9_params->loopFilterSharpness = lfp->loop_filter_sharpness; + vp9_params->modeRefLfEnabled = lfp->loop_filter_delta_enabled; + + vp9_params->log2_tile_columns = frame_hdr->tile_cols_log2; + vp9_params->log2_tile_rows = frame_hdr->tile_rows_log2; + + vp9_params->segmentEnabled = sp->segmentation_enabled; + vp9_params->segmentMapUpdate = sp->segmentation_update_map; + vp9_params->segmentMapTemporalUpdate = sp->segmentation_temporal_update; + vp9_params->segmentFeatureMode = sp->segmentation_abs_or_delta_update; + + vp9_params->qpYAc = qp->base_q_idx; + vp9_params->qpYDc = qp->delta_q_y_dc; + vp9_params->qpChDc = qp->delta_q_uv_dc; + vp9_params->qpChAc = qp->delta_q_uv_ac; + + vp9_params->resetFrameContext = frame_hdr->reset_frame_context; + vp9_params->mcomp_filter_type = frame_hdr->interpolation_filter; + vp9_params->frameTagSize = frame_hdr->frame_header_length_in_bytes; + vp9_params->offsetToDctParts = frame_hdr->header_size_in_bytes; + + for (i = 0; i < GST_VP9_MAX_REF_LF_DELTAS; i++) + vp9_params->mbRefLfDelta[i] = lfp->loop_filter_ref_deltas[i]; + + for (i = 0; i < GST_VP9_MAX_MODE_LF_DELTAS; i++) + vp9_params->mbModeLfDelta[i] = lfp->loop_filter_mode_deltas[i]; + + memcpy (vp9_params->mb_segment_tree_probs, sp->segmentation_tree_probs, + sizeof (sp->segmentation_tree_probs)); + memcpy (vp9_params->segment_pred_probs, sp->segmentation_pred_prob, + sizeof (sp->segmentation_pred_prob)); + memcpy (vp9_params->refFrameSignBias, frame_hdr->ref_frame_sign_bias, + sizeof (frame_hdr->ref_frame_sign_bias)); + + for (i = 0; i < GST_VP9_REFS_PER_FRAME; i++) { + vp9_params->activeRefIdx[i] = frame_hdr->ref_frame_idx[i]; + } + + memcpy (vp9_params->segmentFeatureEnable, sp->feature_enabled, + sizeof (sp->feature_enabled)); + memcpy (vp9_params->segmentFeatureData, sp->feature_data, + sizeof (sp->feature_data)); + + if (!gst_nv_decoder_decode_picture (self->decoder, &self->params)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_nv_vp9_dec_output_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstNvVp9Dec *self = GST_NV_VP9_DEC (decoder); + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstNvDecoderFrame *decoder_frame; + + GST_LOG_OBJECT (self, "Outputting picture %p", picture); + + decoder_frame = (GstNvDecoderFrame *) gst_vp9_picture_get_user_data (picture); + if (!decoder_frame) { + GST_ERROR_OBJECT (self, "No decoder frame in picture %p", picture); + goto error; + } + + if (!gst_nv_decoder_finish_frame (self->decoder, vdec, decoder_frame, + &frame->output_buffer)) { + GST_ERROR_OBJECT (self, "Failed to handle output picture"); + goto error; + } + + gst_vp9_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_video_decoder_drop_frame (vdec, frame); + gst_vp9_picture_unref (picture); + + return GST_FLOW_ERROR; +} + +static guint +gst_nv_vp9_dec_get_preferred_output_delay (GstVp9Decoder * decoder, + gboolean is_live) +{ + /* Prefer to zero latency for live pipeline */ + if (is_live) + return 0; + + /* NVCODEC SDK uses 4 frame delay for better throughput performance */ + return 4; +} + +typedef struct +{ + GstCaps *sink_caps; + GstCaps *src_caps; + guint cuda_device_id; + gboolean is_default; +} GstNvVp9DecClassData; + +static void +gst_nv_vp9_dec_subclass_init (gpointer klass, GstNvVp9DecClassData * cdata) +{ + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstNvVp9DecClass *nvdec_class = (GstNvVp9DecClass *) (klass); + gchar *long_name; + + if (cdata->is_default) { + long_name = g_strdup_printf ("NVDEC VP9 Stateless Decoder"); + } else { + long_name = g_strdup_printf ("NVDEC VP9 Stateless Decoder with device %d", + cdata->cuda_device_id); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", + "NVIDIA VP9 video decoder", "Seungha Yang <seungha@centricular.com>"); + g_free (long_name); + + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + cdata->sink_caps)); + gst_element_class_add_pad_template (element_class, + gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + cdata->src_caps)); + + nvdec_class->cuda_device_id = cdata->cuda_device_id; + + gst_caps_unref (cdata->sink_caps); + gst_caps_unref (cdata->src_caps); + g_free (cdata); +} + +void +gst_nv_vp9_dec_register (GstPlugin * plugin, guint device_id, guint rank, + GstCaps * sink_caps, GstCaps * src_caps, gboolean is_primary) +{ + GTypeQuery type_query; + GTypeInfo type_info = { 0, }; + GType subtype; + gchar *type_name; + gchar *feature_name; + GstNvVp9DecClassData *cdata; + gboolean is_default = TRUE; + + /** + * element-nvvp9sldec: + * + * Since: 1.20 + */ + + cdata = g_new0 (GstNvVp9DecClassData, 1); + cdata->sink_caps = gst_caps_copy (sink_caps); + gst_caps_set_simple (cdata->sink_caps, + "alignment", G_TYPE_STRING, "frame", NULL); + GST_MINI_OBJECT_FLAG_SET (cdata->sink_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + cdata->src_caps = gst_caps_ref (src_caps); + cdata->cuda_device_id = device_id; + + g_type_query (GST_TYPE_NV_VP9_DEC, &type_query); + memset (&type_info, 0, sizeof (type_info)); + type_info.class_size = type_query.class_size; + type_info.instance_size = type_query.instance_size; + type_info.class_init = (GClassInitFunc) gst_nv_vp9_dec_subclass_init; + type_info.class_data = cdata; + + if (is_primary) { + type_name = g_strdup ("GstNvVP9StatelessPrimaryDec"); + feature_name = g_strdup ("nvvp9dec"); + } else { + type_name = g_strdup ("GstNvVP9StatelessDec"); + feature_name = g_strdup ("nvvp9sldec"); + } + + if (g_type_from_name (type_name) != 0) { + g_free (type_name); + g_free (feature_name); + if (is_primary) { + type_name = + g_strdup_printf ("GstNvVP9StatelessPrimaryDevice%dDec", device_id); + feature_name = g_strdup_printf ("nvvp9device%ddec", device_id); + } else { + type_name = g_strdup_printf ("GstNvVP9StatelessDevice%dDec", device_id); + feature_name = g_strdup_printf ("nvvp9sldevice%ddec", device_id); + } + + is_default = FALSE; + } + + cdata->is_default = is_default; + subtype = g_type_register_static (GST_TYPE_NV_VP9_DEC, + type_name, &type_info, 0); + + /* make lower rank than default device */ + if (rank > 0 && !is_default) + rank--; + + if (!gst_element_register (plugin, feature_name, rank, subtype)) + GST_WARNING ("Failed to register plugin '%s'", type_name); + + g_free (type_name); + g_free (feature_name); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/gstnvvp9dec.h
Added
@@ -0,0 +1,47 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_NV_VP9_DEC_H__ +#define __GST_NV_VP9_DEC_H__ + +#include <gst/gst.h> +#include <gst/codecs/gstvp9decoder.h> + +G_BEGIN_DECLS + +#define GST_TYPE_NV_VP9_DEC (gst_nv_vp9_dec_get_type()) +#define GST_NV_VP9_DEC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_NV_VP9_DEC, GstNvVp9Dec)) +#define GST_NV_VP9_DEC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_NV_VP9_DEC, GstNvVp9DecClass)) +#define GST_NV_VP9_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_NV_VP9_DEC, GstNvVp9DecClass)) + +typedef struct _GstNvVp9Dec GstNvVp9Dec; +typedef struct _GstNvVp9DecClass GstNvVp9DecClass; + +GType gst_nv_vp9_dec_get_type (void); + +void gst_nv_vp9_dec_register (GstPlugin * plugin, + guint device_id, + guint rank, + GstCaps * sink_caps, + GstCaps * src_caps, + gboolean is_primary); + +G_END_DECLS + +#endif /* __GST_NV_VP9_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/meson.build
Changed
@@ -12,6 +12,20 @@ 'gstnvdecoder.c', 'gstnvh264dec.c', 'gstnvh265dec.c', + 'gstcudamemory.c', + 'gstcudabufferpool.c', + 'gstcudabasetransform.c', + 'gstcudadownload.c', + 'gstcudaupload.c', + 'gstcudanvrtc.c', + 'gstnvrtcloader.c', + 'cuda-converter.c', + 'gstcudafilter.c', + 'gstcudabasefilter.c', + 'gstcudaconvert.c', + 'gstcudascale.c', + 'gstnvvp8dec.c', + 'gstnvvp9dec.c', ] if get_option('nvcodec').disabled()
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/plugin.c
Changed
@@ -33,7 +33,12 @@ #include "gstnvenc.h" #include "gstnvh264dec.h" #include "gstnvh265dec.h" +#include "gstnvvp8dec.h" +#include "gstnvvp9dec.h" #include "gstnvdecoder.h" +#include "gstcudadownload.h" +#include "gstcudaupload.h" +#include "gstcudafilter.h" GST_DEBUG_CATEGORY (gst_nvcodec_debug); GST_DEBUG_CATEGORY (gst_nvdec_debug); @@ -56,6 +61,8 @@ const gchar *env; gboolean use_h264_sl_dec = FALSE; gboolean use_h265_sl_dec = FALSE; + gboolean use_vp8_sl_dec = FALSE; + gboolean use_vp9_sl_dec = FALSE; GST_DEBUG_CATEGORY_INIT (gst_nvcodec_debug, "nvcodec", 0, "nvcodec"); GST_DEBUG_CATEGORY_INIT (gst_nvdec_debug, "nvdec", 0, "nvdec"); @@ -108,6 +115,12 @@ } else if (g_ascii_strcasecmp (*iter, "h265") == 0) { GST_INFO ("Found %s in GST_USE_NV_STATELESS_CODEC environment", *iter); use_h265_sl_dec = TRUE; + } else if (g_ascii_strcasecmp (*iter, "vp8") == 0) { + GST_INFO ("Found %s in GST_USE_NV_STATELESS_CODEC environment", *iter); + use_vp8_sl_dec = TRUE; + } else if (g_ascii_strcasecmp (*iter, "vp9") == 0) { + GST_INFO ("Found %s in GST_USE_NV_STATELESS_CODEC environment", *iter); + use_vp9_sl_dec = TRUE; } } @@ -155,7 +168,8 @@ gst_nv_h264_dec_register (plugin, i, GST_RANK_SECONDARY, sink_template, src_template, FALSE); if (use_h264_sl_dec) { - GST_INFO ("Skip register cuvid parser based nvh264dec"); + GST_INFO + ("Skipping registration of CUVID parser based nvh264dec element"); register_cuviddec = FALSE; gst_nv_h264_dec_register (plugin, @@ -166,13 +180,37 @@ gst_nv_h265_dec_register (plugin, i, GST_RANK_SECONDARY, sink_template, src_template, FALSE); if (use_h265_sl_dec) { - GST_INFO ("Skip register cuvid parser based nvh264dec"); + GST_INFO + ("Skipping registration of CUVID parser based nvh265dec element"); register_cuviddec = FALSE; gst_nv_h265_dec_register (plugin, i, GST_RANK_PRIMARY, sink_template, src_template, TRUE); } break; + case cudaVideoCodec_VP8: + gst_nv_vp8_dec_register (plugin, + i, GST_RANK_SECONDARY, sink_template, src_template, FALSE); + if (use_vp8_sl_dec) { + GST_INFO + ("Skipping registration of CUVID parser based nvhvp8dec element"); + register_cuviddec = FALSE; + + gst_nv_vp8_dec_register (plugin, + i, GST_RANK_PRIMARY, sink_template, src_template, TRUE); + } + break; + case cudaVideoCodec_VP9: + gst_nv_vp9_dec_register (plugin, + i, GST_RANK_SECONDARY, sink_template, src_template, FALSE); + if (use_vp9_sl_dec) { + GST_INFO ("Skip register cuvid parser based nvhvp9dec"); + register_cuviddec = FALSE; + + gst_nv_vp9_dec_register (plugin, + i, GST_RANK_PRIMARY, sink_template, src_template, TRUE); + } + break; default: break; } @@ -194,6 +232,13 @@ CuCtxDestroy (cuda_ctx); } + gst_element_register (plugin, "cudadownload", GST_RANK_NONE, + GST_TYPE_CUDA_DOWNLOAD); + gst_element_register (plugin, "cudaupload", GST_RANK_NONE, + GST_TYPE_CUDA_UPLOAD); + + gst_cuda_filter_plugin_init (plugin); + return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/nvcodec/stub/cuda.h -> gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/stub/cuda.h
Changed
@@ -27,7 +27,11 @@ typedef gpointer CUgraphicsResource; typedef gpointer CUstream; typedef gpointer CUarray; +typedef gpointer CUmodule; +typedef gpointer CUfunction; +typedef gpointer CUmipmappedArray; +typedef guint64 CUtexObject; typedef guintptr CUdeviceptr; typedef gint CUdevice; @@ -46,6 +50,7 @@ typedef enum { + CU_DEVICE_ATTRIBUTE_TEXTURE_ALIGNMENT = 14, CU_DEVICE_ATTRIBUTE_COMPUTE_CAPABILITY_MAJOR = 75, CU_DEVICE_ATTRIBUTE_COMPUTE_CAPABILITY_MINOR = 76, } CUdevice_attribute; @@ -70,6 +75,39 @@ CU_STREAM_NON_BLOCKING = 0x1 } CUstream_flags; +typedef enum +{ + CU_TR_FILTER_MODE_POINT = 0, + CU_TR_FILTER_MODE_LINEAR = 1 +} CUfilter_mode; + +typedef enum +{ + CU_TR_ADDRESS_MODE_WRAP = 0, + CU_TR_ADDRESS_MODE_CLAMP = 1, + CU_TR_ADDRESS_MODE_MIRROR = 2, + CU_TR_ADDRESS_MODE_BORDER = 3 +} CUaddress_mode; + +typedef enum +{ + CU_RESOURCE_TYPE_ARRAY = 0, + CU_RESOURCE_TYPE_MIPMAPPED_ARRAY = 1, + CU_RESOURCE_TYPE_LINEAR = 2, + CU_RESOURCE_TYPE_PITCH2D = 3 +} CUresourcetype; + +typedef enum +{ + CU_AD_FORMAT_UNSIGNED_INT8 = 1, + CU_AD_FORMAT_UNSIGNED_INT16 = 2, +} CUarray_format; + +typedef enum +{ + CU_RES_VIEW_FORMAT_NONE = 0, +} CUresourceViewFormat; + typedef struct { gsize srcXInBytes; @@ -97,6 +135,66 @@ CU_GL_DEVICE_LIST_ALL = 0x01, } CUGLDeviceList; +typedef struct +{ + CUaddress_mode addressMode[3]; + CUfilter_mode filterMode; + guint flags; + guint maxAnisotropy; + CUfilter_mode mipmapFilterMode; + gfloat mipmapLevelBias; + gfloat minMipmapLevelClamp; + gfloat maxMipmapLevelClamp; + gfloat borderColor[4]; + gint reserved[12]; +} CUDA_TEXTURE_DESC; + +typedef struct +{ + CUresourcetype resType; + + union { + struct { + CUarray hArray; + } array; + struct { + CUmipmappedArray hMipmappedArray; + } mipmap; + struct { + CUdeviceptr devPtr; + CUarray_format format; + guint numChannels; + gsize sizeInBytes; + } linear; + struct { + CUdeviceptr devPtr; + CUarray_format format; + guint numChannels; + gsize width; + gsize height; + gsize pitchInBytes; + } pitch2D; + struct { + gint reserved[32]; + } reserved; + } res; + + guint flags; +} CUDA_RESOURCE_DESC; + +typedef struct +{ + CUresourceViewFormat format; + gsize width; + gsize height; + gsize depth; + guint firstMipmapLevel; + guint lastMipmapLevel; + guint firstLayer; + guint lastLayer; + guint reserved[16]; +} CUDA_RESOURCE_VIEW_DESC; + #define CUDA_VERSION 10000 #ifdef _WIN32 @@ -114,11 +212,14 @@ #define cuMemAlloc cuMemAlloc_v2 #define cuMemAllocPitch cuMemAllocPitch_v2 +#define cuMemAllocHost cuMemAllocHost_v2 #define cuMemcpy2D cuMemcpy2D_v2 #define cuMemcpy2DAsync cuMemcpy2DAsync_v2 #define cuMemFree cuMemFree_v2 #define cuGLGetDevices cuGLGetDevices_v2 +#define CU_TRSF_READ_AS_INTEGER 1 + G_END_DECLS #endif /* __GST_CUDA_STUB_H__ */
View file
gst-plugins-bad-1.20.1.tar.xz/sys/nvcodec/stub/nvrtc.h
Added
@@ -0,0 +1,34 @@ +/* NVRTC stub header + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_NVRTC_STUB_H__ +#define __GST_NVRTC_STUB_H__ + +#include <gst/gst.h> + +G_BEGIN_DECLS + +typedef struct _nvrtcProgram * nvrtcProgram; + +typedef enum { + NVRTC_SUCCESS = 0, +} nvrtcResult; + +G_END_DECLS + +#endif /* __GST_NVRTC_STUB_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/meson.build
Changed
@@ -4,6 +4,7 @@ 'openslessink.c', 'openslessrc.c', 'opensles.c', + 'openslesplugin.c', ] opensles_option = get_option('opensles')
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/opensles.c -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/opensles.c
Changed
@@ -77,25 +77,12 @@ g_mutex_unlock (&engine_mutex); } -static gboolean -plugin_init (GstPlugin * plugin) +void +opensles_element_init (GstPlugin * plugin) { - g_mutex_init (&engine_mutex); - - if (!gst_element_register (plugin, "openslessink", GST_RANK_PRIMARY, - GST_TYPE_OPENSLES_SINK)) { - return FALSE; - } - if (!gst_element_register (plugin, "openslessrc", GST_RANK_PRIMARY, - GST_TYPE_OPENSLES_SRC)) { - return FALSE; + static gsize res = FALSE; + if (g_once_init_enter (&res)) { + g_mutex_init (&engine_mutex); + g_once_init_leave (&res, TRUE); } - - return TRUE; } - -GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, - GST_VERSION_MINOR, - opensles, - "OpenSL ES support for GStreamer", - plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/opensles.h -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/opensles.h
Changed
@@ -24,6 +24,8 @@ #include <gst/gst.h> #include <SLES/OpenSLES.h> +void opensles_element_init (GstPlugin * plugin); + SLObjectItf gst_opensles_get_engine (void); void gst_opensles_release_engine (SLObjectItf engine_object);
View file
gst-plugins-bad-1.20.1.tar.xz/sys/opensles/openslesplugin.c
Added
@@ -0,0 +1,43 @@ +/* GStreamer + * Copyright (C) 2012 Fluendo S.A. <support@fluendo.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "opensles.h" +#include "openslessink.h" +#include "openslessrc.h" + +static gboolean +plugin_init (GstPlugin * plugin) +{ + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (openslessink, plugin); + ret |= GST_ELEMENT_REGISTER (openslessrc, plugin); + + return ret; +} + +GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, + GST_VERSION_MINOR, + opensles, + "OpenSL ES support for GStreamer", + plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/openslessink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/openslessink.c
Changed
@@ -74,6 +74,8 @@ #define parent_class gst_opensles_sink_parent_class G_DEFINE_TYPE_WITH_CODE (GstOpenSLESSink, gst_opensles_sink, GST_TYPE_AUDIO_BASE_SINK, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (openslessink, "openslessink", + GST_RANK_PRIMARY, GST_TYPE_OPENSLES_SINK, opensles_element_init (plugin)); static GstAudioRingBuffer * gst_opensles_sink_create_ringbuffer (GstAudioBaseSink * base)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/openslessink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/openslessink.h
Changed
@@ -52,6 +52,7 @@ }; GType gst_opensles_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (openslessink); G_END_DECLS #endif /* __OPENSLESSINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/openslessrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/openslessrc.c
Changed
@@ -35,6 +35,7 @@ # include <config.h> #endif +#include "opensles.h" #include "openslessrc.h" GST_DEBUG_CATEGORY_STATIC (opensles_src_debug); @@ -58,6 +59,8 @@ #define parent_class gst_opensles_src_parent_class G_DEFINE_TYPE_WITH_CODE (GstOpenSLESSrc, gst_opensles_src, GST_TYPE_AUDIO_BASE_SRC, _do_init); +GST_ELEMENT_REGISTER_DEFINE_WITH_CODE (openslessrc, "openslessrc", + GST_RANK_PRIMARY, GST_TYPE_OPENSLES_SRC, opensles_element_init (plugin)); enum {
View file
gst-plugins-bad-1.18.6.tar.xz/sys/opensles/openslessrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/opensles/openslessrc.h
Changed
@@ -48,6 +48,7 @@ }; GType gst_opensles_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (openslessrc); G_END_DECLS #endif /* __OPENSLESSRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/shm/gstshm.c -> gst-plugins-bad-1.20.1.tar.xz/sys/shm/gstshm.c
Changed
@@ -29,10 +29,12 @@ static gboolean plugin_init (GstPlugin * plugin) { - return gst_element_register (plugin, "shmsrc", - GST_RANK_NONE, GST_TYPE_SHM_SRC) && - gst_element_register (plugin, "shmsink", - GST_RANK_NONE, GST_TYPE_SHM_SINK); + gboolean ret = FALSE; + + ret |= GST_ELEMENT_REGISTER (shmsrc, plugin); + ret |= GST_ELEMENT_REGISTER (shmsink, plugin); + + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/shm/gstshmsink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/shm/gstshmsink.c
Changed
@@ -83,6 +83,8 @@ #define gst_shm_sink_parent_class parent_class G_DEFINE_TYPE (GstShmSink, gst_shm_sink, GST_TYPE_BASE_SINK); +GST_ELEMENT_REGISTER_DEFINE (shmsink, "shmsink", GST_RANK_NONE, + GST_TYPE_SHM_SINK); static void gst_shm_sink_finalize (GObject * object); static void gst_shm_sink_set_property (GObject * object, guint prop_id,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/shm/gstshmsink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/shm/gstshmsink.h
Changed
@@ -78,5 +78,7 @@ GType gst_shm_sink_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (shmsink); + G_END_DECLS #endif /* __GST_SHM_SINK_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/shm/gstshmsrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/shm/gstshmsrc.c
Changed
@@ -76,6 +76,7 @@ #define gst_shm_src_parent_class parent_class G_DEFINE_TYPE (GstShmSrc, gst_shm_src, GST_TYPE_PUSH_SRC); +GST_ELEMENT_REGISTER_DEFINE (shmsrc, "shmsrc", GST_RANK_NONE, GST_TYPE_SHM_SRC); static void gst_shm_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/shm/gstshmsrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/shm/gstshmsrc.h
Changed
@@ -65,6 +65,8 @@ GType gst_shm_src_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (shmsrc); + struct _GstShmPipe { int use_count;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/shm/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/shm/meson.build
Changed
@@ -6,12 +6,14 @@ 'gstshmsink.c', ] +shm_deps = [] +shm_enabled = false if get_option('shm').disabled() subdir_done() endif -shm_deps = [] -if ['darwin', 'ios'].contains(host_system) or host_system.endswith('bsd') +# NetBSD has shm_* in librt +if ['darwin', 'ios', 'freebsd', 'openbsd'].contains(host_system) rt_dep = [] shm_enabled = true else @@ -34,7 +36,7 @@ shm_sources, c_args : gst_plugins_bad_args + ['-DSHM_PIPE_USE_GLIB'], include_directories : [configinc], - dependencies : [gstbase_dep, rt_dep], + dependencies : [gstbase_dep, rt_dep] + network_deps, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264.c -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264.c
Changed
@@ -28,26 +28,19 @@ #include <gst/gst.h> #include "gstuvch264_mjpgdemux.h" #include "gstuvch264_src.h" - -extern GType gst_uvc_h264_device_provider_get_type (); +#include "gstuvch264deviceprovider.h" static gboolean plugin_init (GstPlugin * plugin) { - if (!gst_element_register (plugin, "uvch264mjpgdemux", GST_RANK_NONE, - GST_TYPE_UVC_H264_MJPG_DEMUX)) - return FALSE; - - if (!gst_element_register (plugin, "uvch264src", GST_RANK_NONE, - GST_TYPE_UVC_H264_SRC)) - return FALSE; + gboolean ret = FALSE; - if (!gst_device_provider_register (plugin, "uvch264deviceprovider", - GST_RANK_PRIMARY, gst_uvc_h264_device_provider_get_type ())) - return FALSE; + ret |= GST_ELEMENT_REGISTER (uvch264mjpgdemux, plugin); + ret |= GST_ELEMENT_REGISTER (uvch264src, plugin); + ret |= GST_DEVICE_PROVIDER_REGISTER (uvch264deviceprovider, plugin); - return TRUE; + return ret; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264_mjpgdemux.c -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264_mjpgdemux.c
Changed
@@ -130,6 +130,8 @@ #define gst_uvc_h264_mjpg_demux_parent_class parent_class G_DEFINE_TYPE (GstUvcH264MjpgDemux, gst_uvc_h264_mjpg_demux, GST_TYPE_ELEMENT); +GST_ELEMENT_REGISTER_DEFINE (uvch264mjpgdemux, "uvch264mjpgdemux", + GST_RANK_NONE, GST_TYPE_UVC_H264_MJPG_DEMUX); static void gst_uvc_h264_mjpg_demux_class_init (GstUvcH264MjpgDemuxClass * klass)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264_mjpgdemux.h -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264_mjpgdemux.h
Changed
@@ -104,6 +104,8 @@ GType gst_uvc_h264_mjpg_demux_get_type (void); +GST_ELEMENT_REGISTER_DECLARE (uvch264mjpgdemux); + G_END_DECLS #endif /* __GST_UVC_H264_MJPG_DEMUX_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264_src.c -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264_src.c
Changed
@@ -123,6 +123,8 @@ #define gst_uvc_h264_src_parent_class parent_class G_DEFINE_TYPE (GstUvcH264Src, gst_uvc_h264_src, GST_TYPE_BASE_CAMERA_SRC); +GST_ELEMENT_REGISTER_DEFINE (uvch264src, "uvch264src", GST_RANK_NONE, + GST_TYPE_UVC_H264_SRC); #define GST_UVC_H264_SRC_VF_CAPS_STR \ GST_VIDEO_CAPS_MAKE (GST_VIDEO_FORMATS_ALL) ";" \ @@ -2814,8 +2816,8 @@ } if (!gst_element_link (self->v4l2_src, tee)) goto error_remove_all; - vf_pad = gst_element_get_request_pad (tee, "src_%u"); - vid_pad = gst_element_get_request_pad (tee, "src_%u"); + vf_pad = gst_element_request_pad_simple (tee, "src_%u"); + vid_pad = gst_element_request_pad_simple (tee, "src_%u"); } break; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264_src.h -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264_src.h
Changed
@@ -158,6 +158,8 @@ GstBaseCameraSrcClass parent; }; +GST_ELEMENT_REGISTER_DECLARE (uvch264src); + G_END_DECLS #endif /* __GST_UVC_H264_SRC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264deviceprovider.c -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264deviceprovider.c
Changed
@@ -54,6 +54,9 @@ G_DEFINE_TYPE (GstUvcH264Device, gst_uvc_h264_device, GST_TYPE_DEVICE); /* *INDENT-ON* */ +GST_DEVICE_PROVIDER_REGISTER_DEFINE (uvch264deviceprovider, + "uvch264deviceprovider", GST_RANK_PRIMARY, + gst_uvc_h264_device_provider_get_type ()); static void gst_uvc_h264_device_get_property (GObject * object, guint prop_id,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/uvch264/gstuvch264deviceprovider.h -> gst-plugins-bad-1.20.1.tar.xz/sys/uvch264/gstuvch264deviceprovider.h
Changed
@@ -26,5 +26,5 @@ G_DECLARE_FINAL_TYPE (GstUvcH264Device, gst_uvc_h264_device, GST, UVC_H264_DEVICE, GstDevice) G_DECLARE_FINAL_TYPE (GstUvcH264DeviceProvider, gst_uvc_h264_device_provider, GST, UVC_H264_DEVICE_PROVIDER, GstDeviceProvider) - +GST_DEVICE_PROVIDER_REGISTER_DECLARE (uvch264deviceprovider); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2codecallocator.c -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecallocator.c
Changed
@@ -90,7 +90,8 @@ buf->index = index; buf->num_mems = num_mems; for (i = 0; i < buf->num_mems; i++) { - GstMemory *mem = gst_dmabuf_allocator_alloc (allocator, fds[i], sizes[i]); + GstMemory *mem = gst_fd_allocator_alloc (allocator, fds[i], sizes[i], + GST_FD_MEMORY_FLAG_KEEP_MAPPED); gst_memory_resize (mem, offsets[i], sizes[i] - offsets[i]); GST_MINI_OBJECT (mem)->dispose = gst_v4l2_codec_allocator_release; @@ -212,7 +213,10 @@ while ((buf = g_queue_pop_head (&self->pool))) gst_v4l2_codec_buffer_free (buf); - gst_clear_object (&self->decoder); + if (self->decoder) { + gst_v4l2_codec_allocator_detach (self); + gst_clear_object (&self->decoder); + } G_OBJECT_CLASS (gst_v4l2_codec_allocator_parent_class)->dispose (object); } @@ -222,9 +226,6 @@ { GstV4l2CodecAllocator *self = GST_V4L2_CODEC_ALLOCATOR (object); - if (!self->detached) - gst_v4l2_decoder_request_buffers (self->decoder, self->direction, 0); - g_cond_clear (&self->buffer_cond); G_OBJECT_CLASS (gst_v4l2_codec_allocator_parent_class)->finalize (object);
View file
gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecalphadecodebin.c
Added
@@ -0,0 +1,231 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Daniel Almeida <daniel.almeida@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/pbutils/pbutils.h> + +#include "gstv4l2codecalphadecodebin.h" +#include "gstv4l2decoder.h" + +GST_DEBUG_CATEGORY_STATIC (v4l2_codecalphadecodebin_debug); +#define GST_CAT_DEFAULT (v4l2_codecalphadecodebin_debug) + +typedef struct +{ + GstBin parent; + + gboolean constructed; + const gchar *missing_element; +} GstV4l2CodecAlphaDecodeBinPrivate; + +#define gst_v4l2_codec_alpha_decode_bin_parent_class parent_class +G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstV4l2CodecAlphaDecodeBin, + gst_v4l2_codec_alpha_decode_bin, GST_TYPE_BIN, + G_ADD_PRIVATE (GstV4l2CodecAlphaDecodeBin); + GST_DEBUG_CATEGORY_INIT (v4l2_codecalphadecodebin_debug, + "v4l2codecs-alphadecodebin", 0, "V4L2 stateless alpha decode bin")); + + +static GstStaticPadTemplate gst_alpha_decode_bin_src_template = +GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("ANY") + ); + +static gboolean +gst_v4l2_codec_alpha_decode_bin_open (GstV4l2CodecAlphaDecodeBin * self) +{ + GstV4l2CodecAlphaDecodeBinPrivate *priv = + gst_v4l2_codec_alpha_decode_bin_get_instance_private (self); + + if (priv->missing_element) { + gst_element_post_message (GST_ELEMENT (self), + gst_missing_element_message_new (GST_ELEMENT (self), + priv->missing_element)); + } else if (!priv->constructed) { + GST_ELEMENT_ERROR (self, CORE, FAILED, + ("Failed to construct alpha decoder pipeline."), (NULL)); + } + + return priv->constructed; +} + +static GstStateChangeReturn +gst_v4l2_codec_alpha_decode_bin_change_state (GstElement * element, + GstStateChange transition) +{ + GstV4l2CodecAlphaDecodeBin *self = GST_V4L2_CODEC_ALPHA_DECODE_BIN (element); + + switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + if (!gst_v4l2_codec_alpha_decode_bin_open (self)) + return GST_STATE_CHANGE_FAILURE; + break; + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + +static void +gst_v4l2_codec_alpha_decode_bin_constructed (GObject * obj) +{ + GstV4l2CodecAlphaDecodeBin *self = GST_V4L2_CODEC_ALPHA_DECODE_BIN (obj); + GstV4l2CodecAlphaDecodeBinPrivate *priv = + gst_v4l2_codec_alpha_decode_bin_get_instance_private (self); + GstV4l2CodecAlphaDecodeBinClass *klass = + GST_V4L2_CODEC_ALPHA_DECODE_BIN_GET_CLASS (self); + GstPad *src_gpad, *sink_gpad; + GstPad *src_pad = NULL, *sink_pad = NULL; + GstElement *alphademux = NULL; + GstElement *queue = NULL; + GstElement *alpha_queue = NULL; + GstElement *decoder = NULL; + GstElement *alpha_decoder = NULL; + GstElement *alphacombine = NULL; + + /* setup ghost pads */ + sink_gpad = gst_ghost_pad_new_no_target_from_template ("sink", + gst_element_class_get_pad_template (GST_ELEMENT_CLASS (klass), "sink")); + gst_element_add_pad (GST_ELEMENT (self), sink_gpad); + + src_gpad = gst_ghost_pad_new_no_target_from_template ("src", + gst_element_class_get_pad_template (GST_ELEMENT_CLASS (klass), "src")); + gst_element_add_pad (GST_ELEMENT (self), src_gpad); + + /* create elements */ + alphademux = gst_element_factory_make ("codecalphademux", NULL); + if (!alphademux) { + priv->missing_element = "codecalphademux"; + goto cleanup; + } + + queue = gst_element_factory_make ("queue", NULL); + alpha_queue = gst_element_factory_make ("queue", NULL); + if (!queue || !alpha_queue) { + priv->missing_element = "queue"; + goto cleanup; + } + + decoder = gst_element_factory_make (klass->decoder_name, "maindec"); + if (!decoder) { + priv->missing_element = klass->decoder_name; + goto cleanup; + } + + alpha_decoder = gst_element_factory_make (klass->decoder_name, "alphadec"); + if (!alpha_decoder) { + priv->missing_element = klass->decoder_name; + goto cleanup; + } + + /* We disable QoS on decoders because we need to maintain frame pairing in + * order for alphacombine to work. */ + g_object_set (decoder, "qos", FALSE, NULL); + g_object_set (alpha_decoder, "qos", FALSE, NULL); + + alphacombine = gst_element_factory_make ("alphacombine", NULL); + if (!alphacombine) { + priv->missing_element = "alphacombine"; + goto cleanup; + } + + gst_bin_add_many (GST_BIN (self), alphademux, queue, alpha_queue, decoder, + alpha_decoder, alphacombine, NULL); + + /* link elements */ + sink_pad = gst_element_get_static_pad (alphademux, "sink"); + gst_ghost_pad_set_target (GST_GHOST_PAD (sink_gpad), sink_pad); + gst_clear_object (&sink_pad); + + gst_element_link_pads (alphademux, "src", queue, "sink"); + gst_element_link_pads (queue, "src", decoder, "sink"); + gst_element_link_pads (decoder, "src", alphacombine, "sink"); + + gst_element_link_pads (alphademux, "alpha", alpha_queue, "sink"); + gst_element_link_pads (alpha_queue, "src", alpha_decoder, "sink"); + gst_element_link_pads (alpha_decoder, "src", alphacombine, "alpha"); + + src_pad = gst_element_get_static_pad (alphacombine, "src"); + gst_ghost_pad_set_target (GST_GHOST_PAD (src_gpad), src_pad); + gst_object_unref (src_pad); + + g_object_set (queue, "max-size-bytes", 0, "max-size-time", 0, + "max-size-buffers", 1, NULL); + g_object_set (alpha_queue, "max-size-bytes", 0, "max-size-time", 0, + "max-size-buffers", 1, NULL); + + /* signal success, we will handle this in NULL->READY transition */ + priv->constructed = TRUE; + return; + +cleanup: + gst_clear_object (&alphademux); + gst_clear_object (&queue); + gst_clear_object (&alpha_queue); + gst_clear_object (&decoder); + gst_clear_object (&alpha_decoder); + gst_clear_object (&alphacombine); + + G_OBJECT_CLASS (parent_class)->constructed (obj); +} + +static void +gst_v4l2_codec_alpha_decode_bin_class_init (GstV4l2CodecAlphaDecodeBinClass * + klass) +{ + GstElementClass *element_class = (GstElementClass *) klass; + GObjectClass *obj_class = (GObjectClass *) klass; + + /* This is needed to access the subclass class instance, otherwise we cannot + * read the class parameters */ + obj_class->constructed = gst_v4l2_codec_alpha_decode_bin_constructed; + + gst_element_class_add_static_pad_template (element_class, + &gst_alpha_decode_bin_src_template); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_alpha_decode_bin_change_state); + + /* let's make the doc generator happy */ + gst_type_mark_as_plugin_api (GST_TYPE_V4L2_CODEC_ALPHA_DECODE_BIN, 0); +} + +static void +gst_v4l2_codec_alpha_decode_bin_init (GstV4l2CodecAlphaDecodeBin * self) +{ +} + +void +gst_v4l2_codec_alpha_decode_bin_register (GstPlugin * plugin, + GClassInitFunc class_init, gconstpointer class_data, + const gchar * element_name_tmpl, GstV4l2CodecDevice * device, guint rank) +{ + /* TODO check that we have compatible src format */ + + gst_v4l2_decoder_register (plugin, + GST_TYPE_V4L2_CODEC_ALPHA_DECODE_BIN, class_init, class_data, NULL, + element_name_tmpl, device, + rank + GST_V4L2_CODEC_ALPHA_DECODE_BIN_RANK_OFFSET, NULL); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecalphadecodebin.h
Added
@@ -0,0 +1,57 @@ +/* GStreamer + * Copyright (C) <2021> Collabora Ltd. + * Author: Daniel Almeida <daniel.almeida@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_V4L2_CODEC_ALPHA_DECODE_BIN_H__ +#define __GST_V4L2_CODEC_ALPHA_DECODE_BIN_H__ + +#include <gst/gst.h> +#include <gstv4l2decoder.h> + +/* When wrapping, use the original rank plus this offset. The ad-hoc rules is + * that hardware implementation will use PRIMARY+1 or +2 to override the + * software decoder, so the offset must be large enough to jump over those. + * This should also be small enough so that a marginal (64) or secondary + * wrapper does not cross the PRIMARY line. + */ +#define GST_V4L2_CODEC_ALPHA_DECODE_BIN_RANK_OFFSET 10 + +G_BEGIN_DECLS + +#define GST_TYPE_V4L2_CODEC_ALPHA_DECODE_BIN (gst_v4l2_codec_alpha_decode_bin_get_type()) +G_DECLARE_DERIVABLE_TYPE (GstV4l2CodecAlphaDecodeBin, + gst_v4l2_codec_alpha_decode_bin, GST, V4L2_CODEC_ALPHA_DECODE_BIN, GstBin); + +struct _GstV4l2CodecAlphaDecodeBinClass +{ + GstBinClass parent_class; + gchar *decoder_name; +}; + +void gst_v4l2_codec_alpha_decode_bin_register (GstPlugin * plugin, + GClassInitFunc class_init, + gconstpointer class_data, + const gchar * element_name_tmpl, + GstV4l2CodecDevice * device, + guint rank); + + + +G_END_DECLS +#endif /* __GST_V4L2_CODEC_ALPHA_DECODE_BIN_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2codech264dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codech264dec.c
Changed
@@ -24,7 +24,14 @@ #include "gstv4l2codecallocator.h" #include "gstv4l2codech264dec.h" #include "gstv4l2codecpool.h" -#include "linux/h264-ctrls.h" +#include "gstv4l2format.h" +#include "linux/v4l2-controls.h" + +#define KERNEL_VERSION(a,b,c) (((a) << 16) + ((b) << 8) + (c)) + +#define V4L2_MIN_KERNEL_VER_MAJOR 5 +#define V4L2_MIN_KERNEL_VER_MINOR 11 +#define V4L2_MIN_KERNEL_VERSION KERNEL_VERSION(V4L2_MIN_KERNEL_VER_MAJOR, V4L2_MIN_KERNEL_VER_MINOR, 0) GST_DEBUG_CATEGORY_STATIC (v4l2_h264dec_debug); #define GST_CAT_DEFAULT v4l2_h264dec_debug @@ -46,7 +53,7 @@ static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SRC_NAME, GST_PAD_SRC, GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ NV12, YUY2, NV12_32L32 }"))); + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (GST_V4L2_DEFAULT_VIDEO_FORMATS))); struct _GstV4l2CodecH264Dec { @@ -60,6 +67,8 @@ gint coded_height; guint bitdepth; guint chroma_format_idc; + guint num_slices; + gboolean first_slice; GstV4l2CodecAllocator *sink_allocator; GstV4l2CodecAllocator *src_allocator; @@ -67,57 +76,124 @@ gint min_pool_size; gboolean has_videometa; gboolean need_negotiation; + gboolean interlaced; + gboolean need_sequence; gboolean copy_frames; + gboolean scaling_matrix_present; struct v4l2_ctrl_h264_sps sps; struct v4l2_ctrl_h264_pps pps; struct v4l2_ctrl_h264_scaling_matrix scaling_matrix; struct v4l2_ctrl_h264_decode_params decode_params; + struct v4l2_ctrl_h264_pred_weights pred_weight; GArray *slice_params; - enum v4l2_mpeg_video_h264_decode_mode decode_mode; - enum v4l2_mpeg_video_h264_start_code start_code; + enum v4l2_stateless_h264_decode_mode decode_mode; + enum v4l2_stateless_h264_start_code start_code; GstMemory *bitstream; GstMapInfo bitstream_map; }; -G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstV4l2CodecH264Dec, - gst_v4l2_codec_h264_dec, GST_TYPE_H264_DECODER, - GST_DEBUG_CATEGORY_INIT (v4l2_h264dec_debug, "v4l2codecs-h264dec", 0, - "V4L2 stateless h264 decoder")); +G_DEFINE_ABSTRACT_TYPE (GstV4l2CodecH264Dec, gst_v4l2_codec_h264_dec, + GST_TYPE_H264_DECODER); + #define parent_class gst_v4l2_codec_h264_dec_parent_class static gboolean is_frame_based (GstV4l2CodecH264Dec * self) { - return self->decode_mode == V4L2_MPEG_VIDEO_H264_DECODE_MODE_FRAME_BASED; + return self->decode_mode == V4L2_STATELESS_H264_DECODE_MODE_FRAME_BASED; } static gboolean is_slice_based (GstV4l2CodecH264Dec * self) { - return self->decode_mode == V4L2_MPEG_VIDEO_H264_DECODE_MODE_SLICE_BASED; + return self->decode_mode == V4L2_STATELESS_H264_DECODE_MODE_SLICE_BASED; } static gboolean needs_start_codes (GstV4l2CodecH264Dec * self) { - return self->start_code == V4L2_MPEG_VIDEO_H264_START_CODE_ANNEX_B; + return self->start_code == V4L2_STATELESS_H264_START_CODE_ANNEX_B; } +static gboolean +gst_v4l2_decoder_h264_api_check (GstV4l2Decoder * decoder) +{ + guint i, ret_size; + /* *INDENT-OFF* */ + #define SET_ID(cid) .id = (cid), .name = #cid + struct + { + const gchar *name; + unsigned int id; + unsigned int size; + gboolean optional; + } controls[] = { + { + SET_ID (V4L2_CID_STATELESS_H264_SPS), + .size = sizeof(struct v4l2_ctrl_h264_sps), + }, { + SET_ID (V4L2_CID_STATELESS_H264_PPS), + .size = sizeof(struct v4l2_ctrl_h264_pps), + }, { + SET_ID (V4L2_CID_STATELESS_H264_SCALING_MATRIX), + .size = sizeof(struct v4l2_ctrl_h264_scaling_matrix), + .optional = TRUE, + }, { + SET_ID (V4L2_CID_STATELESS_H264_DECODE_PARAMS), + .size = sizeof(struct v4l2_ctrl_h264_decode_params), + }, { + SET_ID (V4L2_CID_STATELESS_H264_SLICE_PARAMS), + .size = sizeof(struct v4l2_ctrl_h264_slice_params), + .optional = TRUE, + }, { + SET_ID (V4L2_CID_STATELESS_H264_PRED_WEIGHTS), + .size = sizeof(struct v4l2_ctrl_h264_pred_weights), + .optional = TRUE, + } + }; + #undef SET_ID + /* *INDENT-ON* */ + + /* + * Compatibility check: make sure the pointer controls are + * the right size. + */ + for (i = 0; i < G_N_ELEMENTS (controls); i++) { + gboolean control_found; + + control_found = gst_v4l2_decoder_query_control_size (decoder, + controls[i].id, &ret_size); + + if (!controls[i].optional && !control_found) { + GST_WARNING ("Driver is missing %s support.", controls[i].name); + return FALSE; + } + + if (control_found && ret_size != controls[i].size) { + GST_WARNING ("%s control size mismatch: got %d bytes but %d expected.", + controls[i].name, ret_size, controls[i].size); + return FALSE; + } + } + + return TRUE; +} static gboolean gst_v4l2_codec_h264_dec_open (GstVideoDecoder * decoder) { GstV4l2CodecH264Dec *self = GST_V4L2_CODEC_H264_DEC (decoder); + /* *INDENT-OFF* */ struct v4l2_ext_control control[] = { { - .id = V4L2_CID_MPEG_VIDEO_H264_DECODE_MODE, + .id = V4L2_CID_STATELESS_H264_DECODE_MODE, }, { - .id = V4L2_CID_MPEG_VIDEO_H264_START_CODE, + .id = V4L2_CID_STATELESS_H264_START_CODE, }, }; /* *INDENT-ON* */ @@ -229,7 +305,7 @@ /* *INDENT-OFF* */ struct v4l2_ext_control control[] = { { - .id = V4L2_CID_MPEG_VIDEO_H264_SPS, + .id = V4L2_CID_STATELESS_H264_SPS, .ptr = &self->sps, .size = sizeof (self->sps), }, @@ -244,11 +320,11 @@ GST_DEBUG_OBJECT (self, "Negotiate"); - gst_v4l2_codec_h264_dec_reset_allocation (self); - gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SINK); gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SRC); + gst_v4l2_codec_h264_dec_reset_allocation (self); + if (!gst_v4l2_decoder_set_sink_fmt (self->decoder, V4L2_PIX_FMT_H264_SLICE, self->coded_width, self->coded_height, get_pixel_bitdepth (self))) { GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, @@ -295,6 +371,9 @@ self->vinfo.finfo->format, self->display_width, self->display_height, h264dec->input_state); + if (self->interlaced) + self->output_state->info.interlace_mode = GST_VIDEO_INTERLACE_MODE_MIXED; + self->output_state->caps = gst_video_info_to_caps (&self->output_state->info); if (GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder)) { @@ -323,7 +402,7 @@ GstQuery * query) { GstV4l2CodecH264Dec *self = GST_V4L2_CODEC_H264_DEC (decoder); - guint min = 0; + guint min = 0, num_bitstream; self->has_videometa = gst_query_find_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); @@ -336,10 +415,26 @@ min = MAX (2, min); + num_bitstream = 1 + + MAX (1, gst_v4l2_decoder_get_render_delay (self->decoder)); + self->sink_allocator = gst_v4l2_codec_allocator_new (self->decoder, - GST_PAD_SINK, self->min_pool_size + 2); + GST_PAD_SINK, num_bitstream); + if (!self->sink_allocator) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to allocate sink buffers."), (NULL)); + return FALSE; + } + self->src_allocator = gst_v4l2_codec_allocator_new (self->decoder, GST_PAD_SRC, self->min_pool_size + min + 4); + if (!self->src_allocator) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to allocate source buffers."), (NULL)); + g_clear_object (&self->sink_allocator); + return FALSE; + } + self->src_pool = gst_v4l2_codec_pool_new (self->src_allocator, &self->vinfo); /* Our buffer pool is internal, we will let the base class create a video @@ -413,7 +508,7 @@ | (pps->constrained_intra_pred_flag ? V4L2_H264_PPS_FLAG_CONSTRAINED_INTRA_PRED : 0) | (pps->redundant_pic_cnt_present_flag ? V4L2_H264_PPS_FLAG_REDUNDANT_PIC_CNT_PRESENT : 0) | (pps->transform_8x8_mode_flag ? V4L2_H264_PPS_FLAG_TRANSFORM_8X8_MODE : 0) - | (pps->pic_scaling_matrix_present_flag ? V4L2_H264_PPS_FLAG_PIC_SCALING_MATRIX_PRESENT : 0), + | (self->scaling_matrix_present ? V4L2_H264_PPS_FLAG_SCALING_MATRIX_PRESENT : 0), }; /* *INDENT-ON* */ } @@ -446,45 +541,168 @@ GstH264SliceHdr * slice_hdr, GstH264Picture * picture, GstH264Dpb * dpb) { GArray *refs = gst_h264_dpb_get_pictures_all (dpb); - gint i; + gint i, entry_id = 0; /* *INDENT-OFF* */ self->decode_params = (struct v4l2_ctrl_h264_decode_params) { - .num_slices = 0, /* will be incremented as we receive slices */ .nal_ref_idc = picture->nal_ref_idc, - .top_field_order_cnt = picture->top_field_order_cnt, - .bottom_field_order_cnt = picture->bottom_field_order_cnt, - .flags = picture->idr ? V4L2_H264_DECODE_PARAM_FLAG_IDR_PIC : 0, + .frame_num = slice_hdr->frame_num, + .idr_pic_id = slice_hdr->idr_pic_id, + .pic_order_cnt_lsb = slice_hdr->pic_order_cnt_lsb, + .delta_pic_order_cnt_bottom = slice_hdr->delta_pic_order_cnt_bottom, + .delta_pic_order_cnt0 = slice_hdr->delta_pic_order_cnt[0], + .delta_pic_order_cnt1 = slice_hdr->delta_pic_order_cnt[1], + .dec_ref_pic_marking_bit_size = slice_hdr->dec_ref_pic_marking.bit_size, + .pic_order_cnt_bit_size = slice_hdr->pic_order_cnt_bit_size, + .slice_group_change_cycle = slice_hdr->slice_group_change_cycle, + .flags = (picture->idr ? V4L2_H264_DECODE_PARAM_FLAG_IDR_PIC : 0) | + (slice_hdr->field_pic_flag ? V4L2_H264_DECODE_PARAM_FLAG_FIELD_PIC : 0) | + (slice_hdr->bottom_field_flag ? V4L2_H264_DECODE_PARAM_FLAG_BOTTOM_FIELD : 0), }; + switch (picture->field) { + case GST_H264_PICTURE_FIELD_FRAME: + self->decode_params.top_field_order_cnt = picture->top_field_order_cnt; + self->decode_params.bottom_field_order_cnt = + picture->bottom_field_order_cnt; + break; + case GST_H264_PICTURE_FIELD_TOP_FIELD: + self->decode_params.top_field_order_cnt = picture->top_field_order_cnt; + self->decode_params.bottom_field_order_cnt = 0; + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + self->decode_params.top_field_order_cnt = 0; + self->decode_params.bottom_field_order_cnt = + picture->bottom_field_order_cnt; + break; + } + for (i = 0; i < refs->len; i++) { GstH264Picture *ref_pic = g_array_index (refs, GstH264Picture *, i); gint pic_num = ref_pic->pic_num; + gint frame_num = ref_pic->frame_num; + struct v4l2_h264_dpb_entry *entry; + + /* Skip non-reference as they are not useful to decoding */ + if (!GST_H264_PICTURE_IS_REF (ref_pic)) + continue; + + /* The second field picture will be handled differently */ + if (ref_pic->second_field) + continue; + + /* V4L2 uAPI uses pic_num for both PicNum and LongTermPicNum, and + * frame_num for both FrameNum and LongTermFrameIdx */ + if (GST_H264_PICTURE_IS_LONG_TERM_REF (ref_pic)) { + pic_num = ref_pic->long_term_pic_num; + frame_num = ref_pic->long_term_frame_idx; + } - /* Unwrap pic_num */ - if (pic_num < 0) - pic_num += slice_hdr->max_pic_num; - - self->decode_params.dpb[i] = (struct v4l2_h264_dpb_entry) { + entry = &self->decode_params.dpb[entry_id++]; + *entry = (struct v4l2_h264_dpb_entry) { /* - * The reference is multiplied by 1000 because it's wassed as micro + * The reference is multiplied by 1000 because it's was set as micro * seconds and this TS is nanosecond. */ .reference_ts = (guint64) ref_pic->system_frame_number * 1000, - .frame_num = ref_pic->frame_num, + .frame_num = frame_num, .pic_num = pic_num, - .top_field_order_cnt = ref_pic->pic_order_cnt, - .bottom_field_order_cnt = ref_pic->bottom_field_order_cnt, .flags = V4L2_H264_DPB_ENTRY_FLAG_VALID - | (ref_pic->ref ? V4L2_H264_DPB_ENTRY_FLAG_ACTIVE : 0) - | (ref_pic->long_term ? V4L2_H264_DPB_ENTRY_FLAG_LONG_TERM : 0), + | (GST_H264_PICTURE_IS_REF (ref_pic) ? V4L2_H264_DPB_ENTRY_FLAG_ACTIVE : 0) + | (GST_H264_PICTURE_IS_LONG_TERM_REF (ref_pic) ? V4L2_H264_DPB_ENTRY_FLAG_LONG_TERM : 0), }; + + switch (ref_pic->field) { + case GST_H264_PICTURE_FIELD_FRAME: + entry->top_field_order_cnt = ref_pic->top_field_order_cnt; + entry->bottom_field_order_cnt = ref_pic->bottom_field_order_cnt; + entry->fields = V4L2_H264_FRAME_REF; + break; + case GST_H264_PICTURE_FIELD_TOP_FIELD: + entry->top_field_order_cnt = ref_pic->top_field_order_cnt; + entry->fields = V4L2_H264_TOP_FIELD_REF; + + if (ref_pic->other_field) { + entry->bottom_field_order_cnt = + ref_pic->other_field->bottom_field_order_cnt; + entry->fields |= V4L2_H264_BOTTOM_FIELD_REF; + } else { + entry->flags |= V4L2_H264_DPB_ENTRY_FLAG_FIELD; + } + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + entry->bottom_field_order_cnt = ref_pic->bottom_field_order_cnt; + entry->fields = V4L2_H264_BOTTOM_FIELD_REF; + + if (ref_pic->other_field) { + entry->top_field_order_cnt = + ref_pic->other_field->top_field_order_cnt; + entry->fields |= V4L2_H264_TOP_FIELD_REF; + } else { + entry->flags |= V4L2_H264_DPB_ENTRY_FLAG_FIELD; + } + break; + } } /* *INDENT-ON* */ g_array_unref (refs); } +static void +gst_v4l2_codec_h264_dec_fill_pred_weight (GstV4l2CodecH264Dec * self, + GstH264SliceHdr * slice_hdr) +{ + gint i, j; + + /* *INDENT-OFF* */ + self->pred_weight = (struct v4l2_ctrl_h264_pred_weights) { + .luma_log2_weight_denom = slice_hdr->pred_weight_table.luma_log2_weight_denom, + .chroma_log2_weight_denom = slice_hdr->pred_weight_table.chroma_log2_weight_denom, + }; + /* *INDENT-ON* */ + + for (i = 0; i <= slice_hdr->num_ref_idx_l0_active_minus1; i++) { + self->pred_weight.weight_factors[0].luma_weight[i] = + slice_hdr->pred_weight_table.luma_weight_l0[i]; + self->pred_weight.weight_factors[0].luma_offset[i] = + slice_hdr->pred_weight_table.luma_offset_l0[i]; + } + + if (slice_hdr->pps->sequence->chroma_array_type != 0) { + for (i = 0; i <= slice_hdr->num_ref_idx_l0_active_minus1; i++) { + for (j = 0; j < 2; j++) { + self->pred_weight.weight_factors[0].chroma_weight[i][j] = + slice_hdr->pred_weight_table.chroma_weight_l0[i][j]; + self->pred_weight.weight_factors[0].chroma_offset[i][j] = + slice_hdr->pred_weight_table.chroma_offset_l0[i][j]; + } + } + } + + /* Skip l1 if this is not a B-Frames. */ + if (slice_hdr->type % 5 != GST_H264_B_SLICE) + return; + + for (i = 0; i <= slice_hdr->num_ref_idx_l1_active_minus1; i++) { + self->pred_weight.weight_factors[1].luma_weight[i] = + slice_hdr->pred_weight_table.luma_weight_l1[i]; + self->pred_weight.weight_factors[1].luma_offset[i] = + slice_hdr->pred_weight_table.luma_offset_l1[i]; + } + + if (slice_hdr->pps->sequence->chroma_array_type != 0) { + for (i = 0; i <= slice_hdr->num_ref_idx_l1_active_minus1; i++) { + for (j = 0; j < 2; j++) { + self->pred_weight.weight_factors[1].chroma_weight[i][j] = + slice_hdr->pred_weight_table.chroma_weight_l1[i][j]; + self->pred_weight.weight_factors[1].chroma_offset[i][j] = + slice_hdr->pred_weight_table.chroma_offset_l1[i][j]; + } + } + } +} + static guint get_slice_header_bit_size (GstH264Slice * slice) { @@ -496,13 +714,12 @@ gst_v4l2_codec_h264_dec_fill_slice_params (GstV4l2CodecH264Dec * self, GstH264Slice * slice) { - gint n = self->decode_params.num_slices++; + gint n = self->num_slices++; gsize slice_size = slice->nalu.size; struct v4l2_ctrl_h264_slice_params *params; - gint i, j; /* Ensure array is large enough */ - if (self->slice_params->len < self->decode_params.num_slices) + if (self->slice_params->len < self->num_slices) g_array_set_size (self->slice_params, self->slice_params->len * 2); if (needs_start_codes (self)) @@ -511,26 +728,11 @@ /* *INDENT-OFF* */ params = &g_array_index (self->slice_params, struct v4l2_ctrl_h264_slice_params, n); *params = (struct v4l2_ctrl_h264_slice_params) { - .size = slice_size, - .start_byte_offset = self->bitstream_map.size, .header_bit_size = get_slice_header_bit_size (slice), .first_mb_in_slice = slice->header.first_mb_in_slice, .slice_type = slice->header.type % 5, - .pic_parameter_set_id = slice->header.pps->id, .colour_plane_id = slice->header.colour_plane_id, .redundant_pic_cnt = slice->header.redundant_pic_cnt, - .frame_num = slice->header.frame_num, - .idr_pic_id = slice->header.idr_pic_id, - .pic_order_cnt_lsb = slice->header.pic_order_cnt_lsb, - .delta_pic_order_cnt_bottom = slice->header.delta_pic_order_cnt_bottom, - .delta_pic_order_cnt0 = slice->header.delta_pic_order_cnt[0], - .delta_pic_order_cnt1 = slice->header.delta_pic_order_cnt[1], - .pred_weight_table = (struct v4l2_h264_pred_weight_table) { - .luma_log2_weight_denom = slice->header.pred_weight_table.luma_log2_weight_denom, - .chroma_log2_weight_denom = slice->header.pred_weight_table.chroma_log2_weight_denom, - }, - .dec_ref_pic_marking_bit_size = slice->header.dec_ref_pic_marking.bit_size, - .pic_order_cnt_bit_size = slice->header.pic_order_cnt_bit_size, .cabac_init_idc = slice->header.cabac_init_idc, .slice_qp_delta = slice->header.slice_qp_delta, .slice_qs_delta = slice->header.slice_qs_delta, @@ -539,54 +741,10 @@ .slice_beta_offset_div2 = slice->header.slice_beta_offset_div2, .num_ref_idx_l0_active_minus1 = slice->header.num_ref_idx_l0_active_minus1, .num_ref_idx_l1_active_minus1 = slice->header.num_ref_idx_l1_active_minus1, - .slice_group_change_cycle = slice->header.slice_group_change_cycle, - - .flags = (slice->header.field_pic_flag ? V4L2_H264_SLICE_FLAG_FIELD_PIC : 0) | - (slice->header.bottom_field_flag ? V4L2_H264_SLICE_FLAG_BOTTOM_FIELD : 0) | - (slice->header.direct_spatial_mv_pred_flag ? V4L2_H264_SLICE_FLAG_DIRECT_SPATIAL_MV_PRED : 0) | + .flags = (slice->header.direct_spatial_mv_pred_flag ? V4L2_H264_SLICE_FLAG_DIRECT_SPATIAL_MV_PRED : 0) | (slice->header.sp_for_switch_flag ? V4L2_H264_SLICE_FLAG_SP_FOR_SWITCH : 0), }; /* *INDENT-ON* */ - - for (i = 0; i <= slice->header.num_ref_idx_l0_active_minus1; i++) { - params->pred_weight_table.weight_factors[0].luma_weight[i] = - slice->header.pred_weight_table.luma_weight_l0[i]; - params->pred_weight_table.weight_factors[0].luma_offset[i] = - slice->header.pred_weight_table.luma_offset_l0[i]; - } - - if (slice->header.pps->sequence->chroma_array_type != 0) { - for (i = 0; i <= slice->header.num_ref_idx_l0_active_minus1; i++) { - for (j = 0; j < 2; j++) { - params->pred_weight_table.weight_factors[0].chroma_weight[i][j] = - slice->header.pred_weight_table.chroma_weight_l0[i][j]; - params->pred_weight_table.weight_factors[0].chroma_offset[i][j] = - slice->header.pred_weight_table.chroma_offset_l0[i][j]; - } - } - } - - /* Skip l1 if this is not a B-Frames. */ - if (slice->header.type % 5 != GST_H264_B_SLICE) - return; - - for (i = 0; i <= slice->header.num_ref_idx_l0_active_minus1; i++) { - params->pred_weight_table.weight_factors[0].luma_weight[i] = - slice->header.pred_weight_table.luma_weight_l0[i]; - params->pred_weight_table.weight_factors[0].luma_offset[i] = - slice->header.pred_weight_table.luma_offset_l0[i]; - } - - if (slice->header.pps->sequence->chroma_array_type != 0) { - for (i = 0; i <= slice->header.num_ref_idx_l1_active_minus1; i++) { - for (j = 0; j < 2; j++) { - params->pred_weight_table.weight_factors[1].chroma_weight[i][j] = - slice->header.pred_weight_table.chroma_weight_l1[i][j]; - params->pred_weight_table.weight_factors[1].chroma_offset[i][j] = - slice->header.pred_weight_table.chroma_offset_l1[i][j]; - } - } - } } static guint8 @@ -600,6 +758,10 @@ if (!ref_pic) return 0xff; + /* DPB entries only stores first field in a merged fashion */ + if (ref_pic->second_field && ref_pic->other_field) + ref_pic = ref_pic->other_field; + ref_ts = (guint64) ref_pic->system_frame_number * 1000; for (i = 0; i < 16; i++) { if (dpb[i].flags & V4L2_H264_DPB_ENTRY_FLAG_ACTIVE @@ -610,9 +772,30 @@ return 0xff; } +static guint +_get_v4l2_fields_ref (GstH264Picture * ref_pic, gboolean merge) +{ + if (merge && ref_pic->other_field) + return V4L2_H264_FRAME_REF; + + switch (ref_pic->field) { + case GST_H264_PICTURE_FIELD_FRAME: + return V4L2_H264_FRAME_REF; + break; + case GST_H264_PICTURE_FIELD_TOP_FIELD: + return V4L2_H264_TOP_FIELD_REF; + break; + case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: + return V4L2_H264_BOTTOM_FIELD_REF; + break; + } + + return V4L2_H264_FRAME_REF; +} + static void gst_v4l2_codec_h264_dec_fill_references (GstV4l2CodecH264Dec * self, - GArray * ref_pic_list0, GArray * ref_pic_list1) + gboolean cur_is_frame, GArray * ref_pic_list0, GArray * ref_pic_list1) { struct v4l2_ctrl_h264_slice_params *slice_params; gint i; @@ -628,19 +811,23 @@ for (i = 0; i < ref_pic_list0->len; i++) { GstH264Picture *ref_pic = g_array_index (ref_pic_list0, GstH264Picture *, i); - slice_params->ref_pic_list0[i] = + slice_params->ref_pic_list0[i].index = lookup_dpb_index (self->decode_params.dpb, ref_pic); + slice_params->ref_pic_list0[i].fields = + _get_v4l2_fields_ref (ref_pic, cur_is_frame); } for (i = 0; i < ref_pic_list1->len; i++) { GstH264Picture *ref_pic = g_array_index (ref_pic_list1, GstH264Picture *, i); - slice_params->ref_pic_list1[i] = + slice_params->ref_pic_list1[i].index = lookup_dpb_index (self->decode_params.dpb, ref_pic); + slice_params->ref_pic_list1[i].fields = + _get_v4l2_fields_ref (ref_pic, cur_is_frame); } } -static gboolean +static GstFlowReturn gst_v4l2_codec_h264_dec_new_sequence (GstH264Decoder * decoder, const GstH264SPS * sps, gint max_dpb_size) { @@ -648,6 +835,7 @@ gint crop_width = sps->width; gint crop_height = sps->height; gboolean negotiation_needed = FALSE; + gboolean interlaced; if (self->vinfo.finfo->format == GST_VIDEO_FORMAT_UNKNOWN) negotiation_needed = TRUE; @@ -676,6 +864,14 @@ self->coded_width, self->coded_height); } + interlaced = !sps->frame_mbs_only_flag; + if (self->interlaced != interlaced) { + self->interlaced = interlaced; + + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "Interlaced mode changed to %d", interlaced); + } + if (self->bitdepth != sps->bit_depth_luma_minus8 + 8) { self->bitdepth = sps->bit_depth_luma_minus8 + 8; negotiation_needed = TRUE; @@ -690,12 +886,13 @@ } gst_v4l2_codec_h264_dec_fill_sequence (self, sps); + self->need_sequence = TRUE; if (negotiation_needed) { self->need_negotiation = TRUE; if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; } } @@ -720,7 +917,7 @@ self->copy_frames = FALSE; } - return TRUE; + return GST_FLOW_OK; } static gboolean @@ -751,7 +948,7 @@ return TRUE; } -static gboolean +static GstFlowReturn gst_v4l2_codec_h264_dec_start_picture (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb) { @@ -759,20 +956,31 @@ /* FIXME base class should not call us if negotiation failed */ if (!self->sink_allocator) - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; if (!gst_v4l2_codec_h264_dec_ensure_bitstream (self)) - return FALSE; + return GST_FLOW_ERROR; + + /* + * Scaling matrix is present if there's one provided + * by either the SPS or the PPS. This flag must be + * set to true or false, before filling the PPS V4L2 control. + */ + self->scaling_matrix_present = + slice->header.pps->sequence->scaling_matrix_present_flag || + slice->header.pps->pic_scaling_matrix_present_flag; gst_v4l2_codec_h264_dec_fill_pps (self, slice->header.pps); - gst_v4l2_codec_h264_dec_fill_scaling_matrix (self, slice->header.pps); + + if (self->scaling_matrix_present) + gst_v4l2_codec_h264_dec_fill_scaling_matrix (self, slice->header.pps); + gst_v4l2_codec_h264_dec_fill_decoder_params (self, &slice->header, picture, dpb); - if (is_frame_based (self)) - gst_v4l2_codec_h264_dec_fill_slice_params (self, slice); + self->first_slice = TRUE; - return TRUE; + return GST_FLOW_OK; } static gboolean @@ -823,24 +1031,6 @@ return FALSE; } -static gboolean -gst_v4l2_codec_h264_dec_wait (GstV4l2CodecH264Dec * self, - GstV4l2Request * request) -{ - gint ret = gst_v4l2_request_poll (request, GST_SECOND); - if (ret == 0) { - GST_ELEMENT_ERROR (self, STREAM, DECODE, - ("Decoding frame took too long"), (NULL)); - return FALSE; - } else if (ret < 0) { - GST_ELEMENT_ERROR (self, STREAM, DECODE, - ("Decoding request failed: %s", g_strerror (errno)), (NULL)); - return FALSE; - } - - return TRUE; -} - static GstFlowReturn gst_v4l2_codec_h264_dec_output_picture (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture) @@ -848,40 +1038,29 @@ GstV4l2CodecH264Dec *self = GST_V4L2_CODEC_H264_DEC (decoder); GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); GstV4l2Request *request = gst_h264_picture_get_user_data (picture); - guint32 frame_num; - GstH264Picture *other_pic; - GstV4l2Request *other_request; + gint ret; GST_DEBUG_OBJECT (self, "Output picture %u", picture->system_frame_number); - if (gst_v4l2_request_is_done (request)) - goto finish_frame; - - if (!gst_v4l2_codec_h264_dec_wait (self, request)) + ret = gst_v4l2_request_set_done (request); + if (ret == 0) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Decoding frame %u took too long", picture->system_frame_number), + (NULL)); + goto error; + } else if (ret < 0) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Decoding request failed: %s", g_strerror (errno)), (NULL)); goto error; - - while (TRUE) { - if (!gst_v4l2_decoder_dequeue_src (self->decoder, &frame_num)) { - GST_ELEMENT_ERROR (self, STREAM, DECODE, - ("Decoder did not produce a frame"), (NULL)); - goto error; - } - - if (frame_num == picture->system_frame_number) - break; - - other_pic = gst_h264_decoder_get_picture (decoder, frame_num); - if (other_pic) { - other_request = gst_h264_picture_get_user_data (other_pic); - gst_v4l2_request_set_done (other_request); - gst_h264_picture_unref (other_pic); - } } - -finish_frame: - gst_v4l2_request_set_done (request); g_return_val_if_fail (frame->output_buffer, GST_FLOW_ERROR); + if (gst_v4l2_request_failed (request)) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Failed to decode frame %u", picture->system_frame_number), (NULL)); + goto error; + } + /* Hold on reference buffers for the rest of the picture lifetime */ gst_h264_picture_set_user_data (picture, gst_buffer_ref (frame->output_buffer), (GDestroyNotify) gst_buffer_unref); @@ -910,7 +1089,7 @@ self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; } - self->decode_params.num_slices = 0; + self->num_slices = 0; } static gboolean @@ -934,14 +1113,6 @@ return FALSE; } - if (!gst_v4l2_decoder_queue_src_buffer (self->decoder, buffer, - frame->system_frame_number)) { - GST_ELEMENT_ERROR (self, RESOURCE, WRITE, - ("Driver did not accept the picture buffer."), (NULL)); - gst_buffer_unref (buffer); - return FALSE; - } - frame->output_buffer = buffer; return TRUE; } @@ -950,103 +1121,124 @@ gst_v4l2_codec_h264_dec_submit_bitstream (GstV4l2CodecH264Dec * self, GstH264Picture * picture, guint flags) { - GstVideoCodecFrame *frame; - GstV4l2Request *prev_request, *request; + GstV4l2Request *prev_request, *request = NULL; gsize bytesused; gboolean ret = FALSE; + guint count = 0; /* *INDENT-OFF* */ + /* Reserve space for controls */ struct v4l2_ext_control control[] = { - { - .id = V4L2_CID_MPEG_VIDEO_H264_SPS, - .ptr = &self->sps, - .size = sizeof (self->sps), - }, - { - .id = V4L2_CID_MPEG_VIDEO_H264_PPS, - .ptr = &self->pps, - .size = sizeof (self->pps), - }, - { - .id = V4L2_CID_MPEG_VIDEO_H264_SCALING_MATRIX, - .ptr = &self->scaling_matrix, - .size = sizeof (self->scaling_matrix), - }, - { - .id = V4L2_CID_MPEG_VIDEO_H264_SLICE_PARAMS, - .ptr = self->slice_params->data, - .size = g_array_get_element_size (self->slice_params) - * self->decode_params.num_slices, - }, - { - .id = V4L2_CID_MPEG_VIDEO_H264_DECODE_PARAMS, - .ptr = &self->decode_params, - .size = sizeof (self->decode_params), - }, + { }, /* SPS */ + { }, /* PPS */ + { }, /* DECODE_PARAMS */ + { }, /* SLICE_PARAMS */ + { }, /* SCALING_MATRIX */ + { }, /* PRED_WEIGHTS */ }; /* *INDENT-ON* */ - request = gst_v4l2_decoder_alloc_request (self->decoder); + prev_request = gst_h264_picture_get_user_data (picture); + + bytesused = self->bitstream_map.size; + gst_memory_unmap (self->bitstream, &self->bitstream_map); + self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + gst_memory_resize (self->bitstream, 0, bytesused); + + if (prev_request) { + request = gst_v4l2_decoder_alloc_sub_request (self->decoder, prev_request, + self->bitstream); + } else { + GstVideoCodecFrame *frame; + + frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), + picture->system_frame_number); + g_return_val_if_fail (frame, FALSE); + + if (!gst_v4l2_codec_h264_dec_ensure_output_buffer (self, frame)) + goto done; + + request = gst_v4l2_decoder_alloc_request (self->decoder, + picture->system_frame_number, self->bitstream, frame->output_buffer); + + gst_video_codec_frame_unref (frame); + } + if (!request) { GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, ("Failed to allocate a media request object."), (NULL)); goto done; } + if (self->need_sequence) { + control[count].id = V4L2_CID_STATELESS_H264_SPS; + control[count].ptr = &self->sps; + control[count].size = sizeof (self->sps); + count++; + self->need_sequence = FALSE; + } - frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), - picture->system_frame_number); - g_return_val_if_fail (frame, FALSE); - - if (!gst_v4l2_codec_h264_dec_ensure_output_buffer (self, frame)) - goto done; + if (self->first_slice) { + control[count].id = V4L2_CID_STATELESS_H264_PPS; + control[count].ptr = &self->pps; + control[count].size = sizeof (self->pps); + count++; + + if (self->scaling_matrix_present) { + control[count].id = V4L2_CID_STATELESS_H264_SCALING_MATRIX; + control[count].ptr = &self->scaling_matrix; + control[count].size = sizeof (self->scaling_matrix); + count++; + } - gst_video_codec_frame_unref (frame); + control[count].id = V4L2_CID_STATELESS_H264_DECODE_PARAMS; + control[count].ptr = &self->decode_params; + control[count].size = sizeof (self->decode_params); + count++; - if (!gst_v4l2_decoder_set_controls (self->decoder, request, control, - G_N_ELEMENTS (control))) { - GST_ELEMENT_ERROR (self, RESOURCE, WRITE, - ("Driver did not accept the bitstream parameters."), (NULL)); - goto done; + self->first_slice = FALSE; } - bytesused = self->bitstream_map.size; - gst_memory_unmap (self->bitstream, &self->bitstream_map); - self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + /* If it's not slice-based then it doesn't support per-slice controls. */ + if (is_slice_based (self)) { + control[count].id = V4L2_CID_STATELESS_H264_SLICE_PARAMS; + control[count].ptr = self->slice_params->data; + control[count].size = g_array_get_element_size (self->slice_params) + * self->num_slices; + count++; + + control[count].id = V4L2_CID_STATELESS_H264_PRED_WEIGHTS; + control[count].ptr = &self->pred_weight; + control[count].size = sizeof (self->pred_weight); + count++; + } - if (!gst_v4l2_decoder_queue_sink_mem (self->decoder, request, self->bitstream, - picture->system_frame_number, bytesused, flags)) { + if (!gst_v4l2_decoder_set_controls (self->decoder, request, control, count)) { GST_ELEMENT_ERROR (self, RESOURCE, WRITE, - ("Driver did not accept the bitstream data."), (NULL)); + ("Driver did not accept the bitstream parameters."), (NULL)); goto done; } - if (!gst_v4l2_request_queue (request)) { + if (!gst_v4l2_request_queue (request, flags)) { GST_ELEMENT_ERROR (self, RESOURCE, WRITE, ("Driver did not accept the decode request."), (NULL)); goto done; } - prev_request = gst_h264_picture_get_user_data (picture); - if (prev_request) { - if (!gst_v4l2_codec_h264_dec_wait (self, prev_request)) - goto done; - gst_v4l2_request_set_done (prev_request); - } - gst_h264_picture_set_user_data (picture, g_steal_pointer (&request), - (GDestroyNotify) gst_v4l2_request_free); + (GDestroyNotify) gst_v4l2_request_unref); ret = TRUE; done: if (request) - gst_v4l2_request_free (request); + gst_v4l2_request_unref (request); + gst_v4l2_codec_h264_dec_reset_picture (self); return ret; } -static gboolean +static GstFlowReturn gst_v4l2_codec_h264_dec_decode_slice (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, GArray * ref_pic_list1) @@ -1063,12 +1255,13 @@ if (!gst_v4l2_codec_h264_dec_submit_bitstream (self, picture, V4L2_BUF_FLAG_M2M_HOLD_CAPTURE_BUF) || !gst_v4l2_codec_h264_dec_ensure_bitstream (self)) - return FALSE; + return GST_FLOW_ERROR; } gst_v4l2_codec_h264_dec_fill_slice_params (self, slice); - gst_v4l2_codec_h264_dec_fill_references (self, ref_pic_list0, - ref_pic_list1); + gst_v4l2_codec_h264_dec_fill_pred_weight (self, &slice->header); + gst_v4l2_codec_h264_dec_fill_references (self, + GST_H264_PICTURE_IS_FRAME (picture), ref_pic_list0, ref_pic_list1); } bitstream_data = self->bitstream_map.data + self->bitstream_map.size; @@ -1080,7 +1273,7 @@ if (self->bitstream_map.size + nal_size > self->bitstream_map.maxsize) { GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, ("Not enough space to send all slice of an H264 frame."), (NULL)); - return FALSE; + return GST_FLOW_ERROR; } if (needs_start_codes (self)) { @@ -1093,15 +1286,66 @@ slice->nalu.size); self->bitstream_map.size += nal_size; - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_v4l2_codec_h264_dec_end_picture (GstH264Decoder * decoder, GstH264Picture * picture) { GstV4l2CodecH264Dec *self = GST_V4L2_CODEC_H264_DEC (decoder); - return gst_v4l2_codec_h264_dec_submit_bitstream (self, picture, 0); + guint flags = 0; + + /* Hold on the output frame if this is first field of a pair */ + if (picture->field != GST_H264_PICTURE_FIELD_FRAME && !picture->second_field) + flags = V4L2_BUF_FLAG_M2M_HOLD_CAPTURE_BUF; + + if (!gst_v4l2_codec_h264_dec_submit_bitstream (self, picture, flags)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_v4l2_codec_h264_dec_new_field_picture (GstH264Decoder * decoder, + const GstH264Picture * first_field, GstH264Picture * second_field) +{ + GstV4l2CodecH264Dec *self = GST_V4L2_CODEC_H264_DEC (decoder); + GstV4l2Request *request = + gst_h264_picture_get_user_data ((GstH264Picture *) first_field); + + if (!request) { + GST_WARNING_OBJECT (self, + "First picture does not have an associated request"); + return GST_FLOW_OK; + } + + GST_DEBUG_OBJECT (self, "Assigned request %p to second field.", request); + + /* Associate the previous request with the new picture so that + * submit_bitstream can create sub-request */ + gst_h264_picture_set_user_data (second_field, gst_v4l2_request_ref (request), + (GDestroyNotify) gst_v4l2_request_unref); + + return GST_FLOW_OK; +} + +static guint +gst_v4l2_codec_h264_dec_get_preferred_output_delay (GstH264Decoder * decoder, + gboolean live) +{ + GstV4l2CodecH264Dec *self = GST_V4L2_CODEC_H264_DEC (decoder); + guint delay; + + if (live) + delay = 0; + else + /* Just one for now, perhaps we can make this configurable in the future. */ + delay = 1; + + gst_v4l2_decoder_set_render_delay (self->decoder, delay); + + return delay; } static void @@ -1260,17 +1504,54 @@ GST_DEBUG_FUNCPTR (gst_v4l2_codec_h264_dec_decode_slice); h264decoder_class->end_picture = GST_DEBUG_FUNCPTR (gst_v4l2_codec_h264_dec_end_picture); + h264decoder_class->new_field_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_h264_dec_new_field_picture); + h264decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_h264_dec_get_preferred_output_delay); klass->device = device; gst_v4l2_decoder_install_properties (gobject_class, PROP_LAST, device); } void -gst_v4l2_codec_h264_dec_register (GstPlugin * plugin, +gst_v4l2_codec_h264_dec_register (GstPlugin * plugin, GstV4l2Decoder * decoder, GstV4l2CodecDevice * device, guint rank) { - gst_v4l2_decoder_register (plugin, GST_TYPE_V4L2_CODEC_H264_DEC, + GstCaps *src_caps; + guint version; + + GST_DEBUG_CATEGORY_INIT (v4l2_h264dec_debug, "v4l2codecs-h264dec", 0, + "V4L2 stateless h264 decoder"); + + if (!gst_v4l2_decoder_set_sink_fmt (decoder, V4L2_PIX_FMT_H264_SLICE, + 320, 240, 8)) + return; + src_caps = gst_v4l2_decoder_enum_src_formats (decoder); + + if (gst_caps_is_empty (src_caps)) { + GST_WARNING ("Not registering H264 decoder since it produces no " + "supported format"); + goto done; + } + + version = gst_v4l2_decoder_get_version (decoder); + if (version < V4L2_MIN_KERNEL_VERSION) + GST_WARNING ("V4L2 API v%u.%u too old, at least v%u.%u required", + (version >> 16) & 0xff, (version >> 8) & 0xff, + V4L2_MIN_KERNEL_VER_MAJOR, V4L2_MIN_KERNEL_VER_MINOR); + + if (!gst_v4l2_decoder_h264_api_check (decoder)) { + GST_WARNING ("Not registering H264 decoder as it failed ABI check."); + goto done; + } + + gst_v4l2_decoder_register (plugin, + GST_TYPE_V4L2_CODEC_H264_DEC, (GClassInitFunc) gst_v4l2_codec_h264_dec_subclass_init, + gst_mini_object_ref (GST_MINI_OBJECT (device)), (GInstanceInitFunc) gst_v4l2_codec_h264_dec_subinit, - "v4l2sl%sh264dec", device, rank); + "v4l2sl%sh264dec", device, rank, NULL); + +done: + gst_caps_unref (src_caps); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2codech264dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codech264dec.h
Changed
@@ -45,6 +45,7 @@ GType gst_v4l2_codec_h264_dec_get_type (void); void gst_v4l2_codec_h264_dec_register (GstPlugin * plugin, + GstV4l2Decoder * decoder, GstV4l2CodecDevice * device, guint rank);
View file
gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecmpeg2dec.c
Added
@@ -0,0 +1,1078 @@ +/* GStreamer + * Copyright (C) 2020 Daniel Almeida <daniel.almeida@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstv4l2codecallocator.h" +#include "gstv4l2codecmpeg2dec.h" +#include "gstv4l2codecpool.h" +#include "gstv4l2format.h" +#include "linux/v4l2-controls.h" + +#define KERNEL_VERSION(a,b,c) (((a) << 16) + ((b) << 8) + (c)) + +#define V4L2_MIN_KERNEL_VER_MAJOR 5 +#define V4L2_MIN_KERNEL_VER_MINOR 14 +#define V4L2_MIN_KERNEL_VERSION \ + KERNEL_VERSION(V4L2_MIN_KERNEL_VER_MAJOR, V4L2_MIN_KERNEL_VER_MINOR, 0) + +#define MPEG2_BITDEPTH 8 + +GST_DEBUG_CATEGORY_STATIC (v4l2_mpeg2dec_debug); +#define GST_CAT_DEFAULT v4l2_mpeg2dec_debug + +enum +{ + PROP_0, + PROP_LAST = PROP_0 +}; + +static GstStaticPadTemplate sink_template = +GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SINK_NAME, + GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/mpeg, " + "systemstream=(boolean) false, " + "mpegversion=(int) 2, " "profile=(string) {main, simple} ")); + +static GstStaticPadTemplate src_template = +GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SRC_NAME, + GST_PAD_SRC, GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (GST_V4L2_DEFAULT_VIDEO_FORMATS))); + +struct _GstV4l2CodecMpeg2Dec +{ + GstMpeg2Decoder parent; + + GstV4l2Decoder *decoder; + GstVideoCodecState *output_state; + GstVideoInfo vinfo; + + guint16 width; + guint16 height; + guint chroma_format; + gboolean interlaced; + GstMpegVideoProfile profile; + guint16 vbv_buffer_size; + gboolean need_sequence; + gboolean need_quantiser; + + struct v4l2_ctrl_mpeg2_sequence v4l2_sequence; + struct v4l2_ctrl_mpeg2_picture v4l2_picture; + struct v4l2_ctrl_mpeg2_quantisation v4l2_quantisation; + + GstV4l2CodecAllocator *sink_allocator; + GstV4l2CodecAllocator *src_allocator; + GstV4l2CodecPool *src_pool; + gint min_pool_size; + gboolean has_videometa; + gboolean need_negotiation; + + GstMemory *bitstream; + GstMapInfo bitstream_map; + + gboolean copy_frames; +}; + +G_DEFINE_ABSTRACT_TYPE (GstV4l2CodecMpeg2Dec, gst_v4l2_codec_mpeg2_dec, + GST_TYPE_MPEG2_DECODER); + +#define parent_class gst_v4l2_codec_mpeg2_dec_parent_class + +static guint +gst_v4l2_codec_mpeg2_dec_get_preferred_output_delay (GstMpeg2Decoder * decoder, + gboolean is_live) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + guint delay; + + if (is_live) + delay = 0; + else + /* Just one for now, perhaps we can make this configurable in the future. */ + delay = 1; + + gst_v4l2_decoder_set_render_delay (self->decoder, delay); + + return delay; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_open (GstVideoDecoder * decoder) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + guint version; + + if (!gst_v4l2_decoder_open (self->decoder)) { + GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ_WRITE, + ("Failed to open mpeg2 decoder"), + ("gst_v4l2_decoder_open() failed: %s", g_strerror (errno))); + return FALSE; + } + + version = gst_v4l2_decoder_get_version (self->decoder); + if (version < V4L2_MIN_KERNEL_VERSION) { + GST_ERROR_OBJECT (self, + "V4L2 API v%u.%u too old, at least v%u.%u required", + (version >> 16) & 0xff, (version >> 8) & 0xff, + V4L2_MIN_KERNEL_VER_MAJOR, V4L2_MIN_KERNEL_VER_MINOR); + + gst_v4l2_decoder_close (self->decoder); + return FALSE; + } + + return TRUE; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_close (GstVideoDecoder * decoder) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + return gst_v4l2_decoder_close (self->decoder); +} + +static void +gst_v4l2_codec_mpeg2_dec_reset_allocation (GstV4l2CodecMpeg2Dec * self) +{ + if (self->sink_allocator) { + gst_v4l2_codec_allocator_detach (self->sink_allocator); + g_clear_object (&self->sink_allocator); + } + + if (self->src_allocator) { + gst_v4l2_codec_allocator_detach (self->src_allocator); + g_clear_object (&self->src_allocator); + g_clear_object (&self->src_pool); + } +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_stop (GstVideoDecoder * decoder) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SINK); + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SRC); + + gst_v4l2_codec_mpeg2_dec_reset_allocation (self); + + if (self->output_state) + gst_video_codec_state_unref (self->output_state); + self->output_state = NULL; + + return GST_VIDEO_DECODER_CLASS (parent_class)->stop (decoder); +} + +static gint +get_pixel_bitdepth (GstV4l2CodecMpeg2Dec * self) +{ + gint depth; + + switch (self->chroma_format) { + case 0: + /* 4:0:0 */ + depth = MPEG2_BITDEPTH; + break; + case 1: + /* 4:2:0 */ + depth = MPEG2_BITDEPTH + MPEG2_BITDEPTH / 2; + break; + case 2: + /* 4:2:2 */ + depth = 2 * MPEG2_BITDEPTH; + break; + case 3: + /* 4:4:4 */ + depth = 3 * MPEG2_BITDEPTH; + break; + default: + GST_WARNING_OBJECT (self, "Unsupported chroma format %i", + self->chroma_format); + depth = 0; + break; + } + + return depth; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_negotiate (GstVideoDecoder * decoder) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + GstMpeg2Decoder *mpeg2dec = GST_MPEG2_DECODER (decoder); + /* *INDENT-OFF* */ + struct v4l2_ext_control control[] = { + { + .id = V4L2_CID_STATELESS_MPEG2_SEQUENCE, + .ptr = &self->v4l2_sequence, + .size = sizeof(self->v4l2_sequence), + }, + { + .id = V4L2_CID_STATELESS_MPEG2_QUANTISATION, + .ptr = &self->v4l2_quantisation, + .size = sizeof(self->v4l2_quantisation), + }, + }; + + /* *INDENT-ON* */ + GstCaps *filter, *caps; + + /* Ignore downstream renegotiation request. */ + if (!self->need_negotiation) + return TRUE; + self->need_negotiation = FALSE; + + GST_DEBUG_OBJECT (self, "Negotiate"); + + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SINK); + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SRC); + + gst_v4l2_codec_mpeg2_dec_reset_allocation (self); + + if (!gst_v4l2_decoder_set_sink_fmt (self->decoder, V4L2_PIX_FMT_MPEG2_SLICE, + self->width, self->height, get_pixel_bitdepth (self))) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("Failed to configure mpeg2 decoder"), + ("gst_v4l2_decoder_set_sink_fmt() failed: %s", g_strerror (errno))); + gst_v4l2_decoder_close (self->decoder); + return FALSE; + } + + if (!gst_v4l2_decoder_set_controls (self->decoder, NULL, control, + G_N_ELEMENTS (control))) { + GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, + ("Driver does not support the selected stream."), (NULL)); + return FALSE; + } + + filter = gst_v4l2_decoder_enum_src_formats (self->decoder); + if (!filter) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("No supported decoder output formats"), (NULL)); + return FALSE; + } + GST_DEBUG_OBJECT (self, "Supported output formats: %" GST_PTR_FORMAT, filter); + + caps = gst_pad_peer_query_caps (decoder->srcpad, filter); + gst_caps_unref (filter); + GST_DEBUG_OBJECT (self, "Peer supported formats: %" GST_PTR_FORMAT, caps); + + if (!gst_v4l2_decoder_select_src_format (self->decoder, caps, &self->vinfo)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("Unsupported bitdepth/chroma format"), + ("No support for %ux%u chroma IDC %i", self->width, + self->height, self->chroma_format)); + gst_caps_unref (caps); + return FALSE; + } + gst_caps_unref (caps); + + if (self->output_state) + gst_video_codec_state_unref (self->output_state); + + self->output_state = + gst_video_decoder_set_output_state (GST_VIDEO_DECODER (self), + self->vinfo.finfo->format, self->width, + self->height, mpeg2dec->input_state); + + if (self->interlaced) + self->output_state->info.interlace_mode = + GST_VIDEO_INTERLACE_MODE_INTERLEAVED; + + self->output_state->caps = gst_video_info_to_caps (&self->output_state->info); + + if (GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder)) { + if (!gst_v4l2_decoder_streamon (self->decoder, GST_PAD_SINK)) { + GST_ELEMENT_ERROR (self, RESOURCE, FAILED, + ("Could not enable the decoder driver."), + ("VIDIOC_STREAMON(SINK) failed: %s", g_strerror (errno))); + return FALSE; + } + + if (!gst_v4l2_decoder_streamon (self->decoder, GST_PAD_SRC)) { + GST_ELEMENT_ERROR (self, RESOURCE, FAILED, + ("Could not enable the decoder driver."), + ("VIDIOC_STREAMON(SRC) failed: %s", g_strerror (errno))); + return FALSE; + } + + return TRUE; + } + + return FALSE; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + guint min = 0, num_bitstream; + + self->has_videometa = gst_query_find_allocation_meta (query, + GST_VIDEO_META_API_TYPE, NULL); + + g_clear_object (&self->src_pool); + g_clear_object (&self->src_allocator); + + if (gst_query_get_n_allocation_pools (query) > 0) + gst_query_parse_nth_allocation_pool (query, 0, NULL, NULL, &min, NULL); + + min = MAX (2, min); + /* note the dpb size is fixed at 2 */ + num_bitstream = 1 + + MAX (1, gst_v4l2_decoder_get_render_delay (self->decoder)); + + self->sink_allocator = gst_v4l2_codec_allocator_new (self->decoder, + GST_PAD_SINK, num_bitstream); + self->src_allocator = gst_v4l2_codec_allocator_new (self->decoder, + GST_PAD_SRC, self->min_pool_size + min + 4); + self->src_pool = gst_v4l2_codec_pool_new (self->src_allocator, &self->vinfo); + + /* Our buffer pool is internal, we will let the base class create a video + * pool, and use it if we are running out of buffers or if downstream does + * not support GstVideoMeta */ + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + + +static GstFlowReturn +gst_v4l2_codec_mpeg2_dec_new_sequence (GstMpeg2Decoder * decoder, + const GstMpegVideoSequenceHdr * seq, + const GstMpegVideoSequenceExt * seq_ext, + const GstMpegVideoSequenceDisplayExt * seq_display_ext, + const GstMpegVideoSequenceScalableExt * seq_scalable_ext) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + gboolean negotiation_needed = FALSE; + gboolean interlaced; + guint16 width; + guint16 height; + guint16 vbv_buffer_size; + GstMpegVideoProfile mpeg_profile; + + GST_LOG_OBJECT (self, "New sequence"); + + interlaced = seq_ext ? !seq_ext->progressive : FALSE; + if (self->interlaced != interlaced) { + GST_INFO_OBJECT (self, "interlaced sequence change"); + self->interlaced = interlaced; + negotiation_needed = TRUE; + } + + width = seq->width; + height = seq->height; + vbv_buffer_size = seq->vbv_buffer_size_value; + if (seq_ext) { + width = (width & 0x0fff) | ((guint32) seq_ext->horiz_size_ext << 12); + height = (height & 0x0fff) | ((guint32) seq_ext->vert_size_ext << 12); + vbv_buffer_size = (vbv_buffer_size & 0x03ff) | ((guint32) + seq_ext->vbv_buffer_size_extension << 10); + } + + if (self->width != width || self->height != height) { + GST_INFO_OBJECT (self, "resolution change %dx%d -> %dx%d", + self->width, self->height, width, height); + self->width = width; + self->height = height; + negotiation_needed = TRUE; + } + + if (self->vbv_buffer_size != vbv_buffer_size) { + GST_INFO_OBJECT (self, "vbv buffer size change %d -> %d", + self->vbv_buffer_size, vbv_buffer_size); + self->vbv_buffer_size = vbv_buffer_size; + negotiation_needed = TRUE; + } + + mpeg_profile = GST_MPEG_VIDEO_PROFILE_MAIN; + if (seq_ext) + mpeg_profile = seq_ext->profile; + + if (mpeg_profile != GST_MPEG_VIDEO_PROFILE_MAIN && + mpeg_profile != GST_MPEG_VIDEO_PROFILE_SIMPLE) { + GST_ERROR_OBJECT (self, "Cannot support profile %d", mpeg_profile); + return GST_FLOW_ERROR; + } + + if (self->profile != mpeg_profile) { + GST_INFO_OBJECT (self, "Profile change %d -> %d", + self->profile, mpeg_profile); + self->profile = mpeg_profile; + self->need_negotiation = TRUE; + } + + if (self->vinfo.finfo->format == GST_VIDEO_FORMAT_UNKNOWN) + negotiation_needed = TRUE; + + /* copy quantiser from the sequence header, + * if none is provided this will copy the default ones + * added by the parser + */ + memcpy (self->v4l2_quantisation.intra_quantiser_matrix, + seq->intra_quantizer_matrix, + sizeof (self->v4l2_quantisation.intra_quantiser_matrix));; + memcpy (self->v4l2_quantisation.non_intra_quantiser_matrix, + seq->non_intra_quantizer_matrix, + sizeof (self->v4l2_quantisation.non_intra_quantiser_matrix));; + + /* *INDENT-OFF* */ + self->v4l2_sequence = (struct v4l2_ctrl_mpeg2_sequence) { + .horizontal_size = self->width, + .vertical_size = self->height, + .vbv_buffer_size = self->vbv_buffer_size * 16 * 1024, + .profile_and_level_indication = + seq_ext ? (seq_ext->profile << 4) | (seq_ext-> + level << 1) | seq_ext->profile_level_escape_bit : 0, + .chroma_format = seq_ext ? seq_ext->chroma_format : 0, + .flags = seq_ext->progressive ? V4L2_MPEG2_SEQ_FLAG_PROGRESSIVE : 0, + }; + /* *INDENT-ON* */ + + if (negotiation_needed) { + self->need_negotiation = TRUE; + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_ERROR; + } + } else { + self->need_sequence = TRUE; + self->need_quantiser = TRUE; + } + + /* Check if we can zero-copy buffers */ + if (!self->has_videometa) { + GstVideoInfo ref_vinfo; + gint i; + + gst_video_info_set_format (&ref_vinfo, GST_VIDEO_INFO_FORMAT (&self->vinfo), + self->width, self->height); + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&self->vinfo); i++) { + if (self->vinfo.stride[i] != ref_vinfo.stride[i] || + self->vinfo.offset[i] != ref_vinfo.offset[i]) { + GST_WARNING_OBJECT (self, + "GstVideoMeta support required, copying frames."); + self->copy_frames = TRUE; + break; + } + } + } else { + self->copy_frames = FALSE; + } + + return GST_FLOW_OK; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_ensure_bitstream (GstV4l2CodecMpeg2Dec * self) +{ + if (self->bitstream) + goto done; + + self->bitstream = gst_v4l2_codec_allocator_alloc (self->sink_allocator); + + if (!self->bitstream) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to decode mpeg2 stream."), (NULL)); + return FALSE; + } + + if (!gst_memory_map (self->bitstream, &self->bitstream_map, GST_MAP_WRITE)) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, + ("Could not access bitstream memory for writing"), (NULL)); + g_clear_pointer (&self->bitstream, gst_memory_unref); + return FALSE; + } + +done: + /* We use this field to track how much we have written */ + self->bitstream_map.size = 0; + + return TRUE; +} + +static inline void +_parse_picture_coding_type (struct v4l2_ctrl_mpeg2_picture *v4l2_picture, + GstMpeg2Picture * mpeg2_picture) +{ + switch (mpeg2_picture->type) { + case GST_MPEG_VIDEO_PICTURE_TYPE_I: + v4l2_picture->picture_coding_type = V4L2_MPEG2_PIC_CODING_TYPE_I; + break; + case GST_MPEG_VIDEO_PICTURE_TYPE_P: + v4l2_picture->picture_coding_type = V4L2_MPEG2_PIC_CODING_TYPE_P; + break; + case GST_MPEG_VIDEO_PICTURE_TYPE_B: + v4l2_picture->picture_coding_type = V4L2_MPEG2_PIC_CODING_TYPE_B; + break; + case GST_MPEG_VIDEO_PICTURE_TYPE_D: + v4l2_picture->picture_coding_type = V4L2_MPEG2_PIC_CODING_TYPE_D; + break; + } +} + +static inline void +_parse_picture_structure (struct v4l2_ctrl_mpeg2_picture *v4l2_picture, + GstMpeg2Slice * slice) +{ + if (!slice->pic_ext) + return; + switch (slice->pic_ext->picture_structure) { + case GST_MPEG_VIDEO_PICTURE_STRUCTURE_TOP_FIELD: + v4l2_picture->picture_structure = V4L2_MPEG2_PIC_TOP_FIELD; + break; + case GST_MPEG_VIDEO_PICTURE_STRUCTURE_BOTTOM_FIELD: + v4l2_picture->picture_structure = V4L2_MPEG2_PIC_BOTTOM_FIELD; + break; + case GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME: + v4l2_picture->picture_structure = V4L2_MPEG2_PIC_FRAME; + break; + } +} + +static GstFlowReturn +gst_v4l2_codec_mpeg2_dec_start_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice, + GstMpeg2Picture * prev_picture, GstMpeg2Picture * next_picture) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + + /* FIXME base class should not call us if negotiation failed */ + if (!self->sink_allocator) + return GST_FLOW_ERROR; + + if (!gst_v4l2_codec_mpeg2_dec_ensure_bitstream (self)) + return GST_FLOW_ERROR; + + + /* *INDENT-OFF* */ + self->v4l2_picture = (struct v4l2_ctrl_mpeg2_picture) { + .backward_ref_ts = next_picture ? next_picture->system_frame_number * 1000 : GST_CLOCK_TIME_NONE, + .forward_ref_ts = prev_picture ? prev_picture->system_frame_number * 1000 : GST_CLOCK_TIME_NONE, + .intra_dc_precision = slice->pic_ext ? slice->pic_ext->intra_dc_precision : 0, + .flags = (slice->pic_ext && slice->pic_ext->top_field_first ? V4L2_MPEG2_PIC_FLAG_TOP_FIELD_FIRST : 0) | + (slice->pic_ext && slice->pic_ext->frame_pred_frame_dct ? V4L2_MPEG2_PIC_FLAG_FRAME_PRED_DCT : 0 ) | + (slice->pic_ext && slice->pic_ext->concealment_motion_vectors ? V4L2_MPEG2_PIC_FLAG_CONCEALMENT_MV : 0) | + (slice->pic_ext && slice->pic_ext->q_scale_type ? V4L2_MPEG2_PIC_FLAG_Q_SCALE_TYPE : 0) | + (slice->pic_ext && slice->pic_ext->intra_vlc_format ? V4L2_MPEG2_PIC_FLAG_INTRA_VLC : 0) | + (slice->pic_ext && slice->pic_ext->alternate_scan ? V4L2_MPEG2_PIC_FLAG_ALT_SCAN : 0) | + (slice->pic_ext && slice->pic_ext->repeat_first_field ? V4L2_MPEG2_PIC_FLAG_REPEAT_FIRST : 0) | + (slice->pic_ext && slice->pic_ext->progressive_frame ? V4L2_MPEG2_PIC_FLAG_PROGRESSIVE : 0), + }; + /* *INDENT-ON* */ + + _parse_picture_coding_type (&self->v4l2_picture, picture); + _parse_picture_structure (&self->v4l2_picture, slice); + + /* slices share pic_ext and quant_matrix for the picture which might be there or not */ + if (slice->pic_ext) + memcpy (&self->v4l2_picture.f_code, slice->pic_ext->f_code, + sizeof (self->v4l2_picture.f_code)); + + /* overwrite the sequence ones if needed, see 6.1.1.6 for reference */ + if (slice->quant_matrix) { + if (slice->quant_matrix->load_intra_quantiser_matrix) + memcpy (self->v4l2_quantisation.intra_quantiser_matrix, + slice->quant_matrix->intra_quantiser_matrix, + sizeof (self->v4l2_quantisation.intra_quantiser_matrix)); + if (slice->quant_matrix->load_non_intra_quantiser_matrix) + memcpy (self->v4l2_quantisation.non_intra_quantiser_matrix, + slice->quant_matrix->non_intra_quantiser_matrix, + sizeof (self->v4l2_quantisation.non_intra_quantiser_matrix)); + if (slice->quant_matrix->load_chroma_intra_quantiser_matrix) + memcpy (self->v4l2_quantisation.chroma_intra_quantiser_matrix, + slice->quant_matrix->chroma_intra_quantiser_matrix, + sizeof (self->v4l2_quantisation.chroma_intra_quantiser_matrix)); + if (slice->quant_matrix->load_chroma_non_intra_quantiser_matrix) + memcpy (self->v4l2_quantisation.chroma_non_intra_quantiser_matrix, + slice->quant_matrix->chroma_non_intra_quantiser_matrix, + sizeof (self->v4l2_quantisation.chroma_non_intra_quantiser_matrix)); + + self->need_quantiser |= (slice->quant_matrix->load_intra_quantiser_matrix || + slice->quant_matrix->load_non_intra_quantiser_matrix || + slice->quant_matrix->load_chroma_intra_quantiser_matrix || + slice->quant_matrix->load_chroma_non_intra_quantiser_matrix); + } + + return GST_FLOW_OK; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_copy_output_buffer (GstV4l2CodecMpeg2Dec * self, + GstVideoCodecFrame * codec_frame) +{ + GstVideoFrame src_frame; + GstVideoFrame dest_frame; + GstVideoInfo dest_vinfo; + GstBuffer *buffer; + + gst_video_info_set_format (&dest_vinfo, GST_VIDEO_INFO_FORMAT (&self->vinfo), + self->width, self->height); + + buffer = gst_video_decoder_allocate_output_buffer (GST_VIDEO_DECODER (self)); + if (!buffer) + goto fail; + + if (!gst_video_frame_map (&src_frame, &self->vinfo, + codec_frame->output_buffer, GST_MAP_READ)) + goto fail; + + if (!gst_video_frame_map (&dest_frame, &dest_vinfo, buffer, GST_MAP_WRITE)) { + gst_video_frame_unmap (&dest_frame); + goto fail; + } + + GST_VIDEO_INFO_WIDTH (&src_frame.info) = self->width; + GST_VIDEO_INFO_HEIGHT (&src_frame.info) = self->height; + + if (!gst_video_frame_copy (&dest_frame, &src_frame)) { + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + goto fail; + } + + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + gst_buffer_replace (&codec_frame->output_buffer, buffer); + gst_buffer_unref (buffer); + + return TRUE; + +fail: + GST_ERROR_OBJECT (self, "Failed copy output buffer."); + return FALSE; +} + +static GstFlowReturn +gst_v4l2_codec_mpeg2_dec_output_picture (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, GstMpeg2Picture * picture) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstV4l2Request *request = gst_mpeg2_picture_get_user_data (picture); + gint ret; + + GST_DEBUG_OBJECT (self, "Output picture %u", picture->system_frame_number); + + ret = gst_v4l2_request_set_done (request); + if (ret == 0) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Decoding frame %u took too long", picture->system_frame_number), + (NULL)); + goto error; + } else if (ret < 0) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Decoding request failed: %s", g_strerror (errno)), (NULL)); + goto error; + } + g_return_val_if_fail (frame->output_buffer, GST_FLOW_ERROR); + + if (gst_v4l2_request_failed (request)) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Failed to decode frame %u", picture->system_frame_number), (NULL)); + goto error; + } + + /* Hold on reference buffers for the rest of the picture lifetime */ + gst_mpeg2_picture_set_user_data (picture, + gst_buffer_ref (frame->output_buffer), (GDestroyNotify) gst_buffer_unref); + + if (self->copy_frames) + gst_v4l2_codec_mpeg2_dec_copy_output_buffer (self, frame); + + gst_mpeg2_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_video_decoder_drop_frame (vdec, frame); + gst_mpeg2_picture_unref (picture); + + return GST_FLOW_ERROR; +} + +static void +gst_v4l2_codec_mpeg2_dec_reset_picture (GstV4l2CodecMpeg2Dec * self) +{ + if (self->bitstream) { + if (self->bitstream_map.memory) + gst_memory_unmap (self->bitstream, &self->bitstream_map); + g_clear_pointer (&self->bitstream, gst_memory_unref); + self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + } +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_ensure_output_buffer (GstV4l2CodecMpeg2Dec * self, + GstVideoCodecFrame * frame) +{ + GstBuffer *buffer; + GstFlowReturn flow_ret; + + if (frame->output_buffer) + return TRUE; + + flow_ret = gst_buffer_pool_acquire_buffer (GST_BUFFER_POOL (self->src_pool), + &buffer, NULL); + if (flow_ret != GST_FLOW_OK) { + if (flow_ret == GST_FLOW_FLUSHING) + GST_DEBUG_OBJECT (self, "Frame decoding aborted, we are flushing."); + else + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, + ("No more picture buffer available."), (NULL)); + return FALSE; + } + + frame->output_buffer = buffer; + return TRUE; +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_submit_bitstream (GstV4l2CodecMpeg2Dec * self, + GstMpeg2Picture * picture) +{ + GstV4l2Request *prev_request = NULL, *request = NULL; + gsize bytesused; + gboolean ret = FALSE; + guint count = 0; + guint flags = 0; + + /* *INDENT-OFF* */ + /* Reserve space for controls */ + struct v4l2_ext_control control[] = { + { }, /* sequence */ + { }, /* picture */ + { }, /* slice */ + { }, /* quantization */ + }; + /* *INDENT-ON* */ + + if (picture->structure != GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME) { + if (picture->first_field) + prev_request = gst_mpeg2_picture_get_user_data (picture->first_field); + else + flags = V4L2_BUF_FLAG_M2M_HOLD_CAPTURE_BUF; + } + + bytesused = self->bitstream_map.size; + gst_memory_unmap (self->bitstream, &self->bitstream_map); + self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + gst_memory_resize (self->bitstream, 0, bytesused); + + if (prev_request) { + request = gst_v4l2_decoder_alloc_sub_request (self->decoder, prev_request, + self->bitstream); + } else { + GstVideoCodecFrame *frame; + + frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), + picture->system_frame_number); + g_return_val_if_fail (frame, FALSE); + + if (!gst_v4l2_codec_mpeg2_dec_ensure_output_buffer (self, frame)) + goto done; + + request = gst_v4l2_decoder_alloc_request (self->decoder, + picture->system_frame_number, self->bitstream, frame->output_buffer); + + gst_video_codec_frame_unref (frame); + } + + if (!request) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Failed to allocate a media request object."), (NULL)); + goto done; + } + + if (self->need_sequence) { + control[count].id = V4L2_CID_STATELESS_MPEG2_SEQUENCE; + control[count].ptr = &self->v4l2_sequence; + control[count].size = sizeof (self->v4l2_sequence); + count++; + self->need_sequence = FALSE; + } + + control[count].id = V4L2_CID_STATELESS_MPEG2_PICTURE; + control[count].ptr = &self->v4l2_picture; + control[count].size = sizeof (self->v4l2_picture); + count++; + + if (self->need_quantiser) { + control[count].id = V4L2_CID_STATELESS_MPEG2_QUANTISATION; + control[count].ptr = &self->v4l2_quantisation; + control[count].size = sizeof (self->v4l2_quantisation); + count++; + self->need_quantiser = FALSE; + } + + if (!gst_v4l2_decoder_set_controls (self->decoder, request, control, count)) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, + ("Driver did not accept the bitstream parameters."), (NULL)); + goto done; + } + + if (!gst_v4l2_request_queue (request, flags)) { + GST_ELEMENT_ERROR (self, RESOURCE, WRITE, + ("Driver did not accept the decode request."), (NULL)); + goto done; + } + + gst_mpeg2_picture_set_user_data (picture, g_steal_pointer (&request), + (GDestroyNotify) gst_v4l2_request_unref); + + ret = TRUE; + +done: + if (request) + gst_v4l2_request_unref (request); + + gst_v4l2_codec_mpeg2_dec_reset_picture (self); + + return ret; +} + +static GstFlowReturn +gst_v4l2_codec_mpeg2_dec_decode_slice (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + const gsize slice_size = slice->size; + const gsize slice_offset = slice->sc_offset; + const guint8 *slice_ptr = slice->packet.data + slice_offset; + guint8 *bitstream_ptr = self->bitstream_map.data + self->bitstream_map.size; + + if (self->bitstream_map.size + slice_size > self->bitstream_map.maxsize) { + GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, + ("Not enough space for slice."), (NULL)); + gst_v4l2_codec_mpeg2_dec_reset_picture (self); + return GST_FLOW_ERROR; + } + + memcpy (bitstream_ptr, slice_ptr, slice_size); + self->bitstream_map.size += slice_size; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_v4l2_codec_mpeg2_dec_end_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + /* FIXME might need to make this lazier in case we get an unpaired field */ + if (!gst_v4l2_codec_mpeg2_dec_submit_bitstream (self, picture)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static void +gst_v4l2_codec_mpeg2_dec_set_flushing (GstV4l2CodecMpeg2Dec * self, + gboolean flushing) +{ + if (self->sink_allocator) + gst_v4l2_codec_allocator_set_flushing (self->sink_allocator, flushing); + if (self->src_allocator) + gst_v4l2_codec_allocator_set_flushing (self->src_allocator, flushing); +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_flush (GstVideoDecoder * decoder) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + + GST_DEBUG_OBJECT (self, "Flushing decoder state."); + + gst_v4l2_decoder_flush (self->decoder); + gst_v4l2_codec_mpeg2_dec_set_flushing (self, FALSE); + + return GST_VIDEO_DECODER_CLASS (parent_class)->flush (decoder); +} + +static gboolean +gst_v4l2_codec_mpeg2_dec_sink_event (GstVideoDecoder * decoder, + GstEvent * event) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (decoder); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + GST_DEBUG_OBJECT (self, "flush start"); + gst_v4l2_codec_mpeg2_dec_set_flushing (self, TRUE); + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstStateChangeReturn +gst_v4l2_codec_mpeg2_dec_change_state (GstElement * element, + GstStateChange transition) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (element); + + if (transition == GST_STATE_CHANGE_PAUSED_TO_READY) + gst_v4l2_codec_mpeg2_dec_set_flushing (self, TRUE); + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + +static void +gst_v4l2_codec_mpeg2_dec_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (object); + GObject *dec = G_OBJECT (self->decoder); + + switch (prop_id) { + default: + gst_v4l2_decoder_set_property (dec, prop_id - PROP_LAST, value, pspec); + break; + } +} + +static void +gst_v4l2_codec_mpeg2_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (object); + GObject *dec = G_OBJECT (self->decoder); + + switch (prop_id) { + default: + gst_v4l2_decoder_get_property (dec, prop_id - PROP_LAST, value, pspec); + break; + } +} + +static void +gst_v4l2_codec_mpeg2_dec_init (GstV4l2CodecMpeg2Dec * self) +{ +} + +static void +gst_v4l2_codec_mpeg2_dec_subinit (GstV4l2CodecMpeg2Dec * self, + GstV4l2CodecMpeg2DecClass * klass) +{ + self->decoder = gst_v4l2_decoder_new (klass->device); + gst_video_info_init (&self->vinfo); +} + +static void +gst_v4l2_codec_mpeg2_dec_dispose (GObject * object) +{ + GstV4l2CodecMpeg2Dec *self = GST_V4L2_CODEC_MPEG2_DEC (object); + + g_clear_object (&self->decoder); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_v4l2_codec_mpeg2_dec_class_init (GstV4l2CodecMpeg2DecClass * klass) +{ +} + +static void +gst_v4l2_codec_mpeg2_dec_subclass_init (GstV4l2CodecMpeg2DecClass * klass, + GstV4l2CodecDevice * device) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstMpeg2DecoderClass *mpeg2decoder_class = GST_MPEG2_DECODER_CLASS (klass); + + gobject_class->set_property = gst_v4l2_codec_mpeg2_dec_set_property; + gobject_class->get_property = gst_v4l2_codec_mpeg2_dec_get_property; + gobject_class->dispose = gst_v4l2_codec_mpeg2_dec_dispose; + + gst_element_class_set_static_metadata (element_class, + "V4L2 Stateless Mpeg2 Video Decoder", + "Codec/Decoder/Video/Hardware", + "A V4L2 based Mpeg2 video decoder", + "Daniel Almeida <daniel.almeida@collabora.com>"); + + gst_element_class_add_static_pad_template (element_class, &sink_template); + gst_element_class_add_static_pad_template (element_class, &src_template); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_change_state); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_close); + decoder_class->stop = GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_stop); + decoder_class->negotiate = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_decide_allocation); + decoder_class->flush = GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_flush); + decoder_class->sink_event = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_sink_event); + + mpeg2decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_new_sequence); + mpeg2decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_output_picture); + mpeg2decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_start_picture); + mpeg2decoder_class->decode_slice = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_decode_slice); + mpeg2decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_end_picture); + mpeg2decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_mpeg2_dec_get_preferred_output_delay); + + klass->device = device; + gst_v4l2_decoder_install_properties (gobject_class, PROP_LAST, device); +} + +void +gst_v4l2_codec_mpeg2_dec_register (GstPlugin * plugin, GstV4l2Decoder * decoder, + GstV4l2CodecDevice * device, guint rank) +{ + GstCaps *src_caps; + + GST_DEBUG_CATEGORY_INIT (v4l2_mpeg2dec_debug, "v4l2codecs-mpeg2dec", 0, + "V4L2 stateless mpeg2 decoder"); + + if (!gst_v4l2_decoder_set_sink_fmt (decoder, V4L2_PIX_FMT_MPEG2_SLICE, + 320, 240, 8)) + return; + src_caps = gst_v4l2_decoder_enum_src_formats (decoder); + + if (gst_caps_is_empty (src_caps)) { + GST_WARNING ("Not registering MPEG2 decoder since it produces no " + "supported format"); + goto done; + } + + gst_v4l2_decoder_register (plugin, GST_TYPE_V4L2_CODEC_MPEG2_DEC, + (GClassInitFunc) gst_v4l2_codec_mpeg2_dec_subclass_init, + gst_mini_object_ref (GST_MINI_OBJECT (device)), + (GInstanceInitFunc) gst_v4l2_codec_mpeg2_dec_subinit, + "v4l2sl%smpeg2dec", device, rank, NULL); + +done: + gst_caps_unref (src_caps); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecmpeg2dec.h
Added
@@ -0,0 +1,54 @@ +/* GStreamer + * Copyright (C) 2020 Daniel Almeida <daniel.almeida@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_V4L2_CODEC_MPEG2_DEC_H__ +#define __GST_V4L2_CODEC_MPEG2_DEC_H__ + +#define GST_USE_UNSTABLE_API +#include <gst/codecs/gstmpeg2decoder.h> + +#include "gstv4l2decoder.h" + +G_BEGIN_DECLS + +#define GST_TYPE_V4L2_CODEC_MPEG2_DEC (gst_v4l2_codec_mpeg2_dec_get_type()) +#define GST_V4L2_CODEC_MPEG2_DEC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_V4L2_CODEC_MPEG2_DEC,GstV4l2CodecMpeg2Dec)) +#define GST_V4L2_CODEC_MPEG2_DEC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_V4L2_CODEC_MPEG2_DEC,GstV4l2CodecMpeg2DecClass)) +#define GST_V4L2_CODEC_MPEG2_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_V4L2_CODEC_MPEG2_DEC, GstV4l2CodecMpeg2DecClass)) +#define GST_IS_V4L2_CODEC_MPEG2_DEC(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_V4L2_CODEC_MPEG2_DEC)) +#define GST_IS_V4L2_CODEC_MPEG2_DEC_CLASS(obj) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_V4L2_CODEC_MPEG2_DEC)) + +typedef struct _GstV4l2CodecMpeg2Dec GstV4l2CodecMpeg2Dec; +typedef struct _GstV4l2CodecMpeg2DecClass GstV4l2CodecMpeg2DecClass; + +struct _GstV4l2CodecMpeg2DecClass +{ + GstMpeg2DecoderClass parent_class; + GstV4l2CodecDevice *device; +}; + +GType gst_v4l2_codec_mpeg2_dec_get_type (void); +void gst_v4l2_codec_mpeg2_dec_register (GstPlugin * plugin, + GstV4l2Decoder * decoder, + GstV4l2CodecDevice * device, + guint rank); + +G_END_DECLS + +#endif /* __GST_V4L2_MPEG2_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2codecvp8dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecvp8dec.c
Changed
@@ -22,9 +22,16 @@ #endif #include "gstv4l2codecallocator.h" +#include "gstv4l2codecalphadecodebin.h" #include "gstv4l2codecpool.h" #include "gstv4l2codecvp8dec.h" -#include "linux/vp8-ctrls.h" +#include "gstv4l2format.h" + +#define KERNEL_VERSION(a,b,c) (((a) << 16) + ((b) << 8) + (c)) + +#define V4L2_MIN_KERNEL_VER_MAJOR 5 +#define V4L2_MIN_KERNEL_VER_MINOR 13 +#define V4L2_MIN_KERNEL_VERSION KERNEL_VERSION(V4L2_MIN_KERNEL_VER_MAJOR, V4L2_MIN_KERNEL_VER_MINOR, 0) GST_DEBUG_CATEGORY_STATIC (v4l2_vp8dec_debug); #define GST_CAT_DEFAULT v4l2_vp8dec_debug @@ -41,10 +48,16 @@ GST_STATIC_CAPS ("video/x-vp8") ); +static GstStaticPadTemplate alpha_template = +GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SINK_NAME, + GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp8, codec-alpha = (boolean) true") + ); + static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SRC_NAME, GST_PAD_SRC, GST_PAD_ALWAYS, - GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ NV12, YUY2, NV12_32L32 }"))); + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (GST_V4L2_DEFAULT_VIDEO_FORMATS))); struct _GstV4l2CodecVp8Dec { @@ -63,22 +76,40 @@ gboolean need_negotiation; gboolean copy_frames; - struct v4l2_ctrl_vp8_frame_header frame_header; + struct v4l2_ctrl_vp8_frame frame_header; GstMemory *bitstream; GstMapInfo bitstream_map; }; -G_DEFINE_ABSTRACT_TYPE_WITH_CODE (GstV4l2CodecVp8Dec, - gst_v4l2_codec_vp8_dec, GST_TYPE_VP8_DECODER, - GST_DEBUG_CATEGORY_INIT (v4l2_vp8dec_debug, "v4l2codecs-vp8dec", 0, - "V4L2 stateless VP8 decoder")); +G_DEFINE_ABSTRACT_TYPE (GstV4l2CodecVp8Dec, gst_v4l2_codec_vp8_dec, + GST_TYPE_VP8_DECODER); + #define parent_class gst_v4l2_codec_vp8_dec_parent_class +static guint +gst_v4l2_codec_vp8_dec_get_preferred_output_delay (GstVp8Decoder * decoder, + gboolean is_live) +{ + + GstV4l2CodecVp8Dec *self = GST_V4L2_CODEC_VP8_DEC (decoder); + guint delay; + + if (is_live) + delay = 0; + else + /* Just one for now, perhaps we can make this configurable in the future. */ + delay = 1; + + gst_v4l2_decoder_set_render_delay (self->decoder, delay); + return delay; +} + static gboolean gst_v4l2_codec_vp8_dec_open (GstVideoDecoder * decoder) { GstV4l2CodecVp8Dec *self = GST_V4L2_CODEC_VP8_DEC (decoder); + guint version; if (!gst_v4l2_decoder_open (self->decoder)) { GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ_WRITE, @@ -87,6 +118,13 @@ return FALSE; } + version = gst_v4l2_decoder_get_version (self->decoder); + if (version < V4L2_MIN_KERNEL_VERSION) + GST_WARNING_OBJECT (self, + "V4L2 API v%u.%u too old, at least v%u.%u required", + (version >> 16) & 0xff, (version >> 8) & 0xff, + V4L2_MIN_KERNEL_VER_MAJOR, V4L2_MIN_KERNEL_VER_MINOR); + return TRUE; } @@ -138,7 +176,7 @@ /* *INDENT-OFF* */ struct v4l2_ext_control control[] = { { - .id = V4L2_CID_MPEG_VIDEO_VP8_FRAME_HEADER, + .id = V4L2_CID_STATELESS_VP8_FRAME, .ptr = &self->frame_header, .size = sizeof (self->frame_header), }, @@ -153,11 +191,11 @@ GST_DEBUG_OBJECT (self, "Negotiate"); - gst_v4l2_codec_vp8_dec_reset_allocation (self); - gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SINK); gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SRC); + gst_v4l2_codec_vp8_dec_reset_allocation (self); + if (!gst_v4l2_decoder_set_sink_fmt (self->decoder, V4L2_PIX_FMT_VP8_FRAME, self->width, self->height, 12 /* 8 bits 4:2:0 */ )) { GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, @@ -233,6 +271,7 @@ { GstV4l2CodecVp8Dec *self = GST_V4L2_CODEC_VP8_DEC (decoder); guint min = 0; + guint num_bitstream; self->has_videometa = gst_query_find_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); @@ -245,10 +284,26 @@ min = MAX (2, min); + num_bitstream = 1 + + MAX (1, gst_v4l2_decoder_get_render_delay (self->decoder)); + self->sink_allocator = gst_v4l2_codec_allocator_new (self->decoder, - GST_PAD_SINK, self->min_pool_size + 2); + GST_PAD_SINK, num_bitstream); + if (!self->sink_allocator) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to allocate sink buffers."), (NULL)); + return FALSE; + } + self->src_allocator = gst_v4l2_codec_allocator_new (self->decoder, GST_PAD_SRC, self->min_pool_size + min + 4); + if (!self->src_allocator) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to allocate source buffers."), (NULL)); + g_clear_object (&self->sink_allocator); + return FALSE; + } + self->src_pool = gst_v4l2_codec_pool_new (self->src_allocator, &self->vinfo); /* Our buffer pool is internal, we will let the base class create a video @@ -259,57 +314,57 @@ } static void -gst_v4l2_codec_vp8_dec_fill_segment_header (struct v4l2_vp8_segment_header - *segment_header, const GstVp8Segmentation * segmentation) +gst_v4l2_codec_vp8_dec_fill_segment (struct v4l2_vp8_segment + *segment, const GstVp8Segmentation * segmentation) { gint i; /* *INDENT-OFF* */ - segment_header->flags = - (segmentation->segmentation_enabled ? V4L2_VP8_SEGMENT_HEADER_FLAG_ENABLED : 0) | - (segmentation->update_mb_segmentation_map ? V4L2_VP8_SEGMENT_HEADER_FLAG_UPDATE_MAP : 0) | - (segmentation->update_segment_feature_data ? V4L2_VP8_SEGMENT_HEADER_FLAG_UPDATE_FEATURE_DATA : 0) | - (segmentation->segment_feature_mode ? 0 : V4L2_VP8_SEGMENT_HEADER_FLAG_DELTA_VALUE_MODE); + segment->flags = + (segmentation->segmentation_enabled ? V4L2_VP8_SEGMENT_FLAG_ENABLED : 0) | + (segmentation->update_mb_segmentation_map ? V4L2_VP8_SEGMENT_FLAG_UPDATE_MAP : 0) | + (segmentation->update_segment_feature_data ? V4L2_VP8_SEGMENT_FLAG_UPDATE_FEATURE_DATA : 0) | + (segmentation->segment_feature_mode ? 0 : V4L2_VP8_SEGMENT_FLAG_DELTA_VALUE_MODE); /* *INDENT-ON* */ for (i = 0; i < 4; i++) { - segment_header->quant_update[i] = segmentation->quantizer_update_value[i]; - segment_header->lf_update[i] = segmentation->lf_update_value[i]; + segment->quant_update[i] = segmentation->quantizer_update_value[i]; + segment->lf_update[i] = segmentation->lf_update_value[i]; } for (i = 0; i < 3; i++) - segment_header->segment_probs[i] = segmentation->segment_prob[i]; + segment->segment_probs[i] = segmentation->segment_prob[i]; - segment_header->padding = 0; + segment->padding = 0; } static void -gst_v4l2_codec_vp8_dec_fill_lf_header (struct v4l2_vp8_loopfilter_header - *lf_header, const GstVp8MbLfAdjustments * lf_adj) +gst_v4l2_codec_vp8_dec_fill_lf (struct v4l2_vp8_loop_filter + *lf, const GstVp8MbLfAdjustments * lf_adj) { gint i; - lf_header->flags |= - (lf_adj->loop_filter_adj_enable ? V4L2_VP8_LF_HEADER_ADJ_ENABLE : 0) | - (lf_adj->mode_ref_lf_delta_update ? V4L2_VP8_LF_HEADER_DELTA_UPDATE : 0); + lf->flags |= + (lf_adj->loop_filter_adj_enable ? V4L2_VP8_LF_ADJ_ENABLE : 0) | + (lf_adj->mode_ref_lf_delta_update ? V4L2_VP8_LF_DELTA_UPDATE : 0); for (i = 0; i < 4; i++) { - lf_header->ref_frm_delta[i] = lf_adj->ref_frame_delta[i]; - lf_header->mb_mode_delta[i] = lf_adj->mb_mode_delta[i]; + lf->ref_frm_delta[i] = lf_adj->ref_frame_delta[i]; + lf->mb_mode_delta[i] = lf_adj->mb_mode_delta[i]; } } static void -gst_v4l2_codec_vp8_dec_fill_entropy_header (struct v4l2_vp8_entropy_header - *entropy_header, const GstVp8FrameHdr * frame_hdr) +gst_v4l2_codec_vp8_dec_fill_entropy (struct v4l2_vp8_entropy + *entropy, const GstVp8FrameHdr * frame_hdr) { - memcpy (entropy_header->coeff_probs, frame_hdr->token_probs.prob, + memcpy (entropy->coeff_probs, frame_hdr->token_probs.prob, sizeof (frame_hdr->token_probs.prob)); - memcpy (entropy_header->y_mode_probs, frame_hdr->mode_probs.y_prob, + memcpy (entropy->y_mode_probs, frame_hdr->mode_probs.y_prob, sizeof (frame_hdr->mode_probs.y_prob)); - memcpy (entropy_header->uv_mode_probs, frame_hdr->mode_probs.uv_prob, + memcpy (entropy->uv_mode_probs, frame_hdr->mode_probs.uv_prob, sizeof (frame_hdr->mode_probs.uv_prob)); - memcpy (entropy_header->mv_probs, frame_hdr->mv_probs.prob, + memcpy (entropy->mv_probs, frame_hdr->mv_probs.prob, sizeof (frame_hdr->mv_probs.prob)); } @@ -320,13 +375,13 @@ gint i; /* *INDENT-OFF* */ - self->frame_header = (struct v4l2_ctrl_vp8_frame_header) { - .lf_header = (struct v4l2_vp8_loopfilter_header) { + self->frame_header = (struct v4l2_ctrl_vp8_frame) { + .lf = (struct v4l2_vp8_loop_filter) { .sharpness_level = frame_hdr->sharpness_level, .level = frame_hdr->loop_filter_level, .flags = (frame_hdr->filter_type == 1 ? V4L2_VP8_LF_FILTER_TYPE_SIMPLE : 0) }, - .quant_header = (struct v4l2_vp8_quantization_header) { + .quant = (struct v4l2_vp8_quantization) { .y_ac_qi = frame_hdr->quant_indices.y_ac_qi, .y_dc_delta = frame_hdr->quant_indices.y_dc_delta, .y2_dc_delta = frame_hdr->quant_indices.y2_dc_delta, @@ -356,19 +411,18 @@ .first_part_size = frame_hdr->first_part_size, .first_part_header_bits = frame_hdr->header_size, - .flags = (frame_hdr->key_frame ? V4L2_VP8_FRAME_HEADER_FLAG_KEY_FRAME : 0) | - (frame_hdr->show_frame ? V4L2_VP8_FRAME_HEADER_FLAG_SHOW_FRAME : 0) | - (frame_hdr->mb_no_skip_coeff ? V4L2_VP8_FRAME_HEADER_FLAG_MB_NO_SKIP_COEFF : 0) | - (frame_hdr->sign_bias_golden ? V4L2_VP8_FRAME_HEADER_FLAG_SIGN_BIAS_GOLDEN : 0) | - (frame_hdr->sign_bias_alternate ? V4L2_VP8_FRAME_HEADER_FLAG_SIGN_BIAS_ALT : 0), + .flags = (frame_hdr->key_frame ? V4L2_VP8_FRAME_FLAG_KEY_FRAME : 0) | + (frame_hdr->show_frame ? V4L2_VP8_FRAME_FLAG_SHOW_FRAME : 0) | + (frame_hdr->mb_no_skip_coeff ? V4L2_VP8_FRAME_FLAG_MB_NO_SKIP_COEFF : 0) | + (frame_hdr->sign_bias_golden ? V4L2_VP8_FRAME_FLAG_SIGN_BIAS_GOLDEN : 0) | + (frame_hdr->sign_bias_alternate ? V4L2_VP8_FRAME_FLAG_SIGN_BIAS_ALT : 0), }; /* *INDENT-ON* */ for (i = 0; i < 8; i++) self->frame_header.dct_part_sizes[i] = frame_hdr->partition_size[i]; - gst_v4l2_codec_vp8_dec_fill_entropy_header (&self-> - frame_header.entropy_header, frame_hdr); + gst_v4l2_codec_vp8_dec_fill_entropy (&self->frame_header.entropy, frame_hdr); } static void @@ -394,7 +448,7 @@ (guint32) self->frame_header.alt_frame_ts / 1000); } -static gboolean +static GstFlowReturn gst_v4l2_codec_vp8_dec_new_sequence (GstVp8Decoder * decoder, const GstVp8FrameHdr * frame_hdr) { @@ -419,7 +473,7 @@ self->need_negotiation = TRUE; if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; } } @@ -444,10 +498,10 @@ self->copy_frames = FALSE; } - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_v4l2_codec_vp8_dec_start_picture (GstVp8Decoder * decoder, GstVp8Picture * picture) { @@ -455,7 +509,7 @@ /* FIXME base class should not call us if negotiation failed */ if (!self->sink_allocator) - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; /* Ensure we have a bitstream to write into */ if (!self->bitstream) { @@ -464,24 +518,24 @@ if (!self->bitstream) { GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, ("Not enough memory to decode VP8 stream."), (NULL)); - return FALSE; + return GST_FLOW_ERROR; } if (!gst_memory_map (self->bitstream, &self->bitstream_map, GST_MAP_WRITE)) { GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, ("Could not access bitstream memory for writing"), (NULL)); g_clear_pointer (&self->bitstream, gst_memory_unref); - return FALSE; + return GST_FLOW_ERROR; } } /* We use this field to track how much we have written */ self->bitstream_map.size = 0; - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_v4l2_codec_vp8_dec_decode_picture (GstVp8Decoder * decoder, GstVp8Picture * picture, GstVp8Parser * parser) { @@ -491,20 +545,20 @@ if (self->bitstream_map.maxsize < picture->size) { GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, ("Not enough space to send picture bitstream."), (NULL)); - return FALSE; + return GST_FLOW_ERROR; } gst_v4l2_codec_vp8_dec_fill_frame_header (self, &picture->frame_hdr); - gst_v4l2_codec_vp8_dec_fill_segment_header (&self-> - frame_header.segment_header, &parser->segmentation); - gst_v4l2_codec_vp8_dec_fill_lf_header (&self->frame_header.lf_header, + gst_v4l2_codec_vp8_dec_fill_segment (&self->frame_header.segment, + &parser->segmentation); + gst_v4l2_codec_vp8_dec_fill_lf (&self->frame_header.lf, &parser->mb_lf_adjust); gst_v4l2_codec_vp8_dec_fill_references (self); memcpy (bitstream_data, picture->data, picture->size); self->bitstream_map.size = picture->size; - return TRUE; + return GST_FLOW_OK; } static void @@ -518,7 +572,7 @@ } } -static gboolean +static GstFlowReturn gst_v4l2_codec_vp8_dec_end_picture (GstVp8Decoder * decoder, GstVp8Picture * picture) { @@ -526,28 +580,23 @@ GstVideoCodecFrame *frame; GstV4l2Request *request; GstBuffer *buffer; - GstFlowReturn flow_ret; + GstFlowReturn flow_ret = GST_FLOW_OK; gsize bytesused; /* *INDENT-OFF* */ struct v4l2_ext_control control[] = { { - .id = V4L2_CID_MPEG_VIDEO_VP8_FRAME_HEADER, + .id = V4L2_CID_STATELESS_VP8_FRAME, .ptr = &self->frame_header, .size = sizeof(self->frame_header), }, }; /* *INDENT-ON* */ - request = gst_v4l2_decoder_alloc_request (self->decoder); - if (!request) { - GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, - ("Failed to allocate a media request object."), (NULL)); - goto fail; - } - - gst_vp8_picture_set_user_data (picture, request, - (GDestroyNotify) gst_v4l2_request_free); + bytesused = self->bitstream_map.size; + gst_memory_unmap (self->bitstream, &self->bitstream_map); + self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + gst_memory_resize (self->bitstream, 0, bytesused); flow_ret = gst_buffer_pool_acquire_buffer (GST_BUFFER_POOL (self->src_pool), &buffer, NULL); @@ -562,49 +611,46 @@ frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), picture->system_frame_number); - g_return_val_if_fail (frame, FALSE); + g_return_val_if_fail (frame, GST_FLOW_ERROR); g_warn_if_fail (frame->output_buffer == NULL); frame->output_buffer = buffer; gst_video_codec_frame_unref (frame); - if (!gst_v4l2_decoder_set_controls (self->decoder, request, control, - G_N_ELEMENTS (control))) { - GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, - ("Driver did not accept the bitstream parameters."), (NULL)); - goto fail; - } - - bytesused = self->bitstream_map.size; - gst_memory_unmap (self->bitstream, &self->bitstream_map); - self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; - - if (!gst_v4l2_decoder_queue_sink_mem (self->decoder, request, self->bitstream, - picture->system_frame_number, bytesused, 0)) { - GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, - ("Driver did not accept the bitstream data."), (NULL)); + request = gst_v4l2_decoder_alloc_request (self->decoder, + picture->system_frame_number, self->bitstream, buffer); + if (!request) { + GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, + ("Failed to allocate a media request object."), (NULL)); goto fail; } + gst_vp8_picture_set_user_data (picture, request, + (GDestroyNotify) gst_v4l2_request_unref); - if (!gst_v4l2_decoder_queue_src_buffer (self->decoder, buffer, - picture->system_frame_number)) { + if (!gst_v4l2_decoder_set_controls (self->decoder, request, control, + G_N_ELEMENTS (control))) { GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, - ("Driver did not accept the picture buffer."), (NULL)); + ("Driver did not accept the bitstream parameters."), (NULL)); goto fail; } - if (!gst_v4l2_request_queue (request)) { + if (!gst_v4l2_request_queue (request, 0)) { GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, ("Driver did not accept the decode request."), (NULL)); goto fail; } gst_v4l2_codec_vp8_dec_reset_picture (self); - return TRUE; + + return GST_FLOW_OK; fail: gst_v4l2_codec_vp8_dec_reset_picture (self); - return FALSE; + + if (flow_ret != GST_FLOW_OK) + return flow_ret; + + return GST_FLOW_ERROR; } static gboolean @@ -663,15 +709,10 @@ GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); GstV4l2Request *request = gst_vp8_picture_get_user_data (picture); gint ret; - guint32 frame_num; GST_DEBUG_OBJECT (self, "Output picture %u", picture->system_frame_number); - /* Unlikely, but it would not break this decoding flow */ - if (gst_v4l2_request_is_done (request)) - goto finish_frame; - - ret = gst_v4l2_request_poll (request, GST_SECOND); + ret = gst_v4l2_request_set_done (request); if (ret == 0) { GST_ELEMENT_ERROR (self, STREAM, DECODE, ("Decoding frame took too long"), (NULL)); @@ -682,18 +723,14 @@ goto error; } - do { - if (!gst_v4l2_decoder_dequeue_src (self->decoder, &frame_num)) { - GST_ELEMENT_ERROR (self, STREAM, DECODE, - ("Decoder did not produce a frame"), (NULL)); - goto error; - } - } while (frame_num != picture->system_frame_number); - -finish_frame: - gst_v4l2_request_set_done (request); g_return_val_if_fail (frame->output_buffer, GST_FLOW_ERROR); + if (gst_v4l2_request_failed (request)) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Failed to decode frame %u", picture->system_frame_number), (NULL)); + goto error; + } + /* Hold on reference buffers for the rest of the picture lifetime */ gst_vp8_picture_set_user_data (picture, gst_buffer_ref (frame->output_buffer), (GDestroyNotify) gst_buffer_unref); @@ -865,17 +902,68 @@ GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp8_dec_end_picture); vp8decoder_class->output_picture = GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp8_dec_output_picture); + vp8decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp8_dec_get_preferred_output_delay); klass->device = device; gst_v4l2_decoder_install_properties (gobject_class, PROP_LAST, device); } +static void gst_v4l2_codec_vp8_alpha_decode_bin_subclass_init + (GstV4l2CodecAlphaDecodeBinClass * klass, gchar * decoder_name) +{ + GstV4l2CodecAlphaDecodeBinClass *adbin_class = + (GstV4l2CodecAlphaDecodeBinClass *) klass; + GstElementClass *element_class = (GstElementClass *) klass; + + adbin_class->decoder_name = decoder_name; + gst_element_class_add_static_pad_template (element_class, &alpha_template); + + gst_element_class_set_static_metadata (element_class, + "VP8 Alpha Decoder", "Codec/Decoder/Video", + "Wrapper bin to decode VP8 with alpha stream.", + "Daniel Almeida <daniel.almeida@collabora.com>"); +} + void -gst_v4l2_codec_vp8_dec_register (GstPlugin * plugin, +gst_v4l2_codec_vp8_dec_register (GstPlugin * plugin, GstV4l2Decoder * decoder, GstV4l2CodecDevice * device, guint rank) { + gchar *element_name; + GstCaps *src_caps, *alpha_caps; + + GST_DEBUG_CATEGORY_INIT (v4l2_vp8dec_debug, "v4l2codecs-vp8dec", 0, + "V4L2 stateless VP8 decoder"); + + if (!gst_v4l2_decoder_set_sink_fmt (decoder, V4L2_PIX_FMT_VP8_FRAME, + 320, 240, 8)) + return; + src_caps = gst_v4l2_decoder_enum_src_formats (decoder); + + if (gst_caps_is_empty (src_caps)) { + GST_WARNING ("Not registering VP8 decoder since it produces no " + "supported format"); + goto done; + } + gst_v4l2_decoder_register (plugin, GST_TYPE_V4L2_CODEC_VP8_DEC, (GClassInitFunc) gst_v4l2_codec_vp8_dec_subclass_init, + gst_mini_object_ref (GST_MINI_OBJECT (device)), (GInstanceInitFunc) gst_v4l2_codec_vp8_dec_subinit, - "v4l2sl%svp8dec", device, rank); + "v4l2sl%svp8dec", device, rank, &element_name); + + if (!element_name) + goto done; + + alpha_caps = gst_caps_from_string ("video/x-raw,format={I420, NV12}"); + + if (gst_caps_can_intersect (src_caps, alpha_caps)) + gst_v4l2_codec_alpha_decode_bin_register (plugin, + (GClassInitFunc) gst_v4l2_codec_vp8_alpha_decode_bin_subclass_init, + element_name, "v4l2slvp8%salphadecodebin", device, rank); + + gst_caps_unref (alpha_caps); + +done: + gst_caps_unref (src_caps); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2codecvp8dec.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecvp8dec.h
Changed
@@ -45,8 +45,9 @@ GType gst_v4l2_codec_vp8_dec_get_type (void); void gst_v4l2_codec_vp8_dec_register (GstPlugin * plugin, - GstV4l2CodecDevice * device, - guint rank); + GstV4l2Decoder * decoder, + GstV4l2CodecDevice * device, + guint rank); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecvp9dec.c
Added
@@ -0,0 +1,1181 @@ + +/* GStreamer + * Copyright (C) 2021 Daniel Almeida <daniel.almeida@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include <config.h> +#endif + +#include "gstv4l2codecallocator.h" +#include "gstv4l2codecalphadecodebin.h" +#include "gstv4l2codecpool.h" +#include "gstv4l2codecvp9dec.h" +#include "gstv4l2format.h" +#include "linux/v4l2-controls.h" +#include "linux/videodev2.h" + +GST_DEBUG_CATEGORY_STATIC (v4l2_vp9dec_debug); +#define GST_CAT_DEFAULT v4l2_vp9dec_debug + +/* Used to mark picture that have been outputed */ +#define FLAG_PICTURE_OUTPUTED GST_MINI_OBJECT_FLAG_LAST + +enum +{ + PROP_0, + PROP_LAST = PROP_0 +}; + +static GstStaticPadTemplate sink_template = +GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SINK_NAME, + GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp9, " "alignment=(string) frame") + ); + +static GstStaticPadTemplate alpha_template = +GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SINK_NAME, + GST_PAD_SINK, GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-vp9, codec-alpha = (boolean) true, " + "alignment = frame") + ); + +static GstStaticPadTemplate src_template = +GST_STATIC_PAD_TEMPLATE (GST_VIDEO_DECODER_SRC_NAME, + GST_PAD_SRC, GST_PAD_ALWAYS, + GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE (GST_V4L2_DEFAULT_VIDEO_FORMATS))); + +struct _GstV4l2CodecVp9Dec +{ + GstVp9Decoder parent; + GstV4l2Decoder *decoder; + GstVideoCodecState *output_state; + GstVideoInfo vinfo; + gint width; + gint height; + + GstV4l2CodecAllocator *sink_allocator; + GstV4l2CodecAllocator *src_allocator; + GstV4l2CodecPool *src_pool; + gboolean has_videometa; + gboolean need_negotiation; + gboolean copy_frames; + + struct v4l2_ctrl_vp9_frame v4l2_vp9_frame; + struct v4l2_ctrl_vp9_compressed_hdr v4l2_delta_probs; + + GstMemory *bitstream; + GstMapInfo bitstream_map; + + /* will renegotiate if parser reports new values */ + guint bit_depth; + guint color_range; + guint profile; + guint color_space; + guint subsampling_x; + guint subsampling_y; +}; + +G_DEFINE_ABSTRACT_TYPE (GstV4l2CodecVp9Dec, gst_v4l2_codec_vp9_dec, + GST_TYPE_VP9_DECODER); + +#define parent_class gst_v4l2_codec_vp9_dec_parent_class + +static guint +gst_v4l2_codec_vp9_dec_get_preferred_output_delay (GstVp9Decoder * decoder, + gboolean is_live) +{ + + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + guint delay; + + if (is_live) + delay = 0; + else + delay = 1; + + gst_v4l2_decoder_set_render_delay (self->decoder, delay); + return delay; +} + +static void +gst_v4l2_codec_vp9_dec_fill_lf_params (GstV4l2CodecVp9Dec * self, + const GstVp9LoopFilterParams * lf) +{ + gint i; + + G_STATIC_ASSERT (sizeof (self->v4l2_vp9_frame.lf.ref_deltas) == + sizeof (lf->loop_filter_ref_deltas)); + G_STATIC_ASSERT (sizeof (self->v4l2_vp9_frame.lf.mode_deltas) == + sizeof (lf->loop_filter_mode_deltas)); + + for (i = 0; i < G_N_ELEMENTS (self->v4l2_vp9_frame.lf.ref_deltas); i++) + self->v4l2_vp9_frame.lf.ref_deltas[i] = lf->loop_filter_ref_deltas[i]; + + for (i = 0; i < G_N_ELEMENTS (self->v4l2_vp9_frame.lf.mode_deltas); i++) + self->v4l2_vp9_frame.lf.mode_deltas[i] = lf->loop_filter_mode_deltas[i]; +} + +static void +gst_v4l2_codec_vp9_dec_fill_seg_params (GstV4l2CodecVp9Dec * self, + const GstVp9SegmentationParams * s) +{ + guint i; + struct v4l2_vp9_segmentation *v4l2_segmentation = &self->v4l2_vp9_frame.seg; + + G_STATIC_ASSERT (sizeof (v4l2_segmentation->tree_probs) == + sizeof (s->segmentation_tree_probs)); + G_STATIC_ASSERT (sizeof (v4l2_segmentation->pred_probs) == + sizeof (s->segmentation_pred_prob)); + G_STATIC_ASSERT (sizeof (v4l2_segmentation->feature_data) == + sizeof (s->feature_data)); + + for (i = 0; i < G_N_ELEMENTS (v4l2_segmentation->tree_probs); i++) { + v4l2_segmentation->tree_probs[i] = s->segmentation_tree_probs[i]; + } + for (i = 0; i < G_N_ELEMENTS (v4l2_segmentation->pred_probs); i++) { + v4l2_segmentation->pred_probs[i] = s->segmentation_pred_prob[i]; + } + for (i = 0; i < G_N_ELEMENTS (v4l2_segmentation->feature_enabled); i++) { + /* see 3. Symbols (and abbreviated terms) for reference */ + + /* *INDENT-OFF* */ + v4l2_segmentation->feature_enabled[i] = + (s->feature_enabled[i][GST_VP9_SEG_LVL_ALT_Q] ? V4L2_VP9_SEGMENT_FEATURE_ENABLED(V4L2_VP9_SEG_LVL_ALT_Q) : 0) | + (s->feature_enabled[i][GST_VP9_SEG_LVL_ALT_L] ? V4L2_VP9_SEGMENT_FEATURE_ENABLED(V4L2_VP9_SEG_LVL_ALT_L) : 0) | + (s->feature_enabled[i][GST_VP9_SEG_LVL_REF_FRAME] ? V4L2_VP9_SEGMENT_FEATURE_ENABLED(V4L2_VP9_SEG_LVL_REF_FRAME) : 0) | + (s->feature_enabled[i][GST_VP9_SEG_SEG_LVL_SKIP] ? V4L2_VP9_SEGMENT_FEATURE_ENABLED(V4L2_VP9_SEG_LVL_SKIP) : 0); + /* *INDENT-ON* */ + } + + memcpy (v4l2_segmentation->feature_data, s->feature_data, + sizeof (v4l2_segmentation->feature_data)); +} + +static void +gst_v4l2_codec_vp9_dec_fill_prob_updates (GstV4l2CodecVp9Dec * self, + const GstVp9FrameHeader * h) +{ + struct v4l2_ctrl_vp9_compressed_hdr *probs = &self->v4l2_delta_probs; + + G_STATIC_ASSERT (sizeof (probs->tx8) == + sizeof (h->delta_probabilities.tx_probs_8x8)); + G_STATIC_ASSERT (sizeof (probs->tx16) == + sizeof (h->delta_probabilities.tx_probs_16x16)); + G_STATIC_ASSERT (sizeof (probs->tx32) == + sizeof (h->delta_probabilities.tx_probs_32x32)); + G_STATIC_ASSERT (sizeof (probs->coef) == + sizeof (h->delta_probabilities.coef)); + G_STATIC_ASSERT (sizeof (probs->skip) == + sizeof (h->delta_probabilities.skip)); + G_STATIC_ASSERT (sizeof (probs->inter_mode) == + sizeof (h->delta_probabilities.inter_mode)); + G_STATIC_ASSERT (sizeof (probs->interp_filter) == + sizeof (h->delta_probabilities.interp_filter)); + G_STATIC_ASSERT (sizeof (probs->is_inter) == + sizeof (h->delta_probabilities.is_inter)); + G_STATIC_ASSERT (sizeof (probs->comp_mode) == + sizeof (h->delta_probabilities.comp_mode)); + G_STATIC_ASSERT (sizeof (probs->single_ref) == + sizeof (h->delta_probabilities.single_ref)); + G_STATIC_ASSERT (sizeof (probs->comp_ref) == + sizeof (h->delta_probabilities.comp_ref)); + G_STATIC_ASSERT (sizeof (probs->y_mode) == + sizeof (h->delta_probabilities.y_mode)); + G_STATIC_ASSERT (sizeof (probs->partition) == + sizeof (h->delta_probabilities.partition)); + G_STATIC_ASSERT (sizeof (probs->mv.joint) == + sizeof (h->delta_probabilities.mv.joint)); + G_STATIC_ASSERT (sizeof (probs->mv.sign) == + sizeof (h->delta_probabilities.mv.sign)); + G_STATIC_ASSERT (sizeof (probs->mv.classes) == + sizeof (h->delta_probabilities.mv.klass)); + G_STATIC_ASSERT (sizeof (probs->mv.class0_bit) == + sizeof (h->delta_probabilities.mv.class0_bit)); + G_STATIC_ASSERT (sizeof (probs->mv.bits) == + sizeof (h->delta_probabilities.mv.bits)); + G_STATIC_ASSERT (sizeof (probs->mv.class0_fr) == + sizeof (h->delta_probabilities.mv.class0_fr)); + G_STATIC_ASSERT (sizeof (probs->mv.fr) == + sizeof (h->delta_probabilities.mv.fr)); + G_STATIC_ASSERT (sizeof (probs->mv.class0_hp) == + sizeof (h->delta_probabilities.mv.class0_hp)); + G_STATIC_ASSERT (sizeof (probs->mv.hp) == + sizeof (h->delta_probabilities.mv.hp)); + + memset (probs, 0, sizeof (*probs)); + + probs->tx_mode = h->tx_mode; + memcpy (probs->tx8, h->delta_probabilities.tx_probs_8x8, sizeof (probs->tx8)); + memcpy (probs->tx16, h->delta_probabilities.tx_probs_16x16, + sizeof (probs->tx16)); + memcpy (probs->tx32, h->delta_probabilities.tx_probs_32x32, + sizeof (probs->tx32)); + memcpy (probs->coef, h->delta_probabilities.coef, sizeof (probs->coef)); + memcpy (probs->skip, h->delta_probabilities.skip, sizeof (probs->skip)); + memcpy (probs->inter_mode, h->delta_probabilities.inter_mode, + sizeof (probs->inter_mode)); + memcpy (probs->interp_filter, + h->delta_probabilities.interp_filter, sizeof (probs->interp_filter)); + memcpy (probs->is_inter, h->delta_probabilities.is_inter, + sizeof (probs->is_inter)); + memcpy (probs->comp_mode, h->delta_probabilities.comp_mode, + sizeof (probs->comp_mode)); + memcpy (probs->single_ref, h->delta_probabilities.single_ref, + sizeof (probs->single_ref)); + memcpy (probs->comp_ref, h->delta_probabilities.comp_ref, + sizeof (probs->comp_ref)); + memcpy (probs->y_mode, h->delta_probabilities.y_mode, sizeof (probs->y_mode)); + memcpy (probs->partition, h->delta_probabilities.partition, + sizeof (probs->partition)); + + memcpy (probs->mv.joint, h->delta_probabilities.mv.joint, + sizeof (h->delta_probabilities.mv.joint)); + memcpy (probs->mv.sign, h->delta_probabilities.mv.sign, + sizeof (h->delta_probabilities.mv.sign)); + memcpy (probs->mv.classes, h->delta_probabilities.mv.klass, + sizeof (h->delta_probabilities.mv.klass)); + memcpy (probs->mv.class0_bit, + h->delta_probabilities.mv.class0_bit, + sizeof (h->delta_probabilities.mv.class0_bit)); + memcpy (probs->mv.bits, h->delta_probabilities.mv.bits, + sizeof (h->delta_probabilities.mv.bits)); + memcpy (probs->mv.class0_fr, h->delta_probabilities.mv.class0_fr, + sizeof (h->delta_probabilities.mv.class0_fr)); + memcpy (probs->mv.fr, h->delta_probabilities.mv.fr, + sizeof (h->delta_probabilities.mv.fr)); + memcpy (probs->mv.class0_hp, h->delta_probabilities.mv.class0_hp, + sizeof (h->delta_probabilities.mv.class0_hp)); + memcpy (probs->mv.hp, h->delta_probabilities.mv.hp, + sizeof (h->delta_probabilities.mv.hp)); +} + +static void +gst_v4l2_codecs_vp9_dec_fill_refs (GstV4l2CodecVp9Dec * self, + const GstVp9FrameHeader * h, const GstVp9Dpb * reference_frames) +{ + GstVp9Picture *ref_pic; + + if (reference_frames && reference_frames->pic_list[h->ref_frame_idx[0]]) { + ref_pic = reference_frames->pic_list[h->ref_frame_idx[0]]; + self->v4l2_vp9_frame.last_frame_ts = ref_pic->system_frame_number * 1000; + } + + if (reference_frames && reference_frames->pic_list[h->ref_frame_idx[1]]) { + ref_pic = reference_frames->pic_list[h->ref_frame_idx[1]]; + self->v4l2_vp9_frame.golden_frame_ts = ref_pic->system_frame_number * 1000; + } + + if (reference_frames && reference_frames->pic_list[h->ref_frame_idx[2]]) { + ref_pic = reference_frames->pic_list[h->ref_frame_idx[2]]; + self->v4l2_vp9_frame.alt_frame_ts = ref_pic->system_frame_number * 1000; + } +} + +static void +gst_v4l2_codec_vp9_dec_fill_dec_params (GstV4l2CodecVp9Dec * self, + const GstVp9FrameHeader * h, const GstVp9Dpb * reference_frames) +{ + /* *INDENT-OFF* */ + self->v4l2_vp9_frame = (struct v4l2_ctrl_vp9_frame) { + .flags = + (h->frame_type == GST_VP9_KEY_FRAME ? V4L2_VP9_FRAME_FLAG_KEY_FRAME : 0) | + (h->show_frame ? V4L2_VP9_FRAME_FLAG_SHOW_FRAME : 0) | + (h->error_resilient_mode ? V4L2_VP9_FRAME_FLAG_ERROR_RESILIENT : 0) | + (h->intra_only ? V4L2_VP9_FRAME_FLAG_INTRA_ONLY : 0) | + (h->allow_high_precision_mv ? V4L2_VP9_FRAME_FLAG_ALLOW_HIGH_PREC_MV : 0) | + (h->refresh_frame_context ? V4L2_VP9_FRAME_FLAG_REFRESH_FRAME_CTX : 0) | + (h->frame_parallel_decoding_mode ? V4L2_VP9_FRAME_FLAG_PARALLEL_DEC_MODE : 0) | + (self->subsampling_x ? V4L2_VP9_FRAME_FLAG_X_SUBSAMPLING : 0) | + (self->subsampling_y ? V4L2_VP9_FRAME_FLAG_Y_SUBSAMPLING : 0) | + (self->color_range ? V4L2_VP9_FRAME_FLAG_COLOR_RANGE_FULL_SWING : 0), + + .compressed_header_size = h->header_size_in_bytes, + .uncompressed_header_size = h->frame_header_length_in_bytes, + .profile = h->profile, + .frame_context_idx = h->frame_context_idx, + .bit_depth = self->bit_depth, + .interpolation_filter = h->interpolation_filter, + .tile_cols_log2 = h->tile_cols_log2, + .tile_rows_log2 = h->tile_rows_log2, + .reference_mode = h->reference_mode, + .frame_width_minus_1 = h->width - 1, + .frame_height_minus_1 = h->height - 1, + .render_width_minus_1 = h->render_width ? h->render_width - 1 : h->width - 1, + .render_height_minus_1 = h->render_height ? h->render_height - 1: h->height - 1, + .ref_frame_sign_bias = + (h->ref_frame_sign_bias[GST_VP9_REF_FRAME_LAST] ? V4L2_VP9_SIGN_BIAS_LAST : 0) | + (h->ref_frame_sign_bias[GST_VP9_REF_FRAME_GOLDEN] ? V4L2_VP9_SIGN_BIAS_GOLDEN : 0) | + (h->ref_frame_sign_bias[GST_VP9_REF_FRAME_ALTREF] ? V4L2_VP9_SIGN_BIAS_ALT : 0), + + .lf = (struct v4l2_vp9_loop_filter) { + .flags = + (h->loop_filter_params.loop_filter_delta_enabled ? V4L2_VP9_LOOP_FILTER_FLAG_DELTA_ENABLED : 0) | + (h->loop_filter_params.loop_filter_delta_update ? V4L2_VP9_LOOP_FILTER_FLAG_DELTA_UPDATE : 0), + .level = h->loop_filter_params.loop_filter_level, + .sharpness = h->loop_filter_params.loop_filter_sharpness + }, + + .quant = (struct v4l2_vp9_quantization) { + .base_q_idx = h->quantization_params.base_q_idx, /* used for Y (luma) AC coefficients */ + .delta_q_y_dc = h->quantization_params.delta_q_y_dc, + .delta_q_uv_dc = h->quantization_params.delta_q_uv_dc, + .delta_q_uv_ac = h->quantization_params.delta_q_uv_ac, + }, + + .seg = (struct v4l2_vp9_segmentation) { + .flags = + (h->segmentation_params.segmentation_enabled ? V4L2_VP9_SEGMENTATION_FLAG_ENABLED : 0) | + (h->segmentation_params.segmentation_update_map ? V4L2_VP9_SEGMENTATION_FLAG_UPDATE_MAP : 0) | + (h->segmentation_params.segmentation_temporal_update ? V4L2_VP9_SEGMENTATION_FLAG_TEMPORAL_UPDATE : 0) | + (h->segmentation_params.segmentation_update_data ? V4L2_VP9_SEGMENTATION_FLAG_UPDATE_DATA : 0) | + (h->segmentation_params.segmentation_abs_or_delta_update ? V4L2_VP9_SEGMENTATION_FLAG_ABS_OR_DELTA_UPDATE : 0), + } + }; + /* *INDENT-ON* */ + + switch (h->reset_frame_context) { + case 0: + case 1: + self->v4l2_vp9_frame.reset_frame_context = V4L2_VP9_RESET_FRAME_CTX_NONE; + break; + case 2: + self->v4l2_vp9_frame.reset_frame_context = V4L2_VP9_RESET_FRAME_CTX_SPEC; + break; + case 3: + self->v4l2_vp9_frame.reset_frame_context = V4L2_VP9_RESET_FRAME_CTX_ALL; + break; + default: + break; + } + + gst_v4l2_codecs_vp9_dec_fill_refs (self, h, reference_frames); + gst_v4l2_codec_vp9_dec_fill_lf_params (self, &h->loop_filter_params); + gst_v4l2_codec_vp9_dec_fill_seg_params (self, &h->segmentation_params); +} + +static gboolean +gst_v4l2_codec_vp9_dec_open (GstVideoDecoder * decoder) +{ + GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + + if (!gst_v4l2_decoder_open (self->decoder)) { + GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ_WRITE, + ("Failed to open VP9 decoder"), + ("gst_v4l2_decoder_open() failed: %s", g_strerror (errno))); + return FALSE; + } + + /* V4L2 does not support non-keyframe resolution change, this will ask the + * base class to drop frame until the next keyframe as a workaround. */ + gst_vp9_decoder_set_non_keyframe_format_change_support (vp9dec, FALSE); + + return TRUE; +} + +static gboolean +gst_v4l2_codec_vp9_dec_close (GstVideoDecoder * decoder) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + gst_v4l2_decoder_close (self->decoder); + return TRUE; +} + +static void +gst_v4l2_codec_vp9_dec_reset_allocation (GstV4l2CodecVp9Dec * self) +{ + if (self->sink_allocator) { + gst_v4l2_codec_allocator_detach (self->sink_allocator); + g_clear_object (&self->sink_allocator); + } + + if (self->src_allocator) { + gst_v4l2_codec_allocator_detach (self->src_allocator); + g_clear_object (&self->src_allocator); + g_clear_object (&self->src_pool); + } +} + +static gboolean +gst_v4l2_codec_vp9_dec_stop (GstVideoDecoder * decoder) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SINK); + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SRC); + + gst_v4l2_codec_vp9_dec_reset_allocation (self); + + if (self->output_state) + gst_video_codec_state_unref (self->output_state); + self->output_state = NULL; + + return GST_VIDEO_DECODER_CLASS (parent_class)->stop (decoder); +} + +static gboolean +gst_v4l2_codec_vp9_dec_negotiate (GstVideoDecoder * decoder) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); + + /* *INDENT-OFF* */ + struct v4l2_ext_control control[] = { + { + .id = V4L2_CID_STATELESS_VP9_FRAME, + .ptr = &self->v4l2_vp9_frame, + .size = sizeof (self->v4l2_vp9_frame), + }, + }; + /* *INDENT-ON* */ + + GstCaps *filter, *caps; + /* Ignore downstream renegotiation request. */ + if (!self->need_negotiation) + return TRUE; + self->need_negotiation = FALSE; + + GST_DEBUG_OBJECT (self, "Negotiate"); + + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SINK); + gst_v4l2_decoder_streamoff (self->decoder, GST_PAD_SRC); + + gst_v4l2_codec_vp9_dec_reset_allocation (self); + + if (!gst_v4l2_decoder_set_sink_fmt (self->decoder, V4L2_PIX_FMT_VP9_FRAME, + self->width, self->height, self->bit_depth)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("Failed to configure VP9 decoder"), + ("gst_v4l2_decoder_set_sink_fmt() failed: %s", g_strerror (errno))); + gst_v4l2_decoder_close (self->decoder); + return FALSE; + } + if (!gst_v4l2_decoder_set_controls (self->decoder, NULL, control, + G_N_ELEMENTS (control))) { + GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, + ("Driver does not support the selected stream."), (NULL)); + return FALSE; + } + + filter = gst_v4l2_decoder_enum_src_formats (self->decoder); + if (!filter) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("No supported decoder output formats"), (NULL)); + return FALSE; + } + GST_DEBUG_OBJECT (self, "Supported output formats: %" GST_PTR_FORMAT, filter); + + caps = gst_pad_peer_query_caps (decoder->srcpad, filter); + gst_caps_unref (filter); + GST_DEBUG_OBJECT (self, "Peer supported formats: %" GST_PTR_FORMAT, caps); + + if (!gst_v4l2_decoder_select_src_format (self->decoder, caps, &self->vinfo)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, + ("Unsupported pixel format"), + ("No support for %ux%u format %s", self->width, self->height, + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (&self->vinfo)))); + gst_caps_unref (caps); + return FALSE; + } + gst_caps_unref (caps); + + if (self->output_state) + gst_video_codec_state_unref (self->output_state); + + self->output_state = + gst_video_decoder_set_output_state (GST_VIDEO_DECODER (self), + self->vinfo.finfo->format, self->width, + self->height, vp9dec->input_state); + + self->output_state->caps = gst_video_info_to_caps (&self->output_state->info); + + if (GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder)) { + if (!gst_v4l2_decoder_streamon (self->decoder, GST_PAD_SINK)) { + GST_ELEMENT_ERROR (self, RESOURCE, FAILED, + ("Could not enable the decoder driver."), + ("VIDIOC_STREAMON(SINK) failed: %s", g_strerror (errno))); + return FALSE; + } + + if (!gst_v4l2_decoder_streamon (self->decoder, GST_PAD_SRC)) { + GST_ELEMENT_ERROR (self, RESOURCE, FAILED, + ("Could not enable the decoder driver."), + ("VIDIOC_STREAMON(SRC) failed: %s", g_strerror (errno))); + return FALSE; + } + + return TRUE; + } + + return FALSE; +} + +static gboolean +gst_v4l2_codec_vp9_dec_decide_allocation (GstVideoDecoder * decoder, + GstQuery * query) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + guint min = 0; + guint num_bitstream; + + self->has_videometa = gst_query_find_allocation_meta (query, + GST_VIDEO_META_API_TYPE, NULL); + + g_clear_object (&self->src_pool); + g_clear_object (&self->src_allocator); + + if (gst_query_get_n_allocation_pools (query) > 0) + gst_query_parse_nth_allocation_pool (query, 0, NULL, NULL, &min, NULL); + + min = MAX (2, min); + + num_bitstream = 1 + + MAX (1, gst_v4l2_decoder_get_render_delay (self->decoder)); + + self->sink_allocator = gst_v4l2_codec_allocator_new (self->decoder, + GST_PAD_SINK, num_bitstream); + if (!self->sink_allocator) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to allocate sink buffers."), (NULL)); + return FALSE; + } + + self->src_allocator = gst_v4l2_codec_allocator_new (self->decoder, + GST_PAD_SRC, GST_VP9_REF_FRAMES + min + 4); + if (!self->src_allocator) { + GST_ELEMENT_ERROR (self, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to allocate source buffers."), (NULL)); + g_clear_object (&self->sink_allocator); + return FALSE; + } + + self->src_pool = gst_v4l2_codec_pool_new (self->src_allocator, &self->vinfo); + + /* Our buffer pool is internal, we will let the base class create a video + * pool, and use it if we are running out of buffers or if downstream does + * not support GstVideoMeta */ + return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation + (decoder, query); +} + +static GstFlowReturn +gst_v4l2_codec_vp9_dec_new_sequence (GstVp9Decoder * decoder, + const GstVp9FrameHeader * frame_hdr) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + gboolean negotiation_needed = FALSE; + + if (self->vinfo.finfo->format == GST_VIDEO_FORMAT_UNKNOWN) + negotiation_needed = TRUE; + + /* TODO Check if current buffers are large enough, and reuse them */ + if (self->width != frame_hdr->width || self->height != frame_hdr->height) { + self->width = frame_hdr->width; + self->height = frame_hdr->height; + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "Resolution changed to %dx%d", + self->width, self->height); + } + + if (self->subsampling_x != frame_hdr->subsampling_x || + self->subsampling_y != frame_hdr->subsampling_y) { + GST_DEBUG_OBJECT (self, + "subsampling changed from x: %d, y: %d to x: %d, y: %d", + self->subsampling_x, self->subsampling_y, + frame_hdr->subsampling_x, frame_hdr->subsampling_y); + self->subsampling_x = frame_hdr->subsampling_x; + self->subsampling_y = frame_hdr->subsampling_y; + negotiation_needed = TRUE; + } + + if (frame_hdr->color_space != GST_VP9_CS_UNKNOWN && + frame_hdr->color_space != GST_VP9_CS_RESERVED_2 && + frame_hdr->color_space != self->color_space) { + GST_DEBUG_OBJECT (self, "colorspace changed from %d to %d", + self->color_space, frame_hdr->color_space); + self->color_space = frame_hdr->color_space; + negotiation_needed = TRUE; + } + + if (frame_hdr->color_range != self->color_range) { + GST_DEBUG_OBJECT (self, "color range changed from %d to %d", + self->color_range, frame_hdr->color_range); + self->color_range = frame_hdr->color_range; + negotiation_needed = TRUE; + } + + if (frame_hdr->profile != GST_VP9_PROFILE_UNDEFINED && + frame_hdr->profile != self->profile) { + GST_DEBUG_OBJECT (self, "profile changed from %d to %d", self->profile, + frame_hdr->profile); + self->profile = frame_hdr->profile; + negotiation_needed = TRUE; + } + + if (frame_hdr->bit_depth != self->bit_depth) { + GST_DEBUG_OBJECT (self, "bit-depth changed from %d to %d", + self->bit_depth, frame_hdr->bit_depth); + self->bit_depth = frame_hdr->bit_depth; + negotiation_needed = TRUE; + } + + gst_v4l2_codec_vp9_dec_fill_dec_params (self, frame_hdr, NULL); + gst_v4l2_codec_vp9_dec_fill_prob_updates (self, frame_hdr); + + if (negotiation_needed) { + self->need_negotiation = TRUE; + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_ERROR; + } + } + + /* Check if we can zero-copy buffers */ + if (!self->has_videometa) { + GstVideoInfo ref_vinfo; + gint i; + + gst_video_info_set_format (&ref_vinfo, GST_VIDEO_INFO_FORMAT (&self->vinfo), + self->width, self->height); + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&self->vinfo); i++) { + if (self->vinfo.stride[i] != ref_vinfo.stride[i] || + self->vinfo.offset[i] != ref_vinfo.offset[i]) { + GST_WARNING_OBJECT (self, + "GstVideoMeta support required, copying frames."); + self->copy_frames = TRUE; + break; + } + } + } else { + self->copy_frames = FALSE; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_v4l2_codec_vp9_dec_start_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + + /* FIXME base class should not call us if negotiation failed */ + if (!self->sink_allocator) + return GST_FLOW_ERROR; + + /* Ensure we have a bitstream to write into */ + if (!self->bitstream) { + self->bitstream = gst_v4l2_codec_allocator_alloc (self->sink_allocator); + + if (!self->bitstream) { + GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, + ("Not enough memory to decode VP9 stream."), (NULL)); + return GST_FLOW_ERROR; + } + + if (!gst_memory_map (self->bitstream, &self->bitstream_map, GST_MAP_WRITE)) { + GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, + ("Could not access bitstream memory for writing"), (NULL)); + g_clear_pointer (&self->bitstream, gst_memory_unref); + return GST_FLOW_ERROR; + } + } + + /* We use this field to track how much we have written */ + self->bitstream_map.size = 0; + + return GST_FLOW_OK; +} + +static void +gst_v4l2_codec_vp9_dec_reset_picture (GstV4l2CodecVp9Dec * self) +{ + if (self->bitstream) { + if (self->bitstream_map.memory) + gst_memory_unmap (self->bitstream, &self->bitstream_map); + g_clear_pointer (&self->bitstream, gst_memory_unref); + self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + } +} + +static GstFlowReturn +gst_v4l2_codec_vp9_dec_decode_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture, GstVp9Dpb * dpb) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + guint8 *bitstream_data = self->bitstream_map.data; + + if (self->bitstream_map.maxsize < picture->size) { + GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, + ("Not enough space to send picture bitstream."), (NULL)); + gst_v4l2_codec_vp9_dec_reset_picture (self); + return GST_FLOW_ERROR; + } + + gst_v4l2_codec_vp9_dec_fill_dec_params (self, &picture->frame_hdr, dpb); + + if (decoder->parse_compressed_headers) + gst_v4l2_codec_vp9_dec_fill_prob_updates (self, &picture->frame_hdr); + + memcpy (bitstream_data, picture->data, picture->size); + self->bitstream_map.size = picture->size; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_v4l2_codec_vp9_dec_end_picture (GstVp9Decoder * decoder, + GstVp9Picture * picture) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + GstVideoCodecFrame *frame; + GstV4l2Request *request = NULL; + GstFlowReturn flow_ret; + gsize bytesused; + + /* *INDENT-OFF* */ + struct v4l2_ext_control decode_params_control[] = { + { + .id = V4L2_CID_STATELESS_VP9_FRAME, + .ptr = &self->v4l2_vp9_frame, + .size = sizeof(self->v4l2_vp9_frame), + }, + { + .id = V4L2_CID_STATELESS_VP9_COMPRESSED_HDR, + .ptr = &self->v4l2_delta_probs, + .size = sizeof (self->v4l2_delta_probs), + }, + }; + /* *INDENT-ON* */ + + bytesused = self->bitstream_map.size; + gst_memory_unmap (self->bitstream, &self->bitstream_map); + self->bitstream_map = (GstMapInfo) GST_MAP_INFO_INIT; + gst_memory_resize (self->bitstream, 0, bytesused); + + frame = gst_video_decoder_get_frame (GST_VIDEO_DECODER (self), + picture->system_frame_number); + g_return_val_if_fail (frame, FALSE); + + flow_ret = gst_buffer_pool_acquire_buffer (GST_BUFFER_POOL (self->src_pool), + &frame->output_buffer, NULL); + if (flow_ret != GST_FLOW_OK) { + if (flow_ret == GST_FLOW_FLUSHING) + GST_DEBUG_OBJECT (self, "Frame decoding aborted, we are flushing."); + else + GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, + ("No more picture buffer available."), (NULL)); + goto fail; + } + + request = gst_v4l2_decoder_alloc_request (self->decoder, + picture->system_frame_number, self->bitstream, frame->output_buffer); + + gst_video_codec_frame_unref (frame); + + if (!request) { + GST_ELEMENT_ERROR (decoder, RESOURCE, NO_SPACE_LEFT, + ("Failed to allocate a media request object."), (NULL)); + goto fail; + } + + if (!gst_v4l2_decoder_set_controls (self->decoder, request, + decode_params_control, G_N_ELEMENTS (decode_params_control))) { + GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, + ("Driver did not accept the bitstream parameters."), (NULL)); + goto fail; + } + + if (!gst_v4l2_request_queue (request, 0)) { + GST_ELEMENT_ERROR (decoder, RESOURCE, WRITE, + ("Driver did not accept the decode request."), (NULL)); + goto fail; + } + + gst_vp9_picture_set_user_data (picture, request, + (GDestroyNotify) gst_v4l2_request_unref); + gst_v4l2_codec_vp9_dec_reset_picture (self); + return GST_FLOW_OK; + +fail: + if (request) + gst_v4l2_request_unref (request); + + gst_v4l2_codec_vp9_dec_reset_picture (self); + return GST_FLOW_ERROR; +} + +static gboolean +gst_v4l2_codec_vp9_dec_copy_output_buffer (GstV4l2CodecVp9Dec * self, + GstVideoCodecFrame * codec_frame) +{ + GstVideoFrame src_frame; + GstVideoFrame dest_frame; + GstVideoInfo dest_vinfo; + GstBuffer *buffer; + + gst_video_info_set_format (&dest_vinfo, GST_VIDEO_INFO_FORMAT (&self->vinfo), + self->width, self->height); + + buffer = gst_video_decoder_allocate_output_buffer (GST_VIDEO_DECODER (self)); + if (!buffer) + goto fail; + + if (!gst_video_frame_map (&src_frame, &self->vinfo, + codec_frame->output_buffer, GST_MAP_READ)) + goto fail; + + if (!gst_video_frame_map (&dest_frame, &dest_vinfo, buffer, GST_MAP_WRITE)) { + gst_video_frame_unmap (&dest_frame); + goto fail; + } + + /* gst_video_frame_copy can crop this, but does not know, so let make it + * think it's all right */ + GST_VIDEO_INFO_WIDTH (&src_frame.info) = self->width; + GST_VIDEO_INFO_HEIGHT (&src_frame.info) = self->height; + + if (!gst_video_frame_copy (&dest_frame, &src_frame)) { + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + goto fail; + } + + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + gst_buffer_replace (&codec_frame->output_buffer, buffer); + gst_buffer_unref (buffer); + + return TRUE; + +fail: + GST_ERROR_OBJECT (self, "Failed copy output buffer."); + return FALSE; +} + + +static GstVp9Picture * +gst_v4l2_codec_vp9_dec_duplicate_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstVp9Picture *new_picture; + + GST_DEBUG_OBJECT (decoder, "Duplicate picture %u", + picture->system_frame_number); + + new_picture = gst_vp9_picture_new (); + new_picture->frame_hdr = picture->frame_hdr; + new_picture->system_frame_number = frame->system_frame_number; + + if (GST_MINI_OBJECT_FLAG_IS_SET (picture, FLAG_PICTURE_OUTPUTED)) { + GstBuffer *output_buffer = gst_vp9_picture_get_user_data (picture); + if (output_buffer) + frame->output_buffer = gst_buffer_ref (output_buffer); + } else { + GstV4l2Request *request = gst_vp9_picture_get_user_data (picture); + gst_vp9_picture_set_user_data (new_picture, gst_v4l2_request_ref (request), + (GDestroyNotify) gst_v4l2_request_unref); + frame->output_buffer = gst_v4l2_request_dup_pic_buf (request); + } + + return new_picture; +} + +static GstFlowReturn +gst_v4l2_codec_vp9_dec_output_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstV4l2Request *request = gst_vp9_picture_get_user_data (picture); + gint ret; + + GST_DEBUG_OBJECT (self, "Output picture %u", picture->system_frame_number); + + if (request) { + ret = gst_v4l2_request_set_done (request); + if (ret == 0) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Decoding frame took too long"), (NULL)); + goto error; + } else if (ret < 0) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Decoding request failed: %s", g_strerror (errno)), (NULL)); + goto error; + } + g_return_val_if_fail (frame->output_buffer, GST_FLOW_ERROR); + + if (gst_v4l2_request_failed (request)) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Failed to decode frame %u", picture->system_frame_number), (NULL)); + goto error; + } + + /* Hold on reference buffers for the rest of the picture lifetime */ + gst_vp9_picture_set_user_data (picture, + gst_buffer_ref (frame->output_buffer), + (GDestroyNotify) gst_buffer_unref); + + GST_MINI_OBJECT_FLAG_SET (picture, FLAG_PICTURE_OUTPUTED); + } + + /* This may happen if we duplicate a picture witch failed to decode */ + if (!frame->output_buffer) { + GST_ELEMENT_ERROR (self, STREAM, DECODE, + ("Failed to decode frame %u", picture->system_frame_number), (NULL)); + goto error; + } + + if (self->copy_frames) + gst_v4l2_codec_vp9_dec_copy_output_buffer (self, frame); + + gst_vp9_picture_unref (picture); + + return gst_video_decoder_finish_frame (vdec, frame); + +error: + gst_video_decoder_drop_frame (vdec, frame); + gst_vp9_picture_unref (picture); + + return GST_FLOW_ERROR; +} + +static void +gst_v4l2_codec_vp9_dec_set_flushing (GstV4l2CodecVp9Dec * self, + gboolean flushing) +{ + if (self->sink_allocator) + gst_v4l2_codec_allocator_set_flushing (self->sink_allocator, flushing); + if (self->src_allocator) + gst_v4l2_codec_allocator_set_flushing (self->src_allocator, flushing); +} + +static gboolean +gst_v4l2_codec_vp9_dec_flush (GstVideoDecoder * decoder) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + + GST_DEBUG_OBJECT (self, "Flushing decoder state."); + + gst_v4l2_decoder_flush (self->decoder); + gst_v4l2_codec_vp9_dec_set_flushing (self, FALSE); + + return GST_VIDEO_DECODER_CLASS (parent_class)->flush (decoder); +} + +static gboolean +gst_v4l2_codec_vp9_dec_sink_event (GstVideoDecoder * decoder, GstEvent * event) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (decoder); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_FLUSH_START: + GST_DEBUG_OBJECT (self, "flush start"); + gst_v4l2_codec_vp9_dec_set_flushing (self, TRUE); + break; + default: + break; + } + + return GST_VIDEO_DECODER_CLASS (parent_class)->sink_event (decoder, event); +} + +static GstStateChangeReturn +gst_v4l2_codec_vp9_dec_change_state (GstElement * element, + GstStateChange transition) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (element); + + if (transition == GST_STATE_CHANGE_PAUSED_TO_READY) + gst_v4l2_codec_vp9_dec_set_flushing (self, TRUE); + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + +static void +gst_v4l2_codec_vp9_dec_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (object); + GObject *dec = G_OBJECT (self->decoder); + + switch (prop_id) { + default: + gst_v4l2_decoder_set_property (dec, prop_id - PROP_LAST, value, pspec); + break; + } +} + +static void +gst_v4l2_codec_vp9_dec_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (object); + GObject *dec = G_OBJECT (self->decoder); + + switch (prop_id) { + default: + gst_v4l2_decoder_get_property (dec, prop_id - PROP_LAST, value, pspec); + break; + } +} + +static void +gst_v4l2_codec_vp9_dec_init (GstV4l2CodecVp9Dec * self) +{ + GstVp9Decoder *parent = GST_VP9_DECODER (self); + parent->parse_compressed_headers = TRUE; +} + +static void +gst_v4l2_codec_vp9_dec_subinit (GstV4l2CodecVp9Dec * self, + GstV4l2CodecVp9DecClass * klass) +{ + self->decoder = gst_v4l2_decoder_new (klass->device); + gst_video_info_init (&self->vinfo); +} + +static void +gst_v4l2_codec_vp9_dec_dispose (GObject * object) +{ + GstV4l2CodecVp9Dec *self = GST_V4L2_CODEC_VP9_DEC (object); + + g_clear_object (&self->decoder); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_v4l2_codec_vp9_dec_class_init (GstV4l2CodecVp9DecClass * klass) +{ +} + +static void +gst_v4l2_codec_vp9_dec_subclass_init (GstV4l2CodecVp9DecClass * klass, + GstV4l2CodecDevice * device) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + GstVp9DecoderClass *vp9decoder_class = GST_VP9_DECODER_CLASS (klass); + + gobject_class->set_property = gst_v4l2_codec_vp9_dec_set_property; + gobject_class->get_property = gst_v4l2_codec_vp9_dec_get_property; + gobject_class->dispose = gst_v4l2_codec_vp9_dec_dispose; + + gst_element_class_set_static_metadata (element_class, + "V4L2 Stateless VP9 Video Decoder", + "Codec/Decoder/Video/Hardware", + "A V4L2 based VP9 video decoder", + "Daniel Almeida <daniel.almeida@collabora.com>"); + + gst_element_class_add_static_pad_template (element_class, &sink_template); + gst_element_class_add_static_pad_template (element_class, &src_template); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_change_state); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_close); + decoder_class->stop = GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_stop); + decoder_class->negotiate = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_negotiate); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_decide_allocation); + decoder_class->flush = GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_flush); + decoder_class->sink_event = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_sink_event); + + vp9decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_new_sequence); + vp9decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_start_picture); + vp9decoder_class->decode_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_decode_picture); + vp9decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_end_picture); + vp9decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_output_picture); + vp9decoder_class->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_duplicate_picture); + vp9decoder_class->get_preferred_output_delay = + GST_DEBUG_FUNCPTR (gst_v4l2_codec_vp9_dec_get_preferred_output_delay); + + klass->device = device; + gst_v4l2_decoder_install_properties (gobject_class, PROP_LAST, device); +} + +static void gst_v4l2_codec_vp9_alpha_decode_bin_subclass_init + (GstV4l2CodecAlphaDecodeBinClass * klass, gchar * decoder_name) +{ + GstV4l2CodecAlphaDecodeBinClass *adbin_class = + (GstV4l2CodecAlphaDecodeBinClass *) klass; + GstElementClass *element_class = (GstElementClass *) klass; + + adbin_class->decoder_name = decoder_name; + gst_element_class_add_static_pad_template (element_class, &alpha_template); + + gst_element_class_set_static_metadata (element_class, + "VP9 Alpha Decoder", "Codec/Decoder/Video", + "Wrapper bin to decode VP9 with alpha stream.", + "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); +} + +void +gst_v4l2_codec_vp9_dec_register (GstPlugin * plugin, GstV4l2Decoder * decoder, + GstV4l2CodecDevice * device, guint rank) +{ + gchar *element_name; + GstCaps *src_caps, *alpha_caps; + + GST_DEBUG_CATEGORY_INIT (v4l2_vp9dec_debug, "v4l2codecs-vp9dec", 0, + "V4L2 stateless VP9 decoder"); + + if (!gst_v4l2_decoder_set_sink_fmt (decoder, V4L2_PIX_FMT_VP9_FRAME, + 320, 240, 8)) + return; + src_caps = gst_v4l2_decoder_enum_src_formats (decoder); + + if (gst_caps_is_empty (src_caps)) { + GST_WARNING ("Not registering VP9 decoder since it produces no " + "supported format"); + goto done; + } + + gst_v4l2_decoder_register (plugin, GST_TYPE_V4L2_CODEC_VP9_DEC, + (GClassInitFunc) gst_v4l2_codec_vp9_dec_subclass_init, + gst_mini_object_ref (GST_MINI_OBJECT (device)), + (GInstanceInitFunc) gst_v4l2_codec_vp9_dec_subinit, + "v4l2sl%svp9dec", device, rank, &element_name); + + if (!element_name) + goto done; + + alpha_caps = gst_caps_from_string ("video/x-raw,format={I420, NV12}"); + + if (gst_caps_can_intersect (src_caps, alpha_caps)) + gst_v4l2_codec_alpha_decode_bin_register (plugin, + (GClassInitFunc) gst_v4l2_codec_vp9_alpha_decode_bin_subclass_init, + element_name, "v4l2slvp9%salphadecodebin", device, rank); + + gst_caps_unref (alpha_caps); + +done: + gst_caps_unref (src_caps); +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2codecvp9dec.h
Added
@@ -0,0 +1,55 @@ + +/* GStreamer + * Copyright (C) 2021 Daniel Almeida <daniel.almeida@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_V4L2_CODEC_VP9_DEC_H__ +#define __GST_V4L2_CODEC_VP9_DEC_H__ + +#define GST_USE_UNSTABLE_API +#include <gst/codecs/gstvp9decoder.h> + +#include "gstv4l2decoder.h" + +G_BEGIN_DECLS + +#define GST_TYPE_V4L2_CODEC_VP9_DEC (gst_v4l2_codec_vp9_dec_get_type()) +#define GST_V4L2_CODEC_VP9_DEC(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_V4L2_CODEC_VP9_DEC,GstV4l2CodecVp9Dec)) +#define GST_V4L2_CODEC_VP9_DEC_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_V4L2_CODEC_VP9_DEC,GstV4l2CodecVp9DecClass)) +#define GST_V4L2_CODEC_VP9_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), GST_TYPE_V4L2_CODEC_VP9_DEC, GstV4l2CodecVp9DecClass)) +#define GST_IS_V4L2_CODEC_VP9_DEC(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_V4L2_CODEC_VP9_DEC)) +#define GST_IS_V4L2_CODEC_VP9_DEC_CLASS(obj) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_V4L2_CODEC_VP9_DEC)) + +typedef struct _GstV4l2CodecVp9Dec GstV4l2CodecVp9Dec; +typedef struct _GstV4l2CodecVp9DecClass GstV4l2CodecVp9DecClass; + +struct _GstV4l2CodecVp9DecClass +{ + GstVp9DecoderClass parent_class; + GstV4l2CodecDevice *device; +}; + +GType gst_v4l2_codec_vp9_dec_get_type (void); +void gst_v4l2_codec_vp9_dec_register (GstPlugin * plugin, + GstV4l2Decoder * decoder, + GstV4l2CodecDevice * device, + guint rank); + +G_END_DECLS + +#endif /* __GST_V4L2_CODEC_VP9_DEC_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2decoder.c -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2decoder.c
Changed
@@ -29,12 +29,15 @@ #include "linux/videodev2.h" #include <fcntl.h> +#include <sys/ioctl.h> #include <sys/stat.h> #include <sys/types.h> #include <unistd.h> #include <gst/base/base.h> +#define IMAGE_MINSZ 4096 + GST_DEBUG_CATEGORY (v4l2_decoder_debug); #define GST_CAT_DEFAULT v4l2_decoder_debug @@ -47,12 +50,22 @@ struct _GstV4l2Request { + /* non-thread safe */ + gint ref_count; + GstV4l2Decoder *decoder; gint fd; + guint32 frame_num; GstMemory *bitstream; + GstBuffer *pic_buf; GstPoll *poll; GstPollFD pollfd; + + /* request state */ gboolean pending; + gboolean failed; + gboolean hold_pic_buf; + gboolean sub_request; }; struct _GstV4l2Decoder @@ -64,6 +77,7 @@ gint video_fd; GstQueueArray *request_pool; GstQueueArray *pending_requests; + guint version; enum v4l2_buf_type src_buf_type; enum v4l2_buf_type sink_buf_type; @@ -72,12 +86,15 @@ /* properties */ gchar *media_device; gchar *video_device; + guint render_delay; }; G_DEFINE_TYPE_WITH_CODE (GstV4l2Decoder, gst_v4l2_decoder, GST_TYPE_OBJECT, GST_DEBUG_CATEGORY_INIT (v4l2_decoder_debug, "v4l2codecs-decoder", 0, "V4L2 stateless decoder helper")); +static void gst_v4l2_request_free (GstV4l2Request * request); + static guint32 direction_to_buffer_type (GstV4l2Decoder * self, GstPadDirection direction) { @@ -135,6 +152,12 @@ return gst_object_ref_sink (decoder); } +guint +gst_v4l2_decoder_get_version (GstV4l2Decoder * self) +{ + return self->version; +} + gboolean gst_v4l2_decoder_open (GstV4l2Decoder * self) { @@ -163,6 +186,8 @@ return FALSE; } + self->version = querycap.version; + if (querycap.capabilities & V4L2_CAP_DEVICE_CAPS) capabilities = querycap.device_caps; else @@ -192,6 +217,9 @@ { GstV4l2Request *request; + while ((request = gst_queue_array_pop_head (self->pending_requests))) + gst_v4l2_request_unref (request); + while ((request = gst_queue_array_pop_head (self->request_pool))) gst_v4l2_request_free (request); @@ -225,16 +253,18 @@ gboolean gst_v4l2_decoder_streamoff (GstV4l2Decoder * self, GstPadDirection direction) { - GstV4l2Request *pending_req; guint32 type = direction_to_buffer_type (self, direction); gint ret; if (direction == GST_PAD_SRC) { + GstV4l2Request *pending_req; + /* STREAMOFF have the effect of cancelling all requests and unqueuing all * buffers, so clear the pending request list */ while ((pending_req = gst_queue_array_pop_head (self->pending_requests))) { g_clear_pointer (&pending_req->bitstream, gst_memory_unref); pending_req->pending = FALSE; + gst_v4l2_request_unref (pending_req); } } @@ -296,8 +326,9 @@ }, }; gint ret; + /* Using raw image size for now, it is guarantied to be large enough */ - gsize sizeimage = (width * height * pixel_bitdepth) / 8; + gsize sizeimage = MAX (IMAGE_MINSZ, (width * height * pixel_bitdepth) / 8); if (self->mplane) format.fmt.pix_mp.plane_fmt[0].sizeimage = sizeimage; @@ -530,12 +561,12 @@ return TRUE; } -gboolean +static gboolean gst_v4l2_decoder_queue_sink_mem (GstV4l2Decoder * self, - GstV4l2Request * request, GstMemory * mem, guint32 frame_num, - gsize bytesused, guint flags) + GstV4l2Request * request, GstMemory * mem, guint32 frame_num, guint flags) { gint ret; + gsize bytesused = gst_memory_get_sizes (mem, NULL, NULL); struct v4l2_plane plane = { .bytesused = bytesused, }; @@ -563,14 +594,11 @@ return FALSE; } - request->bitstream = gst_memory_ref (mem); - return TRUE; } -gboolean -gst_v4l2_decoder_queue_src_buffer (GstV4l2Decoder * self, GstBuffer * buffer, - guint32 frame_num) +static gboolean +gst_v4l2_decoder_queue_src_buffer (GstV4l2Decoder * self, GstBuffer * buffer) { gint i, ret; struct v4l2_plane planes[GST_VIDEO_MAX_PLANES]; @@ -606,7 +634,7 @@ return TRUE; } -gboolean +static gboolean gst_v4l2_decoder_dequeue_sink (GstV4l2Decoder * self) { gint ret; @@ -632,7 +660,7 @@ return TRUE; } -gboolean +static gboolean gst_v4l2_decoder_dequeue_src (GstV4l2Decoder * self, guint32 * out_frame_num) { gint ret; @@ -702,6 +730,29 @@ return TRUE; } +gboolean +gst_v4l2_decoder_query_control_size (GstV4l2Decoder * self, + unsigned int control_id, unsigned int *control_size) +{ + gint ret; + struct v4l2_query_ext_ctrl control = { + .id = control_id, + }; + + *control_size = 0; + + ret = ioctl (self->video_fd, VIDIOC_QUERY_EXT_CTRL, &control); + if (ret < 0) + /* + * It's not an error if a control is not supported by this driver. + * Return false but don't print any error. + */ + return FALSE; + + *control_size = control.elem_size; + return TRUE; +} + void gst_v4l2_decoder_install_properties (GObjectClass * gobject_class, gint prop_offset, GstV4l2CodecDevice * device) @@ -765,10 +816,27 @@ } } +/** + * gst_v4l2_decoder_register: + * @plugin: a #GstPlugin + * @dec_type: A #GType for the codec + * @class_init: The #GClassInitFunc for #dec_type + * @instance_init: The #GInstanceInitFunc for #dec_type + * @element_name_tmpl: A string to use for the first codec found and as a template for the next ones. + * @device: (transfer full) A #GstV4l2CodecDevice + * @rank: The rank to use for the element + * @class_data: (nullable) (transfer full) A #gpointer to pass as class_data, set to @device if null + * @element_name (nullable) (out) Sets the pointer to the new element name + * + * Registers a decoder element as a subtype of @dec_type for @plugin. + * Will create a different sub_types for each subsequent @decoder of the + * same type. + */ void gst_v4l2_decoder_register (GstPlugin * plugin, - GType dec_type, GClassInitFunc class_init, GInstanceInitFunc instance_init, - const gchar * element_name_tmpl, GstV4l2CodecDevice * device, guint rank) + GType dec_type, GClassInitFunc class_init, gconstpointer class_data, + GInstanceInitFunc instance_init, const gchar * element_name_tmpl, + GstV4l2CodecDevice * device, guint rank, gchar ** element_name) { GTypeQuery type_query; GTypeInfo type_info = { 0, }; @@ -780,9 +848,11 @@ type_info.class_size = type_query.class_size; type_info.instance_size = type_query.instance_size; type_info.class_init = class_init; - type_info.class_data = gst_mini_object_ref (GST_MINI_OBJECT (device)); + type_info.class_data = class_data; type_info.instance_init = instance_init; - GST_MINI_OBJECT_FLAG_SET (device, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + if (class_data == device) + GST_MINI_OBJECT_FLAG_SET (device, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); /* The first decoder to be registered should use a constant name, like * v4l2slvp8dec, for any additional decoders, we create unique names. Decoder @@ -800,14 +870,81 @@ subtype = g_type_register_static (dec_type, type_name, &type_info, 0); - if (!gst_element_register (plugin, type_name, rank, subtype)) + if (!gst_element_register (plugin, type_name, rank, subtype)) { GST_WARNING ("Failed to register plugin '%s'", type_name); + g_free (type_name); + type_name = NULL; + } + + if (element_name) + *element_name = type_name; + else + g_free (type_name); +} + +/* + * gst_v4l2_decoder_alloc_request: + * @self a #GstV4l2Decoder pointer + * @frame_num: Used as a timestamp to identify references + * @bitstream the #GstMemory that holds the bitstream data + * @pic_buf the #GstBuffer holding the decoded picture + * + * Allocate a Linux media request file descriptor. This request wrapper will + * hold a reference to the requested bitstream memory to decoded and the + * picture buffer this request will decode to. This will be used for + * transparent management of the V4L2 queues. + * + * Returns: a new #GstV4l2Request + */ +GstV4l2Request * +gst_v4l2_decoder_alloc_request (GstV4l2Decoder * self, guint32 frame_num, + GstMemory * bitstream, GstBuffer * pic_buf) +{ + GstV4l2Request *request = gst_queue_array_pop_head (self->request_pool); + gint ret; - g_free (type_name); + if (!request) { + request = g_new0 (GstV4l2Request, 1); + + ret = ioctl (self->media_fd, MEDIA_IOC_REQUEST_ALLOC, &request->fd); + if (ret < 0) { + GST_ERROR_OBJECT (self, "MEDIA_IOC_REQUEST_ALLOC failed: %s", + g_strerror (errno)); + return NULL; + } + + request->poll = gst_poll_new (FALSE); + gst_poll_fd_init (&request->pollfd); + request->pollfd.fd = request->fd; + gst_poll_add_fd (request->poll, &request->pollfd); + gst_poll_fd_ctl_pri (request->poll, &request->pollfd, TRUE); + } + + request->decoder = g_object_ref (self); + request->bitstream = gst_memory_ref (bitstream); + request->pic_buf = gst_buffer_ref (pic_buf); + request->frame_num = frame_num; + request->ref_count = 1; + + return request; } +/* + * gst_v4l2_decoder_alloc_sub_request: + * @self a #GstV4l2Decoder pointer + * @prev_request the #GstV4l2Request this request continue + * @bitstream the #GstMemory that holds the bitstream data + * + * Allocate a Linux media request file descriptor. Similar to + * gst_v4l2_decoder_alloc_request(), but used when a request is the + * continuation of the decoding of the same picture. This is notably the case + * for subsequent slices or for second field of a frame. + * + * Returns: a new #GstV4l2Request + */ GstV4l2Request * -gst_v4l2_decoder_alloc_request (GstV4l2Decoder * self) +gst_v4l2_decoder_alloc_sub_request (GstV4l2Decoder * self, + GstV4l2Request * prev_request, GstMemory * bitstream) { GstV4l2Request *request = gst_queue_array_pop_head (self->request_pool); gint ret; @@ -830,24 +967,85 @@ } request->decoder = g_object_ref (self); + request->bitstream = gst_memory_ref (bitstream); + request->pic_buf = gst_buffer_ref (prev_request->pic_buf); + request->frame_num = prev_request->frame_num; + request->sub_request = TRUE; + request->ref_count = 1; + return request; } +/** + * gst_v4l2_decoder_set_render_delay: + * @self: a #GstV4l2Decoder pointer + * @delay: The expected render delay + * + * The decoder will adjust the number of allowed concurrent request in order + * to allow this delay. The same number of concurrent bitstream buffer will be + * used, so make sure to adjust the number of bitstream buffer. + * + * For per-slice decoder, this is the maximum number of pending slice, so the + * render backlog in frame may be less then the render delay. + */ void +gst_v4l2_decoder_set_render_delay (GstV4l2Decoder * self, guint delay) +{ + self->render_delay = delay; +} + +/** + * gst_v4l2_decoder_get_render_delay: + * @self: a #GstV4l2Decoder pointer + * + * This function is used to avoid storing the render delay in multiple places. + * + * Returns: The currently configured render delay. + */ +guint +gst_v4l2_decoder_get_render_delay (GstV4l2Decoder * self) +{ + return self->render_delay; +} + +GstV4l2Request * +gst_v4l2_request_ref (GstV4l2Request * request) +{ + request->ref_count++; + return request; +} + +static void gst_v4l2_request_free (GstV4l2Request * request) { GstV4l2Decoder *decoder = request->decoder; + + request->decoder = NULL; + close (request->fd); + gst_poll_free (request->poll); + g_free (request); + + if (decoder) + g_object_unref (decoder); +} + +void +gst_v4l2_request_unref (GstV4l2Request * request) +{ + GstV4l2Decoder *decoder = request->decoder; gint ret; - if (!decoder) { - close (request->fd); - gst_poll_free (request->poll); - g_free (request); + g_return_if_fail (request->ref_count > 0); + + if (--request->ref_count > 0) return; - } g_clear_pointer (&request->bitstream, gst_memory_unref); - request->decoder = NULL; + g_clear_pointer (&request->pic_buf, gst_buffer_unref); + request->frame_num = G_MAXUINT32; + request->failed = FALSE; + request->hold_pic_buf = FALSE; + request->sub_request = FALSE; if (request->pending) { gint idx; @@ -859,7 +1057,6 @@ gst_queue_array_drop_element (decoder->pending_requests, idx); gst_v4l2_request_free (request); - g_object_unref (decoder); return; } @@ -870,68 +1067,120 @@ GST_ERROR_OBJECT (request->decoder, "MEDIA_REQUEST_IOC_REINIT failed: %s", g_strerror (errno)); gst_v4l2_request_free (request); - g_object_unref (decoder); return; } gst_queue_array_push_tail (decoder->request_pool, request); - g_object_unref (decoder); + g_clear_object (&request->decoder); } gboolean -gst_v4l2_request_queue (GstV4l2Request * request) +gst_v4l2_request_queue (GstV4l2Request * request, guint flags) { + GstV4l2Decoder *decoder = request->decoder; gint ret; + guint max_pending; + + GST_TRACE_OBJECT (decoder, "Queuing request %p.", request); - GST_TRACE_OBJECT (request->decoder, "Queuing request %p.", request); + if (!gst_v4l2_decoder_queue_sink_mem (decoder, request, + request->bitstream, request->frame_num, flags)) { + GST_ERROR_OBJECT (decoder, "Driver did not accept the bitstream data."); + return FALSE; + } + + if (!request->sub_request && + !gst_v4l2_decoder_queue_src_buffer (decoder, request->pic_buf)) { + GST_ERROR_OBJECT (decoder, "Driver did not accept the picture buffer."); + return FALSE; + } ret = ioctl (request->fd, MEDIA_REQUEST_IOC_QUEUE, NULL); if (ret < 0) { - GST_ERROR_OBJECT (request->decoder, "MEDIA_REQUEST_IOC_QUEUE, failed: %s", + GST_ERROR_OBJECT (decoder, "MEDIA_REQUEST_IOC_QUEUE, failed: %s", g_strerror (errno)); return FALSE; } + if (flags & V4L2_BUF_FLAG_M2M_HOLD_CAPTURE_BUF) + request->hold_pic_buf = TRUE; + request->pending = TRUE; - gst_queue_array_push_tail (request->decoder->pending_requests, request); + gst_queue_array_push_tail (decoder->pending_requests, + gst_v4l2_request_ref (request)); + + max_pending = MAX (1, decoder->render_delay); + + if (gst_queue_array_get_length (decoder->pending_requests) > max_pending) { + GstV4l2Request *pending_req; + + pending_req = gst_queue_array_peek_head (decoder->pending_requests); + gst_v4l2_request_set_done (pending_req); + } return TRUE; } gint -gst_v4l2_request_poll (GstV4l2Request * request, GstClockTime timeout) -{ - return gst_poll_wait (request->poll, timeout); -} - -void gst_v4l2_request_set_done (GstV4l2Request * request) { - if (request->bitstream) { - GstV4l2Decoder *dec = request->decoder; - GstV4l2Request *pending_req; + GstV4l2Decoder *decoder = request->decoder; + GstV4l2Request *pending_req = NULL; + gint ret; - while ((pending_req = gst_queue_array_pop_head (dec->pending_requests))) { - gst_v4l2_decoder_dequeue_sink (request->decoder); - g_clear_pointer (&pending_req->bitstream, gst_memory_unref); + if (!request->pending) + return 1; - if (pending_req == request) - break; - } + ret = gst_poll_wait (request->poll, GST_SECOND); + if (ret == 0) { + GST_WARNING_OBJECT (decoder, "Request %p took too long.", request); + return 0; + } - /* Pending request should always be found in the fifo */ - if (pending_req != request) { - g_warning ("Pending request not found in the pending list."); - gst_v4l2_decoder_dequeue_sink (request->decoder); - g_clear_pointer (&pending_req->bitstream, gst_memory_unref); + if (ret < 0) { + GST_WARNING_OBJECT (decoder, "Request %p error: %s (%i)", + request, g_strerror (errno), errno); + return ret; + } + + while ((pending_req = gst_queue_array_pop_head (decoder->pending_requests))) { + gst_v4l2_decoder_dequeue_sink (decoder); + g_clear_pointer (&pending_req->bitstream, gst_memory_unref); + + if (!pending_req->hold_pic_buf) { + guint32 frame_num = G_MAXUINT32; + + if (!gst_v4l2_decoder_dequeue_src (decoder, &frame_num)) { + pending_req->failed = TRUE; + } else if (frame_num != pending_req->frame_num) { + GST_WARNING_OBJECT (decoder, + "Requested frame %u, but driver returned frame %u.", + pending_req->frame_num, frame_num); + pending_req->failed = TRUE; + } } + + pending_req->pending = FALSE; + gst_v4l2_request_unref (pending_req); + + if (pending_req == request) + break; } - request->pending = FALSE; + /* Pending request must be in the pending request list */ + g_assert (pending_req == request); + + return ret; } gboolean -gst_v4l2_request_is_done (GstV4l2Request * request) +gst_v4l2_request_failed (GstV4l2Request * request) +{ + return request->failed; +} + +GstBuffer * +gst_v4l2_request_dup_pic_buf (GstV4l2Request * request) { - return !request->pending; + return gst_buffer_ref (request->pic_buf); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2decoder.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2decoder.h
Changed
@@ -35,6 +35,8 @@ GstV4l2Decoder * gst_v4l2_decoder_new (GstV4l2CodecDevice * device); +guint gst_v4l2_decoder_get_version (GstV4l2Decoder * self); + gboolean gst_v4l2_decoder_open (GstV4l2Decoder * decoder); gboolean gst_v4l2_decoder_close (GstV4l2Decoder * decoder); @@ -72,22 +74,6 @@ gsize * offsets, guint *num_fds); -gboolean gst_v4l2_decoder_queue_sink_mem (GstV4l2Decoder * self, - GstV4l2Request * request, - GstMemory * mem, - guint32 frame_num, - gsize bytesused, - guint flags); - -gboolean gst_v4l2_decoder_dequeue_sink (GstV4l2Decoder * self); - -gboolean gst_v4l2_decoder_queue_src_buffer (GstV4l2Decoder * self, - GstBuffer * buffer, - guint32 frame_num); - -gboolean gst_v4l2_decoder_dequeue_src (GstV4l2Decoder * self, - guint32 *out_frame_num); - gboolean gst_v4l2_decoder_set_controls (GstV4l2Decoder * self, GstV4l2Request * request, struct v4l2_ext_control *control, @@ -97,6 +83,10 @@ struct v4l2_ext_control * control, guint count); +gboolean gst_v4l2_decoder_query_control_size (GstV4l2Decoder * self, + unsigned int control_id, + unsigned int *control_size); + void gst_v4l2_decoder_install_properties (GObjectClass * gobject_class, gint prop_offset, GstV4l2CodecDevice * device); @@ -110,22 +100,43 @@ void gst_v4l2_decoder_register (GstPlugin * plugin, GType dec_type, GClassInitFunc class_init, + gconstpointer class_data, GInstanceInitFunc instance_init, const gchar *element_name_tmpl, GstV4l2CodecDevice * device, - guint rank); + guint rank, + gchar ** element_name); + +GstV4l2Request *gst_v4l2_decoder_alloc_request (GstV4l2Decoder * self, + guint32 frame_num, + GstMemory *bitstream, + GstBuffer * pic_buf); + +GstV4l2Request *gst_v4l2_decoder_alloc_sub_request (GstV4l2Decoder * self, + GstV4l2Request * prev_request, + GstMemory *bitstream); + +void gst_v4l2_decoder_set_render_delay (GstV4l2Decoder * self, + guint delay); + +guint gst_v4l2_decoder_get_render_delay (GstV4l2Decoder * self); + + +GstV4l2Request * gst_v4l2_request_ref (GstV4l2Request * request); -GstV4l2Request *gst_v4l2_decoder_alloc_request (GstV4l2Decoder * self); +void gst_v4l2_request_unref (GstV4l2Request * request); -void gst_v4l2_request_free (GstV4l2Request * request); +gboolean gst_v4l2_request_queue (GstV4l2Request * request, + guint flags); -gboolean gst_v4l2_request_queue (GstV4l2Request * request); +gint gst_v4l2_request_poll (GstV4l2Request * request, + GstClockTime timeout); -gint gst_v4l2_request_poll (GstV4l2Request * request, GstClockTime timeout); +gint gst_v4l2_request_set_done (GstV4l2Request * request); -void gst_v4l2_request_set_done (GstV4l2Request * request); +gboolean gst_v4l2_request_failed (GstV4l2Request * request); -gboolean gst_v4l2_request_is_done (GstV4l2Request * request); +GstBuffer * gst_v4l2_request_dup_pic_buf (GstV4l2Request * request); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2format.c -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2format.c
Changed
@@ -35,6 +35,7 @@ {V4L2_PIX_FMT_NV12, 1, GST_VIDEO_FORMAT_NV12, 8, 420}, {V4L2_PIX_FMT_YUYV, 1, GST_VIDEO_FORMAT_YUY2, 8, 422}, {V4L2_PIX_FMT_SUNXI_TILED_NV12, 1, GST_VIDEO_FORMAT_NV12_32L32, 8, 422}, + {V4L2_PIX_FMT_NV12_4L4, 1, GST_VIDEO_FORMAT_NV12_4L4, 8, 420}, {0,} }; @@ -77,6 +78,7 @@ switch (finfo->format) { case GST_VIDEO_FORMAT_NV12: + case GST_VIDEO_FORMAT_NV12_4L4: case GST_VIDEO_FORMAT_NV12_32L32: case GST_VIDEO_FORMAT_NV12_64Z32: case GST_VIDEO_FORMAT_NV16:
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/gstv4l2format.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/gstv4l2format.h
Changed
@@ -24,6 +24,8 @@ #include <gst/video/video.h> #include "linux/videodev2.h" +#define GST_V4L2_DEFAULT_VIDEO_FORMATS "{ NV12, YUY2, NV12_4L4, NV12_32L32 }" + gboolean gst_v4l2_format_to_video_info (struct v4l2_format * fmt, GstVideoInfo * out_info);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/media.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/linux/media.h
Changed
@@ -20,10 +20,7 @@ #ifndef __LINUX_MEDIA_H #define __LINUX_MEDIA_H -#ifndef __KERNEL__ #include <stdint.h> -#define __user -#endif #include "linux/types-compat.h" struct media_device_info { @@ -127,6 +124,7 @@ #define MEDIA_ENT_F_PROC_VIDEO_STATISTICS (MEDIA_ENT_F_BASE + 0x4006) #define MEDIA_ENT_F_PROC_VIDEO_ENCODER (MEDIA_ENT_F_BASE + 0x4007) #define MEDIA_ENT_F_PROC_VIDEO_DECODER (MEDIA_ENT_F_BASE + 0x4008) +#define MEDIA_ENT_F_PROC_VIDEO_ISP (MEDIA_ENT_F_BASE + 0x4009) /* * Switch and bridge entity functions @@ -167,7 +165,6 @@ __u32 minor; } dev; -#if !defined(__KERNEL__) /* * TODO: this shouldn't have been added without * actual drivers that use this. When the first real driver @@ -199,7 +196,6 @@ __u32 minor; } fb; int dvb; -#endif /* Sub-device specifications */ /* Nothing needed yet */ @@ -236,9 +232,9 @@ struct media_links_enum { __u32 entity; /* Should have enough room for pads elements */ - struct media_pad_desc __user *pads; + struct media_pad_desc *pads; /* Should have enough room for links elements */ - struct media_link_desc __user *links; + struct media_link_desc *links; __u32 reserved[4]; }; @@ -267,21 +263,6 @@ #define MEDIA_INTF_T_ALSA_PCM_PLAYBACK (MEDIA_INTF_T_ALSA_BASE + 1) #define MEDIA_INTF_T_ALSA_CONTROL (MEDIA_INTF_T_ALSA_BASE + 2) -#if defined(__KERNEL__) - -/* - * Connector functions - * - * For now these should not be used in userspace, as some definitions may - * change. - * - * It is the responsibility of the entity drivers to add connectors and links. - */ -#define MEDIA_ENT_F_CONN_RF (MEDIA_ENT_F_BASE + 0x30001) -#define MEDIA_ENT_F_CONN_SVIDEO (MEDIA_ENT_F_BASE + 0x30002) -#define MEDIA_ENT_F_CONN_COMPOSITE (MEDIA_ENT_F_BASE + 0x30003) - -#endif /* * MC next gen API definitions @@ -383,7 +364,6 @@ #define MEDIA_REQUEST_IOC_QUEUE _IO('|', 0x80) #define MEDIA_REQUEST_IOC_REINIT _IO('|', 0x81) -#ifndef __KERNEL__ /* * Legacy symbols used to avoid userspace compilation breakages. @@ -435,6 +415,5 @@ /* Obsolete symbol for media_version, no longer used in the kernel */ #define MEDIA_API_VERSION ((0 << 16) | (1 << 8) | 0) -#endif #endif /* __LINUX_MEDIA_H */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/types-compat.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/linux/types-compat.h
Changed
@@ -24,7 +24,7 @@ #ifndef __TYPES_COMPAT_H__ #define __TYPES_COMPAT_H__ -#define __user +#define __inline__ inline #ifdef __linux__ #include <linux/types.h> @@ -40,6 +40,11 @@ # endif #endif +#if defined(__sun) +/* for _IOR/_IORW on Illumos distros */ +#include <sys/ioccom.h> +#endif + #ifndef __bitwise # ifdef __CHECK_ENDIAN__ # define __bitwise __bitwise__
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/v4l2-common.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/linux/v4l2-common.h
Changed
@@ -92,7 +92,6 @@ __u8 *edid; }; -#ifndef __KERNEL__ /* Backward compatibility target definitions --- to be removed. */ #define V4L2_SEL_TGT_CROP_ACTIVE V4L2_SEL_TGT_CROP #define V4L2_SEL_TGT_COMPOSE_ACTIVE V4L2_SEL_TGT_COMPOSE @@ -105,6 +104,5 @@ #define V4L2_SUBDEV_SEL_FLAG_SIZE_GE V4L2_SEL_FLAG_GE #define V4L2_SUBDEV_SEL_FLAG_SIZE_LE V4L2_SEL_FLAG_LE #define V4L2_SUBDEV_SEL_FLAG_KEEP_CONFIG V4L2_SEL_FLAG_KEEP_CONFIG -#endif #endif /* __V4L2_COMMON__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/v4l2-controls.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/linux/v4l2-controls.h
Changed
@@ -54,7 +54,7 @@ /* Control classes */ #define V4L2_CTRL_CLASS_USER 0x00980000 /* Old-style 'user' controls */ -#define V4L2_CTRL_CLASS_MPEG 0x00990000 /* MPEG-compression controls */ +#define V4L2_CTRL_CLASS_CODEC 0x00990000 /* Stateful codec controls */ #define V4L2_CTRL_CLASS_CAMERA 0x009a0000 /* Camera class controls */ #define V4L2_CTRL_CLASS_FM_TX 0x009b0000 /* FM Modulator controls */ #define V4L2_CTRL_CLASS_FLASH 0x009c0000 /* Camera flash controls */ @@ -65,6 +65,8 @@ #define V4L2_CTRL_CLASS_FM_RX 0x00a10000 /* FM Receiver controls */ #define V4L2_CTRL_CLASS_RF_TUNER 0x00a20000 /* RF tuner controls */ #define V4L2_CTRL_CLASS_DETECT 0x00a30000 /* Detection controls */ +#define V4L2_CTRL_CLASS_CODEC_STATELESS 0x00a40000 /* Stateless codecs controls */ +#define V4L2_CTRL_CLASS_COLORIMETRY 0x00a50000 /* Colorimetry controls */ /* User-class control IDs */ @@ -125,6 +127,7 @@ V4L2_COLORFX_SOLARIZATION = 13, V4L2_COLORFX_ANTIQUE = 14, V4L2_COLORFX_SET_CBCR = 15, + V4L2_COLORFX_SET_RGB = 16, }; #define V4L2_CID_AUTOBRIGHTNESS (V4L2_CID_BASE+32) #define V4L2_CID_BAND_STOP_FILTER (V4L2_CID_BASE+33) @@ -142,9 +145,10 @@ #define V4L2_CID_ALPHA_COMPONENT (V4L2_CID_BASE+41) #define V4L2_CID_COLORFX_CBCR (V4L2_CID_BASE+42) +#define V4L2_CID_COLORFX_RGB (V4L2_CID_BASE+43) /* last CID + 1 */ -#define V4L2_CID_LASTP1 (V4L2_CID_BASE+43) +#define V4L2_CID_LASTP1 (V4L2_CID_BASE+44) /* USER-class private control IDs */ @@ -198,15 +202,31 @@ */ #define V4L2_CID_USER_ATMEL_ISC_BASE (V4L2_CID_USER_BASE + 0x10c0) +/* + * The base for the CODA driver controls. + * We reserve 16 controls for this driver. + */ +#define V4L2_CID_USER_CODA_BASE (V4L2_CID_USER_BASE + 0x10e0) +/* + * The base for MIPI CCS driver controls. + * We reserve 128 controls for this driver. + */ +#define V4L2_CID_USER_CCS_BASE (V4L2_CID_USER_BASE + 0x10f0) +/* + * The base for Allegro driver controls. + * We reserve 16 controls for this driver. + */ +#define V4L2_CID_USER_ALLEGRO_BASE (V4L2_CID_USER_BASE + 0x1170) + /* MPEG-class control IDs */ /* The MPEG controls are applicable to all codec controls * and the 'MPEG' part of the define is historical */ -#define V4L2_CID_MPEG_BASE (V4L2_CTRL_CLASS_MPEG | 0x900) -#define V4L2_CID_MPEG_CLASS (V4L2_CTRL_CLASS_MPEG | 1) +#define V4L2_CID_CODEC_BASE (V4L2_CTRL_CLASS_CODEC | 0x900) +#define V4L2_CID_CODEC_CLASS (V4L2_CTRL_CLASS_CODEC | 1) /* MPEG streams, specific to multiplexed streams */ -#define V4L2_CID_MPEG_STREAM_TYPE (V4L2_CID_MPEG_BASE+0) +#define V4L2_CID_MPEG_STREAM_TYPE (V4L2_CID_CODEC_BASE+0) enum v4l2_mpeg_stream_type { V4L2_MPEG_STREAM_TYPE_MPEG2_PS = 0, /* MPEG-2 program stream */ V4L2_MPEG_STREAM_TYPE_MPEG2_TS = 1, /* MPEG-2 transport stream */ @@ -215,26 +235,26 @@ V4L2_MPEG_STREAM_TYPE_MPEG1_VCD = 4, /* MPEG-1 VCD-compatible stream */ V4L2_MPEG_STREAM_TYPE_MPEG2_SVCD = 5, /* MPEG-2 SVCD-compatible stream */ }; -#define V4L2_CID_MPEG_STREAM_PID_PMT (V4L2_CID_MPEG_BASE+1) -#define V4L2_CID_MPEG_STREAM_PID_AUDIO (V4L2_CID_MPEG_BASE+2) -#define V4L2_CID_MPEG_STREAM_PID_VIDEO (V4L2_CID_MPEG_BASE+3) -#define V4L2_CID_MPEG_STREAM_PID_PCR (V4L2_CID_MPEG_BASE+4) -#define V4L2_CID_MPEG_STREAM_PES_ID_AUDIO (V4L2_CID_MPEG_BASE+5) -#define V4L2_CID_MPEG_STREAM_PES_ID_VIDEO (V4L2_CID_MPEG_BASE+6) -#define V4L2_CID_MPEG_STREAM_VBI_FMT (V4L2_CID_MPEG_BASE+7) +#define V4L2_CID_MPEG_STREAM_PID_PMT (V4L2_CID_CODEC_BASE+1) +#define V4L2_CID_MPEG_STREAM_PID_AUDIO (V4L2_CID_CODEC_BASE+2) +#define V4L2_CID_MPEG_STREAM_PID_VIDEO (V4L2_CID_CODEC_BASE+3) +#define V4L2_CID_MPEG_STREAM_PID_PCR (V4L2_CID_CODEC_BASE+4) +#define V4L2_CID_MPEG_STREAM_PES_ID_AUDIO (V4L2_CID_CODEC_BASE+5) +#define V4L2_CID_MPEG_STREAM_PES_ID_VIDEO (V4L2_CID_CODEC_BASE+6) +#define V4L2_CID_MPEG_STREAM_VBI_FMT (V4L2_CID_CODEC_BASE+7) enum v4l2_mpeg_stream_vbi_fmt { V4L2_MPEG_STREAM_VBI_FMT_NONE = 0, /* No VBI in the MPEG stream */ V4L2_MPEG_STREAM_VBI_FMT_IVTV = 1, /* VBI in private packets, IVTV format */ }; /* MPEG audio controls specific to multiplexed streams */ -#define V4L2_CID_MPEG_AUDIO_SAMPLING_FREQ (V4L2_CID_MPEG_BASE+100) +#define V4L2_CID_MPEG_AUDIO_SAMPLING_FREQ (V4L2_CID_CODEC_BASE+100) enum v4l2_mpeg_audio_sampling_freq { V4L2_MPEG_AUDIO_SAMPLING_FREQ_44100 = 0, V4L2_MPEG_AUDIO_SAMPLING_FREQ_48000 = 1, V4L2_MPEG_AUDIO_SAMPLING_FREQ_32000 = 2, }; -#define V4L2_CID_MPEG_AUDIO_ENCODING (V4L2_CID_MPEG_BASE+101) +#define V4L2_CID_MPEG_AUDIO_ENCODING (V4L2_CID_CODEC_BASE+101) enum v4l2_mpeg_audio_encoding { V4L2_MPEG_AUDIO_ENCODING_LAYER_1 = 0, V4L2_MPEG_AUDIO_ENCODING_LAYER_2 = 1, @@ -242,7 +262,7 @@ V4L2_MPEG_AUDIO_ENCODING_AAC = 3, V4L2_MPEG_AUDIO_ENCODING_AC3 = 4, }; -#define V4L2_CID_MPEG_AUDIO_L1_BITRATE (V4L2_CID_MPEG_BASE+102) +#define V4L2_CID_MPEG_AUDIO_L1_BITRATE (V4L2_CID_CODEC_BASE+102) enum v4l2_mpeg_audio_l1_bitrate { V4L2_MPEG_AUDIO_L1_BITRATE_32K = 0, V4L2_MPEG_AUDIO_L1_BITRATE_64K = 1, @@ -259,7 +279,7 @@ V4L2_MPEG_AUDIO_L1_BITRATE_416K = 12, V4L2_MPEG_AUDIO_L1_BITRATE_448K = 13, }; -#define V4L2_CID_MPEG_AUDIO_L2_BITRATE (V4L2_CID_MPEG_BASE+103) +#define V4L2_CID_MPEG_AUDIO_L2_BITRATE (V4L2_CID_CODEC_BASE+103) enum v4l2_mpeg_audio_l2_bitrate { V4L2_MPEG_AUDIO_L2_BITRATE_32K = 0, V4L2_MPEG_AUDIO_L2_BITRATE_48K = 1, @@ -276,7 +296,7 @@ V4L2_MPEG_AUDIO_L2_BITRATE_320K = 12, V4L2_MPEG_AUDIO_L2_BITRATE_384K = 13, }; -#define V4L2_CID_MPEG_AUDIO_L3_BITRATE (V4L2_CID_MPEG_BASE+104) +#define V4L2_CID_MPEG_AUDIO_L3_BITRATE (V4L2_CID_CODEC_BASE+104) enum v4l2_mpeg_audio_l3_bitrate { V4L2_MPEG_AUDIO_L3_BITRATE_32K = 0, V4L2_MPEG_AUDIO_L3_BITRATE_40K = 1, @@ -293,34 +313,34 @@ V4L2_MPEG_AUDIO_L3_BITRATE_256K = 12, V4L2_MPEG_AUDIO_L3_BITRATE_320K = 13, }; -#define V4L2_CID_MPEG_AUDIO_MODE (V4L2_CID_MPEG_BASE+105) +#define V4L2_CID_MPEG_AUDIO_MODE (V4L2_CID_CODEC_BASE+105) enum v4l2_mpeg_audio_mode { V4L2_MPEG_AUDIO_MODE_STEREO = 0, V4L2_MPEG_AUDIO_MODE_JOINT_STEREO = 1, V4L2_MPEG_AUDIO_MODE_DUAL = 2, V4L2_MPEG_AUDIO_MODE_MONO = 3, }; -#define V4L2_CID_MPEG_AUDIO_MODE_EXTENSION (V4L2_CID_MPEG_BASE+106) +#define V4L2_CID_MPEG_AUDIO_MODE_EXTENSION (V4L2_CID_CODEC_BASE+106) enum v4l2_mpeg_audio_mode_extension { V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_4 = 0, V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_8 = 1, V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_12 = 2, V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_16 = 3, }; -#define V4L2_CID_MPEG_AUDIO_EMPHASIS (V4L2_CID_MPEG_BASE+107) +#define V4L2_CID_MPEG_AUDIO_EMPHASIS (V4L2_CID_CODEC_BASE+107) enum v4l2_mpeg_audio_emphasis { V4L2_MPEG_AUDIO_EMPHASIS_NONE = 0, V4L2_MPEG_AUDIO_EMPHASIS_50_DIV_15_uS = 1, V4L2_MPEG_AUDIO_EMPHASIS_CCITT_J17 = 2, }; -#define V4L2_CID_MPEG_AUDIO_CRC (V4L2_CID_MPEG_BASE+108) +#define V4L2_CID_MPEG_AUDIO_CRC (V4L2_CID_CODEC_BASE+108) enum v4l2_mpeg_audio_crc { V4L2_MPEG_AUDIO_CRC_NONE = 0, V4L2_MPEG_AUDIO_CRC_CRC16 = 1, }; -#define V4L2_CID_MPEG_AUDIO_MUTE (V4L2_CID_MPEG_BASE+109) -#define V4L2_CID_MPEG_AUDIO_AAC_BITRATE (V4L2_CID_MPEG_BASE+110) -#define V4L2_CID_MPEG_AUDIO_AC3_BITRATE (V4L2_CID_MPEG_BASE+111) +#define V4L2_CID_MPEG_AUDIO_MUTE (V4L2_CID_CODEC_BASE+109) +#define V4L2_CID_MPEG_AUDIO_AAC_BITRATE (V4L2_CID_CODEC_BASE+110) +#define V4L2_CID_MPEG_AUDIO_AC3_BITRATE (V4L2_CID_CODEC_BASE+111) enum v4l2_mpeg_audio_ac3_bitrate { V4L2_MPEG_AUDIO_AC3_BITRATE_32K = 0, V4L2_MPEG_AUDIO_AC3_BITRATE_40K = 1, @@ -342,7 +362,7 @@ V4L2_MPEG_AUDIO_AC3_BITRATE_576K = 17, V4L2_MPEG_AUDIO_AC3_BITRATE_640K = 18, }; -#define V4L2_CID_MPEG_AUDIO_DEC_PLAYBACK (V4L2_CID_MPEG_BASE+112) +#define V4L2_CID_MPEG_AUDIO_DEC_PLAYBACK (V4L2_CID_CODEC_BASE+112) enum v4l2_mpeg_audio_dec_playback { V4L2_MPEG_AUDIO_DEC_PLAYBACK_AUTO = 0, V4L2_MPEG_AUDIO_DEC_PLAYBACK_STEREO = 1, @@ -351,79 +371,85 @@ V4L2_MPEG_AUDIO_DEC_PLAYBACK_MONO = 4, V4L2_MPEG_AUDIO_DEC_PLAYBACK_SWAPPED_STEREO = 5, }; -#define V4L2_CID_MPEG_AUDIO_DEC_MULTILINGUAL_PLAYBACK (V4L2_CID_MPEG_BASE+113) +#define V4L2_CID_MPEG_AUDIO_DEC_MULTILINGUAL_PLAYBACK (V4L2_CID_CODEC_BASE+113) /* MPEG video controls specific to multiplexed streams */ -#define V4L2_CID_MPEG_VIDEO_ENCODING (V4L2_CID_MPEG_BASE+200) +#define V4L2_CID_MPEG_VIDEO_ENCODING (V4L2_CID_CODEC_BASE+200) enum v4l2_mpeg_video_encoding { V4L2_MPEG_VIDEO_ENCODING_MPEG_1 = 0, V4L2_MPEG_VIDEO_ENCODING_MPEG_2 = 1, V4L2_MPEG_VIDEO_ENCODING_MPEG_4_AVC = 2, }; -#define V4L2_CID_MPEG_VIDEO_ASPECT (V4L2_CID_MPEG_BASE+201) +#define V4L2_CID_MPEG_VIDEO_ASPECT (V4L2_CID_CODEC_BASE+201) enum v4l2_mpeg_video_aspect { V4L2_MPEG_VIDEO_ASPECT_1x1 = 0, V4L2_MPEG_VIDEO_ASPECT_4x3 = 1, V4L2_MPEG_VIDEO_ASPECT_16x9 = 2, V4L2_MPEG_VIDEO_ASPECT_221x100 = 3, }; -#define V4L2_CID_MPEG_VIDEO_B_FRAMES (V4L2_CID_MPEG_BASE+202) -#define V4L2_CID_MPEG_VIDEO_GOP_SIZE (V4L2_CID_MPEG_BASE+203) -#define V4L2_CID_MPEG_VIDEO_GOP_CLOSURE (V4L2_CID_MPEG_BASE+204) -#define V4L2_CID_MPEG_VIDEO_PULLDOWN (V4L2_CID_MPEG_BASE+205) -#define V4L2_CID_MPEG_VIDEO_BITRATE_MODE (V4L2_CID_MPEG_BASE+206) +#define V4L2_CID_MPEG_VIDEO_B_FRAMES (V4L2_CID_CODEC_BASE+202) +#define V4L2_CID_MPEG_VIDEO_GOP_SIZE (V4L2_CID_CODEC_BASE+203) +#define V4L2_CID_MPEG_VIDEO_GOP_CLOSURE (V4L2_CID_CODEC_BASE+204) +#define V4L2_CID_MPEG_VIDEO_PULLDOWN (V4L2_CID_CODEC_BASE+205) +#define V4L2_CID_MPEG_VIDEO_BITRATE_MODE (V4L2_CID_CODEC_BASE+206) enum v4l2_mpeg_video_bitrate_mode { V4L2_MPEG_VIDEO_BITRATE_MODE_VBR = 0, V4L2_MPEG_VIDEO_BITRATE_MODE_CBR = 1, -}; -#define V4L2_CID_MPEG_VIDEO_BITRATE (V4L2_CID_MPEG_BASE+207) -#define V4L2_CID_MPEG_VIDEO_BITRATE_PEAK (V4L2_CID_MPEG_BASE+208) -#define V4L2_CID_MPEG_VIDEO_TEMPORAL_DECIMATION (V4L2_CID_MPEG_BASE+209) -#define V4L2_CID_MPEG_VIDEO_MUTE (V4L2_CID_MPEG_BASE+210) -#define V4L2_CID_MPEG_VIDEO_MUTE_YUV (V4L2_CID_MPEG_BASE+211) -#define V4L2_CID_MPEG_VIDEO_DECODER_SLICE_INTERFACE (V4L2_CID_MPEG_BASE+212) -#define V4L2_CID_MPEG_VIDEO_DECODER_MPEG4_DEBLOCK_FILTER (V4L2_CID_MPEG_BASE+213) -#define V4L2_CID_MPEG_VIDEO_CYCLIC_INTRA_REFRESH_MB (V4L2_CID_MPEG_BASE+214) -#define V4L2_CID_MPEG_VIDEO_FRAME_RC_ENABLE (V4L2_CID_MPEG_BASE+215) -#define V4L2_CID_MPEG_VIDEO_HEADER_MODE (V4L2_CID_MPEG_BASE+216) + V4L2_MPEG_VIDEO_BITRATE_MODE_CQ = 2, +}; +#define V4L2_CID_MPEG_VIDEO_BITRATE (V4L2_CID_CODEC_BASE+207) +#define V4L2_CID_MPEG_VIDEO_BITRATE_PEAK (V4L2_CID_CODEC_BASE+208) +#define V4L2_CID_MPEG_VIDEO_TEMPORAL_DECIMATION (V4L2_CID_CODEC_BASE+209) +#define V4L2_CID_MPEG_VIDEO_MUTE (V4L2_CID_CODEC_BASE+210) +#define V4L2_CID_MPEG_VIDEO_MUTE_YUV (V4L2_CID_CODEC_BASE+211) +#define V4L2_CID_MPEG_VIDEO_DECODER_SLICE_INTERFACE (V4L2_CID_CODEC_BASE+212) +#define V4L2_CID_MPEG_VIDEO_DECODER_MPEG4_DEBLOCK_FILTER (V4L2_CID_CODEC_BASE+213) +#define V4L2_CID_MPEG_VIDEO_CYCLIC_INTRA_REFRESH_MB (V4L2_CID_CODEC_BASE+214) +#define V4L2_CID_MPEG_VIDEO_FRAME_RC_ENABLE (V4L2_CID_CODEC_BASE+215) +#define V4L2_CID_MPEG_VIDEO_HEADER_MODE (V4L2_CID_CODEC_BASE+216) enum v4l2_mpeg_video_header_mode { V4L2_MPEG_VIDEO_HEADER_MODE_SEPARATE = 0, V4L2_MPEG_VIDEO_HEADER_MODE_JOINED_WITH_1ST_FRAME = 1, }; -#define V4L2_CID_MPEG_VIDEO_MAX_REF_PIC (V4L2_CID_MPEG_BASE+217) -#define V4L2_CID_MPEG_VIDEO_MB_RC_ENABLE (V4L2_CID_MPEG_BASE+218) -#define V4L2_CID_MPEG_VIDEO_MULTI_SLICE_MAX_BYTES (V4L2_CID_MPEG_BASE+219) -#define V4L2_CID_MPEG_VIDEO_MULTI_SLICE_MAX_MB (V4L2_CID_MPEG_BASE+220) -#define V4L2_CID_MPEG_VIDEO_MULTI_SLICE_MODE (V4L2_CID_MPEG_BASE+221) +#define V4L2_CID_MPEG_VIDEO_MAX_REF_PIC (V4L2_CID_CODEC_BASE+217) +#define V4L2_CID_MPEG_VIDEO_MB_RC_ENABLE (V4L2_CID_CODEC_BASE+218) +#define V4L2_CID_MPEG_VIDEO_MULTI_SLICE_MAX_BYTES (V4L2_CID_CODEC_BASE+219) +#define V4L2_CID_MPEG_VIDEO_MULTI_SLICE_MAX_MB (V4L2_CID_CODEC_BASE+220) +#define V4L2_CID_MPEG_VIDEO_MULTI_SLICE_MODE (V4L2_CID_CODEC_BASE+221) enum v4l2_mpeg_video_multi_slice_mode { V4L2_MPEG_VIDEO_MULTI_SLICE_MODE_SINGLE = 0, V4L2_MPEG_VIDEO_MULTI_SLICE_MODE_MAX_MB = 1, V4L2_MPEG_VIDEO_MULTI_SLICE_MODE_MAX_BYTES = 2, -#ifndef __KERNEL__ /* Kept for backwards compatibility reasons. Stupid typo... */ V4L2_MPEG_VIDEO_MULTI_SICE_MODE_MAX_MB = 1, V4L2_MPEG_VIDEO_MULTI_SICE_MODE_MAX_BYTES = 2, -#endif }; -#define V4L2_CID_MPEG_VIDEO_VBV_SIZE (V4L2_CID_MPEG_BASE+222) -#define V4L2_CID_MPEG_VIDEO_DEC_PTS (V4L2_CID_MPEG_BASE+223) -#define V4L2_CID_MPEG_VIDEO_DEC_FRAME (V4L2_CID_MPEG_BASE+224) -#define V4L2_CID_MPEG_VIDEO_VBV_DELAY (V4L2_CID_MPEG_BASE+225) -#define V4L2_CID_MPEG_VIDEO_REPEAT_SEQ_HEADER (V4L2_CID_MPEG_BASE+226) -#define V4L2_CID_MPEG_VIDEO_MV_H_SEARCH_RANGE (V4L2_CID_MPEG_BASE+227) -#define V4L2_CID_MPEG_VIDEO_MV_V_SEARCH_RANGE (V4L2_CID_MPEG_BASE+228) -#define V4L2_CID_MPEG_VIDEO_FORCE_KEY_FRAME (V4L2_CID_MPEG_BASE+229) +#define V4L2_CID_MPEG_VIDEO_VBV_SIZE (V4L2_CID_CODEC_BASE+222) +#define V4L2_CID_MPEG_VIDEO_DEC_PTS (V4L2_CID_CODEC_BASE+223) +#define V4L2_CID_MPEG_VIDEO_DEC_FRAME (V4L2_CID_CODEC_BASE+224) +#define V4L2_CID_MPEG_VIDEO_VBV_DELAY (V4L2_CID_CODEC_BASE+225) +#define V4L2_CID_MPEG_VIDEO_REPEAT_SEQ_HEADER (V4L2_CID_CODEC_BASE+226) +#define V4L2_CID_MPEG_VIDEO_MV_H_SEARCH_RANGE (V4L2_CID_CODEC_BASE+227) +#define V4L2_CID_MPEG_VIDEO_MV_V_SEARCH_RANGE (V4L2_CID_CODEC_BASE+228) +#define V4L2_CID_MPEG_VIDEO_FORCE_KEY_FRAME (V4L2_CID_CODEC_BASE+229) +#define V4L2_CID_MPEG_VIDEO_BASELAYER_PRIORITY_ID (V4L2_CID_CODEC_BASE+230) +#define V4L2_CID_MPEG_VIDEO_AU_DELIMITER (V4L2_CID_CODEC_BASE+231) +#define V4L2_CID_MPEG_VIDEO_LTR_COUNT (V4L2_CID_CODEC_BASE+232) +#define V4L2_CID_MPEG_VIDEO_FRAME_LTR_INDEX (V4L2_CID_CODEC_BASE+233) +#define V4L2_CID_MPEG_VIDEO_USE_LTR_FRAMES (V4L2_CID_CODEC_BASE+234) +#define V4L2_CID_MPEG_VIDEO_DEC_CONCEAL_COLOR (V4L2_CID_CODEC_BASE+235) +#define V4L2_CID_MPEG_VIDEO_INTRA_REFRESH_PERIOD (V4L2_CID_CODEC_BASE+236) /* CIDs for the MPEG-2 Part 2 (H.262) codec */ -#define V4L2_CID_MPEG_VIDEO_MPEG2_LEVEL (V4L2_CID_MPEG_BASE+270) +#define V4L2_CID_MPEG_VIDEO_MPEG2_LEVEL (V4L2_CID_CODEC_BASE+270) enum v4l2_mpeg_video_mpeg2_level { V4L2_MPEG_VIDEO_MPEG2_LEVEL_LOW = 0, V4L2_MPEG_VIDEO_MPEG2_LEVEL_MAIN = 1, V4L2_MPEG_VIDEO_MPEG2_LEVEL_HIGH_1440 = 2, V4L2_MPEG_VIDEO_MPEG2_LEVEL_HIGH = 3, }; -#define V4L2_CID_MPEG_VIDEO_MPEG2_PROFILE (V4L2_CID_MPEG_BASE+271) +#define V4L2_CID_MPEG_VIDEO_MPEG2_PROFILE (V4L2_CID_CODEC_BASE+271) enum v4l2_mpeg_video_mpeg2_profile { V4L2_MPEG_VIDEO_MPEG2_PROFILE_SIMPLE = 0, V4L2_MPEG_VIDEO_MPEG2_PROFILE_MAIN = 1, @@ -434,28 +460,28 @@ }; /* CIDs for the FWHT codec as used by the vicodec driver. */ -#define V4L2_CID_FWHT_I_FRAME_QP (V4L2_CID_MPEG_BASE + 290) -#define V4L2_CID_FWHT_P_FRAME_QP (V4L2_CID_MPEG_BASE + 291) - -#define V4L2_CID_MPEG_VIDEO_H263_I_FRAME_QP (V4L2_CID_MPEG_BASE+300) -#define V4L2_CID_MPEG_VIDEO_H263_P_FRAME_QP (V4L2_CID_MPEG_BASE+301) -#define V4L2_CID_MPEG_VIDEO_H263_B_FRAME_QP (V4L2_CID_MPEG_BASE+302) -#define V4L2_CID_MPEG_VIDEO_H263_MIN_QP (V4L2_CID_MPEG_BASE+303) -#define V4L2_CID_MPEG_VIDEO_H263_MAX_QP (V4L2_CID_MPEG_BASE+304) -#define V4L2_CID_MPEG_VIDEO_H264_I_FRAME_QP (V4L2_CID_MPEG_BASE+350) -#define V4L2_CID_MPEG_VIDEO_H264_P_FRAME_QP (V4L2_CID_MPEG_BASE+351) -#define V4L2_CID_MPEG_VIDEO_H264_B_FRAME_QP (V4L2_CID_MPEG_BASE+352) -#define V4L2_CID_MPEG_VIDEO_H264_MIN_QP (V4L2_CID_MPEG_BASE+353) -#define V4L2_CID_MPEG_VIDEO_H264_MAX_QP (V4L2_CID_MPEG_BASE+354) -#define V4L2_CID_MPEG_VIDEO_H264_8X8_TRANSFORM (V4L2_CID_MPEG_BASE+355) -#define V4L2_CID_MPEG_VIDEO_H264_CPB_SIZE (V4L2_CID_MPEG_BASE+356) -#define V4L2_CID_MPEG_VIDEO_H264_ENTROPY_MODE (V4L2_CID_MPEG_BASE+357) +#define V4L2_CID_FWHT_I_FRAME_QP (V4L2_CID_CODEC_BASE + 290) +#define V4L2_CID_FWHT_P_FRAME_QP (V4L2_CID_CODEC_BASE + 291) + +#define V4L2_CID_MPEG_VIDEO_H263_I_FRAME_QP (V4L2_CID_CODEC_BASE+300) +#define V4L2_CID_MPEG_VIDEO_H263_P_FRAME_QP (V4L2_CID_CODEC_BASE+301) +#define V4L2_CID_MPEG_VIDEO_H263_B_FRAME_QP (V4L2_CID_CODEC_BASE+302) +#define V4L2_CID_MPEG_VIDEO_H263_MIN_QP (V4L2_CID_CODEC_BASE+303) +#define V4L2_CID_MPEG_VIDEO_H263_MAX_QP (V4L2_CID_CODEC_BASE+304) +#define V4L2_CID_MPEG_VIDEO_H264_I_FRAME_QP (V4L2_CID_CODEC_BASE+350) +#define V4L2_CID_MPEG_VIDEO_H264_P_FRAME_QP (V4L2_CID_CODEC_BASE+351) +#define V4L2_CID_MPEG_VIDEO_H264_B_FRAME_QP (V4L2_CID_CODEC_BASE+352) +#define V4L2_CID_MPEG_VIDEO_H264_MIN_QP (V4L2_CID_CODEC_BASE+353) +#define V4L2_CID_MPEG_VIDEO_H264_MAX_QP (V4L2_CID_CODEC_BASE+354) +#define V4L2_CID_MPEG_VIDEO_H264_8X8_TRANSFORM (V4L2_CID_CODEC_BASE+355) +#define V4L2_CID_MPEG_VIDEO_H264_CPB_SIZE (V4L2_CID_CODEC_BASE+356) +#define V4L2_CID_MPEG_VIDEO_H264_ENTROPY_MODE (V4L2_CID_CODEC_BASE+357) enum v4l2_mpeg_video_h264_entropy_mode { V4L2_MPEG_VIDEO_H264_ENTROPY_MODE_CAVLC = 0, V4L2_MPEG_VIDEO_H264_ENTROPY_MODE_CABAC = 1, }; -#define V4L2_CID_MPEG_VIDEO_H264_I_PERIOD (V4L2_CID_MPEG_BASE+358) -#define V4L2_CID_MPEG_VIDEO_H264_LEVEL (V4L2_CID_MPEG_BASE+359) +#define V4L2_CID_MPEG_VIDEO_H264_I_PERIOD (V4L2_CID_CODEC_BASE+358) +#define V4L2_CID_MPEG_VIDEO_H264_LEVEL (V4L2_CID_CODEC_BASE+359) enum v4l2_mpeg_video_h264_level { V4L2_MPEG_VIDEO_H264_LEVEL_1_0 = 0, V4L2_MPEG_VIDEO_H264_LEVEL_1B = 1, @@ -473,16 +499,20 @@ V4L2_MPEG_VIDEO_H264_LEVEL_4_2 = 13, V4L2_MPEG_VIDEO_H264_LEVEL_5_0 = 14, V4L2_MPEG_VIDEO_H264_LEVEL_5_1 = 15, -}; -#define V4L2_CID_MPEG_VIDEO_H264_LOOP_FILTER_ALPHA (V4L2_CID_MPEG_BASE+360) -#define V4L2_CID_MPEG_VIDEO_H264_LOOP_FILTER_BETA (V4L2_CID_MPEG_BASE+361) -#define V4L2_CID_MPEG_VIDEO_H264_LOOP_FILTER_MODE (V4L2_CID_MPEG_BASE+362) + V4L2_MPEG_VIDEO_H264_LEVEL_5_2 = 16, + V4L2_MPEG_VIDEO_H264_LEVEL_6_0 = 17, + V4L2_MPEG_VIDEO_H264_LEVEL_6_1 = 18, + V4L2_MPEG_VIDEO_H264_LEVEL_6_2 = 19, +}; +#define V4L2_CID_MPEG_VIDEO_H264_LOOP_FILTER_ALPHA (V4L2_CID_CODEC_BASE+360) +#define V4L2_CID_MPEG_VIDEO_H264_LOOP_FILTER_BETA (V4L2_CID_CODEC_BASE+361) +#define V4L2_CID_MPEG_VIDEO_H264_LOOP_FILTER_MODE (V4L2_CID_CODEC_BASE+362) enum v4l2_mpeg_video_h264_loop_filter_mode { V4L2_MPEG_VIDEO_H264_LOOP_FILTER_MODE_ENABLED = 0, V4L2_MPEG_VIDEO_H264_LOOP_FILTER_MODE_DISABLED = 1, V4L2_MPEG_VIDEO_H264_LOOP_FILTER_MODE_DISABLED_AT_SLICE_BOUNDARY = 2, }; -#define V4L2_CID_MPEG_VIDEO_H264_PROFILE (V4L2_CID_MPEG_BASE+363) +#define V4L2_CID_MPEG_VIDEO_H264_PROFILE (V4L2_CID_CODEC_BASE+363) enum v4l2_mpeg_video_h264_profile { V4L2_MPEG_VIDEO_H264_PROFILE_BASELINE = 0, V4L2_MPEG_VIDEO_H264_PROFILE_CONSTRAINED_BASELINE = 1, @@ -501,11 +531,12 @@ V4L2_MPEG_VIDEO_H264_PROFILE_SCALABLE_HIGH_INTRA = 14, V4L2_MPEG_VIDEO_H264_PROFILE_STEREO_HIGH = 15, V4L2_MPEG_VIDEO_H264_PROFILE_MULTIVIEW_HIGH = 16, + V4L2_MPEG_VIDEO_H264_PROFILE_CONSTRAINED_HIGH = 17, }; -#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT (V4L2_CID_MPEG_BASE+364) -#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH (V4L2_CID_MPEG_BASE+365) -#define V4L2_CID_MPEG_VIDEO_H264_VUI_SAR_ENABLE (V4L2_CID_MPEG_BASE+366) -#define V4L2_CID_MPEG_VIDEO_H264_VUI_SAR_IDC (V4L2_CID_MPEG_BASE+367) +#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT (V4L2_CID_CODEC_BASE+364) +#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH (V4L2_CID_CODEC_BASE+365) +#define V4L2_CID_MPEG_VIDEO_H264_VUI_SAR_ENABLE (V4L2_CID_CODEC_BASE+366) +#define V4L2_CID_MPEG_VIDEO_H264_VUI_SAR_IDC (V4L2_CID_CODEC_BASE+367) enum v4l2_mpeg_video_h264_vui_sar_idc { V4L2_MPEG_VIDEO_H264_VUI_SAR_IDC_UNSPECIFIED = 0, V4L2_MPEG_VIDEO_H264_VUI_SAR_IDC_1x1 = 1, @@ -526,9 +557,9 @@ V4L2_MPEG_VIDEO_H264_VUI_SAR_IDC_2x1 = 16, V4L2_MPEG_VIDEO_H264_VUI_SAR_IDC_EXTENDED = 17, }; -#define V4L2_CID_MPEG_VIDEO_H264_SEI_FRAME_PACKING (V4L2_CID_MPEG_BASE+368) -#define V4L2_CID_MPEG_VIDEO_H264_SEI_FP_CURRENT_FRAME_0 (V4L2_CID_MPEG_BASE+369) -#define V4L2_CID_MPEG_VIDEO_H264_SEI_FP_ARRANGEMENT_TYPE (V4L2_CID_MPEG_BASE+370) +#define V4L2_CID_MPEG_VIDEO_H264_SEI_FRAME_PACKING (V4L2_CID_CODEC_BASE+368) +#define V4L2_CID_MPEG_VIDEO_H264_SEI_FP_CURRENT_FRAME_0 (V4L2_CID_CODEC_BASE+369) +#define V4L2_CID_MPEG_VIDEO_H264_SEI_FP_ARRANGEMENT_TYPE (V4L2_CID_CODEC_BASE+370) enum v4l2_mpeg_video_h264_sei_fp_arrangement_type { V4L2_MPEG_VIDEO_H264_SEI_FP_ARRANGEMENT_TYPE_CHECKERBOARD = 0, V4L2_MPEG_VIDEO_H264_SEI_FP_ARRANGEMENT_TYPE_COLUMN = 1, @@ -537,8 +568,8 @@ V4L2_MPEG_VIDEO_H264_SEI_FP_ARRANGEMENT_TYPE_TOP_BOTTOM = 4, V4L2_MPEG_VIDEO_H264_SEI_FP_ARRANGEMENT_TYPE_TEMPORAL = 5, }; -#define V4L2_CID_MPEG_VIDEO_H264_FMO (V4L2_CID_MPEG_BASE+371) -#define V4L2_CID_MPEG_VIDEO_H264_FMO_MAP_TYPE (V4L2_CID_MPEG_BASE+372) +#define V4L2_CID_MPEG_VIDEO_H264_FMO (V4L2_CID_CODEC_BASE+371) +#define V4L2_CID_MPEG_VIDEO_H264_FMO_MAP_TYPE (V4L2_CID_CODEC_BASE+372) enum v4l2_mpeg_video_h264_fmo_map_type { V4L2_MPEG_VIDEO_H264_FMO_MAP_TYPE_INTERLEAVED_SLICES = 0, V4L2_MPEG_VIDEO_H264_FMO_MAP_TYPE_SCATTERED_SLICES = 1, @@ -548,36 +579,45 @@ V4L2_MPEG_VIDEO_H264_FMO_MAP_TYPE_WIPE_SCAN = 5, V4L2_MPEG_VIDEO_H264_FMO_MAP_TYPE_EXPLICIT = 6, }; -#define V4L2_CID_MPEG_VIDEO_H264_FMO_SLICE_GROUP (V4L2_CID_MPEG_BASE+373) -#define V4L2_CID_MPEG_VIDEO_H264_FMO_CHANGE_DIRECTION (V4L2_CID_MPEG_BASE+374) +#define V4L2_CID_MPEG_VIDEO_H264_FMO_SLICE_GROUP (V4L2_CID_CODEC_BASE+373) +#define V4L2_CID_MPEG_VIDEO_H264_FMO_CHANGE_DIRECTION (V4L2_CID_CODEC_BASE+374) enum v4l2_mpeg_video_h264_fmo_change_dir { V4L2_MPEG_VIDEO_H264_FMO_CHANGE_DIR_RIGHT = 0, V4L2_MPEG_VIDEO_H264_FMO_CHANGE_DIR_LEFT = 1, }; -#define V4L2_CID_MPEG_VIDEO_H264_FMO_CHANGE_RATE (V4L2_CID_MPEG_BASE+375) -#define V4L2_CID_MPEG_VIDEO_H264_FMO_RUN_LENGTH (V4L2_CID_MPEG_BASE+376) -#define V4L2_CID_MPEG_VIDEO_H264_ASO (V4L2_CID_MPEG_BASE+377) -#define V4L2_CID_MPEG_VIDEO_H264_ASO_SLICE_ORDER (V4L2_CID_MPEG_BASE+378) -#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING (V4L2_CID_MPEG_BASE+379) -#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING_TYPE (V4L2_CID_MPEG_BASE+380) +#define V4L2_CID_MPEG_VIDEO_H264_FMO_CHANGE_RATE (V4L2_CID_CODEC_BASE+375) +#define V4L2_CID_MPEG_VIDEO_H264_FMO_RUN_LENGTH (V4L2_CID_CODEC_BASE+376) +#define V4L2_CID_MPEG_VIDEO_H264_ASO (V4L2_CID_CODEC_BASE+377) +#define V4L2_CID_MPEG_VIDEO_H264_ASO_SLICE_ORDER (V4L2_CID_CODEC_BASE+378) +#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING (V4L2_CID_CODEC_BASE+379) +#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING_TYPE (V4L2_CID_CODEC_BASE+380) enum v4l2_mpeg_video_h264_hierarchical_coding_type { V4L2_MPEG_VIDEO_H264_HIERARCHICAL_CODING_B = 0, V4L2_MPEG_VIDEO_H264_HIERARCHICAL_CODING_P = 1, }; -#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING_LAYER (V4L2_CID_MPEG_BASE+381) -#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING_LAYER_QP (V4L2_CID_MPEG_BASE+382) -#define V4L2_CID_MPEG_VIDEO_H264_CONSTRAINED_INTRA_PREDICTION (V4L2_CID_MPEG_BASE+383) -#define V4L2_CID_MPEG_VIDEO_H264_CHROMA_QP_INDEX_OFFSET (V4L2_CID_MPEG_BASE+384) -#define V4L2_CID_MPEG_VIDEO_H264_I_FRAME_MIN_QP (V4L2_CID_MPEG_BASE+385) -#define V4L2_CID_MPEG_VIDEO_H264_I_FRAME_MAX_QP (V4L2_CID_MPEG_BASE+386) -#define V4L2_CID_MPEG_VIDEO_H264_P_FRAME_MIN_QP (V4L2_CID_MPEG_BASE+387) -#define V4L2_CID_MPEG_VIDEO_H264_P_FRAME_MAX_QP (V4L2_CID_MPEG_BASE+388) -#define V4L2_CID_MPEG_VIDEO_MPEG4_I_FRAME_QP (V4L2_CID_MPEG_BASE+400) -#define V4L2_CID_MPEG_VIDEO_MPEG4_P_FRAME_QP (V4L2_CID_MPEG_BASE+401) -#define V4L2_CID_MPEG_VIDEO_MPEG4_B_FRAME_QP (V4L2_CID_MPEG_BASE+402) -#define V4L2_CID_MPEG_VIDEO_MPEG4_MIN_QP (V4L2_CID_MPEG_BASE+403) -#define V4L2_CID_MPEG_VIDEO_MPEG4_MAX_QP (V4L2_CID_MPEG_BASE+404) -#define V4L2_CID_MPEG_VIDEO_MPEG4_LEVEL (V4L2_CID_MPEG_BASE+405) +#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING_LAYER (V4L2_CID_CODEC_BASE+381) +#define V4L2_CID_MPEG_VIDEO_H264_HIERARCHICAL_CODING_LAYER_QP (V4L2_CID_CODEC_BASE+382) +#define V4L2_CID_MPEG_VIDEO_H264_CONSTRAINED_INTRA_PREDICTION (V4L2_CID_CODEC_BASE+383) +#define V4L2_CID_MPEG_VIDEO_H264_CHROMA_QP_INDEX_OFFSET (V4L2_CID_CODEC_BASE+384) +#define V4L2_CID_MPEG_VIDEO_H264_I_FRAME_MIN_QP (V4L2_CID_CODEC_BASE+385) +#define V4L2_CID_MPEG_VIDEO_H264_I_FRAME_MAX_QP (V4L2_CID_CODEC_BASE+386) +#define V4L2_CID_MPEG_VIDEO_H264_P_FRAME_MIN_QP (V4L2_CID_CODEC_BASE+387) +#define V4L2_CID_MPEG_VIDEO_H264_P_FRAME_MAX_QP (V4L2_CID_CODEC_BASE+388) +#define V4L2_CID_MPEG_VIDEO_H264_B_FRAME_MIN_QP (V4L2_CID_CODEC_BASE+389) +#define V4L2_CID_MPEG_VIDEO_H264_B_FRAME_MAX_QP (V4L2_CID_CODEC_BASE+390) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L0_BR (V4L2_CID_CODEC_BASE+391) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L1_BR (V4L2_CID_CODEC_BASE+392) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L2_BR (V4L2_CID_CODEC_BASE+393) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L3_BR (V4L2_CID_CODEC_BASE+394) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L4_BR (V4L2_CID_CODEC_BASE+395) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L5_BR (V4L2_CID_CODEC_BASE+396) +#define V4L2_CID_MPEG_VIDEO_H264_HIER_CODING_L6_BR (V4L2_CID_CODEC_BASE+397) +#define V4L2_CID_MPEG_VIDEO_MPEG4_I_FRAME_QP (V4L2_CID_CODEC_BASE+400) +#define V4L2_CID_MPEG_VIDEO_MPEG4_P_FRAME_QP (V4L2_CID_CODEC_BASE+401) +#define V4L2_CID_MPEG_VIDEO_MPEG4_B_FRAME_QP (V4L2_CID_CODEC_BASE+402) +#define V4L2_CID_MPEG_VIDEO_MPEG4_MIN_QP (V4L2_CID_CODEC_BASE+403) +#define V4L2_CID_MPEG_VIDEO_MPEG4_MAX_QP (V4L2_CID_CODEC_BASE+404) +#define V4L2_CID_MPEG_VIDEO_MPEG4_LEVEL (V4L2_CID_CODEC_BASE+405) enum v4l2_mpeg_video_mpeg4_level { V4L2_MPEG_VIDEO_MPEG4_LEVEL_0 = 0, V4L2_MPEG_VIDEO_MPEG4_LEVEL_0B = 1, @@ -588,7 +628,7 @@ V4L2_MPEG_VIDEO_MPEG4_LEVEL_4 = 6, V4L2_MPEG_VIDEO_MPEG4_LEVEL_5 = 7, }; -#define V4L2_CID_MPEG_VIDEO_MPEG4_PROFILE (V4L2_CID_MPEG_BASE+406) +#define V4L2_CID_MPEG_VIDEO_MPEG4_PROFILE (V4L2_CID_CODEC_BASE+406) enum v4l2_mpeg_video_mpeg4_profile { V4L2_MPEG_VIDEO_MPEG4_PROFILE_SIMPLE = 0, V4L2_MPEG_VIDEO_MPEG4_PROFILE_ADVANCED_SIMPLE = 1, @@ -596,40 +636,40 @@ V4L2_MPEG_VIDEO_MPEG4_PROFILE_SIMPLE_SCALABLE = 3, V4L2_MPEG_VIDEO_MPEG4_PROFILE_ADVANCED_CODING_EFFICIENCY = 4, }; -#define V4L2_CID_MPEG_VIDEO_MPEG4_QPEL (V4L2_CID_MPEG_BASE+407) +#define V4L2_CID_MPEG_VIDEO_MPEG4_QPEL (V4L2_CID_CODEC_BASE+407) /* Control IDs for VP8 streams * Although VP8 is not part of MPEG we add these controls to the MPEG class * as that class is already handling other video compression standards */ -#define V4L2_CID_MPEG_VIDEO_VPX_NUM_PARTITIONS (V4L2_CID_MPEG_BASE+500) +#define V4L2_CID_MPEG_VIDEO_VPX_NUM_PARTITIONS (V4L2_CID_CODEC_BASE+500) enum v4l2_vp8_num_partitions { V4L2_CID_MPEG_VIDEO_VPX_1_PARTITION = 0, V4L2_CID_MPEG_VIDEO_VPX_2_PARTITIONS = 1, V4L2_CID_MPEG_VIDEO_VPX_4_PARTITIONS = 2, V4L2_CID_MPEG_VIDEO_VPX_8_PARTITIONS = 3, }; -#define V4L2_CID_MPEG_VIDEO_VPX_IMD_DISABLE_4X4 (V4L2_CID_MPEG_BASE+501) -#define V4L2_CID_MPEG_VIDEO_VPX_NUM_REF_FRAMES (V4L2_CID_MPEG_BASE+502) +#define V4L2_CID_MPEG_VIDEO_VPX_IMD_DISABLE_4X4 (V4L2_CID_CODEC_BASE+501) +#define V4L2_CID_MPEG_VIDEO_VPX_NUM_REF_FRAMES (V4L2_CID_CODEC_BASE+502) enum v4l2_vp8_num_ref_frames { V4L2_CID_MPEG_VIDEO_VPX_1_REF_FRAME = 0, V4L2_CID_MPEG_VIDEO_VPX_2_REF_FRAME = 1, V4L2_CID_MPEG_VIDEO_VPX_3_REF_FRAME = 2, }; -#define V4L2_CID_MPEG_VIDEO_VPX_FILTER_LEVEL (V4L2_CID_MPEG_BASE+503) -#define V4L2_CID_MPEG_VIDEO_VPX_FILTER_SHARPNESS (V4L2_CID_MPEG_BASE+504) -#define V4L2_CID_MPEG_VIDEO_VPX_GOLDEN_FRAME_REF_PERIOD (V4L2_CID_MPEG_BASE+505) -#define V4L2_CID_MPEG_VIDEO_VPX_GOLDEN_FRAME_SEL (V4L2_CID_MPEG_BASE+506) +#define V4L2_CID_MPEG_VIDEO_VPX_FILTER_LEVEL (V4L2_CID_CODEC_BASE+503) +#define V4L2_CID_MPEG_VIDEO_VPX_FILTER_SHARPNESS (V4L2_CID_CODEC_BASE+504) +#define V4L2_CID_MPEG_VIDEO_VPX_GOLDEN_FRAME_REF_PERIOD (V4L2_CID_CODEC_BASE+505) +#define V4L2_CID_MPEG_VIDEO_VPX_GOLDEN_FRAME_SEL (V4L2_CID_CODEC_BASE+506) enum v4l2_vp8_golden_frame_sel { V4L2_CID_MPEG_VIDEO_VPX_GOLDEN_FRAME_USE_PREV = 0, V4L2_CID_MPEG_VIDEO_VPX_GOLDEN_FRAME_USE_REF_PERIOD = 1, }; -#define V4L2_CID_MPEG_VIDEO_VPX_MIN_QP (V4L2_CID_MPEG_BASE+507) -#define V4L2_CID_MPEG_VIDEO_VPX_MAX_QP (V4L2_CID_MPEG_BASE+508) -#define V4L2_CID_MPEG_VIDEO_VPX_I_FRAME_QP (V4L2_CID_MPEG_BASE+509) -#define V4L2_CID_MPEG_VIDEO_VPX_P_FRAME_QP (V4L2_CID_MPEG_BASE+510) +#define V4L2_CID_MPEG_VIDEO_VPX_MIN_QP (V4L2_CID_CODEC_BASE+507) +#define V4L2_CID_MPEG_VIDEO_VPX_MAX_QP (V4L2_CID_CODEC_BASE+508) +#define V4L2_CID_MPEG_VIDEO_VPX_I_FRAME_QP (V4L2_CID_CODEC_BASE+509) +#define V4L2_CID_MPEG_VIDEO_VPX_P_FRAME_QP (V4L2_CID_CODEC_BASE+510) -#define V4L2_CID_MPEG_VIDEO_VP8_PROFILE (V4L2_CID_MPEG_BASE+511) +#define V4L2_CID_MPEG_VIDEO_VP8_PROFILE (V4L2_CID_CODEC_BASE+511) enum v4l2_mpeg_video_vp8_profile { V4L2_MPEG_VIDEO_VP8_PROFILE_0 = 0, V4L2_MPEG_VIDEO_VP8_PROFILE_1 = 1, @@ -638,42 +678,59 @@ }; /* Deprecated alias for compatibility reasons. */ #define V4L2_CID_MPEG_VIDEO_VPX_PROFILE V4L2_CID_MPEG_VIDEO_VP8_PROFILE -#define V4L2_CID_MPEG_VIDEO_VP9_PROFILE (V4L2_CID_MPEG_BASE+512) +#define V4L2_CID_MPEG_VIDEO_VP9_PROFILE (V4L2_CID_CODEC_BASE+512) enum v4l2_mpeg_video_vp9_profile { V4L2_MPEG_VIDEO_VP9_PROFILE_0 = 0, V4L2_MPEG_VIDEO_VP9_PROFILE_1 = 1, V4L2_MPEG_VIDEO_VP9_PROFILE_2 = 2, V4L2_MPEG_VIDEO_VP9_PROFILE_3 = 3, }; +#define V4L2_CID_MPEG_VIDEO_VP9_LEVEL (V4L2_CID_CODEC_BASE+513) +enum v4l2_mpeg_video_vp9_level { + V4L2_MPEG_VIDEO_VP9_LEVEL_1_0 = 0, + V4L2_MPEG_VIDEO_VP9_LEVEL_1_1 = 1, + V4L2_MPEG_VIDEO_VP9_LEVEL_2_0 = 2, + V4L2_MPEG_VIDEO_VP9_LEVEL_2_1 = 3, + V4L2_MPEG_VIDEO_VP9_LEVEL_3_0 = 4, + V4L2_MPEG_VIDEO_VP9_LEVEL_3_1 = 5, + V4L2_MPEG_VIDEO_VP9_LEVEL_4_0 = 6, + V4L2_MPEG_VIDEO_VP9_LEVEL_4_1 = 7, + V4L2_MPEG_VIDEO_VP9_LEVEL_5_0 = 8, + V4L2_MPEG_VIDEO_VP9_LEVEL_5_1 = 9, + V4L2_MPEG_VIDEO_VP9_LEVEL_5_2 = 10, + V4L2_MPEG_VIDEO_VP9_LEVEL_6_0 = 11, + V4L2_MPEG_VIDEO_VP9_LEVEL_6_1 = 12, + V4L2_MPEG_VIDEO_VP9_LEVEL_6_2 = 13, +}; /* CIDs for HEVC encoding. */ -#define V4L2_CID_MPEG_VIDEO_HEVC_MIN_QP (V4L2_CID_MPEG_BASE + 600) -#define V4L2_CID_MPEG_VIDEO_HEVC_MAX_QP (V4L2_CID_MPEG_BASE + 601) -#define V4L2_CID_MPEG_VIDEO_HEVC_I_FRAME_QP (V4L2_CID_MPEG_BASE + 602) -#define V4L2_CID_MPEG_VIDEO_HEVC_P_FRAME_QP (V4L2_CID_MPEG_BASE + 603) -#define V4L2_CID_MPEG_VIDEO_HEVC_B_FRAME_QP (V4L2_CID_MPEG_BASE + 604) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_QP (V4L2_CID_MPEG_BASE + 605) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_TYPE (V4L2_CID_MPEG_BASE + 606) +#define V4L2_CID_MPEG_VIDEO_HEVC_MIN_QP (V4L2_CID_CODEC_BASE + 600) +#define V4L2_CID_MPEG_VIDEO_HEVC_MAX_QP (V4L2_CID_CODEC_BASE + 601) +#define V4L2_CID_MPEG_VIDEO_HEVC_I_FRAME_QP (V4L2_CID_CODEC_BASE + 602) +#define V4L2_CID_MPEG_VIDEO_HEVC_P_FRAME_QP (V4L2_CID_CODEC_BASE + 603) +#define V4L2_CID_MPEG_VIDEO_HEVC_B_FRAME_QP (V4L2_CID_CODEC_BASE + 604) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_QP (V4L2_CID_CODEC_BASE + 605) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_TYPE (V4L2_CID_CODEC_BASE + 606) enum v4l2_mpeg_video_hevc_hier_coding_type { V4L2_MPEG_VIDEO_HEVC_HIERARCHICAL_CODING_B = 0, V4L2_MPEG_VIDEO_HEVC_HIERARCHICAL_CODING_P = 1, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_LAYER (V4L2_CID_MPEG_BASE + 607) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L0_QP (V4L2_CID_MPEG_BASE + 608) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L1_QP (V4L2_CID_MPEG_BASE + 609) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L2_QP (V4L2_CID_MPEG_BASE + 610) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L3_QP (V4L2_CID_MPEG_BASE + 611) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L4_QP (V4L2_CID_MPEG_BASE + 612) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L5_QP (V4L2_CID_MPEG_BASE + 613) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L6_QP (V4L2_CID_MPEG_BASE + 614) -#define V4L2_CID_MPEG_VIDEO_HEVC_PROFILE (V4L2_CID_MPEG_BASE + 615) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_LAYER (V4L2_CID_CODEC_BASE + 607) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L0_QP (V4L2_CID_CODEC_BASE + 608) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L1_QP (V4L2_CID_CODEC_BASE + 609) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L2_QP (V4L2_CID_CODEC_BASE + 610) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L3_QP (V4L2_CID_CODEC_BASE + 611) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L4_QP (V4L2_CID_CODEC_BASE + 612) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L5_QP (V4L2_CID_CODEC_BASE + 613) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L6_QP (V4L2_CID_CODEC_BASE + 614) +#define V4L2_CID_MPEG_VIDEO_HEVC_PROFILE (V4L2_CID_CODEC_BASE + 615) enum v4l2_mpeg_video_hevc_profile { V4L2_MPEG_VIDEO_HEVC_PROFILE_MAIN = 0, V4L2_MPEG_VIDEO_HEVC_PROFILE_MAIN_STILL_PICTURE = 1, V4L2_MPEG_VIDEO_HEVC_PROFILE_MAIN_10 = 2, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_LEVEL (V4L2_CID_MPEG_BASE + 616) +#define V4L2_CID_MPEG_VIDEO_HEVC_LEVEL (V4L2_CID_CODEC_BASE + 616) enum v4l2_mpeg_video_hevc_level { V4L2_MPEG_VIDEO_HEVC_LEVEL_1 = 0, V4L2_MPEG_VIDEO_HEVC_LEVEL_2 = 1, @@ -689,64 +746,81 @@ V4L2_MPEG_VIDEO_HEVC_LEVEL_6_1 = 11, V4L2_MPEG_VIDEO_HEVC_LEVEL_6_2 = 12, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_FRAME_RATE_RESOLUTION (V4L2_CID_MPEG_BASE + 617) -#define V4L2_CID_MPEG_VIDEO_HEVC_TIER (V4L2_CID_MPEG_BASE + 618) +#define V4L2_CID_MPEG_VIDEO_HEVC_FRAME_RATE_RESOLUTION (V4L2_CID_CODEC_BASE + 617) +#define V4L2_CID_MPEG_VIDEO_HEVC_TIER (V4L2_CID_CODEC_BASE + 618) enum v4l2_mpeg_video_hevc_tier { V4L2_MPEG_VIDEO_HEVC_TIER_MAIN = 0, V4L2_MPEG_VIDEO_HEVC_TIER_HIGH = 1, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_MAX_PARTITION_DEPTH (V4L2_CID_MPEG_BASE + 619) -#define V4L2_CID_MPEG_VIDEO_HEVC_LOOP_FILTER_MODE (V4L2_CID_MPEG_BASE + 620) +#define V4L2_CID_MPEG_VIDEO_HEVC_MAX_PARTITION_DEPTH (V4L2_CID_CODEC_BASE + 619) +#define V4L2_CID_MPEG_VIDEO_HEVC_LOOP_FILTER_MODE (V4L2_CID_CODEC_BASE + 620) enum v4l2_cid_mpeg_video_hevc_loop_filter_mode { V4L2_MPEG_VIDEO_HEVC_LOOP_FILTER_MODE_DISABLED = 0, V4L2_MPEG_VIDEO_HEVC_LOOP_FILTER_MODE_ENABLED = 1, V4L2_MPEG_VIDEO_HEVC_LOOP_FILTER_MODE_DISABLED_AT_SLICE_BOUNDARY = 2, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_LF_BETA_OFFSET_DIV2 (V4L2_CID_MPEG_BASE + 621) -#define V4L2_CID_MPEG_VIDEO_HEVC_LF_TC_OFFSET_DIV2 (V4L2_CID_MPEG_BASE + 622) -#define V4L2_CID_MPEG_VIDEO_HEVC_REFRESH_TYPE (V4L2_CID_MPEG_BASE + 623) +#define V4L2_CID_MPEG_VIDEO_HEVC_LF_BETA_OFFSET_DIV2 (V4L2_CID_CODEC_BASE + 621) +#define V4L2_CID_MPEG_VIDEO_HEVC_LF_TC_OFFSET_DIV2 (V4L2_CID_CODEC_BASE + 622) +#define V4L2_CID_MPEG_VIDEO_HEVC_REFRESH_TYPE (V4L2_CID_CODEC_BASE + 623) enum v4l2_cid_mpeg_video_hevc_refresh_type { V4L2_MPEG_VIDEO_HEVC_REFRESH_NONE = 0, V4L2_MPEG_VIDEO_HEVC_REFRESH_CRA = 1, V4L2_MPEG_VIDEO_HEVC_REFRESH_IDR = 2, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_REFRESH_PERIOD (V4L2_CID_MPEG_BASE + 624) -#define V4L2_CID_MPEG_VIDEO_HEVC_LOSSLESS_CU (V4L2_CID_MPEG_BASE + 625) -#define V4L2_CID_MPEG_VIDEO_HEVC_CONST_INTRA_PRED (V4L2_CID_MPEG_BASE + 626) -#define V4L2_CID_MPEG_VIDEO_HEVC_WAVEFRONT (V4L2_CID_MPEG_BASE + 627) -#define V4L2_CID_MPEG_VIDEO_HEVC_GENERAL_PB (V4L2_CID_MPEG_BASE + 628) -#define V4L2_CID_MPEG_VIDEO_HEVC_TEMPORAL_ID (V4L2_CID_MPEG_BASE + 629) -#define V4L2_CID_MPEG_VIDEO_HEVC_STRONG_SMOOTHING (V4L2_CID_MPEG_BASE + 630) -#define V4L2_CID_MPEG_VIDEO_HEVC_MAX_NUM_MERGE_MV_MINUS1 (V4L2_CID_MPEG_BASE + 631) -#define V4L2_CID_MPEG_VIDEO_HEVC_INTRA_PU_SPLIT (V4L2_CID_MPEG_BASE + 632) -#define V4L2_CID_MPEG_VIDEO_HEVC_TMV_PREDICTION (V4L2_CID_MPEG_BASE + 633) -#define V4L2_CID_MPEG_VIDEO_HEVC_WITHOUT_STARTCODE (V4L2_CID_MPEG_BASE + 634) -#define V4L2_CID_MPEG_VIDEO_HEVC_SIZE_OF_LENGTH_FIELD (V4L2_CID_MPEG_BASE + 635) +#define V4L2_CID_MPEG_VIDEO_HEVC_REFRESH_PERIOD (V4L2_CID_CODEC_BASE + 624) +#define V4L2_CID_MPEG_VIDEO_HEVC_LOSSLESS_CU (V4L2_CID_CODEC_BASE + 625) +#define V4L2_CID_MPEG_VIDEO_HEVC_CONST_INTRA_PRED (V4L2_CID_CODEC_BASE + 626) +#define V4L2_CID_MPEG_VIDEO_HEVC_WAVEFRONT (V4L2_CID_CODEC_BASE + 627) +#define V4L2_CID_MPEG_VIDEO_HEVC_GENERAL_PB (V4L2_CID_CODEC_BASE + 628) +#define V4L2_CID_MPEG_VIDEO_HEVC_TEMPORAL_ID (V4L2_CID_CODEC_BASE + 629) +#define V4L2_CID_MPEG_VIDEO_HEVC_STRONG_SMOOTHING (V4L2_CID_CODEC_BASE + 630) +#define V4L2_CID_MPEG_VIDEO_HEVC_MAX_NUM_MERGE_MV_MINUS1 (V4L2_CID_CODEC_BASE + 631) +#define V4L2_CID_MPEG_VIDEO_HEVC_INTRA_PU_SPLIT (V4L2_CID_CODEC_BASE + 632) +#define V4L2_CID_MPEG_VIDEO_HEVC_TMV_PREDICTION (V4L2_CID_CODEC_BASE + 633) +#define V4L2_CID_MPEG_VIDEO_HEVC_WITHOUT_STARTCODE (V4L2_CID_CODEC_BASE + 634) +#define V4L2_CID_MPEG_VIDEO_HEVC_SIZE_OF_LENGTH_FIELD (V4L2_CID_CODEC_BASE + 635) enum v4l2_cid_mpeg_video_hevc_size_of_length_field { V4L2_MPEG_VIDEO_HEVC_SIZE_0 = 0, V4L2_MPEG_VIDEO_HEVC_SIZE_1 = 1, V4L2_MPEG_VIDEO_HEVC_SIZE_2 = 2, V4L2_MPEG_VIDEO_HEVC_SIZE_4 = 3, }; -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L0_BR (V4L2_CID_MPEG_BASE + 636) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L1_BR (V4L2_CID_MPEG_BASE + 637) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L2_BR (V4L2_CID_MPEG_BASE + 638) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L3_BR (V4L2_CID_MPEG_BASE + 639) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L4_BR (V4L2_CID_MPEG_BASE + 640) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L5_BR (V4L2_CID_MPEG_BASE + 641) -#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L6_BR (V4L2_CID_MPEG_BASE + 642) -#define V4L2_CID_MPEG_VIDEO_REF_NUMBER_FOR_PFRAMES (V4L2_CID_MPEG_BASE + 643) -#define V4L2_CID_MPEG_VIDEO_PREPEND_SPSPPS_TO_IDR (V4L2_CID_MPEG_BASE + 644) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L0_BR (V4L2_CID_CODEC_BASE + 636) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L1_BR (V4L2_CID_CODEC_BASE + 637) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L2_BR (V4L2_CID_CODEC_BASE + 638) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L3_BR (V4L2_CID_CODEC_BASE + 639) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L4_BR (V4L2_CID_CODEC_BASE + 640) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L5_BR (V4L2_CID_CODEC_BASE + 641) +#define V4L2_CID_MPEG_VIDEO_HEVC_HIER_CODING_L6_BR (V4L2_CID_CODEC_BASE + 642) +#define V4L2_CID_MPEG_VIDEO_REF_NUMBER_FOR_PFRAMES (V4L2_CID_CODEC_BASE + 643) +#define V4L2_CID_MPEG_VIDEO_PREPEND_SPSPPS_TO_IDR (V4L2_CID_CODEC_BASE + 644) +#define V4L2_CID_MPEG_VIDEO_CONSTANT_QUALITY (V4L2_CID_CODEC_BASE + 645) +#define V4L2_CID_MPEG_VIDEO_FRAME_SKIP_MODE (V4L2_CID_CODEC_BASE + 646) +enum v4l2_mpeg_video_frame_skip_mode { + V4L2_MPEG_VIDEO_FRAME_SKIP_MODE_DISABLED = 0, + V4L2_MPEG_VIDEO_FRAME_SKIP_MODE_LEVEL_LIMIT = 1, + V4L2_MPEG_VIDEO_FRAME_SKIP_MODE_BUF_LIMIT = 2, +}; + +#define V4L2_CID_MPEG_VIDEO_HEVC_I_FRAME_MIN_QP (V4L2_CID_CODEC_BASE + 647) +#define V4L2_CID_MPEG_VIDEO_HEVC_I_FRAME_MAX_QP (V4L2_CID_CODEC_BASE + 648) +#define V4L2_CID_MPEG_VIDEO_HEVC_P_FRAME_MIN_QP (V4L2_CID_CODEC_BASE + 649) +#define V4L2_CID_MPEG_VIDEO_HEVC_P_FRAME_MAX_QP (V4L2_CID_CODEC_BASE + 650) +#define V4L2_CID_MPEG_VIDEO_HEVC_B_FRAME_MIN_QP (V4L2_CID_CODEC_BASE + 651) +#define V4L2_CID_MPEG_VIDEO_HEVC_B_FRAME_MAX_QP (V4L2_CID_CODEC_BASE + 652) + +#define V4L2_CID_MPEG_VIDEO_DEC_DISPLAY_DELAY (V4L2_CID_CODEC_BASE + 653) +#define V4L2_CID_MPEG_VIDEO_DEC_DISPLAY_DELAY_ENABLE (V4L2_CID_CODEC_BASE + 654) /* MPEG-class control IDs specific to the CX2341x driver as defined by V4L2 */ -#define V4L2_CID_MPEG_CX2341X_BASE (V4L2_CTRL_CLASS_MPEG | 0x1000) -#define V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE (V4L2_CID_MPEG_CX2341X_BASE+0) +#define V4L2_CID_CODEC_CX2341X_BASE (V4L2_CTRL_CLASS_CODEC | 0x1000) +#define V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE (V4L2_CID_CODEC_CX2341X_BASE+0) enum v4l2_mpeg_cx2341x_video_spatial_filter_mode { V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_MANUAL = 0, V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_AUTO = 1, }; -#define V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER (V4L2_CID_MPEG_CX2341X_BASE+1) -#define V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE (V4L2_CID_MPEG_CX2341X_BASE+2) +#define V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER (V4L2_CID_CODEC_CX2341X_BASE+1) +#define V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE (V4L2_CID_CODEC_CX2341X_BASE+2) enum v4l2_mpeg_cx2341x_video_luma_spatial_filter_type { V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_OFF = 0, V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_HOR = 1, @@ -754,18 +828,18 @@ V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_HV_SEPARABLE = 3, V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_SYM_NON_SEPARABLE = 4, }; -#define V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE (V4L2_CID_MPEG_CX2341X_BASE+3) +#define V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE (V4L2_CID_CODEC_CX2341X_BASE+3) enum v4l2_mpeg_cx2341x_video_chroma_spatial_filter_type { V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_OFF = 0, V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_1D_HOR = 1, }; -#define V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE (V4L2_CID_MPEG_CX2341X_BASE+4) +#define V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE (V4L2_CID_CODEC_CX2341X_BASE+4) enum v4l2_mpeg_cx2341x_video_temporal_filter_mode { V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_MANUAL = 0, V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_AUTO = 1, }; -#define V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER (V4L2_CID_MPEG_CX2341X_BASE+5) -#define V4L2_CID_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE (V4L2_CID_MPEG_CX2341X_BASE+6) +#define V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER (V4L2_CID_CODEC_CX2341X_BASE+5) +#define V4L2_CID_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE (V4L2_CID_CODEC_CX2341X_BASE+6) enum v4l2_mpeg_cx2341x_video_median_filter_type { V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_OFF = 0, V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR = 1, @@ -773,38 +847,38 @@ V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR_VERT = 3, V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_DIAG = 4, }; -#define V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_BOTTOM (V4L2_CID_MPEG_CX2341X_BASE+7) -#define V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_TOP (V4L2_CID_MPEG_CX2341X_BASE+8) -#define V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_BOTTOM (V4L2_CID_MPEG_CX2341X_BASE+9) -#define V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_TOP (V4L2_CID_MPEG_CX2341X_BASE+10) -#define V4L2_CID_MPEG_CX2341X_STREAM_INSERT_NAV_PACKETS (V4L2_CID_MPEG_CX2341X_BASE+11) +#define V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_BOTTOM (V4L2_CID_CODEC_CX2341X_BASE+7) +#define V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_TOP (V4L2_CID_CODEC_CX2341X_BASE+8) +#define V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_BOTTOM (V4L2_CID_CODEC_CX2341X_BASE+9) +#define V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_TOP (V4L2_CID_CODEC_CX2341X_BASE+10) +#define V4L2_CID_MPEG_CX2341X_STREAM_INSERT_NAV_PACKETS (V4L2_CID_CODEC_CX2341X_BASE+11) /* MPEG-class control IDs specific to the Samsung MFC 5.1 driver as defined by V4L2 */ -#define V4L2_CID_MPEG_MFC51_BASE (V4L2_CTRL_CLASS_MPEG | 0x1100) +#define V4L2_CID_CODEC_MFC51_BASE (V4L2_CTRL_CLASS_CODEC | 0x1100) -#define V4L2_CID_MPEG_MFC51_VIDEO_DECODER_H264_DISPLAY_DELAY (V4L2_CID_MPEG_MFC51_BASE+0) -#define V4L2_CID_MPEG_MFC51_VIDEO_DECODER_H264_DISPLAY_DELAY_ENABLE (V4L2_CID_MPEG_MFC51_BASE+1) -#define V4L2_CID_MPEG_MFC51_VIDEO_FRAME_SKIP_MODE (V4L2_CID_MPEG_MFC51_BASE+2) +#define V4L2_CID_MPEG_MFC51_VIDEO_DECODER_H264_DISPLAY_DELAY (V4L2_CID_CODEC_MFC51_BASE+0) +#define V4L2_CID_MPEG_MFC51_VIDEO_DECODER_H264_DISPLAY_DELAY_ENABLE (V4L2_CID_CODEC_MFC51_BASE+1) +#define V4L2_CID_MPEG_MFC51_VIDEO_FRAME_SKIP_MODE (V4L2_CID_CODEC_MFC51_BASE+2) enum v4l2_mpeg_mfc51_video_frame_skip_mode { V4L2_MPEG_MFC51_VIDEO_FRAME_SKIP_MODE_DISABLED = 0, V4L2_MPEG_MFC51_VIDEO_FRAME_SKIP_MODE_LEVEL_LIMIT = 1, V4L2_MPEG_MFC51_VIDEO_FRAME_SKIP_MODE_BUF_LIMIT = 2, }; -#define V4L2_CID_MPEG_MFC51_VIDEO_FORCE_FRAME_TYPE (V4L2_CID_MPEG_MFC51_BASE+3) +#define V4L2_CID_MPEG_MFC51_VIDEO_FORCE_FRAME_TYPE (V4L2_CID_CODEC_MFC51_BASE+3) enum v4l2_mpeg_mfc51_video_force_frame_type { V4L2_MPEG_MFC51_VIDEO_FORCE_FRAME_TYPE_DISABLED = 0, V4L2_MPEG_MFC51_VIDEO_FORCE_FRAME_TYPE_I_FRAME = 1, V4L2_MPEG_MFC51_VIDEO_FORCE_FRAME_TYPE_NOT_CODED = 2, }; -#define V4L2_CID_MPEG_MFC51_VIDEO_PADDING (V4L2_CID_MPEG_MFC51_BASE+4) -#define V4L2_CID_MPEG_MFC51_VIDEO_PADDING_YUV (V4L2_CID_MPEG_MFC51_BASE+5) -#define V4L2_CID_MPEG_MFC51_VIDEO_RC_FIXED_TARGET_BIT (V4L2_CID_MPEG_MFC51_BASE+6) -#define V4L2_CID_MPEG_MFC51_VIDEO_RC_REACTION_COEFF (V4L2_CID_MPEG_MFC51_BASE+7) -#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_ACTIVITY (V4L2_CID_MPEG_MFC51_BASE+50) -#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_DARK (V4L2_CID_MPEG_MFC51_BASE+51) -#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_SMOOTH (V4L2_CID_MPEG_MFC51_BASE+52) -#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_STATIC (V4L2_CID_MPEG_MFC51_BASE+53) -#define V4L2_CID_MPEG_MFC51_VIDEO_H264_NUM_REF_PIC_FOR_P (V4L2_CID_MPEG_MFC51_BASE+54) +#define V4L2_CID_MPEG_MFC51_VIDEO_PADDING (V4L2_CID_CODEC_MFC51_BASE+4) +#define V4L2_CID_MPEG_MFC51_VIDEO_PADDING_YUV (V4L2_CID_CODEC_MFC51_BASE+5) +#define V4L2_CID_MPEG_MFC51_VIDEO_RC_FIXED_TARGET_BIT (V4L2_CID_CODEC_MFC51_BASE+6) +#define V4L2_CID_MPEG_MFC51_VIDEO_RC_REACTION_COEFF (V4L2_CID_CODEC_MFC51_BASE+7) +#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_ACTIVITY (V4L2_CID_CODEC_MFC51_BASE+50) +#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_DARK (V4L2_CID_CODEC_MFC51_BASE+51) +#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_SMOOTH (V4L2_CID_CODEC_MFC51_BASE+52) +#define V4L2_CID_MPEG_MFC51_VIDEO_H264_ADAPTIVE_RC_STATIC (V4L2_CID_CODEC_MFC51_BASE+53) +#define V4L2_CID_MPEG_MFC51_VIDEO_H264_NUM_REF_PIC_FOR_P (V4L2_CID_CODEC_MFC51_BASE+54) /* Camera class control IDs */ @@ -918,6 +992,13 @@ #define V4L2_CID_PAN_SPEED (V4L2_CID_CAMERA_CLASS_BASE+32) #define V4L2_CID_TILT_SPEED (V4L2_CID_CAMERA_CLASS_BASE+33) +#define V4L2_CID_CAMERA_ORIENTATION (V4L2_CID_CAMERA_CLASS_BASE+34) +#define V4L2_CAMERA_ORIENTATION_FRONT 0 +#define V4L2_CAMERA_ORIENTATION_BACK 1 +#define V4L2_CAMERA_ORIENTATION_EXTERNAL 2 + +#define V4L2_CID_CAMERA_SENSOR_ROTATION (V4L2_CID_CAMERA_CLASS_BASE+35) + /* FM Modulator class control IDs */ #define V4L2_CID_FM_TX_CLASS_BASE (V4L2_CTRL_CLASS_FM_TX | 0x900) @@ -1041,6 +1122,7 @@ #define V4L2_CID_TEST_PATTERN_BLUE (V4L2_CID_IMAGE_SOURCE_CLASS_BASE + 6) #define V4L2_CID_TEST_PATTERN_GREENB (V4L2_CID_IMAGE_SOURCE_CLASS_BASE + 7) #define V4L2_CID_UNIT_CELL_SIZE (V4L2_CID_IMAGE_SOURCE_CLASS_BASE + 8) +#define V4L2_CID_NOTIFY_GAINS (V4L2_CID_IMAGE_SOURCE_CLASS_BASE + 9) /* Image processing controls */ @@ -1134,4 +1216,1094 @@ #define V4L2_CID_DETECT_MD_THRESHOLD_GRID (V4L2_CID_DETECT_CLASS_BASE + 3) #define V4L2_CID_DETECT_MD_REGION_GRID (V4L2_CID_DETECT_CLASS_BASE + 4) + +/* Stateless CODECs controls */ +#define V4L2_CID_CODEC_STATELESS_BASE (V4L2_CTRL_CLASS_CODEC_STATELESS | 0x900) +#define V4L2_CID_CODEC_STATELESS_CLASS (V4L2_CTRL_CLASS_CODEC_STATELESS | 1) + +#define V4L2_CID_STATELESS_H264_DECODE_MODE (V4L2_CID_CODEC_STATELESS_BASE + 0) +/** + * enum v4l2_stateless_h264_decode_mode - Decoding mode + * + * @V4L2_STATELESS_H264_DECODE_MODE_SLICE_BASED: indicates that decoding + * is performed one slice at a time. In this mode, + * V4L2_CID_STATELESS_H264_SLICE_PARAMS must contain the parsed slice + * parameters and the OUTPUT buffer must contain a single slice. + * V4L2_BUF_CAP_SUPPORTS_M2M_HOLD_CAPTURE_BUF feature is used + * in order to support multislice frames. + * @V4L2_STATELESS_H264_DECODE_MODE_FRAME_BASED: indicates that + * decoding is performed per frame. The OUTPUT buffer must contain + * all slices and also both fields. This mode is typically supported + * by device drivers that are able to parse the slice(s) header(s) + * in hardware. When this mode is selected, + * V4L2_CID_STATELESS_H264_SLICE_PARAMS is not used. + */ +enum v4l2_stateless_h264_decode_mode { + V4L2_STATELESS_H264_DECODE_MODE_SLICE_BASED, + V4L2_STATELESS_H264_DECODE_MODE_FRAME_BASED, +}; + +#define V4L2_CID_STATELESS_H264_START_CODE (V4L2_CID_CODEC_STATELESS_BASE + 1) +/** + * enum v4l2_stateless_h264_start_code - Start code + * + * @V4L2_STATELESS_H264_START_CODE_NONE: slices are passed + * to the driver without any start code. + * @V4L2_STATELESS_H264_START_CODE_ANNEX_B: slices are passed + * to the driver with an Annex B start code prefix + * (legal start codes can be 3-bytes 0x000001 or 4-bytes 0x00000001). + * This mode is typically supported by device drivers that parse + * the start code in hardware. + */ +enum v4l2_stateless_h264_start_code { + V4L2_STATELESS_H264_START_CODE_NONE, + V4L2_STATELESS_H264_START_CODE_ANNEX_B, +}; + +#define V4L2_H264_SPS_CONSTRAINT_SET0_FLAG 0x01 +#define V4L2_H264_SPS_CONSTRAINT_SET1_FLAG 0x02 +#define V4L2_H264_SPS_CONSTRAINT_SET2_FLAG 0x04 +#define V4L2_H264_SPS_CONSTRAINT_SET3_FLAG 0x08 +#define V4L2_H264_SPS_CONSTRAINT_SET4_FLAG 0x10 +#define V4L2_H264_SPS_CONSTRAINT_SET5_FLAG 0x20 + +#define V4L2_H264_SPS_FLAG_SEPARATE_COLOUR_PLANE 0x01 +#define V4L2_H264_SPS_FLAG_QPPRIME_Y_ZERO_TRANSFORM_BYPASS 0x02 +#define V4L2_H264_SPS_FLAG_DELTA_PIC_ORDER_ALWAYS_ZERO 0x04 +#define V4L2_H264_SPS_FLAG_GAPS_IN_FRAME_NUM_VALUE_ALLOWED 0x08 +#define V4L2_H264_SPS_FLAG_FRAME_MBS_ONLY 0x10 +#define V4L2_H264_SPS_FLAG_MB_ADAPTIVE_FRAME_FIELD 0x20 +#define V4L2_H264_SPS_FLAG_DIRECT_8X8_INFERENCE 0x40 + +#define V4L2_H264_SPS_HAS_CHROMA_FORMAT(sps) \ + ((sps)->profile_idc == 100 || (sps)->profile_idc == 110 || \ + (sps)->profile_idc == 122 || (sps)->profile_idc == 244 || \ + (sps)->profile_idc == 44 || (sps)->profile_idc == 83 || \ + (sps)->profile_idc == 86 || (sps)->profile_idc == 118 || \ + (sps)->profile_idc == 128 || (sps)->profile_idc == 138 || \ + (sps)->profile_idc == 139 || (sps)->profile_idc == 134 || \ + (sps)->profile_idc == 135) + +#define V4L2_CID_STATELESS_H264_SPS (V4L2_CID_CODEC_STATELESS_BASE + 2) +/** + * struct v4l2_ctrl_h264_sps - H264 sequence parameter set + * + * All the members on this sequence parameter set structure match the + * sequence parameter set syntax as specified by the H264 specification. + * + * @profile_idc: see H264 specification. + * @constraint_set_flags: see H264 specification. + * @level_idc: see H264 specification. + * @seq_parameter_set_id: see H264 specification. + * @chroma_format_idc: see H264 specification. + * @bit_depth_luma_minus8: see H264 specification. + * @bit_depth_chroma_minus8: see H264 specification. + * @log2_max_frame_num_minus4: see H264 specification. + * @pic_order_cnt_type: see H264 specification. + * @log2_max_pic_order_cnt_lsb_minus4: see H264 specification. + * @max_num_ref_frames: see H264 specification. + * @num_ref_frames_in_pic_order_cnt_cycle: see H264 specification. + * @offset_for_ref_frame: see H264 specification. + * @offset_for_non_ref_pic: see H264 specification. + * @offset_for_top_to_bottom_field: see H264 specification. + * @pic_width_in_mbs_minus1: see H264 specification. + * @pic_height_in_map_units_minus1: see H264 specification. + * @flags: see V4L2_H264_SPS_FLAG_{}. + */ +struct v4l2_ctrl_h264_sps { + __u8 profile_idc; + __u8 constraint_set_flags; + __u8 level_idc; + __u8 seq_parameter_set_id; + __u8 chroma_format_idc; + __u8 bit_depth_luma_minus8; + __u8 bit_depth_chroma_minus8; + __u8 log2_max_frame_num_minus4; + __u8 pic_order_cnt_type; + __u8 log2_max_pic_order_cnt_lsb_minus4; + __u8 max_num_ref_frames; + __u8 num_ref_frames_in_pic_order_cnt_cycle; + __s32 offset_for_ref_frame[255]; + __s32 offset_for_non_ref_pic; + __s32 offset_for_top_to_bottom_field; + __u16 pic_width_in_mbs_minus1; + __u16 pic_height_in_map_units_minus1; + __u32 flags; +}; + +#define V4L2_H264_PPS_FLAG_ENTROPY_CODING_MODE 0x0001 +#define V4L2_H264_PPS_FLAG_BOTTOM_FIELD_PIC_ORDER_IN_FRAME_PRESENT 0x0002 +#define V4L2_H264_PPS_FLAG_WEIGHTED_PRED 0x0004 +#define V4L2_H264_PPS_FLAG_DEBLOCKING_FILTER_CONTROL_PRESENT 0x0008 +#define V4L2_H264_PPS_FLAG_CONSTRAINED_INTRA_PRED 0x0010 +#define V4L2_H264_PPS_FLAG_REDUNDANT_PIC_CNT_PRESENT 0x0020 +#define V4L2_H264_PPS_FLAG_TRANSFORM_8X8_MODE 0x0040 +#define V4L2_H264_PPS_FLAG_SCALING_MATRIX_PRESENT 0x0080 + +#define V4L2_CID_STATELESS_H264_PPS (V4L2_CID_CODEC_STATELESS_BASE + 3) +/** + * struct v4l2_ctrl_h264_pps - H264 picture parameter set + * + * Except where noted, all the members on this picture parameter set + * structure match the picture parameter set syntax as specified + * by the H264 specification. + * + * In particular, V4L2_H264_PPS_FLAG_SCALING_MATRIX_PRESENT flag + * has a specific meaning. This flag should be set if a non-flat + * scaling matrix applies to the picture. In this case, applications + * are expected to use V4L2_CID_STATELESS_H264_SCALING_MATRIX, + * to pass the values of the non-flat matrices. + * + * @pic_parameter_set_id: see H264 specification. + * @seq_parameter_set_id: see H264 specification. + * @num_slice_groups_minus1: see H264 specification. + * @num_ref_idx_l0_default_active_minus1: see H264 specification. + * @num_ref_idx_l1_default_active_minus1: see H264 specification. + * @weighted_bipred_idc: see H264 specification. + * @pic_init_qp_minus26: see H264 specification. + * @pic_init_qs_minus26: see H264 specification. + * @chroma_qp_index_offset: see H264 specification. + * @second_chroma_qp_index_offset: see H264 specification. + * @flags: see V4L2_H264_PPS_FLAG_{}. + */ +struct v4l2_ctrl_h264_pps { + __u8 pic_parameter_set_id; + __u8 seq_parameter_set_id; + __u8 num_slice_groups_minus1; + __u8 num_ref_idx_l0_default_active_minus1; + __u8 num_ref_idx_l1_default_active_minus1; + __u8 weighted_bipred_idc; + __s8 pic_init_qp_minus26; + __s8 pic_init_qs_minus26; + __s8 chroma_qp_index_offset; + __s8 second_chroma_qp_index_offset; + __u16 flags; +}; + +#define V4L2_CID_STATELESS_H264_SCALING_MATRIX (V4L2_CID_CODEC_STATELESS_BASE + 4) +/** + * struct v4l2_ctrl_h264_scaling_matrix - H264 scaling matrices + * + * @scaling_list_4x4: scaling matrix after applying the inverse + * scanning process. Expected list order is Intra Y, Intra Cb, + * Intra Cr, Inter Y, Inter Cb, Inter Cr. The values on each + * scaling list are expected in raster scan order. + * @scaling_list_8x8: scaling matrix after applying the inverse + * scanning process. Expected list order is Intra Y, Inter Y, + * Intra Cb, Inter Cb, Intra Cr, Inter Cr. The values on each + * scaling list are expected in raster scan order. + * + * Note that the list order is different for the 4x4 and 8x8 + * matrices as per the H264 specification, see table 7-2 "Assignment + * of mnemonic names to scaling list indices and specification of + * fall-back rule". + */ +struct v4l2_ctrl_h264_scaling_matrix { + __u8 scaling_list_4x4[6][16]; + __u8 scaling_list_8x8[6][64]; +}; + +struct v4l2_h264_weight_factors { + __s16 luma_weight[32]; + __s16 luma_offset[32]; + __s16 chroma_weight[32][2]; + __s16 chroma_offset[32][2]; +}; + +#define V4L2_H264_CTRL_PRED_WEIGHTS_REQUIRED(pps, slice) \ + ((((pps)->flags & V4L2_H264_PPS_FLAG_WEIGHTED_PRED) && \ + ((slice)->slice_type == V4L2_H264_SLICE_TYPE_P || \ + (slice)->slice_type == V4L2_H264_SLICE_TYPE_SP)) || \ + ((pps)->weighted_bipred_idc == 1 && \ + (slice)->slice_type == V4L2_H264_SLICE_TYPE_B)) + +#define V4L2_CID_STATELESS_H264_PRED_WEIGHTS (V4L2_CID_CODEC_STATELESS_BASE + 5) +/** + * struct v4l2_ctrl_h264_pred_weights - Prediction weight table + * + * Prediction weight table, which matches the syntax specified + * by the H264 specification. + * + * @luma_log2_weight_denom: see H264 specification. + * @chroma_log2_weight_denom: see H264 specification. + * @weight_factors: luma and chroma weight factors. + */ +struct v4l2_ctrl_h264_pred_weights { + __u16 luma_log2_weight_denom; + __u16 chroma_log2_weight_denom; + struct v4l2_h264_weight_factors weight_factors[2]; +}; + +#define V4L2_H264_SLICE_TYPE_P 0 +#define V4L2_H264_SLICE_TYPE_B 1 +#define V4L2_H264_SLICE_TYPE_I 2 +#define V4L2_H264_SLICE_TYPE_SP 3 +#define V4L2_H264_SLICE_TYPE_SI 4 + +#define V4L2_H264_SLICE_FLAG_DIRECT_SPATIAL_MV_PRED 0x01 +#define V4L2_H264_SLICE_FLAG_SP_FOR_SWITCH 0x02 + +#define V4L2_H264_TOP_FIELD_REF 0x1 +#define V4L2_H264_BOTTOM_FIELD_REF 0x2 +#define V4L2_H264_FRAME_REF 0x3 + +/** + * struct v4l2_h264_reference - H264 picture reference + * + * @fields: indicates how the picture is referenced. + * Valid values are V4L2_H264_{}_REF. + * @index: index into v4l2_ctrl_h264_decode_params.dpb[]. + */ +struct v4l2_h264_reference { + __u8 fields; + __u8 index; +}; + +/* + * Maximum DPB size, as specified by section 'A.3.1 Level limits + * common to the Baseline, Main, and Extended profiles'. + */ +#define V4L2_H264_NUM_DPB_ENTRIES 16 +#define V4L2_H264_REF_LIST_LEN (2 * V4L2_H264_NUM_DPB_ENTRIES) + +#define V4L2_CID_STATELESS_H264_SLICE_PARAMS (V4L2_CID_CODEC_STATELESS_BASE + 6) +/** + * struct v4l2_ctrl_h264_slice_params - H264 slice parameters + * + * This structure holds the H264 syntax elements that are specified + * as non-invariant for the slices in a given frame. + * + * Slice invariant syntax elements are contained in struct + * v4l2_ctrl_h264_decode_params. This is done to reduce the API surface + * on frame-based decoders, where slice header parsing is done by the + * hardware. + * + * Slice invariant syntax elements are specified in specification section + * "7.4.3 Slice header semantics". + * + * Except where noted, the members on this struct match the slice header syntax. + * + * @header_bit_size: offset in bits to slice_data() from the beginning of this slice. + * @first_mb_in_slice: see H264 specification. + * @slice_type: see H264 specification. + * @colour_plane_id: see H264 specification. + * @redundant_pic_cnt: see H264 specification. + * @cabac_init_idc: see H264 specification. + * @slice_qp_delta: see H264 specification. + * @slice_qs_delta: see H264 specification. + * @disable_deblocking_filter_idc: see H264 specification. + * @slice_alpha_c0_offset_div2: see H264 specification. + * @slice_beta_offset_div2: see H264 specification. + * @num_ref_idx_l0_active_minus1: see H264 specification. + * @num_ref_idx_l1_active_minus1: see H264 specification. + * @reserved: padding field. Should be zeroed by applications. + * @ref_pic_list0: reference picture list 0 after applying the per-slice modifications. + * @ref_pic_list1: reference picture list 1 after applying the per-slice modifications. + * @flags: see V4L2_H264_SLICE_FLAG_{}. + */ +struct v4l2_ctrl_h264_slice_params { + __u32 header_bit_size; + __u32 first_mb_in_slice; + __u8 slice_type; + __u8 colour_plane_id; + __u8 redundant_pic_cnt; + __u8 cabac_init_idc; + __s8 slice_qp_delta; + __s8 slice_qs_delta; + __u8 disable_deblocking_filter_idc; + __s8 slice_alpha_c0_offset_div2; + __s8 slice_beta_offset_div2; + __u8 num_ref_idx_l0_active_minus1; + __u8 num_ref_idx_l1_active_minus1; + + __u8 reserved; + + struct v4l2_h264_reference ref_pic_list0[V4L2_H264_REF_LIST_LEN]; + struct v4l2_h264_reference ref_pic_list1[V4L2_H264_REF_LIST_LEN]; + + __u32 flags; +}; + +#define V4L2_H264_DPB_ENTRY_FLAG_VALID 0x01 +#define V4L2_H264_DPB_ENTRY_FLAG_ACTIVE 0x02 +#define V4L2_H264_DPB_ENTRY_FLAG_LONG_TERM 0x04 +#define V4L2_H264_DPB_ENTRY_FLAG_FIELD 0x08 + +/** + * struct v4l2_h264_dpb_entry - H264 decoded picture buffer entry + * + * @reference_ts: timestamp of the V4L2 capture buffer to use as reference. + * The timestamp refers to the timestamp field in struct v4l2_buffer. + * Use v4l2_timeval_to_ns() to convert the struct timeval to a __u64. + * @pic_num: matches PicNum variable assigned during the reference + * picture lists construction process. + * @frame_num: frame identifier which matches frame_num syntax element. + * @fields: indicates how the DPB entry is referenced. Valid values are + * V4L2_H264_{}_REF. + * @reserved: padding field. Should be zeroed by applications. + * @top_field_order_cnt: matches TopFieldOrderCnt picture value. + * @bottom_field_order_cnt: matches BottomFieldOrderCnt picture value. + * Note that picture field is indicated by v4l2_buffer.field. + * @flags: see V4L2_H264_DPB_ENTRY_FLAG_{}. + */ +struct v4l2_h264_dpb_entry { + __u64 reference_ts; + __u32 pic_num; + __u16 frame_num; + __u8 fields; + __u8 reserved[5]; + __s32 top_field_order_cnt; + __s32 bottom_field_order_cnt; + __u32 flags; +}; + +#define V4L2_H264_DECODE_PARAM_FLAG_IDR_PIC 0x01 +#define V4L2_H264_DECODE_PARAM_FLAG_FIELD_PIC 0x02 +#define V4L2_H264_DECODE_PARAM_FLAG_BOTTOM_FIELD 0x04 + +#define V4L2_CID_STATELESS_H264_DECODE_PARAMS (V4L2_CID_CODEC_STATELESS_BASE + 7) +/** + * struct v4l2_ctrl_h264_decode_params - H264 decoding parameters + * + * @dpb: decoded picture buffer. + * @nal_ref_idc: slice header syntax element. + * @frame_num: slice header syntax element. + * @top_field_order_cnt: matches TopFieldOrderCnt picture value. + * @bottom_field_order_cnt: matches BottomFieldOrderCnt picture value. + * Note that picture field is indicated by v4l2_buffer.field. + * @idr_pic_id: slice header syntax element. + * @pic_order_cnt_lsb: slice header syntax element. + * @delta_pic_order_cnt_bottom: slice header syntax element. + * @delta_pic_order_cnt0: slice header syntax element. + * @delta_pic_order_cnt1: slice header syntax element. + * @dec_ref_pic_marking_bit_size: size in bits of dec_ref_pic_marking() + * syntax element. + * @pic_order_cnt_bit_size: size in bits of pic order count syntax. + * @slice_group_change_cycle: slice header syntax element. + * @reserved: padding field. Should be zeroed by applications. + * @flags: see V4L2_H264_DECODE_PARAM_FLAG_{}. + */ +struct v4l2_ctrl_h264_decode_params { + struct v4l2_h264_dpb_entry dpb[V4L2_H264_NUM_DPB_ENTRIES]; + __u16 nal_ref_idc; + __u16 frame_num; + __s32 top_field_order_cnt; + __s32 bottom_field_order_cnt; + __u16 idr_pic_id; + __u16 pic_order_cnt_lsb; + __s32 delta_pic_order_cnt_bottom; + __s32 delta_pic_order_cnt0; + __s32 delta_pic_order_cnt1; + __u32 dec_ref_pic_marking_bit_size; + __u32 pic_order_cnt_bit_size; + __u32 slice_group_change_cycle; + + __u32 reserved; + __u32 flags; +}; + + +/* Stateless FWHT control, used by the vicodec driver */ + +/* Current FWHT version */ +#define V4L2_FWHT_VERSION 3 + +/* Set if this is an interlaced format */ +#define V4L2_FWHT_FL_IS_INTERLACED _BITUL(0) +/* Set if this is a bottom-first (NTSC) interlaced format */ +#define V4L2_FWHT_FL_IS_BOTTOM_FIRST _BITUL(1) +/* Set if each 'frame' contains just one field */ +#define V4L2_FWHT_FL_IS_ALTERNATE _BITUL(2) +/* + * If V4L2_FWHT_FL_IS_ALTERNATE was set, then this is set if this + * 'frame' is the bottom field, else it is the top field. + */ +#define V4L2_FWHT_FL_IS_BOTTOM_FIELD _BITUL(3) +/* Set if the Y' plane is uncompressed */ +#define V4L2_FWHT_FL_LUMA_IS_UNCOMPRESSED _BITUL(4) +/* Set if the Cb plane is uncompressed */ +#define V4L2_FWHT_FL_CB_IS_UNCOMPRESSED _BITUL(5) +/* Set if the Cr plane is uncompressed */ +#define V4L2_FWHT_FL_CR_IS_UNCOMPRESSED _BITUL(6) +/* Set if the chroma plane is full height, if cleared it is half height */ +#define V4L2_FWHT_FL_CHROMA_FULL_HEIGHT _BITUL(7) +/* Set if the chroma plane is full width, if cleared it is half width */ +#define V4L2_FWHT_FL_CHROMA_FULL_WIDTH _BITUL(8) +/* Set if the alpha plane is uncompressed */ +#define V4L2_FWHT_FL_ALPHA_IS_UNCOMPRESSED _BITUL(9) +/* Set if this is an I Frame */ +#define V4L2_FWHT_FL_I_FRAME _BITUL(10) + +/* A 4-values flag - the number of components - 1 */ +#define V4L2_FWHT_FL_COMPONENTS_NUM_MSK GENMASK(18, 16) +#define V4L2_FWHT_FL_COMPONENTS_NUM_OFFSET 16 + +/* A 4-values flag - the pixel encoding type */ +#define V4L2_FWHT_FL_PIXENC_MSK GENMASK(20, 19) +#define V4L2_FWHT_FL_PIXENC_OFFSET 19 +#define V4L2_FWHT_FL_PIXENC_YUV (1 << V4L2_FWHT_FL_PIXENC_OFFSET) +#define V4L2_FWHT_FL_PIXENC_RGB (2 << V4L2_FWHT_FL_PIXENC_OFFSET) +#define V4L2_FWHT_FL_PIXENC_HSV (3 << V4L2_FWHT_FL_PIXENC_OFFSET) + +#define V4L2_CID_STATELESS_FWHT_PARAMS (V4L2_CID_CODEC_STATELESS_BASE + 100) +/** + * struct v4l2_ctrl_fwht_params - FWHT parameters + * + * @backward_ref_ts: timestamp of the V4L2 capture buffer to use as reference. + * The timestamp refers to the timestamp field in struct v4l2_buffer. + * Use v4l2_timeval_to_ns() to convert the struct timeval to a __u64. + * @version: must be V4L2_FWHT_VERSION. + * @width: width of frame. + * @height: height of frame. + * @flags: FWHT flags (see V4L2_FWHT_FL_*). + * @colorspace: the colorspace (enum v4l2_colorspace). + * @xfer_func: the transfer function (enum v4l2_xfer_func). + * @ycbcr_enc: the Y'CbCr encoding (enum v4l2_ycbcr_encoding). + * @quantization: the quantization (enum v4l2_quantization). + */ +struct v4l2_ctrl_fwht_params { + __u64 backward_ref_ts; + __u32 version; + __u32 width; + __u32 height; + __u32 flags; + __u32 colorspace; + __u32 xfer_func; + __u32 ycbcr_enc; + __u32 quantization; +}; + +/* Stateless VP8 control */ + +#define V4L2_VP8_SEGMENT_FLAG_ENABLED 0x01 +#define V4L2_VP8_SEGMENT_FLAG_UPDATE_MAP 0x02 +#define V4L2_VP8_SEGMENT_FLAG_UPDATE_FEATURE_DATA 0x04 +#define V4L2_VP8_SEGMENT_FLAG_DELTA_VALUE_MODE 0x08 + +/** + * struct v4l2_vp8_segment - VP8 segment-based adjustments parameters + * + * @quant_update: update values for the segment quantizer. + * @lf_update: update values for the loop filter level. + * @segment_probs: branch probabilities of the segment_id decoding tree. + * @padding: padding field. Should be zeroed by applications. + * @flags: see V4L2_VP8_SEGMENT_FLAG_{}. + * + * This structure contains segment-based adjustments related parameters. + * See the 'update_segmentation()' part of the frame header syntax, + * and section '9.3. Segment-Based Adjustments' of the VP8 specification + * for more details. + */ +struct v4l2_vp8_segment { + __s8 quant_update[4]; + __s8 lf_update[4]; + __u8 segment_probs[3]; + __u8 padding; + __u32 flags; +}; + +#define V4L2_VP8_LF_ADJ_ENABLE 0x01 +#define V4L2_VP8_LF_DELTA_UPDATE 0x02 +#define V4L2_VP8_LF_FILTER_TYPE_SIMPLE 0x04 + +/** + * struct v4l2_vp8_loop_filter - VP8 loop filter parameters + * + * @ref_frm_delta: Reference frame signed delta values. + * @mb_mode_delta: MB prediction mode signed delta values. + * @sharpness_level: matches sharpness_level syntax element. + * @level: matches loop_filter_level syntax element. + * @padding: padding field. Should be zeroed by applications. + * @flags: see V4L2_VP8_LF_FLAG_{}. + * + * This structure contains loop filter related parameters. + * See the 'mb_lf_adjustments()' part of the frame header syntax, + * and section '9.4. Loop Filter Type and Levels' of the VP8 specification + * for more details. + */ +struct v4l2_vp8_loop_filter { + __s8 ref_frm_delta[4]; + __s8 mb_mode_delta[4]; + __u8 sharpness_level; + __u8 level; + __u16 padding; + __u32 flags; +}; + +/** + * struct v4l2_vp8_quantization - VP8 quantizattion indices + * + * @y_ac_qi: luma AC coefficient table index. + * @y_dc_delta: luma DC delta vaue. + * @y2_dc_delta: y2 block DC delta value. + * @y2_ac_delta: y2 block AC delta value. + * @uv_dc_delta: chroma DC delta value. + * @uv_ac_delta: chroma AC delta value. + * @padding: padding field. Should be zeroed by applications. + * + * This structure contains the quantization indices present + * in 'quant_indices()' part of the frame header syntax. + * See section '9.6. Dequantization Indices' of the VP8 specification + * for more details. + */ +struct v4l2_vp8_quantization { + __u8 y_ac_qi; + __s8 y_dc_delta; + __s8 y2_dc_delta; + __s8 y2_ac_delta; + __s8 uv_dc_delta; + __s8 uv_ac_delta; + __u16 padding; +}; + +#define V4L2_VP8_COEFF_PROB_CNT 11 +#define V4L2_VP8_MV_PROB_CNT 19 + +/** + * struct v4l2_vp8_entropy - VP8 update probabilities + * + * @coeff_probs: coefficient probability update values. + * @y_mode_probs: luma intra-prediction probabilities. + * @uv_mode_probs: chroma intra-prediction probabilities. + * @mv_probs: mv decoding probability. + * @padding: padding field. Should be zeroed by applications. + * + * This structure contains the update probabilities present in + * 'token_prob_update()' and 'mv_prob_update()' part of the frame header. + * See section '17.2. Probability Updates' of the VP8 specification + * for more details. + */ +struct v4l2_vp8_entropy { + __u8 coeff_probs[4][8][3][V4L2_VP8_COEFF_PROB_CNT]; + __u8 y_mode_probs[4]; + __u8 uv_mode_probs[3]; + __u8 mv_probs[2][V4L2_VP8_MV_PROB_CNT]; + __u8 padding[3]; +}; + +/** + * struct v4l2_vp8_entropy_coder_state - VP8 boolean coder state + * + * @range: coder state value for "Range" + * @value: coder state value for "Value" + * @bit_count: number of bits left in range "Value". + * @padding: padding field. Should be zeroed by applications. + * + * This structure contains the state for the boolean coder, as + * explained in section '7. Boolean Entropy Decoder' of the VP8 specification. + */ +struct v4l2_vp8_entropy_coder_state { + __u8 range; + __u8 value; + __u8 bit_count; + __u8 padding; +}; + +#define V4L2_VP8_FRAME_FLAG_KEY_FRAME 0x01 +#define V4L2_VP8_FRAME_FLAG_EXPERIMENTAL 0x02 +#define V4L2_VP8_FRAME_FLAG_SHOW_FRAME 0x04 +#define V4L2_VP8_FRAME_FLAG_MB_NO_SKIP_COEFF 0x08 +#define V4L2_VP8_FRAME_FLAG_SIGN_BIAS_GOLDEN 0x10 +#define V4L2_VP8_FRAME_FLAG_SIGN_BIAS_ALT 0x20 + +#define V4L2_VP8_FRAME_IS_KEY_FRAME(hdr) \ + (!!((hdr)->flags & V4L2_VP8_FRAME_FLAG_KEY_FRAME)) + +#define V4L2_CID_STATELESS_VP8_FRAME (V4L2_CID_CODEC_STATELESS_BASE + 200) +/** + * struct v4l2_ctrl_vp8_frame - VP8 frame parameters + * + * @segment: segmentation parameters. See &v4l2_vp8_segment for more details + * @lf: loop filter parameters. See &v4l2_vp8_loop_filter for more details + * @quant: quantization parameters. See &v4l2_vp8_quantization for more details + * @entropy: update probabilities. See &v4l2_vp8_entropy for more details + * @coder_state: boolean coder state. See &v4l2_vp8_entropy_coder_state for more details + * @width: frame width. + * @height: frame height. + * @horizontal_scale: horizontal scaling factor. + * @vertical_scale: vertical scaling factor. + * @version: bitstream version. + * @prob_skip_false: frame header syntax element. + * @prob_intra: frame header syntax element. + * @prob_last: frame header syntax element. + * @prob_gf: frame header syntax element. + * @num_dct_parts: number of DCT coefficients partitions. + * @first_part_size: size of the first partition, i.e. the control partition. + * @first_part_header_bits: size in bits of the first partition header portion. + * @dct_part_sizes: DCT coefficients sizes. + * @last_frame_ts: "last" reference buffer timestamp. + * The timestamp refers to the timestamp field in struct v4l2_buffer. + * Use v4l2_timeval_to_ns() to convert the struct timeval to a __u64. + * @golden_frame_ts: "golden" reference buffer timestamp. + * @alt_frame_ts: "alt" reference buffer timestamp. + * @flags: see V4L2_VP8_FRAME_FLAG_{}. + */ +struct v4l2_ctrl_vp8_frame { + struct v4l2_vp8_segment segment; + struct v4l2_vp8_loop_filter lf; + struct v4l2_vp8_quantization quant; + struct v4l2_vp8_entropy entropy; + struct v4l2_vp8_entropy_coder_state coder_state; + + __u16 width; + __u16 height; + + __u8 horizontal_scale; + __u8 vertical_scale; + + __u8 version; + __u8 prob_skip_false; + __u8 prob_intra; + __u8 prob_last; + __u8 prob_gf; + __u8 num_dct_parts; + + __u32 first_part_size; + __u32 first_part_header_bits; + __u32 dct_part_sizes[8]; + + __u64 last_frame_ts; + __u64 golden_frame_ts; + __u64 alt_frame_ts; + + __u64 flags; +}; + +/* Stateless MPEG-2 controls */ + +#define V4L2_MPEG2_SEQ_FLAG_PROGRESSIVE 0x01 + +#define V4L2_CID_STATELESS_MPEG2_SEQUENCE (V4L2_CID_CODEC_STATELESS_BASE+220) +/** + * struct v4l2_ctrl_mpeg2_sequence - MPEG-2 sequence header + * + * All the members on this structure match the sequence header and sequence + * extension syntaxes as specified by the MPEG-2 specification. + * + * Fields horizontal_size, vertical_size and vbv_buffer_size are a + * combination of respective _value and extension syntax elements, + * as described in section 6.3.3 "Sequence header". + * + * @horizontal_size: combination of elements horizontal_size_value and + * horizontal_size_extension. + * @vertical_size: combination of elements vertical_size_value and + * vertical_size_extension. + * @vbv_buffer_size: combination of elements vbv_buffer_size_value and + * vbv_buffer_size_extension. + * @profile_and_level_indication: see MPEG-2 specification. + * @chroma_format: see MPEG-2 specification. + * @flags: see V4L2_MPEG2_SEQ_FLAG_{}. + */ +struct v4l2_ctrl_mpeg2_sequence { + __u16 horizontal_size; + __u16 vertical_size; + __u32 vbv_buffer_size; + __u16 profile_and_level_indication; + __u8 chroma_format; + __u8 flags; +}; + +#define V4L2_MPEG2_PIC_CODING_TYPE_I 1 +#define V4L2_MPEG2_PIC_CODING_TYPE_P 2 +#define V4L2_MPEG2_PIC_CODING_TYPE_B 3 +#define V4L2_MPEG2_PIC_CODING_TYPE_D 4 + +#define V4L2_MPEG2_PIC_TOP_FIELD 0x1 +#define V4L2_MPEG2_PIC_BOTTOM_FIELD 0x2 +#define V4L2_MPEG2_PIC_FRAME 0x3 + +#define V4L2_MPEG2_PIC_FLAG_TOP_FIELD_FIRST 0x0001 +#define V4L2_MPEG2_PIC_FLAG_FRAME_PRED_DCT 0x0002 +#define V4L2_MPEG2_PIC_FLAG_CONCEALMENT_MV 0x0004 +#define V4L2_MPEG2_PIC_FLAG_Q_SCALE_TYPE 0x0008 +#define V4L2_MPEG2_PIC_FLAG_INTRA_VLC 0x0010 +#define V4L2_MPEG2_PIC_FLAG_ALT_SCAN 0x0020 +#define V4L2_MPEG2_PIC_FLAG_REPEAT_FIRST 0x0040 +#define V4L2_MPEG2_PIC_FLAG_PROGRESSIVE 0x0080 + +#define V4L2_CID_STATELESS_MPEG2_PICTURE (V4L2_CID_CODEC_STATELESS_BASE+221) +/** + * struct v4l2_ctrl_mpeg2_picture - MPEG-2 picture header + * + * All the members on this structure match the picture header and picture + * coding extension syntaxes as specified by the MPEG-2 specification. + * + * @backward_ref_ts: timestamp of the V4L2 capture buffer to use as + * reference for backward prediction. + * @forward_ref_ts: timestamp of the V4L2 capture buffer to use as + * reference for forward prediction. These timestamp refers to the + * timestamp field in struct v4l2_buffer. Use v4l2_timeval_to_ns() + * to convert the struct timeval to a __u64. + * @flags: see V4L2_MPEG2_PIC_FLAG_{}. + * @f_code: see MPEG-2 specification. + * @picture_coding_type: see MPEG-2 specification. + * @picture_structure: see V4L2_MPEG2_PIC_{}_FIELD. + * @intra_dc_precision: see MPEG-2 specification. + * @reserved: padding field. Should be zeroed by applications. + */ +struct v4l2_ctrl_mpeg2_picture { + __u64 backward_ref_ts; + __u64 forward_ref_ts; + __u32 flags; + __u8 f_code[2][2]; + __u8 picture_coding_type; + __u8 picture_structure; + __u8 intra_dc_precision; + __u8 reserved[5]; +}; + +#define V4L2_CID_STATELESS_MPEG2_QUANTISATION (V4L2_CID_CODEC_STATELESS_BASE+222) +/** + * struct v4l2_ctrl_mpeg2_quantisation - MPEG-2 quantisation + * + * Quantisation matrices as specified by section 6.3.7 + * "Quant matrix extension". + * + * @intra_quantiser_matrix: The quantisation matrix coefficients + * for intra-coded frames, in zigzag scanning order. It is relevant + * for both luma and chroma components, although it can be superseded + * by the chroma-specific matrix for non-4:2:0 YUV formats. + * @non_intra_quantiser_matrix: The quantisation matrix coefficients + * for non-intra-coded frames, in zigzag scanning order. It is relevant + * for both luma and chroma components, although it can be superseded + * by the chroma-specific matrix for non-4:2:0 YUV formats. + * @chroma_intra_quantiser_matrix: The quantisation matrix coefficients + * for the chominance component of intra-coded frames, in zigzag scanning + * order. Only relevant for 4:2:2 and 4:4:4 YUV formats. + * @chroma_non_intra_quantiser_matrix: The quantisation matrix coefficients + * for the chrominance component of non-intra-coded frames, in zigzag scanning + * order. Only relevant for 4:2:2 and 4:4:4 YUV formats. + */ +struct v4l2_ctrl_mpeg2_quantisation { + __u8 intra_quantiser_matrix[64]; + __u8 non_intra_quantiser_matrix[64]; + __u8 chroma_intra_quantiser_matrix[64]; + __u8 chroma_non_intra_quantiser_matrix[64]; +}; + +#define V4L2_CID_COLORIMETRY_CLASS_BASE (V4L2_CTRL_CLASS_COLORIMETRY | 0x900) +#define V4L2_CID_COLORIMETRY_CLASS (V4L2_CTRL_CLASS_COLORIMETRY | 1) + +#define V4L2_CID_COLORIMETRY_HDR10_CLL_INFO (V4L2_CID_COLORIMETRY_CLASS_BASE + 0) + +struct v4l2_ctrl_hdr10_cll_info { + __u16 max_content_light_level; + __u16 max_pic_average_light_level; +}; + +#define V4L2_CID_COLORIMETRY_HDR10_MASTERING_DISPLAY (V4L2_CID_COLORIMETRY_CLASS_BASE + 1) + +#define V4L2_HDR10_MASTERING_PRIMARIES_X_LOW 5 +#define V4L2_HDR10_MASTERING_PRIMARIES_X_HIGH 37000 +#define V4L2_HDR10_MASTERING_PRIMARIES_Y_LOW 5 +#define V4L2_HDR10_MASTERING_PRIMARIES_Y_HIGH 42000 +#define V4L2_HDR10_MASTERING_WHITE_POINT_X_LOW 5 +#define V4L2_HDR10_MASTERING_WHITE_POINT_X_HIGH 37000 +#define V4L2_HDR10_MASTERING_WHITE_POINT_Y_LOW 5 +#define V4L2_HDR10_MASTERING_WHITE_POINT_Y_HIGH 42000 +#define V4L2_HDR10_MASTERING_MAX_LUMA_LOW 50000 +#define V4L2_HDR10_MASTERING_MAX_LUMA_HIGH 100000000 +#define V4L2_HDR10_MASTERING_MIN_LUMA_LOW 1 +#define V4L2_HDR10_MASTERING_MIN_LUMA_HIGH 50000 + +struct v4l2_ctrl_hdr10_mastering_display { + __u16 display_primaries_x[3]; + __u16 display_primaries_y[3]; + __u16 white_point_x; + __u16 white_point_y; + __u32 max_display_mastering_luminance; + __u32 min_display_mastering_luminance; +}; + +/* Stateless VP9 controls */ + +#define V4L2_VP9_LOOP_FILTER_FLAG_DELTA_ENABLED 0x1 +#define V4L2_VP9_LOOP_FILTER_FLAG_DELTA_UPDATE 0x2 + +/** + * struct v4l2_vp9_loop_filter - VP9 loop filter parameters + * + * @ref_deltas: contains the adjustment needed for the filter level based on the + * chosen reference frame. If this syntax element is not present in the bitstream, + * users should pass its last value. + * @mode_deltas: contains the adjustment needed for the filter level based on the + * chosen mode. If this syntax element is not present in the bitstream, users should + * pass its last value. + * @level: indicates the loop filter strength. + * @sharpness: indicates the sharpness level. + * @flags: combination of V4L2_VP9_LOOP_FILTER_FLAG_{} flags. + * @reserved: padding field. Should be zeroed by applications. + * + * This structure contains all loop filter related parameters. See sections + * '7.2.8 Loop filter semantics' of the VP9 specification for more details. + */ +struct v4l2_vp9_loop_filter { + __s8 ref_deltas[4]; + __s8 mode_deltas[2]; + __u8 level; + __u8 sharpness; + __u8 flags; + __u8 reserved[7]; +}; + +/** + * struct v4l2_vp9_quantization - VP9 quantization parameters + * + * @base_q_idx: indicates the base frame qindex. + * @delta_q_y_dc: indicates the Y DC quantizer relative to base_q_idx. + * @delta_q_uv_dc: indicates the UV DC quantizer relative to base_q_idx. + * @delta_q_uv_ac: indicates the UV AC quantizer relative to base_q_idx. + * @reserved: padding field. Should be zeroed by applications. + * + * Encodes the quantization parameters. See section '7.2.9 Quantization params + * syntax' of the VP9 specification for more details. + */ +struct v4l2_vp9_quantization { + __u8 base_q_idx; + __s8 delta_q_y_dc; + __s8 delta_q_uv_dc; + __s8 delta_q_uv_ac; + __u8 reserved[4]; +}; + +#define V4L2_VP9_SEGMENTATION_FLAG_ENABLED 0x01 +#define V4L2_VP9_SEGMENTATION_FLAG_UPDATE_MAP 0x02 +#define V4L2_VP9_SEGMENTATION_FLAG_TEMPORAL_UPDATE 0x04 +#define V4L2_VP9_SEGMENTATION_FLAG_UPDATE_DATA 0x08 +#define V4L2_VP9_SEGMENTATION_FLAG_ABS_OR_DELTA_UPDATE 0x10 + +#define V4L2_VP9_SEG_LVL_ALT_Q 0 +#define V4L2_VP9_SEG_LVL_ALT_L 1 +#define V4L2_VP9_SEG_LVL_REF_FRAME 2 +#define V4L2_VP9_SEG_LVL_SKIP 3 +#define V4L2_VP9_SEG_LVL_MAX 4 + +#define V4L2_VP9_SEGMENT_FEATURE_ENABLED(id) (1 << (id)) +#define V4L2_VP9_SEGMENT_FEATURE_ENABLED_MASK 0xf + +/** + * struct v4l2_vp9_segmentation - VP9 segmentation parameters + * + * @feature_data: data attached to each feature. Data entry is only valid if + * the feature is enabled. The array shall be indexed with segment number as + * the first dimension (0..7) and one of V4L2_VP9_SEG_{} as the second dimension. + * @feature_enabled: bitmask defining which features are enabled in each segment. + * The value for each segment is a combination of V4L2_VP9_SEGMENT_FEATURE_ENABLED(id) + * values where id is one of V4L2_VP9_SEG_LVL_{}. + * @tree_probs: specifies the probability values to be used when decoding a + * Segment-ID. See '5.15. Segmentation map' section of the VP9 specification + * for more details. + * @pred_probs: specifies the probability values to be used when decoding a + * Predicted-Segment-ID. See '6.4.14. Get segment id syntax' section of :ref:`vp9` + * for more details. + * @flags: combination of V4L2_VP9_SEGMENTATION_FLAG_{} flags. + * @reserved: padding field. Should be zeroed by applications. + * + * Encodes the quantization parameters. See section '7.2.10 Segmentation params syntax' of + * the VP9 specification for more details. + */ +struct v4l2_vp9_segmentation { + __s16 feature_data[8][4]; + __u8 feature_enabled[8]; + __u8 tree_probs[7]; + __u8 pred_probs[3]; + __u8 flags; + __u8 reserved[5]; +}; + +#define V4L2_VP9_FRAME_FLAG_KEY_FRAME 0x001 +#define V4L2_VP9_FRAME_FLAG_SHOW_FRAME 0x002 +#define V4L2_VP9_FRAME_FLAG_ERROR_RESILIENT 0x004 +#define V4L2_VP9_FRAME_FLAG_INTRA_ONLY 0x008 +#define V4L2_VP9_FRAME_FLAG_ALLOW_HIGH_PREC_MV 0x010 +#define V4L2_VP9_FRAME_FLAG_REFRESH_FRAME_CTX 0x020 +#define V4L2_VP9_FRAME_FLAG_PARALLEL_DEC_MODE 0x040 +#define V4L2_VP9_FRAME_FLAG_X_SUBSAMPLING 0x080 +#define V4L2_VP9_FRAME_FLAG_Y_SUBSAMPLING 0x100 +#define V4L2_VP9_FRAME_FLAG_COLOR_RANGE_FULL_SWING 0x200 + +#define V4L2_VP9_SIGN_BIAS_LAST 0x1 +#define V4L2_VP9_SIGN_BIAS_GOLDEN 0x2 +#define V4L2_VP9_SIGN_BIAS_ALT 0x4 + +#define V4L2_VP9_RESET_FRAME_CTX_NONE 0 +#define V4L2_VP9_RESET_FRAME_CTX_SPEC 1 +#define V4L2_VP9_RESET_FRAME_CTX_ALL 2 + +#define V4L2_VP9_INTERP_FILTER_EIGHTTAP 0 +#define V4L2_VP9_INTERP_FILTER_EIGHTTAP_SMOOTH 1 +#define V4L2_VP9_INTERP_FILTER_EIGHTTAP_SHARP 2 +#define V4L2_VP9_INTERP_FILTER_BILINEAR 3 +#define V4L2_VP9_INTERP_FILTER_SWITCHABLE 4 + +#define V4L2_VP9_REFERENCE_MODE_SINGLE_REFERENCE 0 +#define V4L2_VP9_REFERENCE_MODE_COMPOUND_REFERENCE 1 +#define V4L2_VP9_REFERENCE_MODE_SELECT 2 + +#define V4L2_VP9_PROFILE_MAX 3 + +#define V4L2_CID_STATELESS_VP9_FRAME (V4L2_CID_CODEC_STATELESS_BASE + 300) +/** + * struct v4l2_ctrl_vp9_frame - VP9 frame decoding control + * + * @lf: loop filter parameters. See &v4l2_vp9_loop_filter for more details. + * @quant: quantization parameters. See &v4l2_vp9_quantization for more details. + * @seg: segmentation parameters. See &v4l2_vp9_segmentation for more details. + * @flags: combination of V4L2_VP9_FRAME_FLAG_{} flags. + * @compressed_header_size: compressed header size in bytes. + * @uncompressed_header_size: uncompressed header size in bytes. + * @frame_width_minus_1: add 1 to it and you'll get the frame width expressed in pixels. + * @frame_height_minus_1: add 1 to it and you'll get the frame height expressed in pixels. + * @render_width_minus_1: add 1 to it and you'll get the expected render width expressed in + * pixels. This is not used during the decoding process but might be used by HW scalers + * to prepare a frame that's ready for scanout. + * @render_height_minus_1: add 1 to it and you'll get the expected render height expressed in + * pixels. This is not used during the decoding process but might be used by HW scalers + * to prepare a frame that's ready for scanout. + * @last_frame_ts: "last" reference buffer timestamp. + * The timestamp refers to the timestamp field in struct v4l2_buffer. + * Use v4l2_timeval_to_ns() to convert the struct timeval to a __u64. + * @golden_frame_ts: "golden" reference buffer timestamp. + * The timestamp refers to the timestamp field in struct v4l2_buffer. + * Use v4l2_timeval_to_ns() to convert the struct timeval to a __u64. + * @alt_frame_ts: "alt" reference buffer timestamp. + * The timestamp refers to the timestamp field in struct v4l2_buffer. + * Use v4l2_timeval_to_ns() to convert the struct timeval to a __u64. + * @ref_frame_sign_bias: a bitfield specifying whether the sign bias is set for a given + * reference frame. Either of V4L2_VP9_SIGN_BIAS_{}. + * @reset_frame_context: specifies whether the frame context should be reset to default values. + * Either of V4L2_VP9_RESET_FRAME_CTX_{}. + * @frame_context_idx: frame context that should be used/updated. + * @profile: VP9 profile. Can be 0, 1, 2 or 3. + * @bit_depth: bits per components. Can be 8, 10 or 12. Note that not all profiles support + * 10 and/or 12 bits depths. + * @interpolation_filter: specifies the filter selection used for performing inter prediction. + * Set to one of V4L2_VP9_INTERP_FILTER_{}. + * @tile_cols_log2: specifies the base 2 logarithm of the width of each tile (where the width + * is measured in units of 8x8 blocks). Shall be less than or equal to 6. + * @tile_rows_log2: specifies the base 2 logarithm of the height of each tile (where the height + * is measured in units of 8x8 blocks). + * @reference_mode: specifies the type of inter prediction to be used. + * Set to one of V4L2_VP9_REFERENCE_MODE_{}. + * @reserved: padding field. Should be zeroed by applications. + */ +struct v4l2_ctrl_vp9_frame { + struct v4l2_vp9_loop_filter lf; + struct v4l2_vp9_quantization quant; + struct v4l2_vp9_segmentation seg; + __u32 flags; + __u16 compressed_header_size; + __u16 uncompressed_header_size; + __u16 frame_width_minus_1; + __u16 frame_height_minus_1; + __u16 render_width_minus_1; + __u16 render_height_minus_1; + __u64 last_frame_ts; + __u64 golden_frame_ts; + __u64 alt_frame_ts; + __u8 ref_frame_sign_bias; + __u8 reset_frame_context; + __u8 frame_context_idx; + __u8 profile; + __u8 bit_depth; + __u8 interpolation_filter; + __u8 tile_cols_log2; + __u8 tile_rows_log2; + __u8 reference_mode; + __u8 reserved[7]; +}; + +#define V4L2_VP9_NUM_FRAME_CTX 4 + +/** + * struct v4l2_vp9_mv_probs - VP9 Motion vector probability updates + * @joint: motion vector joint probability updates. + * @sign: motion vector sign probability updates. + * @classes: motion vector class probability updates. + * @class0_bit: motion vector class0 bit probability updates. + * @bits: motion vector bits probability updates. + * @class0_fr: motion vector class0 fractional bit probability updates. + * @fr: motion vector fractional bit probability updates. + * @class0_hp: motion vector class0 high precision fractional bit probability updates. + * @hp: motion vector high precision fractional bit probability updates. + * + * This structure contains new values of motion vector probabilities. + * A value of zero in an array element means there is no update of the relevant probability. + * See `struct v4l2_vp9_prob_updates` for details. + */ +struct v4l2_vp9_mv_probs { + __u8 joint[3]; + __u8 sign[2]; + __u8 classes[2][10]; + __u8 class0_bit[2]; + __u8 bits[2][10]; + __u8 class0_fr[2][2][3]; + __u8 fr[2][3]; + __u8 class0_hp[2]; + __u8 hp[2]; +}; + +#define V4L2_CID_STATELESS_VP9_COMPRESSED_HDR (V4L2_CID_CODEC_STATELESS_BASE + 301) + +#define V4L2_VP9_TX_MODE_ONLY_4X4 0 +#define V4L2_VP9_TX_MODE_ALLOW_8X8 1 +#define V4L2_VP9_TX_MODE_ALLOW_16X16 2 +#define V4L2_VP9_TX_MODE_ALLOW_32X32 3 +#define V4L2_VP9_TX_MODE_SELECT 4 + +/** + * struct v4l2_ctrl_vp9_compressed_hdr - VP9 probability updates control + * @tx_mode: specifies the TX mode. Set to one of V4L2_VP9_TX_MODE_{}. + * @tx8: TX 8x8 probability updates. + * @tx16: TX 16x16 probability updates. + * @tx32: TX 32x32 probability updates. + * @coef: coefficient probability updates. + * @skip: skip probability updates. + * @inter_mode: inter mode probability updates. + * @interp_filter: interpolation filter probability updates. + * @is_inter: is inter-block probability updates. + * @comp_mode: compound prediction mode probability updates. + * @single_ref: single ref probability updates. + * @comp_ref: compound ref probability updates. + * @y_mode: Y prediction mode probability updates. + * @uv_mode: UV prediction mode probability updates. + * @partition: partition probability updates. + * @mv: motion vector probability updates. + * + * This structure holds the probabilities update as parsed in the compressed + * header (Spec 6.3). These values represent the value of probability update after + * being translated with inv_map_table[] (see 6.3.5). A value of zero in an array element + * means that there is no update of the relevant probability. + * + * This control is optional and needs to be used when dealing with the hardware which is + * not capable of parsing the compressed header itself. Only drivers which need it will + * implement it. + */ +struct v4l2_ctrl_vp9_compressed_hdr { + __u8 tx_mode; + __u8 tx8[2][1]; + __u8 tx16[2][2]; + __u8 tx32[2][3]; + __u8 coef[4][2][2][6][6][3]; + __u8 skip[3]; + __u8 inter_mode[7][3]; + __u8 interp_filter[4][2]; + __u8 is_inter[4]; + __u8 comp_mode[5]; + __u8 single_ref[5][2]; + __u8 comp_ref[5]; + __u8 y_mode[4][9]; + __u8 uv_mode[10][9]; + __u8 partition[16][3]; + + struct v4l2_vp9_mv_probs mv; +}; + +/* MPEG-compression definitions kept for backwards compatibility */ +#define V4L2_CTRL_CLASS_MPEG V4L2_CTRL_CLASS_CODEC +#define V4L2_CID_MPEG_CLASS V4L2_CID_CODEC_CLASS +#define V4L2_CID_MPEG_BASE V4L2_CID_CODEC_BASE +#define V4L2_CID_MPEG_CX2341X_BASE V4L2_CID_CODEC_CX2341X_BASE +#define V4L2_CID_MPEG_MFC51_BASE V4L2_CID_CODEC_MFC51_BASE + #endif
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/linux/videodev2.h -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/linux/videodev2.h
Changed
@@ -169,6 +169,8 @@ || (type) == V4L2_BUF_TYPE_SDR_OUTPUT \ || (type) == V4L2_BUF_TYPE_META_OUTPUT) +#define V4L2_TYPE_IS_CAPTURE(type) (!V4L2_TYPE_IS_OUTPUT(type)) + enum v4l2_tuner_type { V4L2_TUNER_RADIO = 1, V4L2_TUNER_ANALOG_TV = 2, @@ -217,9 +219,7 @@ V4L2_COLORSPACE_470_SYSTEM_M = 5, /* - * EBU Tech 3213 PAL/SECAM colorspace. This only makes sense when - * dealing with really old PAL/SECAM recordings. Superseded by - * SMPTE 170M. + * EBU Tech 3213 PAL/SECAM colorspace. */ V4L2_COLORSPACE_470_SYSTEM_BG = 6, @@ -324,14 +324,12 @@ /* Rec. 709/EN 61966-2-4 Extended Gamut -- HDTV */ V4L2_YCBCR_ENC_XV709 = 4, -#ifndef __KERNEL__ /* * sYCC (Y'CbCr encoding of sRGB), identical to ENC_601. It was added * originally due to a misunderstanding of the sYCC standard. It should * not be used, instead use V4L2_YCBCR_ENC_601. */ V4L2_YCBCR_ENC_SYCC = 5, -#endif /* BT.2020 Non-constant Luminance Y'CbCr */ V4L2_YCBCR_ENC_BT2020 = 6, @@ -369,9 +367,9 @@ enum v4l2_quantization { /* - * The default for R'G'B' quantization is always full range, except - * for the BT2020 colorspace. For Y'CbCr the quantization is always - * limited range, except for COLORSPACE_JPEG: this is full range. + * The default for R'G'B' quantization is always full range. + * For Y'CbCr the quantization is always limited range, except + * for COLORSPACE_JPEG: this is full range. */ V4L2_QUANTIZATION_DEFAULT = 0, V4L2_QUANTIZATION_FULL_RANGE = 1, @@ -380,14 +378,13 @@ /* * Determine how QUANTIZATION_DEFAULT should map to a proper quantization. - * This depends on whether the image is RGB or not, the colorspace and the - * Y'CbCr encoding. + * This depends on whether the image is RGB or not, the colorspace. + * The Y'CbCr encoding is not used anymore, but is still there for backwards + * compatibility. */ #define V4L2_MAP_QUANTIZATION_DEFAULT(is_rgb_or_hsv, colsp, ycbcr_enc) \ - (((is_rgb_or_hsv) && (colsp) == V4L2_COLORSPACE_BT2020) ? \ - V4L2_QUANTIZATION_LIM_RANGE : \ - (((is_rgb_or_hsv) || (colsp) == V4L2_COLORSPACE_JPEG) ? \ - V4L2_QUANTIZATION_FULL_RANGE : V4L2_QUANTIZATION_LIM_RANGE)) + (((is_rgb_or_hsv) || (colsp) == V4L2_COLORSPACE_JPEG) ? \ + V4L2_QUANTIZATION_FULL_RANGE : V4L2_QUANTIZATION_LIM_RANGE) /* * Deprecated names for opRGB colorspace (IEC 61966-2-5) @@ -395,10 +392,8 @@ * WARNING: Please don't use these deprecated defines in your code, as * there is a chance we have to remove them in the future. */ -#ifndef __KERNEL__ #define V4L2_COLORSPACE_ADOBERGB V4L2_COLORSPACE_OPRGB #define V4L2_XFER_FUNC_ADOBERGB V4L2_XFER_FUNC_OPRGB -#endif enum v4l2_priority { V4L2_PRIORITY_UNSET = 0, /* not initialized */ @@ -485,6 +480,8 @@ #define V4L2_CAP_TOUCH 0x10000000 /* Is a touch device */ +#define V4L2_CAP_IO_MC 0x20000000 /* Is input/output controlled by the media controller */ + #define V4L2_CAP_DEVICE_CAPS 0x80000000 /* sets device capabilities field */ /* @@ -512,7 +509,7 @@ /* Pixel format FOURCC depth Description */ -/* RGB formats */ +/* RGB formats (1 or 2 bytes per pixel) */ #define V4L2_PIX_FMT_RGB332 v4l2_fourcc('R', 'G', 'B', '1') /* 8 RGB-3-3-2 */ #define V4L2_PIX_FMT_RGB444 v4l2_fourcc('R', '4', '4', '4') /* 16 xxxxrrrr ggggbbbb */ #define V4L2_PIX_FMT_ARGB444 v4l2_fourcc('A', 'R', '1', '2') /* 16 aaaarrrr ggggbbbb */ @@ -521,12 +518,6 @@ #define V4L2_PIX_FMT_RGBX444 v4l2_fourcc('R', 'X', '1', '2') /* 16 rrrrgggg bbbbxxxx */ #define V4L2_PIX_FMT_ABGR444 v4l2_fourcc('A', 'B', '1', '2') /* 16 aaaabbbb ggggrrrr */ #define V4L2_PIX_FMT_XBGR444 v4l2_fourcc('X', 'B', '1', '2') /* 16 xxxxbbbb ggggrrrr */ - -/* - * Originally this had 'BA12' as fourcc, but this clashed with the older - * V4L2_PIX_FMT_SGRBG12 which inexplicably used that same fourcc. - * So use 'GA12' instead for V4L2_PIX_FMT_BGRA444. - */ #define V4L2_PIX_FMT_BGRA444 v4l2_fourcc('G', 'A', '1', '2') /* 16 bbbbgggg rrrraaaa */ #define V4L2_PIX_FMT_BGRX444 v4l2_fourcc('B', 'X', '1', '2') /* 16 bbbbgggg rrrrxxxx */ #define V4L2_PIX_FMT_RGB555 v4l2_fourcc('R', 'G', 'B', 'O') /* 16 RGB-5-5-5 */ @@ -543,6 +534,8 @@ #define V4L2_PIX_FMT_ARGB555X v4l2_fourcc_be('A', 'R', '1', '5') /* 16 ARGB-5-5-5 BE */ #define V4L2_PIX_FMT_XRGB555X v4l2_fourcc_be('X', 'R', '1', '5') /* 16 XRGB-5-5-5 BE */ #define V4L2_PIX_FMT_RGB565X v4l2_fourcc('R', 'G', 'B', 'R') /* 16 RGB-5-6-5 BE */ + +/* RGB formats (3 or 4 bytes per pixel) */ #define V4L2_PIX_FMT_BGR666 v4l2_fourcc('B', 'G', 'R', 'H') /* 18 BGR-6-6-6 */ #define V4L2_PIX_FMT_BGR24 v4l2_fourcc('B', 'G', 'R', '3') /* 24 BGR-8-8-8 */ #define V4L2_PIX_FMT_RGB24 v4l2_fourcc('R', 'G', 'B', '3') /* 24 RGB-8-8-8 */ @@ -587,13 +580,12 @@ #define V4L2_PIX_FMT_YUV444 v4l2_fourcc('Y', '4', '4', '4') /* 16 xxxxyyyy uuuuvvvv */ #define V4L2_PIX_FMT_YUV555 v4l2_fourcc('Y', 'U', 'V', 'O') /* 16 YUV-5-5-5 */ #define V4L2_PIX_FMT_YUV565 v4l2_fourcc('Y', 'U', 'V', 'P') /* 16 YUV-5-6-5 */ +#define V4L2_PIX_FMT_YUV24 v4l2_fourcc('Y', 'U', 'V', '3') /* 24 YUV-8-8-8 */ #define V4L2_PIX_FMT_YUV32 v4l2_fourcc('Y', 'U', 'V', '4') /* 32 YUV-8-8-8-8 */ #define V4L2_PIX_FMT_AYUV32 v4l2_fourcc('A', 'Y', 'U', 'V') /* 32 AYUV-8-8-8-8 */ #define V4L2_PIX_FMT_XYUV32 v4l2_fourcc('X', 'Y', 'U', 'V') /* 32 XYUV-8-8-8-8 */ #define V4L2_PIX_FMT_VUYA32 v4l2_fourcc('V', 'U', 'Y', 'A') /* 32 VUYA-8-8-8-8 */ #define V4L2_PIX_FMT_VUYX32 v4l2_fourcc('V', 'U', 'Y', 'X') /* 32 VUYX-8-8-8-8 */ -#define V4L2_PIX_FMT_HI240 v4l2_fourcc('H', 'I', '2', '4') /* 8 8-bit color */ -#define V4L2_PIX_FMT_HM12 v4l2_fourcc('H', 'M', '1', '2') /* 8 YUV 4:2:0 16x16 macroblocks */ #define V4L2_PIX_FMT_M420 v4l2_fourcc('M', '4', '2', '0') /* 12 YUV 4:2:0 2 lines y, 1 line uv interleaved */ /* two planes -- one Y, one Cr + Cb interleaved */ @@ -609,8 +601,6 @@ #define V4L2_PIX_FMT_NV21M v4l2_fourcc('N', 'M', '2', '1') /* 21 Y/CrCb 4:2:0 */ #define V4L2_PIX_FMT_NV16M v4l2_fourcc('N', 'M', '1', '6') /* 16 Y/CbCr 4:2:2 */ #define V4L2_PIX_FMT_NV61M v4l2_fourcc('N', 'M', '6', '1') /* 16 Y/CrCb 4:2:2 */ -#define V4L2_PIX_FMT_NV12MT v4l2_fourcc('T', 'M', '1', '2') /* 12 Y/CbCr 4:2:0 64x32 macroblocks */ -#define V4L2_PIX_FMT_NV12MT_16X16 v4l2_fourcc('V', 'M', '1', '2') /* 12 Y/CbCr 4:2:0 16x16 macroblocks */ /* three planes - Y Cb, Cr */ #define V4L2_PIX_FMT_YUV410 v4l2_fourcc('Y', 'U', 'V', '9') /* 9 YUV 4:1:0 */ @@ -628,6 +618,15 @@ #define V4L2_PIX_FMT_YUV444M v4l2_fourcc('Y', 'M', '2', '4') /* 24 YUV444 planar */ #define V4L2_PIX_FMT_YVU444M v4l2_fourcc('Y', 'M', '4', '2') /* 24 YVU444 planar */ +/* Tiled YUV formats */ +#define V4L2_PIX_FMT_NV12_4L4 v4l2_fourcc('V', 'T', '1', '2') /* 12 Y/CbCr 4:2:0 4x4 tiles */ +#define V4L2_PIX_FMT_NV12_16L16 v4l2_fourcc('H', 'M', '1', '2') /* 12 Y/CbCr 4:2:0 16x16 tiles */ +#define V4L2_PIX_FMT_NV12_32L32 v4l2_fourcc('S', 'T', '1', '2') /* 12 Y/CbCr 4:2:0 32x32 tiles */ + +/* Tiled YUV formats, non contiguous planes */ +#define V4L2_PIX_FMT_NV12MT v4l2_fourcc('T', 'M', '1', '2') /* 12 Y/CbCr 4:2:0 64x32 tiles */ +#define V4L2_PIX_FMT_NV12MT_16X16 v4l2_fourcc('V', 'M', '1', '2') /* 12 Y/CbCr 4:2:0 16x16 tiles */ + /* Bayer formats - see http://www.siliconimaging.com/RGB%20Bayer.htm */ #define V4L2_PIX_FMT_SBGGR8 v4l2_fourcc('B', 'A', '8', '1') /* 8 BGBG.. GRGR.. */ #define V4L2_PIX_FMT_SGBRG8 v4l2_fourcc('G', 'B', 'R', 'G') /* 8 GBGB.. RGRG.. */ @@ -696,10 +695,13 @@ #define V4L2_PIX_FMT_VC1_ANNEX_G v4l2_fourcc('V', 'C', '1', 'G') /* SMPTE 421M Annex G compliant stream */ #define V4L2_PIX_FMT_VC1_ANNEX_L v4l2_fourcc('V', 'C', '1', 'L') /* SMPTE 421M Annex L compliant stream */ #define V4L2_PIX_FMT_VP8 v4l2_fourcc('V', 'P', '8', '0') /* VP8 */ +#define V4L2_PIX_FMT_VP8_FRAME v4l2_fourcc('V', 'P', '8', 'F') /* VP8 parsed frame */ #define V4L2_PIX_FMT_VP9 v4l2_fourcc('V', 'P', '9', '0') /* VP9 */ +#define V4L2_PIX_FMT_VP9_FRAME v4l2_fourcc('V', 'P', '9', 'F') /* VP9 parsed frame */ #define V4L2_PIX_FMT_HEVC v4l2_fourcc('H', 'E', 'V', 'C') /* HEVC aka H.265 */ #define V4L2_PIX_FMT_FWHT v4l2_fourcc('F', 'W', 'H', 'T') /* Fast Walsh Hadamard Transform (vicodec) */ #define V4L2_PIX_FMT_FWHT_STATELESS v4l2_fourcc('S', 'F', 'W', 'H') /* Stateless FWHT (vicodec) */ +#define V4L2_PIX_FMT_H264_SLICE v4l2_fourcc('S', '2', '6', '4') /* H264 parsed slices */ /* Vendor-specific formats */ #define V4L2_PIX_FMT_CPIA1 v4l2_fourcc('C', 'P', 'I', 'A') /* cpia1 YUV */ @@ -732,9 +734,10 @@ #define V4L2_PIX_FMT_Y12I v4l2_fourcc('Y', '1', '2', 'I') /* Greyscale 12-bit L/R interleaved */ #define V4L2_PIX_FMT_Z16 v4l2_fourcc('Z', '1', '6', ' ') /* Depth data 16-bit */ #define V4L2_PIX_FMT_MT21C v4l2_fourcc('M', 'T', '2', '1') /* Mediatek compressed block mode */ +#define V4L2_PIX_FMT_MM21 v4l2_fourcc('M', 'M', '2', '1') /* Mediatek 8-bit block mode, two non-contiguous planes */ #define V4L2_PIX_FMT_INZI v4l2_fourcc('I', 'N', 'Z', 'I') /* Intel Planar Greyscale 10-bit and Depth 16-bit */ -#define V4L2_PIX_FMT_SUNXI_TILED_NV12 v4l2_fourcc('S', 'T', '1', '2') /* Sunxi Tiled NV12 Format */ #define V4L2_PIX_FMT_CNF4 v4l2_fourcc('C', 'N', 'F', '4') /* Intel 4-bit packed depth confidence information */ +#define V4L2_PIX_FMT_HI240 v4l2_fourcc('H', 'I', '2', '4') /* BTTV 8-bit dithered RGB */ /* 10bit raw bayer packed, 32 bytes for every 25 pixels, last LSB 6 bits unused */ #define V4L2_PIX_FMT_IPU3_SBGGR10 v4l2_fourcc('i', 'p', '3', 'b') /* IPU3 packed 10-bit BGGR bayer */ @@ -765,11 +768,16 @@ #define V4L2_META_FMT_D4XX v4l2_fourcc('D', '4', 'X', 'X') /* D4XX Payload Header metadata */ #define V4L2_META_FMT_VIVID v4l2_fourcc('V', 'I', 'V', 'D') /* Vivid Metadata */ +/* Vendor specific - used for RK_ISP1 camera sub-system */ +#define V4L2_META_FMT_RK_ISP1_PARAMS v4l2_fourcc('R', 'K', '1', 'P') /* Rockchip ISP1 3A Parameters */ +#define V4L2_META_FMT_RK_ISP1_STAT_3A v4l2_fourcc('R', 'K', '1', 'S') /* Rockchip ISP1 3A Statistics */ + /* priv field value to indicates that subsequent fields are valid. */ #define V4L2_PIX_FMT_PRIV_MAGIC 0xfeedcafe /* Flags */ #define V4L2_PIX_FMT_FLAG_PREMUL_ALPHA 0x00000001 +#define V4L2_PIX_FMT_FLAG_SET_CSC 0x00000002 /* * F O R M A T E N U M E R A T I O N @@ -780,13 +788,20 @@ __u32 flags; __u8 description[32]; /* Description string */ __u32 pixelformat; /* Format fourcc */ - __u32 reserved[4]; + __u32 mbus_code; /* Media bus code */ + __u32 reserved[3]; }; #define V4L2_FMT_FLAG_COMPRESSED 0x0001 #define V4L2_FMT_FLAG_EMULATED 0x0002 #define V4L2_FMT_FLAG_CONTINUOUS_BYTESTREAM 0x0004 #define V4L2_FMT_FLAG_DYN_RESOLUTION 0x0008 +#define V4L2_FMT_FLAG_ENC_CAP_FRAME_INTERVAL 0x0010 +#define V4L2_FMT_FLAG_CSC_COLORSPACE 0x0020 +#define V4L2_FMT_FLAG_CSC_XFER_FUNC 0x0040 +#define V4L2_FMT_FLAG_CSC_YCBCR_ENC 0x0080 +#define V4L2_FMT_FLAG_CSC_HSV_ENC V4L2_FMT_FLAG_CSC_YCBCR_ENC +#define V4L2_FMT_FLAG_CSC_QUANTIZATION 0x0100 /* Frame Size and frame rate enumeration */ /* @@ -916,39 +931,26 @@ * M E M O R Y - M A P P I N G B U F F E R S */ -#ifdef __KERNEL__ -/* - * This corresponds to the user space version of timeval - * for 64-bit time_t. sparc64 is different from everyone - * else, using the microseconds in the wrong half of the - * second 64-bit word. - */ -struct __kernel_v4l2_timeval { - long long tv_sec; -#if defined(__sparc__) && defined(__arch64__) - int tv_usec; - int __pad; -#else - long long tv_usec; -#endif -}; -#endif struct v4l2_requestbuffers { __u32 count; __u32 type; /* enum v4l2_buf_type */ __u32 memory; /* enum v4l2_memory */ __u32 capabilities; - __u32 reserved[1]; + __u8 flags; + __u8 reserved[3]; }; +#define V4L2_MEMORY_FLAG_NON_COHERENT (1 << 0) + /* capabilities for struct v4l2_requestbuffers and v4l2_create_buffers */ -#define V4L2_BUF_CAP_SUPPORTS_MMAP (1 << 0) -#define V4L2_BUF_CAP_SUPPORTS_USERPTR (1 << 1) -#define V4L2_BUF_CAP_SUPPORTS_DMABUF (1 << 2) -#define V4L2_BUF_CAP_SUPPORTS_REQUESTS (1 << 3) -#define V4L2_BUF_CAP_SUPPORTS_ORPHANED_BUFS (1 << 4) +#define V4L2_BUF_CAP_SUPPORTS_MMAP (1 << 0) +#define V4L2_BUF_CAP_SUPPORTS_USERPTR (1 << 1) +#define V4L2_BUF_CAP_SUPPORTS_DMABUF (1 << 2) +#define V4L2_BUF_CAP_SUPPORTS_REQUESTS (1 << 3) +#define V4L2_BUF_CAP_SUPPORTS_ORPHANED_BUFS (1 << 4) #define V4L2_BUF_CAP_SUPPORTS_M2M_HOLD_CAPTURE_BUF (1 << 5) +#define V4L2_BUF_CAP_SUPPORTS_MMAP_CACHE_HINTS (1 << 6) /** * struct v4l2_plane - plane info for multi-planar buffers @@ -962,8 +964,10 @@ * pointing to this plane * @fd: when memory is V4L2_MEMORY_DMABUF, a userspace file * descriptor associated with this plane + * @m: union of @mem_offset, @userptr and @fd * @data_offset: offset in the plane to the start of data; usually 0, * unless there is a header in front of the data + * @reserved: drivers and applications must zero this array * * Multi-planar buffers consist of one or more planes, e.g. an YCbCr buffer * with two planes can have one plane for Y, and another for interleaved CbCr @@ -1005,10 +1009,14 @@ * a userspace file descriptor associated with this buffer * @planes: for multiplanar buffers; userspace pointer to the array of plane * info structs for this buffer + * @m: union of @offset, @userptr, @planes and @fd * @length: size in bytes of the buffer (NOT its payload) for single-plane * buffers (when type != *_MPLANE); number of elements in the * planes array for multi-plane buffers + * @reserved2: drivers and applications must zero this field * @request_fd: fd of the request that this buffer should use + * @reserved: for backwards compatibility with applications that do not know + * about @request_fd * * Contains data exchanged by application and driver using one of the Streaming * I/O methods. @@ -1019,11 +1027,7 @@ __u32 bytesused; __u32 flags; __u32 field; -#ifdef __KERNEL__ - struct __kernel_v4l2_timeval timestamp; -#else struct timeval timestamp; -#endif struct v4l2_timecode timecode; __u32 sequence; @@ -1043,19 +1047,17 @@ }; }; -#ifndef __KERNEL__ /** * v4l2_timeval_to_ns - Convert timeval to nanoseconds - * @ts: pointer to the timeval variable to be converted + * @tv: pointer to the timeval variable to be converted * * Returns the scalar nanosecond representation of the timeval * parameter. */ -static inline __u64 v4l2_timeval_to_ns(const struct timeval *tv) +static __inline__ __u64 v4l2_timeval_to_ns(const struct timeval *tv) { return (__u64)tv->tv_sec * 1000000000ULL + tv->tv_usec * 1000; } -#endif /* Flags for 'flags' field */ /* Buffer is mapped (flag) */ @@ -1107,6 +1109,7 @@ * @flags: flags for newly created file, currently only O_CLOEXEC is * supported, refer to manual of open syscall for more details * @fd: file descriptor associated with DMABUF (set by driver) + * @reserved: drivers and applications must zero this array * * Contains data used for exporting a video buffer as DMABUF file descriptor. * The buffer is identified by a 'cookie' returned by VIDIOC_QUERYBUF @@ -1164,16 +1167,16 @@ struct v4l2_clip { struct v4l2_rect c; - struct v4l2_clip __user *next; + struct v4l2_clip *next; }; struct v4l2_window { struct v4l2_rect w; __u32 field; /* enum v4l2_field */ __u32 chromakey; - struct v4l2_clip __user *clips; + struct v4l2_clip *clips; __u32 clipcount; - void __user *bitmap; + void *bitmap; __u8 global_alpha; }; @@ -1712,20 +1715,31 @@ union { __s32 value; __s64 value64; - char __user *string; - __u8 __user *p_u8; - __u16 __user *p_u16; - __u32 __user *p_u32; - struct v4l2_area __user *p_area; - void __user *ptr; + char *string; + __u8 *p_u8; + __u16 *p_u16; + __u32 *p_u32; + struct v4l2_area *p_area; + struct v4l2_ctrl_h264_sps *p_h264_sps; + struct v4l2_ctrl_h264_pps *p_h264_pps; + struct v4l2_ctrl_h264_scaling_matrix *p_h264_scaling_matrix; + struct v4l2_ctrl_h264_pred_weights *p_h264_pred_weights; + struct v4l2_ctrl_h264_slice_params *p_h264_slice_params; + struct v4l2_ctrl_h264_decode_params *p_h264_decode_params; + struct v4l2_ctrl_fwht_params *p_fwht_params; + struct v4l2_ctrl_vp8_frame *p_vp8_frame; + struct v4l2_ctrl_mpeg2_sequence *p_mpeg2_sequence; + struct v4l2_ctrl_mpeg2_picture *p_mpeg2_picture; + struct v4l2_ctrl_mpeg2_quantisation *p_mpeg2_quantisation; + struct v4l2_ctrl_vp9_compressed_hdr *p_vp9_compressed_hdr_probs; + struct v4l2_ctrl_vp9_frame *p_vp9_frame; + void *ptr; }; } __attribute__ ((packed)); struct v4l2_ext_controls { union { -#ifndef __KERNEL__ __u32 ctrl_class; -#endif __u32 which; }; __u32 count; @@ -1763,6 +1777,27 @@ V4L2_CTRL_TYPE_U16 = 0x0101, V4L2_CTRL_TYPE_U32 = 0x0102, V4L2_CTRL_TYPE_AREA = 0x0106, + + V4L2_CTRL_TYPE_HDR10_CLL_INFO = 0x0110, + V4L2_CTRL_TYPE_HDR10_MASTERING_DISPLAY = 0x0111, + + V4L2_CTRL_TYPE_H264_SPS = 0x0200, + V4L2_CTRL_TYPE_H264_PPS = 0x0201, + V4L2_CTRL_TYPE_H264_SCALING_MATRIX = 0x0202, + V4L2_CTRL_TYPE_H264_SLICE_PARAMS = 0x0203, + V4L2_CTRL_TYPE_H264_DECODE_PARAMS = 0x0204, + V4L2_CTRL_TYPE_H264_PRED_WEIGHTS = 0x0205, + + V4L2_CTRL_TYPE_FWHT_PARAMS = 0x0220, + + V4L2_CTRL_TYPE_VP8_FRAME = 0x0240, + + V4L2_CTRL_TYPE_MPEG2_QUANTISATION = 0x0250, + V4L2_CTRL_TYPE_MPEG2_SEQUENCE = 0x0251, + V4L2_CTRL_TYPE_MPEG2_PICTURE = 0x0252, + + V4L2_CTRL_TYPE_VP9_COMPRESSED_HDR = 0x0260, + V4L2_CTRL_TYPE_VP9_FRAME = 0x0261, }; /* Used in the VIDIOC_QUERYCTRL ioctl for querying controls */ @@ -2200,6 +2235,7 @@ * this plane will be used * @bytesperline: distance in bytes between the leftmost pixels in two * adjacent lines + * @reserved: drivers and applications must zero this array */ struct v4l2_plane_pix_format { __u32 sizeimage; @@ -2218,8 +2254,10 @@ * @num_planes: number of planes for this format * @flags: format flags (V4L2_PIX_FMT_FLAG_*) * @ycbcr_enc: enum v4l2_ycbcr_encoding, Y'CbCr encoding + * @hsv_enc: enum v4l2_hsv_encoding, HSV encoding * @quantization: enum v4l2_quantization, colorspace quantization * @xfer_func: enum v4l2_xfer_func, colorspace transfer function + * @reserved: drivers and applications must zero this array */ struct v4l2_pix_format_mplane { __u32 width; @@ -2244,6 +2282,7 @@ * struct v4l2_sdr_format - SDR format definition * @pixelformat: little endian four character code (fourcc) * @buffersize: maximum size in bytes required for data + * @reserved: drivers and applications must zero this array */ struct v4l2_sdr_format { __u32 pixelformat; @@ -2270,6 +2309,8 @@ * @vbi: raw VBI capture or output parameters * @sliced: sliced VBI capture or output parameters * @raw_data: placeholder for future extensions and custom formats + * @fmt: union of @pix, @pix_mp, @win, @vbi, @sliced, @sdr, @meta + * and @raw_data */ struct v4l2_format { __u32 type; @@ -2371,11 +2412,7 @@ } u; __u32 pending; __u32 sequence; -#ifdef __KERNEL__ - struct __kernel_timespec timestamp; -#else struct timespec timestamp; -#endif __u32 id; __u32 reserved[8]; }; @@ -2442,6 +2479,9 @@ * @memory: enum v4l2_memory; buffer memory type * @format: frame format, for which buffers are requested * @capabilities: capabilities of this buffer type. + * @flags: additional buffer management attributes (ignored unless the + * queue has V4L2_BUF_CAP_SUPPORTS_MMAP_CACHE_HINTS capability + * and configured for MMAP streaming I/O). * @reserved: future extensions */ struct v4l2_create_buffers { @@ -2450,7 +2490,8 @@ __u32 memory; struct v4l2_format format; __u32 capabilities; - __u32 reserved[7]; + __u32 flags; + __u32 reserved[6]; }; /* @@ -2558,4 +2599,8 @@ #define BASE_VIDIOC_PRIVATE 192 /* 192-255 are private */ +/* Deprecated definitions kept for backwards compatibility */ +#define V4L2_PIX_FMT_HM12 V4L2_PIX_FMT_NV12_16L16 +#define V4L2_PIX_FMT_SUNXI_TILED_NV12 V4L2_PIX_FMT_NV12_32L32 + #endif /* __LINUX_VIDEODEV2_H */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/meson.build
Changed
@@ -3,10 +3,13 @@ 'gstv4l2codecallocator.c', 'gstv4l2codecdevice.c', 'gstv4l2codech264dec.c', + 'gstv4l2codecmpeg2dec.c', 'gstv4l2codecpool.c', 'gstv4l2codecvp8dec.c', + 'gstv4l2codecvp9dec.c', 'gstv4l2decoder.c', 'gstv4l2format.c', + 'gstv4l2codecalphadecodebin.c', ] libgudev_dep = dependency('gudev-1.0', required: get_option('v4l2codecs')) @@ -37,7 +40,8 @@ c_args : gst_plugins_bad_args, cpp_args: gst_plugins_bad_args, include_directories : [configinc], - dependencies : [gstbase_dep, gstcodecs_dep, gstallocators_dep, libgudev_dep], + dependencies : [gstbase_dep, gstcodecs_dep, gstallocators_dep, libgudev_dep, + gstpbutils_dep,], install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/sys/v4l2codecs/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/v4l2codecs/plugin.c
Changed
@@ -24,10 +24,11 @@ #include "gstv4l2codecdevice.h" #include "gstv4l2codech264dec.h" +#include "gstv4l2codecmpeg2dec.h" #include "gstv4l2codecvp8dec.h" +#include "gstv4l2codecvp9dec.h" #include "gstv4l2decoder.h" -#include "linux/h264-ctrls.h" -#include "linux/vp8-ctrls.h" +#include "linux/v4l2-controls.h" #include "linux/media.h" #define GST_CAT_DEFAULT gstv4l2codecs_debug @@ -50,13 +51,28 @@ case V4L2_PIX_FMT_H264_SLICE: GST_INFO_OBJECT (decoder, "Registering %s as H264 Decoder", device->name); - gst_v4l2_codec_h264_dec_register (plugin, device, GST_RANK_PRIMARY + 1); + gst_v4l2_codec_h264_dec_register (plugin, decoder, device, + GST_RANK_PRIMARY + 1); break; case V4L2_PIX_FMT_VP8_FRAME: GST_INFO_OBJECT (decoder, "Registering %s as VP8 Decoder", device->name); - gst_v4l2_codec_vp8_dec_register (plugin, device, GST_RANK_PRIMARY + 1); + gst_v4l2_codec_vp8_dec_register (plugin, decoder, device, + GST_RANK_PRIMARY + 1); break; + case V4L2_PIX_FMT_MPEG2_SLICE: + GST_INFO_OBJECT (decoder, "Registering %s as Mpeg2 Decoder", + device->name); + gst_v4l2_codec_mpeg2_dec_register (plugin, decoder, device, + GST_RANK_PRIMARY + 1); + break; + case V4L2_PIX_FMT_VP9_FRAME: + GST_INFO_OBJECT (decoder, "Registering %s as VP9 Decoder", + device->name); + gst_v4l2_codec_vp9_dec_register (plugin, decoder, device, + GST_RANK_PRIMARY + 1); + break; + default: GST_FIXME_OBJECT (decoder, "%" GST_FOURCC_FORMAT " is not supported.", GST_FOURCC_ARGS (fmt));
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvaallocator.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvaallocator.c
Changed
@@ -26,38 +26,14 @@ #include <sys/types.h> #include <unistd.h> -#include <va/va_drmcommon.h> #include "gstvacaps.h" +#include "gstvasurfacecopy.h" #include "gstvavideoformat.h" +#include "vasurfaceimage.h" #define GST_CAT_DEFAULT gst_va_memory_debug -GST_DEBUG_CATEGORY_STATIC (gst_va_memory_debug); - -struct _GstVaDmabufAllocator -{ - GstDmaBufAllocator parent; - - /* queue for disposable surfaces */ - GstAtomicQueue *queue; - GstVaDisplay *display; - - GstMemoryMapFunction parent_map; -}; - -static void _init_debug_category (void); - -#define gst_va_dmabuf_allocator_parent_class dmabuf_parent_class -G_DEFINE_TYPE_WITH_CODE (GstVaDmabufAllocator, gst_va_dmabuf_allocator, - GST_TYPE_DMABUF_ALLOCATOR, _init_debug_category ()); - -typedef struct _GstVaBufferSurface GstVaBufferSurface; -struct _GstVaBufferSurface -{ - GstVideoInfo info; - VASurfaceID surface; - gint ref_count; -}; +GST_DEBUG_CATEGORY (gst_va_memory_debug); static void _init_debug_category (void) @@ -72,326 +48,394 @@ #endif } -static gboolean -_destroy_surfaces (GstVaDisplay * display, VASurfaceID * surfaces, - gint num_surfaces) -{ - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; +/*=========================== Quarks for GstMemory ===========================*/ - g_return_val_if_fail (num_surfaces > 0, FALSE); +static GQuark +gst_va_buffer_surface_quark (void) +{ + static gsize surface_quark = 0; - gst_va_display_lock (display); - status = vaDestroySurfaces (dpy, surfaces, num_surfaces); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaDestroySurfaces: %s", vaErrorStr (status)); - return FALSE; + if (g_once_init_enter (&surface_quark)) { + GQuark quark = g_quark_from_string ("GstVaBufferSurface"); + g_once_init_leave (&surface_quark, quark); } - return TRUE; - + return surface_quark; } -static gboolean -_create_surfaces (GstVaDisplay * display, guint rt_format, guint fourcc, - guint width, guint height, gint usage_hint, VASurfaceID * surfaces, - guint num_surfaces) +static GQuark +gst_va_drm_mod_quark (void) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - /* *INDENT-OFF* */ - VASurfaceAttrib attrs[] = { - { - .type = VASurfaceAttribUsageHint, - .flags = VA_SURFACE_ATTRIB_SETTABLE, - .value.type = VAGenericValueTypeInteger, - .value.value.i = usage_hint, - }, - { - .type = VASurfaceAttribPixelFormat, - .flags = VA_SURFACE_ATTRIB_SETTABLE, - .value.type = VAGenericValueTypeInteger, - .value.value.i = fourcc, - }, - { - .type = VASurfaceAttribMemoryType, - .flags = VA_SURFACE_ATTRIB_SETTABLE, - .value.type = VAGenericValueTypeInteger, - .value.value.i = VA_SURFACE_ATTRIB_MEM_TYPE_VA, - }, - }; - /* *INDENT-ON* */ - VAStatus status; - - g_return_val_if_fail (num_surfaces > 0, FALSE); + static gsize drm_mod_quark = 0; - gst_va_display_lock (display); - status = vaCreateSurfaces (dpy, rt_format, width, height, surfaces, - num_surfaces, attrs, G_N_ELEMENTS (attrs)); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaCreateSurfaces: %s", vaErrorStr (status)); - return FALSE; + if (g_once_init_enter (&drm_mod_quark)) { + GQuark quark = g_quark_from_string ("DRMModifier"); + g_once_init_leave (&drm_mod_quark, quark); } - return TRUE; + return drm_mod_quark; } -static gboolean -_export_surface_to_dmabuf (GstVaDisplay * display, VASurfaceID surface, - guint32 flags, VADRMPRIMESurfaceDescriptor * desc) -{ - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; - - gst_va_display_lock (display); - status = vaExportSurfaceHandle (dpy, surface, - VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2, flags, desc); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaExportSurfaceHandle: %s", vaErrorStr (status)); - return FALSE; +static GQuark +gst_va_buffer_aux_surface_quark (void) +{ + static gsize surface_quark = 0; + + if (g_once_init_enter (&surface_quark)) { + GQuark quark = g_quark_from_string ("GstVaBufferAuxSurface"); + g_once_init_leave (&surface_quark, quark); } - return TRUE; + return surface_quark; } -static gboolean -_destroy_image (GstVaDisplay * display, VAImageID image_id) +/*========================= GstVaBufferSurface ===============================*/ + +typedef struct _GstVaBufferSurface GstVaBufferSurface; +struct _GstVaBufferSurface { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + GstVaDisplay *display; + VASurfaceID surface; + guint n_mems; + GstMemory *mems[GST_VIDEO_MAX_PLANES]; + gint ref_count; + gint ref_mems_count; +}; - gst_va_display_lock (display); - status = vaDestroyImage (dpy, image_id); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaDestroyImage: %s", vaErrorStr (status)); - return FALSE; +static void +gst_va_buffer_surface_unref (gpointer data) +{ + GstVaBufferSurface *buf = data; + + g_return_if_fail (buf && GST_IS_VA_DISPLAY (buf->display)); + + if (g_atomic_int_dec_and_test (&buf->ref_count)) { + GST_LOG_OBJECT (buf->display, "Destroying surface %#x", buf->surface); + va_destroy_surfaces (buf->display, &buf->surface, 1); + gst_clear_object (&buf->display); + g_slice_free (GstVaBufferSurface, buf); } - return TRUE; } -static gboolean -_get_derive_image (GstVaDisplay * display, VASurfaceID surface, VAImage * image) +static GstVaBufferSurface * +gst_va_buffer_surface_new (VASurfaceID surface, GstVideoFormat format, + gint width, gint height) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + GstVaBufferSurface *buf = g_slice_new (GstVaBufferSurface); - gst_va_display_lock (display); - status = vaDeriveImage (dpy, surface, image); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_WARNING ("vaDeriveImage: %s", vaErrorStr (status)); - return FALSE; - } + g_atomic_int_set (&buf->ref_count, 0); + g_atomic_int_set (&buf->ref_mems_count, 0); + buf->surface = surface; + buf->display = NULL; + buf->n_mems = 0; - return TRUE; + return buf; } -static gboolean -_create_image (GstVaDisplay * display, GstVideoFormat format, gint width, - gint height, VAImage * image) +/*=========================== GstVaMemoryPool ================================*/ + +/* queue for disposed surfaces */ +typedef struct _GstVaMemoryPool GstVaMemoryPool; +struct _GstVaMemoryPool { - VADisplay dpy = gst_va_display_get_va_dpy (display); - const VAImageFormat *va_format; - VAStatus status; + GstAtomicQueue *queue; + gint surface_count; - va_format = gst_va_image_format_from_video_format (format); - if (!va_format) - return FALSE; + GMutex lock; +}; - gst_va_display_lock (display); - status = - vaCreateImage (dpy, (VAImageFormat *) va_format, width, height, image); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaCreateImage: %s", vaErrorStr (status)); - return FALSE; - } - return TRUE; -} +#define GST_VA_MEMORY_POOL_CAST(obj) ((GstVaMemoryPool *)obj) +#define GST_VA_MEMORY_POOL_LOCK(obj) g_mutex_lock (&GST_VA_MEMORY_POOL_CAST(obj)->lock) +#define GST_VA_MEMORY_POOL_UNLOCK(obj) g_mutex_unlock (&GST_VA_MEMORY_POOL_CAST(obj)->lock) -static gboolean -_get_image (GstVaDisplay * display, VASurfaceID surface, VAImage * image) +static void +gst_va_memory_pool_init (GstVaMemoryPool * self) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + self->queue = gst_atomic_queue_new (2); - gst_va_display_lock (display); - status = vaGetImage (dpy, surface, 0, 0, image->width, image->height, - image->image_id); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaGetImage: %s", vaErrorStr (status)); - return FALSE; - } + g_mutex_init (&self->lock); - return TRUE; + self->surface_count = 0; } -static gboolean -_sync_surface (GstVaDisplay * display, VASurfaceID surface) +static void +gst_va_memory_pool_finalize (GstVaMemoryPool * self) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + g_mutex_clear (&self->lock); - gst_va_display_lock (display); - status = vaSyncSurface (dpy, surface); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_WARNING ("vaSyncSurface: %s", vaErrorStr (status)); - return FALSE; - } - return TRUE; + gst_atomic_queue_unref (self->queue); } -static gboolean -_map_buffer (GstVaDisplay * display, VABufferID buffer, gpointer * data) +static void +gst_va_memory_pool_flush_unlocked (GstVaMemoryPool * self, + GstVaDisplay * display) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + GstMemory *mem; + GstVaBufferSurface *buf; - gst_va_display_lock (display); - status = vaMapBuffer (dpy, buffer, data); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_WARNING ("vaMapBuffer: %s", vaErrorStr (status)); - return FALSE; + while ((mem = gst_atomic_queue_pop (self->queue))) { + /* destroy the surface */ + buf = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_va_buffer_surface_quark ()); + if (buf) { + if (g_atomic_int_dec_and_test (&buf->ref_count)) { + GST_LOG ("Destroying surface %#x", buf->surface); + va_destroy_surfaces (display, &buf->surface, 1); + self->surface_count -= 1; /* GstVaDmabufAllocator */ + g_slice_free (GstVaBufferSurface, buf); + } + } else { + self->surface_count -= 1; /* GstVaAllocator */ + } + + GST_MINI_OBJECT_CAST (mem)->dispose = NULL; + /* when mem are pushed available queue its allocator is unref, + * then now it is required to ref the allocator here because + * memory's finalize will unref it again */ + gst_object_ref (mem->allocator); + gst_memory_unref (mem); } - return TRUE; } -static gboolean -_unmap_buffer (GstVaDisplay * display, VABufferID buffer) +static void +gst_va_memory_pool_flush (GstVaMemoryPool * self, GstVaDisplay * display) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + GST_VA_MEMORY_POOL_LOCK (self); + gst_va_memory_pool_flush_unlocked (self, display); + GST_VA_MEMORY_POOL_UNLOCK (self); +} - gst_va_display_lock (display); - status = vaUnmapBuffer (dpy, buffer); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_WARNING ("vaUnmapBuffer: %s", vaErrorStr (status)); - return FALSE; - } - return TRUE; +static inline void +gst_va_memory_pool_push (GstVaMemoryPool * self, GstMemory * mem) +{ + gst_atomic_queue_push (self->queue, gst_memory_ref (mem)); } -static gboolean -_put_image (GstVaDisplay * display, VASurfaceID surface, VAImage * image) +static inline GstMemory * +gst_va_memory_pool_pop (GstVaMemoryPool * self) { - VADisplay dpy = gst_va_display_get_va_dpy (display); - VAStatus status; + return gst_atomic_queue_pop (self->queue); +} - if (!_sync_surface (display, surface)) - return FALSE; +static inline GstMemory * +gst_va_memory_pool_peek (GstVaMemoryPool * self) +{ + return gst_atomic_queue_peek (self->queue); +} - gst_va_display_lock (display); - status = vaPutImage (dpy, surface, image->image_id, 0, 0, image->width, - image->height, 0, 0, image->width, image->height); - gst_va_display_unlock (display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR ("vaPutImage: %s", vaErrorStr (status)); - return FALSE; - } - return TRUE; +static inline guint +gst_va_memory_pool_surface_count (GstVaMemoryPool * self) +{ + return g_atomic_int_get (&self->surface_count); } -static GQuark -gst_va_buffer_surface_quark (void) +static inline void +gst_va_memory_pool_surface_inc (GstVaMemoryPool * self) { - static gsize surface_quark = 0; + g_atomic_int_inc (&self->surface_count); +} - if (g_once_init_enter (&surface_quark)) { - GQuark quark = g_quark_from_string ("GstVaBufferSurface"); - g_once_init_leave (&surface_quark, quark); +/*=========================== GstVaDmabufAllocator ===========================*/ + +struct _GstVaDmabufAllocator +{ + GstDmaBufAllocator parent; + + GstVaDisplay *display; + + GstMemoryMapFunction parent_map; + GstMemoryCopyFunction parent_copy; + + GstVideoInfo info; + guint usage_hint; + + GstVaSurfaceCopy *copy; + + GstVaMemoryPool pool; +}; + +#define gst_va_dmabuf_allocator_parent_class dmabuf_parent_class +G_DEFINE_TYPE_WITH_CODE (GstVaDmabufAllocator, gst_va_dmabuf_allocator, + GST_TYPE_DMABUF_ALLOCATOR, _init_debug_category ()); + +static GstVaSurfaceCopy * +_ensure_surface_copy (GstVaSurfaceCopy ** old, GstVaDisplay * display, + GstVideoInfo * info) +{ + GstVaSurfaceCopy *surface_copy; + + surface_copy = g_atomic_pointer_get (old); + if (!surface_copy) { + surface_copy = gst_va_surface_copy_new (display, info); + + /* others create a new one and set it before us */ + if (surface_copy && + !g_atomic_pointer_compare_and_exchange (old, NULL, surface_copy)) { + gst_va_surface_copy_free (surface_copy); + surface_copy = g_atomic_pointer_get (old); + } } - return surface_quark; + return surface_copy; } -static GQuark -gst_va_drm_mod_quark (void) +/* If a buffer contains multiple memories (dmabuf objects) its very + * difficult to provide a realiable way to fast-copy single memories: + * While VA API sees surfaces with dependant dmabufs, GStreamer only + * copies dmabufs in isolation; trying to solve it while keeping a + * reference of the copied buffer and dmabuf index is very fragile. */ +static GstMemory * +gst_va_dmabuf_mem_copy (GstMemory * gmem, gssize offset, gssize size) { - static gsize drm_mod_quark = 0; + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (gmem->allocator); + GstVaBufferSurface *buf; + guint64 *drm_mod; + gsize mem_size; - if (g_once_init_enter (&drm_mod_quark)) { - GQuark quark = g_quark_from_string ("DRMModifier"); - g_once_init_leave (&drm_mod_quark, quark); + buf = gst_mini_object_get_qdata (GST_MINI_OBJECT (gmem), + gst_va_buffer_surface_quark ()); + + drm_mod = gst_mini_object_get_qdata (GST_MINI_OBJECT (gmem), + gst_va_drm_mod_quark ()); + + /* 0 is DRM_FORMAT_MOD_LINEAR, we do not include its header now. */ + if (buf->n_mems > 1 && *drm_mod != 0) { + GST_ERROR_OBJECT (self, "Failed to copy multi-dmabuf because non-linear " + "modifier: %#lx.", *drm_mod); + return NULL; } - return drm_mod_quark; + /* check if it's full memory copy */ + mem_size = gst_memory_get_sizes (gmem, NULL, NULL); + + if (size == -1) + size = mem_size > offset ? mem_size - offset : 0; + + /* @XXX: if one-memory buffer it's possible to copy */ + if (offset == 0 && size == mem_size && buf->n_mems == 1) { + GstVaBufferSurface *buf_copy = NULL; + GstMemory *copy; + GstVaSurfaceCopy *copy_func; + + GST_VA_MEMORY_POOL_LOCK (&self->pool); + copy = gst_va_memory_pool_pop (&self->pool); + GST_VA_MEMORY_POOL_UNLOCK (&self->pool); + + if (copy) { + gst_object_ref (copy->allocator); + + buf_copy = gst_mini_object_get_qdata (GST_MINI_OBJECT (copy), + gst_va_buffer_surface_quark ()); + + g_assert (g_atomic_int_get (&buf_copy->ref_mems_count) == 0); + + g_atomic_int_add (&buf_copy->ref_mems_count, 1); + } else { + GstBuffer *buffer = gst_buffer_new (); + + if (!gst_va_dmabuf_allocator_setup_buffer (gmem->allocator, buffer)) { + GST_WARNING_OBJECT (self, "Failed to create a new dmabuf memory"); + return NULL; + } + + copy = gst_buffer_get_memory (buffer, 0); + gst_buffer_unref (buffer); + + buf_copy = gst_mini_object_get_qdata (GST_MINI_OBJECT (copy), + gst_va_buffer_surface_quark ()); + } + + g_assert (buf_copy->n_mems == 1); + + copy_func = _ensure_surface_copy (&self->copy, self->display, &self->info); + if (copy_func && gst_va_surface_copy (copy_func, buf_copy->surface, + buf->surface)) + return copy; + + gst_memory_unref (copy); + + /* try system memory */ + } + + if (*drm_mod != 0) { + GST_ERROR_OBJECT (self, "Failed to copy dmabuf because non-linear " + "modifier: %#lx.", *drm_mod); + return NULL; + } + + /* fallback to system memory */ + return self->parent_copy (gmem, offset, size); + } static gpointer gst_va_dmabuf_mem_map (GstMemory * gmem, gsize maxsize, GstMapFlags flags) { GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (gmem->allocator); - VASurfaceID surface = gst_va_memory_get_surface (gmem, NULL); + VASurfaceID surface = gst_va_memory_get_surface (gmem); + guint64 *drm_mod; - _sync_surface (self->display, surface); + drm_mod = gst_mini_object_get_qdata (GST_MINI_OBJECT (gmem), + gst_va_drm_mod_quark ()); - /* @TODO: if mapping with flag GST_MAP_VASURFACE return the - * VA_SURFACE_ID. - * if mapping and drm_modifers are not lineal, use vaDeriveImage */ -#ifndef GST_DISABLE_GST_DEBUG - { - guint64 *drm_mod; - - drm_mod = gst_mini_object_get_qdata (GST_MINI_OBJECT (gmem), - gst_va_drm_mod_quark ()); - GST_TRACE_OBJECT (self, "DRM modifiers: %#lx", *drm_mod); + /* 0 is DRM_FORMAT_MOD_LINEAR, we do not include its header now. */ + if (*drm_mod != 0) { + GST_ERROR_OBJECT (self, "Failed to map the dmabuf because the modifier " + "is: %#lx, which is not linear.", *drm_mod); + return NULL; } -#endif + + va_sync_surface (self->display, surface); return self->parent_map (gmem, maxsize, flags); } static void -gst_va_dmabuf_allocator_dispose (GObject * object) +gst_va_dmabuf_allocator_finalize (GObject * object) { GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (object); + g_clear_pointer (&self->copy, gst_va_surface_copy_free); + gst_va_memory_pool_finalize (&self->pool); gst_clear_object (&self->display); - gst_atomic_queue_unref (self->queue); - G_OBJECT_CLASS (dmabuf_parent_class)->dispose (object); + G_OBJECT_CLASS (dmabuf_parent_class)->finalize (object); } static void -gst_va_dmabuf_allocator_free (GstAllocator * allocator, GstMemory * mem) +gst_va_dmabuf_allocator_dispose (GObject * object) { - GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (allocator); - GstVaBufferSurface *buf; - - /* first close the dmabuf fd */ - GST_ALLOCATOR_CLASS (dmabuf_parent_class)->free (allocator, mem); + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (object); - while ((buf = gst_atomic_queue_pop (self->queue))) { - GST_LOG_OBJECT (self, "Destroying surface %#x", buf->surface); - _destroy_surfaces (self->display, &buf->surface, 1); - g_slice_free (GstVaBufferSurface, buf); + gst_va_memory_pool_flush_unlocked (&self->pool, self->display); + if (gst_va_memory_pool_surface_count (&self->pool) != 0) { + GST_WARNING_OBJECT (self, "Surfaces leaked: %d", + gst_va_memory_pool_surface_count (&self->pool)); } + + G_OBJECT_CLASS (dmabuf_parent_class)->dispose (object); } static void gst_va_dmabuf_allocator_class_init (GstVaDmabufAllocatorClass * klass) { - GstAllocatorClass *allocator_class = GST_ALLOCATOR_CLASS (klass); GObjectClass *object_class = G_OBJECT_CLASS (klass); object_class->dispose = gst_va_dmabuf_allocator_dispose; - allocator_class->free = gst_va_dmabuf_allocator_free; + object_class->finalize = gst_va_dmabuf_allocator_finalize; } static void gst_va_dmabuf_allocator_init (GstVaDmabufAllocator * self) { - self->queue = gst_atomic_queue_new (2); + GstAllocator *allocator = GST_ALLOCATOR (self); - self->parent_map = GST_ALLOCATOR (self)->mem_map; - GST_ALLOCATOR (self)->mem_map = gst_va_dmabuf_mem_map; + self->parent_map = allocator->mem_map; + allocator->mem_map = gst_va_dmabuf_mem_map; + self->parent_copy = allocator->mem_copy; + allocator->mem_copy = gst_va_dmabuf_mem_copy; + + gst_va_memory_pool_init (&self->pool); } GstAllocator * @@ -408,19 +452,6 @@ return GST_ALLOCATOR (self); } -static GstVaBufferSurface * -_create_buffer_surface (VASurfaceID surface, GstVideoFormat format, - gint width, gint height) -{ - GstVaBufferSurface *buf = g_slice_new (GstVaBufferSurface); - - g_atomic_int_set (&buf->ref_count, 0); - buf->surface = surface; - gst_video_info_set_format (&buf->info, format, width, height); - - return buf; -} - static inline goffset _get_fd_size (gint fd) { @@ -428,56 +459,113 @@ } static gboolean -gst_va_memory_dispose (GstMiniObject * mini_object) +gst_va_dmabuf_memory_release (GstMiniObject * mini_object) { GstMemory *mem = GST_MEMORY_CAST (mini_object); - GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (mem->allocator); GstVaBufferSurface *buf; + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (mem->allocator); + guint i; + + buf = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_va_buffer_surface_quark ()); + if (!buf) + return TRUE; /* free this unknown buffer */ + + /* if this is the last reference to the GstVaBufferSurface, iterates + * its array of memories to push them into the queue with thread + * safetly. */ + GST_VA_MEMORY_POOL_LOCK (&self->pool); + if (g_atomic_int_dec_and_test (&buf->ref_mems_count)) { + for (i = 0; i < buf->n_mems; i++) { + GST_LOG_OBJECT (self, "releasing %p: dmabuf %d, va surface %#x", + buf->mems[i], gst_dmabuf_memory_get_fd (buf->mems[i]), buf->surface); + gst_va_memory_pool_push (&self->pool, buf->mems[i]); + } + } + GST_VA_MEMORY_POOL_UNLOCK (&self->pool); - buf = gst_mini_object_get_qdata (mini_object, gst_va_buffer_surface_quark ()); - if (buf && g_atomic_int_dec_and_test (&buf->ref_count)) - gst_atomic_queue_push (self->queue, buf); + /* note: if ref_mem_count doesn't reach zero, that memory will + * "float" until it's pushed back into the pool by the last va + * buffer surface ref */ - return TRUE; + /* Keep last in case we are holding on the last allocator ref */ + gst_object_unref (mem->allocator); + + /* don't call mini_object's free */ + return FALSE; } -gboolean -gst_va_dmabuf_setup_buffer (GstAllocator * allocator, GstBuffer * buffer, - GstVaAllocationParams * params) +/* Creates an exported VASurface and adds it as @buffer's memories + * qdata + * + * If @info is not NULL, a dummy (non-pooled) buffer is created to + * update offsets and strides, and it has to be unrefed immediately. + */ +static gboolean +gst_va_dmabuf_allocator_setup_buffer_full (GstAllocator * allocator, + GstBuffer * buffer, GstVideoInfo * info) { GstVaBufferSurface *buf; GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (allocator); GstVideoFormat format; VADRMPRIMESurfaceDescriptor desc = { 0, }; + VASurfaceAttribExternalBuffers *extbuf = NULL, ext_buf; VASurfaceID surface; guint32 i, fourcc, rt_format, export_flags; + GDestroyNotify buffer_destroy = NULL; g_return_val_if_fail (GST_IS_VA_DMABUF_ALLOCATOR (allocator), FALSE); - g_return_val_if_fail (params, FALSE); - format = GST_VIDEO_INFO_FORMAT (¶ms->info); + format = GST_VIDEO_INFO_FORMAT (&self->info); fourcc = gst_va_fourcc_from_video_format (format); rt_format = gst_va_chroma_from_video_format (format); if (fourcc == 0 || rt_format == 0) { GST_ERROR_OBJECT (allocator, "Unsupported format: %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (¶ms->info))); + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (&self->info))); return FALSE; } - if (!_create_surfaces (self->display, rt_format, fourcc, - GST_VIDEO_INFO_WIDTH (¶ms->info), - GST_VIDEO_INFO_HEIGHT (¶ms->info), params->usage_hint, &surface, - 1)) + /* HACK(victor): disable tiling for i965 driver for RGB formats */ + if (gst_va_display_is_implementation (self->display, + GST_VA_IMPLEMENTATION_INTEL_I965) + && GST_VIDEO_INFO_IS_RGB (&self->info)) { + /* *INDENT-OFF* */ + ext_buf = (VASurfaceAttribExternalBuffers) { + .width = GST_VIDEO_INFO_WIDTH (&self->info), + .height = GST_VIDEO_INFO_HEIGHT (&self->info), + .num_planes = GST_VIDEO_INFO_N_PLANES (&self->info), + .pixel_format = fourcc, + }; + /* *INDENT-ON* */ + + extbuf = &ext_buf; + } + + if (!va_create_surfaces (self->display, rt_format, fourcc, + GST_VIDEO_INFO_WIDTH (&self->info), + GST_VIDEO_INFO_HEIGHT (&self->info), self->usage_hint, extbuf, + &surface, 1)) return FALSE; - /* Each layer will contain exactly one plane. For example, an NV12 - * surface will be exported as two layers */ - export_flags = VA_EXPORT_SURFACE_SEPARATE_LAYERS - | VA_EXPORT_SURFACE_READ_WRITE; - if (!_export_surface_to_dmabuf (self->display, surface, export_flags, &desc)) + /* workaround for missing layered dmabuf formats in i965 */ + if (gst_va_display_is_implementation (self->display, + GST_VA_IMPLEMENTATION_INTEL_I965) + && (fourcc == VA_FOURCC_YUY2 || fourcc == VA_FOURCC_UYVY)) { + /* These are not representable as separate planes */ + export_flags = VA_EXPORT_SURFACE_COMPOSED_LAYERS; + } else { + /* Each layer will contain exactly one plane. For example, an NV12 + * surface will be exported as two layers */ + export_flags = VA_EXPORT_SURFACE_SEPARATE_LAYERS; + } + + export_flags |= VA_EXPORT_SURFACE_READ_WRITE; + + if (!va_export_surface_to_dmabuf (self->display, surface, export_flags, + &desc)) goto failed; - g_assert (GST_VIDEO_INFO_N_PLANES (¶ms->info) == desc.num_layers); + g_assert (GST_VIDEO_INFO_N_PLANES (&self->info) == desc.num_layers); if (fourcc != desc.fourcc) { GST_ERROR ("Unsupported fourcc: %" GST_FOURCC_FORMAT, @@ -485,7 +573,18 @@ goto failed; } - buf = _create_buffer_surface (surface, format, desc.width, desc.height); + if (desc.num_objects == 0) { + GST_ERROR ("Failed to export surface to dmabuf"); + goto failed; + } + + buf = gst_va_buffer_surface_new (surface, format, desc.width, desc.height); + if (G_UNLIKELY (info)) { + *info = self->info; + GST_VIDEO_INFO_SIZE (info) = 0; + } + + buf->n_mems = desc.num_objects; for (i = 0; i < desc.num_objects; i++) { gint fd = desc.objects[i].fd; @@ -495,59 +594,305 @@ guint64 *drm_mod = g_new (guint64, 1); gst_buffer_append_memory (buffer, mem); - - GST_MINI_OBJECT (mem)->dispose = gst_va_memory_dispose; + buf->mems[i] = mem; + + if (G_LIKELY (!info)) { + GST_MINI_OBJECT (mem)->dispose = gst_va_dmabuf_memory_release; + g_atomic_int_add (&buf->ref_mems_count, 1); + } else { + /* if no @info, surface will be destroyed as soon as buffer is + * destroyed (e.g. gst_va_dmabuf_allocator_try()) */ + buf->display = gst_object_ref (self->display); + buffer_destroy = gst_va_buffer_surface_unref; + } g_atomic_int_add (&buf->ref_count, 1); gst_mini_object_set_qdata (GST_MINI_OBJECT (mem), - gst_va_buffer_surface_quark (), buf, NULL); + gst_va_buffer_surface_quark (), buf, buffer_destroy); *drm_mod = desc.objects[i].drm_format_modifier; gst_mini_object_set_qdata (GST_MINI_OBJECT (mem), gst_va_drm_mod_quark (), drm_mod, g_free); + + if (G_UNLIKELY (info)) + GST_VIDEO_INFO_SIZE (info) += size; + + GST_LOG_OBJECT (self, "buffer %p: new dmabuf %d / surface %#x [%dx%d] " + "size %" G_GSIZE_FORMAT " drm mod %#lx", buffer, fd, surface, + GST_VIDEO_INFO_WIDTH (&self->info), GST_VIDEO_INFO_HEIGHT (&self->info), + GST_VIDEO_INFO_SIZE (&self->info), *drm_mod); + } + + if (G_UNLIKELY (info)) { + for (i = 0; i < desc.num_layers; i++) { + g_assert (desc.layers[i].num_planes == 1); + GST_VIDEO_INFO_PLANE_OFFSET (info, i) = desc.layers[i].offset[0]; + GST_VIDEO_INFO_PLANE_STRIDE (info, i) = desc.layers[i].pitch[0]; + } + } else { + gst_va_memory_pool_surface_inc (&self->pool); + } + + return TRUE; + +failed: + { + va_destroy_surfaces (self->display, &surface, 1); + return FALSE; } +} + +gboolean +gst_va_dmabuf_allocator_setup_buffer (GstAllocator * allocator, + GstBuffer * buffer) +{ + return gst_va_dmabuf_allocator_setup_buffer_full (allocator, buffer, NULL); +} + +static VASurfaceID +gst_va_dmabuf_allocator_prepare_buffer_unlocked (GstVaDmabufAllocator * self, + GstBuffer * buffer) +{ + GstMemory *mems[GST_VIDEO_MAX_PLANES] = { 0, }; + GstVaBufferSurface *buf; + gint i, j, idx; + + mems[0] = gst_va_memory_pool_pop (&self->pool); + if (!mems[0]) + return VA_INVALID_ID; + + buf = gst_mini_object_get_qdata (GST_MINI_OBJECT (mems[0]), + gst_va_buffer_surface_quark ()); + if (!buf) + return VA_INVALID_ID; + + if (buf->surface == VA_INVALID_ID) + return VA_INVALID_ID; + + for (idx = 1; idx < buf->n_mems; idx++) { + /* grab next memory from queue */ + { + GstMemory *mem; + GstVaBufferSurface *pbuf; + + mem = gst_va_memory_pool_peek (&self->pool); + if (!mem) + return VA_INVALID_ID; + + pbuf = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_va_buffer_surface_quark ()); + if (!pbuf) + return VA_INVALID_ID; + + if (pbuf->surface != buf->surface) { + GST_WARNING_OBJECT (self, + "expecting memory with surface %#x but got %#x: " + "possible memory interweaving", buf->surface, pbuf->surface); + return VA_INVALID_ID; + } + } + + mems[idx] = gst_va_memory_pool_pop (&self->pool); + }; + + /* append memories */ + for (i = 0; i < buf->n_mems; i++) { + gboolean found = FALSE; + + /* find next memory to append */ + for (j = 0; j < idx; j++) { + if (buf->mems[i] == mems[j]) { + found = TRUE; + break; + } + } + + /* if not found, free all the popped memories and bail */ + if (!found) { + if (!buf->display) + buf->display = gst_object_ref (self->display); + for (j = 0; j < idx; j++) { + gst_object_ref (buf->mems[j]->allocator); + GST_MINI_OBJECT (mems[j])->dispose = NULL; + gst_memory_unref (mems[j]); + } + return VA_INVALID_ID; + } + + g_atomic_int_add (&buf->ref_mems_count, 1); + gst_object_ref (buf->mems[i]->allocator); + gst_buffer_append_memory (buffer, buf->mems[i]); + + GST_LOG ("bufer %p: memory %p - dmabuf %d / surface %#x", buffer, + buf->mems[i], gst_dmabuf_memory_get_fd (buf->mems[i]), + gst_va_memory_get_surface (buf->mems[i])); + } + + return buf->surface; +} + +gboolean +gst_va_dmabuf_allocator_prepare_buffer (GstAllocator * allocator, + GstBuffer * buffer) +{ + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (allocator); + VASurfaceID surface; + + GST_VA_MEMORY_POOL_LOCK (&self->pool); + surface = gst_va_dmabuf_allocator_prepare_buffer_unlocked (self, buffer); + GST_VA_MEMORY_POOL_UNLOCK (&self->pool); + + return (surface != VA_INVALID_ID); +} + +void +gst_va_dmabuf_allocator_flush (GstAllocator * allocator) +{ + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (allocator); + + gst_va_memory_pool_flush (&self->pool, self->display); +} + +static gboolean +gst_va_dmabuf_allocator_try (GstAllocator * allocator) +{ + GstBuffer *buffer; + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (allocator); + GstVideoInfo info = self->info; + gboolean ret; + + buffer = gst_buffer_new (); + ret = gst_va_dmabuf_allocator_setup_buffer_full (allocator, buffer, &info); + gst_buffer_unref (buffer); + + if (ret) + self->info = info; + + return ret; +} + +gboolean +gst_va_dmabuf_allocator_set_format (GstAllocator * allocator, + GstVideoInfo * info, guint usage_hint) +{ + GstVaDmabufAllocator *self; + gboolean ret; + + g_return_val_if_fail (GST_IS_VA_DMABUF_ALLOCATOR (allocator), FALSE); + g_return_val_if_fail (info, FALSE); + + self = GST_VA_DMABUF_ALLOCATOR (allocator); + + if (gst_va_memory_pool_surface_count (&self->pool) != 0) { + if (GST_VIDEO_INFO_FORMAT (info) == GST_VIDEO_INFO_FORMAT (&self->info) + && GST_VIDEO_INFO_WIDTH (info) == GST_VIDEO_INFO_WIDTH (&self->info) + && GST_VIDEO_INFO_HEIGHT (info) == GST_VIDEO_INFO_HEIGHT (&self->info) + && usage_hint == self->usage_hint) { + *info = self->info; /* update callee info (offset & stride) */ + return TRUE; + } + return FALSE; + } + + self->usage_hint = usage_hint; + self->info = *info; + + g_clear_pointer (&self->copy, gst_va_surface_copy_free); + + ret = gst_va_dmabuf_allocator_try (allocator); + + if (ret) + *info = self->info; + + return ret; +} - for (i = 0; i < desc.num_layers; i++) { - g_assert (desc.layers[i].num_planes == 1); - GST_VIDEO_INFO_PLANE_OFFSET (&buf->info, i) = desc.layers[i].offset[0]; - GST_VIDEO_INFO_PLANE_STRIDE (&buf->info, i) = desc.layers[i].pitch[0]; - } +gboolean +gst_va_dmabuf_allocator_get_format (GstAllocator * allocator, + GstVideoInfo * info, guint * usage_hint) +{ + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (allocator); - GST_VIDEO_INFO_SIZE (&buf->info) = gst_buffer_get_size (buffer); - GST_LOG_OBJECT (self, "Created surface %#x [%dx%d] size %" G_GSIZE_FORMAT, - buf->surface, GST_VIDEO_INFO_WIDTH (&buf->info), - GST_VIDEO_INFO_HEIGHT (&buf->info), GST_VIDEO_INFO_SIZE (&buf->info)); + if (GST_VIDEO_INFO_FORMAT (&self->info) == GST_VIDEO_FORMAT_UNKNOWN) + return FALSE; - params->info = buf->info; + if (info) + *info = self->info; + if (usage_hint) + *usage_hint = self->usage_hint; return TRUE; - -failed: - { - _destroy_surfaces (self->display, &surface, 1); - return FALSE; - } } +/* XXX: use a surface pool to control the created surfaces */ gboolean -gst_va_dmabuf_try (GstAllocator * allocator, GstVaAllocationParams * params) +gst_va_dmabuf_memories_setup (GstVaDisplay * display, GstVideoInfo * info, + guint n_planes, GstMemory * mem[GST_VIDEO_MAX_PLANES], + uintptr_t * fds, gsize offset[GST_VIDEO_MAX_PLANES], guint usage_hint) { - GstBuffer *buffer = gst_buffer_new (); - GstMapInfo map_info; + GstVideoFormat format; + GstVaBufferSurface *buf; + /* *INDENT-OFF* */ + VASurfaceAttribExternalBuffers ext_buf = { + .width = GST_VIDEO_INFO_WIDTH (info), + .height = GST_VIDEO_INFO_HEIGHT (info), + .data_size = GST_VIDEO_INFO_SIZE (info), + .num_planes = GST_VIDEO_INFO_N_PLANES (info), + .buffers = fds, + .num_buffers = GST_VIDEO_INFO_N_PLANES (info), + }; + /* *INDENT-ON* */ + VASurfaceID surface; + guint32 fourcc, rt_format; + guint i; gboolean ret; - ret = gst_va_dmabuf_setup_buffer (allocator, buffer, params); - if (ret) { - /* XXX: radeonsi for kadaveri cannot map dmabufs to user space */ - if (!gst_buffer_map (buffer, &map_info, GST_MAP_READWRITE)) { - GST_WARNING_OBJECT (allocator, - "DMABuf backend cannot map frames to user space."); - } - gst_buffer_unmap (buffer, &map_info); + g_return_val_if_fail (GST_IS_VA_DISPLAY (display), FALSE); + g_return_val_if_fail (n_planes > 0 + && n_planes <= GST_VIDEO_MAX_PLANES, FALSE); + + format = GST_VIDEO_INFO_FORMAT (info); + if (format == GST_VIDEO_FORMAT_UNKNOWN) + return FALSE; + + rt_format = gst_va_chroma_from_video_format (format); + if (rt_format == 0) + return FALSE; + + fourcc = gst_va_fourcc_from_video_format (format); + if (fourcc == 0) + return FALSE; + + ext_buf.pixel_format = fourcc; + + for (i = 0; i < n_planes; i++) { + ext_buf.pitches[i] = GST_VIDEO_INFO_PLANE_STRIDE (info, i); + ext_buf.offsets[i] = offset[i]; } - gst_buffer_unref (buffer); - return ret; + ret = va_create_surfaces (display, rt_format, ext_buf.pixel_format, + ext_buf.width, ext_buf.height, usage_hint, &ext_buf, &surface, 1); + if (!ret) + return FALSE; + + GST_LOG_OBJECT (display, "Created surface %#x [%dx%d]", surface, + ext_buf.width, ext_buf.height); + + buf = gst_va_buffer_surface_new (surface, rt_format, ext_buf.width, + ext_buf.height); + buf->display = gst_object_ref (display); + buf->n_mems = n_planes; + memcpy (buf->mems, mem, sizeof (buf->mems)); + + for (i = 0; i < n_planes; i++) { + g_atomic_int_add (&buf->ref_count, 1); + gst_mini_object_set_qdata (GST_MINI_OBJECT (mem[i]), + gst_va_buffer_surface_quark (), buf, gst_va_buffer_surface_unref); + GST_INFO_OBJECT (display, "setting surface %#x to dmabuf fd %d", + buf->surface, gst_dmabuf_memory_get_fd (mem[i])); + } + + return TRUE; } /*===================== GstVaAllocator / GstVaMemory =========================*/ @@ -557,16 +902,29 @@ GstAllocator parent; GstVaDisplay *display; + gboolean use_derived; GArray *surface_formats; + + GstVideoFormat surface_format; + GstVideoFormat img_format; + guint32 fourcc; + guint32 rt_format; + + GstVideoInfo derived_info; + GstVideoInfo info; + guint usage_hint; + + GstVaSurfaceCopy *copy; + + GstVaMemoryPool pool; }; typedef struct _GstVaMemory GstVaMemory; struct _GstVaMemory { - GstMemory parent; + GstMemory mem; - GstVideoInfo info; VASurfaceID surface; GstVideoFormat surface_format; VAImage image; @@ -583,13 +941,31 @@ G_DEFINE_TYPE_WITH_CODE (GstVaAllocator, gst_va_allocator, GST_TYPE_ALLOCATOR, _init_debug_category ()); +static gboolean _va_unmap (GstVaMemory * mem); + static void -gst_va_allocator_dispose (GObject * object) +gst_va_allocator_finalize (GObject * object) { GstVaAllocator *self = GST_VA_ALLOCATOR (object); - gst_clear_object (&self->display); + g_clear_pointer (&self->copy, gst_va_surface_copy_free); + gst_va_memory_pool_finalize (&self->pool); g_clear_pointer (&self->surface_formats, g_array_unref); + gst_clear_object (&self->display); + + G_OBJECT_CLASS (gst_va_allocator_parent_class)->finalize (object); +} + +static void +gst_va_allocator_dispose (GObject * object) +{ + GstVaAllocator *self = GST_VA_ALLOCATOR (object); + + gst_va_memory_pool_flush_unlocked (&self->pool, self->display); + if (gst_va_memory_pool_surface_count (&self->pool) != 0) { + GST_WARNING_OBJECT (self, "Surfaces leaked: %d", + gst_va_memory_pool_surface_count (&self->pool)); + } G_OBJECT_CLASS (gst_va_allocator_parent_class)->dispose (object); } @@ -600,9 +976,17 @@ GstVaAllocator *self = GST_VA_ALLOCATOR (allocator); GstVaMemory *va_mem = (GstVaMemory *) mem; - GST_LOG_OBJECT (self, "Destroying surface %#x", va_mem->surface); + if (va_mem->mapped_data) { + g_warning (G_STRLOC ":%s: Freeing memory %p still mapped", G_STRFUNC, + va_mem); + _va_unmap (va_mem); + } + + if (va_mem->surface != VA_INVALID_ID && mem->parent == NULL) { + GST_LOG_OBJECT (self, "Destroying surface %#x", va_mem->surface); + va_destroy_surfaces (self->display, &va_mem->surface, 1); + } - _destroy_surfaces (self->display, &va_mem->surface, 1); g_mutex_clear (&va_mem->lock); g_slice_free (GstVaMemory, va_mem); @@ -615,6 +999,7 @@ GObjectClass *object_class = G_OBJECT_CLASS (klass); object_class->dispose = gst_va_allocator_dispose; + object_class->finalize = gst_va_allocator_finalize; allocator_class->free = _va_free; } @@ -638,41 +1023,64 @@ g_atomic_int_set (&mem->map_count, 0); g_mutex_init (&mem->lock); - gst_memory_init (GST_MEMORY_CAST (mem), GST_MEMORY_FLAG_NO_SHARE, allocator, - NULL, size, 0 /* align */ , 0 /* offset */ , size); + gst_memory_init (GST_MEMORY_CAST (mem), 0, allocator, NULL, size, + 0 /* align */ , 0 /* offset */ , size); } -static inline gboolean -_ensure_image (GstVaDisplay * display, VASurfaceID surface, GstVideoInfo * info, - VAImage * image, gboolean * derived) +static inline void +_update_info (GstVideoInfo * info, const VAImage * image) { - gint i; - gboolean try_derived; + guint i; - if (image->image_id != VA_INVALID_ID) - return TRUE; - - if (!_sync_surface (display, surface)) - return FALSE; + for (i = 0; i < image->num_planes; i++) { + GST_VIDEO_INFO_PLANE_OFFSET (info, i) = image->offsets[i]; + GST_VIDEO_INFO_PLANE_STRIDE (info, i) = image->pitches[i]; + } - try_derived = (derived) ? *derived : FALSE; + GST_VIDEO_INFO_SIZE (info) = image->data_size; +} - if (try_derived && _get_derive_image (display, surface, image)) - goto bail; - if (!_create_image (display, GST_VIDEO_INFO_FORMAT (info), - GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info), image)) +static inline gboolean +_update_image_info (GstVaAllocator * va_allocator) +{ + VASurfaceID surface; + VAImage image = {.image_id = VA_INVALID_ID, }; + + /* Create a test surface first */ + if (!va_create_surfaces (va_allocator->display, va_allocator->rt_format, + va_allocator->fourcc, GST_VIDEO_INFO_WIDTH (&va_allocator->info), + GST_VIDEO_INFO_HEIGHT (&va_allocator->info), va_allocator->usage_hint, + NULL, &surface, 1)) { + GST_ERROR_OBJECT (va_allocator, "Failed to create a test surface"); return FALSE; + } - if (derived) - *derived = FALSE; + GST_DEBUG_OBJECT (va_allocator, "Created surface %#x [%dx%d]", surface, + GST_VIDEO_INFO_WIDTH (&va_allocator->info), + GST_VIDEO_INFO_HEIGHT (&va_allocator->info)); + + /* Try derived first, but different formats can never derive */ + if (va_allocator->surface_format == va_allocator->img_format) { + if (va_get_derive_image (va_allocator->display, surface, &image)) { + va_allocator->use_derived = TRUE; + va_allocator->derived_info = va_allocator->info; + _update_info (&va_allocator->derived_info, &image); + va_destroy_image (va_allocator->display, image.image_id); + } + image.image_id = VA_INVALID_ID; /* reset it */ + } -bail: - for (i = 0; i < image->num_planes; i++) { - GST_VIDEO_INFO_PLANE_OFFSET (info, i) = image->offsets[i]; - GST_VIDEO_INFO_PLANE_STRIDE (info, i) = image->pitches[i]; + /* Then we try to create a image. */ + if (!va_create_image (va_allocator->display, va_allocator->img_format, + GST_VIDEO_INFO_WIDTH (&va_allocator->info), + GST_VIDEO_INFO_HEIGHT (&va_allocator->info), &image)) { + va_destroy_surfaces (va_allocator->display, &surface, 1); + return FALSE; } - GST_VIDEO_INFO_SIZE (info) = image->data_size; + _update_info (&va_allocator->info, &image); + va_destroy_image (va_allocator->display, image.image_id); + va_destroy_surfaces (va_allocator->display, &surface, 1); return TRUE; } @@ -681,14 +1089,16 @@ _va_map_unlocked (GstVaMemory * mem, GstMapFlags flags) { GstAllocator *allocator = GST_MEMORY_CAST (mem)->allocator; + GstVideoInfo *info; GstVaAllocator *va_allocator; GstVaDisplay *display; + gboolean use_derived; g_return_val_if_fail (mem->surface != VA_INVALID_ID, NULL); g_return_val_if_fail (GST_IS_VA_ALLOCATOR (allocator), NULL); if (g_atomic_int_get (&mem->map_count) > 0) { - if (mem->prev_mapflags != flags || !mem->mapped_data) + if (!(mem->prev_mapflags & flags) || !mem->mapped_data) return NULL; else goto success; @@ -699,11 +1109,8 @@ if (flags & GST_MAP_WRITE) { mem->is_dirty = TRUE; - mem->is_derived = FALSE; } else { /* GST_MAP_READ only */ mem->is_dirty = FALSE; - mem->is_derived = va_allocator->use_derived && - (GST_VIDEO_INFO_FORMAT (&mem->info) == mem->surface_format); } if (flags & GST_MAP_VA) { @@ -711,18 +1118,47 @@ goto success; } - if (!_ensure_image (display, mem->surface, &mem->info, &mem->image, - &mem->is_derived)) + switch (gst_va_display_get_implementation (display)) { + case GST_VA_IMPLEMENTATION_INTEL_IHD: + /* On Gen7+ Intel graphics the memory is mappable but not + * cached, so normal memcpy() access is very slow to read, but + * it's ok for writing. So let's assume that users won't prefer + * direct-mapped memory if they request read access. */ + use_derived = va_allocator->use_derived && !(flags & GST_MAP_READ); + break; + case GST_VA_IMPLEMENTATION_INTEL_I965: + /* YUV derived images are tiled, so writing them is also + * problematic */ + use_derived = va_allocator->use_derived && !((flags & GST_MAP_READ) + || ((flags & GST_MAP_WRITE) + && GST_VIDEO_INFO_IS_YUV (&va_allocator->derived_info))); + break; + case GST_VA_IMPLEMENTATION_MESA_GALLIUM: + /* Reading RGB derived images, with non-standard resolutions, + * looks like tiled too. TODO(victor): fill a bug in Mesa. */ + use_derived = va_allocator->use_derived && !((flags & GST_MAP_READ) + && GST_VIDEO_INFO_IS_RGB (&va_allocator->derived_info)); + break; + default: + use_derived = va_allocator->use_derived; + break; + } + if (use_derived) + info = &va_allocator->derived_info; + else + info = &va_allocator->info; + + if (!va_ensure_image (display, mem->surface, info, &mem->image, use_derived)) return NULL; - va_allocator->use_derived = mem->is_derived; + mem->is_derived = use_derived; if (!mem->is_derived) { - if (!_get_image (display, mem->surface, &mem->image)) + if (!va_get_image (display, mem->surface, &mem->image)) goto fail; } - if (!_map_buffer (display, mem->image.buf, &mem->mapped_data)) + if (!va_map_buffer (display, mem->image.buf, &mem->mapped_data)) goto fail; success: @@ -734,7 +1170,7 @@ fail: { - _destroy_image (display, mem->image.image_id); + va_destroy_image (display, mem->image.image_id); _clean_mem (mem); return NULL; } @@ -769,15 +1205,15 @@ if (mem->image.image_id != VA_INVALID_ID) { if (mem->is_dirty && !mem->is_derived) { - ret = _put_image (display, mem->surface, &mem->image); + ret = va_put_image (display, mem->surface, &mem->image); mem->is_dirty = FALSE; } /* XXX(victor): if is derived and is dirty, create another surface * an replace it in mem */ } - ret &= _unmap_buffer (display, mem->image.buf); - ret &= _destroy_image (display, mem->image.image_id); + ret &= va_unmap_buffer (display, mem->image.buf); + ret &= va_destroy_image (display, mem->image.image_id); bail: _clean_mem (mem); @@ -797,50 +1233,103 @@ return ret; } -/* XXX(victor): shallow copy -- only the surface */ static GstMemory * -_va_copy_unlocked (GstVaMemory * mem) +_va_share (GstMemory * mem, gssize offset, gssize size) { - GstVaMemory *ret; - gsize size; + GstVaMemory *vamem = (GstVaMemory *) mem; + GstVaMemory *sub; + GstMemory *parent; - ret = g_slice_new (GstVaMemory); + GST_DEBUG ("%p: share %" G_GSSIZE_FORMAT ", %" G_GSIZE_FORMAT, mem, offset, + size); - size = GST_VIDEO_INFO_SIZE (&mem->info); + /* find real parent */ + if ((parent = vamem->mem.parent) == NULL) + parent = (GstMemory *) vamem; - ret->info = mem->info; - ret->surface = mem->surface; + if (size == -1) + size = mem->maxsize - offset; - _reset_mem (ret, GST_MEMORY_CAST (mem)->allocator, size); + sub = g_slice_new (GstVaMemory); - return GST_MEMORY_CAST (ret); -} + /* the shared memory is alwyas readonly */ + gst_memory_init (GST_MEMORY_CAST (sub), GST_MINI_OBJECT_FLAGS (parent) | + GST_MINI_OBJECT_FLAG_LOCK_READONLY, vamem->mem.allocator, parent, + vamem->mem.maxsize, vamem->mem.align, vamem->mem.offset + offset, size); -static GstMemory * -_va_copy (GstVaMemory * mem, gssize offset, gssize size) -{ - GstMemory *ret; + sub->surface = vamem->surface; + sub->surface_format = vamem->surface_format; - g_mutex_lock (&mem->lock); - ret = _va_copy_unlocked (mem); - g_mutex_unlock (&mem->lock); + _clean_mem (sub); - return ret; + g_atomic_int_set (&sub->map_count, 0); + g_mutex_init (&sub->lock); + + return GST_MEMORY_CAST (sub); } +/* XXX(victor): deep copy implementation. */ static GstMemory * -_va_share (GstMemory * mem, gssize offset, gssize size) +_va_copy (GstMemory * mem, gssize offset, gssize size) { - /* VA surfaces are opaque structures, which cannot be shared */ - return NULL; -} + GstMemory *copy; + GstMapInfo sinfo, dinfo; + GstVaAllocator *va_allocator = GST_VA_ALLOCATOR (mem->allocator); + GstVaMemory *va_copy, *va_mem = (GstVaMemory *) mem; + gsize mem_size; -static gboolean -_va_is_span (GstMemory * mem1, GstMemory * mem2, gsize * offset) -{ - /* VA surfaces are opaque structures, which might live in other - * memory. It is impossible to know, so far, if they can mergable. */ - return FALSE; + GST_DEBUG ("%p: copy %" G_GSSIZE_FORMAT ", %" G_GSIZE_FORMAT, mem, offset, + size); + + { + GST_VA_MEMORY_POOL_LOCK (&va_allocator->pool); + copy = gst_va_memory_pool_pop (&va_allocator->pool); + GST_VA_MEMORY_POOL_UNLOCK (&va_allocator->pool); + + if (!copy) { + copy = gst_va_allocator_alloc (mem->allocator); + if (!copy) { + GST_WARNING ("failed to allocate new memory"); + return NULL; + } + } else { + gst_object_ref (mem->allocator); + } + } + + va_copy = (GstVaMemory *) copy; + mem_size = gst_memory_get_sizes (mem, NULL, NULL); + + if (size == -1) + size = mem_size > offset ? mem_size - offset : 0; + + if (offset == 0 && size == mem_size) { + GstVaSurfaceCopy *copy_func; + + copy_func = _ensure_surface_copy (&va_allocator->copy, + va_allocator->display, &va_allocator->info); + if (copy_func + && gst_va_surface_copy (copy_func, va_copy->surface, va_mem->surface)) + return copy; + } + + if (!gst_memory_map (mem, &sinfo, GST_MAP_READ)) { + GST_WARNING ("failed to map memory to copy"); + return NULL; + } + + if (!gst_memory_map (copy, &dinfo, GST_MAP_WRITE)) { + GST_WARNING ("could not write map memory %p", copy); + gst_allocator_free (mem->allocator, copy); + gst_memory_unmap (mem, &sinfo); + return NULL; + } + + memcpy (dinfo.data, sinfo.data + offset, size); + gst_memory_unmap (copy, &dinfo); + gst_memory_unmap (mem, &sinfo); + + return copy; } static void @@ -851,68 +1340,65 @@ allocator->mem_type = GST_ALLOCATOR_VASURFACE; allocator->mem_map = (GstMemoryMapFunction) _va_map; allocator->mem_unmap = (GstMemoryUnmapFunction) _va_unmap; - allocator->mem_copy = (GstMemoryCopyFunction) _va_copy; allocator->mem_share = _va_share; - allocator->mem_is_span = _va_is_span; + allocator->mem_copy = _va_copy; - self->use_derived = TRUE; + gst_va_memory_pool_init (&self->pool); GST_OBJECT_FLAG_SET (self, GST_ALLOCATOR_FLAG_CUSTOM_ALLOC); } +static gboolean +gst_va_memory_release (GstMiniObject * mini_object) +{ + GstMemory *mem = GST_MEMORY_CAST (mini_object); + GstVaAllocator *self = GST_VA_ALLOCATOR (mem->allocator); + + GST_LOG ("releasing %p: surface %#x", mem, gst_va_memory_get_surface (mem)); + + gst_va_memory_pool_push (&self->pool, mem); + + /* Keep last in case we are holding on the last allocator ref */ + gst_object_unref (mem->allocator); + + /* don't call mini_object's free */ + return FALSE; +} + GstMemory * -gst_va_allocator_alloc (GstAllocator * allocator, - GstVaAllocationParams * params) +gst_va_allocator_alloc (GstAllocator * allocator) { GstVaAllocator *self; GstVaMemory *mem; - GstVideoFormat format; - VAImage image = { 0, }; VASurfaceID surface; - guint32 fourcc, rt_format; g_return_val_if_fail (GST_IS_VA_ALLOCATOR (allocator), NULL); self = GST_VA_ALLOCATOR (allocator); - format = - gst_va_video_surface_format_from_image_format (GST_VIDEO_INFO_FORMAT - (¶ms->info), self->surface_formats); - if (format == GST_VIDEO_FORMAT_UNKNOWN) { - GST_ERROR_OBJECT (allocator, "Unsupported format: %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (¶ms->info))); - return NULL; - } - - fourcc = gst_va_fourcc_from_video_format (format); - rt_format = gst_va_chroma_from_video_format (format); - if (fourcc == 0 || rt_format == 0) { - GST_ERROR_OBJECT (allocator, "Unsupported format: %s", - gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (¶ms->info))); + if (self->rt_format == 0) { + GST_ERROR_OBJECT (self, "Unknown fourcc or chroma format"); return NULL; } - if (!_create_surfaces (self->display, rt_format, fourcc, - GST_VIDEO_INFO_WIDTH (¶ms->info), - GST_VIDEO_INFO_HEIGHT (¶ms->info), params->usage_hint, &surface, - 1)) - return NULL; - - image.image_id = VA_INVALID_ID; - if (!_ensure_image (self->display, surface, ¶ms->info, &image, NULL)) + if (!va_create_surfaces (self->display, self->rt_format, self->fourcc, + GST_VIDEO_INFO_WIDTH (&self->info), + GST_VIDEO_INFO_HEIGHT (&self->info), self->usage_hint, NULL, + &surface, 1)) return NULL; - _destroy_image (self->display, image.image_id); mem = g_slice_new (GstVaMemory); mem->surface = surface; - mem->surface_format = format; - mem->info = params->info; + mem->surface_format = self->surface_format; + + _reset_mem (mem, allocator, GST_VIDEO_INFO_SIZE (&self->info)); - _reset_mem (mem, allocator, GST_VIDEO_INFO_SIZE (¶ms->info)); + GST_MINI_OBJECT (mem)->dispose = gst_va_memory_release; + gst_va_memory_pool_surface_inc (&self->pool); GST_LOG_OBJECT (self, "Created surface %#x [%dx%d]", mem->surface, - GST_VIDEO_INFO_WIDTH (&mem->info), GST_VIDEO_INFO_HEIGHT (&mem->info)); + GST_VIDEO_INFO_WIDTH (&self->info), GST_VIDEO_INFO_HEIGHT (&self->info)); return GST_MEMORY_CAST (mem); } @@ -933,55 +1419,287 @@ } gboolean -gst_va_allocator_try (GstAllocator * allocator, GstVaAllocationParams * params) +gst_va_allocator_setup_buffer (GstAllocator * allocator, GstBuffer * buffer) +{ + GstMemory *mem = gst_va_allocator_alloc (allocator); + if (!mem) + return FALSE; + + gst_buffer_append_memory (buffer, mem); + return TRUE; +} + +static VASurfaceID +gst_va_allocator_prepare_buffer_unlocked (GstVaAllocator * self, + GstBuffer * buffer) { GstMemory *mem; + VASurfaceID surface; - mem = gst_va_allocator_alloc (allocator, params); + mem = gst_va_memory_pool_pop (&self->pool); if (!mem) + return VA_INVALID_ID; + + gst_object_ref (mem->allocator); + surface = gst_va_memory_get_surface (mem); + gst_buffer_append_memory (buffer, mem); + + GST_LOG ("buffer %p: memory %p - surface %#x", buffer, mem, surface); + + return surface; +} + +gboolean +gst_va_allocator_prepare_buffer (GstAllocator * allocator, GstBuffer * buffer) +{ + GstVaAllocator *self = GST_VA_ALLOCATOR (allocator); + VASurfaceID surface; + + GST_VA_MEMORY_POOL_LOCK (&self->pool); + surface = gst_va_allocator_prepare_buffer_unlocked (self, buffer); + GST_VA_MEMORY_POOL_UNLOCK (&self->pool); + + return (surface != VA_INVALID_ID); +} + +void +gst_va_allocator_flush (GstAllocator * allocator) +{ + GstVaAllocator *self = GST_VA_ALLOCATOR (allocator); + + gst_va_memory_pool_flush (&self->pool, self->display); +} + +static gboolean +gst_va_allocator_try (GstAllocator * allocator) +{ + GstVaAllocator *self = GST_VA_ALLOCATOR (allocator); + + self->fourcc = 0; + self->rt_format = 0; + self->use_derived = FALSE; + self->img_format = GST_VIDEO_INFO_FORMAT (&self->info); + + self->surface_format = + gst_va_video_surface_format_from_image_format (self->img_format, + self->surface_formats); + if (self->surface_format == GST_VIDEO_FORMAT_UNKNOWN) { + /* try a surface without fourcc but rt_format only */ + self->fourcc = 0; + self->rt_format = gst_va_chroma_from_video_format (self->img_format); + } else { + self->fourcc = gst_va_fourcc_from_video_format (self->surface_format); + self->rt_format = gst_va_chroma_from_video_format (self->surface_format); + } + + if (self->rt_format == 0) { + GST_ERROR_OBJECT (allocator, "Unsupported image format: %s", + gst_video_format_to_string (self->img_format)); + return FALSE; + } + + if (!_update_image_info (self)) { + GST_ERROR_OBJECT (allocator, "Failed to update allocator info"); + return FALSE; + } + + GST_INFO_OBJECT (self, + "va allocator info, surface format: %s, image format: %s, " + "use derived: %s, rt format: 0x%x, fourcc: %" GST_FOURCC_FORMAT, + (self->surface_format == GST_VIDEO_FORMAT_UNKNOWN) ? "unknown" + : gst_video_format_to_string (self->surface_format), + gst_video_format_to_string (self->img_format), + self->use_derived ? "true" : "false", self->rt_format, + GST_FOURCC_ARGS (self->fourcc)); + return TRUE; +} + +gboolean +gst_va_allocator_set_format (GstAllocator * allocator, GstVideoInfo * info, + guint usage_hint) +{ + GstVaAllocator *self; + gboolean ret; + + g_return_val_if_fail (GST_IS_VA_ALLOCATOR (allocator), FALSE); + g_return_val_if_fail (info, FALSE); + + self = GST_VA_ALLOCATOR (allocator); + + if (gst_va_memory_pool_surface_count (&self->pool) != 0) { + if (GST_VIDEO_INFO_FORMAT (info) == GST_VIDEO_INFO_FORMAT (&self->info) + && GST_VIDEO_INFO_WIDTH (info) == GST_VIDEO_INFO_WIDTH (&self->info) + && GST_VIDEO_INFO_HEIGHT (info) == GST_VIDEO_INFO_HEIGHT (&self->info) + && usage_hint == self->usage_hint) { + *info = self->info; /* update callee info (offset & stride) */ + return TRUE; + } + return FALSE; + } + + self->usage_hint = usage_hint; + self->info = *info; + + g_clear_pointer (&self->copy, gst_va_surface_copy_free); + + ret = gst_va_allocator_try (allocator); + if (ret) + *info = self->info; + + return ret; +} + +gboolean +gst_va_allocator_get_format (GstAllocator * allocator, GstVideoInfo * info, + guint * usage_hint) +{ + GstVaAllocator *self = GST_VA_ALLOCATOR (allocator); + + if (GST_VIDEO_INFO_FORMAT (&self->info) == GST_VIDEO_FORMAT_UNKNOWN) return FALSE; - gst_memory_unref (mem); + + if (info) + *info = self->info; + if (usage_hint) + *usage_hint = self->usage_hint; + return TRUE; } /*============ Utilities =====================================================*/ VASurfaceID -gst_va_memory_get_surface (GstMemory * mem, GstVideoInfo * info) +gst_va_memory_get_surface (GstMemory * mem) { VASurfaceID surface = VA_INVALID_ID; if (!mem->allocator) return VA_INVALID_ID; - if (GST_IS_VA_DMABUF_ALLOCATOR (mem->allocator)) { + if (GST_IS_DMABUF_ALLOCATOR (mem->allocator)) { GstVaBufferSurface *buf; buf = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), gst_va_buffer_surface_quark ()); - if (buf) { - if (info) - *info = buf->info; + if (buf) surface = buf->surface; - } } else if (GST_IS_VA_ALLOCATOR (mem->allocator)) { GstVaMemory *va_mem = (GstVaMemory *) mem; surface = va_mem->surface; - if (info) - *info = va_mem->info; } return surface; } VASurfaceID -gst_va_buffer_get_surface (GstBuffer * buffer, GstVideoInfo * info) +gst_va_buffer_get_surface (GstBuffer * buffer) +{ + GstMemory *mem; + + mem = gst_buffer_peek_memory (buffer, 0); + if (!mem) + return VA_INVALID_ID; + + return gst_va_memory_get_surface (mem); +} + +gboolean +gst_va_buffer_create_aux_surface (GstBuffer * buffer) +{ + GstMemory *mem; + VASurfaceID surface = VA_INVALID_ID; + GstVaDisplay *display = NULL; + GstVideoFormat format; + gint width, height; + GstVaBufferSurface *surface_buffer; + + mem = gst_buffer_peek_memory (buffer, 0); + if (!mem) + return FALSE; + + /* Already created it. */ + surface_buffer = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_va_buffer_aux_surface_quark ()); + if (surface_buffer) + return TRUE; + + if (!mem->allocator) + return FALSE; + + if (GST_IS_VA_DMABUF_ALLOCATOR (mem->allocator)) { + GstVaDmabufAllocator *self = GST_VA_DMABUF_ALLOCATOR (mem->allocator); + guint32 fourcc, rt_format; + + format = GST_VIDEO_INFO_FORMAT (&self->info); + fourcc = gst_va_fourcc_from_video_format (format); + rt_format = gst_va_chroma_from_video_format (format); + if (fourcc == 0 || rt_format == 0) { + GST_ERROR_OBJECT (self, "Unsupported format: %s", + gst_video_format_to_string (GST_VIDEO_INFO_FORMAT (&self->info))); + return FALSE; + } + + display = self->display; + width = GST_VIDEO_INFO_WIDTH (&self->info); + height = GST_VIDEO_INFO_HEIGHT (&self->info); + if (!va_create_surfaces (self->display, rt_format, fourcc, + GST_VIDEO_INFO_WIDTH (&self->info), + GST_VIDEO_INFO_HEIGHT (&self->info), self->usage_hint, NULL, + &surface, 1)) + return FALSE; + } else if (GST_IS_VA_ALLOCATOR (mem->allocator)) { + GstVaAllocator *self = GST_VA_ALLOCATOR (mem->allocator); + + if (self->rt_format == 0) { + GST_ERROR_OBJECT (self, "Unknown fourcc or chroma format"); + return FALSE; + } + + display = self->display; + width = GST_VIDEO_INFO_WIDTH (&self->info); + height = GST_VIDEO_INFO_HEIGHT (&self->info); + format = GST_VIDEO_INFO_FORMAT (&self->info); + if (!va_create_surfaces (self->display, self->rt_format, self->fourcc, + GST_VIDEO_INFO_WIDTH (&self->info), + GST_VIDEO_INFO_HEIGHT (&self->info), self->usage_hint, NULL, + &surface, 1)) + return FALSE; + } else { + g_assert_not_reached (); + } + + if (!display || surface == VA_INVALID_ID) + return FALSE; + + surface_buffer = gst_va_buffer_surface_new (surface, format, width, height); + surface_buffer->display = gst_object_ref (display); + g_atomic_int_add (&surface_buffer->ref_count, 1); + + gst_mini_object_set_qdata (GST_MINI_OBJECT (mem), + gst_va_buffer_aux_surface_quark (), surface_buffer, + gst_va_buffer_surface_unref); + + return TRUE; +} + +VASurfaceID +gst_va_buffer_get_aux_surface (GstBuffer * buffer) { + GstVaBufferSurface *surface_buffer; GstMemory *mem; mem = gst_buffer_peek_memory (buffer, 0); if (!mem) return VA_INVALID_ID; - return gst_va_memory_get_surface (mem, info); + surface_buffer = gst_mini_object_get_qdata (GST_MINI_OBJECT (mem), + gst_va_buffer_aux_surface_quark ()); + if (!surface_buffer) + return VA_INVALID_ID; + + /* No one increments it, and its lifetime is the same with the + gstmemory itself */ + g_assert (g_atomic_int_get (&surface_buffer->ref_count) == 1); + + return surface_buffer->surface; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvaallocator.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvaallocator.h
Changed
@@ -21,29 +21,37 @@ #pragma once #include <gst/allocators/allocators.h> +#include <gst/va/gstvadisplay.h> #include <gst/video/video.h> - -#include "gstvadisplay.h" +#include <stdint.h> +#include <va/va.h> G_BEGIN_DECLS -typedef struct _GstVaAllocationParams GstVaAllocationParams; -struct _GstVaAllocationParams -{ - GstVideoInfo info; - guint32 usage_hint; -}; - #define GST_TYPE_VA_DMABUF_ALLOCATOR (gst_va_dmabuf_allocator_get_type()) G_DECLARE_FINAL_TYPE (GstVaDmabufAllocator, gst_va_dmabuf_allocator, GST, VA_DMABUF_ALLOCATOR, GstDmaBufAllocator); GstAllocator * gst_va_dmabuf_allocator_new (GstVaDisplay * display); -gboolean gst_va_dmabuf_setup_buffer (GstAllocator * allocator, - GstBuffer * buffer, - GstVaAllocationParams * params); -gboolean gst_va_dmabuf_try (GstAllocator * allocator, - GstVaAllocationParams * params); +gboolean gst_va_dmabuf_allocator_setup_buffer (GstAllocator * allocator, + GstBuffer * buffer); +gboolean gst_va_dmabuf_allocator_prepare_buffer (GstAllocator * allocator, + GstBuffer * buffer); +void gst_va_dmabuf_allocator_flush (GstAllocator * allocator); +gboolean gst_va_dmabuf_allocator_set_format (GstAllocator * allocator, + GstVideoInfo * info, + guint usage_hint); +gboolean gst_va_dmabuf_allocator_get_format (GstAllocator * allocator, + GstVideoInfo * info, + guint * usage_hint); + +gboolean gst_va_dmabuf_memories_setup (GstVaDisplay * display, + GstVideoInfo * info, + guint n_planes, + GstMemory * mem[GST_VIDEO_MAX_PLANES], + uintptr_t * fds, + gsize offset[GST_VIDEO_MAX_PLANES], + guint usage_hint); #define GST_TYPE_VA_ALLOCATOR (gst_va_allocator_get_type()) G_DECLARE_FINAL_TYPE (GstVaAllocator, gst_va_allocator, GST, VA_ALLOCATOR, GstAllocator); @@ -54,14 +62,23 @@ GstAllocator * gst_va_allocator_new (GstVaDisplay * display, GArray * surface_formats); -GstMemory * gst_va_allocator_alloc (GstAllocator * allocator, - GstVaAllocationParams * params); -gboolean gst_va_allocator_try (GstAllocator * allocator, - GstVaAllocationParams * params); +GstMemory * gst_va_allocator_alloc (GstAllocator * allocator); +gboolean gst_va_allocator_setup_buffer (GstAllocator * allocator, + GstBuffer * buffer); +gboolean gst_va_allocator_prepare_buffer (GstAllocator * allocator, + GstBuffer * buffer); +void gst_va_allocator_flush (GstAllocator * allocator); +gboolean gst_va_allocator_set_format (GstAllocator * allocator, + GstVideoInfo * info, + guint usage_hint); +gboolean gst_va_allocator_get_format (GstAllocator * allocator, + GstVideoInfo * info, + guint * usage_hint); + +VASurfaceID gst_va_memory_get_surface (GstMemory * mem); +VASurfaceID gst_va_buffer_get_surface (GstBuffer * buffer); -VASurfaceID gst_va_memory_get_surface (GstMemory * mem, - GstVideoInfo * info); -VASurfaceID gst_va_buffer_get_surface (GstBuffer * buffer, - GstVideoInfo * info); +gboolean gst_va_buffer_create_aux_surface (GstBuffer * buffer); +VASurfaceID gst_va_buffer_get_aux_surface (GstBuffer * buffer); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvaav1dec.c
Added
@@ -0,0 +1,1017 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vaav1dec + * @title: vaav1dec + * @short_description: A VA-API based AV1 video decoder + * + * vaav1dec decodes AV1 bitstreams to VA surfaces using the + * installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver. + * + * The decoding surfaces can be mapped onto main memory as video + * frames. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.av1 ! ivfparse ! av1parse ! vaav1dec ! autovideosink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/codecs/gstav1decoder.h> +#include "gstvaav1dec.h" +#include "gstvabasedec.h" +#include "gstvaallocator.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_av1dec_debug); +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT gst_va_av1dec_debug +#else +#define GST_CAT_DEFAULT NULL +#endif + +#define GST_VA_AV1_DEC(obj) ((GstVaAV1Dec *) obj) +#define GST_VA_AV1_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaAV1DecClass)) +#define GST_VA_AV1_DEC_CLASS(klass) ((GstVaAV1DecClass *) klass) + +typedef struct _GstVaAV1Dec GstVaAV1Dec; +typedef struct _GstVaAV1DecClass GstVaAV1DecClass; + +struct _GstVaAV1DecClass +{ + GstVaBaseDecClass parent_class; +}; + +struct _GstVaAV1Dec +{ + GstVaBaseDec parent; + + GstFlowReturn last_ret; + + GstAV1SequenceHeaderOBU seq; + gint max_width; + gint max_height; +}; + +static GstElementClass *parent_class = NULL; + +/* *INDENT-OFF* */ +static const gchar *src_caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12, P010_10LE }") " ;" + GST_VIDEO_CAPS_MAKE ("{ NV12, P010_10LE }"); +/* *INDENT-ON* */ + +static const gchar *sink_caps_str = "video/x-av1"; + +static gboolean +gst_va_av1_dec_negotiate (GstVideoDecoder * decoder) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstAV1Decoder *av1dec = GST_AV1_DECODER (decoder); + GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; + GstCapsFeatures *capsfeatures = NULL; + + /* Ignore downstream renegotiation request. */ + if (!base->need_negotiation) + return TRUE; + + base->need_negotiation = FALSE; + + /* Do not re-create the context if only the frame size changes */ + if (!gst_va_decoder_config_is_equal (base->decoder, base->profile, + base->rt_format, self->max_width, self->max_height)) { + if (gst_va_decoder_is_open (base->decoder) + && !gst_va_decoder_close (base->decoder)) + return FALSE; + + if (!gst_va_decoder_open (base->decoder, base->profile, base->rt_format)) + return FALSE; + + if (!gst_va_decoder_set_frame_size (base->decoder, self->max_width, + self->max_height)) + return FALSE; + } + + if (base->output_state) + gst_video_codec_state_unref (base->output_state); + + gst_va_base_dec_get_preferred_format_and_caps_features (base, &format, + &capsfeatures); + + base->output_state = gst_video_decoder_set_output_state (decoder, format, + base->width, base->height, av1dec->input_state); + + base->output_state->caps = gst_video_info_to_caps (&base->output_state->info); + if (capsfeatures) + gst_caps_set_features_simple (base->output_state->caps, capsfeatures); + + GST_INFO_OBJECT (self, "Negotiated caps %" GST_PTR_FORMAT, + base->output_state->caps); + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static GstCaps * +_complete_sink_caps (GstCaps * sinkcaps) +{ + GstCaps *caps = gst_caps_copy (sinkcaps); + GValue val = G_VALUE_INIT; + + g_value_init (&val, G_TYPE_STRING); + g_value_set_string (&val, "frame"); + gst_caps_set_value (caps, "alignment", &val); + g_value_unset (&val); + + return caps; +} + +static VAProfile +_get_profile (GstVaAV1Dec * self, const GstAV1SequenceHeaderOBU * seq_hdr) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (self); + VAProfile profile = VAProfileNone; + + switch (seq_hdr->seq_profile) { + case GST_AV1_PROFILE_0: + profile = VAProfileAV1Profile0; + break; + case GST_AV1_PROFILE_1: + profile = VAProfileAV1Profile1; + break; + default: + GST_ERROR_OBJECT (self, "Unsupported av1 profile value %d", + seq_hdr->seq_profile); + return VAProfileNone; + } + + if (!gst_va_decoder_has_profile (base->decoder, profile)) { + GST_ERROR_OBJECT (self, "Profile %s is not supported by HW", + gst_va_profile_name (profile)); + return VAProfileNone; + } + + return profile; +} + +static guint +_get_rtformat (GstVaAV1Dec * self, VAProfile profile, + const GstAV1SequenceHeaderOBU * seq_header) +{ + /* 6.4.1: + seq_profile Bit depth Monochrome support Chroma subsampling + 0 8 or 10 Yes YUV 4:2:0 + 1 8 or 10 No YUV 4:4:4 + 2 8 or 10 Yes YUV 4:2:2 + 2 12 Yes YUV 4:2:0,YUV 4:2:2,YUV 4:4:4 + */ + + /* TODO: consider Monochrome case. Just return 4:2:0 for Monochrome now. */ + switch (profile) { + case VAProfileAV1Profile0: + if (seq_header->bit_depth == 8) { + return VA_RT_FORMAT_YUV420; + } else if (seq_header->bit_depth == 10) { + return VA_RT_FORMAT_YUV420_10; + } + break; + case VAProfileAV1Profile1: + if (seq_header->bit_depth == 8) { + return VA_RT_FORMAT_YUV444; + } else if (seq_header->bit_depth == 10) { + return VA_RT_FORMAT_YUV444_10; + } + break; + default: + break; + } + + GST_ERROR_OBJECT (self, "Fail to find rtformat for profile:%s, bit_depth:%d", + gst_va_profile_name (profile), seq_header->bit_depth); + return 0; +} + +static GstCaps * +gst_va_av1_dec_getcaps (GstVideoDecoder * decoder, GstCaps * filter) +{ + GstCaps *sinkcaps, *caps = NULL, *tmp; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + if (base->decoder) + caps = gst_va_decoder_get_sinkpad_caps (base->decoder); + + if (caps) { + sinkcaps = _complete_sink_caps (caps); + gst_caps_unref (caps); + if (filter) { + tmp = gst_caps_intersect_full (filter, sinkcaps, + GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (sinkcaps); + caps = tmp; + } else { + caps = sinkcaps; + } + GST_LOG_OBJECT (base, "Returning caps %" GST_PTR_FORMAT, caps); + } else if (!caps) { + caps = gst_video_decoder_proxy_getcaps (decoder, NULL, filter); + } + + return caps; +} + +static GstFlowReturn +gst_va_av1_dec_new_sequence (GstAV1Decoder * decoder, + const GstAV1SequenceHeaderOBU * seq_hdr) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + VAProfile profile; + guint rt_format; + + GST_LOG_OBJECT (self, "new sequence"); + + profile = _get_profile (self, seq_hdr); + if (profile == VAProfileNone) + return GST_FLOW_NOT_NEGOTIATED; + + rt_format = _get_rtformat (self, profile, seq_hdr); + if (!rt_format) + return GST_FLOW_NOT_NEGOTIATED; + + self->seq = *seq_hdr; + + if (!gst_va_decoder_config_is_equal (base->decoder, profile, + rt_format, seq_hdr->max_frame_width_minus_1 + 1, + seq_hdr->max_frame_height_minus_1 + 1)) { + base->profile = profile; + base->rt_format = rt_format; + self->max_width = seq_hdr->max_frame_width_minus_1 + 1; + self->max_height = seq_hdr->max_frame_height_minus_1 + 1; + base->need_negotiation = TRUE; + + base->min_buffers = 7 + 4; /* dpb size + scratch surfaces */ + + /* May be changed by frame header */ + base->width = self->max_width; + base->height = self->max_height; + base->need_valign = FALSE; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_av1_dec_new_picture (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, GstAV1Picture * picture) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *pic; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstAV1FrameHeaderOBU *frame_hdr = &picture->frame_hdr; + + if (frame_hdr->upscaled_width != base->width + || frame_hdr->frame_height != base->height) { + base->width = frame_hdr->upscaled_width; + base->height = frame_hdr->frame_height; + + if (base->width < self->max_width || base->height < self->max_height) { + base->need_valign = TRUE; + /* *INDENT-OFF* */ + base->valign = (GstVideoAlignment){ + .padding_bottom = self->max_height - base->height, + .padding_right = self->max_width - base->width, + }; + /* *INDENT-ON* */ + } + + base->need_negotiation = TRUE; + } + + if (base->need_negotiation) { + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + self->last_ret = gst_video_decoder_allocate_output_frame (vdec, frame); + if (self->last_ret != GST_FLOW_OK) { + GST_WARNING_OBJECT (self, + "Failed to allocated output buffer, return %s", + gst_flow_get_name (self->last_ret)); + return self->last_ret; + } + + if (picture->apply_grain) { + if (!gst_va_buffer_create_aux_surface (frame->output_buffer)) { + GST_WARNING_OBJECT (self, + "Failed to allocated aux surface for buffer %p", + frame->output_buffer); + return GST_FLOW_ERROR; + } + } + + pic = gst_va_decode_picture_new (base->decoder, frame->output_buffer); + + gst_av1_picture_set_user_data (picture, pic, + (GDestroyNotify) gst_va_decode_picture_free); + + if (picture->apply_grain) { + GST_LOG_OBJECT (self, "New va decode picture %p - %#x(aux: %#x)", pic, + gst_va_decode_picture_get_surface (pic), + gst_va_decode_picture_get_aux_surface (pic)); + } else { + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, + gst_va_decode_picture_get_surface (pic)); + } + + return GST_FLOW_OK; +} + +static GstAV1Picture * +gst_va_av1_dec_duplicate_picture (GstAV1Decoder * decoder, + GstAV1Picture * picture) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *pic; + GstVaDecodePicture *new_pic; + GstAV1Picture *new_picture; + + pic = gst_av1_picture_get_user_data (picture); + if (!pic) { + GST_ERROR_OBJECT (self, "Parent picture does not have a va picture"); + return NULL; + } + + new_picture = gst_av1_picture_new (); + g_assert (pic->gstbuffer); + new_pic = gst_va_decode_picture_new (base->decoder, pic->gstbuffer); + + GST_LOG_OBJECT (self, "Duplicate output with buffer %" GST_PTR_FORMAT + " (surface %#x)", pic, gst_va_decode_picture_get_surface (pic)); + + gst_av1_picture_set_user_data (new_picture, new_pic, + (GDestroyNotify) gst_va_decode_picture_free); + + return new_picture; +} + +static void +_setup_segment_info (VADecPictureParameterBufferAV1 * pic_param, + GstAV1FrameHeaderOBU * frame_header) +{ + guint i, j; + uint8_t feature_mask; + + for (i = 0; i < GST_AV1_MAX_SEGMENTS; i++) + for (j = 0; j < GST_AV1_SEG_LVL_MAX; j++) + pic_param->seg_info.feature_data[i][j] = + frame_header->segmentation_params.feature_data[i][j]; + + for (i = 0; i < GST_AV1_MAX_SEGMENTS; i++) { + feature_mask = 0; + for (j = 0; j < GST_AV1_SEG_LVL_MAX; j++) { + if (frame_header->segmentation_params.feature_enabled[i][j]) + feature_mask |= 1 << j; + } + pic_param->seg_info.feature_mask[i] = feature_mask; + } +} + +static void +_setup_film_grain_info (VADecPictureParameterBufferAV1 * pic_param, + GstAV1FrameHeaderOBU * frame_header) +{ + guint i; + + if (!frame_header->film_grain_params.apply_grain) + return; + + pic_param->film_grain_info.num_y_points = + frame_header->film_grain_params.num_y_points; + for (i = 0; i < frame_header->film_grain_params.num_y_points; i++) { + pic_param->film_grain_info.point_y_value[i] = + frame_header->film_grain_params.point_y_value[i]; + pic_param->film_grain_info.point_y_scaling[i] = + frame_header->film_grain_params.point_y_scaling[i]; + } + + pic_param->film_grain_info.num_cb_points = + frame_header->film_grain_params.num_cb_points; + for (i = 0; i < frame_header->film_grain_params.num_cb_points; i++) { + pic_param->film_grain_info.point_cb_value[i] = + frame_header->film_grain_params.point_cb_value[i]; + pic_param->film_grain_info.point_cb_scaling[i] = + frame_header->film_grain_params.point_cb_scaling[i]; + } + + pic_param->film_grain_info.num_cr_points = + frame_header->film_grain_params.num_cr_points; + for (i = 0; i < frame_header->film_grain_params.num_cr_points; i++) { + pic_param->film_grain_info.point_cr_value[i] = + frame_header->film_grain_params.point_cr_value[i]; + pic_param->film_grain_info.point_cr_scaling[i] = + frame_header->film_grain_params.point_cr_scaling[i]; + } + + + if (pic_param->film_grain_info.num_y_points) { + for (i = 0; i < 24; i++) { + pic_param->film_grain_info.ar_coeffs_y[i] = + frame_header->film_grain_params.ar_coeffs_y_plus_128[i] - 128; + } + } + if (frame_header->film_grain_params.chroma_scaling_from_luma + || pic_param->film_grain_info.num_cb_points) { + for (i = 0; i < GST_AV1_MAX_NUM_POS_LUMA; i++) { + pic_param->film_grain_info.ar_coeffs_cb[i] = + frame_header->film_grain_params.ar_coeffs_cb_plus_128[i] - 128; + } + } + if (frame_header->film_grain_params.chroma_scaling_from_luma + || pic_param->film_grain_info.num_cr_points) { + for (i = 0; i < GST_AV1_MAX_NUM_POS_LUMA; i++) { + pic_param->film_grain_info.ar_coeffs_cr[i] = + frame_header->film_grain_params.ar_coeffs_cr_plus_128[i] - 128; + } + } +} + +static void +_setup_loop_filter_info (VADecPictureParameterBufferAV1 * pic_param, + GstAV1FrameHeaderOBU * frame_header) +{ + guint i; + + pic_param->filter_level[0] = + frame_header->loop_filter_params.loop_filter_level[0]; + pic_param->filter_level[1] = + frame_header->loop_filter_params.loop_filter_level[1]; + pic_param->filter_level_u = + frame_header->loop_filter_params.loop_filter_level[2]; + pic_param->filter_level_v = + frame_header->loop_filter_params.loop_filter_level[3]; + + for (i = 0; i < GST_AV1_TOTAL_REFS_PER_FRAME; i++) + pic_param->ref_deltas[i] = + frame_header->loop_filter_params.loop_filter_ref_deltas[i]; + for (i = 0; i < 2; i++) + pic_param->mode_deltas[i] = + frame_header->loop_filter_params.loop_filter_mode_deltas[i]; +} + +static void +_setup_quantization_info (VADecPictureParameterBufferAV1 * pic_param, + GstAV1FrameHeaderOBU * frame_header) +{ + pic_param->qmatrix_fields.bits.using_qmatrix = + frame_header->quantization_params.using_qmatrix; + if (frame_header->quantization_params.using_qmatrix) { + pic_param->qmatrix_fields.bits.qm_y = + frame_header->quantization_params.qm_y; + pic_param->qmatrix_fields.bits.qm_u = + frame_header->quantization_params.qm_u; + pic_param->qmatrix_fields.bits.qm_v = + frame_header->quantization_params.qm_v; + } else { + pic_param->qmatrix_fields.bits.qm_y = 0; + pic_param->qmatrix_fields.bits.qm_u = 0; + pic_param->qmatrix_fields.bits.qm_v = 0; + } +} + +static void +_setup_cdef_info (VADecPictureParameterBufferAV1 * pic_param, + GstAV1FrameHeaderOBU * frame_header, guint8 num_planes) +{ + guint8 sec_strength; + guint i; + + pic_param->cdef_damping_minus_3 = frame_header->cdef_params.cdef_damping - 3; + pic_param->cdef_bits = frame_header->cdef_params.cdef_bits; + for (i = 0; i < GST_AV1_CDEF_MAX; i++) { + sec_strength = frame_header->cdef_params.cdef_y_sec_strength[i]; + g_assert (sec_strength <= 4); + /* may need to minus 1 in order to merge with primary value. */ + if (sec_strength == 4) + sec_strength--; + + pic_param->cdef_y_strengths[i] = + ((frame_header->cdef_params.cdef_y_pri_strength[i] & 0xf) << 2) | + (sec_strength & 0x03); + } + if (num_planes > 1) { + for (i = 0; i < GST_AV1_CDEF_MAX; i++) { + sec_strength = frame_header->cdef_params.cdef_uv_sec_strength[i]; + g_assert (sec_strength <= 4); + /* may need to minus 1 in order to merge with primary value. */ + if (sec_strength == 4) + sec_strength--; + + pic_param->cdef_uv_strengths[i] = + ((frame_header->cdef_params.cdef_uv_pri_strength[i] & 0xf) << 2) | + (sec_strength & 0x03); + } + } else { + for (i = 0; i < GST_AV1_CDEF_MAX; i++) { + pic_param->cdef_uv_strengths[i] = 0; + } + } +} + +static void +_setup_global_motion_info (VADecPictureParameterBufferAV1 * pic_param, + GstAV1FrameHeaderOBU * frame_header) +{ + guint i, j; + + for (i = 0; i < 7; i++) { + /* assuming VAAV1TransformationType and GstAV1WarpModelType are + * equivalent */ + pic_param->wm[i].wmtype = (VAAV1TransformationType) + frame_header->global_motion_params.gm_type[GST_AV1_REF_LAST_FRAME + i]; + + for (j = 0; j < 6; j++) + pic_param->wm[i].wmmat[j] = + frame_header->global_motion_params.gm_params + [GST_AV1_REF_LAST_FRAME + i][j]; + + pic_param->wm[i].wmmat[6] = 0; + pic_param->wm[i].wmmat[7] = 0; + + pic_param->wm[i].invalid = + frame_header->global_motion_params.invalid[GST_AV1_REF_LAST_FRAME + i]; + } +} + +static GstFlowReturn +gst_va_av1_dec_start_picture (GstAV1Decoder * decoder, GstAV1Picture * picture, + GstAV1Dpb * dpb) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstAV1FrameHeaderOBU *frame_header = &picture->frame_hdr; + GstAV1SequenceHeaderOBU *seq_header = &self->seq; + VADecPictureParameterBufferAV1 pic_param = { }; + GstVaDecodePicture *va_pic; + guint i; + + va_pic = gst_av1_picture_get_user_data (picture); + g_assert (va_pic); + + /* *INDENT-OFF* */ + pic_param = (VADecPictureParameterBufferAV1){ + .profile = seq_header->seq_profile, + .order_hint_bits_minus_1 = seq_header->order_hint_bits_minus_1, + .matrix_coefficients = seq_header->color_config.matrix_coefficients, + .seq_info_fields.fields = { + .still_picture = seq_header->still_picture, + .use_128x128_superblock = seq_header->use_128x128_superblock, + .enable_filter_intra = seq_header->enable_filter_intra, + .enable_intra_edge_filter = seq_header->enable_intra_edge_filter, + .enable_interintra_compound = seq_header->enable_interintra_compound, + .enable_masked_compound = seq_header->enable_masked_compound, + .enable_dual_filter = seq_header->enable_dual_filter, + .enable_order_hint = seq_header->enable_order_hint, + .enable_jnt_comp = seq_header->enable_jnt_comp, + .enable_cdef = seq_header->enable_cdef, + .mono_chrome = seq_header->color_config.mono_chrome, + .color_range = seq_header->color_config.color_range, + .subsampling_x = seq_header->color_config.subsampling_x, + .subsampling_y = seq_header->color_config.subsampling_y, + .film_grain_params_present = seq_header->film_grain_params_present, + }, + .anchor_frames_num = 0, + .anchor_frames_list = NULL, + .frame_width_minus1 = frame_header->upscaled_width - 1, + .frame_height_minus1 = frame_header->frame_height - 1, + .output_frame_width_in_tiles_minus_1 = 0, + .output_frame_height_in_tiles_minus_1 = 0, + .order_hint = frame_header->order_hint, + /* Segmentation */ + .seg_info.segment_info_fields.bits = { + .enabled = frame_header->segmentation_params.segmentation_enabled, + .update_map = frame_header->segmentation_params.segmentation_update_map, + .temporal_update = + frame_header->segmentation_params.segmentation_temporal_update, + .update_data = + frame_header->segmentation_params.segmentation_update_data, + }, + /* FilmGrain */ + .film_grain_info = { + .film_grain_info_fields.bits = { + .apply_grain = frame_header->film_grain_params.apply_grain, + .chroma_scaling_from_luma = + frame_header->film_grain_params.chroma_scaling_from_luma, + .grain_scaling_minus_8 = + frame_header->film_grain_params.grain_scaling_minus_8, + .ar_coeff_lag = frame_header->film_grain_params.ar_coeff_lag, + .ar_coeff_shift_minus_6 = + frame_header->film_grain_params.ar_coeff_shift_minus_6, + .grain_scale_shift = frame_header->film_grain_params.grain_scale_shift, + .overlap_flag = frame_header->film_grain_params.overlap_flag, + .clip_to_restricted_range = + frame_header->film_grain_params.clip_to_restricted_range, + }, + .grain_seed = frame_header->film_grain_params.grain_seed, + .cb_mult = frame_header->film_grain_params.cb_mult, + .cb_luma_mult = frame_header->film_grain_params.cb_luma_mult, + .cb_offset = frame_header->film_grain_params.cb_offset, + .cr_mult = frame_header->film_grain_params.cr_mult, + .cr_luma_mult = frame_header->film_grain_params.cr_luma_mult, + .cr_offset = frame_header->film_grain_params.cr_offset, + }, + .tile_cols = frame_header->tile_info.tile_cols, + .tile_rows = frame_header->tile_info.tile_rows, + .context_update_tile_id = frame_header->tile_info.context_update_tile_id, + .pic_info_fields.bits = { + .frame_type = frame_header->frame_type, + .show_frame = frame_header->show_frame, + .showable_frame = frame_header->showable_frame, + .error_resilient_mode = frame_header->error_resilient_mode, + .disable_cdf_update = frame_header->disable_cdf_update, + .allow_screen_content_tools = frame_header->allow_screen_content_tools, + .force_integer_mv = frame_header->force_integer_mv, + .allow_intrabc = frame_header->allow_intrabc, + .use_superres = frame_header->use_superres, + .allow_high_precision_mv = frame_header->allow_high_precision_mv, + .is_motion_mode_switchable = frame_header->is_motion_mode_switchable, + .use_ref_frame_mvs = frame_header->use_ref_frame_mvs, + .disable_frame_end_update_cdf = + frame_header->disable_frame_end_update_cdf, + .uniform_tile_spacing_flag = + frame_header->tile_info.uniform_tile_spacing_flag, + .allow_warped_motion = frame_header->allow_warped_motion, + }, + .superres_scale_denominator = frame_header->superres_denom, + .interp_filter = frame_header->interpolation_filter, + /* loop filter */ + .loop_filter_info_fields.bits = { + .sharpness_level = + frame_header->loop_filter_params.loop_filter_sharpness, + .mode_ref_delta_enabled = + frame_header->loop_filter_params.loop_filter_delta_enabled, + .mode_ref_delta_update = + frame_header->loop_filter_params.loop_filter_delta_update, + }, + .mode_control_fields.bits = { + .delta_lf_present_flag = + frame_header->loop_filter_params.delta_lf_present, + .log2_delta_lf_res = frame_header->loop_filter_params.delta_lf_res, + .delta_lf_multi = frame_header->loop_filter_params.delta_lf_multi, + .delta_q_present_flag = + frame_header->quantization_params.delta_q_present, + .log2_delta_q_res = frame_header->quantization_params.delta_q_res, + .tx_mode = frame_header->tx_mode, + .reference_select = frame_header->reference_select, + .reduced_tx_set_used = frame_header->reduced_tx_set, + .skip_mode_present = frame_header->skip_mode_present, + }, + /* quantization */ + .base_qindex = frame_header->quantization_params.base_q_idx, + .y_dc_delta_q = frame_header->quantization_params.delta_q_y_dc, + .u_dc_delta_q = frame_header->quantization_params.delta_q_u_dc, + .u_ac_delta_q = frame_header->quantization_params.delta_q_u_ac, + .v_dc_delta_q = frame_header->quantization_params.delta_q_v_dc, + .v_ac_delta_q = frame_header->quantization_params.delta_q_v_ac, + /* loop restoration */ + .loop_restoration_fields.bits = { + .yframe_restoration_type = + frame_header->loop_restoration_params.frame_restoration_type[0], + .cbframe_restoration_type = + frame_header->loop_restoration_params.frame_restoration_type[1], + .crframe_restoration_type = + frame_header->loop_restoration_params.frame_restoration_type[2], + .lr_unit_shift = frame_header->loop_restoration_params.lr_unit_shift, + .lr_uv_shift = frame_header->loop_restoration_params.lr_uv_shift, + }, + }; + /* *INDENT-ON* */ + + if (seq_header->bit_depth == 8) { + pic_param.bit_depth_idx = 0; + } else if (seq_header->bit_depth == 10) { + pic_param.bit_depth_idx = 1; + } else if (seq_header->bit_depth == 12) { + pic_param.bit_depth_idx = 2; + } else { + g_assert_not_reached (); + } + + if (frame_header->film_grain_params.apply_grain) { + pic_param.current_frame = gst_va_decode_picture_get_aux_surface (va_pic); + pic_param.current_display_picture = + gst_va_decode_picture_get_surface (va_pic); + } else { + pic_param.current_frame = gst_va_decode_picture_get_surface (va_pic); + pic_param.current_display_picture = VA_INVALID_SURFACE; + } + + for (i = 0; i < GST_AV1_NUM_REF_FRAMES; i++) { + if (dpb->pic_list[i]) { + if (dpb->pic_list[i]->apply_grain) { + pic_param.ref_frame_map[i] = gst_va_decode_picture_get_aux_surface + (gst_av1_picture_get_user_data (dpb->pic_list[i])); + } else { + pic_param.ref_frame_map[i] = gst_va_decode_picture_get_surface + (gst_av1_picture_get_user_data (dpb->pic_list[i])); + } + } else { + pic_param.ref_frame_map[i] = VA_INVALID_SURFACE; + } + } + for (i = 0; i < GST_AV1_REFS_PER_FRAME; i++) { + pic_param.ref_frame_idx[i] = frame_header->ref_frame_idx[i]; + } + pic_param.primary_ref_frame = frame_header->primary_ref_frame; + + _setup_segment_info (&pic_param, frame_header); + _setup_film_grain_info (&pic_param, frame_header); + + for (i = 0; i < 63; i++) { + pic_param.width_in_sbs_minus_1[i] = + frame_header->tile_info.width_in_sbs_minus_1[i]; + pic_param.height_in_sbs_minus_1[i] = + frame_header->tile_info.height_in_sbs_minus_1[i]; + } + + _setup_loop_filter_info (&pic_param, frame_header); + _setup_quantization_info (&pic_param, frame_header); + _setup_cdef_info (&pic_param, frame_header, seq_header->num_planes); + _setup_global_motion_info (&pic_param, frame_header); + + if (!gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAPictureParameterBufferType, &pic_param, sizeof (pic_param))) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_av1_dec_decode_tile (GstAV1Decoder * decoder, GstAV1Picture * picture, + GstAV1Tile * tile) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstAV1TileGroupOBU *tile_group = &tile->tile_group; + GstVaDecodePicture *va_pic; + guint i; + VASliceParameterBufferAV1 slice_param[GST_AV1_MAX_TILE_COUNT]; + + GST_TRACE_OBJECT (self, "-"); + + for (i = 0; i < tile_group->tg_end - tile_group->tg_start + 1; i++) { + slice_param[i] = (VASliceParameterBufferAV1) { + }; + slice_param[i].slice_data_size = + tile_group->entry[tile_group->tg_start + i].tile_size; + slice_param[i].slice_data_offset = + tile_group->entry[tile_group->tg_start + i].tile_offset; + slice_param[i].tile_row = + tile_group->entry[tile_group->tg_start + i].tile_row; + slice_param[i].tile_column = + tile_group->entry[tile_group->tg_start + i].tile_col; + slice_param[i].slice_data_flag = 0; + } + + va_pic = gst_av1_picture_get_user_data (picture); + + if (!gst_va_decoder_add_slice_buffer_with_n_params (base->decoder, va_pic, + slice_param, sizeof (VASliceParameterBufferAV1), i, tile->obu.data, + tile->obu.obu_size)) { + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_av1_dec_end_picture (GstAV1Decoder * decoder, GstAV1Picture * picture) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *va_pic; + + GST_LOG_OBJECT (self, "end picture %p, (system_frame_number %d)", + picture, picture->system_frame_number); + + va_pic = gst_av1_picture_get_user_data (picture); + + if (!gst_va_decoder_decode_with_aux_surface (base->decoder, va_pic, + picture->apply_grain)) { + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_av1_dec_output_picture (GstAV1Decoder * decoder, + GstVideoCodecFrame * frame, GstAV1Picture * picture) +{ + GstVaAV1Dec *self = GST_VA_AV1_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + g_assert (picture->frame_hdr.show_frame || + picture->frame_hdr.show_existing_frame); + + GST_LOG_OBJECT (self, + "Outputting picture %p (system_frame_number %d)", + picture, picture->system_frame_number); + + if (self->last_ret != GST_FLOW_OK) { + gst_av1_picture_unref (picture); + gst_video_decoder_drop_frame (GST_VIDEO_DECODER (self), frame); + return self->last_ret; + } + + if (picture->frame_hdr.show_existing_frame) { + GstVaDecodePicture *pic; + + g_assert (!frame->output_buffer); + pic = gst_av1_picture_get_user_data (picture); + frame->output_buffer = gst_buffer_ref (pic->gstbuffer); + } + + if (base->copy_frames) + gst_va_base_dec_copy_output_buffer (base, frame); + + gst_av1_picture_unref (picture); + + return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); +} + +static void +gst_va_av1_dec_init (GTypeInstance * instance, gpointer g_class) +{ + gst_va_base_dec_init (GST_VA_BASE_DEC (instance), GST_CAT_DEFAULT); +} + +static void +gst_va_av1_dec_dispose (GObject * object) +{ + gst_va_base_dec_close (GST_VIDEO_DECODER (object)); + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_av1_dec_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *src_doc_caps, *sink_doc_caps; + GObjectClass *gobject_class = G_OBJECT_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstAV1DecoderClass *av1decoder_class = GST_AV1_DECODER_CLASS (g_class); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (g_class); + struct CData *cdata = class_data; + gchar *long_name; + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API AV1 Decoder in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API AV1 Decoder"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", + "VA-API based AV1 video decoder", "He Junyan <junyan.he@intel.com>"); + + sink_doc_caps = gst_caps_from_string (sink_caps_str); + src_doc_caps = gst_caps_from_string (src_caps_str); + + parent_class = g_type_class_peek_parent (g_class); + + gst_va_base_dec_class_init (GST_VA_BASE_DEC_CLASS (g_class), AV1, + cdata->render_device_path, cdata->sink_caps, cdata->src_caps, + src_doc_caps, sink_doc_caps); + + gobject_class->dispose = gst_va_av1_dec_dispose; + + decoder_class->getcaps = GST_DEBUG_FUNCPTR (gst_va_av1_dec_getcaps); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_va_av1_dec_negotiate); + + av1decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_new_sequence); + av1decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_new_picture); + av1decoder_class->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_duplicate_picture); + av1decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_start_picture); + av1decoder_class->decode_tile = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_decode_tile); + av1decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_end_picture); + av1decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_va_av1_dec_output_picture); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + gst_caps_unref (cdata->src_caps); + gst_caps_unref (cdata->sink_caps); + g_free (cdata); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_av1dec_debug, "vaav1dec", 0, + "VA AV1 decoder"); + + return NULL; +} + +gboolean +gst_va_av1_dec_register (GstPlugin * plugin, GstVaDevice * device, + GstCaps * sink_caps, GstCaps * src_caps, guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaAV1DecClass), + .class_init = gst_va_av1_dec_class_init, + .instance_size = sizeof (GstVaAV1Dec), + .instance_init = gst_va_av1_dec_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + g_return_val_if_fail (GST_IS_CAPS (sink_caps), FALSE); + g_return_val_if_fail (GST_IS_CAPS (src_caps), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + cdata->sink_caps = _complete_sink_caps (sink_caps); + cdata->src_caps = gst_caps_ref (src_caps); + + /* class data will be leaked if the element never gets instantiated */ + GST_MINI_OBJECT_FLAG_SET (cdata->sink_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaAV1Dec"); + feature_name = g_strdup ("vaav1dec"); + + /* The first decoder to be registered should use a constant name, + * like vaav1dec, for any additional decoders, we create unique + * names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sAV1Dec", basename); + feature_name = g_strdup_printf ("va%sav1dec", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_AV1_DECODER, + type_name, &type_info, 0); + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvaav1dec.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_av1_dec_register (GstPlugin * plugin, + GstVaDevice * device, + GstCaps * sink_caps, + GstCaps * src_caps, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvabasedec.c
Added
@@ -0,0 +1,959 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstvabasedec.h" + +#include "gstvaallocator.h" +#include "gstvacaps.h" +#include "gstvapool.h" +#include "gstvautils.h" +#include "gstvavideoformat.h" + +#define GST_CAT_DEFAULT (base->debug_category) +#define GST_VA_BASE_DEC_GET_PARENT_CLASS(obj) (GST_VA_BASE_DEC_GET_CLASS(obj)->parent_decoder_class) + +static gboolean +gst_va_base_dec_open (GstVideoDecoder * decoder) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaBaseDecClass *klass = GST_VA_BASE_DEC_GET_CLASS (decoder); + gboolean ret = FALSE; + + if (!gst_va_ensure_element_data (decoder, klass->render_device_path, + &base->display)) + return FALSE; + + if (!g_atomic_pointer_get (&base->decoder)) { + GstVaDecoder *va_decoder; + + va_decoder = gst_va_decoder_new (base->display, klass->codec); + if (va_decoder) + ret = TRUE; + + gst_object_replace ((GstObject **) (&base->decoder), + (GstObject *) va_decoder); + gst_clear_object (&va_decoder); + } else { + ret = TRUE; + } + + base->apply_video_crop = FALSE; + + return ret; +} + +gboolean +gst_va_base_dec_close (GstVideoDecoder * decoder) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + gst_clear_object (&base->decoder); + gst_clear_object (&base->display); + + return TRUE; +} + +static gboolean +gst_va_base_dec_stop (GstVideoDecoder * decoder) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + if (!gst_va_decoder_close (base->decoder)) + return FALSE; + + if (base->output_state) + gst_video_codec_state_unref (base->output_state); + base->output_state = NULL; + + if (base->other_pool) + gst_buffer_pool_set_active (base->other_pool, FALSE); + gst_clear_object (&base->other_pool); + + g_clear_pointer (&base->convert, gst_video_converter_free); + + return GST_VIDEO_DECODER_CLASS (GST_VA_BASE_DEC_GET_PARENT_CLASS + (decoder))->stop (decoder); +} + +static GstCaps * +gst_va_base_dec_getcaps (GstVideoDecoder * decoder, GstCaps * filter) +{ + GstCaps *caps = NULL, *tmp; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecoder *va_decoder = NULL; + + gst_object_replace ((GstObject **) & va_decoder, (GstObject *) base->decoder); + + if (va_decoder) { + caps = gst_va_decoder_get_sinkpad_caps (va_decoder); + gst_object_unref (va_decoder); + } + + if (caps) { + if (filter) { + tmp = gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = tmp; + } + GST_LOG_OBJECT (base, "Returning caps %" GST_PTR_FORMAT, caps); + } else { + caps = gst_video_decoder_proxy_getcaps (decoder, NULL, filter); + } + + return caps; +} + +static gboolean +_query_context (GstVaBaseDec * self, GstQuery * query) +{ + GstVaDisplay *display = NULL; + gboolean ret; + + gst_object_replace ((GstObject **) & display, (GstObject *) self->display); + ret = gst_va_handle_context_query (GST_ELEMENT_CAST (self), query, display); + gst_clear_object (&display); + + return ret; +} + +static gboolean +gst_va_base_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + gboolean ret = FALSE; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT:{ + ret = _query_context (base, query); + break; + } + case GST_QUERY_CAPS:{ + GstCaps *caps = NULL, *tmp, *filter = NULL; + GstVaDecoder *va_decoder = NULL; + gboolean fixed_caps; + + gst_object_replace ((GstObject **) & va_decoder, + (GstObject *) base->decoder); + + gst_query_parse_caps (query, &filter); + + fixed_caps = GST_PAD_IS_FIXED_CAPS (GST_VIDEO_DECODER_SRC_PAD (decoder)); + + if (!fixed_caps && va_decoder) + caps = gst_va_decoder_get_srcpad_caps (va_decoder); + + gst_clear_object (&va_decoder); + + if (caps) { + if (filter) { + tmp = + gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (caps); + caps = tmp; + } + + GST_LOG_OBJECT (base, "Returning caps %" GST_PTR_FORMAT, caps); + gst_query_set_caps_result (query, caps); + gst_caps_unref (caps); + ret = TRUE; + break; + } + /* else jump to default */ + } + default: + ret = GST_VIDEO_DECODER_CLASS (GST_VA_BASE_DEC_GET_PARENT_CLASS + (decoder))->src_query (decoder, query); + break; + } + + return ret; +} + +static gboolean +gst_va_base_dec_sink_query (GstVideoDecoder * decoder, GstQuery * query) +{ + if (GST_QUERY_TYPE (query) == GST_QUERY_CONTEXT) + return _query_context (GST_VA_BASE_DEC (decoder), query); + return GST_VIDEO_DECODER_CLASS (GST_VA_BASE_DEC_GET_PARENT_CLASS + (decoder))->sink_query (decoder, query); +} + +static GstAllocator * +_create_allocator (GstVaBaseDec * base, GstCaps * caps) +{ + GstAllocator *allocator = NULL; + + if (gst_caps_is_dmabuf (caps)) + allocator = gst_va_dmabuf_allocator_new (base->display); + else { + GArray *surface_formats = + gst_va_decoder_get_surface_formats (base->decoder); + allocator = gst_va_allocator_new (base->display, surface_formats); + } + + return allocator; +} + +static void +_create_other_pool (GstVaBaseDec * base, GstAllocator * allocator, + GstAllocationParams * params, GstCaps * caps, guint size) +{ + GstBufferPool *pool; + GstStructure *config; + + gst_clear_object (&base->other_pool); + + GST_DEBUG_OBJECT (base, "making new other pool for copy"); + + pool = gst_video_buffer_pool_new (); + config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + gst_buffer_pool_config_set_allocator (config, allocator, params); + if (!gst_buffer_pool_set_config (pool, config)) { + GST_ERROR_OBJECT (base, "Couldn't configure other pool for copy."); + gst_clear_object (&pool); + } + + base->other_pool = pool; +} + +static gboolean +_need_video_crop (GstVaBaseDec * base) +{ + + if (base->need_valign && + (base->valign.padding_left > 0 || base->valign.padding_top > 0)) + return TRUE; + + return FALSE; +} + +/* This path for pool setting is a little complicated but not commonly + used. We deliberately separate it from the main path of pool setting. */ +static gboolean +_decide_allocation_for_video_crop (GstVideoDecoder * decoder, + GstQuery * query, GstCaps * caps, const GstVideoInfo * info) +{ + GstAllocator *allocator = NULL, *other_allocator = NULL; + GstAllocationParams other_params, params; + gboolean update_pool = FALSE, update_allocator = FALSE; + GstBufferPool *pool = NULL, *other_pool = NULL; + guint size = 0, min, max; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + gboolean ret = TRUE; + GstCaps *va_caps = NULL; + + /* If others provide a valid allocator, just use it. */ + if (gst_query_get_n_allocation_params (query) > 0) { + gst_query_parse_nth_allocation_param (query, 0, &other_allocator, + &other_params); + update_allocator = TRUE; + } else { + gst_allocation_params_init (&other_params); + } + + /* If others provide a valid pool, just use it. */ + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &other_pool, &size, &min, + &max); + + min += base->min_buffers; + size = MAX (size, GST_VIDEO_INFO_SIZE (info)); + update_pool = TRUE; + } else { + size = GST_VIDEO_INFO_SIZE (info); + min = base->min_buffers; + max = 0; + } + + /* Ensure that the other pool is ready */ + if (gst_caps_is_raw (caps)) { + if (GST_IS_VA_POOL (other_pool)) + gst_clear_object (&other_pool); + + if (!other_pool) { + if (other_allocator && (GST_IS_VA_DMABUF_ALLOCATOR (other_allocator) + || GST_IS_VA_ALLOCATOR (other_allocator))) + gst_clear_object (&other_allocator); + + _create_other_pool (base, other_allocator, &other_params, caps, size); + } else { + gst_object_replace ((GstObject **) & base->other_pool, + (GstObject *) other_pool); + } + } else { + GstStructure *other_config; + + if (!GST_IS_VA_POOL (other_pool)) + gst_clear_object (&other_pool); + + if (!other_pool) + other_pool = gst_va_pool_new (); + + if (other_allocator && !(GST_IS_VA_DMABUF_ALLOCATOR (other_allocator) + || GST_IS_VA_ALLOCATOR (other_allocator))) + gst_clear_object (&other_allocator); + + if (!other_allocator) { + other_allocator = _create_allocator (base, caps); + if (!other_allocator) { + ret = FALSE; + goto cleanup; + } + } + + other_config = gst_buffer_pool_get_config (other_pool); + + gst_buffer_pool_config_set_params (other_config, caps, size, min, max); + gst_buffer_pool_config_set_allocator (other_config, other_allocator, + &other_params); + /* Always support VideoMeta but no VideoCropMeta here. */ + gst_buffer_pool_config_add_option (other_config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + gst_buffer_pool_config_set_va_allocation_params (other_config, 0); + + if (!gst_buffer_pool_set_config (other_pool, other_config)) { + ret = FALSE; + goto cleanup; + } + + gst_object_replace ((GstObject **) & base->other_pool, + (GstObject *) other_pool); + } + + /* Now setup the buffer pool for decoder */ + pool = gst_va_pool_new (); + + va_caps = gst_caps_copy (caps); + gst_caps_set_features_simple (va_caps, + gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_VA)); + + if (!(allocator = _create_allocator (base, va_caps))) { + ret = FALSE; + goto cleanup; + } + + gst_allocation_params_init (¶ms); + + { + GstStructure *config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_set_params (config, caps, size, min, max); + gst_buffer_pool_config_set_allocator (config, allocator, ¶ms); + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + if (_need_video_crop (base)) { + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + gst_buffer_pool_config_set_video_alignment (config, &base->valign); + } + + gst_buffer_pool_config_set_va_allocation_params (config, + VA_SURFACE_ATTRIB_USAGE_HINT_DECODER); + + if (!gst_buffer_pool_set_config (pool, config)) { + ret = FALSE; + goto cleanup; + } + } + + if (update_allocator) + gst_query_set_nth_allocation_param (query, 0, allocator, ¶ms); + else + gst_query_add_allocation_param (query, allocator, ¶ms); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + GST_WARNING_OBJECT (base, "We need to copy the output buffer manually " + "because of the top/left alignment, which may have low performance. " + "The element which supports VideoCropMeta such as 'vapostproc' can " + "avoid this."); + base->copy_frames = TRUE; + base->apply_video_crop = TRUE; + +cleanup: + if (ret != TRUE) + gst_clear_object (&base->other_pool); + gst_clear_object (&allocator); + gst_clear_object (&other_allocator); + gst_clear_object (&pool); + gst_clear_object (&other_pool); + gst_clear_caps (&va_caps); + + return ret; +} + +/* We only support system pool and va pool. For va pool, its allocator + * should be va allocator or dma allocator. + * If output caps is memory:VAMemory, the pool should be a va pool + * with va allocator. + * If output caps is memory:DMABuf, the pool should be a va pool + * with dma allocator. + * We may need the other_pool to copy the decoder picture to the + * output buffer. We need to do this copy when: + * 1). The output caps is raw(system mem), but the downstream does + * not support VideoMeta and the strides and offsets of the va pool + * are different from the system memory pool, which means that the + * gst_video_frame_map() can not map the buffer correctly. Then we + * need a va pool with va allocator as an the internal pool and create + * a system pool as the other_pool to copy frames to system mem and + * output it. + * 2). The decoder has crop_top/left value > 0(e.g. the conformance + * window in the H265). Which means that the real output picture + * locates in the middle of the decoded buffer. If the downstream can + * support VideoCropMeta, a VideoCropMeta is added to notify the + * real picture's coordinate and size. But if not, we need to copy + * it manually and the other_pool is needed. We always assume that + * decoded picture starts from top-left corner, and so there is no + * need to do this if crop_bottom/right value > 0. + * + * 1. if crop_top/left value > 0 and the downstream does not support the + * VideoCropMeta, we always have the other_pool to do the copy(The pool + * may be provided by the downstream element, or created by ourself if + * no suitable one found). + * 2. get allocator in query + * 2.1 if allocator is not ours and caps is raw, keep it for other_pool. + * 3. get pool in query + * 3.1 if pool is not va, downstream doesn't support video meta and + * caps are raw, keep it as other_pool. + * 3.2 if there's no pool in query and and caps is raw, create other_pool + * as GstVideoPool with the non-va from query and query's params. + * 4. create our allocator and pool if they aren't in query + * 5. add or update pool and allocator in query + * 6. set our custom pool configuration + */ +static gboolean +gst_va_base_dec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query) +{ + GstAllocator *allocator = NULL, *other_allocator = NULL; + GstAllocationParams other_params, params; + GstBufferPool *pool = NULL, *other_pool = NULL; + GstCaps *caps = NULL; + GstVideoInfo info; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + guint size = 0, min, max; + gboolean update_pool = FALSE, update_allocator = FALSE; + gboolean has_videometa, has_video_crop_meta; + gboolean ret = TRUE; + + g_assert (base->min_buffers > 0); + + gst_query_parse_allocation (query, &caps, NULL); + + if (!(caps && gst_video_info_from_caps (&info, caps))) + goto wrong_caps; + + has_videometa = gst_query_find_allocation_meta (query, + GST_VIDEO_META_API_TYPE, NULL); + has_video_crop_meta = has_videometa && gst_query_find_allocation_meta (query, + GST_VIDEO_CROP_META_API_TYPE, NULL); + + /* 1. The output picture locates in the middle of the decoded buffer, + but the downstream element does not support VideoCropMeta, we + definitely need a copy. + 2. Some codec such as H265, it does not clean the DPB when new SPS + comes. The new SPS may set the crop window to top-left corner and + so no video crop is needed here. But we may still have cached frames + in DPB which need a copy. */ + if ((_need_video_crop (base) && !has_video_crop_meta) || + base->apply_video_crop) { + return _decide_allocation_for_video_crop (decoder, query, caps, &info); + } + + if (gst_query_get_n_allocation_params (query) > 0) { + gst_query_parse_nth_allocation_param (query, 0, &allocator, &other_params); + if (allocator && !(GST_IS_VA_DMABUF_ALLOCATOR (allocator) + || GST_IS_VA_ALLOCATOR (allocator))) { + /* save the allocator for the other pool */ + other_allocator = allocator; + allocator = NULL; + } + update_allocator = TRUE; + } else { + gst_allocation_params_init (&other_params); + } + + gst_allocation_params_init (¶ms); + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + if (pool) { + if (!GST_IS_VA_POOL (pool)) { + GST_DEBUG_OBJECT (base, + "may need other pool for copy frames %" GST_PTR_FORMAT, pool); + other_pool = pool; + pool = NULL; + } + } + + min += base->min_buffers; + size = MAX (size, GST_VIDEO_INFO_SIZE (&info)); + + update_pool = TRUE; + } else { + size = GST_VIDEO_INFO_SIZE (&info); + min = base->min_buffers; + max = 0; + } + + if (!allocator) { + if (!(allocator = _create_allocator (base, caps))) { + ret = FALSE; + goto cleanup; + } + } + + if (!pool) + pool = gst_va_pool_new (); + + { + GstStructure *config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_set_params (config, caps, size, min, max); + gst_buffer_pool_config_set_allocator (config, allocator, ¶ms); + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_META); + + if (base->need_valign) { + gst_buffer_pool_config_add_option (config, + GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); + gst_buffer_pool_config_set_video_alignment (config, &base->valign); + } + + gst_buffer_pool_config_set_va_allocation_params (config, + VA_SURFACE_ATTRIB_USAGE_HINT_DECODER); + + if (!gst_buffer_pool_set_config (pool, config)) { + ret = FALSE; + goto cleanup; + } + } + + if (update_allocator) + gst_query_set_nth_allocation_param (query, 0, allocator, ¶ms); + else + gst_query_add_allocation_param (query, allocator, ¶ms); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + base->copy_frames = (!has_videometa && gst_va_pool_requires_video_meta (pool) + && gst_caps_is_raw (caps)); + if (base->copy_frames) { + if (other_pool) { + gst_object_replace ((GstObject **) & base->other_pool, + (GstObject *) other_pool); + } else { + _create_other_pool (base, other_allocator, &other_params, caps, size); + } + GST_DEBUG_OBJECT (base, "Use the other pool for copy %" GST_PTR_FORMAT, + base->other_pool); + } else { + gst_clear_object (&base->other_pool); + } + +cleanup: + gst_clear_object (&allocator); + gst_clear_object (&other_allocator); + gst_clear_object (&pool); + gst_clear_object (&other_pool); + + /* There's no need to chain decoder's method since all what is + * needed is done. */ + return ret; + +wrong_caps: + { + GST_WARNING_OBJECT (base, "No valid caps"); + return FALSE; + } +} + +static void +gst_va_base_dec_set_context (GstElement * element, GstContext * context) +{ + GstVaDisplay *old_display, *new_display; + GstVaBaseDec *base = GST_VA_BASE_DEC (element); + GstVaBaseDecClass *klass = GST_VA_BASE_DEC_GET_CLASS (base); + gboolean ret; + + old_display = base->display ? gst_object_ref (base->display) : NULL; + ret = gst_va_handle_set_context (element, context, klass->render_device_path, + &base->display); + new_display = base->display ? gst_object_ref (base->display) : NULL; + + if (!ret + || (old_display && new_display && old_display != new_display + && base->decoder)) { + GST_ELEMENT_WARNING (base, RESOURCE, BUSY, + ("Can't replace VA display while operating"), (NULL)); + } + + gst_clear_object (&old_display); + gst_clear_object (&new_display); + + GST_ELEMENT_CLASS (GST_VA_BASE_DEC_GET_PARENT_CLASS + (element))->set_context (element, context); +} + +void +gst_va_base_dec_init (GstVaBaseDec * base, GstDebugCategory * cat) +{ + base->debug_category = cat; +} + +void +gst_va_base_dec_class_init (GstVaBaseDecClass * klass, GstVaCodecs codec, + const gchar * render_device_path, GstCaps * sink_caps, GstCaps * src_caps, + GstCaps * doc_src_caps, GstCaps * doc_sink_caps) +{ + GstPadTemplate *sink_pad_templ, *src_pad_templ; + GstElementClass *element_class = GST_ELEMENT_CLASS (klass); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (klass); + + klass->parent_decoder_class = g_type_class_peek_parent (klass); + + klass->codec = codec; + klass->render_device_path = g_strdup (render_device_path); + + sink_pad_templ = gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + sink_caps); + gst_element_class_add_pad_template (element_class, sink_pad_templ); + + if (doc_sink_caps) { + gst_pad_template_set_documentation_caps (sink_pad_templ, doc_sink_caps); + gst_caps_unref (doc_sink_caps); + } + + src_pad_templ = gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + src_caps); + gst_element_class_add_pad_template (element_class, src_pad_templ); + + if (doc_src_caps) { + gst_pad_template_set_documentation_caps (src_pad_templ, doc_src_caps); + gst_caps_unref (doc_src_caps); + } + + element_class->set_context = GST_DEBUG_FUNCPTR (gst_va_base_dec_set_context); + + decoder_class->open = GST_DEBUG_FUNCPTR (gst_va_base_dec_open); + decoder_class->close = GST_DEBUG_FUNCPTR (gst_va_base_dec_close); + decoder_class->stop = GST_DEBUG_FUNCPTR (gst_va_base_dec_stop); + decoder_class->getcaps = GST_DEBUG_FUNCPTR (gst_va_base_dec_getcaps); + decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_va_base_dec_src_query); + decoder_class->sink_query = GST_DEBUG_FUNCPTR (gst_va_base_dec_sink_query); + decoder_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_va_base_dec_decide_allocation); +} + +static GstVideoFormat +_default_video_format_from_chroma (guint chroma_type) +{ + switch (chroma_type) { + /* 4:2:0 */ + case VA_RT_FORMAT_YUV420: + return GST_VIDEO_FORMAT_NV12; + case VA_RT_FORMAT_YUV420_10: + return GST_VIDEO_FORMAT_P010_10LE; + case VA_RT_FORMAT_YUV420_12: + return GST_VIDEO_FORMAT_P012_LE; + /* 4:2:2 */ + case VA_RT_FORMAT_YUV422: + return GST_VIDEO_FORMAT_UYVY; + case VA_RT_FORMAT_YUV422_10: + return GST_VIDEO_FORMAT_Y210; + case VA_RT_FORMAT_YUV422_12: + return GST_VIDEO_FORMAT_Y212_LE; + /* 4:4:4 */ + case VA_RT_FORMAT_YUV444: + return GST_VIDEO_FORMAT_VUYA; + case VA_RT_FORMAT_YUV444_10: + return GST_VIDEO_FORMAT_Y410; + case VA_RT_FORMAT_YUV444_12: + return GST_VIDEO_FORMAT_Y412_LE; + default: + return GST_VIDEO_FORMAT_UNKNOWN; + } +} + +/* Check whether the downstream supports VideoMeta; if not, we need to + * fallback to the system memory. */ +static gboolean +_downstream_has_video_meta (GstVaBaseDec * base, GstCaps * caps) +{ + GstQuery *query; + gboolean ret = FALSE; + + query = gst_query_new_allocation (caps, FALSE); + if (gst_pad_peer_query (GST_VIDEO_DECODER_SRC_PAD (base), query)) + ret = gst_query_find_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + gst_query_unref (query); + + return ret; +} + +void +gst_va_base_dec_get_preferred_format_and_caps_features (GstVaBaseDec * base, + GstVideoFormat * format, GstCapsFeatures ** capsfeatures) +{ + GstCaps *peer_caps, *preferred_caps = NULL; + GstCapsFeatures *features; + GstStructure *structure; + const GValue *v_format; + guint num_structures, i; + gboolean is_any; + + g_return_if_fail (base); + + /* verify if peer caps is any */ + { + peer_caps = + gst_pad_peer_query_caps (GST_VIDEO_DECODER_SRC_PAD (base), NULL); + is_any = gst_caps_is_any (peer_caps); + gst_clear_caps (&peer_caps); + } + + peer_caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (base)); + GST_DEBUG_OBJECT (base, "Allowed caps %" GST_PTR_FORMAT, peer_caps); + + /* prefer memory:VASurface over other caps features */ + num_structures = gst_caps_get_size (peer_caps); + for (i = 0; i < num_structures; i++) { + features = gst_caps_get_features (peer_caps, i); + structure = gst_caps_get_structure (peer_caps, i); + + if (gst_caps_features_is_any (features)) + continue; + + if (gst_caps_features_contains (features, GST_CAPS_FEATURE_MEMORY_VA)) { + preferred_caps = gst_caps_new_full (gst_structure_copy (structure), NULL); + gst_caps_set_features_simple (preferred_caps, + gst_caps_features_copy (features)); + break; + } + } + + if (!preferred_caps) + preferred_caps = peer_caps; + else + gst_clear_caps (&peer_caps); + + if (gst_caps_is_empty (preferred_caps)) { + if (capsfeatures) + *capsfeatures = NULL; /* system memory */ + if (format) + *format = _default_video_format_from_chroma (base->rt_format); + goto bail; + } + + if (capsfeatures) { + features = gst_caps_get_features (preferred_caps, 0); + if (features) { + *capsfeatures = gst_caps_features_copy (features); + + if (is_any + && !gst_caps_features_is_equal (features, + GST_CAPS_FEATURES_MEMORY_SYSTEM_MEMORY) + && !_downstream_has_video_meta (base, preferred_caps)) { + GST_INFO_OBJECT (base, "Downstream reports ANY caps but without" + " VideoMeta support; fallback to system memory."); + gst_caps_features_free (*capsfeatures); + *capsfeatures = NULL; + } + } else { + *capsfeatures = NULL; + } + } + + if (!format) + goto bail; + + structure = gst_caps_get_structure (preferred_caps, 0); + v_format = gst_structure_get_value (structure, "format"); + if (!v_format) + *format = _default_video_format_from_chroma (base->rt_format); + else if (G_VALUE_HOLDS_STRING (v_format)) + *format = gst_video_format_from_string (g_value_get_string (v_format)); + else if (GST_VALUE_HOLDS_LIST (v_format)) { + guint num_values = gst_value_list_get_size (v_format); + for (i = 0; i < num_values; i++) { + GstVideoFormat fmt; + const GValue *v_fmt = gst_value_list_get_value (v_format, i); + if (!v_fmt) + continue; + fmt = gst_video_format_from_string (g_value_get_string (v_fmt)); + if (gst_va_chroma_from_video_format (fmt) == base->rt_format) { + *format = fmt; + break; + } + } + if (i == num_values) + *format = _default_video_format_from_chroma (base->rt_format); + } + +bail: + gst_clear_caps (&preferred_caps); +} + +static gboolean +_copy_buffer_and_apply_video_crop (GstVaBaseDec * base, + GstVideoFrame * src_frame, GstVideoFrame * dest_frame, + GstVideoCropMeta * video_crop) +{ + GstVideoInfo dst_info = dest_frame->info; + + dst_info.fps_n = src_frame->info.fps_n; + dst_info.fps_d = src_frame->info.fps_d; + + if (base->convert) { + gboolean new_convert = FALSE; + gint x = 0, y = 0, width = 0, height = 0; + const GstStructure *config = gst_video_converter_get_config (base->convert); + + if (!gst_structure_get_int (config, GST_VIDEO_CONVERTER_OPT_SRC_X, &x) + || !gst_structure_get_int (config, GST_VIDEO_CONVERTER_OPT_SRC_Y, &y) + || !gst_structure_get_int (config, GST_VIDEO_CONVERTER_OPT_SRC_WIDTH, + &width) + || !gst_structure_get_int (config, GST_VIDEO_CONVERTER_OPT_SRC_HEIGHT, + &height)) + new_convert = TRUE; + + new_convert |= (video_crop->x != x); + new_convert |= (video_crop->y != y); + new_convert |= (video_crop->width != width); + new_convert |= (video_crop->height != height); + + /* No need to check dest, it always has (0,0) -> (width, height) */ + + if (new_convert) + g_clear_pointer (&base->convert, gst_video_converter_free); + } + + if (!base->convert) { + base->convert = gst_video_converter_new (&src_frame->info, &dst_info, + gst_structure_new ("options", + GST_VIDEO_CONVERTER_OPT_DITHER_METHOD, + GST_TYPE_VIDEO_DITHER_METHOD, GST_VIDEO_DITHER_NONE, + GST_VIDEO_CONVERTER_OPT_DITHER_QUANTIZATION, + G_TYPE_UINT, 0, + GST_VIDEO_CONVERTER_OPT_CHROMA_MODE, + GST_TYPE_VIDEO_CHROMA_MODE, GST_VIDEO_CHROMA_MODE_NONE, + GST_VIDEO_CONVERTER_OPT_MATRIX_MODE, + GST_TYPE_VIDEO_MATRIX_MODE, GST_VIDEO_MATRIX_MODE_NONE, + GST_VIDEO_CONVERTER_OPT_SRC_X, G_TYPE_INT, video_crop->x, + GST_VIDEO_CONVERTER_OPT_SRC_Y, G_TYPE_INT, video_crop->y, + GST_VIDEO_CONVERTER_OPT_SRC_WIDTH, G_TYPE_INT, video_crop->width, + GST_VIDEO_CONVERTER_OPT_SRC_HEIGHT, G_TYPE_INT, video_crop->height, + GST_VIDEO_CONVERTER_OPT_DEST_X, G_TYPE_INT, 0, + GST_VIDEO_CONVERTER_OPT_DEST_Y, G_TYPE_INT, 0, + GST_VIDEO_CONVERTER_OPT_DEST_WIDTH, G_TYPE_INT, video_crop->width, + GST_VIDEO_CONVERTER_OPT_DEST_HEIGHT, G_TYPE_INT, video_crop->height, + NULL)); + + if (!base->convert) { + GST_WARNING_OBJECT (base, "failed to create a video convert"); + return FALSE; + } + } + + gst_video_converter_frame (base->convert, src_frame, dest_frame); + + return TRUE; +} + +gboolean +gst_va_base_dec_copy_output_buffer (GstVaBaseDec * base, + GstVideoCodecFrame * codec_frame) +{ + GstVideoFrame src_frame; + GstVideoFrame dest_frame; + GstVideoInfo dest_vinfo; + GstVideoInfo *src_vinfo; + GstBuffer *buffer = NULL; + GstVideoCropMeta *video_crop; + GstFlowReturn ret; + + g_return_val_if_fail (base && base->output_state, FALSE); + + if (!base->other_pool) + return FALSE; + + if (!gst_buffer_pool_set_active (base->other_pool, TRUE)) + return FALSE; + + src_vinfo = &base->output_state->info; + gst_video_info_set_format (&dest_vinfo, GST_VIDEO_INFO_FORMAT (src_vinfo), + base->width, base->height); + + ret = gst_buffer_pool_acquire_buffer (base->other_pool, &buffer, NULL); + if (ret != GST_FLOW_OK) + goto fail; + if (!gst_video_frame_map (&src_frame, src_vinfo, codec_frame->output_buffer, + GST_MAP_READ)) + goto fail; + if (!gst_video_frame_map (&dest_frame, &dest_vinfo, buffer, GST_MAP_WRITE)) { + gst_video_frame_unmap (&src_frame); + goto fail; + } + + video_crop = gst_buffer_get_video_crop_meta (codec_frame->output_buffer); + if (video_crop) { + if (!_copy_buffer_and_apply_video_crop (base, + &src_frame, &dest_frame, video_crop)) { + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + GST_ERROR_OBJECT (base, "fail to apply the video crop."); + goto fail; + } + } else { + /* gst_video_frame_copy can crop this, but does not know, so let + * make it think it's all right */ + GST_VIDEO_INFO_WIDTH (&src_frame.info) = base->width; + GST_VIDEO_INFO_HEIGHT (&src_frame.info) = base->height; + + if (!gst_video_frame_copy (&dest_frame, &src_frame)) { + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + goto fail; + } + } + + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + gst_buffer_replace (&codec_frame->output_buffer, buffer); + gst_buffer_unref (buffer); + + return TRUE; + +fail: + if (buffer) + gst_buffer_unref (buffer); + + GST_ERROR_OBJECT (base, "Failed copy output buffer."); + return FALSE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvabasedec.h
Added
@@ -0,0 +1,127 @@ + +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/codecs/gsth264decoder.h> +#include <gst/codecs/gsth265decoder.h> +#include <gst/codecs/gstmpeg2decoder.h> +#include <gst/codecs/gstvp8decoder.h> +#include <gst/codecs/gstvp9decoder.h> +#include <gst/codecs/gstav1decoder.h> + +#include "gstvadevice.h" +#include "gstvadecoder.h" +#include "gstvaprofile.h" + +G_BEGIN_DECLS + +#define GST_VA_BASE_DEC(obj) ((GstVaBaseDec *)(obj)) +#define GST_VA_BASE_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaBaseDecClass)) +#define GST_VA_BASE_DEC_CLASS(klass) ((GstVaBaseDecClass *)(klass)) + +typedef struct _GstVaBaseDec GstVaBaseDec; +typedef struct _GstVaBaseDecClass GstVaBaseDecClass; + +struct _GstVaBaseDec +{ + /* <private> */ + union + { + GstH264Decoder h264; + GstH265Decoder h265; + GstMpeg2Decoder mpeg2; + GstVp8Decoder vp8; + GstVp9Decoder vp9; + GstAV1Decoder av1; + } parent; + + GstDebugCategory *debug_category; + + GstVaDisplay *display; + GstVaDecoder *decoder; + + VAProfile profile; + guint rt_format; + gint width; + gint height; + + guint min_buffers; + + GstVideoCodecState *output_state; + GstBufferPool *other_pool; + + gboolean need_valign; + GstVideoAlignment valign; + + gboolean copy_frames; + + gboolean apply_video_crop; + GstVideoConverter *convert; + + gboolean need_negotiation; +}; + +struct _GstVaBaseDecClass +{ + /* <private> */ + union + { + GstH264DecoderClass h264; + GstH265DecoderClass h265; + GstMpeg2DecoderClass mpeg2; + GstVp8DecoderClass vp8; + GstVp9DecoderClass vp9; + GstAV1DecoderClass av1; + } parent_class; + + GstVaCodecs codec; + gchar *render_device_path; + /* The parent class in GType hierarchy */ + GstObjectClass *parent_decoder_class; +}; + +struct CData +{ + gchar *render_device_path; + gchar *description; + GstCaps *sink_caps; + GstCaps *src_caps; +}; + +void gst_va_base_dec_init (GstVaBaseDec * base, + GstDebugCategory * cat); +void gst_va_base_dec_class_init (GstVaBaseDecClass * klass, + GstVaCodecs codec, + const gchar * render_device_path, + GstCaps * sink_caps, + GstCaps * src_caps, + GstCaps * doc_src_caps, + GstCaps * doc_sink_caps); + +gboolean gst_va_base_dec_close (GstVideoDecoder * decoder); +void gst_va_base_dec_get_preferred_format_and_caps_features (GstVaBaseDec * base, + GstVideoFormat * format, + GstCapsFeatures ** capsfeatures); +gboolean gst_va_base_dec_copy_output_buffer (GstVaBaseDec * base, + GstVideoCodecFrame * codec_frame); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvabasetransform.c
Added
@@ -0,0 +1,852 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvabasetransform.h" + +#include "gstvaallocator.h" +#include "gstvacaps.h" +#include "gstvapool.h" +#include "gstvautils.h" + +#define GST_CAT_DEFAULT gst_va_base_transform_debug +GST_DEBUG_CATEGORY_STATIC (GST_CAT_DEFAULT); + +struct _GstVaBaseTransformPrivate +{ + GstVideoInfo srcpad_info; + + GstBufferPool *other_pool; + + GstCaps *sinkpad_caps; + GstVideoInfo sinkpad_info; + GstBufferPool *sinkpad_pool; + + GstCaps *filter_caps; +}; + +/** + * GstVaBaseTransform: + * + * A base class implementation for VA-API filters. + * + * Since: 1.20 + */ +#define gst_va_base_transform_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstVaBaseTransform, gst_va_base_transform, + GST_TYPE_BASE_TRANSFORM, G_ADD_PRIVATE (GstVaBaseTransform) + GST_DEBUG_CATEGORY_INIT (gst_va_base_transform_debug, + "vabasetransform", 0, "vabasetransform element"); + ); + +extern GRecMutex GST_VA_SHARED_LOCK; + +static void +gst_va_base_transform_dispose (GObject * object) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (object); + + if (self->priv->other_pool) { + gst_buffer_pool_set_active (self->priv->other_pool, FALSE); + gst_clear_object (&self->priv->other_pool); + } + + gst_clear_caps (&self->out_caps); + gst_clear_caps (&self->in_caps); + + gst_clear_caps (&self->priv->filter_caps); + + gst_clear_object (&self->filter); + gst_clear_object (&self->display); + + if (self->priv->sinkpad_pool) { + gst_buffer_pool_set_active (self->priv->sinkpad_pool, FALSE); + gst_clear_object (&self->priv->sinkpad_pool); + } + + gst_clear_caps (&self->priv->sinkpad_caps); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_base_transform_init (GstVaBaseTransform * self) +{ + gst_base_transform_set_qos_enabled (GST_BASE_TRANSFORM (self), TRUE); + + self->priv = gst_va_base_transform_get_instance_private (self); +} + +static gboolean +gst_va_base_transform_query (GstBaseTransform * trans, + GstPadDirection direction, GstQuery * query) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (trans); + gboolean ret = FALSE; + + switch (GST_QUERY_TYPE (query)) { + case GST_QUERY_CONTEXT: + { + GstVaDisplay *display = NULL; + + gst_object_replace ((GstObject **) & display, + (GstObject *) self->display); + ret = gst_va_handle_context_query (GST_ELEMENT_CAST (self), query, + display); + gst_clear_object (&display); + break; + } + default: + ret = GST_BASE_TRANSFORM_CLASS (parent_class)->query (trans, direction, + query); + break; + } + + return ret; +} + +static gboolean +gst_va_base_transform_set_caps (GstBaseTransform * trans, GstCaps * incaps, + GstCaps * outcaps) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (trans); + GstVaBaseTransformClass *fclass; + GstVideoInfo in_info, out_info; + gboolean res; + + /* input caps */ + if (!gst_video_info_from_caps (&in_info, incaps)) + goto invalid_caps; + + /* output caps */ + if (!gst_video_info_from_caps (&out_info, outcaps)) + goto invalid_caps; + + fclass = GST_VA_BASE_TRANSFORM_GET_CLASS (self); + if (fclass->set_info) + res = fclass->set_info (self, incaps, &in_info, outcaps, &out_info); + else + res = TRUE; + + self->negotiated = res; + + if (res) { + gst_caps_replace (&self->in_caps, incaps); + gst_caps_replace (&self->out_caps, outcaps); + + self->in_info = in_info; + self->out_info = out_info; + } + + if (self->priv->sinkpad_pool) { + gst_buffer_pool_set_active (self->priv->sinkpad_pool, FALSE); + gst_clear_object (&self->priv->sinkpad_pool); + } + + if (self->priv->other_pool) { + gst_buffer_pool_set_active (self->priv->other_pool, FALSE); + gst_clear_object (&self->priv->other_pool); + } + + return res; + + /* ERRORS */ +invalid_caps: + { + GST_ERROR_OBJECT (self, "invalid caps"); + self->negotiated = FALSE; + return FALSE; + } +} + +/* Answer upstream allocation query. */ +static gboolean +gst_va_base_transform_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (trans); + GstAllocator *allocator = NULL; + GstAllocationParams params = { 0, }; + GstBufferPool *pool; + GstCaps *caps; + GstVideoInfo info; + gboolean update_allocator = FALSE; + guint size, usage_hint = VA_SURFACE_ATTRIB_USAGE_HINT_GENERIC; /* it migth be + * used by a va + * decoder */ + + gst_clear_caps (&self->priv->sinkpad_caps); + + if (!GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query)) + return FALSE; + + /* passthrough, we're done */ + if (!decide_query) + return TRUE; + + if (gst_query_get_n_allocation_pools (query) > 0) + return TRUE; + + gst_query_parse_allocation (query, &caps, NULL); + if (!caps) + return FALSE; + if (!gst_video_info_from_caps (&info, caps)) { + GST_ERROR_OBJECT (self, "Cannot parse caps %" GST_PTR_FORMAT, caps); + return FALSE; + } + + size = GST_VIDEO_INFO_SIZE (&info); + + if (gst_query_get_n_allocation_params (query) > 0) { + gst_query_parse_nth_allocation_param (query, 0, &allocator, ¶ms); + if (!GST_IS_VA_DMABUF_ALLOCATOR (allocator) + && !GST_IS_VA_ALLOCATOR (allocator)) + gst_clear_object (&allocator); + update_allocator = TRUE; + } else { + gst_allocation_params_init (¶ms); + } + + if (!allocator) { + if (!(allocator = gst_va_base_transform_allocator_from_caps (self, caps))) + return FALSE; + } + + pool = gst_va_pool_new_with_config (caps, size, 1 + self->extra_min_buffers, + 0, usage_hint, allocator, ¶ms); + if (!pool) { + gst_object_unref (allocator); + goto config_failed; + } + + if (update_allocator) + gst_query_set_nth_allocation_param (query, 0, allocator, ¶ms); + else + gst_query_add_allocation_param (query, allocator, ¶ms); + + gst_query_add_allocation_pool (query, pool, size, 1 + self->extra_min_buffers, + 0); + + GST_DEBUG_OBJECT (self, + "proposing %" GST_PTR_FORMAT " with allocator %" GST_PTR_FORMAT, + pool, allocator); + + gst_object_unref (allocator); + gst_object_unref (pool); + + gst_query_add_allocation_meta (query, GST_VIDEO_META_API_TYPE, NULL); + + self->priv->sinkpad_caps = gst_caps_ref (caps); + + return TRUE; + + /* ERRORS */ +config_failed: + { + GST_ERROR_OBJECT (self, "failed to set config"); + return FALSE; + } +} + +static GstBufferPool * +_create_other_pool (GstAllocator * allocator, + GstAllocationParams * params, GstCaps * caps, guint size) +{ + GstBufferPool *pool = NULL; + GstStructure *config; + + pool = gst_video_buffer_pool_new (); + config = gst_buffer_pool_get_config (pool); + + gst_buffer_pool_config_set_params (config, caps, size, 0, 0); + gst_buffer_pool_config_set_allocator (config, allocator, params); + if (!gst_buffer_pool_set_config (pool, config)) { + gst_clear_object (&pool); + } + + return pool; +} + +/* configure the allocation query that was answered downstream, we can + * configure some properties on it. Only it's called when not in + * passthrough mode. */ +static gboolean +gst_va_base_transform_decide_allocation (GstBaseTransform * trans, + GstQuery * query) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (trans); + GstAllocator *allocator = NULL, *other_allocator = NULL; + GstAllocationParams params, other_params; + GstBufferPool *pool = NULL, *other_pool = NULL; + GstCaps *outcaps = NULL; + GstStructure *config; + GstVideoInfo vinfo; + guint min, max, size = 0, usage_hint = VA_SURFACE_ATTRIB_USAGE_HINT_VPP_WRITE; + gboolean update_pool, update_allocator, has_videometa, copy_frames; + + gst_query_parse_allocation (query, &outcaps, NULL); + + gst_allocation_params_init (&other_params); + gst_allocation_params_init (¶ms); + + if (!gst_video_info_from_caps (&vinfo, outcaps)) { + GST_ERROR_OBJECT (self, "Cannot parse caps %" GST_PTR_FORMAT, outcaps); + return FALSE; + } + + if (gst_query_get_n_allocation_params (query) > 0) { + gst_query_parse_nth_allocation_param (query, 0, &allocator, &other_params); + if (allocator && !(GST_IS_VA_DMABUF_ALLOCATOR (allocator) + || GST_IS_VA_ALLOCATOR (allocator))) { + /* save the allocator for the other pool */ + other_allocator = allocator; + allocator = NULL; + } + update_allocator = TRUE; + } else { + update_allocator = FALSE; + } + + if (gst_query_get_n_allocation_pools (query) > 0) { + gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); + + if (pool) { + if (!GST_IS_VA_POOL (pool)) { + GST_DEBUG_OBJECT (self, + "may need other pool for copy frames %" GST_PTR_FORMAT, pool); + other_pool = pool; + pool = NULL; + } + } + + update_pool = TRUE; + } else { + size = GST_VIDEO_INFO_SIZE (&vinfo); + min = 1; + max = 0; + update_pool = FALSE; + } + + if (!allocator) { + /* XXX(victor): USAGE_HINT_VPP_WRITE creates tiled dmabuf frames + * in iHD */ + if (gst_caps_is_dmabuf (outcaps) && GST_VIDEO_INFO_IS_RGB (&vinfo)) + usage_hint = VA_SURFACE_ATTRIB_USAGE_HINT_GENERIC; + if (!(allocator = + gst_va_base_transform_allocator_from_caps (self, outcaps))) + return FALSE; + } + + if (!pool) + pool = gst_va_pool_new (); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_allocator (config, allocator, ¶ms); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + gst_buffer_pool_config_set_params (config, outcaps, size, min, max); + gst_buffer_pool_config_set_va_allocation_params (config, usage_hint); + if (!gst_buffer_pool_set_config (pool, config)) { + gst_object_unref (allocator); + gst_object_unref (pool); + return FALSE; + } + + if (GST_IS_VA_DMABUF_ALLOCATOR (allocator)) { + gst_va_dmabuf_allocator_get_format (allocator, &self->priv->srcpad_info, + NULL); + } else if (GST_IS_VA_ALLOCATOR (allocator)) { + gst_va_allocator_get_format (allocator, &self->priv->srcpad_info, NULL); + } + + if (update_allocator) + gst_query_set_nth_allocation_param (query, 0, allocator, ¶ms); + else + gst_query_add_allocation_param (query, allocator, ¶ms); + + if (update_pool) + gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); + else + gst_query_add_allocation_pool (query, pool, size, min, max); + + has_videometa = gst_query_find_allocation_meta (query, + GST_VIDEO_META_API_TYPE, NULL); + + copy_frames = (!has_videometa && gst_va_pool_requires_video_meta (pool) + && gst_caps_is_raw (outcaps)); + if (copy_frames) { + if (other_pool) { + gst_object_replace ((GstObject **) & self->priv->other_pool, + (GstObject *) other_pool); + } else { + self->priv->other_pool = + _create_other_pool (other_allocator, &other_params, outcaps, size); + } + GST_DEBUG_OBJECT (self, "Use the other pool for copy %" GST_PTR_FORMAT, + self->priv->other_pool); + } else { + gst_clear_object (&self->priv->other_pool); + } + + GST_DEBUG_OBJECT (self, + "decided pool %" GST_PTR_FORMAT " with allocator %" GST_PTR_FORMAT, + pool, allocator); + + gst_object_unref (allocator); + gst_object_unref (pool); + gst_clear_object (&other_allocator); + gst_clear_object (&other_pool); + + /* removes allocation metas */ + return GST_BASE_TRANSFORM_CLASS (parent_class)->decide_allocation (trans, + query); + +} + +/* output buffers must be from our VA-based pool, they cannot be + * system-allocated */ +static gboolean +gst_va_base_transform_transform_size (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, gsize size, + GstCaps * othercaps, gsize * othersize) +{ + return FALSE; +} + +static GstFlowReturn +gst_va_base_transform_generate_output (GstBaseTransform * trans, + GstBuffer ** outbuf) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (trans); + GstVideoFrame src_frame; + GstVideoFrame dest_frame; + GstBuffer *buffer = NULL; + GstFlowReturn ret; + + ret = GST_BASE_TRANSFORM_CLASS (parent_class)->generate_output (trans, + outbuf); + + if (ret != GST_FLOW_OK || *outbuf == NULL) + return ret; + + if (!self->priv->other_pool) + return GST_FLOW_OK; + + /* Now need to copy the output buffer */ + ret = GST_FLOW_ERROR; + + if (!gst_buffer_pool_set_active (self->priv->other_pool, TRUE)) { + GST_WARNING_OBJECT (self, "failed to active the other pool %" + GST_PTR_FORMAT, self->priv->other_pool); + goto out; + } + + ret = gst_buffer_pool_acquire_buffer (self->priv->other_pool, &buffer, NULL); + if (ret != GST_FLOW_OK) + goto out; + + if (!gst_video_frame_map (&src_frame, &self->priv->srcpad_info, *outbuf, + GST_MAP_READ)) + goto out; + + if (!gst_video_frame_map (&dest_frame, &self->out_info, buffer, + GST_MAP_WRITE)) { + gst_video_frame_unmap (&src_frame); + goto out; + } + + if (!gst_video_frame_copy (&dest_frame, &src_frame)) { + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + goto out; + } + + gst_video_frame_unmap (&src_frame); + gst_video_frame_unmap (&dest_frame); + + gst_buffer_replace (outbuf, buffer); + ret = GST_FLOW_OK; + +out: + gst_clear_buffer (&buffer); + return ret; +} + +static GstStateChangeReturn +gst_va_base_transform_change_state (GstElement * element, + GstStateChange transition) +{ + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (element); + GstVaBaseTransformClass *klass = GST_VA_BASE_TRANSFORM_GET_CLASS (element); + GstStateChangeReturn ret; + + switch (transition) { + case GST_STATE_CHANGE_NULL_TO_READY: + if (!gst_va_ensure_element_data (element, klass->render_device_path, + &self->display)) + goto open_failed; + gst_clear_caps (&self->priv->filter_caps); + gst_clear_object (&self->filter); + self->filter = gst_va_filter_new (self->display); + if (!gst_va_filter_open (self->filter)) + goto open_failed; + if (klass->update_properties) + klass->update_properties (self); + break; + default: + break; + } + + ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); + + switch (transition) { + case GST_STATE_CHANGE_PAUSED_TO_READY: + gst_va_filter_close (self->filter); + break; + case GST_STATE_CHANGE_READY_TO_NULL: + gst_clear_caps (&self->priv->filter_caps); + gst_clear_object (&self->filter); + gst_clear_object (&self->display); + break; + default: + break; + } + + return ret; + + /* Errors */ +open_failed: + { + GST_ELEMENT_ERROR (self, LIBRARY, INIT, (NULL), ("Failed to open VPP")); + return GST_STATE_CHANGE_FAILURE; + } +} + +static void +gst_va_base_transform_set_context (GstElement * element, GstContext * context) +{ + GstVaDisplay *old_display, *new_display; + GstVaBaseTransform *self = GST_VA_BASE_TRANSFORM (element); + GstVaBaseTransformClass *klass = GST_VA_BASE_TRANSFORM_GET_CLASS (self); + gboolean ret; + + old_display = self->display ? gst_object_ref (self->display) : NULL; + ret = gst_va_handle_set_context (element, context, klass->render_device_path, + &self->display); + new_display = self->display ? gst_object_ref (self->display) : NULL; + + if (!ret + || (old_display && new_display && old_display != new_display + && self->filter)) { + GST_ELEMENT_WARNING (element, RESOURCE, BUSY, + ("Can't replace VA display while operating"), (NULL)); + } + + gst_clear_object (&old_display); + gst_clear_object (&new_display); + + GST_ELEMENT_CLASS (parent_class)->set_context (element, context); +} + +static void +gst_va_base_transform_class_init (GstVaBaseTransformClass * klass) +{ + GObjectClass *gobject_class; + GstElementClass *element_class; + GstBaseTransformClass *trans_class; + + gobject_class = G_OBJECT_CLASS (klass); + element_class = GST_ELEMENT_CLASS (klass); + trans_class = GST_BASE_TRANSFORM_CLASS (klass); + + gobject_class->dispose = gst_va_base_transform_dispose; + + trans_class->query = GST_DEBUG_FUNCPTR (gst_va_base_transform_query); + trans_class->set_caps = GST_DEBUG_FUNCPTR (gst_va_base_transform_set_caps); + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_va_base_transform_propose_allocation); + trans_class->decide_allocation = + GST_DEBUG_FUNCPTR (gst_va_base_transform_decide_allocation); + trans_class->transform_size = + GST_DEBUG_FUNCPTR (gst_va_base_transform_transform_size); + trans_class->generate_output = + GST_DEBUG_FUNCPTR (gst_va_base_transform_generate_output); + + element_class->set_context = + GST_DEBUG_FUNCPTR (gst_va_base_transform_set_context); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_va_base_transform_change_state); + + gst_type_mark_as_plugin_api (GST_TYPE_VA_BASE_TRANSFORM, 0); +} + +GstAllocator * +gst_va_base_transform_allocator_from_caps (GstVaBaseTransform * self, + GstCaps * caps) +{ + GstAllocator *allocator = NULL; + + if (gst_caps_is_dmabuf (caps)) { + allocator = gst_va_dmabuf_allocator_new (self->display); + } else { + GArray *surface_formats = gst_va_filter_get_surface_formats (self->filter); + allocator = gst_va_allocator_new (self->display, surface_formats); + } + + return allocator; +} + +static inline gsize +_get_plane_data_size (GstVideoInfo * info, guint plane) +{ + gint comp[GST_VIDEO_MAX_COMPONENTS]; + gint height, padded_height; + + gst_video_format_info_component (info->finfo, plane, comp); + + height = GST_VIDEO_INFO_HEIGHT (info); + padded_height = + GST_VIDEO_FORMAT_INFO_SCALE_HEIGHT (info->finfo, comp[0], height); + + return GST_VIDEO_INFO_PLANE_STRIDE (info, plane) * padded_height; +} + +static gboolean +_try_import_dmabuf_unlocked (GstVaBaseTransform * self, GstBuffer * inbuf) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + GstVideoMeta *meta; + GstVideoInfo in_info = btrans->in_info; + GstMemory *mems[GST_VIDEO_MAX_PLANES]; + guint i, n_mem, n_planes; + gsize offset[GST_VIDEO_MAX_PLANES]; + uintptr_t fd[GST_VIDEO_MAX_PLANES]; + + n_planes = GST_VIDEO_INFO_N_PLANES (&in_info); + n_mem = gst_buffer_n_memory (inbuf); + meta = gst_buffer_get_video_meta (inbuf); + + /* This will eliminate most non-dmabuf out there */ + if (!gst_is_dmabuf_memory (gst_buffer_peek_memory (inbuf, 0))) + return FALSE; + + /* We cannot have multiple dmabuf per plane */ + if (n_mem > n_planes) + return FALSE; + + /* Update video info based on video meta */ + if (meta) { + GST_VIDEO_INFO_WIDTH (&in_info) = meta->width; + GST_VIDEO_INFO_HEIGHT (&in_info) = meta->height; + + for (i = 0; i < meta->n_planes; i++) { + GST_VIDEO_INFO_PLANE_OFFSET (&in_info, i) = meta->offset[i]; + GST_VIDEO_INFO_PLANE_STRIDE (&in_info, i) = meta->stride[i]; + } + } + + /* Find and validate all memories */ + for (i = 0; i < n_planes; i++) { + guint plane_size; + guint length; + guint mem_idx; + gsize mem_skip; + + plane_size = _get_plane_data_size (&in_info, i); + + if (!gst_buffer_find_memory (inbuf, in_info.offset[i], plane_size, + &mem_idx, &length, &mem_skip)) + return FALSE; + + /* We can't have more then one dmabuf per plane */ + if (length != 1) + return FALSE; + + mems[i] = gst_buffer_peek_memory (inbuf, mem_idx); + + /* And all memory found must be dmabuf */ + if (!gst_is_dmabuf_memory (mems[i])) + return FALSE; + + offset[i] = mems[i]->offset + mem_skip; + fd[i] = gst_dmabuf_memory_get_fd (mems[i]); + } + + /* Now create a VASurfaceID for the buffer */ + return gst_va_dmabuf_memories_setup (btrans->display, &in_info, n_planes, + mems, fd, offset, VA_SURFACE_ATTRIB_USAGE_HINT_VPP_READ); +} + +static GstBufferPool * +_get_sinkpad_pool (GstVaBaseTransform * self) +{ + GstAllocator *allocator; + GstAllocationParams params = { 0, }; + GstCaps *caps; + GstVideoInfo in_info; + guint size, usage_hint = VA_SURFACE_ATTRIB_USAGE_HINT_VPP_READ; + + if (self->priv->sinkpad_pool) + return self->priv->sinkpad_pool; + + gst_allocation_params_init (¶ms); + + if (self->priv->sinkpad_caps) { + caps = self->priv->sinkpad_caps; + gst_video_info_from_caps (&in_info, caps); + } else { + caps = self->in_caps; + in_info = self->in_info; + } + + size = GST_VIDEO_INFO_SIZE (&in_info); + + allocator = gst_va_base_transform_allocator_from_caps (self, caps); + self->priv->sinkpad_pool = gst_va_pool_new_with_config (caps, size, 1, 0, + usage_hint, allocator, ¶ms); + if (!self->priv->sinkpad_pool) { + gst_object_unref (allocator); + return NULL; + } + + if (GST_IS_VA_DMABUF_ALLOCATOR (allocator)) { + gst_va_dmabuf_allocator_get_format (allocator, &self->priv->sinkpad_info, + NULL); + } else if (GST_IS_VA_ALLOCATOR (allocator)) { + gst_va_allocator_get_format (allocator, &self->priv->sinkpad_info, NULL); + } + + gst_object_unref (allocator); + + if (!gst_buffer_pool_set_active (self->priv->sinkpad_pool, TRUE)) { + GST_WARNING_OBJECT (self, "failed to active the sinkpad pool %" + GST_PTR_FORMAT, self->priv->sinkpad_pool); + return NULL; + } + + return self->priv->sinkpad_pool; +} + +static gboolean +_try_import_buffer (GstVaBaseTransform * self, GstBuffer * inbuf) +{ + VASurfaceID surface; + gboolean ret; + + surface = gst_va_buffer_get_surface (inbuf); + if (surface != VA_INVALID_ID) + return TRUE; + + g_rec_mutex_lock (&GST_VA_SHARED_LOCK); + ret = _try_import_dmabuf_unlocked (self, inbuf); + g_rec_mutex_unlock (&GST_VA_SHARED_LOCK); + + return ret; +} + +GstFlowReturn +gst_va_base_transform_import_buffer (GstVaBaseTransform * self, + GstBuffer * inbuf, GstBuffer ** buf) +{ + GstBuffer *buffer = NULL; + GstBufferPool *pool; + GstFlowReturn ret; + GstVideoFrame in_frame, out_frame; + gboolean imported, copied; + + g_return_val_if_fail (GST_IS_VA_BASE_TRANSFORM (self), GST_FLOW_ERROR); + + imported = _try_import_buffer (self, inbuf); + if (imported) { + *buf = gst_buffer_ref (inbuf); + return GST_FLOW_OK; + } + + /* input buffer doesn't come from a vapool, thus it is required to + * have a pool, grab from it a new buffer and copy the input + * buffer to the new one */ + if (!(pool = _get_sinkpad_pool (self))) + return GST_FLOW_ERROR; + + ret = gst_buffer_pool_acquire_buffer (pool, &buffer, NULL); + if (ret != GST_FLOW_OK) + return ret; + + GST_LOG_OBJECT (self, "copying input frame"); + + if (!gst_video_frame_map (&in_frame, &self->in_info, inbuf, GST_MAP_READ)) + goto invalid_buffer; + + if (!gst_video_frame_map (&out_frame, &self->priv->sinkpad_info, buffer, + GST_MAP_WRITE)) { + gst_video_frame_unmap (&in_frame); + goto invalid_buffer; + } + + copied = gst_video_frame_copy (&out_frame, &in_frame); + + gst_video_frame_unmap (&out_frame); + gst_video_frame_unmap (&in_frame); + + if (!copied) + goto invalid_buffer; + + /* copy metadata, default implemenation of baseclass will copy everything + * what we need */ + GST_BASE_TRANSFORM_CLASS (parent_class)->copy_metadata + (GST_BASE_TRANSFORM_CAST (self), inbuf, buffer); + + *buf = buffer; + + return GST_FLOW_OK; + +invalid_buffer: + { + GST_ELEMENT_WARNING (self, CORE, NOT_IMPLEMENTED, (NULL), + ("invalid video buffer received")); + if (buffer) + gst_buffer_unref (buffer); + return GST_FLOW_OK; + } +} + +GstCaps * +gst_va_base_transform_get_filter_caps (GstVaBaseTransform * self) +{ + g_return_val_if_fail (GST_IS_VA_BASE_TRANSFORM (self), NULL); + + GST_OBJECT_LOCK (self); + if (self->priv->filter_caps) { + GST_OBJECT_UNLOCK (self); + return self->priv->filter_caps; + } + GST_OBJECT_UNLOCK (self); + + if (!self->filter) + return NULL; + + GST_OBJECT_LOCK (self); + self->priv->filter_caps = gst_va_filter_get_caps (self->filter); + GST_OBJECT_UNLOCK (self); + return self->priv->filter_caps; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvabasetransform.h
Added
@@ -0,0 +1,95 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/base/gstbasetransform.h> + +#include "gstvafilter.h" + +G_BEGIN_DECLS + +#define GST_TYPE_VA_BASE_TRANSFORM (gst_va_base_transform_get_type()) +#define GST_VA_BASE_TRANSFORM(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_VA_BASE_TRANSFORM, GstVaBaseTransform)) +#define GST_IS_VA_BASE_TRANSFORM(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_VA_BASE_TRANSFORM)) +#define GST_VA_BASE_TRANSFORM_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_VA_BASE_TRANSFORM, GstVaBaseTransformClass)) +#define GST_IS_VA_BASE_TRANSFORM_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_VA_BASE_TRANSFORM)) +#define GST_VA_BASE_TRANSFORM_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj), GST_TYPE_VA_BASE_TRANSFORM, GstVaBaseTransformClass)) + +typedef struct _GstVaBaseTransform GstVaBaseTransform; +typedef struct _GstVaBaseTransformClass GstVaBaseTransformClass; +typedef struct _GstVaBaseTransformPrivate GstVaBaseTransformPrivate; + +struct _GstVaBaseTransform +{ + GstBaseTransform parent; + + /*< public >*/ + GstVaDisplay *display; + GstVaFilter *filter; + + GstCaps *in_caps; + GstCaps *out_caps; + GstVideoInfo in_info; + GstVideoInfo out_info; + + gboolean negotiated; + + guint extra_min_buffers; + + /*< private >*/ + GstVaBaseTransformPrivate *priv; + + gpointer _padding[GST_PADDING]; +}; + +struct _GstVaBaseTransformClass +{ + GstBaseTransformClass parent_class; + + /*< public >*/ + gboolean (*set_info) (GstVaBaseTransform *self, + GstCaps *incaps, GstVideoInfo *in_info, + GstCaps *outcaps, GstVideoInfo *out_info); + + void (*update_properties) (GstVaBaseTransform *self); + + /*< private >*/ + gchar *render_device_path; + + gpointer _padding[GST_PADDING]; +}; + +GType gst_va_base_transform_get_type (void); + +GstAllocator * gst_va_base_transform_allocator_from_caps + (GstVaBaseTransform * self, + GstCaps * caps); + +GstFlowReturn gst_va_base_transform_import_buffer (GstVaBaseTransform * self, + GstBuffer * inbuf, + GstBuffer ** buf); + +GstCaps * gst_va_base_transform_get_filter_caps + (GstVaBaseTransform * self); + +G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVaBaseTransform, gst_object_unref) + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvacaps.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvacaps.c
Changed
@@ -24,14 +24,16 @@ #include "gstvacaps.h" +#include <gst/allocators/allocators.h> + #include <va/va_drmcommon.h> -#include "gstvadisplay.h" +#include "gstvadisplay_priv.h" #include "gstvaprofile.h" #include "gstvavideoformat.h" -GST_DEBUG_CATEGORY_EXTERN (gst_va_display_debug); -#define GST_CAT_DEFAULT gst_va_display_debug +GST_DEBUG_CATEGORY_EXTERN (gstva_debug); +#define GST_CAT_DEFAULT gstva_debug static const guint va_rt_format_list[] = { #define R(name) G_PASTE (VA_RT_FORMAT_, name) @@ -93,14 +95,17 @@ return NULL; } -static gboolean -_gst_caps_set_format_array (GstCaps * caps, GArray * formats) +gboolean +gst_caps_set_format_array (GstCaps * caps, GArray * formats) { GstVideoFormat fmt; GValue v_formats = G_VALUE_INIT; const gchar *format; guint i; + g_return_val_if_fail (GST_IS_CAPS (caps), FALSE); + g_return_val_if_fail (formats, FALSE); + if (formats->len == 1) { fmt = g_array_index (formats, GstVideoFormat, 0); if (fmt == GST_VIDEO_FORMAT_UNKNOWN) @@ -144,7 +149,7 @@ gst_va_create_raw_caps_from_config (GstVaDisplay * display, VAConfigID config) { GArray *formats; - GstCaps *caps, *base_caps, *feature_caps; + GstCaps *caps = NULL, *base_caps, *feature_caps; GstCapsFeatures *features; GstVideoFormat format; VASurfaceAttrib *attribs; @@ -188,79 +193,36 @@ /* if driver doesn't report surface formats for current * chroma. Gallium AMD bug for 4:2:2 */ - if (formats->len == 0) { - caps = NULL; + if (formats->len == 0) goto bail; - } base_caps = gst_caps_new_simple ("video/x-raw", "width", GST_TYPE_INT_RANGE, min_width, max_width, "height", GST_TYPE_INT_RANGE, min_height, max_height, NULL); - _gst_caps_set_format_array (base_caps, formats); + if (!gst_caps_set_format_array (base_caps, formats)) { + gst_caps_unref (base_caps); + goto bail; + } caps = gst_caps_new_empty (); if (mem_type & VA_SURFACE_ATTRIB_MEM_TYPE_VA) { feature_caps = gst_caps_copy (base_caps); - features = gst_caps_features_from_string ("memory:VAMemory"); + features = gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_VA); gst_caps_set_features_simple (feature_caps, features); caps = gst_caps_merge (caps, feature_caps); } if (mem_type & VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME || mem_type & VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2) { feature_caps = gst_caps_copy (base_caps); - features = gst_caps_features_from_string ("memory:DMABuf"); + features = gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_DMABUF); gst_caps_set_features_simple (feature_caps, features); caps = gst_caps_merge (caps, feature_caps); } - /* raw caps */ - /* XXX(victor): assumption -- drivers can only download to image - * formats with same chroma of surface's format - */ - { - GstCaps *raw_caps; - GArray *image_formats = gst_va_display_get_image_formats (display); - - if (!image_formats) { - raw_caps = gst_caps_copy (base_caps); - } else { - GArray *raw_formats = g_array_new (FALSE, FALSE, sizeof (GstVideoFormat)); - guint j, surface_chroma, image_chroma; - GstVideoFormat image_format; - - raw_caps = - gst_caps_new_simple ("video/x-raw", "width", GST_TYPE_INT_RANGE, - min_width, max_width, "height", GST_TYPE_INT_RANGE, min_height, - max_height, NULL); - - for (i = 0; i < formats->len; i++) { - format = g_array_index (formats, GstVideoFormat, i); - surface_chroma = gst_va_chroma_from_video_format (format); - if (surface_chroma == 0) - continue; - g_array_append_val (raw_formats, format); - - for (j = 0; j < image_formats->len; j++) { - image_format = g_array_index (image_formats, GstVideoFormat, j); - image_chroma = gst_va_chroma_from_video_format (image_format); - if (image_format != format && surface_chroma == image_chroma) - g_array_append_val (raw_formats, image_format); - } - } - - if (!_gst_caps_set_format_array (raw_caps, raw_formats)) { - gst_caps_unref (raw_caps); - raw_caps = gst_caps_copy (base_caps); - } - - g_array_unref (raw_formats); - g_array_unref (image_formats); - } - - caps = gst_caps_merge (caps, raw_caps); - } + /* raw caps */ + caps = gst_caps_merge (caps, gst_caps_copy (base_caps)); gst_caps_unref (base_caps); @@ -371,6 +333,49 @@ return caps; } +static GstCaps * +_regroup_raw_caps (GstCaps * caps) +{ + GstCaps *sys_caps, *va_caps, *dma_caps, *tmp; + guint size, i; + + if (gst_caps_is_any (caps) || gst_caps_is_empty (caps)) + return caps; + + size = gst_caps_get_size (caps); + if (size <= 1) + return caps; + + /* We need to simplify caps by features. */ + sys_caps = gst_caps_new_empty (); + va_caps = gst_caps_new_empty (); + dma_caps = gst_caps_new_empty (); + for (i = 0; i < size; i++) { + GstCapsFeatures *ft; + + tmp = gst_caps_copy_nth (caps, i); + ft = gst_caps_get_features (tmp, 0); + if (gst_caps_features_contains (ft, GST_CAPS_FEATURE_MEMORY_DMABUF)) { + dma_caps = gst_caps_merge (dma_caps, tmp); + } else if (gst_caps_features_contains (ft, GST_CAPS_FEATURE_MEMORY_VA)) { + va_caps = gst_caps_merge (va_caps, tmp); + } else { + sys_caps = gst_caps_merge (sys_caps, tmp); + } + } + + sys_caps = gst_caps_simplify (sys_caps); + va_caps = gst_caps_simplify (va_caps); + dma_caps = gst_caps_simplify (dma_caps); + + sys_caps = gst_caps_merge (sys_caps, va_caps); + sys_caps = gst_caps_merge (sys_caps, dma_caps); + + gst_caps_unref (caps); + + return sys_caps; +} + gboolean gst_va_caps_from_profiles (GstVaDisplay * display, GArray * profiles, VAEntrypoint entrypoint, GstCaps ** codedcaps_ptr, GstCaps ** rawcaps_ptr) @@ -458,7 +463,7 @@ gst_caps_replace (&codedcaps, NULL); if ((ret = codedcaps && rawcaps)) { - rawcaps = gst_caps_simplify (rawcaps); + rawcaps = _regroup_raw_caps (rawcaps); codedcaps = gst_caps_simplify (codedcaps); if (rawcaps_ptr) @@ -474,3 +479,33 @@ return ret; } + +static inline gboolean +_caps_is (GstCaps * caps, const gchar * feature) +{ + GstCapsFeatures *features; + + if (!gst_caps_is_fixed (caps)) + return FALSE; + + features = gst_caps_get_features (caps, 0); + return gst_caps_features_contains (features, feature); +} + +gboolean +gst_caps_is_dmabuf (GstCaps * caps) +{ + return _caps_is (caps, GST_CAPS_FEATURE_MEMORY_DMABUF); +} + +gboolean +gst_caps_is_vamemory (GstCaps * caps) +{ + return _caps_is (caps, GST_CAPS_FEATURE_MEMORY_VA); +} + +gboolean +gst_caps_is_raw (GstCaps * caps) +{ + return _caps_is (caps, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvacaps.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvacaps.h
Changed
@@ -20,7 +20,8 @@ #pragma once -#include "gstvadisplay.h" +#include <gst/va/gstvadisplay.h> +#include <va/va.h> G_BEGIN_DECLS @@ -37,5 +38,12 @@ GstCaps * gst_va_create_raw_caps_from_config (GstVaDisplay * display, VAConfigID config); +gboolean gst_caps_set_format_array (GstCaps * caps, + GArray * formats); + +gboolean gst_caps_is_dmabuf (GstCaps * caps); +gboolean gst_caps_is_vamemory (GstCaps * caps); +gboolean gst_caps_is_raw (GstCaps * caps); + G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadecoder.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadecoder.c
Changed
@@ -24,9 +24,11 @@ #include "gstvadecoder.h" +#include "gstvaallocator.h" #include "gstvacaps.h" -#include "gstvadisplay_wrapped.h" +#include "gstvadisplay_priv.h" #include "gstvavideoformat.h" +#include <gst/va/gstvadisplay_wrapped.h> struct _GstVaDecoder { @@ -64,6 +66,8 @@ static GParamSpec *g_properties[N_PROPERTIES]; +static gboolean _destroy_buffers (GstVaDecodePicture * pic); + static void gst_va_decoder_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) @@ -115,7 +119,8 @@ { GstVaDecoder *self = GST_VA_DECODER (object); - gst_va_decoder_close (self); + if (!gst_va_decoder_close (self)) + GST_WARNING_OBJECT (self, "VaDecoder is not successfully closed"); g_clear_pointer (&self->available_profiles, g_array_unref); gst_clear_object (&self->display); @@ -292,8 +297,8 @@ } gboolean -gst_va_decoder_set_format (GstVaDecoder * self, gint coded_width, - gint coded_height, GArray * surfaces) +gst_va_decoder_set_frame_size_with_surfaces (GstVaDecoder * self, + gint coded_width, gint coded_height, GArray * surfaces) { VAContextID context; VADisplay dpy; @@ -306,7 +311,7 @@ GST_OBJECT_LOCK (self); if (self->context != VA_INVALID_ID) { GST_OBJECT_UNLOCK (self); - GST_INFO_OBJECT (self, "decoder already has a format"); + GST_INFO_OBJECT (self, "decoder already has a context"); return TRUE; } GST_OBJECT_UNLOCK (self); @@ -342,6 +347,44 @@ return TRUE; } +gboolean +gst_va_decoder_set_frame_size (GstVaDecoder * self, gint coded_width, + gint coded_height) +{ + return gst_va_decoder_set_frame_size_with_surfaces (self, coded_width, + coded_height, NULL); +} + +/* This function is only used by codecs where frame size can change + * without a context reset, as for example VP9 */ +gboolean +gst_va_decoder_update_frame_size (GstVaDecoder * self, gint coded_width, + gint coded_height) +{ + g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); + + if (!gst_va_decoder_is_open (self)) { + GST_ERROR_OBJECT (self, "decoder has not been opened yet"); + return FALSE; + } + + GST_OBJECT_LOCK (self); + if (self->context == VA_INVALID_ID) { + GST_OBJECT_UNLOCK (self); + GST_INFO_OBJECT (self, "decoder does not have a context"); + return FALSE; + } + GST_OBJECT_UNLOCK (self); + + + GST_OBJECT_LOCK (self); + self->coded_width = coded_width; + self->coded_height = coded_height; + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + static gboolean _get_codec_caps (GstVaDecoder * self) { @@ -370,7 +413,7 @@ g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); - if (self->srcpad_caps) + if (g_atomic_pointer_get (&self->srcpad_caps)) return gst_caps_ref (self->srcpad_caps); if (_get_codec_caps (self)) @@ -393,7 +436,7 @@ { g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); - if (self->sinkpad_caps) + if (g_atomic_pointer_get (&self->sinkpad_caps)) return gst_caps_ref (self->sinkpad_caps); if (_get_codec_caps (self)) @@ -509,7 +552,6 @@ g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); g_return_val_if_fail (self->context != VA_INVALID_ID, FALSE); g_return_val_if_fail (pic && data && size > 0, FALSE); - g_return_val_if_fail (pic->buffers->len + 1 <= 16, FALSE); dpy = gst_va_display_get_va_dpy (self->display); gst_va_display_lock (self->display); @@ -525,9 +567,9 @@ } gboolean -gst_va_decoder_add_slice_buffer (GstVaDecoder * self, GstVaDecodePicture * pic, - gpointer params_data, gsize params_size, gpointer slice_data, - gsize slice_size) +gst_va_decoder_add_slice_buffer_with_n_params (GstVaDecoder * self, + GstVaDecodePicture * pic, gpointer params_data, gsize params_size, + guint params_num, gpointer slice_data, gsize slice_size) { VABufferID params_buffer, slice_buffer; VADisplay dpy; @@ -541,7 +583,7 @@ dpy = gst_va_display_get_va_dpy (self->display); gst_va_display_lock (self->display); status = vaCreateBuffer (dpy, self->context, VASliceParameterBufferType, - params_size, 1, params_data, ¶ms_buffer); + params_size, params_num, params_data, ¶ms_buffer); gst_va_display_unlock (self->display); if (status != VA_STATUS_SUCCESS) { GST_ERROR_OBJECT (self, "vaCreateBuffer: %s", vaErrorStr (status)); @@ -564,58 +606,81 @@ } gboolean -gst_va_decoder_decode (GstVaDecoder * self, GstVaDecodePicture * pic) +gst_va_decoder_add_slice_buffer (GstVaDecoder * self, GstVaDecodePicture * pic, + gpointer params_data, gsize params_size, gpointer slice_data, + gsize slice_size) +{ + return gst_va_decoder_add_slice_buffer_with_n_params (self, pic, params_data, + params_size, 1, slice_data, slice_size); +} + +gboolean +gst_va_decoder_decode_with_aux_surface (GstVaDecoder * self, + GstVaDecodePicture * pic, gboolean use_aux) { VADisplay dpy; VAStatus status; + VASurfaceID surface = VA_INVALID_ID; gboolean ret = FALSE; g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); g_return_val_if_fail (self->context != VA_INVALID_ID, FALSE); - g_return_val_if_fail (pic && pic->surface != VA_INVALID_ID, FALSE); + g_return_val_if_fail (pic, FALSE); - GST_TRACE_OBJECT (self, "Decode to surface %#x", pic->surface); + if (use_aux) { + surface = gst_va_decode_picture_get_aux_surface (pic); + } else { + surface = gst_va_decode_picture_get_surface (pic); + } + if (surface == VA_INVALID_ID) { + GST_ERROR_OBJECT (self, "Decode picture without VASurfaceID"); + return FALSE; + } + + GST_TRACE_OBJECT (self, "Decode to surface %#x", surface); dpy = gst_va_display_get_va_dpy (self->display); gst_va_display_lock (self->display); - status = vaBeginPicture (dpy, self->context, pic->surface); + status = vaBeginPicture (dpy, self->context, surface); gst_va_display_unlock (self->display); if (status != VA_STATUS_SUCCESS) { - GST_ERROR_OBJECT (self, "vaBeginPicture: %s", vaErrorStr (status)); + GST_WARNING_OBJECT (self, "vaBeginPicture: %s", vaErrorStr (status)); goto fail_end_pic; } - gst_va_display_lock (self->display); - status = vaRenderPicture (dpy, self->context, - (VABufferID *) pic->buffers->data, pic->buffers->len); - gst_va_display_unlock (self->display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR_OBJECT (self, "vaRenderPicture: %s", vaErrorStr (status)); - goto fail_end_pic; + if (pic->buffers->len > 0) { + gst_va_display_lock (self->display); + status = vaRenderPicture (dpy, self->context, + (VABufferID *) pic->buffers->data, pic->buffers->len); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING_OBJECT (self, "vaRenderPicture: %s", vaErrorStr (status)); + goto fail_end_pic; + } } - gst_va_display_lock (self->display); - status = vaRenderPicture (dpy, self->context, - (VABufferID *) pic->slices->data, pic->slices->len); - gst_va_display_unlock (self->display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR_OBJECT (self, "vaRenderPicture: %s", vaErrorStr (status)); - goto fail_end_pic; + if (pic->slices->len > 0) { + gst_va_display_lock (self->display); + status = vaRenderPicture (dpy, self->context, + (VABufferID *) pic->slices->data, pic->slices->len); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING_OBJECT (self, "vaRenderPicture: %s", vaErrorStr (status)); + goto fail_end_pic; + } } gst_va_display_lock (self->display); status = vaEndPicture (dpy, self->context); gst_va_display_unlock (self->display); - if (status != VA_STATUS_SUCCESS) { - GST_ERROR_OBJECT (self, "vaEndPicture: %s", vaErrorStr (status)); - goto bail; - } - - ret = TRUE; + if (status != VA_STATUS_SUCCESS) + GST_WARNING_OBJECT (self, "vaEndPicture: %s", vaErrorStr (status)); + else + ret = TRUE; bail: - gst_va_decoder_destroy_buffers (self, pic); + _destroy_buffers (pic); return ret; @@ -624,14 +689,59 @@ gst_va_display_lock (self->display); status = vaEndPicture (dpy, self->context); gst_va_display_unlock (self->display); - if (status != VA_STATUS_SUCCESS) - GST_ERROR_OBJECT (self, "vaEndPicture: %s", vaErrorStr (status)); goto bail; } } gboolean -gst_va_decoder_destroy_buffers (GstVaDecoder * self, GstVaDecodePicture * pic) +gst_va_decoder_decode (GstVaDecoder * self, GstVaDecodePicture * pic) +{ + return gst_va_decoder_decode_with_aux_surface (self, pic, FALSE); +} + +gboolean +gst_va_decoder_config_is_equal (GstVaDecoder * self, VAProfile new_profile, + guint new_rtformat, gint new_width, gint new_height) +{ + gboolean ret; + + g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); + + /* @TODO: Check if current buffers are large enough, and reuse + * them */ + GST_OBJECT_LOCK (self); + ret = (self->profile == new_profile && self->rt_format == new_rtformat + && self->coded_width == new_width && self->coded_height == new_height); + GST_OBJECT_UNLOCK (self); + + return ret; +} + +gboolean +gst_va_decoder_get_config (GstVaDecoder * self, VAProfile * profile, + guint * rt_format, gint * width, gint * height) +{ + g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); + + if (!gst_va_decoder_is_open (self)) + return FALSE; + + GST_OBJECT_LOCK (self); + if (profile) + *profile = self->profile; + if (rt_format) + *rt_format = self->rt_format; + if (width) + *width = self->coded_width; + if (height) + *height = self->coded_height; + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gboolean +_destroy_buffers (GstVaDecodePicture * pic) { VABufferID buffer; VADisplay dpy; @@ -639,68 +749,105 @@ guint i; gboolean ret = TRUE; - g_return_val_if_fail (GST_IS_VA_DECODER (self), FALSE); - g_return_val_if_fail (pic && pic->surface != VA_INVALID_ID, FALSE); - - GST_TRACE_OBJECT (self, "Destroy buffers of surface %#x", pic->surface); - - dpy = gst_va_display_get_va_dpy (self->display); - - for (i = 0; i < pic->buffers->len; i++) { - buffer = g_array_index (pic->buffers, VABufferID, i); - gst_va_display_lock (self->display); - status = vaDestroyBuffer (dpy, buffer); - gst_va_display_unlock (self->display); - if (status != VA_STATUS_SUCCESS) { - ret = FALSE; - GST_WARNING ("Failed to destroy parameter buffer: %s", - vaErrorStr (status)); + g_return_val_if_fail (GST_IS_VA_DISPLAY (pic->display), FALSE); + + dpy = gst_va_display_get_va_dpy (pic->display); + + if (pic->buffers) { + for (i = 0; i < pic->buffers->len; i++) { + buffer = g_array_index (pic->buffers, VABufferID, i); + gst_va_display_lock (pic->display); + status = vaDestroyBuffer (dpy, buffer); + gst_va_display_unlock (pic->display); + if (status != VA_STATUS_SUCCESS) { + ret = FALSE; + GST_WARNING ("Failed to destroy parameter buffer: %s", + vaErrorStr (status)); + } } + + pic->buffers = g_array_set_size (pic->buffers, 0); } - for (i = 0; i < pic->slices->len; i++) { - buffer = g_array_index (pic->slices, VABufferID, i); - gst_va_display_lock (self->display); - status = vaDestroyBuffer (dpy, buffer); - gst_va_display_unlock (self->display); - if (status != VA_STATUS_SUCCESS) { - ret = FALSE; - GST_WARNING ("Failed to destroy slice buffer: %s", vaErrorStr (status)); + if (pic->slices) { + for (i = 0; i < pic->slices->len; i++) { + buffer = g_array_index (pic->slices, VABufferID, i); + gst_va_display_lock (pic->display); + status = vaDestroyBuffer (dpy, buffer); + gst_va_display_unlock (pic->display); + if (status != VA_STATUS_SUCCESS) { + ret = FALSE; + GST_WARNING ("Failed to destroy slice buffer: %s", vaErrorStr (status)); + } } - } - pic->buffers = g_array_set_size (pic->buffers, 0); - pic->slices = g_array_set_size (pic->slices, 0); + pic->slices = g_array_set_size (pic->slices, 0); + } return ret; } - GstVaDecodePicture * -gst_va_decode_picture_new (VASurfaceID surface) +gst_va_decode_picture_new (GstVaDecoder * self, GstBuffer * buffer) { GstVaDecodePicture *pic; - g_return_val_if_fail (surface != VA_INVALID_ID, NULL); + g_return_val_if_fail (buffer && GST_IS_BUFFER (buffer), NULL); + g_return_val_if_fail (self && GST_IS_VA_DECODER (self), NULL); pic = g_slice_new (GstVaDecodePicture); - pic->surface = surface; + pic->gstbuffer = gst_buffer_ref (buffer); pic->buffers = g_array_sized_new (FALSE, FALSE, sizeof (VABufferID), 16); pic->slices = g_array_sized_new (FALSE, FALSE, sizeof (VABufferID), 64); + pic->display = gst_object_ref (self->display); return pic; } +VASurfaceID +gst_va_decode_picture_get_surface (GstVaDecodePicture * pic) +{ + g_return_val_if_fail (pic, VA_INVALID_ID); + g_return_val_if_fail (pic->gstbuffer, VA_INVALID_ID); + + return gst_va_buffer_get_surface (pic->gstbuffer); +} + +VASurfaceID +gst_va_decode_picture_get_aux_surface (GstVaDecodePicture * pic) +{ + g_return_val_if_fail (pic, VA_INVALID_ID); + g_return_val_if_fail (pic->gstbuffer, VA_INVALID_ID); + + return gst_va_buffer_get_aux_surface (pic->gstbuffer); +} + void gst_va_decode_picture_free (GstVaDecodePicture * pic) { g_return_if_fail (pic); - if (pic->buffers->len > 0 || pic->slices->len > 0) - GST_WARNING ("VABufferID are leaked"); + _destroy_buffers (pic); + gst_buffer_unref (pic->gstbuffer); g_clear_pointer (&pic->buffers, g_array_unref); g_clear_pointer (&pic->slices, g_array_unref); + gst_clear_object (&pic->display); g_slice_free (GstVaDecodePicture, pic); } + +GstVaDecodePicture * +gst_va_decode_picture_dup (GstVaDecodePicture * pic) +{ + GstVaDecodePicture *dup; + + g_return_val_if_fail (pic, NULL); + + dup = g_slice_new0 (GstVaDecodePicture); + + dup->display = gst_object_ref (pic->display); + /* dups only need gstbuffer */ + dup->gstbuffer = gst_buffer_ref (pic->gstbuffer); + return dup; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadecoder.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadecoder.h
Changed
@@ -20,16 +20,18 @@ #pragma once -#include "gstvadisplay.h" +#include <gst/va/gstvadisplay.h> +#include <va/va.h> G_BEGIN_DECLS typedef struct _GstVaDecodePicture GstVaDecodePicture; struct _GstVaDecodePicture { + GstVaDisplay *display; GArray *buffers; GArray *slices; - VASurfaceID surface; + GstBuffer *gstbuffer; }; #define GST_TYPE_VA_DECODER (gst_va_decoder_get_type()) @@ -42,34 +44,64 @@ guint rt_format); gboolean gst_va_decoder_close (GstVaDecoder * self); gboolean gst_va_decoder_is_open (GstVaDecoder * self); -gboolean gst_va_decoder_set_format (GstVaDecoder * self, +gboolean gst_va_decoder_set_frame_size_with_surfaces + (GstVaDecoder * self, gint coded_width, gint coded_height, GArray * surfaces); +gboolean gst_va_decoder_set_frame_size (GstVaDecoder * self, + gint coded_width, + gint coded_height); +gboolean gst_va_decoder_update_frame_size (GstVaDecoder * self, + gint coded_width, + gint coded_height); GstCaps * gst_va_decoder_get_srcpad_caps (GstVaDecoder * self); GstCaps * gst_va_decoder_get_sinkpad_caps (GstVaDecoder * self); gboolean gst_va_decoder_has_profile (GstVaDecoder * self, - VAProfile profile); + VAProfile profile); gint gst_va_decoder_get_mem_types (GstVaDecoder * self); GArray * gst_va_decoder_get_surface_formats (GstVaDecoder * self); gboolean gst_va_decoder_add_param_buffer (GstVaDecoder * self, - GstVaDecodePicture * pic, - gint type, - gpointer data, - gsize size); + GstVaDecodePicture * pic, + gint type, + gpointer data, + gsize size); gboolean gst_va_decoder_add_slice_buffer (GstVaDecoder * self, - GstVaDecodePicture * pic, - gpointer params_data, - gsize params_size, - gpointer slice_data, - gsize slice_size); + GstVaDecodePicture * pic, + gpointer params_data, + gsize params_size, + gpointer slice_data, + gsize slice_size); +gboolean gst_va_decoder_add_slice_buffer_with_n_params + (GstVaDecoder * self, + GstVaDecodePicture * pic, + gpointer params_data, + gsize params_size, + guint params_num, + gpointer slice_data, + gsize slice_size); gboolean gst_va_decoder_decode (GstVaDecoder * self, GstVaDecodePicture * pic); -gboolean gst_va_decoder_destroy_buffers (GstVaDecoder * self, - GstVaDecodePicture * pic); +gboolean gst_va_decoder_decode_with_aux_surface (GstVaDecoder * self, + GstVaDecodePicture * pic, + gboolean use_aux); +gboolean gst_va_decoder_config_is_equal (GstVaDecoder * decoder, + VAProfile new_profile, + guint new_rtformat, + gint new_width, + gint new_height); +gboolean gst_va_decoder_get_config (GstVaDecoder * decoder, + VAProfile * profile, + guint * rt_format, + gint * width, + gint * height); -GstVaDecodePicture * gst_va_decode_picture_new (VASurfaceID surface); +GstVaDecodePicture * gst_va_decode_picture_new (GstVaDecoder * self, + GstBuffer * buffer); +VASurfaceID gst_va_decode_picture_get_surface (GstVaDecodePicture * pic); +VASurfaceID gst_va_decode_picture_get_aux_surface (GstVaDecodePicture * pic); void gst_va_decode_picture_free (GstVaDecodePicture * pic); +GstVaDecodePicture * gst_va_decode_picture_dup (GstVaDecodePicture * pic); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadeinterlace.c
Added
@@ -0,0 +1,865 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vadeinterlace + * @title: vadeinterlace + * @short_description: A VA-API base video deinterlace filter + * + * vadeinterlace deinterlaces interlaced video frames to progressive + * video frames. This element and its deinterlacing methods depend on + * the installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver, but it's usually avaialble with bob (linear) method. + * + * This element doesn't change the caps features, it only negotiates + * the same dowstream and upstream. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=interlaced_video.mp4 ! parsebin ! vah264dec ! vadeinterlace ! vapostproc ! autovideosink + * ``` + * + * Since: 1.20 + * + */ + +/* ToDo: + * + * + field property to select only one field and keep the same framerate + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvadeinterlace.h" + +#include <gst/video/video.h> + +#include <va/va_drmcommon.h> + +#include "gstvaallocator.h" +#include "gstvabasetransform.h" +#include "gstvacaps.h" +#include "gstvadisplay_priv.h" +#include "gstvafilter.h" +#include "gstvapool.h" +#include "gstvautils.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_deinterlace_debug); +#define GST_CAT_DEFAULT gst_va_deinterlace_debug + +#define GST_VA_DEINTERLACE(obj) ((GstVaDeinterlace *) obj) +#define GST_VA_DEINTERLACE_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaDeinterlaceClass)) +#define GST_VA_DEINTERLACE_CLASS(klass) ((GstVaDeinterlaceClass *) klass) + +typedef struct _GstVaDeinterlace GstVaDeinterlace; +typedef struct _GstVaDeinterlaceClass GstVaDeinterlaceClass; + +enum CurrField +{ + UNKNOWN_FIELD, + FIRST_FIELD, + SECOND_FIELD, + FINISHED, +}; + +struct _GstVaDeinterlaceClass +{ + /* GstVideoFilter overlaps functionality */ + GstVaBaseTransformClass parent_class; +}; + +struct _GstVaDeinterlace +{ + GstVaBaseTransform parent; + + gboolean rebuild_filters; + VAProcDeinterlacingType method; + + guint num_backward_references; + + GstBuffer *history[8]; + gint hcount; + gint hdepth; + gint hcurr; + enum CurrField curr_field; + + /* Calculated buffer duration by using upstream framerate */ + GstClockTime default_duration; +}; + +static GstElementClass *parent_class = NULL; + +struct CData +{ + gchar *render_device_path; + gchar *description; +}; + +/* *INDENT-OFF* */ +static const gchar *caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR }") " ;" + GST_VIDEO_CAPS_MAKE ("{ VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, " + "I420, P010_10LE, RGBA, BGRA, ARGB, ABGR }"); +/* *INDENT-ON* */ + +static void +_reset_history (GstVaDeinterlace * self) +{ + gint i; + + for (i = 0; i < self->hcount; i++) + gst_buffer_unref (self->history[i]); + self->hcount = 0; +} + +static void +gst_va_deinterlace_dispose (GObject * object) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (object); + + _reset_history (self); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_deinterlace_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (object); + guint method; + + GST_OBJECT_LOCK (object); + switch (prop_id) { + case GST_VA_FILTER_PROP_DEINTERLACE_METHOD: + method = g_value_get_enum (value); + if (method != self->method) { + self->method = method; + g_atomic_int_set (&self->rebuild_filters, TRUE); + } + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + GST_OBJECT_UNLOCK (object); +} + +static void +gst_va_deinterlace_get_property (GObject * object, guint prop_id, + GValue * value, GParamSpec * pspec) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (object); + + GST_OBJECT_LOCK (object); + switch (prop_id) { + case GST_VA_FILTER_PROP_DEINTERLACE_METHOD:{ + g_value_set_enum (value, self->method); + break; + } + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + GST_OBJECT_UNLOCK (object); +} + +static GstFlowReturn +gst_va_deinterlace_submit_input_buffer (GstBaseTransform * trans, + gboolean is_discont, GstBuffer * input) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (trans); + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + GstBuffer *buf, *inbuf; + GstFlowReturn ret; + gint i; + + /* Let baseclass handle QoS first */ + ret = GST_BASE_TRANSFORM_CLASS (parent_class)->submit_input_buffer (trans, + is_discont, input); + if (ret != GST_FLOW_OK) + return ret; + + if (gst_base_transform_is_passthrough (trans)) + return ret; + + /* at this moment, baseclass must hold queued_buf */ + g_assert (trans->queued_buf != NULL); + + /* Check if we can use this buffer directly. If not, copy this into + * our fallback buffer */ + buf = trans->queued_buf; + trans->queued_buf = NULL; + + ret = gst_va_base_transform_import_buffer (btrans, buf, &inbuf); + if (ret != GST_FLOW_OK) + return ret; + + gst_buffer_unref (buf); + + if (self->hcount < self->hdepth) { + self->history[self->hcount++] = inbuf; + } else { + gst_clear_buffer (&self->history[0]); + for (i = 0; i + 1 < self->hcount; i++) + self->history[i] = self->history[i + 1]; + self->history[i] = inbuf; + } + + if (self->history[self->hcurr]) + self->curr_field = FIRST_FIELD; + + return ret; +} + +static void +_build_filter (GstVaDeinterlace * self) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + guint i, num_caps; + VAProcFilterCapDeinterlacing *caps; + guint32 num_forward_references; + + caps = gst_va_filter_get_filter_caps (btrans->filter, + VAProcFilterDeinterlacing, &num_caps); + if (!caps) + return; + + for (i = 0; i < num_caps; i++) { + if (caps[i].type != self->method) + continue; + + if (gst_va_filter_add_deinterlace_buffer (btrans->filter, self->method, + &num_forward_references, &self->num_backward_references)) { + self->hdepth = num_forward_references + self->num_backward_references + 1; + if (self->hdepth > 8) { + GST_ELEMENT_ERROR (self, STREAM, FAILED, + ("Pipeline requires too many references: (%u forward, %u backward)", + num_forward_references, self->num_backward_references), (NULL)); + } + GST_INFO_OBJECT (self, "References for method: %u forward / %u backward", + num_forward_references, self->num_backward_references); + self->hcurr = num_forward_references; + return; + } + } + + GST_ELEMENT_ERROR (self, LIBRARY, SETTINGS, + ("Invalid deinterlacing method: %d", self->method), (NULL)); +} + +static void +gst_va_deinterlace_rebuild_filters (GstVaDeinterlace * self) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + + if (!g_atomic_int_get (&self->rebuild_filters)) + return; + + _reset_history (self); + gst_va_filter_drop_filter_buffers (btrans->filter); + _build_filter (self); + + /* extra number of buffers for propose_allocation */ + if (self->hdepth > btrans->extra_min_buffers) { + btrans->extra_min_buffers = self->hdepth; + gst_base_transform_reconfigure_sink (GST_BASE_TRANSFORM (self)); + } + + g_atomic_int_set (&self->rebuild_filters, FALSE); +} + +static gboolean +gst_va_deinterlace_set_info (GstVaBaseTransform * btrans, GstCaps * incaps, + GstVideoInfo * in_info, GstCaps * outcaps, GstVideoInfo * out_info) +{ + GstBaseTransform *trans = GST_BASE_TRANSFORM (btrans); + GstVaDeinterlace *self = GST_VA_DEINTERLACE (btrans); + + switch (GST_VIDEO_INFO_INTERLACE_MODE (in_info)) { + case GST_VIDEO_INTERLACE_MODE_PROGRESSIVE: + /* Nothing to do */ + gst_base_transform_set_passthrough (trans, TRUE); + return TRUE; + break; + case GST_VIDEO_INTERLACE_MODE_ALTERNATE: + case GST_VIDEO_INTERLACE_MODE_FIELDS: + GST_ERROR_OBJECT (self, "Unsupported interlace mode."); + return FALSE; + break; + default: + break; + } + + /* Calculate expected buffer duration. We might need to reference this value + * when buffer duration is unknown */ + if (GST_VIDEO_INFO_FPS_N (in_info) > 0 && GST_VIDEO_INFO_FPS_D (in_info) > 0) { + self->default_duration = + gst_util_uint64_scale_int (GST_SECOND, GST_VIDEO_INFO_FPS_D (in_info), + GST_VIDEO_INFO_FPS_N (in_info)); + } else { + /* Assume 25 fps. We need this for reporting latency at least */ + self->default_duration = gst_util_uint64_scale_int (GST_SECOND, 1, 25); + } + + if (gst_va_filter_set_video_info (btrans->filter, in_info, out_info)) { + g_atomic_int_set (&self->rebuild_filters, TRUE); + gst_base_transform_set_passthrough (trans, FALSE); + gst_va_deinterlace_rebuild_filters (self); + + return TRUE; + } + + return FALSE; +} + +static void +gst_va_deinterlace_before_transform (GstBaseTransform * trans, + GstBuffer * inbuf) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + GstClockTime ts, stream_time; + + ts = GST_BUFFER_TIMESTAMP (inbuf); + stream_time = + gst_segment_to_stream_time (&trans->segment, GST_FORMAT_TIME, ts); + + GST_TRACE_OBJECT (self, "sync to %" GST_TIME_FORMAT, GST_TIME_ARGS (ts)); + + if (GST_CLOCK_TIME_IS_VALID (stream_time)) + gst_object_sync_values (GST_OBJECT (self), stream_time); + + gst_va_deinterlace_rebuild_filters (self); +} + +static void +_set_field (GstVaDeinterlace * self, guint32 * surface_flags) +{ + GstBaseTransform *trans = GST_BASE_TRANSFORM (self); + + if (trans->segment.rate < 0) { + if ((self->curr_field == FIRST_FIELD + && (*surface_flags & VA_TOP_FIELD_FIRST)) + || (self->curr_field == SECOND_FIELD + && (*surface_flags & VA_BOTTOM_FIELD_FIRST))) { + *surface_flags |= VA_BOTTOM_FIELD; + } else { + *surface_flags |= VA_TOP_FIELD; + } + } else { + if ((self->curr_field == FIRST_FIELD + && (*surface_flags & VA_BOTTOM_FIELD_FIRST)) + || (self->curr_field == SECOND_FIELD + && (*surface_flags & VA_TOP_FIELD_FIRST))) { + *surface_flags |= VA_BOTTOM_FIELD; + } else { + *surface_flags |= VA_TOP_FIELD; + } + } +} + +static GstFlowReturn +gst_va_deinterlace_transform (GstBaseTransform * trans, GstBuffer * inbuf, + GstBuffer * outbuf) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (trans); + GstFlowReturn res = GST_FLOW_OK; + GstVaSample src, dst; + GstVideoInfo *info = &btrans->in_info; + VASurfaceID forward_references[8], backward_references[8]; + guint i, surface_flags; + + if (G_UNLIKELY (!btrans->negotiated)) + goto unknown_format; + + g_assert (self->curr_field == FIRST_FIELD + || self->curr_field == SECOND_FIELD); + + surface_flags = gst_va_buffer_get_surface_flags (inbuf, info); + if (surface_flags != VA_FRAME_PICTURE) + _set_field (self, &surface_flags); + + GST_TRACE_OBJECT (self, "Processing %d field (flags = %u): %" GST_PTR_FORMAT, + self->curr_field, surface_flags, inbuf); + + for (i = 0; i < self->hcurr; i++) { + forward_references[i] = + gst_va_buffer_get_surface (self->history[self->hcurr - i - 1]); + } + for (i = 0; i < self->num_backward_references; i++) { + backward_references[i] = + gst_va_buffer_get_surface (self->history[self->hcurr + i + 1]); + } + + /* *INDENT-OFF* */ + src = (GstVaSample) { + .buffer = inbuf, + .flags = surface_flags, + .forward_references = forward_references, + .num_forward_references = self->hcurr, + .backward_references = backward_references, + .num_backward_references = self->num_backward_references, + }; + dst = (GstVaSample) { + .buffer = outbuf, + }; + /* *INDENT-ON* */ + + if (!gst_va_filter_process (btrans->filter, &src, &dst)) { + gst_buffer_set_flags (outbuf, GST_BUFFER_FLAG_CORRUPTED); + } + + return res; + + /* ERRORS */ +unknown_format: + { + GST_ELEMENT_ERROR (self, CORE, NOT_IMPLEMENTED, (NULL), ("unknown format")); + return GST_FLOW_NOT_NEGOTIATED; + } +} + +static GstFlowReturn +gst_va_deinterlace_generate_output (GstBaseTransform * trans, + GstBuffer ** outbuf) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + GstFlowReturn ret; + GstBuffer *inbuf, *buf = NULL; + + if (gst_base_transform_is_passthrough (trans)) { + return GST_BASE_TRANSFORM_CLASS (parent_class)->generate_output (trans, + outbuf); + } + + *outbuf = NULL; + + if (self->curr_field == FINISHED) + return GST_FLOW_OK; + + inbuf = self->history[self->hcurr]; + if (!inbuf) + return GST_FLOW_OK; + + if (!self->history[self->hdepth - 1]) + return GST_FLOW_OK; + + ret = GST_BASE_TRANSFORM_CLASS (parent_class)->prepare_output_buffer (trans, + inbuf, &buf); + if (ret != GST_FLOW_OK || !buf) { + GST_WARNING_OBJECT (self, "Could not get buffer from pool: %s", + gst_flow_get_name (ret)); + return ret; + } + + ret = gst_va_deinterlace_transform (trans, inbuf, buf); + if (ret != GST_FLOW_OK) { + gst_buffer_unref (buf); + return ret; + } + + if (!GST_BUFFER_PTS_IS_VALID (inbuf)) { + GST_LOG_OBJECT (self, "Input buffer timestamp is unknown"); + } else { + GstClockTime duration; + + if (GST_BUFFER_DURATION_IS_VALID (inbuf)) + duration = GST_BUFFER_DURATION (inbuf) / 2; + else + duration = self->default_duration / 2; + + GST_BUFFER_DURATION (buf) = duration; + if (self->curr_field == SECOND_FIELD) + GST_BUFFER_PTS (buf) = GST_BUFFER_PTS (buf) + duration; + } + + *outbuf = buf; + + GST_TRACE_OBJECT (self, "Pushing %" GST_PTR_FORMAT, buf); + + if (self->curr_field == FIRST_FIELD) + self->curr_field = SECOND_FIELD; + else if (self->curr_field == SECOND_FIELD) + self->curr_field = FINISHED; + + return ret; +} + +static GstCaps * +gst_va_deinterlace_remove_interlace (GstCaps * caps) +{ + GstStructure *st; + gint i, n; + GstCaps *res; + GstCapsFeatures *f; + + res = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + st = gst_caps_get_structure (caps, i); + f = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (res, st, f)) + continue; + + st = gst_structure_copy (st); + gst_structure_remove_fields (st, "interlace-mode", "field-order", + "framerate", NULL); + + gst_caps_append_structure_full (res, st, gst_caps_features_copy (f)); + } + + return res; +} + +static GstCaps * +gst_va_deinterlace_transform_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * filter) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (trans); + GstCaps *ret, *filter_caps; + + GST_DEBUG_OBJECT (self, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + filter_caps = gst_va_base_transform_get_filter_caps (btrans); + if (filter_caps && !gst_caps_can_intersect (caps, filter_caps)) { + ret = gst_caps_ref (caps); + goto bail; + } + + ret = gst_va_deinterlace_remove_interlace (caps); + +bail: + if (filter) { + GstCaps *intersection; + + intersection = + gst_caps_intersect_full (filter, ret, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (ret); + ret = intersection; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, ret); + + return ret; +} + +static GstCaps * +gst_va_deinterlace_fixate_caps (GstBaseTransform * trans, + GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + GstCapsFeatures *out_f; + GstStructure *in_s, *out_s; + gint fps_n, fps_d; + const gchar *in_interlace_mode, *out_interlace_mode; + + GST_DEBUG_OBJECT (self, + "trying to fixate othercaps %" GST_PTR_FORMAT " based on caps %" + GST_PTR_FORMAT, othercaps, caps); + + othercaps = gst_caps_truncate (othercaps); + othercaps = gst_caps_make_writable (othercaps); + + if (direction == GST_PAD_SRC) { + othercaps = gst_caps_fixate (othercaps); + goto bail; + } + + in_s = gst_caps_get_structure (caps, 0); + in_interlace_mode = gst_structure_get_string (in_s, "interlace-mode"); + + out_s = gst_caps_get_structure (othercaps, 0); + + if (g_strcmp0 ("progressive", in_interlace_mode) == 0) { + /* Just forward interlace-mode=progressive and framerate + * By this way, basetransform will enable passthrough for non-interlaced + * stream */ + const GValue *framerate = gst_structure_get_value (in_s, "framerate"); + gst_structure_set_value (out_s, "framerate", framerate); + gst_structure_set (out_s, "interlace-mode", G_TYPE_STRING, "progressive", + NULL); + + goto bail; + } + + out_f = gst_caps_get_features (othercaps, 0); + out_interlace_mode = gst_structure_get_string (out_s, "interlace-mode"); + + if ((!out_interlace_mode + || (g_strcmp0 ("progressive", out_interlace_mode) == 0)) + && (gst_caps_features_contains (out_f, GST_CAPS_FEATURE_MEMORY_VA) + || gst_caps_features_contains (out_f, GST_CAPS_FEATURE_MEMORY_DMABUF) + || gst_caps_features_contains (out_f, + GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY))) { + gst_structure_set (out_s, "interlace-mode", G_TYPE_STRING, "progressive", + NULL); + + if (gst_structure_get_fraction (in_s, "framerate", &fps_n, &fps_d)) { + fps_n *= 2; + gst_structure_set (out_s, "framerate", GST_TYPE_FRACTION, fps_n, fps_d, + NULL); + } + } else { + /* if caps features aren't supported, just forward interlace-mode + * and framerate */ + const GValue *framerate = gst_structure_get_value (in_s, "framerate"); + gst_structure_set_value (out_s, "framerate", framerate); + gst_structure_set (out_s, "interlace-mode", G_TYPE_STRING, + in_interlace_mode, NULL); + } + +bail: + GST_DEBUG_OBJECT (self, "fixated othercaps to %" GST_PTR_FORMAT, othercaps); + + return othercaps; +} + +static gboolean +gst_va_deinterlace_query (GstBaseTransform * trans, GstPadDirection direction, + GstQuery * query) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (trans); + + if (direction == GST_PAD_SRC && GST_QUERY_TYPE (query) == GST_QUERY_LATENCY) { + GstPad *peer; + GstClockTime latency, min, max; + gboolean res = FALSE; + gboolean live; + + if (gst_base_transform_is_passthrough (trans)) + return FALSE; + + peer = gst_pad_get_peer (GST_BASE_TRANSFORM_SINK_PAD (trans)); + if (!peer) + return FALSE; + + res = gst_pad_query (peer, query); + gst_object_unref (peer); + if (!res) + return FALSE; + + gst_query_parse_latency (query, &live, &min, &max); + + GST_DEBUG_OBJECT (self, "Peer latency: min %" GST_TIME_FORMAT " max %" + GST_TIME_FORMAT, GST_TIME_ARGS (min), GST_TIME_ARGS (max)); + + /* add our own latency: number of fields + history depth */ + latency = (2 + self->hdepth) * self->default_duration; + + GST_DEBUG_OBJECT (self, "Our latency: min %" GST_TIME_FORMAT ", max %" + GST_TIME_FORMAT, GST_TIME_ARGS (latency), GST_TIME_ARGS (latency)); + + min += latency; + if (max != GST_CLOCK_TIME_NONE) + max += latency; + + GST_DEBUG_OBJECT (self, "Calculated total latency : min %" GST_TIME_FORMAT + " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min), GST_TIME_ARGS (max)); + + gst_query_set_latency (query, live, min, max); + + return TRUE; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->query (trans, direction, + query); +} + +static void +gst_va_deinterlace_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *doc_caps, *sink_caps = NULL, *src_caps = NULL; + GstPadTemplate *sink_pad_templ, *src_pad_templ; + GObjectClass *object_class = G_OBJECT_CLASS (g_class); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstVaBaseTransformClass *btrans_class = GST_VA_BASE_TRANSFORM_CLASS (g_class); + GstVaDisplay *display; + GstVaFilter *filter; + struct CData *cdata = class_data; + gchar *long_name; + + parent_class = g_type_class_peek_parent (g_class); + + btrans_class->render_device_path = g_strdup (cdata->render_device_path); + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API Deinterlacer in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API Deinterlacer"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Filter/Effect/Video/Deinterlace", + "VA-API based deinterlacer", "Víctor Jáquez <vjaquez@igalia.com>"); + + display = gst_va_display_drm_new_from_path (btrans_class->render_device_path); + filter = gst_va_filter_new (display); + + if (gst_va_filter_open (filter)) { + src_caps = gst_va_filter_get_caps (filter); + /* adds any to enable passthrough */ + { + GstCaps *any_caps = gst_caps_new_empty_simple ("video/x-raw"); + gst_caps_set_features_simple (any_caps, gst_caps_features_new_any ()); + src_caps = gst_caps_merge (src_caps, any_caps); + } + } else { + src_caps = gst_caps_from_string (caps_str); + } + + sink_caps = gst_va_deinterlace_remove_interlace (src_caps); + + doc_caps = gst_caps_from_string (caps_str); + + sink_pad_templ = gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + sink_caps); + gst_element_class_add_pad_template (element_class, sink_pad_templ); + gst_pad_template_set_documentation_caps (sink_pad_templ, + gst_caps_ref (doc_caps)); + + src_pad_templ = gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + src_caps); + gst_element_class_add_pad_template (element_class, src_pad_templ); + gst_pad_template_set_documentation_caps (src_pad_templ, + gst_caps_ref (doc_caps)); + gst_caps_unref (doc_caps); + + gst_caps_unref (src_caps); + gst_caps_unref (sink_caps); + + object_class->dispose = gst_va_deinterlace_dispose; + object_class->set_property = gst_va_deinterlace_set_property; + object_class->get_property = gst_va_deinterlace_get_property; + + trans_class->transform_caps = + GST_DEBUG_FUNCPTR (gst_va_deinterlace_transform_caps); + trans_class->fixate_caps = GST_DEBUG_FUNCPTR (gst_va_deinterlace_fixate_caps); + trans_class->before_transform = + GST_DEBUG_FUNCPTR (gst_va_deinterlace_before_transform); + trans_class->transform = GST_DEBUG_FUNCPTR (gst_va_deinterlace_transform); + trans_class->submit_input_buffer = + GST_DEBUG_FUNCPTR (gst_va_deinterlace_submit_input_buffer); + trans_class->generate_output = + GST_DEBUG_FUNCPTR (gst_va_deinterlace_generate_output); + trans_class->query = GST_DEBUG_FUNCPTR (gst_va_deinterlace_query); + + trans_class->transform_ip_on_passthrough = FALSE; + + btrans_class->set_info = GST_DEBUG_FUNCPTR (gst_va_deinterlace_set_info); + + gst_va_filter_install_deinterlace_properties (filter, object_class); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + g_free (cdata); + gst_object_unref (filter); + gst_object_unref (display); +} + +static void +gst_va_deinterlace_init (GTypeInstance * instance, gpointer g_class) +{ + GstVaDeinterlace *self = GST_VA_DEINTERLACE (instance); + GParamSpec *pspec; + + pspec = g_object_class_find_property (g_class, "method"); + g_assert (pspec); + self->method = g_value_get_enum (g_param_spec_get_default_value (pspec)); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_deinterlace_debug, "vadeinterlace", 0, + "VA Video Deinterlace"); + + return NULL; +} + +gboolean +gst_va_deinterlace_register (GstPlugin * plugin, GstVaDevice * device, + guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaDeinterlaceClass), + .class_init = gst_va_deinterlace_class_init, + .instance_size = sizeof (GstVaDeinterlace), + .instance_init = gst_va_deinterlace_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaDeinterlace"); + feature_name = g_strdup ("vadeinterlace"); + + /* The first postprocessor to be registered should use a constant + * name, like vadeinterlace, for any additional postprocessors, we + * create unique names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sDeinterlace", basename); + feature_name = g_strdup_printf ("va%sdeinterlace", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_VA_BASE_TRANSFORM, type_name, + &type_info, 0); + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadeinterlace.h
Added
@@ -0,0 +1,31 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_deinterlace_register (GstPlugin * plugin, + GstVaDevice * device, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadevice.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadevice.c
Changed
@@ -24,8 +24,9 @@ #include "gstvadevice.h" +#if HAVE_GUDEV #include <gudev/gudev.h> -#include "gstvadisplay_drm.h" +#endif #define GST_CAT_DEFAULT gstva_debug GST_DEBUG_CATEGORY_EXTERN (gstva_debug); @@ -35,8 +36,7 @@ static void gst_va_device_free (GstVaDevice * device) { - if (device->display) - gst_object_unref (device->display); + gst_clear_object (&device->display); g_free (device->render_device_path); g_free (device); } @@ -56,6 +56,15 @@ return device; } +static gint +compare_device_path (gconstpointer a, gconstpointer b, gpointer user_data) +{ + const GstVaDevice *pa = a, *pb = b; + + return strcmp (pa->render_device_path, pb->render_device_path); +} + +#if HAVE_GUDEV GList * gst_va_device_find_devices (void) { @@ -81,14 +90,40 @@ continue; GST_INFO ("Found VA-API device: %s", path); - g_queue_push_tail (&devices, gst_va_device_new (dpy, path)); + g_queue_push_head (&devices, gst_va_device_new (dpy, path)); } + g_queue_sort (&devices, compare_device_path, NULL); g_list_free_full (udev_devices, g_object_unref); g_object_unref (client); return devices.head; } +#else +GList * +gst_va_device_find_devices (void) +{ + GstVaDisplay *dpy; + GQueue devices = G_QUEUE_INIT; + gchar path[64]; + guint i; + + for (i = 0; i < 8; i++) { + g_snprintf (path, sizeof (path), "/dev/dri/renderD%d", 128 + i); + if (!g_file_test (path, G_FILE_TEST_EXISTS)) + continue; + + if (!(dpy = gst_va_display_drm_new_from_path (path))) + continue; + + GST_INFO ("Found VA-API device: %s", path); + g_queue_push_head (&devices, gst_va_device_new (dpy, path)); + } + + g_queue_sort (&devices, compare_device_path, NULL); + return devices.head; +} +#endif void gst_va_device_list_free (GList * devices)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvadevice.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadevice.h
Changed
@@ -24,7 +24,7 @@ G_BEGIN_DECLS -#include "gstvadisplay.h" +#include <gst/va/gstvadisplay_drm.h> #define GST_TYPE_VA_DEVICE (gst_va_device_get_type()) #define GST_IS_VA_DEVICE(obj) (GST_IS_MINI_OBJECT_TYPE((obj), GST_TYPE_VA_DEVICE))
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadisplay_priv.c
Added
@@ -0,0 +1,175 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvadisplay_priv.h" +#include "gstvaprofile.h" +#include "gstvavideoformat.h" + +GArray * +gst_va_display_get_profiles (GstVaDisplay * self, guint32 codec, + VAEntrypoint entrypoint) +{ + GArray *ret = NULL; + VADisplay dpy; + VAEntrypoint *entrypoints; + VAProfile *profiles; + VAStatus status; + gint i, j, num_entrypoints = 0, num_profiles = 0; + + g_return_val_if_fail (GST_IS_VA_DISPLAY (self), NULL); + + dpy = gst_va_display_get_va_dpy (self); + + gst_va_display_lock (self); + num_profiles = vaMaxNumProfiles (dpy); + num_entrypoints = vaMaxNumEntrypoints (dpy); + gst_va_display_unlock (self); + + profiles = g_new (VAProfile, num_profiles); + entrypoints = g_new (VAEntrypoint, num_entrypoints); + + gst_va_display_lock (self); + status = vaQueryConfigProfiles (dpy, profiles, &num_profiles); + gst_va_display_unlock (self); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaQueryConfigProfile: %s", vaErrorStr (status)); + goto bail; + } + + for (i = 0; i < num_profiles; i++) { + if (codec != gst_va_profile_codec (profiles[i])) + continue; + + gst_va_display_lock (self); + status = vaQueryConfigEntrypoints (dpy, profiles[i], entrypoints, + &num_entrypoints); + gst_va_display_unlock (self); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaQueryConfigEntrypoints: %s", vaErrorStr (status)); + goto bail; + } + + for (j = 0; j < num_entrypoints; j++) { + if (entrypoints[j] == entrypoint) { + if (!ret) + ret = g_array_new (FALSE, FALSE, sizeof (VAProfile)); + g_array_append_val (ret, profiles[i]); + break; + } + } + } + +bail: + g_free (entrypoints); + g_free (profiles); + return ret; +} + +GArray * +gst_va_display_get_image_formats (GstVaDisplay * self) +{ + GArray *ret = NULL; + GstVideoFormat format; + VADisplay dpy; + VAImageFormat *va_formats; + VAStatus status; + int i, max, num = 0; + + g_return_val_if_fail (GST_IS_VA_DISPLAY (self), NULL); + + dpy = gst_va_display_get_va_dpy (self); + + gst_va_display_lock (self); + max = vaMaxNumImageFormats (dpy); + gst_va_display_unlock (self); + if (max == 0) + return NULL; + + va_formats = g_new (VAImageFormat, max); + + gst_va_display_lock (self); + status = vaQueryImageFormats (dpy, va_formats, &num); + gst_va_display_unlock (self); + + gst_va_video_format_fix_map (va_formats, num); + + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaQueryImageFormats: %s", vaErrorStr (status)); + goto bail; + } + + ret = g_array_sized_new (FALSE, FALSE, sizeof (GstVideoFormat), num); + for (i = 0; i < num; i++) { + format = gst_va_video_format_from_va_image_format (&va_formats[i]); + if (format != GST_VIDEO_FORMAT_UNKNOWN) + g_array_append_val (ret, format); + } + + if (ret->len == 0) { + g_array_unref (ret); + ret = NULL; + } + +bail: + g_free (va_formats); + return ret; +} + +gboolean +gst_va_display_has_vpp (GstVaDisplay * self) +{ + VADisplay dpy; + VAEntrypoint *entrypoints; + VAStatus status; + int i, max, num; + gboolean found = FALSE; + g_return_val_if_fail (GST_IS_VA_DISPLAY (self), FALSE); + + dpy = gst_va_display_get_va_dpy (self); + + gst_va_display_lock (self); + max = vaMaxNumEntrypoints (dpy); + gst_va_display_unlock (self); + + entrypoints = g_new (VAEntrypoint, max); + + gst_va_display_lock (self); + status = vaQueryConfigEntrypoints (dpy, VAProfileNone, entrypoints, &num); + gst_va_display_unlock (self); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaQueryImageFormats: %s", vaErrorStr (status)); + goto bail; + } + + for (i = 0; i < num; i++) { + if (entrypoints[i] == VAEntrypointVideoProc) { + found = TRUE; + break; + } + } + +bail: + g_free (entrypoints); + return found; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvadisplay_priv.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/va/gstvadisplay.h> +#include <va/va.h> + +G_BEGIN_DECLS +GArray * gst_va_display_get_profiles (GstVaDisplay * self, + guint32 codec, + VAEntrypoint entrypoint); +GArray * gst_va_display_get_image_formats (GstVaDisplay * self); +gboolean gst_va_display_has_vpp (GstVaDisplay * self); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvafilter.c
Added
@@ -0,0 +1,1795 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvafilter.h" + +#include <gst/video/video.h> + +#include <va/va_drmcommon.h> + +#include "gstvaallocator.h" +#include "gstvacaps.h" +#include "gstvadisplay_priv.h" +#include "gstvavideoformat.h" +#include "vasurfaceimage.h" + +struct _GstVaFilter +{ + GstObject parent; + + GstVaDisplay *display; + VAConfigID config; + VAContextID context; + + /* hardware constraints */ + VAProcPipelineCaps pipeline_caps; + + guint32 mem_types; + gint min_width; + gint max_width; + gint min_height; + gint max_height; + + GArray *surface_formats; + GArray *image_formats; + + GArray *available_filters; + + /* stream information */ + guint mirror; + guint rotation; + GstVideoOrientationMethod orientation; + + gboolean crop_enabled; + + VARectangle input_region; + VARectangle output_region; + + VAProcColorStandardType input_color_standard; + VAProcColorProperties input_color_properties; + VAProcColorStandardType output_color_standard; + VAProcColorProperties output_color_properties; + + GArray *filters; +}; + +GST_DEBUG_CATEGORY_STATIC (gst_va_filter_debug); +#define GST_CAT_DEFAULT gst_va_filter_debug + +#define gst_va_filter_parent_class parent_class +G_DEFINE_TYPE_WITH_CODE (GstVaFilter, gst_va_filter, GST_TYPE_OBJECT, + GST_DEBUG_CATEGORY_INIT (gst_va_filter_debug, "vafilter", 0, "VA Filter")); + +enum +{ + PROP_DISPLAY = 1, + N_PROPERTIES +}; + +static GParamSpec *g_properties[N_PROPERTIES]; + +static void +gst_va_filter_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstVaFilter *self = GST_VA_FILTER (object); + + switch (prop_id) { + case PROP_DISPLAY:{ + g_assert (!self->display); + self->display = g_value_dup_object (value); + break; + } + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_va_filter_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstVaFilter *self = GST_VA_FILTER (object); + + switch (prop_id) { + case PROP_DISPLAY: + g_value_set_object (value, self->display); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } +} + +static void +gst_va_filter_dispose (GObject * object) +{ + GstVaFilter *self = GST_VA_FILTER (object); + + gst_va_filter_close (self); + + g_clear_pointer (&self->available_filters, g_array_unref); + g_clear_pointer (&self->image_formats, g_array_unref); + g_clear_pointer (&self->surface_formats, g_array_unref); + gst_clear_object (&self->display); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_filter_class_init (GstVaFilterClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->set_property = gst_va_filter_set_property; + gobject_class->get_property = gst_va_filter_get_property; + gobject_class->dispose = gst_va_filter_dispose; + + g_properties[PROP_DISPLAY] = + g_param_spec_object ("display", "GstVaDisplay", "GstVADisplay object", + GST_TYPE_VA_DISPLAY, + G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS); + + g_object_class_install_properties (gobject_class, N_PROPERTIES, g_properties); +} + +static void +gst_va_filter_init (GstVaFilter * self) +{ + self->config = VA_INVALID_ID; + self->context = VA_INVALID_ID; + + self->min_height = 1; + self->max_height = G_MAXINT; + self->min_width = 1; + self->max_width = G_MAXINT; +} + +GstVaFilter * +gst_va_filter_new (GstVaDisplay * display) +{ + g_return_val_if_fail (GST_IS_VA_DISPLAY (display), NULL); + + return g_object_new (GST_TYPE_VA_FILTER, "display", display, NULL); +} + +gboolean +gst_va_filter_is_open (GstVaFilter * self) +{ + gboolean ret; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + GST_OBJECT_LOCK (self); + ret = (self->config != VA_INVALID_ID && self->context != VA_INVALID_ID); + GST_OBJECT_UNLOCK (self); + return ret; +} + +static gboolean +gst_va_filter_ensure_config_attributes (GstVaFilter * self, + guint32 * rt_formats_ptr) +{ + VAConfigAttrib attribs[] = { + {.type = VAConfigAttribMaxPictureWidth,}, + {.type = VAConfigAttribMaxPictureHeight,}, + {.type = VAConfigAttribRTFormat,}, + }; + VADisplay dpy; + VAStatus status; + guint i, value, rt_formats = 0, max_width = 0, max_height = 0; + + dpy = gst_va_display_get_va_dpy (self->display); + + gst_va_display_lock (self->display); + status = vaGetConfigAttributes (dpy, VAProfileNone, VAEntrypointVideoProc, + attribs, G_N_ELEMENTS (attribs)); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaGetConfigAttributes: %s", vaErrorStr (status)); + return FALSE; + } + + for (i = 0; i < G_N_ELEMENTS (attribs); i++) { + value = attribs[i].value; + if (value == VA_ATTRIB_NOT_SUPPORTED) + continue; + switch (attribs[i].type) { + case VAConfigAttribMaxPictureHeight: + max_height = value; + break; + case VAConfigAttribMaxPictureWidth: + max_width = value; + break; + case VAConfigAttribRTFormat: + rt_formats = value; + break; + default: + break; + } + } + + if (rt_formats_ptr && rt_formats != 0) + *rt_formats_ptr = rt_formats; + if (max_width > 0 && max_width < G_MAXINT) + self->max_width = max_width; + if (max_height > 0 && max_height < G_MAXINT) + self->max_height = max_height; + + return TRUE; +} + +/* There are formats that are not handled correctly by driver */ +static gboolean +format_is_accepted (GstVaFilter * self, GstVideoFormat format) +{ + /* https://github.com/intel/media-driver/issues/690 + * https://github.com/intel/media-driver/issues/644 */ + if (!gst_va_display_is_implementation (self->display, + GST_VA_IMPLEMENTATION_INTEL_IHD)) + return TRUE; + + switch (format) { + case GST_VIDEO_FORMAT_ARGB: + case GST_VIDEO_FORMAT_xRGB: + case GST_VIDEO_FORMAT_ABGR: + case GST_VIDEO_FORMAT_xBGR: + return FALSE; + default: + break; + } + + return TRUE; +} + +static gboolean +gst_va_filter_ensure_surface_attributes (GstVaFilter * self) +{ + GArray *surface_formats; + GstVideoFormat format; + VASurfaceAttrib *attribs; + guint i, attrib_count; + + attribs = + gst_va_get_surface_attribs (self->display, self->config, &attrib_count); + if (!attribs) + return FALSE; + surface_formats = g_array_new (FALSE, FALSE, sizeof (GstVideoFormat)); + + for (i = 0; i < attrib_count; i++) { + if (attribs[i].value.type != VAGenericValueTypeInteger) + continue; + switch (attribs[i].type) { + case VASurfaceAttribPixelFormat: + format = gst_va_video_format_from_va_fourcc (attribs[i].value.value.i); + if (format != GST_VIDEO_FORMAT_UNKNOWN + && format_is_accepted (self, format)) + g_array_append_val (surface_formats, format); + break; + case VASurfaceAttribMinWidth: + self->min_width = MAX (self->min_width, attribs[i].value.value.i); + break; + case VASurfaceAttribMaxWidth: + if (self->max_width > 0) + self->max_width = MIN (self->max_width, attribs[i].value.value.i); + else + self->max_width = attribs[i].value.value.i; + break; + case VASurfaceAttribMinHeight: + self->min_height = MAX (self->min_height, attribs[i].value.value.i); + break; + case VASurfaceAttribMaxHeight: + if (self->max_height > 0) + self->max_height = MIN (self->max_height, attribs[i].value.value.i); + else + self->max_height = attribs[i].value.value.i; + break; + case VASurfaceAttribMemoryType: + self->mem_types = attribs[i].value.value.i; + break; + default: + break; + } + } + + if (surface_formats->len == 0) + g_clear_pointer (&surface_formats, g_array_unref); + + self->surface_formats = surface_formats; + + g_free (attribs); + + return TRUE; +} + +static gboolean +gst_va_filter_ensure_pipeline_caps (GstVaFilter * self) +{ + VADisplay dpy; + VAStatus status; + + dpy = gst_va_display_get_va_dpy (self->display); + + gst_va_display_lock (self->display); + status = vaQueryVideoProcPipelineCaps (dpy, self->context, NULL, 0, + &self->pipeline_caps); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaQueryVideoProcPipelineCaps: %s", + vaErrorStr (status)); + return FALSE; + } + + return TRUE; +} + +/* Not thread-safe API */ +gboolean +gst_va_filter_open (GstVaFilter * self) +{ + VAConfigAttrib attrib = { + .type = VAConfigAttribRTFormat, + }; + VADisplay dpy; + VAStatus status; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + if (gst_va_filter_is_open (self)) + return TRUE; + + if (!gst_va_filter_ensure_config_attributes (self, &attrib.value)) + return FALSE; + + if (!gst_va_filter_ensure_pipeline_caps (self)) + return FALSE; + + self->image_formats = gst_va_display_get_image_formats (self->display); + if (!self->image_formats) + return FALSE; + + dpy = gst_va_display_get_va_dpy (self->display); + + gst_va_display_lock (self->display); + status = vaCreateConfig (dpy, VAProfileNone, VAEntrypointVideoProc, &attrib, + 1, &self->config); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaCreateConfig: %s", vaErrorStr (status)); + return FALSE; + } + + if (!gst_va_filter_ensure_surface_attributes (self)) + goto bail; + + gst_va_display_lock (self->display); + status = vaCreateContext (dpy, self->config, 0, 0, 0, NULL, 0, + &self->context); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaCreateContext: %s", vaErrorStr (status)); + goto bail; + } + + return TRUE; + +bail: + { + gst_va_display_lock (self->display); + status = vaDestroyConfig (dpy, self->config); + gst_va_display_unlock (self->display); + + return FALSE; + } +} + +/* Not thread-safe API */ +gboolean +gst_va_filter_close (GstVaFilter * self) +{ + VADisplay dpy; + VAStatus status; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + if (!gst_va_filter_is_open (self)) + return TRUE; + + dpy = gst_va_display_get_va_dpy (self->display); + + if (self->context != VA_INVALID_ID) { + gst_va_display_lock (self->display); + status = vaDestroyContext (dpy, self->context); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) + GST_ERROR_OBJECT (self, "vaDestroyContext: %s", vaErrorStr (status)); + } + + gst_va_display_lock (self->display); + status = vaDestroyConfig (dpy, self->config); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaDestroyConfig: %s", vaErrorStr (status)); + return FALSE; + } + + g_clear_pointer (&self->available_filters, g_array_unref); + g_clear_pointer (&self->filters, g_array_unref); + + gst_va_filter_init (self); + + return TRUE; +} + +/* *INDENT-OFF* */ +static const struct VaFilterCapMap { + VAProcFilterType type; + guint count; +} filter_cap_map[] = { + { VAProcFilterNoiseReduction, 1 }, + { VAProcFilterDeinterlacing, VAProcDeinterlacingCount }, + { VAProcFilterSharpening, 1 }, + { VAProcFilterColorBalance, VAProcColorBalanceCount }, + { VAProcFilterSkinToneEnhancement, 1 }, + { VAProcFilterTotalColorCorrection, VAProcTotalColorCorrectionCount }, + { VAProcFilterHVSNoiseReduction, 0 }, + { VAProcFilterHighDynamicRangeToneMapping, 1 }, +}; +/* *INDENT-ON* */ + +static const struct VaFilterCapMap * +gst_va_filter_get_filter_cap (VAProcFilterType type) +{ + guint i; + + for (i = 0; i < G_N_ELEMENTS (filter_cap_map); i++) { + if (filter_cap_map[i].type == type) + return &filter_cap_map[i]; + } + + return NULL; +} + +static guint +gst_va_filter_get_filter_cap_count (VAProcFilterType type) +{ + const struct VaFilterCapMap *map = gst_va_filter_get_filter_cap (type); + return map ? map->count : 0; +} + +struct VaFilter +{ + VAProcFilterType type; + guint num_caps; + union + { + VAProcFilterCap simple; + VAProcFilterCapDeinterlacing deint[VAProcDeinterlacingCount]; + VAProcFilterCapColorBalance cb[VAProcColorBalanceCount]; + VAProcFilterCapTotalColorCorrection cc[VAProcTotalColorCorrectionCount]; + VAProcFilterCapHighDynamicRange hdr; + } caps; +}; + +static gboolean +gst_va_filter_ensure_filters (GstVaFilter * self) +{ + GArray *filters; + VADisplay dpy; + VAProcFilterType *filter_types; + VAStatus status; + guint i, num = VAProcFilterCount; + gboolean ret = FALSE; + + GST_OBJECT_LOCK (self); + if (self->available_filters) { + GST_OBJECT_UNLOCK (self); + return TRUE; + } + GST_OBJECT_UNLOCK (self); + + filter_types = g_malloc_n (num, sizeof (*filter_types)); + + dpy = gst_va_display_get_va_dpy (self->display); + + gst_va_display_lock (self->display); + status = vaQueryVideoProcFilters (dpy, self->context, filter_types, &num); + gst_va_display_unlock (self->display); + if (status == VA_STATUS_ERROR_MAX_NUM_EXCEEDED) { + filter_types = g_try_realloc_n (filter_types, num, sizeof (*filter_types)); + gst_va_display_lock (self->display); + status = vaQueryVideoProcFilters (dpy, self->context, filter_types, &num); + gst_va_display_unlock (self->display); + } + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaQueryVideoProcFilters: %s", vaErrorStr (status)); + goto bail; + } + + if (num == 0) + goto bail; + + filters = g_array_sized_new (FALSE, FALSE, sizeof (struct VaFilter), num); + + for (i = 0; i < num; i++) { + guint num_caps = gst_va_filter_get_filter_cap_count (filter_types[i]); + struct VaFilter filter = { filter_types[i], num_caps, {{{0,}}} }; + + if (num_caps > 0) { + gst_va_display_lock (self->display); + status = vaQueryVideoProcFilterCaps (dpy, self->context, filter.type, + &filter.caps, &filter.num_caps); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING_OBJECT (self, "vaQueryVideoProcFiltersCaps: %s", + vaErrorStr (status)); + continue; + } + } + + g_array_append_val (filters, filter); + } + + GST_OBJECT_LOCK (self); + g_clear_pointer (&self->available_filters, g_array_unref); + self->available_filters = filters; + GST_OBJECT_UNLOCK (self); + + ret = TRUE; + +bail: + g_free (filter_types); + + return ret; +} + +/* *INDENT-OFF* */ +static const struct _CBDesc { + const char *name; + const char *nick; + const char *blurb; + guint prop_id; +} cb_desc[VAProcColorBalanceCount] = { + [VAProcColorBalanceHue] = + { "hue", "Hue", "Color hue value", GST_VA_FILTER_PROP_HUE }, + [VAProcColorBalanceSaturation] = + { "saturation", "Saturation", "Color saturation value", + GST_VA_FILTER_PROP_SATURATION }, + [VAProcColorBalanceBrightness] = + { "brightness", "Brightness", "Color brightness value", + GST_VA_FILTER_PROP_BRIGHTNESS }, + [VAProcColorBalanceContrast] = + { "contrast", "Contrast", "Color contrast value", + GST_VA_FILTER_PROP_CONTRAST }, + [VAProcColorBalanceAutoSaturation] = + { "auto-saturation", "Auto-Saturation", "Enable auto saturation", + GST_VA_FILTER_PROP_AUTO_SATURATION }, + [VAProcColorBalanceAutoBrightness] = + { "auto-brightness", "Auto-Brightness", "Enable auto brightness", + GST_VA_FILTER_PROP_AUTO_BRIGHTNESS }, + [VAProcColorBalanceAutoContrast] = + { "auto-contrast", "Auto-Contrast", "Enable auto contrast", + GST_VA_FILTER_PROP_AUTO_CONTRAST }, +}; +/* *INDENT-ON* */ + +gboolean +gst_va_filter_install_properties (GstVaFilter * self, GObjectClass * klass) +{ + guint i; + const GParamFlags common_flags = G_PARAM_READWRITE + | GST_PARAM_CONDITIONALLY_AVAILABLE | G_PARAM_STATIC_STRINGS + | GST_PARAM_MUTABLE_PLAYING | GST_PARAM_CONTROLLABLE; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!gst_va_filter_ensure_filters (self)) + return FALSE; + + for (i = 0; i < self->available_filters->len; i++) { + const struct VaFilter *filter = + &g_array_index (self->available_filters, struct VaFilter, i); + + switch (filter->type) { + case VAProcFilterNoiseReduction:{ + const VAProcFilterCap *caps = &filter->caps.simple; + + g_object_class_install_property (klass, GST_VA_FILTER_PROP_DENOISE, + g_param_spec_float ("denoise", "Noise reduction", + "Noise reduction factor", caps->range.min_value, + caps->range.max_value, caps->range.default_value, + common_flags)); + break; + } + case VAProcFilterSharpening:{ + const VAProcFilterCap *caps = &filter->caps.simple; + + g_object_class_install_property (klass, GST_VA_FILTER_PROP_SHARPEN, + g_param_spec_float ("sharpen", "Sharpening Level", + "Sharpening/blurring filter", caps->range.min_value, + caps->range.max_value, caps->range.default_value, + common_flags)); + break; + } + case VAProcFilterSkinToneEnhancement:{ + const VAProcFilterCap *caps = &filter->caps.simple; + GParamSpec *pspec; + + /* i965 filter */ + if (filter->num_caps == 0) { + pspec = g_param_spec_boolean ("skin-tone", "Skin Tone Enhancenment", + "Skin Tone Enhancenment filter", FALSE, common_flags); + } else { + pspec = g_param_spec_float ("skin-tone", "Skin Tone Enhancenment", + "Skin Tone Enhancenment filter", caps->range.min_value, + caps->range.max_value, caps->range.default_value, common_flags); + } + + g_object_class_install_property (klass, GST_VA_FILTER_PROP_SKINTONE, + pspec); + break; + } + case VAProcFilterColorBalance:{ + const VAProcFilterCapColorBalance *caps = filter->caps.cb; + GParamSpec *pspec; + guint j, k; + + for (j = 0; j < filter->num_caps; j++) { + k = caps[j].type; + if (caps[j].range.min_value < caps[j].range.max_value) { + pspec = g_param_spec_float (cb_desc[k].name, cb_desc[k].nick, + cb_desc[k].blurb, caps[j].range.min_value, + caps[j].range.max_value, caps[j].range.default_value, + common_flags); + } else { + pspec = g_param_spec_boolean (cb_desc[k].name, cb_desc[k].nick, + cb_desc[k].blurb, FALSE, common_flags); + } + + g_object_class_install_property (klass, cb_desc[k].prop_id, pspec); + } + + break; + } + default: + break; + } + } + + if (self->pipeline_caps.mirror_flags != VA_MIRROR_NONE + || self->pipeline_caps.rotation_flags != VA_ROTATION_NONE) { + g_object_class_install_property (klass, GST_VA_FILTER_PROP_VIDEO_DIR, + g_param_spec_enum ("video-direction", "Video Direction", + "Video direction: rotation and flipping", + GST_TYPE_VIDEO_ORIENTATION_METHOD, GST_VIDEO_ORIENTATION_IDENTITY, + common_flags)); + } + + /** + * GstVaPostProc:disable-passthrough: + * + * If set to %TRUE the filter will not enable passthrough mode, thus + * each frame will be processed. It's useful for cropping, for + * example. + * + * Since: 1.20 + */ + g_object_class_install_property (klass, + GST_VA_FILTER_PROP_DISABLE_PASSTHROUGH, + g_param_spec_boolean ("disable-passthrough", "Disable Passthrough", + "Forces passing buffers through the postprocessor", FALSE, + G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS + | GST_PARAM_MUTABLE_READY)); + + /** + * GstVaPostProc:add-borders: + * + * If set to %TRUE the filter will add black borders if necessary to + * keep the display aspect ratio. + * + * Since: 1.20 + */ + g_object_class_install_property (klass, GST_VA_FILTER_PROP_ADD_BORDERS, + g_param_spec_boolean ("add-borders", "Add Borders", + "Add black borders if necessary to keep the display aspect ratio", + FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS + | GST_PARAM_MUTABLE_PLAYING)); + + + return TRUE; +} + +/** + * GstVaDeinterlaceMethods: + * @GST_VA_DEINTERLACE_BOB: Interpolating missing lines by using the + * adjacent lines. + * @GST_VA_DEINTERLACE_WEAVE: Show both fields per frame. (don't use) + * @GST_VA_DEINTERLACE_ADAPTIVE: Interpolating missing lines by using + * spatial/temporal references. + * @GST_VA_DEINTERLACE_COMPENSATED: Recreating missing lines by using + * motion vector. + * + * Since: 1.20 + */ +/* *INDENT-OFF* */ +static const GEnumValue di_desc[] = { + [GST_VA_DEINTERLACE_BOB] = + { VAProcDeinterlacingBob, + "Bob: Interpolating missing lines by using the adjacent lines.", "bob" }, + [GST_VA_DEINTERLACE_WEAVE] = + { VAProcDeinterlacingWeave, "Weave: Show both fields per frame. (don't use)", + "weave" }, + [GST_VA_DEINTERLACE_ADAPTIVE] = + { VAProcDeinterlacingMotionAdaptive, + "Adaptive: Interpolating missing lines by using spatial/temporal references.", + "adaptive" }, + [GST_VA_DEINTERLACE_COMPENSATED] = + { VAProcDeinterlacingMotionCompensated, + "Compensation: Recreating missing lines by using motion vector.", + "compensated" }, +}; +/* *INDENT-ON* */ + +static GType +gst_va_deinterlace_methods_get_type (guint num_caps, + const VAProcFilterCapDeinterlacing * caps) +{ + guint i, j = 0; + static GType deinterlace_methods_type = 0; + static GEnumValue methods_types[VAProcDeinterlacingCount]; + + if (deinterlace_methods_type > 0) + return deinterlace_methods_type; + + for (i = 0; i < num_caps; i++) { + if (caps[i].type > VAProcDeinterlacingNone + && caps[i].type < VAProcDeinterlacingCount) + methods_types[j++] = di_desc[caps[i].type]; + } + + /* *INDENT-OFF* */ + methods_types[j] = (GEnumValue) { 0, NULL, NULL }; + /* *INDENT-ON* */ + + deinterlace_methods_type = g_enum_register_static ("GstVaDeinterlaceMethods", + (const GEnumValue *) methods_types); + + return deinterlace_methods_type; +} + +gboolean +gst_va_filter_install_deinterlace_properties (GstVaFilter * self, + GObjectClass * klass) +{ + GType type; + guint i; + const GParamFlags common_flags = G_PARAM_READWRITE + | G_PARAM_STATIC_STRINGS | GST_PARAM_MUTABLE_PLAYING; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!gst_va_filter_ensure_filters (self)) + return FALSE; + + for (i = 0; i < self->available_filters->len; i++) { + const struct VaFilter *filter = + &g_array_index (self->available_filters, struct VaFilter, i); + + if (filter->type == VAProcFilterDeinterlacing) { + guint i, default_method = 0; + const VAProcFilterCapDeinterlacing *caps = filter->caps.deint; + if (!caps) + break; + + /* use the first method in the list as default */ + for (i = 0; i < filter->num_caps; i++) { + if (caps[i].type > VAProcDeinterlacingNone + && caps[i].type < VAProcDeinterlacingCount) { + default_method = caps[i].type; + break; + } + } + + if (default_method == 0) + break; + + type = gst_va_deinterlace_methods_get_type (filter->num_caps, caps); + gst_type_mark_as_plugin_api (type, 0); + + /** + * GstVaDeinterlace:method + * + * Selects the different deinterlacing algorithms that can be used. + * + * It depends on the driver the number of available algorithms, + * and they provide different quality and different processing + * consumption. + * + * Since: 1.20 + */ + g_object_class_install_property (klass, + GST_VA_FILTER_PROP_DEINTERLACE_METHOD, + g_param_spec_enum ("method", "Method", "Deinterlace Method", + type, default_method, common_flags)); + + return TRUE; + } + } + + return FALSE; +} + +gboolean +gst_va_filter_has_filter (GstVaFilter * self, VAProcFilterType type) +{ + guint i; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!gst_va_filter_ensure_filters (self)) + return FALSE; + + for (i = 0; i < self->available_filters->len; i++) { + const struct VaFilter *filter = + &g_array_index (self->available_filters, struct VaFilter, i); + + if (filter->type == type) + return TRUE; + } + + return FALSE; +} + +const gpointer +gst_va_filter_get_filter_caps (GstVaFilter * self, VAProcFilterType type, + guint * num_caps) +{ + struct VaFilter *filter = NULL; + /* *INDENT-OFF* */ + static const VAProcFilterCap i965_ste_caps = { + .range = { + .min_value = 0.0, + .max_value = 1.0, + .default_value = 0.0, + .step = 1.0, + }, + }; + /* *INDENT-ON* */ + gpointer ret = NULL; + guint i; + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!gst_va_filter_ensure_filters (self)) + return FALSE; + + GST_OBJECT_LOCK (self); + for (i = 0; i < self->available_filters->len; i++) { + filter = &g_array_index (self->available_filters, struct VaFilter, i); + + if (filter->type == type) { + if (filter->num_caps > 0) + ret = &filter->caps; + else if (type == VAProcFilterSkinToneEnhancement && filter->num_caps == 0) + ret = (gpointer) & i965_ste_caps; + break; + } + } + + if (ret && filter && num_caps) + *num_caps = filter->num_caps; + GST_OBJECT_UNLOCK (self); + + return ret; +} + +guint32 +gst_va_filter_get_mem_types (GstVaFilter * self) +{ + guint32 ret; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), 0); + + GST_OBJECT_LOCK (self); + ret = self->mem_types; + GST_OBJECT_UNLOCK (self); + + return ret; +} + +GArray * +gst_va_filter_get_surface_formats (GstVaFilter * self) +{ + GArray *ret; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), NULL); + + GST_OBJECT_LOCK (self); + ret = self->surface_formats ? g_array_ref (self->surface_formats) : NULL; + GST_OBJECT_UNLOCK (self); + + return ret; +} + +static gboolean +_from_video_orientation_method (GstVideoOrientationMethod orientation, + guint * mirror, guint * rotation) +{ + switch (orientation) { + case GST_VIDEO_ORIENTATION_IDENTITY: + *mirror = VA_MIRROR_NONE; + *rotation = VA_ROTATION_NONE; + break; + case GST_VIDEO_ORIENTATION_HORIZ: + *mirror = VA_MIRROR_HORIZONTAL; + *rotation = VA_ROTATION_NONE; + break; + case GST_VIDEO_ORIENTATION_VERT: + *mirror = VA_MIRROR_VERTICAL; + *rotation = VA_ROTATION_NONE; + break; + case GST_VIDEO_ORIENTATION_90R: + *mirror = VA_MIRROR_NONE; + *rotation = VA_ROTATION_90; + break; + case GST_VIDEO_ORIENTATION_180: + *mirror = VA_MIRROR_NONE; + *rotation = VA_ROTATION_180; + break; + case GST_VIDEO_ORIENTATION_90L: + *mirror = VA_MIRROR_NONE; + *rotation = VA_ROTATION_270; + break; + case GST_VIDEO_ORIENTATION_UL_LR: + *mirror = VA_MIRROR_HORIZONTAL; + *rotation = VA_ROTATION_90; + break; + case GST_VIDEO_ORIENTATION_UR_LL: + *mirror = VA_MIRROR_VERTICAL; + *rotation = VA_ROTATION_90; + break; + default: + return FALSE; + break; + } + + return TRUE; +} + +gboolean +gst_va_filter_set_orientation (GstVaFilter * self, + GstVideoOrientationMethod orientation) +{ + guint32 mirror = VA_MIRROR_NONE, rotation = VA_ROTATION_NONE; + guint32 mirror_flags, rotation_flags; + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!_from_video_orientation_method (orientation, &mirror, &rotation)) + return FALSE; + + GST_OBJECT_LOCK (self); + mirror_flags = self->pipeline_caps.mirror_flags; + GST_OBJECT_UNLOCK (self); + + if (mirror != VA_MIRROR_NONE && !(mirror_flags & mirror)) + return FALSE; + + GST_OBJECT_LOCK (self); + rotation_flags = self->pipeline_caps.rotation_flags; + GST_OBJECT_UNLOCK (self); + + if (rotation != VA_ROTATION_NONE && !(rotation_flags & (1 << rotation))) + return FALSE; + + GST_OBJECT_LOCK (self); + self->orientation = orientation; + self->mirror = mirror; + self->rotation = rotation; + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +GstVideoOrientationMethod +gst_va_filter_get_orientation (GstVaFilter * self) +{ + GstVideoOrientationMethod ret; + + GST_OBJECT_LOCK (self); + ret = self->orientation; + GST_OBJECT_UNLOCK (self); + + return ret; +} + +void +gst_va_filter_enable_cropping (GstVaFilter * self, gboolean cropping) +{ + GST_OBJECT_LOCK (self); + if (cropping != self->crop_enabled) + self->crop_enabled = cropping; + GST_OBJECT_UNLOCK (self); +} + +static inline GstCaps * +_create_base_caps (GstVaFilter * self) +{ + return gst_caps_new_simple ("video/x-raw", "width", GST_TYPE_INT_RANGE, + self->min_width, self->max_width, "height", GST_TYPE_INT_RANGE, + self->min_height, self->max_height, NULL); +} + +GstCaps * +gst_va_filter_get_caps (GstVaFilter * self) +{ + GArray *surface_formats = NULL, *image_formats = NULL; + GstCaps *caps, *base_caps, *feature_caps; + GstCapsFeatures *features; + guint32 mem_types; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), NULL); + + if (!gst_va_filter_is_open (self)) + return NULL; + + surface_formats = gst_va_filter_get_surface_formats (self); + if (!surface_formats) + return NULL; + + base_caps = _create_base_caps (self); + + if (!gst_caps_set_format_array (base_caps, surface_formats)) + goto fail; + + g_array_unref (surface_formats); + + caps = gst_caps_new_empty (); + + mem_types = gst_va_filter_get_mem_types (self); + + if (mem_types & VA_SURFACE_ATTRIB_MEM_TYPE_VA) { + feature_caps = gst_caps_copy (base_caps); + features = gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_VA); + gst_caps_set_features_simple (feature_caps, features); + caps = gst_caps_merge (caps, feature_caps); + } + if (mem_types & VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME + || mem_types & VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2) { + feature_caps = gst_caps_copy (base_caps); + features = gst_caps_features_from_string (GST_CAPS_FEATURE_MEMORY_DMABUF); + gst_caps_set_features_simple (feature_caps, features); + caps = gst_caps_merge (caps, feature_caps); + } + + gst_caps_unref (base_caps); + + base_caps = _create_base_caps (self); + + GST_OBJECT_LOCK (self); + image_formats = + self->image_formats ? g_array_ref (self->image_formats) : NULL; + GST_OBJECT_UNLOCK (self); + + if (image_formats) { + if (!gst_caps_set_format_array (base_caps, image_formats)) + goto fail; + g_array_unref (image_formats); + } + + return gst_caps_merge (caps, base_caps); + +fail: + { + g_clear_pointer (&surface_formats, g_array_unref); + g_clear_pointer (&image_formats, g_array_unref); + gst_caps_unref (base_caps); + return NULL; + } +} + +/* from va_vpp.h */ +/* *INDENT-OFF* */ +static const struct ColorPropertiesMap +{ + VAProcColorStandardType standard; + guint8 primaries; + guint8 transfer; + guint8 matrix; +} color_properties_map[] = { + { VAProcColorStandardBT601, 5, 6, 5 }, + { VAProcColorStandardBT601, 6, 6, 6 }, + { VAProcColorStandardBT709, 1, 1, 1 }, + { VAProcColorStandardBT470M, 4, 4, 4 }, + { VAProcColorStandardBT470BG, 5, 5, 5 }, + { VAProcColorStandardSMPTE170M, 6, 6, 6 }, + { VAProcColorStandardSMPTE240M, 7, 7, 7 }, + { VAProcColorStandardGenericFilm, 8, 1, 1 }, + { VAProcColorStandardSRGB, 1, 13, 0 }, + /* { VAProcColorStandardSTRGB, ?, ?, ? }, */ + { VAProcColorStandardXVYCC601, 1, 11, 5 }, + { VAProcColorStandardXVYCC709, 1, 11, 1 }, + { VAProcColorStandardBT2020, 9, 14, 9 }, +}; + /* *INDENT-ON* */ + +static guint8 +_get_chroma_siting (GstVideoChromaSite chrome_site) +{ + /* *INDENT-OFF* */ + static const struct ChromaSiteMap { + GstVideoChromaSite gst; + guint8 va; + } chroma_site_map[] = { + { GST_VIDEO_CHROMA_SITE_UNKNOWN, VA_CHROMA_SITING_UNKNOWN }, + { GST_VIDEO_CHROMA_SITE_NONE, VA_CHROMA_SITING_VERTICAL_CENTER + | VA_CHROMA_SITING_HORIZONTAL_CENTER }, + { GST_VIDEO_CHROMA_SITE_H_COSITED, VA_CHROMA_SITING_VERTICAL_CENTER + | VA_CHROMA_SITING_HORIZONTAL_LEFT }, + { GST_VIDEO_CHROMA_SITE_V_COSITED, VA_CHROMA_SITING_VERTICAL_TOP + | VA_CHROMA_SITING_VERTICAL_BOTTOM }, + { GST_VIDEO_CHROMA_SITE_COSITED, VA_CHROMA_SITING_VERTICAL_CENTER + | VA_CHROMA_SITING_HORIZONTAL_LEFT + | VA_CHROMA_SITING_VERTICAL_TOP + | VA_CHROMA_SITING_VERTICAL_BOTTOM }, + { GST_VIDEO_CHROMA_SITE_JPEG, VA_CHROMA_SITING_VERTICAL_CENTER + | VA_CHROMA_SITING_HORIZONTAL_CENTER }, + { GST_VIDEO_CHROMA_SITE_MPEG2, VA_CHROMA_SITING_VERTICAL_CENTER + | VA_CHROMA_SITING_HORIZONTAL_LEFT }, + { GST_VIDEO_CHROMA_SITE_DV, VA_CHROMA_SITING_VERTICAL_TOP + | VA_CHROMA_SITING_HORIZONTAL_LEFT }, + }; + /* *INDENT-ON* */ + guint i; + + for (i = 0; i < G_N_ELEMENTS (chroma_site_map); i++) { + if (chrome_site == chroma_site_map[i].gst) + return chroma_site_map[i].va; + } + + return VA_CHROMA_SITING_UNKNOWN; +} + +static guint8 +_get_color_range (GstVideoColorRange range) +{ + /* *INDENT-OFF* */ + static const struct ColorRangeMap { + GstVideoColorRange gst; + guint8 va; + } color_range_map[] = { + { GST_VIDEO_COLOR_RANGE_UNKNOWN, VA_SOURCE_RANGE_UNKNOWN }, + { GST_VIDEO_COLOR_RANGE_0_255, VA_SOURCE_RANGE_FULL }, + { GST_VIDEO_COLOR_RANGE_16_235, VA_SOURCE_RANGE_REDUCED }, + }; + /* *INDENT-ON* */ + guint i; + + for (i = 0; i < G_N_ELEMENTS (color_range_map); i++) { + if (range == color_range_map[i].gst) + return color_range_map[i].va; + } + + return VA_SOURCE_RANGE_UNKNOWN; +} + +static VAProcColorStandardType +_gst_video_colorimetry_to_va (const GstVideoColorimetry * const colorimetry) +{ + if (!colorimetry + || colorimetry->primaries == GST_VIDEO_COLOR_PRIMARIES_UNKNOWN) + return VAProcColorStandardNone; + + if (gst_video_colorimetry_matches (colorimetry, GST_VIDEO_COLORIMETRY_BT709)) + return VAProcColorStandardBT709; + + /* NOTE: VAProcColorStandardBT2020 in VAAPI is the same as + * GST_VIDEO_COLORIMETRY_BT2020_10 in gstreamer. */ + if (gst_video_colorimetry_matches (colorimetry, + GST_VIDEO_COLORIMETRY_BT2020_10) || + gst_video_colorimetry_matches (colorimetry, GST_VIDEO_COLORIMETRY_BT2020)) + return VAProcColorStandardBT2020; + + if (gst_video_colorimetry_matches (colorimetry, GST_VIDEO_COLORIMETRY_BT601)) + return VAProcColorStandardBT601; + + if (gst_video_colorimetry_matches (colorimetry, + GST_VIDEO_COLORIMETRY_SMPTE240M)) + return VAProcColorStandardSMPTE240M; + + if (gst_video_colorimetry_matches (colorimetry, GST_VIDEO_COLORIMETRY_SRGB)) + return VAProcColorStandardSRGB; + + return VAProcColorStandardNone; +} + +static void +_config_color_properties (VAProcColorStandardType * std, + VAProcColorProperties * props, const GstVideoInfo * info, + VAProcColorStandardType * standards, guint32 num_standards) +{ + GstVideoColorimetry colorimetry = GST_VIDEO_INFO_COLORIMETRY (info); + VAProcColorStandardType best; + gboolean has_explicit; + guint i, j, k; + gint score, bestscore = -1, worstscore; + + best = _gst_video_colorimetry_to_va (&colorimetry); + + has_explicit = FALSE; + for (i = 0; i < num_standards; i++) { + /* Find the exact match standard. */ + if (standards[i] != VAProcColorStandardNone && standards[i] == best) + break; + + if (standards[i] == VAProcColorStandardExplicit) + has_explicit = TRUE; + } + + if (i < num_standards) { + *std = best; + goto set_properties; + } else if (has_explicit) { + *std = VAProcColorStandardExplicit; + goto set_properties; + } + + worstscore = 4 * (colorimetry.matrix != GST_VIDEO_COLOR_MATRIX_UNKNOWN + && colorimetry.matrix != GST_VIDEO_COLOR_MATRIX_RGB) + + 2 * (colorimetry.transfer != GST_VIDEO_TRANSFER_UNKNOWN) + + (colorimetry.primaries != GST_VIDEO_COLOR_PRIMARIES_UNKNOWN); + + if (worstscore == 0) { + /* No properties specified, there's not a useful choice. */ + *std = VAProcColorStandardNone; + *props = (VAProcColorProperties) { + }; + + return; + } + + best = VAProcColorStandardNone; + k = -1; + for (i = 0; i < num_standards; i++) { + for (j = 0; j < G_N_ELEMENTS (color_properties_map); j++) { + if (color_properties_map[j].standard != standards[i]) + continue; + + score = 0; + if (colorimetry.matrix != GST_VIDEO_COLOR_MATRIX_UNKNOWN + && colorimetry.matrix != GST_VIDEO_COLOR_MATRIX_RGB) + score += 4 * (colorimetry.matrix != color_properties_map[j].matrix); + if (colorimetry.transfer != GST_VIDEO_TRANSFER_UNKNOWN) + score += 2 * (colorimetry.transfer != color_properties_map[j].transfer); + if (colorimetry.primaries != GST_VIDEO_COLOR_PRIMARIES_UNKNOWN) + score += (colorimetry.primaries != color_properties_map[j].primaries); + + if (score < worstscore && (bestscore == -1 || score < bestscore)) { + bestscore = score; + best = color_properties_map[j].standard; + k = j; + } + } + } + + if (best != VAProcColorStandardNone) { + *std = best; + colorimetry.matrix = color_properties_map[k].matrix; + colorimetry.transfer = color_properties_map[k].transfer; + colorimetry.primaries = color_properties_map[k].primaries; + } + +set_properties: + /* *INDENT-OFF* */ + *props = (VAProcColorProperties) { + .chroma_sample_location = + _get_chroma_siting (GST_VIDEO_INFO_CHROMA_SITE (info)), + .color_range = _get_color_range (colorimetry.range), + .colour_primaries = + gst_video_color_primaries_to_iso (colorimetry.primaries), + .transfer_characteristics = + gst_video_transfer_function_to_iso (colorimetry.transfer), + .matrix_coefficients = + gst_video_color_matrix_to_iso (colorimetry.matrix), + }; + /* *INDENT-ON* */ +} + +gboolean +gst_va_filter_set_video_info (GstVaFilter * self, GstVideoInfo * in_info, + GstVideoInfo * out_info) +{ + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + g_return_val_if_fail (out_info && in_info, FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + GST_OBJECT_LOCK (self); + /* *INDENT-OFF* */ + self->input_region = (VARectangle) { + .width = GST_VIDEO_INFO_WIDTH (in_info), + .height = GST_VIDEO_INFO_HEIGHT (in_info), + }; + + self->output_region = (VARectangle) { + .width = GST_VIDEO_INFO_WIDTH (out_info), + .height = GST_VIDEO_INFO_HEIGHT (out_info), + }; + /* *INDENT-ON* */ + + _config_color_properties (&self->input_color_standard, + &self->input_color_properties, in_info, + self->pipeline_caps.input_color_standards, + self->pipeline_caps.num_input_color_standards); + _config_color_properties (&self->output_color_standard, + &self->output_color_properties, out_info, + self->pipeline_caps.output_color_standards, + self->pipeline_caps.num_output_color_standards); + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gboolean +_query_pipeline_caps (GstVaFilter * self, GArray * filters, + VAProcPipelineCaps * caps) +{ + VABufferID *va_filters = NULL; + VADisplay dpy; + VAStatus status; + guint32 num_filters = 0; + + GST_OBJECT_LOCK (self); + if (filters) { + num_filters = filters->len; + va_filters = (num_filters > 0) ? (VABufferID *) filters->data : NULL; + } + GST_OBJECT_UNLOCK (self); + + dpy = gst_va_display_get_va_dpy (self->display); + + gst_va_display_lock (self->display); + status = vaQueryVideoProcPipelineCaps (dpy, self->context, va_filters, + num_filters, caps); + gst_va_display_unlock (self->display); + + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaQueryVideoProcPipelineCaps: %s", + vaErrorStr (status)); + return FALSE; + } + + return TRUE; +} + +gboolean +gst_va_filter_add_deinterlace_buffer (GstVaFilter * self, + VAProcDeinterlacingType method, guint32 * forward, guint32 * backward) +{ + GArray *filters = NULL; + VAProcFilterParameterBufferDeinterlacing params = { + .type = VAProcFilterDeinterlacing, + .algorithm = method, + }; + VAProcPipelineCaps pipeline_caps = { 0, }; + gboolean ret; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!(method != VAProcDeinterlacingNone + && method != VAProcDeinterlacingCount)) + return FALSE; + + if (!gst_va_filter_add_filter_buffer (self, ¶ms, sizeof (params), 1)) + return FALSE; + + GST_OBJECT_LOCK (self); + if (self->filters) + filters = g_array_ref (self->filters); + GST_OBJECT_UNLOCK (self); + ret = _query_pipeline_caps (self, filters, &pipeline_caps); + if (filters) + g_array_unref (filters); + if (!ret) + return FALSE; + + if (forward) + *forward = pipeline_caps.num_forward_references; + if (backward) + *backward = pipeline_caps.num_backward_references; + + return TRUE; +} + +gboolean +gst_va_filter_add_filter_buffer (GstVaFilter * self, gpointer data, gsize size, + guint num) +{ + VABufferID buffer; + VADisplay dpy; + VAStatus status; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + g_return_val_if_fail (data && size > 0, FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + dpy = gst_va_display_get_va_dpy (self->display); + gst_va_display_lock (self->display); + status = vaCreateBuffer (dpy, self->context, VAProcFilterParameterBufferType, + size, num, data, &buffer); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaCreateBuffer: %s", vaErrorStr (status)); + return FALSE; + } + + /* lazy creation */ + GST_OBJECT_LOCK (self); + if (!self->filters) + self->filters = g_array_sized_new (FALSE, FALSE, sizeof (VABufferID), 16); + + g_array_append_val (self->filters, buffer); + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gboolean +_destroy_filters_unlocked (GstVaFilter * self) +{ + VABufferID buffer; + VADisplay dpy; + VAStatus status; + gboolean ret = TRUE; + guint i; + + GST_TRACE_OBJECT (self, "Destroying %u filter buffers", self->filters->len); + + dpy = gst_va_display_get_va_dpy (self->display); + + for (i = 0; i < self->filters->len; i++) { + buffer = g_array_index (self->filters, VABufferID, i); + + gst_va_display_lock (self->display); + status = vaDestroyBuffer (dpy, buffer); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + ret = FALSE; + GST_WARNING_OBJECT (self, "Failed to destroy filter buffer: %s", + vaErrorStr (status)); + } + } + + self->filters = g_array_set_size (self->filters, 0); + + return ret; +} + +gboolean +gst_va_filter_drop_filter_buffers (GstVaFilter * self) +{ + gboolean ret = TRUE; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + + GST_OBJECT_LOCK (self); + if (self->filters) + ret = _destroy_filters_unlocked (self); + GST_OBJECT_UNLOCK (self); + + return ret; +} + +static gboolean +_fill_va_sample (GstVaFilter * self, GstVaSample * sample, + GstPadDirection direction) +{ + GstVideoCropMeta *crop = NULL; + + if (sample->buffer) + sample->surface = gst_va_buffer_get_surface (sample->buffer); + if (sample->surface == VA_INVALID_ID) + return FALSE; + + /* @FIXME: in gallium vaQuerySurfaceStatus only seems to work with + * encoder's surfaces */ + if (!GST_VA_DISPLAY_IS_IMPLEMENTATION (self->display, MESA_GALLIUM)) { + if (!va_check_surface (self->display, sample->surface)) + return FALSE; + } + + /* XXX: cropping occurs only in input frames */ + if (direction == GST_PAD_SRC) { + GST_OBJECT_LOCK (self); + sample->rect = self->output_region; + sample->rect.x = sample->borders_w / 2; + sample->rect.y = sample->borders_h / 2; + sample->rect.width -= sample->borders_w; + sample->rect.height -= sample->borders_h; + GST_OBJECT_UNLOCK (self); + + return TRUE; + } + + /* if buffer has crop meta, its real size is in video meta */ + if (sample->buffer) + crop = gst_buffer_get_video_crop_meta (sample->buffer); + + GST_OBJECT_LOCK (self); + if (crop && self->crop_enabled) { + /* *INDENT-OFF* */ + sample->rect = (VARectangle) { + .x = crop->x, + .y = crop->y, + .width = crop->width, + .height = crop->height + }; + /* *INDENT-ON* */ + } else { + sample->rect = self->input_region; + } + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gboolean +_create_pipeline_buffer (GstVaFilter * self, GstVaSample * src, + GstVaSample * dst, GArray * filters, VABufferID * buffer) +{ + VADisplay dpy; + VAStatus status; + VABufferID *va_filters = NULL; + VAProcPipelineParameterBuffer params; + guint32 num_filters = 0; + + GST_OBJECT_LOCK (self); + + /* *INDENT-OFF* */ + if (filters) { + num_filters = filters->len; + va_filters = (num_filters > 0) ? (VABufferID *) filters->data : NULL; + } + params = (VAProcPipelineParameterBuffer) { + .surface = src->surface, + .surface_region = &src->rect, + .surface_color_standard = self->input_color_standard, + .output_region = &dst->rect, + .output_background_color = 0xff000000, /* ARGB black */ + .output_color_standard = self->output_color_standard, + .filters = va_filters, + .num_filters = num_filters, + .forward_references = src->forward_references, + .num_forward_references = src->num_forward_references, + .backward_references = src->backward_references, + .num_backward_references = src->num_backward_references, + .rotation_state = self->rotation, + .mirror_state = self->mirror, + .input_surface_flag = src->flags, + .output_surface_flag = dst->flags, + .input_color_properties = self->input_color_properties, + .output_color_properties = self->output_color_properties, + }; + /* *INDENT-ON* */ + + GST_OBJECT_UNLOCK (self); + + dpy = gst_va_display_get_va_dpy (self->display); + gst_va_display_lock (self->display); + status = vaCreateBuffer (dpy, self->context, + VAProcPipelineParameterBufferType, sizeof (params), 1, ¶ms, buffer); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaCreateBuffer: %s", vaErrorStr (status)); + return FALSE; + } + + GST_TRACE_OBJECT (self, "Created VABufferID %#x with %u filters", *buffer, + num_filters); + + return TRUE; +} + +gboolean +gst_va_filter_process (GstVaFilter * self, GstVaSample * src, GstVaSample * dst) +{ + GArray *filters = NULL; + VABufferID buffer; + VADisplay dpy; + VAProcPipelineCaps pipeline_caps = { 0, }; + VAStatus status; + gboolean ret = FALSE; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + g_return_val_if_fail (src, FALSE); + g_return_val_if_fail (dst, FALSE); + + if (!gst_va_filter_is_open (self)) + return FALSE; + + if (!(_fill_va_sample (self, src, GST_PAD_SINK) + && _fill_va_sample (self, dst, GST_PAD_SRC))) + return FALSE; + + GST_OBJECT_LOCK (self); + if (self->filters) + filters = g_array_ref (self->filters); + GST_OBJECT_UNLOCK (self); + + if (!_query_pipeline_caps (self, filters, &pipeline_caps)) + return FALSE; + + if (!_create_pipeline_buffer (self, src, dst, filters, &buffer)) + return FALSE; + + if (filters) + g_array_unref (filters); + + dpy = gst_va_display_get_va_dpy (self->display); + + gst_va_display_lock (self->display); + status = vaBeginPicture (dpy, self->context, dst->surface); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaBeginPicture: %s", vaErrorStr (status)); + return FALSE; + } + + gst_va_display_lock (self->display); + status = vaRenderPicture (dpy, self->context, &buffer, 1); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaRenderPicture: %s", vaErrorStr (status)); + goto fail_end_pic; + } + + gst_va_display_lock (self->display); + status = vaEndPicture (dpy, self->context); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR_OBJECT (self, "vaEndPicture: %s", vaErrorStr (status)); + goto bail; + } + + ret = TRUE; + +bail: + gst_va_display_lock (self->display); + status = vaDestroyBuffer (dpy, buffer); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING_OBJECT (self, "Failed to destroy pipeline buffer: %s", + vaErrorStr (status)); + } + + return ret; + +fail_end_pic: + { + gst_va_display_lock (self->display); + status = vaEndPicture (dpy, self->context); + gst_va_display_unlock (self->display); + if (status != VA_STATUS_SUCCESS) + GST_ERROR_OBJECT (self, "vaEndPicture: %s", vaErrorStr (status)); + goto bail; + } +} + +/** + * gst_va_buffer_get_surface_flags: + * @buffer: the #GstBuffer to check. + * @info: the #GstVideoInfo with info. + * + * Gets the surface flags, related with interlace given @buffer and + * @info. + * + * Returns: VA surface flags. + */ +guint32 +gst_va_buffer_get_surface_flags (GstBuffer * buffer, GstVideoInfo * info) +{ + guint32 surface_flags = 0; + + if (GST_VIDEO_INFO_INTERLACE_MODE (info) == GST_VIDEO_INTERLACE_MODE_MIXED + || (GST_VIDEO_INFO_INTERLACE_MODE (info) == + GST_VIDEO_INTERLACE_MODE_INTERLEAVED + && GST_VIDEO_INFO_FIELD_ORDER (info) == + GST_VIDEO_FIELD_ORDER_UNKNOWN)) { + if (GST_BUFFER_FLAG_IS_SET (buffer, GST_VIDEO_BUFFER_FLAG_INTERLACED)) { + if (GST_BUFFER_FLAG_IS_SET (buffer, GST_VIDEO_BUFFER_FLAG_TFF)) { + surface_flags = VA_TOP_FIELD_FIRST; + } else { + surface_flags = VA_BOTTOM_FIELD_FIRST; + } + } else { + surface_flags = VA_FRAME_PICTURE; + } + } else if (GST_VIDEO_INFO_FIELD_ORDER (info) == + GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST) { + surface_flags = VA_BOTTOM_FIELD_FIRST; + } else if (GST_VIDEO_INFO_FIELD_ORDER (info) == + GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST) { + surface_flags = VA_TOP_FIELD_FIRST; + } + + return surface_flags; +} + +gboolean +gst_va_filter_has_video_format (GstVaFilter * self, GstVideoFormat format, + GstCapsFeatures * feature) +{ + guint i; + GstVideoFormat fmt; + + g_return_val_if_fail (GST_IS_VA_FILTER (self), FALSE); + g_return_val_if_fail (format != GST_VIDEO_FORMAT_UNKNOWN, FALSE); + g_return_val_if_fail (GST_IS_CAPS_FEATURES (feature) + && !gst_caps_features_is_any (feature), FALSE); + + + GST_OBJECT_LOCK (self); + for (i = 0; i < self->surface_formats->len; i++) { + fmt = g_array_index (self->surface_formats, GstVideoFormat, i); + if (format == fmt) { + GST_OBJECT_UNLOCK (self); + return TRUE; + } + } + GST_OBJECT_UNLOCK (self); + + if (!gst_caps_features_is_equal (feature, + GST_CAPS_FEATURES_MEMORY_SYSTEM_MEMORY)) + return FALSE; + + GST_OBJECT_LOCK (self); + for (i = 0; i < self->image_formats->len; i++) { + fmt = g_array_index (self->image_formats, GstVideoFormat, i); + if (format == fmt) { + GST_OBJECT_UNLOCK (self); + return TRUE; + } + } + GST_OBJECT_UNLOCK (self); + + return FALSE; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvafilter.h
Added
@@ -0,0 +1,125 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/va/gstvadisplay.h> +#include <gst/video/video.h> + +#include <va/va.h> +#include <va/va_vpp.h> + +G_BEGIN_DECLS + +#define GST_TYPE_VA_FILTER (gst_va_filter_get_type()) +G_DECLARE_FINAL_TYPE (GstVaFilter, gst_va_filter, GST, VA_FILTER, GstObject) + +typedef enum { + GST_VA_DEINTERLACE_BOB = VAProcDeinterlacingBob, + GST_VA_DEINTERLACE_WEAVE = VAProcDeinterlacingWeave, + GST_VA_DEINTERLACE_ADAPTIVE = VAProcDeinterlacingMotionAdaptive, + GST_VA_DEINTERLACE_COMPENSATED = VAProcDeinterlacingMotionCompensated, +} GstVaDeinterlaceMethods; + +enum { + GST_VA_FILTER_PROP_DENOISE = 1, + GST_VA_FILTER_PROP_SHARPEN, + GST_VA_FILTER_PROP_SKINTONE, + GST_VA_FILTER_PROP_VIDEO_DIR, + GST_VA_FILTER_PROP_HUE, + GST_VA_FILTER_PROP_SATURATION, + GST_VA_FILTER_PROP_BRIGHTNESS, + GST_VA_FILTER_PROP_CONTRAST, + GST_VA_FILTER_PROP_AUTO_SATURATION, + GST_VA_FILTER_PROP_AUTO_BRIGHTNESS, + GST_VA_FILTER_PROP_AUTO_CONTRAST, + GST_VA_FILTER_PROP_DISABLE_PASSTHROUGH, + GST_VA_FILTER_PROP_DEINTERLACE_METHOD, + GST_VA_FILTER_PROP_ADD_BORDERS, + GST_VA_FILTER_PROP_LAST +}; + +typedef struct _GstVaSample GstVaSample; +struct _GstVaSample +{ + GstBuffer *buffer; + guint32 flags; + + /* references for (de)interlacing */ + VASurfaceID *forward_references; + guint num_forward_references; + VASurfaceID *backward_references; + guint num_backward_references; + + /* borders to preserve dar */ + gint borders_h; + gint borders_w; + + /*< private >*/ + VASurfaceID surface; + VARectangle rect; +}; + +GstVaFilter * gst_va_filter_new (GstVaDisplay * display); +gboolean gst_va_filter_open (GstVaFilter * self); +gboolean gst_va_filter_close (GstVaFilter * self); +gboolean gst_va_filter_is_open (GstVaFilter * self); +gboolean gst_va_filter_has_filter (GstVaFilter * self, + VAProcFilterType type); +gboolean gst_va_filter_install_properties (GstVaFilter * self, + GObjectClass * klass); +gboolean gst_va_filter_install_deinterlace_properties + (GstVaFilter * self, + GObjectClass * klass); +gboolean gst_va_filter_set_orientation (GstVaFilter * self, + GstVideoOrientationMethod orientation); +GstVideoOrientationMethod gst_va_filter_get_orientation (GstVaFilter * self); +void gst_va_filter_enable_cropping (GstVaFilter * self, + gboolean cropping); +const gpointer gst_va_filter_get_filter_caps (GstVaFilter * self, + VAProcFilterType type, + guint * num_caps); +guint32 gst_va_filter_get_mem_types (GstVaFilter * self); +GArray * gst_va_filter_get_surface_formats (GstVaFilter * self); +GstCaps * gst_va_filter_get_caps (GstVaFilter * self); +gboolean gst_va_filter_set_video_info (GstVaFilter * self, + GstVideoInfo * in_info, + GstVideoInfo * out_info); +gboolean gst_va_filter_add_filter_buffer (GstVaFilter * self, + gpointer data, + gsize size, + guint num); +gboolean gst_va_filter_add_deinterlace_buffer(GstVaFilter * self, + VAProcDeinterlacingType method, + guint32 * forward, + guint32 * backward); +gboolean gst_va_filter_drop_filter_buffers (GstVaFilter * self); +gboolean gst_va_filter_process (GstVaFilter * self, + GstVaSample * src, + GstVaSample * dest); + +guint32 gst_va_buffer_get_surface_flags (GstBuffer * buffer, + GstVideoInfo * info); + +gboolean gst_va_filter_has_video_format (GstVaFilter * self, + GstVideoFormat format, + GstCapsFeatures * feature); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvah264dec.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvah264dec.c
Changed
@@ -18,28 +18,46 @@ * Boston, MA 02110-1301, USA. */ +/** + * SECTION:element-vah264dec + * @title: vah264dec + * @short_description: A VA-API based H264 video decoder + * + * vah264dec decodes H264 bitstreams to VA surfaces using the + * installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver. + * + * The decoding surfaces can be mapped onto main memory as video + * frames. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=big_buck_bunny.mov ! parsebin ! vah264dec ! autovideosink + * ``` + * + * Since: 1.18 + * + */ + +/* ToDo: + * + * + mutiview and stereo profiles + */ + #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstvah264dec.h" -#include <gst/codecs/gsth264decoder.h> - -#include <va/va_drmcommon.h> - -#include "gstvaallocator.h" -#include "gstvacaps.h" -#include "gstvadecoder.h" -#include "gstvadevice.h" -#include "gstvadisplay_drm.h" -#include "gstvapool.h" -#include "gstvaprofile.h" -#include "gstvautils.h" -#include "gstvavideoformat.h" +#include "gstvabasedec.h" GST_DEBUG_CATEGORY_STATIC (gst_va_h264dec_debug); +#ifndef GST_DISABLE_GST_DEBUG #define GST_CAT_DEFAULT gst_va_h264dec_debug +#else +#define GST_CAT_DEFAULT NULL +#endif #define GST_VA_H264_DEC(obj) ((GstVaH264Dec *) obj) #define GST_VA_H264_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaH264DecClass)) @@ -50,128 +68,58 @@ struct _GstVaH264DecClass { - GstH264DecoderClass parent_class; - - gchar *render_device_path; + GstVaBaseDecClass parent_class; }; struct _GstVaH264Dec { - GstH264Decoder parent; - - GstVaDisplay *display; - GstVaDecoder *decoder; - - GstBufferPool *other_pool; + GstVaBaseDec parent; GstFlowReturn last_ret; - GstVideoCodecState *output_state; - VAProfile profile; - gint display_width; - gint display_height; gint coded_width; gint coded_height; - guint rt_format; gint dpb_size; - gboolean need_negotiation; - gboolean need_cropping; - gboolean has_videometa; - gboolean copy_frames; + /* Used to fill VAPictureParameterBufferH264.ReferenceFrames */ + GArray *ref_list; + + gboolean interlaced; }; static GstElementClass *parent_class = NULL; -struct CData -{ - gchar *render_device_path; - gchar *description; - GstCaps *sink_caps; - GstCaps *src_caps; -}; - /* *INDENT-OFF* */ -static const gchar *src_caps_str = GST_VIDEO_CAPS_MAKE_WITH_FEATURES ("memory:VAMemory", - "{ NV12, P010_10LE }") " ;" GST_VIDEO_CAPS_MAKE ("{ NV12, P010_10LE }"); +static const gchar *src_caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12, P010_10LE }") " ;" + GST_VIDEO_CAPS_MAKE ("{ NV12, P010_10LE }"); /* *INDENT-ON* */ static const gchar *sink_caps_str = "video/x-h264"; -static gboolean +static GstFlowReturn gst_va_h264_dec_end_picture (GstH264Decoder * decoder, GstH264Picture * picture) { - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); GstVaDecodePicture *va_pic; - GST_LOG_OBJECT (self, "end picture %p, (poc %d)", + GST_LOG_OBJECT (base, "end picture %p, (poc %d)", picture, picture->pic_order_cnt); va_pic = gst_h264_picture_get_user_data (picture); - return gst_va_decoder_decode (self->decoder, va_pic); -} - -static gboolean -_copy_output_buffer (GstVaH264Dec * self, GstVideoCodecFrame * codec_frame) -{ - GstVideoFrame src_frame; - GstVideoFrame dest_frame; - GstVideoInfo dest_vinfo; - GstBuffer *buffer; - GstFlowReturn ret; - - if (!self->other_pool) - return FALSE; - - if (!gst_buffer_pool_set_active (self->other_pool, TRUE)) - return FALSE; - - gst_video_info_set_format (&dest_vinfo, - GST_VIDEO_INFO_FORMAT (&self->output_state->info), self->display_width, - self->display_height); - - ret = gst_buffer_pool_acquire_buffer (self->other_pool, &buffer, NULL); - if (ret != GST_FLOW_OK) - goto fail; + if (!gst_va_decoder_decode (base->decoder, va_pic)) + return GST_FLOW_ERROR; - if (!gst_video_frame_map (&src_frame, &self->output_state->info, - codec_frame->output_buffer, GST_MAP_READ)) - goto fail; - - if (!gst_video_frame_map (&dest_frame, &dest_vinfo, buffer, GST_MAP_WRITE)) { - gst_video_frame_unmap (&dest_frame); - goto fail; - } - - /* gst_video_frame_copy can crop this, but does not know, so let - * make it think it's all right */ - GST_VIDEO_INFO_WIDTH (&src_frame.info) = self->display_width; - GST_VIDEO_INFO_HEIGHT (&src_frame.info) = self->display_height; - - if (!gst_video_frame_copy (&dest_frame, &src_frame)) { - gst_video_frame_unmap (&src_frame); - gst_video_frame_unmap (&dest_frame); - goto fail; - } - - gst_video_frame_unmap (&src_frame); - gst_video_frame_unmap (&dest_frame); - gst_buffer_replace (&codec_frame->output_buffer, buffer); - gst_buffer_unref (buffer); - - return TRUE; - -fail: - GST_ERROR_OBJECT (self, "Failed copy output buffer."); - return FALSE; + return GST_FLOW_OK; } - static GstFlowReturn gst_va_h264_dec_output_picture (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture) { + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); GstVaH264Dec *self = GST_VA_H264_DEC (decoder); GST_LOG_OBJECT (self, @@ -183,16 +131,19 @@ return self->last_ret; } - if (self->copy_frames) - _copy_output_buffer (self, frame); + if (base->copy_frames) + gst_va_base_dec_copy_output_buffer (base, frame); - GST_BUFFER_PTS (frame->output_buffer) = GST_BUFFER_PTS (frame->input_buffer); - GST_BUFFER_DTS (frame->output_buffer) = GST_CLOCK_TIME_NONE; - GST_BUFFER_DURATION (frame->output_buffer) = - GST_BUFFER_DURATION (frame->input_buffer); + if (picture->buffer_flags != 0) { + gboolean interlaced = + (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_INTERLACED) != 0; + gboolean tff = (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_TFF) != 0; - GST_LOG_OBJECT (self, "Finish frame %" GST_TIME_FORMAT, - GST_TIME_ARGS (GST_BUFFER_PTS (frame->output_buffer))); + GST_TRACE_OBJECT (self, + "apply buffer flags 0x%x (interlaced %d, top-field-first %d)", + picture->buffer_flags, interlaced, tff); + GST_BUFFER_FLAG_SET (frame->output_buffer, picture->buffer_flags); + } gst_h264_picture_unref (picture); @@ -210,7 +161,8 @@ } static void -_fill_vaapi_pic (VAPictureH264 * va_picture, GstH264Picture * picture) +_fill_vaapi_pic (VAPictureH264 * va_picture, GstH264Picture * picture, + gboolean merge_other_field) { GstVaDecodePicture *va_pic; @@ -221,14 +173,14 @@ return; } - va_picture->picture_id = va_pic->surface; + va_picture->picture_id = gst_va_decode_picture_get_surface (va_pic); va_picture->flags = 0; - if (picture->ref && picture->long_term) { + if (GST_H264_PICTURE_IS_LONG_TERM_REF (picture)) { va_picture->flags |= VA_PICTURE_H264_LONG_TERM_REFERENCE; va_picture->frame_idx = picture->long_term_frame_idx; } else { - if (picture->ref) + if (GST_H264_PICTURE_IS_SHORT_TERM_REF (picture)) va_picture->flags |= VA_PICTURE_H264_SHORT_TERM_REFERENCE; va_picture->frame_idx = picture->frame_num; } @@ -239,13 +191,23 @@ va_picture->BottomFieldOrderCnt = picture->bottom_field_order_cnt; break; case GST_H264_PICTURE_FIELD_TOP_FIELD: - va_picture->flags |= VA_PICTURE_H264_TOP_FIELD; + if (merge_other_field && picture->other_field) { + va_picture->BottomFieldOrderCnt = + picture->other_field->bottom_field_order_cnt; + } else { + va_picture->flags |= VA_PICTURE_H264_TOP_FIELD; + va_picture->BottomFieldOrderCnt = 0; + } va_picture->TopFieldOrderCnt = picture->top_field_order_cnt; - va_picture->BottomFieldOrderCnt = 0; break; case GST_H264_PICTURE_FIELD_BOTTOM_FIELD: - va_picture->flags |= VA_PICTURE_H264_BOTTOM_FIELD; - va_picture->TopFieldOrderCnt = 0; + if (merge_other_field && picture->other_field) { + va_picture->TopFieldOrderCnt = + picture->other_field->top_field_order_cnt; + } else { + va_picture->flags |= VA_PICTURE_H264_BOTTOM_FIELD; + va_picture->TopFieldOrderCnt = 0; + } va_picture->BottomFieldOrderCnt = picture->bottom_field_order_cnt; break; default: @@ -258,13 +220,21 @@ /* fill the VA API reference picture lists from the GstCodec reference * picture list */ static void -_fill_ref_pic_list (VAPictureH264 va_reflist[32], GArray * reflist) +_fill_ref_pic_list (VAPictureH264 va_reflist[32], GArray * reflist, + GstH264Picture * current_picture) { guint i; for (i = 0; i < reflist->len; i++) { GstH264Picture *picture = g_array_index (reflist, GstH264Picture *, i); - _fill_vaapi_pic (&va_reflist[i], picture); + + if (picture) { + _fill_vaapi_pic (&va_reflist[i], picture, + GST_H264_PICTURE_IS_FRAME (current_picture)); + } else { + /* list might include null picture if reference picture was missing */ + _init_vaapi_pic (&va_reflist[i]); + } } for (; i < 32; i++) @@ -353,19 +323,16 @@ return 8 * nal_header_bytes + header->header_size - epb_count * 8; } -static gboolean +static GstFlowReturn gst_va_h264_dec_decode_slice (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GArray * ref_pic_list0, GArray * ref_pic_list1) { GstH264SliceHdr *header = &slice->header; GstH264NalUnit *nalu = &slice->nalu; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); GstVaDecodePicture *va_pic; - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); VASliceParameterBufferH264 slice_param; - gboolean ret; - - GST_TRACE_OBJECT (self, "-"); /* *INDENT-OFF* */ slice_param = (VASliceParameterBufferH264) { @@ -387,37 +354,35 @@ }; /* *INDENT-ON* */ - _fill_ref_pic_list (slice_param.RefPicList0, ref_pic_list0); - _fill_ref_pic_list (slice_param.RefPicList1, ref_pic_list1); + _fill_ref_pic_list (slice_param.RefPicList0, ref_pic_list0, picture); + _fill_ref_pic_list (slice_param.RefPicList1, ref_pic_list1, picture); _fill_pred_weight_table (header, &slice_param); va_pic = gst_h264_picture_get_user_data (picture); - ret = gst_va_decoder_add_slice_buffer (self->decoder, va_pic, &slice_param, - sizeof (slice_param), slice->nalu.data + slice->nalu.offset, - slice->nalu.size); - if (!ret) { - gst_va_decoder_destroy_buffers (self->decoder, va_pic); - return FALSE; + if (!gst_va_decoder_add_slice_buffer (base->decoder, va_pic, &slice_param, + sizeof (slice_param), slice->nalu.data + slice->nalu.offset, + slice->nalu.size)) { + return GST_FLOW_ERROR; } - return TRUE; + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_va_h264_dec_start_picture (GstH264Decoder * decoder, GstH264Picture * picture, GstH264Slice * slice, GstH264Dpb * dpb) { + GstVaH264Dec *self = GST_VA_H264_DEC (decoder); GstH264PPS *pps; GstH264SPS *sps; - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); GstVaDecodePicture *va_pic; VAIQMatrixBufferH264 iq_matrix = { 0, }; VAPictureParameterBufferH264 pic_param; guint i, n; - - GST_TRACE_OBJECT (self, "-"); + GArray *ref_list = self->ref_list; va_pic = gst_h264_picture_get_user_data (picture); @@ -430,7 +395,8 @@ /* .ReferenceFrames */ .picture_width_in_mbs_minus1 = sps->pic_width_in_mbs_minus1, .picture_height_in_mbs_minus1 = - sps->pic_height_in_map_units_minus1 << !sps->frame_mbs_only_flag, + ((sps->pic_height_in_map_units_minus1 + 1) << + !sps->frame_mbs_only_flag) -1, .bit_depth_luma_minus8 = sps->bit_depth_luma_minus8, .bit_depth_chroma_minus8 = sps->bit_depth_chroma_minus8, .num_ref_frames = sps->num_ref_frames, @@ -469,35 +435,34 @@ }; /* *INDENT-ON* */ - _fill_vaapi_pic (&pic_param.CurrPic, picture); + _fill_vaapi_pic (&pic_param.CurrPic, picture, FALSE); /* reference frames */ { - GArray *ref_list = g_array_sized_new (FALSE, FALSE, - sizeof (GstH264Picture *), 16); - g_array_set_clear_func (ref_list, (GDestroyNotify) gst_h264_picture_clear); + guint ref_frame_idx = 0; + g_array_set_size (ref_list, 0); - gst_h264_dpb_get_pictures_short_term_ref (dpb, ref_list); - for (i = 0; i < 16 && i < ref_list->len; i++) { + gst_h264_dpb_get_pictures_short_term_ref (dpb, FALSE, FALSE, ref_list); + for (i = 0; ref_frame_idx < 16 && i < ref_list->len; i++) { GstH264Picture *pic = g_array_index (ref_list, GstH264Picture *, i); - _fill_vaapi_pic (&pic_param.ReferenceFrames[i], pic); + _fill_vaapi_pic (&pic_param.ReferenceFrames[ref_frame_idx++], pic, TRUE); } g_array_set_size (ref_list, 0); - gst_h264_dpb_get_pictures_long_term_ref (dpb, ref_list); - for (; i < 16 && i < ref_list->len; i++) { + gst_h264_dpb_get_pictures_long_term_ref (dpb, FALSE, ref_list); + for (i = 0; ref_frame_idx < 16 && i < ref_list->len; i++) { GstH264Picture *pic = g_array_index (ref_list, GstH264Picture *, i); - _fill_vaapi_pic (&pic_param.ReferenceFrames[i], pic); + _fill_vaapi_pic (&pic_param.ReferenceFrames[ref_frame_idx++], pic, TRUE); } - g_array_unref (ref_list); + g_array_set_size (ref_list, 0); - for (; i < 16; i++) - _init_vaapi_pic (&pic_param.ReferenceFrames[i]); + for (; ref_frame_idx < 16; ref_frame_idx++) + _init_vaapi_pic (&pic_param.ReferenceFrames[ref_frame_idx]); } - if (!gst_va_decoder_add_param_buffer (self->decoder, va_pic, + if (!gst_va_decoder_add_param_buffer (base->decoder, va_pic, VAPictureParameterBufferType, &pic_param, sizeof (pic_param))) - goto fail; + return GST_FLOW_ERROR; /* there are always 6 4x4 scaling lists */ for (i = 0; i < 6; i++) { @@ -514,51 +479,74 @@ [i], pps->scaling_lists_8x8[i]); } - if (!gst_va_decoder_add_param_buffer (self->decoder, va_pic, + if (!gst_va_decoder_add_param_buffer (base->decoder, va_pic, VAIQMatrixBufferType, &iq_matrix, sizeof (iq_matrix))) - goto fail; + return GST_FLOW_ERROR; - return TRUE; - -fail: - { - gst_va_decoder_destroy_buffers (self->decoder, va_pic); - return FALSE; - } + return GST_FLOW_OK; } -static gboolean +static GstFlowReturn gst_va_h264_dec_new_picture (GstH264Decoder * decoder, GstVideoCodecFrame * frame, GstH264Picture * picture) { GstVaH264Dec *self = GST_VA_H264_DEC (decoder); GstVaDecodePicture *pic; GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); - VASurfaceID surface; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + if (base->need_negotiation) { + if (!gst_video_decoder_negotiate (vdec)) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } self->last_ret = gst_video_decoder_allocate_output_frame (vdec, frame); if (self->last_ret != GST_FLOW_OK) goto error; - surface = gst_va_buffer_get_surface (frame->output_buffer, NULL); + pic = gst_va_decode_picture_new (base->decoder, frame->output_buffer); - pic = gst_va_decode_picture_new (surface); gst_h264_picture_set_user_data (picture, pic, (GDestroyNotify) gst_va_decode_picture_free); - GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, pic->surface); + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, + gst_va_decode_picture_get_surface (pic)); - return TRUE; + return GST_FLOW_OK; error: { GST_WARNING_OBJECT (self, "Failed to allocated output buffer, return %s", gst_flow_get_name (self->last_ret)); - return FALSE; + return self->last_ret; } } +static GstFlowReturn +gst_va_h264_dec_new_field_picture (GstH264Decoder * decoder, + const GstH264Picture * first_field, GstH264Picture * second_field) +{ + GstVaDecodePicture *first_pic, *second_pic; + GstVaH264Dec *self = GST_VA_H264_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + first_pic = gst_h264_picture_get_user_data ((GstH264Picture *) first_field); + if (!first_pic) + return GST_FLOW_ERROR; + + second_pic = gst_va_decode_picture_new (base->decoder, first_pic->gstbuffer); + gst_h264_picture_set_user_data (second_field, second_pic, + (GDestroyNotify) gst_va_decode_picture_free); + + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", second_pic, + gst_va_decode_picture_get_surface (second_pic)); + + return GST_FLOW_OK; +} + static inline guint _get_num_views (const GstH264SPS * sps) { @@ -619,6 +607,7 @@ static VAProfile _get_profile (GstVaH264Dec * self, const GstH264SPS * sps, gint max_dpb_size) { + GstVaBaseDec *base = GST_VA_BASE_DEC (self); VAProfile profiles[4]; gint i = 0, j; @@ -631,11 +620,21 @@ switch (sps->profile_idc) { case GST_H264_PROFILE_BASELINE: - if (sps->constraint_set1_flag) { /* A.2.2 (main profile) */ + { + GstH264DecoderCompliance compliance = GST_H264_DECODER_COMPLIANCE_STRICT; + + g_object_get (G_OBJECT (self), "compliance", &compliance, NULL); + + /* A.2 compliant or not strict */ + if (sps->constraint_set0_flag || sps->constraint_set1_flag + || sps->constraint_set2_flag + || compliance != GST_H264_DECODER_COMPLIANCE_STRICT) { profiles[i++] = VAProfileH264ConstrainedBaseline; profiles[i++] = VAProfileH264Main; } + break; + } case GST_H264_PROFILE_EXTENDED: if (sps->constraint_set1_flag) { /* A.2.2 (main profile) */ profiles[i++] = VAProfileH264Main; @@ -653,7 +652,7 @@ } for (j = 0; j < i && j < G_N_ELEMENTS (profiles); j++) { - if (gst_va_decoder_has_profile (self->decoder, profiles[j])) + if (gst_va_decoder_has_profile (base->decoder, profiles[j])) return profiles[j]; } @@ -662,66 +661,19 @@ return VAProfileNone; } -static gboolean -_format_changed (GstVaH264Dec * self, VAProfile new_profile, guint new_rtformat, - gint new_width, gint new_height) -{ - VAProfile profile = VAProfileNone; - guint rt_format = VA_RT_FORMAT_YUV420; - gint width = 0, height = 0; - - g_object_get (self->decoder, "va-profile", &profile, "va-rt-format", - &rt_format, "coded-width", &width, "coded-height", &height, NULL); - - /* @TODO: Check if current buffers are large enough, and reuse - * them */ - return !(profile == new_profile && rt_format == new_rtformat - && width == new_width && height == new_height); -} - -static void -_set_latency (GstVaH264Dec * self, const GstH264SPS * sps) -{ - GstClockTime duration, min, max; - gint fps_d, fps_n; - guint32 num_reorder_frames; - - fps_d = self->output_state->info.fps_d; - fps_n = self->output_state->info.fps_n; - - /* if 0/1 then 25/1 */ - if (fps_n == 0) { - fps_n = 25; - fps_d = 1; - } - - num_reorder_frames = 1; - if (sps->vui_parameters_present_flag - && sps->vui_parameters.bitstream_restriction_flag) - num_reorder_frames = sps->vui_parameters.num_reorder_frames; - if (num_reorder_frames > self->dpb_size) - num_reorder_frames = 1; - - duration = gst_util_uint64_scale_int (GST_SECOND, fps_d, fps_n); - min = num_reorder_frames * duration; - max = self->dpb_size * duration; - - GST_LOG_OBJECT (self, - "latency min %" G_GUINT64_FORMAT " max %" G_GUINT64_FORMAT, min, max); - - gst_video_decoder_set_latency (GST_VIDEO_DECODER (self), min, max); -} - -static gboolean +static GstFlowReturn gst_va_h264_dec_new_sequence (GstH264Decoder * decoder, const GstH264SPS * sps, gint max_dpb_size) { + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); GstVaH264Dec *self = GST_VA_H264_DEC (decoder); VAProfile profile; gint display_width; gint display_height; + gint padding_left, padding_right, padding_top, padding_bottom; guint rt_format; gboolean negotiation_needed = FALSE; + gboolean interlaced; if (self->dpb_size < max_dpb_size) self->dpb_size = max_dpb_size; @@ -729,23 +681,29 @@ if (sps->frame_cropping_flag) { display_width = sps->crop_rect_width; display_height = sps->crop_rect_height; + padding_left = sps->crop_rect_x; + padding_right = sps->width - sps->crop_rect_x - display_width; + padding_top = sps->crop_rect_y; + padding_bottom = sps->height - sps->crop_rect_y - display_height; } else { display_width = sps->width; display_height = sps->height; + padding_left = padding_right = padding_top = padding_bottom = 0; } profile = _get_profile (self, sps, max_dpb_size); if (profile == VAProfileNone) - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; rt_format = _get_rtformat (self, sps->bit_depth_luma_minus8 + 8, sps->chroma_format_idc); if (rt_format == 0) - return FALSE; + return GST_FLOW_NOT_NEGOTIATED; - if (_format_changed (self, profile, rt_format, sps->width, sps->height)) { - self->profile = profile; - self->rt_format = rt_format; + if (!gst_va_decoder_config_is_equal (base->decoder, profile, + rt_format, sps->width, sps->height)) { + base->profile = profile; + base->rt_format = rt_format; self->coded_width = sps->width; self->coded_height = sps->height; @@ -755,57 +713,49 @@ self->coded_height); } - if (self->display_width != display_width - || self->display_height != display_height) { - self->display_width = display_width; - self->display_height = display_height; + if (base->width != display_width || base->height != display_height) { + base->width = display_width; + base->height = display_height; negotiation_needed = TRUE; - GST_INFO_OBJECT (self, "Resolution changed to %dx%d", self->display_width, - self->display_height); + GST_INFO_OBJECT (self, "Resolution changed to %dx%d", base->width, + base->height); } - self->need_cropping = self->display_width < self->coded_width - || self->display_height < self->coded_height; - - if (negotiation_needed) { - self->need_negotiation = TRUE; - if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { - GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); - return FALSE; - } + interlaced = !sps->frame_mbs_only_flag; + if (self->interlaced != interlaced) { + self->interlaced = interlaced; - _set_latency (self, sps); + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "Interlaced mode changed to %d", interlaced); } - return TRUE; -} - -static gboolean -gst_va_h264_dec_open (GstVideoDecoder * decoder) -{ - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); - GstVaH264DecClass *klass = GST_VA_H264_DEC_GET_CLASS (decoder); - - if (!gst_va_ensure_element_data (decoder, klass->render_device_path, - &self->display)) - return FALSE; - - if (!self->decoder) - self->decoder = gst_va_decoder_new (self->display, H264); + base->need_valign = base->width < self->coded_width + || base->height < self->coded_height; + if (base->need_valign) { + /* *INDENT-OFF* */ + if (base->valign.padding_left != padding_left || + base->valign.padding_right != padding_right || + base->valign.padding_top != padding_top || + base->valign.padding_bottom != padding_bottom) { + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "crop rect changed to (%d,%d)-->(%d,%d)", + padding_left, padding_top, padding_right, padding_bottom); + } + base->valign = (GstVideoAlignment) { + .padding_left = padding_left, + .padding_right = padding_right, + .padding_top = padding_top, + .padding_bottom = padding_bottom, + }; + /* *INDENT-ON* */ + } - return (self->decoder != NULL); -} + base->min_buffers = self->dpb_size + 4; /* dpb size + scratch surfaces */ -static gboolean -gst_va_h264_dec_close (GstVideoDecoder * decoder) -{ - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); - - gst_clear_object (&self->decoder); - gst_clear_object (&self->display); + base->need_negotiation = negotiation_needed; - return TRUE; + return GST_FLOW_OK; } static GstCaps * @@ -837,13 +787,13 @@ } static GstCaps * -gst_va_h264_dec_sink_getcaps (GstVideoDecoder * decoder, GstCaps * filter) +gst_va_h264_dec_getcaps (GstVideoDecoder * decoder, GstCaps * filter) { GstCaps *sinkcaps, *caps = NULL, *tmp; - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); - if (self->decoder) - caps = gst_va_decoder_get_sinkpad_caps (self->decoder); + if (base->decoder) + caps = gst_va_decoder_get_sinkpad_caps (base->decoder); if (caps) { sinkcaps = _complete_sink_caps (caps); @@ -856,8 +806,8 @@ } else { caps = sinkcaps; } - GST_LOG_OBJECT (self, "Returning caps %" GST_PTR_FORMAT, caps); - } else if (!caps) { + GST_LOG_OBJECT (base, "Returning caps %" GST_PTR_FORMAT, caps); + } else { caps = gst_video_decoder_proxy_getcaps (decoder, NULL, filter); } @@ -865,500 +815,61 @@ } static gboolean -gst_va_h264_dec_src_query (GstVideoDecoder * decoder, GstQuery * query) -{ - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); - gboolean ret = FALSE; - - switch (GST_QUERY_TYPE (query)) { - case GST_QUERY_CONTEXT:{ - return gst_va_handle_context_query (GST_ELEMENT_CAST (self), query, - self->display); - } - case GST_QUERY_CAPS:{ - GstCaps *caps = NULL, *tmp, *filter = NULL; - - gst_query_parse_caps (query, &filter); - if (self->decoder) - caps = gst_va_decoder_get_srcpad_caps (self->decoder); - if (caps) { - if (filter) { - tmp = - gst_caps_intersect_full (filter, caps, GST_CAPS_INTERSECT_FIRST); - gst_caps_unref (caps); - caps = tmp; - } - - GST_LOG_OBJECT (self, "Returning caps %" GST_PTR_FORMAT, caps); - gst_query_set_caps_result (query, caps); - gst_caps_unref (caps); - ret = TRUE; - break; - } - /* else jump to default */ - } - default: - ret = GST_VIDEO_DECODER_CLASS (parent_class)->src_query (decoder, query); - break; - } - - return ret; -} - -static gboolean -gst_va_h264_dec_sink_query (GstVideoDecoder * decoder, GstQuery * query) -{ - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); - - if (GST_QUERY_TYPE (query) == GST_QUERY_CONTEXT) { - return gst_va_handle_context_query (GST_ELEMENT_CAST (self), query, - self->display); - } - - return GST_VIDEO_DECODER_CLASS (parent_class)->sink_query (decoder, query); -} - -static gboolean -gst_va_h264_dec_stop (GstVideoDecoder * decoder) -{ - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); - - if (!gst_va_decoder_close (self->decoder)) - return FALSE; - - if (self->output_state) - gst_video_codec_state_unref (self->output_state); - self->output_state = NULL; - - if (self->other_pool) - gst_buffer_pool_set_active (self->other_pool, FALSE); - gst_clear_object (&self->other_pool); - - return GST_VIDEO_DECODER_CLASS (parent_class)->stop (decoder); -} - -static GstVideoFormat -_default_video_format_from_chroma (guint chroma_type) -{ - switch (chroma_type) { - case VA_RT_FORMAT_YUV420: - case VA_RT_FORMAT_YUV422: - case VA_RT_FORMAT_YUV444: - return GST_VIDEO_FORMAT_NV12; - case VA_RT_FORMAT_YUV420_10: - case VA_RT_FORMAT_YUV422_10: - case VA_RT_FORMAT_YUV444_10: - return GST_VIDEO_FORMAT_P010_10LE; - default: - return GST_VIDEO_FORMAT_UNKNOWN; - } -} - -static void -_get_preferred_format_and_caps_features (GstVaH264Dec * self, - GstVideoFormat * format, GstCapsFeatures ** capsfeatures) -{ - GstCaps *peer_caps, *preferred_caps = NULL; - GstCapsFeatures *features; - GstStructure *structure; - const GValue *v_format; - guint num_structures, i; - - peer_caps = gst_pad_get_allowed_caps (GST_VIDEO_DECODER_SRC_PAD (self)); - GST_DEBUG_OBJECT (self, "Allowed caps %" GST_PTR_FORMAT, peer_caps); - - /* prefer memory:VASurface over other caps features */ - num_structures = gst_caps_get_size (peer_caps); - for (i = 0; i < num_structures; i++) { - features = gst_caps_get_features (peer_caps, i); - structure = gst_caps_get_structure (peer_caps, i); - - if (gst_caps_features_is_any (features)) - continue; - - if (gst_caps_features_contains (features, "memory:VAMemory")) { - preferred_caps = gst_caps_new_full (gst_structure_copy (structure), NULL); - gst_caps_set_features_simple (preferred_caps, - gst_caps_features_copy (features)); - break; - } - } - - if (!preferred_caps) - preferred_caps = peer_caps; - else - gst_clear_caps (&peer_caps); - - if (gst_caps_is_empty (preferred_caps) - || gst_caps_is_any (preferred_caps)) { - /* if any or not linked yet then system memory and nv12 */ - if (capsfeatures) - *capsfeatures = NULL; - if (format) - *format = _default_video_format_from_chroma (self->rt_format); - goto bail; - } - - features = gst_caps_get_features (preferred_caps, 0); - if (features && capsfeatures) - *capsfeatures = gst_caps_features_copy (features); - - if (!format) - goto bail; - - structure = gst_caps_get_structure (preferred_caps, 0); - v_format = gst_structure_get_value (structure, "format"); - if (!v_format) - *format = _default_video_format_from_chroma (self->rt_format); - else if (G_VALUE_HOLDS_STRING (v_format)) - *format = gst_video_format_from_string (g_value_get_string (v_format)); - else if (GST_VALUE_HOLDS_LIST (v_format)) { - guint num_values = gst_value_list_get_size (v_format); - for (i = 0; i < num_values; i++) { - GstVideoFormat fmt; - const GValue *v_fmt = gst_value_list_get_value (v_format, i); - if (!v_fmt) - continue; - fmt = gst_video_format_from_string (g_value_get_string (v_fmt)); - if (gst_va_chroma_from_video_format (fmt) == self->rt_format) { - *format = fmt; - break; - } - } - if (i == num_values) - *format = _default_video_format_from_chroma (self->rt_format); - } - -bail: - gst_clear_caps (&preferred_caps); -} - -static gboolean gst_va_h264_dec_negotiate (GstVideoDecoder * decoder) { + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); GstVaH264Dec *self = GST_VA_H264_DEC (decoder); GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; GstCapsFeatures *capsfeatures = NULL; GstH264Decoder *h264dec = GST_H264_DECODER (decoder); /* Ignore downstream renegotiation request. */ - if (!self->need_negotiation) + if (!base->need_negotiation) return TRUE; - self->need_negotiation = FALSE; + base->need_negotiation = FALSE; - if (gst_va_decoder_is_open (self->decoder) - && !gst_va_decoder_close (self->decoder)) + if (gst_va_decoder_is_open (base->decoder) + && !gst_va_decoder_close (base->decoder)) return FALSE; - if (!gst_va_decoder_open (self->decoder, self->profile, self->rt_format)) + if (!gst_va_decoder_open (base->decoder, base->profile, base->rt_format)) return FALSE; - if (!gst_va_decoder_set_format (self->decoder, self->coded_width, - self->coded_height, NULL)) + if (!gst_va_decoder_set_frame_size (base->decoder, self->coded_width, + self->coded_height)) return FALSE; - if (self->output_state) - gst_video_codec_state_unref (self->output_state); + if (base->output_state) + gst_video_codec_state_unref (base->output_state); - _get_preferred_format_and_caps_features (self, &format, &capsfeatures); + gst_va_base_dec_get_preferred_format_and_caps_features (base, &format, + &capsfeatures); - self->output_state = + base->output_state = gst_video_decoder_set_output_state (decoder, format, - self->display_width, self->display_height, h264dec->input_state); + base->width, base->height, h264dec->input_state); + if (self->interlaced) + base->output_state->info.interlace_mode = GST_VIDEO_INTERLACE_MODE_MIXED; - self->output_state->caps = gst_video_info_to_caps (&self->output_state->info); + base->output_state->caps = gst_video_info_to_caps (&base->output_state->info); if (capsfeatures) - gst_caps_set_features_simple (self->output_state->caps, capsfeatures); + gst_caps_set_features_simple (base->output_state->caps, capsfeatures); GST_INFO_OBJECT (self, "Negotiated caps %" GST_PTR_FORMAT, - self->output_state->caps); + base->output_state->caps); return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); } -static inline gboolean -_caps_is_dmabuf (GstVaH264Dec * self, GstCaps * caps) -{ - GstCapsFeatures *features; - - features = gst_caps_get_features (caps, 0); - return gst_caps_features_contains (features, GST_CAPS_FEATURE_MEMORY_DMABUF) - && (gst_va_decoder_get_mem_types (self->decoder) - & VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME); -} - -static inline gboolean -_caps_is_va_memory (GstCaps * caps) -{ - GstCapsFeatures *features; - - features = gst_caps_get_features (caps, 0); - return gst_caps_features_contains (features, "memory:VAMemory"); -} - -static inline void -_shall_copy_frames (GstVaH264Dec * self, GstVideoInfo * info) -{ - GstVideoInfo ref_info; - guint i; - - self->copy_frames = FALSE; - - if (self->has_videometa) - return; - - gst_video_info_set_format (&ref_info, GST_VIDEO_INFO_FORMAT (info), - self->display_width, self->display_height); - - for (i = 0; i < GST_VIDEO_INFO_N_PLANES (info); i++) { - if (info->stride[i] != ref_info.stride[i] || - info->offset[i] != ref_info.offset[i]) { - GST_WARNING_OBJECT (self, - "GstVideoMeta support required, copying frames."); - self->copy_frames = TRUE; - break; - } - } -} - -static gboolean -_try_allocator (GstVaH264Dec * self, GstAllocator * allocator, GstCaps * caps, - guint * size) -{ - GstVaAllocationParams params = { - .usage_hint = VA_SURFACE_ATTRIB_USAGE_HINT_DECODER, - }; - - if (!gst_video_info_from_caps (¶ms.info, caps)) - return FALSE; - if (self->need_cropping) { - GST_VIDEO_INFO_WIDTH (¶ms.info) = self->coded_width; - GST_VIDEO_INFO_HEIGHT (¶ms.info) = self->coded_height; - } - - if (GST_IS_VA_DMABUF_ALLOCATOR (allocator)) { - if (!gst_va_dmabuf_try (allocator, ¶ms)) - return FALSE; - } else if (GST_IS_VA_ALLOCATOR (allocator)) { - if (!gst_va_allocator_try (allocator, ¶ms)) - return FALSE; - if (!_caps_is_va_memory (caps)) - _shall_copy_frames (self, ¶ms.info); - } else { - return FALSE; - } - - if (size) - *size = GST_VIDEO_INFO_SIZE (¶ms.info); - - return TRUE; -} - -static GstAllocator * -_create_allocator (GstVaH264Dec * self, GstCaps * caps, guint * size) -{ - GstAllocator *allocator = NULL; - GstVaDisplay *display = NULL; - - g_object_get (self->decoder, "display", &display, NULL); - - if (_caps_is_dmabuf (self, caps)) - allocator = gst_va_dmabuf_allocator_new (display); - else { - GArray *surface_formats = - gst_va_decoder_get_surface_formats (self->decoder); - allocator = gst_va_allocator_new (display, surface_formats); - } - - gst_object_unref (display); - - if (!_try_allocator (self, allocator, caps, size)) - gst_clear_object (&allocator); - - return allocator; -} - -/* 1. get allocator in query - * 1.1 if allocator is not ours and downstream doesn't handle - * videometa, keep it for other_pool - * 2. get pool in query - * 2.1 if pool is not va, keep it as other_pool if downstream - * doesn't handle videometa or (it doesn't handle alignment and - * the stream needs cropping) - * 2.2 if there's no pool in query and downstream doesn't handle - * videometa, create other_pool as GstVideoPool with the non-va - * from query and query's params - * 3. create our allocator and pool if they aren't in query - * 4. add or update pool and allocator in query - * 5. set our custom pool configuration - */ -static gboolean -gst_va_h264_dec_decide_allocation (GstVideoDecoder * decoder, GstQuery * query) -{ - GstAllocator *allocator = NULL, *other_allocator = NULL; - GstAllocationParams other_params, params; - GstBufferPool *pool = NULL; - GstCaps *caps = NULL; - GstStructure *config; - GstVideoInfo info; - GstVaH264Dec *self = GST_VA_H264_DEC (decoder); - guint size, min, max; - gboolean update_pool = FALSE, update_allocator = FALSE, has_videoalignment; - - gst_query_parse_allocation (query, &caps, NULL); - - if (!(caps && gst_video_info_from_caps (&info, caps))) - goto wrong_caps; - - self->has_videometa = gst_query_find_allocation_meta (query, - GST_VIDEO_META_API_TYPE, NULL); - - if (gst_query_get_n_allocation_params (query) > 0) { - gst_query_parse_nth_allocation_param (query, 0, &allocator, &other_params); - if (allocator && !(GST_IS_VA_DMABUF_ALLOCATOR (allocator) - || GST_IS_VA_ALLOCATOR (allocator))) { - /* save the allocator for the other pool */ - other_allocator = allocator; - allocator = NULL; - } - update_allocator = TRUE; - } else { - gst_allocation_params_init (&other_params); - } - - gst_allocation_params_init (¶ms); - - if (gst_query_get_n_allocation_pools (query) > 0) { - gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max); - if (pool) { - if (!GST_IS_VA_POOL (pool)) { - has_videoalignment = gst_buffer_pool_has_option (pool, - GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); - if (!self->has_videometa || (!has_videoalignment - && self->need_cropping)) { - GST_DEBUG_OBJECT (self, - "keeping other pool for copy %" GST_PTR_FORMAT, pool); - gst_object_replace ((GstObject **) & self->other_pool, - (GstObject *) pool); - gst_object_unref (pool); /* decrease previous increase */ - } - gst_clear_object (&pool); - } - } - - min = MAX (16 + 4, min); /* max num pic references + scratch surfaces */ - size = MAX (size, GST_VIDEO_INFO_SIZE (&info)); - - update_pool = TRUE; - } else { - size = GST_VIDEO_INFO_SIZE (&info); - - if (!self->has_videometa && !_caps_is_va_memory (caps)) { - GST_DEBUG_OBJECT (self, "making new other pool for copy"); - self->other_pool = gst_video_buffer_pool_new (); - config = gst_buffer_pool_get_config (self->other_pool); - gst_buffer_pool_config_set_params (config, caps, size, 0, 0); - gst_buffer_pool_config_set_allocator (config, other_allocator, - &other_params); - if (!gst_buffer_pool_set_config (self->other_pool, config)) { - GST_ERROR_OBJECT (self, "couldn't configure other pool for copy"); - gst_clear_object (&self->other_pool); - } - } else { - gst_clear_object (&other_allocator); - } - - min = 16 + 4; /* max num pic references + scratch surfaces */ - max = 0; - } - - if (!allocator) { - if (!(allocator = _create_allocator (self, caps, &size))) - return FALSE; - } - - if (!pool) - pool = gst_va_pool_new (); - - { - GstStructure *config = gst_buffer_pool_get_config (pool); - - gst_buffer_pool_config_set_params (config, caps, size, min, max); - gst_buffer_pool_config_set_allocator (config, allocator, ¶ms); - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_META); - - if (self->need_cropping) { - GstVideoAlignment video_align = { - .padding_bottom = self->coded_height - self->display_height, - .padding_left = self->coded_width - self->display_width, - }; - gst_buffer_pool_config_add_option (config, - GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); - gst_buffer_pool_config_set_video_alignment (config, &video_align); - } - - gst_buffer_pool_config_set_va_allocation_params (config, - VA_SURFACE_ATTRIB_USAGE_HINT_DECODER); - - if (!gst_buffer_pool_set_config (pool, config)) - return FALSE; - } - - if (update_allocator) - gst_query_set_nth_allocation_param (query, 0, allocator, ¶ms); - else - gst_query_add_allocation_param (query, allocator, ¶ms); - - if (update_pool) - gst_query_set_nth_allocation_pool (query, 0, pool, size, min, max); - else - gst_query_add_allocation_pool (query, pool, size, min, max); - - gst_object_unref (allocator); - gst_object_unref (pool); - - return GST_VIDEO_DECODER_CLASS (parent_class)->decide_allocation (decoder, - query); - -wrong_caps: - { - GST_WARNING_OBJECT (self, "No valid caps"); - return FALSE; - } -} - static void -gst_va_h264_dec_set_context (GstElement * element, GstContext * context) +gst_va_h264_dec_dispose (GObject * object) { - GstVaDisplay *old_display, *new_display; - GstVaH264Dec *self = GST_VA_H264_DEC (element); - GstVaH264DecClass *klass = GST_VA_H264_DEC_GET_CLASS (self); - gboolean ret; + GstVaH264Dec *self = GST_VA_H264_DEC (object); - old_display = self->display ? gst_object_ref (self->display) : NULL; - ret = gst_va_handle_set_context (element, context, klass->render_device_path, - &self->display); - new_display = self->display ? gst_object_ref (self->display) : NULL; + gst_va_base_dec_close (GST_VIDEO_DECODER (object)); + g_clear_pointer (&self->ref_list, g_array_unref); - if (!ret - || (old_display && new_display && old_display != new_display - && self->decoder)) { - GST_ELEMENT_WARNING (element, RESOURCE, BUSY, - ("Can't replace VA display while operating"), (NULL)); - } - - gst_clear_object (&old_display); - gst_clear_object (&new_display); - - GST_ELEMENT_CLASS (parent_class)->set_context (element, context); -} - -static void -gst_va_h264_dec_dispose (GObject * object) -{ - gst_va_h264_dec_close (GST_VIDEO_DECODER (object)); G_OBJECT_CLASS (parent_class)->dispose (object); } @@ -1366,19 +877,13 @@ gst_va_h264_dec_class_init (gpointer g_class, gpointer class_data) { GstCaps *src_doc_caps, *sink_doc_caps; - GstPadTemplate *sink_pad_templ, *src_pad_templ; GObjectClass *gobject_class = G_OBJECT_CLASS (g_class); GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); GstH264DecoderClass *h264decoder_class = GST_H264_DECODER_CLASS (g_class); - GstVaH264DecClass *klass = GST_VA_H264_DEC_CLASS (g_class); GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (g_class); struct CData *cdata = class_data; gchar *long_name; - parent_class = g_type_class_peek_parent (g_class); - - klass->render_device_path = g_strdup (cdata->render_device_path); - if (cdata->description) { long_name = g_strdup_printf ("VA-API H.264 Decoder in %s", cdata->description); @@ -1391,33 +896,19 @@ "VA-API based H.264 video decoder", "Víctor Jáquez <vjaquez@igalia.com>"); - sink_pad_templ = gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, - cdata->sink_caps); - gst_element_class_add_pad_template (element_class, sink_pad_templ); sink_doc_caps = gst_caps_from_string (sink_caps_str); - gst_pad_template_set_documentation_caps (sink_pad_templ, sink_doc_caps); - gst_caps_unref (sink_doc_caps); - - src_pad_templ = gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, - cdata->src_caps); - gst_element_class_add_pad_template (element_class, src_pad_templ); src_doc_caps = gst_caps_from_string (src_caps_str); - gst_pad_template_set_documentation_caps (src_pad_templ, src_doc_caps); - gst_caps_unref (src_doc_caps); - gobject_class->dispose = gst_va_h264_dec_dispose; + parent_class = g_type_class_peek_parent (g_class); - element_class->set_context = GST_DEBUG_FUNCPTR (gst_va_h264_dec_set_context); + gst_va_base_dec_class_init (GST_VA_BASE_DEC_CLASS (g_class), H264, + cdata->render_device_path, cdata->sink_caps, cdata->src_caps, + src_doc_caps, sink_doc_caps); - decoder_class->open = GST_DEBUG_FUNCPTR (gst_va_h264_dec_open); - decoder_class->close = GST_DEBUG_FUNCPTR (gst_va_h264_dec_close); - decoder_class->stop = GST_DEBUG_FUNCPTR (gst_va_h264_dec_stop); - decoder_class->getcaps = GST_DEBUG_FUNCPTR (gst_va_h264_dec_sink_getcaps); - decoder_class->src_query = GST_DEBUG_FUNCPTR (gst_va_h264_dec_src_query); - decoder_class->sink_query = GST_DEBUG_FUNCPTR (gst_va_h264_dec_sink_query); + gobject_class->dispose = gst_va_h264_dec_dispose; + + decoder_class->getcaps = GST_DEBUG_FUNCPTR (gst_va_h264_dec_getcaps); decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_va_h264_dec_negotiate); - decoder_class->decide_allocation = - GST_DEBUG_FUNCPTR (gst_va_h264_dec_decide_allocation); h264decoder_class->new_sequence = GST_DEBUG_FUNCPTR (gst_va_h264_dec_new_sequence); @@ -1432,6 +923,8 @@ GST_DEBUG_FUNCPTR (gst_va_h264_dec_start_picture); h264decoder_class->end_picture = GST_DEBUG_FUNCPTR (gst_va_h264_dec_end_picture); + h264decoder_class->new_field_picture = + GST_DEBUG_FUNCPTR (gst_va_h264_dec_new_field_picture); g_free (long_name); g_free (cdata->description); @@ -1444,8 +937,16 @@ static void gst_va_h264_dec_init (GTypeInstance * instance, gpointer g_class) { + GstVaH264Dec *self = GST_VA_H264_DEC (instance); + + gst_va_base_dec_init (GST_VA_BASE_DEC (instance), GST_CAT_DEFAULT); gst_h264_decoder_set_process_ref_pic_lists (GST_H264_DECODER (instance), TRUE); + + self->ref_list = g_array_sized_new (FALSE, TRUE, + sizeof (GstH264Picture *), 16); + g_array_set_clear_func (self->ref_list, + (GDestroyNotify) gst_h264_picture_clear); } static gpointer
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvah265dec.c
Added
@@ -0,0 +1,1396 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * Copyright (C) 2020 Collabora + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vah265dec + * @title: vah265dec + * @short_description: A VA-API based H265 video decoder + * + * vah265dec decodes H265 bitstreams to VA surfaces using the + * installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver. + * + * The decoding surfaces can be mapped onto main memory as video + * frames. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=big_buck_bunny.mov ! parsebin ! vah265dec ! autovideosink + * ``` + * + * Since: 1.20 + * + */ + +/* ToDo: + * + * + interlaced streams + * + mutiview and stereo profiles + * + SCC extension buffer + * + Add 10bit support + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvah265dec.h" + +#include "gstvabasedec.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_h265dec_debug); +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT gst_va_h265dec_debug +#else +#define GST_CAT_DEFAULT NULL +#endif + +#define GST_VA_H265_DEC(obj) ((GstVaH265Dec *) obj) +#define GST_VA_H265_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaH265DecClass)) +#define GST_VA_H265_DEC_CLASS(klass) ((GstVaH265DecClass *) klass) + +struct slice +{ + guint8 *data; + guint size; + + VASliceParameterBufferHEVCExtension param; +}; + +typedef struct _GstVaH265Dec GstVaH265Dec; +typedef struct _GstVaH265DecClass GstVaH265DecClass; + +struct _GstVaH265DecClass +{ + GstVaBaseDecClass parent_class; +}; + +struct _GstVaH265Dec +{ + GstVaBaseDec parent; + + GstFlowReturn last_ret; + + gint coded_width; + gint coded_height; + gint dpb_size; + + VAPictureParameterBufferHEVCExtension pic_param; + + gint32 WpOffsetHalfRangeC; + + struct slice prev_slice; +}; + +static GstElementClass *parent_class = NULL; + +/* *INDENT-OFF* */ +static const gchar *src_caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12, P010_10LE }") " ;" + GST_VIDEO_CAPS_MAKE ("{ NV12, P010_10LE }"); +/* *INDENT-ON* */ + +static const gchar *sink_caps_str = "video/x-h265"; + +static gboolean +_is_range_extension_profile (VAProfile profile) +{ + if (profile == VAProfileHEVCMain422_10 + || profile == VAProfileHEVCMain444 + || profile == VAProfileHEVCMain444_10 + || profile == VAProfileHEVCMain12 + || profile == VAProfileHEVCMain444_12 + || profile == VAProfileHEVCMain422_12) + return TRUE; + return FALSE; +} + +static gboolean +_is_screen_content_ext_profile (VAProfile profile) +{ + if (profile == VAProfileHEVCSccMain + || profile == VAProfileHEVCSccMain10 + || profile == VAProfileHEVCSccMain444) + return TRUE; + + return FALSE; +} + +static inline void +_set_last_slice_flag (GstVaH265Dec * self) +{ + self->prev_slice.param.base.LongSliceFlags.fields.LastSliceOfPic = 1; +} + +static void +_replace_previous_slice (GstVaH265Dec * self, guint8 * data, guint size) +{ + struct slice *slice = &self->prev_slice; + gboolean do_reset = (slice->size < size); + + if (!data || do_reset) { + g_clear_pointer (&slice->data, g_free); + slice->size = 0; + } + + if (!data) + return; + + if (do_reset) { + GST_LOG_OBJECT (self, "allocating slice data %u", size); + slice->data = g_malloc (size); + } + + memcpy (slice->data, data, size); + slice->size = size; +} + +static gboolean +_submit_previous_slice (GstVaBaseDec * base, GstVaDecodePicture * va_pic) +{ + GstVaH265Dec *self = GST_VA_H265_DEC (base); + struct slice *slice; + gboolean ret; + gsize param_size; + + slice = &self->prev_slice; + if (!slice->data && slice->size == 0) + return TRUE; + if (!slice->data || slice->size == 0) + return FALSE; + + param_size = _is_range_extension_profile (self->parent.profile) + || _is_screen_content_ext_profile (self->parent.profile) ? + sizeof (slice->param) : sizeof (slice->param.base); + ret = gst_va_decoder_add_slice_buffer (base->decoder, va_pic, &slice->param, + param_size, slice->data, slice->size); + + return ret; +} + +static GstFlowReturn +gst_va_h265_dec_end_picture (GstH265Decoder * decoder, GstH265Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + GstVaDecodePicture *va_pic; + gboolean ret; + + GST_LOG_OBJECT (base, "end picture %p, (poc %d)", + picture, picture->pic_order_cnt); + + va_pic = gst_h265_picture_get_user_data (picture); + + _set_last_slice_flag (self); + ret = _submit_previous_slice (base, va_pic); + + /* TODO(victor): optimization: this could be done at decoder's + * stop() vmethod */ + _replace_previous_slice (self, NULL, 0); + + if (!ret) { + GST_ERROR_OBJECT (self, "Failed to submit the previous slice"); + return GST_FLOW_ERROR; + } + + ret = gst_va_decoder_decode (base->decoder, va_pic); + if (!ret) { + GST_ERROR_OBJECT (self, "Failed at end picture %p, (poc %d)", + picture, picture->pic_order_cnt); + return GST_FLOW_ERROR; + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_h265_dec_output_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * frame, GstH265Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + GstVaDecodePicture *va_pic; + + va_pic = gst_h265_picture_get_user_data (picture); + g_assert (va_pic->gstbuffer); + + GST_LOG_OBJECT (self, + "Outputting picture %p (poc %d)", picture, picture->pic_order_cnt); + + if (self->last_ret != GST_FLOW_OK) { + gst_h265_picture_unref (picture); + _replace_previous_slice (self, NULL, 0); + gst_video_decoder_drop_frame (GST_VIDEO_DECODER (self), frame); + return self->last_ret; + } + + gst_buffer_replace (&frame->output_buffer, va_pic->gstbuffer); + + if (base->copy_frames) + gst_va_base_dec_copy_output_buffer (base, frame); + + gst_h265_picture_unref (picture); + + return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); +} + +static void +_init_vaapi_pic (VAPictureHEVC * va_picture) +{ + va_picture->picture_id = VA_INVALID_ID; + va_picture->flags = VA_PICTURE_HEVC_INVALID; + va_picture->pic_order_cnt = 0; +} + +static gint +_find_frame_rps_type (GstH265Decoder * decoder, GstH265Picture * ref_pic) +{ + gint i; + + for (i = 0; i < G_N_ELEMENTS (decoder->RefPicSetStCurrBefore); i++) { + if (ref_pic == decoder->RefPicSetStCurrBefore[i]) + return VA_PICTURE_HEVC_RPS_ST_CURR_BEFORE; + } + + for (i = 0; i < G_N_ELEMENTS (decoder->RefPicSetStCurrAfter); i++) { + if (ref_pic == decoder->RefPicSetStCurrAfter[i]) + return VA_PICTURE_HEVC_RPS_ST_CURR_AFTER; + } + + for (i = 0; i < G_N_ELEMENTS (decoder->RefPicSetLtCurr); i++) { + if (ref_pic == decoder->RefPicSetLtCurr[i]) + return VA_PICTURE_HEVC_RPS_LT_CURR; + } + + return 0; +} + + +static void +_fill_vaapi_pic (GstH265Decoder * decoder, VAPictureHEVC * va_picture, + GstH265Picture * picture) +{ + GstVaDecodePicture *va_pic; + + va_pic = gst_h265_picture_get_user_data (picture); + + if (!va_pic) { + _init_vaapi_pic (va_picture); + return; + } + + va_picture->picture_id = gst_va_decode_picture_get_surface (va_pic); + va_picture->pic_order_cnt = picture->pic_order_cnt; + va_picture->flags = 0; + + if (picture->ref && picture->long_term) + va_picture->flags |= VA_PICTURE_HEVC_LONG_TERM_REFERENCE; + + va_picture->flags |= _find_frame_rps_type (decoder, picture); +} + +static guint8 +_get_reference_index (GstH265Decoder * decoder, GstH265Picture * picture) +{ + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + guint8 i; + + for (i = 0; i < 15; i++) { + VAPictureHEVC *ref_va_pic = &self->pic_param.base.ReferenceFrames[i]; + + if (ref_va_pic->picture_id == VA_INVALID_ID) + break; + + if (ref_va_pic->pic_order_cnt == picture->pic_order_cnt) + return i; + } + + return 0xFF; +} + +/* fill the VA API reference picture lists from the GstCodec reference + * picture list */ +static void +_fill_ref_pic_list (GstH265Decoder * decoder, GstH265Picture * cur_pic, + guint8 va_reflist[15], GArray * reflist) +{ + guint i; + + for (i = 0; i < reflist->len && i < 15; i++) { + GstH265Picture *picture = g_array_index (reflist, GstH265Picture *, i); + va_reflist[i] = _get_reference_index (decoder, picture); + } + + for (; i < 15; i++) + va_reflist[i] = 0xFF; +} + +static void +_fill_pred_weight_table (GstVaH265Dec * self, GstH265SliceHdr * header, + VASliceParameterBufferHEVCExtension * slice_param) +{ + gint chroma_weight, chroma_log2_weight_denom; + gint i, j; + GstH265PPS *pps = header->pps; + gboolean is_rext = _is_range_extension_profile (self->parent.profile); + + if (GST_H265_IS_I_SLICE (header) || + (!pps->weighted_pred_flag && GST_H265_IS_P_SLICE (header)) || + (!pps->weighted_bipred_flag && GST_H265_IS_B_SLICE (header))) + return; + + slice_param->base.luma_log2_weight_denom = + header->pred_weight_table.luma_log2_weight_denom; + + if (pps->sps->chroma_array_type != 0) + slice_param->base.delta_chroma_log2_weight_denom = + header->pred_weight_table.delta_chroma_log2_weight_denom; + + for (i = 0; i <= header->num_ref_idx_l0_active_minus1; i++) { + if (!header->pred_weight_table.luma_weight_l0_flag[i]) + continue; + + slice_param->base.delta_luma_weight_l0[i] = + header->pred_weight_table.delta_luma_weight_l0[i]; + slice_param->base.luma_offset_l0[i] = + header->pred_weight_table.luma_offset_l0[i]; + + if (is_rext) { + slice_param->rext.luma_offset_l0[i] = + header->pred_weight_table.luma_offset_l0[i]; + } + } + + chroma_log2_weight_denom = slice_param->base.luma_log2_weight_denom + + slice_param->base.delta_chroma_log2_weight_denom; + + for (i = 0; i <= header->num_ref_idx_l0_active_minus1; i++) { + if (!header->pred_weight_table.chroma_weight_l0_flag[i]) + continue; + + for (j = 0; j < 2; j++) { + gint16 delta_chroma_offset_l0 = + header->pred_weight_table.delta_chroma_offset_l0[i][j]; + gint chroma_offset; + + slice_param->base.delta_chroma_weight_l0[i][j] = + header->pred_weight_table.delta_chroma_weight_l0[i][j]; + + /* Find ChromaWeightL0 */ + chroma_weight = (1 << chroma_log2_weight_denom) + + header->pred_weight_table.delta_chroma_weight_l0[i][j]; + chroma_offset = self->WpOffsetHalfRangeC + delta_chroma_offset_l0 + - ((self->WpOffsetHalfRangeC * chroma_weight) + >> chroma_log2_weight_denom); + + /* 7-56 */ + slice_param->base.ChromaOffsetL0[i][j] = CLAMP (chroma_offset, + -self->WpOffsetHalfRangeC, self->WpOffsetHalfRangeC - 1); + + if (is_rext) { + slice_param->rext.ChromaOffsetL0[i][j] = + slice_param->base.ChromaOffsetL0[i][j]; + } + } + } + + /* Skip l1 if this is not a B-Frame. */ + if (!GST_H265_IS_B_SLICE (header)) + return; + + for (i = 0; i <= header->num_ref_idx_l1_active_minus1; i++) { + if (!header->pred_weight_table.luma_weight_l1_flag[i]) + continue; + + slice_param->base.delta_luma_weight_l1[i] = + header->pred_weight_table.delta_luma_weight_l1[i]; + slice_param->base.luma_offset_l1[i] = + header->pred_weight_table.luma_offset_l1[i]; + + if (is_rext) { + slice_param->rext.luma_offset_l1[i] = + header->pred_weight_table.luma_offset_l1[i]; + } + } + + for (i = 0; i <= header->num_ref_idx_l1_active_minus1; i++) { + if (!header->pred_weight_table.chroma_weight_l1_flag[i]) + continue; + + for (j = 0; j < 2; j++) { + gint16 delta_chroma_offset_l1 = + header->pred_weight_table.delta_chroma_offset_l1[i][j]; + gint chroma_offset; + + slice_param->base.delta_chroma_weight_l1[i][j] = + header->pred_weight_table.delta_chroma_weight_l1[i][j]; + + /* Find ChromaWeightL1 */ + chroma_weight = (1 << chroma_log2_weight_denom) + + header->pred_weight_table.delta_chroma_weight_l1[i][j]; + + chroma_offset = self->WpOffsetHalfRangeC + delta_chroma_offset_l1 + - ((self->WpOffsetHalfRangeC * chroma_weight) + >> chroma_log2_weight_denom); + + /* 7-56 */ + slice_param->base.ChromaOffsetL1[i][j] = CLAMP (chroma_offset, + -self->WpOffsetHalfRangeC, self->WpOffsetHalfRangeC - 1); + + if (is_rext) { + slice_param->rext.ChromaOffsetL1[i][j] = + slice_param->base.ChromaOffsetL1[i][j]; + } + } + } +} + +static inline guint +_get_slice_data_byte_offset (GstH265SliceHdr * slice_hdr, + guint nal_header_bytes) +{ + guint epb_count; + + epb_count = slice_hdr->n_emulation_prevention_bytes; + return nal_header_bytes + (slice_hdr->header_size + 7) / 8 - epb_count; +} + +static GstFlowReturn +gst_va_h265_dec_decode_slice (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, GArray * ref_pic_list0, + GArray * ref_pic_list1) +{ + GstH265SliceHdr *header = &slice->header; + GstH265NalUnit *nalu = &slice->nalu; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + GstVaDecodePicture *va_pic; + VASliceParameterBufferHEVCExtension *slice_param; + + va_pic = gst_h265_picture_get_user_data (picture); + if (!_submit_previous_slice (base, va_pic)) { + _replace_previous_slice (self, NULL, 0); + GST_ERROR_OBJECT (base, "Failed to submit previous slice buffers"); + return GST_FLOW_ERROR; + } + + slice_param = &self->prev_slice.param; + + /* *INDENT-OFF* */ + slice_param->base = (VASliceParameterBufferHEVC) { + .slice_data_size = nalu->size, + .slice_data_offset = 0, + .slice_data_flag = VA_SLICE_DATA_FLAG_ALL, + .slice_data_byte_offset = _get_slice_data_byte_offset (header, nalu->header_bytes), + .slice_segment_address = header->segment_address, + .collocated_ref_idx = header->temporal_mvp_enabled_flag ? header->collocated_ref_idx : 0xFF, + .num_ref_idx_l0_active_minus1 = header->num_ref_idx_l0_active_minus1, + .num_ref_idx_l1_active_minus1 = header->num_ref_idx_l1_active_minus1, + .slice_qp_delta = header->qp_delta, + .slice_cb_qp_offset = header->cb_qp_offset, + .slice_cr_qp_offset = header->cr_qp_offset, + .slice_beta_offset_div2 = header->beta_offset_div2, + .slice_tc_offset_div2 = header->tc_offset_div2, + .five_minus_max_num_merge_cand = header->five_minus_max_num_merge_cand, + .num_entry_point_offsets = header->num_entry_point_offsets, + .entry_offset_to_subset_array = 0, /* does not exist in spec */ + .slice_data_num_emu_prevn_bytes = header->n_emulation_prevention_bytes, + .LongSliceFlags.fields = { + .LastSliceOfPic = 0, /* the last one will be set on end_picture() */ + .dependent_slice_segment_flag = header->dependent_slice_segment_flag, + .slice_type = header->type, + .color_plane_id = header->colour_plane_id, + .slice_sao_luma_flag = header->sao_luma_flag, + .slice_sao_chroma_flag = header->sao_chroma_flag, + .mvd_l1_zero_flag = header->mvd_l1_zero_flag, + .cabac_init_flag = header->cabac_init_flag, + .slice_temporal_mvp_enabled_flag = header->temporal_mvp_enabled_flag, + .slice_deblocking_filter_disabled_flag = + header->deblocking_filter_disabled_flag, + .collocated_from_l0_flag = header->collocated_from_l0_flag, + .slice_loop_filter_across_slices_enabled_flag = + header->loop_filter_across_slices_enabled_flag, + }, + }; + /* *INDENT-ON* */ + + if (_is_range_extension_profile (base->profile) + || _is_screen_content_ext_profile (base->profile)) { + /* *INDENT-OFF* */ + slice_param->rext = (VASliceParameterBufferHEVCRext) { + .slice_ext_flags.bits = { + .cu_chroma_qp_offset_enabled_flag = header->cu_chroma_qp_offset_enabled_flag, + .use_integer_mv_flag = header->use_integer_mv_flag, + }, + .slice_act_y_qp_offset = header->slice_act_y_qp_offset, + .slice_act_cb_qp_offset = header->slice_act_cb_qp_offset, + .slice_act_cr_qp_offset = header->slice_act_cr_qp_offset, + }; + /* *INDENT-ON* */ + } + + _fill_ref_pic_list (decoder, picture, slice_param->base.RefPicList[0], + ref_pic_list0); + _fill_ref_pic_list (decoder, picture, slice_param->base.RefPicList[1], + ref_pic_list1); + + _fill_pred_weight_table (GST_VA_H265_DEC (decoder), header, slice_param); + + _replace_previous_slice (self, slice->nalu.data + slice->nalu.offset, + slice->nalu.size); + + return GST_FLOW_OK; +} + +static void +_fill_picture_range_ext_parameter (GstVaH265Dec * decoder, + GstH265SPS * sps, GstH265PPS * pps) +{ + VAPictureParameterBufferHEVCRext *pic_param = &decoder->pic_param.rext; + + GstH265SPSExtensionParams *sps_ext = &sps->sps_extnsion_params; + GstH265PPSExtensionParams *pps_ext = &pps->pps_extension_params; + + /* *INDENT-OFF* */ + *pic_param = (VAPictureParameterBufferHEVCRext) { + .range_extension_pic_fields.bits = { + .transform_skip_rotation_enabled_flag = sps_ext->transform_skip_rotation_enabled_flag, + .transform_skip_context_enabled_flag = sps_ext->transform_skip_context_enabled_flag, + .implicit_rdpcm_enabled_flag = sps_ext->implicit_rdpcm_enabled_flag, + .explicit_rdpcm_enabled_flag = sps_ext->explicit_rdpcm_enabled_flag, + .extended_precision_processing_flag = sps_ext->extended_precision_processing_flag, + .intra_smoothing_disabled_flag = sps_ext->intra_smoothing_disabled_flag, + .high_precision_offsets_enabled_flag = sps_ext->high_precision_offsets_enabled_flag, + .persistent_rice_adaptation_enabled_flag = sps_ext->persistent_rice_adaptation_enabled_flag, + .cabac_bypass_alignment_enabled_flag = sps_ext->cabac_bypass_alignment_enabled_flag, + .cross_component_prediction_enabled_flag = pps_ext->cross_component_prediction_enabled_flag, + .chroma_qp_offset_list_enabled_flag = pps_ext->chroma_qp_offset_list_enabled_flag, + }, + .diff_cu_chroma_qp_offset_depth = pps_ext->diff_cu_chroma_qp_offset_depth, + .chroma_qp_offset_list_len_minus1 = pps_ext->chroma_qp_offset_list_len_minus1, + .log2_sao_offset_scale_luma = pps_ext->log2_sao_offset_scale_luma, + .log2_sao_offset_scale_chroma = pps_ext->log2_sao_offset_scale_chroma, + .log2_max_transform_skip_block_size_minus2 = pps_ext->log2_max_transform_skip_block_size_minus2, + }; + /* *INDENT-ON* */ + + memcpy (pic_param->cb_qp_offset_list, pps_ext->cb_qp_offset_list, + sizeof (pic_param->cb_qp_offset_list)); + memcpy (pic_param->cr_qp_offset_list, pps_ext->cr_qp_offset_list, + sizeof (pic_param->cr_qp_offset_list)); +} + +static void +_fill_screen_content_ext_parameter (GstVaH265Dec * decoder, + GstH265SPS * sps, GstH265PPS * pps) +{ + VAPictureParameterBufferHEVCScc *pic_param = &decoder->pic_param.scc; + const GstH265PPSSccExtensionParams *pps_scc = &pps->pps_scc_extension_params; + const GstH265SPSSccExtensionParams *sps_scc = &sps->sps_scc_extension_params; + guint32 num_comps; + guint i, n; + + /* *INDENT-OFF* */ + *pic_param = (VAPictureParameterBufferHEVCScc) { + .screen_content_pic_fields.bits = { + .pps_curr_pic_ref_enabled_flag = pps_scc->pps_curr_pic_ref_enabled_flag, + .palette_mode_enabled_flag = sps_scc->palette_mode_enabled_flag, + .motion_vector_resolution_control_idc = sps_scc->motion_vector_resolution_control_idc, + .intra_boundary_filtering_disabled_flag = sps_scc->intra_boundary_filtering_disabled_flag, + .residual_adaptive_colour_transform_enabled_flag = pps_scc->residual_adaptive_colour_transform_enabled_flag, + .pps_slice_act_qp_offsets_present_flag = pps_scc->pps_slice_act_qp_offsets_present_flag, + }, + .palette_max_size = sps_scc->palette_max_size, + .delta_palette_max_predictor_size = sps_scc->delta_palette_max_predictor_size, + .pps_act_y_qp_offset_plus5 = pps_scc->pps_act_y_qp_offset_plus5, + .pps_act_cb_qp_offset_plus5 = pps_scc->pps_act_cb_qp_offset_plus5, + .pps_act_cr_qp_offset_plus3 = pps_scc->pps_act_cr_qp_offset_plus3, + }; + /* *INDENT-ON* */ + + /* firstly use the pps, then sps */ + num_comps = sps->chroma_format_idc ? 3 : 1; + + if (pps_scc->pps_palette_predictor_initializers_present_flag) { + pic_param->predictor_palette_size = + pps_scc->pps_num_palette_predictor_initializer; + for (n = 0; n < num_comps; n++) + for (i = 0; i < pps_scc->pps_num_palette_predictor_initializer; i++) + pic_param->predictor_palette_entries[n][i] = + (uint16_t) pps_scc->pps_palette_predictor_initializer[n][i]; + } else if (sps_scc->sps_palette_predictor_initializers_present_flag) { + pic_param->predictor_palette_size = + sps_scc->sps_num_palette_predictor_initializer_minus1 + 1; + for (n = 0; n < num_comps; n++) + for (i = 0; + i < sps_scc->sps_num_palette_predictor_initializer_minus1 + 1; i++) + pic_param->predictor_palette_entries[n][i] = + (uint16_t) sps_scc->sps_palette_predictor_initializer[n][i]; + } +} + +static GstFlowReturn +gst_va_h265_dec_start_picture (GstH265Decoder * decoder, + GstH265Picture * picture, GstH265Slice * slice, GstH265Dpb * dpb) +{ + GstH265PPS *pps; + GstH265SPS *sps; + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + GstVaBaseDec *base = &self->parent; + GstVaDecodePicture *va_pic; + GstH265ScalingList *scaling_list = NULL; + VAIQMatrixBufferHEVC iq_matrix = { 0, }; + VAPictureParameterBufferHEVCExtension *pic_param = &self->pic_param; + gsize pic_param_size; + guint i; + + va_pic = gst_h265_picture_get_user_data (picture); + + pps = slice->header.pps; + sps = pps->sps; + + /* *INDENT-OFF* */ + pic_param->base = (VAPictureParameterBufferHEVC) { + .pic_width_in_luma_samples = sps->pic_width_in_luma_samples, + .pic_height_in_luma_samples = sps->pic_height_in_luma_samples, + .sps_max_dec_pic_buffering_minus1 = sps->max_dec_pic_buffering_minus1[sps->max_sub_layers_minus1], + .bit_depth_luma_minus8 = sps->bit_depth_luma_minus8, + .bit_depth_chroma_minus8 = sps->bit_depth_chroma_minus8, + .pcm_sample_bit_depth_luma_minus1 = sps->pcm_sample_bit_depth_luma_minus1, + .pcm_sample_bit_depth_chroma_minus1 = sps->pcm_sample_bit_depth_chroma_minus1, + .log2_min_luma_coding_block_size_minus3 = sps->log2_min_luma_coding_block_size_minus3, + .log2_diff_max_min_luma_coding_block_size = sps->log2_diff_max_min_luma_coding_block_size, + .log2_min_transform_block_size_minus2 = sps->log2_min_transform_block_size_minus2, + .log2_diff_max_min_transform_block_size = sps->log2_diff_max_min_transform_block_size, + .log2_min_pcm_luma_coding_block_size_minus3 = sps->log2_min_pcm_luma_coding_block_size_minus3, + .log2_diff_max_min_pcm_luma_coding_block_size = sps->log2_diff_max_min_pcm_luma_coding_block_size, + .max_transform_hierarchy_depth_intra = sps->max_transform_hierarchy_depth_intra, + .max_transform_hierarchy_depth_inter = sps->max_transform_hierarchy_depth_inter, + .init_qp_minus26 = pps->init_qp_minus26, + .diff_cu_qp_delta_depth = pps->diff_cu_qp_delta_depth, + .pps_cb_qp_offset = pps->cb_qp_offset, + .pps_cr_qp_offset = pps->cr_qp_offset, + .log2_parallel_merge_level_minus2 = pps->log2_parallel_merge_level_minus2, + .num_tile_columns_minus1 = pps->num_tile_columns_minus1, + .num_tile_rows_minus1 = pps->num_tile_rows_minus1, + .log2_max_pic_order_cnt_lsb_minus4 = sps->log2_max_pic_order_cnt_lsb_minus4, + .num_short_term_ref_pic_sets = sps->num_short_term_ref_pic_sets, + .num_long_term_ref_pic_sps = sps->num_long_term_ref_pics_sps, + .num_ref_idx_l0_default_active_minus1 = pps->num_ref_idx_l0_default_active_minus1, + .num_ref_idx_l1_default_active_minus1 = pps->num_ref_idx_l1_default_active_minus1, + .pps_beta_offset_div2 = pps->beta_offset_div2, + .pps_tc_offset_div2 = pps->tc_offset_div2, + .num_extra_slice_header_bits = pps->num_extra_slice_header_bits, + .st_rps_bits = slice->header.short_term_ref_pic_set_size, /* FIXME missing emulation bits removal */ + .pic_fields.bits = { + .chroma_format_idc = sps->chroma_format_idc, + .separate_colour_plane_flag = sps->separate_colour_plane_flag, + .pcm_enabled_flag = sps->pcm_enabled_flag, + .scaling_list_enabled_flag = sps->scaling_list_enabled_flag, + .transform_skip_enabled_flag = pps->transform_skip_enabled_flag, + .amp_enabled_flag = sps->amp_enabled_flag, + .strong_intra_smoothing_enabled_flag = sps->strong_intra_smoothing_enabled_flag, + .sign_data_hiding_enabled_flag = pps->sign_data_hiding_enabled_flag, + .constrained_intra_pred_flag = pps->constrained_intra_pred_flag, + .cu_qp_delta_enabled_flag = pps->cu_qp_delta_enabled_flag, + .weighted_pred_flag = pps->weighted_pred_flag, + .weighted_bipred_flag = pps->weighted_bipred_flag, + .transquant_bypass_enabled_flag = pps->transquant_bypass_enabled_flag, + .tiles_enabled_flag = pps->tiles_enabled_flag, + .entropy_coding_sync_enabled_flag = pps->entropy_coding_sync_enabled_flag, + .pps_loop_filter_across_slices_enabled_flag = pps->loop_filter_across_slices_enabled_flag, + .loop_filter_across_tiles_enabled_flag = pps->loop_filter_across_tiles_enabled_flag, + .pcm_loop_filter_disabled_flag = sps->pcm_loop_filter_disabled_flag, + /* Not set by FFMPEG either */ + .NoPicReorderingFlag = 0, + .NoBiPredFlag = 0, + }, + .slice_parsing_fields.bits = { + .lists_modification_present_flag = pps->lists_modification_present_flag, + .long_term_ref_pics_present_flag = sps->long_term_ref_pics_present_flag, + .sps_temporal_mvp_enabled_flag = sps->temporal_mvp_enabled_flag, + .cabac_init_present_flag = pps->cabac_init_present_flag, + .output_flag_present_flag = pps->output_flag_present_flag, + .dependent_slice_segments_enabled_flag = pps->dependent_slice_segments_enabled_flag, + .pps_slice_chroma_qp_offsets_present_flag = pps->slice_chroma_qp_offsets_present_flag, + .sample_adaptive_offset_enabled_flag = sps->sample_adaptive_offset_enabled_flag, + .deblocking_filter_override_enabled_flag = pps->deblocking_filter_override_enabled_flag, + .pps_disable_deblocking_filter_flag = pps->deblocking_filter_disabled_flag, + .slice_segment_header_extension_present_flag = pps->slice_segment_header_extension_present_flag, + .RapPicFlag = picture->RapPicFlag, + .IdrPicFlag = GST_H265_IS_NAL_TYPE_IDR (slice->nalu.type), + .IntraPicFlag = GST_H265_IS_NAL_TYPE_IRAP (slice->nalu.type), + }, + }; + /* *INDENT-ON* */ + + if (_is_range_extension_profile (self->parent.profile) + || _is_screen_content_ext_profile (self->parent.profile)) { + _fill_picture_range_ext_parameter (self, sps, pps); + if (_is_screen_content_ext_profile (self->parent.profile)) + _fill_screen_content_ext_parameter (self, sps, pps); + } + + for (i = 0; i <= pps->num_tile_columns_minus1; i++) + pic_param->base.column_width_minus1[i] = pps->column_width_minus1[i]; + + for (i = 0; i <= pps->num_tile_rows_minus1; i++) + pic_param->base.row_height_minus1[i] = pps->row_height_minus1[i]; + + _fill_vaapi_pic (decoder, &pic_param->base.CurrPic, picture); + + /* reference frames */ + { + GArray *ref_list = gst_h265_dpb_get_pictures_all (dpb); + guint j; + + i = 0; + for (j = 0; j < 15 && j < ref_list->len; j++) { + GstH265Picture *pic = g_array_index (ref_list, GstH265Picture *, j); + + if (pic->ref) { + _fill_vaapi_pic (decoder, &pic_param->base.ReferenceFrames[i], pic); + i++; + } + } + g_array_unref (ref_list); + + /* 7.4.3.3.3, the current decoded picture is marked as "used for + long-term reference". Current picture is not in the DPB now. */ + if (pps->pps_scc_extension_params.pps_curr_pic_ref_enabled_flag && i < 15) { + pic_param->base.ReferenceFrames[i].picture_id = + gst_va_decode_picture_get_surface (gst_h265_picture_get_user_data + (picture)); + pic_param->base.ReferenceFrames[i].pic_order_cnt = picture->pic_order_cnt; + pic_param->base.ReferenceFrames[i].flags |= + VA_PICTURE_HEVC_LONG_TERM_REFERENCE; + pic_param->base.ReferenceFrames[i].flags |= + _find_frame_rps_type (decoder, picture); + i++; + } + + for (; i < 15; i++) + _init_vaapi_pic (&pic_param->base.ReferenceFrames[i]); + } + + pic_param_size = _is_range_extension_profile (self->parent.profile) + || _is_screen_content_ext_profile (self->parent.profile) ? + sizeof (*pic_param) : sizeof (pic_param->base); + if (!gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAPictureParameterBufferType, pic_param, pic_param_size)) + return GST_FLOW_ERROR; + + if (pps->scaling_list_data_present_flag || + (sps->scaling_list_enabled_flag + && !sps->scaling_list_data_present_flag)) { + scaling_list = &pps->scaling_list; + GST_DEBUG_OBJECT (decoder, "Passing scaling list from PPS"); + } else if (sps->scaling_list_enabled_flag && + sps->scaling_list_data_present_flag) { + scaling_list = &sps->scaling_list; + GST_DEBUG_OBJECT (decoder, "Passing scaling list from SPS"); + } + + if (scaling_list) { + for (i = 0; i < G_N_ELEMENTS (iq_matrix.ScalingList4x4); i++) + gst_h265_quant_matrix_4x4_get_raster_from_uprightdiagonal + (iq_matrix.ScalingList4x4[i], scaling_list->scaling_lists_4x4[i]); + + for (i = 0; i < G_N_ELEMENTS (iq_matrix.ScalingList8x8); i++) + gst_h265_quant_matrix_8x8_get_raster_from_uprightdiagonal + (iq_matrix.ScalingList8x8[i], scaling_list->scaling_lists_8x8[i]); + + for (i = 0; i < G_N_ELEMENTS (iq_matrix.ScalingList16x16); i++) + gst_h265_quant_matrix_16x16_get_raster_from_uprightdiagonal + (iq_matrix.ScalingList16x16[i], scaling_list->scaling_lists_16x16[i]); + + for (i = 0; i < G_N_ELEMENTS (iq_matrix.ScalingList32x32); i++) + gst_h265_quant_matrix_32x32_get_raster_from_uprightdiagonal + (iq_matrix.ScalingList32x32[i], scaling_list->scaling_lists_32x32[i]); + + for (i = 0; i < 6; i++) + iq_matrix.ScalingListDC16x16[i] = + scaling_list->scaling_list_dc_coef_minus8_16x16[i] + 8; + + for (i = 0; i < 2; i++) + iq_matrix.ScalingListDC32x32[i] = + scaling_list->scaling_list_dc_coef_minus8_32x32[i] + 8; + + if (!gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAIQMatrixBufferType, &iq_matrix, sizeof (iq_matrix))) { + return GST_FLOW_ERROR; + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_h265_dec_new_picture (GstH265Decoder * decoder, + GstVideoCodecFrame * frame, GstH265Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + GstVaDecodePicture *pic; + GstBuffer *output_buffer; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + + if (base->need_negotiation) { + if (!gst_video_decoder_negotiate (vdec)) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + output_buffer = gst_video_decoder_allocate_output_buffer (vdec); + if (!output_buffer) { + self->last_ret = GST_FLOW_ERROR; + goto error; + } + self->last_ret = GST_FLOW_OK; + + pic = gst_va_decode_picture_new (base->decoder, output_buffer); + gst_buffer_unref (output_buffer); + + gst_h265_picture_set_user_data (picture, pic, + (GDestroyNotify) gst_va_decode_picture_free); + + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, + gst_va_decode_picture_get_surface (pic)); + + return GST_FLOW_OK; + +error: + { + GST_WARNING_OBJECT (self, + "Failed to allocated output buffer, return %s", + gst_flow_get_name (self->last_ret)); + return self->last_ret; + } +} + +static guint +_get_rtformat (GstVaH265Dec * self, guint8 bit_depth_luma, + guint8 bit_depth_chroma, guint8 chroma_format_idc) +{ + guint8 bit_num = MAX (bit_depth_luma, bit_depth_chroma); + + switch (bit_num) { + case 11: + case 12: + if (chroma_format_idc == 3) + return VA_RT_FORMAT_YUV444_12; + if (chroma_format_idc == 2) + return VA_RT_FORMAT_YUV422_12; + else + return VA_RT_FORMAT_YUV420_12; + break; + case 9: + case 10: + if (chroma_format_idc == 3) + return VA_RT_FORMAT_YUV444_10; + if (chroma_format_idc == 2) + return VA_RT_FORMAT_YUV422_10; + else + return VA_RT_FORMAT_YUV420_10; + break; + case 8: + if (chroma_format_idc == 3) + return VA_RT_FORMAT_YUV444; + if (chroma_format_idc == 2) + return VA_RT_FORMAT_YUV422; + else + return VA_RT_FORMAT_YUV420; + break; + default: + GST_ERROR_OBJECT (self, "Unsupported chroma format: %d " + "(with depth luma: %d, with depth chroma: %d)", + chroma_format_idc, bit_depth_luma, bit_depth_chroma); + return 0; + } +} + +/* *INDENT-OFF* */ +static const struct +{ + GstH265Profile profile; + VAProfile va_profile; +} profile_map[] = { +#define P(idc, va) { G_PASTE (GST_H265_PROFILE_, idc), G_PASTE (VAProfileHEVC, va) } + P (MAIN, Main), + P (MAIN_10, Main10), + /*P (MAIN_STILL_PICTURE, ), + P (MONOCHROME, ), + P (MONOCHROME_12, ), + P (MONOCHROME_16, ),*/ + P (MAIN_12, Main12), + P (MAIN_422_10, Main422_10), + P (MAIN_422_12, Main422_12), + P (MAIN_444, Main444), + P (MAIN_444_10, Main444_10), + P (MAIN_444_12, Main444_12), + /*P (MAIN_INTRA, ), + P (MAIN_10_INTRA, ), + P (MAIN_12_INTRA, ), + P (MAIN_422_10_INTRA, ), + P (MAIN_422_12_INTRA, ), + P (MAIN_444_INTRA, ), + P (MAIN_444_10_INTRA, ), + P (MAIN_444_12_INTRA, ), + P (MAIN_444_16_INTRA, ), + P (MAIN_444_STILL_PICTURE, ), + P (MAIN_444_16_STILL_PICTURE, ), + P (MONOCHROME_10, ), + P (HIGH_THROUGHPUT_444, ), + P (HIGH_THROUGHPUT_444_10, ), + P (HIGH_THROUGHPUT_444_14, ), + P (HIGH_THROUGHPUT_444_16_INTRA, ),*/ + P (SCREEN_EXTENDED_MAIN, SccMain), + P (SCREEN_EXTENDED_MAIN_10, SccMain10), + P (SCREEN_EXTENDED_MAIN_444, SccMain444), + /*P (SCREEN_EXTENDED_MAIN_444_10, ), + P (SCREEN_EXTENDED_HIGH_THROUGHPUT_444, ), + P (SCREEN_EXTENDED_HIGH_THROUGHPUT_444_10, ), + P (SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14, ), + P (MULTIVIEW_MAIN, ), + P (SCALABLE_MAIN, ), + P (SCALABLE_MAIN_10, ), + P (SCALABLE_MONOCHROME, ), + P (SCALABLE_MONOCHROME_12, ), + P (SCALABLE_MONOCHROME_16, ), + P (SCALABLE_MAIN_444, ), + P (3D_MAIN, ),*/ +#undef P +}; +/* *INDENT-ON* */ + +static VAProfile +_get_profile (GstVaH265Dec * self, const GstH265SPS * sps, gint max_dpb_size) +{ + GstH265Decoder *h265_decoder = GST_H265_DECODER (self); + GstVaBaseDec *base = GST_VA_BASE_DEC (self); + GstH265Profile profile = gst_h265_get_profile_from_sps ((GstH265SPS *) sps); + VAProfile profiles[4]; + gint i = 0, j; + + /* 1. The profile directly specified by the SPS should always be the + first choice. It is the exact one. + 2. The profile in the input caps may contain the compatible profile + chosen by the upstream element. Upstream element such as the parse + may already decide the best compatible profile for us. We also need + to consider it as a choice. */ + + for (j = 0; j < G_N_ELEMENTS (profile_map); j++) { + if (profile_map[j].profile == profile) { + profiles[i++] = profile_map[j].va_profile; + break; + } + } + + if (h265_decoder->input_state->caps + && gst_caps_is_fixed (h265_decoder->input_state->caps)) { + GstH265Profile compatible_profile = GST_H265_PROFILE_INVALID; + GstStructure *structure; + const gchar *profile_str; + + structure = gst_caps_get_structure (h265_decoder->input_state->caps, 0); + + profile_str = gst_structure_get_string (structure, "profile"); + if (profile_str) + compatible_profile = gst_h265_profile_from_string (profile_str); + + if (compatible_profile != profile) { + GST_INFO_OBJECT (self, "The upstream set the compatible profile %s, " + "also consider it as a candidate.", profile_str); + + for (j = 0; j < G_N_ELEMENTS (profile_map); j++) { + if (profile_map[j].profile == compatible_profile) { + profiles[i++] = profile_map[j].va_profile; + break; + } + } + } + } + + for (j = 0; j < i && j < G_N_ELEMENTS (profiles); j++) { + if (gst_va_decoder_has_profile (base->decoder, profiles[j])) + return profiles[j]; + } + + GST_ERROR_OBJECT (self, "Unsupported profile: %d", profile); + + return VAProfileNone; +} + +static GstFlowReturn +gst_va_h265_dec_new_sequence (GstH265Decoder * decoder, const GstH265SPS * sps, + gint max_dpb_size) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + VAProfile profile; + gint display_width; + gint display_height; + gint padding_left, padding_right, padding_top, padding_bottom; + guint rt_format; + gboolean negotiation_needed = FALSE; + + if (self->dpb_size < max_dpb_size) + self->dpb_size = max_dpb_size; + + if (sps->conformance_window_flag) { + display_width = sps->crop_rect_width; + display_height = sps->crop_rect_height; + padding_left = sps->crop_rect_x; + padding_right = sps->width - sps->crop_rect_x - display_width; + padding_top = sps->crop_rect_y; + padding_bottom = sps->height - sps->crop_rect_y - display_height; + } else { + display_width = sps->width; + display_height = sps->height; + padding_left = padding_right = padding_top = padding_bottom = 0; + } + + profile = _get_profile (self, sps, max_dpb_size); + if (profile == VAProfileNone) + return GST_FLOW_NOT_NEGOTIATED; + + rt_format = _get_rtformat (self, sps->bit_depth_luma_minus8 + 8, + sps->bit_depth_chroma_minus8 + 8, sps->chroma_format_idc); + if (rt_format == 0) + return GST_FLOW_NOT_NEGOTIATED; + + if (!gst_va_decoder_config_is_equal (base->decoder, profile, + rt_format, sps->width, sps->height)) { + base->profile = profile; + base->rt_format = rt_format; + self->coded_width = sps->width; + self->coded_height = sps->height; + + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "Format changed to %s [%x] (%dx%d)", + gst_va_profile_name (profile), rt_format, self->coded_width, + self->coded_height); + } + + if (base->width != display_width || base->height != display_height) { + base->width = display_width; + base->height = display_height; + + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "Resolution changed to %dx%d", base->width, + base->height); + } + + base->need_valign = base->width < self->coded_width + || base->height < self->coded_height; + if (base->need_valign) { + /* *INDENT-OFF* */ + if (base->valign.padding_left != padding_left || + base->valign.padding_right != padding_right || + base->valign.padding_top != padding_top || + base->valign.padding_bottom != padding_bottom) { + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "crop rect changed to (%d,%d)-->(%d,%d)", + padding_left, padding_top, padding_right, padding_bottom); + } + base->valign = (GstVideoAlignment) { + .padding_left = padding_left, + .padding_right = padding_right, + .padding_top = padding_top, + .padding_bottom = padding_bottom, + }; + /* *INDENT-ON* */ + } + + base->min_buffers = self->dpb_size + 4; /* dpb size + scratch surfaces */ + + base->need_negotiation = negotiation_needed; + + { + /* FIXME: We don't have parser API for sps_range_extension, so + * assuming high_precision_offsets_enabled_flag as zero */ + guint high_precision_offsets_enabled_flag = 0, bitdepthC = 0; + + /* Calculate WpOffsetHalfRangeC: (7-34) */ + bitdepthC = sps->bit_depth_chroma_minus8 + 8; + self->WpOffsetHalfRangeC = + 1 << (high_precision_offsets_enabled_flag ? (bitdepthC - 1) : 7); + } + + return GST_FLOW_OK; +} + +static GstCaps * +_complete_sink_caps (GstCaps * sinkcaps) +{ + GstCaps *caps = gst_caps_copy (sinkcaps); + GValue val = G_VALUE_INIT; + const gchar *streamformat[] = { "hvc1", "hev1", "byte-stream" }; + gint i; + + g_value_init (&val, G_TYPE_STRING); + g_value_set_string (&val, "au"); + gst_caps_set_value (caps, "alignment", &val); + g_value_unset (&val); + + gst_value_list_init (&val, G_N_ELEMENTS (streamformat)); + for (i = 0; i < G_N_ELEMENTS (streamformat); i++) { + GValue v = G_VALUE_INIT; + + g_value_init (&v, G_TYPE_STRING); + g_value_set_string (&v, streamformat[i]); + gst_value_list_append_value (&val, &v); + g_value_unset (&v); + } + gst_caps_set_value (caps, "stream-format", &val); + g_value_unset (&val); + + return caps; +} + +static GstCaps * +gst_va_h265_dec_getcaps (GstVideoDecoder * decoder, GstCaps * filter) +{ + GstCaps *sinkcaps, *caps = NULL, *tmp; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + if (base->decoder) + caps = gst_va_decoder_get_sinkpad_caps (base->decoder); + + if (caps) { + sinkcaps = _complete_sink_caps (caps); + gst_caps_unref (caps); + if (filter) { + tmp = gst_caps_intersect_full (filter, sinkcaps, + GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (sinkcaps); + caps = tmp; + } else { + caps = sinkcaps; + } + GST_LOG_OBJECT (base, "Returning caps %" GST_PTR_FORMAT, caps); + } else if (!caps) { + caps = gst_video_decoder_proxy_getcaps (decoder, NULL, filter); + } + + return caps; +} + +static gboolean +gst_va_h265_dec_negotiate (GstVideoDecoder * decoder) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaH265Dec *self = GST_VA_H265_DEC (decoder); + GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; + GstCapsFeatures *capsfeatures = NULL; + GstH265Decoder *h265dec = GST_H265_DECODER (decoder); + + /* Ignore downstream renegotiation request. */ + if (!base->need_negotiation) + return TRUE; + + base->need_negotiation = FALSE; + + if (gst_va_decoder_is_open (base->decoder) + && !gst_va_decoder_close (base->decoder)) + return FALSE; + + if (!gst_va_decoder_open (base->decoder, base->profile, base->rt_format)) + return FALSE; + + if (!gst_va_decoder_set_frame_size (base->decoder, self->coded_width, + self->coded_height)) + return FALSE; + + if (base->output_state) + gst_video_codec_state_unref (base->output_state); + + gst_va_base_dec_get_preferred_format_and_caps_features (base, &format, + &capsfeatures); + + base->output_state = + gst_video_decoder_set_output_state (decoder, format, + base->width, base->height, h265dec->input_state); + + base->output_state->caps = gst_video_info_to_caps (&base->output_state->info); + if (capsfeatures) + gst_caps_set_features_simple (base->output_state->caps, capsfeatures); + + GST_INFO_OBJECT (self, "Negotiated caps %" GST_PTR_FORMAT, + base->output_state->caps); + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static void +gst_va_h265_dec_dispose (GObject * object) +{ + g_free (GST_VA_H265_DEC (object)->prev_slice.data); + + gst_va_base_dec_close (GST_VIDEO_DECODER (object)); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_h265_dec_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *src_doc_caps, *sink_doc_caps; + GObjectClass *gobject_class = G_OBJECT_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstH265DecoderClass *h265decoder_class = GST_H265_DECODER_CLASS (g_class); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (g_class); + struct CData *cdata = class_data; + gchar *long_name; + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API H.265 Decoder in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API H.265 Decoder"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", + "VA-API based H.265 video decoder", + "Nicolas Dufresne <nicolas.dufresne@collabora.com>"); + + sink_doc_caps = gst_caps_from_string (sink_caps_str); + src_doc_caps = gst_caps_from_string (src_caps_str); + + parent_class = g_type_class_peek_parent (g_class); + + gst_va_base_dec_class_init (GST_VA_BASE_DEC_CLASS (g_class), HEVC, + cdata->render_device_path, cdata->sink_caps, cdata->src_caps, + src_doc_caps, sink_doc_caps); + + gobject_class->dispose = gst_va_h265_dec_dispose; + + decoder_class->getcaps = GST_DEBUG_FUNCPTR (gst_va_h265_dec_getcaps); + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_va_h265_dec_negotiate); + + h265decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_va_h265_dec_new_sequence); + h265decoder_class->decode_slice = + GST_DEBUG_FUNCPTR (gst_va_h265_dec_decode_slice); + + h265decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_va_h265_dec_new_picture); + h265decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_va_h265_dec_output_picture); + h265decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_va_h265_dec_start_picture); + h265decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_va_h265_dec_end_picture); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + gst_caps_unref (cdata->src_caps); + gst_caps_unref (cdata->sink_caps); + g_free (cdata); +} + +static void +gst_va_h265_dec_init (GTypeInstance * instance, gpointer g_class) +{ + gst_va_base_dec_init (GST_VA_BASE_DEC (instance), GST_CAT_DEFAULT); + gst_h265_decoder_set_process_ref_pic_lists (GST_H265_DECODER (instance), + TRUE); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_h265dec_debug, "vah265dec", 0, + "VA H265 decoder"); + + return NULL; +} + +gboolean +gst_va_h265_dec_register (GstPlugin * plugin, GstVaDevice * device, + GstCaps * sink_caps, GstCaps * src_caps, guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaH265DecClass), + .class_init = gst_va_h265_dec_class_init, + .instance_size = sizeof (GstVaH265Dec), + .instance_init = gst_va_h265_dec_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + g_return_val_if_fail (GST_IS_CAPS (sink_caps), FALSE); + g_return_val_if_fail (GST_IS_CAPS (src_caps), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + cdata->sink_caps = _complete_sink_caps (sink_caps); + cdata->src_caps = gst_caps_ref (src_caps); + + /* class data will be leaked if the element never gets instantiated */ + GST_MINI_OBJECT_FLAG_SET (cdata->sink_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaH265Dec"); + feature_name = g_strdup ("vah265dec"); + + /* The first decoder to be registered should use a constant name, + * like vah265dec, for any additional decoders, we create unique + * names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sH265Dec", basename); + feature_name = g_strdup_printf ("va%sh265dec", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_H265_DECODER, + type_name, &type_info, 0); + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvah265dec.h
Added
@@ -0,0 +1,35 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * Copyright (C) 2020 Collabora + * Author: Nicolas Dufresne <nicolas.dufresne@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_h265_dec_register (GstPlugin * plugin, + GstVaDevice * device, + GstCaps * sink_caps, + GstCaps * src_caps, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvampeg2dec.c
Added
@@ -0,0 +1,736 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vampeg2dec + * @title: vampeg2dec + * @short_description: A VA-API based Mpeg2 video decoder + * + * vampeg2dec decodes Mpeg2 bitstreams to VA surfaces using the + * installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver. + * + * The decoding surfaces can be mapped onto main memory as video + * frames. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.mpg ! parsebin ! vampeg2dec ! autovideosink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvampeg2dec.h" + +#include "gstvabasedec.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_mpeg2dec_debug); +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT gst_va_mpeg2dec_debug +#else +#define GST_CAT_DEFAULT NULL +#endif + +#define GST_VA_MPEG2_DEC(obj) ((GstVaMpeg2Dec *) obj) +#define GST_VA_MPEG2_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaMpeg2DecClass)) +#define GST_VA_MPEG2_DEC_CLASS(klass) ((GstVaMpeg2DecClass *) klass) + +typedef struct _GstVaMpeg2Dec GstVaMpeg2Dec; +typedef struct _GstVaMpeg2DecClass GstVaMpeg2DecClass; + +struct _GstVaMpeg2DecClass +{ + GstVaBaseDecClass parent_class; +}; + +struct _GstVaMpeg2Dec +{ + GstVaBaseDec parent; + + gboolean progressive; + GstMpegVideoSequenceHdr seq; +}; + +static GstElementClass *parent_class = NULL; + +/* *INDENT-OFF* */ +static const gchar *src_caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12 }") " ;" + GST_VIDEO_CAPS_MAKE ("{ NV12 }"); +/* *INDENT-ON* */ + +static const gchar *sink_caps_str = "video/x-mpeg2"; + +static gboolean +gst_va_mpeg2_dec_negotiate (GstVideoDecoder * decoder) +{ + GstCapsFeatures *capsfeatures = NULL; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; + GstMpeg2Decoder *mpeg2dec = GST_MPEG2_DECODER (decoder); + + /* Ignore downstream renegotiation request. */ + if (!base->need_negotiation) + return TRUE; + + base->need_negotiation = FALSE; + + if (gst_va_decoder_is_open (base->decoder) + && !gst_va_decoder_close (base->decoder)) + return FALSE; + + if (!gst_va_decoder_open (base->decoder, base->profile, base->rt_format)) + return FALSE; + + if (!gst_va_decoder_set_frame_size (base->decoder, base->width, base->height)) + return FALSE; + + if (base->output_state) + gst_video_codec_state_unref (base->output_state); + + gst_va_base_dec_get_preferred_format_and_caps_features (base, &format, + &capsfeatures); + + base->output_state = + gst_video_decoder_set_output_state (decoder, format, + base->width, base->height, mpeg2dec->input_state); + + if (!self->progressive) + base->output_state->info.interlace_mode = GST_VIDEO_INTERLACE_MODE_MIXED; + + base->output_state->caps = gst_video_info_to_caps (&base->output_state->info); + if (capsfeatures) + gst_caps_set_features_simple (base->output_state->caps, capsfeatures); + + GST_INFO_OBJECT (self, "Negotiated caps %" GST_PTR_FORMAT, + base->output_state->caps); + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static VAProfile +_map_profile (GstMpegVideoProfile profile) +{ + VAProfile p = VAProfileNone; + + switch (profile) { + case GST_MPEG_VIDEO_PROFILE_SIMPLE: + p = VAProfileMPEG2Simple; + break; + case GST_MPEG_VIDEO_PROFILE_MAIN: + p = VAProfileMPEG2Main; + break; + default: + p = VAProfileNone; + break; + } + + return p; +} + +static VAProfile +_get_profile (GstVaMpeg2Dec * self, GstMpegVideoProfile profile, + const GstMpegVideoSequenceExt * seq_ext, + const GstMpegVideoSequenceScalableExt * seq_scalable_ext) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (self); + VAProfile hw_profile; + + hw_profile = _map_profile (profile); + if (hw_profile == VAProfileNone) + return hw_profile; + + /* promote the profile if hw does not support, until we get one */ + do { + if (gst_va_decoder_has_profile (base->decoder, hw_profile)) + return hw_profile; + + /* Otherwise, try to map to a higher profile */ + switch (profile) { + case GST_MPEG_VIDEO_PROFILE_SIMPLE: + hw_profile = VAProfileMPEG2Main; + break; + case GST_MPEG_VIDEO_PROFILE_HIGH: + /* Try to map to main profile if no high profile specific bits used */ + if (!seq_scalable_ext && (seq_ext && seq_ext->chroma_format == 1)) { + hw_profile = VAProfileMPEG2Main; + break; + } + /* fall-through */ + default: + GST_ERROR_OBJECT (self, "profile %d is unsupported.", profile); + hw_profile = VAProfileNone; + break; + } + } while (hw_profile != VAProfileNone); + + return hw_profile; +} + +static guint +_get_rtformat (GstVaMpeg2Dec * self, GstMpegVideoChromaFormat chroma_format) +{ + guint ret = 0; + + switch (chroma_format) { + case GST_MPEG_VIDEO_CHROMA_420: + ret = VA_RT_FORMAT_YUV420; + break; + case GST_MPEG_VIDEO_CHROMA_422: + ret = VA_RT_FORMAT_YUV422; + break; + case GST_MPEG_VIDEO_CHROMA_444: + ret = VA_RT_FORMAT_YUV444; + break; + default: + GST_ERROR_OBJECT (self, "Unsupported chroma format: %d ", chroma_format); + break; + } + + return ret; +} + +static GstFlowReturn +gst_va_mpeg2_dec_new_sequence (GstMpeg2Decoder * decoder, + const GstMpegVideoSequenceHdr * seq, + const GstMpegVideoSequenceExt * seq_ext, + const GstMpegVideoSequenceDisplayExt * seq_display_ext, + const GstMpegVideoSequenceScalableExt * seq_scalable_ext) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + VAProfile profile; + GstMpegVideoProfile mpeg_profile; + gboolean negotiation_needed = FALSE; + guint rt_format; + gint width, height; + gboolean progressive; + + self->seq = *seq; + + width = seq->width; + height = seq->height; + if (seq_ext) { + width = (width & 0x0fff) | ((guint32) seq_ext->horiz_size_ext << 12); + height = (height & 0x0fff) | ((guint32) seq_ext->vert_size_ext << 12); + } + + mpeg_profile = GST_MPEG_VIDEO_PROFILE_MAIN; + if (seq_ext) + mpeg_profile = seq_ext->profile; + + profile = _get_profile (self, mpeg_profile, seq_ext, seq_scalable_ext); + if (profile == VAProfileNone) + return GST_FLOW_NOT_NEGOTIATED; + + rt_format = _get_rtformat (self, + seq_ext ? seq_ext->chroma_format : GST_MPEG_VIDEO_CHROMA_420); + if (rt_format == 0) + return GST_FLOW_NOT_NEGOTIATED; + + if (!gst_va_decoder_config_is_equal (base->decoder, profile, + rt_format, width, height)) { + base->profile = profile; + base->rt_format = rt_format; + base->width = width; + base->height = height; + + negotiation_needed = TRUE; + + GST_INFO_OBJECT (self, "Format changed to %s [%x] (%dx%d)", + gst_va_profile_name (profile), rt_format, base->width, base->height); + } + + progressive = seq_ext ? seq_ext->progressive : 1; + if (self->progressive != progressive) { + self->progressive = progressive; + + negotiation_needed = TRUE; + GST_INFO_OBJECT (self, "Interlaced mode changed to %d", !progressive); + } + + base->need_valign = FALSE; + + base->min_buffers = 2 + 4; /* max num pic references + scratch surfaces */ + + base->need_negotiation = negotiation_needed; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_mpeg2_dec_new_picture (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, GstMpeg2Picture * picture) +{ + GstFlowReturn ret; + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + GstVaDecodePicture *pic; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + + if (base->need_negotiation) { + if (!gst_video_decoder_negotiate (vdec)) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + ret = gst_video_decoder_allocate_output_frame (vdec, frame); + if (ret != GST_FLOW_OK) + goto error; + + pic = gst_va_decode_picture_new (base->decoder, frame->output_buffer); + + gst_mpeg2_picture_set_user_data (picture, pic, + (GDestroyNotify) gst_va_decode_picture_free); + + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, + gst_va_decode_picture_get_surface (pic)); + + return GST_FLOW_OK; + +error: + { + GST_WARNING_OBJECT (self, "Failed to allocated output buffer, return %s", + gst_flow_get_name (ret)); + return ret; + } +} + +static GstFlowReturn +gst_va_mpeg2_dec_new_field_picture (GstMpeg2Decoder * decoder, + const GstMpeg2Picture * first_field, GstMpeg2Picture * second_field) +{ + GstVaDecodePicture *first_pic, *second_pic; + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + first_pic = gst_mpeg2_picture_get_user_data ((GstMpeg2Picture *) first_field); + if (!first_pic) + return GST_FLOW_ERROR; + + second_pic = gst_va_decode_picture_new (base->decoder, first_pic->gstbuffer); + gst_mpeg2_picture_set_user_data (second_field, second_pic, + (GDestroyNotify) gst_va_decode_picture_free); + + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", second_pic, + gst_va_decode_picture_get_surface (second_pic)); + + return GST_FLOW_OK; +} + +static inline guint32 +_pack_f_code (guint8 f_code[2][2]) +{ + return (((guint32) f_code[0][0] << 12) + | ((guint32) f_code[0][1] << 8) + | ((guint32) f_code[1][0] << 4) + | (f_code[1][1])); +} + +static inline void +_copy_quant_matrix (guint8 dst[64], const guint8 src[64]) +{ + memcpy (dst, src, 64); +} + +static gboolean +gst_va_mpeg2_dec_add_quant_matrix (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + GstMpegVideoQuantMatrixExt *const quant_matrix = slice->quant_matrix; + guint8 *intra_quant_matrix = NULL; + guint8 *non_intra_quant_matrix = NULL; + guint8 *chroma_intra_quant_matrix = NULL; + guint8 *chroma_non_intra_quant_matrix = NULL; + VAIQMatrixBufferMPEG2 iq_matrix = { 0 }; + GstVaDecodePicture *va_pic; + + intra_quant_matrix = self->seq.intra_quantizer_matrix; + non_intra_quant_matrix = self->seq.non_intra_quantizer_matrix; + + if (quant_matrix) { + if (quant_matrix->load_intra_quantiser_matrix) + intra_quant_matrix = quant_matrix->intra_quantiser_matrix; + if (quant_matrix->load_non_intra_quantiser_matrix) + non_intra_quant_matrix = quant_matrix->non_intra_quantiser_matrix; + if (quant_matrix->load_chroma_intra_quantiser_matrix) + chroma_intra_quant_matrix = quant_matrix->chroma_intra_quantiser_matrix; + if (quant_matrix->load_chroma_non_intra_quantiser_matrix) + chroma_non_intra_quant_matrix = + quant_matrix->chroma_non_intra_quantiser_matrix; + } + + iq_matrix.load_intra_quantiser_matrix = intra_quant_matrix != NULL; + if (intra_quant_matrix) + _copy_quant_matrix (iq_matrix.intra_quantiser_matrix, intra_quant_matrix); + + iq_matrix.load_non_intra_quantiser_matrix = non_intra_quant_matrix != NULL; + if (non_intra_quant_matrix) + _copy_quant_matrix (iq_matrix.non_intra_quantiser_matrix, + non_intra_quant_matrix); + + iq_matrix.load_chroma_intra_quantiser_matrix = + chroma_intra_quant_matrix != NULL; + if (chroma_intra_quant_matrix) + _copy_quant_matrix (iq_matrix.chroma_intra_quantiser_matrix, + chroma_intra_quant_matrix); + + iq_matrix.load_chroma_non_intra_quantiser_matrix = + chroma_non_intra_quant_matrix != NULL; + if (chroma_non_intra_quant_matrix) + _copy_quant_matrix (iq_matrix.chroma_non_intra_quantiser_matrix, + chroma_non_intra_quant_matrix); + + va_pic = gst_mpeg2_picture_get_user_data (picture); + return gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAIQMatrixBufferType, &iq_matrix, sizeof (iq_matrix)); +} + +static inline uint32_t +_is_frame_start (GstMpeg2Picture * picture) +{ + return (!picture->first_field + || (picture->structure == GST_MPEG_VIDEO_PICTURE_STRUCTURE_FRAME)) + ? 1 : 0; +} + +static inline VASurfaceID +_get_surface_id (GstMpeg2Picture * picture) +{ + GstVaDecodePicture *va_pic; + + if (!picture) + return VA_INVALID_ID; + + va_pic = gst_mpeg2_picture_get_user_data (picture); + if (!va_pic) + return VA_INVALID_ID; + return gst_va_decode_picture_get_surface (va_pic); +} + +static GstFlowReturn +gst_va_mpeg2_dec_start_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice, + GstMpeg2Picture * prev_picture, GstMpeg2Picture * next_picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + GstVaDecodePicture *va_pic; + VAPictureParameterBufferMPEG2 pic_param; + + va_pic = gst_mpeg2_picture_get_user_data (picture); + + /* *INDENT-OFF* */ + pic_param = (VAPictureParameterBufferMPEG2) { + .horizontal_size = base->width, + .vertical_size = base->height, + .forward_reference_picture = VA_INVALID_ID, + .backward_reference_picture = VA_INVALID_ID, + .picture_coding_type = slice->pic_hdr->pic_type, + .f_code = _pack_f_code (slice->pic_ext->f_code), + .picture_coding_extension.bits = { + .is_first_field = _is_frame_start (picture), + .intra_dc_precision = slice->pic_ext->intra_dc_precision, + .picture_structure = slice->pic_ext->picture_structure, + .top_field_first = slice->pic_ext->top_field_first, + .frame_pred_frame_dct = slice->pic_ext->frame_pred_frame_dct, + .concealment_motion_vectors = slice->pic_ext->concealment_motion_vectors, + .q_scale_type = slice->pic_ext->q_scale_type, + .intra_vlc_format = slice->pic_ext->intra_vlc_format, + .alternate_scan = slice->pic_ext->alternate_scan, + .repeat_first_field = slice->pic_ext->repeat_first_field, + .progressive_frame = slice->pic_ext->progressive_frame, + }, + }; + /* *INDENT-ON* */ + + switch (picture->type) { + case GST_MPEG_VIDEO_PICTURE_TYPE_B:{ + VASurfaceID surface = _get_surface_id (next_picture); + if (surface == VA_INVALID_ID) { + GST_WARNING_OBJECT (self, "Missing the backward reference picture"); + if (GST_VA_DISPLAY_IS_IMPLEMENTATION (base->display, MESA_GALLIUM)) + return GST_FLOW_ERROR; + else if (GST_VA_DISPLAY_IS_IMPLEMENTATION (base->display, INTEL_IHD)) + surface = gst_va_decode_picture_get_surface (va_pic); + } + pic_param.backward_reference_picture = surface; + } + /* fall-through */ + case GST_MPEG_VIDEO_PICTURE_TYPE_P:{ + VASurfaceID surface = _get_surface_id (prev_picture); + if (surface == VA_INVALID_ID) { + GST_WARNING_OBJECT (self, "Missing the forward reference picture"); + if (GST_VA_DISPLAY_IS_IMPLEMENTATION (base->display, MESA_GALLIUM)) + return GST_FLOW_ERROR; + else if (GST_VA_DISPLAY_IS_IMPLEMENTATION (base->display, INTEL_IHD)) + surface = gst_va_decode_picture_get_surface (va_pic); + } + pic_param.forward_reference_picture = surface; + } + default: + break; + } + + if (!gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAPictureParameterBufferType, &pic_param, sizeof (pic_param))) + return GST_FLOW_ERROR; + + if (!gst_va_mpeg2_dec_add_quant_matrix (decoder, picture, slice)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_mpeg2_dec_decode_slice (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture, GstMpeg2Slice * slice) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstMpegVideoSliceHdr *header = &slice->header; + GstMpegVideoPacket *packet = &slice->packet; + GstVaDecodePicture *va_pic; + VASliceParameterBufferMPEG2 slice_param; + + /* *INDENT-OFF* */ + slice_param = (VASliceParameterBufferMPEG2) { + .slice_data_size = slice->size, + .slice_data_offset = 0, + .slice_data_flag = VA_SLICE_DATA_FLAG_ALL, + .macroblock_offset = header->header_size + 32, + .slice_horizontal_position = header->mb_column, + .slice_vertical_position = header->mb_row, + .quantiser_scale_code = header->quantiser_scale_code, + .intra_slice_flag = header->intra_slice, + }; + /* *INDENT-ON* */ + + va_pic = gst_mpeg2_picture_get_user_data (picture); + if (!gst_va_decoder_add_slice_buffer (base->decoder, va_pic, + &slice_param, sizeof (slice_param), + (guint8 *) (packet->data + slice->sc_offset), slice->size)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_mpeg2_dec_end_picture (GstMpeg2Decoder * decoder, + GstMpeg2Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *va_pic; + + GST_LOG_OBJECT (base, "end picture %p, (poc %d)", + picture, picture->pic_order_cnt); + + va_pic = gst_mpeg2_picture_get_user_data (picture); + + if (!gst_va_decoder_decode (base->decoder, va_pic)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_mpeg2_dec_output_picture (GstMpeg2Decoder * decoder, + GstVideoCodecFrame * frame, GstMpeg2Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaMpeg2Dec *self = GST_VA_MPEG2_DEC (decoder); + + GST_LOG_OBJECT (self, + "Outputting picture %p (poc %d)", picture, picture->pic_order_cnt); + + if (base->copy_frames) + gst_va_base_dec_copy_output_buffer (base, frame); + + if (picture->buffer_flags != 0) { + gboolean interlaced = + (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_INTERLACED) != 0; + gboolean tff = (picture->buffer_flags & GST_VIDEO_BUFFER_FLAG_TFF) != 0; + + GST_TRACE_OBJECT (self, + "apply buffer flags 0x%x (interlaced %d, top-field-first %d)", + picture->buffer_flags, interlaced, tff); + GST_BUFFER_FLAG_SET (frame->output_buffer, picture->buffer_flags); + } + + gst_mpeg2_picture_unref (picture); + + return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); +} + +static void +gst_va_mpeg2_dec_init (GTypeInstance * instance, gpointer g_class) +{ + gst_va_base_dec_init (GST_VA_BASE_DEC (instance), GST_CAT_DEFAULT); +} + +static void +gst_va_mpeg2_dec_dispose (GObject * object) +{ + gst_va_base_dec_close (GST_VIDEO_DECODER (object)); + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_mpeg2_dec_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *src_doc_caps, *sink_doc_caps; + GObjectClass *gobject_class = G_OBJECT_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstMpeg2DecoderClass *mpeg2decoder_class = GST_MPEG2_DECODER_CLASS (g_class); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (g_class); + struct CData *cdata = class_data; + gchar *long_name; + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API Mpeg2 Decoder in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API Mpeg2 Decoder"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", + "VA-API based Mpeg2 video decoder", "He Junyan <junyan.he@intel.com>"); + + sink_doc_caps = gst_caps_from_string (sink_caps_str); + src_doc_caps = gst_caps_from_string (src_caps_str); + + parent_class = g_type_class_peek_parent (g_class); + + gst_va_base_dec_class_init (GST_VA_BASE_DEC_CLASS (g_class), MPEG2, + cdata->render_device_path, cdata->sink_caps, cdata->src_caps, + src_doc_caps, sink_doc_caps); + + gobject_class->dispose = gst_va_mpeg2_dec_dispose; + + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_negotiate); + + mpeg2decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_new_sequence); + mpeg2decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_new_picture); + mpeg2decoder_class->new_field_picture = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_new_field_picture); + mpeg2decoder_class->start_picture = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_start_picture); + mpeg2decoder_class->decode_slice = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_decode_slice); + mpeg2decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_end_picture); + mpeg2decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_va_mpeg2_dec_output_picture); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + gst_caps_unref (cdata->src_caps); + gst_caps_unref (cdata->sink_caps); + g_free (cdata); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_mpeg2dec_debug, "vampeg2dec", 0, + "VA Mpeg2 decoder"); + + return NULL; +} + +gboolean +gst_va_mpeg2_dec_register (GstPlugin * plugin, GstVaDevice * device, + GstCaps * sink_caps, GstCaps * src_caps, guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaMpeg2DecClass), + .class_init = gst_va_mpeg2_dec_class_init, + .instance_size = sizeof (GstVaMpeg2Dec), + .instance_init = gst_va_mpeg2_dec_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + g_return_val_if_fail (GST_IS_CAPS (sink_caps), FALSE); + g_return_val_if_fail (GST_IS_CAPS (src_caps), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + cdata->sink_caps = gst_caps_ref (sink_caps); + cdata->src_caps = gst_caps_ref (src_caps); + + /* class data will be leaked if the element never gets instantiated */ + GST_MINI_OBJECT_FLAG_SET (cdata->sink_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaMpeg2dec"); + feature_name = g_strdup ("vampeg2dec"); + + /* The first decoder to be registered should use a constant name, + * like vampeg2dec, for any additional decoders, we create unique + * names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sMpeg2Dec", basename); + feature_name = g_strdup_printf ("va%smpeg2dec", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_MPEG2_DECODER, + type_name, &type_info, 0); + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvampeg2dec.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_mpeg2_dec_register (GstPlugin * plugin, + GstVaDevice * device, + GstCaps * sink_caps, + GstCaps * src_caps, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvapool.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvapool.c
Changed
@@ -25,6 +25,7 @@ #include "gstvapool.h" #include "gstvaallocator.h" +#include "gstvacaps.h" GST_DEBUG_CATEGORY_STATIC (gst_va_pool_debug); #define GST_CAT_DEFAULT gst_va_pool_debug @@ -36,10 +37,12 @@ GstVideoInfo alloc_info; GstVideoInfo caps_info; GstAllocator *allocator; - guint32 usage_hint; + gboolean force_videometa; gboolean add_videometa; - gboolean need_alignment; - GstVideoAlignment video_align; + gint crop_left; + gint crop_top; + + gboolean starting; }; #define gst_va_pool_parent_class parent_class @@ -74,10 +77,11 @@ GstVideoAlignment video_align = { 0, }; GstVideoInfo caps_info, alloc_info; gint width, height; - guint size, min_buffers, max_buffers; + guint i, min_buffers, max_buffers; guint32 usage_hint; + gboolean has_alignment; - if (!gst_buffer_pool_config_get_params (config, &caps, &size, &min_buffers, + if (!gst_buffer_pool_config_get_params (config, &caps, NULL, &min_buffers, &max_buffers)) goto wrong_config; @@ -87,9 +91,6 @@ if (!gst_video_info_from_caps (&caps_info, caps)) goto wrong_caps; - if (size < GST_VIDEO_INFO_SIZE (&caps_info)) - goto wrong_size; - if (!gst_buffer_pool_config_get_allocator (config, &allocator, NULL)) goto wrong_config; @@ -103,49 +104,61 @@ width = GST_VIDEO_INFO_WIDTH (&caps_info); height = GST_VIDEO_INFO_HEIGHT (&caps_info); - GST_LOG_OBJECT (vpool, "%dx%d - %u | caps %" GST_PTR_FORMAT, width, height, - size, caps); - - if (vpool->allocator) - gst_object_unref (vpool->allocator); - if ((vpool->allocator = allocator)) - gst_object_ref (allocator); + GST_LOG_OBJECT (vpool, "%dx%d | %" GST_PTR_FORMAT, width, height, caps); /* enable metadata based on config of the pool */ vpool->add_videometa = gst_buffer_pool_config_has_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); /* parse extra alignment info */ - vpool->need_alignment = gst_buffer_pool_config_has_option (config, + has_alignment = gst_buffer_pool_config_has_option (config, GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT); - if (vpool->need_alignment && vpool->add_videometa) { + if (has_alignment) { gst_buffer_pool_config_get_video_alignment (config, &video_align); width += video_align.padding_left + video_align.padding_right; height += video_align.padding_bottom + video_align.padding_top; - /* apply the alignment to the info */ - if (!gst_video_info_align (&caps_info, &video_align)) - goto failed_to_align; - - gst_buffer_pool_config_set_video_alignment (config, &video_align); + if (video_align.padding_left > 0) + vpool->crop_left = video_align.padding_left; + if (video_align.padding_top > 0) + vpool->crop_top = video_align.padding_top; } - GST_VIDEO_INFO_SIZE (&caps_info) = - MAX (size, GST_VIDEO_INFO_SIZE (&caps_info)); - + /* update allocation info with aligned size */ alloc_info = caps_info; GST_VIDEO_INFO_WIDTH (&alloc_info) = width; GST_VIDEO_INFO_HEIGHT (&alloc_info) = height; + if (GST_IS_VA_DMABUF_ALLOCATOR (allocator)) { + if (!gst_va_dmabuf_allocator_set_format (allocator, &alloc_info, + usage_hint)) + goto failed_allocator; + } else if (GST_IS_VA_ALLOCATOR (allocator)) { + if (!gst_va_allocator_set_format (allocator, &alloc_info, usage_hint)) + goto failed_allocator; + } + + gst_object_replace ((GstObject **) & vpool->allocator, + GST_OBJECT (allocator)); + vpool->caps_info = caps_info; vpool->alloc_info = alloc_info; - vpool->usage_hint = usage_hint; - vpool->video_align = video_align; + + for (i = 0; i < GST_VIDEO_INFO_N_PLANES (&caps_info); i++) { + if (GST_VIDEO_INFO_PLANE_STRIDE (&caps_info, i) != + GST_VIDEO_INFO_PLANE_STRIDE (&alloc_info, i) || + GST_VIDEO_INFO_PLANE_OFFSET (&caps_info, i) != + GST_VIDEO_INFO_PLANE_OFFSET (&alloc_info, i)) { + GST_INFO_OBJECT (vpool, "Video meta is required in buffer."); + vpool->force_videometa = TRUE; + break; + } + } gst_buffer_pool_config_set_params (config, caps, - GST_VIDEO_INFO_SIZE (&caps_info), min_buffers, max_buffers); + GST_VIDEO_INFO_SIZE (&alloc_info), min_buffers, max_buffers); return GST_BUFFER_POOL_CLASS (parent_class)->set_config (pool, config); @@ -166,16 +179,9 @@ "failed getting geometry from caps %" GST_PTR_FORMAT, caps); return FALSE; } -wrong_size: +failed_allocator: { - GST_WARNING_OBJECT (vpool, - "Provided size is to small for the caps: %u < %" G_GSIZE_FORMAT, size, - GST_VIDEO_INFO_SIZE (&caps_info)); - return FALSE; - } -failed_to_align: - { - GST_WARNING_OBJECT (pool, "Failed to align"); + GST_WARNING_OBJECT (vpool, "Failed to set format to allocator"); return FALSE; } } @@ -185,38 +191,57 @@ GstBufferPoolAcquireParams * params) { GstBuffer *buf; - GstVideoMeta *vmeta; GstVaPool *vpool = GST_VA_POOL (pool); - GstVaAllocationParams alloc_params = { - .info = vpool->alloc_info, - .usage_hint = vpool->usage_hint, - }; buf = gst_buffer_new (); if (GST_IS_VA_DMABUF_ALLOCATOR (vpool->allocator)) { - if (!gst_va_dmabuf_setup_buffer (vpool->allocator, buf, &alloc_params)) - goto no_memory; + if (vpool->starting) { + if (!gst_va_dmabuf_allocator_setup_buffer (vpool->allocator, buf)) + goto no_memory; + } else if (!gst_va_dmabuf_allocator_prepare_buffer (vpool->allocator, buf)) { + if (!gst_va_dmabuf_allocator_setup_buffer (vpool->allocator, buf)) + goto no_memory; + } } else if (GST_IS_VA_ALLOCATOR (vpool->allocator)) { - GstMemory *mem = gst_va_allocator_alloc (vpool->allocator, &alloc_params); - if (!mem) - goto no_memory; - gst_buffer_append_memory (buf, mem); + if (vpool->starting) { + if (!gst_va_allocator_setup_buffer (vpool->allocator, buf)) + goto no_memory; + } else if (!gst_va_allocator_prepare_buffer (vpool->allocator, buf)) { + if (!gst_va_allocator_setup_buffer (vpool->allocator, buf)) + goto no_memory; + } } else goto no_memory; if (vpool->add_videometa) { - /* GstVaAllocator may update offset/stride given the physical - * memory */ - vmeta = gst_buffer_add_video_meta_full (buf, GST_VIDEO_FRAME_FLAG_NONE, - GST_VIDEO_INFO_FORMAT (&vpool->caps_info), - GST_VIDEO_INFO_WIDTH (&vpool->caps_info), - GST_VIDEO_INFO_HEIGHT (&vpool->caps_info), - GST_VIDEO_INFO_N_PLANES (&vpool->caps_info), - alloc_params.info.offset, alloc_params.info.stride); - - if (vpool->need_alignment) - gst_video_meta_set_alignment (vmeta, vpool->video_align); + if (vpool->crop_left > 0 || vpool->crop_top > 0) { + GstVideoCropMeta *crop_meta; + + /* For video crop, its video meta's width and height should be + the full size of uncropped resolution. */ + gst_buffer_add_video_meta_full (buf, GST_VIDEO_FRAME_FLAG_NONE, + GST_VIDEO_INFO_FORMAT (&vpool->alloc_info), + GST_VIDEO_INFO_WIDTH (&vpool->alloc_info), + GST_VIDEO_INFO_HEIGHT (&vpool->alloc_info), + GST_VIDEO_INFO_N_PLANES (&vpool->alloc_info), + vpool->alloc_info.offset, vpool->alloc_info.stride); + + crop_meta = gst_buffer_add_video_crop_meta (buf); + crop_meta->x = vpool->crop_left; + crop_meta->y = vpool->crop_top; + crop_meta->width = GST_VIDEO_INFO_WIDTH (&vpool->caps_info); + crop_meta->height = GST_VIDEO_INFO_HEIGHT (&vpool->caps_info); + } else { + /* GstVaAllocator may update offset/stride given the physical + * memory */ + gst_buffer_add_video_meta_full (buf, GST_VIDEO_FRAME_FLAG_NONE, + GST_VIDEO_INFO_FORMAT (&vpool->caps_info), + GST_VIDEO_INFO_WIDTH (&vpool->caps_info), + GST_VIDEO_INFO_HEIGHT (&vpool->caps_info), + GST_VIDEO_INFO_N_PLANES (&vpool->caps_info), + vpool->alloc_info.offset, vpool->alloc_info.stride); + } } *buffer = buf; @@ -232,6 +257,35 @@ } } +static gboolean +gst_va_pool_start (GstBufferPool * pool) +{ + GstVaPool *vpool = GST_VA_POOL (pool); + gboolean ret; + + vpool->starting = TRUE; + ret = GST_BUFFER_POOL_CLASS (parent_class)->start (pool); + vpool->starting = FALSE; + + return ret; +} + +static gboolean +gst_va_pool_stop (GstBufferPool * pool) +{ + GstVaPool *vpool = GST_VA_POOL (pool); + gboolean ret; + + ret = GST_BUFFER_POOL_CLASS (parent_class)->stop (pool); + + if (GST_IS_VA_DMABUF_ALLOCATOR (vpool->allocator)) + gst_va_dmabuf_allocator_flush (vpool->allocator); + else if (GST_IS_VA_ALLOCATOR (vpool->allocator)) + gst_va_allocator_flush (vpool->allocator); + + return ret; +} + static void gst_va_pool_dispose (GObject * object) { @@ -255,6 +309,8 @@ gstbufferpool_class->get_options = gst_va_pool_get_options; gstbufferpool_class->set_config = gst_va_pool_set_config; gstbufferpool_class->alloc_buffer = gst_va_pool_alloc; + gstbufferpool_class->start = gst_va_pool_start; + gstbufferpool_class->stop = gst_va_pool_stop; } static void @@ -281,3 +337,32 @@ { gst_structure_set (config, "usage-hint", G_TYPE_UINT, usage_hint, NULL); } + +gboolean +gst_va_pool_requires_video_meta (GstBufferPool * pool) +{ + return GST_VA_POOL (pool)->force_videometa; +} + +GstBufferPool * +gst_va_pool_new_with_config (GstCaps * caps, guint size, guint min_buffers, + guint max_buffers, guint usage_hint, GstAllocator * allocator, + GstAllocationParams * alloc_params) +{ + GstBufferPool *pool; + GstStructure *config; + + pool = gst_va_pool_new (); + + config = gst_buffer_pool_get_config (pool); + gst_buffer_pool_config_set_params (config, caps, size, min_buffers, + max_buffers); + gst_buffer_pool_config_set_va_allocation_params (config, usage_hint); + gst_buffer_pool_config_set_allocator (config, allocator, alloc_params); + gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META); + + if (!gst_buffer_pool_set_config (pool, config)) + gst_clear_object (&pool); + + return pool; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvapool.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvapool.h
Changed
@@ -28,6 +28,16 @@ G_DECLARE_FINAL_TYPE (GstVaPool, gst_va_pool, GST, VA_POOL, GstBufferPool) GstBufferPool * gst_va_pool_new (void); +gboolean gst_va_pool_requires_video_meta (GstBufferPool * pool); void gst_buffer_pool_config_set_va_allocation_params (GstStructure * config, guint usage_hint); + +GstBufferPool * gst_va_pool_new_with_config (GstCaps * caps, + guint size, + guint min_buffers, + guint max_buffers, + guint usage_hint, + GstAllocator * allocator, + GstAllocationParams * alloc_params); + G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvaprofile.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvaprofile.c
Changed
@@ -71,10 +71,10 @@ /* "profile = (string) { multiview-high, stereo-high }"), */ P (HEVC, Main, "video/x-h265", "profile = (string) main"), P (HEVC, Main10, "video/x-h265", "profile = (string) main-10"), - P (VP9, Profile0, "video/x-vp9", "profile = (string) profile0"), - P (VP9, Profile1, "video/x-vp9", "profile = (string) profile1"), - P (VP9, Profile2, "video/x-vp9", "profile = (string) profile2"), - P (VP9, Profile3, "video/x-vp9", "profile = (string) profile3"), + P (VP9, Profile0, "video/x-vp9", "profile = (string) 0"), + P (VP9, Profile1, "video/x-vp9", "profile = (string) 1"), + P (VP9, Profile2, "video/x-vp9", "profile = (string) 2"), + P (VP9, Profile3, "video/x-vp9", "profile = (string) 3"), P (HEVC, Main12, "video/x-h265", "profile = (string) main-12"), P (HEVC, Main422_10, "video/x-h265", "profile = (string) main-422-10"), P (HEVC, Main422_12, "video/x-h265", "profile = (string) main-422-12"), @@ -87,8 +87,24 @@ P (HEVC, SccMain444, "video/x-h265", "profile = (string) screen-extended-main-444"), #if VA_CHECK_VERSION(1,7,0) - P (AV1, Profile0, "video/x-av1", NULL), - P (AV1, Profile1, "video/x-av1", NULL), + /* Spec A.2: + "Main" compliant decoders must be able to decode streams with + seq_profile equal to 0. + "High" compliant decoders must be able to decode streams with + seq_profile less than or equal to 1. + "Professional" compliant decoders must be able to decode streams + with seq_profile less than or equal to 2. + + The correct relationship between profile "main" "high" "professional" + and seq_profile "0" "1" "2" should be: + main <------> { 0 } + high <------> { main, 1 } + professional <------> { high, 2 } + + So far, all va decoders can support "0" when they support "1", + we just map "0" to "main" and "1" to "high". */ + P (AV1, Profile0, "video/x-av1", "profile = (string) main"), + P (AV1, Profile1, "video/x-av1", "profile = (string) high"), #endif #if VA_CHECK_VERSION(1, 8, 0) P (HEVC, SccMain444_10, "video/x-h265",
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvasurfacecopy.c
Added
@@ -0,0 +1,157 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstvasurfacecopy.h" + +#include "gstvaallocator.h" +#include "gstvadisplay_priv.h" +#include "gstvafilter.h" +#include "vasurfaceimage.h" + +#define GST_CAT_DEFAULT gst_va_memory_debug +GST_DEBUG_CATEGORY_EXTERN (gst_va_memory_debug); + +struct _GstVaSurfaceCopy +{ + GstVaDisplay *display; + + GstVideoInfo info; + gboolean has_copy; + + GRecMutex lock; + GstVaFilter *filter; +}; + +static gboolean +_has_copy (GstVaDisplay * display) +{ +#if VA_CHECK_VERSION (1, 12, 0) + VADisplay dpy; + VADisplayAttribute attr = { + .type = VADisplayAttribCopy, + .flags = VA_DISPLAY_ATTRIB_GETTABLE, + }; + VAStatus status; + + dpy = gst_va_display_get_va_dpy (display); + + gst_va_display_lock (display); + status = vaGetDisplayAttributes (dpy, &attr, 1); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_INFO ("vaGetDisplayAttribures: %s", vaErrorStr (status)); + return FALSE; + } + + return TRUE; +#else + return FALSE; +#endif +} + +GstVaSurfaceCopy * +gst_va_surface_copy_new (GstVaDisplay * display, GstVideoInfo * vinfo) +{ + GstVaSurfaceCopy *self; + + g_return_val_if_fail (GST_IS_VA_DISPLAY (display), NULL); + g_return_val_if_fail (vinfo != NULL, NULL); + + self = g_slice_new (GstVaSurfaceCopy); + self->display = gst_object_ref (display); + self->has_copy = _has_copy (display); + self->info = *vinfo; + self->filter = NULL; + g_rec_mutex_init (&self->lock); + + if (gst_va_display_has_vpp (display)) { + self->filter = gst_va_filter_new (display); + if (!(gst_va_filter_open (self->filter) + && gst_va_filter_set_video_info (self->filter, vinfo, vinfo))) + gst_clear_object (&self->filter); + } + + return self; +} + +void +gst_va_surface_copy_free (GstVaSurfaceCopy * self) +{ + g_return_if_fail (self && GST_IS_VA_DISPLAY (self->display)); + + gst_clear_object (&self->display); + if (self->filter) { + gst_va_filter_close (self->filter); + gst_clear_object (&self->filter); + } + + g_rec_mutex_clear (&self->lock); + + g_slice_free (GstVaSurfaceCopy, self); +} + +static gboolean +_vpp_copy_surface (GstVaSurfaceCopy * self, VASurfaceID dst, VASurfaceID src) +{ + gboolean ret; + + GstVaSample gst_src = { + .surface = src, + }; + GstVaSample gst_dst = { + .surface = dst, + }; + + g_rec_mutex_lock (&self->lock); + ret = gst_va_filter_process (self->filter, &gst_src, &gst_dst); + g_rec_mutex_unlock (&self->lock); + + return ret; +} + +gboolean +gst_va_surface_copy (GstVaSurfaceCopy * self, VASurfaceID dst, VASurfaceID src) +{ + VAImage image = {.image_id = VA_INVALID_ID, }; + gboolean ret; + + g_return_val_if_fail (self && GST_IS_VA_DISPLAY (self->display), FALSE); + + if (self->has_copy && va_copy_surface (self->display, dst, src)) { + GST_LOG ("GPU copy of %#x to %#x", src, dst); + return TRUE; + } + + if (self->filter && _vpp_copy_surface (self, dst, src)) { + GST_LOG ("VPP copy of %#x to %#x", src, dst); + return TRUE; + } + + if (!va_ensure_image (self->display, src, &self->info, &image, FALSE)) + return FALSE; + + if ((ret = va_put_image (self->display, dst, &image))) + GST_LOG ("shallow copy of %#x to %#x", src, dst); + + va_unmap_buffer (self->display, image.buf); + va_destroy_image (self->display, image.image_id); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvasurfacecopy.h
Added
@@ -0,0 +1,43 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/va/gstvadisplay.h> +#include <gst/video/video.h> +#include <va/va.h> + +G_BEGIN_DECLS + +/** + * Opaque object helper for copying surfaces. + * + * It's purpose is to avoid circular dependencies. + */ +typedef struct _GstVaSurfaceCopy GstVaSurfaceCopy; + +GstVaSurfaceCopy * gst_va_surface_copy_new (GstVaDisplay * display, + GstVideoInfo * vinfo); +void gst_va_surface_copy_free (GstVaSurfaceCopy * self); +gboolean gst_va_surface_copy (GstVaSurfaceCopy * self, + VASurfaceID dst, + VASurfaceID src); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvautils.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvautils.c
Changed
@@ -22,9 +22,9 @@ #include "config.h" #endif -#include "gstvadisplay_drm.h" -#include "gstvadisplay_wrapped.h" #include "gstvautils.h" +#include <gst/va/gstvadisplay_drm.h> +#include <gst/va/gstvadisplay_wrapped.h> GST_DEBUG_CATEGORY_STATIC (GST_CAT_CONTEXT); @@ -158,13 +158,13 @@ GstMessage *msg; if (!display) { - GST_ERROR_OBJECT (element, "Could not get GL display connection"); + GST_ERROR_OBJECT (element, "Could not get VA display connection"); return; } _init_context_debug (); - ctxt = gst_context_new ("gst.va.display.handle", TRUE); + ctxt = gst_context_new (GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR, TRUE); gst_context_set_va_display (ctxt, display); GST_CAT_INFO_OBJECT (GST_CAT_CONTEXT, element, @@ -186,24 +186,26 @@ /* 1) Check if the element already has a context of the specific * type. */ - if (gst_va_display_found (element, *display_ptr)) + if (gst_va_display_found (element, g_atomic_pointer_get (display_ptr))) goto done; - _gst_context_query (element, "gst.va.display.handle"); + _gst_context_query (element, GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR); /* Neighbour found and it updated the display */ - if (gst_va_display_found (element, *display_ptr)) + if (gst_va_display_found (element, g_atomic_pointer_get (display_ptr))) goto done; /* If no neighbor, or application not interested, use drm */ display = gst_va_display_drm_new_from_path (render_device_path); - *display_ptr = display; + gst_object_replace ((GstObject **) display_ptr, (GstObject *) display); gst_va_element_propagate_display_context (element, display); + gst_clear_object (&display); + done: - return *display_ptr != NULL; + return g_atomic_pointer_get (display_ptr) != NULL; } gboolean @@ -220,7 +222,7 @@ context_type = gst_context_get_context_type (context); - if (g_strcmp0 (context_type, "gst.va.display.handle") == 0) { + if (g_strcmp0 (context_type, GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR) == 0) { type_name = G_OBJECT_TYPE_NAME (element); if (!gst_context_get_va_display (context, type_name, render_device_path, &display_replacement)) { @@ -231,11 +233,9 @@ } if (display_replacement) { - GstVaDisplay *old = *display_ptr; - *display_ptr = display_replacement; - - if (old) - gst_object_unref (old); + gst_object_replace ((GstObject **) display_ptr, + (GstObject *) display_replacement); + gst_object_unref (display_replacement); } return TRUE; @@ -258,7 +258,8 @@ "handle context query %" GST_PTR_FORMAT, query); gst_query_parse_context_type (query, &context_type); - if (!display || g_strcmp0 (context_type, "gst.va.display.handle") != 0) + if (!display + || g_strcmp0 (context_type, GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR) != 0) return FALSE; gst_query_parse_context (query, &old_ctxt); @@ -266,7 +267,7 @@ if (old_ctxt) ctxt = gst_context_copy (old_ctxt); else - ctxt = gst_context_new ("gst.va.display.handle", TRUE); + ctxt = gst_context_new (GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR, TRUE); gst_context_set_va_display (ctxt, display); gst_query_set_context (query, ctxt); @@ -292,7 +293,7 @@ is_devnode = (g_strstr_len (type_name, -1, "renderD") != NULL); s = gst_context_get_structure (context); - if (gst_structure_get (s, "gst-display", GST_TYPE_VA_DISPLAY, &display, NULL)) { + if (gst_structure_get (s, "gst-display", GST_TYPE_OBJECT, &display, NULL)) { gchar *device_path = NULL; gboolean ret; @@ -302,7 +303,7 @@ g_free (device_path); if (ret) goto accept; - } else if (!is_devnode) { + } else if (GST_IS_VA_DISPLAY (display) && !is_devnode) { goto accept; } @@ -314,7 +315,7 @@ * VADisplay from users */ if (!is_devnode && gst_structure_get (s, "va-display", G_TYPE_POINTER, &dpy, NULL)) { - if ((display = gst_va_display_wrapped_new ((guintptr) dpy))) + if ((display = gst_va_display_wrapped_new (dpy))) goto accept; } @@ -339,11 +340,12 @@ g_return_if_fail (context != NULL); - if (display) + if (display) { GST_CAT_LOG (GST_CAT_CONTEXT, "setting GstVaDisplay (%" GST_PTR_FORMAT ") on context (%" GST_PTR_FORMAT ")", display, context); + } s = gst_context_writable_structure (context); - gst_structure_set (s, "gst-display", GST_TYPE_VA_DISPLAY, display, NULL); + gst_structure_set (s, "gst-display", GST_TYPE_OBJECT, display, NULL); }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvautils.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvautils.h
Changed
@@ -20,29 +20,28 @@ #pragma once -#include "gstvadisplay.h" +#include <gst/va/gstvadisplay.h> G_BEGIN_DECLS gboolean gst_va_ensure_element_data (gpointer element, - const gchar *render_device_path, - GstVaDisplay ** display_ptr); + const gchar *render_device_path, + GstVaDisplay ** display_ptr); gboolean gst_va_handle_set_context (GstElement * element, - GstContext * context, - const gchar *render_device_path, - GstVaDisplay ** display_ptr); + GstContext * context, + const gchar *render_device_path, + GstVaDisplay ** display_ptr); gboolean gst_va_handle_context_query (GstElement * element, - GstQuery * query, - GstVaDisplay * display); + GstQuery * query, + GstVaDisplay * display); void gst_va_element_propagate_display_context (GstElement * element, - GstVaDisplay * display); + GstVaDisplay * display); gboolean gst_context_get_va_display (GstContext * context, - const gchar * type_name, - const gchar * render_device_path, - GstVaDisplay ** display_ptr); + const gchar * type_name, + const gchar * render_device_path, + GstVaDisplay ** display_ptr); void gst_context_set_va_display (GstContext * context, - GstVaDisplay * display); - + GstVaDisplay * display); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvavideoformat.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavideoformat.c
Changed
@@ -24,11 +24,12 @@ #include "gstvavideoformat.h" +GST_DEBUG_CATEGORY_STATIC (gstva_debug); + #define VA_NSB_FIRST 0 /* No Significant Bit */ -/* XXX(victor): add RGB fourcc runtime checkups for screwed drivers */ /* *INDENT-OFF* */ -static const struct FormatMap +static struct FormatMap { GstVideoFormat format; guint va_rtformat; @@ -46,15 +47,20 @@ G (VUYA, ('A', 'Y', 'U', 'V'), YUV444, LSB, 32), F (RGBA, ('R', 'G', 'B', 'A'), RGB32, LSB, 32, 32, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000), - /* F (????, RGBX), */ + F (RGBx, ('R', 'G', 'B', 'X'), RGB32, LSB, 32, 24, 0x000000ff, + 0x0000ff00, 0x00ff0000, 0x00000000), F (BGRA, ('B', 'G', 'R', 'A'), RGB32, LSB, 32, 32, 0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000), F (ARGB, ('A', 'R', 'G', 'B'), RGB32, LSB, 32, 32, 0x0000ff00, 0x00ff0000, 0xff000000, 0x000000ff), - /* F (????, XRGB), */ + F (xRGB, ('X', 'R', 'G', 'B'), RGB32, LSB, 32, 24, 0x0000ff00, + 0x00ff0000, 0xff000000, 0x00000000), F (ABGR, ('A', 'B', 'G', 'R'), RGB32, LSB, 32, 32, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff), - /* F (????, XBGR), */ + F (xBGR, ('X', 'B', 'G', 'R'), RGB32, LSB, 32, 24, 0xff000000, + 0x00ff0000, 0x0000ff00, 0x00000000), + F (BGRx, ('B', 'G', 'R', 'X'), RGB32, LSB, 32, 24, 0x00ff0000, + 0x0000ff00, 0x000000ff, 0x00000000), G (UYVY, ('U', 'Y', 'V', 'Y'), YUV422, NSB, 16), G (YUY2, ('Y', 'U', 'Y', '2'), YUV422, NSB, 16), G (AYUV, ('A', 'Y', 'U', 'V'), YUV444, LSB, 32), @@ -78,9 +84,12 @@ G (Y210, ('Y', '2', '1', '0'), YUV422_10, NSB, 32), /* F (????, Y216), */ G (Y410, ('Y', '4', '1', '0'), YUV444_10, NSB, 32), + G (Y212_LE, ('Y', '2', '1', '2'), YUV422_12, NSB, 32), + G (Y412_LE, ('Y', '4', '1', '2'), YUV444_12, NSB, 32), /* F (????, Y416), */ /* F (????, YV16), */ G (P010_10LE, ('P', '0', '1', '0'), YUV420_10, NSB, 24), + G (P012_LE, ('P', '0', '1', '2'), YUV420_12, NSB, 24), /* F (P016_LE, P016, ????), */ /* F (????, I010), */ /* F (????, IYUV), */ @@ -89,14 +98,67 @@ /* F (????, X2R10G10B10), */ /* F (????, X2B10G10R10), */ G (GRAY8, ('Y', '8', '0', '0'), YUV400, NSB, 8), + G (Y444, ('4', '4', '4', 'P'), YUV444, NSB, 24), /* F (????, Y16), */ /* G (VYUY, VYUY, YUV422), */ /* G (YVYU, YVYU, YUV422), */ /* F (ARGB64, ARGB64, ????), */ /* F (????, ABGR64), */ + F (RGB16, ('R', 'G', '1', '6'), RGB16, NSB, 16, 16, 0x0000f800, + 0x000007e0, 0x0000001f, 0x00000000), + F (RGB, ('R', 'G', '2', '4'), RGB32, NSB, 32, 24, 0x00ff0000, + 0x0000ff00, 0x000000ff, 0x00000000), + F (BGR10A2_LE, ('A', 'R', '3', '0'), RGB32, LSB, 32, 30, 0x3ff00000, + 0x000ffc00, 0x000003ff, 0x30000000), #undef F #undef G }; + +static const struct RBG32FormatMap +{ + GstVideoFormat format; + VAImageFormat va_format[2]; +} rgb32_format_map[] = { +#define F(fourcc, order, bpp, depth, r, g, b, a) \ + { VA_FOURCC fourcc, G_PASTE (G_PASTE (VA_, order), _FIRST), bpp, depth, r, g, b, a } +#define A(fourcc, order, r, g, b, a) F (fourcc, order, 32, 32, r, g, b, a) +#define X(fourcc, order, r, g, b) F (fourcc, order, 32, 24, r, g, b, 0x0) + { GST_VIDEO_FORMAT_ARGB, { + A (('B', 'G', 'R', 'A'), LSB, 0x0000ff00, 0x00ff0000, 0xff000000, 0x000000ff), + A (('A', 'R', 'G', 'B'), MSB, 0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000), + } }, + { GST_VIDEO_FORMAT_RGBA, { + A (('A', 'B', 'G', 'R'), LSB, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000), + A (('R', 'G', 'B', 'A'), MSB, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff), + } }, + { GST_VIDEO_FORMAT_ABGR, { + A (('R', 'G', 'B', 'A'), LSB, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff), + A (('A', 'B', 'G', 'R'), MSB, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000), + } }, + { GST_VIDEO_FORMAT_BGRA, { + A (('A', 'R', 'G', 'B'), LSB, 0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000), + A (('B', 'G', 'R', 'A'), MSB, 0x0000ff00, 0x00ff0000, 0xff000000, 0x000000ff), + } }, + { GST_VIDEO_FORMAT_xRGB, { + X (('B', 'G', 'R', 'X'), LSB, 0x0000ff00, 0x00ff0000, 0xff000000), + X (('X', 'R', 'G', 'B'), MSB, 0x00ff0000, 0x0000ff00, 0x000000ff), + } }, + { GST_VIDEO_FORMAT_RGBx, { + X (('X', 'B', 'G', 'R'), LSB, 0x000000ff, 0x0000ff00, 0x00ff0000), + X (('R', 'G', 'B', 'X'), MSB, 0xff000000, 0x00ff0000, 0x0000ff00), + } }, + { GST_VIDEO_FORMAT_xBGR, { + X (('R', 'G', 'B', 'X'), LSB, 0xff000000, 0x00ff0000, 0x0000ff00), + X (('X', 'B', 'G', 'R'), MSB, 0x000000ff, 0x0000ff00, 0x00ff0000), + } }, + { GST_VIDEO_FORMAT_BGRx, { + X (('X', 'R', 'G', 'B'), LSB, 0x00ff0000, 0x0000ff00, 0x000000ff), + X (('B', 'G', 'R', 'X'), MSB, 0x0000ff00, 0x00ff0000, 0xff000000), + } }, +#undef X +#undef A +#undef F +}; /* *INDENT-ON* */ static const struct FormatMap * @@ -112,7 +174,7 @@ return NULL; } -static const struct FormatMap * +static struct FormatMap * get_format_map_from_video_format (GstVideoFormat format) { int i; @@ -210,7 +272,7 @@ GArray * surface_formats) { GstVideoFormat surface_format; - guint i, image_chroma, surface_chroma; + guint i, image_chroma; if (image_format == GST_VIDEO_FORMAT_UNKNOWN) return GST_VIDEO_FORMAT_UNKNOWN; @@ -227,12 +289,102 @@ if (surface_format == image_format) return surface_format; + } + + return GST_VIDEO_FORMAT_UNKNOWN; +} - surface_chroma = gst_va_chroma_from_video_format (surface_format); +static GstVideoFormat +find_gst_video_format_in_rgb32_map (VAImageFormat * image_format) +{ + guint i, j; - if (surface_chroma == image_chroma) - return surface_format; + for (i = 0; i < G_N_ELEMENTS (rgb32_format_map); i++) { + for (j = 0; j < G_N_ELEMENTS (rgb32_format_map[i].va_format); j++) { + if (va_format_is_same (&rgb32_format_map[i].va_format[j], image_format)) + return rgb32_format_map[i].format; + } } return GST_VIDEO_FORMAT_UNKNOWN; } + +struct ImageFormatArray +{ + VAImageFormat *image_formats; + gint len; +}; + +static gpointer +fix_map (gpointer data) +{ + struct ImageFormatArray *args = data; + GstVideoFormat format; + VAImageFormat *image_format; + struct FormatMap *map; + guint i; + + GST_DEBUG_CATEGORY_GET (gstva_debug, "va"); + + for (i = 0; i < args->len; i++) { + image_format = &args->image_formats[i]; + if (!va_format_is_rgb (image_format)) + continue; + format = find_gst_video_format_in_rgb32_map (image_format); + if (format == GST_VIDEO_FORMAT_UNKNOWN) + continue; + map = get_format_map_from_video_format (format); + if (!map) + continue; + if (va_format_is_same (&map->va_format, image_format)) + continue; + + map->va_format = *image_format; + + GST_CAT_INFO (gstva_debug, "GST_VIDEO_FORMAT_%s => { fourcc %" + GST_FOURCC_FORMAT ", %s, bpp %d, depth %d, R %#010x, G %#010x, " + "B %#010x, A %#010x }", gst_video_format_to_string (map->format), + GST_FOURCC_ARGS (map->va_format.fourcc), + (map->va_format.byte_order == 1) ? "LSB" : "MSB", + map->va_format.bits_per_pixel, map->va_format.depth, + map->va_format.red_mask, map->va_format.green_mask, + map->va_format.blue_mask, map->va_format.alpha_mask); + } + + return NULL; +} + +/* XXX: RGB32 LSB VAImageFormats don't map statically with GStreamer + * color formats. Each driver does what they want. + * + * For MSB, there is no ambiguity: same order in define, memory and + * CPU. For example, + * + * RGBA is RGBA in memory and RGBA with channel mask R:0xFF0000 + * G:0x00FF0000 B:0x0000FF00 A:0x000000FF in CPU. + * + * For LSB, CPU's perspective and memory's perspective are + * different. For example, + * + * From CPU's perspective, it's RGBA order in memory, but when it is + * stored in memory, because CPU's little endianness, it will be + * re-ordered, with mask R:0x000000FF G:0x0000FF00 B:0x00FF0000 + * A:0xFF000000. + * + * In other words, from memory's perspective, RGBA LSB is equal as + * ABGR MSB. + * + * These definitions are mixed used all over the media system and we + * need to correct the mapping form VA video format to GStreamer + * video format in both manners. + * + * https://gitlab.freedesktop.org/gstreamer/gstreamer-vaapi/-/merge_requests/123 + */ +void +gst_va_video_format_fix_map (VAImageFormat * image_formats, gint num) +{ + static GOnce once = G_ONCE_INIT; + struct ImageFormatArray args = { image_formats, num }; + + g_once (&once, fix_map, &args); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/gstvavideoformat.h -> gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavideoformat.h
Changed
@@ -32,5 +32,7 @@ GstVideoFormat gst_va_video_format_from_va_image_format (const VAImageFormat * va_format); GstVideoFormat gst_va_video_surface_format_from_image_format (GstVideoFormat image_format, GArray * surface_formats); +void gst_va_video_format_fix_map (VAImageFormat * image_formats, + gint num); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavp8dec.c
Added
@@ -0,0 +1,605 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vavp8dec + * @title: vavp8dec + * @short_description: A VA-API based VP8 video decoder + * + * vavp8dec decodes VP8 bitstreams to VA surfaces using the + * installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver. + * + * The decoding surfaces can be mapped onto main memory as video + * frames. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.webm ! parsebin ! vavp8dec ! autovideosink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvavp8dec.h" + +#include "gstvabasedec.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_vp8dec_debug); +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT gst_va_vp8dec_debug +#else +#define GST_CAT_DEFAULT NULL +#endif + +#define GST_VA_VP8_DEC(obj) ((GstVaVp8Dec *) obj) +#define GST_VA_VP8_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaVp8DecClass)) +#define GST_VA_VP8_DEC_CLASS(klass) ((GstVaVp8DecClass *) klass) + +typedef struct _GstVaVp8Dec GstVaVp8Dec; +typedef struct _GstVaVp8DecClass GstVaVp8DecClass; + +struct _GstVaVp8DecClass +{ + GstVaBaseDecClass parent_class; +}; + +struct _GstVaVp8Dec +{ + GstVaBaseDec parent; + + GstFlowReturn last_ret; +}; + +static GstElementClass *parent_class = NULL; + +/* *INDENT-OFF* */ +static const gchar *src_caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12 }") " ;" + GST_VIDEO_CAPS_MAKE ("{ NV12 }"); +/* *INDENT-ON* */ + +static const gchar *sink_caps_str = "video/x-vp8"; + +static gboolean +gst_va_vp8_dec_negotiate (GstVideoDecoder * decoder) +{ + GstCapsFeatures *capsfeatures = NULL; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp8Dec *self = GST_VA_VP8_DEC (decoder); + GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; + GstVp8Decoder *vp8dec = GST_VP8_DECODER (decoder); + + /* Ignore downstream renegotiation request. */ + if (!base->need_negotiation) + return TRUE; + + base->need_negotiation = FALSE; + + if (gst_va_decoder_is_open (base->decoder) + && !gst_va_decoder_close (base->decoder)) + return FALSE; + + if (!gst_va_decoder_open (base->decoder, base->profile, base->rt_format)) + return FALSE; + + if (!gst_va_decoder_set_frame_size (base->decoder, base->width, base->height)) + return FALSE; + + if (base->output_state) + gst_video_codec_state_unref (base->output_state); + + gst_va_base_dec_get_preferred_format_and_caps_features (base, &format, + &capsfeatures); + + base->output_state = + gst_video_decoder_set_output_state (decoder, format, + base->width, base->height, vp8dec->input_state); + + base->output_state->caps = gst_video_info_to_caps (&base->output_state->info); + if (capsfeatures) + gst_caps_set_features_simple (base->output_state->caps, capsfeatures); + + GST_INFO_OBJECT (self, "Negotiated caps %" GST_PTR_FORMAT, + base->output_state->caps); + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static VAProfile +_get_profile (GstVaVp8Dec * self, const GstVp8FrameHdr * frame_hdr) +{ + + if (frame_hdr->version > 3) { + GST_ERROR_OBJECT (self, "Unsupported vp8 version: %d", frame_hdr->version); + return VAProfileNone; + } + + return VAProfileVP8Version0_3; +} + +static GstFlowReturn +gst_va_vp8_dec_new_sequence (GstVp8Decoder * decoder, + const GstVp8FrameHdr * frame_hdr) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp8Dec *self = GST_VA_VP8_DEC (decoder); + VAProfile profile; + guint rt_format; + gboolean negotiation_needed = FALSE; + + GST_LOG_OBJECT (self, "new sequence"); + + profile = _get_profile (self, frame_hdr); + if (profile == VAProfileNone) + return GST_FLOW_NOT_NEGOTIATED; + + if (!gst_va_decoder_has_profile (base->decoder, profile)) { + GST_ERROR_OBJECT (self, "Profile %s is not supported", + gst_va_profile_name (profile)); + return GST_FLOW_NOT_NEGOTIATED; + } + + /* VP8 always use 8 bits 4:2:0 */ + rt_format = VA_RT_FORMAT_YUV420; + + if (!gst_va_decoder_config_is_equal (base->decoder, profile, + rt_format, frame_hdr->width, frame_hdr->height)) { + base->profile = profile; + base->width = frame_hdr->width; + base->height = frame_hdr->height; + base->rt_format = rt_format; + negotiation_needed = TRUE; + } + + base->min_buffers = 3 + 4; /* max num pic references + scratch surfaces */ + + base->need_negotiation = negotiation_needed; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_vp8_dec_new_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture) +{ + GstVaVp8Dec *self = GST_VA_VP8_DEC (decoder); + GstVaDecodePicture *pic; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + if (base->need_negotiation) { + if (!gst_video_decoder_negotiate (vdec)) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + self->last_ret = gst_video_decoder_allocate_output_frame (vdec, frame); + if (self->last_ret != GST_FLOW_OK) + goto error; + + pic = gst_va_decode_picture_new (base->decoder, frame->output_buffer); + + gst_vp8_picture_set_user_data (picture, pic, + (GDestroyNotify) gst_va_decode_picture_free); + + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, + gst_va_decode_picture_get_surface (pic)); + + return GST_FLOW_OK; + +error: + { + GST_WARNING_OBJECT (self, + "Failed to allocated output buffer, return %s", + gst_flow_get_name (self->last_ret)); + return self->last_ret; + } +} + +static gboolean +_fill_quant_matrix (GstVp8Decoder * decoder, GstVp8Picture * picture, + GstVp8Parser * parser) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVp8FrameHdr const *frame_hdr = &picture->frame_hdr; + GstVp8Segmentation *const seg = &parser->segmentation; + VAIQMatrixBufferVP8 iq_matrix = { }; + const gint8 QI_MAX = 127; + gint16 qi, qi_base; + gint i; + + /* Fill in VAIQMatrixBufferVP8 */ + for (i = 0; i < 4; i++) { + if (seg->segmentation_enabled) { + qi_base = seg->quantizer_update_value[i]; + if (!seg->segment_feature_mode) /* 0 means delta update */ + qi_base += frame_hdr->quant_indices.y_ac_qi; + } else + qi_base = frame_hdr->quant_indices.y_ac_qi; + + qi = qi_base; + iq_matrix.quantization_index[i][0] = CLAMP (qi, 0, QI_MAX); + qi = qi_base + frame_hdr->quant_indices.y_dc_delta; + iq_matrix.quantization_index[i][1] = CLAMP (qi, 0, QI_MAX); + qi = qi_base + frame_hdr->quant_indices.y2_dc_delta; + iq_matrix.quantization_index[i][2] = CLAMP (qi, 0, QI_MAX); + qi = qi_base + frame_hdr->quant_indices.y2_ac_delta; + iq_matrix.quantization_index[i][3] = CLAMP (qi, 0, QI_MAX); + qi = qi_base + frame_hdr->quant_indices.uv_dc_delta; + iq_matrix.quantization_index[i][4] = CLAMP (qi, 0, QI_MAX); + qi = qi_base + frame_hdr->quant_indices.uv_ac_delta; + iq_matrix.quantization_index[i][5] = CLAMP (qi, 0, QI_MAX); + } + + return gst_va_decoder_add_param_buffer (base->decoder, + gst_vp8_picture_get_user_data (picture), VAIQMatrixBufferType, &iq_matrix, + sizeof (iq_matrix)); +} + +static gboolean +_fill_probability_table (GstVp8Decoder * decoder, GstVp8Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVp8FrameHdr const *frame_hdr = &picture->frame_hdr; + VAProbabilityDataBufferVP8 prob_table = { }; + + /* Fill in VAProbabilityDataBufferVP8 */ + memcpy (prob_table.dct_coeff_probs, frame_hdr->token_probs.prob, + sizeof (frame_hdr->token_probs.prob)); + + return gst_va_decoder_add_param_buffer (base->decoder, + gst_vp8_picture_get_user_data (picture), VAProbabilityBufferType, + &prob_table, sizeof (prob_table)); +} + +static gboolean +_fill_picture (GstVp8Decoder * decoder, GstVp8Picture * picture, + GstVp8Parser * parser) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *va_pic; + VAPictureParameterBufferVP8 pic_param; + GstVp8FrameHdr const *frame_hdr = &picture->frame_hdr; + GstVp8Segmentation *const seg = &parser->segmentation; + guint i; + + if (!_fill_quant_matrix (decoder, picture, parser)) + return FALSE; + + if (!_fill_probability_table (decoder, picture)) + return FALSE; + + /* *INDENT-OFF* */ + pic_param = (VAPictureParameterBufferVP8) { + .frame_width = base->width, + .frame_height = base->height, + .last_ref_frame = VA_INVALID_SURFACE, + .golden_ref_frame = VA_INVALID_SURFACE, + .alt_ref_frame = VA_INVALID_SURFACE, + .out_of_loop_frame = VA_INVALID_SURFACE, // not used currently + .pic_fields.bits.key_frame = !frame_hdr->key_frame, + .pic_fields.bits.version = frame_hdr->version, + .pic_fields.bits.segmentation_enabled = seg->segmentation_enabled, + .pic_fields.bits.update_mb_segmentation_map = + seg->update_mb_segmentation_map, + .pic_fields.bits.update_segment_feature_data = + seg->update_segment_feature_data, + .pic_fields.bits.filter_type = frame_hdr->filter_type, + .pic_fields.bits.sharpness_level = frame_hdr->sharpness_level, + .pic_fields.bits.loop_filter_adj_enable = + parser->mb_lf_adjust.loop_filter_adj_enable, + .pic_fields.bits.mode_ref_lf_delta_update = + parser->mb_lf_adjust.mode_ref_lf_delta_update, + .pic_fields.bits.sign_bias_golden = frame_hdr->sign_bias_golden, + .pic_fields.bits.sign_bias_alternate = frame_hdr->sign_bias_alternate, + .pic_fields.bits.mb_no_coeff_skip = frame_hdr->mb_no_skip_coeff, + /* In decoding, the only loop filter settings that matter are those + in the frame header (9.1) */ + .pic_fields.bits.loop_filter_disable = frame_hdr->loop_filter_level == 0, + .prob_skip_false = frame_hdr->prob_skip_false, + .prob_intra = frame_hdr->prob_intra, + .prob_last = frame_hdr->prob_last, + .prob_gf = frame_hdr->prob_gf, + .bool_coder_ctx.range = frame_hdr->rd_range, + .bool_coder_ctx.value = frame_hdr->rd_value, + .bool_coder_ctx.count = frame_hdr->rd_count, + }; + /* *INDENT-ON* */ + + if (!frame_hdr->key_frame) { + if (decoder->last_picture) { + va_pic = gst_vp8_picture_get_user_data (decoder->last_picture); + pic_param.last_ref_frame = gst_va_decode_picture_get_surface (va_pic); + } + if (decoder->golden_ref_picture) { + va_pic = gst_vp8_picture_get_user_data (decoder->golden_ref_picture); + pic_param.golden_ref_frame = gst_va_decode_picture_get_surface (va_pic); + } + if (decoder->alt_ref_picture) { + va_pic = gst_vp8_picture_get_user_data (decoder->alt_ref_picture); + pic_param.alt_ref_frame = gst_va_decode_picture_get_surface (va_pic); + } + } + + for (i = 0; i < 3; i++) + pic_param.mb_segment_tree_probs[i] = seg->segment_prob[i]; + + for (i = 0; i < 4; i++) { + gint8 level; + if (seg->segmentation_enabled) { + level = seg->lf_update_value[i]; + /* 0 means delta update */ + if (!seg->segment_feature_mode) + level += frame_hdr->loop_filter_level; + } else + level = frame_hdr->loop_filter_level; + pic_param.loop_filter_level[i] = CLAMP (level, 0, 63); + + pic_param.loop_filter_deltas_ref_frame[i] = + parser->mb_lf_adjust.ref_frame_delta[i]; + pic_param.loop_filter_deltas_mode[i] = + parser->mb_lf_adjust.mb_mode_delta[i]; + } + + memcpy (pic_param.y_mode_probs, frame_hdr->mode_probs.y_prob, + sizeof (frame_hdr->mode_probs.y_prob)); + memcpy (pic_param.uv_mode_probs, frame_hdr->mode_probs.uv_prob, + sizeof (frame_hdr->mode_probs.uv_prob)); + memcpy (pic_param.mv_probs, frame_hdr->mv_probs.prob, + sizeof (frame_hdr->mv_probs)); + + va_pic = gst_vp8_picture_get_user_data (picture); + return gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAPictureParameterBufferType, &pic_param, sizeof (pic_param)); +} + +static gboolean +_add_slice (GstVp8Decoder * decoder, GstVp8Picture * picture, + GstVp8Parser * parser) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVp8FrameHdr const *frame_hdr = &picture->frame_hdr; + VASliceParameterBufferVP8 slice_param; + GstVaDecodePicture *va_pic; + gint i; + + /* *INDENT-OFF* */ + slice_param = (VASliceParameterBufferVP8) { + .slice_data_size = picture->size, + .slice_data_offset = frame_hdr->data_chunk_size, + .macroblock_offset = frame_hdr->header_size, + .num_of_partitions = (1 << frame_hdr->log2_nbr_of_dct_partitions) + 1, + }; + /* *INDENT-ON* */ + + slice_param.partition_size[0] = + frame_hdr->first_part_size - ((slice_param.macroblock_offset + 7) >> 3); + for (i = 1; i < slice_param.num_of_partitions; i++) + slice_param.partition_size[i] = frame_hdr->partition_size[i - 1]; + for (; i < G_N_ELEMENTS (slice_param.partition_size); i++) + slice_param.partition_size[i] = 0; + + va_pic = gst_vp8_picture_get_user_data (picture); + return gst_va_decoder_add_slice_buffer (base->decoder, va_pic, &slice_param, + sizeof (slice_param), (gpointer) picture->data, picture->size); +} + +static gboolean +gst_va_vp8_dec_decode_picture (GstVp8Decoder * decoder, GstVp8Picture * picture, + GstVp8Parser * parser) +{ + if (_fill_picture (decoder, picture, parser) && + _add_slice (decoder, picture, parser)) + return GST_FLOW_OK; + + return GST_FLOW_ERROR; +} + +static GstFlowReturn +gst_va_vp8_dec_end_picture (GstVp8Decoder * decoder, GstVp8Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *va_pic; + + GST_LOG_OBJECT (base, "end picture %p, (system_frame_number %d)", + picture, picture->system_frame_number); + + va_pic = gst_vp8_picture_get_user_data (picture); + + if (!gst_va_decoder_decode (base->decoder, va_pic)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_vp8_dec_output_picture (GstVp8Decoder * decoder, + GstVideoCodecFrame * frame, GstVp8Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp8Dec *self = GST_VA_VP8_DEC (decoder); + + GST_LOG_OBJECT (self, + "Outputting picture %p (system_frame_number %d)", + picture, picture->system_frame_number); + + if (self->last_ret != GST_FLOW_OK) { + gst_vp8_picture_unref (picture); + gst_video_decoder_drop_frame (GST_VIDEO_DECODER (self), frame); + return self->last_ret; + } + + if (base->copy_frames) + gst_va_base_dec_copy_output_buffer (base, frame); + + gst_vp8_picture_unref (picture); + + return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); +} + +static void +gst_va_vp8_dec_init (GTypeInstance * instance, gpointer g_class) +{ + gst_va_base_dec_init (GST_VA_BASE_DEC (instance), GST_CAT_DEFAULT); +} + +static void +gst_va_vp8_dec_dispose (GObject * object) +{ + gst_va_base_dec_close (GST_VIDEO_DECODER (object)); + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_vp8_dec_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *src_doc_caps, *sink_doc_caps; + GObjectClass *gobject_class = G_OBJECT_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstVp8DecoderClass *vp8decoder_class = GST_VP8_DECODER_CLASS (g_class); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (g_class); + struct CData *cdata = class_data; + gchar *long_name; + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API VP8 Decoder in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API VP8 Decoder"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", + "VA-API based VP8 video decoder", "He Junyan <junyan.he@intel.com>"); + + sink_doc_caps = gst_caps_from_string (sink_caps_str); + src_doc_caps = gst_caps_from_string (src_caps_str); + + parent_class = g_type_class_peek_parent (g_class); + + gst_va_base_dec_class_init (GST_VA_BASE_DEC_CLASS (g_class), VP8, + cdata->render_device_path, cdata->sink_caps, cdata->src_caps, + src_doc_caps, sink_doc_caps); + + gobject_class->dispose = gst_va_vp8_dec_dispose; + + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_va_vp8_dec_negotiate); + + vp8decoder_class->new_sequence = + GST_DEBUG_FUNCPTR (gst_va_vp8_dec_new_sequence); + vp8decoder_class->new_picture = + GST_DEBUG_FUNCPTR (gst_va_vp8_dec_new_picture); + vp8decoder_class->decode_picture = + GST_DEBUG_FUNCPTR (gst_va_vp8_dec_decode_picture); + vp8decoder_class->end_picture = + GST_DEBUG_FUNCPTR (gst_va_vp8_dec_end_picture); + vp8decoder_class->output_picture = + GST_DEBUG_FUNCPTR (gst_va_vp8_dec_output_picture); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + gst_caps_unref (cdata->src_caps); + gst_caps_unref (cdata->sink_caps); + g_free (cdata); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_vp8dec_debug, "vavp8dec", 0, + "VA VP8 decoder"); + + return NULL; +} + +gboolean +gst_va_vp8_dec_register (GstPlugin * plugin, GstVaDevice * device, + GstCaps * sink_caps, GstCaps * src_caps, guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaVp8DecClass), + .class_init = gst_va_vp8_dec_class_init, + .instance_size = sizeof (GstVaVp8Dec), + .instance_init = gst_va_vp8_dec_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + g_return_val_if_fail (GST_IS_CAPS (sink_caps), FALSE); + g_return_val_if_fail (GST_IS_CAPS (src_caps), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + cdata->sink_caps = gst_caps_ref (sink_caps); + cdata->src_caps = gst_caps_ref (src_caps); + + /* class data will be leaked if the element never gets instantiated */ + GST_MINI_OBJECT_FLAG_SET (cdata->sink_caps, + GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaVp8dec"); + feature_name = g_strdup ("vavp8dec"); + + /* The first decoder to be registered should use a constant name, + * like vavp8dec, for any additional decoders, we create unique + * names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sVP8Dec", basename); + feature_name = g_strdup_printf ("va%svp8dec", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_VP8_DECODER, + type_name, &type_info, 0); + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavp8dec.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_vp8_dec_register (GstPlugin * plugin, + GstVaDevice * device, + GstCaps * sink_caps, + GstCaps * src_caps, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavp9dec.c
Added
@@ -0,0 +1,764 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vavp9dec + * @title: vavp9dec + * @short_description: A VA-API based VP9 video decoder + * + * vavp9dec decodes VP9 bitstreams to VA surfaces using the + * installed and chosen [VA-API](https://01.org/linuxmedia/vaapi) + * driver. + * + * The decoding surfaces can be mapped onto main memory as video + * frames. + * + * ## Example launch line + * ``` + * gst-launch-1.0 filesrc location=sample.webm ! parsebin ! vavp9dec ! autovideosink + * ``` + * + * Since: 1.20 + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvavp9dec.h" + +#include "gstvabasedec.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_vp9dec_debug); +#ifndef GST_DISABLE_GST_DEBUG +#define GST_CAT_DEFAULT gst_va_vp9dec_debug +#else +#define GST_CAT_DEFAULT NULL +#endif + +#define GST_VA_VP9_DEC(obj) ((GstVaVp9Dec *) obj) +#define GST_VA_VP9_DEC_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaVp9DecClass)) +#define GST_VA_VP9_DEC_CLASS(klass) ((GstVaVp9DecClass *) klass) + +typedef struct _GstVaVp9Dec GstVaVp9Dec; +typedef struct _GstVaVp9DecClass GstVaVp9DecClass; + +struct _GstVaVp9DecClass +{ + GstVaBaseDecClass parent_class; +}; + +struct _GstVaVp9Dec +{ + GstVaBaseDec parent; + GstVp9Segmentation segmentation[GST_VP9_MAX_SEGMENTS]; +}; + +static GstElementClass *parent_class = NULL; + +/* *INDENT-OFF* */ +static const gchar *src_caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12 }") " ;" + GST_VIDEO_CAPS_MAKE ("{ NV12 }"); +/* *INDENT-ON* */ + +static const gchar *sink_caps_str = "video/x-vp9"; + +static guint +_get_rtformat (GstVaVp9Dec * self, GstVP9Profile profile, + GstVp9BitDepth bit_depth, gint subsampling_x, gint subsampling_y) +{ + switch (profile) { + case GST_VP9_PROFILE_0: + return VA_RT_FORMAT_YUV420; + case GST_VP9_PROFILE_1: + if (subsampling_x == 1 && subsampling_y == 0) + return VA_RT_FORMAT_YUV422; + else if (subsampling_x == 0 && subsampling_y == 0) + return VA_RT_FORMAT_YUV444; + break; + case GST_VP9_PROFILE_2: + if (bit_depth == GST_VP9_BIT_DEPTH_10) + return VA_RT_FORMAT_YUV420_10; + else if (bit_depth == GST_VP9_BIT_DEPTH_12) + return VA_RT_FORMAT_YUV420_12; + break; + case GST_VP9_PROFILE_3: + if (subsampling_x == 1 && subsampling_y == 0) { + if (bit_depth == GST_VP9_BIT_DEPTH_10) + return VA_RT_FORMAT_YUV422_10; + else if (bit_depth == GST_VP9_BIT_DEPTH_12) + return VA_RT_FORMAT_YUV422_12; + } else if (subsampling_x == 0 && subsampling_y == 0) { + if (bit_depth == GST_VP9_BIT_DEPTH_10) + return VA_RT_FORMAT_YUV444_10; + else if (bit_depth == GST_VP9_BIT_DEPTH_12) + return VA_RT_FORMAT_YUV444_12; + } + break; + default: + break; + } + + GST_ERROR_OBJECT (self, "Unsupported chroma format"); + return 0; +} + +static VAProfile +_get_profile (GstVaVp9Dec * self, GstVP9Profile profile) +{ + switch (profile) { + case GST_VP9_PROFILE_0: + return VAProfileVP9Profile0; + case GST_VP9_PROFILE_1: + return VAProfileVP9Profile1; + case GST_VP9_PROFILE_2: + return VAProfileVP9Profile2; + case GST_VP9_PROFILE_3: + return VAProfileVP9Profile3; + default: + break; + } + + GST_ERROR_OBJECT (self, "Unsupported profile"); + return VAProfileNone; +} + +static GstFlowReturn +gst_va_vp9_new_sequence (GstVp9Decoder * decoder, + const GstVp9FrameHeader * frame_hdr) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp9Dec *self = GST_VA_VP9_DEC (decoder); + VAProfile profile; + gboolean negotiation_needed = FALSE; + guint rt_format; + + profile = _get_profile (self, frame_hdr->profile); + if (profile == VAProfileNone) + return GST_FLOW_NOT_NEGOTIATED; + + if (!gst_va_decoder_has_profile (base->decoder, profile)) { + GST_ERROR_OBJECT (self, "Profile %s is not supported", + gst_va_profile_name (profile)); + return GST_FLOW_NOT_NEGOTIATED; + } + + rt_format = _get_rtformat (self, frame_hdr->profile, frame_hdr->bit_depth, + frame_hdr->subsampling_x, frame_hdr->subsampling_y); + if (rt_format == 0) + return GST_FLOW_NOT_NEGOTIATED; + + if (!gst_va_decoder_config_is_equal (base->decoder, profile, + rt_format, frame_hdr->width, frame_hdr->height)) { + base->profile = profile; + base->width = frame_hdr->width; + base->height = frame_hdr->height; + base->rt_format = rt_format; + negotiation_needed = TRUE; + } + + base->min_buffers = GST_VP9_REF_FRAMES; + + base->need_negotiation = negotiation_needed; + + return GST_FLOW_OK; +} + +static GstFlowReturn +_check_resolution_change (GstVaVp9Dec * self, GstVp9Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (self); + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + + if ((base->width != frame_hdr->width) || base->height != frame_hdr->height) { + base->width = frame_hdr->width; + base->height = frame_hdr->height; + + base->need_negotiation = TRUE; + if (!gst_video_decoder_negotiate (GST_VIDEO_DECODER (self))) { + GST_ERROR_OBJECT (self, "Resolution changed, but failed to" + " negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + + /* @TODO: if negotiation fails, decoder should resize output + * frame. For that we would need an auxiliar allocator, and + * later use GstVaFilter or GstVideoConverter. */ + } + } + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_vp9_dec_new_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstFlowReturn ret; + GstVaVp9Dec *self = GST_VA_VP9_DEC (decoder); + GstVaDecodePicture *pic; + GstVideoDecoder *vdec = GST_VIDEO_DECODER (decoder); + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + + ret = _check_resolution_change (self, picture); + if (ret != GST_FLOW_OK) + return ret; + + if (base->need_negotiation) { + if (!gst_video_decoder_negotiate (vdec)) { + GST_ERROR_OBJECT (self, "Failed to negotiate with downstream"); + return GST_FLOW_NOT_NEGOTIATED; + } + } + + ret = gst_video_decoder_allocate_output_frame (vdec, frame); + if (ret != GST_FLOW_OK) + goto error; + + pic = gst_va_decode_picture_new (base->decoder, frame->output_buffer); + + gst_vp9_picture_set_user_data (picture, pic, + (GDestroyNotify) gst_va_decode_picture_free); + + GST_LOG_OBJECT (self, "New va decode picture %p - %#x", pic, + gst_va_decode_picture_get_surface (pic)); + + return GST_FLOW_OK; + +error: + { + GST_WARNING_OBJECT (self, "Failed to allocated output buffer, return %s", + gst_flow_get_name (ret)); + return ret; + } +} + +static inline gboolean +_fill_param (GstVp9Decoder * decoder, GstVp9Picture * picture, GstVp9Dpb * dpb) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *va_pic; + const GstVp9FrameHeader *frame_hdr = &picture->frame_hdr; + const GstVp9LoopFilterParams *lfp = &frame_hdr->loop_filter_params; + const GstVp9SegmentationParams *sp = &frame_hdr->segmentation_params; + VADecPictureParameterBufferVP9 pic_param; + guint i; + + /* *INDENT-OFF* */ + pic_param = (VADecPictureParameterBufferVP9) { + .frame_width = base->width, + .frame_height = base->height, + + .pic_fields.bits = { + .subsampling_x = frame_hdr->subsampling_x, + .subsampling_y = frame_hdr->subsampling_x, + .frame_type = frame_hdr->frame_type, + .show_frame = frame_hdr->show_frame, + .error_resilient_mode = frame_hdr->error_resilient_mode, + .intra_only = frame_hdr->intra_only, + .allow_high_precision_mv = frame_hdr->allow_high_precision_mv, + .mcomp_filter_type = frame_hdr->interpolation_filter, + .frame_parallel_decoding_mode = frame_hdr->frame_parallel_decoding_mode, + .reset_frame_context = frame_hdr->reset_frame_context, + .refresh_frame_context = frame_hdr->refresh_frame_context, + .frame_context_idx = frame_hdr->frame_context_idx, + + .segmentation_enabled = sp->segmentation_enabled, + .segmentation_temporal_update = sp->segmentation_temporal_update, + .segmentation_update_map = sp->segmentation_update_map, + + .last_ref_frame = + frame_hdr->ref_frame_idx[GST_VP9_REF_FRAME_LAST - 1], + .last_ref_frame_sign_bias = + frame_hdr->ref_frame_sign_bias[GST_VP9_REF_FRAME_LAST], + .golden_ref_frame = + frame_hdr->ref_frame_idx[GST_VP9_REF_FRAME_GOLDEN - 1], + .golden_ref_frame_sign_bias = + frame_hdr->ref_frame_sign_bias[GST_VP9_REF_FRAME_GOLDEN], + .alt_ref_frame = + frame_hdr->ref_frame_idx[GST_VP9_REF_FRAME_ALTREF - 1], + .alt_ref_frame_sign_bias = + frame_hdr->ref_frame_sign_bias[GST_VP9_REF_FRAME_ALTREF], + + .lossless_flag = frame_hdr->lossless_flag, + }, + + .filter_level = lfp->loop_filter_level, + .sharpness_level = lfp->loop_filter_sharpness, + .log2_tile_rows = frame_hdr->tile_rows_log2, + .log2_tile_columns = frame_hdr->tile_cols_log2, + + .frame_header_length_in_bytes = frame_hdr->frame_header_length_in_bytes, + .first_partition_size = frame_hdr->header_size_in_bytes, + + .profile = frame_hdr->profile, + .bit_depth = frame_hdr->bit_depth + }; + /* *INDENT-ON* */ + + memcpy (pic_param.mb_segment_tree_probs, sp->segmentation_tree_probs, + sizeof (sp->segmentation_tree_probs)); + + if (sp->segmentation_temporal_update) { + memcpy (pic_param.segment_pred_probs, sp->segmentation_pred_prob, + sizeof (sp->segmentation_pred_prob)); + } else { + memset (pic_param.segment_pred_probs, 255, + sizeof (pic_param.segment_pred_probs)); + } + + for (i = 0; i < GST_VP9_REF_FRAMES; i++) { + if (dpb->pic_list[i]) { + GstVaDecodePicture *va_pic = + gst_vp9_picture_get_user_data (dpb->pic_list[i]); + + pic_param.reference_frames[i] = + gst_va_decode_picture_get_surface (va_pic); + } else { + pic_param.reference_frames[i] = VA_INVALID_ID; + } + } + + va_pic = gst_vp9_picture_get_user_data (picture); + + return gst_va_decoder_add_param_buffer (base->decoder, va_pic, + VAPictureParameterBufferType, &pic_param, sizeof (pic_param)); +} + +static void +_update_segmentation (GstVaVp9Dec * self, GstVp9FrameHeader * header) +{ + const GstVp9LoopFilterParams *lfp = &header->loop_filter_params; + const GstVp9QuantizationParams *qp = &header->quantization_params; + const GstVp9SegmentationParams *sp = &header->segmentation_params; + guint8 n_shift = lfp->loop_filter_level >> 5; + guint i; + + for (i = 0; i < GST_VP9_MAX_SEGMENTS; i++) { + gint16 luma_dc_quant_scale; + gint16 luma_ac_quant_scale; + gint16 chroma_dc_quant_scale; + gint16 chroma_ac_quant_scale; + guint8 qindex; + guint8 lvl_lookup[GST_VP9_MAX_REF_LF_DELTAS][GST_VP9_MAX_MODE_LF_DELTAS]; + gint lvl_seg = lfp->loop_filter_level; + + /* 8.6.1 Dequantization functions */ + qindex = gst_vp9_get_qindex (sp, qp, i); + luma_dc_quant_scale = + gst_vp9_get_dc_quant (qindex, qp->delta_q_y_dc, header->bit_depth); + luma_ac_quant_scale = gst_vp9_get_ac_quant (qindex, 0, header->bit_depth); + chroma_dc_quant_scale = + gst_vp9_get_dc_quant (qindex, qp->delta_q_uv_dc, header->bit_depth); + chroma_ac_quant_scale = + gst_vp9_get_ac_quant (qindex, qp->delta_q_uv_ac, header->bit_depth); + + if (!lfp->loop_filter_level) { + memset (lvl_lookup, 0, sizeof (lvl_lookup)); + } else { + /* 8.8.1 Loop filter frame init process */ + if (gst_vp9_seg_feature_active (sp, i, GST_VP9_SEG_LVL_ALT_L)) { + if (sp->segmentation_abs_or_delta_update) { + lvl_seg = sp->feature_data[i][GST_VP9_SEG_LVL_ALT_L]; + } else { + lvl_seg += sp->feature_data[i][GST_VP9_SEG_LVL_ALT_L]; + } + + lvl_seg = CLAMP (lvl_seg, 0, GST_VP9_MAX_LOOP_FILTER); + } + + if (!lfp->loop_filter_delta_enabled) { + memset (lvl_lookup, lvl_seg, sizeof (lvl_lookup)); + } else { + guint8 ref, mode; + gint intra_lvl = lvl_seg + + (lfp->loop_filter_ref_deltas[GST_VP9_REF_FRAME_INTRA] << n_shift); + + memcpy (lvl_lookup, self->segmentation[i].filter_level, + sizeof (lvl_lookup)); + + lvl_lookup[GST_VP9_REF_FRAME_INTRA][0] = + CLAMP (intra_lvl, 0, GST_VP9_MAX_LOOP_FILTER); + for (ref = GST_VP9_REF_FRAME_LAST; ref < GST_VP9_REF_FRAME_MAX; ref++) { + for (mode = 0; mode < GST_VP9_MAX_MODE_LF_DELTAS; mode++) { + intra_lvl = lvl_seg + (lfp->loop_filter_ref_deltas[ref] << n_shift) + + (lfp->loop_filter_mode_deltas[mode] << n_shift); + lvl_lookup[ref][mode] = + CLAMP (intra_lvl, 0, GST_VP9_MAX_LOOP_FILTER); + } + } + } + } + + /* *INDENT-OFF* */ + self->segmentation[i] = (GstVp9Segmentation) { + .luma_dc_quant_scale = luma_dc_quant_scale, + .luma_ac_quant_scale = luma_ac_quant_scale, + .chroma_dc_quant_scale = chroma_dc_quant_scale, + .chroma_ac_quant_scale = chroma_ac_quant_scale, + + .reference_frame_enabled = sp->feature_enabled[i][GST_VP9_SEG_LVL_REF_FRAME], + .reference_frame = sp->feature_data[i][GST_VP9_SEG_LVL_REF_FRAME], + .reference_skip = sp->feature_enabled[i][GST_VP9_SEG_SEG_LVL_SKIP], + }; + /* *INDENT-ON* */ + + memcpy (self->segmentation[i].filter_level, lvl_lookup, + sizeof (lvl_lookup)); + } +} + +static inline gboolean +_fill_slice (GstVp9Decoder * decoder, GstVp9Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp9Dec *self = GST_VA_VP9_DEC (decoder); + GstVaDecodePicture *va_pic; + const GstVp9Segmentation *seg; + VASliceParameterBufferVP9 slice_param; + guint i; + + _update_segmentation (self, &picture->frame_hdr); + + /* *INDENT-OFF* */ + slice_param = (VASliceParameterBufferVP9) { + .slice_data_size = picture->size, + .slice_data_offset = 0, + .slice_data_flag = VA_SLICE_DATA_FLAG_ALL, + }; + /* *INDENT-ON* */ + + for (i = 0; i < GST_VP9_MAX_SEGMENTS; i++) { + seg = &self->segmentation[i]; + + /* *INDENT-OFF* */ + slice_param.seg_param[i] = (VASegmentParameterVP9) { + .segment_flags.fields = { + .segment_reference_enabled = seg->reference_frame_enabled, + .segment_reference = seg->reference_frame, + .segment_reference_skipped = seg->reference_skip, + }, + .luma_dc_quant_scale = seg->luma_dc_quant_scale, + .luma_ac_quant_scale = seg->luma_ac_quant_scale, + .chroma_dc_quant_scale = seg->chroma_dc_quant_scale, + .chroma_ac_quant_scale = seg->chroma_ac_quant_scale, + }; + /* *INDENT-ON* */ + + memcpy (slice_param.seg_param[i].filter_level, seg->filter_level, + sizeof (slice_param.seg_param[i].filter_level)); + } + + va_pic = gst_vp9_picture_get_user_data (picture); + + return gst_va_decoder_add_slice_buffer (base->decoder, va_pic, &slice_param, + sizeof (slice_param), (gpointer) picture->data, picture->size); +} + +static GstFlowReturn +gst_va_vp9_decode_picture (GstVp9Decoder * decoder, GstVp9Picture * picture, + GstVp9Dpb * dpb) +{ + if (_fill_param (decoder, picture, dpb) && _fill_slice (decoder, picture)) + return GST_FLOW_OK; + + return GST_FLOW_ERROR; +} + +static GstFlowReturn +gst_va_vp9_dec_end_picture (GstVp9Decoder * decoder, GstVp9Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaDecodePicture *va_pic; + + GST_LOG_OBJECT (base, "end picture %p", picture); + + va_pic = gst_vp9_picture_get_user_data (picture); + + if (!gst_va_decoder_decode (base->decoder, va_pic)) + return GST_FLOW_ERROR; + + return GST_FLOW_OK; +} + +static GstFlowReturn +gst_va_vp9_dec_output_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp9Dec *self = GST_VA_VP9_DEC (decoder); + + GST_LOG_OBJECT (self, "Outputting picture %p", picture); + + if (base->copy_frames) + gst_va_base_dec_copy_output_buffer (base, frame); + + gst_vp9_picture_unref (picture); + + return gst_video_decoder_finish_frame (GST_VIDEO_DECODER (self), frame); +} + +static GstVp9Picture * +gst_va_vp9_dec_duplicate_picture (GstVp9Decoder * decoder, + GstVideoCodecFrame * frame, GstVp9Picture * picture) +{ + GstVaDecodePicture *va_pic, *va_dup; + GstVp9Picture *new_picture; + + if (_check_resolution_change (GST_VA_VP9_DEC (decoder), picture) != + GST_FLOW_OK) { + return NULL; + } + + va_pic = gst_vp9_picture_get_user_data (picture); + va_dup = gst_va_decode_picture_dup (va_pic); + + new_picture = gst_vp9_picture_new (); + new_picture->frame_hdr = picture->frame_hdr; + + frame->output_buffer = gst_buffer_ref (va_dup->gstbuffer); + + gst_vp9_picture_set_user_data (picture, va_dup, + (GDestroyNotify) gst_va_decode_picture_free); + + return new_picture; +} + +static gboolean +gst_va_vp9_dec_negotiate (GstVideoDecoder * decoder) +{ + GstCapsFeatures *capsfeatures = NULL; + GstVaBaseDec *base = GST_VA_BASE_DEC (decoder); + GstVaVp9Dec *self = GST_VA_VP9_DEC (decoder); + GstVideoFormat format = GST_VIDEO_FORMAT_UNKNOWN; + GstVp9Decoder *vp9dec = GST_VP9_DECODER (decoder); + gboolean need_open; + + /* Ignore downstream renegotiation request. */ + if (!base->need_negotiation) + return TRUE; + + base->need_negotiation = FALSE; + + need_open = TRUE; + /* VP9 profile entry should have the ability to handle dynamical + * resolution changes. If only the resolution changes, we should not + * re-create the config and context. */ + if (gst_va_decoder_is_open (base->decoder)) { + VAProfile cur_profile; + guint cur_rtformat; + gint cur_width, cur_height; + + if (!gst_va_decoder_get_config (base->decoder, &cur_profile, + &cur_rtformat, &cur_width, &cur_height)) + return FALSE; + + if (base->profile == cur_profile && base->rt_format == cur_rtformat) { + if (!gst_va_decoder_update_frame_size (base->decoder, base->width, + base->height)) + return FALSE; + + GST_INFO_OBJECT (self, "dynamical resolution changes from %dx%d to" + " %dx%d", cur_width, cur_height, base->width, base->height); + + need_open = FALSE; + } else { + if (!gst_va_decoder_close (base->decoder)) + return FALSE; + } + } + + if (need_open) { + if (!gst_va_decoder_open (base->decoder, base->profile, base->rt_format)) + return FALSE; + + if (!gst_va_decoder_set_frame_size (base->decoder, base->width, + base->height)) + return FALSE; + } + + if (base->output_state) + gst_video_codec_state_unref (base->output_state); + + gst_va_base_dec_get_preferred_format_and_caps_features (base, &format, + &capsfeatures); + + base->output_state = + gst_video_decoder_set_output_state (decoder, format, + base->width, base->height, vp9dec->input_state); + + base->output_state->caps = gst_video_info_to_caps (&base->output_state->info); + if (capsfeatures) + gst_caps_set_features_simple (base->output_state->caps, capsfeatures); + + GST_INFO_OBJECT (self, "Negotiated caps %" GST_PTR_FORMAT, + base->output_state->caps); + + return GST_VIDEO_DECODER_CLASS (parent_class)->negotiate (decoder); +} + +static void +gst_va_vp9_dec_dispose (GObject * object) +{ + gst_va_base_dec_close (GST_VIDEO_DECODER (object)); + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_vp9_dec_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *src_doc_caps, *sink_doc_caps; + GObjectClass *gobject_class = G_OBJECT_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstVideoDecoderClass *decoder_class = GST_VIDEO_DECODER_CLASS (g_class); + GstVp9DecoderClass *vp9_class = GST_VP9_DECODER_CLASS (g_class); + struct CData *cdata = class_data; + gchar *long_name; + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API VP9 Decoder in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API VP9 Decoder"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Codec/Decoder/Video/Hardware", "VA-API based VP9 video decoder", + "Víctor Jáquez <vjaquez@igalia.com>"); + + sink_doc_caps = gst_caps_from_string (sink_caps_str); + src_doc_caps = gst_caps_from_string (src_caps_str); + + parent_class = g_type_class_peek_parent (g_class); + + gst_va_base_dec_class_init (GST_VA_BASE_DEC_CLASS (g_class), VP9, + cdata->render_device_path, cdata->sink_caps, cdata->src_caps, + src_doc_caps, sink_doc_caps); + + gobject_class->dispose = gst_va_vp9_dec_dispose; + + decoder_class->negotiate = GST_DEBUG_FUNCPTR (gst_va_vp9_dec_negotiate); + + vp9_class->new_sequence = GST_DEBUG_FUNCPTR (gst_va_vp9_new_sequence); + vp9_class->new_picture = GST_DEBUG_FUNCPTR (gst_va_vp9_dec_new_picture); + vp9_class->decode_picture = GST_DEBUG_FUNCPTR (gst_va_vp9_decode_picture); + vp9_class->end_picture = GST_DEBUG_FUNCPTR (gst_va_vp9_dec_end_picture); + vp9_class->output_picture = GST_DEBUG_FUNCPTR (gst_va_vp9_dec_output_picture); + vp9_class->duplicate_picture = + GST_DEBUG_FUNCPTR (gst_va_vp9_dec_duplicate_picture); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + gst_caps_unref (cdata->src_caps); + gst_caps_unref (cdata->sink_caps); + g_free (cdata); +} + +static void +gst_va_vp9_dec_init (GTypeInstance * instance, gpointer g_class) +{ + gst_va_base_dec_init (GST_VA_BASE_DEC (instance), GST_CAT_DEFAULT); +} + +/* This element doesn't parse supreframes. Let's delegate it to the + * parser. */ +static GstCaps * +_complete_sink_caps (GstCaps * sinkcaps) +{ + gst_caps_set_simple (sinkcaps, "alignment", G_TYPE_STRING, "frame", NULL); + return gst_caps_ref (sinkcaps); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_vp9dec_debug, "vavp9dec", 0, + "VA VP9 decoder"); + + return NULL; +} + +gboolean +gst_va_vp9_dec_register (GstPlugin * plugin, GstVaDevice * device, + GstCaps * sink_caps, GstCaps * src_caps, guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaVp9DecClass), + .class_init = gst_va_vp9_dec_class_init, + .instance_size = sizeof (GstVaVp9Dec), + .instance_init = gst_va_vp9_dec_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + g_return_val_if_fail (GST_IS_CAPS (sink_caps), FALSE); + g_return_val_if_fail (GST_IS_CAPS (src_caps), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + cdata->sink_caps = _complete_sink_caps (sink_caps); + cdata->src_caps = gst_caps_ref (src_caps); + + /* class data will be leaked if the element never gets instantiated */ + GST_MINI_OBJECT_FLAG_SET (sink_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + GST_MINI_OBJECT_FLAG_SET (src_caps, GST_MINI_OBJECT_FLAG_MAY_BE_LEAKED); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaVp9Dec"); + feature_name = g_strdup ("vavp9dec"); + + /* The first decoder to be registered should use a constant name, + * like vavp9dec, for any additional decoders, we create unique + * names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sVp9Dec", basename); + feature_name = g_strdup_printf ("va%svp9dec", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_VP9_DECODER, + type_name, &type_info, 0); + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavp9dec.h
Added
@@ -0,0 +1,33 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_vp9_dec_register (GstPlugin * plugin, + GstVaDevice * device, + GstCaps * sink_caps, + GstCaps * src_caps, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavpp.c
Added
@@ -0,0 +1,2297 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the0 + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/** + * SECTION:element-vapostproc + * @title: vapostproc + * @short_description: A VA-API base video postprocessing filter + * + * vapostproc applies different video filters to VA surfaces. These + * filters vary depending on the installed and chosen + * [VA-API](https://01.org/linuxmedia/vaapi) driver, but usually + * resizing and color conversion are available. + * + * The generated surfaces can be mapped onto main memory as video + * frames. + * + * Use gst-inspect-1.0 to introspect the available capabilities of the + * driver's post-processor entry point. + * + * ## Example launch line + * ``` + * gst-launch-1.0 videotestsrc ! "video/x-raw,format=(string)NV12" ! vapostproc ! autovideosink + * ``` + * + * Cropping is supported via buffers' crop meta. It's only done if the + * postproccessor is not in passthrough mode or if downstream doesn't + * support the crop meta API. + * + * ### Cropping example + * ``` + * gst-launch-1.0 videotestsrc ! "video/x-raw,format=(string)NV12" ! videocrop bottom=50 left=100 ! vapostproc ! autovideosink + * ``` + * + * If the VA driver support color balance filter, with controls such + * as hue, brightness, contrast, etc., those controls are exposed both + * as element properties and through the #GstColorBalance interface. + * + * Since: 1.20 + * + */ + +/* ToDo: + * + * + HDR tone mapping + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstvavpp.h" + +#include <gst/video/video.h> + +#include <va/va_drmcommon.h> + +#include "gstvaallocator.h" +#include "gstvabasetransform.h" +#include "gstvacaps.h" +#include "gstvadisplay_priv.h" +#include "gstvafilter.h" +#include "gstvapool.h" +#include "gstvautils.h" + +GST_DEBUG_CATEGORY_STATIC (gst_va_vpp_debug); +#define GST_CAT_DEFAULT gst_va_vpp_debug + +#define GST_VA_VPP(obj) ((GstVaVpp *) obj) +#define GST_VA_VPP_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS ((obj), G_TYPE_FROM_INSTANCE (obj), GstVaVppClass)) +#define GST_VA_VPP_CLASS(klass) ((GstVaVppClass *) klass) + +#define SWAP(a, b) do { const __typeof__ (a) t = a; a = b; b = t; } while (0) + +typedef struct _GstVaVpp GstVaVpp; +typedef struct _GstVaVppClass GstVaVppClass; + +struct _GstVaVppClass +{ + /* GstVideoFilter overlaps functionality */ + GstVaBaseTransformClass parent_class; +}; + +struct _GstVaVpp +{ + GstVaBaseTransform parent; + + gboolean rebuild_filters; + guint op_flags; + + /* filters */ + float denoise; + float sharpen; + float skintone; + float brightness; + float contrast; + float hue; + float saturation; + gboolean auto_contrast; + gboolean auto_brightness; + gboolean auto_saturation; + GstVideoOrientationMethod direction; + GstVideoOrientationMethod prev_direction; + GstVideoOrientationMethod tag_direction; + gboolean add_borders; + gint borders_h; + gint borders_w; + + GList *channels; +}; + +static GstElementClass *parent_class = NULL; + +struct CData +{ + gchar *render_device_path; + gchar *description; +}; + +/* convertions that disable passthrough */ +enum +{ + VPP_CONVERT_SIZE = 1 << 0, + VPP_CONVERT_FORMAT = 1 << 1, + VPP_CONVERT_FILTERS = 1 << 2, + VPP_CONVERT_DIRECTION = 1 << 3, + VPP_CONVERT_FEATURE = 1 << 4, + VPP_CONVERT_CROP = 1 << 5, + VPP_CONVERT_DUMMY = 1 << 6, +}; + +/* *INDENT-OFF* */ +static const gchar *caps_str = + GST_VIDEO_CAPS_MAKE_WITH_FEATURES (GST_CAPS_FEATURE_MEMORY_VA, + "{ NV12, I420, YV12, YUY2, RGBA, BGRA, P010_10LE, ARGB, ABGR }") " ;" + GST_VIDEO_CAPS_MAKE ("{ VUYA, GRAY8, NV12, NV21, YUY2, UYVY, YV12, " + "I420, P010_10LE, RGBA, BGRA, ARGB, ABGR }"); +/* *INDENT-ON* */ + +#define META_TAG_COLORSPACE meta_tag_colorspace_quark +static GQuark meta_tag_colorspace_quark; +#define META_TAG_SIZE meta_tag_size_quark +static GQuark meta_tag_size_quark; +#define META_TAG_ORIENTATION meta_tag_orientation_quark +static GQuark meta_tag_orientation_quark; +#define META_TAG_VIDEO meta_tag_video_quark +static GQuark meta_tag_video_quark; + +static void gst_va_vpp_colorbalance_init (gpointer iface, gpointer data); +static void gst_va_vpp_rebuild_filters (GstVaVpp * self); + +static void +gst_va_vpp_dispose (GObject * object) +{ + GstVaVpp *self = GST_VA_VPP (object); + + if (self->channels) + g_list_free_full (g_steal_pointer (&self->channels), g_object_unref); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_va_vpp_update_passthrough (GstVaVpp * self, gboolean reconf) +{ + GstBaseTransform *trans = GST_BASE_TRANSFORM (self); + gboolean old, new; + + old = gst_base_transform_is_passthrough (trans); + + GST_OBJECT_LOCK (self); + new = (self->op_flags == 0); + GST_OBJECT_UNLOCK (self); + + if (old != new) { + GST_INFO_OBJECT (self, "%s passthrough", new ? "enabling" : "disabling"); + if (reconf) + gst_base_transform_reconfigure_src (trans); + gst_base_transform_set_passthrough (trans, new); + } +} + +static void +_update_properties_unlocked (GstVaVpp * self) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + + if (!btrans->filter) + return; + + if ((self->direction != GST_VIDEO_ORIENTATION_AUTO + && self->direction != self->prev_direction) + || (self->direction == GST_VIDEO_ORIENTATION_AUTO + && self->tag_direction != self->prev_direction)) { + + GstVideoOrientationMethod direction = + (self->direction == GST_VIDEO_ORIENTATION_AUTO) ? + self->tag_direction : self->direction; + + if (!gst_va_filter_set_orientation (btrans->filter, direction)) { + if (self->direction == GST_VIDEO_ORIENTATION_AUTO) + self->tag_direction = self->prev_direction; + else + self->direction = self->prev_direction; + + self->op_flags &= ~VPP_CONVERT_DIRECTION; + + /* FIXME: unlocked bus warning message */ + GST_WARNING_OBJECT (self, + "Driver cannot set resquested orientation. Setting it back."); + } else { + self->prev_direction = direction; + + self->op_flags |= VPP_CONVERT_DIRECTION; + + gst_base_transform_reconfigure_src (GST_BASE_TRANSFORM (self)); + } + } else { + self->op_flags &= ~VPP_CONVERT_DIRECTION; + } +} + +static void +gst_va_vpp_set_property (GObject * object, guint prop_id, + const GValue * value, GParamSpec * pspec) +{ + GstVaVpp *self = GST_VA_VPP (object); + + GST_OBJECT_LOCK (object); + switch (prop_id) { + case GST_VA_FILTER_PROP_DENOISE: + self->denoise = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_SHARPEN: + self->sharpen = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_SKINTONE: + if (G_VALUE_TYPE (value) == G_TYPE_BOOLEAN) + self->skintone = (float) g_value_get_boolean (value); + else + self->skintone = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_VIDEO_DIR:{ + GstVideoOrientationMethod direction = g_value_get_enum (value); + self->prev_direction = (direction == GST_VIDEO_ORIENTATION_AUTO) ? + self->tag_direction : self->direction; + self->direction = direction; + break; + } + case GST_VA_FILTER_PROP_HUE: + self->hue = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_SATURATION: + self->saturation = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_BRIGHTNESS: + self->brightness = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_CONTRAST: + self->contrast = g_value_get_float (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_AUTO_SATURATION: + self->auto_saturation = g_value_get_boolean (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_AUTO_BRIGHTNESS: + self->auto_brightness = g_value_get_boolean (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_AUTO_CONTRAST: + self->auto_contrast = g_value_get_boolean (value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + break; + case GST_VA_FILTER_PROP_DISABLE_PASSTHROUGH:{ + gboolean disable_passthrough = g_value_get_boolean (value); + if (disable_passthrough) + self->op_flags |= VPP_CONVERT_DUMMY; + else + self->op_flags &= ~VPP_CONVERT_DUMMY; + break; + } + case GST_VA_FILTER_PROP_ADD_BORDERS: + self->add_borders = g_value_get_boolean (value); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + + _update_properties_unlocked (self); + GST_OBJECT_UNLOCK (object); + + gst_va_vpp_update_passthrough (self, FALSE); +} + +static void +gst_va_vpp_get_property (GObject * object, guint prop_id, GValue * value, + GParamSpec * pspec) +{ + GstVaVpp *self = GST_VA_VPP (object); + + GST_OBJECT_LOCK (object); + switch (prop_id) { + case GST_VA_FILTER_PROP_DENOISE: + g_value_set_float (value, self->denoise); + break; + case GST_VA_FILTER_PROP_SHARPEN: + g_value_set_float (value, self->sharpen); + break; + case GST_VA_FILTER_PROP_SKINTONE: + if (G_VALUE_TYPE (value) == G_TYPE_BOOLEAN) + g_value_set_boolean (value, self->skintone > 0); + else + g_value_set_float (value, self->skintone); + break; + case GST_VA_FILTER_PROP_VIDEO_DIR: + g_value_set_enum (value, self->direction); + break; + case GST_VA_FILTER_PROP_HUE: + g_value_set_float (value, self->hue); + break; + case GST_VA_FILTER_PROP_SATURATION: + g_value_set_float (value, self->saturation); + break; + case GST_VA_FILTER_PROP_BRIGHTNESS: + g_value_set_float (value, self->brightness); + break; + case GST_VA_FILTER_PROP_CONTRAST: + g_value_set_float (value, self->contrast); + break; + case GST_VA_FILTER_PROP_AUTO_SATURATION: + g_value_set_boolean (value, self->auto_saturation); + break; + case GST_VA_FILTER_PROP_AUTO_BRIGHTNESS: + g_value_set_boolean (value, self->auto_brightness); + break; + case GST_VA_FILTER_PROP_AUTO_CONTRAST: + g_value_set_boolean (value, self->auto_contrast); + break; + case GST_VA_FILTER_PROP_DISABLE_PASSTHROUGH: + g_value_set_boolean (value, (self->op_flags & VPP_CONVERT_DUMMY)); + break; + case GST_VA_FILTER_PROP_ADD_BORDERS: + g_value_set_boolean (value, self->add_borders); + break; + default: + G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); + break; + } + GST_OBJECT_UNLOCK (object); +} + +static gboolean +gst_va_vpp_propose_allocation (GstBaseTransform * trans, + GstQuery * decide_query, GstQuery * query) +{ + /* if we are not passthrough, we can handle crop meta */ + if (decide_query) + gst_query_add_allocation_meta (query, GST_VIDEO_CROP_META_API_TYPE, NULL); + + return GST_BASE_TRANSFORM_CLASS (parent_class)->propose_allocation (trans, + decide_query, query); +} + +static void +gst_va_vpp_update_properties (GstVaBaseTransform * btrans) +{ + GstVaVpp *self = GST_VA_VPP (btrans); + + gst_va_vpp_rebuild_filters (self); + _update_properties_unlocked (self); +} + +static gboolean +gst_va_vpp_set_info (GstVaBaseTransform * btrans, GstCaps * incaps, + GstVideoInfo * in_info, GstCaps * outcaps, GstVideoInfo * out_info) +{ + GstVaVpp *self = GST_VA_VPP (btrans); + GstCapsFeatures *infeat, *outfeat; + gint from_dar_n, from_dar_d, to_dar_n, to_dar_d; + + if (GST_VIDEO_INFO_INTERLACE_MODE (in_info) != + GST_VIDEO_INFO_INTERLACE_MODE (out_info)) { + GST_ERROR_OBJECT (self, "input and output formats do not match"); + return FALSE; + } + + /* calculate possible borders if display-aspect-ratio change */ + { + if (!gst_util_fraction_multiply (GST_VIDEO_INFO_WIDTH (in_info), + GST_VIDEO_INFO_HEIGHT (in_info), GST_VIDEO_INFO_PAR_N (in_info), + GST_VIDEO_INFO_PAR_D (in_info), &from_dar_n, &from_dar_d)) { + from_dar_n = from_dar_d = -1; + } + + if (!gst_util_fraction_multiply (GST_VIDEO_INFO_WIDTH (out_info), + GST_VIDEO_INFO_HEIGHT (out_info), GST_VIDEO_INFO_PAR_N (out_info), + GST_VIDEO_INFO_PAR_D (out_info), &to_dar_n, &to_dar_d)) { + to_dar_n = to_dar_d = -1; + } + + /* if video-orientation changes consider it for borders */ + switch (gst_va_filter_get_orientation (btrans->filter)) { + case GST_VIDEO_ORIENTATION_90R: + case GST_VIDEO_ORIENTATION_90L: + case GST_VIDEO_ORIENTATION_UL_LR: + case GST_VIDEO_ORIENTATION_UR_LL: + SWAP (from_dar_n, from_dar_d); + break; + default: + break; + } + + self->borders_h = self->borders_w = 0; + if (to_dar_n != from_dar_n || to_dar_d != from_dar_d) { + if (self->add_borders) { + gint n, d, to_h, to_w; + + if (from_dar_n != -1 && from_dar_d != -1 + && gst_util_fraction_multiply (from_dar_n, from_dar_d, + out_info->par_d, out_info->par_n, &n, &d)) { + to_h = gst_util_uint64_scale_int (out_info->width, d, n); + if (to_h <= out_info->height) { + self->borders_h = out_info->height - to_h; + self->borders_w = 0; + } else { + to_w = gst_util_uint64_scale_int (out_info->height, n, d); + g_assert (to_w <= out_info->width); + self->borders_h = 0; + self->borders_w = out_info->width - to_w; + } + } else { + GST_WARNING_OBJECT (self, "Can't calculate borders"); + } + } else { + GST_WARNING_OBJECT (self, "Can't keep DAR!"); + } + } + } + + if (!gst_video_info_is_equal (in_info, out_info)) { + if (GST_VIDEO_INFO_FORMAT (in_info) != GST_VIDEO_INFO_FORMAT (out_info)) + self->op_flags |= VPP_CONVERT_FORMAT; + else + self->op_flags &= ~VPP_CONVERT_FORMAT; + + if (GST_VIDEO_INFO_WIDTH (in_info) != GST_VIDEO_INFO_WIDTH (out_info) + || GST_VIDEO_INFO_HEIGHT (in_info) != GST_VIDEO_INFO_HEIGHT (out_info) + || self->borders_h > 0 || self->borders_w > 0) + self->op_flags |= VPP_CONVERT_SIZE; + else + self->op_flags &= ~VPP_CONVERT_SIZE; + } else { + self->op_flags &= ~VPP_CONVERT_FORMAT & ~VPP_CONVERT_SIZE; + } + + infeat = gst_caps_get_features (incaps, 0); + outfeat = gst_caps_get_features (outcaps, 0); + if (!gst_caps_features_is_equal (infeat, outfeat)) + self->op_flags |= VPP_CONVERT_FEATURE; + else + self->op_flags &= ~VPP_CONVERT_FEATURE; + + if (gst_va_filter_set_video_info (btrans->filter, in_info, out_info)) { + gst_va_vpp_update_passthrough (self, FALSE); + return TRUE; + } + + return FALSE; +} + +static inline gboolean +_get_filter_value (GstVaVpp * self, VAProcFilterType type, gfloat * value) +{ + gboolean ret = TRUE; + + GST_OBJECT_LOCK (self); + switch (type) { + case VAProcFilterNoiseReduction: + *value = self->denoise; + break; + case VAProcFilterSharpening: + *value = self->sharpen; + break; + case VAProcFilterSkinToneEnhancement: + *value = self->skintone; + break; + default: + ret = FALSE; + break; + } + GST_OBJECT_UNLOCK (self); + + return ret; +} + +static inline gboolean +_add_filter_buffer (GstVaVpp * self, VAProcFilterType type, + const VAProcFilterCap * cap) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + VAProcFilterParameterBuffer param; + gfloat value = 0; + + if (!_get_filter_value (self, type, &value)) + return FALSE; + if (value == cap->range.default_value) + return FALSE; + + /* *INDENT-OFF* */ + param = (VAProcFilterParameterBuffer) { + .type = type, + .value = value, + }; + /* *INDENT-ON* */ + + return gst_va_filter_add_filter_buffer (btrans->filter, ¶m, + sizeof (param), 1); +} + +static inline gboolean +_get_filter_cb_value (GstVaVpp * self, VAProcColorBalanceType type, + gfloat * value) +{ + gboolean ret = TRUE; + + GST_OBJECT_LOCK (self); + switch (type) { + case VAProcColorBalanceHue: + *value = self->hue; + break; + case VAProcColorBalanceSaturation: + *value = self->saturation; + break; + case VAProcColorBalanceBrightness: + *value = self->brightness; + break; + case VAProcColorBalanceContrast: + *value = self->contrast; + break; + case VAProcColorBalanceAutoSaturation: + *value = self->auto_saturation; + break; + case VAProcColorBalanceAutoBrightness: + *value = self->auto_brightness; + break; + case VAProcColorBalanceAutoContrast: + *value = self->auto_contrast; + break; + default: + ret = FALSE; + break; + } + GST_OBJECT_UNLOCK (self); + + return ret; +} + +static inline gboolean +_add_filter_cb_buffer (GstVaVpp * self, + const VAProcFilterCapColorBalance * caps, guint num_caps) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + VAProcFilterParameterBufferColorBalance param[VAProcColorBalanceCount] = + { 0, }; + gfloat value; + guint i, c = 0; + + value = 0; + for (i = 0; i < num_caps && i < VAProcColorBalanceCount; i++) { + if (!_get_filter_cb_value (self, caps[i].type, &value)) + continue; + if (value == caps[i].range.default_value) + continue; + + /* *INDENT-OFF* */ + param[c++] = (VAProcFilterParameterBufferColorBalance) { + .type = VAProcFilterColorBalance, + .attrib = caps[i].type, + .value = value, + }; + /* *INDENT-ON* */ + } + + if (c > 0) { + return gst_va_filter_add_filter_buffer (btrans->filter, param, + sizeof (*param), c); + } + return FALSE; +} + +static void +_build_filters (GstVaVpp * self) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + static const VAProcFilterType filter_types[] = { VAProcFilterNoiseReduction, + VAProcFilterSharpening, VAProcFilterSkinToneEnhancement, + VAProcFilterColorBalance, + }; + guint i, num_caps; + gboolean apply = FALSE; + + for (i = 0; i < G_N_ELEMENTS (filter_types); i++) { + const gpointer caps = gst_va_filter_get_filter_caps (btrans->filter, + filter_types[i], &num_caps); + if (!caps) + continue; + + switch (filter_types[i]) { + case VAProcFilterNoiseReduction: + apply |= _add_filter_buffer (self, filter_types[i], caps); + break; + case VAProcFilterSharpening: + apply |= _add_filter_buffer (self, filter_types[i], caps); + break; + case VAProcFilterSkinToneEnhancement: + apply |= _add_filter_buffer (self, filter_types[i], caps); + break; + case VAProcFilterColorBalance: + apply |= _add_filter_cb_buffer (self, caps, num_caps); + break; + default: + break; + } + } + + GST_OBJECT_LOCK (self); + if (apply) + self->op_flags |= VPP_CONVERT_FILTERS; + else + self->op_flags &= ~VPP_CONVERT_FILTERS; + GST_OBJECT_UNLOCK (self); +} + +static void +gst_va_vpp_rebuild_filters (GstVaVpp * self) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + + if (!g_atomic_int_get (&self->rebuild_filters)) + return; + + gst_va_filter_drop_filter_buffers (btrans->filter); + _build_filters (self); + g_atomic_int_set (&self->rebuild_filters, FALSE); +} + +static void +gst_va_vpp_before_transform (GstBaseTransform * trans, GstBuffer * inbuf) +{ + GstVaVpp *self = GST_VA_VPP (trans); + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + GstClockTime ts, stream_time; + gboolean is_passthrough; + + ts = GST_BUFFER_TIMESTAMP (inbuf); + stream_time = + gst_segment_to_stream_time (&trans->segment, GST_FORMAT_TIME, ts); + + GST_TRACE_OBJECT (self, "sync to %" GST_TIME_FORMAT, GST_TIME_ARGS (ts)); + + if (GST_CLOCK_TIME_IS_VALID (stream_time)) + gst_object_sync_values (GST_OBJECT (self), stream_time); + + gst_va_vpp_rebuild_filters (self); + gst_va_vpp_update_passthrough (self, TRUE); + + /* cropping is only enabled if vapostproc is not in passthrough */ + is_passthrough = gst_base_transform_is_passthrough (trans); + GST_OBJECT_LOCK (self); + if (!is_passthrough && gst_buffer_get_video_crop_meta (inbuf)) { + self->op_flags |= VPP_CONVERT_CROP; + } else { + self->op_flags &= ~VPP_CONVERT_CROP; + } + gst_va_filter_enable_cropping (btrans->filter, + (self->op_flags & VPP_CONVERT_CROP)); + GST_OBJECT_UNLOCK (self); +} + +static GstFlowReturn +gst_va_vpp_transform (GstBaseTransform * trans, GstBuffer * inbuf, + GstBuffer * outbuf) +{ + GstVaVpp *self = GST_VA_VPP (trans); + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (trans); + GstBuffer *buf = NULL; + GstFlowReturn res = GST_FLOW_OK; + GstVaSample src, dst; + + if (G_UNLIKELY (!btrans->negotiated)) + goto unknown_format; + + res = gst_va_base_transform_import_buffer (btrans, inbuf, &buf); + if (res != GST_FLOW_OK) + return res; + + /* *INDENT-OFF* */ + src = (GstVaSample) { + .buffer = buf, + .flags = gst_va_buffer_get_surface_flags (buf, &btrans->in_info), + }; + + dst = (GstVaSample) { + .buffer = outbuf, + .borders_h = self->borders_h, + .borders_w = self->borders_w, + .flags = gst_va_buffer_get_surface_flags (outbuf, &btrans->out_info), + }; + /* *INDENT-ON* */ + + if (!gst_va_filter_process (btrans->filter, &src, &dst)) { + gst_buffer_set_flags (outbuf, GST_BUFFER_FLAG_CORRUPTED); + } + + gst_buffer_unref (buf); + + return res; + + /* ERRORS */ +unknown_format: + { + GST_ELEMENT_ERROR (self, CORE, NOT_IMPLEMENTED, (NULL), ("unknown format")); + return GST_FLOW_NOT_NEGOTIATED; + } +} + +static gboolean +gst_va_vpp_transform_meta (GstBaseTransform * trans, GstBuffer * inbuf, + GstMeta * meta, GstBuffer * outbuf) +{ + GstVaVpp *self = GST_VA_VPP (trans); + const GstMetaInfo *info = meta->info; + const gchar *const *tags; + + tags = gst_meta_api_type_get_tags (info->api); + + if (!tags) + return TRUE; + + /* don't copy colorspace/size/orientation specific metadata */ + if ((self->op_flags & VPP_CONVERT_FORMAT) + && gst_meta_api_type_has_tag (info->api, META_TAG_COLORSPACE)) + return FALSE; + else if ((self->op_flags & (VPP_CONVERT_SIZE | VPP_CONVERT_CROP)) + && gst_meta_api_type_has_tag (info->api, META_TAG_SIZE)) + return FALSE; + else if ((self->op_flags & VPP_CONVERT_DIRECTION) + && gst_meta_api_type_has_tag (info->api, META_TAG_ORIENTATION)) + return FALSE; + else if (gst_meta_api_type_has_tag (info->api, META_TAG_VIDEO)) + return TRUE; + + return FALSE; +} + +/* In structures with supported caps features it's: + * + Rangified resolution size. + * + Rangified "pixel-aspect-ratio" if present. + * + Removed "format", "colorimetry", "chroma-site" + * + * Structures with unsupported caps features are copied as-is. + */ +static GstCaps * +gst_va_vpp_caps_remove_fields (GstCaps * caps) +{ + GstCaps *ret; + GstStructure *structure; + GstCapsFeatures *features; + gint i, j, n, m; + + ret = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + structure = gst_caps_get_structure (caps, i); + features = gst_caps_get_features (caps, i); + + /* If this is already expressed by the existing caps + * skip this structure */ + if (i > 0 && gst_caps_is_subset_structure_full (ret, structure, features)) + continue; + + structure = gst_structure_copy (structure); + + m = gst_caps_features_get_size (features); + for (j = 0; j < m; j++) { + const gchar *feature = gst_caps_features_get_nth (features, j); + + if (g_strcmp0 (feature, GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY) == 0 + || g_strcmp0 (feature, GST_CAPS_FEATURE_MEMORY_DMABUF) == 0 + || g_strcmp0 (feature, GST_CAPS_FEATURE_MEMORY_VA) == 0) { + + /* rangify frame size */ + gst_structure_set (structure, "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, + "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); + + /* if pixel aspect ratio, make a range of it */ + if (gst_structure_has_field (structure, "pixel-aspect-ratio")) { + gst_structure_set (structure, "pixel-aspect-ratio", + GST_TYPE_FRACTION_RANGE, 1, G_MAXINT, G_MAXINT, 1, NULL); + } + + /* remove format-related fields */ + gst_structure_remove_fields (structure, "format", "colorimetry", + "chroma-site", NULL); + + break; + } + } + + gst_caps_append_structure_full (ret, structure, + gst_caps_features_copy (features)); + } + + return ret; +} + +/* Returns all structures in @caps without @feature_name but now with + * @feature_name */ +static GstCaps * +gst_va_vpp_complete_caps_features (const GstCaps * caps, + const gchar * feature_name) +{ + guint i, j, m, n; + GstCaps *tmp; + + tmp = gst_caps_new_empty (); + + n = gst_caps_get_size (caps); + for (i = 0; i < n; i++) { + GstCapsFeatures *features, *orig_features; + GstStructure *s = gst_caps_get_structure (caps, i); + gboolean contained = FALSE; + + orig_features = gst_caps_get_features (caps, i); + features = gst_caps_features_new (feature_name, NULL); + + m = gst_caps_features_get_size (orig_features); + for (j = 0; j < m; j++) { + const gchar *feature = gst_caps_features_get_nth (orig_features, j); + + /* if we already have the features */ + if (gst_caps_features_contains (features, feature)) { + contained = TRUE; + break; + } + } + + if (!contained && !gst_caps_is_subset_structure_full (tmp, s, features)) + gst_caps_append_structure_full (tmp, gst_structure_copy (s), features); + else + gst_caps_features_free (features); + } + + return tmp; +} + +static GstCaps * +gst_va_vpp_transform_caps (GstBaseTransform * trans, GstPadDirection direction, + GstCaps * caps, GstCaps * filter) +{ + GstVaVpp *self = GST_VA_VPP (trans); + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (trans); + GstCaps *ret, *tmp, *filter_caps; + + GST_DEBUG_OBJECT (self, + "Transforming caps %" GST_PTR_FORMAT " in direction %s", caps, + (direction == GST_PAD_SINK) ? "sink" : "src"); + + filter_caps = gst_va_base_transform_get_filter_caps (btrans); + if (filter_caps && !gst_caps_can_intersect (caps, filter_caps)) { + ret = gst_caps_ref (caps); + goto bail; + } + + ret = gst_va_vpp_caps_remove_fields (caps); + + tmp = gst_va_vpp_complete_caps_features (ret, GST_CAPS_FEATURE_MEMORY_VA); + if (!gst_caps_is_subset (tmp, ret)) { + gst_caps_append (ret, tmp); + } else { + gst_caps_unref (tmp); + } + + tmp = gst_va_vpp_complete_caps_features (ret, GST_CAPS_FEATURE_MEMORY_DMABUF); + if (!gst_caps_is_subset (tmp, ret)) { + gst_caps_append (ret, tmp); + } else { + gst_caps_unref (tmp); + } + + tmp = gst_va_vpp_complete_caps_features (ret, + GST_CAPS_FEATURE_MEMORY_SYSTEM_MEMORY); + if (!gst_caps_is_subset (tmp, ret)) { + gst_caps_append (ret, tmp); + } else { + gst_caps_unref (tmp); + } + +bail: + if (filter) { + GstCaps *intersection; + + intersection = + gst_caps_intersect_full (filter, ret, GST_CAPS_INTERSECT_FIRST); + gst_caps_unref (ret); + ret = intersection; + } + + GST_DEBUG_OBJECT (trans, "returning caps: %" GST_PTR_FORMAT, ret); + + return ret; +} + +/* + * This is an incomplete matrix of in formats and a score for the preferred output + * format. + * + * out: RGB24 RGB16 ARGB AYUV YUV444 YUV422 YUV420 YUV411 YUV410 PAL GRAY + * in + * RGB24 0 2 1 2 2 3 4 5 6 7 8 + * RGB16 1 0 1 2 2 3 4 5 6 7 8 + * ARGB 2 3 0 1 4 5 6 7 8 9 10 + * AYUV 3 4 1 0 2 5 6 7 8 9 10 + * YUV444 2 4 3 1 0 5 6 7 8 9 10 + * YUV422 3 5 4 2 1 0 6 7 8 9 10 + * YUV420 4 6 5 3 2 1 0 7 8 9 10 + * YUV411 4 6 5 3 2 1 7 0 8 9 10 + * YUV410 6 8 7 5 4 3 2 1 0 9 10 + * PAL 1 3 2 6 4 6 7 8 9 0 10 + * GRAY 1 4 3 2 1 5 6 7 8 9 0 + * + * PAL or GRAY are never preferred, if we can we would convert to PAL instead + * of GRAY, though + * less subsampling is preferred and if any, preferably horizontal + * We would like to keep the alpha, even if we would need to to colorspace conversion + * or lose depth. + */ +#define SCORE_FORMAT_CHANGE 1 +#define SCORE_DEPTH_CHANGE 1 +#define SCORE_ALPHA_CHANGE 1 +#define SCORE_CHROMA_W_CHANGE 1 +#define SCORE_CHROMA_H_CHANGE 1 +#define SCORE_PALETTE_CHANGE 1 + +#define SCORE_COLORSPACE_LOSS 2 /* RGB <-> YUV */ +#define SCORE_DEPTH_LOSS 4 /* change bit depth */ +#define SCORE_ALPHA_LOSS 8 /* lose the alpha channel */ +#define SCORE_CHROMA_W_LOSS 16 /* vertical subsample */ +#define SCORE_CHROMA_H_LOSS 32 /* horizontal subsample */ +#define SCORE_PALETTE_LOSS 64 /* convert to palette format */ +#define SCORE_COLOR_LOSS 128 /* convert to GRAY */ + +#define COLORSPACE_MASK (GST_VIDEO_FORMAT_FLAG_YUV | \ + GST_VIDEO_FORMAT_FLAG_RGB | GST_VIDEO_FORMAT_FLAG_GRAY) +#define ALPHA_MASK (GST_VIDEO_FORMAT_FLAG_ALPHA) +#define PALETTE_MASK (GST_VIDEO_FORMAT_FLAG_PALETTE) + +/* calculate how much loss a conversion would be */ +static gboolean +score_value (GstVaVpp * self, const GstVideoFormatInfo * in_info, + GstVideoFormat format, gint * min_loss, + const GstVideoFormatInfo ** out_info) +{ + const GstVideoFormatInfo *t_info; + GstVideoFormatFlags in_flags, t_flags; + gint loss; + + t_info = gst_video_format_get_info (format); + if (!t_info || t_info->format == GST_VIDEO_FORMAT_UNKNOWN) + return FALSE; + + /* accept input format immediately without loss */ + if (in_info == t_info) { + *min_loss = 0; + *out_info = t_info; + return TRUE; + } + + loss = SCORE_FORMAT_CHANGE; + + in_flags = GST_VIDEO_FORMAT_INFO_FLAGS (in_info); + in_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; + in_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; + in_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; + + t_flags = GST_VIDEO_FORMAT_INFO_FLAGS (t_info); + t_flags &= ~GST_VIDEO_FORMAT_FLAG_LE; + t_flags &= ~GST_VIDEO_FORMAT_FLAG_COMPLEX; + t_flags &= ~GST_VIDEO_FORMAT_FLAG_UNPACK; + + if ((t_flags & PALETTE_MASK) != (in_flags & PALETTE_MASK)) { + loss += SCORE_PALETTE_CHANGE; + if (t_flags & PALETTE_MASK) + loss += SCORE_PALETTE_LOSS; + } + + if ((t_flags & COLORSPACE_MASK) != (in_flags & COLORSPACE_MASK)) { + loss += SCORE_COLORSPACE_LOSS; + if (t_flags & GST_VIDEO_FORMAT_FLAG_GRAY) + loss += SCORE_COLOR_LOSS; + } + + if ((t_flags & ALPHA_MASK) != (in_flags & ALPHA_MASK)) { + loss += SCORE_ALPHA_CHANGE; + if (in_flags & ALPHA_MASK) + loss += SCORE_ALPHA_LOSS; + } + + if ((in_info->h_sub[1]) != (t_info->h_sub[1])) { + loss += SCORE_CHROMA_H_CHANGE; + if ((in_info->h_sub[1]) < (t_info->h_sub[1])) + loss += SCORE_CHROMA_H_LOSS; + } + if ((in_info->w_sub[1]) != (t_info->w_sub[1])) { + loss += SCORE_CHROMA_W_CHANGE; + if ((in_info->w_sub[1]) < (t_info->w_sub[1])) + loss += SCORE_CHROMA_W_LOSS; + } + + if ((in_info->bits) != (t_info->bits)) { + loss += SCORE_DEPTH_CHANGE; + if ((in_info->bits) > (t_info->bits)) + loss += SCORE_DEPTH_LOSS; + } + + GST_DEBUG_OBJECT (self, "score %s -> %s = %d", + GST_VIDEO_FORMAT_INFO_NAME (in_info), + GST_VIDEO_FORMAT_INFO_NAME (t_info), loss); + + if (loss < *min_loss) { + GST_DEBUG_OBJECT (self, "found new best %d", loss); + *out_info = t_info; + *min_loss = loss; + return TRUE; + } + + return FALSE; +} + +static GstCaps * +gst_va_vpp_fixate_format (GstVaVpp * self, GstCaps * caps, GstCaps * result) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + GstStructure *ins; + const gchar *in_format; + const GstVideoFormatInfo *in_info, *out_info = NULL; + GstCapsFeatures *features; + GstVideoFormat fmt; + gint min_loss = G_MAXINT; + guint i, best_i, capslen; + + ins = gst_caps_get_structure (caps, 0); + in_format = gst_structure_get_string (ins, "format"); + if (!in_format) + return NULL; + + GST_DEBUG_OBJECT (self, "source format %s", in_format); + + in_info = + gst_video_format_get_info (gst_video_format_from_string (in_format)); + if (!in_info) + return NULL; + + best_i = 0; + capslen = gst_caps_get_size (result); + GST_DEBUG_OBJECT (self, "iterate %d structures", capslen); + for (i = 0; i < capslen; i++) { + GstStructure *tests; + const GValue *format; + + tests = gst_caps_get_structure (result, i); + format = gst_structure_get_value (tests, "format"); + /* should not happen */ + if (format == NULL) + continue; + + features = gst_caps_get_features (result, i); + + if (GST_VALUE_HOLDS_LIST (format)) { + gint j, len; + + len = gst_value_list_get_size (format); + GST_DEBUG_OBJECT (self, "have %d formats", len); + for (j = 0; j < len; j++) { + const GValue *val; + + val = gst_value_list_get_value (format, j); + if (G_VALUE_HOLDS_STRING (val)) { + fmt = gst_video_format_from_string (g_value_get_string (val)); + if (!gst_va_filter_has_video_format (btrans->filter, fmt, features)) + continue; + if (score_value (self, in_info, fmt, &min_loss, &out_info)) + best_i = i; + if (min_loss == 0) + break; + } + } + } else if (G_VALUE_HOLDS_STRING (format)) { + fmt = gst_video_format_from_string (g_value_get_string (format)); + if (!gst_va_filter_has_video_format (btrans->filter, fmt, features)) + continue; + if (score_value (self, in_info, fmt, &min_loss, &out_info)) + best_i = i; + } + + if (min_loss == 0) + break; + } + + if (out_info) { + GstCaps *ret; + GstStructure *out; + + features = gst_caps_features_copy (gst_caps_get_features (result, best_i)); + out = gst_structure_copy (gst_caps_get_structure (result, best_i)); + gst_structure_set (out, "format", G_TYPE_STRING, + GST_VIDEO_FORMAT_INFO_NAME (out_info), NULL); + ret = gst_caps_new_full (out, NULL); + gst_caps_set_features_simple (ret, features); + return ret; + } + + return NULL; +} + +static void +gst_va_vpp_fixate_size (GstVaVpp * self, GstPadDirection direction, + GstCaps * caps, GstCaps * othercaps) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + GstStructure *ins, *outs; + const GValue *from_par, *to_par; + GValue fpar = { 0, }; + GValue tpar = { 0, }; + + ins = gst_caps_get_structure (caps, 0); + outs = gst_caps_get_structure (othercaps, 0); + + from_par = gst_structure_get_value (ins, "pixel-aspect-ratio"); + to_par = gst_structure_get_value (outs, "pixel-aspect-ratio"); + + /* If we're fixating from the sinkpad we always set the PAR and + * assume that missing PAR on the sinkpad means 1/1 and + * missing PAR on the srcpad means undefined + */ + if (direction == GST_PAD_SINK) { + if (!from_par) { + g_value_init (&fpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&fpar, 1, 1); + from_par = &fpar; + } + if (!to_par) { + g_value_init (&tpar, GST_TYPE_FRACTION_RANGE); + gst_value_set_fraction_range_full (&tpar, 1, G_MAXINT, G_MAXINT, 1); + to_par = &tpar; + } + } else { + if (!to_par) { + g_value_init (&tpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&tpar, 1, 1); + to_par = &tpar; + + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1, + NULL); + } + if (!from_par) { + g_value_init (&fpar, GST_TYPE_FRACTION); + gst_value_set_fraction (&fpar, 1, 1); + from_par = &fpar; + } + } + + /* we have both PAR but they might not be fixated */ + { + gint from_w, from_h, from_par_n, from_par_d, to_par_n, to_par_d; + gint w = 0, h = 0; + gint from_dar_n, from_dar_d; + gint num, den; + + /* from_par should be fixed */ + g_return_if_fail (gst_value_is_fixed (from_par)); + + from_par_n = gst_value_get_fraction_numerator (from_par); + from_par_d = gst_value_get_fraction_denominator (from_par); + + gst_structure_get_int (ins, "width", &from_w); + gst_structure_get_int (ins, "height", &from_h); + + gst_structure_get_int (outs, "width", &w); + gst_structure_get_int (outs, "height", &h); + + /* if video-orientation changes */ + switch (gst_va_filter_get_orientation (btrans->filter)) { + case GST_VIDEO_ORIENTATION_90R: + case GST_VIDEO_ORIENTATION_90L: + case GST_VIDEO_ORIENTATION_UL_LR: + case GST_VIDEO_ORIENTATION_UR_LL: + SWAP (from_w, from_h); + SWAP (from_par_n, from_par_d); + break; + default: + break; + } + + /* if both width and height are already fixed, we can't do anything + * about it anymore */ + if (w && h) { + guint n, d; + + GST_DEBUG_OBJECT (self, "dimensions already set to %dx%d, not fixating", + w, h); + if (!gst_value_is_fixed (to_par)) { + if (gst_video_calculate_display_ratio (&n, &d, from_w, from_h, + from_par_n, from_par_d, w, h)) { + GST_DEBUG_OBJECT (self, "fixating to_par to %dx%d", n, d); + if (gst_structure_has_field (outs, "pixel-aspect-ratio")) + gst_structure_fixate_field_nearest_fraction (outs, + "pixel-aspect-ratio", n, d); + else if (n != d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + n, d, NULL); + } + } + goto done; + } + + /* Calculate input DAR */ + if (!gst_util_fraction_multiply (from_w, from_h, from_par_n, from_par_d, + &from_dar_n, &from_dar_d)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + GST_DEBUG_OBJECT (self, "Input DAR is %d/%d", from_dar_n, from_dar_d); + + /* If either width or height are fixed there's not much we + * can do either except choosing a height or width and PAR + * that matches the DAR as good as possible + */ + if (h) { + GstStructure *tmp; + gint set_w, set_par_n, set_par_d; + + GST_DEBUG_OBJECT (self, "height is fixed (%d)", h); + + /* If the PAR is fixed too, there's not much to do + * except choosing the width that is nearest to the + * width with the same DAR */ + if (gst_value_is_fixed (to_par)) { + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + GST_DEBUG_OBJECT (self, "PAR is fixed %d/%d", to_par_n, to_par_d); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, + to_par_n, &num, &den)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (h, num, den); + gst_structure_fixate_field_nearest_int (outs, "width", w); + + goto done; + } + + /* The PAR is not fixed and it's quite likely that we can set + * an arbitrary PAR. */ + + /* Check if we can keep the input width */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + /* Might have failed but try to keep the DAR nonetheless by + * adjusting the PAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, h, set_w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + /* Check if the adjusted PAR is accepted */ + if (set_par_n == to_par_n && set_par_d == to_par_d) { + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "width", G_TYPE_INT, set_w, + "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, + NULL); + goto done; + } + + /* Otherwise scale the width to the new PAR and check if the + * adjusted with is accepted. If all that fails we can't keep + * the DAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (h, num, den); + gst_structure_fixate_field_nearest_int (outs, "width", w); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + + goto done; + } else if (w) { + GstStructure *tmp; + gint set_h, set_par_n, set_par_d; + + GST_DEBUG_OBJECT (self, "width is fixed (%d)", w); + + /* If the PAR is fixed too, there's not much to do + * except choosing the height that is nearest to the + * height with the same DAR */ + if (gst_value_is_fixed (to_par)) { + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + GST_DEBUG_OBJECT (self, "PAR is fixed %d/%d", to_par_n, to_par_d); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_d, + to_par_n, &num, &den)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + h = (guint) gst_util_uint64_scale_int_round (w, den, num); + gst_structure_fixate_field_nearest_int (outs, "height", h); + + goto done; + } + + /* The PAR is not fixed and it's quite likely that we can set + * an arbitrary PAR. */ + + /* Check if we can keep the input height */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + + /* Might have failed but try to keep the DAR nonetheless by + * adjusting the PAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + /* Check if the adjusted PAR is accepted */ + if (set_par_n == to_par_n && set_par_d == to_par_d) { + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "height", G_TYPE_INT, set_h, + "pixel-aspect-ratio", GST_TYPE_FRACTION, set_par_n, set_par_d, + NULL); + goto done; + } + + /* Otherwise scale the height to the new PAR and check if the + * adjusted with is accepted. If all that fails we can't keep + * the DAR */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scale sized - integer overflow")); + goto done; + } + + h = (guint) gst_util_uint64_scale_int_round (w, den, num); + gst_structure_fixate_field_nearest_int (outs, "height", h); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + + goto done; + } else if (gst_value_is_fixed (to_par)) { + GstStructure *tmp; + gint set_h, set_w, f_h, f_w; + + to_par_n = gst_value_get_fraction_numerator (to_par); + to_par_d = gst_value_get_fraction_denominator (to_par); + + /* Calculate scale factor for the PAR change */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, to_par_n, + to_par_d, &num, &den)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + /* Try to keep the input height (because of interlacing) */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + + /* This might have failed but try to scale the width + * to keep the DAR nonetheless */ + w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); + gst_structure_fixate_field_nearest_int (tmp, "width", w); + gst_structure_get_int (tmp, "width", &set_w); + gst_structure_free (tmp); + + /* We kept the DAR and the height is nearest to the original height */ + if (set_w == w) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + goto done; + } + + f_h = set_h; + f_w = set_w; + + /* If the former failed, try to keep the input width at least */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + /* This might have failed but try to scale the width + * to keep the DAR nonetheless */ + h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); + gst_structure_fixate_field_nearest_int (tmp, "height", h); + gst_structure_get_int (tmp, "height", &set_h); + gst_structure_free (tmp); + + /* We kept the DAR and the width is nearest to the original width */ + if (set_h == h) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + goto done; + } + + /* If all this failed, keep the dimensions with the DAR that was closest + * to the correct DAR. This changes the DAR but there's not much else to + * do here. + */ + if (set_w * ABS (set_h - h) < ABS (f_w - w) * f_h) { + f_h = set_h; + f_w = set_w; + } + gst_structure_set (outs, "width", G_TYPE_INT, f_w, "height", G_TYPE_INT, + f_h, NULL); + goto done; + } else { + GstStructure *tmp; + gint set_h, set_w, set_par_n, set_par_d, tmp2; + + /* width, height and PAR are not fixed but passthrough is not possible */ + + /* First try to keep the height and width as good as possible + * and scale PAR */ + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", from_h); + gst_structure_get_int (tmp, "height", &set_h); + gst_structure_fixate_field_nearest_int (tmp, "width", from_w); + gst_structure_get_int (tmp, "width", &set_w); + + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_h, set_w, + &to_par_n, &to_par_d)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + gst_structure_free (tmp); + goto done; + } + + if (!gst_structure_has_field (tmp, "pixel-aspect-ratio")) + gst_structure_set_value (tmp, "pixel-aspect-ratio", to_par); + gst_structure_fixate_field_nearest_fraction (tmp, "pixel-aspect-ratio", + to_par_n, to_par_d); + gst_structure_get_fraction (tmp, "pixel-aspect-ratio", &set_par_n, + &set_par_d); + gst_structure_free (tmp); + + if (set_par_n == to_par_n && set_par_d == to_par_d) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* Otherwise try to scale width to keep the DAR with the set + * PAR and height */ + if (!gst_util_fraction_multiply (from_dar_n, from_dar_d, set_par_d, + set_par_n, &num, &den)) { + GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, (NULL), + ("Error calculating the output scaled size - integer overflow")); + goto done; + } + + w = (guint) gst_util_uint64_scale_int_round (set_h, num, den); + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "width", w); + gst_structure_get_int (tmp, "width", &tmp2); + gst_structure_free (tmp); + + if (tmp2 == w) { + gst_structure_set (outs, "width", G_TYPE_INT, tmp2, "height", + G_TYPE_INT, set_h, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* ... or try the same with the height */ + h = (guint) gst_util_uint64_scale_int_round (set_w, den, num); + tmp = gst_structure_copy (outs); + gst_structure_fixate_field_nearest_int (tmp, "height", h); + gst_structure_get_int (tmp, "height", &tmp2); + gst_structure_free (tmp); + + if (tmp2 == h) { + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, tmp2, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + goto done; + } + + /* If all fails we can't keep the DAR and take the nearest values + * for everything from the first try */ + gst_structure_set (outs, "width", G_TYPE_INT, set_w, "height", + G_TYPE_INT, set_h, NULL); + if (gst_structure_has_field (outs, "pixel-aspect-ratio") || + set_par_n != set_par_d) + gst_structure_set (outs, "pixel-aspect-ratio", GST_TYPE_FRACTION, + set_par_n, set_par_d, NULL); + } + } + +done: + if (from_par == &fpar) + g_value_unset (&fpar); + if (to_par == &tpar) + g_value_unset (&tpar); +} + +static gboolean +subsampling_unchanged (GstVideoInfo * in_info, GstVideoInfo * out_info) +{ + gint i; + const GstVideoFormatInfo *in_format, *out_format; + + if (GST_VIDEO_INFO_N_COMPONENTS (in_info) != + GST_VIDEO_INFO_N_COMPONENTS (out_info)) + return FALSE; + + in_format = in_info->finfo; + out_format = out_info->finfo; + + for (i = 0; i < GST_VIDEO_INFO_N_COMPONENTS (in_info); i++) { + if (GST_VIDEO_FORMAT_INFO_W_SUB (in_format, + i) != GST_VIDEO_FORMAT_INFO_W_SUB (out_format, i)) + return FALSE; + if (GST_VIDEO_FORMAT_INFO_H_SUB (in_format, + i) != GST_VIDEO_FORMAT_INFO_H_SUB (out_format, i)) + return FALSE; + } + + return TRUE; +} + +static void +transfer_colorimetry_from_input (GstVaVpp * self, GstCaps * in_caps, + GstCaps * out_caps) +{ + GstStructure *out_caps_s = gst_caps_get_structure (out_caps, 0); + GstStructure *in_caps_s = gst_caps_get_structure (in_caps, 0); + gboolean have_colorimetry = + gst_structure_has_field (out_caps_s, "colorimetry"); + gboolean have_chroma_site = + gst_structure_has_field (out_caps_s, "chroma-site"); + + /* If the output already has colorimetry and chroma-site, stop, + * otherwise try and transfer what we can from the input caps */ + if (have_colorimetry && have_chroma_site) + return; + + { + GstVideoInfo in_info, out_info; + const GValue *in_colorimetry = + gst_structure_get_value (in_caps_s, "colorimetry"); + + if (!gst_video_info_from_caps (&in_info, in_caps)) { + GST_WARNING_OBJECT (self, + "Failed to convert sink pad caps to video info"); + return; + } + if (!gst_video_info_from_caps (&out_info, out_caps)) { + GST_WARNING_OBJECT (self, "Failed to convert src pad caps to video info"); + return; + } + + if (!have_colorimetry && in_colorimetry != NULL) { + if ((GST_VIDEO_INFO_IS_YUV (&out_info) + && GST_VIDEO_INFO_IS_YUV (&in_info)) + || (GST_VIDEO_INFO_IS_RGB (&out_info) + && GST_VIDEO_INFO_IS_RGB (&in_info)) + || (GST_VIDEO_INFO_IS_GRAY (&out_info) + && GST_VIDEO_INFO_IS_GRAY (&in_info))) { + /* Can transfer the colorimetry intact from the input if it has it */ + gst_structure_set_value (out_caps_s, "colorimetry", in_colorimetry); + } else { + gchar *colorimetry_str; + + /* Changing between YUV/RGB - forward primaries and transfer function, but use + * default range and matrix. + * the primaries is used for conversion between RGB and XYZ (CIE 1931 coordinate). + * the transfer function could be another reference (e.g., HDR) + */ + out_info.colorimetry.primaries = in_info.colorimetry.primaries; + out_info.colorimetry.transfer = in_info.colorimetry.transfer; + + colorimetry_str = + gst_video_colorimetry_to_string (&out_info.colorimetry); + gst_caps_set_simple (out_caps, "colorimetry", G_TYPE_STRING, + colorimetry_str, NULL); + g_free (colorimetry_str); + } + } + + /* Only YUV output needs chroma-site. If the input was also YUV and had the same chroma + * subsampling, transfer the siting. If the sub-sampling is changing, then the planes get + * scaled anyway so there's no real reason to prefer the input siting. */ + if (!have_chroma_site && GST_VIDEO_INFO_IS_YUV (&out_info)) { + if (GST_VIDEO_INFO_IS_YUV (&in_info)) { + const GValue *in_chroma_site = + gst_structure_get_value (in_caps_s, "chroma-site"); + if (in_chroma_site != NULL + && subsampling_unchanged (&in_info, &out_info)) + gst_structure_set_value (out_caps_s, "chroma-site", in_chroma_site); + } + } + } +} + +static void +copy_misc_fields_from_input (GstCaps * in_caps, GstCaps * out_caps) +{ + const gchar *fields[] = { "interlace-mode", "field-order", "multiview-mode", + "multiview-flags", "framerate" + }; + GstStructure *out_caps_s = gst_caps_get_structure (out_caps, 0); + GstStructure *in_caps_s = gst_caps_get_structure (in_caps, 0); + int i; + + for (i = 0; i < G_N_ELEMENTS (fields); i++) { + const GValue *in_field = gst_structure_get_value (in_caps_s, fields[i]); + const GValue *out_field = gst_structure_get_value (out_caps_s, fields[i]); + + if (out_field && gst_value_is_fixed (out_field)) + continue; + + if (in_field) + gst_structure_set_value (out_caps_s, fields[i], in_field); + } +} + +static GstCaps * +gst_va_vpp_fixate_caps (GstBaseTransform * trans, GstPadDirection direction, + GstCaps * caps, GstCaps * othercaps) +{ + GstVaVpp *self = GST_VA_VPP (trans); + GstCaps *result; + + GST_DEBUG_OBJECT (self, + "trying to fixate othercaps %" GST_PTR_FORMAT " based on caps %" + GST_PTR_FORMAT, othercaps, caps); + + /* will iterate in all structures to find one with "best color" */ + result = gst_va_vpp_fixate_format (self, caps, othercaps); + if (!result) + return othercaps; + + gst_clear_caps (&othercaps); + + gst_va_vpp_fixate_size (self, direction, caps, result); + + /* some fields might be lost while feature caps conversion */ + copy_misc_fields_from_input (caps, result); + + /* fixate remaining fields */ + result = gst_caps_fixate (result); + + if (direction == GST_PAD_SINK) { + if (gst_caps_is_subset (caps, result)) { + gst_caps_replace (&result, caps); + } else { + /* Try and preserve input colorimetry / chroma information */ + transfer_colorimetry_from_input (self, caps, result); + } + } + + GST_DEBUG_OBJECT (self, "fixated othercaps to %" GST_PTR_FORMAT, result); + + return result; +} + +static void +_get_scale_factor (GstVaVpp * self, gdouble * w_factor, gdouble * h_factor) +{ + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (self); + gdouble w = GST_VIDEO_INFO_WIDTH (&btrans->in_info); + gdouble h = GST_VIDEO_INFO_HEIGHT (&btrans->in_info); + + switch (self->direction) { + case GST_VIDEO_ORIENTATION_90R: + case GST_VIDEO_ORIENTATION_90L: + case GST_VIDEO_ORIENTATION_UR_LL: + case GST_VIDEO_ORIENTATION_UL_LR:{ + gdouble tmp = h; + h = w; + w = tmp; + break; + } + default: + break; + } + + *w_factor = GST_VIDEO_INFO_WIDTH (&btrans->out_info); + *w_factor /= w; + + *h_factor = GST_VIDEO_INFO_HEIGHT (&btrans->out_info); + *h_factor /= h; +} + +static gboolean +gst_va_vpp_src_event (GstBaseTransform * trans, GstEvent * event) +{ + GstVaVpp *self = GST_VA_VPP (trans); + GstVaBaseTransform *btrans = GST_VA_BASE_TRANSFORM (trans); + GstStructure *structure; + const GstVideoInfo *in_info = &btrans->in_info, *out_info = &btrans->out_info; + gdouble new_x = 0, new_y = 0, x = 0, y = 0, w_factor = 1, h_factor = 1; + gboolean ret; + + GST_TRACE_OBJECT (self, "handling %s event", GST_EVENT_TYPE_NAME (event)); + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_NAVIGATION: + if (GST_VIDEO_INFO_WIDTH (in_info) != GST_VIDEO_INFO_WIDTH (out_info) + || GST_VIDEO_INFO_HEIGHT (in_info) != GST_VIDEO_INFO_HEIGHT (out_info) + || gst_va_filter_get_orientation (btrans->filter) != + GST_VIDEO_ORIENTATION_IDENTITY) { + + event = + GST_EVENT (gst_mini_object_make_writable (GST_MINI_OBJECT (event))); + + structure = (GstStructure *) gst_event_get_structure (event); + if (!gst_structure_get_double (structure, "pointer_x", &x) + || !gst_structure_get_double (structure, "pointer_y", &y)) + break; + + /* video-direction compensation */ + switch (self->direction) { + case GST_VIDEO_ORIENTATION_90R: + new_x = y; + new_y = GST_VIDEO_INFO_WIDTH (in_info) - 1 - x; + break; + case GST_VIDEO_ORIENTATION_90L: + new_x = GST_VIDEO_INFO_HEIGHT (in_info) - 1 - y; + new_y = x; + break; + case GST_VIDEO_ORIENTATION_UR_LL: + new_x = GST_VIDEO_INFO_HEIGHT (in_info) - 1 - y; + new_y = GST_VIDEO_INFO_WIDTH (in_info) - 1 - x; + break; + case GST_VIDEO_ORIENTATION_UL_LR: + new_x = y; + new_y = x; + break; + case GST_VIDEO_ORIENTATION_180: + /* FIXME: is this correct? */ + new_x = GST_VIDEO_INFO_WIDTH (in_info) - 1 - x; + new_y = GST_VIDEO_INFO_HEIGHT (in_info) - 1 - y; + break; + case GST_VIDEO_ORIENTATION_HORIZ: + new_x = GST_VIDEO_INFO_WIDTH (in_info) - 1 - x; + new_y = y; + break; + case GST_VIDEO_ORIENTATION_VERT: + new_x = x; + new_y = GST_VIDEO_INFO_HEIGHT (in_info) - 1 - y; + break; + default: + new_x = x; + new_y = y; + break; + } + + /* scale compensation */ + _get_scale_factor (self, &w_factor, &h_factor); + new_x *= w_factor; + new_y *= h_factor; + + /* crop compensation is done by videocrop */ + + GST_TRACE_OBJECT (self, "from %fx%f to %fx%f", x, y, new_x, new_y); + gst_structure_set (structure, "pointer_x", G_TYPE_DOUBLE, new_x, + "pointer_y", G_TYPE_DOUBLE, new_y, NULL); + } + break; + default: + break; + } + + ret = GST_BASE_TRANSFORM_CLASS (parent_class)->src_event (trans, event); + + return ret; +} + +static gboolean +gst_va_vpp_sink_event (GstBaseTransform * trans, GstEvent * event) +{ + GstVaVpp *self = GST_VA_VPP (trans); + GstTagList *taglist; + gchar *orientation; + + switch (GST_EVENT_TYPE (event)) { + case GST_EVENT_TAG: + gst_event_parse_tag (event, &taglist); + + if (!gst_tag_list_get_string (taglist, "image-orientation", &orientation)) + break; + + if (self->direction != GST_VIDEO_ORIENTATION_AUTO) + break; + + GST_DEBUG_OBJECT (self, "tag orientation %s", orientation); + + GST_OBJECT_LOCK (self); + if (!g_strcmp0 ("rotate-0", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_IDENTITY; + else if (!g_strcmp0 ("rotate-90", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_90R; + else if (!g_strcmp0 ("rotate-180", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_180; + else if (!g_strcmp0 ("rotate-270", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_90L; + else if (!g_strcmp0 ("flip-rotate-0", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_HORIZ; + else if (!g_strcmp0 ("flip-rotate-90", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_UL_LR; + else if (!g_strcmp0 ("flip-rotate-180", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_VERT; + else if (!g_strcmp0 ("flip-rotate-270", orientation)) + self->tag_direction = GST_VIDEO_ORIENTATION_UR_LL; + + _update_properties_unlocked (self); + GST_OBJECT_UNLOCK (self); + + gst_va_vpp_update_passthrough (self, FALSE); + + break; + default: + break; + } + + return GST_BASE_TRANSFORM_CLASS (parent_class)->sink_event (trans, event); +} + +static void +gst_va_vpp_class_init (gpointer g_class, gpointer class_data) +{ + GstCaps *doc_caps, *caps = NULL; + GstPadTemplate *sink_pad_templ, *src_pad_templ; + GObjectClass *object_class = G_OBJECT_CLASS (g_class); + GstBaseTransformClass *trans_class = GST_BASE_TRANSFORM_CLASS (g_class); + GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); + GstVaBaseTransformClass *btrans_class = GST_VA_BASE_TRANSFORM_CLASS (g_class); + GstVaDisplay *display; + GstVaFilter *filter; + struct CData *cdata = class_data; + gchar *long_name; + + parent_class = g_type_class_peek_parent (g_class); + + btrans_class->render_device_path = g_strdup (cdata->render_device_path); + + if (cdata->description) { + long_name = g_strdup_printf ("VA-API Video Postprocessor in %s", + cdata->description); + } else { + long_name = g_strdup ("VA-API Video Postprocessor"); + } + + gst_element_class_set_metadata (element_class, long_name, + "Filter/Converter/Video/Scaler/Hardware", + "VA-API based video postprocessor", + "Víctor Jáquez <vjaquez@igalia.com>"); + + display = gst_va_display_drm_new_from_path (btrans_class->render_device_path); + filter = gst_va_filter_new (display); + + if (gst_va_filter_open (filter)) { + caps = gst_va_filter_get_caps (filter); + + /* adds any to enable passthrough */ + { + GstCaps *any_caps = gst_caps_new_empty_simple ("video/x-raw"); + gst_caps_set_features_simple (any_caps, gst_caps_features_new_any ()); + caps = gst_caps_merge (caps, any_caps); + } + } else { + caps = gst_caps_from_string (caps_str); + } + + doc_caps = gst_caps_from_string (caps_str); + + sink_pad_templ = gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, + caps); + gst_element_class_add_pad_template (element_class, sink_pad_templ); + gst_pad_template_set_documentation_caps (sink_pad_templ, + gst_caps_ref (doc_caps)); + + src_pad_templ = gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, + caps); + gst_element_class_add_pad_template (element_class, src_pad_templ); + gst_pad_template_set_documentation_caps (src_pad_templ, + gst_caps_ref (doc_caps)); + gst_caps_unref (doc_caps); + + gst_caps_unref (caps); + + object_class->dispose = gst_va_vpp_dispose; + object_class->set_property = gst_va_vpp_set_property; + object_class->get_property = gst_va_vpp_get_property; + + trans_class->propose_allocation = + GST_DEBUG_FUNCPTR (gst_va_vpp_propose_allocation); + trans_class->transform_caps = GST_DEBUG_FUNCPTR (gst_va_vpp_transform_caps); + trans_class->fixate_caps = GST_DEBUG_FUNCPTR (gst_va_vpp_fixate_caps); + trans_class->before_transform = + GST_DEBUG_FUNCPTR (gst_va_vpp_before_transform); + trans_class->transform = GST_DEBUG_FUNCPTR (gst_va_vpp_transform); + trans_class->transform_meta = GST_DEBUG_FUNCPTR (gst_va_vpp_transform_meta); + trans_class->src_event = GST_DEBUG_FUNCPTR (gst_va_vpp_src_event); + trans_class->sink_event = GST_DEBUG_FUNCPTR (gst_va_vpp_sink_event); + + trans_class->transform_ip_on_passthrough = FALSE; + + btrans_class->set_info = GST_DEBUG_FUNCPTR (gst_va_vpp_set_info); + btrans_class->update_properties = + GST_DEBUG_FUNCPTR (gst_va_vpp_update_properties); + + gst_va_filter_install_properties (filter, object_class); + + g_free (long_name); + g_free (cdata->description); + g_free (cdata->render_device_path); + g_free (cdata); + gst_object_unref (filter); + gst_object_unref (display); +} + +static inline void +_create_colorbalance_channel (GstVaVpp * self, const gchar * label) +{ + GstColorBalanceChannel *channel; + + channel = g_object_new (GST_TYPE_COLOR_BALANCE_CHANNEL, NULL); + channel->label = g_strdup_printf ("VA-%s", label); + channel->min_value = -1000; + channel->max_value = 1000; + + self->channels = g_list_append (self->channels, channel); +} + +static void +gst_va_vpp_init (GTypeInstance * instance, gpointer g_class) +{ + GstVaVpp *self = GST_VA_VPP (instance); + GParamSpec *pspec; + + self->direction = GST_VIDEO_ORIENTATION_IDENTITY; + self->prev_direction = self->direction; + self->tag_direction = GST_VIDEO_ORIENTATION_AUTO; + + pspec = g_object_class_find_property (g_class, "denoise"); + if (pspec) + self->denoise = g_value_get_float (g_param_spec_get_default_value (pspec)); + + pspec = g_object_class_find_property (g_class, "sharpen"); + if (pspec) + self->sharpen = g_value_get_float (g_param_spec_get_default_value (pspec)); + + pspec = g_object_class_find_property (g_class, "skin-tone"); + if (pspec) { + const GValue *value = g_param_spec_get_default_value (pspec); + if (G_VALUE_TYPE (value) == G_TYPE_BOOLEAN) + self->skintone = g_value_get_boolean (value); + else + self->skintone = g_value_get_float (value); + } + + /* color balance */ + pspec = g_object_class_find_property (g_class, "brightness"); + if (pspec) { + self->brightness = + g_value_get_float (g_param_spec_get_default_value (pspec)); + _create_colorbalance_channel (self, "BRIGHTNESS"); + } + pspec = g_object_class_find_property (g_class, "contrast"); + if (pspec) { + self->contrast = g_value_get_float (g_param_spec_get_default_value (pspec)); + _create_colorbalance_channel (self, "CONTRAST"); + } + pspec = g_object_class_find_property (g_class, "hue"); + if (pspec) { + self->hue = g_value_get_float (g_param_spec_get_default_value (pspec)); + _create_colorbalance_channel (self, "HUE"); + } + pspec = g_object_class_find_property (g_class, "saturation"); + if (pspec) { + self->saturation = + g_value_get_float (g_param_spec_get_default_value (pspec)); + _create_colorbalance_channel (self, "SATURATION"); + } + + /* enable QoS */ + gst_base_transform_set_qos_enabled (GST_BASE_TRANSFORM (instance), TRUE); +} + +static gpointer +_register_debug_category (gpointer data) +{ + GST_DEBUG_CATEGORY_INIT (gst_va_vpp_debug, "vapostproc", 0, + "VA Video Postprocessor"); + +#define D(type) \ + G_PASTE (META_TAG_, type) = \ + g_quark_from_static_string (G_PASTE (G_PASTE (GST_META_TAG_VIDEO_, type), _STR)) + D (COLORSPACE); + D (SIZE); + D (ORIENTATION); +#undef D + META_TAG_VIDEO = g_quark_from_static_string (GST_META_TAG_VIDEO_STR); + + return NULL; +} + +gboolean +gst_va_vpp_register (GstPlugin * plugin, GstVaDevice * device, + gboolean has_colorbalance, guint rank) +{ + static GOnce debug_once = G_ONCE_INIT; + GType type; + GTypeInfo type_info = { + .class_size = sizeof (GstVaVppClass), + .class_init = gst_va_vpp_class_init, + .instance_size = sizeof (GstVaVpp), + .instance_init = gst_va_vpp_init, + }; + struct CData *cdata; + gboolean ret; + gchar *type_name, *feature_name; + + g_return_val_if_fail (GST_IS_PLUGIN (plugin), FALSE); + g_return_val_if_fail (GST_IS_VA_DEVICE (device), FALSE); + + cdata = g_new (struct CData, 1); + cdata->description = NULL; + cdata->render_device_path = g_strdup (device->render_device_path); + + type_info.class_data = cdata; + + type_name = g_strdup ("GstVaPostProc"); + feature_name = g_strdup ("vapostproc"); + + /* The first postprocessor to be registered should use a constant + * name, like vapostproc, for any additional postprocessors, we + * create unique names, using inserting the render device name. */ + if (g_type_from_name (type_name)) { + gchar *basename = g_path_get_basename (device->render_device_path); + g_free (type_name); + g_free (feature_name); + type_name = g_strdup_printf ("GstVa%sPostProc", basename); + feature_name = g_strdup_printf ("va%spostproc", basename); + cdata->description = basename; + + /* lower rank for non-first device */ + if (rank > 0) + rank--; + } + + g_once (&debug_once, _register_debug_category, NULL); + + type = g_type_register_static (GST_TYPE_VA_BASE_TRANSFORM, type_name, + &type_info, 0); + + if (has_colorbalance) { + const GInterfaceInfo info = { gst_va_vpp_colorbalance_init, NULL, NULL }; + g_type_add_interface_static (type, GST_TYPE_COLOR_BALANCE, &info); + } + + ret = gst_element_register (plugin, feature_name, rank, type); + + g_free (type_name); + g_free (feature_name); + + return ret; +} + +/* Color Balance interface */ +static const GList * +gst_va_vpp_colorbalance_list_channels (GstColorBalance * balance) +{ + GstVaVpp *self = GST_VA_VPP (balance); + + return self->channels; +} + +/* This assumes --as happens with intel drivers-- that max values are + * bigger than the simmetrical values of min values */ +static float +make_max_simmetrical (GParamSpecFloat * fpspec) +{ + gfloat max; + + if (fpspec->default_value == 0) + max = -fpspec->minimum; + else + max = fpspec->default_value + ABS (fpspec->minimum - fpspec->default_value); + + return MIN (max, fpspec->maximum); +} + +static gboolean +_set_cb_val (GstVaVpp * self, const gchar * name, + GstColorBalanceChannel * channel, gint value, gfloat * cb) +{ + GObjectClass *klass = G_OBJECT_CLASS (GST_VA_VPP_GET_CLASS (self)); + GParamSpec *pspec; + GParamSpecFloat *fpspec; + gfloat new_value, max; + gboolean changed; + + pspec = g_object_class_find_property (klass, name); + if (!pspec) + return FALSE; + + fpspec = G_PARAM_SPEC_FLOAT (pspec); + max = make_max_simmetrical (fpspec); + + new_value = (value - channel->min_value) * (max - fpspec->minimum) + / (channel->max_value - channel->min_value) + fpspec->minimum; + + GST_OBJECT_LOCK (self); + changed = new_value != *cb; + *cb = new_value; + value = (*cb + fpspec->minimum) * (channel->max_value - channel->min_value) + / (max - fpspec->minimum) + channel->min_value; + GST_OBJECT_UNLOCK (self); + + if (changed) { + GST_INFO_OBJECT (self, "%s: %d / %f", channel->label, value, new_value); + gst_color_balance_value_changed (GST_COLOR_BALANCE (self), channel, value); + g_atomic_int_set (&self->rebuild_filters, TRUE); + } + + return TRUE; +} + +static void +gst_va_vpp_colorbalance_set_value (GstColorBalance * balance, + GstColorBalanceChannel * channel, gint value) +{ + GstVaVpp *self = GST_VA_VPP (balance); + + if (g_str_has_suffix (channel->label, "HUE")) + _set_cb_val (self, "hue", channel, value, &self->hue); + else if (g_str_has_suffix (channel->label, "BRIGHTNESS")) + _set_cb_val (self, "brightness", channel, value, &self->brightness); + else if (g_str_has_suffix (channel->label, "CONTRAST")) + _set_cb_val (self, "contrast", channel, value, &self->contrast); + else if (g_str_has_suffix (channel->label, "SATURATION")) + _set_cb_val (self, "saturation", channel, value, &self->saturation); +} + +static gboolean +_get_cb_val (GstVaVpp * self, const gchar * name, + GstColorBalanceChannel * channel, gfloat * cb, gint * val) +{ + GObjectClass *klass = G_OBJECT_CLASS (GST_VA_VPP_GET_CLASS (self)); + GParamSpec *pspec; + GParamSpecFloat *fpspec; + gfloat max; + + pspec = g_object_class_find_property (klass, name); + if (!pspec) + return FALSE; + + fpspec = G_PARAM_SPEC_FLOAT (pspec); + max = make_max_simmetrical (fpspec); + + GST_OBJECT_LOCK (self); + *val = (*cb + fpspec->minimum) * (channel->max_value - channel->min_value) + / (max - fpspec->minimum) + channel->min_value; + GST_OBJECT_UNLOCK (self); + + return TRUE; +} + +static gint +gst_va_vpp_colorbalance_get_value (GstColorBalance * balance, + GstColorBalanceChannel * channel) +{ + GstVaVpp *self = GST_VA_VPP (balance); + gint value = 0; + + if (g_str_has_suffix (channel->label, "HUE")) + _get_cb_val (self, "hue", channel, &self->hue, &value); + else if (g_str_has_suffix (channel->label, "BRIGHTNESS")) + _get_cb_val (self, "brightness", channel, &self->brightness, &value); + else if (g_str_has_suffix (channel->label, "CONTRAST")) + _get_cb_val (self, "contrast", channel, &self->contrast, &value); + else if (g_str_has_suffix (channel->label, "SATURATION")) + _get_cb_val (self, "saturation", channel, &self->saturation, &value); + + return value; +} + +static GstColorBalanceType +gst_va_vpp_colorbalance_get_balance_type (GstColorBalance * balance) +{ + return GST_COLOR_BALANCE_HARDWARE; +} + +static void +gst_va_vpp_colorbalance_init (gpointer iface, gpointer data) +{ + GstColorBalanceInterface *cbiface = iface; + + cbiface->list_channels = gst_va_vpp_colorbalance_list_channels; + cbiface->set_value = gst_va_vpp_colorbalance_set_value; + cbiface->get_value = gst_va_vpp_colorbalance_get_value; + cbiface->get_balance_type = gst_va_vpp_colorbalance_get_balance_type; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/gstvavpp.h
Added
@@ -0,0 +1,32 @@ +/* GStreamer + * Copyright (C) 2020 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include "gstvadevice.h" + +G_BEGIN_DECLS + +gboolean gst_va_vpp_register (GstPlugin * plugin, + GstVaDevice * device, + gboolean has_colorbalance, + guint rank); + +G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/va/meson.build
Changed
@@ -1,43 +1,40 @@ va_sources = [ 'plugin.c', 'gstvaallocator.c', + 'gstvabasedec.c', + 'gstvabasetransform.c', 'gstvacaps.c', 'gstvadecoder.c', - 'gstvadisplay.c', - 'gstvadisplay_drm.c', - 'gstvadisplay_wrapped.c', + 'gstvadeinterlace.c', 'gstvadevice.c', + 'gstvadisplay_priv.c', + 'gstvafilter.c', 'gstvah264dec.c', + 'gstvah265dec.c', 'gstvapool.c', 'gstvaprofile.c', + 'gstvasurfacecopy.c', 'gstvautils.c', 'gstvavideoformat.c', + 'gstvavp8dec.c', + 'gstvavp9dec.c', + 'gstvampeg2dec.c', + 'gstvavpp.c', + 'vasurfaceimage.c', ] -va_option = get_option('va') -if va_option.disabled() +if not gstva_dep.found() subdir_done() endif -libva_req = ['>= 1.6'] +libgudev_dep = dependency('gudev-1.0', required: false) +cdata.set10('HAVE_GUDEV', libgudev_dep.found()) -libva_dep = dependency('libva', version: libva_req, required: va_option) -libva_drm_dep = dependency('libva-drm', version: libva_req, required: va_option) -libgudev_dep = dependency('gudev-1.0', required: va_option) -libdrm_dep = dependency('libdrm', required: false, - fallback: ['libdrm', 'ext_libdrm']) - -have_va = libva_dep.found() and libva_drm_dep.found() -if not (have_va and libgudev_dep.found()) - if va_option.enabled() - error('The va plugin was enabled explicity, but required dependencies were not found.') - endif - subdir_done() +if libva_dep.version().version_compare('>= 1.8') + va_sources += 'gstvaav1dec.c' endif -cdata.set10('HAVE_LIBDRM', libdrm_dep.found()) - -driverdir = libva_dep.get_pkgconfig_variable('driverdir') +driverdir = libva_dep.get_variable('driverdir', default_value: '') if driverdir == '' driverdir = join_paths(get_option('prefix'), get_option('libdir'), 'dri') endif @@ -47,7 +44,7 @@ va_sources, c_args : gst_plugins_bad_args + extra_c_args + gstva_cargs + ['-std=c99'], include_directories : [configinc], - dependencies : [gstbase_dep, gstvideo_dep, gstcodecs_dep, libva_dep, gstallocators_dep, libva_drm_dep, libgudev_dep, libdrm_dep] + extra_dep, + dependencies : [gstvideo_dep, gstcodecs_dep, gstallocators_dep, gstva_dep, libgudev_dep] + extra_dep, install : true, install_dir : plugins_install_dir, )
View file
gst-plugins-bad-1.18.6.tar.xz/sys/va/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/va/plugin.c
Changed
@@ -27,14 +27,26 @@ #include "config.h" #endif +#include "gstvaav1dec.h" #include "gstvacaps.h" +#include "gstvadeinterlace.h" #include "gstvadevice.h" +#include "gstvafilter.h" #include "gstvah264dec.h" +#include "gstvah265dec.h" +#include "gstvampeg2dec.h" #include "gstvaprofile.h" +#include "gstvavp8dec.h" +#include "gstvavp9dec.h" +#include "gstvavpp.h" #define GST_CAT_DEFAULT gstva_debug GST_DEBUG_CATEGORY (gstva_debug); +/* big bad mutex to exclusive access to shared stream buffers, such as + * DMABuf after a tee */ +GRecMutex GST_VA_SHARED_LOCK = { 0, }; + static void plugin_add_dependencies (GstPlugin * plugin) { @@ -91,6 +103,43 @@ device->render_device_path); } break; + case HEVC: + if (!gst_va_h265_dec_register (plugin, device, sinkcaps, srccaps, + GST_RANK_NONE)) { + GST_WARNING ("Failed to register H265 decoder: %s", + device->render_device_path); + } + break; + case VP8: + if (!gst_va_vp8_dec_register (plugin, device, sinkcaps, srccaps, + GST_RANK_NONE)) { + GST_WARNING ("Failed to register VP8 decoder: %s", + device->render_device_path); + } + break; + case VP9: + if (!gst_va_vp9_dec_register (plugin, device, sinkcaps, srccaps, + GST_RANK_NONE)) { + GST_WARNING ("Failed to register VP9 decoder: %s", + device->render_device_path); + } + break; + case MPEG2: + if (!gst_va_mpeg2_dec_register (plugin, device, sinkcaps, srccaps, + GST_RANK_NONE)) { + GST_WARNING ("Failed to register Mpeg2 decoder: %s", + device->render_device_path); + } + break; +#if VA_CHECK_VERSION(1, 8, 0) + case AV1: + if (!gst_va_av1_dec_register (plugin, device, sinkcaps, srccaps, + GST_RANK_NONE)) { + GST_WARNING ("Failed to register AV1 decoder: %s", + device->render_device_path); + } + break; +#endif default: GST_DEBUG ("No decoder implementation for %" GST_FOURCC_FORMAT, GST_FOURCC_ARGS (codec)); @@ -138,6 +187,38 @@ } } +static void +plugin_register_vpp (GstPlugin * plugin, GstVaDevice * device) +{ + GstVaFilter *filter; + gboolean has_colorbalance, has_deinterlace; + + has_colorbalance = FALSE; + has_deinterlace = FALSE; + filter = gst_va_filter_new (device->display); + if (gst_va_filter_open (filter)) { + has_colorbalance = + gst_va_filter_has_filter (filter, VAProcFilterColorBalance); + has_deinterlace = + gst_va_filter_has_filter (filter, VAProcFilterDeinterlacing); + } else { + GST_WARNING ("Failed open VA filter"); + gst_object_unref (filter); + return; + } + gst_object_unref (filter); + + if (!gst_va_vpp_register (plugin, device, has_colorbalance, GST_RANK_NONE)) + GST_WARNING ("Failed to register postproc: %s", device->render_device_path); + + if (has_deinterlace) { + if (!gst_va_deinterlace_register (plugin, device, GST_RANK_NONE)) { + GST_WARNING ("Failed to register deinterlace: %s", + device->render_device_path); + } + } +} + static inline void _insert_profile_in_table (GHashTable * table, VAProfile profile) { @@ -167,7 +248,7 @@ VAStatus status; GHashTable *decoders, *encoders, *encoderslp, *encodersimg; gint i, j, num_entrypoints = 0, num_profiles = 0; - gboolean ret = FALSE; + gboolean has_vpp = FALSE, ret = FALSE; decoders = g_hash_table_new_full (g_int64_hash, g_int64_equal, (GDestroyNotify) g_free, (GDestroyNotify) g_array_unref); @@ -201,6 +282,8 @@ _insert_profile_in_table (encoderslp, profiles[i]); else if (entrypoints[j] == VAEntrypointEncPicture) _insert_profile_in_table (encodersimg, profiles[i]); + else if (entrypoints[j] == VAEntrypointVideoProc) + has_vpp = TRUE; } } @@ -209,6 +292,8 @@ plugin_register_encoders (plugin, device, encoderslp, VAEntrypointEncSliceLP); plugin_register_encoders (plugin, device, encodersimg, VAEntrypointEncPicture); + if (has_vpp) + plugin_register_vpp (plugin, device); ret = TRUE;
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/vasurfaceimage.c
Added
@@ -0,0 +1,352 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "vasurfaceimage.h" + +#include "gstvavideoformat.h" +#include <va/va.h> + +gboolean +va_destroy_surfaces (GstVaDisplay * display, VASurfaceID * surfaces, + gint num_surfaces) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + g_return_val_if_fail (num_surfaces > 0, FALSE); + + gst_va_display_lock (display); + status = vaDestroySurfaces (dpy, surfaces, num_surfaces); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaDestroySurfaces: %s", vaErrorStr (status)); + return FALSE; + } + + return TRUE; + +} + +gboolean +va_create_surfaces (GstVaDisplay * display, guint rt_format, guint fourcc, + guint width, guint height, gint usage_hint, + VASurfaceAttribExternalBuffers * ext_buf, VASurfaceID * surfaces, + guint num_surfaces) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + /* *INDENT-OFF* */ + VASurfaceAttrib attrs[5] = { + { + .type = VASurfaceAttribUsageHint, + .flags = VA_SURFACE_ATTRIB_SETTABLE, + .value.type = VAGenericValueTypeInteger, + .value.value.i = usage_hint, + }, + { + .type = VASurfaceAttribMemoryType, + .flags = VA_SURFACE_ATTRIB_SETTABLE, + .value.type = VAGenericValueTypeInteger, + .value.value.i = (ext_buf && ext_buf->num_buffers > 0) + ? VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME + : VA_SURFACE_ATTRIB_MEM_TYPE_VA, + }, + }; + /* *INDENT-ON* */ + VAStatus status; + guint num_attrs = 2; + + g_return_val_if_fail (num_surfaces > 0, FALSE); + + if (fourcc > 0) { + /* *INDENT-OFF* */ + attrs[num_attrs++] = (VASurfaceAttrib) { + .type = VASurfaceAttribPixelFormat, + .flags = VA_SURFACE_ATTRIB_SETTABLE, + .value.type = VAGenericValueTypeInteger, + .value.value.i = fourcc, + }; + /* *INDENT-ON* */ + } + + if (ext_buf) { + /* *INDENT-OFF* */ + attrs[num_attrs++] = (VASurfaceAttrib) { + .type = VASurfaceAttribExternalBufferDescriptor, + .flags = VA_SURFACE_ATTRIB_SETTABLE, + .value.type = VAGenericValueTypePointer, + .value.value.p = ext_buf, + }; + /* *INDENT-ON* */ + } + + gst_va_display_lock (display); + status = vaCreateSurfaces (dpy, rt_format, width, height, surfaces, + num_surfaces, attrs, num_attrs); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaCreateSurfaces: %s", vaErrorStr (status)); + return FALSE; + } + + return TRUE; +} + +gboolean +va_export_surface_to_dmabuf (GstVaDisplay * display, VASurfaceID surface, + guint32 flags, VADRMPRIMESurfaceDescriptor * desc) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaExportSurfaceHandle (dpy, surface, + VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2, flags, desc); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaExportSurfaceHandle: %s", vaErrorStr (status)); + return FALSE; + } + + return TRUE; +} + +gboolean +va_destroy_image (GstVaDisplay * display, VAImageID image_id) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaDestroyImage (dpy, image_id); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaDestroyImage: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +} + +gboolean +va_get_derive_image (GstVaDisplay * display, VASurfaceID surface, + VAImage * image) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaDeriveImage (dpy, surface, image); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING ("vaDeriveImage: %s", vaErrorStr (status)); + return FALSE; + } + + return TRUE; +} + +gboolean +va_create_image (GstVaDisplay * display, GstVideoFormat format, gint width, + gint height, VAImage * image) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + const VAImageFormat *va_format; + VAStatus status; + + va_format = gst_va_image_format_from_video_format (format); + if (!va_format) + return FALSE; + + gst_va_display_lock (display); + status = + vaCreateImage (dpy, (VAImageFormat *) va_format, width, height, image); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaCreateImage: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +} + +gboolean +va_get_image (GstVaDisplay * display, VASurfaceID surface, VAImage * image) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaGetImage (dpy, surface, 0, 0, image->width, image->height, + image->image_id); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaGetImage: %s", vaErrorStr (status)); + return FALSE; + } + + return TRUE; +} + +gboolean +va_sync_surface (GstVaDisplay * display, VASurfaceID surface) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaSyncSurface (dpy, surface); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING ("vaSyncSurface: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +} + +gboolean +va_map_buffer (GstVaDisplay * display, VABufferID buffer, gpointer * data) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaMapBuffer (dpy, buffer, data); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING ("vaMapBuffer: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +} + +gboolean +va_unmap_buffer (GstVaDisplay * display, VABufferID buffer) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + gst_va_display_lock (display); + status = vaUnmapBuffer (dpy, buffer); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_WARNING ("vaUnmapBuffer: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +} + +gboolean +va_put_image (GstVaDisplay * display, VASurfaceID surface, VAImage * image) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + + if (!va_sync_surface (display, surface)) + return FALSE; + + gst_va_display_lock (display); + status = vaPutImage (dpy, surface, image->image_id, 0, 0, image->width, + image->height, 0, 0, image->width, image->height); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_ERROR ("vaPutImage: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +} + +gboolean +va_ensure_image (GstVaDisplay * display, VASurfaceID surface, + GstVideoInfo * info, VAImage * image, gboolean derived) +{ + gboolean ret = TRUE; + + if (image->image_id != VA_INVALID_ID) + return TRUE; + + if (!va_sync_surface (display, surface)) + return FALSE; + + if (derived) { + ret = va_get_derive_image (display, surface, image); + } else { + ret = va_create_image (display, GST_VIDEO_INFO_FORMAT (info), + GST_VIDEO_INFO_WIDTH (info), GST_VIDEO_INFO_HEIGHT (info), image); + } + + return ret; +} + +gboolean +va_check_surface (GstVaDisplay * display, VASurfaceID surface) +{ + VADisplay dpy = gst_va_display_get_va_dpy (display); + VAStatus status; + VASurfaceStatus state; + + gst_va_display_lock (display); + status = vaQuerySurfaceStatus (dpy, surface, &state); + gst_va_display_unlock (display); + + if (status != VA_STATUS_SUCCESS) + GST_ERROR ("vaQuerySurfaceStatus: %s", vaErrorStr (status)); + + GST_LOG ("surface %#x status %d", surface, state); + + return (status == VA_STATUS_SUCCESS); +} + +gboolean +va_copy_surface (GstVaDisplay * display, VASurfaceID dst, VASurfaceID src) +{ +#if VA_CHECK_VERSION (1, 12, 0) + VADisplay dpy = gst_va_display_get_va_dpy (display); + /* *INDENT-OFF* */ + VACopyObject obj_src = { + .obj_type = VACopyObjectSurface, + .object = { + .surface_id = src, + }, + }; + VACopyObject obj_dst = { + .obj_type = VACopyObjectSurface, + .object = { + .surface_id = dst, + }, + }; + VACopyOption option = { + .bits = { + .va_copy_sync = VA_EXEC_SYNC, + .va_copy_mode = VA_EXEC_MODE_DEFAULT, + }, + }; + /* *INDENT-ON* */ + VAStatus status; + + gst_va_display_lock (display); + status = vaCopy (dpy, &obj_dst, &obj_src, option); + gst_va_display_unlock (display); + if (status != VA_STATUS_SUCCESS) { + GST_INFO ("vaCopy: %s", vaErrorStr (status)); + return FALSE; + } + return TRUE; +#else + return FALSE; +#endif +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/va/vasurfaceimage.h
Added
@@ -0,0 +1,82 @@ +/* GStreamer + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#pragma once + +#include <gst/va/gstvadisplay.h> +#include <gst/video/video.h> +#include <va/va.h> +#include <va/va_drmcommon.h> + +G_BEGIN_DECLS + +/* surfaces */ +gboolean va_create_surfaces (GstVaDisplay * display, + guint rt_format, guint fourcc, + guint width, guint height, + gint usage_hint, + VASurfaceAttribExternalBuffers * ext_buf, + VASurfaceID * surfaces, + guint num_surfaces); +gboolean va_destroy_surfaces (GstVaDisplay * display, + VASurfaceID * surfaces, + gint num_surfaces); +gboolean va_export_surface_to_dmabuf (GstVaDisplay * display, + VASurfaceID surface, + guint32 flags, + VADRMPRIMESurfaceDescriptor * desc); +gboolean va_sync_surface (GstVaDisplay * display, + VASurfaceID surface); +gboolean va_check_surface (GstVaDisplay * display, + VASurfaceID surface); +gboolean va_copy_surface (GstVaDisplay * display, + VASurfaceID dst, + VASurfaceID src); + +/* images */ +gboolean va_create_image (GstVaDisplay * display, + GstVideoFormat format, + gint width, gint height, + VAImage * image); +gboolean va_destroy_image (GstVaDisplay * display, + VAImageID image_id); +gboolean va_get_image (GstVaDisplay * display, + VASurfaceID surface, + VAImage * image); +gboolean va_get_derive_image (GstVaDisplay * display, + VASurfaceID surface, + VAImage * image); +gboolean va_put_image (GstVaDisplay * display, + VASurfaceID surface, + VAImage * image); +gboolean va_ensure_image (GstVaDisplay * display, + VASurfaceID surface, + GstVideoInfo * info, + VAImage * image, + gboolean derived); + +/* mapping */ +gboolean va_map_buffer (GstVaDisplay * display, + VABufferID buffer, + gpointer * data); +gboolean va_unmap_buffer (GstVaDisplay * display, + VABufferID buffer); + +G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstmmdeviceenumerator.cpp
Added
@@ -0,0 +1,472 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "gstmmdeviceenumerator.h" + +#ifndef INITGUID +#include <initguid.h> +#endif + +/* *INDENT-OFF* */ +G_BEGIN_DECLS + +GST_DEBUG_CATEGORY_EXTERN (gst_wasapi_debug); +#define GST_CAT_DEFAULT gst_wasapi_debug + +G_END_DECLS + +/* IMMNotificationClient implementation */ +class GstIMMNotificationClient : public IMMNotificationClient +{ +public: + static HRESULT + CreateInstance (GstMMDeviceEnumerator * enumerator, + const GstMMNotificationClientCallbacks * callbacks, + gpointer user_data, + IMMNotificationClient ** client) + { + GstIMMNotificationClient *self; + + self = new GstIMMNotificationClient (); + + self->callbacks_ = *callbacks; + self->user_data_ = user_data; + g_weak_ref_set (&self->enumerator_, enumerator); + + *client = (IMMNotificationClient *) self; + + return S_OK; + } + + /* IUnknown */ + STDMETHODIMP + QueryInterface (REFIID riid, void ** object) + { + if (!object) + return E_POINTER; + + if (riid == IID_IUnknown) { + *object = static_cast<IUnknown *> (this); + } else if (riid == __uuidof(IMMNotificationClient)) { + *object = static_cast<IMMNotificationClient *> (this); + } else { + *object = nullptr; + return E_NOINTERFACE; + } + + AddRef (); + + return S_OK; + } + + STDMETHODIMP_ (ULONG) + AddRef (void) + { + GST_TRACE ("%p, %d", this, (guint) ref_count_); + return InterlockedIncrement (&ref_count_); + } + + STDMETHODIMP_ (ULONG) + Release (void) + { + ULONG ref_count; + + GST_TRACE ("%p, %d", this, (guint) ref_count_); + ref_count = InterlockedDecrement (&ref_count_); + + if (ref_count == 0) { + GST_TRACE ("Delete instance %p", this); + delete this; + } + + return ref_count; + } + + /* IMMNotificationClient */ + STDMETHODIMP + OnDeviceStateChanged (LPCWSTR device_id, DWORD new_state) + { + GstMMDeviceEnumerator *listener; + HRESULT hr; + + if (!callbacks_.device_state_changed) + return S_OK; + + listener = (GstMMDeviceEnumerator *) g_weak_ref_get (&enumerator_); + if (!listener) + return S_OK; + + hr = callbacks_.device_state_changed (listener, device_id, new_state, + user_data_); + gst_object_unref (listener); + + return hr; + } + + STDMETHODIMP + OnDeviceAdded (LPCWSTR device_id) + { + GstMMDeviceEnumerator *listener; + HRESULT hr; + + if (!callbacks_.device_added) + return S_OK; + + listener = (GstMMDeviceEnumerator *) g_weak_ref_get (&enumerator_); + if (!listener) + return S_OK; + + hr = callbacks_.device_added (listener, device_id, user_data_); + gst_object_unref (listener); + + return hr; + } + + STDMETHODIMP + OnDeviceRemoved (LPCWSTR device_id) + { + GstMMDeviceEnumerator *listener; + HRESULT hr; + + if (!callbacks_.device_removed) + return S_OK; + + listener = (GstMMDeviceEnumerator *) g_weak_ref_get (&enumerator_); + if (!listener) + return S_OK; + + hr = callbacks_.device_removed (listener, device_id, user_data_); + gst_object_unref (listener); + + return hr; + } + + STDMETHODIMP + OnDefaultDeviceChanged (EDataFlow flow, ERole role, LPCWSTR default_device_id) + { + GstMMDeviceEnumerator *listener; + HRESULT hr; + + if (!callbacks_.default_device_changed) + return S_OK; + + listener = (GstMMDeviceEnumerator *) g_weak_ref_get (&enumerator_); + if (!listener) + return S_OK; + + hr = callbacks_.default_device_changed (listener, + flow, role, default_device_id, user_data_); + gst_object_unref (listener); + + return hr; + } + + STDMETHODIMP + OnPropertyValueChanged (LPCWSTR device_id, const PROPERTYKEY key) + { + GstMMDeviceEnumerator *listener; + HRESULT hr; + + if (!callbacks_.property_value_changed) + return S_OK; + + listener = (GstMMDeviceEnumerator *) g_weak_ref_get (&enumerator_); + if (!device_id) + return S_OK; + + hr = callbacks_.property_value_changed (listener, + device_id, key, user_data_); + gst_object_unref (listener); + + return hr; + } + +private: + GstIMMNotificationClient () + : ref_count_ (1) + { + g_weak_ref_init (&enumerator_, nullptr); + } + + virtual ~GstIMMNotificationClient () + { + g_weak_ref_clear (&enumerator_); + } + +private: + ULONG ref_count_; + GstMMNotificationClientCallbacks callbacks_; + gpointer user_data_; + GWeakRef enumerator_; +}; +/* *INDENT-ON* */ + +struct _GstMMDeviceEnumerator +{ + GstObject parent; + + IMMDeviceEnumerator *handle; + IMMNotificationClient *client; + + GMutex lock; + GCond cond; + + GThread *thread; + GMainContext *context; + GMainLoop *loop; + + gboolean running; +}; + +static void gst_mm_device_enumerator_constructed (GObject * object); +static void gst_mm_device_enumerator_finalize (GObject * object); + +static gpointer +gst_mm_device_enumerator_thread_func (GstMMDeviceEnumerator * self); + +#define gst_mm_device_enumerator_parent_class parent_class +G_DEFINE_TYPE (GstMMDeviceEnumerator, + gst_mm_device_enumerator, GST_TYPE_OBJECT); + +static void +gst_mm_device_enumerator_class_init (GstMMDeviceEnumeratorClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + + gobject_class->constructed = gst_mm_device_enumerator_constructed; + gobject_class->finalize = gst_mm_device_enumerator_finalize; +} + +static void +gst_mm_device_enumerator_init (GstMMDeviceEnumerator * self) +{ + g_mutex_init (&self->lock); + g_cond_init (&self->cond); + self->context = g_main_context_new (); + self->loop = g_main_loop_new (self->context, FALSE); +} + +static void +gst_mm_device_enumerator_constructed (GObject * object) +{ + GstMMDeviceEnumerator *self = GST_MM_DEVICE_ENUMERATOR (object); + + g_mutex_lock (&self->lock); + self->thread = g_thread_new ("GstMMDeviceEnumerator", + (GThreadFunc) gst_mm_device_enumerator_thread_func, self); + while (!g_main_loop_is_running (self->loop)) + g_cond_wait (&self->cond, &self->lock); + g_mutex_unlock (&self->lock); +} + +static void +gst_mm_device_enumerator_finalize (GObject * object) +{ + GstMMDeviceEnumerator *self = GST_MM_DEVICE_ENUMERATOR (object); + + g_main_loop_quit (self->loop); + g_thread_join (self->thread); + g_main_loop_unref (self->loop); + g_main_context_unref (self->context); + + g_mutex_clear (&self->lock); + g_cond_clear (&self->cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static gboolean +loop_running_cb (GstMMDeviceEnumerator * self) +{ + g_mutex_lock (&self->lock); + g_cond_signal (&self->cond); + g_mutex_unlock (&self->lock); + + return G_SOURCE_REMOVE; +} + +static gpointer +gst_mm_device_enumerator_thread_func (GstMMDeviceEnumerator * self) +{ + GSource *idle_source; + IMMDeviceEnumerator *enumerator = nullptr; + HRESULT hr; + + CoInitializeEx (NULL, COINIT_MULTITHREADED); + g_main_context_push_thread_default (self->context); + + idle_source = g_idle_source_new (); + g_source_set_callback (idle_source, + (GSourceFunc) loop_running_cb, self, nullptr); + g_source_attach (idle_source, self->context); + g_source_unref (idle_source); + + hr = CoCreateInstance (__uuidof (MMDeviceEnumerator), + nullptr, CLSCTX_ALL, IID_PPV_ARGS (&enumerator)); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to create IMMDeviceEnumerator instance"); + goto run_loop; + } + + self->handle = enumerator; + +run_loop: + GST_INFO_OBJECT (self, "Starting loop"); + g_main_loop_run (self->loop); + GST_INFO_OBJECT (self, "Stopped loop"); + + if (self->client && self->handle) { + self->handle->UnregisterEndpointNotificationCallback (self->client); + + self->client->Release (); + } + + if (self->handle) + self->handle->Release (); + + g_main_context_pop_thread_default (self->context); + CoUninitialize (); + + return nullptr; +} + +GstMMDeviceEnumerator * +gst_mm_device_enumerator_new (void) +{ + GstMMDeviceEnumerator *self; + + self = (GstMMDeviceEnumerator *) g_object_new (GST_TYPE_MM_DEVICE_ENUMERATOR, + nullptr); + + if (!self->handle) { + gst_object_unref (self); + return nullptr; + } + + gst_object_ref_sink (self); + + return self; +} + +IMMDeviceEnumerator * +gst_mm_device_enumerator_get_handle (GstMMDeviceEnumerator * enumerator) +{ + g_return_val_if_fail (GST_IS_MM_DEVICE_ENUMERATOR (enumerator), nullptr); + + return enumerator->handle; +} + +typedef struct +{ + GstMMDeviceEnumerator *self; + GstMMNotificationClientCallbacks *callbacks; + gpointer user_data; + + gboolean handled; + GMutex lock; + GCond cond; + + gboolean ret; +} SetNotificationCallbackData; + +static gboolean +set_notification_callback (SetNotificationCallbackData * data) +{ + GstMMDeviceEnumerator *self = data->self; + HRESULT hr; + + g_mutex_lock (&data->lock); + g_mutex_lock (&self->lock); + + data->ret = TRUE; + + if (self->client) { + self->handle->UnregisterEndpointNotificationCallback (self->client); + self->client->Release (); + self->client = nullptr; + } + + if (data->callbacks) { + IMMNotificationClient *client; + + hr = GstIMMNotificationClient::CreateInstance (self, data->callbacks, + data->user_data, &client); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, + "Failed to create IMMNotificationClient instance"); + data->ret = FALSE; + goto out; + } + + hr = self->handle->RegisterEndpointNotificationCallback (client); + if (FAILED (hr)) { + GST_ERROR_OBJECT (self, "Failed to register callback"); + client->Release (); + data->ret = FALSE; + goto out; + } + + self->client = client; + } + +out: + data->handled = TRUE; + g_cond_signal (&data->cond); + g_mutex_unlock (&self->lock); + g_mutex_unlock (&data->lock); + + return G_SOURCE_REMOVE; +} + +gboolean +gst_mm_device_enumerator_set_notification_callback (GstMMDeviceEnumerator * + enumerator, GstMMNotificationClientCallbacks * callbacks, + gpointer user_data) +{ + SetNotificationCallbackData data; + gboolean ret; + + g_return_val_if_fail (GST_IS_MM_DEVICE_ENUMERATOR (enumerator), FALSE); + + data.self = enumerator; + data.callbacks = callbacks; + data.user_data = user_data; + data.handled = FALSE; + + g_mutex_init (&data.lock); + g_cond_init (&data.cond); + + g_main_context_invoke (enumerator->context, + (GSourceFunc) set_notification_callback, &data); + g_mutex_lock (&data.lock); + while (!data.handled) + g_cond_wait (&data.cond, &data.lock); + g_mutex_unlock (&data.lock); + + ret = data.ret; + + g_mutex_clear (&data.lock); + g_cond_clear (&data.cond); + + return ret; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstmmdeviceenumerator.h
Added
@@ -0,0 +1,69 @@ +/* + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_MM_DEVICE_ENUMERATOR_H__ +#define __GST_MM_DEVICE_ENUMERATOR_H__ + +#include <gst/gst.h> +#include <mmdeviceapi.h> + +G_BEGIN_DECLS + +#define GST_TYPE_MM_DEVICE_ENUMERATOR (gst_mm_device_enumerator_get_type ()) +G_DECLARE_FINAL_TYPE (GstMMDeviceEnumerator, gst_mm_device_enumerator, + GST, MM_DEVICE_ENUMERATOR, GstObject); + +typedef struct +{ + HRESULT (*device_state_changed) (GstMMDeviceEnumerator * enumerator, + LPCWSTR device_id, + DWORD new_state, + gpointer user_data); + + HRESULT (*device_added) (GstMMDeviceEnumerator * enumerator, + LPCWSTR device_id, + gpointer user_data); + + HRESULT (*device_removed) (GstMMDeviceEnumerator * provider, + LPCWSTR device_id, + gpointer user_data); + + HRESULT (*default_device_changed) (GstMMDeviceEnumerator * provider, + EDataFlow flow, + ERole role, + LPCWSTR default_device_id, + gpointer user_data); + + HRESULT (*property_value_changed) (GstMMDeviceEnumerator * provider, + LPCWSTR device_id, + const PROPERTYKEY key, + gpointer user_data); +} GstMMNotificationClientCallbacks; + +GstMMDeviceEnumerator * gst_mm_device_enumerator_new (void); + +IMMDeviceEnumerator * gst_mm_device_enumerator_get_handle (GstMMDeviceEnumerator * enumerator); + +gboolean gst_mm_device_enumerator_set_notification_callback (GstMMDeviceEnumerator * enumerator, + GstMMNotificationClientCallbacks * callbacks, + gpointer user_data); + +G_END_DECLS + +#endif /* __GST_MM_DEVICE_ENUMERATOR_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapidevice.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapidevice.c
Changed
@@ -1,5 +1,6 @@ /* GStreamer * Copyright (C) 2018 Nirbheek Chauhan <nirbheek@centricular.com> + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public @@ -23,11 +24,27 @@ #include "gstwasapidevice.h" +GST_DEBUG_CATEGORY_EXTERN (gst_wasapi_debug); +#define GST_CAT_DEFAULT gst_wasapi_debug + G_DEFINE_TYPE (GstWasapiDeviceProvider, gst_wasapi_device_provider, GST_TYPE_DEVICE_PROVIDER); static void gst_wasapi_device_provider_finalize (GObject * object); static GList *gst_wasapi_device_provider_probe (GstDeviceProvider * provider); +static gboolean gst_wasapi_device_provider_start (GstDeviceProvider * provider); +static void gst_wasapi_device_provider_stop (GstDeviceProvider * provider); + +static HRESULT +gst_wasapi_device_provider_device_added (GstMMDeviceEnumerator * enumerator, + LPCWSTR device_id, gpointer user_data); +static HRESULT +gst_wasapi_device_provider_device_removed (GstMMDeviceEnumerator * enumerator, + LPCWSTR device_id, gpointer user_data); +static HRESULT +gst_wasapi_device_provider_default_device_changed (GstMMDeviceEnumerator * + enumerator, EDataFlow flow, ERole role, LPCWSTR device_id, + gpointer user_data); static void gst_wasapi_device_provider_class_init (GstWasapiDeviceProviderClass * klass) @@ -38,6 +55,8 @@ gobject_class->finalize = gst_wasapi_device_provider_finalize; dm_class->probe = gst_wasapi_device_provider_probe; + dm_class->start = gst_wasapi_device_provider_start; + dm_class->stop = gst_wasapi_device_provider_stop; gst_device_provider_class_set_static_metadata (dm_class, "WASAPI (Windows Audio Session API) Device Provider", @@ -46,15 +65,68 @@ } static void -gst_wasapi_device_provider_init (GstWasapiDeviceProvider * provider) +gst_wasapi_device_provider_init (GstWasapiDeviceProvider * self) { - CoInitializeEx (NULL, COINIT_MULTITHREADED); + self->enumerator = gst_mm_device_enumerator_new (); +} + +static gboolean +gst_wasapi_device_provider_start (GstDeviceProvider * provider) +{ + GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (provider); + GstMMNotificationClientCallbacks callbacks = { NULL, }; + GList *devices = NULL; + GList *iter; + + if (!self->enumerator) { + GST_WARNING_OBJECT (self, "Enumerator wasn't configured"); + return FALSE; + } + + callbacks.device_added = gst_wasapi_device_provider_device_added; + callbacks.device_removed = gst_wasapi_device_provider_device_removed; + callbacks.default_device_changed = + gst_wasapi_device_provider_default_device_changed; + + if (!gst_mm_device_enumerator_set_notification_callback (self->enumerator, + &callbacks, self)) { + GST_WARNING_OBJECT (self, "Failed to set callbacks"); + return FALSE; + } + + /* baseclass will not call probe() once it's started, but we can get + * notification only add/remove or change case. To this manually */ + devices = gst_wasapi_device_provider_probe (provider); + if (devices) { + for (iter = devices; iter; iter = g_list_next (iter)) { + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + } + + g_list_free (devices); + } + + return TRUE; +} + +static void +gst_wasapi_device_provider_stop (GstDeviceProvider * provider) +{ + GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (provider); + + if (self->enumerator) { + gst_mm_device_enumerator_set_notification_callback (self->enumerator, + NULL, NULL); + } } static void gst_wasapi_device_provider_finalize (GObject * object) { - CoUninitialize (); + GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (object); + + gst_clear_object (&self->enumerator); + + G_OBJECT_CLASS (gst_wasapi_device_provider_parent_class)->finalize (object); } static GList * @@ -63,12 +135,137 @@ GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (provider); GList *devices = NULL; - if (!gst_wasapi_util_get_devices (GST_OBJECT (self), TRUE, &devices)) + if (!gst_wasapi_util_get_devices (self->enumerator, TRUE, &devices)) GST_ERROR_OBJECT (self, "Failed to enumerate devices"); return devices; } +static gboolean +gst_wasapi_device_is_in_list (GList * list, GstDevice * device) +{ + GList *iter; + GstStructure *s; + const gchar *device_id; + gboolean found = FALSE; + + s = gst_device_get_properties (device); + g_assert (s); + + device_id = gst_structure_get_string (s, "device.strid"); + g_assert (device_id); + + for (iter = list; iter; iter = g_list_next (iter)) { + GstStructure *other_s; + const gchar *other_id; + + other_s = gst_device_get_properties (GST_DEVICE (iter->data)); + g_assert (other_s); + + other_id = gst_structure_get_string (other_s, "device.strid"); + g_assert (other_id); + + if (g_ascii_strcasecmp (device_id, other_id) == 0) { + found = TRUE; + } + + gst_structure_free (other_s); + if (found) + break; + } + + gst_structure_free (s); + + return found; +} + +static void +gst_wasapi_device_provider_update_devices (GstWasapiDeviceProvider * self) +{ + GstDeviceProvider *provider = GST_DEVICE_PROVIDER_CAST (self); + GList *prev_devices = NULL; + GList *new_devices = NULL; + GList *to_add = NULL; + GList *to_remove = NULL; + GList *iter; + + GST_OBJECT_LOCK (self); + prev_devices = g_list_copy_deep (provider->devices, + (GCopyFunc) gst_object_ref, NULL); + GST_OBJECT_UNLOCK (self); + + new_devices = gst_wasapi_device_provider_probe (provider); + + /* Ownership of GstDevice for gst_device_provider_device_add() + * and gst_device_provider_device_remove() is a bit complicated. + * Remove floating reference here for things to be clear */ + for (iter = new_devices; iter; iter = g_list_next (iter)) + gst_object_ref_sink (iter->data); + + /* Check newly added devices */ + for (iter = new_devices; iter; iter = g_list_next (iter)) { + if (!gst_wasapi_device_is_in_list (prev_devices, GST_DEVICE (iter->data))) { + to_add = g_list_prepend (to_add, gst_object_ref (iter->data)); + } + } + + /* Check removed device */ + for (iter = prev_devices; iter; iter = g_list_next (iter)) { + if (!gst_wasapi_device_is_in_list (new_devices, GST_DEVICE (iter->data))) { + to_remove = g_list_prepend (to_remove, gst_object_ref (iter->data)); + } + } + + for (iter = to_remove; iter; iter = g_list_next (iter)) + gst_device_provider_device_remove (provider, GST_DEVICE (iter->data)); + + for (iter = to_add; iter; iter = g_list_next (iter)) + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + + if (prev_devices) + g_list_free_full (prev_devices, (GDestroyNotify) gst_object_unref); + + if (to_add) + g_list_free_full (to_add, (GDestroyNotify) gst_object_unref); + + if (to_remove) + g_list_free_full (to_remove, (GDestroyNotify) gst_object_unref); +} + +static HRESULT +gst_wasapi_device_provider_device_added (GstMMDeviceEnumerator * enumerator, + LPCWSTR device_id, gpointer user_data) +{ + GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (user_data); + + gst_wasapi_device_provider_update_devices (self); + + return S_OK; +} + +static HRESULT +gst_wasapi_device_provider_device_removed (GstMMDeviceEnumerator * enumerator, + LPCWSTR device_id, gpointer user_data) +{ + GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (user_data); + + gst_wasapi_device_provider_update_devices (self); + + return S_OK; +} + +static HRESULT +gst_wasapi_device_provider_default_device_changed (GstMMDeviceEnumerator * + enumerator, EDataFlow flow, ERole role, LPCWSTR device_id, + gpointer user_data) +{ + GstWasapiDeviceProvider *self = GST_WASAPI_DEVICE_PROVIDER (user_data); + + gst_wasapi_device_provider_update_devices (self); + + return S_OK; +} + /* GstWasapiDevice begins */ enum
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapidevice.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapidevice.h
Changed
@@ -38,6 +38,8 @@ struct _GstWasapiDeviceProvider { GstDeviceProvider parent; + + GstMMDeviceEnumerator *enumerator; }; struct _GstWasapiDeviceProviderClass
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapisink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapisink.c
Changed
@@ -180,7 +180,7 @@ self->cancellable = CreateEvent (NULL, TRUE, FALSE, NULL); self->client_needs_restart = FALSE; - CoInitializeEx (NULL, COINIT_MULTITHREADED); + self->enumerator = gst_mm_device_enumerator_new (); } static void @@ -208,6 +208,8 @@ self->render_client = NULL; } + gst_clear_object (&self->enumerator); + G_OBJECT_CLASS (gst_wasapi_sink_parent_class)->dispose (object); } @@ -219,8 +221,6 @@ CoTaskMemFree (self->mix_format); self->mix_format = NULL; - CoUninitialize (); - if (self->cached_caps != NULL) { gst_caps_unref (self->cached_caps); self->cached_caps = NULL; @@ -412,8 +412,10 @@ * even if the old device was unplugged. We need to handle this somehow. * For example, perhaps we should automatically switch to the new device if * the default device is changed and a device isn't explicitly selected. */ - if (!gst_wasapi_util_get_device_client (GST_ELEMENT (self), eRender, - self->role, self->device_strid, &device, &client)) { + if (!gst_wasapi_util_get_device (self->enumerator, eRender, + self->role, self->device_strid, &device) + || !gst_wasapi_util_get_audio_client (GST_ELEMENT (self), + device, &client)) { if (!self->device_strid) GST_ELEMENT_ERROR (self, RESOURCE, OPEN_WRITE, (NULL), ("Failed to get default device")); @@ -483,7 +485,12 @@ guint bpf, rate, devicep_frames; HRESULT hr; - CoInitializeEx (NULL, COINIT_MULTITHREADED); + if (!self->client) { + GST_DEBUG_OBJECT (self, "no IAudioClient, creating a new one"); + if (!gst_wasapi_util_get_audio_client (GST_ELEMENT (self), + self->device, &self->client)) + goto beach; + } if (gst_wasapi_sink_can_audioclient3 (self)) { if (!gst_wasapi_util_initialize_audioclient3 (GST_ELEMENT (self), spec, @@ -590,7 +597,8 @@ GstWasapiSink *self = GST_WASAPI_SINK (asink); if (self->client != NULL) { - IAudioClient_Stop (self->client); + IUnknown_Release (self->client); + self->client = NULL; } if (self->render_client != NULL) { @@ -598,8 +606,6 @@ self->render_client = NULL; } - CoUninitialize (); - return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapisink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapisink.h
Changed
@@ -40,6 +40,8 @@ { GstAudioSink parent; + GstMMDeviceEnumerator *enumerator; + IMMDevice *device; IAudioClient *client; IAudioRenderClient *render_client;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapisrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapisrc.c
Changed
@@ -199,7 +199,7 @@ self->loopback_event_handle = CreateEvent (NULL, FALSE, FALSE, NULL); self->loopback_cancellable = CreateEvent (NULL, TRUE, FALSE, NULL); - CoInitializeEx (NULL, COINIT_MULTITHREADED); + self->enumerator = gst_mm_device_enumerator_new (); } static void @@ -247,6 +247,8 @@ self->loopback_cancellable = NULL; } + gst_clear_object (&self->enumerator); + G_OBJECT_CLASS (parent_class)->dispose (object); } @@ -258,8 +260,6 @@ CoTaskMemFree (self->mix_format); self->mix_format = NULL; - CoUninitialize (); - g_clear_pointer (&self->cached_caps, gst_caps_unref); g_clear_pointer (&self->positions, g_free); g_clear_pointer (&self->device_strid, g_free); @@ -426,9 +426,11 @@ * even if the old device was unplugged. We need to handle this somehow. * For example, perhaps we should automatically switch to the new device if * the default device is changed and a device isn't explicitly selected. */ - if (!gst_wasapi_util_get_device_client (GST_ELEMENT (self), + if (!gst_wasapi_util_get_device (self->enumerator, self->loopback ? eRender : eCapture, self->role, self->device_strid, - &device, &client)) { + &device) + || !gst_wasapi_util_get_audio_client (GST_ELEMENT (self), + device, &client)) { if (!self->device_strid) GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, (NULL), ("Failed to get default device")); @@ -445,9 +447,10 @@ * we will keep pusing silence data to into wasapi client so that make audio * client report audio data in any case */ - if (!gst_wasapi_util_get_device_client (GST_ELEMENT (self), - eRender, self->role, self->device_strid, - &loopback_device, &self->loopback_client)) { + if (!gst_wasapi_util_get_device (self->enumerator, + eRender, self->role, self->device_strid, &loopback_device) + || !gst_wasapi_util_get_audio_client (GST_ELEMENT (self), + loopback_device, &self->loopback_client)) { if (!self->device_strid) GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, (NULL), ("Failed to get default device for loopback")); @@ -601,8 +604,6 @@ guint bpf, rate, devicep_frames, buffer_frames; HRESULT hr; - CoInitializeEx (NULL, COINIT_MULTITHREADED); - if (gst_wasapi_src_can_audioclient3 (self)) { if (!gst_wasapi_util_initialize_audioclient3 (GST_ELEMENT (self), spec, (IAudioClient3 *) self->client, self->mix_format, self->low_latency, @@ -741,8 +742,6 @@ self->client_clock_freq = 0; - CoUninitialize (); - return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapisrc.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapisrc.h
Changed
@@ -40,6 +40,8 @@ { GstAudioSrc parent; + GstMMDeviceEnumerator *enumerator; + IMMDevice *device; IAudioClient *client; IAudioClock *client_clock;
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapiutil.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapiutil.c
Changed
@@ -305,39 +305,29 @@ return error_text; } -static IMMDeviceEnumerator * -gst_wasapi_util_get_device_enumerator (GstObject * self) -{ - HRESULT hr; - IMMDeviceEnumerator *enumerator = NULL; - - hr = CoCreateInstance (&CLSID_MMDeviceEnumerator, NULL, CLSCTX_ALL, - &IID_IMMDeviceEnumerator, (void **) &enumerator); - HR_FAILED_RET (hr, CoCreateInstance (MMDeviceEnumerator), NULL); - - return enumerator; -} - gboolean -gst_wasapi_util_get_devices (GstObject * self, gboolean active, - GList ** devices) +gst_wasapi_util_get_devices (GstMMDeviceEnumerator * self, + gboolean active, GList ** devices) { gboolean res = FALSE; static GstStaticCaps scaps = GST_STATIC_CAPS (GST_WASAPI_STATIC_CAPS); DWORD dwStateMask = active ? DEVICE_STATE_ACTIVE : DEVICE_STATEMASK_ALL; IMMDeviceCollection *device_collection = NULL; - IMMDeviceEnumerator *enumerator = NULL; + IMMDeviceEnumerator *enum_handle = NULL; const gchar *device_class, *element_name; guint ii, count; HRESULT hr; *devices = NULL; - enumerator = gst_wasapi_util_get_device_enumerator (self); - if (!enumerator) + if (!self) + return FALSE; + + enum_handle = gst_mm_device_enumerator_get_handle (self); + if (!enum_handle) return FALSE; - hr = IMMDeviceEnumerator_EnumAudioEndpoints (enumerator, eAll, dwStateMask, + hr = IMMDeviceEnumerator_EnumAudioEndpoints (enum_handle, eAll, dwStateMask, &device_collection); HR_FAILED_GOTO (hr, IMMDeviceEnumerator::EnumAudioEndpoints, err); @@ -459,8 +449,6 @@ res = TRUE; err: - if (enumerator) - IUnknown_Release (enumerator); if (device_collection) IUnknown_Release (device_collection); return res; @@ -533,25 +521,28 @@ } gboolean -gst_wasapi_util_get_device_client (GstElement * self, +gst_wasapi_util_get_device (GstMMDeviceEnumerator * self, gint data_flow, gint role, const wchar_t * device_strid, - IMMDevice ** ret_device, IAudioClient ** ret_client) + IMMDevice ** ret_device) { gboolean res = FALSE; HRESULT hr; - IMMDeviceEnumerator *enumerator = NULL; + IMMDeviceEnumerator *enum_handle = NULL; IMMDevice *device = NULL; - IAudioClient *client = NULL; - if (!(enumerator = gst_wasapi_util_get_device_enumerator (GST_OBJECT (self)))) - goto beach; + if (!self) + return FALSE; + + enum_handle = gst_mm_device_enumerator_get_handle (self); + if (!enum_handle) + return FALSE; if (!device_strid) { - hr = IMMDeviceEnumerator_GetDefaultAudioEndpoint (enumerator, data_flow, + hr = IMMDeviceEnumerator_GetDefaultAudioEndpoint (enum_handle, data_flow, role, &device); HR_FAILED_GOTO (hr, IMMDeviceEnumerator::GetDefaultAudioEndpoint, beach); } else { - hr = IMMDeviceEnumerator_GetDevice (enumerator, device_strid, &device); + hr = IMMDeviceEnumerator_GetDevice (enum_handle, device_strid, &device); if (hr != S_OK) { gchar *msg = gst_wasapi_util_hresult_to_string (hr); GST_ERROR_OBJECT (self, "IMMDeviceEnumerator::GetDevice (%S) failed" @@ -561,6 +552,26 @@ } } + IUnknown_AddRef (device); + *ret_device = device; + + res = TRUE; + +beach: + if (device != NULL) + IUnknown_Release (device); + + return res; +} + +gboolean +gst_wasapi_util_get_audio_client (GstElement * self, + IMMDevice * device, IAudioClient ** ret_client) +{ + IAudioClient *client = NULL; + gboolean res = FALSE; + HRESULT hr; + if (gst_wasapi_util_have_audioclient3 ()) hr = IMMDevice_Activate (device, &IID_IAudioClient3, CLSCTX_ALL, NULL, (void **) &client); @@ -570,9 +581,7 @@ HR_FAILED_GOTO (hr, IMMDevice::Activate (IID_IAudioClient), beach); IUnknown_AddRef (client); - IUnknown_AddRef (device); *ret_client = client; - *ret_device = device; res = TRUE; @@ -580,12 +589,6 @@ if (client != NULL) IUnknown_Release (client); - if (device != NULL) - IUnknown_Release (device); - - if (enumerator != NULL) - IUnknown_Release (enumerator); - return res; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/gstwasapiutil.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/gstwasapiutil.h
Changed
@@ -29,6 +29,7 @@ #include <audioclient.h> #include "gstaudioclient3.h" +#include "gstmmdeviceenumerator.h" /* Static Caps shared between source, sink, and device provider */ #define GST_WASAPI_STATIC_CAPS "audio/x-raw, " \ @@ -92,12 +93,16 @@ gchar *gst_wasapi_util_hresult_to_string (HRESULT hr); -gboolean gst_wasapi_util_get_devices (GstObject * element, gboolean active, - GList ** devices); +gboolean gst_wasapi_util_get_devices (GstMMDeviceEnumerator * enumerator, + gboolean active, + GList ** devices); -gboolean gst_wasapi_util_get_device_client (GstElement * element, +gboolean gst_wasapi_util_get_device (GstMMDeviceEnumerator * enumerator, gint data_flow, gint role, const wchar_t * device_strid, - IMMDevice ** ret_device, IAudioClient ** ret_client); + IMMDevice ** ret_device); + +gboolean gst_wasapi_util_get_audio_client (GstElement * self, + IMMDevice * device, IAudioClient ** ret_client); gboolean gst_wasapi_util_get_device_format (GstElement * element, gint device_mode, IMMDevice * device, IAudioClient * client,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi/meson.build
Changed
@@ -1,4 +1,5 @@ wasapi_sources = [ + 'gstmmdeviceenumerator.cpp', 'gstwasapi.c', 'gstwasapisrc.c', 'gstwasapisink.c', @@ -30,6 +31,7 @@ gstwasapi = library('gstwasapi', wasapi_sources, c_args : gst_plugins_bad_args + wasapi_args, + cpp_args: gst_plugins_bad_args, include_directories : [configinc], dependencies : [gstaudio_dep, ole32_dep, ksuser_dep], install : true,
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2client.cpp -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2client.cpp
Changed
@@ -41,6 +41,7 @@ #include <locale> #include <codecvt> +/* *INDENT-OFF* */ using namespace ABI::Windows::ApplicationModel::Core; using namespace ABI::Windows::Foundation; using namespace ABI::Windows::Foundation::Collections; @@ -57,11 +58,13 @@ #define GST_CAT_DEFAULT gst_wasapi2_client_debug G_END_DECLS +/* *INDENT-ON* */ static void gst_wasapi2_client_on_device_activated (GstWasapi2Client * client, - IAudioClient3 * audio_client); + IAudioClient * audio_client); +/* *INDENT-OFF* */ class GstWasapiDeviceActivator : public RuntimeClass<RuntimeClassFlags<ClassicCom>, FtmBase, IActivateAudioInterfaceCompletionHandler> @@ -101,7 +104,7 @@ STDMETHOD(ActivateCompleted) (IActivateAudioInterfaceAsyncOperation *async_op) { - ComPtr<IAudioClient3> audio_client; + ComPtr<IAudioClient> audio_client; HRESULT hr = S_OK; HRESULT hr_async_op = S_OK; ComPtr<IUnknown> audio_interface; @@ -161,7 +164,7 @@ HRESULT async_hr = S_OK; async_hr = ActivateAudioInterfaceAsync (device_id.c_str (), - __uuidof(IAudioClient3), nullptr, this, &async_op); + __uuidof(IAudioClient), nullptr, this, &async_op); /* for debugging */ gst_wasapi2_result (async_hr); @@ -205,6 +208,7 @@ GWeakRef listener_; ComPtr<ICoreDispatcher> dispatcher_; }; +/* *INDENT-ON* */ typedef enum { @@ -221,47 +225,29 @@ PROP_DEVICE_NAME, PROP_DEVICE_INDEX, PROP_DEVICE_CLASS, - PROP_LOW_LATENCY, PROP_DISPATCHER, + PROP_CAN_AUTO_ROUTING, }; #define DEFAULT_DEVICE_INDEX -1 #define DEFAULT_DEVICE_CLASS GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE -#define DEFAULT_LOW_LATENCY FALSE struct _GstWasapi2Client { GstObject parent; GstWasapi2ClientDeviceClass device_class; - gboolean low_latency; gchar *device_id; gchar *device_name; gint device_index; gpointer dispatcher; + gboolean can_auto_routing; - IAudioClient3 *audio_client; - IAudioCaptureClient *audio_capture_client; - IAudioRenderClient *audio_render_client; - ISimpleAudioVolume *audio_volume; + IAudioClient *audio_client; GstWasapiDeviceActivator *activator; - WAVEFORMATEX *mix_format; GstCaps *supported_caps; - HANDLE event_handle; - HANDLE cancellable; - gboolean opened; - gboolean running; - - guint32 device_period; - guint32 buffer_frame_count; - - GstAudioChannelPosition *positions; - - /* Used for capture mode */ - GstAdapter *adapter; - GThread *thread; GMutex lock; GCond cond; @@ -281,7 +267,9 @@ static const GEnumValue types[] = { {GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE, "Capture", "capture"}, {GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, "Render", "render"}, - {0, NULL, NULL} + {GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE, "Loopback-Capture", + "loopback-capture"}, + {0, nullptr, nullptr} }; if (g_once_init_enter (&class_type)) { @@ -293,7 +281,6 @@ } static void gst_wasapi2_client_constructed (GObject * object); -static void gst_wasapi2_client_dispose (GObject * object); static void gst_wasapi2_client_finalize (GObject * object); static void gst_wasapi2_client_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); @@ -305,8 +292,7 @@ gst_wasapi2_client_main_loop_running_cb (GstWasapi2Client * self); #define gst_wasapi2_client_parent_class parent_class -G_DEFINE_TYPE (GstWasapi2Client, - gst_wasapi2_client, GST_TYPE_OBJECT); +G_DEFINE_TYPE (GstWasapi2Client, gst_wasapi2_client, GST_TYPE_OBJECT); static void gst_wasapi2_client_class_init (GstWasapi2ClientClass * klass) @@ -317,17 +303,16 @@ G_PARAM_STATIC_STRINGS); gobject_class->constructed = gst_wasapi2_client_constructed; - gobject_class->dispose = gst_wasapi2_client_dispose; gobject_class->finalize = gst_wasapi2_client_finalize; gobject_class->get_property = gst_wasapi2_client_get_property; gobject_class->set_property = gst_wasapi2_client_set_property; g_object_class_install_property (gobject_class, PROP_DEVICE, g_param_spec_string ("device", "Device", - "WASAPI playback device as a GUID string", NULL, param_flags)); + "WASAPI playback device as a GUID string", nullptr, param_flags)); g_object_class_install_property (gobject_class, PROP_DEVICE_NAME, g_param_spec_string ("device-name", "Device Name", - "The human-readable device name", NULL, param_flags)); + "The human-readable device name", nullptr, param_flags)); g_object_class_install_property (gobject_class, PROP_DEVICE_INDEX, g_param_spec_int ("device-index", "Device Index", "The zero-based device index", -1, G_MAXINT, DEFAULT_DEVICE_INDEX, @@ -336,13 +321,13 @@ g_param_spec_enum ("device-class", "Device Class", "Device class", GST_TYPE_WASAPI2_CLIENT_DEVICE_CLASS, DEFAULT_DEVICE_CLASS, param_flags)); - g_object_class_install_property (gobject_class, PROP_LOW_LATENCY, - g_param_spec_boolean ("low-latency", "Low latency", - "Optimize all settings for lowest latency. Always safe to enable.", - DEFAULT_LOW_LATENCY, param_flags)); g_object_class_install_property (gobject_class, PROP_DISPATCHER, g_param_spec_pointer ("dispatcher", "Dispatcher", "ICoreDispatcher COM object to use", param_flags)); + g_object_class_install_property (gobject_class, PROP_CAN_AUTO_ROUTING, + g_param_spec_boolean ("auto-routing", "Auto Routing", + "Whether client can support automatic stream routing", FALSE, + (GParamFlags) (G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS))); } static void @@ -350,11 +335,7 @@ { self->device_index = DEFAULT_DEVICE_INDEX; self->device_class = DEFAULT_DEVICE_CLASS; - self->low_latency = DEFAULT_LOW_LATENCY; - - self->adapter = gst_adapter_new (); - self->event_handle = CreateEvent (NULL, FALSE, FALSE, NULL); - self->cancellable = CreateEvent (NULL, TRUE, FALSE, NULL); + self->can_auto_routing = FALSE; g_mutex_init (&self->lock); g_cond_init (&self->cond); @@ -371,7 +352,9 @@ gst_wasapi2_client_constructed (GObject * object) { GstWasapi2Client *self = GST_WASAPI2_CLIENT (object); + /* *INDENT-OFF* */ ComPtr<GstWasapiDeviceActivator> activator; + /* *INDENT-ON* */ /* Create a new thread to ensure that COM thread can be MTA thread. * We cannot ensure whether CoInitializeEx() was called outside of here for @@ -388,44 +371,26 @@ } static void -gst_wasapi2_client_dispose (GObject * object) +gst_wasapi2_client_finalize (GObject * object) { GstWasapi2Client *self = GST_WASAPI2_CLIENT (object); - GST_DEBUG_OBJECT (self, "dispose"); - - gst_clear_caps (&self->supported_caps); - if (self->loop) { g_main_loop_quit (self->loop); g_thread_join (self->thread); g_main_context_unref (self->context); g_main_loop_unref (self->loop); - self->thread = NULL; - self->context = NULL; - self->loop = NULL; + self->thread = nullptr; + self->context = nullptr; + self->loop = nullptr; } - g_clear_object (&self->adapter); - - G_OBJECT_CLASS (parent_class)->dispose (object); -} - -static void -gst_wasapi2_client_finalize (GObject * object) -{ - GstWasapi2Client *self = GST_WASAPI2_CLIENT (object); + gst_clear_caps (&self->supported_caps); g_free (self->device_id); g_free (self->device_name); - g_free (self->positions); - - CoTaskMemFree (self->mix_format); - CloseHandle (self->event_handle); - CloseHandle (self->cancellable); - g_mutex_clear (&self->lock); g_cond_clear (&self->cond); @@ -454,12 +419,12 @@ case PROP_DEVICE_CLASS: g_value_set_enum (value, self->device_class); break; - case PROP_LOW_LATENCY: - g_value_set_boolean (value, self->low_latency); - break; case PROP_DISPATCHER: g_value_set_pointer (value, self->dispatcher); break; + case PROP_CAN_AUTO_ROUTING: + g_value_set_boolean (value, self->can_auto_routing); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -488,9 +453,6 @@ self->device_class = (GstWasapi2ClientDeviceClass) g_value_get_enum (value); break; - case PROP_LOW_LATENCY: - self->low_latency = g_value_get_boolean (value); - break; case PROP_DISPATCHER: self->dispatcher = g_value_get_pointer (value); break; @@ -514,13 +476,13 @@ static void gst_wasapi2_client_on_device_activated (GstWasapi2Client * self, - IAudioClient3 * audio_client) + IAudioClient * audio_client) { GST_INFO_OBJECT (self, "Device activated"); g_mutex_lock (&self->init_lock); if (audio_client) { - audio_client->AddRef(); + audio_client->AddRef (); self->audio_client = audio_client; self->activate_state = GST_WASAPI2_CLIENT_ACTIVATE_DONE; } else { @@ -531,6 +493,7 @@ g_mutex_unlock (&self->init_lock); } +/* *INDENT-OFF* */ static std::string convert_wstring_to_string (const std::wstring &wstr) { @@ -573,17 +536,20 @@ return ret; } +/* *INDENT-ON* */ static gboolean gst_wasapi2_client_activate_async (GstWasapi2Client * self, GstWasapiDeviceActivator * activator) { - HRESULT hr; + /* *INDENT-OFF* */ ComPtr<IDeviceInformationStatics> device_info_static; ComPtr<IAsyncOperation<DeviceInformationCollection*>> async_op; ComPtr<IVectorView<DeviceInformation*>> device_list; HStringReference hstr_device_info = HStringReference(RuntimeClass_Windows_Devices_Enumeration_DeviceInformation); + /* *INDENT-ON* */ + HRESULT hr; DeviceClass device_class; unsigned int count = 0; gint device_index = 0; @@ -596,8 +562,9 @@ GST_INFO_OBJECT (self, "requested device info, device-class: %s, device: %s, device-index: %d", - self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE ? "capture" : - "render", GST_STR_NULL (self->device_id), self->device_index); + self->device_class == + GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE ? "capture" : "render", + GST_STR_NULL (self->device_id), self->device_index); if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE) { device_class = DeviceClass::DeviceClass_AudioCapture; @@ -623,17 +590,22 @@ * Note that default device is much preferred * See https://docs.microsoft.com/en-us/windows/win32/coreaudio/automatic-stream-routing */ - if (self->device_id && - g_ascii_strcasecmp (self->device_id, default_device_id.c_str()) == 0) { - GST_DEBUG_OBJECT (self, "Default device was requested"); - use_default_device = TRUE; - } else if (self->device_index < 0 && !self->device_id) { - GST_DEBUG_OBJECT (self, - "No device was explicitly requested, use default device"); - use_default_device = TRUE; - } else if (!self->device_id && self->device_index == 0) { - GST_DEBUG_OBJECT (self, "device-index == zero means default device"); - use_default_device = TRUE; + + /* DEVINTERFACE_AUDIO_CAPTURE and DEVINTERFACE_AUDIO_RENDER are available + * as of Windows 10 */ + if (gst_wasapi2_can_automatic_stream_routing ()) { + if (self->device_id && + g_ascii_strcasecmp (self->device_id, default_device_id.c_str ()) == 0) { + GST_DEBUG_OBJECT (self, "Default device was requested"); + use_default_device = TRUE; + } else if (self->device_index < 0 && !self->device_id) { + GST_DEBUG_OBJECT (self, + "No device was explicitly requested, use default device"); + use_default_device = TRUE; + } else if (!self->device_id && self->device_index == 0) { + GST_DEBUG_OBJECT (self, "device-index == zero means default device"); + use_default_device = TRUE; + } } if (use_default_device) { @@ -646,7 +618,7 @@ goto activate; } - hr = GetActivationFactory (hstr_device_info.Get(), &device_info_static); + hr = GetActivationFactory (hstr_device_info.Get (), &device_info_static); if (!gst_wasapi2_result (hr)) goto failed; @@ -655,7 +627,9 @@ if (!gst_wasapi2_result (hr)) goto failed; + /* *INDENT-OFF* */ hr = SyncWait<DeviceInformationCollection*>(async_op.Get ()); + /* *INDENT-ON* */ if (!gst_wasapi2_result (hr)) goto failed; @@ -683,10 +657,17 @@ GST_DEBUG_OBJECT (self, "Available device count: %d", count); - /* zero is for default device */ - device_index = 1; + if (gst_wasapi2_can_automatic_stream_routing ()) { + /* zero is for default device */ + device_index = 1; + } else { + device_index = 0; + } + for (unsigned int i = 0; i < count; i++) { + /* *INDENT-OFF* */ ComPtr<IDeviceInformation> device_info; + /* *INDENT-ON* */ HString id; HString name; boolean b_value; @@ -709,16 +690,16 @@ /* To ensure device id and device name are available, * will query this later again once target device is determined */ - hr = device_info->get_Id (id.GetAddressOf()); + hr = device_info->get_Id (id.GetAddressOf ()); if (!gst_wasapi2_result (hr)) continue; - if (!id.IsValid()) { + if (!id.IsValid ()) { GST_WARNING_OBJECT (self, "Device index %d has invalid id", i); continue; } - hr = device_info->get_Name (name.GetAddressOf()); + hr = device_info->get_Name (name.GetAddressOf ()); if (!gst_wasapi2_result (hr)) continue; @@ -740,7 +721,16 @@ } GST_DEBUG_OBJECT (self, "device [%d] id: %s, name: %s", - device_index, cur_device_id.c_str(), cur_device_name.c_str()); + device_index, cur_device_id.c_str (), cur_device_name.c_str ()); + + if (self->device_index < 0 && !self->device_id) { + GST_INFO_OBJECT (self, "Select the first device, device id %s", + cur_device_id.c_str ()); + target_device_id_wstring = id.GetRawBuffer (nullptr); + target_device_id = cur_device_id; + target_device_name = cur_device_name; + break; + } if (self->device_id && g_ascii_strcasecmp (self->device_id, cur_device_id.c_str ()) == 0) { @@ -774,12 +764,14 @@ activate: /* fill device id and name */ g_free (self->device_id); - self->device_id = g_strdup (target_device_id.c_str()); + self->device_id = g_strdup (target_device_id.c_str ()); g_free (self->device_name); self->device_name = g_strdup (target_device_name.c_str ()); self->device_index = device_index; + /* default device supports automatic stream routing */ + self->can_auto_routing = use_default_device; hr = activator->ActivateDeviceAsync (target_device_id_wstring); if (!gst_wasapi2_result (hr)) { @@ -825,10 +817,13 @@ RoInitializeWrapper initialize (RO_INIT_MULTITHREADED); GSource *source; HRESULT hr; + /* *INDENT-OFF* */ ComPtr<GstWasapiDeviceActivator> activator; hr = MakeAndInitialize<GstWasapiDeviceActivator> (&activator, self, self->dispatcher); + /* *INDENT-ON* */ + if (!gst_wasapi2_result (hr)) { GST_ERROR_OBJECT (self, "Could not create activator object"); self->activate_state = GST_WASAPI2_CLIENT_ACTIVATE_FAILED; @@ -850,7 +845,7 @@ source = g_idle_source_new (); g_source_set_callback (source, - (GSourceFunc) gst_wasapi2_client_main_loop_running_cb, self, NULL); + (GSourceFunc) gst_wasapi2_client_main_loop_running_cb, self, nullptr); g_source_attach (source, self->context); g_source_unref (source); @@ -860,27 +855,7 @@ g_main_context_pop_thread_default (self->context); - gst_wasapi2_client_stop (self); - - if (self->audio_volume) { - self->audio_volume->Release (); - self->audio_volume = NULL; - } - - if (self->audio_render_client) { - self->audio_render_client->Release (); - self->audio_render_client = NULL; - } - - if (self->audio_capture_client) { - self->audio_capture_client->Release (); - self->audio_capture_client = NULL; - } - - if (self->audio_client) { - self->audio_client->Release (); - self->audio_client = NULL; - } + GST_WASAPI2_CLEAR_COM (self->audio_client); /* Reset explicitly to ensure that it happens before * RoInitializeWrapper dtor is called */ @@ -888,929 +863,48 @@ GST_DEBUG_OBJECT (self, "Exit thread function"); - return NULL; -} - -static const gchar * -gst_waveformatex_to_audio_format (WAVEFORMATEXTENSIBLE * format) -{ - const gchar *fmt_str = NULL; - GstAudioFormat fmt = GST_AUDIO_FORMAT_UNKNOWN; - - if (format->Format.wFormatTag == WAVE_FORMAT_PCM) { - fmt = gst_audio_format_build_integer (TRUE, G_LITTLE_ENDIAN, - format->Format.wBitsPerSample, format->Format.wBitsPerSample); - } else if (format->Format.wFormatTag == WAVE_FORMAT_IEEE_FLOAT) { - if (format->Format.wBitsPerSample == 32) - fmt = GST_AUDIO_FORMAT_F32LE; - else if (format->Format.wBitsPerSample == 64) - fmt = GST_AUDIO_FORMAT_F64LE; - } else if (format->Format.wFormatTag == WAVE_FORMAT_EXTENSIBLE) { - if (IsEqualGUID (format->SubFormat, KSDATAFORMAT_SUBTYPE_PCM)) { - fmt = gst_audio_format_build_integer (TRUE, G_LITTLE_ENDIAN, - format->Format.wBitsPerSample, format->Samples.wValidBitsPerSample); - } else if (IsEqualGUID (format->SubFormat, - KSDATAFORMAT_SUBTYPE_IEEE_FLOAT)) { - if (format->Format.wBitsPerSample == 32 - && format->Samples.wValidBitsPerSample == 32) - fmt = GST_AUDIO_FORMAT_F32LE; - else if (format->Format.wBitsPerSample == 64 && - format->Samples.wValidBitsPerSample == 64) - fmt = GST_AUDIO_FORMAT_F64LE; - } - } - - if (fmt != GST_AUDIO_FORMAT_UNKNOWN) - fmt_str = gst_audio_format_to_string (fmt); - - return fmt_str; -} - -static void -gst_wasapi_util_channel_position_all_none (guint channels, - GstAudioChannelPosition * position) -{ - int ii; - for (ii = 0; ii < channels; ii++) - position[ii] = GST_AUDIO_CHANNEL_POSITION_NONE; -} - -static struct -{ - guint64 wasapi_pos; - GstAudioChannelPosition gst_pos; -} wasapi_to_gst_pos[] = { - {SPEAKER_FRONT_LEFT, GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT}, - {SPEAKER_FRONT_RIGHT, GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT}, - {SPEAKER_FRONT_CENTER, GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER}, - {SPEAKER_LOW_FREQUENCY, GST_AUDIO_CHANNEL_POSITION_LFE1}, - {SPEAKER_BACK_LEFT, GST_AUDIO_CHANNEL_POSITION_REAR_LEFT}, - {SPEAKER_BACK_RIGHT, GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT}, - {SPEAKER_FRONT_LEFT_OF_CENTER, - GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER}, - {SPEAKER_FRONT_RIGHT_OF_CENTER, - GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER}, - {SPEAKER_BACK_CENTER, GST_AUDIO_CHANNEL_POSITION_REAR_CENTER}, - /* Enum values diverge from this point onwards */ - {SPEAKER_SIDE_LEFT, GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT}, - {SPEAKER_SIDE_RIGHT, GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT}, - {SPEAKER_TOP_CENTER, GST_AUDIO_CHANNEL_POSITION_TOP_CENTER}, - {SPEAKER_TOP_FRONT_LEFT, GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_LEFT}, - {SPEAKER_TOP_FRONT_CENTER, GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_CENTER}, - {SPEAKER_TOP_FRONT_RIGHT, GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_RIGHT}, - {SPEAKER_TOP_BACK_LEFT, GST_AUDIO_CHANNEL_POSITION_TOP_REAR_LEFT}, - {SPEAKER_TOP_BACK_CENTER, GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER}, - {SPEAKER_TOP_BACK_RIGHT, GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT} -}; - -/* Parse WAVEFORMATEX to get the gstreamer channel mask, and the wasapi channel - * positions so GstAudioRingbuffer can reorder the audio data to match the - * gstreamer channel order. */ -static guint64 -gst_wasapi_util_waveformatex_to_channel_mask (WAVEFORMATEXTENSIBLE * format, - GstAudioChannelPosition ** out_position) -{ - int ii, ch; - guint64 mask = 0; - WORD nChannels = format->Format.nChannels; - DWORD dwChannelMask = format->dwChannelMask; - GstAudioChannelPosition *pos = NULL; - - pos = g_new (GstAudioChannelPosition, nChannels); - gst_wasapi_util_channel_position_all_none (nChannels, pos); - - /* Too many channels, have to assume that they are all non-positional */ - if (nChannels > G_N_ELEMENTS (wasapi_to_gst_pos)) { - GST_INFO ("Got too many (%i) channels, assuming non-positional", nChannels); - goto out; - } - - /* Too many bits in the channel mask, and the bits don't match nChannels */ - if (dwChannelMask >> (G_N_ELEMENTS (wasapi_to_gst_pos) + 1) != 0) { - GST_WARNING ("Too many bits in channel mask (%lu), assuming " - "non-positional", dwChannelMask); - goto out; - } - - /* Map WASAPI's channel mask to Gstreamer's channel mask and positions. - * If the no. of bits in the mask > nChannels, we will ignore the extra. */ - for (ii = 0, ch = 0; ii < G_N_ELEMENTS (wasapi_to_gst_pos) && ch < nChannels; - ii++) { - if (!(dwChannelMask & wasapi_to_gst_pos[ii].wasapi_pos)) - /* no match, try next */ - continue; - mask |= G_GUINT64_CONSTANT (1) << wasapi_to_gst_pos[ii].gst_pos; - pos[ch++] = wasapi_to_gst_pos[ii].gst_pos; - } - - /* XXX: Warn if some channel masks couldn't be mapped? */ - - GST_DEBUG ("Converted WASAPI mask 0x%" G_GINT64_MODIFIER "x -> 0x%" - G_GINT64_MODIFIER "x", (guint64) dwChannelMask, (guint64) mask); - -out: - if (out_position) - *out_position = pos; - return mask; -} - -static gboolean -gst_wasapi2_util_parse_waveformatex (WAVEFORMATEXTENSIBLE * format, - GstCaps * template_caps, GstCaps ** out_caps, - GstAudioChannelPosition ** out_positions) -{ - int ii; - const gchar *afmt; - guint64 channel_mask; - - *out_caps = NULL; - - /* TODO: handle SPDIF and other encoded formats */ - - /* 1 or 2 channels <= 16 bits sample size OR - * 1 or 2 channels > 16 bits sample size or >2 channels */ - if (format->Format.wFormatTag != WAVE_FORMAT_PCM && - format->Format.wFormatTag != WAVE_FORMAT_IEEE_FLOAT && - format->Format.wFormatTag != WAVE_FORMAT_EXTENSIBLE) - /* Unhandled format tag */ - return FALSE; - - /* WASAPI can only tell us one canonical mix format that it will accept. The - * alternative is calling IsFormatSupported on all combinations of formats. - * Instead, it's simpler and faster to require conversion inside gstreamer */ - afmt = gst_waveformatex_to_audio_format (format); - if (afmt == NULL) - return FALSE; - - *out_caps = gst_caps_copy (template_caps); - - /* This will always return something that might be usable */ - channel_mask = - gst_wasapi_util_waveformatex_to_channel_mask (format, out_positions); - - for (ii = 0; ii < gst_caps_get_size (*out_caps); ii++) { - GstStructure *s = gst_caps_get_structure (*out_caps, ii); - - gst_structure_set (s, - "format", G_TYPE_STRING, afmt, - "channels", G_TYPE_INT, format->Format.nChannels, - "rate", G_TYPE_INT, format->Format.nSamplesPerSec, NULL); - - if (channel_mask) { - gst_structure_set (s, - "channel-mask", GST_TYPE_BITMASK, channel_mask, NULL); - } - } - - return TRUE; + return nullptr; } GstCaps * gst_wasapi2_client_get_caps (GstWasapi2Client * client) { - WAVEFORMATEX *format = NULL; + WAVEFORMATEX *mix_format = nullptr; static GstStaticCaps static_caps = GST_STATIC_CAPS (GST_WASAPI2_STATIC_CAPS); GstCaps *scaps; HRESULT hr; - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), NULL); + g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), nullptr); if (client->supported_caps) return gst_caps_ref (client->supported_caps); if (!client->audio_client) { GST_WARNING_OBJECT (client, "IAudioClient3 wasn't configured"); - return NULL; + return nullptr; } - CoTaskMemFree (client->mix_format); - client->mix_format = nullptr; - - g_clear_pointer (&client->positions, g_free); - - hr = client->audio_client->GetMixFormat (&format); - if (!gst_wasapi2_result (hr)) - return NULL; + hr = client->audio_client->GetMixFormat (&mix_format); + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (client, "Failed to get mix format"); + return nullptr; + } scaps = gst_static_caps_get (&static_caps); - gst_wasapi2_util_parse_waveformatex ((WAVEFORMATEXTENSIBLE *) format, - scaps, &client->supported_caps, &client->positions); + gst_wasapi2_util_parse_waveformatex (mix_format, + scaps, &client->supported_caps, nullptr); gst_caps_unref (scaps); - client->mix_format = format; + CoTaskMemFree (mix_format); if (!client->supported_caps) { GST_ERROR_OBJECT (client, "No caps from subclass"); - return NULL; + return nullptr; } return gst_caps_ref (client->supported_caps); } -static gboolean -gst_wasapi2_client_initialize_audio_client3 (GstWasapi2Client * self) -{ - HRESULT hr; - UINT32 default_period, fundamental_period, min_period, max_period; - DWORD stream_flags = AUDCLNT_STREAMFLAGS_EVENTCALLBACK; - WAVEFORMATEX *format = NULL; - UINT32 period; - gboolean ret = FALSE; - IAudioClient3 *audio_client = self->audio_client; - - hr = audio_client->GetSharedModeEnginePeriod (self->mix_format, - &default_period, &fundamental_period, &min_period, &max_period); - if (!gst_wasapi2_result (hr)) - goto done; - - GST_INFO_OBJECT (self, "Using IAudioClient3, default period %d frames, " - "fundamental period %d frames, minimum period %d frames, maximum period " - "%d frames", default_period, fundamental_period, min_period, max_period); - - hr = audio_client->InitializeSharedAudioStream (stream_flags, min_period, - self->mix_format, nullptr); - - if (!gst_wasapi2_result (hr)) { - GST_WARNING_OBJECT (self, "Failed to initialize IAudioClient3"); - goto done; - } - - /* query period again to be ensured */ - hr = audio_client->GetCurrentSharedModeEnginePeriod (&format, &period); - if (!gst_wasapi2_result (hr)) { - GST_WARNING_OBJECT (self, "Failed to get current period"); - goto done; - } - - self->device_period = period; - ret = TRUE; - -done: - CoTaskMemFree (format); - - return ret; -} - -static void -gst_wasapi2_util_get_best_buffer_sizes (GstAudioRingBufferSpec * spec, - REFERENCE_TIME default_period, REFERENCE_TIME min_period, - REFERENCE_TIME * ret_period, REFERENCE_TIME * ret_buffer_duration) -{ - REFERENCE_TIME use_period, use_buffer; - - /* Shared mode always runs at the default period, so if we want a larger - * period (for lower CPU usage), we do it as a multiple of that */ - use_period = default_period; - - /* Ensure that the period (latency_time) used is an integral multiple of - * either the default period or the minimum period */ - use_period = use_period * MAX ((spec->latency_time * 10) / use_period, 1); - - /* Ask WASAPI to create a software ringbuffer of at least this size; it may - * be larger so the actual buffer time may be different, which is why after - * initialization we read the buffer duration actually in-use and set - * segsize/segtotal from that. */ - use_buffer = spec->buffer_time * 10; - /* Has to be at least twice the period */ - if (use_buffer < 2 * use_period) - use_buffer = 2 * use_period; - - *ret_period = use_period; - *ret_buffer_duration = use_buffer; -} - -static gboolean -gst_wasapi2_client_initialize_audio_client (GstWasapi2Client * self, - GstAudioRingBufferSpec * spec) -{ - REFERENCE_TIME default_period, min_period; - REFERENCE_TIME device_period, device_buffer_duration; - guint rate; - DWORD stream_flags = AUDCLNT_STREAMFLAGS_EVENTCALLBACK; - HRESULT hr; - IAudioClient3 *audio_client = self->audio_client; - - hr = audio_client->GetDevicePeriod (&default_period, &min_period); - if (!gst_wasapi2_result (hr)) { - GST_WARNING_OBJECT (self, "Couldn't get device period info"); - return FALSE; - } - - GST_INFO_OBJECT (self, "wasapi2 default period: %" G_GINT64_FORMAT - ", min period: %" G_GINT64_FORMAT, default_period, min_period); - - rate = GST_AUDIO_INFO_RATE (&spec->info); - - if (self->low_latency) { - device_period = default_period; - /* this should be same as hnsPeriodicity - * when AUDCLNT_STREAMFLAGS_EVENTCALLBACK is used - * And in case of shared mode, hnsPeriodicity should be zero, so - * this value should be zero as well */ - device_buffer_duration = 0; - } else { - /* Clamp values to integral multiples of an appropriate period */ - gst_wasapi2_util_get_best_buffer_sizes (spec, - default_period, min_period, &device_period, &device_buffer_duration); - } - - hr = audio_client->Initialize (AUDCLNT_SHAREMODE_SHARED, stream_flags, - device_buffer_duration, - /* This must always be 0 in shared mode */ - 0, - self->mix_format, nullptr); - if (!gst_wasapi2_result (hr)) { - GST_WARNING_OBJECT (self, "Couldn't initialize audioclient"); - return FALSE; - } - - /* device_period can be a non-power-of-10 value so round while converting */ - self->device_period = - gst_util_uint64_scale_round (device_period, rate * 100, GST_SECOND); - - return TRUE; -} - -gboolean -gst_wasapi2_client_open (GstWasapi2Client * client, GstAudioRingBufferSpec * spec, - GstAudioRingBuffer * buf) -{ - HRESULT hr; - REFERENCE_TIME latency_rt; - guint bpf, rate; - IAudioClient3 *audio_client; - ComPtr<ISimpleAudioVolume> audio_volume; - gboolean initialized = FALSE; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - - /* FIXME: Once IAudioClient3 was initialized, we may need to re-open - * IAudioClient3 in order to handle audio format change */ - if (client->opened) { - GST_INFO_OBJECT (client, "IAudioClient3 object is initialized already"); - return TRUE; - } - - audio_client = client->audio_client; - - if (!audio_client) { - GST_ERROR_OBJECT (client, "IAudioClient3 object wasn't configured"); - return FALSE; - } - - if (!client->mix_format) { - GST_ERROR_OBJECT (client, "Unknown mix format"); - return FALSE; - } - - /* Only use audioclient3 when low-latency is requested because otherwise - * very slow machines and VMs with 1 CPU allocated will get glitches: - * https://bugzilla.gnome.org/show_bug.cgi?id=794497 */ - if (client->low_latency) - initialized = gst_wasapi2_client_initialize_audio_client3 (client); - - /* Try again if IAudioClinet3 API is unavailable. - * NOTE: IAudioClinet3:: methods might not be available for default device - * NOTE: The default device is a special device which is needed for supporting - * automatic stream routing - * https://docs.microsoft.com/en-us/windows/win32/coreaudio/automatic-stream-routing - */ - if (!initialized) - initialized = gst_wasapi2_client_initialize_audio_client (client, spec); - - if (!initialized) { - GST_ERROR_OBJECT (client, "Failed to initialize audioclient"); - return FALSE; - } - - bpf = GST_AUDIO_INFO_BPF (&spec->info); - rate = GST_AUDIO_INFO_RATE (&spec->info); - - /* Total size in frames of the allocated buffer that we will read from */ - hr = audio_client->GetBufferSize (&client->buffer_frame_count); - if (!gst_wasapi2_result (hr)) { - return FALSE; - } - - GST_INFO_OBJECT (client, "buffer size is %i frames, device period is %i " - "frames, bpf is %i bytes, rate is %i Hz", client->buffer_frame_count, - client->device_period, bpf, rate); - - /* Actual latency-time/buffer-time will be different now */ - spec->segsize = client->device_period * bpf; - - /* We need a minimum of 2 segments to ensure glitch-free playback */ - spec->segtotal = MAX (client->buffer_frame_count * bpf / spec->segsize, 2); - - GST_INFO_OBJECT (client, "segsize is %i, segtotal is %i", spec->segsize, - spec->segtotal); - - /* Get WASAPI latency for logging */ - hr = audio_client->GetStreamLatency (&latency_rt); - if (!gst_wasapi2_result (hr)) { - return FALSE; - } - - GST_INFO_OBJECT (client, "wasapi2 stream latency: %" G_GINT64_FORMAT " (%" - G_GINT64_FORMAT " ms)", latency_rt, latency_rt / 10000); - - /* Set the event handler which will trigger read/write */ - hr = audio_client->SetEventHandle (client->event_handle); - if (!gst_wasapi2_result (hr)) - return FALSE; - - if (client->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER) { - ComPtr<IAudioRenderClient> render_client; - - hr = audio_client->GetService (IID_PPV_ARGS (&render_client)); - if (!gst_wasapi2_result (hr)) - return FALSE; - - client->audio_render_client = render_client.Detach (); - } else { - ComPtr<IAudioCaptureClient> capture_client; - - hr = audio_client->GetService (IID_PPV_ARGS (&capture_client)); - if (!gst_wasapi2_result (hr)) - return FALSE; - - client->audio_capture_client = capture_client.Detach (); - } - - hr = audio_client->GetService (IID_PPV_ARGS (&audio_volume)); - if (!gst_wasapi2_result (hr)) - return FALSE; - - client->audio_volume = audio_volume.Detach (); - - gst_audio_ring_buffer_set_channel_positions (buf, client->positions); - - client->opened = TRUE; - - return TRUE; -} - -/* Get the empty space in the buffer that we have to write to */ -static gint -gst_wasapi2_client_get_can_frames (GstWasapi2Client * self) -{ - HRESULT hr; - UINT32 n_frames_padding; - IAudioClient3 *audio_client = self->audio_client; - - if (!audio_client) { - GST_WARNING_OBJECT (self, "IAudioClient3 wasn't configured"); - return -1; - } - - /* Frames the card hasn't rendered yet */ - hr = audio_client->GetCurrentPadding (&n_frames_padding); - if (!gst_wasapi2_result (hr)) - return -1; - - GST_LOG_OBJECT (self, "%d unread frames (padding)", n_frames_padding); - - /* We can write out these many frames */ - return self->buffer_frame_count - n_frames_padding; -} - -gboolean -gst_wasapi2_client_start (GstWasapi2Client * client) -{ - HRESULT hr; - IAudioClient3 *audio_client; - WAVEFORMATEX *mix_format; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - - audio_client = client->audio_client; - mix_format = client->mix_format; - - if (!audio_client) { - GST_ERROR_OBJECT (client, "IAudioClient3 object wasn't configured"); - return FALSE; - } - - if (!mix_format) { - GST_ERROR_OBJECT (client, "Unknown MixFormat"); - return FALSE; - } - - if (client->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE && - !client->audio_capture_client) { - GST_ERROR_OBJECT (client, "IAudioCaptureClient wasn't configured"); - return FALSE; - } - - if (client->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER && - !client->audio_render_client) { - GST_ERROR_OBJECT (client, "IAudioRenderClient wasn't configured"); - return FALSE; - } - - ResetEvent (client->cancellable); - - if (client->running) { - GST_WARNING_OBJECT (client, "IAudioClient3 is running already"); - return TRUE; - } - - /* To avoid start-up glitches, before starting the streaming, we fill the - * buffer with silence as recommended by the documentation: - * https://msdn.microsoft.com/en-us/library/windows/desktop/dd370879%28v=vs.85%29.aspx */ - if (client->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER) { - IAudioRenderClient *render_client = client->audio_render_client; - gint n_frames, len; - BYTE *dst = NULL; - - n_frames = gst_wasapi2_client_get_can_frames (client); - if (n_frames < 1) { - GST_ERROR_OBJECT (client, - "should have more than %i frames to write", n_frames); - return FALSE; - } - - len = n_frames * mix_format->nBlockAlign; - - hr = render_client->GetBuffer (n_frames, &dst); - if (!gst_wasapi2_result (hr)) { - GST_ERROR_OBJECT (client, "Couldn't get buffer"); - return FALSE; - } - - GST_DEBUG_OBJECT (client, "pre-wrote %i bytes of silence", len); - - hr = render_client->ReleaseBuffer (n_frames, AUDCLNT_BUFFERFLAGS_SILENT); - if (!gst_wasapi2_result (hr)) { - GST_ERROR_OBJECT (client, "Couldn't release buffer"); - return FALSE; - } - } - - hr = audio_client->Start (); - client->running = gst_wasapi2_result (hr); - gst_adapter_clear (client->adapter); - - return client->running; -} - -gboolean -gst_wasapi2_client_stop (GstWasapi2Client * client) -{ - HRESULT hr; - IAudioClient3 *audio_client; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - - audio_client = client->audio_client; - - if (!client->running) { - GST_DEBUG_OBJECT (client, "We are not running now"); - return TRUE; - } - - if (!client->audio_client) { - GST_ERROR_OBJECT (client, "IAudioClient3 object wasn't configured"); - return FALSE; - } - - client->running = FALSE; - SetEvent (client->cancellable); - - hr = audio_client->Stop (); - if (!gst_wasapi2_result (hr)) - return FALSE; - - /* reset state for reuse case */ - hr = audio_client->Reset (); - return gst_wasapi2_result (hr); -} - -gint -gst_wasapi2_client_read (GstWasapi2Client * client, gpointer data, guint length) -{ - IAudioCaptureClient *capture_client; - WAVEFORMATEX *mix_format; - HRESULT hr; - BYTE *from = NULL; - guint wanted = length; - guint bpf; - DWORD flags; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - g_return_val_if_fail (client->audio_capture_client != NULL, -1); - g_return_val_if_fail (client->mix_format != NULL, -1); - - capture_client = client->audio_capture_client; - mix_format = client->mix_format; - - if (!client->running) { - GST_ERROR_OBJECT (client, "client is not running now"); - return -1; - } - - /* If we've accumulated enough data, return it immediately */ - if (gst_adapter_available (client->adapter) >= wanted) { - memcpy (data, gst_adapter_map (client->adapter, wanted), wanted); - gst_adapter_flush (client->adapter, wanted); - GST_DEBUG_OBJECT (client, "Adapter has enough data, returning %i", wanted); - return wanted; - } - - bpf = mix_format->nBlockAlign; - - while (wanted > 0) { - DWORD dwWaitResult; - guint got_frames, avail_frames, n_frames, want_frames, read_len; - HANDLE event_handle[2]; - - event_handle[0] = client->event_handle; - event_handle[1] = client->cancellable; - - /* Wait for data to become available */ - dwWaitResult = WaitForMultipleObjects (2, event_handle, FALSE, INFINITE); - if (dwWaitResult != WAIT_OBJECT_0 && dwWaitResult != WAIT_OBJECT_0 + 1) { - GST_ERROR_OBJECT (client, "Error waiting for event handle: %x", - (guint) dwWaitResult); - return -1; - } - - if (!client->running) { - GST_DEBUG_OBJECT (client, "Cancelled"); - return -1; - } - - hr = capture_client->GetBuffer (&from, &got_frames, &flags, nullptr, - nullptr); - if (!gst_wasapi2_result (hr)) { - if (hr == AUDCLNT_S_BUFFER_EMPTY) { - GST_INFO_OBJECT (client, "Client buffer is empty, retry"); - return 0; - } - - GST_ERROR_OBJECT (client, "Couldn't get buffer from capture client"); - return -1; - } - - if (got_frames == 0) { - GST_DEBUG_OBJECT (client, "No buffer to read"); - capture_client->ReleaseBuffer (got_frames); - return 0; - } - - if (G_UNLIKELY (flags != 0)) { - /* https://docs.microsoft.com/en-us/windows/win32/api/audioclient/ne-audioclient-_audclnt_bufferflags */ - if (flags & AUDCLNT_BUFFERFLAGS_DATA_DISCONTINUITY) - GST_DEBUG_OBJECT (client, "WASAPI reported discontinuity (glitch?)"); - if (flags & AUDCLNT_BUFFERFLAGS_TIMESTAMP_ERROR) - GST_DEBUG_OBJECT (client, "WASAPI reported a timestamp error"); - } - - /* Copy all the frames we got into the adapter, and then extract at most - * @wanted size of frames from it. This helps when ::GetBuffer returns more - * data than we can handle right now. */ - { - GstBuffer *tmp = gst_buffer_new_allocate (NULL, got_frames * bpf, NULL); - /* If flags has AUDCLNT_BUFFERFLAGS_SILENT, we will ignore the actual - * data and write out silence, see: - * https://docs.microsoft.com/en-us/windows/win32/api/audioclient/ne-audioclient-_audclnt_bufferflags */ - if (flags & AUDCLNT_BUFFERFLAGS_SILENT) - memset (from, 0, got_frames * bpf); - gst_buffer_fill (tmp, 0, from, got_frames * bpf); - gst_adapter_push (client->adapter, tmp); - } - - /* Release all captured buffers; we copied them above */ - hr = capture_client->ReleaseBuffer (got_frames); - from = NULL; - if (!gst_wasapi2_result (hr)) { - GST_ERROR_OBJECT (client, "Failed to release buffer"); - return -1; - } - - want_frames = wanted / bpf; - avail_frames = gst_adapter_available (client->adapter) / bpf; - - /* Only copy data that will fit into the allocated buffer of size @length */ - n_frames = MIN (avail_frames, want_frames); - read_len = n_frames * bpf; - - if (read_len == 0) { - GST_WARNING_OBJECT (client, "No data to read"); - return 0; - } - - GST_LOG_OBJECT (client, "frames captured: %d (%d bytes), " - "can read: %d (%d bytes), will read: %d (%d bytes), " - "adapter has: %d (%d bytes)", got_frames, got_frames * bpf, want_frames, - wanted, n_frames, read_len, avail_frames, avail_frames * bpf); - - memcpy (data, gst_adapter_map (client->adapter, read_len), read_len); - gst_adapter_flush (client->adapter, read_len); - wanted -= read_len; - } - - return length; -} - -gint -gst_wasapi2_client_write (GstWasapi2Client * client, gpointer data, - guint length) -{ - IAudioRenderClient *render_client; - WAVEFORMATEX *mix_format; - HRESULT hr; - BYTE *dst = nullptr; - DWORD dwWaitResult; - guint can_frames, have_frames, n_frames, write_len = 0; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), -1); - g_return_val_if_fail (client->audio_render_client != NULL, -1); - g_return_val_if_fail (client->mix_format != NULL, -1); - - if (!client->running) { - GST_WARNING_OBJECT (client, "client is not running now"); - return -1; - } - - render_client = client->audio_render_client; - mix_format = client->mix_format; - - /* We have N frames to be written out */ - have_frames = length / (mix_format->nBlockAlign); - - /* In shared mode we can write parts of the buffer, so only wait - * in case we can't write anything */ - can_frames = gst_wasapi2_client_get_can_frames (client); - if (can_frames < 0) { - GST_ERROR_OBJECT (client, "Error getting frames to write to"); - return -1; - } - - if (can_frames == 0) { - HANDLE event_handle[2]; - - event_handle[0] = client->event_handle; - event_handle[1] = client->cancellable; - - dwWaitResult = WaitForMultipleObjects (2, event_handle, FALSE, INFINITE); - if (dwWaitResult != WAIT_OBJECT_0 && dwWaitResult != WAIT_OBJECT_0 + 1) { - GST_ERROR_OBJECT (client, "Error waiting for event handle: %x", - (guint) dwWaitResult); - return -1; - } - - if (!client->running) { - GST_DEBUG_OBJECT (client, "Cancelled"); - return -1; - } - - can_frames = gst_wasapi2_client_get_can_frames (client); - if (can_frames < 0) { - GST_ERROR_OBJECT (client, "Error getting frames to write to"); - return -1; - } - } - - /* We will write out these many frames, and this much length */ - n_frames = MIN (can_frames, have_frames); - write_len = n_frames * mix_format->nBlockAlign; - - GST_LOG_OBJECT (client, "total: %d, have_frames: %d (%d bytes), " - "can_frames: %d, will write: %d (%d bytes)", client->buffer_frame_count, - have_frames, length, can_frames, n_frames, write_len); - - hr = render_client->GetBuffer (n_frames, &dst); - if (!gst_wasapi2_result (hr)) { - GST_ERROR_OBJECT (client, "Couldn't get buffer from client"); - return -1; - } - - memcpy (dst, data, write_len); - hr = render_client->ReleaseBuffer (n_frames, 0); - - return write_len; -} - -guint -gst_wasapi2_client_delay (GstWasapi2Client * client) -{ - HRESULT hr; - UINT32 delay; - IAudioClient3 *audio_client; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), 0); - - audio_client = client->audio_client; - - if (!audio_client) { - GST_WARNING_OBJECT (client, "IAudioClient3 wasn't configured"); - return 0; - } - - hr = audio_client->GetCurrentPadding (&delay); - if (!gst_wasapi2_result (hr)) - return 0; - - return delay; -} - -gboolean -gst_wasapi2_client_set_mute (GstWasapi2Client * client, gboolean mute) -{ - HRESULT hr; - ISimpleAudioVolume *audio_volume; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - - audio_volume = client->audio_volume; - - if (!audio_volume) { - GST_WARNING_OBJECT (client, "ISimpleAudioVolume object wasn't configured"); - return FALSE; - } - - hr = audio_volume->SetMute (mute, nullptr); - GST_DEBUG_OBJECT (client, "Set mute %s, hr: 0x%x", - mute ? "enabled" : "disabled", (gint) hr); - - return gst_wasapi2_result (hr); -} - -gboolean -gst_wasapi2_client_get_mute (GstWasapi2Client * client, gboolean * mute) -{ - HRESULT hr; - ISimpleAudioVolume *audio_volume; - BOOL current_mute = FALSE; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - g_return_val_if_fail (mute != NULL, FALSE); - - audio_volume = client->audio_volume; - - if (!audio_volume) { - GST_WARNING_OBJECT (client, "ISimpleAudioVolume object wasn't configured"); - return FALSE; - } - - hr = audio_volume->GetMute (¤t_mute); - if (!gst_wasapi2_result (hr)) - return FALSE; - - *mute = (gboolean) current_mute; - - return TRUE; -} - -gboolean -gst_wasapi2_client_set_volume (GstWasapi2Client * client, gfloat volume) -{ - HRESULT hr; - ISimpleAudioVolume *audio_volume; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - g_return_val_if_fail (volume >= 0 && volume <= 1.0, FALSE); - - audio_volume = client->audio_volume; - - if (!audio_volume) { - GST_WARNING_OBJECT (client, "ISimpleAudioVolume object wasn't configured"); - return FALSE; - } - - hr = audio_volume->SetMasterVolume (volume, nullptr); - GST_DEBUG_OBJECT (client, "Set volume %.2f hr: 0x%x", volume, (gint) hr); - - return gst_wasapi2_result (hr); -} - -gboolean -gst_wasapi2_client_get_volume (GstWasapi2Client * client, gfloat * volume) -{ - HRESULT hr; - ISimpleAudioVolume *audio_volume; - float current_volume = FALSE; - - g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), FALSE); - g_return_val_if_fail (volume != NULL, FALSE); - - audio_volume = client->audio_volume; - - if (!audio_volume) { - GST_WARNING_OBJECT (client, "ISimpleAudioVolume object wasn't configured"); - return FALSE; - } - - hr = audio_volume->GetMasterVolume (¤t_volume); - if (!gst_wasapi2_result (hr)) - return FALSE; - - *volume = current_volume; - - return TRUE; -} - gboolean gst_wasapi2_client_ensure_activation (GstWasapi2Client * client) { @@ -1830,21 +924,23 @@ static HRESULT find_dispatcher (ICoreDispatcher ** dispatcher) { + /* *INDENT-OFF* */ HStringReference hstr_core_app = HStringReference(RuntimeClass_Windows_ApplicationModel_Core_CoreApplication); + ComPtr<ICoreApplication> core_app; + ComPtr<ICoreApplicationView> core_app_view; + ComPtr<ICoreWindow> core_window; + /* *INDENT-ON* */ HRESULT hr; - ComPtr<ICoreApplication> core_app; - hr = GetActivationFactory (hstr_core_app.Get(), &core_app); + hr = GetActivationFactory (hstr_core_app.Get (), &core_app); if (FAILED (hr)) return hr; - ComPtr<ICoreApplicationView> core_app_view; hr = core_app->GetCurrentView (&core_app_view); if (FAILED (hr)) return hr; - ComPtr<ICoreWindow> core_window; hr = core_app_view->get_CoreWindow (&core_window); if (FAILED (hr)) return hr; @@ -1854,11 +950,12 @@ GstWasapi2Client * gst_wasapi2_client_new (GstWasapi2ClientDeviceClass device_class, - gboolean low_latency, gint device_index, const gchar * device_id, - gpointer dispatcher) + gint device_index, const gchar * device_id, gpointer dispatcher) { GstWasapi2Client *self; + /* *INDENT-OFF* */ ComPtr<ICoreDispatcher> core_dispatcher; + /* *INDENT-ON* */ /* Multiple COM init is allowed */ RoInitializeWrapper init_wrapper (RO_INIT_MULTITHREADED); @@ -1879,9 +976,8 @@ } self = (GstWasapi2Client *) g_object_new (GST_TYPE_WASAPI2_CLIENT, - "device-class", device_class, "low-latency", low_latency, - "device-index", device_index, "device", device_id, - "dispatcher", dispatcher, NULL); + "device-class", device_class, "device-index", device_index, + "device", device_id, "dispatcher", dispatcher, nullptr); /* Reset explicitly to ensure that it happens before * RoInitializeWrapper dtor is called */ @@ -1889,10 +985,18 @@ if (self->activate_state == GST_WASAPI2_CLIENT_ACTIVATE_FAILED) { gst_object_unref (self); - return NULL; + return nullptr; } gst_object_ref_sink (self); return self; } + +IAudioClient * +gst_wasapi2_client_get_handle (GstWasapi2Client * client) +{ + g_return_val_if_fail (GST_IS_WASAPI2_CLIENT (client), nullptr); + + return client->audio_client; +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2client.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2client.h
Changed
@@ -22,6 +22,7 @@ #include <gst/gst.h> #include <gst/audio/audio.h> +#include "gstwasapi2util.h" G_BEGIN_DECLS @@ -29,6 +30,7 @@ { GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE = 0, GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, + GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE, } GstWasapi2ClientDeviceClass; #define GST_TYPE_WASAPI2_CLIENT_DEVICE_CLASS (gst_wasapi2_client_device_class_get_type()) @@ -38,47 +40,16 @@ G_DECLARE_FINAL_TYPE (GstWasapi2Client, gst_wasapi2_client, GST, WASAPI2_CLIENT, GstObject); -GstCaps * gst_wasapi2_client_get_caps (GstWasapi2Client * client); - -gboolean gst_wasapi2_client_open (GstWasapi2Client * client, - GstAudioRingBufferSpec * spec, - GstAudioRingBuffer * buf); - -gboolean gst_wasapi2_client_start (GstWasapi2Client * client); - -gboolean gst_wasapi2_client_stop (GstWasapi2Client * client); - -gint gst_wasapi2_client_read (GstWasapi2Client * client, - gpointer data, - guint length); - -gint gst_wasapi2_client_write (GstWasapi2Client * client, - gpointer data, - guint length); - -guint gst_wasapi2_client_delay (GstWasapi2Client * client); - -gboolean gst_wasapi2_client_set_mute (GstWasapi2Client * client, - gboolean mute); - -gboolean gst_wasapi2_client_get_mute (GstWasapi2Client * client, - gboolean * mute); - -gboolean gst_wasapi2_client_set_volume (GstWasapi2Client * client, - gfloat volume); - -gboolean gst_wasapi2_client_get_volume (GstWasapi2Client * client, - gfloat * volume); - -gboolean gst_wasapi2_client_ensure_activation (GstWasapi2Client * client); - GstWasapi2Client * gst_wasapi2_client_new (GstWasapi2ClientDeviceClass device_class, - gboolean low_latency, gint device_index, const gchar * device_id, gpointer dispatcher); -G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstWasapi2Client, gst_object_unref) +gboolean gst_wasapi2_client_ensure_activation (GstWasapi2Client * client); + +IAudioClient * gst_wasapi2_client_get_handle (GstWasapi2Client * client); + +GstCaps * gst_wasapi2_client_get_caps (GstWasapi2Client * client); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2device.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2device.c
Changed
@@ -25,6 +25,7 @@ #include "gstwasapi2device.h" #include "gstwasapi2client.h" #include "gstwasapi2util.h" +#include <gst/winrt/gstwinrt.h> GST_DEBUG_CATEGORY_EXTERN (gst_wasapi2_debug); #define GST_CAT_DEFAULT gst_wasapi2_debug @@ -41,6 +42,7 @@ gchar *device_id; const gchar *factory_name; + GstWasapi2ClientDeviceClass device_class; }; G_DEFINE_TYPE (GstWasapi2Device, gst_wasapi2_device, GST_TYPE_DEVICE); @@ -96,6 +98,9 @@ g_object_set (elem, "device", self->device_id, NULL); + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE) + g_object_set (elem, "loopback", TRUE, NULL); + return elem; } @@ -131,33 +136,128 @@ } } -struct _GstWasapi2DeviceProvider +typedef struct _GstWasapi2DeviceProvider { GstDeviceProvider parent; -}; -G_DEFINE_TYPE (GstWasapi2DeviceProvider, gst_wasapi2_device_provider, - GST_TYPE_DEVICE_PROVIDER); + GstWinRTDeviceWatcher *watcher; + + GMutex lock; + GCond cond; + + gboolean enum_completed; +} GstWasapi2DeviceProvider; + +typedef struct _GstWasapi2DeviceProviderClass +{ + GstDeviceProviderClass parent_class; + + GstWinRTDeviceClass winrt_device_class; +} GstWasapi2DeviceProviderClass; + +static GstDeviceProviderClass *parent_class = NULL; + +#define GST_WASAPI2_DEVICE_PROVIDER(object) \ + ((GstWasapi2DeviceProvider *) (object)) +#define GST_WASAPI2_DEVICE_PROVIDER_GET_CLASS(object) \ + (G_TYPE_INSTANCE_GET_CLASS ((object),G_TYPE_FROM_INSTANCE (object),GstWasapi2DeviceProviderClass)) + +static void gst_wasapi2_device_provider_dispose (GObject * object); +static void gst_wasapi2_device_provider_finalize (GObject * object); static GList *gst_wasapi2_device_provider_probe (GstDeviceProvider * provider); +static gboolean +gst_wasapi2_device_provider_start (GstDeviceProvider * provider); +static void gst_wasapi2_device_provider_stop (GstDeviceProvider * provider); static void -gst_wasapi2_device_provider_class_init (GstWasapi2DeviceProviderClass * klass) +gst_wasapi2_device_provider_device_added (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformation * info, + gpointer user_data); +static void +gst_wasapi2_device_provider_device_updated (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data); +static void gst_wasapi2_device_provider_device_removed (GstWinRTDeviceWatcher * + watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data); +static void +gst_wasapi2_device_provider_device_enum_completed (GstWinRTDeviceWatcher * + watcher, gpointer user_data); + +static void +gst_wasapi2_device_provider_class_init (GstWasapi2DeviceProviderClass * klass, + gpointer data) { + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstDeviceProviderClass *provider_class = GST_DEVICE_PROVIDER_CLASS (klass); + gobject_class->dispose = gst_wasapi2_device_provider_dispose; + gobject_class->finalize = gst_wasapi2_device_provider_finalize; + provider_class->probe = GST_DEBUG_FUNCPTR (gst_wasapi2_device_provider_probe); + provider_class->start = GST_DEBUG_FUNCPTR (gst_wasapi2_device_provider_start); + provider_class->stop = GST_DEBUG_FUNCPTR (gst_wasapi2_device_provider_stop); + + parent_class = (GstDeviceProviderClass *) g_type_class_peek_parent (klass); + + klass->winrt_device_class = (GstWinRTDeviceClass) GPOINTER_TO_INT (data); + if (klass->winrt_device_class == GST_WINRT_DEVICE_CLASS_AUDIO_CAPTURE) { + gst_device_provider_class_set_static_metadata (provider_class, + "WASAPI (Windows Audio Session API) Capture Device Provider", + "Source/Audio", "List WASAPI source devices", + "Nirbheek Chauhan <nirbheek@centricular.com>, " + "Seungha Yang <seungha@centricular.com>"); + } else { + gst_device_provider_class_set_static_metadata (provider_class, + "WASAPI (Windows Audio Session API) Render and Loopback Capture Device Provider", + "Source/Sink/Audio", "List WASAPI loop back source and sink devices", + "Nirbheek Chauhan <nirbheek@centricular.com>, " + "Seungha Yang <seungha@centricular.com>"); + } +} + +static void +gst_wasapi2_device_provider_init (GstWasapi2DeviceProvider * self) +{ + GstWasapi2DeviceProviderClass *klass = + GST_WASAPI2_DEVICE_PROVIDER_GET_CLASS (self); + GstWinRTDeviceWatcherCallbacks callbacks; + + g_mutex_init (&self->lock); + g_cond_init (&self->cond); + + callbacks.added = gst_wasapi2_device_provider_device_added; + callbacks.updated = gst_wasapi2_device_provider_device_updated; + callbacks.removed = gst_wasapi2_device_provider_device_removed; + callbacks.enumeration_completed = + gst_wasapi2_device_provider_device_enum_completed; + + self->watcher = + gst_winrt_device_watcher_new (klass->winrt_device_class, + &callbacks, self); +} + +static void +gst_wasapi2_device_provider_dispose (GObject * object) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (object); - gst_device_provider_class_set_static_metadata (provider_class, - "WASAPI (Windows Audio Session API) Device Provider", - "Source/Sink/Audio", "List WASAPI source and sink devices", - "Nirbheek Chauhan <nirbheek@centricular.com>, " - "Seungha Yang <seungha@centricular.com>"); + gst_clear_object (&self->watcher); + + G_OBJECT_CLASS (parent_class)->dispose (object); } static void -gst_wasapi2_device_provider_init (GstWasapi2DeviceProvider * provider) +gst_wasapi2_device_provider_finalize (GObject * object) { + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (object); + + g_mutex_clear (&self->lock); + g_cond_clear (&self->cond); + + G_OBJECT_CLASS (parent_class)->finalize (object); } static void @@ -183,7 +283,7 @@ gchar *device_id = NULL; gchar *device_name = NULL; - client = gst_wasapi2_client_new (client_class, FALSE, i, NULL, NULL); + client = gst_wasapi2_client_new (client_class, i, NULL, NULL); if (!client) return; @@ -208,15 +308,28 @@ } props = gst_structure_new ("wasapi2-proplist", - "device.api", G_TYPE_STRING, "wasapi", + "device.api", G_TYPE_STRING, "wasapi2", "device.id", G_TYPE_STRING, device_id, "device.default", G_TYPE_BOOLEAN, i == 0, - "wasapi.device.description", G_TYPE_STRING, device_name, NULL); + "wasapi2.device.description", G_TYPE_STRING, device_name, NULL); + switch (client_class) { + case GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE: + gst_structure_set (props, + "wasapi2.device.loopback", G_TYPE_BOOLEAN, FALSE, NULL); + break; + case GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE: + gst_structure_set (props, + "wasapi2.device.loopback", G_TYPE_BOOLEAN, TRUE, NULL); + break; + default: + break; + } device = g_object_new (GST_TYPE_WASAPI2_DEVICE, "device", device_id, "display-name", device_name, "caps", caps, "device-class", device_class, "properties", props, NULL); GST_WASAPI2_DEVICE (device)->factory_name = factory_name; + GST_WASAPI2_DEVICE (device)->device_class = client_class; *devices = g_list_append (*devices, device); @@ -236,12 +349,247 @@ gst_wasapi2_device_provider_probe (GstDeviceProvider * provider) { GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (provider); + GstWasapi2DeviceProviderClass *klass = + GST_WASAPI2_DEVICE_PROVIDER_GET_CLASS (self); GList *devices = NULL; - gst_wasapi2_device_provider_probe_internal (self, - GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE, &devices); - gst_wasapi2_device_provider_probe_internal (self, - GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, &devices); + if (klass->winrt_device_class == GST_WINRT_DEVICE_CLASS_AUDIO_CAPTURE) { + gst_wasapi2_device_provider_probe_internal (self, + GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE, &devices); + } else { + gst_wasapi2_device_provider_probe_internal (self, + GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE, &devices); + gst_wasapi2_device_provider_probe_internal (self, + GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, &devices); + } return devices; } + +static gboolean +gst_wasapi2_device_provider_start (GstDeviceProvider * provider) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (provider); + GList *devices = NULL; + GList *iter; + + if (!self->watcher) { + GST_ERROR_OBJECT (self, "DeviceWatcher object wasn't configured"); + return FALSE; + } + + self->enum_completed = FALSE; + + if (!gst_winrt_device_watcher_start (self->watcher)) + return FALSE; + + /* Wait for initial enumeration to be completed */ + g_mutex_lock (&self->lock); + while (!self->enum_completed) + g_cond_wait (&self->cond, &self->lock); + + devices = gst_wasapi2_device_provider_probe (provider); + if (devices) { + for (iter = devices; iter; iter = g_list_next (iter)) { + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + } + + g_list_free (devices); + } + g_mutex_unlock (&self->lock); + + return TRUE; +} + +static void +gst_wasapi2_device_provider_stop (GstDeviceProvider * provider) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (provider); + + if (self->watcher) + gst_winrt_device_watcher_stop (self->watcher); +} + +static gboolean +gst_wasapi2_device_is_in_list (GList * list, GstDevice * device) +{ + GList *iter; + GstStructure *s; + const gchar *device_id; + gboolean found = FALSE; + + s = gst_device_get_properties (device); + g_assert (s); + + device_id = gst_structure_get_string (s, "device.id"); + g_assert (device_id); + + for (iter = list; iter; iter = g_list_next (iter)) { + GstStructure *other_s; + const gchar *other_id; + + other_s = gst_device_get_properties (GST_DEVICE (iter->data)); + g_assert (other_s); + + other_id = gst_structure_get_string (other_s, "device.id"); + g_assert (other_id); + + if (g_ascii_strcasecmp (device_id, other_id) == 0) { + found = TRUE; + } + + gst_structure_free (other_s); + if (found) + break; + } + + gst_structure_free (s); + + return found; +} + +static void +gst_wasapi2_device_provider_update_devices (GstWasapi2DeviceProvider * self) +{ + GstDeviceProvider *provider = GST_DEVICE_PROVIDER_CAST (self); + GList *prev_devices = NULL; + GList *new_devices = NULL; + GList *to_add = NULL; + GList *to_remove = NULL; + GList *iter; + + GST_OBJECT_LOCK (self); + prev_devices = g_list_copy_deep (provider->devices, + (GCopyFunc) gst_object_ref, NULL); + GST_OBJECT_UNLOCK (self); + + new_devices = gst_wasapi2_device_provider_probe (provider); + + /* Ownership of GstDevice for gst_device_provider_device_add() + * and gst_device_provider_device_remove() is a bit complicated. + * Remove floating reference here for things to be clear */ + for (iter = new_devices; iter; iter = g_list_next (iter)) + gst_object_ref_sink (iter->data); + + /* Check newly added devices */ + for (iter = new_devices; iter; iter = g_list_next (iter)) { + if (!gst_wasapi2_device_is_in_list (prev_devices, GST_DEVICE (iter->data))) { + to_add = g_list_prepend (to_add, gst_object_ref (iter->data)); + } + } + + /* Check removed device */ + for (iter = prev_devices; iter; iter = g_list_next (iter)) { + if (!gst_wasapi2_device_is_in_list (new_devices, GST_DEVICE (iter->data))) { + to_remove = g_list_prepend (to_remove, gst_object_ref (iter->data)); + } + } + + for (iter = to_remove; iter; iter = g_list_next (iter)) + gst_device_provider_device_remove (provider, GST_DEVICE (iter->data)); + + for (iter = to_add; iter; iter = g_list_next (iter)) + gst_device_provider_device_add (provider, GST_DEVICE (iter->data)); + + if (prev_devices) + g_list_free_full (prev_devices, (GDestroyNotify) gst_object_unref); + + if (to_add) + g_list_free_full (to_add, (GDestroyNotify) gst_object_unref); + + if (to_remove) + g_list_free_full (to_remove, (GDestroyNotify) gst_object_unref); +} + +static void +gst_wasapi2_device_provider_device_added (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformation * info, + gpointer user_data) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (user_data); + + if (self->enum_completed) + gst_wasapi2_device_provider_update_devices (self); +} + +static void +gst_wasapi2_device_provider_device_removed (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * + info_update, gpointer user_data) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (user_data); + + if (self->enum_completed) + gst_wasapi2_device_provider_update_devices (self); +} + +static void +gst_wasapi2_device_provider_device_updated (GstWinRTDeviceWatcher * watcher, + __x_ABI_CWindows_CDevices_CEnumeration_CIDeviceInformationUpdate * info, + gpointer user_data) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (user_data); + + gst_wasapi2_device_provider_update_devices (self); +} + +static void +gst_wasapi2_device_provider_device_enum_completed (GstWinRTDeviceWatcher * + watcher, gpointer user_data) +{ + GstWasapi2DeviceProvider *self = GST_WASAPI2_DEVICE_PROVIDER (user_data); + + g_mutex_lock (&self->lock); + GST_DEBUG_OBJECT (self, "Enumeration completed"); + self->enum_completed = TRUE; + g_cond_signal (&self->cond); + g_mutex_unlock (&self->lock); +} + +static void +gst_wasapi2_device_provider_register_internal (GstPlugin * plugin, + guint rank, GstWinRTDeviceClass winrt_device_class) +{ + GType type; + const gchar *type_name = NULL; + const gchar *feature_name = NULL; + GTypeInfo type_info = { + sizeof (GstWasapi2DeviceProviderClass), + NULL, + NULL, + (GClassInitFunc) gst_wasapi2_device_provider_class_init, + NULL, + NULL, + sizeof (GstWasapi2DeviceProvider), + 0, + (GInstanceInitFunc) gst_wasapi2_device_provider_init, + }; + + type_info.class_data = GINT_TO_POINTER (winrt_device_class); + + if (winrt_device_class == GST_WINRT_DEVICE_CLASS_AUDIO_CAPTURE) { + type_name = "GstWasapi2CaputreDeviceProvider"; + feature_name = "wasapi2capturedeviceprovider"; + } else if (winrt_device_class == GST_WINRT_DEVICE_CLASS_AUDIO_RENDER) { + type_name = "GstWasapi2RenderDeviceProvider"; + feature_name = "wasapi2renderdeviceprovider"; + } else { + g_assert_not_reached (); + return; + } + + type = g_type_register_static (GST_TYPE_DEVICE_PROVIDER, + type_name, &type_info, (GTypeFlags) 0); + + if (!gst_device_provider_register (plugin, feature_name, rank, type)) + GST_WARNING ("Failed to register provider '%s'", type_name); +} + +void +gst_wasapi2_device_provider_register (GstPlugin * plugin, guint rank) +{ + gst_wasapi2_device_provider_register_internal (plugin, rank, + GST_WINRT_DEVICE_CLASS_AUDIO_CAPTURE); + gst_wasapi2_device_provider_register_internal (plugin, rank, + GST_WINRT_DEVICE_CLASS_AUDIO_RENDER); +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2device.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2device.h
Changed
@@ -25,12 +25,10 @@ G_BEGIN_DECLS #define GST_TYPE_WASAPI2_DEVICE (gst_wasapi2_device_get_type()) -#define GST_TYPE_WASAPI2_DEVICE_PROVIDER (gst_wasapi2_device_provider_get_type()) - G_DECLARE_FINAL_TYPE (GstWasapi2Device, gst_wasapi2_device, GST, WASAPI2_DEVICE, GstDevice); -G_DECLARE_FINAL_TYPE (GstWasapi2DeviceProvider, gst_wasapi2_device_provider, - GST, WASAPI2_DEVICE_PROVIDER, GstDeviceProvider); + +void gst_wasapi2_device_provider_register (GstPlugin * plugin, guint rank); G_END_DECLS
View file
gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2ringbuffer.cpp
Added
@@ -0,0 +1,1448 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include "gstwasapi2ringbuffer.h" +#include <string.h> +#include <mfapi.h> +#include <wrl.h> + +GST_DEBUG_CATEGORY_STATIC (gst_wasapi2_ring_buffer_debug); +#define GST_CAT_DEFAULT gst_wasapi2_ring_buffer_debug + +static HRESULT gst_wasapi2_ring_buffer_io_callback (GstWasapi2RingBuffer * buf); +static HRESULT +gst_wasapi2_ring_buffer_loopback_callback (GstWasapi2RingBuffer * buf); + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; + +class GstWasapiAsyncCallback : public IMFAsyncCallback +{ +public: + GstWasapiAsyncCallback(GstWasapi2RingBuffer *listener, + DWORD queue_id, + gboolean loopback) + : ref_count_(1) + , queue_id_(queue_id) + , loopback_(loopback) + { + g_weak_ref_init (&listener_, listener); + } + + virtual ~GstWasapiAsyncCallback () + { + g_weak_ref_set (&listener_, nullptr); + } + + /* IUnknown */ + STDMETHODIMP_ (ULONG) + AddRef (void) + { + GST_TRACE ("%p, %d", this, ref_count_); + return InterlockedIncrement (&ref_count_); + } + STDMETHODIMP_ (ULONG) + Release (void) + { + ULONG ref_count; + + GST_TRACE ("%p, %d", this, ref_count_); + ref_count = InterlockedDecrement (&ref_count_); + + if (ref_count == 0) { + GST_TRACE ("Delete instance %p", this); + delete this; + } + + return ref_count; + } + + STDMETHODIMP + QueryInterface (REFIID riid, void ** object) + { + if (!object) + return E_POINTER; + + if (riid == IID_IUnknown) { + GST_TRACE ("query IUnknown interface %p", this); + *object = static_cast<IUnknown *> (static_cast<GstWasapiAsyncCallback *> (this)); + } else if (riid == __uuidof (IMFAsyncCallback)) { + GST_TRACE ("query IUnknown interface %p", this); + *object = static_cast<IUnknown *> (static_cast<GstWasapiAsyncCallback *> (this)); + } else { + *object = nullptr; + return E_NOINTERFACE; + } + + AddRef (); + + return S_OK; + } + + /* IMFAsyncCallback */ + STDMETHODIMP + GetParameters(DWORD * pdwFlags, DWORD * pdwQueue) + { + *pdwFlags = 0; + *pdwQueue = queue_id_; + + return S_OK; + } + + STDMETHODIMP + Invoke(IMFAsyncResult * pAsyncResult) + { + GstWasapi2RingBuffer *ringbuffer; + HRESULT hr; + + ringbuffer = (GstWasapi2RingBuffer *) g_weak_ref_get (&listener_); + if (!ringbuffer) { + GST_WARNING ("Listener was removed"); + return S_OK; + } + + if (loopback_) + hr = gst_wasapi2_ring_buffer_loopback_callback (ringbuffer); + else + hr = gst_wasapi2_ring_buffer_io_callback (ringbuffer); + gst_object_unref (ringbuffer); + + return hr; + } + +private: + ULONG ref_count_; + DWORD queue_id_; + GWeakRef listener_; + gboolean loopback_; +}; +/* *INDENT-ON* */ + +struct _GstWasapi2RingBuffer +{ + GstAudioRingBuffer parent; + + GstWasapi2ClientDeviceClass device_class; + gchar *device_id; + gboolean low_latency; + gboolean mute; + gdouble volume; + gpointer dispatcher; + gboolean can_auto_routing; + + GstWasapi2Client *client; + GstWasapi2Client *loopback_client; + IAudioCaptureClient *capture_client; + IAudioRenderClient *render_client; + ISimpleAudioVolume *volume_object; + + GstWasapiAsyncCallback *callback_object; + IMFAsyncResult *callback_result; + MFWORKITEM_KEY callback_key; + HANDLE event_handle; + + GstWasapiAsyncCallback *loopback_callback_object; + IMFAsyncResult *loopback_callback_result; + MFWORKITEM_KEY loopback_callback_key; + HANDLE loopback_event_handle; + + guint64 expected_position; + gboolean is_first; + gboolean running; + UINT32 buffer_size; + UINT32 loopback_buffer_size; + + gint segoffset; + guint64 write_frame_offset; + + GMutex volume_lock; + gboolean mute_changed; + gboolean volume_changed; + + GstCaps *supported_caps; +}; + +static void gst_wasapi2_ring_buffer_constructed (GObject * object); +static void gst_wasapi2_ring_buffer_dispose (GObject * object); +static void gst_wasapi2_ring_buffer_finalize (GObject * object); + +static gboolean gst_wasapi2_ring_buffer_open_device (GstAudioRingBuffer * buf); +static gboolean gst_wasapi2_ring_buffer_close_device (GstAudioRingBuffer * buf); +static gboolean gst_wasapi2_ring_buffer_acquire (GstAudioRingBuffer * buf, + GstAudioRingBufferSpec * spec); +static gboolean gst_wasapi2_ring_buffer_release (GstAudioRingBuffer * buf); +static gboolean gst_wasapi2_ring_buffer_start (GstAudioRingBuffer * buf); +static gboolean gst_wasapi2_ring_buffer_resume (GstAudioRingBuffer * buf); +static gboolean gst_wasapi2_ring_buffer_pause (GstAudioRingBuffer * buf); +static gboolean gst_wasapi2_ring_buffer_stop (GstAudioRingBuffer * buf); +static guint gst_wasapi2_ring_buffer_delay (GstAudioRingBuffer * buf); + +#define gst_wasapi2_ring_buffer_parent_class parent_class +G_DEFINE_TYPE (GstWasapi2RingBuffer, gst_wasapi2_ring_buffer, + GST_TYPE_AUDIO_RING_BUFFER); + +static void +gst_wasapi2_ring_buffer_class_init (GstWasapi2RingBufferClass * klass) +{ + GObjectClass *gobject_class = G_OBJECT_CLASS (klass); + GstAudioRingBufferClass *ring_buffer_class = + GST_AUDIO_RING_BUFFER_CLASS (klass); + + gobject_class->constructed = gst_wasapi2_ring_buffer_constructed; + gobject_class->dispose = gst_wasapi2_ring_buffer_dispose; + gobject_class->finalize = gst_wasapi2_ring_buffer_finalize; + + ring_buffer_class->open_device = + GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_open_device); + ring_buffer_class->close_device = + GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_close_device); + ring_buffer_class->acquire = + GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_acquire); + ring_buffer_class->release = + GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_release); + ring_buffer_class->start = GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_start); + ring_buffer_class->resume = + GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_resume); + ring_buffer_class->pause = GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_pause); + ring_buffer_class->stop = GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_stop); + ring_buffer_class->delay = GST_DEBUG_FUNCPTR (gst_wasapi2_ring_buffer_delay); + + GST_DEBUG_CATEGORY_INIT (gst_wasapi2_ring_buffer_debug, + "wasapi2ringbuffer", 0, "wasapi2ringbuffer"); +} + +static void +gst_wasapi2_ring_buffer_init (GstWasapi2RingBuffer * self) +{ + self->volume = 1.0f; + self->mute = FALSE; + + self->event_handle = CreateEvent (nullptr, FALSE, FALSE, nullptr); + self->loopback_event_handle = CreateEvent (nullptr, FALSE, FALSE, nullptr); + g_mutex_init (&self->volume_lock); +} + +static void +gst_wasapi2_ring_buffer_constructed (GObject * object) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (object); + HRESULT hr; + DWORD task_id = 0; + DWORD queue_id = 0; + + hr = MFLockSharedWorkQueue (L"Pro Audio", 0, &task_id, &queue_id); + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (self, "Failed to get work queue id"); + goto out; + } + + self->callback_object = new GstWasapiAsyncCallback (self, queue_id, FALSE); + hr = MFCreateAsyncResult (nullptr, self->callback_object, nullptr, + &self->callback_result); + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (self, "Failed to create IAsyncResult"); + GST_WASAPI2_CLEAR_COM (self->callback_object); + } + + /* Create another callback object for loopback silence feed */ + self->loopback_callback_object = + new GstWasapiAsyncCallback (self, queue_id, TRUE); + hr = MFCreateAsyncResult (nullptr, self->loopback_callback_object, nullptr, + &self->loopback_callback_result); + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (self, "Failed to create IAsyncResult"); + GST_WASAPI2_CLEAR_COM (self->callback_object); + GST_WASAPI2_CLEAR_COM (self->callback_result); + GST_WASAPI2_CLEAR_COM (self->loopback_callback_object); + } + +out: + G_OBJECT_CLASS (parent_class)->constructed (object); +} + +static void +gst_wasapi2_ring_buffer_dispose (GObject * object) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (object); + + GST_WASAPI2_CLEAR_COM (self->render_client); + GST_WASAPI2_CLEAR_COM (self->capture_client); + GST_WASAPI2_CLEAR_COM (self->volume_object); + GST_WASAPI2_CLEAR_COM (self->callback_result); + GST_WASAPI2_CLEAR_COM (self->callback_object); + GST_WASAPI2_CLEAR_COM (self->loopback_callback_result); + GST_WASAPI2_CLEAR_COM (self->loopback_callback_object); + + gst_clear_object (&self->client); + gst_clear_object (&self->loopback_client); + gst_clear_caps (&self->supported_caps); + + G_OBJECT_CLASS (parent_class)->dispose (object); +} + +static void +gst_wasapi2_ring_buffer_finalize (GObject * object) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (object); + + g_free (self->device_id); + CloseHandle (self->event_handle); + CloseHandle (self->loopback_event_handle); + g_mutex_clear (&self->volume_lock); + + G_OBJECT_CLASS (parent_class)->finalize (object); +} + +static void +gst_wasapi2_ring_buffer_post_open_error (GstWasapi2RingBuffer * self) +{ + GstElement *parent = (GstElement *) GST_OBJECT_PARENT (self); + + if (!parent) { + GST_WARNING_OBJECT (self, "Cannot find parent"); + return; + } + + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER) { + GST_ELEMENT_ERROR (parent, RESOURCE, OPEN_WRITE, + (nullptr), ("Failed to open device")); + } else { + GST_ELEMENT_ERROR (parent, RESOURCE, OPEN_READ, + (nullptr), ("Failed to open device")); + } +} + +static void +gst_wasapi2_ring_buffer_post_scheduling_error (GstWasapi2RingBuffer * self) +{ + GstElement *parent = (GstElement *) GST_OBJECT_PARENT (self); + + if (!parent) { + GST_WARNING_OBJECT (self, "Cannot find parent"); + return; + } + + GST_ELEMENT_ERROR (parent, RESOURCE, FAILED, + (nullptr), ("Failed to schedule next I/O")); +} + +static void +gst_wasapi2_ring_buffer_post_io_error (GstWasapi2RingBuffer * self, HRESULT hr) +{ + GstElement *parent = (GstElement *) GST_OBJECT_PARENT (self); + gchar *error_msg; + + if (!parent) { + GST_WARNING_OBJECT (self, "Cannot find parent"); + return; + } + + error_msg = gst_wasapi2_util_get_error_message (hr); + + GST_ERROR_OBJECT (self, "Posting I/O error %s (hr: 0x%x)", error_msg, hr); + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER) { + GST_ELEMENT_ERROR (parent, RESOURCE, WRITE, + ("Failed to write to device"), ("%s, hr: 0x%x", error_msg, hr)); + } else { + GST_ELEMENT_ERROR (parent, RESOURCE, READ, + ("Failed to read from device"), ("%s hr: 0x%x", error_msg, hr)); + } + + g_free (error_msg); +} + +static gboolean +gst_wasapi2_ring_buffer_open_device (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Open"); + + if (self->client) { + GST_DEBUG_OBJECT (self, "Already opened"); + return TRUE; + } + + self->client = gst_wasapi2_client_new (self->device_class, + -1, self->device_id, self->dispatcher); + if (!self->client) { + gst_wasapi2_ring_buffer_post_open_error (self); + return FALSE; + } + + g_object_get (self->client, "auto-routing", &self->can_auto_routing, nullptr); + + /* Open another render client to feed silence */ + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE) { + self->loopback_client = + gst_wasapi2_client_new (GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, + -1, self->device_id, self->dispatcher); + + if (!self->loopback_client) { + gst_wasapi2_ring_buffer_post_open_error (self); + gst_clear_object (&self->client); + + return FALSE; + } + } + + return TRUE; +} + +static gboolean +gst_wasapi2_ring_buffer_close_device_internal (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Close device"); + + if (self->running) + gst_wasapi2_ring_buffer_stop (buf); + + GST_WASAPI2_CLEAR_COM (self->capture_client); + GST_WASAPI2_CLEAR_COM (self->render_client); + + g_mutex_lock (&self->volume_lock); + if (self->volume_object) + self->volume_object->SetMute (FALSE, nullptr); + GST_WASAPI2_CLEAR_COM (self->volume_object); + g_mutex_unlock (&self->volume_lock); + + gst_clear_object (&self->client); + gst_clear_object (&self->loopback_client); + + return TRUE; +} + +static gboolean +gst_wasapi2_ring_buffer_close_device (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Close"); + + gst_wasapi2_ring_buffer_close_device_internal (buf); + + gst_clear_caps (&self->supported_caps); + + return TRUE; +} + +static HRESULT +gst_wasapi2_ring_buffer_read (GstWasapi2RingBuffer * self) +{ + GstAudioRingBuffer *ringbuffer = GST_AUDIO_RING_BUFFER_CAST (self); + BYTE *data = nullptr; + UINT32 to_read = 0; + guint32 to_read_bytes; + DWORD flags = 0; + HRESULT hr; + guint64 position; + GstAudioInfo *info = &ringbuffer->spec.info; + IAudioCaptureClient *capture_client = self->capture_client; + guint gap_size = 0; + guint offset = 0; + gint segment; + guint8 *readptr; + gint len; + + if (!capture_client) { + GST_ERROR_OBJECT (self, "IAudioCaptureClient is not available"); + return E_FAIL; + } + + hr = capture_client->GetBuffer (&data, &to_read, &flags, &position, nullptr); + if (hr == AUDCLNT_S_BUFFER_EMPTY || to_read == 0) { + GST_LOG_OBJECT (self, "Empty buffer"); + to_read = 0; + goto out; + } + + to_read_bytes = to_read * GST_AUDIO_INFO_BPF (info); + + GST_LOG_OBJECT (self, "Reading %d frames offset at %" G_GUINT64_FORMAT + ", expected position %" G_GUINT64_FORMAT, to_read, position, + self->expected_position); + + if (self->is_first) { + self->expected_position = position + to_read; + self->is_first = FALSE; + } else { + if (position > self->expected_position) { + guint gap_frames; + + gap_frames = (guint) (position - self->expected_position); + GST_WARNING_OBJECT (self, "Found %u frames gap", gap_frames); + gap_size = gap_frames * GST_AUDIO_INFO_BPF (info); + } + + self->expected_position = position + to_read; + } + + /* Fill gap data if any */ + while (gap_size > 0) { + if (!gst_audio_ring_buffer_prepare_read (ringbuffer, + &segment, &readptr, &len)) { + GST_INFO_OBJECT (self, "No segment available"); + goto out; + } + + g_assert (self->segoffset >= 0); + + len -= self->segoffset; + if (len > gap_size) + len = gap_size; + + gst_audio_format_info_fill_silence (ringbuffer->spec.info.finfo, + readptr + self->segoffset, len); + + self->segoffset += len; + gap_size -= len; + + if (self->segoffset == ringbuffer->spec.segsize) { + gst_audio_ring_buffer_advance (ringbuffer, 1); + self->segoffset = 0; + } + } + + while (to_read_bytes) { + if (!gst_audio_ring_buffer_prepare_read (ringbuffer, + &segment, &readptr, &len)) { + GST_INFO_OBJECT (self, "No segment available"); + goto out; + } + + len -= self->segoffset; + if (len > to_read_bytes) + len = to_read_bytes; + + if ((flags & AUDCLNT_BUFFERFLAGS_SILENT) == AUDCLNT_BUFFERFLAGS_SILENT) { + gst_audio_format_info_fill_silence (ringbuffer->spec.info.finfo, + readptr + self->segoffset, len); + } else { + memcpy (readptr + self->segoffset, data + offset, len); + } + + self->segoffset += len; + offset += len; + to_read_bytes -= len; + + if (self->segoffset == ringbuffer->spec.segsize) { + gst_audio_ring_buffer_advance (ringbuffer, 1); + self->segoffset = 0; + } + } + +out: + hr = capture_client->ReleaseBuffer (to_read); + /* For debugging */ + gst_wasapi2_result (hr); + + return hr; +} + +static HRESULT +gst_wasapi2_ring_buffer_write (GstWasapi2RingBuffer * self, gboolean preroll) +{ + GstAudioRingBuffer *ringbuffer = GST_AUDIO_RING_BUFFER_CAST (self); + HRESULT hr; + IAudioClient *client_handle; + IAudioRenderClient *render_client; + guint32 padding_frames = 0; + guint32 can_write; + guint32 can_write_bytes; + gint segment; + guint8 *readptr; + gint len; + BYTE *data = nullptr; + + client_handle = gst_wasapi2_client_get_handle (self->client); + if (!client_handle) { + GST_ERROR_OBJECT (self, "IAudioClient is not available"); + return E_FAIL; + } + + render_client = self->render_client; + if (!render_client) { + GST_ERROR_OBJECT (self, "IAudioRenderClient is not available"); + return E_FAIL; + } + + hr = client_handle->GetCurrentPadding (&padding_frames); + if (!gst_wasapi2_result (hr)) + return hr; + + if (padding_frames >= self->buffer_size) { + GST_INFO_OBJECT (self, + "Padding size %d is larger than or equal to buffer size %d", + padding_frames, self->buffer_size); + return S_OK; + } + + can_write = self->buffer_size - padding_frames; + can_write_bytes = can_write * GST_AUDIO_INFO_BPF (&ringbuffer->spec.info); + if (preroll) { + GST_INFO_OBJECT (self, "Pre-fill %d frames with silence", can_write); + + hr = render_client->GetBuffer (can_write, &data); + if (!gst_wasapi2_result (hr)) + return hr; + + hr = render_client->ReleaseBuffer (can_write, AUDCLNT_BUFFERFLAGS_SILENT); + return gst_wasapi2_result (hr); + } + + GST_LOG_OBJECT (self, "Writing %d frames offset at %" G_GUINT64_FORMAT, + can_write, self->write_frame_offset); + self->write_frame_offset += can_write; + + while (can_write_bytes > 0) { + if (!gst_audio_ring_buffer_prepare_read (ringbuffer, + &segment, &readptr, &len)) { + GST_INFO_OBJECT (self, "No segment available, fill silence"); + + /* This would be case where in the middle of PAUSED state change. + * Just fill silent buffer to avoid immediate I/O callback after + * we return here */ + hr = render_client->GetBuffer (can_write, &data); + if (!gst_wasapi2_result (hr)) + return hr; + + hr = render_client->ReleaseBuffer (can_write, AUDCLNT_BUFFERFLAGS_SILENT); + /* for debugging */ + gst_wasapi2_result (hr); + return hr; + } + + len -= self->segoffset; + + if (len > can_write_bytes) + len = can_write_bytes; + + can_write = len / GST_AUDIO_INFO_BPF (&ringbuffer->spec.info); + if (can_write == 0) + break; + + hr = render_client->GetBuffer (can_write, &data); + if (!gst_wasapi2_result (hr)) + return hr; + + memcpy (data, readptr + self->segoffset, len); + hr = render_client->ReleaseBuffer (can_write, 0); + + self->segoffset += len; + can_write_bytes -= len; + + if (self->segoffset == ringbuffer->spec.segsize) { + gst_audio_ring_buffer_clear (ringbuffer, segment); + gst_audio_ring_buffer_advance (ringbuffer, 1); + self->segoffset = 0; + } + + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (self, "Failed to release buffer"); + break; + } + } + + return S_OK; +} + +static HRESULT +gst_wasapi2_ring_buffer_io_callback (GstWasapi2RingBuffer * self) +{ + HRESULT hr = E_FAIL; + + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (self), E_FAIL); + + if (!self->running) { + GST_INFO_OBJECT (self, "We are not running now"); + return S_OK; + } + + switch (self->device_class) { + case GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE: + case GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE: + hr = gst_wasapi2_ring_buffer_read (self); + break; + case GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER: + hr = gst_wasapi2_ring_buffer_write (self, FALSE); + break; + default: + g_assert_not_reached (); + break; + } + + /* We can ignore errors for device unplugged event if client can support + * automatic stream routing, but except for loopback capture. + * loopback capture client doesn't seem to be able to recover status from this + * situation */ + if (self->can_auto_routing && + self->device_class != GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE && + (hr == AUDCLNT_E_ENDPOINT_CREATE_FAILED + || hr == AUDCLNT_E_DEVICE_INVALIDATED)) { + GST_WARNING_OBJECT (self, + "Device was unplugged but client can support automatic routing"); + hr = S_OK; + } + + if (self->running) { + if (gst_wasapi2_result (hr)) { + hr = MFPutWaitingWorkItem (self->event_handle, 0, self->callback_result, + &self->callback_key); + + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to put item"); + gst_wasapi2_ring_buffer_post_scheduling_error (self); + + return hr; + } + } + } else { + GST_INFO_OBJECT (self, "We are not running now"); + return S_OK; + } + + if (FAILED (hr)) + gst_wasapi2_ring_buffer_post_io_error (self, hr); + + return hr; +} + +static HRESULT +gst_wasapi2_ring_buffer_fill_loopback_silence (GstWasapi2RingBuffer * self) +{ + HRESULT hr; + IAudioClient *client_handle; + IAudioRenderClient *render_client; + guint32 padding_frames = 0; + guint32 can_write; + BYTE *data = nullptr; + + client_handle = gst_wasapi2_client_get_handle (self->loopback_client); + if (!client_handle) { + GST_ERROR_OBJECT (self, "IAudioClient is not available"); + return E_FAIL; + } + + render_client = self->render_client; + if (!render_client) { + GST_ERROR_OBJECT (self, "IAudioRenderClient is not available"); + return E_FAIL; + } + + hr = client_handle->GetCurrentPadding (&padding_frames); + if (!gst_wasapi2_result (hr)) + return hr; + + if (padding_frames >= self->buffer_size) { + GST_INFO_OBJECT (self, + "Padding size %d is larger than or equal to buffer size %d", + padding_frames, self->buffer_size); + return S_OK; + } + + can_write = self->buffer_size - padding_frames; + + GST_TRACE_OBJECT (self, + "Writing %d silent frames offset at %" G_GUINT64_FORMAT, can_write); + + hr = render_client->GetBuffer (can_write, &data); + if (!gst_wasapi2_result (hr)) + return hr; + + hr = render_client->ReleaseBuffer (can_write, AUDCLNT_BUFFERFLAGS_SILENT); + return gst_wasapi2_result (hr); +} + +static HRESULT +gst_wasapi2_ring_buffer_loopback_callback (GstWasapi2RingBuffer * self) +{ + HRESULT hr = E_FAIL; + + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (self), E_FAIL); + g_return_val_if_fail (self->device_class == + GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE, E_FAIL); + + if (!self->running) { + GST_INFO_OBJECT (self, "We are not running now"); + return S_OK; + } + + hr = gst_wasapi2_ring_buffer_fill_loopback_silence (self); + + if (self->running) { + if (gst_wasapi2_result (hr)) { + hr = MFPutWaitingWorkItem (self->loopback_event_handle, 0, + self->loopback_callback_result, &self->loopback_callback_key); + + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to put item"); + gst_wasapi2_ring_buffer_post_scheduling_error (self); + + return hr; + } + } + } else { + GST_INFO_OBJECT (self, "We are not running now"); + return S_OK; + } + + if (FAILED (hr)) + gst_wasapi2_ring_buffer_post_io_error (self, hr); + + return hr; +} + +static HRESULT +gst_wasapi2_ring_buffer_initialize_audio_client3 (GstWasapi2RingBuffer * self, + IAudioClient * client_handle, WAVEFORMATEX * mix_format, guint * period) +{ + HRESULT hr = S_OK; + UINT32 default_period, fundamental_period, min_period, max_period; + /* AUDCLNT_STREAMFLAGS_NOPERSIST is not allowed for + * InitializeSharedAudioStream */ + DWORD stream_flags = AUDCLNT_STREAMFLAGS_EVENTCALLBACK; + ComPtr < IAudioClient3 > audio_client; + + hr = client_handle->QueryInterface (IID_PPV_ARGS (&audio_client)); + if (!gst_wasapi2_result (hr)) { + GST_INFO_OBJECT (self, "IAudioClient3 interface is unavailable"); + return hr; + } + + hr = audio_client->GetSharedModeEnginePeriod (mix_format, + &default_period, &fundamental_period, &min_period, &max_period); + if (!gst_wasapi2_result (hr)) { + GST_INFO_OBJECT (self, "Couldn't get period"); + return hr; + } + + GST_INFO_OBJECT (self, "Using IAudioClient3, default period %d frames, " + "fundamental period %d frames, minimum period %d frames, maximum period " + "%d frames", default_period, fundamental_period, min_period, max_period); + + *period = min_period; + + hr = audio_client->InitializeSharedAudioStream (stream_flags, min_period, + mix_format, nullptr); + + if (!gst_wasapi2_result (hr)) + GST_WARNING_OBJECT (self, "Failed to initialize IAudioClient3"); + + return hr; +} + +static HRESULT +gst_wasapi2_ring_buffer_initialize_audio_client (GstWasapi2RingBuffer * self, + IAudioClient * client_handle, WAVEFORMATEX * mix_format, guint * period, + DWORD extra_flags) +{ + GstAudioRingBuffer *ringbuffer = GST_AUDIO_RING_BUFFER_CAST (self); + REFERENCE_TIME default_period, min_period; + DWORD stream_flags = + AUDCLNT_STREAMFLAGS_EVENTCALLBACK | AUDCLNT_STREAMFLAGS_NOPERSIST; + HRESULT hr; + + stream_flags |= extra_flags; + + hr = client_handle->GetDevicePeriod (&default_period, &min_period); + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't get device period info"); + return hr; + } + + GST_INFO_OBJECT (self, "wasapi2 default period: %" G_GINT64_FORMAT + ", min period: %" G_GINT64_FORMAT, default_period, min_period); + + hr = client_handle->Initialize (AUDCLNT_SHAREMODE_SHARED, stream_flags, + /* hnsBufferDuration should be same as hnsPeriodicity + * when AUDCLNT_STREAMFLAGS_EVENTCALLBACK is used. + * And in case of shared mode, hnsPeriodicity should be zero, so + * this value should be zero as well */ + 0, + /* This must always be 0 in shared mode */ + 0, mix_format, nullptr); + + if (!gst_wasapi2_result (hr)) { + GST_WARNING_OBJECT (self, "Couldn't initialize audioclient"); + return hr; + } + + *period = gst_util_uint64_scale_round (default_period * 100, + GST_AUDIO_INFO_RATE (&ringbuffer->spec.info), GST_SECOND); + + return S_OK; +} + +static gboolean +gst_wasapi2_ring_buffer_prepare_loopback_client (GstWasapi2RingBuffer * self) +{ + IAudioClient *client_handle; + HRESULT hr; + WAVEFORMATEX *mix_format = nullptr; + guint period = 0; + ComPtr < IAudioRenderClient > render_client; + + if (!self->loopback_client) { + GST_ERROR_OBJECT (self, "No configured client object"); + return FALSE; + } + + if (!gst_wasapi2_client_ensure_activation (self->loopback_client)) { + GST_ERROR_OBJECT (self, "Failed to activate audio client"); + return FALSE; + } + + client_handle = gst_wasapi2_client_get_handle (self->loopback_client); + if (!client_handle) { + GST_ERROR_OBJECT (self, "IAudioClient handle is not available"); + return FALSE; + } + + hr = client_handle->GetMixFormat (&mix_format); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to get mix format"); + return FALSE; + } + + hr = gst_wasapi2_ring_buffer_initialize_audio_client (self, client_handle, + mix_format, &period, 0); + + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to initialize audio client"); + return FALSE; + } + + hr = client_handle->SetEventHandle (self->loopback_event_handle); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to set event handle"); + return FALSE; + } + + hr = client_handle->GetBufferSize (&self->loopback_buffer_size); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to query buffer size"); + return FALSE; + } + + hr = client_handle->GetService (IID_PPV_ARGS (&render_client)); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "IAudioRenderClient is unavailable"); + return FALSE; + } + + self->render_client = render_client.Detach (); + + return TRUE; +} + +static gboolean +gst_wasapi2_ring_buffer_acquire (GstAudioRingBuffer * buf, + GstAudioRingBufferSpec * spec) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + IAudioClient *client_handle; + HRESULT hr; + WAVEFORMATEX *mix_format = nullptr; + ComPtr < ISimpleAudioVolume > audio_volume; + GstAudioChannelPosition *position = nullptr; + guint period = 0; + + GST_DEBUG_OBJECT (buf, "Acquire"); + + if (!self->client && !gst_wasapi2_ring_buffer_open_device (buf)) + return FALSE; + + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE) { + if (!gst_wasapi2_ring_buffer_prepare_loopback_client (self)) { + GST_ERROR_OBJECT (self, "Failed to prepare loopback client"); + goto error; + } + } + + if (!gst_wasapi2_client_ensure_activation (self->client)) { + GST_ERROR_OBJECT (self, "Failed to activate audio client"); + goto error; + } + + client_handle = gst_wasapi2_client_get_handle (self->client); + if (!client_handle) { + GST_ERROR_OBJECT (self, "IAudioClient handle is not available"); + goto error; + } + + /* TODO: convert given caps to mix format */ + hr = client_handle->GetMixFormat (&mix_format); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to get mix format"); + goto error; + } + + /* Only use audioclient3 when low-latency is requested because otherwise + * very slow machines and VMs with 1 CPU allocated will get glitches: + * https://bugzilla.gnome.org/show_bug.cgi?id=794497 */ + hr = E_FAIL; + if (self->low_latency && + /* AUDCLNT_STREAMFLAGS_LOOPBACK is not allowed for + * InitializeSharedAudioStream */ + self->device_class != GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE) { + hr = gst_wasapi2_ring_buffer_initialize_audio_client3 (self, client_handle, + mix_format, &period); + } + + /* Try again if IAudioClinet3 API is unavailable. + * NOTE: IAudioClinet3:: methods might not be available for default device + * NOTE: The default device is a special device which is needed for supporting + * automatic stream routing + * https://docs.microsoft.com/en-us/windows/win32/coreaudio/automatic-stream-routing + */ + if (FAILED (hr)) { + DWORD extra_flags = 0; + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE) + extra_flags = AUDCLNT_STREAMFLAGS_LOOPBACK; + + hr = gst_wasapi2_ring_buffer_initialize_audio_client (self, client_handle, + mix_format, &period, extra_flags); + } + + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to initialize audio client"); + goto error; + } + + hr = client_handle->SetEventHandle (self->event_handle); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to set event handle"); + goto error; + } + + gst_wasapi2_util_waveformatex_to_channel_mask (mix_format, &position); + if (position) + gst_audio_ring_buffer_set_channel_positions (buf, position); + g_free (position); + + CoTaskMemFree (mix_format); + + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to init audio client"); + goto error; + } + + hr = client_handle->GetBufferSize (&self->buffer_size); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to query buffer size"); + goto error; + } + + g_assert (period > 0); + + if (self->buffer_size > period) { + GST_INFO_OBJECT (self, "Updating buffer size %d -> %d", self->buffer_size, + period); + self->buffer_size = period; + } + + spec->segsize = period * GST_AUDIO_INFO_BPF (&buf->spec.info); + spec->segtotal = 2; + + GST_INFO_OBJECT (self, + "Buffer size: %d frames, period: %d frames, segsize: %d bytes", + self->buffer_size, period, spec->segsize); + + if (self->device_class == GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER) { + ComPtr < IAudioRenderClient > render_client; + + hr = client_handle->GetService (IID_PPV_ARGS (&render_client)); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "IAudioRenderClient is unavailable"); + goto error; + } + + self->render_client = render_client.Detach (); + } else { + ComPtr < IAudioCaptureClient > capture_client; + + hr = client_handle->GetService (IID_PPV_ARGS (&capture_client)); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "IAudioCaptureClient is unavailable"); + goto error; + } + + self->capture_client = capture_client.Detach (); + } + + hr = client_handle->GetService (IID_PPV_ARGS (&audio_volume)); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "ISimpleAudioVolume is unavailable"); + goto error; + } + + g_mutex_lock (&self->volume_lock); + self->volume_object = audio_volume.Detach (); + + if (self->mute_changed) { + self->volume_object->SetMute (self->mute, nullptr); + self->mute_changed = FALSE; + } else { + self->volume_object->SetMute (FALSE, nullptr); + } + + if (self->volume_changed) { + self->volume_object->SetMasterVolume (self->volume, nullptr); + self->volume_changed = FALSE; + } + g_mutex_unlock (&self->volume_lock); + + buf->size = spec->segtotal * spec->segsize; + buf->memory = (guint8 *) g_malloc (buf->size); + gst_audio_format_info_fill_silence (buf->spec.info.finfo, + buf->memory, buf->size); + + return TRUE; + +error: + GST_WASAPI2_CLEAR_COM (self->render_client); + GST_WASAPI2_CLEAR_COM (self->capture_client); + GST_WASAPI2_CLEAR_COM (self->volume_object); + + gst_wasapi2_ring_buffer_post_open_error (self); + + return FALSE; +} + +static gboolean +gst_wasapi2_ring_buffer_release (GstAudioRingBuffer * buf) +{ + GST_DEBUG_OBJECT (buf, "Release"); + + g_clear_pointer (&buf->memory, g_free); + + /* IAudioClient handle is not reusable once it's initialized */ + gst_wasapi2_ring_buffer_close_device_internal (buf); + + return TRUE; +} + +static gboolean +gst_wasapi2_ring_buffer_start_internal (GstWasapi2RingBuffer * self) +{ + IAudioClient *client_handle; + HRESULT hr; + + if (self->running) { + GST_INFO_OBJECT (self, "We are running already"); + return TRUE; + } + + client_handle = gst_wasapi2_client_get_handle (self->client); + self->is_first = TRUE; + self->running = TRUE; + self->segoffset = 0; + self->write_frame_offset = 0; + + switch (self->device_class) { + case GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER: + /* render client might read data from buffer immediately once it's prepared. + * Pre-fill with silence in order to start-up glitch */ + hr = gst_wasapi2_ring_buffer_write (self, TRUE); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to pre-fill buffer with silence"); + goto error; + } + break; + case GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE: + { + IAudioClient *loopback_client_handle; + + /* Start silence feed client first */ + loopback_client_handle = + gst_wasapi2_client_get_handle (self->loopback_client); + + hr = loopback_client_handle->Start (); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to start loopback client"); + self->running = FALSE; + goto error; + } + + hr = MFPutWaitingWorkItem (self->loopback_event_handle, + 0, self->loopback_callback_result, &self->loopback_callback_key); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to put waiting item"); + loopback_client_handle->Stop (); + self->running = FALSE; + goto error; + } + break; + } + default: + break; + } + + hr = client_handle->Start (); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to start client"); + self->running = FALSE; + goto error; + } + + hr = MFPutWaitingWorkItem (self->event_handle, 0, self->callback_result, + &self->callback_key); + if (!gst_wasapi2_result (hr)) { + GST_ERROR_OBJECT (self, "Failed to put waiting item"); + client_handle->Stop (); + self->running = FALSE; + goto error; + } + + return TRUE; + +error: + gst_wasapi2_ring_buffer_post_open_error (self); + return FALSE; +} + +static gboolean +gst_wasapi2_ring_buffer_start (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Start"); + + return gst_wasapi2_ring_buffer_start_internal (self); +} + +static gboolean +gst_wasapi2_ring_buffer_resume (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (self, "Resume"); + + return gst_wasapi2_ring_buffer_start_internal (self); +} + +static gboolean +gst_wasapi2_ring_buffer_stop_internal (GstWasapi2RingBuffer * self) +{ + IAudioClient *client_handle; + HRESULT hr; + + if (!self->client) { + GST_DEBUG_OBJECT (self, "No configured client"); + return TRUE; + } + + if (!self->running) { + GST_DEBUG_OBJECT (self, "We are not running"); + return TRUE; + } + + client_handle = gst_wasapi2_client_get_handle (self->client); + + self->running = FALSE; + MFCancelWorkItem (self->callback_key); + + hr = client_handle->Stop (); + gst_wasapi2_result (hr); + + /* Call reset for later reuse case */ + hr = client_handle->Reset (); + self->expected_position = 0; + self->write_frame_offset = 0; + + if (self->loopback_client) { + client_handle = gst_wasapi2_client_get_handle (self->loopback_client); + + MFCancelWorkItem (self->loopback_callback_key); + + hr = client_handle->Stop (); + gst_wasapi2_result (hr); + + client_handle->Reset (); + } + + return TRUE; +} + +static gboolean +gst_wasapi2_ring_buffer_stop (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (buf, "Stop"); + + return gst_wasapi2_ring_buffer_stop_internal (self); +} + +static gboolean +gst_wasapi2_ring_buffer_pause (GstAudioRingBuffer * buf) +{ + GstWasapi2RingBuffer *self = GST_WASAPI2_RING_BUFFER (buf); + + GST_DEBUG_OBJECT (buf, "Pause"); + + return gst_wasapi2_ring_buffer_stop_internal (self); +} + +static guint +gst_wasapi2_ring_buffer_delay (GstAudioRingBuffer * buf) +{ + /* NOTE: WASAPI supports GetCurrentPadding() method for querying + * currently unread buffer size, but it doesn't seem to be quite useful + * here because: + * + * In case of capture client, GetCurrentPadding() will return the number of + * unread frames which will be identical to pNumFramesToRead value of + * IAudioCaptureClient::GetBuffer()'s return. Since we are running on + * event-driven mode and whenever available, WASAPI will notify signal + * so it's likely zero at this moment. And there is a chance to + * return incorrect value here because our IO callback happens from + * other thread. + * + * And render client's padding size will return the total size of buffer + * which is likely larger than twice of our period. Which doesn't represent + * the amount queued frame size in device correctly + */ + return 0; +} + +GstAudioRingBuffer * +gst_wasapi2_ring_buffer_new (GstWasapi2ClientDeviceClass device_class, + gboolean low_latency, const gchar * device_id, gpointer dispatcher, + const gchar * name) +{ + GstWasapi2RingBuffer *self; + + self = (GstWasapi2RingBuffer *) + g_object_new (GST_TYPE_WASAPI2_RING_BUFFER, "name", name, nullptr); + + if (!self->callback_object) { + gst_object_unref (self); + return nullptr; + } + + self->device_class = device_class; + self->low_latency = low_latency; + self->device_id = g_strdup (device_id); + self->dispatcher = dispatcher; + + return GST_AUDIO_RING_BUFFER_CAST (self); +} + +GstCaps * +gst_wasapi2_ring_buffer_get_caps (GstWasapi2RingBuffer * buf) +{ + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (buf), nullptr); + + if (buf->supported_caps) + return gst_caps_ref (buf->supported_caps); + + if (!buf->client) + return nullptr; + + if (!gst_wasapi2_client_ensure_activation (buf->client)) { + GST_ERROR_OBJECT (buf, "Failed to activate audio client"); + return nullptr; + } + + buf->supported_caps = gst_wasapi2_client_get_caps (buf->client); + if (buf->supported_caps) + return gst_caps_ref (buf->supported_caps); + + return nullptr; +} + +HRESULT +gst_wasapi2_ring_buffer_set_mute (GstWasapi2RingBuffer * buf, gboolean mute) +{ + HRESULT hr = S_OK; + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (buf), E_INVALIDARG); + + g_mutex_lock (&buf->volume_lock); + buf->mute = mute; + if (buf->volume_object) + hr = buf->volume_object->SetMute (mute, nullptr); + else + buf->volume_changed = TRUE; + g_mutex_unlock (&buf->volume_lock); + + return S_OK; +} + +HRESULT +gst_wasapi2_ring_buffer_get_mute (GstWasapi2RingBuffer * buf, gboolean * mute) +{ + BOOL mute_val; + HRESULT hr = S_OK; + + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (buf), E_INVALIDARG); + g_return_val_if_fail (mute != nullptr, E_INVALIDARG); + + mute_val = buf->mute; + + g_mutex_lock (&buf->volume_lock); + if (buf->volume_object) + hr = buf->volume_object->GetMute (&mute_val); + g_mutex_unlock (&buf->volume_lock); + + *mute = mute_val ? TRUE : FALSE; + + return hr; +} + +HRESULT +gst_wasapi2_ring_buffer_set_volume (GstWasapi2RingBuffer * buf, gfloat volume) +{ + HRESULT hr = S_OK; + + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (buf), E_INVALIDARG); + g_return_val_if_fail (volume >= 0 && volume <= 1.0, E_INVALIDARG); + + g_mutex_lock (&buf->volume_lock); + buf->volume = volume; + if (buf->volume_object) + hr = buf->volume_object->SetMasterVolume (volume, nullptr); + else + buf->mute_changed = TRUE; + g_mutex_unlock (&buf->volume_lock); + + return hr; +} + +HRESULT +gst_wasapi2_ring_buffer_get_volume (GstWasapi2RingBuffer * buf, gfloat * volume) +{ + gfloat volume_val; + HRESULT hr = S_OK; + + g_return_val_if_fail (GST_IS_WASAPI2_RING_BUFFER (buf), E_INVALIDARG); + g_return_val_if_fail (volume != nullptr, E_INVALIDARG); + + g_mutex_lock (&buf->volume_lock); + volume_val = buf->volume; + if (buf->volume_object) + hr = buf->volume_object->GetMasterVolume (&volume_val); + g_mutex_unlock (&buf->volume_lock); + + *volume = volume_val; + + return hr; +}
View file
gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2ringbuffer.h
Added
@@ -0,0 +1,55 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifndef __GST_WASAPI2_RING_BUFFER_H__ +#define __GST_WASAPI2_RING_BUFFER_H__ + +#include <gst/gst.h> +#include <gst/audio/audio.h> +#include "gstwasapi2client.h" + +G_BEGIN_DECLS + +#define GST_TYPE_WASAPI2_RING_BUFFER (gst_wasapi2_ring_buffer_get_type()) +G_DECLARE_FINAL_TYPE (GstWasapi2RingBuffer, gst_wasapi2_ring_buffer, + GST, WASAPI2_RING_BUFFER, GstAudioRingBuffer); + +GstAudioRingBuffer * gst_wasapi2_ring_buffer_new (GstWasapi2ClientDeviceClass device_class, + gboolean low_latency, + const gchar *device_id, + gpointer dispatcher, + const gchar * name); + +GstCaps * gst_wasapi2_ring_buffer_get_caps (GstWasapi2RingBuffer * buf); + +HRESULT gst_wasapi2_ring_buffer_set_mute (GstWasapi2RingBuffer * buf, + gboolean mute); + +HRESULT gst_wasapi2_ring_buffer_get_mute (GstWasapi2RingBuffer * buf, + gboolean * mute); + +HRESULT gst_wasapi2_ring_buffer_set_volume (GstWasapi2RingBuffer * buf, + gfloat volume); + +HRESULT gst_wasapi2_ring_buffer_get_volume (GstWasapi2RingBuffer * buf, + gfloat * volume); + +G_END_DECLS + +#endif /* __GST_WASAPI2_RING_BUFFER_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2sink.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2sink.c
Changed
@@ -31,11 +31,11 @@ * * ## Example pipelines * |[ - * gst-launch-1.0 -v audiotestsrc samplesperbuffer=160 ! wasapi2sink - * ]| Generate 20 ms buffers and render to the default audio device. + * gst-launch-1.0 -v audiotestsrc ! wasapi2sink + * ]| Generate audio test buffers and render to the default audio device. * * |[ - * gst-launch-1.0 -v audiotestsrc samplesperbuffer=160 ! wasapi2sink low-latency=true + * gst-launch-1.0 -v audiotestsink samplesperbuffer=160 ! wasapi2sink low-latency=true * ]| Same as above, but with the minimum possible latency * */ @@ -45,7 +45,7 @@ #include "gstwasapi2sink.h" #include "gstwasapi2util.h" -#include "gstwasapi2client.h" +#include "gstwasapi2ringbuffer.h" GST_DEBUG_CATEGORY_STATIC (gst_wasapi2_sink_debug); #define GST_CAT_DEFAULT gst_wasapi2_sink_debug @@ -59,9 +59,6 @@ #define DEFAULT_MUTE FALSE #define DEFAULT_VOLUME 1.0 -#define GST_WASAPI2_SINK_LOCK(s) g_mutex_lock(&(s)->lock) -#define GST_WASAPI2_SINK_UNLOCK(s) g_mutex_unlock(&(s)->lock) - enum { PROP_0, @@ -74,11 +71,7 @@ struct _GstWasapi2Sink { - GstAudioSink parent; - - GstWasapi2Client *client; - GstCaps *cached_caps; - gboolean started; + GstAudioBaseSink parent; /* properties */ gchar *device_id; @@ -89,30 +82,21 @@ gboolean mute_changed; gboolean volume_changed; - - /* to protect audioclient from set/get property */ - GMutex lock; }; -static void gst_wasapi2_sink_dispose (GObject * object); static void gst_wasapi2_sink_finalize (GObject * object); static void gst_wasapi2_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_wasapi2_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); +static GstStateChangeReturn gst_wasapi2_sink_change_state (GstElement * + element, GstStateChange transition); + static GstCaps *gst_wasapi2_sink_get_caps (GstBaseSink * bsink, GstCaps * filter); - -static gboolean gst_wasapi2_sink_prepare (GstAudioSink * asink, - GstAudioRingBufferSpec * spec); -static gboolean gst_wasapi2_sink_unprepare (GstAudioSink * asink); -static gboolean gst_wasapi2_sink_open (GstAudioSink * asink); -static gboolean gst_wasapi2_sink_close (GstAudioSink * asink); -static gint gst_wasapi2_sink_write (GstAudioSink * asink, - gpointer data, guint length); -static guint gst_wasapi2_sink_delay (GstAudioSink * asink); -static void gst_wasapi2_sink_reset (GstAudioSink * asink); +static GstAudioRingBuffer *gst_wasapi2_sink_create_ringbuffer (GstAudioBaseSink + * sink); static void gst_wasapi2_sink_set_mute (GstWasapi2Sink * self, gboolean mute); static gboolean gst_wasapi2_sink_get_mute (GstWasapi2Sink * self); @@ -120,7 +104,8 @@ static gdouble gst_wasapi2_sink_get_volume (GstWasapi2Sink * self); #define gst_wasapi2_sink_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstWasapi2Sink, gst_wasapi2_sink, GST_TYPE_AUDIO_SINK, +G_DEFINE_TYPE_WITH_CODE (GstWasapi2Sink, gst_wasapi2_sink, + GST_TYPE_AUDIO_BASE_SINK, G_IMPLEMENT_INTERFACE (GST_TYPE_STREAM_VOLUME, NULL)); static void @@ -129,9 +114,9 @@ GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *element_class = GST_ELEMENT_CLASS (klass); GstBaseSinkClass *basesink_class = GST_BASE_SINK_CLASS (klass); - GstAudioSinkClass *audiosink_class = GST_AUDIO_SINK_CLASS (klass); + GstAudioBaseSinkClass *audiobasesink_class = + GST_AUDIO_BASE_SINK_CLASS (klass); - gobject_class->dispose = gst_wasapi2_sink_dispose; gobject_class->finalize = gst_wasapi2_sink_finalize; gobject_class->set_property = gst_wasapi2_sink_set_property; gobject_class->get_property = gst_wasapi2_sink_get_property; @@ -184,15 +169,13 @@ "Ole André Vadla Ravnås <ole.andre.ravnas@tandberg.com>, " "Seungha Yang <seungha@centricular.com>"); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_wasapi2_sink_change_state); + basesink_class->get_caps = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_get_caps); - audiosink_class->prepare = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_prepare); - audiosink_class->unprepare = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_unprepare); - audiosink_class->open = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_open); - audiosink_class->close = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_close); - audiosink_class->write = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_write); - audiosink_class->delay = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_delay); - audiosink_class->reset = GST_DEBUG_FUNCPTR (gst_wasapi2_sink_reset); + audiobasesink_class->create_ringbuffer = + GST_DEBUG_FUNCPTR (gst_wasapi2_sink_create_ringbuffer); GST_DEBUG_CATEGORY_INIT (gst_wasapi2_sink_debug, "wasapi2sink", 0, "Windows audio session API sink"); @@ -204,21 +187,6 @@ self->low_latency = DEFAULT_LOW_LATENCY; self->mute = DEFAULT_MUTE; self->volume = DEFAULT_VOLUME; - - g_mutex_init (&self->lock); -} - -static void -gst_wasapi2_sink_dispose (GObject * object) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (object); - - GST_WASAPI2_SINK_LOCK (self); - gst_clear_object (&self->client); - gst_clear_caps (&self->cached_caps); - GST_WASAPI2_SINK_UNLOCK (self); - - G_OBJECT_CLASS (parent_class)->dispose (object); } static void @@ -227,7 +195,6 @@ GstWasapi2Sink *self = GST_WASAPI2_SINK (object); g_free (self->device_id); - g_mutex_clear (&self->lock); G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -286,29 +253,59 @@ } } +static GstStateChangeReturn +gst_wasapi2_sink_change_state (GstElement * element, GstStateChange transition) +{ + GstWasapi2Sink *self = GST_WASAPI2_SINK (element); + GstAudioBaseSink *asink = GST_AUDIO_BASE_SINK_CAST (element); + + switch (transition) { + case GST_STATE_CHANGE_READY_TO_PAUSED: + /* If we have pending volume/mute values to set, do here */ + GST_OBJECT_LOCK (self); + if (asink->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (asink->ringbuffer); + + if (self->volume_changed) { + gst_wasapi2_ring_buffer_set_volume (ringbuffer, self->volume); + self->volume_changed = FALSE; + } + + if (self->mute_changed) { + gst_wasapi2_ring_buffer_set_mute (ringbuffer, self->mute); + self->mute_changed = FALSE; + } + } + GST_OBJECT_UNLOCK (self); + break; + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + static GstCaps * gst_wasapi2_sink_get_caps (GstBaseSink * bsink, GstCaps * filter) { - GstWasapi2Sink *self = GST_WASAPI2_SINK (bsink); + GstAudioBaseSink *asink = GST_AUDIO_BASE_SINK_CAST (bsink); GstCaps *caps = NULL; - /* In case of UWP, device activation might not be finished yet */ - if (self->client && !gst_wasapi2_client_ensure_activation (self->client)) { - GST_ELEMENT_ERROR (self, RESOURCE, OPEN_WRITE, (NULL), - ("Failed to activate device")); - return NULL; - } - - if (self->client) - caps = gst_wasapi2_client_get_caps (self->client); + GST_OBJECT_LOCK (bsink); + if (asink->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (asink->ringbuffer); - /* store one caps here so that we can return device caps even if - * audioclient was closed due to unprepare() */ - if (!self->cached_caps && caps) - self->cached_caps = gst_caps_ref (caps); + gst_object_ref (ringbuffer); + GST_OBJECT_UNLOCK (bsink); - if (!caps && self->cached_caps) - caps = gst_caps_ref (self->cached_caps); + /* Get caps might be able to block if device is not activated yet */ + caps = gst_wasapi2_ring_buffer_get_caps (ringbuffer); + gst_object_unref (ringbuffer); + } else { + GST_OBJECT_UNLOCK (bsink); + } if (!caps) caps = gst_pad_get_pad_template_caps (bsink->sinkpad); @@ -320,212 +317,81 @@ caps = filtered; } - GST_DEBUG_OBJECT (self, "returning caps %" GST_PTR_FORMAT, caps); + GST_DEBUG_OBJECT (bsink, "returning caps %" GST_PTR_FORMAT, caps); return caps; } -static gboolean -gst_wasapi2_sink_open_unlocked (GstAudioSink * asink) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - - self->client = - gst_wasapi2_client_new (GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, - self->low_latency, -1, self->device_id, self->dispatcher); - - return ! !self->client; -} - -static gboolean -gst_wasapi2_sink_open (GstAudioSink * asink) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - gboolean ret; - - GST_DEBUG_OBJECT (self, "Opening device"); - - GST_WASAPI2_SINK_LOCK (self); - ret = gst_wasapi2_sink_open_unlocked (asink); - GST_WASAPI2_SINK_UNLOCK (self); - - if (!ret) { - GST_ELEMENT_ERROR (self, RESOURCE, OPEN_WRITE, (NULL), - ("Failed to open device")); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_wasapi2_sink_close (GstAudioSink * asink) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - - GST_WASAPI2_SINK_LOCK (self); - - gst_clear_object (&self->client); - gst_clear_caps (&self->cached_caps); - self->started = FALSE; - - GST_WASAPI2_SINK_UNLOCK (self); - - return TRUE; -} - -static gboolean -gst_wasapi2_sink_prepare (GstAudioSink * asink, GstAudioRingBufferSpec * spec) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - GstAudioBaseSink *bsink = GST_AUDIO_BASE_SINK (asink); - gboolean ret = FALSE; - - GST_WASAPI2_SINK_LOCK (self); - if (!self->client && !gst_wasapi2_sink_open_unlocked (asink)) { - GST_ERROR_OBJECT (self, "No audio client was configured"); - goto done; - } - - if (!gst_wasapi2_client_ensure_activation (self->client)) { - GST_ERROR_OBJECT (self, "Couldn't activate audio device"); - goto done; - } - - if (!gst_wasapi2_client_open (self->client, spec, bsink->ringbuffer)) { - GST_ERROR_OBJECT (self, "Couldn't open audio client"); - goto done; - } - - /* Set mute and volume here again, maybe when "mute" property was set, audioclient - * might not be configured at that moment */ - if (self->mute_changed) { - gst_wasapi2_client_set_mute (self->client, self->mute); - self->mute_changed = FALSE; - } - - if (self->volume_changed) { - gst_wasapi2_client_set_volume (self->client, self->volume); - self->volume_changed = FALSE; - } - - /* Will start IAudioClient on the first write request */ - self->started = FALSE; - ret = TRUE; - -done: - GST_WASAPI2_SINK_UNLOCK (self); - - return ret; -} - -static gboolean -gst_wasapi2_sink_unprepare (GstAudioSink * asink) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - - self->started = FALSE; - - /* Will reopen device later prepare() */ - GST_WASAPI2_SINK_LOCK (self); - if (self->client) { - gst_wasapi2_client_stop (self->client); - gst_clear_object (&self->client); - } - GST_WASAPI2_SINK_UNLOCK (self); - - return TRUE; -} - -static gint -gst_wasapi2_sink_write (GstAudioSink * asink, gpointer data, guint length) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - - if (!self->client) { - GST_ERROR_OBJECT (self, "No audio client was configured"); - return -1; - } - - if (!self->started) { - if (!gst_wasapi2_client_start (self->client)) { - GST_ERROR_OBJECT (self, "Failed to re-start client"); - return -1; - } - - self->started = TRUE; - } - - return gst_wasapi2_client_write (self->client, data, length); -} - -static guint -gst_wasapi2_sink_delay (GstAudioSink * asink) +static GstAudioRingBuffer * +gst_wasapi2_sink_create_ringbuffer (GstAudioBaseSink * sink) { - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); + GstWasapi2Sink *self = GST_WASAPI2_SINK (sink); + GstAudioRingBuffer *ringbuffer; + gchar *name; - if (!self->client) - return 0; + name = g_strdup_printf ("%s-ringbuffer", GST_OBJECT_NAME (sink)); - return gst_wasapi2_client_delay (self->client); -} + ringbuffer = + gst_wasapi2_ring_buffer_new (GST_WASAPI2_CLIENT_DEVICE_CLASS_RENDER, + self->low_latency, self->device_id, self->dispatcher, name); -static void -gst_wasapi2_sink_reset (GstAudioSink * asink) -{ - GstWasapi2Sink *self = GST_WASAPI2_SINK (asink); - - GST_INFO_OBJECT (self, "reset called"); - - self->started = FALSE; + g_free (name); - if (!self->client) - return; - - gst_wasapi2_client_stop (self->client); + return ringbuffer; } static void gst_wasapi2_sink_set_mute (GstWasapi2Sink * self, gboolean mute) { - GST_WASAPI2_SINK_LOCK (self); + GstAudioBaseSink *bsink = GST_AUDIO_BASE_SINK_CAST (self); + HRESULT hr; + + GST_OBJECT_LOCK (self); self->mute = mute; self->mute_changed = TRUE; - if (self->client) { - if (!gst_wasapi2_client_set_mute (self->client, mute)) { + if (bsink->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsink->ringbuffer); + + hr = gst_wasapi2_ring_buffer_set_mute (ringbuffer, mute); + + if (FAILED (hr)) { GST_INFO_OBJECT (self, "Couldn't set mute"); } else { self->mute_changed = FALSE; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SINK_UNLOCK (self); + GST_OBJECT_UNLOCK (self); } static gboolean gst_wasapi2_sink_get_mute (GstWasapi2Sink * self) { + GstAudioBaseSink *bsink = GST_AUDIO_BASE_SINK_CAST (self); gboolean mute; + HRESULT hr; - GST_WASAPI2_SINK_LOCK (self); + GST_OBJECT_LOCK (self); mute = self->mute; - if (self->client) { - if (!gst_wasapi2_client_get_mute (self->client, &mute)) { - GST_INFO_OBJECT (self, "Couldn't get mute state"); + if (bsink->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsink->ringbuffer); + + hr = gst_wasapi2_ring_buffer_get_mute (ringbuffer, &mute); + + if (FAILED (hr)) { + GST_INFO_OBJECT (self, "Couldn't get mute"); } else { self->mute = mute; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SINK_UNLOCK (self); + GST_OBJECT_UNLOCK (self); return mute; } @@ -533,7 +399,10 @@ static void gst_wasapi2_sink_set_volume (GstWasapi2Sink * self, gdouble volume) { - GST_WASAPI2_SINK_LOCK (self); + GstAudioBaseSink *bsink = GST_AUDIO_BASE_SINK_CAST (self); + HRESULT hr; + + GST_OBJECT_LOCK (self); self->volume = volume; /* clip volume value */ @@ -541,39 +410,47 @@ self->volume = MIN (1.0, self->volume); self->volume_changed = TRUE; - if (self->client) { - if (!gst_wasapi2_client_set_volume (self->client, (gfloat) self->volume)) { + if (bsink->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsink->ringbuffer); + + hr = gst_wasapi2_ring_buffer_set_volume (ringbuffer, (gfloat) self->volume); + + if (FAILED (hr)) { GST_INFO_OBJECT (self, "Couldn't set volume"); } else { self->volume_changed = FALSE; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SINK_UNLOCK (self); + GST_OBJECT_UNLOCK (self); } static gdouble gst_wasapi2_sink_get_volume (GstWasapi2Sink * self) { + GstAudioBaseSink *bsink = GST_AUDIO_BASE_SINK_CAST (self); gfloat volume; + HRESULT hr; - GST_WASAPI2_SINK_LOCK (self); + GST_OBJECT_LOCK (self); volume = (gfloat) self->volume; - if (self->client) { - if (!gst_wasapi2_client_get_volume (self->client, &volume)) { - GST_INFO_OBJECT (self, "Couldn't get volume"); + if (bsink->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsink->ringbuffer); + + hr = gst_wasapi2_ring_buffer_get_volume (ringbuffer, &volume); + + if (FAILED (hr)) { + GST_INFO_OBJECT (self, "Couldn't set volume"); } else { self->volume = volume; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SINK_UNLOCK (self); + GST_OBJECT_UNLOCK (self); volume = MAX (0.0, volume); volume = MIN (1.0, volume);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2sink.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2sink.h
Changed
@@ -27,7 +27,7 @@ #define GST_TYPE_WASAPI2_SINK (gst_wasapi2_sink_get_type ()) G_DECLARE_FINAL_TYPE (GstWasapi2Sink, - gst_wasapi2_sink, GST, WASAPI2_SINK, GstAudioSink); + gst_wasapi2_sink, GST, WASAPI2_SINK, GstAudioBaseSink); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2src.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2src.c
Changed
@@ -29,12 +29,12 @@ * * ## Example pipelines * |[ - * gst-launch-1.0 -v wasapi2src ! fakesrc - * ]| Capture from the default audio device and render to fakesrc. + * gst-launch-1.0 -v wasapi2src ! fakesink + * ]| Capture from the default audio device and render to fakesink. * * |[ - * gst-launch-1.0 -v wasapi2src low-latency=true ! fakesrc - * ]| Capture from the default audio device with the minimum possible latency and render to fakesrc. + * gst-launch-1.0 -v wasapi2src low-latency=true ! fakesink + * ]| Capture from the default audio device with the minimum possible latency and render to fakesink. * */ #ifdef HAVE_CONFIG_H @@ -43,7 +43,7 @@ #include "gstwasapi2src.h" #include "gstwasapi2util.h" -#include "gstwasapi2client.h" +#include "gstwasapi2ringbuffer.h" GST_DEBUG_CATEGORY_STATIC (gst_wasapi2_src_debug); #define GST_CAT_DEFAULT gst_wasapi2_src_debug @@ -56,9 +56,7 @@ #define DEFAULT_LOW_LATENCY FALSE #define DEFAULT_MUTE FALSE #define DEFAULT_VOLUME 1.0 - -#define GST_WASAPI2_SRC_LOCK(s) g_mutex_lock(&(s)->lock) -#define GST_WASAPI2_SRC_UNLOCK(s) g_mutex_unlock(&(s)->lock) +#define DEFAULT_LOOPBACK FALSE enum { @@ -68,15 +66,12 @@ PROP_MUTE, PROP_VOLUME, PROP_DISPATCHER, + PROP_LOOPBACK, }; struct _GstWasapi2Src { - GstAudioSrc parent; - - GstWasapi2Client *client; - GstCaps *cached_caps; - gboolean started; + GstAudioBaseSrc parent; /* properties */ gchar *device_id; @@ -84,32 +79,24 @@ gboolean mute; gdouble volume; gpointer dispatcher; + gboolean loopback; gboolean mute_changed; gboolean volume_changed; - - /* to protect audioclient from set/get property */ - GMutex lock; }; -static void gst_wasapi2_src_dispose (GObject * object); static void gst_wasapi2_src_finalize (GObject * object); static void gst_wasapi2_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_wasapi2_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); -static GstCaps *gst_wasapi2_src_get_caps (GstBaseSrc * bsrc, GstCaps * filter); +static GstStateChangeReturn gst_wasapi2_src_change_state (GstElement * + element, GstStateChange transition); -static gboolean gst_wasapi2_src_open (GstAudioSrc * asrc); -static gboolean gst_wasapi2_src_close (GstAudioSrc * asrc); -static gboolean gst_wasapi2_src_prepare (GstAudioSrc * asrc, - GstAudioRingBufferSpec * spec); -static gboolean gst_wasapi2_src_unprepare (GstAudioSrc * asrc); -static guint gst_wasapi2_src_read (GstAudioSrc * asrc, gpointer data, - guint length, GstClockTime * timestamp); -static guint gst_wasapi2_src_delay (GstAudioSrc * asrc); -static void gst_wasapi2_src_reset (GstAudioSrc * asrc); +static GstCaps *gst_wasapi2_src_get_caps (GstBaseSrc * bsrc, GstCaps * filter); +static GstAudioRingBuffer *gst_wasapi2_src_create_ringbuffer (GstAudioBaseSrc * + src); static void gst_wasapi2_src_set_mute (GstWasapi2Src * self, gboolean mute); static gboolean gst_wasapi2_src_get_mute (GstWasapi2Src * self); @@ -117,7 +104,8 @@ static gdouble gst_wasapi2_src_get_volume (GstWasapi2Src * self); #define gst_wasapi2_src_parent_class parent_class -G_DEFINE_TYPE_WITH_CODE (GstWasapi2Src, gst_wasapi2_src, GST_TYPE_AUDIO_SRC, +G_DEFINE_TYPE_WITH_CODE (GstWasapi2Src, gst_wasapi2_src, + GST_TYPE_AUDIO_BASE_SRC, G_IMPLEMENT_INTERFACE (GST_TYPE_STREAM_VOLUME, NULL)); static void @@ -126,9 +114,8 @@ GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *element_class = GST_ELEMENT_CLASS (klass); GstBaseSrcClass *basesrc_class = GST_BASE_SRC_CLASS (klass); - GstAudioSrcClass *audiosrc_class = GST_AUDIO_SRC_CLASS (klass); + GstAudioBaseSrcClass *audiobasesrc_class = GST_AUDIO_BASE_SRC_CLASS (klass); - gobject_class->dispose = gst_wasapi2_src_dispose; gobject_class->finalize = gst_wasapi2_src_finalize; gobject_class->set_property = gst_wasapi2_src_set_property; gobject_class->get_property = gst_wasapi2_src_get_property; @@ -173,6 +160,19 @@ "reference count management", GST_PARAM_MUTABLE_READY | G_PARAM_WRITABLE | G_PARAM_STATIC_STRINGS)); + /** + * GstWasapi2Src:loopback: + * + * Open render device for loopback recording + * + * Since: 1.20 + */ + g_object_class_install_property (gobject_class, PROP_LOOPBACK, + g_param_spec_boolean ("loopback", "Loopback recording", + "Open render device for loopback recording", DEFAULT_LOOPBACK, + GST_PARAM_MUTABLE_READY | G_PARAM_READWRITE | + G_PARAM_STATIC_STRINGS)); + gst_element_class_add_static_pad_template (element_class, &src_template); gst_element_class_set_static_metadata (element_class, "Wasapi2Src", "Source/Audio/Hardware", @@ -181,15 +181,13 @@ "Ole André Vadla Ravnås <ole.andre.ravnas@tandberg.com>, " "Seungha Yang <seungha@centricular.com>"); + element_class->change_state = + GST_DEBUG_FUNCPTR (gst_wasapi2_src_change_state); + basesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_wasapi2_src_get_caps); - audiosrc_class->open = GST_DEBUG_FUNCPTR (gst_wasapi2_src_open); - audiosrc_class->close = GST_DEBUG_FUNCPTR (gst_wasapi2_src_close); - audiosrc_class->read = GST_DEBUG_FUNCPTR (gst_wasapi2_src_read); - audiosrc_class->prepare = GST_DEBUG_FUNCPTR (gst_wasapi2_src_prepare); - audiosrc_class->unprepare = GST_DEBUG_FUNCPTR (gst_wasapi2_src_unprepare); - audiosrc_class->delay = GST_DEBUG_FUNCPTR (gst_wasapi2_src_delay); - audiosrc_class->reset = GST_DEBUG_FUNCPTR (gst_wasapi2_src_reset); + audiobasesrc_class->create_ringbuffer = + GST_DEBUG_FUNCPTR (gst_wasapi2_src_create_ringbuffer); GST_DEBUG_CATEGORY_INIT (gst_wasapi2_src_debug, "wasapi2src", 0, "Windows audio session API source"); @@ -201,21 +199,7 @@ self->mute = DEFAULT_MUTE; self->volume = DEFAULT_VOLUME; self->low_latency = DEFAULT_LOW_LATENCY; - - g_mutex_init (&self->lock); -} - -static void -gst_wasapi2_src_dispose (GObject * object) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (object); - - GST_WASAPI2_SRC_LOCK (self); - gst_clear_object (&self->client); - gst_clear_caps (&self->cached_caps); - GST_WASAPI2_SRC_UNLOCK (self); - - G_OBJECT_CLASS (parent_class)->dispose (object); + self->loopback = DEFAULT_LOOPBACK; } static void @@ -224,7 +208,6 @@ GstWasapi2Src *self = GST_WASAPI2_SRC (object); g_free (self->device_id); - g_mutex_clear (&self->lock); G_OBJECT_CLASS (parent_class)->finalize (object); } @@ -252,6 +235,9 @@ case PROP_DISPATCHER: self->dispatcher = g_value_get_pointer (value); break; + case PROP_LOOPBACK: + self->loopback = g_value_get_boolean (value); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; @@ -277,35 +263,68 @@ case PROP_VOLUME: g_value_set_double (value, gst_wasapi2_src_get_volume (self)); break; + case PROP_LOOPBACK: + g_value_set_boolean (value, self->loopback); + break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } +static GstStateChangeReturn +gst_wasapi2_src_change_state (GstElement * element, GstStateChange transition) +{ + GstWasapi2Src *self = GST_WASAPI2_SRC (element); + GstAudioBaseSrc *asrc = GST_AUDIO_BASE_SRC_CAST (element); + + switch (transition) { + case GST_STATE_CHANGE_READY_TO_PAUSED: + /* If we have pending volume/mute values to set, do here */ + GST_OBJECT_LOCK (self); + if (asrc->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (asrc->ringbuffer); + + if (self->volume_changed) { + gst_wasapi2_ring_buffer_set_volume (ringbuffer, self->volume); + self->volume_changed = FALSE; + } + + if (self->mute_changed) { + gst_wasapi2_ring_buffer_set_mute (ringbuffer, self->mute); + self->mute_changed = FALSE; + } + } + GST_OBJECT_UNLOCK (self); + break; + default: + break; + } + + return GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); +} + static GstCaps * gst_wasapi2_src_get_caps (GstBaseSrc * bsrc, GstCaps * filter) { - GstWasapi2Src *self = GST_WASAPI2_SRC (bsrc); + GstAudioBaseSrc *asrc = GST_AUDIO_BASE_SRC_CAST (bsrc); GstCaps *caps = NULL; - /* In case of UWP, device activation might not be finished yet */ - if (self->client && !gst_wasapi2_client_ensure_activation (self->client)) { - GST_ELEMENT_ERROR (self, RESOURCE, OPEN_WRITE, (NULL), - ("Failed to activate device")); - return NULL; - } - - if (self->client) - caps = gst_wasapi2_client_get_caps (self->client); + GST_OBJECT_LOCK (bsrc); + if (asrc->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (asrc->ringbuffer); - /* store one caps here so that we can return device caps even if - * audioclient was closed due to unprepare() */ - if (!self->cached_caps && caps) - self->cached_caps = gst_caps_ref (caps); + gst_object_ref (ringbuffer); + GST_OBJECT_UNLOCK (bsrc); - if (!caps && self->cached_caps) - caps = gst_caps_ref (self->cached_caps); + /* Get caps might be able to block if device is not activated yet */ + caps = gst_wasapi2_ring_buffer_get_caps (ringbuffer); + gst_object_unref (ringbuffer); + } else { + GST_OBJECT_UNLOCK (bsrc); + } if (!caps) caps = gst_pad_get_pad_template_caps (bsrc->srcpad); @@ -317,213 +336,84 @@ caps = filtered; } - GST_DEBUG_OBJECT (self, "returning caps %" GST_PTR_FORMAT, caps); + GST_DEBUG_OBJECT (bsrc, "returning caps %" GST_PTR_FORMAT, caps); return caps; } -static gboolean -gst_wasapi2_src_open_unlocked (GstAudioSrc * asrc) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - - self->client = - gst_wasapi2_client_new (GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE, - self->low_latency, -1, self->device_id, self->dispatcher); - - return ! !self->client; -} - -static gboolean -gst_wasapi2_src_open (GstAudioSrc * asrc) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - gboolean ret; - - GST_DEBUG_OBJECT (self, "Opening device"); - - GST_WASAPI2_SRC_LOCK (self); - ret = gst_wasapi2_src_open_unlocked (asrc); - GST_WASAPI2_SRC_UNLOCK (self); - - if (!ret) { - GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, (NULL), - ("Failed to open device")); - return FALSE; - } - - return TRUE; -} - -static gboolean -gst_wasapi2_src_close (GstAudioSrc * asrc) +static GstAudioRingBuffer * +gst_wasapi2_src_create_ringbuffer (GstAudioBaseSrc * src) { - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); + GstWasapi2Src *self = GST_WASAPI2_SRC (src); + GstAudioRingBuffer *ringbuffer; + gchar *name; + GstWasapi2ClientDeviceClass device_class = + GST_WASAPI2_CLIENT_DEVICE_CLASS_CAPTURE; - GST_WASAPI2_SRC_LOCK (self); + if (self->loopback) + device_class = GST_WASAPI2_CLIENT_DEVICE_CLASS_LOOPBACK_CAPTURE; - gst_clear_object (&self->client); - gst_clear_caps (&self->cached_caps); - self->started = FALSE; + name = g_strdup_printf ("%s-ringbuffer", GST_OBJECT_NAME (src)); - GST_WASAPI2_SRC_UNLOCK (self); + ringbuffer = + gst_wasapi2_ring_buffer_new (device_class, + self->low_latency, self->device_id, self->dispatcher, name); + g_free (name); - return TRUE; -} - -static gboolean -gst_wasapi2_src_prepare (GstAudioSrc * asrc, GstAudioRingBufferSpec * spec) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - GstAudioBaseSrc *bsrc = GST_AUDIO_BASE_SRC (asrc); - gboolean ret = FALSE; - - GST_WASAPI2_SRC_LOCK (self); - if (!self->client && !gst_wasapi2_src_open_unlocked (asrc)) { - GST_ERROR_OBJECT (self, "No audio client was configured"); - goto done; - } - - if (!gst_wasapi2_client_ensure_activation (self->client)) { - GST_ERROR_OBJECT (self, "Couldn't activate audio device"); - goto done; - } - - if (!gst_wasapi2_client_open (self->client, spec, bsrc->ringbuffer)) { - GST_ERROR_OBJECT (self, "Couldn't open audio client"); - goto done; - } - - /* Set mute and volume here again, maybe when "mute" property was set, audioclient - * might not be configured at that moment */ - if (self->mute_changed) { - gst_wasapi2_client_set_mute (self->client, self->mute); - self->mute_changed = FALSE; - } - - if (self->volume_changed) { - gst_wasapi2_client_set_volume (self->client, self->volume); - self->volume_changed = FALSE; - } - - /* Will start IAudioClient on the first read request */ - self->started = FALSE; - ret = TRUE; - -done: - GST_WASAPI2_SRC_UNLOCK (self); - - return ret; -} - -static gboolean -gst_wasapi2_src_unprepare (GstAudioSrc * asrc) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - - self->started = FALSE; - - /* Will reopen device later prepare() */ - GST_WASAPI2_SRC_LOCK (self); - if (self->client) { - gst_wasapi2_client_stop (self->client); - gst_clear_object (&self->client); - } - GST_WASAPI2_SRC_UNLOCK (self); - - return TRUE; -} - -static guint -gst_wasapi2_src_read (GstAudioSrc * asrc, gpointer data, guint length, - GstClockTime * timestamp) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - - if (!self->client) { - GST_ERROR_OBJECT (self, "No audio client was configured"); - return -1; - } - - if (!self->started) { - if (!gst_wasapi2_client_start (self->client)) { - GST_ERROR_OBJECT (self, "Failed to re-start client"); - return -1; - } - - self->started = TRUE; - } - - return gst_wasapi2_client_read (self->client, data, length); -} - -static guint -gst_wasapi2_src_delay (GstAudioSrc * asrc) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - - if (!self->client) - return 0; - - return gst_wasapi2_client_delay (self->client); -} - -static void -gst_wasapi2_src_reset (GstAudioSrc * asrc) -{ - GstWasapi2Src *self = GST_WASAPI2_SRC (asrc); - - GST_DEBUG_OBJECT (self, "reset called"); - - self->started = FALSE; - - if (!self->client) - return; - - gst_wasapi2_client_stop (self->client); + return ringbuffer; } static void gst_wasapi2_src_set_mute (GstWasapi2Src * self, gboolean mute) { - GST_WASAPI2_SRC_LOCK (self); + GstAudioBaseSrc *bsrc = GST_AUDIO_BASE_SRC_CAST (self); + HRESULT hr; + + GST_OBJECT_LOCK (self); self->mute = mute; self->mute_changed = TRUE; - if (self->client) { - if (!gst_wasapi2_client_set_mute (self->client, mute)) { + if (bsrc->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsrc->ringbuffer); + + hr = gst_wasapi2_ring_buffer_set_mute (ringbuffer, mute); + if (FAILED (hr)) { GST_INFO_OBJECT (self, "Couldn't set mute"); } else { self->mute_changed = FALSE; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SRC_UNLOCK (self); + GST_OBJECT_UNLOCK (self); } static gboolean gst_wasapi2_src_get_mute (GstWasapi2Src * self) { + GstAudioBaseSrc *bsrc = GST_AUDIO_BASE_SRC_CAST (self); gboolean mute; + HRESULT hr; - GST_WASAPI2_SRC_LOCK (self); + GST_OBJECT_LOCK (self); mute = self->mute; - if (self->client) { - if (!gst_wasapi2_client_get_mute (self->client, &mute)) { - GST_INFO_OBJECT (self, "Couldn't get mute state"); + if (bsrc->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsrc->ringbuffer); + + hr = gst_wasapi2_ring_buffer_get_mute (ringbuffer, &mute); + + if (FAILED (hr)) { + GST_INFO_OBJECT (self, "Couldn't get mute"); } else { self->mute = mute; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SRC_UNLOCK (self); + GST_OBJECT_UNLOCK (self); return mute; } @@ -531,7 +421,10 @@ static void gst_wasapi2_src_set_volume (GstWasapi2Src * self, gdouble volume) { - GST_WASAPI2_SRC_LOCK (self); + GstAudioBaseSrc *bsrc = GST_AUDIO_BASE_SRC_CAST (self); + HRESULT hr; + + GST_OBJECT_LOCK (self); self->volume = volume; /* clip volume value */ @@ -539,39 +432,47 @@ self->volume = MIN (1.0, self->volume); self->volume_changed = TRUE; - if (self->client) { - if (!gst_wasapi2_client_set_volume (self->client, (gfloat) self->volume)) { + if (bsrc->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsrc->ringbuffer); + + hr = gst_wasapi2_ring_buffer_set_volume (ringbuffer, (gfloat) self->volume); + + if (FAILED (hr)) { GST_INFO_OBJECT (self, "Couldn't set volume"); } else { self->volume_changed = FALSE; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SRC_UNLOCK (self); + GST_OBJECT_UNLOCK (self); } static gdouble gst_wasapi2_src_get_volume (GstWasapi2Src * self) { + GstAudioBaseSrc *bsrc = GST_AUDIO_BASE_SRC_CAST (self); gfloat volume; + HRESULT hr; - GST_WASAPI2_SRC_LOCK (self); + GST_OBJECT_LOCK (self); volume = (gfloat) self->volume; - if (self->client) { - if (!gst_wasapi2_client_get_volume (self->client, &volume)) { - GST_INFO_OBJECT (self, "Couldn't get volume"); + if (bsrc->ringbuffer) { + GstWasapi2RingBuffer *ringbuffer = + GST_WASAPI2_RING_BUFFER (bsrc->ringbuffer); + + hr = gst_wasapi2_ring_buffer_get_volume (ringbuffer, &volume); + + if (FAILED (hr)) { + GST_INFO_OBJECT (self, "Couldn't set volume"); } else { self->volume = volume; } - } else { - GST_DEBUG_OBJECT (self, "audio client is not configured yet"); } - GST_WASAPI2_SRC_UNLOCK (self); + GST_OBJECT_UNLOCK (self); volume = MAX (0.0, volume); volume = MIN (1.0, volume);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2src.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2src.h
Changed
@@ -27,7 +27,7 @@ #define GST_TYPE_WASAPI2_SRC (gst_wasapi2_src_get_type ()) G_DECLARE_FINAL_TYPE (GstWasapi2Src, - gst_wasapi2_src, GST, WASAPI2_SRC, GstAudioSrc); + gst_wasapi2_src, GST, WASAPI2_SRC, GstAudioBaseSrc); G_END_DECLS
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2util.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2util.c
Changed
@@ -25,10 +25,57 @@ #include "gstwasapi2util.h" #include <audioclient.h> +#include <mmdeviceapi.h> +#include <winternl.h> GST_DEBUG_CATEGORY_EXTERN (gst_wasapi2_debug); #define GST_CAT_DEFAULT gst_wasapi2_debug +/* Desktop only defines */ +#ifndef KSAUDIO_SPEAKER_MONO +#define KSAUDIO_SPEAKER_MONO (SPEAKER_FRONT_CENTER) +#endif +#ifndef KSAUDIO_SPEAKER_1POINT1 +#define KSAUDIO_SPEAKER_1POINT1 (SPEAKER_FRONT_CENTER | SPEAKER_LOW_FREQUENCY) +#endif +#ifndef KSAUDIO_SPEAKER_STEREO +#define KSAUDIO_SPEAKER_STEREO (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT) +#endif +#ifndef KSAUDIO_SPEAKER_2POINT1 +#define KSAUDIO_SPEAKER_2POINT1 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | SPEAKER_LOW_FREQUENCY) +#endif +#ifndef KSAUDIO_SPEAKER_3POINT0 +#define KSAUDIO_SPEAKER_3POINT0 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | SPEAKER_FRONT_CENTER) +#endif +#ifndef KSAUDIO_SPEAKER_3POINT1 +#define KSAUDIO_SPEAKER_3POINT1 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | \ + SPEAKER_FRONT_CENTER | SPEAKER_LOW_FREQUENCY) +#endif +#ifndef KSAUDIO_SPEAKER_QUAD +#define KSAUDIO_SPEAKER_QUAD (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | \ + SPEAKER_BACK_LEFT | SPEAKER_BACK_RIGHT) +#endif +#define KSAUDIO_SPEAKER_SURROUND (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | \ + SPEAKER_FRONT_CENTER | SPEAKER_BACK_CENTER) +#ifndef KSAUDIO_SPEAKER_5POINT0 +#define KSAUDIO_SPEAKER_5POINT0 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | SPEAKER_FRONT_CENTER | \ + SPEAKER_SIDE_LEFT | SPEAKER_SIDE_RIGHT) +#endif +#define KSAUDIO_SPEAKER_5POINT1 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | \ + SPEAKER_FRONT_CENTER | SPEAKER_LOW_FREQUENCY | \ + SPEAKER_BACK_LEFT | SPEAKER_BACK_RIGHT) +#ifndef KSAUDIO_SPEAKER_7POINT0 +#define KSAUDIO_SPEAKER_7POINT0 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | SPEAKER_FRONT_CENTER | \ + SPEAKER_BACK_LEFT | SPEAKER_BACK_RIGHT | \ + SPEAKER_SIDE_LEFT | SPEAKER_SIDE_RIGHT) +#endif +#ifndef KSAUDIO_SPEAKER_7POINT1 +#define KSAUDIO_SPEAKER_7POINT1 (SPEAKER_FRONT_LEFT | SPEAKER_FRONT_RIGHT | \ + SPEAKER_FRONT_CENTER | SPEAKER_LOW_FREQUENCY | \ + SPEAKER_BACK_LEFT | SPEAKER_BACK_RIGHT | \ + SPEAKER_FRONT_LEFT_OF_CENTER | SPEAKER_FRONT_RIGHT_OF_CENTER) +#endif + /* *INDENT-OFF* */ static struct { @@ -57,6 +104,27 @@ {SPEAKER_TOP_BACK_CENTER, GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER}, {SPEAKER_TOP_BACK_RIGHT, GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT} }; + +static DWORD default_ch_masks[] = { + 0, + KSAUDIO_SPEAKER_MONO, + /* 2ch */ + KSAUDIO_SPEAKER_STEREO, + /* 2.1ch */ + /* KSAUDIO_SPEAKER_3POINT0 ? */ + KSAUDIO_SPEAKER_2POINT1, + /* 4ch */ + /* KSAUDIO_SPEAKER_3POINT1 or KSAUDIO_SPEAKER_SURROUND ? */ + KSAUDIO_SPEAKER_QUAD, + /* 5ch */ + KSAUDIO_SPEAKER_5POINT0, + /* 5.1ch */ + KSAUDIO_SPEAKER_5POINT1, + /* 7ch */ + KSAUDIO_SPEAKER_7POINT0, + /* 7.1ch */ + KSAUDIO_SPEAKER_7POINT1, +}; /* *INDENT-ON* */ static const gchar * @@ -157,6 +225,20 @@ return s; } +gchar * +gst_wasapi2_util_get_error_message (HRESULT hr) +{ + gchar *error_text = NULL; + + error_text = g_win32_error_message ((gint) hr); + if (!error_text || strlen (error_text) == 0) { + g_free (error_text); + error_text = g_strdup (hresult_to_string_fallback (hr)); + } + + return error_text; +} + gboolean _gst_wasapi2_result (HRESULT hr, GstDebugCategory * cat, const gchar * file, const gchar * function, gint line) @@ -192,3 +274,219 @@ return SUCCEEDED (hr); #endif } + +static void +gst_wasapi_util_channel_position_all_none (guint channels, + GstAudioChannelPosition * position) +{ + guint i; + + for (i = 0; i < channels; i++) + position[i] = GST_AUDIO_CHANNEL_POSITION_NONE; +} + +guint64 +gst_wasapi2_util_waveformatex_to_channel_mask (WAVEFORMATEX * format, + GstAudioChannelPosition ** out_position) +{ + guint i, ch; + guint64 mask = 0; + GstAudioChannelPosition *pos = NULL; + WORD nChannels = 0; + DWORD dwChannelMask = 0; + + nChannels = format->nChannels; + if (format->wFormatTag == WAVE_FORMAT_EXTENSIBLE) { + WAVEFORMATEXTENSIBLE *extensible = (WAVEFORMATEXTENSIBLE *) format; + dwChannelMask = extensible->dwChannelMask; + } + + if (out_position) + *out_position = NULL; + + if (nChannels > 2 && !dwChannelMask) { + GST_WARNING ("Unknown channel mask value for %d channel stream", nChannels); + + if (nChannels >= G_N_ELEMENTS (default_ch_masks)) { + GST_ERROR ("To may channels %d", nChannels); + return 0; + } + + dwChannelMask = default_ch_masks[nChannels]; + } + + pos = g_new (GstAudioChannelPosition, nChannels); + gst_wasapi_util_channel_position_all_none (nChannels, pos); + + /* Too many channels, have to assume that they are all non-positional */ + if (nChannels > G_N_ELEMENTS (wasapi_to_gst_pos)) { + GST_INFO ("Got too many (%i) channels, assuming non-positional", nChannels); + goto out; + } + + /* Too many bits in the channel mask, and the bits don't match nChannels */ + if (dwChannelMask >> (G_N_ELEMENTS (wasapi_to_gst_pos) + 1) != 0) { + GST_WARNING ("Too many bits in channel mask (%lu), assuming " + "non-positional", dwChannelMask); + goto out; + } + + /* Map WASAPI's channel mask to Gstreamer's channel mask and positions. + * If the no. of bits in the mask > nChannels, we will ignore the extra. */ + for (i = 0, ch = 0; i < G_N_ELEMENTS (wasapi_to_gst_pos) && ch < nChannels; + i++) { + if (!(dwChannelMask & wasapi_to_gst_pos[i].wasapi_pos)) + /* no match, try next */ + continue; + mask |= G_GUINT64_CONSTANT (1) << wasapi_to_gst_pos[i].gst_pos; + pos[ch++] = wasapi_to_gst_pos[i].gst_pos; + } + + /* XXX: Warn if some channel masks couldn't be mapped? */ + + GST_DEBUG ("Converted WASAPI mask 0x%" G_GINT64_MODIFIER "x -> 0x%" + G_GINT64_MODIFIER "x", (guint64) dwChannelMask, (guint64) mask); + +out: + if (out_position) { + *out_position = pos; + } else { + g_free (pos); + } + + return mask; +} + +const gchar * +gst_wasapi2_util_waveformatex_to_audio_format (WAVEFORMATEX * format) +{ + const gchar *fmt_str = NULL; + GstAudioFormat fmt = GST_AUDIO_FORMAT_UNKNOWN; + + switch (format->wFormatTag) { + case WAVE_FORMAT_PCM: + fmt = gst_audio_format_build_integer (TRUE, G_LITTLE_ENDIAN, + format->wBitsPerSample, format->wBitsPerSample); + break; + case WAVE_FORMAT_IEEE_FLOAT: + if (format->wBitsPerSample == 32) + fmt = GST_AUDIO_FORMAT_F32LE; + else if (format->wBitsPerSample == 64) + fmt = GST_AUDIO_FORMAT_F64LE; + break; + case WAVE_FORMAT_EXTENSIBLE: + { + WAVEFORMATEXTENSIBLE *ex = (WAVEFORMATEXTENSIBLE *) format; + if (IsEqualGUID (&ex->SubFormat, &KSDATAFORMAT_SUBTYPE_PCM)) { + fmt = gst_audio_format_build_integer (TRUE, G_LITTLE_ENDIAN, + format->wBitsPerSample, ex->Samples.wValidBitsPerSample); + } else if (IsEqualGUID (&ex->SubFormat, &KSDATAFORMAT_SUBTYPE_IEEE_FLOAT)) { + if (format->wBitsPerSample == 32 + && ex->Samples.wValidBitsPerSample == 32) + fmt = GST_AUDIO_FORMAT_F32LE; + else if (format->wBitsPerSample == 64 && + ex->Samples.wValidBitsPerSample == 64) + fmt = GST_AUDIO_FORMAT_F64LE; + } + break; + } + default: + break; + } + + if (fmt != GST_AUDIO_FORMAT_UNKNOWN) + fmt_str = gst_audio_format_to_string (fmt); + + return fmt_str; +} + +gboolean +gst_wasapi2_util_parse_waveformatex (WAVEFORMATEX * format, + GstCaps * template_caps, GstCaps ** out_caps, + GstAudioChannelPosition ** out_positions) +{ + const gchar *afmt; + guint64 channel_mask; + + *out_caps = NULL; + + /* TODO: handle SPDIF and other encoded formats */ + + /* 1 or 2 channels <= 16 bits sample size OR + * 1 or 2 channels > 16 bits sample size or >2 channels */ + if (format->wFormatTag != WAVE_FORMAT_PCM && + format->wFormatTag != WAVE_FORMAT_IEEE_FLOAT && + format->wFormatTag != WAVE_FORMAT_EXTENSIBLE) + /* Unhandled format tag */ + return FALSE; + + /* WASAPI can only tell us one canonical mix format that it will accept. The + * alternative is calling IsFormatSupported on all combinations of formats. + * Instead, it's simpler and faster to require conversion inside gstreamer */ + afmt = gst_wasapi2_util_waveformatex_to_audio_format (format); + if (afmt == NULL) + return FALSE; + + *out_caps = gst_caps_copy (template_caps); + + channel_mask = gst_wasapi2_util_waveformatex_to_channel_mask (format, + out_positions); + + gst_caps_set_simple (*out_caps, + "format", G_TYPE_STRING, afmt, + "channels", G_TYPE_INT, format->nChannels, + "rate", G_TYPE_INT, format->nSamplesPerSec, NULL); + + if (channel_mask) { + gst_caps_set_simple (*out_caps, + "channel-mask", GST_TYPE_BITMASK, channel_mask, NULL); + } + + return TRUE; +} + +gboolean +gst_wasapi2_can_automatic_stream_routing (void) +{ +#ifdef GST_WASAPI2_WINAPI_ONLY_APP + /* Assume we are on very recent OS */ + return TRUE; +#else + static gboolean ret = FALSE; + static gsize version_once = 0; + + if (g_once_init_enter (&version_once)) { + OSVERSIONINFOEXW osverinfo; + typedef NTSTATUS (WINAPI fRtlGetVersion) (PRTL_OSVERSIONINFOEXW); + fRtlGetVersion *RtlGetVersion = NULL; + HMODULE hmodule = NULL; + + memset (&osverinfo, 0, sizeof (OSVERSIONINFOEXW)); + osverinfo.dwOSVersionInfoSize = sizeof (OSVERSIONINFOEXW); + + hmodule = LoadLibraryW (L"ntdll.dll"); + if (hmodule) + RtlGetVersion = + (fRtlGetVersion *) GetProcAddress (hmodule, "RtlGetVersion"); + + if (RtlGetVersion) { + RtlGetVersion (&osverinfo); + + /* automatic stream routing requires Windows 10 + * Anniversary Update (version 1607, build number 14393.0) */ + if (osverinfo.dwMajorVersion > 10 || + (osverinfo.dwMajorVersion == 10 && osverinfo.dwBuildNumber >= 14393)) + ret = TRUE; + } + + if (hmodule) + FreeLibrary (hmodule); + + g_once_init_leave (&version_once, 1); + } + + GST_INFO ("Automatic stream routing support: %d", ret); + + return ret; +#endif +}
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/gstwasapi2util.h -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/gstwasapi2util.h
Changed
@@ -23,6 +23,8 @@ #include <gst/gst.h> #include <gst/audio/audio.h> #include <windows.h> +#include <initguid.h> +#include <audioclient.h> G_BEGIN_DECLS @@ -33,6 +35,13 @@ "rate = " GST_AUDIO_RATE_RANGE ", " \ "channels = " GST_AUDIO_CHANNELS_RANGE +#define GST_WASAPI2_CLEAR_COM(obj) G_STMT_START { \ + if (obj) { \ + (obj)->Release (); \ + (obj) = NULL; \ + } \ + } G_STMT_END + gboolean _gst_wasapi2_result (HRESULT hr, GstDebugCategory * cat, const gchar * file, @@ -42,6 +51,20 @@ #define gst_wasapi2_result(result) \ _gst_wasapi2_result (result, GST_CAT_DEFAULT, __FILE__, GST_FUNCTION, __LINE__) +guint64 gst_wasapi2_util_waveformatex_to_channel_mask (WAVEFORMATEX * format, + GstAudioChannelPosition ** out_position); + +const gchar * gst_wasapi2_util_waveformatex_to_audio_format (WAVEFORMATEX * format); + +gboolean gst_wasapi2_util_parse_waveformatex (WAVEFORMATEX * format, + GstCaps * template_caps, + GstCaps ** out_caps, + GstAudioChannelPosition ** out_positions); + +gchar * gst_wasapi2_util_get_error_message (HRESULT hr); + +gboolean gst_wasapi2_can_automatic_stream_routing (void); + G_END_DECLS #endif /* __GST_WASAPI_UTIL_H__ */
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/meson.build -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/meson.build
Changed
@@ -4,6 +4,7 @@ 'gstwasapi2util.c', 'gstwasapi2client.cpp', 'gstwasapi2device.c', + 'gstwasapi2ringbuffer.cpp', 'plugin.c', ] @@ -26,8 +27,9 @@ ksuser_dep = cc.find_library('ksuser', required : get_option('wasapi2')) runtimeobject_dep = cc.find_library('runtimeobject', required : get_option('wasapi2')) mmdeviceapi_dep = cc.find_library('mmdevapi', required : get_option('wasapi2')) -wasapi2_dep = [ole32_dep, ksuser_dep, runtimeobject_dep, mmdeviceapi_dep] -have_symbols = false +mfplat_dep = cc.find_library('mfplat', required : get_option('wasapi2')) +wasapi2_dep = [ole32_dep, ksuser_dep, runtimeobject_dep, mmdeviceapi_dep, mfplat_dep] +extra_args = ['-DGST_USE_UNSTABLE_API'] foreach dep: wasapi2_dep if not dep.found() @@ -71,7 +73,7 @@ return 0; } ''', dependencies: wasapi2_dep, - name: 'checking if building winapi-partiion-app') + name: 'building for WINAPI_PARTITION_APP') if not winapi_app if wasapi2_option.enabled() @@ -81,12 +83,67 @@ endif endif +winapi_desktop = cxx.compiles('''#include <winapifamily.h> + #if !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP) + #error "not win32" + #endif''', + name: 'building for WINAPI_PARTITION_DESKTOP') + +if winapi_app and not winapi_desktop + extra_args += ['-DGST_WASAPI2_WINAPI_ONLY_APP'] +endif + +win10_sdk = cxx.compiles('''#include <windows.h> + #ifndef WDK_NTDDI_VERSION + #error "unknown Windows SDK version" + #endif + #if (WDK_NTDDI_VERSION < 0x0A000000) + #error "Not a Windows 10 SDK" + #endif + ''', + name: 'building with Windows 10 SDK') + +if not win10_sdk + if wasapi2_option.enabled() + error('wasapi2 plugin was enabled explicitly, but Windows 10 SDK is unavailable') + else + subdir_done() + endif +endif + +building_for_win10 = cxx.compiles('''#include <windows.h> + #ifndef WINVER + #error "unknown minimum supported OS version" + #endif + #if (WINVER < 0x0A00) + #error "Windows 10 API is not guaranteed" + #endif + ''', + name: 'building for Windows 10') + +if not building_for_win10 + message('Bumping target Windows version to Windows 10 for building wasapi2 plugin') + extra_args += ['-DWINVER=0x0A00', '-D_WIN32_WINNT=0x0A00', '-DNTDDI_VERSION=WDK_NTDDI_VERSION'] +endif + +if not gstwinrt_dep.found() + if wasapi2_option.enabled() + error('wasapi2 plugin was enabled explicitly, but GstWinRt library is unavailable') + else + subdir_done() + endif +endif + +# Work around for Windows SDK header issue +# https://docs.microsoft.com/en-us/cpp/build/reference/permissive-standards-conformance?view=msvc-160#windows-header-issues +extra_cpp_args = cxx.get_supported_arguments(['/Zc:twoPhase-']) + gstwasapi2 = library('gstwasapi2', wasapi2_sources, - c_args : gst_plugins_bad_args + ['-DCOBJMACROS'], - cpp_args : gst_plugins_bad_args, + c_args : gst_plugins_bad_args + ['-DCOBJMACROS'] + extra_args, + cpp_args : gst_plugins_bad_args + extra_args + extra_cpp_args, include_directories : [configinc], - dependencies : [gstaudio_dep] + wasapi2_dep, + dependencies : [gstaudio_dep, gstwinrt_dep] + wasapi2_dep, install : true, install_dir : plugins_install_dir) pkgconfig.generate(gstwasapi2, install_dir : plugins_pkgconfig_install_dir)
View file
gst-plugins-bad-1.18.6.tar.xz/sys/wasapi2/plugin.c -> gst-plugins-bad-1.20.1.tar.xz/sys/wasapi2/plugin.c
Changed
@@ -26,14 +26,23 @@ #include "gstwasapi2sink.h" #include "gstwasapi2src.h" #include "gstwasapi2device.h" +#include "gstwasapi2util.h" +#include <mfapi.h> GST_DEBUG_CATEGORY (gst_wasapi2_debug); GST_DEBUG_CATEGORY (gst_wasapi2_client_debug); +static void +plugin_deinit (gpointer data) +{ + MFShutdown (); +} + static gboolean plugin_init (GstPlugin * plugin) { - GstRank rank = GST_RANK_SECONDARY; + GstRank rank = GST_RANK_PRIMARY + 1; + HRESULT hr; /** * plugin-wasapi2: @@ -41,10 +50,11 @@ * Since: 1.18 */ -#if WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_APP) && !WINAPI_FAMILY_PARTITION(WINAPI_PARTITION_DESKTOP) - /* If we are building for UWP, wasapi2 plugin should have the highest rank */ - rank = GST_RANK_PRIMARY + 1; -#endif + hr = MFStartup (MF_VERSION, MFSTARTUP_NOSOCKET); + if (!gst_wasapi2_result (hr)) { + GST_WARNING ("MFStartup failure, hr: 0x%x", hr); + return TRUE; + } GST_DEBUG_CATEGORY_INIT (gst_wasapi2_debug, "wasapi2", 0, "wasapi2"); GST_DEBUG_CATEGORY_INIT (gst_wasapi2_client_debug, "wasapi2client", @@ -53,8 +63,11 @@ gst_element_register (plugin, "wasapi2sink", rank, GST_TYPE_WASAPI2_SINK); gst_element_register (plugin, "wasapi2src", rank, GST_TYPE_WASAPI2_SRC); - gst_device_provider_register (plugin, "wasapi2deviceprovider", - rank, GST_TYPE_WASAPI2_DEVICE_PROVIDER); + gst_wasapi2_device_provider_register (plugin, rank); + + g_object_set_data_full (G_OBJECT (plugin), + "plugin-wasapi2-shutdown", "shutdown-data", + (GDestroyNotify) plugin_deinit); return TRUE; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/winscreencap/dxgicapture.c -> gst-plugins-bad-1.20.1.tar.xz/sys/winscreencap/dxgicapture.c
Changed
@@ -60,6 +60,7 @@ /*Direct3D pointers */ ID3D11Device *d3d11_device; ID3D11DeviceContext *d3d11_context; + IDXGIOutput1 *dxgi_output1; IDXGIOutputDuplication *dxgi_dupl; /* Texture that has been rotated and combined fragments. */ @@ -205,13 +206,67 @@ return ! !GstD3DCompileFunc; } +static GstFlowReturn +initialize_output_duplication (DxgiCapture * self) +{ + HDESK hdesk; + HRESULT hr; + DXGI_OUTDUPL_DESC old_dupl_desc; + GstDXGIScreenCapSrc *src = self->src; + + PTR_RELEASE (self->dxgi_dupl); + + hdesk = OpenInputDesktop (0, FALSE, GENERIC_ALL); + if (hdesk) { + if (!SetThreadDesktop (hdesk)) { + GST_WARNING_OBJECT (src, "SetThreadDesktop() failed. Error code: %lu", + GetLastError ()); + } + + CloseDesktop (hdesk); + } else { + GST_WARNING_OBJECT (src, "OpenInputDesktop() failed. Error code: %lu", + GetLastError ()); + } + + hr = IDXGIOutput1_DuplicateOutput (self->dxgi_output1, + (IUnknown *) (self->d3d11_device), &self->dxgi_dupl); + if (hr != S_OK) { + gchar *msg = get_hresult_to_string (hr); + GST_WARNING_OBJECT (src, "IDXGIOutput1::DuplicateOutput() failed (%x): %s", + (guint) hr, msg); + g_free (msg); + if (hr == E_ACCESSDENIED) { + /* Happens temporarily during resolution changes. */ + return GST_FLOW_OK; + } + return GST_FLOW_ERROR; + } + + old_dupl_desc = self->dupl_desc; + IDXGIOutputDuplication_GetDesc (self->dxgi_dupl, &self->dupl_desc); + + if (self->readable_texture && + (self->dupl_desc.ModeDesc.Width != old_dupl_desc.ModeDesc.Width || + self->dupl_desc.ModeDesc.Height != old_dupl_desc.ModeDesc.Height || + self->dupl_desc.Rotation != old_dupl_desc.Rotation)) { + PTR_RELEASE (self->readable_texture); + PTR_RELEASE (self->work_texture); + + _setup_texture (self); + + return GST_DXGICAP_FLOW_RESOLUTION_CHANGE; + } + + return GST_FLOW_OK; +} + DxgiCapture * dxgicap_new (HMONITOR monitor, GstDXGIScreenCapSrc * src) { int i, j; HRESULT hr; IDXGIFactory1 *dxgi_factory1 = NULL; - IDXGIOutput1 *dxgi_output1 = NULL; IDXGIAdapter1 *dxgi_adapter1 = NULL; ID3D11InputLayout *vertex_input_layout = NULL; ID3DBlob *vertex_shader_blob = NULL; @@ -227,7 +282,6 @@ hr = CreateDXGIFactory1 (&IID_IDXGIFactory1, (void **) &dxgi_factory1); HR_FAILED_GOTO (hr, CreateDXGIFactory1, new_error); - dxgi_output1 = NULL; for (i = 0; IDXGIFactory1_EnumAdapters1 (dxgi_factory1, i, &dxgi_adapter1) != DXGI_ERROR_NOT_FOUND; ++i) { @@ -249,11 +303,11 @@ DXGI_ERROR_NOT_FOUND; ++j) { DXGI_OUTPUT_DESC output_desc; hr = IDXGIOutput_QueryInterface (dxgi_output, &IID_IDXGIOutput1, - (void **) &dxgi_output1); + (void **) &self->dxgi_output1); PTR_RELEASE (dxgi_output); HR_FAILED_GOTO (hr, IDXGIOutput::QueryInterface, new_error); - hr = IDXGIOutput1_GetDesc (dxgi_output1, &output_desc); + hr = IDXGIOutput1_GetDesc (self->dxgi_output1, &output_desc); HR_FAILED_GOTO (hr, IDXGIOutput1::GetDesc, new_error); if (output_desc.Monitor == monitor) { @@ -261,13 +315,12 @@ break; } - PTR_RELEASE (dxgi_output1); - dxgi_output1 = NULL; + PTR_RELEASE (self->dxgi_output1); } PTR_RELEASE (dxgi_adapter1); - if (NULL != dxgi_output1) { + if (NULL != self->dxgi_output1) { break; } @@ -275,18 +328,16 @@ PTR_RELEASE (self->d3d11_context); } - if (NULL == dxgi_output1) { + if (NULL == self->dxgi_output1) { goto new_error; } PTR_RELEASE (dxgi_factory1); - hr = IDXGIOutput1_DuplicateOutput (dxgi_output1, - (IUnknown *) (self->d3d11_device), &self->dxgi_dupl); - PTR_RELEASE (dxgi_output1); - HR_FAILED_GOTO (hr, IDXGIOutput1::DuplicateOutput, new_error); + if (initialize_output_duplication (self) == GST_FLOW_ERROR) { + goto new_error; + } - IDXGIOutputDuplication_GetDesc (self->dxgi_dupl, &self->dupl_desc); self->pointer_buffer_capacity = INITIAL_POINTER_BUFFER_CAPACITY; self->pointer_buffer = g_malloc (self->pointer_buffer_capacity); if (NULL == self->pointer_buffer) { @@ -388,6 +439,7 @@ PTR_RELEASE (self->target_view); PTR_RELEASE (self->readable_texture); PTR_RELEASE (self->work_texture); + PTR_RELEASE (self->dxgi_output1); PTR_RELEASE (self->dxgi_dupl); PTR_RELEASE (self->d3d11_context); PTR_RELEASE (self->d3d11_device); @@ -418,17 +470,24 @@ PTR_RELEASE (self->work_texture); } -gboolean +GstFlowReturn dxgicap_acquire_next_frame (DxgiCapture * self, gboolean show_cursor, guint timeout) { - gboolean ret = FALSE; + GstFlowReturn ret = GST_FLOW_ERROR; HRESULT hr; GstDXGIScreenCapSrc *src = self->src; DXGI_OUTDUPL_FRAME_INFO frame_info; IDXGIResource *desktop_resource = NULL; + if (!self->dxgi_dupl) { + /* Desktop duplication interface became invalid due to desktop switch, + * UAC prompt popping up, or similar event. Try to reinitialize. */ + ret = initialize_output_duplication (self); + goto end; + } + /* Get the latest desktop frames. */ hr = IDXGIOutputDuplication_AcquireNextFrame (self->dxgi_dupl, timeout, &frame_info, &desktop_resource); @@ -436,7 +495,13 @@ /* In case of DXGI_ERROR_WAIT_TIMEOUT, * it has not changed from the last time. */ GST_LOG_OBJECT (src, "DXGI_ERROR_WAIT_TIMEOUT"); - ret = TRUE; + ret = GST_FLOW_OK; + goto end; + } else if (hr == DXGI_ERROR_ACCESS_LOST) { + GST_LOG_OBJECT (src, "DXGI_ERROR_ACCESS_LOST; reinitializing output " + "duplication..."); + PTR_RELEASE (self->dxgi_dupl); + ret = GST_FLOW_OK; goto end; } HR_FAILED_GOTO (hr, IDXGIOutputDuplication::AcquireNextFrame, end); @@ -476,15 +541,17 @@ } HR_FAILED_GOTO (hr, IDXGIOutputDuplication::GetFramePointerShape, end); self->pointer_shape_info = pointer_shape_info; - ret = TRUE; + ret = GST_FLOW_OK; } else { - ret = TRUE; + ret = GST_FLOW_OK; } } else { - ret = TRUE; + ret = GST_FLOW_OK; } end: - IDXGIOutputDuplication_ReleaseFrame (self->dxgi_dupl); + if (self->dxgi_dupl) { + IDXGIOutputDuplication_ReleaseFrame (self->dxgi_dupl); + } PTR_RELEASE (desktop_resource); return ret; }
View file
gst-plugins-bad-1.18.6.tar.xz/sys/winscreencap/dxgicapture.h -> gst-plugins-bad-1.20.1.tar.xz/sys/winscreencap/dxgicapture.h
Changed
@@ -56,6 +56,7 @@ } \ } G_STMT_END +#define GST_DXGICAP_FLOW_RESOLUTION_CHANGE GST_FLOW_CUSTOM_SUCCESS_1 typedef struct _GstDXGIScreenCapSrc GstDXGIScreenCapSrc; @@ -69,7 +70,7 @@ gboolean dxgicap_start (DxgiCapture * _this); void dxgicap_stop (DxgiCapture * _this); -gint dxgicap_acquire_next_frame (DxgiCapture * _this, gboolean show_cursor, +GstFlowReturn dxgicap_acquire_next_frame (DxgiCapture * _this, gboolean show_cursor, guint timeout); gboolean dxgicap_copy_buffer (DxgiCapture * _this, gboolean show_cursor, LPRECT src_rect, GstVideoInfo * video_info, GstBuffer * buf);
View file
gst-plugins-bad-1.18.6.tar.xz/sys/winscreencap/gstdxgiscreencapsrc.c -> gst-plugins-bad-1.20.1.tar.xz/sys/winscreencap/gstdxgiscreencapsrc.c
Changed
@@ -126,9 +126,8 @@ static gboolean gst_dxgi_screen_cap_src_unlock (GstBaseSrc * bsrc); -static GstFlowReturn gst_dxgi_screen_cap_src_fill (GstPushSrc * pushsrc, - GstBuffer * buffer); - +static GstFlowReturn gst_dxgi_screen_cap_src_create (GstBaseSrc * pushsrc, + guint64 offset, guint length, GstBuffer ** buffer); static HMONITOR _get_hmonitor (GstDXGIScreenCapSrc * src); @@ -139,12 +138,10 @@ GObjectClass *go_class; GstElementClass *e_class; GstBaseSrcClass *bs_class; - GstPushSrcClass *ps_class; go_class = G_OBJECT_CLASS (klass); e_class = GST_ELEMENT_CLASS (klass); bs_class = GST_BASE_SRC_CLASS (klass); - ps_class = GST_PUSH_SRC_CLASS (klass); go_class->set_property = gst_dxgi_screen_cap_src_set_property; go_class->get_property = gst_dxgi_screen_cap_src_get_property; @@ -156,7 +153,7 @@ bs_class->stop = GST_DEBUG_FUNCPTR (gst_dxgi_screen_cap_src_stop); bs_class->unlock = GST_DEBUG_FUNCPTR (gst_dxgi_screen_cap_src_unlock); bs_class->fixate = GST_DEBUG_FUNCPTR (gst_dxgi_screen_cap_src_fixate); - ps_class->fill = GST_DEBUG_FUNCPTR (gst_dxgi_screen_cap_src_fill); + bs_class->create = GST_DEBUG_FUNCPTR (gst_dxgi_screen_cap_src_create); g_object_class_install_property (go_class, PROP_MONITOR, g_param_spec_int ("monitor", "Monitor", @@ -442,12 +439,15 @@ } static GstFlowReturn -gst_dxgi_screen_cap_src_fill (GstPushSrc * push_src, GstBuffer * buf) +gst_dxgi_screen_cap_src_create (GstBaseSrc * base_src, guint64 offset, + guint length, GstBuffer ** buf) { - GstDXGIScreenCapSrc *src = GST_DXGI_SCREEN_CAP_SRC (push_src); + GstDXGIScreenCapSrc *src = GST_DXGI_SCREEN_CAP_SRC (base_src); GstClock *clock; GstClockTime buf_time, buf_dur; guint64 frame_number; + GstFlowReturn ret; + GstBuffer *buffer = NULL; if (G_UNLIKELY (!src->dxgi_capture)) { GST_DEBUG_OBJECT (src, "format wasn't negotiated before create function"); @@ -534,15 +534,40 @@ } /* Get the latest desktop frame. */ - if (dxgicap_acquire_next_frame (src->dxgi_capture, src->show_cursor, 0)) { - /* Copy the latest desktop frame to the video frame. */ - if (dxgicap_copy_buffer (src->dxgi_capture, src->show_cursor, - &src->src_rect, &src->video_info, buf)) { - GST_BUFFER_TIMESTAMP (buf) = buf_time; - GST_BUFFER_DURATION (buf) = buf_dur; - return GST_FLOW_OK; + do { + ret = dxgicap_acquire_next_frame (src->dxgi_capture, src->show_cursor, 0); + if (ret == GST_DXGICAP_FLOW_RESOLUTION_CHANGE) { + GST_DEBUG_OBJECT (src, "Resolution change detected."); + + if (!gst_base_src_negotiate (GST_BASE_SRC (src))) { + return GST_FLOW_NOT_NEGOTIATED; + } } + } while (ret == GST_DXGICAP_FLOW_RESOLUTION_CHANGE); + + if (ret != GST_FLOW_OK) { + return ret; + } + + ret = + GST_BASE_SRC_CLASS (g_type_class_peek_parent (parent_class))->alloc + (base_src, offset, length, &buffer); + if (ret != GST_FLOW_OK) { + return ret; } + + /* Copy the latest desktop frame to the video frame. */ + if (dxgicap_copy_buffer (src->dxgi_capture, src->show_cursor, + &src->src_rect, &src->video_info, buffer)) { + GST_BUFFER_TIMESTAMP (buffer) = buf_time; + GST_BUFFER_DURATION (buffer) = buf_dur; + *buf = buffer; + + return GST_FLOW_OK; + } + + gst_clear_buffer (&buffer); + return GST_FLOW_ERROR; }
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/aesdec.c
Added
@@ -0,0 +1,181 @@ +/* GStreamer + * + * Copyright (C) 2021 Aaron Boxer <aaron.boxer@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif +#include <gst/gst.h> +#include <gst/check/gstcheck.h> +#include <gst/check/gstharness.h> + +unsigned char plain16[] = { + 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, + 0x08, 0x09, 0x0A, 0x0B, 0x0C, 0x0D, 0x0E, 0x0F +}; + +unsigned char enc16[] = { + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0x40, 0xee, 0xfd, 0xcb, 0x3b, 0xbe, 0xf3, 0x0b, + 0xa7, 0xaf, 0x5e, 0x20, 0x87, 0x78, 0x8a, 0x45 +}; + +unsigned char enc16_serialize[] = { + 0xe9, 0xaa, 0x8e, 0x83, 0x4d, 0x8d, 0x70, 0xb7, + 0xe0, 0xd2, 0x54, 0xff, 0x67, 0x0d, 0xd7, 0x18, + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0x40, 0xee, 0xfd, 0xcb, 0x3b, 0xbe, 0xf3, 0x0b, + 0xa7, 0xaf, 0x5e, 0x20, 0x87, 0x78, 0x8a, 0x45 +}; + +unsigned char plain17[] = { + 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, + 0x08, 0x09, 0x0A, 0x0B, 0x0C, 0x0D, 0x0E, 0x0F, + 0x10 +}; + +unsigned char enc17[] = { + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0xe1, 0xe0, 0xaa, 0xf4, 0xe8, 0x29, 0x7c, 0x9f, + 0xc4, 0xe3, 0x11, 0x4a, 0x97, 0x58, 0x9c, 0xa5 +}; + +unsigned char enc17_serialize[] = { + 0xe9, 0xaa, 0x8e, 0x83, 0x4d, 0x8d, 0x70, 0xb7, + 0xe0, 0xd2, 0x54, 0xff, 0x67, 0x0d, 0xd7, 0x18, + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0xe1, 0xe0, 0xaa, 0xf4, 0xe8, 0x29, 0x7c, 0x9f, + 0xc4, 0xe3, 0x11, 0x4a, 0x97, 0x58, 0x9c, 0xa5 +}; + +static void +run (gboolean per_buffer_padding, + gboolean serialize_iv, + guchar * in_ref, gsize in_ref_len, guchar * out_ref, gsize out_ref_len) +{ + + GstHarness *h; + GstBuffer *buf, *outbuf; + gint out_ref_len_truncated = out_ref_len & ~0xF; + + h = gst_harness_new ("aesdec"); + gst_harness_set_src_caps_str (h, "video/x-raw"); + + g_object_set (h->element, + "key", "1f9423681beb9a79215820f6bda73d0f", + "iv", "e9aa8e834d8d70b7e0d254ff670dd718", + "per-buffer-padding", per_buffer_padding, + "serialize-iv", serialize_iv, NULL); + + buf = gst_buffer_new_and_alloc (in_ref_len); + gst_buffer_fill (buf, 0, in_ref, in_ref_len); + outbuf = gst_harness_push_and_pull (h, gst_buffer_ref (buf)); + + fail_unless (gst_buffer_memcmp (outbuf, 0, out_ref, + out_ref_len_truncated) == 0); + + gst_buffer_unref (outbuf); + gst_buffer_unref (buf); + + if (!per_buffer_padding) { + gint final_bytes = out_ref_len & 0xF; + + if (final_bytes != 0) { + gst_harness_push_event (h, gst_event_new_eos ()); + outbuf = gst_harness_try_pull (h); + fail_unless (outbuf); + fail_unless (gst_buffer_get_size (outbuf) == final_bytes); + fail_unless (gst_buffer_memcmp (outbuf, 0, + out_ref + out_ref_len_truncated, final_bytes) == 0); + gst_buffer_unref (outbuf); + } + } + + gst_harness_teardown (h); +} + +GST_START_TEST (text16) +{ + run (TRUE, FALSE, enc16, sizeof (enc16), plain16, sizeof (plain16)); +} + +GST_END_TEST; + +GST_START_TEST (text16_serialize) +{ + run (TRUE, TRUE, enc16_serialize, sizeof (enc16_serialize), plain16, + sizeof (plain16)); +} + +GST_END_TEST; + +GST_START_TEST (text16_serialize_no_per_buffer_padding) +{ + run (FALSE, TRUE, enc16_serialize, sizeof (enc16_serialize), plain16, + sizeof (plain16)); +} + +GST_END_TEST; + + +GST_START_TEST (text17) +{ + run (TRUE, FALSE, enc17, sizeof (enc17), plain17, sizeof (plain17)); +} + +GST_END_TEST; + +GST_START_TEST (text17_serialize) +{ + run (TRUE, TRUE, enc17_serialize, sizeof (enc17_serialize), plain17, + sizeof (plain17)); +} + +GST_END_TEST; + + +GST_START_TEST (text17_serialize_no_per_buffer_padding) +{ + run (FALSE, TRUE, enc17_serialize, sizeof (enc17_serialize), plain17, + sizeof (plain17)); +} + +GST_END_TEST; + +static Suite * +aesdec_suite (void) +{ + Suite *s = suite_create ("aesdec"); + TCase *tc = tcase_create ("general"); + + suite_add_tcase (s, tc); + tcase_add_test (tc, text16); + tcase_add_test (tc, text16_serialize); + tcase_add_test (tc, text16_serialize_no_per_buffer_padding); + tcase_add_test (tc, text17); + tcase_add_test (tc, text17_serialize); + tcase_add_test (tc, text17_serialize_no_per_buffer_padding); + return s; +} + +GST_CHECK_MAIN (aesdec);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/aesenc.c
Added
@@ -0,0 +1,179 @@ +/* GStreamer + * + * Copyright (C) 2021 Aaron Boxer <aaron.boxer@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif +#include <gst/gst.h> +#include <gst/check/gstcheck.h> +#include <gst/check/gstharness.h> + +unsigned char plain16[] = { + 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, + 0x08, 0x09, 0x0A, 0x0B, 0x0C, 0x0D, 0x0E, 0x0F +}; + +unsigned char enc16[] = { + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0x40, 0xee, 0xfd, 0xcb, 0x3b, 0xbe, 0xf3, 0x0b, + 0xa7, 0xaf, 0x5e, 0x20, 0x87, 0x78, 0x8a, 0x45 +}; + +unsigned char enc16_serialize[] = { + 0xe9, 0xaa, 0x8e, 0x83, 0x4d, 0x8d, 0x70, 0xb7, + 0xe0, 0xd2, 0x54, 0xff, 0x67, 0x0d, 0xd7, 0x18, + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0x40, 0xee, 0xfd, 0xcb, 0x3b, 0xbe, 0xf3, 0x0b, + 0xa7, 0xaf, 0x5e, 0x20, 0x87, 0x78, 0x8a, 0x45 +}; + +unsigned char enc16_serialize_no_per_buffer_padding[] = { + 0xe9, 0xaa, 0x8e, 0x83, 0x4d, 0x8d, 0x70, 0xb7, + 0xe0, 0xd2, 0x54, 0xff, 0x67, 0x0d, 0xd7, 0x18, + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27 +}; + + +unsigned char plain17[] = { + 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, + 0x08, 0x09, 0x0A, 0x0B, 0x0C, 0x0D, 0x0E, 0x0F, + 0x10 +}; + +unsigned char enc17[] = { + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0xe1, 0xe0, 0xaa, 0xf4, 0xe8, 0x29, 0x7c, 0x9f, + 0xc4, 0xe3, 0x11, 0x4a, 0x97, 0x58, 0x9c, 0xa5 +}; + +unsigned char enc17_serialize[] = { + 0xe9, 0xaa, 0x8e, 0x83, 0x4d, 0x8d, 0x70, 0xb7, + 0xe0, 0xd2, 0x54, 0xff, 0x67, 0x0d, 0xd7, 0x18, + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, + 0xe1, 0xe0, 0xaa, 0xf4, 0xe8, 0x29, 0x7c, 0x9f, + 0xc4, 0xe3, 0x11, 0x4a, 0x97, 0x58, 0x9c, 0xa5 +}; + +unsigned char enc17_serialize_no_per_buffer_padding[] = { + 0xe9, 0xaa, 0x8e, 0x83, 0x4d, 0x8d, 0x70, 0xb7, + 0xe0, 0xd2, 0x54, 0xff, 0x67, 0x0d, 0xd7, 0x18, + 0xfc, 0x49, 0x14, 0xc6, 0xee, 0x06, 0xe1, 0xb1, + 0xc7, 0xa2, 0x3a, 0x05, 0x13, 0x15, 0x29, 0x27, +}; + +static void +run (gboolean per_buffer_padding, + gboolean serialize_iv, + guchar * in_ref, gsize in_ref_len, guchar * out_ref, gsize out_ref_len) +{ + GstHarness *h; + GstBuffer *buf, *outbuf; + + h = gst_harness_new ("aesenc"); + gst_harness_set_src_caps_str (h, "video/x-raw"); + + g_object_set (h->element, + "key", "1f9423681beb9a79215820f6bda73d0f", + "iv", "e9aa8e834d8d70b7e0d254ff670dd718", + "per-buffer-padding", per_buffer_padding, + "serialize-iv", serialize_iv, NULL); + + buf = gst_buffer_new_and_alloc (in_ref_len); + gst_buffer_fill (buf, 0, in_ref, in_ref_len); + outbuf = gst_harness_push_and_pull (h, gst_buffer_ref (buf)); + + fail_unless (gst_buffer_memcmp (outbuf, 0, out_ref, out_ref_len) == 0); + + gst_buffer_unref (outbuf); + gst_buffer_unref (buf); + + gst_harness_teardown (h); +} + +GST_START_TEST (text16) +{ + run (TRUE, FALSE, plain16, sizeof (plain16), enc16, sizeof (enc16)); +} + +GST_END_TEST; + +GST_START_TEST (text16_serialize) +{ + run (TRUE, TRUE, plain16, sizeof (plain16), enc16_serialize, + sizeof (enc16_serialize)); +} + +GST_END_TEST; + +GST_START_TEST (text16_serialize_no_per_buffer_padding) +{ + run (FALSE, TRUE, plain16, sizeof (plain16), + enc16_serialize_no_per_buffer_padding, + sizeof (enc16_serialize_no_per_buffer_padding)); +} + +GST_END_TEST; + +GST_START_TEST (text17) +{ + run (TRUE, FALSE, plain17, sizeof (plain17), enc17, sizeof (enc17)); +} + +GST_END_TEST; + +GST_START_TEST (text17_serialize) +{ + run (TRUE, TRUE, plain17, sizeof (plain17), enc17_serialize, + sizeof (enc17_serialize)); +} + +GST_END_TEST; + +GST_START_TEST (text17_serialize_no_per_buffer_padding) +{ + run (FALSE, TRUE, plain17, sizeof (plain17), + enc17_serialize_no_per_buffer_padding, + sizeof (enc17_serialize_no_per_buffer_padding)); +} + +GST_END_TEST; + +static Suite * +aesenc_suite (void) +{ + Suite *s = suite_create ("aesenc"); + TCase *tc = tcase_create ("general"); + + suite_add_tcase (s, tc); + tcase_add_test (tc, text16); + tcase_add_test (tc, text16_serialize); + tcase_add_test (tc, text16_serialize_no_per_buffer_padding); + tcase_add_test (tc, text17); + tcase_add_test (tc, text17_serialize); + tcase_add_test (tc, text17_serialize_no_per_buffer_padding); + return s; +} + +GST_CHECK_MAIN (aesenc);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/asfmux.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/asfmux.c
Changed
@@ -66,7 +66,7 @@ ASSERT_OBJECT_REFCOUNT (srcpad, "srcpad", 1); if (!(sinkpad = gst_element_get_static_pad (element, sinkname))) - sinkpad = gst_element_get_request_pad (element, sinkname); + sinkpad = gst_element_request_pad_simple (element, sinkname); fail_if (sinkpad == NULL, "Could not get sink pad from %s", GST_ELEMENT_NAME (element)); /* references are owned by: 1) us, 2) asfmux, 3) collect pads */ @@ -90,7 +90,7 @@ /* clean up floating src pad */ padname = g_strdup_printf (sinkname, 1); if (!(sinkpad = gst_element_get_static_pad (element, padname))) - sinkpad = gst_element_get_request_pad (element, padname); + sinkpad = gst_element_request_pad_simple (element, padname); g_free (padname); fail_if (sinkpad == NULL, "sinkpad is null");
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/av1parse.c
Added
@@ -0,0 +1,378 @@ +/* + * GStreamer + * + * unit test for h265parse + * + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/check/gstcheck.h> +#include <gst/check/gstharness.h> +#include "av1parse.h" +#include <string.h> + +static void +check_caps_event (GstHarness * h) +{ + GstEvent *event; + GstCaps *caps = NULL; + GstStructure *s; + const gchar *profile; + gint width, height; + guint depth; + + while ((event = gst_harness_try_pull_event (h))) { + GstCaps *event_caps; + if (GST_EVENT_TYPE (event) != GST_EVENT_CAPS) { + gst_event_unref (event); + continue; + } + + gst_event_parse_caps (event, &event_caps); + gst_caps_replace (&caps, event_caps); + gst_event_unref (event); + } + + fail_unless (caps != NULL); + s = gst_caps_get_structure (caps, 0); + fail_unless (gst_structure_get_int (s, "width", &width)); + fail_unless (gst_structure_get_int (s, "height", &height)); + fail_unless ((profile = gst_structure_get_string (s, "profile"))); + fail_unless (gst_structure_get_uint (s, "bit-depth-chroma", &depth)); + + fail_unless_equals_int (width, 400); + fail_unless_equals_int (height, 300); + fail_unless_equals_int (depth, 8); + fail_unless_equals_string (profile, "main"); + gst_caps_unref (caps); +} + +GST_START_TEST (test_byte_to_frame) +{ + GstHarness *h; + GstBuffer *in_buf, *out_buf = NULL; + GstMapInfo map; + GstFlowReturn ret; + gint i = 0; + guint offset; + guint len; + guint output_buf_num; + + h = gst_harness_new_parse ("av1parse"); + fail_unless (h != NULL); + + gst_harness_set_sink_caps_str (h, "video/x-av1,parsed=(boolean)true," + "alignment=(string)frame,stream-format=(string)obu-stream"); + gst_harness_set_src_caps_str (h, "video/x-av1"); + + gst_harness_play (h); + + output_buf_num = 0; + offset = 0; + len = stream_no_annexb_av1_len / 5; + for (i = 0; i < 5; i++) { + if (i == 4) + len = stream_no_annexb_av1_len - offset; + + in_buf = gst_buffer_new_and_alloc (len); + gst_buffer_map (in_buf, &map, GST_MAP_WRITE); + memcpy (map.data, stream_no_annexb_av1 + offset, len); + gst_buffer_unmap (in_buf, &map); + offset += len; + + ret = gst_harness_push (h, in_buf); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + + gst_clear_buffer (&out_buf); + while ((out_buf = gst_harness_try_pull (h)) != NULL) { + if (output_buf_num == 0) + check_caps_event (h); + + fail_unless (gst_buffer_get_size (out_buf) == + stream_av1_frame_size[output_buf_num]); + + gst_clear_buffer (&out_buf); + output_buf_num++; + } + } + + fail_unless (output_buf_num == 14); + + gst_harness_teardown (h); + +} + +GST_END_TEST; + +GST_START_TEST (test_byte_to_annexb) +{ + GstHarness *h; + GstBuffer *in_buf, *out_buf = NULL; + GstMapInfo map; + GstFlowReturn ret; + gint i = 0; + guint offset; + guint len; + guint output_buf_num; + + h = gst_harness_new_parse ("av1parse"); + fail_unless (h != NULL); + + gst_harness_set_sink_caps_str (h, "video/x-av1,parsed=(boolean)true," + "alignment=(string)tu,stream-format=(string)annexb"); + gst_harness_set_src_caps_str (h, "video/x-av1,alignment=(string)byte"); + + gst_harness_play (h); + + output_buf_num = 0; + offset = 0; + len = stream_no_annexb_av1_len / 5; + for (i = 0; i < 5; i++) { + if (i == 4) + len = stream_no_annexb_av1_len - offset; + + in_buf = gst_buffer_new_and_alloc (len); + gst_buffer_map (in_buf, &map, GST_MAP_WRITE); + memcpy (map.data, stream_no_annexb_av1 + offset, len); + gst_buffer_unmap (in_buf, &map); + offset += len; + + ret = gst_harness_push (h, in_buf); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + + gst_clear_buffer (&out_buf); + while ((out_buf = gst_harness_try_pull (h)) != NULL) { + if (output_buf_num == 0) + check_caps_event (h); + + fail_unless (gst_buffer_get_size (out_buf) == + stream_annexb_av1_tu_len[output_buf_num]); + + gst_clear_buffer (&out_buf); + output_buf_num++; + } + } + + /* The last TU need EOS */ + fail_unless (gst_harness_push_event (h, gst_event_new_eos ())); + out_buf = gst_harness_try_pull (h); + fail_unless (out_buf); + fail_unless (gst_buffer_get_size (out_buf) == + stream_annexb_av1_tu_len[output_buf_num]); + output_buf_num++; + gst_clear_buffer (&out_buf); + + fail_unless (output_buf_num == 10); + + gst_harness_teardown (h); +} + +GST_END_TEST; + +GST_START_TEST (test_annexb_to_frame) +{ + GstHarness *h; + GstBuffer *in_buf, *out_buf = NULL; + GstMapInfo map; + GstFlowReturn ret; + gint i = 0; + guint offset; + guint output_buf_num; + + h = gst_harness_new_parse ("av1parse"); + fail_unless (h != NULL); + + gst_harness_set_sink_caps_str (h, "video/x-av1,parsed=(boolean)true," + "alignment=(string)frame,stream-format=(string)obu-stream"); + gst_harness_set_src_caps_str (h, "video/x-av1,alignment=(string)tu," + "stream-format=(string)annexb"); + + gst_harness_play (h); + + output_buf_num = 0; + offset = 0; + for (i = 0; i < G_N_ELEMENTS (stream_annexb_av1_tu_len); i++) { + in_buf = gst_buffer_new_and_alloc (stream_annexb_av1_tu_len[i]); + gst_buffer_map (in_buf, &map, GST_MAP_WRITE); + memcpy (map.data, stream_annexb_av1 + offset, stream_annexb_av1_tu_len[i]); + gst_buffer_unmap (in_buf, &map); + offset += stream_annexb_av1_tu_len[i]; + + ret = gst_harness_push (h, in_buf); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + + gst_clear_buffer (&out_buf); + while ((out_buf = gst_harness_try_pull (h)) != NULL) { + if (output_buf_num == 0) + check_caps_event (h); + + fail_unless (gst_buffer_get_size (out_buf) == + stream_av1_frame_size[output_buf_num]); + + gst_clear_buffer (&out_buf); + output_buf_num++; + } + } + + fail_unless (output_buf_num == 14); + + gst_harness_teardown (h); +} + +GST_END_TEST; + +GST_START_TEST (test_annexb_to_obu) +{ + GstHarness *h; + GstBuffer *in_buf, *out_buf = NULL; + GstMapInfo map; + GstFlowReturn ret; + gint i = 0; + guint offset; + guint output_buf_num; + + h = gst_harness_new_parse ("av1parse"); + fail_unless (h != NULL); + + gst_harness_set_sink_caps_str (h, "video/x-av1,parsed=(boolean)true," + "alignment=(string)obu"); + gst_harness_set_src_caps_str (h, "video/x-av1,alignment=(string)tu," + "stream-format=(string)annexb"); + + gst_harness_play (h); + + output_buf_num = 0; + offset = 0; + for (i = 0; i < G_N_ELEMENTS (stream_annexb_av1_tu_len); i++) { + in_buf = gst_buffer_new_and_alloc (stream_annexb_av1_tu_len[i]); + gst_buffer_map (in_buf, &map, GST_MAP_WRITE); + memcpy (map.data, stream_annexb_av1 + offset, stream_annexb_av1_tu_len[i]); + gst_buffer_unmap (in_buf, &map); + offset += stream_annexb_av1_tu_len[i]; + + ret = gst_harness_push (h, in_buf); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + + gst_clear_buffer (&out_buf); + while ((out_buf = gst_harness_try_pull (h)) != NULL) { + if (output_buf_num == 0) + check_caps_event (h); + + fail_unless (gst_buffer_get_size (out_buf) == + stream_av1_obu_size[output_buf_num]); + + gst_clear_buffer (&out_buf); + output_buf_num++; + } + } + + fail_unless (output_buf_num == G_N_ELEMENTS (stream_av1_obu_size)); + + gst_harness_teardown (h); +} + +GST_END_TEST; + +GST_START_TEST (test_byte_to_obu) +{ + GstHarness *h; + GstBuffer *in_buf, *out_buf = NULL; + GstMapInfo map; + GstFlowReturn ret; + gint i = 0; + guint offset; + guint len; + guint output_buf_num; + + h = gst_harness_new_parse ("av1parse"); + fail_unless (h != NULL); + + gst_harness_set_sink_caps_str (h, "video/x-av1,parsed=(boolean)true," + "alignment=(string)obu,stream-format=(string)obu-stream"); + gst_harness_set_src_caps_str (h, "video/x-av1"); + + gst_harness_play (h); + + output_buf_num = 0; + offset = 0; + len = stream_no_annexb_av1_len / 5; + for (i = 0; i < 5; i++) { + if (i == 4) + len = stream_no_annexb_av1_len - offset; + + in_buf = gst_buffer_new_and_alloc (len); + gst_buffer_map (in_buf, &map, GST_MAP_WRITE); + memcpy (map.data, stream_no_annexb_av1 + offset, len); + gst_buffer_unmap (in_buf, &map); + offset += len; + + ret = gst_harness_push (h, in_buf); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + + gst_clear_buffer (&out_buf); + while ((out_buf = gst_harness_try_pull (h)) != NULL) { + if (output_buf_num == 0) + check_caps_event (h); + + fail_unless (gst_buffer_get_size (out_buf) == + stream_av1_obu_size[output_buf_num]); + + gst_clear_buffer (&out_buf); + output_buf_num++; + } + } + + fail_unless (output_buf_num == G_N_ELEMENTS (stream_av1_obu_size)); + + gst_harness_teardown (h); + +} + +GST_END_TEST; + +static Suite * +av1parse_suite (void) +{ + Suite *s; + TCase *tc_chain; + + s = suite_create ("av1parse"); + tc_chain = tcase_create ("general"); + + suite_add_tcase (s, tc_chain); + + tcase_add_test (tc_chain, test_byte_to_frame); + tcase_add_test (tc_chain, test_byte_to_annexb); + tcase_add_test (tc_chain, test_annexb_to_frame); + tcase_add_test (tc_chain, test_annexb_to_obu); + tcase_add_test (tc_chain, test_byte_to_obu); + + return s; +} + +GST_CHECK_MAIN (av1parse);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/av1parse.h
Added
@@ -0,0 +1,1799 @@ +/* GStreamer + * Copyright (C) 2020 Intel Corporation + * Author: He Junyan <junyan.he@intel.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/gst.h> + +static guint8 stream_annexb_av1[] = { + 0xd0, 0x2a, 0xce, 0x2a, 0x01, 0x10, 0x0c, 0x08, 0x00, 0x00, 0x00, 0x04, + 0x46, 0x3e, 0x56, 0xff, 0xfc, 0xc0, 0x20, 0xbd, 0x2a, 0x30, 0x14, 0x00, + 0xa6, 0x40, 0x10, 0xf3, 0x47, 0x52, 0x5f, 0xe1, 0xe0, 0x0e, 0x22, 0xf0, + 0x95, 0xe2, 0x9c, 0x49, 0x32, 0xd9, 0x73, 0xfb, 0x89, 0x0e, 0x9f, 0xe2, + 0xd2, 0x80, 0x2d, 0xaf, 0x80, 0x0d, 0xa6, 0xae, 0x2b, 0x72, 0x90, 0xcc, + 0x79, 0x15, 0x18, 0x61, 0xda, 0x0a, 0xe3, 0xff, 0x5d, 0x14, 0x2b, 0x2f, + 0x0e, 0x08, 0x0b, 0x08, 0x09, 0xb2, 0xa8, 0x72, 0x7a, 0x8d, 0x97, 0x14, + 0x53, 0x7a, 0xdd, 0x64, 0xf1, 0x33, 0x42, 0x65, 0x7c, 0x01, 0x14, 0xd0, + 0x9a, 0x59, 0x0b, 0xab, 0x92, 0x59, 0xbd, 0xef, 0xa2, 0x3c, 0xaf, 0xc5, + 0x94, 0x28, 0x0a, 0x59, 0xd1, 0x5b, 0xd8, 0x0c, 0x4c, 0x4c, 0xeb, 0x4e, + 0x0c, 0x3f, 0x9b, 0xcf, 0x11, 0xfb, 0xdf, 0x7d, 0x28, 0xc7, 0x80, 0xfd, + 0x23, 0x2c, 0x44, 0xa7, 0x69, 0xd4, 0xa2, 0x25, 0xeb, 0x3a, 0x10, 0xdd, + 0x7b, 0x95, 0x06, 0xf1, 0xe9, 0x89, 0x04, 0xd5, 0x5a, 0x33, 0x76, 0xc0, + 0x44, 0x0d, 0x0a, 0x0c, 0x4d, 0xd3, 0xfb, 0x06, 0xb0, 0x4d, 0xdd, 0x39, + 0x5b, 0xf7, 0xcc, 0x51, 0x5f, 0x54, 0x6d, 0x5e, 0xbf, 0x62, 0x50, 0xd8, + 0xaa, 0x4d, 0x72, 0x56, 0x12, 0x28, 0x37, 0xae, 0x66, 0x3e, 0x1e, 0x66, + 0x62, 0x6b, 0xd9, 0x74, 0xf7, 0xc3, 0x1b, 0xc2, 0x9b, 0xd3, 0xb7, 0xff, + 0xfc, 0xcc, 0xf4, 0x6b, 0x39, 0x81, 0xd9, 0xf0, 0x89, 0x4d, 0xed, 0xf5, + 0xc7, 0xd1, 0x64, 0x12, 0x7c, 0x7b, 0xe9, 0x98, 0x84, 0x73, 0x42, 0xe9, + 0x4e, 0xbe, 0xa9, 0xc4, 0x6f, 0x4c, 0xce, 0xd5, 0x01, 0x10, 0x45, 0x6f, + 0x18, 0x59, 0x99, 0xdf, 0x0f, 0x37, 0x36, 0xae, 0xa0, 0x62, 0x5f, 0xd8, + 0x5f, 0x53, 0x2a, 0x4b, 0xa2, 0x6c, 0x30, 0xa2, 0x4d, 0xd2, 0x93, 0x66, + 0x5c, 0x04, 0x53, 0xe1, 0x02, 0xe9, 0xaa, 0x3d, 0xc7, 0x21, 0x9a, 0x0e, + 0x93, 0xa6, 0xfe, 0x7b, 0x19, 0x4f, 0x6d, 0x35, 0xa8, 0xbf, 0xed, 0x9b, + 0x36, 0xe3, 0xcf, 0xc7, 0x2d, 0x9f, 0xcd, 0xb9, 0x49, 0xe2, 0x6e, 0x9f, + 0x92, 0xce, 0xea, 0x7c, 0x65, 0x50, 0xd1, 0xaf, 0x4e, 0xfe, 0x03, 0x2e, + 0x57, 0x75, 0x49, 0x6f, 0x1e, 0xf1, 0x5c, 0x65, 0x59, 0xcb, 0x36, 0x23, + 0xc2, 0xad, 0xcb, 0xb2, 0x89, 0x53, 0xf9, 0xd2, 0xcd, 0xab, 0x20, 0x4d, + 0xc5, 0x98, 0x74, 0x48, 0x4f, 0xbe, 0xd3, 0x6c, 0x01, 0xa2, 0xee, 0xab, + 0xf0, 0xf4, 0xf0, 0x17, 0xc8, 0xd6, 0xa9, 0x91, 0xbd, 0x11, 0x4f, 0xca, + 0x3b, 0x06, 0x5a, 0x58, 0xe9, 0x96, 0xa0, 0x30, 0xe9, 0x39, 0x57, 0xa8, + 0x65, 0xae, 0x9c, 0x87, 0xed, 0x2e, 0x8e, 0xbf, 0x8e, 0xcf, 0x73, 0x16, + 0x08, 0xe2, 0xfe, 0x63, 0x78, 0x73, 0xe3, 0xd0, 0x10, 0x14, 0x81, 0xf3, + 0x33, 0xcf, 0x69, 0x2e, 0x94, 0xb5, 0x77, 0x00, 0x50, 0x90, 0xf9, 0x4c, + 0xb4, 0x22, 0xfa, 0x60, 0xf4, 0xe7, 0xae, 0xbc, 0x25, 0x44, 0x82, 0x9d, + 0x19, 0xee, 0xa9, 0x5c, 0x5c, 0x93, 0x62, 0x70, 0x37, 0x04, 0xa8, 0x4f, + 0x2f, 0x4d, 0xbb, 0xbb, 0xe0, 0x8e, 0x97, 0x39, 0x08, 0xe1, 0x70, 0xb4, + 0x9b, 0x66, 0x2a, 0x14, 0x3d, 0x44, 0x88, 0x6c, 0x8a, 0x0d, 0x02, 0xfe, + 0xf3, 0x88, 0xc6, 0x75, 0x28, 0x3f, 0x67, 0xee, 0xc3, 0xb7, 0xdc, 0x7a, + 0x74, 0xd1, 0xfd, 0xdf, 0x0d, 0xee, 0xc8, 0xcd, 0xba, 0xf2, 0x16, 0x06, + 0x71, 0x73, 0x1e, 0xf7, 0x81, 0x21, 0x92, 0x75, 0x91, 0x44, 0x39, 0x04, + 0x57, 0xa0, 0xe0, 0xac, 0x26, 0x68, 0x50, 0x53, 0x3d, 0xe9, 0xb3, 0x98, + 0xd4, 0xa0, 0xf3, 0x31, 0x9e, 0xd4, 0x53, 0x2f, 0x12, 0xdc, 0x3c, 0x1d, + 0x2a, 0x65, 0xb9, 0x26, 0xc0, 0x15, 0xa9, 0x37, 0x1f, 0x6d, 0x61, 0xf7, + 0x10, 0xdd, 0x18, 0xfa, 0x22, 0x8e, 0x00, 0xad, 0x9c, 0x39, 0xeb, 0x9d, + 0xce, 0x43, 0xcf, 0x97, 0xb0, 0x49, 0x53, 0xf0, 0xf4, 0x59, 0xd8, 0x68, + 0xce, 0xa4, 0xc5, 0xad, 0x5c, 0xd1, 0x10, 0xcf, 0x95, 0x7f, 0x61, 0x81, + 0xd5, 0x4c, 0xb5, 0x10, 0x80, 0x52, 0x0c, 0x53, 0x46, 0xa3, 0x1d, 0xe9, + 0x70, 0x13, 0x2a, 0xd9, 0x7b, 0x32, 0x87, 0x36, 0x58, 0x16, 0x81, 0xe1, + 0x12, 0x91, 0x48, 0x4d, 0xfc, 0x5c, 0x04, 0x6a, 0x0b, 0xbd, 0x03, 0x8d, + 0x6a, 0x2f, 0xdb, 0x36, 0x5c, 0xde, 0xc1, 0xfc, 0xe8, 0x5a, 0x3e, 0x67, + 0xd4, 0xc2, 0xff, 0x06, 0x89, 0x12, 0x55, 0xbe, 0xd4, 0xae, 0xee, 0x8e, + 0x8e, 0x0b, 0x6b, 0x87, 0x44, 0x3c, 0x04, 0x18, 0x8e, 0xcb, 0x2b, 0x8c, + 0x69, 0xd6, 0xe3, 0x9d, 0xe3, 0xbd, 0x01, 0xa0, 0xe8, 0xaf, 0x77, 0x10, + 0x27, 0x27, 0x2c, 0xa7, 0x38, 0xb6, 0x52, 0xfe, 0x69, 0x88, 0xd0, 0x01, + 0x4d, 0x99, 0xf2, 0x51, 0xe9, 0x0f, 0x1d, 0xb8, 0x21, 0xf0, 0x96, 0xf8, + 0xb9, 0x11, 0x12, 0x8c, 0x5b, 0xdf, 0x34, 0xb9, 0xe6, 0xaa, 0xb8, 0x53, + 0x07, 0x8f, 0xa3, 0x4b, 0xdf, 0x21, 0x89, 0x69, 0xb8, 0x30, 0x6e, 0x72, + 0xe7, 0x34, 0x5d, 0x5a, 0x99, 0x13, 0xa5, 0x0e, 0x9e, 0x74, 0x9b, 0xa0, + 0xaf, 0x48, 0x29, 0x87, 0xbc, 0x3c, 0xf3, 0x4e, 0xdd, 0x41, 0x4f, 0x8c, + 0x47, 0x30, 0x9a, 0x75, 0x4a, 0xdf, 0x76, 0xfe, 0x1f, 0x80, 0x75, 0x3a, + 0xfa, 0xb8, 0xa7, 0xda, 0xcc, 0x02, 0x94, 0x29, 0xe0, 0x2e, 0x2e, 0x3b, + 0x67, 0xcc, 0x22, 0x74, 0xe7, 0x89, 0xdd, 0xc8, 0x14, 0x3f, 0x24, 0xf1, + 0x33, 0xab, 0x9c, 0x4f, 0x69, 0xe8, 0x02, 0x44, 0xec, 0xbc, 0x43, 0x66, + 0x47, 0xcb, 0xe4, 0x90, 0xb7, 0x1d, 0x57, 0x41, 0x76, 0xeb, 0x4b, 0xe8, + 0xb4, 0x2c, 0x39, 0x4e, 0xb8, 0x56, 0x34, 0xa0, 0x87, 0x41, 0xd2, 0x08, + 0x8f, 0x5e, 0xeb, 0x67, 0x39, 0x99, 0x36, 0xe0, 0x73, 0x64, 0x24, 0xd8, + 0xdc, 0xb1, 0x9e, 0xda, 0x37, 0xbc, 0x0e, 0x39, 0xb4, 0xde, 0xda, 0x06, + 0xd0, 0xf4, 0xca, 0x93, 0x78, 0x23, 0x73, 0xef, 0x74, 0x38, 0xaa, 0x7d, + 0xa1, 0x76, 0xf9, 0xb4, 0x68, 0xbc, 0xc8, 0x89, 0xb7, 0x85, 0x53, 0x04, + 0x88, 0x97, 0xed, 0xaa, 0xf1, 0xe2, 0xc3, 0xc1, 0x18, 0xe3, 0xd8, 0xc9, + 0x2c, 0xc0, 0x9d, 0x01, 0x1d, 0x78, 0x68, 0xdf, 0x3f, 0x6c, 0x0f, 0xb3, + 0xc3, 0xa5, 0x83, 0x80, 0x6d, 0x9a, 0x0f, 0x88, 0xcf, 0x46, 0x81, 0x53, + 0x51, 0x19, 0xa9, 0x47, 0x5a, 0xea, 0x49, 0xfa, 0xb1, 0x59, 0x34, 0x54, + 0x8d, 0xb2, 0x23, 0x2d, 0x9b, 0xb4, 0x43, 0xf0, 0x93, 0x4a, 0xe1, 0xeb, + 0x33, 0xd9, 0x13, 0x02, 0x2d, 0xc7, 0x69, 0xc5, 0xf1, 0xa4, 0x5e, 0x95, + 0x21, 0x95, 0xa4, 0x56, 0x68, 0x73, 0x16, 0x25, 0x1c, 0xd9, 0x53, 0x65, + 0x1d, 0xea, 0xd4, 0xec, 0xd5, 0xe7, 0x87, 0x04, 0x1a, 0x33, 0xdf, 0x42, + 0x8c, 0x2c, 0x61, 0x32, 0x20, 0x90, 0xba, 0x93, 0x9e, 0x64, 0x84, 0xc0, + 0xd9, 0xe0, 0xe6, 0xfa, 0x8c, 0x6d, 0xbe, 0x8c, 0x83, 0x15, 0xee, 0x97, + 0xf4, 0xc2, 0x87, 0x2a, 0x97, 0xb4, 0x22, 0xac, 0xf2, 0x72, 0x1e, 0xeb, + 0x89, 0x0d, 0xae, 0x75, 0xea, 0x36, 0xfe, 0xe1, 0x2c, 0xcd, 0x03, 0x3a, + 0xeb, 0x01, 0x3e, 0x08, 0xbf, 0xb4, 0x09, 0x92, 0x42, 0x4b, 0x45, 0x37, + 0xfd, 0x7b, 0xcd, 0x3c, 0x88, 0x66, 0x63, 0xab, 0x36, 0x2f, 0xd2, 0x14, + 0x60, 0x8d, 0xbd, 0xad, 0x21, 0xb4, 0x12, 0xe8, 0x08, 0x3c, 0x61, 0x82, + 0x94, 0xd8, 0x8c, 0x93, 0x28, 0x73, 0x73, 0xab, 0xcc, 0xf7, 0x5e, 0xd8, + 0x7e, 0x53, 0x5d, 0x87, 0xdc, 0x09, 0x40, 0x23, 0xb7, 0x50, 0x25, 0xda, + 0x26, 0x49, 0xb0, 0x2b, 0xd8, 0xe0, 0xc6, 0xd6, 0xc7, 0xc9, 0xb1, 0xe9, + 0xb6, 0xf1, 0xd3, 0x16, 0xdb, 0xf1, 0xff, 0x0f, 0x9a, 0x57, 0x5b, 0x87, + 0x93, 0x7e, 0xeb, 0x22, 0x31, 0xde, 0xbc, 0xeb, 0xa2, 0x99, 0x1b, 0x98, + 0x35, 0x71, 0x75, 0xfa, 0x79, 0x1d, 0xfb, 0xea, 0x04, 0x9c, 0xc7, 0x7e, + 0x1c, 0xdf, 0xa3, 0x99, 0x68, 0x13, 0x86, 0x6d, 0xb1, 0x4c, 0xfd, 0x2a, + 0xd2, 0x8e, 0x4d, 0x3d, 0x6e, 0x50, 0xd7, 0xab, 0xfa, 0x8e, 0x50, 0x43, + 0x2e, 0x1d, 0xf0, 0xdc, 0x5a, 0x3e, 0xaa, 0x1a, 0x0c, 0x13, 0xe3, 0xd2, + 0xf1, 0x0e, 0x91, 0xe4, 0xa4, 0x36, 0x7a, 0xfd, 0xf8, 0x5a, 0x4a, 0x39, + 0x66, 0x64, 0xbe, 0x93, 0x3b, 0x4e, 0xa7, 0x62, 0xa3, 0xae, 0x23, 0xe6, + 0x47, 0xa0, 0x6c, 0x00, 0xfc, 0xe0, 0x6b, 0xee, 0xb3, 0x7e, 0xca, 0xab, + 0x53, 0x55, 0xe5, 0xe2, 0x19, 0xbb, 0x22, 0x89, 0x2d, 0xa0, 0xcc, 0xae, + 0xd0, 0x77, 0x1f, 0xa1, 0xf8, 0x83, 0x22, 0x07, 0xa8, 0x81, 0xad, 0xe5, + 0xab, 0xc6, 0x91, 0x59, 0xd7, 0x90, 0xa9, 0x63, 0x56, 0x7f, 0x94, 0x78, + 0x74, 0x69, 0xa3, 0xdd, 0xd3, 0x8e, 0x13, 0xec, 0xc3, 0x78, 0x99, 0x8c, + 0x72, 0x52, 0xa8, 0x7b, 0x4f, 0x77, 0x5a, 0x9f, 0xbe, 0xf1, 0xe1, 0xb7, + 0xea, 0x5f, 0xa6, 0xc3, 0x83, 0x60, 0x81, 0x3f, 0xda, 0xe8, 0x78, 0x07, + 0x96, 0x2c, 0x63, 0xc9, 0x2a, 0xa1, 0x52, 0xb6, 0x72, 0xbc, 0xf6, 0xe1, + 0xdc, 0x9f, 0x24, 0x18, 0x07, 0xf1, 0x33, 0x0e, 0xb3, 0x8d, 0xee, 0xd8, + 0xdb, 0x92, 0xc1, 0x6b, 0x0a, 0x4a, 0xf0, 0x5d, 0x08, 0xb4, 0x50, 0x1f, + 0x5a, 0x78, 0x14, 0x9f, 0xdf, 0x7d, 0x9c, 0x79, 0x09, 0x10, 0x39, 0x38, + 0x0f, 0x4e, 0x52, 0x42, 0x36, 0x94, 0x86, 0xe3, 0xcb, 0x2a, 0xd5, 0xab, + 0xe1, 0x16, 0xc5, 0x6b, 0x5f, 0x1f, 0xf5, 0x02, 0xb0, 0xb1, 0x63, 0x33, + 0xe1, 0x06, 0xe6, 0xfe, 0x38, 0xb9, 0x3d, 0x85, 0x13, 0x6c, 0xd9, 0xb9, + 0xc4, 0xfd, 0x27, 0xd1, 0x6f, 0xbe, 0x86, 0x4c, 0x52, 0xb2, 0x5f, 0xe9, + 0x9b, 0xb8, 0xa0, 0x50, 0x02, 0x27, 0xfd, 0x04, 0x41, 0x87, 0x02, 0x89, + 0xff, 0x7b, 0x80, 0xbb, 0x82, 0x98, 0xed, 0x27, 0xc6, 0x84, 0x53, 0x34, + 0xae, 0xb5, 0x7f, 0xcf, 0x1c, 0xa7, 0x85, 0x6e, 0x20, 0xca, 0xa8, 0x47, + 0x1a, 0x2f, 0x0b, 0x15, 0x54, 0xec, 0x9a, 0x68, 0xb3, 0x91, 0x30, 0xe5, + 0x57, 0xc8, 0x56, 0x87, 0x8a, 0x15, 0xb2, 0xc5, 0x88, 0x6b, 0xa1, 0xa6, + 0xb1, 0x42, 0x83, 0xd6, 0xcb, 0x4a, 0x59, 0x2c, 0x45, 0xa1, 0xf9, 0x74, + 0x83, 0x38, 0x70, 0x65, 0xa2, 0x5a, 0x8f, 0xf4, 0x39, 0x1f, 0xe6, 0xd3, + 0x3c, 0x60, 0x60, 0x86, 0x34, 0x6f, 0xb2, 0x56, 0xcd, 0x38, 0x10, 0xa5, + 0x24, 0xfa, 0x25, 0x5d, 0xab, 0x9c, 0xd2, 0xb4, 0x03, 0x47, 0xea, 0xa4, + 0x8e, 0x33, 0xb2, 0xa7, 0xef, 0x4d, 0x0e, 0x9b, 0x03, 0xd1, 0xfc, 0x32, + 0x5f, 0xcc, 0xcf, 0x12, 0x36, 0x5b, 0x74, 0xf2, 0xbf, 0x6d, 0xab, 0x84, + 0x05, 0x2a, 0x46, 0x00, 0x3f, 0x85, 0x7c, 0x01, 0x80, 0x4e, 0x47, 0x0f, + 0x13, 0xe7, 0x96, 0x39, 0x89, 0x5d, 0x3e, 0x2c, 0xe5, 0x03, 0xfe, 0x0f, + 0xe2, 0x91, 0x3e, 0xf1, 0x96, 0xca, 0x16, 0x6c, 0x10, 0x4c, 0x14, 0xfa, + 0x13, 0xa1, 0xeb, 0x72, 0x49, 0xe3, 0x00, 0xcb, 0xe2, 0xac, 0xfa, 0xae, + 0xbd, 0x1c, 0xfc, 0x66, 0xb9, 0x5b, 0xa4, 0xa7, 0xdd, 0x0c, 0x33, 0xd3, + 0x05, 0x0d, 0x79, 0xeb, 0x2b, 0xef, 0x9c, 0x36, 0x92, 0xd0, 0xdd, 0xe8, + 0x72, 0x8a, 0x41, 0x9b, 0x7b, 0xbc, 0xcd, 0x3f, 0x69, 0x93, 0x89, 0xb5, + 0xb1, 0xf0, 0x16, 0xce, 0xfd, 0x19, 0x91, 0x4d, 0x82, 0xb1, 0x5f, 0x53, + 0x13, 0x51, 0xc2, 0x0b, 0x1e, 0xd2, 0xf2, 0xe2, 0x2e, 0x79, 0x48, 0xd5, + 0x47, 0xd0, 0xdc, 0x94, 0xc2, 0xe2, 0x3a, 0x2b, 0x28, 0x48, 0x7d, 0x17, + 0xdb, 0x39, 0x76, 0xa1, 0x63, 0x8b, 0x97, 0xc9, 0x9a, 0x89, 0xcf, 0x39, + 0xed, 0xd9, 0xea, 0xca, 0x82, 0x81, 0xb7, 0xce, 0x49, 0x6c, 0x92, 0x61, + 0x48, 0xaf, 0xde, 0xa7, 0x7e, 0x74, 0xa4, 0xac, 0x41, 0x0c, 0xee, 0xed, + 0x07, 0xed, 0x24, 0xa4, 0xa8, 0xf7, 0x37, 0xaa, 0xa9, 0x28, 0x26, 0x45, + 0x3c, 0xb7, 0x80, 0x11, 0x3d, 0x39, 0x7e, 0xf2, 0x04, 0x27, 0x04, 0xff, + 0x0f, 0xa3, 0x49, 0x03, 0xdf, 0xcd, 0x38, 0xa1, 0xf3, 0x14, 0xef, 0x0f, + 0xed, 0xb0, 0xf1, 0xa5, 0xc3, 0x60, 0x0d, 0x7a, 0x25, 0x21, 0x4c, 0x87, + 0x4d, 0xe0, 0x57, 0xcb, 0x5f, 0x92, 0x12, 0x5b, 0x6c, 0xf7, 0x4e, 0x90, + 0xd9, 0x30, 0x01, 0x10, 0xa1, 0x5a, 0x68, 0xff, 0x19, 0x99, 0xf7, 0x19, + 0x9b, 0x5b, 0x2f, 0x8d, 0x9e, 0x56, 0x1f, 0x86, 0xea, 0xb9, 0x1b, 0xf5, + 0xd0, 0xbf, 0xfc, 0x2e, 0x4b, 0x3b, 0x16, 0xbf, 0xb4, 0xb0, 0x3e, 0xda, + 0x53, 0xb9, 0x64, 0x2d, 0xee, 0x89, 0xd4, 0xa7, 0x7c, 0xcf, 0xb0, 0x6b, + 0x36, 0x74, 0x55, 0xac, 0x5b, 0x97, 0xbf, 0x56, 0x70, 0xf6, 0xeb, 0x1f, + 0xa8, 0xe1, 0x18, 0xe0, 0xff, 0x45, 0x7b, 0xfe, 0x4d, 0x4b, 0x1c, 0xd1, + 0x6a, 0x75, 0x33, 0x20, 0xd6, 0x67, 0x1f, 0xf1, 0xd3, 0x51, 0x9d, 0xcf, + 0xb5, 0x01, 0x45, 0x53, 0xa3, 0xd4, 0x01, 0x55, 0x16, 0xc3, 0xd9, 0x07, + 0x0d, 0xc1, 0xdc, 0x02, 0x66, 0x00, 0xb6, 0x90, 0x5c, 0x37, 0x24, 0x99, + 0xdf, 0xe8, 0xca, 0x89, 0x99, 0x02, 0x2e, 0xf7, 0xa7, 0x77, 0xc6, 0x3f, + 0x15, 0x17, 0x36, 0x45, 0x7a, 0xc1, 0x73, 0x20, 0x67, 0xce, 0x84, 0x75, + 0xc2, 0xbf, 0x27, 0x34, 0x69, 0x12, 0x16, 0x14, 0x8c, 0x53, 0x82, 0xc2, + 0x6f, 0xa1, 0x78, 0x1e, 0x31, 0x36, 0x9b, 0x59, 0xc1, 0x05, 0x5d, 0xb9, + 0x6c, 0xee, 0x04, 0x44, 0x47, 0x09, 0x69, 0xb8, 0xed, 0x66, 0x31, 0x09, + 0x3f, 0x1f, 0x0f, 0xb7, 0x4f, 0x3b, 0xa1, 0xc2, 0x69, 0x89, 0x53, 0x99, + 0x62, 0x59, 0xad, 0x51, 0x40, 0x92, 0xf3, 0x25, 0x33, 0xfe, 0x9f, 0x86, + 0x66, 0x7c, 0x0b, 0xa8, 0xee, 0x53, 0x5d, 0x57, 0x9f, 0x21, 0x59, 0x31, + 0x20, 0xb3, 0xb2, 0xe0, 0xdc, 0xa8, 0xb6, 0x9e, 0x74, 0x36, 0xd3, 0x28, + 0x3d, 0xd8, 0x8f, 0x79, 0xa8, 0x08, 0x5b, 0x77, 0x18, 0xd0, 0x49, 0xcc, + 0x1a, 0xde, 0x4e, 0xb8, 0xef, 0x49, 0x52, 0xf1, 0x19, 0x1c, 0x45, 0x08, + 0x2b, 0x8a, 0x6e, 0x55, 0x30, 0xe4, 0x64, 0xcc, 0x32, 0x65, 0x93, 0x76, + 0xa6, 0xa8, 0x42, 0xc8, 0x01, 0x6d, 0x83, 0xa9, 0x04, 0xcd, 0xe1, 0xee, + 0xd1, 0xad, 0xe4, 0x35, 0x09, 0xb6, 0x43, 0x38, 0x81, 0x0a, 0x98, 0xee, + 0x7c, 0xa4, 0x42, 0x0b, 0x39, 0xc4, 0x57, 0x65, 0xdf, 0x62, 0x0e, 0x34, + 0xd1, 0x6b, 0xc8, 0x2b, 0x64, 0xa3, 0xeb, 0x3d, 0xca, 0x82, 0x89, 0x29, + 0x4a, 0xd4, 0x63, 0xb6, 0xf7, 0xa6, 0x8c, 0x1d, 0x44, 0x47, 0x17, 0x67, + 0x0c, 0x61, 0x2b, 0x77, 0x70, 0x8d, 0x87, 0xec, 0xb9, 0xc6, 0xcd, 0x8c, + 0x31, 0x35, 0xb7, 0x69, 0x0b, 0xbb, 0xf5, 0x1c, 0xd2, 0x0d, 0x47, 0xef, + 0x42, 0x28, 0x71, 0xee, 0x1d, 0x1b, 0xad, 0xbe, 0xd0, 0x42, 0x60, 0x49, + 0x1f, 0xe8, 0x98, 0xad, 0x62, 0x51, 0xba, 0x83, 0x84, 0x2e, 0x6c, 0x1c, + 0x78, 0x47, 0xf9, 0x0e, 0x99, 0xef, 0x22, 0x2a, 0x7b, 0x46, 0x2a, 0x93, + 0xc8, 0x2e, 0x18, 0xbb, 0xd2, 0xe9, 0x0c, 0x4a, 0x33, 0x16, 0x45, 0x7c, + 0x85, 0xfe, 0x99, 0x87, 0xa6, 0x10, 0x97, 0x2c, 0xa0, 0x99, 0xe0, 0x18, + 0xc4, 0x77, 0x1e, 0x39, 0xd2, 0xab, 0x73, 0x1c, 0x2f, 0xc0, 0xfe, 0x82, + 0x7d, 0xc6, 0xe9, 0x39, 0x36, 0x29, 0x9b, 0xd5, 0xe9, 0x6a, 0x06, 0x9c, + 0x19, 0xe2, 0x47, 0x04, 0x6e, 0xe8, 0xaf, 0x5f, 0xd9, 0xaf, 0xc2, 0xa0, + 0x11, 0x7c, 0xff, 0xa7, 0x8a, 0x2b, 0x05, 0x0b, 0x3e, 0x97, 0xc8, 0x43, + 0xf1, 0x64, 0x67, 0xaa, 0xfc, 0xea, 0x41, 0xbf, 0x1c, 0x13, 0x6b, 0x97, + 0x44, 0x56, 0x8a, 0x92, 0x2f, 0xf3, 0x21, 0x6c, 0x5b, 0xed, 0xca, 0x7b, + 0x75, 0x6a, 0x9e, 0x0f, 0xb0, 0x6c, 0x3d, 0x70, 0x98, 0x61, 0x17, 0xc9, + 0x4c, 0x46, 0x99, 0xfd, 0x88, 0x50, 0x5e, 0xbe, 0x71, 0xb9, 0xbc, 0xea, + 0x16, 0x09, 0xca, 0x24, 0xe3, 0x9c, 0xb7, 0x3d, 0x05, 0x5b, 0x6d, 0x62, + 0x6f, 0xb9, 0xd8, 0x99, 0xda, 0x45, 0xcb, 0x7c, 0xa6, 0xc6, 0x46, 0x36, + 0x5b, 0x20, 0xb6, 0xa7, 0x39, 0x73, 0xf8, 0x97, 0x71, 0xaf, 0xcd, 0x66, + 0x58, 0xd4, 0xf1, 0x1c, 0x4c, 0x16, 0x79, 0xc6, 0x63, 0x56, 0xca, 0x79, + 0x15, 0x7b, 0xad, 0x40, 0xaa, 0xf6, 0x18, 0x64, 0x0e, 0xb7, 0x28, 0xd4, + 0x19, 0xf0, 0xa4, 0x3a, 0x92, 0x9d, 0x25, 0x97, 0xef, 0x5e, 0x75, 0x2a, + 0x13, 0x0c, 0xac, 0x2a, 0xf8, 0x62, 0x36, 0xa5, 0x8e, 0xb2, 0x8d, 0xd7, + 0x20, 0xc1, 0x0f, 0xc2, 0xbd, 0xdd, 0x96, 0x55, 0x98, 0xae, 0x74, 0xfc, + 0x71, 0x26, 0xb6, 0x63, 0x35, 0x19, 0x95, 0x8f, 0xce, 0x0f, 0x66, 0xcd, + 0x72, 0x83, 0x05, 0x5d, 0xe3, 0x44, 0x12, 0x62, 0x9e, 0x4f, 0xe0, 0xf2, + 0x4f, 0x74, 0x76, 0x8f, 0x79, 0x7f, 0x54, 0x1c, 0x81, 0x46, 0x2e, 0x31, + 0x5a, 0xf3, 0x9d, 0x55, 0x0e, 0x23, 0x7e, 0x19, 0xe8, 0xba, 0xe1, 0x1f, + 0xe3, 0xdb, 0x79, 0x0e, 0xa6, 0x46, 0xbf, 0xb5, 0x39, 0x91, 0xff, 0xbb, + 0xe9, 0xb2, 0x36, 0x73, 0x1a, 0x2a, 0xc5, 0xc6, 0xff, 0x32, 0x80, 0xb8, + 0xff, 0xe7, 0x03, 0xc0, 0xfa, 0x5e, 0xd5, 0x60, 0x13, 0x1d, 0x35, 0xe3, + 0x9f, 0x19, 0xf7, 0xca, 0x2d, 0xf7, 0x07, 0xcc, 0xa4, 0x6b, 0xa1, 0x49, + 0xbe, 0x5e, 0x5b, 0x30, 0xea, 0x5a, 0x8a, 0x26, 0x53, 0xe4, 0x30, 0xc9, + 0x76, 0x4b, 0x4d, 0x19, 0xdf, 0xb5, 0xec, 0xcf, 0xd0, 0xae, 0x1a, 0x2c, + 0x03, 0x87, 0x30, 0xe8, 0xb3, 0xf5, 0xd0, 0x7d, 0x95, 0x01, 0xef, 0x4b, + 0xa8, 0xde, 0x45, 0x5e, 0x9c, 0x2f, 0xf3, 0xe9, 0xc4, 0x9e, 0x94, 0xf9, + 0x64, 0xef, 0x84, 0x1d, 0x86, 0xa8, 0x6c, 0x2d, 0x35, 0x05, 0x86, 0x57, + 0x98, 0x11, 0xf1, 0xdd, 0xaf, 0xd9, 0xda, 0xc5, 0x0e, 0x34, 0x34, 0x86, + 0x6a, 0xdc, 0xe9, 0x53, 0x12, 0x62, 0xd3, 0x83, 0xcb, 0x67, 0x66, 0xea, + 0x09, 0xb0, 0x45, 0xf6, 0x1b, 0xe6, 0x10, 0x69, 0x68, 0x60, 0xd9, 0xac, + 0x1f, 0xbf, 0xf2, 0x23, 0x57, 0x8d, 0xd3, 0xad, 0xf0, 0xa7, 0xc8, 0x4c, + 0x41, 0x15, 0x19, 0xb6, 0x55, 0xab, 0xee, 0x22, 0x0c, 0x0d, 0xac, 0xf5, + 0x24, 0x85, 0xc7, 0x1c, 0x63, 0x6c, 0xca, 0xee, 0x42, 0xc6, 0xda, 0x87, + 0x2d, 0x43, 0xd2, 0xe7, 0xfd, 0xbf, 0x95, 0x93, 0x83, 0x05, 0x33, 0xf4, + 0xa8, 0x56, 0x91, 0x01, 0xe1, 0xb0, 0x96, 0xa6, 0xf1, 0xf4, 0xb1, 0x9d, + 0x83, 0x06, 0x70, 0xba, 0xef, 0xad, 0xa2, 0x7f, 0xd3, 0x7f, 0x83, 0xa4, + 0x0b, 0x3c, 0x7e, 0x80, 0x2e, 0xd8, 0xf4, 0x76, 0x6e, 0x68, 0xa1, 0x14, + 0x33, 0x6c, 0xc0, 0x7e, 0x4d, 0x67, 0xf2, 0xb3, 0x4a, 0x8a, 0xbc, 0x02, + 0x1a, 0xa9, 0x03, 0x98, 0xfb, 0xa9, 0xfc, 0x1a, 0xba, 0xdb, 0xe9, 0x48, + 0x32, 0x14, 0x63, 0x11, 0x11, 0x85, 0xfc, 0x25, 0xe3, 0xe2, 0x4a, 0x96, + 0x01, 0x5f, 0xa9, 0xf4, 0xd6, 0x82, 0xf6, 0x09, 0x56, 0x15, 0xaa, 0xa7, + 0x52, 0xbb, 0x90, 0x18, 0x74, 0xc1, 0xc6, 0x10, 0xc0, 0x65, 0xb0, 0x2d, + 0x5a, 0xd6, 0xfe, 0xe4, 0x13, 0x11, 0xc0, 0x62, 0xc3, 0x4e, 0xb0, 0xd3, + 0x8d, 0x38, 0xae, 0x30, 0xce, 0x1a, 0x76, 0x71, 0x36, 0x93, 0x33, 0x65, + 0x2b, 0x0e, 0xcc, 0x44, 0xd7, 0x7c, 0xde, 0x01, 0x67, 0x33, 0x34, 0xae, + 0xe4, 0x4f, 0xd0, 0x36, 0x13, 0xac, 0x4b, 0x57, 0xff, 0xd0, 0x80, 0x74, + 0xa7, 0x10, 0x1d, 0x9e, 0x47, 0xdf, 0x78, 0xe4, 0x47, 0x25, 0xf8, 0xf2, + 0x6a, 0x47, 0x27, 0x84, 0x5b, 0x56, 0x15, 0xd8, 0xf1, 0x48, 0x48, 0x33, + 0x13, 0x09, 0x23, 0xea, 0x29, 0x1a, 0x87, 0x63, 0xe6, 0x52, 0xd9, 0x00, + 0xe5, 0x8f, 0x93, 0x1c, 0x9f, 0x1b, 0x68, 0x18, 0x95, 0xb9, 0xa7, 0x67, + 0x52, 0x7b, 0x42, 0x36, 0xb6, 0xff, 0x4a, 0x00, 0x01, 0x6b, 0x8a, 0xb5, + 0x70, 0x4e, 0x36, 0x79, 0xc7, 0xed, 0x2a, 0xab, 0xc1, 0x50, 0x6e, 0xcf, + 0x90, 0xda, 0x7f, 0x2e, 0xbf, 0x7e, 0x78, 0x98, 0xe2, 0xd4, 0xbe, 0x56, + 0x83, 0x1c, 0xd5, 0xd0, 0x94, 0xbd, 0x81, 0xfb, 0x7b, 0xa8, 0x29, 0xa3, + 0x73, 0x14, 0x1c, 0x63, 0x29, 0x60, 0xf0, 0x34, 0x54, 0x1b, 0xc3, 0x10, + 0xb9, 0x78, 0xa1, 0xb2, 0x99, 0x77, 0x8b, 0xbc, 0x57, 0x15, 0x40, 0x8c, + 0x97, 0x4c, 0x80, 0x6f, 0xa9, 0xc6, 0xcd, 0xee, 0x60, 0x9b, 0x09, 0xe4, + 0x7c, 0xd6, 0x58, 0x45, 0x13, 0x00, 0xe3, 0x4d, 0x06, 0xe3, 0xdd, 0x47, + 0x0a, 0x2d, 0x95, 0xd0, 0x40, 0x0b, 0xbb, 0x2b, 0xf2, 0xc7, 0x48, 0x97, + 0x94, 0x36, 0xe3, 0x6e, 0x36, 0xe3, 0xff, 0x01, 0x9e, 0xa1, 0x7b, 0xef, + 0xfa, 0x9c, 0x26, 0xdf, 0x31, 0x51, 0x5d, 0x16, 0x52, 0x00, 0x29, 0x92, + 0xa0, 0xcb, 0x03, 0xf5, 0x70, 0x7b, 0x23, 0xa9, 0x60, 0xf2, 0x44, 0x54, + 0xd3, 0x05, 0xee, 0xfb, 0xd0, 0xa7, 0xff, 0xcd, 0x49, 0xb6, 0x7b, 0x56, + 0x92, 0x34, 0x2d, 0xef, 0xcb, 0x7b, 0x10, 0xdf, 0x02, 0x9b, 0x80, 0xce, + 0x71, 0x39, 0x77, 0xb6, 0xc2, 0x17, 0x90, 0x88, 0x78, 0xe4, 0x6c, 0x83, + 0xa2, 0x13, 0x92, 0x3d, 0xaf, 0xd0, 0x83, 0x22, 0xd1, 0x07, 0x8d, 0xd2, + 0x70, 0xa4, 0x39, 0x32, 0x85, 0x9c, 0x59, 0x57, 0x08, 0x03, 0x25, 0xa5, + 0x87, 0xfe, 0x19, 0x64, 0x30, 0x74, 0xb1, 0x5a, 0x28, 0x3c, 0xf7, 0xce, + 0xb7, 0x50, 0xf0, 0x20, 0x0e, 0x6c, 0xf2, 0xe9, 0xfd, 0x87, 0xe1, 0x7e, + 0x55, 0x87, 0x40, 0x2a, 0x1c, 0x98, 0xc6, 0xbd, 0x74, 0xae, 0xe5, 0x1d, + 0xa2, 0x4e, 0x97, 0xb4, 0x6d, 0x6d, 0x90, 0xd2, 0x9b, 0xd1, 0x15, 0xc8, + 0xaf, 0xf3, 0x76, 0x63, 0x33, 0xf7, 0x40, 0x6d, 0xc3, 0x82, 0x26, 0x83, + 0xe5, 0x40, 0xa2, 0xb3, 0xd0, 0x8f, 0x9c, 0xa2, 0x14, 0xc9, 0x3a, 0xe4, + 0x31, 0xa8, 0xab, 0x2e, 0x06, 0x2e, 0x3e, 0x05, 0xee, 0x06, 0x2e, 0xf7, + 0x2b, 0x2f, 0x1b, 0xbb, 0x3f, 0x3e, 0x82, 0xed, 0x4a, 0x82, 0xae, 0x23, + 0x12, 0xc7, 0x8e, 0xdc, 0x95, 0xe5, 0x6c, 0x3d, 0xcb, 0x17, 0x36, 0x94, + 0xe1, 0xba, 0x4e, 0x2e, 0xca, 0x5f, 0x77, 0x95, 0x50, 0xf8, 0xa0, 0x85, + 0xfd, 0x65, 0x66, 0x80, 0x87, 0xbe, 0xc5, 0x97, 0x04, 0x4f, 0x36, 0xc6, + 0xa6, 0xa8, 0x81, 0x7b, 0xdb, 0xc5, 0xbe, 0x3f, 0x7b, 0x80, 0x7d, 0x7d, + 0xc1, 0xd1, 0x48, 0x62, 0x8e, 0x6f, 0x78, 0xdb, 0x6b, 0xdd, 0x66, 0xcd, + 0xa8, 0x21, 0x13, 0xcd, 0x5d, 0xe0, 0x0c, 0xb2, 0x0d, 0x5b, 0xbe, 0x18, + 0xf3, 0x75, 0x50, 0xc4, 0x8a, 0xaf, 0x1e, 0x21, 0x99, 0xf7, 0xce, 0x07, + 0x2c, 0x53, 0xf1, 0xaf, 0x96, 0x91, 0x0b, 0xcb, 0x48, 0xad, 0x1e, 0xe4, + 0x8b, 0x2b, 0xb1, 0x30, 0x87, 0x65, 0xfc, 0xee, 0xdf, 0x02, 0x7d, 0x4b, + 0x05, 0x57, 0xd9, 0xc6, 0x2b, 0x9a, 0x51, 0x97, 0xa9, 0xd8, 0xee, 0xd6, + 0x5d, 0x28, 0x8e, 0x26, 0x20, 0xee, 0xa0, 0xfd, 0x58, 0x5d, 0x58, 0x5d, + 0x58, 0xc4, 0xd9, 0x06, 0xe6, 0x2b, 0x71, 0xcd, 0x51, 0xf6, 0x9d, 0xe3, + 0x90, 0x68, 0x01, 0x8d, 0x9a, 0x7d, 0x8b, 0x55, 0x7d, 0xc3, 0x26, 0x3f, + 0x49, 0xfd, 0x34, 0xc2, 0x97, 0x03, 0x40, 0x9b, 0x26, 0xe1, 0x6e, 0x33, + 0x92, 0xd8, 0x5e, 0x71, 0x48, 0x1f, 0xb8, 0x2e, 0x01, 0x8d, 0x8d, 0xa2, + 0xd3, 0xdb, 0x7e, 0x95, 0x20, 0x74, 0xb2, 0x36, 0x05, 0xf5, 0xd0, 0xbc, + 0xc3, 0x6c, 0x8e, 0xdc, 0x0b, 0x3a, 0xb2, 0xdd, 0x34, 0x1b, 0x5a, 0xd2, + 0x9a, 0x05, 0x0a, 0x5e, 0x7f, 0x1f, 0x8b, 0x21, 0xa1, 0x72, 0x36, 0x3e, + 0xbf, 0x38, 0x10, 0x1b, 0xcf, 0x23, 0x8f, 0x92, 0x92, 0xcf, 0x3c, 0x1d, + 0xca, 0x20, 0xec, 0xb8, 0x65, 0x28, 0x34, 0x8c, 0x7a, 0xb0, 0xc9, 0xf6, + 0xa7, 0xcb, 0xf0, 0xc2, 0xf5, 0xc0, 0x90, 0xc4, 0x03, 0x79, 0x88, 0x51, + 0x38, 0xe3, 0xa6, 0xf9, 0x55, 0xd9, 0x56, 0x6d, 0x76, 0x0e, 0xaa, 0x14, + 0xa2, 0x1c, 0x7b, 0xe2, 0x4f, 0x90, 0x4d, 0x1c, 0x7d, 0x87, 0xd9, 0xac, + 0x22, 0x43, 0xf9, 0xb7, 0x4e, 0xaf, 0xc3, 0x96, 0x09, 0xbe, 0x4c, 0x1b, + 0xdf, 0x72, 0x45, 0x6a, 0x9b, 0xe3, 0x57, 0x18, 0x64, 0x95, 0xa9, 0xa0, + 0x00, 0x30, 0xe3, 0xc0, 0x65, 0x92, 0x8c, 0x32, 0x1e, 0x1a, 0x66, 0x4f, + 0x4b, 0x1a, 0x02, 0x0f, 0xc2, 0x79, 0x7a, 0x3a, 0x62, 0x53, 0x15, 0xea, + 0x63, 0x39, 0x99, 0x8c, 0xb0, 0x65, 0xe6, 0x57, 0x90, 0x5e, 0x0f, 0x31, + 0xc7, 0x9f, 0xb1, 0x8b, 0x40, 0x00, 0x55, 0xd9, 0xac, 0x91, 0x20, 0x7f, + 0x73, 0x12, 0x9d, 0xb9, 0x53, 0x1b, 0x2e, 0x3a, 0x0e, 0x2d, 0x48, 0x7a, + 0x65, 0x54, 0xdf, 0x4b, 0xbd, 0x8e, 0xc7, 0xe4, 0x47, 0x10, 0xe2, 0x1f, + 0x2e, 0x7b, 0x55, 0x18, 0xac, 0x05, 0x2c, 0xb8, 0x62, 0xe8, 0xef, 0xf4, + 0x52, 0x64, 0xe4, 0xf9, 0x32, 0xf6, 0xb1, 0x1c, 0x99, 0xc7, 0x43, 0x5a, + 0x59, 0xd0, 0xd3, 0x92, 0x88, 0x11, 0x54, 0x22, 0x60, 0x54, 0x00, 0x12, + 0xb0, 0x4a, 0x32, 0xbd, 0xef, 0xb4, 0x29, 0x71, 0xbe, 0x5f, 0x33, 0xc6, + 0x79, 0xa3, 0xd6, 0x02, 0xd2, 0xa6, 0x92, 0x86, 0x55, 0x4d, 0xab, 0x42, + 0x8e, 0xbc, 0x62, 0x9c, 0xa9, 0x71, 0x70, 0x8c, 0x69, 0x99, 0x82, 0x4b, + 0x4e, 0x95, 0x5b, 0x76, 0xf2, 0xa9, 0x87, 0x7b, 0xda, 0xd3, 0x3a, 0x7d, + 0xab, 0x40, 0x00, 0x45, 0xdf, 0x62, 0xb3, 0x02, 0xa6, 0x44, 0x6b, 0xb5, + 0x70, 0xa5, 0x2b, 0xab, 0x2d, 0x9e, 0xef, 0xc3, 0x4c, 0xe3, 0x14, 0xab, + 0x0a, 0xeb, 0xba, 0x14, 0x51, 0xef, 0x7f, 0xf6, 0xa9, 0xaa, 0x03, 0x7f, + 0xa6, 0x16, 0x28, 0x78, 0xe8, 0x09, 0xda, 0x00, 0x01, 0x68, 0xf1, 0x9a, + 0x17, 0x65, 0xdd, 0x88, 0xc8, 0xd9, 0x15, 0x1e, 0x44, 0x0a, 0x2e, 0x8e, + 0x67, 0x45, 0xe4, 0xb8, 0xda, 0xd9, 0xa0, 0x3c, 0x22, 0x52, 0xaf, 0x97, + 0x99, 0x1f, 0xdc, 0xd9, 0xed, 0xbf, 0x90, 0x38, 0xce, 0xe7, 0x50, 0x5a, + 0x00, 0x02, 0x89, 0x6f, 0xa2, 0x75, 0xc5, 0xc1, 0x4e, 0xda, 0x8f, 0x34, + 0xb1, 0x67, 0xbc, 0xd4, 0x26, 0x80, 0x67, 0x46, 0x86, 0x38, 0x5d, 0x76, + 0xca, 0xfa, 0x3b, 0x85, 0x5a, 0x6a, 0xd4, 0x09, 0x88, 0x83, 0x04, 0x4a, + 0x1b, 0x01, 0x3e, 0x3e, 0x09, 0x97, 0x74, 0xa3, 0x36, 0x86, 0x63, 0x21, + 0x65, 0xf9, 0xf9, 0xe9, 0x5c, 0xc0, 0x48, 0x2d, 0x33, 0xf9, 0x32, 0xe2, + 0x7b, 0x94, 0xab, 0x49, 0xfa, 0x37, 0x73, 0xa5, 0xaf, 0x52, 0xcb, 0xb9, + 0x5e, 0xd2, 0xf0, 0x00, 0x02, 0x30, 0x7f, 0x0f, 0xea, 0x76, 0x3d, 0x46, + 0x5f, 0x68, 0x8a, 0x90, 0xc7, 0x31, 0x21, 0x7b, 0x5a, 0x8c, 0x5a, 0x7b, + 0xa0, 0xc9, 0x8f, 0xba, 0x34, 0x7e, 0x22, 0x5b, 0x47, 0xb1, 0xcb, 0x5b, + 0x05, 0x88, 0x97, 0x51, 0x17, 0x78, 0x00, 0x01, 0x6b, 0xb0, 0xfc, 0x59, + 0xa9, 0x4e, 0x3f, 0x6c, 0x69, 0x42, 0xc9, 0xfd, 0x7b, 0xa0, 0x15, 0x7d, + 0xf6, 0x44, 0x85, 0x08, 0xa6, 0xc4, 0x3d, 0x11, 0x8b, 0xdb, 0x2c, 0x19, + 0x92, 0x3b, 0x73, 0x7f, 0x9e, 0x75, 0x4f, 0x81, 0x49, 0x4f, 0x88, 0xad, + 0x00, 0x01, 0x7a, 0x58, 0x11, 0xa0, 0xc8, 0xf4, 0xd8, 0x88, 0x78, 0x0c, + 0x37, 0xa5, 0x0b, 0xd5, 0xc7, 0x32, 0x3a, 0x01, 0x12, 0x12, 0x91, 0x96, + 0xb2, 0xc3, 0xde, 0x0d, 0x43, 0xcc, 0x84, 0x77, 0xab, 0xa5, 0xd8, 0x5b, + 0x12, 0x92, 0x98, 0x08, 0xe7, 0x19, 0x43, 0x7c, 0x11, 0xce, 0x2b, 0xa1, + 0x1d, 0x20, 0x62, 0x91, 0x4e, 0xfd, 0xe1, 0x54, 0x0b, 0xee, 0x36, 0x9f, + 0x62, 0xb5, 0x8d, 0x65, 0x85, 0xf1, 0xe0, 0x00, 0x03, 0x33, 0xea, 0xfe, + 0xf5, 0x80, 0xbf, 0x61, 0x93, 0x16, 0xae, 0xf0, 0xdb, 0x03, 0x98, 0x72, + 0x34, 0x13, 0x28, 0x36, 0x0c, 0x5a, 0x00, 0xb8, 0x19, 0xce, 0x2a, 0x93, + 0x93, 0xe8, 0xb3, 0x37, 0x11, 0x50, 0x5d, 0x6a, 0xde, 0x00, 0x00, 0x51, + 0xd9, 0x01, 0xa2, 0x2e, 0x9b, 0x31, 0xac, 0xbc, 0x8c, 0x2e, 0x93, 0x7c, + 0x01, 0xc6, 0x79, 0xc3, 0xba, 0x9e, 0x0d, 0xab, 0xfa, 0xaf, 0x60, 0xc5, + 0xe7, 0xf7, 0x31, 0x99, 0x82, 0x7f, 0xda, 0xc2, 0xa4, 0x15, 0x43, 0x3c, + 0xe4, 0xb5, 0x7d, 0xf2, 0x4f, 0xf7, 0xa1, 0x07, 0xfe, 0x2b, 0x2c, 0x9d, + 0x43, 0x3d, 0xb4, 0xc7, 0x28, 0x83, 0xe2, 0xa9, 0x74, 0x39, 0x0c, 0xca, + 0xc0, 0x77, 0xea, 0xfd, 0x1a, 0x16, 0x60, 0xf1, 0x16, 0xb5, 0x79, 0xf2, + 0xec, 0x1b, 0x05, 0x69, 0x66, 0xbf, 0xf3, 0x75, 0xfe, 0x44, 0xae, 0x9a, + 0x6f, 0x27, 0xea, 0x1d, 0x26, 0x8a, 0xf8, 0x31, 0x77, 0xb7, 0x35, 0x75, + 0xb5, 0xf7, 0xdb, 0xcc, 0x8b, 0x16, 0x7e, 0xb1, 0x8e, 0x83, 0x80, 0x24, + 0x24, 0x68, 0x00, 0x08, 0xaf, 0xa2, 0xe0, 0x97, 0xd0, 0x6d, 0x13, 0xfc, + 0x9c, 0x5c, 0x44, 0x2e, 0x77, 0x15, 0x87, 0x5b, 0x8f, 0xda, 0x00, 0x9d, + 0x8c, 0x8d, 0xaa, 0xd2, 0xc9, 0x5a, 0x77, 0x6e, 0xd0, 0x9e, 0xc4, 0x88, + 0xb0, 0x2e, 0x86, 0x0a, 0x41, 0xa0, 0x00, 0x40, 0x59, 0x37, 0x17, 0xba, + 0xe4, 0xc1, 0xff, 0xac, 0xb8, 0x55, 0xd4, 0xa6, 0x17, 0x7e, 0xb4, 0xaf, + 0x1f, 0x02, 0x6f, 0x54, 0x44, 0xea, 0xb6, 0x19, 0x2e, 0x76, 0x47, 0xd0, + 0x0e, 0x8e, 0x2e, 0xaf, 0xb2, 0x0c, 0xd1, 0x4a, 0xd2, 0xae, 0x6c, 0x8d, + 0x2a, 0xb5, 0xb2, 0x28, 0x4d, 0xfc, 0xb1, 0x7a, 0xea, 0x6a, 0xd5, 0x29, + 0x24, 0x0c, 0x95, 0x5e, 0x7c, 0x6a, 0x24, 0x1d, 0xe3, 0x4d, 0x3e, 0x94, + 0x26, 0xb4, 0x77, 0x11, 0xb7, 0xdb, 0x04, 0x78, 0x63, 0x81, 0x6d, 0x45, + 0xf4, 0xe8, 0xe2, 0x26, 0x3f, 0xce, 0xb0, 0x10, 0x1c, 0x7e, 0xb0, 0x46, + 0x97, 0x48, 0xc1, 0xc6, 0xac, 0xf9, 0xc5, 0xc0, 0xba, 0xf2, 0x15, 0xc7, + 0x91, 0x57, 0x61, 0xc9, 0xa1, 0x40, 0xe6, 0xac, 0x6b, 0xd6, 0x2f, 0x24, + 0xa2, 0x89, 0xa0, 0x00, 0x37, 0xd3, 0x24, 0x56, 0xed, 0x6a, 0x25, 0x9e, + 0x40, 0x85, 0x08, 0x51, 0x11, 0xbc, 0x6c, 0x85, 0x0d, 0x10, 0xd6, 0xd6, + 0xf3, 0x70, 0x7e, 0x89, 0x09, 0xf7, 0x8e, 0x26, 0xff, 0xf2, 0x56, 0xbb, + 0x79, 0xd1, 0x9e, 0x44, 0xea, 0x56, 0x80, 0x00, 0xd8, 0x83, 0x87, 0x25, + 0xad, 0x6e, 0x7e, 0x86, 0x8c, 0xeb, 0x42, 0x3b, 0xeb, 0x22, 0xac, 0x8b, + 0x6f, 0xb3, 0x2a, 0x5c, 0xdf, 0x2c, 0x9d, 0x85, 0xbc, 0x61, 0xe2, 0x1a, + 0x61, 0x0a, 0x4e, 0xde, 0xe9, 0xb6, 0x0e, 0x63, 0x2b, 0x40, 0x00, 0x5f, + 0x7b, 0x67, 0xa7, 0x79, 0x71, 0x52, 0x16, 0x0d, 0x83, 0xeb, 0x35, 0x33, + 0xee, 0xeb, 0x77, 0xfd, 0xd1, 0x8b, 0x3a, 0xc0, 0x89, 0x3f, 0xb9, 0xa8, + 0x14, 0x89, 0xe2, 0x3d, 0x03, 0x9b, 0xe8, 0xf7, 0x47, 0xdc, 0xd0, 0x3e, + 0xac, 0x90, 0xbe, 0x54, 0x9a, 0xbc, 0x11, 0xda, 0xde, 0xd6, 0x44, 0xc1, + 0xa4, 0xf8, 0xe9, 0xe3, 0x31, 0x09, 0x38, 0xf7, 0xe5, 0x90, 0x71, 0x4e, + 0x99, 0x95, 0x7b, 0x92, 0x07, 0x85, 0xe0, 0x50, 0xce, 0x9b, 0xc0, 0x00, + 0x04, 0x80, 0xf0, 0x75, 0x88, 0xfd, 0xdf, 0xf0, 0xda, 0x6c, 0xb5, 0xbb, + 0xfe, 0xbc, 0x02, 0x09, 0x56, 0xa8, 0xae, 0x06, 0x68, 0xdc, 0xc3, 0xbd, + 0x2d, 0x0d, 0xe7, 0x1a, 0xce, 0x8a, 0x75, 0xc1, 0xaa, 0xd5, 0xf4, 0x65, + 0xe0, 0x00, 0x04, 0xd5, 0xbc, 0xfc, 0xf2, 0xfc, 0x3f, 0xa8, 0xe8, 0x42, + 0x73, 0x01, 0x72, 0xea, 0x47, 0x75, 0x47, 0xba, 0xc2, 0x44, 0x45, 0x78, + 0xa6, 0xad, 0x65, 0xa2, 0xac, 0x9d, 0x81, 0x7f, 0xea, 0x59, 0xe0, 0x32, + 0x1f, 0x78, 0xad, 0x00, 0x01, 0x90, 0xc5, 0x8c, 0x2c, 0x3f, 0x26, 0xda, + 0x01, 0x98, 0x64, 0xb3, 0x91, 0x4a, 0x89, 0xe2, 0xf2, 0x2c, 0x93, 0x18, + 0x94, 0x0e, 0x01, 0x12, 0xea, 0x3c, 0xb2, 0xa1, 0xbf, 0xa8, 0x81, 0xb2, + 0x3c, 0xab, 0xa4, 0xe4, 0xec, 0x51, 0xb9, 0x50, 0xaa, 0x64, 0x0a, 0x50, + 0xf4, 0x35, 0xe4, 0xfe, 0x5c, 0xdc, 0xce, 0x01, 0x13, 0x7f, 0x2e, 0x90, + 0x41, 0x4c, 0x65, 0x0c, 0xe2, 0xa7, 0x55, 0x69, 0x3a, 0x7a, 0x65, 0x30, + 0x91, 0x2b, 0x8c, 0x93, 0x87, 0xaf, 0xe2, 0x3e, 0x74, 0xcb, 0x5d, 0x57, + 0x9b, 0x78, 0x9f, 0x0e, 0xff, 0xbc, 0x00, 0x00, 0xac, 0x2e, 0x02, 0xb2, + 0xdc, 0xa0, 0xe7, 0x29, 0x19, 0x99, 0x01, 0x42, 0x09, 0x22, 0xd0, 0x48, + 0x58, 0x57, 0x40, 0xb8, 0x53, 0xd9, 0x96, 0x9e, 0x38, 0x33, 0xe9, 0x20, + 0x69, 0x08, 0x46, 0x6a, 0x4c, 0x80, 0x37, 0x7d, 0x3c, 0x64, 0x4f, 0x00, + 0x00, 0x28, 0x00, 0x62, 0x35, 0x47, 0xad, 0x6f, 0x13, 0xcb, 0x2b, 0x1a, + 0x7e, 0x89, 0xaa, 0x9b, 0x6a, 0x59, 0x1b, 0x2f, 0x30, 0x72, 0xbc, 0xfa, + 0xe1, 0xb2, 0x9f, 0xc5, 0x8a, 0x5f, 0xbf, 0x67, 0x9f, 0x87, 0x26, 0x7d, + 0xdb, 0xc9, 0xe0, 0x01, 0xbc, 0xf7, 0xae, 0x3e, 0xb5, 0x3a, 0xd7, 0x8b, + 0xb5, 0x96, 0x20, 0x5f, 0x16, 0x99, 0xe9, 0xe1, 0x20, 0x97, 0xa5, 0x69, + 0x33, 0x9a, 0xa3, 0x17, 0x63, 0xcb, 0x50, 0xbb, 0x02, 0x1d, 0x00, 0x37, + 0x80, 0x00, 0x20, 0xbc, 0x04, 0xeb, 0xc1, 0x44, 0xc2, 0x8d, 0x51, 0x53, + 0x4b, 0xd0, 0x13, 0x36, 0xfb, 0x7d, 0x2d, 0xb6, 0xe8, 0x27, 0xf9, 0x4f, + 0xa4, 0xbf, 0x33, 0xb3, 0x9b, 0x55, 0x05, 0xbc, 0x5f, 0xb4, 0xab, 0x5d, + 0x62, 0x94, 0x69, 0xee, 0x1b, 0xc0, 0x00, 0x04, 0x15, 0x08, 0x75, 0xa3, + 0x94, 0xfa, 0x0e, 0xdb, 0x22, 0x09, 0x9c, 0x52, 0x60, 0x61, 0xf3, 0x97, + 0x04, 0x81, 0x58, 0x09, 0xaf, 0x58, 0x6f, 0x1d, 0x3c, 0x36, 0x01, 0xc8, + 0xdb, 0xf5, 0x0d, 0x0f, 0x19, 0x59, 0xb8, 0x07, 0x80, 0xca, 0xce, 0xa1, + 0x83, 0xaa, 0x7c, 0x81, 0x34, 0x1c, 0x19, 0xb4, 0x37, 0x1e, 0x7f, 0xc8, + 0x77, 0x03, 0xfa, 0xc7, 0x42, 0xf3, 0xef, 0x38, 0x4b, 0x42, 0x3c, 0xd1, + 0xfa, 0x83, 0xcf, 0x8a, 0xe8, 0x5a, 0x2c, 0xfb, 0x92, 0x20, 0xb0, 0x9f, + 0xad, 0x0b, 0xbe, 0x8a, 0x9a, 0xfc, 0x72, 0x3d, 0x32, 0x8d, 0x98, 0x6c, + 0x9a, 0x6b, 0xd6, 0x8b, 0x4c, 0x5e, 0x0c, 0xea, 0xbf, 0xc0, 0x0a, 0xf4, + 0x2b, 0x0c, 0x78, 0x81, 0xd4, 0x65, 0xb6, 0x53, 0x2c, 0xdc, 0x27, 0x8d, + 0xd4, 0x54, 0xfc, 0xbf, 0x3b, 0xe4, 0x81, 0x68, 0x32, 0xbb, 0xe7, 0x20, + 0x54, 0x8b, 0x37, 0xce, 0xec, 0x57, 0xe8, 0xd1, 0xe1, 0xe3, 0xa9, 0x04, + 0xfb, 0x89, 0xdd, 0x97, 0x84, 0x44, 0x18, 0x44, 0x7d, 0xea, 0x02, 0x70, + 0xc0, 0xbe, 0x4b, 0xd4, 0xb5, 0x5a, 0xa0, 0x2e, 0x7d, 0x76, 0x97, 0xdc, + 0x2a, 0x9a, 0x69, 0xe7, 0xd5, 0x13, 0x10, 0x6c, 0x95, 0xe8, 0x69, 0xc7, + 0xc1, 0xc4, 0x7c, 0xf7, 0x20, 0x01, 0x5e, 0x8e, 0xe3, 0xe7, 0xd8, 0xb7, + 0x2e, 0x12, 0x75, 0x47, 0x76, 0x81, 0x1f, 0xde, 0x87, 0x10, 0xaa, 0x15, + 0x4c, 0xcb, 0xd8, 0x3f, 0x2b, 0xf9, 0x7e, 0x4a, 0x15, 0xeb, 0x22, 0xc7, + 0x9d, 0x89, 0x09, 0x89, 0x09, 0x89, 0x10, 0x57, 0xed, 0xdf, 0x9f, 0xd3, + 0xbd, 0x30, 0x07, 0x94, 0xb8, 0x03, 0x5f, 0x8b, 0x0c, 0xbd, 0xfe, 0x10, + 0x29, 0x50, 0xd8, 0x3a, 0x8f, 0x84, 0xab, 0xe5, 0x1a, 0x90, 0x50, 0xf2, + 0xd9, 0x6b, 0xcd, 0xe1, 0xa4, 0xc1, 0x7e, 0xdd, 0x3f, 0x6b, 0xcc, 0x40, + 0x09, 0x97, 0x2b, 0xf0, 0xca, 0xa7, 0xc5, 0x8e, 0x34, 0x6a, 0x16, 0x7b, + 0xbe, 0x26, 0x7a, 0x47, 0xe3, 0x84, 0x17, 0x0d, 0x66, 0x9b, 0x38, 0x44, + 0x70, 0x68, 0x7d, 0x7f, 0xfb, 0x1a, 0x2a, 0xad, 0xfb, 0x27, 0x63, 0x9d, + 0x3f, 0xa3, 0xf8, 0x30, 0x0d, 0x81, 0xf2, 0x66, 0xd6, 0xe2, 0xd1, 0x8e, + 0x40, 0xee, 0x10, 0xc9, 0xfb, 0xdc, 0x07, 0x45, 0x1c, 0xdc, 0xfb, 0x50, + 0xb4, 0xee, 0x42, 0x14, 0x61, 0xb0, 0x3e, 0x38, 0xd5, 0x26, 0xca, 0x5b, + 0x9a, 0x39, 0x43, 0x9a, 0x1b, 0x9a, 0x39, 0x42, 0x38, 0xc9, 0x84, 0x2c, + 0x7e, 0x15, 0xc2, 0x6f, 0x26, 0xea, 0x47, 0x24, 0x83, 0xda, 0x7a, 0x55, + 0x54, 0x75, 0x47, 0x76, 0x07, 0x01, 0xf0, 0x5f, 0xc6, 0x91, 0xd5, 0xab, + 0xce, 0x86, 0xe5, 0x88, 0x4a, 0xdb, 0x70, 0xcb, 0x19, 0xd4, 0xae, 0xf2, + 0xce, 0x6c, 0xd5, 0x6a, 0xc3, 0xb3, 0x73, 0x1f, 0x02, 0x96, 0xb3, 0xec, + 0x47, 0xe2, 0x74, 0xc6, 0x61, 0x6e, 0x52, 0x9a, 0xcd, 0x07, 0x4f, 0xc5, + 0xaa, 0x31, 0xff, 0x4b, 0x8a, 0x59, 0x32, 0x05, 0xa7, 0xd3, 0x0e, 0xf0, + 0x69, 0xe7, 0x74, 0x96, 0x1f, 0x5a, 0xb4, 0x42, 0xff, 0x04, 0x21, 0xe2, + 0xf5, 0xc8, 0x01, 0x66, 0x98, 0x53, 0x30, 0x0e, 0x63, 0x9a, 0x2f, 0x61, + 0x0e, 0x7b, 0xb4, 0x66, 0x14, 0xe4, 0x0c, 0xf2, 0x2d, 0xc8, 0x8b, 0x33, + 0xe1, 0x33, 0x56, 0x21, 0x89, 0xe9, 0xe0, 0x19, 0x67, 0x12, 0xbe, 0xe6, + 0xd2, 0xbc, 0x0c, 0xac, 0xbf, 0x94, 0xf6, 0x80, 0x00, 0xb3, 0x9c, 0x19, + 0xcb, 0x9a, 0xd6, 0xda, 0x95, 0x6c, 0x03, 0xfa, 0x72, 0xf7, 0xe9, 0xa0, + 0xda, 0xfe, 0x0d, 0xce, 0x84, 0xe8, 0xa1, 0x9c, 0xe8, 0xe3, 0x3b, 0x61, + 0x0a, 0x84, 0x61, 0x53, 0x90, 0x2e, 0x27, 0xed, 0x7d, 0xc3, 0x40, 0x00, + 0x64, 0xe0, 0xed, 0xe8, 0x8d, 0x67, 0x22, 0x22, 0xe8, 0x19, 0x7c, 0x52, + 0x05, 0xf8, 0x9e, 0x23, 0xb2, 0xaa, 0xdf, 0x76, 0x18, 0xaf, 0x58, 0xf2, + 0xbd, 0x2e, 0xdf, 0xc2, 0xe8, 0xaf, 0xb2, 0x7d, 0xe4, 0xe1, 0x81, 0x43, + 0xa9, 0xff, 0x32, 0xa0, 0x4e, 0xc8, 0xd4, 0xf3, 0x75, 0x9a, 0x99, 0xa9, + 0x66, 0x2f, 0xc0, 0x9c, 0xe5, 0x13, 0x32, 0x39, 0x06, 0xc0, 0x57, 0x75, + 0xd1, 0x50, 0x44, 0xc0, 0xec, 0x54, 0x33, 0xe0, 0x0f, 0xa4, 0x35, 0x54, + 0xe1, 0xd9, 0x9a, 0x00, 0x09, 0x49, 0xf0, 0x80, 0x9f, 0x2d, 0x4e, 0xb4, + 0xab, 0xfa, 0xa8, 0xad, 0xb8, 0x5f, 0xf7, 0x03, 0x80, 0x05, 0x66, 0x47, + 0xfe, 0x41, 0x1c, 0x5d, 0x92, 0xe2, 0xab, 0x52, 0xd3, 0x05, 0xe0, 0x84, + 0xdc, 0xbd, 0x7f, 0x5d, 0x06, 0xea, 0x9a, 0x00, 0x03, 0x3b, 0xa7, 0xb5, + 0x06, 0x3c, 0x8d, 0xbc, 0xbd, 0x81, 0x08, 0xb3, 0x43, 0x8d, 0x97, 0x9a, + 0x1a, 0x5b, 0x75, 0xa7, 0x86, 0x45, 0x53, 0x95, 0xf7, 0x7e, 0xfc, 0xc1, + 0x4f, 0x3a, 0x20, 0x32, 0xf6, 0xa6, 0xce, 0xea, 0x09, 0xad, 0x37, 0x4f, + 0x96, 0x0c, 0xcb, 0xca, 0x65, 0xfd, 0x7a, 0x09, 0x08, 0x52, 0xf7, 0xb7, + 0x2c, 0x93, 0xa3, 0xae, 0x55, 0x10, 0xa9, 0x83, 0x77, 0x62, 0xf3, 0xce, + 0xe2, 0xf4, 0x85, 0xb4, 0xc5, 0xa2, 0x1b, 0xf7, 0x7b, 0xe5, 0x8f, 0x2c, + 0x49, 0x5e, 0xdf, 0x21, 0xfd, 0x52, 0x41, 0xbc, 0x00, 0x00, 0x93, 0xe8, + 0xf8, 0x22, 0x65, 0xe0, 0xe6, 0x9d, 0x24, 0x65, 0xf5, 0xa3, 0xe9, 0x4a, + 0xf5, 0x9d, 0x85, 0xaa, 0x3a, 0x55, 0x7a, 0x79, 0xc1, 0xc6, 0xac, 0x22, + 0xe4, 0x57, 0x02, 0x7e, 0x30, 0xf8, 0xc5, 0x46, 0x80, 0xbc, 0x00, 0x00, + 0xc3, 0x01, 0x78, 0x7f, 0x4c, 0xfb, 0xe6, 0x35, 0x2b, 0xdc, 0x22, 0xc6, + 0xd1, 0xcd, 0x80, 0x42, 0x58, 0xe5, 0xaa, 0xc7, 0x13, 0x1c, 0x0a, 0xb9, + 0xcb, 0xbb, 0xac, 0x7b, 0x86, 0xe5, 0xd9, 0x65, 0x20, 0x6c, 0x4f, 0x8c, + 0x55, 0x50, 0xd0, 0xb9, 0xae, 0xdf, 0xca, 0x8d, 0x1e, 0x9f, 0x32, 0xd0, + 0xac, 0xa3, 0x4c, 0xcc, 0x3d, 0xeb, 0x50, 0x7b, 0x36, 0xd9, 0xe4, 0x23, + 0xc7, 0x95, 0xf2, 0x27, 0x20, 0xb8, 0x1c, 0x20, 0xdb, 0x59, 0xe0, 0x0a, + 0xd2, 0x56, 0x93, 0xbe, 0x06, 0x47, 0x80, 0x9d, 0xe7, 0x8a, 0x22, 0x6a, + 0xd2, 0xb8, 0x6a, 0x1e, 0x64, 0x5e, 0x5f, 0x77, 0x0b, 0x03, 0x4f, 0x61, + 0xa7, 0x4c, 0x71, 0xad, 0xf6, 0x7e, 0xb8, 0x5d, 0x77, 0x7b, 0x4b, 0xea, + 0x10, 0x5c, 0x50, 0x2e, 0xe4, 0x78, 0x00, 0x01, 0x0a, 0xf8, 0x9c, 0x5d, + 0x08, 0x35, 0x16, 0x70, 0x5f, 0x72, 0x1b, 0x8d, 0x05, 0x5c, 0x9a, 0x6c, + 0x25, 0x9f, 0x89, 0x29, 0x1e, 0x23, 0xe1, 0xa3, 0x4f, 0xbc, 0x0a, 0x0e, + 0x60, 0xff, 0x4c, 0xd6, 0xeb, 0xd2, 0x24, 0x11, 0xe0, 0x2e, 0x62, 0xc2, + 0x8f, 0xad, 0xe4, 0x22, 0xb8, 0x12, 0xd6, 0x16, 0x05, 0x20, 0xb5, 0x0a, + 0xb3, 0x0a, 0x01, 0x10, 0x0c, 0x08, 0x00, 0x00, 0x00, 0x04, 0x46, 0x3e, + 0x56, 0xff, 0xfc, 0xc0, 0x20, 0xa2, 0x0a, 0x30, 0x14, 0x00, 0x23, 0x20, + 0x02, 0x10, 0x60, 0xa2, 0x41, 0xb7, 0x02, 0xfa, 0xa4, 0x60, 0x8f, 0x83, + 0xec, 0xec, 0x15, 0x58, 0xc2, 0xa3, 0x0c, 0x6f, 0x72, 0x4f, 0xff, 0xff, + 0xfa, 0x7e, 0x91, 0x29, 0x87, 0x38, 0x8c, 0x0e, 0xd7, 0xde, 0xd9, 0x96, + 0x14, 0x8d, 0xcf, 0x01, 0x9d, 0x56, 0x4d, 0xc2, 0x18, 0x94, 0x81, 0xfd, + 0x30, 0x0c, 0xfe, 0xe6, 0x98, 0x13, 0x47, 0x7f, 0x94, 0x6c, 0x4f, 0x15, + 0xf4, 0xcb, 0x91, 0xf3, 0x7d, 0x6a, 0xad, 0x5a, 0x0d, 0x7b, 0xbc, 0xff, + 0x03, 0x89, 0x02, 0xbb, 0x59, 0xbd, 0x71, 0x89, 0xec, 0x91, 0xb2, 0x5f, + 0x97, 0x5a, 0x93, 0x6b, 0x53, 0x74, 0x69, 0x29, 0xc1, 0x4f, 0xc4, 0xd2, + 0x94, 0x12, 0x65, 0xb3, 0x96, 0xde, 0x48, 0x68, 0x01, 0xed, 0x27, 0x98, + 0xf8, 0x62, 0xe1, 0x3f, 0xbd, 0x9d, 0x21, 0x70, 0xe3, 0xf4, 0x05, 0xcf, + 0xbf, 0x3e, 0xce, 0x46, 0x73, 0x31, 0x69, 0xa2, 0x9e, 0x37, 0xd7, 0xef, + 0x6c, 0x69, 0x9c, 0x25, 0x82, 0x2c, 0x23, 0xad, 0x92, 0x41, 0x32, 0x83, + 0xcd, 0x36, 0x33, 0xa7, 0x3a, 0x8f, 0x43, 0xef, 0x59, 0xf6, 0x0d, 0x64, + 0x90, 0x0d, 0x44, 0x46, 0x29, 0x6d, 0x53, 0x68, 0xcf, 0xab, 0x3f, 0x3d, + 0x17, 0xd7, 0xbd, 0xbc, 0xe9, 0x14, 0x30, 0x7f, 0x8e, 0xaa, 0x4b, 0x59, + 0xa9, 0xeb, 0x5c, 0xeb, 0x22, 0xc1, 0x5c, 0x8e, 0x7e, 0x5c, 0x8d, 0xe1, + 0x60, 0xc4, 0xe5, 0xbd, 0xe5, 0x0e, 0x20, 0x43, 0x3a, 0x88, 0x2d, 0xdd, + 0xd1, 0x3e, 0x48, 0x2d, 0x10, 0x44, 0xf6, 0x7f, 0xe3, 0x81, 0x6c, 0xbd, + 0xb7, 0x5a, 0x89, 0x49, 0xd3, 0xf7, 0x1e, 0x9b, 0xbc, 0xf6, 0x7c, 0x41, + 0xbd, 0xec, 0x60, 0x20, 0xa5, 0x4f, 0x01, 0xd9, 0xc2, 0x10, 0x62, 0x66, + 0x17, 0x7e, 0x67, 0xcb, 0x31, 0xda, 0x63, 0x18, 0xbc, 0xbc, 0x52, 0xc1, + 0xbf, 0x7b, 0x5b, 0x21, 0x73, 0xdb, 0x2d, 0x6a, 0xab, 0x7c, 0xbc, 0x92, + 0x73, 0xfd, 0xdd, 0xce, 0x1e, 0x49, 0x49, 0x11, 0x78, 0xbd, 0x8f, 0x36, + 0xea, 0xc0, 0x96, 0x91, 0xc0, 0xc5, 0xdb, 0xdd, 0x4a, 0x66, 0xdd, 0xee, + 0xce, 0x3a, 0x00, 0xba, 0x64, 0x0e, 0x6a, 0xad, 0x3f, 0x74, 0x3a, 0xe3, + 0x52, 0x55, 0x80, 0x04, 0x04, 0xbe, 0xa2, 0xaa, 0x19, 0xb6, 0x59, 0x71, + 0xa6, 0x96, 0xd8, 0x1b, 0x94, 0x99, 0xfb, 0xba, 0x31, 0x22, 0x85, 0xdd, + 0x06, 0xc9, 0x81, 0x37, 0x11, 0x45, 0x78, 0x76, 0xa5, 0x47, 0x7c, 0xbb, + 0x5f, 0xbb, 0x3e, 0x1b, 0xe1, 0x02, 0xfa, 0x1c, 0xe0, 0x09, 0xe1, 0x95, + 0x1b, 0x07, 0xf5, 0x25, 0x49, 0xba, 0x6f, 0xa0, 0x27, 0x3b, 0xce, 0xb0, + 0xc4, 0x35, 0x46, 0xe8, 0xd3, 0xff, 0x6a, 0xb4, 0x1d, 0xff, 0xf5, 0x0b, + 0x11, 0x80, 0x0b, 0xf1, 0xfb, 0x11, 0xaf, 0xa0, 0xbd, 0x74, 0xdf, 0xdc, + 0x5d, 0x62, 0xae, 0xa0, 0x41, 0xfe, 0x31, 0x21, 0xce, 0xa7, 0x12, 0xd8, + 0x1f, 0x2f, 0x5e, 0xdb, 0x6c, 0x0b, 0x6d, 0xbe, 0x34, 0x72, 0x19, 0x46, + 0x07, 0x83, 0x23, 0x47, 0x07, 0x88, 0x53, 0x5b, 0x81, 0x27, 0xb3, 0x6d, + 0xcc, 0xbd, 0x32, 0xa7, 0x4f, 0xf4, 0xda, 0x67, 0xfe, 0x46, 0x93, 0x53, + 0xa9, 0xa3, 0xce, 0xfe, 0xb5, 0xa9, 0xfd, 0xe6, 0x6d, 0x9e, 0x18, 0xb0, + 0x21, 0xac, 0x84, 0xc5, 0x33, 0x51, 0x83, 0x88, 0xe9, 0x0d, 0xce, 0x3e, + 0x14, 0xca, 0x67, 0xd6, 0x1d, 0xde, 0x6d, 0xa5, 0x46, 0x66, 0xbe, 0x83, + 0xba, 0x83, 0x55, 0x65, 0x06, 0x32, 0xd0, 0xd1, 0x4d, 0x42, 0xc2, 0xf7, + 0xb9, 0xde, 0x6f, 0xad, 0xc7, 0x55, 0xad, 0x65, 0x9c, 0x63, 0x10, 0x2a, + 0x9f, 0x76, 0x6c, 0x01, 0x84, 0xbc, 0x52, 0xb8, 0x7d, 0xe4, 0x90, 0x27, + 0xb4, 0x04, 0x1d, 0x2f, 0x56, 0x5a, 0xac, 0x4d, 0x37, 0x7b, 0x50, 0x12, + 0x09, 0x16, 0x96, 0x04, 0x25, 0x58, 0x2a, 0xc5, 0xb6, 0x56, 0x30, 0x3f, + 0x1d, 0x70, 0xed, 0x05, 0xd3, 0xc8, 0xd7, 0x46, 0xda, 0x69, 0xf1, 0x86, + 0x28, 0x83, 0xe9, 0x5f, 0x1f, 0xeb, 0xce, 0xa2, 0x2c, 0x24, 0x75, 0x63, + 0xa9, 0x05, 0x00, 0x31, 0x18, 0x35, 0xf1, 0xab, 0xce, 0xdf, 0x34, 0xc8, + 0xec, 0x2b, 0x61, 0x06, 0x66, 0x0d, 0x7f, 0x30, 0xc3, 0x98, 0x58, 0x89, + 0x1b, 0x1f, 0x2f, 0x8d, 0xd6, 0x91, 0xac, 0x1a, 0xd8, 0x10, 0x61, 0x29, + 0xd1, 0x50, 0x8b, 0xae, 0x1a, 0xfe, 0xfd, 0xed, 0x68, 0xbd, 0x75, 0x75, + 0xd1, 0x59, 0x1c, 0xd9, 0x37, 0xfe, 0xd5, 0xc3, 0xbd, 0x98, 0x8d, 0xc7, + 0x68, 0x07, 0x8d, 0x69, 0x76, 0x9c, 0x4a, 0x4c, 0x32, 0x91, 0x2f, 0xba, + 0xf9, 0x59, 0x8b, 0xfa, 0xe6, 0x8a, 0xa5, 0x41, 0xd7, 0xe5, 0x45, 0xe4, + 0x26, 0x3f, 0x2b, 0x48, 0x6a, 0x6d, 0xe8, 0x37, 0x4c, 0xb3, 0xbd, 0x2b, + 0xe8, 0xe4, 0x92, 0xbb, 0x97, 0x62, 0x63, 0x65, 0xe4, 0x51, 0x44, 0x6b, + 0xfe, 0x37, 0xbe, 0x96, 0xce, 0xed, 0x84, 0x3f, 0xab, 0x41, 0xf6, 0x6e, + 0x0e, 0x1d, 0x73, 0xab, 0xdb, 0x3b, 0xe5, 0x66, 0xe8, 0x4a, 0xc9, 0xdf, + 0xe0, 0x40, 0x6a, 0xa8, 0x8e, 0xcf, 0x85, 0x8b, 0x3c, 0x65, 0xb5, 0x9e, + 0xfe, 0x72, 0x95, 0xad, 0xb5, 0x59, 0x30, 0x27, 0xd2, 0x66, 0x65, 0x82, + 0xc9, 0x3b, 0x27, 0x85, 0x49, 0x7b, 0xfd, 0xc8, 0x60, 0xf1, 0x55, 0xda, + 0x18, 0x14, 0x2c, 0xf3, 0xd1, 0xa3, 0xb1, 0x48, 0x9e, 0xd1, 0x5b, 0x3e, + 0x40, 0x6f, 0x41, 0x4a, 0xeb, 0x9a, 0x98, 0xa7, 0x2b, 0xa1, 0x7a, 0x0c, + 0x8f, 0x94, 0x13, 0xda, 0x89, 0x3e, 0xd4, 0xc5, 0xfc, 0xa1, 0x6f, 0x40, + 0x66, 0x18, 0x1e, 0x41, 0x80, 0xda, 0xb0, 0x87, 0x9b, 0xa3, 0x81, 0x2d, + 0x67, 0xd6, 0x11, 0x92, 0x7b, 0x74, 0x6c, 0xe8, 0xda, 0x26, 0xee, 0x5e, + 0x28, 0x70, 0x22, 0x48, 0xcc, 0x60, 0xa8, 0xcd, 0x7d, 0x7f, 0x45, 0x11, + 0x1e, 0x47, 0x11, 0x00, 0x58, 0xea, 0x51, 0xe4, 0xb9, 0x23, 0x30, 0x4e, + 0x66, 0xf6, 0xdc, 0x53, 0xc9, 0x34, 0x26, 0xba, 0x3f, 0xc2, 0x52, 0x88, + 0x23, 0xfb, 0x4a, 0x77, 0x31, 0x8c, 0x29, 0x52, 0x30, 0x5e, 0x53, 0x99, + 0xde, 0xaa, 0xc9, 0xd0, 0xa9, 0xed, 0xd0, 0x2c, 0x1a, 0xaa, 0x2a, 0x02, + 0x2b, 0x92, 0x7a, 0xf1, 0xab, 0x88, 0xb3, 0xf0, 0xaa, 0x4e, 0x7a, 0xde, + 0x0a, 0x76, 0x24, 0x46, 0xa0, 0x9a, 0x67, 0x19, 0xcf, 0xa7, 0xc0, 0x46, + 0x69, 0x2d, 0x02, 0xfa, 0x39, 0xaa, 0x1f, 0x83, 0x6f, 0x3c, 0xaa, 0xb6, + 0x5e, 0xac, 0x4d, 0x34, 0x26, 0x07, 0xea, 0x50, 0x96, 0xd8, 0x42, 0x47, + 0xbf, 0xa0, 0xe8, 0x4b, 0xad, 0xe1, 0x86, 0xf4, 0x87, 0x52, 0x30, 0x04, + 0x72, 0x23, 0x79, 0xc4, 0x98, 0xf4, 0x8b, 0x4f, 0x42, 0xf4, 0x58, 0x6e, + 0x11, 0x48, 0xc2, 0x69, 0x3d, 0x70, 0xc5, 0x28, 0x00, 0xbe, 0x03, 0x9d, + 0x10, 0xae, 0x6c, 0x8a, 0xa9, 0xf9, 0xe0, 0x1a, 0xf4, 0xf1, 0x3e, 0x97, + 0x9e, 0xf5, 0x2f, 0x29, 0xec, 0x1d, 0x6a, 0xf5, 0xbe, 0xbc, 0x5a, 0xd7, + 0x9f, 0x84, 0x6f, 0xb0, 0xec, 0xd3, 0x96, 0x7a, 0xd9, 0x00, 0x9a, 0xaa, + 0x62, 0x7e, 0xf1, 0x5a, 0xd7, 0x1c, 0xf6, 0xe3, 0x6c, 0x4d, 0x87, 0xc6, + 0xc2, 0xae, 0xd0, 0x91, 0xed, 0xdf, 0xa0, 0x9d, 0x37, 0x73, 0x55, 0x89, + 0x53, 0x9b, 0xe1, 0x0e, 0xae, 0x5a, 0xcb, 0xfa, 0xf1, 0x40, 0xaa, 0xb6, + 0x7c, 0xa1, 0x15, 0x45, 0x67, 0x87, 0x28, 0x4f, 0x34, 0x8c, 0xd7, 0xc9, + 0x5e, 0x91, 0x06, 0x06, 0x43, 0x44, 0xf5, 0x27, 0x1c, 0x7c, 0x36, 0xff, + 0xd9, 0x11, 0xa6, 0x1a, 0x55, 0x04, 0x57, 0x20, 0x72, 0xdb, 0xb0, 0xad, + 0x4c, 0xdd, 0x74, 0x06, 0x4b, 0x17, 0x64, 0x33, 0xca, 0xec, 0x7e, 0x37, + 0xa6, 0x68, 0x89, 0xea, 0x39, 0x4b, 0x2e, 0xa4, 0x5f, 0xb5, 0x0d, 0xd2, + 0x1a, 0x00, 0xb5, 0x95, 0x07, 0x97, 0x6d, 0xb0, 0xff, 0x8a, 0xbf, 0xa7, + 0xb8, 0xff, 0xae, 0x3f, 0x0c, 0xd8, 0xb7, 0x67, 0xee, 0x09, 0xf3, 0x74, + 0xda, 0x2b, 0x10, 0xc1, 0x33, 0xf7, 0xb0, 0x8d, 0xfa, 0x37, 0x18, 0x15, + 0xc3, 0xaa, 0xce, 0x95, 0x11, 0xb2, 0x44, 0xee, 0xad, 0x98, 0x2c, 0x5a, + 0x95, 0x67, 0x19, 0x08, 0xd8, 0x20, 0xd6, 0x87, 0xfa, 0x3e, 0x63, 0x61, + 0xb5, 0x12, 0x93, 0x6f, 0x23, 0x32, 0xcb, 0x3e, 0x0c, 0x0b, 0xaa, 0x28, + 0xff, 0xca, 0xfb, 0x53, 0x41, 0x3a, 0x78, 0xa6, 0xad, 0xfe, 0x5f, 0x06, + 0x8f, 0xe4, 0x90, 0x21, 0x9b, 0x25, 0xcc, 0x66, 0xe3, 0x10, 0xea, 0x90, + 0x8c, 0x2c, 0x10, 0x50, 0x83, 0xec, 0x46, 0xc1, 0x8b, 0x86, 0x6f, 0x64, + 0x3b, 0x03, 0xcf, 0xd4, 0x1c, 0x49, 0xa7, 0xb8, 0x04, 0xc9, 0xa0, 0xfc, + 0xe8, 0x89, 0xb1, 0x43, 0xb0, 0xef, 0xd7, 0xed, 0x16, 0xda, 0xdd, 0x42, + 0x4f, 0xcd, 0x46, 0x2e, 0x1f, 0xa2, 0x6c, 0xf1, 0x35, 0xaf, 0xca, 0xa7, + 0x20, 0x99, 0x70, 0x0f, 0x73, 0x65, 0x69, 0x03, 0xc7, 0xe1, 0x14, 0x6d, + 0xdb, 0x0a, 0xac, 0x7d, 0x73, 0x52, 0x0e, 0x1c, 0xe1, 0xf7, 0x4e, 0x70, + 0x49, 0x5f, 0x50, 0x47, 0xf2, 0x5c, 0xad, 0x82, 0xd6, 0x58, 0x28, 0xa1, + 0x2b, 0x43, 0x70, 0xcb, 0xbb, 0xd6, 0xb4, 0x35, 0xf5, 0x69, 0x81, 0x42, + 0x10, 0x5f, 0x17, 0xf9, 0x73, 0x4e, 0xf0, 0x24, 0xe9, 0x41, 0x96, 0xb6, + 0xe8, 0xb3, 0x4c, 0xfb, 0xd6, 0x48, 0x91, 0x9e, 0x71, 0x28, 0xaa, 0xb0, + 0xd4, 0x79, 0x45, 0x22, 0xa0, 0xf4, 0x52, 0xc3, 0x27, 0xc1, 0x12, 0xa8, + 0xe0, 0xb9, 0x8f, 0x94, 0x32, 0x94, 0xa6, 0x9a, 0x15, 0x4b, 0xeb, 0x52, + 0x2c, 0xa4, 0x28, 0xe1, 0x0b, 0x2f, 0xe6, 0x70, 0xc7, 0xde, 0x3a, 0x2f, + 0xec, 0x9b, 0x10, 0x1d, 0x01, 0x10, 0x1a, 0x30, 0x29, 0x04, 0x70, 0x20, + 0x00, 0x00, 0x01, 0xa6, 0x50, 0x30, 0x42, 0x0c, 0x14, 0x58, 0x00, 0x00, + 0x10, 0x00, 0xd4, 0x33, 0xcf, 0x95, 0x8b, 0xec, 0xf4, 0x18, 0x17, 0x30, + 0x29, 0x02, 0x70, 0x40, 0x00, 0x00, 0x41, 0xa7, 0x28, 0x02, 0x10, 0x60, + 0xa2, 0xc0, 0x00, 0x01, 0xc0, 0x00, 0xd9, 0x98, 0xd6, 0xe0, 0xc1, 0x0b, + 0xbf, 0x0b, 0x30, 0x29, 0x01, 0x40, 0x80, 0x00, 0x20, 0x41, 0xa7, 0x80, + 0x09, 0xb5, 0x74, 0x02, 0xc9, 0xff, 0xb2, 0xa0, 0x00, 0x55, 0x04, 0xfc, + 0xb8, 0xfc, 0xf1, 0xe7, 0x91, 0xda, 0x85, 0x5e, 0x63, 0xf0, 0x9b, 0xf8, + 0xee, 0xe9, 0xac, 0x98, 0xca, 0x41, 0x44, 0xa6, 0xde, 0xba, 0xdd, 0xe2, + 0x54, 0xd0, 0xab, 0xb6, 0xa6, 0x51, 0x54, 0x04, 0x74, 0x11, 0xce, 0x4f, + 0xb9, 0xc5, 0x26, 0xbd, 0x59, 0x0b, 0xeb, 0xe7, 0x9f, 0xa4, 0x8e, 0x07, + 0x19, 0x65, 0x9d, 0xd9, 0x5d, 0xae, 0xa1, 0xcd, 0xcb, 0xfd, 0xe5, 0x9e, + 0xd6, 0x5a, 0x0b, 0x40, 0x2d, 0x64, 0x1e, 0x39, 0x7c, 0x9e, 0xc1, 0xf6, + 0xa0, 0x14, 0x3f, 0x85, 0x9d, 0x65, 0xf6, 0x43, 0x3e, 0xa3, 0xb7, 0xa9, + 0x66, 0x10, 0x3d, 0x02, 0xad, 0x46, 0xb7, 0xd6, 0x5f, 0x15, 0x4b, 0x7b, + 0x18, 0x7d, 0xa6, 0x41, 0x36, 0x06, 0x8f, 0x13, 0x80, 0xd5, 0xc5, 0x7d, + 0x26, 0x33, 0x6b, 0xfd, 0x1e, 0xe9, 0x6e, 0xfc, 0x95, 0x36, 0x42, 0x99, + 0xa6, 0x7b, 0x43, 0x84, 0x5c, 0x29, 0xac, 0xd9, 0x42, 0xaf, 0x3d, 0xd8, + 0xe0, 0x5d, 0xbe, 0x62, 0x23, 0xd1, 0xaa, 0xf8, 0xbb, 0x95, 0x08, 0x3f, + 0x57, 0x96, 0x3a, 0xf0, 0x66, 0x5b, 0x3f, 0xd2, 0x4a, 0xd0, 0x30, 0x74, + 0x57, 0x13, 0xf8, 0x94, 0x75, 0x03, 0x4a, 0xe6, 0xdd, 0x3d, 0xd9, 0x98, + 0x9a, 0xa0, 0x01, 0x02, 0x45, 0xe1, 0xba, 0xfc, 0x28, 0x77, 0x94, 0x5c, + 0x10, 0xdf, 0x0f, 0x96, 0xec, 0x38, 0xf0, 0xe3, 0xd6, 0xdd, 0x87, 0x4d, + 0xf9, 0x79, 0x43, 0x04, 0x03, 0x5b, 0x88, 0xe5, 0xbd, 0x64, 0x5a, 0x96, + 0xf1, 0xd3, 0xf0, 0x3e, 0xf2, 0x83, 0x09, 0x22, 0xf9, 0x18, 0xa5, 0x53, + 0x6d, 0x80, 0x0f, 0xc1, 0x46, 0x2d, 0x8f, 0xae, 0x4c, 0xef, 0x87, 0x24, + 0x74, 0xc3, 0xf7, 0x70, 0x02, 0x0a, 0xd9, 0x62, 0xc9, 0xda, 0x99, 0x46, + 0x4f, 0x0e, 0xcc, 0x67, 0x19, 0xe9, 0x3a, 0x5c, 0x70, 0x36, 0x1a, 0xea, + 0x22, 0x76, 0xf5, 0x14, 0x59, 0x22, 0x86, 0xfe, 0xec, 0x22, 0xd9, 0x11, + 0x1a, 0x7b, 0xd1, 0xd4, 0x11, 0x3d, 0xa1, 0x92, 0x3f, 0x16, 0xe4, 0x42, + 0xb6, 0x39, 0x2f, 0xaa, 0x26, 0x60, 0x2b, 0x5e, 0xe7, 0xbb, 0x0c, 0x59, + 0x5d, 0xa7, 0x84, 0x78, 0x9f, 0xaf, 0xa0, 0x9d, 0x66, 0xb8, 0x7c, 0xfa, + 0xfd, 0x62, 0x37, 0x21, 0xe9, 0x2b, 0xd5, 0x46, 0xba, 0x92, 0x54, 0x45, + 0xe0, 0xa3, 0xba, 0x2d, 0xfa, 0x32, 0x4f, 0x30, 0xf9, 0x00, 0x29, 0x01, + 0x92, 0x69, 0x62, 0x59, 0xb4, 0xb1, 0x04, 0x00, 0x81, 0x1a, 0xc8, 0x4f, + 0x8a, 0x2c, 0xfc, 0x8a, 0xe6, 0xda, 0x1e, 0x82, 0x7f, 0x66, 0xb8, 0x4f, + 0x3f, 0x0e, 0x36, 0x42, 0x97, 0x43, 0xad, 0x92, 0x3d, 0x7c, 0x18, 0x1a, + 0xbd, 0xf7, 0x36, 0x13, 0xb7, 0x0d, 0x0e, 0x28, 0x6e, 0xa4, 0x54, 0x92, + 0x06, 0x21, 0xf3, 0xda, 0x9a, 0x67, 0x63, 0xf8, 0x81, 0x08, 0xb4, 0x7a, + 0x6c, 0x65, 0xf1, 0xdf, 0xf6, 0xb7, 0x0d, 0x9c, 0x8a, 0x04, 0x7b, 0xda, + 0x4c, 0x21, 0x20, 0x43, 0x44, 0x94, 0x81, 0xa4, 0x11, 0xfa, 0x43, 0xa8, + 0x2c, 0xb9, 0xb0, 0xee, 0x99, 0x20, 0xa3, 0x05, 0x82, 0x5f, 0x25, 0xaa, + 0xd2, 0xcb, 0x18, 0x01, 0xca, 0xdf, 0xf9, 0x08, 0x20, 0x57, 0xf0, 0x74, + 0xca, 0x9a, 0x0f, 0x58, 0x9a, 0x59, 0xfa, 0x28, 0x0f, 0x46, 0x0e, 0x80, + 0x56, 0x8b, 0x66, 0x99, 0xb0, 0x1c, 0xa1, 0xe4, 0x9a, 0x7b, 0x8b, 0x44, + 0x3a, 0xa0, 0x4a, 0x2b, 0xc0, 0x32, 0x60, 0xfa, 0xf3, 0x8f, 0xa5, 0x88, + 0xa4, 0x57, 0xb4, 0xa3, 0xc1, 0x85, 0x11, 0x1b, 0x0f, 0xb9, 0x17, 0xec, + 0x8e, 0xb2, 0x01, 0xe2, 0x74, 0x12, 0x60, 0xd7, 0x8b, 0x3a, 0x0e, 0x08, + 0xe6, 0x83, 0x20, 0x0f, 0x20, 0xbf, 0x0a, 0x03, 0x7e, 0x27, 0x43, 0x66, + 0x50, 0xb1, 0xb2, 0x10, 0xd0, 0xf3, 0xb7, 0x2b, 0x6c, 0x77, 0x65, 0x65, + 0x69, 0x98, 0x93, 0xe6, 0x87, 0xad, 0xb4, 0xd8, 0x5c, 0x70, 0xca, 0x1f, + 0x05, 0x81, 0x84, 0x92, 0x7b, 0x99, 0x81, 0xf6, 0xec, 0x2c, 0xd1, 0x30, + 0x1b, 0x96, 0xd7, 0x57, 0xd6, 0x8a, 0x35, 0x08, 0x9b, 0x6c, 0xbe, 0x26, + 0x59, 0xc5, 0xf5, 0x70, 0xa1, 0x42, 0xc9, 0x47, 0x9d, 0x46, 0xd1, 0xdc, + 0xe2, 0x7b, 0xee, 0xa1, 0x33, 0xc3, 0x97, 0x23, 0x59, 0x8e, 0x95, 0x83, + 0x67, 0x42, 0xcc, 0x6e, 0x00, 0xc1, 0xf9, 0x5c, 0xdb, 0x8b, 0xc3, 0x87, + 0x6a, 0xe8, 0x11, 0xf8, 0xdf, 0xc9, 0x06, 0x2a, 0x6f, 0x50, 0xce, 0x02, + 0x5c, 0x33, 0x1b, 0x36, 0xd8, 0x0f, 0x02, 0x12, 0x48, 0x7d, 0xbd, 0x62, + 0x12, 0xfa, 0xa5, 0xc3, 0xf1, 0x9b, 0x1c, 0x01, 0x81, 0xed, 0x8b, 0x5b, + 0x37, 0x93, 0xe8, 0x14, 0x63, 0x06, 0x7a, 0xd0, 0x63, 0x6b, 0x54, 0x77, + 0x49, 0x52, 0xf1, 0xe2, 0x1f, 0xd8, 0xc9, 0xab, 0xe0, 0xdc, 0x88, 0xaf, + 0xb6, 0x1a, 0x6e, 0x64, 0x6a, 0xcd, 0x56, 0x6e, 0x84, 0xaf, 0x0c, 0xdc, + 0xe4, 0x12, 0xe9, 0xa0, 0x2d, 0xae, 0x4c, 0x6a, 0xb8, 0x90, 0x87, 0x3d, + 0x07, 0x66, 0x4c, 0x2e, 0x99, 0xe9, 0x4a, 0x6f, 0x77, 0x52, 0x54, 0x7e, + 0x57, 0x59, 0x00, 0xb7, 0x14, 0x0a, 0x0a, 0xce, 0x7f, 0x99, 0xad, 0x2a, + 0x6f, 0xe1, 0x8d, 0xe6, 0x76, 0x9b, 0x46, 0xbe, 0x38, 0x56, 0x1a, 0x25, + 0xa6, 0xb1, 0x81, 0x13, 0x47, 0xb0, 0x86, 0xb5, 0x57, 0xff, 0xce, 0xca, + 0xa9, 0x52, 0xf2, 0x55, 0xdb, 0x02, 0x63, 0xbd, 0xb6, 0x56, 0xf1, 0xcf, + 0xa0, 0xdc, 0xf7, 0xea, 0x2d, 0x00, 0x29, 0x0e, 0x72, 0x3e, 0xeb, 0x7f, + 0x08, 0xd0, 0x1a, 0x23, 0x15, 0xea, 0x5c, 0xfa, 0xc0, 0xbd, 0xaa, 0x73, + 0x86, 0x64, 0xb9, 0x21, 0xd9, 0xd7, 0x2b, 0x7b, 0xe6, 0x08, 0xa1, 0x33, + 0x43, 0xf8, 0x12, 0x89, 0x84, 0x1f, 0xd9, 0x82, 0xb0, 0x13, 0xc5, 0x40, + 0x68, 0x72, 0x1a, 0xd8, 0xf4, 0x20, 0xd5, 0x75, 0xbe, 0xcb, 0xca, 0x13, + 0x6b, 0x73, 0x60, 0x19, 0x87, 0x19, 0x18, 0x57, 0x72, 0xce, 0xb7, 0xd4, + 0x6a, 0x3d, 0x93, 0x69, 0xa4, 0xe2, 0x04, 0xf7, 0x32, 0xb8, 0xad, 0xa1, + 0x5f, 0x5f, 0xc1, 0x53, 0xf7, 0x8e, 0x7d, 0x9f, 0x1f, 0x22, 0xe1, 0x34, + 0x94, 0x2a, 0x33, 0x4d, 0x35, 0xea, 0xed, 0x37, 0x7e, 0xcd, 0x25, 0xbb, + 0xff, 0xaa, 0x6d, 0x72, 0xbf, 0xb4, 0xc2, 0x73, 0xf7, 0xdc, 0x94, 0x9d, + 0xd0, 0x44, 0x0f, 0xdd, 0x18, 0x88, 0x56, 0x9c, 0x73, 0x36, 0x1c, 0x45, + 0xef, 0x2e, 0x99, 0xb1, 0x38, 0xd5, 0x4a, 0xf9, 0x26, 0x4d, 0x7f, 0x0b, + 0x03, 0xde, 0x0d, 0xd5, 0x98, 0x9d, 0xcf, 0xd8, 0xdc, 0xd4, 0x58, 0x77, + 0xc3, 0x71, 0x26, 0x91, 0x29, 0xd3, 0xa1, 0x55, 0x0e, 0xdc, 0xed, 0xfc, + 0x37, 0xf5, 0xf8, 0x40, 0xfc, 0x1c, 0x60, 0xd8, 0xbf, 0xd8, 0xc5, 0xf1, + 0xf5, 0x70, 0x1c, 0xd8, 0x13, 0x72, 0xc6, 0xaf, 0x35, 0xa6, 0x1f, 0xe1, + 0x99, 0xff, 0x89, 0x8e, 0xe4, 0x16, 0x75, 0x13, 0xf4, 0x52, 0xb4, 0x21, + 0x3c, 0xfb, 0xfc, 0x79, 0x2a, 0x91, 0xe8, 0x14, 0x4c, 0x76, 0x00, 0xc6, + 0x20, 0x2e, 0x51, 0xbd, 0x4e, 0x56, 0x49, 0x95, 0x44, 0xe1, 0x48, 0x85, + 0x6d, 0xcc, 0x46, 0x90, 0x75, 0xb7, 0x2b, 0xb6, 0x25, 0x24, 0xd1, 0x83, + 0x07, 0x10, 0x39, 0x43, 0x6e, 0x86, 0x33, 0x16, 0xc2, 0x87, 0x02, 0x67, + 0xf3, 0x1a, 0x0d, 0xa3, 0x77, 0x99, 0xaa, 0x84, 0x8e, 0x08, 0x8e, 0x1d, + 0xe8, 0x80, 0xfb, 0xfd, 0xb4, 0x16, 0x39, 0x46, 0xdd, 0x2f, 0x54, 0x69, + 0x2e, 0x3e, 0x0d, 0xed, 0xb0, 0xf3, 0xd0, 0xcc, 0x25, 0x91, 0x5a, 0x3b, + 0x49, 0xab, 0x52, 0xc8, 0x95, 0xf9, 0x04, 0x84, 0x1b, 0xfa, 0x60, 0xf5, + 0x6d, 0x36, 0xb5, 0xae, 0xaa, 0xd1, 0x6c, 0xd1, 0x1a, 0x0c, 0x4d, 0xf3, + 0x0e, 0xb0, 0x9c, 0xd3, 0x78, 0x20, 0x4a, 0x49, 0x1d, 0x77, 0x80, 0x71, + 0x78, 0xb1, 0x65, 0xdb, 0x8f, 0xd9, 0xe5, 0x4a, 0x29, 0x93, 0xd9, 0x23, + 0x31, 0x8e, 0xae, 0x54, 0x5b, 0x74, 0x02, 0xa9, 0x89, 0x93, 0x64, 0x1e, + 0x5b, 0xbb, 0xd8, 0xe2, 0x44, 0x4b, 0xbb, 0xab, 0x8e, 0x6e, 0xb0, 0x17, + 0x09, 0xe0, 0xe9, 0xc6, 0x68, 0xb1, 0xd6, 0x59, 0x53, 0xcc, 0x35, 0x75, + 0x03, 0xe9, 0x51, 0x7f, 0x11, 0x54, 0xe5, 0x6b, 0x06, 0x11, 0x8e, 0x28, + 0x39, 0xd9, 0x15, 0x80, 0x8c, 0x01, 0x1c, 0x50, 0xa9, 0x74, 0x97, 0x39, + 0x61, 0xed, 0xad, 0x15, 0xc6, 0xfa, 0x9b, 0x30, 0x55, 0x13, 0x53, 0x8e, + 0x9f, 0xcc, 0xee, 0x37, 0x4a, 0x1b, 0x0d, 0x77, 0x87, 0x80, 0x8e, 0xbf, + 0xcf, 0x0f, 0x94, 0xd9, 0xfe, 0xbc, 0x91, 0x53, 0x19, 0xad, 0x24, 0xb9, + 0xd0, 0x2e, 0xc2, 0xbf, 0xd2, 0xa4, 0x12, 0x6d, 0x95, 0xec, 0xd6, 0xf2, + 0x26, 0xa4, 0xb4, 0xd2, 0x48, 0x78, 0x36, 0xf0, 0xc0, 0xcc, 0xc8, 0xab, + 0xf4, 0xfb, 0x4e, 0x3b, 0x1b, 0x41, 0xff, 0x38, 0xb7, 0xa3, 0x6d, 0xeb, + 0xf0, 0x9a, 0x4e, 0xf4, 0x30, 0xe1, 0xc8, 0x59, 0xb4, 0xee, 0x99, 0xe6, + 0x21, 0xdf, 0xbc, 0x39, 0x66, 0x79, 0x86, 0x1c, 0xa3, 0x3f, 0x92, 0xbb, + 0xf3, 0x78, 0xdb, 0xc2, 0x47, 0x3b, 0x32, 0x87, 0xde, 0xe6, 0xf3, 0x72, + 0x82, 0x39, 0x48, 0xfb, 0x4a, 0x03, 0x02, 0xc4, 0xea, 0x8e, 0xee, 0x3d, + 0xde, 0xbc, 0x43, 0x4b, 0xc7, 0xc2, 0x6a, 0xe4, 0x8e, 0x4f, 0xed, 0xd3, + 0xeb, 0x6b, 0x24, 0x6d, 0x72, 0x53, 0xdb, 0xfb, 0xa2, 0x31, 0x47, 0xdb, + 0xff, 0x16, 0x60, 0xc9, 0xf3, 0xf9, 0x62, 0x7f, 0x64, 0xfe, 0x31, 0xc3, + 0x1a, 0xe2, 0xbb, 0x6c, 0xf3, 0x13, 0x9d, 0xe4, 0x2f, 0x4f, 0xeb, 0x67, + 0x95, 0xe4, 0x57, 0x02, 0x05, 0xaa, 0x21, 0xaa, 0x46, 0x3d, 0x40, 0x6e, + 0x52, 0xdb, 0x6f, 0x3c, 0x4a, 0xe2, 0x53, 0x8a, 0x5f, 0xcf, 0x70, 0x64, + 0x6b, 0xd7, 0x89, 0xbc, 0x43, 0x75, 0x6c, 0xac, 0x44, 0xd9, 0xb0, 0x7d, + 0x38, 0xa4, 0xb9, 0x81, 0x93, 0x39, 0x87, 0x0d, 0x52, 0x27, 0x56, 0x1b, + 0x07, 0xe7, 0xf6, 0xe9, 0xbb, 0xde, 0xb2, 0xaf, 0x44, 0x00, 0xe2, 0x12, + 0x9b, 0xea, 0xf1, 0xdb, 0x4f, 0xdd, 0xa4, 0xe0, 0xc3, 0xcd, 0x0b, 0x29, + 0x5e, 0x61, 0xd5, 0x21, 0xa5, 0x3a, 0x71, 0xd1, 0xb8, 0x67, 0x8b, 0x7a, + 0x76, 0xda, 0xd1, 0xb9, 0xb5, 0xde, 0x39, 0x73, 0x1c, 0x82, 0x07, 0xa7, + 0x8c, 0x0b, 0xcb, 0x0e, 0xe7, 0x1b, 0x55, 0x50, 0xc7, 0x8e, 0x3c, 0xe3, + 0xc2, 0x31, 0x5e, 0x30, 0xee, 0x91, 0xcb, 0xec, 0xc3, 0x4b, 0xbb, 0xa4, + 0x4e, 0x93, 0x5e, 0x44, 0x30, 0x88, 0xd2, 0x84, 0x44, 0x0f, 0x90, 0xb9, + 0xf1, 0x9a, 0x4c, 0x8f, 0xb4, 0xd0, 0x95, 0x39, 0x0b, 0x13, 0xc8, 0xb5, + 0x6b, 0x02, 0xbb, 0x7f, 0x01, 0x67, 0xe8, 0x65, 0x50, 0x9f, 0x04, 0x9d, + 0x04, 0x30, 0x32, 0x01, 0xe2, 0x00, 0x00, 0x68, 0xa3, 0x4f, 0x90, 0x2c, + 0x7f, 0x3c, 0xc5, 0x9d, 0xfe, 0x55, 0x40, 0x00, 0xfe, 0x87, 0x7b, 0xc1, + 0x84, 0xc9, 0x5f, 0x4b, 0xc2, 0x7d, 0x54, 0x41, 0x97, 0xa9, 0x62, 0x7f, + 0xfc, 0x92, 0x55, 0x9e, 0x0c, 0x17, 0xe4, 0x04, 0x19, 0xd1, 0xc4, 0x17, + 0xee, 0x1a, 0xcd, 0x27, 0xf0, 0x52, 0xbb, 0xb0, 0xb1, 0x3c, 0x01, 0x42, + 0x5d, 0xbd, 0x28, 0xa5, 0x1e, 0x17, 0xe0, 0x10, 0xe7, 0x75, 0x24, 0xc4, + 0xa8, 0xeb, 0xa4, 0xf9, 0xf3, 0x97, 0xd7, 0xa7, 0x61, 0x7d, 0x28, 0x6b, + 0x0f, 0xa6, 0x85, 0x9f, 0x40, 0x14, 0x14, 0xaf, 0x27, 0x64, 0xc6, 0x18, + 0x14, 0xb3, 0xea, 0x33, 0x7f, 0xfe, 0x75, 0xea, 0xfa, 0x00, 0x5e, 0x00, + 0x5f, 0x2b, 0xc8, 0x00, 0x2c, 0xaf, 0xaf, 0x8f, 0xe7, 0x7a, 0xa4, 0xea, + 0xb5, 0x9f, 0xd1, 0xb8, 0x03, 0x7e, 0x5b, 0x69, 0x07, 0x0b, 0x7b, 0x4b, + 0x2b, 0xb4, 0xe7, 0x98, 0x85, 0xe0, 0xb3, 0xc5, 0xcb, 0x91, 0xd3, 0x71, + 0x4c, 0x14, 0xaf, 0x53, 0xdd, 0xee, 0xfe, 0x99, 0x3b, 0x87, 0x44, 0xc0, + 0x23, 0xa0, 0xc9, 0x0d, 0x3d, 0x61, 0x51, 0x71, 0x7f, 0xc8, 0xa7, 0x55, + 0x12, 0x7b, 0x66, 0xb9, 0xe3, 0x20, 0x3b, 0x8c, 0x6c, 0xf9, 0x0e, 0xdd, + 0x9c, 0xe4, 0x6f, 0xa4, 0x19, 0x0a, 0x54, 0x3e, 0xa9, 0xca, 0x1f, 0xd5, + 0x08, 0x20, 0xb8, 0x87, 0xff, 0xde, 0x7d, 0x3b, 0x93, 0xa9, 0x64, 0x65, + 0x9f, 0x28, 0xd2, 0x9f, 0x2e, 0xbd, 0x6b, 0x25, 0x3c, 0xb3, 0xe7, 0x02, + 0xf2, 0xac, 0x5d, 0xbf, 0x5d, 0x67, 0xe6, 0x38, 0xf1, 0xce, 0x71, 0x8e, + 0xf1, 0xe1, 0xd0, 0x01, 0x33, 0x91, 0x96, 0x94, 0xdc, 0xce, 0x95, 0x16, + 0x66, 0x3d, 0x2b, 0xea, 0xd7, 0x84, 0xb3, 0xc7, 0x89, 0xd4, 0x8a, 0x81, + 0x06, 0xa2, 0xb4, 0xf5, 0x05, 0xb3, 0x94, 0x4a, 0xd9, 0xc7, 0x3d, 0x9c, + 0x48, 0xcd, 0x54, 0xab, 0xbf, 0x60, 0xc9, 0x79, 0x0f, 0x34, 0x88, 0x97, + 0xdd, 0xe4, 0x77, 0x41, 0xe4, 0x94, 0x91, 0x24, 0xcc, 0xd4, 0xa5, 0xb8, + 0x8b, 0xb2, 0xb3, 0xd4, 0x1b, 0xa9, 0xe4, 0x8b, 0x26, 0xa5, 0xbd, 0x32, + 0x69, 0x52, 0xe3, 0x34, 0x5e, 0xfa, 0x79, 0x5f, 0xbe, 0xe3, 0x81, 0x2a, + 0x2d, 0xc4, 0xa3, 0xb3, 0x7a, 0x68, 0xde, 0xf5, 0x92, 0xe6, 0x3c, 0x5c, + 0x1b, 0x0e, 0xbd, 0xea, 0x1b, 0x68, 0x5a, 0x0e, 0xb2, 0xd2, 0x1c, 0x7b, + 0x1d, 0x7b, 0xdc, 0x8a, 0xc3, 0xb7, 0x7f, 0xc4, 0x47, 0xd5, 0xb3, 0xfa, + 0x6b, 0xd6, 0x0e, 0x87, 0xfb, 0x3e, 0x9b, 0x5f, 0xd1, 0xeb, 0x28, 0x0c, + 0xce, 0x6c, 0x32, 0x17, 0xc2, 0x36, 0xe6, 0x8f, 0x72, 0x4f, 0x73, 0x0d, + 0x65, 0x1e, 0xe7, 0xec, 0xac, 0x00, 0xad, 0x80, 0x8a, 0x90, 0x3e, 0x6f, + 0x98, 0xa0, 0x17, 0x51, 0x77, 0x40, 0x0e, 0xd9, 0x8a, 0x94, 0x81, 0xc6, + 0x37, 0x71, 0x0c, 0xd4, 0x05, 0xa5, 0x51, 0x87, 0xb7, 0xca, 0xf4, 0x15, + 0xa1, 0x56, 0xa1, 0x04, 0xf8, 0x35, 0xef, 0x38, 0x20, 0x3a, 0x07, 0x27, + 0x0f, 0x3a, 0xb2, 0x4c, 0x1d, 0x12, 0x99, 0x81, 0x8d, 0xf3, 0xdd, 0xa3, + 0xa3, 0xdf, 0x62, 0x49, 0x21, 0xdc, 0xf3, 0xc0, 0xef, 0x31, 0x48, 0xe7, + 0x74, 0xa6, 0x1d, 0xc6, 0x9c, 0x6a, 0x92, 0x2e, 0x24, 0x3d, 0x8a, 0xd6, + 0x90, 0x6f, 0x05, 0x92, 0x04, 0x10, 0x6e, 0x82, 0x8d, 0xc1, 0xe8, 0x2e, + 0xad, 0xed, 0x4c, 0x5d, 0xcb, 0x20, 0x0f, 0xc3, 0x91, 0x11, 0xca, 0xc7, + 0x85, 0x82, 0x1e, 0xe2, 0x81, 0xaf, 0x30, 0x3b, 0x53, 0x1e, 0x3b, 0xed, + 0x66, 0x15, 0xb6, 0x28, 0x59, 0x26, 0x12, 0xfd, 0x9c, 0xe6, 0x73, 0xa0, + 0xa6, 0x8a, 0x1f, 0x4d, 0xc1, 0x68, 0xc3, 0x7e, 0x09, 0xf3, 0x3a, 0x29, + 0xc5, 0x9f, 0x2a, 0x93, 0x99, 0x6d, 0x73, 0x5a, 0x65, 0x62, 0xc3, 0xd1, + 0xab, 0x15, 0xe9, 0xd2, 0xd5, 0x70, 0xbe, 0x4e, 0xf5, 0xc5, 0x7c, 0x88, + 0x2d, 0x70, 0x06, 0x05, 0x01, 0x10, 0x02, 0x18, 0xb8, 0xad, 0x04, 0xab, + 0x04, 0x01, 0x10, 0xa7, 0x04, 0x30, 0x32, 0x03, 0x24, 0x07, 0x00, 0x40, + 0xa3, 0x4f, 0xb0, 0x2e, 0x7e, 0x79, 0x85, 0x99, 0xfe, 0x55, 0xa0, 0x00, + 0xfe, 0x6d, 0xc6, 0x23, 0x45, 0x72, 0xd9, 0x05, 0xe9, 0xbe, 0xfa, 0xe1, + 0xef, 0xf6, 0xb7, 0xbc, 0x02, 0x97, 0x0d, 0xcb, 0x31, 0x30, 0x5c, 0xc4, + 0x67, 0xcd, 0x6b, 0x10, 0xba, 0xe8, 0xaa, 0x61, 0xf1, 0xfb, 0x91, 0x08, + 0xd0, 0x5d, 0x39, 0xd0, 0xa4, 0x55, 0x12, 0xfa, 0xa0, 0x8d, 0xc7, 0x60, + 0x65, 0x52, 0x17, 0xe9, 0x87, 0xb4, 0x87, 0xf6, 0x2e, 0x76, 0x29, 0x5c, + 0x9f, 0x9f, 0xbd, 0x87, 0x3c, 0x83, 0xd7, 0x80, 0x33, 0xef, 0xd1, 0x35, + 0x60, 0x9c, 0x77, 0xa0, 0x76, 0xd1, 0xe7, 0xb2, 0x4e, 0xb9, 0x65, 0x79, + 0xc6, 0x82, 0x76, 0x05, 0xbd, 0xcf, 0x4d, 0xa2, 0x60, 0x1a, 0x47, 0xb0, + 0x1d, 0xb0, 0x58, 0xec, 0x58, 0x31, 0xd3, 0xa2, 0xaa, 0x69, 0xda, 0x54, + 0x6d, 0x30, 0x68, 0x3e, 0xbe, 0x8c, 0x11, 0x7e, 0xe5, 0x25, 0x48, 0xf3, + 0xd9, 0x04, 0x6c, 0x70, 0x85, 0xce, 0xda, 0xb8, 0x54, 0x5b, 0xef, 0x9a, + 0xf5, 0x62, 0x5a, 0x90, 0x2c, 0x1a, 0xf5, 0x51, 0xdd, 0x23, 0xc7, 0x46, + 0x95, 0x71, 0xd7, 0xc7, 0x42, 0x62, 0x5c, 0x08, 0xe5, 0xc9, 0xb2, 0x6f, + 0x44, 0x18, 0x3d, 0x0d, 0xd3, 0x6b, 0x5e, 0x16, 0x89, 0x0a, 0x45, 0x04, + 0xbb, 0x3c, 0x77, 0xf2, 0x8d, 0x9d, 0xb1, 0x9e, 0x13, 0xb3, 0x78, 0x76, + 0x4c, 0x52, 0x5e, 0x3c, 0xa0, 0x2e, 0x39, 0x23, 0xab, 0xa8, 0xd1, 0xe6, + 0xb7, 0xb6, 0xde, 0xd6, 0x38, 0x5e, 0xc4, 0x3f, 0x1d, 0x48, 0xd5, 0x60, + 0x79, 0xf4, 0xd6, 0x5a, 0xae, 0xc9, 0xa9, 0x42, 0x3d, 0x14, 0x06, 0x41, + 0xb0, 0x53, 0x3e, 0xeb, 0xf6, 0x16, 0x12, 0x4a, 0x9b, 0x1d, 0xc9, 0xef, + 0x4f, 0x27, 0xa3, 0x5c, 0xe6, 0x42, 0x72, 0xa1, 0x42, 0xaa, 0x41, 0x23, + 0xad, 0x6e, 0xa4, 0xd4, 0xa8, 0x49, 0xad, 0xc1, 0x89, 0xb5, 0xff, 0x1e, + 0x13, 0xa3, 0xad, 0x42, 0x82, 0x75, 0xb6, 0x42, 0x4b, 0x7f, 0x2e, 0x88, + 0x90, 0xc7, 0xc4, 0x95, 0x8a, 0x25, 0x44, 0x08, 0xbb, 0x90, 0x97, 0x40, + 0x6c, 0xf4, 0x57, 0xc4, 0x15, 0xc6, 0xd2, 0x97, 0x17, 0xea, 0xfb, 0xd7, + 0x07, 0xeb, 0xf0, 0x23, 0x74, 0x09, 0x92, 0x6a, 0xf5, 0x4a, 0x08, 0x8e, + 0x78, 0x1b, 0xa2, 0x98, 0x97, 0x3e, 0xd5, 0x40, 0x0d, 0x74, 0x4d, 0x41, + 0xe0, 0xe0, 0x78, 0x5b, 0x58, 0x5a, 0x5b, 0x93, 0x0e, 0x51, 0x3a, 0x39, + 0x40, 0xa5, 0x77, 0xe6, 0x17, 0xa6, 0xc2, 0x84, 0x03, 0xa5, 0x3f, 0x3e, + 0xca, 0x02, 0x72, 0xb1, 0x53, 0xff, 0xe8, 0x8d, 0xce, 0x1b, 0x6d, 0x25, + 0x2e, 0x21, 0xa9, 0x30, 0x4c, 0x8d, 0x33, 0x16, 0x69, 0x7b, 0x4e, 0x4c, + 0x7d, 0xb3, 0x9d, 0x8a, 0xa2, 0x07, 0xca, 0x4e, 0x9c, 0x3c, 0x54, 0x08, + 0x09, 0x85, 0xbc, 0x10, 0x27, 0x39, 0x9e, 0x30, 0x78, 0x99, 0x8b, 0x0f, + 0x13, 0xa8, 0xd9, 0x4b, 0xe1, 0x7c, 0xf0, 0xf1, 0x33, 0xb5, 0x08, 0xf9, + 0x36, 0x4f, 0x14, 0x1e, 0xc6, 0x5f, 0x86, 0xab, 0xd9, 0x1e, 0xfa, 0x26, + 0xea, 0xdd, 0x8d, 0xdc, 0x98, 0x5b, 0x9c, 0x36, 0x0c, 0x5c, 0xfa, 0x96, + 0x49, 0xea, 0x7f, 0xfb, 0x5c, 0x87, 0x34, 0x0d, 0xc4, 0xc1, 0x82, 0x2d, + 0x1a, 0xf4, 0xfb, 0x06, 0x28, 0x43, 0x31, 0x75, 0xda, 0x3d, 0xf2, 0xef, + 0xb4, 0x33, 0xcb, 0xcf, 0x6c, 0x21, 0xf9, 0x73, 0x6c, 0x5a, 0xd5, 0xac, + 0x5d, 0xe1, 0x8f, 0xdb, 0xad, 0xf7, 0x0e, 0x08, 0x36, 0x83, 0xfc, 0xe1, + 0x1c, 0x3e, 0xf4, 0xe6, 0x0c, 0x6f, 0xd7, 0x98, 0x84, 0x3a, 0x46, 0xa1, + 0xa6, 0xaa, 0xab, 0xa3, 0xd2, 0xe6, 0x9c, 0x36, 0x90, 0x5c, 0x84, 0x45, + 0xf7, 0x27, 0x68, 0x42, 0x14, 0xaa, 0x76, 0xfe, 0x39, 0x56, 0x1b, 0xa7, + 0x40, 0x25, 0x46, 0x66, 0x65, 0xd3, 0x6e, 0x42, 0xd8, 0x66, 0x3c, 0x26, + 0xba, 0x2d, 0xe6, 0x59, 0x7f, 0x49, 0x86, 0xf6, 0x19, 0x75, 0xfc, 0x28, + 0xc2, 0x75, 0x5b, 0x80, 0x06, 0x05, 0x01, 0x10, 0x02, 0x18, 0xa8, 0xd6, + 0x07, 0x81, 0x07, 0x01, 0x10, 0xfd, 0x06, 0x30, 0x29, 0x03, 0x54, 0x02, + 0xa0, 0x46, 0x4e, 0x9e, 0x40, 0x47, 0xa4, 0x48, 0x8b, 0x27, 0xfe, 0x0d, + 0x00, 0x3d, 0x5d, 0xaa, 0x74, 0xa1, 0xa9, 0x46, 0x95, 0x54, 0xc4, 0x88, + 0x99, 0xb7, 0x9a, 0x46, 0x10, 0x50, 0x44, 0x35, 0xf4, 0xc4, 0x19, 0x14, + 0x90, 0xd7, 0x46, 0x85, 0xce, 0x13, 0x23, 0x4b, 0xad, 0x18, 0x20, 0xac, + 0x71, 0x11, 0xf0, 0x9e, 0x3b, 0x8e, 0xaf, 0x68, 0x12, 0xa5, 0x26, 0xec, + 0x84, 0xe5, 0x93, 0x00, 0x85, 0x91, 0x72, 0x97, 0xd2, 0xcd, 0x4b, 0xf6, + 0x12, 0x14, 0x38, 0xd6, 0xa2, 0x70, 0x71, 0x96, 0xea, 0xde, 0xf9, 0xc6, + 0xd3, 0x62, 0x29, 0xb9, 0xdd, 0x9c, 0xcb, 0x02, 0xff, 0x42, 0x48, 0xac, + 0xe9, 0x4f, 0x99, 0x28, 0x60, 0xc4, 0x27, 0x2c, 0xfc, 0xd1, 0xfc, 0xfb, + 0x5f, 0xe0, 0x72, 0xe3, 0xdb, 0x04, 0xc3, 0x1d, 0x09, 0x72, 0x9d, 0x3f, + 0xf9, 0xb3, 0x3e, 0xad, 0xd5, 0x6e, 0x6c, 0x89, 0xd1, 0x58, 0x7f, 0xcb, + 0xe1, 0x2d, 0xa2, 0x95, 0xc0, 0xaf, 0x00, 0x5e, 0x44, 0x43, 0xa8, 0x54, + 0x41, 0xd9, 0xfb, 0x0b, 0x65, 0x1d, 0x70, 0x8a, 0xa7, 0x00, 0xae, 0xb1, + 0x6b, 0x51, 0xed, 0xd5, 0x26, 0xd8, 0x1d, 0xad, 0xf8, 0x78, 0x8d, 0xc8, + 0x2e, 0x33, 0xa0, 0x49, 0x08, 0xb0, 0xb4, 0xaa, 0xec, 0x95, 0xe9, 0x4d, + 0xec, 0x7a, 0x36, 0xa5, 0xf8, 0xaf, 0x43, 0xc7, 0x52, 0x7e, 0x7e, 0xe0, + 0xab, 0x28, 0x4a, 0x90, 0x7e, 0x78, 0x38, 0x96, 0xea, 0x43, 0x60, 0x2a, + 0xf4, 0x5c, 0xa6, 0x5f, 0x3c, 0xeb, 0xe6, 0x47, 0x4f, 0x94, 0xf8, 0xa1, + 0x4c, 0x36, 0x5e, 0xe1, 0xc2, 0x40, 0xea, 0xfe, 0xf9, 0xf1, 0x84, 0x60, + 0x93, 0xb0, 0xdd, 0xb4, 0x00, 0x78, 0xc5, 0xec, 0xa8, 0x5a, 0x77, 0x93, + 0xf4, 0xff, 0x51, 0x9f, 0xd8, 0xba, 0xd3, 0x48, 0x1d, 0x7e, 0xb9, 0x51, + 0x83, 0x3c, 0x52, 0xda, 0x61, 0x89, 0x51, 0x6b, 0xf8, 0x08, 0xb0, 0xaa, + 0x3d, 0x03, 0xc5, 0x5e, 0xd4, 0xd9, 0x7d, 0xd2, 0xca, 0x39, 0x9e, 0xc3, + 0x87, 0xab, 0x88, 0xca, 0x2a, 0x8c, 0xf5, 0xe0, 0x3b, 0x5c, 0x1f, 0x8b, + 0x06, 0x80, 0xf4, 0xd7, 0xcc, 0x26, 0x77, 0xb2, 0x9a, 0xc3, 0x6e, 0xd0, + 0x17, 0x4c, 0x50, 0x05, 0xcf, 0x84, 0x51, 0xb9, 0xf8, 0x33, 0x28, 0x02, + 0x1f, 0x56, 0x12, 0x16, 0x2a, 0x70, 0x97, 0xc6, 0xe9, 0xc8, 0xe7, 0x4f, + 0x00, 0x0e, 0x9f, 0x54, 0x40, 0xed, 0xb1, 0x34, 0x26, 0x52, 0x90, 0xe8, + 0x98, 0xfd, 0xeb, 0x06, 0xd3, 0xf0, 0x22, 0x4f, 0x82, 0xb2, 0xfa, 0xc2, + 0x32, 0x09, 0x39, 0x46, 0xc6, 0xfa, 0xd5, 0xaf, 0x12, 0x45, 0x85, 0x1a, + 0x96, 0x2d, 0x3c, 0x61, 0x90, 0x33, 0x3d, 0xf3, 0xd2, 0x7c, 0x11, 0x1a, + 0x04, 0x10, 0x8c, 0xb3, 0x55, 0x0d, 0xf7, 0x6a, 0x72, 0xec, 0xa9, 0xa4, + 0xe8, 0x61, 0x0c, 0x42, 0x80, 0x19, 0xed, 0xf3, 0x95, 0x06, 0x02, 0x86, + 0x18, 0x3c, 0x3d, 0x67, 0x29, 0x71, 0xd5, 0x16, 0xb9, 0x2b, 0x99, 0xae, + 0x68, 0x2b, 0x19, 0x68, 0xcc, 0x06, 0xdd, 0xcf, 0x96, 0x72, 0x08, 0x3a, + 0x75, 0x52, 0x7f, 0xf9, 0x49, 0x84, 0x55, 0x23, 0xc2, 0x96, 0x99, 0xd8, + 0x73, 0x6a, 0x00, 0xc6, 0x07, 0x33, 0x43, 0xa9, 0x35, 0x83, 0x6e, 0x1b, + 0x94, 0x04, 0x89, 0xb8, 0x1b, 0x2a, 0x72, 0xa6, 0xcb, 0x53, 0xc5, 0xe6, + 0x4c, 0x40, 0xd2, 0x83, 0x77, 0x6c, 0x18, 0x66, 0x68, 0x1b, 0x33, 0x9a, + 0x55, 0xf3, 0x0b, 0xc3, 0x34, 0xf9, 0xaa, 0x43, 0x6d, 0x53, 0x42, 0x2c, + 0x55, 0x1b, 0xc3, 0x79, 0xf7, 0x3c, 0x30, 0xc0, 0xd6, 0xdd, 0x43, 0x80, + 0x1f, 0x3f, 0x5a, 0x81, 0x9d, 0xa8, 0xaf, 0xb2, 0x15, 0x82, 0x28, 0x4c, + 0x8a, 0x45, 0x97, 0xbc, 0x88, 0x9a, 0x08, 0x52, 0x49, 0xac, 0x1a, 0x90, + 0x5e, 0xd9, 0xa8, 0x43, 0xf8, 0xb2, 0x96, 0x09, 0x16, 0x90, 0xc1, 0x4a, + 0x5a, 0x1a, 0xd3, 0x59, 0x4b, 0xc7, 0x90, 0xa7, 0x35, 0xfc, 0x19, 0xa0, + 0x26, 0x5f, 0xff, 0x99, 0x3e, 0xc8, 0xd6, 0xf1, 0x02, 0x3e, 0xc0, 0x81, + 0x65, 0x55, 0x1b, 0x73, 0x3c, 0xe6, 0x6e, 0x9b, 0xed, 0x5b, 0xb3, 0x9f, + 0x47, 0x34, 0xf7, 0xaa, 0x08, 0x44, 0xf2, 0x8c, 0x02, 0x6a, 0x9e, 0x33, + 0x25, 0x45, 0x6a, 0x8e, 0x29, 0xe6, 0x76, 0x7d, 0x1a, 0x29, 0xd2, 0xf8, + 0xe6, 0xb2, 0xed, 0x86, 0x1a, 0x11, 0x79, 0xa8, 0xf6, 0xf0, 0x3d, 0x8e, + 0xda, 0x07, 0x31, 0x7f, 0xbb, 0x42, 0x6c, 0x90, 0xdc, 0xdb, 0x8a, 0x0b, + 0xd8, 0x93, 0x72, 0x8e, 0x17, 0xf7, 0xb7, 0x7f, 0x39, 0xda, 0xdc, 0x70, + 0x50, 0x26, 0x87, 0x11, 0x35, 0xc4, 0x13, 0xe0, 0x03, 0x31, 0x72, 0x8f, + 0x92, 0x7d, 0x72, 0x32, 0x1c, 0x3c, 0xef, 0x89, 0x2c, 0x22, 0x91, 0xb3, + 0x4d, 0x03, 0x6c, 0xf6, 0x45, 0x31, 0x15, 0x08, 0x8e, 0x7e, 0x92, 0x52, + 0x03, 0xc1, 0x75, 0x95, 0x3a, 0xd6, 0x6d, 0x89, 0x42, 0x7b, 0xca, 0xf2, + 0x73, 0x69, 0xdb, 0x23, 0x5b, 0x91, 0x2a, 0x03, 0x09, 0xee, 0x80, 0x03, + 0xe7, 0x7a, 0xf6, 0x43, 0xcc, 0xa1, 0xea, 0x81, 0xc5, 0x36, 0x47, 0x44, + 0x31, 0xe2, 0x38, 0xd8, 0xd7, 0x61, 0x8f, 0x92, 0x0f, 0xe2, 0x27, 0x36, + 0x68, 0x83, 0xc4, 0xbf, 0xd1, 0x94, 0x1c, 0xe5, 0xe4, 0x9e, 0x59, 0xc6, + 0x55, 0x03, 0xeb, 0xa4, 0x25, 0xbf, 0x88, 0xe0, 0x29, 0x05, 0xbc, 0xa4, + 0xf6, 0x3b, 0x06, 0x30, 0xfc, 0xbc, 0xb5, 0xeb, 0x17, 0x00, 0x8f, 0x49, + 0x30, 0xa1, 0x17, 0xda, 0x8a, 0x8f, 0x61, 0x62, 0xde, 0x0e, 0x47, 0xee, + 0x5a, 0xaa, 0x85, 0x93, 0x72, 0xe5, 0xf5, 0x88, 0xfa, 0xf4, 0x08, 0x87, + 0x30, 0x32, 0xa5, 0x89, 0x40, 0xad, 0x84, 0x02, 0x0c, 0xcb, 0xd6, 0x6b, + 0x73, 0x8a, 0x73, 0x85, 0x1d, 0x84, 0x19, 0x4f, 0x74, 0x7b, 0x42, 0x5f, + 0x35, 0xf8, 0xe2, 0x4a, 0x24, 0xdb, 0x3e, 0x0f, 0xe1, 0xce, 0xc1, 0x6a, + 0x95, 0x86, 0x5f, 0xa1, 0x38, 0x59, 0x84, 0xc1, 0xef, 0xbe, 0xcf, 0xd7, + 0x49, 0xaf, 0xcf, 0x62, 0x38, 0xd9, 0xb5, 0xab, 0xe4, 0x7f, 0x17, 0x46, + 0xaf, 0x30, 0xb6, 0x52, 0x75, 0x73, 0x0c, 0x6d, 0x1c, 0x46, 0x93, 0x9f, + 0xfc, 0x56, 0x80, 0x6b, 0x00, 0x81, 0x65, 0x21, 0xb0, 0xd9, 0xc1, 0x42, + 0xd2, 0xc1, 0xd9, 0x26, 0x8b, 0xbf, 0xe7, 0xbc, 0x14, 0x07, 0x2b, 0xb1, + 0xdf, 0x27, 0xe6, 0xaa, 0x72, 0x79, 0xe7, 0x3e, 0xcc, 0xf3, 0xec, 0x3f, + 0xce, 0xc5, 0x1d, 0xcb, 0xc6, 0xa2, 0x4a, 0x24, 0x17, 0x95, 0x16, 0xf5, + 0x52, 0x51, 0x30, 0x32, 0x05, 0x30, 0x05, 0x60, 0xcc, 0xa3, 0x4f, 0x90, + 0x27, 0x99, 0xa4, 0x05, 0x8f, 0xff, 0x95, 0x40, 0x00, 0x83, 0x22, 0x34, + 0x01, 0x56, 0x70, 0xf3, 0x5c, 0x52, 0xbe, 0x53, 0x3b, 0x9a, 0x61, 0xeb, + 0xf3, 0x28, 0x7b, 0xea, 0xaa, 0x62, 0x22, 0x17, 0x95, 0x6d, 0xcc, 0x8a, + 0xc9, 0x3a, 0x3b, 0xd5, 0xb4, 0x58, 0x35, 0xcb, 0x51, 0x24, 0x24, 0xbe, + 0x93, 0x4e, 0x14, 0x4c, 0x10, 0x33, 0x03, 0x59, 0x1d, 0x59, 0xec, 0x1a, + 0xcf, 0xe8, 0xf6, 0x4c, 0x35, 0xeb, 0x98, 0x57, 0xf1, 0x44, 0x16, 0x06, + 0x05, 0x01, 0x10, 0x02, 0x18, 0xe8, 0x5c, 0x5b, 0x01, 0x10, 0x58, 0x30, + 0x32, 0x07, 0x22, 0x0d, 0xd8, 0xa8, 0xa3, 0x4f, 0x90, 0x10, 0xc1, 0xa3, + 0x05, 0x89, 0xfe, 0x5d, 0x40, 0x00, 0xfe, 0x4e, 0x3b, 0x7d, 0x8a, 0xb9, + 0x8a, 0x54, 0xce, 0xfa, 0x71, 0xbe, 0x85, 0x22, 0x22, 0x23, 0xc1, 0xf1, + 0x05, 0xf9, 0x98, 0x70, 0xda, 0xad, 0x77, 0xb1, 0x0d, 0x0d, 0x3b, 0xa7, + 0x5c, 0xe5, 0x0a, 0x41, 0x25, 0x07, 0x4e, 0x87, 0x79, 0xab, 0x08, 0x4e, + 0x04, 0xdf, 0xeb, 0x06, 0x4e, 0x5a, 0x60, 0x75, 0x27, 0x81, 0x62, 0x9e, + 0x2b, 0x7e, 0x72, 0x3b, 0x26, 0xd3, 0x11, 0x72, 0x9b, 0x26, 0x12, 0x48, + 0xc0, 0xb6, 0xf0, 0x1a, 0x19, 0x01, 0x10, 0x16, 0x30, 0x32, 0x08, 0x00, + 0x09, 0xa8, 0x5c, 0x83, 0x4f, 0xf0, 0x10, 0x91, 0xa3, 0x05, 0x80, 0x00, + 0x01, 0x00, 0xca, 0x9c, 0x87, 0x80 +}; + +static guint32 stream_annexb_av1_tu_len[] = { + 5458, 1335, 2077, 7, 559, 7, 984, 7, 93, 27 +}; + +static guint8 stream_no_annexb_av1[] = { + 0x12, 0x00, 0x0a, 0x0b, 0x00, 0x00, 0x00, 0x04, 0x46, 0x3e, 0x56, 0xff, + 0xfc, 0xc0, 0x20, 0x32, 0xbc, 0x2a, 0x14, 0x00, 0xa6, 0x40, 0x10, 0xf3, + 0x47, 0x52, 0x5f, 0xe1, 0xe0, 0x0e, 0x22, 0xf0, 0x95, 0xe2, 0x9c, 0x49, + 0x32, 0xd9, 0x73, 0xfb, 0x89, 0x0e, 0x9f, 0xe2, 0xd2, 0x80, 0x2d, 0xaf, + 0x80, 0x0d, 0xa6, 0xae, 0x2b, 0x72, 0x90, 0xcc, 0x79, 0x15, 0x18, 0x61, + 0xda, 0x0a, 0xe3, 0xff, 0x5d, 0x14, 0x2b, 0x2f, 0x0e, 0x08, 0x0b, 0x08, + 0x09, 0xb2, 0xa8, 0x72, 0x7a, 0x8d, 0x97, 0x14, 0x53, 0x7a, 0xdd, 0x64, + 0xf1, 0x33, 0x42, 0x65, 0x7c, 0x01, 0x14, 0xd0, 0x9a, 0x59, 0x0b, 0xab, + 0x92, 0x59, 0xbd, 0xef, 0xa2, 0x3c, 0xaf, 0xc5, 0x94, 0x28, 0x0a, 0x59, + 0xd1, 0x5b, 0xd8, 0x0c, 0x4c, 0x4c, 0xeb, 0x4e, 0x0c, 0x3f, 0x9b, 0xcf, + 0x11, 0xfb, 0xdf, 0x7d, 0x28, 0xc7, 0x80, 0xfd, 0x23, 0x2c, 0x44, 0xa7, + 0x69, 0xd4, 0xa2, 0x25, 0xeb, 0x3a, 0x10, 0xdd, 0x7b, 0x95, 0x06, 0xf1, + 0xe9, 0x89, 0x04, 0xd5, 0x5a, 0x33, 0x76, 0xc0, 0x44, 0x0d, 0x0a, 0x0c, + 0x4d, 0xd3, 0xfb, 0x06, 0xb0, 0x4d, 0xdd, 0x39, 0x5b, 0xf7, 0xcc, 0x51, + 0x5f, 0x54, 0x6d, 0x5e, 0xbf, 0x62, 0x50, 0xd8, 0xaa, 0x4d, 0x72, 0x56, + 0x12, 0x28, 0x37, 0xae, 0x66, 0x3e, 0x1e, 0x66, 0x62, 0x6b, 0xd9, 0x74, + 0xf7, 0xc3, 0x1b, 0xc2, 0x9b, 0xd3, 0xb7, 0xff, 0xfc, 0xcc, 0xf4, 0x6b, + 0x39, 0x81, 0xd9, 0xf0, 0x89, 0x4d, 0xed, 0xf5, 0xc7, 0xd1, 0x64, 0x12, + 0x7c, 0x7b, 0xe9, 0x98, 0x84, 0x73, 0x42, 0xe9, 0x4e, 0xbe, 0xa9, 0xc4, + 0x6f, 0x4c, 0xce, 0xd5, 0x01, 0x10, 0x45, 0x6f, 0x18, 0x59, 0x99, 0xdf, + 0x0f, 0x37, 0x36, 0xae, 0xa0, 0x62, 0x5f, 0xd8, 0x5f, 0x53, 0x2a, 0x4b, + 0xa2, 0x6c, 0x30, 0xa2, 0x4d, 0xd2, 0x93, 0x66, 0x5c, 0x04, 0x53, 0xe1, + 0x02, 0xe9, 0xaa, 0x3d, 0xc7, 0x21, 0x9a, 0x0e, 0x93, 0xa6, 0xfe, 0x7b, + 0x19, 0x4f, 0x6d, 0x35, 0xa8, 0xbf, 0xed, 0x9b, 0x36, 0xe3, 0xcf, 0xc7, + 0x2d, 0x9f, 0xcd, 0xb9, 0x49, 0xe2, 0x6e, 0x9f, 0x92, 0xce, 0xea, 0x7c, + 0x65, 0x50, 0xd1, 0xaf, 0x4e, 0xfe, 0x03, 0x2e, 0x57, 0x75, 0x49, 0x6f, + 0x1e, 0xf1, 0x5c, 0x65, 0x59, 0xcb, 0x36, 0x23, 0xc2, 0xad, 0xcb, 0xb2, + 0x89, 0x53, 0xf9, 0xd2, 0xcd, 0xab, 0x20, 0x4d, 0xc5, 0x98, 0x74, 0x48, + 0x4f, 0xbe, 0xd3, 0x6c, 0x01, 0xa2, 0xee, 0xab, 0xf0, 0xf4, 0xf0, 0x17, + 0xc8, 0xd6, 0xa9, 0x91, 0xbd, 0x11, 0x4f, 0xca, 0x3b, 0x06, 0x5a, 0x58, + 0xe9, 0x96, 0xa0, 0x30, 0xe9, 0x39, 0x57, 0xa8, 0x65, 0xae, 0x9c, 0x87, + 0xed, 0x2e, 0x8e, 0xbf, 0x8e, 0xcf, 0x73, 0x16, 0x08, 0xe2, 0xfe, 0x63, + 0x78, 0x73, 0xe3, 0xd0, 0x10, 0x14, 0x81, 0xf3, 0x33, 0xcf, 0x69, 0x2e, + 0x94, 0xb5, 0x77, 0x00, 0x50, 0x90, 0xf9, 0x4c, 0xb4, 0x22, 0xfa, 0x60, + 0xf4, 0xe7, 0xae, 0xbc, 0x25, 0x44, 0x82, 0x9d, 0x19, 0xee, 0xa9, 0x5c, + 0x5c, 0x93, 0x62, 0x70, 0x37, 0x04, 0xa8, 0x4f, 0x2f, 0x4d, 0xbb, 0xbb, + 0xe0, 0x8e, 0x97, 0x39, 0x08, 0xe1, 0x70, 0xb4, 0x9b, 0x66, 0x2a, 0x14, + 0x3d, 0x44, 0x88, 0x6c, 0x8a, 0x0d, 0x02, 0xfe, 0xf3, 0x88, 0xc6, 0x75, + 0x28, 0x3f, 0x67, 0xee, 0xc3, 0xb7, 0xdc, 0x7a, 0x74, 0xd1, 0xfd, 0xdf, + 0x0d, 0xee, 0xc8, 0xcd, 0xba, 0xf2, 0x16, 0x06, 0x71, 0x73, 0x1e, 0xf7, + 0x81, 0x21, 0x92, 0x75, 0x91, 0x44, 0x39, 0x04, 0x57, 0xa0, 0xe0, 0xac, + 0x26, 0x68, 0x50, 0x53, 0x3d, 0xe9, 0xb3, 0x98, 0xd4, 0xa0, 0xf3, 0x31, + 0x9e, 0xd4, 0x53, 0x2f, 0x12, 0xdc, 0x3c, 0x1d, 0x2a, 0x65, 0xb9, 0x26, + 0xc0, 0x15, 0xa9, 0x37, 0x1f, 0x6d, 0x61, 0xf7, 0x10, 0xdd, 0x18, 0xfa, + 0x22, 0x8e, 0x00, 0xad, 0x9c, 0x39, 0xeb, 0x9d, 0xce, 0x43, 0xcf, 0x97, + 0xb0, 0x49, 0x53, 0xf0, 0xf4, 0x59, 0xd8, 0x68, 0xce, 0xa4, 0xc5, 0xad, + 0x5c, 0xd1, 0x10, 0xcf, 0x95, 0x7f, 0x61, 0x81, 0xd5, 0x4c, 0xb5, 0x10, + 0x80, 0x52, 0x0c, 0x53, 0x46, 0xa3, 0x1d, 0xe9, 0x70, 0x13, 0x2a, 0xd9, + 0x7b, 0x32, 0x87, 0x36, 0x58, 0x16, 0x81, 0xe1, 0x12, 0x91, 0x48, 0x4d, + 0xfc, 0x5c, 0x04, 0x6a, 0x0b, 0xbd, 0x03, 0x8d, 0x6a, 0x2f, 0xdb, 0x36, + 0x5c, 0xde, 0xc1, 0xfc, 0xe8, 0x5a, 0x3e, 0x67, 0xd4, 0xc2, 0xff, 0x06, + 0x89, 0x12, 0x55, 0xbe, 0xd4, 0xae, 0xee, 0x8e, 0x8e, 0x0b, 0x6b, 0x87, + 0x44, 0x3c, 0x04, 0x18, 0x8e, 0xcb, 0x2b, 0x8c, 0x69, 0xd6, 0xe3, 0x9d, + 0xe3, 0xbd, 0x01, 0xa0, 0xe8, 0xaf, 0x77, 0x10, 0x27, 0x27, 0x2c, 0xa7, + 0x38, 0xb6, 0x52, 0xfe, 0x69, 0x88, 0xd0, 0x01, 0x4d, 0x99, 0xf2, 0x51, + 0xe9, 0x0f, 0x1d, 0xb8, 0x21, 0xf0, 0x96, 0xf8, 0xb9, 0x11, 0x12, 0x8c, + 0x5b, 0xdf, 0x34, 0xb9, 0xe6, 0xaa, 0xb8, 0x53, 0x07, 0x8f, 0xa3, 0x4b, + 0xdf, 0x21, 0x89, 0x69, 0xb8, 0x30, 0x6e, 0x72, 0xe7, 0x34, 0x5d, 0x5a, + 0x99, 0x13, 0xa5, 0x0e, 0x9e, 0x74, 0x9b, 0xa0, 0xaf, 0x48, 0x29, 0x87, + 0xbc, 0x3c, 0xf3, 0x4e, 0xdd, 0x41, 0x4f, 0x8c, 0x47, 0x30, 0x9a, 0x75, + 0x4a, 0xdf, 0x76, 0xfe, 0x1f, 0x80, 0x75, 0x3a, 0xfa, 0xb8, 0xa7, 0xda, + 0xcc, 0x02, 0x94, 0x29, 0xe0, 0x2e, 0x2e, 0x3b, 0x67, 0xcc, 0x22, 0x74, + 0xe7, 0x89, 0xdd, 0xc8, 0x14, 0x3f, 0x24, 0xf1, 0x33, 0xab, 0x9c, 0x4f, + 0x69, 0xe8, 0x02, 0x44, 0xec, 0xbc, 0x43, 0x66, 0x47, 0xcb, 0xe4, 0x90, + 0xb7, 0x1d, 0x57, 0x41, 0x76, 0xeb, 0x4b, 0xe8, 0xb4, 0x2c, 0x39, 0x4e, + 0xb8, 0x56, 0x34, 0xa0, 0x87, 0x41, 0xd2, 0x08, 0x8f, 0x5e, 0xeb, 0x67, + 0x39, 0x99, 0x36, 0xe0, 0x73, 0x64, 0x24, 0xd8, 0xdc, 0xb1, 0x9e, 0xda, + 0x37, 0xbc, 0x0e, 0x39, 0xb4, 0xde, 0xda, 0x06, 0xd0, 0xf4, 0xca, 0x93, + 0x78, 0x23, 0x73, 0xef, 0x74, 0x38, 0xaa, 0x7d, 0xa1, 0x76, 0xf9, 0xb4, + 0x68, 0xbc, 0xc8, 0x89, 0xb7, 0x85, 0x53, 0x04, 0x88, 0x97, 0xed, 0xaa, + 0xf1, 0xe2, 0xc3, 0xc1, 0x18, 0xe3, 0xd8, 0xc9, 0x2c, 0xc0, 0x9d, 0x01, + 0x1d, 0x78, 0x68, 0xdf, 0x3f, 0x6c, 0x0f, 0xb3, 0xc3, 0xa5, 0x83, 0x80, + 0x6d, 0x9a, 0x0f, 0x88, 0xcf, 0x46, 0x81, 0x53, 0x51, 0x19, 0xa9, 0x47, + 0x5a, 0xea, 0x49, 0xfa, 0xb1, 0x59, 0x34, 0x54, 0x8d, 0xb2, 0x23, 0x2d, + 0x9b, 0xb4, 0x43, 0xf0, 0x93, 0x4a, 0xe1, 0xeb, 0x33, 0xd9, 0x13, 0x02, + 0x2d, 0xc7, 0x69, 0xc5, 0xf1, 0xa4, 0x5e, 0x95, 0x21, 0x95, 0xa4, 0x56, + 0x68, 0x73, 0x16, 0x25, 0x1c, 0xd9, 0x53, 0x65, 0x1d, 0xea, 0xd4, 0xec, + 0xd5, 0xe7, 0x87, 0x04, 0x1a, 0x33, 0xdf, 0x42, 0x8c, 0x2c, 0x61, 0x32, + 0x20, 0x90, 0xba, 0x93, 0x9e, 0x64, 0x84, 0xc0, 0xd9, 0xe0, 0xe6, 0xfa, + 0x8c, 0x6d, 0xbe, 0x8c, 0x83, 0x15, 0xee, 0x97, 0xf4, 0xc2, 0x87, 0x2a, + 0x97, 0xb4, 0x22, 0xac, 0xf2, 0x72, 0x1e, 0xeb, 0x89, 0x0d, 0xae, 0x75, + 0xea, 0x36, 0xfe, 0xe1, 0x2c, 0xcd, 0x03, 0x3a, 0xeb, 0x01, 0x3e, 0x08, + 0xbf, 0xb4, 0x09, 0x92, 0x42, 0x4b, 0x45, 0x37, 0xfd, 0x7b, 0xcd, 0x3c, + 0x88, 0x66, 0x63, 0xab, 0x36, 0x2f, 0xd2, 0x14, 0x60, 0x8d, 0xbd, 0xad, + 0x21, 0xb4, 0x12, 0xe8, 0x08, 0x3c, 0x61, 0x82, 0x94, 0xd8, 0x8c, 0x93, + 0x28, 0x73, 0x73, 0xab, 0xcc, 0xf7, 0x5e, 0xd8, 0x7e, 0x53, 0x5d, 0x87, + 0xdc, 0x09, 0x40, 0x23, 0xb7, 0x50, 0x25, 0xda, 0x26, 0x49, 0xb0, 0x2b, + 0xd8, 0xe0, 0xc6, 0xd6, 0xc7, 0xc9, 0xb1, 0xe9, 0xb6, 0xf1, 0xd3, 0x16, + 0xdb, 0xf1, 0xff, 0x0f, 0x9a, 0x57, 0x5b, 0x87, 0x93, 0x7e, 0xeb, 0x22, + 0x31, 0xde, 0xbc, 0xeb, 0xa2, 0x99, 0x1b, 0x98, 0x35, 0x71, 0x75, 0xfa, + 0x79, 0x1d, 0xfb, 0xea, 0x04, 0x9c, 0xc7, 0x7e, 0x1c, 0xdf, 0xa3, 0x99, + 0x68, 0x13, 0x86, 0x6d, 0xb1, 0x4c, 0xfd, 0x2a, 0xd2, 0x8e, 0x4d, 0x3d, + 0x6e, 0x50, 0xd7, 0xab, 0xfa, 0x8e, 0x50, 0x43, 0x2e, 0x1d, 0xf0, 0xdc, + 0x5a, 0x3e, 0xaa, 0x1a, 0x0c, 0x13, 0xe3, 0xd2, 0xf1, 0x0e, 0x91, 0xe4, + 0xa4, 0x36, 0x7a, 0xfd, 0xf8, 0x5a, 0x4a, 0x39, 0x66, 0x64, 0xbe, 0x93, + 0x3b, 0x4e, 0xa7, 0x62, 0xa3, 0xae, 0x23, 0xe6, 0x47, 0xa0, 0x6c, 0x00, + 0xfc, 0xe0, 0x6b, 0xee, 0xb3, 0x7e, 0xca, 0xab, 0x53, 0x55, 0xe5, 0xe2, + 0x19, 0xbb, 0x22, 0x89, 0x2d, 0xa0, 0xcc, 0xae, 0xd0, 0x77, 0x1f, 0xa1, + 0xf8, 0x83, 0x22, 0x07, 0xa8, 0x81, 0xad, 0xe5, 0xab, 0xc6, 0x91, 0x59, + 0xd7, 0x90, 0xa9, 0x63, 0x56, 0x7f, 0x94, 0x78, 0x74, 0x69, 0xa3, 0xdd, + 0xd3, 0x8e, 0x13, 0xec, 0xc3, 0x78, 0x99, 0x8c, 0x72, 0x52, 0xa8, 0x7b, + 0x4f, 0x77, 0x5a, 0x9f, 0xbe, 0xf1, 0xe1, 0xb7, 0xea, 0x5f, 0xa6, 0xc3, + 0x83, 0x60, 0x81, 0x3f, 0xda, 0xe8, 0x78, 0x07, 0x96, 0x2c, 0x63, 0xc9, + 0x2a, 0xa1, 0x52, 0xb6, 0x72, 0xbc, 0xf6, 0xe1, 0xdc, 0x9f, 0x24, 0x18, + 0x07, 0xf1, 0x33, 0x0e, 0xb3, 0x8d, 0xee, 0xd8, 0xdb, 0x92, 0xc1, 0x6b, + 0x0a, 0x4a, 0xf0, 0x5d, 0x08, 0xb4, 0x50, 0x1f, 0x5a, 0x78, 0x14, 0x9f, + 0xdf, 0x7d, 0x9c, 0x79, 0x09, 0x10, 0x39, 0x38, 0x0f, 0x4e, 0x52, 0x42, + 0x36, 0x94, 0x86, 0xe3, 0xcb, 0x2a, 0xd5, 0xab, 0xe1, 0x16, 0xc5, 0x6b, + 0x5f, 0x1f, 0xf5, 0x02, 0xb0, 0xb1, 0x63, 0x33, 0xe1, 0x06, 0xe6, 0xfe, + 0x38, 0xb9, 0x3d, 0x85, 0x13, 0x6c, 0xd9, 0xb9, 0xc4, 0xfd, 0x27, 0xd1, + 0x6f, 0xbe, 0x86, 0x4c, 0x52, 0xb2, 0x5f, 0xe9, 0x9b, 0xb8, 0xa0, 0x50, + 0x02, 0x27, 0xfd, 0x04, 0x41, 0x87, 0x02, 0x89, 0xff, 0x7b, 0x80, 0xbb, + 0x82, 0x98, 0xed, 0x27, 0xc6, 0x84, 0x53, 0x34, 0xae, 0xb5, 0x7f, 0xcf, + 0x1c, 0xa7, 0x85, 0x6e, 0x20, 0xca, 0xa8, 0x47, 0x1a, 0x2f, 0x0b, 0x15, + 0x54, 0xec, 0x9a, 0x68, 0xb3, 0x91, 0x30, 0xe5, 0x57, 0xc8, 0x56, 0x87, + 0x8a, 0x15, 0xb2, 0xc5, 0x88, 0x6b, 0xa1, 0xa6, 0xb1, 0x42, 0x83, 0xd6, + 0xcb, 0x4a, 0x59, 0x2c, 0x45, 0xa1, 0xf9, 0x74, 0x83, 0x38, 0x70, 0x65, + 0xa2, 0x5a, 0x8f, 0xf4, 0x39, 0x1f, 0xe6, 0xd3, 0x3c, 0x60, 0x60, 0x86, + 0x34, 0x6f, 0xb2, 0x56, 0xcd, 0x38, 0x10, 0xa5, 0x24, 0xfa, 0x25, 0x5d, + 0xab, 0x9c, 0xd2, 0xb4, 0x03, 0x47, 0xea, 0xa4, 0x8e, 0x33, 0xb2, 0xa7, + 0xef, 0x4d, 0x0e, 0x9b, 0x03, 0xd1, 0xfc, 0x32, 0x5f, 0xcc, 0xcf, 0x12, + 0x36, 0x5b, 0x74, 0xf2, 0xbf, 0x6d, 0xab, 0x84, 0x05, 0x2a, 0x46, 0x00, + 0x3f, 0x85, 0x7c, 0x01, 0x80, 0x4e, 0x47, 0x0f, 0x13, 0xe7, 0x96, 0x39, + 0x89, 0x5d, 0x3e, 0x2c, 0xe5, 0x03, 0xfe, 0x0f, 0xe2, 0x91, 0x3e, 0xf1, + 0x96, 0xca, 0x16, 0x6c, 0x10, 0x4c, 0x14, 0xfa, 0x13, 0xa1, 0xeb, 0x72, + 0x49, 0xe3, 0x00, 0xcb, 0xe2, 0xac, 0xfa, 0xae, 0xbd, 0x1c, 0xfc, 0x66, + 0xb9, 0x5b, 0xa4, 0xa7, 0xdd, 0x0c, 0x33, 0xd3, 0x05, 0x0d, 0x79, 0xeb, + 0x2b, 0xef, 0x9c, 0x36, 0x92, 0xd0, 0xdd, 0xe8, 0x72, 0x8a, 0x41, 0x9b, + 0x7b, 0xbc, 0xcd, 0x3f, 0x69, 0x93, 0x89, 0xb5, 0xb1, 0xf0, 0x16, 0xce, + 0xfd, 0x19, 0x91, 0x4d, 0x82, 0xb1, 0x5f, 0x53, 0x13, 0x51, 0xc2, 0x0b, + 0x1e, 0xd2, 0xf2, 0xe2, 0x2e, 0x79, 0x48, 0xd5, 0x47, 0xd0, 0xdc, 0x94, + 0xc2, 0xe2, 0x3a, 0x2b, 0x28, 0x48, 0x7d, 0x17, 0xdb, 0x39, 0x76, 0xa1, + 0x63, 0x8b, 0x97, 0xc9, 0x9a, 0x89, 0xcf, 0x39, 0xed, 0xd9, 0xea, 0xca, + 0x82, 0x81, 0xb7, 0xce, 0x49, 0x6c, 0x92, 0x61, 0x48, 0xaf, 0xde, 0xa7, + 0x7e, 0x74, 0xa4, 0xac, 0x41, 0x0c, 0xee, 0xed, 0x07, 0xed, 0x24, 0xa4, + 0xa8, 0xf7, 0x37, 0xaa, 0xa9, 0x28, 0x26, 0x45, 0x3c, 0xb7, 0x80, 0x11, + 0x3d, 0x39, 0x7e, 0xf2, 0x04, 0x27, 0x04, 0xff, 0x0f, 0xa3, 0x49, 0x03, + 0xdf, 0xcd, 0x38, 0xa1, 0xf3, 0x14, 0xef, 0x0f, 0xed, 0xb0, 0xf1, 0xa5, + 0xc3, 0x60, 0x0d, 0x7a, 0x25, 0x21, 0x4c, 0x87, 0x4d, 0xe0, 0x57, 0xcb, + 0x5f, 0x92, 0x12, 0x5b, 0x6c, 0xf7, 0x4e, 0x90, 0xd9, 0x30, 0x01, 0x10, + 0xa1, 0x5a, 0x68, 0xff, 0x19, 0x99, 0xf7, 0x19, 0x9b, 0x5b, 0x2f, 0x8d, + 0x9e, 0x56, 0x1f, 0x86, 0xea, 0xb9, 0x1b, 0xf5, 0xd0, 0xbf, 0xfc, 0x2e, + 0x4b, 0x3b, 0x16, 0xbf, 0xb4, 0xb0, 0x3e, 0xda, 0x53, 0xb9, 0x64, 0x2d, + 0xee, 0x89, 0xd4, 0xa7, 0x7c, 0xcf, 0xb0, 0x6b, 0x36, 0x74, 0x55, 0xac, + 0x5b, 0x97, 0xbf, 0x56, 0x70, 0xf6, 0xeb, 0x1f, 0xa8, 0xe1, 0x18, 0xe0, + 0xff, 0x45, 0x7b, 0xfe, 0x4d, 0x4b, 0x1c, 0xd1, 0x6a, 0x75, 0x33, 0x20, + 0xd6, 0x67, 0x1f, 0xf1, 0xd3, 0x51, 0x9d, 0xcf, 0xb5, 0x01, 0x45, 0x53, + 0xa3, 0xd4, 0x01, 0x55, 0x16, 0xc3, 0xd9, 0x07, 0x0d, 0xc1, 0xdc, 0x02, + 0x66, 0x00, 0xb6, 0x90, 0x5c, 0x37, 0x24, 0x99, 0xdf, 0xe8, 0xca, 0x89, + 0x99, 0x02, 0x2e, 0xf7, 0xa7, 0x77, 0xc6, 0x3f, 0x15, 0x17, 0x36, 0x45, + 0x7a, 0xc1, 0x73, 0x20, 0x67, 0xce, 0x84, 0x75, 0xc2, 0xbf, 0x27, 0x34, + 0x69, 0x12, 0x16, 0x14, 0x8c, 0x53, 0x82, 0xc2, 0x6f, 0xa1, 0x78, 0x1e, + 0x31, 0x36, 0x9b, 0x59, 0xc1, 0x05, 0x5d, 0xb9, 0x6c, 0xee, 0x04, 0x44, + 0x47, 0x09, 0x69, 0xb8, 0xed, 0x66, 0x31, 0x09, 0x3f, 0x1f, 0x0f, 0xb7, + 0x4f, 0x3b, 0xa1, 0xc2, 0x69, 0x89, 0x53, 0x99, 0x62, 0x59, 0xad, 0x51, + 0x40, 0x92, 0xf3, 0x25, 0x33, 0xfe, 0x9f, 0x86, 0x66, 0x7c, 0x0b, 0xa8, + 0xee, 0x53, 0x5d, 0x57, 0x9f, 0x21, 0x59, 0x31, 0x20, 0xb3, 0xb2, 0xe0, + 0xdc, 0xa8, 0xb6, 0x9e, 0x74, 0x36, 0xd3, 0x28, 0x3d, 0xd8, 0x8f, 0x79, + 0xa8, 0x08, 0x5b, 0x77, 0x18, 0xd0, 0x49, 0xcc, 0x1a, 0xde, 0x4e, 0xb8, + 0xef, 0x49, 0x52, 0xf1, 0x19, 0x1c, 0x45, 0x08, 0x2b, 0x8a, 0x6e, 0x55, + 0x30, 0xe4, 0x64, 0xcc, 0x32, 0x65, 0x93, 0x76, 0xa6, 0xa8, 0x42, 0xc8, + 0x01, 0x6d, 0x83, 0xa9, 0x04, 0xcd, 0xe1, 0xee, 0xd1, 0xad, 0xe4, 0x35, + 0x09, 0xb6, 0x43, 0x38, 0x81, 0x0a, 0x98, 0xee, 0x7c, 0xa4, 0x42, 0x0b, + 0x39, 0xc4, 0x57, 0x65, 0xdf, 0x62, 0x0e, 0x34, 0xd1, 0x6b, 0xc8, 0x2b, + 0x64, 0xa3, 0xeb, 0x3d, 0xca, 0x82, 0x89, 0x29, 0x4a, 0xd4, 0x63, 0xb6, + 0xf7, 0xa6, 0x8c, 0x1d, 0x44, 0x47, 0x17, 0x67, 0x0c, 0x61, 0x2b, 0x77, + 0x70, 0x8d, 0x87, 0xec, 0xb9, 0xc6, 0xcd, 0x8c, 0x31, 0x35, 0xb7, 0x69, + 0x0b, 0xbb, 0xf5, 0x1c, 0xd2, 0x0d, 0x47, 0xef, 0x42, 0x28, 0x71, 0xee, + 0x1d, 0x1b, 0xad, 0xbe, 0xd0, 0x42, 0x60, 0x49, 0x1f, 0xe8, 0x98, 0xad, + 0x62, 0x51, 0xba, 0x83, 0x84, 0x2e, 0x6c, 0x1c, 0x78, 0x47, 0xf9, 0x0e, + 0x99, 0xef, 0x22, 0x2a, 0x7b, 0x46, 0x2a, 0x93, 0xc8, 0x2e, 0x18, 0xbb, + 0xd2, 0xe9, 0x0c, 0x4a, 0x33, 0x16, 0x45, 0x7c, 0x85, 0xfe, 0x99, 0x87, + 0xa6, 0x10, 0x97, 0x2c, 0xa0, 0x99, 0xe0, 0x18, 0xc4, 0x77, 0x1e, 0x39, + 0xd2, 0xab, 0x73, 0x1c, 0x2f, 0xc0, 0xfe, 0x82, 0x7d, 0xc6, 0xe9, 0x39, + 0x36, 0x29, 0x9b, 0xd5, 0xe9, 0x6a, 0x06, 0x9c, 0x19, 0xe2, 0x47, 0x04, + 0x6e, 0xe8, 0xaf, 0x5f, 0xd9, 0xaf, 0xc2, 0xa0, 0x11, 0x7c, 0xff, 0xa7, + 0x8a, 0x2b, 0x05, 0x0b, 0x3e, 0x97, 0xc8, 0x43, 0xf1, 0x64, 0x67, 0xaa, + 0xfc, 0xea, 0x41, 0xbf, 0x1c, 0x13, 0x6b, 0x97, 0x44, 0x56, 0x8a, 0x92, + 0x2f, 0xf3, 0x21, 0x6c, 0x5b, 0xed, 0xca, 0x7b, 0x75, 0x6a, 0x9e, 0x0f, + 0xb0, 0x6c, 0x3d, 0x70, 0x98, 0x61, 0x17, 0xc9, 0x4c, 0x46, 0x99, 0xfd, + 0x88, 0x50, 0x5e, 0xbe, 0x71, 0xb9, 0xbc, 0xea, 0x16, 0x09, 0xca, 0x24, + 0xe3, 0x9c, 0xb7, 0x3d, 0x05, 0x5b, 0x6d, 0x62, 0x6f, 0xb9, 0xd8, 0x99, + 0xda, 0x45, 0xcb, 0x7c, 0xa6, 0xc6, 0x46, 0x36, 0x5b, 0x20, 0xb6, 0xa7, + 0x39, 0x73, 0xf8, 0x97, 0x71, 0xaf, 0xcd, 0x66, 0x58, 0xd4, 0xf1, 0x1c, + 0x4c, 0x16, 0x79, 0xc6, 0x63, 0x56, 0xca, 0x79, 0x15, 0x7b, 0xad, 0x40, + 0xaa, 0xf6, 0x18, 0x64, 0x0e, 0xb7, 0x28, 0xd4, 0x19, 0xf0, 0xa4, 0x3a, + 0x92, 0x9d, 0x25, 0x97, 0xef, 0x5e, 0x75, 0x2a, 0x13, 0x0c, 0xac, 0x2a, + 0xf8, 0x62, 0x36, 0xa5, 0x8e, 0xb2, 0x8d, 0xd7, 0x20, 0xc1, 0x0f, 0xc2, + 0xbd, 0xdd, 0x96, 0x55, 0x98, 0xae, 0x74, 0xfc, 0x71, 0x26, 0xb6, 0x63, + 0x35, 0x19, 0x95, 0x8f, 0xce, 0x0f, 0x66, 0xcd, 0x72, 0x83, 0x05, 0x5d, + 0xe3, 0x44, 0x12, 0x62, 0x9e, 0x4f, 0xe0, 0xf2, 0x4f, 0x74, 0x76, 0x8f, + 0x79, 0x7f, 0x54, 0x1c, 0x81, 0x46, 0x2e, 0x31, 0x5a, 0xf3, 0x9d, 0x55, + 0x0e, 0x23, 0x7e, 0x19, 0xe8, 0xba, 0xe1, 0x1f, 0xe3, 0xdb, 0x79, 0x0e, + 0xa6, 0x46, 0xbf, 0xb5, 0x39, 0x91, 0xff, 0xbb, 0xe9, 0xb2, 0x36, 0x73, + 0x1a, 0x2a, 0xc5, 0xc6, 0xff, 0x32, 0x80, 0xb8, 0xff, 0xe7, 0x03, 0xc0, + 0xfa, 0x5e, 0xd5, 0x60, 0x13, 0x1d, 0x35, 0xe3, 0x9f, 0x19, 0xf7, 0xca, + 0x2d, 0xf7, 0x07, 0xcc, 0xa4, 0x6b, 0xa1, 0x49, 0xbe, 0x5e, 0x5b, 0x30, + 0xea, 0x5a, 0x8a, 0x26, 0x53, 0xe4, 0x30, 0xc9, 0x76, 0x4b, 0x4d, 0x19, + 0xdf, 0xb5, 0xec, 0xcf, 0xd0, 0xae, 0x1a, 0x2c, 0x03, 0x87, 0x30, 0xe8, + 0xb3, 0xf5, 0xd0, 0x7d, 0x95, 0x01, 0xef, 0x4b, 0xa8, 0xde, 0x45, 0x5e, + 0x9c, 0x2f, 0xf3, 0xe9, 0xc4, 0x9e, 0x94, 0xf9, 0x64, 0xef, 0x84, 0x1d, + 0x86, 0xa8, 0x6c, 0x2d, 0x35, 0x05, 0x86, 0x57, 0x98, 0x11, 0xf1, 0xdd, + 0xaf, 0xd9, 0xda, 0xc5, 0x0e, 0x34, 0x34, 0x86, 0x6a, 0xdc, 0xe9, 0x53, + 0x12, 0x62, 0xd3, 0x83, 0xcb, 0x67, 0x66, 0xea, 0x09, 0xb0, 0x45, 0xf6, + 0x1b, 0xe6, 0x10, 0x69, 0x68, 0x60, 0xd9, 0xac, 0x1f, 0xbf, 0xf2, 0x23, + 0x57, 0x8d, 0xd3, 0xad, 0xf0, 0xa7, 0xc8, 0x4c, 0x41, 0x15, 0x19, 0xb6, + 0x55, 0xab, 0xee, 0x22, 0x0c, 0x0d, 0xac, 0xf5, 0x24, 0x85, 0xc7, 0x1c, + 0x63, 0x6c, 0xca, 0xee, 0x42, 0xc6, 0xda, 0x87, 0x2d, 0x43, 0xd2, 0xe7, + 0xfd, 0xbf, 0x95, 0x93, 0x83, 0x05, 0x33, 0xf4, 0xa8, 0x56, 0x91, 0x01, + 0xe1, 0xb0, 0x96, 0xa6, 0xf1, 0xf4, 0xb1, 0x9d, 0x83, 0x06, 0x70, 0xba, + 0xef, 0xad, 0xa2, 0x7f, 0xd3, 0x7f, 0x83, 0xa4, 0x0b, 0x3c, 0x7e, 0x80, + 0x2e, 0xd8, 0xf4, 0x76, 0x6e, 0x68, 0xa1, 0x14, 0x33, 0x6c, 0xc0, 0x7e, + 0x4d, 0x67, 0xf2, 0xb3, 0x4a, 0x8a, 0xbc, 0x02, 0x1a, 0xa9, 0x03, 0x98, + 0xfb, 0xa9, 0xfc, 0x1a, 0xba, 0xdb, 0xe9, 0x48, 0x32, 0x14, 0x63, 0x11, + 0x11, 0x85, 0xfc, 0x25, 0xe3, 0xe2, 0x4a, 0x96, 0x01, 0x5f, 0xa9, 0xf4, + 0xd6, 0x82, 0xf6, 0x09, 0x56, 0x15, 0xaa, 0xa7, 0x52, 0xbb, 0x90, 0x18, + 0x74, 0xc1, 0xc6, 0x10, 0xc0, 0x65, 0xb0, 0x2d, 0x5a, 0xd6, 0xfe, 0xe4, + 0x13, 0x11, 0xc0, 0x62, 0xc3, 0x4e, 0xb0, 0xd3, 0x8d, 0x38, 0xae, 0x30, + 0xce, 0x1a, 0x76, 0x71, 0x36, 0x93, 0x33, 0x65, 0x2b, 0x0e, 0xcc, 0x44, + 0xd7, 0x7c, 0xde, 0x01, 0x67, 0x33, 0x34, 0xae, 0xe4, 0x4f, 0xd0, 0x36, + 0x13, 0xac, 0x4b, 0x57, 0xff, 0xd0, 0x80, 0x74, 0xa7, 0x10, 0x1d, 0x9e, + 0x47, 0xdf, 0x78, 0xe4, 0x47, 0x25, 0xf8, 0xf2, 0x6a, 0x47, 0x27, 0x84, + 0x5b, 0x56, 0x15, 0xd8, 0xf1, 0x48, 0x48, 0x33, 0x13, 0x09, 0x23, 0xea, + 0x29, 0x1a, 0x87, 0x63, 0xe6, 0x52, 0xd9, 0x00, 0xe5, 0x8f, 0x93, 0x1c, + 0x9f, 0x1b, 0x68, 0x18, 0x95, 0xb9, 0xa7, 0x67, 0x52, 0x7b, 0x42, 0x36, + 0xb6, 0xff, 0x4a, 0x00, 0x01, 0x6b, 0x8a, 0xb5, 0x70, 0x4e, 0x36, 0x79, + 0xc7, 0xed, 0x2a, 0xab, 0xc1, 0x50, 0x6e, 0xcf, 0x90, 0xda, 0x7f, 0x2e, + 0xbf, 0x7e, 0x78, 0x98, 0xe2, 0xd4, 0xbe, 0x56, 0x83, 0x1c, 0xd5, 0xd0, + 0x94, 0xbd, 0x81, 0xfb, 0x7b, 0xa8, 0x29, 0xa3, 0x73, 0x14, 0x1c, 0x63, + 0x29, 0x60, 0xf0, 0x34, 0x54, 0x1b, 0xc3, 0x10, 0xb9, 0x78, 0xa1, 0xb2, + 0x99, 0x77, 0x8b, 0xbc, 0x57, 0x15, 0x40, 0x8c, 0x97, 0x4c, 0x80, 0x6f, + 0xa9, 0xc6, 0xcd, 0xee, 0x60, 0x9b, 0x09, 0xe4, 0x7c, 0xd6, 0x58, 0x45, + 0x13, 0x00, 0xe3, 0x4d, 0x06, 0xe3, 0xdd, 0x47, 0x0a, 0x2d, 0x95, 0xd0, + 0x40, 0x0b, 0xbb, 0x2b, 0xf2, 0xc7, 0x48, 0x97, 0x94, 0x36, 0xe3, 0x6e, + 0x36, 0xe3, 0xff, 0x01, 0x9e, 0xa1, 0x7b, 0xef, 0xfa, 0x9c, 0x26, 0xdf, + 0x31, 0x51, 0x5d, 0x16, 0x52, 0x00, 0x29, 0x92, 0xa0, 0xcb, 0x03, 0xf5, + 0x70, 0x7b, 0x23, 0xa9, 0x60, 0xf2, 0x44, 0x54, 0xd3, 0x05, 0xee, 0xfb, + 0xd0, 0xa7, 0xff, 0xcd, 0x49, 0xb6, 0x7b, 0x56, 0x92, 0x34, 0x2d, 0xef, + 0xcb, 0x7b, 0x10, 0xdf, 0x02, 0x9b, 0x80, 0xce, 0x71, 0x39, 0x77, 0xb6, + 0xc2, 0x17, 0x90, 0x88, 0x78, 0xe4, 0x6c, 0x83, 0xa2, 0x13, 0x92, 0x3d, + 0xaf, 0xd0, 0x83, 0x22, 0xd1, 0x07, 0x8d, 0xd2, 0x70, 0xa4, 0x39, 0x32, + 0x85, 0x9c, 0x59, 0x57, 0x08, 0x03, 0x25, 0xa5, 0x87, 0xfe, 0x19, 0x64, + 0x30, 0x74, 0xb1, 0x5a, 0x28, 0x3c, 0xf7, 0xce, 0xb7, 0x50, 0xf0, 0x20, + 0x0e, 0x6c, 0xf2, 0xe9, 0xfd, 0x87, 0xe1, 0x7e, 0x55, 0x87, 0x40, 0x2a, + 0x1c, 0x98, 0xc6, 0xbd, 0x74, 0xae, 0xe5, 0x1d, 0xa2, 0x4e, 0x97, 0xb4, + 0x6d, 0x6d, 0x90, 0xd2, 0x9b, 0xd1, 0x15, 0xc8, 0xaf, 0xf3, 0x76, 0x63, + 0x33, 0xf7, 0x40, 0x6d, 0xc3, 0x82, 0x26, 0x83, 0xe5, 0x40, 0xa2, 0xb3, + 0xd0, 0x8f, 0x9c, 0xa2, 0x14, 0xc9, 0x3a, 0xe4, 0x31, 0xa8, 0xab, 0x2e, + 0x06, 0x2e, 0x3e, 0x05, 0xee, 0x06, 0x2e, 0xf7, 0x2b, 0x2f, 0x1b, 0xbb, + 0x3f, 0x3e, 0x82, 0xed, 0x4a, 0x82, 0xae, 0x23, 0x12, 0xc7, 0x8e, 0xdc, + 0x95, 0xe5, 0x6c, 0x3d, 0xcb, 0x17, 0x36, 0x94, 0xe1, 0xba, 0x4e, 0x2e, + 0xca, 0x5f, 0x77, 0x95, 0x50, 0xf8, 0xa0, 0x85, 0xfd, 0x65, 0x66, 0x80, + 0x87, 0xbe, 0xc5, 0x97, 0x04, 0x4f, 0x36, 0xc6, 0xa6, 0xa8, 0x81, 0x7b, + 0xdb, 0xc5, 0xbe, 0x3f, 0x7b, 0x80, 0x7d, 0x7d, 0xc1, 0xd1, 0x48, 0x62, + 0x8e, 0x6f, 0x78, 0xdb, 0x6b, 0xdd, 0x66, 0xcd, 0xa8, 0x21, 0x13, 0xcd, + 0x5d, 0xe0, 0x0c, 0xb2, 0x0d, 0x5b, 0xbe, 0x18, 0xf3, 0x75, 0x50, 0xc4, + 0x8a, 0xaf, 0x1e, 0x21, 0x99, 0xf7, 0xce, 0x07, 0x2c, 0x53, 0xf1, 0xaf, + 0x96, 0x91, 0x0b, 0xcb, 0x48, 0xad, 0x1e, 0xe4, 0x8b, 0x2b, 0xb1, 0x30, + 0x87, 0x65, 0xfc, 0xee, 0xdf, 0x02, 0x7d, 0x4b, 0x05, 0x57, 0xd9, 0xc6, + 0x2b, 0x9a, 0x51, 0x97, 0xa9, 0xd8, 0xee, 0xd6, 0x5d, 0x28, 0x8e, 0x26, + 0x20, 0xee, 0xa0, 0xfd, 0x58, 0x5d, 0x58, 0x5d, 0x58, 0xc4, 0xd9, 0x06, + 0xe6, 0x2b, 0x71, 0xcd, 0x51, 0xf6, 0x9d, 0xe3, 0x90, 0x68, 0x01, 0x8d, + 0x9a, 0x7d, 0x8b, 0x55, 0x7d, 0xc3, 0x26, 0x3f, 0x49, 0xfd, 0x34, 0xc2, + 0x97, 0x03, 0x40, 0x9b, 0x26, 0xe1, 0x6e, 0x33, 0x92, 0xd8, 0x5e, 0x71, + 0x48, 0x1f, 0xb8, 0x2e, 0x01, 0x8d, 0x8d, 0xa2, 0xd3, 0xdb, 0x7e, 0x95, + 0x20, 0x74, 0xb2, 0x36, 0x05, 0xf5, 0xd0, 0xbc, 0xc3, 0x6c, 0x8e, 0xdc, + 0x0b, 0x3a, 0xb2, 0xdd, 0x34, 0x1b, 0x5a, 0xd2, 0x9a, 0x05, 0x0a, 0x5e, + 0x7f, 0x1f, 0x8b, 0x21, 0xa1, 0x72, 0x36, 0x3e, 0xbf, 0x38, 0x10, 0x1b, + 0xcf, 0x23, 0x8f, 0x92, 0x92, 0xcf, 0x3c, 0x1d, 0xca, 0x20, 0xec, 0xb8, + 0x65, 0x28, 0x34, 0x8c, 0x7a, 0xb0, 0xc9, 0xf6, 0xa7, 0xcb, 0xf0, 0xc2, + 0xf5, 0xc0, 0x90, 0xc4, 0x03, 0x79, 0x88, 0x51, 0x38, 0xe3, 0xa6, 0xf9, + 0x55, 0xd9, 0x56, 0x6d, 0x76, 0x0e, 0xaa, 0x14, 0xa2, 0x1c, 0x7b, 0xe2, + 0x4f, 0x90, 0x4d, 0x1c, 0x7d, 0x87, 0xd9, 0xac, 0x22, 0x43, 0xf9, 0xb7, + 0x4e, 0xaf, 0xc3, 0x96, 0x09, 0xbe, 0x4c, 0x1b, 0xdf, 0x72, 0x45, 0x6a, + 0x9b, 0xe3, 0x57, 0x18, 0x64, 0x95, 0xa9, 0xa0, 0x00, 0x30, 0xe3, 0xc0, + 0x65, 0x92, 0x8c, 0x32, 0x1e, 0x1a, 0x66, 0x4f, 0x4b, 0x1a, 0x02, 0x0f, + 0xc2, 0x79, 0x7a, 0x3a, 0x62, 0x53, 0x15, 0xea, 0x63, 0x39, 0x99, 0x8c, + 0xb0, 0x65, 0xe6, 0x57, 0x90, 0x5e, 0x0f, 0x31, 0xc7, 0x9f, 0xb1, 0x8b, + 0x40, 0x00, 0x55, 0xd9, 0xac, 0x91, 0x20, 0x7f, 0x73, 0x12, 0x9d, 0xb9, + 0x53, 0x1b, 0x2e, 0x3a, 0x0e, 0x2d, 0x48, 0x7a, 0x65, 0x54, 0xdf, 0x4b, + 0xbd, 0x8e, 0xc7, 0xe4, 0x47, 0x10, 0xe2, 0x1f, 0x2e, 0x7b, 0x55, 0x18, + 0xac, 0x05, 0x2c, 0xb8, 0x62, 0xe8, 0xef, 0xf4, 0x52, 0x64, 0xe4, 0xf9, + 0x32, 0xf6, 0xb1, 0x1c, 0x99, 0xc7, 0x43, 0x5a, 0x59, 0xd0, 0xd3, 0x92, + 0x88, 0x11, 0x54, 0x22, 0x60, 0x54, 0x00, 0x12, 0xb0, 0x4a, 0x32, 0xbd, + 0xef, 0xb4, 0x29, 0x71, 0xbe, 0x5f, 0x33, 0xc6, 0x79, 0xa3, 0xd6, 0x02, + 0xd2, 0xa6, 0x92, 0x86, 0x55, 0x4d, 0xab, 0x42, 0x8e, 0xbc, 0x62, 0x9c, + 0xa9, 0x71, 0x70, 0x8c, 0x69, 0x99, 0x82, 0x4b, 0x4e, 0x95, 0x5b, 0x76, + 0xf2, 0xa9, 0x87, 0x7b, 0xda, 0xd3, 0x3a, 0x7d, 0xab, 0x40, 0x00, 0x45, + 0xdf, 0x62, 0xb3, 0x02, 0xa6, 0x44, 0x6b, 0xb5, 0x70, 0xa5, 0x2b, 0xab, + 0x2d, 0x9e, 0xef, 0xc3, 0x4c, 0xe3, 0x14, 0xab, 0x0a, 0xeb, 0xba, 0x14, + 0x51, 0xef, 0x7f, 0xf6, 0xa9, 0xaa, 0x03, 0x7f, 0xa6, 0x16, 0x28, 0x78, + 0xe8, 0x09, 0xda, 0x00, 0x01, 0x68, 0xf1, 0x9a, 0x17, 0x65, 0xdd, 0x88, + 0xc8, 0xd9, 0x15, 0x1e, 0x44, 0x0a, 0x2e, 0x8e, 0x67, 0x45, 0xe4, 0xb8, + 0xda, 0xd9, 0xa0, 0x3c, 0x22, 0x52, 0xaf, 0x97, 0x99, 0x1f, 0xdc, 0xd9, + 0xed, 0xbf, 0x90, 0x38, 0xce, 0xe7, 0x50, 0x5a, 0x00, 0x02, 0x89, 0x6f, + 0xa2, 0x75, 0xc5, 0xc1, 0x4e, 0xda, 0x8f, 0x34, 0xb1, 0x67, 0xbc, 0xd4, + 0x26, 0x80, 0x67, 0x46, 0x86, 0x38, 0x5d, 0x76, 0xca, 0xfa, 0x3b, 0x85, + 0x5a, 0x6a, 0xd4, 0x09, 0x88, 0x83, 0x04, 0x4a, 0x1b, 0x01, 0x3e, 0x3e, + 0x09, 0x97, 0x74, 0xa3, 0x36, 0x86, 0x63, 0x21, 0x65, 0xf9, 0xf9, 0xe9, + 0x5c, 0xc0, 0x48, 0x2d, 0x33, 0xf9, 0x32, 0xe2, 0x7b, 0x94, 0xab, 0x49, + 0xfa, 0x37, 0x73, 0xa5, 0xaf, 0x52, 0xcb, 0xb9, 0x5e, 0xd2, 0xf0, 0x00, + 0x02, 0x30, 0x7f, 0x0f, 0xea, 0x76, 0x3d, 0x46, 0x5f, 0x68, 0x8a, 0x90, + 0xc7, 0x31, 0x21, 0x7b, 0x5a, 0x8c, 0x5a, 0x7b, 0xa0, 0xc9, 0x8f, 0xba, + 0x34, 0x7e, 0x22, 0x5b, 0x47, 0xb1, 0xcb, 0x5b, 0x05, 0x88, 0x97, 0x51, + 0x17, 0x78, 0x00, 0x01, 0x6b, 0xb0, 0xfc, 0x59, 0xa9, 0x4e, 0x3f, 0x6c, + 0x69, 0x42, 0xc9, 0xfd, 0x7b, 0xa0, 0x15, 0x7d, 0xf6, 0x44, 0x85, 0x08, + 0xa6, 0xc4, 0x3d, 0x11, 0x8b, 0xdb, 0x2c, 0x19, 0x92, 0x3b, 0x73, 0x7f, + 0x9e, 0x75, 0x4f, 0x81, 0x49, 0x4f, 0x88, 0xad, 0x00, 0x01, 0x7a, 0x58, + 0x11, 0xa0, 0xc8, 0xf4, 0xd8, 0x88, 0x78, 0x0c, 0x37, 0xa5, 0x0b, 0xd5, + 0xc7, 0x32, 0x3a, 0x01, 0x12, 0x12, 0x91, 0x96, 0xb2, 0xc3, 0xde, 0x0d, + 0x43, 0xcc, 0x84, 0x77, 0xab, 0xa5, 0xd8, 0x5b, 0x12, 0x92, 0x98, 0x08, + 0xe7, 0x19, 0x43, 0x7c, 0x11, 0xce, 0x2b, 0xa1, 0x1d, 0x20, 0x62, 0x91, + 0x4e, 0xfd, 0xe1, 0x54, 0x0b, 0xee, 0x36, 0x9f, 0x62, 0xb5, 0x8d, 0x65, + 0x85, 0xf1, 0xe0, 0x00, 0x03, 0x33, 0xea, 0xfe, 0xf5, 0x80, 0xbf, 0x61, + 0x93, 0x16, 0xae, 0xf0, 0xdb, 0x03, 0x98, 0x72, 0x34, 0x13, 0x28, 0x36, + 0x0c, 0x5a, 0x00, 0xb8, 0x19, 0xce, 0x2a, 0x93, 0x93, 0xe8, 0xb3, 0x37, + 0x11, 0x50, 0x5d, 0x6a, 0xde, 0x00, 0x00, 0x51, 0xd9, 0x01, 0xa2, 0x2e, + 0x9b, 0x31, 0xac, 0xbc, 0x8c, 0x2e, 0x93, 0x7c, 0x01, 0xc6, 0x79, 0xc3, + 0xba, 0x9e, 0x0d, 0xab, 0xfa, 0xaf, 0x60, 0xc5, 0xe7, 0xf7, 0x31, 0x99, + 0x82, 0x7f, 0xda, 0xc2, 0xa4, 0x15, 0x43, 0x3c, 0xe4, 0xb5, 0x7d, 0xf2, + 0x4f, 0xf7, 0xa1, 0x07, 0xfe, 0x2b, 0x2c, 0x9d, 0x43, 0x3d, 0xb4, 0xc7, + 0x28, 0x83, 0xe2, 0xa9, 0x74, 0x39, 0x0c, 0xca, 0xc0, 0x77, 0xea, 0xfd, + 0x1a, 0x16, 0x60, 0xf1, 0x16, 0xb5, 0x79, 0xf2, 0xec, 0x1b, 0x05, 0x69, + 0x66, 0xbf, 0xf3, 0x75, 0xfe, 0x44, 0xae, 0x9a, 0x6f, 0x27, 0xea, 0x1d, + 0x26, 0x8a, 0xf8, 0x31, 0x77, 0xb7, 0x35, 0x75, 0xb5, 0xf7, 0xdb, 0xcc, + 0x8b, 0x16, 0x7e, 0xb1, 0x8e, 0x83, 0x80, 0x24, 0x24, 0x68, 0x00, 0x08, + 0xaf, 0xa2, 0xe0, 0x97, 0xd0, 0x6d, 0x13, 0xfc, 0x9c, 0x5c, 0x44, 0x2e, + 0x77, 0x15, 0x87, 0x5b, 0x8f, 0xda, 0x00, 0x9d, 0x8c, 0x8d, 0xaa, 0xd2, + 0xc9, 0x5a, 0x77, 0x6e, 0xd0, 0x9e, 0xc4, 0x88, 0xb0, 0x2e, 0x86, 0x0a, + 0x41, 0xa0, 0x00, 0x40, 0x59, 0x37, 0x17, 0xba, 0xe4, 0xc1, 0xff, 0xac, + 0xb8, 0x55, 0xd4, 0xa6, 0x17, 0x7e, 0xb4, 0xaf, 0x1f, 0x02, 0x6f, 0x54, + 0x44, 0xea, 0xb6, 0x19, 0x2e, 0x76, 0x47, 0xd0, 0x0e, 0x8e, 0x2e, 0xaf, + 0xb2, 0x0c, 0xd1, 0x4a, 0xd2, 0xae, 0x6c, 0x8d, 0x2a, 0xb5, 0xb2, 0x28, + 0x4d, 0xfc, 0xb1, 0x7a, 0xea, 0x6a, 0xd5, 0x29, 0x24, 0x0c, 0x95, 0x5e, + 0x7c, 0x6a, 0x24, 0x1d, 0xe3, 0x4d, 0x3e, 0x94, 0x26, 0xb4, 0x77, 0x11, + 0xb7, 0xdb, 0x04, 0x78, 0x63, 0x81, 0x6d, 0x45, 0xf4, 0xe8, 0xe2, 0x26, + 0x3f, 0xce, 0xb0, 0x10, 0x1c, 0x7e, 0xb0, 0x46, 0x97, 0x48, 0xc1, 0xc6, + 0xac, 0xf9, 0xc5, 0xc0, 0xba, 0xf2, 0x15, 0xc7, 0x91, 0x57, 0x61, 0xc9, + 0xa1, 0x40, 0xe6, 0xac, 0x6b, 0xd6, 0x2f, 0x24, 0xa2, 0x89, 0xa0, 0x00, + 0x37, 0xd3, 0x24, 0x56, 0xed, 0x6a, 0x25, 0x9e, 0x40, 0x85, 0x08, 0x51, + 0x11, 0xbc, 0x6c, 0x85, 0x0d, 0x10, 0xd6, 0xd6, 0xf3, 0x70, 0x7e, 0x89, + 0x09, 0xf7, 0x8e, 0x26, 0xff, 0xf2, 0x56, 0xbb, 0x79, 0xd1, 0x9e, 0x44, + 0xea, 0x56, 0x80, 0x00, 0xd8, 0x83, 0x87, 0x25, 0xad, 0x6e, 0x7e, 0x86, + 0x8c, 0xeb, 0x42, 0x3b, 0xeb, 0x22, 0xac, 0x8b, 0x6f, 0xb3, 0x2a, 0x5c, + 0xdf, 0x2c, 0x9d, 0x85, 0xbc, 0x61, 0xe2, 0x1a, 0x61, 0x0a, 0x4e, 0xde, + 0xe9, 0xb6, 0x0e, 0x63, 0x2b, 0x40, 0x00, 0x5f, 0x7b, 0x67, 0xa7, 0x79, + 0x71, 0x52, 0x16, 0x0d, 0x83, 0xeb, 0x35, 0x33, 0xee, 0xeb, 0x77, 0xfd, + 0xd1, 0x8b, 0x3a, 0xc0, 0x89, 0x3f, 0xb9, 0xa8, 0x14, 0x89, 0xe2, 0x3d, + 0x03, 0x9b, 0xe8, 0xf7, 0x47, 0xdc, 0xd0, 0x3e, 0xac, 0x90, 0xbe, 0x54, + 0x9a, 0xbc, 0x11, 0xda, 0xde, 0xd6, 0x44, 0xc1, 0xa4, 0xf8, 0xe9, 0xe3, + 0x31, 0x09, 0x38, 0xf7, 0xe5, 0x90, 0x71, 0x4e, 0x99, 0x95, 0x7b, 0x92, + 0x07, 0x85, 0xe0, 0x50, 0xce, 0x9b, 0xc0, 0x00, 0x04, 0x80, 0xf0, 0x75, + 0x88, 0xfd, 0xdf, 0xf0, 0xda, 0x6c, 0xb5, 0xbb, 0xfe, 0xbc, 0x02, 0x09, + 0x56, 0xa8, 0xae, 0x06, 0x68, 0xdc, 0xc3, 0xbd, 0x2d, 0x0d, 0xe7, 0x1a, + 0xce, 0x8a, 0x75, 0xc1, 0xaa, 0xd5, 0xf4, 0x65, 0xe0, 0x00, 0x04, 0xd5, + 0xbc, 0xfc, 0xf2, 0xfc, 0x3f, 0xa8, 0xe8, 0x42, 0x73, 0x01, 0x72, 0xea, + 0x47, 0x75, 0x47, 0xba, 0xc2, 0x44, 0x45, 0x78, 0xa6, 0xad, 0x65, 0xa2, + 0xac, 0x9d, 0x81, 0x7f, 0xea, 0x59, 0xe0, 0x32, 0x1f, 0x78, 0xad, 0x00, + 0x01, 0x90, 0xc5, 0x8c, 0x2c, 0x3f, 0x26, 0xda, 0x01, 0x98, 0x64, 0xb3, + 0x91, 0x4a, 0x89, 0xe2, 0xf2, 0x2c, 0x93, 0x18, 0x94, 0x0e, 0x01, 0x12, + 0xea, 0x3c, 0xb2, 0xa1, 0xbf, 0xa8, 0x81, 0xb2, 0x3c, 0xab, 0xa4, 0xe4, + 0xec, 0x51, 0xb9, 0x50, 0xaa, 0x64, 0x0a, 0x50, 0xf4, 0x35, 0xe4, 0xfe, + 0x5c, 0xdc, 0xce, 0x01, 0x13, 0x7f, 0x2e, 0x90, 0x41, 0x4c, 0x65, 0x0c, + 0xe2, 0xa7, 0x55, 0x69, 0x3a, 0x7a, 0x65, 0x30, 0x91, 0x2b, 0x8c, 0x93, + 0x87, 0xaf, 0xe2, 0x3e, 0x74, 0xcb, 0x5d, 0x57, 0x9b, 0x78, 0x9f, 0x0e, + 0xff, 0xbc, 0x00, 0x00, 0xac, 0x2e, 0x02, 0xb2, 0xdc, 0xa0, 0xe7, 0x29, + 0x19, 0x99, 0x01, 0x42, 0x09, 0x22, 0xd0, 0x48, 0x58, 0x57, 0x40, 0xb8, + 0x53, 0xd9, 0x96, 0x9e, 0x38, 0x33, 0xe9, 0x20, 0x69, 0x08, 0x46, 0x6a, + 0x4c, 0x80, 0x37, 0x7d, 0x3c, 0x64, 0x4f, 0x00, 0x00, 0x28, 0x00, 0x62, + 0x35, 0x47, 0xad, 0x6f, 0x13, 0xcb, 0x2b, 0x1a, 0x7e, 0x89, 0xaa, 0x9b, + 0x6a, 0x59, 0x1b, 0x2f, 0x30, 0x72, 0xbc, 0xfa, 0xe1, 0xb2, 0x9f, 0xc5, + 0x8a, 0x5f, 0xbf, 0x67, 0x9f, 0x87, 0x26, 0x7d, 0xdb, 0xc9, 0xe0, 0x01, + 0xbc, 0xf7, 0xae, 0x3e, 0xb5, 0x3a, 0xd7, 0x8b, 0xb5, 0x96, 0x20, 0x5f, + 0x16, 0x99, 0xe9, 0xe1, 0x20, 0x97, 0xa5, 0x69, 0x33, 0x9a, 0xa3, 0x17, + 0x63, 0xcb, 0x50, 0xbb, 0x02, 0x1d, 0x00, 0x37, 0x80, 0x00, 0x20, 0xbc, + 0x04, 0xeb, 0xc1, 0x44, 0xc2, 0x8d, 0x51, 0x53, 0x4b, 0xd0, 0x13, 0x36, + 0xfb, 0x7d, 0x2d, 0xb6, 0xe8, 0x27, 0xf9, 0x4f, 0xa4, 0xbf, 0x33, 0xb3, + 0x9b, 0x55, 0x05, 0xbc, 0x5f, 0xb4, 0xab, 0x5d, 0x62, 0x94, 0x69, 0xee, + 0x1b, 0xc0, 0x00, 0x04, 0x15, 0x08, 0x75, 0xa3, 0x94, 0xfa, 0x0e, 0xdb, + 0x22, 0x09, 0x9c, 0x52, 0x60, 0x61, 0xf3, 0x97, 0x04, 0x81, 0x58, 0x09, + 0xaf, 0x58, 0x6f, 0x1d, 0x3c, 0x36, 0x01, 0xc8, 0xdb, 0xf5, 0x0d, 0x0f, + 0x19, 0x59, 0xb8, 0x07, 0x80, 0xca, 0xce, 0xa1, 0x83, 0xaa, 0x7c, 0x81, + 0x34, 0x1c, 0x19, 0xb4, 0x37, 0x1e, 0x7f, 0xc8, 0x77, 0x03, 0xfa, 0xc7, + 0x42, 0xf3, 0xef, 0x38, 0x4b, 0x42, 0x3c, 0xd1, 0xfa, 0x83, 0xcf, 0x8a, + 0xe8, 0x5a, 0x2c, 0xfb, 0x92, 0x20, 0xb0, 0x9f, 0xad, 0x0b, 0xbe, 0x8a, + 0x9a, 0xfc, 0x72, 0x3d, 0x32, 0x8d, 0x98, 0x6c, 0x9a, 0x6b, 0xd6, 0x8b, + 0x4c, 0x5e, 0x0c, 0xea, 0xbf, 0xc0, 0x0a, 0xf4, 0x2b, 0x0c, 0x78, 0x81, + 0xd4, 0x65, 0xb6, 0x53, 0x2c, 0xdc, 0x27, 0x8d, 0xd4, 0x54, 0xfc, 0xbf, + 0x3b, 0xe4, 0x81, 0x68, 0x32, 0xbb, 0xe7, 0x20, 0x54, 0x8b, 0x37, 0xce, + 0xec, 0x57, 0xe8, 0xd1, 0xe1, 0xe3, 0xa9, 0x04, 0xfb, 0x89, 0xdd, 0x97, + 0x84, 0x44, 0x18, 0x44, 0x7d, 0xea, 0x02, 0x70, 0xc0, 0xbe, 0x4b, 0xd4, + 0xb5, 0x5a, 0xa0, 0x2e, 0x7d, 0x76, 0x97, 0xdc, 0x2a, 0x9a, 0x69, 0xe7, + 0xd5, 0x13, 0x10, 0x6c, 0x95, 0xe8, 0x69, 0xc7, 0xc1, 0xc4, 0x7c, 0xf7, + 0x20, 0x01, 0x5e, 0x8e, 0xe3, 0xe7, 0xd8, 0xb7, 0x2e, 0x12, 0x75, 0x47, + 0x76, 0x81, 0x1f, 0xde, 0x87, 0x10, 0xaa, 0x15, 0x4c, 0xcb, 0xd8, 0x3f, + 0x2b, 0xf9, 0x7e, 0x4a, 0x15, 0xeb, 0x22, 0xc7, 0x9d, 0x89, 0x09, 0x89, + 0x09, 0x89, 0x10, 0x57, 0xed, 0xdf, 0x9f, 0xd3, 0xbd, 0x30, 0x07, 0x94, + 0xb8, 0x03, 0x5f, 0x8b, 0x0c, 0xbd, 0xfe, 0x10, 0x29, 0x50, 0xd8, 0x3a, + 0x8f, 0x84, 0xab, 0xe5, 0x1a, 0x90, 0x50, 0xf2, 0xd9, 0x6b, 0xcd, 0xe1, + 0xa4, 0xc1, 0x7e, 0xdd, 0x3f, 0x6b, 0xcc, 0x40, 0x09, 0x97, 0x2b, 0xf0, + 0xca, 0xa7, 0xc5, 0x8e, 0x34, 0x6a, 0x16, 0x7b, 0xbe, 0x26, 0x7a, 0x47, + 0xe3, 0x84, 0x17, 0x0d, 0x66, 0x9b, 0x38, 0x44, 0x70, 0x68, 0x7d, 0x7f, + 0xfb, 0x1a, 0x2a, 0xad, 0xfb, 0x27, 0x63, 0x9d, 0x3f, 0xa3, 0xf8, 0x30, + 0x0d, 0x81, 0xf2, 0x66, 0xd6, 0xe2, 0xd1, 0x8e, 0x40, 0xee, 0x10, 0xc9, + 0xfb, 0xdc, 0x07, 0x45, 0x1c, 0xdc, 0xfb, 0x50, 0xb4, 0xee, 0x42, 0x14, + 0x61, 0xb0, 0x3e, 0x38, 0xd5, 0x26, 0xca, 0x5b, 0x9a, 0x39, 0x43, 0x9a, + 0x1b, 0x9a, 0x39, 0x42, 0x38, 0xc9, 0x84, 0x2c, 0x7e, 0x15, 0xc2, 0x6f, + 0x26, 0xea, 0x47, 0x24, 0x83, 0xda, 0x7a, 0x55, 0x54, 0x75, 0x47, 0x76, + 0x07, 0x01, 0xf0, 0x5f, 0xc6, 0x91, 0xd5, 0xab, 0xce, 0x86, 0xe5, 0x88, + 0x4a, 0xdb, 0x70, 0xcb, 0x19, 0xd4, 0xae, 0xf2, 0xce, 0x6c, 0xd5, 0x6a, + 0xc3, 0xb3, 0x73, 0x1f, 0x02, 0x96, 0xb3, 0xec, 0x47, 0xe2, 0x74, 0xc6, + 0x61, 0x6e, 0x52, 0x9a, 0xcd, 0x07, 0x4f, 0xc5, 0xaa, 0x31, 0xff, 0x4b, + 0x8a, 0x59, 0x32, 0x05, 0xa7, 0xd3, 0x0e, 0xf0, 0x69, 0xe7, 0x74, 0x96, + 0x1f, 0x5a, 0xb4, 0x42, 0xff, 0x04, 0x21, 0xe2, 0xf5, 0xc8, 0x01, 0x66, + 0x98, 0x53, 0x30, 0x0e, 0x63, 0x9a, 0x2f, 0x61, 0x0e, 0x7b, 0xb4, 0x66, + 0x14, 0xe4, 0x0c, 0xf2, 0x2d, 0xc8, 0x8b, 0x33, 0xe1, 0x33, 0x56, 0x21, + 0x89, 0xe9, 0xe0, 0x19, 0x67, 0x12, 0xbe, 0xe6, 0xd2, 0xbc, 0x0c, 0xac, + 0xbf, 0x94, 0xf6, 0x80, 0x00, 0xb3, 0x9c, 0x19, 0xcb, 0x9a, 0xd6, 0xda, + 0x95, 0x6c, 0x03, 0xfa, 0x72, 0xf7, 0xe9, 0xa0, 0xda, 0xfe, 0x0d, 0xce, + 0x84, 0xe8, 0xa1, 0x9c, 0xe8, 0xe3, 0x3b, 0x61, 0x0a, 0x84, 0x61, 0x53, + 0x90, 0x2e, 0x27, 0xed, 0x7d, 0xc3, 0x40, 0x00, 0x64, 0xe0, 0xed, 0xe8, + 0x8d, 0x67, 0x22, 0x22, 0xe8, 0x19, 0x7c, 0x52, 0x05, 0xf8, 0x9e, 0x23, + 0xb2, 0xaa, 0xdf, 0x76, 0x18, 0xaf, 0x58, 0xf2, 0xbd, 0x2e, 0xdf, 0xc2, + 0xe8, 0xaf, 0xb2, 0x7d, 0xe4, 0xe1, 0x81, 0x43, 0xa9, 0xff, 0x32, 0xa0, + 0x4e, 0xc8, 0xd4, 0xf3, 0x75, 0x9a, 0x99, 0xa9, 0x66, 0x2f, 0xc0, 0x9c, + 0xe5, 0x13, 0x32, 0x39, 0x06, 0xc0, 0x57, 0x75, 0xd1, 0x50, 0x44, 0xc0, + 0xec, 0x54, 0x33, 0xe0, 0x0f, 0xa4, 0x35, 0x54, 0xe1, 0xd9, 0x9a, 0x00, + 0x09, 0x49, 0xf0, 0x80, 0x9f, 0x2d, 0x4e, 0xb4, 0xab, 0xfa, 0xa8, 0xad, + 0xb8, 0x5f, 0xf7, 0x03, 0x80, 0x05, 0x66, 0x47, 0xfe, 0x41, 0x1c, 0x5d, + 0x92, 0xe2, 0xab, 0x52, 0xd3, 0x05, 0xe0, 0x84, 0xdc, 0xbd, 0x7f, 0x5d, + 0x06, 0xea, 0x9a, 0x00, 0x03, 0x3b, 0xa7, 0xb5, 0x06, 0x3c, 0x8d, 0xbc, + 0xbd, 0x81, 0x08, 0xb3, 0x43, 0x8d, 0x97, 0x9a, 0x1a, 0x5b, 0x75, 0xa7, + 0x86, 0x45, 0x53, 0x95, 0xf7, 0x7e, 0xfc, 0xc1, 0x4f, 0x3a, 0x20, 0x32, + 0xf6, 0xa6, 0xce, 0xea, 0x09, 0xad, 0x37, 0x4f, 0x96, 0x0c, 0xcb, 0xca, + 0x65, 0xfd, 0x7a, 0x09, 0x08, 0x52, 0xf7, 0xb7, 0x2c, 0x93, 0xa3, 0xae, + 0x55, 0x10, 0xa9, 0x83, 0x77, 0x62, 0xf3, 0xce, 0xe2, 0xf4, 0x85, 0xb4, + 0xc5, 0xa2, 0x1b, 0xf7, 0x7b, 0xe5, 0x8f, 0x2c, 0x49, 0x5e, 0xdf, 0x21, + 0xfd, 0x52, 0x41, 0xbc, 0x00, 0x00, 0x93, 0xe8, 0xf8, 0x22, 0x65, 0xe0, + 0xe6, 0x9d, 0x24, 0x65, 0xf5, 0xa3, 0xe9, 0x4a, 0xf5, 0x9d, 0x85, 0xaa, + 0x3a, 0x55, 0x7a, 0x79, 0xc1, 0xc6, 0xac, 0x22, 0xe4, 0x57, 0x02, 0x7e, + 0x30, 0xf8, 0xc5, 0x46, 0x80, 0xbc, 0x00, 0x00, 0xc3, 0x01, 0x78, 0x7f, + 0x4c, 0xfb, 0xe6, 0x35, 0x2b, 0xdc, 0x22, 0xc6, 0xd1, 0xcd, 0x80, 0x42, + 0x58, 0xe5, 0xaa, 0xc7, 0x13, 0x1c, 0x0a, 0xb9, 0xcb, 0xbb, 0xac, 0x7b, + 0x86, 0xe5, 0xd9, 0x65, 0x20, 0x6c, 0x4f, 0x8c, 0x55, 0x50, 0xd0, 0xb9, + 0xae, 0xdf, 0xca, 0x8d, 0x1e, 0x9f, 0x32, 0xd0, 0xac, 0xa3, 0x4c, 0xcc, + 0x3d, 0xeb, 0x50, 0x7b, 0x36, 0xd9, 0xe4, 0x23, 0xc7, 0x95, 0xf2, 0x27, + 0x20, 0xb8, 0x1c, 0x20, 0xdb, 0x59, 0xe0, 0x0a, 0xd2, 0x56, 0x93, 0xbe, + 0x06, 0x47, 0x80, 0x9d, 0xe7, 0x8a, 0x22, 0x6a, 0xd2, 0xb8, 0x6a, 0x1e, + 0x64, 0x5e, 0x5f, 0x77, 0x0b, 0x03, 0x4f, 0x61, 0xa7, 0x4c, 0x71, 0xad, + 0xf6, 0x7e, 0xb8, 0x5d, 0x77, 0x7b, 0x4b, 0xea, 0x10, 0x5c, 0x50, 0x2e, + 0xe4, 0x78, 0x00, 0x01, 0x0a, 0xf8, 0x9c, 0x5d, 0x08, 0x35, 0x16, 0x70, + 0x5f, 0x72, 0x1b, 0x8d, 0x05, 0x5c, 0x9a, 0x6c, 0x25, 0x9f, 0x89, 0x29, + 0x1e, 0x23, 0xe1, 0xa3, 0x4f, 0xbc, 0x0a, 0x0e, 0x60, 0xff, 0x4c, 0xd6, + 0xeb, 0xd2, 0x24, 0x11, 0xe0, 0x2e, 0x62, 0xc2, 0x8f, 0xad, 0xe4, 0x22, + 0xb8, 0x12, 0xd6, 0x16, 0x05, 0x20, 0x12, 0x00, 0x0a, 0x0b, 0x00, 0x00, + 0x00, 0x04, 0x46, 0x3e, 0x56, 0xff, 0xfc, 0xc0, 0x20, 0x32, 0xa1, 0x0a, + 0x14, 0x00, 0x23, 0x20, 0x02, 0x10, 0x60, 0xa2, 0x41, 0xb7, 0x02, 0xfa, + 0xa4, 0x60, 0x8f, 0x83, 0xec, 0xec, 0x15, 0x58, 0xc2, 0xa3, 0x0c, 0x6f, + 0x72, 0x4f, 0xff, 0xff, 0xfa, 0x7e, 0x91, 0x29, 0x87, 0x38, 0x8c, 0x0e, + 0xd7, 0xde, 0xd9, 0x96, 0x14, 0x8d, 0xcf, 0x01, 0x9d, 0x56, 0x4d, 0xc2, + 0x18, 0x94, 0x81, 0xfd, 0x30, 0x0c, 0xfe, 0xe6, 0x98, 0x13, 0x47, 0x7f, + 0x94, 0x6c, 0x4f, 0x15, 0xf4, 0xcb, 0x91, 0xf3, 0x7d, 0x6a, 0xad, 0x5a, + 0x0d, 0x7b, 0xbc, 0xff, 0x03, 0x89, 0x02, 0xbb, 0x59, 0xbd, 0x71, 0x89, + 0xec, 0x91, 0xb2, 0x5f, 0x97, 0x5a, 0x93, 0x6b, 0x53, 0x74, 0x69, 0x29, + 0xc1, 0x4f, 0xc4, 0xd2, 0x94, 0x12, 0x65, 0xb3, 0x96, 0xde, 0x48, 0x68, + 0x01, 0xed, 0x27, 0x98, 0xf8, 0x62, 0xe1, 0x3f, 0xbd, 0x9d, 0x21, 0x70, + 0xe3, 0xf4, 0x05, 0xcf, 0xbf, 0x3e, 0xce, 0x46, 0x73, 0x31, 0x69, 0xa2, + 0x9e, 0x37, 0xd7, 0xef, 0x6c, 0x69, 0x9c, 0x25, 0x82, 0x2c, 0x23, 0xad, + 0x92, 0x41, 0x32, 0x83, 0xcd, 0x36, 0x33, 0xa7, 0x3a, 0x8f, 0x43, 0xef, + 0x59, 0xf6, 0x0d, 0x64, 0x90, 0x0d, 0x44, 0x46, 0x29, 0x6d, 0x53, 0x68, + 0xcf, 0xab, 0x3f, 0x3d, 0x17, 0xd7, 0xbd, 0xbc, 0xe9, 0x14, 0x30, 0x7f, + 0x8e, 0xaa, 0x4b, 0x59, 0xa9, 0xeb, 0x5c, 0xeb, 0x22, 0xc1, 0x5c, 0x8e, + 0x7e, 0x5c, 0x8d, 0xe1, 0x60, 0xc4, 0xe5, 0xbd, 0xe5, 0x0e, 0x20, 0x43, + 0x3a, 0x88, 0x2d, 0xdd, 0xd1, 0x3e, 0x48, 0x2d, 0x10, 0x44, 0xf6, 0x7f, + 0xe3, 0x81, 0x6c, 0xbd, 0xb7, 0x5a, 0x89, 0x49, 0xd3, 0xf7, 0x1e, 0x9b, + 0xbc, 0xf6, 0x7c, 0x41, 0xbd, 0xec, 0x60, 0x20, 0xa5, 0x4f, 0x01, 0xd9, + 0xc2, 0x10, 0x62, 0x66, 0x17, 0x7e, 0x67, 0xcb, 0x31, 0xda, 0x63, 0x18, + 0xbc, 0xbc, 0x52, 0xc1, 0xbf, 0x7b, 0x5b, 0x21, 0x73, 0xdb, 0x2d, 0x6a, + 0xab, 0x7c, 0xbc, 0x92, 0x73, 0xfd, 0xdd, 0xce, 0x1e, 0x49, 0x49, 0x11, + 0x78, 0xbd, 0x8f, 0x36, 0xea, 0xc0, 0x96, 0x91, 0xc0, 0xc5, 0xdb, 0xdd, + 0x4a, 0x66, 0xdd, 0xee, 0xce, 0x3a, 0x00, 0xba, 0x64, 0x0e, 0x6a, 0xad, + 0x3f, 0x74, 0x3a, 0xe3, 0x52, 0x55, 0x80, 0x04, 0x04, 0xbe, 0xa2, 0xaa, + 0x19, 0xb6, 0x59, 0x71, 0xa6, 0x96, 0xd8, 0x1b, 0x94, 0x99, 0xfb, 0xba, + 0x31, 0x22, 0x85, 0xdd, 0x06, 0xc9, 0x81, 0x37, 0x11, 0x45, 0x78, 0x76, + 0xa5, 0x47, 0x7c, 0xbb, 0x5f, 0xbb, 0x3e, 0x1b, 0xe1, 0x02, 0xfa, 0x1c, + 0xe0, 0x09, 0xe1, 0x95, 0x1b, 0x07, 0xf5, 0x25, 0x49, 0xba, 0x6f, 0xa0, + 0x27, 0x3b, 0xce, 0xb0, 0xc4, 0x35, 0x46, 0xe8, 0xd3, 0xff, 0x6a, 0xb4, + 0x1d, 0xff, 0xf5, 0x0b, 0x11, 0x80, 0x0b, 0xf1, 0xfb, 0x11, 0xaf, 0xa0, + 0xbd, 0x74, 0xdf, 0xdc, 0x5d, 0x62, 0xae, 0xa0, 0x41, 0xfe, 0x31, 0x21, + 0xce, 0xa7, 0x12, 0xd8, 0x1f, 0x2f, 0x5e, 0xdb, 0x6c, 0x0b, 0x6d, 0xbe, + 0x34, 0x72, 0x19, 0x46, 0x07, 0x83, 0x23, 0x47, 0x07, 0x88, 0x53, 0x5b, + 0x81, 0x27, 0xb3, 0x6d, 0xcc, 0xbd, 0x32, 0xa7, 0x4f, 0xf4, 0xda, 0x67, + 0xfe, 0x46, 0x93, 0x53, 0xa9, 0xa3, 0xce, 0xfe, 0xb5, 0xa9, 0xfd, 0xe6, + 0x6d, 0x9e, 0x18, 0xb0, 0x21, 0xac, 0x84, 0xc5, 0x33, 0x51, 0x83, 0x88, + 0xe9, 0x0d, 0xce, 0x3e, 0x14, 0xca, 0x67, 0xd6, 0x1d, 0xde, 0x6d, 0xa5, + 0x46, 0x66, 0xbe, 0x83, 0xba, 0x83, 0x55, 0x65, 0x06, 0x32, 0xd0, 0xd1, + 0x4d, 0x42, 0xc2, 0xf7, 0xb9, 0xde, 0x6f, 0xad, 0xc7, 0x55, 0xad, 0x65, + 0x9c, 0x63, 0x10, 0x2a, 0x9f, 0x76, 0x6c, 0x01, 0x84, 0xbc, 0x52, 0xb8, + 0x7d, 0xe4, 0x90, 0x27, 0xb4, 0x04, 0x1d, 0x2f, 0x56, 0x5a, 0xac, 0x4d, + 0x37, 0x7b, 0x50, 0x12, 0x09, 0x16, 0x96, 0x04, 0x25, 0x58, 0x2a, 0xc5, + 0xb6, 0x56, 0x30, 0x3f, 0x1d, 0x70, 0xed, 0x05, 0xd3, 0xc8, 0xd7, 0x46, + 0xda, 0x69, 0xf1, 0x86, 0x28, 0x83, 0xe9, 0x5f, 0x1f, 0xeb, 0xce, 0xa2, + 0x2c, 0x24, 0x75, 0x63, 0xa9, 0x05, 0x00, 0x31, 0x18, 0x35, 0xf1, 0xab, + 0xce, 0xdf, 0x34, 0xc8, 0xec, 0x2b, 0x61, 0x06, 0x66, 0x0d, 0x7f, 0x30, + 0xc3, 0x98, 0x58, 0x89, 0x1b, 0x1f, 0x2f, 0x8d, 0xd6, 0x91, 0xac, 0x1a, + 0xd8, 0x10, 0x61, 0x29, 0xd1, 0x50, 0x8b, 0xae, 0x1a, 0xfe, 0xfd, 0xed, + 0x68, 0xbd, 0x75, 0x75, 0xd1, 0x59, 0x1c, 0xd9, 0x37, 0xfe, 0xd5, 0xc3, + 0xbd, 0x98, 0x8d, 0xc7, 0x68, 0x07, 0x8d, 0x69, 0x76, 0x9c, 0x4a, 0x4c, + 0x32, 0x91, 0x2f, 0xba, 0xf9, 0x59, 0x8b, 0xfa, 0xe6, 0x8a, 0xa5, 0x41, + 0xd7, 0xe5, 0x45, 0xe4, 0x26, 0x3f, 0x2b, 0x48, 0x6a, 0x6d, 0xe8, 0x37, + 0x4c, 0xb3, 0xbd, 0x2b, 0xe8, 0xe4, 0x92, 0xbb, 0x97, 0x62, 0x63, 0x65, + 0xe4, 0x51, 0x44, 0x6b, 0xfe, 0x37, 0xbe, 0x96, 0xce, 0xed, 0x84, 0x3f, + 0xab, 0x41, 0xf6, 0x6e, 0x0e, 0x1d, 0x73, 0xab, 0xdb, 0x3b, 0xe5, 0x66, + 0xe8, 0x4a, 0xc9, 0xdf, 0xe0, 0x40, 0x6a, 0xa8, 0x8e, 0xcf, 0x85, 0x8b, + 0x3c, 0x65, 0xb5, 0x9e, 0xfe, 0x72, 0x95, 0xad, 0xb5, 0x59, 0x30, 0x27, + 0xd2, 0x66, 0x65, 0x82, 0xc9, 0x3b, 0x27, 0x85, 0x49, 0x7b, 0xfd, 0xc8, + 0x60, 0xf1, 0x55, 0xda, 0x18, 0x14, 0x2c, 0xf3, 0xd1, 0xa3, 0xb1, 0x48, + 0x9e, 0xd1, 0x5b, 0x3e, 0x40, 0x6f, 0x41, 0x4a, 0xeb, 0x9a, 0x98, 0xa7, + 0x2b, 0xa1, 0x7a, 0x0c, 0x8f, 0x94, 0x13, 0xda, 0x89, 0x3e, 0xd4, 0xc5, + 0xfc, 0xa1, 0x6f, 0x40, 0x66, 0x18, 0x1e, 0x41, 0x80, 0xda, 0xb0, 0x87, + 0x9b, 0xa3, 0x81, 0x2d, 0x67, 0xd6, 0x11, 0x92, 0x7b, 0x74, 0x6c, 0xe8, + 0xda, 0x26, 0xee, 0x5e, 0x28, 0x70, 0x22, 0x48, 0xcc, 0x60, 0xa8, 0xcd, + 0x7d, 0x7f, 0x45, 0x11, 0x1e, 0x47, 0x11, 0x00, 0x58, 0xea, 0x51, 0xe4, + 0xb9, 0x23, 0x30, 0x4e, 0x66, 0xf6, 0xdc, 0x53, 0xc9, 0x34, 0x26, 0xba, + 0x3f, 0xc2, 0x52, 0x88, 0x23, 0xfb, 0x4a, 0x77, 0x31, 0x8c, 0x29, 0x52, + 0x30, 0x5e, 0x53, 0x99, 0xde, 0xaa, 0xc9, 0xd0, 0xa9, 0xed, 0xd0, 0x2c, + 0x1a, 0xaa, 0x2a, 0x02, 0x2b, 0x92, 0x7a, 0xf1, 0xab, 0x88, 0xb3, 0xf0, + 0xaa, 0x4e, 0x7a, 0xde, 0x0a, 0x76, 0x24, 0x46, 0xa0, 0x9a, 0x67, 0x19, + 0xcf, 0xa7, 0xc0, 0x46, 0x69, 0x2d, 0x02, 0xfa, 0x39, 0xaa, 0x1f, 0x83, + 0x6f, 0x3c, 0xaa, 0xb6, 0x5e, 0xac, 0x4d, 0x34, 0x26, 0x07, 0xea, 0x50, + 0x96, 0xd8, 0x42, 0x47, 0xbf, 0xa0, 0xe8, 0x4b, 0xad, 0xe1, 0x86, 0xf4, + 0x87, 0x52, 0x30, 0x04, 0x72, 0x23, 0x79, 0xc4, 0x98, 0xf4, 0x8b, 0x4f, + 0x42, 0xf4, 0x58, 0x6e, 0x11, 0x48, 0xc2, 0x69, 0x3d, 0x70, 0xc5, 0x28, + 0x00, 0xbe, 0x03, 0x9d, 0x10, 0xae, 0x6c, 0x8a, 0xa9, 0xf9, 0xe0, 0x1a, + 0xf4, 0xf1, 0x3e, 0x97, 0x9e, 0xf5, 0x2f, 0x29, 0xec, 0x1d, 0x6a, 0xf5, + 0xbe, 0xbc, 0x5a, 0xd7, 0x9f, 0x84, 0x6f, 0xb0, 0xec, 0xd3, 0x96, 0x7a, + 0xd9, 0x00, 0x9a, 0xaa, 0x62, 0x7e, 0xf1, 0x5a, 0xd7, 0x1c, 0xf6, 0xe3, + 0x6c, 0x4d, 0x87, 0xc6, 0xc2, 0xae, 0xd0, 0x91, 0xed, 0xdf, 0xa0, 0x9d, + 0x37, 0x73, 0x55, 0x89, 0x53, 0x9b, 0xe1, 0x0e, 0xae, 0x5a, 0xcb, 0xfa, + 0xf1, 0x40, 0xaa, 0xb6, 0x7c, 0xa1, 0x15, 0x45, 0x67, 0x87, 0x28, 0x4f, + 0x34, 0x8c, 0xd7, 0xc9, 0x5e, 0x91, 0x06, 0x06, 0x43, 0x44, 0xf5, 0x27, + 0x1c, 0x7c, 0x36, 0xff, 0xd9, 0x11, 0xa6, 0x1a, 0x55, 0x04, 0x57, 0x20, + 0x72, 0xdb, 0xb0, 0xad, 0x4c, 0xdd, 0x74, 0x06, 0x4b, 0x17, 0x64, 0x33, + 0xca, 0xec, 0x7e, 0x37, 0xa6, 0x68, 0x89, 0xea, 0x39, 0x4b, 0x2e, 0xa4, + 0x5f, 0xb5, 0x0d, 0xd2, 0x1a, 0x00, 0xb5, 0x95, 0x07, 0x97, 0x6d, 0xb0, + 0xff, 0x8a, 0xbf, 0xa7, 0xb8, 0xff, 0xae, 0x3f, 0x0c, 0xd8, 0xb7, 0x67, + 0xee, 0x09, 0xf3, 0x74, 0xda, 0x2b, 0x10, 0xc1, 0x33, 0xf7, 0xb0, 0x8d, + 0xfa, 0x37, 0x18, 0x15, 0xc3, 0xaa, 0xce, 0x95, 0x11, 0xb2, 0x44, 0xee, + 0xad, 0x98, 0x2c, 0x5a, 0x95, 0x67, 0x19, 0x08, 0xd8, 0x20, 0xd6, 0x87, + 0xfa, 0x3e, 0x63, 0x61, 0xb5, 0x12, 0x93, 0x6f, 0x23, 0x32, 0xcb, 0x3e, + 0x0c, 0x0b, 0xaa, 0x28, 0xff, 0xca, 0xfb, 0x53, 0x41, 0x3a, 0x78, 0xa6, + 0xad, 0xfe, 0x5f, 0x06, 0x8f, 0xe4, 0x90, 0x21, 0x9b, 0x25, 0xcc, 0x66, + 0xe3, 0x10, 0xea, 0x90, 0x8c, 0x2c, 0x10, 0x50, 0x83, 0xec, 0x46, 0xc1, + 0x8b, 0x86, 0x6f, 0x64, 0x3b, 0x03, 0xcf, 0xd4, 0x1c, 0x49, 0xa7, 0xb8, + 0x04, 0xc9, 0xa0, 0xfc, 0xe8, 0x89, 0xb1, 0x43, 0xb0, 0xef, 0xd7, 0xed, + 0x16, 0xda, 0xdd, 0x42, 0x4f, 0xcd, 0x46, 0x2e, 0x1f, 0xa2, 0x6c, 0xf1, + 0x35, 0xaf, 0xca, 0xa7, 0x20, 0x99, 0x70, 0x0f, 0x73, 0x65, 0x69, 0x03, + 0xc7, 0xe1, 0x14, 0x6d, 0xdb, 0x0a, 0xac, 0x7d, 0x73, 0x52, 0x0e, 0x1c, + 0xe1, 0xf7, 0x4e, 0x70, 0x49, 0x5f, 0x50, 0x47, 0xf2, 0x5c, 0xad, 0x82, + 0xd6, 0x58, 0x28, 0xa1, 0x2b, 0x43, 0x70, 0xcb, 0xbb, 0xd6, 0xb4, 0x35, + 0xf5, 0x69, 0x81, 0x42, 0x10, 0x5f, 0x17, 0xf9, 0x73, 0x4e, 0xf0, 0x24, + 0xe9, 0x41, 0x96, 0xb6, 0xe8, 0xb3, 0x4c, 0xfb, 0xd6, 0x48, 0x91, 0x9e, + 0x71, 0x28, 0xaa, 0xb0, 0xd4, 0x79, 0x45, 0x22, 0xa0, 0xf4, 0x52, 0xc3, + 0x27, 0xc1, 0x12, 0xa8, 0xe0, 0xb9, 0x8f, 0x94, 0x32, 0x94, 0xa6, 0x9a, + 0x15, 0x4b, 0xeb, 0x52, 0x2c, 0xa4, 0x28, 0xe1, 0x0b, 0x2f, 0xe6, 0x70, + 0xc7, 0xde, 0x3a, 0x2f, 0xec, 0x12, 0x00, 0x32, 0x19, 0x29, 0x04, 0x70, + 0x20, 0x00, 0x00, 0x01, 0xa6, 0x50, 0x30, 0x42, 0x0c, 0x14, 0x58, 0x00, + 0x00, 0x10, 0x00, 0xd4, 0x33, 0xcf, 0x95, 0x8b, 0xec, 0xf4, 0x32, 0x16, + 0x29, 0x02, 0x70, 0x40, 0x00, 0x00, 0x41, 0xa7, 0x28, 0x02, 0x10, 0x60, + 0xa2, 0xc0, 0x00, 0x01, 0xc0, 0x00, 0xd9, 0x98, 0xd6, 0xe0, 0x32, 0xbe, + 0x0b, 0x29, 0x01, 0x40, 0x80, 0x00, 0x20, 0x41, 0xa7, 0x80, 0x09, 0xb5, + 0x74, 0x02, 0xc9, 0xff, 0xb2, 0xa0, 0x00, 0x55, 0x04, 0xfc, 0xb8, 0xfc, + 0xf1, 0xe7, 0x91, 0xda, 0x85, 0x5e, 0x63, 0xf0, 0x9b, 0xf8, 0xee, 0xe9, + 0xac, 0x98, 0xca, 0x41, 0x44, 0xa6, 0xde, 0xba, 0xdd, 0xe2, 0x54, 0xd0, + 0xab, 0xb6, 0xa6, 0x51, 0x54, 0x04, 0x74, 0x11, 0xce, 0x4f, 0xb9, 0xc5, + 0x26, 0xbd, 0x59, 0x0b, 0xeb, 0xe7, 0x9f, 0xa4, 0x8e, 0x07, 0x19, 0x65, + 0x9d, 0xd9, 0x5d, 0xae, 0xa1, 0xcd, 0xcb, 0xfd, 0xe5, 0x9e, 0xd6, 0x5a, + 0x0b, 0x40, 0x2d, 0x64, 0x1e, 0x39, 0x7c, 0x9e, 0xc1, 0xf6, 0xa0, 0x14, + 0x3f, 0x85, 0x9d, 0x65, 0xf6, 0x43, 0x3e, 0xa3, 0xb7, 0xa9, 0x66, 0x10, + 0x3d, 0x02, 0xad, 0x46, 0xb7, 0xd6, 0x5f, 0x15, 0x4b, 0x7b, 0x18, 0x7d, + 0xa6, 0x41, 0x36, 0x06, 0x8f, 0x13, 0x80, 0xd5, 0xc5, 0x7d, 0x26, 0x33, + 0x6b, 0xfd, 0x1e, 0xe9, 0x6e, 0xfc, 0x95, 0x36, 0x42, 0x99, 0xa6, 0x7b, + 0x43, 0x84, 0x5c, 0x29, 0xac, 0xd9, 0x42, 0xaf, 0x3d, 0xd8, 0xe0, 0x5d, + 0xbe, 0x62, 0x23, 0xd1, 0xaa, 0xf8, 0xbb, 0x95, 0x08, 0x3f, 0x57, 0x96, + 0x3a, 0xf0, 0x66, 0x5b, 0x3f, 0xd2, 0x4a, 0xd0, 0x30, 0x74, 0x57, 0x13, + 0xf8, 0x94, 0x75, 0x03, 0x4a, 0xe6, 0xdd, 0x3d, 0xd9, 0x98, 0x9a, 0xa0, + 0x01, 0x02, 0x45, 0xe1, 0xba, 0xfc, 0x28, 0x77, 0x94, 0x5c, 0x10, 0xdf, + 0x0f, 0x96, 0xec, 0x38, 0xf0, 0xe3, 0xd6, 0xdd, 0x87, 0x4d, 0xf9, 0x79, + 0x43, 0x04, 0x03, 0x5b, 0x88, 0xe5, 0xbd, 0x64, 0x5a, 0x96, 0xf1, 0xd3, + 0xf0, 0x3e, 0xf2, 0x83, 0x09, 0x22, 0xf9, 0x18, 0xa5, 0x53, 0x6d, 0x80, + 0x0f, 0xc1, 0x46, 0x2d, 0x8f, 0xae, 0x4c, 0xef, 0x87, 0x24, 0x74, 0xc3, + 0xf7, 0x70, 0x02, 0x0a, 0xd9, 0x62, 0xc9, 0xda, 0x99, 0x46, 0x4f, 0x0e, + 0xcc, 0x67, 0x19, 0xe9, 0x3a, 0x5c, 0x70, 0x36, 0x1a, 0xea, 0x22, 0x76, + 0xf5, 0x14, 0x59, 0x22, 0x86, 0xfe, 0xec, 0x22, 0xd9, 0x11, 0x1a, 0x7b, + 0xd1, 0xd4, 0x11, 0x3d, 0xa1, 0x92, 0x3f, 0x16, 0xe4, 0x42, 0xb6, 0x39, + 0x2f, 0xaa, 0x26, 0x60, 0x2b, 0x5e, 0xe7, 0xbb, 0x0c, 0x59, 0x5d, 0xa7, + 0x84, 0x78, 0x9f, 0xaf, 0xa0, 0x9d, 0x66, 0xb8, 0x7c, 0xfa, 0xfd, 0x62, + 0x37, 0x21, 0xe9, 0x2b, 0xd5, 0x46, 0xba, 0x92, 0x54, 0x45, 0xe0, 0xa3, + 0xba, 0x2d, 0xfa, 0x32, 0x4f, 0x30, 0xf9, 0x00, 0x29, 0x01, 0x92, 0x69, + 0x62, 0x59, 0xb4, 0xb1, 0x04, 0x00, 0x81, 0x1a, 0xc8, 0x4f, 0x8a, 0x2c, + 0xfc, 0x8a, 0xe6, 0xda, 0x1e, 0x82, 0x7f, 0x66, 0xb8, 0x4f, 0x3f, 0x0e, + 0x36, 0x42, 0x97, 0x43, 0xad, 0x92, 0x3d, 0x7c, 0x18, 0x1a, 0xbd, 0xf7, + 0x36, 0x13, 0xb7, 0x0d, 0x0e, 0x28, 0x6e, 0xa4, 0x54, 0x92, 0x06, 0x21, + 0xf3, 0xda, 0x9a, 0x67, 0x63, 0xf8, 0x81, 0x08, 0xb4, 0x7a, 0x6c, 0x65, + 0xf1, 0xdf, 0xf6, 0xb7, 0x0d, 0x9c, 0x8a, 0x04, 0x7b, 0xda, 0x4c, 0x21, + 0x20, 0x43, 0x44, 0x94, 0x81, 0xa4, 0x11, 0xfa, 0x43, 0xa8, 0x2c, 0xb9, + 0xb0, 0xee, 0x99, 0x20, 0xa3, 0x05, 0x82, 0x5f, 0x25, 0xaa, 0xd2, 0xcb, + 0x18, 0x01, 0xca, 0xdf, 0xf9, 0x08, 0x20, 0x57, 0xf0, 0x74, 0xca, 0x9a, + 0x0f, 0x58, 0x9a, 0x59, 0xfa, 0x28, 0x0f, 0x46, 0x0e, 0x80, 0x56, 0x8b, + 0x66, 0x99, 0xb0, 0x1c, 0xa1, 0xe4, 0x9a, 0x7b, 0x8b, 0x44, 0x3a, 0xa0, + 0x4a, 0x2b, 0xc0, 0x32, 0x60, 0xfa, 0xf3, 0x8f, 0xa5, 0x88, 0xa4, 0x57, + 0xb4, 0xa3, 0xc1, 0x85, 0x11, 0x1b, 0x0f, 0xb9, 0x17, 0xec, 0x8e, 0xb2, + 0x01, 0xe2, 0x74, 0x12, 0x60, 0xd7, 0x8b, 0x3a, 0x0e, 0x08, 0xe6, 0x83, + 0x20, 0x0f, 0x20, 0xbf, 0x0a, 0x03, 0x7e, 0x27, 0x43, 0x66, 0x50, 0xb1, + 0xb2, 0x10, 0xd0, 0xf3, 0xb7, 0x2b, 0x6c, 0x77, 0x65, 0x65, 0x69, 0x98, + 0x93, 0xe6, 0x87, 0xad, 0xb4, 0xd8, 0x5c, 0x70, 0xca, 0x1f, 0x05, 0x81, + 0x84, 0x92, 0x7b, 0x99, 0x81, 0xf6, 0xec, 0x2c, 0xd1, 0x30, 0x1b, 0x96, + 0xd7, 0x57, 0xd6, 0x8a, 0x35, 0x08, 0x9b, 0x6c, 0xbe, 0x26, 0x59, 0xc5, + 0xf5, 0x70, 0xa1, 0x42, 0xc9, 0x47, 0x9d, 0x46, 0xd1, 0xdc, 0xe2, 0x7b, + 0xee, 0xa1, 0x33, 0xc3, 0x97, 0x23, 0x59, 0x8e, 0x95, 0x83, 0x67, 0x42, + 0xcc, 0x6e, 0x00, 0xc1, 0xf9, 0x5c, 0xdb, 0x8b, 0xc3, 0x87, 0x6a, 0xe8, + 0x11, 0xf8, 0xdf, 0xc9, 0x06, 0x2a, 0x6f, 0x50, 0xce, 0x02, 0x5c, 0x33, + 0x1b, 0x36, 0xd8, 0x0f, 0x02, 0x12, 0x48, 0x7d, 0xbd, 0x62, 0x12, 0xfa, + 0xa5, 0xc3, 0xf1, 0x9b, 0x1c, 0x01, 0x81, 0xed, 0x8b, 0x5b, 0x37, 0x93, + 0xe8, 0x14, 0x63, 0x06, 0x7a, 0xd0, 0x63, 0x6b, 0x54, 0x77, 0x49, 0x52, + 0xf1, 0xe2, 0x1f, 0xd8, 0xc9, 0xab, 0xe0, 0xdc, 0x88, 0xaf, 0xb6, 0x1a, + 0x6e, 0x64, 0x6a, 0xcd, 0x56, 0x6e, 0x84, 0xaf, 0x0c, 0xdc, 0xe4, 0x12, + 0xe9, 0xa0, 0x2d, 0xae, 0x4c, 0x6a, 0xb8, 0x90, 0x87, 0x3d, 0x07, 0x66, + 0x4c, 0x2e, 0x99, 0xe9, 0x4a, 0x6f, 0x77, 0x52, 0x54, 0x7e, 0x57, 0x59, + 0x00, 0xb7, 0x14, 0x0a, 0x0a, 0xce, 0x7f, 0x99, 0xad, 0x2a, 0x6f, 0xe1, + 0x8d, 0xe6, 0x76, 0x9b, 0x46, 0xbe, 0x38, 0x56, 0x1a, 0x25, 0xa6, 0xb1, + 0x81, 0x13, 0x47, 0xb0, 0x86, 0xb5, 0x57, 0xff, 0xce, 0xca, 0xa9, 0x52, + 0xf2, 0x55, 0xdb, 0x02, 0x63, 0xbd, 0xb6, 0x56, 0xf1, 0xcf, 0xa0, 0xdc, + 0xf7, 0xea, 0x2d, 0x00, 0x29, 0x0e, 0x72, 0x3e, 0xeb, 0x7f, 0x08, 0xd0, + 0x1a, 0x23, 0x15, 0xea, 0x5c, 0xfa, 0xc0, 0xbd, 0xaa, 0x73, 0x86, 0x64, + 0xb9, 0x21, 0xd9, 0xd7, 0x2b, 0x7b, 0xe6, 0x08, 0xa1, 0x33, 0x43, 0xf8, + 0x12, 0x89, 0x84, 0x1f, 0xd9, 0x82, 0xb0, 0x13, 0xc5, 0x40, 0x68, 0x72, + 0x1a, 0xd8, 0xf4, 0x20, 0xd5, 0x75, 0xbe, 0xcb, 0xca, 0x13, 0x6b, 0x73, + 0x60, 0x19, 0x87, 0x19, 0x18, 0x57, 0x72, 0xce, 0xb7, 0xd4, 0x6a, 0x3d, + 0x93, 0x69, 0xa4, 0xe2, 0x04, 0xf7, 0x32, 0xb8, 0xad, 0xa1, 0x5f, 0x5f, + 0xc1, 0x53, 0xf7, 0x8e, 0x7d, 0x9f, 0x1f, 0x22, 0xe1, 0x34, 0x94, 0x2a, + 0x33, 0x4d, 0x35, 0xea, 0xed, 0x37, 0x7e, 0xcd, 0x25, 0xbb, 0xff, 0xaa, + 0x6d, 0x72, 0xbf, 0xb4, 0xc2, 0x73, 0xf7, 0xdc, 0x94, 0x9d, 0xd0, 0x44, + 0x0f, 0xdd, 0x18, 0x88, 0x56, 0x9c, 0x73, 0x36, 0x1c, 0x45, 0xef, 0x2e, + 0x99, 0xb1, 0x38, 0xd5, 0x4a, 0xf9, 0x26, 0x4d, 0x7f, 0x0b, 0x03, 0xde, + 0x0d, 0xd5, 0x98, 0x9d, 0xcf, 0xd8, 0xdc, 0xd4, 0x58, 0x77, 0xc3, 0x71, + 0x26, 0x91, 0x29, 0xd3, 0xa1, 0x55, 0x0e, 0xdc, 0xed, 0xfc, 0x37, 0xf5, + 0xf8, 0x40, 0xfc, 0x1c, 0x60, 0xd8, 0xbf, 0xd8, 0xc5, 0xf1, 0xf5, 0x70, + 0x1c, 0xd8, 0x13, 0x72, 0xc6, 0xaf, 0x35, 0xa6, 0x1f, 0xe1, 0x99, 0xff, + 0x89, 0x8e, 0xe4, 0x16, 0x75, 0x13, 0xf4, 0x52, 0xb4, 0x21, 0x3c, 0xfb, + 0xfc, 0x79, 0x2a, 0x91, 0xe8, 0x14, 0x4c, 0x76, 0x00, 0xc6, 0x20, 0x2e, + 0x51, 0xbd, 0x4e, 0x56, 0x49, 0x95, 0x44, 0xe1, 0x48, 0x85, 0x6d, 0xcc, + 0x46, 0x90, 0x75, 0xb7, 0x2b, 0xb6, 0x25, 0x24, 0xd1, 0x83, 0x07, 0x10, + 0x39, 0x43, 0x6e, 0x86, 0x33, 0x16, 0xc2, 0x87, 0x02, 0x67, 0xf3, 0x1a, + 0x0d, 0xa3, 0x77, 0x99, 0xaa, 0x84, 0x8e, 0x08, 0x8e, 0x1d, 0xe8, 0x80, + 0xfb, 0xfd, 0xb4, 0x16, 0x39, 0x46, 0xdd, 0x2f, 0x54, 0x69, 0x2e, 0x3e, + 0x0d, 0xed, 0xb0, 0xf3, 0xd0, 0xcc, 0x25, 0x91, 0x5a, 0x3b, 0x49, 0xab, + 0x52, 0xc8, 0x95, 0xf9, 0x04, 0x84, 0x1b, 0xfa, 0x60, 0xf5, 0x6d, 0x36, + 0xb5, 0xae, 0xaa, 0xd1, 0x6c, 0xd1, 0x1a, 0x0c, 0x4d, 0xf3, 0x0e, 0xb0, + 0x9c, 0xd3, 0x78, 0x20, 0x4a, 0x49, 0x1d, 0x77, 0x80, 0x71, 0x78, 0xb1, + 0x65, 0xdb, 0x8f, 0xd9, 0xe5, 0x4a, 0x29, 0x93, 0xd9, 0x23, 0x31, 0x8e, + 0xae, 0x54, 0x5b, 0x74, 0x02, 0xa9, 0x89, 0x93, 0x64, 0x1e, 0x5b, 0xbb, + 0xd8, 0xe2, 0x44, 0x4b, 0xbb, 0xab, 0x8e, 0x6e, 0xb0, 0x17, 0x09, 0xe0, + 0xe9, 0xc6, 0x68, 0xb1, 0xd6, 0x59, 0x53, 0xcc, 0x35, 0x75, 0x03, 0xe9, + 0x51, 0x7f, 0x11, 0x54, 0xe5, 0x6b, 0x06, 0x11, 0x8e, 0x28, 0x39, 0xd9, + 0x15, 0x80, 0x8c, 0x01, 0x1c, 0x50, 0xa9, 0x74, 0x97, 0x39, 0x61, 0xed, + 0xad, 0x15, 0xc6, 0xfa, 0x9b, 0x30, 0x55, 0x13, 0x53, 0x8e, 0x9f, 0xcc, + 0xee, 0x37, 0x4a, 0x1b, 0x0d, 0x77, 0x87, 0x80, 0x8e, 0xbf, 0xcf, 0x0f, + 0x94, 0xd9, 0xfe, 0xbc, 0x91, 0x53, 0x19, 0xad, 0x24, 0xb9, 0xd0, 0x2e, + 0xc2, 0xbf, 0xd2, 0xa4, 0x12, 0x6d, 0x95, 0xec, 0xd6, 0xf2, 0x26, 0xa4, + 0xb4, 0xd2, 0x48, 0x78, 0x36, 0xf0, 0xc0, 0xcc, 0xc8, 0xab, 0xf4, 0xfb, + 0x4e, 0x3b, 0x1b, 0x41, 0xff, 0x38, 0xb7, 0xa3, 0x6d, 0xeb, 0xf0, 0x9a, + 0x4e, 0xf4, 0x30, 0xe1, 0xc8, 0x59, 0xb4, 0xee, 0x99, 0xe6, 0x21, 0xdf, + 0xbc, 0x39, 0x66, 0x79, 0x86, 0x1c, 0xa3, 0x3f, 0x92, 0xbb, 0xf3, 0x78, + 0xdb, 0xc2, 0x47, 0x3b, 0x32, 0x87, 0xde, 0xe6, 0xf3, 0x72, 0x82, 0x39, + 0x48, 0xfb, 0x4a, 0x03, 0x02, 0xc4, 0xea, 0x8e, 0xee, 0x3d, 0xde, 0xbc, + 0x43, 0x4b, 0xc7, 0xc2, 0x6a, 0xe4, 0x8e, 0x4f, 0xed, 0xd3, 0xeb, 0x6b, + 0x24, 0x6d, 0x72, 0x53, 0xdb, 0xfb, 0xa2, 0x31, 0x47, 0xdb, 0xff, 0x16, + 0x60, 0xc9, 0xf3, 0xf9, 0x62, 0x7f, 0x64, 0xfe, 0x31, 0xc3, 0x1a, 0xe2, + 0xbb, 0x6c, 0xf3, 0x13, 0x9d, 0xe4, 0x2f, 0x4f, 0xeb, 0x67, 0x95, 0xe4, + 0x57, 0x02, 0x05, 0xaa, 0x21, 0xaa, 0x46, 0x3d, 0x40, 0x6e, 0x52, 0xdb, + 0x6f, 0x3c, 0x4a, 0xe2, 0x53, 0x8a, 0x5f, 0xcf, 0x70, 0x64, 0x6b, 0xd7, + 0x89, 0xbc, 0x43, 0x75, 0x6c, 0xac, 0x44, 0xd9, 0xb0, 0x7d, 0x38, 0xa4, + 0xb9, 0x81, 0x93, 0x39, 0x87, 0x0d, 0x52, 0x27, 0x56, 0x1b, 0x07, 0xe7, + 0xf6, 0xe9, 0xbb, 0xde, 0xb2, 0xaf, 0x44, 0x00, 0xe2, 0x12, 0x9b, 0xea, + 0xf1, 0xdb, 0x4f, 0xdd, 0xa4, 0xe0, 0xc3, 0xcd, 0x0b, 0x29, 0x5e, 0x61, + 0xd5, 0x21, 0xa5, 0x3a, 0x71, 0xd1, 0xb8, 0x67, 0x8b, 0x7a, 0x76, 0xda, + 0xd1, 0xb9, 0xb5, 0xde, 0x39, 0x73, 0x1c, 0x82, 0x07, 0xa7, 0x8c, 0x0b, + 0xcb, 0x0e, 0xe7, 0x1b, 0x55, 0x50, 0xc7, 0x8e, 0x3c, 0xe3, 0xc2, 0x31, + 0x5e, 0x30, 0xee, 0x91, 0xcb, 0xec, 0xc3, 0x4b, 0xbb, 0xa4, 0x4e, 0x93, + 0x5e, 0x44, 0x30, 0x88, 0xd2, 0x84, 0x44, 0x0f, 0x90, 0xb9, 0xf1, 0x9a, + 0x4c, 0x8f, 0xb4, 0xd0, 0x95, 0x39, 0x0b, 0x13, 0xc8, 0xb5, 0x6b, 0x02, + 0xbb, 0x7f, 0x01, 0x67, 0xe8, 0x65, 0x50, 0x32, 0x9c, 0x04, 0x32, 0x01, + 0xe2, 0x00, 0x00, 0x68, 0xa3, 0x4f, 0x90, 0x2c, 0x7f, 0x3c, 0xc5, 0x9d, + 0xfe, 0x55, 0x40, 0x00, 0xfe, 0x87, 0x7b, 0xc1, 0x84, 0xc9, 0x5f, 0x4b, + 0xc2, 0x7d, 0x54, 0x41, 0x97, 0xa9, 0x62, 0x7f, 0xfc, 0x92, 0x55, 0x9e, + 0x0c, 0x17, 0xe4, 0x04, 0x19, 0xd1, 0xc4, 0x17, 0xee, 0x1a, 0xcd, 0x27, + 0xf0, 0x52, 0xbb, 0xb0, 0xb1, 0x3c, 0x01, 0x42, 0x5d, 0xbd, 0x28, 0xa5, + 0x1e, 0x17, 0xe0, 0x10, 0xe7, 0x75, 0x24, 0xc4, 0xa8, 0xeb, 0xa4, 0xf9, + 0xf3, 0x97, 0xd7, 0xa7, 0x61, 0x7d, 0x28, 0x6b, 0x0f, 0xa6, 0x85, 0x9f, + 0x40, 0x14, 0x14, 0xaf, 0x27, 0x64, 0xc6, 0x18, 0x14, 0xb3, 0xea, 0x33, + 0x7f, 0xfe, 0x75, 0xea, 0xfa, 0x00, 0x5e, 0x00, 0x5f, 0x2b, 0xc8, 0x00, + 0x2c, 0xaf, 0xaf, 0x8f, 0xe7, 0x7a, 0xa4, 0xea, 0xb5, 0x9f, 0xd1, 0xb8, + 0x03, 0x7e, 0x5b, 0x69, 0x07, 0x0b, 0x7b, 0x4b, 0x2b, 0xb4, 0xe7, 0x98, + 0x85, 0xe0, 0xb3, 0xc5, 0xcb, 0x91, 0xd3, 0x71, 0x4c, 0x14, 0xaf, 0x53, + 0xdd, 0xee, 0xfe, 0x99, 0x3b, 0x87, 0x44, 0xc0, 0x23, 0xa0, 0xc9, 0x0d, + 0x3d, 0x61, 0x51, 0x71, 0x7f, 0xc8, 0xa7, 0x55, 0x12, 0x7b, 0x66, 0xb9, + 0xe3, 0x20, 0x3b, 0x8c, 0x6c, 0xf9, 0x0e, 0xdd, 0x9c, 0xe4, 0x6f, 0xa4, + 0x19, 0x0a, 0x54, 0x3e, 0xa9, 0xca, 0x1f, 0xd5, 0x08, 0x20, 0xb8, 0x87, + 0xff, 0xde, 0x7d, 0x3b, 0x93, 0xa9, 0x64, 0x65, 0x9f, 0x28, 0xd2, 0x9f, + 0x2e, 0xbd, 0x6b, 0x25, 0x3c, 0xb3, 0xe7, 0x02, 0xf2, 0xac, 0x5d, 0xbf, + 0x5d, 0x67, 0xe6, 0x38, 0xf1, 0xce, 0x71, 0x8e, 0xf1, 0xe1, 0xd0, 0x01, + 0x33, 0x91, 0x96, 0x94, 0xdc, 0xce, 0x95, 0x16, 0x66, 0x3d, 0x2b, 0xea, + 0xd7, 0x84, 0xb3, 0xc7, 0x89, 0xd4, 0x8a, 0x81, 0x06, 0xa2, 0xb4, 0xf5, + 0x05, 0xb3, 0x94, 0x4a, 0xd9, 0xc7, 0x3d, 0x9c, 0x48, 0xcd, 0x54, 0xab, + 0xbf, 0x60, 0xc9, 0x79, 0x0f, 0x34, 0x88, 0x97, 0xdd, 0xe4, 0x77, 0x41, + 0xe4, 0x94, 0x91, 0x24, 0xcc, 0xd4, 0xa5, 0xb8, 0x8b, 0xb2, 0xb3, 0xd4, + 0x1b, 0xa9, 0xe4, 0x8b, 0x26, 0xa5, 0xbd, 0x32, 0x69, 0x52, 0xe3, 0x34, + 0x5e, 0xfa, 0x79, 0x5f, 0xbe, 0xe3, 0x81, 0x2a, 0x2d, 0xc4, 0xa3, 0xb3, + 0x7a, 0x68, 0xde, 0xf5, 0x92, 0xe6, 0x3c, 0x5c, 0x1b, 0x0e, 0xbd, 0xea, + 0x1b, 0x68, 0x5a, 0x0e, 0xb2, 0xd2, 0x1c, 0x7b, 0x1d, 0x7b, 0xdc, 0x8a, + 0xc3, 0xb7, 0x7f, 0xc4, 0x47, 0xd5, 0xb3, 0xfa, 0x6b, 0xd6, 0x0e, 0x87, + 0xfb, 0x3e, 0x9b, 0x5f, 0xd1, 0xeb, 0x28, 0x0c, 0xce, 0x6c, 0x32, 0x17, + 0xc2, 0x36, 0xe6, 0x8f, 0x72, 0x4f, 0x73, 0x0d, 0x65, 0x1e, 0xe7, 0xec, + 0xac, 0x00, 0xad, 0x80, 0x8a, 0x90, 0x3e, 0x6f, 0x98, 0xa0, 0x17, 0x51, + 0x77, 0x40, 0x0e, 0xd9, 0x8a, 0x94, 0x81, 0xc6, 0x37, 0x71, 0x0c, 0xd4, + 0x05, 0xa5, 0x51, 0x87, 0xb7, 0xca, 0xf4, 0x15, 0xa1, 0x56, 0xa1, 0x04, + 0xf8, 0x35, 0xef, 0x38, 0x20, 0x3a, 0x07, 0x27, 0x0f, 0x3a, 0xb2, 0x4c, + 0x1d, 0x12, 0x99, 0x81, 0x8d, 0xf3, 0xdd, 0xa3, 0xa3, 0xdf, 0x62, 0x49, + 0x21, 0xdc, 0xf3, 0xc0, 0xef, 0x31, 0x48, 0xe7, 0x74, 0xa6, 0x1d, 0xc6, + 0x9c, 0x6a, 0x92, 0x2e, 0x24, 0x3d, 0x8a, 0xd6, 0x90, 0x6f, 0x05, 0x92, + 0x04, 0x10, 0x6e, 0x82, 0x8d, 0xc1, 0xe8, 0x2e, 0xad, 0xed, 0x4c, 0x5d, + 0xcb, 0x20, 0x0f, 0xc3, 0x91, 0x11, 0xca, 0xc7, 0x85, 0x82, 0x1e, 0xe2, + 0x81, 0xaf, 0x30, 0x3b, 0x53, 0x1e, 0x3b, 0xed, 0x66, 0x15, 0xb6, 0x28, + 0x59, 0x26, 0x12, 0xfd, 0x9c, 0xe6, 0x73, 0xa0, 0xa6, 0x8a, 0x1f, 0x4d, + 0xc1, 0x68, 0xc3, 0x7e, 0x09, 0xf3, 0x3a, 0x29, 0xc5, 0x9f, 0x2a, 0x93, + 0x99, 0x6d, 0x73, 0x5a, 0x65, 0x62, 0xc3, 0xd1, 0xab, 0x15, 0xe9, 0xd2, + 0xd5, 0x70, 0xbe, 0x4e, 0xf5, 0xc5, 0x7c, 0x88, 0x2d, 0x70, 0x12, 0x00, + 0x1a, 0x01, 0xb8, 0x12, 0x00, 0x32, 0xa6, 0x04, 0x32, 0x03, 0x24, 0x07, + 0x00, 0x40, 0xa3, 0x4f, 0xb0, 0x2e, 0x7e, 0x79, 0x85, 0x99, 0xfe, 0x55, + 0xa0, 0x00, 0xfe, 0x6d, 0xc6, 0x23, 0x45, 0x72, 0xd9, 0x05, 0xe9, 0xbe, + 0xfa, 0xe1, 0xef, 0xf6, 0xb7, 0xbc, 0x02, 0x97, 0x0d, 0xcb, 0x31, 0x30, + 0x5c, 0xc4, 0x67, 0xcd, 0x6b, 0x10, 0xba, 0xe8, 0xaa, 0x61, 0xf1, 0xfb, + 0x91, 0x08, 0xd0, 0x5d, 0x39, 0xd0, 0xa4, 0x55, 0x12, 0xfa, 0xa0, 0x8d, + 0xc7, 0x60, 0x65, 0x52, 0x17, 0xe9, 0x87, 0xb4, 0x87, 0xf6, 0x2e, 0x76, + 0x29, 0x5c, 0x9f, 0x9f, 0xbd, 0x87, 0x3c, 0x83, 0xd7, 0x80, 0x33, 0xef, + 0xd1, 0x35, 0x60, 0x9c, 0x77, 0xa0, 0x76, 0xd1, 0xe7, 0xb2, 0x4e, 0xb9, + 0x65, 0x79, 0xc6, 0x82, 0x76, 0x05, 0xbd, 0xcf, 0x4d, 0xa2, 0x60, 0x1a, + 0x47, 0xb0, 0x1d, 0xb0, 0x58, 0xec, 0x58, 0x31, 0xd3, 0xa2, 0xaa, 0x69, + 0xda, 0x54, 0x6d, 0x30, 0x68, 0x3e, 0xbe, 0x8c, 0x11, 0x7e, 0xe5, 0x25, + 0x48, 0xf3, 0xd9, 0x04, 0x6c, 0x70, 0x85, 0xce, 0xda, 0xb8, 0x54, 0x5b, + 0xef, 0x9a, 0xf5, 0x62, 0x5a, 0x90, 0x2c, 0x1a, 0xf5, 0x51, 0xdd, 0x23, + 0xc7, 0x46, 0x95, 0x71, 0xd7, 0xc7, 0x42, 0x62, 0x5c, 0x08, 0xe5, 0xc9, + 0xb2, 0x6f, 0x44, 0x18, 0x3d, 0x0d, 0xd3, 0x6b, 0x5e, 0x16, 0x89, 0x0a, + 0x45, 0x04, 0xbb, 0x3c, 0x77, 0xf2, 0x8d, 0x9d, 0xb1, 0x9e, 0x13, 0xb3, + 0x78, 0x76, 0x4c, 0x52, 0x5e, 0x3c, 0xa0, 0x2e, 0x39, 0x23, 0xab, 0xa8, + 0xd1, 0xe6, 0xb7, 0xb6, 0xde, 0xd6, 0x38, 0x5e, 0xc4, 0x3f, 0x1d, 0x48, + 0xd5, 0x60, 0x79, 0xf4, 0xd6, 0x5a, 0xae, 0xc9, 0xa9, 0x42, 0x3d, 0x14, + 0x06, 0x41, 0xb0, 0x53, 0x3e, 0xeb, 0xf6, 0x16, 0x12, 0x4a, 0x9b, 0x1d, + 0xc9, 0xef, 0x4f, 0x27, 0xa3, 0x5c, 0xe6, 0x42, 0x72, 0xa1, 0x42, 0xaa, + 0x41, 0x23, 0xad, 0x6e, 0xa4, 0xd4, 0xa8, 0x49, 0xad, 0xc1, 0x89, 0xb5, + 0xff, 0x1e, 0x13, 0xa3, 0xad, 0x42, 0x82, 0x75, 0xb6, 0x42, 0x4b, 0x7f, + 0x2e, 0x88, 0x90, 0xc7, 0xc4, 0x95, 0x8a, 0x25, 0x44, 0x08, 0xbb, 0x90, + 0x97, 0x40, 0x6c, 0xf4, 0x57, 0xc4, 0x15, 0xc6, 0xd2, 0x97, 0x17, 0xea, + 0xfb, 0xd7, 0x07, 0xeb, 0xf0, 0x23, 0x74, 0x09, 0x92, 0x6a, 0xf5, 0x4a, + 0x08, 0x8e, 0x78, 0x1b, 0xa2, 0x98, 0x97, 0x3e, 0xd5, 0x40, 0x0d, 0x74, + 0x4d, 0x41, 0xe0, 0xe0, 0x78, 0x5b, 0x58, 0x5a, 0x5b, 0x93, 0x0e, 0x51, + 0x3a, 0x39, 0x40, 0xa5, 0x77, 0xe6, 0x17, 0xa6, 0xc2, 0x84, 0x03, 0xa5, + 0x3f, 0x3e, 0xca, 0x02, 0x72, 0xb1, 0x53, 0xff, 0xe8, 0x8d, 0xce, 0x1b, + 0x6d, 0x25, 0x2e, 0x21, 0xa9, 0x30, 0x4c, 0x8d, 0x33, 0x16, 0x69, 0x7b, + 0x4e, 0x4c, 0x7d, 0xb3, 0x9d, 0x8a, 0xa2, 0x07, 0xca, 0x4e, 0x9c, 0x3c, + 0x54, 0x08, 0x09, 0x85, 0xbc, 0x10, 0x27, 0x39, 0x9e, 0x30, 0x78, 0x99, + 0x8b, 0x0f, 0x13, 0xa8, 0xd9, 0x4b, 0xe1, 0x7c, 0xf0, 0xf1, 0x33, 0xb5, + 0x08, 0xf9, 0x36, 0x4f, 0x14, 0x1e, 0xc6, 0x5f, 0x86, 0xab, 0xd9, 0x1e, + 0xfa, 0x26, 0xea, 0xdd, 0x8d, 0xdc, 0x98, 0x5b, 0x9c, 0x36, 0x0c, 0x5c, + 0xfa, 0x96, 0x49, 0xea, 0x7f, 0xfb, 0x5c, 0x87, 0x34, 0x0d, 0xc4, 0xc1, + 0x82, 0x2d, 0x1a, 0xf4, 0xfb, 0x06, 0x28, 0x43, 0x31, 0x75, 0xda, 0x3d, + 0xf2, 0xef, 0xb4, 0x33, 0xcb, 0xcf, 0x6c, 0x21, 0xf9, 0x73, 0x6c, 0x5a, + 0xd5, 0xac, 0x5d, 0xe1, 0x8f, 0xdb, 0xad, 0xf7, 0x0e, 0x08, 0x36, 0x83, + 0xfc, 0xe1, 0x1c, 0x3e, 0xf4, 0xe6, 0x0c, 0x6f, 0xd7, 0x98, 0x84, 0x3a, + 0x46, 0xa1, 0xa6, 0xaa, 0xab, 0xa3, 0xd2, 0xe6, 0x9c, 0x36, 0x90, 0x5c, + 0x84, 0x45, 0xf7, 0x27, 0x68, 0x42, 0x14, 0xaa, 0x76, 0xfe, 0x39, 0x56, + 0x1b, 0xa7, 0x40, 0x25, 0x46, 0x66, 0x65, 0xd3, 0x6e, 0x42, 0xd8, 0x66, + 0x3c, 0x26, 0xba, 0x2d, 0xe6, 0x59, 0x7f, 0x49, 0x86, 0xf6, 0x19, 0x75, + 0xfc, 0x28, 0xc2, 0x75, 0x5b, 0x80, 0x12, 0x00, 0x1a, 0x01, 0xa8, 0x12, + 0x00, 0x32, 0xfc, 0x06, 0x29, 0x03, 0x54, 0x02, 0xa0, 0x46, 0x4e, 0x9e, + 0x40, 0x47, 0xa4, 0x48, 0x8b, 0x27, 0xfe, 0x0d, 0x00, 0x3d, 0x5d, 0xaa, + 0x74, 0xa1, 0xa9, 0x46, 0x95, 0x54, 0xc4, 0x88, 0x99, 0xb7, 0x9a, 0x46, + 0x10, 0x50, 0x44, 0x35, 0xf4, 0xc4, 0x19, 0x14, 0x90, 0xd7, 0x46, 0x85, + 0xce, 0x13, 0x23, 0x4b, 0xad, 0x18, 0x20, 0xac, 0x71, 0x11, 0xf0, 0x9e, + 0x3b, 0x8e, 0xaf, 0x68, 0x12, 0xa5, 0x26, 0xec, 0x84, 0xe5, 0x93, 0x00, + 0x85, 0x91, 0x72, 0x97, 0xd2, 0xcd, 0x4b, 0xf6, 0x12, 0x14, 0x38, 0xd6, + 0xa2, 0x70, 0x71, 0x96, 0xea, 0xde, 0xf9, 0xc6, 0xd3, 0x62, 0x29, 0xb9, + 0xdd, 0x9c, 0xcb, 0x02, 0xff, 0x42, 0x48, 0xac, 0xe9, 0x4f, 0x99, 0x28, + 0x60, 0xc4, 0x27, 0x2c, 0xfc, 0xd1, 0xfc, 0xfb, 0x5f, 0xe0, 0x72, 0xe3, + 0xdb, 0x04, 0xc3, 0x1d, 0x09, 0x72, 0x9d, 0x3f, 0xf9, 0xb3, 0x3e, 0xad, + 0xd5, 0x6e, 0x6c, 0x89, 0xd1, 0x58, 0x7f, 0xcb, 0xe1, 0x2d, 0xa2, 0x95, + 0xc0, 0xaf, 0x00, 0x5e, 0x44, 0x43, 0xa8, 0x54, 0x41, 0xd9, 0xfb, 0x0b, + 0x65, 0x1d, 0x70, 0x8a, 0xa7, 0x00, 0xae, 0xb1, 0x6b, 0x51, 0xed, 0xd5, + 0x26, 0xd8, 0x1d, 0xad, 0xf8, 0x78, 0x8d, 0xc8, 0x2e, 0x33, 0xa0, 0x49, + 0x08, 0xb0, 0xb4, 0xaa, 0xec, 0x95, 0xe9, 0x4d, 0xec, 0x7a, 0x36, 0xa5, + 0xf8, 0xaf, 0x43, 0xc7, 0x52, 0x7e, 0x7e, 0xe0, 0xab, 0x28, 0x4a, 0x90, + 0x7e, 0x78, 0x38, 0x96, 0xea, 0x43, 0x60, 0x2a, 0xf4, 0x5c, 0xa6, 0x5f, + 0x3c, 0xeb, 0xe6, 0x47, 0x4f, 0x94, 0xf8, 0xa1, 0x4c, 0x36, 0x5e, 0xe1, + 0xc2, 0x40, 0xea, 0xfe, 0xf9, 0xf1, 0x84, 0x60, 0x93, 0xb0, 0xdd, 0xb4, + 0x00, 0x78, 0xc5, 0xec, 0xa8, 0x5a, 0x77, 0x93, 0xf4, 0xff, 0x51, 0x9f, + 0xd8, 0xba, 0xd3, 0x48, 0x1d, 0x7e, 0xb9, 0x51, 0x83, 0x3c, 0x52, 0xda, + 0x61, 0x89, 0x51, 0x6b, 0xf8, 0x08, 0xb0, 0xaa, 0x3d, 0x03, 0xc5, 0x5e, + 0xd4, 0xd9, 0x7d, 0xd2, 0xca, 0x39, 0x9e, 0xc3, 0x87, 0xab, 0x88, 0xca, + 0x2a, 0x8c, 0xf5, 0xe0, 0x3b, 0x5c, 0x1f, 0x8b, 0x06, 0x80, 0xf4, 0xd7, + 0xcc, 0x26, 0x77, 0xb2, 0x9a, 0xc3, 0x6e, 0xd0, 0x17, 0x4c, 0x50, 0x05, + 0xcf, 0x84, 0x51, 0xb9, 0xf8, 0x33, 0x28, 0x02, 0x1f, 0x56, 0x12, 0x16, + 0x2a, 0x70, 0x97, 0xc6, 0xe9, 0xc8, 0xe7, 0x4f, 0x00, 0x0e, 0x9f, 0x54, + 0x40, 0xed, 0xb1, 0x34, 0x26, 0x52, 0x90, 0xe8, 0x98, 0xfd, 0xeb, 0x06, + 0xd3, 0xf0, 0x22, 0x4f, 0x82, 0xb2, 0xfa, 0xc2, 0x32, 0x09, 0x39, 0x46, + 0xc6, 0xfa, 0xd5, 0xaf, 0x12, 0x45, 0x85, 0x1a, 0x96, 0x2d, 0x3c, 0x61, + 0x90, 0x33, 0x3d, 0xf3, 0xd2, 0x7c, 0x11, 0x1a, 0x04, 0x10, 0x8c, 0xb3, + 0x55, 0x0d, 0xf7, 0x6a, 0x72, 0xec, 0xa9, 0xa4, 0xe8, 0x61, 0x0c, 0x42, + 0x80, 0x19, 0xed, 0xf3, 0x95, 0x06, 0x02, 0x86, 0x18, 0x3c, 0x3d, 0x67, + 0x29, 0x71, 0xd5, 0x16, 0xb9, 0x2b, 0x99, 0xae, 0x68, 0x2b, 0x19, 0x68, + 0xcc, 0x06, 0xdd, 0xcf, 0x96, 0x72, 0x08, 0x3a, 0x75, 0x52, 0x7f, 0xf9, + 0x49, 0x84, 0x55, 0x23, 0xc2, 0x96, 0x99, 0xd8, 0x73, 0x6a, 0x00, 0xc6, + 0x07, 0x33, 0x43, 0xa9, 0x35, 0x83, 0x6e, 0x1b, 0x94, 0x04, 0x89, 0xb8, + 0x1b, 0x2a, 0x72, 0xa6, 0xcb, 0x53, 0xc5, 0xe6, 0x4c, 0x40, 0xd2, 0x83, + 0x77, 0x6c, 0x18, 0x66, 0x68, 0x1b, 0x33, 0x9a, 0x55, 0xf3, 0x0b, 0xc3, + 0x34, 0xf9, 0xaa, 0x43, 0x6d, 0x53, 0x42, 0x2c, 0x55, 0x1b, 0xc3, 0x79, + 0xf7, 0x3c, 0x30, 0xc0, 0xd6, 0xdd, 0x43, 0x80, 0x1f, 0x3f, 0x5a, 0x81, + 0x9d, 0xa8, 0xaf, 0xb2, 0x15, 0x82, 0x28, 0x4c, 0x8a, 0x45, 0x97, 0xbc, + 0x88, 0x9a, 0x08, 0x52, 0x49, 0xac, 0x1a, 0x90, 0x5e, 0xd9, 0xa8, 0x43, + 0xf8, 0xb2, 0x96, 0x09, 0x16, 0x90, 0xc1, 0x4a, 0x5a, 0x1a, 0xd3, 0x59, + 0x4b, 0xc7, 0x90, 0xa7, 0x35, 0xfc, 0x19, 0xa0, 0x26, 0x5f, 0xff, 0x99, + 0x3e, 0xc8, 0xd6, 0xf1, 0x02, 0x3e, 0xc0, 0x81, 0x65, 0x55, 0x1b, 0x73, + 0x3c, 0xe6, 0x6e, 0x9b, 0xed, 0x5b, 0xb3, 0x9f, 0x47, 0x34, 0xf7, 0xaa, + 0x08, 0x44, 0xf2, 0x8c, 0x02, 0x6a, 0x9e, 0x33, 0x25, 0x45, 0x6a, 0x8e, + 0x29, 0xe6, 0x76, 0x7d, 0x1a, 0x29, 0xd2, 0xf8, 0xe6, 0xb2, 0xed, 0x86, + 0x1a, 0x11, 0x79, 0xa8, 0xf6, 0xf0, 0x3d, 0x8e, 0xda, 0x07, 0x31, 0x7f, + 0xbb, 0x42, 0x6c, 0x90, 0xdc, 0xdb, 0x8a, 0x0b, 0xd8, 0x93, 0x72, 0x8e, + 0x17, 0xf7, 0xb7, 0x7f, 0x39, 0xda, 0xdc, 0x70, 0x50, 0x26, 0x87, 0x11, + 0x35, 0xc4, 0x13, 0xe0, 0x03, 0x31, 0x72, 0x8f, 0x92, 0x7d, 0x72, 0x32, + 0x1c, 0x3c, 0xef, 0x89, 0x2c, 0x22, 0x91, 0xb3, 0x4d, 0x03, 0x6c, 0xf6, + 0x45, 0x31, 0x15, 0x08, 0x8e, 0x7e, 0x92, 0x52, 0x03, 0xc1, 0x75, 0x95, + 0x3a, 0xd6, 0x6d, 0x89, 0x42, 0x7b, 0xca, 0xf2, 0x73, 0x69, 0xdb, 0x23, + 0x5b, 0x91, 0x2a, 0x03, 0x09, 0xee, 0x80, 0x03, 0xe7, 0x7a, 0xf6, 0x43, + 0xcc, 0xa1, 0xea, 0x81, 0xc5, 0x36, 0x47, 0x44, 0x31, 0xe2, 0x38, 0xd8, + 0xd7, 0x61, 0x8f, 0x92, 0x0f, 0xe2, 0x27, 0x36, 0x68, 0x83, 0xc4, 0xbf, + 0xd1, 0x94, 0x1c, 0xe5, 0xe4, 0x9e, 0x59, 0xc6, 0x55, 0x03, 0xeb, 0xa4, + 0x25, 0xbf, 0x88, 0xe0, 0x29, 0x05, 0xbc, 0xa4, 0xf6, 0x3b, 0x06, 0x30, + 0xfc, 0xbc, 0xb5, 0xeb, 0x17, 0x00, 0x8f, 0x49, 0x30, 0xa1, 0x17, 0xda, + 0x8a, 0x8f, 0x61, 0x62, 0xde, 0x0e, 0x47, 0xee, 0x5a, 0xaa, 0x85, 0x93, + 0x72, 0xe5, 0xf5, 0x88, 0xfa, 0xf4, 0x08, 0x87, 0x30, 0x32, 0xa5, 0x89, + 0x40, 0xad, 0x84, 0x02, 0x0c, 0xcb, 0xd6, 0x6b, 0x73, 0x8a, 0x73, 0x85, + 0x1d, 0x84, 0x19, 0x4f, 0x74, 0x7b, 0x42, 0x5f, 0x35, 0xf8, 0xe2, 0x4a, + 0x24, 0xdb, 0x3e, 0x0f, 0xe1, 0xce, 0xc1, 0x6a, 0x95, 0x86, 0x5f, 0xa1, + 0x38, 0x59, 0x84, 0xc1, 0xef, 0xbe, 0xcf, 0xd7, 0x49, 0xaf, 0xcf, 0x62, + 0x38, 0xd9, 0xb5, 0xab, 0xe4, 0x7f, 0x17, 0x46, 0xaf, 0x30, 0xb6, 0x52, + 0x75, 0x73, 0x0c, 0x6d, 0x1c, 0x46, 0x93, 0x9f, 0xfc, 0x56, 0x80, 0x6b, + 0x00, 0x81, 0x65, 0x21, 0xb0, 0xd9, 0xc1, 0x42, 0xd2, 0xc1, 0xd9, 0x26, + 0x8b, 0xbf, 0xe7, 0xbc, 0x14, 0x07, 0x2b, 0xb1, 0xdf, 0x27, 0xe6, 0xaa, + 0x72, 0x79, 0xe7, 0x3e, 0xcc, 0xf3, 0xec, 0x3f, 0xce, 0xc5, 0x1d, 0xcb, + 0xc6, 0xa2, 0x4a, 0x24, 0x17, 0x95, 0x16, 0xf5, 0x32, 0x50, 0x32, 0x05, + 0x30, 0x05, 0x60, 0xcc, 0xa3, 0x4f, 0x90, 0x27, 0x99, 0xa4, 0x05, 0x8f, + 0xff, 0x95, 0x40, 0x00, 0x83, 0x22, 0x34, 0x01, 0x56, 0x70, 0xf3, 0x5c, + 0x52, 0xbe, 0x53, 0x3b, 0x9a, 0x61, 0xeb, 0xf3, 0x28, 0x7b, 0xea, 0xaa, + 0x62, 0x22, 0x17, 0x95, 0x6d, 0xcc, 0x8a, 0xc9, 0x3a, 0x3b, 0xd5, 0xb4, + 0x58, 0x35, 0xcb, 0x51, 0x24, 0x24, 0xbe, 0x93, 0x4e, 0x14, 0x4c, 0x10, + 0x33, 0x03, 0x59, 0x1d, 0x59, 0xec, 0x1a, 0xcf, 0xe8, 0xf6, 0x4c, 0x35, + 0xeb, 0x98, 0x57, 0xf1, 0x44, 0x16, 0x12, 0x00, 0x1a, 0x01, 0xe8, 0x12, + 0x00, 0x32, 0x57, 0x32, 0x07, 0x22, 0x0d, 0xd8, 0xa8, 0xa3, 0x4f, 0x90, + 0x10, 0xc1, 0xa3, 0x05, 0x89, 0xfe, 0x5d, 0x40, 0x00, 0xfe, 0x4e, 0x3b, + 0x7d, 0x8a, 0xb9, 0x8a, 0x54, 0xce, 0xfa, 0x71, 0xbe, 0x85, 0x22, 0x22, + 0x23, 0xc1, 0xf1, 0x05, 0xf9, 0x98, 0x70, 0xda, 0xad, 0x77, 0xb1, 0x0d, + 0x0d, 0x3b, 0xa7, 0x5c, 0xe5, 0x0a, 0x41, 0x25, 0x07, 0x4e, 0x87, 0x79, + 0xab, 0x08, 0x4e, 0x04, 0xdf, 0xeb, 0x06, 0x4e, 0x5a, 0x60, 0x75, 0x27, + 0x81, 0x62, 0x9e, 0x2b, 0x7e, 0x72, 0x3b, 0x26, 0xd3, 0x11, 0x72, 0x9b, + 0x26, 0x12, 0x48, 0xc0, 0xb6, 0xf0, 0x12, 0x00, 0x32, 0x15, 0x32, 0x08, + 0x00, 0x09, 0xa8, 0x5c, 0x83, 0x4f, 0xf0, 0x10, 0x91, 0xa3, 0x05, 0x80, + 0x00, 0x01, 0x00, 0xca, 0x9c, 0x87, 0x80 +}; + +static guint32 stream_no_annexb_av1_len = 10519; + +static gsize stream_av1_frame_size[] = { + 5454, 1331, 29, 24, 1473, 543, 5, 555, 5, 897, 82, 5, 91, 25 +}; + +static gsize stream_av1_obu_size[] = { + 2, 13, 5439, 2, 13, 1316, 2, 27, 24, 1473, 543, 2, + 3, 2, 553, 2, 3, 2, 895, 82, 2, 3, 2, 89, 2, 23 +};
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/avtpcrfbase.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/avtpcrfbase.c
Changed
@@ -29,6 +29,10 @@ static struct avtp_crf_pdu * generate_crf_pdu (int data_len, guint64 first_tstamp) { + const guint64 base_freq = 48000; + const guint64 interval = 160; + const gdouble interval_time = 1.0e9 / base_freq * interval; + struct avtp_crf_pdu *crf_pdu = g_malloc0 (sizeof (struct avtp_crf_pdu) + data_len); @@ -36,12 +40,14 @@ avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_SV, 1); avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_STREAM_ID, 0xABCD1234ABCD1234); avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_TYPE, AVTP_CRF_TYPE_AUDIO_SAMPLE); - avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_BASE_FREQ, 48000); + avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_BASE_FREQ, base_freq); avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_PULL, 1); avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_CRF_DATA_LEN, data_len); - avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_TIMESTAMP_INTERVAL, 160); - for (int i = 0; i < data_len / 8; i++) - crf_pdu->crf_data[i] = htobe64 (first_tstamp + i * 3333333); + avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_TIMESTAMP_INTERVAL, interval); + for (int i = 0; i < data_len / 8; i++) { + const guint64 offset = i * interval_time; + crf_pdu->crf_data[i] = htobe64 (first_tstamp + offset); + } return crf_pdu; } @@ -481,8 +487,7 @@ GST_END_TEST; static void -setup_thread_defaults (GstAvtpCrfBase * avtpcrfbase, - GstClockTime * past_periods) +setup_thread_defaults (GstAvtpCrfBase * avtpcrfbase, gdouble * past_periods) { GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -501,7 +506,7 @@ { int data_len = 64; struct avtp_crf_pdu *crf_pdu = generate_crf_pdu (data_len, 1000); - GstClockTime past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; + gdouble past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -513,8 +518,8 @@ thread_data->periods_stored = 2; calculate_average_period (avtpcrfbase, crf_pdu); - fail_unless_equals_uint64 (thread_data->average_period, 20777); - fail_unless_equals_uint64 (thread_data->past_periods[2], 20833); + fail_unless_equals_float (thread_data->average_period, 20777.7775); + fail_unless_equals_float (thread_data->past_periods[2], 20833.3325); fail_unless_equals_uint64 (thread_data->current_ts, 1000); gst_object_unref (avtpcrfbase); @@ -524,6 +529,54 @@ GST_END_TEST; /* + * Test for rounding error + */ +GST_START_TEST (test_calculate_average_period_rounding_error) +{ + /* the presentation time in ns */ + const GstClockTimeDiff ptime = 50000000; + /* the time in ns of one sync event e.g. one audio sample @48kHz */ + const gdouble event_interval = 1.0e9 / 48000; + /* the presentation time measured in sync events (e.g. sample rate) + * for class B traffic with a presentation time of 50ms. + */ + const GstClockTime ptime_in_events = ptime / event_interval; + + /* With 4 timestamps generate_crf_pdu() multiples the interval time + * with 3. This results into an integer time stamp in nsi without decimal + * digits. Therefore the rounding issue when generating the timestamps for + * the CRF PDU is avoided here. + */ + int data_len = 32; + struct avtp_crf_pdu *crf_pdu = generate_crf_pdu (data_len, 1000); + gdouble past_periods[10] = { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }; + GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); + GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; + + setup_thread_defaults (avtpcrfbase, past_periods); + + thread_data->timestamp_interval = 160; + thread_data->num_pkt_tstamps = data_len / sizeof (uint64_t); + thread_data->past_periods_iter = 0; + thread_data->periods_stored = 0; + + calculate_average_period (avtpcrfbase, crf_pdu); + + /* When internally using integer for average_period calculation the following + * multiplication will result to (20833 * 2400=) 49999200ns. This value + * differs by 800ns from the original presentation time of 50ms. When using + * double this rounding error is avoided. + */ + fail_unless_equals_float ((thread_data->average_period * ptime_in_events), + ptime); + + gst_object_unref (avtpcrfbase); + g_free (crf_pdu); +} + +GST_END_TEST; + +/* * Test for an overflow in the 64-bit CRF timestamp in the CRF AVTPDU when * there are multiple CRF timestamps in a packet. */ @@ -532,7 +585,7 @@ int data_len = 64; struct avtp_crf_pdu *crf_pdu = generate_crf_pdu (data_len, 18446744073709501615ULL); - GstClockTime past_periods[10] = + gdouble past_periods[10] = { 21000, 20500, 21220, 21345, 20990, 21996, 20220, 20915, 21324, 23123 }; GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -545,8 +598,8 @@ thread_data->periods_stored = 10; calculate_average_period (avtpcrfbase, crf_pdu); - fail_unless_equals_uint64 (thread_data->average_period, 21147); - fail_unless_equals_uint64 (thread_data->past_periods[5], 20833); + fail_unless_equals_float (thread_data->average_period, 21147.03325); + fail_unless_equals_float (thread_data->past_periods[5], 20833.3325); fail_unless_equals_uint64 (thread_data->current_ts, 18446744073709501615ULL); g_free (crf_pdu); @@ -563,7 +616,7 @@ { int data_len = 8; struct avtp_crf_pdu *crf_pdu = generate_crf_pdu (data_len, 21833); - GstClockTime past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; + gdouble past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -579,8 +632,8 @@ avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_SEQ_NUM, 10); calculate_average_period (avtpcrfbase, crf_pdu); - fail_unless_equals_uint64 (thread_data->average_period, 20777); - fail_unless_equals_uint64 (thread_data->past_periods[2], 20833); + fail_unless_equals_float (thread_data->average_period, 20777.6666666); + fail_unless_equals_float (thread_data->past_periods[2], 20833); fail_unless_equals_uint64 (thread_data->last_seqnum, 10); fail_unless_equals_uint64 (thread_data->last_received_tstamp, 21833); fail_unless_equals_uint64 (thread_data->current_ts, 21833); @@ -600,7 +653,7 @@ int data_len = 8; struct avtp_crf_pdu *crf_pdu1 = generate_crf_pdu (data_len, 1000); struct avtp_crf_pdu *crf_pdu2 = generate_crf_pdu (data_len, 21833); - GstClockTime past_periods[10] = { 0 }; + gdouble past_periods[10] = { 0 }; GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -613,15 +666,15 @@ avtp_crf_pdu_set (crf_pdu2, AVTP_CRF_FIELD_SEQ_NUM, 11); calculate_average_period (avtpcrfbase, crf_pdu1); - fail_unless_equals_uint64 (thread_data->past_periods[0], 0); + fail_unless_equals_float (thread_data->past_periods[0], 0); fail_unless_equals_uint64 (thread_data->last_seqnum, 10); - fail_unless_equals_uint64 (thread_data->average_period, 20854); + fail_unless_equals_float (thread_data->average_period, 20854); fail_unless_equals_uint64 (thread_data->current_ts, 1000); calculate_average_period (avtpcrfbase, crf_pdu2); - fail_unless_equals_uint64 (thread_data->past_periods[0], 20833); + fail_unless_equals_float (thread_data->past_periods[0], 20833); fail_unless_equals_uint64 (thread_data->last_seqnum, 11); - fail_unless_equals_uint64 (thread_data->average_period, 20833); + fail_unless_equals_float (thread_data->average_period, 20833); fail_unless_equals_uint64 (thread_data->current_ts, 21833); g_free (crf_pdu1); @@ -632,6 +685,51 @@ GST_END_TEST; /* + * Test to ensure average_period is calculated correctly + * when receiving multiple CRF AVTPDUs with single CRF timestamp + * with timestamp_interval > 1 + */ +GST_START_TEST (test_calculate_average_period_single_crf_tstamp_interval) +{ + int data_len = 8; + struct avtp_crf_pdu *crf_pdu1 = generate_crf_pdu (data_len, 1000); + /* Used timestamp + * = sample_time * timestamp_interval + first_tstamp + * = 1/48kHz * 160 + 1000 + */ + struct avtp_crf_pdu *crf_pdu2 = generate_crf_pdu (data_len, 3334280); + gdouble past_periods[10] = { 0 }; + GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); + GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; + + setup_thread_defaults (avtpcrfbase, past_periods); + + thread_data->timestamp_interval = 160; + thread_data->num_pkt_tstamps = 1; + + avtp_crf_pdu_set (crf_pdu1, AVTP_CRF_FIELD_SEQ_NUM, 10); + avtp_crf_pdu_set (crf_pdu2, AVTP_CRF_FIELD_SEQ_NUM, 11); + + calculate_average_period (avtpcrfbase, crf_pdu1); + fail_unless_equals_float (thread_data->past_periods[0], 0); + fail_unless_equals_uint64 (thread_data->last_seqnum, 10); + fail_unless_equals_float (thread_data->average_period, 20854); + fail_unless_equals_uint64 (thread_data->current_ts, 1000); + + calculate_average_period (avtpcrfbase, crf_pdu2); + fail_unless_equals_float (thread_data->past_periods[0], 20833); + fail_unless_equals_uint64 (thread_data->last_seqnum, 11); + fail_unless_equals_float (thread_data->average_period, 20833); + fail_unless_equals_uint64 (thread_data->current_ts, 3334280); + + g_free (crf_pdu1); + g_free (crf_pdu2); + gst_object_unref (avtpcrfbase); +} + +GST_END_TEST; + +/* * Test for an overflow in the 64-bit CRF timestamp in the CRF AVTPDU when * there is a single CRF timestamp in a packet. */ @@ -639,7 +737,7 @@ { int data_len = 8; struct avtp_crf_pdu *crf_pdu = generate_crf_pdu (data_len, 20833); - GstClockTime past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; + gdouble past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -655,8 +753,8 @@ avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_SEQ_NUM, 10); calculate_average_period (avtpcrfbase, crf_pdu); - fail_unless_equals_uint64 (thread_data->average_period, 20778); - fail_unless_equals_uint64 (thread_data->past_periods[2], 20834); + fail_unless_equals_float (thread_data->average_period, 20778); + fail_unless_equals_float (thread_data->past_periods[2], 20834); fail_unless_equals_uint64 (thread_data->last_seqnum, 10); fail_unless_equals_uint64 (thread_data->last_received_tstamp, 20833); fail_unless_equals_uint64 (thread_data->current_ts, 20833); @@ -676,7 +774,7 @@ { int data_len = 8; struct avtp_crf_pdu *crf_pdu = generate_crf_pdu (data_len, 21833); - GstClockTime past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; + gdouble past_periods[10] = { 21000, 20500, 0, 0, 0, 0, 0, 0, 0, 0 }; GstAvtpCrfBase *avtpcrfbase = g_object_new (GST_TYPE_AVTP_CRF_BASE, NULL); GstAvtpCrfThreadData *thread_data = &avtpcrfbase->thread_data; @@ -692,8 +790,8 @@ avtp_crf_pdu_set (crf_pdu, AVTP_CRF_FIELD_SEQ_NUM, 12); calculate_average_period (avtpcrfbase, crf_pdu); - fail_unless_equals_uint64 (thread_data->average_period, 20750); - fail_unless_equals_uint64 (thread_data->past_periods[2], 0); + fail_unless_equals_float (thread_data->average_period, 20750); + fail_unless_equals_float (thread_data->past_periods[2], 0); fail_unless_equals_uint64 (thread_data->last_seqnum, 12); fail_unless_equals_uint64 (thread_data->last_received_tstamp, 21833); fail_unless_equals_uint64 (thread_data->current_ts, 21833); @@ -732,12 +830,15 @@ tcase_add_test (tc_chain, test_validate_crf_pdu_tstamps_not_monotonic); tcase_add_test (tc_chain, test_gst_base_freq_multiplier); tcase_add_test (tc_chain, test_calculate_average_period_multiple_crf_tstamps); + tcase_add_test (tc_chain, test_calculate_average_period_rounding_error); tcase_add_test (tc_chain, test_calculate_average_period_multiple_crf_tstamps_64_bit_overflow); tcase_add_test (tc_chain, test_calculate_average_period_single_crf_tstamp); tcase_add_test (tc_chain, test_calculate_average_period_single_crf_tstamp_init); tcase_add_test (tc_chain, + test_calculate_average_period_single_crf_tstamp_interval); + tcase_add_test (tc_chain, test_calculate_average_period_single_crf_tstamp_64_bit_overflow); tcase_add_test (tc_chain, test_calculate_average_period_single_crf_tstamp_seq_num_skip);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/avtpcrfcheck.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/avtpcrfcheck.c
Changed
@@ -106,11 +106,13 @@ struct avtp_stream_pdu *pdu; GstMapInfo info; guint32 type; + gint r; gst_buffer_map (buf, &info, GST_MAP_WRITE); pdu = (struct avtp_stream_pdu *) info.data; - avtp_pdu_get ((struct avtp_common_pdu *) pdu, AVTP_FIELD_SUBTYPE, &type); + r = avtp_pdu_get ((struct avtp_common_pdu *) pdu, AVTP_FIELD_SUBTYPE, &type); + g_assert (r == 0); if (type == AVTP_SUBTYPE_AAF) avtp_aaf_pdu_set (pdu, AVTP_AAF_FIELD_TIMESTAMP, avtp_tstamp); else if (type == AVTP_SUBTYPE_CVF) {
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/avtpcrfsync.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/avtpcrfsync.c
Changed
@@ -112,6 +112,7 @@ struct avtp_stream_pdu *pdu; GstMapInfo info; guint32 type; + gint r; gst_buffer_map (buf, &info, GST_MAP_WRITE); pdu = (struct avtp_stream_pdu *) info.data; @@ -119,7 +120,8 @@ GST_BUFFER_PTS (buf) = orig->buf_pts; GST_BUFFER_DTS (buf) = orig->buf_dts; - avtp_pdu_get ((struct avtp_common_pdu *) pdu, AVTP_FIELD_SUBTYPE, &type); + r = avtp_pdu_get ((struct avtp_common_pdu *) pdu, AVTP_FIELD_SUBTYPE, &type); + g_assert (r == 0); if (type == AVTP_SUBTYPE_AAF) avtp_aaf_pdu_set (pdu, AVTP_AAF_FIELD_TIMESTAMP, orig->avtp_ts); else if (type == AVTP_SUBTYPE_CVF) {
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/avtpcvfdepay.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/avtpcvfdepay.c
Changed
@@ -64,8 +64,7 @@ guint8 nal_size[4]; gst_buffer_extract (buffer, 0, nal_size, 4); - return (nal_size[0] << 24) | (nal_size[1] << 16) | (nal_size[2] << 8) | - nal_size[3]; + return GST_READ_UINT32_BE (nal_size); } static gsize @@ -88,7 +87,7 @@ return NULL; gst_buffer_extract (buffer, *offset, buf, 4); - nal_size = (buf[0] << 24) | (buf[1] << 16) | (buf[2] << 8) | buf[3]; + nal_size = GST_READ_UINT32_BE (buf); ret = gst_buffer_copy_region (buffer, GST_BUFFER_COPY_MEMORY, *offset, @@ -1472,14 +1471,12 @@ { Suite *s = suite_create ("avtpcvfdepay"); TCase *tc_chain = tcase_create ("general"); + TCase *tc_slow = tcase_create ("slow"); suite_add_tcase (s, tc_chain); - /* 'fragmented_big' may take some time to run, so give it a bit more time */ - tcase_set_timeout (tc_chain, 10); tcase_add_test (tc_chain, test_depayloader_single); tcase_add_test (tc_chain, test_depayloader_multiple_single); tcase_add_test (tc_chain, test_depayloader_fragmented); - tcase_add_test (tc_chain, test_depayloader_fragmented_big); tcase_add_test (tc_chain, test_depayloader_single_and_fragmented); tcase_add_test (tc_chain, test_depayloader_property); tcase_add_test (tc_chain, test_depayloader_lost_packet); @@ -1494,6 +1491,11 @@ tcase_add_test (tc_chain, test_depayloader_multiple_lost_eos); tcase_add_test (tc_chain, test_depayloader_fragment_and_single); + suite_add_tcase (s, tc_slow); + /* 'fragmented_big' may take some time to run, so give it a bit more time */ + tcase_set_timeout (tc_slow, 20); + tcase_add_test (tc_slow, test_depayloader_fragmented_big); + return s; }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/camerabin.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/camerabin.c
Changed
@@ -335,6 +335,7 @@ gst_test_video_src_alloc (GstPushSrc * src, GstBuffer ** buf) { GstTestVideoSrc *self = GST_TEST_VIDEO_SRC (src); + GstVideoInfo vinfo; guint8 *data; gsize data_size; @@ -344,7 +345,10 @@ self->caps = NULL; } - data_size = self->width * self->height * 3; /* RGB size */ + gst_video_info_set_format (&vinfo, GST_VIDEO_FORMAT_RGB, self->width, + self->height); + + data_size = vinfo.size; data = g_malloc (data_size); *buf = gst_buffer_new_wrapped (data, data_size);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/cccombiner.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/cccombiner.c
Changed
@@ -31,8 +31,6 @@ static GstStaticCaps foo_bar_caps = GST_STATIC_CAPS ("foo/bar"); static GstStaticCaps cea708_cc_data_caps = GST_STATIC_CAPS ("closedcaption/x-cea-708,format=(string) cc_data"); -static GstStaticCaps cea708_cdp_caps = -GST_STATIC_CAPS ("closedcaption/x-cea-708,format=(string) cdp"); GST_START_TEST (no_captions) { @@ -91,7 +89,6 @@ buflist = gst_sample_get_buffer_list (captions_sample); fail_unless_equals_int (gst_buffer_list_length (buflist), 1); - fail_unless (gst_buffer_list_get (buflist, 0) == expected_caption_buffer); gst_sample_unref (captions_sample); gst_object_unref (caption_pad); @@ -106,10 +103,11 @@ GstCaps *caps; GstVideoCaptionMeta *meta; GstBuffer *second_video_buf, *second_caption_buf; + const guint8 cc_data[3] = { 0xfc, 0x20, 0x20 }; h = gst_harness_new_with_padnames ("cccombiner", "sink", "src"); h2 = gst_harness_new_with_element (h->element, NULL, NULL); - caption_pad = gst_element_get_request_pad (h->element, "caption"); + caption_pad = gst_element_request_pad_simple (h->element, "caption"); gst_harness_add_element_sink_pad (h2, caption_pad); gst_object_unref (caption_pad); @@ -127,7 +125,8 @@ expected_video_buffer = buf; gst_harness_push (h, buf); - buf = gst_buffer_new_and_alloc (128); + buf = gst_buffer_new_and_alloc (3); + gst_buffer_fill (buf, 0, cc_data, 3); GST_BUFFER_PTS (buf) = 0; GST_BUFFER_DURATION (buf) = 40 * GST_MSECOND; expected_caption_buffer = buf; @@ -141,7 +140,8 @@ second_video_buf = buf; gst_harness_push (h, buf); - buf = gst_buffer_new_and_alloc (128); + buf = gst_buffer_new_and_alloc (3); + gst_buffer_fill (buf, 0, cc_data, 3); GST_BUFFER_PTS (buf) = 40 * GST_MSECOND; GST_BUFFER_DURATION (buf) = 40 * GST_MSECOND; second_caption_buf = buf; @@ -158,7 +158,7 @@ fail_unless (meta != NULL); fail_unless_equals_int (meta->caption_type, GST_VIDEO_CAPTION_TYPE_CEA708_RAW); - fail_unless_equals_int (meta->size, 128); + fail_unless_equals_int (meta->size, 3); gst_buffer_unref (outbuf); @@ -174,91 +174,7 @@ fail_unless (meta != NULL); fail_unless_equals_int (meta->caption_type, GST_VIDEO_CAPTION_TYPE_CEA708_RAW); - fail_unless_equals_int (meta->size, 128); - - gst_buffer_unref (outbuf); - - /* Caps should be equal to input caps */ - caps = gst_pad_get_current_caps (h->sinkpad); - fail_unless (caps != NULL); - fail_unless (gst_caps_can_intersect (caps, - gst_static_caps_get (&foo_bar_caps))); - gst_caps_unref (caps); - - gst_harness_teardown (h); - gst_harness_teardown (h2); -} - -GST_END_TEST; - -GST_START_TEST (captions_type_change_and_eos) -{ - GstHarness *h, *h2; - GstBuffer *buf, *outbuf; - GstPad *caption_pad; - GstCaps *caps; - GstVideoCaptionMeta *meta; - - h = gst_harness_new_with_padnames ("cccombiner", "sink", "src"); - h2 = gst_harness_new_with_element (h->element, NULL, NULL); - caption_pad = gst_element_get_request_pad (h->element, "caption"); - gst_harness_add_element_sink_pad (h2, caption_pad); - gst_object_unref (caption_pad); - - gst_harness_set_src_caps_str (h, foo_bar_caps.string); - gst_harness_set_src_caps_str (h2, cea708_cc_data_caps.string); - - /* Push a buffer and caption buffer */ - buf = gst_buffer_new_and_alloc (128); - GST_BUFFER_PTS (buf) = 0; - GST_BUFFER_DURATION (buf) = 40 * GST_MSECOND; - gst_harness_push (h, buf); - - buf = gst_buffer_new_and_alloc (128); - GST_BUFFER_PTS (buf) = 0; - GST_BUFFER_DURATION (buf) = 40 * GST_MSECOND; - gst_harness_push (h2, buf); - - /* Change caption type */ - gst_harness_set_src_caps_str (h2, cea708_cdp_caps.string); - - /* And another one: the first video buffer should be retrievable - * after the second caption buffer is pushed */ - buf = gst_buffer_new_and_alloc (128); - GST_BUFFER_PTS (buf) = 40 * GST_MSECOND; - GST_BUFFER_DURATION (buf) = 40 * GST_MSECOND; - gst_harness_push (h, buf); - - buf = gst_buffer_new_and_alloc (128); - GST_BUFFER_PTS (buf) = 40 * GST_MSECOND; - GST_BUFFER_DURATION (buf) = 40 * GST_MSECOND; - gst_harness_push (h2, buf); - - /* Pull the first output buffer */ - outbuf = gst_harness_pull (h); - fail_unless (outbuf != NULL); - - meta = gst_buffer_get_video_caption_meta (outbuf); - fail_unless (meta != NULL); - fail_unless_equals_int (meta->caption_type, - GST_VIDEO_CAPTION_TYPE_CEA708_RAW); - fail_unless_equals_int (meta->size, 128); - - gst_buffer_unref (outbuf); - - /* Push EOS on both pads get the second output buffer, we otherwise wait - * in case there are further captions for the current video buffer */ - gst_harness_push_event (h, gst_event_new_eos ()); - gst_harness_push_event (h2, gst_event_new_eos ()); - - outbuf = gst_harness_pull (h); - fail_unless (outbuf != NULL); - - meta = gst_buffer_get_video_caption_meta (outbuf); - fail_unless (meta != NULL); - fail_unless_equals_int (meta->caption_type, - GST_VIDEO_CAPTION_TYPE_CEA708_CDP); - fail_unless_equals_int (meta->size, 128); + fail_unless_equals_int (meta->size, 3); gst_buffer_unref (outbuf); @@ -285,7 +201,6 @@ tcase_add_test (tc, no_captions); tcase_add_test (tc, captions_and_eos); - tcase_add_test (tc, captions_type_change_and_eos); return s; }
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/cudaconvert.c
Added
@@ -0,0 +1,216 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/check/gstcheck.h> + +static gboolean +bus_cb (GstBus * bus, GstMessage * message, gpointer data) +{ + GMainLoop *loop = (GMainLoop *) data; + switch (GST_MESSAGE_TYPE (message)) { + case GST_MESSAGE_ERROR:{ + GError *err = NULL; + gchar *debug = NULL; + + gst_message_parse_error (message, &err, &debug); + + GST_ERROR ("Error: %s : %s", err->message, debug); + g_error_free (err); + g_free (debug); + + fail_if (TRUE, "failed"); + g_main_loop_quit (loop); + } + break; + case GST_MESSAGE_EOS: + g_main_loop_quit (loop); + break; + default: + break; + } + return TRUE; +} + +static void +run_convert_pipelne (const gchar * in_format, const gchar * out_format) +{ + GstBus *bus; + GMainLoop *loop = g_main_loop_new (NULL, FALSE); + gchar *pipeline_str = + g_strdup_printf ("videotestsrc num-buffers=1 is-live=true ! " + "video/x-raw,format=%s,framerate=3/1 ! cudaupload ! " + "cudaconvert ! cudadownload ! video/x-raw,format=%s ! " + "videoconvert ! autovideosink", in_format, out_format); + GstElement *pipeline; + + pipeline = gst_parse_launch (pipeline_str, NULL); + fail_unless (pipeline != NULL); + g_free (pipeline_str); + + bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline)); + gst_bus_add_watch (bus, (GstBusFunc) bus_cb, loop); + + gst_element_set_state (pipeline, GST_STATE_PLAYING); + g_main_loop_run (loop); + gst_element_set_state (pipeline, GST_STATE_NULL); + + gst_bus_remove_watch (bus); + gst_object_unref (bus); + gst_object_unref (pipeline); + g_main_loop_unref (loop); +} + +GST_START_TEST (test_convert_yuv_yuv) +{ + const gchar *format_list[] = { + "I420", "YV12", "NV12", "NV21", "P010_10LE", "I420_10LE", + "Y444", "Y444_16LE", + }; + + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (format_list); i++) { + for (j = 0; j < G_N_ELEMENTS (format_list); j++) { + if (i == j) + continue; + + GST_DEBUG ("run conversion %s to %s", format_list[i], format_list[j]); + run_convert_pipelne (format_list[i], format_list[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_convert_yuv_rgb) +{ + const gchar *in_format_list[] = { + "I420", "YV12", "NV12", "NV21", "P010_10LE", "I420_10LE", + "Y444", "Y444_16LE", + }; + const gchar *out_format_list[] = { + "BGRA", "RGBA", "RGBx", "BGRx", "ARGB", "ABGR", "RGB", "BGR", "BGR10A2_LE", + "RGB10A2_LE", + }; + + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (in_format_list); i++) { + for (j = 0; j < G_N_ELEMENTS (out_format_list); j++) { + GST_DEBUG ("run conversion %s to %s", in_format_list[i], + out_format_list[j]); + run_convert_pipelne (in_format_list[i], out_format_list[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_convert_rgb_yuv) +{ + const gchar *in_format_list[] = { + "BGRA", "RGBA", "RGBx", "BGRx", "ARGB", "ABGR", "RGB", "BGR", "BGR10A2_LE", + "RGB10A2_LE", + }; + const gchar *out_format_list[] = { + "I420", "YV12", "NV12", "NV21", "P010_10LE", "I420_10LE", + "Y444", "Y444_16LE", + }; + + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (in_format_list); i++) { + for (j = 0; j < G_N_ELEMENTS (out_format_list); j++) { + GST_DEBUG ("run conversion %s to %s", in_format_list[i], + out_format_list[j]); + run_convert_pipelne (in_format_list[i], out_format_list[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_convert_rgb_rgb) +{ + const gchar *format_list[] = { + "BGRA", "RGBA", "RGBx", "BGRx", "ARGB", "ABGR", "RGB", "BGR", "BGR10A2_LE", + "RGB10A2_LE", + }; + + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (format_list); i++) { + for (j = 0; j < G_N_ELEMENTS (format_list); j++) { + if (i == j) + continue; + + GST_DEBUG ("run conversion %s to %s", format_list[i], format_list[j]); + run_convert_pipelne (format_list[i], format_list[j]); + } + } +} + +GST_END_TEST; + +static gboolean +check_cuda_convert_available (void) +{ + gboolean ret = TRUE; + GstElement *upload; + + upload = gst_element_factory_make ("cudaconvert", NULL); + if (!upload) { + GST_WARNING ("cudaconvert is not available, possibly driver load failure"); + return FALSE; + } + + gst_object_unref (upload); + + return ret; +} + +static Suite * +cudaconvert_suite (void) +{ + Suite *s; + TCase *tc_chain; + + /* HACK: cuda device init/deinit with fork seems to problematic */ + g_setenv ("CK_FORK", "no", TRUE); + + s = suite_create ("cudaconvert"); + tc_chain = tcase_create ("general"); + + suite_add_tcase (s, tc_chain); + + if (!check_cuda_convert_available ()) { + GST_DEBUG ("Skip cudaconvert test since cannot open device"); + goto end; + } + + tcase_add_test (tc_chain, test_convert_yuv_yuv); + tcase_add_test (tc_chain, test_convert_yuv_rgb); + tcase_add_test (tc_chain, test_convert_rgb_yuv); + tcase_add_test (tc_chain, test_convert_rgb_rgb); + +end: + return s; +} + +GST_CHECK_MAIN (cudaconvert);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/cudafilter.c
Added
@@ -0,0 +1,168 @@ +/* GStreamer + * Copyright (C) 2019 Seungha Yang <seungha.yang@navercorp.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/check/gstcheck.h> +#include <gst/check/gstharness.h> +#include <gst/video/video.h> + + +static void +test_buffer_meta_common (const gchar * in_caps, const gchar * out_caps, + const gchar * pipeline) +{ + GstHarness *h; + GstElement *capsfilter; + GstCaps *caps, *srccaps; + GstBuffer *in_buf, *out_buf = NULL; + GstFlowReturn ret; + GstVideoInfo info; + GstVideoTimeCodeMeta *meta = NULL; + + h = gst_harness_new_parse (pipeline); + fail_unless (h != NULL); + + capsfilter = gst_harness_find_element (h, "capsfilter"); + + gst_harness_play (h); + + srccaps = gst_caps_from_string (in_caps); + fail_unless (srccaps != NULL); + fail_unless (gst_video_info_from_caps (&info, srccaps)); + + gst_harness_set_src_caps (h, srccaps); + + /* enforce cuda memory */ + caps = gst_caps_from_string (out_caps); + g_object_set (capsfilter, "caps", caps, NULL); + gst_caps_unref (caps); + gst_object_unref (capsfilter); + + in_buf = gst_buffer_new_and_alloc (GST_VIDEO_INFO_SIZE (&info)); + gst_buffer_memset (in_buf, 0, 0, GST_VIDEO_INFO_SIZE (&info)); + + GST_BUFFER_DURATION (in_buf) = GST_SECOND; + GST_BUFFER_PTS (in_buf) = 0; + GST_BUFFER_DTS (in_buf) = GST_CLOCK_TIME_NONE; + + gst_buffer_add_video_time_code_meta_full (in_buf, 30, 1, + NULL, GST_VIDEO_TIME_CODE_FLAGS_NONE, 0, 0, 1, 1, 0); + + ret = gst_harness_push (h, gst_buffer_ref (in_buf)); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + + out_buf = gst_harness_try_pull (h); + fail_unless (out_buf != NULL, "No output buffer"); + + meta = gst_buffer_get_video_time_code_meta (out_buf); + fail_unless (meta != NULL, "output buffer has no meta"); + fail_unless_equals_int (meta->tc.config.fps_n, 30); + fail_unless_equals_int (meta->tc.config.fps_d, 1); + fail_unless_equals_int (meta->tc.seconds, 1); + + gst_buffer_unref (in_buf); + gst_buffer_unref (out_buf); + + gst_harness_teardown (h); +} + +GST_START_TEST (test_buffer_meta) +{ + /* test whether buffer meta would be preserved or not */ + + test_buffer_meta_common + ("video/x-raw,format=(string)NV12,width=340,height=240", + "video/x-raw(memory:CUDAMemory)", "cudaupload ! capsfilter"); + test_buffer_meta_common + ("video/x-raw,format=(string)NV12,width=340,height=240", "video/x-raw", + "cudaupload ! cudadownload ! capsfilter"); + test_buffer_meta_common + ("video/x-raw,format=(string)NV12,width=340,height=240", + "video/x-raw,format=(string)I420,width=340,height=240", + "cudaupload ! cudaconvert ! cudadownload ! capsfilter"); + test_buffer_meta_common + ("video/x-raw,format=(string)NV12,width=340,height=240", + "video/x-raw,format=(string)NV12,width=640,height=480", + "cudaupload ! cudaconvert ! cudascale ! cudaconvert ! cudadownload ! capsfilter"); +} + +GST_END_TEST; + +static gboolean +check_cuda_available (void) +{ + GstElement *elem; + + elem = gst_element_factory_make ("cudaupload", NULL); + if (!elem) { + GST_WARNING ("cudaupload is not available, possibly driver load failure"); + return FALSE; + } + gst_object_unref (elem); + + elem = gst_element_factory_make ("cudadownload", NULL); + if (!elem) { + GST_WARNING ("cudadownload is not available, possibly driver load failure"); + return FALSE; + } + gst_object_unref (elem); + + elem = gst_element_factory_make ("cudaconvert", NULL); + if (!elem) { + GST_WARNING ("cudaconvert is not available, possibly driver load failure"); + return FALSE; + } + gst_object_unref (elem); + + elem = gst_element_factory_make ("cudadownload", NULL); + if (!elem) { + GST_WARNING ("cudascale is not available, possibly driver load failure"); + return FALSE; + } + gst_object_unref (elem); + + return TRUE; +} + +static Suite * +cudafilter_suite (void) +{ + Suite *s; + TCase *tc_chain; + + /* HACK: cuda device init/deinit with fork seems to problematic */ + g_setenv ("CK_FORK", "no", TRUE); + + s = suite_create ("cudafilter"); + tc_chain = tcase_create ("general"); + + suite_add_tcase (s, tc_chain); + + if (!check_cuda_available ()) { + GST_DEBUG ("Skip cuda filter test since cannot open device"); + goto end; + } + + tcase_add_test (tc_chain, test_buffer_meta); + +end: + return s; +} + +GST_CHECK_MAIN (cudafilter);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/curlsftpsink.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/curlsftpsink.c
Changed
@@ -52,6 +52,7 @@ gchar *res_passphrase = NULL; gchar *res_kh_file = NULL; gchar *res_host_pubkey_md5 = NULL; + gchar *res_host_pubkey_sha256 = NULL; guint res_auth_type = 0; gboolean res_accept_unkh = FALSE; @@ -76,6 +77,10 @@ g_object_set (G_OBJECT (sink), "ssh-knownhosts", "known_hosts", NULL); g_object_set (G_OBJECT (sink), "ssh-host-pubkey-md5", "00112233445566778899aabbccddeeff", NULL); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + g_object_set (G_OBJECT (sink), "ssh-host-pubkey-sha256", + "TQtiu1/zwGEEKG4z/PDfPE/ak47AF9nbWHykx4CWcu9", NULL); +#endif g_object_set (G_OBJECT (sink), "ssh-accept-unknownhost", TRUE, NULL); g_object_set (G_OBJECT (sink), "ssh-key-passphrase", "SoMePaSsPhRaSe", NULL); @@ -94,6 +99,10 @@ "create-dirs", &res_create_dirs, "ssh-key-passphrase", &res_passphrase, NULL); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + g_object_get (sink, "ssh-host-pubkey-sha256", &res_host_pubkey_sha256, NULL); +#endif + fail_unless (strncmp (res_location, "test_location", strlen ("test_location")) == 0); fail_unless (strncmp (res_user, "test_user", strlen ("test_user")) == 0); @@ -113,6 +122,11 @@ == 0); fail_unless (strncmp (res_host_pubkey_md5, "00112233445566778899aabbccddeeff", strlen ("00112233445566778899aabbccddeeff")) == 0); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + fail_unless (strncmp (res_host_pubkey_sha256, + "TQtiu1/zwGEEKG4z/PDfPE/ak47AF9nbWHykx4CWcu9", + strlen ("TQtiu1/zwGEEKG4z/PDfPE/ak47AF9nbWHykx4CWcu9")) == 0); +#endif fail_unless (strncmp (res_passphrase, "SoMePaSsPhRaSe", strlen ("SoMePaSsPhRaSe")) == 0); fail_unless (res_accept_unkh == TRUE); @@ -127,6 +141,7 @@ g_free (res_passphrase); g_free (res_kh_file); g_free (res_host_pubkey_md5); + g_free (res_host_pubkey_sha256); /* ------- change properties ------------- */ @@ -145,6 +160,10 @@ g_object_set (G_OBJECT (sink), "ssh-knownhosts", "/zzz/known_hosts", NULL); g_object_set (G_OBJECT (sink), "ssh-host-pubkey-md5", "ffeeddccbbaa99887766554433221100", NULL); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + g_object_set (G_OBJECT (sink), "ssh-host-pubkey-sha256", + "TUtitut/wGEEKG4z/PDfPE/ak47AF7nbWHykAxCWcu5", NULL); +#endif g_object_set (G_OBJECT (sink), "ssh-accept-unknownhost", FALSE, NULL); g_object_set (G_OBJECT (sink), "ssh-key-passphrase", "OtherPASSphrase", NULL); @@ -163,6 +182,10 @@ "ssh-key-passphrase", &res_passphrase, "create-dirs", &res_create_dirs, NULL); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + g_object_get (sink, "ssh-host-pubkey-sha256", &res_host_pubkey_sha256, NULL); +#endif + fail_unless (strncmp (res_location, "new_location", strlen ("new_location")) == 0); fail_unless (strncmp (res_user, "new_user", strlen ("new_user")) == 0); @@ -182,6 +205,11 @@ strlen ("/zzz/known_host")) == 0); fail_unless (strncmp (res_host_pubkey_md5, "ffeeddccbbaa99887766554433221100", strlen ("ffeeddccbbaa99887766554433221100")) == 0); +#if CURL_AT_LEAST_VERSION(7, 80, 0) + fail_unless (strncmp (res_host_pubkey_sha256, + "TUtitut/wGEEKG4z/PDfPE/ak47AF7nbWHykAxCWcu5", + strlen ("TUtitut/wGEEKG4z/PDfPE/ak47AF7nbWHykAxCWcu5")) == 0); +#endif fail_unless (strncmp (res_passphrase, "OtherPASSphrase", strlen ("OtherPASSphrase")) == 0); fail_unless (res_accept_unkh == FALSE); @@ -196,6 +224,7 @@ g_free (res_passphrase); g_free (res_kh_file); g_free (res_host_pubkey_md5); + g_free (res_host_pubkey_256); cleanup_curlsftpsink (sink); }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/d3d11colorconvert.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/d3d11colorconvert.c
Changed
@@ -45,6 +45,24 @@ static const guint8 rgba_reorder_data[] = { 0x49, 0x24, 0x72, 0xff }; static const guint8 bgra_reorder_data[] = { 0x72, 0x24, 0x49, 0xff }; +static const gchar *YUV_FORMATS[] = { + "VUYA", "NV12", "P010_10LE", "P012_LE", "P016_LE", "I420", "I420_10LE", + "I420_12LE", "YV12", "NV21", "Y444", "Y444_10LE", "Y444_12LE", "Y444_16LE", + "Y42B", "I422_10LE", "I422_12LE", +}; + +static const gchar *RGB_FORMATS[] = { + "BGRA", "RGBA", "RGB10A2_LE", "BGRx", "RGBx", +}; + +static const gchar *PACKED_YUV_FORMATS[] = { + "Y410", +}; + +static const gchar *GRAY_FORMATS[] = { + "GRAY8", "GRAY16_LE" +}; + static TestFrame test_rgba_reorder[] = { {1, 1, GST_VIDEO_FORMAT_RGBA, {(guint8 *) & rgba_reorder_data}}, {1, 1, GST_VIDEO_FORMAT_BGRA, {(guint8 *) & bgra_reorder_data}}, @@ -53,7 +71,7 @@ GST_START_TEST (test_d3d11_color_convert_rgba_reorder) { GstHarness *h = - gst_harness_new_parse ("d3d11upload ! d3d11colorconvert ! d3d11download"); + gst_harness_new_parse ("d3d11upload ! d3d11convert ! d3d11download"); gint i, j, k; for (i = 0; i < G_N_ELEMENTS (test_rgba_reorder); i++) { @@ -102,7 +120,6 @@ GST_END_TEST; -#if RUN_VISUAL_TEST static gboolean bus_cb (GstBus * bus, GstMessage * message, gpointer data) { @@ -139,7 +156,7 @@ gchar *pipeline_str = g_strdup_printf ("videotestsrc num-buffers=1 is-live=true ! " "video/x-raw,format=%s,framerate=3/1 ! d3d11upload ! " - "d3d11colorconvert ! d3d11download ! video/x-raw,format=%s ! " + "d3d11convert ! d3d11download ! video/x-raw,format=%s ! " "videoconvert ! d3d11videosink", in_format, out_format); GstElement *pipeline; @@ -162,19 +179,15 @@ GST_START_TEST (test_d3d11_color_convert_yuv_yuv) { - const gchar *format_list[] = { - "VUYA", "NV12", "P010_10LE", "P016_LE", "I420", "I420_10LE" - }; - gint i, j; - for (i = 0; i < G_N_ELEMENTS (format_list); i++) { - for (j = 0; j < G_N_ELEMENTS (format_list); j++) { + for (i = 0; i < G_N_ELEMENTS (YUV_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (YUV_FORMATS); j++) { if (i == j) continue; - GST_DEBUG ("run conversion %s to %s", format_list[i], format_list[j]); - run_convert_pipelne (format_list[i], format_list[j]); + GST_DEBUG ("run conversion %s to %s", YUV_FORMATS[i], YUV_FORMATS[j]); + run_convert_pipelne (YUV_FORMATS[i], YUV_FORMATS[j]); } } } @@ -183,24 +196,32 @@ GST_START_TEST (test_d3d11_color_convert_yuv_rgb) { - const gchar *in_format_list[] = { - "VUYA", "NV12", "P010_10LE", "P016_LE", "I420", "I420_10LE" - }; - const gchar *out_format_list[] = { - "BGRA", "RGBA", "RGB10A2_LE", - }; + gint i, j; + for (i = 0; i < G_N_ELEMENTS (YUV_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (RGB_FORMATS); j++) { + if (i == j) + continue; + + GST_DEBUG ("run conversion %s to %s", YUV_FORMATS[i], RGB_FORMATS[j]); + run_convert_pipelne (YUV_FORMATS[i], RGB_FORMATS[j]); + } + } +} + +GST_END_TEST; +GST_START_TEST (test_d3d11_color_convert_yuv_gray) +{ gint i, j; - for (i = 0; i < G_N_ELEMENTS (in_format_list); i++) { - for (j = 0; j < G_N_ELEMENTS (out_format_list); j++) { + for (i = 0; i < G_N_ELEMENTS (YUV_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (GRAY_FORMATS); j++) { if (i == j) continue; - GST_DEBUG ("run conversion %s to %s", in_format_list[i], - out_format_list[j]); - run_convert_pipelne (in_format_list[i], out_format_list[j]); + GST_DEBUG ("run conversion %s to %s", YUV_FORMATS[i], GRAY_FORMATS[j]); + run_convert_pipelne (YUV_FORMATS[i], GRAY_FORMATS[j]); } } } @@ -209,20 +230,12 @@ GST_START_TEST (test_d3d11_color_convert_rgb_yuv) { - const gchar *in_format_list[] = { - "BGRA", "RGBA", "RGB10A2_LE", - }; - const gchar *out_format_list[] = { - "VUYA", "NV12", "P010_10LE", "P016_LE", "I420", "I420_10LE" - }; - gint i, j; - for (i = 0; i < G_N_ELEMENTS (in_format_list); i++) { - for (j = 0; j < G_N_ELEMENTS (out_format_list); j++) { - GST_DEBUG ("run conversion %s to %s", in_format_list[i], - out_format_list[j]); - run_convert_pipelne (in_format_list[i], out_format_list[j]); + for (i = 0; i < G_N_ELEMENTS (RGB_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (YUV_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", RGB_FORMATS[i], YUV_FORMATS[j]); + run_convert_pipelne (RGB_FORMATS[i], YUV_FORMATS[j]); } } } @@ -231,40 +244,130 @@ GST_START_TEST (test_d3d11_color_convert_rgb_rgb) { - const gchar *format_list[] = { - "BGRA", "RGBA", "RGB10A2_LE", - }; - gint i, j; - for (i = 0; i < G_N_ELEMENTS (format_list); i++) { - for (j = 0; j < G_N_ELEMENTS (format_list); j++) { + for (i = 0; i < G_N_ELEMENTS (RGB_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (RGB_FORMATS); j++) { if (i == j) continue; - GST_DEBUG ("run conversion %s to %s", format_list[i], format_list[j]); - run_convert_pipelne (format_list[i], format_list[j]); + GST_DEBUG ("run conversion %s to %s", RGB_FORMATS[i], RGB_FORMATS[j]); + run_convert_pipelne (RGB_FORMATS[i], RGB_FORMATS[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_d3d11_color_convert_rgb_gray) +{ + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (RGB_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (GRAY_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", RGB_FORMATS[i], GRAY_FORMATS[j]); + run_convert_pipelne (RGB_FORMATS[i], GRAY_FORMATS[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_d3d11_color_convert_packed_yuv_yuv) +{ + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (PACKED_YUV_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (YUV_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", PACKED_YUV_FORMATS[i], + YUV_FORMATS[j]); + run_convert_pipelne (PACKED_YUV_FORMATS[i], YUV_FORMATS[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_d3d11_color_convert_packed_yuv_rgb) +{ + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (PACKED_YUV_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (RGB_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", PACKED_YUV_FORMATS[i], + RGB_FORMATS[j]); + run_convert_pipelne (PACKED_YUV_FORMATS[i], RGB_FORMATS[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_d3d11_color_convert_packed_yuv_gray) +{ + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (PACKED_YUV_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (GRAY_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", PACKED_YUV_FORMATS[i], + GRAY_FORMATS[j]); + run_convert_pipelne (PACKED_YUV_FORMATS[i], GRAY_FORMATS[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_d3d11_color_convert_gray_yuv) +{ + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (GRAY_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (YUV_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", GRAY_FORMATS[i], YUV_FORMATS[j]); + run_convert_pipelne (GRAY_FORMATS[i], YUV_FORMATS[j]); + } + } +} + +GST_END_TEST; + +GST_START_TEST (test_d3d11_color_convert_gray_rgb) +{ + gint i, j; + + for (i = 0; i < G_N_ELEMENTS (GRAY_FORMATS); i++) { + for (j = 0; j < G_N_ELEMENTS (RGB_FORMATS); j++) { + GST_DEBUG ("run conversion %s to %s", GRAY_FORMATS[i], RGB_FORMATS[j]); + run_convert_pipelne (GRAY_FORMATS[i], RGB_FORMATS[j]); } } } GST_END_TEST; -#endif /* RUN_VISUAL_TEST */ static Suite * d3d11colorconvert_suite (void) { Suite *s = suite_create ("d3d11colorconvert"); TCase *tc_basic = tcase_create ("general"); + const gchar *run_visual_test = g_getenv ("RUN_VISUAL_TEST"); suite_add_tcase (s, tc_basic); tcase_add_test (tc_basic, test_d3d11_color_convert_rgba_reorder); -#if RUN_VISUAL_TEST - tcase_add_test (tc_basic, test_d3d11_color_convert_yuv_yuv); - tcase_add_test (tc_basic, test_d3d11_color_convert_yuv_rgb); - tcase_add_test (tc_basic, test_d3d11_color_convert_rgb_yuv); - tcase_add_test (tc_basic, test_d3d11_color_convert_rgb_rgb); -#endif + if (run_visual_test != NULL) { + tcase_add_test (tc_basic, test_d3d11_color_convert_yuv_yuv); + tcase_add_test (tc_basic, test_d3d11_color_convert_yuv_rgb); + tcase_add_test (tc_basic, test_d3d11_color_convert_yuv_gray); + tcase_add_test (tc_basic, test_d3d11_color_convert_rgb_yuv); + tcase_add_test (tc_basic, test_d3d11_color_convert_rgb_rgb); + tcase_add_test (tc_basic, test_d3d11_color_convert_rgb_gray); + tcase_add_test (tc_basic, test_d3d11_color_convert_packed_yuv_yuv); + tcase_add_test (tc_basic, test_d3d11_color_convert_packed_yuv_rgb); + tcase_add_test (tc_basic, test_d3d11_color_convert_packed_yuv_gray); + tcase_add_test (tc_basic, test_d3d11_color_convert_gray_yuv); + tcase_add_test (tc_basic, test_d3d11_color_convert_gray_rgb); + } return s; }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/dash_mpd.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/dash_mpd.c
Changed
@@ -51,6 +51,9 @@ #include <gst/check/gstcheck.h> +#include <libxml/parser.h> +#include <libxml/tree.h> + GST_DEBUG_CATEGORY (gst_dash_demux_debug); /* @@ -1392,6 +1395,58 @@ gboolean ret; GstMPDClient *mpdclient = gst_mpd_client_new (); + gchar *str; + + ret = gst_mpd_client_parse (mpdclient, xml, (gint) strlen (xml)); + assert_equals_int (ret, TRUE); + + periodNode = (GstMPDPeriodNode *) mpdclient->mpd_root_node->Periods->data; + adaptationSet = (GstMPDAdaptationSetNode *) periodNode->AdaptationSets->data; + representationBase = GST_MPD_REPRESENTATION_BASE_NODE (adaptationSet); + contentProtection = + (GstMPDDescriptorTypeNode *) representationBase->ContentProtection->data; + assert_equals_string (contentProtection->schemeIdUri, "TestSchemeIdUri"); + + /* We can't do a simple compare of value (which should be an XML dump + of the ContentProtection element), because the whitespace + formatting from xmlDump might differ between versions of libxml */ + str = strstr (contentProtection->value, "<ContentProtection"); + fail_if (str == NULL); + str = strstr (contentProtection->value, "value=\"TestValue\""); + fail_if (str == NULL); + str = strstr (contentProtection->value, "</ContentProtection>"); + fail_if (str == NULL); + + gst_mpd_client_free (mpdclient); +} + +GST_END_TEST; + +/* + * Test parsing Period AdaptationSet RepresentationBase ContentProtection + * with custom ContentProtection content. + */ +GST_START_TEST + (dash_mpdparser_period_adaptationSet_representationBase_contentProtection_with_content) +{ + GstMPDPeriodNode *periodNode; + GstMPDAdaptationSetNode *adaptationSet; + GstMPDRepresentationBaseNode *representationBase; + GstMPDDescriptorTypeNode *contentProtection; + const gchar *xml = + "<?xml version=\"1.0\"?>" + "<MPD xmlns=\"urn:mpeg:dash:schema:mpd:2011\"" + " profiles=\"urn:mpeg:dash:profile:isoff-main:2011\">" + " xmlns:customns=\"foo\"" + " <Period>" + " <AdaptationSet>" + " <ContentProtection schemeIdUri=\"TestSchemeIdUri\">" + " <customns:bar>Hello world</customns:bar>" + " </ContentProtection></AdaptationSet></Period></MPD>"; + + gboolean ret; + GstMPDClient *mpdclient = gst_mpd_client_new (); + gchar *str; ret = gst_mpd_client_parse (mpdclient, xml, (gint) strlen (xml)); assert_equals_int (ret, TRUE); @@ -1402,13 +1457,25 @@ contentProtection = (GstMPDDescriptorTypeNode *) representationBase->ContentProtection->data; assert_equals_string (contentProtection->schemeIdUri, "TestSchemeIdUri"); - assert_equals_string (contentProtection->value, "TestValue"); + + /* We can't do a simple compare of value (which should be an XML dump + of the ContentProtection element), because the whitespace + formatting from xmlDump might differ between versions of libxml */ + str = strstr (contentProtection->value, "<ContentProtection"); + fail_if (str == NULL); + str = + strstr (contentProtection->value, + "<customns:bar>Hello world</customns:bar>"); + fail_if (str == NULL); + str = strstr (contentProtection->value, "</ContentProtection>"); + fail_if (str == NULL); gst_mpd_client_free (mpdclient); } GST_END_TEST; + /* * Test parsing ContentProtection element that has no value attribute */ @@ -1515,6 +1582,93 @@ GST_END_TEST; /* + * Test parsing Period AdaptationSet RepresentationBase ContentProtection + * attributes + */ +GST_START_TEST + (dash_mpdparser_period_adaptationSet_representationBase_contentProtection_xml_namespaces) +{ + const gchar *xml = + "<?xml version=\"1.0\"?>" + "<MPD xmlns=\"urn:mpeg:dash:schema:mpd:2011\" minBufferTime=\"PT1.500S\"" + " type=\"static\" mediaPresentationDuration=\"PT0H24M28.000S\"" + " maxSegmentDuration=\"PT0H0M4.000S\"" + " profiles=\"urn:mpeg:dash:profile:isoff-live:2011,http://dashif.org/guidelines/dash264\"" + " xmlns:cenc=\"urn:mpeg:cenc:2013\" xmlns:clearkey=\"http://dashif.org/guidelines/clearKey\">" + " <Period>" " <AdaptationSet>" + " <ContentProtection schemeIdUri=\"urn:mpeg:dash:mp4protection:2011\"" + " value=\"cenc\" cenc:default_KID=\"33969335-53A5-4E78-BA99-9054CD1B2871\">" + " </ContentProtection>" + " <ContentProtection value=\"ClearKey1.0\"" + " schemeIdUri=\"urn:uuid:e2719d58-a985-b3c9-781a-b030af78d30e\">" + " <clearkey:Laurl Lic_type=\"EME-1.0\">https://drm.test.example/AcquireLicense</clearkey:Laurl>" + " </ContentProtection></AdaptationSet></Period></MPD>"; + GstMPDPeriodNode *periodNode; + GstMPDAdaptationSetNode *adaptationSet; + GstMPDRepresentationBaseNode *representationBase; + GstMPDDescriptorTypeNode *contentProtection; + gboolean ret; + GstMPDClient *mpdclient = gst_mpd_client_new (); + xmlDocPtr doc; + xmlNode *root_element = NULL, *node; + xmlChar *property = NULL; + + ret = gst_mpd_client_parse (mpdclient, xml, (gint) strlen (xml)); + assert_equals_int (ret, TRUE); + + periodNode = (GstMPDPeriodNode *) mpdclient->mpd_root_node->Periods->data; + adaptationSet = (GstMPDAdaptationSetNode *) periodNode->AdaptationSets->data; + representationBase = GST_MPD_REPRESENTATION_BASE_NODE (adaptationSet); + assert_equals_int (g_list_length (representationBase->ContentProtection), 2); + contentProtection = (GstMPDDescriptorTypeNode *) + g_list_nth_data (representationBase->ContentProtection, 0); + assert_equals_string (contentProtection->schemeIdUri, + "urn:mpeg:dash:mp4protection:2011"); + + contentProtection = (GstMPDDescriptorTypeNode *) + g_list_nth_data (representationBase->ContentProtection, 1); + assert_equals_string (contentProtection->schemeIdUri, + "urn:uuid:e2719d58-a985-b3c9-781a-b030af78d30e"); + + /* We can't do a simple string compare of value, because the whitespace + formatting from xmlDump might differ between versions of libxml */ + LIBXML_TEST_VERSION; + doc = + xmlReadMemory (contentProtection->value, + strlen (contentProtection->value), "ContentProtection.xml", NULL, + XML_PARSE_NONET); + fail_if (!doc); + root_element = xmlDocGetRootElement (doc); + fail_if (root_element->type != XML_ELEMENT_NODE); + fail_if (xmlStrcmp (root_element->name, + (xmlChar *) "ContentProtection") != 0); + fail_if ((property = + xmlGetNoNsProp (root_element, (const xmlChar *) "value")) == NULL); + fail_if (xmlStrcmp (property, (xmlChar *) "ClearKey1.0") != 0); + xmlFree (property); + fail_if ((property = + xmlGetNoNsProp (root_element, + (const xmlChar *) "schemeIdUri")) == NULL); + assert_equals_string ((const gchar *) property, + "urn:uuid:e2719d58-a985-b3c9-781a-b030af78d30e"); + xmlFree (property); + + for (node = root_element->children; node; node = node->next) { + if (node->type == XML_ELEMENT_NODE) + break; + } + assert_equals_string ((const gchar *) node->name, "Laurl"); + assert_equals_string ((const gchar *) node->children->content, + "https://drm.test.example/AcquireLicense"); + + xmlFreeDoc (doc); + + gst_mpd_client_free (mpdclient); +} + +GST_END_TEST; + +/* * Test parsing Period AdaptationSet Accessibility attributes * */ @@ -5852,7 +6006,7 @@ /* constructs initial mpd using external xml uri */ /* For invalid URI, mpdparser should be ignore it */ xml_joined = g_strjoin ("", xml_frag_start, - xml_uri_front, "http://404/ERROR/XML.period", xml_uri_rear, + xml_uri_front, "http://404.invalid/ERROR/XML.period", xml_uri_rear, xml_uri_front, (const char *) file_uri_single_period, xml_uri_rear, xml_uri_front, (const char *) file_uri_double_period, xml_uri_rear, xml_frag_end, NULL); @@ -6389,6 +6543,10 @@ tcase_add_test (tc_simpleMPD, dash_mpdparser_contentProtection_no_value_no_encoding); tcase_add_test (tc_simpleMPD, + dash_mpdparser_period_adaptationSet_representationBase_contentProtection_with_content); + tcase_add_test (tc_simpleMPD, + dash_mpdparser_period_adaptationSet_representationBase_contentProtection_xml_namespaces); + tcase_add_test (tc_simpleMPD, dash_mpdparser_period_adaptationSet_accessibility); tcase_add_test (tc_simpleMPD, dash_mpdparser_period_adaptationSet_role); tcase_add_test (tc_simpleMPD, dash_mpdparser_period_adaptationSet_rating);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/dtls.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/dtls.c
Changed
@@ -147,22 +147,22 @@ gst_element_set_state (c_enc, GST_STATE_PAUSED); - target = gst_element_get_request_pad (c_dec, "src"); + target = gst_element_request_pad_simple (c_dec, "src"); ghost = gst_ghost_pad_new ("src", target); gst_element_add_pad (s_bin, ghost); gst_object_unref (target); - target = gst_element_get_request_pad (s_enc, "sink"); + target = gst_element_request_pad_simple (s_enc, "sink"); ghost = gst_ghost_pad_new ("sink", target); gst_element_add_pad (s_bin, ghost); gst_object_unref (target); - target = gst_element_get_request_pad (s_dec, "src"); + target = gst_element_request_pad_simple (s_dec, "src"); ghost = gst_ghost_pad_new ("src", target); gst_element_add_pad (c_bin, ghost); gst_object_unref (target); - target = gst_element_get_request_pad (c_enc, "sink"); + target = gst_element_request_pad_simple (c_enc, "sink"); ghost = gst_ghost_pad_new ("sink", target); gst_element_add_pad (c_bin, ghost); gst_object_unref (target);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/h264parse.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/h264parse.c
Changed
@@ -23,6 +23,10 @@ * Boston, MA 02110-1301, USA. */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + #include <gst/check/check.h> #include <gst/video/video.h> #include "gst-libs/gst/codecparsers/gsth264parser.h" @@ -801,12 +805,20 @@ } /* These were generated using pipeline: - * gst-launch-1.0 videotestsrc num-buffers=1 pattern=green \ + * gst-launch-1.0 videotestsrc num-buffers=2 pattern=green \ * ! video/x-raw,width=128,height=128 \ * ! openh264enc num-slices=2 \ * ! fakesink dump=1 */ +/* codec-data */ +static guint8 h264_slicing_codec_data[] = { + 0x01, 0x42, 0xc0, 0x0b, 0xff, 0xe1, 0x00, 0x0e, + 0x67, 0x42, 0xc0, 0x0b, 0x8c, 0x8d, 0x41, 0x02, + 0x24, 0x03, 0xc2, 0x21, 0x1a, 0x80, 0x01, 0x00, + 0x04, 0x68, 0xce, 0x3c, 0x80 +}; + /* SPS */ static guint8 h264_slicing_sps[] = { 0x00, 0x00, 0x00, 0x01, 0x67, 0x42, 0xc0, 0x0b, @@ -841,6 +853,18 @@ 0xd7, 0x5d, 0x75, 0xd7, 0x5e }; +/* P Slice 1 */ +static guint8 h264_slice_1[] = { + 0x00, 0x00, 0x00, 0x01, 0x61, 0xe0, 0x00, 0x40, + 0x00, 0x9c, 0x82, 0x3c, 0x10, 0xc0 +}; + +/* P Slice 2 */ +static guint8 h264_slice_2[] = { + 0x00, 0x00, 0x00, 0x01, 0x61, 0x04, 0x38, 0x00, + 0x10, 0x00, 0x27, 0x20, 0x8f, 0x04, 0x30 +}; + static inline GstBuffer * wrap_buffer (const guint8 * buf, gsize size, GstClockTime pts, GstBufferFlags flags) @@ -1257,6 +1281,187 @@ GST_END_TEST; +typedef enum +{ + PACKETIZED_AU = 0, + /* TODO: packetized with nal alignment if we expect that should work? */ + BYTESTREAM_AU, + BYTESTREAM_NAL, +} H264ParseStreamType; + +static const gchar * +stream_type_to_caps_str (H264ParseStreamType type) +{ + switch (type) { + case PACKETIZED_AU: + return "video/x-h264,stream-format=avc,alignment=au"; + case BYTESTREAM_AU: + return "video/x-h264,stream-format=byte-stream,alignment=au"; + case BYTESTREAM_NAL: + return "video/x-h264,stream-format=byte-stream,alignment=nal"; + } + + g_assert_not_reached (); + + return NULL; +} + +static GstMemory * +nalu_to_memory (H264ParseStreamType type, const guint8 * data, gsize size) +{ + gpointer dump = g_memdup2 (data, size); + + if (type == PACKETIZED_AU) { + guint32 nalu_size; + + nalu_size = size - 4; + nalu_size = GUINT32_TO_BE (nalu_size); + memcpy (dump, &nalu_size, sizeof (nalu_size)); + } + + return gst_memory_new_wrapped (0, dump, size, 0, size, dump, g_free); +} + +static GList * +create_aud_test_buffers (H264ParseStreamType type, gboolean inband_aud) +{ + GList *list = NULL; + GstBuffer *buf = NULL; + +#define APPEND_NALU_TO_BUFFER(type,nalu,end_of_au) G_STMT_START { \ + if (!buf) { \ + buf = gst_buffer_new (); \ + } \ + gst_buffer_append_memory (buf, nalu_to_memory (type, nalu, \ + sizeof (nalu))); \ + if (type == BYTESTREAM_NAL || end_of_au) { \ + list = g_list_append (list, buf); \ + buf = NULL; \ + } \ +} G_STMT_END + + if (inband_aud) + APPEND_NALU_TO_BUFFER (type, h264_aud, FALSE); + + APPEND_NALU_TO_BUFFER (type, h264_slicing_sps, FALSE); + APPEND_NALU_TO_BUFFER (type, h264_slicing_pps, FALSE); + APPEND_NALU_TO_BUFFER (type, h264_idr_slice_1, FALSE); + APPEND_NALU_TO_BUFFER (type, h264_idr_slice_2, TRUE); + + if (inband_aud) + APPEND_NALU_TO_BUFFER (type, h264_aud, FALSE); + + APPEND_NALU_TO_BUFFER (type, h264_slice_1, FALSE); + APPEND_NALU_TO_BUFFER (type, h264_slice_2, TRUE); + +#undef APPEND_NALU_TO_BUFFER + + return list; +} + +static void +check_aud_insertion (gboolean inband_aud, H264ParseStreamType in_type, + H264ParseStreamType out_type) +{ + GstHarness *h; + GList *in_buffers = NULL; + GList *expected_buffers = NULL; + GList *result_buffers = NULL; + GList *iter, *walk; + GstCaps *in_caps, *out_caps; + gboolean aud_in_output; + GstBuffer *buf; + + h = gst_harness_new ("h264parse"); + + in_caps = gst_caps_from_string (stream_type_to_caps_str (in_type)); + if (in_type == PACKETIZED_AU) { + GstBuffer *cdata_buf = gst_buffer_new_memdup (h264_slicing_codec_data, + sizeof (h264_slicing_codec_data)); + gst_caps_set_simple (in_caps, + "codec_data", GST_TYPE_BUFFER, cdata_buf, NULL); + gst_buffer_unref (cdata_buf); + } + + out_caps = gst_caps_from_string (stream_type_to_caps_str (out_type)); + + gst_harness_set_caps (h, in_caps, out_caps); + + in_buffers = create_aud_test_buffers (in_type, inband_aud); + + if (out_type == BYTESTREAM_AU || out_type == BYTESTREAM_NAL) { + /* In case of byte-stream output, parse will insert AUD always */ + aud_in_output = TRUE; + } else if (inband_aud) { + /* Parse will not drop AUD in any case */ + aud_in_output = TRUE; + } else { + /* Cases where input bitstream doesn't contain AUD and output format is + * packetized. In this case parse will not insert AUD */ + aud_in_output = FALSE; + } + + expected_buffers = create_aud_test_buffers (out_type, aud_in_output); + + for (iter = in_buffers; iter; iter = g_list_next (iter)) { + buf = (GstBuffer *) iter->data; + fail_unless_equals_int (gst_harness_push (h, gst_buffer_ref (buf)), + GST_FLOW_OK); + } + + /* EOS for pending buffers to be drained if any */ + gst_harness_push_event (h, gst_event_new_eos ()); + + while ((buf = gst_harness_try_pull (h))) + result_buffers = g_list_append (result_buffers, buf); + + fail_unless_equals_int (g_list_length (result_buffers), + g_list_length (expected_buffers)); + + for (iter = expected_buffers, walk = result_buffers; iter && walk; + iter = g_list_next (iter), walk = g_list_next (walk)) { + GstBuffer *buf1, *buf2; + GstMapInfo map1, map2; + + buf1 = (GstBuffer *) iter->data; + buf2 = (GstBuffer *) walk->data; + + gst_buffer_map (buf1, &map1, GST_MAP_READ); + gst_buffer_map (buf2, &map2, GST_MAP_READ); + + fail_unless_equals_int (map1.size, map2.size); + fail_unless (memcmp (map1.data, map2.data, map1.size) == 0); + gst_buffer_unmap (buf1, &map1); + gst_buffer_unmap (buf2, &map2); + } + + g_list_free_full (in_buffers, (GDestroyNotify) gst_buffer_unref); + g_list_free_full (expected_buffers, (GDestroyNotify) gst_buffer_unref); + g_list_free_full (result_buffers, (GDestroyNotify) gst_buffer_unref); + + gst_harness_teardown (h); +} + +GST_START_TEST (test_parse_aud_insert) +{ + gboolean inband_aud[] = { + TRUE, FALSE + }; + H264ParseStreamType stream_types[] = { + PACKETIZED_AU, BYTESTREAM_AU, BYTESTREAM_NAL + }; + guint i, j, k; + + for (i = 0; i < G_N_ELEMENTS (inband_aud); i++) { + for (j = 0; j < G_N_ELEMENTS (stream_types); j++) { + for (k = 0; k < G_N_ELEMENTS (stream_types); k++) { + check_aud_insertion (inband_aud[i], stream_types[j], stream_types[k]); + } + } + } +} + +GST_END_TEST; /* * TODO: @@ -1369,6 +1574,7 @@ tcase_add_test (tc_chain, test_parse_sei_closedcaptions); tcase_add_test (tc_chain, test_parse_compatible_caps); tcase_add_test (tc_chain, test_parse_skip_to_4bytes_sc); + tcase_add_test (tc_chain, test_parse_aud_insert); nf += gst_check_run_suite (s, "h264parse", __FILE__); }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/kate.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/kate.c
Changed
@@ -271,8 +271,7 @@ GstBuffer *buf; GstCaps *caps = NULL; - buf = gst_buffer_new_wrapped (g_memdup (kate_header_0x80, - sizeof (kate_header_0x80)), sizeof (kate_header_0x80)); + buf = gst_buffer_new_memdup (kate_header_0x80, sizeof (kate_header_0x80)); GST_BUFFER_OFFSET (buf) = 0; caps = gst_type_find_helper_for_buffer (NULL, buf, &prob); @@ -344,8 +343,8 @@ gst_check_setup_events (mydecsrcpad, katedec, caps, GST_FORMAT_TIME); gst_caps_unref (caps); - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x80, - sizeof (kate_header_0x80)), sizeof (kate_header_0x80)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x80, sizeof (kate_header_0x80)); ASSERT_BUFFER_REFCOUNT (inbuffer, "inbuffer", 1); gst_buffer_ref (inbuffer); @@ -357,8 +356,8 @@ gst_buffer_unref (inbuffer); fail_unless (g_list_length (buffers) == 0); - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x81, - sizeof (kate_header_0x81)), sizeof (kate_header_0x81)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x81, sizeof (kate_header_0x81)); ASSERT_BUFFER_REFCOUNT (inbuffer, "inbuffer", 1); gst_buffer_ref (inbuffer); @@ -490,8 +489,7 @@ "could not set to playing"); bus = gst_bus_new (); - inbuffer = gst_buffer_new_wrapped (g_memdup (test_string, - strlen (test_string) + 1), strlen (test_string) + 1); + inbuffer = gst_buffer_new_memdup (test_string, strlen (test_string) + 1); GST_BUFFER_TIMESTAMP (inbuffer) = GST_BUFFER_OFFSET (inbuffer) = 1 * GST_SECOND; @@ -544,8 +542,7 @@ "could not set to playing"); bus = gst_bus_new (); - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_spu, sizeof (kate_spu)), - sizeof (kate_spu)); + inbuffer = gst_buffer_new_memdup (kate_spu, sizeof (kate_spu)); GST_BUFFER_TIMESTAMP (inbuffer) = GST_BUFFER_OFFSET (inbuffer) = 1 * GST_SECOND; @@ -662,19 +659,19 @@ gst_caps_unref (caps); /* push headers */ - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x80, - sizeof (kate_header_0x80)), sizeof (kate_header_0x80)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x80, sizeof (kate_header_0x80)); GST_BUFFER_OFFSET (inbuffer) = GST_BUFFER_OFFSET_END (inbuffer) = 0; fail_unless_equals_int (gst_pad_push (pad, inbuffer), GST_FLOW_OK); - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x81, - sizeof (kate_header_0x81)), sizeof (kate_header_0x81)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x81, sizeof (kate_header_0x81)); GST_BUFFER_OFFSET (inbuffer) = GST_BUFFER_OFFSET_END (inbuffer) = 0; fail_unless_equals_int (gst_pad_push (pad, inbuffer), GST_FLOW_OK); for (i = 2; i < 8; ++i) { - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x8x, - sizeof (kate_header_0x8x)), sizeof (kate_header_0x8x)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x8x, sizeof (kate_header_0x8x)); fail_if (gst_buffer_map (inbuffer, &info, GST_MAP_WRITE) != TRUE); info.data[0] = 0x80 | i; gst_buffer_unmap (inbuffer, &info); @@ -682,8 +679,8 @@ fail_unless_equals_int (gst_pad_push (pad, inbuffer), GST_FLOW_OK); } - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x88, - sizeof (kate_header_0x88)), sizeof (kate_header_0x88)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x88, sizeof (kate_header_0x88)); GST_BUFFER_OFFSET (inbuffer) = GST_BUFFER_OFFSET_END (inbuffer) = 0; fail_unless_equals_int (gst_pad_push (pad, inbuffer), GST_FLOW_OK); } @@ -705,8 +702,8 @@ test_kate_send_headers (kateparse, myparsesrcpad); /* push a text packet */ - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x00, - sizeof (kate_header_0x00)), sizeof (kate_header_0x00)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x00, sizeof (kate_header_0x00)); GST_BUFFER_TIMESTAMP (inbuffer) = GST_BUFFER_OFFSET (inbuffer) = 1 * GST_SECOND; GST_BUFFER_DURATION (inbuffer) = 5 * GST_SECOND; @@ -714,8 +711,8 @@ fail_unless_equals_int (gst_pad_push (myparsesrcpad, inbuffer), GST_FLOW_OK); /* push a eos packet */ - inbuffer = gst_buffer_new_wrapped (g_memdup (kate_header_0x7f, - sizeof (kate_header_0x7f)), sizeof (kate_header_0x7f)); + inbuffer = + gst_buffer_new_memdup (kate_header_0x7f, sizeof (kate_header_0x7f)); GST_BUFFER_TIMESTAMP (inbuffer) = GST_BUFFER_OFFSET (inbuffer) = 6 * GST_SECOND; GST_BUFFER_DURATION (inbuffer) = 0;
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/line21.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/line21.c
Changed
@@ -31,7 +31,7 @@ GstHarness *h; GstBuffer *buf, *outbuf; GstVideoInfo info; - GstVideoCaptionMeta *in_cc_meta, *out_cc_meta; + GstVideoCaptionMeta *out_cc_meta; guint i; guint8 empty_data[] = { 0x8c, 0x80, 0x80, 0x0, 0x80, 0x80 }; guint8 full_data[] = { 0x8c, 0x42, 0x43, 0x0, 0x44, 0x45 }; @@ -58,7 +58,6 @@ GST_VIDEO_CAPTION_META_API_TYPE), 1); out_cc_meta = gst_buffer_get_video_caption_meta (outbuf); - fail_unless (out_cc_meta != NULL); fail_unless (out_cc_meta->size == 6); @@ -70,7 +69,7 @@ buf = gst_buffer_new_and_alloc (info.size); gst_buffer_add_video_caption_meta (buf, GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A, full_data, 6); - in_cc_meta = gst_buffer_get_video_caption_meta (buf); + outbuf = gst_harness_push_and_pull (h, buf); fail_unless (outbuf != NULL); @@ -78,7 +77,7 @@ GST_VIDEO_CAPTION_META_API_TYPE), 1); out_cc_meta = gst_buffer_get_video_caption_meta (outbuf); - fail_unless (in_cc_meta != out_cc_meta); + fail_unless (out_cc_meta != NULL); for (i = 0; i < out_cc_meta->size; i++) fail_unless (out_cc_meta->data[i] == full_data[i]); @@ -89,6 +88,45 @@ GST_END_TEST; +GST_START_TEST (remove_caption_meta) +{ + GstHarness *h; + GstBuffer *buf, *outbuf; + GstVideoInfo info; + GstVideoCaptionMeta *out_cc_meta; + guint8 full_data[] = { 0x8c, 0x42, 0x43, 0x0, 0x44, 0x45 }; + GstCaps *caps = gst_caps_new_simple ("video/x-raw", + "format", G_TYPE_STRING, "I420", + "width", G_TYPE_INT, 720, + "height", G_TYPE_INT, 525, + "interlace-mode", G_TYPE_STRING, "interleaved", + NULL); + + h = gst_harness_new_parse ("line21encoder remove-caption-meta=true"); + gst_harness_set_caps (h, gst_caps_ref (caps), gst_caps_ref (caps)); + + gst_video_info_from_caps (&info, caps); + + gst_caps_unref (caps); + + buf = gst_buffer_new_and_alloc (info.size); + gst_buffer_add_video_caption_meta (buf, GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A, + full_data, 6); + + outbuf = gst_harness_push_and_pull (h, buf); + fail_unless (outbuf != NULL); + fail_unless_equals_int (gst_buffer_get_n_meta (outbuf, + GST_VIDEO_CAPTION_META_API_TYPE), 0); + + out_cc_meta = gst_buffer_get_video_caption_meta (outbuf); + fail_unless (out_cc_meta == NULL); + + gst_buffer_unref (outbuf); + gst_harness_teardown (h); +} + +GST_END_TEST; + static Suite * line21_suite (void) { @@ -98,6 +136,7 @@ suite_add_tcase (s, tc); tcase_add_test (tc, basic); + tcase_add_test (tc, remove_caption_meta); return s; }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/mpegtsmux.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/mpegtsmux.c
Changed
@@ -71,7 +71,7 @@ fail_if (srcpad == NULL, "Could not create a srcpad"); if (!(sinkpad = gst_element_get_static_pad (element, sinkname))) - sinkpad = gst_element_get_request_pad (element, sinkname); + sinkpad = gst_element_request_pad_simple (element, sinkname); fail_if (sinkpad == NULL, "Could not get sink pad from %s", GST_ELEMENT_NAME (element)); /* we can't test the reference count of the sinkpad here because it's either @@ -95,7 +95,7 @@ /* clean up floating src pad */ if (!(sinkpad = gst_element_get_static_pad (element, sinkname))) - sinkpad = gst_element_get_request_pad (element, sinkname); + sinkpad = gst_element_request_pad_simple (element, sinkname); srcpad = gst_pad_get_peer (sinkpad); gst_pad_unlink (srcpad, sinkpad);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/mplex.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/mplex.c
Changed
@@ -128,7 +128,7 @@ fail_if (srcpad == NULL, "Could not create a srcpad"); ASSERT_OBJECT_REFCOUNT (srcpad, "srcpad", 1); - sinkpad = gst_element_get_request_pad (element, sinkname); + sinkpad = gst_element_request_pad_simple (element, sinkname); fail_if (sinkpad == NULL, "Could not get sink pad from %s", GST_ELEMENT_NAME (element)); /* references are owned by: 1) us, 2) mplex, 3) mplex list */
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/msdkh264enc.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/msdkh264enc.c
Changed
@@ -47,12 +47,11 @@ GstCaps *srccaps = NULL; GstBus *bus = NULL; + element = gst_check_setup_element ("msdkh264enc"); if (caps) { srccaps = gst_caps_from_string (caps); fail_unless (srccaps != NULL); } - element = gst_check_setup_element ("msdkh264enc"); - fail_unless (element != NULL); srcpad = gst_check_setup_src_pad (element, &h264enc_srctemp); sinkpad = gst_check_setup_sink_pad (element, &h264enc_sinktemp); gst_pad_set_active (srcpad, TRUE); @@ -66,12 +65,10 @@ GST_STATE_PLAYING) != GST_STATE_CHANGE_FAILURE, "could not set to playing"); - if (srccaps) - gst_caps_unref (srccaps); + gst_caps_unref (srccaps); buffers = NULL; return element; - } static void @@ -151,11 +148,16 @@ msdkh264enc_suite (void) { Suite *s = suite_create ("msdkh264enc"); - TCase *tc_chain = tcase_create ("general"); + GstElementFactory *factory; suite_add_tcase (s, tc_chain); - tcase_add_test (tc_chain, msdk_h264enc); + + factory = gst_element_factory_find ("msdkh264enc"); + if (factory) { + tcase_add_test (tc_chain, msdk_h264enc); + gst_object_unref (factory); + } return s; }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/netsim.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/netsim.c
Changed
@@ -29,10 +29,12 @@ { GstHarness *h = gst_harness_new_parse ("netsim delay-probability=0.5"); GstCaps *caps = gst_caps_from_string ("mycaps"); - GstBuffer *buf = gst_harness_create_buffer (h, 100); + GstBuffer *buf; GstHarnessThread *state, *push; GstSegment segment; + gst_harness_set_src_caps (h, gst_caps_ref (caps)); + buf = gst_harness_create_buffer (h, 100); gst_segment_init (&segment, GST_FORMAT_TIME); state = gst_harness_stress_statechange_start (h); push = gst_harness_stress_push_buffer_start (h, caps, &segment, buf);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/openjpeg.c
Added
@@ -0,0 +1,272 @@ +/* GStreamer + * + * Copyright (c) 2010 Sebastian Dröge <sebastian.droege@collabora.co.uk> + * Copyright (c) 2010 David Schleef <ds@schleef.org> + * Copyright (c) 2014 Thijs Vermeir <thijs.vermeir@barco.com> + * Copyright (c) 2021 Stéphane Cerveau <scerveau@collabora.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/check/gstcheck.h> + +static GstStaticPadTemplate enc_sinktemplate = GST_STATIC_PAD_TEMPLATE ("sink", + GST_PAD_SINK, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("image/x-j2c, " + "width = (int) [16, MAX], " + "height = (int) [16, MAX], " "framerate = (fraction) [0, MAX]")); + +static GstStaticPadTemplate enc_srctemplate = GST_STATIC_PAD_TEMPLATE ("src", + GST_PAD_SRC, + GST_PAD_ALWAYS, + GST_STATIC_CAPS ("video/x-raw, " + "format = (string) I420, " + "width = (int) [16, MAX], " + "height = (int) [16, MAX], " "framerate = (fraction) [0, MAX]")); + +#define MAX_THREADS 8 +#define NUM_BUFFERS 4 +#define FRAME_RATE 1000 + +static GstPad *sinkpad, *srcpad; + +typedef struct _OpenJPEGData +{ + GMainLoop *loop; + gboolean failing_pipeline; +} OpenJPEGData; + +static GstElement * +setup_openjpegenc (const gchar * src_caps_str, gint num_stripes) +{ + GstElement *openjpegenc; + GstCaps *srccaps = NULL; + GstBus *bus; + + if (src_caps_str) { + srccaps = gst_caps_from_string (src_caps_str); + fail_unless (srccaps != NULL); + } + + openjpegenc = gst_check_setup_element ("openjpegenc"); + fail_unless (openjpegenc != NULL); + g_object_set (openjpegenc, "num-stripes", num_stripes, NULL); + srcpad = gst_check_setup_src_pad (openjpegenc, &enc_srctemplate); + sinkpad = gst_check_setup_sink_pad (openjpegenc, &enc_sinktemplate); + gst_pad_set_active (srcpad, TRUE); + gst_pad_set_active (sinkpad, TRUE); + + gst_check_setup_events (srcpad, openjpegenc, srccaps, GST_FORMAT_TIME); + + bus = gst_bus_new (); + gst_element_set_bus (openjpegenc, bus); + + fail_unless (gst_element_set_state (openjpegenc, + GST_STATE_PLAYING) != GST_STATE_CHANGE_FAILURE, + "could not set to playing"); + + if (srccaps) + gst_caps_unref (srccaps); + + buffers = NULL; + return openjpegenc; +} + +static void +cleanup_openjpegenc (GstElement * openjpegenc) +{ + GstBus *bus; + + /* Free parsed buffers */ + gst_check_drop_buffers (); + + bus = GST_ELEMENT_BUS (openjpegenc); + gst_bus_set_flushing (bus, TRUE); + gst_object_unref (bus); + + gst_pad_set_active (srcpad, FALSE); + gst_pad_set_active (sinkpad, FALSE); + gst_check_teardown_src_pad (openjpegenc); + gst_check_teardown_sink_pad (openjpegenc); + gst_check_teardown_element (openjpegenc); +} + +GST_START_TEST (test_openjpeg_encode_simple) +{ + GstElement *openjpegenc; + GstBuffer *buffer; + gint i; + GList *l; + GstCaps *outcaps, *sinkcaps; + GstSegment seg; + + openjpegenc = + setup_openjpegenc + ("video/x-raw,format=(string)I420,width=(int)320,height=(int)240,framerate=(fraction)25/1", + 1); + + gst_segment_init (&seg, GST_FORMAT_TIME); + seg.stop = gst_util_uint64_scale (10, GST_SECOND, 25); + + fail_unless (gst_pad_push_event (srcpad, gst_event_new_segment (&seg))); + + buffer = gst_buffer_new_allocate (NULL, 320 * 240 + 2 * 160 * 120, NULL); + gst_buffer_memset (buffer, 0, 0, -1); + + for (i = 0; i < 10; i++) { + GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (i, GST_SECOND, 25); + GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (1, GST_SECOND, 25); + fail_unless (gst_pad_push (srcpad, gst_buffer_ref (buffer)) == GST_FLOW_OK); + } + + gst_buffer_unref (buffer); + + fail_unless (gst_pad_push_event (srcpad, gst_event_new_eos ())); + + /* All buffers must be there now */ + fail_unless_equals_int (g_list_length (buffers), 10); + + outcaps = + gst_caps_from_string + ("image/x-j2c,width=(int)320,height=(int)240,framerate=(fraction)25/1"); + + for (l = buffers, i = 0; l; l = l->next, i++) { + buffer = l->data; + + fail_unless_equals_uint64 (GST_BUFFER_DURATION (buffer), + gst_util_uint64_scale (1, GST_SECOND, 25)); + + sinkcaps = gst_pad_get_current_caps (sinkpad); + fail_unless (gst_caps_can_intersect (sinkcaps, outcaps)); + gst_caps_unref (sinkcaps); + } + + gst_caps_unref (outcaps); + + cleanup_openjpegenc (openjpegenc); +} + +GST_END_TEST; + +static gboolean +bus_cb (GstBus * bus, GstMessage * message, gpointer data) +{ + OpenJPEGData *opj_data = (OpenJPEGData *) data; + switch (GST_MESSAGE_TYPE (message)) { + case GST_MESSAGE_ERROR:{ + GError *err = NULL; + gchar *debug = NULL; + + gst_message_parse_error (message, &err, &debug); + + GST_ERROR ("Error: %s : %s", err->message, debug); + g_error_free (err); + g_free (debug); + fail_if (!opj_data->failing_pipeline, "failed"); + g_main_loop_quit (opj_data->loop); + break; + } + case GST_MESSAGE_EOS: + fail_if (opj_data->failing_pipeline, "failed"); + g_main_loop_quit (opj_data->loop); + break; + default: + break; + } + return TRUE; +} + +static void +run_openjpeg_pipeline (const gchar * in_format, gint width, gint height, + gint num_stripes, gint enc_threads, gint dec_threads, + gboolean failing_pipeline) +{ + GstBus *bus; + OpenJPEGData opj_data; + GstElement *pipeline; + gchar *pipeline_str = + g_strdup_printf ("videotestsrc num-buffers=%d ! " + "video/x-raw,format=%s, width=%d, height=%d, framerate=%d/1 ! openjpegenc num-stripes=%d num-threads=%d ! jpeg2000parse" + " ! openjpegdec max-threads=%d ! fakevideosink", + NUM_BUFFERS, in_format, width, height, FRAME_RATE, num_stripes, + enc_threads, dec_threads); + GST_LOG ("Running pipeline: %s", pipeline_str); + pipeline = gst_parse_launch (pipeline_str, NULL); + fail_unless (pipeline != NULL); + g_free (pipeline_str); + + opj_data.loop = g_main_loop_new (NULL, FALSE); + opj_data.failing_pipeline = failing_pipeline; + + bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline)); + gst_bus_add_watch (bus, (GstBusFunc) bus_cb, &opj_data); + + gst_element_set_state (pipeline, GST_STATE_PLAYING); + g_main_loop_run (opj_data.loop); + gst_element_set_state (pipeline, GST_STATE_NULL); + + gst_bus_remove_watch (bus); + gst_object_unref (bus); + gst_object_unref (pipeline); + g_main_loop_unref (opj_data.loop); +} + + +GST_START_TEST (test_openjpeg_simple) +{ + int i; + const gchar *in_format_list[] = { + "ARGB64", "ARGB", "xRGB", "AYUV64", "Y444_10LE", "I422_10LE", "I420_10LE", + "AYUV", "Y444", "Y42B", "Y41B", "YUV9", "I420", "GRAY8", "GRAY16_LE" + }; + + for (i = 0; i < G_N_ELEMENTS (in_format_list); i++) { + run_openjpeg_pipeline (in_format_list[i], 320, 200, 1, 1, 1, FALSE); + } + + /* Check that the pipeline is failing properly */ + run_openjpeg_pipeline (in_format_list[0], 16, 16, 1, 0, 0, TRUE); + run_openjpeg_pipeline (in_format_list[0], 16, 16, 1, 1, 1, TRUE); + + for (i = 1; i < 8; i++) { + run_openjpeg_pipeline (in_format_list[0], 320, 200, i, 0, 0, FALSE); + run_openjpeg_pipeline (in_format_list[0], 320, 200, i, 1, 0, FALSE); + run_openjpeg_pipeline (in_format_list[0], 320, 200, i, 0, 1, FALSE); + run_openjpeg_pipeline (in_format_list[0], 320, 200, i, 0, 4, FALSE); + run_openjpeg_pipeline (in_format_list[0], 320, 200, i, 5, 3, FALSE); + run_openjpeg_pipeline (in_format_list[0], 320, 200, i, 8, 8, FALSE); + } +} + +GST_END_TEST; + + +static Suite * +openjpeg_suite (void) +{ + Suite *s = suite_create ("openjpeg"); + TCase *tc_chain = tcase_create ("general"); + + suite_add_tcase (s, tc_chain); + + tcase_add_test (tc_chain, test_openjpeg_encode_simple); + tcase_add_test (tc_chain, test_openjpeg_simple); + tcase_set_timeout (tc_chain, 5 * 60); + return s; +} + +GST_CHECK_MAIN (openjpeg);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/pcapparse.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/pcapparse.c
Changed
@@ -97,8 +97,7 @@ data_size = sizeof (zerosize_data); - in_buf = gst_buffer_new_wrapped (g_memdup (zerosize_data, data_size), - data_size); + in_buf = gst_buffer_new_memdup (zerosize_data, data_size); gst_harness_push (h, in_buf); gst_harness_play (h);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/rtpsrc.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/rtpsrc.c
Changed
@@ -28,7 +28,7 @@ /* Sets properties to non-default values (make sure this stays in sync) */ g_object_set (rtpsrc, "uri", "rtp://1.230.1.2:1234?" - "latency=300" "&ttl=8" "&ttl-mc=9", NULL); + "latency=300" "&ttl=8" "&ttl-mc=9" "&multicast-iface=dummy", NULL); g_object_get (rtpsrc, "latency", &latency, "ttl-mc", &ttl_mc, "ttl", &ttl, NULL);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/shm.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/shm.c
Changed
@@ -215,10 +215,10 @@ g_signal_emit_by_name (sink, "pull-sample", &sample); gst_sample_unref (sample); - state_res = gst_element_set_state (consumer, GST_STATE_NULL); + state_res = gst_element_set_state (producer, GST_STATE_NULL); fail_unless (state_res != GST_STATE_CHANGE_FAILURE); - state_res = gst_element_set_state (producer, GST_STATE_NULL); + state_res = gst_element_set_state (consumer, GST_STATE_NULL); fail_unless (state_res != GST_STATE_CHANGE_FAILURE); gst_object_unref (consumer);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/vapostproc.c
Added
@@ -0,0 +1,123 @@ +/* GStreamer + * + * unit test for VA allocators + * + * Copyright (C) 2021 Igalia, S.L. + * Author: Víctor Jáquez <vjaquez@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +# include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/check/gstcheck.h> +#include <gst/check/gstharness.h> +#include <gst/video/video.h> + +GST_START_TEST (raw_copy) +{ + GstHarness *h; + GstBuffer *buf, *buf_copy; + gboolean ret; + + h = gst_harness_new_parse ("videotestsrc num-buffers=1 ! " + "video/x-raw, width=(int)1024, height=(int)768 ! vapostproc"); + ck_assert (h); + + gst_harness_set_sink_caps_str (h, + "video/x-raw, format=(string)NV12, width=(int)3840, height=(int)2160"); + + gst_harness_add_propose_allocation_meta (h, GST_VIDEO_META_API_TYPE, NULL); + gst_harness_play (h); + + buf = gst_harness_pull (h); + ck_assert (buf); + + buf_copy = gst_buffer_new (); + ret = gst_buffer_copy_into (buf_copy, buf, + GST_BUFFER_COPY_MEMORY | GST_BUFFER_COPY_DEEP, 0, -1); + ck_assert (ret); + + gst_clear_buffer (&buf); + gst_clear_buffer (&buf_copy); + + gst_harness_teardown (h); +} + +GST_END_TEST; + +GST_START_TEST (dmabuf_copy) +{ + GstHarness *h; + GstBuffer *buf, *buf_copy; + gboolean ret; + + h = gst_harness_new_parse ("videotestsrc num-buffers=1 ! " + "video/x-raw, width=(int)1024, height=(int)768 ! vapostproc"); + ck_assert (h); + + gst_harness_set_sink_caps_str (h, + "video/x-raw(memory:DMABuf), format=(string)NV12, width=(int)3840, height=(int)2160"); + + gst_harness_add_propose_allocation_meta (h, GST_VIDEO_META_API_TYPE, NULL); + gst_harness_play (h); + + buf = gst_harness_pull (h); + ck_assert (buf); + + buf_copy = gst_buffer_new (); + ret = gst_buffer_copy_into (buf_copy, buf, + GST_BUFFER_COPY_MEMORY | GST_BUFFER_COPY_DEEP, 0, -1); + + if (gst_buffer_n_memory (buf_copy) == 1) + ck_assert (ret == TRUE); + /* else it will depend on the drm modifier */ + + gst_clear_buffer (&buf); + gst_clear_buffer (&buf_copy); + + gst_harness_teardown (h); +} + +GST_END_TEST; + +int +main (int argc, char **argv) +{ + GstElement *vpp; + Suite *s; + TCase *tc_chain; + + gst_check_init (&argc, &argv); + + vpp = gst_element_factory_make ("vapostproc", NULL); + if (!vpp) + return EXIT_SUCCESS; /* not available vapostproc */ + gst_object_unref (vpp); + + s = suite_create ("va"); + tc_chain = tcase_create ("copy"); + + suite_add_tcase (s, tc_chain); + + tcase_add_test (tc_chain, raw_copy); + tcase_add_test (tc_chain, dmabuf_copy); + + return gst_check_run_suite (s, "va", __FILE__); +}
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/vp9parse.c
Added
@@ -0,0 +1,160 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/check/gstcheck.h> +#include <gst/check/gstharness.h> +#include "vp9parse.h" +#include <string.h> + +typedef struct +{ + const guint8 *data; + guint len; + gboolean superframe; + guint subframe_len[2]; +} GstVp9ParseTestFrameData; + +static void +run_split_superframe_with_caps (const gchar * in_caps) +{ + GstHarness *h; + GstBuffer *in_buf, *out_buf = NULL; + GstMapInfo map; + GstFlowReturn ret; + gint i = 0; + GstVp9ParseTestFrameData frames[] = { + {profile_0_frame0, profile_0_frame0_len, FALSE, {profile_0_frame0_len, 0}}, + {profile_0_frame1, profile_0_frame1_len, TRUE, {profile_0_frame1_first_len, + profile_0_frame1_last_len}}, + {profile_0_frame2, profile_0_frame2_len, FALSE, {profile_0_frame2_len, 0}}, + }; + + h = gst_harness_new_parse ("vp9parse"); + fail_unless (h != NULL); + + gst_harness_set_sink_caps_str (h, "video/x-vp9,alignment=(string)frame"); + + /* default alignment is super-frame */ + gst_harness_set_src_caps_str (h, in_caps); + + gst_harness_play (h); + for (i = 0; i < G_N_ELEMENTS (frames); i++) { + in_buf = gst_buffer_new_and_alloc (frames[i].len); + gst_buffer_map (in_buf, &map, GST_MAP_WRITE); + memcpy (map.data, frames[i].data, frames[i].len); + gst_buffer_unmap (in_buf, &map); + + ret = gst_harness_push (h, in_buf); + fail_unless (ret == GST_FLOW_OK, "GstFlowReturn was %s", + gst_flow_get_name (ret)); + out_buf = gst_harness_try_pull (h); + fail_unless (out_buf); + fail_unless_equals_int (gst_buffer_get_size (out_buf), + frames[i].subframe_len[0]); + + if (i == 0) { + GstEvent *event; + GstCaps *caps = NULL; + GstStructure *s; + const gchar *profile; + gint width, height; + + fail_if (GST_BUFFER_FLAG_IS_SET (out_buf, GST_BUFFER_FLAG_DELTA_UNIT)); + + while ((event = gst_harness_try_pull_event (h))) { + GstCaps *event_caps; + if (GST_EVENT_TYPE (event) != GST_EVENT_CAPS) { + gst_event_unref (event); + continue; + } + + gst_event_parse_caps (event, &event_caps); + gst_caps_replace (&caps, event_caps); + gst_event_unref (event); + } + + fail_unless (caps != NULL); + s = gst_caps_get_structure (caps, 0); + fail_unless (gst_structure_get_int (s, "width", &width)); + fail_unless (gst_structure_get_int (s, "height", &height)); + fail_unless ((profile = gst_structure_get_string (s, "profile"))); + + fail_unless_equals_int (width, 256); + fail_unless_equals_int (height, 144); + fail_unless_equals_string (profile, "0"); + gst_caps_unref (caps); + } else { + fail_unless (GST_BUFFER_FLAG_IS_SET (out_buf, + GST_BUFFER_FLAG_DELTA_UNIT)); + } + + if (frames[i].superframe) { + /* this is decoding only frame */ + fail_unless (GST_BUFFER_FLAG_IS_SET (out_buf, + GST_BUFFER_FLAG_DECODE_ONLY)); + fail_unless (GST_BUFFER_FLAG_IS_SET (out_buf, + GST_BUFFER_FLAG_DELTA_UNIT)); + gst_clear_buffer (&out_buf); + + out_buf = gst_harness_try_pull (h); + fail_unless (out_buf); + fail_unless_equals_int (gst_buffer_get_size (out_buf), + frames[i].subframe_len[1]); + fail_unless (GST_BUFFER_FLAG_IS_SET (out_buf, + GST_BUFFER_FLAG_DELTA_UNIT)); + } + + gst_clear_buffer (&out_buf); + } + + gst_harness_teardown (h); +} + +GST_START_TEST (test_split_superframe) +{ + /* vp9parse will split frame if downstream alignment is frame + * whatever the upstream alignment was specified */ + run_split_superframe_with_caps ("video/x-vp9"); + run_split_superframe_with_caps ("video/x-vp9,alignment=(string)super-frame"); + run_split_superframe_with_caps ("video/x-vp9,alignment=(string)frame"); +} + +GST_END_TEST; + +static Suite * +vp9parse_suite (void) +{ + Suite *s; + TCase *tc_chain; + + s = suite_create ("vp9parse"); + tc_chain = tcase_create ("general"); + + suite_add_tcase (s, tc_chain); + + tcase_add_test (tc_chain, test_split_superframe); + + return s; +} + +GST_CHECK_MAIN (vp9parse);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/vp9parse.h
Added
@@ -0,0 +1,1305 @@ +/* GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/gst.h> + +/* Part of vp9 test bitstream available at + * https://www.webmproject.org/vp9/levels/ + */ + +/* crowd_run_256X144_fr15_bd8_8buf_l1.webm + * 256x144 + * profile0 + */ +static const guint8 profile_0_frame0[] = { + 0x82, 0x49, 0x83, 0x42, 0x00, 0x0f, 0xf0, 0x08, 0xf4, 0x18, 0x38, 0x24, + 0x1c, 0x18, 0xee, 0x00, 0x02, 0x20, 0x7f, 0xd1, 0x5d, 0x32, 0xd6, 0x6f, + 0x51, 0xe2, 0x8f, 0x24, 0xca, 0xed, 0xd1, 0x9b, 0x97, 0x35, 0x93, 0x88, + 0xde, 0x8c, 0xc7, 0xa7, 0x75, 0x52, 0xb3, 0x63, 0xa2, 0xb8, 0xe9, 0x00, + 0x17, 0xba, 0x18, 0x00, 0x7a, 0xa9, 0xff, 0xff, 0xd9, 0x9e, 0x97, 0xd4, + 0x01, 0x93, 0x4c, 0xd3, 0xc5, 0x2f, 0xbd, 0xe0, 0x13, 0xef, 0x24, 0x4c, + 0x27, 0xc9, 0x3e, 0x91, 0x16, 0x22, 0xe0, 0x91, 0x88, 0x6b, 0xce, 0x67, + 0xa1, 0xde, 0xa4, 0x3f, 0xa2, 0x5c, 0x55, 0x6f, 0x27, 0xce, 0x96, 0x42, + 0x72, 0x99, 0xe2, 0x6a, 0x79, 0x48, 0xbc, 0x1f, 0x12, 0x52, 0x13, 0x49, + 0x92, 0xb1, 0xb3, 0x69, 0x25, 0x1e, 0x35, 0x86, 0x03, 0x4f, 0x68, 0x53, + 0xa7, 0x1f, 0xf8, 0xf1, 0x46, 0xa6, 0xd2, 0x4b, 0xe3, 0xc1, 0x61, 0x56, + 0x09, 0x2d, 0x59, 0xfc, 0x40, 0xb7, 0x8b, 0x5b, 0x3c, 0xcc, 0xf8, 0x66, + 0xfc, 0x98, 0xda, 0xbc, 0x08, 0x01, 0xb2, 0x7b, 0x8d, 0x6d, 0x14, 0x4f, + 0x86, 0xf9, 0x56, 0xf7, 0x5a, 0xb5, 0x50, 0x44, 0xef, 0xca, 0x71, 0x7e, + 0x21, 0xfe, 0x15, 0x8d, 0xaa, 0xfe, 0x5b, 0x87, 0xc0, 0xed, 0x76, 0x91, + 0x2c, 0xd2, 0x61, 0x6e, 0xd9, 0x52, 0x8c, 0x3b, 0x46, 0xb3, 0x72, 0x21, + 0xac, 0xbb, 0xc3, 0x15, 0x54, 0x3f, 0x2d, 0x2d, 0xcf, 0x73, 0xce, 0x6d, + 0x4c, 0x36, 0xde, 0xdd, 0xa5, 0xd4, 0xa4, 0x9f, 0xfb, 0x9e, 0x93, 0xc7, + 0xf6, 0x86, 0x24, 0x15, 0x15, 0x2e, 0x27, 0x9b, 0x61, 0x5e, 0x92, 0x67, + 0x93, 0x6e, 0x04, 0x1a, 0x0f, 0x98, 0x51, 0x12, 0xc6, 0xb3, 0x07, 0x80, + 0x3e, 0xd4, 0x12, 0x29, 0x5d, 0x75, 0x05, 0x6f, 0xfd, 0x55, 0x6b, 0xaf, + 0xc0, 0x78, 0x21, 0xd8, 0xc0, 0xb1, 0x75, 0x99, 0x81, 0xea, 0xf8, 0x03, + 0xb7, 0x3f, 0x1a, 0x6f, 0x9e, 0xda, 0x3d, 0x39, 0xe2, 0xbb, 0xc4, 0xc6, + 0xcd, 0xb4, 0xc8, 0x4c, 0xcc, 0xcd, 0xd6, 0x32, 0xbf, 0x15, 0x21, 0xcf, + 0x4d, 0x00, 0xcc, 0x69, 0x29, 0x77, 0x2c, 0xd2, 0xfb, 0x82, 0x3f, 0x7a, + 0xb0, 0xbd, 0x75, 0x9a, 0x38, 0xcf, 0x91, 0x8f, 0x1b, 0x5e, 0x4b, 0x35, + 0x06, 0x5a, 0xef, 0x36, 0x4a, 0x3c, 0x95, 0xf8, 0xb4, 0x5a, 0x5a, 0x9e, + 0xc1, 0x68, 0x16, 0xfc, 0xee, 0x85, 0x81, 0xfa, 0xca, 0x58, 0x0d, 0x70, + 0x24, 0x5d, 0xa0, 0x00, 0x88, 0xbd, 0x57, 0x35, 0x16, 0xaa, 0xe0, 0xec, + 0x0f, 0xf0, 0xca, 0xf0, 0xe5, 0x59, 0x44, 0x2a, 0xf3, 0xa3, 0x74, 0x1f, + 0xce, 0x19, 0xa3, 0x65, 0x26, 0x38, 0x71, 0xdb, 0x08, 0x86, 0x1d, 0xef, + 0x0a, 0x11, 0x62, 0xc1, 0x91, 0xf1, 0x2a, 0x6f, 0xa2, 0xa5, 0x25, 0x15, + 0x70, 0x90, 0x29, 0xe8, 0xb7, 0xea, 0x45, 0x6a, 0xdd, 0xf2, 0x0e, 0x17, + 0x0d, 0x02, 0xe1, 0x6e, 0x6a, 0xbf, 0xa2, 0xd6, 0x03, 0x88, 0xba, 0x2a, + 0xd7, 0x30, 0xa4, 0x77, 0xd9, 0xf6, 0x36, 0x8f, 0xc1, 0x5f, 0xe8, 0xc3, + 0x9f, 0xc2, 0xf1, 0xf8, 0xfb, 0x6c, 0x53, 0x98, 0x22, 0x52, 0x1e, 0x32, + 0x96, 0xd8, 0x9b, 0x70, 0xe7, 0x37, 0x42, 0x73, 0x67, 0x9f, 0x67, 0xe6, + 0x4c, 0xe5, 0xac, 0x49, 0xcc, 0x96, 0x5a, 0x4c, 0xa4, 0x33, 0xd4, 0x8e, + 0xf2, 0x68, 0x83, 0x3d, 0x49, 0x1b, 0x8c, 0x91, 0x5b, 0x2e, 0xa3, 0x38, + 0x38, 0xb8, 0x32, 0x76, 0xed, 0x9a, 0x5e, 0x9a, 0xf2, 0x8d, 0x22, 0x5a, + 0x42, 0xa0, 0x3b, 0xdc, 0x74, 0x1c, 0x50, 0xff, 0xa7, 0xe1, 0x12, 0xbb, + 0xe8, 0x41, 0x6c, 0xe3, 0x6a, 0xde, 0x62, 0x42, 0xed, 0xba, 0xcc, 0x25, + 0xdd, 0xcc, 0x08, 0x7d, 0xd4, 0x3b, 0xac, 0xa1, 0x2f, 0xd1, 0xa7, 0x5a, + 0x79, 0xf6, 0x8a, 0x06, 0xd7, 0xd5, 0x39, 0xea, 0xe1, 0x5e, 0xc0, 0xca, + 0x60, 0x1f, 0xd4, 0xf0, 0x06, 0x31, 0x20, 0xab, 0x2e, 0xed, 0x45, 0x00, + 0x9d, 0xfc, 0x7f, 0xf0, 0x3d, 0xfd, 0x12, 0x01, 0xa7, 0x06, 0x29, 0x51, + 0xb0, 0xb7, 0x3e, 0x12, 0x24, 0xfd, 0xf4, 0x81, 0xc9, 0xc6, 0x80, 0xb5, + 0x8f, 0x79, 0x28, 0x91, 0xc6, 0x49, 0x75, 0x5b, 0x8c, 0x8b, 0xc0, 0x06, + 0xc4, 0x0a, 0xa5, 0x05, 0xed, 0xd8, 0xf9, 0x7a, 0x2e, 0x2c, 0x6b, 0x03, + 0x4b, 0x61, 0x78, 0x8f, 0x1e, 0x4d, 0x72, 0xed, 0xd4, 0x3a, 0x73, 0x3d, + 0x2c, 0xca, 0xe6, 0xb8, 0x9c, 0x68, 0x71, 0x9f, 0x20, 0x75, 0x4a, 0xd1, + 0x85, 0x6d, 0xe2, 0x05, 0x54, 0x8d, 0xd0, 0x45, 0xc4, 0x6e, 0xa0, 0xfa, + 0x8e, 0xf7, 0x55, 0xf9, 0x46, 0x5c, 0xe3, 0x1a, 0xe7, 0x45, 0xfc, 0xa5, + 0xe9, 0xd0, 0xc6, 0x68, 0x07, 0x0a, 0x0d, 0x29, 0x70, 0x16, 0x17, 0x02, + 0x70, 0x93, 0x00, 0x5a, 0x7c, 0x07, 0x3f, 0x85, 0x4a, 0x9a, 0x23, 0x0b, + 0x14, 0xf9, 0x26, 0x27, 0x8c, 0xf4, 0x33, 0x84, 0x1c, 0xba, 0xff, 0xf7, + 0x99, 0x49, 0xe7, 0xc9, 0xfd, 0x09, 0x8f, 0x32, 0xce, 0x03, 0xfe, 0x2c, + 0xe7, 0x46, 0x2c, 0xdf, 0xd7, 0xce, 0xef, 0x35, 0x37, 0xe9, 0x4f, 0xd6, + 0xd2, 0x8b, 0x5b, 0x79, 0x01, 0x39, 0xc2, 0xa8, 0xf2, 0x2c, 0x01, 0x84, + 0x30, 0x9c, 0x26, 0xb0, 0x74, 0xdd, 0x7a, 0xcc, 0xa8, 0x36, 0x45, 0xc5, + 0x73, 0xc5, 0xbe, 0xa6, 0x1a, 0x01, 0x44, 0xc9, 0x04, 0x96, 0x99, 0x53, + 0xc8, 0xa7, 0xb0, 0xd9, 0xca, 0x54, 0x06, 0xcb, 0xa9, 0xb2, 0x98, 0x94, + 0x07, 0x11, 0xff, 0xa5, 0x28, 0xb4, 0xb6, 0x17, 0xe5, 0xee, 0x7f, 0x75, + 0xc4, 0xbd, 0x63, 0x07, 0x50, 0x73, 0xd1, 0x18, 0x48, 0x42, 0xd3, 0x61, + 0x4f, 0xbe, 0xbe, 0x67, 0x9c, 0xd6, 0xe6, 0x41, 0xcb, 0x7c, 0x54, 0x26, + 0x99, 0x45, 0x33, 0x6f, 0xb7, 0xf8, 0xdb, 0xd9, 0x96, 0x3c, 0xae, 0x7d, + 0x37, 0xa8, 0x52, 0xdc, 0x5a, 0xe0, 0x41, 0x5b, 0xac, 0x0f, 0xc0, 0x9f, + 0x62, 0x93, 0x73, 0xd6, 0x4a, 0x06, 0x2c, 0x5d, 0x47, 0x3f, 0x7b, 0xb2, + 0x8f, 0x9d, 0x29, 0x05, 0xfa, 0x38, 0xc6, 0x43, 0x55, 0x1e, 0x26, 0x26, + 0xb8, 0x8f, 0x66, 0xd5, 0x23, 0x0d, 0x22, 0x03, 0x45, 0x8c, 0x64, 0xef, + 0xea, 0xc7, 0x93, 0x20, 0x47, 0xae, 0xe5, 0x72, 0xff, 0x7e, 0xb5, 0x48, + 0x67, 0x83, 0x32, 0x2c, 0xb1, 0x7f, 0xdb, 0xf5, 0xf7, 0x27, 0x5a, 0xc7, + 0xc4, 0x37, 0xe8, 0x65, 0x51, 0x0c, 0xa6, 0xc9, 0x8a, 0x5a, 0x41, 0x4f, + 0xa1, 0xf3, 0xee, 0xc4, 0xbb, 0xd9, 0xb8, 0x63, 0xad, 0x79, 0x53, 0xad, + 0x10, 0x36, 0xbb, 0x68, 0xcc, 0xf9, 0x70, 0xff, 0x78, 0xce, 0x58, 0x23, + 0x66, 0xac, 0x7c, 0x0f, 0x61, 0xe0, 0xf0, 0x94, 0x47, 0xbb, 0x23, 0x66, + 0xc5, 0x0d, 0x52, 0x3e, 0x8f, 0xe4, 0xa6, 0x2a, 0x6a, 0x93, 0x2d, 0x74, + 0x19, 0x1e, 0xc1, 0xa3, 0xcc, 0x76, 0x62, 0x43, 0x46, 0x59, 0x41, 0xc2, + 0x1a, 0x1d, 0xf7, 0xa4, 0xf1, 0xa9, 0xaf, 0xfe, 0xdd, 0xe3, 0x32, 0xe1, + 0x19, 0x2f, 0xa9, 0x9b, 0xa4, 0xc5, 0xee, 0xc5, 0x6c, 0xcf, 0xdf, 0xaf, + 0x40, 0x79, 0x91, 0xd6, 0x6f, 0x58, 0x5f, 0x31, 0x7e, 0xa8, 0x26, 0x61, + 0xf2, 0x88, 0xc5, 0xb5, 0x8a, 0xa5, 0x5e, 0xfc, 0x0b, 0xae, 0xba, 0x77, + 0x71, 0xcd, 0xca, 0xd9, 0x91, 0xee, 0x52, 0x6e, 0xee, 0x2f, 0xbb, 0x11, + 0x40, 0x97, 0x59, 0x4c, 0xd7, 0x1d, 0x47, 0xc4, 0xce, 0xaf, 0x71, 0xcc, + 0xc8, 0x2e, 0x3f, 0x6a, 0x8b, 0x57, 0x6f, 0xd2, 0xd7, 0x2e, 0x7c, 0x1e, + 0x50, 0x5f, 0x09, 0x2c, 0x7c, 0x57, 0x86, 0xb6, 0xf0, 0x94, 0x23, 0x1e, + 0x2f, 0x8e, 0x50, 0xa2, 0x03, 0x43, 0x32, 0x13, 0x59, 0x4e, 0x29, 0x10, + 0x0f, 0xfb, 0x9b, 0xa7, 0xfc, 0xe1, 0x5a, 0x26, 0x55, 0x84, 0xdd, 0x71, + 0xd0, 0xe3, 0x92, 0x1d, 0xba, 0x98, 0x6e, 0x78, 0xe4, 0x5a, 0xf8, 0x2e, + 0x35, 0x42, 0x94, 0x80, 0xf4, 0x77, 0x36, 0x2b, 0x23, 0xc9, 0x14, 0x94, + 0x47, 0x19, 0x1e, 0x4e, 0x5c, 0x93, 0x0e, 0x31, 0x35, 0xbb, 0xc7, 0x24, + 0x18, 0x9f, 0xca, 0x25, 0xa2, 0x3c, 0x3d, 0xa0, 0xb1, 0x5d, 0xac, 0x15, + 0x57, 0x57, 0x56, 0x40, 0x2a, 0x66, 0x64, 0xe2, 0xa8, 0x69, 0x4d, 0xda, + 0xed, 0xe1, 0x8e, 0x24, 0xd0, 0xc4, 0xdd, 0x49, 0x55, 0xa8, 0xca, 0x9f, + 0x83, 0xd7, 0x93, 0xc9, 0x96, 0x65, 0xf2, 0x01, 0xca, 0xd7, 0x3e, 0x31, + 0x9f, 0x16, 0x3e, 0x77, 0x99, 0x3d, 0x51, 0x82, 0xc7, 0x20, 0x3c, 0x5d, + 0xa1, 0x76, 0xa5, 0x4b, 0xb3, 0x8a, 0xb2, 0x8b, 0x79, 0xdd, 0x2c, 0x29, + 0x2c, 0x3a, 0x0a, 0x6f, 0xdd, 0xaa, 0x2a, 0xb8, 0xd5, 0x3d, 0xca, 0x54, + 0x0f, 0x6a, 0x1a, 0x25, 0x89, 0xfb, 0x52, 0x94, 0xfe, 0xcd, 0xa9, 0x04, + 0x86, 0x88, 0x0e, 0x59, 0x81, 0x22, 0xd3, 0xea, 0x02, 0xe7, 0xc4, 0xf3, + 0x2c, 0x12, 0x20, 0x67, 0xbe, 0x0f, 0x8f, 0x5c, 0x7d, 0xba, 0x7f, 0x72, + 0x14, 0xb0, 0x4a, 0x6d, 0x6f, 0x38, 0x52, 0x8b, 0x1f, 0xb3, 0x6e, 0xad, + 0x3c, 0x86, 0x7c, 0x84, 0x86, 0x6d, 0xea, 0x55, 0xaf, 0x68, 0xd7, 0x0f, + 0xb0, 0x35, 0xb8, 0x00, 0x6b, 0x37, 0x2a, 0x3b, 0x33, 0xca, 0xfd, 0x68, + 0xb4, 0x79, 0x2c, 0x58, 0xcb, 0x00, 0x39, 0x74, 0x3a, 0x21, 0x55, 0x8d, + 0x1e, 0xbd, 0x66, 0x62, 0x83, 0xa1, 0x82, 0xe6, 0x06, 0xb4, 0xdd, 0x09, + 0xa4, 0xdc, 0x3f, 0x66, 0xe8, 0xac, 0xf2, 0x1d, 0x79, 0x19, 0xa3, 0x76, + 0x0d, 0x99, 0xa5, 0xc9, 0xb9, 0xb0, 0x00, 0x5d, 0x9e, 0x48, 0xe6, 0xa4, + 0xd4, 0xf9, 0x6b, 0xbb, 0xa9, 0x0a, 0xab, 0x5e, 0xf6, 0x52, 0x8b, 0x3b, + 0x8a, 0x3b, 0x34, 0x29, 0x52, 0xb4, 0x07, 0xe6, 0x28, 0xbf, 0xde, 0x4b, + 0x8e, 0x20, 0xc2, 0x90, 0x10, 0x5f, 0x13, 0xa1, 0xc3, 0x1c, 0xed, 0xd2, + 0xaa, 0x63, 0x82, 0x6c, 0xed, 0xf4, 0x1f, 0x37, 0xf2, 0xf2, 0x26, 0xa0, + 0xf3, 0x9d, 0xfb, 0x6d, 0xc3, 0x38, 0xaf, 0xb2, 0x5c, 0x73, 0x2b, 0x7e, + 0xab, 0x5e, 0x17, 0x08, 0x64, 0x7a, 0x4f, 0xf9, 0x87, 0x65, 0x4a, 0xe9, + 0x45, 0x17, 0x44, 0x03, 0xb1, 0xdd, 0x25, 0x24, 0x9e, 0x77, 0x7e, 0xd8, + 0xa5, 0x90, 0xab, 0x74, 0x33, 0x83, 0x37, 0xc3, 0xee, 0xac, 0xd2, 0xad, + 0x47, 0x57, 0x1e, 0x60, 0x51, 0x0c, 0xc4, 0x91, 0x4a, 0xb1, 0x54, 0x98, + 0x50, 0x7c, 0x52, 0xaf, 0xbb, 0x5c, 0xa9, 0x6b, 0x18, 0x73, 0x98, 0xa6, + 0x4c, 0xf2, 0x19, 0xb0, 0x6c, 0x86, 0x98, 0x41, 0x4a, 0xac, 0x99, 0x38, + 0x72, 0x82, 0x7e, 0x3e, 0xab, 0xcc, 0xf8, 0x97, 0x3a, 0xd2, 0x09, 0xa1, + 0xfd, 0x75, 0x85, 0x60, 0x02, 0x76, 0xa4, 0x62, 0x6c, 0x66, 0x9d, 0xc7, + 0x54, 0x2f, 0xf0, 0xee, 0x66, 0x68, 0x31, 0x76, 0xd8, 0x88, 0x1d, 0xfd, + 0x81, 0x41, 0x6c, 0x86, 0x51, 0x2c, 0x82, 0xd1, 0x7b, 0x4f, 0x6c, 0x2c, + 0xea, 0x1f, 0xc0, 0xc0, 0x88, 0xc2, 0xa3, 0x6f, 0x9e, 0xa4, 0x8e, 0x8e, + 0x90, 0x5d, 0xc1, 0xbe, 0x58, 0xb2, 0xcc, 0xea, 0x01, 0x6e, 0x98, 0x5e, + 0xac, 0x93, 0xb3, 0x39, 0xca, 0x1a, 0x49, 0x34, 0x39, 0x11, 0x2f, 0xe0, + 0x49, 0x49, 0x8e, 0xe7, 0xdf, 0x79, 0x8c, 0x4f, 0xba, 0x0b, 0x09, 0x6f, + 0xe1, 0x03, 0xdc, 0xea, 0xdf, 0x42, 0x1d, 0xa2, 0x0f, 0x3f, 0x08, 0x82, + 0xb2, 0xd8, 0xb2, 0xd4, 0xb2, 0xea, 0xa9, 0x51, 0xc2, 0x8f, 0x14, 0x76, + 0x91, 0xd9, 0x0d, 0x6a, 0x6a, 0xc9, 0xf8, 0xf5, 0xf4, 0x48, 0xbf, 0xe5, + 0x34, 0x74, 0x41, 0x84, 0xe7, 0x05, 0xb9, 0xa9, 0xdd, 0xa6, 0x8a, 0x48, + 0xa4, 0x24, 0x38, 0xa8, 0x6b, 0x68, 0xd8, 0x5f, 0x0a, 0xd1, 0x1f, 0x01, + 0xec, 0xf2, 0x85, 0x6c, 0x36, 0xde, 0x2f, 0xdd, 0xc8, 0x53, 0x6f, 0xf1, + 0x53, 0xfa, 0x35, 0x24, 0x47, 0xd2, 0x14, 0xf5, 0xab, 0x56, 0xa1, 0xe8, + 0x95, 0x3d, 0x44, 0x89, 0x7c, 0x6e, 0xd3, 0x4f, 0x57, 0xd5, 0xde, 0x93, + 0x0f, 0xa9, 0x33, 0xf9, 0x0e, 0x6e, 0x85, 0x93, 0x6a, 0xf2, 0xda, 0xdb, + 0xc1, 0x42, 0x89, 0x0b, 0xbb, 0xc1, 0x81, 0x1d, 0x3e, 0xf7, 0xf8, 0x11, + 0xa2, 0x5f, 0x26, 0x81, 0x75, 0x35, 0xd3, 0xa0, 0xdc, 0xcc, 0x14, 0xd7, + 0x89, 0xb1, 0x1d, 0x90, 0xee, 0x62, 0x7f, 0x28, 0x74, 0xce, 0xa6, 0xc8, + 0x1c, 0x46, 0xb3, 0x4f, 0x55, 0x35, 0xab, 0xfe, 0xd0, 0x3e, 0xa7, 0xd8, + 0x79, 0x1c, 0x22, 0xc4, 0xa2, 0x26, 0xf6, 0x40, 0x4e, 0x91, 0xb3, 0x68, + 0x43, 0xdb, 0x66, 0xde, 0xa7, 0xb0, 0xc9, 0x28, 0xb1, 0x36, 0xdc, 0x3e, + 0x1b, 0x19, 0x4e, 0xfb, 0xac, 0xde, 0x13, 0xd0, 0x02, 0x42, 0x76, 0x79, + 0xdc, 0xf2, 0xea, 0xc7, 0xcf, 0x16, 0xab, 0x19, 0x42, 0xaa, 0x2a, 0x59, + 0x79, 0xe8, 0xf5, 0xb0, 0x66, 0x92, 0xfc, 0x98, 0x1b, 0x76, 0x46, 0x31, + 0x65, 0x47, 0xe1, 0x7b, 0x70, 0x4a, 0x06, 0xa7, 0xdb, 0xc6, 0xc2, 0x8b, + 0xca, 0x4c, 0x58, 0x6a, 0x11, 0xa3, 0xb7, 0xe2, 0xca, 0x06, 0x25, 0xdc, + 0xc3, 0xc1, 0x37, 0xbb, 0xb7, 0xab, 0xce, 0x12, 0x43, 0xfe, 0x8b, 0xa9, + 0x01, 0x7d, 0xaf, 0xac, 0x89, 0x28, 0x2d, 0x2c, 0xc2, 0x4c, 0x87, 0xcf, + 0xe8, 0x4c, 0x5b, 0x87, 0x7f, 0xb5, 0x22, 0xa1, 0xd9, 0x40, 0x4d, 0xa9, + 0x66, 0x10, 0x8b, 0x1b, 0xeb, 0x3f, 0x47, 0x5b, 0x67, 0x34, 0xb2, 0xf6, + 0x84, 0x88, 0x54, 0x66, 0xf0, 0x5d, 0x9d, 0xa5, 0x27, 0x66, 0x55, 0xf7, + 0x42, 0x3d, 0xd5, 0xca, 0xca, 0x6e, 0xbc, 0xfd, 0x37, 0x39, 0x99, 0xdb, + 0xa1, 0xa2, 0x4c, 0x69, 0x43, 0x42, 0xc8, 0xf5, 0xb9, 0x8f, 0xbe, 0x52, + 0xf8, 0xcc, 0x87, 0xf6, 0x0e, 0x30, 0xc5, 0x2b, 0xf4, 0xa8, 0x85, 0xab, + 0x8f, 0xa2, 0x11, 0xbc, 0x4f, 0x22, 0x9c, 0x94, 0x44, 0x6b, 0x86, 0x0e, + 0x3f, 0xaa, 0x5c, 0xb7, 0x41, 0x1b, 0xed, 0x83, 0x88, 0xf3, 0x0e, 0x0c, + 0xea, 0x19, 0x9d, 0xb5, 0x9f, 0xe2, 0xe8, 0xa2, 0x9f, 0x4b, 0x0f, 0x9c, + 0xe5, 0x0f, 0xc9, 0x18, 0xa4, 0x3b, 0x86, 0x16, 0x15, 0x45, 0x2b, 0x97, + 0x8f, 0x7a, 0xf3, 0x18, 0x30, 0xf8, 0x97, 0xdc, 0x3e, 0xc4, 0xca, 0x26, + 0xa0, 0x24, 0x53, 0xd9, 0xcd, 0x8e, 0x5c, 0xc0, 0x95, 0x31, 0xc2, 0xf7, + 0x24, 0x5f, 0x19, 0x3e, 0x5b, 0x4b, 0x08, 0x13, 0x4d, 0x8f, 0x77, 0x15, + 0xf1, 0x67, 0xbe, 0x55, 0xa8, 0x7c, 0xf0, 0xcc, 0x97, 0x83, 0x7b, 0xaa, + 0x8c, 0xab, 0xa4, 0xb7, 0xe7, 0xdc, 0x5d, 0x62, 0xc5, 0x6a, 0x21, 0xf6, + 0xe3, 0xca, 0x4c, 0x96, 0x44, 0x06, 0xaa, 0x38, 0x6f, 0xc4, 0x88, 0x8c, + 0x58, 0x54, 0xdb, 0x7f, 0x1c, 0xac, 0x14, 0x15, 0x7c, 0xce, 0x26, 0x79, + 0x4b, 0xae, 0xa0, 0x24, 0xb8, 0xec, 0xb6, 0xfb, 0x8c, 0x35, 0x9b, 0x16, + 0xcb, 0xc0, 0x96, 0x47, 0xf2, 0x18, 0xff, 0xf7, 0x91, 0x02, 0x87, 0xe1, + 0xb3, 0xd0, 0xa4, 0x29, 0x13, 0x25, 0xb7, 0xde, 0x8f, 0xa6, 0xe4, 0x22, + 0x6c, 0x92, 0x3e, 0x29, 0x90, 0x3d, 0xf2, 0xa5, 0xff, 0xeb, 0xf9, 0x0a, + 0xa9, 0x3b, 0xd3, 0x50, 0xbb, 0x96, 0x18, 0x2c, 0x8c, 0xdf, 0xc2, 0x5b, + 0x8f, 0x83, 0x66, 0x3c, 0x64, 0x81, 0x1b, 0x6f, 0x26, 0x47, 0x46, 0x48, + 0x79, 0x06, 0x19, 0x07, 0xb8, 0xab, 0xc2, 0x58, 0x22, 0xd7, 0x00, 0xed, + 0xcd, 0x3c, 0x13, 0x81, 0xe8, 0x72, 0x7d, 0x97, 0xa9, 0x9a, 0xc2, 0x21, + 0xe5, 0x9a, 0xf1, 0xba, 0xfa, 0xb7, 0xe8, 0x0f, 0xfb, 0xa9, 0x4a, 0xd9, + 0x23, 0x7a, 0xe1, 0x06, 0x04, 0x14, 0x23, 0x77, 0x46, 0x93, 0x91, 0xc4, + 0xd2, 0x7c, 0x24, 0x4c, 0xad, 0xee, 0xba, 0x7d, 0xd0, 0x11, 0xd2, 0x3a, + 0x0f, 0x46, 0xae, 0x21, 0xe6, 0x08, 0x43, 0x12, 0x2c, 0x95, 0x4f, 0x52, + 0x87, 0x8e, 0x51, 0x35, 0xf4, 0x28, 0x3a, 0x12, 0x55, 0x14, 0x33, 0x5d, + 0xd2, 0xd2, 0x14, 0xa1, 0xcd, 0x83, 0x4d, 0x5d, 0xfe, 0x80, 0x5c, 0x5a, + 0x2b, 0x34, 0x17, 0x8b, 0x7b, 0xdb, 0x78, 0x98, 0x90, 0xdc, 0x04, 0x52, + 0x96, 0x58, 0x50, 0x47, 0x38, 0x48, 0x39, 0x16, 0xd1, 0x87, 0x2d, 0x3d, + 0xcb, 0xba, 0xaf, 0x5b, 0x9f, 0x57, 0xda, 0x13, 0x5a, 0xf2, 0xe7, 0x46, + 0x37, 0x44, 0x1c, 0x85, 0x1c, 0x7d, 0xdc, 0xba, 0xb2, 0x24, 0xd1, 0xad, + 0x28, 0x86, 0xdf, 0xf5, 0x26, 0x09, 0x3f, 0x7c, 0x2b, 0xec, 0xfc, 0xf0, + 0x25, 0x96, 0x5c, 0x1d, 0x45, 0xef, 0xba, 0xd9, 0x54, 0x4a, 0x9d, 0x9b, + 0xc3, 0x9a, 0x73, 0xa0, 0x09, 0x82, 0x8d, 0xa2, 0x5c, 0x8a, 0x18, 0xb3, + 0x36, 0x04, 0x0e, 0xe9, 0x53, 0x2c, 0x47, 0xe6, 0x1c, 0xc9, 0xd3, 0xe1, + 0x61, 0x44, 0x1b, 0x9f, 0x20, 0xa9, 0x7a, 0x7f, 0x14, 0x13, 0x2d, 0xf5, + 0xf9, 0x75, 0xbe, 0x25, 0xa4, 0xbf, 0x65, 0x93, 0x06, 0xd2, 0xfb, 0xa4, + 0xa8, 0x22, 0x72, 0x11, 0x97, 0xd0, 0x66, 0x90, 0x62, 0xc3, 0x81, 0xfd, + 0x06, 0xd1, 0xe1, 0x18, 0xea, 0x1c, 0xc7, 0x22, 0xfe, 0x39, 0x7d, 0xc6, + 0x37, 0xb8, 0x39, 0x36, 0xe9, 0xb3, 0xe2, 0x8f, 0x44, 0xec, 0x9a, 0x3a, + 0x5e, 0x71, 0x08, 0xac, 0x4e, 0xbd, 0x37, 0x07, 0x17, 0x89, 0xf0, 0xf5, + 0x4f, 0x35, 0xcf, 0x89, 0x7a, 0xf8, 0x71, 0x2f, 0x06, 0x22, 0xf5, 0x51, + 0x4b, 0x8d, 0x53, 0x24, 0x16, 0xe7, 0x7c, 0x5a, 0x17, 0x7a, 0x6a, 0xc5, + 0xbe, 0x14, 0x8d, 0xa6, 0x25, 0xf1, 0x5c, 0xd9, 0x07, 0x74, 0x9b, 0x45, + 0x2d, 0xdd, 0x31, 0x7a, 0xbd, 0xd7, 0x9c, 0x63, 0x7c, 0x2f, 0x4f, 0x1e, + 0x9e, 0x4f, 0x16, 0x1f, 0x23, 0x61, 0xe9, 0x20, 0x43, 0x7e, 0x96, 0xf3, + 0x1b, 0xa4, 0x63, 0xe6, 0x7f, 0xb5, 0x31, 0xd8, 0x9b, 0xd6, 0xe7, 0x0c, + 0xbb, 0x18, 0x62, 0x88, 0x4f, 0x5c, 0x6c, 0xcf, 0x8f, 0x4e, 0x29, 0x90, + 0xc1, 0x2f, 0x9f, 0x72, 0x9c, 0xc5, 0xcb, 0xd6, 0xe5, 0x56, 0x5a, 0x11, + 0x98, 0xce, 0x84, 0xd1, 0x40, 0x0c, 0x14, 0xcc, 0x00, 0xf3, 0x25, 0x13, + 0xb6, 0xe2, 0x5b, 0x56, 0xee, 0x70, 0xe3, 0x81, 0x04, 0x89, 0x45, 0x38, + 0x03, 0xf8, 0x29, 0xdd, 0xc9, 0xd1, 0x43, 0xfc, 0x09, 0x57, 0x5f, 0x3a, + 0x6a, 0xba, 0x37, 0x8a, 0x95, 0x9a, 0xbd, 0x4b, 0x57, 0xaa, 0xba, 0x89, + 0x34, 0x75, 0x2a, 0x02, 0x92, 0xc0, 0x88, 0xa9, 0xdc, 0x21, 0x12, 0x24, + 0x6b, 0xab, 0x87, 0xe5, 0x0e, 0x28, 0xb6, 0x14, 0xb3, 0x8c, 0xa5, 0x91, + 0x4a, 0x53, 0xdf, 0x16, 0x23, 0x9f, 0xea, 0x17, 0xa7, 0xb6, 0x9c, 0x15, + 0x7a, 0x02, 0x65, 0xc6, 0x2e, 0xd9, 0xd4, 0xd4, 0x93, 0xe8, 0xab, 0x6f, + 0x1b, 0xc8, 0x25, 0xa5, 0xea, 0x1f, 0x07, 0x64, 0xf1, 0x12, 0x8d, 0xd3, + 0x89, 0xe8, 0x6d, 0x28, 0xa2, 0xbf, 0xa5, 0x34, 0xea, 0xd6, 0x06, 0x2f, + 0x3a, 0xd6, 0xb2, 0xe5, 0x30, 0xe0, 0xc5, 0x3a, 0x44, 0xe2, 0x55, 0x88, + 0x38, 0xc4, 0x2b, 0xc3, 0x69, 0x2e, 0xf6, 0x95, 0xb4, 0x41, 0xd2, 0xe5, + 0xaa, 0x89, 0x9c, 0x75, 0x5e, 0x29, 0xff, 0x4e, 0x61, 0xf2, 0xd0, 0x74, + 0x7a, 0x6d, 0xbd, 0xc3, 0x55, 0xd1, 0x63, 0x75, 0x35, 0x5a, 0x27, 0x26, + 0x61, 0x12, 0xcc, 0xd2, 0x54, 0xcf, 0x1d, 0xdf, 0x83, 0x7e, 0x9c, 0xf5, + 0x88, 0x3b, 0x1a, 0x5b, 0x66, 0x8f, 0xc1, 0x71, 0x95, 0x3e, 0xae, 0x4e, + 0xa3, 0x86, 0x82, 0xc9, 0xa1, 0x0a, 0x1a, 0x2d, 0x36, 0x44, 0xa6, 0x37, + 0x1d, 0x4d, 0x3e, 0x72, 0xfb, 0x15, 0x8b, 0x5b, 0xec, 0x22, 0x13, 0x34, + 0x3a, 0xdb, 0xc1, 0xf4, 0x78, 0xe9, 0x51, 0xb7, 0x60, 0x53, 0xff, 0x71, + 0x07, 0x86, 0x1a, 0xc9, 0x38, 0xfa, 0xa6, 0xeb, 0x7a, 0x10, 0x75, 0xb3, + 0xf1, 0xf0, 0x28, 0xb9, 0x7d, 0x08, 0x59, 0x36, 0x81, 0xa0, 0xdc, 0x7c, + 0x71, 0x8f, 0xdf, 0x85, 0x45, 0xa7, 0x49, 0xcd, 0x1f, 0xf5, 0x0f, 0x46, + 0x4f, 0xa6, 0x7a, 0x6d, 0x64, 0x05, 0x48, 0x82, 0x8f, 0xe7, 0x55, 0xf5, + 0xdc, 0xf5, 0xcb, 0xd7, 0x14, 0x97, 0x05, 0x3d, 0x3b, 0x62, 0x90, 0x64, + 0x2d, 0x87, 0x06, 0x97, 0x3c, 0xc0, 0xde, 0x79, 0x73, 0x0f, 0xae, 0xee, + 0x99, 0x1e, 0x4d, 0xda, 0xbf, 0x6d, 0x2d, 0xe4, 0xc8, 0xde, 0x31, 0x16, + 0x2b, 0x5a, 0x6f, 0x1b, 0x71, 0x92, 0xd6, 0x6b, 0x4c, 0x34, 0x39, 0xdc, + 0x1e, 0xf5, 0x62, 0x59, 0x49, 0xb9, 0xd4, 0x6e, 0x1b, 0x82, 0x70, 0xa8, + 0xf0, 0xf2, 0x19, 0x75, 0x23, 0x25, 0xa8, 0xcc, 0x26, 0xe3, 0xd0, 0xca, + 0x0e, 0xe6, 0x47, 0x5f, 0x99, 0xc7, 0x6b, 0x05, 0xb8, 0xba, 0x6c, 0xe5, + 0xab, 0xfb, 0x76, 0x45, 0x1c, 0x31, 0x9a, 0x76, 0xfe, 0xae, 0x3e, 0x07, + 0xad, 0x22, 0x2c, 0x2f, 0x91, 0x45, 0xa7, 0x7b, 0xad, 0xe3, 0xf3, 0x2f, + 0xe1, 0x53, 0x32, 0x73, 0x40, 0x34, 0x0e, 0x9e, 0x5b, 0xa2, 0xa4, 0x4c, + 0x79, 0x3c, 0xa8, 0x5f, 0x6d, 0x96, 0xe8, 0x20, 0x3c, 0x1a, 0x33, 0x0a, + 0x7b, 0x3d, 0xad, 0xd8, 0x81, 0xbc, 0x3b, 0x99, 0xfc, 0xcd, 0xdf, 0x6b, + 0xe7, 0xa6, 0xa7, 0xe3, 0xdc, 0xfc, 0x3f, 0xd2, 0xca, 0xc8, 0x51, 0x00, + 0xdc, 0x95, 0x07, 0x5c, 0x17, 0xea, 0x97, 0xf1, 0x42, 0x3f, 0x00, 0x9e, + 0xc2, 0xab, 0xf3, 0x05, 0xfd, 0x1a, 0xc2, 0xdf, 0x2f, 0x56, 0x23, 0x64, + 0x43, 0x35, 0x31, 0x85, 0x62, 0xbb, 0x99, 0xb0, 0x01, 0xb6, 0x9b, 0xdb, + 0x07, 0xed, 0xa0, 0xf4, 0x21, 0xa1, 0xd4, 0x61, 0xb7, 0x57, 0x16, 0x15, + 0x69, 0xf4, 0x48, 0xd4, 0x7b, 0x55, 0x6f, 0xac, 0x94, 0xd5, 0xbd, 0x4a, + 0x31, 0x92, 0x45, 0xb0, 0x7d, 0x71, 0x09, 0xfa, 0x5f, 0x50, 0xf8, 0x44, + 0x38, 0xfe, 0x23, 0x55, 0x87, 0xb3, 0xbe, 0xdc, 0x75, 0x72, 0xf9, 0x57, + 0xd8, 0xeb, 0xd7, 0x65, 0x85, 0x93, 0xb8, 0xd3, 0x0d, 0x92, 0xd9, 0x68, + 0x09, 0xbe, 0x27, 0x38, 0x82, 0x64, 0xbd, 0xc8, 0xa3, 0x75, 0x2c, 0xb5, + 0xef, 0x76, 0x07, 0x89, 0xfe, 0xd9, 0x4c, 0x32, 0x09, 0x7a, 0xf8, 0x8b, + 0x22, 0x26, 0xee, 0x7c, 0xf9, 0x90, 0xbe, 0x8b, 0x01, 0xd0, 0xa5, 0x00, + 0x0a, 0x1f, 0x26, 0x21, 0x31, 0x48, 0xc0, 0xf3, 0xaf, 0x6d, 0x19, 0xb9, + 0x12, 0x7c, 0xa7, 0x88, 0xad, 0x55, 0x79, 0x1c, 0x10, 0x94, 0x40, 0xb8, + 0xf6, 0x07, 0x0e, 0xfd, 0x4f, 0xd1, 0x12, 0x29, 0x31, 0x04, 0x4d, 0x95, + 0xf5, 0x5d, 0x95, 0x9a, 0xb6, 0x47, 0x39, 0x5b, 0x5a, 0xba, 0x2a, 0xd8, + 0x43, 0xf9, 0x27, 0x22, 0x9c, 0x48, 0xdf, 0x78, 0x87, 0xfe, 0xc5, 0x19, + 0xfe, 0xe1, 0x0a, 0x9b, 0x67, 0x42, 0xc1, 0x69, 0x8c, 0x2d, 0x83, 0xa5, + 0x25, 0xf3, 0xcf, 0xeb, 0xfb, 0x32, 0xa7, 0x14, 0xef, 0x98, 0xc1, 0xa1, + 0x91, 0x56, 0x22, 0xf9, 0x4c, 0xa6, 0x94, 0xf0, 0x7d, 0x18, 0xd7, 0x97, + 0x36, 0xe1, 0x7e, 0x51, 0xeb, 0xac, 0x65, 0x42, 0x9e, 0x59, 0xc7, 0xfc, + 0x4c, 0x57, 0x2f, 0x77, 0x47, 0xe5, 0x64, 0x1a, 0x61, 0xa1, 0x73, 0x38, + 0x71, 0x87, 0x8b, 0x9f, 0xe5, 0xe7, 0x10, 0xc9, 0x26, 0x07, 0x05, 0x45, + 0x1e, 0x3b, 0x58, 0x1d, 0xc2, 0x90, 0x44, 0xa5, 0x57, 0x5b, 0xc8, 0xd5, + 0xf7, 0x4d, 0xb0, 0x8b, 0xc2, 0xc3, 0xe5, 0x6d, 0xe7, 0x9a, 0xdf, 0xdb, + 0xfc, 0x96, 0x3f, 0x0b, 0x23, 0x95, 0x48, 0xe1, 0xb4, 0x87, 0x9e, 0x8a, + 0x09, 0xf2, 0xc4, 0x41, 0x95, 0xe5, 0x0d, 0x74, 0x09, 0xa4, 0x78, 0xc0, + 0x9b, 0xbb, 0x89, 0x91, 0xed, 0x03, 0xb6, 0x5b, 0x69, 0xfc, 0x86, 0x62, + 0x4b, 0xa9, 0xcc, 0xa4, 0x02, 0x81, 0x5f, 0xaa, 0x39, 0xa8, 0xfb, 0xd3, + 0x5c, 0xf2, 0x73, 0xe8, 0xed, 0x44, 0x22, 0xf0, 0xd5, 0x5f, 0x4b, 0x75, + 0x04, 0x98, 0x09, 0x9b, 0x39, 0x4f, 0xb5, 0xb5, 0x93, 0x07, 0xcc, 0xb8, + 0x70, 0x8d, 0x61, 0xf2, 0xe8, 0x49, 0x1c, 0xda, 0xd5, 0xcc, 0x4f, 0xf2, + 0x08, 0x8a, 0x0a, 0xf2, 0x87, 0xc8, 0x4b, 0x3b, 0x10, 0x1b, 0x0a, 0xd4, + 0xcb, 0x75, 0xb0, 0x78, 0xa8, 0xae, 0x8f, 0x4d, 0x5d, 0x3f, 0xeb, 0x6e, + 0xb0, 0xb2, 0xe0, 0x4f, 0x6c, 0x9d, 0x26, 0x48, 0x11, 0x2b, 0xd7, 0xba, + 0x0a, 0x3f, 0x38, 0xc2, 0x40, 0x0d, 0xb9, 0xc9, 0x1a, 0x24, 0x41, 0xde, + 0x0c, 0xd9, 0x67, 0x4f, 0x7e, 0xaa, 0xe3, 0x37, 0x51, 0x0d, 0x46, 0x4a, + 0x95, 0xc9, 0x44, 0x25, 0x74, 0xe5, 0x78, 0xbc, 0xeb, 0x5f, 0x20, 0xa0, + 0x7e, 0xad, 0xfa, 0xc5, 0x34, 0xd8, 0x85, 0xbe, 0xea, 0x93, 0xd1, 0x5a, + 0x20, 0x14, 0x6b, 0x74, 0x40, 0xde, 0x05, 0xad, 0x57, 0xd8, 0xba, 0x34, + 0x92, 0x64, 0xed, 0x80, 0xb4, 0xe6, 0x18, 0x89, 0xea, 0x15, 0x40, 0xd8, + 0xcf, 0x1e, 0xf7, 0x34, 0xc8, 0x3e, 0x39, 0x9d, 0x40, 0x5b, 0x7c, 0x36, + 0x90, 0xaf, 0xe4, 0x8c, 0xac, 0xc5, 0xe8, 0x50, 0xe9, 0x9d, 0x76, 0x60, + 0x08, 0xc3, 0x85, 0xbf, 0x37, 0xdd, 0x63, 0x01, 0x4c, 0xe9, 0xa4, 0x30, + 0x39, 0x88, 0x8b, 0x1b, 0x80, 0x79, 0xaa, 0x46, 0xf0, 0x4f, 0x7e, 0x97, + 0x81, 0x2a, 0xfc, 0x9f, 0xfc, 0xbf, 0xe9, 0xf6, 0xa0, 0x04, 0x33, 0xcb, + 0xc9, 0x5b, 0xde, 0xd3, 0xf4, 0x6a, 0x5d, 0x11, 0x45, 0x37, 0x29, 0xca, + 0xb5, 0x3e, 0x6d, 0xd9, 0x39, 0x95, 0xaa, 0x32, 0x14, 0xd1, 0xbd, 0x24, + 0xb7, 0x3f, 0xd7, 0x43, 0x86, 0x9a, 0x80, 0x8e, 0x53, 0xfc, 0x13, 0xb9, + 0x36, 0x31, 0x8c, 0x58, 0x73, 0x41, 0x57, 0xc1, 0xf0, 0x5b, 0xa9, 0x6c, + 0x6c, 0xf5, 0x5c, 0x95, 0xaa, 0x19, 0x82, 0x84, 0x36, 0x50, 0x19, 0x89, + 0xc3, 0x2b, 0x09, 0x59, 0x7a, 0x2d, 0xd9, 0x5e, 0x01, 0x9c, 0xd6, 0xfc, + 0xb7, 0xfe, 0xac, 0x31, 0x2e, 0x86, 0x32, 0xbf, 0x4e, 0xed, 0x53, 0xed, + 0x76, 0x68, 0x20, 0x0d, 0xfa, 0x18, 0x62, 0x47, 0x5e, 0x25, 0x5a, 0xd0, + 0x80, 0x4a, 0xae, 0xa7, 0x74, 0xe9, 0xfa, 0x9a, 0x12, 0x48, 0x7c, 0xcd, + 0xd5, 0xcd, 0x02, 0xcb, 0x47, 0x84, 0xbf, 0xed, 0xa3, 0x82, 0x29, 0xe4, + 0x69, 0x89, 0xa5, 0xbc, 0xff, 0xac, 0x06, 0x3a, 0x16, 0xc0, 0xdc, 0xe1, + 0xf7, 0xfa, 0x99, 0xaf, 0x02, 0x04, 0xbf, 0xa9, 0x56, 0xe2, 0xbe, 0xd0, + 0x5d, 0x85, 0x5a, 0x6e, 0xc4, 0xac, 0x5c, 0xac, 0xbb, 0x9f, 0x41, 0xa6, + 0x65, 0x43, 0x77, 0xe2, 0x32, 0x1b, 0xfb, 0x64, 0x7b, 0xbd, 0x1a, 0x83, + 0xff, 0xba, 0xc6, 0x99, 0x79, 0x3c, 0x10, 0x78, 0xc7, 0x5b, 0x9f, 0x0d, + 0xf3, 0xdc, 0xc1, 0xc1, 0xfb, 0xc1, 0xf3, 0x0e, 0x56, 0x25, 0x19, 0xa1, + 0xea, 0xbf, 0xc0, 0xda, 0x2b, 0x0b, 0xc5, 0x4c, 0x4d, 0x22, 0x1b, 0x9b, + 0x53, 0x27, 0xe0, 0x73, 0x82, 0x75, 0xf9, 0xbe, 0x6b, 0xa3, 0x75, 0xa3, + 0x9f, 0x63, 0x0b, 0xfa, 0x10, 0x1b, 0xb1, 0x49, 0x2c, 0x1d, 0xfa, 0x47, + 0xd5, 0xfc, 0x4b, 0x32, 0x62, 0xd9, 0x30, 0x1e, 0x22, 0xb3, 0xce, 0xb8, + 0x24, 0xd7, 0x17, 0xbb, 0xb5, 0xb0, 0x39, 0x2f, 0x21, 0x12, 0xe4, 0xd8, + 0x87, 0xa7, 0xe3, 0xa8, 0xb4, 0x59, 0x90, 0x19, 0x6b, 0x27, 0xd8, 0x40, + 0xc6, 0xc0, 0xc1, 0xd4, 0x0a, 0x69, 0xe1, 0xc9, 0xb1, 0x2e, 0xb5, 0xb5, + 0xb0, 0x60, 0xf1, 0x01, 0x90, 0x8d, 0x80, 0xe0, 0x1f, 0xd3, 0x5a, 0x1c, + 0xa0, 0x83, 0x5c, 0x07, 0xe2, 0x9a, 0x0b, 0x64, 0xbc, 0x78, 0x45, 0x11, + 0x9f, 0x0a, 0x60, 0xae, 0x78, 0x56, 0x74, 0xda, 0x2d, 0x0f, 0x66, 0x9d, + 0x3c, 0xc9, 0x05, 0x59, 0x0f, 0x0a, 0x48, 0x9a, 0x04, 0x4b, 0xa7, 0x6d, + 0xd4, 0xaf, 0x23, 0xeb, 0xea, 0x6e, 0xfe, 0xa3, 0xfa, 0x10, 0x02, 0x42, + 0xb5, 0x49, 0x0f, 0x73, 0x29, 0xcd, 0x43, 0x62, 0x06, 0xb6, 0xc8, 0xc4, + 0xaa, 0xb1, 0x55, 0x6b, 0x2a, 0x50, 0x08, 0x8f, 0x82, 0x33, 0x74, 0xcb, + 0xc5, 0x18, 0xc2, 0x60, 0xa8, 0x1d, 0x75, 0x11, 0x90, 0x78, 0x82, 0x99, + 0x8b, 0x91, 0xa8, 0x60, 0x7e, 0xe8, 0x66, 0x76, 0xa7, 0x1b, 0x15, 0xfa, + 0x0b, 0xa2, 0x22, 0x9e, 0x85, 0x83, 0xa4, 0x2c, 0x45, 0x70, 0x9f, 0xf6, + 0x3c, 0xe4, 0x67, 0xbe, 0x96, 0x02, 0x40, 0x11, 0x3b, 0xa2, 0xeb, 0x01, + 0x08, 0x7b, 0xbe, 0x2b, 0xd2, 0x45, 0x8e, 0x51, 0x25, 0x6c, 0x33, 0x5d, + 0xf1, 0x4c, 0x4c, 0xe7, 0x28, 0x75, 0x2f, 0xd9, 0x90, 0xf6, 0xf8, 0x11, + 0x7a, 0x0d, 0x5e, 0xf2, 0xad, 0xaf, 0xf3, 0x83, 0x45, 0x78, 0x1e, 0xbc, + 0xe7, 0x36, 0xfb, 0x53, 0xf7, 0x32, 0xab, 0x22, 0xc3, 0x3e, 0x58, 0x3d, + 0x9d, 0xe1, 0x18, 0xac, 0x57, 0x04, 0x21, 0xcf, 0x53, 0xf1, 0xc6, 0xe6, + 0x33, 0x57, 0x83, 0xf2, 0xef, 0x5f, 0x3b, 0x8b, 0x94, 0x31, 0x91, 0x52, + 0x19, 0x0f, 0x6c, 0xd0, 0xa3, 0x5f, 0x18, 0x6f, 0x9b, 0x6d, 0x92, 0xe9, + 0x03, 0xd8, 0x41, 0x10, 0x00, 0xc4, 0x35, 0xb2, 0x60, 0x58, 0xc5, 0xcf, + 0x28, 0x9f, 0x5b, 0x13, 0xa5, 0x83, 0xb9, 0x68, 0x99, 0xc2, 0x3b, 0xbf, + 0x6b, 0x49, 0xc3, 0xba, 0x12, 0xb5, 0xc3, 0x53, 0x87, 0x45, 0x21, 0xca, + 0x20, 0xeb, 0x18, 0xd7, 0x4c, 0xc3, 0xc9, 0xe3, 0xe7, 0x83, 0x1c, 0x49, + 0xbc, 0x29, 0x19, 0x51, 0xa5, 0x66, 0x3c, 0xa1, 0xda, 0x32, 0xe7, 0xb8, + 0x24, 0x2f, 0xb8, 0xdc, 0x76, 0xe7, 0xe0, 0xbf, 0x8b, 0x39, 0x38, 0x91, + 0x13, 0x36, 0x09, 0x23, 0x85, 0x2c, 0x49, 0x86, 0x9b, 0x97, 0x2a, 0xb7, + 0xc7, 0x0b, 0xb9, 0xc5, 0xf7, 0xe1, 0x72, 0x30, 0x85, 0xf5, 0x2d, 0x36, + 0xdc, 0x4a, 0x89, 0x45, 0xd1, 0x82, 0x5a, 0x6e, 0xe3, 0x54, 0x8f, 0x30, + 0x57, 0x26, 0x9b, 0x7d, 0x52, 0x65, 0xbf, 0x05, 0x91, 0x7f, 0xf0, 0xb8, + 0x3d, 0x8e, 0x81, 0xec, 0xaa, 0xf8, 0x50, 0xa5, 0x55, 0x08, 0xad, 0x4c, + 0x7a, 0x78, 0xa4, 0x02, 0xc7, 0x03, 0xdd, 0x67, 0x64, 0x09, 0x66, 0x79, + 0x7f, 0x5c, 0x16, 0x12, 0x9c, 0xda, 0x38, 0x41, 0x30, 0xa0, 0xa0, 0x17, + 0x88, 0x42, 0x7f, 0xf1, 0x5b, 0x83, 0xfa, 0xe3, 0xc4, 0xea, 0xdb, 0x78, + 0x5f, 0xa1, 0x92, 0xf9, 0xbf, 0x0b, 0xed, 0x94, 0x3b, 0x7d, 0x2e, 0x9c, + 0xf5, 0xbb, 0x7e, 0x80, 0x2c, 0x36, 0x66, 0x4f, 0x4e, 0xf2, 0x20, 0x22, + 0x20, 0xe8, 0xe5, 0xba, 0xa6, 0xe5, 0x31, 0x39, 0x52, 0x0c, 0x65, 0x26, + 0xdf, 0xe5, 0x0d, 0x92, 0x12, 0x38, 0xd9, 0x0f, 0x0c, 0xe1, 0xfc, 0xce, + 0xce, 0x69, 0x03, 0x8a, 0x5c, 0x19, 0xc9, 0xfa, 0xed, 0x90, 0xb4, 0xe1, + 0xe1, 0x6f, 0x62, 0xd2, 0x35, 0xce, 0x8f, 0xe6, 0xfc, 0x37, 0x99, 0x44, + 0xa5, 0xbd, 0x61, 0x06, 0xdb, 0x71, 0x66, 0x07, 0x74, 0x8e, 0xec, 0xe7, + 0x0d, 0xc0, 0xc8, 0xc9, 0x7c, 0x51, 0xd0, 0x55, 0xeb, 0x56, 0x87, 0x08, + 0x16, 0x71, 0x44, 0x5e, 0x2d, 0x5d, 0x6a, 0xef, 0xc0, 0xd8, 0xe3, 0x9d, + 0x57, 0x32, 0x01, 0xc7, 0xce, 0xff, 0x70, 0xee, 0xbc, 0x22, 0xd0, 0xe3, + 0x90, 0x84, 0x77, 0x4a, 0x94, 0xaa, 0x8e, 0xeb, 0x5b, 0x87, 0xec, 0x4e, + 0x1a, 0x4e, 0xae, 0x1d, 0x39, 0x1c, 0xff, 0x6e, 0x9c, 0x0f, 0x90, 0x4d, + 0x56, 0x3b, 0xff, 0xac, 0xe5, 0xbb, 0x22, 0xb2, 0x2c, 0xfd, 0x76, 0xf4, + 0x42, 0x5c, 0x5b, 0x9b, 0x7f, 0x58, 0xc0, 0x53, 0xcd, 0x17, 0xa2, 0x61, + 0xec, 0x83, 0x2d, 0xe8, 0x69, 0x42, 0x5e, 0x5f, 0xc1, 0x30, 0x85, 0xa0, + 0x2b, 0x7e, 0xb3, 0xe6, 0x14, 0x6d, 0x93, 0xb6, 0xda, 0xef, 0x4c, 0xaf, + 0x8a, 0x1f, 0x7b, 0x0f, 0x4a, 0xd3, 0x47, 0x89, 0x0f, 0x62, 0xe1, 0x70, + 0xcd, 0xa2, 0x1f, 0xa2, 0x8b, 0x5c, 0xd9, 0x2b, 0x26, 0xef, 0x5e, 0x68, + 0x11, 0x71, 0x89, 0x33, 0x17, 0x51, 0x41, 0x88, 0x60, 0x99, 0x48, 0x2d, + 0x39, 0x1b, 0xf2, 0x8a, 0xf0, 0x98, 0xf9, 0xe0, 0x49, 0xb3, 0xde, 0x10, + 0x30, 0xbd, 0x02, 0x55, 0xb2, 0x3f, 0xbb, 0xcd, 0x8a, 0x28, 0xd5, 0xb7, + 0x91, 0xc7, 0x27, 0xd6, 0x99, 0x0c, 0x32, 0x35, 0xba, 0x14, 0x3e, 0xd5, + 0x1c, 0xb0, 0x46, 0x39, 0x91, 0xd3, 0x05, 0x4f, 0xba, 0x63, 0xee, 0x3c, + 0x76, 0xe5, 0xcb, 0x54, 0xb7, 0x8c, 0x69, 0x81, 0x53, 0xa3, 0xb3, 0xa2, + 0xef, 0x3a, 0x35, 0x54, 0xaf, 0xbc, 0x5e, 0x1c, 0x2a, 0x0f, 0x94, 0x2b, + 0x69, 0x20, 0x1f, 0xa1, 0x0c, 0xa3, 0x82, 0x36, 0x80, 0xdb, 0x47, 0xa3, + 0x41, 0x7e, 0xd1, 0x9c, 0x5f, 0x80, 0x58, 0x72, 0xac, 0x8c, 0x7a, 0x64, + 0x13, 0xe6, 0x19, 0x9f, 0x15, 0xc2, 0x20, 0xf7, 0xe8, 0x3e, 0x4a, 0x4c, + 0x0e, 0x3a, 0x63, 0x11, 0x6c, 0xf7, 0x09, 0x76, 0xc3, 0x12, 0xfc, 0x88, + 0xd5, 0xc8, 0xd0, 0xbb, 0xfa, 0x0b, 0x46, 0xc2, 0xe3, 0x3b, 0x84, 0xf4, + 0x0f, 0xcc, 0x84, 0x59, 0x98, 0xc5, 0x93, 0xdd, 0xb7, 0x03, 0x46, 0xf9, + 0x09, 0xd3, 0x94, 0x8c, 0x23, 0xd3, 0x99, 0x42, 0xf1, 0xc3, 0xa8, 0x5b, + 0x0c, 0xb0, 0xbc, 0x23, 0xb8, 0xea, 0x84, 0xab, 0x41, 0x3f, 0x6e, 0xb2, + 0xc2, 0xa7, 0xbe, 0x5e, 0x43, 0x5c, 0x25, 0x0d, 0x1e, 0x2a, 0xae, 0xc5, + 0xbe, 0xc5, 0xdc, 0xd6, 0xc7, 0xfa, 0xe7, 0x04, 0x8d, 0xc8, 0xba, 0xca, + 0xbd, 0xd0, 0x09, 0x23, 0xd6, 0xee, 0x31, 0x75, 0x34, 0x92, 0x6d, 0x26, + 0x2c, 0x0c, 0x87, 0xa8, 0x5d, 0x0b, 0x6a, 0xe4, 0xec, 0x50, 0x77, 0xcd, + 0x92, 0x2a, 0xbb, 0x61, 0x09, 0x48, 0xe5, 0x6c, 0xb1, 0x99, 0xe8, 0x03, + 0xe2, 0x8f, 0xbb, 0x1a, 0xf9, 0x6a, 0x7d, 0xac, 0xba, 0xd2, 0x88, 0x41, + 0xd6, 0xd8, 0x79, 0x1d, 0x3d, 0xed, 0x66, 0xf1, 0x0f, 0xd4, 0x27, 0x61, + 0xbf, 0xc3, 0x31, 0x4e, 0xdc, 0x86, 0x67, 0x6e, 0x90, 0xab, 0x34, 0x2f, + 0xf6, 0xcd, 0xdf, 0xb8, 0x25, 0xb2, 0x01, 0xc3, 0xa0, 0xd7, 0x90, 0xa6, + 0x90, 0xde, 0xcd, 0x97, 0xf8, 0xa7, 0xb8, 0x24, 0x96, 0x31, 0x19, 0x2c, + 0xf1, 0xde, 0x6a, 0x0b, 0x19, 0x1d, 0xbb, 0xce, 0xef, 0x4b, 0x72, 0x5a, + 0xde, 0x50, 0x02, 0x61, 0x0f, 0x5e, 0xa4, 0x47, 0x6c, 0x4b, 0x2c, 0xd2, + 0x4d, 0xf4, 0xe7, 0x67, 0xb9, 0xff, 0x52, 0x5d, 0x9f, 0x9f, 0x96, 0xb7, + 0x17, 0x5e, 0xb8, 0x91, 0x58, 0x1a, 0xd5, 0xf6, 0x6d, 0x67, 0x9d, 0x74, + 0xfb, 0x55, 0xbb, 0xf9, 0xaf, 0x3c, 0xec, 0x9e, 0x0d, 0x72, 0xfd, 0x00, + 0x24, 0xc3, 0x41, 0x57, 0x34, 0x04, 0x8d, 0x82, 0xbb, 0x82, 0xc0, 0x24, + 0x41, 0x1f, 0xf8, 0x20, 0xf9, 0x65, 0x0a, 0x2b, 0xad, 0x50, 0xae, 0xa6, + 0x23, 0x6a, 0x5d, 0xdf, 0x6a, 0x37, 0x2b, 0x42, 0x46, 0x63, 0x9b, 0xee, + 0xd9, 0x30, 0xf3, 0x5d, 0x69, 0xb8, 0x9c, 0x0f, 0x46, 0x82, 0x44, 0xb0, + 0x05, 0xec, 0x3d, 0x25, 0xb2, 0x41, 0xe7, 0xb4, 0x4c, 0x10, 0xa5, 0x96, + 0x17, 0xec, 0x3d, 0x40, 0x39, 0xc4, 0x64, 0x10, 0x71, 0x7a, 0x78, 0xc6, + 0x5a, 0x5d, 0x31, 0xcc, 0x1f, 0x8a, 0x25, 0xe2, 0x06, 0x5e, 0x3f, 0xa1, + 0x98, 0xb5, 0xee, 0x9a, 0x3e, 0x4b, 0xa9, 0x10, 0xd9, 0xd6, 0x32, 0x61, + 0xf5, 0x63, 0xfc, 0x03, 0x2d, 0xbf, 0xad, 0x3f, 0x47, 0xa4, 0x01, 0xb6, + 0x5f, 0x4d, 0xdc, 0x76, 0xfb, 0x9e, 0x52, 0x18, 0x42, 0xd1, 0xe7, 0x12, + 0x7b, 0xc7, 0xf8, 0x23, 0xb7, 0x7c, 0xba, 0x48, 0xb1, 0x8b, 0x0c, 0x89, + 0x98, 0xa5, 0x42, 0xe8, 0xc6, 0xbc, 0x58, 0xa6, 0x20, 0x04, 0xe7, 0xbe, + 0xf3, 0x48, 0x63, 0xa3, 0x6e, 0xac, 0xfa, 0xdd, 0x57, 0xf5, 0x4c, 0xcc, + 0x6b, 0xa5, 0xc7, 0x23, 0xb3, 0x18, 0x00, 0x49, 0x61, 0x1b, 0x6f, 0x3e, + 0x41, 0x47, 0x35, 0x41, 0xc1, 0x6a, 0x7d, 0xc1, 0x8b, 0x4d, 0x3f, 0xeb, + 0x4d, 0x0f, 0xc1, 0x46, 0xbc, 0xa5, 0x49, 0x22, 0xc4, 0x43, 0x08, 0xf2, + 0xc6, 0x51, 0x66, 0x03, 0x92, 0x24, 0xc0, 0xfb, 0x4d, 0x16, 0x96, 0x93, + 0xd6, 0xbf, 0xd4, 0x81, 0xa9, 0x33, 0x5e, 0x5f, 0x94, 0xb5, 0x70, 0x2a, + 0xd9, 0x13, 0x6b, 0xa0, 0xdb, 0xec, 0x54, 0x65, 0x4a, 0x56, 0x17, 0xef, + 0x54, 0x43, 0x1b, 0xb2, 0x1a, 0x06, 0x7c, 0x71, 0x04, 0x59, 0x66, 0x28, + 0x95, 0xa1, 0xa5, 0x7e, 0x31, 0xe4, 0xb2, 0xe1, 0x3b, 0x18, 0xc4, 0xdb, + 0x48, 0x33, 0x70, 0xed, 0xe8, 0xcf, 0x09, 0x9b, 0x0d, 0xf0, 0xd7, 0x17, + 0x6b, 0x9f, 0x4a, 0x23, 0xad, 0x92, 0x83, 0x59, 0x91, 0xdd, 0x5f, 0xc6, + 0x38, 0xb9, 0x99, 0xba, 0xee, 0x13, 0x2b, 0xb8, 0x91, 0x1c, 0x49, 0x30, + 0x9b, 0xab, 0xcb, 0x3f, 0xa4, 0x43, 0x03, 0x89, 0x8f, 0x86, 0x04, 0xbe, + 0x34, 0x3e, 0xbe, 0xda, 0x6f, 0x49, 0x8e, 0x2c, 0x08, 0x20, 0xe3, 0xb1, + 0x6d, 0xbd, 0xec, 0x3f, 0xf9, 0xd5, 0x1a, 0xde, 0x05, 0xfe, 0x71, 0x74, + 0xe5, 0x35, 0x4b, 0xc4, 0x4f, 0x19, 0x68, 0x24, 0x5f, 0x88, 0xbe, 0x7c, + 0xd7, 0xa8, 0x97, 0x3e, 0x6f, 0xa1, 0xab, 0xc7, 0x36, 0x9b, 0xfc, 0x42, + 0x8e, 0x20, 0x70, 0x22, 0x0c, 0x77, 0x6e, 0xaf, 0xae, 0x34, 0x85, 0x27, + 0xca, 0xa3, 0x7d, 0xcd, 0xe5, 0x90, 0x11, 0x10, 0x8e, 0x5b, 0x59, 0x48, + 0x8c, 0x9d, 0xa4, 0x7f, 0x4a, 0x0d, 0xa8, 0x00, 0xb2, 0x6a, 0xe6, 0x03, + 0xc6, 0xcf, 0x10, 0xa3, 0xdb, 0x0d, 0x61, 0xfd, 0x34, 0x4c, 0xf0, 0xe0, + 0x63, 0xed, 0x15, 0xd6, 0x56, 0x3c, 0xa7, 0x75, 0xd6, 0x45, 0x31, 0xd0, + 0x71, 0x92, 0x50, 0x8e, 0xab, 0x6a, 0x1c, 0x42, 0xf6, 0xbf, 0x16, 0x56, + 0x95, 0x71, 0xee, 0x87, 0x75, 0x4e, 0x89, 0xe4, 0x0d, 0xc0, 0xc2, 0x8e, + 0x7e, 0x90, 0xe9, 0x3b, 0x62, 0x48, 0xfa, 0x0a, 0x02, 0x64, 0xc8, 0xbd, + 0xa7, 0x6d, 0xe8, 0x8a, 0xf8, 0x68, 0xeb, 0xce, 0x66, 0x0f, 0x39, 0x02, + 0xa9, 0x92, 0x9f, 0xba, 0x38, 0x1e, 0x68, 0xbd, 0x3c, 0x33, 0x9f, 0x2a, + 0x44, 0x5c, 0xf2, 0xe3, 0xd7, 0x2b, 0x1e, 0x81, 0x94, 0xba, 0x38, 0xd8, + 0x47, 0x24, 0x8b, 0x7e, 0x37, 0x33, 0xc1, 0xba, 0xf4, 0x4d, 0x09, 0x99, + 0x60, 0x07, 0xf3, 0x71, 0x19, 0x84, 0x53, 0xe5, 0x89, 0x73, 0x8e, 0xed, + 0xe0, 0xef, 0x3b, 0x1e, 0x3a, 0x00, 0xf9, 0x46, 0x91, 0x40, 0xff, 0x94, + 0x49, 0x4b, 0xd6, 0x23, 0xff, 0x5b, 0x7f, 0xdf, 0xdc, 0x99, 0xbe, 0xae, + 0x5d, 0x0a, 0x0d, 0x4f, 0xcf, 0x3c, 0x8e, 0x44, 0x65, 0x8b, 0x20, 0xd8, + 0xc9, 0x3f, 0xa7, 0xa7, 0xd1, 0xd7, 0x36, 0xf7, 0xb4, 0x7c, 0xfc, 0x62, + 0xfc, 0x80, 0x33, 0x36, 0x9b, 0xba, 0x74, 0x95, 0x11, 0xed, 0xcb, 0x2b, + 0x9d, 0x90, 0x6b, 0x44, 0x17, 0xbd, 0x43, 0xbf, 0x54, 0x0a, 0x8e, 0x25, + 0xaa, 0xe5, 0x6c, 0xd5, 0xb7, 0xe1, 0x2d, 0x86, 0xe4, 0xed, 0xf6, 0xec, + 0xd1, 0xe8, 0x16, 0x1d, 0xda, 0x27, 0x5d, 0xca, 0xa9, 0xcd, 0x74, 0xb5, + 0xbb, 0x2e, 0x5c, 0x65, 0x6d, 0xee, 0xb1, 0x69, 0xd8, 0x9b, 0x63, 0x56, + 0xdc, 0x55, 0xb6, 0xe9, 0xa0, 0x38, 0x79, 0x86, 0x50, 0xc2, 0x55, 0xf3, + 0x3a, 0x77, 0xd4, 0xda, 0x59, 0x93, 0xc7, 0xc6, 0x4c, 0x04, 0x56, 0x0a, + 0x64, 0xae, 0xab, 0x54, 0xaa, 0x26, 0x5d, 0xcb, 0x90, 0x4b, 0x5c, 0xa4, + 0x76, 0xeb, 0x0c, 0x81, 0x69, 0xb1, 0xf7, 0x3f, 0xba, 0xf2, 0xf3, 0x20, + 0x55, 0xdd, 0xe8, 0xd4, 0x32, 0xf4, 0x43, 0x86, 0x47, 0x27, 0x3c, 0xab, + 0xde, 0x0b, 0x49, 0xd2, 0xc0, 0x53, 0x01, 0xe2, 0x87, 0x38, 0x1f, 0x6d, + 0x47, 0x37, 0x5c, 0x46, 0x15, 0xfe, 0x25, 0x82, 0x9f, 0xf6, 0x6b, 0x84, + 0x90, 0x1f, 0x86, 0x0d, 0xaf, 0x9c, 0x23, 0x02, 0x11, 0xcb, 0x62, 0x87, + 0xdf, 0x1d, 0xae, 0x45, 0xdf, 0x6a, 0xd4, 0x34, 0x74, 0x0c, 0x08, 0x46, + 0x7b, 0x79, 0x26, 0x44, 0x33, 0xbd, 0xaf, 0x3f, 0x45, 0xd0, 0x8a, 0x1e, + 0x8f, 0xf6, 0xa8, 0xa6, 0xd0, 0xcb, 0x08, 0x78, 0x2a, 0xdc, 0xf4, 0xd2, + 0x48, 0xaa, 0x53, 0xde, 0x2c, 0x13, 0x24, 0x4c, 0x2a, 0x14, 0x73, 0x92, + 0x66, 0xbc, 0x3e, 0x18, 0x38, 0xe1, 0x42, 0x0e, 0x2e, 0x4d, 0x94, 0x19, + 0xed, 0x22, 0xb5, 0xd3, 0x84, 0xab, 0xb4, 0xbe, 0x9d, 0xf4, 0x18, 0xd6, + 0xb2, 0x9d, 0x4b, 0xe9, 0xa3, 0x66, 0xc4, 0x87, 0x75, 0x1c, 0x06, 0x16, + 0xba, 0x5d, 0x5b, 0x2c, 0x77, 0x3c, 0x62, 0xe0, 0x3b, 0x73, 0xd5, 0x25, + 0xb9, 0x5b, 0x2e, 0xca, 0xd2, 0xc7, 0x05, 0x4c, 0xf5, 0x0e, 0xbe, 0x97, + 0x1c, 0xf5, 0xdb, 0x5b, 0xbe, 0x8a, 0xa5, 0xaf, 0xb3, 0xe0, 0x71, 0x16, + 0x09, 0xf6, 0x4e, 0x33, 0xe0, 0x05, 0x32, 0x1f, 0x3e, 0xa3, 0xe7, 0x35, + 0xf8, 0xf7, 0xbf, 0xbc, 0x43, 0xc9, 0xf9, 0xfd, 0x8d, 0xd8, 0x54, 0x9f, + 0x94, 0x4a, 0x68, 0x9e, 0x54, 0x70, 0x12, 0x76, 0xfa, 0x91, 0xbf, 0x31, + 0xe8, 0xdc, 0x5c, 0xe4, 0x9f, 0xe6, 0xb7, 0x97, 0x34, 0x2c, 0x7f, 0xbd, + 0x5f, 0x1d, 0xbd, 0x67, 0xd2, 0xe2, 0x3c, 0x06, 0x7b, 0xd1, 0x70, 0x29, + 0x36, 0x93, 0x4e, 0x9e, 0xc1, 0xa2, 0xfb, 0xad, 0xde, 0xaa, 0x6c, 0xb6, + 0x2b, 0xf8, 0x0e, 0xff, 0xad, 0x51, 0x4a, 0xc7, 0x64, 0x30, 0x75, 0x63, + 0xd2, 0x86, 0x07, 0xc0, 0x70, 0xef, 0xcd, 0xa0, 0x3b, 0x41, 0x11, 0xaa, + 0x59, 0xb4, 0x45, 0x4f, 0xdb, 0xf9, 0xf3, 0xb6, 0x0c, 0x78, 0x94, 0x14, + 0x32, 0x81, 0x71, 0xf0, 0xd5, 0x52, 0x14, 0x99, 0xd5, 0x30, 0x5c, 0xea, + 0xa2, 0x97, 0x28, 0x73, 0xe8, 0xa5, 0x24, 0xf7, 0x51, 0x0a, 0x23, 0x2d, + 0x5f, 0x5a, 0x59, 0x84, 0x66, 0x83, 0x38, 0x7a, 0x41, 0x3d, 0x47, 0x2a, + 0xa0, 0x45, 0x1f, 0x1f, 0xc8, 0x5a, 0x1b, 0xd7, 0x18, 0x30, 0x18, 0xb5, + 0x2c, 0xaa, 0x57, 0xc0, 0x14, 0x0d, 0x2c, 0xc3, 0x99, 0xfc, 0x6b, 0xd2, + 0xf8, 0x34, 0x7e, 0xd5, 0xca, 0x3a, 0xfc, 0xc8, 0xa0, 0x90, 0x38, 0x1a, + 0x88, 0x69, 0x52, 0x90, 0x8c, 0x4a, 0xfb, 0xf1, 0xd3, 0xe1, 0x8e, 0x03, + 0x22, 0x48, 0x34, 0x30, 0xe5, 0x29, 0x3e, 0xb1, 0xea, 0x54, 0x08, 0x4f, + 0xcd, 0x76, 0x93, 0xdb, 0xf3, 0xe2, 0x92, 0xb4, 0x47, 0xb5, 0x24, 0xf6, + 0xd5, 0x9a, 0xea, 0xcb, 0xd6, 0x9e, 0xfe, 0xf4, 0x41, 0x4f, 0x79, 0x93, + 0x3c, 0xa9, 0xd2, 0x40, 0x1f, 0xc9, 0x43, 0x6c, 0x21, 0x53, 0x2c, 0x69, + 0x70, 0x6f, 0x3a, 0x8c, 0x85, 0xe3, 0xb3, 0x12, 0x28, 0xf0, 0x27, 0x28, + 0xc6, 0x65, 0x8d, 0xe2, 0x5c, 0x26, 0x46, 0x4c, 0xfa, 0x46, 0x14, 0xf2, + 0xe1, 0x68, 0x1d, 0x99, 0xe5, 0xfc, 0xe5, 0x6a, 0xce, 0x32, 0xce, 0x37, + 0x1d, 0x17, 0xb5, 0x7b, 0x38, 0x32, 0xb3, 0x6e, 0x42, 0xcd, 0x4a, 0x16, + 0x1e, 0x54, 0x94, 0x56, 0x8b, 0xf6, 0xeb, 0x60, 0x31, 0x7e, 0xd7, 0xdc, + 0xcc, 0x96, 0xf5, 0x51, 0x1e, 0x19, 0xdc, 0x7d, 0x06, 0xa6, 0xf4, 0x59, + 0xda, 0xa1, 0xa2, 0xb2, 0x7f, 0x83, 0xf8, 0x7a, 0x73, 0xef, 0x8f, 0x51, + 0xac, 0xf5, 0xdf, 0xf1, 0xe6, 0x81, 0x94, 0xad, 0x40, 0x3c, 0xc0, 0x08, + 0x80, 0xe8, 0x35, 0x98, 0x4c, 0x5f, 0x4b, 0x06, 0x92, 0xa0, 0x6c, 0xd4, + 0x1f, 0xfa, 0xbc, 0xe0, 0x3e, 0xd7, 0xe0, 0xa5, 0x74, 0xf3, 0x44, 0xd9, + 0xa7, 0xc6, 0x8a, 0xea, 0x6a, 0x79, 0xe5, 0x77, 0x13, 0x38, 0x15, 0xf0, + 0x82, 0x5c, 0x58, 0x22, 0x3e, 0x40, 0x64, 0x4d, 0x16, 0xa8, 0x67, 0x65, + 0x17, 0x97, 0x85, 0x88, 0xe7, 0x3c, 0x65, 0x21, 0x23, 0xb8, 0xe9, 0xb4, + 0x1a, 0x43, 0x6a, 0xc7, 0xa0, 0x9b, 0x8b, 0xce, 0xb2, 0x8d, 0x42, 0x32, + 0x56, 0x17, 0xcd, 0x84, 0x90, 0x95, 0x1f, 0xab, 0x5c, 0xf0, 0x1f, 0xc4, + 0x93, 0x09, 0x4a, 0xa4, 0xad, 0xb1, 0x83, 0xa2, 0x02, 0xc1, 0xe1, 0x21, + 0xaa, 0x00, 0x2f, 0xc5, 0x9d, 0x15, 0xb8, 0xa8, 0xbe, 0x92, 0x7f, 0x0d, + 0x2f, 0xc7, 0xb4, 0x32, 0xf9, 0x87, 0xec, 0x7f, 0x1f, 0x3e, 0xcd, 0x32, + 0x66, 0xc5, 0xbc, 0x8b, 0x0b, 0x7a, 0x30, 0x0d, 0x99, 0xde, 0x02, 0x82, + 0xe7, 0x89, 0x80, 0x55, 0xc6, 0x51, 0xdb, 0xd3, 0xb6, 0xa9, 0xc2, 0xc0, + 0xc5, 0x10, 0x44, 0x2c, 0xde, 0x27, 0x08, 0x95, 0x68, 0x61, 0x43, 0x79, + 0x5d, 0x9a, 0xb1, 0x67, 0x4a, 0x2f, 0x0b, 0x4e, 0x84, 0xa4, 0x4c, 0x6d, + 0xe0, 0x7b, 0xed, 0x07, 0xd5, 0xe9, 0xd7, 0x51, 0xb0, 0x75, 0xbc, 0xe4, + 0x0f, 0xf4, 0xeb, 0x76, 0x9f, 0xf4, 0x05, 0x0d, 0x0b, 0x9a, 0x9a, 0xa4, + 0xbb, 0x4f, 0xa7, 0x16, 0x79, 0x6c, 0xb9, 0xa8, 0x12, 0xd5, 0x85, 0x2f, + 0x73, 0x17, 0x55, 0xb5, 0x8b, 0xc5, 0xf8, 0xa5, 0x4e, 0x58, 0x1c, 0x55, + 0xd4, 0x37, 0xc6, 0x0b, 0x52, 0x8c, 0x6c, 0x43, 0xc2, 0xb6, 0x25, 0x76, + 0x78, 0x37, 0xc4, 0xdb, 0xeb, 0xf7, 0xaa, 0x13, 0xfc, 0x47, 0x10, 0x13, + 0x18, 0xdf, 0x80, 0x10, 0x15, 0xa7, 0x52, 0xdd, 0x55, 0x1c, 0xb0, 0xa0, + 0xca, 0x75, 0x0f, 0x34, 0x13, 0x0b, 0xae, 0x28, 0x09, 0xb7, 0xa9, 0x3d, + 0xd2, 0xf1, 0x59, 0x98, 0xed, 0x71, 0xfa, 0xfa, 0xbb, 0x28, 0x16, 0xa9, + 0x95, 0xfb, 0xc8, 0xfe, 0x2c, 0xb0, 0x4f, 0x86, 0x8c, 0x1d, 0x9d, 0xde, + 0xe5, 0x3e, 0x1b, 0xb0, 0x3d, 0x43, 0x4a, 0x31, 0xa1, 0x0e, 0xc6, 0xac, + 0xa9, 0x5f, 0x60, 0x5b, 0xfa, 0xa5, 0x2a, 0x95, 0x54, 0x3c, 0x50, 0x7f, + 0x9c, 0x9f, 0x5c, 0xfa, 0xb0, 0x5c, 0x57, 0xe9, 0x13, 0x6b, 0x47, 0xb2, + 0xda, 0x72, 0x9b, 0xbc, 0x66, 0x3c, 0x9b, 0x5a, 0x75, 0x9a, 0x1e, 0x23, + 0xac, 0xbd, 0x47, 0x00, 0x8f, 0x8f, 0xe8, 0xbf, 0xa9, 0xbf, 0x82, 0x99, + 0x23, 0x8e, 0x5f, 0x43, 0xda, 0x06, 0x7e, 0xfc, 0xf4, 0x52, 0x81, 0x5e, + 0xea, 0xab, 0x16, 0xee, 0x69, 0x73, 0xfc, 0xeb, 0xfe, 0x8d, 0xeb, 0xf8, + 0xcc, 0x80, 0xa2, 0x09, 0xb9, 0x6e, 0x24, 0x8a, 0xea, 0x23, 0x7c, 0xdf, + 0xbe, 0xc8, 0x4d, 0x03, 0x81, 0x1c, 0x48, 0xcd, 0x47, 0x7e, 0xf8, 0x23, + 0xf1, 0xd4, 0x5f, 0xce, 0x5b, 0xfc, 0x09, 0xe8, 0xe0, 0xc9, 0x4d, 0x80, + 0x35, 0xe1, 0x6c, 0x3e, 0x9a, 0xe0, 0x92, 0xa8, 0x65, 0xa8, 0x5e, 0x76, + 0x51, 0x4b, 0x56, 0xa5, 0x99, 0x60, 0x50, 0xcb, 0x04, 0xcf, 0xaa, 0xad, + 0x79, 0x82, 0x23, 0x1a, 0x38, 0x9b, 0x8f, 0x52, 0xdb, 0x4d, 0x82, 0xa6, + 0xaf, 0xcf, 0xde, 0x3f, 0x7f, 0xca, 0x2c, 0xba, 0x56, 0x2d, 0x7c, 0x2c, + 0x5d, 0x09, 0xc3, 0xcd, 0x74, 0xc6, 0x79, 0xeb, 0x1e, 0x33, 0x4b, 0x25, + 0x5c, 0x5c, 0xc4, 0x8e, 0xa4, 0x36, 0x0a, 0xf7, 0xb8, 0x3c, 0x42, 0x83, + 0xab, 0x7d, 0xc3, 0x07, 0xa6, 0x93, 0x06, 0x1d, 0x85, 0xb3, 0xf5, 0x0a, + 0x65, 0x7d, 0x14, 0xeb, 0x4f, 0x4d, 0xe1, 0xb7, 0x65, 0xeb, 0x15, 0x9d, + 0xf8, 0x36, 0x81, 0x60, 0x2c, 0x5f, 0x88, 0xb6, 0x97, 0xb6, 0xd1, 0xf1, + 0xa5, 0x98, 0x51, 0x29, 0x19, 0xae, 0x20, 0xba, 0x6e, 0xc4, 0x61, 0x6b, + 0x5e, 0x03, 0x39, 0x22, 0x43, 0xd0, 0x74, 0xc6, 0x2a, 0xdb, 0x4f, 0x47, + 0x92, 0xf4, 0x1d, 0x2d, 0x19, 0x79, 0xc4, 0x43, 0x74, 0xc4, 0xd0, 0xbe, + 0x6d, 0x8a, 0x8e, 0xa2, 0xcc, 0x77, 0x81, 0x5f, 0x9f, 0x8a, 0x35, 0xfe, + 0x6e, 0x57, 0xfb, 0x5e, 0x37, 0x25, 0x7d, 0xd9, 0x48, 0x98, 0xed, 0x5e, + 0xb3, 0x9a, 0x48, 0x7e, 0x67, 0x33, 0xa4, 0xc3, 0x97, 0x5e, 0x01, 0x87, + 0xea, 0xd8, 0x33, 0xe7, 0x0c, 0xc2, 0x11, 0x88, 0x54, 0xf8, 0x5b, 0xbb, + 0x04, 0xe2, 0x6a, 0xa9, 0x35, 0xf8, 0x3e, 0x3f, 0x70, 0xa7, 0xe3, 0xdd, + 0xd5, 0x56, 0x98, 0x09, 0x3e, 0xfb, 0xef, 0xbf, 0x40, 0x9c, 0xa9, 0x60, + 0xab, 0x41, 0xfd, 0xb9, 0xfd, 0x6b, 0x21, 0x75, 0xa6, 0x35, 0xd9, 0x90, + 0xc6, 0xe8, 0xbf, 0xc3, 0x8f, 0xdf, 0xed, 0x34, 0xa9, 0xc3, 0x3e, 0x82, + 0xbc, 0xcd, 0x24, 0xf5, 0x12, 0x65, 0xa8, 0xf3, 0xec, 0x6b, 0x29, 0xa4, + 0xbb, 0xd7, 0x6c, 0xe0, 0xb4, 0xd0, 0x32, 0x31, 0xdb, 0x9b, 0x63, 0xc7, + 0xdd, 0xd6, 0x35, 0x64, 0x11, 0xe0, 0xe3, 0x19, 0xc7, 0x67, 0x0f, 0xea, + 0x98, 0x53, 0xf3, 0x95, 0x3d, 0x92, 0xfc, 0x86, 0xd4, 0xd5, 0x30, 0x7e, + 0x33, 0x73, 0x61, 0xcb, 0x4c, 0x52, 0x24, 0x7a, 0xb2, 0xf5, 0x96, 0x72, + 0x6f, 0x56, 0xf0, 0xcf, 0xca, 0x8d, 0xaf, 0xe0, 0xbf, 0x62, 0xbb, 0x4c, + 0x81, 0x06, 0xc3, 0xef, 0x0a, 0x48, 0x19, 0x72, 0x5d, 0x87, 0x92, 0xe6, + 0x0c, 0x61, 0xf5, 0xbc, 0x41, 0xbe, 0x88, 0x34, 0x44, 0x5d, 0xbb, 0x57, + 0x3f, 0x7a, 0x9b, 0x19, 0xa2, 0x74, 0x3d, 0x83, 0x81, 0xef, 0xe0, 0x30, + 0xdb, 0xbd, 0xf6, 0x8e, 0xab, 0x34, 0x68, 0x34, 0x7b, 0x50, 0x77, 0x65, + 0xfe, 0xcf, 0x42, 0x00, 0x2f, 0x53, 0x79, 0x30, 0x30, 0xf4, 0xf0, 0x55, + 0x09, 0xeb, 0xc4, 0x44, 0x03, 0x69, 0x32, 0x20, 0x7a, 0x82, 0x21, 0x62, + 0x09, 0x0a, 0x17, 0xc2, 0x2c, 0x2d, 0x59, 0x59, 0x0f, 0x22, 0x93, 0xe9, + 0x47, 0x5c, 0x31, 0xf3, 0xde, 0xcf, 0x0a, 0xc8, 0x7e, 0x72, 0xe6, 0xea, + 0x23, 0x1c, 0xe1, 0x7f, 0x54, 0xc0, 0x92, 0x69, 0x27, 0x8b, 0x6c, 0xa4, + 0xd5, 0x55, 0xb9, 0xbe, 0x9c, 0xe1, 0x39, 0x5b, 0xa5, 0xb8, 0xf1, 0x89, + 0x24, 0xbe, 0xea, 0xa7, 0xf3, 0xea, 0xb9, 0xef, 0x3c, 0xbb, 0x08, 0x49, + 0x53, 0x6b, 0x38, 0xd9, 0x95, 0xcf, 0x5e, 0x07, 0xaf, 0xf9, 0x5b, 0x86, + 0x68, 0x38, 0x46, 0x77, 0x25, 0x6f, 0xac, 0x2f, 0xf2, 0x52, 0x6e, 0x72, + 0x19, 0x8b, 0x58, 0xf8, 0x9c, 0xde, 0x86, 0x86, 0x93, 0x0f, 0x0a, 0x15, + 0x93, 0x24, 0x82, 0xfd, 0xdb, 0x39, 0x5b, 0xc1, 0x7b, 0x32, 0xb7, 0xfa, + 0x22, 0x1f, 0xf0, 0xa7, 0xd0, 0x48, 0xa6, 0xc0, 0x8d, 0x1c, 0xa8, 0xdb, + 0x57, 0x0a, 0x05, 0x67, 0x06, 0x70, 0x8e, 0x03, 0x65, 0xf6, 0xd9, 0x1f, + 0xc1, 0x4f, 0x01, 0xa5, 0xa9, 0x1a, 0xf9, 0x7b, 0x97, 0x27, 0x9c, 0x81, + 0xaf, 0xbb, 0xbe, 0xf4, 0x99, 0x6e, 0xf0, 0xd1, 0x2d, 0x01, 0xfb, 0x83, + 0x03, 0x27, 0x02, 0xe9, 0xcb, 0x65, 0x43, 0x5c, 0x8d, 0x55, 0x3c, 0xc9, + 0x3c, 0xc9, 0xfe, 0x17, 0xd8, 0xea, 0x7d, 0xd7, 0xa5, 0xc8, 0xe7, 0xd5, + 0xc4, 0x9f, 0x51, 0xe9, 0xf4, 0x7b, 0x62, 0x25, 0xc6, 0xf4, 0xf5, 0x06, + 0x3f, 0x42, 0xf1, 0x85, 0xf2, 0x7b, 0xf4, 0x7c, 0xc4, 0x51, 0x8e, 0xe6, + 0x88, 0x49, 0xbf, 0xf7, 0x84, 0x2b, 0xbd, 0x27, 0xaa, 0x69, 0xb6, 0xbc, + 0x86, 0x05, 0x3a, 0xe6, 0x07, 0xe7, 0xa8, 0x00, 0xb5, 0x66, 0x40, 0x4e, + 0x3d, 0x3f, 0xa8, 0x8b, 0xe9, 0xfa, 0x82, 0x0c, 0x1b, 0x35, 0xea, 0x0c, + 0x4e, 0x3f, 0xa9, 0xa0, 0x73, 0xd5, 0x96, 0x2b, 0x32, 0xdf, 0x1d, 0x30, + 0x1d, 0x1b, 0xca, 0x7c, 0xc1, 0xd9, 0xe3, 0x46, 0x17, 0xa0, 0xa2, 0x49, + 0x01, 0x42, 0x86, 0x9f, 0x75, 0x30, 0xfa, 0x6d, 0x7a, 0x86, 0x89, 0x32, + 0x85, 0xfd, 0xf3, 0x90, 0x6a, 0x3e, 0x84, 0x8d, 0x8c, 0x42, 0xfd, 0xb5, + 0x4c, 0xf4, 0x37, 0xae, 0x50, 0x95, 0xba, 0x4b, 0x98, 0x31, 0x3e, 0x79, + 0xfa, 0x64, 0x21, 0xe5, 0x75, 0xa3, 0x49, 0x8c, 0x73, 0x49, 0x68, 0x92, + 0xcc, 0x19, 0x9d, 0x08, 0xf9, 0x62, 0xde, 0x42, 0xb8, 0x20, 0x47, 0xec, + 0x4f, 0xf3, 0x92, 0xa1, 0x89, 0xf1, 0x4a, 0x95, 0xa7, 0x69, 0xdc, 0xe0, + 0x4a, 0xb3, 0x0e, 0x85, 0x75, 0x02, 0x9c, 0x79, 0xd0, 0x24, 0xc3, 0x18, + 0x66, 0xe0, 0x63, 0x91, 0x7a, 0x0b, 0xc6, 0x3b, 0x31, 0x5c, 0x1e, 0x47, + 0x0a, 0x71, 0x2b, 0xa2, 0x21, 0x4d, 0x69, 0x2c, 0x72, 0xd0, 0xcc, 0x66, + 0xf0, 0xd0, 0x45, 0x9d, 0x04, 0xf2, 0x6c, 0x42, 0x99, 0xd5, 0x1e, 0xb8, + 0x6d, 0x9b, 0xbd, 0x27, 0xf2, 0xcf, 0xda, 0xdb, 0x1a, 0x16, 0x8f, 0xf3, + 0x8a, 0x5f, 0xb6, 0x81, 0xc1, 0xe9, 0xbd, 0x90, 0x71, 0xe9, 0x53, 0xd2, + 0x1f, 0xaa, 0x53, 0xce, 0xf0, 0x80, 0x70, 0x92, 0x6c, 0x72, 0x14, 0xb1, + 0x9b, 0x75, 0x90, 0x5a, 0x15, 0x41, 0xa8, 0xbe, 0x26, 0x29, 0xd1, 0xa8, + 0x96, 0x21, 0xda, 0x73, 0xb1, 0x4e, 0xbc, 0x96, 0x36, 0x19, 0x78, 0x87, + 0x2a, 0x76, 0x09, 0x37, 0x11, 0x6b, 0x59, 0xeb, 0x9b, 0x3f, 0xe3, 0x8e, + 0x13, 0xbd, 0x71, 0x0e, 0x78, 0xc3, 0xb5, 0xc4, 0xac, 0x7e, 0x17, 0x18, + 0x8a, 0x49, 0xad, 0xf8, 0x53, 0xe0, 0x1c, 0xd4, 0x82, 0xe0, 0x66, 0x18, + 0x1e, 0xfa, 0x12, 0x88, 0xce, 0x72, 0x2a, 0xae, 0xfc, 0x18, 0x2c, 0xd2, + 0xe0, 0xf1, 0x6b, 0x47, 0x0f, 0x92, 0xe9, 0x61, 0x59, 0x13, 0x31, 0x51, + 0xb8, 0xdf, 0x08, 0x74, 0xef, 0xa5, 0xbe, 0x7b, 0x81, 0xd4, 0x95, 0x90, + 0x33, 0x59, 0x56, 0x68, 0x9f, 0x50, 0x17, 0x92, 0x38, 0xb5, 0x16, 0x6c, + 0x31, 0x3f, 0xb2, 0xe9, 0x3a, 0x62, 0x0e, 0x63, 0xfe, 0x2b, 0xff, 0x75, + 0xc6, 0x11, 0x6e, 0xae, 0xab, 0xfc, 0xdc, 0x95, 0xaa, 0x87, 0x09, 0x0a, + 0x23, 0xb9, 0x6f, 0xc3, 0x19, 0xb1, 0xcb, 0xc0, 0x4e, 0x6e, 0x6d, 0xc6, + 0x6e, 0xf8, 0x70, 0x99, 0x5a, 0x8f, 0x40, 0x54, 0x5d, 0xe9, 0x3c, 0xb6, + 0x0c, 0xb0, 0x66, 0xf0, 0xc7, 0xdd, 0xd3, 0xe0, 0xd5, 0x11, 0xc1, 0x4b, + 0x0c, 0x99, 0x8c, 0x77, 0xbb, 0xf1, 0xe0, 0x21, 0xf6, 0xe9, 0x03, 0x4b, + 0x4f, 0x78, 0xc2, 0x6d, 0xf1, 0x7b, 0x38, 0x81, 0x32, 0xe7, 0x02, 0x4b, + 0xbd, 0xdd, 0x32, 0x33, 0x19, 0x04, 0x02, 0x57, 0xed, 0x75, 0xe1, 0x18, + 0x6e, 0x0f, 0x2b, 0x9c, 0xc3, 0xa9, 0x29, 0x03, 0x04, 0x1a, 0xd0, 0xe5, + 0x04, 0xdd, 0x7f, 0x45, 0x73, 0x08, 0x9f, 0x84, 0x6e, 0xe1, 0x55, 0xea, + 0xf9, 0xcc, 0xa6, 0x87, 0x62, 0xef, 0x32, 0x29, 0xaa, 0x28, 0x7f, 0xe5, + 0x5a, 0xbe, 0x2f, 0x00, 0x74, 0xa3, 0x11, 0xf6, 0x8b, 0xf6, 0xfa, 0x93, + 0x7e, 0xcf, 0x99, 0x60, 0xb6, 0x9c, 0xa6, 0x58, 0x26, 0x6e, 0x4c, 0x23, + 0x3c, 0x17, 0x35, 0x4f, 0x52, 0xbc, 0x84, 0x8e, 0xa8, 0x57, 0x92, 0x89, + 0xb8, 0xa2, 0xb4, 0x6f, 0xc4, 0x43, 0x0f, 0x54, 0x29, 0x47, 0x73, 0x79, + 0x0a, 0x0a, 0x4f, 0xa5, 0x26, 0x72, 0x1b, 0x0f, 0x92, 0x08, 0xe1, 0xf5, + 0xde, 0x20, 0xee, 0xbb, 0x85, 0xed, 0xcc, 0xd7, 0x85, 0xc1, 0x90, 0xaa, + 0xaf, 0xb7, 0x30, 0x3b, 0x13, 0x38, 0x34, 0x1a, 0xdc, 0x6f, 0x97, 0x61, + 0x3e, 0xd1, 0xb5, 0x93, 0xda, 0x6c, 0x9e, 0xbb, 0xb0, 0x02, 0xbc, 0x12, + 0xc3, 0xdf, 0x71, 0xb6, 0x61, 0x8d, 0x78, 0x94, 0xc6, 0x5a, 0x0e, 0xf1, + 0x17, 0x70, 0x3e, 0x9c, 0x95, 0xcb, 0x06, 0x24, 0xb5, 0xb1, 0x38, 0x6c, + 0xb8, 0x8d, 0xd3, 0x8c, 0x30, 0x3b, 0x56, 0x65, 0x4a, 0xe2, 0x43, 0x5b, + 0xcc, 0x9d, 0xeb, 0xca, 0x3a, 0x6d, 0x3d, 0xd8, 0x10, 0x20, 0xf0, 0x01, + 0x15, 0x32, 0xf9, 0x43, 0x61, 0x4a, 0xe6, 0x35, 0xc7, 0x77, 0x32, 0x7d, + 0xcd, 0x04, 0x77, 0x12, 0x19, 0xbc, 0x2f, 0xa7, 0xed, 0xbd, 0xbd, 0xdf, + 0xb1, 0xcf, 0x19, 0xb8, 0xce, 0x12, 0x0a, 0x25, 0x0f, 0xf7, 0xad, 0x84, + 0xb1, 0x41, 0xba, 0x1e, 0xf2, 0xb7, 0x85, 0x02, 0xd1, 0x74, 0x56, 0x32, + 0xb4, 0xf2, 0x60, 0xe5, 0x4b, 0x1b, 0xf6, 0x6a, 0xb2, 0x50, 0x16, 0x83, + 0xa4, 0x94, 0xc8, 0x9d, 0x1e, 0x5f, 0x38, 0xed, 0xa9, 0xd5, 0x81, 0x46, + 0xf2, 0x90, 0x22, 0xe8, 0xcc, 0x1f, 0x24, 0x77, 0x4d, 0xbd, 0x75, 0xa4, + 0xc1, 0x1b, 0x52, 0x97, 0xd2, 0xf6, 0x4c, 0xa9, 0x81, 0x33, 0x4d, 0xaf, + 0xb5, 0xf8, 0x2f, 0x24, 0xf2, 0xb6, 0x36, 0xc8, 0x28, 0x57, 0xc4, 0xe5, + 0x99, 0x69, 0x96, 0xac, 0xab, 0x87, 0xd3, 0x3e, 0xea, 0xf5, 0x02, 0xb7, + 0x6a, 0x6a, 0x58, 0x1f, 0x58, 0x1b, 0x26, 0xae, 0xaf, 0x6e, 0x24, 0xaf, + 0xa9, 0xc7, 0xfe, 0x29, 0x75, 0xe4, 0x8e, 0x5d, 0x89, 0xef, 0x05, 0x15, + 0xea, 0xc4, 0x26, 0x9d, 0xc8, 0x55, 0x69, 0x21, 0xa4, 0xac, 0xcf, 0x21, + 0x15, 0x0d, 0x4b, 0x9e, 0xd7, 0xde, 0x17, 0xcd, 0xc0, 0x36, 0xc2, 0xeb, + 0x2b, 0x2e, 0xc0, 0x31, 0x03, 0x18, 0x44, 0x79, 0x50, 0x3d, 0x66, 0x18, + 0x21, 0x14, 0xbf, 0x2c, 0x4c, 0xf7, 0xe9, 0x6f, 0xb1, 0xe3, 0x48, 0xfd, + 0xfd, 0x99, 0x72, 0xd8, 0xe5, 0xa7, 0x9e, 0xca, 0xf0, 0x89, 0x69, 0x9d, + 0xac, 0x84, 0x5c, 0xc8, 0xe4, 0x5d, 0x04, 0x48, 0xd3, 0xb4, 0x2c, 0xa8, + 0xf5, 0x19, 0x8c, 0x22, 0x2b, 0x27, 0x7d, 0x9b, 0x70, 0x42, 0x9c, 0x3f, + 0x64, 0x34, 0x91, 0x20, 0x5b, 0xb0, 0x4e, 0xd9, 0x7e, 0xaf, 0xf6, 0x6b, + 0x2b, 0xdb, 0x6a, 0x50, 0xb8, 0xd9, 0xf4, 0x5c, 0x2d, 0xac, 0xec, 0x0b, + 0xaa, 0xd5, 0x36, 0x7c, 0x2b, 0x24, 0xdc, 0x48, 0xe6, 0x92, 0x2c, 0x36, + 0x63, 0xb1, 0x40, 0xad, 0x3e, 0xbe, 0x29, 0x4f, 0x74, 0x93, 0xfd, 0x76, + 0x7c, 0xa2, 0xa4, 0x04, 0x05, 0x17, 0xe7, 0xb5, 0x29, 0x8a, 0xe8, 0x6a, + 0x7f, 0xb2, 0x5e, 0x24, 0xd5, 0x11, 0xf3, 0x83, 0x38, 0x70, 0x9a, 0x12, + 0x73, 0x88, 0x51, 0xf1, 0x25, 0x68, 0xd5, 0xe9, 0x06, 0x8c, 0x26, 0x99, + 0x59, 0xd9, 0x60, 0x92, 0xce, 0x32, 0x8f, 0x75, 0xa4, 0x86, 0xf1, 0xea, + 0xb3, 0xee, 0xf5, 0x50, 0x86, 0x7a, 0x6a, 0xa9, 0xfd, 0x3a, 0x70, 0x35, + 0xf5, 0x00, 0x0a, 0x31, 0xaf, 0xd3, 0x64, 0x21, 0x40, 0xa4, 0xe3, 0x28, + 0x3a, 0xec, 0x33, 0x70, 0x0f, 0xc3, 0xcd, 0x99, 0x3f, 0xd3, 0xb4, 0xd6, + 0xc9, 0x2f, 0x17, 0x05, 0xaf, 0xd2, 0xe7, 0xe5, 0xf5, 0x4d, 0x03, 0x84, + 0x0d, 0x8a, 0x3d, 0x76, 0x59, 0x0b, 0x10, 0xc4, 0x53, 0x46, 0x8b, 0x82, + 0x10, 0x57, 0x48, 0x28, 0x21, 0x23, 0x8b, 0x9b, 0xcf, 0x29, 0xb7, 0x51, + 0xaa, 0xaa, 0xa2, 0x26, 0x1b, 0x61, 0xd2, 0x48, 0xcc, 0x7f, 0xc7, 0x5c, + 0xa8, 0xba, 0x8f, 0x6c, 0x79, 0x63, 0x2e, 0xe3, 0x04, 0xe3, 0xb2, 0xb2, + 0x52, 0xe5, 0xb9, 0xe0, 0xa9, 0xd9, 0xbb, 0xc1, 0x83, 0xf0, 0xbf, 0x11, + 0x1e, 0x69, 0x59, 0xfd, 0x0f, 0x94, 0x4c, 0x85, 0x44, 0xef, 0xbd, 0x65, + 0xbc, 0xb8, 0xb7, 0x0e, 0xb0, 0x48, 0xc8, 0x36, 0x24, 0xd5, 0xe8, 0x75, + 0x99, 0xa0, 0x4e, 0x7b, 0x82, 0x90, 0x54, 0x11, 0x9c, 0xbf, 0x48, 0x9e, + 0xf3, 0xd7, 0x66, 0x8e, 0x22, 0xbb, 0x05, 0x49, 0xc1, 0xb3, 0x51, 0xa4, + 0xe1, 0x10, 0x3f, 0x57, 0x2a, 0x27, 0x08, 0x0f, 0x86, 0xf6, 0xd1, 0x3b, + 0x49, 0x58, 0xf4, 0xbd, 0xd1, 0xc7, 0xf1, 0xe2, 0xed, 0xf7, 0x33, 0x63, + 0x61, 0x6e, 0xe2, 0xb5, 0x73, 0x96, 0xcd, 0xbb, 0xaf, 0x5d, 0xf1, 0x31, + 0xe8, 0xd3, 0x25, 0xcd, 0x35, 0xba, 0xbd, 0x7e, 0x1d, 0x14, 0x10, 0x3a, + 0x29, 0x87, 0xe1, 0xdd, 0x53, 0x80, 0x42, 0xf8, 0xff, 0x12, 0xdb, 0x4c, + 0x8a, 0xdd, 0x5f, 0xcb, 0x7c, 0xe7, 0xa8, 0xa9, 0xad, 0x1c, 0x33, 0xfe, + 0x0f, 0x17, 0x57, 0x2b, 0xdd, 0x2f, 0xc1, 0x3c, 0xf4, 0x02, 0x05, 0x2b, + 0x27, 0x82, 0xd7, 0x34, 0x72, 0x2f, 0x25, 0x18, 0xe8, 0x3e, 0x3b, 0x38, + 0x43, 0x7b, 0xb4, 0xb7, 0x81, 0xf8, 0x16, 0x92, 0x16, 0x93, 0x6c, 0x78, + 0xe7, 0xbb, 0x60, 0x87, 0xf7, 0x49, 0x47, 0x40, 0xbb, 0x52, 0x7d, 0x13, + 0xab, 0x27, 0x4b, 0x7a, 0x07, 0x92, 0x02, 0x83, 0xd2, 0xaa, 0x34, 0x3b, + 0xe8, 0x5d, 0xc8, 0xb4, 0xcc, 0xa2, 0xea, 0xa3, 0x78, 0x75, 0x76, 0x32, + 0x36, 0xc6, 0x36, 0x85, 0x89, 0x8c, 0x6e, 0x6c, 0xcd, 0xf4, 0x43, 0x32, + 0xf6, 0x21, 0x03, 0x7d, 0x42, 0x08, 0xd1, 0x11, 0xb8, 0x14, 0x08, 0x29, + 0x1c, 0x14, 0x00, 0x67, 0xe6, 0x2c, 0x21, 0xc9, 0xc6, 0x2a, 0xb5, 0x6f, + 0xc5, 0x7c, 0xa8, 0x00, 0x81, 0xa4, 0x86, 0x53, 0xed, 0x66, 0x5d, 0x7e, + 0x21, 0xd4, 0x68, 0x11, 0xec, 0x59, 0xc5, 0x3d, 0x5c, 0x5f, 0x28, 0xff, + 0xd0, 0xe2, 0xd6, 0x52, 0x1e, 0x43, 0xd2, 0x8e, 0x00 +}; +static const guint profile_0_frame0_len = 8085; + +/* superframe, consists of two frames. + * decoding only frame is followed by normal frame + * first frame size: 5796 + * the last frame size: 369 */ +static const guint8 profile_0_frame1[] = { + 0x84, 0x00, 0x80, 0x49, 0x72, 0x70, 0x9e, 0xc0, 0x00, 0x5c, 0x7f, 0x93, + 0x59, 0x50, 0x2b, 0xf4, 0xe0, 0x5f, 0x1d, 0xdd, 0xec, 0xcc, 0x24, 0x77, + 0xbd, 0x77, 0x68, 0x93, 0x7a, 0xf9, 0x77, 0xbc, 0x02, 0x97, 0x24, 0x5c, + 0x07, 0x33, 0xae, 0xbe, 0x87, 0xf3, 0xa0, 0xbf, 0x8f, 0xf5, 0x79, 0x1f, + 0xf0, 0x6a, 0xe9, 0x90, 0x5d, 0x33, 0x00, 0x00, 0x7f, 0x2d, 0x32, 0xce, + 0xf8, 0x93, 0x00, 0x6e, 0xaf, 0x72, 0xcf, 0x0b, 0x37, 0xdf, 0x4e, 0xa0, + 0x20, 0x60, 0xf7, 0xce, 0xd8, 0x2a, 0x54, 0x5b, 0xb6, 0xcd, 0x44, 0xa8, + 0x37, 0xea, 0x08, 0x46, 0x4f, 0x36, 0xd6, 0x6c, 0xe2, 0x0f, 0x4a, 0x2d, + 0xff, 0x84, 0xc5, 0x82, 0x3b, 0x58, 0x1a, 0x37, 0xb9, 0x92, 0x85, 0x92, + 0x08, 0x3e, 0x8b, 0x56, 0x7e, 0x89, 0x83, 0xf6, 0x5f, 0x8d, 0x37, 0x9f, + 0x65, 0x89, 0x6d, 0x3a, 0x9b, 0x50, 0x81, 0xcd, 0xc8, 0xb2, 0xc2, 0xe9, + 0x9f, 0x4e, 0x70, 0xb9, 0xe2, 0x48, 0xcd, 0x0e, 0x4c, 0xfd, 0xcc, 0xc6, + 0x58, 0x3a, 0x02, 0x61, 0x6c, 0x16, 0xf9, 0xe6, 0x64, 0x51, 0x61, 0x49, + 0x51, 0x5d, 0xbe, 0xcb, 0x27, 0xc1, 0xdb, 0xec, 0x09, 0x8a, 0x67, 0x6c, + 0x9a, 0xe3, 0x1b, 0x0d, 0x14, 0xb9, 0x2a, 0x5a, 0x55, 0xf5, 0x9d, 0xa3, + 0x4a, 0x9b, 0x76, 0x72, 0x43, 0x2d, 0x0a, 0x7a, 0xfd, 0x5f, 0x26, 0x6d, + 0x80, 0x04, 0x03, 0x7a, 0x45, 0x89, 0xda, 0xd1, 0x1e, 0x0e, 0x86, 0xa4, + 0x45, 0x5e, 0x6d, 0xe9, 0xd3, 0xf4, 0xd0, 0x12, 0x67, 0xcb, 0xd2, 0xa3, + 0xe7, 0x35, 0x08, 0xc8, 0x84, 0xbb, 0xc5, 0xc8, 0xe5, 0xb0, 0x02, 0x71, + 0xa2, 0xff, 0xaa, 0x02, 0xef, 0x11, 0x6c, 0x39, 0xa8, 0x6b, 0x4b, 0x4c, + 0x1f, 0x5a, 0xd4, 0x7d, 0xa7, 0xda, 0x12, 0x00, 0xcc, 0x79, 0xd1, 0x70, + 0x97, 0x60, 0x20, 0x95, 0xfb, 0x25, 0x92, 0xb7, 0xf4, 0x8f, 0xab, 0x09, + 0x2d, 0x01, 0x16, 0x8e, 0x48, 0xc1, 0xe7, 0x01, 0xc1, 0x96, 0xf3, 0xf9, + 0xea, 0xa9, 0x4f, 0x64, 0xb7, 0x56, 0x17, 0x48, 0x1c, 0xdd, 0x61, 0x99, + 0x42, 0xed, 0x51, 0xe3, 0xe3, 0x14, 0x96, 0x4a, 0x9a, 0x94, 0x59, 0x47, + 0xa4, 0x4b, 0xdc, 0xea, 0x35, 0x58, 0xa8, 0xea, 0x5e, 0x06, 0xbd, 0x91, + 0xbd, 0xa1, 0x8c, 0xb1, 0x27, 0xd0, 0x76, 0x30, 0x55, 0xfc, 0x79, 0x8a, + 0x39, 0xe0, 0x38, 0xeb, 0xd9, 0x4b, 0x6a, 0x6a, 0x28, 0xfa, 0xb9, 0xf7, + 0x85, 0x25, 0x65, 0xd7, 0xb3, 0x99, 0xbc, 0xfb, 0x4f, 0x90, 0xe5, 0x02, + 0x01, 0x6d, 0xcb, 0xbe, 0x0a, 0xc4, 0xa6, 0xb4, 0x05, 0x91, 0xdb, 0xaa, + 0xc7, 0xb9, 0x5f, 0xdc, 0x18, 0xc2, 0xc9, 0x2c, 0xca, 0xa4, 0xe9, 0x11, + 0x65, 0xad, 0xc5, 0x15, 0xc4, 0x4d, 0x3f, 0x3a, 0xc8, 0xfe, 0x03, 0x64, + 0x05, 0xbd, 0xf7, 0xd6, 0x76, 0xd0, 0xe0, 0xf0, 0x0b, 0x7b, 0xef, 0xdc, + 0x34, 0xab, 0xfa, 0xbf, 0xb0, 0xaf, 0xec, 0xfd, 0x54, 0x70, 0x4a, 0xe1, + 0x26, 0x1d, 0xac, 0x69, 0xa0, 0x66, 0x7c, 0x87, 0xed, 0xe6, 0xf7, 0x19, + 0xf3, 0xeb, 0xcf, 0x6c, 0xb7, 0x0d, 0x46, 0x09, 0x6c, 0xf6, 0x97, 0x16, + 0x38, 0x6f, 0x13, 0x3a, 0xde, 0xd4, 0x41, 0xec, 0x4c, 0x65, 0x24, 0xd5, + 0x63, 0xe7, 0x9f, 0x9d, 0xb5, 0x3e, 0xdb, 0xc2, 0xef, 0x58, 0xba, 0x2f, + 0x98, 0x8e, 0x48, 0x08, 0x0a, 0x0e, 0x7e, 0xbc, 0xa4, 0x9f, 0x5a, 0xd0, + 0x89, 0xba, 0x75, 0xa6, 0xc5, 0xe9, 0x24, 0x9d, 0x28, 0x6b, 0x5b, 0x06, + 0xd8, 0xf7, 0xfe, 0x9f, 0x38, 0x10, 0x31, 0x4f, 0x7e, 0x21, 0xd6, 0x81, + 0xf3, 0x13, 0x7a, 0x21, 0x90, 0xda, 0x52, 0x84, 0x53, 0x74, 0x00, 0x8b, + 0x85, 0xf2, 0x90, 0xbe, 0x4c, 0x88, 0x9c, 0x01, 0xd2, 0xd1, 0x1c, 0x6a, + 0xf4, 0x5a, 0xb6, 0x0f, 0x7f, 0x5d, 0x70, 0xff, 0x78, 0x9b, 0x5d, 0x4d, + 0xfe, 0xc7, 0x87, 0xdf, 0x96, 0x4d, 0x27, 0x04, 0x55, 0x56, 0x65, 0x13, + 0xb1, 0xc2, 0x30, 0xce, 0x02, 0x7f, 0x36, 0x9a, 0xa2, 0x06, 0x5d, 0x6a, + 0xf9, 0x4b, 0xb7, 0x3b, 0x61, 0xeb, 0x03, 0xd6, 0x05, 0xfd, 0xf7, 0x34, + 0xfe, 0xb7, 0x87, 0x43, 0xb8, 0xbf, 0xc0, 0xd1, 0xed, 0x48, 0x65, 0x2f, + 0x8e, 0xf7, 0x21, 0x3f, 0x5a, 0x84, 0x58, 0x1d, 0x8f, 0x0d, 0x4d, 0x0f, + 0x6a, 0x25, 0xec, 0x8a, 0xdf, 0xeb, 0x06, 0xd5, 0xba, 0xc5, 0xbb, 0xc7, + 0x0d, 0x1b, 0x4d, 0xf7, 0x3b, 0x88, 0x85, 0x81, 0x06, 0xc5, 0x61, 0x36, + 0x60, 0xfb, 0x9a, 0xdf, 0xef, 0xd4, 0x35, 0x9b, 0x37, 0xfb, 0xcd, 0x4a, + 0x55, 0xed, 0xaf, 0xf2, 0xa9, 0x7b, 0x1f, 0x27, 0xfa, 0x4e, 0x02, 0x28, + 0x24, 0xc8, 0xcf, 0xac, 0x23, 0x12, 0x46, 0x84, 0xb5, 0xad, 0x0a, 0x7a, + 0x44, 0x8c, 0xbc, 0xc7, 0x82, 0x5b, 0xc1, 0x2c, 0x5f, 0x02, 0x27, 0x8f, + 0xd1, 0xa0, 0x96, 0x6c, 0x7c, 0x20, 0x53, 0x8d, 0xaa, 0x75, 0x2e, 0xfa, + 0xbd, 0xd2, 0xc2, 0x30, 0xdb, 0xe4, 0x70, 0x4c, 0x40, 0x1e, 0xdd, 0xd7, + 0xf0, 0x41, 0x31, 0x3e, 0x12, 0xd2, 0xd5, 0x6a, 0x6e, 0xfc, 0xec, 0xde, + 0xb6, 0x5c, 0xd3, 0x43, 0x7e, 0x50, 0xab, 0xb5, 0x80, 0x39, 0x65, 0xcd, + 0x38, 0x98, 0x64, 0x4e, 0x03, 0x79, 0x52, 0x73, 0x7e, 0x80, 0xe7, 0x14, + 0xf7, 0xc6, 0xa9, 0xda, 0x68, 0x5e, 0x7c, 0x26, 0x56, 0xf6, 0x67, 0x6b, + 0xb4, 0xaa, 0x8b, 0x76, 0xbe, 0xa1, 0x29, 0xc7, 0x85, 0xcc, 0x28, 0x56, + 0x47, 0x6e, 0x81, 0x2b, 0xc4, 0xf0, 0x66, 0xa2, 0x3b, 0xd5, 0xe4, 0x9d, + 0x85, 0xfe, 0xfd, 0x21, 0xb6, 0x86, 0x29, 0xe7, 0xf6, 0x01, 0xc2, 0x97, + 0x24, 0x3c, 0x9b, 0xc7, 0x8b, 0x9f, 0x4b, 0x0a, 0x94, 0xf3, 0x63, 0x62, + 0xf0, 0x73, 0xf8, 0xbe, 0xb1, 0xee, 0x2f, 0xed, 0xa7, 0xa8, 0xfd, 0x24, + 0x09, 0x14, 0x2a, 0x12, 0xe2, 0xbd, 0x13, 0x33, 0x15, 0xa0, 0xca, 0x17, + 0x36, 0x22, 0xc2, 0x64, 0x98, 0xf0, 0x44, 0xb4, 0x5e, 0x9d, 0x28, 0xdf, + 0xca, 0xb3, 0x7a, 0x92, 0x4c, 0x37, 0x6e, 0xea, 0x62, 0x00, 0xf1, 0x7a, + 0xc6, 0xe3, 0xbc, 0x46, 0x79, 0xad, 0xc8, 0xce, 0x94, 0x67, 0x40, 0x2f, + 0x6b, 0x2e, 0x57, 0x84, 0xca, 0xca, 0x16, 0x54, 0xbb, 0xc8, 0xf7, 0x78, + 0x31, 0x6f, 0x10, 0x59, 0x8b, 0x7b, 0xf6, 0x15, 0x7d, 0x11, 0x21, 0x24, + 0x86, 0xdb, 0x32, 0x95, 0x12, 0x6f, 0x96, 0x95, 0xd7, 0x76, 0xcd, 0xef, + 0x1e, 0xec, 0x4f, 0x01, 0x5e, 0x89, 0xf1, 0xbb, 0x08, 0x8a, 0xa2, 0xe9, + 0xeb, 0x77, 0x3a, 0x1c, 0x9e, 0x38, 0x16, 0x7a, 0x64, 0xc2, 0x24, 0xa5, + 0x78, 0x62, 0xa7, 0x35, 0x6e, 0x53, 0x9d, 0xa0, 0x82, 0xca, 0x54, 0xd7, + 0x63, 0xcf, 0xe7, 0x3a, 0xb9, 0x0b, 0x7d, 0x5a, 0xe5, 0xdb, 0x57, 0x39, + 0x4d, 0xb4, 0xf3, 0x1d, 0xc8, 0x00, 0xa0, 0x85, 0x47, 0xf1, 0xb7, 0xf2, + 0x53, 0xdb, 0x91, 0x42, 0xfb, 0xed, 0x87, 0x0f, 0x9b, 0xb2, 0x35, 0xe7, + 0x8f, 0x9a, 0x55, 0xd8, 0x34, 0x74, 0x71, 0x15, 0x6b, 0xb9, 0x08, 0x3e, + 0x09, 0xd6, 0x3f, 0xc0, 0x27, 0x91, 0xd7, 0xf9, 0xfa, 0x99, 0xa4, 0xd5, + 0xb3, 0xe4, 0x18, 0xbf, 0x3b, 0x4a, 0xfb, 0xbd, 0xc4, 0xcf, 0x9f, 0x01, + 0xa5, 0xd4, 0x6a, 0x1e, 0x7e, 0x0c, 0x0e, 0x09, 0x7e, 0x16, 0xb2, 0x84, + 0x32, 0xd3, 0x66, 0x71, 0x46, 0x4e, 0x4b, 0xe0, 0xc6, 0xb6, 0x80, 0x0b, + 0x55, 0xf1, 0x46, 0x89, 0x75, 0x98, 0x5f, 0xef, 0x1b, 0x26, 0xb5, 0xa2, + 0x92, 0xdd, 0x31, 0xd5, 0x15, 0xdb, 0xf8, 0x86, 0xb3, 0x78, 0x13, 0x9b, + 0x0a, 0xe7, 0x79, 0xa8, 0x42, 0xa4, 0x63, 0xdf, 0xde, 0x16, 0x6c, 0x43, + 0xe7, 0xe5, 0x35, 0x8d, 0x05, 0x22, 0xe0, 0xb4, 0xdd, 0x2f, 0x87, 0xe4, + 0xe0, 0xb1, 0x9d, 0x3a, 0x08, 0x31, 0x16, 0xf2, 0x2b, 0x43, 0x50, 0xd8, + 0x9c, 0x0c, 0x59, 0x84, 0x03, 0x00, 0xd1, 0x68, 0xfb, 0x22, 0xee, 0xcb, + 0x60, 0x7c, 0x0c, 0xf9, 0x22, 0xd1, 0x3e, 0xb4, 0xf5, 0xc9, 0xa9, 0x43, + 0x9c, 0x36, 0x9d, 0x13, 0x65, 0x62, 0x08, 0x90, 0x7e, 0x23, 0x9c, 0xc8, + 0x19, 0x93, 0xb9, 0x26, 0x2d, 0x3a, 0x7b, 0x9b, 0x1c, 0x9d, 0x39, 0x2e, + 0xa3, 0x79, 0x4b, 0x78, 0x49, 0x3e, 0xf3, 0xb2, 0xc7, 0xa9, 0xa4, 0x08, + 0x5f, 0xd2, 0x54, 0xda, 0xcd, 0xf5, 0x2d, 0x8f, 0x97, 0xa1, 0x22, 0xd7, + 0x0c, 0x6c, 0xde, 0x9b, 0x75, 0x37, 0xb1, 0xdf, 0x51, 0x4c, 0xc9, 0xa0, + 0xe6, 0x42, 0x53, 0xca, 0x74, 0x77, 0x67, 0xa4, 0xc8, 0xa4, 0x5b, 0x24, + 0x5f, 0xa2, 0x46, 0x4b, 0x8f, 0x2b, 0x71, 0x5a, 0xcb, 0x0d, 0xa8, 0xca, + 0xbd, 0xdb, 0x9a, 0x89, 0x9c, 0xd2, 0xbd, 0xd4, 0x0a, 0xc0, 0xba, 0x49, + 0x84, 0xd6, 0xa5, 0xd4, 0x5b, 0x27, 0x91, 0x6e, 0xab, 0x1c, 0x66, 0x29, + 0x63, 0xbc, 0x78, 0x75, 0x00, 0xf4, 0x66, 0x3b, 0x7e, 0xd2, 0xb6, 0x2b, + 0xfc, 0x2e, 0xf4, 0x1d, 0x76, 0x45, 0x20, 0x20, 0x00, 0x4c, 0x3d, 0x10, + 0xc0, 0x7d, 0x98, 0x0d, 0xc0, 0xc1, 0xf3, 0x40, 0x48, 0x98, 0x6b, 0xd5, + 0x7a, 0x76, 0x3b, 0xeb, 0x5b, 0x0a, 0x29, 0xd2, 0x4a, 0x53, 0x57, 0x9d, + 0x5c, 0xa6, 0xe7, 0xd3, 0xe9, 0x21, 0x7c, 0x23, 0xbb, 0x45, 0x79, 0xe4, + 0xfb, 0x1e, 0xb2, 0x85, 0x90, 0x5d, 0xa0, 0xfe, 0xc5, 0xbb, 0x2a, 0x49, + 0x23, 0x00, 0x81, 0xf2, 0x68, 0x68, 0xed, 0x5d, 0x9a, 0xd3, 0x77, 0x1e, + 0xcf, 0xa2, 0x07, 0xa2, 0xcf, 0x71, 0x28, 0xb3, 0xa3, 0x41, 0xfa, 0xd2, + 0x68, 0xa2, 0xa0, 0x1c, 0x7c, 0x8a, 0x16, 0x7d, 0x34, 0x88, 0x29, 0xe5, + 0x7a, 0x0e, 0x24, 0xba, 0x79, 0x3c, 0x93, 0x3a, 0xf7, 0x49, 0xb8, 0x54, + 0x0c, 0xb7, 0x8f, 0x50, 0x96, 0x9c, 0x70, 0x40, 0xfe, 0x91, 0x17, 0x08, + 0xec, 0x5b, 0x2a, 0xa9, 0x32, 0x85, 0xe8, 0x88, 0xf7, 0x4a, 0x48, 0x45, + 0x0d, 0x33, 0xd4, 0xe9, 0xb3, 0xac, 0x52, 0xdc, 0xb8, 0xe6, 0x34, 0xab, + 0xf0, 0xa9, 0xd3, 0x00, 0x81, 0x39, 0x73, 0x9d, 0x46, 0x59, 0x12, 0xa1, + 0x18, 0x26, 0xb0, 0xa6, 0x38, 0xb8, 0x74, 0xf8, 0x27, 0xb9, 0xa8, 0xc0, + 0x0f, 0x71, 0x5d, 0xce, 0x57, 0x53, 0xb4, 0x93, 0x15, 0x17, 0x28, 0x2c, + 0x44, 0xea, 0xdd, 0x67, 0x52, 0x8c, 0x68, 0x03, 0x80, 0x1d, 0x6e, 0x60, + 0x40, 0x80, 0xeb, 0x99, 0x18, 0xe1, 0xe9, 0x1f, 0x81, 0xcc, 0x6c, 0x17, + 0x2d, 0xf3, 0xea, 0x93, 0x3b, 0xbd, 0x22, 0x45, 0xd4, 0x4b, 0xe1, 0xb0, + 0x0d, 0x29, 0x59, 0x2f, 0xe3, 0x1a, 0x8f, 0xa1, 0x45, 0x51, 0x63, 0x8f, + 0x11, 0x7a, 0x61, 0x4b, 0x33, 0xcd, 0xbd, 0xbc, 0x25, 0xe3, 0xd3, 0xc0, + 0x8d, 0xfa, 0x23, 0x40, 0x3d, 0xef, 0x9c, 0x42, 0x7e, 0xff, 0x6c, 0x3c, + 0x3c, 0x3d, 0x75, 0xba, 0xe2, 0xd6, 0x19, 0xee, 0x20, 0x67, 0x2f, 0xbe, + 0x0c, 0x23, 0x37, 0x01, 0x42, 0x33, 0xe2, 0xb0, 0xa5, 0x4e, 0x9e, 0xbe, + 0x2c, 0xa0, 0xc7, 0x59, 0x20, 0xfb, 0xbe, 0x3c, 0xed, 0x1f, 0xc1, 0xb0, + 0x72, 0x9e, 0x40, 0xb2, 0xa7, 0xac, 0x4e, 0x5c, 0x7b, 0x8a, 0x13, 0x90, + 0xcc, 0x6a, 0xbe, 0x34, 0x75, 0x2e, 0xeb, 0x4f, 0x13, 0x6c, 0x7c, 0xc6, + 0x48, 0x7e, 0xaf, 0x4b, 0x19, 0xcf, 0x19, 0x38, 0xe1, 0x6a, 0x49, 0x00, + 0x53, 0xfc, 0x82, 0x2b, 0x75, 0x28, 0x72, 0x56, 0x85, 0x85, 0x17, 0x53, + 0x50, 0x2c, 0x3b, 0x06, 0x35, 0xf9, 0x8f, 0xc8, 0xf9, 0xa7, 0x4d, 0x0c, + 0x14, 0x62, 0x61, 0xbd, 0xc8, 0x0e, 0x55, 0x17, 0x65, 0x5a, 0xd9, 0x6c, + 0x24, 0x0c, 0x42, 0x42, 0x50, 0x18, 0x9c, 0xfa, 0xc7, 0xf8, 0xd8, 0x47, + 0x2d, 0x41, 0x91, 0x4f, 0x9c, 0x78, 0xed, 0xd6, 0x36, 0xb4, 0xe7, 0x70, + 0xb0, 0x3d, 0x2c, 0xc5, 0x65, 0x45, 0x00, 0xa1, 0x7a, 0x9b, 0xdc, 0xa3, + 0xd8, 0x2d, 0x47, 0x5b, 0xc0, 0x4d, 0x0f, 0x3f, 0xb4, 0xb3, 0xa8, 0xec, + 0xbd, 0xeb, 0x0b, 0x3d, 0x0d, 0xe9, 0x52, 0xd6, 0xe2, 0xc5, 0x9a, 0x96, + 0x6a, 0xdc, 0xed, 0xa6, 0x3a, 0x96, 0x33, 0x77, 0x6a, 0x83, 0x19, 0x98, + 0xfc, 0x92, 0x93, 0x44, 0x6e, 0xca, 0x76, 0x83, 0xfe, 0xd1, 0x5a, 0x02, + 0x0f, 0x92, 0x59, 0xbc, 0x9d, 0x2e, 0xe0, 0x3e, 0x5c, 0x7d, 0x6a, 0x0f, + 0x41, 0xfe, 0xdd, 0xb2, 0x11, 0x78, 0xf9, 0x5b, 0x13, 0x26, 0x65, 0xe6, + 0xc0, 0xd4, 0xc2, 0x25, 0x2d, 0x7a, 0xcb, 0xbb, 0xdb, 0x8c, 0x9f, 0xf4, + 0xd5, 0xfb, 0x57, 0xfc, 0xc7, 0xa0, 0x25, 0xbe, 0x14, 0x4a, 0xc3, 0xd6, + 0xbe, 0x7f, 0xa8, 0xc2, 0xa6, 0x02, 0x60, 0x14, 0x92, 0x70, 0xbe, 0xc2, + 0x96, 0x95, 0x0f, 0x1e, 0xf4, 0x2f, 0xf9, 0xe6, 0xeb, 0x0a, 0xcc, 0xa4, + 0xcf, 0x05, 0x18, 0x8d, 0x16, 0x30, 0x4c, 0x47, 0xe7, 0x05, 0x06, 0x92, + 0x42, 0x6d, 0x43, 0x60, 0xf5, 0xba, 0xff, 0xee, 0x86, 0xf3, 0xb3, 0xf9, + 0x8e, 0x07, 0xd7, 0x00, 0xa4, 0xc1, 0x38, 0x16, 0x6c, 0xbe, 0x70, 0x31, + 0x0c, 0x7c, 0xc2, 0x56, 0xe7, 0xc5, 0x5e, 0x80, 0x2b, 0xee, 0x14, 0xe5, + 0x19, 0xbd, 0xb3, 0x80, 0x4d, 0x80, 0x23, 0x11, 0x91, 0xb4, 0xd4, 0xdc, + 0x4c, 0xc3, 0xe3, 0x2a, 0x74, 0x85, 0xd9, 0xb0, 0xa1, 0x0f, 0x1c, 0x32, + 0x1f, 0xfc, 0x35, 0x9e, 0xdf, 0x49, 0xa8, 0x55, 0x78, 0x1a, 0x75, 0x95, + 0xab, 0x8a, 0x12, 0xaa, 0xf6, 0x0c, 0xec, 0xad, 0xb5, 0x3e, 0xf1, 0x07, + 0x01, 0x30, 0x8e, 0xc5, 0xdb, 0xbe, 0xd9, 0x2d, 0x37, 0xa4, 0x76, 0x49, + 0x79, 0xac, 0xda, 0xe7, 0xc8, 0x46, 0x89, 0x82, 0x8d, 0x42, 0x70, 0x88, + 0x4f, 0xfd, 0x65, 0x48, 0x62, 0x29, 0xae, 0x49, 0x0c, 0xbd, 0xa0, 0xe7, + 0x22, 0x06, 0xdd, 0x92, 0x18, 0xd7, 0xc0, 0x4d, 0x8a, 0x2f, 0x03, 0xcd, + 0xd4, 0x6e, 0xf9, 0x40, 0xfb, 0xb5, 0x08, 0x33, 0x63, 0xb6, 0xbc, 0x1f, + 0xd7, 0x31, 0x8c, 0x3b, 0x6c, 0x4e, 0x9c, 0x1e, 0x5a, 0x00, 0xb9, 0x57, + 0x9b, 0x25, 0xed, 0x74, 0x43, 0xf4, 0xaa, 0xa7, 0x15, 0x89, 0x36, 0x2e, + 0x63, 0x8c, 0x40, 0xfc, 0x7b, 0x03, 0xfc, 0x66, 0x7b, 0x88, 0x1a, 0xf6, + 0xf7, 0xf8, 0xa1, 0x3b, 0xa9, 0x99, 0x6b, 0x3f, 0xe3, 0xa8, 0xe1, 0x18, + 0x74, 0xa2, 0xa7, 0x28, 0xac, 0x27, 0x6f, 0xf3, 0x16, 0x43, 0xaa, 0x1c, + 0x2c, 0x08, 0xdb, 0xc6, 0x15, 0x02, 0xe6, 0x19, 0x6d, 0x8f, 0x7b, 0xfd, + 0x81, 0x58, 0xcc, 0xda, 0x69, 0xa1, 0x9d, 0x53, 0xcd, 0xd5, 0x53, 0xce, + 0x1b, 0x3c, 0x75, 0xc3, 0x14, 0xbf, 0x96, 0x79, 0x34, 0x8a, 0x08, 0xa2, + 0xe7, 0x69, 0x7a, 0xb5, 0x0f, 0x77, 0x5f, 0xe1, 0xb8, 0xd1, 0x51, 0xce, + 0x78, 0xb3, 0xa3, 0xc1, 0x8e, 0x08, 0x73, 0x6b, 0x2a, 0x74, 0xe1, 0xf1, + 0x22, 0x43, 0xf0, 0x60, 0xcc, 0x59, 0x48, 0x44, 0xd9, 0xee, 0xf8, 0x1a, + 0xb8, 0xc7, 0x26, 0x7a, 0x74, 0xf6, 0x9b, 0xfb, 0x15, 0x43, 0x6c, 0x9f, + 0x59, 0xfa, 0x53, 0xa3, 0x94, 0x7d, 0x8e, 0x42, 0xaa, 0xd8, 0x05, 0x0e, + 0xa5, 0xe3, 0xda, 0xd5, 0xed, 0x8b, 0x14, 0x81, 0x13, 0x05, 0xa5, 0x9d, + 0xdc, 0xb5, 0x8b, 0xdf, 0x66, 0xc5, 0xd4, 0xb1, 0x14, 0x20, 0xc7, 0xee, + 0x55, 0x91, 0xc8, 0xf4, 0x69, 0x2e, 0xe0, 0x43, 0x65, 0x2b, 0x34, 0xa9, + 0x55, 0x4d, 0x9d, 0x24, 0x91, 0x58, 0x8a, 0xe2, 0x4a, 0xc9, 0x00, 0x1c, + 0x31, 0x52, 0x58, 0x87, 0xf7, 0x64, 0x4b, 0xb5, 0x8d, 0xee, 0x78, 0xae, + 0x09, 0xe5, 0xa1, 0x44, 0x21, 0x4a, 0xe8, 0x30, 0xb3, 0x67, 0x78, 0x89, + 0x44, 0x5e, 0x15, 0x4f, 0xb5, 0xfc, 0xf7, 0x82, 0xef, 0xc5, 0xe3, 0xff, + 0x21, 0x69, 0xca, 0x45, 0x56, 0x5b, 0x2d, 0x8f, 0x63, 0x6a, 0xd2, 0x67, + 0x45, 0x65, 0xd0, 0xf0, 0x37, 0xfa, 0x6c, 0x3d, 0x56, 0xbd, 0xe5, 0x64, + 0xae, 0x96, 0xed, 0xa6, 0xbd, 0x6f, 0xf1, 0xa4, 0x5c, 0x0b, 0xfd, 0x2d, + 0x1b, 0x31, 0x82, 0x10, 0x0f, 0x1d, 0xae, 0x4d, 0xa9, 0xf3, 0xbf, 0xe9, + 0x68, 0x33, 0x1d, 0x64, 0x66, 0x34, 0xc8, 0x03, 0x8d, 0x7c, 0x80, 0x97, + 0x03, 0x80, 0x84, 0x02, 0xf5, 0x39, 0xc6, 0xea, 0xc5, 0x89, 0x57, 0x87, + 0x2d, 0x05, 0x59, 0xa7, 0x1d, 0xc6, 0x26, 0x76, 0xce, 0x96, 0xdb, 0x1b, + 0x23, 0x3c, 0x51, 0x9b, 0xab, 0x43, 0xab, 0x84, 0x88, 0x30, 0x41, 0xaf, + 0xad, 0xc4, 0x03, 0xc0, 0xe7, 0x57, 0x08, 0x0c, 0x93, 0x6b, 0x41, 0x44, + 0xcb, 0x72, 0xaf, 0x7b, 0xd0, 0x64, 0x90, 0x42, 0x13, 0x0f, 0x59, 0xd9, + 0xd7, 0xf3, 0xc2, 0xdb, 0xa3, 0xc1, 0xd2, 0x9d, 0x79, 0xcc, 0x4e, 0x31, + 0x84, 0x88, 0xa9, 0x42, 0x97, 0xab, 0xe7, 0xe8, 0x00, 0xeb, 0x5e, 0x8a, + 0x50, 0xe3, 0x05, 0xe8, 0x9b, 0xc8, 0xa5, 0xcf, 0xe6, 0xdf, 0x31, 0xff, + 0x9f, 0x5c, 0x16, 0xee, 0x3f, 0xc6, 0x4e, 0x3d, 0x7a, 0x1b, 0x5b, 0x48, + 0xb0, 0xf9, 0x29, 0xa2, 0xed, 0x64, 0x41, 0x28, 0xec, 0xbb, 0x2f, 0x7c, + 0xa9, 0xaa, 0x33, 0x78, 0xb2, 0x9a, 0x44, 0x91, 0x69, 0xfe, 0x4a, 0x6d, + 0xf2, 0xa4, 0xdc, 0xb8, 0x83, 0xee, 0xee, 0x31, 0x73, 0xe4, 0x3c, 0x4a, + 0xf3, 0xfa, 0xa6, 0xd2, 0x33, 0xa8, 0x03, 0x86, 0xe7, 0x9b, 0x9d, 0x8d, + 0xfd, 0x4f, 0x2a, 0xf7, 0x15, 0x26, 0x06, 0x1e, 0xcd, 0xa9, 0xd1, 0xa5, + 0x3d, 0x17, 0xf9, 0xdd, 0x6b, 0x05, 0x75, 0xf1, 0x56, 0xa7, 0x6b, 0xa0, + 0xc8, 0xc9, 0x3d, 0x21, 0x26, 0xa3, 0xf8, 0xb7, 0x74, 0xda, 0x78, 0x09, + 0xf1, 0x32, 0xf8, 0x2c, 0x3f, 0x47, 0x7d, 0xad, 0x8c, 0xc0, 0x97, 0x1d, + 0x82, 0x4c, 0x4a, 0x67, 0x81, 0x18, 0xe1, 0x87, 0x79, 0xa5, 0x13, 0x18, + 0xd2, 0x73, 0xc3, 0x6b, 0x03, 0xcc, 0x10, 0x83, 0x51, 0x14, 0x5b, 0x11, + 0x1f, 0xe2, 0x51, 0x4d, 0x81, 0x54, 0x1f, 0xee, 0xdd, 0xe8, 0x2d, 0xde, + 0xe6, 0x4d, 0x72, 0x18, 0x2c, 0xf4, 0x27, 0xe9, 0x08, 0x90, 0x9b, 0xed, + 0x03, 0xe4, 0x78, 0xac, 0xc7, 0x18, 0x91, 0x5d, 0x81, 0x69, 0x36, 0x7f, + 0xd7, 0xa6, 0x00, 0xbe, 0xf2, 0xb5, 0xef, 0xf7, 0x68, 0x17, 0xe6, 0x86, + 0x4a, 0x47, 0x32, 0xb1, 0xc5, 0x32, 0x21, 0x76, 0xe0, 0x43, 0x8d, 0xae, + 0xc6, 0x5f, 0xdc, 0xe5, 0x56, 0x9f, 0x99, 0x58, 0x54, 0xc0, 0xf3, 0x0d, + 0xea, 0xd6, 0x25, 0xfb, 0xbc, 0x4f, 0xc7, 0xdc, 0xc0, 0xf2, 0x0c, 0x57, + 0x80, 0x61, 0x9c, 0x0f, 0x24, 0xc0, 0x7e, 0x18, 0x51, 0xbd, 0x35, 0x33, + 0x33, 0x4b, 0x25, 0xf8, 0x45, 0x3d, 0x48, 0xcd, 0xd4, 0x45, 0x40, 0x33, + 0xb9, 0x0a, 0x6c, 0xfc, 0xd8, 0x43, 0xaf, 0x8c, 0x09, 0x0b, 0x5b, 0x14, + 0x39, 0xc3, 0xdb, 0x0e, 0x85, 0x77, 0x15, 0xf8, 0xed, 0x9b, 0x7e, 0xd6, + 0xb1, 0x03, 0x23, 0x6d, 0xab, 0xb6, 0xbd, 0xb3, 0x88, 0x03, 0xea, 0x44, + 0xd8, 0x61, 0x9e, 0xf3, 0x1a, 0xec, 0x2a, 0x94, 0xbb, 0xbe, 0xb9, 0x3b, + 0xd3, 0x85, 0x63, 0x83, 0xc6, 0x2c, 0x83, 0x4e, 0x85, 0xfe, 0x07, 0xc2, + 0x76, 0x38, 0x04, 0xbd, 0xdd, 0x65, 0x05, 0xab, 0x4c, 0x36, 0xc1, 0xc1, + 0x69, 0x8d, 0x10, 0xd5, 0xc0, 0xc6, 0x60, 0x2d, 0xf0, 0xbc, 0xfa, 0x0a, + 0x13, 0x14, 0xe7, 0x2c, 0xcb, 0x09, 0x49, 0x04, 0x1e, 0xe8, 0x81, 0x75, + 0x42, 0x54, 0xa0, 0x08, 0x92, 0x51, 0x06, 0x9f, 0xa7, 0x60, 0x85, 0x59, + 0xd9, 0x98, 0x1f, 0x70, 0xcf, 0x92, 0x9d, 0x3e, 0x41, 0xd9, 0xcc, 0x2c, + 0x78, 0x8e, 0x67, 0x4a, 0xce, 0x0d, 0x61, 0x80, 0xf7, 0xd4, 0x40, 0x43, + 0x60, 0xba, 0xae, 0xd2, 0x24, 0xe6, 0xc5, 0x51, 0x26, 0x06, 0x74, 0x35, + 0x72, 0xd9, 0x33, 0x34, 0x46, 0x2f, 0x7d, 0x08, 0x2f, 0x14, 0xaa, 0x37, + 0xc6, 0xda, 0x94, 0xcf, 0x34, 0x2f, 0x7c, 0xeb, 0x63, 0x3a, 0xcf, 0xc5, + 0xbe, 0xae, 0x55, 0x0c, 0xc2, 0xf1, 0x92, 0x2e, 0xf1, 0x61, 0x2f, 0x35, + 0xd0, 0x17, 0xbb, 0x5d, 0xc1, 0x54, 0x1b, 0x9c, 0xc0, 0x9f, 0x46, 0x12, + 0xb1, 0xb7, 0x34, 0x5a, 0x3b, 0xec, 0x57, 0x81, 0x24, 0x5b, 0xd1, 0xad, + 0xcd, 0x33, 0x7a, 0xb6, 0x94, 0x51, 0xac, 0xc1, 0x3c, 0x0f, 0xe4, 0x15, + 0x6f, 0xfd, 0x43, 0xa5, 0x09, 0xe5, 0x2c, 0xa4, 0xcd, 0xa4, 0x7d, 0x3a, + 0x10, 0x5e, 0x21, 0xcf, 0x23, 0x30, 0x61, 0xfb, 0x61, 0x0c, 0xe7, 0x91, + 0x0f, 0x79, 0x38, 0x60, 0xc1, 0x42, 0x41, 0xd8, 0x86, 0x1b, 0xd8, 0x12, + 0x39, 0xd0, 0xa7, 0x76, 0xfc, 0x51, 0x6e, 0xa5, 0x8a, 0xfa, 0xd4, 0xa6, + 0x4e, 0xae, 0xa0, 0x8b, 0x58, 0x07, 0xfb, 0x02, 0xa1, 0x2b, 0x20, 0xb9, + 0xe2, 0x16, 0x05, 0xae, 0x6e, 0xfc, 0x5d, 0xbe, 0xa8, 0x3f, 0x0f, 0x07, + 0xb3, 0x06, 0xbd, 0x24, 0x70, 0x76, 0x8b, 0xaf, 0xaf, 0xc3, 0xdc, 0x36, + 0xee, 0xbd, 0x83, 0xa4, 0x44, 0xbd, 0x78, 0x6f, 0x81, 0xab, 0x3d, 0x66, + 0x9a, 0xba, 0xae, 0x04, 0xdc, 0x35, 0x0a, 0xa8, 0x7c, 0x36, 0x63, 0x3d, + 0x16, 0xe1, 0x25, 0x8d, 0xb7, 0x4a, 0x99, 0xfa, 0xe9, 0x4f, 0x54, 0x7c, + 0xc0, 0x14, 0x05, 0xfc, 0x4f, 0x81, 0xf1, 0xbf, 0x59, 0x2f, 0x0f, 0xfb, + 0xe4, 0x23, 0xdb, 0x2c, 0xd2, 0xf9, 0x4c, 0x53, 0x3e, 0x2e, 0xd1, 0xd6, + 0x92, 0x21, 0x02, 0x6f, 0x4e, 0x68, 0x3e, 0xf1, 0x3b, 0x5a, 0x44, 0xcf, + 0x1e, 0x0d, 0xda, 0x77, 0xa9, 0x10, 0x7d, 0xb2, 0x95, 0x7e, 0x9b, 0x36, + 0xbc, 0xd7, 0xae, 0x59, 0xf1, 0x00, 0x3a, 0x44, 0x0d, 0x41, 0x63, 0xf6, + 0x12, 0xc1, 0xc3, 0xd2, 0x12, 0x25, 0xf6, 0x8b, 0xc0, 0xaa, 0xcb, 0xf2, + 0x4d, 0xcc, 0x27, 0xef, 0x21, 0x7e, 0x46, 0x13, 0xdf, 0xaf, 0x55, 0xd2, + 0xec, 0x42, 0xdf, 0xf4, 0x8b, 0xee, 0xd5, 0x42, 0x9e, 0x30, 0xcb, 0x67, + 0xbe, 0x00, 0x0b, 0x31, 0x56, 0xd2, 0x64, 0x84, 0xf4, 0x63, 0xed, 0xcb, + 0x2a, 0xb1, 0x14, 0x03, 0x82, 0xa3, 0xee, 0x97, 0x53, 0xa4, 0x1f, 0x92, + 0x13, 0x3c, 0x13, 0x42, 0x34, 0xea, 0xfe, 0x59, 0x14, 0x6b, 0x7f, 0x35, + 0x2d, 0x9f, 0x9a, 0x46, 0x4c, 0x40, 0x7e, 0x92, 0xe2, 0xa8, 0x81, 0x74, + 0x88, 0x05, 0xb6, 0x4d, 0xf3, 0x86, 0xa3, 0x25, 0xf8, 0x5e, 0x3d, 0x1b, + 0xbf, 0x3b, 0x82, 0xd6, 0x14, 0x68, 0xed, 0xf3, 0x33, 0xc8, 0x18, 0x28, + 0xfa, 0x6f, 0x14, 0xa4, 0xeb, 0x08, 0xbd, 0x7f, 0x3c, 0x20, 0x49, 0xb9, + 0xdb, 0x85, 0xc6, 0x30, 0x0d, 0x40, 0xce, 0x88, 0x3e, 0x5c, 0xe2, 0x21, + 0xb9, 0x3a, 0x0f, 0x29, 0x16, 0x04, 0x19, 0xb9, 0x14, 0xa6, 0xf3, 0xcf, + 0xac, 0x83, 0xfa, 0x8c, 0x5b, 0x7d, 0xb4, 0xcc, 0xb6, 0x97, 0x97, 0xb6, + 0x0a, 0x56, 0x55, 0x5a, 0xed, 0xa8, 0xb7, 0xe7, 0x29, 0x57, 0x56, 0x1d, + 0xdf, 0xf9, 0x19, 0x38, 0xa5, 0x11, 0x23, 0x2a, 0xd0, 0x2c, 0xe3, 0x1c, + 0x83, 0x0a, 0x8a, 0x6f, 0x52, 0xd8, 0xf0, 0x1d, 0x3d, 0x84, 0x6a, 0xf8, + 0xe1, 0xd8, 0x54, 0xe0, 0x26, 0x87, 0xdb, 0x7e, 0x1b, 0x23, 0xb8, 0xee, + 0xdf, 0x8e, 0x36, 0xfb, 0x4f, 0x2d, 0xe5, 0xa9, 0x2c, 0x44, 0xee, 0xe5, + 0xf7, 0x80, 0x16, 0xed, 0x09, 0xda, 0xe6, 0x75, 0xfc, 0xe9, 0xf8, 0x8e, + 0xc4, 0x70, 0x66, 0xa9, 0xb0, 0x9d, 0xaf, 0x39, 0x5d, 0x3f, 0x10, 0x86, + 0x38, 0x45, 0x3c, 0x6e, 0x39, 0xe3, 0xc7, 0x4d, 0x53, 0x04, 0xf9, 0x09, + 0xa7, 0xfc, 0xc6, 0x3b, 0x82, 0x3c, 0x41, 0x33, 0x52, 0x21, 0xcc, 0x50, + 0x52, 0x01, 0x58, 0x2a, 0xbf, 0x19, 0x47, 0x52, 0x86, 0x44, 0x06, 0x2c, + 0x6a, 0x8b, 0x51, 0x9d, 0x58, 0x6c, 0x5b, 0xe8, 0xba, 0x3d, 0xf3, 0x3b, + 0x19, 0xdc, 0xb9, 0x70, 0xd1, 0x68, 0x2d, 0x3e, 0xda, 0x37, 0x7c, 0xf4, + 0xe2, 0x05, 0x35, 0x15, 0xbb, 0x3d, 0x5a, 0x6d, 0xc6, 0x32, 0x81, 0x4f, + 0xe6, 0xec, 0x1a, 0x88, 0x0c, 0xbf, 0x95, 0x37, 0x76, 0xcc, 0x1e, 0xa9, + 0x91, 0xf0, 0x80, 0xb3, 0x0e, 0x86, 0x64, 0x2c, 0xe9, 0xf2, 0xab, 0xce, + 0xb8, 0xa8, 0x3b, 0x0a, 0x26, 0x1b, 0x58, 0x7d, 0x85, 0x51, 0xc7, 0x7c, + 0x8f, 0x44, 0x0a, 0x2b, 0x6a, 0x65, 0x04, 0x2e, 0xbf, 0x85, 0x89, 0x21, + 0xab, 0x0a, 0x8b, 0xef, 0x9a, 0xd9, 0x62, 0x2e, 0x18, 0x0f, 0x5d, 0xc0, + 0x43, 0xdd, 0xbe, 0x68, 0x5a, 0xe6, 0x61, 0x7e, 0xaf, 0x85, 0xbe, 0xd1, + 0x75, 0x5e, 0x34, 0x64, 0x7d, 0x6e, 0xe2, 0xad, 0xfd, 0x45, 0xf0, 0x52, + 0xa5, 0x66, 0x18, 0x3b, 0x50, 0x70, 0xe6, 0x28, 0xb3, 0x5f, 0x11, 0x72, + 0xcd, 0x67, 0x04, 0xc3, 0x53, 0x75, 0x8d, 0x4f, 0x57, 0xe7, 0xd6, 0xa7, + 0xa1, 0x24, 0xcc, 0xf5, 0x90, 0x8c, 0xfb, 0xf4, 0x7d, 0x0d, 0xd8, 0x13, + 0xa4, 0x3a, 0x94, 0x96, 0x1a, 0xbb, 0x5b, 0xb0, 0x49, 0xcd, 0xf5, 0x49, + 0x14, 0xfe, 0x7f, 0x42, 0x6d, 0x0e, 0xe0, 0xf0, 0x0e, 0x04, 0xdd, 0xbf, + 0x61, 0x9c, 0xec, 0xcd, 0xc9, 0x24, 0xe3, 0x73, 0xae, 0x13, 0xa8, 0xce, + 0xe2, 0xc6, 0x51, 0x51, 0xaf, 0xed, 0xbd, 0x19, 0x7b, 0x9d, 0x70, 0x65, + 0xb6, 0xa3, 0xeb, 0x59, 0xb1, 0x58, 0x82, 0xfd, 0x91, 0xe4, 0x90, 0xeb, + 0xe2, 0x92, 0xe7, 0x85, 0x6e, 0xef, 0x7f, 0x9a, 0x62, 0x10, 0xf5, 0x93, + 0x23, 0x57, 0xce, 0x42, 0x2d, 0x84, 0x37, 0x1e, 0x1c, 0xd7, 0x08, 0x68, + 0x37, 0x01, 0x83, 0x02, 0x8d, 0x19, 0xb6, 0x4b, 0xa0, 0x0b, 0xb2, 0x67, + 0x56, 0xd5, 0xf8, 0xbf, 0xeb, 0xcc, 0xc4, 0x3b, 0x64, 0x63, 0xa1, 0xe6, + 0x5d, 0xc0, 0xe2, 0x4f, 0x93, 0x93, 0x9c, 0xf9, 0x78, 0xad, 0xcf, 0x65, + 0x06, 0x50, 0x34, 0x7b, 0x3d, 0x9e, 0xb0, 0x37, 0xad, 0xc7, 0xd0, 0x72, + 0xad, 0xc0, 0xd1, 0x2f, 0xac, 0x48, 0xfd, 0x98, 0xb6, 0xb5, 0xcc, 0x20, + 0xb2, 0xcc, 0xb5, 0xa1, 0xb8, 0xf3, 0x5f, 0x49, 0x6d, 0x95, 0xdb, 0x31, + 0xad, 0x69, 0x2f, 0xb0, 0x7a, 0xac, 0x32, 0xdc, 0x3f, 0xdc, 0xbb, 0xf0, + 0x24, 0xeb, 0x5c, 0x10, 0x31, 0xfd, 0x91, 0xaa, 0x5e, 0xa3, 0xe4, 0xc6, + 0x3a, 0xd7, 0xaa, 0xb4, 0xd9, 0x80, 0x19, 0xb3, 0xf8, 0x9e, 0xcb, 0xc4, + 0x73, 0x3f, 0x61, 0x6a, 0x59, 0x77, 0x63, 0x2e, 0x4e, 0x82, 0x5c, 0x4e, + 0x8b, 0x5c, 0xdc, 0x15, 0xc3, 0xf2, 0xab, 0x3a, 0x5b, 0x6e, 0xcc, 0x08, + 0x2b, 0xd1, 0x9e, 0x48, 0x7b, 0xfd, 0x75, 0x91, 0xf5, 0x86, 0x46, 0xfa, + 0x89, 0x21, 0x3f, 0x4b, 0x07, 0x92, 0xb3, 0xa4, 0x6c, 0xa8, 0x87, 0x3b, + 0x8e, 0x9e, 0x9e, 0xeb, 0x2d, 0x87, 0xd0, 0x45, 0x42, 0xce, 0x6b, 0x31, + 0xe6, 0x50, 0xc9, 0xcf, 0xa1, 0x34, 0xda, 0x5a, 0x4c, 0x5a, 0x93, 0x5f, + 0x5b, 0x6d, 0xc5, 0x22, 0x75, 0x53, 0x44, 0xbd, 0xe6, 0xa1, 0x78, 0xec, + 0xd5, 0xd7, 0x53, 0xa1, 0x67, 0xd7, 0xf0, 0x83, 0x85, 0x75, 0xd6, 0xcb, + 0x30, 0x1e, 0x6f, 0xd4, 0xc2, 0xcd, 0x82, 0xf1, 0x08, 0x7c, 0x1e, 0xa4, + 0xed, 0xef, 0xb1, 0xc8, 0xdc, 0xf2, 0x64, 0x65, 0xdf, 0xe7, 0x21, 0xed, + 0x4d, 0x5a, 0x87, 0xe8, 0x68, 0x87, 0xac, 0x8d, 0x9d, 0xc2, 0xb7, 0x19, + 0x80, 0x62, 0xac, 0xbb, 0x73, 0x7d, 0x39, 0xd0, 0xf6, 0x7b, 0xb8, 0xc1, + 0x53, 0x1b, 0xe9, 0x46, 0x28, 0x50, 0x05, 0x6e, 0x65, 0xf9, 0x3d, 0x1c, + 0x23, 0x94, 0x09, 0xd8, 0x8f, 0xcf, 0xf6, 0x19, 0x52, 0x29, 0xae, 0x1d, + 0x46, 0xf5, 0x05, 0x61, 0x82, 0xb5, 0xd1, 0x52, 0xc5, 0x03, 0x49, 0x9d, + 0xcf, 0xe2, 0x49, 0x0b, 0x6a, 0x77, 0x73, 0x91, 0x95, 0x7a, 0x56, 0xb9, + 0x13, 0xa9, 0xb4, 0x9b, 0xf9, 0xa4, 0x39, 0x06, 0x1a, 0xd4, 0xd7, 0x7f, + 0xec, 0xb4, 0xa6, 0xcd, 0x8a, 0x60, 0x3d, 0x4d, 0xba, 0xbc, 0x91, 0xc7, + 0x1f, 0x50, 0xac, 0xd7, 0x75, 0x77, 0x3c, 0xb1, 0xd6, 0x5c, 0x34, 0x88, + 0xf0, 0xe1, 0xd9, 0x63, 0x99, 0xdd, 0xa0, 0x37, 0xde, 0xa1, 0xb4, 0xc1, + 0xc3, 0x4d, 0x8c, 0x0d, 0x8a, 0x0d, 0x96, 0xba, 0x48, 0xc2, 0x0c, 0x3d, + 0x6f, 0x7b, 0x10, 0xb5, 0x22, 0x9b, 0x79, 0x84, 0x6f, 0x24, 0x4d, 0xaf, + 0x61, 0x03, 0xd6, 0x33, 0xc8, 0x2e, 0xde, 0x24, 0x39, 0x59, 0x49, 0x5f, + 0x11, 0xad, 0x3a, 0x99, 0x49, 0x3c, 0xda, 0xdc, 0xe8, 0xea, 0x66, 0x37, + 0xa9, 0x6e, 0xf3, 0x79, 0x6a, 0x6f, 0xea, 0x43, 0x10, 0x6e, 0xf6, 0x41, + 0xc2, 0x54, 0x6e, 0x42, 0x43, 0xe9, 0xcf, 0x3e, 0xce, 0xeb, 0x99, 0xf2, + 0x56, 0xb7, 0x26, 0x4a, 0xce, 0xd6, 0x57, 0x59, 0x8f, 0x30, 0xa0, 0xdc, + 0xe4, 0xed, 0x42, 0xe0, 0x1d, 0x15, 0x9e, 0x39, 0xc9, 0xdd, 0x10, 0xfe, + 0x66, 0x34, 0x54, 0xd4, 0x25, 0xfc, 0x48, 0x87, 0xae, 0xff, 0x77, 0x96, + 0x01, 0x30, 0xd7, 0xd5, 0x2b, 0xa3, 0xba, 0xe0, 0x34, 0xce, 0x67, 0x96, + 0x4c, 0x3e, 0x88, 0x84, 0x7d, 0x49, 0xeb, 0x01, 0x07, 0x2d, 0xad, 0x3d, + 0xa5, 0x22, 0x47, 0xed, 0x5b, 0x43, 0xfc, 0xcf, 0x69, 0x1c, 0xff, 0x65, + 0x00, 0x93, 0x60, 0x3b, 0x94, 0x10, 0xd6, 0xcd, 0x71, 0xa7, 0x93, 0xfd, + 0x58, 0x53, 0xcf, 0x9f, 0x4f, 0x56, 0xa7, 0x7b, 0x39, 0x9d, 0x9b, 0x5d, + 0x0d, 0xf1, 0xb5, 0x98, 0x96, 0x9b, 0x04, 0x7a, 0xa7, 0xda, 0x8f, 0x44, + 0x92, 0x05, 0x22, 0xbd, 0xb6, 0x49, 0x15, 0x53, 0xf6, 0x0b, 0x9c, 0xb4, + 0xf3, 0x61, 0xfa, 0xf3, 0xc1, 0x88, 0xd5, 0x09, 0xb9, 0x3c, 0x0f, 0x2b, + 0xa6, 0xbe, 0x1d, 0xa8, 0xc6, 0x65, 0x61, 0x46, 0x95, 0xf0, 0x04, 0xfd, + 0x2e, 0xc7, 0xfe, 0x67, 0xb3, 0xb9, 0x1e, 0xc8, 0xb8, 0x9a, 0x54, 0x45, + 0x0b, 0x9d, 0x1a, 0x26, 0x6e, 0x3c, 0x7a, 0xca, 0x77, 0xb9, 0xf5, 0x5a, + 0x95, 0x1c, 0x99, 0x12, 0x97, 0xfb, 0x53, 0xdd, 0x63, 0x9c, 0x2e, 0x5e, + 0x9d, 0x13, 0x19, 0xa0, 0x21, 0x24, 0xe0, 0x7a, 0xa4, 0xda, 0x8c, 0x64, + 0x13, 0xce, 0xa8, 0xd8, 0x2d, 0x09, 0x15, 0x1a, 0x50, 0xb4, 0x25, 0x32, + 0xc5, 0xef, 0x03, 0xf4, 0x1e, 0xca, 0x8c, 0x62, 0xcd, 0x8d, 0x01, 0x79, + 0x7b, 0xf0, 0x06, 0x69, 0xd1, 0x2f, 0x56, 0x01, 0x94, 0xc1, 0xa1, 0xcd, + 0x9d, 0x22, 0x6e, 0x49, 0xfb, 0x7a, 0xab, 0x4f, 0x3e, 0xae, 0x6b, 0xad, + 0xce, 0x33, 0xd6, 0x9c, 0x86, 0x0c, 0x5c, 0x97, 0x34, 0x5e, 0x59, 0xce, + 0x3a, 0x43, 0x99, 0x5e, 0xca, 0x59, 0xd0, 0x6c, 0x1e, 0xb6, 0xf8, 0x36, + 0xef, 0x28, 0x0c, 0x0d, 0x26, 0xc6, 0xb8, 0x5a, 0x91, 0xe3, 0x0e, 0xa1, + 0x9b, 0xf2, 0x31, 0x02, 0x6f, 0x98, 0x47, 0x1e, 0x92, 0x55, 0x1a, 0x0f, + 0x9d, 0x24, 0x63, 0x51, 0x14, 0xdf, 0xdd, 0x82, 0xbc, 0xb3, 0x10, 0x1d, + 0xa9, 0xe9, 0xb3, 0x1a, 0xb9, 0x74, 0x68, 0xc2, 0x24, 0xec, 0x8e, 0x9c, + 0xc8, 0xc3, 0x14, 0xec, 0x05, 0xe5, 0xa7, 0x96, 0xc7, 0xba, 0x3e, 0x62, + 0xd2, 0x90, 0xd8, 0xec, 0x4a, 0xae, 0x71, 0x67, 0x0a, 0xd5, 0x78, 0x72, + 0x2f, 0x66, 0x2a, 0x48, 0x77, 0x5b, 0x51, 0xe2, 0xe9, 0x2a, 0x0c, 0x53, + 0xaa, 0xf9, 0xa1, 0x01, 0xb8, 0x4c, 0x39, 0xba, 0x55, 0xc1, 0x3a, 0xe4, + 0x6b, 0x14, 0x0e, 0x95, 0x78, 0x52, 0x81, 0xea, 0x60, 0x76, 0xc0, 0x54, + 0xc0, 0x1d, 0x84, 0xf8, 0x4e, 0xb5, 0x9c, 0x20, 0xc8, 0x01, 0x82, 0xd3, + 0x52, 0xfa, 0xee, 0xd5, 0xf6, 0x82, 0x55, 0x4b, 0xb3, 0x52, 0x9f, 0xb8, + 0xcc, 0x69, 0x38, 0x52, 0x90, 0xbd, 0xba, 0xc0, 0xd4, 0xa8, 0x5c, 0x9e, + 0x59, 0x14, 0xb4, 0x35, 0x5f, 0x0e, 0xf6, 0x51, 0x29, 0xa7, 0x1a, 0xd2, + 0xdd, 0xfd, 0xda, 0x8a, 0x8e, 0xa3, 0x6d, 0x7f, 0x19, 0x81, 0xab, 0x0c, + 0xed, 0x15, 0x28, 0x8c, 0x68, 0xdb, 0xf7, 0x72, 0x4f, 0xf3, 0x36, 0x1d, + 0x62, 0x44, 0xe1, 0xa2, 0x48, 0x81, 0x78, 0x1c, 0xfc, 0x53, 0xc9, 0x9e, + 0x5c, 0x62, 0x46, 0x71, 0x51, 0xf1, 0xa4, 0x88, 0x1b, 0x1f, 0xb5, 0xdf, + 0x85, 0xfb, 0xf6, 0xfc, 0xc2, 0x13, 0x14, 0xf9, 0xda, 0xa1, 0xd1, 0x3b, + 0x8b, 0x81, 0xf1, 0x77, 0xa2, 0xff, 0xfa, 0x9f, 0xa4, 0x25, 0x1d, 0xea, + 0x08, 0x69, 0xcf, 0x2e, 0x9c, 0xac, 0xe8, 0x92, 0x03, 0x59, 0x4e, 0xf2, + 0xf7, 0xa8, 0x57, 0x84, 0x77, 0xa4, 0xde, 0xac, 0x7f, 0x52, 0xf1, 0xe9, + 0x5b, 0x65, 0x46, 0xd1, 0x20, 0x4e, 0x86, 0xf0, 0xb4, 0xb1, 0x52, 0x6a, + 0x00, 0xd9, 0x32, 0x13, 0x80, 0x5b, 0xe1, 0x17, 0x47, 0xc7, 0xfd, 0xc5, + 0x98, 0x9a, 0x12, 0xe2, 0xf2, 0x8d, 0xea, 0xb9, 0x3a, 0xac, 0x6d, 0x3c, + 0xbb, 0xfb, 0x07, 0x38, 0x48, 0xf6, 0x5b, 0x57, 0xb0, 0xe3, 0x41, 0xc5, + 0x4e, 0xeb, 0xdc, 0x76, 0x33, 0xc1, 0x0b, 0x6c, 0xae, 0xd9, 0x6a, 0xed, + 0x7f, 0x4f, 0xb1, 0xc5, 0x21, 0xa7, 0x1d, 0xff, 0xfc, 0x93, 0x83, 0x0a, + 0x72, 0x82, 0xed, 0xde, 0xa8, 0xe9, 0x4a, 0x7e, 0x06, 0x79, 0x14, 0x95, + 0x4a, 0x4a, 0x4f, 0xda, 0xcf, 0xa7, 0x69, 0x69, 0x56, 0x78, 0x67, 0x9c, + 0x51, 0x55, 0x77, 0x6a, 0x3d, 0xad, 0x4c, 0xc8, 0x86, 0x99, 0xa0, 0xfc, + 0x03, 0xa7, 0x0d, 0x9c, 0x4f, 0x8c, 0x63, 0x3a, 0xf3, 0x58, 0xd9, 0xd9, + 0xcd, 0x67, 0x2a, 0xc1, 0x45, 0x05, 0xa3, 0x1b, 0x51, 0xf5, 0x48, 0x08, + 0x2b, 0x56, 0xe4, 0xe2, 0x28, 0x24, 0xaa, 0x46, 0x19, 0xd3, 0x34, 0xda, + 0x5a, 0x95, 0x2f, 0x67, 0x99, 0x33, 0xd5, 0xc8, 0x34, 0x23, 0x23, 0x1a, + 0x97, 0xc4, 0xf2, 0x6d, 0xf5, 0xc9, 0x2a, 0xd3, 0x87, 0xc5, 0xe7, 0x8f, + 0xc5, 0x16, 0x41, 0xbf, 0x04, 0x04, 0x0f, 0x50, 0xb8, 0x91, 0x0d, 0x0c, + 0xfd, 0xb5, 0xc3, 0xe6, 0xbf, 0xfd, 0xdb, 0xef, 0x21, 0xc3, 0xf4, 0x4e, + 0x88, 0xb4, 0x33, 0xfe, 0x19, 0xee, 0xc1, 0x3b, 0x5a, 0xcf, 0x71, 0x75, + 0x40, 0xbe, 0x6d, 0x07, 0xb9, 0x9f, 0x51, 0x81, 0x97, 0xca, 0xca, 0xa1, + 0x8a, 0x45, 0xa9, 0x05, 0x5a, 0x8f, 0x36, 0xcf, 0x66, 0x56, 0x29, 0x96, + 0xc4, 0x52, 0x76, 0x44, 0x72, 0xff, 0xb0, 0x22, 0xc9, 0xce, 0xf1, 0x9f, + 0x05, 0x61, 0xa0, 0xa2, 0x95, 0x7e, 0x5a, 0xf6, 0x39, 0x47, 0xf8, 0x81, + 0x80, 0x00, 0x1e, 0xa5, 0x45, 0x4d, 0xb1, 0x90, 0xc9, 0xf5, 0xba, 0xe6, + 0x1a, 0xe1, 0x99, 0xc7, 0x60, 0x77, 0xc1, 0xb6, 0x12, 0x23, 0xa5, 0xd6, + 0xe4, 0x27, 0x49, 0x95, 0xc5, 0xa8, 0x30, 0xb8, 0xb6, 0x66, 0x47, 0x31, + 0xf3, 0x8d, 0x21, 0xbe, 0x47, 0x4c, 0xe5, 0x71, 0x95, 0xa2, 0x17, 0x1f, + 0x4c, 0x0c, 0x10, 0xc0, 0x9b, 0x32, 0x29, 0xf8, 0xb5, 0xbd, 0xc7, 0x39, + 0x4a, 0x5b, 0xb6, 0xc5, 0x73, 0x73, 0x96, 0x49, 0x13, 0x35, 0xbd, 0xa1, + 0xa6, 0x7d, 0x6f, 0xfb, 0x4f, 0x1d, 0x43, 0x3f, 0x80, 0x1f, 0xba, 0x0b, + 0xcf, 0x47, 0x80, 0x69, 0xa7, 0xda, 0x86, 0x12, 0xc9, 0x74, 0x83, 0x41, + 0x70, 0x20, 0x66, 0xbb, 0x18, 0xd8, 0xb5, 0xf3, 0x89, 0xfb, 0x13, 0xad, + 0x18, 0xc7, 0x2d, 0x94, 0x91, 0xb0, 0x5d, 0xd2, 0x57, 0xe4, 0x99, 0x0e, + 0xd8, 0x50, 0xb3, 0xb6, 0x12, 0x08, 0xa8, 0x23, 0x06, 0x7a, 0x72, 0x43, + 0x30, 0x9c, 0x96, 0x64, 0xfc, 0x48, 0xf9, 0x00, 0x02, 0xcf, 0xf9, 0x26, + 0x15, 0x82, 0x91, 0xd9, 0xdc, 0x48, 0x3e, 0x56, 0xd5, 0x01, 0xaa, 0x83, + 0xb4, 0x99, 0x48, 0x59, 0x72, 0x93, 0xdb, 0x30, 0x40, 0xd7, 0x89, 0xf2, + 0xc6, 0x3e, 0x96, 0x6a, 0xe2, 0x50, 0x12, 0x17, 0x2c, 0xc0, 0xe9, 0x2a, + 0xf6, 0xa6, 0x20, 0xc5, 0xd5, 0xbe, 0x79, 0x40, 0xeb, 0x17, 0x0e, 0xb9, + 0x1f, 0x17, 0xc0, 0xab, 0x7d, 0xdd, 0x75, 0x0f, 0x6f, 0x32, 0x99, 0x48, + 0x9d, 0x8b, 0x3b, 0xb3, 0xfc, 0xaf, 0xb7, 0x79, 0xfc, 0xdf, 0xab, 0xb1, + 0xad, 0x68, 0x66, 0xe9, 0xc8, 0x78, 0x9e, 0xe8, 0x11, 0x58, 0x74, 0x02, + 0xb8, 0x1a, 0xfe, 0x64, 0x2d, 0x6b, 0x74, 0x2c, 0x2c, 0xee, 0x3a, 0x27, + 0x51, 0xee, 0x96, 0x04, 0xb7, 0xc4, 0x4b, 0x14, 0x1d, 0x1e, 0x94, 0x4b, + 0x78, 0xa2, 0xfb, 0x07, 0x73, 0xe2, 0x1a, 0x75, 0x11, 0x46, 0xe9, 0xb3, + 0xe2, 0x15, 0x37, 0x40, 0xe3, 0xaa, 0x2f, 0x9d, 0x0d, 0x80, 0x50, 0xac, + 0x85, 0x67, 0x85, 0x12, 0xf0, 0xe2, 0x01, 0xdc, 0xe9, 0x68, 0x0c, 0x3a, + 0x25, 0xca, 0xf3, 0x93, 0x76, 0x39, 0xeb, 0x00, 0xbf, 0x29, 0xf7, 0xdb, + 0x79, 0xd0, 0xb0, 0x7a, 0x6d, 0x81, 0x48, 0xe8, 0xad, 0xe3, 0x2c, 0xcc, + 0x53, 0x16, 0x2e, 0xed, 0xd6, 0xc7, 0x93, 0xff, 0x18, 0x21, 0x43, 0x99, + 0x60, 0xfb, 0x46, 0x77, 0x2c, 0x75, 0x3b, 0xb1, 0x0a, 0x87, 0xca, 0x51, + 0xd2, 0x28, 0xde, 0x4d, 0xf9, 0x68, 0x4a, 0x5f, 0xb1, 0x52, 0x9d, 0x5b, + 0x78, 0xf2, 0x2b, 0xaa, 0x17, 0x29, 0x9a, 0x5a, 0x76, 0x07, 0x37, 0x64, + 0xf2, 0x07, 0xe7, 0x2e, 0x93, 0x57, 0x60, 0xa2, 0xfb, 0x2c, 0xd7, 0x4e, + 0x61, 0x3e, 0x2b, 0xd3, 0xda, 0xcc, 0x2a, 0xfb, 0x38, 0x5e, 0xec, 0xf4, + 0x82, 0xa5, 0x4b, 0x9e, 0x39, 0x0e, 0xf4, 0xca, 0xbe, 0xf3, 0x43, 0x3a, + 0x5e, 0x16, 0x3a, 0xc9, 0x0d, 0x7d, 0x15, 0xa1, 0x0e, 0x2f, 0x42, 0x89, + 0x54, 0xf0, 0x4b, 0xcb, 0x0b, 0x54, 0xa4, 0x5e, 0x21, 0xae, 0x8f, 0xd0, + 0x14, 0x23, 0x46, 0xf6, 0xeb, 0x29, 0x77, 0x0a, 0x7c, 0xd6, 0x2e, 0x31, + 0xdf, 0xe8, 0x68, 0x80, 0x01, 0xa1, 0xd9, 0x30, 0x2b, 0x5f, 0x2a, 0x33, + 0xf4, 0xaa, 0x9a, 0x17, 0x38, 0x54, 0x8e, 0xdc, 0x07, 0xa4, 0xc3, 0x8a, + 0x64, 0xda, 0x9c, 0x1d, 0x0b, 0x40, 0xe5, 0xef, 0x38, 0x2d, 0x77, 0x4e, + 0x5d, 0x12, 0xf6, 0x9f, 0xa8, 0x71, 0x2a, 0xb8, 0x04, 0x0a, 0x93, 0x1f, + 0x05, 0x9d, 0xab, 0xee, 0x4f, 0xa6, 0xec, 0x7c, 0xa9, 0x9a, 0x73, 0x09, + 0xde, 0xfc, 0x29, 0xb9, 0xf6, 0xd4, 0x75, 0x99, 0xbb, 0x77, 0x23, 0xbc, + 0x31, 0x76, 0x7e, 0x3a, 0xe7, 0x12, 0x37, 0x83, 0x69, 0xdb, 0xac, 0x5b, + 0xcd, 0x4e, 0x12, 0x96, 0xa9, 0x1b, 0x2a, 0x21, 0x71, 0xe0, 0xd9, 0xc9, + 0x62, 0xbc, 0xa4, 0x48, 0xf8, 0xe5, 0x9e, 0x9c, 0xe6, 0x0b, 0x2f, 0xee, + 0xa5, 0xc8, 0xc2, 0x72, 0x82, 0xbf, 0xad, 0x2e, 0x53, 0x03, 0x3f, 0x02, + 0x30, 0x9b, 0xa5, 0xcc, 0xd1, 0x61, 0x3c, 0xab, 0xc2, 0xbc, 0xcf, 0x4f, + 0x0e, 0xf4, 0x5b, 0x57, 0xce, 0x61, 0x8f, 0xcf, 0xaa, 0x3d, 0x2e, 0xeb, + 0x4d, 0xbc, 0x29, 0x8a, 0x81, 0x38, 0xe7, 0xa1, 0x7b, 0x53, 0x59, 0xbc, + 0x9f, 0xbf, 0xc9, 0xb1, 0x49, 0xd8, 0xcf, 0x44, 0xd3, 0x6c, 0x50, 0x09, + 0x93, 0x22, 0x2f, 0xd1, 0xed, 0x70, 0x52, 0xa4, 0x72, 0x82, 0x14, 0xab, + 0xdc, 0x0e, 0x8f, 0x9e, 0x8c, 0x27, 0xa9, 0x5a, 0x28, 0x6a, 0x77, 0xb3, + 0xc6, 0x30, 0x2f, 0xbf, 0xc8, 0x84, 0x94, 0xa6, 0x70, 0xa9, 0xab, 0x64, + 0x88, 0x85, 0xb0, 0x8e, 0x39, 0x20, 0x03, 0xb3, 0x2d, 0x76, 0x95, 0xc2, + 0xdb, 0x4f, 0x3c, 0x2b, 0x8f, 0xd0, 0xc3, 0x41, 0x5b, 0xdf, 0x84, 0x5c, + 0xc6, 0xf8, 0xdb, 0x05, 0x3d, 0xf9, 0x2b, 0x76, 0x18, 0xdb, 0xbc, 0xa2, + 0xa1, 0xc5, 0xb2, 0xd7, 0xf1, 0x80, 0x1b, 0xce, 0xce, 0x23, 0xd9, 0x07, + 0x3b, 0xf8, 0x47, 0xc9, 0x5f, 0x0d, 0xc7, 0x23, 0xc0, 0x2b, 0x64, 0xf7, + 0xfa, 0x8a, 0xa1, 0x26, 0x56, 0xd4, 0x14, 0xbe, 0xda, 0x6a, 0xb6, 0x15, + 0x7e, 0xa7, 0x2e, 0xce, 0x91, 0x6d, 0x45, 0xd6, 0x58, 0x19, 0x47, 0xcb, + 0xaf, 0xd4, 0x79, 0xee, 0xc1, 0x68, 0xad, 0xf8, 0xee, 0xda, 0x9d, 0xa2, + 0x6d, 0x91, 0x1a, 0xd1, 0x4a, 0x0c, 0xed, 0xb0, 0x8c, 0x3e, 0x01, 0xff, + 0xf0, 0x20, 0x16, 0x4f, 0x0b, 0x80, 0xfa, 0xce, 0x25, 0xb9, 0xb8, 0x27, + 0xa7, 0xf5, 0x70, 0xd4, 0x98, 0x3d, 0x3a, 0xce, 0x9c, 0xb7, 0xb3, 0xa4, + 0x46, 0x64, 0x3b, 0x07, 0xaf, 0x88, 0xd2, 0xbc, 0x69, 0x72, 0x37, 0x7d, + 0x24, 0x38, 0x2a, 0xee, 0xa5, 0xfd, 0xfe, 0xad, 0xdf, 0xcc, 0xcb, 0x36, + 0xd3, 0x32, 0x4f, 0x8a, 0x04, 0xd9, 0xac, 0x4e, 0x87, 0x05, 0xd5, 0x77, + 0x2a, 0xc6, 0xa2, 0xd7, 0x5b, 0x14, 0xff, 0x74, 0x45, 0xe4, 0x55, 0x1e, + 0xd9, 0x37, 0x2d, 0xb3, 0xf2, 0x0f, 0xc3, 0x9d, 0x52, 0x9f, 0xec, 0x0c, + 0x89, 0xa1, 0x97, 0xdb, 0x8a, 0x53, 0x52, 0x27, 0x8c, 0x30, 0x55, 0x60, + 0x3e, 0x90, 0x01, 0xec, 0x59, 0x12, 0xf0, 0xc9, 0x7c, 0xc7, 0x00, 0x75, + 0xd7, 0x52, 0xe0, 0xdf, 0x4b, 0xe4, 0xfa, 0xba, 0x34, 0x6a, 0x09, 0xf9, + 0x55, 0xec, 0xce, 0xab, 0xa3, 0x51, 0x1d, 0xb5, 0xb2, 0x9f, 0x12, 0x60, + 0x81, 0x86, 0xa7, 0xba, 0x2f, 0x7e, 0x6e, 0x5f, 0x29, 0xde, 0x01, 0x49, + 0xcc, 0x35, 0xf1, 0xd9, 0xdf, 0xad, 0xf9, 0x06, 0xfd, 0x0a, 0xb9, 0x00, + 0x61, 0x1e, 0xfb, 0x00, 0x10, 0x8e, 0x11, 0x4e, 0xff, 0xed, 0x61, 0x34, + 0x06, 0x3c, 0x03, 0x38, 0x46, 0xff, 0x04, 0x56, 0x64, 0xdb, 0xcd, 0x41, + 0x05, 0x68, 0x30, 0x77, 0x55, 0x21, 0x07, 0x4b, 0xe9, 0x01, 0x88, 0x0c, + 0x1b, 0x69, 0x64, 0xc4, 0x8e, 0xda, 0x72, 0x09, 0x01, 0x93, 0xc0, 0x03, + 0x34, 0xec, 0x59, 0xfb, 0xd0, 0x9e, 0x3c, 0xbb, 0x29, 0xe8, 0x30, 0x76, + 0xb9, 0xaf, 0x9d, 0x3b, 0xe4, 0x9e, 0x7c, 0xd5, 0x6a, 0x66, 0x03, 0x8c, + 0x2e, 0x43, 0x98, 0xed, 0xbe, 0xdc, 0xc9, 0xf8, 0x1b, 0xe7, 0xc9, 0x6f, + 0x42, 0x3f, 0x09, 0xe3, 0x83, 0x5d, 0x4c, 0xfe, 0x19, 0xa7, 0xb4, 0xdc, + 0xc6, 0x8d, 0x85, 0x4c, 0x51, 0xe5, 0x21, 0x7e, 0xe7, 0xa2, 0x2f, 0x48, + 0x88, 0x59, 0xd0, 0x49, 0xab, 0x09, 0x63, 0x1d, 0xa9, 0x35, 0x90, 0x27, + 0x1a, 0xd0, 0xd9, 0x0e, 0x9d, 0x2b, 0xec, 0xf6, 0x38, 0x02, 0x02, 0x62, + 0xaa, 0xff, 0x30, 0x07, 0xf6, 0x14, 0x66, 0x92, 0x23, 0x32, 0x1f, 0x5f, + 0xc0, 0xa9, 0x94, 0xf3, 0xb8, 0xaf, 0x52, 0x8d, 0xfd, 0x07, 0xe5, 0xc6, + 0x1d, 0x25, 0xb6, 0x36, 0xdd, 0xb1, 0xb5, 0x69, 0x54, 0x34, 0x2f, 0x4c, + 0xa3, 0xc5, 0x3d, 0xc9, 0x6d, 0x4f, 0x4a, 0xc8, 0xad, 0x9a, 0x58, 0x2f, + 0x42, 0xb6, 0x7f, 0x9b, 0xe9, 0x83, 0x86, 0xb0, 0x59, 0xfb, 0x8d, 0x5c, + 0xed, 0x42, 0x72, 0x6b, 0xff, 0x35, 0xa5, 0xe5, 0x3f, 0x6e, 0x6e, 0x51, + 0xea, 0xff, 0xfd, 0x66, 0xf8, 0xff, 0x5e, 0x1d, 0x56, 0x53, 0x75, 0xc5, + 0x89, 0x44, 0x53, 0xa3, 0xac, 0xd4, 0xff, 0x0e, 0x84, 0x66, 0x56, 0xae, + 0x97, 0x8b, 0x43, 0x37, 0x4b, 0xa6, 0xc9, 0x37, 0x2a, 0x22, 0xcb, 0x77, + 0xe3, 0xfe, 0x25, 0xee, 0x50, 0x56, 0xc3, 0xb2, 0xfb, 0x3a, 0x2d, 0xff, + 0x5a, 0xdb, 0xf6, 0x2f, 0x6b, 0x76, 0x7c, 0x52, 0xc1, 0x87, 0x37, 0x5f, + 0x8c, 0x82, 0x1d, 0x8a, 0xce, 0x90, 0x87, 0xda, 0x6a, 0xe4, 0xd8, 0x00, + 0x86, 0x02, 0x00, 0x96, 0xe0, 0xc1, 0x50, 0x80, 0x00, 0x64, 0x70, 0xce, + 0x3f, 0xd1, 0xf9, 0x0e, 0xe4, 0xfc, 0x2f, 0xe3, 0xf6, 0x37, 0x50, 0x3c, + 0xff, 0xb7, 0xdd, 0x70, 0x7f, 0xff, 0x6f, 0xf6, 0xc6, 0x80, 0x00, 0x33, + 0xf8, 0xb8, 0x5a, 0x9c, 0xb7, 0x39, 0xa5, 0xca, 0x1e, 0x02, 0x24, 0xdf, + 0xd1, 0x47, 0x14, 0x68, 0x1e, 0xc7, 0x93, 0xd7, 0x4d, 0x1f, 0x54, 0xe7, + 0x15, 0xec, 0x93, 0xf2, 0x3a, 0x57, 0x86, 0x30, 0xd0, 0xb7, 0x7e, 0x31, + 0x03, 0x8c, 0x96, 0x3c, 0xde, 0x72, 0x8d, 0x76, 0x4d, 0x1f, 0xec, 0xda, + 0x3a, 0x0d, 0x4d, 0x75, 0x41, 0x06, 0x80, 0x3a, 0xec, 0xfb, 0xa0, 0xa2, + 0xe0, 0xda, 0x9b, 0x0c, 0x68, 0x17, 0xc3, 0xb3, 0x64, 0xd0, 0x88, 0xe4, + 0xf1, 0xf3, 0x6e, 0x38, 0x7c, 0x80, 0x2f, 0x7a, 0xd9, 0xab, 0x67, 0x97, + 0xf0, 0x19, 0x6e, 0x67, 0x4b, 0x7d, 0x58, 0xd1, 0x23, 0x98, 0xd0, 0xf9, + 0x48, 0xb7, 0x75, 0x6c, 0x90, 0x4c, 0x2d, 0xcd, 0x50, 0x1b, 0xb0, 0x61, + 0xb1, 0xa3, 0xc1, 0xbf, 0x77, 0xb9, 0xbf, 0xf8, 0x89, 0x89, 0xb8, 0x43, + 0xc1, 0xdf, 0x8c, 0xfa, 0xfc, 0xf6, 0xaa, 0xd0, 0xaf, 0x48, 0xd4, 0x8a, + 0x4f, 0x64, 0x74, 0x2f, 0x1b, 0xc2, 0xfb, 0x32, 0xbb, 0x4b, 0x35, 0x0a, + 0x64, 0x0e, 0x0a, 0xe7, 0xd7, 0x00, 0xd6, 0x2a, 0x3e, 0x16, 0x7d, 0x43, + 0x38, 0x46, 0xd2, 0x85, 0x2b, 0x8f, 0x14, 0x5e, 0x24, 0x39, 0x3c, 0xe9, + 0x52, 0xd0, 0xcf, 0xf6, 0xe3, 0xff, 0x54, 0xb8, 0x38, 0xe3, 0xa6, 0x28, + 0x72, 0xd5, 0x89, 0x40, 0x75, 0xb7, 0xbf, 0x96, 0x19, 0x00, 0x52, 0x7c, + 0x4a, 0xc0, 0x91, 0x01, 0x54, 0xba, 0x63, 0xb3, 0xca, 0xed, 0xab, 0x3d, + 0xdf, 0xf9, 0x11, 0x54, 0x1f, 0xc6, 0x24, 0x4f, 0x21, 0x93, 0xea, 0x1b, + 0xc9, 0xd6, 0xd1, 0x29, 0x8a, 0x57, 0x66, 0x44, 0x10, 0xed, 0x38, 0xca, + 0x7b, 0xa0, 0xa6, 0xc2, 0x7c, 0x41, 0x2f, 0x39, 0x3c, 0xb6, 0x44, 0x99, + 0x0f, 0x47, 0x96, 0xa3, 0xab, 0xc9, 0x8e, 0x96, 0x63, 0x24, 0x33, 0xef, + 0x7d, 0x4f, 0xdb, 0x42, 0x2a, 0x9f, 0x76, 0xd2, 0x1e, 0xf4, 0x81, 0x35, + 0x41, 0xba, 0x42, 0xf1, 0x4c, 0x87, 0x34, 0x8d, 0x1e, 0x07, 0xa5, 0x48, + 0x70, 0xeb, 0x3c, 0xe5, 0xf4, 0x21, 0x4b, 0xde, 0xb5, 0x4b, 0xdd, 0x44, + 0x71, 0x1d, 0xcf, 0xd3, 0xae, 0x9a, 0x75, 0xa6, 0x45, 0xe2, 0x80, 0xaf, + 0x3a, 0xf6, 0xa7, 0x0f, 0x79, 0xfe, 0xe2, 0x4d, 0x2a, 0x8c, 0x2e, 0xa5, + 0x32, 0x36, 0xdd, 0xaa, 0xec, 0x88, 0x07, 0xf8, 0xea, 0xc6, 0x5a, 0xf5, + 0x29, 0x01, 0x30, 0xe0, 0x01, 0xcf, 0x83, 0xd0, 0x00, 0xc9, 0xa4, 0x16, + 0x71, 0x01, 0xc9 +}; +static const guint profile_0_frame1_len = 6171; +static const guint profile_0_frame1_first_len = 5796; +static const guint profile_0_frame1_last_len = 369; + +static const guint8 profile_0_frame2[] = { + 0x86, 0x04, 0x18, 0x96, 0x98, 0x60, 0x54, 0x40, 0x00, 0x14, 0x77, 0x67, + 0x6b, 0x66, 0x9f, 0x27, 0x00, 0x52, 0x03, 0x88, 0xbb, 0x8b, 0xe5, 0xb8, + 0x03, 0xc7, 0xfb, 0x7f, 0x00, 0x00, 0x62, 0xde, 0xa8, 0x9d, 0xec, 0x3c, + 0x11, 0x21, 0xf8, 0x45, 0xca, 0x83, 0x6c, 0x72, 0x35, 0x09, 0x7e, 0x88, + 0x66, 0x79, 0xa2, 0xac, 0x5c, 0xae, 0x69, 0x00, 0x66, 0xcb, 0x5b, 0x63, + 0x0e, 0x8c, 0xd3, 0x73, 0x06, 0xb5, 0xc1, 0xeb, 0x8b, 0x6e, 0xcd, 0xe8, + 0xdc, 0xe9, 0x25, 0xbc, 0x9e, 0x0e, 0x89, 0xe8, 0x83, 0xa2, 0x8f, 0xc6, + 0x87, 0x59, 0xa9, 0x2a, 0x12, 0x5e, 0x6f, 0x48, 0xa0, 0xb6, 0x52, 0xb4, + 0x5e, 0x9a, 0xb0, 0xb6, 0x1b, 0x7d, 0x44, 0x61, 0x4c, 0x9b, 0xac, 0x11, + 0x44, 0x14, 0xf3, 0xde, 0xf7, 0x82, 0x8e, 0x57, 0x1a, 0x41, 0x1f, 0xe9, + 0x99, 0x4b, 0x93, 0xbd, 0xbf, 0x41, 0x20, 0x57, 0x5d, 0xaa, 0x1f, 0xd2, + 0x47, 0x29, 0xe5, 0xb9, 0x7a, 0x74, 0x98, 0x94, 0x71, 0xe2, 0xd8, 0xaa, + 0xeb, 0x7b, 0x35, 0x9c, 0x8e, 0xa5, 0xc0, 0xed, 0x5f, 0xc7, 0x64, 0x0e, + 0x22, 0x25, 0x12, 0xbd, 0x8e, 0x55, 0x94, 0x03, 0xcb, 0x92, 0xdf, 0x8c, + 0x5e, 0xb3, 0xa0, 0x59, 0x6c, 0xdc, 0x97, 0xbd, 0x33, 0x90, 0x89, 0xc1, + 0x7d, 0x20, 0x72, 0x29, 0x3c, 0x52, 0x7b, 0x3a, 0x89, 0x7a, 0x69, 0x7f, + 0x6a, 0x9f, 0xd2, 0x86, 0x91, 0x9c, 0x3d, 0x84, 0x41, 0xbe, 0x33, 0x88, + 0x9e, 0x47, 0x5d, 0x89, 0x34, 0xc6, 0xe7, 0x18, 0xcb, 0x9c, 0x88, 0x5e, + 0xed, 0xe4, 0x53, 0x35, 0xb3, 0x1d, 0x89, 0x4f, 0xfb, 0x59, 0xd0, 0xeb, + 0xe8, 0x2e, 0x9a, 0xb3, 0xbc, 0xc0, 0x81, 0x07, 0x37, 0xc8, 0xad, 0x17, + 0x24, 0x89, 0x12, 0xf7, 0x09, 0x35, 0x29, 0x54, 0x76, 0x9f, 0x0c, 0x5c, + 0x97, 0xa1, 0x9e, 0x52, 0xae, 0xbf, 0xfc, 0x11, 0x0d, 0xce, 0x50, 0x56, + 0xf2, 0x61, 0xd3, 0x35, 0xba, 0x99, 0x88, 0x9f, 0xca, 0xfe, 0xf8, 0x1a, + 0x0f, 0x33, 0x28, 0x9e, 0x19, 0x70, 0x38, 0x14, 0xc5, 0xa3, 0x2a, 0x0b, + 0x42, 0x98, 0xed, 0xfc, 0x68, 0x6c, 0xc8, 0x13, 0x63, 0x86, 0xd1, 0x3f, + 0x09, 0x26, 0x84, 0x0d, 0x62, 0x0d, 0x91, 0x1a, 0xfd, 0x8c, 0x8f, 0xe0, + 0xca, 0x66, 0x60, 0xee, 0xcb, 0x72, 0x7c, 0x55, 0xd3, 0x6b, 0x22, 0x09, + 0x61, 0x3a, 0x7a, 0xbe, 0x98, 0x4b, 0x31, 0xf2, 0x61, 0x38, 0xb7, 0x99, + 0x83, 0xd5, 0x38, 0x12, 0xf8, 0x54, 0x53, 0x21, 0xd4, 0x9a, 0xca, 0xcf, + 0xf8, 0x26, 0xa3, 0xfe, 0x1a, 0xc4, 0xc6, 0x3e, 0xbd, 0x2c, 0x0d, 0xa9, + 0x79, 0xc4, 0x6f, 0xa3, 0x9a, 0x92, 0x56, 0xc5, 0xe2, 0xea, 0xcc, 0xdf, + 0x12, 0xe9, 0x03, 0x9d, 0x1c, 0x5a, 0x4f, 0x76, 0xfb, 0x08, 0xf5, 0x48, + 0xac, 0xe1, 0x2f, 0x98, 0xed, 0x75, 0x3e, 0xb8, 0x56, 0x50, 0x3c, 0xff, + 0x0a, 0x02, 0x01, 0xcd, 0x66, 0xe9, 0xed, 0x23, 0x79, 0x31, 0x5d, 0x97, + 0x3e, 0x61, 0x1f, 0x9d, 0x0c, 0x99, 0x3c, 0x93, 0x2b, 0x75, 0x5f, 0xf3, + 0x05, 0x88, 0x07, 0x7b, 0xc4, 0x22, 0x09, 0xc4, 0x04, 0x53, 0xb4, 0x9e, + 0x37, 0x24, 0x87, 0x38, 0x6e, 0x49, 0x43, 0xe3, 0x89, 0xda, 0x9e, 0x03, + 0x2a, 0x0c, 0x92, 0x79, 0x60, 0x2b, 0x7b, 0xa2, 0x33, 0xf2, 0x44, 0x11, + 0x10, 0xff, 0x1a, 0x90, 0x9d, 0xa1, 0xa7, 0x36, 0xac, 0xc9, 0x21, 0xbc, + 0x16, 0x5e, 0x08, 0x30, 0x64, 0xba, 0xe3, 0x6a, 0x10, 0xb8, 0x08, 0xfe, + 0xad, 0x41, 0xa0, 0x6d, 0xd9, 0xba, 0xd7, 0xdf, 0x60, 0x5d, 0x88, 0x1f, + 0x52, 0xa5, 0x49, 0x95, 0x66, 0xc4, 0x91, 0xc5, 0x1d, 0xa4, 0xef, 0xfe, + 0xab, 0xab, 0x6b, 0x34, 0x99, 0xa7, 0xf0, 0x66, 0x3f, 0x80, 0xdc, 0xa2, + 0x53, 0x85, 0x13, 0x28, 0x8b, 0xfe, 0x4a, 0xf0, 0x22, 0xe5, 0x2d, 0x61, + 0x91, 0x17, 0x94, 0x25, 0x0c, 0x90, 0xf1, 0x44, 0x36, 0x08, 0x8d, 0x4b, + 0xd9, 0xcf, 0x6e, 0x49, 0x46, 0x5b, 0x57, 0x42, 0x37, 0x68, 0x73, 0x91, + 0x91, 0x87, 0x64, 0x36, 0x74, 0x1e, 0x45, 0x3c, 0x7a, 0x4f, 0x33, 0x84, + 0x2c, 0x94, 0x30, 0x1c, 0x03, 0x01, 0x83, 0x7f, 0x4c, 0x26, 0xdd, 0x49, + 0x19, 0x1d, 0xd9, 0xc1, 0xb5, 0xad, 0xd6, 0x05, 0x13, 0xad, 0x1e, 0xd1, + 0x74, 0x87, 0x42, 0x06, 0x90, 0xce, 0x7f, 0x44, 0x18, 0x6a, 0x7c, 0xa8, + 0xa9, 0xbf, 0x4e, 0x26, 0x26, 0x96, 0x9c, 0xe9, 0x12, 0xb1, 0x75, 0xaf, + 0xc9, 0xe9, 0x60, 0x9f, 0x8c, 0x56, 0x4a, 0x6b, 0x1d, 0x18, 0x3e, 0x8f, + 0xa9, 0x70, 0xff, 0x65, 0x4c, 0xa6, 0x15, 0x9b, 0x17, 0xfb, 0xf7, 0x96, + 0xe3, 0x4b, 0x29, 0x74, 0xcf, 0x0e, 0x2d, 0x86, 0x86, 0xf9, 0x37, 0xe0, + 0xe5, 0x64, 0x37, 0x0a, 0x29, 0xa6, 0x0a, 0x1f, 0x4f, 0x32, 0x65, 0x6b, + 0xe2, 0xcc, 0x5a, 0xf1, 0x45, 0xdd, 0xd5, 0xc4, 0x5b, 0xe2, 0xdb, 0x2a, + 0x0a, 0x05, 0x0b, 0x90, 0xed, 0x92, 0xc7, 0x48, 0x54, 0xac, 0xf1, 0xdb, + 0x59, 0x78, 0xa8, 0x78, 0x12, 0x42, 0x6c, 0x92, 0x78, 0x5d, 0x0e, 0x54, + 0xd7, 0x3f, 0x47, 0xc9, 0xd2, 0x1e, 0x2d, 0x82, 0xb5, 0x21, 0x1f, 0x8d, + 0x3a, 0x48, 0x88, 0x63, 0x35, 0x24, 0xb8, 0x1c, 0xd1, 0x27, 0x06, 0x53, + 0x05, 0x07, 0xdf, 0xeb, 0x0d, 0x57, 0xaa, 0x76, 0x20, 0xa1, 0x94, 0xe0, + 0xc3, 0x1d, 0x2b, 0x26, 0xf2, 0xbd, 0x31, 0x68, 0x4c, 0x66, 0x19, 0x9a, + 0xe2, 0xd6, 0x51, 0xe4, 0xfe, 0x8d, 0xd9, 0xd7, 0x38, 0x4c, 0x26, 0xcf, + 0x40, 0x35, 0x85, 0x96, 0xd8, 0xd0, 0x1b, 0x23, 0x67, 0x67, 0x46, 0x00, + 0x36, 0xaf, 0x26, 0xcc, 0xfc, 0x96, 0xe3, 0x7b, 0x5e, 0xe0, 0x42, 0x9b, + 0xe0, 0x74, 0x2d, 0x2a, 0xe8, 0xe8, 0xce, 0xcc, 0x00, 0xb0, 0xe1, 0x74, + 0x3c, 0x81, 0xb5, 0x9d, 0x5d, 0x6f, 0xba, 0x42, 0x03, 0x28, 0xb2, 0xaa, + 0xb9, 0xd1, 0xea, 0x49, 0x4d, 0x8f, 0x91, 0xb0, 0xf4, 0x97, 0x67, 0xa2, + 0xd5, 0x90, 0x8a, 0x14, 0xf7, 0x5c, 0x98, 0xd5, 0xf7, 0x14, 0x16, 0x61, + 0x68, 0x4f, 0x4c, 0xa0, 0x06, 0x00 +}; +static const guint profile_0_frame2_len = 834;
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/wasapi.c
Added
@@ -0,0 +1,291 @@ +/* GStreamer unit test for wasapi plugin + * + * Copyright (C) 2021 Jakub Janků <janku.jakub.jj@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/check/gstcheck.h> + +typedef struct +{ + GMainLoop *loop; + GstElement *pipeline; + guint n_buffers; + guint restart_count; + GstState reuse_state; +} SrcReuseTestData; + +static gboolean +src_reuse_bus_handler (GstBus * bus, GstMessage * message, + SrcReuseTestData * data) +{ + if (GST_MESSAGE_TYPE (message) == GST_MESSAGE_ERROR) { + GST_ERROR ("Got error message from pipeline"); + g_main_loop_quit (data->loop); + } + + return TRUE; +} + +static void +start_pipeline (SrcReuseTestData * data) +{ + GstStateChangeReturn ret; + + GST_INFO ("Start pipeline"); + ret = gst_element_set_state (data->pipeline, GST_STATE_PLAYING); + fail_unless (ret != GST_STATE_CHANGE_FAILURE); +} + +static gboolean +restart_pipeline (SrcReuseTestData * data) +{ + data->restart_count++; + start_pipeline (data); + + return G_SOURCE_REMOVE; +} + +static gboolean +handle_handoff (SrcReuseTestData * data) +{ + data->n_buffers++; + + /* Restart every 10 packets */ + if (data->n_buffers > 10) { + GstStateChangeReturn ret; + data->n_buffers = 0; + + ret = gst_element_set_state (data->pipeline, data->reuse_state); + fail_unless (ret != GST_STATE_CHANGE_FAILURE); + + if (data->restart_count < 2) { + GST_INFO ("Restart pipeline, current restart count %d", + data->restart_count); + g_timeout_add_seconds (1, (GSourceFunc) restart_pipeline, data); + } else { + GST_INFO ("Finish test"); + g_main_loop_quit (data->loop); + } + } + + return G_SOURCE_REMOVE; +} + +static void +on_sink_handoff (GstElement * element, GstBuffer * buffer, GstPad * pad, + SrcReuseTestData * data) +{ + g_idle_add ((GSourceFunc) handle_handoff, data); +} + +static void +wasapisrc_reuse (GstState reuse_state) +{ + GstBus *bus; + GstElement *sink; + SrcReuseTestData data; + + memset (&data, 0, sizeof (SrcReuseTestData)); + + data.loop = g_main_loop_new (NULL, FALSE); + + data.pipeline = gst_parse_launch ("wasapisrc provide-clock=false ! queue ! " + "fakesink name=sink async=false", NULL); + fail_unless (data.pipeline != NULL); + data.reuse_state = reuse_state; + + sink = gst_bin_get_by_name (GST_BIN (data.pipeline), "sink"); + fail_unless (sink); + + g_object_set (G_OBJECT (sink), "signal-handoffs", TRUE, NULL); + g_signal_connect (sink, "handoff", G_CALLBACK (on_sink_handoff), &data); + + bus = gst_element_get_bus (GST_ELEMENT (data.pipeline)); + fail_unless (bus != NULL); + + gst_bus_add_watch (bus, (GstBusFunc) src_reuse_bus_handler, &data); + start_pipeline (&data); + g_main_loop_run (data.loop); + + fail_unless (data.restart_count == 2); + + gst_element_set_start_time (data.pipeline, GST_STATE_NULL); + gst_bus_remove_watch (bus); + gst_object_unref (bus); + + gst_object_unref (data.pipeline); + g_main_loop_unref (data.loop); +} + +/* https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1110 */ +GST_START_TEST (test_wasapisrc_reuse_null) +{ + wasapisrc_reuse (GST_STATE_NULL); +} + +GST_END_TEST; + +GST_START_TEST (test_wasapisrc_reuse_ready) +{ + wasapisrc_reuse (GST_STATE_READY); +} + +GST_END_TEST; + +typedef struct +{ + GMainLoop *loop; + GstElement *pipe; + guint rem_st_changes; + GstState reuse_state; +} SinkPlayReadyTData; + +static gboolean +sink_reuse_bus_watch_cb (GstBus * bus, GstMessage * message, gpointer user_data) +{ + fail_unless (message->type != GST_MESSAGE_ERROR); + + return G_SOURCE_CONTINUE; +} + +static gboolean +state_timer_cb (gpointer user_data) +{ + SinkPlayReadyTData *tdata = user_data; + GstState nxt_st = tdata->rem_st_changes % 2 == 1 ? + tdata->reuse_state : GST_STATE_PLAYING; + + ASSERT_SET_STATE (tdata->pipe, nxt_st, GST_STATE_CHANGE_SUCCESS); + tdata->rem_st_changes--; + + if (tdata->rem_st_changes == 0) { + g_main_loop_quit (tdata->loop); + return G_SOURCE_REMOVE; + } + return G_SOURCE_CONTINUE; +} + +/* Test that the wasapisink can survive the state change + * from PLAYING to READY and then back to PLAYING */ +static void +wasapisink_reuse (GstState reuse_state) +{ + SinkPlayReadyTData tdata; + GstBus *bus; + + tdata.pipe = gst_parse_launch ("audiotestsrc ! wasapisink async=false", NULL); + fail_unless (tdata.pipe != NULL); + bus = gst_element_get_bus (tdata.pipe); + fail_unless (bus != NULL); + gst_bus_add_watch (bus, sink_reuse_bus_watch_cb, NULL); + + tdata.reuse_state = reuse_state; + + ASSERT_SET_STATE (tdata.pipe, GST_STATE_PLAYING, GST_STATE_CHANGE_SUCCESS); + tdata.rem_st_changes = 3; /* -> READY -> PLAYING -> QUIT */ + g_timeout_add_seconds (1, state_timer_cb, &tdata); + + tdata.loop = g_main_loop_new (NULL, FALSE); + g_main_loop_run (tdata.loop); + + g_main_loop_unref (tdata.loop); + gst_bus_remove_watch (bus); + gst_object_unref (bus); + gst_object_unref (tdata.pipe); +} + +GST_START_TEST (test_wasapisink_reuse_null) +{ + wasapisink_reuse (GST_STATE_NULL); +} + +GST_END_TEST; + +GST_START_TEST (test_wasapisink_reuse_ready) +{ + wasapisink_reuse (GST_STATE_READY); +} + +GST_END_TEST; + +static gboolean +check_wasapi_element (gboolean is_src) +{ + gboolean ret = TRUE; + GstElement *elem; + const gchar *elem_name; + + if (is_src) + elem_name = "wasapisrc"; + else + elem_name = "wasapisink"; + + elem = gst_element_factory_make (elem_name, NULL); + if (!elem) { + GST_INFO ("%s is not available", elem_name); + return FALSE; + } + + /* GST_STATE_READY is meaning that camera is available */ + if (gst_element_set_state (elem, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS) { + GST_INFO ("cannot open device"); + ret = FALSE; + } + + gst_element_set_state (elem, GST_STATE_NULL); + gst_object_unref (elem); + + return ret; +} + +static Suite * +wasapi_suite (void) +{ + Suite *s = suite_create ("wasapi"); + TCase *tc_basic = tcase_create ("general"); + gboolean have_src = FALSE; + gboolean have_sink = FALSE; + + suite_add_tcase (s, tc_basic); + + have_src = check_wasapi_element (TRUE); + have_sink = check_wasapi_element (FALSE); + + if (!have_src && !have_sink) { + GST_INFO ("Skipping tests, wasapisrc/wasapisink are unavailable"); + } else { + if (have_src) { + tcase_add_test (tc_basic, test_wasapisrc_reuse_null); + tcase_add_test (tc_basic, test_wasapisrc_reuse_ready); + } + + if (have_sink) { + tcase_add_test (tc_basic, test_wasapisink_reuse_null); + tcase_add_test (tc_basic, test_wasapisink_reuse_ready); + } + } + + return s; +} + +GST_CHECK_MAIN (wasapi)
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/wasapi2.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/wasapi2.c
Changed
@@ -32,10 +32,12 @@ GstElement *pipeline; guint n_buffers; guint restart_count; + GstState reuse_state; } SrcReuseTestData; static gboolean -bus_handler (GstBus * bus, GstMessage * message, SrcReuseTestData * data) +src_reuse_bus_handler (GstBus * bus, GstMessage * message, + SrcReuseTestData * data) { if (GST_MESSAGE_TYPE (message) == GST_MESSAGE_ERROR) { GST_ERROR ("Got error message from pipeline"); @@ -74,7 +76,7 @@ GstStateChangeReturn ret; data->n_buffers = 0; - ret = gst_element_set_state (data->pipeline, GST_STATE_NULL); + ret = gst_element_set_state (data->pipeline, data->reuse_state); fail_unless (ret != GST_STATE_CHANGE_FAILURE); if (data->restart_count < 2) { @@ -97,8 +99,8 @@ g_idle_add ((GSourceFunc) handle_handoff, data); } -/* https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1110 */ -GST_START_TEST (test_wasapi2src_reuse) +static void +wasapi2src_reuse (GstState reuse_state) { GstBus *bus; GstElement *sink; @@ -111,6 +113,7 @@ data.pipeline = gst_parse_launch ("wasapi2src provide-clock=false ! queue ! " "fakesink name=sink async=false", NULL); fail_unless (data.pipeline != NULL); + data.reuse_state = reuse_state; sink = gst_bin_get_by_name (GST_BIN (data.pipeline), "sink"); fail_unless (sink); @@ -121,37 +124,138 @@ bus = gst_element_get_bus (GST_ELEMENT (data.pipeline)); fail_unless (bus != NULL); - gst_bus_add_watch (bus, (GstBusFunc) bus_handler, &data); + gst_bus_add_watch (bus, (GstBusFunc) src_reuse_bus_handler, &data); start_pipeline (&data); g_main_loop_run (data.loop); fail_unless (data.restart_count == 2); + + gst_element_set_start_time (data.pipeline, GST_STATE_NULL); + gst_bus_remove_watch (bus); + gst_object_unref (bus); + gst_object_unref (data.pipeline); g_main_loop_unref (data.loop); } +/* https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1110 */ +GST_START_TEST (test_wasapi2src_reuse_null) +{ + wasapi2src_reuse (GST_STATE_NULL); +} + +GST_END_TEST; + +GST_START_TEST (test_wasapi2src_reuse_ready) +{ + wasapi2src_reuse (GST_STATE_READY); +} + GST_END_TEST; +typedef struct +{ + GMainLoop *loop; + GstElement *pipe; + guint rem_st_changes; + GstState reuse_state; +} SinkPlayReadyTData; + +static gboolean +sink_reuse_bus_watch_cb (GstBus * bus, GstMessage * message, gpointer user_data) +{ + fail_unless (message->type != GST_MESSAGE_ERROR); + + return G_SOURCE_CONTINUE; +} + static gboolean -check_wasapi2_src_available (void) +state_timer_cb (gpointer user_data) +{ + SinkPlayReadyTData *tdata = user_data; + GstState nxt_st = tdata->rem_st_changes % 2 == 1 ? + tdata->reuse_state : GST_STATE_PLAYING; + + ASSERT_SET_STATE (tdata->pipe, nxt_st, GST_STATE_CHANGE_SUCCESS); + tdata->rem_st_changes--; + + if (tdata->rem_st_changes == 0) { + g_main_loop_quit (tdata->loop); + return G_SOURCE_REMOVE; + } + return G_SOURCE_CONTINUE; +} + +/* Test that the wasapisink can survive the state change + * from PLAYING to READY and then back to PLAYING */ +static void +wasapi2sink_reuse (GstState reuse_state) +{ + SinkPlayReadyTData tdata; + GstBus *bus; + + tdata.pipe = + gst_parse_launch ("audiotestsrc ! wasapi2sink async=false", NULL); + fail_unless (tdata.pipe != NULL); + bus = gst_element_get_bus (tdata.pipe); + fail_unless (bus != NULL); + gst_bus_add_watch (bus, sink_reuse_bus_watch_cb, NULL); + + tdata.reuse_state = reuse_state; + + ASSERT_SET_STATE (tdata.pipe, GST_STATE_PLAYING, GST_STATE_CHANGE_SUCCESS); + tdata.rem_st_changes = 3; /* -> READY -> PLAYING -> QUIT */ + g_timeout_add_seconds (1, state_timer_cb, &tdata); + + tdata.loop = g_main_loop_new (NULL, FALSE); + g_main_loop_run (tdata.loop); + + g_main_loop_unref (tdata.loop); + gst_bus_remove_watch (bus); + gst_object_unref (bus); + gst_object_unref (tdata.pipe); +} + +GST_START_TEST (test_wasapi2sink_reuse_null) +{ + wasapi2sink_reuse (GST_STATE_NULL); +} + +GST_END_TEST; + +GST_START_TEST (test_wasapi2sink_reuse_ready) +{ + wasapi2sink_reuse (GST_STATE_READY); +} + +GST_END_TEST; + +static gboolean +check_wasapi2_element (gboolean is_src) { gboolean ret = TRUE; - GstElement *src; + GstElement *elem; + const gchar *elem_name; + + if (is_src) + elem_name = "wasapi2src"; + else + elem_name = "wasapi2sink"; - src = gst_element_factory_make ("wasapi2src", NULL); - if (!src) { - GST_INFO ("wasapisrc is not available"); + elem = gst_element_factory_make (elem_name, NULL); + if (!elem) { + GST_INFO ("%s is not available", elem_name); return FALSE; } /* GST_STATE_READY is meaning that camera is available */ - if (gst_element_set_state (src, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS) { + if (gst_element_set_state (elem, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS) { GST_INFO ("cannot open device"); ret = FALSE; } - gst_element_set_state (src, GST_STATE_NULL); - gst_object_unref (src); + gst_element_set_state (elem, GST_STATE_NULL); + gst_object_unref (elem); return ret; } @@ -162,15 +266,25 @@ Suite *s = suite_create ("wasapi2"); TCase *tc_basic = tcase_create ("general"); gboolean have_src = FALSE; + gboolean have_sink = FALSE; suite_add_tcase (s, tc_basic); - have_src = check_wasapi2_src_available (); + have_src = check_wasapi2_element (TRUE); + have_sink = check_wasapi2_element (FALSE); - if (have_src) { - tcase_add_test (tc_basic, test_wasapi2src_reuse); + if (!have_src && !have_sink) { + GST_INFO ("Skipping tests, wasapi2src/wasapi2sink are unavailable"); } else { - GST_INFO ("Skipping tests, wasapi2src is unavailable"); + if (have_src) { + tcase_add_test (tc_basic, test_wasapi2src_reuse_null); + tcase_add_test (tc_basic, test_wasapi2src_reuse_ready); + } + + if (have_sink) { + tcase_add_test (tc_basic, test_wasapi2sink_reuse_null); + tcase_add_test (tc_basic, test_wasapi2sink_reuse_ready); + } } return s;
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/elements/webrtcbin.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/elements/webrtcbin.c
Changed
@@ -36,8 +36,9 @@ #define OPUS_RTP_CAPS(pt) "application/x-rtp,payload=" G_STRINGIFY(pt) ",encoding-name=OPUS,media=audio,clock-rate=48000,ssrc=(uint)3384078950" #define VP8_RTP_CAPS(pt) "application/x-rtp,payload=" G_STRINGIFY(pt) ",encoding-name=VP8,media=video,clock-rate=90000,ssrc=(uint)3484078950" +#define H264_RTP_CAPS(pt) "application/x-rtp,payload=" G_STRINGIFY(pt) ",encoding-name=H264,media=video,clock-rate=90000,ssrc=(uint)3484078951" -#define TEST_IS_OFFER_ELEMENT(t, e) ((t)->offerror == 1 && (e) == (t)->webrtc1 ? TRUE : FALSE) +#define TEST_IS_OFFER_ELEMENT(t, e) ((((t)->offerror == 1 && (e) == (t)->webrtc1) || ((t)->offerror == 2 && (e) == (t)->webrtc2)) ? TRUE : FALSE) #define TEST_GET_OFFEROR(t) (TEST_IS_OFFER_ELEMENT(t, t->webrtc1) ? (t)->webrtc1 : t->webrtc2) #define TEST_GET_ANSWERER(t) (TEST_IS_OFFER_ELEMENT(t, t->webrtc1) ? (t)->webrtc2 : t->webrtc1) @@ -62,6 +63,7 @@ struct test_webrtc { GList *harnesses; + GstTestClock *test_clock; GThread *thread; GMainLoop *loop; GstBus *bus1; @@ -159,12 +161,13 @@ GstElement *answerer = TEST_GET_ANSWERER (t); g_mutex_lock (&t->lock); - if (++t->answer_set_count >= 2 && t->on_answer_set) { - t->on_answer_set (t, answerer, promise, t->answer_set_data); + if (++t->answer_set_count >= 2) { + if (t->on_answer_set) + t->on_answer_set (t, answerer, promise, t->answer_set_data); + if (t->state == STATE_ANSWER_CREATED) + t->state = STATE_ANSWER_SET; + g_cond_broadcast (&t->cond); } - if (t->state == STATE_ANSWER_CREATED) - t->state = STATE_ANSWER_SET; - g_cond_broadcast (&t->cond); gst_promise_unref (promise); g_mutex_unlock (&t->lock); } @@ -177,14 +180,19 @@ GstElement *answerer = TEST_GET_ANSWERER (t); const GstStructure *reply; GstWebRTCSessionDescription *answer = NULL; - gchar *desc; + GError *error = NULL; reply = gst_promise_get_reply (promise); - gst_structure_get (reply, "answer", - GST_TYPE_WEBRTC_SESSION_DESCRIPTION, &answer, NULL); - desc = gst_sdp_message_as_text (answer->sdp); - GST_INFO ("Created Answer: %s", desc); - g_free (desc); + if (gst_structure_get (reply, "answer", + GST_TYPE_WEBRTC_SESSION_DESCRIPTION, &answer, NULL)) { + gchar *desc = gst_sdp_message_as_text (answer->sdp); + GST_INFO ("Created Answer: %s", desc); + g_free (desc); + } else if (gst_structure_get (reply, "error", G_TYPE_ERROR, &error, NULL)) { + GST_INFO ("Creating answer resulted in error: %s", error->message); + } else { + g_assert_not_reached (); + } g_mutex_lock (&t->lock); @@ -196,15 +204,28 @@ } gst_promise_unref (promise); - promise = gst_promise_new_with_change_func (_on_answer_set, t, NULL); - g_signal_emit_by_name (answerer, "set-local-description", t->answer_desc, - promise); - promise = gst_promise_new_with_change_func (_on_answer_set, t, NULL); - g_signal_emit_by_name (offeror, "set-remote-description", t->answer_desc, - promise); + if (error) + goto error; + + if (t->answer_desc) { + promise = gst_promise_new_with_change_func (_on_answer_set, t, NULL); + g_signal_emit_by_name (answerer, "set-local-description", t->answer_desc, + promise); + promise = gst_promise_new_with_change_func (_on_answer_set, t, NULL); + g_signal_emit_by_name (offeror, "set-remote-description", t->answer_desc, + promise); + } test_webrtc_signal_state_unlocked (t, STATE_ANSWER_CREATED); g_mutex_unlock (&t->lock); + return; + +error: + g_clear_error (&error); + if (t->state < STATE_ERROR) + test_webrtc_signal_state_unlocked (t, STATE_ERROR); + g_mutex_unlock (&t->lock); + return; } static void @@ -214,12 +235,13 @@ GstElement *offeror = TEST_GET_OFFEROR (t); g_mutex_lock (&t->lock); - if (++t->offer_set_count >= 2 && t->on_offer_set) { - t->on_offer_set (t, offeror, promise, t->offer_set_data); + if (++t->offer_set_count >= 2) { + if (t->on_offer_set) + t->on_offer_set (t, offeror, promise, t->offer_set_data); + if (t->state == STATE_OFFER_CREATED) + t->state = STATE_OFFER_SET; + g_cond_broadcast (&t->cond); } - if (t->state == STATE_OFFER_CREATED) - t->state = STATE_OFFER_SET; - g_cond_broadcast (&t->cond); gst_promise_unref (promise); g_mutex_unlock (&t->lock); } @@ -232,14 +254,19 @@ GstElement *answerer = TEST_GET_ANSWERER (t); const GstStructure *reply; GstWebRTCSessionDescription *offer = NULL; - gchar *desc; + GError *error = NULL; reply = gst_promise_get_reply (promise); - gst_structure_get (reply, "offer", - GST_TYPE_WEBRTC_SESSION_DESCRIPTION, &offer, NULL); - desc = gst_sdp_message_as_text (offer->sdp); - GST_INFO ("Created offer: %s", desc); - g_free (desc); + if (gst_structure_get (reply, "offer", + GST_TYPE_WEBRTC_SESSION_DESCRIPTION, &offer, NULL)) { + gchar *desc = gst_sdp_message_as_text (offer->sdp); + GST_INFO ("Created offer: %s", desc); + g_free (desc); + } else if (gst_structure_get (reply, "error", G_TYPE_ERROR, &error, NULL)) { + GST_INFO ("Creating offer resulted in error: %s", error->message); + } else { + g_assert_not_reached (); + } g_mutex_lock (&t->lock); @@ -251,18 +278,31 @@ } gst_promise_unref (promise); - promise = gst_promise_new_with_change_func (_on_offer_set, t, NULL); - g_signal_emit_by_name (offeror, "set-local-description", t->offer_desc, - promise); - promise = gst_promise_new_with_change_func (_on_offer_set, t, NULL); - g_signal_emit_by_name (answerer, "set-remote-description", t->offer_desc, - promise); + if (error) + goto error; + + if (t->offer_desc) { + promise = gst_promise_new_with_change_func (_on_offer_set, t, NULL); + g_signal_emit_by_name (offeror, "set-local-description", t->offer_desc, + promise); + promise = gst_promise_new_with_change_func (_on_offer_set, t, NULL); + g_signal_emit_by_name (answerer, "set-remote-description", t->offer_desc, + promise); - promise = gst_promise_new_with_change_func (_on_answer_received, t, NULL); - g_signal_emit_by_name (answerer, "create-answer", NULL, promise); + promise = gst_promise_new_with_change_func (_on_answer_received, t, NULL); + g_signal_emit_by_name (answerer, "create-answer", NULL, promise); + } test_webrtc_signal_state_unlocked (t, STATE_OFFER_CREATED); g_mutex_unlock (&t->lock); + return; + +error: + g_clear_error (&error); + if (t->state < STATE_ERROR) + test_webrtc_signal_state_unlocked (t, STATE_ERROR); + g_mutex_unlock (&t->lock); + return; } static gboolean @@ -416,6 +456,14 @@ { switch (GST_MESSAGE_TYPE (msg)) { case GST_MESSAGE_ERROR:{ + GError *err = NULL; + gchar *dbg = NULL; + + gst_message_parse_error (msg, &err, &dbg); + g_error ("ERROR from element %s: %s (Debugging info: %s)", + GST_OBJECT_NAME (msg->src), err->message, (dbg) ? dbg : "none"); + g_error_free (err); + g_free (dbg); g_assert_not_reached (); break; } @@ -495,10 +543,13 @@ ret->on_answer_created = _offer_answer_not_reached; ret->on_data_channel = _on_data_channel_not_reached; ret->bus_message = _bus_no_errors; + ret->offerror = 1; g_mutex_init (&ret->lock); g_cond_init (&ret->cond); + ret->test_clock = GST_TEST_CLOCK (gst_test_clock_new ()); + ret->thread = g_thread_new ("test-webrtc", (GThreadFunc) _bus_thread, ret); g_mutex_lock (&ret->lock); @@ -514,6 +565,9 @@ ret->webrtc2 = gst_element_factory_make ("webrtcbin", NULL); fail_unless (ret->webrtc1 != NULL && ret->webrtc2 != NULL); + gst_element_set_clock (ret->webrtc1, GST_CLOCK (ret->test_clock)); + gst_element_set_clock (ret->webrtc2, GST_CLOCK (ret->test_clock)); + gst_element_set_bus (ret->webrtc1, ret->bus1); gst_element_set_bus (ret->webrtc2, ret->bus2); @@ -578,6 +632,8 @@ g_thread_join (t->thread); + g_object_unref (t->test_clock); + gst_bus_remove_watch (t->bus1); gst_bus_remove_watch (t->bus2); @@ -625,13 +681,13 @@ } static void -test_webrtc_create_offer (struct test_webrtc *t, GstElement * webrtc) +test_webrtc_create_offer (struct test_webrtc *t) { GstPromise *promise; + GstElement *offeror = TEST_GET_OFFEROR (t); - t->offerror = webrtc == t->webrtc1 ? 1 : 2; promise = gst_promise_new_with_change_func (_on_offer_received, t, NULL); - g_signal_emit_by_name (webrtc, "create-offer", NULL, promise); + g_signal_emit_by_name (offeror, "create-offer", NULL, promise); } static void @@ -656,7 +712,6 @@ test_webrtc_wait_for_state_mask (t, states); } -#if 0 static void test_webrtc_wait_for_ice_gathering_complete (struct test_webrtc *t) { @@ -673,6 +728,7 @@ g_mutex_unlock (&t->lock); } +#if 0 static void test_webrtc_wait_for_ice_connection (struct test_webrtc *t, GstWebRTCICEConnectionState states) @@ -691,6 +747,7 @@ g_mutex_unlock (&t->lock); } #endif + static void _pad_added_fakesink (struct test_webrtc *t, GstElement * element, GstPad * pad, gpointer user_data) @@ -712,7 +769,7 @@ { guint *flag = (guint *) user_data; - *flag = 1; + *flag |= 1 << ((element == t->webrtc1) ? 1 : 2); } typedef void (*ValidateSDPFunc) (struct test_webrtc * t, GstElement * element, @@ -736,7 +793,7 @@ struct validate_sdp *validate = user_data; GstWebRTCSessionDescription *desc = NULL; - if (t->offerror == 1 && t->webrtc1 == element) + if (TEST_IS_OFFER_ELEMENT (t, element)) desc = t->offer_desc; else desc = t->answer_desc; @@ -774,7 +831,7 @@ GST_STATE_READY) == GST_STATE_CHANGE_FAILURE); } - test_webrtc_create_offer (t, t->webrtc1); + test_webrtc_create_offer (t); if (wait_mask == 0) { test_webrtc_wait_for_answer_error_eos (t); @@ -803,13 +860,12 @@ GST_START_TEST (test_sdp_no_media) { struct test_webrtc *t = test_webrtc_new (); - VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (0), NULL); - VAL_SDP_INIT (answer, _count_num_sdp_media, GUINT_TO_POINTER (0), NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (0), NULL); /* check that a no stream connection creates 0 media sections */ t->on_negotiation_needed = NULL; - test_validate_sdp (t, &offer, &answer); + test_validate_sdp (t, &count, &count); test_webrtc_free (t); } @@ -817,6 +873,115 @@ GST_END_TEST; static void +on_sdp_media_direction (struct test_webrtc *t, GstElement * element, + GstWebRTCSessionDescription * desc, gpointer user_data) +{ + gchar **expected_directions = user_data; + int i; + + for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { + const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); + + if (g_strcmp0 (gst_sdp_media_get_media (media), "audio") == 0 + || g_strcmp0 (gst_sdp_media_get_media (media), "video") == 0) { + gboolean have_direction = FALSE; + int j; + + for (j = 0; j < gst_sdp_media_attributes_len (media); j++) { + const GstSDPAttribute *attr = gst_sdp_media_get_attribute (media, j); + + if (g_strcmp0 (attr->key, "inactive") == 0) { + fail_unless (have_direction == FALSE, + "duplicate/multiple directions for media %u", j); + have_direction = TRUE; + fail_unless_equals_string (attr->key, expected_directions[i]); + } else if (g_strcmp0 (attr->key, "sendonly") == 0) { + fail_unless (have_direction == FALSE, + "duplicate/multiple directions for media %u", j); + have_direction = TRUE; + fail_unless_equals_string (attr->key, expected_directions[i]); + } else if (g_strcmp0 (attr->key, "recvonly") == 0) { + fail_unless (have_direction == FALSE, + "duplicate/multiple directions for media %u", j); + have_direction = TRUE; + fail_unless_equals_string (attr->key, expected_directions[i]); + } else if (g_strcmp0 (attr->key, "sendrecv") == 0) { + fail_unless (have_direction == FALSE, + "duplicate/multiple directions for media %u", j); + have_direction = TRUE; + fail_unless_equals_string (attr->key, expected_directions[i]); + } + } + fail_unless (have_direction, "no direction attribute in media %u", i); + } + } +} + +static void +on_sdp_media_no_duplicate_payloads (struct test_webrtc *t, GstElement * element, + GstWebRTCSessionDescription * desc, gpointer user_data) +{ + int i, j, k; + + for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { + const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); + + GArray *media_formats = g_array_new (FALSE, FALSE, sizeof (int)); + for (j = 0; j < gst_sdp_media_formats_len (media); j++) { + int pt = atoi (gst_sdp_media_get_format (media, j)); + for (k = 0; k < media_formats->len; k++) { + int val = g_array_index (media_formats, int, k); + if (pt == val) + fail ("found an unexpected duplicate payload type %u within media %u", + pt, i); + } + g_array_append_val (media_formats, pt); + } + g_array_free (media_formats, TRUE); + } +} + +static void +on_sdp_media_count_formats (struct test_webrtc *t, GstElement * element, + GstWebRTCSessionDescription * desc, gpointer user_data) +{ + guint *expected_n_media_formats = user_data; + int i; + + for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { + const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); + fail_unless_equals_int (gst_sdp_media_formats_len (media), + expected_n_media_formats[i]); + } +} + +static void +on_sdp_media_setup (struct test_webrtc *t, GstElement * element, + GstWebRTCSessionDescription * desc, gpointer user_data) +{ + gchar **expected_setup = user_data; + int i; + + for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { + const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); + gboolean have_setup = FALSE; + int j; + + for (j = 0; j < gst_sdp_media_attributes_len (media); j++) { + const GstSDPAttribute *attr = gst_sdp_media_get_attribute (media, j); + + if (g_strcmp0 (attr->key, "setup") == 0) { + fail_unless (have_setup == FALSE, + "duplicate/multiple setup for media %u", j); + have_setup = TRUE; + fail_unless_equals_string (attr->value, expected_setup[i]); + } + } + fail_unless (have_setup, "no setup attribute in media %u", i); + } +} + +static void add_fake_audio_src_harness (GstHarness * h, gint pt) { GstCaps *caps = gst_caps_from_string (OPUS_RTP_CAPS (pt)); @@ -856,8 +1021,24 @@ GST_START_TEST (test_audio) { struct test_webrtc *t = create_audio_test (); - VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); - VAL_SDP_INIT (answer, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint media_format_count[] = { 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &media_formats); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, &count); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &count); + const gchar *expected_offer_direction[] = { "sendrecv", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); /* check that a single stream connection creates the associated number * of media sections */ @@ -868,6 +1049,78 @@ GST_END_TEST; +static void +_check_ice_port_restriction (struct test_webrtc *t, GstElement * element, + guint mlineindex, gchar * candidate, GstElement * other, gpointer user_data) +{ + GRegex *regex; + GMatchInfo *match_info; + + gchar *candidate_port; + gchar *candidate_protocol; + gchar *candidate_typ; + guint port_as_int; + guint peer_number; + + regex = + g_regex_new ("candidate:(\\d+) (1) (UDP|TCP) (\\d+) ([0-9.]+|[0-9a-f:]+)" + " (\\d+) typ ([a-z]+)", 0, 0, NULL); + + g_regex_match (regex, candidate, 0, &match_info); + fail_unless (g_match_info_get_match_count (match_info) == 8, candidate); + + candidate_protocol = g_match_info_fetch (match_info, 2); + candidate_port = g_match_info_fetch (match_info, 6); + candidate_typ = g_match_info_fetch (match_info, 7); + + peer_number = t->webrtc1 == element ? 1 : 2; + + port_as_int = atoi (candidate_port); + + if (!g_strcmp0 (candidate_typ, "host") && port_as_int != 9) { + guint expected_min = peer_number * 10000 + 1000; + guint expected_max = expected_min + 999; + + fail_unless (port_as_int >= expected_min); + fail_unless (port_as_int <= expected_max); + } + + g_free (candidate_port); + g_free (candidate_protocol); + g_free (candidate_typ); + g_match_info_free (match_info); + g_regex_unref (regex); +} + +GST_START_TEST (test_ice_port_restriction) +{ + struct test_webrtc *t = create_audio_test (); + GObject *webrtcice; + + VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (answer, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + + /* + * Ports are defined as follows "{peer}{protocol}000" + * - peer number: "1" for t->webrtc1, "2" for t->webrtc2 + */ + g_object_get (t->webrtc1, "ice-agent", &webrtcice, NULL); + g_object_set (webrtcice, "min-rtp-port", 11000, "max-rtp-port", 11999, NULL); + g_object_unref (webrtcice); + + g_object_get (t->webrtc2, "ice-agent", &webrtcice, NULL); + g_object_set (webrtcice, "min-rtp-port", 21000, "max-rtp-port", 21999, NULL); + g_object_unref (webrtcice); + + t->on_ice_candidate = _check_ice_port_restriction; + test_validate_sdp (t, &offer, &answer); + + test_webrtc_wait_for_ice_gathering_complete (t); + test_webrtc_free (t); +} + +GST_END_TEST; + static struct test_webrtc * create_audio_video_test (void) { @@ -884,8 +1137,24 @@ GST_START_TEST (test_audio_video) { struct test_webrtc *t = create_audio_video_test (); - VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); - VAL_SDP_INIT (answer, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint media_format_count[] = { 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, &count); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &count); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); /* check that a dual stream connection creates the associated number * of media sections */ @@ -896,64 +1165,29 @@ GST_END_TEST; -static void -on_sdp_media_direction (struct test_webrtc *t, GstElement * element, - GstWebRTCSessionDescription * desc, gpointer user_data) -{ - gchar **expected_directions = user_data; - int i; - - for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { - const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); - - if (g_strcmp0 (gst_sdp_media_get_media (media), "audio") == 0 - || g_strcmp0 (gst_sdp_media_get_media (media), "video") == 0) { - gboolean have_direction = FALSE; - int j; - - for (j = 0; j < gst_sdp_media_attributes_len (media); j++) { - const GstSDPAttribute *attr = gst_sdp_media_get_attribute (media, j); - - if (g_strcmp0 (attr->key, "inactive") == 0) { - fail_unless (have_direction == FALSE, - "duplicate/multiple directions for media %u", j); - have_direction = TRUE; - fail_unless (g_strcmp0 (attr->key, expected_directions[i]) == 0); - } else if (g_strcmp0 (attr->key, "sendonly") == 0) { - fail_unless (have_direction == FALSE, - "duplicate/multiple directions for media %u", j); - have_direction = TRUE; - fail_unless (g_strcmp0 (attr->key, expected_directions[i]) == 0); - } else if (g_strcmp0 (attr->key, "recvonly") == 0) { - fail_unless (have_direction == FALSE, - "duplicate/multiple directions for media %u", j); - have_direction = TRUE; - fail_unless (g_strcmp0 (attr->key, expected_directions[i]) == 0); - } else if (g_strcmp0 (attr->key, "sendrecv") == 0) { - fail_unless (have_direction == FALSE, - "duplicate/multiple directions for media %u", j); - have_direction = TRUE; - fail_unless (g_strcmp0 (attr->key, expected_directions[i]) == 0); - } - } - fail_unless (have_direction, "no direction attribute in media %u", i); - } - } -} - GST_START_TEST (test_media_direction) { struct test_webrtc *t = create_audio_video_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv", "recvonly" }; + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint media_format_count[] = { 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, &count); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &count); + + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendrecv", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); GstHarness *h; - VAL_SDP_INIT (offer_direction, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (2), - &offer_direction); - VAL_SDP_INIT (answer_direction, on_sdp_media_direction, expected_answer, - NULL); - VAL_SDP_INIT (answer, _count_num_sdp_media, GUINT_TO_POINTER (2), - &answer_direction); /* check the default media directions for transceivers */ @@ -972,9 +1206,10 @@ GstWebRTCSessionDescription * desc, gpointer user_data) { const GstSDPMedia *vmedia; + guint video_mline = GPOINTER_TO_UINT (user_data); guint j; - vmedia = gst_sdp_message_get_media (desc->sdp, 1); + vmedia = gst_sdp_message_get_media (desc->sdp, video_mline); for (j = 0; j < gst_sdp_media_attributes_len (vmedia); j++) { const GstSDPAttribute *attr = gst_sdp_media_get_attribute (vmedia, j); @@ -990,6 +1225,8 @@ fail_unless_equals_string (attr->value, "99 rtx/90000"); } else if (g_str_has_prefix (attr->value, "100")) { fail_unless_equals_string (attr->value, "100 rtx/90000"); + } else if (g_str_has_prefix (attr->value, "101")) { + fail_unless_equals_string (attr->value, "101 H264/90000"); } } } @@ -1000,8 +1237,19 @@ GST_START_TEST (test_payload_types) { struct test_webrtc *t = create_audio_video_test (); - VAL_SDP_INIT (payloads, on_sdp_media_payload_types, NULL, NULL); - VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (2), &payloads); + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint media_format_count[] = { 1, 5, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (payloads, on_sdp_media_payload_types, GUINT_TO_POINTER (1), + &media_formats); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), &payloads); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, &count); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); GstWebRTCRTPTransceiver *trans; GArray *transceivers; @@ -1019,47 +1267,6 @@ GST_END_TEST; -static void -on_sdp_media_setup (struct test_webrtc *t, GstElement * element, - GstWebRTCSessionDescription * desc, gpointer user_data) -{ - gchar **expected_setup = user_data; - int i; - - for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { - const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); - gboolean have_setup = FALSE; - int j; - - for (j = 0; j < gst_sdp_media_attributes_len (media); j++) { - const GstSDPAttribute *attr = gst_sdp_media_get_attribute (media, j); - - if (g_strcmp0 (attr->key, "setup") == 0) { - fail_unless (have_setup == FALSE, - "duplicate/multiple setup for media %u", j); - have_setup = TRUE; - fail_unless (g_strcmp0 (attr->value, expected_setup[i]) == 0); - } - } - fail_unless (have_setup, "no setup attribute in media %u", i); - } -} - -GST_START_TEST (test_media_setup) -{ - struct test_webrtc *t = create_audio_test (); - const gchar *expected_offer[] = { "actpass" }; - const gchar *expected_answer[] = { "active" }; - VAL_SDP_INIT (offer, on_sdp_media_setup, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_setup, expected_answer, NULL); - - /* check the default dtls setup negotiation values */ - test_validate_sdp (t, &offer, &answer); - test_webrtc_free (t); -} - -GST_END_TEST; - GST_START_TEST (test_no_nice_elements_request_pad) { struct test_webrtc *t = test_webrtc_new (); @@ -1081,7 +1288,7 @@ t->bus_message = NULL; - pad = gst_element_get_request_pad (t->webrtc1, "sink_0"); + pad = gst_element_request_pad_simple (t->webrtc1, "sink_0"); fail_unless (pad == NULL); test_webrtc_wait_for_answer_error_eos (t); @@ -1384,14 +1591,15 @@ GST_START_TEST (test_add_transceiver) { struct test_webrtc *t = test_webrtc_new (); - GstWebRTCRTPTransceiverDirection direction; + GstWebRTCRTPTransceiverDirection direction, trans_direction; GstWebRTCRTPTransceiver *trans; direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV; g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, NULL, &trans); fail_unless (trans != NULL); - fail_unless_equals_int (direction, trans->direction); + g_object_get (trans, "direction", &trans_direction, NULL); + fail_unless_equals_int (direction, trans_direction); gst_object_unref (trans); @@ -1425,10 +1633,24 @@ struct test_webrtc *t = test_webrtc_new (); GstWebRTCRTPTransceiverDirection direction; GstWebRTCRTPTransceiver *trans; - const gchar *expected_offer[] = { "recvonly" }; - const gchar *expected_answer[] = { "sendonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &media_formats); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, &count); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &count); + const gchar *expected_offer_direction[] = { "recvonly", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); GstCaps *caps; GstHarness *h; @@ -1463,10 +1685,24 @@ struct test_webrtc *t = test_webrtc_new (); GstWebRTCRTPTransceiverDirection direction; GstWebRTCRTPTransceiver *trans; - const gchar *expected_offer[] = { "recvonly", "sendonly" }; - const gchar *expected_answer[] = { "sendonly", "recvonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, &count); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &count); + const gchar *expected_offer_direction[] = { "recvonly", "sendonly" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); GstCaps *caps; GstHarness *h; GArray *transceivers; @@ -1492,8 +1728,10 @@ t->harnesses = g_list_prepend (t->harnesses, h); g_signal_emit_by_name (t->webrtc1, "get-transceivers", &transceivers); fail_unless (transceivers != NULL); + fail_unless_equals_int (transceivers->len, 2); trans = g_array_index (transceivers, GstWebRTCRTPTransceiver *, 1); - trans->direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY; + g_object_set (trans, "direction", + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY, NULL); g_array_unref (transceivers); @@ -1538,8 +1776,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); gchar *label; t->on_negotiation_needed = NULL; @@ -1558,7 +1796,7 @@ g_signal_connect (channel, "on-error", G_CALLBACK (on_channel_error_not_reached), NULL); - test_validate_sdp (t, &offer, &answer); + test_validate_sdp (t, &offer, &offer); g_object_unref (channel); g_free (label); @@ -1592,8 +1830,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; @@ -1616,7 +1854,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 1 << STATE_CUSTOM, FALSE); + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); g_object_unref (channel); test_webrtc_free (t); @@ -1665,8 +1903,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; @@ -1689,7 +1927,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 1 << STATE_CUSTOM, FALSE); + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); g_object_unref (channel); test_webrtc_free (t); @@ -1746,8 +1984,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; @@ -1770,7 +2008,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 1 << STATE_CUSTOM, FALSE); + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); g_object_unref (channel); test_webrtc_free (t); @@ -1799,8 +2037,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; @@ -1823,7 +2061,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 1 << STATE_CUSTOM, FALSE); + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); g_object_unref (channel); test_webrtc_free (t); @@ -1831,6 +2069,183 @@ GST_END_TEST; +struct test_data_channel +{ + GObject *dc1; + GObject *dc2; + gint n_open; + gint n_closed; + gint n_destroyed; +}; + +static void +have_data_channel_mark_open (struct test_webrtc *t, + GstElement * element, GObject * our, gpointer user_data) +{ + struct test_data_channel *tdc = t->data_channel_data; + + tdc->dc2 = our; + if (g_atomic_int_add (&tdc->n_open, 1) == 1) { + test_webrtc_signal_state_unlocked (t, STATE_CUSTOM); + } +} + +static gboolean +is_data_channel_open (GObject * channel) +{ + GstWebRTCDataChannelState ready_state = GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED; + + if (channel) { + g_object_get (channel, "ready-state", &ready_state, NULL); + } + + return ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_OPEN; +} + +static void +on_data_channel_open (GObject * channel, GParamSpec * pspec, + struct test_webrtc *t) +{ + struct test_data_channel *tdc = t->data_channel_data; + + if (is_data_channel_open (channel)) { + if (g_atomic_int_add (&tdc->n_open, 1) == 1) { + test_webrtc_signal_state (t, STATE_CUSTOM); + } + } +} + +static void +on_data_channel_close (GObject * channel, GParamSpec * pspec, + struct test_webrtc *t) +{ + struct test_data_channel *tdc = t->data_channel_data; + GstWebRTCDataChannelState ready_state; + + g_object_get (channel, "ready-state", &ready_state, NULL); + + if (ready_state == GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED) { + g_atomic_int_add (&tdc->n_closed, 1); + } +} + +static void +on_data_channel_destroyed (gpointer data, GObject * where_the_object_was) +{ + struct test_webrtc *t = data; + struct test_data_channel *tdc = t->data_channel_data; + + if (where_the_object_was == tdc->dc1) { + tdc->dc1 = NULL; + } else if (where_the_object_was == tdc->dc2) { + tdc->dc2 = NULL; + } + + if (g_atomic_int_add (&tdc->n_destroyed, 1) == 1) { + test_webrtc_signal_state (t, STATE_CUSTOM); + } +} + +GST_START_TEST (test_data_channel_close) +{ +#define NUM_CHANNELS 3 + struct test_webrtc *t = test_webrtc_new (); + struct test_data_channel tdc = { NULL, NULL, 0, 0, 0 }; + guint channel_id[NUM_CHANNELS] = { 0, 1, 2 }; + gulong sigid = 0; + int i; + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_data_channel = have_data_channel_mark_open; + t->data_channel_data = &tdc; + + fail_if (gst_element_set_state (t->webrtc1, + GST_STATE_READY) == GST_STATE_CHANGE_FAILURE); + fail_if (gst_element_set_state (t->webrtc2, + GST_STATE_READY) == GST_STATE_CHANGE_FAILURE); + + /* open and close NUM_CHANNELS data channels to verify that we can reuse the + * stream id of a previously closed data channel and that we have the same + * behaviour no matter if we create the channel in READY or PLAYING state */ + for (i = 0; i < NUM_CHANNELS; i++) { + tdc.n_open = 0; + tdc.n_closed = 0; + tdc.n_destroyed = 0; + + g_signal_emit_by_name (t->webrtc1, "create-data-channel", "label", NULL, + &tdc.dc1); + g_assert_nonnull (tdc.dc1); + g_object_unref (tdc.dc1); /* webrtcbin should still hold a ref */ + g_object_weak_ref (tdc.dc1, on_data_channel_destroyed, t); + g_signal_connect (tdc.dc1, "on-error", + G_CALLBACK (on_channel_error_not_reached), NULL); + sigid = g_signal_connect (tdc.dc1, "notify::ready-state", + G_CALLBACK (on_data_channel_open), t); + + if (i == 0) { + fail_if (gst_element_set_state (t->webrtc1, + GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); + fail_if (gst_element_set_state (t->webrtc2, + GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); + + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); + } else { + /* FIXME: Creating a data channel may result in "on-open" being sent + * before we even had a chance to register the signal. For this test we + * want to make sure that the channel is actually open before we try to + * close it. So if we didn't receive the signal we fall back to a 1s + * timeout where we explicitly check if both channels are open. */ + gint64 timeout = g_get_monotonic_time () + 1 * G_TIME_SPAN_SECOND; + g_mutex_lock (&t->lock); + while (((1 << t->state) & STATE_CUSTOM) == 0) { + if (!g_cond_wait_until (&t->cond, &t->lock, timeout)) { + g_assert (is_data_channel_open (tdc.dc1) + && is_data_channel_open (tdc.dc2)); + break; + } + } + g_mutex_unlock (&t->lock); + } + + g_object_get (tdc.dc1, "id", &channel_id[i], NULL); + + g_signal_handler_disconnect (tdc.dc1, sigid); + g_object_weak_ref (tdc.dc2, on_data_channel_destroyed, t); + g_signal_connect (tdc.dc1, "notify::ready-state", + G_CALLBACK (on_data_channel_close), t); + g_signal_connect (tdc.dc2, "notify::ready-state", + G_CALLBACK (on_data_channel_close), t); + test_webrtc_signal_state (t, STATE_NEW); + + /* currently we assume there is no renegotiation if the last data channel is + * removed but if it changes this test could be extended to verify both + * the behaviour of removing the last channel as well as the behaviour when + * there are still data channels remaining */ + t->on_negotiation_needed = _negotiation_not_reached; + g_signal_emit_by_name (tdc.dc1, "close"); + + test_webrtc_wait_for_state_mask (t, 1 << STATE_CUSTOM); + + assert_equals_int (g_atomic_int_get (&tdc.n_closed), 2); + assert_equals_pointer (tdc.dc1, NULL); + assert_equals_pointer (tdc.dc2, NULL); + + test_webrtc_signal_state (t, STATE_NEW); + } + + /* verify the same stream id has been reused for each data channel */ + assert_equals_int (channel_id[0], channel_id[1]); + assert_equals_int (channel_id[0], channel_id[2]); + + test_webrtc_free (t); +#undef NUM_CHANNELS +} + +GST_END_TEST; + static void on_buffered_amount_low_emitted (GObject * channel, struct test_webrtc *t) { @@ -1854,8 +2269,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; @@ -1878,7 +2293,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 1 << STATE_CUSTOM, FALSE); + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); g_object_unref (channel); test_webrtc_free (t); @@ -1923,8 +2338,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; @@ -1945,7 +2360,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 1 << STATE_CUSTOM, FALSE); + test_validate_sdp_full (t, &offer, &offer, 1 << STATE_CUSTOM, FALSE); g_object_unref (channel); test_webrtc_free (t); @@ -1973,8 +2388,8 @@ { struct test_webrtc *t = test_webrtc_new (); GObject *channel1 = NULL, *channel2 = NULL; - VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, NULL); - VAL_SDP_INIT (answer, on_sdp_has_datachannel, NULL, NULL); + VAL_SDP_INIT (media_count, _count_num_sdp_media, GUINT_TO_POINTER (1), NULL); + VAL_SDP_INIT (offer, on_sdp_has_datachannel, NULL, &media_count); GstStructure *s; gint n_ready = 0; @@ -2001,7 +2416,7 @@ fail_if (gst_element_set_state (t->webrtc2, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); - test_validate_sdp_full (t, &offer, &answer, 0, FALSE); + test_validate_sdp_full (t, &offer, &offer, 0, FALSE); t->data_channel_data = &n_ready; @@ -2053,7 +2468,7 @@ GStrv expected = user_data; guint i; - fail_unless (_parse_bundle (sd->sdp, &bundled)); + fail_unless (_parse_bundle (sd->sdp, &bundled, NULL)); if (!bundled) { fail_unless_equals_int (g_strv_length (expected), 0); @@ -2091,16 +2506,33 @@ const gchar *offer_bundle_only[] = { "video1", NULL }; const gchar *answer_bundle_only[] = { NULL }; - VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); - VAL_SDP_INIT (bundle_tag, _check_bundle_tag, bundle, &count); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + VAL_SDP_INIT (bundle_tag, _check_bundle_tag, bundle, &payloads); VAL_SDP_INIT (offer_non_reject, _count_non_rejected_media, GUINT_TO_POINTER (1), &bundle_tag); VAL_SDP_INIT (answer_non_reject, _count_non_rejected_media, GUINT_TO_POINTER (2), &bundle_tag); - VAL_SDP_INIT (offer, _check_bundle_only_media, &offer_bundle_only, + VAL_SDP_INIT (offer_bundle, _check_bundle_only_media, &offer_bundle_only, &offer_non_reject); - VAL_SDP_INIT (answer, _check_bundle_only_media, &answer_bundle_only, + VAL_SDP_INIT (answer_bundle, _check_bundle_only_media, &answer_bundle_only, &answer_non_reject); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &offer_bundle); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &answer_bundle); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); /* We set a max-bundle policy on the offering webrtcbin, * this means that all the offered medias should be part @@ -2130,12 +2562,28 @@ const gchar *bundle[] = { "audio0", "video1", NULL }; const gchar *bundle_only[] = { NULL }; - VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); VAL_SDP_INIT (bundle_tag, _check_bundle_tag, bundle, &count); VAL_SDP_INIT (count_non_reject, _count_non_rejected_media, GUINT_TO_POINTER (2), &bundle_tag); VAL_SDP_INIT (bundle_sdp, _check_bundle_only_media, &bundle_only, &count_non_reject); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &bundle_sdp); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &bundle_sdp); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); /* We set a max-compat policy on the offering webrtcbin, * this means that all the offered medias should be part @@ -2152,7 +2600,7 @@ gst_util_set_object_arg (G_OBJECT (t->webrtc2), "bundle-policy", "max-bundle"); - test_validate_sdp (t, &bundle_sdp, &bundle_sdp); + test_validate_sdp (t, &offer, &answer); test_webrtc_free (t); } @@ -2162,22 +2610,39 @@ GST_START_TEST (test_bundle_audio_video_max_bundle_none) { struct test_webrtc *t = create_audio_video_test (); - const gchar *offer_bundle[] = { "audio0", "video1", NULL }; + const gchar *offer_mid[] = { "audio0", "video1", NULL }; const gchar *offer_bundle_only[] = { "video1", NULL }; - const gchar *answer_bundle[] = { NULL }; + const gchar *answer_mid[] = { NULL }; const gchar *answer_bundle_only[] = { NULL }; - VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); VAL_SDP_INIT (count_non_reject, _count_non_rejected_media, - GUINT_TO_POINTER (1), &count); - VAL_SDP_INIT (offer_bundle_tag, _check_bundle_tag, offer_bundle, + GUINT_TO_POINTER (1), &payloads); + VAL_SDP_INIT (offer_bundle_tag, _check_bundle_tag, offer_mid, &count_non_reject); - VAL_SDP_INIT (answer_bundle_tag, _check_bundle_tag, answer_bundle, + VAL_SDP_INIT (answer_bundle_tag, _check_bundle_tag, answer_mid, &count_non_reject); - VAL_SDP_INIT (offer, _check_bundle_only_media, &offer_bundle_only, + VAL_SDP_INIT (offer_bundle, _check_bundle_only_media, &offer_bundle_only, &offer_bundle_tag); - VAL_SDP_INIT (answer, _check_bundle_only_media, &answer_bundle_only, + VAL_SDP_INIT (answer_bundle, _check_bundle_only_media, &answer_bundle_only, &answer_bundle_tag); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &offer_bundle); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &answer_bundle); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); /* We set a max-bundle policy on the offering webrtcbin, * this means that all the offered medias should be part @@ -2203,21 +2668,40 @@ GST_START_TEST (test_bundle_audio_video_data) { struct test_webrtc *t = create_audio_video_test (); - const gchar *bundle[] = { "audio0", "video1", "application2", NULL }; + const gchar *mids[] = { "audio0", "video1", "application2", NULL }; const gchar *offer_bundle_only[] = { "video1", "application2", NULL }; const gchar *answer_bundle_only[] = { NULL }; GObject *channel = NULL; - VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (3), NULL); - VAL_SDP_INIT (bundle_tag, _check_bundle_tag, bundle, &count); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (3), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + VAL_SDP_INIT (bundle_tag, _check_bundle_tag, mids, &payloads); VAL_SDP_INIT (offer_non_reject, _count_non_rejected_media, GUINT_TO_POINTER (1), &bundle_tag); VAL_SDP_INIT (answer_non_reject, _count_non_rejected_media, GUINT_TO_POINTER (3), &bundle_tag); - VAL_SDP_INIT (offer, _check_bundle_only_media, &offer_bundle_only, + VAL_SDP_INIT (offer_bundle, _check_bundle_only_media, &offer_bundle_only, &offer_non_reject); - VAL_SDP_INIT (answer, _check_bundle_only_media, &answer_bundle_only, + VAL_SDP_INIT (answer_bundle, _check_bundle_only_media, &answer_bundle_only, &answer_non_reject); + const gchar *expected_offer_setup[] = { "actpass", "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &offer_bundle); + const gchar *expected_answer_setup[] = { "active", "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &answer_bundle); + const gchar *expected_offer_direction[] = + { "sendrecv", "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = + { "recvonly", "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); /* We set a max-bundle policy on the offering webrtcbin, * this means that all the offered medias should be part @@ -2253,10 +2737,24 @@ GST_START_TEST (test_duplicate_nego) { struct test_webrtc *t = create_audio_video_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv", "recvonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendrecv", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); GstHarness *h; guint negotiation_flag = 0; @@ -2270,7 +2768,7 @@ t->harnesses = g_list_prepend (t->harnesses, h); test_validate_sdp (t, &offer, &answer); - fail_unless_equals_int (negotiation_flag, 1); + fail_unless (negotiation_flag & (1 << 2)); test_webrtc_reset_negotiation (t); test_validate_sdp (t, &offer, &answer); @@ -2283,13 +2781,28 @@ GST_START_TEST (test_dual_audio) { struct test_webrtc *t = create_audio_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv", }; - const gchar *expected_answer[] = { "sendrecv", "recvonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendrecv", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); GstHarness *h; GstWebRTCRTPTransceiver *trans; GArray *transceivers; + guint mline; /* test that each mline gets a unique transceiver even with the same caps */ @@ -2310,11 +2823,13 @@ trans = g_array_index (transceivers, GstWebRTCRTPTransceiver *, 0); fail_unless (trans != NULL); - fail_unless_equals_int (trans->mline, 0); + g_object_get (trans, "mlineindex", &mline, NULL); + fail_unless_equals_int (mline, 0); trans = g_array_index (transceivers, GstWebRTCRTPTransceiver *, 1); fail_unless (trans != NULL); - fail_unless_equals_int (trans->mline, 1); + g_object_get (trans, "mlineindex", &mline, NULL); + fail_unless_equals_int (mline, 1); g_array_unref (transceivers); test_webrtc_free (t); @@ -2446,10 +2961,26 @@ GST_START_TEST (test_renego_add_stream) { struct test_webrtc *t = create_audio_video_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv", "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv", "recvonly", "recvonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = + { "sendrecv", "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = + { "sendrecv", "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, NULL); VAL_SDP_INIT (renego_ice_params, sdp_media_equal_ice_params, NULL, &renego_mid); @@ -2471,8 +3002,8 @@ add_fake_audio_src_harness (h, 98); t->harnesses = g_list_prepend (t->harnesses, h); - offer.next = &renego_fingerprint; - answer.next = &renego_fingerprint; + media_formats.next = &renego_fingerprint; + count.user_data = GUINT_TO_POINTER (3); /* renegotiate! */ test_webrtc_reset_negotiation (t); @@ -2486,11 +3017,25 @@ GST_START_TEST (test_renego_stream_add_data_channel) { struct test_webrtc *t = create_audio_video_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv", "recvonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv", NULL }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendrecv", "recvonly", NULL }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, NULL); VAL_SDP_INIT (renego_ice_params, sdp_media_equal_ice_params, NULL, &renego_mid); @@ -2512,8 +3057,8 @@ g_signal_emit_by_name (t->webrtc1, "create-data-channel", "label", NULL, &channel); - offer.next = &renego_fingerprint; - answer.next = &renego_fingerprint; + media_formats.next = &renego_fingerprint; + count.user_data = GUINT_TO_POINTER (3); /* renegotiate! */ test_webrtc_reset_negotiation (t); @@ -2528,10 +3073,24 @@ GST_START_TEST (test_renego_data_channel_add_stream) { struct test_webrtc *t = test_webrtc_new (); - const gchar *expected_offer[] = { NULL, "sendrecv" }; - const gchar *expected_answer[] = { NULL, "recvonly" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { NULL, "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { NULL, "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, NULL); VAL_SDP_INIT (renego_ice_params, sdp_media_equal_ice_params, NULL, &renego_mid); @@ -2543,9 +3102,10 @@ GObject *channel; GstHarness *h; - /* negotiate an AV stream and then renegotiate a data channel */ + /* negotiate an data channel and then renegotiate to add a av stream */ t->on_negotiation_needed = NULL; t->on_ice_candidate = NULL; + t->on_data_channel = NULL; t->on_pad_added = _pad_added_fakesink; fail_if (gst_element_set_state (t->webrtc1, @@ -2562,8 +3122,78 @@ add_fake_audio_src_harness (h, 97); t->harnesses = g_list_prepend (t->harnesses, h); - offer.next = &renego_fingerprint; - answer.next = &renego_fingerprint; + media_formats.next = &renego_fingerprint; + count.user_data = GUINT_TO_POINTER (2); + + /* renegotiate! */ + test_webrtc_reset_negotiation (t); + test_validate_sdp_full (t, &offer, &answer, 0, FALSE); + + g_object_unref (channel); + test_webrtc_free (t); +} + +GST_END_TEST; + + +GST_START_TEST (test_renego_stream_data_channel_add_stream) +{ + struct test_webrtc *t = test_webrtc_new (); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", NULL, "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", NULL, "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, NULL); + VAL_SDP_INIT (renego_ice_params, sdp_media_equal_ice_params, NULL, + &renego_mid); + VAL_SDP_INIT (renego_sess_id, sdp_equal_session_id, NULL, &renego_ice_params); + VAL_SDP_INIT (renego_sess_ver, sdp_increasing_session_version, NULL, + &renego_sess_id); + VAL_SDP_INIT (renego_fingerprint, sdp_media_equal_fingerprint, NULL, + &renego_sess_ver); + GObject *channel; + GstHarness *h; + + /* Negotiate a stream and a data channel, then renogotiate with a new stream */ + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_data_channel = NULL; + t->on_pad_added = _pad_added_fakesink; + + h = gst_harness_new_with_element (t->webrtc1, "sink_0", NULL); + add_fake_audio_src_harness (h, 97); + t->harnesses = g_list_prepend (t->harnesses, h); + + fail_if (gst_element_set_state (t->webrtc1, + GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); + fail_if (gst_element_set_state (t->webrtc2, + GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); + + g_signal_emit_by_name (t->webrtc1, "create-data-channel", "label", NULL, + &channel); + + test_validate_sdp_full (t, &offer, &answer, 0, FALSE); + + h = gst_harness_new_with_element (t->webrtc1, "sink_2", NULL); + add_fake_audio_src_harness (h, 97); + t->harnesses = g_list_prepend (t->harnesses, h); + + media_formats.next = &renego_fingerprint; + count.user_data = GUINT_TO_POINTER (3); /* renegotiate! */ test_webrtc_reset_negotiation (t); @@ -2578,15 +3208,31 @@ GST_START_TEST (test_bundle_renego_add_stream) { struct test_webrtc *t = create_audio_video_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv", "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv", "recvonly", "recvonly" }; const gchar *bundle[] = { "audio0", "video1", "audio2", NULL }; const gchar *offer_bundle_only[] = { "video1", "audio2", NULL }; const gchar *answer_bundle_only[] = { NULL }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = + { "sendrecv", "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = + { "sendrecv", "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); - VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, NULL); + VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, &payloads); VAL_SDP_INIT (renego_ice_params, sdp_media_equal_ice_params, NULL, &renego_mid); VAL_SDP_INIT (renego_sess_id, sdp_equal_session_id, NULL, &renego_ice_params); @@ -2631,8 +3277,9 @@ add_fake_audio_src_harness (h, 98); t->harnesses = g_list_prepend (t->harnesses, h); - offer.next = &offer_bundle_only_sdp; - answer.next = &answer_bundle_only_sdp; + offer_setup.next = &offer_bundle_only_sdp; + answer_setup.next = &answer_bundle_only_sdp; + count.user_data = GUINT_TO_POINTER (3); /* renegotiate! */ test_webrtc_reset_negotiation (t); @@ -2646,12 +3293,28 @@ GST_START_TEST (test_bundle_max_compat_max_bundle_renego_add_stream) { struct test_webrtc *t = create_audio_video_test (); - const gchar *expected_offer[] = { "sendrecv", "sendrecv", "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv", "recvonly", "recvonly" }; const gchar *bundle[] = { "audio0", "video1", "audio2", NULL }; const gchar *bundle_only[] = { NULL }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = + { "sendrecv", "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = + { "sendrecv", "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); VAL_SDP_INIT (renego_mid, sdp_media_equal_mid, NULL, NULL); VAL_SDP_INIT (renego_ice_params, sdp_media_equal_ice_params, NULL, @@ -2694,8 +3357,8 @@ add_fake_audio_src_harness (h, 98); t->harnesses = g_list_prepend (t->harnesses, h); - offer.next = &bundle_sdp; - answer.next = &bundle_sdp; + media_formats.next = &bundle_sdp; + count.user_data = GUINT_TO_POINTER (3); /* renegotiate! */ test_webrtc_reset_negotiation (t); @@ -2709,10 +3372,24 @@ GST_START_TEST (test_renego_transceiver_set_direction) { struct test_webrtc *t = create_audio_test (); - const gchar *expected_offer[] = { "sendrecv" }; - const gchar *expected_answer[] = { "sendrecv" }; - VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer, NULL); - VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer, NULL); + guint media_format_count[] = { 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendrecv", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); GstWebRTCRTPTransceiver *transceiver; GstHarness *h; GstPad *pad; @@ -2730,8 +3407,8 @@ fail_unless (transceiver != NULL); g_object_set (transceiver, "direction", GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_INACTIVE, NULL); - expected_offer[0] = "inactive"; - expected_answer[0] = "inactive"; + expected_offer_direction[0] = "inactive"; + expected_answer_direction[0] = "inactive"; /* TODO: also validate EOS events from the inactive change */ @@ -2745,6 +3422,955 @@ GST_END_TEST; +static void +offer_remove_last_media (struct test_webrtc *t, GstElement * element, + GstPromise * promise, gpointer user_data) +{ + guint i, n; + GstSDPMessage *new, *old; + const GstSDPOrigin *origin; + const GstSDPConnection *conn; + + old = t->offer_desc->sdp; + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_new (&new)); + + origin = gst_sdp_message_get_origin (old); + conn = gst_sdp_message_get_connection (old); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_set_version (new, + gst_sdp_message_get_version (old))); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_set_origin (new, + origin->username, origin->sess_id, origin->sess_version, + origin->nettype, origin->addrtype, origin->addr)); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_set_session_name (new, + gst_sdp_message_get_session_name (old))); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_set_information (new, + gst_sdp_message_get_information (old))); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_set_uri (new, + gst_sdp_message_get_uri (old))); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_set_connection (new, + conn->nettype, conn->addrtype, conn->address, conn->ttl, + conn->addr_number)); + + n = gst_sdp_message_attributes_len (old); + for (i = 0; i < n; i++) { + const GstSDPAttribute *a = gst_sdp_message_get_attribute (old, i); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_add_attribute (new, + a->key, a->value)); + } + + n = gst_sdp_message_medias_len (old); + fail_unless (n > 0); + for (i = 0; i < n - 1; i++) { + const GstSDPMedia *m = gst_sdp_message_get_media (old, i); + GstSDPMedia *new_m; + + fail_unless_equals_int (GST_SDP_OK, gst_sdp_media_copy (m, &new_m)); + fail_unless_equals_int (GST_SDP_OK, gst_sdp_message_add_media (new, new_m)); + gst_sdp_media_init (new_m); + gst_sdp_media_free (new_m); + } + + gst_webrtc_session_description_free (t->offer_desc); + t->offer_desc = gst_webrtc_session_description_new (GST_WEBRTC_SDP_TYPE_OFFER, + new); +} + +static void +offer_set_produced_error (struct test_webrtc *t, GstElement * element, + GstPromise * promise, gpointer user_data) +{ + const GstStructure *reply; + GError *error = NULL; + + reply = gst_promise_get_reply (promise); + fail_unless (gst_structure_get (reply, "error", G_TYPE_ERROR, &error, NULL)); + GST_INFO ("error produced: %s", error->message); + g_clear_error (&error); + + test_webrtc_signal_state_unlocked (t, STATE_CUSTOM); +} + +static void +offer_created_produced_error (struct test_webrtc *t, GstElement * element, + GstPromise * promise, gpointer user_data) +{ + const GstStructure *reply; + GError *error = NULL; + + reply = gst_promise_get_reply (promise); + fail_unless (gst_structure_get (reply, "error", G_TYPE_ERROR, &error, NULL)); + GST_INFO ("error produced: %s", error->message); + g_clear_error (&error); +} + +GST_START_TEST (test_renego_lose_media_fails) +{ + struct test_webrtc *t = create_audio_video_test (); + VAL_SDP_INIT (offer, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); + VAL_SDP_INIT (answer, _count_num_sdp_media, GUINT_TO_POINTER (2), NULL); + + /* check that removing an m=line will produce an error */ + + test_validate_sdp (t, &offer, &answer); + + test_webrtc_reset_negotiation (t); + + t->on_offer_created = offer_remove_last_media; + t->on_offer_set = offer_set_produced_error; + t->on_answer_created = NULL; + + test_webrtc_create_offer (t); + test_webrtc_wait_for_state_mask (t, 1 << STATE_CUSTOM); + + test_webrtc_free (t); +} + +GST_END_TEST; + +GST_START_TEST (test_bundle_codec_preferences_rtx_no_duplicate_payloads) +{ + struct test_webrtc *t = test_webrtc_new (); + GstWebRTCRTPTransceiverDirection direction; + GstWebRTCRTPTransceiver *trans; + guint offer_media_format_count[] = { 2, }; + guint answer_media_format_count[] = { 1, }; + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, NULL); + VAL_SDP_INIT (offer_media_formats, on_sdp_media_count_formats, + offer_media_format_count, &payloads); + VAL_SDP_INIT (answer_media_formats, on_sdp_media_count_formats, + answer_media_format_count, &payloads); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &offer_media_formats); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &answer_media_formats); + const gchar *expected_offer_direction[] = { "recvonly", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + GstCaps *caps; + GstHarness *h; + + /* add a transceiver that will only receive an opus stream and check that + * the created offer is marked as recvonly */ + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + gst_util_set_object_arg (G_OBJECT (t->webrtc1), "bundle-policy", + "max-bundle"); + gst_util_set_object_arg (G_OBJECT (t->webrtc2), "bundle-policy", + "max-bundle"); + + /* setup recvonly transceiver */ + caps = gst_caps_from_string (VP8_RTP_CAPS (96)); + direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY; + g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, caps, + &trans); + g_object_set (GST_OBJECT (trans), "do-nack", TRUE, NULL); + gst_caps_unref (caps); + fail_unless (trans != NULL); + gst_object_unref (trans); + + /* setup sendonly peer */ + h = gst_harness_new_with_element (t->webrtc2, "sink_0", NULL); + add_fake_video_src_harness (h, 96); + t->harnesses = g_list_prepend (t->harnesses, h); + test_validate_sdp (t, &offer, &answer); + + test_webrtc_free (t); +} + +GST_END_TEST; + +static void +on_sdp_media_no_duplicate_extmaps (struct test_webrtc *t, GstElement * element, + GstWebRTCSessionDescription * desc, gpointer user_data) +{ + const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, 0); + + fail_unless (media != NULL); + + fail_unless_equals_string (gst_sdp_media_get_attribute_val_n (media, "extmap", + 0), "1 foobar"); + + fail_unless (gst_sdp_media_get_attribute_val_n (media, "extmap", 1) == NULL); +} + +/* In this test, we validate that identical extmaps for multiple formats + * in the caps of a single transceiver are deduplicated. This is necessary + * because Firefox will complain about duplicate extmap ids and fail negotiation + * otherwise. */ +GST_START_TEST (test_codec_preferences_no_duplicate_extmaps) +{ + struct test_webrtc *t = test_webrtc_new (); + GstWebRTCRTPTransceiver *trans; + GstWebRTCRTPTransceiverDirection direction; + VAL_SDP_INIT (extmaps, on_sdp_media_no_duplicate_extmaps, NULL, NULL); + GstCaps *caps; + GstStructure *s; + + caps = gst_caps_new_empty (); + + s = gst_structure_from_string (VP8_RTP_CAPS (96), NULL); + gst_structure_set (s, "extmap-1", G_TYPE_STRING, "foobar", NULL); + gst_caps_append_structure (caps, s); + s = gst_structure_from_string (H264_RTP_CAPS (97), NULL); + gst_structure_set (s, "extmap-1", G_TYPE_STRING, "foobar", NULL); + gst_caps_append_structure (caps, s); + + direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY; + g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, caps, + &trans); + gst_caps_unref (caps); + fail_unless (trans != NULL); + + t->on_negotiation_needed = NULL; + t->on_pad_added = NULL; + t->on_ice_candidate = NULL; + + test_validate_sdp (t, &extmaps, NULL); + + test_webrtc_free (t); +} + +GST_END_TEST; + +/* In this test, we validate that trying to use different values + * for the same extmap id in multiple formats in the caps of a + * single transceiver errors out when creating the offer. */ +GST_START_TEST (test_codec_preferences_incompatible_extmaps) +{ + struct test_webrtc *t = test_webrtc_new (); + GstWebRTCRTPTransceiver *trans; + GstWebRTCRTPTransceiverDirection direction; + GstCaps *caps; + GstStructure *s; + + caps = gst_caps_new_empty (); + + s = gst_structure_from_string (VP8_RTP_CAPS (96), NULL); + gst_structure_set (s, "extmap-1", G_TYPE_STRING, "foobar", NULL); + gst_caps_append_structure (caps, s); + s = gst_structure_from_string (H264_RTP_CAPS (97), NULL); + gst_structure_set (s, "extmap-1", G_TYPE_STRING, "foobaz", NULL); + gst_caps_append_structure (caps, s); + + direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY; + g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, caps, + &trans); + gst_caps_unref (caps); + fail_unless (trans != NULL); + + t->on_negotiation_needed = NULL; + t->on_pad_added = NULL; + t->on_ice_candidate = NULL; + t->on_offer_created = offer_created_produced_error; + + test_validate_sdp_full (t, NULL, NULL, STATE_OFFER_CREATED, TRUE); + + test_webrtc_free (t); +} + +GST_END_TEST; + +/* In this test, we validate that extmap values must be of the correct type */ +GST_START_TEST (test_codec_preferences_invalid_extmap) +{ + struct test_webrtc *t = test_webrtc_new (); + GstWebRTCRTPTransceiver *trans; + GstWebRTCRTPTransceiverDirection direction; + GstCaps *caps; + GstStructure *s; + + caps = gst_caps_new_empty (); + + s = gst_structure_from_string (VP8_RTP_CAPS (96), NULL); + gst_structure_set (s, "extmap-1", G_TYPE_INT, 42, NULL); + gst_caps_append_structure (caps, s); + + direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY; + g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, caps, + &trans); + gst_caps_unref (caps); + fail_unless (trans != NULL); + + t->on_negotiation_needed = NULL; + t->on_pad_added = NULL; + t->on_ice_candidate = NULL; + t->on_offer_created = offer_created_produced_error; + + test_validate_sdp_full (t, NULL, NULL, STATE_OFFER_CREATED, TRUE); + + test_webrtc_free (t); +} + +GST_END_TEST; + +GST_START_TEST (test_reject_request_pad) +{ + struct test_webrtc *t = test_webrtc_new (); + GstWebRTCRTPTransceiverDirection direction; + GstWebRTCRTPTransceiver *trans, *trans2; + guint offer_media_format_count[] = { 1, }; + guint answer_media_format_count[] = { 1, }; + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, NULL); + VAL_SDP_INIT (offer_media_formats, on_sdp_media_count_formats, + offer_media_format_count, &payloads); + VAL_SDP_INIT (answer_media_formats, on_sdp_media_count_formats, + answer_media_format_count, &payloads); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &offer_media_formats); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &answer_media_formats); + const gchar *expected_offer_direction[] = { "recvonly", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "sendonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + GstCaps *caps; + GstHarness *h; + GstPad *pad; + GstPadTemplate *templ; + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + gst_util_set_object_arg (G_OBJECT (t->webrtc1), "bundle-policy", + "max-bundle"); + gst_util_set_object_arg (G_OBJECT (t->webrtc2), "bundle-policy", + "max-bundle"); + + /* setup recvonly transceiver */ + caps = gst_caps_from_string (VP8_RTP_CAPS (96)); + direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_RECVONLY; + g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, caps, + &trans); + gst_caps_unref (caps); + fail_unless (trans != NULL); + + h = gst_harness_new_with_element (t->webrtc2, "sink_0", NULL); + add_fake_video_src_harness (h, 96); + t->harnesses = g_list_prepend (t->harnesses, h); + + test_validate_sdp (t, &offer, &answer); + + /* This should fail because the direction is wrong */ + pad = gst_element_request_pad_simple (t->webrtc1, "sink_0"); + fail_unless (pad == NULL); + + g_object_set (trans, "direction", + GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDRECV, NULL); + + templ = gst_element_get_pad_template (t->webrtc1, "sink_%u"); + fail_unless (templ != NULL); + + /* This should fail because the caps are wrong */ + caps = gst_caps_from_string (OPUS_RTP_CAPS (96)); + pad = gst_element_request_pad (t->webrtc1, templ, "sink_0", caps); + fail_unless (pad == NULL); + + g_object_set (trans, "codec-preferences", NULL, NULL); + + /* This should fail because the kind doesn't match */ + pad = gst_element_request_pad (t->webrtc1, templ, "sink_0", caps); + fail_unless (pad == NULL); + gst_caps_unref (caps); + + /* This should succeed and give us sink_0 */ + pad = gst_element_request_pad_simple (t->webrtc1, "sink_0"); + fail_unless (pad != NULL); + + g_object_get (pad, "transceiver", &trans2, NULL); + + fail_unless (trans == trans2); + + gst_object_unref (pad); + gst_object_unref (trans); + gst_object_unref (trans2); + + test_webrtc_free (t); +} + +GST_END_TEST; + +static void +_verify_media_types (struct test_webrtc *t, GstElement * element, + GstWebRTCSessionDescription * desc, gpointer user_data) +{ + gchar **media_types = user_data; + int i; + + for (i = 0; i < gst_sdp_message_medias_len (desc->sdp); i++) { + const GstSDPMedia *media = gst_sdp_message_get_media (desc->sdp, i); + + fail_unless_equals_string (gst_sdp_media_get_media (media), media_types[i]); + } +} + +GST_START_TEST (test_reject_create_offer) +{ + struct test_webrtc *t = test_webrtc_new (); + GstHarness *h; + GstPromise *promise; + GstPromiseResult res; + const GstStructure *s; + GError *error = NULL; + + const gchar *media_types[] = { "video", "audio" }; + VAL_SDP_INIT (media_type, _verify_media_types, &media_types, NULL); + guint media_format_count[] = { 1, 1 }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &media_type); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", "actpass" }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", "active" }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", "sendrecv" }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", "recvonly" }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + /* setup sendonly peer */ + h = gst_harness_new_with_element (t->webrtc1, "sink_1", NULL); + add_fake_audio_src_harness (h, 96); + t->harnesses = g_list_prepend (t->harnesses, h); + + /* Check that if there is no 0, we can't create an offer with a hole */ + promise = gst_promise_new (); + g_signal_emit_by_name (t->webrtc1, "create-offer", NULL, promise); + res = gst_promise_wait (promise); + fail_unless_equals_int (res, GST_PROMISE_RESULT_REPLIED); + s = gst_promise_get_reply (promise); + fail_unless (s != NULL); + gst_structure_get (s, "error", G_TYPE_ERROR, &error, NULL); + fail_unless (g_error_matches (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE)); + fail_unless (g_str_match_string + ("has locked mline 1 but the whole offer only has 0 sections", + error->message, FALSE)); + g_clear_error (&error); + gst_promise_unref (promise); + + h = gst_harness_new_with_element (t->webrtc1, "sink_%u", NULL); + add_fake_video_src_harness (h, 97); + t->harnesses = g_list_prepend (t->harnesses, h); + + /* Adding a second sink, which will fill m-line 0, should fix it */ + test_validate_sdp (t, &offer, &answer); + + test_webrtc_free (t); +} + +GST_END_TEST; + +GST_START_TEST (test_reject_set_description) +{ + struct test_webrtc *t = test_webrtc_new (); + GstHarness *h; + GstPromise *promise; + GstPromiseResult res; + const GstStructure *s; + GError *error = NULL; + GstWebRTCSessionDescription *desc = NULL; + GstPadTemplate *templ; + GstCaps *caps; + GstPad *pad; + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + /* setup peer 1 */ + h = gst_harness_new_with_element (t->webrtc1, "sink_0", NULL); + add_fake_audio_src_harness (h, 96); + t->harnesses = g_list_prepend (t->harnesses, h); + + /* Create a second side with specific video caps */ + templ = gst_element_get_pad_template (t->webrtc2, "sink_%u"); + fail_unless (templ != NULL); + caps = gst_caps_from_string (VP8_RTP_CAPS (97)); + pad = gst_element_request_pad (t->webrtc2, templ, "sink_0", caps); + fail_unless (pad != NULL); + gst_caps_unref (caps); + gst_object_unref (pad); + + /* Create an offer */ + promise = gst_promise_new (); + g_signal_emit_by_name (t->webrtc1, "create-offer", NULL, promise); + res = gst_promise_wait (promise); + fail_unless_equals_int (res, GST_PROMISE_RESULT_REPLIED); + s = gst_promise_get_reply (promise); + fail_unless (s != NULL); + gst_structure_get (s, "offer", GST_TYPE_WEBRTC_SESSION_DESCRIPTION, &desc, + NULL); + fail_unless (desc != NULL); + gst_promise_unref (promise); + + fail_if (gst_element_set_state (t->webrtc2, + GST_STATE_READY) == GST_STATE_CHANGE_FAILURE); + + /* Verify that setting an offer where there is a forced m-line with + a different kind fails. */ + promise = gst_promise_new (); + g_signal_emit_by_name (t->webrtc2, "set-remote-description", desc, promise); + res = gst_promise_wait (promise); + fail_unless_equals_int (res, GST_PROMISE_RESULT_REPLIED); + s = gst_promise_get_reply (promise); + gst_structure_get (s, "error", G_TYPE_ERROR, &error, NULL); + fail_unless (g_error_matches (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE)); + fail_unless (g_str_match_string + ("m-line 0 was locked to audio, but SDP has audio media", error->message, + FALSE)); + + g_clear_error (&error); + fail_unless (s != NULL); + gst_promise_unref (promise); + gst_webrtc_session_description_free (desc); + + test_webrtc_free (t); +} + +GST_END_TEST; + +GST_START_TEST (test_force_second_media) +{ + struct test_webrtc *t = test_webrtc_new (); + const gchar *media_types[] = { "audio" }; + VAL_SDP_INIT (media_type, _verify_media_types, &media_types, NULL); + guint media_format_count[] = { 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, &media_type); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &media_formats); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &media_formats); + const gchar *expected_offer_direction[] = { "sendrecv", }; + VAL_SDP_INIT (offer_direction, on_sdp_media_direction, + expected_offer_direction, &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", }; + VAL_SDP_INIT (answer_direction, on_sdp_media_direction, + expected_answer_direction, &answer_setup); + VAL_SDP_INIT (answer_count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &answer_direction); + VAL_SDP_INIT (offer_count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &offer_direction); + + const gchar *second_media_types[] = { "audio", "video" }; + VAL_SDP_INIT (second_media_type, _verify_media_types, &second_media_types, + NULL); + guint second_media_format_count[] = { 1, 1 }; + VAL_SDP_INIT (second_media_formats, on_sdp_media_count_formats, + second_media_format_count, &second_media_type); + const gchar *second_expected_offer_setup[] = { "active", "actpass" }; + VAL_SDP_INIT (second_offer_setup, on_sdp_media_setup, + second_expected_offer_setup, &second_media_formats); + const gchar *second_expected_answer_setup[] = { "passive", "active" }; + VAL_SDP_INIT (second_answer_setup, on_sdp_media_setup, + second_expected_answer_setup, &second_media_formats); + const gchar *second_expected_answer_direction[] = { "sendonly", "recvonly" }; + VAL_SDP_INIT (second_answer_direction, on_sdp_media_direction, + second_expected_answer_direction, &second_answer_setup); + const gchar *second_expected_offer_direction[] = { "recvonly", "sendrecv" }; + VAL_SDP_INIT (second_offer_direction, on_sdp_media_direction, + second_expected_offer_direction, &second_offer_setup); + VAL_SDP_INIT (second_answer_count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &second_answer_direction); + VAL_SDP_INIT (second_offer_count, _count_num_sdp_media, GUINT_TO_POINTER (2), + &second_offer_direction); + + GstHarness *h; + guint negotiation_flag = 0; + GstPadTemplate *templ; + GstCaps *caps; + GstPad *pad; + + /* add a transceiver that will only receive an opus stream and check that + * the created offer is marked as recvonly */ + t->on_negotiation_needed = on_negotiation_needed_hit; + t->negotiation_data = &negotiation_flag; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + /* setup peer */ + h = gst_harness_new_with_element (t->webrtc1, "sink_0", NULL); + add_fake_audio_src_harness (h, 96); + t->harnesses = g_list_prepend (t->harnesses, h); + + /* Create a second side with specific video caps */ + templ = gst_element_get_pad_template (t->webrtc2, "sink_%u"); + fail_unless (templ != NULL); + caps = gst_caps_from_string (VP8_RTP_CAPS (97)); + pad = gst_element_request_pad (t->webrtc2, templ, NULL, caps); + gst_caps_unref (caps); + fail_unless (pad != NULL); + h = gst_harness_new_with_element (t->webrtc2, GST_PAD_NAME (pad), NULL); + gst_object_unref (pad); + add_fake_video_src_harness (h, 97); + t->harnesses = g_list_prepend (t->harnesses, h); + + test_validate_sdp (t, &offer_count, &answer_count); + fail_unless (negotiation_flag & 1 << 2); + + test_webrtc_reset_negotiation (t); + + t->offerror = 2; + test_validate_sdp (t, &second_offer_count, &second_answer_count); + + test_webrtc_free (t); +} + +GST_END_TEST; + +GST_START_TEST (test_codec_preferences_caps) +{ + GstHarness *h; + GstPad *pad; + GstWebRTCRTPTransceiver *trans; + GstCaps *caps, *caps2; + + h = gst_harness_new_with_padnames ("webrtcbin", "sink_0", NULL); + pad = gst_element_get_static_pad (h->element, "sink_0"); + + g_object_get (pad, "transceiver", &trans, NULL); + + caps = gst_caps_from_string ("application/x-rtp, media=video," + "encoding-name=VP8, payload=115; application/x-rtp, media=video," + " encoding-name=H264, payload=104"); + g_object_set (trans, "codec-preferences", caps, NULL); + + caps2 = gst_pad_query_caps (pad, NULL); + fail_unless (gst_caps_is_equal (caps, caps2)); + gst_caps_unref (caps2); + gst_caps_unref (caps); + + caps = gst_caps_from_string (VP8_RTP_CAPS (115)); + fail_unless (gst_pad_query_accept_caps (pad, caps)); + gst_harness_set_src_caps (h, g_steal_pointer (&caps)); + + caps = gst_caps_from_string (VP8_RTP_CAPS (99)); + fail_unless (!gst_pad_query_accept_caps (pad, caps)); + gst_caps_unref (caps); + + gst_object_unref (pad); + gst_object_unref (trans); + gst_harness_teardown (h); +} + +GST_END_TEST; + +GST_START_TEST (test_codec_preferences_negotiation_sinkpad) +{ + struct test_webrtc *t = test_webrtc_new (); + guint media_format_count[] = { 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &media_formats); + VAL_SDP_INIT (payloads2, on_sdp_media_payload_types, GUINT_TO_POINTER (0), + &count); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &payloads2); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + + GstPad *pad; + GstWebRTCRTPTransceiver *transceiver; + GstHarness *h; + GstCaps *caps; + GstPromise *promise; + GstPromiseResult res; + const GstStructure *s; + GError *error = NULL; + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + h = gst_harness_new_with_element (t->webrtc1, "sink_0", NULL); + pad = gst_element_get_static_pad (t->webrtc1, "sink_0"); + g_object_get (pad, "transceiver", &transceiver, NULL); + caps = gst_caps_from_string (VP8_RTP_CAPS (115) ";" VP8_RTP_CAPS (97)); + g_object_set (transceiver, "codec-preferences", caps, NULL); + gst_caps_unref (caps); + gst_object_unref (transceiver); + gst_object_unref (pad); + + add_fake_video_src_harness (h, 96); + t->harnesses = g_list_prepend (t->harnesses, h); + + promise = gst_promise_new (); + g_signal_emit_by_name (t->webrtc1, "create-offer", NULL, promise); + res = gst_promise_wait (promise); + fail_unless_equals_int (res, GST_PROMISE_RESULT_REPLIED); + s = gst_promise_get_reply (promise); + fail_unless (s != NULL); + gst_structure_get (s, "error", G_TYPE_ERROR, &error, NULL); + fail_unless (g_error_matches (error, GST_WEBRTC_ERROR, + GST_WEBRTC_ERROR_INTERNAL_FAILURE)); + fail_unless (g_str_match_string + ("Caps negotiation on pad sink_0 failed against codec preferences", + error->message, FALSE)); + g_clear_error (&error); + gst_promise_unref (promise); + + caps = gst_caps_from_string (VP8_RTP_CAPS (97)); + gst_harness_set_src_caps (h, caps); + + test_validate_sdp (t, &offer, &answer); + + test_webrtc_free (t); +} + +GST_END_TEST; + + +static void +add_audio_test_src_harness (GstHarness * h) +{ +#define L16_CAPS "application/x-rtp, payload=11, media=audio," \ + " encoding-name=L16, clock-rate=44100, ssrc=(uint)3484078952" + GstCaps *caps = gst_caps_from_string (L16_CAPS); + gst_harness_set_src_caps (h, caps); + gst_harness_add_src_parse (h, "audiotestsrc is-live=true ! rtpL16pay ! " + L16_CAPS " ! identity", TRUE); +} + +static void +_pad_added_harness (struct test_webrtc *t, GstElement * element, + GstPad * pad, gpointer user_data) +{ + GstHarness *h; + GstHarness **sink_harness = user_data; + + if (GST_PAD_DIRECTION (pad) != GST_PAD_SRC) + return; + + h = gst_harness_new_with_element (element, NULL, GST_OBJECT_NAME (pad)); + t->harnesses = g_list_prepend (t->harnesses, h); + + if (sink_harness) { + *sink_harness = h; + g_cond_broadcast (&t->cond); + } +} + +static void +new_jitterbuffer_set_fast_start (GstElement * rtpbin, + GstElement * rtpjitterbuffer, guint session_id, guint ssrc, + gpointer user_data) +{ + g_object_set (rtpjitterbuffer, "faststart-min-packets", 1, NULL); +} + +GST_START_TEST (test_codec_preferences_negotiation_srcpad) +{ + struct test_webrtc *t = test_webrtc_new (); + guint media_format_count[] = { 1, }; + VAL_SDP_INIT (media_formats, on_sdp_media_count_formats, + media_format_count, NULL); + VAL_SDP_INIT (count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &media_formats); + VAL_SDP_INIT (payloads, on_sdp_media_no_duplicate_payloads, NULL, &count); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &payloads); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &payloads); + const gchar *expected_offer_direction[] = { "sendrecv", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + VAL_SDP_INIT (answer_non_reject, _count_non_rejected_media, + GUINT_TO_POINTER (0), &count); + GstHarness *h; + GstHarness *sink_harness = NULL; + guint i; + GstElement *rtpbin2; + GstBuffer *buf; + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_harness; + t->pad_added_data = &sink_harness; + + rtpbin2 = gst_bin_get_by_name (GST_BIN (t->webrtc2), "rtpbin"); + fail_unless (rtpbin2 != NULL); + g_signal_connect (rtpbin2, "new-jitterbuffer", + G_CALLBACK (new_jitterbuffer_set_fast_start), NULL); + g_object_unref (rtpbin2); + + h = gst_harness_new_with_element (t->webrtc1, "sink_0", NULL); + add_audio_test_src_harness (h); + t->harnesses = g_list_prepend (t->harnesses, h); + + test_validate_sdp (t, &offer, &answer); + + fail_if (gst_element_set_state (t->webrtc1, + GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); + fail_if (gst_element_set_state (t->webrtc2, + GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE); + + for (i = 0; i < 10; i++) + gst_harness_push_from_src (h); + + g_mutex_lock (&t->lock); + while (sink_harness == NULL) { + gst_harness_push_from_src (h); + g_cond_wait_until (&t->cond, &t->lock, g_get_monotonic_time () + 5000); + } + g_mutex_unlock (&t->lock); + fail_unless (sink_harness->element == t->webrtc2); + + /* Get one buffer out, this makes sure the capsfilter is primed and + * avoids races. + */ + buf = gst_harness_pull (sink_harness); + fail_unless (buf != NULL); + gst_buffer_unref (buf); + + gst_harness_set_sink_caps_str (sink_harness, OPUS_RTP_CAPS (100)); + + test_webrtc_reset_negotiation (t); + test_validate_sdp_full (t, &offer, &answer_non_reject, 0, FALSE); + + test_webrtc_free (t); +} + +GST_END_TEST; + +static void +_on_new_transceiver_codec_preferences_h264 (GstElement * webrtcbin, + GstWebRTCRTPTransceiver * trans, gpointer * user_data) +{ + GstCaps *caps; + + caps = gst_caps_from_string ("application/x-rtp,encoding-name=(string)H264"); + g_object_set (trans, "codec-preferences", caps, NULL); + gst_caps_unref (caps); +} + +static void +on_sdp_media_payload_types_only_h264 (struct test_webrtc *t, + GstElement * element, GstWebRTCSessionDescription * desc, + gpointer user_data) +{ + const GstSDPMedia *vmedia; + guint video_mline = GPOINTER_TO_UINT (user_data); + guint j; + + vmedia = gst_sdp_message_get_media (desc->sdp, video_mline); + + for (j = 0; j < gst_sdp_media_attributes_len (vmedia); j++) { + const GstSDPAttribute *attr = gst_sdp_media_get_attribute (vmedia, j); + + if (!g_strcmp0 (attr->key, "rtpmap")) { + fail_unless_equals_string (attr->value, "101 H264/90000"); + } + } +} + + +GST_START_TEST (test_codec_preferences_in_on_new_transceiver) +{ + struct test_webrtc *t = test_webrtc_new (); + GstWebRTCRTPTransceiverDirection direction; + GstWebRTCRTPTransceiver *trans; + VAL_SDP_INIT (no_duplicate_payloads, on_sdp_media_no_duplicate_payloads, + NULL, NULL); + guint offer_media_format_count[] = { 2 }; + guint answer_media_format_count[] = { 1 }; + VAL_SDP_INIT (offer_media_formats, on_sdp_media_count_formats, + offer_media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (answer_media_formats, on_sdp_media_count_formats, + answer_media_format_count, &no_duplicate_payloads); + VAL_SDP_INIT (offer_count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &offer_media_formats); + VAL_SDP_INIT (answer_count, _count_num_sdp_media, GUINT_TO_POINTER (1), + &answer_media_formats); + VAL_SDP_INIT (offer_payloads, on_sdp_media_payload_types, + GUINT_TO_POINTER (0), &offer_count); + VAL_SDP_INIT (answer_payloads, on_sdp_media_payload_types_only_h264, + GUINT_TO_POINTER (0), &answer_count); + const gchar *expected_offer_setup[] = { "actpass", }; + VAL_SDP_INIT (offer_setup, on_sdp_media_setup, expected_offer_setup, + &offer_payloads); + const gchar *expected_answer_setup[] = { "active", }; + VAL_SDP_INIT (answer_setup, on_sdp_media_setup, expected_answer_setup, + &answer_payloads); + const gchar *expected_offer_direction[] = { "sendonly", }; + VAL_SDP_INIT (offer, on_sdp_media_direction, expected_offer_direction, + &offer_setup); + const gchar *expected_answer_direction[] = { "recvonly", }; + VAL_SDP_INIT (answer, on_sdp_media_direction, expected_answer_direction, + &answer_setup); + GstCaps *caps; + GstHarness *h; + + t->on_negotiation_needed = NULL; + t->on_ice_candidate = NULL; + t->on_pad_added = _pad_added_fakesink; + + /* setup sendonly transceiver with VP8 and H264 */ + caps = gst_caps_from_string (VP8_RTP_CAPS (97) ";" H264_RTP_CAPS (101)); + direction = GST_WEBRTC_RTP_TRANSCEIVER_DIRECTION_SENDONLY; + g_signal_emit_by_name (t->webrtc1, "add-transceiver", direction, caps, + &trans); + gst_caps_unref (caps); + fail_unless (trans != NULL); + gst_object_unref (trans); + + /* setup recvonly peer */ + h = gst_harness_new_with_element (t->webrtc2, "sink_0", NULL); + add_fake_video_src_harness (h, 101); + t->harnesses = g_list_prepend (t->harnesses, h); + + /* connect to "on-new-transceiver" to set codec-preferences to H264 */ + g_signal_connect (t->webrtc2, "on-new-transceiver", + G_CALLBACK (_on_new_transceiver_codec_preferences_h264), NULL); + + /* Answer SDP should now have H264 only. Without the codec-preferences it + * would only have VP8 because that comes first in the SDP */ + + test_validate_sdp (t, &offer, &answer); + test_webrtc_free (t); +} + +GST_END_TEST; + static Suite * webrtcbin_suite (void) { @@ -2768,9 +4394,9 @@ tcase_add_test (tc, test_sdp_no_media); tcase_add_test (tc, test_session_stats); tcase_add_test (tc, test_audio); + tcase_add_test (tc, test_ice_port_restriction); tcase_add_test (tc, test_audio_video); tcase_add_test (tc, test_media_direction); - tcase_add_test (tc, test_media_setup); tcase_add_test (tc, test_add_transceiver); tcase_add_test (tc, test_get_transceivers); tcase_add_test (tc, test_add_recvonly_transceiver); @@ -2785,18 +4411,34 @@ tcase_add_test (tc, test_bundle_renego_add_stream); tcase_add_test (tc, test_bundle_max_compat_max_bundle_renego_add_stream); tcase_add_test (tc, test_renego_transceiver_set_direction); + tcase_add_test (tc, test_renego_lose_media_fails); + tcase_add_test (tc, + test_bundle_codec_preferences_rtx_no_duplicate_payloads); + tcase_add_test (tc, test_reject_request_pad); + tcase_add_test (tc, test_reject_create_offer); + tcase_add_test (tc, test_reject_set_description); + tcase_add_test (tc, test_force_second_media); + tcase_add_test (tc, test_codec_preferences_caps); + tcase_add_test (tc, test_codec_preferences_negotiation_sinkpad); + tcase_add_test (tc, test_codec_preferences_negotiation_srcpad); + tcase_add_test (tc, test_codec_preferences_in_on_new_transceiver); + tcase_add_test (tc, test_codec_preferences_no_duplicate_extmaps); + tcase_add_test (tc, test_codec_preferences_incompatible_extmaps); + tcase_add_test (tc, test_codec_preferences_invalid_extmap); if (sctpenc && sctpdec) { tcase_add_test (tc, test_data_channel_create); tcase_add_test (tc, test_data_channel_remote_notify); tcase_add_test (tc, test_data_channel_transfer_string); tcase_add_test (tc, test_data_channel_transfer_data); tcase_add_test (tc, test_data_channel_create_after_negotiate); + tcase_add_test (tc, test_data_channel_close); tcase_add_test (tc, test_data_channel_low_threshold); tcase_add_test (tc, test_data_channel_max_message_size); tcase_add_test (tc, test_data_channel_pre_negotiated); tcase_add_test (tc, test_bundle_audio_video_data); tcase_add_test (tc, test_renego_stream_add_data_channel); tcase_add_test (tc, test_renego_data_channel_add_stream); + tcase_add_test (tc, test_renego_stream_data_channel_add_stream); } else { GST_WARNING ("Some required elements were not found. " "All datachannel tests are disabled. sctpenc %p, sctpdec %p", sctpenc,
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/libs/av1parser.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/libs/av1parser.c
Changed
@@ -225,7 +225,7 @@ assert_equals_int (frame.frame_header.loop_filter_params. loop_filter_delta_update, 1); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[0], 0); + loop_filter_params.loop_filter_ref_deltas[0], 1); assert_equals_int (frame.frame_header. loop_filter_params.loop_filter_ref_deltas[1], 0); assert_equals_int (frame.frame_header. @@ -233,13 +233,13 @@ assert_equals_int (frame.frame_header. loop_filter_params.loop_filter_ref_deltas[3], 0); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[4], 0); + loop_filter_params.loop_filter_ref_deltas[4], -1); assert_equals_int (frame.frame_header. loop_filter_params.loop_filter_ref_deltas[5], 0); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[6], 0); + loop_filter_params.loop_filter_ref_deltas[6], -1); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[7], 0); + loop_filter_params.loop_filter_ref_deltas[7], -1); assert_equals_int (frame.frame_header.loop_filter_params. loop_filter_mode_deltas[0], 0); assert_equals_int (frame.frame_header.loop_filter_params. @@ -333,7 +333,7 @@ assert_equals_int (frame.frame_header.loop_filter_params. loop_filter_delta_update, 1); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[0], 0); + loop_filter_params.loop_filter_ref_deltas[0], 1); assert_equals_int (frame.frame_header. loop_filter_params.loop_filter_ref_deltas[1], 0); assert_equals_int (frame.frame_header. @@ -341,13 +341,13 @@ assert_equals_int (frame.frame_header. loop_filter_params.loop_filter_ref_deltas[3], 0); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[4], 0); + loop_filter_params.loop_filter_ref_deltas[4], -1); assert_equals_int (frame.frame_header. loop_filter_params.loop_filter_ref_deltas[5], 0); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[6], 0); + loop_filter_params.loop_filter_ref_deltas[6], -1); assert_equals_int (frame.frame_header. - loop_filter_params.loop_filter_ref_deltas[7], 0); + loop_filter_params.loop_filter_ref_deltas[7], -1); assert_equals_int (frame.frame_header.loop_filter_params. loop_filter_mode_deltas[0], 0); assert_equals_int (frame.frame_header.loop_filter_params.
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/libs/d3d11device.cpp
Added
@@ -0,0 +1,208 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/check/gstcheck.h> +#include <gst/d3d11/gstd3d11.h> +#include <wrl.h> + +/* *INDENT-OFF* */ +using namespace Microsoft::WRL; +/* *INDENT-ON* */ + +static gboolean have_multiple_adapters = FALSE; + +GST_START_TEST (test_device_new) +{ + GstD3D11Device *device = nullptr; + guint adapter_index = G_MAXINT; + + device = gst_d3d11_device_new (0, 0); + fail_unless (GST_IS_D3D11_DEVICE (device)); + + g_object_get (device, "adapter", &adapter_index, nullptr); + fail_unless_equals_int (adapter_index, 0); + gst_clear_object (&device); + + if (have_multiple_adapters) { + device = gst_d3d11_device_new (1, 0); + fail_unless (GST_IS_D3D11_DEVICE (device)); + + g_object_get (device, "adapter", &adapter_index, nullptr); + fail_unless_equals_int (adapter_index, 1); + } + + gst_clear_object (&device); +} + +GST_END_TEST; + +GST_START_TEST (test_device_for_adapter_luid) +{ + GstD3D11Device *device = nullptr; + HRESULT hr; + ComPtr < IDXGIAdapter1 > adapter; + ComPtr < IDXGIFactory1 > factory; + DXGI_ADAPTER_DESC desc; + guint adapter_index = G_MAXINT; + gint64 adapter_luid = 0; + gint64 luid; + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (SUCCEEDED (hr)) + hr = factory->EnumAdapters1 (0, &adapter); + + if (SUCCEEDED (hr)) + hr = adapter->GetDesc (&desc); + + if (SUCCEEDED (hr)) { + luid = gst_d3d11_luid_to_int64 (&desc.AdapterLuid); + device = gst_d3d11_device_new_for_adapter_luid (luid, 0); + fail_unless (GST_IS_D3D11_DEVICE (device)); + + g_object_get (device, "adapter", &adapter_index, "adapter-luid", + &adapter_luid, nullptr); + + /* adapter_luid is corresponding to the first enumerated adapter, + * so adapter index should be zero here */ + fail_unless_equals_int (adapter_index, 0); + fail_unless_equals_int64 (adapter_luid, luid); + } + + gst_clear_object (&device); + adapter = nullptr; + + if (have_multiple_adapters) { + if (SUCCEEDED (hr)) + hr = factory->EnumAdapters1 (1, &adapter); + + if (SUCCEEDED (hr)) + hr = adapter->GetDesc (&desc); + + if (SUCCEEDED (hr)) { + luid = gst_d3d11_luid_to_int64 (&desc.AdapterLuid); + device = gst_d3d11_device_new_for_adapter_luid (luid, 0); + fail_unless (GST_IS_D3D11_DEVICE (device)); + + g_object_get (device, "adapter", &adapter_index, "adapter-luid", + &adapter_luid, nullptr); + + fail_unless_equals_int (adapter_index, 1); + fail_unless_equals_int64 (adapter_luid, luid); + } + } + + gst_clear_object (&device); +} + +GST_END_TEST; + +GST_START_TEST (test_device_new_wrapped) +{ + GstD3D11Device *device = nullptr; + GstD3D11Device *device_clone = nullptr; + ID3D11Device *device_handle, *device_handle_clone; + ID3D11DeviceContext *context_handle, *context_handle_clone; + guint adapter_index = 0; + guint index; + gint64 luid, luid_clone; + + if (have_multiple_adapters) + adapter_index = 1; + + device = gst_d3d11_device_new (adapter_index, 0); + fail_unless (GST_IS_D3D11_DEVICE (device)); + + device_handle = gst_d3d11_device_get_device_handle (device); + fail_unless (device_handle != nullptr); + + context_handle = gst_d3d11_device_get_device_context_handle (device); + fail_unless (context_handle != nullptr); + + g_object_get (device, "adapter", &index, "adapter-luid", &luid, nullptr); + fail_unless_equals_int (index, adapter_index); + + device_clone = gst_d3d11_device_new_wrapped (device_handle); + fail_unless (GST_IS_D3D11_DEVICE (device_clone)); + + device_handle_clone = gst_d3d11_device_get_device_handle (device_clone); + fail_unless_equals_pointer (device_handle, device_handle_clone); + + context_handle_clone = + gst_d3d11_device_get_device_context_handle (device_clone); + fail_unless_equals_pointer (context_handle, context_handle_clone); + + g_object_get (device_clone, + "adapter", &index, "adapter-luid", &luid_clone, nullptr); + fail_unless_equals_int (index, adapter_index); + fail_unless_equals_int64 (luid, luid_clone); + + gst_clear_object (&device); + gst_clear_object (&device_clone); +} + +GST_END_TEST; + +static gboolean +check_d3d11_available (void) +{ + HRESULT hr; + ComPtr < IDXGIAdapter1 > adapter; + ComPtr < IDXGIFactory1 > factory; + + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (FAILED (hr)) + return FALSE; + + hr = factory->EnumAdapters1 (0, &adapter); + if (FAILED (hr)) + return FALSE; + + adapter = nullptr; + hr = factory->EnumAdapters1 (1, &adapter); + if (SUCCEEDED (hr)) + have_multiple_adapters = TRUE; + + return TRUE; +} + +static Suite * +d3d11device_suite (void) +{ + Suite *s = suite_create ("d3d11device"); + TCase *tc_basic = tcase_create ("general"); + + suite_add_tcase (s, tc_basic); + + if (!check_d3d11_available ()) + goto out; + + tcase_add_test (tc_basic, test_device_new); + tcase_add_test (tc_basic, test_device_for_adapter_luid); + tcase_add_test (tc_basic, test_device_new_wrapped); + +out: + return s; +} + +GST_CHECK_MAIN (d3d11device);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/libs/h265parser.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/libs/h265parser.c
Changed
@@ -390,6 +390,15 @@ ptl->lower_bit_rate_constraint_flag = lower_bit_rate_constraint_flag; } +static void +set_chroma_idc_and_depth (GstH265SPS * sps, guint8 chroma_idc, + guint8 depth_luma, guint8 depth_chroma) +{ + sps->chroma_format_idc = chroma_idc; + sps->bit_depth_luma_minus8 = depth_luma - 8; + sps->bit_depth_chroma_minus8 = depth_chroma - 8; +} + GST_START_TEST (test_h265_format_range_profiles_exact_match) { /* Test all the combinations from Table A.2 */ @@ -558,9 +567,8 @@ ptl.profile_idc = 11; set_format_range_fields (&ptl, 1, 1, 1, 0, 0, 0, 0, 0, 0, 1); - /* identical to screen-extended-main-444-10 */ g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, - GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444_10); + GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_10); } GST_END_TEST; @@ -568,73 +576,91 @@ GST_START_TEST (test_h265_format_range_profiles_partial_match) { /* Test matching compatible profiles from non-standard bitstream */ - GstH265ProfileTierLevel ptl; + GstH265SPS sps; + GstH265ProfileTierLevel *ptl = &sps.profile_tier_level; - memset (&ptl, 0, sizeof (ptl)); - ptl.profile_idc = 4; - set_format_range_fields (&ptl, 0, 1, 1, 1, 1, 0, 0, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + memset (&sps, 0, sizeof (sps)); + ptl->profile_idc = 4; + set_format_range_fields (ptl, 0, 1, 1, 1, 1, 0, 0, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_MAIN_444); - ptl.profile_idc = 5; + ptl->profile_idc = 5; /* wrong max_monochrome_constraint_flag, should still be compatible with GST_H265_PROFILE_HIGH_THROUGHPUT_444_10 */ - set_format_range_fields (&ptl, 1, 1, 1, 0, 0, 0, 1, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 1, 1, 1, 0, 0, 0, 1, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_HIGH_THROUGHPUT_444_10); /* wrong max_12bit_constraint_flag, should still be compatible with GST_H265_PROFILE_HIGH_THROUGHPUT_444_14 */ - set_format_range_fields (&ptl, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_HIGH_THROUGHPUT_444_14); /* wrong intra_constraint_flag, GST_H265_PROFILE_HIGH_THROUGHPUT_444_14 and GST_H265_PROFILE_HIGH_THROUGHPUT_444_16_INTRA are both compatible, but GST_H265_PROFILE_HIGH_THROUGHPUT_444_16_INTRA should be chosen because of the higher priority. */ - set_format_range_fields (&ptl, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_HIGH_THROUGHPUT_444_16_INTRA); - ptl.profile_idc = 6; + ptl->profile_idc = 6; /* wrong max_12bit_constraint_flag, should not be compatible with any */ - set_format_range_fields (&ptl, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_INVALID); - ptl.profile_idc = 7; + ptl->profile_idc = 7; /* wrong max_monochrome_constraint_flag, and intra_constraint_flag, still compatible with GST_H265_PROFILE_SCALABLE_MAIN_10 */ - set_format_range_fields (&ptl, 0, 1, 1, 0, 1, 1, 1, 1, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 0, 1, 1, 0, 1, 1, 1, 1, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_SCALABLE_MAIN_10); - ptl.profile_idc = 8; + ptl->profile_idc = 8; /* wrong one_picture_only_constraint_flag, still compatible with GST_H265_PROFILE_3D_MAIN */ - set_format_range_fields (&ptl, 0, 1, 1, 1, 1, 1, 0, 0, 1, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 0, 1, 1, 1, 1, 1, 0, 0, 1, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_3D_MAIN); - ptl.profile_idc = 9; + ptl->profile_idc = 9; /* wrong one_picture_only_constraint_flag, still compatible with GST_H265_PROFILE_SCREEN_EXTENDED_MAIN */ - set_format_range_fields (&ptl, 1, 1, 1, 1, 1, 1, 0, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 1, 1, 1, 1, 1, 1, 0, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_SCREEN_EXTENDED_MAIN); + /* wrong indications but have right chroma_format_idc and bit_depth in SPS, + should be recognized as GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444 */ + set_format_range_fields (ptl, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, + GST_H265_PROFILE_INVALID); + set_chroma_idc_and_depth (&sps, 3, 8, 8); + g_assert_cmpuint (gst_h265_get_profile_from_sps (&sps), ==, + GST_H265_PROFILE_SCREEN_EXTENDED_MAIN_444); - ptl.profile_idc = 10; + ptl->profile_idc = 10; /* wrong max_10bit_constraint_flag, still compatible with GST_H265_PROFILE_SCALABLE_MONOCHROME_16 */ - set_format_range_fields (&ptl, 0, 0, 1, 0, 1, 1, 1, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 0, 0, 1, 0, 1, 1, 1, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_SCALABLE_MONOCHROME_16); - ptl.profile_idc = 11; + ptl->profile_idc = 11; /* wrong max_12bit_constraint_flag and max_422chroma_constraint_flag, should be recognized as GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14 */ - set_format_range_fields (&ptl, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1); - g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (&ptl), ==, + set_format_range_fields (ptl, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1); + g_assert_cmpuint (gst_h265_profile_tier_level_get_profile (ptl), ==, GST_H265_PROFILE_SCREEN_EXTENDED_HIGH_THROUGHPUT_444_14); + + ptl->profile_idc = 2; + /* main and main10 compatibility flags but with 10 bith depth */ + ptl->profile_compatibility_flag[1] = 1; + ptl->profile_compatibility_flag[2] = 1; + set_format_range_fields (ptl, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0); + set_chroma_idc_and_depth (&sps, 1, 10, 10); + g_assert_cmpuint (gst_h265_get_profile_from_sps (&sps), ==, + GST_H265_PROFILE_MAIN_10); } GST_END_TEST;
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/libs/mpegts.c -> gst-plugins-bad-1.20.1.tar.xz/tests/check/libs/mpegts.c
Changed
@@ -16,6 +16,9 @@ * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, * Boston, MA 02110-1301, USA. */ +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif #include <gst/check/gstcheck.h> #include <gst/mpegts/mpegts.h> @@ -509,7 +512,7 @@ guint8 *data; GstDateTime *dt; - data = g_memdup (stt_data_check, 20); + data = g_memdup2 (stt_data_check, 20); section = gst_mpegts_section_new (0x1ffb, data, 20); stt = gst_mpegts_section_get_atsc_stt (section);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/check/libs/play.c
Added
@@ -0,0 +1,1838 @@ +/* GStreamer + * + * Copyright (C) 2014-2015 Sebastian Dröge <sebastian@centricular.com> + * Copyright (C) 2015 Brijesh Singh <brijesh.ksingh@gmail.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +/* TODO: + * - start with pause, go to playing + * - play, pause, play + * - set uri in play/pause + * - play/pause after eos + * - seek in play/pause/stopped, after eos, back to 0, after duration + * - http buffering + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#ifdef HAVE_VALGRIND +# include <valgrind/valgrind.h> +#endif + +#define SOUP_VERSION_MIN_REQUIRED (SOUP_VERSION_2_40) +#include <libsoup/soup.h> +#if !defined(SOUP_MINOR_VERSION) || SOUP_MINOR_VERSION < 44 +#define SoupStatus SoupKnownStatusCode +#endif + +#include <gst/check/gstcheck.h> + +#define fail_unless_equals_int(a, b) \ +G_STMT_START { \ + int first = a; \ + int second = b; \ + fail_unless(first == second, \ + "'" #a "' (%d) is not equal to '" #b"' (%d)", first, second); \ +} G_STMT_END; + +#define fail_unless_equals_uint64(a, b) \ +G_STMT_START { \ + guint64 first = a; \ + guint64 second = b; \ + fail_unless(first == second, \ + "'" #a "' (%" G_GUINT64_FORMAT ") is not equal to '" #b"' (%" \ + G_GUINT64_FORMAT ")", first, second); \ +} G_STMT_END; + +#define fail_unless_equals_double(a, b) \ +G_STMT_START { \ + double first = a; \ + double second = b; \ + fail_unless(first == second, \ + "'" #a "' (%lf) is not equal to '" #b"' (%lf)", first, second); \ +} G_STMT_END; + +#include <gst/play/play.h> + +START_TEST (test_create_and_free) +{ + GstPlay *player; + + player = gst_play_new (NULL); + fail_unless (player != NULL); + g_object_unref (player); +} + +END_TEST; + +START_TEST (test_set_and_get_uri) +{ + GstPlay *player; + gchar *uri; + + player = gst_play_new (NULL); + + fail_unless (player != NULL); + + gst_play_set_uri (player, "file:///path/to/a/file"); + uri = gst_play_get_uri (player); + + fail_unless (g_strcmp0 (uri, "file:///path/to/a/file") == 0); + + g_free (uri); + g_object_unref (player); +} + +END_TEST; + +START_TEST (test_set_and_get_position_update_interval) +{ + GstPlay *player; + guint interval = 0; + GstStructure *config; + + player = gst_play_new (NULL); + + fail_unless (player != NULL); + + config = gst_play_get_config (player); + gst_play_config_set_position_update_interval (config, 500); + interval = gst_play_config_get_position_update_interval (config); + fail_unless (interval == 500); + gst_play_set_config (player, config); + + g_object_unref (player); +} + +END_TEST; + +typedef enum +{ + STATE_CHANGE_BUFFERING, + STATE_CHANGE_DURATION_CHANGED, + STATE_CHANGE_END_OF_STREAM, + STATE_CHANGE_ERROR, + STATE_CHANGE_WARNING, + STATE_CHANGE_POSITION_UPDATED, + STATE_CHANGE_STATE_CHANGED, + STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED, + STATE_CHANGE_MEDIA_INFO_UPDATED, + STATE_CHANGE_SEEK_DONE, + STATE_CHANGE_URI_LOADED, +} TestPlayerStateChange; + +static const gchar * +test_play_state_change_get_name (TestPlayerStateChange change) +{ + switch (change) { + case STATE_CHANGE_BUFFERING: + return "buffering"; + case STATE_CHANGE_DURATION_CHANGED: + return "duration-changed"; + case STATE_CHANGE_END_OF_STREAM: + return "end-of-stream"; + case STATE_CHANGE_WARNING: + return "warning"; + case STATE_CHANGE_ERROR: + return "error"; + case STATE_CHANGE_POSITION_UPDATED: + return "position-updated"; + case STATE_CHANGE_STATE_CHANGED: + return "state-changed"; + case STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED: + return "video-dimensions-changed"; + case STATE_CHANGE_MEDIA_INFO_UPDATED: + return "media-info-updated"; + case STATE_CHANGE_SEEK_DONE: + return "seek-done"; + case STATE_CHANGE_URI_LOADED: + return "uri-loaded"; + default: + g_assert_not_reached (); + break; + } +} + +typedef struct _TestPlayerState TestPlayerState; +struct _TestPlayerState +{ + gint buffering_percent; + guint64 position, duration, seek_done_position; + gboolean end_of_stream, is_error, is_warning, seek_done; + GstPlayState state; + guint width, height; + GstPlayMediaInfo *media_info; + gchar *uri_loaded; + GstClockTime last_position; + gboolean done; + GError *error; + GstStructure *error_details; + + void (*test_callback) (GstPlay * player, TestPlayerStateChange change, + TestPlayerState * old_state, TestPlayerState * new_state); + gpointer test_data; +}; + +static void process_play_messages (GstPlay * player, TestPlayerState * state); + +static void +test_play_state_change_debug (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + GST_DEBUG_OBJECT (player, "Changed %s:\n" + "\tbuffering %d%% -> %d%%\n" + "\tposition %" GST_TIME_FORMAT " -> %" GST_TIME_FORMAT "\n" + "\tduration %" GST_TIME_FORMAT " -> %" GST_TIME_FORMAT "\n" + "\tseek position %" GST_TIME_FORMAT " -> %" GST_TIME_FORMAT "\n" + "\tend-of-stream %d -> %d\n" + "\terror %d -> %d\n" + "\tseek_done %d -> %d\n" + "\tstate %s -> %s\n" + "\twidth/height %d/%d -> %d/%d\n" + "\tmedia_info %p -> %p\n" + "\turi_loaded %s -> %s", + test_play_state_change_get_name (change), + old_state->buffering_percent, new_state->buffering_percent, + GST_TIME_ARGS (old_state->position), GST_TIME_ARGS (new_state->position), + GST_TIME_ARGS (old_state->duration), GST_TIME_ARGS (new_state->duration), + GST_TIME_ARGS (old_state->seek_done_position), + GST_TIME_ARGS (new_state->seek_done_position), old_state->end_of_stream, + new_state->end_of_stream, old_state->is_error, new_state->is_error, + old_state->seek_done, new_state->seek_done, + gst_play_state_get_name (old_state->state), + gst_play_state_get_name (new_state->state), old_state->width, + old_state->height, new_state->width, new_state->height, + old_state->media_info, new_state->media_info, + old_state->uri_loaded, new_state->uri_loaded); +} + +static void +test_play_state_reset (GstPlay * player, TestPlayerState * state) +{ + state->buffering_percent = 100; + state->position = state->duration = state->seek_done_position = -1; + state->end_of_stream = state->is_error = state->seek_done = FALSE; + state->state = GST_PLAY_STATE_STOPPED; + state->width = state->height = 0; + state->media_info = NULL; + state->last_position = GST_CLOCK_TIME_NONE; + state->done = FALSE; + g_clear_pointer (&state->uri_loaded, g_free); + g_clear_error (&state->error); + gst_clear_structure (&state->error_details); +} + +static GstPlay * +test_play_new (TestPlayerState * state) +{ + GstPlay *player; + GstElement *playbin, *fakesink; + + player = gst_play_new (NULL); + fail_unless (player != NULL); + + test_play_state_reset (player, state); + + playbin = gst_play_get_pipeline (player); + fakesink = gst_element_factory_make ("fakesink", "audio-sink"); + g_object_set (fakesink, "sync", TRUE, NULL); + g_object_set (playbin, "audio-sink", fakesink, NULL); + fakesink = gst_element_factory_make ("fakesink", "video-sink"); + g_object_set (fakesink, "sync", TRUE, NULL); + g_object_set (playbin, "video-sink", fakesink, NULL); + gst_object_unref (playbin); + + return player; +} + +static void +test_play_stopped_cb (GstPlay * player, TestPlayerStateChange change, + TestPlayerState * old_state, TestPlayerState * new_state) +{ + if (new_state->state == GST_PLAY_STATE_STOPPED) { + new_state->done = TRUE; + } +} + +static void +stop_player (GstPlay * player, TestPlayerState * state) +{ + if (state->state != GST_PLAY_STATE_STOPPED) { + /* Make sure all pending operations are finished so the player won't be + * appear as 'leaked' to leak detection tools. */ + state->test_callback = test_play_stopped_cb; + gst_play_stop (player); + state->done = FALSE; + process_play_messages (player, state); + } + test_play_state_reset (player, state); +} + +static void +test_play_audio_video_eos_cb (GstPlay * player, TestPlayerStateChange change, + TestPlayerState * old_state, TestPlayerState * new_state) +{ + gint step = GPOINTER_TO_INT (new_state->test_data); + gboolean video; + + video = ! !(step & 0x10); + step = (step & (~0x10)); + + switch (step) { + case 0: + fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); + if (video) + fail_unless (g_str_has_suffix (new_state->uri_loaded, + "audio-video-short.ogg")); + else + fail_unless (g_str_has_suffix (new_state->uri_loaded, + "audio-short.ogg")); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 1: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_STOPPED); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_BUFFERING); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 2: + fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 3: + fail_unless_equals_int (change, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED); + if (video) { + fail_unless_equals_int (new_state->width, 320); + fail_unless_equals_int (new_state->height, 240); + } else { + fail_unless_equals_int (new_state->width, 0); + fail_unless_equals_int (new_state->height, 0); + } + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 4: + fail_unless_equals_int (change, STATE_CHANGE_DURATION_CHANGED); + fail_unless_equals_uint64 (new_state->duration, + G_GUINT64_CONSTANT (464399092)); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 5: + fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 6: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_BUFFERING); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_PLAYING); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 7: + fail_unless_equals_int (change, STATE_CHANGE_POSITION_UPDATED); + g_assert (new_state->position <= old_state->duration); + if (new_state->position == old_state->duration) + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 8: + fail_unless_equals_int (change, STATE_CHANGE_END_OF_STREAM); + fail_unless_equals_uint64 (new_state->position, old_state->duration); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + break; + case 9: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_PLAYING); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_STOPPED); + new_state->test_data = + GINT_TO_POINTER ((video ? 0x10 : 0x00) | (step + 1)); + new_state->done = TRUE; + break; + default: + fail (); + break; + } +} + +void +process_play_messages (GstPlay * player, TestPlayerState * state) +{ + GstBus *bus = gst_play_get_message_bus (player); + do { + GstMessage *msg = + gst_bus_timed_pop_filtered (bus, -1, GST_MESSAGE_APPLICATION); + GST_INFO ("message: %" GST_PTR_FORMAT, msg); + + if (gst_play_is_play_message (msg)) { + TestPlayerState old_state = *state; + GstPlayMessage type; + + gst_play_message_parse_type (msg, &type); + switch (type) { + case GST_PLAY_MESSAGE_URI_LOADED: + state->uri_loaded = gst_play_get_uri (player); + state->test_callback (player, STATE_CHANGE_URI_LOADED, &old_state, + state); + break; + case GST_PLAY_MESSAGE_POSITION_UPDATED: + gst_play_message_parse_position_updated (msg, &state->position); + test_play_state_change_debug (player, STATE_CHANGE_POSITION_UPDATED, + &old_state, state); + state->test_callback (player, STATE_CHANGE_POSITION_UPDATED, + &old_state, state); + break; + case GST_PLAY_MESSAGE_DURATION_CHANGED:{ + GstClockTime duration; + gst_play_message_parse_duration_updated (msg, &duration); + state->duration = duration; + test_play_state_change_debug (player, STATE_CHANGE_DURATION_CHANGED, + &old_state, state); + state->test_callback (player, STATE_CHANGE_DURATION_CHANGED, + &old_state, state); + break; + } + case GST_PLAY_MESSAGE_STATE_CHANGED:{ + GstPlayState play_state; + gst_play_message_parse_state_changed (msg, &play_state); + + state->state = play_state; + + if (play_state == GST_PLAY_STATE_STOPPED) + test_play_state_reset (player, state); + + test_play_state_change_debug (player, STATE_CHANGE_STATE_CHANGED, + &old_state, state); + state->test_callback (player, STATE_CHANGE_STATE_CHANGED, &old_state, + state); + break; + } + case GST_PLAY_MESSAGE_BUFFERING:{ + guint percent; + gst_play_message_parse_buffering_percent (msg, &percent); + + state->buffering_percent = percent; + test_play_state_change_debug (player, STATE_CHANGE_BUFFERING, + &old_state, state); + state->test_callback (player, STATE_CHANGE_BUFFERING, &old_state, + state); + break; + } + case GST_PLAY_MESSAGE_END_OF_STREAM: + state->end_of_stream = TRUE; + test_play_state_change_debug (player, STATE_CHANGE_END_OF_STREAM, + &old_state, state); + state->test_callback (player, STATE_CHANGE_END_OF_STREAM, &old_state, + state); + break; + case GST_PLAY_MESSAGE_ERROR:{ + gst_play_message_parse_error (msg, &state->error, + &state->error_details); + GST_DEBUG ("error: %s details: %" GST_PTR_FORMAT, + state->error ? state->error->message : "", state->error_details); + state->is_error = TRUE; + test_play_state_change_debug (player, STATE_CHANGE_ERROR, + &old_state, state); + state->test_callback (player, STATE_CHANGE_ERROR, &old_state, state); + break; + } + case GST_PLAY_MESSAGE_WARNING:{ + gst_play_message_parse_error (msg, &state->error, + &state->error_details); + GST_DEBUG ("error: %s details: %" GST_PTR_FORMAT, + state->error ? state->error->message : "", state->error_details); + state->is_warning = TRUE; + test_play_state_change_debug (player, STATE_CHANGE_WARNING, + &old_state, state); + state->test_callback (player, STATE_CHANGE_WARNING, &old_state, + state); + break; + } + case GST_PLAY_MESSAGE_VIDEO_DIMENSIONS_CHANGED:{ + guint width, height; + gst_play_message_parse_video_dimensions_changed (msg, &width, + &height); + + state->width = width; + state->height = height; + test_play_state_change_debug (player, + STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED, &old_state, state); + state->test_callback (player, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED, + &old_state, state); + break; + } + case GST_PLAY_MESSAGE_MEDIA_INFO_UPDATED:{ + GstPlayMediaInfo *media_info; + gst_play_message_parse_media_info_updated (msg, &media_info); + + state->media_info = media_info; + test_play_state_change_debug (player, + STATE_CHANGE_MEDIA_INFO_UPDATED, &old_state, state); + state->test_callback (player, STATE_CHANGE_MEDIA_INFO_UPDATED, + &old_state, state); + break; + } + case GST_PLAY_MESSAGE_VOLUME_CHANGED:{ + gdouble volume; + gst_play_message_parse_volume_changed (msg, &volume); + break; + } + case GST_PLAY_MESSAGE_MUTE_CHANGED:{ + gboolean is_muted; + gst_play_message_parse_muted_changed (msg, &is_muted); + break; + } + case GST_PLAY_MESSAGE_SEEK_DONE: + state->seek_done = TRUE; + state->seek_done_position = gst_play_get_position (player); + test_play_state_change_debug (player, STATE_CHANGE_SEEK_DONE, + &old_state, state); + state->test_callback (player, STATE_CHANGE_SEEK_DONE, &old_state, + state); + break; + } + } + gst_message_unref (msg); + if (state->done) + break; + } while (1); + gst_object_unref (bus); +} + +START_TEST (test_play_audio_eos) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_audio_video_eos_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio-short.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 10); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_audio_info (GstPlayMediaInfo * media_info) +{ + gint i = 0; + GList *list; + + for (list = gst_play_media_info_get_audio_streams (media_info); + list != NULL; list = list->next) { + GstPlayStreamInfo *stream = (GstPlayStreamInfo *) list->data; + GstPlayAudioInfo *audio_info = (GstPlayAudioInfo *) stream; + + fail_unless (gst_play_stream_info_get_tags (stream) != NULL); + fail_unless (gst_play_stream_info_get_caps (stream) != NULL); + fail_unless_equals_string (gst_play_stream_info_get_stream_type (stream), + "audio"); + + if (i == 0) { + fail_unless_equals_string (gst_play_stream_info_get_codec (stream), + "MPEG-1 Layer 3 (MP3)"); + fail_unless_equals_int (gst_play_audio_info_get_sample_rate + (audio_info), 48000); + fail_unless_equals_int (gst_play_audio_info_get_channels (audio_info), 2); + fail_unless_equals_int (gst_play_audio_info_get_max_bitrate + (audio_info), 192000); + fail_unless (gst_play_audio_info_get_language (audio_info) != NULL); + } else { + fail_unless_equals_string (gst_play_stream_info_get_codec (stream), + "MPEG-4 AAC"); + fail_unless_equals_int (gst_play_audio_info_get_sample_rate + (audio_info), 48000); + fail_unless_equals_int (gst_play_audio_info_get_channels (audio_info), 6); + fail_unless (gst_play_audio_info_get_language (audio_info) != NULL); + } + + i++; + } +} + +static void +test_video_info (GstPlayMediaInfo * media_info) +{ + GList *list; + + for (list = gst_play_media_info_get_video_streams (media_info); + list != NULL; list = list->next) { + gint fps_d, fps_n; + guint par_d, par_n; + GstPlayStreamInfo *stream = (GstPlayStreamInfo *) list->data; + GstPlayVideoInfo *video_info = (GstPlayVideoInfo *) stream; + + fail_unless (gst_play_stream_info_get_tags (stream) != NULL); + fail_unless (gst_play_stream_info_get_caps (stream) != NULL); + fail_unless_equals_int (gst_play_stream_info_get_index (stream), 0); + fail_unless (strstr (gst_play_stream_info_get_codec (stream), + "H.264") != NULL + || strstr (gst_play_stream_info_get_codec (stream), "H264") != NULL); + fail_unless_equals_int (gst_play_video_info_get_width (video_info), 320); + fail_unless_equals_int (gst_play_video_info_get_height (video_info), 240); + gst_play_video_info_get_framerate (video_info, &fps_n, &fps_d); + fail_unless_equals_int (fps_n, 24); + fail_unless_equals_int (fps_d, 1); + gst_play_video_info_get_pixel_aspect_ratio (video_info, &par_n, &par_d); + fail_unless_equals_int (par_n, 33); + fail_unless_equals_int (par_d, 20); + } +} + +static void +test_subtitle_info (GstPlayMediaInfo * media_info) +{ + GList *list; + + for (list = gst_play_media_info_get_subtitle_streams (media_info); + list != NULL; list = list->next) { + GstPlayStreamInfo *stream = (GstPlayStreamInfo *) list->data; + GstPlaySubtitleInfo *sub = (GstPlaySubtitleInfo *) stream; + + fail_unless_equals_string (gst_play_stream_info_get_stream_type (stream), + "subtitle"); + fail_unless (gst_play_stream_info_get_tags (stream) != NULL); + fail_unless (gst_play_stream_info_get_caps (stream) != NULL); + fail_unless_equals_string (gst_play_stream_info_get_codec (stream), + "Timed Text"); + fail_unless (gst_play_subtitle_info_get_language (sub) != NULL); + } +} + +static void +test_media_info_object (GstPlay * player, GstPlayMediaInfo * media_info) +{ + GList *list; + + /* global tag */ + fail_unless (gst_play_media_info_is_seekable (media_info) == TRUE); + fail_unless (gst_play_media_info_get_tags (media_info) != NULL); + fail_unless_equals_string (gst_play_media_info_get_title (media_info), + "Sintel"); + fail_unless_equals_string (gst_play_media_info_get_container_format + (media_info), "Matroska"); + fail_unless (gst_play_media_info_get_image_sample (media_info) == NULL); + fail_unless (strstr (gst_play_media_info_get_uri (media_info), + "sintel.mkv") != NULL); + + /* number of streams */ + list = gst_play_media_info_get_stream_list (media_info); + fail_unless (list != NULL); + fail_unless_equals_int (g_list_length (list), 10); + + list = gst_play_media_info_get_video_streams (media_info); + fail_unless (list != NULL); + fail_unless_equals_int (g_list_length (list), 1); + + list = gst_play_media_info_get_audio_streams (media_info); + fail_unless (list != NULL); + fail_unless_equals_int (g_list_length (list), 2); + + list = gst_play_media_info_get_subtitle_streams (media_info); + fail_unless (list != NULL); + fail_unless_equals_int (g_list_length (list), 7); + + /* test subtitle */ + test_subtitle_info (media_info); + + /* test audio */ + test_audio_info (media_info); + + /* test video */ + test_video_info (media_info); +} + +static void +test_play_media_info_cb (GstPlay * player, TestPlayerStateChange change, + TestPlayerState * old_state, TestPlayerState * new_state) +{ + gint completed = GPOINTER_TO_INT (new_state->test_data); + + if (change == STATE_CHANGE_MEDIA_INFO_UPDATED) { + test_media_info_object (player, new_state->media_info); + new_state->test_data = GINT_TO_POINTER (completed + 1); + new_state->done = TRUE; + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->done = TRUE; + } +} + +START_TEST (test_play_media_info) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_media_info_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 1); + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_stream_disable_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data) & 0xf; + gint mask = GPOINTER_TO_INT (new_state->test_data) & 0xf0; + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + new_state->test_data = GINT_TO_POINTER (0x10 + steps + 1); + gst_play_set_audio_track_enabled (player, FALSE); + + } else if (mask == 0x10 && change == STATE_CHANGE_POSITION_UPDATED) { + GstPlayAudioInfo *audio; + + audio = gst_play_get_current_audio_track (player); + fail_unless (audio == NULL); + new_state->test_data = GINT_TO_POINTER (0x20 + steps + 1); + gst_play_set_subtitle_track_enabled (player, FALSE); + + } else if (mask == 0x20 && change == STATE_CHANGE_POSITION_UPDATED) { + GstPlaySubtitleInfo *sub; + + sub = gst_play_get_current_subtitle_track (player); + fail_unless (sub == NULL); + new_state->test_data = GINT_TO_POINTER (0x30 + steps + 1); + new_state->done = TRUE; + + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->done = TRUE; + } +} + +START_TEST (test_play_stream_disable) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_stream_disable_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 0x33); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_stream_switch_audio_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + gint ret; + + new_state->test_data = GINT_TO_POINTER (steps + 1); + ret = gst_play_set_audio_track (player, 1); + fail_unless_equals_int (ret, 1); + + } else if (steps && change == STATE_CHANGE_POSITION_UPDATED) { + gint index; + GstPlayAudioInfo *audio; + + audio = gst_play_get_current_audio_track (player); + fail_unless (audio != NULL); + index = gst_play_stream_info_get_index ((GstPlayStreamInfo *) audio); + fail_unless_equals_int (index, 1); + g_object_unref (audio); + + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->done = TRUE; + } +} + +START_TEST (test_play_stream_switch_audio) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_stream_switch_audio_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_stream_switch_subtitle_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + gint ret; + + new_state->test_data = GINT_TO_POINTER (steps + 1); + ret = gst_play_set_subtitle_track (player, 5); + fail_unless_equals_int (ret, 1); + + } else if (steps && change == STATE_CHANGE_POSITION_UPDATED) { + gint index; + GstPlaySubtitleInfo *sub; + + sub = gst_play_get_current_subtitle_track (player); + fail_unless (sub != NULL); + index = gst_play_stream_info_get_index ((GstPlayStreamInfo *) sub); + fail_unless_equals_int (index, 5); + g_object_unref (sub); + + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->done = TRUE; + } +} + +START_TEST (test_play_stream_switch_subtitle) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_stream_switch_subtitle_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_error_invalid_external_suburi_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + gchar *suburi; + + suburi = gst_filename_to_uri (TEST_PATH "/foo.srt", NULL); + fail_unless (suburi != NULL); + + new_state->test_data = GINT_TO_POINTER (steps + 1); + /* load invalid suburi */ + gst_play_set_subtitle_uri (player, suburi); + g_free (suburi); + + } else if (steps && change == STATE_CHANGE_WARNING) { + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + } +} + +START_TEST (test_play_error_invalid_external_suburi) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_error_invalid_external_suburi_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio-video.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static gboolean +has_subtitle_stream (TestPlayerState * new_state) +{ + if (gst_play_media_info_get_subtitle_streams (new_state->media_info)) + return TRUE; + + return FALSE; +} + +static void +test_play_external_suburi_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + gchar *suburi; + + suburi = gst_filename_to_uri (TEST_PATH "/test_sub.srt", NULL); + fail_unless (suburi != NULL); + + gst_play_set_subtitle_uri (player, suburi); + g_free (suburi); + new_state->test_data = GINT_TO_POINTER (steps + 1); + + } else if (change == STATE_CHANGE_MEDIA_INFO_UPDATED && + has_subtitle_stream (new_state)) { + gchar *current_suburi, *suburi; + + current_suburi = gst_play_get_subtitle_uri (player); + fail_unless (current_suburi != NULL); + suburi = gst_filename_to_uri (TEST_PATH "/test_sub.srt", NULL); + fail_unless (suburi != NULL); + + fail_unless_equals_int (g_strcmp0 (current_suburi, suburi), 0); + + g_free (current_suburi); + g_free (suburi); + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) + new_state->done = TRUE; +} + +START_TEST (test_play_external_suburi) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_external_suburi_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio-video.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_rate_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data) & 0xf; + gint mask = GPOINTER_TO_INT (new_state->test_data) & 0xf0; + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + guint64 dur = -1, pos = -1; + + g_object_get (player, "position", &pos, "duration", &dur, NULL); + pos = pos + dur * 0.2; /* seek 20% */ + gst_play_seek (player, pos); + + /* default rate should be 1.0 */ + fail_unless_equals_double (gst_play_get_rate (player), 1.0); + new_state->test_data = GINT_TO_POINTER (mask + steps + 1); + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->done = TRUE; + } else if (steps == 1 && change == STATE_CHANGE_SEEK_DONE) { + if (mask == 0x10) + gst_play_set_rate (player, 1.5); + else if (mask == 0x20) + gst_play_set_rate (player, -1.0); + + new_state->test_data = GINT_TO_POINTER (mask + steps + 1); + } else if (steps && (change == STATE_CHANGE_POSITION_UPDATED)) { + if (steps == 10) { + new_state->done = TRUE; + } else { + if (mask == 0x10 && (new_state->position > old_state->position)) + new_state->test_data = GINT_TO_POINTER (mask + steps + 1); + else if (mask == 0x20 && (new_state->position < old_state->position)) + new_state->test_data = GINT_TO_POINTER (mask + steps + 1); + } + } +} + +START_TEST (test_play_forward_rate) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_rate_cb; + state.test_data = GINT_TO_POINTER (0x10); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & 0xf, 10); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +START_TEST (test_play_backward_rate) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_rate_cb; + state.test_data = GINT_TO_POINTER (0x20); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & 0xf, 10); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +START_TEST (test_play_audio_video_eos) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_audio_video_eos_cb; + state.test_data = GINT_TO_POINTER (0x10); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio-video-short.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & (~0x10), 10); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_error_invalid_uri_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint step = GPOINTER_TO_INT (new_state->test_data); + + switch (step) { + case 0: + fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); + fail_unless_equals_string (new_state->uri_loaded, "foo://bar"); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 1: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_STOPPED); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_BUFFERING); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 2: + fail_unless_equals_int (change, STATE_CHANGE_ERROR); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 3: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_BUFFERING); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_STOPPED); + new_state->test_data = GINT_TO_POINTER (step + 1); + new_state->done = TRUE; + break; + default: + fail (); + break; + } +} + +START_TEST (test_play_error_invalid_uri) +{ + GstPlay *player; + TestPlayerState state; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_error_invalid_uri_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + gst_play_set_uri (player, "foo://bar"); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 4); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_error_invalid_uri_and_play_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint step = GPOINTER_TO_INT (new_state->test_data); + gchar *uri; + + switch (step) { + case 0: + fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); + fail_unless_equals_string (new_state->uri_loaded, "foo://bar"); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 1: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_STOPPED); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_BUFFERING); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 2: + fail_unless_equals_int (change, STATE_CHANGE_ERROR); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 3: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_BUFFERING); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_STOPPED); + new_state->test_data = GINT_TO_POINTER (step + 1); + + uri = gst_filename_to_uri (TEST_PATH "/audio-short.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + break; + case 4: + fail_unless_equals_int (change, STATE_CHANGE_URI_LOADED); + fail_unless (g_str_has_suffix (new_state->uri_loaded, "audio-short.ogg")); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 5: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_STOPPED); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_BUFFERING); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 6: + fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 7: + fail_unless_equals_int (change, STATE_CHANGE_VIDEO_DIMENSIONS_CHANGED); + fail_unless_equals_int (new_state->width, 0); + fail_unless_equals_int (new_state->height, 0); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 8: + fail_unless_equals_int (change, STATE_CHANGE_DURATION_CHANGED); + fail_unless_equals_uint64 (new_state->duration, + G_GUINT64_CONSTANT (464399092)); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 9: + fail_unless_equals_int (change, STATE_CHANGE_MEDIA_INFO_UPDATED); + new_state->test_data = GINT_TO_POINTER (step + 1); + break; + case 10: + fail_unless_equals_int (change, STATE_CHANGE_STATE_CHANGED); + fail_unless_equals_int (old_state->state, GST_PLAY_STATE_BUFFERING); + fail_unless_equals_int (new_state->state, GST_PLAY_STATE_PLAYING); + new_state->test_data = GINT_TO_POINTER (step + 1); + new_state->done = TRUE; + break; + default: + fail (); + break; + } +} + +START_TEST (test_play_error_invalid_uri_and_play) +{ + GstPlay *player; + TestPlayerState state; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_error_invalid_uri_and_play_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + gst_play_set_uri (player, "foo://bar"); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 11); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_seek_done_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint step = GPOINTER_TO_INT (new_state->test_data) & (~0x10); + + if (new_state->state == GST_PLAY_STATE_PAUSED && !step) { + gst_play_seek (player, 0); + new_state->test_data = GINT_TO_POINTER (step + 1); + } else if (change == STATE_CHANGE_SEEK_DONE && step == 1) { + fail_unless_equals_int (change, STATE_CHANGE_SEEK_DONE); + fail_unless_equals_uint64 (new_state->seek_done_position, + G_GUINT64_CONSTANT (0)); + new_state->test_data = GINT_TO_POINTER (step + 1); + new_state->done = TRUE; + } +} + +START_TEST (test_play_audio_video_seek_done) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_seek_done_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/audio-video.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_pause (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data) & (~0x10), 2); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_play_position_update_interval_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (new_state->state == GST_PLAY_STATE_PLAYING && !steps) { + new_state->test_data = GINT_TO_POINTER (steps + 1); + } else if (steps && change == STATE_CHANGE_POSITION_UPDATED) { + GstStructure *config = gst_play_get_config (player); + guint update_interval = + gst_play_config_get_position_update_interval (config); + GstClockTime position = gst_play_get_position (player); + new_state->test_data = GINT_TO_POINTER (steps + 1); + + gst_structure_free (config); + if (GST_CLOCK_TIME_IS_VALID (old_state->last_position)) { + GstClockTime delta = GST_CLOCK_DIFF (old_state->last_position, position); + GST_DEBUG_OBJECT (player, + "current delta: %" GST_TIME_FORMAT " interval: %" GST_TIME_FORMAT, + GST_TIME_ARGS (delta), GST_TIME_ARGS (update_interval)); + + if (update_interval > 10) { + fail_unless (delta > ((update_interval - 10) * GST_MSECOND) + && delta < ((update_interval + 10) * GST_MSECOND)); + } + } + + new_state->last_position = position; + + if (position >= 2000 * GST_MSECOND) { + new_state->done = TRUE; + } + } else if (change == STATE_CHANGE_END_OF_STREAM || + change == STATE_CHANGE_ERROR) { + new_state->done = TRUE; + } +} + +START_TEST (test_play_position_update_interval) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + GstStructure *config; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_play_position_update_interval_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + config = gst_play_get_config (player); + gst_play_config_set_position_update_interval (config, 600); + gst_play_set_config (player, config); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 5); + + /* Disable position updates */ + gst_play_stop (player); + + config = gst_play_get_config (player); + gst_play_config_set_position_update_interval (config, 0); + gst_play_set_config (player, config); + state.last_position = GST_CLOCK_TIME_NONE; + + gst_play_play (player); + process_play_messages (player, &state); + + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 6); + + stop_player (player, &state); + g_object_unref (player); +} + +END_TEST; + +static void +test_restart_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (!steps && change == STATE_CHANGE_URI_LOADED) { + fail_unless (g_str_has_suffix (new_state->uri_loaded, "sintel.mkv")); + new_state->test_data = GINT_TO_POINTER (steps + 1); + } else if (change == STATE_CHANGE_STATE_CHANGED + && new_state->state == GST_PLAY_STATE_BUFFERING) { + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + } +} + +static void +test_restart_cb2 (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + gint steps = GPOINTER_TO_INT (new_state->test_data); + + if (!steps && change == STATE_CHANGE_URI_LOADED) { + fail_unless (g_str_has_suffix (new_state->uri_loaded, "audio-short.ogg")); + new_state->test_data = GINT_TO_POINTER (steps + 1); + } else if (change == STATE_CHANGE_STATE_CHANGED + && new_state->state == GST_PLAY_STATE_BUFFERING) { + new_state->test_data = GINT_TO_POINTER (steps + 1); + new_state->done = TRUE; + } +} + + +START_TEST (test_restart) +{ + GstPlay *player; + TestPlayerState state; + gchar *uri; + + memset (&state, 0, sizeof (state)); + state.test_callback = test_restart_cb; + state.test_data = GINT_TO_POINTER (0); + + player = test_play_new (&state); + + fail_unless (player != NULL); + + uri = gst_filename_to_uri (TEST_PATH "/sintel.mkv", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); + stop_player (player, &state); + + /* Try again with another URI */ + state.test_data = GINT_TO_POINTER (0); + state.test_callback = test_restart_cb2; + + uri = gst_filename_to_uri (TEST_PATH "/audio-short.ogg", NULL); + fail_unless (uri != NULL); + gst_play_set_uri (player, uri); + g_free (uri); + + gst_play_play (player); + process_play_messages (player, &state); + fail_unless_equals_int (GPOINTER_TO_INT (state.test_data), 2); + stop_player (player, &state); + + g_object_unref (player); +} + +END_TEST; + +static void +do_get (SoupMessage * msg, const char *path) +{ + char *uri; + SoupStatus status = SOUP_STATUS_OK; + + uri = soup_uri_to_string (soup_message_get_uri (msg), FALSE); + GST_DEBUG ("request: \"%s\"", uri); + + if (status != (SoupStatus) SOUP_STATUS_OK) + goto beach; + + if (msg->method == SOUP_METHOD_GET) { + char *full_path = g_strconcat (TEST_PATH, path, NULL); + char *buf; + gsize buflen; + + if (!g_file_get_contents (full_path, &buf, &buflen, NULL)) { + status = SOUP_STATUS_NOT_FOUND; + g_free (full_path); + goto beach; + } + + g_free (full_path); + soup_message_body_append (msg->response_body, SOUP_MEMORY_TAKE, + buf, buflen); + } + +beach: + soup_message_set_status (msg, status); + g_free (uri); +} + +static void +server_callback (SoupServer * server, SoupMessage * msg, + const char *path, GHashTable * query, + SoupClientContext * context, gpointer data) +{ + GST_DEBUG ("%s %s HTTP/1.%d", msg->method, path, + soup_message_get_http_version (msg)); + if (msg->request_body->length) + GST_DEBUG ("%s", msg->request_body->data); + + if (msg->method == SOUP_METHOD_GET) + do_get (msg, path); + else + soup_message_set_status (msg, SOUP_STATUS_NOT_IMPLEMENTED); + + GST_DEBUG (" -> %d %s", msg->status_code, msg->reason_phrase); +} + +static guint +get_port_from_server (SoupServer * server) +{ + GSList *uris; + guint port; + + uris = soup_server_get_uris (server); + g_assert (g_slist_length (uris) == 1); + port = soup_uri_get_port (uris->data); + g_slist_free_full (uris, (GDestroyNotify) soup_uri_free); + + return port; +} + +typedef struct +{ + GMainLoop *loop; + GMainContext *ctx; + GThread *thread; + SoupServer *server; + GMutex lock; + GCond cond; +} ServerContext; + +static gboolean +main_loop_running_cb (gpointer data) +{ + ServerContext *context = (ServerContext *) data; + + g_mutex_lock (&context->lock); + g_cond_signal (&context->cond); + g_mutex_unlock (&context->lock); + + return G_SOURCE_REMOVE; +} + +static gpointer +http_main (gpointer data) +{ + ServerContext *context = (ServerContext *) data; + GSource *source; + + context->server = soup_server_new (NULL, NULL); + soup_server_add_handler (context->server, NULL, server_callback, NULL, NULL); + + g_main_context_push_thread_default (context->ctx); + + { + GSocketAddress *address; + GError *err = NULL; + SoupServerListenOptions listen_flags = 0; + + address = + g_inet_socket_address_new_from_string ("0.0.0.0", + SOUP_ADDRESS_ANY_PORT); + soup_server_listen (context->server, address, listen_flags, &err); + g_object_unref (address); + + if (err) { + GST_ERROR ("Failed to start HTTP server: %s", err->message); + g_object_unref (context->server); + g_error_free (err); + } + } + + source = g_idle_source_new (); + g_source_set_callback (source, (GSourceFunc) main_loop_running_cb, context, + NULL); + g_source_attach (source, context->ctx); + g_source_unref (source); + + g_main_loop_run (context->loop); + g_main_context_pop_thread_default (context->ctx); + g_object_unref (context->server); + return NULL; +} + +#define TEST_USER_AGENT "test user agent" + +static void +test_user_agent_cb (GstPlay * player, + TestPlayerStateChange change, TestPlayerState * old_state, + TestPlayerState * new_state) +{ + if (change == STATE_CHANGE_STATE_CHANGED + && new_state->state == GST_PLAY_STATE_PAUSED) { + GstElement *pipeline; + GstElement *source; + gchar *user_agent; + + pipeline = gst_play_get_pipeline (player); + source = gst_bin_get_by_name (GST_BIN_CAST (pipeline), "source"); + g_object_get (source, "user-agent", &user_agent, NULL); + fail_unless_equals_string (user_agent, TEST_USER_AGENT); + g_free (user_agent); + gst_object_unref (source); + gst_object_unref (pipeline); + new_state->done = TRUE; + } +} + +START_TEST (test_user_agent) +{ + GstPlay *player; + GstStructure *config; + gchar *user_agent; + TestPlayerState state; + guint port; + gchar *url; + ServerContext *context = g_new (ServerContext, 1); + + g_mutex_init (&context->lock); + g_cond_init (&context->cond); + context->ctx = g_main_context_new (); + context->loop = g_main_loop_new (context->ctx, FALSE); + context->server = NULL; + + g_mutex_lock (&context->lock); + context->thread = g_thread_new ("HTTP Server", http_main, context); + while (!g_main_loop_is_running (context->loop)) + g_cond_wait (&context->cond, &context->lock); + g_mutex_unlock (&context->lock); + + if (context->server == NULL) { + g_print ("Failed to start up HTTP server"); + /* skip this test */ + goto beach; + } + + memset (&state, 0, sizeof (state)); + state.test_callback = test_user_agent_cb; + state.test_data = GINT_TO_POINTER (0); + + player = gst_play_new (NULL); + fail_unless (player != NULL); + + port = get_port_from_server (context->server); + url = g_strdup_printf ("http://127.0.0.1:%u/audio.ogg", port); + fail_unless (url != NULL); + + gst_play_set_uri (player, url); + g_free (url); + + config = gst_play_get_config (player); + gst_play_config_set_user_agent (config, TEST_USER_AGENT); + + user_agent = gst_play_config_get_user_agent (config); + fail_unless_equals_string (user_agent, TEST_USER_AGENT); + g_free (user_agent); + + gst_play_set_config (player, config); + + gst_play_pause (player); + process_play_messages (player, &state); + + stop_player (player, &state); + g_object_unref (player); + +beach: + g_main_loop_quit (context->loop); + g_thread_unref (context->thread); + g_main_loop_unref (context->loop); + context->loop = NULL; + g_main_context_unref (context->ctx); + context->ctx = NULL; + g_free (context); +} + +END_TEST; + +static Suite * +play_suite (void) +{ + Suite *s = suite_create ("GstPlay"); + + TCase *tc_general = tcase_create ("general"); + + /* Use a longer timeout */ +#ifdef HAVE_VALGRIND + if (RUNNING_ON_VALGRIND) { + tcase_set_timeout (tc_general, 5 * 60); + } else +#endif + { + tcase_set_timeout (tc_general, 2 * 60); + } + + tcase_add_test (tc_general, test_create_and_free); + tcase_add_test (tc_general, test_set_and_get_uri); + tcase_add_test (tc_general, test_set_and_get_position_update_interval); + +#ifdef HAVE_VALGRIND + if (RUNNING_ON_VALGRIND) { + } else +#endif + { + tcase_add_test (tc_general, test_play_position_update_interval); + } + tcase_add_test (tc_general, test_play_audio_eos); + tcase_add_test (tc_general, test_play_audio_video_eos); + tcase_add_test (tc_general, test_play_error_invalid_uri); + tcase_add_test (tc_general, test_play_error_invalid_uri_and_play); + tcase_add_test (tc_general, test_play_media_info); + tcase_add_test (tc_general, test_play_stream_disable); + tcase_add_test (tc_general, test_play_stream_switch_audio); + tcase_add_test (tc_general, test_play_stream_switch_subtitle); + tcase_add_test (tc_general, test_play_error_invalid_external_suburi); + tcase_add_test (tc_general, test_play_external_suburi); + tcase_add_test (tc_general, test_play_forward_rate); + tcase_add_test (tc_general, test_play_backward_rate); + tcase_add_test (tc_general, test_play_audio_video_seek_done); + tcase_add_test (tc_general, test_restart); + tcase_add_test (tc_general, test_user_agent); + + suite_add_tcase (s, tc_general); + + return s; +} + +GST_CHECK_MAIN (play)
View file
gst-plugins-bad-1.18.6.tar.xz/tests/check/meson.build -> gst-plugins-bad-1.20.1.tar.xz/tests/check/meson.build
Changed
@@ -15,16 +15,25 @@ # Since nalutils API is internal, need to build it again nalutils_dep = gstcodecparsers_dep.partial_dependency (compile_args: true, includes: true) -enable_gst_player_tests = get_option('gst_player_tests') +enable_gst_play_tests = get_option('gst_play_tests') +libsoup_dep = dependency('libsoup-2.4', version : '>=2.48', required : enable_gst_play_tests, + fallback : ['libsoup', 'libsoup_dep']) # name, condition when to skip the test and extra dependencies base_tests = [ + [['elements/aesenc.c'], not aes_dep.found(), [aes_dep]], + [['elements/aesdec.c'], not aes_dep.found(), [aes_dep]], [['elements/aiffparse.c']], [['elements/asfmux.c']], [['elements/autoconvert.c']], [['elements/autovideoconvert.c']], [['elements/avwait.c']], [['elements/camerabin.c']], + [['elements/ccconverter.c'], not closedcaption_dep.found(), [gstvideo_dep]], + [['elements/cccombiner.c'], not closedcaption_dep.found(), ], + [['elements/ccextractor.c'], not closedcaption_dep.found(), ], + [['elements/cudaconvert.c'], false, [gmodule_dep, gstgl_dep]], + [['elements/cudafilter.c'], false, [gmodule_dep, gstgl_dep]], [['elements/d3d11colorconvert.c'], host_machine.system() != 'windows', ], [['elements/gdpdepay.c']], [['elements/gdppay.c']], @@ -35,6 +44,7 @@ [['elements/id3mux.c']], [['elements/interlace.c']], [['elements/jpeg2000parse.c'], false, [libparser_dep, gstcodecparsers_dep]], + [['elements/line21.c'], not closedcaption_dep.found(), ], [['elements/mfvideosrc.c'], host_machine.system() != 'windows', ], [['elements/mpegtsdemux.c'], false, [gstmpegts_dep]], [['elements/mpegtsmux.c'], false, [gstmpegts_dep]], @@ -46,6 +56,7 @@ [['elements/nvenc.c'], false, [gmodule_dep, gstgl_dep]], [['elements/nvdec.c'], not gstgl_dep.found(), [gmodule_dep, gstgl_dep]], [['elements/svthevcenc.c'], not svthevcenc_dep.found(), [svthevcenc_dep]], + [['elements/openjpeg.c'], not openjpeg_dep.found(), [openjpeg_dep]], [['elements/pcapparse.c'], false, [libparser_dep]], [['elements/pnm.c']], [['elements/ristrtpext.c']], @@ -56,6 +67,9 @@ [['elements/switchbin.c']], [['elements/videoframe-audiolevel.c']], [['elements/viewfinderbin.c']], + [['elements/vp9parse.c'], false, [gstcodecparsers_dep]], + [['elements/av1parse.c'], false, [gstcodecparsers_dep]], + [['elements/wasapi.c'], host_machine.system() != 'windows', ], [['elements/wasapi2.c'], host_machine.system() != 'windows', ], [['libs/h264parser.c'], false, [gstcodecparsers_dep]], [['libs/h265parser.c'], false, [gstcodecparsers_dep]], @@ -65,7 +79,7 @@ [['libs/mpegts.c'], false, [gstmpegts_dep]], [['libs/mpegvideoparser.c'], false, [gstcodecparsers_dep]], [['libs/planaraudioadapter.c'], false, [gstbadaudio_dep]], - [['libs/player.c'], not enable_gst_player_tests, [gstplayer_dep]], + [['libs/play.c'], not enable_gst_play_tests, [gstplay_dep, libsoup_dep]], [['libs/vc1parser.c'], false, [gstcodecparsers_dep]], [['libs/vp8parser.c'], false, [gstcodecparsers_dep]], [['libs/vp9parser.c'], false, [gstcodecparsers_dep]], @@ -78,6 +92,7 @@ [['libs/vkcommandpool.c'], not gstvulkan_dep.found(), [gstvulkan_dep]], [['libs/vkimage.c'], not gstvulkan_dep.found(), [gstvulkan_dep]], [['libs/vkinstance.c'], not gstvulkan_dep.found(), [gstvulkan_dep]], + [['libs/d3d11device.cpp'], not gstd3d11_dep.found(), [gstd3d11_dep]], ] # FIXME: unistd dependency, unstable or not tested yet on windows @@ -94,11 +109,7 @@ [['elements/avtpcvfdepay.c'], not avtp_dep.found(), [avtp_dep]], [['elements/avtpsink.c'], not avtp_dep.found(), [avtp_dep]], [['elements/avtpsrc.c'], not avtp_dep.found(), [avtp_dep]], - [['elements/ccconverter.c'], not gstvideo_dep.found(), [gstvideo_dep]], - [['elements/cccombiner.c']], - [['elements/ccextractor.c']], [['elements/clockselect.c']], - [['elements/line21.c']], [['elements/curlhttpsink.c'], not curl_dep.found(), [curl_dep]], [['elements/curlhttpsrc.c'], not curl_dep.found(), [curl_dep, gio_dep]], [['elements/curlfilesink.c'], @@ -129,6 +140,13 @@ ] endif +# linux only tests +if host_machine.system() == 'linux' + base_tests += [ + [['elements/vapostproc.c'], not gstva_dep.found(), [gstva_dep]], + ] +endif + test_defines = [ '-UG_DISABLE_ASSERT', '-UG_DISABLE_CAST_CHECKS', @@ -150,14 +168,24 @@ if gst_dep.type_name() == 'pkgconfig' pbase = dependency('gstreamer-plugins-base-' + api_version) - pluginsdirs = [gst_dep.get_pkgconfig_variable('pluginsdir'), - pbase.get_pkgconfig_variable('pluginsdir')] - gst_plugin_scanner_dir = gst_dep.get_pkgconfig_variable('pluginscannerdir') + pluginsdirs = [gst_dep.get_variable('pluginsdir'), + pbase.get_variable('pluginsdir')] + gst_plugin_scanner_dir = gst_dep.get_variable('pluginscannerdir') else gst_plugin_scanner_dir = subproject('gstreamer').get_variable('gst_scanner_dir') endif gst_plugin_scanner_path = join_paths(gst_plugin_scanner_dir, 'gst-plugin-scanner') +extra_args = [] +# XXX: our MinGW 32bits toolchain complians when ComPtr is in use +if host_system == 'windows' and cc.get_id() != 'msvc' + mingw_args = cc.get_supported_arguments([ + '-Wno-redundant-decls', + ]) + + extra_args += mingw_args +endif + foreach t : base_tests fnames = t.get(0) test_name = fnames[0].split('.').get(0).underscorify() @@ -177,8 +205,8 @@ if not skip_test exe = executable(test_name, fnames, extra_sources, include_directories : [configinc], - c_args : gst_plugins_bad_args + test_defines, - cpp_args : gst_plugins_bad_args + test_defines, + c_args : gst_plugins_bad_args + test_defines + extra_args, + cpp_args : gst_plugins_bad_args + test_defines + extra_args, dependencies : [libm] + test_deps + extra_deps, ) @@ -186,11 +214,11 @@ env.set('GST_PLUGIN_SYSTEM_PATH_1_0', '') env.set('CK_DEFAULT_TIMEOUT', '20') env.set('GST_STATE_IGNORE_ELEMENTS', '') - env.set('GST_PLUGIN_PATH_1_0', [meson.build_root()] + pluginsdirs) + env.set('GST_PLUGIN_PATH_1_0', [meson.global_build_root()] + pluginsdirs) env.set('GST_REGISTRY', join_paths(meson.current_build_dir(), '@0@.registry'.format(test_name))) env.set('GST_PLUGIN_LOADING_WHITELIST', 'gstreamer', 'gst-plugins-base', 'gst-plugins-good', 'gst-plugins-ugly','gst-libav', 'libnice', - 'gst-plugins-bad@' + meson.build_root()) + 'gst-plugins-bad@' + meson.project_build_root()) env.set('GST_PLUGIN_SCANNER_1_0', gst_plugin_scanner_path) test(test_name, exe, env: env, timeout: 3 * 60) endif @@ -225,6 +253,6 @@ endforeach endif -if enable_gst_player_tests +if enable_gst_play_tests subdir ('media') endif
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11device.cpp
Added
@@ -0,0 +1,402 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include "d3d11device.h" +#include <d3dcompiler.h> +#include <wrl.h> +#include <string.h> + +using namespace Microsoft::WRL; + +HRESULT +prepare_d3d11_device (ID3D11Device ** d3d11_device, + ID3D11DeviceContext ** d3d11_context, IDXGIFactory2 ** dxgi_factory) +{ + HRESULT hr; + static const D3D_FEATURE_LEVEL feature_levels[] = { + D3D_FEATURE_LEVEL_11_1, + D3D_FEATURE_LEVEL_11_0, + D3D_FEATURE_LEVEL_10_1, + D3D_FEATURE_LEVEL_10_0, + D3D_FEATURE_LEVEL_9_3, + D3D_FEATURE_LEVEL_9_2, + D3D_FEATURE_LEVEL_9_1 + }; + D3D_FEATURE_LEVEL selected_level; + + ComPtr<IDXGIFactory1> factory; + hr = CreateDXGIFactory1 (IID_PPV_ARGS (&factory)); + if (FAILED (hr)) { + gst_printerrln ("IDXGIFactory1 is unavailable, hr 0x%x", (guint) hr); + return hr; + } + + ComPtr<IDXGIFactory2> factory2; + hr = factory.As (&factory2); + if (FAILED (hr)) { + gst_printerrln ("IDXGIFactory2 is unavailable, hr 0x%x", (guint) hr); + return hr; + } + + ComPtr<IDXGIAdapter1> adapter; + hr = factory->EnumAdapters1 (0, &adapter); + if (FAILED (hr)) { + gst_printerrln ("IDXGIAdapter1 is unavailable, hr 0x%x", (guint) hr); + return hr; + } + + ComPtr<ID3D11Device> device; + ComPtr<ID3D11DeviceContext> context; + hr = D3D11CreateDevice (adapter.Get(), D3D_DRIVER_TYPE_UNKNOWN, + NULL, D3D11_CREATE_DEVICE_BGRA_SUPPORT, feature_levels, + G_N_ELEMENTS (feature_levels), D3D11_SDK_VERSION, &device, + &selected_level, &context); + + /* Try again with excluding D3D_FEATURE_LEVEL_11_1 */ + if (FAILED (hr)) { + hr = D3D11CreateDevice (adapter.Get(), D3D_DRIVER_TYPE_UNKNOWN, + NULL, D3D11_CREATE_DEVICE_BGRA_SUPPORT, &feature_levels[1], + G_N_ELEMENTS (feature_levels) - 1, D3D11_SDK_VERSION, &device, + &selected_level, &context); + } + + if (FAILED (hr)) { + gst_printerrln ("ID3D11Device is unavailable, hr 0x%x", (guint) hr); + return hr; + } + + if (d3d11_device) + *d3d11_device = device.Detach(); + if (d3d11_context) + *d3d11_context = context.Detach(); + if (dxgi_factory) + *dxgi_factory = factory2.Detach(); + + return hr; +} + +HRESULT +prepare_shared_texture (ID3D11Device * d3d11_device, guint width, + guint height, DXGI_FORMAT format, UINT misc_flags, + ID3D11Texture2D ** texture, ID3D11ShaderResourceView ** srv, + IDXGIKeyedMutex ** keyed_mutex, HANDLE * shared_handle) +{ + D3D11_TEXTURE2D_DESC texture_desc = { 0, }; + HRESULT hr; + + /* Texture size doesn't need to be identical to that of backbuffer */ + texture_desc.Width = width; + texture_desc.Height = height; + texture_desc.MipLevels = 1; + texture_desc.Format = format; + texture_desc.SampleDesc.Count = 1; + texture_desc.ArraySize = 1; + texture_desc.Usage = D3D11_USAGE_DEFAULT; + texture_desc.BindFlags = + D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; + texture_desc.MiscFlags = misc_flags; + + ComPtr<ID3D11Texture2D> shared_texture; + hr = d3d11_device->CreateTexture2D (&texture_desc, nullptr, &shared_texture); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11Texture2D"); + return hr; + } + + ComPtr<ID3D11ShaderResourceView> shader_resource_view; + if (srv) { + D3D11_SHADER_RESOURCE_VIEW_DESC srv_desc = { 0, }; + srv_desc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; + srv_desc.Texture2D.MipLevels = 1; + srv_desc.Format = format; + + hr = d3d11_device->CreateShaderResourceView (shared_texture.Get(), &srv_desc, + &shader_resource_view); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11ShaderResourceView"); + return hr; + } + } + + ComPtr<IDXGIKeyedMutex> keyed; + if ((misc_flags & D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX) != 0 && keyed_mutex) { + hr = shared_texture.As (&keyed); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get IDXGIKeyedMutex"); + return hr; + } + } + + ComPtr<IDXGIResource> dxgi_resource; + hr = shared_texture.As (&dxgi_resource); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get IDXGIResource handle"); + return hr; + } + + HANDLE handle; + if ((misc_flags & D3D11_RESOURCE_MISC_SHARED_NTHANDLE) != 0) { + ComPtr<IDXGIResource1> dxgi_resource1; + hr = dxgi_resource.As (&dxgi_resource1); + + if (FAILED (hr)) { + gst_printerrln ("Couldn't get IDXGIResource1"); + return hr; + } + + hr = dxgi_resource1->CreateSharedHandle (nullptr, + DXGI_SHARED_RESOURCE_READ | DXGI_SHARED_RESOURCE_WRITE, nullptr, + &handle); + } else { + hr = dxgi_resource->GetSharedHandle (&handle); + } + + if (FAILED (hr)) { + gst_printerrln ("Couldn't get shared handle from texture"); + return hr; + } + + *texture = shared_texture.Detach(); + if (srv) + *srv = shader_resource_view.Detach(); + *shared_handle = handle; + if (keyed && keyed_mutex) + *keyed_mutex = keyed.Detach(); + + return S_OK; +} + +static HRESULT +d3d_compile (const gchar * source, gboolean is_pixel_shader, ID3DBlob ** code) +{ + HRESULT hr; + const gchar *shader_target = "ps_4_0"; + + if (!is_pixel_shader) + shader_target = "vs_4_0"; + + ComPtr<ID3DBlob> blob; + ComPtr<ID3DBlob> error; + hr = D3DCompile (source, strlen (source), nullptr, nullptr, + nullptr, "main", shader_target, 0, 0, &blob, &error); + + if (FAILED (hr)) { + const gchar *err = nullptr; + if (error) + err = (const gchar *) error->GetBufferPointer (); + + gst_printerrln ("Couldn't compile pixel shader, error: %s", + GST_STR_NULL (err)); + return hr; + } + + *code = blob.Detach(); + + return S_OK; +} + +HRESULT +prepare_shader (ID3D11Device * d3d11_device, ID3D11DeviceContext * context, + ID3D11SamplerState ** sampler, ID3D11PixelShader ** ps, + ID3D11VertexShader ** vs, ID3D11InputLayout ** layout, + ID3D11Buffer ** vertex, ID3D11Buffer ** index) +{ + static const gchar ps_code[] = + "Texture2D shaderTexture;\n" + "SamplerState samplerState;\n" + "\n" + "struct PS_INPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + " float3 Texture: TEXCOORD0;\n" + "};\n" + "\n" + "struct PS_OUTPUT\n" + "{\n" + " float4 Plane: SV_Target;\n" + "};\n" + "\n" + "PS_OUTPUT main(PS_INPUT input)\n" + "{\n" + " PS_OUTPUT output;\n" + " output.Plane = shaderTexture.Sample(samplerState, input.Texture);\n" + " return output;\n" + "}\n"; + + static const gchar vs_code[] = + "struct VS_INPUT\n" + "{\n" + " float4 Position : POSITION;\n" + " float4 Texture : TEXCOORD0;\n" + "};\n" + "\n" + "struct VS_OUTPUT\n" + "{\n" + " float4 Position: SV_POSITION;\n" + " float4 Texture: TEXCOORD0;\n" + "};\n" + "\n" + "VS_OUTPUT main(VS_INPUT input)\n" + "{\n" + " return input;\n" + "}\n"; + + D3D11_SAMPLER_DESC sampler_desc = { 0, }; + sampler_desc.Filter = D3D11_FILTER_MIN_MAG_LINEAR_MIP_POINT; + sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP; + sampler_desc.ComparisonFunc = D3D11_COMPARISON_ALWAYS; + sampler_desc.MinLOD = 0; + sampler_desc.MaxLOD = D3D11_FLOAT32_MAX; + + ComPtr<ID3D11SamplerState> sampler_state; + HRESULT hr = d3d11_device->CreateSamplerState (&sampler_desc, &sampler_state); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11SamplerState"); + return hr; + } + + ComPtr<ID3DBlob> code; + hr = d3d_compile (ps_code, TRUE, &code); + if (FAILED (hr)) + return hr; + + ComPtr<ID3D11PixelShader> pixel_shader; + hr = d3d11_device->CreatePixelShader (code->GetBufferPointer(), + code->GetBufferSize(), nullptr, &pixel_shader); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11PixelShader"); + return hr; + } + + hr = d3d_compile (vs_code, FALSE, code.ReleaseAndGetAddressOf()); + if (FAILED (hr)) + return hr; + + ComPtr<ID3D11VertexShader> vertex_shader; + hr = d3d11_device->CreateVertexShader (code->GetBufferPointer(), + code->GetBufferSize(), nullptr, &vertex_shader); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11VertexShader"); + return hr; + } + + D3D11_INPUT_ELEMENT_DESC input_desc[] = { + { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, + D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0 }, + { "TEXCOORD", 0, DXGI_FORMAT_R32G32_FLOAT, 0, + D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0 } + }; + + ComPtr<ID3D11InputLayout> input_layout; + hr = d3d11_device->CreateInputLayout (input_desc, G_N_ELEMENTS (input_desc), + code->GetBufferPointer(), code->GetBufferSize(), &input_layout); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11InputLayout"); + return hr; + } + + D3D11_BUFFER_DESC buffer_desc = { 0, }; + buffer_desc.Usage = D3D11_USAGE_DYNAMIC; + buffer_desc.ByteWidth = sizeof (VertexData) * 4; + buffer_desc.BindFlags = D3D11_BIND_VERTEX_BUFFER; + buffer_desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; + + ComPtr<ID3D11Buffer> vertex_buffer; + hr = d3d11_device->CreateBuffer (&buffer_desc, nullptr, &vertex_buffer); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11Buffer for vertex buffer"); + return hr; + } + + D3D11_MAPPED_SUBRESOURCE map; + hr = context->Map (vertex_buffer.Get(), 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + if (FAILED (hr)) { + gst_printerrln ("Couldn't map vertex buffer"); + return hr; + } + + VertexData *vertex_data = (VertexData *) map.pData; + vertex_data[0].position.x = -1.0f; + vertex_data[0].position.y = -1.0f; + vertex_data[0].position.z = 0.0f; + vertex_data[0].texture.x = 0.0f; + vertex_data[0].texture.y = 1.0f; + + vertex_data[1].position.x = -1.0f; + vertex_data[1].position.y = 1.0f; + vertex_data[1].position.z = 0.0f; + vertex_data[1].texture.x = 0.0f; + vertex_data[1].texture.y = 0.0f; + + vertex_data[2].position.x = 1.0f; + vertex_data[2].position.y = 1.0f; + vertex_data[2].position.z = 0.0f; + vertex_data[2].texture.x = 1.0f; + vertex_data[2].texture.y = 0.0f; + + vertex_data[3].position.x = 1.0f; + vertex_data[3].position.y = -1.0f; + vertex_data[3].position.z = 0.0f; + vertex_data[3].texture.x = 1.0f; + vertex_data[3].texture.y = 1.0f; + + context->Unmap (vertex_buffer.Get(), 0); + + ComPtr<ID3D11Buffer> index_buffer; + buffer_desc.ByteWidth = sizeof (WORD) * 2 * 3; + buffer_desc.BindFlags = D3D11_BIND_INDEX_BUFFER; + hr = d3d11_device->CreateBuffer (&buffer_desc, nullptr, &index_buffer); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create ID3D11Buffer for index buffer"); + return hr; + } + + hr = context->Map (index_buffer.Get(), 0, D3D11_MAP_WRITE_DISCARD, 0, &map); + if (FAILED (hr)) { + gst_printerrln ("Couldn't map index buffer"); + return hr; + } + + WORD *indices = (WORD *) map.pData; + indices[0] = 0; + indices[1] = 1; + indices[2] = 2; + + indices[3] = 3; + indices[4] = 0; + indices[5] = 2; + + context->Unmap (index_buffer.Get(), 0); + + *sampler = sampler_state.Detach(); + *ps = pixel_shader.Detach(); + *vs = vertex_shader.Detach(); + *layout = input_layout.Detach(); + *vertex = vertex_buffer.Detach(); + *index = index_buffer.Detach(); + + return S_OK; +} \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11device.h
Added
@@ -0,0 +1,63 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/gst.h> +#include <windows.h> +#include <d3d11.h> +#include <dxgi1_2.h> + +typedef struct +{ + struct { + FLOAT x; + FLOAT y; + FLOAT z; + } position; + struct { + FLOAT x; + FLOAT y; + } texture; +} VertexData; + +HRESULT +prepare_d3d11_device (ID3D11Device ** d3d11_device, + ID3D11DeviceContext ** d3d11_context, + IDXGIFactory2 ** dxgi_factory); + +HRESULT +prepare_shared_texture (ID3D11Device * d3d11_device, + guint width, + guint height, + DXGI_FORMAT format, + UINT misc_flags, + ID3D11Texture2D ** texture, + ID3D11ShaderResourceView ** srv, + IDXGIKeyedMutex ** keyed_mutex, + HANDLE * shared_handle); + +HRESULT +prepare_shader (ID3D11Device * d3d11_device, + ID3D11DeviceContext * context, + ID3D11SamplerState ** sampler, + ID3D11PixelShader ** ps, + ID3D11VertexShader ** vs, + ID3D11InputLayout ** layout, + ID3D11Buffer ** vertex, + ID3D11Buffer ** index); \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11screencapturesrc.cpp
Added
@@ -0,0 +1,215 @@ +/* GStreamer + * Copyright (C) 2021 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <windows.h> + +static GstDevice * +enum_devices (gboolean only_show, gint monitor_idx, guint64 monitor_handle) +{ + GstDeviceMonitor *monitor; + GstCaps *caps; + GList *devices, *iter; + gboolean ret; + guint i; + GstDevice *target = nullptr; + + monitor = gst_device_monitor_new (); + + /* Filtering by using d3d11 memory caps with "Source/Monitor" class */ + caps = gst_caps_from_string ("video/x-raw(memory:D3D11Memory)"); + ret = gst_device_monitor_add_filter (monitor, "Source/Monitor", caps); + gst_caps_unref (caps); + + if (!ret) { + gst_object_unref (monitor); + g_warning ("Failed to setup device monitor"); + return nullptr; + } + + gst_device_monitor_start (monitor); + devices = gst_device_monitor_get_devices (monitor); + + if (!devices) { + g_warning ("No detected d3d11 monitor device"); + gst_device_monitor_stop (monitor); + gst_object_unref (monitor); + return nullptr; + } + + gst_println ("Found %d monitor device(s)", g_list_length (devices)); + + for (iter = devices, i = 0; iter; iter = g_list_next (iter), i++) { + GstDevice *dev = GST_DEVICE (iter->data); + GstStructure *s; + guint disp_coord_left, disp_coord_top, disp_coord_right, disp_coord_bottom; + gchar *name = nullptr; + gchar *adapter_desc = nullptr; + guint64 hmonitor; + HMONITOR handle; + gboolean primary; + + s = gst_device_get_properties (dev); + + gst_structure_get (s, + "display.coordinates.left", G_TYPE_INT, &disp_coord_left, + "display.coordinates.top", G_TYPE_INT, &disp_coord_top, + "display.coordinates.right", G_TYPE_INT, &disp_coord_right, + "display.coordinates.bottom", G_TYPE_INT, &disp_coord_bottom, + "device.adapter.description", G_TYPE_STRING, &adapter_desc, + "device.hmonitor", G_TYPE_UINT64, &hmonitor, + "device.primary", G_TYPE_BOOLEAN, &primary, nullptr); + + name = gst_device_get_display_name (dev); + + handle = (HMONITOR) hmonitor; + + gst_println ("Monitor %d (%s - %s):", i, name, adapter_desc); + gst_println (" HMONITOR: %p (%" G_GUINT64_FORMAT ")", handle, hmonitor); + gst_println (" Display Coordinates (left:top:right:bottom): %d:%d:%d:%d\n", + disp_coord_left, disp_coord_top, disp_coord_right, disp_coord_bottom); + + g_free (adapter_desc); + g_free (name); + + if (!only_show && !target) { + if (monitor_handle != 0) { + if (monitor_handle == hmonitor) { + target = (GstDevice *) gst_object_ref (dev); + } + } else if (monitor_idx < 0) { + if (primary) { + target = (GstDevice *) gst_object_ref (dev); + } + } else if (monitor_idx == i) { + target = (GstDevice *) gst_object_ref (dev); + } + + if (target) + gst_println ("Found target monitor device"); + } + } + g_list_free_full (devices, gst_object_unref); + + return target; +} + +static gboolean +bus_msg (GstBus * bus, GstMessage * msg, GMainLoop * loop) +{ + switch (GST_MESSAGE_TYPE (msg)) { + case GST_MESSAGE_ERROR:{ + GError *err; + gchar *dbg; + + gst_message_parse_error (msg, &err, &dbg); + g_printerr ("ERROR %s \n", err->message); + if (dbg != NULL) + g_printerr ("ERROR debug information: %s\n", dbg); + g_clear_error (&err); + g_free (dbg); + + g_main_loop_quit (loop); + break; + } + default: + break; + } + + return TRUE; +} + +gint +main (gint argc, gchar ** argv) +{ + GstElement *pipeline, *src, *queue, *sink; + GMainLoop *loop; + gboolean ret; + gboolean show_devices = FALSE; + gint64 hmonitor = 0; + gint monitor_index = -1; + GError *err = nullptr; + GstDevice *device; + GOptionContext *option_ctx; + GOptionEntry options[] = { + {"show-devices", 0, 0, G_OPTION_ARG_NONE, &show_devices, + "Display available monitor devices", nullptr}, + {"hmonitor", 0, 0, G_OPTION_ARG_INT64, &hmonitor, + "Address of HMONITOR handle", nullptr}, + {"index", 0, 0, G_OPTION_ARG_INT, &monitor_index, + "Monitor index to capture (-1 for primary monitor)", nullptr}, + {nullptr} + }; + + option_ctx = g_option_context_new ("D3D11 screen capture example"); + g_option_context_add_main_entries (option_ctx, options, NULL); + g_option_context_add_group (option_ctx, gst_init_get_option_group ()); + ret = g_option_context_parse (option_ctx, &argc, &argv, &err); + g_option_context_free (option_ctx); + + if (!ret) { + g_printerr ("option parsing failed: %s\n", err->message); + g_clear_error (&err); + return 1; + } + + device = enum_devices (show_devices, monitor_index, (guint64) hmonitor); + if (show_devices) { + gst_clear_object (&device); + return 0; + } + + if (!device) { + gst_println ("Failed to find monitor device"); + return 1; + } + + src = gst_device_create_element (device, nullptr); + gst_object_unref (device); + if (!src) { + g_warning ("Failed to create d3d11screencapture element"); + return 1; + } + + loop = g_main_loop_new (nullptr, FALSE); + pipeline = gst_pipeline_new (nullptr); + + queue = gst_element_factory_make ("queue", nullptr); + sink = gst_element_factory_make ("d3d11videosink", nullptr); + + gst_bin_add_many (GST_BIN (pipeline), src, queue, sink, nullptr); + gst_element_link_many (src, queue, sink, nullptr); + + gst_bus_add_watch (GST_ELEMENT_BUS (pipeline), (GstBusFunc) bus_msg, loop); + gst_element_set_state (pipeline, GST_STATE_PLAYING); + + g_main_loop_run (loop); + + gst_element_set_state (pipeline, GST_STATE_NULL); + gst_bus_remove_watch (GST_ELEMENT_BUS (pipeline)); + + gst_object_unref (pipeline); + g_main_loop_unref (loop); + + return 0; +}
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11videosink-kb.c
Changed
(renamed from tests/examples/d3d11videosink/d3d11videosink-kb.c)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11videosink-kb.h
Changed
(renamed from tests/examples/d3d11videosink/d3d11videosink-kb.h)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11videosink-shared-texture-d3d9ex.cpp
Added
@@ -0,0 +1,398 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/video/video.h> + +#include "d3d11device.h" +#include <wrl.h> +#include <mutex> +#include <d3d9.h> + +using namespace Microsoft::WRL; + +static gchar *uri = nullptr; + +static GMainLoop *loop = nullptr; +static gboolean visible = FALSE; +static HWND hwnd = nullptr; +std::mutex lock; + +/* Device handles */ +ComPtr<ID3D11Device> device; +ComPtr<ID3D11DeviceContext> context; +ComPtr<IDXGIFactory2> factory; + +ComPtr<IDirect3D9Ex> d3d9_handle; +ComPtr<IDirect3DDevice9Ex> d3d9_device; + +/* SwapChain resources */ +ComPtr<IDirect3DSwapChain9> swapchain; + +/* Shard texture resources */ +ComPtr<ID3D11Texture2D> shared_texture; +ComPtr<IDirect3DTexture9> shared_d3d9_texture; +ComPtr<IDirect3DSurface9> d3d9_surface; +HANDLE shared_handle = nullptr; + +static void +on_begin_draw (GstElement * videosink, gpointer user_data) +{ + std::lock_guard<std::mutex> lk (lock); + GstElement *sink = GST_ELEMENT (user_data); + gboolean ret = TRUE; + HRESULT hr; + ComPtr<IDirect3DSurface9> backbuffer; + + /* Windows was destroyed, nothing to draw */ + if (!hwnd) + return; + + if (!shared_handle) { + gst_printerrln ("Shared handle wasn't configured"); + exit (-1); + } + + if (!swapchain) { + gst_printerrln ("SwapChain wasn't configured"); + exit (-1); + } + + g_signal_emit_by_name (sink, + "draw", shared_handle, D3D11_RESOURCE_MISC_SHARED, 0, 0, &ret); + + if (!ret) { + gst_printerrln ("Failed to draw on shared handle"); + if (loop) + g_main_loop_quit (loop); + + return; + } + + /* Get swapchain's backbuffer */ + hr = swapchain->GetBackBuffer (0, D3DBACKBUFFER_TYPE_MONO, &backbuffer); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get backbuffer"); + exit (-1); + } + + /* Copy shared texture to backbuffer */ + hr = d3d9_device->StretchRect (d3d9_surface.Get(), nullptr, + backbuffer.Get(), nullptr, D3DTEXF_LINEAR); + if (FAILED (hr)) { + gst_printerrln ("StretchRect failed"); + exit (-1); + } + + hr = d3d9_device->BeginScene (); + if (FAILED (hr)) { + gst_printerrln ("BeginScene failed"); + exit (-1); + } + + hr = swapchain->Present (nullptr, nullptr, nullptr, nullptr, 0); + if (FAILED (hr)) { + gst_printerrln ("Present failed"); + exit (-1); + } + + hr = d3d9_device->EndScene (); + if (FAILED (hr)) { + gst_printerrln ("BeginScene failed"); + exit (-1); + } +} + +static void +on_resize (void) +{ + std::lock_guard<std::mutex> lk (lock); + RECT client_rect; + guint width, height; + HRESULT hr; + + GetClientRect (hwnd, &client_rect); + + width = MAX (1, (client_rect.right - client_rect.left)); + height = MAX (1, (client_rect.bottom - client_rect.top)); + + D3DPRESENT_PARAMETERS params = { 0, }; + + if (!swapchain) { + params.Windowed = TRUE; + params.SwapEffect = D3DSWAPEFFECT_DISCARD; + params.hDeviceWindow = hwnd; + /* GST_VIDEO_FORMAT_BGRA */ + params.BackBufferFormat = D3DFMT_A8R8G8B8; + + hr = d3d9_device->CreateAdditionalSwapChain (¶ms, &swapchain); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create swapchain"); + exit (-1); + } + } else { + /* Check whether we need to re-create swapchain */ + hr = swapchain->GetPresentParameters (¶ms); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get swapchain parameters"); + exit (-1); + } + + if (params.BackBufferWidth != width || params.BackBufferHeight != height) { + /* Backbuffer will have client area size */ + params.BackBufferWidth = 0; + params.BackBufferHeight = 0; + + swapchain = nullptr; + hr = d3d9_device->CreateAdditionalSwapChain (¶ms, &swapchain); + if (FAILED (hr)) { + gst_printerrln ("Couldn't re-create swapchain"); + exit (-1); + } + } + } +} + +static LRESULT CALLBACK +window_proc (HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam) +{ + switch (message) { + case WM_DESTROY: + { + std::lock_guard<std::mutex> lk (lock); + hwnd = NULL; + if (loop) + g_main_loop_quit (loop); + break; + } + case WM_SIZE: + on_resize (); + break; + default: + break; + } + + return DefWindowProc (hWnd, message, wParam, lParam); +} + +static gboolean +bus_msg (GstBus * bus, GstMessage * msg, GstElement * pipeline) +{ + switch (GST_MESSAGE_TYPE (msg)) { + case GST_MESSAGE_ASYNC_DONE: + /* make window visible when we have something to show */ + if (!visible && hwnd) { + ShowWindow (hwnd, SW_SHOW); + visible = TRUE; + } + gst_element_set_state (pipeline, GST_STATE_PLAYING); + break; + case GST_MESSAGE_ERROR:{ + GError *err; + gchar *dbg; + + gst_message_parse_error (msg, &err, &dbg); + g_printerr ("ERROR %s \n", err->message); + if (dbg != NULL) + g_printerr ("ERROR debug information: %s\n", dbg); + g_clear_error (&err); + g_free (dbg); + + g_main_loop_quit (loop); + break; + } + default: + break; + } + + return TRUE; +} + +static gboolean +msg_cb (GIOChannel * source, GIOCondition condition, gpointer data) +{ + MSG msg; + + if (!PeekMessage (&msg, NULL, 0, 0, PM_REMOVE)) + return G_SOURCE_CONTINUE; + + TranslateMessage (&msg); + DispatchMessage (&msg); + + return G_SOURCE_CONTINUE; +} + +gint +main (gint argc, gchar ** argv) +{ + GstElement *pipeline, *sink; + GstStateChangeReturn sret; + WNDCLASSEX wc = { 0, }; + HINSTANCE hinstance = GetModuleHandle (NULL); + GIOChannel *msg_io_channel = NULL; + RECT wr = { 0, 0, 320, 240 }; + HRESULT hr; + GOptionEntry options[] = { + {"uri", 0, 0, G_OPTION_ARG_STRING, &uri, + "URI to test (if unspecified, videotestsrc will be used)", + NULL}, + {NULL} + }; + GOptionContext *option_ctx; + gboolean ret; + GError *error = nullptr; + + option_ctx = g_option_context_new ("d3d11videosink shard-texture with d3d9 interop example"); + g_option_context_add_main_entries (option_ctx, options, NULL); + g_option_context_add_group (option_ctx, gst_init_get_option_group ()); + ret = g_option_context_parse (option_ctx, &argc, &argv, &error); + g_option_context_free (option_ctx); + + if (!ret) { + g_printerr ("option parsing failed: %s\n", error->message); + g_clear_error (&error); + exit (1); + } + + /* 1) Prepare window */ + wc.cbSize = sizeof (WNDCLASSEX); + wc.style = CS_HREDRAW | CS_VREDRAW; + wc.lpfnWndProc = (WNDPROC) window_proc; + wc.hInstance = hinstance; + wc.hCursor = LoadCursor (NULL, IDC_ARROW); + wc.lpszClassName = TEXT ("GstD3D11VideoSinkSharedTextureD3D9ExExample"); + RegisterClassEx (&wc); + + AdjustWindowRect (&wr, WS_OVERLAPPEDWINDOW, FALSE); + + hwnd = CreateWindowEx (0, wc.lpszClassName, + TEXT ("GstD3D11VideoSinkSharedTextureD3D9ExExample"), + WS_CLIPSIBLINGS | WS_CLIPCHILDREN | WS_OVERLAPPEDWINDOW, + CW_USEDEFAULT, CW_USEDEFAULT, + wr.right - wr.left, wr.bottom - wr.top, (HWND) NULL, (HMENU) NULL, + hinstance, NULL); + + /* 2) Prepare D3D11 device */ + hr = prepare_d3d11_device (&device, &context, &factory); + if (FAILED (hr)) { + gst_printerrln ("D3D11 device is unavailable"); + return -1; + } + + /* 3) Prepare D3D9EX device */ + Direct3DCreate9Ex (D3D_SDK_VERSION, &d3d9_handle); + if (!d3d9_handle) { + gst_printerrln ("D3D9 handle is unavailable"); + return -1; + } + + D3DPRESENT_PARAMETERS params = { 0,}; + params.Windowed = TRUE; + params.SwapEffect = D3DSWAPEFFECT_DISCARD; + params.hDeviceWindow = hwnd; + params.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE; + + hr = d3d9_handle->CreateDeviceEx (D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, hwnd, + D3DCREATE_HARDWARE_VERTEXPROCESSING | D3DCREATE_MULTITHREADED | D3DCREATE_FPU_PRESERVE, + ¶ms, nullptr, &d3d9_device); + if (FAILED (hr)) { + gst_printerrln ("D3d9 deice is unavailable"); + return -1; + } + + /* 4) Create shared texture */ + /* Texture size doesn't need to be identical to that of backbuffer */ + hr = prepare_shared_texture (device.Get(), 1280, 720, + DXGI_FORMAT_B8G8R8A8_UNORM, + /* NOTE: D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX is incompatible with + * d3d9. User should use D3D11_RESOURCE_MISC_SHARED in case of d3d9 */ + D3D11_RESOURCE_MISC_SHARED, + &shared_texture, nullptr, nullptr, &shared_handle); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create texture to share with d3d11videosink"); + return -1; + } + + hr = d3d9_device->CreateTexture (1280, 720, 1, D3DUSAGE_RENDERTARGET, + D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, &shared_d3d9_texture, &shared_handle); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create shared d3d9 texture"); + return -1; + } + + hr = shared_d3d9_texture->GetSurfaceLevel (0, &d3d9_surface); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get surface from shared d3d9 texture"); + return -1; + } + + /* Call initial resize to prepare swapchain */ + on_resize(); + + loop = g_main_loop_new (NULL, FALSE); + msg_io_channel = g_io_channel_win32_new_messages ((gsize) hwnd); + g_io_add_watch (msg_io_channel, G_IO_IN, msg_cb, NULL); + + /* Enable drawing on our texture and add signal handler */ + sink = gst_element_factory_make ("d3d11videosink", NULL); + g_object_set (sink, "draw-on-shared-texture", TRUE, nullptr); + g_signal_connect (sink, "begin-draw", G_CALLBACK (on_begin_draw), sink); + + if (uri) { + pipeline = gst_element_factory_make ("playbin", nullptr); + g_object_set (pipeline, "uri", uri, "video-sink", sink, nullptr); + } else { + GstElement *src = gst_element_factory_make ("videotestsrc", nullptr); + + pipeline = gst_pipeline_new ("d3d11videosink-pipeline"); + gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL); + gst_element_link (src, sink); + } + + gst_bus_add_watch (GST_ELEMENT_BUS (pipeline), (GstBusFunc) bus_msg, + pipeline); + + sret = gst_element_set_state (pipeline, GST_STATE_PAUSED); + if (sret == GST_STATE_CHANGE_FAILURE) { + g_printerr ("Pipeline doesn't want to pause\n"); + } else { + g_main_loop_run (loop); + gst_element_set_state (pipeline, GST_STATE_NULL); + } + + gst_bus_remove_watch (GST_ELEMENT_BUS (pipeline)); + + if (hwnd) + DestroyWindow (hwnd); + + gst_object_unref (pipeline); + if (msg_io_channel) + g_io_channel_unref (msg_io_channel); + g_main_loop_unref (loop); + + gst_deinit (); + g_free (uri); + + return 0; +}
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11videosink-shared-texture.cpp
Added
@@ -0,0 +1,424 @@ +/* + * GStreamer + * Copyright (C) 2020 Seungha Yang <seungha@centricular.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> +#include <gst/video/video.h> + +#include "d3d11device.h" +#include <wrl.h> +#include <mutex> + +using namespace Microsoft::WRL; + +static gboolean use_keyed_mutex = FALSE; +static gboolean use_nt_handle = FALSE; +static gchar *texture_foramt = nullptr; +static gchar *uri = nullptr; + +static GMainLoop *loop = nullptr; +static gboolean visible = FALSE; +static HWND hwnd = nullptr; +std::mutex lock; + +/* Device handles */ +ComPtr<ID3D11Device> device; +ComPtr<ID3D11DeviceContext> context; +ComPtr<IDXGIFactory2> factory; + +/* SwapChain resources */ +ComPtr<IDXGISwapChain1> swapchain; +ComPtr<ID3D11RenderTargetView> rtv; + +/* Shard texture resources */ +ComPtr<ID3D11Texture2D> shared_texture; +ComPtr<ID3D11ShaderResourceView> srv; +ComPtr<IDXGIKeyedMutex> keyed_mutex; +HANDLE shared_handle = nullptr; +UINT misc_flags = 0; + +static void +on_begin_draw (GstElement * videosink, gpointer user_data) +{ + std::lock_guard<std::mutex> lk (lock); + GstElement *sink = GST_ELEMENT (user_data); + gboolean ret = TRUE; + HRESULT hr; + + /* Windows was destroyed, nothing to draw */ + if (!hwnd) + return; + + if (!shared_handle) { + gst_printerrln ("Shared handle wasn't configured"); + exit (-1); + } + + g_signal_emit_by_name (sink, + "draw", shared_handle, misc_flags, 0, 0, &ret); + + if (!ret) { + gst_printerrln ("Failed to draw on shared handle"); + if (loop) + g_main_loop_quit (loop); + + return; + } + + /* Acquire sync */ + if (keyed_mutex) { + hr = keyed_mutex->AcquireSync (0, INFINITE); + if (FAILED (hr)) { + gst_printerrln ("Failed to acquire sync"); + exit (-1); + } + } + + ID3D11RenderTargetView *render_target_view = rtv.Get(); + context->OMSetRenderTargets (1, &render_target_view, nullptr); + + ID3D11ShaderResourceView *shader_resource = srv.Get(); + context->PSSetShaderResources (0, 1, &shader_resource); + + context->DrawIndexed (6, 0, 0); + if (keyed_mutex) + keyed_mutex->ReleaseSync (0); + + swapchain->Present (0, 0); +} + +static void +on_resize (void) +{ + std::lock_guard<std::mutex> lk (lock); + ComPtr<ID3D11Texture2D> backbuffer; + + rtv = nullptr; + + HRESULT hr = swapchain->ResizeBuffers (0, + /* Specify zero width/height to use the size of current client area */ + 0, 0, + /* Reuse configured format */ + DXGI_FORMAT_UNKNOWN, + 0); + if (FAILED (hr)) { + gst_printerrln ("Couldn't resize swapchain"); + exit(-1); + } + + hr = swapchain->GetBuffer (0, IID_PPV_ARGS (&backbuffer)); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get backbuffer from swapchain"); + exit(-1); + } + + hr = device->CreateRenderTargetView (backbuffer.Get(), nullptr, &rtv); + if (FAILED (hr)) { + gst_printerrln ("Couldn't get ID3D11RenderTargetView from backbuffer"); + exit(-1); + } + + D3D11_TEXTURE2D_DESC desc; + backbuffer->GetDesc(&desc); + + D3D11_VIEWPORT viewport; + viewport.TopLeftX = 0; + viewport.TopLeftY = 0; + viewport.Width = desc.Width; + viewport.Height = desc.Height; + viewport.MinDepth = 0.0f; + viewport.MaxDepth = 1.0f; + + context->RSSetViewports (1, &viewport); +} + +static LRESULT CALLBACK +window_proc (HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam) +{ + switch (message) { + case WM_DESTROY: + { + std::lock_guard<std::mutex> lk (lock); + hwnd = NULL; + if (loop) + g_main_loop_quit (loop); + break; + } + case WM_SIZE: + on_resize (); + break; + default: + break; + } + + return DefWindowProc (hWnd, message, wParam, lParam); +} + +static gboolean +bus_msg (GstBus * bus, GstMessage * msg, GstElement * pipeline) +{ + switch (GST_MESSAGE_TYPE (msg)) { + case GST_MESSAGE_ASYNC_DONE: + /* make window visible when we have something to show */ + if (!visible && hwnd) { + ShowWindow (hwnd, SW_SHOW); + visible = TRUE; + } + gst_element_set_state (pipeline, GST_STATE_PLAYING); + break; + case GST_MESSAGE_ERROR:{ + GError *err; + gchar *dbg; + + gst_message_parse_error (msg, &err, &dbg); + g_printerr ("ERROR %s \n", err->message); + if (dbg != NULL) + g_printerr ("ERROR debug information: %s\n", dbg); + g_clear_error (&err); + g_free (dbg); + + g_main_loop_quit (loop); + break; + } + default: + break; + } + + return TRUE; +} + +static gboolean +msg_cb (GIOChannel * source, GIOCondition condition, gpointer data) +{ + MSG msg; + + if (!PeekMessage (&msg, NULL, 0, 0, PM_REMOVE)) + return G_SOURCE_CONTINUE; + + TranslateMessage (&msg); + DispatchMessage (&msg); + + return G_SOURCE_CONTINUE; +} + +gint +main (gint argc, gchar ** argv) +{ + GstElement *pipeline, *sink; + GstStateChangeReturn sret; + WNDCLASSEX wc = { 0, }; + HINSTANCE hinstance = GetModuleHandle (NULL); + GIOChannel *msg_io_channel = NULL; + RECT wr = { 0, 0, 320, 240 }; + ComPtr<ID3D11SamplerState> sampler; + ComPtr<ID3D11PixelShader> ps; + ComPtr<ID3D11VertexShader> vs; + ComPtr<ID3D11InputLayout> layout; + ComPtr<ID3D11Buffer> vertex; + ComPtr<ID3D11Buffer> index; + HRESULT hr; + GOptionEntry options[] = { + {"use-keyed-mutex", 0, 0, G_OPTION_ARG_NONE, &use_keyed_mutex, + "Allocate shared texture with D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX flag", + NULL}, + {"use-nt-handle", 0, 0, G_OPTION_ARG_NONE, &use_nt_handle, + "Allocate shared texture with D3D11_RESOURCE_MISC_SHARED_NTHANDLE flag", + NULL}, + {"texture-format", 0, 0, G_OPTION_ARG_STRING, &texture_foramt, + "texture format to test, supported arguments are { BGRA, RGBA, RGB10A2_LE }", + NULL}, + {"uri", 0, 0, G_OPTION_ARG_STRING, &uri, + "URI to test (if unspecified, videotestsrc will be used)", + NULL}, + {NULL} + }; + GOptionContext *option_ctx; + gboolean ret; + GError *error = nullptr; + DXGI_FORMAT format = DXGI_FORMAT_B8G8R8A8_UNORM; + + option_ctx = g_option_context_new ("d3d11videosink shard-texture example"); + g_option_context_add_main_entries (option_ctx, options, NULL); + g_option_context_add_group (option_ctx, gst_init_get_option_group ()); + ret = g_option_context_parse (option_ctx, &argc, &argv, &error); + g_option_context_free (option_ctx); + + if (!ret) { + g_printerr ("option parsing failed: %s\n", error->message); + g_clear_error (&error); + exit (1); + } + + if (g_strcmp0 (texture_foramt, "RGBA") == 0) { + gst_println ("Use DXGI_FORMAT_R8G8B8A8_UNORM (RGBA) format"); + format = DXGI_FORMAT_R8G8B8A8_UNORM; + } else if (g_strcmp0 (texture_foramt, "RGB10A2_LE") == 0) { + gst_println ("Use DXGI_FORMAT_R10G10B10A2_UNORM (RGB10A2_LE) format"); + format = DXGI_FORMAT_R10G10B10A2_UNORM; + } else { + gst_println ("Use DXGI_FORMAT_B8G8R8A8_UNORM format"); + format = DXGI_FORMAT_B8G8R8A8_UNORM; + } + + /* NT handle needs to be used with keyed mutex */ + if (use_nt_handle) { + misc_flags = D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX | D3D11_RESOURCE_MISC_SHARED_NTHANDLE; + } else if (use_keyed_mutex) { + misc_flags = D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX; + } else { + misc_flags = D3D11_RESOURCE_MISC_SHARED; + } + + gst_println ("Use keyed-mutex: %d, use_nt_handle: %d", use_keyed_mutex, + use_nt_handle); + + /* 1) Prepare window */ + wc.cbSize = sizeof (WNDCLASSEX); + wc.style = CS_HREDRAW | CS_VREDRAW; + wc.lpfnWndProc = (WNDPROC) window_proc; + wc.hInstance = hinstance; + wc.hCursor = LoadCursor (NULL, IDC_ARROW); + wc.lpszClassName = TEXT ("GstD3D11VideoSinkSharedTextureExExample"); + RegisterClassEx (&wc); + + AdjustWindowRect (&wr, WS_OVERLAPPEDWINDOW, FALSE); + + hwnd = CreateWindowEx (0, wc.lpszClassName, + TEXT ("GstD3D11VideoSinkSharedTextureExExample"), + WS_CLIPSIBLINGS | WS_CLIPCHILDREN | WS_OVERLAPPEDWINDOW, + CW_USEDEFAULT, CW_USEDEFAULT, + wr.right - wr.left, wr.bottom - wr.top, (HWND) NULL, (HMENU) NULL, + hinstance, NULL); + + /* 2) Prepare D3D11 device */ + hr = prepare_d3d11_device (&device, &context, &factory); + if (FAILED (hr)) { + gst_printerrln ("D3D11 device is unavailable"); + return -1; + } + + hr = prepare_shader (device.Get(), context.Get(), &sampler, + &ps, &vs, &layout, &vertex, &index); + if (FAILED (hr)) { + gst_printerrln ("Couldn't setup shader"); + return -1; + } + + /* 3) Prepare SwapChain */ + DXGI_SWAP_CHAIN_DESC1 desc = { 0, }; + desc.Width = 0; + desc.Height = 0; + desc.Format = format; + desc.Stereo = FALSE; + desc.SampleDesc.Count = 1; + desc.SampleDesc.Quality = 0; + desc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT; + desc.BufferCount = 2; + desc.Scaling = DXGI_SCALING_NONE; + desc.SwapEffect = DXGI_SWAP_EFFECT_FLIP_SEQUENTIAL; + desc.AlphaMode = DXGI_ALPHA_MODE_UNSPECIFIED; + + hr = factory->CreateSwapChainForHwnd (device.Get(), hwnd, &desc, + nullptr, nullptr, &swapchain); + if (FAILED (hr)) { + gst_printerrln ("IDXGISwapChain1 is unavailable"); + return -1; + } + + /* 4) Create shared texture */ + /* Texture size doesn't need to be identical to that of backbuffer */ + hr = prepare_shared_texture (device.Get(), 1280, 720, + format, misc_flags, &shared_texture, &srv, &keyed_mutex, &shared_handle); + if (FAILED (hr)) { + gst_printerrln ("Couldn't create texture to share with d3d11videosink"); + return -1; + } + + context->IASetPrimitiveTopology (D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); + context->IASetInputLayout (layout.Get()); + ID3D11Buffer *buf = vertex.Get(); + UINT offsets = 0; + UINT stride = sizeof(VertexData); + context->IASetVertexBuffers (0, 1, &buf, &stride, &offsets); + context->IASetIndexBuffer (index.Get(), DXGI_FORMAT_R16_UINT, 0); + + ID3D11SamplerState *sampler_state = sampler.Get(); + context->PSSetSamplers (0, 1, &sampler_state); + context->VSSetShader (vs.Get(), nullptr, 0); + context->PSSetShader (ps.Get(), nullptr, 0); + + /* Call initial resize to prepare resources */ + on_resize(); + + loop = g_main_loop_new (NULL, FALSE); + msg_io_channel = g_io_channel_win32_new_messages ((gsize) hwnd); + g_io_add_watch (msg_io_channel, G_IO_IN, msg_cb, NULL); + + /* Enable drawing on our texture and add signal handler */ + sink = gst_element_factory_make ("d3d11videosink", NULL); + g_object_set (sink, "draw-on-shared-texture", TRUE, nullptr); + g_signal_connect (sink, "begin-draw", G_CALLBACK (on_begin_draw), sink); + + if (uri) { + pipeline = gst_element_factory_make ("playbin", nullptr); + g_object_set (pipeline, "uri", uri, "video-sink", sink, nullptr); + } else { + GstElement *src = gst_element_factory_make ("videotestsrc", nullptr); + + pipeline = gst_pipeline_new ("d3d11videosink-pipeline"); + gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL); + gst_element_link (src, sink); + } + + gst_bus_add_watch (GST_ELEMENT_BUS (pipeline), (GstBusFunc) bus_msg, + pipeline); + + sret = gst_element_set_state (pipeline, GST_STATE_PAUSED); + if (sret == GST_STATE_CHANGE_FAILURE) { + g_printerr ("Pipeline doesn't want to pause\n"); + } else { + g_main_loop_run (loop); + gst_element_set_state (pipeline, GST_STATE_NULL); + } + + gst_bus_remove_watch (GST_ELEMENT_BUS (pipeline)); + + if (hwnd) + DestroyWindow (hwnd); + + gst_object_unref (pipeline); + if (msg_io_channel) + g_io_channel_unref (msg_io_channel); + g_main_loop_unref (loop); + + gst_deinit (); + + g_free (texture_foramt); + g_free (uri); + + /* NT handle should be explicitly closed to avoid leak */ + if (use_nt_handle) + CloseHandle (shared_handle); + + return 0; +}
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/d3d11videosink.c
Changed
(renamed from tests/examples/d3d11videosink/d3d11videosink.c)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/d3d11/meson.build
Added
@@ -0,0 +1,47 @@ +if host_system == 'windows' + executable('d3d11videosink', + ['d3d11videosink.c', 'd3d11videosink-kb.c'], + c_args : gst_plugins_bad_args, + include_directories : [configinc, libsinc], + dependencies: [gst_dep, gstbase_dep, gstvideo_dep], + install: false, + ) + + executable('d3d11screencapturesrc', + ['d3d11screencapturesrc.cpp'], + c_args : gst_plugins_bad_args, + include_directories : [configinc, libsinc], + dependencies: [gst_dep, gstbase_dep, gstvideo_dep], + install: false, + ) + + d3d11_lib = cc.find_library('d3d11', required : false) + dxgi_lib = cc.find_library('dxgi', required : false) + d3dcompiler_lib = cc.find_library('d3dcompiler', required: false) + have_d3d11_h = cc.has_header('d3d11.h') + have_dxgi_h = cc.has_header('dxgi1_2.h') + have_d3d11compiler_h = cc.has_header('d3dcompiler.h') + + d3d9_dep = cc.find_library('d3d9', required : false) + have_d3d9_h = cc.has_header('d3d9.h') + + if d3d11_lib.found() and dxgi_lib.found() and d3dcompiler_lib.found() and have_d3d11_h and have_dxgi_h and have_d3d11compiler_h + executable('d3d11videosink-shared-texture', + ['d3d11videosink-shared-texture.cpp', 'd3d11device.cpp'], + c_args : gst_plugins_bad_args, + include_directories : [configinc, libsinc], + dependencies: [gst_dep, gstbase_dep, gstvideo_dep, d3d11_lib, dxgi_lib, d3dcompiler_lib], + install: false, + ) + + if d3d_dep.found() and have_d3d9_h + executable('d3d11videosink-shared-texture-d3d9ex', + ['d3d11videosink-shared-texture-d3d9ex.cpp', 'd3d11device.cpp'], + c_args : gst_plugins_bad_args, + include_directories : [configinc, libsinc], + dependencies: [gst_dep, gstbase_dep, gstvideo_dep, d3d11_lib, dxgi_lib, d3dcompiler_lib, d3d9_dep], + install: false, + ) + endif + endif +endif
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/meson.build -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/meson.build
Changed
@@ -2,7 +2,7 @@ subdir('avsamplesink') subdir('camerabin2') subdir('codecparsers') -subdir('d3d11videosink') +subdir('d3d11') subdir('directfb') subdir('ipcpipeline') subdir('mpegts') @@ -14,6 +14,7 @@ subdir('va') subdir('waylandsink') subdir('webrtc') +subdir('wpe') executable('playout', 'playout.c',
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/mpegts/meson.build -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/mpegts/meson.build
Changed
@@ -1,4 +1,4 @@ -foreach fname : ['ts-parser.c', 'ts-section-writer.c', 'ts-scte-writer.c'] +foreach fname : ['ts-parser.c', 'ts-section-writer.c', 'ts-scte-writer.c', 'tsmux-prog-map.c'] exe_name = fname.split('.').get(0).underscorify() executable(exe_name,
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/mpegts/ts-parser.c -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/mpegts/ts-parser.c
Changed
@@ -80,12 +80,20 @@ #define dump_memory_content(desc, spacing) dump_memory_bytes((desc)->data + 2, (desc)->length, spacing) static const gchar * -descriptor_name (gint val) +descriptor_name (GstMpegtsDescriptor * desc) { - GEnumValue *en; + GEnumValue *en = NULL; + gint val = desc->tag; - en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek - (GST_TYPE_MPEGTS_DESCRIPTOR_TYPE)), val); + /* Treat extended descriptors */ + if (val == 0x7f) { + en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek + (GST_TYPE_MPEGTS_DVB_EXTENDED_DESCRIPTOR_TYPE)), + desc->tag_extension); + } + if (en == NULL) + en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek + (GST_TYPE_MPEGTS_DESCRIPTOR_TYPE)), val); if (en == NULL) /* Else try with DVB enum types */ en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek @@ -99,6 +107,10 @@ en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek (GST_TYPE_MPEGTS_ISDB_DESCRIPTOR_TYPE)), val); if (en == NULL) + /* Else try with SCTE enum types */ + en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek + (GST_TYPE_MPEGTS_SCTE_DESCRIPTOR_TYPE)), val); + if (en == NULL) /* Else try with misc enum types */ en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek (GST_TYPE_MPEGTS_MISC_DESCRIPTOR_TYPE)), val); @@ -139,6 +151,10 @@ en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek (GST_TYPE_MPEGTS_STREAM_TYPE)), val); if (en == NULL) + /* Else try with HDMV enum types */ + en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek + (GST_TYPE_MPEGTS_HDMV_STREAM_TYPE)), val); + if (en == NULL) /* Else try with SCTE enum types */ en = g_enum_get_value (G_ENUM_CLASS (g_type_class_peek (GST_TYPE_MPEGTS_SCTE_STREAM_TYPE)), val); @@ -502,284 +518,311 @@ } } +#define SAFE_CHAR(a) (g_ascii_isprint(a) ? a : '.') + +/* Single descriptor dump + * Descriptors that can only appear in specific tables should be handled before */ static void -dump_descriptors (GPtrArray * descriptors, guint spacing) +dump_generic_descriptor (GstMpegtsDescriptor * desc, guint spacing) { - guint i; - - for (i = 0; i < descriptors->len; i++) { - GstMpegtsDescriptor *desc = g_ptr_array_index (descriptors, i); - g_printf ("%*s [descriptor 0x%02x (%s) length:%d]\n", spacing, "", - desc->tag, descriptor_name (desc->tag), desc->length); - if (DUMP_DESCRIPTORS) - dump_memory_content (desc, spacing + 2); - switch (desc->tag) { - case GST_MTS_DESC_REGISTRATION: - { - const guint8 *data = desc->data + 2; -#define SAFE_CHAR(a) (g_ascii_isprint(a) ? a : '.') - g_printf ("%*s Registration : %c%c%c%c [%02x%02x%02x%02x]\n", spacing, - "", SAFE_CHAR (data[0]), SAFE_CHAR (data[1]), SAFE_CHAR (data[2]), - SAFE_CHAR (data[3]), data[0], data[1], data[2], data[3]); + switch (desc->tag) { + case GST_MTS_DESC_REGISTRATION: + { + const guint8 *data = desc->data + 2; + g_printf ("%*s Registration : %c%c%c%c [%02x%02x%02x%02x]\n", spacing, + "", SAFE_CHAR (data[0]), SAFE_CHAR (data[1]), SAFE_CHAR (data[2]), + SAFE_CHAR (data[3]), data[0], data[1], data[2], data[3]); - break; - } - case GST_MTS_DESC_CA: - { - guint16 ca_pid, ca_system_id; - const guint8 *private_data; - gsize private_data_size; - if (gst_mpegts_descriptor_parse_ca (desc, &ca_system_id, &ca_pid, - &private_data, &private_data_size)) { - g_printf ("%*s CA system id : 0x%04x\n", spacing, "", ca_system_id); - g_printf ("%*s CA PID : 0x%04x\n", spacing, "", ca_pid); - if (private_data_size) { - g_printf ("%*s Private Data :\n", spacing, ""); - dump_memory_bytes ((guint8 *) private_data, private_data_size, - spacing + 2); - } - } - break; - } - case GST_MTS_DESC_DVB_NETWORK_NAME: - { - gchar *network_name; - if (gst_mpegts_descriptor_parse_dvb_network_name (desc, &network_name)) { - g_printf ("%*s Network Name : %s\n", spacing, "", network_name); - g_free (network_name); + break; + } + case GST_MTS_DESC_CA: + { + guint16 ca_pid, ca_system_id; + const guint8 *private_data; + gsize private_data_size; + if (gst_mpegts_descriptor_parse_ca (desc, &ca_system_id, &ca_pid, + &private_data, &private_data_size)) { + g_printf ("%*s CA system id : 0x%04x\n", spacing, "", ca_system_id); + g_printf ("%*s CA PID : 0x%04x\n", spacing, "", ca_pid); + if (private_data_size) { + g_printf ("%*s Private Data :\n", spacing, ""); + dump_memory_bytes ((guint8 *) private_data, private_data_size, + spacing + 2); } - break; } - case GST_MTS_DESC_DVB_SERVICE_LIST: - { - dump_dvb_service_list (desc, spacing + 2); - break; + break; + } + case GST_MTS_DESC_DVB_NETWORK_NAME: + { + gchar *network_name; + if (gst_mpegts_descriptor_parse_dvb_network_name (desc, &network_name)) { + g_printf ("%*s Network Name : %s\n", spacing, "", network_name); + g_free (network_name); } - case GST_MTS_DESC_DVB_CABLE_DELIVERY_SYSTEM: - dump_cable_delivery_descriptor (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_TERRESTRIAL_DELIVERY_SYSTEM: - dump_terrestrial_delivery (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_BOUQUET_NAME: - { - gchar *bouquet_name; - if (gst_mpegts_descriptor_parse_dvb_bouquet_name (desc, &bouquet_name)) { - g_printf ("%*s Bouquet Name Descriptor, bouquet_name:%s\n", spacing, - "", bouquet_name); - g_free (bouquet_name); - } - break; + break; + } + case GST_MTS_DESC_DVB_SERVICE_LIST: + { + dump_dvb_service_list (desc, spacing + 2); + break; + } + case GST_MTS_DESC_DVB_CABLE_DELIVERY_SYSTEM: + dump_cable_delivery_descriptor (desc, spacing + 2); + break; + case GST_MTS_DESC_DVB_TERRESTRIAL_DELIVERY_SYSTEM: + dump_terrestrial_delivery (desc, spacing + 2); + break; + case GST_MTS_DESC_DVB_BOUQUET_NAME: + { + gchar *bouquet_name; + if (gst_mpegts_descriptor_parse_dvb_bouquet_name (desc, &bouquet_name)) { + g_printf ("%*s Bouquet Name Descriptor, bouquet_name:%s\n", spacing, + "", bouquet_name); + g_free (bouquet_name); } - case GST_MTS_DESC_DTG_LOGICAL_CHANNEL: - dump_logical_channel_descriptor (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_SERVICE: - { - gchar *service_name, *provider_name; - GstMpegtsDVBServiceType service_type; - if (gst_mpegts_descriptor_parse_dvb_service (desc, &service_type, - &service_name, &provider_name)) { - g_printf ("%*s Service Descriptor, type:0x%02x (%s)\n", spacing, "", - service_type, enum_name (GST_TYPE_MPEGTS_DVB_SERVICE_TYPE, - service_type)); - g_printf ("%*s service_name : %s\n", spacing, "", service_name); - g_printf ("%*s provider_name : %s\n", spacing, "", - provider_name); - g_free (service_name); - g_free (provider_name); + break; + } + case GST_MTS_DESC_DVB_SERVICE: + { + gchar *service_name, *provider_name; + GstMpegtsDVBServiceType service_type; + if (gst_mpegts_descriptor_parse_dvb_service (desc, &service_type, + &service_name, &provider_name)) { + g_printf ("%*s Service Descriptor, type:0x%02x (%s)\n", spacing, "", + service_type, enum_name (GST_TYPE_MPEGTS_DVB_SERVICE_TYPE, + service_type)); + g_printf ("%*s service_name : %s\n", spacing, "", service_name); + g_printf ("%*s provider_name : %s\n", spacing, "", provider_name); + g_free (service_name); + g_free (provider_name); - } - break; - } - case GST_MTS_DESC_DVB_MULTILINGUAL_BOUQUET_NAME: - { - dump_multiligual_bouquet_name (desc, spacing + 2); - break; } - case GST_MTS_DESC_DVB_MULTILINGUAL_NETWORK_NAME: - { - dump_multiligual_network_name (desc, spacing + 2); - break; - } - case GST_MTS_DESC_DVB_MULTILINGUAL_SERVICE_NAME: - { - dump_multiligual_service_name (desc, spacing + 2); - break; - } - case GST_MTS_DESC_DVB_MULTILINGUAL_COMPONENT: - { - dump_multiligual_component (desc, spacing + 2); - break; - } - case GST_MTS_DESC_DVB_PRIVATE_DATA_SPECIFIER: - { - guint32 specifier; - guint8 len = 0, *data = NULL; - - if (gst_mpegts_descriptor_parse_dvb_private_data_specifier (desc, - &specifier, &data, &len)) { - g_printf ("%*s private_data_specifier : 0x%08x\n", spacing, "", - specifier); - if (len > 0) { - dump_memory_bytes (data, len, spacing + 2); - g_free (data); - } + break; + } + case GST_MTS_DESC_DVB_MULTILINGUAL_BOUQUET_NAME: + { + dump_multiligual_bouquet_name (desc, spacing + 2); + break; + } + case GST_MTS_DESC_DVB_MULTILINGUAL_NETWORK_NAME: + { + dump_multiligual_network_name (desc, spacing + 2); + break; + } + case GST_MTS_DESC_DVB_MULTILINGUAL_SERVICE_NAME: + { + dump_multiligual_service_name (desc, spacing + 2); + break; + } + case GST_MTS_DESC_DVB_MULTILINGUAL_COMPONENT: + { + dump_multiligual_component (desc, spacing + 2); + break; + } + case GST_MTS_DESC_DVB_PRIVATE_DATA_SPECIFIER: + { + guint32 specifier; + guint8 len = 0, *data = NULL; + + if (gst_mpegts_descriptor_parse_dvb_private_data_specifier (desc, + &specifier, &data, &len)) { + g_printf ("%*s private_data_specifier : 0x%08x\n", spacing, "", + specifier); + if (len > 0) { + dump_memory_bytes (data, len, spacing + 2); + g_free (data); } - break; } - case GST_MTS_DESC_DVB_FREQUENCY_LIST: - { - gboolean offset; - GArray *list; - if (gst_mpegts_descriptor_parse_dvb_frequency_list (desc, &offset, - &list)) { - guint j; - for (j = 0; j < list->len; j++) { - guint32 freq = g_array_index (list, guint32, j); - g_printf ("%*s Frequency : %u %s\n", spacing, "", freq, - offset ? "kHz" : "Hz"); - } - g_array_unref (list); + break; + } + case GST_MTS_DESC_DVB_FREQUENCY_LIST: + { + gboolean offset; + GArray *list; + if (gst_mpegts_descriptor_parse_dvb_frequency_list (desc, &offset, &list)) { + guint j; + for (j = 0; j < list->len; j++) { + guint32 freq = g_array_index (list, guint32, j); + g_printf ("%*s Frequency : %u %s\n", spacing, "", freq, + offset ? "kHz" : "Hz"); } - break; + g_array_unref (list); } - case GST_MTS_DESC_DVB_LINKAGE: - dump_linkage (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_COMPONENT: - dump_component (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_STREAM_IDENTIFIER: - { - guint8 tag; - if (gst_mpegts_descriptor_parse_dvb_stream_identifier (desc, &tag)) { - g_printf ("%*s Component Tag : 0x%02x\n", spacing, "", tag); - } - break; + break; + } + case GST_MTS_DESC_DVB_LINKAGE: + dump_linkage (desc, spacing + 2); + break; + case GST_MTS_DESC_DVB_COMPONENT: + dump_component (desc, spacing + 2); + break; + case GST_MTS_DESC_DVB_STREAM_IDENTIFIER: + { + guint8 tag; + if (gst_mpegts_descriptor_parse_dvb_stream_identifier (desc, &tag)) { + g_printf ("%*s Component Tag : 0x%02x\n", spacing, "", tag); } - case GST_MTS_DESC_DVB_CA_IDENTIFIER: - { - GArray *list; - guint j; - guint16 ca_id; - if (gst_mpegts_descriptor_parse_dvb_ca_identifier (desc, &list)) { - for (j = 0; j < list->len; j++) { - ca_id = g_array_index (list, guint16, j); - g_printf ("%*s CA Identifier : 0x%04x\n", spacing, "", ca_id); - } - g_array_unref (list); + break; + } + case GST_MTS_DESC_DVB_CA_IDENTIFIER: + { + GArray *list; + guint j; + guint16 ca_id; + if (gst_mpegts_descriptor_parse_dvb_ca_identifier (desc, &list)) { + for (j = 0; j < list->len; j++) { + ca_id = g_array_index (list, guint16, j); + g_printf ("%*s CA Identifier : 0x%04x\n", spacing, "", ca_id); } - break; + g_array_unref (list); } - case GST_MTS_DESC_DVB_CONTENT: - dump_content (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_PARENTAL_RATING: - { - GPtrArray *ratings; - guint j; - - if (gst_mpegts_descriptor_parse_dvb_parental_rating (desc, &ratings)) { - for (j = 0; j < ratings->len; j++) { - GstMpegtsDVBParentalRatingItem *item = - g_ptr_array_index (ratings, j); - g_printf ("%*s country_code : %s\n", spacing, "", - item->country_code); - g_printf ("%*s rating age : %d\n", spacing, "", item->rating); - } - g_ptr_array_unref (ratings); + break; + } + case GST_MTS_DESC_DVB_CONTENT: + dump_content (desc, spacing + 2); + break; + case GST_MTS_DESC_DVB_PARENTAL_RATING: + { + GPtrArray *ratings; + guint j; + + if (gst_mpegts_descriptor_parse_dvb_parental_rating (desc, &ratings)) { + for (j = 0; j < ratings->len; j++) { + GstMpegtsDVBParentalRatingItem *item = g_ptr_array_index (ratings, j); + g_printf ("%*s country_code : %s\n", spacing, "", + item->country_code); + g_printf ("%*s rating age : %d\n", spacing, "", item->rating); } - break; + g_ptr_array_unref (ratings); } - case GST_MTS_DESC_DVB_DATA_BROADCAST: - { - GstMpegtsDataBroadcastDescriptor *res; - - if (gst_mpegts_descriptor_parse_dvb_data_broadcast (desc, &res)) { - g_printf ("%*s data_broadcast_id : 0x%04x\n", spacing, "", - res->data_broadcast_id); - g_printf ("%*s component_tag : 0x%02x\n", spacing, "", - res->component_tag); - if (res->length > 0) { - g_printf ("%*s selector_bytes:\n", spacing, ""); - dump_memory_bytes (res->selector_bytes, res->length, spacing + 2); - } - g_printf ("%*s text : %s\n", spacing, "", - res->text ? res->text : "NULL"); - gst_mpegts_dvb_data_broadcast_descriptor_free (res); + break; + } + case GST_MTS_DESC_DVB_DATA_BROADCAST: + { + GstMpegtsDataBroadcastDescriptor *res; + + if (gst_mpegts_descriptor_parse_dvb_data_broadcast (desc, &res)) { + g_printf ("%*s data_broadcast_id : 0x%04x\n", spacing, "", + res->data_broadcast_id); + g_printf ("%*s component_tag : 0x%02x\n", spacing, "", + res->component_tag); + if (res->length > 0) { + g_printf ("%*s selector_bytes:\n", spacing, ""); + dump_memory_bytes (res->selector_bytes, res->length, spacing + 2); } - break; + g_printf ("%*s text : %s\n", spacing, "", + res->text ? res->text : "NULL"); + gst_mpegts_dvb_data_broadcast_descriptor_free (res); } - case GST_MTS_DESC_ISO_639_LANGUAGE: - dump_iso_639_language (desc, spacing + 2); - break; - case GST_MTS_DESC_DVB_SHORT_EVENT: - { - gchar *language_code, *event_name, *text; - if (gst_mpegts_descriptor_parse_dvb_short_event (desc, &language_code, - &event_name, &text)) { - g_printf ("%*s Short Event, language_code:%s\n", spacing, "", - language_code); - g_printf ("%*s event_name : %s\n", spacing, "", event_name); - g_printf ("%*s text : %s\n", spacing, "", text); - g_free (language_code); - g_free (event_name); - g_free (text); - } + break; + } + case GST_MTS_DESC_ISO_639_LANGUAGE: + dump_iso_639_language (desc, spacing + 2); + break; + case GST_MTS_DESC_DVB_SHORT_EVENT: + { + gchar *language_code, *event_name, *text; + if (gst_mpegts_descriptor_parse_dvb_short_event (desc, &language_code, + &event_name, &text)) { + g_printf ("%*s Short Event, language_code:%s\n", spacing, "", + language_code); + g_printf ("%*s event_name : %s\n", spacing, "", event_name); + g_printf ("%*s text : %s\n", spacing, "", text); + g_free (language_code); + g_free (event_name); + g_free (text); } - break; - case GST_MTS_DESC_DVB_EXTENDED_EVENT: - { - dump_dvb_extended_event (desc, spacing + 2); - break; + } + break; + case GST_MTS_DESC_DVB_EXTENDED_EVENT: + { + dump_dvb_extended_event (desc, spacing + 2); + break; + } + case GST_MTS_DESC_DVB_SUBTITLING: + { + gchar *lang; + guint8 type; + guint16 composition; + guint16 ancillary; + guint j; + + for (j = 0; + gst_mpegts_descriptor_parse_dvb_subtitling_idx (desc, j, &lang, + &type, &composition, &ancillary); j++) { + g_printf ("%*s Subtitling, language_code:%s\n", spacing, "", lang); + g_printf ("%*s type : %u\n", spacing, "", type); + g_printf ("%*s composition page id : %u\n", spacing, "", + composition); + g_printf ("%*s ancillary page id : %u\n", spacing, "", + ancillary); + g_free (lang); } - case GST_MTS_DESC_DVB_SUBTITLING: - { - gchar *lang; - guint8 type; - guint16 composition; - guint16 ancillary; - guint j; - - for (j = 0; - gst_mpegts_descriptor_parse_dvb_subtitling_idx (desc, j, &lang, - &type, &composition, &ancillary); j++) { - g_printf ("%*s Subtitling, language_code:%s\n", spacing, "", lang); - g_printf ("%*s type : %u\n", spacing, "", type); - g_printf ("%*s composition page id : %u\n", spacing, "", - composition); - g_printf ("%*s ancillary page id : %u\n", spacing, "", - ancillary); - g_free (lang); - } + } + break; + case GST_MTS_DESC_DVB_TELETEXT: + { + GstMpegtsDVBTeletextType type; + gchar *lang; + guint8 magazine, page_number; + guint j; + + for (j = 0; + gst_mpegts_descriptor_parse_dvb_teletext_idx (desc, j, &lang, &type, + &magazine, &page_number); j++) { + g_printf ("%*s Teletext, type:0x%02x (%s)\n", spacing, "", type, + enum_name (GST_TYPE_MPEGTS_DVB_TELETEXT_TYPE, type)); + g_printf ("%*s language : %s\n", spacing, "", lang); + g_printf ("%*s magazine : %u\n", spacing, "", magazine); + g_printf ("%*s page number : %u\n", spacing, "", page_number); + g_free (lang); } - break; - case GST_MTS_DESC_DVB_TELETEXT: - { - GstMpegtsDVBTeletextType type; - gchar *lang; - guint8 magazine, page_number; - guint j; + } + break; + default: + break; + } +} - for (j = 0; - gst_mpegts_descriptor_parse_dvb_teletext_idx (desc, j, &lang, &type, - &magazine, &page_number); j++) { - g_printf ("%*s Teletext, type:0x%02x (%s)\n", spacing, "", type, - enum_name (GST_TYPE_MPEGTS_DVB_TELETEXT_TYPE, type)); - g_printf ("%*s language : %s\n", spacing, "", lang); - g_printf ("%*s magazine : %u\n", spacing, "", magazine); - g_printf ("%*s page number : %u\n", spacing, "", page_number); - g_free (lang); - } - } +static void +dump_descriptors (GPtrArray * descriptors, guint spacing) +{ + guint i; + + for (i = 0; i < descriptors->len; i++) { + GstMpegtsDescriptor *desc = g_ptr_array_index (descriptors, i); + g_printf ("%*s [descriptor 0x%02x (%s) length:%d]\n", spacing, "", + desc->tag, descriptor_name (desc), desc->length); + if (DUMP_DESCRIPTORS) + dump_memory_content (desc, spacing + 2); + dump_generic_descriptor (desc, spacing + 2); + } +} + +static void +dump_nit_descriptors (GPtrArray * descriptors, guint spacing) +{ + /* Descriptors that can only appear in NIT */ + guint i; + + for (i = 0; i < descriptors->len; i++) { + GstMpegtsDescriptor *desc = g_ptr_array_index (descriptors, i); + g_printf ("%*s [descriptor 0x%02x (%s) length:%d]\n", spacing, "", + desc->tag, descriptor_name (desc), desc->length); + if (DUMP_DESCRIPTORS) + dump_memory_content (desc, spacing + 2); + switch (desc->tag) { + case GST_MTS_DESC_DTG_LOGICAL_CHANNEL: + dump_logical_channel_descriptor (desc, spacing + 2); break; default: + dump_generic_descriptor (desc, spacing + 2); break; } } } + static void dump_pat (GstMpegtsSection * section) { @@ -973,7 +1016,7 @@ g_printf (" transport_stream_id:0x%04x , original_network_id:0x%02x\n", stream->transport_stream_id, stream->original_network_id); - dump_descriptors (stream->descriptors, 9); + dump_nit_descriptors (stream->descriptors, 9); } } @@ -1027,6 +1070,29 @@ } } + +static void +dump_sit (GstMpegtsSection * section) +{ + const GstMpegtsSIT *sit = gst_mpegts_section_get_sit (section); + guint i, len; + + g_assert (sit); + + dump_descriptors (sit->descriptors, 7); + len = sit->services->len; + g_printf (" %d Services:\n", len); + for (i = 0; i < len; i++) { + GstMpegtsSITService *service = g_ptr_array_index (sit->services, i); + g_print + (" service_id:0x%04x, running_status:0x%02x (%s)\n", + service->service_id, service->running_status, + enum_name (GST_TYPE_MPEGTS_RUNNING_STATUS, service->running_status)); + dump_descriptors (service->descriptors, 9); + } +} + + static void dump_tdt (GstMpegtsSection * section) { @@ -1248,6 +1314,9 @@ case GST_MPEGTS_SECTION_EIT: dump_eit (section); break; + case GST_MPEGTS_SECTION_SIT: + dump_sit (section); + break; case GST_MPEGTS_SECTION_ATSC_MGT: dump_mgt (section); break; @@ -1340,8 +1409,10 @@ g_type_class_ref (GST_TYPE_MPEGTS_RUNNING_STATUS); g_type_class_ref (GST_TYPE_MPEGTS_DESCRIPTOR_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_DVB_DESCRIPTOR_TYPE); + g_type_class_ref (GST_TYPE_MPEGTS_DVB_EXTENDED_DESCRIPTOR_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_ATSC_DESCRIPTOR_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_ISDB_DESCRIPTOR_TYPE); + g_type_class_ref (GST_TYPE_MPEGTS_SCTE_DESCRIPTOR_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_MISC_DESCRIPTOR_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_ISO639_AUDIO_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_DVB_SERVICE_TYPE); @@ -1361,6 +1432,7 @@ g_type_class_ref (GST_TYPE_MPEGTS_COMPONENT_STREAM_CONTENT); g_type_class_ref (GST_TYPE_MPEGTS_CONTENT_NIBBLE_HI); g_type_class_ref (GST_TYPE_MPEGTS_SCTE_STREAM_TYPE); + g_type_class_ref (GST_TYPE_MPEGTS_HDMV_STREAM_TYPE); g_type_class_ref (GST_TYPE_MPEGTS_SECTION_SCTE_TABLE_ID); mainloop = g_main_loop_new (NULL, FALSE);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/mpegts/ts-scte-writer.c -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/mpegts/ts-scte-writer.c
Changed
@@ -29,9 +29,9 @@ /* Splice is at 5s for 30s */ if (out) - sit = gst_mpegts_scte_splice_out_new (1, 5 * 90000, 30 * 90000); + sit = gst_mpegts_scte_splice_out_new (1, 5 * GST_SECOND, 30 * GST_SECOND); else - sit = gst_mpegts_scte_splice_in_new (2, 35 * 90000); + sit = gst_mpegts_scte_splice_in_new (2, 35 * GST_SECOND); section = gst_mpegts_section_from_scte_sit (sit, 123); gst_mpegts_section_send_event (section, mux);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/mpegts/tsmux-prog-map.c
Added
@@ -0,0 +1,99 @@ +/* GStreamer + * + * tsmux-prog-map.c: sample application to show how to construct the + * mpegtsmux prog-map property. + * + * MIT License + * + * Copyright (C) 2021 Jan Schmidt <jan@centricular.com> + * + * Permission is hereby granted, free of charge, to any person obtaining a copy + * of this software and associated documentation files (the "Software"), to deal + * in the Software without restriction, including without limitation the rights + * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + * copies of the Software, and to permit persons to whom the Software is + * furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in all + * copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE + * SOFTWARE.* + * + */ + +#ifdef HAVE_CONFIG_H +#include "config.h" +#endif + +#include <gst/gst.h> + +/* Test pipeline with h264video on PID 101 and AAC audio on PID 102 + * The streams will be assigned to Program #2, with PMT on PID 100, and + * the PCR on the video stream */ +#define TEST_PIPELINE "videotestsrc num-buffers=90 ! video/x-raw,framerate=30/1 ! x264enc ! queue ! .sink_101 mpegtsmux name=mux ! fakesink audiotestsrc samplesperbuffer=4800 num-buffers=30 ! audio/x-raw,rate=48000 ! fdkaacenc ! aacparse ! mux.sink_102" + +static void +_on_bus_message (GstBus * bus, GstMessage * message, GMainLoop * mainloop) +{ + switch (GST_MESSAGE_TYPE (message)) { + case GST_MESSAGE_ERROR: + case GST_MESSAGE_EOS: + g_main_loop_quit (mainloop); + break; + default: + break; + } +} + +int +main (int argc, char **argv) +{ + GstElement *pipeline = NULL; + GError *error = NULL; + GstBus *bus; + GMainLoop *mainloop; + GstElement *muxer; + GstStructure *prog_map; + + gst_init (&argc, &argv); + + pipeline = gst_parse_launch (TEST_PIPELINE, &error); + if (error) { + g_print ("Error constructing pipeline: %s\n", error->message); + g_clear_error (&error); + return 1; + } + + mainloop = g_main_loop_new (NULL, FALSE); + + bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline)); + gst_bus_add_signal_watch (bus); + g_signal_connect (bus, "message", (GCallback) _on_bus_message, mainloop); + + /* Configure the program map here. The elementary streams get their PIDs from the pad name. + * Assign them to program # 2, with PMT on PID 100 */ + + muxer = gst_bin_get_by_name (GST_BIN (pipeline), "mux"); + + prog_map = gst_structure_new ("x-prog-map", + "sink_101", G_TYPE_INT, 2, "sink_102", G_TYPE_INT, 2, + "PMT_2", G_TYPE_UINT, 100, "PCR_2", G_TYPE_STRING, "sink_101", NULL); + g_object_set (muxer, "prog-map", prog_map, NULL); + + gst_element_set_state (pipeline, GST_STATE_PLAYING); + + g_main_loop_run (mainloop); + + gst_element_set_state (pipeline, GST_STATE_NULL); + + gst_object_unref (pipeline); + gst_object_unref (bus); + + return 0; +}
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/playout.c -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/playout.c
Changed
@@ -824,7 +824,7 @@ /* Hook up to mixers and remove the probes blocking the pads */ if (item->audio_pad) { GST_DEBUG ("%s: hooking up audio pad to mixer", item->fn); - sinkpad = gst_element_get_request_pad (app->audio_mixer, "sink_%u"); + sinkpad = gst_element_request_pad_simple (app->audio_mixer, "sink_%u"); gst_pad_link (item->audio_pad, sinkpad); segment_time = playout_item_pad_get_segment_time (item->audio_pad); @@ -857,7 +857,7 @@ if (item->video_pad) { GST_DEBUG ("%s: hooking up video pad to mixer", item->fn); - sinkpad = gst_element_get_request_pad (app->video_mixer, "sink_%u"); + sinkpad = gst_element_request_pad_simple (app->video_mixer, "sink_%u"); /* Get new height/width/xpos/ypos such that the video scales up or down to * fit within the output video size without any cropping */
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/va/main.c -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/va/main.c
Changed
@@ -14,6 +14,7 @@ #include <gst/gst.h> #include <gst/app/gstappsink.h> #include <gst/video/video.h> +#include <gst/va/gstvadisplay.h> #include <va/va_x11.h> @@ -44,11 +45,11 @@ gst_println ("got need context %s", context_type); - if (g_strcmp0 (context_type, "gst.va.display.handle") == 0) { + if (g_strcmp0 (context_type, GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR) == 0) { GstContext *context; GstStructure *s; - context = gst_context_new ("gst.va.display.handle", TRUE); + context = gst_context_new (GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR, TRUE); s = gst_context_writable_structure (context); gst_structure_set (s, "va-display", G_TYPE_POINTER, app->va_dpy, NULL); gst_element_set_context (GST_ELEMENT (msg->src), context); @@ -228,7 +229,7 @@ gst_object_unref (src); sink = gst_bin_get_by_name (GST_BIN (app->pipeline), "sink"); - caps = gst_caps_from_string ("video/x-raw(memory:VAMemory)"); + caps = gst_caps_from_string ("video/x-raw(" GST_CAPS_FEATURE_MEMORY_VA ")"); g_object_set (sink, "caps", caps, NULL); gst_caps_unref (caps); gst_app_sink_set_callbacks (GST_APP_SINK (sink), &callbacks, app, NULL);
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/va/meson.build -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/va/meson.build
Changed
@@ -1,14 +1,26 @@ +if not gstva_dep.found() + subdir_done() +endif + gtk_dep = dependency('gtk+-3.0', required : get_option('examples')) gtk_x11_dep = dependency('gtk+-x11-3.0', required : get_option('examples')) x11_dep = dependency('x11', required : get_option('examples')) libva_x11_dep = dependency('libva-x11', version: libva_req, required: get_option('examples')) -if have_va and gtk_dep.found() and gtk_x11_dep.found() and x11_dep.found() and libva_x11_dep.found() +if gtk_dep.found() and gtk_x11_dep.found() and x11_dep.found() and libva_x11_dep.found() executable('va-x11-render', 'main.c', install: false, include_directories : [configinc], - dependencies : [gtk_dep, gtk_x11_dep, x11_dep, gst_dep, gstapp_dep, gstvideo_dep, libva_dep, libva_x11_dep], + dependencies : [gtk_dep, gtk_x11_dep, x11_dep, gst_dep, gstapp_dep, gstvideo_dep, gstva_dep, libva_x11_dep], c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API'], ) endif + +executable('multiple-vpp', + 'multiple-vpp.c', + install: false, + include_directories : [configinc], + dependencies : [gst_dep, gstvideo_dep, gstva_dep, gstcontroller_dep], + c_args : gst_plugins_bad_args + ['-DGST_USE_UNSTABLE_API'], +)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/va/multiple-vpp.c
Added
@@ -0,0 +1,456 @@ +#include <stdlib.h> + +#include <gst/gst.h> +#include <gst/video/video.h> +#include <gst/controller/gstinterpolationcontrolsource.h> +#include <gst/controller/gstdirectcontrolbinding.h> +#include <gst/va/gstvadisplay.h> + +#define CHANGE_DIR_WITH_EVENT 0 + +static gint num_buffers = 50; +static gboolean camera = FALSE; +static gboolean randomcb = FALSE; +static gboolean randomdir = FALSE; +static gboolean randomsharpen = FALSE; +static gboolean randomcrop = FALSE; + +static GOptionEntry entries[] = { + {"num-buffers", 'n', 0, G_OPTION_ARG_INT, &num_buffers, + "Number of buffers (<= 0 : forever)", "N"}, + {"camera", 'c', 0, G_OPTION_ARG_NONE, &camera, + "Use default v4l2src as video source", NULL}, + {"random-cb", 'r', 0, G_OPTION_ARG_NONE, &randomcb, + "Change colorbalance randomly every second (if supported)", NULL}, + {"random-dir", 'd', 0, G_OPTION_ARG_NONE, &randomdir, + "Change video direction randomly every second (if supported)", NULL}, + {"random-sharpen", 's', 0, G_OPTION_ARG_NONE, &randomsharpen, + "Change sharpen filter randomly every second (if supported)", NULL}, + {"random-crop", 'p', 0, G_OPTION_ARG_NONE, &randomcrop, + "Change cropping randomly every 150 miliseconds", NULL}, + {NULL}, +}; + +struct _app +{ + GMainLoop *loop; + GstObject *display; + GstElement *pipeline; + GstElement *vpp; + GstElement *crop; + GMutex mutex; + + GstControlSource *sharpen; + gint right, left, top, bottom; + gint ldir, rdir, tdir, bdir; +}; + +static GstBusSyncReply +context_handler (GstBus * bus, GstMessage * msg, gpointer data) +{ + struct _app *app = data; + const gchar *context_type; + + switch (GST_MESSAGE_TYPE (msg)) { + case GST_MESSAGE_HAVE_CONTEXT:{ + GstContext *context = NULL; + + gst_message_parse_have_context (msg, &context); + if (context) { + context_type = gst_context_get_context_type (context); + + if (g_strcmp0 (context_type, + GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR) == 0) { + const GstStructure *s = gst_context_get_structure (context); + GstObject *display = NULL; + + gst_printerr ("got have context %s from %s: ", context_type, + GST_MESSAGE_SRC_NAME (msg)); + + gst_structure_get (s, "gst-display", GST_TYPE_OBJECT, &display, NULL); + gst_printerrln ("%s", display ? + GST_OBJECT_NAME (display) : "no gst display"); + gst_context_unref (context); + + if (display) { + g_mutex_lock (&app->mutex); + gst_object_replace (&app->display, display); + gst_object_unref (display); + g_mutex_unlock (&app->mutex); + } + } + } + + gst_message_unref (msg); + + return GST_BUS_DROP; + } + case GST_MESSAGE_NEED_CONTEXT: + gst_message_parse_context_type (msg, &context_type); + + if (g_strcmp0 (context_type, GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR) == 0) { + GstContext *context; + GstStructure *s; + + gst_printerr ("got need context %s from %s: ", context_type, + GST_MESSAGE_SRC_NAME (msg)); + + g_mutex_lock (&app->mutex); + if (!app->display) { + g_mutex_unlock (&app->mutex); + gst_printerrln ("no gst display yet"); + gst_message_unref (msg); + return GST_BUS_DROP; + } + + context = + gst_context_new (GST_VA_DISPLAY_HANDLE_CONTEXT_TYPE_STR, TRUE); + s = gst_context_writable_structure (context); + gst_structure_set (s, "gst-display", GST_TYPE_OBJECT, app->display, + NULL); + gst_printerrln ("%s", GST_OBJECT_NAME (app->display)); + gst_element_set_context (GST_ELEMENT (GST_MESSAGE_SRC (msg)), context); + gst_context_unref (context); + g_mutex_unlock (&app->mutex); + + } + + gst_message_unref (msg); + + return GST_BUS_DROP; + + default: + break; + } + + return GST_BUS_PASS; +} + +static gboolean +message_handler (GstBus * bus, GstMessage * msg, gpointer data) +{ + struct _app *app = data; + + switch (GST_MESSAGE_TYPE (msg)) { + case GST_MESSAGE_EOS: + g_main_loop_quit (app->loop); + break; + case GST_MESSAGE_ERROR:{ + gchar *debug = NULL; + GError *err = NULL; + + gst_message_parse_error (msg, &err, &debug); + + if (err) { + gst_printerrln ("GStreamer error: %s\n%s", err->message, + debug ? debug : ""); + g_error_free (err); + } + + if (debug) + g_free (debug); + + g_main_loop_quit (app->loop); + break; + } + default: + break; + } + + return TRUE; +} + +static void +config_simple (struct _app *app) +{ + GParamSpec *pspec; + GObjectClass *g_class = G_OBJECT_GET_CLASS (app->vpp); + const static gchar *props[] = { "brightness", "hue", "saturation", + "contrast" + }; + gfloat max; + guint i; + + if (camera && (pspec = g_object_class_find_property (g_class, "skin-tone"))) { + if (G_PARAM_SPEC_TYPE (pspec) == G_TYPE_BOOLEAN) { + g_object_set (app->vpp, "skin-tone", TRUE, NULL); + } else { + max = ((GParamSpecFloat *) pspec)->maximum; + g_object_set (app->vpp, "skin-tone", max, NULL); + } + + return; + } + + for (i = 0; i < G_N_ELEMENTS (props); i++) { + pspec = g_object_class_find_property (g_class, props[i]); + if (!pspec) + continue; + + max = ((GParamSpecFloat *) pspec)->maximum; + g_object_set (app->vpp, props[i], max, NULL); + } +} + +static gboolean +build_pipeline (struct _app *app) +{ + GstElement *src; + GstBus *bus; + GError *err = NULL; + GString *cmd = g_string_new (NULL); + const gchar *source = camera ? "v4l2src" : "videotestsrc"; + + g_string_printf (cmd, "%s name=src ! tee name=t " + "t. ! queue ! videocrop name=crop ! vapostproc name=vpp ! " + "fpsdisplaysink video-sink=autovideosink " + "t. ! queue ! vapostproc ! timeoverlay ! autovideosink", source); + + app->pipeline = gst_parse_launch (cmd->str, &err); + g_string_free (cmd, TRUE); + if (err) { + gst_printerrln ("Couldn't create pipeline: %s", err->message); + g_error_free (err); + return FALSE; + } + + if (num_buffers > 0) { + src = gst_bin_get_by_name (GST_BIN (app->pipeline), "src"); + g_object_set (src, "num-buffers", num_buffers, NULL); + gst_object_unref (src); + } + + app->vpp = gst_bin_get_by_name (GST_BIN (app->pipeline), "vpp"); + if (!randomcb && !randomdir && !randomsharpen && !randomcrop) + config_simple (app); + + app->crop = gst_bin_get_by_name (GST_BIN (app->pipeline), "crop"); + + bus = gst_pipeline_get_bus (GST_PIPELINE (app->pipeline)); + gst_bus_set_sync_handler (bus, context_handler, app, NULL); + gst_bus_add_watch (bus, message_handler, app); + gst_object_unref (bus); + + return TRUE; +} + +static gboolean +change_cb_randomly (gpointer data) +{ + struct _app *app = data; + GstColorBalance *cb; + GList *channels; + + if (!GST_COLOR_BALANCE_GET_INTERFACE (app->vpp)) + return G_SOURCE_REMOVE; + + cb = GST_COLOR_BALANCE (app->vpp); + channels = (GList *) gst_color_balance_list_channels (cb); + for (; channels && channels->data; channels = channels->next) { + GstColorBalanceChannel *channel = channels->data; + gint value = + g_random_int_range (channel->min_value, channel->max_value + 1); + + gst_color_balance_set_value (cb, channel, value); + } + + return G_SOURCE_CONTINUE; +} + +static gboolean +change_dir_randomly (gpointer data) +{ + struct _app *app = data; + GObjectClass *g_class = G_OBJECT_GET_CLASS (app->vpp); + GParamSpec *pspec; + + pspec = g_object_class_find_property (g_class, "video-direction"); + if (!pspec) + return G_SOURCE_REMOVE; + + /* choose either sent direction by property or by event */ +#if !CHANGE_DIR_WITH_EVENT + { + GEnumClass *enumclass; + guint idx, value; + + enumclass = G_PARAM_SPEC_ENUM (pspec)->enum_class; + idx = g_random_int_range (0, enumclass->n_values); + value = enumclass->values[idx].value; + + g_object_set (app->vpp, "video-direction", value, NULL); + } +#else + { + GstEvent *event; + guint idx; + static const gchar *orientation[] = { + "rotate-0", "rotate-90", "rotate-180", "rotate-270", + "flip-rotate-0", "flip-rotate-90", "flip-rotate-180", "flip-rotate-270", + "undefined", + }; + + idx = g_random_int_range (0, G_N_ELEMENTS (orientation)); + + event = gst_event_new_tag (gst_tag_list_new (GST_TAG_IMAGE_ORIENTATION, + orientation[idx], NULL)); + gst_element_send_event (app->pipeline, event); + } +#endif + + return G_SOURCE_CONTINUE; +} + +static inline GParamSpec * +vpp_has_sharpen (GstElement * vpp) +{ + GObjectClass *g_class = G_OBJECT_GET_CLASS (vpp); + return g_object_class_find_property (g_class, "sharpen"); +} + +static gboolean +change_sharpen_randomly (gpointer data) +{ + struct _app *app = data; + GParamSpec *pspec; + gdouble value; + + pspec = vpp_has_sharpen (app->vpp); + if (!pspec) + return G_SOURCE_REMOVE; + value = g_random_double_range (G_PARAM_SPEC_FLOAT (pspec)->minimum, + G_PARAM_SPEC_FLOAT (pspec)->maximum); + + gst_timed_value_control_source_set (GST_TIMED_VALUE_CONTROL_SOURCE + (app->sharpen), GST_SECOND, value); + + return G_SOURCE_CONTINUE; +} + +static gboolean +change_crop_randomly (gpointer data) +{ + struct _app *app = data; + + g_object_set (app->crop, "bottom", app->bottom, "top", app->top, "left", + app->left, "right", app->right, NULL); + + app->top += app->tdir; + if (app->top >= 80) + app->tdir = -10; + else if (app->top < 10) + app->tdir = 10; + + app->bottom += app->bdir; + if (app->bottom >= 60) + app->bdir = -10; + else if (app->bottom < 10) + app->bdir = 10; + + app->left += app->ldir; + if (app->left >= 100) + app->ldir = -10; + else if (app->left < 10) + app->ldir = 10; + + app->right += app->rdir; + if (app->right >= 80) + app->rdir = -10; + else if (app->right < 10) + app->rdir = 10; + + return G_SOURCE_CONTINUE; +} + +static gboolean +parse_arguments (int *argc, char ***argv) +{ + GOptionContext *ctxt; + GError *err = NULL; + + ctxt = g_option_context_new ("— Multiple VA postprocessors"); + g_option_context_add_main_entries (ctxt, entries, NULL); + g_option_context_add_group (ctxt, gst_init_get_option_group ()); + + if (!g_option_context_parse (ctxt, argc, argv, &err)) { + gst_printerrln ("option parsing failed: %s", err->message); + g_error_free (err); + return FALSE; + } + + g_option_context_free (ctxt); + return TRUE; +} + +int +main (int argc, char **argv) +{ + GstBus *bus; + struct _app app = { NULL, }; + int ret = EXIT_FAILURE; + + if (!parse_arguments (&argc, &argv)) + return EXIT_FAILURE; + + g_mutex_init (&app.mutex); + + app.loop = g_main_loop_new (NULL, TRUE); + + if (!build_pipeline (&app)) + goto gst_failed; + + if (randomcb) + g_timeout_add_seconds (1, change_cb_randomly, &app); + + if (randomdir) { +#if CHANGE_DIR_WITH_EVENT + gst_util_set_object_arg (G_OBJECT (app.vpp), "video-direction", "auto"); +#endif + g_timeout_add_seconds (1, change_dir_randomly, &app); + } + + if (randomsharpen && vpp_has_sharpen (app.vpp)) { + GstControlBinding *bind; + + app.sharpen = gst_interpolation_control_source_new (); + bind = gst_direct_control_binding_new_absolute (GST_OBJECT (app.vpp), + "sharpen", app.sharpen); + gst_object_add_control_binding (GST_OBJECT (app.vpp), bind); + g_object_set (app.sharpen, "mode", GST_INTERPOLATION_MODE_LINEAR, NULL); + + change_sharpen_randomly (&app); + g_timeout_add_seconds (1, change_sharpen_randomly, &app); + } + + if (randomcrop) { + app.bdir = app.ldir = app.rdir = app.tdir = 10; + g_timeout_add (150, change_crop_randomly, &app); + } + + gst_element_set_state (app.pipeline, GST_STATE_PLAYING); + + g_main_loop_run (app.loop); + + gst_element_set_state (app.pipeline, GST_STATE_NULL); + + bus = gst_pipeline_get_bus (GST_PIPELINE (app.pipeline)); + gst_bus_remove_watch (bus); + gst_object_unref (bus); + + gst_clear_object (&app.display); + + ret = EXIT_SUCCESS; + + gst_clear_object (&app.vpp); + gst_clear_object (&app.pipeline); + gst_clear_object (&app.sharpen); + gst_clear_object (&app.crop); + +gst_failed: + g_mutex_clear (&app.mutex); + g_main_loop_unref (app.loop); + + gst_deinit (); + + return ret; +}
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/waylandsink/main.c -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/waylandsink/main.c
Changed
@@ -254,10 +254,10 @@ } else { if (live) { d->pipeline = gst_parse_launch ("videotestsrc pattern=18 " - "background-color=0x000062FF is-live=true ! waylandsink", NULL); + "background-color=0xFF0062FF is-live=true ! waylandsink", NULL); } else { d->pipeline = gst_parse_launch ("videotestsrc pattern=18 " - "background-color=0x000062FF ! waylandsink", NULL); + "background-color=0xFF0062FF ! waylandsink", NULL); } }
View file
gst-plugins-bad-1.18.6.tar.xz/tests/examples/webrtc/webrtcrenego.c -> gst-plugins-bad-1.20.1.tar.xz/tests/examples/webrtc/webrtcrenego.c
Changed
@@ -229,6 +229,7 @@ gst_pad_unlink (pad, peer); gst_element_release_request_pad (webrtc1, peer); + gst_object_unref (transceiver); gst_object_unref (peer); gst_object_unref (pad);
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/wpe
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/wpe/meson.build
Added
@@ -0,0 +1,14 @@ +examples = ['wpe'] + +foreach example : examples + exe_name = example + src_file = '@0@.c'.format(example) + + executable(exe_name, + src_file, + install: false, + include_directories : [configinc], + dependencies : [glib_dep, gst_dep], + c_args : ['-DHAVE_CONFIG_H=1'], + ) +endforeach
View file
gst-plugins-bad-1.20.1.tar.xz/tests/examples/wpe/wpe.c
Added
@@ -0,0 +1,157 @@ +/* Copyright (C) <2018, 2019> Philippe Normand <philn@igalia.com> + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Library General Public + * License as published by the Free Software Foundation; either + * version 2 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Library General Public License for more details. + * + * You should have received a copy of the GNU Library General Public + * License along with this library; if not, write to the + * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, + * Boston, MA 02110-1301, USA. + */ + +#include <gst/gst.h> + +static GMainLoop *loop; +static GstElement *pipe1; +static GstBus *bus1; + +static gboolean +_bus_watch (GstBus * bus, GstMessage * msg, GstElement * pipe) +{ + switch (GST_MESSAGE_TYPE (msg)) { + case GST_MESSAGE_STATE_CHANGED: + if (GST_ELEMENT (msg->src) == pipe) { + GstState old, new, pending; + + gst_message_parse_state_changed (msg, &old, &new, &pending); + + { + gchar *dump_name = g_strconcat ("state_changed-", + gst_element_state_get_name (old), "_", + gst_element_state_get_name (new), NULL); + GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (msg->src), + GST_DEBUG_GRAPH_SHOW_ALL, dump_name); + g_free (dump_name); + } + } + break; + case GST_MESSAGE_ERROR:{ + GError *err = NULL; + gchar *dbg_info = NULL; + + GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (pipe), + GST_DEBUG_GRAPH_SHOW_ALL, "error"); + + gst_message_parse_error (msg, &err, &dbg_info); + g_printerr ("ERROR from element %s: %s\n", + GST_OBJECT_NAME (msg->src), err->message); + g_printerr ("Debugging info: %s\n", (dbg_info) ? dbg_info : "none"); + g_error_free (err); + g_free (dbg_info); + g_main_loop_quit (loop); + break; + } + case GST_MESSAGE_EOS:{ + GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (pipe), + GST_DEBUG_GRAPH_SHOW_ALL, "eos"); + g_print ("EOS received\n"); + g_main_loop_quit (loop); + break; + } + default: + break; + } + + return TRUE; +} + +static void +_wpe_pad_added (GstElement * src, GstPad * new_pad, GstElement * pipe) +{ + GstElement *out; + GstPad *sink; + gchar *name = gst_pad_get_name (new_pad); + gchar *bin_name; + + out = gst_parse_bin_from_description + ("audioresample ! audioconvert ! autoaudiosink", TRUE, NULL); + bin_name = g_strdup_printf ("%s-bin", name); + g_free (name); + + gst_object_set_name (GST_OBJECT_CAST (out), bin_name); + g_free (bin_name); + + gst_bin_add (GST_BIN (pipe), out); + sink = out->sinkpads->data; + gst_pad_link (new_pad, sink); + gst_element_sync_state_with_parent (out); +} + +static void +_wpe_pad_removed (GstElement * src, GstPad * pad, GstElement * pipe) +{ + gchar *name = gst_pad_get_name (pad); + gchar *bin_name = g_strdup_printf ("%s-bin", name); + GstElement *bin = gst_bin_get_by_name (GST_BIN_CAST (pipe), bin_name); + + if (GST_IS_ELEMENT (bin)) { + gst_bin_remove (GST_BIN_CAST (pipe), bin); + gst_element_set_state (bin, GST_STATE_NULL); + } + g_free (name); + g_free (bin_name); +} + +int +main (int argc, char *argv[]) +{ + GstElement *src; + + if (argc < 2) { + g_printerr ("Usage: %s <website url>\n", argv[0]); + return 1; + } + + gst_init (&argc, &argv); + + loop = g_main_loop_new (NULL, FALSE); + pipe1 = + gst_parse_launch + ("wpesrc name=wpesrc ! queue ! glcolorconvert ! gtkglsink enable-last-sample=0", + NULL); + bus1 = gst_pipeline_get_bus (GST_PIPELINE (pipe1)); + gst_bus_add_watch (bus1, (GstBusFunc) _bus_watch, pipe1); + + src = gst_bin_get_by_name (GST_BIN (pipe1), "wpesrc"); + + gst_element_set_state (GST_ELEMENT (pipe1), GST_STATE_READY); + + g_signal_connect (src, "pad-added", G_CALLBACK (_wpe_pad_added), pipe1); + g_signal_connect (src, "pad-removed", G_CALLBACK (_wpe_pad_removed), pipe1); + + g_object_set (src, "location", argv[1], NULL); + gst_clear_object (&src); + + g_print ("Starting pipeline\n"); + gst_element_set_state (GST_ELEMENT (pipe1), GST_STATE_PLAYING); + + g_main_loop_run (loop); + + gst_element_set_state (GST_ELEMENT (pipe1), GST_STATE_NULL); + g_print ("Pipeline stopped\n"); + + gst_bus_remove_watch (bus1); + gst_object_unref (bus1); + gst_object_unref (pipe1); + + gst_deinit (); + + return 0; +}
View file
gst-plugins-bad-1.18.6.tar.xz/tests/meson.build -> gst-plugins-bad-1.20.1.tar.xz/tests/meson.build
Changed
@@ -1,6 +1,7 @@ if not get_option('tests').disabled() and gstcheck_dep.found() subdir('check') subdir('icles') + subdir('validate') endif if not get_option('examples').disabled() subdir('examples')
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate/meson.build
Added
@@ -0,0 +1,29 @@ +gst_tester = find_program('gst-tester-@0@'.format(api_version), required: get_option('tests')) +if not gst_tester.found() + subdir_done() +endif + +tests = [ + 'opencv/cvtracker' +] + +env = environment() +env.set('GST_PLUGIN_PATH_1_0', meson.global_build_root(), pluginsdirs) +env.set('GST_PLUGIN_SYSTEM_PATH_1_0', '') +env.set('GST_REGISTRY', '@0@/@1@.registry'.format(meson.current_build_dir(), 'validate')) +env.set('GST_PLUGIN_SCANNER_1_0', gst_plugin_scanner_path) +env.set('GST_PLUGIN_LOADING_WHITELIST', 'gstreamer', 'gst-validate', 'gst-plugins-base', + 'gst-plugins-bad@' + meson.project_build_root()) + +foreach t: tests + test_dir_name = t.split('/') + test_name = 'validate' + foreach c: test_dir_name + test_name += '.' + c + endforeach + test_env = env + test_env.set('GST_VALIDATE_LOGSDIR', join_paths(meson.current_build_dir(), test_name)) + test_file = join_paths(meson.current_source_dir(), t + '.validatetest') + test(test_name, gst_tester, args: [test_file, '--use-fakesinks'], + env: test_env, timeout : 3 * 60, protocol: 'tap') +endforeach
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate/opencv
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate/opencv/cvtracker
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate/opencv/cvtracker.validatetest
Added
@@ -0,0 +1,8 @@ +meta, + args = { + "videotestsrc pattern=ball animation-mode=frames num-buffers=90 ! video/x-raw,framerate=30/1,width=320,height=240 ! cvtracker name=tracker object-initial-x=135 object-initial-y=95 object-initial-width=50 object-initial-height=50 algorithm=1 ! videoconvert ! $(videosink) sync=true", + }, + configs = { + "$(validateflow), pad=tracker:src, record-buffers=true", + } +crank-clock, repeat=91 \ No newline at end of file
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate/opencv/cvtracker/flow-expectations
Added
+(directory)
View file
gst-plugins-bad-1.20.1.tar.xz/tests/validate/opencv/cvtracker/flow-expectations/log-tracker-src-expected
Added
@@ -0,0 +1,94 @@ +event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)1; +event caps: video/x-raw, format=(string)RGB, framerate=(fraction)30/1, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, width=(int)320; +event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=none, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000 +buffer: pts=0:00:00.000000000, dur=0:00:00.033333333, flags=discont, meta=GstVideoMeta +buffer: pts=0:00:00.033333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=139, y=99, width=50, height=50] +buffer: pts=0:00:00.066666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=144, y=104, width=49, height=49] +buffer: pts=0:00:00.100000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=149, y=108, width=50, height=50] +buffer: pts=0:00:00.133333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=153, y=113, width=49, height=49] +buffer: pts=0:00:00.166666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=157, y=117, width=50, height=50] +buffer: pts=0:00:00.200000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=161, y=121, width=50, height=50] +buffer: pts=0:00:00.233333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=166, y=126, width=49, height=49] +buffer: pts=0:00:00.266666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=171, y=130, width=49, height=49] +buffer: pts=0:00:00.300000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=175, y=135, width=48, height=48] +buffer: pts=0:00:00.333333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=179, y=138, width=49, height=49] +buffer: pts=0:00:00.366666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=183, y=142, width=50, height=50] +buffer: pts=0:00:00.400000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=187, y=146, width=49, height=49] +buffer: pts=0:00:00.433333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=191, y=150, width=50, height=50] +buffer: pts=0:00:00.466666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=194, y=153, width=50, height=50] +buffer: pts=0:00:00.500000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=199, y=157, width=49, height=49] +buffer: pts=0:00:00.533333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=202, y=160, width=50, height=50] +buffer: pts=0:00:00.566666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=206, y=163, width=50, height=50] +buffer: pts=0:00:00.600000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=210, y=166, width=50, height=50] +buffer: pts=0:00:00.633333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=214, y=170, width=49, height=49] +buffer: pts=0:00:00.666666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=218, y=173, width=49, height=49] +buffer: pts=0:00:00.700000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=221, y=176, width=50, height=50] +buffer: pts=0:00:00.733333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=223, y=178, width=51, height=51] +buffer: pts=0:00:00.766666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=228, y=179, width=50, height=50] +buffer: pts=0:00:00.800000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=231, y=183, width=49, height=49] +buffer: pts=0:00:00.833333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=235, y=185, width=50, height=50] +buffer: pts=0:00:00.866666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=237, y=186, width=50, height=50] +buffer: pts=0:00:00.900000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=240, y=188, width=49, height=49] +buffer: pts=0:00:00.933333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=243, y=190, width=50, height=50] +buffer: pts=0:00:00.966666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=246, y=191, width=49, height=49] +buffer: pts=0:00:01.000000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=249, y=192, width=50, height=50] +buffer: pts=0:00:01.033333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=251, y=193, width=49, height=49] +buffer: pts=0:00:01.066666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=254, y=194, width=49, height=49] +buffer: pts=0:00:01.100000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=257, y=195, width=49, height=49] +buffer: pts=0:00:01.133333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=258, y=195, width=50, height=50] +buffer: pts=0:00:01.166666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=260, y=194, width=50, height=50] +buffer: pts=0:00:01.200000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=261, y=195, width=49, height=49] +buffer: pts=0:00:01.233333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=265, y=195, width=49, height=49] +buffer: pts=0:00:01.266666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=265, y=195, width=50, height=50] +buffer: pts=0:00:01.300000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=267, y=194, width=49, height=49] +buffer: pts=0:00:01.333333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=268, y=193, width=50, height=50] +buffer: pts=0:00:01.366666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=269, y=191, width=50, height=50] +buffer: pts=0:00:01.400000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=271, y=191, width=49, height=49] +buffer: pts=0:00:01.433333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=272, y=190, width=49, height=49] +buffer: pts=0:00:01.466666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=274, y=188, width=49, height=49] +buffer: pts=0:00:01.500000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=273, y=186, width=50, height=50] +buffer: pts=0:00:01.533333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=274, y=184, width=49, height=49] +buffer: pts=0:00:01.566666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=274, y=182, width=50, height=50] +buffer: pts=0:00:01.600000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=275, y=179, width=49, height=49] +buffer: pts=0:00:01.633333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=275, y=178, width=49, height=49] +buffer: pts=0:00:01.666666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=275, y=175, width=50, height=50] +buffer: pts=0:00:01.700000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=275, y=172, width=49, height=49] +buffer: pts=0:00:01.733333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=275, y=169, width=49, height=49] +buffer: pts=0:00:01.766666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=276, y=166, width=49, height=49] +buffer: pts=0:00:01.800000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=274, y=164, width=49, height=49] +buffer: pts=0:00:01.833333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=274, y=159, width=50, height=50] +buffer: pts=0:00:01.866666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=273, y=155, width=50, height=50] +buffer: pts=0:00:01.900000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=272, y=152, width=50, height=50] +buffer: pts=0:00:01.933333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=271, y=148, width=50, height=50] +buffer: pts=0:00:01.966666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=270, y=145, width=49, height=49] +buffer: pts=0:00:02.000000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=269, y=141, width=49, height=49] +buffer: pts=0:00:02.033333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=267, y=137, width=50, height=50] +buffer: pts=0:00:02.066666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=265, y=133, width=49, height=49] +buffer: pts=0:00:02.100000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=265, y=129, width=49, height=49] +buffer: pts=0:00:02.133333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=262, y=125, width=49, height=49] +buffer: pts=0:00:02.166666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=260, y=120, width=50, height=50] +buffer: pts=0:00:02.200000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=258, y=116, width=49, height=49] +buffer: pts=0:00:02.233333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=255, y=111, width=50, height=50] +buffer: pts=0:00:02.266666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=253, y=107, width=49, height=49] +buffer: pts=0:00:02.300000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=252, y=103, width=50, height=50] +buffer: pts=0:00:02.333333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=248, y=98, width=49, height=49] +buffer: pts=0:00:02.366666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=246, y=94, width=50, height=50] +buffer: pts=0:00:02.400000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=243, y=89, width=49, height=49] +buffer: pts=0:00:02.433333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=241, y=85, width=49, height=49] +buffer: pts=0:00:02.466666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=237, y=81, width=49, height=49] +buffer: pts=0:00:02.500000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=235, y=77, width=50, height=50] +buffer: pts=0:00:02.533333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=231, y=72, width=49, height=49] +buffer: pts=0:00:02.566666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=228, y=68, width=49, height=49] +buffer: pts=0:00:02.600000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=225, y=63, width=50, height=50] +buffer: pts=0:00:02.633333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=221, y=59, width=49, height=49] +buffer: pts=0:00:02.666666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=218, y=55, width=49, height=49] +buffer: pts=0:00:02.700000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=214, y=51, width=50, height=50] +buffer: pts=0:00:02.733333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=210, y=47, width=49, height=49] +buffer: pts=0:00:02.766666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=207, y=43, width=49, height=49] +buffer: pts=0:00:02.800000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=203, y=40, width=50, height=50] +buffer: pts=0:00:02.833333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=199, y=36, width=49, height=49] +buffer: pts=0:00:02.866666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=195, y=33, width=49, height=49] +buffer: pts=0:00:02.900000000, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=191, y=29, width=49, height=49] +buffer: pts=0:00:02.933333333, dur=0:00:00.033333333, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=188, y=26, width=49, height=49] +buffer: pts=0:00:02.966666666, dur=0:00:00.033333334, meta=GstVideoMeta, GstVideoRegionOfInterestMeta[x=182, y=22, width=50, height=50] +event eos: (no structure)
View file
gst-plugins-bad-1.18.6.tar.xz/tools/gst-transcoder.c -> gst-plugins-bad-1.20.1.tar.xz/tools/gst-transcoder.c
Changed
@@ -22,6 +22,9 @@ #include "utils.h" #include <gst/transcoder/gsttranscoder.h> +#ifdef G_OS_UNIX +#include <glib-unix.h> +#endif static const gchar *HELP_SUMMARY = "gst-transcoder-1.0 transcodes a stream defined by its first <input-uri>\n" @@ -65,6 +68,45 @@ gchar *framerate; } Settings; +#ifdef G_OS_UNIX +static guint signal_watch_hup_id; +static guint signal_watch_intr_id; + +static gboolean +intr_handler (gpointer user_data) +{ + GstTranscoder *self = GST_TRANSCODER (user_data); + GstElement *pipeline = gst_transcoder_get_pipeline (self); + + g_print ("handling interrupt.\n"); + + if (pipeline) { + gst_element_send_event (pipeline, gst_event_new_eos ()); + g_object_unref (pipeline); + } + + signal_watch_intr_id = 0; + return G_SOURCE_REMOVE; +} + +static gboolean +hup_handler (gpointer user_data) +{ + GstTranscoder *self = GST_TRANSCODER (user_data); + GstElement *pipeline = gst_transcoder_get_pipeline (self); + + g_print ("handling hang up.\n"); + + if (pipeline) { + gst_element_send_event (pipeline, gst_event_new_eos ()); + g_object_unref (pipeline); + } + + signal_watch_intr_id = 0; + return G_SOURCE_REMOVE; +} +#endif /* G_OS_UNIX */ + static void position_updated_cb (GstTranscoder * transcoder, GstClockTime pos) { @@ -236,10 +278,11 @@ GST_TYPE_PAD_LINK_RETURN, &lret, "msg-source-type", G_TYPE_GTYPE, &type, NULL) && type == g_type_from_name ("GstTranscodeBin")) { + const gchar *debug = gst_structure_get_string (details, "debug"); + error ("\nCould not setup transcoding pipeline," " make sure that your transcoding format parameters" - " are compatible with the input stream."); - + " are compatible with the input stream.\n\n%s", debug); return; } } @@ -281,6 +324,7 @@ GError *err = NULL; GstTranscoder *transcoder; GOptionContext *ctx; + GstTranscoderSignalAdapter *signal_adapter; Settings settings = { .cpu_usage = 100, .rate = -1, @@ -370,22 +414,39 @@ } transcoder = gst_transcoder_new_full (settings.src_uri, settings.dest_uri, - settings.profile, NULL); + settings.profile); gst_transcoder_set_avoid_reencoding (transcoder, TRUE); - gst_transcoder_set_cpu_usage (transcoder, settings.cpu_usage); - g_signal_connect (transcoder, "position-updated", - G_CALLBACK (position_updated_cb), NULL); - g_signal_connect (transcoder, "warning", G_CALLBACK (_warning_cb), NULL); - g_signal_connect (transcoder, "error", G_CALLBACK (_error_cb), NULL); - g_assert (transcoder); + signal_adapter = gst_transcoder_get_signal_adapter (transcoder, NULL); + g_signal_connect_swapped (signal_adapter, "position-updated", + G_CALLBACK (position_updated_cb), transcoder); + g_signal_connect_swapped (signal_adapter, "warning", G_CALLBACK (_warning_cb), + transcoder); + g_signal_connect_swapped (signal_adapter, "error", G_CALLBACK (_error_cb), + transcoder); + + +#ifdef G_OS_UNIX + signal_watch_intr_id = + g_unix_signal_add (SIGINT, (GSourceFunc) intr_handler, transcoder); + signal_watch_hup_id = + g_unix_signal_add (SIGHUP, (GSourceFunc) hup_handler, transcoder); +#endif ok ("Starting transcoding..."); gst_transcoder_run (transcoder, &err); + g_object_unref (signal_adapter); if (!err) ok ("\nDONE."); +#ifdef G_OS_UNIX + if (signal_watch_intr_id > 0) + g_source_remove (signal_watch_intr_id); + if (signal_watch_hup_id > 0) + g_source_remove (signal_watch_hup_id); +#endif + done: g_free (settings.dest_uri); g_free (settings.src_uri);
View file
gst-plugins-bad-1.18.6.tar.xz/tools/meson.build -> gst-plugins-bad-1.20.1.tar.xz/tools/meson.build
Changed
@@ -2,4 +2,5 @@ 'gst-transcoder.c', 'utils.c', install : true, dependencies : [gst_dep, gstpbutils_dep, gst_transcoder_dep], + c_args: ['-DG_LOG_DOMAIN="gst-transcoder-@0@"'.format(api_version)], )
Locations
Projects
Search
Status Monitor
Help
Open Build Service
OBS Manuals
API Documentation
OBS Portal
Reporting a Bug
Contact
Mailing List
Forums
Chat (IRC)
Twitter
Open Build Service (OBS)
is an
openSUSE project
.